Reference no: EM133137838
AM41PB Probabilistic Modelling
Please note that the items in each of the questions are mainly for guidance and do not directly correspond to the marking scheme. For instance, the item "Explain which software package you have used or provide the code if written by yourself" will not be marked on its own.
Question 1: Message passing for inference- The use of message passing for inference is described on lecture slides 218-221. It boils down to solving the binary equation
z = AS (mod 2), where z ∈ {0, 1}M-N is an M - N dimensional binary vector (syndrome) and S ∈ {0, 1}M an M dimensional binary vector (noise); the binary matrix A is a sparse (M -N)×M binary matrix. The matrix used in this case M = 6, N = 3 and A = 1 1 1 0 0 1 0 1 1 0 0 1 0 1 1 0 0 1 (a) Plot the factor graph that represents the problem and representative messages from node
Derive the closed set of message passing equations that represents the problem.
Write the expression for the approximate posterior for the variables S using the messages.
Using an existing software package (or your own code) , a syndrome vector z = [0, 0, 0] and a noise vector S with prior probabilities
p(S1 = 1) 0.9
=
p(S2 = 1) 0.9
=
p(S3 = 1) 0.1
=
p(S4 = 1) 0.9
=
p(S5 = 1) 0.9
=
p(S6 = 1) 0.1
to factor and factor to node.
=
provide the posterior values (0, 1) of the elements of the vector S for 10 iterations of the algorithm.
(e) Verify that the solution obeys the equation
z = AS (mod 2).
Explain which software package you have used and how it implements the algorithm described in (a) or provide the code if written by yourself.
Mixture models - On BlackBoard, under the section assignments, please ?nd the set of GDP data points of di?erent countries in the ?le GDP2003-2011.txt (each line includes two real values - of GDP values for 2003 and 2011). Use the Expectation Maximisation (EM) algorithm to train a Gaussian mixture model (GMM) of the data.
Explain brie?y the GMM model and the EM algorithm you are using.
Train a GMM of K = 3, 5, 7, 9, 11 Gaussians and plot the resulting mixture model (in the form of Gaussians in this two dimensional space). Explain the results obtained.
Plot the progression in likelihood values with respect to the iterations of the EM algo- rithm for the various cases.
Review methods for ?nding the optimal number of Gaussians in the mixture, explain the various criteria and the one you have been using.
Based on your results determine which of the models used best represents the data.
Explain which software package you have used and how it implements the algorithm described in (a) or provide the code if written by yourself.
Gaussian processes - On BlackBoard, under the section assignments, please ?nd a dataset GPdata 2022.dat of 10 points in input-output pairs (i.e., x1, y1, x2, y2 . . . x10, y10).
Explain brie?y the Gaussian Process algorithm and how it is used for this task.
Use the Gaussian process method to model the data (plot mean and variance throughout the input range). Assumed variance of the training output is 0.04.
Apply the quadratic covariance matrix with di?erent parameter values and choose your preferred model (parameter values are typically small positive numbers, representing inverse-scale and uncorrupted input variance). What is the value of y for x = 0?
Explain your preferred choice of parameter values.
Explain which software package you have used and how it implements the algorithm described in (a) or provide the code if written by yourself.
Attachment:- Probabilistic Modelling.rar