标签:win average standard stand iss rand write dep jacob
THE UNIVERSITY OF HONG KONG
DEPARTMENT OF STATISTICS AND ACTUARIAL SCIENCE
STAT6011/7611/8305 COMPUTATIONAL STATISTICS
(2019 Fall)
Assignment 3, due on November 28
All numerical computation MUST be conducted in Python, and attach
the Python code.
1. Consider an integral.
(a) Plot the above integrand function in the range of (−2, 5).
(b) Use the Gaussian Legendre, Chebyshev 1, Chebyshev 2, and Jacobi quadratures
with 10 nodes and weights to approximate the integral respectively.
Present the nodes, weights, and the approximation results.
2. Use the dataset q2.csv, the observed data y = (y1, . . . , yn) are from a mixture
of normal distributions, i.e., Yi ∼
Pk
j=1 ωjfj (y), i = 1, . . . , n = 1000, where
each fj
is a normal density function N(µj, σ2j), and ωj
is the mixing probability
and Pk
j=1 ωj = 1. Consider the complete data (yi, ui), where the missing data
ui
indicates which distribution yi
is from.
(a) Write out the complete-data likelihood.
(b) Derive the marginal distribution of yi.
(c) Suppose that we know k = 2, σ21 = σ2
2 = 1 (j = 1, 2), and ω1 = ω2 = 0.5,
but µj’s are unknown. Derive the Q(µ|µ
(0)) function in the E step, and
derive the estimators {µ(1)j} given the previous step values {µ(0)j} in the
M step. Use the (sample-based) EM algorithm to estimate µj.
(d) Repeat (c) using population EM, i.e., taking the expectation of Yi based
on its true mixture density function f(y) = 0.5N(−1, 1) + 0.5N(1, 1)
where the expectation can be computed using Monte Carlo (MCEM).
Comment on results in (c) and (d).
(e) Suppose that we know k = 2, σ21 = σ22 = 1 (j = 1, 2), but µj and ωj are unknown.
If we treat the ui’s as missing data, derive the Q(ω, µ|ω(0), µ(0))
function in the E step, and derive the estimators in a closed form, i.e.,
the iterative equation between {ω(1)j, µ(1)j} and {ω(0)j, µ(0)j} in the M step.
Use the (sample-based) EM algorithm to estimate µj and ωj.
(f) Repeat (e) using population EM, i.e., taking the expectation of Yi based
on its true mixture density function f(y) = ω1N(−1, 1)+ω2N(1, 1) where
the expectation can be computed using Gaussian Hermite quadrature.
1
Comment on results in (c)–(f), i.e., knowing the true weights helps convergence
or not (e.g., how many iterations are needed for convergence).
3. Use the EM algorithm to estimate the parameters in the random effects logistic
model, for i = 1, . . . , I and j = 1, . . . , J,
Yij = β0 + β1xij + ui + ij ,
The unknown parameter vector θ = (β0, β1, σ2u, σ2)
T.
(a) Write out the complete-data likelihood.
(b) Derive the Q-function and the M-step of the EM algorithm.
(c) Conduct simulations as follows. Set the parameters β0 = 0.5, β1 = 1,
σu = 1, σ= 1, I = 100, and J = 2. For each dataset, simulate xij from
Uniform(0, 1), simulate ij and ui
from the corresponding normal distributions,
and then obtain yij . Use the EM algorithm to obtain the parameter
estimates based on each simulated dataset. Repeat the simulation
process 1000 times and present the bias (averaged over 1000 simulations)
and standard deviation for θ. Comment on your findings.
2
因为专业,所以值得信赖。如有需要,请加QQ:99515681 或 微信:codehelp
标签:win average standard stand iss rand write dep jacob
原文地址:https://www.cnblogs.com/simplebluejava/p/11978702.html