Course slides
Welcome to
David Dowe's
short course
slides.
Lloyd Allison's
CIA's
32-Mstate.
Gaussian distribution
- also known as
Normal distribution
Consider data x1, x2, ..., xN.
E.g., heights or weights.
We wish to estimate the mean, mu.
We also wish to estimate the standard deviation (s.d.), sigma,
or the variance, sigma^2.
The likelihood function is
f ( x | mu, sigma )
= PI_{i=1}^{i=N} 1/(sqrt(2 pi) sigma)
exp { - (xi - mu)^2 / (2 sigma^2) }
The log-likelihood function is
L = - log f ( x | mu, sigma )
= N log (sqrt(2 pi) sigma) + SIGMA_{i=1}^{i=N} (xi - mu)^2 / (2 sigma^2)
del L / del mu =
del L / del sigma =
The Maximum Likelihood estimates, mu^_{ML} and sigma^_{ML},
are obtained when del L / del mu = 0 and del L / del sigma = 0.
This gives
mu^_{ML} = SIGMA_{i=1}^{i=N} xi / N = xbar,
and
sigma^_{ML} = SIGMA_{i=1}^{i=N} (xi - xbar)^2 / (N - 1).
To get the Minimum Message Length (MML) estimate,
we need both the Fisher information and the priors.
The Fisher information, F, can be obtained by continuing with
the log-likelihood function as above.
As before,
del L / del mu =
del L / del sigma =
So,
del^2 L / del mu^2 =
del^2 L / del mu del sigma = del^2 L / del sigma del mu
=
del^2 L / del sigma^2 =
E ( del^2 L / del mu^2 ) =
E ( del^2 L / del mu del sigma ) = E ( del^2 L / del sigma del mu )
=
E ( del^2 L / del sigma^2 ) =
blah.
--------------------------------------------------------------------------
Lloyd Allison's
CIA's
32-Mstate.
Copying is not permitted without expressed permission from
David L. Dowe.