Note: There is a rating embedded within this post, please visit this post to rate it.
Suppose X=(x1,x2,…, xN) are the samples taken from a random distribution whose PDF is parameterized by the parameter
Here
Hereafter we will denote
The maximum likelihood estimate of the unknown parameter
In differential geometry, the maximum of a function f(x) is found by taking the first derivative of the function and equating it to zero. Similarly, the maximum likelihood estimate of a parameter –
The first partial derivative of log likelihood function with respect to
Calculating MLE for Poisson distribution:
Let X=(x1,x2,…, xN) are the samples taken from Poisson distribution given by
Calculating the Likelihood
The log likelihood is given by,
Differentiating and equating to zero to find the maxim (otherwise equating the score to zero)
Thus the mean of the samples gives the MLE of the parameter
To be updated soon
For the derivation of other PDFs see the following links
Theoretical derivation of Maximum Likelihood Estimator for Exponential PDF
Theoretical derivation of Maximum Likelihood Estimator for Gaussian PDF
See also:
[1] An Introduction to Estimation Theory
[2] Bias of an Estimator
[3] Minimum Variance Unbiased Estimators (MVUE)
[4] Maximum Likelihood Estimation
[5] Maximum Likelihood Decoding
[6] Probability and Random Process
[7] Likelihood Function and Maximum Likelihood Estimation (MLE)