In: Statistics and Probability
Provide five example of maximum likelihood estimator (MLE) in detailed steps.
MAXIMUM LIKELIHOOD ESTIMATOR::-
Suppose that we have a random sample from a population of interest. We may have a theoretical model for the way that the population is distributed. However, there may be several population parameters of which we do not know the values. Maximum likelihood estimation is one way to determine these unknown parameters
The basic idea behind maximum likelihood estimation is that we determine the values of these unknown parameters. We do this in such a way to maximize an associated joint probability density function or probability mass function. We will see this in more detail in what follows. Then we will calculate some examples of maximum likelihood estimation
EXAMPLES::-
1)Assume we have a bundle of seeds, every one of which has a consistent likelihood p of achievement of germination. We plant n of these and tally the quantity of those that grow. Accept that each seed grows freely of the others. ow do we decide the greatest probability estimator of the parameter p?
We begin by noting that each seed is modeled by a Bernoulli distribution with a success of p. We let X be either 0 or 1, and the probability mass function for a single seed is
f( x ; p ) = px(1 - p)1 - x.
Our sample consists of n different Xi, each of with has a Bernoulli distribution. The seeds that sprout have Xi = 1 and the seeds that fail to sprout have Xi = 0.
The likelihood function is given by:
L ( p ) = Π pxi(1 - p)1 - xi
2)
For another model, assume that we have an arbitrary example X1, X2, . . . Xn from a populace that we are demonstrating with an exponential conveyance. The likelihood thickness work for one irregular variable is of the frame f( x ) = θ-1 e - x/θ
The likelihood function is given by the joint probability density function. This is a product of several of these density functions:
L(θ) = Π θ-1e -xi/θ = θ-n e -Σxi/θ