Neas-Seminars

Fox Module 21 Generalized linear models concepts


http://33771.hs2.instantasp.net/Topic8635.aspx

By NEAS - 12/2/2009 2:50:35 PM

Fox Module 21 Generalized linear models concepts

 


           Maximum likelihood estimation

           Link functions


 

 

Read Section 15.1, “Structure of generalized linear models,” on pages 379 through the gray box on page 381.

 

Know the three components of generalized linear models on page 379-380:

 


 

           Random component: the conditional distribution of the response variable.

           Linear predictor: a linear function of regressors

           Link function: transforms the expectation of the response variable to the linear predictor


 

 

Know the expressions for the identity, log, inverse, logit, and probit link functions in the middle column of Table 15.1 at the top of page 379.

 

Fox uses matrix algebra to explain maximum likelihood estimation in vector form in Section 14.1.5 on pages 352-355. Fox’s section is complex, and it is not needed for this course. The postings for this module on the VEE discussion forum explain maximum likelihood estimation. Focus on the following items:

 


 

           The relation of the likelihood function to the probability density function.

           Forming and maximizing the likelihood.

           Maximizing the log-likelihood.


 

 

The final exam problems test these concepts. You don’t solve GLMs on the final exam. But you may be asked for the linear relation using a log-link function or a logit link function.

 

A final exam problem may test the equations for maximizing the likelihood or loglikelihood.  You must know this material for later actuarial exams, and it is extremely useful for your company work; nothing here is wasted.

 

Know the Poisson, exponential, and binomial distributions. For each distribution, given a sample of observed values, know the equation to solve for maximum likelihood parameters.

 

If all observed values come from the same distribution, the maximum likelihood estimator is the mean. If the distribution of the observed value depends on explanatory variables, the maximum likelihood estimate can not be solved by pencil and paper. We use statistical software, not hand calculators.  The final exam tests the concepts, not complex examples.

 

By bradboy - 2/27/2018 5:37:54 AM

Look at this kid telling it like it is! NEAS, please stop promising things will be posted and not posting them. It is already bad enough that every page reference in this course is inaccurate.

In an effort to be helpful, this link does a good job of explaining maximum likelihood: https://onlinecourses.science.psu.edu/stat414/node/191‌‌

Essentially, your just finding the point where the derivative of the likelihood function is equal to zero. Finding the likelihood function seems like the difficult part.