Fox Module 21 Generalized linear models concepts
Maximum likelihood estimation
Link functions
Read Section 15.1, “Structure of generalized linear models,” on pages 379 through the gray box on page 381.
Know the three components of generalized linear models on page 379-380:
Random component: the conditional distribution of the response variable.
Linear predictor: a linear function of regressors
Link function: transforms the expectation of the response variable to the linear predictor
Know the expressions for the identity, log, inverse, logit, and probit link functions in the middle column of Table 15.1 at the top of page 379.
Fox uses matrix algebra to explain maximum likelihood estimation in vector form in Section 14.1.5 on pages 352-355. Fox’s section is complex, and it is not needed for this course. The postings for this module on the VEE discussion forum explain maximum likelihood estimation. Focus on the following items:
The relation of the likelihood function to the probability density function.
Forming and maximizing the likelihood.
Maximizing the log-likelihood.
The final exam problems test these concepts. You don’t solve GLMs on the final exam. But you may be asked for the linear relation using a log-link function or a logit link function.
A final exam problem may test the equations for maximizing the likelihood or loglikelihood. You must know this material for later actuarial exams, and it is extremely useful for your company work; nothing here is wasted.
Know the Poisson, exponential, and binomial distributions. For each distribution, given a sample of observed values, know the equation to solve for maximum likelihood parameters.
If all observed values come from the same distribution, the maximum likelihood estimator is the mean. If the distribution of the observed value depends on explanatory variables, the maximum likelihood estimate can not be solved by pencil and paper. We use statistical software, not hand calculators. The final exam tests the concepts, not complex examples.