Within this very nice piece, Rob drops this bomb of mathematical knowledge:
It is not necessary to actually fit separate models when computing the CV statistic for linear models.
Say what?
Here is a broader excerpt and the method itself (after the jump).
While cross-validation can be computationally expensive in general, it is very easy and fast to compute LOOCV for linear models. A linear model can be written as
Then
and the fitted values can be calculated using
where is known as the “hat-matrix” because it is used to compute (“Y-hat”).
If the diagonal values of are denoted by , then the cross-validation statistic can be computed using
where is the residual obtained from fitting the model to all observations. See Christensen’s book Plane Answers to Complex Questions for a proof. Thus, it is not necessary to actually fit separate models when computing the CV statistic for linear models. This remarkable result allows cross-validation to be used while only fitting the model once to all available observations.
No comments:
Post a Comment