Skip to content

Commit

Permalink
Update 03-regression-regularisation.Rmd
Browse files Browse the repository at this point in the history
  • Loading branch information
alanocallaghan authored Apr 16, 2024
1 parent 1bd1447 commit 9012bd2
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions _episodes_rmd/03-regression-regularisation.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -528,15 +528,15 @@ When $\lambda$ is small, we don't really care a lot about shrinking our coeffici
just using ordinary least squares. We see how a penalty term, $\lambda$, might be chosen later in this episode.

For now, to see how regularisation might improve a model, let's fit a model using the same set
of 20 features (stored as `features`) selected earlier in this episode (these
of 20 features (stored as `cpg_markers`) selected earlier in this episode (these
are a subset of the features identified by Horvarth et al), using both
regularised and ordinary least squares. To fit regularised regression models, we will use the **`glmnet`** package.

```{r plot-ridge, fig.cap="A line plot of coefficient estimates against log lambda for a ridge regression model.", fig.alt="A line plot of coefficient estimates against log lambda for a ridge regression model. Lines are depicted in different colours, with coefficients generally having large values on the left of the plot (small log lambda) and moving smoothly and gradually towards zero to the right of the plot (large log lambda). Some coefficients appear to increase and then decrease in magnitude as lambda increases, or switch signs."}
library("glmnet")
## glmnet() performs scaling by default, supply un-scaled data:
horvath_mat <- methyl_mat[, features] # select the first 20 sites as before
horvath_mat <- methyl_mat[, cpg_markers] # select the same 20 sites as before
train_mat <- horvath_mat[train_ind, ] # use the same individuals as selected before
test_mat <- horvath_mat[-train_ind, ]
Expand Down

0 comments on commit 9012bd2

Please sign in to comment.