Learn to build a variety of Linear Models using R.

## Contents

- Motivating Example
- k-Nearest Neighbours
- Background
- Residuals
- Best fit and least squares

- Linear Regression
- Validating Model Assumptions
- Fit Diagnostics

- Using
`{broom}`

- Logistic Regression
- Odds, Log Odds and the Logit Function
- Example: Synthetic Data
- Principle of Parsimony
- Multicollinearity
- Example: Myopia Data

- Generalised Linear Models
- Logistic Regression
- Odds, Log Odds and the Logit Function
- Example: Synthetic Data
- Thresholding and classification
- Principle of Parsimony
- Multicollinearity
- Example: Myopia Data
- Beyond binary: One-versus-rest models
- Model evaluation

- Poisson regression

- Logistic Regression
- Feature Importance
- Feature Selection
- Stepwise (forward selection and backward elimination)

- Regularisation
- Lasso and Ridge Regression

- Generalized Additive Models
- Mixed Effects Models
- Using
`{caret}`

- pre-processing;
- train/test splitting;
- feature importance and feature selection;
- model evaluation (using cross validation and bootstrapping);
- model tuning.

## Prior Knowledge

We assume that participants have prior experience with R, ideally having completed both the the Introduction to R and Data Wrangling courses.