# Week 12: Linear Regression

Slowly, we are moving into the modeling domain. Everything we do contain certain assumptions. A model is basically a world in which we impose certain assumptions to make certain predictions. For instance, we assume that if someone drinks coffee every morning, tomorrow morning that person will also be drinking coffee. This works the same with Machine Learning. We create ML models with assumptions of the world. This week, we are considering **regresssion** problems. Regression problems solve for numerical outputs. The linear regression line (also known as least squares or best fit line) are closely related to different domains: statistics, optimization and linear algebra. In our course, will only quickly go through the former two, but if you are interested, you are encouraged to take other ML classes at Berkeley.

## Discussion

Worksheet Slides Slides Annotated Solutions

## Lab

## Meme Submission

## Spotify Playlist

Contribute to the class playlist