Weekly plans and update for week 40

Hello everybody,

we hope you are all doing fine. What comes here is our weekly summary with plans for this week's lectures and lab.

The lab session is pretty obvious, we keep working on project 1.

Last week we discussed logistic regression and gradient methods, and started to look at stochastic gradient descent and its family of methods.

Although it does not sound like a very exciting topic (and far away from all the interesting data we can analyze with ML methods), finding the minima of the cost/risk function is the true bottleneck of all machine learning methods. For our 'simpler' methods linear regression and logistic regression, we end up with clear convex functions and the search for an optimal minimum is facilitated by that. 

This week we move into neural networks (end Thursday and all of Friday) and here we may not have a convex(concave) cost function, meaning that we can get stuck in saddle points, local minima or may not find a minimum at all. Furthermore, we will have zillions of parameters to fit. This is why we spend some time on stochastic gradient methods, these methods are simply essential for all ML studies. 

The overview video for this week tries to summarize what we plan to do. See https://www.uio.no/studier/emner/matnat/fys/FYS-STK3155/h20/forelesningsvideoer/OverviewWeek40.mp4?vrtx=view-as-webpage

 

The slides for week 40 are at https://compphysics.github.io/MachineLearning/doc/web/course.html, scroll down to week 40.

The reading suggestions are Aurelien Geron's chapter 10 and Hastie et al chapter 11. For Stochastic Gradient Descent, we recommend chapter 4 of Geron's text.

 

The slides for week 39 have been upgraded accordingly to what we did last week, with the inclusion of the videos and handwritten notes from last week. 

 

Best wishes to you all and cya at the lab today,

Kristian, Michael, Morten, Nicolai, Per-Dimitri, Stian and Øyvind

Publisert 30. sep. 2020 07:42 - Sist endret 30. sep. 2020 07:42