Weekly plans and update for week 39

Hello folks, here comes a quick update with plans for this week and project 1. 

Last week we discussed logistic regressions and gradient descent (SGD) methods and we barely started scratching the surface of the stochastic gradient descent family of methods. This will be the topic for Thursday's lecture (with possible migration into Friday's lecture). We will discuss how to set up the SGD for both linear regression and logistic regression (classification) problems with examples. We will also discuss how to use automatic differentiation, an extremely useful algorithm included the autograd library. On Friday we move on to neural networks. That will keep us busy next week as well. 

The material for the latter can be looked up in for example the slides at https://compphysics.github.io/MachineLearning/doc/pub/NeuralNet/html/NeuralNet-bs.html

 

Else, for project 1. If there is an interest for it, we can extend the deadline with one week. I have received questions about this from several of you already.  

Also, if you are still struggling with the k-fold (it seems many of you get odd results when you use the Kfold function from skicit-learn,  you can choose to replace cross-validation with the bootstrap method. Normally most practitioners tend to use CV for the estimation of the MSE, since it tends to give better estimates for the MSE for small data sets.  For CV, remember that for each model  (the polynomial degree) you need to compute the MSE, variance and bias for each of the folds you have (for example $$k=5$$) and produce the averages of these quantities.

 

Keep in mind that for the terrain data you need to fine tune the  hyper parameter $$\lambda$$ as well.   

Best wishes for the week to you all,

Hanna, Lucas, Morten and Stian

Publisert 24. sep. 2019 07:34 - Sist endret 24. sep. 2019 07:34