Weekly update for week 42

Weekly update FYS-STK3155/4155

Hi all, again, we hope this week has started the best possible way for everybody.

A longer mail this time, sorry for that.

1) There is a new version of project 2, now with the accuracy function only for the logistic regression part (part c) and for the neural network classification part (part e). 

 

2) I have updated typos in equations in the neural network part and added expressions for the derivatives of the cost function for the cross-entropy version. Please let me know if you spot typos! I appreciate all feedback here. Don't hesitate to mail me or use piazza if you find topics which are not clearly spelled out.

 

3) For project 2, you may wish to try other activation functions than the standard sigmoid. The various RELU ones and the ELU discussed in the Neural net lectures, see towards the end of https://compphysics.github.io/MachineLearning/doc/pub/NeuralNet/html/NeuralNet.html, are commonly used nowadays and tend to yield more stable results for the back propagation algorithm.

 

4) You may also wish to explore various gradient descent approaches for parts c-e). It seems that the stochastic gradient with mini-batches works best. Feel free to try them out.

This is also the topic for Thursday's lecture, that is, we will revisit the gradient descent material with an emphasis on what is needed for project 2, see https://compphysics.github.io/MachineLearning/doc/pub/Splines/html/Splines-bs.html

 

 

5) Else, this week we will wrap up the discussion on neural networks for supervised learning by also bringing up again how to solve differential equations with neural networks. We will also say something about convolutional neural networks, mainly with an eye on tensorflow applications. And then we will move into other techniques for supervised learning, namely k-nearest nearest neighbors, support veector machines and various forests and tree models.  This will end our supervised learning material and keep us busy till the end of October. After that we will start with unsupervised learning and end the semester with Bayesian statistics. 

 

6) Lucas (Charpentier) did the heroic effort of deriving the back propagation algorithm from the text of Hastie et al, see chapter. I include these heroic efforts to this mail. Thx a million Lucas. See below for the file.

 

7) Finally, I have added the way we evaluate and grade projects and how we set the final mark in the GitHub folder, see https://github.com/CompPhysics/MachineLearning/tree/master/doc/Projects/EvaluationGrading. I hope you'll find it useful. 

 

best wishes for the week,

the ML gang, Morten, Bendik, Kristine and Øyvind.

 

Publisert 16. okt. 2018 14:41 - Sist endret 16. okt. 2018 14:41