Semester page for FYS-STK4155 - Autumn 2022

Teachers

The deadline for project is now December 18 at midnight. 

Best wishes to you all.

11. des. 2022 21:47

We will try to organize a last lab session for those interested this coming Thursday (December 15) from 10am to 12pm, both online (same zoom link as before) and in person at FØ434.
Best wishes to you all and don't hesitate to swing by if you have questions.

Morten et al

11. des. 2022 21:45

Dear all, we hope you all doing well during these hectic exam times.
Since our last regular lab sessions and lectures ended last week (Friday), we thought of organizing (this week) some additional questions and answers sessions. Since many of you are probably busy working on exams we thought of first of offering these sessions via zoom, but also in-person for those of you who prefer/can.
This week we are planning on zoom and in-person Q&A and help sessions Thursday 2pm-4pm and Friday 2pm-4pm. We will use the same zoom link as we used for the lectures. If you wish to be there in person, we will be at our lab FØ434 at the same time.
Don't hesitate to come by for questions or just swing by our offices in case of questions and more.

Best wishes to you all,
Morten et al

p.s. we are hoping to be able to send feedback on project 2 to you all by the middle of next week

30. nov. 2022 07:49

Hi all, this is sadly our last week and after our discussions on support vector machines last week (see lecture notes and videos) we are now ready to wrap up the semester by scratching the surface of unsupervised learning methods. We will focus on the standard principal component analysis (PCA) method (which allows us to revisit the correlation and covariance matrices and the SVD) and one of the simplest (and very intuitive ) clustering methods, namely what is called k-means clustering. You will find all this wonderful material, plus a summary and more by jumping into the lecture slides for week 47, see for example https://compphysics.github.io/MachineLearning/doc/pub/week47/html/week47-reveal.html.

Else, see also

  - Geron's chapter 9 on PCA

  - Hastie et al Chapter 13 (sections 13.1-13.2 are the most relevant ones)

- and excellent videos at:

  - We recommend hi...

21. nov. 2022 22:03

Hi all, 
this is just a quick reminder that the final deadline for project 2 is now set to Friday the 18th (midnight). We pushed it from last Friday to Wednesday and then finally to Friday this week. 
Also, feel free to come with suggestions for project topics for project 3 to be presented at Friday this week. 
We will discuss this also during the lecture on Thursday.

Finally, here are some general observations from us about project 1. Hopefully these remarks can be of use when you wrap up the report for project 2 and work on project 3 as well.
 Best wishes to you all,
Morten et al.

////.  Comments about project 1

Summary after corrections: 
* Many of you have written very nice codes! Thx, this part is very good. And there were many excellent results. 
* However, many of you are not used to write scientific reports. Here are so...

16. nov. 2022 06:49

Dear all, welcome to a new week with FYS-STK.

Last week we wrapped up our discussions on decision trees and ensemble methods based on decisions trees (bagging, random forests, boosting and gradient boosting). These are all very popular methods and in particular for classification problems, often produce excellent results on training and predictions. And they are all simple to implement and have a low-level of mathematical complexity. Last week we started also with our last supervised learning method, Support Vector machines. This topic will also keep us busy this coming week.  We are also planning to run an eventual mini-workshop on possible topics for project 3.  Here you'll find the topics presented by different groups in 2020 and 2021.

In 2020 the contributions were (and some of these ended up in thesis work and/or publications, online only due to Covid-19!)

  • Maria Emine Nylund: Lego Bricks Classifier...
13. nov. 2022 10:36

Dear all, welcome to a new week and FYS-STK.

Last week we went through the basic algorithms of decision trees for classification and regression, with an emphasis on the so-called CART algorithm. We discussed also so-called ensemble methods like bagging, voting and random forests before starting with boosting methods. The latter use also decision trees as weak learners. We will go through the details of these methods this week and discuss ADAboost and gradient boosting. If we get time, we may start with support vector machines, our second last topic this semester.  Else, the deadline for project 2 is now set to Wednesday 16th of November since we also postponed the deadline for project 1 by five days.

 

Lab Wednesday and Thursday: work on project 2, new deadline November 16 at midnight.

Lecture Thursday: Boosting m...

8. nov. 2022 08:14

Important note:  Due to the "High-School teachers' week" (Faglig pedagogisk dag in Norwegian) at the University of Oslo, our lecture hall is occupied Thursday and our Thursday lecture has to be on zoom only. 
It will be recorded as usual. On Friday we are back to our regular auditorium. We apologize for this inconvenience.

2. nov. 2022 06:35

Dear All, welcome to a new week and a new topic (our third last).

Last week we ended our discussions of deep learning methods with a discussion of convolutional neural networks and recurrent neural networks. This week we start with another set of very popular methods for both classification and regression. We will start with decision trees and then move over to ensembles of decisions trees (random forests, bagging and other methods) and then end with boosting methods and gradient boosting.

The plans for this week are

Lab Wednesday and Thursday: work on project 2

Lecture Thursday: Basics of decision trees, classification and regression algorithms

Lecture Friday: Decision trees and  ensemble models (bagging and random forests)

Teaching material: Lecture notes week 44 at https://compphysics.gi...

2. nov. 2022 06:27

Dear all, 

last week we applied neural networks to the solution of differential equations and started our discussion of convolutional neural networks. The plan this week is as follows

* Lab: Wednesday and Thursday work on project 2. The lab on Wednesdays is only at FØ434.

* Lecture Thursday: Convolutional Neural Networks (CNN)

* Lecture Friday: Recurrent Neural Networks (RNN)

 

=== Videos and reading recommendation

Video  on Convolutional Neural Networks from MIT at https://www.youtube.com/watch?v=iaSUYvmCekI&ab_channel=AlexanderAmini

Video  on Recurrent Neural Networks from MIT at https://www.youtube.com/watch?v=SEnXr6v2ifU&ab_channel=AlexanderAmini

=== Reading Recommendations 

 

CNN readings

Goodfe...

26. okt. 2022 06:15