You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+24-4Lines changed: 24 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ Analysis and Machine Learning, spanning from weekly plans to lecture
5
5
material and various reading assignments. The emphasis is on deep
6
6
learning algorithms, starting with the mathematics of neural networks
7
7
(NNs), moving on to convolutional NNs (CNNs) and recurrent NNs (RNNs),
8
-
autoencoders and other dimensionality reduction methods to finally
8
+
autoencoders, graph neural networks and other dimensionality reduction methods to finally
9
9
discuss generative methods. These will include Boltzmann machines,
10
10
variational autoencoders, generalized adversarial networks, diffusion methods and other.
11
11
@@ -19,6 +19,24 @@ FYS5429 zoom link to be announced when semester starts
19
19
20
20
All teaching material is available from this GitHub link.
21
21
22
+
23
+
The course can also be used as a self-study course and besides the
24
+
lectures, many of you may wish to independently work on your own projects related to for example your thesis or research. In
25
+
general, in addition to the lectures, we have often followed five main
26
+
paths:
27
+
28
+
- Projects (two in total) and exercises that follow the lectures
29
+
30
+
- The coding path. This leads often to a single project only where one focuses on coding for example CNNs or RNNs or parts of LLMs from scratch.
31
+
32
+
- The Physics Informed neural network path (PINNs). Here we define some basic PDEs which are solved by using PINNs. We start normally with studies of selected differential equations using NNs, and/or RNNs, and/or GNNs or Autoencoders before moving over to PINNs.
33
+
34
+
- The own data path. Some of you may have data you wish to analyze with different deep learning methods
35
+
36
+
- The Bayesian ML path is not covered by the present lecture material and leads normally to independent self-study work.
37
+
38
+
39
+
22
40
## January 20-24: Presentation of couse, review of neural networks and deep Learning and discussion of possible projects
23
41
24
42
- Presentation of course and overview
@@ -60,15 +78,17 @@ All teaching material is available from this GitHub link.
60
78
## March 3-7
61
79
- Recurrent neural networks and codes
62
80
- Long-Short-Term memory and applications to differential equations
81
+
- Graph neural network (GNN)s
63
82
- Slides at https://github.com/CompPhysics/AdvancedMachineLearning/blob/main/doc/pub/week7/ipynb/week7.ipynb
64
83
- Whiteboard notes at https://github.com/CompPhysics/AdvancedMachineLearning/blob/main/doc/HandwrittenNotes/2025/NotesFebruary27.pdf
65
-
- Recommended reading Goodfellow et al chapters 10 and Raschka et al chapter 15
84
+
- Recommended reading Goodfellow et al chapters 10 and Raschka et al chapter 15 and 18
66
85
67
86
68
87
## March 10-14
88
+
- GNNs
69
89
- Autoencoders and PCA
70
90
- Slides at https://github.com/CompPhysics/AdvancedMachineLearning/blob/main/doc/pub/week8/ipynb/week8.ipynb
71
-
- Recommended reading Goodfellow et al chapter 14 for Autoenconders
91
+
- Recommended reading Goodfellow et al chapter 14 for Autoenconders and Rashcka et al chapter 18
0 commit comments