This is the repository for the upcoming learning group meetup in October based on fast.ai v3 part 2 course, fastai v2 library development, and PyTorch v1.2 course taking place in Vienna. (See also the repository from the previous fastai pytorch course in Vienna v1 based on the fast.ai v3 part 1 course material.)
❗ Please register in order to get the updates for the meetups.
For this learning group meetup you are expected to have basic knowledge of deep learning or have gone through the fast.ai v3 part 1 course material and to have at least one year experience with programming. You should feel comfortable with programming in Python as well as having basic knowledge in Calculus and Linear Algebra. Some machine learning background is advised to make best use of the course.
- Lesson 8: 16.10.2019 18:00-20:00 - Matrix multiplicatio; Forward and backward passes - Michael Pieler
- Lesson 9: 6.11.2019 18:30-20:30 - Loss functions, optimizers, and the training loop - Liad Magen & Thomas Keil
- Lesson 10: 20.11.2019 18:30-20:30
- Lesson 11: exact day to be announced soon
- Lesson 12: 18.12.29019 18:30-20:30
- Lesson 13: tba
- Lesson 14: tba
Note: All the learning group meetups will take place at Nic.at, Karlsplatz 1, 1010 Wien.
(The first lesson already starts with number 8, because the part 1 course contained 7 lessons.)
- To dos before the lesson:
- watch the fastai lesson 8 (timlee lesson notes)
- run the matrix multiplication and the forward and backward pass notebooks
- Do not worry, the first lesson is quite dense and we will tackle the building blocks piece by piece! :-)
- Matrix multiplication on German Wikipedia (the German version has better visualisations)
- Animated matrix multiplication
- Broadcasting visualisation
- Refresh your PyTorch basics with the lerning material from our previous fast.ai v3 part 1 learning group.
- Get familiar with PyTorch einsum to get more intuition for matrix multiplication.
- What is torch.nn really? (This nicely explains the steps needed for training a deep learning model with PyTorch. It covers torch.nn, torch.optim, Dataset, and DataLoader. This setup is a "blueprint" for a deep learning library based on PyTorch.)
- fastai v2 dev test setup
- Go deeper with DL debugging, troubleshooting (pdf or video), and how to avoid it in the first place (i.e., the Karpathy recipe).
- Why understanding backprop can be important for debugging.
- Xavier Glorot and Kaiming He init
- Publications:
- Matrix calculus for DL (web) (arxiv)
- Xavier Glorot init
- Kaiming He init
- Fixup init
- If you want to present one of the papers in this or the next lectures reach out to us via email! :-)
- If you want to know more about matrix multiplication & Co. on your (Nvidia) GPU.
- To dos before the lesson:
- watch the fastai lesson 9
- run the lesson 9 notebook
- tba
- To dos before the lesson:
- watch the fastai lesson 10
- run the lesson 10 notebook
- tba
- To dos before the lesson:
- watch the fastai lesson 11
- run the lesson 11 notebook
- tba
- To dos before the lesson:
- watch the fastai lesson 12
- run the lesson 12 notebook
- tba
- To dos before the lesson:
- watch the fastai lesson 13
- run the lesson 13 notebook
- tba
- To dos before the lesson:
- watch the fastai lesson 14
- run the lesson 14 notebook
- tba
- fast.ai v3 part 2 course details
- fast.ai v3 part 2 course material (this should be your first address if you are searching for something)
- fast.ai v3 part 2 course notebooks
- fastai v1 docs (this should be your second address if you are searching for something)
- fastai v2 dev repo (We will have a look at the notebooks used for the development of fastai v2 to see how the different parts end up in the library.)
- fast.ai forum (this should be your third address if you are searching for something)
- TWiML fast.ai v3 part 2 study group material
- Learning tips
- Do not forget, the path to mastery is not a straight line! (From the book "Chop Wood Carry Water".)
- Please feel free to send us suggestions!