Introduction to UvA Deep Learning Tutorials
The UvA Deep Learning Tutorials project offers a comprehensive series of Jupyter notebooks that serve as an educational companion to a deep learning course. This series aims to reinforce theoretical concepts presented in lectures by providing practical and interactive implementations. Created by Phillip Lippe, these tutorials cover a wide range of topics including optimization techniques, transformers, graph neural networks, and the PyTorch framework with PyTorch Lightning. Students will also find translations to JAX+Flax, offering an alternative implementation framework.
Course Details
The current course edition runs from October 28 to December 20, 2024, and is constantly updated. To support learning, video recordings are available through a dedicated YouTube playlist. Additionally, the notebooks are available in HTML format for easy reading on any device via the project's RTD website.
Using the Notebooks
Students and learners can engage with the notebooks in various ways, depending on their resources and preferences:
-
Locally on CPU: The notebooks are available in the project's GitHub repository. They are designed to work on standard laptops without needing a GPU. Pretrained models are downloadable through the notebooks or manually from Google Drive, with the total storage requirement under 1GB. A conda environment is provided to ensure all necessary packages are installed.
-
Google Colab: For those preferring cloud computing, the notebooks can be run on Google Colab, which supports GPU acceleration. Each notebook offers a link to open it in Colab, where users should remember to enable GPU support under
Runtime -> Change runtime type
. -
Snellius Cluster: For larger, more intensive training tasks, the Snellius cluster is an option. However, it requires converting notebooks to scripts and is suitable for users deeply involved in model training.
Tutorial Session Format
The tutorials are presented during the first hour of each group session, where the content and implementation details are explained. While these tutorials are not part of mandatory graded assignments, they are highly recommended as they support understanding and aid in exam preparation.
Topics Covered
The tutorials are designed to align with the course lectures, providing practical insights into each subject. Some of the key tutorials include:
- Introduction to PyTorch
- Activation functions
- Optimization and Initialization
- Inception, ResNet, and DenseNet architectures
- Transformers and Multi-Head Attention
- Graph Neural Networks
- Adversarial attacks
Feedback and Contributions
These tutorials are being presented for the first time in this course. Feedback is encouraged to enhance the educational experience and improve future editions. Students are invited to report any bugs or provide suggestions via a feedback form or direct contact with Phillip Lippe.
For those looking to cite the work, a citation format is provided for referencing purposes.
Conclusion
The UvA Deep Learning Tutorials offer a rich resource for students to immerse themselves in the practical aspects of deep learning. By blending theoretical understanding with hands-on experience, these tutorials empower learners to deepen their knowledge and skillset in the rapidly evolving field of deep learning.