Practical Tutorials for the Machine Learning for Natural Language Processing 1 Lecture at the University of Zurich for the Fall Semester 2025.
In this repository, you will find the notebooks for the tutorials and exercises for the course. The tutorials are designed to help you understand and put into practice the theoretical concepts discussed in the lecture. Tutorials are designed to be interactive, and you are encouraged to experiment with the code and try different variations to understand the concepts better.
-
exercises: Contains the given code samples for completing the exercises.
-
tutorials_notebooks_in_class: Contains the given code samples presented within the tutorials.
-
lecture_supplementaryl_code_samples: Contains the code sample presented or given within the lecture.
-
tutorials_supplementary_code_samples: Contains supplementary code samples provided as part of the tutorials.
From linear to deep learning models for text classification. In this exercise, you will implement a simple linear model for text classification using the sklearn library. You will then extend the model to a deep learning model using the skorch library.
See the exercise sheet for more details:
Building word embeddings with PyTorch. In this exercise, you will implement a Continuous Bag of Words Model using the torch library. See the exercise sheet for more details:
Also known as Exercise AB. Handled through Simon Clematide. Please see OLAT Course for details.
Named Entity Recognition using Transformer Encoders. In this exercise, you will fine-tune a BERT model using the HuggingFace library. Additionally, you will experiment with GliNER to perform anonymization of texts.
LLM Prompting, Prompt Engineering and small LLM fine-tuning using Unsloth AI
Topic Modeling using LDA and CTM