A curated collection of Machine Learning tutorials created from scratch, combining theory, intuition, math, and hands-on implementation.
These Jupyter notebooks walk through ML concepts step-by-step — perfect for beginners, students, and anyone who wants to understand ML deeply rather than treating it as a black box.
Created by Deepti Jethwani.
This notebook introduces the fundamental concepts of Machine Learning:
- What ML is and how it works
- Types of learning: Supervised, Unsupervised, Reinforcement
- Key terms: features, labels, models, training, testing
- Bias–variance tradeoff
- Underfitting vs Overfitting
- Train/test split basics
- Introduction to evaluation metrics
Goal: To build a strong conceptual foundation before exploring algorithms.
This notebook walks through digit classification using the MNIST dataset:
- Loading and exploring the MNIST dataset
- Preprocessing and normalization
- Building a classification model
- Training the model on 60,000 images
- Testing on 10,000 images
- Visualizing predictions and errors
- Evaluating accuracy
Goal: Learn end-to-end ML workflow on a real-world dataset.
A mathematical and practical overview of Linear Regression:
- Intuition of linear relationships
- Hypothesis function
- Cost function (MSE)
- Gradient Descent explained step-by-step
- Formula derivation
- Implementing Linear Regression from scratch
- Plotting regression lines and errors
Goal: Understand the math and mechanics behind regression models.
A complete guide to Logistic Regression for classification:
- Logistic function and sigmoid
- Why linear regression fails at classification
- Decision boundary
- Log loss / Cross-entropy
- Gradient descent for logistic regression
- Implementing logistic regression from scratch
- Evaluating model accuracy
Goal: Learn binary classification deeply.
Support Vector Machines explained with intuition and implementation:
- Maximum-margin concept
- Hyperplanes and support vectors
- Kernel trick (Linear, Polynomial, RBF)
- Training an SVM classifier
- Visualizing decision boundaries
- Understanding when SVMs perform best
Goal: Build intuition for one of the strongest classical ML models.
A full tutorial on Decision Trees:
- Gini impurity
- Entropy and information gain
- How trees split data
- Classification and regression trees
- Implementing decision trees
- Training on datasets
- Tree visualization using
.dotfiles
Goal: Understand how tree-based models make decisions.
- KNN from scratch
- Naive Bayes
- PCA
- Random Forest
- Gradient Boosting
- Notes