Skip to content

ML concepts, intuition, mathematics, and full Python implementations — all in one learning-friendly repository.

Notifications You must be signed in to change notification settings

whatsupwithdeepti/ml-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

📘 ML From Scratch Tutorials

A curated collection of Machine Learning tutorials created from scratch, combining theory, intuition, math, and hands-on implementation.
These Jupyter notebooks walk through ML concepts step-by-step — perfect for beginners, students, and anyone who wants to understand ML deeply rather than treating it as a black box.

Created by Deepti Jethwani.


📂 Repository Contents (Detailed Explanation of Each File)

1. Key concepts of ML.ipynb

This notebook introduces the fundamental concepts of Machine Learning:

  • What ML is and how it works
  • Types of learning: Supervised, Unsupervised, Reinforcement
  • Key terms: features, labels, models, training, testing
  • Bias–variance tradeoff
  • Underfitting vs Overfitting
  • Train/test split basics
  • Introduction to evaluation metrics

Goal: To build a strong conceptual foundation before exploring algorithms.


2. MNIST - Classification Problem.ipynb

This notebook walks through digit classification using the MNIST dataset:

  • Loading and exploring the MNIST dataset
  • Preprocessing and normalization
  • Building a classification model
  • Training the model on 60,000 images
  • Testing on 10,000 images
  • Visualizing predictions and errors
  • Evaluating accuracy

Goal: Learn end-to-end ML workflow on a real-world dataset.


3. Linear Regression ( Math ).ipynb

A mathematical and practical overview of Linear Regression:

  • Intuition of linear relationships
  • Hypothesis function
  • Cost function (MSE)
  • Gradient Descent explained step-by-step
  • Formula derivation
  • Implementing Linear Regression from scratch
  • Plotting regression lines and errors

Goal: Understand the math and mechanics behind regression models.


4. Logistic Regression.ipynb

A complete guide to Logistic Regression for classification:

  • Logistic function and sigmoid
  • Why linear regression fails at classification
  • Decision boundary
  • Log loss / Cross-entropy
  • Gradient descent for logistic regression
  • Implementing logistic regression from scratch
  • Evaluating model accuracy

Goal: Learn binary classification deeply.


5. SVM.ipynb

Support Vector Machines explained with intuition and implementation:

  • Maximum-margin concept
  • Hyperplanes and support vectors
  • Kernel trick (Linear, Polynomial, RBF)
  • Training an SVM classifier
  • Visualizing decision boundaries
  • Understanding when SVMs perform best

Goal: Build intuition for one of the strongest classical ML models.


6. Decision Trees.ipynb

A full tutorial on Decision Trees:

  • Gini impurity
  • Entropy and information gain
  • How trees split data
  • Classification and regression trees
  • Implementing decision trees
  • Training on datasets
  • Tree visualization using .dot files

Goal: Understand how tree-based models make decisions.


📢 Future Additions (Planned)

  • KNN from scratch
  • Naive Bayes
  • PCA
  • Random Forest
  • Gradient Boosting
  • Notes