Skip to content

jaibhasin/neural-network-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Network from Scratch

A pure Python implementation of a neural network with automatic differentiation and backpropagation, built from scratch. This project demonstrates the fundamental concepts of deep learning by implementing a multi-layer perceptron (MLP) with custom automatic differentiation.

Features

  • Custom Value class implementing automatic differentiation
  • Implementation of basic mathematical operations (add, multiply, tanh, exp, etc.)
  • Neural network components (Neuron, Layer, MLP) built from scratch
  • Backpropagation algorithm for training
  • Support for multi-layer perceptrons with configurable architecture

Project Structure

.
├── main_code/
│   ├── engine.py    # Core Value class with automatic differentiation, backpropagation 
│   ├── nn.py        # Neural network implementation (Neuron, Layer, MLP)
│   └── __init__.py
├── using_nn.py      # Example usage of the neural network
└── test.py          # Comparing engine.py with Pytorch

Installation

This project uses only Python's standard library modules (math and random), so no additional package installation is required. Simply clone the repository and you're ready to go:

git clone https://github.com/jaibhasin/neural-network-from-scratch.git
cd neural-network-from-scratch

Usage

Here's a simple example of how to use the neural network:

from main_code.nn import MLP

# Define input data and labels
xs = [
    [2.0, 3.0, -1.0],
    [3.0, -1.0, 0.5],
    [0.5, 1.0, 1.0],
    [1.0, 1.0, -1.0],
]
ys = [1.0, -1.0, -1.0, 1.0]

# Create a neural network with 3 inputs, two hidden layers of 4 neurons each, and 1 output
n = MLP(3, [4, 4, 1])

# Training loop
for k in range(300):
    # Forward pass
    y_pred = [n(x) for x in xs]
    loss = sum((yout - ygt)**2 for ygt, yout in zip(ys, y_pred))

    # Backward pass
    for p in n.parameters():
        p.grad = 0
    loss.backward()

    # Update parameters
    for p in n.parameters():
        p.data -= 0.01 * p.grad

Implementation Details

Value Class

The Value class in engine.py implements automatic differentiation with:

  • Basic mathematical operations (+, -, *, /, **)
  • Activation functions (tanh)
  • Backpropagation algorithm
  • Gradient computation

Neural Network Components

  • Neuron: Basic unit with weights and bias
  • Layer: Collection of neurons
  • MLP: Multi-layer perceptron with configurable architecture

License

This project is open source and available under the MIT License.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages