Skip to content

Implementing Neural Network from Scratch only using basic mathematics

Notifications You must be signed in to change notification settings

Yash8745/Neural_Network_Scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Network Scratch Implementation

This project is a scratch implementation of a neural network from the ground up, written in Python. It serves as a learning resource and demonstrates how neural networks can be built without using high-level machine learning libraries such as TensorFlow or PyTorch.

Table of Contents

Features

  • Implements core components of a neural network:
    • Activation functions
    • Layers
    • Softmax for output probabilities
  • A structured design to build and test neural networks from scratch.
  • Jupyter notebooks for experimentation and visualization.
  • Modular codebase for reuse and extensibility.

Installation

  1. Clone the repository:

    git clone https://github.com/your-username/Neural_Network_Scratch.git
    cd Neural_Network_Scratch
  2. Install the required Python dependencies:

    pip install -r requirements.txt

Usage

Running the Scripts

  • Neural Network Implementation: Run the neural_network.py file to execute the main functionality of the neural network.

    python neural_network.py
  • Experimentation Notebooks: Open the notebooks in the notebooks directory using Jupyter:

    jupyter notebook notebooks/batch.ipynb

Example Workflow

  1. Modify the data loading script (data.py) to input your dataset.
  2. Configure the network architecture in layer.py and adjust hyperparameters in neural_network.py.
  3. Train the network and evaluate performance using the provided scripts.

Project Structure

Neural_Network_Scratch/
├── notebooks/
│   ├── batch.ipynb         # Notebook for batch processing experiments
│   ├── node.ipynb          # Notebook for node-level experiments
├── activation.py           # Implementation of activation functions
├── data.py                 # Data preprocessing and loading utilities
├── layer.py                # Implementation of neural network layers
├── neural_network.py       # Main neural network script
├── readme.md               # Project documentation (this file)
├── requirements.txt        # Python dependencies
├── softmax.py              # Softmax function implementation

Contributing

Contributions are welcome! If you'd like to improve this project:

  1. Fork the repository.
  2. Create a new branch for your feature/bugfix.
  3. Commit your changes and submit a pull request.

Example:

git checkout -b feature-new-activation-function
git commit -m "Add new activation function"
git push origin feature-new-activation-function

License

This project is licensed under the MIT License. See the LICENSE file for details.

Feel free to customize this further based on specific details or features of your project. Let me know if you need additional sections or adjustments!

About

Implementing Neural Network from Scratch only using basic mathematics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published