This project aims to provide a foundational understanding of neural networks by implementing them from scratch. It serves as an educational tool for those interested in the inner workings of neural networks without relying on high-level libraries.
- Customizable Layers: Build neural networks with varying numbers of layers and units per layer.
- Activation Functions: Implement and experiment with different activation functions.
- Loss Functions: Utilize various loss functions to evaluate model performance.
- Training Algorithms: Train networks using backpropagation and gradient descent.
To set up the project locally:
-
Clone the repository:
git clone https://github.com/martonbakk/Homemade-NN.git
-
Navigate to the project directory:
cd Homemade-NN -
Install dependencies:
pip install -r requirements.txt
After installation, you can start building and training your neural networks. Refer to the provided example scripts for guidance:
-
XOR Problem: Demonstrates training a network to solve the XOR logic gate problem.
python testnn_xor.py
-
Image Classification: Shows how to train a network for basic image classification tasks.
python testnn_pic.py
These scripts illustrate how to define network architectures, specify activation and loss functions, and train the models.
Contributions are welcome! Feel free to fork the repository, make enhancements, and submit pull requests. For major changes, please open an issue to discuss your ideas.
This project is inspired by the desire to deepen the understanding of neural networks by building them from the ground up. It draws upon foundational concepts in machine learning and neural network theory.
Note: This project is intended for educational purposes and may not be optimized for production use.