This repository contains the lab works from the Neural Network and Fuzzy Logic course at Rajshahi University of Engineering & Technology (RUET). The notebooks implement various neural network algorithms and showcase practical applications of machine learning models.
- K Nearest Neighbour (KNN): Implementation of KNN from scratch and comparison with built-in KNN models.
- Single Layer Perceptron: Implementation of a simple perceptron model for binary classification tasks.
- Multi-Layer Perceptron: A more advanced model involving multiple layers with backpropagation for training.
-
K Nearest Neighbour (KNN):
K Nearest Neighbour Scratch vs Built-In on Breast Cancer Dataset.ipynb: Comparison of KNN from scratch and built-in methods on the Breast Cancer dataset.Implement K Nearest Neighbour From Scratch and Compare with the Built-in Graphically.ipynb: Visual comparison of custom vs built-in KNN.
-
Single Layer Perceptron:
Single Layer Perceptron.ipynb: Simple perceptron model to classify binary data.
-
Multi-Layer Perceptron (MLP):
Multi-Layer Perceptron & Backpropagation Binary Classifier.ipynb: Implementation of a multi-layer perceptron and backpropagation algorithm for binary classification.
-
Datasets:
Breast_Cancer_Dataset.csv: The dataset used for training and testing the KNN model.
- Clone the repository:
git clone https://github.com/AudityGhosh/Neural_Network_Notebooks.git
- Navigate to the directory:
cd Neural_Network_Notebooks - Open the Jupyter notebooks:
jupyter notebook
- Run the notebooks in the order they appear to explore each neural network model.
- Neural Computing - An Introduction: The book used as a reference for implementing various neural network models.
- Special Thanks: Gratitude to Kuldip Saha, classmate, and Nur E Anika Anan, senior, for their valuable support during the development of this project.
Feel free to fork the repository, contribute to the projects, or suggest improvements. Pull requests are welcome!
This project was developed by Audity Ghosh as part of the Neural Network and Fuzzy Logic course at RUET, with special thanks to the authors and contributors.