Skip to content

DarkPimbaa/Neural-Network-MLP-CPP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Network MLP (C++)

A Multi-Layer Perceptron neural network implemented from scratch in C++ with zero external dependencies. Available as both static and dynamic libraries.

Features

  • Pure C++ implementation with no ML framework dependencies
  • Configurable layer count and neuron count
  • Bias neurons in each layer (except output)
  • ReLU activation for hidden layers
  • Weight modification for training
  • Network truncation support (weight averaging)
  • Available as static (.a) and dynamic (.so) libraries

Building

./build.sh

This creates both lib/libredeneural.a and lib/libredeneural.so.

Usage

Include in Your Project

Copy include/redeNeural.hpp to your project's include path and link against the library.

CMake Integration

# Static
target_link_libraries(your_target /path/to/lib/libredeneural.a)

# Dynamic
target_link_libraries(your_target /path/to/lib/libredeneural.so)

Example

#include <redeNeural.hpp>
#include <vector>

int main() {
    // 3 input neurons, 2 hidden layers, 2 output neurons
    RedeNeural network(3, 2, 2);

    std::vector<double> inputs = {0.5, -0.5, 1.0};
    std::vector<bool> output = network.iniciar(inputs);

    return 0;
}

Architecture

  • Input layer with configurable neuron count + bias
  • N hidden layers with ReLU activation + bias
  • Output layer returning boolean vector
  • Weight initialization and modification API for training

Requirements

  • C++17
  • No external dependencies

License

MIT

About

Multi-Layer Perceptron neural network from scratch in C++ with zero dependencies

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors