A Multi-Layer Perceptron neural network implemented from scratch in C++ with zero external dependencies. Available as both static and dynamic libraries.
- Pure C++ implementation with no ML framework dependencies
- Configurable layer count and neuron count
- Bias neurons in each layer (except output)
- ReLU activation for hidden layers
- Weight modification for training
- Network truncation support (weight averaging)
- Available as static (
.a) and dynamic (.so) libraries
./build.shThis creates both lib/libredeneural.a and lib/libredeneural.so.
Copy include/redeNeural.hpp to your project's include path and link against the library.
# Static
target_link_libraries(your_target /path/to/lib/libredeneural.a)
# Dynamic
target_link_libraries(your_target /path/to/lib/libredeneural.so)#include <redeNeural.hpp>
#include <vector>
int main() {
// 3 input neurons, 2 hidden layers, 2 output neurons
RedeNeural network(3, 2, 2);
std::vector<double> inputs = {0.5, -0.5, 1.0};
std::vector<bool> output = network.iniciar(inputs);
return 0;
}- Input layer with configurable neuron count + bias
- N hidden layers with ReLU activation + bias
- Output layer returning boolean vector
- Weight initialization and modification API for training
- C++17
- No external dependencies
MIT