A simple, modular neural network implementation in pure C. Learn how neural networks work from the ground up with this educational project.
# Build the project
make
# Train on XOR problem
make run-xor
# Train on sine wave data
make run
# Train on circle classification
make run-circleThis project implements a neural network that can:
- Learn the XOR logic gate
- Approximate mathematical functions (like sine waves)
- Classify points inside/outside a circle
- Use different activation functions (sigmoid, ReLU, tanh, etc.)
- Use different loss functions (MSE, binary cross-entropy)
- Train with momentum and learning rate decay
# Simple XOR training
./nn -d xor -v
# Train on custom data
./nn -d circle -h 16 -b 64 -v
# Show all options
./nn -?- 2-layer neural network (input → hidden → output)
- Configurable hidden layer size
- Multiple activation functions
- Batch training with momentum
- Gradient descent with backpropagation
- Early stopping to prevent overfitting
- Learning rate decay over time
- Weight decay (L2 regularization)
- Progress visualization
- XOR: Classic logic gate problem
- Sine: sin(x) × cos(y) function
- Circle: Inside/outside circle classification
- Enhanced Circle: More boundary examples
- Gradient checking to verify math is correct
- Train/test split for evaluating performance
- Accuracy calculation for classification
training configuration:
epochs: 5000
learning rate: 0.100000
hidden size: 8
batch size: 32
dataset: xor
hidden activation: relu
output activation: sigmoid
loss function: mse
training...
epoch 2500/5000 | train loss: 0.043 | test loss: 0.041
final results:
training accuracy: 98.75%
test accuracy: 97.50%
- Forward pass: Input → hidden layer → output layer
- Calculate loss: Compare output to expected values
- Backward pass: Compute gradients using chain rule
- Update weights: Adjust weights using gradients
- Repeat: Process all training data for multiple epochs
Built for education - simple enough to understand, complete enough to be useful.