Skip to content

Pranavhc/autograd_nn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Autograd Neural Network

This Neural Network library utilizes 'Automatic Differentiation' (or 'Autograd') to compute gradients of the loss function with respect to the model's parameters.

Look into tensor.py for the autograd system.

Look into Examples for usage examples.

from nn.tensor import Tensor
import numpy as np

x = Tensor(np.array([2.0, 3.0]), requires_grad=True)
w = Tensor(np.array([4.0, -1.0]), requires_grad=True)

# Some operation
out = (x * w).sum(dim=0)

# Backpropogation
out.backward()

print("x.grad:", x.grad)  # [4.0, -1.0]
print("w.grad:", w.grad)  # [2.0, 3.0]

About

A neural network library built around an Automatic Differentiation system written from scratch. The main focus of this project is its 'AutoGrad' system.

Topics

Resources

License

Stars

Watchers

Forks

Contributors

Languages