In this project, I implemented a neural network from scratch. This includes: Acitvation functions (Linear, Sigmoid, ReLU, SoftMax), Layers (FullyConnected, Pool2D, Conv2D), and loss functions (CrossEntropy, L2), and the forward, backward, and predict functions of the network itself. You can find my code here.