Neural Networks from scratch!
Two different implementations:
- Finite difference implementation (simplest to understand)
- Backprop implementation (using an autograd, suprisingly simple when you know about computation graphs)
And also the PyTorch implementation for comparison.
We make the network to learn xor gate. Like learning the 1 ^ 1 = 0. For that we create the truth table for the xor, and feed the bits into the network.
You can read my blog post if you like to learn how to implement this step-by-step: https://comsci.blog/2023/08/24/nn-from-scratch.html