What I Built
Built deep learning architectures from scratch progressing from simple feed-forward networks to advanced models like Variational Autoencoders. Implemented FFN, CNN (LeNet-5, AlexNet, VGG-11, ResNet), RNN, LSTM with applications including baby name generation, English-French translation with attention, and β-VAE with latent space interpolation.
What I Learned
Backpropagation is beautiful when you implement it manually. Building LSTM cells from scratch with NumPy taught me how gradients flow through time and why vanishing gradients happen. The reparameterization trick in VAEs and understanding ELBO optimization gave deep insights into probabilistic modeling.
Also learned that modern architectures (ResNet, attention) solve fundamental problems in gradient flow and information preservation.
Project
Architectures: FFN, CNN variants, RNN/LSTM, Autoencoders, VAE | Applications: MNIST/CIFAR classification, sequence prediction, language modeling, translation
Citation
@online{prasanna_koppolu,
author = {Prasanna Koppolu, Bhanu},
title = {Deep {Learning} from {Scratch}},
url = {https://bhanuprasanna2001.github.io/projects/dl_from_scratch.html},
langid = {en}
}