Micrograd is a tiny yet powerful framework that mimics how larger machine learning libraries (like PyTorch) handle automatic differentiation. It provides an intuitive way to explore the nuances of gradient calculations. This helps you understand how computational graphs are built and how gradients are propagated. Here, we will explore the key aspects of micrograd using a minimal code example.