diff --git a/exercises/01_penguin_classification.ipynb b/exercises/01_penguin_classification.ipynb index 8a6eb99..a80035f 100644 --- a/exercises/01_penguin_classification.ipynb +++ b/exercises/01_penguin_classification.ipynb @@ -350,7 +350,7 @@ "\n", "  The `__init__()` method is where we typically define the attributes of a class. In our case, all the \"sub-components\" of our model should be defined here.\n", "\n", - "  The `forward` method is called when we use the neural network to make a prediction. Another term for \"making a prediction\" is running the forward pass, because information flows forward from the input through the hidden layers to the output. When we compute parameter updates, we run the backward pass by calling the function `loss.backward()`. During the backward pass, information about parameter changes flows backwards, from the output through the hidden layers to the input.\n", + "  The `forward` method is called when we use the neural network to make a prediction. Another term for \"making a prediction\" is running the forward pass, because information flows forward from the input through the hidden layers to the output. This builds a computational graph. To compute parameter updates, we run the backward pass by calling the function `loss.backward()`. During the backward pass, `autograd` traverses this graph to compute the gradients, which are then used to update the model's parameters.\n", "\n", "  The `forward` method is called from the `__call__()` function of `nn.Module`, so that when we run `model(batch)`, the `forward` method is called. \n", "- First, we will create quite an ugly network to highlight how to make a neural network in PyTorch on a very basic level.\n", @@ -360,7 +360,7 @@ }, { "cell_type": "code", - "execution_count": 2, + "execution_count": 14, "metadata": {}, "outputs": [], "source": [