Skip to content

Commit 83c0c84

Browse files
committed
Comment on the softmax function
(cherry picked from commit 29d8b04bd4a62511a58baa74e0e5cc9fc2e6754c)
1 parent 751c500 commit 83c0c84

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

exercises/01_penguin_classification.ipynb

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -449,7 +449,8 @@
449449
"\n",
450450
"- Before we jump in and write these loops, we must first choose an activation function to apply to the model's outputs. We chose not to include this in the network itself.\n",
451451
" - We need to convert our model outputs into something that can be compared to our targets i.e. [0,0,1]\n",
452-
" - Here we are going to use the softmax activation function: see [the PyTorch docs](https://pytorch.org/docs/stable/generated/torch.nn.Softmax.html).\n",
452+
" - \n",
453+
" - Here we are going to use the softmax activation function: see [the PyTorch docs](https://pytorch.org/docs/stable/generated/torch.nn.Softmax.html). It can be seen as a generalization of both the logits and sigmoid functions to handle multi-class classification tasks\n",
453454
" - For those of you who've studied physics, you may be reminded of the partition function in thermodynamics.\n",
454455
" - This activation function is good for classification when the result is one of ``A or B or C``.\n",
455456
" - It's bad if you even want to assign two classification to a single image—say a photo of a dog _and_ a cat.\n",

0 commit comments

Comments
 (0)