Skip to content

Commit

Permalink
Update ch03/02_bonus_efficient-multihead-attention/mha-implementation…
Browse files Browse the repository at this point in the history
…s.ipynb
  • Loading branch information
rasbt authored Oct 24, 2024
1 parent 16121c6 commit ac9dab6
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@
"source": [
"- To run all the code in this notebook, please ensure you update to at least PyTorch 2.5 (FlexAttention is not included in earlier PyTorch releases)\n",
"- If the code cell above shows a PyTorch version lower than 2.5, you can upgrade your PyTorch installation by uncommenting and running the following code cell (Please note that PyTorch 2.5 requires Python 3.9 or later)\n",
"- For more specific instructions and CUDA versions, please refer to the official installation guide at https://pytorch.org."
"- For more specific instructions and CUDA versions, please refer to the official installation guide at https://pytorch.org"
]
},
{
Expand Down

0 comments on commit ac9dab6

Please sign in to comment.