Skip to content

Commit

Permalink
Understanding PyTorch Buffers (#288)
Browse files Browse the repository at this point in the history
  • Loading branch information
rasbt authored Jul 26, 2024
1 parent 08040f0 commit deea13e
Show file tree
Hide file tree
Showing 4 changed files with 567 additions and 1 deletion.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,6 +102,7 @@ Several folders contain optional materials as a bonus for interested readers:
- [Dataloader Intuition with Simple Numbers](ch02/04_bonus_dataloader-intuition)
- **Chapter 3:**
- [Comparing Efficient Multi-Head Attention Implementations](ch03/02_bonus_efficient-multihead-attention/mha-implementations.ipynb)
- [Understanding PyTorch Buffers](ch03/03_understanding-buffers/understanding-buffers.ipynb)
- **Chapter 4:**
- [FLOPS Analysis](ch04/02_performance-analysis/flops-analysis.ipynb)
- **Chapter 5:**
Expand Down
3 changes: 3 additions & 0 deletions ch03/03_understanding-buffers/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# Understanding PyTorch Buffers

- [understanding-buffers.ipynb](understanding-buffers.ipynb) explains the idea behind PyTorch buffers, which are used to implement the causal attention mechanism in chapter 3
Loading

0 comments on commit deea13e

Please sign in to comment.