bird-of-paradise commited on
Commit
bda7411
·
verified ·
1 Parent(s): 1f4d6e1

adding PyData Global 2025 presentation link

Browse files
Files changed (1) hide show
  1. README.md +31 -10
README.md CHANGED
@@ -9,16 +9,19 @@ This repository provides a detailed guide and implementation of the Transformer
9
  For implementions of more recent architectural innovations from DeepSeek, see the **Related Implementations** section.
10
 
11
  ## Table of Contents
12
- 1. [Summary and Key Insights](#summary-and-key-insights)
13
- 2. [Implementation Details](#implementation-details)
14
- - [Embedding and Positional Encoding](#embedding-and-positional-encoding)
15
- - [Transformer Attention](#transformer-attention)
16
- - [Feed-Forward Network](#feed-forward-network)
17
- - [Transformer Decoder](#transformer-decoder)
18
- - [Encoder-Decoder Stack](#encoder-decoder-stack)
19
- - [Full Transformer](#full-transformer)
20
- 3. [Testing](#testing)
21
- 4. [Visualizations](#visualizations)
 
 
 
22
 
23
  ## Quick Start
24
  View the complete implementation and tutorial in the [Jupyter notebook](Transformer_Implementation_Tutorial.ipynb).
@@ -216,6 +219,24 @@ These visualizations help understand the inner workings of the transformer and v
216
 
217
  For detailed code and interactive examples, please refer to the complete implementation notebook.
218
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
219
  ## Related Implementations
220
 
221
  This repository is part of a series implementing the key architectural innovations from the DeepSeek paper:
 
9
  For implementions of more recent architectural innovations from DeepSeek, see the **Related Implementations** section.
10
 
11
  ## Table of Contents
12
+ 1. [Summary and Key Insights](## summary-and-key-insights)
13
+ 2. [Implementation Details](# implementation-details)
14
+ - [Embedding and Positional Encoding](### embedding-and-positional-encoding)
15
+ - [Transformer Attention](### transformer-attention)
16
+ - [Feed-Forward Network](### feed-forward-network)
17
+ - [Transformer Decoder](### transformer-decoder)
18
+ - [Encoder-Decoder Stack](### encoder-decoder-stack)
19
+ - [Full Transformer](### full-transformer)
20
+ - [Testing](### testing)
21
+ - [Visualizations](### visualizations)
22
+ 3. [PyData Global 2025 Presentation](## PyData Global 2025 Presentation)
23
+ 4. [Related Implementations](##Related Implementations)
24
+
25
 
26
  ## Quick Start
27
  View the complete implementation and tutorial in the [Jupyter notebook](Transformer_Implementation_Tutorial.ipynb).
 
219
 
220
  For detailed code and interactive examples, please refer to the complete implementation notebook.
221
 
222
+ ## PyData Global 2025 Presentation
223
+ For those of you who prefer to learn from videos, you can watch my PyData Global 2025 presentation "[I Built a Transformer from Scratch So You Don’t Have To](https://www.youtube.com/watch?v=ID5zSzycQBg)"
224
+
225
+ • How the original Transformer architecture works
226
+
227
+ • How to translate each component into PyTorch
228
+
229
+ • Key ideas: attention, masking, positional encoding, FFN
230
+
231
+ • A decoder-only forward pass, step-by-step
232
+
233
+ • Common implementation bugs — and how to debug them
234
+
235
+ • Where to go next (code, tutorials, training references)
236
+
237
+
238
+
239
+
240
  ## Related Implementations
241
 
242
  This repository is part of a series implementing the key architectural innovations from the DeepSeek paper: