Skip to content

Commit 0518bee

Browse files
authored
docs: fix broken link to SelfAttention (#384)
1 parent 773cb27 commit 0518bee

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

course_UvA-DL/05-transformers-and-MH-attention/MHAttention.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -118,7 +118,7 @@
118118
# * [Attention?
119119
# Attention!
120120
# (Lilian Weng, 2018)](https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html) - A nice blog post summarizing attention mechanisms in many domains including vision.
121-
# * [Illustrated: Self-Attention (Raimi Karim, 2019)](https://towardsdatascience.com/illustrated-self-attention-2d627e33b20a) - A nice visualization of the steps of self-attention.
121+
# * [Illustrated: Self-Attention (Raimi Karim, 2019)](https://medium.com/data-science/illustrated-self-attention-2d627e33b20a) - A nice visualization of the steps of self-attention.
122122
# Recommended going through if the explanation below is too abstract for you.
123123
# * [The Transformer family (Lilian Weng, 2020)](https://lilianweng.github.io/lil-log/2020/04/07/the-transformer-family.html) - A very detailed blog post reviewing more variants of Transformers besides the original one.
124124

0 commit comments

Comments
 (0)