You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This basically creates a matrix of [sin | cos] whereas the implementation in other papers including the original Attention is all you need had a positional embedding in which the sin and cos alternated between each other along the embedding dimension. Does this have anything to do with the relative positional embedding?
Thanks!
The text was updated successfully, but these errors were encountered:
This is how the positional embeddings matrix is constructed in the code:
This basically creates a matrix of [sin | cos] whereas the implementation in other papers including the original Attention is all you need had a positional embedding in which the sin and cos alternated between each other along the embedding dimension. Does this have anything to do with the relative positional embedding?
Thanks!
The text was updated successfully, but these errors were encountered: