Skip to content

Fun project to reimplement all the layers of a transformer that is slow but transparent, interpretable & visualizable

Notifications You must be signed in to change notification settings

manikanta-72/Slowformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 

Repository files navigation

Slowformer

Slowformer is a transformer framework designed to prioritize:

  • Weight Transparency: Gain deep insights into the behavior of weights at every layer, making the training and inference processes more interpretable.
  • Monitorability: Seamlessly track weight updates, gradients, and activations with robust logging and visualization tools.
  • Deliberate Speed: Crafted for research and experimentation, Slowformer values clarity and explainability over raw computational speed.

ToDo List

  • Implement Basic Transformer Model

Reading List

Transformer Architecture

About

Fun project to reimplement all the layers of a transformer that is slow but transparent, interpretable & visualizable

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages