Skip to content

sujianwei1/transformer-explainer

 
 

Repository files navigation

Transformer Explainer: Interactive Learning of Text-Generative Models

Transformer Explainer is an interactive visualization tool designed to help anyone learn how Transformer-based models like GPT work. It runs a live GPT-2 model right in your browser, allowing you to experiment with your own text and observe in real time how internal components and operations of the Transformer work together to predict the next tokens.

MIT license

transformer-explainer.mp4
🚀 Live Demo 📺 Demo Video

How to run locally

Prerequisites

  • Node.js 20 or higher
  • NPM

Steps

git clone https://github.com/poloclub/transformer-explainer.git
cd transformer-explainer
npm install
npm run dev

Then, on your web browser, access http://localhost:5173.

Credits

Transformer Explainer was created by Aeree Cho, Grace C. Kim, Alexander Karpekov, Alec Helbling, Jay Wang, Seongmin Lee, Benjamin Hoover, and Polo Chau at the Georgia Institute of Technology.

License

The software is available under the MIT License.

Contact

If you have any questions, feel free to open an issue or contact Aeree Cho or any of the contributors listed above.

About

Learn How Transformers work in Generative AI with Interactive Visualization

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 98.8%
  • Svelte 0.9%
  • TypeScript 0.2%
  • Python 0.1%
  • HTML 0.0%
  • SCSS 0.0%