Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TL] initial implement flashattention op in TL #202

Merged
merged 6 commits into from
Sep 30, 2024

Conversation

tzj-fxz
Copy link
Contributor

@tzj-fxz tzj-fxz commented Sep 29, 2024

  • add initial implementation of flashattention in ops/general_flashatten
  • wrap implementation in config class and op class
  • fix kernel name bug
  • modify tilelang test script
  • add ops flashattention test script

Copy link
Contributor

@LeiWang1999 LeiWang1999 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your contributions. @tzj-fxz
I've left some comments.

bitblas/ops/general_flashatten/__init__.py Outdated Show resolved Hide resolved
bitblas/ops/operator.py Outdated Show resolved Hide resolved
bitblas/ops/operator.py Outdated Show resolved Hide resolved
@LeiWang1999
Copy link
Contributor

thanks @tzj-fxz ,overall lgtm, should fix the test.

@tzj-fxz
Copy link
Contributor Author

tzj-fxz commented Sep 30, 2024

thanks @tzj-fxz ,overall lgtm, should fix the test.

OK, now CI tests are passed in the newest commit.

@LeiWang1999 LeiWang1999 merged commit 155a1f1 into microsoft:main Sep 30, 2024
6 checks passed
@LeiWang1999
Copy link
Contributor

Merged, thanks @tzj-fxz

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants