Skip to content

Releases: Dao-AILab/flash-attention

v2.0.6.post1

14 Aug 17:04
Compare
Choose a tag to compare
Use single thread compilation for cuda12.1, torch2.1 to avoid OOM CI

v2.0.6

13 Aug 23:53
Compare
Choose a tag to compare
Bump to v2.0.6

v1.0.9

17 Jul 10:19
Compare
Choose a tag to compare
Bump to v1.0.9

v1.0.8

03 Jul 00:11
Compare
Choose a tag to compare
Bump to v1.0.8

v1.0.7

30 May 21:22
Compare
Choose a tag to compare
Bump version to 1.0.7

v1.0.6

27 May 02:55
7c766b1
Compare
Choose a tag to compare
Merge pull request #243 from ksivaman/bump_version_to_v1_0_6

bump to v1.0.6

v1.0.5

12 May 21:24
Compare
Choose a tag to compare
Add ninja to pyproject.toml build-system, bump to v1.0.5

v1.0.4

26 Apr 16:21
Compare
Choose a tag to compare
[Docs] Clearer error message for bwd d > 64, bump to v1.0.4

v1.0.3.post0

21 Apr 20:38
Compare
Choose a tag to compare
Bump version to v1.0.3.post0

v1.0.3

21 Apr 19:05
Compare
Choose a tag to compare
Bump version to 1.0.3