Skip to content

Releases: OpenProteinAI/flash-attention

v2.6.3-alibi-as-bh-bias-16

25 Mar 00:52
4ebb8be

Choose a tag to compare

export minv and maxv for pytorch 2.6

v2.6.3-alibi-as-bh-bias-15

08 Feb 14:39
4bda2db

Choose a tag to compare

alibi.h use the non max seqlen formula which seems more correct actua…

v2.6.3-alibi-as-bh-bias-12

08 Feb 14:23
efbbaf4

Choose a tag to compare

also modify alibi.h

v2.6.3-alibi-as-bh-bias-11

08 Feb 13:32
f5ce6ee

Choose a tag to compare

follow the original expression more closely

v2.6.3-alibi-as-bh-bias-10

08 Feb 02:38
a645edd

Choose a tag to compare

diagonal noncausal try accounting for max_seqlens too

v2.6.3-alibi-as-bh-bias-8

07 Feb 14:47
a8d7fcc

Choose a tag to compare

redo causal and noncausal alibi for diagonal

v2.6.3-alibi-as-bh-bias-7

07 Feb 14:30
c42bf6b

Choose a tag to compare

...and only compile for py310 and torch24

v2.6.3-alibi-as-bh-bias-6

07 Feb 14:13
d2dd0fd

Choose a tag to compare

compile for torch 2.4.1 too

v2.6.3-alibi-as-bh-bias-5

05 Feb 23:59
f124d98

Choose a tag to compare

fix TORCH_CUDA_VERSION env var for pytorch 2.5

v2.6.3-alibi-as-bh-bias-4

05 Feb 23:45
42a895e

Choose a tag to compare

compile for pytorch 2.5.1