Commit 7d00eed
committed
Tag and save Pallas LSE statistics to eliminate attention recomputation entirely
1 parent b195959 commit 7d00eed
2 files changed
Lines changed: 2 additions & 1 deletion
File tree
- src/maxdiffusion
- kernels/splash_attention
- models/flux/transformers
Lines changed: 1 addition & 0 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1264 | 1264 | | |
1265 | 1265 | | |
1266 | 1266 | | |
| 1267 | + | |
1267 | 1268 | | |
1268 | 1269 | | |
1269 | 1270 | | |
| |||
Lines changed: 1 addition & 1 deletion
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
495 | 495 | | |
496 | 496 | | |
497 | 497 | | |
498 | | - | |
| 498 | + | |
499 | 499 | | |
500 | 500 | | |
501 | 501 | | |
| |||
0 commit comments