Skip to content

Conversation

@vancyland
Copy link

Add SparseCausalAttention implementation to ./animatediff/models/attention.py

If you want to use SparseCausalAttention in the self-attention part, simply set unet_use_cross_frame_attention = True.

If you want to further improve the code, you can focus on the class SparseCausalAttention2D(CrossAttention).

To view the corresponding experimental results, please visit this link.

@vancyland vancyland changed the title Update attention.py Update attention.py by adding SparseCausalAttention implementation Sep 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant