Skip to content

Pull requests: AlibabaPAI/FlashModels

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Pull requests list

Fix torch.load of transformers patch when enabling low_cpu_mem_usage
#10 opened Sep 23, 2024 by lausannel Contributor Loading…
Add Llama3 profile scripts
#9 opened Sep 13, 2024 by lausannel Contributor Loading…
Patch FlashAttention2 for Llama
#3 opened Jun 27, 2024 by Seventeen17 Contributor Loading…
ProTip! no:milestone will show everything without a milestone.