Skip to content

不用flash attention是否能够运行 #40

@bxwldljh

Description

@bxwldljh

您好。我用的是V100显卡,不支持flash attention,我尝试使用attn_implementation="eager",仍报错显示缺少flash-attn。不用flash attention是否能够运行模型呢?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions