dog shit一样的安装包,下次安装直接从这里面选择
https://github.com/Dao-AILab/flash-attention/releases
安装名字里带有FALSE版本,不要带有TRUE的版本
conda activate 环境名
pip uninstall flash-attn
FLASH_ATTENTION_FORCE_BUILD=TRUE pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.5/flash_attn-2.5.5+cu122torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl --upgrade