方法一:直接尝试pip:
pipinstallflash_attn可能出现报错:
pip install flash-attn -i https://pypi.tuna.tsinghua.edu.cn/simple Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Collecting flash-attn Using cached https://pypi.tuna.tsinghua.edu.cn/packages/3b/b2/8d76c41ad7974ee264754709c22963447f7f8134613fd9ce80984ed0dab7/flash_attn-2.8.3.tar.gz (8.4 MB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error × Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> [20 lines of output] Traceback (most recent call last): File "/home/ysr/.conda/envs/wan/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 389, in <module> main() File "/home/ysr/.conda/envs/wan/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 373, in main json_out["return_val"] = hook(**hook_input["kwargs"]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ysr/.conda/envs/wan/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 143, in get_requires_for_build_wheel return hook(config_settings) ^^^^^^^^^^^^^^^^^^^^^ File "/tmp/pip-build-env-7rn47hok/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 331, in get_requires_for_build_wheel return self._get_build_requires(config_settings, requirements=[]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tmp/pip-build-env-7rn47hok/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 301, in _get_build_requires self.run_setup() File "/tmp/pip-build-env-7rn47hok/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 512, in run_setup super().run_setup(setup_script=setup_script) File "/tmp/pip-build-env-7rn47hok/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 317, in run_setup exec(code, locals()) File "<string>", line 22, in <module> ModuleNotFoundError: No module named 'torch' [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed to build 'flash-attn' when getting requirements to build wheel这是一个非常典型的 Python 打包和构建错误,尤其是在尝试从源代码或使用 PyPI 缓存的源码包进行安装时:
ModuleNotFoundError: No module named 'torch'错误原因分析
当pip尝试安装flash-attn时,它会经历几个步骤,其中一个步骤是获取构建 Wheel 所需的依赖项 (Getting requirements to build wheel)。
flash-attn的setup.py(或类似的构建配置)在运行时需要导入torch来检查环境、确定 CUDA 架构等信息。- 在构建环境(
pip-build-env-*)内部,PyTorch 模块没有被找到,导致ModuleNotFoundError。
为什么 PyTorch 已经安装了,构建环境却找不到它?
这是因为pip默认会在一个隔离的临时环境中运行构建脚本(通过pyproject.toml文件指定的要求),以确保构建的纯净性。您的主环境中的 PyTorch (虽然已安装) 并不会自动复制到这个临时的构建环境。
解决方案:禁用构建隔离
解决这个问题最简单、最直接的方法就是禁用pip的构建隔离。这将允许构建脚本直接访问您当前虚拟环境(wan)中已经安装的torch模块。
请尝试使用以下命令进行安装:
pipinstallflash-attn --no-build-isolation -i https://pypi.tuna.tsinghua.edu.cn/simple--no-build-isolation: 禁用构建隔离,允许flash-attn的构建脚本访问您当前环境中的torch模块。
然后可能出现报错二:
(类似说没有找到cuda的,具体报错我的记录丢失了,解决方法是export CUDA_HOME=xxx,其中xxx类似于/一堆前缀路径/cuda-12.1。
错误三:
网络问题
因为服务器可能连不上外网或者其他什么网络问题,如果出现这种情况,最好的解决方法就是自己到github仓库中找到对应版本的whl,然后手动安装。
Building wheel for flash-attn (pyproject.toml) ... error error: subprocess-exited-with-error × Building wheel for flash-attn (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [22 lines of output] No CUDA runtime is found, using CUDA_HOME='/opt/Software/cuda/12.9.1' /home/ysr/.conda/envs/wan/lib/python3.11/site-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: BSD License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() torch.__version__ = 2.5.1+cu121 running bdist_wheel Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.3/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp311-cp311-linux_x86_64.whl error: Remote end closed connection without response [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash-attn Failed to build flash-attn error: failed-wheel-build-for-install × Failed to build installable wheels for some pyproject.toml based projects ╰─> flash-attn在报错里其实已经给出了你可能匹配的whl路径了:Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.3/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
直接去网站下载,然后传到服务器上,手动安装:
pipinstallflash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp311-cp311-linux_x86_64.whl