> 技术文档 > ImportError: cannot import name ‘LlamaFlashAttention2‘ from ‘transformers.models.llama.modeling_llam_importerror: cannot import name \'llamaflashattenti

ImportError: cannot import name ‘LlamaFlashAttention2‘ from ‘transformers.models.llama.modeling_llam_importerror: cannot import name \'llamaflashattenti

  • 操作系统:ubuntu22.04

错误描述

使用llama-factory训练时,

`python3 my_run_sft.py train ./config.yaml `

有时会出现以下错误:

Traceback (most recent call last): File \"/media/dingxin/data/study/GPT/llama/LLaMA-Factory/my_run_sft.py\", line 71, in <module> from transformers.models.llama.modeling_llama import (ImportError: cannot import name \'LlamaFlashAttention2\' from \'transformers.models.llama.modeling_llama\' (/media/dingxin/data/anaconda3/envs/llama_factory/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py)

解决办法

安装指定的transformers版本4.47.1

pip install transformers==4.47.1

顺利解决