-
Notifications
You must be signed in to change notification settings - Fork 27
Open
Description
I had the misfortune of following the instructions 5 hours after release of transformers v4.35, the instructions guide to upgrade to the latest release, so I got the following error:
$ python -m llava.serve.controller --host 0.0.0.0 --port 10000 (obsidian)
[2023-11-02 21:08:06,589] [INFO] [real_accelerator.py:110:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
File "[...]/miniconda3/envs/obsidian/lib/python3.10/runpy.py", line 187, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
File "[...]/miniconda3/envs/obsidian/lib/python3.10/runpy.py", line 110, in _get_module_details
__import__(pkg_name)
File "[...]/Obsidian/llava/__init__.py", line 1, in <module>
from .model import LlavaLlamaForCausalLM
File "[...]/Obsidian/llava/model/__init__.py", line 3, in <module>
from .language_model.llava_mpt import LlavaMPTForCausalLM, LlavaMPTConfig
File "[...]/Obsidian/llava/model/language_model/llava_mpt.py", line 26, in <module>
from .mpt.modeling_mpt import MPTConfig, MPTForCausalLM, MPTModel
File "[...]/Obsidian/llava/model/language_model/mpt/modeling_mpt.py", line 19, in <module>
from .hf_prefixlm_converter import add_bidirectional_mask_if_missing, convert_hf_causal_lm_to_prefix_lm
File "[...]/Obsidian/llava/model/language_model/mpt/hf_prefixlm_converter.py", line 15, in <module>
from transformers.models.bloom.modeling_bloom import _expand_mask as _expand_mask_bloom
ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' ([...]/miniconda3/envs/obsidian/lib/python3.10/site-packages/transformers/models/bloom/modeling_bloom.py)
As the immediate workaround, downgrading to v4.34 (pip install --upgrade transformers==4.34.0) works.
a2382625920
Metadata
Metadata
Assignees
Labels
No labels