Skip to content

输入问题后报错 #196

@xdicac

Description

@xdicac

过程如下:
D:\programs\lib\JittorLLMs>python cli_demo.py chatglm
[i 0223 16:24:36.015000 72 compiler.py:956] Jittor(1.3.8.5) src: d:\programs\lib\jittorllms\env\lib\site-packages\jittor
[i 0223 16:24:36.083000 72 compiler.py:957] cl at C:\Users\zxd.cache\jittor\msvc\VC_____\bin\cl.exe(19.29.30133)
[i 0223 16:24:36.083000 72 compiler.py:958] cache_path: C:\Users\zxd.cache\jittor\jt1.3.8\cl\py3.9.13\Windows-10-10.xf4\11thGenIntelRCxe1\main
[i 0223 16:24:36.173000 72 init.py:227] Total mem: 15.80GB, using 5 procs for compiling.
[i 0223 16:24:37.409000 72 jit_compiler.cc:28] Load cc_path: C:\Users\zxd.cache\jittor\msvc\VC_____\bin\cl.exe
Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a revision is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Loading checkpoint shards: 100%|████████████████████████████████████████████| 8/8 [00:56<00:00, 7.03s/it]
用户输入:你是谁
Traceback (most recent call last):
File "D:\programs\lib\JittorLLMs\cli_demo.py", line 9, in
model.chat()
File "D:\programs\lib\JittorLLMs\models\chatglm_init_.py", line 36, in chat
for response, history in self.model.stream_chat(self.tokenizer, text, history=history):
File "C:\Users\zxd/.cache\huggingface\modules\transformers_modules\local\modeling_chatglm.py", line 1259, in stream_chat
for outputs in self.stream_generate(**input_ids, **gen_kwargs):
File "C:\Users\zxd/.cache\huggingface\modules\transformers_modules\local\modeling_chatglm.py", line 1334, in stream_generate
model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs)
File "C:\Users\zxd/.cache\huggingface\modules\transformers_modules\local\modeling_chatglm.py", line 1086, in prepare_inputs_for_generation
mask_positions = [seq.index(mask_token) for seq in seqs]
File "C:\Users\zxd/.cache\huggingface\modules\transformers_modules\local\modeling_chatglm.py", line 1086, in
mask_positions = [seq.index(mask_token) for seq in seqs]
ValueError: 150001 is not in list

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions