We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
response, history = model.chat(tokenizer, "你好", history=[]) 报错TypeError: 'NoneType' object is not callable。 history是什么类型的啊
No response
from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained(r"E:\workspace\QA\MindChat-main\THUDM\chatglm-6b-int4", trust_remote_code=True) model = AutoModel.from_pretrained(r"E:\workspace\QA\MindChat-main\THUDM\chatglm-6b-int4", trust_remote_code=True).half() response, history = model.chat(tokenizer, "你好", history=[]) print(response)
- OS: Window10 - Python:3.10 - Transformers:4.27.1 - PyTorch:2.3.1 - CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :False
The text was updated successfully, but these errors were encountered:
hove to solve it?
Sorry, something went wrong.
这个我也遇到并解决了,你看看代码前面是否有gcc报错,我是gcc问题,安装5.1.0的gcc并重启pycharm就好了,网站如下:https://sourceforge.net/projects/tdm-gcc/files/TDM-GCC%20Installer/
No branches or pull requests
Is there an existing issue for this?
Current Behavior
response, history = model.chat(tokenizer, "你好", history=[]) 报错TypeError: 'NoneType' object is not callable。
history是什么类型的啊
Expected Behavior
No response
Steps To Reproduce
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained(r"E:\workspace\QA\MindChat-main\THUDM\chatglm-6b-int4", trust_remote_code=True)
model = AutoModel.from_pretrained(r"E:\workspace\QA\MindChat-main\THUDM\chatglm-6b-int4", trust_remote_code=True).half()
response, history = model.chat(tokenizer, "你好", history=[])
print(response)
Environment
Anything else?
No response
The text was updated successfully, but these errors were encountered: