You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(lora) R:\llama\lora>python app.py --data_dir="./data" --base_model='meta-llama/Llama-2-7b-chat-hf'
fatal: not a git repository (or any of the parent directories): .git
Cannot get git commit hash: Command '['git', 'rev-parse', 'HEAD']' returned non-zero exit status 128.
bin R:\llama\lora\lora\lib\site-packages\bitsandbytes\libbitsandbytes_cuda118.dll
GPU compute capability: (8, 6)
GPU total number of SMs: 28
GPU total cores: 3584
GPU total memory: 12884901888 bytes (12288.00 MB) (12.00 GB)
CPU available memory: 52328894464 bytes (49904.72 MB) (48.74 GB)
Will keep 2 offloaded models in CPU RAM.
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 2/2 [00:07<00:00, 3.58s/it]
Running on local URL: http://127.0.0.1:7860
To create a public link, set `share=True` in `launch()`.
Loading base model meta-llama/Llama-2-7b-chat-hf...
Traceback (most recent call last):
File "R:\llama\lora\llama_lora\ui\finetune\training.py", line 283, in training
train_output = Global.finetune_train_fn(
File "R:\llama\lora\llama_lora\lib\finetune.py", line 203, in train
model = AutoModelForCausalLM.from_pretrained(
File "R:\llama\lora\lora\lib\site-packages\transformers\models\auto\auto_factory.py", line 566, in from_pretrained
return model_class.from_pretrained(
File "R:\llama\lora\lora\lib\site-packages\transformers\modeling_utils.py", line 3236, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
TypeError: __init__() got an unexpected keyword argument 'llm_int8_skip_modules'
The text was updated successfully, but these errors were encountered:
Hi. i try to train localy with my RTX3060 on windows 10. Can somebody help me with this erorr?
I think i did this steps to start it works with cuda
And this is erorr
The text was updated successfully, but these errors were encountered: