Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to instantiate model: Model format not supported (no matching implementation found) (type=value_error) #2

Open
Lbaiall opened this issue Mar 31, 2024 · 1 comment

Comments

@Lbaiall
Copy link

Lbaiall commented Mar 31, 2024

i had try to change all model but it still stuck right this line !!
Y:\mick.ai\rag>python rag.py
loading model
loading directory Y:\mick.ai\README.txt
Y:\mick.ai\README.txt
instantiated loader
splitting text and embedding using gpt4all embeddings
load_gguf: gguf_init_from_file failed
magic_match: unsupported model architecture:
Traceback (most recent call last):
File "Y:\mick.ai\rag\rag.py", line 41, in
vectorstore = Chroma.from_documents(documents=splits, embedding=GPT4AllEmbeddings())
^^^^^^^^^^^^^^^^^^^
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for GPT4AllEmbeddings
root
Unable to instantiate model: Model format not supported (no matching implementation found) (type=value_error)

@ruddythor
Copy link
Owner

oh im just now seeing this! i'll look to see if i can fix this! thanks for the comment!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants