Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate Consolidating Llama3 and Functionary into just Llama3.1 #55

Open
LachsBagel opened this issue Aug 18, 2024 · 1 comment
Open
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@LachsBagel
Copy link
Collaborator

This will allow for downloading one set of model parameters instead of two. Reducing download and installation time for users.

@LachsBagel LachsBagel added the enhancement New feature or request label Aug 18, 2024
@LachsBagel
Copy link
Collaborator Author

Currently the agents docker container runs the functionary model and the host OS directly has ollama to run the llama3 model.

Investigate whether llama3.1 8B can replace both of the above models by running on the host OS via ollama and then removing functionary from the agents container. Ensuring the agents container's agents still retain the same functionality as they presently do.

@LachsBagel LachsBagel added the help wanted Extra attention is needed label Aug 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

1 participant