-
-
Notifications
You must be signed in to change notification settings - Fork 9.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
✨ feat: Support Cloudflare Workers AI #3402
base: main
Are you sure you want to change the base?
Conversation
This commit adds support for Cloudflare as a model provider. It includes changes to the `ModelProvider` enum, the `UserKeyVaults` interface, the `getServerGlobalConfig` function, the `DEFAULT_LLM_CONFIG` constant, the `getLLMConfig` function, the `AgentRuntime` class, and the `DEFAULT_MODEL_PROVIDER_LIST` constant.
@BrandonStudio is attempting to deploy a commit to the LobeHub Pro Team on Vercel. A member of the Team first needs to authorize it. |
Thank you for raising your pull request and contributing to our Community |
getModelBeta, | ||
getModelDisplayName, | ||
getModelFunctionCalling, | ||
getModelTokens, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@arvinxx Do I have to export these functions in order to test here?
@arvinxx When would you like to review? It has been a long time. |
@BrandonStudio yes. I will review it this week |
need to rebase with |
💻 变更类型 | Change Type
🔀 变更说明 | Description of Change
✖ Does not support function calling for now.
Only stable models (not beta) are enabled by default.
📝 补充信息 | Additional Information
Based on #2966 by @sxjeru.
i18n by @sxjeru and groq/llama-3.1-8b-instant.