-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding Unify AI Support #1446
base: main
Are you sure you want to change the base?
Adding Unify AI Support #1446
Conversation
Codecov ReportAttention: Patch coverage is
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. Additional details and impacted files@@ Coverage Diff @@
## main #1446 +/- ##
==========================================
- Coverage 62.59% 62.35% -0.24%
==========================================
Files 287 288 +1
Lines 17589 17656 +67
==========================================
+ Hits 11009 11010 +1
- Misses 6580 6646 +66 ☔ View full report in Codecov by Sentry. |
api_type: "unify" | ||
model: "llama-3-8b-chat@together-ai" # or Get a list of models here: https://docs.unify.ai/python/utils#list-models | ||
base_url: "https://api.unify.ai/v0" | ||
api_key: "Enter your Unify API key here" # or Get your API key from https://console.unify.ai |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
keep some value with YOUR_API_KEY
self.model = self.config.model | ||
self.client = Unify( | ||
api_key=self.config.api_key, | ||
endpoint=f"{self.config.model}@{self.config.provider}", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no provider field in LLMConfig
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggest to only add async client
def _const_kwargs(self, messages: list[dict], stream: bool = False) -> dict: | ||
return { | ||
"messages": messages, | ||
"max_tokens": self.config.max_token, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
return resp | ||
return resp.choices[0].message.content if resp.choices else "" | ||
|
||
def _update_costs(self, usage: dict): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to add due to implemented under BaseLLM
async def _achat_completion(self, messages: list[dict], timeout=USE_CONFIG_TIMEOUT) -> ChatCompletion: | ||
try: | ||
response = await self.async_client.generate( | ||
messages=messages, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
_const_kwargs not used ?
|
||
async def _achat_completion_stream(self, messages: list[dict], timeout=USE_CONFIG_TIMEOUT) -> str: | ||
try: | ||
stream = self.client.generate( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should use async_client
"finish_reason": "stop", | ||
}], | ||
usage=CompletionUsage( | ||
prompt_tokens=count_message_tokens(messages, self.model), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
currently main branch has no count_message_tokens
func. and suggest to support unify common models not only from openai.
def __init__(self, config: LLMConfig): | ||
self.config = config | ||
self._init_client() | ||
self.cost_manager = CostManager(token_costs=OPENAI_TOKEN_COSTS) # Using OpenAI costs as Unify is compatible |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what about non-openai models?
So is it a similar component to oneapi? I don't see the need to write another class independent of OpenAI. It seems that directly registering an LLMType is enough? |
Features
Feature Docs
Influence
Result
Other