Option to integrate claude API #548
Replies: 1 comment 1 reply
-
I made a fork and implemented Portkey AI Gateway support, which I ran locally, but then I found litellm and started running that locally instead. Portkey AI Gateway is a bit lighter weight (about 10% the SLOC!), but I think it's also less featureful and it's definitely not as actively developed. Anyway, litellm works great except that I'm not sure how to get function calling working and I had to edit the model_prices_and_context_window.json to support
litellm_config.yaml
and how to run litellm:
If you don't need gemini just remove the model_prices stuff. With this setup, I can |
Beta Was this translation helpful? Give feedback.
-
Hi,
Since there are a lot of competing LLM API that offers similar functionality, do you guys think it's good idea to support those service like Claude as well?
Beta Was this translation helpful? Give feedback.
All reactions