Skip to content

Commit

Permalink
chore: Adding an example for ab-testing and loadbalance
Browse files Browse the repository at this point in the history
  • Loading branch information
noble-varghese committed Sep 12, 2023
1 parent 4e30cf3 commit 907f676
Show file tree
Hide file tree
Showing 3 changed files with 292 additions and 14 deletions.
31 changes: 17 additions & 14 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,27 +1,30 @@
# Changelog
### Changelog

We are excited to announce the **stable release** of the all-new **Portkey Python SDK**, version 0.1.44! This SDK makes it easier than ever to add production capabilities to your existing LLM systems with one line of change to your code.
All notable changes to this project will be documented in this file. Dates are displayed in UTC.

## Key Features and Enhancements
Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).

- **Stability and Reliability**: This release marks the stable version of Portkey Python SDK, thoroughly tested to ensure reliable performance in your projects.
#### v0.1.44

- **Ease of Use**: The SDK follows OpenAI SDK footprint, and with one line of change to your existing code, you can add Portkey's production features to your app.
> 11 September 2023
- **Community Support**: [Join our growing community](https://discord.gg/QHJ3RgcvKT) of practitioners putting LLMs in production. Share ideas, resolve doubts, and collaborate on projects.
- feat: added changie to generate changelogs [`#4`](https://github.com/Portkey-AI/portkey-python-sdk/pull/4)
- feat: version upgrade - 0.1.44 [`#3`](https://github.com/Portkey-AI/portkey-python-sdk/pull/3)
- feat: Workflow update [`cb80617`](https://github.com/Portkey-AI/portkey-python-sdk/commit/cb806173049d2a1f690935320e5ad4738910a452)
- fea: Initial Commit [`2c3631a`](https://github.com/Portkey-AI/portkey-python-sdk/commit/2c3631ac65ff58158695e84881993460fd27cb82)
- feat: adding the streaming capability into rubeus sdk [`f06e23b`](https://github.com/Portkey-AI/portkey-python-sdk/commit/f06e23bfa676995d578f64eff3401db917660742)

## Getting Started
<!-- auto-changelog-above -->

```py
pip install portkey-ai
```
For comprehensive documentation on Portkey production features, [check out our docs here.](https://docs.portkey.ai/)
We are excited to announce the **stable release** of the all-new **Portkey Python SDK**, version 0.1.44! This SDK makes it easier than ever to add production capabilities to your existing LLM systems with one line of change to your code.

## Feedback and Contributions
## Key Features and Enhancements

We welcome your feedback and contributions! Feel free to report issues, suggest enhancements, or submit pull requests on our [GitHub repository](https://github.com/Portkey-AI/portkey-python-sdk).
- **Stability and Reliability**: This release marks the stable version of Portkey Python SDK, thoroughly tested to ensure reliable performance in your projects.

Thank you for your support and enthusiasm for the Portkey Python SDK. We look forward to seeing the amazing projects you will build with it!
- **Ease of Use**: The SDK follows OpenAI SDK footprint, and with one line of change to your existing code, you can add Portkey's production features to your app.

- **Community Support**: [Join our growing community](https://discord.gg/QHJ3RgcvKT) of practitioners putting LLMs in production. Share ideas, resolve doubts, and collaborate on projects.

Happy coding!

Expand Down
26 changes: 26 additions & 0 deletions examples/demo.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@

import portkey as pk
from portkey import Config, LLMOptions
from getpass import getpass

# Enter the password on the prompt window.
API_KEY = "x2trk"

# Setting the API key
pk.api_key = API_KEY

pk.config = Config(
mode="fallback",
llms=[
LLMOptions(virtual_key="open-ai-key-66a67d", provider="openai"),
LLMOptions(virtual_key="anthropic-key-351feb", provider="anthropic")
]
)


response = pk.Completions.create(
model="text-davinci-002",
prompt="Who are you ?"
)

print(response.choices[0].text)
249 changes: 249 additions & 0 deletions examples/fallback.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,249 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Portkey | Building Resilient Llamaindex Apps\n",
"\n",
"**Portkey** is a full-stack LLMOps platform that productionizes your Gen AI app reliably and securely."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Key Features of Portkey\n",
"\n",
"1. **AI Gateway**:\n",
" - **Automated Fallbacks & Retries**: Ensure your application remains functional even if a primary service fails.\n",
" - **Load Balancing**: Efficiently distribute incoming requests among multiple models.\n",
" - **Semantic Caching**: Reduce costs and latency by intelligently caching results.\n",
" \n",
"2. **Observability**:\n",
" - **Logging**: Keep track of all requests for monitoring and debugging.\n",
" - **Requests Tracing**: Understand the journey of each request for optimization.\n",
" - **Custom Tags**: Segment and categorize requests for better insights.\n",
"\n",
"To harness these features, let's start with the setup:\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "5Z933R9wuZ4z"
},
"outputs": [],
"source": [
"# Installing the Portkey AI python SDK developed by the Portkey team\n",
"!pip install portkey-ai -U\n",
"!portkey --version"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Importing necessary libraries and modules\n",
"import portkey as pk\n",
"from portkey import Config, LLMOptions\n",
"from getpass import getpass"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### **Step 1: Get your Portkey API key**\n",
"\n",
"Log into [Portkey here](https://app.portkey.ai/), then click on the profile icon on top right and \"Copy API Key\". Let's also set OpenAI & Anthropic API keys."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Enter the password on the prompt window.\n",
"API_KEY = getpass(\"Enter your PORTKEY_API_KEY \")\n",
"\n",
"# Setting the API key\n",
"pk.api_key = API_KEY\n",
"\n",
"# NOTE: For adding custom url, uncomment this line and add your custom url in a selfhosted version.\n",
"# pk.base_url = \"\"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### **Step 2: Configure Portkey Features**\n",
"\n",
"To harness the full potential of Portkey, you can configure various features as illustrated above. Here's a guide to all Portkey features and the expected values:\n",
"\n",
"| Feature | Config Key | Value(Type) | Required |\n",
"|---------------------|-------------------------|--------------------------------------------------|-------------|\n",
"| API Key | `api_key` | `string` | ✅ Required (can be set externally) |\n",
"| Mode | `mode` | `fallback`, `ab_test`, `single` | ✅ Required |\n",
"| Cache Type | `cache_status` | `simple`, `semantic` | ❔ Optional |\n",
"| Force Cache Refresh | `cache_force_refresh` | `boolean` | ❔ Optional |\n",
"| Cache Age | `cache_age` | `integer` (in seconds) | ❔ Optional |\n",
"| Trace ID | `trace_id` | `string` | ❔ Optional |\n",
"| Retries | `retry` | `integer` [0,5] | ❔ Optional |\n",
"| Metadata | `metadata` | `json object` [More info](https://docs.portkey.ai/key-features/custom-metadata) | ❔ Optional |\n",
"| Base URL | `base_url` | `url` | ❔ Optional |\n",
"\n",
"\n",
"To set up Portkey for different modes and features, refer to the provided IPython Notebook examples in the examples/ directory.\n",
"\n",
"For more information and detailed documentation, please visit [Portkey Documentation](https://docs.portkey.ai/)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Example-1: Configuring Portkey for Fallback Mode\n",
"In this example, we'll demonstrate how to configure Portkey for the Fallback Mode using the sdk. Fallback Mode allows you to define a backup strategy when your primary service is unavailable.\n",
"\n",
"`Note`: The order of definition of LLMOptions is important for fallbacks. Ensure that you define your fallback strategy in the order of preference. This ensures that your fallback logic is in place and ready to be used when needed."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pk.config = Config(\n",
" mode=\"fallback\",\n",
" llms=[\n",
" LLMOptions(model=\"text-davinci-002\", virtual_key=\"open-ai-key-66a67d\", provider=\"openai\"),\n",
" LLMOptions(model=\"claude-2\", virtual_key=\"anthropic-key-351feb\", provider=\"anthropic\", max_tokens=250)\n",
" ]\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Example 1: Basic example\n",
"\n",
"response = pk.Completions.create(\n",
" prompt=\"Who are you ?\"\n",
")\n",
"\n",
"print(response.choices[0].text)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Example 3: Streaming results\n",
"\n",
"response3 = pk.Completions.create(\n",
" prompt=\"Translate the following English text to French: 'Hello, how are you?'\",\n",
" stream=True # Stream back partial progress\n",
")\n",
"\n",
"for event in response3:\n",
" if event.choices[0].text:\n",
" print(event.choices[0].text, end=\"\", flush=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### **Configuring Portkey for Load Balancing (A/B test) Mode**\n",
"\n",
"To utilize Portkey's Load Balancing Mode, follow the steps below. Load Balancing Mode enables you to distribute incoming requests across multiple services to ensure high availability and scalability.\n",
"\n",
"`NOTE`: Loadbalance is also called A/B test."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pk.config = Config(\n",
" mode=\"ab_test\",\n",
" llms=[\n",
" LLMOptions(model=\"text-davinci-002\", virtual_key=\"open-ai-key-66a67d\", provider=\"openai\", weight=0.4),\n",
" LLMOptions(model=\"claude-2\", virtual_key=\"anthropic-key-351feb\", provider=\"anthropic\", max_tokens=250, weight=0.6)\n",
" ]\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Example 1: Basic example\n",
"\n",
"response = pk.Completions.create(\n",
" prompt=\"Summarize the key points from the article titled 'The Impact of Climate Change on Global Biodiversity.'\"\n",
")\n",
"\n",
"print(response.choices[0].text)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Example 3: Streaming results\n",
"\n",
"response3 = pk.Completions.create(\n",
" prompt=\"Generate a creative short story about a detective solving a mysterious case.\",\n",
" stream=True # Stream back partial progress\n",
")\n",
"\n",
"for event in response3:\n",
" if event.choices[0].text is None:\n",
" break\n",
" print(event.choices[0].text, end=\"\", flush=True)"
]
}
],
"metadata": {
"colab": {
"provenance": []
},
"kernelspec": {
"display_name": "Python 3",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.5"
}
},
"nbformat": 4,
"nbformat_minor": 0
}

0 comments on commit 907f676

Please sign in to comment.