Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Etablish versioned API path #323

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 11 additions & 11 deletions API.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
This documentation covers how to get started with the API that backs OpenGPTs.
This allows you to easily integrate it with a different frontend of your choice.

For full API documentation, see [localhost:8100/docs](localhost:8100/docs) after deployment.
For full API documentation, see [localhost:8100/api/v1/docs](localhost:8100/api/v1/docs) after deployment.

If you want to see the API docs before deployment, check out the [hosted docs here](https://opengpts-example-vz4y4ooboq-uc.a.run.app/docs).

Expand All @@ -17,7 +17,7 @@ This should look something like:

```python
import requests
requests.post('http://127.0.0.1:8100/assistants', json={
requests.post('http://127.0.0.1:8100/api/v1/assistants', json={
"name": "bar",
"config": {"configurable": {}},
"public": True
Expand Down Expand Up @@ -61,7 +61,7 @@ Notably different from OpenAI's assistant API, we require starting the thread wi

```python
import requests
requests.post('http://127.0.0.1:8100/threads', cookies= {"opengpts_user_id": "foo"}, json={
requests.post('http://127.0.0.1:8100/api/v1/threads', cookies= {"opengpts_user_id": "foo"}, json={
"name": "hi",
"assistant_id": "9c7d7e6e-654b-4eaa-b160-f19f922fc63b"
}).content
Expand All @@ -82,7 +82,7 @@ We can check the thread, and see that it is currently empty:
```python
import requests
requests.get(
'http://127.0.0.1:8100/threads/231dc7f3-33ee-4040-98fe-27f6e2aa8b2b/state',
'http://127.0.0.1:8100/api/v1/threads/231dc7f3-33ee-4040-98fe-27f6e2aa8b2b/state',
cookies= {"opengpts_user_id": "foo"}
).content
```
Expand All @@ -99,7 +99,7 @@ Let's add a message to the thread!
```python
import requests
requests.post(
'http://127.0.0.1:8100/threads/231dc7f3-33ee-4040-98fe-27f6e2aa8b2b/state',
'http://127.0.0.1:8100/api/v1/threads/231dc7f3-33ee-4040-98fe-27f6e2aa8b2b/state',
cookies= {"opengpts_user_id": "foo"}, json={
"values": [{
"content": "hi! my name is bob",
Expand All @@ -125,7 +125,7 @@ If we now run the command to see the thread, we can see that there is now a mess
```python
import requests
requests.get(
'http://127.0.0.1:8100/threads/231dc7f3-33ee-4040-98fe-27f6e2aa8b2b/state',
'http://127.0.0.1:8100/api/v1/threads/231dc7f3-33ee-4040-98fe-27f6e2aa8b2b/state',
cookies= {"opengpts_user_id": "foo"}
).content
```
Expand All @@ -143,7 +143,7 @@ We can now run the assistant on that thread.

```python
import requests
requests.post('http://127.0.0.1:8100/runs', cookies= {"opengpts_user_id": "foo"}, json={
requests.post('http://127.0.0.1:8100/api/v1/runs', cookies= {"opengpts_user_id": "foo"}, json={
"assistant_id": "9c7d7e6e-654b-4eaa-b160-f19f922fc63b",
"thread_id": "231dc7f3-33ee-4040-98fe-27f6e2aa8b2b",
"input": {
Expand All @@ -157,7 +157,7 @@ If we now check the thread, we can see (after a bit) that there is a message fro

```python
import requests
requests.get('http://127.0.0.1:8100/threads/231dc7f3-33ee-4040-98fe-27f6e2aa8b2b/state', cookies= {"opengpts_user_id": "foo"}).content
requests.get('http://127.0.0.1:8100/api/v1/threads/231dc7f3-33ee-4040-98fe-27f6e2aa8b2b/state', cookies= {"opengpts_user_id": "foo"}).content
```
```shell
b'{"values":[{"content":"hi! my name is bob","additional_kwargs":{},"type":"human","example":false},{"content":"Hello, Bob! How can I assist you today?","additional_kwargs":{"agent":{"return_values":{"output":"Hello, Bob! How can I assist you today?"},"log":"Hello, Bob! How can I assist you today?","type":"AgentFinish"}},"type":"ai","example":false}],"next":[]}'
Expand All @@ -174,7 +174,7 @@ Continuing the example above, we can run:

```python
import requests
requests.post('http://127.0.0.1:8100/runs', cookies= {"opengpts_user_id": "foo"}, json={
requests.post('http://127.0.0.1:8100/api/v1/runs', cookies= {"opengpts_user_id": "foo"}, json={
"assistant_id": "9c7d7e6e-654b-4eaa-b160-f19f922fc63b",
"thread_id": "231dc7f3-33ee-4040-98fe-27f6e2aa8b2b",
"input": {
Expand All @@ -190,7 +190,7 @@ Then, if we call the threads endpoint after a bit we can see the human message -

```python
import requests
requests.get('http://127.0.0.1:8100/threads/231dc7f3-33ee-4040-98fe-27f6e2aa8b2b/state', cookies= {"opengpts_user_id": "foo"}).content
requests.get('http://127.0.0.1:8100/api/v1/threads/231dc7f3-33ee-4040-98fe-27f6e2aa8b2b/state', cookies= {"opengpts_user_id": "foo"}).content
```

```shell
Expand All @@ -210,7 +210,7 @@ Below is an example of streaming back tokens for a response.
import requests
import json
response = requests.post(
'http://127.0.0.1:8100/runs/stream',
'http://127.0.0.1:8100/api/v1/runs/stream',
cookies= {"opengpts_user_id": "foo"}, json={
"assistant_id": "9c7d7e6e-654b-4eaa-b160-f19f922fc63b",
"thread_id": "231dc7f3-33ee-4040-98fe-27f6e2aa8b2b",
Expand Down
6 changes: 3 additions & 3 deletions backend/app/api/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,16 +14,16 @@ async def ok():

router.include_router(
assistants_router,
prefix="/assistants",
prefix="/api/v1/assistants",
tags=["assistants"],
)
router.include_router(
runs_router,
prefix="/runs",
prefix="/api/v1/runs",
tags=["runs"],
)
router.include_router(
threads_router,
prefix="/threads",
prefix="/api/v1/threads",
tags=["threads"],
)
4 changes: 2 additions & 2 deletions backend/app/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
app.include_router(api_router)


@app.post("/ingest", description="Upload files to the given assistant.")
@app.post("/api/v1/ingest", description="Upload files to the given assistant.")
async def ingest_files(
files: list[UploadFile], user: AuthedUser, config: str = Form(...)
) -> None:
Expand All @@ -47,7 +47,7 @@ async def ingest_files(
return ingest_runnable.batch([file.file for file in files], config)


@app.get("/health")
@app.get("/api/v1/health")
async def health() -> dict:
return {"status": "ok"}

Expand Down
20 changes: 10 additions & 10 deletions backend/tests/unit_tests/app/test_app.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ async def test_list_and_create_assistants(pool: asyncpg.pool.Pool) -> None:

async with get_client() as client:
response = await client.get(
"/assistants/",
"/api/v1/assistants/",
headers=headers,
)
assert response.status_code == 200
Expand All @@ -33,7 +33,7 @@ async def test_list_and_create_assistants(pool: asyncpg.pool.Pool) -> None:

# Create an assistant
response = await client.put(
f"/assistants/{aid}",
f"/api/v1/assistants/{aid}",
json={"name": "bobby", "config": {}, "public": False},
headers=headers,
)
Expand All @@ -47,7 +47,7 @@ async def test_list_and_create_assistants(pool: asyncpg.pool.Pool) -> None:
async with pool.acquire() as conn:
assert len(await conn.fetch("SELECT * FROM assistant;")) == 1

response = await client.get("/assistants/", headers=headers)
response = await client.get("/api/v1/assistants/", headers=headers)
assert [
_project(d, exclude_keys=["updated_at", "user_id"]) for d in response.json()
] == [
Expand All @@ -60,7 +60,7 @@ async def test_list_and_create_assistants(pool: asyncpg.pool.Pool) -> None:
]

response = await client.put(
f"/assistants/{aid}",
f"/api/v1/assistants/{aid}",
json={"name": "bobby", "config": {}, "public": False},
headers=headers,
)
Expand All @@ -74,7 +74,7 @@ async def test_list_and_create_assistants(pool: asyncpg.pool.Pool) -> None:

# Check not visible to other users
headers = {"Cookie": "opengpts_user_id=2"}
response = await client.get("/assistants/", headers=headers)
response = await client.get("/api/v1/assistants/", headers=headers)
assert response.status_code == 200, response.text
assert response.json() == []

Expand All @@ -87,7 +87,7 @@ async def test_threads() -> None:

async with get_client() as client:
response = await client.put(
f"/assistants/{aid}",
f"/api/v1/assistants/{aid}",
json={
"name": "assistant",
"config": {"configurable": {"type": "chatbot"}},
Expand All @@ -97,25 +97,25 @@ async def test_threads() -> None:
)

response = await client.put(
f"/threads/{tid}",
f"/api/v1/threads/{tid}",
json={"name": "bobby", "assistant_id": aid},
headers=headers,
)
assert response.status_code == 200, response.text

response = await client.get(f"/threads/{tid}/state", headers=headers)
response = await client.get(f"/api/v1/threads/{tid}/state", headers=headers)
assert response.status_code == 200
assert response.json() == {"values": None, "next": []}

response = await client.get("/threads/", headers=headers)
response = await client.get("/api/v1/threads/", headers=headers)

assert response.status_code == 200
assert [
_project(d, exclude_keys=["updated_at", "user_id"]) for d in response.json()
] == [{"assistant_id": aid, "name": "bobby", "thread_id": tid}]

response = await client.put(
f"/threads/{tid}",
f"/api/v1/threads/{tid}",
headers={"Cookie": "opengpts_user_id=2"},
)
assert response.status_code == 422
2 changes: 1 addition & 1 deletion frontend/src/api/assistants.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ export async function getAssistant(
assistantId: string,
): Promise<Config | null> {
try {
const response = await fetch(`/assistants/${assistantId}`);
const response = await fetch(`/api/v1/assistants/${assistantId}`);
if (!response.ok) {
return null;
}
Expand Down
2 changes: 1 addition & 1 deletion frontend/src/api/threads.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ import { Chat } from "../types";

export async function getThread(threadId: string): Promise<Chat | null> {
try {
const response = await fetch(`/threads/${threadId}`);
const response = await fetch(`/api/v1/threads/${threadId}`);
if (!response.ok) {
return null;
}
Expand Down
6 changes: 3 additions & 3 deletions frontend/src/hooks/useChatList.ts
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ export function useChatList(): ChatListProps {

useEffect(() => {
async function fetchChats() {
const chats = await fetch("/threads/", {
const chats = await fetch("/api/v1/threads/", {
headers: {
Accept: "application/json",
},
Expand All @@ -44,7 +44,7 @@ export function useChatList(): ChatListProps {
}, []);

const createChat = useCallback(async (name: string, assistant_id: string) => {
const response = await fetch(`/threads`, {
const response = await fetch(`/api/v1/threads`, {
method: "POST",
body: JSON.stringify({ assistant_id, name }),
headers: {
Expand All @@ -59,7 +59,7 @@ export function useChatList(): ChatListProps {

const deleteChat = useCallback(
async (thread_id: string) => {
await fetch(`/threads/${thread_id}`, {
await fetch(`/api/v1/threads/${thread_id}`, {
method: "DELETE",
headers: {
Accept: "application/json",
Expand Down
2 changes: 1 addition & 1 deletion frontend/src/hooks/useChatMessages.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ import { Message } from "../types";
import { StreamState, mergeMessagesById } from "./useStreamState";

async function getState(threadId: string) {
const { values, next } = await fetch(`/threads/${threadId}/state`, {
const { values, next } = await fetch(`/api/v1/threads/${threadId}/state`, {
headers: {
Accept: "application/json",
},
Expand Down
8 changes: 5 additions & 3 deletions frontend/src/hooks/useConfigList.ts
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ export function useConfigList(): ConfigListProps {

useEffect(() => {
async function fetchConfigs() {
const myConfigs = await fetch("/assistants/", {
const myConfigs = await fetch("/api/v1/assistants/", {
headers: {
Accept: "application/json",
},
Expand All @@ -73,7 +73,9 @@ export function useConfigList(): ConfigListProps {
assistantId?: string,
): Promise<string> => {
const confResponse = await fetch(
assistantId ? `/assistants/${assistantId}` : "/assistants",
assistantId
? `/api/v1/assistants/${assistantId}`
: "/api/v1/assistants",
{
method: assistantId ? "PUT" : "POST",
body: JSON.stringify({ name, config, public: isPublic }),
Expand All @@ -94,7 +96,7 @@ export function useConfigList(): ConfigListProps {
"config",
JSON.stringify({ configurable: { assistant_id } }),
);
await fetch(`/ingest`, {
await fetch(`/api/v1/ingest`, {
method: "POST",
body: formData,
});
Expand Down
2 changes: 1 addition & 1 deletion frontend/src/hooks/useMessageEditing.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ export function useMessageEditing(
}, []);
const commitEdits = useCallback(async () => {
if (!threadId) return;
fetch(`/threads/${threadId}/state`, {
fetch(`/api/v1/threads/${threadId}/state`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ values: Object.values(editing) }),
Expand Down
2 changes: 1 addition & 1 deletion frontend/src/hooks/useSchemas.ts
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ export function useSchemas() {

useEffect(() => {
async function save() {
const configSchema = await fetch("/runs/config_schema")
const configSchema = await fetch("/api/v1/runs/config_schema")
.then((r) => r.json())
.then(simplifySchema);
setSchemas({
Expand Down
2 changes: 1 addition & 1 deletion frontend/src/hooks/useStreamState.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ export function useStreamState(): StreamStateProps {
setController(controller);
setCurrent({ status: "inflight", messages: input || [] });

await fetchEventSource("/runs/stream", {
await fetchEventSource("/api/v1/runs/stream", {
signal: controller.signal,
method: "POST",
headers: { "Content-Type": "application/json" },
Expand Down
2 changes: 1 addition & 1 deletion frontend/vite.config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ export default defineConfig({
usePolling: true
},
proxy: {
"^/(assistants|threads|ingest|runs)": {
"^/api/v1/(assistants|threads|ingest|runs)": {
target: process.env.VITE_BACKEND_URL || "http://127.0.0.1:8100",
changeOrigin: true,
rewrite: (path) => path.replace("/____LANGSERVE_BASE_URL", ""),
Expand Down
Loading