Skip to content

Commit

Permalink
Merge branch 'main' into app-template
Browse files Browse the repository at this point in the history
  • Loading branch information
shhdgit authored Mar 27, 2024
2 parents 9c07d10 + 01ff9e5 commit a22b98c
Show file tree
Hide file tree
Showing 38 changed files with 9,979 additions and 4 deletions.
26 changes: 26 additions & 0 deletions .github/workflows/ci-ui.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
name: CI UI

on:
pull_request:
branches:
- main

jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4

- name: Use Node.js
uses: actions/setup-node@v4
with:
node-version: "20.x"
cache: "yarn"
cache-dependency-path: "**/yarn.lock"

- name: Run lint
working-directory: ./ui
run: |
yarn --frozen-lockfile
yarn gen:api
yarn lint
4 changes: 2 additions & 2 deletions blocks/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@
from .condition import ListCondition, NumberCondition, TextCondition
from .dict import KeySelector
from .input import DictInput, ListInput, TextInput
from .invoke import AsyncInvoker, Invoke
from .invoke import AsyncInvoker, Invoke, InvokeWithDict, InvokeWithList
from .list import ConcatList, JoinList
from .llm import LLMChain
from .llm import ChatLLMChain, LLMChain
from .output import TextOutput
from .text import ComposeDict, ListParser
from .tools import GoogleSearch
42 changes: 41 additions & 1 deletion blocks/llm.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
import langchain.chains
from langchain_core.language_models import BaseLanguageModel
from langchain.prompts.chat import BaseChatPromptTemplate
from langchain_core.language_models import BaseChatModel, BaseLanguageModel
from langchain_core.messages import AIMessage, HumanMessage
from langchain_core.prompts import StringPromptTemplate

from observability import span
Expand All @@ -22,3 +24,41 @@ def __init__(
@span(name="LLM Chain")
def __call__(self, text: str, **kwargs) -> str:
return self.chain.predict(text=text, **kwargs)


@block(name="Chat_LLM", kind="llm")
class ChatLLMChain(BaseBlock):
"""
ChatLLM is a block that uses a chat model to generate a response to a given message.
Attributes:
model: BaseChatModel from LangChain.
prompt_template_type: BaseChatPromptTemplate from LangChain which contains a system_template to use.
"""

def __init__(
self, model: BaseChatModel, prompt_template_type: BaseChatPromptTemplate
):
self.chat = model
self.prompt = prompt_template_type

@span(name="ChatLLM")
def __call__(self, messages: list, **kwargs) -> str:
"""
Generate a response to a given message.
Args:
messages: A list of string messages.
**kwargs: Additional arguments to pass to the prompt template.
"""
ms = []
for i, m in enumerate(messages):
if not isinstance(m, str):
raise TypeError(f"messages[{i}] must be a string, but got {type(m)}")
if i % 2 == 0:
ms.append(HumanMessage(content=m))
else:
ms.append(AIMessage(content=m))
prompt_value = self.prompt.format_prompt(messages=ms, **kwargs)
response = self.chat.generate_prompt([prompt_value])
return response.generations[0][0].text
9 changes: 9 additions & 0 deletions docs/deployment/_category_.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"position": 3,
"label": "Deployment",
"collapsible": false,
"collapsed": true,
"link": {
"type": "generated-index"
}
}
40 changes: 40 additions & 0 deletions docs/deployment/local.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
---
title: Local Deployment
sidebar_label: Local
sidebar_position: 3.1
---

# Local Deployment

Deploy LinguFlow on your local machine using Docker Compose. This setup is perfect for developing, testing LinguFlow applications, and diagnosing integration issues.

**Requirements**: Docker and Docker Compose, both of which are part of [Docker Desktop](https://docs.docker.com/get-docker/) for Mac or Windows users.

## Getting Started

Follow these steps to get LinguFlow up and running on your local environment:

```sh
# Clone the LinguFlow repository
git clone [email protected]:pingcap/LinguFlow.git
# Navigate into the LinguFlow directory
cd LinguFlow

# Start the UI and API server
docker-compose -f docker-compose.dev.yaml up
```

-> You can now access LinguFlow at http://localhost:5173

## Updating LinguFlow

To update LinguFlow to the latest version locally, a simple `git pull` is usually sufficient. However, there are two exceptions:

- When the dependencies of `LinguFlow` have been updated (as listed in requirements.txt).
- When the model of `LinguFlow` has been updated (as defined in model.py).

In these scenarios, you'll need to rebuild the LinguFlow Docker image by running:

```sh
docker-compose -f docker-compose.dev.yaml build
```
67 changes: 67 additions & 0 deletions docs/deployment/self_host.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
---
title: Self-Host Deployment
sidebar_label: Self-Host
sidebar_position: 3.2
---

# Self-Host Deployment

LinguFlow Server, encompassing both the API and Web UI, is open-source and can be self-hosted using Docker, offering flexibility for deployment.

## Prerequisites: Database

A database is essential for storing LinguFlow's business data, including applications, versions, and interactions.

LinguFlow is compatible with [several databases](https://docs.sqlalchemy.org/en/20/dialects/index.html#support-levels-for-included-dialects). [TiDB Serverless](https://www.pingcap.com/tidb-serverless/) by PingCAP is recommended. It's a fully-managed, MySQL-compatible database that scales automatically and offers free quotas, making it an excellent choice for small development teams.

Regardless of the database selected, ensure you have the connection string ready once the database is set up.

## Configuring the Database

LinguFlow leverages [alembic](https://alembic.sqlalchemy.org/en/latest/) for automatic table structure creation. However, the database must be manually created first. For TiDB Serverless, execute `create database linguflow` via the Chat2Query page on its web console. For other databases, use the respective database client to connect and execute `create database linguflow`.

After creating the database, proceed with the automatic table structure creation:

```sh
# Install alembic
pip install alembic

# Initialize alembic in the LinguFlow directory
cd LinguFlow
alembic init alembic
sed -i '1s|^|import model\n|' alembic/env.py
sed -i "s|target_metadata =.*|target_metadata = model.Base.metadata|" alembic/env.py
sed -i "s|sqlalchemy.url =.*|sqlalchemy.url = <database_url>|" alembic.ini
alembic revision --autogenerate -m "init"
alembic upgrade head
```

**Note**: Replace `<database_url>` with your actual database connection string in SQLAlchemy format. For TiDB Serverless, it resembles:

```
mysql+pymysql://<USER>:<PASSWORD>@<HOST>:<PORT>/linguflow?ssl_ca=/etc/ssl/certs/ca-certificates.crt&ssl_verify_cert=true&ssl_verify_identity=true
```

## Deploying the Application

Before deployment, edit `docker-compose.yaml` to update the `DATABASE_URL` environment variable with your actual database URL. Then, on the production host:

```sh
docker-compose up -d
```

Access the LinguFlow page at `http://{your-public-ip}`.

## How to Update

To update the application:

```sh
cd LinguFlow
docker-compose down
git pull
docker-compose build --no-cache
alembic revision --autogenerate -m "update schema if any necessary database migrations"
alembic upgrade head
docker-compose up
```
9 changes: 9 additions & 0 deletions docs/develop/_category_.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"position": 4,
"label": "Develop",
"collapsible": false,
"collapsed": true,
"link": {
"type": "generated-index"
}
}
27 changes: 27 additions & 0 deletions docs/develop/application_and_version.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
---
title: Application & Version
sidebar_label: Application & Version
sidebar_position: 4.1
---

# Application & Version

On LinguFlow, you can build your own LLM applications, each supporting multiple versions.

## Application

Create an application and assign it a meaningful name that reflects its business purpose. Each application should correspond to a specific business function, addressing a particular business challenge.

Applications can also [invoke each other](builder/blocks#invoke-category), enabling a modular approach to problem-solving.

### Optional: Enabling Tracing

When creating an application in LinguFlow, you have the option to enable [tracing](../run/tracing) by providing specific `LANGFUSE_SECRET_KEY` and `LANGFUSE_PUBLIC_KEY` for a [Langfuse Cloud](https://langfuse.com/) project. By entering the correct keys, tracing data will be transmitted to Langfuse Cloud during both debugging and production use of the application. This feature allows for enhanced monitoring and debugging capabilities by tracking the application's performance and behavior in real-time.

## Version

Within each application, you can create multiple versions. Assign each version a suitable name and manage them accordingly.

In an application, only one specific version can be designated as the `Published Version`. When you [run an Application](../run/call_an_application), it is this `Published Version` that is actually called.

When it's time to update a version, you can create a new version based on the existing `Published Version`. This new version can then be optimized and [debugged](builder/debugging). Once the new version meets all business requirements, it can be set as the new `Published Version`, ensuring a seamless transition and continuous improvement of your application.
9 changes: 9 additions & 0 deletions docs/develop/builder/_category_.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"position": 4.2,
"label": "Builder",
"collapsible": false,
"collapsed": true,
"link": {
"type": "generated-index"
}
}
Loading

0 comments on commit a22b98c

Please sign in to comment.