Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: support multi agent for ts #300

Merged
merged 32 commits into from
Sep 26, 2024
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
413593b
feat: update question to ask creating multiagent in express
thucpn Sep 18, 2024
622b84b
feat: add express simple multiagent
thucpn Sep 18, 2024
f464b40
fix: import from agent
thucpn Sep 18, 2024
0ebcb9f
Create yellow-jokes-protect.md
marcusschiesser Sep 19, 2024
f43f00a
create workflow with example agents
thucpn Sep 19, 2024
6c05872
remove unused files
thucpn Sep 19, 2024
2c7a538
update doc
thucpn Sep 19, 2024
5daf519
feat: streaming event
thucpn Sep 19, 2024
b875618
fix: streaming final result
thucpn Sep 19, 2024
b030a3d
fix: pipe final streaming result
thucpn Sep 19, 2024
33ce593
feat: funtional calling agent
thucpn Sep 20, 2024
de5ba29
fix: let default max attempt 2
thucpn Sep 20, 2024
aff4f0c
fix lint
thucpn Sep 20, 2024
c4041e2
refactor: move workflow folder to src
thucpn Sep 20, 2024
f659721
refactor: share settings file for ts templates
thucpn Sep 20, 2024
54d74f8
fix: move settings.ts to setting folder
thucpn Sep 20, 2024
d69cd42
refactor: move workflow to components
thucpn Sep 20, 2024
054ee5b
Update templates/components/multiagent/typescript/workflow/index.ts
marcusschiesser Sep 23, 2024
7297edf
create ts multi agent from streaming template
thucpn Sep 23, 2024
3ebc3ec
remove copy express template
thucpn Sep 23, 2024
8cfabc5
enhance streaming and add handle tool call step
thucpn Sep 23, 2024
305296b
update changeset
thucpn Sep 23, 2024
ea3bbcf
refactor: code review
thucpn Sep 25, 2024
325c7ca
fix: coderabbit comment
thucpn Sep 25, 2024
45f7529
enable multiagent ts test
thucpn Sep 25, 2024
234b15e
fix: e2e apptype for nextjs
thucpn Sep 25, 2024
32c3d89
refactor: use context write event instead of append data annotation d…
thucpn Sep 25, 2024
7079b68
fix streaming
marcusschiesser Sep 25, 2024
6ecd5f8
Merge branch 'main' into feat/support-multi-agent-for-ts
marcusschiesser Sep 26, 2024
0679c37
fix: writer is just streaming
marcusschiesser Sep 26, 2024
fa45102
fix: clearly separate streaming events and content and use workflowEv…
marcusschiesser Sep 26, 2024
2fb502e
fix: add correct tool calls for tool messages
marcusschiesser Sep 26, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .changeset/yellow-jokes-protect.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
thucpn marked this conversation as resolved.
Show resolved Hide resolved
"create-llama": patch
---

Add multi agents template for Express
18 changes: 17 additions & 1 deletion helpers/typescript.ts
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,11 @@ export const installTSTemplate = async ({
* Copy the template files to the target directory.
*/
console.log("\nInitializing project with template:", template, "\n");
const type = template === "multiagent" ? "streaming" : template; // use nextjs streaming template for multiagent
let type = "streaming";
if (template === "multiagent" && framework === "express") {
// use nextjs streaming template as frontend for express and fastapi
type = "multiagent";
}
const templatePath = path.join(templatesDir, "types", type, framework);
const copySource = ["**"];

Expand Down Expand Up @@ -124,6 +128,13 @@ export const installTSTemplate = async ({
cwd: path.join(compPath, "vectordbs", "typescript", vectorDb ?? "none"),
});

if (template === "multiagent") {
await copy("**", path.join(root, relativeEngineDestPath, "workflow"), {
parents: true,
cwd: path.join(compPath, "multiagent", "typescript", "workflow"),
});
}

// copy loader component (TS only supports llama_parse and file for now)
const loaderFolder = useLlamaParse ? "llama_parse" : "file";
await copy("**", enginePath, {
Expand All @@ -145,6 +156,11 @@ export const installTSTemplate = async ({
cwd: path.join(compPath, "engines", "typescript", engine),
});

// copy settings to engine folder
await copy("**", enginePath, {
cwd: path.join(compPath, "settings", "typescript"),
});
thucpn marked this conversation as resolved.
Show resolved Hide resolved

/**
* Copy the selected UI files to the target directory and reference it.
*/
Expand Down
9 changes: 4 additions & 5 deletions questions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -410,10 +410,7 @@ export const askQuestions = async (
return; // early return - no further questions needed for llamapack projects
}

if (program.template === "multiagent") {
// TODO: multi-agents currently only supports FastAPI
program.framework = preferences.framework = "fastapi";
} else if (program.template === "extractor") {
if (program.template === "extractor") {
// Extractor template only supports FastAPI, empty data sources, and llamacloud
// So we just use example file for extractor template, this allows user to choose vector database later
program.dataSources = [EXAMPLE_FILE];
Expand All @@ -424,7 +421,9 @@ export const askQuestions = async (
program.framework = getPrefOrDefault("framework");
} else {
const choices = [
{ title: "NextJS", value: "nextjs" },
...(program.template === "multiagent"
? []
: [{ title: "NextJS", value: "nextjs" }]), // Not supported nextjs for multiagent for now
{ title: "Express", value: "express" },
{ title: "FastAPI (Python)", value: "fastapi" },
];
Expand Down
62 changes: 62 additions & 0 deletions templates/components/multiagent/typescript/workflow/agents.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
import { StreamData } from "ai";
import { ChatMessage, QueryEngineTool } from "llamaindex";
import { getDataSource } from "../engine";
import { FunctionCallingAgent } from "./factory";

const getQueryEngineTool = async () => {
const index = await getDataSource();
if (!index) {
throw new Error("Index not found. Please create an index first.");
}

const topK = process.env.TOP_K ? parseInt(process.env.TOP_K) : undefined;
marcusschiesser marked this conversation as resolved.
Show resolved Hide resolved
return new QueryEngineTool({
queryEngine: index.asQueryEngine({
similarityTopK: topK,
}),
metadata: {
name: "query_index",
description: `Use this tool to retrieve information about the text corpus from the index.`,
},
});
};

export const createResearcher = async (
chatHistory: ChatMessage[],
stream: StreamData,
) => {
return new FunctionCallingAgent({
name: "researcher",
tools: [await getQueryEngineTool()],
systemPrompt:
"You are a researcher agent. You are given a researching task. You must use your tools to complete the research.",
chatHistory,
stream,
});
};

export const createWriter = (
chatHistory: ChatMessage[],
stream: StreamData,
) => {
return new FunctionCallingAgent({
name: "writer",
systemPrompt:
"You are an expert in writing blog posts. You are given a task to write a blog post. Don't make up any information yourself.",
chatHistory,
stream,
});
};

export const createReviewer = (
chatHistory: ChatMessage[],
stream: StreamData,
) => {
return new FunctionCallingAgent({
name: "reviewer",
systemPrompt:
"You are an expert in reviewing blog posts. You are given a task to review a blog post. Review the post for logical inconsistencies, ask critical questions, and provide suggestions for improvement. Furthermore, proofread the post for grammar and spelling errors. Only if the post is good enough for publishing, then you MUST return 'The post is good.'. In all other cases return your review.",
chatHistory,
stream,
});
};
152 changes: 152 additions & 0 deletions templates/components/multiagent/typescript/workflow/factory.ts
thucpn marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
@@ -0,0 +1,152 @@
import {
Context,
StartEvent,
StopEvent,
Workflow,
WorkflowEvent,
} from "@llamaindex/core/workflow";
import { StreamData } from "ai";
import {
BaseToolWithCall,
CallbackManager,
ChatMemoryBuffer,
ChatMessage,
EngineResponse,
LLM,
OpenAIAgent,
Settings,
} from "llamaindex";
import { AgentInput, FunctionCallingStreamResult } from "./type";

class InputEvent extends WorkflowEvent<{
input: ChatMessage[];
}> {}

export class FunctionCallingAgent extends Workflow {
name: string;
llm: LLM;
memory: ChatMemoryBuffer;
tools: BaseToolWithCall[];
systemPrompt?: string;
writeEvents: boolean;
role?: string;
callbackManager: CallbackManager;
stream: StreamData;

constructor(options: {
name: string;
llm?: LLM;
chatHistory?: ChatMessage[];
tools?: BaseToolWithCall[];
systemPrompt?: string;
writeEvents?: boolean;
role?: string;
verbose?: boolean;
timeout?: number;
stream: StreamData;
}) {
super({
verbose: options?.verbose ?? false,
timeout: options?.timeout ?? 360,
});
this.name = options?.name;
this.llm = options.llm ?? Settings.llm;
thucpn marked this conversation as resolved.
Show resolved Hide resolved
this.memory = new ChatMemoryBuffer({
llm: this.llm,
chatHistory: options.chatHistory,
});
this.tools = options?.tools ?? [];
this.systemPrompt = options.systemPrompt;
this.writeEvents = options?.writeEvents ?? true;
this.role = options?.role;
this.callbackManager = this.createCallbackManager();
this.stream = options.stream;

// add steps
this.addStep(StartEvent<AgentInput>, this.prepareChatHistory, {
outputs: InputEvent,
});
this.addStep(InputEvent, this.handleLLMInput, {
outputs: StopEvent,
});
}

private get chatHistory() {
return this.memory.getAllMessages();
}

private async prepareChatHistory(
ctx: Context,
ev: StartEvent<AgentInput>,
): Promise<InputEvent> {
const { message, streaming } = ev.data.input;
ctx.set("streaming", streaming);
this.writeEvent(`Start to work on: ${message}`);
marcusschiesser marked this conversation as resolved.
Show resolved Hide resolved
if (this.systemPrompt) {
this.memory.put({ role: "system", content: this.systemPrompt });
}
this.memory.put({ role: "user", content: message });
return new InputEvent({ input: this.chatHistory });
}

private async handleLLMInput(
ctx: Context,
ev: InputEvent,
): Promise<StopEvent<string | ReadableStream<EngineResponse>>> {
marcusschiesser marked this conversation as resolved.
Show resolved Hide resolved
const chatEngine = new OpenAIAgent({
thucpn marked this conversation as resolved.
Show resolved Hide resolved
tools: this.tools,
systemPrompt: this.systemPrompt,
chatHistory: this.chatHistory,
});

if (!ctx.get("streaming")) {
const response = await Settings.withCallbackManager(
this.callbackManager,
() => {
return chatEngine.chat({
message: ev.data.input.pop()!.content,
});
},
thucpn marked this conversation as resolved.
Show resolved Hide resolved
);
this.writeEvent("Finished task");
return new StopEvent({ result: response.message.content.toString() });
}

const response = await Settings.withCallbackManager(
this.callbackManager,
() => {
return chatEngine.chat({
message: ev.data.input.pop()!.content,
stream: true,
});
thucpn marked this conversation as resolved.
Show resolved Hide resolved
},
);
ctx.writeEventToStream({ data: new FunctionCallingStreamResult(response) });
thucpn marked this conversation as resolved.
Show resolved Hide resolved
return new StopEvent({ result: response });
}
thucpn marked this conversation as resolved.
Show resolved Hide resolved

private createCallbackManager() {
const callbackManager = new CallbackManager();
callbackManager.on("llm-tool-call", (event) => {
const { toolCall } = event.detail;
this.writeEvent(
`Calling tool "${toolCall.name}" with input: ${JSON.stringify(toolCall.input)}`,
thucpn marked this conversation as resolved.
Show resolved Hide resolved
);
marcusschiesser marked this conversation as resolved.
Show resolved Hide resolved
});
callbackManager.on("llm-tool-result", (event) => {
const { toolCall, toolResult } = event.detail;
this.writeEvent(
`Getting result from tool "${toolCall.name}": \n${JSON.stringify(toolResult.output)}`,
);
});
return callbackManager;
}

private writeEvent(msg: string) {
if (!this.writeEvents) return;
this.stream.appendMessageAnnotation({
thucpn marked this conversation as resolved.
Show resolved Hide resolved
type: "agent",
data: { agent: this.name, text: msg },
});
}
}
Loading
Loading