Skip to content

Commit

Permalink
Description comments on LLM module.
Browse files Browse the repository at this point in the history
  • Loading branch information
samchon committed Sep 7, 2024
1 parent a98fe94 commit e296152
Show file tree
Hide file tree
Showing 4 changed files with 141 additions and 5 deletions.
2 changes: 1 addition & 1 deletion benchmark/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,6 @@
"suppress-warnings": "^1.0.2",
"tstl": "^3.0.0",
"uuid": "^9.0.1",
"typia": "../typia-6.10.0-dev.20240906.tgz"
"typia": "../typia-6.10.0-dev.20240907.tgz"
}
}
2 changes: 1 addition & 1 deletion errors/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,6 @@
"typescript": "^5.3.2"
},
"dependencies": {
"typia": "../typia-6.10.0-dev.20240906.tgz"
"typia": "../typia-6.10.0-dev.20240907.tgz"
}
}
140 changes: 138 additions & 2 deletions src/llm.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,72 @@
import { ILlmApplication, ILlmSchema } from "@samchon/openapi";

export function application(): never;
export function application<T extends object>(): ILlmApplication;
/**
* > You must configure the generic argument `App`.
*
* TypeScript functions to LLM function schemas.
*
* Creates an application of LLM (Large Language Model) function calling schemas from
* a TypeScript class or interface type containig the target functions to be called by
* the LLM function calling feature.
*
* If you put the returned {@link ILlmApplication.functions} objects to the LLM provider
* like [OpenAI (ChatGPT)](https://openai.com/), the LLM will automatically select the
* proper function and fill its arguments from the conversation (maybe chatting text)
* with user (human). This is the concept of the LLM function calling.
*
* By the way, there can be some parameters (or their nested properties) that must be
* composed by human, not by LLM. File uploading feature or some sensitive information
* like secrety key (password) are the examples. In that case, you can separate the
* function parameters to both LLM and human sides by configuring the
* {@link ILlmApplication.IOptions.separate} property.
*
* Additionally, the actual function call execution is not by LLM, but by you.
* When the LLM selects the proper function and fills the arguments, you just call
* the function with the LLM prepared arguments. And then informs the return value to
* the LLM by system prompt. The LLM will continue the next conversation based on
* the return value.
*
* @template App Target class or interface type collecting the functions to call
* @param options Options for the LLM application construction
* @returns Application of LLM function calling schemas
* @reference https://platform.openai.com/docs/guides/function-calling
* @author Jeongho Nam - https://github.com/samchon
*/
export function application(options?: ILlmApplication.IOptions): never;

/**
* TypeScript functions to LLM function schemas.
*
* Creates an application of LLM (Large Language Model) function calling schemas from
* a TypeScript class or interface type containig the target functions to be called by
* the LLM function calling feature.
*
* If you put the returned {@link ILlmApplication.functions} objects to the LLM provider
* like [OpenAI (ChatGPT)](https://openai.com/), the LLM will automatically select the
* proper function and fill its arguments from the conversation (maybe chatting text)
* with user (human). This is the concept of the LLM function calling.
*
* By the way, there can be some parameters (or their nested properties) that must be
* composed by human, not by LLM. File uploading feature or some sensitive information
* like secrety key (password) are the examples. In that case, you can separate the
* function parameters to both LLM and human sides by configuring the
* {@link ILlmApplication.IOptions.separate} property.
*
* Additionally, the actual function call execution is not by LLM, but by you.
* When the LLM selects the proper function and fills the arguments, you just call
* the function with the LLM prepared arguments. And then informs the return value to
* the LLM by system prompt. The LLM will continue the next conversation based on
* the return value.
*
* @template App Target class or interface type collecting the functions to call
* @param options Options for the LLM application construction
* @returns Application of LLM function calling schemas
* @reference https://platform.openai.com/docs/guides/function-calling
* @author Jeongho Nam - https://github.com/samchon
*/
export function application<App extends object>(
options?: ILlmApplication.IOptions,
): ILlmApplication;

/**
* @internal
Expand All @@ -10,7 +75,78 @@ export function application(): never {
halt("application");
}

/**
* > You must configure the generic argument `T`.
*
* TypeScript type to LLM type schema.
*
* Creates an LLM (Large Language Model) type schema, a type metadata that is used in the
* [LLM function calling](@reference https://platform.openai.com/docs/guides/function-calling),
* from a TypeScript type.
*
* The returned {@link ILlmSchema} type is similar to the OpenAPI v3.0 based JSON schema
* definition, but it is more simplified for the LLM function calling by remmoving the
* {@link OpenApiV3.IJson.IReference reference} type embodied by the
* {@link OpenApiV3.IJson.IReference.$ref `$ref`} proeprty.
*
* If you actually want to perform the LLM function calling with TypeScript functions,
* you can do it with the {@link application} function. Let's enjoy the
* LLM function calling with native TypeScript functions and types.
*
* > **What LLM function calling is?
* >
* > LLM (Large Language Model) selects propert function and fill the arguments,
* > but actuall function call execution is not by LLM, but by you.
* >
* > In nowadays, most LLM (Large Language Model) like OpenAI are supporting
* > "function calling" feature. The "function calling" means that LLM automatically selects
* > a proper function and compose parameter values from the user's chatting text.
* >
* > When LLM selects the proper function and its arguments, you just call the function
* > with the arguments. And then informs the return value to the LLM by system prompt,
* > LLM will continue the next conversation based on the return value.
*
* @template T Target type
* @returns LLM schema
* @reference https://platform.openai.com/docs/guides/function-calling
* @author Jeongho Nam - https://github.com/samchon
*/
export function schema(): never;

/**
* TypeScript type to LLM type schema.
*
* Creates an LLM (Large Language Model) type schema, a type metadata that is used in the
* [LLM function calling](@reference https://platform.openai.com/docs/guides/function-calling),
* from a TypeScript type.
*
* The returned {@link ILlmSchema} type is similar to the OpenAPI v3.0 based JSON schema
* definition, but it is more simplified for the LLM function calling by remmoving the
* {@link OpenApiV3.IJson.IReference reference} type embodied by the
* {@link OpenApiV3.IJson.IReference.$ref `$ref`} proeprty.
*
* If you actually want to perform the LLM function calling with TypeScript functions,
* you can do it with the {@link application} function. Let's enjoy the
* LLM function calling with native TypeScript functions and types.
*
* > **What LLM function calling is?
* >
* > LLM (Large Language Model) selects propert function and fill the arguments,
* > but actuall function call execution is not by LLM, but by you.
* >
* > In nowadays, most LLM (Large Language Model) like OpenAI are supporting
* > "function calling" feature. The "function calling" means that LLM automatically selects
* > a proper function and compose parameter values from the user's chatting text.
* >
* > When LLM selects the proper function and its arguments, you just call the function
* > with the arguments. And then informs the return value to the LLM by system prompt,
* > LLM will continue the next conversation based on the return value.
*
* @template T Target type
* @returns LLM schema
* @reference https://platform.openai.com/docs/guides/function-calling
* @author Jeongho Nam - https://github.com/samchon
*/
export function schema<T>(): ILlmSchema;

/**
Expand Down
2 changes: 1 addition & 1 deletion test-esm/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,6 @@
"typescript": "^5.4.5"
},
"dependencies": {
"typia": "../typia-6.10.0-dev.20240906.tgz"
"typia": "../typia-6.10.0-dev.20240907.tgz"
}
}

0 comments on commit e296152

Please sign in to comment.