[Idea] Assistant API 功能的接入 可以创建一种新的插件开发方式 Tools Studio #487
Replies: 3 comments 1 reply
-
LobeChat 官方的插件机制不会改动的。原因是我们希望我们的插件能力与具体模型服务商解耦开来。 我们希望 LobeChat 的用户可以轻易地切换背后的模型服务为llama chatglm baichuan 等开源模型。只要本地模型在 function call上的能力达标,插件链路以目前的架构来说是完全兼容的。 如果我们的插件改造成 assistant 的方式,这意味着 LobeChat 的整个产品框架会被 openai 的迭代牵着走,这不是我们所期望的方向。 此外,OpenAI 的 Assistant 接口完全依赖了 openai 的服务器。虽然对于开发者来说会比较方便,什么也不用钻研,直接调用接口,就能使用 openai 现成的无限上下文、文件读取、代码解释器这些特性。但是要知道它的成本是极其昂贵的: 1个 thread 存 1GB 文件,每天要耗费0.2刀,调用代码解释器的能力每次请求又是额外的付费项目。这相当于把成本转加到了用户头上,而且数据又会存储在openai服务器上。 所以我们的计划是会将 assistant 接口作为一个增强能力(可能是插件,此处暂定),提供给到用户。 |
Beta Was this translation helpful? Give feedback.
-
The official plug-in mechanism of LobeChat will not be changed. The reason is that we want our plug-in capabilities to be decoupled from specific model service providers. We hope that LobeChat users can easily switch the model service behind it to open source models such as llama chatglm baichuan. As long as the local model's function call capabilities are up to standard, the plug-in link is fully compatible with the current architecture. If our plug-in is transformed into an assistant, this means that the entire product framework of LobeChat will be led by the iteration of openai, which is not the direction we expect. In addition, OpenAI’s Assistant interface completely relies on openai’s server. Although it will be more convenient for developers, they don’t need to delve into anything. They can directly call the interface and use openai’s ready-made unlimited context, file reading, and code interpreter features. But you must know that its cost is extremely expensive: 1 thread stores 1GB file, which costs 0.2 dollars per day. The ability to call the code interpreter is an additional paid item for each request. This is equivalent to transferring the cost to the user, and the data will be stored on the openai server. Therefore, our plan is to provide the assistant interface to users as an enhanced capability (maybe a plug-in, tentatively decided here). |
Beta Was this translation helpful? Give feedback.
-
我赞同这一点,其实国内的 GLM 或者百川在某些领域已经和 OPENAI 不相上下,不一定非要被他的功能牵着走,解耦是非常正确的。目前我也在看是兼容 GLM 的 FC 还是等 GLM 的 API 对齐…… GLM 的 FC 网络深度搜索还是很强的 |
Beta Was this translation helpful? Give feedback.
-
🥰 需求描述 | Feature Description
相关文档
https://platform.openai.com/docs/assistants/overview
https://platform.openai.com/docs/assistants/tools/code-interpreter
https://platform.openai.com/docs/assistants/tools/function-calling
https://platform.openai.com/docs/assistants/tools/knowledge-retrieval
https://platform.openai.com/docs/assistants/tools/supported-files
我觉得有了assistant API以后 lobe-chat 官方的插件机制是否可以改一下
使用retrieve方式进行
有个想法 就是插件机制改成官方create Gpts的方式进行创建插件的Tools Studio可以单独抽离成@lobe-chat/tools-studios类似 以后市场模块就是gpts store 同时也是lobe chat store
这样插件也会和官方chatgpt gpts 通用 也能使用官方的assistant api 这样不用自己去定义和规范插件开发
🧐 解决方案 | Proposed Solution
如上
📝 补充信息 | Additional Information
No response
Beta Was this translation helpful? Give feedback.
All reactions