You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
WASM would be a great technology for writing the same LLM tool once and using it in Python and TS.
However, the problem is that even a simple HTTP fetch is not standardized among WASM runtimes. This would be the minimum requirement for a runtime running LLM tools (tools usually do API calls).
Context
WASM would be a great technology for writing the same LLM tool once and using it in Python and TS.
However, the problem is that even a simple HTTP fetch is not standardized among WASM runtimes. This would be the minimum requirement for a runtime running LLM tools (tools usually do API calls).
Status
LlamaIndexTS can already use WASM tools; see https://github.com/run-llama/LlamaIndexTS/blob/main/packages/wasm-tools/README.md
Options
How can we do an HTTP fetch?
Extism looks more mature. (August 2024)
Next step
Try extism to implement a Wikipedia tool
The text was updated successfully, but these errors were encountered: