Below are the set of steps to run OpenSearch and OpenSearch dashboards with the OpenSearch assistant and the query generation functionality in the Observability Log Explorer page correctly on the cluster.
- Setup a 2.12+ OpenSearch cluster with OpenSearch Dashboards by following the options here: https://opensearch.org/docs/latest/install-and-configure/
- Note: If you are using a min distribution, there are required OpenSearch and OpenSearch Dashboards plugin to run the assistant.
- Required OpenSearch plugins: ML-Commons, Flow Framework, Skill, SQL, and Observability
- Required OpenSearch Dashboard plugins: Dashboard Assistant, Dashboard Observability
- Note: If you are using a min distribution, there are required OpenSearch and OpenSearch Dashboards plugin to run the assistant.
- Enable the following settings to enable the features:
- To enable the chat assistant feature, set
assistant.chat.enabled
totrue
in theopensearch_dashboards.yml
file, and config the root agent id by calling the api as follows:
PUT /.plugins-ml-config/_doc/os_chat { "type":"os_chat_root_agent", "configuration":{ "agent_id": "your root agent id" } }
- To enable the query assistant feature, set
observability.query_assist.enabled
totrue
in theopensearch_dashboards.yml
file, and config the PPL agent id by calling the api as follows:
Optionally, you can also enable the summarization feature for PPL responses by settingPUT /.plugins-ml-config/_doc/os_query_assist_ppl { "type":"os_query_assist_ppl_agent", "configuration":{ "agent_id": "your ppl agent id" } }
observability.summarize.enabled
totrue
in theopensearch_dashboards.yml
file, then config the agent ids:PUT /.plugins-ml-config/_doc/os_query_assist_response_summary { "type":"os_query_assist_response_summary_agent", "configuration":{ "agent_id": "your response summary agent id" } } PUT /.plugins-ml-config/_doc/os_query_assist_error_summary { "type":"os_query_assist_error_summary_agent", "configuration":{ "agent_id": "your error summary agent id" } }
- To enable the chat assistant feature, set
- After OpenSearch and OpenSearch Dashboards are running, we will setup ML Commons to connect to the LLM model
- Run ML commons on Data node
PUT /_cluster/settings { "persistent" : { "plugins.ml_commons.only_run_on_ml_node":"false" } }
- Add Trusted Endpoints (reference doc)
PUT /_cluster/settings { "persistent" : { "plugins.ml_commons.trusted_connector_endpoints_regex": [ "^https://runtime\\.sagemaker\\..*[a-z0-9-]\\.amazonaws\\.com/.*$", "^https://api\\.openai\\.com/.*$", "^https://api\\.cohere\\.ai/.*$", "^https://bedrock-runtime\\.[a-z0-9-]+\\.amazonaws\\.com/.*$" ] } }
- Call the flow framework plugin to setup the cluster for the assistant.
- See https://github.com/opensearch-project/flow-framework/tree/HEAD/sample-templates for sample templates. For setting up the chat assistant use the
observability-chat-agent
template, and for query assist feature use thequery-assist-agent
template. - Note that other models from other services can be used instead.
- Note that if using the Bedrock model, IAM credentials need to be passed into the template to connect to Bedrock.
- See https://github.com/opensearch-project/flow-framework/tree/HEAD/sample-templates for sample templates. For setting up the chat assistant use the
- To create your skill, you need to work backwards to see how that skill can be achieved by accessing different OpenSearch APIs/functions. For example, a skill to find the alerts related to a question would need to use the Alerting plugin APIs to get this info.
- To power the skill to get alerts, we must build a tool to search alerts.
- To create a tool, you create it here. This is an example tool that search alerts.