Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get offline batch inference details using task API in m #8305

Merged
merged 8 commits into from
Sep 17, 2024
Merged
8 changes: 7 additions & 1 deletion _ml-commons-plugin/api/model-apis/batch-predict.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,13 @@

## Prerequisites

Before using the Batch Predict API, you need to create a connector to the externally hosted model. For example, to create a connector to an OpenAI `text-embedding-ada-002` model, send the following request. The optional `action_type` parameter supports canceling the batch job running on OpenAI:
Before using the Batch Predict API, you need to create a connector to the externally hosted model. For every action, specify the `action_type` parameter that describes the action:

Check failure on line 34 in _ml-commons-plugin/api/model-apis/batch-predict.md

View workflow job for this annotation

GitHub Actions / style-job

[vale] reported by reviewdog 🐶 [OpenSearch.SpacingPunctuation] There should be no space before and one space after the punctuation mark in 'model. For'. Raw Output: {"message": "[OpenSearch.SpacingPunctuation] There should be no space before and one space after the punctuation mark in 'model. For'.", "location": {"path": "_ml-commons-plugin/api/model-apis/batch-predict.md", "range": {"start": {"line": 34, "column": 93}}}, "severity": "ERROR"}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are extra spaces before "For every" and after "specify the". I would say "For each action".


- `batch_predict`: Runs the batch predict operation.
- `batch_predict_status`: Checks the batch predict operation status.
- `cancel_batch_predict`: Cancels the batch predict operation.

For example, to create a connector to an OpenAI `text-embedding-ada-002` model, send the following request. The `cancel_batch_predict` action is optional and supports canceling the batch job running on OpenAI:

```json
POST /_plugins/_ml/connectors/_create
Expand Down
Loading