-
Notifications
You must be signed in to change notification settings - Fork 473
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Get offline batch inference details using task API in m #8305
Conversation
Signed-off-by: Bhavana Ramaram <[email protected]>
Thank you for submitting your PR. The PR states are In progress (or Draft) -> Tech review -> Doc review -> Editorial review -> Merged. Before you submit your PR for doc review, make sure the content is technically accurate. If you need help finding a tech reviewer, tag a maintainer. When you're ready for doc review, tag the assignee of this PR. The doc reviewer may push edits to the PR directly or leave comments and editorial suggestions for you to address (let us know in a comment if you have a preference). The doc reviewer will arrange for an editorial review. |
Signed-off-by: Fanit Kolchina <[email protected]>
Signed-off-by: Fanit Kolchina <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kolchfa-aws @rbhavna Just a couple minor changes. Thanks!
Co-authored-by: Nathan Bower <[email protected]> Signed-off-by: kolchfa-aws <[email protected]>
Signed-off-by: kolchfa-aws <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, @rbhavna! LGTM
@@ -31,7 +31,7 @@ POST /_plugins/_ml/models/<model_id>/_batch_predict | |||
|
|||
## Prerequisites | |||
|
|||
Before using the Batch Predict API, you need to create a connector to the externally hosted model. For example, to create a connector to an OpenAI `text-embedding-ada-002` model, send the following request: | |||
Before using the Batch Predict API, you need to create a connector to the externally hosted model. For example, to create a connector to an OpenAI `text-embedding-ada-002` model, send the following request. The optional `action_type` paramter supports canceling the batch job running on OpenAI: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The optional action_type parameter allows for checking the status or canceling a batch job running on OpenAI.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we have something like this:
Before using the Batch Predict API, you need to create a connector to the externally hosted model. For example, to create a connector to an OpenAI text-embedding-ada-002
model, send the following request. The action_type cancel_batch_predict
is optional and it supports canceling the batch job running on OpenAI:
Signed-off-by: kolchfa-aws <[email protected]>
Signed-off-by: Fanit Kolchina <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM with noted comments.
@@ -31,7 +31,13 @@ POST /_plugins/_ml/models/<model_id>/_batch_predict | |||
|
|||
## Prerequisites | |||
|
|||
Before using the Batch Predict API, you need to create a connector to the externally hosted model. For example, to create a connector to an OpenAI `text-embedding-ada-002` model, send the following request. The optional `action_type` parameter supports canceling the batch job running on OpenAI: | |||
Before using the Batch Predict API, you need to create a connector to the externally hosted model. For every action, specify the `action_type` parameter that describes the action: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are extra spaces before "For every" and after "specify the". I would say "For each action".
Signed-off-by: Fanit Kolchina <[email protected]>
…roject#8305) * get offline batch inference details using task API in ml Signed-off-by: Bhavana Ramaram <[email protected]> * Doc review Signed-off-by: Fanit Kolchina <[email protected]> * Typo fix Signed-off-by: Fanit Kolchina <[email protected]> * Apply suggestions from code review Co-authored-by: Nathan Bower <[email protected]> Signed-off-by: kolchfa-aws <[email protected]> * Update _ml-commons-plugin/api/model-apis/batch-predict.md Signed-off-by: kolchfa-aws <[email protected]> * Update _ml-commons-plugin/api/model-apis/batch-predict.md Signed-off-by: kolchfa-aws <[email protected]> * Add parameter values Signed-off-by: Fanit Kolchina <[email protected]> * Extra spaces Signed-off-by: Fanit Kolchina <[email protected]> --------- Signed-off-by: Bhavana Ramaram <[email protected]> Signed-off-by: Fanit Kolchina <[email protected]> Signed-off-by: kolchfa-aws <[email protected]> Co-authored-by: Fanit Kolchina <[email protected]> Co-authored-by: kolchfa-aws <[email protected]> Co-authored-by: Nathan Bower <[email protected]> Signed-off-by: Noah Staveley <[email protected]>
…roject#8305) * get offline batch inference details using task API in ml Signed-off-by: Bhavana Ramaram <[email protected]> * Doc review Signed-off-by: Fanit Kolchina <[email protected]> * Typo fix Signed-off-by: Fanit Kolchina <[email protected]> * Apply suggestions from code review Co-authored-by: Nathan Bower <[email protected]> Signed-off-by: kolchfa-aws <[email protected]> * Update _ml-commons-plugin/api/model-apis/batch-predict.md Signed-off-by: kolchfa-aws <[email protected]> * Update _ml-commons-plugin/api/model-apis/batch-predict.md Signed-off-by: kolchfa-aws <[email protected]> * Add parameter values Signed-off-by: Fanit Kolchina <[email protected]> * Extra spaces Signed-off-by: Fanit Kolchina <[email protected]> --------- Signed-off-by: Bhavana Ramaram <[email protected]> Signed-off-by: Fanit Kolchina <[email protected]> Signed-off-by: kolchfa-aws <[email protected]> Co-authored-by: Fanit Kolchina <[email protected]> Co-authored-by: kolchfa-aws <[email protected]> Co-authored-by: Nathan Bower <[email protected]> Signed-off-by: Noah Staveley <[email protected]>
Description
Add documentation for getting offline batch inference details using task API in ml-commons
Issues Resolved
closes #8210
Version
2.17
Frontend features
If you're submitting documentation for an OpenSearch Dashboards feature, add a video that shows how a user will interact with the UI step by step. A voiceover is optional.
Checklist
For more information on following Developer Certificate of Origin and signing off your commits, please check here.