Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get offline batch inference details using task API in m #8305

Merged
merged 8 commits into from
Sep 17, 2024

Conversation

rbhavna
Copy link
Contributor

@rbhavna rbhavna commented Sep 17, 2024

Description

Add documentation for getting offline batch inference details using task API in ml-commons

Issues Resolved

closes #8210

Version

2.17

Frontend features

If you're submitting documentation for an OpenSearch Dashboards feature, add a video that shows how a user will interact with the UI step by step. A voiceover is optional.

Checklist

  • By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license and subject to the Developers Certificate of Origin.
    For more information on following Developer Certificate of Origin and signing off your commits, please check here.

Copy link

Thank you for submitting your PR. The PR states are In progress (or Draft) -> Tech review -> Doc review -> Editorial review -> Merged.

Before you submit your PR for doc review, make sure the content is technically accurate. If you need help finding a tech reviewer, tag a maintainer.

When you're ready for doc review, tag the assignee of this PR. The doc reviewer may push edits to the PR directly or leave comments and editorial suggestions for you to address (let us know in a comment if you have a preference). The doc reviewer will arrange for an editorial review.

Signed-off-by: Fanit Kolchina <[email protected]>
Signed-off-by: Fanit Kolchina <[email protected]>
Copy link
Collaborator

@natebower natebower left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kolchfa-aws @rbhavna Just a couple minor changes. Thanks!

_ml-commons-plugin/api/model-apis/batch-predict.md Outdated Show resolved Hide resolved
_ml-commons-plugin/api/model-apis/batch-predict.md Outdated Show resolved Hide resolved
Co-authored-by: Nathan Bower <[email protected]>
Signed-off-by: kolchfa-aws <[email protected]>
Copy link
Collaborator

@kolchfa-aws kolchfa-aws left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, @rbhavna! LGTM

@kolchfa-aws kolchfa-aws added release-notes PR: Include this PR in the automated release notes v2.17.0 labels Sep 17, 2024
@@ -31,7 +31,7 @@ POST /_plugins/_ml/models/<model_id>/_batch_predict

## Prerequisites

Before using the Batch Predict API, you need to create a connector to the externally hosted model. For example, to create a connector to an OpenAI `text-embedding-ada-002` model, send the following request:
Before using the Batch Predict API, you need to create a connector to the externally hosted model. For example, to create a connector to an OpenAI `text-embedding-ada-002` model, send the following request. The optional `action_type` paramter supports canceling the batch job running on OpenAI:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The optional action_type parameter allows for checking the status or canceling a batch job running on OpenAI.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we have something like this:
Before using the Batch Predict API, you need to create a connector to the externally hosted model. For example, to create a connector to an OpenAI text-embedding-ada-002 model, send the following request. The action_type cancel_batch_predict is optional and it supports canceling the batch job running on OpenAI:

Copy link
Collaborator

@natebower natebower left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with noted comments.

@@ -31,7 +31,13 @@ POST /_plugins/_ml/models/<model_id>/_batch_predict

## Prerequisites

Before using the Batch Predict API, you need to create a connector to the externally hosted model. For example, to create a connector to an OpenAI `text-embedding-ada-002` model, send the following request. The optional `action_type` parameter supports canceling the batch job running on OpenAI:
Before using the Batch Predict API, you need to create a connector to the externally hosted model. For every action, specify the `action_type` parameter that describes the action:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are extra spaces before "For every" and after "specify the". I would say "For each action".

Signed-off-by: Fanit Kolchina <[email protected]>
@kolchfa-aws kolchfa-aws merged commit db292d9 into opensearch-project:main Sep 17, 2024
5 checks passed
noahstaveley pushed a commit to noahstaveley/documentation-website that referenced this pull request Sep 23, 2024
…roject#8305)

* get offline batch inference details using task API in ml

Signed-off-by: Bhavana Ramaram <[email protected]>

* Doc review

Signed-off-by: Fanit Kolchina <[email protected]>

* Typo fix

Signed-off-by: Fanit Kolchina <[email protected]>

* Apply suggestions from code review

Co-authored-by: Nathan Bower <[email protected]>
Signed-off-by: kolchfa-aws <[email protected]>

* Update _ml-commons-plugin/api/model-apis/batch-predict.md

Signed-off-by: kolchfa-aws <[email protected]>

* Update _ml-commons-plugin/api/model-apis/batch-predict.md

Signed-off-by: kolchfa-aws <[email protected]>

* Add parameter values

Signed-off-by: Fanit Kolchina <[email protected]>

* Extra spaces

Signed-off-by: Fanit Kolchina <[email protected]>

---------

Signed-off-by: Bhavana Ramaram <[email protected]>
Signed-off-by: Fanit Kolchina <[email protected]>
Signed-off-by: kolchfa-aws <[email protected]>
Co-authored-by: Fanit Kolchina <[email protected]>
Co-authored-by: kolchfa-aws <[email protected]>
Co-authored-by: Nathan Bower <[email protected]>
Signed-off-by: Noah Staveley <[email protected]>
noahstaveley pushed a commit to noahstaveley/documentation-website that referenced this pull request Sep 23, 2024
…roject#8305)

* get offline batch inference details using task API in ml

Signed-off-by: Bhavana Ramaram <[email protected]>

* Doc review

Signed-off-by: Fanit Kolchina <[email protected]>

* Typo fix

Signed-off-by: Fanit Kolchina <[email protected]>

* Apply suggestions from code review

Co-authored-by: Nathan Bower <[email protected]>
Signed-off-by: kolchfa-aws <[email protected]>

* Update _ml-commons-plugin/api/model-apis/batch-predict.md

Signed-off-by: kolchfa-aws <[email protected]>

* Update _ml-commons-plugin/api/model-apis/batch-predict.md

Signed-off-by: kolchfa-aws <[email protected]>

* Add parameter values

Signed-off-by: Fanit Kolchina <[email protected]>

* Extra spaces

Signed-off-by: Fanit Kolchina <[email protected]>

---------

Signed-off-by: Bhavana Ramaram <[email protected]>
Signed-off-by: Fanit Kolchina <[email protected]>
Signed-off-by: kolchfa-aws <[email protected]>
Co-authored-by: Fanit Kolchina <[email protected]>
Co-authored-by: kolchfa-aws <[email protected]>
Co-authored-by: Nathan Bower <[email protected]>
Signed-off-by: Noah Staveley <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release-notes PR: Include this PR in the automated release notes v2.17.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[DOC] Add support for Offline Batch Inference and Ingestion
4 participants