[FEATURE] Offline Batch Inference and Batch Ingestion #2840
Labels
2.17
enhancement
New feature or request
feature
Roadmap:Cost/Performance/Scale
Project-wide roadmap label
v2.17.0
Is your feature request related to a problem?
This feature is related to #1840, #2488. It's about ingesting all the batch inference results from files in S3, OpenAI, Cohere, etc, into the OpenSearch cluster. Batch inference was released in OpenSearch 2.16.
For more details and real case examples of the whole workflow, please refer to this RFC #2891.
The text was updated successfully, but these errors were encountered: