Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Inference API] Fix output stream ordering in InferenceActionProxy #124225

Conversation

timgrein
Copy link
Contributor

@timgrein timgrein commented Mar 6, 2025

While working on propagating product use case I've noticed an ordering issue in the stream output of InferenceActionProxy: taskType was written after inferenceEntityId, but the reading side expects it the other way around. I assume we need/should backport this?

@timgrein timgrein added :ml Machine learning Team:ML Meta label for the ML team v9.0.0 v8.18.0 v8.18.1 v9.1.0 auto-backport Automatically create backport pull requests when merged labels Mar 6, 2025
@elasticsearchmachine
Copy link
Collaborator

Pinging @elastic/ml-core (Team:ML)

@elasticsearchmachine
Copy link
Collaborator

Hi @timgrein, I've created a changelog YAML for you.

Copy link
Contributor

@jonathan-buttner jonathan-buttner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😬 thanks for catching that 🤦‍♂️

@elasticsearchmachine
Copy link
Collaborator

💚 Backport successful

Status Branch Result
9.0
8.18

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto-backport Automatically create backport pull requests when merged >bug :ml Machine learning Team:ML Meta label for the ML team v8.18.0 v8.18.1 v9.0.0 v9.0.1 v9.1.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants