Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ML] Fix issues in dynamically reading the number of allocations #115095

Open
wants to merge 2 commits into
base: 8.16
Choose a base branch
from

Conversation

davidkyle
Copy link
Member

Relates to problems in the GET inference API which should dynamically update the num_allocations field with the actual number from the deployed model. This is required for adaptive allocations where the field will change dynamically.

The first issue is that GroupedActionListener throws if called with size == 0. This is now protected against by skipping the model update if the list is empty.

The second issue is that the wrong field was being updated so the update was not seen in the API response. Tests are added to cover both cases.

Non issue as the code is not live

@elasticsearchmachine
Copy link
Collaborator

Pinging @elastic/ml-core (Team:ML)

@elasticsearchmachine elasticsearchmachine added Team:ML Meta label for the ML team v8.16.1 labels Oct 18, 2024
@@ -126,6 +126,11 @@ private void getModelsByTaskType(TaskType taskType, ActionListener<GetInferenceM
}

private void parseModels(List<UnparsedModel> unparsedModels, ActionListener<GetInferenceModelAction.Response> listener) {
if (unparsedModels.isEmpty()) {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Without this check the GroupedActionListner was called with 0 requests which throws an exception

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
:ml Machine learning >non-issue Team:ML Meta label for the ML team v8.16.0 v8.16.1
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants