-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ML] Enhancements to model memory estimation #60386
Labels
Comments
darnautov
added
enhancement
New value added to drive a business result
:ml
Feature:Anomaly Detection
ML anomaly detection
v7.7.0
labels
Mar 17, 2020
Pinging @elastic/ml-ui (:ml) |
1 task
|
This was referenced Mar 19, 2020
sophiec20
removed
enhancement
New value added to drive a business result
:ml
Feature:Anomaly Detection
ML anomaly detection
labels
Nov 8, 2022
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Meta
/api/ml/validate/calculate_model_memory_limit
Kibana endpoint to call the new Elasticsearch endpoint, removing the calculation in ml/server/models/calculate_model_memory_limit/calculate_model_memory_limit.js. ([ML] Use a new ML endpoint to estimate a model memory #60376)calculateModelMemoryLimit
function to be supplied with the analysis_config object ([ML] Use a new ML endpoint to estimate a model memory #60376)/api/ml/modules/setup/{moduleId}
endpoint will take an additional parameter to indicate whether an estimate of the model memory limit should be made by checking the cardinality of fields in the job configurations. When called from the ML data recognizer wizard, this will be true, making use of the start / end times specified for the data feed. If the setup endpoint is not supplied with start/end times, thecalculateModelMemoryLimit
endpoint will attempt to check over the most recent 3 months of data. Solutions calling the setup endpoint would be expected to pass true in most cases. If they wish to supply their own estimates, this can be done in the jobOverrides parameter. If there is no data in the datafeed index(es), or if false is passed to the setup endpoint, the existing model_memory_limit values supplied in the module job JSON configuration files will be used. ([ML] Module setup with dynamic model memory estimation #60656)The text was updated successfully, but these errors were encountered: