[Lens] Better handling of "too many buckets" error #93998
Labels
enhancement
New value added to drive a business result
Feature:Lens
impact:needs-assessment
Product and/or Engineering needs to evaluate the impact of the change.
Team:Visualizations
Visualization editors, elastic-charts and infrastructure
This is an error that is hard to test using small datasets, it's only an issue with large and high-cardinality data. For this reason I haven't been able to reproduce it in a while, and I think we will need to generate special data to test this issue.
Users can already get unpredictable but frequent query failures due to the 65k
search.max_buckets
limit. For example, the query might fail on a large time range but succeed in a small time range because the large time range has more results. It will happen any time the query gets above 65k results, which could happen with a datatable:When they get a failure, it looks like this:
If this happens while using a dashboard, the user won't have any options to fix. I propose that we do two changes:
When the user has multiple levels of aggregation that multiply together to cause too many buckets, we should warn them in the Lens editor even if the query will succeed. This will make users aware that there could be a problem.
The Lens embeddable should offer inline controls to reduce the size of the Terms aggregation if the query fails. For example, we could offer to "try again with 10 top values" instead of 100.
This is related to #93912
The text was updated successfully, but these errors were encountered: