Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Server side caching implementation for the Index Pattern api #5575

Closed
wants to merge 69 commits into from

Conversation

Bargs
Copy link
Contributor

@Bargs Bargs commented Dec 4, 2015

Requires #5213

When an index pattern is requested via the API it pulls field mappings directly from the matching indices if no index template exists. This is done to avoid having duplicate data in the .kibana index pattern, but retrieving all of these mappings can be a costly operation if there a many indices and many fields in each index. We can't rely on browser caching because we need to be able to invalidate the cache any time an update is made. To keep things fast, let's cache the normalized index-pattern resources that we're returning to clients when they GET them from the API.

At the moment this PR is a rough proof of concept showing how we could use Hapi and its support for catbox to easily cache these responses server side.

Bargs added 30 commits December 3, 2015 20:53
…re are no existing indices that match the pattern
Bargs added 24 commits December 3, 2015 20:53
Adding a 'suites' property to the existing intern configuration caused
an error to be thrown at the beginning of the functional test run. Even
if the value of 'suites' was just an empty array. The existence of the
property seemed to enabled execution of the config file in the selenium
browser because it complained about not having the node require
function. To fix this, I created a separate api test config file without
the node require and removed the 'suites' property from intern.js.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant