-
-
Notifications
You must be signed in to change notification settings - Fork 26
Caching strategy
- JS files
- build.js (compiled application): URL hash (
build.js?v={hash}
with a long cache expire date set in the header) - dependencies: concatenate into a single file, then URL hash
- build.js (compiled application): URL hash (
- CSS
- own CSS files: concatenate, then URL hash
- CSS files of dependencies: concatenate, then URL hash
- i18n files for Angular: URL hash?
- Angular template cache file: URL hash?
Pages that request the data from the API on the client-side via XHR requests (e.g. the listing, edit pages).
- Set
Last-Modified
orETAG
header depending on last Git ref? - Use dogpile.cache to cache rendered pages.
Pages that request the data from the API on the server-side (e.g. the detail, history pages).
- Set
ETAG
header retrieved from API. - Use dogpile.cache to cache rendered pages.
- A cookie is used to communicate the interface language of a user to the server side. This cookie should be ignored, the only view that uses (/will use) it should not be cached (see https://github.com/c2corg/v6_ui/issues/204#issuecomment-229019655) (or we configure Varnish to use the locale in the cookie as part of the cache key).
For example /waypoints/123
returns a single waypoint (though data from other documents is included in the response, e.g. of associated documents).
- Set
ETAG
header which reflects the state/version of the document and all associated documents. The difficulty is to know the version of the documents, because the version of associated documents must also be taken into account. The idea would be to save a version number for each document in the database. And every time the document or one of the associated documents changes, the version number is updated. The cache key would bedocument_id - lang - version
, so that no invalidation of old cache entries is needed. - Use dogpile.cache to cache the response of the service.
Services like the search or advanced search return a list of documents. It does not make sense to cache the whole list at is, because with filter parameters like the bbox or search terms there will be not many cache hits. But caching the individual documents contained in a result list makes sense. E.g. for the advanced search: The search request to ElasticSearch returns a list of document ids that should be included in the response. Now, each document is either retrieved from a cache or retrieved from the database (and then put into the cache).
- Set
no-cache
as header? Or use a low TTL (5 sec)? - Use dogpile.cache to cache individual documents. The list of documents is not cached though.
No special config should be needed if the caching headers are set correctly?
- Use mod_pagespeed or the equivalent for nginx to cache assets?
- Use Redis as backend for dogpile.cache.
⛰️ Production UI • Production API • Prod 🧗♀️
- 🏠 Home
- 📖 Changelog
- 🛠️ Dev tips
- 🍪 Production Recipes
- 🚀 Deployment
- 👨🎓 Useful informations
- ☠️ Legacy