-
Notifications
You must be signed in to change notification settings - Fork 10.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LoadingCache hits downstream multiple time in case of TTL expiry #6180
Comments
I've looked into this previously for #1975 / ben-manes/caffeine#7. A reason why it hasn't moved forward is that while it makes sense for |
yeah so by coalescing, I meant was it will do something like this,
Moreover this works beautifully even if user wants to fetch a single key from cache instead of using current logic of,
So this is good to prevent latency but downstream takes a hit in terms of network bandwidth and throughput (even though it provided bulk API for optimising this use case)
Not sure if users moved to their own implementation or any other open-source because of this 😓 |
The
The caches already provide the needed API hooks for users to implement it themselves. However it seems to be a bit out of scope for the caches to provide this utility themselves. The analogy is that retries and backoff on a failure are also reasonable needs in practice, but it does not make sense for this resilience logic to be specialized by a caching library for its loads rather than be offered by a generalized library (e.g. failsafe).
The above example might be a good starting point for reviewing how it could work, even if you prefer to code it anew. I do think it would be beneficial to have a general purpose library for these cases, but also that is decoupled and independent from these caching projects. I think it is more reasonable to scope our issues to providing the api that is needed for hooking in this functionality. |
I think this should be closed and users directed to use their favorite Reactive Streams library. See this example using Resctor, which could trivially be adjusted to complete Guava’s ListenableFuture. |
Hi,
We are using LoadingCache for our use-case where Key and Value are complex objects
In our service implementation, we try to fetch multiple keys in a go from the cache object
However in case a Key has expired, Guava (
CacheLoader<KeyObject, ValueObject>
) immediately calls the downstream to get the fresh value of the object corresponding to the key usingreload(KeyObject key, ValueObject oldValue)
and this results in multiple downstream calls in case when there are multiple keys in request that have TTL expired.Since This is a "get multiple keys" call to guava, it makes more sense for
LoadingCache
to accumulate all keys that have expired and make a single call to downstream to get the updated values, This way it can now merge these fresh values with values for keys that are still not expired.For implementation, we can use
loadAll(Iterable<? extends Key> keys)
of theCacheLoader
to get values for expired keys.This new implementation can help reduce service network calls opposite of what we see now where it calls asynchronously for every key.
The text was updated successfully, but these errors were encountered: