Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LRU and loading caches #215

Merged
merged 5 commits into from
Sep 28, 2018
Merged

Add LRU and loading caches #215

merged 5 commits into from
Sep 28, 2018

Conversation

erwinvaneyk
Copy link
Member

This PR replaces the old MapCache with an LRUCache. The MapCache was a simple implementation that was not a true cache; it simply was a map. We relied on the controller and other users to delete invocations and workflows to avoid OOMs. The new LRUCache is a true cache, which evicts entities when it reaches the size limit. The old MapCache is moved to the testutils package since it still makes a good candidate for test and development purposes.

To ensure that entities can still be accessed, even when they are evicted from the cache, this PR also introduces the LoadingCache (a partial implementation was in the codebase, but unused). The LoadingCache allows cache consumers to setup a fallback on a cache miss. In this systems case the fallback of the LoadingCache is to reach directly into the event store to fetch the relevant events.

- Converted fes.ErrEventStore to value-based to avoid data races with
the With* methods.

- Restricted locking more in the in-memory event store to avoid data
races.

- Changed the List of LoadingCache to simply use the List of the
underlying cache implmentation.

- Refactored the applyEvent logic of the SubscribedCache
@erwinvaneyk erwinvaneyk merged commit 484871b into master Sep 28, 2018
@erwinvaneyk erwinvaneyk deleted the fes-errors branch September 28, 2018 12:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant