Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Constrain memory usage in the in-memory backend #209

Merged
merged 1 commit into from
Sep 21, 2018
Merged

Conversation

erwinvaneyk
Copy link
Member

Currently, originally intended for simple development and testing, this backend consisted solely out of a map containing all events. As it would never remove old/completed invocations/workflows this meant that over time the event store would hog up more and more memory until an inevitable OOM occurred.

This PR has as a goal to make the in-memory event store feasible for longer-term or more intensive deployments, such as benchmarks and use cases that don't require persistence of events. The solution in this PR is to use an approach akin to TinyLFU for caches. The store is assigned specific limits to the number of entities (n) it contains, which are contained in two segments:

  • store which contains all currently active event streams, which under no circumstance should be deleted.
  • buffer which is a LRU cache contains all completed event streams (event streams that have a last event with the completed flag). The size of the buffer is dynamic; it fills all available space between the store and n, evicting entities if the space is exceeded.

Concretely, this PR...

  • Adds benchmarks for the in-memory store.
  • Adds a LRU cache to the in-memory backend as a buffer for completed invocations.
  • Adds a couple of metrics to the in-memory store.
  • Adds configuration options to the in-memory store, allowing users to specify limits on the resource usage.
  • Introduces an EventStoreErr to the event store package, that should serve as the base error for the package eventually.
  • Fixes a couple of logrus incorrect log statements, which will error in newer logrus versions otherwise.

@erwinvaneyk erwinvaneyk changed the title Constrain memory usage in-memory backend Constrain memory usage in the in-memory backend Sep 21, 2018
@erwinvaneyk erwinvaneyk merged commit c49dfc4 into master Sep 21, 2018
@erwinvaneyk erwinvaneyk deleted the mem-cache branch September 21, 2018 15:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant