Skip to content
This repository has been archived by the owner on Apr 26, 2024. It is now read-only.

Optionally measure size of cache by sum of length of values #1815

Merged
merged 11 commits into from
Jan 17, 2017

Conversation

erikjohnston
Copy link
Member

No description provided.

@erikjohnston erikjohnston force-pushed the erikj/iter_cache_size branch from 87c56c6 to 46aebbb Compare January 16, 2017 15:00
@@ -58,6 +58,18 @@ def __init__(self, max_size, keylen=1, cache_type=dict):

lock = threading.Lock()

def cache_len():
if size_callback is not None:
return sum(size_callback(node.value) for node in cache.itervalues())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is probably sub-optimal since it iterates the entire cache. You will probably need to store the current size somewhere and update it as things are added or removed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you want to be awful you can move the if statement outside the function def and define the function twice...

)

def __len__(self):
return len(self._cache)
if self.iterable:
return sum(len(value.value) for value in self._cache.itervalues())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is probably sub-optimal since it iterates the entire cache. You will probably need to store the current size somewhere and update it as things are added or removed.

Instead of calculating the size of the cache repeatedly, which can take
a long time now that it can use a callback, instead cache the size and
update that on insertion and deletion.

This requires changing the cache descriptors to have two caches, one for
pending deferreds and the other for the actual values. There's no reason
to evict from the pending deferreds as they won't take up any more
memory.
@erikjohnston erikjohnston force-pushed the erikj/iter_cache_size branch from c0698e0 to d906206 Compare January 17, 2017 11:31

@synchronized
def cache_set_default(key, value):
node = cache.get(key, None)
if node is not None:
evict() # As the new node may be bigger than the old node.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't seem necessary if we aren't modifying the cache.

@@ -128,7 +128,7 @@ def test_set(self):
m = Mock()
cache = LruCache(1)

cache.set("key", "value", m)
cache.set("key", "value", [m])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe use callbacks=[m] here?

@erikjohnston erikjohnston force-pushed the erikj/iter_cache_size branch from 2e8afb5 to 9e8e236 Compare January 17, 2017 11:50
Copy link
Contributor

@NegativeMjark NegativeMjark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@erikjohnston erikjohnston merged commit d11d7cd into develop Jan 17, 2017
@erikjohnston erikjohnston deleted the erikj/iter_cache_size branch March 29, 2017 10:58
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants