-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Optionally measure size of cache by sum of length of values #1815
Conversation
87c56c6
to
46aebbb
Compare
@@ -58,6 +58,18 @@ def __init__(self, max_size, keylen=1, cache_type=dict): | |||
|
|||
lock = threading.Lock() | |||
|
|||
def cache_len(): | |||
if size_callback is not None: | |||
return sum(size_callback(node.value) for node in cache.itervalues()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is probably sub-optimal since it iterates the entire cache. You will probably need to store the current size somewhere and update it as things are added or removed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you want to be awful you can move the if statement outside the function def
and define the function twice...
) | ||
|
||
def __len__(self): | ||
return len(self._cache) | ||
if self.iterable: | ||
return sum(len(value.value) for value in self._cache.itervalues()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is probably sub-optimal since it iterates the entire cache. You will probably need to store the current size somewhere and update it as things are added or removed.
Instead of calculating the size of the cache repeatedly, which can take a long time now that it can use a callback, instead cache the size and update that on insertion and deletion. This requires changing the cache descriptors to have two caches, one for pending deferreds and the other for the actual values. There's no reason to evict from the pending deferreds as they won't take up any more memory.
c0698e0
to
d906206
Compare
|
||
@synchronized | ||
def cache_set_default(key, value): | ||
node = cache.get(key, None) | ||
if node is not None: | ||
evict() # As the new node may be bigger than the old node. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't seem necessary if we aren't modifying the cache.
@@ -128,7 +128,7 @@ def test_set(self): | |||
m = Mock() | |||
cache = LruCache(1) | |||
|
|||
cache.set("key", "value", m) | |||
cache.set("key", "value", [m]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe use callbacks=[m]
here?
2e8afb5
to
9e8e236
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
No description provided.