-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for async_simple_cache_middleware
#2579
Conversation
async_simple_cache_middleware
76ce961
to
aae75ab
Compare
aae75ab
to
191a9b2
Compare
make_request: Callable[[RPCEndpoint, Any], Any], async_w3: "Web3" | ||
): | ||
middleware = await async_construct_simple_cache_middleware( | ||
cache_class=cast(Type[Dict[Any, Any]], functools.partial(lru.LRU, 256)), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interested in feedback on using LRU here vs anything else
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The problem with LRU is that it's not thread safe, IIRC.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct, yeah... (#1847 for reference) currently we use LRU with sync as well. Not sure whether we want to add this in as a base and improve on both together or take care of it in this
I didn't plan on spending a ton of bandwidth on this this week so maybe that's the route to take if someone else wants to improve on this base in a separate PR before the next beta release?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The other issue with using LRU with an async method is that it will not await the callback that it calls when it evicts an item. That was really the main reason that I wrote SessionCache
in request.py for what it is worth. So if you are going to be using a callback with the above it will just return a coroutine (if I remember correctly) instead of executing the async callback.
If you really want to you could pull SessionCache out of request.py and put it somewhere in _utils then use that, but that would be a decent amount of work. The SessionCache itself is solid, the main issue still outstanding in request.py is how and where we await the close of an evicted async item.
|
||
async def middleware(method: RPCEndpoint, params: Any) -> RPCResponse: | ||
if method in rpc_whitelist: | ||
with lock: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interested on feedback in this main section of the thread locking
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the exact issue I currently have in request.py with async and thread locking. The below await will release the execution of this method for something else to work while you are still in the lock. That will eventually cause a deadlock.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I think I get it a bit more now. Thanks for the explanation. I think the first step here is probably writing a good test that fails, with the situation you described, and work to make it not fail. I'll see if I can whip something up this week but I'm out for half of the week. Realistically probably won't get to this until next week.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was able to use the POC in #2446 to pretty reliably reproduce the request.py issue, so maybe you could use something similar.
This is a pretty rough copy of the sync with a few changes to make it work for async. Interested on async contributor feedback on whether the meat of the thread locking / simple cache setup could be improved upon. I just added this quickly so that we don't have to re-introduce a cc: @dbfreem, @Eenae, @DefiDebauchery |
81a34a8
to
ef93744
Compare
Factored |
ef93744
to
ae6274c
Compare
- Add async support for simple cache middleware - Refactor ``SessionCache`` as a more generic ``SimpleCache`` class to be used internally as a standardized cache where appropriate. - Use ``SimpleCache`` as the default cache for the simple cache middleware
- Include tests for dictionary-based cache and ``SimpleCache`` for simple cache middleware
fc53672
to
3f20e8c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had a couple nits, but looks good once we pull out the dictionary cache!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm!
98a9b42
to
b7a19c6
Compare
- Remove dictionary cache support altogether, for simple cache middleware, in favor of using the ``SimpleCache`` class. - Handle missing keys in ``SimpleCache`` a bit more gracefully than throwing a ``KeyError``, just return ``None`` if the key is not in the cache.
b7a19c6
to
2e0e935
Compare
Updated... I'm not quite sure how to add bullets to the feature newsfragment 🤔. Other than that this should be ready for a quick re-review w/ removal of the dictionary-based cache support. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
! Sweet!
What was wrong?
Related to Issue #2578
Hard-coding an internal
chain_id
was considered at some point but ultimately we landed on addingeth_chainId
to the list of whitelisted endpoints forsimple_cache_middleware
. Thechain_id
setter had made its way into the library already and some users had gotten around the multipleeth_chainId
checks by using this setter, especially withAsyncEth
.Rather than introduce this setter back, this PR adds support to cache
eth_chainId
calls using theasync_simple_cache_middleware
withAsyncEth
.How was it fixed?
async_simple_cache_middleware
async_construct_simple_cache_middleware
SessionCache
to a more genericSimpleCache
within the caching_utils
and use this cache class in the simple cache middleware as the default cache (for sync and async). Leave support for a simple dictionary cache as well.Todo:
Cute Animal Picture