-
Notifications
You must be signed in to change notification settings - Fork 261
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Speed-up account iteration #1454
Comments
Good question! For a bit of background, Subxt supports using the "old" RPC methods (the default still) or the "new" V2 RPC methods (https://github.com/paritytech/json-rpc-interface-spec/). You can instantiate an With that in mind, the That said, I think that we should put a bit of time into testing the speed of the current (legacy) storage iteration and see what we can do to improve it. One option might be to allow the If you'd like to be a guinea pig, you could fork subxt and increase the number here and see if it makes things much faster for you: subxt/subxt/src/backend/legacy/mod.rs Line 336 in 14b796a
|
Thanks @jsdw for the detailed answer. I think it would be really helpful if a |
Ah great, it's good to hear that helps a bunch; adding that configuration option to a |
@azero-live I added a config option as suggested above; see #1458 for example usage :) |
Perfect, thank you @jsdw jsdw for the fast response and integration! 😄 |
Dear subxt team,
I'm currently using latest
subxt v0.34.0
and I would like to iterate through all existing accounts of the Aleph Zero network. I'm following this example: https://github.com/paritytech/subxt/blob/master/subxt/examples/storage_iterating.rsUnfortunately, it seems like iterating the accounts is extremely slow.
I know from earlier subxt versions, there was the possibility to specify an additional
page_size
param to theiter()
. Is there another option I'm missing that allows me to do a fast iteration through all the accounts?Thanks for all the help and efforts.
The text was updated successfully, but these errors were encountered: