Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(cli): add a RPC batch option to cli #2322

Merged
merged 8 commits into from
Mar 5, 2024

Conversation

0x-shanks
Copy link
Contributor

This PR adds a RPC batch size option to the deploy command.

Resolve: #2236

Copy link

changeset-bot bot commented Feb 27, 2024

🦋 Changeset detected

Latest commit: c812110

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 30 packages
Name Type
@latticexyz/cli Major
@latticexyz/abi-ts Major
@latticexyz/block-logs-stream Major
@latticexyz/common Major
@latticexyz/config Major
create-mud Major
@latticexyz/dev-tools Major
@latticexyz/ecs-browser Major
@latticexyz/faucet Major
@latticexyz/gas-report Major
@latticexyz/network Major
@latticexyz/noise Major
@latticexyz/phaserx Major
@latticexyz/protocol-parser Major
@latticexyz/react Major
@latticexyz/recs Major
@latticexyz/schema-type Major
@latticexyz/services Major
@latticexyz/solecs Major
solhint-config-mud Major
solhint-plugin-mud Major
@latticexyz/std-client Major
@latticexyz/std-contracts Major
@latticexyz/store-cache Major
@latticexyz/store-indexer Major
@latticexyz/store-sync Major
@latticexyz/store Major
@latticexyz/utils Major
@latticexyz/world-modules Major
@latticexyz/world Major

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@0x-shanks
Copy link
Contributor Author

0x-shanks commented Feb 27, 2024

@holic

#2236 (comment)

There was no corresponding throttling, but there was one that did batch processing instead, so this was substituted.
https://viem.sh/docs/clients/transports/http#batch-json-rpc

In my environment, setting batchSize to 1000 allowed me to deploy without any errors.

I have closed PRs I made in the past and made new ones

#2237

#2322

@holic
Copy link
Member

holic commented Feb 27, 2024

Out of curiosity, does just using batch: true do the same thing for you? If so, I think we could feasibly turn this on by default for the deployer (assuming viem gracefully falls back if batching isn't supported for a particular RPC).

@0x-shanks
Copy link
Contributor Author

I don't think just batch:true will work, because if you don't specify batch.wait, the batch control will not work for 1 second(default wait 0 second).
https://viem.sh/docs/clients/transports/http#batchwait-optional

@0x-shanks
Copy link
Contributor Author

Do you usually update the document side with the same PR?

@holic
Copy link
Member

holic commented Feb 27, 2024

If you read further up, it'll still batch within the same "tick" (see also zero delay):

image

This makes me suspect most of our operations would be batched because they're often sent simultaneously.

@holic
Copy link
Member

holic commented Feb 27, 2024

Do you usually update the document side with the same PR?

Ideally, yep!

@0x-shanks
Copy link
Contributor Author

0x-shanks commented Feb 27, 2024

The same effect as throttling could only be achieved if a sufficiently large batchSize and a sufficiently long wait were set

For example, each ensureHoge calls Promise.all internally, and Promise.all does multiple etc_calls as one step batch call
https://github.com/latticexyz/mud/blob/main/packages/cli/src/deploy/deploy.ts#L93-L114

It is thought that each step that is summarized in 1 Batch is a step that was executed within a certain period of time (within the time of waiting).

My hypothesis is this.

  1. If batchSize is small, more batches are split, more requests are sent per second, and quote is limited.
  2. If the wait time is short, requests are sent at short intervals even if batchSize is large, and quote is limited.

I am not sure what should be taken as the argument of cli

@0x-shanks
Copy link
Contributor Author

I'm guessing based on the actual results I've tried, so I could be wrong.

@holic
Copy link
Member

holic commented Feb 28, 2024

I'm guessing based on the actual results I've tried, so I could be wrong.

Could you try with just batch: true and let us know if this still hits RPC rate limits?

@0x-shanks
Copy link
Contributor Author

0x-shanks commented Feb 28, 2024

Of course, I've tried batch: true and still hit RPC rate limits

My environment is this.


Here are the results of the search with each parameter fixed

  • batchSize: 10, wait: 1000(ms): hit RPC rate limits
  • batchSize: 100, wait: 1000(ms): no error
  • batchSize: 1000, wait: 1000(ms): no error
  • batchSize: 100, wait: 0(ms): hit RPC rate limits
  • batchSize: 100, wait: 10(ms): hit RPC rate limits
  • batchSize: 100, wait: 100(ms): hit RPC rate limits

Since batchSize and wait are defaults when batch: true, respectively
batchSize: 1000, wait: 0(ms)

@0x-shanks 0x-shanks changed the title feat(cli): Add a RPC batch size option to cli feat(cli): Add a RPC batch option to cli Feb 29, 2024
@holic
Copy link
Member

holic commented Mar 5, 2024

asked if we could enable this by default but sounds like there's no good way to detect if batching is supported so keeping it as an option makes sense: https://discord.com/channels/1156791276818157609/1212120089219043369

@holic holic changed the title feat(cli): Add a RPC batch option to cli feat(cli): add a RPC batch option to cli Mar 5, 2024
@holic holic merged commit 645736d into latticexyz:main Mar 5, 2024
10 of 12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add RPC call concurrency restrictions to deploy command options
2 participants