Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sending message that exceeds local message size limit #5455

Closed
fab-10 opened this issue May 15, 2023 · 3 comments · Fixed by #7507
Closed

Sending message that exceeds local message size limit #5455

fab-10 opened this issue May 15, 2023 · 3 comments · Fixed by #7507
Labels
bug Something isn't working good first issue Good for newcomers help wanted Extra attention is needed P3 Medium (ex: JSON-RPC request not working with a specific client library due to loose spec assumtion) peering snack Smaller coding task - less than a day for an experienced dev

Comments

@fab-10
Copy link
Contributor

fab-10 commented May 15, 2023

Description

It happens seldomly that Besu tries to send messages that are bigger than the configured size limit and as stated in the code, this should be a bug

Frequency: [What percentage of the time does it occur?]
Rare

Logs (if a bug)

Error message

Sending eth message to peer (PeerId 0x771af70ea1f0f269c70aabd8be2411d1dc20fe9694380cc137d67c3ca033157e118ed04a1fbf8c37aef28a50e81f19531177907d84cb64f7e7eda7af56172815, reputation PeerReputation 118, validated? true, disconnected? false, client: erigon/nd-222-222-222/v2.42.0-stable-beb97784/linux-amd64/go1.20.2, connection [Connection with hashCode 1450331933 with peer 0x771af70ea1f0f269c70aabd8be2411d1dc20fe9694380cc137d67c3ca033157e118ed04a1fbf8c37aef28a50e81f19531177907d84cb64f7e7eda7af56172815 inboundInitiated true initAt 1684118305150], enode enode://771af70ea1f0f269c70aabd8be2411d1dc20fe9694380cc137d67c3ca033157e118ed04a1fbf8c37aef28a50e81f19531177907d84cb64f7e7eda7af56172815@188.42.93.159:0) which exceeds local message size limit of 10485760 bytes.  Message code: 6, Message Size: 10485762

Versions (Add all that apply)

  • Software version: 23.4.0
@fab-10 fab-10 added bug Something isn't working peering labels May 15, 2023
@non-fungible-nelson non-fungible-nelson added P4 Low (ex: Node doesn't start up when the configuration file has unexpected "end-of-line" character) TeamRevenant GH issues worked on by Revenant Team snack Smaller coding task - less than a day for an experienced dev labels May 25, 2023
@fab-10 fab-10 added the good first issue Good for newcomers label May 22, 2024
@macfarla macfarla added P3 Medium (ex: JSON-RPC request not working with a specific client library due to loose spec assumtion) help wanted Extra attention is needed and removed TeamRevenant GH issues worked on by Revenant Team P4 Low (ex: Node doesn't start up when the configuration file has unexpected "end-of-line" character) labels Jul 22, 2024
@rodionlim
Copy link
Contributor

Is the expected Behaviour for Besu to drop the message instead of sending it out?

@fab-10
Copy link
Contributor Author

fab-10 commented Aug 21, 2024

the message should not have been created in the first place, however according to current code, it is sent anyway, but the receiving peer is free to just ignore it.

So the ideal solution should be to avoid creating these messages, otherwise as last resort drop it instead of sending.

@rodionlim
Copy link
Contributor

@fab-10, let me give this a pass. Seems like it's the block messages that are occasionally slipping through, added a check before creating the block message itself. Also amended the catch all to drop the messages instead of forwarding it through when size exceeds the limit

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working good first issue Good for newcomers help wanted Extra attention is needed P3 Medium (ex: JSON-RPC request not working with a specific client library due to loose spec assumtion) peering snack Smaller coding task - less than a day for an experienced dev
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants