This repository has been archived by the owner on Oct 14, 2022. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 81
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Wait a bit before reading the next socket chunk
- Loading branch information
Showing
2 changed files
with
14 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2dedb65
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi,
Just a few thoughts about this commit. My first thought is WHY does the system need a few millisecs before it can process the next inbound packet?
What is happening in the loop that doesn't allow the decoding/processing of it successfully? Is node just sending things too quickly....but if that were the case, then why would slowing down the PHP processing code help?
Whilst its great that using
usleep(1000)
for both your system and my system (in homestead anyway) fixes this issue, perhaps the next person to come along might have an ever slower system. Maybe they'll needusleep(5000)
etc. It just seems risky to use an arbitrary value and then hope that the problem is fixed.I'll try and have a look into the code as well to see if I can suggest anything too.
My next thought was the loop in the
readNextProcessValue()
method. I see that we take each inbound packet, look to see how many more chunks belong to this data block, then remove the header and concatenate the string to a$payload
variable.My concern here would be that if we received a very large data payload from node that we could possibly run out of memory when php is dealing with it. It feels like if the inbound data was held in a stream it would be much safer to deal with large payloads.
A rough outline:
Thanks for your help in digging into this.
2dedb65
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are totally right! I was thinking about using streams yesterday, they will be necessary if a heavy screenshot is transferred (or another type of data). However, if you're calling the
screenshot
method with PuPHPeteer, you're expecting it to return a string (as specified in Puppeteer's documentation), not a resource, so if we need to transform the resource to a string, there is no benefit. Maybe we could add an option to return strings as streams? But I'm not sure about this.Honestly, the socket part of my code is really crappy (especially in PHP), I don't have that much experience with them. I can't even remember why I'm splitting the payloads in chunks of 1024 bytes…
A refactoring will be needed, I'm totally for it. 👍
2dedb65
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jonnywilliamson You might be interested in following issue #12.
(and thank you for the beer! 🍺)
2dedb65
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes I am!
Let me have a look and see if I can help with anything.