Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

http.batch socket: too many open files #296

Closed
ppcano opened this issue Aug 18, 2017 · 8 comments
Closed

http.batch socket: too many open files #296

ppcano opened this issue Aug 18, 2017 · 8 comments

Comments

@ppcano
Copy link
Contributor

ppcano commented Aug 18, 2017

The http.batch does not describe how many resources could be loaded in parallel in the same batch statement.

  1. @liclac
    What could it be one possible max of requests in an http.batch statement? We need to know this to improve HAR converter WIP #291 . I have tested a little bit more and the error is also related to the number of VUs and sleep periods.

  2. For users experiencing the same error, could we help them with a better error message or avoiding this error? Thoughts?

...
WARN[0008] Request Failed            error="Get https://d3dfcx9bcohi6j.cloudfront.net/static/images/logo-harvard.06e80d50d253.png: dial tcp 54.230.96.198:443: socket: too many open files"
WARN[0008] Request Failed            error="Get https://d3dfcx9bcohi6j.cloudfront.net/static/images/logo-sephora.d6de678a1214.png: dial tcp 54.230.96.198:443: socket: too many open files"
WARN[0008] Request Failed            error="Get https://d3dfcx9bcohi6j.cloudfront.net/static/images/logo-sfdc.51c6a0da92cf.png: dial tcp 54.230.96.198:443: socket: too many open files"
WARN[0008] Request Failed            error="Get https://d3dfcx9bcohi6j.cloudfront.net/static/images/logo-blue-medium.a122a7ddd06f.png: dial tcp 54.230.96.198:443: socket: too many open files"
WARN[0008] Request Failed            error="Get https://d3dfcx9bcohi6j.cloudfront.net/static/images/insights-arrow.9e9a1171b89b.png: dial tcp 54.230.96.198:443: socket: too many open files"
...
@liclac
Copy link
Contributor

liclac commented Aug 21, 2017

There should definitely be a limit to how many requests are executed in parallel.

@liclac
Copy link
Contributor

liclac commented Dec 4, 2017

Note: the current solution doesn't do per-host limiting, because that adds a significant amount of complexity for something I honestly can't see being very useful.

If you have a use case where a granular limit trumps a general one, please comment.

@ppcano
Copy link
Contributor Author

ppcano commented Dec 5, 2017

@liclac

What will the expected behavior be for example, if the user sets the batch option to 10 and add 20 requests in a batch?

@liclac
Copy link
Contributor

liclac commented Dec 5, 2017

All of them get ran in parallel.

@robingustafsson
Copy link
Member

Wait, why would all 20 requests, in @ppcano's example above, run in parallel?

Per-host limiting is needed as a tweakable knob to more realistically simulate the connection management of a browser (common use case when testing websites that need to support browsers with varying configs for number of connections per host).

I guess this CL (as a solution to golang/go#13957) will make implementing this easy when it lands (hopefully in Go 1.10, Feb next year) :)

@liclac
Copy link
Contributor

liclac commented Dec 5, 2017

Oh, I misread it. At most 10 requests will be ran in parallel, a new one is started every time a previous one is finished.

@liclac
Copy link
Contributor

liclac commented Dec 5, 2017

I'll toss in a batchPerHost option, the problem is that it's gonna need some major refactoring of how the module works to avoid parsing URLs multiple times, and more complexity is not what this module is really wanting for right now.

@liclac
Copy link
Contributor

liclac commented Dec 8, 2017

batchPerHost added. 06488b8

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants