-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Connection reset by peer" failure When doing local test with lost ( >1000) concurent users #545
Comments
can you show the full stacktrace? |
actually it only shows up in the "failure" tab of the web interface, in the console output of the locust process there's no stacktrace, even with loglevel set to DEBUG. I think it's certainly catch somewhere in the code |
nothing actionable here |
will be it be reopened if I can produce a projet repeating the problem and/or if I dig further in the code to see if there's a try/catch in locust swallowing the exception ? or is it because it's a known python issue and you can't do nothing about it? |
if you provide more info, we can re-open it. |
Connection reset by peer means the server (your target host) has closed its connection to Locust. With a high probability it has nothing to do with Locust and everything to do with your target server. You might find some hints here: https://github.com/locustio/locust/wiki/FAQ#increase-my-request-raterps |
Thank you |
I'm using locust 0.7.5 , when I try to do a local (on my ubuntu 16.04) load test on a tornado web service, with a high number of concurrent users, a certain percentage of the requests (~8%) fail with the error
According to http://www.itmaybeahack.com/homepage/iblog/architecture/C551260341/E20081031204203/index.html , it seems to be a known python issue , which also seems to rule out a problem on the server side (which is correlated by the fact that I see no error or warning on the server side)
Is this a known locust issue ? If yes what are the workaround ?
The text was updated successfully, but these errors were encountered: