-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Requests Per Second Plot Breaks When There are too Many Unique URLs #1059
Comments
what about it is broken? |
please give a clear description of the issue, including details of what you are expecting and what is actually happening. Please exclude all other information, conjectures, or ideas for possible fixes. This issue report is packed with information that is completely irrelevant... which makes it difficult to follow. (For example... I can't even tell if this is a bug report, or a feature request for tagging responses ... it seems like both) |
Fair, I'm a bit new to this if you can't tell. Primarily, I want to fix the issue, but I believe that tagging would conveniently solve the bug as well. Here is a revised bug report: DescriptionRequests per second stops plotting under certain circumstances. Expected behaviorThe reported RPS in the locust web client should be graphed accurately in the total requests per second chart. Actual behaviorTotal Requests per second is not graphing properly under certain circumstances. Generating Unique URLs:Repeatedly hitting the same URL:Environment Settings
Steps to reproduceWrite a locust file that makes requests on a large number of unique URLs. Once the number of requested URLs hits about 500, the graph breaks. |
Have you tried adding name="string" to your request? ie unless I am mistaken, this is what you are after. |
Wow, thanks for pointing that out. That's exactly the feature I was looking for. However, I'm going to leave this issue open because the total requests per second plot still breaks above 500 urls. |
Fixed by #1060 if I'm not mistaken. |
Description of issue
Plotting requests per second breaks when locust has visited around 500 unique URLs
Expected behavior
Total requests per second works as long as needed
Actual behavior
It seems to break once you hit too many unique URLs. I believe this is because locust gets the stats for every URL every time it updates the web ui.
Environment settings
Steps to reproduce (for bug reports)
I am using locust to test a retail point of sale rest api. The api uses document SIDs in the url, so everytime I make a new transaction, I get 3-4 new urls. Not only does this make analyzing the results harder, as I need to group the urls together with regular expressions in pandas, but i believe that it is breaking the plotting of requests per second as well.
I can hammer the api with requests that don't use the document SID in the url forever, but I can only run a few hundred transactions before the plotting breaks.
To make sure, I made 15 new receipts per second for about 30 mins and everything worked correctly. Afterword, I changed my locust file to make a blank receipt and then request it's contents (generating a request on a new url) and was only able to run the load test for 2 mins before plotting broke. (at around 500 unique urls)
Possible fix
I would love to be able to set the field that requests are grouped by to a string. Something like:
response = self.client.post("/v1/rest/document" headers=self.header, json=payload, tag="creating a document")
but I'm not incredibly experienced, so this may not be as easy to implement as I'm hoping.
Also this implies that I'm correct as to why the requests per second plot is breaking. Furthermore, this would save me (and people doing similar testing) from having to group api calls after running a test.
The text was updated successfully, but these errors were encountered: