Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix uneven distribution of locust classes between running locusts #860

Closed
wants to merge 8 commits into from

Conversation

sivukhin
Copy link

@sivukhin sivukhin commented Aug 5, 2018

This PR fix bug related to the issue: #848

Bug description:

There was a function responsible for calculating the locust classes that will be spawned/killed after startup/changing of configuration. The algorithm for spawning n locusts work as following:

  1. Calculate percentage p_i for each locust class based on the class weight
  2. Calculate the number of spawned locusts per class by formula: round(p_i * n)

Of course, this algorithm doesn't guarantee that exactly n locusts will be spawned because of rounding. This behavior can cause small, but annoying bugs (one such bug described in the related issue)

Pull request description:

This pull request introduces simple and separate unit LocustsCollection, that is responsible for even distribution of locust classes between current running locust executors. This class implements a simple algorithm that maintains next invariant:

For each locust class the amount of running locust executors differs by no more than one from the value p_i * n.

Also, this PR includes small refactorings and little improvements in the runners.py file and tests.

@codecov
Copy link

codecov bot commented Aug 5, 2018

Codecov Report

Merging #860 into master will increase coverage by 1.08%.
The diff coverage is 80.76%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #860      +/-   ##
==========================================
+ Coverage   66.55%   67.63%   +1.08%     
==========================================
  Files          14       15       +1     
  Lines        1438     1480      +42     
  Branches      226      233       +7     
==========================================
+ Hits          957     1001      +44     
+ Misses        430      428       -2     
  Partials       51       51
Impacted Files Coverage Δ
locust/web.py 77.31% <0%> (ø) ⬆️
locust/runners.py 51.82% <59.37%> (+0.51%) ⬆️
locust/locusts_collection.py 97.77% <97.77%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ba01b0b...273f877. Read the comment docs.

@LukeCarrier
Copy link

@Umqra thanks for this -- I just identified the same problem with one of our suites and you've saved me a of debugging!

locust/runners.py Outdated Show resolved Hide resolved
@sivukhin
Copy link
Author

sivukhin commented Oct 9, 2018

@LukeCarrier Are there anything else that I should do to merge this pull request?

@LukeCarrier
Copy link

@Umqra I'm just another Locust user I'm afraid -- I'm not knowledgeable enough about Locust's internals to give a quality review. Hopefully a maintainer will check in soon.

@cyberw
Copy link
Collaborator

cyberw commented Oct 20, 2019

Hi @Umqra ! First of all, sorry for the lack of response from maintainers.

@heyman, do you think this makes sense? The main downside I can think of is that it increases complexity a little, but it does solve a real issue.

@heyman
Copy link
Member

heyman commented Oct 20, 2019

I'm not sure about this one. I think the problem should be very uncommon in real world scenarios. To me it doesn't sound like you're doing a load test if you're affected by rounding errors in the number of spawned users.

Still, the behaviour of this PR is better than the current behaviour, but I'm not sure it justifies the extra code and complexity. I wonder if it wouldn't be possible to instead add some extra logic to LocustRunner.weight_locusts() and achieve the same thing?

@heyman
Copy link
Member

heyman commented Oct 20, 2019

I took a stab at it in #1113.

@cyberw
Copy link
Collaborator

cyberw commented Oct 28, 2019

Closing this in favour of #1113

@cyberw cyberw closed this Oct 28, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants