-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wrong tasks weight calculation over several TaskSet's #766
Comments
How are you running locust? If you specify multiple Locusts each is controlled independently so you'd see the expected behavior you are referring to because the timing would be unique per locust. I think what you want is something like:
which will combine the task sets into a single locust. |
@aldenpeterson-wf It won't work. First of all, we get: |
Oops. Needs to be:
This isn't really the idea behind locust though. The paradigm Locust is designed with is to simulate users that have task sets associated with the users. What you are requesting is more of a "lots of users that all share a task set." You could pretty easily accomplish what you are looking to do with a custom client, too. |
@aldenpeterson-wf Thanks for this code snippet (inside |
@mkusz I'm assuming this is resolved - let me know if you need me to reopen this. |
Description of issue / feature request
Defining weight of tasks spread across more than one
TaskSet
lead to a wrong probability of task execution. Take a look at the following code:Actual and expected behavior
Using above code, my expectation is to execute
heavy_1
over 100 times more likely thanlight_1
.It's not happening. Instead, both task probability of execution is 1:1.
When both tasks are in the same
TaskSet
it works as expected.When we add dummy task to
Heavy
like this:Everything started to work correctly.
It seems that weight calculation is always against task with lowes weight not one common value for all
TaskSet
's.Environment settings (for bug reports)
Steps to reproduce (for bug reports)
Described above.
The text was updated successfully, but these errors were encountered: