Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

generator already executing #2141

Closed
stampedeboss opened this issue Jul 26, 2022 · 7 comments · Fixed by #2224
Closed

generator already executing #2141

stampedeboss opened this issue Jul 26, 2022 · 7 comments · Fixed by #2224

Comments

@stampedeboss
Copy link

Describe the bug

022-07-26 18:51:06,429 | locust.runners | CRITICAL| Unhandled exception in greenlet: <Greenlet at 0x7fa78408ee60: <bound method MasterRunner.heartbeat_worker of <locust.runners.MasterRunner object at 0x7fa7969cf910>>>
Traceback (most recent call last):
File "src/gevent/greenlet.py", line 906, in gevent._gevent_cgreenlet.Greenlet.run
File "/usr/local/lib/python3.10/site-packages/locust/runners.py", line 946, in heartbeat_worker
self.start(user_count=self.target_user_count, spawn_rate=self.spawn_rate)
File "/usr/local/lib/python3.10/site-packages/locust/runners.py", line 757, in start
for dispatched_users in self._users_dispatcher:
File "/usr/local/lib/python3.10/site-packages/locust/dispatch.py", line 114, in next
users_on_workers = next(self._dispatcher_generator)
ValueError: generator already executing

Expected behavior

No exception

Actual behavior

Test dies

Steps to reproduce

Running distributed
022-07-26 18:33:52,156 | workload | INFO | Test Requested: Stepped
2022-07-26 18:33:52,156 | workload | INFO | Total Users: 1500
2022-07-26 18:33:52,156 | workload | INFO | Test Duration: 900.0
2022-07-26 18:33:52,156 | workload | INFO | Step Duration: 12.0
2022-07-26 18:33:52,156 | workload | INFO | Users per Step: 30
2022-07-26 18:33:52,156 | workload | INFO | User Spawn Rate: 2.5
2022-07-26 18:33:52,233 | locust.main | INFO | Starting web interface at http://0.0.0.0:8089 (accepting connections from all network interfaces)
2022-07-26 18:33:52,247 | locust.main | INFO | Starting Locust 2.10.1
2022-07-26 18:48:40,998 | workload | INFO | Test Requested: step_tick - User Count (30, 2.5)
2022-07-26 18:48:52,315 | workload | INFO | Test Requested: step_tick - User Count (30, 2.5)
2022-07-26 18:48:53,316 | workload | INFO | Test Requested: step_tick - User Count (60, 2.5)
2022-07-26 18:49:04,619 | workload | INFO | Test Requested: step_tick - User Count (60, 2.5)
2022-07-26 18:49:05,620 | workload | INFO | Test Requested: step_tick - User Count (90, 2.5)
2022-07-26 18:49:16,963 | workload | INFO | Test Requested: step_tick - User Count (90, 2.5)
2022-07-26 18:49:17,964 | workload | INFO | Test Requested: step_tick - User Count (120, 2.5)
2022-07-26 18:49:29,275 | workload | INFO | Test Requested: step_tick - User Count (150, 2.5)
2022-07-26 18:49:40,586 | workload | INFO | Test Requested: step_tick - User Count (150, 2.5)
2022-07-26 18:49:41,587 | workload | INFO | Test Requested: step_tick - User Count (180, 2.5)
2022-07-26 18:49:52,908 | workload | INFO | Test Requested: step_tick - User Count (180, 2.5)
2022-07-26 18:49:53,909 | workload | INFO | Test Requested: step_tick - User Count (210, 2.5)
2022-07-26 18:50:05,215 | workload | INFO | Test Requested: step_tick - User Count (240, 2.5)
2022-07-26 18:50:16,521 | workload | INFO | Test Requested: step_tick - User Count (240, 2.5)
2022-07-26 18:50:17,523 | workload | INFO | Test Requested: step_tick - User Count (270, 2.5)
2022-07-26 18:50:28,830 | workload | INFO | Test Requested: step_tick - User Count (270, 2.5)
2022-07-26 18:50:29,831 | workload | INFO | Test Requested: step_tick - User Count (300, 2.5)
2022-07-26 18:50:41,132 | workload | INFO | Test Requested: step_tick - User Count (330, 2.5)
2022-07-26 18:50:52,424 | workload | INFO | Test Requested: step_tick - User Count (330, 2.5)
2022-07-26 18:50:53,425 | workload | INFO | Test Requested: step_tick - User Count (360, 2.5)
2022-07-26 18:51:05,652 | workload | INFO | Test Requested: step_tick - User Count (390, 2.5)
2022-07-26 18:51:06,429 | locust.runners | CRITICAL| Unhandled exception in greenlet: <Greenlet at 0x7fa78408ee60: <bound method MasterRunner.heartbeat_worker of <locust.runners.MasterRunner object at 0x7fa7969cf910>>>
Traceback (most recent call last):
File "src/gevent/greenlet.py", line 906, in gevent._gevent_cgreenlet.Greenlet.run
File "/usr/local/lib/python3.10/site-packages/locust/runners.py", line 946, in heartbeat_worker
self.start(user_count=self.target_user_count, spawn_rate=self.spawn_rate)
File "/usr/local/lib/python3.10/site-packages/locust/runners.py", line 757, in start
for dispatched_users in self._users_dispatcher:
File "/usr/local/lib/python3.10/site-packages/locust/dispatch.py", line 114, in next
users_on_workers = next(self._dispatcher_generator)
ValueError: generator already executing
2022-0

Environment

  • OS: linux
  • Python version: 3.10.5
  • Locust version: 2.10.1
  • Locust command line that you ran: usr/local/bin/locust -f /opt/locust/locustfile.py --config /opt/locust/locust.conf --master
  • Locust file contents (anonymized if necessary):
@cyberw
Copy link
Collaborator

cyberw commented Aug 13, 2022

Need more details

@cyberw cyberw closed this as completed Aug 27, 2022
@cyberw cyberw added the invalid label Aug 27, 2022
@rijenkii
Copy link

Able to reproduce.

Environment

OS: Artix Linux x64
Python version: 3.10.7
Locust version: 2.12.1

Steps

  1. Start the master
  2. Start all workers

After the launch of the last worker the test starts without problems, but master prints this out before the first status report:

[2022-10-10 16:16:13,073] RijenkiiBook/INFO/root: Waiting for workers to be ready, 7 of 8 connected
[2022-10-10 16:16:14,074] RijenkiiBook/INFO/root: Waiting for workers to be ready, 7 of 8 connected
[2022-10-10 16:16:14,223] RijenkiiBook/INFO/locust.runners: Worker RijenkiiBook_e01d5d8ce3e44fcdbe7a23718efcbd8f (index 7) reported as ready. 8 workers connected.
Type     Name         # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|-----------|-------|-------------|-------|-------|-------|-------|--------|-----------
--------|-----------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated        0     0(0.00%) |      0       0       0      0 |    0.00        0.00

[2022-10-10 16:16:15,075] RijenkiiBook/INFO/locust.main: Run time limit set to 120 seconds
[2022-10-10 16:16:15,076] RijenkiiBook/INFO/locust.main: Starting Locust 2.12.1
[2022-10-10 16:16:15,076] RijenkiiBook/INFO/locust.main: Run time limit set to 120 seconds
[2022-10-10 16:16:15,077] RijenkiiBook/INFO/locust.runners: Sending spawn jobs of 100 users at 5.00 spawn rate to 8 ready workers
[2022-10-10 16:16:15,078] RijenkiiBook/INFO/locust.runners: Sending spawn jobs of 100 users at 5.00 spawn rate to 8 ready workers
Traceback (most recent call last):
  File "src/gevent/greenlet.py", line 906, in gevent._gevent_cgreenlet.Greenlet.run
  File "/home/rijenkii/.local/lib/python3.10/site-packages/locust/runners.py", line 791, in start
    for dispatched_users in self._users_dispatcher:
  File "/home/rijenkii/.local/lib/python3.10/site-packages/locust/dispatch.py", line 115, in __next__
    users_on_workers = next(self._dispatcher_generator)
ValueError: generator already executing
2022-10-10T09:16:15Z <Greenlet at 0x7f657d7a6440: <bound method MasterRunner.start of <locust.runners.MasterRunner object at 0x7f657d6ae8f0>>(100, 5.0)> failed with ValueError

[2022-10-10 16:16:15,086] RijenkiiBook/CRITICAL/locust.main: Unhandled exception in greenlet: <Greenlet at 0x7f657d7a6440: <bound method MasterRunner.start of <locust.runners.MasterRunner object at 0x7f657d6ae8f0>>(100, 5.0)>
Traceback (most recent call last):
  File "src/gevent/greenlet.py", line 906, in gevent._gevent_cgreenlet.Greenlet.run
  File "/home/rijenkii/.local/lib/python3.10/site-packages/locust/runners.py", line 791, in start
    for dispatched_users in self._users_dispatcher:
  File "/home/rijenkii/.local/lib/python3.10/site-packages/locust/dispatch.py", line 115, in __next__
    users_on_workers = next(self._dispatcher_generator)
ValueError: generator already executing
Type     Name         # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|-----------|-------|-------------|-------|-------|-------|-------|--------|-----------
GET      http://example.org       3     0(0.00%) |    446     438     453    450 |    0.00        0.00
--------|-----------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated        3     0(0.00%) |    446     438     453    450 |    0.00        0.00

Master command:

locust ActivityList \
    --headless \
    --autostart \
    --autoquit 0 \
    --run-time 2m \
    --users 100 \
    --spawn-rate 5 \
    --csv (date -Iminutes) \
    --csv-full-history \
    --html (date -Iminutes) \
    --master \
    --expect-workers 8

Worker commands:

locust --worker --master-host localhost &
locust --worker --master-host localhost &
locust --worker --master-host localhost &
locust --worker --master-host localhost &
locust --worker --master-host localhost &
locust --worker --master-host localhost &
locust --worker --master-host localhost &
locust --worker --master-host localhost &

locustfile:

from locust import FastHttpUser
from locust import constant_throughput
from locust import task


class ActivityList(FastHttpUser):
    wait_time = constant_throughput(1)

    @task
    def task(self):
        self.client.get("http://example.org")

@cyberw
Copy link
Collaborator

cyberw commented Oct 10, 2022

Interesting. Can you reduce this to a minimum example? Are the extra parameters (particularly autostart, csv) important, for example?

@rijenkii
Copy link

rijenkii commented Oct 10, 2022

The most minimal example I was able to construct:

locustfile.py:

import locust


class User(locust.HttpUser):
    host = "http://example.org"

    @locust.task
    def task(self):
        self.client.get("")

Worker:

locust --worker --master-host localhost

Master:

locust --master --headless --autostart --users 3 --spawn-rate 1
  • without passing --master the exception does not occur
  • without passing --headless the exception does not occur
  • even though when passing --autostart Locust prints that Option --autostart is ignored for headless mode and worker process., without passing it the exception does not occur.
  • after setting users to a lower value than 3 exception does not occur.
  • changing --spawn-rate does not affect the appearance of the exception.

Offtopic: if --headless implies --autostart, why does --headless --autoquit does not work?

@cyberw
Copy link
Collaborator

cyberw commented Oct 10, 2022

So passing —autostart AND —autostart is necessary to reproduce the error?

Then at least a workaround is easy, because --headless implies autostart, so there is no need to specify that too (hence the warning message).

Headless also implies autoquit, because it doesnt make any sense at to keep a non-ui process running longer than the test run.

Headless has existed for many years. Autostart is only relevant for UI runs, and it is a very new option, in case you are interested in the history :)

@rijenkii
Copy link

So passing --autostart AND --autostart is necessary to reproduce the error?

Then at least a workaround is easy, because --headless implies autostart, so there is no need to specify that too (hence the warning message).

Yeah, but I would have expected --autostart just being ignored, without any side-effects.

Headless also implies autoquit, because it doesnt make any sense at to keep a non-ui process running longer than the test run.

Headless has existed for many years. Autostart is only relevant for UI runs, and it is a very new option, in case you are interested in the history :)

Aaah, makes sense, thanks for the explanation.
Yeah, the fact that headless implies autoquit seems obvious looking back... I just never tried launching it like that for some reason.

@cyberw
Copy link
Collaborator

cyberw commented Oct 11, 2022

Yeah, but I would have expected --autostart just being ignored, without any side-effects.

I agree, it is still a bug. I can take a look at it some time…

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants