Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] How can I control the speed of sending requests? #472

Closed
yisake opened this issue Sep 4, 2016 · 23 comments
Closed

[Question] How can I control the speed of sending requests? #472

yisake opened this issue Sep 4, 2016 · 23 comments

Comments

@yisake
Copy link

yisake commented Sep 4, 2016

For 1000 simulator users, the requests speed is not 1000resquest/second,
So how can I control the speed?

@heyman
Copy link
Member

heyman commented Sep 4, 2016

Use the min_waitand max_waitattributes of the Locust or TaskSet classes.
(See http://docs.locust.io/en/latest/writing-a-locustfile.html#the-min-wait-and-max-wait-attributes).

The idea with Locust is that you implement user behaviour using code, and then you choose how many users you want to simulate. Therefore there isn't a way of saying "I want to start my test with X number of requests/second".

@heyman heyman closed this as completed Sep 4, 2016
@yisake
Copy link
Author

yisake commented Sep 6, 2016

@heyman Is it that every task will be executed once for every user in a second?

@heyman
Copy link
Member

heyman commented Sep 6, 2016

If you set both min_wait and max_wait to 1000, the wait time between the execution of two tasks will be 1 second, for each user. I you set min_wait and max_wait to 10000 and 40000, the average wait time between tasks, for each user, will be 25 seconds.

@yisake
Copy link
Author

yisake commented Sep 6, 2016

Understood. How can I compare locust with jmeter, may I compare the time for sending same number requests with jmeter and locust?

@yisake
Copy link
Author

yisake commented Sep 6, 2016

@heyman It seems locust use gevent to send requests, and gevent can not work with multi-core CPU(as 4-core CPU). Am I right? In my test locust take more time than jmeter with large number of requests. If locust can solve the multi-core problem on this, it will be more powerful than other load testing tool as jmeter, I think. ~~

@heyman
Copy link
Member

heyman commented Sep 6, 2016

Yeah, to utilize multiple cores you should run Locust distributed (See: http://docs.locust.io/en/latest/running-locust-distributed.html)

@yisake
Copy link
Author

yisake commented Sep 15, 2016

@heyman Can I set the min_wait and max_wait to 0? If this, the time between executing is 0 also?

@heyman
Copy link
Member

heyman commented Sep 15, 2016

@yisake Yes

@yisake
Copy link
Author

yisake commented Sep 15, 2016

Many thanks!~~

@yisake
Copy link
Author

yisake commented Sep 15, 2016

@heyman I've done a test with locust and jmeter. In my test I simulate 800 users. And I capture the request with wireshark as below. Locust rps is 530rps, jmeter is 824.3KB/sec. From the speed for each. Jmeter is faster than locust, why this happended?

Locust 800 users capture with wireshark
image

Jmeter 800 users capture with wireshark
image

@heyman
Copy link
Member

heyman commented Sep 16, 2016

It doesn't make sense to compare RPS to KB/s. However I can see a number of reasons for why you might achieve higher throughput with JMeter in your set up. JMeter uses java threads for their "users" which let's them utilize all CPU cores (while using up a lot more memory). To utilize multiple CPU cores with Locust you must run it distributed with one master and multiple slave processes.

Overall Locust and JMeter take a quite different approach when it comes to load testing, and you should really use the one that suits your need best. If you want a framework for defining real user behaviour in code, and then simulating very large amounts of these users, I think Locust would be a good choice. If you just want to achieve a high RPS throughput to a few URL endpoints, you're probably better off with JMeter or even Apache Bench.

@yisake
Copy link
Author

yisake commented Sep 16, 2016

1.I agree with your point with the CPU cores difference between jmeter and locust.

2.My question is that as I simulate 800 users but why the packages/sec is just 473.607 as below? Is it should almost be 800packages/sec?
image

@cgoldberg
Copy link
Member

should be 800packages/sec

package != packet != request

@yisake
Copy link
Author

yisake commented Sep 16, 2016

@cgoldberg I've filter with http&&tcp.dstport==8000(as my server port is 8000), so the filter result is the speed for client sending requests.

@yisake
Copy link
Author

yisake commented Sep 16, 2016

@cgoldberg If the simuate user is 800. What the packet speed should be as your opinion?

@yisake
Copy link
Author

yisake commented Sep 16, 2016

@heyman @cgoldberg
As I simulated 600 user with 2 slave on a single machine just now. The packet rate is almost 1000packets/sec.

So my result is that there's a limit load testing ability for 1 core. With multiple CPU cores, the packets rate for Locust will increase linearly. Right?

@cgoldberg
Copy link
Member

@yisake I really don't understand your questions or your use of english. sorry.

@yisake
Copy link
Author

yisake commented Sep 17, 2016

@cgoldberg I'm sorry my english is not good. My question is that why the packet rate is different between 1 CPU-core and 2 CPU-core (1 slave and 2 slaves) with the same simulate user count ? The RPS with 2 slaves(1096reps/sec) is higher than 1 slave (647reps/sec),

@yisake
Copy link
Author

yisake commented Sep 19, 2016

@cgoldberg How can I set different host(http://www.google.com/index and http://www.baidu.com/index) in one locust file ?

@heyman
Copy link
Member

heyman commented Sep 19, 2016

How can I set different host(http://www.google.com/index and http://www.baidu.com/index) in one locust file ?

You can specify a full URL (with hostname) when you make requests using the Locust client.

Please don't use the Locust Github issues unless you have real indication that there are actually real issues with Locust. This isn't the right forum to raise pure support questions. Especially if it's questions that has clear answers in the documentation.

@yisake
Copy link
Author

yisake commented Sep 19, 2016

@heyman Sorry for that, where can I ask for support, mail or any other tool ? Do you have any idea?

@heyman
Copy link
Member

heyman commented Sep 19, 2016

@yisake You could use StackOverflow. Personally I don't really have time to answer pure support requests.

@kaylin123
Copy link

@yisake 大兄弟,辛苦了。。

@locustio locustio locked and limited conversation to collaborators Apr 21, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants