You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using a ActiveMQ instance as broker and have long running tasks (1-2 Minutes), so I have set high heartbeat values (~5m).
Also, "activemq.prefetchSize" is set to 1 and the messages are fairly big (one image per message).
It processes the first task fine (and it is ack'd), but then idles for always the same amount of time (I think around 230s) doing nothing, before continuing.
It might be related to the heartbeat as I don't notice this for faster tasks.
Does anybody have an idea how it can continue processing after the completion of the message if there are still messages in the queue?
(I want the prefetchSize low to spread the workload evenly)
p.s.: Usually I have the consumers already running, when I submit the work load which results in this behavior. Coincidentally, this time I happened to start them after filling the queue and now it just ran through.
The text was updated successfully, but these errors were encountered:
I am using a ActiveMQ instance as broker and have long running tasks (1-2 Minutes), so I have set high heartbeat values (~5m).
Also, "activemq.prefetchSize" is set to 1 and the messages are fairly big (one image per message).
It processes the first task fine (and it is ack'd), but then idles for always the same amount of time (I think around 230s) doing nothing, before continuing.
It might be related to the heartbeat as I don't notice this for faster tasks.
Does anybody have an idea how it can continue processing after the completion of the message if there are still messages in the queue?
(I want the prefetchSize low to spread the workload evenly)
p.s.: Usually I have the consumers already running, when I submit the work load which results in this behavior. Coincidentally, this time I happened to start them after filling the queue and now it just ran through.
The text was updated successfully, but these errors were encountered: