-
-
Notifications
You must be signed in to change notification settings - Fork 277
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't ensure unique job simultaneously. #111
Comments
First of all I think the first argument might be the job id if I am not mistaken. Even if it is not the first would be the entire hash so already there is something going wrong. See https://github.com/mhenrixon/sidekiq-unique-jobs#finer-control-over-uniqueness for how the method handles the options hash. Second thing I notice is that the 1 users request for 25 jobs 24 of those would just be dropped after the first one is scheduled if they are quick. That might or might not be the desired behaviour but I am guessing they should be able to bet more than 1 time so you want to look into |
Each task takes +- 1 second to run. But picking on what you told, what I understand is that this repo only blocks THE SAME job(id) from being processed more than once simultaneously, however it can't block different jobs with the same params (or part of them- like user_id) from being run simultaneously. Am I missing something? If not do you know anything that could help me here?
With this I conclude that there is no way to keep the "not-unique-jobs" on the queue, but they are always dropped. Pls confirm. |
Search for run_lock or RunLock and how it is used in the code base. |
Check if 4.0.0 fixes your issue and report back if not. |
I think it may work with the "Until Executed", however from the documentation I can't understand how to configure it between Until Executed OR Until Executing |
Check the readme again @acegilz I improved it with information on how to achieve lock in various ways. |
I have one question. If I ask a sidekiq job via Myjob.new.perform instead of Myjob.perform_async, will it still respect uniqueness and follow this gem? |
@acegilz no it will not because it is not passing through sidekiq at all. What you are doing then is saying "run this job immediately" |
right I got it and it makes sense, its not related to this, but btw.. do you have any suggestion for this (reason i was trying new.perform suggested by mperham): sidekiq/sidekiq#2601 |
Suggest you read up on |
I have read multiple times but still can't understand how to send a callback via after_unlock. I need to send a response to controller after the job is concluded but can't make it. Is there any way of making the controller "wait" before worker has a callback /response, or would I have to stick with the workaround I am using(sidekiq/sidekiq#2601) . Here's what I am trying to do: Controller: callback_result = Worker.perform_async(user)
### How to make callback_result display the worker result (only after lock is released!) instead of JID (instantly - while processing)?? Worker: sidekiq_options :queue => "critical", :retry => false, unique: :while_executing, unique_args: :unique_args
def self.unique_args(user)
[ user ]
end
def perform(user)
a = Model.find_or_create_by(user_id: user.id)
end
def after_unlock
if a.valid?
return "completed"
else
return "fail"
end
end |
First of all https://github.com/mperham/sidekiq/wiki/Best-Practices, the user will always be unique because it is a different instance. |
I have sidekiq + sidekiq-unique-jobs running with 25 workers, performing some tasks demanded by users.
By the documentation of this, I'm not 100% sure, but I think that using unique_args = true I could specify what params I would like to use as unique, and in case of same param("user_id") if already being processed by any worker (example: same user performing multiple actions simultaneously) none of the other 24 workers would pick any of those job until param :user_id is unique, leaving the other jobs on the queue until then to avoid deadlocks.
I am not interested on reducing the number of workers because the idea is not to ensure one job at each time, but one job per user at each time; If 25 different users request 25 jobs the processes are all done parallel. If 1 user request 25 jobs each one will begin only when the previous one is done.
Can you please confirm I am using the correct repo? This is part of my code:
Controller:
Worker:
The text was updated successfully, but these errors were encountered: