-
-
Notifications
You must be signed in to change notification settings - Fork 277
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
concurrent-ruby 1.1.10 spikes volume of jobs #701
Comments
I had a feeling it wouldn’t be so easy. Shame they removed the timeouts instead of using my version. I’ll copy the previous version of the timer task and use that instead. Thank you for reporting. |
#702 I wish they would have taken my suggestion and fixed the issue instead of making the issue permanent... |
Released as v7.1.19 https://github.com/mhenrixon/sidekiq-unique-jobs/releases/tag/v7.1.19 |
I hear you, I'll write my own timeout functionality then and hopefully it won't matter which concurrent version you are having. Frigging dependencies aye? |
Really appreciate that!
Haha yup |
@oneandonlymike see if v7.1.20 works better for you |
v7.1.20 seems to have done the trick! Thanks! |
Describe the bug
When using
concurrent-ruby
1.1.10, the gem does not behave as expected. For the application we are implementing in, after upgrading tosidekiq-unique-jobs 7.1.16
we started getting alerts about a higher than normal volume of calls going out to a service that was approaching our rate limit. Attempting the upgrade again withconcurrent-ruby
dropped to1.1.9
had no issues. It looks like the fix applied in #688 for #697 has some unexpected consequences.Expected behavior
Jobs would be unique, allowing for a consistent pattern of calls to hold.
Current behavior
Jobs spike in volume, suggesting some non-unique jobs are making it through.
Worker class
Additional context
I noticed the upgrade contained separate logic based on the version of
concurrent-ruby
used, so it makes sense that downgrading it fixed it, I haven't looked deep enough to see what could be wrong though.The text was updated successfully, but these errors were encountered: