Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"payload is not unique", but cannot find digest or scheduled job #335

Closed
slhck opened this issue Oct 4, 2018 · 5 comments
Closed

"payload is not unique", but cannot find digest or scheduled job #335

slhck opened this issue Oct 4, 2018 · 5 comments

Comments

@slhck
Copy link
Contributor

slhck commented Oct 4, 2018

Describe the bug

I am trying to schedule a job:

IpCacheCleanupWorker.perform_in 1.second

Expected behavior

I expect the job to be scheduled and executed.

Current behavior

The job is not scheduled; instead it returns:

payload is not unique {"class"=>"IpCacheCleanupWorker", "args"=>[], "at"=>1538660761.5906074, "retry"=>false, "queue"=>"default", "lock"=>:until_and_while_executing, "log_duplicate_payload"=>true, "jid"=>"18c78475c0c9dcb7abd0573a", "created_at"=>1538660760.590708, "lock_timeout"=>0, "lock_expiration"=>nil, "unique_prefix"=>"uniquejobs", "unique_args"=>[], "unique_digest"=>"uniquejobs:07f59da973638b67cf8d63abaff6968c"}

When I filter the digests in the web view for uniquejobs:07f59da973638b67cf8d63abaff6968c, nothing is returned.

irb(main):006:0> redis_cache.scan(0, match: "uniquejobs:07f59da973638b67cf8d63abaff6968c")
=> ["768", []]

I also don't see it schedule:

irb(main):008:0> ss = Sidekiq::ScheduledSet.new
=> #<Sidekiq::ScheduledSet:0x000055fe976957a8 @name="schedule", @_size=266>
irb(main):009:0> ss.select {|retri| retri.klass == IpCacheCleanupWorker.name}
=> []

Worker class

class IpCacheCleanupWorker
  include Sidekiq::Worker

  RESCHEDULE_TIME = 1.hour

  sidekiq_options lock: :until_and_while_executing,
                  log_duplicate_payload: true
  sidekiq_options retry: false

  def perform
    IpAddressResolveCache.cleanup
    self.class.perform_in IpCacheCleanupWorker::RESCHEDULE_TIME
  end
end
@mhenrixon
Copy link
Owner

Interesting, I think every 1 second is not such a good idea with uniqueness but I have to try it out before I give a definite answer. It seems like anything below 5 seconds would be giving you shady behavior.

@slhck
Copy link
Contributor Author

slhck commented Oct 4, 2018

Thanks for the quick answer. The 1.second I did just for testing. It's not like I want to run this every second :)

I think I remember that the job initially crashed and I restarted Sidekiq after that, now something got stuck.

@slhck
Copy link
Contributor Author

slhck commented Oct 4, 2018

Just for clarification: the job is supposed to run every hour. I only did the 1.second for testing whether I can queue it. Running .perform_async also fails with the same error.

I already tried restarting Sidekiq and the web server.

@mhenrixon
Copy link
Owner

sidekiq/sidekiq#3980 this is related to your initial problem. You might want to change to retry: 0 for now and add a death handler to prevent this issue. In the future sidekiq will treat retry: false the same way as retry: 0 and call the death handler. See: https://github.com/mhenrixon/sidekiq-unique-jobs#cleanup-dead-locks

For now you should always get the same key: uniquejobs:07f59da973638b67cf8d63abaff6968c so you could kill that key. You can use the same code:

SidekiqUniqueJobs::Digests.del(digest: '07f59da973638b67cf8d63abaff6968c')
# OR
SidekiqUniqueJobs::Digests.del(digest: 'uniquejobs:07f59da973638b67cf8d63abaff6968c')

I honestly don't remember from the top of my head how the code works right now. In the middle of moving to another country so brain is a little shattered.

@slhck
Copy link
Contributor Author

slhck commented Oct 4, 2018

In the meantime I “fixed” it by deleting all uniquejobs keys from Redis. Seems to run again.

Thanks for your quick help, and good luck with the move!

@slhck slhck closed this as completed Oct 4, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants