Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Will a second job lose if the job is already queued, or is already scheduled? #43

Closed
justqyx opened this issue Jul 3, 2014 · 2 comments

Comments

@justqyx
Copy link

justqyx commented Jul 3, 2014

A uniq job is already queued or scheduled, and then a new job coming. Will it lose ?

class QueueWorker
  include Sidekiq::Worker
  sidekiq_options queue: 'test', unique: true, unique_args: :unique_args

  def self.unique_args user_id, client_id, options
    [user_id, client_id]
  end

  def perform(*args)
    sleep 10
  end
end

QueueWorker.perform_sync(1,1, {})  # No.1
QueueWorker.perform_sync(2,1, {})  # No.2
QueueWorker.perform_sync(1,1, {})  # No.3

After reading the source code, I found that if the No.1 job was previously scheduled and is now being queued, the No.3 job will lose!

image

@justqyx
Copy link
Author

justqyx commented Jul 4, 2014

After a test, I got the answer!

Sidekiq Client

irb(main):002:0> LazyWorker.perform_async 1, 1
=> "5ad101f1c2e7a8342c5be6da"
irb(main):003:0> LazyWorker.perform_async 1, 1
=> nil
irb(main):004:0> LazyWorker.perform_async 1, 2
=> "920d6e6e357c423bddc4088a"

The Redis Log

127.0.0.1:6379[1]> keys *
1) "test:queue:default"
2) "test:queues"
3) "test:sidekiq_unique:1daeaf93b1a9b8c152a7dc40b977bb18"
127.0.0.1:6379[1]> keys *
1) "test:queue:default"
2) "test:queues"
3) "test:sidekiq_unique:a79c7aa2db1706c2c4d29140d2d6d52d"
4) "test:sidekiq_unique:1daeaf93b1a9b8c152a7dc40b977bb18"
127.0.0.1:6379[1]> keys *
1) "test:queue:default"
2) "test:queues"
3) "test:sidekiq_unique:a79c7aa2db1706c2c4d29140d2d6d52d"
4) "test:sidekiq_unique:1daeaf93b1a9b8c152a7dc40b977bb18"

Is it possible to auto requeue the No.3 job onto the queue after the No.1 completed ?

@mhenrixon
Copy link
Owner

The whole Idea of the gem was to only allow unique jobs to be scheduled if a duplicate was allowed to be scheduled in the future then it isn't uniqueness anymore.

Is it possible you could tweak your unique_args and take some time constraint into consideration? It could also be possible that you are in fact looking for some type of scheduler? I don't see how else the gem could help you here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants