Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use unique job_id twice #221

Closed
ikhomutov opened this issue Jan 11, 2021 · 4 comments
Closed

Unable to use unique job_id twice #221

ikhomutov opened this issue Jan 11, 2021 · 4 comments

Comments

@ikhomutov
Copy link

It seems like you cannot use same job_id for another task, if the first task was finished and there is a 'result' key for this job_id in redis. Is it an expected behavior? If so, how can I change this?

@ikhomutov
Copy link
Author

Ah, nevermind. Found a comment in #138 about setting keep_result parameter.

@ross-nordstrom
Copy link
Contributor

@samuelcolvin it might be worth updating the docs for this. The bit on Job Uniqueness disagrees with the current behavior (which seemed to change in #138 as mentioned)

It guarantees that a job with a particular ID cannot be enqueued again until its execution has finished.

https://arq-docs.helpmanual.io/#job-uniqueness

Anyways, the keep_result workaround solved it for me. It would also be nice to be able to set the keep_result on a per job basis in case I end up with more flavors of jobs with different result-retention needs.

@samuelcolvin
Copy link
Member

Pr welcome to update the docs.

@KShah707
Copy link

It would also be nice to be able to set the keep_result on a per job basis in case I end up with more flavors of jobs with different result-retention needs.

I think you can, if you use the arq.worker.func wrapper to declare your job: https://arq-docs.helpmanual.io/#arq.worker.func

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants