-
Notifications
You must be signed in to change notification settings - Fork 275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/delay downloads #3360
Feature/delay downloads #3360
Conversation
This will enable a user to delay a download for nzb, torrents or both. For ex. Delaying a download for torrents, will not snatch a torrent result, but will snatch a nzb result, when it hits one.
I need to add this table, as the others are used, and are always overwritten. * Needed to remove the INSERT OR REPLACE. Because the date_added may never be overwritten.
…OR REPLACE anymore. This has the advantage that we're eliminating allot of (duplicated) search results early on. * Fixed the INSERT and the UPDATE.
* currently no UI available. Config needs to be done in config.ini.
Anyone willing to volunteer for running some tests? |
@p0psicles |
@p0psicles |
@NicoLeOca changes to the config should be made while medusa is shutdown or they will not persist. |
@labrys thanks, that helped I am currently using this branch with 6 providers Are there specific logs you want me to keep an eye on? |
@p0psicles what I did:
|
Tnx for the error report. If you run a daily search with the same provider, the database field is added, and you also won't have the error with a manual search anymore. |
@p0psicles |
@NicoLeOca yes, everytime it's holding back a result, you will see this log. So i'm curious what happens around this time. |
@p0psicles you need to create a cache.db migration for this error not happen. So column will be created at boot Here: |
@p0psicles just in case it changes something.
|
@fernandog we only do that for the static tables. |
@NicoLeOca i'm going to add allot more verbose debug logs. I'll start every log with I'll add the date_added field to the manual search results. |
medusa/search/core.py
Outdated
@@ -473,6 +473,8 @@ def search_for_needed_episodes(force=False): | |||
# fail_over_delay time (hours) skipp it. For this we need to check if the result has already been | |||
# stored in the provider cache db, and if it's still younger then the providers attribute fail_over_delay. | |||
if cur_provider.fail_over_enabled and cur_provider.fail_over_hours: | |||
log.debug('DELAY: Provider {provider} delay enabled, with an expiration of {delay}', | |||
{'provider': cur_provider.name, 'delay': cur_provider.fail_over_hours}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe add hours
here? or the correct unit?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ooh yeah sure.
medusa/search/core.py
Outdated
'left': int((int(time.time()) + (first_result['date_added'] - cur_provider.fail_over_hours * 3600)) / 3600)} | ||
) | ||
else: | ||
log.debug(u'DELAY: Provider {provider}, searched cache but could not get any results for: {season}x{episode}', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you use the episode_num instead of this?
from medusa.helper.common import episode_num
episode_num(season=None, episode=None, numbering='standard'):
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also add show name
* Added more info to delayed download logging.
I would make fail-over hours a decimal so if someone wants fractional hours. |
Also I would use the terms |
…arch_delay and search_delay.
I'm not sure about that. First I'd like to see how this works. And which applications we can find. Then later I'm willing to change it for a while, and experiment with using the pubdate. |
@p0psicles one thing I've always thought would be nice would be modifying the manual search page to add an automated selection ordering... i.e. the order that Medusa would select results if it were to select Automatically. Mentioning this here since you brought up ways to visualize releases and their delays. |
Somes episodes were snatched but were not supposed to be
application.log |
just pulled to last commit |
@NicoLeOca Looking at your logs those where all snatched through a backlog search. I think you just missed that commit. Can you confirm that you didn't have any wrong snatches through a daily search? |
@p0psicles |
@p0psicles |
I think you hit on a bug. Not really related to this branch though. I'll look into it. |
I'm hesitant to implement this for the proper search. As I think that a proper should always be downloaded right? @fernandog what do you think? |
Imo yes. Proper can't be delayed |
Cool then this is ready for review. |
I'd add a note like |
just updated to last version |
medusa/tv/cache.py
Outdated
@@ -51,7 +51,8 @@ def __init__(self, provider_id): | |||
b' url TEXT,' | |||
b' time NUMERIC,' | |||
b' quality NUMERIC,' | |||
b' release_group TEXT)'.format(name=provider_id)) | |||
b' release_group TEXT' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing ,
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah good catch. But strange enough it's not required.
I've added it anyway.
if item['link'] not in [_['link'] for _ in items_list_without_dups]: | ||
items_list_without_dups.append(item) | ||
|
||
items_list = items_list_without_dups |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is this for and why is it needed now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because we can't add the same URL in one mass update. And for this feature I can't use the update or insert anymore.
Added benifit is that we don't process/loop through double results.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unique_items = {item['link']:item for item in itemslist}.values()
This uses the link
as a dictionary key ensuring no duplicates with the original item as the value. The resulting dicts values are now all unique.
Codecov Report
@@ Coverage Diff @@
## develop #3360 +/- ##
===========================================
+ Coverage 33.06% 33.41% +0.34%
===========================================
Files 275 275
Lines 34613 35642 +1029
===========================================
+ Hits 11446 11910 +464
- Misses 23167 23732 +565
Continue to review full report at Codecov.
|
@NicoLeOca I would recommend switching back to develop now. |
This feature will add a new attributes to each provider named
fail_over_enabled
andfail_over_hours
. I know those are shitty names, i'm open for suggestions.With this feature you can delay the download for a provider by x amount of hours. While in the mean time it can pick up a result from another provider if that passes by that doesn't have the delay, or where the delay has expired.
It will compare the time of a result of when it's added to the cache table with the first result it got over all providers for that specific season/episode. (season/multiep not supported),
For this I have added a column "date_added" to the provider cache database.
I've also had to remove duplicate items early on in the process, as we can use INSERT OR REPLACE anymore, as I now selectively update a number of fields for results that are already in the db.
Additionally i'm going to add a feature where it will reset snatched nzb's after an x amount of time.
I'll make it configurable that it can also be applied to torrents. This will be done in a follow up PR.
TODO:
- [ ] implement delay for proper searchdecided to not implement this.HOWTO:
To use the feature, go to your provider -> options (second tab). And enable search delay.
The default is set to 8 hours, but you can of course change that.