Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Settings for python/yaml scrapers (similar to plugin settings) #4433

Open
DuctTape42 opened this issue Jan 8, 2024 · 1 comment
Open

Settings for python/yaml scrapers (similar to plugin settings) #4433

DuctTape42 opened this issue Jan 8, 2024 · 1 comment

Comments

@DuctTape42
Copy link

Is your feature request related to a problem? Please describe.
v24 included a plugin manager for scrapers and plugins and settings for plugins. However, there are no such settings for scrapers. Prior to v24 the general solution was to edit the scraper (either the yaml or a config file) to add the information (api keys, user name/passwords, etc). With the new model, that's a lot more difficult.

Describe the solution you'd like
I'd like to see a new UI and API that works like like the plugin infrastructure, but for scrapers. Same sort of configuration in the scraper yaml file and it can use the same graphql api for python/js scrapers. The values should be available to the yaml scrapers as well (probably using the same brace syntax like for {title})

Describe alternatives you've considered
There are two easy (and not mutually exclusive) alternatives:

  1. Keep doing what we're doing. Scruffy's plugin template showed a pretty clever way to create a template config on first use. Users can go in and edit that. I didn't try it yet to see what happens if you uninstall or upgrade the plugin? (if the config gets deleted because it's in the same directory)
  2. Use environment variables to pass information to python scrapers. This one works best for docker users (and probably systemd users), since you can set up the environment variables in the docker compose file or docker command. You could write wrapper scripts for other methods of launching stash.

Both solutions are serviceable for script based scrapers. I think they make using these sorts of plugins a more advanced scenario (especially with the config being python source). They also leave the file based scrapers behind.

Additional context
There's been some discussion about this on Discord. I think @DingDongSoLong4 or maybe @Maista6969 may have some additional context or thoughts here.

@Maista6969
Copy link
Contributor

Thanks for bringing this up, apologies for the late reply! I've already made up some thoughts around this with regards to plugins, but all of the below would apply equally to scraper settings

I would love it if we kept iterating on the plugin settings introduced in #4143, specifically:

  • specifying a default value for a setting
  • requiring the user change a setting before the plugin can run
  • persisting settings across plugin uninstall / reinstall
  • more setting types, like enums and lists (not crucial but nice to have)

The problem of missing defaults has already popped up in every plugin that uses this and is usually worked around in the script code, but this is not satisfying because it's unclear to the user what the default actually is: if the UI defaults booleans to False but the plugin itself assumes True, for example, this is a bad user experience

Other plugins can require e.g. an API key before they'll ever be able to function and the current feedback in the UI when a plugin fails could also use some work

First draft style, I'm imagining something like this in the .yml file that defines a plugin / scraper:

settings:
  foo:
    displayName: Foo
    description: Foo the baz before xyz?
    type: BOOLEAN
    default: true
  bar:
    displayName: Favorite bar
    type: string
    choices:
      - Cheers
      - Paddy's Pub
      - Ten Forward
  apiKey:
    displayName: Other site API Key
    description: Found in your account settings on OtherSite.com
    type: string
    required: true

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: No status
Development

No branches or pull requests

2 participants