Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use --no-deps when installing using pip #7345

Closed
keewis opened this issue Jul 30, 2020 · 14 comments · Fixed by #7397
Closed

use --no-deps when installing using pip #7345

keewis opened this issue Jul 30, 2020 · 14 comments · Fixed by #7397
Labels
Needed: documentation Documentation is required

Comments

@keewis
Copy link
Contributor

keewis commented Jul 30, 2020

#5635 switched to python -m pip install --upgrade --upgrade-strategy eager . with

python:
  install:
    - method: pip
      path: .

which will update every dependency, even if we pinned it to a certain version. I could use method: setuptools instead, but that is unreliable / fails if a dependency depends on the package to be installed (i.e. a circular dependency graph).

Would it be possible to use --no-deps:

python -m pip install --upgrade --upgrade-strategy eager --no-deps .

which means that --upgrade only reinstalls the package at . instead of also upgrading every dependency?

If you're concerned about someone's docs build failing because a dependency is missing, would it be possible to conditionally enable this flag (e.g. when using conda where upgrading dependencies using pip is not really a good idea)?

@keewis
Copy link
Contributor Author

keewis commented Aug 5, 2020

gentle ping. If I understand it correctly, what I'm asking for is to add --no-deps here:

self.build_env.run(
self.venv_bin(filename='python'),
'-m',
'pip',
'install',
'--upgrade',
'--upgrade-strategy',
'eager',
*self._pip_cache_cmd_argument(),
'{path}{extra_requirements}'.format(
path=local_path,
extra_requirements=extra_req_param,
),
cwd=self.checkout_path,
bin_path=self.venv_bin(),
)

@stsewd
Copy link
Member

stsewd commented Aug 5, 2020

Don’t install package dependencies.

https://pip.pypa.io/en/stable/reference/pip_install/#cmdoption-no-deps

If I understand this correctly, --no-deps will not install the deps that are in the setup.py file, so users will need to install those in another way (like a requirements file)?

Can you link me to your package and config file? I'm not sure to understand the problem correctly.

@stsewd stsewd added Needed: more information A reply from issue author is required Support Support question labels Aug 5, 2020
@keewis
Copy link
Contributor Author

keewis commented Aug 5, 2020

If I understand this correctly, --no-deps will not install the deps that are in the setup.py file, so users will need to install those in another way (like a requirements file)?

yes, that's right, it will skip resolving dependencies and only install the package itself. For that to work properly, all dependencies have to be installed manually or using a requirements file (for pip) or a environment file (for conda).

There are two experimental branches which differ only by the install method on my personal setup:

  • setuptools (readthedocs.yml, conda environment, build log): while installing works fine and conda list prints the correct version, the version that was imported in conf.py is the most recently released version pulled in by a dependency (the circular dependency graph I was talking about): it prints xarray: 0.16.0. This is a general issue which I can reproduce locally.
  • pip (readthedocs.yml, conda environment, build log): with pip the correct version of xarray was imported by conf.py: it prints xarray: 0.16.1.dev21+g5e2b36d7. However, while pandas has been pinned to 1.0.* in the environment file it was upgraded by the call to pip install --upgrade --upgrade-strategy eager. This is a problem because we managed to miss a change to pandas, causing our docs build to fail with the new version (which is why we tried to pin it).

I think this can be fixed using --no-deps, which wouldn't hurt us as the environment file installed all dependencies, anyway. If you'd rather not use this unconditionally or only in combination with conda, could you add a configuration setting exposing this?

@no-response no-response bot removed the Needed: more information A reply from issue author is required label Aug 5, 2020
@stsewd
Copy link
Member

stsewd commented Aug 10, 2020

Thanks for the explanation. I think the way to solve this would be to have an upper limit for pandas in https://github.com/keewis/xarray/blob/6ac5056dc0377b05458553e39982bc34b22964d0/setup.cfg#L78 since that will break if your users have pandas 1.0.x as https://github.com/keewis/xarray/blob/6ac5056dc0377b05458553e39982bc34b22964d0/ci/requirements/doc.yml#L20.

@keewis
Copy link
Contributor Author

keewis commented Aug 10, 2020

that's true, we could fix this particular issue using setup.cfg.

That's just a workaround, though, and I would argue that that file is not the best way to pin dependencies in a particular CI because it would affect other CI where we actually want to install the conflicting version of pandas. Besides, if the issue was not with upgrading pandas but a package like cartopy that doesn't provide wheels (or the wheel conflicts with a different conda package), we'd want to avoid reinstalling that package using pip.

Additionally, we're currently considering a workflow where all dependencies are pinned to a specific version and updated in PRs automatically opened by dependabot. That workflow would be impossible if we couldn't rely on version pinning in the environment file.

@stsewd
Copy link
Member

stsewd commented Aug 13, 2020

I think you can solve this by not calling pip install ., since all its dependencies are already installed, and just use the sys.path trick https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html so sphinx is able to find your package.

Another solution would be to first install the package and after that install the pinned packages (not sure if this is possible in conda)

@keewis
Copy link
Contributor Author

keewis commented Aug 13, 2020

just use the sys.path trick

that's what we're currently doing. It works, but since we're also using setuptools_scm (which doesn't detect the version correctly unless the package was installed), the displayed version is not the right one.

Another solution would be to first install the package and after that install the pinned packages

I'll investigate and report back.

@keewis
Copy link
Contributor Author

keewis commented Aug 18, 2020

it works if I use something like

  - pip
  - pip:
    - -e ../..  # path to folder containing setup.py relative to environment file

I still think that using python.install with the pip method should not aggressively upgrade dependencies, but this trick can work around that. If you don't want to change the behavior of the pip method, maybe you should mention this trick in the docs?

@stsewd
Copy link
Member

stsewd commented Aug 18, 2020

I don't think we will be changing this as it would break some builds, and not sure if this is common enough to add a new setting.
We can definitely document this in our faq https://github.com/readthedocs/readthedocs.org/blob/master/docs/faq.rst. Feel free to update those docs!

@stsewd stsewd added Needed: documentation Documentation is required and removed Support Support question labels Aug 18, 2020
@keewis
Copy link
Contributor Author

keewis commented Aug 19, 2020

okay, so it seems to work fine, except with PRs, where it doesn't. I think I tracked this down to the shallow clone of the git repository (which is created using

git clone --no-single-branch --depth 50 https://github.com/pydata/xarray.git .
git fetch origin --force --tags --prune --prune-tags --depth 50 pull/4355/head:external-4355
git checkout --force 9076bb28a0bbc149308a3742cba6dedf9aea36b2

). If the number of commits since the last release exceeds 50, setuptools_scm can't find the version anymore because there's no tagged commit in the chain linked to the commit that is being built (so that means normal branches could also fail, we didn't hit that limit yet). As soon as I call git fetch --unshallow, the version detection succeeds.

Do you have any advice on how to fix that?

@stsewd
Copy link
Member

stsewd commented Aug 19, 2020

@keewis I can disable the shallow cloning for your project. Currently, we don't expose that option to users. What's the project?

@keewis
Copy link
Contributor Author

keewis commented Aug 19, 2020

xarray (slug: xray) and xarray-keewis (only xray is really important, though)

@stsewd
Copy link
Member

stsewd commented Aug 19, 2020

Done, your projects shouldn't shallow clone now https://readthedocs.org/projects/xray/builds/11699942/

@keewis
Copy link
Contributor Author

keewis commented Aug 19, 2020

thank you very much for the really quick responses, I can confirm they don't shallow clone.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Needed: documentation Documentation is required
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants