Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

non-hacky, wheel-compatible way to implement a post-install hook #64

Open
glyph opened this issue Jun 15, 2015 · 120 comments
Open

non-hacky, wheel-compatible way to implement a post-install hook #64

glyph opened this issue Jun 15, 2015 · 120 comments

Comments

@glyph
Copy link

glyph commented Jun 15, 2015

Sometimes you have to generate data dependent upon the installation environment where a package is being installed. For example, Twisted's plugin system has to make an index of all the files which have been installed into a given namespace package for quick lookup. There doesn't seem to be a way to do this properly in a wheel, since setup.py is not run upon installation.

@daenney
Copy link

daenney commented Jun 15, 2015

Imho namespace packages was a terrible idea, at least pre Python 3.3's implicit namespace packages and I'm still not sold on that either.

Why does Twisted feel the need to build up that plugin file index? Is the normal import procedure really that slow that this index provides a significant speedup?

@dstufft
Copy link
Member

dstufft commented Jun 15, 2015

A post install hook is probably a reasonable request.

@glyph
Copy link
Author

glyph commented Jun 15, 2015

The plugin file index is frequently used by command-line tools where start-up time is at a premium. It really varies a lot - on a hot cache, on an SSD, it's not noticeable, but on a loaded machine with spinning rust it can be quite pronounced.

In any case, the Twisted plugin system is just an example. Whether it is a good idea or not (and there's certainly room for debate there) it already exists, and it's very hard to make it behave correctly without user intervention in the universe of wheels. I was actually prompted to post this by a requirement from pywin32, not Twisted, but pywin32's use-case is a bit harder to explain and I don't understand it as well.

@ncoghlan
Copy link
Member

The main concern I have with post-install hooks is the arbitrary code
execution at installation time.

I'm a lot more comfortable with installer plugins that process additional
metadata from the wheel's dist-info directory, as that reduces the scope of
code to be reviewed that is likely to be run with elevated privileges.

In the absence of metadata 2.0, one possible scheme might be to allow
metadata file handlers to be registered by way of entry points.

@ncoghlan
Copy link
Member

ncoghlan commented Jun 15, 2015 via email

@ncoghlan
Copy link
Member

ncoghlan commented Jun 15, 2015 via email

@dstufft
Copy link
Member

dstufft commented Jun 15, 2015

I doubt @glyph cares if the post-install hook is done via a metadata plugin or via some script in the distribution (though I totally agree it should be via a plugin).

@glyph
Copy link
Author

glyph commented Jun 16, 2015

I am not sure I'm clear on the difference. In order for this to work reasonably, it has to be something that can come from the distribution, not in pip or in a plugin installed into pip; how exactly that thing is registered certainly doesn't matter to me.

@glyph
Copy link
Author

glyph commented Jun 16, 2015

@ncoghlan - ironically enough, it's easier to execute arbitrary code at installation time when one is definitely installing with elevated privileges; i.e. in a deb, an RPM, or a Windows installer. The reason that I want this functionality is that (especially with pip now doing wheels all the time) it is no longer possible to do this in a way which happens sensibly when you are running without elevated privileges, i.e. just pip installing some stuff into a virtualenv.

Registering COM plugins is another great example of a thing one might need to do at install time which can't really be put off; I wouldn't know if you could sensibly run COM plugins in a virtualenv though, that's @mhammond's area of expertise, not mine.

@mhammond
Copy link

pywin32's post-install script does things like creating shortcuts on the start menu, writing registry entries and registering COM objects - but it does adapt such that if can fallback to doing these things for just the current user if it doesn't have full elevation. It doesn't currently adapt to not having this post-install script run at all.

Bug as @glyph and I have been discussing, many users do not want or need some of this - if a user just wants to "casually" use pywin32 (ie, so importing win32api works) many would be happy. I think it's a fine compromise for pywin32 to have multiple ways of being installed - use the full-blown executable installer if you want the bells-and-whistles, but also support being installed via pip/wheel etc where you get nothing fancy. But even in this environment there are some files that should be copied (from inside the package to the site directory) for things to work as expected.

@glyph
Copy link
Author

glyph commented Jun 16, 2015

Would it make sense for pywin32 to do things like start-menu and COM registration as extras, or separate packages? It seems, for example, that pythonwin could just be distributed separately, since I don't need an IDE to want to access basic win32 APIs ;)

@glyph
Copy link
Author

glyph commented Jun 16, 2015

I only bring that up by way of pointing out even if everything were nicely separated out, those extras would still need their own post-install hooks, and each one would be doing something quite different.

@mhammond
Copy link

I don't think that someone seeking out the pywin32 installer is well served by splitting it into multiple packages to download and run. Conversely, I don't think people looking for a pip/wheel installation of pywin32 is going to be interested in the COM integration - they probably only want it as some package they care about depends on it. So IMO, a "full blown pywin32 installer" and an "automatically/scriptable minimal installation" will keep the vast majority of people happy.

@glyph
Copy link
Author

glyph commented Jun 16, 2015

I definitely want to be able to put COM components into isolated environments, though, which is why I strongly feel that whatever the solution is for virtualenv has to be an instantiation of whatever is happening system-wide.

@ncoghlan
Copy link
Member

Regarding the installer plugin based approach discussed in http://legacy.python.org/dev/peps/pep-0426/#support-for-metadata-hooks, the way I would expect a system like that to work is:

  • either twisted itself, or a suitably named separate component (e.g. "twisted-plugin") provides the necessary metadata processing utility that handles installation and uninstallation of Twisted plugins
  • any package that needs that metadata to be processed declares a normal dependency on the project that provides the installer plugin

From an end-user perspective, the default behaviour would be that the plugin gets downloaded and invoked automatically by the subsequent package installation. No extra steps, at least when running with normal user level privileges (as noted in https://bitbucket.org/pypa/pypi-metadata-formats/src/default/metadata-hooks.rst there's a reasonable case to be made that installing and running metadata hooks should be opt-in when running with elevated privileges).

While those links refer to the PEP 426 metadata format, I've come to the view that a better model would likely be to put extensions in separate files in the dist-info directory, and have metadata handlers be registered based on those filenames. Not coincidentally, that has the virtue of being an approach that could be pursued independently of progress on metadata 2.0.

The key difference between this approach and allowing arbitrary post-install and pre-uninstall scripts lies in the maintainability. Instead of the Twisted project saying "here is the code to add to your post-install and pre-uninstall scripts to procedurally register and unregister your plugins with Twisted", they'd instead say "declare your Twisted plugins in this format, and depend on this project to indicate you're providing a Twisted plugin that needs to be registered and unregistered". If the registration/unregistration process changes, the Twisted project can just update the centrally maintained metadata processing plugin, rather than having to try to propagate a code update across the entire Twisted plugin ecosystem. (I admit such a change is unlikely in the case of Twisted specifically, but it gives the general idea).

Likewise for pywin32 and updating it for changes to the way it interacts with the underlying OS - I strongly believe it's better to centralise the definition of that operating system interaction code in pywin32 itself, and use a declarative metadata based approach in the projects relying on it.

There's an interesting question around "Do post-install and pre-uninstall hooks declared by a package also get run for that particular package?", and my answer to that is "I'm not sure, as I can see pros and cons to both answers, so a decision either way would need to be use case driven".

(From my position working on Red Hat's hardware integration testing system, I got an interesting perspective on the way systemd ended up taking over the world of Linux init systems. My view is that one of the biggest reasons for its success is that it replaced cargo-culted imperative shell scripts for service configuration that resulted in highly variable quality in service implementations with declarative configuration files that ensure that every single Linux service managed via systemd provides a certain minimum level of functionality, and that new features, like coping with containerisation, can be implemented just by updating systemd, rather than hoping for every single sysvinit script in the world to get modified appropriately)

@guyverthree
Copy link

guyverthree commented Jun 26, 2015

I do see a need for this with the use of wheel convert to transform a window installer file into a wheel package. The installer would have previously run these scripts.

I'm converting a few because I prefer the wheels way of doing things and I have to work nearly 90% with windows.

The packages that I have seen are py2exe, pywin32, pyside, spyder to name ones which have a post-install script that is installed by the wheel package delivery method.

Or should it be that pip is extended to run these post install scripts, from meta-data in the wheel archive, the files are already there. As it does install technically and the wheel is just the package format.

@graingert
Copy link

this would also allow https://pypi.python.org/pypi/django-unchained to ship a wheel

@stuaxo
Copy link

stuaxo commented May 9, 2018

I'd like this to be able to do things like register metadata, and install language files for Gtk.

For shoebot, we are making a library to help text editors plug us in. Those text editors need that extracted to some directory (that bit could be done from a WHL), but then some supplementary things need to happen, e.g. for gedit - install a .plugin file, register a .lang.

The destination of these, depends on the platform, I'd be happy - on installation to tell the installer the location of these so it could uninstall them if needed.

@sloria
Copy link

sloria commented Jun 20, 2018

I'm also interested in this. The use case I have in mind is installing git hooks for development environments, the way husky does via a devinstall hook. This would require pip to distinguish "development" installs, which it doesn't support AFAIK. So I'm not sure how this would look in practice.

@TylerGubala
Copy link

I need either something similar to this, or the ability to install files relative to the Python executable.

My use case is that I am trying to package the module BPY from Blender as a Python Module. Unfortunately, there is a folder, named after the current version of Blender (2.79, at the moment) that needs to be sibling to the Python executable.

Unfortunately, (on Windows and in my experience at least) that location can vary, based on what we have available in setuptools anyways. Currently I can describe this folder containing .py files as being 'scripts' per the setuptools documentation, and that works for most cases where the user is installing into a freshly created venv.

py -3.6-64 -m venv venv
venv\Scripts\activate
(venv)py -m pip install --upgrade pip
(venv)py -m pip install bpy

  • Path to python.exe in venv: ./venv/Scripts/python.exe
  • Path to 2.79 folder: ./venv/Scripts/2.79
  • 2.79 folder is sibling to python.exe: True

But consider the case where someone is installing this into either their system install on Windows, or a Conda env, such as is the case here: TylerGubala/blenderpy#13

py -3.6-64 -m pip install --upgrade pip
py -m pip install bpy

  • Path to python.exe: {PYTHONPATH}/python.exe
  • Path to 2.79 folder: {PYTHONPATH}/Scripts/2.79
  • 2.79 folder is sibling to python.exe: False

The result of not having the 2.79 folder in the correct location (sibling to the executable) is that when you import the bpy module you will receive the following error due to it not being able to find the required libraries:

import bpy

AL lib: (EE) UpdateDeviceParams: Failed to set 44100hz, got 48000hz instead
ModuleNotFoundError: No module named 'bpy_types'
ModuleNotFoundError: No module named 'bpy_types'
ModuleNotFoundError: No module named 'bpy_types'
ModuleNotFoundError: No module named 'bpy_types'
ModuleNotFoundError: No module named 'bpy_types'
F0829 15:50:51.174837 3892 utilities.cc:322] Check failed: !IsGoogleLoggingInitialized() You called InitGoogleLogging() twice!
*** Check failure stack trace: ***
ERROR (bpy.rna): c:\users\tgubs.blenderpy\blender\source\blender\python\intern\bpy_rna.c:6662 pyrna_srna_ExternalType: failed to find 'bpy_types' module
ERROR (bpy.rna): c:\users\tgubs.blenderpy\blender\source\blender\python\intern\bpy_rna.c:6662 pyrna_srna_ExternalType: failed to find 'bpy_types' module
ERROR (bpy.rna): c:\users\tgubs.blenderpy\blender\source\blender\python\intern\bpy_rna.c:6662 pyrna_srna_ExternalType: failed to find 'bpy_types' module

Obviously, I can automate this via script, simply find the 2.79 folder where it is expected and move it sibling to python.exe if it's not there already. However this brings the installation up to 2 commands, one to install the bpy module from pypi and another to make the install correct. That's clumsy and a minor annoyance, and probably prone to people accidentally not doing the second command.

One might suggest that bpy be made simply an sdist only distribution, and handle the 2.79 folder specifically. However the sheer size of source code and precompiled libraries (especially on Windows: > 6GB to download!!!) makes this somewhat of a non-starter. Currently, the only thing that I think I might be able to do is to make bpy's install_requires reference a package on pypi whose sole purpose is to subclass setuptools.commands.install.install to perform the 2.79 movement for the user. However this seems like a band-aid and not too great of a solution.

Would be awesome as well for contributors to have a sensible way of doing this as well, a declarative way where everyone could expect the post_install_scripts to be placed such that setuptools and pip can understand about them irrespective of whether it is an sdist or bdist_wheel installation.

I hope that all makes sense. Unfortunately, for Blender in specific, I don't have control over the source code. There are many other issues and considerations that drive the motivation of having the 2.79 folder relative to the python executable that I won't go into here.

Suffice to say that a simple, post install script, OR being able to specify that the 2.79 folder must be placed relative to the executable, would have resolved this issue in a snap.

Hopefully that all makes sense!

@graingert
Copy link

graingert commented Sep 1, 2018 via email

@TylerGubala
Copy link

Is there a smart way to do that?

The module in question is a .pyd, so it's compiled from C code that I don't have control over.

I'd love to hear more about your suggestion!

@graingert
Copy link

You can make a bpy package, and move the current module to bpy/_speedups.pyd then in bpy/__init__.py do all the setup and re-export bpy._speedups

@glyph
Copy link
Author

glyph commented Sep 1, 2018

Maybe you could make that folder on first import?

In the general case, this is definitely not the right answer. If you're trying to package things up for distribution, the location relative to the python executable definitely shouldn't be writable all the time, and certainly shouldn't be writable by the user using the application just because it's writable by the user who installed it.

@njsmith
Copy link
Member

njsmith commented Sep 1, 2018

I hope that all makes sense. Unfortunately, for Blender in specific, I don't have control over the source code. There are many other issues and considerations that drive the motivation of having the 2.79 folder relative to the python executable that I won't go into here.

I want to push back on this. Obviously I don't know what all the considerations here are, so I may well be missing something, but at first glance this seems like an unlikely and unreasonable requirement.

@glyph is right that the user may not have write access to the folder where the python package is, but ... The installer doesn't necessarily have those permissions either!

Is this just because you need the directory to be on the dll search path? There are lots of ways to manage that that don't require creating a non-standard python environment. Or what are the issues here?

@TylerGubala
Copy link

It's not the .dll search path; it's that the Blender C module code depends on Python packages that exist as normal .py files. These files must exist in a folder that matches the Blender version (2.79 at the time of writing) otherwise the .pyd module simply won't work, as it cannot find the .py files it depends upon, which are supposed to exist at {PYTHON_EXE_DIR}/2.79/....

The mechanism for finding said .py files is part of the C code that I do not have control over.

Is the worry here security?

@njsmith
Copy link
Member

njsmith commented Sep 1, 2018

@TylerGubala I understand you're in a difficult position, where you don't control either how Blender works or how Python packaging works, and are just trying to figure out some way to make things work. But... it sounds like you're saying that the Blender devs made some decisions about how to arrange their Python environment that are incompatible with how Python normally works, so now Python needs to change to match Blender. Maybe it would be better to like, talk to the Blender devs and figure out some way they could make their code work in a regular Python environment?

@joshuacwnewton
Copy link

joshuacwnewton commented Apr 1, 2021

Commenting here to describe another use case for post-install hooks, and to ask for advice on how to approach this in a wheel-friendly way.

Our package depends on several large externally-hosted files (pretrained ML model files, image datasets) that we'd like to download post-install.

  • In other words, sort of like package_data, but for files that are too big to bundle.
  • Currently, we aren't published on PyPI. Instead, we install via a shell script that runs pip install for our package, then downloads the necessary files afterwards. (This is understandably hacky, which is why we're looking for an alternative.)
  • Similar ask from someone else.

Another ML-based library, NLTK, currently uses a downloader script for this.

  • Downside: Needs to be invoked manually by the user after installing the package.
  • Upside: Provides more control + transparency about what gets installed, when it gets installed, what servers it gets installed from, and what locations it gets installed to.

Control + transparency is also the same rationale (more or less) given in this separate issue (with a much different context, mind you): pypa/pip#5898 (comment).

[...] brought along with it a rash of issues where "pip install " would depend on an unknownable set of servers outside of just their configured locations. This caused a number of problems, most obviously in locked down environments where accessing a service required getting a hole punched in a firewall, [...]

Basically, a user should be in charge of where they download files from, it should not be something under someone else's control.


For those reasons, is a manual post-install step the way to go, then? Or is there a way we can tie this to pip?

@con-f-use
Copy link

con-f-use commented Apr 1, 2021

Wheels will never have arbitrary, automatic code execution on install. That goes against the entire concept.

Have you considered a console_script in the entry_points of your setup.py/pyproject.toml. The user would still have to run that manually, but at least it would be included in your package and your package could be pip-installable and on PyPI or whatever index you use? Or on first import you could check if the files iare there and of the right size and, if not, download them?

@glyph
Copy link
Author

glyph commented Apr 1, 2021

Wheels will never have arbitrary, automatic code execution on install. That goes against the entire concept.

If this is the case, then will you close this issue definitively? This is the issue, here.

@henryiii
Copy link
Contributor

henryiii commented Apr 1, 2021

What about a standard way to do this, but opt-in only? Like a wheel.post-install = pkg.pi:main entry point, and a --run-post-install-hooks flag in installers (pip) that will run this if the flag is present? Just a thought, possibly not a good one.

@con-f-use
Copy link

con-f-use commented Apr 1, 2021

That would have no advantage over running something on first import of the package or including a console_script entrypoint, the user has to run.

@joshuacwnewton
Copy link

joshuacwnewton commented Apr 1, 2021

Wheels will never have arbitrary, automatic code execution on install. That goes against the entire concept.

Thanks so much for the quick reply. I had a feeling this was the case, but the issue was still open, so I appreciate the explicit confirmation. 🙂

Have you considered a console_script in the entry_points of your setup.py/pyproject.toml

Indeed! We actually have this already.

I think we mainly wanted a post-install hook as a convenience to save the user a step.

Or on first import you could check if the files iare there and of the right size and, if not, download them?

I was wondering about this, but then I read this earlier comment from @glyph: #64 (comment), which gave me a bit of pause. (That was in a different context, mind you, so perhaps it doesn't apply to our situation, and perhaps checking on first import is valid for us after all.)

@henryiii
Copy link
Contributor

henryiii commented Apr 1, 2021

Running something on first import or using a console_script requires you look up the docs of the package, for each package. Having a standard way to spell it and run it for all packages is a minor improvement, I'd think. Someone installing trusted packages could just activate this once and get it for everything they install. Again, not something I've thought about much, just seemed like a possible improvement over not having anything at all.

@con-f-use
Copy link

con-f-use commented Apr 1, 2021

Running something on first import can be done automatically. Just put the code in the root __init__.py of your package. Your flag would also require people to look into pip's documentation. Again, people switched from setup.py + source distributions to pyproject.toml + wheels exactly because of the (mostly portability and to a lesser extend security) problems, that this code execution has caused in the past.

@henryiii
Copy link
Contributor

henryiii commented Apr 1, 2021

pip install -r requirements.txt --run-post-install-hooks or pipenv install --run-post-install-hooks could handle a bunch of packages that all have post install hooks in one shot. Installing stuff on first import means you have to have internet access while you run the code, rather than just when you install it. It's opt in only, so that does't hurt the security/portability problems, you don't have to opt in.

@henryiii
Copy link
Contributor

henryiii commented Apr 1, 2021

You can pip-compile your requirements.txt with hashes so you don't have unexpected code running.

Your flag would also require people to look into pip's documentation

So does adding -r for requires. Or looking up pip install, etc.

@henryiii
Copy link
Contributor

henryiii commented Apr 1, 2021

And a lot of packages (most of what I do and work with) require compilation, so setup.py vs pyproject.toml is not valid there. NumPy and tons of other packages are compiled, so can't have "no code execution" from SDist. It's only wheel being discussed here. I feel like having opt-in post-install hooks would be a minor benefit.

@pfmoore
Copy link
Member

pfmoore commented Apr 1, 2021

Wheels will never have arbitrary, automatic code execution on install. That goes against the entire concept.

If this is the case, then will you close this issue definitively? This is the issue, here.

The reality is that wheels do not currently have code execution on install. To add something like that would without a doubt require a PEP, and it would probably be a very contentious PEP because there are a lot of people for whom "installing doesn't execute code" is an important characteristic.

So I think the reality is that this issue will never result in a post-install hook being added, but that doesn't mean it's never going to happen, just that someone needs to propose a PEP, which has a relatively high probability of not getting accepted. And I don't get the feeling anyone is willing to do that.

@con-f-use
Copy link

Also, there are mechanisms to distribute binaries for the different architectures you want to support with your wheels. See manylinux and so on. As someone who has fought many many distributability and portability issues with python code in their company, I think, not compiling anything on the target machines at installation is a blessing.

@joshuacwnewton
Copy link

So I think the reality is that this issue will never result in a post-install hook being added, but that doesn't mean it's never going to happen, just that someone needs to propose a PEP, which has a relatively high probability of not getting accepted.

I can't speak for anyone else, but to me this sounds like a good resolution for this issue. Thank you for taking the time to share your thoughts. 🙂

@nhatkhai
Copy link

nhatkhai commented Apr 1, 2021

I think, not compiling anything on the target machines at installation is a blessing.

Until the dependencies of your project dependencies requires many manual run post-install especially if the instruction changes due to version changes. I do agree it is a blessing to keep wheel clean from post-install, and which egg still been supported with an addition flags to make sure people don't have to opt-in.

@KOLANICH
Copy link

KOLANICH commented Apr 1, 2021

If one really needs to run post-install hooks from within the package being installed, he uses the package manager of the distro.

@con-f-use
Copy link

con-f-use commented Apr 1, 2021

Until the dependencies of your project dependencies requires many manual run post-install

At that point you fare better with something like a docker container, a VM image, a custom package in a package manager or a nix-shell rather than a whole dependency tree of python packages with post-install hooks. It's just too many possible points of failure without.

@glyph
Copy link
Author

glyph commented Apr 1, 2021

… not compiling anything on the target machines at installation is a blessing …

As the original filer of the issue here, I can assure you that this is absolutely the opposite of what I want :-)

@glyph
Copy link
Author

glyph commented Apr 1, 2021

The reality is that wheels do not currently have code execution on install. To add something like that would without a doubt require a PEP, and it would probably be a very contentious PEP because there are a lot of people for whom "installing doesn't execute code" is an important characteristic.

So I think the reality is that this issue will never result in a post-install hook being added, but that doesn't mean it's never going to happen, just that someone needs to propose a PEP, which has a relatively high probability of not getting accepted. And I don't get the feeling anyone is willing to do that.

Thanks for the summary, @pfmoore ; this aligns with my understanding.

For my part, I'm much less attached to this issue than I once was. Twisted does already have the equivalent of some __init__.py stuff that does the relevant post-install work, and has all along. The changing idioms in the way Python applications are deployed - i.e. the fact that they're almost always deployed into self-contained virtualenvs both for development and deployment, and integration with the host platform, particularly on Linux, is much less interesting, means that I have fewer and fewer use-cases for this sort of thing.

In the cases where you really need something like this, for applications - adding menu entries to GNOME & KDE, for example - it seems like the right solution would be to have platform-specific tools.

I still think that it would be nice to have a dedicated entrypoint, and I would be much obliged to anyone willing to put in the work on the PEP (I'd even contribute a bit to the writing, if someone else were managing the process!) but you're right in that I'm not sufficiently motivated to make it happen myself.

@nhatkhai
Copy link

I will try conda next time as it seem to support my use case while can work with pip, nuget etc... which are refuse to solve cross-environment dependency.

@danx0r
Copy link

danx0r commented May 29, 2022

You know you're headed for trouble when your question leads to a 6-year comment thread...

I was looking for a way to download large data files (ML models) as a post-install step. I hate hate hate code that does this on first run, that is always a headache when moving to production.

My thoughts after reviewing this thread:

  1. support a console script such as
    mymodule-init
  1. support a similar imported method:
    import mymodule
    mymodule.init()
  1. fall-back to run on first load:
    import mymodule
    mymodule.main() #main checks for data, downloads if necessary

@KOLANICH
Copy link

I was looking for a way to download large data files (ML models) as a post-install step.

It should likely not be done this way, such files should be downloaded only if they are actually needed.

@KOLANICH
Copy link

Having all the three options simultaniously would be nice.

@jvanasco
Copy link

jvanasco commented Jun 1, 2022

It would be nice if this could be handled in a way similar to how the installation of extra packages is handled, like how we used to have to do this:

pip install requests[security]

The idea - as several mentioned above - is to offer end users the opportunity to explicitly opt-in to a post-installation hook just as they can opt-in to installing additional packages.

An example of the need for this is the tldextract package (https://github.com/john-kurkowski/tldextract), which utilizes a cache that must be created/updated post install. The project handles this by running this step only if needed (i.e. on first invocation/import) or when an upgrade is explicitly requested, but this creates an issue as there is a performance hit on the first usage. This is further complicated by the idiosyncrasies of copy-on-write memory in forking/multiprocressing situations.

Although this can be easily handled by automated deployments, it requires managing a web of comments to ensure the custom post-installation hook is not disabled and that any future automated tools remember to reimplement it.

sidenote: I feel like I've spent 10-20 years complaining about the same issues as @glyph, though usually it's within a few weeks or years, not 7+ like this one.

@nhatkhai
Copy link

nhatkhai commented Jun 2, 2022

Util this missing importance feature - I see this wheel as a broken solution for solving any dependency as same way nuget decided to stop support post-installation (Terrible to save work but push complexity to elsewhere).
I would keep use the egg distribution package until this got supported in wheel.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests