Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Solve for the "lowest" versions of specs #330

Closed
wolfv opened this issue Sep 12, 2023 · 6 comments · Fixed by #660
Closed

Solve for the "lowest" versions of specs #330

wolfv opened this issue Sep 12, 2023 · 6 comments · Fixed by #660
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@wolfv
Copy link
Contributor

wolfv commented Sep 12, 2023

In some discussion I heard that Cargo tries to resolve against the earliest versions of all the specs to verify that the lower bounds are correct.

I think that could be a cool feature to have under some flag. This would allow users to make sure (e.g. in a test) that their software still works with the minimum supported version of NumPy etc.

@baszalmstra baszalmstra added the enhancement New feature or request label Sep 19, 2023
@YeungOnion
Copy link
Contributor

I'm interpreting that this would allow the user make sure that minimums support their software by solving a different matchspec where all the constraints that imply a lower bound constraint get converted to another type of constraint and all else are preserved. My question would be, what should be the behavior for each of the constraints?

Would a spec be "expanded" to have defaults? i.e. >=1.2 would test for minimum with the spec ==1.2.0.0.0 (or as many .0's/defaults needed to fully specify)?

@maresb
Copy link

maresb commented Feb 14, 2024

My original interpretation is that you use the exact same specs but flip the priority so that lower version numbers are favored, and then you'd get a single solution with low versions for at least most things.

I can see the value in both interpretations. I'm really curious what others think.

@wolfv
Copy link
Contributor Author

wolfv commented Feb 15, 2024

I think it would be potentially hard or impossible to solve for all packages "pinned" to the lowest versions (e.g. you might depend on two packages with conflicting lower bounds).

For that reason, the way I see it working would be to just reverse the sorting and try all the lowest versions first.

@baszalmstra
Copy link
Collaborator

Yeah thats also the approach I would take. That should also be relatively simple to implement.

@YeungOnion
Copy link
Contributor

I like it, easier to specify behavior leads to less surprise. Simple to implement is nice too, I can spend some time working on that later next week. Would it make sense to notify the user if the solve's lowest versions are higher than what their matchspec implies?

@maresb
Copy link

maresb commented Feb 16, 2024

Would it make sense to notify the user if the solve's lowest versions are higher than what their matchspec implies?

In my view this wouldn't be terribly helpful as default behavior. For me, the lower bounds I include in my dependency specifications typically indicate the feature set that my particular project is using. For instance, maybe I use Pandas 2.0 which depends on NumPy 1.20.3. If my code uses NumPy 1.19 features then I'll include numpy >=1.19. I don't care that the pandas >=2 pin is stricter. I want the solver to solve my specs not write my specs.

A possible exception would be if I'm actively trying to support python >=3.7 but I've pinned pandas >=2 which requires python >=3.8. Then it'd be nice to know that it's time to give up on Python 3.7. But if I'm actively trying to support Python 3.7, then I'll have a CI run dedicated to testing this. So actually I don't find this argument so convincing.

@baszalmstra baszalmstra added the help wanted Extra attention is needed label Feb 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants