Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The other No Arch option #14

Closed
taldcroft opened this issue Jul 3, 2018 · 14 comments
Closed

The other No Arch option #14

taldcroft opened this issue Jul 3, 2018 · 14 comments

Comments

@taldcroft
Copy link
Member

Thinking more about #13, it seems we could maybe do away with all that and have a 100% conda install if we do away with the arch directory. So there would just be the SKA root where conda gets installed (to bin/ lib/ etc), along with a data link for mutable data. The current template stuff we have now could be installed by conda packages.

Features / possible issues:

  • Production data meant for sharing between environments would live in /proj/sot/ska/data. A production Ska3 would make a link to that.
  • Current trick of using soft links to install new version of Ska via rsync and allow easy fall-back is no longer possible, but we probably don't care.
  • Current perl launcher in /proj/sot/ska/bin/perl is not possible. We likely do not need the Python one for Ska3. But it could be replaced by a ska3perl or whatever.

Other problems @jeanconn?

@jeanconn
Copy link
Contributor

jeanconn commented Jul 3, 2018

I didn't think that arch was keeping us from having basically a 100% conda install. I thought arch just gave us a reasonable mechanism for handling two arch ska run time environments on GRETA with one env init script (when we can do conda for GRETA management).

I thought the only reason we need #13 right now is that we need to figure out more on the other "setup" pieces (pinned, .condarc). I don't know if those could also be conda packaged if we actually need them.

@jeanconn
Copy link
Contributor

jeanconn commented Jul 3, 2018

But if you think the matlab python is going to stay python 2 until GRETA is all 64 bit CentOS 7 the multi arch thing isn't really important.

@taldcroft
Copy link
Member Author

matlab python is going to stay python 2 until GRETA is all 64 bit CentOS 7

Definitely, since there won't ever be a 32-bit Ska3. So yes, that was implicit in not needing arch: Ska3 is strictly 64-bit, and I can't see a need for a linux and mac install in the same directory.

@taldcroft
Copy link
Member Author

About the pinned stuff, can that be replicated with a meta-package that has strict requirements? This seems like a much cleaner solution, where the pinned file is just for folks that need a quick-n-easy solution.

My understanding is that the only practical difference between pinned and not pinned at this point is that we know that letting the pinned ones float will break stuff, while for the rest there is a decent chance newer versions will work. But in terms of environment, every ska3-core package is specified at a particular version.

@jeanconn
Copy link
Contributor

jeanconn commented Jul 3, 2018

Yes, wrt using the meta package for strict requirements, that's what I meant by "figure out more".

Now, you've mentioned that for your use cases, the idea of a working dev ska basically ceases to exist because you want to keep your packages at the ska-core versions, but it looks like you are still thinking about dev ska use cases a bit in #16 . Now IIRC my last dev ska use was to ad a new package to play with. I'm not sure what behavior I'd get if I did

  • create runtime env using ska3-core and ska3-flight
  • install new package into the runtime (my last use case was astroquery)

Now, without pinning, I think that the install process finds now finds the minimal process that gives me astroquery. And I think that process is not guaranteed to preserve all ska3-core packages at current versions. So I think you are right we'd probably need something like

create new environment with ska3-core and astroquery and if that didn't work, then try ska3-pinned astroquery.

So yes, that would probably deal with pinned.

With regard to .condarc, I'm still trying to limit the set of conda packages that we get from the default channels to ones that are probably CentOS 5 compatible by an explicit .condarc . If we actually need/want it, we can add a manual copy step if you don't want a Makefile.

@taldcroft
Copy link
Member Author

OK, just looked at https://github.com/sot/skare/blob/py3/dot_condarc and was reminded of the issue. So eventually that will go away.

I am just fighting hard to put all the effort onto the build side so that installation can be done without being in a skare3 repo. Maybe it's impossible or not worth the effort, and if so then do what you need to do for now. It's that constant battle between removing things to be more perfect and perfect being the enemy of good. 😄

@taldcroft
Copy link
Member Author

I think we don't really need a separate ska3-pinned (which would be a dependence of ska3-core). Can't those two requirements just be regular old requirements that we procedurally remember to not update willy-nilly? In a year when we get away from CentOS-5 this whole mess will go away.

@taldcroft
Copy link
Member Author

I would be surprised / disappointed if the requirements of an installed metapackage are forgotten once it is installed. Let's just say we install a metapackage numpy113 that specifies it requires numpy=1.13. Then we try to install another package that has a run time requirement of numpy=1.14. That should just fail. I thought that every time you install something it is solving the whole dependency graph.

@jeanconn
Copy link
Contributor

jeanconn commented Jul 3, 2018

I'm probably just holding on to a memory of old/bad behavior and this is all fixed in conda > 4 and works as you expect.

@taldcroft
Copy link
Member Author

🎉 so the grand plan is slowly morphing into place!

@jeanconn
Copy link
Contributor

jeanconn commented Jul 3, 2018

Sorry, haven't actually tested. Working on ACA pre review.

@taldcroft
Copy link
Member Author

I.e. lets baseline no arch/ dir and have ska3-flight and ska3-core packages.

(Oops, I misread that and got overexcited). But it must work, any other behavior would be a catastrophic bug.

@jeanconn
Copy link
Contributor

jeanconn commented Jul 3, 2018

Looks like the test works as anticipated

fido: more a/meta.yaml 
package:
  name: a
  version: 1.0

requirements:
  run:
    - numpy ==1.13
fido: more b/meta.yaml 
package:
  name: b
  version: 1.0

requirements:
  run:
     - a ==1.0
     - pep8 ==1.7.0

After installing the "b" metapackage I get the appropriate error if I try to change numpy versions.

(chained) bash-4.1$ conda install numpy=1.12
Fetching package metadata .............
Solving package specifications: .

UnsatisfiableError: The following specifications were found to be in conflict:
  - b -> a ==1.0 -> numpy ==1.13
  - numpy 1.12*
Use "conda info <package>" to see the dependencies for each package.

@taldcroft
Copy link
Member Author

Closed by #19.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants