Skip to content

Updating the Core Conda Packages

Javier Gonzalez edited this page Jan 20, 2023 · 15 revisions

The contents of this page explain what is in the corresponding workflows to update ska3-core packages:

Be aware that this page might be a bit outdated in relation to the code in the workflows.

Updating the Core Conda Packages

Updating ska3-core entails:

  • assembling the list of all packages installed for each platform (linux-64, osx-64, win-64),
  • saving these lists and combining them into a single YAML file,
  • collecting the actual packages to place them in our repository
  • collecting the patches that need to be applied to the packages

There are two different types of updates:

  • from scratch. This starts from the specifications in ska3-core-latest/meta.yaml, ska3-flight-latest/meta.yaml and ska3-perl-latest/meta.yaml, which list direct dependencies without specifying versions. It uses a top-level script called install_from_scratch.py for each of these meta-packages.
  • incrementally. This starts from an existing ska3-core package already in the conda channel, adding/updating some packages on top of it.

Creating from scratch

  1. If the update will include increasing the python version, then some ska3 packages might need to be re-built (most probably any package that lists python in their recipe).

  2. Edit pkg_defs/ska3-*-latest/meta.yaml and optionally edit environment files in their respective directories. The idea of these environment files is that they can help the dependency resolution by splitting the installation task in stages, and they can also specify alternate channels.

  3. Assemble the package list and create new meta.yaml files (the recipes with specific versions). This proceeds in two stages:

    1. Create working environments in each platform and list all packages installed, following this gist.
    2. Combine the package lists from all platforms into a final meta.yaml file for each meta-project, following this gist

    There is a Github workflow that should automate this (described below), but unfortunately it always needs tweaks and it is not working as of this writing. The following python code triggers the workflow:

    from skare3_tools import github
    repo = github.Repository('sot/skare3')
    repo.dispatch_event(event_type='conda-meta-yaml', client_payload={'version': '2021.1', 'skare3_branch': '2021.1'})

    After/if the workflow succeeds, it produces three artifacts:

    • conda-meta.zip: the meta.yaml files for ska3-core, ska3-flight and ska3-perl
    • conda-packages.zip: the built ska3-core, ska3-flight and ska3-perl conda packages
    • json-files.zip: various JSON files, including the list of all packages downloaded and a few patch_instructions.json files.
  4. Download packages into conda channel (I do this on kady using this gist):

    unzip json-files.zip
    ./fetch_packages.py --channel www/ASPECT/ska3-conda/prime *json`
    
  5. Combine the noarch/patch_instructions.json into a tarball named patch_instructions.tar.bz2 (using this gist):

    ./combine_patch_instructions.py patch_instructions-*/*/*json
    
  6. index the channel:

    conda index -p patch_instructions.tar.bz2  www/ASPECT/ska3-conda/prime
    

Workflow Details

The following procedure results in meta.yml for ska3-core, ska3-flight and ska3-perl, taking the latest versions of all packages that are available in the conda channels (and should be equivalent to the create-ska3-meta.sh and combine-meta.sh gists):

  1. Create/review the ska3-core-latest, ska3-flight-latest, ska3-perl-latest packages. These packages list only explicit package dependencies we want in the environment, and to a great extent they need to be made by hand. No versions are specified in these packages.
  2. ska3-core-latest, ska3-flight-latest, ska3-perl-latest packages must be built and in some conda repository before proceeding, because they are the basis for the procedure.
  3. Repeat the following for each architecture in ["ubuntu", "macos", "windows"]:
    python ./skare3/pkg_defs/ska3-core-latest/install_from_scratch.py
    conda list --json > ska3-core-${ARCH}.json
    python ./skare3/pkg_defs/ska3-flight-latest/install_from_scratch.py
    conda list --json > ska3-flight-${ARCH}.json
    conda install ska3-perl-latest
    conda list --json > ska3-perl-${ARCH}.json
    python ./skare3/patch_instructions.py get -o patch_instructions-${ARCH}
    
  4. Combine these into meta.yml files for ska3-core/flight/perl. This step can be done automatically on github, using the conda-meta.yml workflow, which uses the combine_arch_meta.py in skare3. Roughly speaking:
    • ska3-flight includes all Ska packages installed by ska3-flight-latest,
    • ska3-core includes all non-Ska packages that get installed when doing `conda install ska3-flight-latest,
    • ska3-perl includes all packages installed by ska3-perl-latest and not by ska3-flight-latest. The steps of the process are:
    ./skare3/combine_arch_meta.py --name ska3-core --version ${SKA3_VERSION} \
      --out pkg_defs/ska3-core/meta.yaml \
      --env linux=json/ska3-flight-ubuntu.json \
      --env osx=json/ska3-flight-macos.json \
      --env win=json/ska3-flight-windows.json \
      --not-in skare3/pkg_defs/ska3-flight-latest/meta.yaml \
      --exclude ska3-flight
    ./skare3/combine_arch_meta.py --name ska3-flight --version ${SKA3_VERSION} \
      --out pkg_defs/ska3-flight/meta.yaml \
      --env linux=json/ska3-flight-ubuntu.json \
      --env osx=json/ska3-flight-macos.json \
      --env win=json/ska3-flight-windows.json \
      --in skare3/pkg_defs/ska3-flight-latest/meta.yaml \
      --include ska3-core \
      --build "noarch: generic"
    ./skare3/combine_arch_meta.py --name ska3-perl --version ${SKA3_VERSION} \
      --out pkg_defs/ska3-perl/meta.yaml \
      --env linux=json/ska3-perl-ubuntu.json \
      --env osx=json/ska3-perl-macos.json \
      --subtract-env linux=json/ska3-flight-ubuntu.json \
      --subtract-env osx=json/ska3-flight-macos.json \
      --build "skip: True  # [win]"
    

Incremental changes

There is a Github workflow to do (part of this) this in the three standard platforms. In the following example, we trigger the workflow which installs ska3-core 2021.4, updates jpeg package to version 9c, and creates artifacts with conda packages (no yaml files yet):

from skare3_tools import github
repo = github.Repository('sot/skare3')
repo.dispatch_event(event_type='incremental-conda-meta', client_payload={'ska3_core_version': '2022.2', 'update': '-c conda-forge sherpa==4.14.0'})

Details

Most updates to core packages are done incrementally from a base version. In this case, it is not convenient to follow the process outlined above, because it would pull all the newest versions of packages. Instead, we follow a procedure to increase the versions of only a few selected packages and make sure all the dependencies are collected and registered in the corresponding meta.yml file.

For now, this procedure assembles ska3-core/meta.yml only, it assumes packages come from the defaults channel, and it is not automated. Adjust as needed. These are the steps on each architecture (windows, macos, ubuntu):

  1. conda create -y -n pkg-dev
    conda activate pkg-dev
    mamba install -y --override-channels -c https://icxc.cfa.harvard.edu/aspect/ska3-conda/flight ska3-core
    # conda install -y --strict-channel-priority --override-channels -c https://icxc.cfa.harvard.edu/aspect/ska3-conda/flight -c defaults -c conda-forge mamba
    mamba uninstall -y ska3-core
    mamba update -y [--strict-channel-priority] [--override-channels] -c https://icxc.cfa.harvard.edu/aspect/ska3-conda/flight -c defaults [package_spec [package_spec ...]]
    mamba install -y [--strict-channel-priority] [--override-channels] -c https://icxc.cfa.harvard.edu/aspect/ska3-conda/flight -c defaults [package_spec [package_spec ...]]
    conda list --json > ska3-core-${ARCH}.json
    
  2. I currently have a prototype script to gather all packages that are not from our conda channel, so I do:
    git clone --branch improvements https://github.com/sot/skare3_tools.git
    python ./skare3_tools/skare3_tools/conda.py --directory packages/${ARCH} --exclude-channel  aspect/ska3-conda/flight
    
  3. combine YAML files (encoding of windows file was different and had to be fixed before this):
    ./skare3/combine_arch_meta.py --name ska3-core --version ${SKA3_VERSION} \
      --out skare3/pkg_defs/ska3-core/meta.yaml \
      --env linux=linux/ska3-core-linux.json \
      --env osx=macos/ska3-core-macos.json \
      --env win=win/ska3-core-windows.json
    
Clone this wiki locally