Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SDSC: PKG - expanse/0.17.3/cpu/b - Missing Wannier90 (example application) #46

Closed
nwolter opened this issue Mar 8, 2023 · 20 comments
Closed

Comments

@nwolter
Copy link

nwolter commented Mar 8, 2023

No description provided.

@nwolter nwolter changed the title SDSC: PKG - expanse/0.17.3/cpu/a - Missing Wannier90 SDSC: PKG - expanse/0.17.3/cpu/a - Missing Wannier90 (example application) Mar 8, 2023
@mkandes
Copy link
Member

mkandes commented May 5, 2023

Still on the build list to do.

@mkandes mkandes changed the title SDSC: PKG - expanse/0.17.3/cpu/a - Missing Wannier90 (example application) SDSC: PKG - expanse/0.17.3/cpu/b - Missing Wannier90 (example application) May 5, 2023
@mkandes
Copy link
Member

mkandes commented Aug 22, 2023

Starting with a shared Spack instance configuration and environment with expanse/0.17.3/cpu/b in my HOME directory, we'll again walk through the process of creating and testing a new spec build script prior to production deployment into the instance.

[mkandes@login02 ~]$ spack --version
0.17.3
[mkandes@login02 ~]$ which spack
alias spack='spack --config-scope /home/mkandes/.spack/0.17.3/cpu/b/'
	spack ()
	{ 
	    : this is a shell function from: /cm/shared/apps/spack/0.17.3/cpu/b/share/spack/setup-env.sh;
	    : the real spack script is here: /cm/shared/apps/spack/0.17.3/cpu/b/bin/spack;
	    _spack_shell_wrapper "$@";
	    return $?
	}
[mkandes@login02 ~]$

@mkandes
Copy link
Member

mkandes commented Aug 22, 2023

First, check the spack info for the package and determine the variants to use and whether or not any dependencies may need to be strictly enforced. e.g., you typically want to enforce the version of BLAS, MPI, and other virtual dependencies to be utilized during the build as there are usually many choices that could satisfy them.

[mkandes@login02 ~]$ spack info wannier90
MakefilePackage:   wannier90

Description:
    Wannier90 calculates maximally-localised Wannier functions (MLWFs).
    Wannier90 is released under the GNU General Public License.

Homepage: http://wannier.org

Externally Detectable: 
    False

Tags: 
    None

Preferred version:  
    3.1.0    https://github.com/wannier-developers/wannier90/archive/v3.1.0.tar.gz

Safe versions:  
    3.1.0    https://github.com/wannier-developers/wannier90/archive/v3.1.0.tar.gz
    3.0.0    https://github.com/wannier-developers/wannier90/archive/v3.0.0.tar.gz
    2.1.0    https://github.com/wannier-developers/wannier90/archive/v2.1.0.tar.gz
    2.0.1    https://github.com/wannier-developers/wannier90/archive/v2.0.1.tar.gz

Deprecated versions:  
    None

Variants:
    Name [Default]    When    Allowed values    Description
    ==============    ====    ==============    ======================================

    shared [on]       --      on, off           Builds a shared version of the library

Installation Phases:
    edit    build    install

Build Dependencies:
    blas  lapack  mpi

Link Dependencies:
    blas  lapack  mpi

Run Dependencies:
    None

Virtual Packages: 
    None

[mkandes@login02 ~]$

@mkandes
Copy link
Member

mkandes commented Aug 22, 2023

Once you've selected the package version, compiler, variant options and dependencies, make sure to run manual checks with spack spec to determine you're able to produce the build spec required.

[mkandes@login02 ~]$ spack spec -l wannier90@3.1.0 % gcc@10.2.0 +shared ^openblas@0.3.18/$(spack find --format '{hash:7}' openblas@0.3.18 % gcc@10.2.0 ~ilp64 threads=none) ^openmpi@4.1.3/$(spack find --format '{hash:7}' openmpi@4.1.3 % gcc@10.2.0)
Input spec
--------------------------------
wannier90@3.1.0%gcc@10.2.0+shared
    ^openblas@0.3.18%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~bignuma~consistent_fpcsr~ilp64+locking+pic+shared threads=none arch=linux-rocky8-zen2
    ^openmpi@4.1.3%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java+legacylaunchers+lustre~memchecker+pmi+pmix+romio~rsh~singularity+static+vt+wrapper-rpath cuda_arch=none fabrics=ucx schedulers=slurm arch=linux-rocky8-zen2
        ^hwloc@2.6.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~cairo~cuda~gl~libudev+libxml2~netloc~nvml~opencl+pci~rocm+shared arch=linux-rocky8-zen2
            ^libpciaccess@0.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
            ^libxml2@2.9.12%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~python arch=linux-rocky8-zen2
                ^libiconv@1.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  libs=shared,static arch=linux-rocky8-zen2
                ^xz@5.2.5%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~pic libs=shared,static arch=linux-rocky8-zen2
                ^zlib@1.2.11%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +optimize+pic+shared arch=linux-rocky8-zen2
            ^ncurses@6.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~symlinks+termlib abi=none arch=linux-rocky8-zen2
        ^libevent@2.1.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~openssl arch=linux-rocky8-zen2
        ^lustre@2.15.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
        ^numactl@2.0.14%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  patches=4e1d78cbbb85de625bad28705e748856033eaafab92a66dffd383a3d7e00cc94,62fc8a8bf7665a60e8f4c93ebbd535647cebf74198f7afafec4c085a8825c006,ff37630df599cfabf0740518b91ec8daaf18e8f288b19adaae5364dc1f6b2296 arch=linux-rocky8-zen2
        ^pmix@3.2.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~docs+pmi_backwards_compatibility~restful arch=linux-rocky8-zen2
        ^slurm@21.08.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~gtk~hdf5~hwloc~mariadb~pmix+readline~restd sysconfdir=PREFIX/etc arch=linux-rocky8-zen2
        ^ucx@1.10.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~assertions~cm+cma~cuda+dc~debug+dm~gdrcopy+ib-hw-tm~java~knem~logging+mlx5-dv+optimizations~parameter_checking+pic+rc~rocm+thread_multiple+ud~xpmem cuda_arch=none arch=linux-rocky8-zen2
            ^rdma-core@43.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~ipo build_type=RelWithDebInfo arch=linux-rocky8-zen2

Concretized
--------------------------------
4vm3ixw  wannier90@3.1.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +shared arch=linux-rocky8-zen2
fgk2tlu      ^openblas@0.3.18%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~bignuma~consistent_fpcsr~ilp64+locking+pic+shared threads=none arch=linux-rocky8-zen2
oq3qvsv      ^openmpi@4.1.3%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java+legacylaunchers+lustre~memchecker+pmi+pmix+romio~rsh~singularity+static+vt+wrapper-rpath cuda_arch=none fabrics=ucx schedulers=slurm arch=linux-rocky8-zen2
7rqkdv4          ^hwloc@2.6.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~cairo~cuda~gl~libudev+libxml2~netloc~nvml~opencl+pci~rocm+shared arch=linux-rocky8-zen2
ykynzrw              ^libpciaccess@0.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
mgovjpj              ^libxml2@2.9.12%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~python arch=linux-rocky8-zen2
zduoj2d                  ^libiconv@1.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  libs=shared,static arch=linux-rocky8-zen2
paz7hxz                  ^xz@5.2.5%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~pic libs=shared,static arch=linux-rocky8-zen2
ws4iari                  ^zlib@1.2.11%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +optimize+pic+shared arch=linux-rocky8-zen2
5lhvslt              ^ncurses@6.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~symlinks+termlib abi=none arch=linux-rocky8-zen2
bimlmtn          ^libevent@2.1.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~openssl arch=linux-rocky8-zen2
fy2cjdg          ^lustre@2.15.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
ckhyr5e          ^numactl@2.0.14%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  patches=4e1d78cbbb85de625bad28705e748856033eaafab92a66dffd383a3d7e00cc94,62fc8a8bf7665a60e8f4c93ebbd535647cebf74198f7afafec4c085a8825c006,ff37630df599cfabf0740518b91ec8daaf18e8f288b19adaae5364dc1f6b2296 arch=linux-rocky8-zen2
dpvrfip          ^pmix@3.2.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~docs+pmi_backwards_compatibility~restful arch=linux-rocky8-zen2
4kvl3fd          ^slurm@21.08.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~gtk~hdf5~hwloc~mariadb~pmix+readline~restd sysconfdir=PREFIX/etc arch=linux-rocky8-zen2
dnpjjuc          ^ucx@1.10.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~assertions~cm+cma~cuda+dc~debug+dm~gdrcopy+ib-hw-tm~java~knem~logging+mlx5-dv+optimizations~parameter_checking+pic+rc~rocm+thread_multiple+ud~xpmem cuda_arch=none arch=linux-rocky8-zen2
xjr3cuj              ^rdma-core@43.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~ipo build_type=RelWithDebInfo arch=linux-rocky8-zen2

[mkandes@login02 ~]$

@mkandes
Copy link
Member

mkandes commented Aug 22, 2023

If the build spec looks good, codify it in a *.sh spec build job script.

#!/usr/bin/env bash

#SBATCH --job-name=wannier90@3.1.0
#SBATCH --account=use300
##SBATCH --reservation=rocky8u7_testing
#SBATCH --partition=ind-shared
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=1
#SBATCH --cpus-per-task=16
#SBATCH --mem=32G
#SBATCH --time=00:30:00
#SBATCH --output=%x.o%j.%N

declare -xr LOCAL_TIME="$(date +'%Y%m%dT%H%M%S%z')"
declare -xir UNIX_TIME="$(date +'%s')"

declare -xr LOCAL_SCRATCH_DIR="/scratch/${USER}/job_${SLURM_JOB_ID}"
declare -xr TMPDIR="${LOCAL_SCRATCH_DIR}"

declare -xr SYSTEM_NAME='expanse'

declare -xr SPACK_VERSION='0.17.3'
declare -xr SPACK_INSTANCE_NAME='cpu'
declare -xr SPACK_INSTANCE_VERSION='b'
declare -xr SPACK_INSTANCE_DIR="/cm/shared/apps/spack/${SPACK_VERSION}/${SPACK_INSTANCE_NAME}/${SPACK_INSTANCE_VERSION}"

declare -xr SLURM_JOB_SCRIPT="$(scontrol show job ${SLURM_JOB_ID} | awk -F= '/Command=/{print $2}')"
declare -xr SLURM_JOB_MD5SUM="$(md5sum ${SLURM_JOB_SCRIPT})"

declare -xr SCHEDULER_MODULE='slurm'

echo "${UNIX_TIME} ${SLURM_JOB_ID} ${SLURM_JOB_MD5SUM} ${SLURM_JOB_DEPENDENCY}" 
echo ""

cat "${SLURM_JOB_SCRIPT}"

module purge
module load "${SCHEDULER_MODULE}"
module list
. "${SPACK_INSTANCE_DIR}/share/spack/setup-env.sh"
shopt -s expand_aliases
source ~/.bashrc

declare -xr SPACK_PACKAGE='wannier90@3.1.0'
declare -xr SPACK_COMPILER='gcc@10.2.0'
declare -xr SPACK_VARIANTS='+shared'
declare -xr SPACK_DEPENDENCIES="^openblas@0.3.18/$(spack find --format '{hash:7}' openblas@0.3.18 % ${SPACK_COMPILER} ~ilp64 threads=none) ^openmpi@4.1.3/$(spack find --format '{hash:7}' openmpi@4.1.3 % ${SPACK_COMPILER})"
declare -xr SPACK_SPEC="${SPACK_PACKAGE} % ${SPACK_COMPILER} ${SPACK_VARIANTS} ${SPACK_DEPENDENCIES}"

printenv

spack config get compilers
spack config get config
spack config get mirrors
spack config get modules
spack config get packages
spack config get repos
spack config get upstreams

time -p spack spec --long --namespaces --types --reuse "${SPACK_SPEC}"
if [[ "${?}" -ne 0 ]]; then
  echo 'ERROR: spack concretization failed.'
  exit 1
fi

time -p spack install --jobs "${SLURM_CPUS_PER_TASK}" --fail-fast --yes-to-all --reuse "${SPACK_SPEC}"
if [[ "${?}" -ne 0 ]]; then
  echo 'ERROR: spack install failed.'
  exit 1
fi

spack module lmod refresh -y

#sbatch --dependency="afterok:${SLURM_JOB_ID}" ''

sleep 30

@mkandes
Copy link
Member

mkandes commented Aug 22, 2023

Then test the spec build job script ...

[mkandes@login01 specs]$ sbatch wannier90@3.1.0.sh 
Submitted batch job 24775410
[mkandes@login01 specs]$ squeue -u $USER
             JOBID PARTITION     NAME     USER ST       TIME  NODES NODELIST(REASON)
   24578139_[8-19]   compute tar-ilsv  mkandes PD       0:00      1 (JobArrayTaskLimit)
        24578139_7   compute tar-ilsv  mkandes  R   18:45:18      1 exp-1-17
          24775410 ind-share wannier9  mkandes  R       0:02      1 exp-15-40
[mkandes@login01 specs]$

Does the package build successfully?

@mkandes
Copy link
Member

mkandes commented Aug 22, 2023

Nope.

==> Installing wannier90-3.1.0-4vm3ixwon6xieneqqg2wkm4rlurcvkzg
==> No binary for wannier90-3.1.0-4vm3ixwon6xieneqqg2wkm4rlurcvkzg found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/40/40651a9832eb93dec20a8360dd535262c261c34e13c41b6755fa6915c936b254.tar.gz
==> No patches needed for wannier90
==> wannier90: Executing phase: 'edit'
==> wannier90: Executing phase: 'build'
==> Error: ProcessError: Command exited with status 2:
    'make' 'wannier' 'post' 'lib' 'w90chk2chk' 'w90vdw' 'dynlib'

37 errors found in build log:
     15     
     16      1246 |     call MPI_scatterv(rootglobalarray, counts, displs, MPI_
            double_complex, &
     17           |                      1
     18     ......
     19      1344 |     call MPI_scatterv(rootglobalarray, counts, displs, MPI_
            Integer, &
     20           |                      2
  >> 21     Error: Type mismatch between actual argument at (1) and actual argu
            ment at (2) (COMPLEX(8)/INTEGER(4)).
     22     ../comms.F90:1214:22:
     23     
     24      1214 |     call MPI_scatterv(rootglobalarray, counts, displs, MPI_
            double_precision, &
     25           |                      1
     26     ......
     27      1344 |     call MPI_scatterv(rootglobalarray, counts, displs, MPI_
            Integer, &
     28           |                      2
...

Need to try again. Maybe this package requires BLAS with +ilp64?

@mkandes
Copy link
Member

mkandes commented Aug 22, 2023

Nope. Still failing with same set of compilation errors.

==> Installing wannier90-3.1.0-uc6whmozesahjnp4spoahpgnumh5pdpv
==> No binary for wannier90-3.1.0-uc6whmozesahjnp4spoahpgnumh5pdpv found: installing from source
==> Using cached archive: /home/mkandes/.spack/0.17.3/cpu/b/var/spack/cache/_source-cache/archive/40/40651a9832eb93dec20a8360dd535262c261c34e13c41b6755fa6915c936b254.
tar.gz
==> No patches needed for wannier90
==> wannier90: Executing phase: 'edit'
==> wannier90: Executing phase: 'build'
==> Error: ProcessError: Command exited with status 2:
    'make' 'wannier' 'post' 'lib' 'w90chk2chk' 'w90vdw' 'dynlib'

37 errors found in build log:
     15     
     16      1246 |     call MPI_scatterv(rootglobalarray, counts, displs, MPI_
            double_complex, &
     17           |                      1
     18     ......
     19      1344 |     call MPI_scatterv(rootglobalarray, counts, displs, MPI_
            Integer, &
     20           |                      2
  >> 21     Error: Type mismatch between actual argument at (1) and actual argu
            ment at (2) (COMPLEX(8)/INTEGER(4)).
     22     ../comms.F90:1214:22:
     23     
     24      1214 |     call MPI_scatterv(rootglobalarray, counts, displs, MPI_
            double_precision, &
     25           |                      1
     26     ......
     27      1344 |     call MPI_scatterv(rootglobalarray, counts, displs, MPI_
            Integer, &
     28           |                      2
  >> 29     Error: Type mismatch between actual argument at (1) and actual argu
            ment at (2) (REAL(8)/INTEGER(4)).
     30     ../comms.F90:1182:22:
...

@mkandes
Copy link
Member

mkandes commented Aug 22, 2023

Perhaps the issue is with the version of OpenMPI?

[mkandes@login01 ~]$ module spider wannier90

----------------------------------------------------------------------------
  wannier90:
----------------------------------------------------------------------------
     Versions:
        wannier90/1.2-openblas
        wannier90/3.1.0-openblas

----------------------------------------------------------------------------
  For detailed information about a specific "wannier90" package (including how to load the modules) use the module's full name. Note that names that have a trailing (E) are extensions provided by other modules.
  For example:

     $ module spider wannier90/3.1.0-openblas
----------------------------------------------------------------------------

 

[mkandes@login01 ~]$ module spider wannier90/1.2-openblas

----------------------------------------------------------------------------
  wannier90: wannier90/1.2-openblas
----------------------------------------------------------------------------

    You will need to load all module(s) on any one of the lines below before the "wannier90/1.2-openblas" module is available to load.

      cpu/0.15.4  gcc/9.2.0
 
    Help:
      Wannier90 calculates maximally-localised Wannier functions (MLWFs).
      Wannier90 is released under the GNU General Public License.


 

[mkandes@login01 ~]$ module spider wannier90/3.1.0-openblas

----------------------------------------------------------------------------
  wannier90: wannier90/3.1.0-openblas
----------------------------------------------------------------------------

    You will need to load all module(s) on any one of the lines below before the "wannier90/3.1.0-openblas" module is available to load.

      cpu/0.15.4  gcc/9.2.0  openmpi/3.1.6
 
    Help:
      Wannier90 calculates maximally-localised Wannier functions (MLWFs).
      Wannier90 is released under the GNU General Public License.


 

[mkandes@login01 ~]$

@mkandes
Copy link
Member

mkandes commented Aug 22, 2023

If that is the case, we may not be able to deploy wannier90@3.1.0 in expanse/0.17.3/cpu/b as openmpi@3.1.6 is no longer able to build against the version of mlnx_ofed we're now using, if I recall correctly.

@mkandes
Copy link
Member

mkandes commented Aug 22, 2023

Actually, it was a problem with Lustre integration.

==> Installing openmpi-3.1.6-m22tb3xif4qf6dl5nv3qkr5rc3wcirpv
==> No binary for openmpi-3.1.6-m22tb3xif4qf6dl5nv3qkr5rc3wcirpv found: installing from source
==> Warning: Expected user 527834 to own /scratch/spack_cpu, but it is owned by 0
==> Fetching https://mirror.spack.io/_source-cache/archive/50/50131d982ec2a516564d74d5616383178361c2f08fdd7d1202b80bdf66a0d279.tar.bz2
==> No patches needed for openmpi
==> openmpi: Executing phase: 'autoreconf'
==> openmpi: Executing phase: 'configure'
==> openmpi: Executing phase: 'build'
==> Error: ProcessError: Command exited with status 2:
    'make' '-j16' 'V=1'

1 error found in build log:
     13882    libtool: compile:  /cm/shared/apps/spack/0.17.3/cpu/b/lib/spack/e
              nv/gcc/gcc -DHAVE_CONFIG_H -I. -I../../../../opal/include -I../..
              /../../ompi/include -I../../../../oshmem/include -I../../../../op
              al/mca/hwloc/hwloc1117/hwloc/include/private/autogen -I../../../.
              ./opal/mca/hwloc/hwloc1117/hwloc/include/hwloc/autogen -I../../..
              /../ompi/mpiext/cuda/c -I../../../.. -I../../../../orte/include -
              I/cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/g
              cc-10.2.0/zlib-1.2.11-ws4iari52j2lphd52i7kd72yj37o32zt/include -I
              /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gc
              c-10.2.0/hwloc-1.11.13-rzxeveahbz2gyxymflgrf3zq3cauadpn/include -
              I/cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/g
              cc-10.2.0/libevent-2.1.8-bimlmtn2x74wxpfxjy6yioltrzjdmeio/include
               -I/cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2
              /gcc-10.2.0/hwloc-1.11.13-rzxeveahbz2gyxymflgrf3zq3cauadpn/includ
              e -O3 -DNDEBUG -finline-functions -fno-strict-aliasing -fexceptio
              ns -pthread -MT fs_lustre_file_get_size.lo -MD -MP -MF .deps/fs_l
              ustre_file_get_size.Tpo -c fs_lustre_file_get_size.c  -fPIC -DPIC
               -o .libs/fs_lustre_file_get_size.o
     13883    In file included from /usr/include/linux/fs.h:18,
     13884                     from /usr/include/linux/lustre/lustre_user.h:54,
     13885                     from /usr/include/lustre/lustreapi.h:46,
     13886                     from ../../../../ompi/mca/fs/lustre/fs_lustre.h:
              37,
     13887                     from fs_lustre.c:30:
  >> 13888    /usr/include/sys/mount.h:35:3: error: expected identifier before 
              numeric constant
     13889       35 |   MS_RDONLY = 1,  /* Mount read-only.  */
     13890          |   ^~~~~~~~~
     13891    fs_lustre.c: In function 'mca_fs_lustre_component_file_query':
     13892    fs_lustre.c:91:55: warning: passing argument 1 of 'mca_fs_base_ge
              t_fstype' discards 'const' qualifier from pointer target type [-W
              discarded-qualifiers]
     13893       91 |             fh->f_fstype = mca_fs_base_get_fstype ( fh->f
              _filename );
     13894          |                                                     ~~^~~
              ~~~~~~~~~

See build log for details:
  /scratch/spack_cpu/job_21730823/spack-stage/spack-stage-openmpi-3.1.6-m22tb3xif4qf6dl5nv3qkr5rc3wcirpv/spack-build-out.txt

==> Error: Terminating after first install failure: ProcessError: Command exited with status 2:
    'make' '-j16' 'V=1'
real 390.67
user 448.62
sys 1712.94
ERROR: spack install failed.

https://github.com/sdsc/spack/blob/sdsc-0.17.3/etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc%4010.2.0/openmpi%403.1.6.o21730823.exp-15-56

@mkandes
Copy link
Member

mkandes commented Aug 23, 2023

There appears to have been a fix applied to the wannier90 package in spack/spack upstream in December 2022 that may be related to the issue here? spack@8332a59

Let's try the latest version of the package in my local package repository.

[mkandes@login02 ~]$ cd .spack/0.17.3/cpu/b/var/spack/repos/mkandes/packages/
[mkandes@login02 packages]$ ls
spark
[mkandes@login02 packages]$ mkdir wannier90
[mkandes@login02 packages]$ cd wannier90/
[mkandes@login02 wannier90]$ ls
[mkandes@login02 wannier90]$ wget https://raw.githubusercontent.com/spack/spack/e0059ef9613cd7a1a77611aa0957a60acc82582e/var/spack/repos/builtin/packages/wannier90/make.sys
--2023-08-23 12:42:25--  https://raw.githubusercontent.com/spack/spack/e0059ef9613cd7a1a77611aa0957a60acc82582e/var/spack/repos/builtin/packages/wannier90/make.sys
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.111.133, 185.199.110.133, 185.199.108.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.111.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 84 [text/plain]
Saving to: ‘make.sys’

make.sys            100%[===================>]      84  --.-KB/s    in 0s      

2023-08-23 12:42:25 (5.91 MB/s) - ‘make.sys’ saved [84/84]

[mkandes@login02 wannier90]$ wget https://raw.githubusercontent.com/spack/spack/e0059ef9613cd7a1a77611aa0957a60acc82582e/var/spack/repos/builtin/packages/wannier90/package.py
--2023-08-23 12:42:40--  https://raw.githubusercontent.com/spack/spack/e0059ef9613cd7a1a77611aa0957a60acc82582e/var/spack/repos/builtin/packages/wannier90/package.py
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.109.133, 185.199.110.133, 185.199.108.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.109.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 7385 (7.2K) [text/plain]
Saving to: ‘package.py’

package.py          100%[===================>]   7.21K  --.-KB/s    in 0s      

2023-08-23 12:42:41 (111 MB/s) - ‘package.py’ saved [7385/7385]

[mkandes@login02 wannier90]$

@mkandes
Copy link
Member

mkandes commented Aug 23, 2023

Check that the new package in the concretization is from my namespace.

[mkandes@login02 ~]$ spack spec -lN wannier90@3.1.0 % gcc@10.2.0 +shared ^openblas@0.3.18/$(spack find --format '{hash:7}' openblas@0.3.18 % gcc@10.2.0 ~ilp64 threads=none) ^openmpi@4.1.3/$(spack find --format '{hash:7}' openmpi@4.1.3 % gcc@10.2.0)
Input spec
--------------------------------
.wannier90@3.1.0%gcc@10.2.0+shared
    ^builtin.openblas@0.3.18%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~bignuma~consistent_fpcsr~ilp64+locking+pic+shared threads=none arch=linux-rocky8-zen2
    ^sdsc.openmpi@4.1.3%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java+legacylaunchers+lustre~memchecker+pmi+pmix+romio~rsh~singularity+static+vt+wrapper-rpath cuda_arch=none fabrics=ucx schedulers=slurm arch=linux-rocky8-zen2
        ^builtin.hwloc@2.6.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~cairo~cuda~gl~libudev+libxml2~netloc~nvml~opencl+pci~rocm+shared arch=linux-rocky8-zen2
            ^builtin.libpciaccess@0.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
            ^builtin.libxml2@2.9.12%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~python arch=linux-rocky8-zen2
                ^builtin.libiconv@1.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  libs=shared,static arch=linux-rocky8-zen2
                ^builtin.xz@5.2.5%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~pic libs=shared,static arch=linux-rocky8-zen2
                ^builtin.zlib@1.2.11%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +optimize+pic+shared arch=linux-rocky8-zen2
            ^builtin.ncurses@6.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~symlinks+termlib abi=none arch=linux-rocky8-zen2
        ^builtin.libevent@2.1.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~openssl arch=linux-rocky8-zen2
        ^builtin.lustre@2.15.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
        ^builtin.numactl@2.0.14%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  patches=4e1d78cbbb85de625bad28705e748856033eaafab92a66dffd383a3d7e00cc94,62fc8a8bf7665a60e8f4c93ebbd535647cebf74198f7afafec4c085a8825c006,ff37630df599cfabf0740518b91ec8daaf18e8f288b19adaae5364dc1f6b2296 arch=linux-rocky8-zen2
        ^builtin.pmix@3.2.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~docs+pmi_backwards_compatibility~restful arch=linux-rocky8-zen2
        ^builtin.slurm@21.08.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~gtk~hdf5~hwloc~mariadb~pmix+readline~restd sysconfdir=PREFIX/etc arch=linux-rocky8-zen2
        ^builtin.ucx@1.10.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~assertions~cm+cma~cuda+dc~debug+dm~gdrcopy+ib-hw-tm~java~knem~logging+mlx5-dv+optimizations~parameter_checking+pic+rc~rocm+thread_multiple+ud~xpmem cuda_arch=none arch=linux-rocky8-zen2
            ^builtin.rdma-core@43.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~ipo build_type=RelWithDebInfo arch=linux-rocky8-zen2

Concretized
--------------------------------
4x2owia  mkandes.wannier90@3.1.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +shared arch=linux-rocky8-zen2
fgk2tlu      ^builtin.openblas@0.3.18%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~bignuma~consistent_fpcsr~ilp64+locking+pic+shared threads=none arch=linux-rocky8-zen2
oq3qvsv      ^sdsc.openmpi@4.1.3%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java+legacylaunchers+lustre~memchecker+pmi+pmix+romio~rsh~singularity+static+vt+wrapper-rpath cuda_arch=none fabrics=ucx schedulers=slurm arch=linux-rocky8-zen2
7rqkdv4          ^builtin.hwloc@2.6.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~cairo~cuda~gl~libudev+libxml2~netloc~nvml~opencl+pci~rocm+shared arch=linux-rocky8-zen2
ykynzrw              ^builtin.libpciaccess@0.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
mgovjpj              ^builtin.libxml2@2.9.12%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~python arch=linux-rocky8-zen2
zduoj2d                  ^builtin.libiconv@1.16%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  libs=shared,static arch=linux-rocky8-zen2
paz7hxz                  ^builtin.xz@5.2.5%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~pic libs=shared,static arch=linux-rocky8-zen2
ws4iari                  ^builtin.zlib@1.2.11%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +optimize+pic+shared arch=linux-rocky8-zen2
5lhvslt              ^builtin.ncurses@6.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~symlinks+termlib abi=none arch=linux-rocky8-zen2
bimlmtn          ^builtin.libevent@2.1.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~openssl arch=linux-rocky8-zen2
fy2cjdg          ^builtin.lustre@2.15.2%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
ckhyr5e          ^builtin.numactl@2.0.14%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  patches=4e1d78cbbb85de625bad28705e748856033eaafab92a66dffd383a3d7e00cc94,62fc8a8bf7665a60e8f4c93ebbd535647cebf74198f7afafec4c085a8825c006,ff37630df599cfabf0740518b91ec8daaf18e8f288b19adaae5364dc1f6b2296 arch=linux-rocky8-zen2
dpvrfip          ^builtin.pmix@3.2.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~docs+pmi_backwards_compatibility~restful arch=linux-rocky8-zen2
4kvl3fd          ^builtin.slurm@21.08.8%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~gtk~hdf5~hwloc~mariadb~pmix+readline~restd sysconfdir=PREFIX/etc arch=linux-rocky8-zen2
dnpjjuc          ^builtin.ucx@1.10.1%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~assertions~cm+cma~cuda+dc~debug+dm~gdrcopy+ib-hw-tm~java~knem~logging+mlx5-dv+optimizations~parameter_checking+pic+rc~rocm+thread_multiple+ud~xpmem cuda_arch=none arch=linux-rocky8-zen2
xjr3cuj              ^builtin.rdma-core@43.0%gcc@10.2.0 cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~ipo build_type=RelWithDebInfo arch=linux-rocky8-zen2

[mkandes@login02 ~]$

@mkandes
Copy link
Member

mkandes commented Aug 23, 2023

That did the trick! Success!

==> Installing wannier90-3.1.0-6vyb54qdivtduxyzffyshioyaooyouxg
==> No binary for wannier90-3.1.0-6vyb54qdivtduxyzffyshioyaooyouxg found: installing from source
==> Using cached archive: /home/mkandes/.spack/0.17.3/cpu/b/var/spack/cache/_source-cache/archive/40/40651a9832eb93dec20a8360dd535262c261c34e13c41b6755fa6915c936b254.tar.gz
==> No patches needed for wannier90
==> wannier90: Executing phase: 'edit'
==> wannier90: Executing phase: 'build'
==> wannier90: Executing phase: 'install'
==> wannier90: Successfully installed wannier90-3.1.0-6vyb54qdivtduxyzffyshioyaooyouxg
  Fetch: 1.51s.  Build: 2m 23.51s.  Total: 2m 25.02s.
[+] /home/mkandes/.spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/wannier90-3.1.0-6vyb54qdivtduxyzffyshioyaooyouxg
real 181.85
user 53.27
sys 2.52
==> Regenerating lmod module files
[mkandes@login02 specs]$ module avail

---- /home/mkandes/.spack/0.17.3/cpu/b/share/spack/lmod/linux-rocky8-x86_64 ----
   gcc/10.2.0/hadoop/3.3.0/bfyu354
   gcc/10.2.0/py-numpy/1.21.3/53ovhtc
   gcc/10.2.0/spark/3.4.0/po6mvtn
   intel-mpi/2019.10.317-kdx4qap/gcc/10.2.0/elpa/2021.05.001/fyc633z
   openmpi/4.1.3-oq3qvsv/gcc/10.2.0/elpa/2021.05.001/eq4324u
   openmpi/4.1.3-oq3qvsv/gcc/10.2.0/wannier90/3.1.0/6vyb54q

@mkandes
Copy link
Member

mkandes commented Aug 24, 2023

We don't actually have any standalone tests for wannier90 that I can see.

[mkandes@login02 software]$ cp -rp /cm/shared/examples/sdsc/wannier90/
qe6.7_wannier1.2/     vasp5.4.4_wannier1.2/ 
[mkandes@login02 software]$ cp -rp /cm/shared/examples/sdsc/wannier90/vasp5.4.4_wannier1.2/Si_bandstructure_GW/
INCAR.step1               KPOINTS                   vasp-shared-wannier90.sb
INCAR.step2               POSCAR                    wannier90.win
INCAR.step3               POTCAR                    
[mkandes@login02 software]$ cp -rp /cm/shared/examples/sdsc/wannier90/vasp5.4.4_wannier1.2/Si_bandstructure_GW/
INCAR.step1               KPOINTS                   vasp-shared-wannier90.sb
INCAR.step2               POSCAR                    wannier90.win
INCAR.step3               POTCAR                    
[mkandes@login02 software]$ cp -rp /cm/shared/examples/sdsc/wannier90/vasp5.4.4_wannier1.2/Si_bandstructure_GW/

As such, we will deploy into production for subsequent testing later.

@mkandes
Copy link
Member

mkandes commented Aug 24, 2023

Resync personal fork of sdsc/spack repo with upstream and then create a new branch for the updated Spack package and spec build script for wannier90.

[mkandes@login01 mkandes]$ git fetch --tags upstream
remote: Enumerating objects: 25, done.
remote: Counting objects: 100% (9/9), done.
remote: Total 25 (delta 8), reused 8 (delta 8), pack-reused 16
Unpacking objects: 100% (25/25), 4.95 KiB | 2.00 KiB/s, done.
From https://github.com/sdsc/spack
   e9bb2189ca..67e74f0e08  sdsc-0.17.3 -> upstream/sdsc-0.17.3
[mkandes@login01 mkandes]$ git log
commit 7e3d0453067e6d31c67922c47683148a9c5192e6 (HEAD -> sdsc-0.17.3, origin/sdsc-0.17.3, origin/HEAD)
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Mon Jul 10 17:11:31 2023 -0700

    Add custom spark package to sdsc package repo in sdsc-0.17.3
    
    The custom change here is simply to include the latest version of
    Spark at the time of this writing, which is Spark v3.4.0, and the sha256
    hash associated with its downloadable tarball from the Spark project.

commit e9bb2189ca01894c673113228dd061cc046bc948
Merge: c33518253b 90f124605b
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Mon Jul 10 16:26:01 2023 -0700

    Merge pull request #89 from mkandes/sdsc-0.17.3
    
    Add spark@3.4.0 % gcc@10.2.0 to expanse/0.17.3/cpu/b

commit 90f124605b39fb79339dd2efe492ec1437b20b79
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Mon Jul 10 16:18:34 2023 -0700
[mkandes@login01 mkandes]$ git merge upstream/sdsc-0.17.3
Updating 7e3d045306..67e74f0e08
Fast-forward
 ...8.exp-15-02 => spark@3.4.0.o23942559.exp-15-56} | 191 ++++++++++-----------
 .../0.17.3/cpu/b/specs/gcc@10.2.0/spark@3.4.0.sh   |   2 +-
 .../sdsc/expanse/0.17.3/cpu/b/yamls/modules.yaml   |   2 +-
 3 files changed, 94 insertions(+), 101 deletions(-)
 rename etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/{spark@3.4.0.o22902088.exp-15-02 => spark@3.4.0.o23942559.exp-15-56} (72%)
[mkandes@login01 mkandes]$ git log
commit 67e74f0e081b45fa82d7ff097a03e4cd5b3757ea (HEAD -> sdsc-0.17.3, upstream/sdsc-0.17.3)
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Fri Jul 14 18:23:11 2023 -0700

    Update modules.yaml for expanse/0.17.3/cpu/b to whitelist hadoop

commit bd28762c3a74ea7b511b6c4abbedb3ab50b6ec48
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Tue Jul 11 16:56:09 2023 -0700

    Deploy spark@3.4.0 % gcc@10.2.0 into prod within expanse/0.17.3/cpu/b

commit 553657e0d4e92fb543587583665818cef36d4b94
Merge: e9bb2189ca 7e3d045306
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Mon Jul 10 17:15:28 2023 -0700

    Merge pull request #90 from mkandes/sdsc-0.17.3
    
    Add custom spark package to sdsc package repo in sdsc-0.17.3

commit 7e3d0453067e6d31c67922c47683148a9c5192e6 (origin/sdsc-0.17.3, origin/HEAD[mkandes@login01 mkandes]$ git push
Username for 'https://github.com': mkandes
Password for 'https://mkandes@github.com': 
Enumerating objects: 51, done.
Counting objects: 100% (39/39), done.
Delta compression using up to 64 threads
Compressing objects: 100% (17/17), done.
Writing objects: 100% (25/25), 9.89 KiB | 4.95 MiB/s, done.
Total 25 (delta 9), reused 0 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (9/9), completed with 6 local objects.
To https://github.com/mkandes/spack.git
   7e3d045306..67e74f0e08  sdsc-0.17.3 -> sdsc-0.17.3
[mkandes@login01 mkandes]$ git branch sdsc-0.17.3-gh-46-pkg-spec-wannier90
[mkandes@login01 mkandes]$ git branch
* sdsc-0.17.3
  sdsc-0.17.3-gh-46-pkg-spec-wannier90
[mkandes@login01 mkandes]$ git checkout sdsc-0.17.3-gh-46-pkg-spec-wannier90
Switched to branch 'sdsc-0.17.3-gh-46-pkg-spec-wannier90'
[mkandes@login01 mkandes]$ ls
bin              DEPLOYMENT.md   LICENSE-MIT     README.md
CHANGELOG.md     etc             NOTICE          SECURITY.md
CONTRIBUTING.md  lib             pyproject.toml  share
COPYRIGHT        LICENSE-APACHE  pytest.ini      var
[mkandes@login01 mkandes]$

@mkandes
Copy link
Member

mkandes commented Aug 24, 2023

Pull requested created and merged. #94

@mkandes
Copy link
Member

mkandes commented Aug 24, 2023

Now it's time to pull the changes into production the expanse/0.17.3/cpu/b instance from the sdsc/spack repo and deploy the wannier90 as specified by the spec build script.

Pull changes first.

[mkandes@login02 ~]$ !946
sudo -u spack_cpu ssh spack_cpu@login.expanse.sdsc.edu
PIN+Yubi: 
Welcome to Bright release         9.0

                                                         Based on Rocky Linux 8
                                                                    ID: #000002

--------------------------------------------------------------------------------

                                 WELCOME TO
                  _______  __ ____  ___    _   _______ ______
                 / ____/ |/ // __ \/   |  / | / / ___// ____/
                / __/  |   // /_/ / /| | /  |/ /\__ \/ __/
               / /___ /   |/ ____/ ___ |/ /|  /___/ / /___
              /_____//_/|_/_/   /_/  |_/_/ |_//____/_____/

--------------------------------------------------------------------------------

Use the following commands to adjust your environment:

'module avail'            - show available modules
'module add <module>'     - adds a module to your environment for this session
'module initadd <module>' - configure module to be loaded at every login

-------------------------------------------------------------------------------
[spack_cpu@login02 ~]$ srun --partition=ind-shared --reservation=root_73  --account=use300 --nodes=1 --nodelist=exp-15-56 --ntasks-per-node=1 --cpus-per-task=16 --mem=32G --time=12:00:00 --pty --wait=0 /bin/bash
[spack_cpu@exp-15-56 ~]$ cd /cm/shared/apps/spack/0.17.3/cpu/b/
[spack_cpu@exp-15-56 b]$ ls
bin              DEPLOYMENT.md   LICENSE-MIT     pytest.ini   var
CHANGELOG.md     etc             NOTICE          README.md
CONTRIBUTING.md  lib             opt             SECURITY.md
COPYRIGHT        LICENSE-APACHE  pyproject.toml  share
[spack_cpu@exp-15-56 b]$ git log
commit 67e74f0e081b45fa82d7ff097a03e4cd5b3757ea (HEAD -> sdsc-0.17.3, origin/sdsc-0.17.3, origin/HEAD)
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Fri Jul 14 18:23:11 2023 -0700

    Update modules.yaml for expanse/0.17.3/cpu/b to whitelist hadoop

commit bd28762c3a74ea7b511b6c4abbedb3ab50b6ec48
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Tue Jul 11 16:56:09 2023 -0700

    Deploy spark@3.4.0 % gcc@10.2.0 into prod within expanse/0.17.3/cpu/b

commit 553657e0d4e92fb543587583665818cef36d4b94
Merge: e9bb2189ca 7e3d045306
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Mon Jul 10 17:15:28 2023 -0700

    Merge pull request #90 from mkandes/sdsc-0.17.3
    
    Add custom spark package to sdsc package repo in sdsc-0.17.3

commit 7e3d0453067e6d31c67922c47683148a9c5192e6
[spack_cpu@exp-15-56 b]$ git status
On branch sdsc-0.17.3
Your branch is up to date with 'origin/sdsc-0.17.3'.

Untracked files:
  (use "git add <file>..." to include in what will be committed)
	etc/spack/compilers.yaml
	etc/spack/licenses/aocc/
	etc/spack/licenses/intel/
	etc/spack/modules.yaml
	etc/spack/packages.yaml

nothing added to commit but untracked files present (use "git add" to track)
[spack_cpu@exp-15-56 b]$ git stash
No local changes to save
[spack_cpu@exp-15-56 b]$ git pull
remote: Enumerating objects: 21, done.
remote: Counting objects: 100% (21/21), done.
remote: Compressing objects: 100% (9/9), done.
remote: Total 21 (delta 7), reused 20 (delta 7), pack-reused 0
Unpacking objects: 100% (21/21), 5.24 KiB | 191.00 KiB/s, done.
From https://github.com/sdsc/spack
   67e74f0e08..56ae69a027  sdsc-0.17.3 -> origin/sdsc-0.17.3
Updating 67e74f0e08..56ae69a027
Fast-forward
 .../gcc@10.2.0/openmpi@4.1.3/wannier90@3.1.0.sh    |  76 ++++++++
 var/spack/repos/sdsc/packages/wannier90/make.sys   |   7 +
 var/spack/repos/sdsc/packages/wannier90/package.py | 199 +++++++++++++++++++++
 3 files changed, 282 insertions(+)
 create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/wannier90@3.1.0.sh
 create mode 100644 var/spack/repos/sdsc/packages/wannier90/make.sys
 create mode 100644 var/spack/repos/sdsc/packages/wannier90/package.py
[spack_cpu@exp-15-56 b]$ git stash pop
No stash entries found.
[spack_cpu@exp-15-56 b]$ git log
commit 56ae69a027390e8f6d80ca45f1f631a266aba598 (HEAD -> sdsc-0.17.3, origin/sdsc-0.17.3, origin/HEAD)
Merge: 67e74f0e08 f7e36a2601
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Wed Aug 23 18:46:34 2023 -0700

    Merge pull request #94 from mkandes/sdsc-0.17.3-gh-46-pkg-spec-wannier90
    
    Add wannier90@3.1.0 % gcc@10.2.0 ^openmpi@4.1.3 to expanse/0.17.3/cpu/b

commit f7e36a26016958936be369531345fb312bc4512a
Author: Marty Kandes <mkandes@sdsc.edu>
Date:   Wed Aug 23 18:37:25 2023 -0700

    Add wannier90@3.1.0 % gcc@10.2.0 ^openmpi@4.1.3 to expanse/0.17.3/cpu/b
    
    Also included is an updated package.py file for wannier90 in the custom
    sdsc package repo. This updated version is based on wannier90 package.py
    file available in the spack/spack upstream on the date of this commite.
    This fixes a build issue encountered with the builtins version available
    in Spack v0.17.3. The fix was provided in this commit [1].
    
[spack_cpu@exp-15-56 b]$

And then run spec build script.

[spack_cpu@exp-15-56 b]$ cd etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/
[spack_cpu@exp-15-56 openmpi@4.1.3]$ sbatch wannier90@3.1.0.sh 
Submitted batch job 24808571
[spack_cpu@exp-15-56 openmpi@4.1.3]$ squeue -u $USER
             JOBID PARTITION     NAME     USER ST       TIME  NODES NODELIST(REASON)
          24808571 ind-share wannier9 spack_cp  R       0:05      1 exp-15-56
          24808493 ind-share     bash spack_cp  R       9:39      1 exp-15-56
[spack_cpu@exp-15-56 openmpi@4.1.3]$

@mkandes
Copy link
Member

mkandes commented Aug 24, 2023

Deployment successful.

[mkandes@login01 ~]$ module spider wannier90/3.1.0

----------------------------------------------------------------------------
  wannier90/3.1.0: wannier90/3.1.0/wrzoklo
----------------------------------------------------------------------------

     Other possible modules matches:
        openmpi/4.1.3-oq3qvsv/gcc/10.2.0/wannier90/3.1.0

    You will need to load all module(s) on any one of the lines below before the "wannier90/3.1.0/wrzoklo" module is available to load.

      cpu/0.17.3b  gcc/10.2.0/npcyll4  openmpi/4.1.3/oq3qvsv
 
    Help:
      Wannier90 calculates maximally-localised Wannier functions (MLWFs).
      Wannier90 is released under the GNU General Public License.
      

----------------------------------------------------------------------------
  To find other possible module matches execute:

      $ module -r spider '.*wannier90/3.1.0.*'

[mkandes@login01 ~]$

Committing spec build script standard output back to sdsc/spack repo for the record.

[spack_cpu@exp-15-56 openmpi@4.1.3]$ git status
On branch sdsc-0.17.3
Your branch is up to date with 'origin/sdsc-0.17.3'.

Untracked files:
  (use "git add <file>..." to include in what will be committed)
	../../../../../../../../compilers.yaml
	../../../../../../../../licenses/aocc/
	../../../../../../../../licenses/intel/
	../../../../../../../../modules.yaml
	../../../../../../../../packages.yaml
	wannier90@3.1.0.o24808571.exp-15-56

nothing added to commit but untracked files present (use "git add" to track)
[spack_cpu@exp-15-56 openmpi@4.1.3]$ git add wannier90@3.1.0.o24808571.exp-15-56
[spack_cpu@exp-15-56 openmpi@4.1.3]$ git commit
[sdsc-0.17.3 2b4e0bc03b] Deploy wannier90@3.1.0 % gcc@10.2.0 ^openmpi@4.1.3 into exp/0.17.3/cpu/b
 1 file changed, 689 insertions(+)
 create mode 100644 etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc@10.2.0/openmpi@4.1.3/wannier90@3.1.0.o24808571.exp-15-56
[spack_cpu@exp-15-56 openmpi@4.1.3]$ git push
Username for 'https://github.com': mkandes
Password for 'https://mkandes@github.com': 
Enumerating objects: 24, done.
Counting objects: 100% (24/24), done.
Delta compression using up to 128 threads
Compressing objects: 100% (9/9), done.
Writing objects: 100% (13/13), 9.97 KiB | 9.97 MiB/s, done.
Total 13 (delta 5), reused 0 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (5/5), completed with 5 local objects.
To https://github.com/sdsc/spack.git
   56ae69a027..2b4e0bc03b  sdsc-0.17.3 -> sdsc-0.17.3
[spack_cpu@exp-15-56 openmpi@4.1.3]$

@mkandes
Copy link
Member

mkandes commented Aug 24, 2023

Ready for testing and/or to close. 2b4e0bc

@mkandes mkandes closed this as completed Aug 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants