Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update the gdas.cd hash and enable GDASApp to run on WCOSS2 #3220

Draft
wants to merge 3 commits into
base: develop
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions env/WCOSS2.env
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,10 @@ step=$1
export launcher="mpiexec -l"
export mpmd_opt="--cpu-bind verbose,core cfp"

# Add path to GDASApp libraries
export LD_LIBRARY_PATH="${LD_LIBRARY_PATH}:${HOMEgfs}/sorc/gdas.cd/build/lib"
export LD_LIBRARY_PATH="${LD_LIBRARY_PATH}:/opt/cray/pe/mpich/8.1.19/ofi/intel/19.0/lib"

Comment on lines +16 to +19
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine for now, but will not be acceptable for implementation. I hope there is a more robust solution than this by that time.

More importantly, this has an impact on every executable in every job -- not just GDASApp executables.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I completely agree. I strongly dislike these two lines. They are temporary patches to allow GFS v17 testing and development to continue on WCOSS2.

The line

export LD_LIBRARY_PATH="${LD_LIBRARY_PATH}:${HOMEgfs}/sorc/gdas.cd/build/lib"

was added because craype/2.7.17 adds

-static-libgcc -static-libstdc++ -Bstatic -lstdc++ -Bdynamic -lm -lpthread

to the ftn command. GDASApp executables failed because they could not find JEDI libraries. Might the addition of a GDASApp install option (something we must have) resolve this problem?

Another concern with the added ftn options is the following warning found in build_gdas.log

icpc: warning #10315: specifying -lm before files may supersede the Intel(R) math library and affect performance
ifort: warning #10315: specifying -lm before files may supersede the Intel(R) math library and affect performance

It would be unfortunate if default compiler options resulted in degraded code performance.

The line

export LD_LIBRARY_PATH="${LD_LIBRARY_PATH}:/opt/cray/pe/mpich/8.1.19/ofi/intel/19.0/lib"

was recommended by GDIT. GDASApp testing identified inconsistencies in across system modules. Some GDASApp executables failed with undefined symbol messages for mpi routines. GDIT is working on a solution.

# Calculate common resource variables
# Check first if the dependent variables are set
if [[ -n "${ntasks:-}" && -n "${max_tasks_per_node:-}" && -n "${tasks_per_node:-}" ]]; then
Expand Down
2 changes: 1 addition & 1 deletion sorc/gdas.cd
Submodule gdas.cd updated 53 files
+24 −0 .github/pull_request_template.md
+6 −0 ci/ci_tests.sh
+123 −47 ci/driver.sh
+0 −169 ci/gw_driver.sh
+1 −0 ci/hera.sh
+13 −0 ci/hercules.sh
+3 −1 ci/orion.sh
+64 −10 ci/run_ci.sh
+0 −80 ci/run_gw_ci.sh
+41 −32 ci/stable_driver.sh
+68 −31 modulefiles/GDAS/wcoss2.intel.lua
+3 −9 parm/aero/jcb-base.yaml.j2
+6 −10 parm/io/fv3jedi_fieldmetadata_history.yaml
+1 −1 parm/jcb-algorithms
+1 −1 parm/jcb-gdas
+10 −2 parm/soca/marine-jcb-base.yaml
+3 −0 parm/soca/obsprep/obsprep_config.yaml
+1 −1 prototypes/gen_prototype.sh
+1 −1 sorc/da-utils
+1 −1 sorc/fv3-jedi
+1 −1 sorc/ioda
+1 −1 sorc/iodaconv
+1 −1 sorc/oops
+1 −1 sorc/saber
+1 −1 sorc/soca
+1 −1 sorc/ufo
+1 −1 sorc/vader
+3 −3 test/atm/global-workflow/CMakeLists.txt
+5 −4 test/atm/global-workflow/jjob_ens_init_split.sh
+8 −4 test/gw-ci/CMakeLists.txt
+3 −0 test/gw-ci/create_exp.sh
+17 −17 test/testreference/C96C48_ufs_hybatmDA_3dvar-fv3inc.ref
+34 −34 test/testreference/C96C48_ufs_hybatmDA_3dvar.ref
+4 −4 test/testreference/C96C48_ufs_hybatmDA_lgetkf_observer.ref
+8 −8 test/testreference/C96C48_ufs_hybatmDA_lgetkf_solver.ref
+30 −33 test/testreference/atm_jjob_3dvar.ref
+65 −65 test/testreference/atm_jjob_lgetkf.ref
+29 −29 test/testreference/atm_jjob_lgetkf_observer.ref
+48 −48 test/testreference/atm_jjob_lgetkf_solver.ref
+2 −0 ush/module-setup.sh
+37 −20 ush/soca/prep_ocean_obs.py
+19 −12 ush/soca/prep_ocean_obs_utils.py
+1 −1 utils/chem/chem_diagb.h
+1 −1 utils/fv3jedi/fv3jedi_fv3inc.h
+1 −1 utils/ioda_example/gdas_meanioda.h
+1 −1 utils/land/land_ensrecenter.h
+1 −1 utils/obsproc/applications/gdas_obsprovider2ioda.h
+1 −1 utils/soca/gdas_ens_handler.h
+14 −8 utils/soca/gdas_incr_handler.h
+26 −18 utils/soca/gdas_postprocincr.h
+1 −1 utils/soca/gdas_soca_diagb.h
+1 −1 utils/soca/gdas_socahybridweights.h
+1 −1 utils/soca/gdassoca_obsstats.h
2 changes: 2 additions & 0 deletions ush/module-setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,8 @@ elif [[ ${MACHINE_ID} = s4* ]] ; then

elif [[ ${MACHINE_ID} = wcoss2 ]]; then
# We are on WCOSS2
# Ignore default modules of the same version lower in the search path (req'd by spack-stack)
export LMOD_TMOD_FIND_FIRST=yes
module reset

elif [[ ${MACHINE_ID} = cheyenne* ]] ; then
Expand Down