From f96bdb420951d01eef80ba829934a26c2ac60f2c Mon Sep 17 00:00:00 2001 From: "Judy.K.Henderson" Date: Thu, 18 May 2023 00:31:32 +0000 Subject: [PATCH] Merge with 01May23 NOAA-EMC/develop branch MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Squashed commit of the following: commit 608ff650fbc902114203a8d03893e3d26fa05d70 Author: Kate Friedman Date: Mon May 1 22:19:39 2023 -0400 Remove gdas bump fix files (#1553) * Remove bump_ver from versions/fix.ver - No longer need a bump version variable in fix.ver. Refs #1552 commit 810071bff4878b16e5b70113fff990a1b1e80a4f Author: Walter Kolczynski - NOAA Date: Mon May 1 22:18:31 2023 -0400 Remove remnant WAVE_RUN from archive (#1556) `$WAVE_RUN` is no longer used in workflow, but one reference remained in the archive job, which would cause failures. The conditional did not need to be replaced by `$RUN` since `$RUN` has already been checked at that point. Fixes #1548 commit f2ea92bf7345be1669677affab401bf25dd917ff Author: Walter Kolczynski - NOAA Date: Mon May 1 17:08:30 2023 -0400 Update UFS to develop as of 2023 Apr 17 (#1509) Updates the UFS model hash to the version as of 2023 Apr 17. Some associated changes accompany this update - Restart filenames for MOM6 (ufs-community/ufs-weather-model#1599) - Remove store coriolis setting from MOM6 namelist (ufs-community/ufs-weather-model#1599) - Change in atm 'log' file names (ufs-community/ufs-weather-model#1704) - Additions to diag_table for frozen species (ufs-community/ufs-weather-model#1529) - Restart quilting (ufs-community/ufs-weather-model#1633) - Update to post itag (ufs-community/ufs-weather-model#1690) The switch to restart quilting adds an additional constraint on the size of write groups to be divisible by the number of tiles, so all were increased to the next multiple of 6. In the process of updating the diag tables, unused tables were removed. Closes #1279 as moot Closes #1445 Closes #1499 Partially addresses #1277 commit 6d3ed8ac4b71759fbace38f3dd544f7ee018b821 Author: Guillaume Vernieres Date: Mon May 1 14:36:01 2023 -0400 Adapt the marine DA to the new COM structure (#1554) commit 6c48e94b4f3c7cf180cd443a13b957c8ab87ab4c Author: Cory Martin Date: Mon May 1 02:02:37 2023 -0400 Update aerosol DA to use new COM structure (#1551) This PR updates the j-jobs and python classes for aerosol DA to use the new COM directory structure. This PR also includes removing of the chem history staging for the ICSDIR in setup_expt.py. The aerosol fields are treated as FV3 tracers, so they are either available (warm start) with the fv_tracer files or start at 0 (cold start) and must be spun up from emissions. Fixes #1516 commit ec2dd3ab8fb78c4e0aff7d81cb64095fe6130ad6 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Fri Apr 28 14:45:09 2023 -0400 Update UFSDA ATM ens for new COM directory structure (#1538) g-w PR #1421 changed the GFS COM directory structure. This PR updates UFSDA ATM ensemble analysis jobs and python script to work with the updated GFS COM directory structure. Fixes #1518 commit 406d6900963ca9c5152ef4b84de0d2840b0c2fd3 Author: Rahul Mahajan Date: Fri Apr 28 12:55:05 2023 -0400 Run an ensemble forecast of the coupled model (#1545) commit ff37168eddd52e00b288cde4bdbdb1927369abe6 Author: Walter Kolczynski - NOAA Date: Fri Apr 28 12:53:59 2023 -0400 Fix ocean anl path in staging (#1544) The path for ocean analysis files was not properly updated after analysis was moved out of model_data into its own directory. commit 8506ec6977c53018a100fd4881c296016a7df630 Author: Cory Martin Date: Thu Apr 27 15:48:47 2023 -0400 Two minor bugfixes (#1542) Found some odd bugs in the aerosol DA changes that need to be fixed to work properly. Not sure how they worked before with testing (by luck?) but these are straightforward fixes to implement. one has a missing / in a path for FileHandler and the other has a missing .nc causing files to not be found commit 5f66da919c2525e189a169bad1f3fcbab3a64739 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Thu Apr 27 14:04:55 2023 -0400 Move guts of ocean analysis post out of j-job (#1539) Removes most of content of JGDAS_GLOBAL_OCEAN_ANALYSIS_POST to scripts/exgdas_global_marine_analysis_post.py in GDASApp, now just calls that script. Addresses first bullet of #1480 commit 3dd6bbe7a77145ab31d00bd4c23af7649353cbd9 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Wed Apr 26 17:02:05 2023 -0400 Update UFSDA ATM anl for new COM directory structure (#1537) g-w PR #1421 changed the GFS COM directory structure. This PR updates UFSDA ATM variational analysis jobs and python scripts to work with the updated GFS COM directory structure. Fixes #1517 commit b5d173f6a95ccc5df55aecb9dc490fffb79b26ca Author: Walter Kolczynski - NOAA Date: Wed Apr 26 13:39:45 2023 -0400 Fix incorrect ocean history tmpl during workflow generation (#1533) The trailing '_TMPL' was accidentally ommited from the ocean history template used during workflow generation. Refs: #1532 commit 789ae97a1210117b9b629c4a47d6cfa999b551ab Author: Walter Kolczynski - NOAA Date: Wed Apr 26 13:39:31 2023 -0400 Fix ocean stating from flat structure (#1531) A typo led to the same variable being defined twice instead of the variable that was supposed to be defined. Fixes #1530 commit 1e4a24c35088a3db6348156c15bbc81b6f814e40 Author: Guillaume Vernieres Date: Wed Apr 26 00:17:56 2023 -0400 Fix warm-start IC staging (#1529) The incorrect path was used for staging coupled components during the COM refactor update (#1421). These are now corrected. Fixes #1528 commit 7421d805e6f50a59cd0be611d3ad7568ae603985 Author: Walter Kolczynski - NOAA Date: Tue Apr 25 15:37:04 2023 -0400 Quiet generate_com (#1526) Turns off trace for the duration of the generate_com function unless DEBUG_WORKFLOW is not set to "NO" (the default). In its place, the function will now echo the assignment. Closes #1524 commit 23e6cc22a456c4a4294216037673473db0a144af Author: Guillaume Vernieres Date: Tue Apr 25 11:45:16 2023 -0400 Add ocnanlvrfy job and bugfixes. (#1514) commit efa5180462f71ec476aeb6c5de4ba074a9d38a29 Author: Walter Kolczynski - NOAA Date: Mon Apr 24 15:37:52 2023 -0400 Reorganize COM and refactor to use templates (#1421) Reorganizes the entire COM directory into a more hierarchical structure and uses centrally-defined templates to define COM paths. ## Hierarchical Structure To organize output a lot better and not have 30000+ files in a single directory, all of the component COM directories are divided into a number of subdirectories for each type of output. Sample directory trees ### Cycled atmosphere only ``` gdas.20211222/00 ├── analysis │   └── atmos │   └── gsidiags │   ├── dir.0000 │   ├── dir.0001 │ ├── (Additional dir.* directories omitted for brevity) │   └── dir.0083 ├── model_data │   └── atmos │   ├── history │   ├── master │   └── restart ├── obs └── products └── atmos ├── cyclone │   └── tracks └── grib2 ├── 0p25 ├── 0p50 └── 1p00 101 directories ``` ``` enkfgdas.20211222/00 ├── earc00 ├── ensstat │   ├── analysis │   │   └── atmos │   │   └── gsidiags │   │   ├── dir.0000 │   │   ├── dir.0001 │   │   ├── (Additional dir.* omitted for brevity) │   │   └── dir.0039 │   └── model_data │   └── atmos │   └── history ├── mem001 │   ├── analysis │   │   └── atmos │   └── model_data │   └── atmos │   ├── history │   ├── master │   └── restart └── mem002 ├── analysis │   └── atmos └── model_data └── atmos ├── history ├── master └── restart 64 directories ``` ``` gfs.20211222/00 ├── analysis │   └── atmos ├── model_data │   └── atmos │   ├── history │   ├── master │   └── restart ├── obs └── products └── atmos ├── bufr ├── cyclone │   ├── genesis_vital │   └── tracks ├── gempak │   ├── 0p25 │   ├── 0p50 │   ├── 1p00 │   ├── 35km_atl │   ├── 35km_pac │   └── 40km ├── grib2 │   ├── 0p25 │   ├── 0p50 │   └── 1p00 └── wmo 26 directories ``` ``` enkfgfs.20211222/00 ├── earc00 ├── ensstat │   ├── analysis │   │   └── atmos │   │   └── gsidiags │   │   ├── dir.0000 │   │   ├── dir.0001 │   │   ├── (Additional dir.* directories removed for brevity) │   │   └── dir.0039 │   └── model_data │   └── atmos │   └── history ├── mem001 │   ├── analysis │   │   └── atmos │   └── model_data │   └── atmos │   ├── history │   ├── master │   └── restart └── mem002 ├── analysis │   └── atmos └── model_data └── atmos ├── history ├── master └── restart 64 directories ``` ### S2SWA coupled prototype (forecast-only): ``` gfs.20130401/00/ ├── model_data │   ├── atmos │   │   ├── history │   │   ├── input │   │   ├── master │   │   └── restart │   ├── chem │   │   └── history │   ├── ice │   │   ├── history │   │   ├── input │   │   └── restart │   ├── med │   │   └── restart │   ├── ocean │   │   ├── history │   │   ├── input │   │   └── restart │   └── wave │   ├── history │   ├── prep │   └── restart └── products ├── atmos │   ├── cyclone │   │   ├── genesis_vital │   │   └── tracks │   ├── gempak │   │   ├── 0p25 │   │   ├── 0p50 │   │   ├── 1p00 │   │   ├── 35km_atl │   │   ├── 35km_pac │   │   └── 40km │   ├── grib2 │   │   ├── 0p25 │   │   ├── 0p50 │   │   └── 1p00 │   └── wmo ├── ocean │   ├── 2D │   ├── 3D │   ├── grib │   │   ├── 0p25 │   │   └── 0p50 │   └── xsect └── wave ├── gempak ├── gridded ├── station └── wmo 51 directories ``` ### Trees with files gdas: https://gist.github.com/WalterKolczynski-NOAA/f1de04901e2703fd24d38146d2669789 gfs: https://gist.github.com/WalterKolczynski-NOAA/5d1b7c0a0f4b8cfff0be1ae54082316a enkfgdas: https://gist.github.com/WalterKolczynski-NOAA/860aaa804e3e70e191e7cae2ebb1055b enkfgfs: https://gist.github.com/WalterKolczynski-NOAA/130bfff4650ed8b07cf395079b65d318 S2SWA P8: https://gist.github.com/WalterKolczynski-NOAA/6ae90c6eafb573878f60682ce47179db ## Templating All of the COM paths have been replaced with new variables that are derived from a set of templates centrally defined in `config.com`. Variables in the templates are then substituted at runtime to generate the COM paths via the use of `envsubst`. To facilitate this, there is a new function, `generate_com` (see below), provided to automatically generate the COM paths. Where possible, COM paths are defined at the j-job level and made read-only. However, many of the EnKF scripts loop over the ensemble members, forcing the definitions to be made at the exscript level instead (and be mutable). The arguments to `generate_com()` are the list of COM variables to generate, optionally accompanied by a template to use using a colon to separate them. When no template is specified, the variable will be generated using the ${varname}_TMPL template. Two options are accepted, `-r` and `-x`, which will mark the variable as read-only and for export, respectively (the same as with the `declare` builtin). It is best practice to define any additional variables needed by the template on the same line to avoid adding them to the calling script’s scope. Here are some examples used in the code: Generate the path to the atmos analysis directory for the current cycle and `$RUN` (implicitly from the `$COM_ATMOS_ANALYSIS_TMPL` template) and mark as read-only and export: ``` YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS ``` Generate the path to the atmos history directory for the previous cycle's gdas from the `$COM_ATMOS_HISTORY_TMPL` template and mark as read-only and export: ``` RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL ``` Generate the path to the first ensemble member's history directory of the current cycle and `$RUN` and mark for export: ``` MEMDIR=’mem001’ YMD=${PDY} HH=${cyc} generate_com -x COM_ATMOS_HISTORY ``` ## Additional information The staging of initial conditions in `setup_expy.py` has been updated to stage in the new locations. The source of the initial conditions can **either** be in the new hierarchical structure or in the old flat structure and the script will stage the files in the new structure. The destination paths are hard-coded here, so if any changes are made to the analysis, input, or restart templates, they will need to be mirrored in `setup_expy.py`. ### Stipulations All changes in this PR are subject to approval by several stakeholders, including NCO. Sample COM trees above are subject to revision based on feedback (for instance, file X isn't really an obs file). File name updates are not included in this PR. File names (primarily for coupled components) will be updated to comply with NCO standards in a future PR. AWIPS jobs are now almost working (they do not in current develop), but one last program is still ending with an error. Work on fit2obs is deferred, so that portion of the verify job does not work. WAFS scripts are all external and have not yet been updated. WAFS is expected to be packaged separately going forward, so will need to be updated like any other downstream package. Some scripts that are not part of our normal development workflow have not yet been updated. I may be able to knock a few more off this list, but some just aren’t available in development mode currently: - All UFSDA app jobs (to be handled separately) - With associated dev jobs (may still modify and test) - JGDAS_ATMOS_GLDAS - ~~JGLOBAL_WAVE_GEMPAK~~ - ~~JGLOBAL_WAVE_POST_BNDPNT~~ - ~~JGLOBAL_WAVE_POST_BNDPNTBLL~~ - ~~JGLOBAL_WAVE_PRDGEN_BULLS~~ - ~~JGLOBAL_WAVE_PRDGEN_GRIDDED~~ - ~~JGLOBAL_WAVE_PREP~~ - With no associated dev job - JGDAS_ATMOS_GEMPAK_META_NCDCJGFS_ATMOS_FBWIND - JGFS_ATMOS_FSU_GENESIS - JGFS_ATMOS_GEMPAK_META - JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF - JGLOBAL_ATMOS_EMCSFC_SFC_PREP - JGLOBAL_ATMOS_POST_MANAGER - JGLOBAL_ATMOS_TROPCY_QC_RELOC + All downstream scripts for the above There are also a few scripts that are not available to the development workflow that I have already made a good-faith effort at updating: - JGDAS_ATMOS_GEMPAK - JGFS_ATMOS_PGRB2_SPEC_NPOESS ## Related Issues Closes #761 Fixes #978 Fixes #999 Fixes #1207 Partially addresses #198 Partially addresses #289 Partially addresses #293 Partially addresses #1299 Partially addresses #1326 commit 408ef65a8e2318125ad61478746024b2d0ef463d Author: Walter Kolczynski - NOAA Date: Mon Apr 24 15:26:58 2023 -0400 Move GDASApp hash to stable version (#1508) Changes the GDASApp hash to a more stable version than the tip of develop. See post-merge converastion in #1506 commit 699a759f80352aaade4203425c5df24842fcefa3 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Mon Apr 24 13:37:04 2023 -0400 update GDASApp hash to d34f616 (#1505) (#1506) commit 44f5c28518a7d4b9e06658c6c21b9b1ee1d0918e Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Mon Apr 24 12:26:14 2023 -0400 ignore archiving sfluxgrbf00[124578] files when they are not present (#1498) * ignore archiving certain sfluxgrbf??? files when they are not present. Co-authored-by: Rahul Mahajan commit 2e88dbfc5ff7a0bb2c6c1630b0c2783d76049d21 Author: Jiarui Dong Date: Mon Apr 24 09:17:48 2023 -0400 Add initial land DA cycling scripts (#1351) This PR adds rocoto jobs, jjobs, config files and updates to the machine.env to enable land-da cycling capability. commit f159d39a3b28dfcc120cdcdf87d11a611c75061f Author: TerrenceMcGuinness-NOAA Date: Fri Apr 21 15:46:35 2023 -0400 Add CI cron jobs (#1476) As a maintainer of the CI framework, I need a set of cron jobs that will fully automate the CI pipeline so that whenever the appropriate label on GitHub is created the PR gets cloned and built followed by a set of functional experiments that are executed and reported on. commit 587e469a1be5e278326fc0cbceefedc90caf75bf Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Fri Apr 21 13:28:11 2023 -0400 Refactor UFS-DA ATM ens component to use python g-w (#1373) This PR contains UFS-DA ATM ens changes originally in PR #1354. Below is a list of changes in this PR - rename UFS-DA ATM ens jobs atmensanalprep, atmensanalrun, and atmensanalpost as atmensanlinit, atmensanlrun, and atmensanlfinal, respectively - replace UFS-DA ATM ens shell scripts with python scripts - rename UFS-DA ATM ens j-jobs consistent with initialize, run, and finalize functions. Update j-jobs to execute python scripts instead of shell scripts - rename UFS-DA ATM ens rocoto jobs to be consistent with initialize, run, and finalize functions. Update jobs to set python paths and execute renamed j-jobs - update rocoto workflow generation to new names for UFS-DA ATM ens jobs - update UFS-DA ATM ens job names in machine dependent env files to new job names - rename UFS-DA ATM ens configuration files consistent with change in job names - add python class for UFS-DA ATM ens analysis - unify JEDIEXE link for UFS-DA Aerosol, ATM, and ENS - properly set `cycledefs` for `gfsatmanlinit` - remove unused `FV3JEDI_FIX` from atmanl and atmensanl config The above changes are part of a larger g-w effort to transition from shell scripts to python. UFS-DA Aerosol was the first GDASApp system to be converted. PR #1372 converted UFS-DA atmospheric variational DA to the python based approach. This PR converts converts UFS-DA atmospheric local ensemble DA to the python based approach. Fixes #1313 Depends (in part) on #1370 and #1372 and NOAA-EMC/GDASApp#388 commit 7db70496063fe32928cacb9790e45a1e987a3510 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Fri Apr 21 12:42:00 2023 -0400 Added Fit2Obs to S4. #1489 (#1497) Adds Fit2Obs support for S4 by adding the module use/load commands to the module_base.s4.lua modulefile. Fixes #1489. commit fb236523140b09686a4c2961e0552e7bd5dbf04f Author: Guillaume Vernieres Date: Fri Apr 21 12:40:11 2023 -0400 Add new task to post-process marine DA (#1485) The work in this PR is only meant to bring us closer to a viable WCDA system. The refactoring of the marine DA to the new standard introduced by @aerorahul and used by @RussTreadon-NOAA and @CoryMartin-NOAA will be addressed after this [Epic](https://github.com/noaa-emc/gdasapp/issues/416) is resolved. ### Motivation and context This work adds a separate j-job ```JGDAS_GLOBAL_OCEAN_ANALYSIS_CHKPT ``` that calls a script that will be in the GDASApp for the time being (PR to come once this is merged) and does the following: - prepares the `SOCA` increment for `MOM6` IAU - recursively apply the `SOCA2CICE` change of variable. A mapping from the 2D seaice analysis variable to the CICE6 dynamical and thermodynamic variables. - merge the `Tref` increment from the `NSST` analysis with the `SOCA` increment ### Summary of the change - HPC environment: the new j-job runs a `JEDI` executable twice and one python script. All are serial jobs but the JEDI exec need to be called as an MPI job with 1 pe. - `jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_CHKPT`, that script point to a ex-script that is not in the GDASApp develop yet. - addition of the option to merge the Tref NSST increment with the MOM6 increment. This is triggered with the `DO_MERGENSST` switch - The new j-job dependency was added, with the option to wait for the surface analysis file `sfcanl.nc` if `do_mergensst` is true. Refs: #1480. Fixes NOAA-EMC/GDASApp/#418 commit 740daba8d6d34a327199701c1df7d6e10da73ec5 Author: Kate Friedman Date: Fri Apr 21 03:11:51 2023 -0400 Create fix file issue template (#1495) Create fix_file.md template file for new fix file request issue. This should help formalize the process and document updates. New issue will auto-assign to @KateFriedman-NOAA and @WalterKolczynski-NOAA (the developers with access to make fix files changes). Fixes #1492 commit 35942896ca4eeef243fe35d47416be64fe0058ff Author: Kate Friedman Date: Thu Apr 20 10:00:23 2023 -0400 Update TC_tracker version to v1.1.15.6 New ens_tracker.v1.1.15.6 tag installed on supported platforms. Adds Jet support and moves package to use new EPIC-installed hpc-stacks on R&Ds. Refs #1463 commit 86c3923bf60b1ce39165070bf2e5c3d60193d6dd Author: Kate Friedman Date: Thu Apr 20 09:39:56 2023 -0400 Update GSI-Monitor hash to reflect recent assimilation changes. New GSI-Monitor hash to 45783e3 to update two fix files. Refs #1483 commit 2f347f6ddc770f2524394af25561a8da0d8dfb50 Author: Kate Friedman Date: Wed Apr 19 15:54:19 2023 -0400 Fit2Obs updates for package reorganization and invocation via module (#1484) The Fit2Obs repo has been reorganized to meet a few NCO standards and to add a module for invoking it from other packages. Refs #1472 commit b2ed8648f80946de85983a51664b120540854cc9 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Tue Apr 18 16:21:14 2023 -0400 Refactor UFS-DA ATM var component to use python g-w (#1372) This PR contains UFS-DA ATM var changes in PR https://github.com/NOAA-EMC/global-workflow/pull/1354. Below is a list of changes in this PR - rename UFS-DA ATM var jobs `atmanalprep`, `atmanalrun`, and `atmanalpost` as `atmanlinit`, `atmanlrun`, and `atmanlfinal`, respectively - replace UFS-DA ATM var shell scripts with python scripts - rename UFS-DA ATM var j-jobs consistent with initialize, run, and finalize functions. Update j-jobs to execute python scripts instead of shell scripts - rename UFS-DA ATM var rocoto jobs to be consistent with initialize, run, and finalize functions. Update jobs to set python paths and execute renamed j-jobs - update rocoto workflow generation to new names for UFS-DA ATM var jobs - update UFS-DA ATM var job names in machine dependent `env` files to new job names - rename UFS-DA ATM var configuration files consistent with change in job names - add UFS-DA ATM yaml files for UFS-DA fix files, crtm files, and increments - add python class for UFS-DA ATM var analysis - link UFS-DA python increment conversion script from GDASApp to g-w `ush` directory The above changes are part of a larger g-w effort to transition from shell scripts to python. UFS-DA Aerosol was the first GDASApp system to be converted. This PR represents the second GDASApp system, UFS-DA atmospheric variational DA, to be converted. PR #1373 contains changes to convert the UFS-DA ensemble DA to a python based approach. Fixes (in part) #1313 Depends (in part) on PR #1370 and [GDASApp #388](https://github.com/NOAA-EMC/GDASApp/pull/388) commit 70a7d99bb65fd7661d7d2bac10633c0dda5d39fd Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Apr 18 10:26:00 2023 -0400 Port the global workflow to Jet (#1301) Provides initial cycled and free-forecast support for the global workflow on Jet. References #357. Not included in this port is support for GLDAS, verif-global, and TC_tracker. Jet will eventually support the following, with the currently tested options bolded, on xjet and kjet partitions - uncoupled, cycled and free-forecast experiments at C48, C96, C192, C384, and C768 resolutions - coupled, free-forecast experiments (ATMA, ATMW, S2S, and S2SW) at C384 ~Note that currently coupled ICs are not available on Jet, so only ATM-only experiments can performed.~ Fixes #357 commit d2b268ab3d965c9a2cf998dd048ac7a3ee7dc36b Author: Walter Kolczynski - NOAA Date: Mon Apr 17 21:15:04 2023 -0400 Consolidate wave parm files (#1477) When the new wave parm directory was created, the existing parm files were not moved into the new directory. These files were used for AWIPS, so not generally tested. commit 8dcfaa6fbc5e0a94d44952f710f77c3b18ffa50d Author: Walter Kolczynski - NOAA Date: Mon Apr 17 13:37:26 2023 -0400 Split MPMD stdout into tasks on slurm (#1469) It can be difficult to debug MPMD jobs because their logs are all written concurrently to a single file. While the use of tags to designate which task via the preamble and PS4 can help identify which line is from which task, it is still difficult to follow a single task through the log, particularly for larger MPMD jobs with dozens of tasks. Individual stdout files are now created by using the `srun` `--output` option. These files are written to the working directory (in `$DATA`). Fixes: #1468 commit 6e7e4f1db78bbb67d94d54bdcb0a021a626b01f1 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Mon Apr 17 09:59:01 2023 -0600 GFS fixed-file YAMLs. (#1471) Add a few fixed-file YAMLs used in the GFS. commit 2ec4125f67e6b6c1d8dee6f6fcf1d0798f587a31 Author: Rahul Mahajan Date: Fri Apr 14 20:55:57 2023 -0400 Initial blocks in place for forecast refactor work (#1466) This PR is a first in a series of PR for transforming the forecast job. This PR does not affect current function of the forecast job. This PR: - adds initial blocks to separate task specific and model configuration for the task blocks commit d47f33f142824c0d2111f7e2f08c43f99b33bff4 Author: Walter Kolczynski - NOAA Date: Fri Apr 14 15:51:14 2023 -0400 Update buoys file and fix boundary point jobs (#1465) The buoy file used by the wave jobs ([wave_gfs.buoys](https://github.com/NOAA-EMC/global-workflow/blob/develop/parm/wave/wave_gfs.buoys)) was just a copy of [wave_gfs.buoys.dat](https://github.com/NOAA-EMC/global-workflow/blob/develop/parm/wave/wave_gfs.buoys.dat). In addition to being a duplicate, the file was a truncated version without any boundary points, causing boundary point jobs to fail. The duplicate file has been removed and replaced by a symlink to the full buoy list [wave_gfs.buoys.full](https://github.com/NOAA-EMC/global-workflow/blob/develop/parm/wave/wave_gfs.buoys.full). This maintains the provenance of the file and prevents the former duplicate from becoming out-of-sync. Users who still want to use the truncated buoy can change the target of the symlink to wave_gfs.buoys.dat. There are also a few minor bugs fixes that were necessary to get boundary point jobs to run: - `FHMAX_WAV_IBP` had been set in the bndpnt config file but not used in the j-job. This was invisible unless a user changed the value, since the config and j-job used the same default. - Checks against `FHMAX_WAV` would set the unused `FHMAX_WAV_IBP` to the max value instead of the `FHMAX_WAV_PNT` used for the loop. This is a problem when running for less than 180 h (the default value). - The boundary point bulletin job was not in the env job list for Orion (other machines have it). Now the boundary point jobs set `FHMAX_WAV_PNT` to `$FHMAX_WAV_IBP`. `FHMAX_WAV_IBP` was moved from the bndpnt config to config.wave so it is visible to bndpntbll as well. Fixes #1464 commit e496e393b16565207c227f4b69a5691d97098624 Author: Kate Friedman Date: Wed Apr 12 15:59:00 2023 -0400 Move Fit2Obs to stand-alone job (#1456) This PR moves the Fit2Obs invocation out of the vrfy job and into its own dedicated `fit2obs` job in the gdas suite. This new dedicated job uses the latest Fit2Obs tag `wflow.1.0`. The Fit2Obs requires a type of spin-up. The job looks back `VBACKUP_FITS` hrs and needs available inputs for that lookback cycle in the `ROTDIR`. The `jobs/JGDAS_FIT2OBS` script will first check that `xdate` (`CDATE` - `VBACKUP_FITS`) > `SDATE` and then if met, will check that the needed inputs exist. - If `xdate>SDATE` is not yet satisfied, the job will exit 0 with "Too early for FIT2OBS to run. Exiting.". The conditional is greater-than and not greater-than-or-equal since the first half cycle generally does not have some of the needed inputs (e.g. prepbufr). Thus the first half cycle is not included in the valid lookback cycles. This avoids erroneous job failures for the first cycle to run the Fit2Obs package. Additional logic could be introduced to include the half cycle if all available inputs are available. - If any of the needed inputs are missing the job will abort with "FATAL ERROR: FILE MISSING: ${file}". This spin-up means that the first cycles will run the job but exit 0 immediately. The 6th cycle (if `VBACKUP_FITS=24`) will be the first cycle to run the Fit2Obs package and produce output in the online archive. Changes: 1. Remove fit2obs variables and settings from `config.vrfy` and into newly created `config.fit2obs` for `fit2obs` job. 2. Remove fit2obs submission/invocation from `jobs/rocoto/vrfy.sh`. 3. Create new `fit2obs` job scripts: `jobs/rocoto/fit2obs.sh` and `jobs/JGDAS_FIT2OBS` 4. Add new `fit2obs` job to setup scripts: `workflow/applications.py` and `workflow/rocoto/workflow_tasks.py` 5. Add new `fit2obs` job to all env files. 6. Add new `fit2obs` job into `config.resources` (use 1 node on WCOSS2 and 3 nodes elsewhere). 7. Add `export DO_FIT2OBS="YES"` to `config.base.emc.dyn`. Resolves #1405 Resolves #1232 commit 363a2b47de11ab327408d2df20a3718f25062fa5 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Wed Apr 12 14:42:50 2023 -0400 Rework arch job dependencies (#1455) This reworks the dependencies for arch tasks so that if there are no verification tasks selected and it is an uncoupled experiment, a dependency for the cycle's post jobs is added. It also fixes the dependency checks in rocoto.py to check for an empty list or string instead of checking the first element of a list, which may not exist if no dependencies are given. Lastly, it issues a warning to the user if there are no dependencies for any job. Fixes #1451 commit 7e661f4cd00cf99b12e55574318bbd0eb1a5eaa0 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Wed Apr 12 10:36:58 2023 -0400 Stage bias files for UFSDA aerosols (#1370) Updates UFSDA aerosols to stage bias files. Also adds in the threads and aprun commands for the run portion of the UFDA aero jobs. Fixes (in part) #1313. commit d5ae3328fa4041b177357b1133f6b92e81c859d7 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Apr 4 03:34:48 2023 -0600 Add logging level to logger (#1442) This PR provides task-level support for the respective pygfs applications. As an example, logging-level may be defined within jobs/JGLOBAL_FORECAST as follows. ``` #! /usr/bin/env bash source "${HOMEgfs}/ush/preamble.sh" source "${HOMEgfs}/ush/jjob_header.sh" -e "fcst" -c "base fcst" export LOGGING_LEVEL="DEBUG" ``` This feature allows a user to change the logging level from the run-time environment rather than requiring modification of the respective pygfs module or task. Fixes #1438 commit de81c5911c308679366b97fda19f015b1388d9dc Author: TerrenceMcGuinness-NOAA Date: Mon Apr 3 17:53:12 2023 -0400 Update hash for GDASApp to db2f998 (#1443) When an update described below made a change to the HASH in the `${HOMEgfs}/sorc/checkout.sh` script for the corresponding GDASApp it was entered in error: _3e73038c - Use V2 version of fix files needed for Thompson MP (#1422) (7 days ago) _ Examining the GDASApp repo it should be confirmed that the correct HASH should be **db2f998** Fixes #1441 commit 0d1e993b2b5db6160c4a6b88b67899dc8e9754f8 Author: Kate Friedman Date: Fri Mar 31 09:58:22 2023 -0400 Remove para module paths for ncdiag on WCOSS2 (#1437) The ncdiag/1.0.0 module moved from para to prod on WCOSS2 on March 27th (RFC 10769). * Remove para module paths for ncdiag on WCOSS2 in global-workflow module_base.wcoss2.lua * Update GSI-EnKF hash to update ncio and ncdiag * Update GSI-Monitor hash to update ncdiag Refs #1426 commit 88e091a8e92cbc813830b21562fa392f8dd2d3d9 Author: Rahul Mahajan Date: Fri Mar 31 02:50:53 2023 -0400 Use P8 settings for C384 atm by default (#1440) Switches the default C383 FV3 timestep to 300s and reduces the decomposition for gfs CDUMP to 8×8 with 48 write tasks per group. These are the settings used by P8. MDAB has advised these settings can be used for non-P8 runs. Fixes #1439 commit 3cfdbe04e864847a3f89b6f8b89799b18b7e1f5e Author: Walter Kolczynski - NOAA Date: Wed Mar 29 10:46:35 2023 -0400 Enforce rstprod on relevant tarballs (#1436) Ensures that tarballs that contain restricted data are properly restricted to the rstprod group. Fixes #1433 commit 809b33bf50193a083c7ffdbd87bb83e0c78b2a9c Author: Kate Friedman Date: Tue Mar 28 09:44:37 2023 -0400 GFSv16.3.5[6] GSI updates (#1404) * Update GSI hash to 31b8b29 * Remove temporary hack that forced GSI to build with crtm/2.4.0 * Remove G18 ABI from exglobal_atmos_analysis.sh * Remove GMI from processing in the GSI Refs #1322, #1321 commit 222f055e1e082faf7f9e489297e5c46ccc582a20 Author: Cory Martin Date: Mon Mar 27 17:48:39 2023 -0400 Have aerostat tar file extract to basename only (#1424) This bugfix makes it so that when the aerosol diags are extracted, they are extracted to the directory directly and not a full directory tree to where the runtime directory was located. Closes #1423 commit c549acb914c7ae479740c8f1007f5f3f36b91db5 Author: Rahul Mahajan Date: Mon Mar 27 10:00:18 2023 -0400 Updates in the aerosol tasks (#1420) -removes the definition of environment variables in the shell script j-jobs -eliminates the use of !ENV in the yaml files for aerosol jobs. -eliminates use of CDATE in the python tasks. Use current_cycle and previous_cycle -uses jinja templates where appropriate. -uses the Executable class to run the variational analysis executable -adds verbose logging to the actions in the aerosol analysis task -uses string templates instead of string replace where looping over tiles is required. -links aerosol ICs when cycling with the ATMA app. ICs courtesy of @CoryMartin-NOAA Co-authored-by: Walter Kolczynski - NOAA commit 3e73038c12f8261543c874cfffed11f8ce496399 Author: Rahul Mahajan Date: Fri Mar 24 12:40:17 2023 -0400 Use V2 version of fix files needed for Thompson MP (#1422) It has been noted by some developers who look at the run log in realtime that the model takes a while during the calculation of Thompson tables. Specifically see this part of the output from the forecast log: ``` 0: Calculating Thompson tables part 1 took 0.334 seconds. 0: Calling radar_init took 0.000 seconds. 0: creating rain collecting graupel table 0: ThompMP: computing qr_acr_qg 0: Writing qr_acr_qgV2.dat in Thompson MP init 0: Computing rain collecting graupel table took 203.539 seconds. 0: creating rain collecting snow table 0: ThompMP: computing qr_acr_qs 0: Writing qr_acr_qsV2.dat in Thompson MP init 0: Computing rain collecting snow table took 36.694 seconds. 0: creating freezing of water drops table 0: Computing freezing of water drops table took 2.084 seconds. 0: Calculating Thompson tables part 2 took 2.084 seconds. 0: ... DONE microphysical lookup tables ``` These tables are already available in the `fix` space and are being used in the ufs-weather-model regression tests. Fixes #1411 commit 4ff622a36e3f9aa2f39e847a7b9e2b536ebcdc0a Author: TerrenceMcGuinness-NOAA Date: Thu Mar 23 01:55:42 2023 -0400 Add experiment creation for CI (#1388) Adds a new python script to create an experiment on the fly for CI. Also adds detect_machine.sh. Closes #1375 commit 6bb2d64fa171604b9d5e8f58e4edf8bf946840e7 Author: Kate Friedman Date: Wed Mar 22 15:44:00 2023 -0400 Update state of operations in RTD to GFSv16.3.7 (#1417) Update the "State of operations" blurb in index.rst to note the updated GFSv16.3.7 operational version. Refs #1368 commit ea15b26585c0c804f44776cabb6ae3d08b76234f Author: Walter Kolczynski - NOAA Date: Wed Mar 22 14:07:06 2023 -0400 Fix typo in wave awips gridded task def (#1412) The task name was misspelled in the task definition for gridded wave awips. commit 74b344ddb28fb40b8a65479f1584cf7daa376dfe Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Wed Mar 22 09:05:42 2023 -0600 Make new methods in yaml_file be public. commit 478f48ede4ee28e8193ff02befb0a954bd7eca61 Author: Walter Kolczynski - NOAA Date: Tue Mar 21 17:18:46 2023 -0400 Fix groupsize for early cycle EnKF rocoto task (#1408) When generating the rocoto tasks for enkfgfs, the different groupsize for gfs would not be picked up because the cdump comparison was not properly updated when the cdump/run was updated to include 'enkf'. commit 49b96ed037c6919c651c6b7dd2b6c9fbe9866904 Author: ChunxiZhang-NOAA <49283036+ChunxiZhang-NOAA@users.noreply.github.com> Date: Mon Mar 20 15:35:19 2023 -0400 Use fracoro data for all new UFS applications (#1242) The new fracoro data should be used for all new UFS applications no matter if it uses frac_grid or not. Most problems in Issue[#863](https://github.com/NOAA-EMC/global-workflow/issues/863) have been resolved. However, one problem remains, e.g., the latest fix, mask and oro datasets (fracoro) created by Shan/Mike/Helin should work for both fractional and non-fractional grid. Note that it also requires the changes in UFS_UTILS. An corresponding PR[#741](https://github.com/ufs-community/UFS_UTILS/pull/741) in UFS_UTILS has been created. Fixes: #863 Dependency: UFS_UTILS [PR#741](https://github.com/ufs-community/UFS_UTILS/pull/741) commit e5af1b45f5aaa3acfe8e6ee37e690edff9aa608e Author: Rahul Mahajan Date: Fri Mar 17 13:02:54 2023 -0400 Updates to python tools for use in DA tasks (#1400) Adds significant updates to pygw tools that expand the use of templated yaml files to make their use in the tasks clear and easier to use. All changes come with associated tests. New tests are added for timetools.py and jinja.py. New methods to parse a "simple" $( ... ) templated as well as jinja2 {{ ... }} templated yaml files are added along with their tests. commit 3fe3592338598ef6b957d29e809dc70df1f82cec Author: Guillaume Vernieres Date: Wed Mar 15 01:48:46 2023 -0400 Marine DA prep j-job needs more memory (#1393) The concatenation step runs out of memory in the marine-gdas prep step. This is not optimized and probably overkill, but 24GB should cover all cases. Fixes #1389 commit 6cf486190c83f64370f1c2dccabd85700fbc0a3a Author: TerrenceMcGuinness-NOAA Date: Tue Mar 14 15:50:18 2023 -0400 Update rocoto_viewer to replace deprecated getiterator call (#1397) Rocoto viewer was using a deprecated function `getiterator` that caused it to fail on python 3.8+. The replacement method `iter` is now used. Fixes #522 commit 2929430369b0e6b4e6d42b45637f552f8cb7e59e Author: Rahul Mahajan Date: Mon Mar 13 16:29:51 2023 -0400 Reset modules properly at beginning of forecast job (#1394) Following the PR last week that enabled ESMF threading, we had to replace `load_fv3gfs_modules.sh` with loading ufs-weather-model specific modules for the `fcst` and `efcs` jobs. `module-setup.sh` is needed after `detect_machine.sh`. Previously, both these functions were performed in `load_fv3gfs_modules.sh`. commit 995e2b0c819d2bf45e9cde7b3fb4dd638034f91d Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Fri Mar 10 18:34:28 2023 -0700 Add base workflow exception class (#1392) Adds a new WorkflowException that can serve as a base class for any new exceptions we wish to create to cover errors not well represented by the native python exceptions. Also adds a test exception. Closes #1391. commit aa8175dc39bd64e895d2362dc0ec37ed76ababef Author: Rahul Mahajan Date: Fri Mar 10 19:50:28 2023 -0500 Enable ESMF threading in the ufs-weather-model forecast (#1371) Transitions the workflow to use ESMF-managed threading for UFS. This allows for per-component specification of threads rather than a single value for all components. The resource calculation is updated to handle the different thread counts for each component. The variable `NTHREADS_FV3` (which set the global thread count) is removed. Now each component has a `${COMPONENT}THREADS` variable. In order to run properly, the launcher commands for each machine had to be modified so the number of processes is the number of CPUs on all nodes. Also, the forecast job now uses the UFS modulefile rather than the typical workflow runtime module. `prod_util` is then loaded manually as it is needed to run the workflow. Additionally, on WCOSS2 `cray-pals` is also loaded manually as it is necessary. This arrangement is temporary until a more permanent solution is implemented. Notes: 1. As a result of threading, the `WRTTASKS_PER_GROUP` in the `model_configure` will end up to be a multiple of the number of threads used in quilting. At present, they are assumed to be the same as the threads for FV3. 2. The `WCOSS2.env` file needs a look at as the sections for steps `fcst` and `efcs` are different and very different when compared to the [job card](https://github.com/ufs-community/ufs-weather-model/blob/develop/tests/fv3_conf/fv3_qsub.IN_wcoss2) in the ufs-weather-model for WCOSS2. Closes #1042 commit 8a2d5061da3c3067291e51680408339ec2efec5c Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Thu Mar 9 16:11:22 2023 -0500 Cleanup ocean, ice, and med directories when CDUMP=gdas (#1387) commit 780a511e51c83b339b02709a782c123a16c5788d Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Thu Mar 9 15:59:40 2023 -0500 Archive ocean and ice files when CDUMP=gdas (#1384) commit d8fdd29ab236798937fbd1c5e1aa9266db98dcc0 Author: Guillaume Vernieres Date: Wed Mar 8 16:38:06 2023 -0500 Updated diag_table_da to allow output of ocean fields for SOCA at various resolutions (#1382) commit 815823997e7886dd4c054fa2bf573dabd5d6047c Author: Rahul Mahajan Date: Wed Mar 8 09:38:11 2023 -0500 Add mechanism to detect machine and clean module env. (#1381) commit 4437181a72629e6f6f7214de749dc890e32a4994 Author: Kate Friedman Date: Tue Mar 7 08:57:10 2023 -0500 Update initial condition documentation and Orion BASE_CPLIC path (#1376) * The initial conditions section of the Read-The-Docs documentation is updated to add information about staged initial conditions that were pulled into global account space on supported platforms. Some reformatting of the section is also done. * The Orion BASE_CPLIC path is also updated after prototype ICs were copied from @WalterKolczynski-NOAA's personal area on Orion to the "glopara" area maintained by @KateFriedman-NOAA. Refs #1345 commit 1a48aca822b1caf9325b37e9819243f91454f4e6 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Mar 7 07:56:59 2023 -0500 Add a missing dollar sign in the earc script (#1378) commit 33e8a8ea56920bca4e31bf961647ec989f24582b Author: TerrenceMcGuinness-NOAA Date: Mon Mar 6 22:52:33 2023 -0500 Add BASH scripts for initial CI testing system using GitHub Labels (#1362) This PR is a set of high-level BASH scripts for creating a basis of a CI System using GitHub labels. These scripts will first poll the **global-workflow** GitHub repo for open PR's with the label **${hostname}-CI**. The label is the designated RDHPCS system the CI functional tests will run. A second script will then clone and build from the designated PRs on the specified RDHPC. Closes #1374 commit 47afc78568563bc4c55208e567bfb967684eae0a Author: Rahul Mahajan Date: Fri Mar 3 10:43:54 2023 -0500 Consolidate ocean, ice and wave task info into config.ufs (#1334) config.fv3 is now renamed as config.ufs and contains ocean, ice, and wave task information * config.fv3 is used to record FV3 and write grid component task decomposition based on resolution. * Ocean (MOM6) and Ice (CICE6) decomposition, tasks and timesteps are added to config.ufs. * Updates are made to config.fcst and config.efcs to source config.ufs based on the configuration of the (coupled) model. * Several configurations were made deep in parsing_namelists_MOM.sh, which have now been elevated to config.ocn * Some improvements are made to limit the export of variables from functions in nems_configure.sh. * This will help with the ESMF threading work as it breaks down the work into multiple steps. commit 0c523d9b75a6a5d24a55fcc3f56ed505dedac086 Author: Jessica Meixner Date: Thu Mar 2 16:27:53 2023 -0500 Update to HR1 (#1197) Updates settings for the HR1 prototype Compilation is switched to "mixed-mode", with a 32-bit atmosphere and 64-bit for other components. Atmosphere physics options are updated. Changing the wave model to run on the outer loop and the resolution to ¼-deg. The model version was already updated previously, but this commit updates some of the associated input files. commit 5d6c71ab34667ef0dac777a89a19a37edd2652fc Author: Rahul Mahajan Date: Thu Mar 2 11:53:10 2023 -0500 Add options while setting up Rocoto XML that are useful for CI (#1365) Adds options for maximum tries (default is 2), cyclethrottle (default is 3), taskthrottle (default is 25) and verbosity (default is 10) commit 6024e68d8f519d4d1de224de0d7d6799e02f2e07 Author: Rahul Mahajan Date: Thu Mar 2 11:06:17 2023 -0500 Revert "Add options while setting up Rocoto XML that are useful for CI (#1363)" (#1364) This reverts commit c318cbdefc80b390a6bf897229fd2e206eb6873c. commit 53952153eea6a9afc83512f84d860823b300bc97 Author: Guillaume Vernieres Date: Thu Mar 2 10:02:14 2023 -0500 MOM6 backgrounds in cycled DA mode were hardcoded for IAU (#1355) * fixed mom6 bkg output Co-authored-by: Rahul Mahajan commit c318cbdefc80b390a6bf897229fd2e206eb6873c Author: Rahul Mahajan Date: Thu Mar 2 09:50:09 2023 -0500 Add options while setting up Rocoto XML that are useful for CI (#1363) maxtries, cyclethrottle, taskthrottle and verbosity are command-line options to setup_xml.py commit 0e1c753e7bd4d4e98b40372168578198c4300124 Author: Rahul Mahajan Date: Thu Mar 2 09:48:51 2023 -0500 Run executables or scripts from within python. (#1341) - adds the ability of running executable (binary or shell scripts) via the subprocess.Popen call - allows setting env. variables for the subprocess without having to modify the calling environment. - allows passing custom arguments to the executable - allows capturing stdout and stderr as well as pass stdin commit 2e92b7c582f116434ca49af7f7e6b5ec48842f48 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Wed Mar 1 16:21:13 2023 -0500 Remove extra 'enkf' from folder/file names in archive scripts (#1360) The archiving scripts have had preceding `enkf`s removed and `$CDUMP` has been replaced with `$RUN` throughout the scripts. This fixes #1353 and partially addresses #1299. Fixes #1353 Refs #1299 commit 0a18568a58d4c33e63d728424223900e98ee0350 Author: Xianwu Xue - NOAA <48287866+XianwuXue-NOAA@users.noreply.github.com> Date: Wed Mar 1 16:12:06 2023 -0500 Fix bugs to create analysis files for early cycle (#1343) The cycled early enkf should generate analysis files, however, the CDUMP was changed from "gfs" to "enkfgfs" for enkfgfs* jobs, some of the scripts should do the corresponding changes. This PR is to fix this bug. commit edbf8d955e04b54909c20379d4f674d0a9f3e1f7 Author: Cory Martin Date: Wed Mar 1 16:10:59 2023 -0500 Make necessary bugfixes to get aerosol cycling going (#1349) Makes a number of bugfixes that were overlooked in #1106 that now allow for 3DVar aerosol DA cycling on Hera/Orion. Also updates the GDASApp hash to the most recent commit in develop. commit c4d05e57d5e7192ac3b8a93e6880efe5bbad2e65 Author: Kate Friedman Date: Tue Feb 28 16:27:46 2023 -0500 Create production_update.md (#1348) New template for operational production updates. Includes checklist for workflow side. commit 8134f975d51905789f7a59b07d713306a91ba10b Author: Rahul Mahajan Date: Mon Feb 27 14:36:03 2023 -0500 Update feature_request.md commit a243b5c1c2bb21a47bcc9dee260ab5b90bb06843 Author: Rahul Mahajan Date: Mon Feb 27 12:21:27 2023 -0500 Create a template for requesting new features in the global workflow to separate from general issues (#1336) commit 0c621d0b9ead8d46fb287d4eb547386b3335f9f8 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Fri Feb 24 19:30:00 2023 -0500 allow script to access eva yaml generators (#1331) commit f69d3e54d92b257bf1102c13eb1197a967288c39 Author: Rahul Mahajan Date: Fri Feb 24 12:06:46 2023 -0500 Add license and status badges for the CI (#1332) commit ea414291341e77eb6eee64241a702141152efcfd Author: Rahul Mahajan Date: Wed Feb 22 17:02:15 2023 -0500 Update ufs_utils hash that supports global_cycle with NoahMP. (#1315) - Updates ufs_utils hash that contains upgrades to `global_cycle` that updates only the greenness fraction. This update expects a pre-existing surface restart file to be updated. - Corressponding updates to scripts that call `global_cycle` to stage the surface restart file to update. - Updates to `checkout.sh` and `Externals.cfg`. Temporary pointers to [this](https://github.com/GeorgeGayno-NOAA/UFS_UTILS/tree/feature/cycle_noahmp) branch until `develop` in `ufs_utils` is updated. - Updates to `setup_expt.py` to force using the same `CCPP_SUITE` and `IMP_PHYSICS` for cycled and forecast-only modes and all apps. - Only builds utilities from ufs_utils that are used in the GFS application. @GeorgeGayno-NOAA still needs to work with the land team to determine which fields (other than greenness fraction) need to be updated through `global_cycle`. Depends on https://github.com/ufs-community/UFS_UTILS/pull/774 Fixes #1314 Updating the UFS_utils hash will also resolve #1275 Fixes #1275 commit 6addad94b510b08a83e43236b9d8c430b8aeddce Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Wed Feb 22 15:02:07 2023 -0500 Save yaml for ocn analysis post and fix bug to save logs (#1318) Temporarily saves var.yaml to comrot in ocean analysis post and fixes a bug that caused log files to not be saved. Necessary for https://github.com/NOAA-EMC/GDASApp/issues/202. This is temporary pending an evaluation of what needs to be saved and the appropriate place. Refs https://github.com/NOAA-EMC/GDASApp/issues/202 commit dbbd8b19ce847136dbb6457a082e58ab17f5678f Author: Kate Friedman Date: Wed Feb 22 14:53:29 2023 -0500 Update crtm to v2.4.0 (#1319) * Update module_base modulefiles to use crtm/2.4.0. * Remove line in module_base modulefiles to set CRTM_FIX; this variable is now set in the crtm module as of v2.4.0. * Update ufs-weather-model hash to c22aaad * Temporarily set crtm_ver in GSI build script to force GSI to build with crtm/2.4.0 Refs #1233 commit adae24cf6c08ca327f74c8bcd3aa9c89cbb28724 Author: Walter Kolczynski - NOAA Date: Wed Feb 22 14:45:49 2023 -0500 Fix python style errors (#1330) Some python style errors crept in to develop because we ignored pynorm failures during the PR process. These errors are now corrected. commit e18a79036f16f60e7560771d17ab89d10ea39a96 Author: Xianwu Xue - NOAA <48287866+XianwuXue-NOAA@users.noreply.github.com> Date: Wed Feb 22 13:17:37 2023 -0500 Fix workflow generation post times for early cycle (#1329) In PR #1309 a setting was overlooked that controls what hours have post tasks created for the early cycle. Fixes #1328 commit 07fedaa649e4576a2bd3d2fca32bb6c144fa2a85 Author: Walter Kolczynski - NOAA Date: Fri Feb 17 16:43:16 2023 -0500 Change RUN and CDUMP for ensemble jobs (#1309) Updates `$RUN` and `$CDUMP` for ensemble jobs to include `enkf`. Previously, the `$RUN` for EnKF jobs had been set to `gdas` or `gfs`. However, this violates NCO policy and was also complicating Issue #761 . Now the `$RUN` for EnKF jobs is either `enkfgdas` or `enkfgfs`. Theoretically, `$CDUMP` shouldn't need to change. However, `$CDUMP` and `$RUN` are used interchangeably throughout much of the workflow (Issue #1299), so for now the `$CDUMP` is kept identical to `$RUN`. This will be corrected in a future PR. This change **changes the name** of enkf output files. Files now begin `enkfgdas` (or `enkfgfs`) rather than `gdas`. Closes #1298 commit 383c8c3a25dbeedb3a4892d22d2531286e87389f Author: Guillaume Vernieres Date: Thu Feb 16 11:10:50 2023 -0500 Provide default for DEBUG_WORKFLOW in load_ufsda_modules.sh commit 3bfcb8975acab6a14634d95189b6b10379e37afc Author: Guillaume Vernieres Date: Wed Feb 15 11:10:34 2023 -0500 Allow increments to be added for ocean and ice cycling. (#1308) Removed forgotten commented out call to the B-mat j-job in jobs/rocoto/ocnanalbmat.sh ... oops Provide ocean increments via `mom6_increment.nc` Link to the JEDI/SOCA increment in ush/forecast_postdet.sh Fixed a dependency bug in workflow/rocoto/workflow_tasks.py MOM6 Increment is required in ROTDIR for the first 1/2 cycle, changes reflected in workflow/setup_expt.py Co-authored-by: Rahul Mahajan commit a1968e6cd7546151e670c5a0d6f1dd4b5d859c10 Author: Kate Friedman Date: Tue Feb 14 12:51:33 2023 -0500 Module base file reorder and reduction (#1306) * Adjust the order of module loads in module_base modulefiles to conform with desired order: compilers, mpi, 3rd party, hdf5, netcdf, nceplibs. * Remove modules that aren't needed for runtime. * Checked dependencies of remaining modules to make sure prereqs are loaded beforehand as needed. * Correct bug with hpss module in module_base.jet.lua Refs #479 commit 5ac68361917e81555b13e0a2b160f2f7546b8fb4 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Feb 14 12:00:43 2023 -0500 Update S4 environment and module files (#1303) Updates the S4 module file and environment file to keep up to date with the current develop branch. Fixes #1297. Addresses one bug in #1195. This also increases the memory request for the ediag job. During testing, that job initially failed. For the 5 cycles run between 2022051500 - 2022051600, the largest memory footprint was ~26.5GB. commit 1040216d8a4efb9955efecebf59775e91d8845e2 Author: Cory Martin Date: Fri Feb 10 17:16:44 2023 -0500 Add in initial 3DVar aerosol DA cycling capability (#1106) This PR adds an initial 3DVar aerosol DA cycling capability to support scientific development and testing towards an operationally viable candidate system for aerosol DA for GFSv17/GEFSv13. This PR includes the following: - Three new j-jobs and ex-scripts for aeroanlinit, aeroanlrun, and aeroanlfinal - modifies the rocoto scripts that call the above j-jobs to actually call them - makes modification to the config.resources and config.aeroanl files - Introduces an object-oriented python structure to initialize/finalize the aerosol analysis with the intention of eventually using the top-level classes for other analysis jobs (soca, atm, land) Closes #982 commit cc54b8cdab3b83b20c866bbaa5dba0004dae5425 Author: Neil Barton <103681022+NeilBarton-NOAA@users.noreply.github.com> Date: Fri Feb 10 17:15:16 2023 -0500 Fix name of MOM6 n restarts for future cycle points (#1307) Script fix for C384 O0.25 3DVAR cycling. Original scripts had wrong naming structure for MOM *res${n}.nc restart files. Refs: #947, #1289 commit d8c1bd5dfb6b2654b5b8c5121af68f7473fac26e Author: Kate Friedman Date: Fri Feb 10 12:26:30 2023 -0500 Update RTD GFS operational version to v16.3.6 (#1305) Update the status of operations to the newly implemented v16.3.6 version on the read-the-docs main page. Refs #1278 commit 9d11e1e871639523813fcc606aab40f1a6a8103f Author: Guillaume Vernieres Date: Tue Feb 7 21:58:29 2023 -0500 Stop wiping data at beginning of bmat vrfy j-job (#1302) The ocean b-matrix verification job relies on data from the b-matrix job still residing in $DATA, but that directory was being wiped when this job began. Now setting WIPE_DATA to NO to prevent the deletion. Closes GDASApp/issues/318 commit ae9d140273df9f78b4ae7d64e441e31f7b0d9e10 Author: Kate Friedman Date: Tue Feb 7 14:56:25 2023 -0500 Add back in module load block in coupled_ic.sh (#1300) - Resolve bug introduced from removal of module load script block in coupled_ic.sh. - Add block back in for now and then address errors produced by it more appropriately. commit 2615fff62853769f1cc5beb5da49a70cb0ad267d Author: Walter Kolczynski - NOAA Date: Mon Feb 6 15:24:32 2023 -0500 Turn off differential pylint (#1296) Turns off the differential pylint test until it can be evaluated further. The YAML block is left in commented form in case we wish to turn it back on later. commit bdb0db77fdb3cc93a2eae981843499cd38724ba5 Author: Rahul Mahajan Date: Fri Feb 3 11:13:18 2023 -0500 Purge ICSDIR (#1295) This PR: - updates `coupled_ic.sh` job to copy initial conditions from `BASE_CPLIC` directly to `ROTDIR` in the names that the workflow expects and conforms to the naming convention within the workflow - obsoletes the need and use of `ICSDIR` in forecast-only experiments which served as a intermediate space for staging initial conditions. - updates the documentation section for `forecast-only` This is a non-breaking change. A change in documentation is required as the instructions for setting up the coupled forecast-only experiment no longer needs to pass the argument `--icsdir` to `setup_expt.py`. Fixes #1276 commit 1b0905c4f25e793a5a210ee0bfdb737bb5136c04 Author: Rahul Mahajan Date: Thu Feb 2 14:42:18 2023 -0500 Deprecate `FDATE` (#1294) There used to be a fringe functionality where the `gdas.tHHz.radstat` could be ignored in the first full cycle `gdasanal` task. All cycled experiments that start with a cold-start or warm-start must have initial bias correction coefficients (the so called `gdas.tHHz.abias` files) and the `gdas.tHHz.radstat` file. Experts may customize their configurations if they choose to setup without a `gdas.tHHz.radstat` file thereby initializing the radiance diagnostics. They would anyways know what they are doing. There are no updates in documentation as this was never an advertised mode of starting an experiment. Closes #1005 commit 219c23b182ef1aee845faad19f4b6c1c0fe817be Author: Rahul Mahajan Date: Wed Feb 1 10:36:25 2023 -0500 Atmosphere cycling with a Coupled model (#1274) This PR enables: - cycling the atmosphere with GSI using the coupled model (S2S) intended for WCDA development - allows staging of cycled IC's for coupled/atm-only model (in warm start and cold start) - updates dependencies in the XML to account for coupled model tasks (currently disable in cycled mode due to inadequacies in `ocnpost` jobs) This PR also: - disables `GLDAS` by default as it is expected to be deprecated. Enthusiastic developers may turn it `ON` - allows a tiny flexibility for DA developers to turn ON 3Dvar automatically if `nens = 0`. Also in this PR: - `diag_table_da` was updated to include instantaneous ocean fields. When running the model in `APP=ATM`, these entries from the `diag_table_da` are ignored. - `diag_table` was also updated to include coupled fields from `diag_table_cpl`. This is a step towards unifying the `diag_table`. There will be more work done in this area in the near future. - `MOM_input_template_500` is added for the 5 degree ocean configuration. This PR was built on initial work from @guillaumevernieres and @NeilBarton-NOAA @guillaumevernieres provided initial conditions @NeilBarton-NOAA and @guillaumevernieres both provided updates to the scripts that were used in the creation of this PR. The following commands are used to setup and configure the cycled experiment with S2S model configuration. @guillaumevernieres @NeilBarton-NOAA @DeniseWorthen and @junwang-noaa provided valuable assistance in debugging the coupled model failures. To setup and run a coupled model, cycled with atmosphere test at C48 atmosphere 5 degree ocean/ice resolution: ``` # Setup experiment and COMROT directories. Copy initial conditions to COMROT. ./setup_expt.py cycled --expdir --comrot --idate 2021032312 --edate 2021032400 --resdet 48 --nens 0 --gfs_cyc 0 --icsdir /scratch1/NCEPDEV/stmp2/Rahul.Mahajan/ICSDIR/C48O500 --start warm --pslot --app S2S # cd into EXPDIR and disable IAU, METP in config.base # Generate XML ./setup_xml.py ``` The test was concluded successfully w/ no failures. The test result can be viewed at ` /scratch1/NCEPDEV/stmp2/Rahul.Mahajan/EXPDIR/prcycs2s` on Hera. The test only performed a 3DVar with the GSI in the atmosphere and no IAU. Including IAU will require additional work and flexibility both in the model code as well as in the workflow. A test with C384 and 0.25 degree ocean/ice model was also configured by obtaining files from @NeilBarton-NOAA and @guillaumevernieres . The forecast model crashed after about 3 hours of integration with an "Out of memory" fault. Additional compute resources might be needed to get that configuration up and running. Controls for "restart", "restart interval", "history" and "history frequencies" are controlled in multiple places for multiple components. A discussion with @junwang-noaa and @DeniseWorthen is ongoing to develop a solution for higher level control of clock related properties in the coupled model. Adopting the current infrastructure built for forecast only prototypes for cycling was very enlightening. Several enhancements could be made to make this system more flexible for the needs of both model development as well as DA development. I have left verbose comments and an extensive array of `TODO` (to be transformed into GH issues), for mapping out future work. commit a9ab1e753808454d713e3863ac34e911fe65bbe5 Author: Kate Friedman Date: Tue Jan 31 16:28:47 2023 -0500 Update obsproc to v1.1.2 (#1293) Update HOMEobsproc path in config.base to new v1.1.2 package version. Refs #1291 commit 3cbca7d7ebba77e0b13d201c9f2669a47ccd7c90 Author: Kate Friedman Date: Tue Jan 31 15:27:05 2023 -0500 Updates to RTD documentation based on full review (#1287) Reviewed documentation to correct spelling mistakes and update contents for current state of system. Refs #1272 commit 6b5d92d0f5270b11d52b1ce4d315f9c563108ce5 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Tue Jan 31 12:42:22 2023 -0500 Add PYTHONPATH to soca ocean prep jjob (#1292) Adds the UFSDA location to PYTHONPATH in soca prep jjob so the path building is not done in script. Refs: https://github.com/NOAA-EMC/GDASApp/issues/242 and https://github.com/NOAA-EMC/GDASApp/pull/234 The issues require changes also to GDASApp, but this PR will not break GDASApp as is commit 32b21694ea6bfcc5627196f3924235b4689a1af3 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Mon Jan 30 22:04:29 2023 -0500 Add ocnanalbmat (#1286) This adds the ocnanalbmat task to S2S workflow between ocnanalprep and ocnanalrun. At present this is a stub task that does not actually run the associated jjob, pending more extensive testing Partially addresses https://github.com/NOAA-EMC/GDASApp/issues/263 commit 807b5290570c0633fa83b266757d00495f07853f Author: Jan Macku Date: Tue Jan 31 02:13:19 2023 +0100 Update ShellCheck Action and add differential PyLint Action (#1290) Adds a differential pylint github action and updates the differential shellcheck action version. commit 659b462b56726875593eeeb6dd36315b3cb56f92 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Mon Jan 30 19:44:05 2023 +0000 Modify workflow to access files from obs component (#1202) Observations in the current cycled system are linked/copied or created into the `atmos/` subdirectory under ROTDIR There are three files linked from emc dump to atmos subdirectory: `syndata.tcvitals.tm00` `snogrb_t1534.3072.1536` `seaice.5min.blend.grb` The nsstbufr, prepbufr will be created in obs subdirectory. There should be only prep jobs need to access GDA location to link files to the obs component. All other jobs should be using obsproc files under the obs component. Fixes #1198 commit e092e8b32471bbe39123eb928b47124c7b71afa7 Author: Xianwu Xue - NOAA <48287866+XianwuXue-NOAA@users.noreply.github.com> Date: Fri Jan 27 11:26:52 2023 -0500 Correct issue in linking final restart files (#1285) A bug in forecast_predet.sh, because when run cycled model, the interval_restart_gfs=0, it will cause the condition failed and will not run the following statements, which will cause gmemdir is not set values in the forecast_postdet.sh. Therefore, I delete the condition to solve this issue. Fixes #1284 commit 9a8759251e3ed12940a0d9b0891d3fa7135eee5e Author: Rahul Mahajan Date: Thu Jan 26 10:54:02 2023 -0500 Remove execute permissions from config files (#1281) Removes `exec` permissions from `parm/config/config.*` files. These files are not to be executed on their own, and rather to be sourced by J-Jobs. commit 32e9778acb0c6cb771e4386a7200606d2b9e8c7a Author: Xianwu Xue - NOAA <48287866+XianwuXue-NOAA@users.noreply.github.com> Date: Wed Jan 25 18:21:00 2023 -0500 Make needed updates to run forecast from GEFS (#1203) Since some of the GEFS functions are empty or missing, this PR is to complete/add such functions and also make some variables to accept values from other scripts Refs: #1147 commit b57638936ae4d8c83c3bba86720f61cc4fe3ffe4 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Wed Jan 25 23:12:51 2023 +0000 Remove unnecessary variables which reference to nemsio (#1259) Remove unnecessary variables which reference to nemsio (part 2 of #601) Wire OUTPUT_FILE as netcdf Replace SUFFIX with wired .nc name Remove OUTPUT_FILE Remove use of affix and replace with wired name "nc" Replace format with wired name "netcdf" Fixes #601 commit 2ecfa880a3f9ab52ff3e7e46373db9e6d9d67b88 Author: Travis Elless <113720457+TravisElless-NOAA@users.noreply.github.com> Date: Wed Jan 25 23:10:52 2023 +0000 Create analysis files for early-cycle EnKF by default (#1237) The upcoming GEFS implementation plans to use analysis files from the early-cycle EnKF to cold start the EPS. This PR allows the early-cycle EnKF members to produce analysis files as the default option. Running in this mode will break early-cycle EnKF forecast functionality - running with this new config.base option set to "NO" and/or a future PR will be needed to restore the early-cycle EnKF forecast job. These changes do not impact the ability for the workflow to successfully cycle. commit 710d1b75dcff1dbb5256f56cbde29f89a21fc95f Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Wed Jan 25 18:08:52 2023 -0500 Don't wipe $DATA before running ocean bmat (#1280) JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT reuses the rundir from JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP and so shouldn't wipe the data before running. Necessary for https://github.com/NOAA-EMC/GDASApp/issues/288 commit f78afebc3125dbfb9a8450b47ad5c72a5a6c16a8 Author: Guillaume Vernieres Date: Wed Jan 25 15:11:05 2023 -0500 More marine DA j-jobs (#1270) Addition of 2 j-jobs: - jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT_VRFY This job currently dumps a few files of the impulse response of B. Not meant to ever go in ops. - jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY This job will eventually generate relevant figures from the marine DA cycle. It currently points to a script that does not yet exist. I also took this PR opportunity to: - update how we load the modules, following what was done for the atmosphere. - modify ```ush/load_ufsda_modules.sh``` so we could optionally specify what to load (GDAS or EVA), default is GDAS when no argument is present - Use POST to copy what's needed in `COM` Fixes #1273 commit fb54bb6606d0959c063aa03ff2182da637d61c3e Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Wed Jan 25 15:08:16 2023 -0500 Update UFS-DA atmospheric prep script to be consistent with GDASApp update (#1265) Updates the GDAS App version to incorporate the changes of https://github.com/NOAA-EMC/GDASApp/pull/278, which restores the ability of `run_jedi_exe.py` to execute UFS-DA applications, specifically `fv3jedi_var.x`. Included in https://github.com/NOAA-EMC/GDASApp/pull/278 is the removal of entry `ufsda.stage` from `ush/ufsda/__init__.py`. Scripts which simply `import ufsda` must now specify the functions to from `ufsda.stage`. g-w issue #1262 documents the addition of the `ufsda.stage` line to `scripts/exgdas_global_atmos_analysis_prep.py`. This change is required for g-w to successfully stage files used by the var and ensemble UFS-DA applications. Fixes #1262 commit f98433ff3661ef848d381ae65322623573a114cc Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Mon Jan 23 10:11:39 2023 -0500 Add new jjob for ocean analysis bmat (#1239) Adds jjob for ocean analysis bmat. The intent is to have separate tasks for the ocean analysis and generation of the b matrix. Once this is merged, additional modifications will be made to GDASApp to allow it to be run with the other global-workflow components of ocean/ice DA Refs: https://github.com/NOAA-EMC/GDASApp/issues/263 commit 5ebad4c88829f0810d05328944c2232273607f65 Author: Kate Friedman Date: Mon Jan 23 09:42:43 2023 -0500 Retire ecf/versions in develop (#1267) NCO confirmed the `ecf/versions` folder and version files within are no longer needed. All versions come from the `versions/` folder version files. Refs #1205 commit 6713eaeb829df2cffbc0c959d17b1f14033d783c Author: Rahul Mahajan Date: Mon Jan 23 09:32:31 2023 -0500 Deploy documentation to RTD (#1264) Adds an Github action that will build and deploy documentation as artifacts for with every push to develop. The artifacts can be downloaded from Actions -> Documentation -> Artifacts Refs: #9 commit f90058590872d824d5da2a1ed263d738d9b3fb62 Author: Rahul Mahajan Date: Fri Jan 20 21:47:35 2023 -0500 Temporarily disable failing pytest (#1263) Disables the test that fails in the GH runner but passes on `localhost`. Once I figure out why it is failing in the runner, it will be reenabled. commit 2e350d6868b2b0a0b7ba474dd1873dbae462665b Author: Kate Friedman Date: Fri Jan 20 14:01:46 2023 -0500 Remove incorrect/misleading comments in config.base (#1261) Remove incorrect comments in config.base.emc.dyn and config.base.nco.static Refs #1260 commit 9fc82753857e3eb97fc91ded0b1c46c8cd51c912 Author: arun chawla <49994787+arunchawla-NOAA@users.noreply.github.com> Date: Fri Jan 20 13:55:12 2023 -0500 Add initial Sphinx documentation (#1258) Adds Sphinx documentation in the docs directory. The documentation reflects the current information in the wiki. Future commits will automate the rendering of the docs to readthedocs and add Sphinx parsing of script documentation. Refs: #9 commit 78509d6da4547124945265398b6e371210010987 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Thu Jan 19 22:08:21 2023 +0000 Remove nemsio support (#1255) Removes remaining nemsio support from scripts. Now-unnecessary variables will be removed in a follow-up PR. Refs: #601 commit 62944fbc00869dd84b03630ad65bf993a7524752 Author: Walter Kolczynski - NOAA Date: Sat Jan 14 19:32:29 2023 -0500 Increase wallclock for diag jobs (#1216) Diag jobs were failing due to insufficient wall clock, so the wall clock is increased until a more complete review of the resources can be completed. Refs #1215 commit 67d17218b5ea5f491f8115c61d546c11aab2b546 Author: Walter Kolczynski - NOAA Date: Sat Jan 14 19:31:59 2023 -0500 Use correct resources for GFS gempak (#1214) The GFS gempak job was using the GDAS settings instead of switching to the GFS settings for `npe` and `npe_node`. This is part of a package of PRs that were tested together as part of a j-job refactor. Fixes: #1213 commit ca5baebed1cefae3d586b6c6fa04f46166b5e876 Author: Walter Kolczynski - NOAA Date: Fri Jan 13 19:12:05 2023 -0500 Abstract common j-job tasks (#1230) Takes all of the tasks that are common to all j-jobs and abstracts them out into a shared script that is sourced by each job: - Set and create $DATA directory - Call setpy and set $cycle - Set pid, pgmout, and pgmerr - Source config files - Source machine environment file The common j-job header script is called by passing the job name for the `${machine}.env` files using the `-e` option, and a list of config files to source with the `-c` option. ``` ${HOMEgfs}/ush/jjob_header.sh -e [env_job] -c "[config1 [config2 [...]]]" ``` The job name argument (`-e`) is mandatory, and the config list is optional but recommend to always use as well. Some pre j-job rocoto entry scripts (`jobs/rocoto/*`) are currently doing much more than they should be. These sometimes required extra finagling, usually pre-calling the jjob header in the rocoto script before it does something. Refs: #1069 commit e8d47783100c51d96e988dc936258f185d502269 Author: Walter Kolczynski - NOAA Date: Fri Jan 13 19:08:32 2023 -0500 Add missing mkgfsawps.x link (#1218) The mkgfsawps.x and overgridid.x executables were not being linked by the link script, but they are necessary to run AWIPS jobs. This fix alone is not sufficient to restore AWIPS functionality. Fixes #1217 commit 4e2e01a5ae8e3fa45e04055af235834684711b00 Author: Walter Kolczynski - NOAA Date: Fri Jan 13 19:07:01 2023 -0500 Fix post sounding job (#1212) A few minor errors were blocking the sounding job from completing successfully: - CFP APRUN command was using incorrect number of PEs - Updates were needed to run on Orion. Removed the machine-specific code for Hera and Jet so it runs the same on all machines, while refactoring it to use C-style loop. - Bash base errors (see also #1195), which are now corrected, while refactoring to use C-style loop. This is part of a package of PRs that were tested together as part of a j-job refactor. Refs #1195 Fixes #1211 commit 469cc7bf9278e64330b7d35ad8a349bcb40848cc Author: Walter Kolczynski - NOAA Date: Fri Jan 13 14:57:52 2023 -0500 Revert "Use fracoro data for all new UFS applications (#1182)" (#1240) This reverts commit 1f258e43ae04acc9d16953793b67769bb53abc27. commit 1f258e43ae04acc9d16953793b67769bb53abc27 Author: ChunxiZhang-NOAA <49283036+ChunxiZhang-NOAA@users.noreply.github.com> Date: Fri Jan 13 14:42:26 2023 -0500 Use fracoro data for all new UFS applications (#1182) The new fracoro data should be used for all new UFS applications no matter if it uses frac_grid or not. Most problems in Issue[#863](https://github.com/NOAA-EMC/global-workflow/issues/863) have been resolved. However, one problem remains, e.g., the latest fix, mask and oro datasets (fracoro) created by Shan/Mike/Helin should work for both fractional and non-fractional grid. Related to the changes in UFS_UTILS. A PR[#714](https://github.com/ufs-community/UFS_UTILS/pull/741) in UFS_UTILS has been created. Fixes #863 commit 55667e958a8ebf667ed18a13283b95e125af13d8 Author: Walter Kolczynski - NOAA Date: Thu Jan 12 13:15:23 2023 -0500 Revert "Merge GFS v16.3 operational GSI changes into develop branch. (#1158)" (#1238) Reverts #1158 due to issues it creates due to crtm version mismatches that can't be immediately corrected. commit e3f351f6fc5b9dc6465ba7faee3920929178f80e Author: Guillaume Vernieres Date: Thu Jan 12 00:46:47 2023 -0500 Add more user defined parameters for the marine DA (#1235) This is the companion PR to [GDASApp PR#270](https://github.com/NOAA-EMC/GDASApp/pull/270) and needs to be merged before I can trigger the GDASApp `CI` (something that we will change eventually ...). See [GDASApp issue #269](https://github.com/NOAA-EMC/GDASApp/issues/269) commit 2ba4f7e8e17ab88215d4c9cf5744ee7c4a119167 Author: Rahul Mahajan Date: Thu Jan 12 00:18:44 2023 -0500 Update pytests action version and run sequentially (#1236) @aerorahul and @guillaumevernieres noticed that some pytests were failing abruptly in the CI. Manually booting them fixed the issue. However, we want those tests to run automatically. This PR executes the pytests for the different versions of python in the matrix sequentially. It is hypothesized (not confirmed), that the pytest artifacts were being clobbered due to parallel execution. This PR also updates the actions to a later version. This eliminates certain deprecation warnings commit ddc86880ec54c01abced62e3d06c441aeee90b68 Author: Rahul Mahajan Date: Tue Jan 10 21:42:10 2023 -0500 Add utility to compare Fortran namelists (#1234) Often times it is necessary to compare Fortran namelists between a UFS-weather-model regression test and a global-workflow experiment, or in other example applications. This PR adds a simple utility that loads two namelists and spits out the differences between them. The differences are calculated as a departure from the first namelist. This utility leverages `f90nml` (approved for use on WCOSS2) The usage is as follows: ``` ❯❯❯ python3 compare_f90nml.py -h usage: compare_f90nml.py [-h] [-r] left_namelist right_namelist Compare two Fortran namelists and display differences (left_namelist - right_namelist) positional arguments: left_namelist Left namelist to compare right_namelist Right namelist to compare options: -h, --help show this help message and exit -r, --reverse reverse diff (right_namelist - left_namelist) (default: False) ``` The comparison is done as follows: - Both namelists are loaded - We loop over the keys of `left_namelist`. We look for the same key in the `right_namelist`. If the key is found, the values are compared. If the key is not found, a note is made that the key is undefined in `right_namelist`. - Differences in the values are printed to screen. - The `-r | --reverse` reverses the `namelists`. This allows the user to use `right_namelist` as the reference. If differences are found, they are shown as follows (examples of `input.nml` from the `control_p8` and `cpld_control_p8` regression tests of the ufs-weather-model) ``` ❯❯❯ python3 compare_f90nml.py control_p8.nml cpld_control_p8.nml comparing: control_p8.nml | cpld_control_p8.nml ----------------------------------------------- atmos_model_nml: ccpp_suite : FV3_GFS_v17_p8 | FV3_GFS_v17_coupled_p8 fms_nml: domains_stack_size : 3000000 | 8000000 fv_core_nml: dnats : 0 | 2 gfs_physics_nml: min_seaice : 0.15 | 1e-06 use_cice_alb : False | True nstf_name : [2, 1, 0, 0, 0] | [2, 0, 0, 0, 0] cplchm : False | True cplflx : False | True cplice : False | True cplwav : False | True cplwav2atm : False | True ``` commit 721e8ae03275501e57bfac999943a426023c19c2 Author: Rahul Mahajan Date: Mon Jan 9 11:49:57 2023 -0500 Updates for pygw (#1231) A few minor updates to `pygw`: - Create a `runtime_config` in the `task.py` base class to store runtime attributes such as `PDY`, `cyc`, `DATA` etc. - Improved logging decorator for debugging. - Allow `cp` to take `target` as a directory. The file from `source` will be copied to the `target` under the same name. This is the `default` unix `cp`. - Ignore `vscode` files in the project. - As a bonus, even documentation is added to demonstrate the use. commit d07db2b4db4bb0c9eb33a557217f7f479dfcc6f9 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Mon Jan 9 15:35:11 2023 +0000 Merge GFS v16.3 operational GSI changes into develop branch. (#1158) Changes were made in the GSI scripts for the operational system that need to be brought back to development: - Remove hard-wired rCDUMP in EnKF forecast job - Turn on `cao_check` and `ta2tb` - Update cloud coefficient fix file Fixes #1148 commit 79e564a7f073c0f128b63c2100d1773dbc59085b Author: Rahul Mahajan Date: Mon Jan 9 10:33:02 2023 -0500 Move member up in directory hierarchy (#1201) Moves the member directory one higher level in the directory hierarchy in anticipation of additionally components being run for the members in addition to atmos. This results in COM directories that are arranged as `memXXX/atmos` instead of `atmos/memXXX`. This PR also adds a "hack" to allow the success for the `gdaspostanl` job in the first half cycle (see #1034). The "hack" will be removed with the refactoring of the post jobs in the (near) future. Fixes #1196 Refs #1034 commit cf1b3281f66409ee090e22a184758c12c2d6c8e8 Author: Rahul Mahajan Date: Tue Dec 20 10:28:14 2022 -0500 Enable staging ics for cycled experiments. (#1199) commit 8b39403e2683b1d16186c90700ddfb124b73af1e Author: Rahul Mahajan Date: Tue Dec 20 10:11:02 2022 -0500 Add tests for configuration.py (#1192) Add tests to configuration.py Update README.md to illustrate developers how to run tests locally and manually Add .gitignore to pygw commit 3e240bb8fe8880f31af591d7829b5051506a6485 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Mon Dec 19 11:35:07 2022 -0500 Replace ocnanal_${CDATE}} with ${RUN}ocnanal_${cyc} (#1191) Replace ocnanal_${CDATE}} with ${RUN}ocnanal_${cyc} as both `$RUN` and `$cyc` are available at the beginning of a job. commit 6d33752bd8b54bd35355dda7413afbe9bb52a5b4 Author: Rahul Mahajan Date: Mon Dec 19 10:45:36 2022 -0500 define NET and RUN in the Rocoto XML to accurately mimic the ecf in ecflow (#1193) commit 8581eacf1a26efba614a1c04715fbc24cb7ae858 Author: Xianwu Xue - NOAA <48287866+XianwuXue-NOAA@users.noreply.github.com> Date: Sun Dec 18 00:01:48 2022 -0500 Fix checking for restart files (#1186) Undoes the portion of PR #1179 that caused a new bug while attempting to fix #1140, without removing the linter fixes. Instead `/dev/null` is 'searched' if `${RSTDIR_ATM}` is not defined. That situation will always result in zero files found, ensuring a rerun is not triggered. Fixes #1140 Fixes #1185 Moots #1190 commit 9be5c41842d1a7632abc86db875f04c54d2926dd Author: mdtoyNOAA <73618848+mdtoyNOAA@users.noreply.github.com> Date: Fri Dec 16 12:27:20 2022 -0700 Fix 'DEBUG' option in build_ufs.sh (#1188) commit 98545daf3c7383f145ef9d8de5d1a6657b086ea1 Author: Kate Friedman Date: Thu Dec 15 13:21:42 2022 -0500 Update archive job memory request value for R&Ds (#1183) - Revert memory request back to 2048M on R&Ds. - Keep 50GB setting for WCOSS2. Refs #1144 commit a18efae579f7079d99155d63d3f5a89c9514c6b7 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Thu Dec 15 09:30:08 2022 -0500 Reorder post so all flux files are generated when running offline (#1181) Post would fail when running without inline post because it would attempt to create 1p00 flux files before the flux file had been generated. `inter_flux.sh` also had to be updated to point at the correct file for offline post, which is now the same as inline. Fixes #1157 commit 5b454c31204b973a5472aeb3518f1a48ef102284 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Thu Dec 15 09:23:37 2022 -0500 Stop checking for restarts on non-GFS CDUMPs (#1179) The forecast was attempting to check `RSTDIR_ATM` for restart files, but that variable is not defined for gdas, resulting in an unbound variable error. Because the error occurred in a subshell, the error had no impact on the execution, just an unwanted error message in the log. Now that directory is only checked when the `$CDUMP` is 'gfs'. Also fixes some linter warnings. Fixes #1140 commit f2ecf3ab314451cd5522dc30a913e61ea59ea3a4 Author: Rahul Mahajan Date: Thu Dec 15 08:14:13 2022 -0500 Add missing jobids in some pre-job scripts (#1176) Some of the pre-job scripts (`jobs/rocoto/*`) were not defining `job` and `jobid` before calling the j-job. Also refactors the archive j-jobs to match the new j-job structure. Fixes #1169 commit 62b2590cdda9a6558f9bf46c3aa61bdb40007e37 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Thu Dec 15 08:10:28 2022 -0500 Remove existing directory if it exists when getic runs (#1165) If getic is run with an existing `$ROTDIR` (possibly because it failed the first time), it would fail because the directory already exists. Now any existing directory is deleted before attempting to move the ICs. Fixes #1156 commit b6da693c8a016fcfb3aacd6e100d153bb264ba30 Author: Rahul Mahajan Date: Wed Dec 14 11:44:33 2022 -0500 Add logging decorator, test and test for yaml_file (#1178) add a logging decorator (logit) and its associated pytest add a pytest for yaml_file commit bf06289c280bafd557c9a9d34ce77d8d18fd2ff8 Author: Rahul Mahajan Date: Tue Dec 13 10:38:54 2022 -0500 fix coding norm check in `hosts.py` (#1174) `pynorms` test is trapping a error in `hosts.py` on the conditional for singularity container that is now fixed. Also start ignoring git hidden directories from pynorms. commit 3e53e061dbf4fcf91bf39c5bd87b7ef31e806d55 Author: Guillaume Vernieres Date: Mon Dec 12 20:05:03 2022 -0500 Fix some bugs and make other changes so ctest in GDASApp works (#1172) Fix some bugs and make other changes so ctest in GDASApp works. `$CDATE` is used instead of `$cyc` for the time being, since `cyc` isn't defined before that point in the scripts. Fix some lines where multiple arguments were enclosed in quotation marks when they should be separate. Update `config.ocnanal` to template out some variables and add them to the defaults yaml. Fixes #1164 Fixes #1173 Companion PR to [GDASApp/PR234](https://github.com/NOAA-EMC/GDASApp/pull/234) See also [GDASApp/232](https://github.com/NOAA-EMC/GDASApp/issues/232) commit 5a748ee99198e7575862f6f05297a1599470d61d Author: Guillaume Vernieres Date: Mon Dec 12 16:02:47 2022 -0500 Support for the GDASApp testing in containers (#1151) Adds support to running GDASapp portion of global-workflow in a container `CONTAINER.env` is quite minimalist at this point, it could also certainly be cleaned up, but it will need to be expanded later anyway to handle additional jobs. Fixes #1234 commit 3085dfc349098436b4b64b11cbcf0fe2b631011f Author: Neil Barton <103681022+NeilBarton-NOAA@users.noreply.github.com> Date: Mon Dec 12 15:58:34 2022 -0500 ATM 3DVAR with and without IAU (#1113) This is the first of a series of PRs to addresses issue https://github.com/NOAA-EMC/global-workflow/issues/947 (cycling the S2SW app with 3DVAR). Using the ATM app, 3DVAR with IAU did not cycle out-of-the-box. This PR includes changes to successfully cycle ATM with 3DVAR with and without IAU. Refs: #947 commit dda2edbcf895a123ff5f4b1e77b28dd88044a24c Author: Rahul Mahajan Date: Mon Dec 12 15:56:49 2022 -0500 Enable checking for python norms and fix violating code (#1168) This PR: - enables checking coding norms on python scripts using `pycodestyle` via github actions - adds the rules of the norms in `.pycodestyle` - fixes python scripts that violate the norms using `autopep8`. If a python script is failing during norm check, it can be fixed with the following command: ``` $> autopep8 -i --max-line-length 160 --ignore E402,W504 /path/to/offending_script.py ``` commit 6274c27f43bd2bb5a01142e5424d7ff7e1663ead Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Mon Dec 12 15:52:36 2022 -0500 Enforce decimal math in atmos post (#1171) `fhr` was still being treated as an octal in some places of atmos post. Instances where `fhr` are used for math are now updated to `d_fhr`, which is always a decimal representation of the `fhr` string. Fixes #1141 commit f19d795061fad696ada891e21df4db5de0e5007a Author: Rahul Mahajan Date: Mon Dec 12 15:48:42 2022 -0500 Update marine DA j-jobs to new format (#1149) The new marine DA jobs were added (PR #1134) roughly concurrently with the refactoring of the j-job scripts (PR #1120), and thus didn't pick up the changes there. These new jobs are now updated. commit 287f52c8943d639d5f49a7d73faeedcc003ca685 Author: Cory Martin Date: Mon Dec 12 07:03:37 2022 +0000 Add utility to manipulate files en masse (#1166) This PR adds the ability to pass in a configuration and manipulate files in bulk. Currently it just works for `copy` and `mkdir`. A sample YAML file: ``` mkdir: - ${CRMTEST}/output copy: - - ${CRMTEST}/fakefile1 - ${CRMTEST}/output/fakefile1 - - ${CRMTEST}/fakefile2 - ${CRMTEST}/output/fakefile2 ``` To use it in a python script: ``` from pygw.utils import FileHandler FileHandler(path='/export/emc-lw-cmartin/cmartin/work/tmp/test.yaml').sync() ``` commit c59c0d8c6b3dbca4902b48cdc7f86afc3d6426b3 Author: Rahul Mahajan Date: Fri Dec 9 17:24:55 2022 -0500 add action to run pytests (#1167) This PR adds a github action that runs pytests on the python code within pygw. The package is tested with python versions 3.7,. 3.8, 3.9 and 3.10. After this PR, tests will be required with every new feature added to `pygw`. commit ce05e203373bf7dc509bba567f2af73d702e9840 Author: Jan Macku Date: Fri Dec 9 23:24:09 2022 +0100 Pin `differential-shellcheck` to `v3` tag (#1162) Pins the differential-shellcheck to v3 instead of using latest to avoid API changes breaking existing CI. Also removes unneeded write permissions. commit cfde4e7f5e087441f5a6991aaf6dc4b23a9b06d0 Author: Rahul Mahajan Date: Thu Dec 8 18:11:45 2022 -0500 Add a task base class and basic logger (#1160) - Adds a very basic base class `Task` that will be used when creating new tasks or refactoring existing tasks. - Adds a very basic logger that can write output to stdout as well as a file. - Adds a test for Logger commit cab4fafaa3afcc11cfc3a3509c4bf3e31faa7df8 Author: Walter Kolczynski - NOAA Date: Wed Dec 7 16:46:51 2022 -0500 Recursively convert dict to AttrDict when making an AttrDict (#1154) Dicts within an AttrDict were not being converted to AttrDicts at creation/update time. This meant (among other things) when updating, nested dicts of the same name would be overwritten instead of merged. commit 21056c1a748f48c4b08c6c5bc1fcebcdd0d10192 Author: Rahul Mahajan Date: Wed Dec 7 16:13:43 2022 -0500 move configuration.py to pygw. Use it from there. return AttrDict after sourcing configs (#1153) commit c961e9a44084d6bfc7042bd8e2190bcf9091d256 Author: Guillaume Vernieres Date: Tue Dec 6 11:21:36 2022 -0500 JEDI based Marine DA tasks (#1134) This PR adds the relevant scripts/steps to exercise the JEDI based marine DA eventually needed for GFSv17. See issue #1072 for more details. The branch used for this PR was extracted from a ["rogue" branch](https://github.com/guillaumevernieres/global-workflow/tree/feature/add-soca) that included changes to the model related scripts to allow for WCDA cycling. The testing was done with the coupled `C48/5deg` atmos/ocean model, using the `GSI` for the amos initialization and the JEDI based marine DA system (SOCA) for the ocean initialization. It is probably step 1 of at least 2 steps to get this work to a functional state in develop. I just don't know yet who's responsibility it is to fix the coupled model scripting to make it work in cycled mode. #### A few notes: - Copying some of the `SOCA` fixed files has yet to be coordinated: An environment variable in `config.ocnanal` is pointing to `/scratch2/NCEPDEV/ocean/Guillaume.Vernieres/data/static/72x35x25/`. This needs to be copied elsewhere. - This work as is does not cycle and will require more work on the model side as pointed above - The testing of that ["rogue" branch](https://github.com/guillaumevernieres/global-workflow/tree/feature/add-soca) from which this PR is derived, was only done on `Hera`. Fixes #1072 commit 7971a5f6dcca8740fe20d36680a2fe0a479c9b1b Author: Rahul Mahajan Date: Tue Dec 6 11:05:38 2022 -0500 Allow customizations based on user/configuration (#1146) * allow for modifying any config file from a higher yaml commit 6ab858ea1e5f7827026655007617730c6bc9eda6 Author: Rahul Mahajan Date: Tue Dec 6 11:04:07 2022 -0500 First step towards making j-jobs consistent in use from ecflow and rocoto (#1120) Several jobs run in development mode do not adhere to EE2 standards. Sometimes these development jobs do get used for operations at which time there is a rush in getting them to be standardized. These jobs are also used to run parallel retrospectives and near-real time parallels. This work is performed with GFSv17 and GEFSv13 as the target implementation. This PR attempts to standardize the use of several variables as defined in the NCO EE2 standards document. Specifically; The following are defined in the job-card: `DATAROOT` `cyc` `HOMEgfs` `job` `jobid=$job/$$` or `jobid=$PBS_JOBNAME/$PBS_JOBID` depending on Rocoto/Ops The following are defined in the j-job: `pid` `DATA=DATAROOT/jobid` `cycle` This PR also fixes a bug in `exgfs_atmos_post.sh` that is only encountered when `WAFSF=YES` and in the `gdaspostanl` job. An overzealous linter added extra white-space and put quotes around an argument for `wgrib2`. commit 540b8a6eb43e3d3e4f07b1ed3719382a43000265 Author: Rahul Mahajan Date: Fri Dec 2 10:04:21 2022 -0500 enable APP=S2SWA on WCOSS2 (#1142) commit e2c29749ffe4050ce4e0273ef4802b8458bbc59c Author: Walter Kolczynski - NOAA Date: Wed Nov 30 15:47:29 2022 -0500 Fix typo in .shellcheckrc commit 17c50577989440b2fb680670a02aa96ab6b62588 Author: Kate Friedman Date: Wed Nov 30 15:40:09 2022 -0500 Remove prod_envir module load from WCOSS2 (#1138) The prod_envir module load in module_base.wcoss2.lua is not needed and is being removed. commit 4e3335b269754340521f435705fb3cd840638a79 Author: Rahul Mahajan Date: Wed Nov 30 14:10:00 2022 -0500 Link staged GSI fix files instead of cloning them from gerrit (#1132) * disable cloning GSI recursively. Link GSI fix files from central space * add `gsi_ver` variable in fix.ver Refs #1128 commit cc2f94b82199897bcc869805644a01e39f71fac3 Author: Kate Friedman Date: Tue Nov 29 12:50:12 2022 -0500 Address shellcheck warnings in env files (#1136) * Update env files for shellcheck warnings: SC2004, SC2034, SC2086, SC2166, SC2250, SC2292 * Remove outdated and unneeded gfs.ver from env folder Refs #397 commit 2aa654cab3c91696b92463ee933a2e7d090a879f Author: Xianwu Xue - NOAA <48287866+XianwuXue-NOAA@users.noreply.github.com> Date: Mon Nov 28 23:11:22 2022 -0500 Adds group size and nmem for GEFS (#1127) Adds additional settings for ensemble forecast group size and number of members since gfs CDUMP (GEFS) will have different values than GDAS. NMEM_EFCS sets the number of members for GEFS (default: 30) NMEM_EFCSGRP_GFS sets the number of members per job for GEFS (default: 1) Fixes #1117 commit 9955a2614789ddac0ea521ad922b0acaad22ef20 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Tue Nov 22 16:29:35 2022 -0500 Remove unnecessary sCDATE assignment in forecast_predet.sh (#1133) `ush/forecast_predet.sh` contains an unnecessary sCDATE assignment. Therefore this line is safe to be removed. Fixes #956 commit 5df144d9fca642a80c48e3b14216599d0062e428 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Tue Nov 22 16:14:37 2022 -0500 Convert archive jobs to proper j-jobs (#1115) Updates the archive jobs to the standard j-job/exscript format. Fixes: #1051 Refs: #720 commit 3a9c27a43d912707f338552a41a380c41f640180 Author: Rahul Mahajan Date: Tue Nov 22 16:01:32 2022 -0500 Update C48 forecast to run with one thread (#1131) C48 forecasts currently fail on WCOSS2 when running with two threads. Two threads are likely not really needed, so the forecast job is reduced to single-threaded on all platforms, at least until the problem can be corrected. Also moves the block capping the write block size to the node size outside the resolution-specific block, since that is needed for all resolutions. Fixes #1129 commit 6e60e1d573ba78f54378cd27654eede3f8765cd8 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Thu Nov 17 16:02:50 2022 -0500 Improved error messages from atmos analysis (#1125) The logical test used to set gsi namelist variable `nhr_obsbin` was not consistent with other checks in the script which use variable `l4densvar`. This relocates the setting for `nhr_obsbin` to be consistent with other tests using `l4densvar`, which will allow the appropriate error messages to be output by the executable. Fixes #1123 Fixes [NCO bugzilla #1196](http://www2.spa.ncep.noaa.gov/bugzilla/show_bug.cgi?id=1196) commit 87f44e5225e92076df76297ba98247be83d8bfe6 Author: Kate Friedman Date: Thu Nov 17 15:36:26 2022 -0500 Update MODULEPATH for Orion (#1126) New hpc-stack location on Orion managed by Hang Lei: /apps/contrib/NCEP/hpc-stack/libs/hpc-stack/modulefiles/stack Refs #1119 commit 2e6883678a136007f0f3bc77334acb9b8478f3ce Author: Kate Friedman Date: Thu Nov 17 13:45:32 2022 -0500 MPMD variable updates and fix (#1124) * Correct wave_mpmd setting in JGLOBAL_WAVE_INIT * Update Jet and S4 env files for mpmd_opt * Rename mpmd to mpmd_opt in WCOSS2 env file commit 196e928a4486ce171393a1033e68cd115b428634 Author: Xianwu Xue - NOAA <48287866+XianwuXue-NOAA@users.noreply.github.com> Date: Thu Nov 17 10:48:07 2022 -0500 Introduce FHMAX_ENKF_GFS to extending ensemble forecast capabilities (#1122) Allow running ensemble forecasts in the early cycle (`CDUMP=gfs`) to an arbitrary forecast length controlled by `FHMAX_ENKF_GFS` commit 53706b737dae603d02d4229cf4160b6f2ebd9afc Author: Kate Friedman Date: Tue Nov 15 09:14:48 2022 -0500 Update R&D launcher commands for tasks and multi-prog (#1112) * Update multi-prog in HERA.env and ORION.env * Update launcher commands in HERA.env and ORION.env * Adjust C96 & C48 eobs resources in config.resources Refs #1060 commit f64f9e50a809e99b3edab91476c31dc9415d64cd Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Thu Nov 10 13:57:18 2022 -0500 Correct crtm path in UFS DA atmospheric analysis scripts (#1111) UFS DA unified forward operator (UFO) validation uses `crtm/2.3.0` for radiances. UFS DA scripts which exercise UFO for radiance assimilation should use the same CRTM coefficients. UFS DA scripts `exgdas_global_atmos_analysis_run.sh` and `exgdas_global_atmos_ensanal_run.sh` currently use CRTM coefficients from `crtm/2.3.0_jedi`. This is not correct. The CRTM coefficient path for the two UFS DA analysis scripts in question has been updated in [`feature/ufsda_crtm`.](https://github.com/NOAA-EMC/global-workflow/tree/feature/ufsda_crtm) Fixes #1110 commit 51488b4c25c2827d43787be226e5c9c8d12a3c13 Author: Walter Kolczynski - NOAA Date: Wed Nov 2 16:10:16 2022 -0400 Correct syntax in remaining sorc scripts (#1105) Updates remaining shell scripts in sorc directory to pass shellcheck without any warnings. Refs: #397 commit a9aaa580745404a225a5e5696a04e4555f04e484 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Tue Nov 1 17:03:08 2022 -0400 Add GSI background error covariance as an option for UFS DA variational assimilation (#1104) Adds the ability to run fv3jedi_var.x with the GSI static B. commit c5fc2b75cc16c18aa250b3cc8e8df3a08c26bb0e Author: Xianwu Xue - NOAA <48287866+XianwuXue-NOAA@users.noreply.github.com> Date: Mon Oct 31 10:36:10 2022 -0400 Add Early Cycle EnKF workflow (#1022) Updates workflow to allow early cycle EnKF. Refs: #1021 commit 52100d247a73fa49f27df6d218f47c39035b25c5 Author: Kate Friedman Date: Thu Oct 27 15:33:34 2022 -0400 Correct errors with gdas and monitoring symlinks (#1101) Refs #1100 commit a8e88910df32c4733ebfbdad5bc26c40996d25e5 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Thu Oct 27 09:22:11 2022 -0400 Fixed gfs-utils links (#1099) Refs #1098 commit 2bc180b8424adaf5738974f5d3597da0f3560868 Author: Walter Kolczynski - NOAA Date: Wed Oct 26 10:00:48 2022 -0400 Fix build scripts and bring into compliance (#1096) Adds module-setup calls and updates gfs-utils to reset modules when building. The gfs-utils update also fixes the build issue on Orion by updating the stack version to 1.2. Brings all build scripts into full shellcheck compliance. Also removes the workflow utils modulefiles since they are no longer needed after moving those programs to gfs-utils. Fixes #1093 commit c8c6994d60a8f4ee56f226e4d45738961d52ff6a Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Tue Oct 25 18:35:21 2022 -0400 Feature/updates for gdas app (#1091) Updates the GDASapp version to a newer version that builds correctly. The former version no longer builds because submodules point to develop branches that are no longer compatible. Moves module loads out of the j-jobs and into their appropriate place in the rocoto job. A new load module script is added to handle the different module set needed for UFSDA. Also temporarily turns off strict mode for the UFSDA jobs to avoid PS1 unbound error in conda. Fixes #1043 Fixes #1067 commit 26c23be78a87806b081359f4647fff7a3f423981 Author: Kate Friedman Date: Tue Oct 25 15:06:43 2022 -0400 Change GLDAS USE_CFP to NO on Hera (#1094) Refs #1089 commit 5c03697b1c850919b88b6a68afeea058674b5742 Author: Kate Friedman Date: Mon Oct 24 14:16:21 2022 -0400 Resource updates to support WCOSS2 (#1070) * Add WCOSS2 BASE_CPLIC to config.coupled_ic * Remove errant ) from WCOSS2 hosts file for COMINsyn * Updates to config.fv3 from WCOSS2 testing * Add "is_exclusive" setting to config.resources * Add WCOSS2 to machine npe_node_max check in config.resources * Update resources in config.resources * Update build_ufs.sh to set S2SW as default app on WCOSS2 * Set hpssarch to NO by default on WCOSS2 Refs #419 commit d01de06d54cba36cd688e4b88cb452015a493185 Author: Walter Kolczynski - NOAA Date: Mon Oct 24 13:37:24 2022 -0400 Set COMPILER in link for detect machine (#1092) The new detect_machine script in gfs-utils requires COMPILER be set. Also added the gfs-utils scripts to the git ignore list. Fixes #1090 commit b9b8322fbf0ad1fc9ea9c5bf584546c3a2865bdf Author: Walter Kolczynski - NOAA Date: Mon Oct 24 09:25:29 2022 -0400 gfs utils update (#1088) The recent update of the gfs_utils version in PR #1082 introduced some gfs_util that weren't accounted for in the original PR. gfs_utils changed the name of the script that determines the machine to `detect_machine.sh`, so scripts that call that script had to be updated. The variable name holding the machine name has also changed (from `$target` to `$MACHINE_ID`) and it may now include a compiler at the end, so changes were necessary to account for that. The WW3 build is changed completely to use the UFS modules to be consistent with the rest of UFS instead of maintaining separate modules in workflow that may use different module versions. Also reverts an inadvertent removal of execute permissions for `checkout.sh` Fixes #1086 commit 9118ab33eca68690f99a4be288101b0b1d565bf6 Author: Kate Friedman Date: Sat Oct 22 07:29:17 2022 -0400 GFS-UTILS update for build and ush scripts (#1082) This PR updates the gfs-utils version (af933d3) and ush script symlinks/paths. This will resolve issue #1059 until `finddate.sh` can be used via the `prod_util` module. * New gfs-utils hash: af933d3 * A number of ush scripts were moved into the new gfs-utils repo but not added to `link_workflow.sh`. This PR adds them to `link_workflow.sh`. * Several workflow scripts are updated to point to the gfs-utils ush symlinks under the top level `/ush` folder instead of the prior `/util/ush` folder. Fixes #1059 commit a30b7167154430d5235065c6df2f3477f96a52df Author: Walter Kolczynski - NOAA Date: Sat Oct 22 00:33:02 2022 -0400 Update UFS version to 2022 Oct 19 (#1083) No settings modifications were necessary for this update. This advance of the UFS version (finally) fixes the problem with missing PV and sigma levels. commit 6f5fa7949299ef70b5e846697461b8538aef6a5d Author: Walter Kolczynski - NOAA Date: Wed Oct 19 23:37:45 2022 -0400 Use more cycledefs for task control (#1078) Splits the existing rocoto cycle definitions up to offer better job control. This means that only the jobs that are due to run will appear in a cycle's job list from rocotostat/rocotoviewer. It also allows for the removal of some of the cycleexist dependencies that were there solely to prevent the job from running in the half cycle. A side effect of this change is that the half-cycle will be recognized as a completed cycle, fixing the bug with archive jobs starting in the fourth cycle (#1003). The gdas cycledef has been split into a `gdas_half` for the first half- cycle and `gdas` for the other GDAS cycles. Tasks that run during that first half-cycle therefore run on two cycledefs. For gfs, instead of slicing perpindicular to time, a new cycledef `gfs_cont` (continuity) was created in parallel to the existing gfs cycledef that omits the first cycle. This was done since only one job (`aerosol_init`) currently skips the first cycle, and this prevents the need to provide two cycledefs for every gfs task but one. Since some time math is now being done on sdate in workflow_xml.py, we now keep those as datetime objects and only convert to string when writing the cycledef strings. In order to access the pygw utilities in the workflow directory, a symlink is created in `workflow` pointing to the pygw location in `ush`. A better solution may be found in the future. Fixes #1003 commit 67770120c72fd810311e3e5a02abec25f9e7725d Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Tue Oct 18 01:58:37 2022 +0000 removing superfluous EFSOI-specific files from develop (#1079) commit 7fc0f261e673cac9df06ef2208f2a840529cea48 Author: Walter Kolczynski - NOAA Date: Mon Oct 17 14:57:07 2022 -0400 Update UFS to Sept 9 version (#1073) Updates the UFS version. This captures the conversion of UFS module from TCL to lua. A couple of the CICE namelist variables are no longer valid in this version, so they are removed. Due to memory limitations on Hera and the increased memory requirements of GOCART, the number of threads is increased there when running the forecast with aerosols. Also added a temporary block to delete any existing gocart output files. The ability to clobber files was deactivated a while ago and I got fed up with forecast jobs failing on retry. commit fd771cb82ab603a64f16ba8639f3e71c8fc7440d Author: Ed Safford <62339196+EdwardSafford-NOAA@users.noreply.github.com> Date: Wed Oct 12 16:55:28 2022 -0400 Modify default file location for monitor data when using rocoto (#1065) Added a $cyc subdirectory to the default file location for monitor data in rocoto. This ensures no file name collisions occur in the output monitor files. This change was also recently made to GFSv16.3.0 package @ 4335ef2. Additionally a problem with the definition of the previous cycle (m1) for the DA monitors was identified and corrected. This corrects the MinMon's output data. Fixes #1055 commit 8172530245972c7f569a2bf950b1929282b937e4 Author: Walter Kolczynski - NOAA Date: Tue Oct 11 15:58:37 2022 -0400 Fix companion ocean resolution for C48 (#1066) The ocean resolution for atmostphere C48 should by 5 deg, not 4 deg. Fixes #1054 commit e8ef5fc6cc2781f5c3c47e7cf2762a6f7de2d123 Author: Walter Kolczynski - NOAA Date: Tue Oct 11 14:25:02 2022 -0400 Add trailing slash for gldas topo path (#1064) GLDAS requires the namelist definition for the topo directory to have the trailing slash. Fixes #1063 commit 9553ef690b12709fd3024f07ad81257d02453ac6 Author: Walter Kolczynski - NOAA Date: Tue Oct 11 11:22:50 2022 -0400 Limit number of CPU for post (#1061) Limits the number of MPI tasks for post to the resolution of the forecast. UPP seems to fail if it is given more ranks than the resolution. Fixes #1060 commit e915eb64095a3ccf3d723892ffa1a2092c8e9a3f Author: Walter Kolczynski - NOAA Date: Fri Oct 7 20:31:29 2022 -0400 Fix eupd trace (#1057) When the preamble was implemented, a `set +x` was inadvertently left in the ensemble update script while the subsequent `set -x` was. This led to much of the trace not appearing in the output. Also removed a jlog print that is only encountered when ICs are missing. It complicated the real IC missing error with an additional unbound variable error. commit e09989b8285f71b44a0958fd1c60e7ca49d73661 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Fri Oct 7 20:12:04 2022 -0400 Port to S4 (#1023) Ports the global workflow to the S4 cluster. Note that this does not include support for S2S experiments. Additionally, S4 does not support C768 experiments. A couple of special notes: - S4 does not have access to rstprod data. Among other things, this means that the nsstbufr and prepbufr.acft_profiles files must be created on the fly. The way I accomplished this was by moving the `MAKE_NSSTBUFR` to and creating `MAKE_ACFTBUFR` in config.base.emc.dyn and setting them via setup_workflow.xml. This seems like a bit of a kludge and I am open to suggestions on how else to address this. Both options need to be set for the prep and analysis jobs. - S4 _can_ run S2S+ experiments, but this requires significant, and convoluted, modifications to the configuration files. Support for these are thus not enabled by default. Instead, I have placed a set of configuration files in S4:/data/users/dhuber/save/s2s_configs. Users interested in performing these experiments should contact me to set it up. Closes #138 commit b26a8ac85b2b981356417ad7ced3d1d420cede68 Author: Kate Friedman Date: Fri Oct 7 10:51:25 2022 -0400 Update to obsproc.v1.0.2 and prepobs.v1.0.1 (#1049) * Update HOMEobsproc paths in config.base * Update primary obsproc JJOB call in prep.sh * Add prepobs module load to R&D module_base modulefiles * Add cdate10 setting to config.prep * Add launcher_PREP to HERA and ORION env files * Add needed COMINtcvital path to config.prep * Retire config.prepbufr and prepbufr step from config.resources Refs #1033 commit 8cb27c6e48437f28ffb965e0043dc00e90a42a31 Author: Walter Kolczynski - NOAA Date: Thu Oct 6 17:23:22 2022 -0400 Add GDAS to the partial build list (#1050) When the GDAS app was added to the workflow, the corresponding build setting was not added to partial_build and the build configuration file. This means that after `build_all.sh` was updated to correct syntax issues, the build would fail because `$Build_gdas` was undefined. Note: the GDAS app still does not build currently due to unrelated problems within the gdas repo. Refs #1043 commit 9b3fa14ec9df6697b9de76f8a536ceff73358935 Author: Walter Kolczynski - NOAA Date: Thu Oct 6 17:22:45 2022 -0400 Fix group number being treated as octal in gdas arch (#1053) The group number was being treated as an octal in gdas archive job, resulting in errors for being out-of-range when more than 7 groups were used. Fixes #1032 commit 235d597113fe2af04cf58c3aa7ac8a6892e2fc64 Author: Walter Kolczynski - NOAA Date: Thu Oct 6 17:21:56 2022 -0400 Remove trace from link script (#1046) Removes the trace from the link script. Since this makes the script silent (including some errors), some messages are added to give feedback about the scripts success or failure. UFS Utils' [link_fixdirs.sh](https://github.com/ufs-community/UFS_UTILS/blob/develop/fix/link_fixdirs.sh) script turns the trace on when it runs, so its STDERR is thrown away. Also corrects a bug where the case of `$RUN_ENVIR` set did not match the case it was compared against. Fixes #1044 commit 695ba93b2e9464a6fd83c432d80293e2ab823735 Author: Kate Friedman Date: Tue Oct 4 14:58:39 2022 -0400 Update gfs-utils hash to 3a609ea (#1048) Update gfs-utils hash to add WCOSS2 support for that component. Refs #1047 commit e464e27611c14ad6db1e347dadea47c30b5e3b96 Author: Walter Kolczynski - NOAA Date: Tue Oct 4 14:25:56 2022 -0400 Fix link script usage statement (#1045) The function being called when passed -h was not the same as the one used for the function definition (there was a leading `_`). Fixes #1044 commit 833b8f4c124fb48b5a04bc0f82c2b5fc9df1165c Author: Walter Kolczynski - NOAA Date: Tue Oct 4 03:04:38 2022 -0400 Replace preamble variable commands with functions (#1012) Replaces the commands to restore the strict and trace state with functions. Functions are more logical choice for this sort of behavior. Refs: #397 commit e7f72e8b6e3f4244465bbfb077f00bc4d7610739 Author: Walter Kolczynski - NOAA Date: Tue Oct 4 03:03:15 2022 -0400 Implement fix reorg and remove gfs-utils code (#1009) Removes all of the code and scripts that were moved to the new [gfs-utils repo](https://github.com/NOAA-EMC/gfs-utils) and adjusts workflow scripts to build and use them from the new location. Some of the build scripts had unnecessary calls to machine-setup that are removed because the lower-level script already has the same functionality. This PR also includes updates to use the new fix organization. This includes the addition of a fix versions file, updates to the link script, and some changes in the fix directories used by scripts to account for files that have been relocated. The versions file sets the version number for each component of fix, so that fix can be more easily maintained and documented. The initial versions are all the same, and correspond to the old fix_NEW directory (other than some directories have been renamed or reorganized). Except gdas, which has already had a new set of fix files added. The fix update also required an update to UFS-Utils. Finally, the link script has been updated to match the syntax of the rest of the build system ([checkout.sh](sorc/checkout.sh) and [build_all.sh](sorc/build_all.sh)). [link_workflow.sh](sorc/link_workflow.sh) now detects the machine automatically instead of requiring an argument, and dev mode, which used to be set using `emc` as an argument, is now assumed. To run in ops mode (copy instead of link), the `-o` option is used. The full syntax is now simply: ``` ./link_workflow.sh [-o] Options: -o: Run in operations (NCO) mode (copy instead of link) ``` Fixes #356 Fixes #966 commit 948f941918ebb85444aabfa68e8f24bbe73d9202 Author: Walter Kolczynski - NOAA Date: Mon Oct 3 22:56:03 2022 -0400 Rename post scripts (#1038) Rename post scripts to remove the unnecessary "ncep" qualifier. Also changes the executable name for upp to be the one produced by upp (`upp.x`). The ush/global_nceppost.sh script is removed as it is not used by any job. We may want to rename the gfs_post.sh script (the one that is used) to global_post.sh to be more reflective of its usage. commit 065a0dda6ceff7224978fb147073c74feb3a13e6 Author: Kate Friedman Date: Thu Sep 29 17:06:40 2022 -0400 Fix missing @ symbol with COMINsyn in config.base (#1039) Refs #419 commit a62abc7aea8131ba262b997ab4b8b67b11552b53 Author: Kate Friedman Date: Thu Sep 29 11:09:14 2022 -0400 WCOSS2 run support and script/config updates (#1030) - WCOSS2 port changes requested by NCO in the move of operations to WCOSS2; mainly within scripts. - Adding WCOSS2 run support by creating WCOSS2.env, wcoss2.yaml hosts file for rocoto, and config updates. - Also some cleanup that was done in the GFS operations port to WCOSS2. Refs #419 commit 72131c6285a8e37fcf1e02f766ffd77aa37f6669 Author: Kate Friedman Date: Wed Sep 28 14:57:22 2022 -0400 Remove base_svn from Hera and Orion hosts files (#1036) Refs #1035 commit b3e10f37b57e2d377784056ec31078af2d9a2c39 Author: Rahul Mahajan Date: Tue Sep 27 14:33:51 2022 -0400 initial commit for incoming yaml work (#1029) simple yaml and jinja toolsets commit 9e0d8d166e2ce07767f7d9d4ff66877a946856bf Author: Ed Safford <62339196+EdwardSafford-NOAA@users.noreply.github.com> Date: Mon Sep 26 13:04:44 2022 -0400 Fix radiance verification failing to find diag files (#1031) Correct exgdas_atmos_verfrad's failure to find diagnostic files. Additionally, fixed two other problems in child scripts which caused halts and/or failure to move resulting data files to the correct location. Fixes #1028 commit fae9e0a5c9270e1a4afa83338e99a035aed2f5a0 Author: Rahul Mahajan Date: Fri Sep 23 15:11:48 2022 -0400 Supported resolutions on platforms and defaults for mode (#1026) * add default options for cycled and forecast-only. Validate supported resolutions on the platforms commit a665817e64cff25836fe283c10105830b3842ed3 Author: Kate Friedman Date: Thu Sep 22 11:15:43 2022 -0400 Add GLDAS scripts & fix GLDAS job (#1018) Absorb GLDAS scripts into global-workflow and fix GLDAS job by updating scripts to use new GLDAS fix file set. * Remove GLDAS scripts from .gitignore * Remove GLDAS script symlinks from link_workflow.sh * Add GLDAS scripts to global-workflow * Updates to GLDAS scripts, includes converting GLDAS script to replace machine checks with CFP variables * Address linter warnings and remove obsolete platforms Refs #622 #1014 commit f180a546b68ff5455faf9fdb280f1ee4d710a36a Author: Ed Safford <62339196+EdwardSafford-NOAA@users.noreply.github.com> Date: Thu Sep 22 08:41:28 2022 -0400 Update GSI Monitor for radmon fix Move the hash value for gsi-monitor from acf8870 to c64cc47 to pick up recent fix to radmon_angle.x executable. Fixes #1024 commit f517d48dddbe2e03a3b9758c902a5d1474af6c26 Author: Walter Kolczynski - NOAA Date: Wed Sep 7 17:10:43 2022 -0400 Correct shell linter config (#1013) Per comment from the developer on PR #1008 after it was merged, the `shell-scripts` option for the shell linter does not do the globbing we expected. However, the linter action also only annotates when syntax issues are *added*. It will not flag files with existing errors, even if they are otherwise changed (at least, this is my understanding). So, immediately processing all shell files shouldn't be an issue (fingers crossed). Refs: #397 commit a2eec51037b732b625fb3425500cf054a5f8b666 Author: Ed Safford <62339196+EdwardSafford-NOAA@users.noreply.github.com> Date: Wed Sep 7 17:08:34 2022 -0400 Correct diagnostic file handling in ush/ozn_xtrct.sh (#1016) The OznMon's ush/ozn_xtrct.sh script creates a list of available sat/instrument sources for a given cycle in order to report missing files. The logic for extracting the list of sources worked on wcoss2 and hera but failed on orion. The logic used a listing of the diagnostic files which included user name. That created a parsing problem if the user name was not the expected length. The solution is to simply do a single line listing (ls -1) and parse the results. Additionally the processing logic was condensed and now avoids running the extraction executables in the case of a missing diag file. Fixes #1004 commit 855ee864e6077051f0a486b8b59c9056d50f955e Author: Walter Kolczynski - NOAA Date: Tue Aug 30 21:54:14 2022 -0400 Add shell linter Github action for pull requests (#1007) Add Github CI to enforce shell script standards. Refs: #397 commit 392ca6fc71c390a75cf0c4d621daa491c721e08a Author: Kate Friedman Date: Fri Aug 26 16:16:16 2022 -0400 Build updates for WCOSS2 (#1002) Enable build of the global-workflow and its components on WCOSS2. commit 5429096ec944ba3133ff7b77ad0587c4b7b0ffc6 Author: Kate Friedman Date: Fri Aug 26 11:40:46 2022 -0400 Update UFS_UTILS tag to `ufs_utils_1_8_0` (#1001) Update UFS_UTILS tag to ufs_utils_1_8_0 New tag contains: - cleanup of decommissioned platforms - adding support for WCOSS2 - regression test updates - code updates to chgres_cube and emcsfc_snow2mdl - some small script updates and cleanup - change to the FNAISC file from CFSR.SEAICE.1982.2012.monthly.clim.grb to IMS-NIC.blended.ice.monthly.clim.grb Refs #974 commit 1f142296279730566610e10fb93bd8beb636af07 Author: Walter Kolczynski - NOAA Date: Thu Aug 25 15:06:24 2022 -0400 Fix preamble id (#996) I was too clever by half when writing the preamble and used a check for an argument that doesn't actually work. Using the more typical method works correctly. commit 3a8fc8ecd2fdbe8484dd882a512e765d3c70b307 Author: Kate Friedman Date: Thu Aug 25 13:49:24 2022 -0400 Add missing "atmos" into job dependencies (#998) The "atmos" subfolder in the GDA was missing in the dependency path for the prep and atmanalprep jobs. Add it in for correctness. Updated dependencies in workflow/rocoto/workflow_tasks.py. Refs: #997 commit aae25112286e70e2f662b15f230bba958dfb897b Author: benwgreen <54944233+benwgreen@users.noreply.github.com> Date: Wed Aug 24 09:17:24 2022 -0600 Bugfix in arch.sh to remove hardwired "htar" (#992) To support "LOCALARCH", $TARCMD is used instead of hardwiring "htar". One line had a hardwired htar; this PR changes "htar" to "$TARCMD". commit 692e3fc891e32d7b7d1a55aca63bb43a3b8ad9b0 Author: Cory Martin Date: Tue Aug 23 16:13:38 2022 -0400 Add in stubs for aerosol DA tasks + bugfix for setup_expt where cycled and ATMA are used (#990) This PR adds in stubs for aerosol DA tasks (jobs/rocoto shell scripts), the ability for setup_expt and setup_xml to include three aerosol DA related tasks, and in the process of adding this capability, fixes a bug in which the combination of ATMA and cycled revealed that @property was being used incorrectly. Fixes #981 commit da164f8b00c68d1309759645a57f0df744d46ef2 Author: Walter Kolczynski - NOAA Date: Tue Aug 23 11:05:23 2022 -0400 Add GSI monitor scripts (#969) As part of the overarching goal of moving global workflow scripts out of component repositories and under global workflow, all of the scripts currently linked in GSI monitor are added to global workflow. Scripts have been updated to reflect standardization already made to other workflow scripts (preamble, no backticks, etc.). See 7aa637 for those modifications specifically. Related GSI-Montitor issue: NOAA-EMC/GSI-Monitor/pull/23 Fixes #967 commit 22977c9aae7016e50362947fd6ef36385d736ce2 Author: Walter Kolczynski - NOAA Date: Tue Aug 23 11:04:08 2022 -0400 Fix product generation at some fcst hrs (#988) 0.5 and 1.0 degree grib files were not being generated for most two-digit forecast hours due to a problem using a zero-padded number. Fixes #985 commit 8a62c3a9e2dcc5de05ca974be8852736d423de76 Author: Cory Martin Date: Fri Aug 19 14:09:07 2022 -0400 Add initial config files for global aerosol DA (#986) Add new config files for aerosol da commit cfcc21f9f96106b17bba9fd194cd360dda1fffb8 Author: Jessica Meixner Date: Thu Aug 18 07:10:44 2022 +0000 Update diag table to remove wav-ocn coupling fields (#979) Removes the wavocn output which is used to evaluate wav->ocean coupling. These were only needed for prototype diagnostics and when these variables are output without wave coupling, it can lead to failures. Fixes #977 commit d81a8b509c8c734ceaf4aa44f120c80452c38706 Author: Rahul Mahajan Date: Mon Aug 15 16:49:10 2022 -0400 use a robust Findwgrib2.cmake to find wgrib2 built w/ native wgrib2 build (#970) commit c29236b6bf6a4280cfc799e3b7adf54b37dd3211 Author: Rahul Mahajan Date: Wed Aug 10 11:15:16 2022 -0400 Externals.cfg was stale and had drifted off (#965) commit 9a538ee8baedb05f23d988003aaedc260ab7ecc7 Author: Walter Kolczynski - NOAA Date: Tue Aug 9 10:36:50 2022 -0400 Fix post comparison with zero-padded numbers (#964) An error was introduced with PR #929 that was causing 0p50 and 1p00 grib files to not be produced due to an error in comparing a zero-padded string. Switching to arithmatic comparison solves the issue. Also updated the method of the zero-padding to the preferred printf since the code was in close proximity. commit 1026b2c96eb8d987527b7b38a5d75f5b1786533c Author: Kate Friedman Date: Wed Aug 3 14:45:48 2022 -0400 Remove SDATE=CDATE IAU block in NCO config.base (#963) - Remove the block in config.base.nco.static that checks if CDATE=SDATE and turns IAU settings to 0. - This block is not needed in operations and causes issues in pre-implementation developer testing when starting a new warm-started parallel with wave restarts. Refs: #960 commit a658e75e579ebc4f454377e5a9cf4c1fc54e4e3d Author: Jessica Meixner Date: Wed Aug 3 18:34:34 2022 +0000 Updates for P8 (#946) These are the final updates for Prototype 8. These changes include: * Updating to the latest ufs-weather-model hash (in progress, waiting for PR) which will update the calculation of 2m T * A small update to the organic carbon coefficients for p8, raises them from 0.3 -> 0.4 for oc1 and oc2 * Uses 10km input files for aerosols * Sets do_gsl_drag_tofd=false by default, which helps with stability of the model Closes #937 commit b2155ad3dc999a2f41aeace58f3b199a8ddde65c Author: Walter Kolczynski - NOAA Date: Wed Aug 3 10:04:06 2022 -0400 Fix GLDAS j-job link (#954) The cd was misplaced when checking for the existence of gldas.fd to create the link for the j-job, so the directory was never found and the job never linked. commit 395720cef000ef221c49c93d4f68417b7fda64b6 Author: Walter Kolczynski - NOAA Date: Mon Aug 1 17:38:50 2022 -0400 Add ocean post to archive dependencies (#949) The archive job was not waiting for ocean post to complete because there was no dependency. Fixes #948 commit 145b67f70f44abbb713e19073016bacfbfcb8184 Author: Walter Kolczynski - NOAA Date: Fri Jul 29 13:42:38 2022 -0400 Initial commit of directory comparison tools (#934) Adds a new `test/` directory to the top level. Inside are miscellaneous scripts I have used to test bitwise identicality of experiments. Main scripts: - `diff_ROTDIR.sh`: Compares two output directories - `diff_UFS_rundir.sh`: Compares two UFS run directories Other scripts and file are helpers to these two main scripts. May eventually form starting point of a global workflow regression test (#267) Refs #267 commit e480093446797e556bc1371f9f80610a7c9a6d4b Author: Walter Kolczynski - NOAA Date: Fri Jul 29 13:42:02 2022 -0400 Add preamble, convert to bash, and remove env (#929) This is the first in a wave of commits to improve and standardize the scripts within global workflow. In this commit, all scripts run during execution are converted to bash and a preamble is added to every script that is not sourced by another script. Every script executed during a forecast cycle is converted to bash. This was mostly straightforward, though there were a couple Korne-shell conventions (primarily using `typeset` to format strings) that had to be replaced with bash-compatable alternatives like `printf`. This in turn required a few modification to prevent zero-padded numbers from being treated as octals (other may have been pre-existing bugs). The preamble contains a number of feature to standardize code and improve script flow and debugging. - First, it uses two variables, `$STRICT` and `$TRACE` to control the behavior of `set`. When `$STRICT` is `"YES"`, error on undefined variables (`set -u`) and exit on non-zero return (`set -e`) are turned on. When `$TRACE` is `"YES"`, command trace (`set -x`) is turned on and a useful string is set to `$PS4` that gives the name and line number of the script. Both `$STRICT` and `$TRACE` default to `"YES"`. They also set up commands, `$ERR_EXIT_ON` and `$TRACE_ON`, that will restore the setting of each in the event a script needs to temporarily turn them off. - Second, the preamble sets up primative timing of scripts using Posix `date`. - Third, it echos the script is beginning and at what time. - Finally, it also establishes a postamble function and sets it to run as a trap of EXIT. The postamble will use the end time to calculate the run time of the script, then print that the script has ended at what time, how long has elapsed, and the exit code. By setting this up as a trap instead of just calling it at the end of the script, it ensures the postamble is called even if the script exits early because there is an error. - In response to this standardization, parts of scripts that performed these preamble functions (announcing start/end, `set -x`, etc) have been deleted. For some scripts where temporarily turning off `-x` or `-e` is needed, they now use `$ERR_EXIT_ON` and `$TRACE_ON` to return to the correct state afterwards, instead of blindly turning the setting back on. - Additionally, some modifications were needed to comply with `set -eu`. Mostly taking care of undefined variables, but also a couple instances where a non-zero return code had to be dealt with. If users wish to use their own preamble script instead, the default script can be overridden by setting `$PREAMBLE_SCRIPT` before the run begins. Instance where scripts would print the full list of environment variables have been removed. These can be spot added back in to debug as necessary. Alternatively, a future PR will add them back in in a standardized way. `rstprod.sh` is added to the link list from gsi_monitor.fd, as it is needed for the radmon scripts. The placeholders for AWIPS and GEMPAK in the Hera and Orion environment scripts were replaced with the correct definitions. There were also other modifications to AWIPS and GEMPAK scripts to get it working for development (AWIPS still isn't and will be fixed in the future). GSI scripts that were brought in recently had all of their backticks replaced with `$( )` as was done with all other script previously. Refs: #397 commit 949513642d33cb3976d0f8e7dd273aedec505a17 Author: Rahul Mahajan Date: Thu Jul 28 18:36:58 2022 -0400 minimal intervention to create a data-atmosphere xml (#936) commit e4b01b99f50c674635477ff3e2e962b9d5ed54aa Author: Rahul Mahajan Date: Thu Jul 28 18:28:28 2022 -0400 Remove Cray, Dell, WCOSS1 from module-setup.sh.inc (#943) This file was missed in the initial cleanup. commit 1ed89c7d202974bd4ccd2b048fedc56382edc9ec Author: Rahul Mahajan Date: Thu Jul 28 15:35:37 2022 -0400 bring GDASApp jjobs and exscripts to global-workflow (#941) commit f04f3ba4dfd5b358aea425f65f82d40917597776 Author: Rahul Mahajan Date: Tue Jul 26 22:07:42 2022 -0400 change gdasechgres dependency to just gdasefcs01 instead of gdasefmn (#933) Replaces the dependency of `gdasechgres` on `gdasefmn` with `gdasefcs01`. Presently, `gdasechgres` has 2 dependencies: - `gdasfcst` - deterministic forecast - `gdasefmn` - ensemble forecasts (all of them). The work done in `gdasechgres` actually depends only on the `mem001/atmos/gdas.tHHz.atmf006.nc`. This file is used as a template as well as obtaining `hgtsfc`. As such, there is no reason to depend on the entire ensemble of forecasts to be complete before `gdasechgres` can start. commit 490de7bae1322cbaa8ee3d52406034c8edf62dd3 Author: Rahul Mahajan Date: Tue Jul 26 22:05:59 2022 -0400 Remove obsolete platforms (WCOSS1, Dell, Cray, Theia) references. (#922) Removes code related to decommissioned HPC platforms WCOSS 1 (Dell & Cray) and Theia. Some references remain in scripts outside the global-workflow repo that are cloned as part of `checkout.sh`. Scripts from the `driver` directory that were hard-wired for one of the WCOSS1 platforms are also removed. Additionally, this commit also switches to using serial netCDF for resolutions C48. C96, C192. Running with parallel netCDF (on Hera) gave errors when testing at C96 for the deterministic forecast. If someone gives a very compelling reason to use parallel netCDF at these resolutions as default, I would be very interested in what they have to say. Closes #680 commit 4eb296f7e82459b1d8188636ca3db60b5fa10091 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Tue Jul 26 14:50:30 2022 -0400 Optimize DA clone, build, and link (#931) The PR contains changes to optimize the DA clone, build, and link. Changes are made to `checkout.sh`, `build_all.sh`, and `link_workflow.sh` in the g-w `sorc/` directory. These changes are in g-w branch `feature/clone` Two arguments are added to `checkout.sh` to allow the user to specify which DA package to build the global workflow with. These options are - `-g`: clone from the [GSI](https://github.com/NOAA-EMC/GSI) repo and build the g-w for GSI-based DA - `-u`: clone from the [GDASApp](https://github.com/NOAA-EMC/GDASApp) repo and build the g-w for UFS-based DA If no option is specified, `checkout.sh` does not clone any DA and DA related repos. This is the default behavior of `checkout.sh`. (_DA related_ repos include [GLDAS](https://github.com/NOAA-EMC/GLDAS), [GSI-utils](https://github.com/NOAA-EMC/GSI-utils), and [GSI-Monitor](https://github.com/NOAA-EMC/GSI-Monitor).) `build_all.sh` is modified to detect which repos and have been cloned and to build accordingly. `link_workflow.sh` is modified to detect which directories are present and link/copy accordingly. Closes #930 commit 2cad536551180d25bfcfc2b5d35fe1089de7f3c3 Author: Kate Friedman Date: Tue Jul 26 13:25:37 2022 -0400 WCOSS2 gempak ush scripts updates and cleanup of old release notes (#920) * WCOSS2 updates to gempak ush scripts - Add /gempak subfolder where needed in gempak ush scripts. - Remove unneeded commented out path settings from older iterations. * Removing older release notes - Cleaning out older GFS version release notes; includes current GFSv16.2.1 release notes, will commit GFSv16.3 release notes with implementation this fall. - Will then keep only the latest release notes moving forward. Refs: #419 commit ffcd5bbde7947902a73eebff7dfe04c2ab045b0a Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Fri Jul 22 16:00:51 2022 -0400 Add GDASapp (first wave of JEDI changes) (#871) Merge changes associated with g-w issue #521 from g-w branch feature/ufsda_gdasapp into develop. feature/ufsda_gdasapp contains g-w extensions for JEDI based atmospheric DA. Specifically, this PR adds the option to add JEDI based variational and/or ensemble DA jobs to replace GSI based DA jobs. The toggling on/off of JEDI_VAR and JEDI_ENS jobs is controlled via two new variables added to `config.base.emc.dyn` and `config.base.nco.static` ``` # DA engine export DO_JEDIVAR="NO" export DO_JEDIENS="NO" ``` When both variables are `NO`, the global workflow uses GSI based DA jobs. Thus, the PR does not alter the default behavior of the develop global workflow. When `DO_JEDIVAR=YES`, GSI jobs `anal` and `analdiag` are replaced by JEDI_VAR jobs `atmanalprep`, `atmanalrun`, and `atmanalpost`. When `DO_JEDIENS=YES`, GSI jobs `eobs`, `ediag`, and `eupd` are replaced by JEDI_ENS jobs `atmensanalprep`, `atmensanalrun`, and `atmensanalpost`. `checkout.sh`, `build_all.sh`, and `link_workflow.sh` are updated to clone, build, and install the GDASapp in the global workflow. Local directory `sorc/gdas.cd` contains the GDASApp superstructure plus the relevant components of JEDI needed to run GDASApp. Closes #521 commit 98f4d16e9bba86d2c433aa0521d960b566062a1f Author: Walter Kolczynski - NOAA Date: Thu Jul 21 16:02:31 2022 -0400 Add postsnd job when bufrsnd it on (#926) In the workflow refactoring, the addition of postsnd to the task list when bufrsnd is true was inadvertently left out. It is now added back in. commit e2869a1247ad2ba72c1bfe82f7682323d5128f4c Author: Kate Friedman Date: Tue Jul 19 16:16:20 2022 -0400 Updated GFS transfer*list files from operations (develop) (#918) Updated transfer list files from WCOSS2 ops. - In move to WCOSS2 the transfer*list files were moved into a new transfer folder under the upper-level parm folder. - The transfer*list files were updated to clean out unneeded paths and the beginning of the paths were updated from: `com/gfs/_ENVIR_` to: `_COMROOT_/gfs/_SHORTVER_` Refs: #419 commit 1651014250a748c8c2e2878663c3e8376639044a Author: ChunxiZhang-NOAA <49283036+ChunxiZhang-NOAA@users.noreply.github.com> Date: Tue Jul 19 11:18:33 2022 -0400 Fix history type of cldfra in the diag_tables (#915) The history type of the new cloud fraction field (cldfra) is changed so it is written to the correct output file. Fixes #914 commit 5c9639dba419ef51608aa8962b54ae69c8ad8c60 Author: Rahul Mahajan Date: Tue Jul 19 11:11:38 2022 -0400 Combine ecflow and rocoto workflow generation (#916) Consolidates the workflow generation systems for ecflow and rocoto. Further unification will be made to make a choice to use one or the other. At this time, they are separate and need separate input criteria. Moves `ush/rocoto` to `workflow/` at the top level. Further refining will be performed as needed based on use and need. commit b690ed6c7c1a250171484aba267d0724d8c12d11 Author: Walter Kolczynski - NOAA Date: Mon Jul 18 19:33:36 2022 -0400 Remove new GSI scripts from gitignore (#917) When the scripts were moved over from GSI into global workflow, they were never removed from the .gitignore. This has now been addressed. Follow-up to PR #904 commit aa2542eb4c95827bc1fc7a4a76d4d0f5bc701f74 Author: Rahul Mahajan Date: Mon Jul 18 13:49:09 2022 -0400 Add ecFlow generator to `develop` (#912) * merge workflow_generator from dev_v16 at 7b7947e2 and move workflow_generator to ush/ecflow commit 5a58fa8a2f75590a671b45bf75a431f2acf2340b Author: Rahul Mahajan Date: Mon Jul 18 13:02:59 2022 -0400 Update gsi-utils hash to include bugfix in handling 3D sfc files (#913) Update GSI-Utils hash to include bugfix that processed 3D fields in surface netCDF files. Fixes #909 commit e8361cc343743bb5087e75f89406c1265404e359 Author: Rahul Mahajan Date: Mon Jul 18 12:44:25 2022 -0400 Combine setup_workflows scripts (#859) `setup_workflow.py` and `setup_workflow_fcstonly.py` are used for setting up the XML for cycled and forecast only configurations. They share large parts of the tasks and dependencies. This PR unifies the ones listed above and makes room for extending for incoming applications. The above two scripts are being replaced by `setup_xml.py`. The usage is: ``` $> setup_xml.py /path/to/experiment_directory ``` This PR also removes the handicap of defining hundreds of entities for task resources. Instead the task resources are placed with the task itself. This PR also does the following: - moves the declaration of `DOBNDPNT_WAVE="NO"` from `config.wave` to `config.base` - reduces timestep at `C48` from `1200s` to `450s` in `config.fv3`. Several ensemble members failed. Inspecting `config.fv3.nco.static`, the value there is `450s` - `eobs` job times out with a wallclock time of `15m`. It is increased to `45m` to play it safe. commit 13385d9c9018def1fbce6772e53665ccb90b2a2a Author: Rahul Mahajan Date: Fri Jul 15 14:02:12 2022 -0400 Load `ncdiag` module instead of building it in the GSI, use UFSWM build system, pull sfcanl out from anal. (#905) `ncdiag` is no longer being built within the GSI. It is available as a module on all systems; Hera, Orion and WCOSS2. This PR: - loads ncdiag module - finds the `ncdiag_cat_serial.x` executable from that module (once loaded) - updates hashes for GSI, GSI-utils and GSI-monitor. No code changes were made in these repos. Only location of `ncdiag` changed. In addition, this PR: - Uses the ufs-weather-model compilation script. The global workflow is duplicating and missing configuration options set in the ufs-weather-model. This also enables a "debug" mode of compiling the model (#300). - Fixes #906 - Fixes an improper link to `fix_gsi`. It was being linked to a non-existing directory. - separates out the creation of Gaussian surface analysis from the GSI atmospheric analysis. This capability is pulled from PR #871. It was necessary to pull these changes from PR #871 since the `j-jobs`, `ex-scripts` from the GSI have been brought to global-workflow and the companion PR for PR #871 in the [GSI PR 415](https://github.com/NOAA-EMC/GSI/pull/415) were already merged prior to the updates needed for `ncio` and `ncdiag`. I realize all these changes could have been streamlined into smaller atomic PR's. **Note:** This PR also requires a documentation update as a new job `sfcanl` has been added to the workflow. commit b61c2375a05daff20805d0216ade8201192c0199 Author: Fanglin Yang Date: Thu Jul 14 20:07:54 2022 -0400 Update ice climo, fix option for non-fractional grid, and add cloud fraction for Thompson (#902) Updates the ice climatology from CFSR to IMS_NIC blended. Fixes issue in the diag table where the cloud fraction entry for Thompson was not included. Adds new coupled mode setting for when fractional grid is not used. Fixes #886 Co-authored-by: Chunxi.Zhang-NOAA commit 680270975e2eb2f09c58a6112a87c0ecaec9e2a5 Author: Barry Baker Date: Thu Jul 14 17:03:34 2022 -0400 Update aerosol variable names and dust alpha (#888) The names of some aerosol fields have changed and needed to be updated. The dust alpha parameter is updated to improve model performance. A typo in the scavenging parameter definition where a string was terminated with a 'smart-quote' instead of a standard one is corrected. commit 06b25267b7839808508eed9e91dcd2b76a33c3ca Author: Jessica Meixner Date: Thu Jul 14 18:52:56 2022 +0000 Allow for wave mesh to be different than the ocean/ice mesh (#897) This PR updates the configuration to allow for the wave mesh to be different from the ocean/ice mesh. This PR also changes the default wave grid back to gwes_30m and updates the load-balancing for this change. The ufs-weather-model is updated to use the code that allows for the wave mesh to be different from the ocean/ice mesh. commit 4f3e14b59ff6c83fbaaf6888c3a3068eae3d77d7 Author: Rahul Mahajan Date: Wed Jul 13 15:53:18 2022 -0400 Migrate `jobs/`, `scripts/` and `ush/` from GSI to global-workflow (#904) This PR moves the `jobs/`, `scripts/` and `ush/` from NOAA-EMC/GSI into the global-workflow. History is preserved. Corresponding GSI PR https://github.com/NOAA-EMC/GSI/pull/436 commit 46f7589807bcb070c8f86601b74f1a84701b669a Author: Rahul Mahajan Date: Wed Jul 13 15:51:56 2022 -0400 build GSI utilities from GSI-utils repo (#889) **Description** The GSI Utilities have been moved to https://github.com/NOAA-EMC/GSI-utils This PR: - checks out GSI utilities from the above repo - builds the utilities from the above repo - does not build the utilities from the GSI repo. In a subsequent PR when the GSI removes the utilities, that option will become redundant. - updates gsi-monitor hash and uses `build.sh` from that repository - creates links appropriately. commit aac2f3a6980cb61a0dbe272b667f602e3051cb16 Author: Jessica Meixner Date: Mon Jul 11 06:57:25 2022 +0000 Replace print_esmf with esmf_logkind (#898) Removes print_esmf as it no longer a variable and add the esmf_logkind variable to config.fcst instead which is the equivalent new variable. The change has already been made in nems_configure.sh, so the setting only needed to be changed in the config file. commit 4448dd48b2aeebdf4257a2ab08bd147fcb511f92 Author: Walter Kolczynski - NOAA Date: Fri Jul 8 13:03:29 2022 -0400 Restore cycling capability (#895) Restores the ability to run in cycled mode. This requires GSI to be built with ncio/1.1.2, so the build script now sets 1.1.2 as the ncio version until either the GSI default version is updated or module version files are added to global-workflow (Issue #671). Two new variables, COMIN_OBS and COMIN_GES_OBS, had to be added to the config file in the wake of GSI updates for WCOSS2 to ensure the correct paths are used instead of trying to use the non-functional compath.py. Note that, for now, in order to use cycling, the microphysics setting (imp_physics) must be changed in config.base from Thompson (8) back to GFDL (11), along with a corresponding CCPP suite (like FV3_GFS_v16). Fixes #711 commit a39bd6364971894eee4349b028ebf54ab496dee0 Author: Rahul Mahajan Date: Tue Jul 5 16:10:46 2022 -0400 Merge `ecf` changes from `WCOSS2` into `develop` (#885) * moved ecflow/ecf to ecf and with updates from feature/ops-wcoss2 commit 3076e13ca5976206f1cdab58cca1b6f80fbb7716 Author: Fanglin Yang Date: Fri Jul 1 10:33:39 2022 -0400 Fix settings of dt_inner and lheatstrg (#883) For Thompson microphysics, currently dt_inner is set to half of physics timestep. It should be set to the same as the physics timestep if semi-Lag sedimentation is applied to rain and graupel. Canopy heat storage is incorrectly turned off for NOAH LSM, and on for NOAH-MP LSM. This setting needs to be reversed. Fixes #884 commit 3b9636c4eadc8c31548e170baeefd358483ec71c Author: Ali.Abdolali <37336972+aliabdolali@users.noreply.github.com> Date: Wed Jun 29 18:22:06 2022 +0000 Update buoy locations (#881) Updates buoy locations for those that were incorrect or have been moved. commit f0f1025d07a1af216e2125111ae7bd9ac43213c2 Author: Barry Baker Date: Tue Jun 28 12:05:31 2022 -0400 Update dust input files to not include _FillValue = NaN (#873) There is an issue with the FENGSHA dust inputs and having the netcdf attribute _FillValue == NaN. This is in relation to ufs-community/ufs-weather-model#1259, ufs-community/ufs-weather-model#1192 and #872 Some dust emissions files have also been updated. File changes only include the netcdf attribute _FillValue changing from NaN -> something appropriate for each variable. Fixes #873 commit 4c8b388ae4fd1fb618021b5c11d5e9c82c2ee2f7 Author: Walter Kolczynski - NOAA Date: Mon Jun 27 18:14:39 2022 -0400 Replace --aerosols with new apps (#854) Removes the --aerosols option for setup_expt and replaces it with new apps for aerosols to match the idiom for other components. Currently, ATMA and S2SWA are supported. commit 65cdcce2f841f5e3e4926d4dab73fd5a8a97bd84 Author: Walter Kolczynski - NOAA Date: Mon Jun 27 17:47:46 2022 -0400 Update component versions (#851) Updates the version of components except UFS and verify to the current tip of their respective develop branches. Verify is not yet updated because we are currently on a branch that hasn't been merged to develop yet in order to use the module/py environment fix. GSI will still need to be updated further before use after the resolution of NOAA-EMC/GSI/issues/348 Also updated the GSI and UPP build scripts to take in debug (`-d`), operations (`-o`), and verbose (`-v`) options and apply them as appropriate to the component build scripts. The ops flag for GSI still needs work, as I encountered issues using the [build_4nco_global.sh](https://github.com/NOAA-EMC/GSI/blob/develop/ush/build_4nco_global.sh) or [prune_4nco_global.sh](https://github.com/NOAA-EMC/GSI/blob/develop/ush/prune_4nco_global.sh) scripts. Also fixes a typo in the `parm/post` file list. Updates are in preparation for the [COM reorg](https://github.com/NOAA-EMC/global-workflow/issues/761) commit 2dc2af0e84f52864a719d71d3dbf14f6368f4c0f Author: Jessica Meixner Date: Fri Jun 24 18:49:39 2022 +0000 Update caculation of restart based on if a wave IC exists (#875) Inspect and assert wave initial conditions exist during a RERUN. Take appropriate action. commit b41a36a10b2f949b0c005e48b978261b3d66eb12 Author: Rahul Mahajan Date: Thu Jun 16 22:38:28 2022 -0400 update from dev_v16 utils.f90. See issue #713 (#868) commit a2b9f483c4aaad37e5d26362186b28a782c0541f Author: Barry Baker Date: Tue Jun 14 23:51:47 2022 -0400 Update GOCART settings for p8/p81 (#818) Wet scavenging coefficients for GOCART are updated. Fixes a bug where when running atm-aerosol, aerosol settings were not properly read. Changes the location of aerosol emissions data on Hera now that Raffaele has moved to a new position. Dust inputs for the fengsha dust emission scheme are updated. These updates include a newer soil database (SOILGRIDSv2) and updates for the drag partition and threshold velocity. It also appropriately scales the emissions by modifying the alpha value found in the dust component configuration file. Fixes #814 Fixes #815 Fixes #816 Fixes #816 commit 027eab90e2cb94a1055f9bc54245e7d5979aca3b Author: Kate Friedman Date: Fri Jun 10 09:10:24 2022 -0400 Retire VSDB (#848) Remove VSDB variables, script blocks, and scripts from global-workflow develop. Refs: #844 commit d6705e2564a698eeddf2424bd580482546658788 Author: Kate Friedman Date: Fri Jun 10 09:05:36 2022 -0400 Add "atmos" COMPONENT subfolder to DMPDIR paths (develop) (#847) - add atmos to WAVICEFILE DMPDIR path in JGLOBAL_WAVE_PREP - add COMPONENT to all DMPDIR paths (COMPONENT=atmos already set) - add "atmos" to all DMPDIR paths in config.anal - add COMPONENT to COMIN_OBS default that uses DMPDIR in drive_makeprepbufr.sh - add COMPONENT to SOURCE_DIR default that uses DMPDIR in getdump.sh - add "atmos" to prep job dependency on updated.status.tm00.bufr_d in setup_workflow.py Refs: #802 commit 59604d60a787fc758380a1ef513601d6ffa97566 Author: Walter Kolczynski - NOAA Date: Fri Jun 10 03:29:16 2022 -0400 Move remaining global post scripts over from UPP (#771) Several scripts currently located under NOAA-EMC/UPP are only used for global, so they have been moved within global-workflow so they can be managed by the global-workflow team. Scripts are just copied from the current tip of UPP develop, other than some changes to standardize style and indentation. Also coming over are four parm files that are used to produce GFS products. As a result, we are no longer linking the entire upp parm directory; instead we are creating individual links for the files we use that still reside in upp. Fixes #630 Fixes #769 commit d78c942a271e669b1fd5adf4f99af108c9f884fe Author: Jessica Meixner Date: Tue Jun 7 14:56:26 2022 +0000 update readme for new linking procedure (#843) commit 3194f52d1625379e067fd16e84016d45ba5560fc Author: Walter Kolczynski - NOAA Date: Tue Jun 7 01:35:15 2022 -0400 Consolidate post scripts to eliminate coupled mode of link script (#766) The two scripts that had modified versions for the coupled model have been joined with changes to the original scripts, eliminating the need for a special coupled mode for the link script. As part of this, those two scripts are no longer copied from UPP and are now part of the global-workflow repository. Fixes: #679 Fixes: #746 Refs: #270 commit d1d2606406aa2deb8d95f348b471770f4388777c Author: Jessica Meixner Date: Mon Jun 6 04:49:38 2022 +0000 Updates for wave mesh cap (#831) Updates to work with the new WW3 mesh cap, including pointing to the new ufs-weather-model tag for p8c slong with associated namelist changes. There is a new setting in config.ww3, `waveMULTIGRID`, that controls whether WW3 uses generic shell (shel) or multigrid. The build script for WW3 pre/post has been updated to use the app to determine whether the regular switch file (for ATMW) or the meshcap version (other) is used. In light of this, `build_all.sh` now passes the app to WW3 pre/post as it does with the UFS build. A large section of the wave prep script that was a hold-over from running WW3 offline from the atmosphere dealing with pre-processing winds was removed. Another large section was moved out of prep and into a new `parsing_namelists_WW3.sh` script that is called during the forecast job (similar to those for some other components). The ability to set `esmf_logkind` for the UFS model is restored (defaults to ESMF_LOGKIND_MULTI). The jlog files were removed for the non-pdgen wave script and ush files. Resources for the coupled model are rebalanced. Fixes #736 commit 7173fa58f0da831ad181a3afdaf3c861afa49e49 Author: Rahul Mahajan Date: Wed Jun 1 17:05:21 2022 -0400 Extend `rocoto.py` for handling offsets in data dependencies. (#835) Extend rocoto.py data_dep to handle offset to cyclestr in any number of instances. Remove hack in aerosol_init dependency generation. commit a9998af41cb8c2825f34745091a41ea400815d8d Author: Walter Kolczynski - NOAA Date: Wed Jun 1 14:11:49 2022 -0400 Refactor checkout script (#809) The checkout script is refactored. First, the repetitive checkout code is abstracted into a function that is called for each component. Second, if the clone already exists, checkout will still checkout the requested commit. Third, a new option (-c) is added that will delete any existing clones a create a fresh clone. These last two options allow updating of versions without needing to manually delete directories (and remembering which are created by checkout) Additionally, a usage statement and code documentation are added. There is no change to the existing command-line options, only new options. No change is necessary by users (although the checkout order and print messages have changed). The new syntax is as follows: ``` checkout.sh [-c][-h][-m ufs_hash][-o] -c: Create a fresh clone (delete existing directories) -h: Print this help message and exit -m ufs_hash: Check out this UFS hash instead of the default -o: Check out operational-only code (GTG and WAFS). Only authorized users can check out GTG. ``` Fixes #808 commit 85bca2479420927d29a1a9db41b542568d527c13 Author: Jessica Meixner Date: Wed Jun 1 15:05:07 2022 +0000 Update IC location for hera because climate moved from (#829) The location of the coupled ICs on Hera is changing, so need to update the path in the config. commit dbd9fa93ced07f39accf7ff4eec2859f0ed5d35f Author: ChunxiZhang-NOAA <49283036+ChunxiZhang-NOAA@users.noreply.github.com> Date: Mon May 30 21:50:56 2022 -0400 Updated SDFs and namelist settings for P8c (#795) Updates the model version in preparation for prototype 8c, along with commensurate setting updates. Some CCPP suites are removed from the UFS build as they are no longer available. Also updates the coupled initial conditions, and there is a simultaneous update to fixed orography files. Refs: #736 commit 05accc1c72b1311ae9689bef5e7a1291dc12a96a Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Mon May 23 17:36:51 2022 -0400 Correct incorrect increment prefix variable (#805) One instance of the increment prefix variable in `ush/forecast_postdet.sh` used `PREFIX_INC` instead of the `PREFIX_ATMINC` used elsewhere. Fixes #804 commit ac7cb1000d5c1d31babd36f4c3e17f643f30fd4d Author: Walter Kolczynski - NOAA Date: Sun May 22 16:50:30 2022 -0400 Remove separate UPP clone (#803) Now that UPP is cloned within UFS, there is no need to create an independent clone in global-workflow. Scripts are updated to use the UPP within UFS. In addition to removing redundancy, this will avoid version mismatch issues when UFS and global-workflow checkout different versions. The upp directory is now linked from within UFS to sorc as upp.fd (instead of the gfs_post.fd the clone was placed in). Additionally, the build script is renamed to build_upp.sh. Fixes #770 Moots #797 commit 58728fb2c689c11335f4bf8ac371e18070c6fbd2 Author: Walter Kolczynski - NOAA Date: Fri May 20 10:14:58 2022 -0400 Fix build script options (#801) There was an issue where if build_all.sh was called with only options that are not used by partial_build.sh, partial_build.sh would see those options because $@ was not overwritten. The script have been updated to consume those options so they are no longer present if $@ isn't overwritten in the new script call. Also fixed the getopts statement in partial_build.sh so we handle errors instead of bash (bash would report the wrong script if the script is sourced) Fixes #800 commit fa12db279cf829b861d08c06e3e3534a335e9fd7 Author: Walter Kolczynski - NOAA Date: Wed May 18 01:04:32 2022 -0400 Refactor some build scripts (#794) Refactors the build_ufs script to allow the build of any UFS app to support future expansion of global-workflow capability. The default should be sufficient for most users, as S2SWA can be used for ATM, ATMA, S2S, and S2SW as well. The new format of the command is now: ``` build_ufs.sh [-a UFS_app][-v] -a UFS_app: Specify the UFS application to build. The default if none is provided is S2SWA -v: Turn on verbose mode (BUILD_VERBOSE) ``` build_all.sh is similarly refactored to include the same options. Also, a new `-c build_config` option is added to specify an alternative list of programs to build. ``` build_all.sh [-a UFS_app][-c build_config][-h][-v] -a UFS_app: Build a specific UFS app instead of the default -c build_config: Selectively build based on the provided config instead of the default config -h: Print usage message and exit -v: Run all scripts in verbose mode ``` partial_build.sh is also updated to take the new `-c` option to specify a build config. ``` partial_build.sh [-c config_file][-h][-v] -c config_file: Selectively build based on the provided config. The default if none is specified is gfs_build.cfg -h: Print usage message and exit -v: Run in verbose mode ``` In addition to the above, build_all and partial_build had their indentation redone and their usage/help statement updated/added. Also, the build configuration file was renamed to from fv3gfs_build.cfg to gfs_build.cfg. Fixes #745 Fixes #751 commit 2abee02765a3bae48f2b4e6a900400d6731eda06 Author: Walter Kolczynski - NOAA Date: Wed May 18 01:02:57 2022 -0400 Add continuous aerosols support (#693) Adds support for continuous aerosols when running with GOCART (using the forecast aerosol tracers from the previous cycle as initial tracer fields). This adds a new task to the workflow mesh in forecast-only mode. The new task does not run in the first cycle, though it will appear in rocotostat/rocotoviewer. In subsequent cycles, the task will launch as soon as the needed restart files from the previous cycle are available and gfs_init has completed. There is currently no mechanism to have aerosol_init run for the first cycle (for instance, continuing from a previous experiment); users would have to run the scripts off-line to add aerosols to the initial conditions. To support the introduction of the necessary python scripts, a new python environment is added to provide necessary libraries. This virtual environment is temporarily being housed in personal space, but an issue has been opened with hpc-stack to add it to the standard stack installations. The introduction of miniconda also caused the conversion of all of the rocoto entry scripts (the ones rocoto calls to run the job) to bash. Most of them had been in korne. To make sure the needed restart files are produced, config.fcst now makes sure the forecast cadence is included in the list of restart times. This may need more testing to ensure it works properly with other restart lists. As part of this, STEP_GFS was updated to properly determine the step from gfs_cyc instead of being a set value. Closes #366 Fixes #516 Fixes #630 commit 69b39ee5af8cfbb5cbf5cf62c92d01958b697652 Author: Kate Friedman Date: Thu May 12 12:18:50 2022 -0400 Add --init to GSI submodule command (#781) Needed to obtain libsrc submodule during checkout and allows the GSI to build correctly since this GSI hash still needs the libsrc submodule during build. Refs: #780 commit b09f92c21028ea37e4663ce4c304935a2dc45b33 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Thu May 12 08:20:16 2022 -0400 Update field_table to be consistent with UFS (#778) Changes the fixed surface value of hydrometeor number concentrations and sub-grid TKE to 0. Also renames the field table. Both changes are consistent with changes made in UFS. Fixes #676 commit d72a62d6c0ab2e9f53ca3e6133148f29982f9d32 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Thu May 12 08:15:23 2022 -0400 Remove special MOD_PATH from build scripts (#775) We always use pre-installed libraries and special MOD_PATHs are no longer necessary, so these are removed from build scripts. MOD_PATH. This also moots MOD_PATHs that were hard-coded to Hera locations in some scripts. The MOD_PATH in build_ufs.sh remains as it is still needed and is a relative path into the UFS repo. Fixes #298 commit 2adf123814a2f29e9cca54a11fbb2807b3f6cf1b Author: Kate Friedman Date: Wed May 4 12:31:20 2022 -0400 Revert g2tmpl back to 1.10.0 (#773) Resolves bug with mismatch between hpc-ips and g2tmpl versions on the WCOSS-Dells. Also revert back to 1.10.0 on Hera and Orion to be consistent. Refs: #772 commit b90f4e99bc42f9a8afbeaa9479e674fb75e096ea Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Tue May 3 09:45:19 2022 -0400 Remove git submodule update for upp checkout (#768) UPP no longer has a CMakeModules submodule, so the submodule update is removed from the checkout script. Fixes: #767 commit ad2b14d9b1ed0fe89fe59bfb96cc03630e8457de Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Tue May 3 09:44:57 2022 -0400 Remove duplicate file linking for Thompson MP (#765) Files for Thompson MP were being copied twice by the link script: once by checking imp_physics, then again checking the CCPP Suite. This is now reduced to one check. Fixes: #675 commit a9fc0033b908aefb42d5b2d191c65af5cb7660c4 Author: Kate Friedman Date: Thu Apr 28 15:17:22 2022 -0400 Remove unneeded/outdated modulefiles (#762) Remove unneeded modulefiles for WCOSS-Cray and GSI monitoring. commit 3333dee304b6b53d3d716c60d0c87ad06d629c1c Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Thu Apr 28 14:35:57 2022 -0400 Remove prepbufr_pre-qc from archive lists (#753) The prepbuft_pre-qc file does not need to be archived, so it no longer is. Fixes #361 commit 88b1b1591d3201709f5f981908aaf7890d3021bd Author: Walter Kolczynski - NOAA Date: Tue Apr 26 20:40:41 2022 -0400 Convert module files from Tcl to Lua (#756) Converts all module files from Tcl format to Lua. As a side effect, this change enforces using the files as modules rather than improperly sourcing them instead of loading. Fixes #670 commit ca80aeb86cb221263611605ba23858448bce0bad Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Mon Apr 25 06:23:06 2022 -0400 Ensure hourly files when GLDAS is on (#740) GLDAS requires hourly output, so make sure FHOUT is 1 when GLDAS is on. Fixes #695 commit 9d00239088651b73a9667b70bd6247b34e35046f Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Tue Apr 19 21:22:22 2022 -0400 Update cyclone tracker to handle return code correctly #642 Updates the cyclone tracker call to correctly exit with an error code if the tracker script reports one. Fixes #507 commit 33d3dd14d6df8109274dde4606bb7a50e4638fa2 Author: Jessica Meixner Date: Tue Apr 19 21:18:56 2022 -0400 Update ww3 pre and post exe build to use cmake (#731) Updates the WW3 pre/post build system to use cmake. Fixes #688 commit 39facec1e03a404f84fd05b8739cd8e4010a0779 Author: Kate Friedman Date: Tue Apr 12 11:36:30 2022 -0400 Add WCOSS2 GFSv16.2 operational def files into develop (#717) Add WCOSS2 operational GFS def files. Refs: #399 commit 0a85223e48a1a8dcced163791d8e7ae5a42a4fab Author: Xianwu Xue - NOAA <48287866+XianwuXue-NOAA@users.noreply.github.com> Date: Tue Apr 12 11:34:17 2022 -0400 Initialize "err" to avoid potential task fail (#715) If not initialize err, and change "set -x" to "set -xue", then the waveinit task will fail with error message: "line 234: err: unbound variable" On branch bugfix/issue_714 Changes to be committed: modified: scripts/exgfs_wave_init.sh Refs: #714 commit fc6e1c3316397432d2da218b3c4fc0ea1fe70644 Author: Rahul Mahajan Date: Tue Apr 5 11:07:21 2022 -0400 config.base should not exist in the repository unless it is in ops. template only exists in the repo, the experiment directory should not contain the templated config.base.emc.dyn (#707) commit cabf437044cb937caf7a7830068fd9e413c5921f Author: Jessica Meixner Date: Wed Mar 30 21:28:50 2022 -0400 Update model and settings for Prototype P8b (#681) Updates model and setting defaults for prototype 8b. The UFS version is updated to a recent version of UFS (tag Prototype-P8b). As part of this, the print_esmf option is removed from the model configure file and replaced with a new setting in nems.configure. We have not made this option user-configurable at this point, but that may come in the future. The default CCPP suites have been updated for all modes to either FV3_GFS_v17_p8 or FV3_GFS_v17_coupled_p8. This involves changing the microphysics to Thompson and changes the gravity wave drag version (knob_ugwp_version) to 0. This change breaks cycled mode for the time being. A future PR will revert the defaults to ones that allow cycled to run along with a more robust system for changing settings. Tiled fix files are now used for all modes instead of just coupled. Cellular automata now defaults on ON. There are also other miscellaneous settings that have had their defaults changed. Fixes #641, #687 commit e3f707cb9c05b45e9d4019679e406f54008279a5 Author: XuLi-NOAA <55100838+XuLi-NOAA@users.noreply.github.com> Date: Thu Mar 24 05:43:31 2022 -0400 Modify eobs for EnKF thinning In the Hybrid EnKF GSI, the thinning box size, dmesh, is different for the full resolution analysis and EnKF respectively. There are set in GSI scripts for the full resolution analysis, and reset in global workflow in config.eobs. Only two thinning box sizes, dmesh(1) & dmesh(2), are defined in GSI scripts before, recently, two more, dmesh(3) & dmesh(4) are added. Accordingly, these two need to be added in config.eobs as well. Specifically, add dmesh(3)=225.0,demsh(4)=100.0 to config.eobs. Fixes #595 commit c32eea459af3177cee28e151b43fe5e12bdfc412 Author: Walter Kolczynski - NOAA Date: Tue Mar 22 11:18:10 2022 -0400 Update ocean resolution for C48 (#651) The ocean resolution for C48 was set as 1-deg, but the fractional grid fix files available at C48 are for a 4-deg ocean grid. The ocean resolution is now set to this value when FV3 is C48. Also commented out a currently redundant block setting OCNRES in config.ocn. The one in ecfs has to stay as it recalculates the ocean resolution based on the EnKF resolution. Fixes #650 commit d758e8b227fca7b5a3320adc43c29ba492c3bfe9 Author: Walter Kolczynski - NOAA Date: Thu Mar 10 05:54:49 2022 -0500 Add single forecast GOCART support (#659) Adds support for running single coupled free forecasts with GOCART aerosols. Support for continuous aerosol fields from cycle to cycle will be added in a future PR. To turn on aerosols, there is a new --aerosols option for setup_expt.py. The option takes no arguments. When it is used, aerosols will be turned on by setting DO_AERO="YES" in config.base. GOCART output files are placed in the chem directory in COM. There is a new tarball archived, chem.tar, that contains the GOCART output. Partially addresses #516 Fixes #403 commit a0e23e254c7c881b20d66c1c53cd36de509e6d69 Author: Walter Kolczynski - NOAA Date: Tue Mar 1 10:50:39 2022 -0500 Specify memory for init job (#669) The init job has been failing on HPC using slurm due to out-of-memory errors. Now the init job specifies the amount of memory needed. Fixes #631 commit 6874e8939211fdc07143450bfd1b6a2863172e10 Author: arun chawla <49994787+arunchawla-NOAA@users.noreply.github.com> Date: Fri Feb 25 16:39:45 2022 -0500 Cleanup of utils directory (#660) Removes all of the old, unused GSM scripts and code from the utils directory. Those codes that might still be used have been moved to a new GSM-utils repo. Fixes #618 commit e3d64abc9c02aecb9d5d38755f6524bf9e277e66 Author: Walter Kolczynski - NOAA Date: Thu Feb 24 22:49:45 2022 -0500 Fix bug with partition_batch on WCOSS (#668) partition_batch was not being defined for WCOSS machines, but was being used in a substitution for config.base. Fixes #667 commit 9d75d8d78037c7a74d1a9c61a88d9d6a7e9b3be9 Author: Walter Kolczynski - NOAA Date: Mon Feb 21 02:10:00 2022 -0500 Split output filetype variable into atm and sfc (#602) The determination of the FV3 output filetype was a bit unwieldy as it contained two different settings (one for the atmosphere and one for the surface) that were not necessarily changed at the same time. They were also being determined in a different location than other settings based on the model resolution. Now the old OUTPUT_FILETYPES variable has been split into two different variables, OUTPUT_FILETYPES_ATM and OUTPUT_FILETYPES_SFC. The determi- nation was also moved into the config.fv3 file, where other resolution- dependent computational settings are set. This has resulted in some functional change on WCOSS Dell, as the EnKF was using different chunking settings and never used parallel output for the surface likely because wasn't being run at a low enough resolu- tion to consider it. However, there is no reason to believe the two forecast modes should have different chunk or output settings. So, I've used the EnKF chunking settings and the free forecast switchover point for the surface output mode. The new filetype settings are now also divorced from the OUTPUT_FILE setting. However, that setting will soon be unnecessary as nemsio is removed from the code as an option (see Issue #601). Fixes #600 Refs #601 commit cb8b5adf16200e7b01b8236a960efce5b6d8ce5d Author: Walter Kolczynski - NOAA Date: Thu Feb 17 15:55:24 2022 -0500 Correct MODE comparisons for forecast only (#658) When the setup_expt scripts were combined, the forecast mode became a mandatory argument. The value of this option is then directly used for the MODE variable, but the argument name (forecast-only) does not match what was previously used for MODE in the forecast-only script (free) and some scripts were still testing against the old value instead of the new one. Those comparisons have now been updated to use the new MODE name. Fixes #657 commit f8867d3f0b1f3b5d238dced412996c56e30d31a1 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Thu Feb 10 20:23:54 2022 -0500 Add LOCALARCH option (#628) Adds the option to archive output locally during archive jobs. This is intended for systems that do not have access to HPSS (e.g. Orion and soon S4), but can be used on any machine. To enable, the LOCALARCH setting in config.base should be set to "YES" (HPSSARCH must be "NO"). When enabled, the tarballs normally created on HPSS will instead be created in the local directory specified by $ATARDIR. Defaults have been added to setup_expt.py to point to a local ATARDIR and LOCALARCH (currently =NO). Fixes #624. commit 2950c7b97f99341caa738526138022987f665627 Author: Jessica Meixner Date: Thu Feb 10 18:07:43 2022 -0500 Turn on fractional grid by default for uncoupled forecasts (#638) Updates input.nml so that there is more consistency between the cpl and standalone atm input.nml. In particular this addresses the fact that frac_grid was only being set if cpl was true. Fixes #571 commit 64b1c1e5ce37fba48dd717bd11356a6f57d9def4 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Wed Feb 9 17:02:39 2022 -0500 Specify warm start as .true. or .false. (#644) Corrects the assignment of EXP_WARM_START to either .true. or .false. depending on the value passed to setup_expt.py via `--start`. Fixes #643 commit e537f0cecabc8e16e26b14e606558de906b19e3f Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Feb 8 23:33:21 2022 -0500 Archive TC tracking logs if produced (#627) The archive job was failing if there were no tropical cyclone files (such as when there are no cyclones). This adds a check for the TC tracking logs in case they are not produced to prevent the gfsarch job from failing on cycles when there were no TCs to track. Fixes #625. commit 32f93becde5ffa07c162252b95417845f2ab5159 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Feb 8 20:59:37 2022 -0500 Fix separate threading for GFS and GDAS forecasts. (#621) Fix separate threading options for GDAS and GFS forecasts (#610). This is performed by keeping nth_fcst_gfs separate from nth_fcst and declaring the new variable npe_node_fcst_gfs. Fixes #610 commit 9bb09a92c55d4534feca5d521fbbd1c664730317 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Feb 8 20:58:05 2022 -0500 Updated the RadMon and OznMon links. #596 (#597) commit 97ebc4d4e6483b135468d85cd9ed974f88955914 Author: Rahul Mahajan Date: Thu Feb 3 02:56:34 2022 -0500 Unify `setup_expt.py` and `setup_expt_fcstonly.py` (#537) `setup_expt.py` and `setup_expt_fcstonly.py` are unified under the former name. The user is now required to provide a `mode` as the first positional argument to `setup.py`. Valid options are `cycled` and `forecast-only`. ``` ❯ python3 setup_expt.py -h usage: setup_expt.py [-h] {cycled,forecast-only} ... Setup files and directories to start a GFS parallel. Create EXPDIR, copy config files. Create COMROT experiment directory structure, link initial condition files from $ICSDIR to $COMROT positional arguments: {cycled,forecast-only} cycled arguments for cycled mode forecast-only arguments for forecast-only mode optional arguments: -h, --help show this help message and exit ``` Upon choosing one of these modes, options specific to the mode can be realized as follows for the `forecast-only` and `cycled` modes respectively. ``` ❯ python3 setup_expt.py forecast-only -h feature/unify-setups usage: setup_expt.py forecast-only [-h] [--pslot PSLOT] [--resdet RESDET] [--comrot COMROT] [--expdir EXPDIR] --idate IDATE --edate EDATE [--icsdir ICSDIR] [--configdir CONFIGDIR] [--cdump CDUMP] [--gfs_cyc {0,1,2,4}] [--start {warm,cold}] [--app {ATM,ATMW,S2S,S2SW}] optional arguments: -h, --help show this help message and exit --pslot PSLOT parallel experiment name --resdet RESDET resolution of the deterministic model forecast --comrot COMROT full path to COMROT --expdir EXPDIR full path to EXPDIR --idate IDATE starting date of experiment, initial conditions must exist! --edate EDATE end date experiment --icsdir ICSDIR full path to initial condition directory --configdir CONFIGDIR full path to directory containing the config files --cdump CDUMP CDUMP to start the experiment --gfs_cyc {0,1,2,4} GFS cycles to run --start {warm,cold} restart mode: warm or cold --app {ATM,ATMW,S2S,S2SW} UFS application ``` ``` ❯ python3 setup_expt.py cycled -h feature/unify-setups usage: setup_expt.py cycled [-h] [--pslot PSLOT] [--resdet RESDET] [--comrot COMROT] [--expdir EXPDIR] --idate IDATE --edate EDATE [--icsdir ICSDIR] [--configdir CONFIGDIR] [--cdump CDUMP] [--gfs_cyc {0,1,2,4}] [--start {warm,cold}] [--resens RESENS] [--nens NENS] [--app {ATM,ATMW}] optional arguments: -h, --help show this help message and exit --pslot PSLOT parallel experiment name --resdet RESDET resolution of the deterministic model forecast --comrot COMROT full path to COMROT --expdir EXPDIR full path to EXPDIR --idate IDATE starting date of experiment, initial conditions must exist! --edate EDATE end date experiment --icsdir ICSDIR full path to initial condition directory --configdir CONFIGDIR full path to directory containing the config files --cdump CDUMP CDUMP to start the experiment --gfs_cyc {0,1,2,4} GFS cycles to run --start {warm,cold} restart mode: warm or cold --resens RESENS resolution of the ensemble model forecast --nens NENS number of ensemble members --app {ATM,ATMW} UFS application ``` Note, `cycled` mode presents some extra options e.g. `nens` as well as a reduced list of the UFS weather model applications. The functionality of `--icsdir` had been broken for cycled and was hard-coded in free forecast. The functionality has now been repaired for cycled. If you provide one, $COMROT will be populated with appropriate links. If none is specified, no links will be created in $COMROT. In coupled mode free-forecast, ICs are copied *to* icsdir from the central maintained prototype location. Coupled users will now need to set this explicitly. For non-coupled forecast-only, this setting currently does nothing. The default value for `--configdir` has been updated to the appropriate location in the workflow. Most users will no longer need to set it unless they want to point to a different config source. The default values for `--comrot` and `--expdir` are updated from None to $HOME to facilitate offline testing of workflow creation. There are some irrelevant sections such as `gfs_cyc` in forecast-only that is still preserved in this PR. It will be cleaned up in subsequent PR's. Another unnecessary complication is the argument of `--start`. The logic presented here would ideally be selected at runtime based on the type of IC's populated in comrot. It is left unchanged. commit d7319f19aceca6ae6d7ce9b06c6eb731832d1de1 Author: Walter Kolczynski - NOAA Date: Wed Feb 2 11:19:43 2022 -0500 Stop archiving gfsarch.log as it is being written (#581) The gfs archive job was failing because it was attempting to archive its own log file into gfsa.tar while it was being written. To exclude that file pattern, bash extended globbing is turned on, which allows the use of a negating group. Fixes: #558 commit 1b300dbf98eccdf03117b3795a2d8da3310a6126 Author: Walter Kolczynski - NOAA Date: Wed Feb 2 11:18:55 2022 -0500 Fix build on non-WCOSS2 machines (#612) The UFS_UTILS and GLDAS versions are updated to correct build problems on development machines. Each had been using a beta version of ESMF that was removed from the hpc-stack installation without warning. Additionally, GLDAS had introduced bugs into their build scripts during the WCOSS2 port. These issues are now all corrected in the new versions. Also updates the UFS_UTILS repository to its new location under UFS instead of EMC. Fixes #476, #561 commit d3028b9d8268028226f9c27800fcd6655e9e4bb8 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Wed Jan 19 14:29:51 2022 -0500 Fix setting of OUTPUT_FILETYPES (#589) The setting of OUTPUT_FILETYPES was being overwritten by the FV3 model configure parsing script, so the settings determined in the config.fcst file was being ignored. Now that block is removed and config.fcst is updated to make sure it is set for any machine. Fixes #588 commit 13421b01a07e5d1cca32ee7579a4094d8209b072 Author: Rahul Mahajan Date: Wed Jan 19 12:16:29 2022 -0500 Update PR template commit 322a61a61238e8486cb42d7d26282b2728d0c32f Author: Rahul Mahajan Date: Wed Jan 19 12:13:33 2022 -0500 Move PR template MD file. Update issue templates commit 7a52fc8d8bbfc5896b7d03004d5189b2a6b24013 Author: Rahul Mahajan Date: Wed Jan 19 11:32:51 2022 -0500 rename pull_request_template.md commit 88ec66eb834e9a78c94feae4f37f774c7807aa3c Author: Rahul Mahajan Date: Wed Jan 19 10:34:54 2022 -0500 make changes to the templates commit 66d84e21ad7ae3fafb4ad5097faf86dfa1b99b60 Author: Walter Kolczynski - NOAA Date: Wed Jan 19 10:08:45 2022 -0500 Fix lfrac entry in diag table (#570) Land fraction was listed in the wrong module, keeping it from being written to output. Refs: #562 commit e2657adbdadaac7089591390428324c6aec260a9 Author: Rahul Mahajan Date: Tue Jan 18 17:04:04 2022 -0500 Update NCO_bug_report.md commit a9d6851a0c3b6051cb1c7c7ec482fc50d762bef4 Author: Rahul Mahajan Date: Tue Jan 18 16:59:18 2022 -0500 Update bug report issue template commit 2fab8b4213645b79b3fea8127e7d4ab21044188e Author: Rahul Mahajan Date: Tue Jan 18 15:13:53 2022 -0500 Add templates for Github (#560) Adds GitHub templates for new issues and PRs to standardize and make sure all the needed information is included. commit 03be05309b184421c473f1b43082b766b114fd46 Author: Walter Kolczynski - NOAA Date: Tue Jan 18 10:02:02 2022 -0500 Fix toggle for building workflow_utils (#580) The fv3gfs_build.cfg did not have a space between the periods and the setting for workflow_utils, which means the setting was ignored and the default of yes was always used. Fixes: #577 commit aadd3bae3d4fdc7827a93ab51b336a7f6b2fdb02 Author: Walter Kolczynski - NOAA Date: Tue Jan 18 10:01:24 2022 -0500 Remove redundant toggle for gldas build (#579) The build cfg files controlling which components are built and the partial_build.sh script all have two instances trying to set the value for gldas. This means one of the settings did nothing when changed. Fixes: #578 commit 86d4b305a070c080dcfd44a3f9a31ca2280759fa Author: Walter Kolczynski - NOAA Date: Fri Jan 14 01:05:59 2022 -0500 Update config missed in last GSI update (#569) When the GSI version was updated in PR #530, updates to the config.anal file were mistakenly omitted. That file is now updated following ops (PR #451, #489). Refs: PR #530 commit 33308ea3b477db17a5db455a4c4887e3d5cbc711 Author: Jessica Meixner Date: Thu Jan 6 00:44:43 2022 -0500 Updates for P8a (#538) Updates ufs-weather-model to the 2021 Dec 23 commit and the matching UPP hash. Coupled settings are updated to run the P8a mini prototype. Updates include: Turn on ice-albedo feedback in atm (Requires changing input.nml to set use_cice_alb=true in &gfs_physics_nml ) Updates to CA namelists Updates for NOAH-MP which require input.nml update for to export iopt_sfc="3" and modifying parm_fv3diag to include pahi, pah_ave, ecan_acc, etran_acc,edir_acc,wa_acc, lfrac to the grib outputs "gfs_phys", "pahi", "pahi", "fv3_history2d", "all", .false., "none", 2 Closes #525 commit ae7092405e7552f76674b33fc81cb3973a68ad4e Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Mon Jan 3 17:28:47 2022 -0500 Reordered CDUMP_LIST Refs #541 (#542) commit 033b1d8ef29b7c829e8131758135b372ae61de26 Author: Walter Kolczynski - NOAA Date: Sun Dec 19 22:18:48 2021 -0500 Update GSI version to 2021 Dec 14 (#530) commit b187e2aa15bfd7600c49627fd4c2d02915b50abe Author: Walter Kolczynski - NOAA Date: Thu Dec 16 22:15:45 2021 -0500 Replace all backticks for command substitution (#526) All instances where backticks are used for command substitution are replaced with $( ). This standardizes usage around $( ), which can be nested and does not be confused with single quotation marks. Refs: #397 commit dd03ed0953bfc65f5a9c85529dd2c95ac0625189 Author: Walter Kolczynski - NOAA Date: Fri Dec 10 15:35:00 2021 -0500 Add coupled support and update UFS (#500) Adds support for full coupled model except aerosols (FV3-WW3-MOM6-CICE) following the prototype settings. Support for aerosols will be added soon. This also updates the UFS version to develop as of Oct 7. There are associated additions or changes to settings, but we've tried to have defaults maintain the same behavior as previously. Issues related to memory in prep and ocnpost on Hera and Orion due to changes in the slurm configuration have been addressed. Due to a change in the dycore that uses the checksum in the NetCDF files to check for data integrity, warm start ICs may need an additional offline step before use to update the checksum: ``` ncatted -a checksum,,d,, ${RESTARTinp}/${PDY}.${cyc}0000.fv_core.res.tile1.nc ./fv_core.res.tile1.nc ncatted -a checksum,,d,, ${RESTARTinp}/${PDY}.${cyc}0000.fv_tracer.res.tile1.nc ./fv_tracer.res.tile1.nc ``` A new setting APP controls what components (and their respective jobs) are turned on. The setting can be found in config.base, but can be set at experiment setup time using the new --app option. Recognized values are ATM (atm-only), ATMW (atm and waves), S2S (atm-ocn-ice), S2SW (atm-ocn-ice-wav). If no value is given, the default is ATM, so there is no change in behavior if you omit the option. For now, these values follow their UFS equivalents, but they may diverge in the future based on the needs of global workflow. ATMW is known to not quite work; S2S is so far untested. When run using the S2SW app, the workflow will automatically substitute the coupled IC copying job in place of the normal gfsprep jobs and use roughly the settings from prototype 7.2 (except aerosols). The alternate settings for the coupled configuration (compared to ATM) are taken from the config.defaults.s2sw file. Coupled prototype ICs are currently being maintained on Hera, Orion, and WCOSS-Dell. The locations are set in the config.coupled_ic. There is a base location (BASE_CPLIC) and a setting for each component specifying a subdirectory within the base location. There are new options available for the checkout.sh, build_all, and link_fv3 scripts: - checkout.sh has a new -m option that allows you to override the UFS hash that is checked out. - build_all.sh has two new options: -c to build UFS as the coupled model, and -a to build UFS as ATM-GOCART (but aerosols are not fully supported yet). These options are then passed to build_ufs.sh. The options are temporary, as all configurations will be available from a single executable soon. When building for coupled, the subcomponents built is controlled by cpl_build.cfg. - link_fv3.sh is renamed to link_workflow.sh and adds an optional third argument, "coupled". When provided, the files necessary for coupled are linked. This option is also temporary, as the link script will be updated to always link these files. ### To run in coupled mode with prototype settings/inputs 1. Clone as usual 2. Run checkout.sh 3. Run build_all.sh with the -c option `./build_all.sh -c` (the -c option is temporary) 4. Run link using the coupled argument: `./link_workflow.sh emc coupled` 5. Run setup_expt_fcstonly.py using the --app option S2SW 6. Modify config files as necessary. config.defaults.s2sw will automatically override some settings when app is S2SW 7. Run setup_workflow_fcstonly.py as usual 8. Setup rocoto to run your experiment ### Technical udpates Converts global forecast into a set of modular scripts. The scripts define a bunch of functions to be called, with separate functions for each component in each classification. For instance, there is an FV3_GFS_predet, MOM6_predet, CICE_predet, etc. that are conditionally run if the associated component is on. The diag table for AOD 550 (diag_table_aod) is reduced to just the portion needed in addition to the normal diag table so it can be appended. Ideally, in the future, we extend this treatment so diag tables can be built based on settings instead of having separate ones for each combination. Wave input file templates that were formerly kept in the fix directory are now more appropriately moved inside the global-workflow in parm. The group labels for the post jobs now indicates the forecast hours included instead of just an index. This required minor updates to check for 'anl' instead of 0 for analysis jobs. The checkout script now writes its logs to the logs/ directory the same way as build does, instead of leaving them in sorc/ The UFS model source has been moved to a ufs_model.fd directory, and the build script has been renamed build_ufs.sh. Error checking was added to the build_ww3prepost.sh script. The archive script has been streamlined and the section that saves gaussian grid files was fixed, but turned off by default (see PR #517). rocoto_viewer has been updated to python3 and the dependency on prod_utils was removed, which makes the script more portable. Be sure to load a sufficient version of python to use. For those coming from coupled-crow: ice and ocean output files are now linked to COM from the forecast run directory instead of being copied at the end of the forecast. This allows post to run as files are produced instead of after the forecast is complete. Ocean post jobs are handled the same as atmosphere post: forecast hours are grouped into a number of tasks determined by NPOSTGRP in config.ocnpost. The ocean and ice output are now also located in their own component directories in COM (the joint ocean-ice files are located in the ocean directory for now). The atmosphere latitudes are reversed from coupled-crow (see c59260b0). Co-authored-by: jikuang Co-authored-by: Kate.Friedman Co-authored-by: Rahul Mahajan Co-authored-by: JessicaMeixner-NOAA Co-authored-by: JianKuang-UMD <51758200+JianKuang-UMD@users.noreply.github.com> Co-authored-by: lgannoaa <37596169+lgannoaa@users.noreply.github.com> commit 8abe1dfa8d613f9398cd86564046b53235f62749 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Wed Dec 8 16:36:52 2021 -0500 Fix resource assignment issue found in analysis job running global_cycle (#467) Correct settings for nth_anal on WCOSS-Dell and nth_cycle everywhere. commit fdffeaaf013f638ac43ad421981e48fabf7a7070 Author: malloryprow Date: Thu Nov 18 14:17:02 2021 -0500 Update gfsmetp, gfsarch dependencies (related update of EMC_verif-global tag) (#508) * Update gfsmetp and gfsarch dependencies * Update EMC_verif-global tag to verif_global_v2.8.0 Refs: #437, #472 commit 108abc589b746b2ba585d18a6b6587e5a0d89f39 Merge: f6f1bb702 e5cd63693 Author: Walter Kolczynski - NOAA Date: Tue Nov 16 10:11:39 2021 -0500 Merge pull request #497 from NOAA-EMC/feature/shebang Update ush/ script files with "python3" in shebang commit e5cd636930b0e36ae9b2b25697cda317a2df996d Author: jikuang Date: Fri Nov 12 11:50:53 2021 -0600 update the following files with "python3" in shebang commit f6f1bb7026bcb4983b0f9e921a6332b6722fe724 Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Wed Nov 10 10:08:42 2021 -0500 Add flag to launcher command to prepend task number to lines in stdout/err (#493) Modified env file settings for launcher variable to prepend task number label with "-l" flag. commit 096fc1fb76ee01aff7908b91adc9851da9235cad Author: Rahul Mahajan Date: Tue Nov 9 08:47:22 2021 -0500 Update NSSTBUFR file logic in prep job (#469) Add switch in config.prep to toggle NSSTBUFR file creation in prep.sh. Add logic in prep.sh to copy NSSTBUFR from DMPDIR if not creating as part of the workflow. NSSTBUFR is created with different DTYPS_nsst before 2020102200. commit 84194a735b46c907656fe2b2cca3ca97799396f0 Merge: a39cb11ef 0740b5b6c Author: Walter Kolczynski - NOAA Date: Wed Oct 27 11:52:55 2021 -0400 Merge pull request #474 from NOAA-EMC/feature/setx2 Hide module load commands commit 0740b5b6c622d64b770c73aa47d9789494d94a99 Author: jikuang Date: Tue Oct 26 13:19:07 2021 -0500 wrap module load commands with set +x commit a39cb11ef9e72b2913d81521d6f9e51da3a8bc22 Author: Kate Friedman Date: Wed Sep 29 09:36:31 2021 -0400 Update EMC_verif-global tag to verif_global_v2.5.2 (#450) New tag provides the following updates since the verif_global_v2.2.1 tag: - Added capability to produce a scorecard (no need to use METviewer AWS to produce scorecards) - Added capability to produce fit-to-obs plots - Added support for Jet - Updated DA ensemble plots graphics to support when models ensemble mean and spread output is on different grids - Hot fix for new METviewer AWS host name - Hot fix for reorganizing precipitation verification input files Refs: #438 commit 6f74cacdffe22f67e99aed1ac0e75720dac27c30 Author: JianKuang-UMD <51758200+JianKuang-UMD@users.noreply.github.com> Date: Tue Sep 28 09:53:16 2021 -0400 Remove firstcyc job (#440) The need for the firstcyc job has gone away, removing redundant job. commit 1ca27148d776eedf96636a845feb95881cfaa138 Author: Kate Friedman Date: Thu Sep 9 11:53:11 2021 -0400 Update repository name for EMC_post component to UPP (#441) - the "EMC_post" repository was renamed to "UPP" on September 6th 2021 - update repository url in checkout script and Externals.cfg Refs: #433 commit 7233d0c46cc9d24b01bee0ef6be12d775d2523a4 Author: Kate Friedman Date: Mon Aug 23 11:22:26 2021 -0400 Add rstprod support to Orion (#421) - Update EMC_verif-global tag to verif_global_v2.2.1 - Turn on rstprod support by default on Orion - Change default DMPDIR path on Orion to new rstprod-supported GDA Refs: #347 commit 20c331dd9678834b980ccc932b6235a8266d4a88 Author: Kate Friedman Date: Thu Aug 19 13:01:07 2021 -0400 Update obsproc package versions for TAC2BUFR implementation (#423) Update obsproc package versions for TAC2BUFR implementation: - obsproc_prep v5.5.0 - obsproc_global v3.4.2 - new packages installed on WCOSS-Dell, Hera, Jet, Orion Refs: #341 commit 7f0f7400520b031e2428238a9741d4d8bfb8207a Author: Kate Friedman Date: Wed Aug 18 12:41:09 2021 -0400 Update vrfy/metp jobs to use jobid in their DATAROOT folders (#414) Update vrfy/metp jobs to use jobid in their respective DATAROOT folder names; fixes race condition between vrfy and metp jobs Refs: #401 commit 9233d965cd19a94ee649b4ee8c117bb587b78923 Author: Kate Friedman Date: Tue Aug 17 12:48:12 2021 -0400 Update workflow_utils build modules and remove ncio module hack (#412) Add ncio/1.0.0 module load to workflow_utils and remove hack that builds this library inline. Refs: #407 commit df26e953792913669698ba64b414b3be5184f43d Author: Kate Friedman Date: Tue Aug 17 11:15:38 2021 -0400 UFS_UTILS tag update - gdas_init support on Jet and HPSS path update for GFSv16 real-time parallel (#410) * Update UFS_UTILS tag to ufs_utils_1_6_0 * Update HPSS path for real-time GFSv16 pre-implementation parallels Refs: #400 commit 34427f560c729ee3b7cc91ff357b3ad908a7486a Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Mon Aug 9 15:47:27 2021 -0400 Free-forecast mode support on Jet (#394) Support for free-forecast mode on Jet is added. Setup support for multiple partitions included (xjet, kjet, sjet, vjet) but tested/recommend to use xjet (default). Verification pieces and cycled mode not yet supported. commit 8c777ad04bce0555bed9f6666b808515b0a3766d Merge: ce66c74bb fc48af56d Author: Kate Friedman Date: Tue Jul 20 15:53:45 2021 -0400 Merge pull request #370 from BrianCurtis-NOAA/feature/rocoto-py3 Convert setup scripts to python3 commit fc48af56d3a0ccbd3e142d22452b9b5e34173b7b Author: Brian Curtis Date: Mon Jul 19 20:48:05 2021 +0000 Missed a set of raw_input(), in py3 they are just input() commit 44dc907a2ea3b8a29334b2416183ca8775ff4c82 Author: Brian Curtis Date: Mon Jul 19 19:59:54 2021 +0000 revert rocoto_viewer as its being worked on elsewhere commit 37f25384c29f6697be5c851868ce98d8b6c31207 Author: Brian Curtis Date: Mon Jul 19 19:27:32 2021 +0000 Fixes from testing commit c0716f41f2446fec9a441b285d9033f73345c5d4 Author: Brian Curtis Date: Mon Jul 19 10:28:06 2021 -0400 Merge w/upstream and update those changes to py3 commit cf8758f84418c236fb45913912491a1c27b2829f Merge: f9013decd ce66c74bb Author: Brian Curtis Date: Mon Jul 19 09:21:25 2021 -0400 fix merge conflict in workflow_fcstonly commit ce66c74bbc43e997bae66a2c2035f2412d3f602b Merge: 96347ea52 7a166da9f Author: Kate Friedman Date: Wed Jul 14 16:18:55 2021 -0400 Merge pull request #362 from KateFriedman-NOAA/bugfix/ffhighres Correct wrong settings for high-res warm-start free-forecast commit 7a166da9f20a59729be39fff953bfe367987f046 Author: Kate.Friedman Date: Wed Jul 14 19:19:45 2021 +0000 Reduce C768 npe_wav_gfs from ops 440 to dev 140 Refs: #353 commit b34c581c3b19ecdcdb4eff463e855597e5c08d80 Author: Kate.Friedman Date: Wed Jul 14 19:15:31 2021 +0000 Adjust how EXP_WARM_START is set from setup_expt_fcstonly.py step - remove default value for start (EXP_WARM_START) - if user provides start flag then use it for EXP_WARM_START - if user doesn't provide start flag then make a decision for them based on resolution - if start=None and res=768 then start=warm - if start=None and res!=768 then start=cold Refs: #353 commit 69cf53412a61e7fe0fb49e8c688bc2cbb4ded572 Author: Kate.Friedman Date: Thu Jun 24 16:38:25 2021 +0000 Set EXP_WARM_START=true in config.base via setup_expt_fcsonly.py if resolution is C768 - have free-forecast setup_expt script check if resolution is operational resolution (C768) and if so, set EXP_WARM_START=.true. - forcing EXP_WARM_START=.true. when C768 covers most C768 scenarios regarding cold vs warm starting - the --start argument still works but new check will override if res=768 - users can still change config.base setting as needed for special scenarios Refs: #353 commit 08f62e45319c9772ef33cdb3c7aa3ec165fede09 Author: Kate.Friedman Date: Wed Jun 23 21:21:51 2021 +0000 Revert EXP_WARM_START if-block addition in config.getic - need to determine better way to set EXP_WARM_START Refs: #353 commit 1c03067ca570295b65f967467ef2329fb7999578 Author: Kate.Friedman Date: Wed Jun 23 19:33:18 2021 +0000 Revert config.base EXP_WARM_START check change Refs: #353 commit 0f1cea6662695cad9ecf7df8317a36e1ab5c57e1 Author: Kate.Friedman Date: Wed Jun 23 19:12:27 2021 +0000 Add checks for EXP_WARM_START in config.base and config.getic - add OPS_RES variable to both config.base.emc.dyn and config.getic - add if-block in both configs to force EXP_WARM_START=true if CASE=OPS_RES and gfs_ver=v16 (current ops) - check in config.base helps with later IAU checks - help catch when user doesn't set EXP_WARM_START=true when running v16 C768 Refs: #353 commit 7ed96c34f3dd6e4c4251e87d9abf62a0dbb413a2 Author: Kate.Friedman Date: Wed Jun 23 19:10:17 2021 +0000 Add cd to ROTDIR when pulling ops warm starts - pull v16 warm starts directly into ROTDIR; don't need temporary location to handle subfolder diffs - resolves issue with GDATE gdas restart files being left in EXTRACT_DIR Refs: #353 commit 06c5e3cb7bd9d82d4e51a63168465118a34e5882 Author: Kate.Friedman Date: Wed Jun 23 19:06:42 2021 +0000 Reduce C768 resource settings to fit node limits - default C768 resources on Hera were 218 nodes which is higher than the 210 node limit per job - bring C768 settings in config.fv3 down to 148 nodes (on Hera) - tested new settings in free-forecast mode on Hera Refs: #353 commit 96347ea527f7b0ab61a1aae6576e2709fb387c7c Merge: cfca8bb2c 5c042e087 Author: Kate Friedman Date: Wed Jun 23 11:13:06 2021 -0400 Merge pull request #316 from NOAA-EMC/feature/hpc-stack GFS components update for hpc-stack support commit 5c042e087c02cb82fe24ce666e477791f5e2417e Merge: b36414e4c cfca8bb2c Author: kate.friedman Date: Thu Jun 17 19:33:03 2021 +0000 Merge remote-tracking branch 'origin/develop' into feature/hpc-stack commit cfca8bb2ca0dc0105b905c536f346002408db771 Merge: e08f55583 fba8cef5b Author: Kate Friedman Date: Thu Jun 17 14:40:18 2021 -0400 Merge pull request #342 from KateFriedman-NOAA/issue178 Free-forecast integration with chgres_cube and resolve know mode bugs commit fba8cef5b870d46df1c665d3e9cabb23904ad3cf Author: Kate.Friedman Date: Wed Jun 16 13:57:14 2021 +0000 Remove wave restart pull in getic script Refs: #178 commit 921838feb9f6fd3fc3c69579c677ffdd7175a103 Author: Kate.Friedman Date: Wed Jun 16 13:18:35 2021 +0000 Remove old compile command from build_fv3.sh Refs: #178 commit b36414e4ce49579c5db1829fab46e1a4ef302569 Merge: 9cf615d11 e08f55583 Author: kate.friedman Date: Tue Jun 15 18:36:55 2021 +0000 Merge remote-tracking branch 'origin/develop' into feature/hpc-stack Refs: #164 commit 65ff48e049a651e74be8be121163c20957bf1c44 Author: kate.friedman Date: Thu Jun 10 18:37:24 2021 +0000 Add v16 pgb anl pull to getic script Refs: #178 commit 37e7c2e0806678c93a11bae682f48da172940fd6 Author: kate.friedman Date: Thu Jun 10 18:36:59 2021 +0000 Add OPS_RES variable to init script Refs: #178 commit db18627960d2def7bbfee30994f4be6586babe1c Author: kate.friedman Date: Thu Jun 10 18:36:10 2021 +0000 Increase init job walltime to 30 mins from 15 mins Refs: #178 commit 30aefb11dd8130f1d4a1e6eb951e1fa400fdc361 Author: kate.friedman Date: Thu Jun 10 15:43:27 2021 +0000 Update/fix pull of v14/v15 pgrb2 anl files in getic job Refs: #178 commit 18970c716d79922fa02b86165996bb47fb42d3fa Author: kate.friedman Date: Wed Jun 9 12:04:01 2021 -0500 Disconnect archive job in workflow from HPSS access check - remove dependency on HPSS access to check for adding gdas[gfs]arch job to workflow - add HPSSARCH variable to ARCHIVE_TO_HPSS definition in cycled workflow, similar to prior addition to free-forecast script Refs: #178 commit b52d2f0b82e0e376905c64daa593362396f8da1b Author: kate.friedman Date: Tue Jun 8 13:06:30 2021 -0500 Update UFS_UTILS checkout to ufs_utils_1_4_0 tag - new ufs_utils_1_4_0 tag includes updates to support this branch - From tag release notes: - Update the GDAS initialization utility to ingest GFS v16 data. - Update to process either the GFS or GDAS CDUMPs. Refs: #178 commit 4c858c0f059abd0ff4c937c7bab692d7b13a8163 Author: kate.friedman Date: Tue Jun 8 13:05:08 2021 -0500 Adjust comments in getic job script Refs: #178 commit 898a43fb74005e969ecb5baf28ecf2d51f7c5dda Author: Kate.Friedman Date: Tue Jun 8 15:57:32 2021 +0000 Adjust getic/init job scripts - adjust EXTRACT_DIR, DATA, and ROTDIR usage in getic and init jobs - remove duplicate pgb file pull from init job Refs: #178 commit 1cdc2a44c6d29cfb8f06a50238ea29d28642c1f1 Author: Kate.Friedman Date: Tue Jun 8 15:56:46 2021 +0000 Add cmake module load to Hera block in machine-setup.sh Refs: #178 commit fe448ff29aedca6a40a748ee47ae001b1639188c Author: kate.friedman Date: Thu Jun 3 19:32:15 2021 +0000 Set DO_WAVE to NO - turning off waves by default in config.base Refs: #178 commit bfb0a33305a46d87f62c4a5c8bd7799a60362678 Author: kate.friedman Date: Thu Jun 3 13:13:08 2021 -0500 Remove RUN_CCPP option, force CCPP now - remove RUN_CCPP case option from build_all.sh - remove RUN_CCPP option in build_fv3.sh, force CCPP build now - remove pre-CCPP GFS ops tag checkout option in checkout.sh, forcing CCPP hash checkout now Refs: #178 commit 771782f109de15f9f0214bcceb852b51bb4b0e02 Author: kate.friedman Date: Thu Jun 3 13:09:42 2021 -0500 Replace Orion checks with hpssarch checks for getic job - add hpssarch variable to setup_workflow_fcstonly.py - replace Orion checks with hpssarch=YES checks Refs: #178 commit e2ac5872ca482eb335bc6294e00594dc3a806718 Author: kate.friedman Date: Wed Jun 2 14:23:35 2021 +0000 Update WCOSS-Dell section of machine-setup.sh - needed cmake module loaded for building - added cmake module load and prereq stack loads Refs: #178 commit f0afb91bae738413539ee2f1fdc63b79e93b3e52 Author: kate.friedman Date: Wed Jun 2 14:22:43 2021 +0000 Change BDATE to GDATE in getic.sh Refs: #178 commit 69ac287062512b929a5193ebc86e4c7fcbbb1917 Merge: e62cefb39 e08f55583 Author: Kate Friedman Date: Wed Jun 2 10:17:52 2021 -0400 Merge branch 'NOAA-EMC:develop' into issue178 commit e62cefb3943a87a90fd40234f08845534df818a1 Author: Kate.Friedman Date: Tue Jun 1 16:50:28 2021 +0000 Correct gdas operational tarball name in getic job script Refs: #178 commit 127723adfde2b7b7c79bd4d346cafcb1f265dc1c Author: Kate.Friedman Date: Tue Jun 1 16:44:37 2021 +0000 getic job updates to add OPS_RES variable and pull from operational tarballs for warm starting - add OPS_RES variable to define current operational resolution; use variable where needed now - updated getic job script to use OPS_RES variable in if-block check - updated getic job script with new operational tarballs to pull for warm starting high res runs Refs: #178 commit 499f217f92b5bb0fad9e71de221937d9973046ce Author: Kate.Friedman Date: Tue Jun 1 15:49:41 2021 +0000 Resolve bug in wavepostpnt jobs when DOBLL_WAV=NO - change DOBNDPNT_WAV=YES and CFP_MP=YES if-block to only prepare cmdtarfile for CFP when DOBLL_WAV=YES to avoid scenario where only a single command is added to cmdtarfile Refs: #178 commit e08f5558372d43d88890ca639e70ab923071361b Merge: e09a398f1 b2879fe9f Author: Kate Friedman Date: Wed May 26 12:27:40 2021 -0400 Merge pull request #327 from NOAA-EMC/operations Operations updates for GFSv16.1.1 commit 824ff7254fba04a0b3726316d300f9fa6282c937 Author: Kate.Friedman Date: Tue May 25 20:42:25 2021 +0000 Change UFS_UTILS version to feature branch for testing Refs: #178 commit 151e0b56b6e38f40449c9eb0807ffa92b6aa46f9 Author: Kate.Friedman Date: Tue May 25 20:41:32 2021 +0000 Correct if-statement syntax in config.init Refs: #178 commit 005d33d7b100b279e0db5290824c3e7acabfa4ba Merge: 5ec376bab e09a398f1 Author: Kate Friedman Date: Mon May 24 15:23:05 2021 -0400 Merge branch 'NOAA-EMC:develop' into issue178 commit b2879fe9f90def631047f533db2144858d827023 Merge: 68b9157bd a8ed770d9 Author: Kate Friedman Date: Mon May 24 10:07:05 2021 -0400 Merge pull request #326 from NOAA-EMC/release/gfsv16.1.1 GFSv16.1.1 commit e09a398f1cedc7252219e75bf0b319d2601bc8df Merge: e3fcfebc1 6bad810bc Author: Kate Friedman Date: Fri May 21 09:58:49 2021 -0400 Merge pull request #317 from malloryprow/feature/EMC_verif-global_upgrade EMC_verif-global v2.0.0 major release updates commit 6bad810bccdfc893fd51b8b55c60c1f732547c3a Author: Mallory Row Date: Fri May 21 13:55:13 2021 +0000 Update Externals.cfg EMC_verif-global tag to verif_global_v2.0.2 Refs: #315 commit 9cf615d11b096ea4843f4e5938a6a65f3500df49 Author: kate.friedman Date: Thu May 20 20:10:18 2021 +0000 Modulefile updates for testing with waves on - updated ww3 and base modulefiles to add modules and update some - added new: pio/2.5.2i, fms/2020.04.03 - updated esmf module to esmf/8_1_1 and nceppost/dceca26 to upp/10.0.6 Refs: #164 commit b155ff37040658703168268d4e58fe4e6b45a50d Author: Mallory Row Date: Thu May 20 15:00:04 2021 +0000 Update EMC_verif-global tag to verif_global_v2.0.2 Refs: #315 commit cae62e3f47e245e70fcf3721fb4c022ea11c059f Merge: 8def77afd e3fcfebc1 Author: kate.friedman Date: Wed May 19 19:45:05 2021 +0000 Sync merge with develop and WW3 build updates - Merge remote-tracking branch 'origin/develop' into feature/hpc-stack - WW3 build updates: revert back to separate module files for build; address modulefile reorganization in separate issue - remove CCPP option with system build, doing CCPP by default now Refs: #164 commit 5ec376bababd935e4f3b15c4754105f5bd750657 Merge: 6b6b9ed39 e3fcfebc1 Author: Kate.Friedman Date: Wed May 19 18:24:31 2021 +0000 Merge remote-tracking branch 'origin/develop' into issue178 * origin/develop: change fv3gfs checkout head fixed a compilation error, NEMs/exe existing, in hera build_fv3.sh with ATM fixed line 43 of link_fv3gfs.sh, build_fv3.sh v15 to v16 bugfix for not removing last point from list in wave post pnt jobs changes added in link_fv3gfs for fix_aer changed fix dir for link_fv3gfs changes on build_fv3.sh and symlinks to optical data files instead of cp Update util_shared module version to 1.3.0 for wave footer fix Revert addition of ecflow trigger for wavepostbndpntbll job update ufs-weather-model tag to GFS.v16.0.16 add ecflow updates for new wave bndpntbll job Additional WAFS tag update to gfs_wafs.v6.0.21 for GFSv16.0.8 Back out updates to add config.resources.nco.static WAFS tag update to gfs_wafs.v6.0.20 in Externals.cfg WAFS tag update (gfs_wafs.v6.0.20) for v16 post-implementation fixes fix for missing wave boundary cbull and bull files Fix cycle date in bull and cbull wave files Reverting transfer parm file changes committed at 39bab45 Component tag updates for nwprod/gfsv16.0.7 Updated transfer parm files for gdas, enkf, and gfs dissemination updated config.fcst testing merra2 workflow in hera update merra2 before a new pull request Issue #215 - GFSv15.3.5 ops updates for gempak and bufrlib versions udate config.base.emc.dyn high resolution MERRA2 data used fixed a bug related to orion added merra2 input and update diag_table merra2 workflow for orion Clean up commments Updated sorc/checkout.sh to pick gfsda.v15.3.3 ( EUM bufr changes ) commit 4641c5d206a95e0e66276fa86d46ecd580213fd5 Merge: eff8a1258 e3fcfebc1 Author: Mallory Row Date: Wed May 19 14:51:19 2021 +0000 Merge branch 'develop' into feature/EMC_verif-global_upgrade commit e3fcfebc1ff9c9f9ceda7b775de0c34117de6459 Merge: 95e0dffea fa5d1e2ba Author: Kate Friedman Date: Wed May 19 10:02:19 2021 -0400 Merge pull request #254 from AnningCheng-NOAA/merra2 Addition of support for MERRA2 commit fa5d1e2ba242a08e26cb7bac8c68b700a83d1d60 Author: anning.cheng Date: Tue May 18 10:41:16 2021 -0400 change fv3gfs checkout head commit 283c4a044847bbadd0cef46e14dde4de97abe986 Author: anning.cheng Date: Mon May 17 15:56:56 2021 -0400 fixed a compilation error, NEMs/exe existing, in hera commit 6b574160f94329754677e7f4541ab07c44185bb0 Author: anning.cheng Date: Fri May 14 13:44:18 2021 +0000 build_fv3.sh with ATM commit dfc8aa9a2cf5370fff38520d2a28f3960be3b583 Merge: bf5d11063 95e0dffea Author: anning.cheng Date: Wed May 12 20:11:44 2021 +0000 Merge remote-tracking branch 'upstream/develop' into merra2 commit eff8a1258aa70c061fe69697388f8914935726ee Merge: b1ad0b963 95e0dffea Author: Mallory Row Date: Wed May 12 19:15:17 2021 +0000 Merge branch 'develop' into feature/EMC_verif-global_upgrade commit 95e0dffea435af17be9150fcd0c842763768b45f Merge: fc727f405 dd4d18790 Author: Kate Friedman Date: Wed May 12 14:51:10 2021 -0400 Merge pull request #323 from NOAA-EMC/sync/operations_v16.0.9 Bring operations branch v16.0.9 changes into develop commit dd4d1879093516a7950c9e0e59f6e44e462199f6 Merge: fc727f405 68b9157bd Author: kate.friedman Date: Mon May 10 16:40:13 2021 +0000 Merge remote-tracking branch 'origin/operations' into sync/operations_v16.0.9 * origin/operations: bugfix for not removing last point from list in wave post pnt jobs Update util_shared module version to 1.3.0 for wave footer fix Revert addition of ecflow trigger for wavepostbndpntbll job update ufs-weather-model tag to GFS.v16.0.16 add ecflow updates for new wave bndpntbll job Additional WAFS tag update to gfs_wafs.v6.0.21 for GFSv16.0.8 Back out updates to add config.resources.nco.static WAFS tag update to gfs_wafs.v6.0.20 in Externals.cfg WAFS tag update (gfs_wafs.v6.0.20) for v16 post-implementation fixes fix for missing wave boundary cbull and bull files Fix cycle date in bull and cbull wave files Reverting transfer parm file changes committed at 39bab45 Component tag updates for nwprod/gfsv16.0.7 Updated transfer parm files for gdas, enkf, and gfs dissemination Issue #215 - GFSv15.3.5 ops updates for gempak and bufrlib versions Clean up commments Updated sorc/checkout.sh to pick gfsda.v15.3.3 ( EUM bufr changes ) Refs: #309 commit 8def77afd4a9f1cd6cdca42c956bc7ebf675a58d Merge: c6e2a96e8 d46e8cf49 Author: Kate Friedman Date: Mon May 10 11:45:02 2021 -0400 Merge pull request #322 from JessicaMeixner-NOAA/hpcstackww3 Updates for ww3 pre and post exec builds for hpc-stack commit d46e8cf49aff0dbf72e457f1177e10e6f50252a6 Author: Jessica Meixner Date: Mon May 10 15:31:56 2021 +0000 update for ww3_pre/post build: -- remove ww3 specific module files -- updated module base files to be in sync with ufs-weather-model except that jasper is a more recent version which is needed for ww3_grib -- added extra ww3 execs needed for this branch -- added ww3prepost build to build_all and partial build commit c6e2a96e86611cce00455609d9be803d580e7bc0 Author: Kate.Friedman Date: Mon May 10 14:30:13 2021 +0000 Change fix file linking to scan for possible subfolders in set - instead of providing subfolder names, scan for subfolders in FIX_DIR path and then create symlinks based on existing subfolders in set - will not have to keep updating list of subfolders now Refs: #164 commit aa1d1e9ca1d8521ef22e1bdb654d4a70e7b1e29e Author: Kate.Friedman Date: Mon May 10 14:27:13 2021 +0000 Change CCPP_SUITE from FV3_GFS_v16beta to FV3_GFS_v16 - newer ufs-weather-model hash supports FV3_GFS_v16 suite Refs: #164 commit a8ed770d9c1e8614c5477d0129d6eaf5fe4e2085 Author: kate.friedman Date: Thu May 6 19:32:36 2021 +0000 Companion update to Externals.cfg for WAFS tag update to gfs_wafs.v6.0.22 Refs: #1,#321 commit 68d9b5b5ff072b1f62131a9f7495c33c14c4a850 Author: kate.friedman Date: Thu May 6 19:28:17 2021 +0000 WAFS tag update to gfs_wafs.v6.0.22 for bug fix Refs: #1,#321 commit 68b9157bd59278dc07369d316999d3c8485ebf7c Merge: 1f5af6283 97fe99096 Author: Kate Friedman Date: Wed May 5 10:50:42 2021 -0400 Merge pull request #306 from NOAA-EMC/release/gfsv16.0.0_to_ops GFSv16.0.9 commit 1aea3ee393c99d4f76556fbbdbebc43f629558c2 Author: kate.friedman Date: Wed May 5 14:30:06 2021 +0000 Add build script and modulefiles for WW3 build - WW3 execs no longer built by FV3 build, need separate build now - add new build_ww3prepost.sh script for building WW3 execs in ufs-weather-model checkout - add modulefiles for WCOSS-Dell, Hera, and Orion Refs: #164 commit 214431a2c04f6e685e966810621fc2fae771d2f5 Author: kate.friedman Date: Wed May 5 14:26:00 2021 +0000 Update WAFS execs to include .x - update WAFS exec names in link_fv3gfs.sh to include ".x" extension that was added with new version Refs: #164 commit eb1613764794a35d165e55423682ed592b2aaacb Author: kate.friedman Date: Wed May 5 14:22:06 2021 +0000 ufs-weather-model hash update for WW3 support with hpc-stack - update ufs-weather-model hash to 554aedcd63e4a7c5012570406132eaf76e249ca9 - update build_fv3.sh for new compile arguments for CCPP/WW3 build Refs: #164 commit b1ad0b96328a54112ecc13ff4b71dcc08556a8c1 Author: Mallory Row Date: Wed May 5 14:13:58 2021 +0000 Update EMC_verif-global tag to verif_global_v2.0.1 Refs: #315 commit c206e8d2606f14ee6188740f881673ff512b866b Author: Mallory Row Date: Mon May 3 19:16:42 2021 +0000 Fix typo in config.metp There was a typo in the grid-to-obs METviewer related config variables. The variables started with "g2g1" instead of "g2o1". Refs: #315 commit f9013decd5af75f66eda619175bacc7549f4746a Author: Brian Curtis Date: Mon May 3 14:37:13 2021 -0400 Files to test for python3 compatibility commit 0b9607711900e0da0de3d3dc1af91cc4dc9dd04d Author: catherine.thomas Date: Mon May 3 12:31:01 2021 -0400 Update v16.1 release notes with wave script fix Refs: #301 commit fda26e9010ef59f69cd378c3092bdc82e57e4ec2 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Mon May 3 12:08:12 2021 -0400 Update config.base.nco.static Change HOMEobsproc_prep and HOMEobsproc_network to point at NCO installation of these packages. commit f9a6dd71ef7d580d6b83f50eb53814a06a944577 Merge: 173bb4814 97fe99096 Author: catherine.thomas Date: Mon May 3 11:55:09 2021 -0400 Merge branch 'release/gfsv16.0.0_to_ops' into release/gfsv16.1 Include wave fixes in v16.1 Refs: #301 commit 173bb4814ed2d2bc6e3436263f55c199f8dc8721 Author: catherine.thomas Date: Mon May 3 11:52:42 2021 -0400 Update v16.1 release notes for wave fixes Refs: #301 commit 858f216330d7ab71137afe6331f6576fec413aee Author: kate.friedman Date: Mon May 3 15:19:56 2021 +0000 Update EMC_post, UFS_UTILS, EMC_gfs_wafs versions in Externals.cfg Refs: #164 commit 6ce83363726ece6dbd34719430dd44f5e63bbc3c Author: kate.friedman Date: Mon May 3 15:14:54 2021 +0000 Component updates for UPP, WAFS, UFS_UTILS - new tag in checkout for UFS_UTILS to use hpc/1.1.0 - new hash in checkout for UPP/EMC_post to support GTG and GTG code copy fix - new hash in checkout for WAFS/EMC_gfs_wafs to support hpc-stack - build script update for WAFS (cmake/stack) - WAFS GCIP config update to fix runtime error Refs: #164 commit 709095709ca60498d05ea0f519b494501e1ecdf2 Author: kate.friedman Date: Mon May 3 15:11:51 2021 +0000 Update util_shared to 1.3.0 on WCOSS-Dell - version changed in production for v16.0.9 footer fix Refs: #164 commit bf5d110632fc67653441e6830bfdcdddda906706 Author: anning.cheng Date: Mon May 3 10:53:37 2021 -0400 fixed line 43 of link_fv3gfs.sh, build_fv3.sh v15 to v16 commit 97fe990968955486fe2b7f3035c9d9423cef5e3e Merge: c582db288 f0fd3d410 Author: Kate Friedman Date: Mon May 3 09:15:15 2021 -0400 Merge pull request #318 from JessicaMeixner-NOAA/v16/bugforpnt Bug fix for missing buoy in wave post point commit f0fd3d4108debaa61f000ac700fd90904f1b600b Author: jessica.meixner Date: Fri Apr 30 17:56:34 2021 +0000 bugfix for not removing last point from list in wave post pnt jobs commit fb47a5a0f7391f670461b00eaf324c3b0102ef1f Author: kate.friedman Date: Wed Apr 28 17:50:15 2021 +0000 Update TC_tracker version in config.vrfy - new TC_tracker.v1.1.15.4 was made available by Jiayi Peng - new version supports hpc-stack on supported platforms - installed locally on WCOSS-Dells, Hera, and Orion Refs: #164 commit 587ac0c914430d5b887327d7c566c452d2eace62 Author: anning.cheng Date: Wed Apr 28 10:36:49 2021 -0400 changes added in link_fv3gfs for fix_aer commit 48690cd07ceeb90c6808346a04a31a6166505c86 Author: Mallory Row Date: Wed Apr 28 14:00:46 2021 +0000 Update EMC_verif-global tag to verif_global_v2.0.0 commit 1d0e0c29979101b451f1fda265af3f9a716bb9de Author: kate.friedman Date: Tue Apr 27 19:33:09 2021 +0000 Reverting resource related changes - will commit resource adjustments separately Refs: #164 commit 101905ce4d92a544d353700b0acdc137aa61b088 Author: anning.cheng Date: Tue Apr 27 14:15:59 2021 -0400 changed fix dir for link_fv3gfs commit 3271bac5b2072bfa2c0c0d229f050b647cd28925 Author: catherine.thomas Date: Tue Apr 27 10:08:54 2021 -0400 Update Externals.cfg and release notes for v16.1 Updated Externals.cfg GSI tag to gfsda.v16.1.0. Added sections in the release notes to explicitly say no changes to parm/config and scripts. Refs: #301 commit 31389bc23b89b1713712da17c8f226b649286b40 Merge: b1e924cd4 c582db288 Author: catherine.thomas Date: Tue Apr 27 08:31:41 2021 -0400 Merge branch 'release/gfsv16.0.0_to_ops' into release/gfsv16.1 This update brings in the wave foot fix for GFSv16.0.9. Refs: #301 commit b1e924cd44eba056443cb2a6789b1dc8040a02e1 Author: catherine.thomas Date: Tue Apr 27 07:31:44 2021 -0400 Change GSI tag to gfsda.v16.1.0 The GSI tag has been updated for the v16.1 implementation. This implementation turns on commercial RO data from GeoOptics. See release notes docs/Release_Notes.gfs.v16.1.0.txt in the global-workflow and release notes doc/Release_Notes.gfsda.v16.1.0.txt in the GSI for details. Refs: #301 commit 3142a8219c74b7b33a2bee1aed4ed7acd34ed0dc Author: catherine.thomas Date: Mon Apr 26 16:59:43 2021 -0400 Add Release Notes for GFSv16.1 Added the file docs/Release_Notes.gfs.v16.1.0.txt to release branch release/gfsv16.1. Refs: #301 commit 7cd6e0e3000a47b39fa0c7e75ad66150e96385d8 Author: Mallory Row Date: Fri Apr 23 14:44:03 2021 +0000 Fix setting of 'SDATE_GFS' for free forecast XML In the workflow XML for free forecasts there is no SDATE_GFS variable. The 'SDATE_GFS' variable was changed to be set to the value of 'SDATE'. Refs: #315 commit d34b99d4e3d369b784fd9d2c639a0bddd1bbfc0c Author: Mallory Row Date: Fri Apr 23 12:59:51 2021 +0000 Clean up and add new settings in metp.sh SDATE_GFS was added into the script preamble following the addition of the variable being added as a rocoto exported environment variable for metp tasks. Config.vrfy was removed from the list of relevant configs to source. Variables CDATEm1 and PDYm1 were removed and replaced with VDATE. VDATE is the verification date to be done by the metp tasks based on VRFYBACK_HRS set in config.metp. Lines were added to catch the exit status of VERIFGLOBAL_SH and to exit if not succesfully run so it can be caught by rocoto. Refs: #315 commit 998f1a454427dedd46bb20b36e381bedbdf3e349 Author: Mallory Row Date: Fri Apr 23 12:49:38 2021 +0000 Revamp of config.metp Following the EMC_verif-global configuration remake, config.metp has been revamped to support the new configuration settings. Refs: #315 commit 6a01ae405202b8984fe8efa73573d7fdba2b74d2 Author: Mallory Row Date: Thu Apr 22 17:17:14 2021 +0000 Add 'SDATE_GFS' rocoto env var for metp tasks The connector script between EMC_verif-global and global-workflow run_verif_global_in_global_workflow.sh (housed in EMC_verif-global) contains a new feature that checks the maximum forecast hour for verification is the hour's initialization date is equal to or after the first GFS cycle initialization. If the date is before, the maximum forecast hour is adjusted to the maximum hour relative to the forecast verification date and the first GFS cycle initialization. This feature was added to cut down on METplus calls being done on data that don't exist. Refs: #315 commit dfec5816add3c56721360a418ec9f3459cfd4d8c Author: Kate.Friedman Date: Thu Apr 22 15:27:56 2021 +0000 Update UFS_UTILS hash to 9d9dd23e37a51fcc5ff6b7499b834c85ab32e5f3 Refs: #164 commit be564bb85eeefbb1699992e0ed4f0705e4e2ee24 Author: anning.cheng Date: Thu Apr 22 11:22:02 2021 -0400 changes on build_fv3.sh and symlinks to optical data files instead of cp commit 15c6619c6e211eb67ca6aa5c995e4a3bf4cf586f Author: Kate.Friedman Date: Thu Apr 22 14:03:46 2021 +0000 Adjust C768 fcst thread value for reduced node usage - change C768 nth_fv3_gfs value from 7 to 2 Refs: #164 commit ece41632a76e697184a46bb2a428c35ccdf08305 Author: kate.friedman Date: Thu Apr 22 13:58:22 2021 +0000 Add separate gfs resource variables to fcst in config.resources Refs: #164 commit 00c18b5e341d601fd83da2f9df2ac928e8dcf7c8 Author: Kate.Friedman Date: Thu Apr 22 13:44:13 2021 +0000 Update FIX_DIR paths in link_fv3gfs.sh - change platform FIX_DIR paths from fix_nco_gfsv16 set to developmental fix set for v17+ Refs: #164 commit b6c392c7569ed909d6b0b27c0f6a4f828b7acdfd Author: Kate.Friedman Date: Thu Apr 22 13:40:25 2021 +0000 Set DO_WAVE=NO in config.base - set default DO_WAVE setting to NO in config.base.emc.dyn - turn off waves until hpc-stack for WW3 execs build is ready Refs: #164 commit 2b703d335b8e3e97c9764bd60d87188ed11938fd Author: Kate.Friedman Date: Thu Apr 22 13:17:28 2021 +0000 Set WAFS checkout to be optional with -o flag - add checkout_wafs to -o flag settings Refs: #164 commit 305bf5e0fb3c665cf71f10023feed1a121be329b Author: Kate.Friedman Date: Thu Apr 22 13:01:04 2021 +0000 Update Externals.cfg to match checkout.sh versions Refs: #164 commit 3cc02d507df48e2b9f99163d737123e5b5c716c0 Author: Kate.Friedman Date: Thu Apr 22 12:52:46 2021 +0000 Update GLDAS and UFS_UTILS checkouts - update GLDAS checkout to gldas_gfsv16_release.v1.15.0 tag; corrects build on Orion - update UFS_UTILS checkout to 4f44bf89 hash of develop (supports stack) Refs: #164 commit 9d733f87203ebd2298d22c0ac9491520bae60983 Author: kate.friedman Date: Thu Apr 22 12:15:13 2021 +0000 Add support for separate gfs thread values - nth_fv3_gfs was added to configs, add support to setup scripts for resource calculations Refs: #164 commit 4c64dce093fb3925b40b2a3f0b57fd196b14d92c Author: kate.friedman Date: Thu Apr 22 12:14:21 2021 +0000 Reduce npe_wav_gfs from 440 to 140 to dev users Refs: #164 commit 35c16e3aafc3addd3d24225880b5319332cca97a Author: kate.friedman Date: Thu Apr 22 12:12:16 2021 +0000 Add wgrib2 module load to module_base.wcoss_dell_p3 - add wgrib2/2.0.8 module load - add setting of WGRIB2 to equal wgrib2 Refs: #164 commit c582db288999b316d8cae7686699ce8790622bc7 Author: kate.friedman Date: Wed Apr 21 14:55:00 2021 +0000 Update util_shared module version to 1.3.0 for wave footer fix - wave footer fix comes via a new util_shared module version on WCOSS - update module version to 1.3.0 in module_base.wcoss_dell_p3 and gfs.ver Refs: #1, #309 commit 918c450d7e1e888bfb2ef604e91bd6e85db4e55e Merge: 88c54d9c0 072a31ae9 Author: catherine.thomas Date: Fri Apr 16 11:30:13 2021 -0400 Merge release/gfsv16.0.0_to_ops into release/gfsv16.1 Syncing release branch for v16.1 with the soon-to-be operations branch. This update includes GFSv16.0.8 updates for WAFS and waves. Refs: #301 commit 88c54d9c0b1546473d980b5361e3187f9de4a14e Author: russ.treadon Date: Thu Apr 15 10:58:33 2021 +0000 Issue #301: update file/directory retention/cleanup and DA build option * modify arch.sh and earc.sh file/directory retention and clean up * add pruned NCO install option to build_gsi.sh commit 072a31ae93f8b5ee3b3eaeab42bb1bcc764fbf60 Author: kate.friedman Date: Tue Apr 13 16:54:27 2021 +0000 Revert addition of ecflow trigger for wavepostbndpntbll job - Remove wavepostbndpntbll trigger from prod def file for each cycle Refs: #1 commit 6b6b9ed3949c2e3ca3d697ffd3bb3a65ec728744 Author: Kate.Friedman Date: Mon Apr 12 17:13:06 2021 +0000 getic and init job updates for v16 in ops - update GFSv16 implementation cycle in config if-blocks to be 2021032100 - add note about missing ops data on HPSS for 2021032106 cycle - add run_v16.chgres.sh as default RUNICSH script Refs: #178 commit 48ea7abd9fe4cfd0e774c92383948c9bb477a905 Author: Kate.Friedman Date: Mon Apr 12 17:10:59 2021 +0000 Low resolution resource adjustments - C96, C192, and C384 config.fv3 resource updates for successful jobs - provided by Jun Wang and Fanglin Yang - tested on Hera Refs: #178 commit b55ecf8e5ae761ddc8e3897411aceb4e0529e106 Author: kate.friedman Date: Tue Apr 6 15:34:05 2021 +0000 Remove Tide/Gyre reference from machine-setup.sh Refs: #164 commit ab488f1aeb383d84a67b37e47a888d58662cf82f Author: kate.friedman Date: Tue Apr 6 15:33:11 2021 +0000 Update OznMonBuild, RadMonBuild and module_base for hpc-stack on WCOSS-Dell Refs: #164 commit 792b91045c1b72d0439576c0ed38c64cfec5c08b Author: russ.treadon Date: Thu Apr 1 15:18:12 2021 +0000 Issue #301: modify earc.sh to retain enkfgdas files needed by metplus commit c9087cfc00f61a30ece07d13a879ad0dc8ee95cb Merge: b0b01c6cd c145448eb Author: Kate Friedman Date: Wed Mar 31 23:02:31 2021 -0400 Merge pull request #310 from JessicaMeixner-NOAA/v16/wavemissingpoints Wave fixes for missing bull and cbull points and Arctic grid grib update commit c145448ebe7f08f110121932b84dc279b565fe79 Author: jessica.meixner Date: Thu Apr 1 01:50:47 2021 +0000 update ufs-weather-model tag to GFS.v16.0.16 commit d804efe8aa42ba6bb468c80a58079d6ec25398d8 Merge: b3eec5985 b0b01c6cd Author: jessica.meixner Date: Wed Mar 31 18:44:34 2021 +0000 Merge remote-tracking branch 'origin/release/gfsv16.0.0_to_ops' into wavemissingpoints commit b3eec5985b35216eafaf1c5358fd05ba9717c687 Author: jessica.meixner Date: Wed Mar 31 18:43:46 2021 +0000 add ecflow updates for new wave bndpntbll job commit b0b01c6cd102be18f06c45c2d59667b64a5ef22c Merge: 06e9f0835 1d6893b6a Author: Kate.Friedman Date: Tue Mar 30 16:12:45 2021 +0000 Additional WAFS tag update to gfs_wafs.v6.0.21 for GFSv16.0.8 Refs: #1 commit 1d6893b6a700df8228ff14da904b72450152abae Author: Kate.Friedman Date: Tue Mar 30 16:11:44 2021 +0000 Additional WAFS tag update to gfs_wafs.v6.0.21 for GFSv16.0.8 Refs: #1 commit 06e9f08359e2e62c11459d2dc8e8fcba5685eb2e Author: Kate.Friedman Date: Tue Mar 30 15:30:35 2021 +0000 Back out updates to add config.resources.nco.static Will retain in other release branch. Removing to keep continuity with nwprod for GFSv16 post-implementation fix updates. Refs: #1 commit 188941144b3af82e2697acb73df5171088126d9c Merge: 9a3fcf3c2 c77a449cd Author: Kate.Friedman Date: Tue Mar 30 15:27:37 2021 +0000 WAFS tag update to gfs_wafs.v6.0.20 in Externals.cfg Refs: #1 commit c77a449cd537b60caf813e309dd0567daf788665 Author: Kate.Friedman Date: Tue Mar 30 15:26:39 2021 +0000 WAFS tag update to gfs_wafs.v6.0.20 in Externals.cfg Refs: #1 commit 9a3fcf3c24137a0a72b8f46dba0428a8a60cef47 Merge: 54cb17c6e 7f7fa4c05 Author: Kate.Friedman Date: Tue Mar 30 13:28:25 2021 +0000 WAFS tag update (gfs_wafs.v6.0.20) for v16 post-implementation fixes Refs: #1 commit 7f7fa4c051eb8acaa39d11e851ec6405727eaab6 Author: Kate.Friedman Date: Tue Mar 30 13:22:30 2021 +0000 WAFS tag update (gfs_wafs.v6.0.20) for v16 post-implementation fixes Tag updates: 1. Change WAFS 1.25 products (except for icing & turbulence) on model pressure levels. 2. Add CAT and MWT back to 0.25 WAFS product Refs: #1 commit 913b103bf57002780016896d930754f7104137fa Author: jessica.meixner Date: Tue Mar 30 13:10:50 2021 +0000 fix for missing wave boundary cbull and bull files This adds and extra job JGLOBAL_WAVE_POST_BNDPNTBLL and changes the points scripts to all use one script for all point files and the difference is in the exported variables in the three JJOB scripts commit 54cb17c6efce54f541ff08a69ef0a6c0497fb595 Merge: de7b54eaf fb6a6e3bd Author: kate.friedman Date: Thu Mar 25 14:47:46 2021 +0000 Post-implementation fix in ush/wave_outp_spec.sh Merge remote-tracking branch 'origin/release/gfsv16.0.0' into release/gfsv16.0.0_to_ops * origin/release/gfsv16.0.0: Fix cycle date in bull and cbull wave files Refs: #1 commit fb6a6e3bd573d7d1c1a0e89c742acb14ca8f1cda Merge: 27e91128f bda505e25 Author: Kate Friedman Date: Thu Mar 25 10:40:25 2021 -0400 Merge pull request #307 from JessicaMeixner-NOAA/bug/v16wavecycledate Fix cycle date for wave bulletins commit bda505e2509bfb3ec5077c32273ecafb9eb2f91e Author: jessica.meixner Date: Thu Mar 25 14:34:14 2021 +0000 Fix cycle date in bull and cbull wave files commit de7b54eaf3fd18c5bf0dc9327fd82939757f3489 Merge: 1f5af6283 27e91128f Author: kate.friedman Date: Tue Mar 23 15:31:54 2021 +0000 GFSv16.0.7 release package for operations branch Merge remote-tracking branch 'origin/release/gfsv16.0.0' into release/gfsv16.0.0_to_ops * origin/release/gfsv16.0.0: (1073 commits) Reverting transfer parm file changes committed at 39bab45 Component tag updates for nwprod/gfsv16.0.7 Updated transfer parm files for gdas, enkf, and gfs dissemination Add config.resources.nco.static ecflow forecast job resource updates from NCO v16.0.7 install Add missing symlinks for WAFS source code folders Update EMC_gfs_wafs tag to gfs_wafs.v6.0.19 Remove KEEPDATA from config.base.nco.static Pull in config changes from NCO v16.0.7 install Pull in workflow changes from NCO v16.0.7 install Update EMC_verif-global tag to verif_global_v1.13.4 ecFlow resource adjustments from NCO for forecast and post jobs issue #257 shorten run time on Mars Dell1 file system Update Fit2Obs tag to newm.1.3 for bugfix Update EMC_verif-global tag for Hera bug Pull in nwpara/gfsv16.0.6 updates for parse-storm-type.pl Issue #1 and issue #238 - update Externals.cfg to match checkout.sh updates for v16.0.6 move errchk definition from script to job for wave prdgen and gempak adding definition of errchk which was undefined in these scripts Issue #1 and issue #233 - update GLDAS tag to gldas_gfsv16_release.v1.13.0 Issue #1 and issue #241 - update EMC_verif-global tag Issue #1 and issue #238 - remove ak_10m grid from config.wave and update checkout.sh tags for ufs-weather-model and EMC_gfs_wafs Issue #1 and issue #226 - update UPP tag to upp_gfsv16_release.v1.1.3 in Externals.cfg Issue #1 and issue #226 - update UPP tag to upp_gfsv16_release.v1.1.3 Issue #1 and issue #226 - update exgfs_atmos_grib2_special_npoess.sh for dbn_alert issue #227 reducing output bufr files to 64 levels per NCO request issue #227 reducing output bufr files to 64 levels per NCO request Issue #1: update ecflow to be consistent with NCO's gfs.v16.0.4 and update checkout to bring in new UPP tag (see issue #226) Upaded ush script scale_dec.sh Updated scripts gfs_v16.0 Issue #1 - add grib_util module load to several analysis ecflow scripts Issue #1 - update WAFS tag to gfs_wafs.v6.0.17 for dbn_alert change Issue #1 - correct gridded wave parm files for v16.0.3 Issue #1 - changes from NCO for GFSv16.0.3 Issue #1 - update Externals.cfg with final tags for GFSv16.0.2 Issue #1 - correct permissions on jobs/rocoto/postsnd.sh Issue #1 - update gempak version to 7.3.3 in the ecflow gfs.ver file Issue #1 - update gempak and dumpjb versions to 7.3.3 and 5.1.0 respectively Script alert updates from NCO for wave downstream Add override for COMIN_WAV_RTOFS in emc mode for waveprep job Issue #197: place CDATE specific sections of config.anal and config.prep inside RUN_ENVIR=emc blocks. These sections are use for retrospective parallels and therefore do not need to be executed in operations (NCO). modified: JGLOBAL_FORECAST modified: JGLOBAL_FORECAST to make it work for both emc and nco running environments. modified: JGLOBAL_FORECAST A test showed that jobid is not defined in JGLOBAL_FORECAST running in the Rocoto environment. jobid is defined in ./env files. Defining DATA without sourcing ./env/$nachine.env caused the script to fail. Move the definitin of DATA after sourcing env parameters modified: checkout.sh to use WAFS tag gfs_wafs.v6.0.16 modified: checkout.sh to update UPP to upp_gfsv16_release.v1.1.1, a minor syntax bug fix modified: link_fv3gfs.sh to 1) use hard copies of external fix fields and executable for NCO installation 2) use soft links for all other files and directories for both NCO and EMC installations Compared local files in NCO implementation directory with release/gfs.v16.0.0 branch, changes made by NCO(Jen Yang) in the following files are either accepted or rejected. create a new branch release/gfsv16.0.0.nco to merge changes made by NCO in /gpfs/dell1/nco/ops/nwpara/gfs-v16/gfs.v16.0.1 back to EMC's repository update for the wave parm so that the wave model will look for the correct restart for when gfs is not run every cycle updates for checking if RTOFS files exist and only processing RTOFS files for needed fhr Issue #1 - update WAFS tag to gfs_wafs.v6.0.14 and update dumpjb version to 5.1.0 modified: jobs/JGFS_ATMOS_POSTSND and jobs/rocoto/postsnd.sh to remove redundant variables in the two scripts and make them work for both EMC and NCO parallels. Issue #1 - update WAFS tag to gfs_wafs.v6.0.13 Rename Release_Notes.gfs.v16.0.0.txt to Release_Notes.gfs.v16.0.0.md Issue #1 - update WAFS tag to gfs_wafs.v6.0.12 for removal of in-cloud turbulence per AWC Issue #1 - pull in corrected npe_eobs values in config.resources Issue #1 - update FV3 tag to GFS.v16.0.14 for Hera/Orion build support Issue #1 - adjust WAFS dependencies to wait for f036 post output Issue #1 - adding release notes for GFSv16 Issue #94 producing awips files with masks and deleting wmo headers for arctic ocean updates to add glo_30m to the created grib files for waves for awips processing Modify gfs/gdas post job to 20 minutes in wall clock. Issue #1 - update link_fv3gfs.sh to point to newly frozen fix_nco_gfsv16 FIX_DIR Modify two wafs jobs trigger as: jgfs_atmos_wafs_grib2 trigger ../../post/jgfs_atmos_post_f000 == complete jgfs_atmos_wafs_grib2_0p25 trigger ../../post/jgfs_atmos_post_f036 == complete Issue #1 - update WAFS tag to gfs_wafs.v6.0.10 and change WAFS job dependencies Issue #1 - update config.fv3 and config.resources with v16rt2 values Issue #1 - update gfs_util modulefiles Change config.resources for eobs for low resolution cases Issue #1 - update to fbwndgfs modulefiles for WCOSS-Dell and WCOSS-Cray updates to parm to reduce the number of wave variables changes to the config so that wave models are interpolated to the multi_1 masked files for the regional output grids Modify module for each job to match implementation package change Modify two wafs jobs trigger Modify wall clock and resource for running jobs in NCO Modify obsproc package location add a dependency for the wavepostpnt on wavepostbndpnt for just gfs as this job does not exist for gdas Issue #1 - update WAFS tag to gfs_wafs.v6.0.9 Issue #1 - remove POE/BACK block from config.prep and set POE=YES/BACK=off as defaults in env/WCOSS_DELL_P3.env prep section Issue #1 - remove unneeded DMPDIR and ICSDIR from config.base.nco.static for rocoto add a dependency to wavepostpnt job on wavepostbndpnt so that both jobs will not run at the same time which will slow both jobs down. This is the reason for the dependency, otherwise there is not a "true" dependency between the jobs Issue #1 - return POE=YES and BACK=off setting for prep on WCOSS_DELL_P3 Issue #1 - remove hardcoded POE and BACK values from config.prep Issue #1 - move ABIBF, AHIBF, and HDOB pointers into RUN_ENVIR=emc block Issue #1 - update config.fv3 based on real-time parallel Issue #1 - update prep job resources Issue #1 - update g2tmpl module load in modulefiles/module_base.wcoss_dell_p3 Issue #1 - config updates from real-time parallel Issue #1 - remove unneeded line in vrfy.sh and update link_fv3gfs.sh for UFS_UTILS execs Issue #1 - update GSI tag to gfsda.v16.0.0 Issue #1 - update component tags and modulefiles for nwtest lib updates, remove unneeded module load and modulefile from downstream wave job rocoto scripts Issue #1 - adjust error handling in wave rocoto job scripts issue #142 generate station i,j grid issue #142 generate station i,j grid issue #142 generate station i,j grid issue #142 add 6 bufr station data issue #142 add 6 bufr stations for Thailand TMD issue #145 change dev path to prod for parallel netcdf modules issue #145 change dev path to prod for parallel netcdf modules Issue #1 - updates for modules and small fixes reverting changes to configs that were not intented to be committed fix resource time estimates bug fix in exgfs_wave_post_pnt.sh update resources and trigger from 192->180 last of EE2 changes updates for EE2 from waves Issue #94 fix for failing silently Issue #1 - update WAFS tag to gfs_wafs.v6.0.8 Issue #94 add native grids as default grids Issue #1: update name of ncdiag executable and source code directory to be conistent cleaning up the rearranged scripts Jobs were tested with PDY 20200925, code managers from post, gempak, wave, and post process certified the test run result. Issue #1: update parm/config.vrfy to define VSDBJOBSH (used by jobs/rocoto/vrfy.sh) Code manager indicated all wafs jobs wall clock is 30 mins. Code manager indicated job card for scripts/gfs/atmos/gempak/jgfs_atmos_pgrb2_spec_gempak.ecf need to be changed The EMC realtime parallel does not use operational job settings. Ecflow job card roll back the setting from module_used_gfs-16_job google sheet document. Update GLDAS tag to gldas_gfsv16_release.v1.10.0 Update gfswafs job to run with loop over fcsthrs Issue #1: update vrfy.sh to submit vsdb processing as separate job (only on WCOSS_DELL_P3) Issue #94 add /fakedbn to run DBN_alerts Update config.awips for newly named JJOB scripts Update WAFS jobs/rocoto scripts to use new JJOB names Making J-Job naming change accourding to code manager. Remove temp files Update post.sh UPP JJOB script name to submit Modify each ecflow script with old j-job name for test. Update config.base.nco.static with config.base.emc.dyn changes Script name updates for sfc_prep and tracker Name change for tropcy scripts and update WAFS tag Fixing spelling mistake in config.gldas Updated drivers and release notes Updated scripts Update Externals.cfg with new UPP tag upp_gfsv16_release.v1.0.16. 1)Update sorc/checkout.sh with new UPP tag upp_gfsv16_release.v1.0.16. 2)Update sorc/link_fv3gfs.sh with new file name convention for jjob and ex-script of post processing part. Updated jobs name Update EMC_verif-global tag to verif_global_v1.11.0 Rename scripts to match ecf script naming convention. Add SENDDBN and DBNROOT. Update GLDAS tag. In anticipating changes from the GLDAS repo : renaming JGDAS_GLDAS to JGDAS_ATMOS_GLDAS, and exgdas_gldas.sh to exgdas_atmos_gldas.sh renamed: jobs/JGFS_POSTSND -> jobs/JGFS_ATMOS_POSTSND renamed: scripts/exgfs_postsnd.sh -> scripts/exgfs_atmos_postsnd.sh modified: docs/archive/README_bufr driver/product/run_postsnd.sh driver/product/run_postsnd.sh.cray driver/product/run_postsnd.sh.dell driver/product/run_postsnd.sh.hera driver/product/run_postsnd.sh.jet parm/config/config.postsnd renamed: scripts/exglobal_fcst_nemsfv3gfs.sh -> scripts/exglobal_forecast.sh and modified jobs/JGLOBAL_FORECAST parm/config/config.fcst updates for optimizing point jobs Issue #1 - update SEND variables and add DBNROOT to base configs and add check to build_enkf_chgres_recenter_nc.sh for GSI build Issue #1 - update to UFS_UTILS ops-gfsv16.0.0 tag Issue #1 - fix to link_fv3gfs.sh for new GLDAS tag Issue #131 Unify dbn_alert path Issue #1: update to UPP tag "upp_gfsv16_release.v1.0.15" Fix for running prep on Hera ecflow full day cycle included Update WAFS tag to gfs_wafs.v6.0.6 Small updates: - new UPP tag - new GLDAS tag - new WAFS tag - new module for WAFS - EE2 updates to awips scripts - added WAFS to archival - break downstream and WAFS archival into separate gfs_downstream tarball - update gfsarch dependencies to wait for all wavepost jobs to complete Issue #131 reduce scripts output to logfile Issue #131 added a path to DBNROOT Adding wafs wave and downstream jobs updates for by hour post Issue #1: add fhrgrp and fhrlst back to gfsawips in setup_workflow.py (bugfix) adding the line to go back a day for RTOFS for the if not NCO section because RTOFS will not be available until 06 cycle Restructured ecflow - up to post step deleted relocate_mv_nvortex.fd since storm relocation is no longer needed. modified build_tropcy_NEMS.sh to remove references to relocate_mv_nvortex Issue #1: correct DA typos in sorc/link_fv3gfs.sh Increase walltime for new wavepost jobs Issue #1: update name of DA jobs and scripts in accordance with WCOSS Implementation Standards Issue #94 add waves-prdgen, ICE->ICEC, Sleep in gempak script add gfs gempak downstream jobs into def file worked on wcoss ecflow script rename after redesign approved - not including all wave jobs Issue #1: clean up DA sections of link_fv3gfs.sh fix from Bhavani for having first wave grib file be set as a forecast instead of analysis fix from Bhavani for having first wave grib file be set as a forecast instead of analysis fix from Bhavani for having first wave grib file be set as a forecast instead of analysis ecflow gfs v16 nco review 3 updates to split boundary points plus saving config file updates Adding missing space to if-block in env files to resolve runtime failure Issue #1: remove pgrb2b.0p25 dependency from gfsawips in setup_workflow.py ecflow gfsv16 redesign 2 Issue #1 - update WAFS tag to gfs_wafs.v6.0.4 and remove HOURLY variable from WAFS configs Issue #1 - change wavegempak and waveawipsgridded dependency to match waveawipsbulls and start when wavepostsbs is complete Issue #1: set n_sponge=42 in gfs section of config.fcst adding pnt jobs as seperate jobs for env moving definitions of wavempexec and wave_mpmd from jobs to env Remove unneeded settings from config.post Added null DBNROOT to wave awips configs add extra script for by hour points for waves updates for boundary points by hour parallelization lowering the resource requirement for wave prep job Issue #1: rename enkf_chgres_recenter executables in accordance with WCOSS Implementation Standards modified: checkout.sh to use gldas_gfsv16_release.v1.6.0 update module for cdo Update to WAFS tag and added SENDDBN_NTC to both base configs modified: link_fv3gfs.sh to not link or copy 0readme fix_chem fix_fv3 fix_sfc_climo which are not used by GFS.v16 and are of large size modified: link_fv3gfs.sh to remove chgres_cube.fd and chgres_cube.fd in sorc/link_fv3gfs.sh modified: link_fv3gfs.sh to allow "fix" directories to be removed before rerunning link_fv3gfs.sh for RUN_ENVIR=nco case Added WAFS jobs to free-forecast mode, updates for extending WAFS to fh120, and two bug fixes in link_fv3gfs.sh and hpssarch_gen.sh Renamed global-workflow-owned ex-scripts to remove ecf extension and updated other scripts which call those ex-scripts Remove UFS_UTILS ecf extensions Issue #1: (1) update earc.sh directory removal to be consistent with arch.sh, (2) update config files to be consistent with EMC real-time GFS v16 parallel Remove ecf script name extensions from downstream wave scripts Add new downstream wave jobs to workflow Add new downstream WAFS jobs Issue #1 - update GLDAS and UPP workflow files for removal of ecf script extension Issue #1: remove ".ecf" suffix from DA scripts referenced in sorc/link_fv3gfs.sh Issue #1: remove ".ecf" extension from DA exscripts (as per WCOSS Implementation Standards) referenced from parm/config files updates to resources for wave jobs Issue #1: update name of DA enkf chgres script in config.echgres Issue #1: Rename DA enkf chgres job and script as per EE2 guidance Workflow changes for wave gempak and awips downstream jobs Fixing wavepostbndpnt dependency in setup_workflow_fcstonly.py ... * conflicts resolved Refs: #1 commit 384440ced59c18af777f7f9bb30016f3e71a1c31 Merge: 2f6d7ab06 fc727f405 Author: kate.friedman Date: Mon Mar 22 17:15:27 2021 +0000 Merge remote-tracking branch 'origin/develop' into feature/hpc-stack * origin/develop: Add two WAFS sorc folders to .gitignore Remove unneeded builds from build_all and partial build Delete unneeded files after workflow utils build was converted to cmake Remove global_chgres exec symlink from link_fv3gfs.sh Refs: #164 commit f35e6090bcf097355e691266bbc12ace6591c150 Merge: 5c77500bc fc727f405 Author: Kate.Friedman Date: Mon Mar 22 15:48:55 2021 +0000 Merge remote-tracking branch 'upstream/develop' into issue178 * upstream/develop: Add two WAFS sorc folders to .gitignore Remove unneeded builds from build_all and partial build Delete unneeded files after workflow utils build was converted to cmake Refs: #178 commit 5c77500bc4742a840a26f05aedffe2aca3e15ec7 Author: Kate.Friedman Date: Mon Mar 22 15:44:40 2021 +0000 Add exit in init.sh when C768 v16 warm starts detected Exit init.sh and init job when high res C768 v16 warm starts are detected and running chgres_cube is not needed. Refs: #178 commit 27e91128f62a7975f16f3fc932727444d1dc812c Author: kate.friedman Date: Thu Mar 18 20:54:22 2021 +0000 Reverting transfer parm file changes committed at 39bab45 - updates to the transfer parm files were erroneously committed after a diff between nwpara/gfsv16.0.7 and release/gfsv16.0.0 branch; v16.0.7 had been moved to nwprod/gfsv16.0.7 and transfer parm file diffs were not in final install - confirmed with SPA Jen that nwprod/gfsv16.0.7 is finalized (it is) - pulled in transfer parm files from nwprod/gfsv16.0.7 to sync them again Refs: #1,#238 commit 41c19e93f4b327b98a3e8b44245cbbfcde8f855b Author: kate.friedman Date: Tue Mar 16 19:37:44 2021 +0000 Component tag updates for nwprod/gfsv16.0.7 - Final nwprod implementation component tag versions set in Externals.cfg and checkout.sh - GLDAS: roll back to gldas_gfsv16_release.v1.12.0 - EMC_verif-global: roll back to verif_global_v1.11.0 - WAFS: update in Externals.cfg to gfs_wafs.v6.0.19 Notes about local nwprod install changes: - EMC_post: nwprod install shows upp_gfsv16_release.v1.1.1 tag but contains local updates found in upp_gfsv16_release.v1.1.3 tag - WAFS: nwprod install shows gfs_wafs.v6.0.17 tag but contains local updates found in gfs_wafs.v6.0.19 tag Refs: #1,#238 commit 39bab455137a473be19cfdf93d62af41a069f0cf Author: kate.friedman Date: Tue Mar 16 18:36:08 2021 +0000 Updated transfer parm files for gdas, enkf, and gfs dissemination Refs: #1,#238 commit fc727f405afd4948ee99be1c13c9f2541405cc8b Merge: 258b342d5 7582541c3 Author: Kate Friedman Date: Wed Mar 10 16:40:18 2021 -0500 Merge pull request #288 from NOAA-EMC/feature/issue272 Post-cmake cleanup and decommissioning commit 7582541c3b4ec182756ad1fa9164c21e0345f058 Author: kate.friedman Date: Wed Mar 10 20:08:43 2021 +0000 Add two WAFS sorc folders to .gitignore - two missing WAFS symlinks were added to link_fv3gfs.sh in prior commit; include them in gitignore file now Refs: #272 commit 4f7243bc68164faf0008893467c5a2ea53d216ae Author: Kate.Friedman Date: Tue Mar 9 18:07:23 2021 +0000 Remove global_chgres exec from link_fv3gfs.sh; no longer exists from UFS_UTILS Refs: #178 commit fcd720edef1fe242c27d543f445bb3b4ee4fe9ec Author: Kate.Friedman Date: Tue Mar 9 17:54:51 2021 +0000 Remove unneeded UFS_UTILS symlinks in link_fv3gfs.sh Refs: #178 commit 2f6d7ab06b3be6eaedb3406efca77ded61ef7aca Author: kate.friedman Date: Tue Mar 9 17:41:14 2021 +0000 Module base updates for WCOSS-Dell, Hera, and Orion - add zlib module load - reorder wgrib2, hdf5, and netcdf module loads to accommodate module dependencies Refs: #164 commit 1285d803fd402945ebc03eb42eb8bfa3d2aa10da Author: kate.friedman Date: Tue Mar 9 10:59:09 2021 -0600 Update GSI checkout to clone GSI-master @ 9c1fc15 Refs: #164 commit 14736c8320bec43651208950c16ea85ae2372985 Merge: e6a9e5767 bea02f8a6 Author: Kate.Friedman Date: Tue Mar 9 16:56:42 2021 +0000 Merge branch 'feature/hpc-stack' of https://github.com/NOAA-EMC/global-workflow into feature/hpc-stack * 'feature/hpc-stack' of https://github.com/NOAA-EMC/global-workflow: Correct if-block syntax in config.postsnd and config.wavepostbndpnt after testing setup scripts FHMAX_WAV_IBP variable check for wavepostbndpnt job Bufr sounding job update Add config.resources.nco.static ecflow forecast job resource updates from NCO v16.0.7 install Update EMC_verif-global tag to verif_global_v1.13.5 Add missing symlinks for WAFS source code folders Update EMC_gfs_wafs tag to gfs_wafs.v6.0.19 Remove KEEPDATA from config.base.nco.static Pull in config changes from NCO v16.0.7 install Pull in workflow changes from NCO v16.0.7 install Update EMC_verif-global tag to verif_global_v1.13.4 ecFlow resource adjustments from NCO for forecast and post jobs issue #257 shorten run time on Mars Dell1 file system commit e6a9e576744a1c245148bdc4b967a5674c9416b5 Author: Kate.Friedman Date: Tue Mar 9 16:55:41 2021 +0000 Remove UFS_UTILS symlinks no longer needed in link_fv3gfs.sh Refs: #164 commit 370e81474ce8f4076b0853b37de7da9c65d65c88 Merge: 364e46ff2 258b342d5 Author: Kate.Friedman Date: Tue Mar 9 16:14:40 2021 +0000 Merge remote-tracking branch 'upstream/develop' into issue178 * upstream/develop: Correct if-block syntax in config.postsnd and config.wavepostbndpnt after testing setup scripts FHMAX_WAV_IBP variable check for wavepostbndpnt job Bufr sounding job update Add config.resources.nco.static ecflow forecast job resource updates from NCO v16.0.7 install Update EMC_verif-global tag to verif_global_v1.13.5 Add missing symlinks for WAFS source code folders Update EMC_gfs_wafs tag to gfs_wafs.v6.0.19 Remove KEEPDATA from config.base.nco.static Pull in config changes from NCO v16.0.7 install Pull in workflow changes from NCO v16.0.7 install remove flags that were not present in Makefiles update jasper to 2.x.25. remove compiler flag in fv3nc2nemsio Update EMC_verif-global tag to verif_global_v1.13.4 another use of _d where an _4 is needed Adjust how target is set for build_workflow_utils Add workflow_utils to build_all and link scripts copy/paste error from enkf_chgres_recenter.fd to enkf_chgres_recenter_nc.fd. When going from nemsio to netcdf, the linking of ip, sp and w3nco changed from _d to _4. update .gitignore to exclude build and install directories as well as compiled files. bugfix in build_workflow_utils.sh hack. nceplibs-ncio now creates the module ncio and not fv3gfs_ncio. Update EMC_verif-global tag to verif_global_v1.13.4 add cmake build capability for workflow utilities ecFlow resource adjustments from NCO for forecast and post jobs issue #257 shorten run time on Mars Dell1 file system Refs: #178 commit 364e46ff2acabed7bd9f6cb07ca10aa19033a980 Author: Kate.Friedman Date: Tue Mar 9 16:08:57 2021 +0000 Add pull of gdas wave restart files in getic.sh if DO_WAVE=YES Refs: #178 commit bea02f8a6fc496794471e57aa75e030092027bd5 Merge: fd50508f8 258b342d5 Author: kate.friedman Date: Mon Mar 8 14:02:51 2021 -0600 Merge remote-tracking branch 'origin/develop' into feature/hpc-stack * origin/develop: Correct if-block syntax in config.postsnd and config.wavepostbndpnt after testing setup scripts FHMAX_WAV_IBP variable check for wavepostbndpnt job Bufr sounding job update Add config.resources.nco.static ecflow forecast job resource updates from NCO v16.0.7 install Update EMC_verif-global tag to verif_global_v1.13.5 Add missing symlinks for WAFS source code folders Update EMC_gfs_wafs tag to gfs_wafs.v6.0.19 Remove KEEPDATA from config.base.nco.static Pull in config changes from NCO v16.0.7 install Pull in workflow changes from NCO v16.0.7 install Update EMC_verif-global tag to verif_global_v1.13.4 ecFlow resource adjustments from NCO for forecast and post jobs issue #257 shorten run time on Mars Dell1 file system Refs: #164 commit e9e1f3ba976f053c6738d95d6de673a065d293c1 Author: kate.friedman Date: Mon Mar 8 19:54:24 2021 +0000 Remove unneeded builds from build_all and partial build - remove build sections in build_all.sh that ran builds that are now consolidated into build_workflow_utils.sh - remove build options in partial build script and config that are now consolidated into workflow_utils build Refs: #272 commit 26fa8c355c241a64d19097da3caa54cba0e46b77 Author: kate.friedman Date: Mon Mar 8 19:38:45 2021 +0000 Delete unneeded files after workflow utils build was converted to cmake - remove modulefiles that are no longer needed; consolidated into new workflow_utils.* modulefiles - remove build scripts that are no longer needed; consolidated into new build_workflow_utils.sh script Refs: #272 commit 258b342d5b6535cf87973105d770a63f4245d44a Merge: f6aa69393 729631d66 Author: Kate Friedman Date: Mon Mar 8 12:59:46 2021 -0500 Merge pull request #282 from NOAA-EMC/feature/issue238 GFSv16.0.7 changes from NCO commit 729631d66f72ba8a3929e4927129d000879c9a1b Author: kate.friedman Date: Mon Mar 8 17:40:17 2021 +0000 Correct if-block syntax in config.postsnd and config.wavepostbndpnt after testing setup scripts Refs: #238 commit a1edf0de4deaccbf7f77f2fb4d24d1ca96c26171 Author: kate.friedman Date: Mon Mar 8 17:21:05 2021 +0000 FHMAX_WAV_IBP variable check for wavepostbndpnt job - add variable override in exgfs_wave_post_bndpnt.sh for FHMAX_WAV_IBP - add check to config.wavepostbndpnt for when FHMAX_GFS is less than FHMAX_WAV_IBP; set FHMAX_WAV_IBP to equal FHMAX_GFS when shorter gfs forecast than 180hrs; resolves runtime error with wavepostbndpnt job waiting for forecast output that will never arrive Refs: #238 commit ac76613f33f7d196a1d04972bc7c975d579da571 Merge: 0f97ccdad f6aa69393 Author: anning.cheng Date: Mon Mar 8 10:09:47 2021 -0600 Merge remote-tracking branch 'upstream/develop' into merra2 * upstream/develop: remove flags that were not present in Makefiles update jasper to 2.x.25. remove compiler flag in fv3nc2nemsio Update EMC_verif-global tag to verif_global_v1.13.4 another use of _d where an _4 is needed Adjust how target is set for build_workflow_utils Add workflow_utils to build_all and link scripts copy/paste error from enkf_chgres_recenter.fd to enkf_chgres_recenter_nc.fd. When going from nemsio to netcdf, the linking of ip, sp and w3nco changed from _d to _4. update .gitignore to exclude build and install directories as well as compiled files. bugfix in build_workflow_utils.sh hack. nceplibs-ncio now creates the module ncio and not fv3gfs_ncio. add cmake build capability for workflow utilities Update Fit2Obs tag to newm.1.3 for bugfix Update Fit2Obs tag to newm.1.3 for bugfix Update EMC_verif-global tag for Hera bug Pull in nwpara/gfsv16.0.6 updates for parse-storm-type.pl HOTFIX: Update EMC_verif-global tag for Hera bug Issue #1 and issue #238 - update Externals.cfg to match checkout.sh updates for v16.0.6 move errchk definition from script to job for wave prdgen and gempak adding definition of errchk which was undefined in these scripts Issue #179 and issue #243 - update Fit2Obs to newm.1.2 tag and correct COMROOT path for Hera HOTFIX: Issue #241 - update EMC_verif-global tag to remove use of /tmp space Issue #1 and issue #233 - update GLDAS tag to gldas_gfsv16_release.v1.13.0 Issue #1 and issue #241 - update EMC_verif-global tag Issue #1 and issue #238 - remove ak_10m grid from config.wave and update checkout.sh tags for ufs-weather-model and EMC_gfs_wafs Issue #179 - update to config.vrfy for Fit2Obs tag which supports Orion Issue #1 and issue #226 - update UPP tag to upp_gfsv16_release.v1.1.3 in Externals.cfg Issue #1 and issue #226 - update UPP tag to upp_gfsv16_release.v1.1.3 Issue #233 - remove unnecessary extra space in tag line for gldas Issue #233 - update GLDAS tag to gldas_gfsv16_release.v1.13.0 Issue #1 and issue #226 - update exgfs_atmos_grib2_special_npoess.sh for dbn_alert modified: exglobal_forecast.sh The breakpoint restart only works for the first restart from a breakpoint. Restart files written in RERUN_RESTRAT after the first restart has a 3-hour time shift for DO_IAU=YES cases. Forecasts starting from the 2nd breakpoint and beyond will fail becasue of incorrect initial conditins. This commit fixes this bug. issue #227 reducing output bufr files to 64 levels per NCO request issue #227 reducing output bufr files to 64 levels per NCO request Issue #1: update ecflow to be consistent with NCO's gfs.v16.0.4 and update checkout to bring in new UPP tag (see issue #226) Upaded ush script scale_dec.sh Updated scripts gfs_v16.0 Issue #1 and issue #220 - set C192/C96/C48 npe_eobs back to dev values for develop Issue #1 - add grib_util module load to several analysis ecflow scripts Issue #189 - update ufs-weather-model hash Issue #201 - workaround for failing post000 job before hpc-stack solution Issue #1 - update WAFS tag to gfs_wafs.v6.0.17 for dbn_alert change Issue #1 - correct gridded wave parm files for v16.0.3 Issue #1 - changes from NCO for GFSv16.0.3 Issue #1 - update Externals.cfg with final tags for GFSv16.0.2 Fixes for issue #202 (FINDDATE) and issue #208 (postsnd.sh permissions) Issue #1 - correct permissions on jobs/rocoto/postsnd.sh Issue #1 - update gempak version to 7.3.3 in the ecflow gfs.ver file Issue #1 - update gempak and dumpjb versions to 7.3.3 and 5.1.0 respectively Script alert updates from NCO for wave downstream Add override for COMIN_WAV_RTOFS in emc mode for waveprep job Issue #197: place CDATE specific sections of config.anal and config.prep inside RUN_ENVIR=emc blocks. These sections are use for retrospective parallels and therefore do not need to be executed in operations (NCO). modified: JGLOBAL_FORECAST modified: JGLOBAL_FORECAST to make it work for both emc and nco running environments. modified: JGLOBAL_FORECAST A test showed that jobid is not defined in JGLOBAL_FORECAST running in the Rocoto environment. jobid is defined in ./env files. Defining DATA without sourcing ./env/$nachine.env caused the script to fail. Move the definitin of DATA after sourcing env parameters modified: checkout.sh to use WAFS tag gfs_wafs.v6.0.16 modified: checkout.sh to update UPP to upp_gfsv16_release.v1.1.1, a minor syntax bug fix modified: link_fv3gfs.sh to 1) use hard copies of external fix fields and executable for NCO installation 2) use soft links for all other files and directories for both NCO and EMC installations Compared local files in NCO implementation directory with release/gfs.v16.0.0 branch, changes made by NCO(Jen Yang) in the following files are either accepted or rejected. create a new branch release/gfsv16.0.0.nco to merge changes made by NCO in /gpfs/dell1/nco/ops/nwpara/gfs-v16/gfs.v16.0.1 back to EMC's repository update for the wave parm so that the wave model will look for the correct restart for when gfs is not run every cycle updates for checking if RTOFS files exist and only processing RTOFS files for needed fhr commit b644f792bbb939d6ab25957b93f812aa02b497a9 Author: kate.friedman Date: Mon Mar 8 15:07:34 2021 +0000 Bufr sounding job update - set DO_BUFRSND to NO by default - in config.postsnd, if FHMAX_GFS is less than 180 then set DO_BUFRSND to FHMAX_GFS Refs: #238 commit 86b0345a550a7128fd6420befcb793a94411e4db Author: Kate.Friedman Date: Mon Mar 8 14:09:40 2021 +0000 Add pull of IAU increment files off HPSS for free-forecast Refs: #178 commit efdf917a843cc301a2fe7d5386aea1e29dd02c46 Author: Kate.Friedman Date: Fri Mar 5 21:29:22 2021 +0000 Update in wave prep script and add ability to pull warm starts - Update getic job to detect warm start config info and pull warm start RESTART files off HPSS and place in ROTDIR; just for retro ICs, will add ops after implementation - Add check to JGLOBAL_WAVE_PREP for prior cycle rtofs ROTDIR symlink; normally cycled mode would have already created this during prior cycle so added check for use in free-forecast mode Refs: #178 commit 65fce4ce9b2d7e63a0bd038fff1a206f8b4a5815 Merge: 084cb6aac 9027452bf Author: kate.friedman Date: Fri Mar 5 17:13:37 2021 +0000 Merge remote-tracking branch 'origin/release/gfsv16.0.0' into feature/issue238 * origin/release/gfsv16.0.0: Add config.resources.nco.static ecflow forecast job resource updates from NCO v16.0.7 install Refs: #238 commit 9027452bfd5345b89bc09656eb28211ca15fe171 Author: kate.friedman Date: Fri Mar 5 17:01:25 2021 +0000 Add config.resources.nco.static - added parm/config/config.resources.nco.static; contains resources for operations - added copy of config.resources.nco.static in link_fv3gfs.sh when nco mode Refs: #1,#238 commit 9ccb24c73e85119901075341c2aaa81b623e6104 Author: kate.friedman Date: Fri Mar 5 15:43:32 2021 +0000 ecflow forecast job resource updates from NCO v16.0.7 install Refs: #1,#238 commit be0ed38a53d2422923ccd5380d610785e61deafa Author: Kate.Friedman Date: Thu Mar 4 18:06:24 2021 +0000 Consolidate/cleanup init config and script Refs: #178 commit d32199cd58329b4b2c9c5c3e84e7d8d0801305e0 Author: Kate.Friedman Date: Thu Mar 4 17:43:58 2021 +0000 Move getic/init settings from scripts to configs - consolidate RETRO variable in config.getic - move configuration settings from getic/init scripts to configs - add dependency to setup_workflow_fcstonly.py - add getic and init to task list at top of config.resources Refs: #178 commit 084cb6aacc0c65440c5fecd3010a1b691ee98d87 Author: kate.friedman Date: Thu Mar 4 15:22:42 2021 +0000 Update EMC_verif-global tag to verif_global_v1.13.5 Refs: #238,#279 commit fd50508f84d3ec721747a069412c929b7269a602 Author: kate.friedman Date: Thu Mar 4 08:43:37 2021 -0600 hpc-stack updates for Orion - update OznMon and RadMon modulefiles - update module_base.orion - update Orion if-block in machine-setup.sh - remove duplicate module use for WCOSS-Dell in machine-setup.sh Refs: #164 commit 95ec834eb3192b12cae979ef5696b9670cc84b72 Author: kate.friedman Date: Thu Mar 4 14:35:43 2021 +0000 hpc-stack updates for WCOSS-Dell - update OznMon, Radmon modulefiles - update module_base.wcoss_dell_p3 - remove WW3 execs from link_fv3gfs.sh; will come from new build script - update wcoss_dell_p3 section of machine-setup.sh for hpc-stack - update checkout.sh for hpc-stack supported components (some test versions) Refs: #164 commit 0f97ccdadbf36ab02fea69933531fb0de878d0c5 Merge: 960ca42f0 1ea07da4b Author: anning.cheng Date: Mon Mar 1 10:53:55 2021 -0600 Merge branch 'merra2' of https://github.com/AnningCheng-NOAA/global-workflow into merra2 * 'merra2' of https://github.com/AnningCheng-NOAA/global-workflow: testing merra2 workflow in hera commit 960ca42f0999f882a2e4dc59d105edef9e7a0fe0 Author: anning.cheng Date: Mon Mar 1 10:53:14 2021 -0600 updated config.fcst commit 428e1ba753ba3151cc284482b828f1d14a394567 Merge: 7e2693626 25152a1dc Author: kate.friedman Date: Mon Mar 1 16:04:08 2021 +0000 Merge remote-tracking branch 'origin/release/gfsv16.0.0' into feature/issue238 * origin/release/gfsv16.0.0: Add missing symlinks for WAFS source code folders Update EMC_gfs_wafs tag to gfs_wafs.v6.0.19 Refs: #1,#238 commit 25152a1dce8316673c044c37c9362f297235aaf8 Author: kate.friedman Date: Fri Feb 26 20:45:59 2021 +0000 Add missing symlinks for WAFS source code folders - added symlinks in link_fv3gfs.sh for wafs_blending_0p25.fd and wafs_grib2_0p25.fd Refs: #1,#238 commit 80e4a39a3cf96c9fa69636ed9aab30f04e170da0 Author: kate.friedman Date: Fri Feb 26 19:45:07 2021 +0000 Update EMC_gfs_wafs tag to gfs_wafs.v6.0.19 - new tag includes changes made locally in NCO nwpara/gfsv16.0.7 install Refs: #1,#238 commit 7e2693626c9c9ef0e625ea09da6f7c60a62746d4 Merge: 2beea1293 264e51f4b Author: kate.friedman Date: Wed Feb 24 21:21:17 2021 +0000 Merge remote-tracking branch 'origin/release/gfsv16.0.0' into feature/issue238 * origin/release/gfsv16.0.0: Remove KEEPDATA from config.base.nco.static commit 2beea12934ecd2d374a6a411733e86bc6343d721 Merge: f6aa69393 378625624 Author: kate.friedman Date: Wed Feb 24 21:20:34 2021 +0000 Merge remote-tracking branch 'origin/release/gfsv16.0.0' into feature/issue238 * origin/release/gfsv16.0.0: Pull in config changes from NCO v16.0.7 install Pull in workflow changes from NCO v16.0.7 install Update EMC_verif-global tag to verif_global_v1.13.4 ecFlow resource adjustments from NCO for forecast and post jobs issue #257 shorten run time on Mars Dell1 file system commit 264e51f4b2e602433bab7c6f968164a550d5b644 Author: kate.friedman Date: Wed Feb 24 21:01:59 2021 +0000 Remove KEEPDATA from config.base.nco.static - NCO requested that KEEPDATA be removed from config.base and config.fcst since KEEPDATA gets set at the ecflow level and shouldn't be changed by the configs lower down. - Already commented out in config.fcst in prior commit. - Removed KEEPDATA from config.base.nco.static since it becomes config.base in prod mode with ecflow. - Retaining KEEPDATA in config.base.emc.dyn since that only becomes config.base in dev mode (rocoto doesn't set KEEPDATA). Refs: #1,#238 commit 3786256240e76982c54a5be6d8a7277530d9dd47 Author: kate.friedman Date: Wed Feb 24 17:37:36 2021 +0000 Pull in config changes from NCO v16.0.7 install - add line to config.fcst to allow running gdas and gfs with different threading - add nth_fv3_gfs to config.fv3 to support different threading for gfs - changes to post settings in config.resources to make post use all tasks on a node and speed up post jobs Refs: #1,#238 commit 9629e98bf1f34fb36b785092988e8335a53172df Author: kate.friedman Date: Wed Feb 24 15:45:07 2021 +0000 Pull in workflow changes from NCO v16.0.7 install - ecflow forecast scripts changes for resources - fixes to prdgen wave scripts Refs: #1,#238 commit d41796ae9d7024624078c5972aed032eea25ca65 Author: Kate.Friedman Date: Tue Feb 23 18:37:10 2021 +0000 Updates from testing on Hera - add WGRIB2 setenv to module_base.hera; wgrib2 was unknown in post jobs without - set RUN_CCPP default to YES; should be running with CCPP now, consider removing variable Refs: #164 commit 2cce21a86a4e95b0f608eb21fcff857e61a20ca5 Merge: af971d76b f6aa69393 Author: kate.friedman Date: Mon Feb 22 10:09:33 2021 -0600 Merge remote-tracking branch 'origin/develop' into feature/hpc-stack * origin/develop: remove flags that were not present in Makefiles update jasper to 2.x.25. remove compiler flag in fv3nc2nemsio Update EMC_verif-global tag to verif_global_v1.13.4 another use of _d where an _4 is needed Adjust how target is set for build_workflow_utils Add workflow_utils to build_all and link scripts copy/paste error from enkf_chgres_recenter.fd to enkf_chgres_recenter_nc.fd. When going from nemsio to netcdf, the linking of ip, sp and w3nco changed from _d to _4. update .gitignore to exclude build and install directories as well as compiled files. bugfix in build_workflow_utils.sh hack. nceplibs-ncio now creates the module ncio and not fv3gfs_ncio. add cmake build capability for workflow utilities Update Fit2Obs tag to newm.1.3 for bugfix Update Fit2Obs tag to newm.1.3 for bugfix Update EMC_verif-global tag for Hera bug Pull in nwpara/gfsv16.0.6 updates for parse-storm-type.pl HOTFIX: Update EMC_verif-global tag for Hera bug Issue #1 and issue #238 - update Externals.cfg to match checkout.sh updates for v16.0.6 move errchk definition from script to job for wave prdgen and gempak adding definition of errchk which was undefined in these scripts Issue #1 and issue #233 - update GLDAS tag to gldas_gfsv16_release.v1.13.0 Issue #1 and issue #241 - update EMC_verif-global tag Issue #1 and issue #238 - remove ak_10m grid from config.wave and update checkout.sh tags for ufs-weather-model and EMC_gfs_wafs Issue #1 and issue #226 - update UPP tag to upp_gfsv16_release.v1.1.3 in Externals.cfg Issue #1 and issue #226 - update UPP tag to upp_gfsv16_release.v1.1.3 Issue #1 and issue #226 - update exgfs_atmos_grib2_special_npoess.sh for dbn_alert issue #227 reducing output bufr files to 64 levels per NCO request issue #227 reducing output bufr files to 64 levels per NCO request Issue #1: update ecflow to be consistent with NCO's gfs.v16.0.4 and update checkout to bring in new UPP tag (see issue #226) Upaded ush script scale_dec.sh Updated scripts gfs_v16.0 Conflicts: Externals.cfg sorc/checkout.sh Refs: #164 commit f6aa69393dbd9e3f99d9d9ee803e4e0c9ea3d1a3 Merge: 2d055914e 1d7776d66 Author: Kate Friedman Date: Fri Feb 19 10:15:41 2021 -0500 Merge pull request #264 from NOAA-EMC/feature/cmake Build workflow utilities with CMake. commit 1d7776d66344a0b7423d64ae6b7f221f51472a46 Author: Rahul Mahajan Date: Fri Feb 19 10:06:57 2021 -0500 remove flags that were not present in Makefiles commit d3df2ea9a67d149d4b3f2b85fd22cade380ac928 Author: Rahul Mahajan Date: Thu Feb 18 10:53:33 2021 -0500 update jasper to 2.x.25. remove compiler flag in fv3nc2nemsio commit 2d055914e5c67e3e93f2ae93597a898b27dc76e4 Merge: 0e29b80df f2ba50e08 Author: Kate Friedman Date: Thu Feb 18 10:51:04 2021 -0500 Merge pull request #271 from NOAA-EMC/issue266 Update EMC_verif-global tag to verif_global_v1.13.4 commit f2ba50e083b3afff2574f3c1fa9d362d77d8ce03 Author: kate.friedman Date: Thu Feb 18 15:35:49 2021 +0000 Update EMC_verif-global tag to verif_global_v1.13.4 - Added missing environment variable in ush/run_verif_global_in_global_workflow.sh - Updated ush/run_verif_global_in_global_workflow.sh and ush/set_up_verif_global.sh to point to new archive directory locations Refs: #266 commit 522af45f29a316f0cd88f2cfb0a1e2f704e0d3d5 Author: Rahul Mahajan Date: Wed Feb 17 15:57:47 2021 -0500 another use of _d where an _4 is needed commit 8e90d448ac6fe55ba40fc1a19c818738c9443426 Author: Kate.Friedman Date: Wed Feb 17 20:44:49 2021 +0000 Adjust how target is set for build_workflow_utils - Add $target to build_workflow_utils.sh command in build_all.sh - Move machine-setup source back down into if-block in build_workflow_utils.sh Refs: #264 commit 7c856bf3d8e1ac13c0c1c2eab805836289b7b0bc Author: Kate.Friedman Date: Wed Feb 17 18:56:19 2021 +0000 Add workflow_utils to build_all and link scripts - add build_workflow_utils.sh to build_all.sh - add workflow utils exec symlinks to link_fv3gfs.sh since these execs no longer land in /exec folder - add workflow_utils build to partial_build.sh and fv3gfs_build.cfg for use by build_all - turn off builds in fv3gfs_build.cfg that are now covered by workflow_utils build - move machine-setup source in build_workflow_utils.sh before target is needed Further cleanup and decommissioning of consolidated builds will come in follow-up commit/PR. Refs: #264 commit ccba5ee2edf8595f8d4a12d3cd956f1d0d4f1c83 Author: Rahul Mahajan Date: Wed Feb 17 11:18:06 2021 -0500 copy/paste error from enkf_chgres_recenter.fd to enkf_chgres_recenter_nc.fd. When going from nemsio to netcdf, the linking of ip, sp and w3nco changed from _d to _4. commit 1d26a83b2824d646bc1483c18496787b6ceb7672 Author: Rahul Mahajan Date: Tue Feb 16 13:07:07 2021 -0600 update .gitignore to exclude build and install directories as well as compiled files. bugfix in build_workflow_utils.sh hack. nceplibs-ncio now creates the module ncio and not fv3gfs_ncio. commit bd103a1a017173694d6e10822ab5a55b4ce1da67 Author: kate.friedman Date: Tue Feb 16 16:43:58 2021 +0000 Update EMC_verif-global tag to verif_global_v1.13.4 - Added missing environment variable in ush/run_verif_global_in_global_workflow.sh - Updated ush/run_verif_global_in_global_workflow.sh and ush/set_up_verif_global.sh to point to new archive directory locations Refs: #1,#266 commit 3a6b186904a9d8e6cdc5d89e80c23094a70717ad Author: Rahul Mahajan Date: Sun Feb 14 19:50:56 2021 -0500 add cmake build capability for workflow utilities commit 181606755de789d1764d1f21a5a2e72ea302da34 Author: Kate.Friedman Date: Fri Feb 12 14:19:51 2021 +0000 Updates for archival, NSST, and v16 retros - add variables: MODE to dilineate free-forecast and cycled modes; RETRO variable to toggle between v16 ops and retro inputs - update arch step to archive INPUT files instead of RESTART files when in free (free-forecast) mode; other small fixes to arch step - set NSST to spinup before availability date - added support for running off v16 retrospective inputs - fix missing bufrsnd job in free-forecast mode - update resource settings for C48 and C96; further refinement coming Refs: #178 commit 1f0be181e9a84ee5c5ddef927c4b3c2c03c430dd Author: kate.friedman Date: Thu Feb 11 19:11:11 2021 +0000 ecFlow resource adjustments from NCO for forecast and post jobs Refs: #1,#238 commit 10ed5f1152c385a4d19864ac5924b77dd66dc75e Author: Kate.Friedman Date: Tue Feb 9 19:08:38 2021 +0000 Add wave jobs to arch job dependencies Refs: #178 commit 77d7408b759a761d1697ad672dd4feae14d9c5b1 Author: Kate.Friedman Date: Tue Feb 9 18:49:05 2021 +0000 Move GFSv16 version date and set C96 to use 4 threads - change config.fv3 to use 4 threads for C96 (too few nodes with 1 thread) - update GFSv16 version if-block date in getic and init scripts to use updated/tentative implementation date; update when finalized Refs: #178 commit 1ea07da4b138f271772ed216b18233ac0d7b6e39 Author: anning.cheng Date: Tue Feb 9 02:20:22 2021 -0500 testing merra2 workflow in hera commit 613a181c040682dfb41997c64c2d220a33eeecbe Author: Kate.Friedman Date: Thu Feb 4 20:57:07 2021 +0000 Adjusting from testing - add COMPONENT setting back into getic.sh - add gfsinit as dependency to gfswaveprep since UFS_UTILS gdas_init scripts remove the atmos folder Refs: #178 commit 7a34968e3a66fc31127de1450d5ce194d9b9681b Author: kate.friedman Date: Thu Feb 4 14:32:30 2021 -0600 Small adjustments from testing - add needed exports in getic.sh - add check to init for whether it needs to run - add check to init for copying pgbanl files when getic doesn't run - remove atmos subfolder from init dependencies for older versions Refs: #178 commit ed74d83b64cbc98a36698dfb39b98fc860892ea4 Author: kate.friedman Date: Thu Feb 4 13:28:59 2021 -0600 Adjustments for Orion, dependencies, and using DATA folder - update getic and init jobs to use DATA folder and ROTDIR - move pgb copy back to getic job since it now dumps into ROTDIR - update setup_workflow_fcstonly.py to turn off getic on Orion and adjust dependencies for init for input files from supported versions Refs: #178 commit 28267e7b66932e69338fb41d3dc338a3d27192dd Merge: 1871c7a95 a2ae84fa5 Author: Kate Friedman Date: Thu Feb 4 11:23:34 2021 -0500 Merge pull request #258 from GuangPingLou-NOAA/release/gfsv16.0.0 Issue #257 shorten run time of postsnd job on WCOSS-Dell commit a2ae84fa532c60496c87419b9e90766a2cd7e659 Author: Guang.Ping.Lou Date: Thu Feb 4 15:45:58 2021 +0000 issue #257 shorten run time on Mars Dell1 file system commit b414ea34ae9b11c33f6070a1687732e05e6e10f2 Merge: abdc1334e 0e29b80df Author: Kate.Friedman Date: Wed Feb 3 19:38:17 2021 +0000 Merge remote-tracking branch 'upstream/develop' into issue178 * upstream/develop: Update Fit2Obs tag to newm.1.3 for bugfix Update Fit2Obs tag to newm.1.3 for bugfix Update EMC_verif-global tag for Hera bug Pull in nwpara/gfsv16.0.6 updates for parse-storm-type.pl HOTFIX: Update EMC_verif-global tag for Hera bug Issue #1 and issue #238 - update Externals.cfg to match checkout.sh updates for v16.0.6 move errchk definition from script to job for wave prdgen and gempak adding definition of errchk which was undefined in these scripts Issue #179 and issue #243 - update Fit2Obs to newm.1.2 tag and correct COMROOT path for Hera HOTFIX: Issue #241 - update EMC_verif-global tag to remove use of /tmp space Issue #1 and issue #233 - update GLDAS tag to gldas_gfsv16_release.v1.13.0 Issue #1 and issue #241 - update EMC_verif-global tag Issue #1 and issue #238 - remove ak_10m grid from config.wave and update checkout.sh tags for ufs-weather-model and EMC_gfs_wafs Issue #179 - update to config.vrfy for Fit2Obs tag which supports Orion Issue #1 and issue #226 - update UPP tag to upp_gfsv16_release.v1.1.3 in Externals.cfg Issue #1 and issue #226 - update UPP tag to upp_gfsv16_release.v1.1.3 Issue #233 - remove unnecessary extra space in tag line for gldas Issue #233 - update GLDAS tag to gldas_gfsv16_release.v1.13.0 Issue #1 and issue #226 - update exgfs_atmos_grib2_special_npoess.sh for dbn_alert issue #227 reducing output bufr files to 64 levels per NCO request issue #227 reducing output bufr files to 64 levels per NCO request Issue #1: update ecflow to be consistent with NCO's gfs.v16.0.4 and update checkout to bring in new UPP tag (see issue #226) Upaded ush script scale_dec.sh Updated scripts gfs_v16.0 Refs: #178 commit abdc1334e3c58d40977b20fb7acabd4c14a80cf6 Author: Kate.Friedman Date: Wed Feb 3 19:28:44 2021 +0000 Further init updates for free-forecast mode - move COMPONENT setting from getic to init job - cleanup how getic job sets tarball paths and pulls pgbanl files - move second step of pgbanl pull to init job for consistency - add MODE variable to config.base and its definition to setup scripts - add MODE setting to config.base if-block that adjusts IAU variables - add missing COMPONENT subfolder to data dependencies for getic and init jobs in setup_workflow_fcstonly.py - remove unneeded commented out wavestat job from setup_workflow_fcstonly.py Refs: #178 commit 29bd69b7545b155e3d19e86031b7e4e225195233 Author: anning.cheng Date: Tue Feb 2 15:04:13 2021 -0600 update merra2 before a new pull request commit 0e29b80df492677380fba6adf1e73d0efba21f3a Merge: 4ffed84c2 9f3c6dffe Author: Kate Friedman Date: Tue Feb 2 13:30:58 2021 -0500 Merge pull request #253 from NOAA-EMC/issue240 NCO updates for GFSv16.0.4[5][6] commit 9f3c6dffe14d2e50eec8e9406c220d740bf0ea3d Merge: 446a3274a 1871c7a95 Author: kate.friedman Date: Tue Feb 2 16:32:47 2021 +0000 Merge remote-tracking branch 'origin/release/gfsv16.0.0' into issue240 SPA Jen and Qingfu corrected ush/parse-storm-type.pl. Pulled this into release/gfsv16.0.0 for v16.0.6. Refs: #240 commit 446a3274affb7a82c0804d33191b6552b9a7899f Merge: 8e0d1ad58 4ffed84c2 Author: kate.friedman Date: Tue Feb 2 16:27:08 2021 +0000 Merge remote-tracking branch 'origin/develop' into issue240 Sync merge with develop branch to get up-to-date. Refs: #240 commit af971d76b2c62aa27d7e989877805a1950e52982 Author: Kate.Friedman Date: Mon Feb 1 17:40:21 2021 +0000 Updates for hpc-stack - Remove unneeded ignores from .gitignore - Update Externals.cfg and checkout.sh for component stack versions - Update link_fv3gfs.sh for changed component links - Further Hera module updates for hpc-stack Refs: #164 commit 4ffed84c2fd152da56894d3cf1fe26c14cf86a5a Author: Kate.Friedman Date: Mon Feb 1 15:02:05 2021 +0000 Update Fit2Obs tag to newm.1.3 for bugfix Jack Woollen reported a bug in the newm.1.2 tag related to insufficient rstprod permission settings on produced files. Need fix on WCOSS/Hera. New tag newm.1.3 resolves bug. Refs: #252 commit 1871c7a9578c4969ca5fabc75a98468a21558cb5 Author: Kate.Friedman Date: Mon Feb 1 14:48:10 2021 +0000 Update Fit2Obs tag to newm.1.3 for bugfix Jack Woollen reported a bug in the newm.1.2 tag related to insufficient rstprod permission settings on produced files. Need fix on WCOSS/Hera. New tag newm.1.3 resolves bug. Refs: #1,#252 commit b251164cb3adfab274d7f7ea15c30eaff5eabf79 Author: Kate.Friedman Date: Fri Jan 29 19:45:44 2021 +0000 Add missing component subfolder to fcst dependency The free-forecast mode fcst job data dependency file path was missing the component "atmos" subfolder. Added and tested in experiment on Hera. Refs: #178 commit 2147c59ac06d1afb69bbe95e0fa9f155b8e13594 Author: kate.friedman Date: Fri Jan 29 18:50:45 2021 +0000 Update EMC_verif-global tag for Hera bug Users reported issues plotting within EMC_verif-global on Hera. EMC_verif-global code manager made hotfix and provided new tag to resolve issues. No other HPCs impacted by this hotfix update. EMC_verif-global tag updated in both Externals.cfg and sorc/checkout.sh. Refs: #1,#251 commit 9066e6deef04dfc23eeeba47e1742d76dc508209 Author: kate.friedman Date: Fri Jan 29 17:03:04 2021 +0000 Pull in nwpara/gfsv16.0.6 updates for parse-storm-type.pl NCO reported "some JTWC tcvital missing in the gfs parallel run". The parse-storm-type.pl script was updated by Qingfu Liu to resolve issues and SPA Jen pulled it into the nwpara/gfsv16.0.6 install. Pulling the script update into our release branch to match nwpara. Refs: #1, #238 commit 94b30e3279985c13e4bfc05c60e487b1bda874ed Author: kate.friedman Date: Fri Jan 29 16:28:17 2021 +0000 HOTFIX: Update EMC_verif-global tag for Hera bug Users reported issues plotting within EMC_verif-global on Hera. EMC_verif-global code manager made hotfix and provided new tag to resolve issues. No other HPCs impacted by this hotfix update. EMC_verif-global tag updated in both Externals.cfg and sorc/checkout.sh. Refs: #251 commit a7391d70716c5a543ec57cf74f2c5cf133c15330 Author: kate.friedman Date: Thu Jan 28 19:04:52 2021 +0000 Update free-forecast mode chgres jobs for chgres_cube Update free-forecast mode to interface with UFS_UTILS gdas_init utility scripts. Update getic job to use gdas_init get scripts to pull ICs off HPSS for GFS versions 13 and later. Rename fv3ic job to "init" and update it to interface with gdas_init run scripts to run chgres_cube and produce GFSv16 ICs. Update job dependencies to detect need to run chgres jobs and hold forecast jobs until ICs are generated or present. Further updates coming for this task. Tested on WCOSS-Dell, need to test elsewhere still. Will disable getic job on Orion. Refs: #1, #178 commit aecf55db6c3e591e704be9d7404b18ed97d41add Author: Kate.Friedman Date: Mon Jan 25 20:04:32 2021 +0000 Issue #164 - stack updates for UFS_UTILS build commit cf4a226caa40c132a3c944fbf9585db0ac93427e Author: Kate.Friedman Date: Mon Jan 25 19:57:07 2021 +0000 Issue #164 - stack updates for regrid_nemsio build commit bc273e98a506d4e2724d4c5c35718f74776ce9d7 Author: Kate.Friedman Date: Mon Jan 25 19:50:49 2021 +0000 Issue #164 - updates to partial build and stack updates for tropcy build commit e09cb35b1ad654715ec9341f92ac98a2a9813730 Author: Kate.Friedman Date: Mon Jan 25 19:18:12 2021 +0000 Issue #164 - update GLDAS tag in Externals.cfg and stack updates for fbwndgfs and grib_util builds commit 8e0d1ad58de7379fa9caa81e739f3e0cc26f0d93 Merge: 479000a5e f53e3ca4f Author: kate.friedman Date: Mon Jan 25 16:27:18 2021 +0000 Issue #1 and issue #240 - merge v16.0.4, v16.0.5, and initial v16.0.6 changes from NCO into develop commit f53e3ca4f66c300c8668fb2f3c4682e6046fb71a Author: kate.friedman Date: Mon Jan 25 16:18:04 2021 +0000 Issue #1 and issue #238 - update Externals.cfg to match checkout.sh updates for v16.0.6 commit fad08a2751627baa3b193ca9496d568597ac7ee8 Merge: 40184c2a2 479000a5e Author: Kate.Friedman Date: Mon Jan 25 16:09:33 2021 +0000 Merge remote-tracking branch 'origin/develop' into feature/hpc-stack * origin/develop: Issue #179 and issue #243 - update Fit2Obs to newm.1.2 tag and correct COMROOT path for Hera HOTFIX: Issue #241 - update EMC_verif-global tag to remove use of /tmp space commit 80ffcfc1d23ab85e6f22f91a6cdd5a3538b276b2 Merge: 676e22f50 52485c41f Author: Kate Friedman Date: Mon Jan 25 10:36:50 2021 -0500 Merge pull request #247 from JessicaMeixner-NOAA/errorcheckwave3 Issue #1 and issue #238 - Move errchk from script level to job level for three wave jobs commit 52485c41f59447021957d3f0f0e0b6db38a7e398 Author: jessica.meixner Date: Mon Jan 25 15:24:09 2021 +0000 move errchk definition from script to job for wave prdgen and gempak commit 676e22f507076767d1d61ed1a977f15b545bd140 Merge: addaed49a 7abf72c6b Author: Kate Friedman Date: Mon Jan 25 10:02:10 2021 -0500 Merge pull request #246 from NOAA-EMC/updates/gfsv16.0.6 Issue #238 - component tags, errchk, and wave grid changes from NCO for v16.0.6 commit 7abf72c6b7c603c7671bd530f4ffa321f857ccd9 Merge: 12b7c1472 cd9af8b85 Author: Kate Friedman Date: Mon Jan 25 09:52:17 2021 -0500 Merge pull request #245 from JessicaMeixner-NOAA/v16_wave_errorchecks Issue #1 and issue #238 - adding definition of errchk which was undefined in three wave scripts commit cd9af8b854587bac619e1bad3dadff95b8140081 Author: jessica.meixner Date: Mon Jan 25 14:18:46 2021 +0000 adding definition of errchk which was undefined in these scripts commit 40184c2a21a22d9ac432a02e1e5bdc4ee18f0887 Author: Kate.Friedman Date: Fri Jan 22 16:36:46 2021 +0000 Issue #164 - remove build_libs.sh, update module_base.hera for hpc-stack, plus updates for building enkf_chgres_recenter, gfs_bufr, and tocsbufr with hpc-stack commit 6ab44fef78f33785268492f54ee1b2c0f8a17242 Author: Kate.Friedman Date: Fri Jan 22 14:44:59 2021 +0000 Issue #164 - update machine-setup.sh for Hera stack, update checkout/build/linking for UPP tag that supports hpc-stack, remove post patch in HERA.env commit 479000a5e7b4b9c575784ecca400594525396556 Merge: bbf300085 5a16a7a01 Author: Kate Friedman Date: Thu Jan 21 11:31:41 2021 -0500 Merge pull request #244 from NOAA-EMC/issue243 Issue #179 and issue #243 - update Fit2Obs to newm.1.2 tag and correct COMROOT path for Hera commit 5a16a7a010477b5bbe530a684608b151b8f05755 Author: kate.friedman Date: Thu Jan 21 16:27:54 2021 +0000 Issue #179 and issue #243 - update Fit2Obs to newm.1.2 tag and correct COMROOT path for Hera commit bbf3000856ebb20ca761c6e7d761f5fcefad0228 Author: kate.friedman Date: Fri Jan 15 18:36:20 2021 +0000 HOTFIX: Issue #241 - update EMC_verif-global tag to remove use of /tmp space commit addaed49ac7a69d66a56ed7fd2096686e201a51f Author: kate.friedman Date: Fri Jan 15 18:28:14 2021 +0000 Issue #1 and issue #233 - update GLDAS tag to gldas_gfsv16_release.v1.13.0 commit 112586a79c4f48baca0b705fd5d0adea9f3c9ec0 Author: kate.friedman Date: Fri Jan 15 18:17:39 2021 +0000 Issue #1 and issue #241 - update EMC_verif-global tag commit 12b7c147207581da7718a82dd2174a94489a67e6 Author: kate.friedman Date: Thu Jan 14 17:04:29 2021 +0000 Issue #1 and issue #238 - remove ak_10m grid from config.wave and update checkout.sh tags for ufs-weather-model and EMC_gfs_wafs commit 43ae19be02687cdc362a34e11d299009f3dd8ffc Merge: c7e6a7f8c 48be4d182 Author: Kate Friedman Date: Thu Jan 14 10:35:16 2021 -0500 Merge pull request #239 from KateFriedman-NOAA/issue179 Issue #179 - update to config.vrfy for Fit2Obs tag which supports Orion commit 48be4d18263ea4f1c706c55e4b6b70d87f19c729 Author: kate.friedman Date: Thu Jan 14 08:38:20 2021 -0600 Issue #179 - update to config.vrfy for Fit2Obs tag which supports Orion commit c7e6a7f8c6f7b0d5648b1379633c252f165c2ca6 Merge: ef8b64150 4be7954dd Author: Kate Friedman Date: Wed Jan 13 14:20:49 2021 -0500 Merge pull request #237 from KateFriedman-NOAA/issue233 Update GLDAS tag to gldas_gfsv16_release.v1.13.0 commit 3b0cb61d71fcd2e5186fc1ae14c873a67b7c515d Author: kate.friedman Date: Wed Jan 6 16:30:10 2021 +0000 Issue #1 and issue #226 - update UPP tag to upp_gfsv16_release.v1.1.3 in Externals.cfg commit 749f0682e432352a052d3dc0153eb6a87a70c0d0 Author: kate.friedman Date: Wed Jan 6 16:23:42 2021 +0000 Issue #1 and issue #226 - update UPP tag to upp_gfsv16_release.v1.1.3 commit 4be7954ddcbb56e7ee5167f6d43054d8821b6a4f Author: kate.friedman Date: Tue Jan 5 12:06:51 2021 -0600 Issue #233 - remove unnecessary extra space in tag line for gldas commit bcbdd12c39124eae14ec0b52d0a05b7c6944c27e Author: kate.friedman Date: Tue Jan 5 12:03:15 2021 -0600 Issue #233 - update GLDAS tag to gldas_gfsv16_release.v1.13.0 commit 91597f544a860b0e6b77019cd0c183ea7e1e580a Author: kate.friedman Date: Tue Jan 5 15:18:09 2021 +0000 Issue #1 and issue #226 - update exgfs_atmos_grib2_special_npoess.sh for dbn_alert commit ef8b64150a0cb7137b1ec48e9c8f3c7e9b3223de Merge: fca3433bf cffc5682f Author: Fanglin Yang Date: Mon Jan 4 10:52:08 2021 -0500 Merge pull request #231 from yangfanglin/feature/multiple_restart modified exglobal_forecast.sh to enable multiple reruns from breakpoint restart initial conditions commit cffc5682f34eb9ab0bbf42ee11eaa848152c0dbc Author: fanglin.yang Date: Wed Dec 30 18:43:34 2020 +0000 modified: exglobal_forecast.sh The breakpoint restart only works for the first restart from a breakpoint. Restart files written in RERUN_RESTRAT after the first restart has a 3-hour time shift for DO_IAU=YES cases. Forecasts starting from the 2nd breakpoint and beyond will fail becasue of incorrect initial conditins. This commit fixes this bug. commit 3a955b4b2611fea6925aeca7e442c119c2b6c4cb Merge: 104328ab2 43c46eb46 Author: Fanglin Yang Date: Wed Dec 23 13:20:36 2020 -0500 Merge pull request #229 from GuangPingLou-NOAA/release/gfsv16.0.0 Release/gfsv16.0.0 commit 43c46eb4636c6327431ed87e8499cc9b4ef1ffa3 Author: Guang Ping Lou Date: Wed Dec 23 17:48:45 2020 +0000 issue #227 reducing output bufr files to 64 levels per NCO request commit 1d3c1554cf453cce448d5ebcc138941a834ad8b3 Author: Guang Ping Lou Date: Wed Dec 23 17:48:30 2020 +0000 issue #227 reducing output bufr files to 64 levels per NCO request commit 104328ab22dc780a8d80bf3ceaa53017d410265f Author: russ.treadon Date: Wed Dec 23 13:54:00 2020 +0000 Issue #1: update ecflow to be consistent with NCO's gfs.v16.0.4 and update checkout to bring in new UPP tag (see issue #226) commit 47efddfdc8815aacc0b7328135b8626a0f0862b2 Merge: e89045b9f eaa91a792 Author: Fanglin Yang Date: Tue Dec 22 12:41:01 2020 -0500 Merge pull request #225 from NOAA-EMC/update_gfsv16 Update gfsv16 commit eaa91a7927cc1f59f3709f5c5af401d9e5c20a1c Author: BoiVuong-NOAA Date: Tue Dec 22 03:54:04 2020 +0000 Upaded ush script scale_dec.sh commit 53f37b8bf9a3ef861c026e842c565f7b370a2a06 Author: BoiVuong-NOAA Date: Mon Dec 21 13:52:42 2020 +0000 Updated scripts gfs_v16.0 commit fca3433bf869b12cade480e1394688d7b6f95687 Merge: a95cce5f3 abb168bc3 Author: Kate Friedman Date: Fri Dec 18 09:30:21 2020 -0500 Merge pull request #221 from NOAA-EMC/nco_v16_changes NCO changes for v16.0.3 commit abb168bc32fdff7b08ce8ecc54ea832aabf03c44 Author: kate.friedman Date: Thu Dec 17 19:17:26 2020 +0000 Issue #1 and issue #220 - set C192/C96/C48 npe_eobs back to dev values for develop commit 65b4d965fa6ebee445a509a0593e925ae5d5b513 Merge: a95cce5f3 e89045b9f Author: kate.friedman Date: Thu Dec 17 18:46:06 2020 +0000 Issue #1 - merge v16.0.3 changes from NCO into develop commit e89045b9fb1d9026810ef5c77a3554c67c41718e Author: kate.friedman Date: Wed Dec 16 16:38:01 2020 +0000 Issue #1 - add grib_util module load to several analysis ecflow scripts commit 1f5af6283f3b1a6d2faec127e7d8c814e793f2a4 Merge: b7ece977f fc4dd668b Author: Kate Friedman Date: Tue Dec 15 14:44:51 2020 -0500 Merge pull request #217 from NOAA-EMC/release/gfsv15.3.5 GFSv15.3.5 commit fc4dd668b32f3c76b84800c1ebc4c3b277db91e4 Author: kate.friedman Date: Tue Dec 15 19:39:41 2020 +0000 Issue #215 - GFSv15.3.5 ops updates for gempak and bufrlib versions commit a95cce5f3e123a241000b209fb50cc8911d13466 Merge: 1562bb97a 2646921d0 Author: Kate Friedman Date: Tue Dec 15 14:05:27 2020 -0500 Merge pull request #216 from NOAA-EMC/issue189 Issue #189 - update ufs-weather-model hash commit 2646921d0da6dc8a136ebeba6cceceacf3806c23 Author: kate.friedman Date: Tue Dec 15 18:56:33 2020 +0000 Issue #189 - update ufs-weather-model hash commit 1562bb97ac21ce2406ed21f76b7c8d58cecf4a2a Merge: f3d11b9bc 7a9bc00e3 Author: Kate Friedman Date: Tue Dec 15 09:51:07 2020 -0500 Merge pull request #213 from NOAA-EMC/hotfixes Hotfixes - issues #201, 202, 208 commit 7a9bc00e367126044f83b0fc9b737386788899a9 Author: Kate.Friedman Date: Mon Dec 14 21:35:58 2020 +0000 Issue #201 - workaround for failing post000 job before hpc-stack solution commit 425588f711812819744c8060a46444b9d4bf2b63 Author: kate.friedman Date: Mon Dec 14 21:06:20 2020 +0000 Issue #1 - update WAFS tag to gfs_wafs.v6.0.17 for dbn_alert change commit ec5e2e5dd98d80ca3e3d696d5eb19a5a4180c87b Author: kate.friedman Date: Mon Dec 14 15:06:16 2020 +0000 Issue #1 - correct gridded wave parm files for v16.0.3 commit 39246c6e1604e9e81ddc384c951861c43e30cdcf Author: kate.friedman Date: Fri Dec 11 19:26:51 2020 +0000 Issue #1 - changes from NCO for GFSv16.0.3 commit b9f7de8af3cb966fa756e38feaf98ce1f84ee2f3 Author: Kate.Friedman Date: Thu Dec 10 19:15:00 2020 +0000 Issue #1 - update Externals.cfg with final tags for GFSv16.0.2 commit cff28bdbf0daa6980766a011f73e2a021aee2341 Author: Kate.Friedman Date: Thu Dec 10 15:59:05 2020 +0000 Fixes for issue #202 (FINDDATE) and issue #208 (postsnd.sh permissions) commit 7c7482de947ae686b619fc72e7dd192410b7e0f7 Author: Kate.Friedman Date: Thu Dec 10 14:10:25 2020 +0000 Issue #1 - correct permissions on jobs/rocoto/postsnd.sh commit dee856c5b54c19cc9de6898908fdb9d4f8d99baa Author: kate.friedman Date: Wed Dec 9 20:58:20 2020 +0000 Issue #1 - update gempak version to 7.3.3 in the ecflow gfs.ver file commit 0ee264bef7ea71d32426b6b6d77109e73b4235ff Author: kate.friedman Date: Wed Dec 9 18:49:54 2020 +0000 Issue #1 - update gempak and dumpjb versions to 7.3.3 and 5.1.0 respectively commit b5d97ab6912596d3b343843ba3ef6e8c5f71a21a Merge: 6da1a24ba be5f9ece8 Author: Kate Friedman Date: Mon Dec 7 14:49:29 2020 -0500 Merge pull request #205 from NOAA-EMC/release/gfsv16.0.0.nco Script alert updates from NCO for wave downstream commit be5f9ece8329ee6b18af92d044809f018404669b Author: kate.friedman Date: Mon Dec 7 19:45:23 2020 +0000 Script alert updates from NCO for wave downstream commit 6da1a24ba89c80ee5f5df38136d03483207d39dd Merge: ce1ae9709 069f2662b Author: Kate Friedman Date: Mon Dec 7 14:12:37 2020 -0500 Merge pull request #204 from NOAA-EMC/release/gfsv16.0.0.nco GFSv16 NCO changes - early December edition commit 069f2662bc2eabc822f13eec4e2f69f1c39606a8 Author: kate.friedman Date: Wed Dec 2 20:01:17 2020 +0000 Add override for COMIN_WAV_RTOFS in emc mode for waveprep job commit 378953d513ce587bb801c85b3899afaee0217961 Author: anning.cheng Date: Wed Dec 2 09:12:38 2020 -0600 udate config.base.emc.dyn commit 81442ad41582eebcf34a9a01fb7e2a7e0a641ebe Author: anning.cheng Date: Tue Dec 1 13:15:08 2020 -0600 high resolution MERRA2 data used commit 79f23fa4c2079ce5cd75da95267d2d4442b757a0 Author: anning.cheng Date: Tue Dec 1 12:54:50 2020 -0600 fixed a bug related to orion commit 947bbad1aad4d2967f81c8498a7f96b1394b66d9 Author: anning.cheng Date: Tue Dec 1 10:02:08 2020 -0600 added merra2 input and update diag_table commit 3bdc629964b6a217944fc56a19d842a7d043dde5 Author: anning.cheng Date: Tue Dec 1 01:31:07 2020 -0600 merra2 workflow for orion commit 194f280cf7351d58b0295af35c91e74ac6925d0b Author: russ.treadon Date: Mon Nov 30 18:47:19 2020 +0000 Issue #197: place CDATE specific sections of config.anal and config.prep inside RUN_ENVIR=emc blocks. These sections are use for retrospective parallels and therefore do not need to be executed in operations (NCO). commit d82efa8417fe6247ed08cd74a447460ba6bb51f9 Author: fanglin.yang Date: Mon Nov 30 18:06:12 2020 +0000 modified: JGLOBAL_FORECAST commit 25a28c8dacc8659071eca65be15fb4de08b18d80 Author: fanglin.yang Date: Mon Nov 30 17:02:05 2020 +0000 modified: JGLOBAL_FORECAST to make it work for both emc and nco running environments. commit 7283c7e60c98639c6466ce5903ab03c36a7f5f57 Author: fanglin.yang Date: Fri Nov 27 22:16:31 2020 +0000 modified: JGLOBAL_FORECAST A test showed that jobid is not defined in JGLOBAL_FORECAST running in the Rocoto environment. jobid is defined in ./env files. Defining DATA without sourcing ./env/$nachine.env caused the script to fail. Move the definitin of DATA after sourcing env parameters commit e6003773c0af51bd15eb2bdb6831c99dcce1cc5b Author: fanglin.yang Date: Tue Nov 24 03:25:23 2020 +0000 modified: checkout.sh to use WAFS tag gfs_wafs.v6.0.16 commit bd5294ee6844f8a5adbd04eae78999068c0961a2 Author: fanglin.yang Date: Sat Nov 21 17:32:05 2020 +0000 modified: checkout.sh to update UPP to upp_gfsv16_release.v1.1.1, a minor syntax bug fix commit 5a516b4cdf0333e4b8316c8cbda7be9f9dccc543 Author: fanglin.yang Date: Fri Nov 20 05:08:33 2020 +0000 modified: link_fv3gfs.sh to 1) use hard copies of external fix fields and executable for NCO installation 2) use soft links for all other files and directories for both NCO and EMC installations commit 759cf3341bb400d3b3254a05a624e3ab671ef6cd Author: fanglin.yang Date: Fri Nov 20 04:51:48 2020 +0000 Compared local files in NCO implementation directory with release/gfs.v16.0.0 branch, changes made by NCO(Jen Yang) in the following files are either accepted or rejected. use EMC's updated version modified: jobs/JGFS_ATMOS_POSTSND Use NCO's updated version modified: jobs/JGLOBAL_FORECAST modified: jobs/JGLOBAL_WAVE_GEMPAK modified: jobs/JGLOBAL_WAVE_POST_BNDPNT modified: jobs/JGLOBAL_WAVE_POST_PNT modified: jobs/JGLOBAL_WAVE_POST_SBS modified: jobs/JGLOBAL_WAVE_PRDGEN_BULLS modified: jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED modified: parm/config/config.base.nco.static Updated by both NCO and EMC modified: parm/config/config.wave Updated gldas tag to gldas_gfsv16_release.v1.12.0 modified: sorc/checkout.sh commit f39ea0cbbabaa5e69e03cec4db309e2adfb33756 Author: fanglin.yang Date: Thu Nov 19 04:43:17 2020 +0000 create a new branch release/gfsv16.0.0.nco to merge changes made by NCO in /gpfs/dell1/nco/ops/nwpara/gfs-v16/gfs.v16.0.1 back to EMC's repository modified: ecflow/ecf/scripts/gdas/atmos/analysis/jgdas_atmos_analysis.ecf modified: ecflow/ecf/scripts/gdas/atmos/analysis/jgdas_atmos_analysis_calc.ecf modified: ecflow/ecf/scripts/gdas/atmos/analysis/jgdas_atmos_analysis_diag.ecf modified: ecflow/ecf/scripts/gdas/atmos/gempak/jgdas_atmos_gempak.ecf modified: ecflow/ecf/scripts/gdas/atmos/gempak/jgdas_atmos_gempak_meta_ncdc.ecf modified: ecflow/ecf/scripts/gdas/atmos/init/jgdas_atmos_gldas.ecf modified: ecflow/ecf/scripts/gdas/atmos/obsproc/dump/jgdas_atmos_dump.ecf modified: ecflow/ecf/scripts/gdas/atmos/obsproc/dump/jgdas_atmos_dump_alert.ecf modified: ecflow/ecf/scripts/gdas/atmos/obsproc/dump/jgdas_atmos_dump_post.ecf modified: ecflow/ecf/scripts/gdas/atmos/obsproc/dump/jgdas_atmos_tropcy_qc_reloc.ecf modified: ecflow/ecf/scripts/gdas/atmos/obsproc/prep/jgdas_atmos_emcsfc_sfc_prep.ecf modified: ecflow/ecf/scripts/gdas/atmos/obsproc/prep/jgdas_atmos_prep.ecf modified: ecflow/ecf/scripts/gdas/atmos/obsproc/prep/jgdas_atmos_prep_post.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_anl.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_f000.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_f001.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_f002.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_f003.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_f004.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_f005.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_f006.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_f007.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_f008.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_f009.ecf modified: ecflow/ecf/scripts/gdas/atmos/post/jgdas_atmos_post_manager.ecf modified: ecflow/ecf/scripts/gdas/atmos/post_processing/jgdas_atmos_chgres_forenkf.ecf modified: ecflow/ecf/scripts/gdas/atmos/verf/jgdas_atmos_verfozn.ecf modified: ecflow/ecf/scripts/gdas/atmos/verf/jgdas_atmos_verfrad.ecf modified: ecflow/ecf/scripts/gdas/atmos/verf/jgdas_atmos_vminmon.ecf modified: ecflow/ecf/scripts/gdas/enkf/analysis/create/jgdas_enkf_diag.ecf modified: ecflow/ecf/scripts/gdas/enkf/analysis/create/jgdas_enkf_select_obs.ecf modified: ecflow/ecf/scripts/gdas/enkf/analysis/create/jgdas_enkf_update.ecf modified: ecflow/ecf/scripts/gdas/enkf/analysis/recenter/ecen/jgdas_enkf_ecen.ecf modified: ecflow/ecf/scripts/gdas/enkf/analysis/recenter/jgdas_enkf_sfc.ecf modified: ecflow/ecf/scripts/gdas/enkf/forecast/jgdas_enkf_fcst.ecf modified: ecflow/ecf/scripts/gdas/jgdas_forecast.ecf modified: ecflow/ecf/scripts/gdas/wave/init/jgdas_wave_init.ecf modified: ecflow/ecf/scripts/gdas/wave/post/jgdas_wave_postpnt.ecf modified: ecflow/ecf/scripts/gdas/wave/post/jgdas_wave_postsbs.ecf modified: ecflow/ecf/scripts/gdas/wave/prep/jgdas_wave_prep.ecf modified: ecflow/ecf/scripts/gfs/atmos/analysis/jgfs_atmos_analysis.ecf modified: ecflow/ecf/scripts/gfs/atmos/analysis/jgfs_atmos_analysis_calc.ecf modified: ecflow/ecf/scripts/gfs/atmos/gempak/jgfs_atmos_gempak.ecf modified: ecflow/ecf/scripts/gfs/atmos/gempak/jgfs_atmos_gempak_meta.ecf modified: ecflow/ecf/scripts/gfs/atmos/gempak/jgfs_atmos_gempak_ncdc_upapgif.ecf modified: ecflow/ecf/scripts/gfs/atmos/gempak/jgfs_atmos_npoess_pgrb2_0p5deg.ecf modified: ecflow/ecf/scripts/gfs/atmos/gempak/jgfs_atmos_pgrb2_spec_gempak.ecf modified: ecflow/ecf/scripts/gfs/atmos/obsproc/dump/jgfs_atmos_dump.ecf modified: ecflow/ecf/scripts/gfs/atmos/obsproc/dump/jgfs_atmos_dump_alert.ecf modified: ecflow/ecf/scripts/gfs/atmos/obsproc/dump/jgfs_atmos_dump_post.ecf modified: ecflow/ecf/scripts/gfs/atmos/obsproc/dump/jgfs_atmos_tropcy_qc_reloc.ecf modified: ecflow/ecf/scripts/gfs/atmos/obsproc/prep/jgfs_atmos_emcsfc_sfc_prep.ecf modified: ecflow/ecf/scripts/gfs/atmos/obsproc/prep/jgfs_atmos_prep.ecf modified: ecflow/ecf/scripts/gfs/atmos/obsproc/prep/jgfs_atmos_prep_post.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_anl.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f000.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f001.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f002.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f003.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f004.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f005.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f006.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f007.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f008.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f009.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f010.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f011.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f012.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f013.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f014.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f015.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f016.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f017.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f018.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f019.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f020.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f021.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f022.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f023.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f024.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f025.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f026.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f027.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f028.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f029.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f030.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f031.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f032.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f033.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f034.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f035.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f036.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f037.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f038.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f039.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f040.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f041.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f042.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f043.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f044.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f045.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f046.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f047.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f048.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f049.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f050.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f051.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f052.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f053.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f054.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f055.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f056.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f057.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f058.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f059.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f060.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f061.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f062.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f063.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f064.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f065.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f066.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f067.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f068.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f069.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f070.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f071.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f072.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f073.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f074.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f075.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f076.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f077.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f078.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f079.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f080.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f081.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f082.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f083.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f084.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f085.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f086.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f087.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f088.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f089.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f090.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f091.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f092.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f093.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f094.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f095.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f096.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f097.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f098.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f099.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f100.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f101.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f102.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f103.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f104.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f105.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f106.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f107.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f108.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f109.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f110.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f111.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f112.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f113.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f114.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f115.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f116.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f117.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f118.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f119.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f120.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f123.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f126.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f129.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f132.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f135.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f138.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f141.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f144.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f147.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f150.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f153.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f156.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f159.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f162.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f165.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f168.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f171.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f174.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f177.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f180.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f183.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f186.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f189.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f192.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f195.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f198.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f201.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f204.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f207.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f210.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f213.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f216.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f219.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f222.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f225.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f228.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f231.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f234.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f237.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f240.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f243.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f246.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f249.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f252.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f255.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f258.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f261.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f264.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f267.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f270.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f273.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f276.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f279.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f282.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f285.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f288.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f291.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f294.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f297.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f300.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f303.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f306.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f309.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f312.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f315.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f318.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f321.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f324.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f327.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f330.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f333.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f336.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f339.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f342.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f345.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f348.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f351.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f354.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f357.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f360.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f363.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f366.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f369.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f372.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f375.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f378.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f381.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_f384.ecf modified: ecflow/ecf/scripts/gfs/atmos/post/jgfs_atmos_post_manager.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f000.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f003.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f006.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f009.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f012.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f015.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f018.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f021.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f024.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f027.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f030.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f033.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f036.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f039.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f042.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f045.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f048.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f051.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f054.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f057.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f060.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f063.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f066.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f069.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f072.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f075.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f078.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f081.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f084.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f090.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f096.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f102.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f108.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f114.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f120.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f126.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f132.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f138.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f144.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f150.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f156.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f162.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f168.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f174.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f180.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f186.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f192.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f198.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f204.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f210.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f216.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f222.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f228.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f234.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_20km_1p0/jgfs_atmos_awips_f240.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f000.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f003.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f006.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f009.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f012.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f015.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f018.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f021.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f024.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f027.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f030.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f033.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f036.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f039.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f042.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f045.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f048.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f051.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f054.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f057.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f060.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f063.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f066.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f069.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f072.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f075.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f078.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f081.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f084.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f090.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f096.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f102.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f108.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f114.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f120.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f126.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f132.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f138.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f144.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f150.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f156.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f162.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f168.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f174.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f180.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f186.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f192.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f198.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f204.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f210.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f216.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f222.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f228.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f234.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_f240.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/bufr_sounding/jgfs_atmos_postsnd.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/bulletins/jgfs_atmos_fbwind.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_blending.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_blending_0p25.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_grib2.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_grib2_0p25.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f00.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f06.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f102.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f108.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f114.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f12.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f120.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f18.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f24.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f30.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f36.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f42.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f48.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f54.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f60.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f66.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f72.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f78.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f84.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f90.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/grib_wafs/jgfs_atmos_wafs_f96.ecf modified: ecflow/ecf/scripts/gfs/atmos/post_processing/jgfs_atmos_wafs_gcip.ecf modified: ecflow/ecf/scripts/gfs/atmos/verf/jgfs_atmos_vminmon.ecf modified: ecflow/ecf/scripts/gfs/jgfs_forecast.ecf modified: ecflow/ecf/scripts/gfs/wave/gempak/jgfs_wave_gempak.ecf modified: ecflow/ecf/scripts/gfs/wave/init/jgfs_wave_init.ecf modified: ecflow/ecf/scripts/gfs/wave/post/jgfs_wave_post_bndpnt.ecf modified: ecflow/ecf/scripts/gfs/wave/post/jgfs_wave_postpnt.ecf modified: ecflow/ecf/scripts/gfs/wave/post/jgfs_wave_postsbs.ecf modified: ecflow/ecf/scripts/gfs/wave/post/jgfs_wave_prdgen_bulls.ecf modified: ecflow/ecf/scripts/gfs/wave/post/jgfs_wave_prdgen_gridded.ecf modified: ecflow/ecf/scripts/gfs/wave/prep/jgfs_wave_prep.ecf modified: jobs/JGFS_ATMOS_POSTSND modified: jobs/JGLOBAL_ATMOS_EMCSFC_SFC_PREP modified: jobs/JGLOBAL_FORECAST modified: jobs/JGLOBAL_WAVE_GEMPAK modified: jobs/JGLOBAL_WAVE_INIT modified: jobs/JGLOBAL_WAVE_POST_BNDPNT modified: jobs/JGLOBAL_WAVE_POST_PNT modified: jobs/JGLOBAL_WAVE_POST_SBS modified: jobs/JGLOBAL_WAVE_PRDGEN_BULLS modified: jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED modified: jobs/JGLOBAL_WAVE_PREP modified: parm/config/config.wave modified: parm/transfer_gdas_1a.list modified: parm/transfer_gdas_1b.list modified: parm/transfer_gdas_1c.list modified: parm/transfer_gdas_enkf_enkf_05.list modified: parm/transfer_gdas_enkf_enkf_10.list modified: parm/transfer_gdas_enkf_enkf_15.list modified: parm/transfer_gdas_enkf_enkf_20.list modified: parm/transfer_gdas_enkf_enkf_25.list modified: parm/transfer_gdas_enkf_enkf_30.list modified: parm/transfer_gdas_enkf_enkf_35.list modified: parm/transfer_gdas_enkf_enkf_40.list modified: parm/transfer_gdas_enkf_enkf_45.list modified: parm/transfer_gdas_enkf_enkf_50.list modified: parm/transfer_gdas_enkf_enkf_55.list modified: parm/transfer_gdas_enkf_enkf_60.list modified: parm/transfer_gdas_enkf_enkf_65.list modified: parm/transfer_gdas_enkf_enkf_70.list modified: parm/transfer_gdas_enkf_enkf_75.list modified: parm/transfer_gdas_enkf_enkf_80.list modified: parm/transfer_gdas_enkf_enkf_misc.list modified: parm/transfer_gdas_misc.list modified: parm/transfer_gfs_1.list modified: parm/transfer_gfs_10a.list modified: parm/transfer_gfs_10b.list modified: parm/transfer_gfs_2.list modified: parm/transfer_gfs_3.list modified: parm/transfer_gfs_4.list modified: parm/transfer_gfs_5.list modified: parm/transfer_gfs_6.list modified: parm/transfer_gfs_7.list modified: parm/transfer_gfs_8.list modified: parm/transfer_gfs_9a.list modified: parm/transfer_gfs_9b.list modified: parm/transfer_gfs_misc.list modified: parm/transfer_rdhpcs_gdas.list modified: parm/transfer_rdhpcs_gdas_enkf_enkf_1.list modified: parm/transfer_rdhpcs_gdas_enkf_enkf_2.list modified: parm/transfer_rdhpcs_gdas_enkf_enkf_3.list modified: parm/transfer_rdhpcs_gdas_enkf_enkf_4.list modified: parm/transfer_rdhpcs_gdas_enkf_enkf_5.list modified: parm/transfer_rdhpcs_gdas_enkf_enkf_6.list modified: parm/transfer_rdhpcs_gdas_enkf_enkf_7.list modified: parm/transfer_rdhpcs_gdas_enkf_enkf_8.list modified: parm/transfer_rdhpcs_gfs.list modified: parm/transfer_rdhpcs_gfs_nawips.list commit ce1ae9709fe506f32833910f3cf4f68117c7d0f2 Merge: b096e2941 e9d00e41e Author: Fanglin Yang Date: Wed Nov 18 16:55:14 2020 -0500 Merge pull request #192 from JessicaMeixner-NOAA/bf/waveICfreq wave parm update for realease/gfsv16 commit e9d00e41e939fca68a39293b973326531f3c6983 Author: jessica.meixner Date: Wed Nov 18 19:51:25 2020 +0000 update for the wave parm so that the wave model will look for the correct restart for when gfs is not run every cycle commit f3d11b9bc7e669f4c11266654649c4d5e28f16a9 Merge: e3972f177 32a004aae Author: Kate Friedman Date: Tue Nov 17 14:01:54 2020 -0500 Merge pull request #186 from lgannoaa/feature/ccpp Initial support for CCPP. commit 32a004aae3949e8fd3bda7a4f628995317d33f47 Merge: 66cc1a062 036cc113d Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Tue Nov 17 13:15:37 2020 -0500 Merge pull request #2 from KateFriedman-NOAA/ccpp Adjust efcs walltime and eupd thread value on Orion commit 036cc113dcd5787ce4bccfb11fb71ed7935bc982 Author: kate.friedman Date: Tue Nov 17 10:44:02 2020 -0600 Adjust efcs walltime and eupd thread value on Orion commit 66cc1a062ba1b430e9ebaf41715997a3de0ff023 Merge: 9e8f96050 2432dc5bb Author: lgannoaa <37596169+lgannoaa@users.noreply.github.com> Date: Mon Nov 16 15:48:34 2020 -0500 Merge pull request #1 from KateFriedman-NOAA/ccpp Fix missing COMROOT setting for Orion commit 2432dc5bbb1b3bc4c317063811333a5b30065794 Author: kate.friedman Date: Mon Nov 16 14:40:59 2020 -0600 Fix missing COMROOT setting on Orion commit 9e8f960508e82f31e9d57da0fc37600b9697e4c9 Author: Lin.Gan Date: Mon Nov 16 18:24:08 2020 +0000 Modify build_all.sh while getopts "oc" option to while getopts "c" option commit 587e1986cb5c6307f2d10a01758994ce1155e36d Author: Lin.Gan Date: Mon Nov 16 18:21:16 2020 +0000 scripts/exglobal_forecast.sh Merge with v16ccpp commit 6066462b23d34db815ed73c630ef624c41239215 Author: Lin.Gan Date: Mon Nov 16 17:53:44 2020 +0000 As requested, using 2e25df5fe952d27355ed58963148f46b82565469 for ufs-weather-model. commit bf312432b662c3e2b7bd8cd5de19dd5b5ee0c042 Merge: dd7b0068c d59c09c6d Author: Lin.Gan Date: Mon Nov 16 17:41:32 2020 +0000 Merge remote-tracking branch 'upstream/v16ccpp' into feature/ccpp Pull in the iovr=3 in config.fcst commit dd7b0068cf0b598e58af41a92712e393674d14fd Author: Lin.Gan Date: Mon Nov 16 14:39:04 2020 +0000 Modify exglobal_forecast.sh to create namelist in runable sequence. commit d59c09c6d9445110871da231ef9945d201986f81 Author: fanglin.yang Date: Mon Nov 16 04:06:48 2020 +0000 modified: ../scripts/exglobal_forecast.sh commit d8aa4423f1a6f2c63c00a4ed5a38a4df6501ef7b Author: fanglin.yang Date: Mon Nov 16 04:04:43 2020 +0000 modified checkout.sh to check out a hash instead of head of the develop branch commit e29fd0c517840a9996987e7ae21f78cc2af8ca12 Author: fanglin.yang Date: Mon Nov 16 03:00:50 2020 +0000 modified: ../parm/config/config.fcst and exglobal_forecast.sh 1. use iovr for model after https://github.com/NCAR/ccpp-physics/pull/514 and iovr_sw and iovr_lw for older version of the model, controled by RUN_CCPP 2. remove the if block and use atmos_model_nml to turn on/off the CCPP option commit 1ec0ebf31767422a67e3f56c951f5ccb2305c7e5 Author: fanglin.yang Date: Sun Nov 15 17:48:51 2020 -0500 bug fix exglobal_forecast.sh commit 336647bcaced053baeee43ef2417b34b6b196439 Author: fanglin.yang Date: Sun Nov 15 12:55:20 2020 -0500 modified: exglobal_forecast.sh to add min_lakeice = ${min_lakeice:-"0.15"} min_seaice = ${min_seaice:-"0.15"} commit 9616b2eef626ce8db2f3a5da343b0a7ece2137cb Author: fanglin.yang Date: Sun Nov 15 04:05:48 2020 +0000 modified: build_fv3.sh commit fe79b87abbeca26939dbcc1ccd7cb5fb2624e666 Author: Lin.Gan Date: Fri Nov 13 20:11:26 2020 +0000 Clean up development code from build_all commit 9cd719d0609cefb3261870528b88c54d24117c47 Author: Lin.Gan Date: Fri Nov 13 19:53:31 2020 +0000 Modified build_all.sh checkout.sh partial_build.sh to allow switch turn on/off ccpp option Turn on CCPP usage: checkout.sh -c build_all.sh -c Turn off CCPP usage: checkout.sh build_all.sh commit 8aed1e4d44166ffb6036409bced3ad04f038dfc4 Author: fanglin.yang Date: Fri Nov 13 17:04:07 2020 +0000 modified: ../scripts/exglobal_forecast.sh modified: build_all.sh build_fv3.sh to automatically detect model version, CCPP vs IPD commit 7826668c6baaf113785e88b79bc059f8a430eace Author: Lin.Gan Date: Fri Nov 13 15:44:31 2020 +0000 Remove development configuration from config.nsst commit 1e5ebe44fd525135114b0fc74a698cadd2eb5eaa Merge: 0b540d78c b0536156e Author: Lin.Gan Date: Fri Nov 13 15:36:06 2020 +0000 Merge remote-tracking branch 'upstream/v16ccpp' into feature/ccpp commit 0b540d78c47755cb0bd68a8dc934461ec1bef345 Author: Lin.Gan Date: Fri Nov 13 15:35:26 2020 +0000 update eupd resource commit 267c661710562ab620c07212626268f0201c6c20 Author: Lin.Gan Date: Fri Nov 13 15:31:56 2020 +0000 Merge with v16ccpp modified: parm/config/config.base.emc.dyn modified: parm/config/config.resources modified: scripts/exglobal_forecast.sh modified: sorc/build_fv3.sh modified: sorc/checkout.sh commit b0536156ed8383150fc65d5897c6acd17c0df248 Author: fanglin.yang Date: Fri Nov 13 06:30:25 2020 +0000 modified: parm/config/config.base.emc.dyn parm/config/config.base.nco.static scripts/exglobal_forecast.sh sorc/build_all.sh sorc/build_fv3.sh sorc/checkout.sh to add the option to check out ufs_weather_model and run with CCPP phsyics commit b80676925e5dca6c26d8dbf59c8ca74c0efe5fcb Author: Lin.Gan Date: Thu Nov 12 20:18:08 2020 +0000 sync config.base.emc.dyn and config.resources with development commit 6e6fdd79515966f84334e9319dbb3b9cfff9095c Author: Lin.Gan Date: Thu Nov 12 19:28:06 2020 +0000 Remove GSD suite file and aero IC extract util commit 8aed527d4358e131fba73785469d1fec957e619a Author: Lin.Gan Date: Thu Nov 12 18:23:21 2020 +0000 deleted: jobs/rocoto/aeroic.sh commit d2191414c48835799e578a6010d7dbfdb6babb78 Author: Lin.Gan Date: Thu Nov 12 18:11:43 2020 +0000 Changes to be committed: modified: jobs/rocoto/fcst.sh deleted: modulefiles/module_base.wcoss_dell_p3_fcst deleted: parm/config/config.aeroic modified: parm/config/config.base.emc.dyn modified: parm/config/config.resources modified: sorc/build_fv3.sh modified: sorc/checkout.sh modified: sorc/link_fv3gfs.sh deleted: ush/load_fv3gfs_modules_fcst.sh deleted: ush/rocoto/setup_workflow_fcstonly_aeroic.py commit 490f0a73d852f709e1232d4a4a9ee7f3d95d58d9 Merge: 52f59845f e3972f177 Author: Lin.Gan Date: Thu Nov 12 15:49:48 2020 +0000 Merge remote-tracking branch 'upstream/develop' into feature/ccpp commit 52f59845f62bd9cc16b6b1197a7de04a4b4a1d86 Author: Lin.Gan Date: Thu Nov 12 15:04:33 2020 +0000 As of 11/12/2020, this package is tested with a C768 cycled (6 cycle) run on Dell system using FV3_GFS_v16beta (imp_physics=11). The wave components are turned off. Ready to merge back to the development. The expdir and log files are available on HPSS: /NCEPDEV/emc-global/1year/Lin.Gan/WCOSS_DELL_P3/feature_ccpp/FV3_GFS_v16beta/FV3_GFS_v16beta-LOG.tar /NCEPDEV/emc-global/1year/Lin.Gan/WCOSS_DELL_P3/feature_ccpp/FV3_GFS_v16beta/FV3_GFS_v16beta-EXPT.tar File parm/config/config.resources has been modified to fix EUPD job issue in job card resource. File sorc/build_fv3.sh and sorc/link_fv3gfs.sh changed to clear up development remark. Removed a developer directory that is not required for FV3_GFS_v16beta confiruration. commit e3972f1778242d696bcc4eb4a979d46373e8d193 Merge: 9f7eebaf6 671856ddf Author: Kate Friedman Date: Fri Nov 6 12:55:23 2020 -0500 Merge pull request #174 from NOAA-EMC/port2orion GFSv16 release hand-off state, Orion support, and additional low res R&D updates commit b096e2941e4817eace7d8d9e9173ca92a3f72944 Merge: 9fa9ffe97 670b97c26 Author: Kate Friedman Date: Fri Nov 6 10:54:07 2020 -0500 Merge pull request #173 from JessicaMeixner-NOAA/bugfix/rtofsissues Updates for RTOFS preprocessing for wave model commit 671856ddf96ff4ae2df8a5feacd1b184ce9b8702 Merge: 5383477af 9f7eebaf6 Author: Kate.Friedman Date: Thu Nov 5 20:43:26 2020 +0000 Sync merge with develop to resolve conflicts commit 5383477af879dd671dbab066b41eea7e2428e50f Author: Kate.Friedman Date: Thu Nov 5 18:46:43 2020 +0000 Change wavepostbndpnt to wait for fcst to end commit 201abf3e7ed9b5cb66bec43bcb9e4d34b3c6d5f9 Merge: f4ceb1826 9fa9ffe97 Author: kate.friedman Date: Thu Nov 5 12:10:32 2020 -0600 Merge remote-tracking branch 'origin/release/gfsv16.0.0' into port2orion * origin/release/gfsv16.0.0: Issue #1 - update WAFS tag to gfs_wafs.v6.0.14 and update dumpjb version to 5.1.0 modified: jobs/JGFS_ATMOS_POSTSND and jobs/rocoto/postsnd.sh to remove redundant variables in the two scripts and make them work for both EMC and NCO parallels. Issue #1 - update WAFS tag to gfs_wafs.v6.0.13 Rename Release_Notes.gfs.v16.0.0.txt to Release_Notes.gfs.v16.0.0.md Issue #1 - update WAFS tag to gfs_wafs.v6.0.12 for removal of in-cloud turbulence per AWC commit f4ceb182685a5d6d6fb18b2e40c48ae0623a0794 Author: Kate.Friedman Date: Thu Nov 5 14:49:08 2020 +0000 Hera updates for OUTPUT_FILETYPES and resources commit 896d19529c828a62574c835d6d372701588fe758 Author: kate.friedman Date: Wed Nov 4 10:19:17 2020 -0600 Set nth_fcst to 4 for C384 deterministic commit 37d01e9fc68517c39b66a1ab67453b016666d590 Author: kate.friedman Date: Tue Nov 3 14:06:59 2020 -0600 Tie DOIAU_ENKF to DOIAU and add DOIAU check for IAU_OFFSET and IAU_FHROT in config.base commit 1a11fd21740860275f9be14caf1e615691075414 Author: kate.friedman Date: Tue Nov 3 14:49:12 2020 +0000 Increase gfsfcst walltime for C192 commit 670b97c2673054b8d18a004390af0093520b1968 Author: jessica.meixner Date: Mon Nov 2 20:46:33 2020 +0000 updates for checking if RTOFS files exist and only processing RTOFS files for needed fhr commit 190b78c8d34f60e84c80b47a97cb4184e4e11fd9 Author: Kate.Friedman Date: Mon Nov 2 18:41:32 2020 +0000 Set nth_fcst to 4 for C384 on Hera to handle less memory commit 713c51eabc40b80e02314b5955b0efab3b235344 Author: Kate.Friedman Date: Mon Nov 2 17:54:49 2020 +0000 Add FDATE calculation to setup scripts and change FDATE in to parsed value in config.base.emc.dyn commit 306ea5f15120d808760bd58b247b5861a8813670 Author: kate.friedman Date: Mon Nov 2 10:43:20 2020 -0600 Reduce C384 nth_fv3 to 1 in config.fv3 commit 9fa9ffe97d23ad69c33658875c9b0c3440c2c97f Author: kate.friedman Date: Mon Nov 2 15:24:48 2020 +0000 Issue #1 - update WAFS tag to gfs_wafs.v6.0.14 and update dumpjb version to 5.1.0 commit 24226384780c9c2438abec1864923f87371798d4 Merge: 705933436 a3b463859 Author: fanglin.yang Date: Fri Oct 30 19:48:13 2020 +0000 Merge branch 'release/gfsv16.0.0' of https://github.com/NOAA-EMC/global-workflow into release/gfsv16.0.0 commit 705933436f2f42d5fde17b8aa4a57918f2c66b6a Author: fanglin.yang Date: Fri Oct 30 19:46:51 2020 +0000 modified: jobs/JGFS_ATMOS_POSTSND and jobs/rocoto/postsnd.sh to remove redundant variables in the two scripts and make them work for both EMC and NCO parallels. commit c9e0566eb2e6179410583c14d48a971a79dc4c88 Author: Kate.Friedman Date: Thu Oct 29 20:50:46 2020 +0000 Revert epos change in setup_workflow.py commit a3b463859154c1127bb66764a65baa89929a0a5f Author: kate.friedman Date: Wed Oct 28 18:18:51 2020 +0000 Issue #1 - update WAFS tag to gfs_wafs.v6.0.13 commit ab0577de3c4bbbdc58219fd1c95a142cf93b5f4a Author: Kate.Friedman Date: Wed Oct 28 13:52:34 2020 +0000 Issue #1 - update anal, eobs, and eupd resources for low res commit 97f1ae89baf37d8bd1940f179fecb6bdc1279fa2 Author: Kate.Friedman Date: Tue Oct 27 17:52:28 2020 +0000 Issue #1 - adjust epos groups for DOIAU/DOIAU_ENKF=NO commit cb6d74e0c7363f7e54d879a97f457f7d4a37d926 Author: Kate Friedman Date: Mon Oct 26 15:56:58 2020 -0400 Rename Release_Notes.gfs.v16.0.0.txt to Release_Notes.gfs.v16.0.0.md commit c09678fb33d0ece36de1af223b5cdf8384d5b9c3 Author: kate.friedman Date: Mon Oct 26 14:43:09 2020 +0000 Issue #1 - update WAFS tag to gfs_wafs.v6.0.12 for removal of in-cloud turbulence per AWC commit 59f08b31916d3585aa9a4832dc9de865e3a350cf Merge: cbe5ddff8 1171a6223 Author: Lin.Gan Date: Thu Oct 22 19:47:42 2020 +0000 Merge remote-tracking branch 'upstream/port2orion' into feature/ccpp commit cbe5ddff8013a03a9391eead40796f9c30d7ca28 Author: Lin.Gan Date: Thu Oct 22 18:35:49 2020 +0000 Cycled warm start run using develop ufs-weather-model branch and feature/ccpp using merged branch on the 10/13 (4da0f) with feature/gfsv16b (31563). CPP_SUITE="FV3_GFS_v16beta" build_fv3 option: CCPP=Y 32BIT=Y SUITES=FV3_GFS_v15,FV3_GSD_v0,FV3_GSD_noah,FV3_GFS_v16beta wave is turned off for this test because it's for atmospheric physics and comparison. commit b7ece977f49c7a13aa39c3b57f0f06cf590d9ba6 Merge: 7728ce18f 09a669c4d Author: Kate Friedman Date: Thu Oct 22 11:07:24 2020 -0400 Merge pull request #165 from NOAA-EMC/release/gfsv15.3.3 GFSv15.3.3 commit 1171a62238f0877fc8956df669f07fc9a418c145 Author: Kate.Friedman Date: Thu Oct 22 14:01:31 2020 +0000 Issue #1 - increase wavepostbndpnt and wavepostpnt walltimes to give more time on Hera commit c3bab19722a8b301e9ac4d2367b7e3e5dc2d88e7 Author: Kate.Friedman Date: Wed Oct 21 20:13:38 2020 +0000 Fix wrong DATE in new if-block in config.anal for cold start checking commit 636ded675b81b15e8b3af2a211edf44f17972cf0 Author: kate.friedman Date: Tue Oct 20 19:23:16 2020 +0000 Issue #1 - add wtime_fcst_gfs time of 4hrs for C384 commit 4007f116adcde8372e590489d0faa0181babbc15 Author: kate.friedman Date: Tue Oct 20 18:29:15 2020 +0000 Issue #1 - fix firstcyc queue bug in workflow_utils.py on WCOSS commit ebdb058fd138a7aa408d69c967747686ea5b77d9 Author: kate.friedman Date: Tue Oct 20 13:23:05 2020 -0500 Issue #1 - resource updates from low res testing - set io_layout to "1,1" for low res gfs in config.fcst - set npe_wav[_gfs] to 140 for all resolutions, same wave grid - increase nth_fv3 to 4 for C192 and C384 - set smaller walltime for gfsfcst when resolutions less than C768 commit 3f0ab6ff2dd4ca42a45e68972b1936d398b32097 Author: Kate.Friedman Date: Tue Oct 20 17:55:18 2020 +0000 Issue #1 - increase C768 npe_eobs to 200 for Hera nodes with less memory commit d5b69c49ddbd03982db79f09707a49d01ef0b443 Merge: 4ebdd230c 794e6655c Author: Kate.Friedman Date: Tue Oct 20 16:56:27 2020 +0000 Merge remote-tracking branch 'origin/release/gfsv16.0.0' into port2orion * origin/release/gfsv16.0.0: Issue #1 - pull in corrected npe_eobs values in config.resources Issue #1 - update FV3 tag to GFS.v16.0.14 for Hera/Orion build support commit 4ebdd230c121a3c839829aa4ef855d80eb462549 Author: Kate.Friedman Date: Tue Oct 20 16:53:46 2020 +0000 Add UPP netcdf module library load to HERA.env to resolve runtime netcdf version mismatch commit 794e6655cb227973e47b2373d0e4c97bf297479e Author: kate.friedman Date: Tue Oct 20 16:45:58 2020 +0000 Issue #1 - pull in corrected npe_eobs values in config.resources commit 1d0cad425b1e163377bba8ace6622bbbb2d81de0 Author: Kate.Friedman Date: Mon Oct 19 19:24:55 2020 +0000 Issue #1 - add parm mon folder to ignore list and move parm section of ignore list up commit d632c87a8f2ce85d54fa45da14aa09f91a0ec680 Author: Kate.Friedman Date: Mon Oct 19 19:20:31 2020 +0000 Issue #1 - add FDATE to config.base commit 22d4118f4bc7853fa81f6505fd9aaf5bc6dab4f8 Author: Kate.Friedman Date: Mon Oct 19 19:12:29 2020 +0000 Issue #1 - save GFSv16 hand-off resource settings into new static nco configs commit 6fd73d7fdf40b291ff49193b606538b540e46139 Author: Kate.Friedman Date: Mon Oct 19 19:11:37 2020 +0000 Issue #1 - new FV3 tag that builds on Hera/Orion and update for building/running high res system on Hera commit 484550c13d8ef957478f20afd6e722f7f610bb8f Author: kate.friedman Date: Fri Oct 16 15:40:35 2020 +0000 Issue #1 - update FV3 tag to GFS.v16.0.14 for Hera/Orion build support commit 58dcf6d961f7478fec64a850e7bcd19ffd8f3741 Author: kate.friedman Date: Thu Oct 15 15:08:38 2020 -0500 Update .gitignore for script renaming and removed external files commit e3f8df65082853baa5a22ed3b242efdae66d40ba Author: kate.friedman Date: Thu Oct 15 11:59:58 2020 -0500 Add wave env updates into ORION.env, increase waveinit tasks, and update resources for analysis and efcs walltimes commit 4da0feaaf8b63ee4662be8be9c5f59d547164b0b Merge: ee0f27b3f 31563a598 Author: lin.gan Date: Tue Oct 13 20:51:48 2020 +0000 Merge remote-tracking branch 'origin/feature/gfsv16b' into feature/ccpp commit e5e7cf1ea61d99080ec95b446c1bee45a57321a3 Merge: a1a7ac949 4b2d26db3 Author: kate.friedman Date: Tue Oct 13 14:25:56 2020 -0500 Sync merge with release/gfsv16.0.0 branch commit 4b2d26db3b9b66ecfdd4756bfcddb6e16f255b03 Author: kate.friedman Date: Tue Oct 13 17:41:24 2020 +0000 Issue #1 - adjust WAFS dependencies to wait for f036 post output commit 9ef1999270e05c1ff5bae5ea5f737d9474c98b3d Author: kate.friedman Date: Fri Oct 9 19:27:09 2020 +0000 Issue #1 - adding release notes for GFSv16 commit bf21010067bf7587be93229a979da347278fd7eb Merge: d5e8be989 31563a598 Author: kate.friedman Date: Fri Oct 9 18:15:36 2020 +0000 Merge remote-tracking branch 'origin/feature/gfsv16b' into release/gfsv16.0.0 commit d5e8be989d1f2f45095fea17fcb5ed1385759e89 Merge: 7728ce18f 561e19532 Author: kate.friedman Date: Fri Oct 9 18:13:53 2020 +0000 GFSv16 package changes from feature/gfsv16b before final wave updates commit 31563a5982b26d7441e2673d68d1d1dac4fe231d Merge: bf27d0b4d 6edf0f7b3 Author: Kate Friedman Date: Thu Oct 8 19:03:10 2020 -0400 Merge pull request #158 from RobertoPadilla-NOAA/feature/gfsv16_wave_prdgen Feature/gfsv16 wave prdgen commit 6edf0f7b396587c4e014ba08fb6199461a970c41 Merge: 896ba3127 bf27d0b4d Author: wx21rph Date: Thu Oct 8 22:54:12 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16_wave_prdgen Generating awips files with masks and elimination wmo headers for Arctic Ocean commit 896ba31270185a1edaa3c6d32ec3c4295badf136 Author: wx21rph Date: Thu Oct 8 22:50:48 2020 +0000 Issue #94 producing awips files with masks and deleting wmo headers for arctic ocean commit bf27d0b4de54e6b873e20d784012c93384ad2358 Merge: 561e19532 959dac21e Author: Kate Friedman Date: Thu Oct 8 18:23:17 2020 -0400 Merge pull request #157 from JessicaMeixner-NOAA/feature/gfsv16b-wavegrids Adding wave grids to grib interpolation commit 959dac21e98ff96130f8e8c90a4b94b18c76f4f3 Merge: 50f33dd92 561e19532 Author: jessica.meixner Date: Thu Oct 8 21:57:29 2020 +0000 Merge remote-tracking branch 'EMC/feature/gfsv16b' into feature/gfsv16b-wavegrids commit 50f33dd92e1f78f8879cb037652b1b4393511842 Author: jessica.meixner Date: Thu Oct 8 21:44:23 2020 +0000 updates to add glo_30m to the created grib files for waves for awips processing commit 561e19532e68aaff69f5b2568c3ebb2a536b7149 Merge: 7421143a1 853a46199 Author: Kate Friedman Date: Thu Oct 8 16:10:32 2020 -0400 Merge pull request #156 from lgannoaa/feature/gfsv16b Modify gfs/gdas post job to 20 minutes in wall clock. commit 853a46199f2c59ff84422024b55a70e2679213ee Author: Lin.Gan Date: Thu Oct 8 19:55:56 2020 +0000 Modify gfs/gdas post job to 20 minutes in wall clock. commit 7421143a169264c8480aea5970a6d67f9bc56c43 Author: kate.friedman Date: Thu Oct 8 18:40:28 2020 +0000 Issue #1 - update link_fv3gfs.sh to point to newly frozen fix_nco_gfsv16 FIX_DIR commit 8ac89aee11e7099303600ba6df8e3d4bf4d70a66 Merge: db9fa1730 eb4ad338d Author: Kate Friedman Date: Thu Oct 8 14:37:11 2020 -0400 Merge pull request #155 from lgannoaa/feature/gfsv16b Further ecflow updates commit eb4ad338dfa1f51c9ec4ada58601ffd3f6894d7f Author: Lin.Gan Date: Thu Oct 8 17:58:29 2020 +0000 Modify two wafs jobs trigger as: jgfs_atmos_wafs_grib2 trigger ../../post/jgfs_atmos_post_f000 == complete jgfs_atmos_wafs_grib2_0p25 trigger ../../post/jgfs_atmos_post_f036 == complete commit 589b1df4d5637f6e4521e79d4f286a463da477f3 Merge: 445a16b03 db9fa1730 Author: Lin.Gan Date: Thu Oct 8 17:46:36 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b commit db9fa173028efd7ed796024a2f4753629e91922b Author: kate.friedman Date: Thu Oct 8 16:01:50 2020 +0000 Issue #1 - update WAFS tag to gfs_wafs.v6.0.10 and change WAFS job dependencies commit 2a0c9739631ae4b1305a2653a1783cce287aeb96 Author: kate.friedman Date: Thu Oct 8 15:12:10 2020 +0000 Issue #1 - update config.fv3 and config.resources with v16rt2 values commit b5fbac7affc200e2c9a420f81615d0ae80b1dacf Merge: 5dac33742 6402996f0 Author: Kate Friedman Date: Thu Oct 8 10:36:43 2020 -0400 Merge pull request #154 from CoryMartin-NOAA/bugfix/eobs_resources Change config.resources for eobs for low resolution cases commit 5dac33742edd372a4e9b08c0b3029bb928cc569b Author: kate.friedman Date: Thu Oct 8 14:35:23 2020 +0000 Issue #1 - update gfs_util modulefiles commit 6402996f0cb5f787d04ac08efc8dd92dca767618 Author: CoryMartin-NOAA Date: Thu Oct 8 14:28:02 2020 +0000 Change config.resources for eobs for low resolution cases commit aae0912df6b1398cdad563495475916970590b1a Author: kate.friedman Date: Thu Oct 8 13:51:06 2020 +0000 Issue #1 - update to fbwndgfs modulefiles for WCOSS-Dell and WCOSS-Cray commit 244e91ca0caf433f82e6e709413db4bac40aade2 Merge: 3bdda7fb5 3b051e93a Author: Kate Friedman Date: Thu Oct 8 09:40:24 2020 -0400 Merge pull request #153 from JessicaMeixner-NOAA/bugfix/wavedependency update wave post pnt dependency commit 5962e117f93f1fc9f040f97b9446bcf9dc1138e1 Author: jessica.meixner Date: Thu Oct 8 11:24:41 2020 +0000 updates to parm to reduce the number of wave variables changes to the config so that wave models are interpolated to the multi_1 masked files for the regional output grids commit 445a16b030384a84c60318600a5ccdeeccbd063b Author: Lin.Gan Date: Thu Oct 8 04:03:32 2020 +0000 Modify module for each job to match implementation package change Modify two wafs jobs trigger Modify wall clock and resource for running jobs in NCO Modify obsproc package location commit 3b051e93adf60717f446564a08e5f3a11fd742af Author: jessica.meixner Date: Thu Oct 8 01:10:14 2020 +0000 add a dependency for the wavepostpnt on wavepostbndpnt for just gfs as this job does not exist for gdas commit 9f7eebaf6b980fd78498c51087d4333f5e56cb74 Author: kate.friedman Date: Wed Oct 7 20:36:19 2020 +0000 Temporarily peg GSI checkout to hash of release branch commit 3bdda7fb50676d25dbc03fffa739889a141b913c Author: kate.friedman Date: Wed Oct 7 17:36:27 2020 +0000 Issue #1 - update WAFS tag to gfs_wafs.v6.0.9 commit 3a700dbda1d6f6ef965cf2fbf30cfdf6cb2a8fe2 Author: kate.friedman Date: Wed Oct 7 14:36:02 2020 +0000 Issue #1 - remove POE/BACK block from config.prep and set POE=YES/BACK=off as defaults in env/WCOSS_DELL_P3.env prep section commit 8cc7d57be3c1d08529dd37633a83a6ffa8ca94b3 Author: kate.friedman Date: Wed Oct 7 14:30:19 2020 +0000 Issue #1 - remove unneeded DMPDIR and ICSDIR from config.base.nco.static commit 343ea3daece25bfeb1f8e870d361a364d1c228f0 Merge: 22f2a407e b57cd17da Author: Kate Friedman Date: Wed Oct 7 09:47:18 2020 -0400 Merge pull request #152 from JessicaMeixner-NOAA/feature/addwavedependency for rocoto add a dependency to wavepostpnt job on wavepostbndpnt commit b57cd17dafee5415b4955fcd8bbc76eb27e88740 Author: jessica.meixner Date: Wed Oct 7 13:19:50 2020 +0000 for rocoto add a dependency to wavepostpnt job on wavepostbndpnt so that both jobs will not run at the same time which will slow both jobs down. This is the reason for the dependency, otherwise there is not a "true" dependency between the jobs commit 22f2a407e853b7d5342ee52faa53affdb25ac44e Author: kate.friedman Date: Tue Oct 6 20:38:42 2020 +0000 Issue #1 - return POE=YES and BACK=off setting for prep on WCOSS_DELL_P3 commit c530316d07ae129f26907b2cae2079b55b37d0c2 Author: kate.friedman Date: Tue Oct 6 19:32:52 2020 +0000 Issue #1 - remove hardcoded POE and BACK values from config.prep commit aece8baef1cd2f46991cb1235c315e228f7b661e Author: kate.friedman Date: Tue Oct 6 19:02:37 2020 +0000 Issue #1 - move ABIBF, AHIBF, and HDOB pointers into RUN_ENVIR=emc block commit c75766cd176802242f003ad3612aadff02563fb2 Author: kate.friedman Date: Tue Oct 6 18:17:35 2020 +0000 Issue #1 - update config.fv3 based on real-time parallel commit f79ec7e0ce356213681e8c87655bb3fb66df16f1 Author: kate.friedman Date: Tue Oct 6 17:40:16 2020 +0000 Issue #1 - update prep job resources commit 577b060e63b1f47c77fd8e7a2d1b7f111fe6fcd4 Author: kate.friedman Date: Tue Oct 6 16:08:07 2020 +0000 Issue #1 - update g2tmpl module load in modulefiles/module_base.wcoss_dell_p3 commit c28d8cea538cdcdfc8557cb71120436edd86e313 Merge: 6197bc623 1b79fd35e Author: Kate Friedman Date: Tue Oct 6 11:16:32 2020 -0400 Merge pull request #150 from NOAA-EMC/feature/gfsv16b_updates nwtest module library and tag updates commit 1b79fd35e79fe7d0bc8eb460dfc08d44261fc470 Author: kate.friedman Date: Tue Oct 6 14:34:26 2020 +0000 Issue #1 - config updates from real-time parallel commit 11ca41294784278dc425e1ca1320253239a85d40 Author: kate.friedman Date: Tue Oct 6 13:44:29 2020 +0000 Issue #1 - remove unneeded line in vrfy.sh and update link_fv3gfs.sh for UFS_UTILS execs commit 97e9d7f2d75bb7167b56aaa70c53f53e4a189192 Author: kate.friedman Date: Mon Oct 5 18:50:40 2020 +0000 Issue #1 - update GSI tag to gfsda.v16.0.0 commit f6689d462ba4528c99a2a331ae8890759ba25ecb Merge: 01e362089 8048cd028 Author: Kate Friedman Date: Mon Oct 5 14:05:27 2020 -0400 Merge pull request #149 from GuangPingLou-NOAA/feature/gfsv16b Add 6 bufr stations to the bufr sounding output commit 01e362089fbf71a962fcdfc7db48f243b74e2e50 Author: kate.friedman Date: Mon Oct 5 17:09:04 2020 +0000 Issue #1 - update component tags and modulefiles for nwtest lib updates, remove unneeded module load and modulefile from downstream wave job rocoto scripts commit 6197bc623013d7fca82c556f3fb51c15832578b3 Merge: 451669a8f cd86b0edd Author: Kate Friedman Date: Mon Oct 5 10:47:08 2020 -0400 Merge pull request #148 from NOAA-EMC/feature/gfsv16b_updates Updates to optimize wave post jobs commit cd86b0eddd305483007e1381d0d4a7256fa05680 Merge: 9b32bf0c8 95bc516af Author: kate.friedman Date: Mon Oct 5 14:16:50 2020 +0000 Merge branch 'feature/gfsv16b_updates' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b_updates commit 9b32bf0c83ac9eb38f8d3b1b405b704b26c6592a Author: kate.friedman Date: Mon Oct 5 14:16:21 2020 +0000 Issue #1 - adjust error handling in wave rocoto job scripts commit 8048cd0285c8b9c8606673b04d60e8a137fe89b8 Author: Guang.Ping.Lou Date: Sat Oct 3 00:45:21 2020 +0000 issue #142 generate station i,j grid commit e4e4b84235344b6da44ae732020115581abb9afe Author: Guang.Ping.Lou Date: Sat Oct 3 00:45:00 2020 +0000 issue #142 generate station i,j grid commit 68e5fd7279bf1f9c30fa4daa16bf44795f5e91f6 Author: Guang.Ping.Lou Date: Sat Oct 3 00:44:46 2020 +0000 issue #142 generate station i,j grid commit fcf1f415333d53fe3a97546d4a16de55d6e1d703 Author: Guang.Ping.Lou Date: Sat Oct 3 00:43:49 2020 +0000 issue #142 add 6 bufr station data commit 026b4e0a2ec1da85a3cc8c73053dd85fd4d56439 Author: Guang.Ping.Lou Date: Sat Oct 3 00:42:58 2020 +0000 issue #142 add 6 bufr stations for Thailand TMD commit faaf8041faeecbc4c39cb37782bbe291ebd7a011 Author: Guang.Ping.Lou Date: Sat Oct 3 00:41:09 2020 +0000 issue #145 change dev path to prod for parallel netcdf modules commit e6ba71f58ab5e88957f12fc742fded77a58f75d2 Author: Guang.Ping.Lou Date: Sat Oct 3 00:40:34 2020 +0000 issue #145 change dev path to prod for parallel netcdf modules commit 95bc516af61e4aa35997f45ea625773f947c2125 Merge: 28904cddc a4d0e4d7a Author: Kate Friedman Date: Fri Oct 2 15:34:48 2020 -0400 Merge pull request #143 from JessicaMeixner-NOAA/feature/gfsv16b-wave-byhr optimize wave post and EE2 commit 451669a8f8d7778addd7bbd090d2fe71c9a6e3f8 Merge: 340f849f1 28904cddc Author: Kate Friedman Date: Fri Oct 2 15:32:41 2020 -0400 Merge pull request #144 from NOAA-EMC/feature/gfsv16b_updates Issue #1 - updates for modules and small fixes commit 28904cddca4d3e432f6b155ae018b6a081aea82e Author: kate.friedman Date: Fri Oct 2 19:05:10 2020 +0000 Issue #1 - updates for modules and small fixes - fix to run ens_tracker without tclogg module in modulefiles/module_base.wcoss_dell_p3 - fix to sorc/syndat_qctropcy.fd/qctropcy.f for compile warning - fix to gfswaveawipsbulls dependency in setup scripts - update to bufr/11.3.0 from bufr/11.2.0 in modulefiles - add override ability for POE for prep jobs in env/WCOSS_DELL_P3.env commit a4d0e4d7a063c14f711bee5ef32cd0b183121ea7 Author: jessica.meixner Date: Fri Oct 2 18:59:22 2020 +0000 reverting changes to configs that were not intented to be committed commit 2e6ba320b39b50030c8097a09700e2eb81cb724a Merge: cb7b27c96 340f849f1 Author: jessica.meixner Date: Fri Oct 2 18:08:59 2020 +0000 Merge remote-tracking branch 'EMC/feature/gfsv16b' into feature/gfsv16b-wave-byhr commit cb7b27c96c59388e39283b2c01aff2da92fcedbf Author: jessica.meixner Date: Fri Oct 2 18:08:16 2020 +0000 fix resource time estimates commit d51a260adc0fb003730ae4090c528856ac18be93 Author: jessica.meixner Date: Fri Oct 2 18:04:50 2020 +0000 bug fix in exgfs_wave_post_pnt.sh commit 340f849f1cad6b562a941f15318eaf2d050743f1 Merge: 897286fbd d5457ec1e Author: Kate Friedman Date: Fri Oct 2 12:00:00 2020 -0400 Merge pull request #141 from RobertoPadilla-NOAA/feature/gfsv16_wave_prdgen Updates to downstream wave jobs commit 2b2635d441beb988e6c09a329eb80e1d519266ee Author: jessica.meixner Date: Fri Oct 2 14:38:06 2020 +0000 update resources and trigger from 192->180 commit 56193a6bf9e8942edd54b7b7078143a3c91bc77f Author: jessica.meixner Date: Fri Oct 2 14:10:05 2020 +0000 last of EE2 changes commit ad5dcff885c3690f991936412453506deed543a6 Author: jessica.meixner Date: Fri Oct 2 14:01:25 2020 +0000 updates for EE2 from waves commit d5457ec1e15e670677eea3171e68df20e4d6ca53 Merge: 4c488d309 897286fbd Author: wx21rph Date: Thu Oct 1 19:02:34 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16_wave_prdgen Issue #94 Adding error capture in the Jjobs commit 4c488d30947cf556d6cbf05d570fa6c41ae59ffb Author: wx21rph Date: Thu Oct 1 18:45:21 2020 +0000 Issue #94 fix for failing silently commit 897286fbdfea6a8d5765f9d89ab7c586bcadb201 Author: kate.friedman Date: Thu Oct 1 18:14:37 2020 +0000 Issue #1 - update WAFS tag to gfs_wafs.v6.0.8 commit e133c5d1d509781824140f5ec1d302502394dc89 Merge: a25a7deac 545742c42 Author: Kate Friedman Date: Thu Oct 1 09:13:05 2020 -0400 Merge pull request #140 from lgannoaa/feature/gfsv16b Update ecflow scripts for GFSv16 commit ee0f27b3f4e3e544be791b3554dd57a808b66e4c Author: Judy.K.Henderson Date: Wed Sep 30 22:21:02 2020 +0000 - updated to use 28Sep develop ufs-weather-model, d021e7b0395ccac2b7a30b414b58a8c924d2784f f61416fef691d9ba39a40df1ce72aa574f54c390 FMS (2019.01.03) 9e1ba7c7448a8d009f39b5588e9498a7dbab1c60 FV3 (heads/develop) 9d05172b711f4ab5d6f978dbe575bd67a681b55a NEMS (heads/develop) 96e3f3a8fa0389a4b110b0fa23e7a414f6d92038 WW3 (6.07.1-50-g96e3f3a8) ffdd19bc6c1df747394b7e9958a76238fcd44242 stochastic_physics (ufs-v1.0.0-70-gffdd19b) - changed compilation options in build_fv3.sh - removed fv3gfs.fd_jkh directory since changes are already in develop branch - updated getic script to retrieve files after 00Z 26Feb20 from mass store with prefix name of 'com' commit 8f79b61ab8b0c7ed17b707eaae7736a3aedd33ef Merge: cc9e98a6d a25a7deac Author: wx21rph Date: Wed Sep 30 16:09:27 2020 +0000 Issue #94 solving conflicts commit cc9e98a6d03fa422f5080626da02426c174e2b45 Author: wx21rph Date: Wed Sep 30 15:23:29 2020 +0000 Issue #94 add native grids as default grids commit 545742c42c635d24c360a8c9b0902483cab66359 Merge: 65ba88e5a a25a7deac Author: Lin.Gan Date: Wed Sep 30 14:05:40 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b commit a25a7deac9053dbb2ce119e3754f32c2d3f225b9 Author: russ.treadon Date: Tue Sep 29 22:46:35 2020 +0000 Issue #1: update name of ncdiag executable and source code directory to be conistent commit 2eab17ce11be26c7d113e16e3a9159e0e6395cac Author: jessica.meixner Date: Tue Sep 29 19:05:21 2020 +0000 cleaning up the rearranged scripts commit 65ba88e5ad97a560e4b1c6dd7a3568185e5ccbd9 Author: Lin.Gan Date: Tue Sep 29 18:22:29 2020 +0000 Jobs were tested with PDY 20200925, code managers from post, gempak, wave, and post process certified the test run result. This merge included an update from high watermark testing (tested by gfs team using devonprod) results impact the following jobs: jgfs_atmos_analysis.ecf jgfs_forecast.ecf jgdas_atmos_analysis.ecf jgdas_enkf_update.ecf jgdas_enkf_ecen.ecf commit d86cfee7befd1d97c9ff6fd8cb58d2f9c9f9f1d8 Merge: abe24e279 69bdae3e7 Author: jessica.meixner Date: Mon Sep 28 18:47:23 2020 +0000 Merge remote-tracking branch 'EMC/feature/gfsv16b' into feature/gfsv16b-wave-byhr commit 121bafee1e227119ddcba233da5b59af242128e4 Merge: 0d7ba5fb2 69bdae3e7 Author: Lin.Gan Date: Mon Sep 28 18:42:03 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b commit 95f03be594d2944f14f0f6baa600bae27691e891 Merge: 7278b0677 69bdae3e7 Author: Guang.Ping.Lou Date: Mon Sep 28 17:30:03 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b commit 69bdae3e7cdb35db28d899b7384daab48e888eb4 Author: russ.treadon Date: Mon Sep 28 16:58:47 2020 +0000 Issue #1: update parm/config.vrfy to define VSDBJOBSH (used by jobs/rocoto/vrfy.sh) commit abe24e279666ecccb27fb875322373b4920a66d3 Merge: d4ebc2e67 ef6d2c5ca Author: jessica.meixner Date: Fri Sep 25 21:56:05 2020 +0000 Merge remote-tracking branch 'EMC/feature/gfsv16b' into feature/gfsv16b-wave-byhr Conflicts: parm/config/config.resources commit 0d7ba5fb286ac280efda5510a23e2dbf625ba19a Author: Lin.Gan Date: Fri Sep 25 21:34:42 2020 +0000 Code manager indicated all wafs jobs wall clock is 30 mins. Code manager indicated job card for scripts/gfs/atmos/gempak/jgfs_atmos_pgrb2_spec_gempak.ecf need to be changed commit 998228f94e36f63a0586f431fbcd32fb646d0ee6 Author: Lin.Gan Date: Fri Sep 25 21:02:57 2020 +0000 The EMC realtime parallel does not use operational job settings. Ecflow job card roll back the setting from module_used_gfs-16_job google sheet document. commit 0e700ef159835510b3e3d02d27448826cb8dd914 Merge: 9b8e2f3a2 ef6d2c5ca Author: Lin.Gan Date: Fri Sep 25 19:49:13 2020 +0000 Bring change from upstream_feature_gfsv16b commit ef6d2c5ca3243c579615ee47bcef494bc42f335e Author: kate.friedman Date: Fri Sep 25 16:17:05 2020 +0000 Update GLDAS tag to gldas_gfsv16_release.v1.10.0 commit a9f8cb2616259b5610b9d101b5f9e50ea3013a2e Author: kate.friedman Date: Fri Sep 25 16:06:11 2020 +0000 Update gfswafs job to run with loop over fcsthrs commit 8187a31eca915184a43c0649db0170271b4ee6a9 Author: russ.treadon Date: Fri Sep 25 14:06:43 2020 +0000 Issue #1: update vrfy.sh to submit vsdb processing as separate job (only on WCOSS_DELL_P3) commit 320f33033ac0f6f6919bd933c0a18d2a4088fcd6 Author: Roberto Padilla Date: Thu Sep 24 23:29:10 2020 +0000 Issue #94 add /fakedbn to run DBN_alerts commit 0d2c6280ac80d944ea7e9e2700430f42d24b3867 Merge: f98765509 83655eda7 Author: Kate Friedman Date: Thu Sep 24 17:17:17 2020 -0400 Merge pull request #139 from NOAA-EMC/feature/gfsv16b_jobname Update script names to match ecflow convention commit 83655eda76114414f39c5e32a61b93186b42ac3c Author: kate.friedman Date: Thu Sep 24 18:30:00 2020 +0000 Update config.awips for newly named JJOB scripts commit 2f5fb01d02516a6fe72664c54cd97a5585f99a3f Author: kate.friedman Date: Thu Sep 24 18:20:14 2020 +0000 Update WAFS jobs/rocoto scripts to use new JJOB names commit cb4a7bb6b507c2e3105d6fa8235edea0889e9bd9 Author: Judy.K.Henderson Date: Thu Sep 24 17:19:55 2020 +0000 modifications to python scripts remove aeroic task from setup_workflow_fcstonly.py delete setup_workflow_fcstonly_noaeroic.py add setup_workflow_fcstonly_aeroic.py commit 9b8e2f3a2b782c9dc602c620ce7efc2f7210d6dd Author: lin.gan Date: Thu Sep 24 15:30:45 2020 +0000 Making J-Job naming change accourding to code manager. Remove temp files commit ff50171eea7b99c7e89b0b773824464da5fe949f Author: kate.friedman Date: Thu Sep 24 14:24:23 2020 +0000 Update post.sh UPP JJOB script name to submit commit 3568bdf3aec8e53853686f3bd8d3470e9f547af1 Author: lin.gan Date: Wed Sep 23 21:38:36 2020 +0000 As requested from management, point ufs-weather-model to development: b8c5c22b2a2effe7b925fae1fa449ddec96be848 git submodule f61416fef691d9ba39a40df1ce72aa574f54c390 FMS (2019.01.03) 6bc61df3c363f9134a46439ff4a5a4a803daafb1 FV3 (heads/develop) 9d05172b711f4ab5d6f978dbe575bd67a681b55a NEMS (heads/develop) 96e3f3a8fa0389a4b110b0fa23e7a414f6d92038 WW3 (6.07.1-50-g96e3f3a8) ffdd19bc6c1df747394b7e9958a76238fcd44242 stochastic_physics (ufs-v1.0.0-70-gffdd19b) Remove aerosol from checkout commit 0eb85537f30b1c89ed160374e2e7f8470d7a20d0 Author: lin.gan Date: Wed Sep 23 18:42:52 2020 +0000 Modify each ecflow script with old j-job name for test. Modify the following in each definition file. - ecen family require new extern from previous cycle - Job jgfs_atmos_npoess_pgrb2_0p5deg trigger changed to be (requested from code manager): 1. jgfs_atmos_post_manager:release_post180 - Job jgdas_enkf_select_obs trigger changed to be: 1. previous cycle enkf post complete 2. current cycle jgdas_atmos_analysis_calc complete (new job) Tested as of 9/21/2020 before production switch with following condition: 1. Known issue in wafs gcip. Job failed. Waiting for code manager to fix. 2. obsproc testing still on going. 3. Code manager still updating j-job and ex-script names. Testing in hold until package is ready and WCOSS availability. commit 9fb09168c9aa2057a0c919f66bdf43b8e4e3a545 Author: kate.friedman Date: Wed Sep 23 15:41:20 2020 +0000 Update config.base.nco.static with config.base.emc.dyn changes commit a1a7ac9495446976b9e9ccae48a39bb4b868c19b Merge: ee4bc2b72 70f5064e8 Author: kate.friedman Date: Wed Sep 23 10:19:52 2020 -0500 Issue #5 - sync merge with feature/gfsv16b_jobname commit 70f5064e8e345e073d9c0dc7da812d0a2a51d91d Author: kate.friedman Date: Wed Sep 23 14:00:07 2020 +0000 Script name updates for sfc_prep and tracker commit 4346c03a1aaf1cf1429a59b6a5682065c2c6e51b Author: kate.friedman Date: Wed Sep 23 13:51:44 2020 +0000 Name change for tropcy scripts and update WAFS tag commit af8f685382eda16a0dcf02c26917fe57e4b32a58 Merge: 12f16a874 f98765509 Author: lin.gan Date: Tue Sep 22 21:16:05 2020 +0000 Merge feature_gfsv16b into feature_ccpp for testing commit 0ef9e49d9752643fa3ebdc8fc0ef9768968836d2 Author: kate.friedman Date: Tue Sep 22 17:50:35 2020 +0000 Fixing spelling mistake in config.gldas commit 9fdb9eef052586c36abf034cd335ec649bf9cf19 Merge: 869dfa350 f98765509 Author: lin.gan Date: Tue Sep 22 17:13:20 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b commit 4b1affe520e0610c14168ccba4ef3439a3b7a966 Author: BoiVuong-NOAA Date: Tue Sep 22 16:39:39 2020 +0000 Updated drivers and release notes commit 2ed155ccee58d3cc72d7fdf6577806c7a14e58e9 Author: BoiVuong-NOAA Date: Tue Sep 22 15:55:01 2020 +0000 Updated scripts commit 6d171bc82f2c63f146fe1cf760c350b2effc20e6 Author: Wen Meng Date: Tue Sep 22 15:26:37 2020 +0000 Update Externals.cfg with new UPP tag upp_gfsv16_release.v1.0.16. commit 4b4069b715c9ebcc5122b79a1508f70ad7c78b46 Author: Wen Meng Date: Tue Sep 22 14:59:24 2020 +0000 1)Update sorc/checkout.sh with new UPP tag upp_gfsv16_release.v1.0.16. 2)Update sorc/link_fv3gfs.sh with new file name convention for jjob and ex-script of post processing part. commit c071ac0159cce98ea98ce5fa8dc5337b284ad136 Author: BoiVuong-NOAA Date: Tue Sep 22 14:48:45 2020 +0000 Updated jobs name commit ea13d0c9146457aadd65d567fda794beeeef6c7d Author: kate.friedman Date: Tue Sep 22 14:27:47 2020 +0000 Update EMC_verif-global tag to verif_global_v1.11.0 commit 92e69985706cf62cdd7b383fd45c4cd50f937053 Author: kate.friedman Date: Tue Sep 22 14:15:24 2020 +0000 Rename scripts to match ecf script naming convention. Add SENDDBN and DBNROOT. Update GLDAS tag. commit 37ae040a0f21ec344228128e3e79910059c3b734 Author: fanglin.yang Date: Tue Sep 22 04:30:58 2020 +0000 In anticipating changes from the GLDAS repo : renaming JGDAS_GLDAS to JGDAS_ATMOS_GLDAS, and exgdas_gldas.sh to exgdas_atmos_gldas.sh modified: driver/gdas/test_gdas_gldas.sh jobs/rocoto/gldas.sh parm/config/config.gldas sorc/link_fv3gfs.sh commit c58b93a61f77d65f042f8226445163029ebd9f8d Author: fanglin.yang Date: Tue Sep 22 04:06:06 2020 +0000 renamed: jobs/JGFS_POSTSND -> jobs/JGFS_ATMOS_POSTSND renamed: scripts/exgfs_postsnd.sh -> scripts/exgfs_atmos_postsnd.sh modified: docs/archive/README_bufr driver/product/run_postsnd.sh driver/product/run_postsnd.sh.cray driver/product/run_postsnd.sh.dell driver/product/run_postsnd.sh.hera driver/product/run_postsnd.sh.jet parm/config/config.postsnd commit bec1b83027e162e4c083464d2cdb3fbdbd5168e8 Author: fanglin.yang Date: Tue Sep 22 03:51:02 2020 +0000 renamed: scripts/exglobal_fcst_nemsfv3gfs.sh -> scripts/exglobal_forecast.sh and modified jobs/JGLOBAL_FORECAST parm/config/config.fcst commit d4ebc2e6798d13af1ced8a51899e80e209a13c71 Author: jessica.meixner Date: Mon Sep 21 18:28:15 2020 +0000 updates for optimizing point jobs commit 7278b0677be6168a9695fca6c2e08154d9af0f63 Merge: 928e3e4b5 f98765509 Author: Guang Ping Lou Date: Fri Sep 18 18:35:06 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b commit f98765509cc7fc0e7b8979abfc73dbc5db035825 Author: kate.friedman Date: Fri Sep 18 16:58:57 2020 +0000 Issue #1 - update SEND variables and add DBNROOT to base configs and add check to build_enkf_chgres_recenter_nc.sh for GSI build commit e9e8d63db112ab8f5e15de08a1714cfebd386a2a Author: kate.friedman Date: Fri Sep 18 15:18:48 2020 +0000 Issue #1 - update to UFS_UTILS ops-gfsv16.0.0 tag commit 3df7625d74012552cf69fcfe7c0b5384b860c999 Author: kate.friedman Date: Fri Sep 18 13:25:00 2020 +0000 Issue #1 - fix to link_fv3gfs.sh for new GLDAS tag commit 928e3e4b53e969f8ecdd3544c3bf2668b57c66b3 Merge: dd0814266 1fc7bde9b Author: Guang Ping Lou Date: Thu Sep 17 20:00:17 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b commit dd08142664e5695e3f27890c0560f0b5e48841a8 Author: Guang Ping Lou Date: Thu Sep 17 19:57:20 2020 +0000 Issue #131 Unify dbn_alert path commit 1fc7bde9b7f99da7d0430387ec4a547e73ea641b Author: russ.treadon Date: Thu Sep 17 19:37:41 2020 +0000 Issue #1: update to UPP tag "upp_gfsv16_release.v1.0.15" commit 9eae5a8af39c4a8c5d47c16725ccc62b5d3c60cf Merge: ce1c78255 99c150992 Author: Kate Friedman Date: Thu Sep 17 13:46:58 2020 -0400 Merge pull request #135 from KateFriedman-NOAA/feature/gfsv16b-down Small updates to downstream jobs and tag updates commit 99c1509923399845ba794cb8fad981d12935dd03 Author: kate.friedman Date: Thu Sep 17 13:13:14 2020 +0000 Fix for running prep on Hera commit 869dfa350a013f66e5fdca66aa9b1f0192c1570c Author: lin.gan Date: Wed Sep 16 21:25:10 2020 +0000 ecflow full day cycle included commit 515eeb5c6b188e873b304619c40c9a6efcfd9046 Author: kate.friedman Date: Wed Sep 16 19:53:26 2020 +0000 Update WAFS tag to gfs_wafs.v6.0.6 commit 89f138af89814c865331b9aefcb8450b5915d1f7 Author: kate.friedman Date: Wed Sep 16 16:33:21 2020 +0000 Small updates: - new UPP tag - new GLDAS tag - new WAFS tag - new module for WAFS - EE2 updates to awips scripts - added WAFS to archival - break downstream and WAFS archival into separate gfs_downstream tarball - update gfsarch dependencies to wait for all wavepost jobs to complete commit 9930e8770a2940c46e22b243c11d2ac14f95611a Merge: b3433d59d ce1c78255 Author: lin.gan Date: Wed Sep 16 13:25:36 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b commit b100c22b5bfcf455ff3ff15bd71f0ddc15f1d071 Author: Guang Ping Lou Date: Tue Sep 15 18:57:21 2020 +0000 Issue #131 reduce scripts output to logfile commit 6866e6cd2ef1e76dd58de27ebb7d68b2bd3fa304 Author: Guang Ping Lou Date: Tue Sep 15 18:55:03 2020 +0000 Issue #131 added a path to DBNROOT commit b3433d59d1d5b75f015b0e044a8b4e03d55407f7 Author: Lin.Gan Date: Tue Sep 15 17:59:32 2020 +0000 Adding wafs wave and downstream jobs commit 4c7ccd466bc71b3e73194ffc48385225e01e7c0c Author: Jessica.Meixner Date: Tue Sep 15 15:29:44 2020 +0000 updates for by hour post commit ce1c78255f4bf8f40c2f116547062c7f1b3832c5 Author: russ.treadon Date: Tue Sep 15 10:25:37 2020 +0000 Issue #1: add fhrgrp and fhrlst back to gfsawips in setup_workflow.py (bugfix) commit 67b33dd15633345e0fd59dd37728c26b36d4c0b3 Merge: c33869dab 586fe6ce1 Author: Kate Friedman Date: Mon Sep 14 10:34:39 2020 -0400 Merge pull request #130 from JessicaMeixner-NOAA/bugfix/rtofs Bugfix for rtofs commit 586fe6ce19c5fc281dec16be3347d0311f3e99ce Author: Jessica.Meixner Date: Mon Sep 14 14:29:01 2020 +0000 adding the line to go back a day for RTOFS for the if not NCO section because RTOFS will not be available until 06 cycle commit 15fcb8a46e2cbc93282d1ad038e6decc5fcdacfb Author: Lin.Gan Date: Mon Sep 14 13:30:24 2020 +0000 Restructured ecflow - up to post step commit c33869dab2704396ea6e569dcd202a2b9866827a Author: fanglin.yang Date: Mon Sep 14 02:34:08 2020 +0000 deleted relocate_mv_nvortex.fd since storm relocation is no longer needed. modified build_tropcy_NEMS.sh to remove references to relocate_mv_nvortex commit 4320bf9fd55757df79b63ed15c2041b1acd0e604 Author: russ.treadon Date: Sat Sep 12 00:05:38 2020 +0000 Issue #1: correct DA typos in sorc/link_fv3gfs.sh commit 674cd841a7be3a2a9c772cd455ab980195ea9fcc Merge: 46c2404eb ad1a9d904 Author: Kate Friedman Date: Fri Sep 11 15:06:36 2020 -0400 Merge pull request #124 from JessicaMeixner-NOAA/feature/gfsv16b-wave Splitting post jobs for waves commit ad1a9d9044e89b75ec82bb201e35005552d3d177 Merge: 986f8f0b9 fd687727b Author: Jessica Meixner Date: Fri Sep 11 14:01:35 2020 -0400 Merge pull request #4 from KateFriedman-NOAA/feature/gfsv16b-splitwavepost Sync with feature/gfsv16b and small updates to split wave post changes commit fd687727b2df80f81fabc41b94f808b74e68783c Merge: 986f8f0b9 fc60e895d Author: kate.friedman Date: Fri Sep 11 17:52:30 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b-wave' into feature/gfsv16b-splitwavepost commit fc60e895dea1c123a4b8e02b50df0d0cfa43f77f Author: kate.friedman Date: Fri Sep 11 17:45:03 2020 +0000 Increase walltime for new wavepost jobs commit e0dd1097de2c335400a546137569886a838c46ba Merge: 0844b0043 46c2404eb Author: kate.friedman Date: Fri Sep 11 14:20:15 2020 +0000 Merge branch 'feature/gfsv16b' into feature/gfsv16b-wave commit 46c2404eba485d08e381ef81ebd5ac10c5ad605f Merge: c55465702 9d9b79c91 Author: Kate Friedman Date: Fri Sep 11 10:15:49 2020 -0400 Merge pull request #128 from RobertoPadilla-NOAA/feature/gfsv16_wave_prdgen Feature/gfsv16 wave prdgen commit c554657023d82ac934b481593e55406c45157932 Author: russ.treadon Date: Fri Sep 11 13:59:51 2020 +0000 Issue #1: update name of DA jobs and scripts in accordance with WCOSS Implementation Standards commit 9d9b79c911add71065b0999821772d026609ad62 Merge: cb78f7b83 d3946f900 Author: wx21rph Date: Fri Sep 11 13:48:19 2020 +0000 Issue #94 solving a conflict commit cb78f7b839a7b5fd5082f39592e3b450ce059e82 Author: wx21rph Date: Fri Sep 11 13:21:41 2020 +0000 Issue #94 add waves-prdgen, ICE->ICEC, Sleep in gempak script commit b4a99e3d8b2f6a3837084ddfb08da6a667de81e0 Merge: edb913733 e61485661 Author: Lin.Gan Date: Fri Sep 11 13:19:13 2020 +0000 Merge branch 'feature/gfsv16b' of https://github.com/lgannoaa/global-workflow into feature/gfsv16b commit 0a01661b9594e07381cb0039abad198d7e9eca04 Merge: 6698047df d3946f900 Author: kate.friedman Date: Thu Sep 10 18:20:01 2020 +0000 Update develop branch to latest version of GFSv16 implementation branch commit 09a669c4d155d1bcd3db302b1a26c7a4a7fa8dd7 Merge: 7728ce18f 2ee26105a Author: Kate Friedman Date: Thu Sep 10 12:58:57 2020 -0400 Merge pull request #127 from ilianagenkova/feature/gfsv15.3.3_EUM_bufr Updating gfsv15.3.3 with EUM bufr changes for OPS implementation commit 2ee26105a6dd8e19015000bbe13b6cefeae65830 Author: Iliana Genkova Date: Thu Sep 10 12:54:04 2020 -0400 Clean up commments commit 35bf161c7de338ab75bd2d61d60de833314a93f9 Author: Iliana Genkova Date: Thu Sep 10 12:26:22 2020 -0400 Updated sorc/checkout.sh to pick gfsda.v15.3.3 ( EUM bufr changes ) commit e614856610da39345976823d29bdc32781f38c71 Author: lin.gan Date: Thu Sep 10 16:26:11 2020 +0000 add gfs gempak downstream jobs into def file commit 907291d81c512968817241b883a0084b8506f2b2 Merge: 3f9695b44 d3946f900 Author: lin.gan Date: Thu Sep 10 15:46:45 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b commit ee4bc2b7268bf8f23ca7f7b3b631ddd6730c6a8c Merge: 41d146fc0 d3946f900 Author: kate.friedman Date: Wed Sep 9 08:54:17 2020 -0500 Issue #5 - sync merge with feature/gfsv16b commit 6b38e648f244afbba9590ea1a4de70d4ce2ed119 Author: jessica.meixner Date: Tue Sep 8 22:14:13 2020 +0000 worked on wcoss commit edb913733023b1025786901aee895d2dbca5edc7 Author: Lin.Gan Date: Tue Sep 8 16:09:07 2020 +0000 ecflow script rename after redesign approved - not including all wave jobs commit d3946f9006f433e0a6b2f459aeaa7fd8bf800aff Author: russ.treadon Date: Tue Sep 8 14:46:39 2020 +0000 Issue #1: clean up DA sections of link_fv3gfs.sh commit 986f8f0b9858f61d32d337a67efa966a7948aa94 Author: jessica.meixner Date: Fri Sep 4 15:42:25 2020 +0000 fix from Bhavani for having first wave grib file be set as a forecast instead of analysis commit 0844b0043c937925c5ad76b4fad03308483d49b3 Author: jessica.meixner Date: Fri Sep 4 15:42:25 2020 +0000 fix from Bhavani for having first wave grib file be set as a forecast instead of analysis commit c1d5c45c13b46964d9d2479962b8c56307a6ad11 Author: jessica.meixner Date: Fri Sep 4 15:42:25 2020 +0000 fix from Bhavani for having first wave grib file be set as a forecast instead of analysis commit 3f9695b44deef4790ae443f4d0c1844b1385374d Author: Lin.Gan Date: Thu Sep 3 19:07:35 2020 +0000 ecflow gfs v16 nco review 3 commit 256885283bb42c2309e820ca53b44042b8223617 Author: jessica.meixner Date: Thu Sep 3 17:44:47 2020 +0000 updates to split boundary points plus saving config file updates commit 8a203e5d5afa92a2b925f69efa5237a2c0d3dca6 Merge: 40c689b5d 26c84b3ca Author: Lin.Gan Date: Thu Sep 3 17:36:19 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b commit 64409b1e9e976b75afd8a3548e7023dce534c4ae Author: kate.friedman Date: Thu Sep 3 15:42:23 2020 +0000 Adding missing space to if-block in env files to resolve runtime failure commit 0485442f5876b82de73c47429d3a70f58f0aa263 Merge: 26c84b3ca 2006bbcb1 Author: kate.friedman Date: Thu Sep 3 15:37:50 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b-wave' into feature/gfsv16b-wave * upstream/feature/gfsv16b-wave: adding pnt jobs as seperate jobs for env moving definitions of wavempexec and wave_mpmd from jobs to env lowering the resource requirement for wave prep job update module for cdo updates to resources for wave jobs Fixing wavepostbndpnt dependency in setup_workflow_fcstonly.py cleaning up wave point post scripts rename wave post script remove gridded so can rename updates to boundary point Remove extra space from line Fix wavepostbndpnt dependency and increase wavepostbntpnt and wavepostpnt walltimes fix typos updating WCOSS work around for CDO, CDO_ROOT is missing from module file adding module use for cdo module on wcoss dell updates for new weights file and adding cdo module for wave prep Adding new wavepostbndpnt and wavepostpnt jobs updating wave post scripts for restructured format updates to JJOBs for wave post point for new structure changing exit 0 to exit number for FATAL errors in ice prep for waves updating error message and exit if there is no current input file update WAVE_PREP so that currents do not check for previous 24 hour so that parallels will be reproducible also deleted unused variable renaming wave job and scripts update jobs for waves Issue #94 - pull in two fixes for wave job setup in free-forecast mode updates for wave scripts to split them, also added updates from GEFS branch to add extra error checking updates from gefs post workflow: updates for re-run case as well as extra error messages commit 26c84b3ca81dec965ab31d00409b537f61174dc1 Author: russ.treadon Date: Thu Sep 3 14:06:29 2020 +0000 Issue #1: remove pgrb2b.0p25 dependency from gfsawips in setup_workflow.py commit 40c689b5d5886d9a4bed92e0c6262bfc5988836d Author: Lin.Gan Date: Thu Sep 3 02:00:09 2020 +0000 ecflow gfsv16 redesign 2 commit 609e157c2477b662b55bbf3d53999cdd30856bc8 Author: kate.friedman Date: Wed Sep 2 20:35:28 2020 +0000 Issue #1 - update WAFS tag to gfs_wafs.v6.0.4 and remove HOURLY variable from WAFS configs commit b0bebad6258bd81c0b0c89b559a04d9b059c838e Author: kate.friedman Date: Wed Sep 2 20:08:01 2020 +0000 Issue #1 - change wavegempak and waveawipsgridded dependency to match waveawipsbulls and start when wavepostsbs is complete commit 084c89e3d562f8c9977eaa17624646be388f7854 Author: russ.treadon Date: Wed Sep 2 19:58:37 2020 +0000 Issue #1: set n_sponge=42 in gfs section of config.fcst commit 2006bbcb1e54e3cd5ae0ff2826b6d0b211c1d013 Author: Jessica.Meixner Date: Wed Sep 2 19:22:38 2020 +0000 adding pnt jobs as seperate jobs for env commit c61cd59488b64ead77e13b2eb5faeb383d052ea0 Author: Jessica.Meixner Date: Wed Sep 2 19:02:21 2020 +0000 moving definitions of wavempexec and wave_mpmd from jobs to env commit 68ed2d670ebe25713554f2165a0bbc7c35fb5030 Merge: 4e7975d26 70abda260 Author: jessica.meixner Date: Wed Sep 2 17:45:26 2020 +0000 Merge remote-tracking branch 'EMC/feature/gfsv16b' into feature/gfsv16b-wave Conflicts: parm/config/config.resources ush/rocoto/setup_workflow.py ush/rocoto/setup_workflow_fcstonly.py commit 70abda260dc9ab47533d25cd570dd055c4644123 Merge: 0a31d568b 52e987ac6 Author: Kate Friedman Date: Wed Sep 2 10:28:22 2020 -0400 Merge pull request #122 from KateFriedman-NOAA/feature/gfsv16b_wavedown Add downstream wave jobs to GFSv16 commit 0a31d568bdacd0fb2d8bc6e1fddb09d2c72d014c Author: kate.friedman Date: Wed Sep 2 14:09:57 2020 +0000 Remove unneeded settings from config.post commit 52e987ac67477932e5b034321cf52f1cc756081c Author: kate.friedman Date: Wed Sep 2 13:05:42 2020 +0000 Added null DBNROOT to wave awips configs commit b3c7b7379a6ebb53b4d4533b894b1bf36df3c2fd Merge: 074f27824 add6ea0ba Author: kate.friedman Date: Wed Sep 2 13:02:41 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b_wavedown commit 806a57838b2a6fd8a9df993fd6cd8debf9fb659a Author: Jessica.Meixner Date: Wed Sep 2 12:28:43 2020 +0000 add extra script for by hour points for waves commit ab3e0b196bbb147da99da4ef34710e8d5fbb68ea Merge: 054f8c536 4e7975d26 Author: Jessica.Meixner Date: Wed Sep 2 11:40:04 2020 +0000 Merge branch 'feature/gfsv16b-wave' of github.com:JessicaMeixner-NOAA/global-workflow into feature/gfsv16b-wave commit 054f8c53608689b00b32d035ce728ed3d8cb3ad9 Author: Jessica.Meixner Date: Wed Sep 2 11:39:28 2020 +0000 updates for boundary points by hour parallelization commit 4e7975d265857e4e634e0c7352f6ed2de1a99336 Author: jessica.meixner Date: Tue Sep 1 22:45:05 2020 +0000 lowering the resource requirement for wave prep job commit add6ea0bae2c56bba2e3d76460771095b98786b1 Author: russ.treadon Date: Tue Sep 1 19:40:10 2020 +0000 Issue #1: rename enkf_chgres_recenter executables in accordance with WCOSS Implementation Standards commit 8a7e8d9944b8727da91bfb28bbfe44958f0aeb65 Merge: 4c9978a8b 595edcfaa Author: Kate Friedman Date: Tue Sep 1 15:00:08 2020 -0400 Merge pull request #121 from KateFriedman-NOAA/feature/gfsv16b-wafs Adding WAFS jobs to GFSv16 commit 4c9978a8b0a98527d7c78a3da0a0a55e39cefed0 Author: fanglin.yang Date: Tue Sep 1 18:48:44 2020 +0000 modified: checkout.sh to use gldas_gfsv16_release.v1.6.0 commit 595edcfaaf1372889c8d569e01ec4dfefb51c469 Merge: 8a593c8df 06bda7dec Author: kate.friedman Date: Tue Sep 1 17:30:08 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b-wafs commit ada0fb3155a57bdba074c84e1ab3d73eabb0e2f4 Author: jessica.meixner Date: Tue Sep 1 17:18:27 2020 +0000 update module for cdo commit 074f2782438471740453e356b856a4756e59c858 Merge: a1a3d69ba 8a593c8df Author: kate.friedman Date: Tue Sep 1 14:30:56 2020 +0000 Sync with WAFS branch commit a1a3d69ba40564f550053b99b4328ba41c7cc45d Merge: 249b6ef76 06bda7dec Author: kate.friedman Date: Tue Sep 1 14:22:20 2020 +0000 Sync merge with feature/gfsv16b commit 8a593c8df03d8a879c5488fbd671fef94f5d867d Author: kate.friedman Date: Tue Sep 1 14:09:49 2020 +0000 Update to WAFS tag and added SENDDBN_NTC to both base configs commit 5270ed831c0b3c428736cbd8a789a4d3cfdcc0fe Merge: 95ec3329e eabda84f9 Author: kate.friedman Date: Tue Sep 1 14:03:53 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b-wafs commit 06bda7dec74831951034b72c83abbb2d994ee539 Author: fanglin.yang Date: Tue Sep 1 14:02:44 2020 +0000 modified: link_fv3gfs.sh to not link or copy 0readme fix_chem fix_fv3 fix_sfc_climo which are not used by GFS.v16 and are of large size commit eabda84f97c11251f672d95bb604ad0c50e8da18 Author: fanglin.yang Date: Tue Sep 1 13:48:11 2020 +0000 modified: link_fv3gfs.sh to remove chgres_cube.fd and chgres_cube.fd in sorc/link_fv3gfs.sh commit 223492bf6c56c0491c0034dc1d4a0a3e6e97a607 Author: fanglin.yang Date: Tue Sep 1 03:59:53 2020 +0000 modified: link_fv3gfs.sh to allow "fix" directories to be removed before rerunning link_fv3gfs.sh for RUN_ENVIR=nco case commit 95ec3329e72a54b0dabdccdecd467972c4914dcd Author: kate.friedman Date: Mon Aug 31 18:57:32 2020 +0000 Added WAFS jobs to free-forecast mode, updates for extending WAFS to fh120, and two bug fixes in link_fv3gfs.sh and hpssarch_gen.sh commit 6f660d9a9a5758f3d74f87f170a95a26ca0c0e6a Merge: 8da419164 3282a8996 Author: kate.friedman Date: Mon Aug 31 17:58:55 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16b-wafs commit 3282a8996ab0510699932a53897b87730b2cc492 Author: kate.friedman Date: Mon Aug 31 17:47:37 2020 +0000 Renamed global-workflow-owned ex-scripts to remove ecf extension and updated other scripts which call those ex-scripts commit 9cde3a41f34d0c2618e8fb06ff1276cec5378ef5 Author: kate.friedman Date: Fri Aug 28 18:46:57 2020 +0000 Remove UFS_UTILS ecf extensions commit f7a92dbc1c3ad822d3b8e567967e91384631288d Author: russ.treadon Date: Fri Aug 28 18:09:26 2020 +0000 Issue #1: (1) update earc.sh directory removal to be consistent with arch.sh, (2) update config files to be consistent with EMC real-time GFS v16 parallel commit 249b6ef761fb41b5d102b282fccedeb827192dff Author: kate.friedman Date: Fri Aug 28 16:52:54 2020 +0000 Remove ecf script name extensions from downstream wave scripts commit 004a29f39204314159d2cbd028950cfaba07cf33 Author: kate.friedman Date: Fri Aug 28 16:50:03 2020 +0000 Add new downstream wave jobs to workflow commit 8da4191644ff55f46e8f394e4e1d29641d3c117c Merge: 8f97726a4 6319bda47 Author: kate.friedman Date: Fri Aug 28 16:42:11 2020 +0000 Sync merge with feature/gfsv16b commit 8f97726a4170d56dcaf6732ae141fae9aaebfd8d Author: kate.friedman Date: Fri Aug 28 16:39:50 2020 +0000 Add new downstream WAFS jobs commit 6319bda47548873acb8859d9a47d2ac6a6ef4c67 Author: kate.friedman Date: Fri Aug 28 15:41:55 2020 +0000 Issue #1 - update GLDAS and UPP workflow files for removal of ecf script extension commit 8f7615c76f76611d748246a80d9286de7c4d3295 Author: russ.treadon Date: Thu Aug 27 20:53:25 2020 +0000 Issue #1: remove ".ecf" suffix from DA scripts referenced in sorc/link_fv3gfs.sh commit af754d1fa0aa4a521fcd5b4fc04eaa88d475dbff Author: russ.treadon Date: Thu Aug 27 20:47:14 2020 +0000 Issue #1: remove ".ecf" extension from DA exscripts (as per WCOSS Implementation Standards) referenced from parm/config files commit a6e91b48b7024c83491fd1f93c6d7df8e3301af5 Author: jessica.meixner Date: Thu Aug 27 20:25:45 2020 +0000 updates to resources for wave jobs commit f0b9a98973b7983eadab51c7009723b5d0b4a564 Author: russ.treadon Date: Wed Aug 26 23:04:02 2020 +0000 Issue #1: update name of DA enkf chgres script in config.echgres commit 01f33c498bf0b5c0d48486cd04dab9c93c721906 Author: russ.treadon Date: Wed Aug 26 20:58:29 2020 +0000 Issue #1: Rename DA enkf chgres job and script as per EE2 guidance commit 7d3f37819aba17dc794d10f7453061802779b435 Author: kate.friedman Date: Wed Aug 26 18:35:59 2020 +0000 Workflow changes for wave gempak and awips downstream jobs commit 485632eafe37c1866011af1bd0a601fb1c7adc4c Merge: 2946baad8 e191fd4bb Author: Jessica Meixner Date: Wed Aug 26 11:49:40 2020 -0400 Merge pull request #3 from KateFriedman-NOAA/feature/gfsv16b-wave Fixing wavepostbndpnt dependency in setup_workflow_fcstonly.py commit e191fd4bb87af787feab486bf072fe4dce7f12f4 Author: kate.friedman Date: Wed Aug 26 15:47:49 2020 +0000 Fixing wavepostbndpnt dependency in setup_workflow_fcstonly.py commit 4641a72e1ced493c7ec5d837a3093908e388b5df Merge: a1b08c928 25ae4899c Author: Kate Friedman Date: Wed Aug 26 11:33:23 2020 -0400 Merge pull request #119 from GeorgeGayno-NOAA/feature/gfsv16b enkf_chgres_recenter_nc.fd - Remove hard-coded vertical levels. commit 25ae4899cee3dc3aeff1e36f0ac1d0b0773ca2f9 Author: George Gayno Date: Wed Aug 26 14:25:36 2020 +0000 feature/gfsv16b: Add Cory's updates to build 'enkf_chgres_recenter_nc' on Cray. Issue #102. commit 2946baad88d908b32793f542cab42f0c89adf009 Author: jessica.meixner Date: Wed Aug 26 13:44:06 2020 +0000 cleaning up wave point post scripts commit 3590e9a45e2b13b22ff3c3d4827907ddcf55eebf Merge: 727c3ffde a1b08c928 Author: George Gayno Date: Tue Aug 25 20:20:17 2020 +0000 feature/gfsv16b: Merge updates from authoratative feature/gfsv16b branch. Issue #112. commit a1b08c92892278453f25c2530f8b8fb968af9a34 Merge: 45ad029ba ce3816d3e Author: Kate Friedman Date: Tue Aug 25 16:01:57 2020 -0400 Merge pull request #116 from RobertoPadilla-NOAA/feature/gfsv16_wave_prdgen Feature/gfsv16 wave prdgen commit ce3816d3e88b8d2de7472a214e66cb54aceca9a6 Merge: b0c916106 45ad029ba Author: wx21rph Date: Tue Aug 25 19:55:53 2020 +0000 Merge remote-tracking branch 'upstream/feature/gfsv16b' into feature/gfsv16_wave_prdgen Issue #94 Adding prdgen for waves commit 90181842fbb7a22479f12d3a67bcdc8fa4b8f669 Author: jessica.meixner Date: Tue Aug 25 19:02:22 2020 +0000 rename wave post script commit 1f15be0a8207d3117200bcc145a5ae66b7445ec3 Author: jessica.meixner Date: Tue Aug 25 19:01:41 2020 +0000 remove gridded so can rename commit 1fc5b368fe0cd438e2202b62de7753b7efed7af6 Author: Jessica.Meixner Date: Tue Aug 25 19:00:20 2020 +0000 updates to boundary point commit f0e3ae5c2cd441673a4a670618e87f050a12c9e9 Merge: 98d6c1633 b74b21e71 Author: Jessica Meixner Date: Tue Aug 25 14:15:57 2020 -0400 Merge pull request #2 from KateFriedman-NOAA/feature/gfsv16b-wave Feature/gfsv16b wave commit b74b21e71edbb62207ccfcafe4c6ed5b6d301cfe Author: kate.friedman Date: Tue Aug 25 18:14:15 2020 +0000 Remove extra space from line commit 25e2cc4596213b39d2a3dc3ee8afda4a86e10d38 Author: kate.friedman Date: Tue Aug 25 18:12:04 2020 +0000 Fix wavepostbndpnt dependency and increase wavepostbntpnt and wavepostpnt walltimes commit 45ad029bad4367961e5eff987f27226b0ac78d5a Author: russ.treadon Date: Tue Aug 25 13:40:45 2020 +0000 Issue #1: several minor changes * Externals.cfg and sorc/checkout.sh: update to tag verif_global_v1.10.1 * jobs/rocoto/arch.sh: update ARCDIR cp to use new path to tracker output * scripts/run_gfsmos_master.sh.dell: remove module purge from mos driver script commit 0a85a36ef0784bebf96d2f97c24dd4c642ac23f1 Merge: d5861551a 5190efb11 Author: Kate Friedman Date: Tue Aug 25 09:20:42 2020 -0400 Merge pull request #118 from GuangPingLou-NOAA/feature/gfsv16b Issue #117 Modufy Bufr station Hilo's grid point representation commit 5190efb119c6eec6a7c67d90327dee4666307541 Author: Guang.Ping.Lou Date: Tue Aug 25 12:30:45 2020 +0000 Issue #117 Modufy Bufr station Hilo's grid point representation commit 98d6c16336e5b0a3241174c54b7631667a4f919d Merge: 0eacf5e21 32bd57ef2 Author: Jessica.Meixner Date: Mon Aug 24 22:27:39 2020 +0000 Merge branch 'feature/gfsv16b-wave' of github.com:JessicaMeixner-NOAA/global-workflow into feature/gfsv16b-wave commit 0eacf5e21c3149d2d922b4c2495e48fb2355e639 Author: Jessica.Meixner Date: Mon Aug 24 22:27:28 2020 +0000 fix typos commit b0c9161061302ccf4b890cceabe5b4eaecb67e61 Author: wx21rph Date: Mon Aug 24 22:25:05 2020 +0000 Issue #94 changing native by interpolated grids for gempak commit 8a68d3c6d6996f389609d5961f32f0db116797d1 Author: wx21rph Date: Mon Aug 24 21:47:44 2020 +0000 Issue #94 removing the load of modulefiles and sending set.pdy to j-jobs commit 32bd57ef2b390dbe73a744188a40ca4c42c306fe Author: jessica.meixner Date: Mon Aug 24 20:35:39 2020 +0000 updating WCOSS work around for CDO, CDO_ROOT is missing from module file commit a2d06bb0bf82ee48899cb26ea9abf2331b543107 Author: jessica.meixner Date: Mon Aug 24 18:55:02 2020 +0000 adding module use for cdo module on wcoss dell commit e5c9f037bd82ed4bc72cf11241d7c053160ff5b9 Author: jessica.meixner Date: Mon Aug 24 18:29:36 2020 +0000 updates for new weights file and adding cdo module for wave prep commit 96655fbe5b5daf130d4a9a458ce47059c047376a Author: wx21rph Date: Mon Aug 24 16:56:37 2020 +0000 Issue #94 removing modulefiles load from j-jobs commit 7f9344f777075ce3220d9a9fccba74c07e4f40e2 Author: wx21rph Date: Mon Aug 24 14:32:26 2020 +0000 Issue #94 Solving reviewers comments commit c9598a3207bb06cbd7d2619d97f723f6511da9a4 Merge: bf97710be 92a2a43db Author: Jessica Meixner Date: Mon Aug 24 10:12:43 2020 -0400 Merge pull request #1 from KateFriedman-NOAA/feature/gfsv16b-wave Adding new wavepostbndpnt and wavepostpnt jobs commit 92a2a43db190982944b4166b929a791a1b0b0750 Author: kate.friedman Date: Mon Aug 24 14:06:23 2020 +0000 Adding new wavepostbndpnt and wavepostpnt jobs commit bf97710bece30195679740b0de2b32e0adc6c3c6 Author: Jessica.Meixner Date: Fri Aug 21 00:20:56 2020 +0000 updating wave post scripts for restructured format commit a914521cec478667191b365e2a2a62e3553e0ff7 Author: Jessica.Meixner Date: Thu Aug 20 23:52:47 2020 +0000 updates to JJOBs for wave post point for new structure commit 1f195f8d432072f4e4f66406ea6b09256a185d2b Author: Jessica.Meixner Date: Thu Aug 20 15:20:28 2020 +0000 changing exit 0 to exit number for FATAL errors in ice prep for waves commit f836a7869dbb88badd93fcd9d08314284eee4f96 Author: Jessica.Meixner Date: Thu Aug 20 15:17:28 2020 +0000 updating error message and exit if there is no current input file commit 529cc3692f4fc003295584e761279be3c9405b68 Author: Jessica.Meixner Date: Thu Aug 20 14:50:50 2020 +0000 update WAVE_PREP so that currents do not check for previous 24 hour so that parallels will be reproducible also deleted unused variable commit 57c4258ba73778fe95118ebfae98c4bbf012ae37 Author: Jessica.Meixner Date: Thu Aug 20 14:09:22 2020 +0000 renaming wave job and scripts commit 1626cb2f90bfea5663f198cfc2ac2916a9200e1a Merge: da2922163 d5861551a Author: Jessica.Meixner Date: Thu Aug 20 14:02:54 2020 +0000 Merge remote-tracking branch 'EMC/feature/gfsv16b' into feature/gfsv16b-splitwavepost commit da2922163922f4b4b14dba78dba36e9eced3a2c5 Author: Jessica.Meixner Date: Thu Aug 20 13:50:15 2020 +0000 update jobs for waves commit a1c456f267ddf701aa47ad33fd231a2f1ae0bf99 Author: wx21rph Date: Wed Aug 19 20:09:47 2020 +0000 Issue #94 renaming the J-jobs commit 41d146fc0fda97a41ef829ecff41b005bbf21740 Merge: c2a4f04e5 d5861551a Author: kate.friedman Date: Wed Aug 19 13:54:13 2020 -0500 Issue #5 - sync merge with feature/gfsv16b after COMPONENT update commit 036617c8a9b5da765b82c276f7b087aa4e16d679 Author: wx21rph Date: Wed Aug 19 14:24:01 2020 +0000 Issue #94 Updating the modulefiles for waves-prdgen commit d5861551ae5b1fe2c2b00802159a42f7fde06357 Merge: 9b923ac30 0e0699aac Author: Kate Friedman Date: Wed Aug 19 09:49:08 2020 -0400 Merge pull request #109 from NOAA-EMC/feature/gfsv16b-restructure GFSv16 restructure for new $COMPONENT subfolder commit 1e76093c8554c3ccb87a7ed243f738f5763ca5da Author: wx21rph Date: Wed Aug 19 13:47:44 2020 +0000 Issue #94 adding waves-prdgen to gfsv16 commit 0e0699aac7cb00988c999546dcdf3a52b11706e3 Author: kate.friedman Date: Wed Aug 19 13:47:29 2020 +0000 Issue #94 - fix to messed up shebang in setup_expt_fcstonly.py commit c7a708fd54ffe60c6259f35c4230e6e73ecd6f77 Author: wx21rph Date: Tue Aug 18 21:16:19 2020 +0000 Issue #94 adding waves-prdgen for gfsv16 commit 0b2744e53a35c570178b7e6cd7ede72b6255c579 Author: Kate.Friedman Date: Fri Aug 14 17:36:31 2020 +0000 Issue #94 - pull in two fixes for wave job setup in free-forecast mode commit 4888001e76d24277b3a75d1d520945383f49d456 Author: Jessica.Meixner Date: Fri Aug 14 18:55:40 2020 +0000 updates for wave scripts to split them, also added updates from GEFS branch to add extra error checking commit 00dc1973010b68e65e176c90a1eeaef359b7beae Author: Kate.Friedman Date: Fri Aug 14 17:36:31 2020 +0000 Issue #94 - pull in two fixes for wave job setup in free-forecast mode commit c2a4f04e50ccb45febf55f7985c64a287973dee0 Author: kate.friedman Date: Fri Aug 14 08:21:01 2020 -0500 Issue #5 - update for COMINsyn on Orion commit fe849e918b30e6a9ca22ebe1a81e350afc4ed6cf Merge: 0c06cda7d 9b923ac30 Author: kate.friedman Date: Thu Aug 13 18:49:52 2020 +0000 Issue #94 - Sync merge branch 'feature/gfsv16b' into feature/gfsv16b-restructure commit 9b923ac308841d7accfc6ed2a4ef9733dce2305e Author: kate.friedman Date: Thu Aug 13 18:37:51 2020 +0000 Issue #1 - update to prod_util version (1.1.3 -> 1.1.4) on WCOSS-Dell for post P1/P2 removal commit bf0bff444bdd8a02e1c875eadbdc6b60628cff76 Author: kate.friedman Date: Wed Aug 12 10:00:51 2020 -0500 Issue #5 - update to JGFS_CYCLONE_TRACKER for machine=ORION commit 0c06cda7dd667e936e8aa71fc1c584a4839094ba Author: kate.friedman Date: Mon Aug 10 18:50:44 2020 +0000 Issue #94 - update CFP module version on WCOSS-Dell commit b73d25224d8208291048b774a91497928b051713 Author: kate.friedman Date: Mon Aug 10 17:58:55 2020 +0000 Issue #94 - update WCOSS-Dell module versions to prod_util and EnvVars commit 485ee56c1e772e8efa944ee9a3f350130d49a718 Author: kate.friedman Date: Mon Aug 10 13:00:42 2020 +0000 Issue #94 - fix AWIPS file dependency commit 5a7e45e398efafb4b5b6cde0a69f799a99406d34 Author: kate.friedman Date: Fri Aug 7 09:48:45 2020 -0500 Issue #5 - add missing AND condition for wavepostsbs commit bdaa28fbdc2372ed85320a4b044a33d350e2e7be Author: kate.friedman Date: Fri Aug 7 13:34:22 2020 +0000 Issue #94 - updates for FSU tracker commit 12f16a874f129106bc7b0aa1f7485c07a0508035 Author: Judy.K.Henderson Date: Fri Aug 7 00:21:32 2020 +0000 save field_table_gsd as a file instead of link commit 86e39906eadf3dca87d44861752a5c881dfb41a9 Author: Judy.K.Henderson Date: Fri Aug 7 00:08:19 2020 +0000 updated diag tables for GSL moved 4D variables to gfs_dyn ( refl_10cm,nwfa,nifa,qc_bl,cldfra_bl,el_pbl,qke ) commit e8800078f8d00e9447ed81b83eb49135110deb8f Author: kate.friedman Date: Thu Aug 6 14:31:05 2020 -0500 Issue #5 - change UFS_UTILS checkout back to auth repo release/ops-gfsv16 branch after commit to it commit df20d848de2420eacba296ebdf9c9fe68d697351 Author: kate.friedman Date: Wed Aug 5 15:46:27 2020 -0500 Issue #5 - fix for errant text in setup_workflow_fcstonly.py for waveinit job settings commit 4cee6f65467c34e171ed3e0bdc72458a875e5e21 Author: kate.friedman Date: Wed Aug 5 17:14:04 2020 +0000 Issue #94 - adding -o flag to checkout.sh for optional GTG checkout with EMC_post commit 563d2931378c3d6276212cc9c4e2a4e9a286589c Author: kate.friedman Date: Tue Aug 4 18:19:47 2020 +0000 Issue #94 - small fix to updated UFS_UTILS part of link_fv3gfs.sh commit 7c4f8eab074115323a7812dc498e8ea043916718 Author: kate.friedman Date: Tue Aug 4 08:53:39 2020 -0500 Issue #5 - add new APRUNCFP setting to ORION.env commit 5652485ef0cc42f8814df9d08e8b953b025d08b4 Merge: 5cf8a458e d1751309f Author: kate.friedman Date: Mon Aug 3 13:10:27 2020 -0500 Issue #5 - Sync merge branch 'feature/gfsv16b' into port2orion * feature/gfsv16b: Issue #1: refactor CFP in DA sections of HERA.env and WCOSS_DELL_P3.env Issue #1: add enkf member sfcf006.nc to enkf tarballs Revert "modified: config.base.emc.dyn to use obsproc_prep.iss70457.netcdfhistory_atmos" modified: config.base.emc.dyn to use obsproc_prep.iss70457.netcdfhistory_atmos commit 5cf8a458e3cea88ce628f9ae1f5c7f1e47d0cdbe Author: kate.friedman Date: Mon Aug 3 13:09:17 2020 -0500 Issue #5 - add setup script settings for HPSSARCH to turn to NO on Orion by default commit 206797c0d350d9ee7b489c9b7ff3afd4c8d3ca38 Author: kate.friedman Date: Mon Aug 3 09:42:14 2020 -0500 Issue #5 - GLDAS tag update, rocoto module update, UFS_UTILS branch and build update commit 8a1fea478bcfdec3ad436e77fdfe8c601a050c77 Author: kate.friedman Date: Thu Jul 30 18:27:57 2020 +0000 Issue #94 - move UFS_UTILS link script step from build_ufs_utils.sh to link_fv3gfs.sh based on feedback from GEFS team commit f54d938ac8faf3c5d0362770b645dcef57b43c30 Author: kate.friedman Date: Wed Jul 29 14:28:03 2020 +0000 Issue #94 - reverting GSI and EMC_gfs_wafs checkouts back to authoritative repos ahead of their updating for new , will update EMC_gfs_wafs to new tag when available commit d3866af6cc8060f88b3bd780b7f6a6f52a8ecba0 Author: kate.friedman Date: Wed Jul 29 13:39:07 2020 +0000 Issue #94 - add default to fhmax in vrfy.sh for when prior tasks disabled and set new path to restructured gfsmos commit 9c2791ed037d62f35aa33875c20de8ac3f823337 Merge: e8a714b2f d1751309f Author: kate.friedman Date: Mon Jul 27 14:44:59 2020 +0000 Issue #94 - Sync merge branch 'feature/gfsv16b' into feature/gfsv16b-restructure commit e8a714b2fa20cce87ed018b3a9215bef37ee0d10 Author: kate.friedman Date: Mon Jul 27 14:37:34 2020 +0000 Issue #94 - retire BASE_SVN commit 5867d4b3de7b6ee7d8f913d573b4a0f44d973320 Author: Jessica.Meixner Date: Fri Jul 24 15:00:41 2020 +0000 updates from gefs post workflow: updates for re-run case as well as extra error messages commit 03daf01d440e702deccfcd246db069783f2c513d Author: kate.friedman Date: Fri Jul 24 14:14:25 2020 +0000 Issue #94 - removing new install script, will introduce via another branch, needs more testing commit 9c0765cec06ea0f80528cba722884a4971beb345 Author: kate.friedman Date: Fri Jul 24 14:08:36 2020 +0000 Issue #94 - add to gempak scripts and cleanup based on PR feedback commit fb70d6fb7a2407bd17c2c587224e66401a6cf1c0 Author: kate.friedman Date: Thu Jul 23 14:35:56 2020 +0000 Issue #94 - updating GLDAS and EMC_post tags in Externals.cfg commit 3bfdc01419b1ea8748f8ce7ebcf00ac6b520f8d7 Author: kate.friedman Date: Thu Jul 23 14:34:28 2020 +0000 Reverting link_fv3gfs.sh, will commit in different changeset commit d1751309fc1999cdc4fdbc603e429ee61fbcf38c Author: russ.treadon Date: Thu Jul 23 11:20:15 2020 +0000 Issue #1: refactor CFP in DA sections of HERA.env and WCOSS_DELL_P3.env commit 821c93cae54d9fa24887cbe2072df4f63aeadb7d Merge: 95ce82355 145f81056 Author: kate.friedman Date: Tue Jul 21 17:15:43 2020 +0000 Issue #94 - Sync merge branch 'feature/gfsv16b' into feature/gfsv16b-restructure commit 95ce823552c20aa15426c4c98feb3c7d147ec89e Author: kate.friedman Date: Tue Jul 21 14:10:09 2020 +0000 Issue #94 - new GLDAS tag with atmos directory commit 145f81056802cb7246a50f7951c2e191a994032f Author: russ.treadon Date: Mon Jul 20 21:09:40 2020 +0000 Issue #1: add enkf member sfcf006.nc to enkf tarballs commit ab89088d3a59ee5d5389fa8e527e0444692c1542 Author: kate.friedman Date: Mon Jul 20 19:07:02 2020 +0000 Issue #94 - component checkout updates to use forks and new UPP tag commit 4eec6641424d3947ae792f632f9b401bb569173a Author: kate.friedman Date: Mon Jul 20 13:31:23 2020 +0000 Issue #94 - fix dirpath in ush/hpssarch_gen.sh commit 727c3ffdeee5b77e65b6665b2cf34b6a30736bbc Author: George Gayno Date: Fri Jul 17 18:27:26 2020 +0000 feature/gfsv16b This commit references #102. Some cleanup to enkf_chgres_recenter_nc. commit d0378245016ba506b9e57a35dcaed8ce10830a9a Author: George Gayno Date: Fri Jul 17 15:59:59 2020 +0000 feature/gfsv16b This commit references #102. Initial updates to enkf_chgres_recenter_nc to output any number of vertical levels. Turn off horizontal interpolation if input and output grids are the same. commit 273c5353bc310b22459fe605662c07922a41e4f4 Author: kate.friedman Date: Thu Jul 16 17:52:43 2020 +0000 Issue #94 - revert RSTDIR_WAVE if-block compression to retain breakpoint restarting commit 3d9f0dc81a63b5d1b54981314fac36a82119e811 Author: kate.friedman Date: Thu Jul 16 17:19:21 2020 +0000 Issue #94 - remove COMPONENTatmos and COMPONENTwave, use COMIN[COMOUT]atmos[wave] instead commit 19d3f72b5f6f1ec145cf95e47c1fe9ca970d40cb Author: kate.friedman Date: Thu Jul 16 16:05:56 2020 +0000 Issue #94 - updates to wave scripts based on NCO feedback commit 0a8e95bce500552b0ed85b86c43d35dd298eccc9 Author: kate.friedman Date: Thu Jul 16 10:55:29 2020 -0500 Issue #5 - reverted back to GLDAS feature/orion_port after committing build fix to it commit 8e906c4cd1cce7d2b626ae7a0c6a799406190d21 Author: kate.friedman Date: Thu Jul 16 10:00:59 2020 -0500 Issue #5 - updates to GLDAS version, module_base.orion, and ORION.env commit 0056b413a65aea9700561c5a9b07946fb34be6b5 Merge: 987d8cf56 d7425d02b Author: kate.friedman Date: Thu Jul 16 13:35:13 2020 +0000 Issue #94 - Sync merge branch 'feature/gfsv16b' into feature/gfsv16b-restructure commit 987d8cf5621e8c302d0f9a7941f4d74222fbf218 Author: kate.friedman Date: Wed Jul 15 17:50:34 2020 +0000 Issue #94 - added and [wave] back in after feedback from Steven Earle, also renamed the wave scripts to match the correct convention commit 49825b3b93080b70b56752477a9406e1765ecdf3 Author: BoiVuong-NOAA Date: Tue Jul 14 23:04:08 2020 +0000 GitHub Issue#94 update gempak's ush scripts commit d7425d02bb9027fe3ea5ceb31d515d70dddb4ec4 Author: fanglin.yang Date: Tue Jul 14 19:50:54 2020 +0000 Revert "modified: config.base.emc.dyn to use obsproc_prep.iss70457.netcdfhistory_atmos" This reverts commit 49772711c37d74740427c241a477afe21ee62d7c. commit 49772711c37d74740427c241a477afe21ee62d7c Author: fanglin.yang Date: Tue Jul 14 19:45:32 2020 +0000 modified: config.base.emc.dyn to use obsproc_prep.iss70457.netcdfhistory_atmos commit c957b86ebf9889f88d39e875e53d3b060a40b10e Author: kate.friedman Date: Tue Jul 14 11:38:18 2020 -0500 Issue #5 - fix updated lmod path and add wavepostsbs to Orion.env if-block for CFP_MP commit a8812e08bfde823fdff54522103322a29d0cf8ed Author: kate.friedman Date: Tue Jul 14 10:36:56 2020 -0500 Issue #5 - update gitignore for new scripts added recently commit 94ed497a3395e605cd3feb0996dae5d9eb346243 Author: kate.friedman Date: Tue Jul 14 10:33:51 2020 -0500 Issue #5 - add updated cdo module to module_base.orion commit 0f92e24c7b0988b07cc394f391008144ab768f11 Merge: 01a64acc8 8b0f57fdf Author: kate.friedman Date: Tue Jul 14 10:11:53 2020 -0500 Issue #5 - Sync merge branch 'feature/gfsv16b' into port2orion * feature/gfsv16b: modified: ../Externals.cfg and checkout.sh to check out UPP tag tag upp_gfsv16_release.v1.0.10 and model tag GFS.v16.0.10 Updated the algorithm used to compute CAPE and CIN in UPP. The computation is now bounded from the surface up to 1 hPa instead of the model top to avoid producing erroneous large CAPE and CIN values. Issue #1 - add mod_icec.sh to UPP ush script symlinking in link_fv3gfs.sh commit 8b0f57fdfd4a8710e176f0035356fb01d781dd70 Author: fanglin.yang Date: Mon Jul 13 23:43:55 2020 +0000 modified: ../Externals.cfg and checkout.sh to check out UPP tag tag upp_gfsv16_release.v1.0.10 and model tag GFS.v16.0.10 Updated the algorithm used to compute CAPE and CIN in UPP. The computation is now bounded from the surface up to 1 hPa instead of the model top to avoid producing erroneous large CAPE and CIN values. commit 37e0f57d948fa98c66fcc9d9695de2769a4a38d2 Author: kate.friedman Date: Mon Jul 13 17:26:52 2020 +0000 Issue #1 - add mod_icec.sh to UPP ush script symlinking in link_fv3gfs.sh commit 663542ea2219eae322437058693cae3ade14e125 Author: kate.friedman Date: Fri Jul 10 20:01:17 2020 +0000 Issue #94 - fix dependencies with in the path commit 866c4c4cf1a2cd177fed81396ddb40e657e8d55b Author: kate.friedman Date: Fri Jul 10 15:22:33 2020 +0000 Issue #94 - convert variables into hard-coded values per feedback commit 01a64acc8587669869411b566674a2be09583f74 Merge: 5b6ad0874 d57fc0280 Author: kate.friedman Date: Thu Jul 9 12:18:12 2020 -0500 Issue #5 - Sync merge branch 'feature/gfsv16b' into port2orion * feature/gfsv16b: modified: checkout.sh to checkout model tag GFS.v16.0.9 Issue #1: set g2o1_obtype_conus_sfc in config.metp to "ONLYSF ADPUPA" Issue #1: update sorc/checkout.sh to checkout verif_global_v1.9.0 further updated exglobal_fcst_nemsfv3gfs.sh to use restart_wave directory for gdas cycle as well modified: jobs/JGLOBAL_FORECAST parm/config/config.wave scripts/exglobal_fcst_nemsfv3gfs.sh add WW3 break-point restart capability and clean up forecast script. Conflicts: sorc/checkout.sh commit 7728ce18f9c9783cbe139ca7c61ea719b7e8c595 Author: kate.friedman Date: Thu Jul 9 14:00:43 2020 +0000 Issue #97 - GFSv15.3.2 RFC 7036 – On WCOSS, implement GFS.v15.3.2 updates to the GSI fix file and global_satinfo.txt. This change is being made to address minimization issues in the GSI and tighten quality control for the seven CrIS water vapor channels. To be implemented on July 6 at 1400Z. commit d57fc02801232de745e1c66b6cb0b459218dc042 Author: fanglin.yang Date: Wed Jul 8 22:59:35 2020 +0000 modified: checkout.sh to checkout model tag GFS.v16.0.9 1. WW3 update: write all restart files in a sub-directory restart_wave 2. Port production/GFS.v16 to Orion (#129) * Update FV3 submodule * Update NEMS submodule * Add 'ulimit -s unlimited' to Orion job card template * Update NEMS submodule to point to fix_moduleinit branch in 'junwang-noaa/NEMS' * Update modulefiles/orion.intel/fv3 * Update regression test configuration on Orion (for gfs_v16) * Update NEMS submodule (change /apps/lmod/init path after Orion maintenance) * Update NEMS submodule (point to NOAA-EMC/NEMS) commit 233f9f615ab1d5cee29c45143655978112bb79fe Merge: 506e3bf4d ae7dbdb2a Author: kate.friedman Date: Wed Jul 8 19:44:40 2020 +0000 Issue #94 - Sync merge branch 'feature/gfsv16b' into feature/gfsv16b-restructure commit 506e3bf4d627093b06bc0a2d5cac63f99767742f Author: kate.friedman Date: Wed Jul 8 17:43:14 2020 +0000 Issue #94 - initial add of COMPONENTatmos variable to scripts commit ae7dbdb2a37daabaa178b8924bad22684a673d4e Merge: 6e73de0ce 58fb79548 Author: Fanglin Yang Date: Wed Jul 8 11:30:07 2020 -0400 Merge pull request #96 from NOAA-EMC/feature/gfsv16b_restart modified: exglobal_fcst_nemsfv3gfs.sh etc to enable WW3 break-point restart capability commit 58fb79548b2ac34a477b4520ca99ce32280b00fd Merge: 4a1cf2b6f 6e73de0ce Author: fanglin.yang Date: Tue Jul 7 15:58:25 2020 +0000 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b_restart commit 6e73de0ce5cd82c9e97d8078dd1ddd253f05a258 Author: russ.treadon Date: Tue Jul 7 15:03:09 2020 +0000 Issue #1: set g2o1_obtype_conus_sfc in config.metp to "ONLYSF ADPUPA" commit 22ad770016b8a61bb672f1f76d62cb3d133f247c Author: russ.treadon Date: Tue Jul 7 13:47:26 2020 +0000 Issue #1: update sorc/checkout.sh to checkout verif_global_v1.9.0 commit 4a1cf2b6f45a76b91fbfa048def2cd647598e154 Author: fanglin.yang Date: Mon Jul 6 23:59:03 2020 +0000 further updated exglobal_fcst_nemsfv3gfs.sh to use restart_wave directory for gdas cycle as well commit 5fe1a217d28e6710502b9524d3e3cce035364c37 Author: fanglin.yang Date: Sat Jul 4 19:43:29 2020 +0000 modified: jobs/JGLOBAL_FORECAST parm/config/config.wave scripts/exglobal_fcst_nemsfv3gfs.sh add WW3 break-point restart capability and clean up forecast script. commit df83408157ff1f01794fda14d3338161d9e2415b Author: Judy.K.Henderson Date: Wed Jul 1 17:41:11 2020 +0000 updated files in jkh directories with current versions for GSI and fv3gfs commit 788a7e692d94f4619e6f4c9b1fa99a32d3545188 Author: Judy.K.Henderson Date: Mon Jun 29 20:19:18 2020 +0000 comment out setting of io_layout in config.fcst commit 3fe6f9fc3aff9cff75aa6179cc545e635d28f312 Author: kate.friedman Date: Mon Jun 29 19:44:12 2020 +0000 Issue #94 - commit initial version of new installation script commit 30e56f2e84c6f1a66dc73db53cabc898b657374c Author: kate.friedman Date: Mon Jun 29 19:41:31 2020 +0000 Issue #94 - generalization updates to link script - change 'nco' mode to 'prod' mode - change 'emc' mode to 'dev' mode - update machine values to match target names commit 5b6ad087408533658dddbb2bb9c4cf74f1fc7c0a Merge: fef33e203 28b1faf03 Author: kate.friedman Date: Mon Jun 29 08:42:35 2020 -0500 Issue #5 - Sync merge branch 'feature/gfsv16b' into port2orion * feature/gfsv16b: modified: config.vrfy to add elif [ $machine = "HERA" ] ; then export RUNGFSMOSSH="$HOMEgfs/scripts/run_gfsmos_master.sh.hera" corrected a typo in hpssarch_gen.sh - echo "${dirname}${head}atma000.ensres${SUFFIX} " >>gdas.txt + echo "${dirname}${head}atma009.ensres${SUFFIX} " >>gdas.txt modified: config.vrfy to point a different syndat directory on Hera export COMROOTp1="/scratch1/NCEPDEV/global/glopara/com" export COMINsyn=${COMINsyn:-${COMROOTp1}/gfs/prod/syndat} Issue #1: HPSS archive and MOS script changes commit a0de3972ec00736cec46b3c11e7dfcd07dcd9372 Author: Judy.K.Henderson Date: Mon Jun 29 13:24:38 2020 +0000 revert paths for HOMEobsproc_prep and HOMEobsproc_network since updated paths do not exist on hera export HOMEobsproc_prep="$BASE_GIT/obsproc/gfsv16/obsproc_prep.iss70457.netcdfhistory_new" export HOMEobsproc_network="$BASE_GIT/obsproc/gfsv16/obsproc_global.iss71402.supportGFSv16" (do not exist on hera) export HOMEobsproc_prep="$BASE_GIT/obsproc/gfsv16b/obsproc_prep.iss70457.netcdfhistory" export HOMEobsproc_network="$BASE_GIT/obsproc/gfsv16b/obsproc_global.iss71402.supportGFSv16" commit 99b8105e69800ce74cc056c1c7c629264a12dcac Author: Judy.K.Henderson Date: Fri Jun 26 22:25:12 2020 +0000 Merged with 26Jun feature/gfsv16b branch Corrected setting of nwat for Thompson MP in config.fcst Squashed commit of the following: commit 28b1faf03c5ad12e4e9a44f1d02c754f1441ebc7 Author: fanglin.yang Date: Fri Jun 26 02:31:18 2020 +0000 modified: config.vrfy to add elif [ $machine = "HERA" ] ; then export RUNGFSMOSSH="$HOMEgfs/scripts/run_gfsmos_master.sh.hera" commit 61f4a52e299482687d84ef6686e6a65f64fe57f3 Author: fanglin.yang Date: Fri Jun 26 02:21:33 2020 +0000 corrected a typo in hpssarch_gen.sh - echo "${dirname}${head}atma000.ensres${SUFFIX} " >>gdas.txt + echo "${dirname}${head}atma009.ensres${SUFFIX} " >>gdas.txt commit b10a9306b732e543d2be3b932d556ebcdcbe8a5e Author: fanglin.yang Date: Thu Jun 25 20:25:30 2020 +0000 modified: config.vrfy to point a different syndat directory on Hera export COMROOTp1="/scratch1/NCEPDEV/global/glopara/com" export COMINsyn=${COMINsyn:-${COMROOTp1}/gfs/prod/syndat} commit b8192e54988f2fb2f4cda0510af02a090dfdda2e Author: russ.treadon Date: Thu Jun 25 18:20:01 2020 +0000 Issue #1: HPSS archive and MOS script changes * replace enkf member atmi*nc with ratmi*nc in HPSS enkf tarballs * add ensemble resolution analysis to HPSS gdas tarball * allow variable range to be externally set in run_gfsmos_master scripts commit e599c368a2d55018e4a1567717efd7ffa09f14d9 Merge: 99277ae3 1e56eddb Author: Kate Friedman Date: Wed Jun 24 14:16:22 2020 -0400 Merge pull request #93 from JessicaMeixner-NOAA/bugfix/exiterr fix for exiting properly with error for wave prep/init scripts commit 1e56eddb055b1414385e276ac73255d3ede9e9e9 Author: JessicaMeixner-NOAA Date: Wed Jun 24 12:36:53 2020 -0500 fix for exiting properly with error for wave prep/init scripts commit 99277ae34ef50454fa15e7e28b564c2e34e3406c Merge: 14dd3c94 4f8d5a5f Author: fanglin.yang Date: Tue Jun 23 16:05:16 2020 +0000 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b commit 4f8d5a5f28c7120905fe36e9254f0199da986188 Author: Kate Friedman Date: Tue Jun 23 11:57:26 2020 -0400 Update README.md Remove use/mention of manage_externals until checkout.sh is retired. commit 14dd3c94938b0e69601e25a42104b2fc23944ebd Author: fanglin.yang Date: Tue Jun 23 15:53:11 2020 +0000 modified: Externals.cfg and sorc/checkout.sh to check out model tag GFS.v16.0.7 and UPP tag upp_gfsv16_release.v1.0.9. Changes include: 1) Inline POST Issues #136 and $142 * Update ceiling height calculation for global FV3. * add low,middle,high instantaneous cloud fraction * add radar reflectivity at model layers 1 and 2 , and radar reflectivities at 1 and 4-km height. * fix a bug in initializing DBZI * output mixed layer CAPE/CIN * remove simulated GOES-12 brightness temperature. * change the names of time averaged low/mid/high cloud fractions in grib2 files from "TCDC" to "LCDC/MCDC/HCDC", respectively. 2) Model Issue #152 * update in-line post control files * upgrade post library to 8.0.9 for hera and wcoss_dell_p3 commit dd76002425a03905bfc6ef63d3f43a6813814497 Merge: df89cc80 dd599eaa Author: Kate Friedman Date: Tue Jun 23 11:34:40 2020 -0400 Merge pull request #91 from christopherwharrop-noaa/feature/fix_externals Update version of upp in Externals.cfg to be consistent with sorc/che… commit dd599eaa4f379e1eb8fc5e057f7904b0e6290d48 Author: Christopher Harrop Date: Tue Jun 23 15:25:08 2020 +0000 Update version of upp in Externals.cfg to be consistent with sorc/checkout.sh commit df89cc800d3c479c132a5e679a2562af91b32f62 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Fri Jun 19 18:40:26 2020 -0400 Update config.resources Generalize setting of echgres threads to be maximum permitted on given platform instead of explicitly setting echgres threads on every platform. commit ff8cd28365377f0ceed87ff8b4b9a9b4b6e91368 Merge: 1dd83b81 965ff420 Author: russ.treadon Date: Thu Jun 18 20:07:59 2020 +0000 Issue #1: merge branch 'feature/chgresfcst' at 965ff42 into feature/gfsv16b commit 965ff4203f132c6b032398a7a56494329714247c Author: russ.treadon Date: Thu Jun 18 20:02:23 2020 +0000 Issue #85: update DA checkout to point at release/gfsda.v16.0.0 commit 71d714119960feed1f0807708cdf85677ab81d73 Author: russ.treadon Date: Tue Jun 16 20:11:33 2020 +0000 Issue #85: (1) rename "chgresfcst" as "echgres", (2) add chgres variables to env commit 1dd83b818e9f2babbd11b762951721cf2a4c415e Author: fanglin.yang Date: Tue Jun 16 16:07:45 2020 +0000 modified: run_gfsmos_master.sh.dell to still set range=both as the default for running the real-time parallel commit eb0e3b4d009047251c535eea6fdc5958e58e68f7 Author: fanglin.yang Date: Tue Jun 16 14:43:29 2020 +0000 update checkout.sh to switch back to post version upp_gfsv16_release.v1.0.8. 1.0.9 still has issues. commit dfc76f0715a29e52e4d7133a8259c492229a5c22 Author: Kate.Friedman Date: Tue Jun 16 13:32:01 2020 +0000 Issue #1 - sync Externals.cfg with checkout.sh update for FV3 GFSv16.0.6 tag commit 7078bb5c680d54297101fae573579be24411a94a Author: fanglin.yang Date: Tue Jun 16 13:22:43 2020 +0000 modified: scripts/run_gfsmos_master.sh.dell to set default verification type o short. modified: sorc/checkout.sh updated to model tag GFS.v16.0.6 to fix contrib issue on HERA commit 15d5bed4119b067905f5a4bf47656557f2211883 Author: russ.treadon Date: Mon Jun 15 20:47:47 2020 +0000 Issue #85: add cfp option to chgresfcst; enable threads with chgresfcst commit d8782697fa698c00daa242a656246cf5f5d9b537 Author: Kate.Friedman Date: Mon Jun 15 16:00:35 2020 +0000 Issue #1 - update Externals.cfg to match updates to checkout.sh commit a2bd621727701e9526feccc7ce8ebd43be31860f Merge: 22b735d3 295cd05f Author: Kate Friedman Date: Mon Jun 15 11:57:14 2020 -0400 Merge pull request #84 from NOAA-EMC/feature/gfsv16b_herawavepost Adapting wavepostsbs for running on Hera commit 295cd05f306e4d70d09880a41fd60fadce74bab3 Author: Jose-Henrique Alves <47567389+ajhenrique@users.noreply.github.com> Date: Mon Jun 15 11:53:13 2020 -0400 Update exwave_post_sbs.sh Removing obsolete nm variable entries commit 22b735d310413989568dca4b30d19eadcd4c3fdb Author: fanglin.yang Date: Fri Jun 12 20:06:49 2020 +0000 modified: checkout.sh to check out upp_gfsv16_relaese.v1.0.9 output cloud ceiling height and instant total cloud fraction. output instant cloud fraction at low/mid/high cloud layer. correct grib2 names of time averaged cloud fraction fraction at low/mid/high cloud layer from "TCDC" into "LCDC, MCDC, HCDC". output radar reflectivity at 1/4 km above ground and model layer 1/2. output mixed layer CAPE/CIN. Remove simulated GOES-12 brightness temperature from gfs product. Add the bug fix of initializing DBZI from Ruiyu. commit 10ce1d4140c7c0fe795ded39e1fa550e0c202c59 Author: Kate.Friedman Date: Thu Jun 11 18:42:22 2020 +0000 Issue #1 - Hotfix to update anaconda module contrib path on Hera commit 58d1139c196f92bd4bb149008a8d61eb6457fb3c Author: henrique.alves Date: Thu Jun 11 02:44:41 2020 +0000 Adapting wavepostsbs for running on Hera commit c330e60197c38acb724cfdf4a30a20417a6618b4 Author: CoryMartin-NOAA Date: Wed Jun 10 21:17:00 2020 +0000 add checkout.sh to test on Dell commit 201609b2d43acd13a08bf1d5ab2251db90a11d32 Author: CoryMartin-NOAA Date: Wed Jun 10 20:51:51 2020 +0000 Commit changes from debugging addition of chgresfcst on hera commit 4405a2c74c8b5a40ee6edd7b4c2faba9bd41b59c Author: russ.treadon Date: Wed Jun 10 14:36:03 2020 +0000 Issue #1: update parm/config/config.base.emc.dyn to be consistent with GFS v16 real-time parallel config.base commit fc3066c2b7a5edd9f0d510b88f8542b07b8a8589 Author: CoryMartin-NOAA Date: Wed Jun 10 14:19:07 2020 +0000 First draft to add chgresfcst to rocoto workflow commit 9f2e4ecfe5799e13a4f6b9e80f7ff3e7b4a3633c Author: russ.treadon Date: Mon Jun 8 00:34:07 2020 +0000 Issue #1: correct typo in scripts/exwave_prep.sh commit 5a8b8f2e80532b7e446c51a69dcb83c7a212395d Author: russ.treadon Date: Mon Jun 8 00:15:31 2020 +0000 Issue #1: check for existence of 0p50 and 1p00 pgrb files before attempting to write to HPSS commit a7306aa93d537da5b165297e0dc34ba88856d4c7 Author: fanglin.yang Date: Sun Jun 7 23:22:53 2020 +0000 modified: jobs/rocoto/post.sh wait for 5 minutes if forecast history file does not exist before exit modified: modulefiles/module_base.hera use GV's temporal build of netcdfp/4.7.4 and esmflocal/8.0.1.08bs on HERA modified: parm/config/config.base.emc.dyn add restart_interval_gfs=0 to config.base. It is used by config.fcst and config.wave modified: parm/config/config.fcst -- fix a bug related to setting npe_wav for gfsfcst. if [ "$CDUMP" = "gfs" ]; then npe_wav=$npe_wav_gfs ; fi -- set io_layout="4,4" for writing gfs restart files modified: parm/config/config.wave set WAVE restart frequency based on restart_interval_gfs (by H. Alves). commit 1082885b082e8a837aef095deb6a3343fca26cb3 Merge: 968b9860 bf5a5c44 Author: Fanglin Yang Date: Thu Jun 4 20:26:47 2020 -0400 Merge pull request #83 from NOAA-EMC/feature/gfsv16b_restart revive GFS forecast break-point restart capability with IAU turned on commit bf5a5c44bdebff5663225e9e9548ba83f498f7cd Merge: b27a01db 968b9860 Author: fanglin.yang Date: Thu Jun 4 23:36:52 2020 +0000 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b_restart commit 968b98609fe2016518f3adcc6b178bdca0b73bcf Merge: 9b36cfde dc512dd6 Author: Guang Ping Lou Date: Thu Jun 4 19:20:53 2020 +0000 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b commit 9b36cfde2032e9296cc7567704e1274560b2897e Author: Guang Ping Lou Date: Thu Jun 4 19:12:56 2020 +0000 Unify output path in gfs_bfr2gpk.sh commit 7675368c3f7f3942b7f8a57bad4c29a38cef48fa Author: Guang Ping Lou Date: Thu Jun 4 19:11:23 2020 +0000 remove station elevation adjustment to T,Q and evaporation bug fix commit dc512dd68df088361c340ac01ec3c8130707ffd1 Merge: a7b25a3b 987c32af Author: Fanglin Yang Date: Thu Jun 4 12:55:33 2020 -0400 Merge pull request #82 from NOAA-EMC/feature/gfsv16b_wavehera Adjustment to wave workflow for running wave component on Hera commit 987c32af7c7e94c53f50d20e1ad3c8b27e2c9afb Author: Jose-Henrique Alves <47567389+ajhenrique@users.noreply.github.com> Date: Thu Jun 4 12:08:05 2020 -0400 Updating checkout to ufs-weather-model GFS.v16.0.5 commit a7b25a3bb0b7b5f1b8a15204ff1e7c061ea1199a Author: Mallory Row Date: Thu Jun 4 14:59:22 2020 +0000 modified: checkout.sh to check out METplus verification tag verif_global_v1.8.0 commit 0acee674759a5e2c8fbd4ec7b1cdd0459f95e2bd Author: russ.treadon Date: Wed Jun 3 20:10:09 2020 +0000 Issue #1: remove redundant entry from config.ediag; update checkout.sh to pull GFS v16 DA from github commit b27a01db1516f9aa229e75957885ffa2125d31d6 Author: fanglin.yang Date: Wed Jun 3 16:58:29 2020 +0000 modified: parm/config/config.fcst modified: scripts/exglobal_fcst_nemsfv3gfs.sh GFS forecast restrat capability from a breakpoint is no longer working with IAU turned on. This function has been overhauled to make it more general and works for cases with and without IAU commit ba895481be53906878f58c2998a398bc59870ea2 Author: Jose-Henrique Alves <47567389+ajhenrique@users.noreply.github.com> Date: Wed Jun 3 09:59:35 2020 -0400 Update JWAVE_PREP Removing lines using for testing presence of files while debugging. commit bb79d7a33c423770f4972c9c643c5dfa7fb0f3be Author: wx20ha Date: Wed Jun 3 02:45:33 2020 +0000 Fixing a few minor bugs in wave_tar.sh after testing on WCOSS commit 8dc2e255e77a55c0f7d90a5018f769ade1e83a9f Author: fanglin.yang Date: Wed Jun 3 01:41:18 2020 +0000 add new file: run_gfsmos_master.sh.hera commit 3393cac802b289cb9b6867c003ff27ef1711327a Author: henrique.alves Date: Thu May 28 00:01:51 2020 +0000 Correcting minor bug in wave_prn_cur.sh Adjusting indents in exwave_prep.sh Adding defaults for current processing in conif.waveprep. commit 32c5f29b62cf747662909ff50a6bf8f327127a09 Merge: 49abb906 e87b5a18 Author: henrique.alves Date: Wed May 27 17:40:34 2020 +0000 Merging latest feature/gfsv16b branch into feature/gfsv16b_wavehera commit 49abb9068effc5aeeba8f861f44207b67ff442c9 Author: henrique.alves Date: Wed May 27 17:36:12 2020 +0000 Adding comment indicating how to regenerate cdo interpolation weights. commit e87b5a18f9dbad2230b9d3324f0c48a7b25d9b62 Author: wx20ha Date: Wed May 27 02:01:57 2020 +0000 JWAVE_PREP updated to use CDO_ROOT defined in config config.waveprep updated to default to WCOSS rtofs operational cdo if no module found wave_prnc_cur.sh bug fixed now provides proper fhr in temp file names exwave_prep.sh adjusted for WCOSS and Hera. commit 70d71310132813c1066a9b0f881ef686602a23c4 Author: russ.treadon Date: Tue May 26 20:39:33 2020 +0000 Issue #1: remove "_break" from commented out lines in config.anal and config.prep. "_break" will cause failue if line active commit 914cb8dff96e45e690c8e1a2932c8d846afb1655 Author: russ.treadon Date: Tue May 26 20:37:27 2020 +0000 Issue #1: correct typo in parm/config/config.anal commit 2e12e63b13533a8a1b741bf55cbbaba49beb14cc Author: russ.treadon Date: Tue May 26 19:18:57 2020 +0000 Issue #1: update config.anal logic to point ABIBF at the correct GDA directory commit 38cd82133b64af289648ca5e88fde1ac3dcaeab0 Author: russ.treadon Date: Tue May 26 18:04:59 2020 +0000 Issue #1: update config files * parm/config/config.anal - add logic to use correct global_convinfo.txt prior to GFS v15.3 implementation (2020052612) * parm/config/config.awips - set NAWIPSGRP to equal NPOSTGRP (config.post) * parm/config/config.fcst - add double quotes around CDUMP on levs test to prevent setup_workflow.py runtime error commit 0dff61f103bab78f9d2426599f976dc318eff4e0 Author: fanglin.yang Date: Mon May 25 03:08:27 2020 +0000 modified: config.fv3 to reduce tasks assigned to the WAVE component. 70 tasks at C768 is adequate for wave. commit 20ef779fd90a267affc0e80ae1f8a3f64afd2f8d Author: henrique.alves Date: Fri May 22 19:52:55 2020 +0000 Redefining mpmd command for working on Hera with slurm Adapting wave scripts to execute mpmd command on Hera Adjusting wave_prnc_cur.sh for properly catting files on Hera commit e248236b233b6bcf2dc008006a82ccd459d7e3f3 Author: fanglin.yang Date: Fri May 22 15:23:59 2020 +0000 modified: config.fcst to set if [ $LEVS = "128" -a $CDUMP = "gdas" ]; then ... lheatstrg=".false." commit 96460e6e5f4b09b9fd10d39550bf022dc62c37ab Author: russ.treadon Date: Fri May 22 13:02:06 2020 +0000 Issue #1: replace "nawips" with "gempak" in hpssarch_gen.sh path to gfs sfc and snd files commit 7222f84041f2a2f221e85d6543657f229408328c Author: fanglin.yang Date: Thu May 21 14:09:36 2020 +0000 modified: checkout.sh to check out modle tag GFS.v16.0.4. changes include: 1. Remove constraints on mixing length and background diffusitivity over inversion layers on land 2. Enhance mass flux for deep convection, hence to increase subsidzing warming to reduce cold bias in the lower tropospehre 3. Fix a RRTMg solar radiaiton bug which has impact in SW abosrption in the UV region in the upper atmospehre. commit 4fa08a77e59660f4b58279375a83d179c891b385 Author: fanglin.yang Date: Wed May 20 01:20:21 2020 +0000 modified: HERA.env to add export CFP_MP="YES" # For analdiag with SLURM commit 9f7df9d3552fd971cde09148bc392ad93bf104e0 Author: fanglin.yang Date: Mon May 18 15:06:19 2020 +0000 modified: checkout.sh to check out upp_gfsv16_release.v1.0.8 1) Add configuration for Orion. 2) Make fields at isobaric levels have 41 vertical levels for all forecast hours and analysis in pgrb2 dataset. 3) Remove SPFH at isobaric levels from pgrb2b dataset. commit 5e4a1335ef3feb0242245b5661f22650a0a2e576 Author: Mallory Row Date: Fri May 15 13:08:25 2020 +0000 modified: checkout.sh to check out METplus verification tag verif_global_v1.7.2 commit 42913497cbb82435318797ee2148ec35311e8ea6 Author: fanglin.yang Date: Tue May 12 03:05:17 2020 +0000 modified: checkout.sh to check out gldas_gfsv16_release.v1.2.0. commit 0c0614cd03e6ee178275b85be7636f842f1eb77e Author: russ.treadon Date: Fri May 8 18:29:07 2020 +0000 Issue #1: change number of tasks for analdiag and ediag to 112 and 56, respectively in config.resources commit a601acda14b2c8c58d2d3ae484fa57812bf8801a Author: emc.glopara Date: Fri May 8 04:17:27 2020 +0000 updated config.vrfy to point to the fit2obs version that supports reading netcdf history files export fitdir="$BASE_GIT/verif/global/Fit2Obs/ncf-vqc/batrun" export PREPQFITSH="$fitdir/subfits_hera_slurm" commit ad86a552ac5893e2c57a36772b94d3e05ccf4d33 Author: Mallory Row Date: Wed May 6 15:28:49 2020 +0000 modified: checkout.sh to check out METplus verification tag verif_global_v1.7.1 commit 20572b53aab8579b20aaf4e365c59f32fd386b5f Author: russ.treadon Date: Fri May 1 19:08:23 2020 +0000 Issue #1: update files written to enkf HPSS tarballs to be consistent with GFS v16 DA updates commit 20baab7ab7f7c151330cea30027b90ff30bfc83b Author: fanglin.yang Date: Wed Apr 29 18:49:05 2020 +0000 modified: checkout.sh to check out model tag GFS.v16.0.3 In Sfc_diff.f, a bug was introduced when the surface layer scheme was updated last time to reduce 2-m temperature cold biases. The bug only has impact over sea-ice points, where momentum and thermal roughness are nevertheless very small. commit af6346497abe1d05d408c8c2b819427120a22961 Author: fanglin.yang Date: Tue Apr 28 18:18:44 2020 +0000 modified: qctropcy.f by Qingfu Liu A bug was found that the history files (syndat_stmcat , syndat_stmcat.scr) save the first and last storm ID used. If the FORTRAN code finds that the storm ID has been used in the current hurricane season, the code will change the storm ID by adding 1 to the original storm ID. The fix is to skip the change of the storm ID. See also https://github.com/NOAA-EMC/global-workflow/issues/63 commit 45f282db443a93dad0234b248f35ca706da945e6 Author: Judy.K.Henderson Date: Fri Jun 26 22:18:58 2020 +0000 revert config.nsst to original values and add if statements for GSD suites commit 28b1faf03c5ad12e4e9a44f1d02c754f1441ebc7 Author: fanglin.yang Date: Fri Jun 26 02:31:18 2020 +0000 modified: config.vrfy to add elif [ $machine = "HERA" ] ; then export RUNGFSMOSSH="$HOMEgfs/scripts/run_gfsmos_master.sh.hera" commit 61f4a52e299482687d84ef6686e6a65f64fe57f3 Author: fanglin.yang Date: Fri Jun 26 02:21:33 2020 +0000 corrected a typo in hpssarch_gen.sh - echo "${dirname}${head}atma000.ensres${SUFFIX} " >>gdas.txt + echo "${dirname}${head}atma009.ensres${SUFFIX} " >>gdas.txt commit 699c2e038169048dddd250255daa48ae5ee09e06 Author: Judy.K.Henderson Date: Thu Jun 25 21:07:05 2020 +0000 corrected syntax error and made all if CCPP_SUITE statements consistent commit b10a9306b732e543d2be3b932d556ebcdcbe8a5e Author: fanglin.yang Date: Thu Jun 25 20:25:30 2020 +0000 modified: config.vrfy to point a different syndat directory on Hera export COMROOTp1="/scratch1/NCEPDEV/global/glopara/com" export COMINsyn=${COMINsyn:-${COMROOTp1}/gfs/prod/syndat} commit b8192e54988f2fb2f4cda0510af02a090dfdda2e Author: russ.treadon Date: Thu Jun 25 18:20:01 2020 +0000 Issue #1: HPSS archive and MOS script changes * replace enkf member atmi*nc with ratmi*nc in HPSS enkf tarballs * add ensemble resolution analysis to HPSS gdas tarball * allow variable range to be externally set in run_gfsmos_master scripts commit fef33e20369128975e8c08e9d0890e0c99d0e16d Author: kate.friedman Date: Thu Jun 25 11:24:27 2020 -0500 Issue #5 - fix to gfsanalcalc dependency (or -> and) commit cfc8a111b8016623c2280dfd359543409c6353fe Author: kate.friedman Date: Thu Jun 25 09:25:01 2020 -0500 Issue #5 - fixes for config.base commit 6700a568822089b2fc669e0990270f6cadd8bd10 Merge: be717cec4 e599c368a Author: kate.friedman Date: Thu Jun 25 08:45:03 2020 -0500 Issue #5 - Sync merge branch 'feature/gfsv16b' into port2orion * feature/gfsv16b: fix for exiting properly with error for wave prep/init scripts commit e599c368a2d55018e4a1567717efd7ffa09f14d9 Merge: 99277ae34 1e56eddb0 Author: Kate Friedman Date: Wed Jun 24 14:16:22 2020 -0400 Merge pull request #93 from JessicaMeixner-NOAA/bugfix/exiterr fix for exiting properly with error for wave prep/init scripts commit 1e56eddb055b1414385e276ac73255d3ede9e9e9 Author: JessicaMeixner-NOAA Date: Wed Jun 24 12:36:53 2020 -0500 fix for exiting properly with error for wave prep/init scripts commit be717cec4f252ae6949ce4a0a33763de836f3080 Merge: df8dc52e3 99277ae34 Author: Kate.Friedman Date: Tue Jun 23 16:20:51 2020 +0000 Issue #5 - Sync merge branch 'feature/gfsv16b' into port2orion * feature/gfsv16b: Update README.md modified: Externals.cfg and sorc/checkout.sh to check out model tag GFS.v16.0.7 and UPP tag upp_gfsv16_release.v1.0.9. Changes include: Update version of upp in Externals.cfg to be consistent with sorc/checkout.sh commit 99277ae34ef50454fa15e7e28b564c2e34e3406c Merge: 14dd3c949 4f8d5a5f2 Author: fanglin.yang Date: Tue Jun 23 16:05:16 2020 +0000 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b commit 4f8d5a5f28c7120905fe36e9254f0199da986188 Author: Kate Friedman Date: Tue Jun 23 11:57:26 2020 -0400 Update README.md Remove use/mention of manage_externals until checkout.sh is retired. commit 14dd3c94938b0e69601e25a42104b2fc23944ebd Author: fanglin.yang Date: Tue Jun 23 15:53:11 2020 +0000 modified: Externals.cfg and sorc/checkout.sh to check out model tag GFS.v16.0.7 and UPP tag upp_gfsv16_release.v1.0.9. Changes include: 1) Inline POST Issues #136 and $142 * Update ceiling height calculation for global FV3. * add low,middle,high instantaneous cloud fraction * add radar reflectivity at model layers 1 and 2 , and radar reflectivities at 1 and 4-km height. * fix a bug in initializing DBZI * output mixed layer CAPE/CIN * remove simulated GOES-12 brightness temperature. * change the names of time averaged low/mid/high cloud fractions in grib2 files from "TCDC" to "LCDC/MCDC/HCDC", respectively. 2) Model Issue #152 * update in-line post control files * upgrade post library to 8.0.9 for hera and wcoss_dell_p3 commit dd76002425a03905bfc6ef63d3f43a6813814497 Merge: df89cc800 dd599eaa4 Author: Kate Friedman Date: Tue Jun 23 11:34:40 2020 -0400 Merge pull request #91 from christopherwharrop-noaa/feature/fix_externals Update version of upp in Externals.cfg to be consistent with sorc/che… commit dd599eaa4f379e1eb8fc5e057f7904b0e6290d48 Author: Christopher Harrop Date: Tue Jun 23 15:25:08 2020 +0000 Update version of upp in Externals.cfg to be consistent with sorc/checkout.sh commit df8dc52e36f605199755615e4bac54d22e139486 Merge: ff46607bc df89cc800 Author: Kate.Friedman Date: Mon Jun 22 14:41:17 2020 +0000 Issue #5 - Sync merge branch 'feature/gfsv16b' into port2orion * feature/gfsv16b: Update config.resources Issue #85: update DA checkout to point at release/gfsda.v16.0.0 Issue #85: (1) rename "chgresfcst" as "echgres", (2) add chgres variables to env Issue #85: add cfp option to chgresfcst; enable threads with chgresfcst add checkout.sh to test on Dell Commit changes from debugging addition of chgresfcst on hera First draft to add chgresfcst to rocoto workflow commit df89cc800d3c479c132a5e679a2562af91b32f62 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Fri Jun 19 18:40:26 2020 -0400 Update config.resources Generalize setting of echgres threads to be maximum permitted on given platform instead of explicitly setting echgres threads on every platform. commit ff8cd28365377f0ceed87ff8b4b9a9b4b6e91368 Merge: 1dd83b818 965ff4203 Author: russ.treadon Date: Thu Jun 18 20:07:59 2020 +0000 Issue #1: merge branch 'feature/chgresfcst' at 965ff42 into feature/gfsv16b commit 965ff4203f132c6b032398a7a56494329714247c Author: russ.treadon Date: Thu Jun 18 20:02:23 2020 +0000 Issue #85: update DA checkout to point at release/gfsda.v16.0.0 commit 6698047dff1c821f39edb89f4b22e91aeba84faf Merge: 958ec2efc b1c6f7cdc Author: Kate Friedman Date: Thu Jun 18 14:39:26 2020 -0400 Merge pull request #89 from NOAA-EMC/feature/develop_v15.3.1 Issue #65 - updates from v15.3.0 and v15.3.1 operations into develop commit b1c6f7cdcaf4bba90088781316802d970d7e8cc3 Author: kate.friedman Date: Thu Jun 18 18:29:56 2020 +0000 Issue #65 - updates from v15.3.0 and v15.3.1 operations into develop - AWIPS data card updates (RFC 6963) - ACCOUNT change on WCOSS in config.base.nco.static - obsproc_prep path adjustment to OT tag install in config.base.emc.dyn - add dictionaries version export to config.prep - RDHPCS gdas transfer list update - fix to link_fv3gfs.sh to harden fix folder symlinking - bug fix to syndat_qctropcy.fd/qctropcy.f commit c61841dd2e0ef498388465591a0593f84ff17533 Author: Judy.K.Henderson Date: Thu Jun 18 16:58:11 2020 +0000 - made change for contrib modules commit 027b5709f1f1b136200e884941841d8c8c68e1e3 Author: Judy.K.Henderson Date: Thu Jun 18 16:41:53 2020 +0000 -- set DA diag table to diag_table_da_gsd when running GSD_v0 or GSD_noah suites -- set radiation and LSM values differently for GSD_v0 and GSD_noah suites commit eaf0d1b21fda064315be9508f065a8904133f83d Merge: 23f25b7e6 4a4b3fb8b Author: Kate Friedman Date: Thu Jun 18 11:30:43 2020 -0400 Merge pull request #88 from NOAA-EMC/release/gfsv15.3.1 GFSv15.3.1 commit ff46607bccf6d8ccc4f4aa1e171a847a9161ab0b Merge: ed6b27971 1dd83b818 Author: Kate.Friedman Date: Wed Jun 17 17:23:44 2020 +0000 Issue #5 - Sync merge branch 'feature/gfsv16b' into port2orion * feature/gfsv16b: modified: run_gfsmos_master.sh.dell to still set range=both as the default for running the real-time parallel update checkout.sh to switch back to post version upp_gfsv16_release.v1.0.8. 1.0.9 still has issues. Issue #1 - sync Externals.cfg with checkout.sh update for FV3 GFSv16.0.6 tag modified: scripts/run_gfsmos_master.sh.dell to set default verification type o short. modified: sorc/checkout.sh updated to model tag GFS.v16.0.6 to fix contrib issue on HERA Issue #1 - update Externals.cfg to match updates to checkout.sh Update exwave_post_sbs.sh Adapting wavepostsbs for running on Hera commit 71d714119960feed1f0807708cdf85677ab81d73 Author: russ.treadon Date: Tue Jun 16 20:11:33 2020 +0000 Issue #85: (1) rename "chgresfcst" as "echgres", (2) add chgres variables to env commit 1dd83b818e9f2babbd11b762951721cf2a4c415e Author: fanglin.yang Date: Tue Jun 16 16:07:45 2020 +0000 modified: run_gfsmos_master.sh.dell to still set range=both as the default for running the real-time parallel commit eb0e3b4d009047251c535eea6fdc5958e58e68f7 Author: fanglin.yang Date: Tue Jun 16 14:43:29 2020 +0000 update checkout.sh to switch back to post version upp_gfsv16_release.v1.0.8. 1.0.9 still has issues. commit dfc76f0715a29e52e4d7133a8259c492229a5c22 Author: Kate.Friedman Date: Tue Jun 16 13:32:01 2020 +0000 Issue #1 - sync Externals.cfg with checkout.sh update for FV3 GFSv16.0.6 tag commit 7078bb5c680d54297101fae573579be24411a94a Author: fanglin.yang Date: Tue Jun 16 13:22:43 2020 +0000 modified: scripts/run_gfsmos_master.sh.dell to set default verification type o short. modified: sorc/checkout.sh updated to model tag GFS.v16.0.6 to fix contrib issue on HERA commit 15d5bed4119b067905f5a4bf47656557f2211883 Author: russ.treadon Date: Mon Jun 15 20:47:47 2020 +0000 Issue #85: add cfp option to chgresfcst; enable threads with chgresfcst commit d8782697fa698c00daa242a656246cf5f5d9b537 Author: Kate.Friedman Date: Mon Jun 15 16:00:35 2020 +0000 Issue #1 - update Externals.cfg to match updates to checkout.sh commit a2bd621727701e9526feccc7ce8ebd43be31860f Merge: 22b735d31 295cd05f3 Author: Kate Friedman Date: Mon Jun 15 11:57:14 2020 -0400 Merge pull request #84 from NOAA-EMC/feature/gfsv16b_herawavepost Adapting wavepostsbs for running on Hera commit 295cd05f306e4d70d09880a41fd60fadce74bab3 Author: Jose-Henrique Alves <47567389+ajhenrique@users.noreply.github.com> Date: Mon Jun 15 11:53:13 2020 -0400 Update exwave_post_sbs.sh Removing obsolete nm variable entries commit 58d1139c196f92bd4bb149008a8d61eb6457fb3c Author: henrique.alves Date: Thu Jun 11 02:44:41 2020 +0000 Adapting wavepostsbs for running on Hera commit c330e60197c38acb724cfdf4a30a20417a6618b4 Author: CoryMartin-NOAA Date: Wed Jun 10 21:17:00 2020 +0000 add checkout.sh to test on Dell commit 201609b2d43acd13a08bf1d5ab2251db90a11d32 Author: CoryMartin-NOAA Date: Wed Jun 10 20:51:51 2020 +0000 Commit changes from debugging addition of chgresfcst on hera commit fc3066c2b7a5edd9f0d510b88f8542b07b8a8589 Author: CoryMartin-NOAA Date: Wed Jun 10 14:19:07 2020 +0000 First draft to add chgresfcst to rocoto workflow commit 4a4b3fb8bfe9a1507945a2288e7ecca7cc0918a1 Merge: bd5a99d09 23f25b7e6 Author: kate.friedman Date: Tue Jun 2 14:32:47 2020 +0000 Issue #65 - Sync merge branch 'operations' into release/gfsv15.3.1 after v15.3.0 commit commit 23f25b7e69fa3dd3d489668b68c27bc9e0543eeb Merge: f1c8a4c57 8b77bc281 Author: Kate Friedman Date: Tue Jun 2 10:15:21 2020 -0400 Merge pull request #79 from NOAA-EMC/release/gfs.v15.3.0 GFSv15.3.0 commit 8b77bc281b99226295938bbc3447df1e70fdff09 Author: kate.friedman Date: Tue Jun 2 14:10:26 2020 +0000 Issue #63 - adjustments to use correct obsproc commit 737cfa89f727a633f7e1ea4465e53eb73cee5007 Author: kate.friedman Date: Wed May 27 14:28:25 2020 +0000 Issue #63 - final NCO changes ahead of v15.3.0 implementation on May 26th at 13Z commit 4e7d35633c5a609c3b9698e32f6dc771f5862b9b Author: Judy.K.Henderson Date: Tue May 12 20:41:12 2020 +0000 removed extra files commit 7df51727311615e95654bf5bfbfbd23f31586f77 Author: Judy.K.Henderson Date: Tue May 12 18:29:07 2020 +0000 corrected setting of effr_in to .true. for GSD suites commit bd5a99d0943bd65bf2f1438b1112ae55d94fd434 Author: kate.friedman Date: Tue May 12 18:07:37 2020 +0000 Issue #65 - fix to link_fv3gfs.sh for emc mode commit 0c403efd38da534d9251ff3e8824088fdc4aff7c Merge: f75685917 a5dfbf5eb Author: kate.friedman Date: Wed May 6 15:47:12 2020 +0000 Isue #65 - sync merge branch 'release/gfs.v15.3.0' into release/gfsv15.3.1 commit a5dfbf5eb3e5c973a8d5dd45c00a0cd5bb003616 Author: kate.friedman Date: Wed May 6 13:05:24 2020 +0000 Issue #63 - fix to qctropcy.f for compile errors commit 59b0c84b766010ee8e0d9beec65cc8e73f7f11fc Author: kate.friedman Date: Tue May 5 21:04:18 2020 +0000 Issue #63 - updating EMC_gfs_wafs clone path to point to GitHub auth repo commit d8ebf59cadb1f75fb66790b001a2e19d0899e857 Author: kate.friedman Date: Tue May 5 19:05:24 2020 +0000 Issue #63 - updating EMC_gfs_wafs tag to gfs_wafs.v5.0.11 commit a22489632496483eee6a580a25dc5f1af72aef3e Author: Judy.K.Henderson Date: Mon May 4 20:19:49 2020 +0000 added new diag table for DA when running GSD_noah suite commit 1d15c729f9a27aba4f9c49ef52d87de5bb61a23f Author: kate.friedman Date: Mon May 4 19:52:10 2020 +0000 Issue #63 - backing out link script change for rsync and using cp again commit f7568591769f7e10264a6f655c458edba0e20e7a Merge: 0342a079c 43d60c780 Author: kate.friedman Date: Mon May 4 17:31:35 2020 +0000 Issue #65 - Sync merge branch 'release/gfs.v15.3.0' into release/gfsv15.3.1 commit 43d60c780d51568047ce7ae46f1522822e074812 Author: kate.friedman Date: Mon May 4 16:47:12 2020 +0000 Issue #63 - updated NCO mode LINK in link_fv3gfs.sh from cp to rsync commit a54ffbbe6345a7472f9a1383c98c629b9efad03d Author: Judy.K.Henderson Date: Fri May 1 21:21:00 2020 +0000 modified name of diag_table when running GSD_noah suite commit c90ea4b242b2a895c323fb1240d8c87c3d51e7b0 Author: russ.treadon Date: Fri May 1 19:23:17 2020 +0000 Issue #63: update ProdGSI tag to gfsda.v15.3.0 commit 0342a079ceecef65fa569af3f30861d4271ca36c Author: BoiVuong-NOAA Date: Fri May 1 17:54:21 2020 +0000 Issue # 65 updated DATA cards for AWIPS commit 4fc766bc94ad72e5efc21d5443224f474fea2bf7 Author: Judy.K.Henderson Date: Thu Apr 30 23:14:02 2020 +0000 - remove modular forecast scripts and obsolete exglobal script commit b53992afdbc8b8279bf99a23ca65dcbe4dbbaccf Author: kate.friedman Date: Tue Apr 28 20:09:09 2020 +0000 Issue #63 - update ProdGSI and EMC_post_gtg tags to match current ops versions commit 6ed13256cb9b3f1a01249d3f9b5d1423e7c87caf Author: kate.friedman Date: Tue Apr 28 19:36:31 2020 +0000 Issue #63 - bug fix in syndat_qctropcy.fd/qctropcy.f and update to link_fv3gfs.sh to point to fix_nco_gfsv15 set that includes fix to storm names in syndat_stmnames commit 204201bb2d73d3b0d2ba9214d9eaeed1ddac3e69 Author: kate.friedman Date: Tue Apr 28 19:13:17 2020 +0000 Issue #63 - backing out changes meant to go in with v15.3 changes commit f1c8a4c57b46ac09ea092c3bd9c5ebccb2ffcb1d Author: kate.friedman Date: Tue Apr 28 18:41:32 2020 +0000 Issue #52 - final NCO svn log for v15.2.12 showed jobs/JGFS_POSTSND also changed commit 9829f71c21a1dc34f2edccce998688d9577e5ee7 Merge: 9b2eca6e5 f60f47a52 Author: kate.friedman Date: Fri Apr 24 14:34:07 2020 +0000 Merge remote-tracking branch 'upstream/gmtb_ccpp_hera' into feature/ccpp commit f60f47a5241b45b910dc2799f74561e8803090b5 Author: Judy.K.Henderson Date: Wed Apr 22 23:26:36 2020 +0000 changed sorc/aeroconv directory name to sorc/aeroconf.fd commit d04907a484cb18ad5e64be998ebd105876823c5c Author: Judy.K.Henderson Date: Wed Apr 22 20:45:57 2020 +0000 updated jkhNOTES commit 4861bab8d0006afcbae93546e898cfd114b4feea Merge: adcfca84a b29b0d6c2 Author: russ.treadon Date: Wed Apr 22 19:49:12 2020 +0000 Issue #37: Merge branch 'operations' at b29b0d6 into feature/gfsv15.3 commit bd4998d2af01e53b66c1e55da723fc017b05bd31 Author: Judy.K.Henderson Date: Wed Apr 22 19:08:36 2020 +0000 - backed out adding CCPP_SUITE, ATARDIR, and HPSS_PROJECT to setup_expt* scripts - added setting of variables in config.base.emc.dyn - removed config.base_emc and config.base_gsd from repository commit 958ec2efce774535c09009f8bbbbcdd87a74e028 Merge: 3b2993f38 69308c16a Author: Kate Friedman Date: Wed Apr 22 13:16:35 2020 -0400 Merge pull request #62 from NOAA-EMC/feature/develop_i52 Issue #52 - GFSv15.2.12 update to obsproc_global version in develop from RFC 6789 commit 69308c16a23684043573750cbdec143afb4b0a77 Author: kate.friedman Date: Wed Apr 22 17:13:40 2020 +0000 Issue #52 - GFSv15.2.12 update to obsproc_global version in develop from RFC 6789 commit b29b0d6c228968dc22e1b92ce3a67f2b51e2cba0 Merge: 14d09bfc9 675b2b316 Author: Hang-Lei-NOAA <44901908+Hang-Lei-NOAA@users.noreply.github.com> Date: Wed Apr 22 12:39:40 2020 -0400 Merge pull request #61 from NOAA-EMC/feature/operations_v15.2.12 Issue #52 - updates for GFSv15.2.12 from RFC 6789 commit 675b2b316a47d9bb76895e39e0f128867cbc89b8 Author: kate.friedman Date: Wed Apr 22 16:25:33 2020 +0000 Issue #52 - updates for GFSv15.2.12 from RFC 6789 commit 273df73e182432e69e01383a797bdd5d47e2bdba Author: Judy.K.Henderson Date: Tue Apr 21 16:56:27 2020 +0000 - add comment when runnning with 20 ensemble members to jkhNOTES commit 8a674828022d9eeb1752f2b078b3c328f17f0b7d Author: Judy.K.Henderson Date: Tue Apr 21 00:37:55 2020 +0000 - updated GSI tag in Externals.cfg - added CCPP_SUITE, HPSS_PROJECT, ATARDIR to setup_expt* python scripts and config.base.emc.dyn - changed path for Thompson lookup tables from $FV3INP to $FIX_AM - add _jkh directories for gsi.fd and fv3gfs.fd with changed files needed for running cycling with Thompson MicroPhysics and reading Thompson lookup tables with threads>1 - added GSD-specific experiment setup python scripts commit 14d09bfc9e7abce21007ec42d6ad9236e441c762 Merge: f5c4aff96 2a84d9921 Author: Kate Friedman Date: Tue Apr 14 09:16:29 2020 -0400 Merge pull request #57 from NOAA-EMC/feature/ops-v15.2.11 Issue #52 - updates for GFSv15.2.11 from RFC 6745 commit 2a84d99215a3a77c8f47c23c12c6bb7f6d67aa98 Author: kate.friedman Date: Tue Apr 14 13:06:32 2020 +0000 Issue #52 - updates for GFSv15.2.11 from RFC 6745 commit 5561ba9b2a1fb089db8f5fe6fdb2b10863d66227 Author: Judy.K.Henderson Date: Tue Apr 14 00:00:57 2020 +0000 - added aeroic task to setup workflow python scripts commit 984c0c43feb5acd35d8806beb5e7041d40b21af4 Author: Judy.K.Henderson Date: Mon Apr 13 23:36:16 2020 +0000 - added sorc/aeroconv.fd to Externals.cfg file - added script to extract INPUT/ and thirdparty/ sub-directories under aeroconv.fd commit 9446debd6ae49488aacf8123f15909c9615268df Author: Judy.K.Henderson Date: Mon Apr 13 20:59:54 2020 +0000 - add GFS_v16beta and GSD_noah CCPP suites commit a34e8b39d587b2a5a8931042d1b4da6795242af2 Author: Judy.K.Henderson Date: Mon Apr 13 20:44:19 2020 +0000 - changes to add CCPP changes to exglobal forecast script commit d0c5a0a12dc983d700111f7a2b17cf4b3f75aa6c Author: Judy.K.Henderson Date: Mon Apr 13 20:38:59 2020 +0000 updated Externals.cfg with tags in checkout.sh for FV3GFS and EMC_Post commit d9fb3a1ef5c628cd726657ca0cd5df2fedec2dd1 Author: Judy.K.Henderson Date: Fri Apr 10 19:36:55 2020 +0000 Merge branch 'feature/gfsv16b' into gmtb_ccpp_hera Squashed commit of the following: commit 073e5f64eab5badbd8a30b2ea702ce27159186fe Author: fanglin.yang Date: Thu Apr 9 16:44:14 2020 +0000 modified: arch.sh add external reference to VFRARCH. commit 53c818babc7c5e3dadc089823604b58bcb31bf0f Author: fanglin.yang Date: Thu Apr 9 16:30:04 2020 +0000 modified: arch.sh to clean old data under vrfyarch and touch remaining data to prevent them from being removed by the operation system commit 44dd4c28609d4e7c24cf87d5f69796ea8f7a6245 Author: russ.treadon Date: Thu Apr 9 12:00:15 2020 +0000 Issue #1: update selection of global_convinfo.txt and prepobs_errtable.global to be consistent with GFS v15.2.11 commit a8b655c0b543b1a3d94b7bb70130e58c2a747663 Author: fanglin.yang Date: Thu Apr 9 02:50:32 2020 +0000 modified: module_base.hera and module_base.wcoss_dell_p3 to point to esmf/8.0.1bs08 commit 4c1acf26cac49b9dd26ad404c6547cea1a969b7f Author: fanglin.yang Date: Thu Apr 9 02:18:43 2020 +0000 modified: checkout.sh to point to model tag GFS.v16.0.2, which includes following updates https://github.com/ufs-community/ufs-weather-model/pull/100 * in FV3: code changes to improve atm fcst->wrt data transfer and fix iau restart files * in NEMS: Code changes are added to reduce the atm->wav coupling time and to turn off flush for esmf print in MAIN_NEMS.F90 * When print_esmf is .true., set Verbosity = high and HierarchyProtocol = off in atm-wav nems.configure (print_esmf can be off in model_confgiure at run time). * update esmf lib to esmf801bs08. commit 41b68b5b7257a6f5a3ee1c88f3b8d3251b8097a9 Merge: 556614f5 ad9dd289 Author: Fanglin Yang Date: Tue Apr 7 00:15:44 2020 -0400 Merge pull request #51 from ajhenrique/bugfix/gfsv16b_sbspost Wave component workflow updates commit ad9dd28990f6ee6727f6dec43d335545c979093e Merge: 3b135fe9 556614f5 Author: henrique.alves Date: Mon Apr 6 00:01:32 2020 +0000 Merging changes in gfsv16b: pointing to junwang:ufs-weather-model:esmf8.0.1 commit 3b135fe9b6a2adfd337aa5c59188a3641ca68432 Author: henrique.alves Date: Sun Apr 5 23:54:36 2020 +0000 Adding ROTDIR path for operationalnco envir commit 556614f596f1e3fdb0dca3affe320aaea3650b23 Author: fanglin.yang Date: Sat Apr 4 00:40:20 2020 +0000 modified: sorc/checkout.sh to check out model branch checkout gfsv16_chsp which contains physics updates (czil, background diffusitivity and mixing length over stable boundary layer, and canopy heat storage) for reducing surface cold biases. modified: parm/config/config.fcst, tunn on canopy heat storage for slm=1. commit 83c546b74d1c8fae22535082dec8cc03d8477ad7 Author: Henrique Alves Date: Fri Apr 3 18:55:08 2020 +0000 Adding back input file setting in global config.wave script. Changing print_esmf to .false. to supress verbosity in esmf library calls commit 551ae13fcaf35f86580707bba583bc3b825d796b Author: Henrique Alves Date: Fri Apr 3 15:52:01 2020 +0000 Added RUN_ENVIR variable to wave j-jobs. Removed wavelog from wave scripts, now using only jlogfile commit 6c803647fb3c8c14e5da032b0f14add219d74cb5 Author: Henrique Alves Date: Fri Apr 3 15:39:34 2020 +0000 Updated KEEPDATA block and added exit 0 to wave j-jobs commit 42e08a505263aee0c990417487e7a62d37405d80 Author: Henrique Alves Date: Fri Apr 3 15:34:38 2020 +0000 Updated wavempexec and wave_mpmp to point to variables launcher and mpmp defined in env files commit ddb1b953789b4253d70db9bceb2e9a2d5b6fcc43 Author: Henrique Alves Date: Fri Apr 3 15:33:12 2020 +0000 Updated wavempexec and wave_mpmp to point to variables launcher and mpmp defined in env files commit 6c272f9e0cb13da7481431c60fc5f4c494d8fb87 Author: Henrique Alves Date: Fri Apr 3 15:06:29 2020 +0000 Unified calls to EXECwave and EXECcode. Removed legacy ush/wave_* scripts. commit 6e1e023b2e2d547b3429bd66a724c735db542c74 Author: Henrique Alves Date: Fri Apr 3 13:26:18 2020 +0000 emoved commented out parameter definitions (legacy) from config files. Moved variables/parms specific to single wave steps out of general config.wave file. commit d5d3701eb32f3a9052d2bb9944bb2a402cd95018 Author: Henrique Alves Date: Fri Apr 3 13:25:54 2020 +0000 emoved commented out parameter definitions (legacy) from config files. Moved variables/parms specific to single wave steps out of general config.wave file. commit 84c8730053c930508bf4ca211b1440ce32f735a1 Merge: 8ef7dee8 49faf67f Author: Henrique Alves Date: Thu Apr 2 19:57:12 2020 +0000 Merge remote-tracking branch 'gfsv16b/feature/gfsv16b' into bugfix/gfsv16b_sbspost commit 8ef7dee8b60cf548b8c0fbe95a57d8a49845ac38 Author: Henrique Alves Date: Thu Apr 2 19:54:07 2020 +0000 Adding idx file for subgrid grib2 files. Redirecting subgrid grib2 source to new native wave grid gnh_10m. Correcting bug on timing of second restart file stream on gdasfcst runs commit 4355153d85c4d9a553dc4838659005a791fe12d7 Author: Henrique Alves Date: Wed Apr 1 19:44:37 2020 +0000 adding hooks and parms for new wave model grids in several scripts. commit 49faf67fd401856923947972053395d351330ede Author: Boi Vuong Date: Mon Mar 30 14:32:42 2020 +0000 GitHub Issue #1 updated AWIPS parm cards with DZDT commit 84c28812d8d681c486132bb02cb9478055ec88b1 Merge: 7470bfe4 0e0dc873 Author: russ.treadon Date: Fri Mar 27 18:50:36 2020 +0000 Issue #1: Merge branch 'develop' at 0e0dc87 into feature/gfsv16b commit 7470bfe459fcb28751800b525d9387fbc486b8e3 Author: russ.treadon Date: Fri Mar 27 18:34:52 2020 +0000 Issue #1: maintenance and consistency updates * modulefiles/module_base.wcoss_dell_p3 - change prod_envir from version 1.0.2 to 1.1.0 * scripts/exglobal_fcst_nemsfv3gfs.sh - add PREFIX_ATMINC for EnKF. PREFIX_ATMINC defaults to a blank string - no change in current behavior * ush/hpssarch_gen.sh - add ensemble mean sfcfXXX to enkf tarball commit 79b91dc2399609d806b99f0ed5271494d1bab6d0 Author: Henrique Alves Date: Fri Mar 27 04:46:51 2020 +0000 Adding missing cd fv3gfs.fd back in checkout.sh commit 0e0dc873ceff26f9666fc40287ad2457171719c7 Merge: b133700a 95a63432 Author: Kate Friedman Date: Thu Mar 26 10:54:44 2020 -0400 Merge pull request #42 from NOAA-EMC/hotfix/viewer Issue #41 - Update PRODUTIL paths for WCOSS in viewer commit 95a63432f3fbdd8745d0617089485cdc332f57e7 Author: kate.friedman Date: Thu Mar 26 14:43:44 2020 +0000 Issue #41 - Update PRODUTIL paths for WCOSS in viewer commit b133700a4a65fe5334e5b5522a8c6f83bf0bbd0c Author: kate.friedman Date: Thu Mar 26 13:01:38 2020 +0000 Issue #21 - correct syntax for machine if-blocks in setup_expt scripts commit c3fbe7970c03aaad8e5b511b9d43931da9382fd3 Author: Henrique Alves Date: Thu Mar 26 12:05:46 2020 +0000 Adding new grids, changes to modulefiles for poiting to new esmf module commit f4ee5098988106037528650ba8a15922150b70af Author: kate.friedman Date: Wed Mar 25 20:52:49 2020 +0000 Issue #21 - remove unneeded logic from partition check commit 4acc80f6037a4d0fd43a64e9178e7aadbd6162f5 Merge: accb6f4b fb1c79f4 Author: Hang-Lei-NOAA <44901908+Hang-Lei-NOAA@users.noreply.github.com> Date: Wed Mar 25 15:22:51 2020 -0400 Merge pull request #40 from NOAA-EMC/port2wcoss3p5 Issue #21 - Add support for WCOSS phase 3.5 commit fb1c79f4b7136e09c9d35d9d05303116036f58c9 Author: kate.friedman Date: Wed Mar 25 16:06:54 2020 +0000 Issue #21 - updated WCOSS phase 3.5 queues commit 08a535492952d863f860977f0ab0da4bc79bc931 Author: kate.friedman Date: Wed Mar 25 14:58:14 2020 +0000 Issue #21 - add phase 3.5 support - add partition option to setup scripts - remove machine if-blocks from config.base and add variable population to setup_expt*py scripts - add phase 3.5 ppn value to WCOSS_DELL_P3 env and config.resources files commit be21d3f2bc06ef8e7e31edce08756ee6aee36d31 Author: Henrique Alves Date: Tue Mar 24 20:10:07 2020 +0000 Poiting exec to new location of ww3 executables (HOMEgfs/exec). Adding new NH 1/6, SH 1/4 deg grids for testing commit c4cc28b08ca58007882e9823a68f26c2c5eaf92b Author: kate.friedman Date: Tue Mar 24 17:22:55 2020 +0000 Fix wave config begin and end echos commit 20d79d274be862b7147f17cd7654f2c770ce7425 Merge: 3efe1c23 accb6f4b Author: russ.treadon Date: Mon Mar 23 18:19:08 2020 +0000 Issue #1: Merge branch 'develop' accb6f4 into feature/gfsv16b commit accb6f4b919871221e8037d5f154b157d88d1102 Merge: 057b2a82 8b51b56f Author: Kate Friedman Date: Mon Mar 23 14:01:30 2020 -0400 Merge pull request #39 from NOAA-EMC/feature/verif-tag Issue #38 - update EMC_verif-global pointer from VLab to GitHub commit 8b51b56f84289d1c01863b61421b21eccd22939c Author: kate.friedman Date: Mon Mar 23 17:40:18 2020 +0000 Issue #38 - update EMC_verif-global pointer from VLab to GitHub commit 3efe1c23fbe906953f3df4e6e6b43a604505dcf1 Author: Fanglin Yang Date: Sun Mar 22 16:49:40 2020 +0000 update arch.sh to remove selected RESTART files commit 586d0994aab2348c1deea521b37183ddf0a8f2d7 Author: kate.friedman Date: Thu Mar 19 14:01:16 2020 +0000 Issue #1 - update EMC_verif-global to tag verif_global_v1.6.0 in Externals.cfg commit f412f2ca8071baa6bd91f641c1fd8102e9271bef Author: russ.treadon Date: Thu Mar 19 13:53:34 2020 +0000 Issue #1: update EMC_verif-global to tag verif_global_v1.6.0 commit 94abe6a1caca4724fbc580d58928a6cf74a65590 Author: kate.friedman Date: Wed Mar 18 19:02:09 2020 +0000 Fix syntax error in setup_workflow_fcstonly.py commit 06ff92be074bac2728055a62ce0fa263a4d13cf6 Author: Guang Ping Lou Date: Tue Mar 17 18:10:37 2020 +0000 modify job card for tests commit a2b76f1d402c8568612dc727ebf36b548748589d Author: Guang Ping Lou Date: Tue Mar 17 18:10:04 2020 +0000 modify namsnd TIMSTN commit 81bc3fc4090c4fd3bdec070abd9c117d5a09491d Author: Guang Ping Lou Date: Tue Mar 17 18:09:24 2020 +0000 modify collectives mpmd design commit 9cbf9be8ea2a959a6711c10838ba46347451b7d1 Author: Guang Ping Lou Date: Tue Mar 17 18:08:41 2020 +0000 modify config.resources for postsnd number of tasks commit e3945e03031db23dbb8681e3d1034cc07377ae2d Author: russ.treadon Date: Mon Mar 16 10:38:07 2020 +0000 Issue #1: add logic to check WAVE_CDUMP prior to setting npe_fcst_gfs commit 6fdc6365fa13c77f01ae0cdb82e74bbbd149f914 Author: russ.treadon Date: Mon Mar 16 10:35:14 2020 +0000 Issue #1: DA workflow updates for GFS v16 (does not change results) commit c728925aaf6f8c2aac170e6987ed34e28e411053 Author: Henrique Alves Date: Sun Mar 15 17:16:56 2020 +0000 Changes to test IOSTYP=3 and correct restart lines on ww3_multi.inp for rexcluding WW3 restarts. commit 9bc784f708e09e9f5392c662cae2ed51cfb09011 Author: Henrique Alves Date: Sat Mar 14 17:43:00 2020 +0000 Changes to allow GFS fcsr cycle to have WW3 restart specs different to GDAS fcst (eg, turn off or on for GFS) commit 91610f80cef8911ac6aaae2ded67a95b4ff1d061 Author: russ.treadon Date: Wed Mar 11 17:25:36 2020 +0000 Issue #1: update Externals.cfg to GFS v16 components commit 7dcf23281cddb9aabc20c73ec077a9a52ded56b8 Merge: 55130516 057b2a82 Author: russ.treadon Date: Wed Mar 11 17:21:33 2020 +0000 Issue #1: Merge branch 'develop' at 057b2a8 into feature/gfsv16b commit 057b2a82fa43f7bc36c3e757ca7d48fb86b9541c Merge: 0377d20f 622167d5 Author: Kate Friedman Date: Wed Mar 11 12:00:58 2020 -0400 Merge pull request #29 from NOAA-EMC/feature/manage_externals Issue #3 - Introduce manage_externals as replacement for checkout.sh commit 50649e274d2b074f4d3c6f18503b8bb6997a5bde Author: Henrique Alves Date: Wed Mar 11 03:16:10 2020 +0000 General cleanup for reducing size of wavepostsbs RUNDIR and improve runtime. Reorganized parameters in postsbs to provide more clarity and upfront control and the config.wave level commit 55130516cea5ea67b758b52c9445aca667e1c35a Author: kate.friedman Date: Mon Mar 9 13:59:44 2020 +0000 Add ability to have wave jobs depending on cdump commit 622167d5fb3322921a1702639ebccb42da1f5e1b Author: kate.friedman Date: Fri Mar 6 18:20:31 2020 +0000 Issue #3 - added explicit config flag example for checkout_externals in README and blurb about this replacing checkout.sh commit e83b90d50999f64ed5208d94a8cceb8179c9395f Author: kate.friedman Date: Fri Mar 6 17:00:15 2020 +0000 Issue #3 - remove prod_util and grib_util sections from build_all.sh, removed elsewhere already commit 8699b46aa1797e8dd29edff1d4bd3b511ad5cb1c Author: kate.friedman Date: Fri Mar 6 16:30:54 2020 +0000 Issue #3 - updated README with new manic version commit e602cd3d536b55b86b04285aedd5781d8d3a9f82 Author: kate.friedman Date: Fri Mar 6 16:27:57 2020 +0000 Issue #3 - updated link_fv3gfs.sh to adjust wafs links commit 830c73f430d70cc516dea419a8969c6fd9fc0910 Author: kate.friedman Date: Fri Mar 6 15:21:45 2020 +0000 Issue #3 - update EMC_verif-global tag in Externals.cfg after sync with develop commit 40084e67810d21366d0e9af6d9937c29aa4965ad Merge: f662fffa 0377d20f Author: kate.friedman Date: Fri Mar 6 15:18:52 2020 +0000 Issue #3 - Merge branch 'develop' into feature/manage_externals commit cfa5be4b414f67efa8eb17c8fe5060fe94cd48ea Author: kate.friedman Date: Fri Mar 6 14:05:29 2020 +0000 Issue #1 - remove gfswave restart archival commit 85b3a1d24e89c60e9977ecc549656a0145caeb8b Author: russ.treadon Date: Thu Mar 5 18:51:22 2020 +0000 Issue #1: update postsnd section of config.resources commit 6c8dc92953e5c003882ffbb8376a8b1b2c8668d2 Author: russ.treadon Date: Thu Mar 5 18:40:38 2020 +0000 Issue #1: update config files to be consistent with settings in pre-implementation GFS v16 real-time and retrospective parallels * parm/config/config.anal - reduce second outer loop iteration count for gfs * parm/config/config.epos - add ENKF_SPREAD to toggle on/YES or off/NO generation of ensemble spread files * parm/config/config.fcst - set adjust_dry_mass for GDAS (true) and GFS (false) * parm/config/config.fv3 - remove nth_efcs, set C384 nth_fv3=1 * parm/config/config.resources - set nth_efcs to nth_fv3 (default 2) commit a107109d8fdd0292a3017f26e56863fbb28a63ce Author: kate.friedman Date: Thu Mar 5 16:08:23 2020 +0000 Fixed bug in config.fcst related to extra then commit 0de559a088ac670f2b105168988c18ee1b57488a Author: russ.treadon Date: Thu Mar 5 15:55:50 2020 +0000 Issue #1: update exglobal_fcst_nemsfv3gfs.sh and checkout.sh * scripts/exglobal_fcst_nemsfv3gfs.sh - turn on adjust_dry_mass with dry_mass=98320.0, turn off iau_drymassfixer * sorc/checkout.sh - update to model tag GFS.v16.0.1 commit ce23e52730a9f6fe20215cb47563dd6f2dcb254a Merge: fa7bca5a 0377d20f Author: russ.treadon Date: Thu Mar 5 15:36:50 2020 +0000 Issue #1: Merge branch 'develop' at 0377d20 into feature/gfsv16b commit 0377d20f3d019f77a47fc9860d6146fd3c8e5d94 Merge: 1b359dbe 25524675 Author: Kate Friedman Date: Thu Mar 5 08:43:16 2020 -0500 Merge pull request #28 from NOAA-EMC/feature/metplus2 Issue #8 - add switch for MET+ jobs commit 25524675a63e59829655bbd9a09abc4dca246357 Author: kate.friedman Date: Thu Mar 5 13:31:02 2020 +0000 Issue #8 - add switch for MET+ jobs commit fa7bca5a5fb26886219190e9e5e5d6efb9f9ddab Merge: af73a39f cd67f975 Author: Kate Friedman Date: Thu Mar 5 08:08:57 2020 -0500 Merge pull request #25 from NOAA-EMC/feature/wave2global Feature/wave2global into feature/gfsv16b commit cd67f97592227fea0238327dade887d4584300c3 Merge: c72d7b5b af73a39f Author: kate.friedman Date: Thu Mar 5 13:03:10 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit c72d7b5bca4a67e83916edd17fb0e5c4612a7867 Author: Jose-Henrique Alves <47567389+ajhenrique@users.noreply.github.com> Date: Wed Mar 4 23:53:41 2020 -0500 Clean up exwave_post_sbs commit 2e738f20930dfd7d5216920ec7cc26774be811f3 Author: Henrique Alves Date: Thu Mar 5 04:03:44 2020 +0000 Moving standalone fv3 model_config exglobal_fcst block into if/else/fi cplwav model_config block. Reinstating config.wave block in JGLOBAL_FORECAST. Pointing EXECwave to HOMEgfs/exec directory for WW3 util executables (changed link_fv3gfs.sh accordingly). Removing debug options from compile.sh line in build_fv3.sh. commit b7638436b1e737077fbb2dad705e7ed157df261e Author: kate.friedman Date: Wed Mar 4 21:02:04 2020 +0000 Fix to JWAVE_PREP to look back a day for rtofs commit 33cf0fc0f2a5d3b3ca749c9941835398f94e7606 Author: kate.friedman Date: Wed Mar 4 19:59:56 2020 +0000 Adjustments after going through PR review commit 1b359dbeb31b94382619dfc9c67e77fffe46aaa0 Merge: 0359d342 31bb7d32 Author: Kate Friedman Date: Wed Mar 4 10:19:36 2020 -0500 Merge pull request #26 from NOAA-EMC/feature/metplus Feature/metplus - refactored MET+ jobs to resolve timing issues commit 5c8fa3dd2666b7fd90b473fb12629016ef402e3e Author: kate.friedman Date: Wed Mar 4 13:03:12 2020 +0000 Adjust restart_interval if-blocks for DOIAU=YES in configs commit 2b337a8f8c900146905915032826faf9e6f07dc0 Merge: a4daafee e0a1a0ae Author: Henrique Alves Date: Tue Mar 3 18:31:09 2020 +0000 Merge branch 'feature/wave2global' of github.com:NOAA-EMC/global-workflow into feature/wave2global commit a4daafeecaf3fed38e637389ab5f4041b5139a4c Author: Henrique Alves Date: Tue Mar 3 18:30:58 2020 +0000 Adjustment to waveprep resources to match numbger of rtofs files, bugfix on generation of ascii spectral data commit af73a39fc8ae7868ea9d08081b98f84345fa4f95 Author: Boi Vuong Date: Tue Mar 3 13:44:59 2020 +0000 Github Issue #1 Updated postsnd.sh commit e0a1a0aeb4066ab834fd93a86780196f55a7a447 Author: kate.friedman Date: Mon Mar 2 18:45:50 2020 +0000 Fix gdasfcst dep on gldas commit 817f8faa6b0c71ef970da302d6e46b57c9ea8129 Merge: 48ed4c5f 203d1747 Author: kate.friedman Date: Mon Mar 2 13:41:13 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit 48ed4c5f4a6b0093941a97d475abdb6bed2937e5 Author: Henrique Alves Date: Mon Mar 2 04:22:02 2020 +0000 General cleanup. wavepostsbs wall time limit matches gfsfcst walltime . commit 4af83a430a8bb58f83d686535d39d9745de9ab6c Author: Henrique Alves Date: Sun Mar 1 02:51:02 2020 +0000 Cleaning up prior to merging to gfsv16b commit 14f52dd35b7d6a1a9b98cff0da2426015cfabd3d Merge: 4400d7c3 d39ba74d Author: Henrique Alves Date: Sat Feb 29 18:23:17 2020 +0000 Merge branch 'feature/wave2global' of github.com:NOAA-EMC/global-workflow into feature/wave2global commit 4400d7c3e5e26bacc0ba892ef3178d4d906d57a8 Author: Henrique Alves Date: Sat Feb 29 18:23:13 2020 +0000 Changes to feal with waveprep and wavepost commit 203d174798aae0d4e581930fa8f49abb8421f81f Author: russ.treadon Date: Sat Feb 29 01:17:57 2020 +0000 Issue #1: two GFS v16 DA updates * sorc/checkout.sh - check out ProdGSI tag gfsda.v16.0.0 * parm/config/config.anal - remove REALTIME=NO section since not needed commit d39ba74d71b5bd5c510eb32cac17facdbe5a9a49 Author: kate.friedman Date: Fri Feb 28 19:01:02 2020 +0000 Removed extra forward slashes in wave part of hpssarch_gen.sh commit 82889365da44196804b7d751a1c8c5992ca102ea Author: kate.friedman Date: Fri Feb 28 18:44:13 2020 +0000 Added waves to archival commit a8f4200d04d7c49a1ff1cd443f1f36fd949cb5db Merge: 9321bdc9 8997f2a2 Author: kate.friedman Date: Fri Feb 28 13:58:28 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit 9321bdc9a5a85a4b458b3539bb4f15bdcbf2cf33 Author: kate.friedman Date: Fri Feb 28 13:39:34 2020 +0000 Cleanup and added adjustments to other env files commit 8997f2a27a45ee018d33abdf1883914a38c88ffd Author: russ.treadon Date: Thu Feb 27 14:37:24 2020 +0000 Issue #1: check existence of EnKF spread file before adding to HPSS archive list commit 2d6620e9b961f290daab1369778644c799bec5cc Author: Guang.Ping.Lou Date: Thu Feb 27 13:45:39 2020 +0000 rm nemsio utility commit 0120cf577df7faf2413e0b585423728ecd62657d Author: Guang.Ping.Lou Date: Thu Feb 27 13:40:45 2020 +0000 modify config.resources for postsnd ppn=4 commit 0891840f8f43dccdd81573f2d6513b573de921aa Author: Guang.Ping.Lou Date: Thu Feb 27 13:32:07 2020 +0000 modify config.resources for postsnd ppn=4 commit 67e20de5ac6e8f52312bc71adc5419330ebe6542 Author: Henrique Alves Date: Thu Feb 27 04:22:36 2020 +0000 Adding current smoothing flag to config.wave. Previous push alos contained changes to reflect improved wave model physics. commit 10164bdae0fdc53a604dc8e58b2a6a5fce5ebc99 Author: Henrique Alves Date: Thu Feb 27 03:47:36 2020 +0000 Adjustments to use gfs/gdas seaice file instead of omb file. Changes to intake rtofs 1hr files when available. commit 7145fd01ffcf3043dd1dea458ae866f208f32633 Author: kate.friedman Date: Wed Feb 26 19:34:24 2020 +0000 Modify prep.sh to create rtofs symlink in ROTDIR if DO_WAVE=YES commit 5ac8c8683b6f3de8cc586bae8d3825b2bd67011c Author: kate.friedman Date: Wed Feb 26 18:44:45 2020 +0000 Fix for missing nth_postsnd in config.resources commit 91456a9b7cfc285b78ec083a419592fafe6f75d0 Author: Guang.Ping.Lou Date: Wed Feb 26 15:52:32 2020 +0000 modified config.resources for bufrsnd commit 011a393c162a422859b1184e9f32dd50d1d48882 Merge: d01c3543 8806c9e4 Author: Kate.Friedman Date: Wed Feb 26 13:50:49 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit d01c3543e59bbabb491454788f17e818305de6fc Author: kate.friedman Date: Wed Feb 26 13:48:26 2020 +0000 Updates to checkout.sh from feature/gfsv16b commit 8806c9e4e3572169146a2ba3605a86680e35287c Merge: e87695c8 318d8b46 Author: Guang.Ping.Lou Date: Wed Feb 26 13:45:13 2020 +0000 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b Commit NetCDF parallel reading capability. commit 3b5d451aa165dc368c674cd2a71d6786097869e7 Merge: 88d8abda 318d8b46 Author: kate.friedman Date: Wed Feb 26 13:43:52 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit e87695c8b151e3c64cafed97ad921849d3998ece Author: Guang.Ping.Lou Date: Wed Feb 26 13:42:41 2020 +0000 reinstate nemsio utility files for namsio data format commit 318d8b46ff7269899db92ae72f68fa1110406a16 Author: russ.treadon Date: Tue Feb 25 20:17:40 2020 +0000 Issue #1: add gldas and esfc to the list of log files to write to HPSS commit 6102b6971fc9ee1f5dfc20c98a625b20e7f2cbbc Author: Guang.Ping.Lou Date: Tue Feb 25 19:32:21 2020 +0000 Adding a parallel reading interface commit c0b9ddc5dfe0d5dfcf1592fabf295ad951cc20b3 Author: Guang.Ping.Lou Date: Tue Feb 25 19:31:19 2020 +0000 adding parallel reading interface commit 057b10cc02f66cdede531a04967a2d6a51efb148 Author: Guang.Ping.Lou Date: Tue Feb 25 19:30:16 2020 +0000 Clean up un-used variable commit 25ac8f0a64ea710e7b8dd55dd3d03cdfd600c671 Author: Guang.Ping.Lou Date: Tue Feb 25 19:28:59 2020 +0000 Modify to call parallel reading subroutine commit c1a25a5ec5d46d52cd931b5178abba45bfb41345 Author: Guang.Ping.Lou Date: Tue Feb 25 19:20:26 2020 +0000 parallelizing gfsbufr.f for reading NetCDF files commit b28020f0c03ce54db3022de43ecc0f7119bce6d2 Author: Guang.Ping.Lou Date: Tue Feb 25 19:17:32 2020 +0000 modify driver for parallel read test commit bce7d9642decf393e2b314e837f557765a6382b8 Author: Guang.Ping.Lou Date: Tue Feb 25 19:14:45 2020 +0000 modify postsnd resource for parallel read commit f89912416cf5347fbcd6e1378aae61e99ffa14ac Author: Guang.Ping.Lou Date: Tue Feb 25 18:59:26 2020 +0000 remove testing jobs script JGFS_POSTSND_netcdf commit 88d8abda4adf3c945b1c5526dfca6b1774b91f3b Merge: 33c65a19 9d6e8464 Author: kate.friedman Date: Tue Feb 25 17:49:29 2020 +0000 Sync merge with feature/gfsv16b commit 9d6e8464aa207da41901278c9e5454efd9e475a2 Author: fanglin.yang Date: Tue Feb 25 04:45:44 2020 +0000 setup_workflow_fcstonly.py failed. Updated to get "reservation" from config.base. commit 36367563af24402db621179bda139f17d3ca1d44 Author: russ.treadon Date: Mon Feb 24 20:37:27 2020 +0000 Issue #1: update config files for use in GFS v16 real-time and retrospective parallels commit 33c65a19cbe68342cd9ab3f434a22a399be3a1b4 Merge: 9a1f79d7 0aa8dacf Author: Henrique Alves Date: Mon Feb 24 15:37:48 2020 +0000 Bugfix for creating spectra files in wavepostsbs step commit 0aa8dacfeef3fd1bf3abe368c9cdaf9a16b57cde Merge: 8148766c 431c7866 Author: kate.friedman Date: Fri Feb 21 13:57:55 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit 8148766c489057553e20fc42bad494f1ed5aa70c Author: kate.friedman Date: Fri Feb 21 13:56:56 2020 +0000 Small fix to exglobal_fcst_nemsfv3gfs.sh commit 431c78668c98268cb749397ec8218428104c9687 Author: russ.treadon Date: Thu Feb 20 15:19:42 2020 +0000 Issue #1: remove RESERVATION from archive jobs commit ebf02b19f98f90ae3c8bc57cc5c2325260aaab0a Author: kate.friedman Date: Thu Feb 20 14:05:52 2020 +0000 Remove output_1st_tstep_rst override and add iau_drymassfixer commit 36cb42dc5d8985f8402405dce811f04063f572ab Author: Henrique Alves Date: Thu Feb 20 12:30:28 2020 +0000 Removing references to gens/gefs from exglobal_fcst_fcst_nemsfv3gfs.sh in preparation for merge back to gfsv16b commit 9a1f79d7aa0343ea95d1af419f55a2d0d356e99d Author: Henrique Alves Date: Thu Feb 20 12:19:11 2020 +0000 Adding back lost and found EOF in section that creates input.nml commit 635d95d58c932909dc6a29c62515c385dc01587a Merge: f31de5aa c4454b3a Author: kate.friedman Date: Wed Feb 19 20:42:49 2020 +0000 Additional sync merge with feature/gfsv16b today commit f31de5aaa6c915ac8bc1184726bead71dc9157d4 Author: kate.friedman Date: Wed Feb 19 20:18:23 2020 +0000 Removing copies of exglobal_fcst that were added erroneously commit c4454b3ab2c5fbda7890546f31f2fe38daa5edd9 Author: fanglin.yang Date: Wed Feb 19 16:41:38 2020 +0000 modified: sorc/checkout.sh to check out model tag GFS.v16.0.0, which contains updates of dry mass fixer, WW3 thread reproducibility and post/8.0.5 lib modified: scripts/exglobal_fcst_nemsfv3gfs.sh to add iau_drymassfixer = .true. commit 31bb7d32181ca84229c3c3374226bbd37784ddc4 Merge: eb73e520 0359d342 Author: Mallory Row Date: Wed Feb 19 15:24:42 2020 +0000 Merge branch 'develop' into feature/metplus commit e058a9054d7edb2b92ee590b6dede733fe4bc4a3 Author: kate.friedman Date: Wed Feb 19 15:05:59 2020 +0000 Changes to config.resources after wave tests commit 24844e403745fb5b292fcde0aae53a3c497aa295 Merge: 0f2f2b0b c2e99795 Author: kate.friedman Date: Wed Feb 19 14:03:58 2020 +0000 Sync merge with feature/gfsv16b commit 0f2f2b0be1ff97485a23cdaf214d428927290c3d Author: kate.friedman Date: Wed Feb 19 13:36:14 2020 +0000 Updates to config.wave and scripts/exwave_post_sbs.sh commit c2e997959144b2628185b58e5480f25a7cd8852a Author: russ.treadon Date: Wed Feb 19 00:42:49 2020 +0000 Issue #1: add backup option to fit2obs, update config files * jobs/rocoto/vrfy.sh - set verification date based on VBACKUP_FITS * parm/config/config.anal - correct ozinfo typo, enclose realtime settings in REALTIME block * parm/config/config.resources - set npe_node_post for WCOSS_DELL_P3, reduce C768 eobs npe_eobs * parm/config/config.vrfy - add VBACKUP_FITS commit ed160c03ff53e10aa0e3d57ac802e8e6fecd54a3 Author: fanglin.yang Date: Tue Feb 18 22:22:10 2020 +0000 modified: sorc/checkout.sh to check out gldas release branch gldas_gfsv16_release.v1.0.0 commit c08ebb8a2aca1061272c05643160aebc2a06eac5 Author: fanglin.yang Date: Tue Feb 18 17:22:01 2020 +0000 modified: checkout.sh to use UPP: upp_gfsv16_release.v1.0.5 ufs-util: release/ops-gfsv16 commit a6bc7f8ad50a0c7400c17c3d1423e2aefccff2ed Author: fanglin.yang Date: Tue Feb 18 17:12:17 2020 +0000 modified: arch.sh to archive gfs_pgrb2b group into HPSS commit 958ee38b6fb9c76056a3045f43dec92a503f48c9 Author: kate.friedman Date: Fri Feb 14 20:38:53 2020 +0000 Increasing resources for eupd when C384 commit db5c593f2c34111a13bddaa1f4f59edcd74c2118 Author: russ.treadon Date: Fri Feb 14 20:12:00 2020 +0000 Issue #1: update modulefiles/module_base.hera to crtm/2.3.0 commit f662fffa25a99617828e4322bf789978cf523248 Author: kate.friedman Date: Fri Feb 14 15:57:05 2020 +0000 Issue #3 - Updated README with new manic tag v1.1.7 commit 94cd971fb14c4b7204822ca780d6e9dbe945b595 Author: kate.friedman Date: Fri Feb 14 15:31:32 2020 +0000 Updates to wave scripts commit 4e006dc33fb68febaa33524dfafaa1f0d3282939 Merge: fcc5a74a a94a44a8 Author: fanglin.yang Date: Fri Feb 14 00:50:00 2020 +0000 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b commit fcc5a74a95d37ef2d7193ec6c0c8e5513a0f3548 Author: fanglin.yang Date: Fri Feb 14 00:48:44 2020 +0000 modified: config.fcst to fix a bug commit e3196a84a0ecd2b54d59abfdc9184622a9c605ca Author: Kate Friedman Date: Thu Feb 13 15:59:06 2020 -0500 Update README.md commit e46b175d8a309010e421ccd54e6d6eb083af3579 Merge: 4bd0e203 0359d342 Author: Kate.Friedman Date: Thu Feb 13 20:38:04 2020 +0000 Issue #3 - sync merge with develop branch commit aec4c288e1f96ee7f3f2a834014edde0ef4ea9a2 Author: Kate.Friedman Date: Thu Feb 13 19:09:56 2020 +0000 Added EUPD_CYC variable to config.base.emc.dyn commit a94a44a82be752f6542b50c8d6ce082f7272dc4c Merge: 0f0894bf 0359d342 Author: russ.treadon Date: Thu Feb 13 18:59:51 2020 +0000 Issue #1: Merge branch 'develop' at 0359d34 into feature/gfsv16b commit 0359d3425a8710e7b696b94456ec8e54e9a2fd9f Merge: 1d9a1f00 bd00cb98 Author: Hang-Lei-NOAA <44901908+Hang-Lei-NOAA@users.noreply.github.com> Date: Thu Feb 13 09:53:59 2020 -0500 Merge pull request #19 from NOAA-EMC/feature/remove_theia Feature/remove theia commit 0f0894bfcc8a25ea921026d911a38e7b726ce638 Author: Boi Vuong Date: Thu Feb 13 14:15:16 2020 +0000 GitHub Issue #1 removed script exnawips.sh.ecf (not used) commit adabbaf95b32b7292cc0f09447f98773795f2b1e Author: Guang.Ping.Lou Date: Thu Feb 13 00:34:02 2020 +0000 change netcdf modules commit 767b9c33e29aff3470d914edb9289ef8236331ed Author: Guang.Ping.Lou Date: Thu Feb 13 00:32:25 2020 +0000 remove hardwired libs commit be3c324dc1e35f336b1df6f2be8ab335f45f1b9e Author: Guang.Ping.Lou Date: Thu Feb 13 00:31:13 2020 +0000 Modified modules for parallel_netcdf commit 3e3c119d096f5d03082183655d0f5db4e4991951 Author: Guang.Ping.Lou Date: Thu Feb 13 00:29:53 2020 +0000 Added netcdf_parallel modules commit 0b25f5b648892a068ac6ae1c30890729576b2c5c Author: fanglin.yang Date: Wed Feb 12 20:28:27 2020 +0000 Changes to be committed: deleted: ../driver/product/driver_WAFS.README deleted: ../scripts/exgfs_grib_wafs.sh.ecf deleted: mkwfsgbl.sh commit 2a2a8d3fe1fcd0bebf07d34c32697f64f5ac745c Author: Kate.Friedman Date: Wed Feb 12 18:00:14 2020 +0000 Adjusting config.resources for C384 commit 8bdf2387b1fad22cfa2da37a7428406a847d2362 Author: Kate.Friedman Date: Wed Feb 12 17:33:53 2020 +0000 Reducing npe_eobs from 400 to 100 commit 648479c086776b0c7b4d24b4052130d48dfaa64a Author: russ.treadon Date: Wed Feb 12 15:56:39 2020 +0000 Issue #1: add reservation keyword to rocoto workflow generator for WCOSS_DELL_P3 commit c48ac6df8c5c166a1ef51b173372f3f26f9d877a Author: Kate.Friedman Date: Wed Feb 12 14:46:26 2020 +0000 Removing gefs from checkout.sh and link_fv3gfs.sh commit fc0b624223baa54315e7ad52f48b689fff29bccb Author: fanglin.yang Date: Wed Feb 12 02:37:36 2020 +0000 modified: exglobal_fcst_nemsfv3gfs.sh -- bug fix commit 97e34f9728ae3bda52fe636994c91453b5cccfb4 Author: fanglin.yang Date: Tue Feb 11 17:47:34 2020 +0000 removed the following obsolete lines from exglobal_fcst_nemsfv3gfs.sh JCAP_STP=${JCAP_STP:-$JCAP_CASE} LONB_STP=${LONB_STP:-$LONB_CASE} LATB_STP=${LATB_STP:-$LATB_CASE} commit 60098fc7238b35f461ec9aa8fabd924fb09372ce Author: fanglin.yang Date: Tue Feb 11 17:34:40 2020 +0000 1. modified: parm/config/config.fcst to set lheatstrg=".false." for both NOAH-LSM and NOAH-MP per the decision made by the physics group. More development is required to use the canopy heat storage parameterization. 2. modified: scripts/exglobal_fcst_nemsfv3gfs.sh to 1) change the default of lheatstrg to ".false." 2) for ensemble forecast, change &nam_stochy ntrunc = $JCAP_STP lon_s = $LONB_STP lat_s = $LATB_STP EOF to &nam_stochy / EOF Phil Pegion noted: "If you are concerned about the ensemble runtime with stochastic physics on, I recommend removing ntrunc, lat_s, and lon_s from the namelist. I have fixed the bug that forced the spectral resolution of the random patterns to be related to the number of mpi tasks. Now the code calculates the appropriate truncation for a given length scale. It probably won't save a lot of time, but it is worth it. Scientifically, the patterns are the same. commit ababd08e676f45524ec616e60a0290c26d3b1795 Author: Boi Vuong Date: Tue Feb 11 16:28:12 2020 +0000 Updated data card grib2_awpgfs102.003 commit eef10607bedf12153580d85d5a7952ea90c8b198 Author: fanglin.yang Date: Tue Feb 11 16:14:29 2020 +0000 resort back to older w3emc lib since w3emc_para does not support mersenne_twister modified: modulefiles/gfs_bufr.hera modified: modulefiles/gfs_bufr.wcoss_dell_p3 commit 8fe97ac689aebb2243b6e69cbf7015e636d68852 Author: fanglin.yang Date: Tue Feb 11 15:45:33 2020 +0000 modified: modulefiles/gfs_bufr.hera and modulefiles/gfs_bufr.wcoss_dell_p3 to load paralel netcdf modules commit 78a7294eb1937758fa81f72cc5045d3d680a07c6 Author: kate.friedman Date: Mon Feb 10 19:29:15 2020 +0000 Fixed missing fi in checkout.sh commit 15628bda0621187949204d0d9af4d901e205495d Author: kate.friedman Date: Mon Feb 10 19:21:39 2020 +0000 Change fv3gfs checkout to develop branch of ufs-weather-model commit 631820beca07522faedccc47d7415c4b6b250273 Author: kate.friedman Date: Mon Feb 10 19:08:30 2020 +0000 Changing checkout.sh to Henrique's fork branch gfsv16_wave commit cc742b672af73931999bddf115bc32f07ed69f87 Merge: ba63c4f8 1d9a1f00 Author: russ.treadon Date: Mon Feb 10 17:19:58 2020 +0000 Issue #1: Merge branch 'develop' at commit:1d9a1f0 into feature/gfsv16b commit a518485866536102312c477926ae42a59d5e561e Author: kate.friedman Date: Mon Feb 10 17:02:22 2020 +0000 Changing fv3gfs.fd checkout back to gfsv16_updates commit c75d35b6802592b17640fecace38d9103a14b1bc Merge: 2a46ee2e ba63c4f8 Author: kate.friedman Date: Mon Feb 10 16:44:53 2020 +0000 Sync merge with feature/gfsv16b commit 2a46ee2ee71da5c75d11cbc96b048a1e34148239 Author: kate.friedman Date: Mon Feb 10 15:42:35 2020 +0000 Updates for wave and fcst scripts, as well as config.wave commit ba63c4f83955bfc38bc3f2fbd9f8c6447630a803 Merge: fc2144e6 75a9fb5d Author: Fanglin Yang Date: Sat Feb 8 20:35:14 2020 -0500 Merge pull request #17 from NOAA-EMC/feature/gfsv16b_paranetcdf Feature/gfsv16b paranetcdf commit 75a9fb5d4dd6d4ef91bdee93769031465ddd57e9 Merge: a2932c5b 6f256b4f Author: fanglin.yang Date: Sun Feb 9 01:22:29 2020 +0000 Merge branch 'feature/gfsv16b_paranetcdf' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b_paranetcdf commit a2932c5be934fb45416c92caa01f0fa3df4829d1 Author: fanglin.yang Date: Sun Feb 9 01:21:52 2020 +0000 modified: parm/config/config.base.emc.dyn modified: parm/config/config.base.nco.static commit 6f256b4f7aadb9913cd78962e19dcabcf7ba49e4 Author: Cory.R.Martin@noaa.gov Date: Fri Feb 7 20:07:24 2020 +0000 Added namelist option 'cld_amt' to enkf_chgres_recenter_ncio. Default is .false., when .true. will attempt to read in cld_amt and print error (but continue) if it is missing from input file commit eb0681751af6b850e99c880c11343af10000d54d Author: fanglin.yang Date: Fri Feb 7 16:20:43 2020 +0000 modified: sorc/checkout.sh update ufs_util repo commit b9feca17a5e566d75995049ba8e1497eb205b96d Author: fanglin.yang Date: Fri Feb 7 15:03:52 2020 +0000 modified: ../../modulefiles/module_base.hera modified: config.anal control ABI and AHI data usage modified: config.resources updates based on pull reviews modified: ../../sorc/gaussian_sfcanl.fd/makefile.sh use parallel netcdf commit eb73e520716215c3f11cc4cdfce3831408221766 Author: Mallory Row Date: Fri Feb 7 14:04:37 2020 +0000 Update EMC_verif-global checkout to verif_global_v1.5.0 commit bd00cb9812c5fb400ba4399d183b2198b8e80372 Author: Kate.Friedman Date: Fri Feb 7 13:41:05 2020 +0000 Issue #4 - bug fix in getic.sh for v15 commit 1c85197d7a1beb34f2e3a52969d631d42003e6eb Merge: 67dae409 1d9a1f00 Author: Kate.Friedman Date: Fri Feb 7 13:27:18 2020 +0000 Issue # 4 - Sync merge branch 'develop' into feature/remove_theia commit 1d9a1f00b73cb3852d352e9a41a15651a99fb656 Merge: 3ed9267b bdbecaa7 Author: Kate Friedman Date: Fri Feb 7 08:11:21 2020 -0500 Merge pull request #18 from lgannoaa/exception_handling Exception handling commit 85deaf78b86184f9b52afe5d68b370ba640c152b Author: kate.friedman Date: Thu Feb 6 19:16:18 2020 +0000 Updated resource configs based on C384 and C768 tests commit 4bd0e20300cc2a79e79433b2ec8cdb15c8f01c9e Author: Kate Friedman Date: Thu Feb 6 11:55:31 2020 -0500 Update README.md commit d9ea1acab54f65a987b32d56587dfd1b6bcd037c Author: Kate.Friedman Date: Thu Feb 6 16:03:11 2020 +0000 Issue #3 - reduce hashes down to minimum 8 characters commit 92d07793ce4be5ac1e12aedb536ace89ec6fcc7b Author: fanglin.yang Date: Thu Feb 6 15:28:15 2020 +0000 Update to use parallel netcdf libs modified: modulefiles/fv3gfs/enkf_chgres_recenter.wcoss_dell_p3 modified: modulefiles/fv3gfs/enkf_chgres_recenter_nc.hera modified: modulefiles/fv3gfs/enkf_chgres_recenter_nc.wcoss_dell_p3 modified: modulefiles/fv3gfs/gaussian_sfcanl.hera modified: modulefiles/fv3gfs/gaussian_sfcanl.wcoss_dell_p3 commit a6a28943d08089d96dcf3ea7953ffaf9877e3caa Author: fanglin.yang Date: Thu Feb 6 15:10:40 2020 +0000 Point new UPP tag upp_gfsv16_release.v1.0.2, for upgrading with netcdf_parallel 4.7.4, hdf_parallel 1.10.6 and w3emc_para 2.4.0 on Dell and Hera commit cf1c0a5cfec92cebaf648a09f42da435750154b6 Author: russ.treadon Date: Thu Feb 6 13:43:17 2020 +0000 Issue #16: Update EMC_verif-global to tag verif_global_v1.5.0 commit 41ca86bc4001549b98d6b6594288c9752215a1f3 Author: fanglin.yang Date: Thu Feb 6 04:19:11 2020 +0000 modified: parm/config/config.base.emc.dyn to point new obsproc_prep to spupport parallel netcdf commit 61b72ba02f920b4817ee199d9cae685e2e3c2234 Author: fanglin.yang Date: Thu Feb 6 01:10:06 2020 +0000 modified: config.resources commit 17bcdf2a1e0e8f1febfc6d57994a12ac625d9d75 Author: kate.friedman Date: Tue Feb 4 19:11:42 2020 +0000 Wave changes for running with IAU on commit 3c0ebdde3307741f88e6ecef4f59faaf5efdd5ad Author: fanglin.yang Date: Tue Feb 4 17:11:56 2020 +0000 modified: modulefiles/module_base.hera and modulefiles/module_base.wcoss_dell_p3 to point to parallel versions of netCDF libs and esmf lib modified: parm/config/config.efcs and parm/config/config.fcst to set chunksizes correctly for high-res and enkf forecasts commit f89bafe55aa0b15ae306dbaff8d97cbd17dc27c5 Author: russ.treadon Date: Mon Feb 3 14:09:33 2020 +0000 Issue #16: update DA checkout to feature/parallel_ncio commit bad2fc1a43a4df12f3f57c4da9e7612402885cd4 Author: fanglin.yang Date: Mon Feb 3 05:01:43 2020 +0000 modified: parm/config/config.fcst modified: parm/config/config.fv3 commit 183aac6be0ff71d8368ee139695aeca7caf72464 Author: fanglin.yang Date: Mon Feb 3 04:10:36 2020 +0000 Modified config.resources to use npe_node_max to define computing resouces. Pull over new settings found in the config files of in v16rt2. Made ertain adjustment and cleaned up certain parameters in a few scripts to remove redundance. modified: parm/config/config.anal modified: parm/config/config.base.emc.dyn modified: parm/config/config.efcs modified: parm/config/config.eobs modified: parm/config/config.eupd modified: parm/config/config.fcst modified: parm/config/config.fv3 modified: parm/config/config.gldas modified: parm/config/config.post modified: parm/config/config.resources commit 9d5e0a0348ae7b825629bc4c601b8ee8e726418d Author: fanglin.yang Date: Sun Feb 2 19:04:59 2020 +0000 Github Issue #16 1. add options to workflow scripts to use parallel netcdf for I/O. The application varies with model resolution and computing platform. See config.fcst and exglobal_fcst_nemsfv3gfs.sh for the settings. 2. Update UPP to upp_gfsv16_release.v1.0.1 modified: modulefiles/module_base.hera modified: modulefiles/module_base.wcoss_dell_p3 modified: parm/config/config.fcst modified: scripts/exglobal_fcst_nemsfv3gfs.sh modified: sorc/checkout.sh deleted: parm/config/config.resources.C96 commit 3a66788620477dd3159fe9633a860c799f421eee Author: fanglin.yang Date: Sun Feb 2 05:17:40 2020 +0000 The following files are no longer used deleted: README.iau deleted: config.anal.iau deleted: config.base.emc.dyn.iau deleted: config.efcs.iau deleted: config.eobs.iau deleted: config.eupd.iau deleted: config.vrfy.iau commit fc2144e64bedc1f7fc7bd302e42baa94bfe06e7f Author: Guang.Ping.Lou Date: Fri Jan 31 16:09:00 2020 +0000 issue #15 Remove station height adjustment commit 72cb96233fb636b83ca9f339e23955629c97b82d Author: Guang.Ping.Lou Date: Fri Jan 31 16:08:20 2020 +0000 issue #15 Porting bufrsnd to Hera commit 0799ecd1922652d3d17e88dae62fea1c6d8a96e3 Author: Guang.Ping.Lou Date: Fri Jan 31 16:08:02 2020 +0000 issue #15 Porting bufrsnd to Hera and remove height adjustment commit a0ac43a8f1541ae47f156442a303689e5d74a9ec Author: Guang.Ping.Lou Date: Fri Jan 31 16:07:34 2020 +0000 issue #15 Porting bufrsnd to Hera and remove height adjustment commit 85dc2f2486f841e7ad4785e5170c2edd2fbba35e Author: Guang.Ping.Lou Date: Fri Jan 31 16:04:28 2020 +0000 issue #15 Porting bufrsnd to Hera commit 26d52df758693bde55b047bfb874df4439c6d57a Author: Guang.Ping.Lou Date: Fri Jan 31 16:03:20 2020 +0000 issue #15 Modofy driver for test runs commit 0e34bc5f4d38d742a11e3275ee560708c417c37f Author: Guang.Ping.Lou Date: Fri Jan 31 14:53:37 2020 +0000 port v16b bufrsnf to Hera commit 87d3d18c291af5543de72e38e8dfe310223e1e1e Author: fanglin.yang Date: Fri Jan 31 03:53:20 2020 +0000 modified parm/config/config.resources to apply more tasks to gldas aprun_gaussian commit 50b70215096dbc87e0484ed0eea9572238c37432 Author: kate.friedman Date: Thu Jan 30 14:55:48 2020 +0000 Resource adjustments and fix for restart copying in exglobal commit bdbecaa7220f2462cc75e802570845809ebcfc75 Author: Lin.Gan Date: Wed Jan 29 15:15:52 2020 +0000 Display exception handling message for individual package with location of the log file commit 85b254d8e65318adaeef87ae7187218de59f94c5 Author: Henrique Alves Date: Tue Jan 28 21:18:35 2020 +0000 Adjusting resources for wave component commit 3f858be34da7134a071762376f4c4e69ea73f4a0 Author: kate.friedman Date: Tue Jan 28 20:43:02 2020 +0000 Further adjustment to config.efcs for restart_interval for cold start SDATE commit a674d30c8fe1da53c2d8bda5b695a474e8906f6b Merge: 9944f2fe 3fda3e35 Author: kate.friedman Date: Tue Jan 28 20:26:10 2020 +0000 Merge branch 'feature/wave2global' of https://github.com/NOAA-EMC/global-workflow into feature/wave2global commit 9944f2fe15576f0b401996fefe3c910888708e0d Author: kate.friedman Date: Tue Jan 28 20:25:59 2020 +0000 Resource adjustments for C384, restart_interval adjustment for cold start, and fcst dependency update commit b64fd5ff43f88bd5f2d27b0a16fb803eac8aff8c Merge: cf008631 3ed9267b Author: kate.friedman Date: Tue Jan 28 15:24:37 2020 +0000 Merge branch 'develop' into feature/manage_externals commit cf0086311daaf62ee33df010946d0a0ddc5bc400 Author: kate.friedman Date: Tue Jan 28 15:23:14 2020 +0000 Issue #3 - remove copy of manage_externals under util and add README.md file commit 3fda3e359f66611c77a97ca9c30a6204a36077c3 Author: Henrique Alves Date: Tue Jan 28 14:18:47 2020 +0000 Adjusting output stride on station outputs, adding bulletin output to wavepostsbs using unified wave_outp_spec.sh (removing wave_outp_bull.sh), adjusting wave resources in config.fv3 commit e4bd7d09ef3cc06d14dbfcb07bd3f211d4a5e54c Merge: 99b19f18 b3d88593 Author: kate.friedman Date: Tue Jan 28 13:58:16 2020 +0000 Sync merge with feature/gfsv16b commit 99b19f18e0a01fcf49b442ec66f1aea43b16433c Author: kate.friedman Date: Tue Jan 28 13:54:46 2020 +0000 Disabling additional wave jobs for later implementation commit c12e87987113fa6f4b543bc0dae61d4259703c03 Author: Lin.Gan Date: Mon Jan 27 19:38:19 2020 +0000 Implement exception handling in build_all script commit bfc7bb0b237d4cf4240551aa8b9c5d025724ac16 Author: Kate.Friedman Date: Mon Jan 27 19:06:17 2020 +0000 Issue #3 - initial add of manage_externals and needed Externals.cfg. Also added .gitignore file and removed scripts/files associated with no-longer-used prod_util and grid_util builds. commit b3d885930adfc07c67bc8fbe146533868f96aba8 Author: russ.treadon Date: Mon Jan 27 17:54:31 2020 +0000 Issue #1: update gldas workflow dependency to improve parallel throughput commit e216205aec6f481b4e96c8707d485dd2eacfb0e7 Author: Henrique Alves Date: Mon Jan 27 00:01:14 2020 +0000 Updating resource config in config.fv3 for running cplwav in c384 with ppn=7,thrd=4; checkout.sh now points to the latest ufs-weather-model develop branch; updated fv3gfs_build.cfg to match ufs-weather-model commit 4db98e694624b62e45417b3bb6c7c771f633cb42 Author: russ.treadon Date: Fri Jan 24 17:18:14 2020 +0000 Issue #1 - update EMC_verif-global checkout to tag verif_global_v1.4.1 commit 4d5713d3983c6cc6a8e497e892760752e09f15a0 Author: Lin.Gan Date: Fri Jan 24 15:51:28 2020 +0000 Testing github commit commit 786806f3cd5d858615f3f74dec78891a2189ee79 Author: Mallory Row Date: Fri Jan 24 15:04:16 2020 +0000 Missed file format updates in a few places in config.metp commit 7d0f783a41d8f1c50e9251b54884410e0bb7ad1a Author: russ.treadon Date: Fri Jan 24 14:41:02 2020 +0000 Issue #1 - GFS v16 updates for MOS, checkout, and archive * scripts/run_gfsmos_master.sh.dell - update MOS script to point at MDL gfsmos * sorc/checkout.sh - check out new fv3gfs branch and UPP tag * ush/hpssarch_gen.sh - add loginc to gdas and gfs tarballs commit d0a3b53c8117676c351c50287d4583951c94d42c Author: Lin.Gan Date: Fri Jan 24 14:30:29 2020 +0000 init commit for exception handling branch commit c11dfef0f7fb8da2866e6a022ac0ff60044766a7 Author: Mallory Row Date: Fri Jan 24 14:07:29 2020 +0000 Update EMC_verif-global tag to verif_global_v1.4.1 commit df2e6b0bd9a2af2107a78c12f9c648a61fe0d5b9 Merge: bc07de02 09e68b48 Author: kate.friedman Date: Thu Jan 23 20:00:34 2020 +0000 Sync merge with feature/gfsv16b commit bc07de02af5743194cb9203d0d36e62e9a25b223 Author: kate.friedman Date: Thu Jan 23 19:43:12 2020 +0000 Fixed wavepostsbs dependencies commit 4b54c37b425c947e79c4ddc192a6e2f06b8528fa Author: kate.friedman Date: Thu Jan 23 15:04:20 2020 +0000 Turn off cplwav in efcs config and adjusted dependencies for fcst job commit 0ea809c208ce606c957eed4d346a3828d8186010 Author: Mallory Row Date: Thu Jan 23 13:29:56 2020 +0000 Update file format variable in config.metp of online archive files commit b886c848a60871a86bb0b45d024a34368ad1d898 Author: kate.friedman Date: Wed Jan 22 21:01:34 2020 +0000 Change to base config and update to wave dependencies commit 154b118a7808bfa60abe0620e4bee21cb4faabee Author: Henrique Alves Date: Wed Jan 22 03:37:24 2020 +0000 Changing WAV_MOD_ID to MDC (model component) tag; updating some paths for wave components in several scripts. Correcting wave_tar bug. commit 0edcb211586f1f96d01f172215568fa7eee3a7a2 Author: Henrique Alves Date: Tue Jan 21 20:15:14 2020 +0000 Several changes to change the sirectory name from to etc. Updates to wavepostsbs. commit c0d7179f34837e40db9ccb1b941ad9f312283a6e Author: Mallory Row Date: Tue Jan 21 16:37:16 2020 +0000 Add updated env machine files for gfsmetp commit 82e690717d72c7b021c637270108f4bacfb6816d Author: Mallory Row Date: Tue Jan 21 16:29:23 2020 +0000 Update config.resources for gfsmetp commit 72e8adf1c8a8859786cbbcf1b976640d73c5c867 Author: Mallory Row Date: Tue Jan 21 16:19:31 2020 +0000 Update EMC_verif-global tag checkout to 1.4.0 commit 6872f79f3f9052377ff863da1bbac482548ee0ce Author: Mallory Row Date: Tue Jan 21 16:14:25 2020 +0000 Add rocoto METplus job script commit 9c94156670bd810561bd2a699648afb946511ea9 Author: Mallory Row Date: Tue Jan 21 16:09:13 2020 +0000 Changes to setup_workflow.py for gfsmetp metatask commit 09e68b4834b0b54552ef0ae3829a41063200fe5f Merge: a49e4e54 3ed9267b Author: russ.treadon Date: Mon Jan 20 22:05:53 2020 +0000 GitHub Issue #1 Merge branch 'develop' at revision 3ed9267 into feature/gfsv16b commit 0ad851882686087fdd21a4a8f88b65dd3960cd1c Author: kate.friedman Date: Fri Jan 17 17:29:09 2020 +0000 ACCOUNT fix in config.base.emc.dyn and dependency fix to setup_workflow.py commit 80f13fb4b785fbde8e925cfb6deb547297f34e70 Author: Henrique Alves Date: Fri Jan 17 03:29:35 2020 +0000 Removing underscore from COM wave directory names commit 39a9df9385631df4a3cba4e6663fc6ab5c3f238d Author: Henrique Alves Date: Fri Jan 17 03:08:41 2020 +0000 Changing back waveprep to include ice and currents by default commit c36df4d748bbc2a8dc7b4a044a1f62f637e3ba33 Author: Henrique Alves Date: Thu Jan 16 19:43:26 2020 +0000 Updating post sbs script to copy station files to correct directory. commit a49e4e5403b1e16025b2a5890cef3de22627119a Author: Boi Vuong Date: Thu Jan 16 14:59:08 2020 +0000 Added build script for gfs_util commit f1cd7ab43d24fc576c82ea227cd7d79cc08f4835 Author: Henrique Alves Date: Wed Jan 15 20:48:11 2020 +0000 Adding block sourcing config files into wave j-jobs commit b736f8315497bfc29fd77faa9d98bf558ddea919 Author: kate.friedman Date: Wed Jan 15 20:40:16 2020 +0000 Removed config sourcing from rocoto job scripts commit 4f6840b25d85152ea7393a8959619bb4caab9d67 Author: Henrique Alves Date: Wed Jan 15 20:38:41 2020 +0000 Removing dependency on log file for wave post sbs commit 429c409799f1999babbfb3653e6f03be66f8fec3 Merge: 3f685fd3 a6905174 Author: Henrique Alves Date: Wed Jan 15 20:35:24 2020 +0000 Merge branch 'feature/wave2global' of github.com:NOAA-EMC/global-workflow into feature/wave2global commit 3f685fd3423add0ba2cfb7c3115a674891c44e55 Author: Henrique Alves Date: Wed Jan 15 20:29:58 2020 +0000 Reinstating Ratkos NO NO in build_fv3.sh commit 8f48844906f851a41db742f7686f7e7cae6b95c0 Merge: 5eeaffa1 7c595189 Author: BoiVuong-NOAA Date: Wed Jan 15 20:12:52 2020 +0000 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b commit a69051742c5edd0e4d34d178329fc24cacede78a Author: kate.friedman Date: Wed Jan 15 20:12:20 2020 +0000 Adding supplemental config source to JGLOBAL_FORECAST for wave commit 5eeaffa1c3f4e35fe067b7e229f16f93574bcb86 Author: BoiVuong-NOAA Date: Wed Jan 15 20:12:08 2020 +0000 Updated GFS drivers commit 7c595189e8c671a7b2eb6363d82020fb9ed58c13 Author: Boi Vuong Date: Wed Jan 15 20:09:54 2020 +0000 Github Issue #1 updated JGFS_PGRB2_SPEC_GEMPAK commit 1f5c593705df384608b1f1dd84df9114d5d8a812 Author: Henrique Alves Date: Wed Jan 15 19:59:27 2020 +0000 Removing KEEPDATA from JWAVE_POST_SBS commit 5d2f2a21947dfb00762b1085fd267ca959b7fea6 Author: Boi Vuong Date: Wed Jan 15 19:50:00 2020 +0000 Github Issue #1 modified rocoto gempak.sh commit baa06afcecef64faad51bfac3b4fd2c064df3fb5 Author: Henrique Alves Date: Wed Jan 15 19:32:06 2020 +0000 Changes to update sbs post and reconciling parameters for output post data. commit 8d430ceeac22bb8943bdf2052b85cccc9b621440 Author: BoiVuong-NOAA Date: Wed Jan 15 16:28:13 2020 +0000 GitHub Issue #1 Updated ush,gempak,jobs and docs commit 53f719f63f569bd061bd3cfd3b63d5040a962dea Author: BoiVuong-NOAA Date: Wed Jan 15 14:19:46 2020 +0000 GitHub Issues #1 Bugzilla ticket 889 fixed bug in rdbfmsua.f commit 8b94a2d6caad3bc5eebfe69bffc5224bebd1856a Author: BoiVuong-NOAA Date: Wed Jan 15 14:15:16 2020 +0000 GitHub Issues #1 Updated some gfs driver commit ed2317a953992974dc2e61517cfe3d22d3c2d75f Author: BoiVuong-NOAA Date: Wed Jan 15 05:10:51 2020 +0000 Updated GFS driver commit 3ed9267b2f540694e957ee33a746f00857a5a1a2 Author: kate.friedman Date: Tue Jan 14 19:32:36 2020 +0000 Issue #10 - mid-year update to bufr station list (develop) commit f59fc0c4550a5e8617fb70ca150e91a65de89e18 Author: BoiVuong-NOAA Date: Mon Jan 13 20:03:34 2020 +0000 GitHub Issue#1 Revert parm card for AWIPS grid 211 commit 21feab285859881814509d75c794bc4ae4f74a3b Author: BoiVuong-NOAA Date: Fri Jan 10 22:00:38 2020 +0000 GitHub Issue #1 Updated AWIPS grid 211 commit e292d96010dc1e63f28d2c9d95c5e61776abe0dd Author: BoiVuong-NOAA Date: Fri Jan 10 21:12:36 2020 +0000 GitHub Issue Updated AWIPS parm card grid#211 commit fcf2d09159f80142364fad616661014e434b2a89 Author: Guang.Ping.Lou Date: Fri Jan 10 20:55:57 2020 +0000 modify gfs_bufr.sh for generalization commit 528cde1998fa6b341794f4ad15820e8cbc517568 Author: Guang.Ping.Lou Date: Fri Jan 10 20:55:05 2020 +0000 modify exgfs_postsnd.sh.ecf commit 0b5a056b7c089e428806ceecb36ae0ddd0535060 Author: Guang.Ping.Lou Date: Fri Jan 10 20:40:21 2020 +0000 Addd 25 bufr stations and all station j,i commit 349c299740b5e1d022faa2d17aa6aa05ed8d7165 Author: Guang.Ping.Lou Date: Fri Jan 10 20:36:57 2020 +0000 add module nemsio/2.2.3 commit 3f78f6a9abc89127afc293b8aa62912a142b58da Author: Guang.Ping.Lou Date: Fri Jan 10 20:35:06 2020 +0000 change gfsbufr.f makefile_module meteorg.f for either nemsio or netcdf commit ad40a4436835d7e6e8c7ec25a74b48562a5cbaf8 Author: Guang.Ping.Lou Date: Fri Jan 10 20:30:53 2020 +0000 added two interface subroutines for either netcdf or nemsio commit 1915aa921dfe2ef17799599e5a3084547caa3ca2 Author: kate.friedman Date: Fri Jan 10 18:36:32 2020 +0000 Issue #8 - pulled in config.metp and modifications to two setup scripts commit e6871dff5fc8de254c5d5d2e6d5576f87c909400 Author: BoiVuong-NOAA Date: Fri Jan 10 15:57:38 2020 +0000 Updated gempak/fix to add GOES 17 image files commit f4185149ddab7d6cabb695c102497cebbb64a693 Author: BoiVuong-NOAA Date: Fri Jan 10 10:36:23 2020 +0000 GitHub Issue#1 Removed util/fix and parm commit 07c1e8d88cc4266937f98261311aa38cbd05a451 Author: BoiVuong-NOAA Date: Thu Jan 9 15:12:28 2020 +0000 GitHub Issue #1 Removed FAX programs ../util/sorc commit 67dae40974485e7ffef4713209142301e5e4ba9e Merge: f78eb1b4 091f4ba1 Author: kate.friedman Date: Wed Jan 8 20:15:45 2020 +0000 Merge branch 'develop' into feature/remove_theia commit f78eb1b4228927a0937b5de2098ba2cece6a4aca Author: kate.friedman Date: Wed Jan 8 20:13:08 2020 +0000 Issue #4 - removed references to Theia and Theia scripts commit 091f4ba1d04f1600e352f2fe090ae9af0880c95d Author: kate.friedman Date: Wed Jan 8 19:37:45 2020 +0000 Issue #7 - missed update to gdas transfer file from GFSv15.2.5 updates commit 32bb8f60be17e70a76debb6d7307475f59fe8a11 Author: George Gayno Date: Wed Jan 8 18:59:37 2020 +0000 feature/gfsv16b: Update to "env/HERA.env" - update computation of tasks used for parallel processing of GLDAS data according to Jesse's recent change to remove unused files. commit 465512fe5ab3e73459da57a720ac52caa21b69a6 Author: BoiVuong-NOAA Date: Wed Jan 8 14:50:34 2020 +0000 GitHub Issue #1 Updated modulefiles and sorc commit d6184a020889585803a84ce611b3b524b19fa294 Author: BoiVuong-NOAA Date: Tue Jan 7 22:36:05 2020 +0000 GitHub Issue #1 Updated ush and ecflow commit a45fad17460e5c9aa1f5f127d285462a5a5a9aae Author: BoiVuong-NOAA Date: Tue Jan 7 22:13:43 2020 +0000 GitHub Issue #1 Updated sorc and modulefiles commit 98fcd0da14c87364d4faab261f6823e76cb303ba Author: Boi Vuong Date: Tue Jan 7 21:08:06 2020 +0000 GitHub Issue #1 Updated Release_Notes.gfs_downstream commit 9c384d1a5635403d38a8786ed4145dab054b4345 Author: Boi Vuong Date: Tue Jan 7 21:00:58 2020 +0000 GitHub Issue #1 Updated GFS AWIPS parm cards commit 293f0f1910a0f112a3155cee1651bea80ceff098 Author: Boi Vuong Date: Tue Jan 7 19:56:06 2020 +0000 GitHub Issue #1 Updated GFS AWIPS parm cards commit 465d6294b1f164b2f42a1fd50783e5151ef05a6d Author: Boi Vuong Date: Tue Jan 7 19:43:49 2020 +0000 GitHub Issue #1 Updated GFS v16.0 driver commit ead449d9d764fe0824db27bbb9b82c34b1d32a8b Author: Boi Vuong Date: Tue Jan 7 15:21:05 2020 +0000 GitHub Issue #1 Updated gempak files commit e45a4a956f85026ca28b4e0d1e9cb6a05cd247b2 Author: Boi Vuong Date: Mon Jan 6 20:51:49 2020 +0000 GitHub Issue #1 Updated AWIPS_WMO_parm files commit 6b035ff7990484b4f167183eba09be79bab3ed5c Author: Boi Vuong Date: Mon Jan 6 19:06:29 2020 +0000 GitHub Issue #1 Removed GFS synthetic GOES 12/13 on global 1 deg commit ff9b3f11fbf6c214a15e3db09b1a6f0f28865310 Author: Boi Vuong Date: Mon Jan 6 16:48:13 2020 +0000 GitHub Issue #1 Removed bulletins jobs and scripts commit f11ecbc9733a819a5f78642105a8cf727b08610e Author: George Gayno Date: Mon Jan 6 15:59:57 2020 +0000 feature/gfsv16b: Update "env/HERA.env" for parallel processing of GLDAS data on Hera. commit 663a29f1aee13e9c32aa140c5cd1e0ad10479f2b Author: George Gayno Date: Fri Jan 3 21:40:51 2020 +0000 feature/gfsv16b branch: Updates for GLDAS to optimize data processing: Add variable APRUN_GLDAS_DATA_PROC to env/WCOSS_C.env to run data processing in parallel with cfp on Cray. commit 74e25778c104399b039b33dbeeb87e068f9f26b2 Author: George Gayno Date: Fri Jan 3 19:35:56 2020 +0000 feature/gfsv16b branch: Updates for GLDAS data processing optimization: (1) Delete JGDAS_GLDAS job and instead link the GLDAS repo version using 'link_fv3gfs.sh'; (2) Update 'link_fv3gfs.sh' to link new gldas script 'gldas_process_data.sh'; (3) Add environment variable - APRUN_GLDAS_DATA_PROC - to WCOSS_DELL_P3.env to run data processing with 'cfp' on Dell. commit 99069319a63ba02c1d747ec601425e91e6bb9d62 Merge: 71ee0e33 a43e4270 Author: fanglin.yang Date: Sun Dec 29 22:50:32 2019 -0500 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b commit 71ee0e333370a24a1f4da5a00a5ab079af446c76 Author: fanglin.yang Date: Sun Dec 29 22:48:12 2019 -0500 further update arch.sh to remove gdas cycle old directories passing the gldas sflux file retention period commit a43e4270b269fcd3d8a97da6a5f2e0d2660cc234 Author: russ.treadon Date: Fri Dec 27 14:29:07 2019 +0000 Issue #1 - default OUTPUT_HISTORY to ".true." in ush/hpssarch_gen.sh commit 784fb65dee27135340771b1c778d1d69219ba05c Merge: 9562fc71 7440aad1 Author: Henrique Alves Date: Fri Dec 27 05:01:12 2019 +0000 Merging latest feature/gfsv16b branch into wave2global commit 9562fc71eb80e33f4d132dd9ad9ca5903be1435f Author: Henrique Alves Date: Fri Dec 27 04:47:20 2019 +0000 Adjusting resources and env for running C384 coupled commit 5ce0bdf026c4c9a4a62252a264816710558dfd20 Author: Henrique Alves Date: Thu Dec 26 19:00:42 2019 +0000 Changes to support wave side by side post commit 7440aad157110974db359e6b0910758fea121844 Author: russ.treadon Date: Thu Dec 26 12:57:40 2019 +0000 Issue #1 - add or dependency on loganl.txt for ecmn metatask and esfc task commit 383a00df519d43342151650c9ac222b6a995d42f Author: russ.treadon Date: Thu Dec 26 11:41:36 2019 +0000 Issue #1 - (1) load g2tmpl/1.6.0 in module_base.wcoss_dell.p3; (2) set DOSFCANL_ENKF in config.esfc commit c2296c6937992fd1dd50e79b8829d69befc9ec12 Author: fanglin.yang Date: Wed Dec 25 11:19:35 2019 -0500 Issue #1 - update arch.sh to save sfcanl for running gldas as ICs commit 68ab52f8b000aab6bde0f5bc106b014ff42f2318 Author: fanglin.yang Date: Tue Dec 24 12:45:56 2019 -0500 modified jobs/rocoto/arch.sh to clean all but sflux grib2 files whithin the last 96 hours of forecast cycles and before RMOLDEND if DO_GLDAS=YES commit b1b9a7ee651d4faa853e4662701125340c9859a2 Author: fanglin.yang Date: Mon Dec 23 20:18:00 2019 -0500 modified setup_workflow.py -- update gdasfcst step dependency if gldas is turned on commit b4d90d97b44a546b0845bb0bc4e30358ab27bb06 Merge: 96c28da1 49af82c3 Author: fanglin.yang Date: Sun Dec 22 14:29:13 2019 -0500 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b commit 96c28da1f23b577070b2e9722eba558cb151d6df Author: fanglin.yang Date: Sun Dec 22 14:27:11 2019 -0500 modified: modulefiles/module_base.hera modified: ush/hpssarch_gen.sh commit 07d9ec5aa1e54e9b78ee1e18b0fa0d984f129bea Author: Henrique Alves Date: Sun Dec 22 17:55:13 2019 +0000 Temporary changes to work with updated WW3 code under GEFS_v12 and to bugfix wave_post_sbs. commit 49af82c306f0d06c683cd637d164d0b2c16b8615 Author: russ.treadon Date: Fri Dec 20 18:41:43 2019 +0000 Issue #1 - update handling of model restarts upon forecast completion in exglobal_fcst_nemsfv3gfs.sh commit a69df4b414f5247a9db24ddebb42bc409f26d090 Author: Henrique Alves Date: Fri Dec 20 12:58:02 2019 +0000 Additional changes to have wave init, prep, postsbs handed over for experiments by gfsv16 group commit 2d8499c9b9331aea6abf5d0910c0f26b9047452a Author: russ.treadon Date: Thu Dec 19 19:01:31 2019 +0000 Issue #1 - Fit2Obs changes (1) provide default defintion for CONVNETC (2) update WCOSS_DELL_P3 fitdir commit d50076fb3ab4491be632d1ba2f5d7673c3c9b42b Author: russ.treadon Date: Thu Dec 19 18:45:47 2019 +0000 Issue #1 - (1) set RUN_ENVIR to OUTPUT_FILE for Fit2Obs, (2) change log file dependency for post000 job commit f6d5044b40c76bdcdf37ebccfbd8cb64e3413bde Merge: ead3c9b4 3fd4bcfa Author: russ.treadon Date: Wed Dec 18 19:21:11 2019 +0000 Issue #1 - merge branch 'develop' at commit:3fd4bcfa into feature/gfsv16b commit 46aa4bae69f1a9a73965b863557cd359161ff7d6 Author: Henrique Alves Date: Tue Dec 17 21:02:05 2019 +0000 Adding resource adjustments to allow fcst step to run the couple FV3-WW3 system. Adding the WAV_POST_SBS job, modifying waveprep to use new unified point output grid tag uoutpGRD commit 3fd4bcfa0bb5e133774b5ad64aaf45b42074c05c Author: kate.friedman Date: Tue Dec 17 19:12:28 2019 +0000 GitHub Issue #2 - GFSv15.2.6 obsproc version update, earc bug fix, and tracker path update commit e76895309db47e7c1d53b50a283f2851c57bf5e2 Merge: 8441f9da ead3c9b4 Author: fanglin.yang Date: Mon Dec 16 21:56:37 2019 -0500 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b commit ead3c9b41f13260ee838c94a5956fdb7bea62ee3 Author: fanglin.yang Date: Tue Dec 17 02:46:45 2019 +0000 recommit updated exglobal_fcst_nemsfv3gfs.sh commit 8441f9da476cc179bcffc4212e90e8a24ca41b93 Merge: 66a1f833 41208b38 Author: fanglin.yang Date: Mon Dec 16 21:28:37 2019 -0500 Merge branch 'feature/gfsv16b' of https://github.com/NOAA-EMC/global-workflow into feature/gfsv16b commit 41208b38e0d641e86db575e3a29a83bd9133f05e Author: fanglin.yang Date: Tue Dec 17 01:51:47 2019 +0000 modified: exglobal_fcst_nemsfv3gfs.sh add postxconfig-NT_FH00.txt for inline post define rst_invt1 as the first element of restart_interval since restart_interval is no longer a single number commit 66a1f833592fd3790158792bdbd656b4ddfcbfd0 Author: fanglin.yang Date: Mon Dec 16 20:47:13 2019 -0500 modified: exglobal_fcst_nemsfv3gfs.sh add postxconfig-NT_FH00.txt for inline post define rst_invt1 as the first element of restart_interval since restart_interval is no longer a single number commit b6201678a6436f2821b20d9199d50b5f9d67afe7 Author: fanglin.yang Date: Mon Dec 16 20:48:23 2019 +0000 modified: sorc/build_fv3.sh to force the compiler to clean and recompile with any changes deleted: parm/config/config.base. This is created at the link time. Should not be incldued in the repo commit c805357ac81b4ede287043339568936c761e37d6 Author: fanglin.yang Date: Mon Dec 16 19:10:13 2019 +0000 Vlab issue #65358 change model repo in checkout.sh to https://github.com/junwang-noaa/ufs-weather-model commit b17c0524648b4b975d64d0be07e0751133466148 Author: fanglin.yang Date: Mon Dec 16 13:56:33 2019 -0500 Vlab issue #65358 modified: sorc/checkout.sh to checkout new model branch postRstUgwd modified: scripts/exglobal_fcst_nemsfv3gfs.sh to include postxconfig-NT_FH00.txt to use the new UPP modified: parm/config/config.fcst add instruction for setting restart_intervals for different applications     # restart_interval:        $number     #    number=0, writes out restart files at the end of forecast.     #    number>0, writes out restart files at the frequency of $number and at the end of forecast.     # restart_interval:        "$number -1"     #    writes out restart files only once at $number forecast hour.     # restart_interval:        "$number1 $number2 $number3 ..."     #    writes out restart file at the specified forecast hours     export restart_interval=${restart_interval:-6} commit 1df4ff49472efc5d419d4f867d3fa9169e929150 Author: russ.treadon Date: Mon Dec 16 15:30:03 2019 +0000 Issue #1 - remove references to sfc from ecen script and ecen workflow * jobs/rocoto/ecen.sh - remove FHSFC_ECEN * ush/rocoto/setup_workflow.py - remove fhrsfc and associated scripting commit b8cff19a2cea88b1f73e0828f9c47b6b85f20ceb Author: Henrique Alves Date: Mon Dec 16 06:13:26 2019 +0000 Fixing issues with waveinit and waveprep and attempting to run gdasfcst. commit 4048d5b24a1b3c589c3a5d9147381d7a159f499c Author: fanglin.yang Date: Sun Dec 15 19:25:11 2019 -0500 modified: HERA.env JET.env WCOSS_C.env to set up env parameters to run the newly added esfc step. commit 1598a13aa41b3b78ec75c05d79aa285f6efc0f46 Merge: a4c2b0cf 9740a139 Author: fanglin.yang Date: Sat Dec 14 11:52:16 2019 -0500 Merge branch 'feature/gfsv16b' of gerrit:global-workflow into feature/gfsv16b commit a4c2b0cf28209bc19a9e7d30b5c94493e11ce70a Author: fanglin.yang Date: Sat Dec 14 11:51:25 2019 -0500 set export WRITE_DOPOST=F in config.efcs to force efcs step not to use inline poste commit 9740a139c8204c829eab8ec48dc4a7408dc752c2 Author: russ.treadon Date: Sat Dec 14 11:46:29 2019 +0000 VLab Issue #65358 - correct efcs dependency error in setup_workflow.py commit 25b0a9fd0895e19f91578bbc77d690600548ed57 Author: fanglin.yang Date: Fri Dec 13 16:28:22 2019 -0500 VLab Issue #65358 set OUTPUT_HISTORY=."false." in config.base as defult commit 61f3259272a49c5c5b6b494aa322f724553be73c Author: fanglin.yang Date: Fri Dec 13 16:19:33 2019 -0500 modified: parm/config/config.arch and parm/config/config.vrfy to get ride of the following error messages when running setup_workflow.py .../config.vrfy: line 39: [: =: unary operator expected .../config.arch: line 21: [: too many arguments commit bcfcd23cc11644dc138b1d4aa19ad22a3c3de53c Author: kate.friedman Date: Fri Dec 13 20:31:18 2019 +0000 Added additional wave jobs to setup scripts commit 97cdff0342611b2b2ae4fccc1990a2787f301b3a Author: russ.treadon Date: Fri Dec 13 20:16:39 2019 +0000 VLab Issue #65358 - config file updates * parm/config/config.ecen - define NECENGRP (number of ecen groups) * parm/config/config.resources - add gldas to help comments commit 708454372802679d8415629617c564264e05f4ff Author: russ.treadon Date: Fri Dec 13 19:46:38 2019 +0000 VLab Issue #65358 - modify ecen to reduce run time * refactor task ecen as metatask ecmn * add ensemble sfcanl tile job commit 290bc09e17bb41ce94f4a98c7e942a0e67757093 Author: kate.friedman Date: Fri Dec 13 19:04:38 2019 +0000 Adding files for additional wave jobs commit 65410e5ccec775f07a0836e67ca6058efc27db7a Author: russ.treadon Date: Fri Dec 13 17:35:47 2019 +0000 VLab Issue #65358 - updates to FSU tracker * jobs/JGFS_FSU_GENESIS - correct typo * jobs/rocoto/arch.sh - copy tracker to ARCDIR * jobs/rocoto/vrfy.sh - add section to execute FSU tracker * modulefiles/module_base.wcoss_dell_p3 - load FSU tracker module * parm/config/config.vrfy - add VRFYFSU flag; update to ens_tracker v1.1.15.2; set GENESISFSU commit 7b0bc3e8cff68764907bc838825d373ceb1820fd Author: fanglin.yang Date: Fri Dec 13 04:16:40 2019 +0000 modified: sorc/checkout.sh to check out gldas repo master instead of a feature branch modified: env/WCOSS_C.env correct a bug commit 8ff056ea594f6ff92fcc39ee49c1cd4549ead4c0 Author: kate.friedman Date: Thu Dec 12 18:32:11 2019 +0000 VLab Issue #65358 - added new FSU tracker JJOB script commit 23ea303716477284c74b187d4142f40c4f4a2e1b Author: Henrique Alves Date: Thu Dec 12 05:19:27 2019 +0000 Updating build_fv3.sh to add correctly ww3 compile options commit 212aeb449300a0795d8d0055ab6f12f39bfc4d98 Author: Henrique Alves Date: Thu Dec 12 02:58:50 2019 +0000 Adding WW3 as component in the gfsv16 for enabling coupled run. commit 6e450a8947967795a9fab9da2b5ebc4d415c0387 Author: Henrique Alves Date: Thu Dec 12 02:51:08 2019 +0000 Changes to make wave init functional end-to-end under GFSv16 workflow. Removing parm/wave directory (to be moved to GEFS repo, in GFSv16 parm were converted to config) commit 530795269fd678f7ee6c7354d3fe674b8c46a458 Author: Kate.Friedman Date: Wed Dec 11 20:47:44 2019 +0000 HOTFIX - VLab Issue #72346 - fix to rocoto_viewer on Hera commit c4b7e07b355b9e92a8070659210cfd95961a1b05 Author: kate.friedman Date: Wed Dec 11 15:01:28 2019 +0000 Added wave tasks to free-forecast mode and adjusted cycled mode dependencies commit 1c8f1e8a937dcdcf0b355292b0c81d6667398186 Author: russ.treadon Date: Wed Dec 11 14:17:23 2019 +0000 Vlab issue #65358 Add section to copy GDAS atcfunix files to $ARCDIR commit e36fed5ffaed435a2d774fa859b0aebd62de149b Author: Henrique Alves Date: Wed Dec 11 13:27:52 2019 +0000 Modifying spec of unified output point parameter from buoy to uoutpGRD to avoid errors as parameter buoy is also used to label output points themselves commit 126bd34442e34862da1ac1eaf00a2d9903cdd595 Author: Henrique Alves Date: Tue Dec 10 18:44:27 2019 +0000 Partial changes to add side-by-side post commit 66add7f33225e5bef12c941fd27b9c0539b155df Author: kate.friedman Date: Tue Dec 10 17:18:00 2019 +0000 Added wave rocoto job scripts, configs, and setup criteria commit b6a2742ebdce9e522a9b6f5af77d3f8a379c8413 Author: Henrique Alves Date: Mon Dec 9 16:36:20 2019 +0000 Adding changes to ensure RTOFS current files are read and prepared for coupled GFS runs. Several changes to add side-by-side post. commit dea2c7b68483673f74426c997070d3b71e01abe2 Author: russ.treadon Date: Mon Dec 9 14:36:59 2019 +0000 Vlab issue #65358 * env/WCOSS_DELL_P3.env allow forecast model to use up to 28 pe per node on WCOSS_DELL_P3 * ush/gaussian_sfcanl.sh - set NETCDF_OUT based on OUTPUT_FILE commit e7b3b9a0b16f7bf84ab2690a641cf4061b5f7097 Author: fanglin.yang Date: Sat Dec 7 13:43:54 2019 -0500 modified: config.base.emc.dyn remove duplicated IAU session modified: ../../ush/hpssarch_gen.sh add to missing gfs analysis increment files to HPSS archive commit 9d447f9363e999d5a7f5ff337c68a225a137bad9 Author: Henrique Alves Date: Fri Dec 6 20:48:56 2019 +0000 Reconciling exglobal_fcst, checkout and link scripts to allow running both GFS and GEFS configs. Adding ability to link out_grd/pnt wave files to COMOUTWW3 commit db1d3be780e8bcee8ac1352d5d8fe4b604d330f9 Merge: 25e423bd 37a3bd27 Author: Henrique Alves Date: Fri Dec 6 15:51:10 2019 +0000 Merging feature/gfsv16b 37a3bd27 into feature/wave2global commit 37a3bd27f3a81e76366c8cf06b9a43e7764b1d2c Author: Guang.Ping.Lou Date: Fri Dec 6 13:59:37 2019 +0000 placement change for getncdimlen commit 53679d41ce93f07a724899a6c4b6273494207777 Author: Guang.Ping.Lou Date: Thu Dec 5 20:59:25 2019 +0000 remove exgfs_postsnd.sh.ecf_netcdf commit 25e423bd5c804c73b2f9ba12ed3c29d4d2e13abd Author: Henrique Alves Date: Thu Dec 5 19:50:43 2019 +0000 Cleaning up (changing wavemodTAG to WAV_MOD_TAG etc) commit edb1e10fd794ba6b547a2fce09cad591c89a9d46 Merge: f8b86c4e 395b55bb Author: Guang.Ping.Lou Date: Thu Dec 5 19:38:15 2019 +0000 Merge branch 'feature/gfsv16b' of ssh://vlab.ncep.noaa.gov:29418/global-workflow into feature/gfsv16b Pull newly commited files commit f8b86c4e760d8b568a48d27aaba55884b263dad5 Author: Guang.Ping.Lou Date: Thu Dec 5 19:35:59 2019 +0000 LEVS is obtained by getncdimlen commit 395b55bba93a5e2485b222ddfeda4b77bf7364e8 Author: kate.friedman Date: Thu Dec 5 19:34:13 2019 +0000 VLab Issue #65358 - small fixes after sync merge with develop branch commit bd7c5fb5ff29085d1a78370829ed395e826002eb Author: Guang.Ping.Lou Date: Thu Dec 5 19:26:46 2019 +0000 update to include nc utility getncdimlen commit d66e380827a2be46a40d0ea324fc34787f9b8644 Merge: a89c1de3 4cfec8b5 Author: kate.friedman Date: Thu Dec 5 19:19:09 2019 +0000 Merge branch 'feature/gfsv16b' of gerrit:global-workflow into feature/gfsv16b commit a89c1de349e6a557dda62100ad7924cd05be3217 Merge: 0cc97b17 e4b6b7d3 Author: kate.friedman Date: Thu Dec 5 19:19:04 2019 +0000 Sync merge with develop branch commit 4cfec8b51ec9e2b3d923f9394eec721fd72e6ec5 Author: Boi Vuong Date: Thu Dec 5 18:56:20 2019 +0000 Updated GEMPAK jobs commit 0cc97b170a3f7b3ac43a0291ed7590ac1193d389 Author: fanglin.yang Date: Thu Dec 5 16:15:13 2019 +0000 update config.vrfy to point to /gpfs/dell1/nco/ops/com/gfs/prod/syndat instead of the default /gpfs/tp1/nco/ops/com/arch/prod/syndat defined in JGFS_CYCLONE_TRACKER commit eb2197553c83c8c2c7b8233781fa8caac2bf7fe7 Author: Henrique Alves Date: Thu Dec 5 05:58:01 2019 +0000 Adjustments to adding stats step commit ee2e88f2275fd9cf19f2e4edb0b985031b4436dc Merge: 9c8f1bac ad441181 Author: fanglin.yang Date: Wed Dec 4 21:53:57 2019 +0000 Merge branch 'feature/gfsv16b' of gerrit:global-workflow into feature/gfsv16b commit 9c8f1bac623d048ffbf9e1000c372d24d0455f63 Author: fanglin.yang Date: Wed Dec 4 21:51:10 2019 +0000 GFS.v16 workflow is still using the gfs.v15 version of syndat_qctropcy.sh.   syndat_qctropcy.sh was updated during gfs.v15.2 development.  A bug was recently discovered which led to tcvitals not being copied back to HOMENHC (named nhc). I have copied the latest syndat_qctropcy.sh from ./nwprod/gfs.v15.2.5/ush to gfs.v16 workflow branch feature/gfsv16b, and made further changes to it per Diane's suggestion.       if [ -s $HOMENHC/tcvitals ]; then          cp nhc $HOMENHC/tcvitals       fi       if [ -s $HOMENHCp1/tcvitals ]; then          cp nhc $HOMENHCp1/tcvitals       fi commit ad44118153c8b906058401b8e68a658f0e14dc40 Author: russ.treadon Date: Wed Dec 4 20:48:20 2019 +0000 Correct typo in ufs-weather-model tag. The correct tag is gfs_v16.0.1, not gfs.v16.0.1. commit 094b43cd70bc7ea5df0feb4143899e15070ab9d3 Author: russ.treadon Date: Wed Dec 4 20:44:50 2019 +0000 Update ufs-weather-model to tag gfs_v16.0.1 commit 415def2b802e28710235d74df033a64bc51b05a6 Author: Guang.Ping.Lou Date: Wed Dec 4 17:54:30 2019 +0000 update build_gfs_bufrsnd.sh commit d7c826ffbf32140232b9b25f6ae16151499989eb Author: Guang.Ping.Lou Date: Wed Dec 4 15:39:44 2019 +0000 update to gfs_bufr.sh for NetCDF commit e8e838f761ae78e57551bd377d4a625147680b4d Author: Guang.Ping.Lou Date: Wed Dec 4 15:38:48 2019 +0000 update to exgfs_postsnd.sh.ecf commit 814cdbce073da3fcc4af0b0a2d052d058a89dc51 Merge: e28880e7 6b6b72ee Author: Guang.Ping.Lou Date: Wed Dec 4 15:35:48 2019 +0000 Merge branch 'feature/gfsv16b' of ssh://vlab.ncep.noaa.gov:29418/global-workflow into feature/gfsv16b commit e28880e732afbd737b10cacccb17c614a0176dd8 Author: Guang.Ping.Lou Date: Wed Dec 4 15:15:53 2019 +0000 bufr sonding run scripts updates for NetCDF commit 6b6b72ee6473191cb04a14943387749d0861f3bf Merge: e374ae8a be27010c Author: russ.treadon Date: Wed Dec 4 14:32:48 2019 +0000 Merge branch 'vrfy_metplus' at commit:be27010c into feature/gfsv16b commit e374ae8a5a99a459fb0ccafef65af336f173063c Author: fanglin.yang Date: Tue Dec 3 16:36:40 2019 -0500 remove duplicated checkout of gldas commit ed084df89006436392777bdce7502b426c505a8c Author: Henrique Alves Date: Mon Dec 2 16:05:53 2019 +0000 Changes made to file names to adjust to wave scripts ported to global-workflow commit be27010c910430722f9bbf1d5bf33f8573d0af6a Merge: dd3a89e5 f7754702 Author: Mallory Row Date: Mon Dec 2 15:12:17 2019 +0000 Merge branch 'feature/gfsv16_hera_crm' into vrfy_metplus commit 225229335fbc90cdf46d211d3dc119ea3c4586e1 Author: fanglin.yang Date: Mon Dec 2 04:42:17 2019 +0000 update config.arch to keep at least 96 hours of gdas data that are used as forcing to drive gldas forecast commit 0fbde08210709c83602092f7ddf5702b375af8a9 Author: fanglin.yang Date: Mon Dec 2 03:59:28 2019 +0000 modified: HERA.env and JET.env to include APRUN definitions forgldas jobs. modified: ../parm/config/config.gldas to add CPCGAUGE to define locally staged CPC Gauge precipitation datasets. commit 3bf2325afd9eff8b1e32ec3cf122e36aea4b7687 Author: fanglin.yang Date: Mon Dec 2 02:22:44 2019 +0000 fix a bug in link_fv3gfs.sh commit 5127533d8be5c3c1ea3ee69d864cf577de8eda6c Author: fanglin.yang Date: Sun Dec 1 21:32:11 2019 +0000 update jobs/JGDAS_GLDAS commit ce071a37c62906d9ff7a8d1a6a9f805a12f77d44 Merge: d73b3070 80faa242 Author: fanglin.yang Date: Sun Dec 1 17:21:15 2019 +0000 Merge branch 'gfsv16_gldasnoah' of gerrit:global-workflow into feature/gfsv16b commit 80faa242cfd613df12e59ef0016e1068c8db1beb Author: fanglin.yang Date: Sun Dec 1 17:18:03 2019 +0000 update JGDAS_GLDAS rocoto/gldas.sh to skip all but 00Z cycle for gldas step commit d73b307000bbc87b4c4e30ea7906f5c36dfe190b Merge: f7754702 1761b012 Author: fanglin.yang Date: Sun Dec 1 04:23:28 2019 +0000 Merge branch 'gfsv16_gldasnoah' of gerrit:global-workflow into feature/gfsv16b, which is cloned from branch feature/gfsv16_hera_crm. All confilcts have been resolved. All modified files have been corss checked. This merge adds an option to run GLDAS after gdas anal step to spin up land surafce model with observed precipitation. GLDAS is checked out from https://github.com/NOAA-EMC/GLDAS. So far GLDAS is only run once per day (00Z cycle). GDAS surafce ICs on the tiles are updated with GLDAS spin-up soil moisture and soil temperature over snow-free land surafce. GLDAS spin-up run uses gdas flux files and CPC daily precip as forcing, and is run for the past 72 or 69 hours depending on the option of DOIAU. If DOIAU=NO, surafce ICs centered at the analysis time are updated; If DOIAU=YES, surafce ICs at the IAU beiginning windown (-3hr) are updated; The xml file created by running setup_workflow.py contains the gdas_gldas step only if RUN_GLDAS is set to YES in config.base. new file: driver/gdas/para_config.gdas_gldas new file: driver/gdas/test_gdas_gldas.sh modified: env/WCOSS_C.env modified: env/WCOSS_DELL_P3.env new file: jobs/JGDAS_GLDAS new file: jobs/rocoto/gldas.sh modified: parm/config/config.base.emc.dyn modified: parm/config/config.base.emc.dyn.iau modified: parm/config/config.base.nco.static new file: parm/config/config.gldas modified: parm/config/config.resources modified: sorc/build_gldas.sh modified: sorc/checkout.sh modified: sorc/fv3gfs_build.cfg modified: sorc/link_fv3gfs.sh modified: sorc/partial_build.sh deleted: ush/rocoto/config.base modified: ush/rocoto/setup_workflow.py commit 1761b01213fcb5a6cca16e5b1f5fc9885df8f475 Author: fanglin.yang Date: Sat Nov 30 16:42:27 2019 +0000 modified: jobs/rocoto/gldas.sh modified: parm/config/config.base.emc.dyn modified: parm/config/config.base.emc.dyn.iau modified: parm/config/config.base.nco.static modified: parm/config/config.gldas modified: scripts/exglobal_fcst_nemsfv3gfs.sh commit 49d7b118c6ee980ad2c3e1e5d8e960926fe5cc1f Author: Henrique Alves Date: Wed Nov 27 17:09:34 2019 +0000 Saving changes to add gwes POST to global-workflow commit dd3a89e5b1317ca41581bfeafb061f2ae60829ed Merge: ee0f88f8 de8febca Author: Mallory Row Date: Tue Nov 26 18:19:51 2019 +0000 Merge branch 'feature/gfsv16' into vrfy_metplus commit edcc6b11b1e3ed9fa3bf41e64959218c6cc87233 Author: fanglin.yang Date: Tue Nov 26 16:48:48 2019 +0000 remove exgdas_gldas.sh.ecf from global-workflow repo. Use the version under gldas repo commit ee0f88f8179a3e34260e0a45e9340f26fef7e524 Author: Mallory Row Date: Tue Nov 26 15:10:34 2019 +0000 Add parameter to control getting prepbufr data from HPSS commit 8de30c43143336d2ff455b29501f925d21dddaf3 Author: Mallory Row Date: Tue Nov 26 15:09:55 2019 +0000 Update EMC_verif-global tag to verif_global_v1.4.0 commit 11d7fae23fc23998722f0f742add4d87af5bab61 Author: Mallory Row Date: Tue Nov 26 15:08:56 2019 +0000 Update metp resources commit f7754702b92d877ce205304bb4229e2f4d0e9eb4 Merge: 3e05eacd de8febca Author: russ.treadon Date: Mon Nov 25 17:14:58 2019 +0000 Merge branch 'feature/gfsv16' at commit:de8febca into feature/gfsv16_hera_crm commit 3e05eacd483503fad68f3adf779c929668b47e80 Author: russ.treadon Date: Mon Nov 25 15:55:49 2019 +0000 Workflow changes for nc/nemsio * sorc/build_fv3.sh - remove "NO NO" from fv3 compile.sh, remove commented out mv line * ush/hpssarch_gen.sh - update enkf HPSS archive list for use with nc/nemsio * ush/rocoto/setup_workflow.py - replace ".nc" suffix with gridsuffix variable commit 5dff08e8813b5fe1c2c662c582ee5eaa010550a6 Author: fanglin.yang Date: Mon Nov 25 02:34:17 2019 +0000 further updated exgdas_gldas.sh.ecf to point to the 6-tile restart files instead of copying them to the local running directory to speed up the job commit ba7a01d9e3b316c27ada5357d09d8700ab1c6bee Author: fanglin.yang Date: Sun Nov 24 23:13:27 2019 +0000 jobs/JGDAS_GLDAS and scripts/exgdas_gldas.sh.ecf have been extensively modified to make them applicable in the GFS global workflow. Attempts were also made to generalize the scripts for running the system at any given resolution instead of the hardwried T1534 resolution. Ush scripts in GLDAS repo have also been updated. The following scripts have also been modified env/WCOSS_DELL_P3.env parm/config/config.base.emc.dyn parm/config/config.base.emc.dyn.iau parm/config/config.base.nco.static parm/config/config.gldas sorc/link_fv3gfs.sh ush/rocoto/setup_workflow.py commit f70a299d03e1c5c902a5b6c65514a2509e72ea82 Author: Henrique Alves Date: Thu Nov 21 02:29:45 2019 +0000 Modified wave PREP step config files for accommodating move of wave j-jobs, ex/ush scripts, fix_wave and parm files into the global-workflow structure. Changed COMICE to COMINice in wave scripts. Renamed ush scripts for direct lreationship to WW3 package program names (eg ww3_prnc) commit 248fd28a4603d23b1ff1ee353e0c521ea36041d5 Author: Henrique Alves Date: Tue Nov 19 22:46:04 2019 +0000 Adding changes that make INIT step functional for GEFS after merging wave j-jobs, ex/ush-scripts, parm, fix into global-workflow commit 3173916a4d4026c8c3d8f629c99a49da28e4c416 Author: Cory.R.Martin@noaa.gov Date: Mon Nov 18 18:22:32 2019 +0000 Changed rocoto setup python scripts to look for .txt and not .nc commit 5832e76a992511ce445396c6039fbb34e9828a1a Author: Cory.R.Martin@noaa.gov Date: Mon Nov 18 17:49:38 2019 +0000 Fixed another spot where logfNNN.txt should be instead of .nemsio/.c commit de8febcaecc94de0082cd8b30e053ca561e45035 Author: russ.treadon Date: Mon Nov 18 16:38:57 2019 +0000 Vlab issue #65358 Update sorc/checkout to check out ufs-weather-model tag gfs.v16.0.0 in sorc/fv3gfs.fd commit dc23e194aac0ec66d443ff804d9d56e4b44ba149 Author: fanglin.yang Date: Mon Nov 18 09:40:00 2019 +0000 Set up branch gfsv16_gldasnoah to add GLDAS to GDAS cycle for spinning up land initial conditions. The following files were copied over from workflow branch gfsv16_gldas, which was set up by Hang Lei driver/gdas/para_config.gdas_gldas driver/gdas/test_gdas_gldas.sh jobs/JGDAS_GLDAS jobs/rocoto/gldas.sh parm/config/config.gldas scripts/exgdas_gldas.sh.ecf sorc/fv3gfs_build.cfg sorc/partial_build.sh sorc/build_gldas.sh sorc/checkout.sh The following files were modified locally in this branch to set up running environment and to use the Python wrokflow setup script to create rocoto xml files. An option DO_GLDAS is defined in config.base. The xml file will be generated with or without the GLDAS step based DO_GLDAS setting. modified: env/WCOSS_C.env modified: env/WCOSS_DELL_P3.env modified: parm/config/config.base.emc.dyn modified: parm/config/config.base.emc.dyn.iau modified: parm/config/config.base.nco.static modified: parm/config/config.resources modified: sorc/link_fv3gfs.sh deleted: ush/rocoto/config.base modified: ush/rocoto/setup_workflow.py Much more work is to be done to get the GLDAS step merged into the workflow. The J-job script JGDAS_GLDAS, running script exgdas_gldas.sh.ecf, and all GLDAS ush scripts need to be updated to add error checks, to use standard module util executables, to pass correctly directory names and variable names from the parent script to child script. config.gldas is essentiall empty at this point. Some parameters found in driver/gdas/para_config.gdas_gldas need to be copied over to config.gldas. Python scripts need to be futher updated to run GLDAS for only 00Z cycle or for all four cycles. All GLDAS fix files and seetings are hardwiredd to the C768 (T1534) resolution. Need to make it flexible to run the system at different resolutions. commit d922bb913ea0330d61f010a80345f0ee90dfd0e9 Author: Cory.R.Martin@noaa.gov Date: Fri Nov 15 21:52:24 2019 +0000 Change logfNNN files from .nc/.nemsio suffix to .txt in hpssarch_gen.sh commit e8cbf92f96d1c6dc24d30fa86d9a4ccd8f2358a3 Author: Cory.R.Martin@noaa.gov Date: Fri Nov 15 21:50:04 2019 +0000 Change logfNNN files from .nc/.nemsio suffix to .txt because they are ASCII commit 6923dcc269bca50c8f4be47a82002d02138486ef Author: Guang.Ping.Lou Date: Fri Nov 15 20:25:02 2019 +0000 Vlab issue #65358 modify scripts to read NetCDF data commit c1381a06b7c079b65633f764452440ea24f52b62 Merge: 81edb37c 2f40ff96 Author: Guang.Ping.Lou Date: Fri Nov 15 20:17:20 2019 +0000 Merge branch 'feature/gfsv16' of ssh://vlab.ncep.noaa.gov:29418/global-workflow into feature/gfsv16 commit 81edb37c7552a1aae57713e08cc27bb289896ffe Author: Guang.Ping.Lou Date: Fri Nov 15 20:11:15 2019 +0000 Vlab issue #65358 modify Bufrsnd to read in NetCDF data commit 2f40ff96a66547aa1d978c54f31df6eb780c24a2 Author: russ.treadon Date: Wed Nov 13 21:01:36 2019 +0000 Vlab issue #65358 Update gfs_post.fd checkout to clone from https://github.com/NOAA-EMC/EMC_post.git commit 041380f0a7b68e50f6eb396a93db711a7954caab Author: russ.treadon Date: Fri Nov 8 20:50:33 2019 +0000 Vlab issue #65358 Replace crtm/2.2.6 with crtm/2.3.0 in module_base.wcoss_dell_p3 commit 79c318ddd22a52f9340808c9d9768b866dc39880 Author: Henrique Alves Date: Fri Nov 8 17:53:47 2019 +0000 Finalizing changes to retain history relative to GEFS/scripts source for exwave scripts commit 4388122b7ca0e32caec0e44435c2ec4434decf4d Author: Henrique Alves Date: Fri Nov 8 17:50:52 2019 +0000 Staging intermediate name change to retain history in exwave scripts relative to source GEFS/scripts commit f4c5e2c214145d165c8de73a0326c7d30285c3ac Author: Henrique Alves Date: Fri Nov 8 17:48:08 2019 +0000 Staging removal of obsolete files commit 6b8321b25ac9291552addec1bb5a4b2eec5eff9c Author: Henrique Alves Date: Fri Nov 8 15:10:24 2019 +0000 Removing extensions gefs/gens (all will be set at the appropriate j-job, links will be made to fix_wave.), making error reporting codes consistent acrros ex-scripts. commit 1800288101815c3adf17ee071d3d04ab525e154b Author: Henrique Alves Date: Fri Nov 8 13:48:18 2019 +0000 Renaming after generalization of calls to ex-scripts from GEFS workflow commit 7bae5b660bbb12efbd7200180d5435106ecc0e85 Author: Henrique Alves Date: Fri Nov 8 13:30:06 2019 +0000 Changing extension to gens matchin NCO parameter, which is set to gens for all gefs-related runs. commit a6d30d986d119a80129be21235534c66abd7c3b8 Author: jswhit2 Date: Fri Nov 8 13:23:38 2019 +0000 handle IAU 'coldstart' using warm start files from non-IAU run commit 4bd4562e77da8552ceeac317c945570afc389ddc Author: Henrique Alves Date: Thu Nov 7 21:48:25 2019 +0000 Adding wave components to global-workflow (fix/wave, jobs, parm/wave, scripts/exwave* ush/wave_*). Massive changes still required for this to work. Paths need to be re-routed, script names need to be adjusted, hard-wired ush-script parameters need to be moved into parm/wave. commit d22bff4c1d41348f91de58a9e8d7aaf444cc836c Author: CoryMartin-NOAA Date: Thu Nov 7 18:24:39 2019 +0000 check sign of delz in enkf_chgres_recenter; change some scripts to use SUFFIX commit 3336d26bf928a1d2a0d7d25b83956fc35510fc23 Merge: 8d994cfb d635fe37 Author: CoryMartin-NOAA Date: Thu Nov 7 14:09:48 2019 +0000 Merge branch 'feature/gfsv16' into feature/gfsv16_hera_crm commit 8d994cfb7ead6630c5cc955d5fb0140336dec3b8 Author: CoryMartin-NOAA Date: Wed Nov 6 18:26:34 2019 +0000 Change sign of delz for enkf_chgres_recenter_ncio commit d635fe375f8374ba279393f24fc5471f70639ef7 Author: Boi Vuong Date: Tue Nov 5 19:56:32 2019 +0000 Vlab Issue #65358 Updated ecflow's defs and scripts commit 391c793d900b1fc73ec93d917bcd96bed10c72a6 Author: Boi Vuong Date: Tue Nov 5 19:24:47 2019 +0000 Updated drivers and gempak/ush scripts commit fb95cd3c304b623bf95e2618b71b2e649751f685 Author: Walter Kolczynski Date: Tue Nov 5 05:34:56 2019 +0000 Change way wave restart files are produced In order to facilitate faster cycle turnover, instead of copying the wave restart files at the end of the forecast, symbolic links to the final destination are created before the forecast. This allows the restart directory to be populated immediately, so that the next cycle can commence while the current cycle finishes. Change-Id: I887dddec88e32ad08fefc5a8080c9f553965b9dc commit 750ba243dc1040eb008bda9eddeecc313de5ad28 Author: Walter Kolczynski Date: Sun Nov 3 08:07:06 2019 +0000 Update NEMS app for new WW3 version An undocumented change to how the peak frequencies were computed in WW3 resulted in unrealistically large peak frequencies. Coupled with a bug in Jasper, this caused crashes during grib2 encoding. Change-Id: Iface172ec8f6da816a4b56af73e8fb091dd57584 commit 104fa736f01666a9a70f7f4654d6fbe50fc07bf4 Author: George Gayno Date: Fri Nov 1 20:02:17 2019 +0000 Vlab issue #65358. Update 'checkout.sh' to check out GLDAS repo. Update build system to build GLDAS programs. Change-Id: Icf3e496ab76f398779cd97e01a250f8720ddfcb1 commit c2696f6951c8e0dca1b853009a88cef6846a02bf Author: Walter Kolczynski Date: Fri Nov 1 10:02:45 2019 +0000 Update NEMS app to incorporate latest WW3 changes The NEMS app is updated to incorporate changes to WW3 for the grib2 encoders and make profiling optional to improve performance. Change-Id: I6b7c96128f116698c27f037129db4ba8f591551b commit e3059e4e923c1b96bdd163f1d3037b17829c2e98 Author: Mallory Row Date: Thu Oct 31 17:53:18 2019 +0000 Initial commit of adding gfsmetp metatask creation in cycled XML commit 8a5e53588c2754dc3c90a559625bcaf42cc88dfa Author: Mallory Row Date: Thu Oct 31 16:48:15 2019 +0000 Fix gfsmetp gfsarch cycle_offset dependency value commit d59bdbaff3d2cf6b04960348864985c96c3f8f75 Author: George Gayno Date: Thu Oct 31 15:42:08 2019 +0000 Vlab issue #65358. gaussian_sfcanl - Rename program build script to better match its name. Change-Id: Ied731beaa567a50e9d881eaee1e58fc63ae28557 commit 718cebf287f55ed6a68027041c513ce23f230145 Author: George Gayno Date: Thu Oct 31 15:24:05 2019 +0000 Vlab issue #65358. gaussian_sfcanl - Update comments. Add global attributes to netcdf file. Change soill flag value over land ice and open water from '1' to '0' to be consistent with sample gaussian netcdf files from Fanglin. Change-Id: Idadcbd725a325cb8f06fc51c4ecf525aad35520d commit a1eb3b46d7a1476e2ae95f1643afaa613946affa Merge: 3ad83dc1 a9f2be62 Author: Mallory Row Date: Thu Oct 31 13:18:44 2019 +0000 Merge branch 'feature/gfsv16' into vrfy_metplus commit 3ad83dc15ca14a6d380be82eaf7cee1a13381c31 Author: Mallory Row Date: Thu Oct 31 13:17:35 2019 +0000 Initial commit of adding gfsmetp metatask creation in XML commit c79d7e64cdf24c74d8df75194038a9cca3aa0630 Author: Mallory Row Date: Thu Oct 31 12:08:05 2019 +0000 Initial commit of adding gfsmetp metatask commit a9f2be62ae42b1d814cc61d69485bab82909eb9d Author: russ.treadon Date: Thu Oct 31 10:28:46 2019 +0000 Vlab issue #65358 Add tcyc to scripts/exglobal_fcst_nemsfv3gfs.sh. Update EMC_post checkout. commit 46ae4ca6bbacf3c3edf65a5402095dd4ac0a9cd3 Author: George Gayno Date: Wed Oct 30 21:29:19 2019 +0000 Vlab issue #65358. gaussian_sfcanl: Add remaining nst variables to netcdf output option. Add logical flag - netcdf_out - to set output option at runtime (output netcdf when true). ./ush/gaussian_sfcanl.sh - Add 'netcdf_out' logical flag. For now use default of false (nemsio) so as to not break current functionality. Change-Id: I0a79474655592cc7a95a6ad2da4d16493ac4b063 commit cd70f5ef1cbf2cdcbb93916aefccbd3fe0cc0b89 Author: George Gayno Date: Wed Oct 30 17:51:57 2019 +0000 Vlab issue #65358. gaussian_sfcanl: Add remaining noah fields for netcdf output option. Change-Id: I352a417962c29b38f9cdf9684debe54bbaff81a3 commit 4db99b7d96c65f61cf231cc90a160bbf7fb23c2f Merge: 541867a1 b295567e Author: CoryMartin-NOAA Date: Wed Oct 30 14:40:37 2019 +0000 Merge branch 'port2hera' into feature/gfsv16_hera_crm commit 5e84c0bd9b72e09953244375d41374d0949b64e5 Author: George Gayno Date: Tue Oct 29 21:14:35 2019 +0000 Vlab issue #65358 gaussian_sfcanl: Additional noah fields output to netcdf file. Change-Id: I3e5d3ee194fd2672fca01ce77039ad1684bf9aa4 commit f863d6418d75359de448b1ddf8a11cc1f75fecb7 Author: Boi Vuong Date: Tue Oct 29 14:36:30 2019 +0000 Vlab Issue #65358 Updated gempak files and scripts commit 29ba953573f244080177023093f710763f405960 Author: Boi Vuong Date: Tue Oct 29 14:09:01 2019 +0000 Vlab Issue #65358 Updated GFS drivers commit 21a1644c83e5b03743e07b9db3f46a0133b4b503 Author: Boi Vuong Date: Tue Oct 29 13:38:58 2019 +0000 Vlab Issue #65358 Removed GFS FAX jobs and scripts commit d7a36e1f212d1089a749004260f99789ebe9177f Author: George Gayno Date: Mon Oct 28 20:55:37 2019 +0000 Vlab issue #65358. Begin updates to "gaussian_sfcanl" program to output surface file in netcdf format. Change-Id: I3c8dd74c93e36f7c3a3ae04be2d0d10a92da382e commit 632bfe92a6912e981b98f84d9782adcd5650d9f9 Author: Walter.Kolczynski Date: Fri Oct 25 16:56:16 2019 +0000 Fix previous merge enabling Hera Change-Id: I277baff3dd5c8f7485d4be1c8052837c3fbadc46 commit a9de0f78292989e552fb66401f9d1777e464606f Author: fanglin.yang Date: Fri Oct 25 14:13:03 2019 +0000 VLab Issue #65358 modified: modulefiles/module_base.wcoss_c, add prerequsite intel/16.3.210 for Module 'esmf/8.0.0bs48' modified: parm/config/config.fcst, set default land model to lsm=1 (Noah LSM) modified: sorc/checkout.sh, check out model gfsv16_bugfix and UPP post_gfs_netcdf for running model with inlinepost. commit 78924915ed4add9511c6b283119015f048201baf Author: Walter.Kolczynski Date: Fri Oct 25 13:47:42 2019 +0000 Add capability to run on Hera Change-Id: I61da1adcfa1b0ff09222c15b2c4e58c6af8a853b commit a29c723c22ae8b429295f272cefcb5eb8fa39e97 Merge: 1958210a e150d73e Author: Mallory Row Date: Fri Oct 25 12:13:23 2019 +0000 Merge branch 'feature/gfsv16' into vrfy_metplus commit 541867a12cfd55b250236cbb246adcf26a126d8a Author: CoryMartin-NOAA Date: Thu Oct 24 20:32:27 2019 +0000 Fixed some bugs associated with netCDF/calc_analysis commit e150d73ec3b80a8fccf2955cd181a4f6ab3e7bde Author: George Gayno Date: Thu Oct 24 19:00:31 2019 +0000 Vlab issue #65358 Begin update of gaussian_sfcanl program to output gaussian surface file in netcdf. Retain current option to output file in nemsio. Change-Id: I8e0b41ef7372a115bd13dc4d068bec8df86b6ee5 commit 24e712314d58cd1e53996bd91761b60fcf2b7095 Author: fanglin.yang Date: Tue Oct 22 15:33:49 2019 -0400 VLab Issue #65358 Add the option, WRITE_DOPOST, to global-workflow to run post from within the model. The model writes out post-processed "master" and "sflux" files. Corresponding changes have been made to the offline UPP in branch https://github.com/yangfanglin/EMC_post/tree/inlinepost. In current scripts/exglobal_fcst_nemsfv3gfs.sh model_configure is created in two different blocks for IAU is on and off, separately. This is prone to making msiatke when the script is updated. These two blocks are merged into one which works for both IAU on and off cases. modified: jobs/rocoto/post.sh modified: parm/config/config.base.emc.dyn modified: parm/config/config.base.emc.dyn.iau modified: parm/config/config.base.nco.static modified: parm/config/config.post modified: scripts/exglobal_fcst_nemsfv3gfs.sh modified: sorc/checkout.sh commit 1e7a76de9a9598b86592b64905e04ee3778ebff9 Author: George Gayno Date: Tue Oct 22 19:06:06 2019 +0000 Vlab issues #65358 and #67852. Remove 'gdas2gldas' and 'gldas2gdas' programs as these will reside under the GLDAS repo on github: https://github.com/NOAA-EMC/GLDAS Change-Id: I87bbac3de9d33db12925d3d6b077b5812c0ebdf4 commit 8da159d91be6bbac829e7a3da7aa81b4f9c2ebef Merge: cee64779 cec5365d Author: CoryMartin-NOAA Date: Tue Oct 22 14:49:27 2019 +0000 Merge branch 'feature/gfsv16' into feature/gfsv16_hera_crm commit cee64779535a5e9d137db66028e41d3f22a753c9 Merge: 616bd08d b5d8124c Author: CoryMartin-NOAA Date: Tue Oct 22 14:49:06 2019 +0000 Merge branch 'port2hera' into feature/gfsv16_hera_crm commit 616bd08dd0df82d183407f8ec6d26f9804d9dc0b Author: CoryMartin-NOAA Date: Tue Oct 22 14:47:52 2019 +0000 Added enkf_chgres_recenter_nc to gfsv16 workflow build scripts commit 7fee101302a547c525fba9df6cac5c42f66d35cb Author: CoryMartin-NOAA Date: Tue Oct 22 14:44:58 2019 +0000 Added enkf_chgres_recenter_nc to gfsv16 workflow commit 1958210a12c976afc7b173f841c79956f7d1ecb6 Merge: adf6de46 cec5365d Author: Mallory Row Date: Mon Oct 21 15:56:39 2019 +0000 Merge branch 'feature/gfsv16' into vrfy_metplus commit cec5365d8af681ab514cf4609d4ecbabcec587ed Author: russ.treadon Date: Mon Oct 21 15:20:18 2019 +0000 VLab Issue #65358 Update EMC_verif-global tag to verif_global_v1.3.1 commit e748ef122e196028816da5aa3a05f0179cabddc9 Author: Walter Kolczynski Date: Sun Oct 20 08:06:20 2019 +0000 Update NEMS app to version for GEFS retrospective The NEMS app is updated to check out the version for phase 2 of the GEFS v12 retrospective. Change-Id: Iddfa75a78965cf19eca2bca7f9fbbf45dcd457a6 commit 2cf1a28142d44713a31b3c1238b5307eee9bbc23 Merge: 29edaf0d 59746e18 Author: CoryMartin-NOAA Date: Thu Oct 17 18:04:11 2019 +0000 Merge branch 'feature/gfsv16' into feature/gfsv16_hera_crm commit 29edaf0d14f6c0992cca560c020bf820af828725 Merge: 59a76637 4a98049f Author: CoryMartin-NOAA Date: Thu Oct 17 16:49:38 2019 +0000 Merge branch 'port2hera' into feature/gfsv16_hera_crm commit 59a76637c4bc734ce8d80360776b231ded1d902a Author: CoryMartin-NOAA Date: Thu Oct 17 16:43:20 2019 +0000 First attempt to get global-workflow working with netCDF IO and GFSv16 commit 59746e1893c85b262fd610c73999a92c745155f1 Author: russ.treadon Date: Tue Oct 15 18:50:12 2019 +0000 Vlab Issue #65358 Add off hour sfluxfXXX.grib2 files to HPSS gdas.tar commit adf6de4607d2ffb94f45d61c19a7bad8d37b5187 Merge: 473728bb 88cbff41 Author: Mallory Row Date: Tue Oct 15 12:14:44 2019 +0000 Merge branch 'feature/gfsv16' into vrfy_metplus commit b7dc7e6124dbde4b0c41cfaca6a457e5bc995066 Author: Walter Kolczynski Date: Sat Oct 12 18:29:16 2019 +0000 Update app for WW3 OMP fix Change-Id: I41523cb5313b18eba4301d599f99ec67ea6c9bc4 Refs: #58418 commit 88cbff4179ea1a08140b9d59dd59fdf7f7dd8d08 Author: fanglin.yang Date: Fri Oct 11 20:26:34 2019 +0000 Vlab Issue #65358 Update ufs_util to release/v2.0.0 which includes A few sea ice related issues in the NSST analysis: Issue #18 (#20) Updates to global_cycle: Tf analysis for the grids with sea ice done using a salinity dependent formula (instead of using 271.2 K). Update sfcsub.F to the latest version used by the forecast model (includes elimination of masked interpolation for substrate temperature and some updates to Fortran 90 standards). The global_cycle.sh script was updated to reference a new global salinity dataset. Updates to global_chgres - Update sfcsub.F to use same version as global_cycle. Feature/hera port (#25) Update repository to run on Hera, the replacement for Theia. Remove all references to Theia. commit 59883cb4ff503fd7a92e371389a5ed4f467da5e2 Author: Walter Kolczynski Date: Thu Oct 10 01:51:06 2019 +0000 Fix bug with creating WW3 restart directory Change-Id: I7002db6175d4e994318e6274071dca1475ae3121 Refs: #58418 commit a2b355c0936ab249f3562f257ac312a9f25e11d0 Author: fanglin.yang Date: Wed Oct 9 19:26:52 2019 +0000 chnage TELTIM for a few low-res cases in config.fv3 from 420s to 450s. 420 is not divisible by 3600, this leads to fractional forecast hour in forecast output and breaks downstream enkf jobs commit 519a99c671db60eb8d994190fa75102c8a4b01e4 Author: Walter Kolczynski Date: Wed Oct 9 19:10:26 2019 +0000 Update NEMS app version for Hera port Change-Id: I19cdf40e4c0a147d138cc59fc720b7608b51554a Refs: #58418 commit c7a4df4d086821f26a394b1b5fbc65df715fdd17 Author: catherine.thomas Date: Tue Oct 8 16:55:36 2019 -0400 Vlab Issue #65358 Added a commented out line for convective cloud to the diag_table_da file commit a9309088cee6591eda24318af55f8566c79285a5 Author: Walter Kolczynski Date: Mon Oct 7 20:18:14 2019 +0000 Update post to more recent version Change-Id: I8aab5d3d532fbfc444abd4c1c71e10f55906500c Refs: #58418 commit 7ef572243ab968aed5a4d2cd75d7ed0fd46a0e09 Author: russ.treadon Date: Sun Oct 6 21:00:04 2019 +0000 VLab Issue #65358 Update sorc/checkout.sh to populate gfs_post.fd with develop branch from https://github.com/NOAA-EMC/EMC_post commit 21b037b1b4c8285ef621e1469710c7ac5acc049e Merge: be5986e4 cc2d15f7 Author: fanglin.yang Date: Fri Oct 4 15:30:06 2019 -0400 Merge branch 'feature/gfsv16' of gerrit:global-workflow into feature/gfsv16 commit be5986e45f81b762fc1404f0354e84fa635285c0 Merge: e52ecec6 332ef296 Author: fanglin.yang Date: Fri Oct 4 15:23:57 2019 -0400 Merge branch 'feature/gfsv16' of gerrit:global-workflow into gfsv16fyang3 commit cc2d15f70f85cfb7d78bbbc958c760e2bb33b460 Author: George Gayno Date: Fri Oct 4 19:23:12 2019 +0000 Vlab issues #65358 and #67852. gdas2gldas - Add 'noahmp' flag. When true, process noah and noahmp fields. When false, only process noah fields. The output nemsio file will always contain records for noah and noahmp. When processing noah only, the noahmp records will contain missing flag values. Change-Id: I3c02662926e64386493d576f989296c718e0fcf3 commit e52ecec638ca357edf18504b72b15028974c0809 Author: fanglin.yang Date: Fri Oct 4 15:22:08 2019 -0400 VLab Issue #65358 update esmf lib in module files to version esmf/8.0.0bs48 Update model to tag version gfs.v16_PhysicsUpdate commit 332ef2961ff40e2b759dc3e165ed7ca72993897e Author: George Gayno Date: Fri Oct 4 12:25:29 2019 +0000 Vlab issues #65358 and #67852. gldas2gdas - Remove unused namelist variable "orog_files_gdas_grid". Change-Id: I4be15b0e6749986a673303dc04a3e27d93b08168 commit 735f421618849dbda49a1c30ccaabd490e333544 Merge: 865ea5aa c68728b5 Author: Walter Kolczynski Date: Thu Oct 3 17:05:25 2019 +0000 Merge branch 'develop' into lgan_fv3_ww3 commit 865ea5aa37fc866b83dc09fb10f6b1021dc09eee Author: Walter Kolczynski Date: Thu Oct 3 16:54:24 2019 +0000 Update NEMS app checkout to ver with GSDCHEM 0.8.8 Change-Id: Idd286ae4300f0b067d3f38a2aa76b27f2d3911df Refs: #58418, #62104 commit cb4594e9562620a9e42adeaf2bad9db823c4e844 Merge: 25a564c5 8fbf33af Author: CoryMartin-NOAA Date: Thu Oct 3 15:11:14 2019 +0000 Merge remote-tracking branch 'origin/feature/gfsv16' into feature/gfsv16_hera_crm commit 25a564c5ea37275e1c4605b2532d19adccd6d42c Author: CoryMartin-NOAA Date: Thu Oct 3 14:55:20 2019 +0000 Change checkout of GSI to be feature/fv3_ncio commit 8fbf33af5c8c1f7f5a2e2077b35be89889c761a1 Author: russ.treadon Date: Wed Oct 2 00:16:54 2019 +0000 VLab Issue #65358 Add logicals do_sppt, do_shum, and do_skeb to namelist gfs_physics_nml in exglobal_fcst_nemsfv3gfs.sh commit ad1b52817e367f2dd09f961618af72ba3cd84939 Author: Walter Kolczynski Date: Tue Oct 1 20:19:19 2019 +0000 Fix issue with exglobal not creating wave restart directory The wave restart directory was not being created by exglobal, causing the copying of wave restart files to fail. Modified the directory creation to use the new restart subdirectory. Change-Id: I77fe3a72249efd3e688ece47f0778b30e034ff24 Refs: #58418 commit 3a0626812faf1d41047554eb3628adb5160eeb77 Author: russ.treadon Date: Tue Oct 1 12:50:49 2019 +0000 VLab Issue #65358 Update EMC_verif-global tag to verif_global_v1.2.2 commit 20bba55d80a776d94681dfbd63c74ca9b4cbff2c Author: George Gayno Date: Mon Sep 30 19:32:10 2019 +0000 Vlab issues #65358 and #67852. gdas2gldas - Some code clean up. Change-Id: I48407fe4643b50b41da04964c32ae0269911b73e commit 3e44b803de2384396e9409b17ee55a754ffdc625 Author: russ.treadon Date: Mon Sep 30 13:33:18 2019 +0000 VLab Issue #65358 Update EMC_verif-global tag to verif_global_v1.2.1 commit 173e362c714a857f46fcfe1a2b7759fa3b0810fc Author: George Gayno Date: Thu Sep 26 16:18:13 2019 +0000 Vlab issues #65358 and #67852. gldas2gdas - Correct diagnostic write of interpolated soil type. Turn off diagnostic write for ops. Change-Id: I6abcfdb6d6cc76c347126c661e367fc5a52de76c commit 18646deaa1e32ee6580f49d6bf86f68f3fa25ce4 Author: russ.treadon Date: Thu Sep 26 13:00:33 2019 +0000 VLab Issue #65358 Update to EMC_verif-global tag verif_global_v1.2.0 commit 354fb0c7de5b98d0537eba8207fd853b47d245a8 Author: George Gayno Date: Wed Sep 25 21:22:17 2019 +0000 Vlab issues #65358 and #67852. gldas2gdas - Add soil moisture rescaling. Add logic to write routine to account for gdas interpolation mask. Change-Id: I58d40377ed3b75404f732570f71bab6a59e09a4b commit 8a312b398ea79bacef025d19252ef0fd98cffa82 Author: Henrique Alves Date: Wed Sep 25 02:26:51 2019 +0000 global-workflow wave component: adding rundata and restart directories to store model run files (binary ww3, log, forcing inputs etc) and restart following new directory tree structure. commit 8949835cb0d37b08ea41ac825cd45c2639497285 Author: George Gayno Date: Tue Sep 24 20:53:15 2019 +0000 Vlab issues #65358 and #67852. gldas2gdas - add build module for theia. fix unallocated variable 'weasd'. Change-Id: Iccd97dae715d7d0878595c604da34ac7e0d499a5 commit bd2ed957a24bd987c77ee00e6f5fde95040ae242 Author: George Gayno Date: Tue Sep 24 20:07:02 2019 +0000 Vlab issues #65358 and #67852. gldas2gdas - Add processing of soil temp and soil moisture. Read gldas data from nemsio file instead of 1d restart file. Flip n/s poles for nemsio file. Change-Id: I3b7a96338cc8e5d706fb6ae538b7aaec2872d3ec commit 473728bba08612c8e32c06ec0a4e185e5072f638 Author: Mallory Row Date: Tue Sep 24 15:14:02 2019 +0000 Update EMC_verif-global tag number to 1.2.0 commit abd00919109042029c90390d7fccde6b0c194478 Author: Mallory Row Date: Tue Sep 24 15:12:28 2019 +0000 Updates for version 16 commit 673142abf313d8a22caf77b6b5cb77451ebfd772 Author: George Gayno Date: Tue Sep 24 00:16:06 2019 +0000 Vlab issues #65358 and #67852. gldas2gdas - flip n/s poles of gldas grid. read gldas soil type and use as intepolation mask. Change-Id: I117e89f2b61817683c937857e795fbdadf412369 commit 1db363f5f914d9ecb3ce789cad2306d0495b16da Author: George Gayno Date: Mon Sep 23 23:19:23 2019 +0000 Vlab issues #65358 and #67852. gldas2gdas - update interpolation to account for gdas mask. Change-Id: Ia46bb70b399c5aa8828ac38f71e8cb2ceb67154b commit 6f1ad409b5017244c37758eca1a627b04e6f324f Author: George Gayno Date: Mon Sep 23 18:30:22 2019 +0000 Vlab issues #65358 and #67852. gldas2gdas - Basic framework of new program to initialize gdas from gldas data. Change-Id: I765f6fc5a092be301eb800ce2a2c124946a800e4 commit 4d89ddacc73d333c17b8c62f3c103183bfc01ffe Author: George Gayno Date: Sun Sep 22 18:02:07 2019 +0000 Vlab issues #65358 and #67852. gdas2gldas - write noah fields to nemsio file. Change-Id: Iaf67dbe387e4d01dc8373f65b61fd85a04c9bf3d commit 80c30b5c62b356b05a565f70e513ad444ffc3dfc Author: George Gayno Date: Sun Sep 22 00:08:08 2019 +0000 Vlab issues #65358 and #67852. gdas2gldas - add read of config namelist to model_grid.F90 Change-Id: I347e66889bfc173f7b2142b8bb05bd763e4bf3b1 commit 5793d9ee9029f45458a446db9c03cafc22389ed9 Author: George Gayno Date: Sat Sep 21 23:24:35 2019 +0000 Vlab issues #65358 and #67852. gdas2gldas - remap noah fields. Change-Id: I4b1470e663a9351a5d8a6c0754cb3c9094beab25 commit 23f2f9983df89588add9d93b97282c5ee916e656 Author: George Gayno Date: Sat Sep 21 18:46:20 2019 +0000 Vlab issues #65358 and #67852. gdas2gldas - setup target grid noah fields. Change-Id: I9217521ed816a25458723f13bf5b15003449d4b4 commit eb4b8407cd38c7cd2e4b2dc58d51dac2641488ad Author: George Gayno Date: Sat Sep 21 18:22:11 2019 +0000 Vlab issues #65358 amd #67852. gdas2gldas - Add read of noah fields. Change-Id: I9da5b601e6de2f72831da1a796fe23688a93ea53 commit e046668e006e3f5876d9bed7f9decf3c32d6f8ca Author: George Gayno Date: Sat Sep 21 16:27:32 2019 +0000 Vlab #65358 and #67852. gdas2gldas - add remaining noahmp fields. Change-Id: If508823e20a2e29803ed55acc878eaaabfe08a0e commit 346ad154c49c8e452953f50d408ef348b7710fef Author: George Gayno Date: Sat Sep 21 14:22:45 2019 +0000 Vlab issues #65358 and #67852. gdas2gldas - Flip the N/S poles in output nemsio file. Change-Id: I83a1b9aa936f68158fb27d10d4c6302950d41689 commit 274c3b909c67d74b8e8f3da042c3a83942ac69b8 Author: George Gayno Date: Sat Sep 21 04:05:00 2019 +0000 Vlab issues #65358 and #67852. gdas2gldas - add more noahmp fields. Change-Id: I7c34973adcde8f212dae3ca31b7c056e2857cf04 commit b4e48920b61d8591f86ed95264d8e42424ed6770 Author: George Gayno Date: Sat Sep 21 03:01:44 2019 +0000 Vlab issues #65358 and #67852. gdas2gldas - add processing of some noahmp fields. Change-Id: Ib3d453a52270f84a9b340f759d27e25a07848e40 commit be85a6cd59778653998519d827f48a2226ce0f4f Author: George Gayno Date: Fri Sep 20 21:13:16 2019 +0000 Vlab issues #65358 and #67852. 'gdas2gldas' program - process orography. Change-Id: I262568ec844a020a4bd5073b330e53831a514422 commit e7f52d7b8050981db5f9595bccef0463f6eb4ddb Merge: 9e453009 b8619596 Author: fanglin.yang Date: Fri Sep 20 16:37:42 2019 -0400 VLab Issue #65358 update parm/config/config.fcst and scripts/exglobal_fcst_nemsfv3gfs.sh to remove the ishuffle option commit b8619596144549b4bb43b1ba4006e500bbd7b71e Merge: fb738ff4 67f1a3b2 Author: fanglin.yang Date: Fri Sep 20 16:29:00 2019 -0400 Merge branch 'gfsv16fyang3' of gerrit:global-workflow into gfsv16fyang3 commit fb738ff4591cf2997c5c8d66c8373d2f5a0b3ed0 Author: fanglin.yang Date: Fri Sep 20 16:28:29 2019 -0400 remove the ishuffle option for netcdf output commit 9e4530096094ca1ef3245dfb67778a811a5a2261 Author: George Gayno Date: Fri Sep 20 20:07:02 2019 +0000 Vlab issues #65358 and #67852. Updates to gdas2gldas program. Change mapping to ignore input and target grid masks. Begin processing of noahmp fields. Change-Id: I6edc0c2bb675cbd6c97cf27c1dda4efb2938da3b commit 6c143cfe01cea314b15e2088b4c52575b5503058 Author: catherine.thomas Date: Fri Sep 20 08:59:14 2019 -0400 VLab Issue #65358 Modified nsig_ext in config.anal and config.eobs for L127 GPSRO commit 4af57a1bec2e2fe635f9e58f80c7433f11ecacba Author: russ.treadon Date: Thu Sep 19 18:10:56 2019 +0000 VLab Issue #65358 Remove nemsio_cvt from list of executables and directories to link/copy in sorc/link_fv3gfs.sh since nemsio_cvt.fd is no longer in ufs_utils. commit 7c8143f872c5437955f5d890d9861dfd3ee185b9 Author: fanglin.yang Date: Thu Sep 19 02:22:55 2019 +0000 Merge changes made in branch gfsv16_physupdt_netcdf into branch feature/gfsv16. Update the model and scripts to use the latest physics configuration including UGWP, and add the option to write our forecast history files in netcdf format with compression. see https://vlab.ncep.noaa.gov/redmine/issues/68678 for the detail of the model changes. UFS_UTIL was also updated to use high-res land-sea mask (FNMSKH) in global_chgres.sh and global_cycle.sh for the interpolation of climatological SST to reduce temperature biases over small lakes. ----------------------------------- Squashed commit of the following: commit 665f70386c7e1e8c6cccc902f1dbe011dbc584c5 Author: fanglin.yang Date: Wed Sep 18 11:09:54 2019 -0400 latest changes have been merged to branch gfsv16_physupdt_netcdf commit 856bd56b425c6b14ebb4495adff1978b7d04b91e Author: fanglin.yang Date: Tue Sep 17 22:35:46 2019 -0400 use FV3 branch SM_gfsv16_physupdt_netcdf, which is forked from gfsv16_physupdt_netcdf but updated to the latest master I5ce54ed1bb9ab1ce10f28ddb1e0e6ba6689905b5 commit aa3e11ec17378fef2e7e734a4b9502c1112b9984 Merge: 0d853b7c 9f079f3b Author: fanglin.yang Date: Tue Sep 17 15:52:51 2019 -0400 Merge branch 'feature/gfsv16' of gerrit:global-workflow into gfsv16fyang3 commit 0d853b7c0df9e1f9901062b84b659685470fd559 Author: fanglin.yang Date: Mon Sep 16 23:32:51 2019 -0400 change ufs_util repo back to the master version commit 2b2449bf0b1a1b9f62429a76990e0247175704cf Author: fanglin.yang Date: Sun Sep 15 14:35:47 2019 -0400 add netcdf option in config.post and rocoto/post.sh commit 7c9bb3dc1a1598387de26d3fd453a9d3197076ac Author: fanglin.yang Date: Sat Sep 14 13:15:46 2019 -0400 Add options to write 1) RESTART files with compression 2) forecast files in netcdf format with compression make sure LINKed file names in DATA and ROTDIR match each other modified: parm/config/config.fcst modified: scripts/exglobal_fcst_nemsfv3gfs.sh commit 13ea6ceb47da5fd0173fe6dd3f7f400851910892 Author: fanglin.yang Date: Sat Sep 14 01:37:54 2019 +0000 modified: ../parm/config/config.fv3 Further tuned GWD coefficients. modified: checkout.sh: use temporary https://github.com/yangfanglin/UFS_UTILS commit 719688bc765e8f70d3165729517bba3682780fbb Author: fanglin.yang Date: Fri Sep 13 04:38:18 2019 +0000 VLab Issue #65358 1) update scripts to turn on all physics updates targeted for GFS.v16. 2) tuned coefficients for running ogwd and mountain block for 127-L GFS. 3) adjust computing resources in config.fv3 for 127-L GFS. 4) add to worflow the option to write out netcdf files with compression. See Jeff Whitaker's FV3 ticket https://vlab.ncep.noaa.gov/redmine/issues/68487 for the detail. 5) Create a UFS_UTIL branch "gfsv16" to set FNMSKH to T1534 land-sea masks for running CHGRES and surface cycle. Changes to be committed: modified: parm/config/config.base.emc.dyn modified: parm/config/config.base.emc.dyn.iau modified: parm/config/config.base.nco.static modified: parm/config/config.fcst modified: parm/config/config.fv3 modified: scripts/exglobal_fcst_nemsfv3gfs.sh modified: sorc/checkout.sh commit 31f91457604f68813ea7d57ed7503694a86fbfff Merge: 23671eee 8214216f Author: fanglin.yang Date: Fri Sep 13 02:56:08 2019 +0000 Merge branch 'feature/gfsv16' of gerrit:global-workflow into gfsv16fyang3 commit 23671eee2fe94527cd659fa9ea3afb19fe0004d0 Merge: ab07a77c 06a217f1 Author: fanglin.yang Date: Tue Aug 27 22:53:19 2019 -0400 Merge branch 'feature/gfsv16' of gerrit:global-workflow into gfsv16fyang3 commit ab07a77c994d4c38a69b1656e29edbd6b0c0c387 Author: fanglin.yang Date: Sun Aug 18 12:51:30 2019 +0000 remove outdated readme.txt commit 24bcacb621073dcbad0fdd457f1d48cdb7daf53a Author: fanglin.yang Date: Sun Aug 18 03:55:36 2019 +0000 add back checkout.sh commit 2d33c2f6636eaa690c40284cb526709aaa989394 Merge: 8a7d50ef 4222fbf4 Author: fanglin.yang Date: Sun Aug 18 03:46:57 2019 +0000 update to feature/gfsv16 commit 8a7d50ef804b1afaa305c35b9360ce075202b882 Author: fanglin.yang Date: Sat Jun 29 19:11:19 2019 +0000 update model tag to gfs.v16_preCCPP_20190610_v1.0.1, in which io/FV3GFS_io.F90 was upfdate to restore warm-restart reproducibility capability commit 5bd7d817ae18259acfd70dde98df9e4c798afc1f Author: fanglin.yang Date: Wed Jun 26 02:48:38 2019 +0000 use model branch gfs.v16_preCCPP_20190610_v1.0.0 which contains Helin's fix to FV3GFS_io.F90 commit ae08fe77a25a1d6c6d57057c263dc44b33ad5478 Author: fanglin.yang Date: Wed Jun 26 00:08:26 2019 +0000 change lheatstrg=T if lsm=1 and lheatstrg=F if lsm=2 commit 513eb3a9cbcd38e3c8c2d60303edbd06c0ade2a3 Merge: 57a51f8c 0637ec7a Author: fanglin.yang Date: Tue Jun 25 20:05:02 2019 +0000 Merge branch 'feature/gfsv16' of gerrit:global-workflow into gfsv16fyang3 commit 57a51f8c5f2b71b2c7b913612e2331ad2bf866b2 Author: fanglin.yang Date: Tue Jun 25 20:03:58 2019 +0000 se default optveg=1 in exglobal_fcst_nemsfv3gfs.sh to turn off dynamic vegetation commit 67f1a3b2b8cd2a0bc5ca803338ed770bcfe28575 Merge: 665f7038 d0c8f2dd Author: fanglin.yang Date: Thu Sep 19 02:20:14 2019 +0000 Merge branch 'feature/gfsv16' of gerrit:global-workflow into gfsv16fyang3 commit 665f70386c7e1e8c6cccc902f1dbe011dbc584c5 Author: fanglin.yang Date: Wed Sep 18 11:09:54 2019 -0400 latest changes have been merged to branch gfsv16_physupdt_netcdf commit d0c8f2dd01fa6a21c0ac75c7cfd220a7342771f2 Author: George Gayno Date: Wed Sep 18 12:12:43 2019 +0000 Vlab issues #65358 and #67852 Updates to 'gdas2gldas' program - Add interpolation of soil temp and soil moisture from tiled gdas to gaussian gldas. Add routine to update gldas restart file with updated soil fields. Change-Id: Ie2fba2c6e74b160cb8c27f917c09e673da5c588b commit 67adca4ac31d1219c173d3af0c7a905f4e7d3382 Author: Henrique Alves Date: Wed Sep 18 02:47:03 2019 +0000 Winding back changes to wave-related parms PDY_PCYC etc, and adding correct parm cyc defining directory, that was erroneously pointing to non-existent parm cycm1. commit 856bd56b425c6b14ebb4495adff1978b7d04b91e Author: fanglin.yang Date: Tue Sep 17 22:35:46 2019 -0400 use FV3 branch SM_gfsv16_physupdt_netcdf, which is forked from gfsv16_physupdt_netcdf but updated to the latest master I5ce54ed1bb9ab1ce10f28ddb1e0e6ba6689905b5 commit 65fe319c30dfceff7ef265c3f5aa6eb2a1ce049e Author: Henrique Alves Date: Tue Sep 17 21:44:59 2019 +0000 Adding parms CDATE_PCYC, PDY_PCYC and cyc_pcyc to exglobal_fcst_nemsfv3gfs.sh for setting location of wave component restart file as a function of time between cycles commit aa3e11ec17378fef2e7e734a4b9502c1112b9984 Merge: 0d853b7c 9f079f3b Author: fanglin.yang Date: Tue Sep 17 15:52:51 2019 -0400 Merge branch 'feature/gfsv16' of gerrit:global-workflow into gfsv16fyang3 commit 0d853b7c0df9e1f9901062b84b659685470fd559 Author: fanglin.yang Date: Mon Sep 16 23:32:51 2019 -0400 change ufs_util repo back to the master version commit 9f079f3b274b94d0569162ac9d7f6a17e1886a59 Author: George Gayno Date: Mon Sep 16 17:25:14 2019 +0000 Vlab issues #65358 and #67852. Update gdas2gldas program to use the input grid mask - snow-free and non-glacial ice land areas - during interpolation. Change-Id: I61936020832c9a7608a9003212160c1eec598319 commit 7858946954b4abe5ab2747a2c74ee311bbe1f16e Author: catherine.thomas Date: Mon Sep 16 11:10:37 2019 -0400 VLab Issue #65358 Modified anal and eobs config files with GPSRO parameters for L127. commit 2b2449bf0b1a1b9f62429a76990e0247175704cf Author: fanglin.yang Date: Sun Sep 15 14:35:47 2019 -0400 add netcdf option in config.post and rocoto/post.sh commit 7c9bb3dc1a1598387de26d3fd453a9d3197076ac Author: fanglin.yang Date: Sat Sep 14 13:15:46 2019 -0400 Add options to write 1) RESTART files with compression 2) forecast files in netcdf format with compression make sure LINKed file names in DATA and ROTDIR match each other modified: parm/config/config.fcst modified: scripts/exglobal_fcst_nemsfv3gfs.sh commit 13ea6ceb47da5fd0173fe6dd3f7f400851910892 Author: fanglin.yang Date: Sat Sep 14 01:37:54 2019 +0000 modified: ../parm/config/config.fv3 Further tuned GWD coefficients. modified: checkout.sh: use temporary https://github.com/yangfanglin/UFS_UTILS commit 5d1fe312749c5fd01090184fabc50bd53cdd33b5 Author: George Gayno Date: Fri Sep 13 23:46:56 2019 +0000 Vlab issues #65358 and #67852 Updates to new gdas2gldas program. Compile on Dell. Read in gldas restart file to set target grid mask. Change-Id: Id862601158715f0f41092bd2192573e02eb832b3 commit 4399d3d275e65ef5ca4219e566d89e0414fca83f Author: russ.treadon Date: Fri Sep 13 13:01:30 2019 +0000 VLab Issue #65358 Only add GSI diagnostic file to tape archive list if diagnostic file exists commit 719688bc765e8f70d3165729517bba3682780fbb Author: fanglin.yang Date: Fri Sep 13 04:38:18 2019 +0000 VLab Issue #65358 1) update scripts to turn on all physics updates targeted for GFS.v16. 2) tuned coefficients for running ogwd and mountain block for 127-L GFS. 3) adjust computing resources in config.fv3 for 127-L GFS. 4) add to worflow the option to write out netcdf files with compression. See Jeff Whitaker's FV3 ticket https://vlab.ncep.noaa.gov/redmine/issues/68487 for the detail. 5) Create a UFS_UTIL branch "gfsv16" to set FNMSKH to T1534 land-sea masks for running CHGRES and surface cycle. Changes to be committed: modified: parm/config/config.base.emc.dyn modified: parm/config/config.base.emc.dyn.iau modified: parm/config/config.base.nco.static modified: parm/config/config.fcst modified: parm/config/config.fv3 modified: scripts/exglobal_fcst_nemsfv3gfs.sh modified: sorc/checkout.sh commit 31f91457604f68813ea7d57ed7503694a86fbfff Merge: 23671eee 8214216f Author: fanglin.yang Date: Fri Sep 13 02:56:08 2019 +0000 Merge branch 'feature/gfsv16' of gerrit:global-workflow into gfsv16fyang3 commit 8214216fcf3409dde81a05d9c2c40680036b6ee8 Merge: b6ffd564 7c0cd1f2 Author: russ.treadon Date: Wed Sep 11 20:14:02 2019 +0000 VLab Issue #65358 Merge branch 'vrfy_metplus' at 7c0cd1f2 into feature/gfsv16 commit b6ffd564607bffaddb62ec2a22aa44c9d31bdaee Author: russ.treadon Date: Sat Sep 7 14:58:25 2019 +0000 VLab Issue #65358 Add IAU_OFFSET=0 to DOIAU_coldstart section of scripts/exglobal_fcst_nemsfv3gfs.sh commit 7c8383dbb5b5352cba4bb0750bea8dd3b9a252c5 Author: russ.treadon Date: Fri Sep 6 13:49:09 2019 +0000 VLab Issue #65358 Remove module_base.wcoss_cray since it is no longer needed. module_base.wcoss_c is used on Luna and Surge (WCOSS_C) commit 286b7afad87530db3ef83952c76d0f778f2c0198 Author: fanglin.yang Date: Thu Sep 5 23:36:49 2019 -0400 VLab Issue #65358 correct a namelist mistake in exglobal_fcst_nemsfv3gfs.sh. load yaml-cpp that is required by esmf commit eead3959412221b583b86eff54948b4ca35ea62a Author: fanglin.yang Date: Thu Sep 5 20:18:23 2019 +0000 VLab Issue #65358 1. modified: sorc/checkout.sh as a temporary resolution to check independent sub-modules of NEMSfv3gfs 2. modified: parm/config/config.fcst and modified: scripts/exglobal_fcst_nemsfv3gfs.sh to add an option to use Wyser 1996 cloud ice radius scheme. See FV3 Redmine issue https://vlab.ncep.noaa.gov/redmine/issues/68141 for the detail. commit af747496f8b4b2de287d061a5094135083410803 Author: Walter Kolczynski Date: Thu Sep 5 18:59:49 2019 +0000 Update NEMS app version to a master commit Previously the checkout for the FV3-GSDCHEM-WW3 app temporarily pointed to an interim commit that included GSDCHEM 0.87 and used ESMF 8.0.0bs40+ to facilitate development while waiting for the app to merge a full update to master. That update has now been made to the app master, so the checkout commit is now updated to point to that new version. The new FV3-GSDCHEM-WW3 app commit updates to GSDCHEM 0.87 and ESMF 8.0.0bs47, and also updates the other components to recent development versions. Change-Id: I0318535aa7564b48567a6442cdda074007095c03 Refs: #58418 commit dbcf983d06ece8959a72f2e7a4713382194cfe2d Author: russ.treadon Date: Wed Sep 4 17:43:47 2019 +0000 VLab Issue #65358 Update Theia and WCOSS module_base esmf module load to esmf/8.0.0bs47 (consistent with forecast model) commit 67390029f44f5e14532f80e002f4d9b46cd92454 Author: fanglin.yang Date: Wed Sep 4 12:24:29 2019 -0400 VLab Issue #65358 Further update config.fcst to turn off canopy heat storage (config.fcst) for Noah-MP. Including canopy heat storage does help improving the 500-hPa ACC but tends to increase the lower-troposphere cold biases. commit 96a27fc42966baaddc15834587e071a4122cfdd9 Merge: 6f7bf392 ede312e0 Author: fanglin.yang Date: Wed Sep 4 10:33:39 2019 -0400 Merge branch 'feature/gfsv16' of gerrit:global-workflow into feature/gfsv16 commit 6f7bf3921d00aaf6470ffb105bcd4b801f5fda35 Author: fanglin.yang Date: Wed Sep 4 10:31:54 2019 -0400 VLab Issue #65358 Update config.fcst to turn on canopy heat storage (config.fcst) along with Noah-MP commit ede312e030e602fa525ac64ed047ba673d527b22 Author: Xu.Li Date: Tue Sep 3 19:01:59 2019 +0000 Vlab Issue: #67743, modify to aplly real SST climatology updates to NSST background over small water bodies commit c1ad18d74918943b9492affb71c93a4fd2f77e6f Author: fanglin.yang Date: Sat Aug 31 15:27:27 2019 -0400 VLab Issue #65358 set in config.fcst hord=5 for both gfs and gdas cycle forecasts commit a90f50af2c12654db6e910dbf9bbc2799807d82f Author: fanglin.yang Date: Sat Aug 31 13:05:06 2019 -0400 remove obsolote readme.txt commit ecd87d7d78bec7839a45269c907030a0501ce133 Author: George Gayno Date: Thu Aug 29 19:46:53 2019 +0000 Vlab issues #65358 and #67852 Baseline new program - tentatively called gdas2gldas - that will update GLDAS with tiled surface data from GDAS. Change-Id: I5f97b4665ec389f85fda1db7caef2425472855ca commit 3834dfb268a375a4e529fee140dd770b8389f8aa Author: russ.treadon Date: Wed Aug 28 18:25:31 2019 +0000 VLab Issue #65358 Update forecast model checkout to NEMSfv3gfs branch gfsv16_physupdt. commit 1677483ccf48b72302ade130ba4dfd99d63ac739 Merge: b41d8472 c68728b5 Author: russ.treadon Date: Wed Aug 28 18:00:20 2019 +0000 VLab Issue #65358 Merge branch develop at commit:c68728b5 into feature/gfsv16. commit b41d8472c3b9a0ac751bb7871de2f90f24904a89 Author: russ.treadon Date: Wed Aug 28 15:22:53 2019 +0000 VLab issue #65358 Undo MinMon changes committed at 6325027f given developer updates to DA_GFSv16 commit 659443c19182c3bbcdd13100ba2c151fa3b9e26e Author: catherine.thomas Date: Wed Aug 28 10:23:49 2019 -0400 VLab Issue #65358 Updated intel, impi, and esmf modules for Theia commit f0e4cf7d0f0e323d5ab9af426633e7d599c98120 Merge: 06a217f1 f94a5a43 Author: russ.treadon Date: Wed Aug 28 12:57:32 2019 +0000 VLab issue #65358 Merge branch 'develop' at f94a5a43 into feature/gfsv16. commit 23671eee2fe94527cd659fa9ea3afb19fe0004d0 Merge: ab07a77c 06a217f1 Author: fanglin.yang Date: Tue Aug 27 22:53:19 2019 -0400 Merge branch 'feature/gfsv16' of gerrit:global-workflow into gfsv16fyang3 commit 06a217f181769757b4ddc92a5ec208bfeae073c6 Author: russ.treadon Date: Mon Aug 26 14:51:27 2019 +0000 VLab issue #65358 resolve Bug #67573 in feature/gfsv16: correct typo in firstcyc queue_arch defintion on non-slurm platforms commit 6325027f1c51ca3adee214c5dd7742ffc6668b00 Author: russ.treadon Date: Mon Aug 26 14:43:03 2019 +0000 VLab issue #65358 Update MinMon paths for development parallels; update WCOSS_D Fit2Obs to process netCDF files commit 57f74c842ba30e40f6e2c3ba1bd4802388aa068f Author: Walter Kolczynski Date: Fri Aug 23 16:34:57 2019 +0000 Update UFS_utils version to work with GEFS The GEFS init system requires changes to global_chgres_driver.sh that are not present in v1.0.0. The UFS_utils version is updated to the current tip of the develop branch, which includes the necessary changes. Change-Id: I097b16342ca081e867786410a6ac9916fd5576fb Refs: #58418 commit ab07a77c994d4c38a69b1656e29edbd6b0c0c387 Author: fanglin.yang Date: Sun Aug 18 12:51:30 2019 +0000 remove outdated readme.txt commit 24bcacb621073dcbad0fdd457f1d48cdb7daf53a Author: fanglin.yang Date: Sun Aug 18 03:55:36 2019 +0000 add back checkout.sh commit 2d33c2f6636eaa690c40284cb526709aaa989394 Merge: 8a7d50ef 4222fbf4 Author: fanglin.yang Date: Sun Aug 18 03:46:57 2019 +0000 update to feature/gfsv16 commit 4222fbf4d1ae83b57f23974c5b096a2cadfca933 Merge: 90000d08 7448544c Author: russ.treadon Date: Tue Aug 13 10:09:01 2019 +0000 VLab issue #65358 Merge branch 'develop' at 7448544c into feature/gfsv16. Brings in Theia bug fixes #66187. commit 6fddfaf8212d29ee9a1eddbdbc7f710d2a19e426 Merge: 9fd2726d 9dc4c7a1 Author: Walter.Kolczynski Date: Mon Aug 12 18:55:36 2019 +0000 Merge branch 'develop' into lgan_fv3_ww3 Change-Id: I068787c69f6610fa6896af16775378286d3e948b commit 90000d08d00b1ba188237eabd5d9a79801e7d80d Merge: 0e676894 9dc4c7a1 Author: russ.treadon Date: Fri Aug 9 12:49:26 2019 +0000 VLab issue #65358 Merge branch 'develop' at 9dc4c7a1 into feature/gfsv16. Brings in hotfix #67072 commit 0e67689461475e2955d6871d3495530332d8cba7 Author: russ.treadon Date: Thu Aug 8 14:56:43 2019 +0000 VLab issue #65358 Generalize ush/hpssarch_gen.sh to include restart files needed by IAU commit 9fd2726d64d62b7fc6196fb74060dc2ec15b3dab Author: Walter.Kolczynski Date: Mon Aug 5 18:24:21 2019 +0000 Update FV3-GSDCHEM-WW3 version Updated the checkout script to fetch a newer commit of the FV3-GSDCHEM-WW3 app. This newer version uses ESMF v8.0.0bs42, updates GSDCHEM to 0.87, WW3 to an OMP-enabled version, and FV3 to the latest develop commit. Change-Id: I7c16511e2d5d183abd96371afd0d0b237e5349aa Refs: #58418, #62104 commit f123abb87a153876a1cdcff5285d7c4dd5d66567 Merge: f4b47b5d dcfa5eee Author: Walter.Kolczynski Date: Wed Jul 31 18:31:17 2019 +0000 Merge remote-tracking branch 'origin/develop' into lgan_fv3_ww3 commit c9ffb656f75ff2cf0af4626dae00fdab2815b8ee Merge: 9c9e7416 dcfa5eee Author: russ.treadon Date: Tue Jul 23 17:10:02 2019 +0000 VLab issue #65358 Merge branch 'develop' at dcfa5eee into feature/gfsv16 commit f4b47b5db243f4f0d172b3430e8a6f494946ca2a Author: Henrique Alves Date: Thu Jul 18 21:49:50 2019 +0000 Cleaning up esmf profiling parms commit 056c6cd12288a6a8f3b1fd0fa6165f0b14667a71 Merge: 2bb5bacc 01b12587 Author: Henrique Alves Date: Thu Jul 18 21:45:33 2019 +0000 Merge branch 'lgan_fv3_ww3' of gerrit:global-workflow into lgan_fv3_ww3 commit 2bb5bacc59c830276c91c03ab490cfbc4781844e Author: Henrique Alves Date: Thu Jul 18 21:45:23 2019 +0000 Branch lgan_fv3_ww3: changing exglobal_fcst to allow writing wave component restart files following number of daily cycles (gfs_cyc), and creating proper comout for wave restart files commit 9c9e741659b7d0a04abd269837869c22b59108db Author: russ.treadon Date: Thu Jul 18 14:58:16 2019 +0000 VLab issue #65358 Replace IAU_DELTHRS with IAU_OFFSET to set iau_offset in model_configure commit 01b12587ae6567677d95e15e990397f142c53633 Author: Walter.Kolczynski Date: Tue Jul 16 17:49:44 2019 +0000 Fix clean arguments for building fv3 The arguments used when building the FV3-GSDCHEM-WW3 app used named arguments, but the script expects arguments without names. Change-Id: Ibc98a4c883f4f28c361104d03e4b17fb070dc7f8 Refs: #58418 commit 1743914f7d2475f128b9c954eef1fa063fe377b3 Author: russ.treadon Date: Tue Jul 16 15:43:41 2019 +0000 VLab issue #65358 * jobs/rocoto/vrfy.sh - set RUN_ENVIR to devpara for fit2obs only * scripts/exglobal_fcst_nemsfv3gfs.sh - IAU related changes * sorc/checkout.sh - checkout NEMSfv3gfs tag gfs.v16_tag_v2.0.0; remove fix_l127 from ProdGSI checkout commit 7135cc11b621165edc1fd7677af053e915ceb364 Author: Henrique Alves Date: Mon Jul 8 17:03:55 2019 +0000 Small bugfix on copying out_pnt for wave output commit 65c3720495cc39ddd64c21b4c4e0080fccd877b0 Author: Henrique Alves Date: Mon Jul 8 15:07:19 2019 +0000 Changes made to exglobal_fcst to streamline linking and copying IO from/to wave model component. commit 8588614fc1c2015687615b58d6b160b6b8a04a52 Merge: 222059b4 6ca55564 Author: russ.treadon Date: Fri Jul 5 14:40:02 2019 +0000 Redmine issue #65358 Merge branch 'master' at 6ca55564 into feature/gfsv16 * add safeguard to not execute select vrfy jobs when CDATE equals SDATE * remove five sorc/build*sh scripts since they are now built by build_ufs_utils.sh commit 222059b4606ee7fc8944f73b42eb2f09171887fd Author: russ.treadon Date: Sat Jun 29 22:37:25 2019 +0000 Redmine issue #65358 Mistakenly removed ProdGSI checkout DA_GFSv16 at 413d08ba. Restore DA_GFSv16 checkout with this commit. commit 413d08baf228ae5dc2d3731ddb620ac59cad801f Author: russ.treadon Date: Sat Jun 29 21:16:48 2019 +0000 Redmine issue #65358 Update fv3gfs model to gfs.v16_preCCPP_20190610_v1.0.1 in sorc/checkout.sh commit 8a7d50ef804b1afaa305c35b9360ce075202b882 Author: fanglin.yang Date: Sat Jun 29 19:11:19 2019 +0000 update model tag to gfs.v16_preCCPP_20190610_v1.0.1, in which io/FV3GFS_io.F90 was upfdate to restore warm-restart reproducibility capability commit 02704a3f0312383919adda7245300a5a60b0ecd8 Author: Walter.Kolczynski Date: Fri Jun 28 16:21:46 2019 +0000 Remove CDUMP requirement to use wave coupling Previously, the needed operations for wave coupling were only completed if CDUMP="gfs", even if cplwav=".true.". The GEFS uses a CDUMP="gdas", so this requirement had to be removed for the GEFS to operate correctly. Change-Id: I50964a7fc064f439b6c879fb199d910b3f7eadfc Refs: #58418 commit 4d8f71e61683a63d2a3b8b25cafca7ef79d2c7ff Author: russ.treadon Date: Fri Jun 28 13:50:58 2019 +0000 Redmine issue #65358 Remove sorc/gettrk.fd and gettrk references in sorc/link_fv3gfs.sh. commit a3033a05458b47548d404f82148423edda4946cd Merge: 6cfec865 6ca55564 Author: Walter.Kolczynski Date: Thu Jun 27 19:38:01 2019 +0000 Merge branch 'master' into lgan_fv3_ww3 Change-Id: I31ee2e1a55de3db34de2dbcb029224729c3c9074 commit a558e4a0e73cd55c90d04bb463b5ba85c12cad33 Author: russ.treadon Date: Thu Jun 27 19:21:08 2019 +0000 Redmine issue #65358 update checkout.sh to check out GFS v16 DA development branch, DA_GFSv16 commit 6cfec865ebb0bcad8f81ebea21bbe4d06531ac99 Author: Walter.Kolczynski Date: Thu Jun 27 19:15:50 2019 +0000 Update checkout script to get combined FV3-WW3-GSDCHEM app Replaces the FV3-only checkout with one that clones the combined FV3-WW3-GSDCHEM app, which can be run in any combination of those components. This unified app is required to run coupled forecasts. The app will build correctly in the normal way using build_fv3.sh (and build_all.sh). Post is also reverted to the mainline project rather than the gtg version which has more restricted access. NOTES ON APP VERSION: 1) App does not build successfully on WCOSS-Cray due to a module dependency issue that will be sorted out later. 2) As committed, the WW3 build requires a path of 79 characters or less. This will be addressed in a future NEMS commit. For now, you can fix this manually by updating L19 of fv3gfs.fd/NEMS/src/incmake/component_WW3.mk to a larger number before building. Change-Id: I6c6d000650dc4ee8b2bbc9997195c3dfa2690520 Refs: #58418 commit a682a673e5a920b629d18d0a9c167144fbbcae91 Author: catherine.thomas Date: Wed Jun 26 11:16:05 2019 -0400 Redmine Issue #65358 Bugfix for jobs/rocoto/getic.sh. The date IF statement to handle pre-nemsio analysis files should use "-ge" instead of "-gt". commit 368141ca93c5c45af6fb0c3d541cc96ed0bd669b Author: russ.treadon Date: Wed Jun 26 13:45:22 2019 +0000 merge parm/config/config.fcst (ae08fe77) and sorc/checkout.sh (5bd7d817) from gfsv16fyang3 to feature/gfsv16. refs #65358 commit 5bd7d817ae18259acfd70dde98df9e4c798afc1f Author: fanglin.yang Date: Wed Jun 26 02:48:38 2019 +0000 use model branch gfs.v16_preCCPP_20190610_v1.0.0 which contains Helin's fix to FV3GFS_io.F90 commit ae08fe77a25a1d6c6d57057c263dc44b33ad5478 Author: fanglin.yang Date: Wed Jun 26 00:08:26 2019 +0000 change lheatstrg=T if lsm=1 and lheatstrg=F if lsm=2 commit 027af835e3dcb212f0487d37cc518c8c82f43631 Author: russ.treadon Date: Tue Jun 25 21:50:59 2019 +0000 refs #65358 merge scripts/exglobal_fcst_nemsfv3gfs.sh from gfsv16fyang3 at 57a51f8c to feature/gfsv16 commit 513eb3a9cbcd38e3c8c2d60303edbd06c0ade2a3 Merge: 57a51f8c 0637ec7a Author: fanglin.yang Date: Tue Jun 25 20:05:02 2019 +0000 Merge branch 'feature/gfsv16' of gerrit:global-workflow into gfsv16fyang3 commit 57a51f8c5f2b71b2c7b913612e2331ad2bf866b2 Author: fanglin.yang Date: Tue Jun 25 20:03:58 2019 +0000 se default optveg=1 in exglobal_fcst_nemsfv3gfs.sh to turn off dynamic vegetation commit 0637ec7a39e3163e77caedafcdc25f175e8f25d2 Author: russ.treadon Date: Tue Jun 25 18:28:45 2019 +0000 feature/gfsv16: add link to redmine issue refs #65358 Commit the following changes to checkout.sh: * add "-e" option to trap execution errors * remove ProdGSI branch EXP-locfix-io checout (use master) * remove commented out lines commit e57878705dc8356e5a2b774543701e151a3efb69 Author: fanglin.yang Date: Tue Jun 25 17:59:51 2019 +0000 update exglobal_fcst_nemsfv3gfs.sh to pass landice from config.fcst; set lheatstrg=F if lsm=2 commit 2550fafc92738e1bb5bf526a45b4e06d8894e640 Author: fanglin.yang Date: Tue Jun 25 03:32:20 2019 +0000 change default vaule of TKE in all field_table for sa-tke-edmf pbl scheme from 1.0E30 to 0.0. Otherwise, the model segfaults for warm restart. Update config.fcst to turn on sa-tke-edmf as default commit 19bf5fdc859bb658c208c1496aab875eb926713d Author: fanglin.yang Date: Tue Jun 25 03:21:01 2019 +0000 update scripts/exgfs_nawips.sh.ecf to add more output for WPC -- a fix from gfs.v15.1 after implementation commit 6c14ce46e88c0d7b2e62480311defcc35f069045 Author: fanglin.yang Date: Sun Jun 16 18:08:07 2019 +0000 change lheatstrg=T commit 39476f9b7f6be32a6e48be8860fee227cd2a25c9 Author: fanglin.yang Date: Sun Jun 16 17:58:33 2019 +0000 set default satmedmf=F in config.fcst. Model segfault with satmedmf=T for warm restart commit 5a13af015c969270eda37a640c69d44e13ab7435 Author: fanglin.yang Date: Fri Jun 14 05:44:06 2019 +0000 update module_base.theia, loading esmf/8.0.0bs21 needs module impi/5.1.2.109 commit 0e70011d6e982bdf8b4bc414511d9d28adfa641f Author: fanglin.yang Date: Thu Jun 13 04:52:50 2019 +0000 update config.fcst for running model with sa-tke-edmf pbl scheme commit c93e259b3b7394b3e4bbb4bbf1dcb1039151f207 Author: fanglin.yang Date: Thu Jun 13 04:46:31 2019 +0000 add field_table_gfdl_satmedmf field_table_thompson_satmedmf field_table_wsm6_satmedmf field_table_zhaocarr_satmedmf for running the model with prognostic subgrid scale turbulent kinetic energy in sa-tke-edmf commit fecff06eed919c8bcce0cf5cdfc27c22e012da03 Merge: 94614d3f 149a95e3 Author: fanglin.yang Date: Wed Jun 12 19:49:20 2019 +0000 Merge branch 'master-v16' of gerrit:global-workflow into gfsv16fyang3 commit 94614d3f95ac951d0f15c763e0371b5ba036d43f Author: fanglin.yang Date: Wed Jun 12 19:48:43 2019 +0000 update workflow to include new physics options: radiation, NoahMP, sa-TKEEDMF and UGWP commit 149a95e35097ad82b7f74c8092134307807648a7 Author: Lin.Gan Date: Wed Jun 12 16:16:20 2019 +0000 Merge nemsio_chgdate change redmine #64864 commit 77f8b292b1880815d7a8971dd686fa8e87b11619 Author: Lin.Gan Date: Thu Jun 6 14:10:03 2019 +0000 Add missing line in config.fv3 commit 29295e75838cec71b4a1bc538c3da78ed6732ed1 Author: Lin.Gan Date: Thu Jun 6 13:35:18 2019 +0000 Apply changes from rev 2fd3204c60b8ed8282ea0dc96f19f6aa24e27f7f and 5518e751d9c6105795e958a3c02175bf1aaf2dd5 commit 1248a869bdd49300cdfd5e47068dd861e801bd6b Merge: 5a4da48e 55f4cc28 Author: Lin.Gan Date: Wed Jun 5 20:23:30 2019 +0000 Merge branch 'feature/iau' into master-v16 Merge feature/iau into master-v16 to take care of vlab issue: 64420 - Add optional 4DIAU to global-workflow Rev. merge 2fd3204c - merge gfsv16fyang2 into feature/iau Rev. merge 5518e751 - update checkout.sh to use EMC_post:master commit 5a4da48eaf45ee62a5456f527a7c224c3940d7cc Author: Lin.Gan Date: Thu May 30 13:43:49 2019 +0000 Remove extra ufs_utils related scripts from ush commit beea5bf73b7c647bb9ce2e62d442454253e3e498 Author: Lin.Gan Date: Wed May 29 15:32:46 2019 +0000 Create FV3 GFS v16 master commit 55f4cc28bbc9d0a8588546c9ac975adc553a343e Author: russ.treadon Date: Tue May 28 20:50:57 2019 +0000 Correct logic for historical ozinfo files. Add correct logic to config.anal.iau commit 4e8e0597844c0308dfa5d2591383e5c3c293c0e5 Merge: 751c6dd6 d919db63 Author: russ.treadon Date: Fri May 24 20:36:49 2019 +0000 Merge branch d919db63 'master' into feature/iau commit 751c6dd61f2c2b990a48fe030dc8c69613ce666d Author: russ.treadon Date: Thu May 23 12:51:23 2019 +0000 Add logic to config.anal to set OZINFO for retrospective parallels commit 4c846c7168dd4b758e5a5b27b06c41f441f754ff Merge: f671b8b2 219fe342 Author: russ.treadon Date: Wed May 22 13:26:15 2019 +0000 Merge branch 'gfsv16fyang' into feature/iau commit f671b8b280cedc0f77ff15bf0f317c62e7d11879 Merge: 0fb73681 2acb5d31 Author: russ.treadon Date: Tue May 21 13:55:22 2019 +0000 Merge 2acb5d31 master into feature/iau commit 219fe3426436525f44b4dd4fd2251a328f748887 Author: fanglin.yang Date: Fri May 17 16:23:22 2019 +0000 update post.sh to check if master file for a particular forecast hour exists, then not keep rerunning for this hours when POST job is resubmitted. This is useful for running extended long forecasts commit 0fb73681321cbc6d7ece3e518edea4fbef82db8c Author: russ.treadon Date: Mon May 13 18:10:02 2019 +0000 Add DA executables to link_fv3gfs.sh commit b083d3568d58865f57b39c45b3f4be2bc0bc982f Merge: b46d0166 475967bc Author: fanglin.yang Date: Mon May 13 02:54:19 2019 +0000 Merge branch 'master' of gerrit:global-workflow into gfsv16fyang Master was updated for using SLURM instead of MOAB job scheduler on Theia/Jet commit b46d016624087eedbd44a503582724607cd7806c Author: fanglin.yang Date: Mon May 13 02:44:24 2019 +0000 Changes to be committed: modified: ../../jobs/rocoto/arch.sh add an option SAVEFCSTNEMSIO. if SAVEFCSTNEMSIO=NO, nemsio files are not archived to HPSS save online grib2 1-deg pgb files instead of grib1 1-deg files modified: ../../jobs/rocoto/vrfy.sh save online grib2 0.2-deg pgb files instead of grib1 0.2-deg files modified: ../../scripts/exglobal_fcst_nemsfv3gfs.sh set warm_restart=.true. if RERUN=YES for forecast-only experiment. Otherwise, the model fails to restart as it keeps looking for gfs_cntrl.nc for cold start cases. modified: ../../sorc/checkout.sh checkout UPP branch gfsv16_fyang for processing 127-L GFS commit db173335481f6145e36668f29bf207ef954b1f4c Merge: c71cefaa 475967bc Author: jswhit Date: Sat May 11 19:03:50 2019 +0000 Merge branch 'master' into feature/iau commit c71cefaaecee38b8759924edabdadd32778c5565 Author: jswhit Date: Wed May 8 02:08:59 2019 +0000 changes to enable slurm from slurm_beta_sync branch commit 9af1480dee2351b540e873f1daf9d26928ec472d Author: lin.gan Date: Thu May 2 17:55:16 2019 +0000 Update resource to use dynamically calculated lowres on Theia include expdir commit ced016bb8b949350f92730bb52ca74cd3b8d9ad2 Author: lin.gan Date: Thu May 2 17:50:27 2019 +0000 Update resource to use dynamically calculated lowres on Theia commit 76f877c57b8f965779725a48059f6a1dc83290fa Author: lin.gan Date: Wed May 1 19:03:13 2019 +0000 Coupled ww3 tested with FV3 GFS forecast certified commit 56c686d03a420bf7216aafb86a572a054fcecdc0 Author: lin.gan Date: Wed May 1 14:06:18 2019 +0000 Testing include ICE data commit c0afa1b1887ae9f474436a98e3b36c1a1c73954d Author: Catherine Thomas Date: Wed May 1 13:50:14 2019 +0000 VLab Issue #60800, feature/iau (C Thomas): Made vcoord more flexible in gaussian_sfcanl. commit ebbf3c0902578b3647d4f110c168e0e87abe24a9 Author: Jeffrey.S.Whitaker Date: Tue Apr 30 11:54:03 2019 -0400 back out some changes from L127_updates merge commit 06441ca025bed60257de1cebaf23ed637db4530b Merge: ce2298dd b439bc7f Author: Jeffrey.S.Whitaker Date: Tue Apr 30 11:13:39 2019 -0400 Merge remote-tracking branch 'origin/L127_updates' into feature/iau commit ce2298dd3893adbfddd7dd857cab6712bd35af0b Author: jswhit Date: Mon Apr 29 18:40:02 2019 +0000 don't try to archive abias files if lobsdiag_forenkf = true commit 199e61c3fa124d1610bcb32cd5adcae8216aad5f Author: jswhit Date: Tue Apr 23 21:33:57 2019 +0000 add missing FNTSFC definition commit eeb6dd7f40174b0a847c816f80741ace704305a5 Merge: c7d0d246 8619f479 Author: jswhit Date: Tue Apr 23 20:17:01 2019 +0000 Merge branch 'master' into feature/iau commit b439bc7f5e3f63d59812f0fdc0a716d0d33966e1 Author: Catherine Thomas Date: Mon Apr 22 16:36:10 2019 +0000 VLab Issue #60800 (L127_updates): Removed references to *C hyblev file. commit c24de7af395edcf9cc9f341ee758d829d747b24f Author: lin.gan Date: Mon Apr 22 14:15:17 2019 +0000 First try see email for record commit e789501ca90f72d7ef121cee8023bb340c0b3416 Author: lin.gan Date: Mon Apr 22 14:12:40 2019 +0000 First try see email for record commit c7d0d2462e31744111b7198b2a4889c0a4a8084b Author: jswhit Date: Sat Apr 20 00:58:10 2019 +0000 update commit fa3385490ebbacc051cc97921bf642c4e47d504c Author: jswhit Date: Sat Apr 20 00:57:03 2019 +0000 add more skeb namelist parameters commit e4a085bb7e41d3978d9fe25c31a27fc457adcb6b Merge: d8e19fa6 82505618 Author: Catherine Thomas Date: Thu Apr 18 18:36:19 2019 +0000 L127_updates: Merge master commit f4296d5ae5dc392a6ad3596ed4352905745d13a6 Merge: b0f2a5ef 193f8837 Author: jswhit Date: Wed Apr 17 22:19:19 2019 +0000 Merge branch 'master' into feature/iau commit b0f2a5efd1da1dd3f8874e44a00384383b64f47c Author: jswhit Date: Wed Apr 17 15:00:57 2019 +0000 update for IAU commit 5f2add7980332d44b6715724081a73f6b4989b41 Author: jswhit Date: Sun Apr 14 17:53:42 2019 +0000 update commit 84377670b492a18db153394fa37c660c27cf0615 Author: jswhit Date: Sun Apr 14 17:51:40 2019 +0000 update commit 62d02250e3eec721cfabdda62f5b988013dd44fb Author: jswhit Date: Sun Apr 14 17:50:18 2019 +0000 update commit 67520861e91a319ca34c075cd22033ac57a4e843 Author: jswhit Date: Sun Apr 14 17:47:27 2019 +0000 update commit ba4d815e8d41019ef047d5d302bf77e4feb64398 Author: jswhit Date: Sun Apr 14 17:45:54 2019 +0000 update commit 1a3fb9785463ffaa20b44400d8360dd500034f87 Author: jswhit Date: Sun Apr 14 17:44:20 2019 +0000 new file commit d28e57d88a64f17c1e86af5dca8c6f911ea87053 Author: jswhit Date: Sun Apr 14 17:42:48 2019 +0000 add output_1st_timestep so it works with NEMSFV3gfs master commit 1068601b73c59a4cf05b8939f371dc8c3e724de6 Author: jswhit Date: Sun Apr 14 17:37:26 2019 +0000 update commit 00240bfe5648b00764a65ffd94a6bdc0f06129e9 Author: jswhit Date: Mon Apr 8 19:38:17 2019 +0000 update commit b4fcf4d83e543af407bca3cf47e18baefc3c724d Author: jswhit Date: Sun Apr 7 12:39:52 2019 +0000 turn on netcdf diag files commit 46b94eaea38cfc027a5701ddaf6ccfb4689aa018 Author: jswhit Date: Thu Apr 4 19:02:07 2019 +0000 config for IAU commit ae727bef51666e7b48bfe9f8cb8a0ced782d1e8b Author: jswhit Date: Thu Apr 4 03:50:12 2019 +0000 run eobs at both gfs and gdas commit 2490ecdbfa80e0b85cab650c011769e6caff3364 Author: jswhit Date: Thu Apr 4 03:47:11 2019 +0000 update to run EnKF for gfs and gdas commit 8cb5d67982b052d3a693bfda6f18db860e6ed459 Author: jswhit Date: Tue Apr 2 16:19:07 2019 +0000 add parameters back in atmos_model_nml commit ca2f9c56badd8f92bb97a25234ab55fb86cced68 Author: fanglin.yang Date: Tue Apr 2 14:41:05 2019 +0000 update gsi tag from fv3da.v1.0.42 to fv3da.v1.0.43 to remove a dead link in Ozomon commit a6927f431ae5743fb5abe38d7e1629500a67535d Author: jswhit Date: Tue Apr 2 03:30:18 2019 +0000 update fv3gfs to 1.0.18 commit 06029ac87fdb5ab6699d40b9a3012912c5a0389a Author: jswhit Date: Tue Apr 2 03:29:51 2019 +0000 more bugfixes commit dbad0c88855ff9fd06cf90a70d0ee112a065febf Author: fanglin.yang Date: Tue Apr 2 00:57:27 2019 +0000 update POC for DA in release note commit f8f108f4a7030a3a3619fec52d6d6d8d961fab65 Author: fanglin.yang Date: Tue Apr 2 00:46:18 2019 +0000 merge NCO's back to q2fy19_nco branch. Update release notes commit 602db4a60218b0e6d47333009e5e1150400eda91 Author: jswhit Date: Mon Apr 1 02:16:17 2019 +0000 fix bug introduced into non-iau warm start forecasts commit 6eac887a585a2513ca21013db2c0948be908402c Author: jswhit Date: Sun Mar 31 14:54:03 2019 +0000 update commit f256a51ac8a6a171f976be5548ca6b842e3b7532 Author: jswhit Date: Sun Mar 31 13:22:10 2019 +0000 more bug fixes commit bde9520b80c199ed34d3cb98bce87db48f1ddcbd Author: jswhit Date: Sat Mar 30 23:22:07 2019 +0000 fix IAU cold start initial date in model_configure commit 22c89f93c1326ae9a83e22bb1d9ad7b027478c83 Author: jswhit Date: Sat Mar 30 19:40:04 2019 +0000 update commit c403d981f6f4ae58700967beaa2baa9d7453316d Author: jswhit Date: Sat Mar 30 18:50:48 2019 +0000 fix typo commit 0b43f7dfd6894ed205c537593a399859f8f8ad70 Author: jswhit Date: Sat Mar 30 16:51:33 2019 +0000 don't change date if coldstart commit cbbb49fb1b60f7bdde5f1bb66be79c86233b3122 Author: russ.treadon Date: Fri Mar 29 21:18:02 2019 +0000 Update DA tag to fv3da.v1.0.42 commit 4786300bfb97c599ac0a3d080e104f38034f0866 Author: jswhit Date: Fri Mar 29 15:55:17 2019 +0000 may restart interval for IAU 3 hours, so restarts written at beginning and middle of window commit 0c64c804276d6ca29e701f439020cddbc248b723 Author: jswhit Date: Fri Mar 29 15:40:48 2019 +0000 make sure restarts at middle of window are saved when IAU is on commit 23a45dea9fd33480ab250b37e4d75d10500726cb Author: fanglin.yang Date: Fri Mar 29 14:30:17 2019 +0000 Update model tag to nemsfv3gfs_beta_v1.0.18 to 1) correct a bug in precip units in the computation of frozen precipitation flag (srflag), 2) write fields that are continuously accumulated in model integration in restart files so that after a restart their acummulated values can be read in. (FV3 Issue #61788) commit 4624e6749b29d59a0cecffb6d856c26d5be8a566 Author: jswhit Date: Fri Mar 29 01:47:51 2019 +0000 updates for IAU commit 944d81a8a194d7a0138fddab4cd685d15c905c47 Author: jswhit Date: Fri Mar 29 01:47:33 2019 +0000 update commit 1538b4f829255c1a3a82a52c16951072963c6624 Author: jswhit Date: Fri Mar 29 01:46:20 2019 +0000 use EXP-locfix and fix_l127 for gsi, and new tag for NEMSfv3gfs commit d9b5538244fee0876185fa90f0b40b1869dc2619 Author: fanglin.yang Date: Thu Mar 28 20:46:52 2019 +0000 update release note commit abff7d1d77b94b28aca48b996644df6e3fa4ccda Author: fanglin.yang Date: Thu Mar 28 20:45:36 2019 +0000 replace current ecflow/def file swith NCO's copy commit 8718363623fe6999b65fd5d971dc93976d078757 Author: jswhit Date: Thu Mar 28 17:50:31 2019 +0000 params for IAU commit a0c7b84ff3868366570a20244b68a12274dec659 Author: jswhit Date: Thu Mar 28 17:42:48 2019 +0000 update commit 836d506585ef645add9dd5b4dc25416ac0672eda Author: jswhit Date: Thu Mar 28 17:06:25 2019 +0000 fix model_configure so start date is previous analysis time when IAU is on. commit 40793165c610c7c8b870be4d1822826e623a890b Author: jswhit Date: Thu Mar 28 15:15:43 2019 +0000 fix IAU commit 3f2dcce15a780a60e180a723dc9f5371f060eed8 Author: jswhit Date: Wed Mar 27 20:21:29 2019 +0000 testing iau commit dfd76c033e4430dcfc0046ca7384ad930e183769 Author: jswhit Date: Tue Mar 26 23:01:05 2019 +0000 use EXP-locfix branch for GSI commit 69c06ea44c639fd4716001edead49b5e2613a03d Author: fanglin.yang Date: Tue Mar 26 02:25:15 2019 +0000 Squashed commit of the following from branch q2fy19_nco_rst Add restart capability of running GFS long forecast from the end or a failing point of last attempt. modified: jobs/JGLOBAL_FORECAST, parm/config/config.base.nco.static parm/config/config.fcst, and scripts/exglobal_fcst_nemsfv3gfs.sh. restart_interval_gfs is used to control the frequency of writing out restart ICs, which are saved under ROTDIR for emc parallels and NWGES for NCO production. exglobal_fcst_nemsfv3gfs.sh script has been modified to autimatically detect if the model should execute as a cold, or warm start, or as rerun. If it is a rerun, the script will look for saved ICs that is restart_interval_gfs hours back from the last ending point. use 8x24 instead of 12x16 layout in config.fv3 for C768 -- Matt Pyle indicated this will actually speed up a 16-day forecasts by about 2 minutes per his test update to model tag nemsfv3gfs_beta_v1.0.17 to address restart I/O issues #60879 commit e01d1df6a81f7998574fa1145e05955f93ecf14d Author: russ.treadon Date: Mon Mar 25 23:16:05 2019 +0000 Update DA tag to fv3da.v1.0.41 (Q3FY19 GDAS observation upgrade) commit 5e49a47ccff59dd40be0cf18a3e7f4fcff0a115b Author: Rahul Mahajan Date: Mon Mar 25 22:02:32 2019 +0000 add config.resources.C96 commit eefaf95e6f6f4511021b83d9b8a43e2f544ba26f Author: jswhit Date: Mon Mar 25 12:37:48 2019 +0000 fix typo commit 5c5c1deb627589f59b666ec78de92db18827abae Author: jswhit Date: Mon Mar 25 02:41:00 2019 +0000 C96 LONA should be 190, not 192 (to match berror file) commit 411406246e5d4d6fa45af4d3f1bff66d8479964a Author: jswhit Date: Mon Mar 25 02:40:42 2019 +0000 .f90, not .f commit 0fdc76ce6f0149a7f62dc013850228bc6e0578cd Merge: c345a00d 8c563c17 Author: jswhit Date: Sun Mar 24 17:10:55 2019 +0000 Merge branch 'feature/iau' of gerrit:global-workflow into feature/iau commit c345a00d79e2d8d3cb7c9a72621b6ed905a9e165 Author: jswhit Date: Sun Mar 24 17:10:36 2019 +0000 fix building of namelist, specification of cycle time commit 8c563c17ea09c96d2c5b407555851c4df7c5a9a3 Author: Rahul Mahajan Date: Fri Mar 22 02:52:36 2019 +0000 fix modules issue commit bbd6e5e8e2a0d306dc352d229e8c6332c7540c7d Author: Rahul Mahajan Date: Fri Mar 22 01:46:14 2019 +0000 fix file permissions commit 28187c8e79abf10065b44cc69f68feb2cd5a2d44 Author: Rahul Mahajan Date: Fri Mar 22 01:45:34 2019 +0000 bugfix in setup_expt.py commit b75a1d7f06196c1b89bcb08698bb0fce589e5510 Author: Rahul Mahajan Date: Thu Mar 21 21:22:06 2019 -0400 bugfix for DOIAU and DOAIU_ENKF Change-Id: Ie7e2c96773ceb59b7f9f4e7340b09a0210ccb1e4 commit d8e19fa60758aa550bc5924b7554dd85cd982eee Author: Catherine Thomas Date: Thu Mar 14 18:08:13 2019 +0000 VLab Issue #60800, L127_updates (C. Thomas): Added nlayers change for eobs/eomg steps in addition to anal. commit 5595fcca56e116526b32fe917a7e788864e7f388 Author: Catherine Thomas Date: Wed Mar 13 13:29:28 2019 +0000 VLab Issue #60800, L127_updates (C. Thomas): Bug fix for config.ecen commit 89f14fd6a9e91b2350455044181697e67a9e087e Author: Catherine Thomas Date: Tue Mar 12 13:08:45 2019 +0000 VLab Issue #60800, L127_updates (C. Thomas): Make global_chgres_driver more flexible to accept different SIGLEVEL commit 0535d189d831e2f0a979b2f9f2085b279b35ab7b Author: Catherine Thomas Date: Mon Mar 11 21:51:22 2019 +0000 VLab Issue #60800, L127_updates (C. Thomas): Fixed typo in config.fv3 commit e820ab2f46da9c123648b370cb71e356b058343b Author: Rahul Mahajan Date: Mon Mar 11 14:52:39 2019 +0000 add utility to change date and forecast hour in nemsio file. commit 326da6173e8f6c430e1883e723f3c94b378bb829 Author: Rahul Mahajan Date: Mon Mar 11 10:38:36 2019 -0400 set restart interval = 3 at start of IAU expt Change-Id: I937c7aa96ef6421c0270c31e906eda1d6e4bb840 commit 3d1f592f1a95d0b9828e60baff409307b2d5d905 Merge: aafd573b 595d44c9 Author: Rahul Mahajan Date: Fri Mar 8 16:17:00 2019 -0500 Merge branch 'master' of gerrit:global-workflow into feature/iau Change-Id: Ide0c4c5dc13314de8f0eb19554b5895fc3e24a2e commit aafd573bc2c30acf5ef8dbf245acc6b6302369d7 Author: Rahul Mahajan Date: Fri Mar 8 16:16:40 2019 -0500 add IAU capability in the forecast script Change-Id: I024d73ba585ffb034f73e8207bbe036a712a51c2 commit aebf15574209889549dd7b0f496b63c51773dffa Merge: 292ef67b 595d44c9 Author: Catherine Thomas Date: Fri Mar 8 17:48:26 2019 +0000 Merge branch 'master' into L127_updates Adding in new model changes from potential v15 implementation commit 292ef67b5d883a6b1bafdc6e8653118b922d4623 Author: Catherine Thomas Date: Thu Mar 7 18:19:44 2019 +0000 VLab Issue #60800, L127_updates (C Thomas): Change hyblev file for fits commit 61b9da080d146fdc16c63aabdb4734ad5cbce8b9 Author: fanglin.yang Date: Tue Mar 5 18:59:56 2019 +0000 minor updates to resources usages in config.vrfy, config.resources and config.post on computers other an Dell to be consistent with the master repository commit 21007eae8d4660ef87c94f4388d0518a1071b945 Author: catherine.thomas Date: Mon Mar 4 09:44:10 2019 -0500 VLab Issue #60800, L127_updates (C. Thomas): Change GSI fix directory in checkout.sh for L127 commit b71c6b7eb24179e381d8abe8740bb5ca66175b3b Author: Catherine Thomas Date: Thu Feb 28 19:33:07 2019 +0000 VLab Issue #60800 (L127_updates): Point to correct hyblev file in fix_am commit 7b4bc5041ca71f0ae7f69dd9dae72eb0652bec24 Author: Catherine Thomas Date: Thu Feb 28 18:52:05 2019 +0000 VLab Issue #60800 (L127_updates): Level dependent changes to config files commit d2b0c40f58b0996dd9ec39c3fac54af132942f8b Author: Catherine Thomas Date: Thu Feb 28 18:24:48 2019 +0000 VLab Issue #60800: Allowed for externally specified SIGLEVEL file in ush/gaussian_sfcanl.sh commit 3b2993f3871467707427dc65deb8c97c7942c178 Merge: 6e7ecef60 70ae08c85 Author: Hang-Lei-NOAA <44901908+Hang-Lei-NOAA@users.noreply.github.com> Date: Thu Apr 9 17:02:49 2020 -0400 Merge pull request #56 from NOAA-EMC/feature/dev-v15.2.10 Issue #52 - GFSv15.2.10 updates for develop from RFC 6652 commit f5c4aff96c9cd6a94d7565cc5a530c233cd531c6 Merge: 7b9735456 dae3332af Author: Hang-Lei-NOAA <44901908+Hang-Lei-NOAA@users.noreply.github.com> Date: Thu Apr 9 17:02:35 2020 -0400 Merge pull request #55 from NOAA-EMC/feature/ops-v15.2.10 Issue #52 - updates for GFSv15.2.10 from RFC 6652 commit 70ae08c85a9e39226f583210c88bdf526f50a395 Author: kate.friedman Date: Thu Apr 9 20:17:03 2020 +0000 Issue #52 - GFSv15.2.10 updates for develop from RFC 6652 commit dae3332afdd0219eae3900165cde2efcb5ea62fd Author: kate.friedman Date: Thu Apr 9 20:09:57 2020 +0000 Issue #52 - updates for GFSv15.2.10 from RFC 6652 commit 7b97354567d2c50adc6b48ac2c9371b3638c28e8 Merge: 7eb0e8205 9311dfa5d Author: Hang-Lei-NOAA <44901908+Hang-Lei-NOAA@users.noreply.github.com> Date: Thu Apr 9 15:17:21 2020 -0400 Merge pull request #54 from NOAA-EMC/feature/i46ops Issue #46 - syndat path and prod module updates for operations branch commit 9311dfa5d65fa46a222574f037f1711de0cf0224 Author: kate.friedman Date: Thu Apr 9 18:34:39 2020 +0000 Issue #46 - syndat path and prod module updates for operations branch - update prod_envir module to prod_envir/1.1.0 - update syndat paths - necessary updates to chgres-related scripts to get them working for testing - update default ACCOUNT to GFS-DEV commit 6e7ecef6070985248444d74650512722f4e79069 Merge: 0e0dc873c 69e7eada4 Author: Hang-Lei-NOAA <44901908+Hang-Lei-NOAA@users.noreply.github.com> Date: Thu Apr 9 13:55:59 2020 -0400 Merge pull request #53 from NOAA-EMC/feature/i46dev Issue #46 - syndat compath updates and prod modules commit 69e7eada432995041a4fd8b449a8e900dcbc43ce Author: kate.friedman Date: Thu Apr 9 17:29:42 2020 +0000 Issue #46 - reverted JGLOBAL_TROPCY_QC_RELOC change, added DO_WAVE with default into config.vrfy, and organized verification switch comments in config.base.emc.dyn commit 81f6114e96f5cd0d55a02306f4ca37ab5c18fb33 Author: Kate.Friedman Date: Thu Apr 2 18:18:30 2020 +0000 Issue #46 - fix to verification switches and fix to COMINsyn path on Hera commit a847accd7e89b3f60e662fa05c633b0efdfa5332 Author: Judy.K.Henderson Date: Tue Mar 31 23:29:26 2020 +0000 merge 26Mar20 develop branch into gmtb_ccpp_hera Squashed commit of the following: commit 0e0dc873ceff26f9666fc40287ad2457171719c7 Merge: b133700a 95a63432 Author: Kate Friedman Date: Thu Mar 26 10:54:44 2020 -0400 Merge pull request #42 from NOAA-EMC/hotfix/viewer Issue #41 - Update PRODUTIL paths for WCOSS in viewer commit 95a63432f3fbdd8745d0617089485cdc332f57e7 Author: kate.friedman Date: Thu Mar 26 14:43:44 2020 +0000 Issue #41 - Update PRODUTIL paths for WCOSS in viewer commit b133700a4a65fe5334e5b5522a8c6f83bf0bbd0c Author: kate.friedman Date: Thu Mar 26 13:01:38 2020 +0000 Issue #21 - correct syntax for machine if-blocks in setup_expt scripts commit f4ee5098988106037528650ba8a15922150b70af Author: kate.friedman Date: Wed Mar 25 20:52:49 2020 +0000 Issue #21 - remove unneeded logic from partition check commit 4acc80f6037a4d0fd43a64e9178e7aadbd6162f5 Merge: accb6f4b fb1c79f4 Author: Hang-Lei-NOAA <44901908+Hang-Lei-NOAA@users.noreply.github.com> Date: Wed Mar 25 15:22:51 2020 -0400 Merge pull request #40 from NOAA-EMC/port2wcoss3p5 Issue #21 - Add support for WCOSS phase 3.5 commit fb1c79f4b7136e09c9d35d9d05303116036f58c9 Author: kate.friedman Date: Wed Mar 25 16:06:54 2020 +0000 Issue #21 - updated WCOSS phase 3.5 queues commit 08a535492952d863f860977f0ab0da4bc79bc931 Author: kate.friedman Date: Wed Mar 25 14:58:14 2020 +0000 Issue #21 - add phase 3.5 support - add partition option to setup scripts - remove machine if-blocks from config.base and add variable population to setup_expt*py scripts - add phase 3.5 ppn value to WCOSS_DELL_P3 env and config.resources files commit accb6f4b919871221e8037d5f154b157d88d1102 Merge: 057b2a82 8b51b56f Author: Kate Friedman Date: Mon Mar 23 14:01:30 2020 -0400 Merge pull request #39 from NOAA-EMC/feature/verif-tag Issue #38 - update EMC_verif-global pointer from VLab to GitHub commit 8b51b56f84289d1c01863b61421b21eccd22939c Author: kate.friedman Date: Mon Mar 23 17:40:18 2020 +0000 Issue #38 - update EMC_verif-global pointer from VLab to GitHub commit 057b2a82fa43f7bc36c3e757ca7d48fb86b9541c Merge: 0377d20f 622167d5 Author: Kate Friedman Date: Wed Mar 11 12:00:58 2020 -0400 Merge pull request #29 from NOAA-EMC/feature/manage_externals Issue #3 - Introduce manage_externals as replacement for checkout.sh commit 622167d5fb3322921a1702639ebccb42da1f5e1b Author: kate.friedman Date: Fri Mar 6 18:20:31 2020 +0000 Issue #3 - added explicit config flag example for checkout_externals in README and blurb about this replacing checkout.sh commit e83b90d50999f64ed5208d94a8cceb8179c9395f Author: kate.friedman Date: Fri Mar 6 17:00:15 2020 +0000 Issue #3 - remove prod_util and grib_util sections from build_all.sh, removed elsewhere already commit 8699b46aa1797e8dd29edff1d4bd3b511ad5cb1c Author: kate.friedman Date: Fri Mar 6 16:30:54 2020 +0000 Issue #3 - updated README with new manic version commit e602cd3d536b55b86b04285aedd5781d8d3a9f82 Author: kate.friedman Date: Fri Mar 6 16:27:57 2020 +0000 Issue #3 - updated link_fv3gfs.sh to adjust wafs links commit 830c73f430d70cc516dea419a8969c6fd9fc0910 Author: kate.friedman Date: Fri Mar 6 15:21:45 2020 +0000 Issue #3 - update EMC_verif-global tag in Externals.cfg after sync with develop commit 40084e67810d21366d0e9af6d9937c29aa4965ad Merge: f662fffa 0377d20f Author: kate.friedman Date: Fri Mar 6 15:18:52 2020 +0000 Issue #3 - Merge branch 'develop' into feature/manage_externals commit 0377d20f3d019f77a47fc9860d6146fd3c8e5d94 Merge: 1b359dbe 25524675 Author: Kate Friedman Date: Thu Mar 5 08:43:16 2020 -0500 Merge pull request #28 from NOAA-EMC/feature/metplus2 Issue #8 - add switch for MET+ jobs commit 25524675a63e59829655bbd9a09abc4dca246357 Author: kate.friedman Date: Thu Mar 5 13:31:02 2020 +0000 Issue #8 - add switch for MET+ jobs commit 1b359dbeb31b94382619dfc9c67e77fffe46aaa0 Merge: 0359d342 31bb7d32 Author: Kate Friedman Date: Wed Mar 4 10:19:36 2020 -0500 Merge pull request #26 from NOAA-EMC/feature/metplus Feature/metplus - refactored MET+ jobs to resolve timing issues commit 31bb7d32181ca84229c3c3374226bbd37784ddc4 Merge: eb73e520 0359d342 Author: Mallory Row Date: Wed Feb 19 15:24:42 2020 +0000 Merge branch 'develop' into feature/metplus commit f662fffa25a99617828e4322bf789978cf523248 Author: kate.friedman Date: Fri Feb 14 15:57:05 2020 +0000 Issue #3 - Updated README with new manic tag v1.1.7 commit e3196a84a0ecd2b54d59abfdc9184622a9c605ca Author: Kate Friedman Date: Thu Feb 13 15:59:06 2020 -0500 Update README.md commit e46b175d8a309010e421ccd54e6d6eb083af3579 Merge: 4bd0e203 0359d342 Author: Kate.Friedman Date: Thu Feb 13 20:38:04 2020 +0000 Issue #3 - sync merge with develop branch commit 0359d3425a8710e7b696b94456ec8e54e9a2fd9f Merge: 1d9a1f00 bd00cb98 Author: Hang-Lei-NOAA <44901908+Hang-Lei-NOAA@users.noreply.github.com> Date: Thu Feb 13 09:53:59 2020 -0500 Merge pull request #19 from NOAA-EMC/feature/remove_theia Feature/remove theia commit eb73e520716215c3f11cc4cdfce3831408221766 Author: Mallory Row Date: Fri Feb 7 14:04:37 2020 +0000 Update EMC_verif-global checkout to verif_global_v1.5.0 commit bd00cb9812c5fb400ba4399d183b2198b8e80372 Author: Kate.Friedman Date: Fri Feb 7 13:41:05 2020 +0000 Issue #4 - bug fix in getic.sh for v15 commit 1c85197d7a1beb34f2e3a52969d631d42003e6eb Merge: 67dae409 1d9a1f00 Author: Kate.Friedman Date: Fri Feb 7 13:27:18 2020 +0000 Issue # 4 - Sync merge branch 'develop' into feature/remove_theia commit 1d9a1f00b73cb3852d352e9a41a15651a99fb656 Merge: 3ed9267b bdbecaa7 Author: Kate Friedman Date: Fri Feb 7 08:11:21 2020 -0500 Merge pull request #18 from lgannoaa/exception_handling Exception handling commit 4bd0e20300cc2a79e79433b2ec8cdb15c8f01c9e Author: Kate Friedman Date: Thu Feb 6 11:55:31 2020 -0500 Update README.md commit d9ea1acab54f65a987b32d56587dfd1b6bcd037c Author: Kate.Friedman Date: Thu Feb 6 16:03:11 2020 +0000 Issue #3 - reduce hashes down to minimum 8 characters commit bdbecaa7220f2462cc75e802570845809ebcfc75 Author: Lin.Gan Date: Wed Jan 29 15:15:52 2020 +0000 Display exception handling message for individual package with location of the log file commit b64fd5ff43f88bd5f2d27b0a16fb803eac8aff8c Merge: cf008631 3ed9267b Author: kate.friedman Date: Tue Jan 28 15:24:37 2020 +0000 Merge branch 'develop' into feature/manage_externals commit cf0086311daaf62ee33df010946d0a0ddc5bc400 Author: kate.friedman Date: Tue Jan 28 15:23:14 2020 +0000 Issue #3 - remove copy of manage_externals under util and add README.md file commit c12e87987113fa6f4b543bc0dae61d4259703c03 Author: Lin.Gan Date: Mon Jan 27 19:38:19 2020 +0000 Implement exception handling in build_all script commit bfc7bb0b237d4cf4240551aa8b9c5d025724ac16 Author: Kate.Friedman Date: Mon Jan 27 19:06:17 2020 +0000 Issue #3 - initial add of manage_externals and needed Externals.cfg. Also added .gitignore file and removed scripts/files associated with no-longer-used prod_util and grid_util builds. commit 4d5713d3983c6cc6a8e497e892760752e09f15a0 Author: Lin.Gan Date: Fri Jan 24 15:51:28 2020 +0000 Testing github commit commit 786806f3cd5d858615f3f74dec78891a2189ee79 Author: Mallory Row Date: Fri Jan 24 15:04:16 2020 +0000 Missed file format updates in a few places in config.metp commit d0a3b53c8117676c351c50287d4583951c94d42c Author: Lin.Gan Date: Fri Jan 24 14:30:29 2020 +0000 init commit for exception handling branch commit c11dfef0f7fb8da2866e6a022ac0ff60044766a7 Author: Mallory Row Date: Fri Jan 24 14:07:29 2020 +0000 Update EMC_verif-global tag to verif_global_v1.4.1 commit 0ea809c208ce606c957eed4d346a3828d8186010 Author: Mallory Row Date: Thu Jan 23 13:29:56 2020 +0000 Update file format variable in config.metp of online archive files commit c0d7179f34837e40db9ccb1b941ad9f312283a6e Author: Mallory Row Date: Tue Jan 21 16:37:16 2020 +0000 Add updated env machine files for gfsmetp commit 82e690717d72c7b021c637270108f4bacfb6816d Author: Mallory Row Date: Tue Jan 21 16:29:23 2020 +0000 Update config.resources for gfsmetp commit 72e8adf1c8a8859786cbbcf1b976640d73c5c867 Author: Mallory Row Date: Tue Jan 21 16:19:31 2020 +0000 Update EMC_verif-global tag checkout to 1.4.0 commit 6872f79f3f9052377ff863da1bbac482548ee0ce Author: Mallory Row Date: Tue Jan 21 16:14:25 2020 +0000 Add rocoto METplus job script commit 9c94156670bd810561bd2a699648afb946511ea9 Author: Mallory Row Date: Tue Jan 21 16:09:13 2020 +0000 Changes to setup_workflow.py for gfsmetp metatask commit 3ed9267b2f540694e957ee33a746f00857a5a1a2 Author: kate.friedman Date: Tue Jan 14 19:32:36 2020 +0000 Issue #10 - mid-year update to bufr station list (develop) commit 1915aa921dfe2ef17799599e5a3084547caa3ca2 Author: kate.friedman Date: Fri Jan 10 18:36:32 2020 +0000 Issue #8 - pulled in config.metp and modifications to two setup scripts commit 67dae40974485e7ffef4713209142301e5e4ba9e Merge: f78eb1b4 091f4ba1 Author: kate.friedman Date: Wed Jan 8 20:15:45 2020 +0000 Merge branch 'develop' into feature/remove_theia commit f78eb1b4228927a0937b5de2098ba2cece6a4aca Author: kate.friedman Date: Wed Jan 8 20:13:08 2020 +0000 Issue #4 - removed references to Theia and Theia scripts commit 091f4ba1d04f1600e352f2fe090ae9af0880c95d Author: kate.friedman Date: Wed Jan 8 19:37:45 2020 +0000 Issue #7 - missed update to gdas transfer file from GFSv15.2.5 updates commit 3fd4bcfa0bb5e133774b5ad64aaf45b42074c05c Author: kate.friedman Date: Tue Dec 17 19:12:28 2019 +0000 GitHub Issue #2 - GFSv15.2.6 obsproc version update, earc bug fix, and tracker path update commit 530795269fd678f7ee6c7354d3fe674b8c46a458 Author: Kate.Friedman Date: Wed Dec 11 20:47:44 2019 +0000 HOTFIX - VLab Issue #72346 - fix to rocoto_viewer on Hera commit def5de038ba0e2b09af6f9be20e4b21cc09cefa0 Author: kate.friedman Date: Mon Mar 30 19:11:59 2020 +0000 Issue #46 - update prod_envir to v1.1.0 throughout commit cbe3b00dc72a31c9f6c951333738b41b2682396a Author: kate.friedman Date: Mon Mar 30 17:51:46 2020 +0000 Issue #46 - updates to prod_util module on Cray and syndat paths throughout commit adcfca84a7a688a02acf5a885e88836a54ea1f52 Author: Hang-Lei-NOAA <44901908+Hang-Lei-NOAA@users.noreply.github.com> Date: Mon Mar 16 12:27:21 2020 -0400 Update checkout.sh gsi git add DA_GFS_v15.3 commit 7eb0e820580dd5381ced7f745b3c24e85ef6218b Merge: 8b177b687 9939d8b4a Author: Kate Friedman Date: Mon Mar 16 09:35:22 2020 -0400 Merge pull request #35 from NOAA-EMC/feature/operations_gda Issue #34 - update GDA DMPDIR paths and associated scripts. Issue #36 - Update DA tag in operations branch for v15.2.9. commit 9939d8b4aa123dc85f8277b6f7c37e3b1400f882 Author: kate.friedman Date: Mon Mar 16 13:11:08 2020 +0000 Issue #36 - update DA tag for v15.2.9 commit 338957e3a2fbb344c799caf2910af8dd5c47ab4e Author: kate.friedman Date: Fri Mar 13 17:11:04 2020 +0000 Issue #34 - update GDA DMPDIR paths and associated scripts commit 8b177b687ba0e263ba1fe08b2a44c5e398003c29 Author: kate.friedman Date: Fri Feb 28 15:37:15 2020 +0000 Issue #11 - update obsproc_global version to v3.2.5 commit 6397f5efe5aca1c071d0600f624fa5bdfe491d3c Author: Judy.K.Henderson Date: Wed Feb 5 19:26:37 2020 +0000 - added links for Thompson lookup tables - corrected setting of gPDY and gcyc in forecast_predet.sh script commit 4a8ac824b95fc500ccf85d0ca5b8b77796eade52 Author: kate.friedman Date: Fri Jan 17 21:04:04 2020 +0000 Issue #11 - update obsproc_global version to v3.2.4 commit 5815c4ddd07d82e62d713ef8f8fe7d4bdbbb6a4a Author: Judy.K.Henderson Date: Fri Jan 17 00:21:14 2020 +0000 updated forecast configuration commit c884ab9de4579abe085315490cfa68fdc0ea365b Author: Judy.K.Henderson Date: Thu Jan 16 00:33:09 2020 +0000 - new exglobal_fcst_nemsfv3gfs.sh script (merged from coupled_crow) - moved module commands rom aeroic.sh to module_base.hera - copy namelist to output directory commit e39382806915083d9c17126ab585c8d58117d95d Author: kate.friedman Date: Tue Jan 14 19:21:16 2020 +0000 Issue #10 - mid-year update to bufr station list commit 2ab0ed07b053b8fabde216877cfef64f1ad2d9f5 Author: Judy.K.Henderson Date: Mon Jan 13 20:59:25 2020 +0000 - make default CCPP suite GSD_v0 - define convective options based on CCPP_SUITE commit 943cfdc07a10358d634648bead46b1caf5f652b5 Author: Judy.K.Henderson Date: Mon Jan 13 20:34:23 2020 +0000 - added changes for creating one config.fcst and config.base.emc.dyn file (for now, still need to define CCPP_SUITE in config.base) - made config.base.nco.static executable - removed fcst and base.emc.dyn files for v15 - made aeroic.sh Korn shell script commit 80e7ea69634d8122e1e6084371e1130dda4e90c7 Author: Judy.K.Henderson Date: Mon Jan 13 19:57:21 2020 +0000 corrected syntax for ttendlim variable commit 598ac7c68e7ea064d8edca9d0966e38f4ab6126d Author: Judy.K.Henderson Date: Mon Jan 13 19:28:42 2020 +0000 * add Lin Gan's scripts commit 76fa32f8895119805fbe670924bb850d8938b485 Author: kate.friedman Date: Wed Jan 8 19:40:44 2020 +0000 Issue #7 - update to EMC_post gtg tag for GFSv15.2.7 commit e6f52d73b3f68dac81117402ff1a8219a23ef287 Author: kate.friedman Date: Wed Jan 8 19:28:47 2020 +0000 Issue #7 - missed change from v15.2.5 to gdas transfer parm file commit b1dcb4fcaf68bfe1e491e3d79bba6fe830509746 Author: kate.friedman Date: Tue Dec 17 15:51:31 2019 +0000 GitHub Issue #2 - GFSv15.2.6 changes commit 04409166f0bb5f3f02d3d61adb2902581ddedddf Author: Judy.K.Henderson Date: Wed Dec 11 01:10:42 2019 +0000 - added aeroic task dependency for gfsfcst - added new python script without aeroic task - updated testemc.sh to use python script without aeroic task commit 1e64f8450a0c8d16775948ae58879fad0ae03bda Merge: 97e9a3bee e4b6b7d35 Author: Judy.K.Henderson Date: Tue Dec 10 23:33:32 2019 +0000 Merge branch 'develop' into gmtb_ccpp_hera modified: jobs/JGFS_CYCLONE_TRACKER modified: jobs/rocoto/getic.sh modified: parm/config/config.resources modified: parm/config/config.vrfy modified: ush/syndat_qctropcy.sh commit 97e9a3beee7a042a5176bf067e480a55c84961fd Author: Judy.K.Henderson Date: Tue Dec 10 23:03:50 2019 +0000 - adding changes to correct setting of ictype when generating FV3 ICs commit 5fcb12ae25a694090b67286bdb517474c27a8c07 Author: Judy.K.Henderson Date: Mon Dec 9 16:52:52 2019 +0000 -- modify checkout.sh to add new directory, sorc/aeroconv commit 6b5a136ed7e481233f33aec33dc79365d9b25acd Author: Judy.K.Henderson Date: Thu Dec 5 17:24:00 2019 +0000 add 'git submodule sync' command to checkout.sh for fv3gfs_ccpp.fd commit 2d555e0905da47f2be5b6aa471237def5dc75b5c Author: kate.friedman Date: Wed Dec 4 20:03:19 2019 +0000 VLab Issue #71995 - GFSv15.2.5 commit 83533877c5b6dc341879c9c25fd516ff07f92c90 Author: Judy.K.Henderson Date: Wed Dec 4 19:01:30 2019 +0000 - define satmedmf earlier in config.fcst for MYNN - add new experiment setup scripts commit 130d35445ffea6e323461441d7f2aee3fc19fc8b Author: Judy.K.Henderson Date: Wed Dec 4 18:51:58 2019 +0000 updates for running GDFL mp with CCPP commit de706fff189032c5dcfd6ce41761c53a0fc39169 Author: Judy.K.Henderson Date: Wed Dec 4 01:03:34 2019 +0000 for now, comment out top 2 lines in diag tables for GSD; otherwise, get invalid file format error commit 432d28a7c8b491bc0f5617725d0770af0fada9ad Author: Judy.K.Henderson Date: Wed Dec 4 00:47:58 2019 +0000 merge in changes from develop branch -- GFSv15.2.3 ncep_post tag update -- GFSv15.2.4 change commit ae98aab7cc233a2f95c72b6ceafbb9b6043fc44a Author: Judy.K.Henderson Date: Wed Dec 4 00:19:28 2019 +0000 set RUCLSM with other land surface model options commit 2c37fefa4dc309588d2c92bfe5ffeaaa58e19942 Author: Judy.K.Henderson Date: Tue Dec 3 23:38:42 2019 +0000 - updated aeroic.sh for hera - temporary fix for fv3ic.sh for now to create FV3 ICs using FV3GFS nemsio files - moved convective options to config.fcst* files - updated links for diag and field tables for GSD - add CCPP executable to link_fv3gfs.sh commit 8a3cbc88fe64e6a36f123eb3299ecb257c939e47 Author: Judy.K.Henderson Date: Tue Dec 3 01:05:30 2019 +0000 updated forecast configuration files commit c2de8c3c65c6a781e1703962e5863e8bae5c0e49 Author: Judy.K.Henderson Date: Mon Dec 2 23:27:07 2019 +0000 updated config.base.emc.dyn files commit 099497054749c805cd6831c5ee8db333d9cd4acb Author: Judy.K.Henderson Date: Mon Dec 2 23:07:45 2019 +0000 update build_fv3.sh for compiling CCPP version commit fea6dd857b707f00e98b320d056583169748105b Author: Judy.K.Henderson Date: Mon Dec 2 22:00:00 2019 +0000 add specific version for dtc/develop branch in checkout.sh commit 4327dd1d1a4aa4e265dee406e47ea13b4f71379c Author: Judy.K.Henderson Date: Mon Dec 2 21:38:43 2019 +0000 corrected directory name and added logfile in checkout.sh commit 4e0e83bae82a2ba3d4962892cc5dd91308f837ac Author: kate.friedman Date: Mon Dec 2 21:29:20 2019 +0000 VLab Issue #71881 - GFSv15.2.4 changes commit 538ed2178af8ce8ffd626eae8b3e2c15d577cde5 Author: Judy.K.Henderson Date: Mon Dec 2 21:04:53 2019 +0000 update checkout.sh to use dtc/develop branch from NCAR:ufs-weather-model repository commit 86109d8875c81d7f7a5c04743f644a3509a02cc7 Merge: 6746e2d5b 6afa503dd Author: Judy.K.Henderson Date: Mon Dec 2 19:11:21 2019 +0000 Merge branch 'port2hera' into gmtb_ccpp_hera commit 90b50694567e02bf640899eec1f640da567c3de5 Author: kate.friedman Date: Mon Dec 2 18:09:14 2019 +0000 VLab Issue #71878 - GFSv15.2.3 ncep_post_gtg tag update commit 6afa503ddde3b4b8c4171d55ba26327ab727a219 Author: Kate.Friedman Date: Mon Dec 2 15:59:21 2019 +0000 VLab Issue #67188 - fix to env/WCOSS_C.env commit b1b9b80071cb7d14c1d3ae60066b5ae87e35fcea Author: Kate.Friedman Date: Mon Dec 2 15:51:10 2019 +0000 VLab Issue #67188 - updated ProdGSI revision and resource settings commit 6746e2d5b0293bc54f0487aac7f9bff3d8736b95 Author: Judy.K.Henderson Date: Tue Nov 26 20:24:23 2019 +0000 corrected version of FV3 to check out (nemsfv3_gfsv15.2.1) commit 2af0cfea83b72338588a5ae426fc647497ad4090 Merge: cc8a5b97c d2f877f81 Author: Judy.K.Henderson Date: Tue Nov 26 19:04:57 2019 +0000 Merge branch 'develop' into gmtb_ccpp d2f877f - Thu Nov 14 19:34:51 2019 commit cc8a5b97c3464a8a59c3b53b78a70b4fce06cff2 Author: Judy.K.Henderson Date: Tue Nov 26 18:36:17 2019 +0000 - rename *GSDsuite files to original names (config.base.emc.dyn, config.fcst, exglobal_fcst_nemsfv3gfs.sh) - moved convective options from config.fcst to config.base - created separate _v15 files commit bdb5df57b272e5c3805a17e024afeef640821668 Author: Judy.K.Henderson Date: Wed Nov 20 17:47:17 2019 +0000 add link so fv3gfs.fd points to fv3gfs_ccpp.fd checkout commit 8caefc93292b4a07f89a4d6771f744971c8b574c Author: Judy.K.Henderson Date: Wed Nov 20 17:45:26 2019 +0000 added CCPP-FV3 checkout to checkout.sh commit 59d2bd90a77a1b802991316b284ea63b0ba1b714 Author: Kate.Friedman Date: Mon Nov 18 19:00:13 2019 +0000 VLab Issue #67188 - updated GSI checkout revision and turned on wafs build for all machines commit dd46a855df388ee3b9306df5921300b9c2710e3a Merge: b295567ef d2f877f81 Author: Kate.Friedman Date: Mon Nov 18 15:01:09 2019 +0000 VLab Issue #67188 - sync merge with develop branch after v15.2 commits commit 8eff68539f782439b6dde8ce58c7e7e4845f41f1 Author: kate.friedman Date: Thu Nov 14 19:31:08 2019 +0000 VLab Issue #71238 - GFSv15.2.2 changes commit e82ed9af2d6f5c0ae95c02ff29009962a1077d66 Author: kate.friedman Date: Wed Nov 6 17:08:36 2019 +0000 VLab Issue #66132 - GFS v15.2.1 changes commit 193100e641a4ad80b8299e33a1dd39098414679f Author: kate.friedman Date: Wed Aug 28 15:48:31 2019 +0000 HOTFIX - VLab Issue #67884 - fix syntax error in workflow_utils.py for non-slurm QUEUE_ARCH firstcyc setting commit b303bca1a935ffc744dd15c8e86e650711f3e0fe Author: kate.friedman Date: Tue Aug 27 19:45:07 2019 +0000 VLab Issue #67744 - GFSv15.1.4 nwprod changes commit 584a241d387ede9537de5d96773e174a9c2f76d1 Author: Kate.Friedman Date: Mon Aug 12 19:27:56 2019 +0000 Theia bug fixes - VLab Issue #66187 commit 7cd7634f7110c5f5197c77541094074b1e3b6e22 Author: kate.friedman Date: Fri Aug 9 13:00:26 2019 +0000 HOTFIX - VLab Issue #67072 - fixing NEMSfv3gfs clone command commit e1dca1bc2d6b1a278067a73c16f1a6e10761d465 Author: Judy.K.Henderson Date: Wed Aug 7 21:28:25 2019 +0000 updated fcst configuration files` commit e2b1933dac989334aa2e9549a541b3648d911dc4 Author: Judy.K.Henderson Date: Tue Jul 23 18:30:58 2019 +0000 - updated lsm=3 for RUC LSM in config.fcst_GSDsuite - changed FHCYCLE=0 for RUC LSM in config.base.emc.dyn_GSDsuite commit dd34b433bf8e50bfdfcfe5d855ea368328613395 Author: kate.friedman Date: Tue Jul 23 15:26:49 2019 +0000 VLab Issue #66290 - GFSv15.1.3 changes commit e5d8caf8b71da807c409eaa55b7f3fa00234ed33 Author: Judy.K.Henderson Date: Mon Jul 22 16:34:37 2019 +0000 - add changes needed to run CCPP version of NEMSFV3GFS for GFDL MP or GSDSuite (GF, MYNN, RUC LSM, Thompson MP) -- change compilation options in build_fv3.sh (assumes CCPP suites suite_FV3_GFS_v15.xml and suite_FV3GFS_GSD_v0.xml) -- change executable name to global_fvgfs_ccpp.x -- add convective variables, imfdeepcnv and imfshalcnv, to config.base.emc.dyn -- add ccpp_suite variable to atmos_model_nml portion of namelist -- created new exglobal forecast scripts with namelist options -- define suite definition file (CCPP_SUITE) in config.fcst -- add new aeroic task to workflow for Thompson microphysics -- modify checkout.sh to add new directory, sorc/aeroconv (read README.md to extract files) -- change nstf_name in namelist from 2,0,0,0,0 to 2,1,1,0,5 (config.nsst settings) commit 4ba0ff91557370a64b00c56e132a03e58f546459 Author: Judy.K.Henderson Date: Mon Jul 22 13:40:19 2019 +0000 update to 19Jul2019 develop branch Squashed commit of the following: commit f0c7afe3eb53bc9998d9bcc905f849ebe4b83549 Author: kate.friedman Date: Tue Jul 16 18:23:31 2019 +0000 VLab Issues #66082 - GFSv15.1.2 nwprod changes commit 1652d227586e07fd9b68869d875278d5fea3043e Author: kate.friedman Date: Mon Jul 8 17:22:26 2019 +0000 VLab Issue #65754 - merge final GFSv15.1.1 into master commit 6ca55564f2c1d2fe1f9dbb08f7be34faae0e9cb9 Author: kate.friedman Date: Tue Jun 11 13:24:26 2019 +0000 VLab Issue #64845 - update UFS_utils to v1.0.0 tag commit ad03fe5a48273f1f2490c2df4c8dabc0f5ed9cbb Author: kate.friedman Date: Mon Jun 10 16:00:47 2019 +0000 VLab issue #64274 - missing ozinfo files in checkout.sh added commit 18341fcfc2372efa821249d885187219d438049e Author: Lin.Gan Date: Fri Jun 7 14:20:35 2019 +0000 Patch missing lowres fix file, disable vrfy jobs from first helf cycle and clean up extra files commit d919db638ac4f83604648d571ef36e4798928561 Author: lin.gan Date: Wed May 22 14:33:59 2019 +0000 VLab Issue #64141 Merge ufs_utils into master commit 2acb5d3104359cd157bb7c14cf81c9187fd5a84a Author: kate.friedman Date: Wed May 15 15:42:03 2019 +0000 VLab Issue #58894 - SLURM updates for free-forecast mode commit 475967bcfdc3698a5b0dde7c17166a4939760592 Author: Kate.Friedman Date: Fri May 10 19:36:13 2019 +0000 VLab Issue #58894 - changes for SLURM on R&D machines commit 8619f4794a277a40ffc227a5254d084870584cf8 Author: fanglin.yang Date: Tue Apr 23 19:29:54 2019 +0000 update link_fv3gfs.sh to add correct directory paths for making soft links of post and wafs source directories commit 839a8164143dc1a04bb1f47b2107f29ef0905a3e Author: fanglin.yang Date: Mon Apr 22 21:45:54 2019 +0000 script update per NCO's request; add track files to archive list commit 8250561827dee4dea7a7612b1a8d7f582b3053ec Merge: a1af4f8 0749708 Author: fanglin.yang Date: Wed Apr 17 00:01:27 2019 -0400 Merge branch 'master' of gerrit:global-workflow commit a1af4f88946e610a4cd92b268bf01c44e99c80f2 Author: fanglin.yang Date: Wed Apr 17 00:00:51 2019 -0400 redefine SIGLEVEL in ush/global_chgres_driver.sh to allow it to be sent in from parent driver script commit 0749708b6ffe33b5c1ebc75a178894e6c23bd512 Author: George Gayno Date: Tue Apr 16 20:01:29 2019 +0000 Vlab issue #40471. Add new parallel version of chgres based on ESMF regridding. Contains the same surface pressure adjustment, vertical interpolation and surface initialization as the serial version of chgres. Can ingest tiled FV3 data, FV3 nemsio data, GFS nemsio data, GFS sigio/sfcio data. Squashed commit of the following: commit c656e6279e2c5d715f3403f47a8be3398e3932a7 Merge: b400492 04f0e75 Author: George Gayno Date: Tue Apr 16 19:57:42 2019 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit b400492d4910154a60e73d681b920956502b6a2e Author: George Gayno Date: Tue Apr 16 19:56:51 2019 +0000 chgres_cube branch: This commit references #40471. Move chgres_cube program to the sorc directory under chgres_cube.fd. Change-Id: I49f02aca17e79d262fd353f66293f3ebe4644b8b commit cc1c409c2aae9347546e15e1b40b0af1ed6ab61f Author: George Gayno Date: Wed Apr 3 13:30:46 2019 +0000 chgres_cube branch: This commit references #40471. Add threads to routine VINTG. Add OMP_STACKSIZE variable to Cray run script. Change-Id: Ibb3d10256d367c7a4520b0feeeba0e9d1a207141 commit 8fe3c48409c437be71f4fb4bd87e0b09f70c53b5 Author: George Gayno Date: Tue Apr 2 17:37:18 2019 +0000 chgres_cube branch: This commit references #40471. Fix bug in search routine by checking original field (stored in field_save) instead of the adjusted field. Add sea ice default logic from original serial version of chgres. Both updates change results. Add threading to search routine. Change-Id: I46278241828a9020851530594d75e3362712353f commit 0ba7dcff2c14128a0792c30a5585a3d5da0754f5 Author: George Gayno Date: Thu Mar 7 18:42:25 2019 +0000 chgres_cube branch: This commit references #40471. Update Theia config files and run scripts for new paths. Change-Id: Ic244b4eabff9ac5c4ebe6e23f6e7845af65799e2 commit ccc1c1e541d27c4334f57ce731c92009b25fefb3 Author: George Gayno Date: Thu Mar 7 18:35:05 2019 +0000 chgres_cube branch: This commit references #40471. Add new Theia config file for running with spectral gfs sigio/sfcio files. Change-Id: Ice67f73de40b82943ed6469f7a4f7f7477d669ff commit 614113255549be1c9abfc6acc5e8e0cb04dd1201 Author: George Gayno Date: Thu Mar 7 14:40:12 2019 +0000 chgres_cube branch: This commit references #40471. Updates for compiling and running on Cray. Remove compilation option for WCOSS phase 1/2. Machine will be retired this year. Change-Id: I7a9d2c0e5c6e6b63908798eda829c443f80c1648 commit 3e02203853376d1f0504ad618934b8c0adebc660 Author: George Gayno Date: Wed Mar 6 21:31:08 2019 +0000 chgres_cube branch: This commit references #40471. Update paths in Dell config files. Change-Id: I6b82400ec53cdce2602c08a59059848b9adc8dd4 commit 595919780c34ae1f0798b62131be37cab26204ea Author: George Gayno Date: Wed Mar 6 16:33:01 2019 +0000 chgres_cube branch: This commit references #40471. Numerous updates so program can ingest/process spectral gfs data in the 'old' sigio/sfcio format. Correct units error when reading snow depth from fv3 tiled warm restart files and fv3 tiled history files. Program expects snow depth in millimeters. Change-Id: I332468857c20e48c645b038469db3848fafd3a37 commit 778af6b5eae7c728dd4598736bf88ac43f98cddd Author: George Gayno Date: Wed Feb 20 13:26:21 2019 +0000 Revert "chgres_cube for warm restart" This reverts commit 00c978c1c119850ec7de6d4dcc3426567717ecea. commit 2f920bd440216524adc211b1850784b6ab0293ee Merge: 257f8aa 00c978c Author: Scott Date: Tue Feb 19 20:51:52 2019 +0000 Merge branch 'chgres_cube_warm_restart' into chgres_cube commit 00c978c1c119850ec7de6d4dcc3426567717ecea Author: Scott Date: Tue Feb 19 20:49:39 2019 +0000 chgres_cube for warm restart commit 257f8aa7e6386f860db5d74697fbd9daf0e2e67f Author: George Gayno Date: Wed Feb 13 15:56:14 2019 +0000 chgres_cube branch: This commit references #40471. When using fv3 global gaussian nemsio files, convert snow depth to millimeters and roughness length to centimeters to be consistent with the fv3 tiled history and restart files. Change-Id: I721f8fd38f0dbe528883af1cd3d62808345930d0 commit d8256f42759b509a05e685cb81dfdba73eb0d939 Author: George Gayno Date: Wed Feb 13 15:09:58 2019 +0000 chgres_cube branch: This commit references #40471. New option to process spectral gfs gaussian nemsio files. Change-Id: Ie387df4daa40c770d6adbd2fcc46e8ada3889d49 commit 65e5949113e79c9c875ffcce3d9b8e6f3038dc26 Author: George Gayno Date: Mon Feb 11 18:51:47 2019 +0000 chgres_cube branch: This commit references #40471. Add script for running on Theia using slurm. Change-Id: I66ed45f8389cd4887db763317186c773aaba95e4 commit 448d6bd1b9378bfd9d01e59db6df38913d1ab1c9 Author: George Gayno Date: Fri Feb 8 20:02:43 2019 +0000 chgres_cube branch: This commit references #40471. static_data.F90 - Updates for new gridgen_sfc file naming convention. Change-Id: I386f9a0f7ae2bc443634420b64abd8a7017d1b27 commit 0d76a9d7c1c3dfda78c5e4269407dab08a0ec697 Merge: ba27618 cc32f99 Author: George Gayno Date: Fri Jan 25 19:15:35 2019 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit ba276186b9113cb0e3adbe71475300c4c1738d9a Author: George Gayno Date: Fri Nov 16 22:08:02 2018 +0000 chgres_cube branch: This commit references #40471. Update theia config files for tracers. Change-Id: Ibcafbc82dc4c6fff58b56cfa163d101abd66cdb5 commit bc7b201febc3613f0c4e5c394f54117147b0f768 Author: George Gayno Date: Fri Nov 16 21:14:25 2018 +0000 chgres_cube branch: This commit references #40471. Update Dell config files to include tracers. Change-Id: Ie30bb81dd10811e16d39e9e919b1c32590e20b8e commit 3375b682269240d05ede7c568d12ac5e13d2d4a3 Author: George Gayno Date: Fri Nov 16 19:39:47 2018 +0000 chgres_cube branch: This commit references #40471. Add namelist option for users to select which tracers to process. Update cray config files accordingly. Change-Id: Ibeffdb44ab550ab3f5514bca2153a35bb5dc9051 commit d9d4408e4949871de57a4b956312b55c64c1afd7 Author: George Gayno Date: Fri Nov 16 18:10:56 2018 +0000 chgres_cube branch: This commit references #40471. Numerous updates so the program can process a user-selected set of tracers at run time. Change-Id: I8679b5a327f457fb745927dfc7ca11926a2258ce commit 9b9c92d87aec005c23d22a1b416cf86d3bd9d297 Author: George Gayno Date: Tue Oct 30 17:43:16 2018 +0000 chgres_cube branch: This commit references #40471. Minor fix to two error messages. Change-Id: I423986d9ab7f62c4da79850d3352c5f0b2190fb2 commit 0b889ad6328ef0beb7a83feb57563e8e9856b895 Author: George Gayno Date: Tue Oct 30 12:26:30 2018 +0000 chgres_cube branch: This commit references #40471. Remove hardwired atm/surface files names in routine "define_input_grid_gaussian". Improve error handling in that routine. Change-Id: I18d894792a44ae4b101891d7155a397b8f5959cd commit 092bc27ffdd37562ccd91372cfaff91c79f5794a Author: George Gayno Date: Thu Oct 25 15:00:19 2018 +0000 chgres_cube branch: This commit references #40471. Horizontally interpolate surface pressure assuming a standard lapse rate (per recommentation of Phil Pegion). This reduces model initialization shock near steep terrain. Change-Id: I9748acde579605a76fc387124c11c31c7dfa4474 commit eebdf594c1792330fde03239d068accd72576df6 Author: George Gayno Date: Tue Oct 9 20:16:20 2018 +0000 chgres_cube branch: This commit references #40471. Standardize error handling. Change-Id: I861798e99b3d1c7c116e184dc04e3fc782de8347 commit d07ec3468703143b6ca1880cec8b9d78dfe34dba Author: George Gayno Date: Fri Oct 5 17:14:15 2018 +0000 chgres_cube branch: This commit references #40471. Add a basic prolog to the top of each module. Change-Id: I47741d78197c2f8f8153ef6256235bae896981b6 commit 780029c6a6d0d3ec628e6d85cceb480840b2968b Author: George Gayno Date: Thu Oct 4 16:33:46 2018 +0000 chgres_cube branch: This commit references #40471. Cleanup of variable declarations. General cleanup. Change-Id: I978c4d6162aa1eaad7efd8523518bc05f394e690 commit 9dbeb5a4bf20d7580717b41bf17d875ff27dd18c Author: George Gayno Date: Wed Oct 3 13:48:42 2018 +0000 chgres_cube branch: This commit references #40471. Add global attribute to atmospheric and boundary files that specifies which input data source was used. Update theia config files to explictly define all input source files. Change-Id: Ifdcc176db26c7070a1ff550d2aed78b2fdde096e commit 9626a74d9dfe55a9df514eff83121a028963df3f Author: George Gayno Date: Tue Oct 2 18:11:48 2018 +0000 chgres_cube branch: This commit references #40471. Remove hard-wired sfc/nsst input file names. Update config files for cray accordingly. Change-Id: I5bbac6304f375635d9e5ff5620a883917c06eb5c commit bfd4c4e4e04aa19b5d6eb1a163231def60c26e65 Author: George Gayno Date: Tue Oct 2 16:56:03 2018 +0000 chgres_cube branch: This commit references #40471. Remove hard-wired input grid atmospheric file names. Update Dell config files accordingly. Change-Id: I6f6cbec751acea32d516e0a67f4dafe97cd99c2c commit 1727ae775e2da4c3de1a1fff2a453dcd30e4273d Author: George Gayno Date: Fri Sep 28 19:28:52 2018 +0000 chgres_cube branch: This commit references #40471. Add comments to input_data.F90 Change-Id: Ic22435948bcdca68840aad989c3b942948880525 commit 3eb8afaf720af694793e3fb25eb062afb7da6017 Author: George Gayno Date: Fri Sep 28 13:57:20 2018 +0000 chgres_cube branch: This commit referenes #40471. Updates for compiling and running on Cray. Point to beta v8.0.0 of the esmf library. Change-Id: I2ec308278481c9fa9758af3db8d7ef031b779644 commit 58e0e214e4be5d8d40a3cc6a3e813fe80f8f4aeb Author: George Gayno Date: Thu Sep 27 19:45:32 2018 +0000 chgres_cube branch: This commit references #40471. Updates for compiling on Dell using a local copy of beta v8.0.0 of esmf. Change-Id: I21c367827f5a8788db8de635ea283f536a3fbdd0 commit 0631e3830b65854229fac8dccc148b13279cbac8 Author: George Gayno Date: Thu Sep 27 15:06:19 2018 +0000 chgres_cube branch: This commit references #40471. Update Theia build to use esmf beta v8.0.0. That version corrects a problem with FieldScatter and FieldGather for large array sizes (such as 3-d t1534 fields). Add extrapolation method NEAREST_STOD to the atmospheric call to RegridStore. That eliminates problems with unmapped points encountered for certain grid configurations (Ex: T1543 gaussian to C1152). New routine "read_input_atm_gaussian_file" for reading fv3gfs gaussian history files in nemsio format. Remove obsolete namelist entries from all run config files. Change-Id: Ibf12ec0102e621cf94761548075f982d765c5f6e commit 07160a501362413ca83698bedcba97c6e9a642bd Author: George Gayno Date: Tue Sep 25 17:33:24 2018 +0000 chgres_cube branch: This commit references #40471. Replace logical 'restart_file' - which determined whether the program was to ingest tiled history or restart files - with 'input_type'. The latter is set as 'history'/'restart'/'gaussian' for tiled history/tiled restart/gaussian history files. Add routine (read_input_nst_gausian_file) to read nst data from gaussian history file to input_data.F90. Change-Id: I577ba3f319d2c0ea1d99598cf55cc8147bcadb97 commit 7cf64d52594e8e30338cd1f8bae424b81289125a Author: George Gayno Date: Mon Sep 24 13:32:47 2018 +0000 chgres_cube branch: This commit references #40471. Updates for running on Cray. Change-Id: Ib4e42021fd31619c4d60064cf0320cfdbd587414 commit 68be11f25c06eab4eb70f0573bddb67a8601b0fa Author: George Gayno Date: Mon Sep 24 12:40:57 2018 +0000 chgres_cube branch: This commit references #40471. Move read of terrain from model_grid.F90 to input_data.F90. This allows the program to use the terrain from the atmospheric files when processing the atmospheric fields. This terrain should be more consistent with the atm fields than what is in the orog files. This change will also improve flexibility as the number of input data sources increases. For example, the gaussian data has no orog files, so terrain must be read in from the restart files. Change-Id: Iff532b85096ec48afd3ca7e1658ab66c86737dc5 commit a3c5fc0edabe9d02ce3b69d22c136bb71ca44e0f Author: George Gayno Date: Fri Sep 21 13:23:31 2018 +0000 chgres_cube branch: This commit references #40471. input_data.F90 - Place 2-d to 3-d wind conversion in its own routine. Change-Id: Iad7c8460fbcbc9e9d4cbbd2c5529331c9604ebda commit d8b8c675e1e9ab386eb7cc5058e2f7e6f368304b Author: George Gayno Date: Fri Sep 21 12:16:01 2018 +0000 chgres_cube branch: This commit references #40471. Change variable name 'levp_input' to 'levp1_input'. Change-Id: Iaa91b691d8747b6906ecd10ddb0f61d40855d5e7 commit d9382ae01c4b0289047c6390a1c234d6ef8fb7e1 Author: George Gayno Date: Tue Sep 18 17:51:43 2018 +0000 chgres_cube branch: This commit references #40471. Logic to process surface fields from an fv3 gaussian nemsio file. Change-Id: I486960805c2c8dd4020438aba11ed8504fac3739 commit 253834853c3c5e0e76257c9402135bfe0de64c9f Author: George Gayno Date: Fri Sep 14 19:31:02 2018 +0000 chgres_cube branch: This commit references #40471. Add some logic to ingest fv3gfs gaussian nemsio files. Currently, only tiled netcdf files may be ingested. Change-Id: Ibc8606fcb63f9c955e636e0200bc8b634155b559 commit 754e2f8161b60a0b7c8bea1f3c58a07afbbcc65d Author: George Gayno Date: Fri Sep 7 20:40:48 2018 +0000 chgres_cube branch: This commit references #40471. Add script and sample configuration files for running on Dell. Change-Id: I6999f9a9ad43b4d62d72e181c431e85dbd48899c commit fa7d51a9795f0e85ece97655434233b2f3433ac0 Author: George Gayno Date: Fri Sep 7 14:48:24 2018 +0000 chgres_cube branch: This commit references #40471. Update regional boundary condition logic to include a blending halo located within the computational grid. Halo indices are now defined with respect to the computation grid instead of the whole grid (computational plus lateral halo). Change-Id: I4429659172403c135c16df530e592a80e7912eab commit 94d1f288ab2994f851862338e06625c96aaa915a Author: George Gayno Date: Tue Sep 4 13:48:36 2018 +0000 chgres_cube branch: This commit references #40471. Update atmospheric write routine to write each tile on its own mpi task. Previously, all tiles were written sequentially on task 0. Change-Id: Ica45f3320970c105f2cda42f1becd83a311f784e commit d18199b2bd07a22976fd04181cda179990a5bebd Author: George Gayno Date: Fri Aug 31 18:42:02 2018 +0000 chgres_cube branch: This commit references #40471. Remove all 'goto' statements. Change-Id: I3a66a5254df2d3e5f3cb6bf12f44d96ead3b9f96 commit 17863752b96a772ee5cd6bfd9c96b904a77f95f5 Author: George Gayno Date: Fri Aug 31 14:40:24 2018 +0000 chgres_cube branch: This commit references #40471. Read input history file tiles in parallel instead of sequentially. Change-Id: Ifa29876014fe516249c788d3f2f81f543ffda171 commit 9cba10d151de587b6b63644f77bf74b073cb16c8 Author: George Gayno Date: Thu Aug 30 20:22:06 2018 +0000 chgres_cube branch: This commit references #40471. Update routine "read_input_atm_restart_data" to read each tile on its own mpi task (or pet). Previously, each file was read sequentially on task 0. Tests on theia showed results do not change, and wall clock time is reduced. Change-Id: I30899db560be58586871dc886283f0766957bef3 commit 4a59accf411686b26128d93e34ad63d9892facff Author: George Gayno Date: Wed Aug 29 20:06:44 2018 +0000 chgres_cube branch: This commit references #40471. Add config files for running the following transforms on Theia: (1) C768 L64 to C768 L91; (2) C768 L64 to C1152 L91 Both transforms ran sucessfully. Remove some diagnostic print. Change-Id: I698d92a016010ad9bc01171ecfe015802275b2ee commit fd3736a88cf1913ce6f9896f734be819f354d9f3 Author: George Gayno Date: Wed Aug 15 15:27:16 2018 +0000 chgres_cube branch: This commit references #40471. Add option to read in a weight file for part of the atmospheric interpolation. If weight file not available, then FieldRegridStore is called as before. Using weight file reduces wall clock time slightly and does not change results. Change-Id: I956bb094315f723840c62a7ee3dd2faeadba70c1 commit f3bb2e3ed2d77e53c6da98c6241c62491316d88f Author: George Gayno Date: Tue Aug 14 18:39:15 2018 +0000 chgres_cube branch: This commit references #40471. Remove extra calls to FieldRegridStore. Results do not change. Wall clock time reduced slightly. Change-Id: Icc93054d844752c5299bdc64c4c117c291b86a09 commit fbb566f9143d1b3a686f351d95352d1de4a9489e Author: George Gayno Date: Mon Aug 6 20:28:20 2018 +0000 chgres_cube branch: This commit references #40471. Remove subroutine 'flip' and replace with F90 statements to reverse arrays in 'z' direction . The latter is much faster. Change-Id: I1a405cc48c7e1981c908e64c77e0d65e0e875caa commit 963246d6061c4dd49001cc226e9df082931b5037 Author: George Gayno Date: Fri Aug 3 20:12:02 2018 +0000 chgres_cube branch: This commit references #40471. Add two new tests for Theia. Change-Id: I2fbb5a33ca236bf289eb643b0d361b1fe46b3419 commit 339b732caad6aa3c19fc0c533799d90978b3b16c Author: George Gayno Date: Fri Jun 22 19:20:19 2018 +0000 chgres_cube branch: This commit references #40471. Update 'make.sh' to build on WCOSS Phase 1/2. Change-Id: I9f6e0fd0154c2874be61af85a8e6b16946f70704 commit cbbb9644a5ef56acecd2cfc03e805414e5e67035 Author: George Gayno Date: Wed Jun 20 20:14:32 2018 +0000 chgres_cube branch: This commit references #40471. Add 3-d temperature and delta-p to target grid atmospheric netcdf file. Change-Id: I505df46a097c772b2cb0c792e18a0db66f9d0d20 commit dcaaba32b355e93720331fe5c0a3ee17d2c64d60 Merge: d868f97 eb1a299 Author: George Gayno Date: Fri Jun 15 13:36:09 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit d868f9737108d583451971bd7f7ac66d9792244a Author: George Gayno Date: Fri Jun 15 13:32:07 2018 +0000 chgres_cube branch: This commit references #40471. When ingesting 'restart' files, get model top pressure from the "fv_core.res.nc" file. Change-Id: Ib9856ceb7743ce8c9be6692128e4616c1c7d4e76 commit bce294fc9b507cdaacdda89af652fda786746d49 Author: George Gayno Date: Thu Jun 14 17:59:41 2018 +0000 chgres_cube branch: This commit references #40471. New namelist variable 'restart_file' to control whether input files are 'restart' or 'history' files. Change-Id: I969651711fa83c038574093447a7b8d4c05c41ba commit 73ddc22b528362edc7b5f5a05de94301bc00e9ba Author: George Gayno Date: Wed Jun 13 21:27:21 2018 +0000 chgres_cube branch: This commit references #40471. Add logic to process GFDL microphysics tracers. Change-Id: Iad9e15cc6477ecfc01120400826c0cf6e660a1ce commit 2a82e038a21fc6de82189362133d92a329ad101f Author: George Date: Mon Jun 11 19:09:45 2018 +0000 chgres_cube branch: This commit references #40471. Update "make.sh" for building on Dell. Change-Id: Iba4915467e315ebc83d4fed7e838391681a84a5f commit 9f9d9531ae37ca81608892b50d390d5c3513ce1d Author: George Date: Fri Jun 8 20:22:51 2018 +0000 chgres_cube branch: This commit references #40471. Preliminary modifications to read input atmospheric fields from 'restart' files. Currently, program only ingests atmospheric 'history' files. Change-Id: Iac0d3cf7672d108653ddb9da620b8cf68b361877 commit 94658f59fbb7d9d21bf773d1f8cd0a71dc485e92 Author: George Date: Thu Jun 7 20:57:44 2018 +0000 chgres_cube branch: This commit references #40471. Update to ingest tiled "restart" surface files. Previously, only tiled "history" files could be ingested. 1) New routine to read tiled restart surface files - "read_input_sfc_restart_data" 2) Rename existing read routine to "read_input_sfc_history_data". 3) Add logic to determine if surface file is a 'restart' or 'history' file. Logic checks for 'xaxis_1' in the header. If it exists, a 'restart' file is assumed. Change-Id: I0ab138d165d7f2646440acc07ee5691c726a1e85 commit 822ccffc37ee4f81398ebfdcfbbad8e39d946149 Author: George Gayno Date: Mon May 21 18:09:17 2018 +0000 chgres_cube branch: This commit references #40471. New routine "write_fv3_atm_bndy_data_netcdf" that outputs an atmospheric lateral boundary file. Supports stand-alone regional grids. Change-Id: Ic6a63a4d915ba7fd870e9c151ff274f90df0a059 commit d11d7cd2d534a3ea2756928a00423d7ca5c68236 Author: George Date: Wed May 16 20:47:04 2018 +0000 chgres_cube branch: This commit references #40471. Add "extrapMethod" argument to atmospheric regridding. This prevents a random glitch when interpolating winds from the center of the grid box to the box edges. Add halo removal logic for atmospheric file. Fix bug in halo removal logic for surface file. Change-Id: Idbee39ded541db7cb11dadc4b488759a736ce7b4 commit 325b729346b4303d2c2d142d6685ae5cb2e5b167 Merge: 3e46722 cebba5c Author: George Date: Fri May 11 14:21:49 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit 3e467223e7ce3d23b3e0a8a8f47d5e233fa3c69a Merge: 00921a5 ac03c87 Author: George Gayno Date: Thu May 3 19:55:50 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit 00921a50e7c7dffe3bc0194e1a9b93fcbf87cde5 Author: George Gayno Date: Mon Apr 30 17:33:17 2018 +0000 chgres_cube branch: This commit references #40471. Update to use v7.1 of ESMF instead of a beta snapshot. modified: chgres_cube/sorc/make.sh Change-Id: I0fc15198c3abb2c28ee758df8dc37bd3c008924c commit 930b87fd892009906db444e44fefd602248d883e Author: George Gayno Date: Tue Apr 24 13:55:34 2018 +0000 chgres_cube branch: This commit references #40471. New routine "write_fv3_atm_header_netcdf" to write the "gfs_ctrl.nc" header file, which contains tracer and vcoord information. Modify to read input surface data from tiled model history files. Change-Id: I39f0d573c2cae3b17f4758973d7bf002dcd7afbf commit 937997b65ee77b09ac69151db9388f0c107d38b7 Merge: ca61546 0e9d2d1 Author: George Gayno Date: Fri Apr 20 17:36:18 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit ca61546139daa5e51a409fa6a538503c10dd5742 Author: George Gayno Date: Fri Apr 13 13:32:40 2018 +0000 chgres_cube branch: This commit references #40471. New routine 'vintg' for vertically interpolating between the input and target hyb-sigma levels. This routine is taken from the GFS CHGRES code, which assumes the lowest model level in index '1'. This is opposite the fv3 convention. Therefore, add a vertical 'flip' of the fv3 data after the read and before the write of the atmospheric files. New routine 'compute_zh' to compute heights. To save memory, reduce the number of 3-d arrays during write of atmospheric file. Change-Id: Ia256b6d348d277119ab1e23da6d00f3653dd8eeb commit 6c95bf7246b74edef319814a13f3667b1cdcb6d7 Author: George Date: Mon Apr 9 20:52:51 2018 +0000 chgres_cube branch: This commit references #40471. New routine 'newpr1' which computes 3-d pressure based on 'ak' and 'bk'. Add esmf fields to hold data on target grid before vertical interpolation (denoted by 'b4adj' in the variable name). Change-Id: Ic9c9175f268f124c0d63498a9e771fe679456782 commit 4c5b40a11daf499c55f9000224c08dc3589d7d98 Merge: ebee537 6482117 Author: George Date: Mon Apr 9 17:36:57 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube. commit ebee53765fe252dec7517d332771ef7a847670cf Author: George Date: Mon Apr 9 17:22:27 2018 +0000 chgres_cube branch: This commit references #40471. Several updates to the atmospheric data processing: 1) Add horizontal interpolation of 3-d temperature. 2) Read input data from the tiled model history files instead of the coldstart files from CHGRES. These files store the winds unstaggered (at the center of the grid box). 3) Add read of target grid 'vcoord' file. 4) New routine 'newps' to adjust surface pressure to new terrain. Change-Id: I653dbc741f785b42ce6fba51333530dd873bd9ef commit 181f712adb0310f098b1f2c1ce2d6b870fd81a0b Author: George Date: Wed Apr 4 14:02:30 2018 +0000 chgres_cube branch: This commit references #40471. Add read of vertical coordinate file to get 'ak' and 'bk' for the target grid. Change-Id: I3ba9fbc8e5205e3c5ef6a04da955d52094197ce9 commit 25c8b572cd74aae0d826ed805ce75b47acad95e8 Merge: 7e00515 7b3afc8 Author: George Date: Tue Apr 3 12:16:15 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit 7e00515e2e1069d491f359c1eee9a63f84000732 Merge: c2ff052 8725547 Author: George Date: Tue Mar 27 19:48:54 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit c2ff0525532b1ed078c4e6bb258dd46f85e40ff6 Merge: 8fd71e9 0c7e545 Author: George Gayno Date: Thu Mar 22 14:20:13 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit 8fd71e907923b2df457d8cb80550e4a0ced1ea2d Author: George Gayno Date: Tue Mar 20 18:50:13 2018 +0000 chgres_cube branch: This commit references #40471. Update theia build to point to official esmf version 7.1.0. Add "module use" statement for locating NCEPLIBS. Change-Id: If06900baa3c70b9713a3d15b25d2a89fd569d5a8 commit 5a61108655e8f253fe1cddabc3cc9dd92f2fcf04 Merge: f944749 7fdcd04 Author: George Date: Mon Mar 19 18:18:58 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit f944749861cc6ecfcb8439c38525f42184d2eda1 Merge: a47764d 3169078 Author: George Gayno Date: Mon Mar 5 13:45:28 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit a47764d197d896b3aced87a390571ac7b9e3bb9a Merge: 944f618 ecf67b6 Author: George Gayno Date: Thu Feb 22 18:24:50 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit 944f6180e7500e6238685e8ae51d4d1a68c9f919 Merge: 73c0c43 03b9b56 Author: George Date: Tue Feb 20 13:56:03 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit 73c0c436bd431dd2b7bb88e87046d42e69bb1d80 Merge: f697860 6b2de36 Author: George Date: Mon Feb 12 13:36:35 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit f697860ab83ff8bcec2383dbe8d4cee86de1b84c Author: George Gayno Date: Thu Feb 8 21:46:24 2018 +0000 chgres_cube branch: This commit references #40471. Minor script changes related to recent master merge. Change-Id: Ibd0d61e3d7da3be6675195d0d8c3b4def38f70d4 commit 5024394887414c4f65e6921419ac8e1550108be7 Merge: 72778cb 2b9a059 Author: George Gayno Date: Thu Feb 8 21:33:29 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube Change-Id: Iadb058b872c1d578acf9895a1321923a8bd71c73 commit 72778cb00dac9f97cfdb5a6085025c80cccaee91 Author: George Gayno Date: Mon Jan 29 19:32:42 2018 +0000 chgres_cube branch: This commit references #40471. To ensure bit identical results for varying task counts, the following modifications were done: (1) The argument "isrctermprocessing=1" was added to all calls to ESMF_FieldRegridStore; (2) The argument "termorderflag=ESMF_TERMORDER_SRCSEQ" was added to all calls to ESMF_FieldRegrid. Change-Id: I354cbd94b5c9e63a8e5635dfb2a2a32fe91426e3 commit e263d0e6d5fce3f24cd0f15ae849be412fff161f Author: George Gayno Date: Mon Jan 8 21:04:00 2018 +0000 chgres_cube branch: This commit references #40471. Move all interpolations of surface fields into their own routine ("interp"). Change-Id: I1c1ffd972566092e17f9850bbfdfce2b3965abe2 commit 574bb6f72998099b7aacc8a15220de2637ba727b Author: George Gayno Date: Mon Jan 8 19:33:52 2018 +0000 chgres_cube branch: This commit references #40471. Move processing of nst fields to surface.F90 to ensure consistency between TREF and SST. This was a problem with the OPS version of CHGRES (see issue #44638.). Remove now obsolete routine nst.F90. Change-Id: Ifb7bf631ef75cdf2af8eb0ff3f77e236996224db commit bdb63dd83f7bf23fe198650fc84cc631428d7d6f Author: George Gayno Date: Thu Jan 4 19:46:18 2018 +0000 chgres_cube branch: This commit references #40471. Change default value of SST to be the same latitudinal dependent guess as that used for TREF. Change-Id: If2d0f1f13d4f7bb318c11003eb062c53609f1d92 commit 735e35ba6efcafe94db47540416b64578d13bb97 Author: George Gayno Date: Thu Jan 4 14:38:19 2018 +0000 chgres_cube branch: This commit references #40471. Simplify driver. General cleanup. Change-Id: I1b227f5f909ec42e4a2a664ccd324df0a2a19da8 commit 8f7e672cdcd40795afa721dbd7f6347ac876ab1f Merge: 9743273 3073c50 Author: George Gayno Date: Wed Jan 3 17:29:16 2018 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit 9743273c09dfb4b4d8931fb0583198d57274dd23 Author: George Gayno Date: Wed Jan 3 17:18:43 2018 +0000 chgres_cube branch: This commit references issue #40471. Add interpolation of u/v winds (on the staggered grid). The method is based on the vector interpolation in the model's write component (by Jun Wang): (1) convert from 2-d cartesian components to 3-d. (2) Horizontally interpolate the 3-d components to the target grid. (3) Convert from the 3-d components back to 2-d. Change-Id: If5110c8c20db8c5ad110f74a9f219e97799705f8 commit 1d90bf06dde893c5f28873fb016f874dc9dfc7e0 Merge: 04e2f1e 7bf492a Author: George Gayno Date: Thu Dec 21 19:16:33 2017 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit 04e2f1ec01dd032fd60893d54ba280759737521f Author: George Gayno Date: Thu Dec 21 19:12:01 2017 +0000 chgres_cube branch: This commit references #40471. Add read of latitude and longitude at the 's' edge and store as ESMF fields. Lat/lon is required to interpolate the staggered winds. Change-Id: I1f5d2a26522547a8887942b386175f65c6de9767 commit df04cc2fa792e0ed0b271e4bd395a400d80f3993 Author: George Gayno Date: Wed Dec 20 20:19:32 2017 +0000 chgres_cube branch: This commit references #40471. Rename routine "get_model_mask" to "get_model_mask_terrain" and remove read of lat/lons from the orography file. New routine "get_model_latlons" to read lat/lon from the 'grid' files. The 'grid' files have lat/lon on the staggered grid, which will be needed when interpolating winds. The orography files only have lat/lon on the 'A' grid. Change-Id: If291bb2f0b1e2294b52e7775347f756cdf68c6c5 commit e76e7f236a520428e35c2d9f79b18dea2fd45fe5 Author: George Gayno Date: Wed Dec 20 14:55:19 2017 +0000 chgres_cube branch: This commit references #40471. Switch from ESMF 7.1.0bs39 to 7.1.0bs44 (Theia). Latest version corrects bug in GridCreateMosaic for stand-alone nests. Change-Id: I50b271be7fbfabc14a51f4cc848308c3d8143d70 commit 56de1e7b5c857da69a63fe9473f6e3a4f01e35a4 Author: George Gayno Date: Fri Dec 15 18:41:50 2017 +0000 chgres_cube branch: This commit references #40471. Replace all remaining F77 netcdf functions with their F90 counterparts. Change-Id: Ieaea4b5485079b44d3c49e49556b2e2eba56b7f6 commit 23ece3b7d201be3c017d890576bc65c452e5810a Author: George Gayno Date: Fri Dec 15 16:02:32 2017 +0000 chgres_cube branch: This commit references #40471. Point (on Theia) to an updated ESMF v7.1.0bs39 library (was recompiled with -precise flag). Update read/write of atmospheric fields to use F90 versions of NetCDF functions. Change-Id: Iccd7b96a91b7d4d498847d00fec7679420852f87 commit 0eb38c145d8935b644ac014e65432a8e93f0f4fc Author: George Gayno Date: Thu Dec 14 21:43:47 2017 +0000 chgres_cube branch: This commit references #40471. Update Cray build to use ESFM v7.1.0bs39. Change-Id: I5cc221fb2be80ddfc4e55cae183b64f671b97553 commit 7f6369774f00955df5543c8a61212f59a52c0373 Author: George Gayno Date: Thu Dec 14 21:16:02 2017 +0000 chgres_cube branch: This commit references #40471. Update ESMF error handling per Gerhard's suggestion. Change-Id: I29b882c178f6595f8d371d6bddb8b080a00f88d7 commit bb2bccc4832080366bf49d67a1dbaa63881bf189 Author: George Gayno Date: Wed Dec 13 21:00:39 2017 +0000 chgres_cube branch: This commit references #40471. Update to use v7.1.0bs39 of the ESMF library, which is only available on Theia currently. Change-Id: I4be45d0dc3e5c9646c6ff52d57b27639665c4b93 commit 1976a12346150a8a6093105089820637f4f5037c Merge: e362248 8585699 Author: George Gayno Date: Wed Dec 13 17:23:00 2017 +0000 chgres_cube branch: This commit references #40471. Merge branch 'master' into chgres_cube commit e362248b338582f8c6f75e8ffacf328738a0a30c Author: George Gayno Date: Mon Nov 6 15:16:09 2017 +0000 This commit references #40471 Initial commit of the CHGRES cube-to_cube program to the chgres_cube branch. This initial version only processes surface and NSST fields. There are some hooks for processing the atmospheric fields. Change-Id: I33fa9e96637817dbe36e180e98b3b0c5e55b6945 commit 04f0e75ea80a4b9f49480d8d6fc979bc72a2899e Author: fanglin.yang Date: Fri Apr 12 20:18:54 2019 +0000 remove sourcing ~/.bash_profile in run_gfsmos_master.sh.dell. Otherwise the program crashes on Dell commit 1c4de6bbd89e3fb9180f868471b26fd4975f50dc Author: fanglin.yang Date: Mon Apr 8 01:59:58 2019 +0000 for Issue#62220, update earc.sh to not overwrite enkfgdas_grp for the first cycle commit 71af83f6f0e0a5d577161e324be4db6d770a4a21 Author: fanglin.yang Date: Wed Apr 3 19:38:51 2019 +0000 Merge GFS.v15.1.0 implementation branch q2fy19_nco to the master * Update model tag to nemsfv3gfs_beta_v1.0.18 to add restart capability of running GFS long forecast from the end or a failing point of last attempt. restart_interval_gfs is used to control the frequency of writing out restart ICs, which are saved under ROTDIR for emc parallels and NWGES for NCO production. exglobal_fcst_nemsfv3gfs.sh script has been modified to autimatically detect if the model should execute as a cold, or warm start, or as rerun. If it is a rerun, the script will look for saved ICs that is restart_interval_gfs hours back from the last ending point. * Correct a bug in precip units in the computation of frozen precipitation flag (srflag). * Write fields that are continuously accumulated in model integration in restart files so that after a restart their acummulated values can be read in. (FV3 Issue #61788) * Use 8x24 instead of 12x16 layout in config.fv3 for C768 * Address restart I/O issues #60879 * Update gsi tag to fv3da.v1.0.43 (Q3FY19 GDAS observation upgrade). * Merge NCO's changes to q2fy19_nco branch and then to the master. * Update GFS.v15 release notes. * Replace current ecflow/def file swith NCO's copies. Changes to be committed: renamed: docs/Release_Notes.gfs.v15.0.0.txt -> docs/Release_Notes.gfs.v15.1.0.txt renamed: docs/Release_Notes.gfs_downstream.v15.0.0.txt -> docs/Release_Notes.gfs_downstream.v15.1.0.txt renamed: ecflow/ecf/defs/prod00.def -> ecflow/ecf/defs/para00_gfs_FV3.def renamed: ecflow/ecf/defs/prod06.def -> ecflow/ecf/defs/para06_gfs_FV3.def renamed: ecflow/ecf/defs/prod12.def -> ecflow/ecf/defs/para12_gfs_FV3.def renamed: ecflow/ecf/defs/prod18.def -> ecflow/ecf/defs/para18_gfs_FV3.def new file: ecflow/ecf/defs/para00_gdas_FV3.def new file: ecflow/ecf/defs/para06_gdas_FV3.def new file: ecflow/ecf/defs/para12_gdas_FV3.def new file: ecflow/ecf/defs/para18_gdas_FV3.def modified: driver/product/run_JGFS_AWIPS_G2_dell.sh_00 modified: gempak/ush/gfs_meta_ak.sh modified: gempak/ush/gfs_meta_us.sh modified: jobs/JGFS_AWIPS_G2 modified: jobs/JGLOBAL_FORECAST modified: parm/config/config.base.nco.static modified: parm/config/config.fcst modified: parm/config/config.fv3 modified: parm/config/config.resources modified: scripts/exglobal_fcst_nemsfv3gfs.sh modified: sorc/checkout.sh commit 193f8837907742eac835b235c534b3275eb15fd4 Author: George Gayno Date: Tue Mar 12 20:32:47 2019 +0000 Vlab issue #59733. Add interface block for routine "write_fv3_data_netcdf" to the chgres.f90 driver. An interface block is required when optional arguments are used. This routine has one: NSST_OUTPUT. Squashed commit of the following: commit 43af677678427218bfc689450e2a8e893cb81cb2 Merge: 09406ee5 595d44c9 Author: George Gayno Date: Tue Mar 12 18:25:09 2019 +0000 chgres_wrterr branch: This commit references #59733. Merge branch 'master' into chgres_wrterr commit 09406ee5a6be3eb2e9b96e84f460b0b5b743f9bb Author: George Gayno Date: Tue Feb 5 21:39:25 2019 +0000 chgres_wrterr branch: This commit references #59733. Add interface block for routine "write_fv3_data_netcdf" to the chgres.f90 driver. Change-Id: Ic30af24b2c2704f1915ab93eeffbe0716a462e73 Change-Id: I19516cd1ad77aa62b3b311745ac63be139947fc2 commit 01382fd9a92ada3574e9c1ea12dfe546adcdca36 Author: kate.friedman Date: Tue Jul 16 18:02:01 2019 +0000 VLab Issues #66082 - GFSv15.1.2 nwprod changes commit 111043640ad8977a192abdc7d1163f039db85391 Author: kate.friedman Date: Fri Jun 28 18:19:39 2019 +0000 Final nwprod/gfs.v15.1.1 changes commit afb70c44180be65eed1ac99a503af56ca1a2118e Author: fanglin.yang Date: Thu May 16 15:57:43 2019 +0000 update checkout.sh to point gfs_wafs.v5.0.9, in which parm/wafs/wafs_gcip_gfs.cfg was updated to use both GOES-15 and GOES-17 whichever is available, not exclusively either one or the other. GOES-17 will go live on 20190602 commit f0880d93d30f890f9455c03c48521579765566eb Author: fanglin.yang Date: Fri May 3 20:46:19 2019 +0000 bring a few minor changes NCO made in gfs.v15.1.0.1 back to branch q2fy19_nco commit 6117ccaecf82f2d3fec39e6a04f627e7cc425789 Author: fanglin.yang Date: Tue Apr 23 19:25:31 2019 +0000 update link_fv3gfs.sh to include correct path for soft links of post and wafs source code commit b4c5f62a461c899222718acb9b54b27db9b465c7 Author: fanglin.yang Date: Mon Apr 22 20:58:55 2019 +0000 per NCO's request, add symbolic links under ./sorc to individual source programs of fregrid and wafs. Also update the path of RSTDIR in JGLOBAL_FORECAST commit b70a9ec3284fd4a3a5531a8d735c24259150c39f Author: Judy K. Henderson Date: Tue Mar 5 15:31:49 2019 -0700 * create branch for GMTB from 05Mar2019 global-workflow master, 595d44c9 * add changes needed to run CCPP version of NEMSFV3GFS -- add convective variables, imfdeepcnv and imfshalcnv, to config.base.emc.dyn -- define suite definition file (CCPP_SUITE) in config.fcst -- copy suite definition file to run directory -- add ccpp_suite variable to atmos_model_nml portion of namelist NOTE: this assumes sorc/fv3gfs.fd is pointing to the CCPP version of NEMSFV3GFS --- .github/ISSUE_TEMPLATE/feature_request.md | 40 + .github/ISSUE_TEMPLATE/fix_file.md | 24 + .github/ISSUE_TEMPLATE/production_update.md | 31 + .github/scripts/build_docs.sh | 31 + .github/workflows/docs.yaml | 51 + .github/workflows/linters.yaml | 64 + .github/workflows/pynorms.yaml | 24 + .github/workflows/pytests.yaml | 36 + .gitignore | 116 +- .pycodestyle | 6 + .shellcheckrc | 16 + Externals.cfg | 79 +- LICENSE.md | 157 + README.md | 74 +- ci/cases/C96C48_hybatmDA.yaml | 15 + ci/cases/C96_atm3DVar.yaml | 14 + ci/platforms/hera.sh | 7 + ci/platforms/orion.sh | 11 + ci/scripts/check_ci.sh | 115 + ci/scripts/clone-build_ci.sh | 122 + ci/scripts/create_experiment.py | 108 + ci/scripts/driver.sh | 136 + ci/scripts/pygw | 1 + ci/scripts/run_ci.sh | 71 + docs/Makefile | 25 + docs/make.bat | 35 + docs/note_fixfield.txt | 2 + docs/requirements.txt | 2 + docs/source/_static/GFS_v16_flowchart.png | Bin 0 -> 121683 bytes docs/source/_static/custom.css | 19 + docs/source/_static/fv3_rocoto_view.png | Bin 0 -> 161304 bytes docs/source/_static/theme_overrides.css | 9 + docs/source/clone.rst | 153 + docs/source/components.rst | 106 + docs/source/conf.py | 111 + docs/source/configure.rst | 59 + docs/source/development.rst | 198 + docs/source/errors_faq.rst | 45 + docs/source/hpc.rst | 125 + docs/source/index.rst | 39 + docs/source/init.rst | 573 + docs/source/jobs.rst | 89 + docs/source/monitor_rocoto.rst | 136 + docs/source/output.rst | 20 + docs/source/run.rst | 16 + docs/source/setup.rst | 301 + docs/source/start.rst | 48 + docs/source/view.rst | 46 + .../analysis/recenter/jenkfgdas_sfc.ecf | 2 +- .../jgfs_atmos_wafs_blending_0p25.ecf | 3 +- .../grib2_wafs/jgfs_atmos_wafs_grib2.ecf | 5 +- .../grib2_wafs/jgfs_atmos_wafs_grib2_0p25.ecf | 5 +- .../post_processing/jgfs_atmos_wafs_gcip.ecf | 1 + env/CONTAINER.env | 38 + env/HERA.env | 319 +- env/JET.env | 300 +- env/ORION.env | 317 +- env/S4.env | 271 + env/WCOSS2.env | 308 + env/gfs.ver | 22 - gempak/ush/gempak_gfs_f00_gif.sh | 2 +- jobs/JGDAS_ATMOS_ANALYSIS_DIAG | 119 +- jobs/JGDAS_ATMOS_CHGRES_FORENKF | 96 +- jobs/JGDAS_ATMOS_GEMPAK | 95 +- jobs/JGDAS_ATMOS_GEMPAK_META_NCDC | 79 +- jobs/JGDAS_ATMOS_GLDAS | 85 + jobs/JGDAS_ATMOS_VERFOZN | 86 + jobs/JGDAS_ATMOS_VERFRAD | 97 + jobs/JGDAS_ATMOS_VMINMON | 74 + jobs/JGDAS_ENKF_ARCHIVE | 44 + jobs/JGDAS_ENKF_DIAG | 169 +- jobs/JGDAS_ENKF_ECEN | 118 +- jobs/JGDAS_ENKF_FCST | 102 +- jobs/JGDAS_ENKF_POST | 73 +- jobs/JGDAS_ENKF_SELECT_OBS | 189 +- jobs/JGDAS_ENKF_SFC | 118 +- jobs/JGDAS_ENKF_UPDATE | 95 +- jobs/JGDAS_FIT2OBS | 88 + jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT | 45 + jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT_VRFY | 44 + jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_CHKPT | 58 + jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_POST | 47 + jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP | 59 + jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_RUN | 39 + jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY | 53 + jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG | 65 +- jobs/JGFS_ATMOS_AWIPS_G2 | 66 +- jobs/JGFS_ATMOS_CYCLONE_GENESIS | 109 +- jobs/JGFS_ATMOS_CYCLONE_TRACKER | 120 +- jobs/JGFS_ATMOS_FBWIND | 61 +- jobs/JGFS_ATMOS_FSU_GENESIS | 99 +- jobs/JGFS_ATMOS_GEMPAK | 174 +- jobs/JGFS_ATMOS_GEMPAK_META | 71 +- jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF | 69 +- jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC | 79 +- jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS | 102 +- jobs/JGFS_ATMOS_POSTSND | 93 +- jobs/JGFS_ATMOS_VMINMON | 73 + jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE | 56 + jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE | 49 + jobs/JGLOBAL_AERO_ANALYSIS_RUN | 35 + jobs/JGLOBAL_ARCHIVE | 52 + jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE | 48 + jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE | 44 + jobs/JGLOBAL_ATMENS_ANALYSIS_RUN | 35 + jobs/JGLOBAL_ATMOS_ANALYSIS | 169 +- jobs/JGLOBAL_ATMOS_ANALYSIS_CALC | 140 +- jobs/JGLOBAL_ATMOS_EMCSFC_SFC_PREP | 74 +- jobs/JGLOBAL_ATMOS_POST | 122 + jobs/JGLOBAL_ATMOS_POST_MANAGER | 82 +- jobs/JGLOBAL_ATMOS_SFCANL | 110 +- jobs/JGLOBAL_ATMOS_TROPCY_QC_RELOC | 92 +- jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE | 58 + jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE | 55 + jobs/JGLOBAL_ATM_ANALYSIS_RUN | 37 + jobs/JGLOBAL_FORECAST | 167 +- jobs/JGLOBAL_LAND_ANALYSIS_FINALIZE | 50 + jobs/JGLOBAL_LAND_ANALYSIS_INITIALIZE | 43 + jobs/JGLOBAL_LAND_ANALYSIS_RUN | 44 + jobs/JGLOBAL_WAVE_GEMPAK | 46 +- jobs/JGLOBAL_WAVE_INIT | 75 +- jobs/JGLOBAL_WAVE_POST_BNDPNT | 84 +- jobs/JGLOBAL_WAVE_POST_BNDPNTBLL | 81 +- jobs/JGLOBAL_WAVE_POST_PNT | 79 +- jobs/JGLOBAL_WAVE_POST_SBS | 84 +- jobs/JGLOBAL_WAVE_PRDGEN_BULLS | 47 +- jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED | 47 +- jobs/JGLOBAL_WAVE_PREP | 94 +- jobs/rocoto/aeroanlfinal.sh | 23 + jobs/rocoto/aeroanlinit.sh | 24 + jobs/rocoto/aeroanlrun.sh | 24 + jobs/rocoto/anal.sh | 11 +- jobs/rocoto/analcalc.sh | 13 +- jobs/rocoto/analdiag.sh | 13 +- jobs/rocoto/arch.sh | 407 +- jobs/rocoto/atmanlfinal.sh | 23 + jobs/rocoto/atmanlinit.sh | 24 + jobs/rocoto/atmanlrun.sh | 24 + jobs/rocoto/atmensanlfinal.sh | 23 + jobs/rocoto/atmensanlinit.sh | 24 + jobs/rocoto/atmensanlrun.sh | 24 + jobs/rocoto/awips.sh | 139 +- jobs/rocoto/coupled_ic.sh | 156 +- jobs/rocoto/earc.sh | 227 +- jobs/rocoto/ecen.sh | 23 +- jobs/rocoto/echgres.sh | 11 +- jobs/rocoto/ediag.sh | 13 +- jobs/rocoto/efcs.sh | 30 +- jobs/rocoto/eobs.sh | 13 +- jobs/rocoto/eomg.sh | 17 - jobs/rocoto/epos.sh | 27 +- jobs/rocoto/esfc.sh | 13 +- jobs/rocoto/eupd.sh | 13 +- jobs/rocoto/fcst.sh | 48 +- jobs/rocoto/fit2obs.sh | 23 + jobs/rocoto/gempak.sh | 72 +- jobs/rocoto/getic.sh | 91 +- jobs/rocoto/gldas.sh | 3 + jobs/rocoto/landanlfinal.sh | 23 + jobs/rocoto/landanlinit.sh | 24 + jobs/rocoto/landanlrun.sh | 24 + jobs/rocoto/metp.sh | 64 +- jobs/rocoto/ocnanalbmat.sh | 19 + jobs/rocoto/ocnanalchkpt.sh | 18 + jobs/rocoto/ocnanalpost.sh | 18 + jobs/rocoto/ocnanalprep.sh | 19 + jobs/rocoto/ocnanalrun.sh | 18 + jobs/rocoto/ocnanalvrfy.sh | 19 + jobs/rocoto/ocnpost.sh | 203 +- jobs/rocoto/post.sh | 24 +- jobs/rocoto/postsnd.sh | 12 +- jobs/rocoto/prep.sh | 132 +- jobs/rocoto/sfcanl.sh | 13 +- jobs/rocoto/wafs.sh | 45 +- jobs/rocoto/wafsblending.sh | 35 +- jobs/rocoto/wafsblending0p25.sh | 35 +- jobs/rocoto/wafsgcip.sh | 40 +- jobs/rocoto/wafsgrib2.sh | 35 +- jobs/rocoto/wafsgrib20p25.sh | 34 +- jobs/rocoto/waveawipsbulls.sh | 29 +- jobs/rocoto/waveawipsgridded.sh | 33 +- jobs/rocoto/wavegempak.sh | 25 +- jobs/rocoto/waveinit.sh | 13 +- jobs/rocoto/wavepostbndpnt.sh | 13 +- jobs/rocoto/wavepostbndpntbll.sh | 13 +- jobs/rocoto/wavepostpnt.sh | 13 +- jobs/rocoto/wavepostsbs.sh | 18 +- jobs/rocoto/waveprep.sh | 13 +- modulefiles/module-setup.csh.inc | 15 +- modulefiles/module-setup.sh.inc | 14 +- modulefiles/module_base.hera.lua | 32 +- modulefiles/module_base.jet.lua | 59 +- modulefiles/module_base.orion.lua | 34 +- modulefiles/module_base.s4.lua | 37 + modulefiles/module_base.wcoss2.lua | 40 + modulefiles/module_gwci.hera.lua | 15 + modulefiles/module_gwci.orion.lua | 21 + modulefiles/module_gwsetup.hera.lua | 13 + modulefiles/module_gwsetup.orion.lua | 17 + parm/config/config.aero | 9 + parm/config/config.aeroanl | 24 + parm/config/config.aeroanlfinal | 10 + parm/config/config.aeroanlinit | 10 + parm/config/config.aeroanlrun | 11 + parm/config/config.aerosol_init | 2 +- parm/config/config.anal | 249 +- parm/config/config.analcalc | 4 + parm/config/config.analdiag | 0 parm/config/config.arch | 0 parm/config/config.atmanl | 24 + parm/config/config.atmanlfinal | 10 + parm/config/config.atmanlinit | 10 + parm/config/config.atmanlrun | 11 + parm/config/config.atmensanl | 22 + parm/config/config.atmensanlfinal | 10 + parm/config/config.atmensanlinit | 10 + parm/config/config.atmensanlrun | 11 + parm/config/config.awips | 0 parm/config/config.base.nco.static | 30 +- parm/config/config.com | 92 + parm/config/config.coupled_ic | 39 +- parm/config/config.defaults.s2sw | 24 +- parm/config/config.earc | 0 parm/config/config.ecen | 0 parm/config/config.echgres | 0 parm/config/config.ediag | 0 parm/config/config.efcs | 52 +- parm/config/config.eobs | 0 parm/config/config.epos | 0 parm/config/config.esfc | 0 parm/config/config.eupd | 0 parm/config/config.fcst | 264 +- parm/config/config.fit2obs | 23 + parm/config/config.fv3 | 193 - parm/config/config.fv3.nco.static | 31 +- parm/config/config.gempak | 2 - parm/config/config.getic | 0 parm/config/config.gldas | 2 +- parm/config/config.ice | 5 +- parm/config/config.init | 1 + parm/config/config.landanl | 23 + parm/config/config.landanlfinal | 10 + parm/config/config.landanlinit | 10 + parm/config/config.landanlrun | 11 + parm/config/config.metp | 30 +- parm/config/config.nsst | 0 parm/config/config.ocn | 30 +- parm/config/config.ocnanal | 32 + parm/config/config.ocnanalbmat | 11 + parm/config/config.ocnanalchkpt | 11 + parm/config/config.ocnanalpost | 10 + parm/config/config.ocnanalprep | 10 + parm/config/config.ocnanalrun | 11 + parm/config/config.ocnanalvrfy | 10 + parm/config/config.ocnpost | 0 parm/config/config.post | 19 +- parm/config/config.postsnd | 0 parm/config/config.prep | 9 +- parm/config/config.prepbufr | 19 - parm/config/config.resources | 1007 +- parm/config/config.resources.nco.static | 354 + parm/config/config.ufs | 370 + parm/config/config.vrfy | 117 +- parm/config/config.wafs | 0 parm/config/config.wafsblending | 0 parm/config/config.wafsblending0p25 | 0 parm/config/config.wafsgcip | 0 parm/config/config.wafsgrib2 | 0 parm/config/config.wafsgrib20p25 | 0 parm/config/config.wave | 6 +- parm/config/config.waveawipsbulls | 3 - parm/config/config.waveawipsgridded | 3 - parm/config/config.wavegempak | 3 - parm/config/config.waveinit | 0 parm/config/config.wavepostbndpnt | 3 - parm/config/config.wavepostbndpntbll | 0 parm/config/config.wavepostpnt | 0 parm/config/config.wavepostsbs | 0 parm/config/config.waveprep | 0 parm/config/yaml/defaults.yaml | 19 + parm/mom6/MOM_input_template_025 | 36 +- parm/mom6/MOM_input_template_050 | 38 +- parm/mom6/MOM_input_template_100 | 34 +- parm/mom6/MOM_input_template_500 | 541 + parm/parm_fv3diag/diag_table | 84 + parm/parm_fv3diag/diag_table_da | 11 + parm/parm_fv3diag/diag_table_history | 89 - parm/parm_fv3diag/field_table_gfdl_progsigma | 42 + .../field_table_gfdl_satmedmf_progsigma | 47 + .../field_table_thompson_noaero_tke_progsigma | 70 + parm/parm_fv3diag/field_table_wsm6_progsigma | 38 + .../field_table_wsm6_satmedmf_progsigma | 43 + .../field_table_zhaocarr_progsigma | 21 + .../field_table_zhaocarr_satmedmf_progsigma | 26 + parm/parm_gdas/aero_crtm_coeff.yaml | 13 + parm/parm_gdas/aero_jedi_fix.yaml | 11 + parm/parm_gdas/aeroanl_inc_vars.yaml | 1 + parm/parm_gdas/atm_crtm_coeff.yaml | 178 + parm/parm_gdas/atm_jedi_fix.yaml | 7 + parm/parm_gdas/atmanl_inc_vars.yaml | 1 + parm/ufs/fix/gfs/atmos.fixed_files.yaml | 85 + parm/ufs/fix/gfs/land.fixed_files.yaml | 58 + parm/ufs/fix/gfs/ocean.fixed_files.yaml | 10 + parm/wave/bull_awips_gfswave | 496 + parm/wave/grib2_gfswave.ao_9km.f000 | 16 + parm/wave/grib2_gfswave.ao_9km.f003 | 16 + parm/wave/grib2_gfswave.ao_9km.f006 | 16 + parm/wave/grib2_gfswave.ao_9km.f009 | 16 + parm/wave/grib2_gfswave.ao_9km.f012 | 16 + parm/wave/grib2_gfswave.ao_9km.f015 | 16 + parm/wave/grib2_gfswave.ao_9km.f018 | 16 + parm/wave/grib2_gfswave.ao_9km.f021 | 16 + parm/wave/grib2_gfswave.ao_9km.f024 | 16 + parm/wave/grib2_gfswave.ao_9km.f027 | 16 + parm/wave/grib2_gfswave.ao_9km.f030 | 16 + parm/wave/grib2_gfswave.ao_9km.f033 | 16 + parm/wave/grib2_gfswave.ao_9km.f036 | 16 + parm/wave/grib2_gfswave.ao_9km.f039 | 16 + parm/wave/grib2_gfswave.ao_9km.f042 | 16 + parm/wave/grib2_gfswave.ao_9km.f045 | 16 + parm/wave/grib2_gfswave.ao_9km.f048 | 16 + parm/wave/grib2_gfswave.ao_9km.f051 | 16 + parm/wave/grib2_gfswave.ao_9km.f054 | 16 + parm/wave/grib2_gfswave.ao_9km.f057 | 16 + parm/wave/grib2_gfswave.ao_9km.f060 | 16 + parm/wave/grib2_gfswave.ao_9km.f063 | 16 + parm/wave/grib2_gfswave.ao_9km.f066 | 16 + parm/wave/grib2_gfswave.ao_9km.f069 | 16 + parm/wave/grib2_gfswave.ao_9km.f072 | 16 + parm/wave/grib2_gfswave.ao_9km.f078 | 16 + parm/wave/grib2_gfswave.ao_9km.f084 | 16 + parm/wave/grib2_gfswave.ao_9km.f090 | 16 + parm/wave/grib2_gfswave.ao_9km.f096 | 16 + parm/wave/grib2_gfswave.ao_9km.f102 | 16 + parm/wave/grib2_gfswave.ao_9km.f108 | 16 + parm/wave/grib2_gfswave.ao_9km.f114 | 16 + parm/wave/grib2_gfswave.ao_9km.f120 | 16 + parm/wave/grib2_gfswave.ao_9km.f126 | 16 + parm/wave/grib2_gfswave.ao_9km.f132 | 16 + parm/wave/grib2_gfswave.ao_9km.f138 | 16 + parm/wave/grib2_gfswave.ao_9km.f144 | 16 + parm/wave/grib2_gfswave.ao_9km.f150 | 16 + parm/wave/grib2_gfswave.ao_9km.f156 | 16 + parm/wave/grib2_gfswave.ao_9km.f162 | 16 + parm/wave/grib2_gfswave.ao_9km.f168 | 16 + parm/wave/grib2_gfswave.ao_9km.f174 | 16 + parm/wave/grib2_gfswave.ao_9km.f180 | 16 + parm/wave/grib2_gfswave.at_10m.f000 | 16 + parm/wave/grib2_gfswave.at_10m.f003 | 16 + parm/wave/grib2_gfswave.at_10m.f006 | 16 + parm/wave/grib2_gfswave.at_10m.f009 | 16 + parm/wave/grib2_gfswave.at_10m.f012 | 16 + parm/wave/grib2_gfswave.at_10m.f015 | 16 + parm/wave/grib2_gfswave.at_10m.f018 | 16 + parm/wave/grib2_gfswave.at_10m.f021 | 16 + parm/wave/grib2_gfswave.at_10m.f024 | 16 + parm/wave/grib2_gfswave.at_10m.f027 | 16 + parm/wave/grib2_gfswave.at_10m.f030 | 16 + parm/wave/grib2_gfswave.at_10m.f033 | 16 + parm/wave/grib2_gfswave.at_10m.f036 | 16 + parm/wave/grib2_gfswave.at_10m.f039 | 16 + parm/wave/grib2_gfswave.at_10m.f042 | 16 + parm/wave/grib2_gfswave.at_10m.f045 | 16 + parm/wave/grib2_gfswave.at_10m.f048 | 16 + parm/wave/grib2_gfswave.at_10m.f051 | 16 + parm/wave/grib2_gfswave.at_10m.f054 | 16 + parm/wave/grib2_gfswave.at_10m.f057 | 16 + parm/wave/grib2_gfswave.at_10m.f060 | 16 + parm/wave/grib2_gfswave.at_10m.f063 | 16 + parm/wave/grib2_gfswave.at_10m.f066 | 16 + parm/wave/grib2_gfswave.at_10m.f069 | 16 + parm/wave/grib2_gfswave.at_10m.f072 | 16 + parm/wave/grib2_gfswave.at_10m.f078 | 16 + parm/wave/grib2_gfswave.at_10m.f084 | 16 + parm/wave/grib2_gfswave.at_10m.f090 | 16 + parm/wave/grib2_gfswave.at_10m.f096 | 16 + parm/wave/grib2_gfswave.at_10m.f102 | 16 + parm/wave/grib2_gfswave.at_10m.f108 | 16 + parm/wave/grib2_gfswave.at_10m.f114 | 16 + parm/wave/grib2_gfswave.at_10m.f120 | 16 + parm/wave/grib2_gfswave.at_10m.f126 | 16 + parm/wave/grib2_gfswave.at_10m.f132 | 16 + parm/wave/grib2_gfswave.at_10m.f138 | 16 + parm/wave/grib2_gfswave.at_10m.f144 | 16 + parm/wave/grib2_gfswave.at_10m.f150 | 16 + parm/wave/grib2_gfswave.at_10m.f156 | 16 + parm/wave/grib2_gfswave.at_10m.f162 | 16 + parm/wave/grib2_gfswave.at_10m.f168 | 16 + parm/wave/grib2_gfswave.at_10m.f174 | 16 + parm/wave/grib2_gfswave.at_10m.f180 | 16 + parm/wave/grib2_gfswave.ep_10m.f000 | 16 + parm/wave/grib2_gfswave.ep_10m.f003 | 16 + parm/wave/grib2_gfswave.ep_10m.f006 | 16 + parm/wave/grib2_gfswave.ep_10m.f009 | 16 + parm/wave/grib2_gfswave.ep_10m.f012 | 16 + parm/wave/grib2_gfswave.ep_10m.f015 | 16 + parm/wave/grib2_gfswave.ep_10m.f018 | 16 + parm/wave/grib2_gfswave.ep_10m.f021 | 16 + parm/wave/grib2_gfswave.ep_10m.f024 | 16 + parm/wave/grib2_gfswave.ep_10m.f027 | 16 + parm/wave/grib2_gfswave.ep_10m.f030 | 16 + parm/wave/grib2_gfswave.ep_10m.f033 | 16 + parm/wave/grib2_gfswave.ep_10m.f036 | 16 + parm/wave/grib2_gfswave.ep_10m.f039 | 16 + parm/wave/grib2_gfswave.ep_10m.f042 | 16 + parm/wave/grib2_gfswave.ep_10m.f045 | 16 + parm/wave/grib2_gfswave.ep_10m.f048 | 16 + parm/wave/grib2_gfswave.ep_10m.f051 | 16 + parm/wave/grib2_gfswave.ep_10m.f054 | 16 + parm/wave/grib2_gfswave.ep_10m.f057 | 16 + parm/wave/grib2_gfswave.ep_10m.f060 | 16 + parm/wave/grib2_gfswave.ep_10m.f063 | 16 + parm/wave/grib2_gfswave.ep_10m.f066 | 16 + parm/wave/grib2_gfswave.ep_10m.f069 | 16 + parm/wave/grib2_gfswave.ep_10m.f072 | 16 + parm/wave/grib2_gfswave.ep_10m.f078 | 16 + parm/wave/grib2_gfswave.ep_10m.f084 | 16 + parm/wave/grib2_gfswave.ep_10m.f090 | 16 + parm/wave/grib2_gfswave.ep_10m.f096 | 16 + parm/wave/grib2_gfswave.ep_10m.f102 | 16 + parm/wave/grib2_gfswave.ep_10m.f108 | 16 + parm/wave/grib2_gfswave.ep_10m.f114 | 16 + parm/wave/grib2_gfswave.ep_10m.f120 | 16 + parm/wave/grib2_gfswave.ep_10m.f126 | 16 + parm/wave/grib2_gfswave.ep_10m.f132 | 16 + parm/wave/grib2_gfswave.ep_10m.f138 | 16 + parm/wave/grib2_gfswave.ep_10m.f144 | 16 + parm/wave/grib2_gfswave.ep_10m.f150 | 16 + parm/wave/grib2_gfswave.ep_10m.f156 | 16 + parm/wave/grib2_gfswave.ep_10m.f162 | 16 + parm/wave/grib2_gfswave.ep_10m.f168 | 16 + parm/wave/grib2_gfswave.ep_10m.f174 | 16 + parm/wave/grib2_gfswave.ep_10m.f180 | 16 + parm/wave/grib2_gfswave.glo_30m.f000 | 16 + parm/wave/grib2_gfswave.glo_30m.f003 | 16 + parm/wave/grib2_gfswave.glo_30m.f006 | 16 + parm/wave/grib2_gfswave.glo_30m.f009 | 16 + parm/wave/grib2_gfswave.glo_30m.f012 | 16 + parm/wave/grib2_gfswave.glo_30m.f015 | 16 + parm/wave/grib2_gfswave.glo_30m.f018 | 16 + parm/wave/grib2_gfswave.glo_30m.f021 | 16 + parm/wave/grib2_gfswave.glo_30m.f024 | 16 + parm/wave/grib2_gfswave.glo_30m.f027 | 16 + parm/wave/grib2_gfswave.glo_30m.f030 | 16 + parm/wave/grib2_gfswave.glo_30m.f033 | 16 + parm/wave/grib2_gfswave.glo_30m.f036 | 16 + parm/wave/grib2_gfswave.glo_30m.f039 | 16 + parm/wave/grib2_gfswave.glo_30m.f042 | 16 + parm/wave/grib2_gfswave.glo_30m.f045 | 16 + parm/wave/grib2_gfswave.glo_30m.f048 | 16 + parm/wave/grib2_gfswave.glo_30m.f051 | 16 + parm/wave/grib2_gfswave.glo_30m.f054 | 16 + parm/wave/grib2_gfswave.glo_30m.f057 | 16 + parm/wave/grib2_gfswave.glo_30m.f060 | 16 + parm/wave/grib2_gfswave.glo_30m.f063 | 16 + parm/wave/grib2_gfswave.glo_30m.f066 | 16 + parm/wave/grib2_gfswave.glo_30m.f069 | 16 + parm/wave/grib2_gfswave.glo_30m.f072 | 16 + parm/wave/grib2_gfswave.glo_30m.f078 | 16 + parm/wave/grib2_gfswave.glo_30m.f084 | 16 + parm/wave/grib2_gfswave.glo_30m.f090 | 16 + parm/wave/grib2_gfswave.glo_30m.f096 | 16 + parm/wave/grib2_gfswave.glo_30m.f102 | 16 + parm/wave/grib2_gfswave.glo_30m.f108 | 16 + parm/wave/grib2_gfswave.glo_30m.f114 | 16 + parm/wave/grib2_gfswave.glo_30m.f120 | 16 + parm/wave/grib2_gfswave.glo_30m.f126 | 16 + parm/wave/grib2_gfswave.glo_30m.f132 | 16 + parm/wave/grib2_gfswave.glo_30m.f138 | 16 + parm/wave/grib2_gfswave.glo_30m.f144 | 16 + parm/wave/grib2_gfswave.glo_30m.f150 | 16 + parm/wave/grib2_gfswave.glo_30m.f156 | 16 + parm/wave/grib2_gfswave.glo_30m.f162 | 16 + parm/wave/grib2_gfswave.glo_30m.f168 | 16 + parm/wave/grib2_gfswave.glo_30m.f174 | 16 + parm/wave/grib2_gfswave.glo_30m.f180 | 16 + parm/wave/grib2_gfswave.wc_10m.f000 | 16 + parm/wave/grib2_gfswave.wc_10m.f003 | 16 + parm/wave/grib2_gfswave.wc_10m.f006 | 16 + parm/wave/grib2_gfswave.wc_10m.f009 | 16 + parm/wave/grib2_gfswave.wc_10m.f012 | 16 + parm/wave/grib2_gfswave.wc_10m.f015 | 16 + parm/wave/grib2_gfswave.wc_10m.f018 | 16 + parm/wave/grib2_gfswave.wc_10m.f021 | 16 + parm/wave/grib2_gfswave.wc_10m.f024 | 16 + parm/wave/grib2_gfswave.wc_10m.f027 | 16 + parm/wave/grib2_gfswave.wc_10m.f030 | 16 + parm/wave/grib2_gfswave.wc_10m.f033 | 16 + parm/wave/grib2_gfswave.wc_10m.f036 | 16 + parm/wave/grib2_gfswave.wc_10m.f039 | 16 + parm/wave/grib2_gfswave.wc_10m.f042 | 16 + parm/wave/grib2_gfswave.wc_10m.f045 | 16 + parm/wave/grib2_gfswave.wc_10m.f048 | 16 + parm/wave/grib2_gfswave.wc_10m.f051 | 16 + parm/wave/grib2_gfswave.wc_10m.f054 | 16 + parm/wave/grib2_gfswave.wc_10m.f057 | 16 + parm/wave/grib2_gfswave.wc_10m.f060 | 16 + parm/wave/grib2_gfswave.wc_10m.f063 | 16 + parm/wave/grib2_gfswave.wc_10m.f066 | 16 + parm/wave/grib2_gfswave.wc_10m.f069 | 16 + parm/wave/grib2_gfswave.wc_10m.f072 | 16 + parm/wave/grib2_gfswave.wc_10m.f078 | 16 + parm/wave/grib2_gfswave.wc_10m.f084 | 16 + parm/wave/grib2_gfswave.wc_10m.f090 | 16 + parm/wave/grib2_gfswave.wc_10m.f096 | 16 + parm/wave/grib2_gfswave.wc_10m.f102 | 16 + parm/wave/grib2_gfswave.wc_10m.f108 | 16 + parm/wave/grib2_gfswave.wc_10m.f114 | 16 + parm/wave/grib2_gfswave.wc_10m.f120 | 16 + parm/wave/grib2_gfswave.wc_10m.f126 | 16 + parm/wave/grib2_gfswave.wc_10m.f132 | 16 + parm/wave/grib2_gfswave.wc_10m.f138 | 16 + parm/wave/grib2_gfswave.wc_10m.f144 | 16 + parm/wave/grib2_gfswave.wc_10m.f150 | 16 + parm/wave/grib2_gfswave.wc_10m.f156 | 16 + parm/wave/grib2_gfswave.wc_10m.f162 | 16 + parm/wave/grib2_gfswave.wc_10m.f168 | 16 + parm/wave/grib2_gfswave.wc_10m.f174 | 16 + parm/wave/grib2_gfswave.wc_10m.f180 | 16 + parm/wave/wave_gfs.buoys | 615 +- scripts/exgdas_atmos_chgres_forenkf.sh | 47 +- scripts/exgdas_atmos_gempak_gif_ncdc.sh | 2 - scripts/exgdas_atmos_gldas.sh | 332 + scripts/exgdas_atmos_nawips.sh | 51 +- scripts/exgdas_atmos_post.sh | 335 + scripts/exgdas_atmos_verfozn.sh | 85 + scripts/exgdas_atmos_verfrad.sh | 212 + scripts/exgdas_atmos_vminmon.sh | 113 + scripts/exgdas_enkf_earc.sh | 304 + scripts/exgdas_enkf_ecen.sh | 111 +- scripts/exgdas_enkf_fcst.sh | 70 +- scripts/exgdas_enkf_post.sh | 40 +- scripts/exgdas_enkf_select_obs.sh | 7 - scripts/exgdas_enkf_sfc.sh | 103 +- scripts/exgdas_enkf_update.sh | 98 +- scripts/exgfs_aero_init_aerosol.py | 406 +- scripts/exgfs_atmos_awips_20km_1p0deg.sh | 284 +- scripts/exgfs_atmos_fbwind.sh | 4 +- scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh | 12 +- scripts/exgfs_atmos_gempak_meta.sh | 10 +- scripts/exgfs_atmos_goes_nawips.sh | 7 +- scripts/exgfs_atmos_grib2_special_npoess.sh | 255 +- scripts/exgfs_atmos_grib_awips.sh | 133 +- scripts/exgfs_atmos_nawips.sh | 257 +- scripts/exgfs_atmos_post.sh | 513 + scripts/exgfs_atmos_postsnd.sh | 85 +- scripts/exgfs_atmos_vminmon.sh | 116 + scripts/exgfs_wave_init.sh | 54 +- scripts/exgfs_wave_nawips.sh | 19 +- scripts/exgfs_wave_post_gridded_sbs.sh | 65 +- scripts/exgfs_wave_post_pnt.sh | 97 +- scripts/exgfs_wave_prdgen_bulls.sh | 79 +- scripts/exgfs_wave_prdgen_gridded.sh | 44 +- scripts/exgfs_wave_prep.sh | 85 +- scripts/exglobal_aero_analysis_finalize.py | 25 + scripts/exglobal_aero_analysis_initialize.py | 25 + scripts/exglobal_aero_analysis_run.py | 23 + scripts/exglobal_archive.sh | 481 + scripts/exglobal_atm_analysis_finalize.py | 25 + scripts/exglobal_atm_analysis_initialize.py | 25 + scripts/exglobal_atm_analysis_run.py | 23 + scripts/exglobal_atmens_analysis_finalize.py | 25 + .../exglobal_atmens_analysis_initialize.py | 25 + scripts/exglobal_atmens_analysis_run.py | 23 + scripts/exglobal_atmos_analysis.sh | 853 +- scripts/exglobal_atmos_analysis_calc.sh | 64 +- scripts/exglobal_atmos_sfcanl.sh | 140 +- scripts/exglobal_atmos_tropcy_qc_reloc.sh | 56 +- scripts/exglobal_diag.sh | 22 +- scripts/exglobal_forecast.py | 27 + scripts/exglobal_forecast.sh | 63 +- scripts/run_reg2grb2.sh | 37 +- scripts/run_regrid.sh | 31 +- sorc/build_all.sh | 291 +- sorc/build_gdas.sh | 41 +- sorc/build_gfs_util.sh | 21 - sorc/build_gfs_utils.sh | 45 + sorc/build_gfs_wafs.sh | 6 +- sorc/build_gldas.sh | 6 +- sorc/build_gsi_enkf.sh | 10 +- sorc/build_gsi_monitor.sh | 10 +- sorc/build_gsi_utils.sh | 10 +- sorc/build_ufs.sh | 30 +- sorc/build_ufs_utils.sh | 8 +- sorc/build_upp.sh | 15 +- sorc/build_ww3prepost.sh | 86 +- sorc/checkout.sh | 107 +- sorc/enkf_chgres_recenter.fd/.gitignore | 3 - sorc/enkf_chgres_recenter.fd/driver.f90 | 65 - sorc/enkf_chgres_recenter.fd/input_data.f90 | 383 - sorc/enkf_chgres_recenter.fd/interp.f90 | 552 - sorc/enkf_chgres_recenter.fd/output_data.f90 | 396 - sorc/enkf_chgres_recenter.fd/setup.f90 | 53 - sorc/enkf_chgres_recenter.fd/utils.f90 | 783 - sorc/enkf_chgres_recenter_nc.fd/driver.f90 | 67 - .../enkf_chgres_recenter_nc.fd/input_data.f90 | 345 - sorc/enkf_chgres_recenter_nc.fd/interp.f90 | 582 - .../output_data.f90 | 288 - sorc/enkf_chgres_recenter_nc.fd/setup.f90 | 55 - sorc/enkf_chgres_recenter_nc.fd/utils.f90 | 736 - sorc/fbwndgfs.fd/fbwndgfs.f | 969 -- sorc/fv3nc2nemsio.fd/0readme | 23 - sorc/fv3nc2nemsio.fd/constants.f90 | 314 - sorc/fv3nc2nemsio.fd/fv3_main.f90 | 215 - sorc/fv3nc2nemsio.fd/fv3_module.f90 | 372 - sorc/fv3nc2nemsio.fd/kinds.f90 | 107 - sorc/gaussian_sfcanl.fd/.gitignore | 3 - sorc/gaussian_sfcanl.fd/gaussian_sfcanl.f90 | 2093 --- sorc/gaussian_sfcanl.fd/weight_gen/README | 23 - .../weight_gen/run.theia.sh | 152 - .../weight_gen/scrip.fd/scrip.f90 | 350 - sorc/gfs_bufr.fd/bfrhdr.f | 174 - sorc/gfs_bufr.fd/bfrize.f | 241 - sorc/gfs_bufr.fd/buff.f | 92 - sorc/gfs_bufr.fd/calpreciptype.f | 1616 --- sorc/gfs_bufr.fd/calwxt_gfs_baldwin.f | 294 - sorc/gfs_bufr.fd/calwxt_gfs_ramer.f | 364 - sorc/gfs_bufr.fd/funcphys.f | 2899 ---- sorc/gfs_bufr.fd/gfsbufr.f | 276 - sorc/gfs_bufr.fd/gslp.f | 92 - sorc/gfs_bufr.fd/lcl.f | 45 - sorc/gfs_bufr.fd/machine.f | 15 - sorc/gfs_bufr.fd/meteorg.f | 1326 -- sorc/gfs_bufr.fd/modstuff1.f | 75 - sorc/gfs_bufr.fd/mstadb.f | 49 - sorc/gfs_bufr.fd/newsig1.f | 65 - sorc/gfs_bufr.fd/physcons.f | 40 - sorc/gfs_bufr.fd/read_nemsio.f | 55 - sorc/gfs_bufr.fd/read_netcdf.f | 55 - sorc/gfs_bufr.fd/read_netcdf_p.f | 113 - sorc/gfs_bufr.fd/rsearch.f | 145 - sorc/gfs_bufr.fd/svp.f | 34 - sorc/gfs_bufr.fd/tdew.f | 54 - sorc/gfs_bufr.fd/terp3.f | 124 - sorc/gfs_bufr.fd/vintg.f | 111 - sorc/gfs_build.cfg | 4 +- sorc/link_workflow.sh | 512 +- sorc/machine-setup.sh | 142 - sorc/ncl.setup | 6 +- sorc/partial_build.sh | 123 +- sorc/regrid_nemsio.fd/constants.f90 | 314 - sorc/regrid_nemsio.fd/fv3_interface.f90 | 779 - sorc/regrid_nemsio.fd/gfs_nems_interface.f90 | 595 - .../interpolation_interface.f90 | 335 - sorc/regrid_nemsio.fd/kinds.f90 | 107 - sorc/regrid_nemsio.fd/main.f90 | 92 - sorc/regrid_nemsio.fd/mpi_interface.f90 | 89 - sorc/regrid_nemsio.fd/namelist_def.f90 | 181 - sorc/regrid_nemsio.fd/netcdfio_interface.f90 | 592 - sorc/regrid_nemsio.fd/physcons.f90 | 77 - .../regrid_nemsio_interface.f90 | 50 - sorc/regrid_nemsio.fd/variable_interface.f90 | 66 - sorc/supvit.fd/supvit_main.f | 865 -- sorc/supvit.fd/supvit_modules.f | 52 - sorc/syndat_getjtbul.fd/getjtbul.f | 248 - sorc/syndat_maksynrc.fd/maksynrc.f | 472 - sorc/syndat_qctropcy.fd/qctropcy.f | 12099 ---------------- sorc/tave.fd/tave.f | 1083 -- sorc/tocsbufr.fd/tocsbufr.f | 272 - sorc/vint.fd/vint.f | 1239 -- test/diff_grib_files.py | 8 +- ush/calcanl_gfs.py | 346 +- ush/calcinc_gfs.py | 129 +- ush/compare_f90nml.py | 107 + ush/detect_machine.sh | 73 + ush/drive_makeprepbufr.sh | 142 - ush/forecast_det.sh | 56 +- ush/forecast_postdet.sh | 607 +- ush/forecast_predet.sh | 113 +- ush/fv3gfs_downstream_nems.sh | 68 +- ush/fv3gfs_dwn_nems.sh | 6 +- ush/fv3gfs_nc2nemsio.sh | 73 - ush/fv3gfs_regrid_nemsio.sh | 119 - ush/fv3gfs_remap.sh | 5 +- ush/gaussian_sfcanl.sh | 58 +- ush/gfs_bfr2gpk.sh | 45 +- ush/gfs_bufr.sh | 96 +- ush/gfs_bufr_netcdf.sh | 16 +- ush/gfs_post.sh | 416 + ush/gfs_sndp.sh | 30 +- ush/gfs_truncate_enkf.sh | 2 +- ush/gldas_forcing.sh | 118 + ush/gldas_get_data.sh | 76 + ush/gldas_liscrd.sh | 46 + ush/gldas_process_data.sh | 34 + ush/global_extrkr.sh | 1697 --- ush/gsi_utils.py | 38 +- ush/hpssarch_gen.sh | 843 +- ush/inter_flux.sh | 22 +- ush/jjob_header.sh | 115 + ush/load_fv3gfs_modules.sh | 18 +- ush/load_ufsda_modules.sh | 90 + ush/merge_fv3_aerosol_tile.py | 4 +- ush/minmon_xtrct_costs.pl | 231 + ush/minmon_xtrct_gnorms.pl | 442 + ush/minmon_xtrct_reduct.pl | 89 + ush/mod_icec.sh | 2 +- ush/module-setup.sh | 107 + ush/nems.configure.atm.IN | 8 +- ush/nems.configure.atm_aero.IN | 5 +- ush/nems.configure.blocked_atm_wav.IN | 7 +- ush/nems.configure.cpld.IN | 7 +- ush/nems.configure.cpld_aero_outerwave.IN | 148 + ush/nems.configure.cpld_aero_wave.IN | 9 +- ush/nems.configure.cpld_outerwave.IN | 136 + ush/nems.configure.cpld_wave.IN | 8 +- ush/nems.configure.leapfrog_atm_wav.IN | 7 +- ush/nems_configure.sh | 187 +- ush/ocnpost.ncl | 5 +- ush/ozn_xtrct.sh | 261 + ush/parsing_model_configure_FV3.sh | 26 +- ush/parsing_namelists_CICE.sh | 98 +- ush/parsing_namelists_FV3.sh | 6 +- ush/parsing_namelists_MOM6.sh | 123 +- ush/parsing_namelists_WW3.sh | 4 +- ush/preamble.sh | 157 +- ush/python/pygfs/__init__.py | 0 ush/python/pygfs/task/__init__.py | 0 ush/python/pygfs/task/aero_analysis.py | 304 + ush/python/pygfs/task/analysis.py | 201 + ush/python/pygfs/task/atm_analysis.py | 435 + ush/python/pygfs/task/atmens_analysis.py | 351 + ush/python/pygfs/task/gfs_forecast.py | 35 + ush/python/pygfs/ufswm/__init__.py | 0 ush/python/pygfs/ufswm/gfs.py | 20 + ush/python/pygfs/ufswm/ufs.py | 58 + ush/python/pygw/.gitignore | 139 + ush/python/pygw/README.md | 36 + ush/python/pygw/setup.cfg | 62 + ush/python/pygw/setup.py | 4 + ush/python/pygw/src/pygw/__init__.py | 8 + ush/python/pygw/src/pygw/attrdict.py | 171 + ush/python/pygw/src/pygw/configuration.py | 179 + ush/python/pygw/src/pygw/exceptions.py | 87 + ush/python/pygw/src/pygw/executable.py | 354 + ush/python/pygw/src/pygw/file_utils.py | 73 + ush/python/pygw/src/pygw/fsutils.py | 73 + ush/python/pygw/src/pygw/jinja.py | 228 + ush/python/pygw/src/pygw/logger.py | 275 + ush/python/pygw/src/pygw/task.py | 93 + ush/python/pygw/src/pygw/template.py | 191 + ush/python/pygw/src/pygw/timetools.py | 291 + ush/python/pygw/src/pygw/yaml_file.py | 208 + ush/python/pygw/src/tests/__init__.py | 0 .../pygw/src/tests/test_configuration.py | 172 + ush/python/pygw/src/tests/test_exceptions.py | 35 + ush/python/pygw/src/tests/test_executable.py | 60 + ush/python/pygw/src/tests/test_file_utils.py | 66 + ush/python/pygw/src/tests/test_jinja.py | 37 + ush/python/pygw/src/tests/test_logger.py | 67 + ush/python/pygw/src/tests/test_template.py | 147 + ush/python/pygw/src/tests/test_timetools.py | 76 + ush/python/pygw/src/tests/test_yaml_file.py | 97 + ush/radmon_diag_ck.sh | 175 + ush/radmon_err_rpt.sh | 194 + ush/radmon_verf_angle.sh | 235 + ush/radmon_verf_bcoef.sh | 233 + ush/radmon_verf_bcor.sh | 226 + ush/radmon_verf_time.sh | 567 + ush/rstprod.sh | 19 + ush/scale_dec.sh | 2 +- ush/syndat_getjtbul.sh | 52 +- ush/syndat_qctropcy.sh | 64 +- ush/trim_rh.sh | 2 +- ush/tropcy_relocate.sh | 105 +- ush/tropcy_relocate_extrkr.sh | 86 +- ush/wave_grib2_sbs.sh | 278 +- ush/wave_grid_interp_sbs.sh | 32 +- ush/wave_grid_moddef.sh | 14 +- ush/wave_outp_cat.sh | 10 +- ush/wave_outp_spec.sh | 22 +- ush/wave_prnc_cur.sh | 21 +- ush/wave_prnc_ice.sh | 38 +- ush/wave_tar.sh | 36 +- util/modulefiles/gfs_util.hera | 28 - util/sorc/compile_gfs_util_wcoss.sh | 49 - .../mkgfsawps.fd/compile_mkgfsawps_wcoss.sh | 28 - util/sorc/mkgfsawps.fd/makefile | 53 - util/sorc/mkgfsawps.fd/makefile.hera | 53 - util/sorc/mkgfsawps.fd/mkgfsawps.f | 511 - .../overgridid.fd/compile_overgridid_wcoss.sh | 35 - util/sorc/overgridid.fd/makefile | 8 - util/sorc/overgridid.fd/overgridid.f | 59 - util/sorc/overgridid.fd/sample.script | 13 - util/sorc/rdbfmsua.fd/MAPFILE | 4045 ------ util/sorc/rdbfmsua.fd/README | 2 - util/sorc/rdbfmsua.fd/README.new | 10 - .../rdbfmsua.fd/compile_rdbfmsua_wcoss.sh | 35 - util/sorc/rdbfmsua.fd/makefile | 84 - util/sorc/rdbfmsua.fd/makefile.hera | 88 - util/sorc/rdbfmsua.fd/rdbfmsua.f | 398 - util/sorc/rdbfmsua.fd/rdbfmsua.f_org | 397 - util/sorc/webtitle.fd/README | 9 - .../webtitle.fd/compile_webtitle_wcoss.sh | 35 - util/sorc/webtitle.fd/makefile | 37 - util/sorc/webtitle.fd/webtitle.f | 147 - util/ush/finddate.sh | 163 - util/ush/make_NTC_file.pl | 119 - util/ush/make_ntc_bull.pl | 250 - util/ush/make_tif.sh | 45 - util/ush/month_name.sh | 112 - versions/fix.ver | 23 + workflow/applications.py | 101 +- workflow/ecFlow/ecflow_definitions.py | 38 +- workflow/ecFlow/ecflow_setup.py | 6 +- workflow/hosts.py | 24 +- workflow/hosts/container.yaml | 23 + workflow/hosts/hera_emc.yaml | 42 +- workflow/hosts/hera_gsl.yaml | 42 +- workflow/hosts/jet_emc.yaml | 23 + workflow/hosts/jet_gsl.yaml | 42 +- workflow/hosts/orion_emc.yaml | 42 +- workflow/hosts/orion_gsl.yaml | 42 +- workflow/hosts/s4.yaml | 23 + workflow/hosts/wcoss2.yaml | 23 + workflow/pygw | 1 + workflow/rocoto/rocoto.py | 18 +- workflow/rocoto/workflow_tasks_emc.py | 529 +- workflow/rocoto/workflow_tasks_gsl.py | 553 +- workflow/rocoto/workflow_xml_emc.py | 58 +- workflow/rocoto/workflow_xml_gsl.py | 58 +- workflow/rocoto_viewer.py | 48 +- workflow/setup_expt.py | 296 +- workflow/setup_xml.py | 17 +- workflow/test_configuration.py | 10 +- 825 files changed, 33581 insertions(+), 57647 deletions(-) create mode 100644 .github/ISSUE_TEMPLATE/feature_request.md create mode 100644 .github/ISSUE_TEMPLATE/fix_file.md create mode 100644 .github/ISSUE_TEMPLATE/production_update.md create mode 100755 .github/scripts/build_docs.sh create mode 100644 .github/workflows/docs.yaml create mode 100644 .github/workflows/linters.yaml create mode 100644 .github/workflows/pynorms.yaml create mode 100644 .github/workflows/pytests.yaml create mode 100644 .pycodestyle create mode 100644 .shellcheckrc create mode 100644 LICENSE.md create mode 100644 ci/cases/C96C48_hybatmDA.yaml create mode 100644 ci/cases/C96_atm3DVar.yaml create mode 100644 ci/platforms/hera.sh create mode 100644 ci/platforms/orion.sh create mode 100755 ci/scripts/check_ci.sh create mode 100755 ci/scripts/clone-build_ci.sh create mode 100755 ci/scripts/create_experiment.py create mode 100755 ci/scripts/driver.sh create mode 120000 ci/scripts/pygw create mode 100755 ci/scripts/run_ci.sh create mode 100644 docs/Makefile create mode 100644 docs/make.bat create mode 100644 docs/requirements.txt create mode 100644 docs/source/_static/GFS_v16_flowchart.png create mode 100644 docs/source/_static/custom.css create mode 100644 docs/source/_static/fv3_rocoto_view.png create mode 100644 docs/source/_static/theme_overrides.css create mode 100644 docs/source/clone.rst create mode 100644 docs/source/components.rst create mode 100644 docs/source/conf.py create mode 100644 docs/source/configure.rst create mode 100644 docs/source/development.rst create mode 100644 docs/source/errors_faq.rst create mode 100644 docs/source/hpc.rst create mode 100644 docs/source/index.rst create mode 100644 docs/source/init.rst create mode 100644 docs/source/jobs.rst create mode 100644 docs/source/monitor_rocoto.rst create mode 100644 docs/source/output.rst create mode 100644 docs/source/run.rst create mode 100644 docs/source/setup.rst create mode 100644 docs/source/start.rst create mode 100644 docs/source/view.rst create mode 100755 env/CONTAINER.env create mode 100755 env/S4.env create mode 100755 env/WCOSS2.env delete mode 100644 env/gfs.ver create mode 100755 jobs/JGDAS_ATMOS_GLDAS create mode 100755 jobs/JGDAS_ATMOS_VERFOZN create mode 100755 jobs/JGDAS_ATMOS_VERFRAD create mode 100755 jobs/JGDAS_ATMOS_VMINMON create mode 100755 jobs/JGDAS_ENKF_ARCHIVE create mode 100755 jobs/JGDAS_FIT2OBS create mode 100755 jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT create mode 100755 jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT_VRFY create mode 100755 jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_CHKPT create mode 100755 jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_POST create mode 100755 jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP create mode 100755 jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_RUN create mode 100755 jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY create mode 100755 jobs/JGFS_ATMOS_VMINMON create mode 100755 jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE create mode 100755 jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE create mode 100755 jobs/JGLOBAL_AERO_ANALYSIS_RUN create mode 100755 jobs/JGLOBAL_ARCHIVE create mode 100755 jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE create mode 100755 jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE create mode 100755 jobs/JGLOBAL_ATMENS_ANALYSIS_RUN create mode 100755 jobs/JGLOBAL_ATMOS_POST create mode 100755 jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE create mode 100755 jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE create mode 100755 jobs/JGLOBAL_ATM_ANALYSIS_RUN create mode 100755 jobs/JGLOBAL_LAND_ANALYSIS_FINALIZE create mode 100755 jobs/JGLOBAL_LAND_ANALYSIS_INITIALIZE create mode 100755 jobs/JGLOBAL_LAND_ANALYSIS_RUN create mode 100755 jobs/rocoto/aeroanlfinal.sh create mode 100755 jobs/rocoto/aeroanlinit.sh create mode 100755 jobs/rocoto/aeroanlrun.sh create mode 100755 jobs/rocoto/atmanlfinal.sh create mode 100755 jobs/rocoto/atmanlinit.sh create mode 100755 jobs/rocoto/atmanlrun.sh create mode 100755 jobs/rocoto/atmensanlfinal.sh create mode 100755 jobs/rocoto/atmensanlinit.sh create mode 100755 jobs/rocoto/atmensanlrun.sh delete mode 100755 jobs/rocoto/eomg.sh create mode 100755 jobs/rocoto/fit2obs.sh create mode 100755 jobs/rocoto/landanlfinal.sh create mode 100755 jobs/rocoto/landanlinit.sh create mode 100755 jobs/rocoto/landanlrun.sh create mode 100755 jobs/rocoto/ocnanalbmat.sh create mode 100755 jobs/rocoto/ocnanalchkpt.sh create mode 100755 jobs/rocoto/ocnanalpost.sh create mode 100755 jobs/rocoto/ocnanalprep.sh create mode 100755 jobs/rocoto/ocnanalrun.sh create mode 100755 jobs/rocoto/ocnanalvrfy.sh create mode 100644 modulefiles/module_base.s4.lua create mode 100644 modulefiles/module_base.wcoss2.lua create mode 100644 modulefiles/module_gwci.hera.lua create mode 100644 modulefiles/module_gwci.orion.lua create mode 100644 modulefiles/module_gwsetup.hera.lua create mode 100644 modulefiles/module_gwsetup.orion.lua create mode 100644 parm/config/config.aeroanl create mode 100644 parm/config/config.aeroanlfinal create mode 100644 parm/config/config.aeroanlinit create mode 100644 parm/config/config.aeroanlrun mode change 100755 => 100644 parm/config/config.anal mode change 100755 => 100644 parm/config/config.analcalc mode change 100755 => 100644 parm/config/config.analdiag mode change 100755 => 100644 parm/config/config.arch create mode 100644 parm/config/config.atmanl create mode 100644 parm/config/config.atmanlfinal create mode 100644 parm/config/config.atmanlinit create mode 100644 parm/config/config.atmanlrun create mode 100755 parm/config/config.atmensanl create mode 100755 parm/config/config.atmensanlfinal create mode 100755 parm/config/config.atmensanlinit create mode 100755 parm/config/config.atmensanlrun mode change 100755 => 100644 parm/config/config.awips mode change 100755 => 100644 parm/config/config.base.nco.static create mode 100644 parm/config/config.com mode change 100755 => 100644 parm/config/config.coupled_ic mode change 100755 => 100644 parm/config/config.earc mode change 100755 => 100644 parm/config/config.ecen mode change 100755 => 100644 parm/config/config.echgres mode change 100755 => 100644 parm/config/config.ediag mode change 100755 => 100644 parm/config/config.efcs mode change 100755 => 100644 parm/config/config.eobs mode change 100755 => 100644 parm/config/config.epos mode change 100755 => 100644 parm/config/config.esfc mode change 100755 => 100644 parm/config/config.eupd mode change 100755 => 100644 parm/config/config.fcst create mode 100644 parm/config/config.fit2obs delete mode 100755 parm/config/config.fv3 mode change 100755 => 100644 parm/config/config.fv3.nco.static mode change 100755 => 100644 parm/config/config.gempak mode change 100755 => 100644 parm/config/config.getic mode change 100755 => 100644 parm/config/config.gldas mode change 100755 => 100644 parm/config/config.init create mode 100755 parm/config/config.landanl create mode 100755 parm/config/config.landanlfinal create mode 100755 parm/config/config.landanlinit create mode 100755 parm/config/config.landanlrun mode change 100755 => 100644 parm/config/config.metp mode change 100755 => 100644 parm/config/config.nsst create mode 100644 parm/config/config.ocnanal create mode 100644 parm/config/config.ocnanalbmat create mode 100644 parm/config/config.ocnanalchkpt create mode 100644 parm/config/config.ocnanalpost create mode 100644 parm/config/config.ocnanalprep create mode 100644 parm/config/config.ocnanalrun create mode 100644 parm/config/config.ocnanalvrfy mode change 100755 => 100644 parm/config/config.ocnpost mode change 100755 => 100644 parm/config/config.post mode change 100755 => 100644 parm/config/config.postsnd mode change 100755 => 100644 parm/config/config.prep delete mode 100755 parm/config/config.prepbufr mode change 100755 => 100644 parm/config/config.resources create mode 100644 parm/config/config.resources.nco.static create mode 100644 parm/config/config.ufs mode change 100755 => 100644 parm/config/config.vrfy mode change 100755 => 100644 parm/config/config.wafs mode change 100755 => 100644 parm/config/config.wafsblending mode change 100755 => 100644 parm/config/config.wafsblending0p25 mode change 100755 => 100644 parm/config/config.wafsgcip mode change 100755 => 100644 parm/config/config.wafsgrib2 mode change 100755 => 100644 parm/config/config.wafsgrib20p25 mode change 100755 => 100644 parm/config/config.wave mode change 100755 => 100644 parm/config/config.waveawipsbulls mode change 100755 => 100644 parm/config/config.waveawipsgridded mode change 100755 => 100644 parm/config/config.wavegempak mode change 100755 => 100644 parm/config/config.waveinit mode change 100755 => 100644 parm/config/config.wavepostbndpnt mode change 100755 => 100644 parm/config/config.wavepostbndpntbll mode change 100755 => 100644 parm/config/config.wavepostpnt mode change 100755 => 100644 parm/config/config.wavepostsbs mode change 100755 => 100644 parm/config/config.waveprep create mode 100644 parm/config/yaml/defaults.yaml create mode 100644 parm/mom6/MOM_input_template_500 delete mode 100644 parm/parm_fv3diag/diag_table_history create mode 100644 parm/parm_fv3diag/field_table_gfdl_progsigma create mode 100644 parm/parm_fv3diag/field_table_gfdl_satmedmf_progsigma create mode 100644 parm/parm_fv3diag/field_table_thompson_noaero_tke_progsigma create mode 100644 parm/parm_fv3diag/field_table_wsm6_progsigma create mode 100644 parm/parm_fv3diag/field_table_wsm6_satmedmf_progsigma create mode 100644 parm/parm_fv3diag/field_table_zhaocarr_progsigma create mode 100644 parm/parm_fv3diag/field_table_zhaocarr_satmedmf_progsigma create mode 100644 parm/parm_gdas/aero_crtm_coeff.yaml create mode 100644 parm/parm_gdas/aero_jedi_fix.yaml create mode 100644 parm/parm_gdas/aeroanl_inc_vars.yaml create mode 100644 parm/parm_gdas/atm_crtm_coeff.yaml create mode 100644 parm/parm_gdas/atm_jedi_fix.yaml create mode 100644 parm/parm_gdas/atmanl_inc_vars.yaml create mode 100644 parm/ufs/fix/gfs/atmos.fixed_files.yaml create mode 100644 parm/ufs/fix/gfs/land.fixed_files.yaml create mode 100644 parm/ufs/fix/gfs/ocean.fixed_files.yaml create mode 100644 parm/wave/bull_awips_gfswave create mode 100644 parm/wave/grib2_gfswave.ao_9km.f000 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f003 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f006 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f009 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f012 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f015 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f018 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f021 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f024 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f027 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f030 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f033 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f036 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f039 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f042 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f045 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f048 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f051 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f054 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f057 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f060 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f063 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f066 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f069 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f072 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f078 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f084 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f090 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f096 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f102 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f108 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f114 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f120 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f126 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f132 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f138 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f144 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f150 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f156 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f162 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f168 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f174 create mode 100644 parm/wave/grib2_gfswave.ao_9km.f180 create mode 100644 parm/wave/grib2_gfswave.at_10m.f000 create mode 100644 parm/wave/grib2_gfswave.at_10m.f003 create mode 100644 parm/wave/grib2_gfswave.at_10m.f006 create mode 100644 parm/wave/grib2_gfswave.at_10m.f009 create mode 100644 parm/wave/grib2_gfswave.at_10m.f012 create mode 100644 parm/wave/grib2_gfswave.at_10m.f015 create mode 100644 parm/wave/grib2_gfswave.at_10m.f018 create mode 100644 parm/wave/grib2_gfswave.at_10m.f021 create mode 100644 parm/wave/grib2_gfswave.at_10m.f024 create mode 100644 parm/wave/grib2_gfswave.at_10m.f027 create mode 100644 parm/wave/grib2_gfswave.at_10m.f030 create mode 100644 parm/wave/grib2_gfswave.at_10m.f033 create mode 100644 parm/wave/grib2_gfswave.at_10m.f036 create mode 100644 parm/wave/grib2_gfswave.at_10m.f039 create mode 100644 parm/wave/grib2_gfswave.at_10m.f042 create mode 100644 parm/wave/grib2_gfswave.at_10m.f045 create mode 100644 parm/wave/grib2_gfswave.at_10m.f048 create mode 100644 parm/wave/grib2_gfswave.at_10m.f051 create mode 100644 parm/wave/grib2_gfswave.at_10m.f054 create mode 100644 parm/wave/grib2_gfswave.at_10m.f057 create mode 100644 parm/wave/grib2_gfswave.at_10m.f060 create mode 100644 parm/wave/grib2_gfswave.at_10m.f063 create mode 100644 parm/wave/grib2_gfswave.at_10m.f066 create mode 100644 parm/wave/grib2_gfswave.at_10m.f069 create mode 100644 parm/wave/grib2_gfswave.at_10m.f072 create mode 100644 parm/wave/grib2_gfswave.at_10m.f078 create mode 100644 parm/wave/grib2_gfswave.at_10m.f084 create mode 100644 parm/wave/grib2_gfswave.at_10m.f090 create mode 100644 parm/wave/grib2_gfswave.at_10m.f096 create mode 100644 parm/wave/grib2_gfswave.at_10m.f102 create mode 100644 parm/wave/grib2_gfswave.at_10m.f108 create mode 100644 parm/wave/grib2_gfswave.at_10m.f114 create mode 100644 parm/wave/grib2_gfswave.at_10m.f120 create mode 100644 parm/wave/grib2_gfswave.at_10m.f126 create mode 100644 parm/wave/grib2_gfswave.at_10m.f132 create mode 100644 parm/wave/grib2_gfswave.at_10m.f138 create mode 100644 parm/wave/grib2_gfswave.at_10m.f144 create mode 100644 parm/wave/grib2_gfswave.at_10m.f150 create mode 100644 parm/wave/grib2_gfswave.at_10m.f156 create mode 100644 parm/wave/grib2_gfswave.at_10m.f162 create mode 100644 parm/wave/grib2_gfswave.at_10m.f168 create mode 100644 parm/wave/grib2_gfswave.at_10m.f174 create mode 100644 parm/wave/grib2_gfswave.at_10m.f180 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f000 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f003 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f006 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f009 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f012 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f015 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f018 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f021 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f024 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f027 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f030 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f033 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f036 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f039 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f042 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f045 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f048 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f051 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f054 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f057 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f060 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f063 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f066 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f069 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f072 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f078 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f084 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f090 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f096 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f102 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f108 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f114 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f120 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f126 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f132 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f138 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f144 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f150 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f156 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f162 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f168 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f174 create mode 100644 parm/wave/grib2_gfswave.ep_10m.f180 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f000 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f003 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f006 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f009 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f012 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f015 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f018 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f021 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f024 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f027 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f030 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f033 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f036 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f039 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f042 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f045 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f048 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f051 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f054 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f057 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f060 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f063 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f066 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f069 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f072 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f078 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f084 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f090 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f096 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f102 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f108 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f114 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f120 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f126 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f132 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f138 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f144 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f150 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f156 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f162 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f168 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f174 create mode 100644 parm/wave/grib2_gfswave.glo_30m.f180 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f000 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f003 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f006 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f009 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f012 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f015 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f018 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f021 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f024 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f027 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f030 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f033 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f036 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f039 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f042 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f045 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f048 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f051 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f054 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f057 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f060 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f063 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f066 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f069 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f072 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f078 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f084 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f090 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f096 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f102 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f108 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f114 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f120 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f126 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f132 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f138 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f144 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f150 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f156 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f162 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f168 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f174 create mode 100644 parm/wave/grib2_gfswave.wc_10m.f180 mode change 100755 => 120000 parm/wave/wave_gfs.buoys create mode 100755 scripts/exgdas_atmos_gldas.sh create mode 100755 scripts/exgdas_atmos_post.sh create mode 100755 scripts/exgdas_atmos_verfozn.sh create mode 100755 scripts/exgdas_atmos_verfrad.sh create mode 100755 scripts/exgdas_atmos_vminmon.sh create mode 100755 scripts/exgdas_enkf_earc.sh create mode 100755 scripts/exgfs_atmos_post.sh create mode 100755 scripts/exgfs_atmos_vminmon.sh create mode 100755 scripts/exglobal_aero_analysis_finalize.py create mode 100755 scripts/exglobal_aero_analysis_initialize.py create mode 100755 scripts/exglobal_aero_analysis_run.py create mode 100755 scripts/exglobal_archive.sh create mode 100755 scripts/exglobal_atm_analysis_finalize.py create mode 100755 scripts/exglobal_atm_analysis_initialize.py create mode 100755 scripts/exglobal_atm_analysis_run.py create mode 100755 scripts/exglobal_atmens_analysis_finalize.py create mode 100755 scripts/exglobal_atmens_analysis_initialize.py create mode 100755 scripts/exglobal_atmens_analysis_run.py create mode 100755 scripts/exglobal_forecast.py delete mode 100755 sorc/build_gfs_util.sh create mode 100755 sorc/build_gfs_utils.sh delete mode 100644 sorc/enkf_chgres_recenter.fd/.gitignore delete mode 100644 sorc/enkf_chgres_recenter.fd/driver.f90 delete mode 100644 sorc/enkf_chgres_recenter.fd/input_data.f90 delete mode 100644 sorc/enkf_chgres_recenter.fd/interp.f90 delete mode 100644 sorc/enkf_chgres_recenter.fd/output_data.f90 delete mode 100644 sorc/enkf_chgres_recenter.fd/setup.f90 delete mode 100644 sorc/enkf_chgres_recenter.fd/utils.f90 delete mode 100644 sorc/enkf_chgres_recenter_nc.fd/driver.f90 delete mode 100644 sorc/enkf_chgres_recenter_nc.fd/input_data.f90 delete mode 100644 sorc/enkf_chgres_recenter_nc.fd/interp.f90 delete mode 100644 sorc/enkf_chgres_recenter_nc.fd/output_data.f90 delete mode 100644 sorc/enkf_chgres_recenter_nc.fd/setup.f90 delete mode 100644 sorc/enkf_chgres_recenter_nc.fd/utils.f90 delete mode 100644 sorc/fbwndgfs.fd/fbwndgfs.f delete mode 100644 sorc/fv3nc2nemsio.fd/0readme delete mode 100644 sorc/fv3nc2nemsio.fd/constants.f90 delete mode 100644 sorc/fv3nc2nemsio.fd/fv3_main.f90 delete mode 100644 sorc/fv3nc2nemsio.fd/fv3_module.f90 delete mode 100644 sorc/fv3nc2nemsio.fd/kinds.f90 delete mode 100644 sorc/gaussian_sfcanl.fd/.gitignore delete mode 100644 sorc/gaussian_sfcanl.fd/gaussian_sfcanl.f90 delete mode 100644 sorc/gaussian_sfcanl.fd/weight_gen/README delete mode 100755 sorc/gaussian_sfcanl.fd/weight_gen/run.theia.sh delete mode 100644 sorc/gaussian_sfcanl.fd/weight_gen/scrip.fd/scrip.f90 delete mode 100644 sorc/gfs_bufr.fd/bfrhdr.f delete mode 100644 sorc/gfs_bufr.fd/bfrize.f delete mode 100644 sorc/gfs_bufr.fd/buff.f delete mode 100644 sorc/gfs_bufr.fd/calpreciptype.f delete mode 100644 sorc/gfs_bufr.fd/calwxt_gfs_baldwin.f delete mode 100644 sorc/gfs_bufr.fd/calwxt_gfs_ramer.f delete mode 100644 sorc/gfs_bufr.fd/funcphys.f delete mode 100644 sorc/gfs_bufr.fd/gfsbufr.f delete mode 100644 sorc/gfs_bufr.fd/gslp.f delete mode 100644 sorc/gfs_bufr.fd/lcl.f delete mode 100644 sorc/gfs_bufr.fd/machine.f delete mode 100644 sorc/gfs_bufr.fd/meteorg.f delete mode 100644 sorc/gfs_bufr.fd/modstuff1.f delete mode 100644 sorc/gfs_bufr.fd/mstadb.f delete mode 100644 sorc/gfs_bufr.fd/newsig1.f delete mode 100644 sorc/gfs_bufr.fd/physcons.f delete mode 100644 sorc/gfs_bufr.fd/read_nemsio.f delete mode 100644 sorc/gfs_bufr.fd/read_netcdf.f delete mode 100644 sorc/gfs_bufr.fd/read_netcdf_p.f delete mode 100644 sorc/gfs_bufr.fd/rsearch.f delete mode 100644 sorc/gfs_bufr.fd/svp.f delete mode 100644 sorc/gfs_bufr.fd/tdew.f delete mode 100644 sorc/gfs_bufr.fd/terp3.f delete mode 100644 sorc/gfs_bufr.fd/vintg.f delete mode 100644 sorc/machine-setup.sh delete mode 100644 sorc/regrid_nemsio.fd/constants.f90 delete mode 100644 sorc/regrid_nemsio.fd/fv3_interface.f90 delete mode 100644 sorc/regrid_nemsio.fd/gfs_nems_interface.f90 delete mode 100644 sorc/regrid_nemsio.fd/interpolation_interface.f90 delete mode 100644 sorc/regrid_nemsio.fd/kinds.f90 delete mode 100644 sorc/regrid_nemsio.fd/main.f90 delete mode 100644 sorc/regrid_nemsio.fd/mpi_interface.f90 delete mode 100644 sorc/regrid_nemsio.fd/namelist_def.f90 delete mode 100644 sorc/regrid_nemsio.fd/netcdfio_interface.f90 delete mode 100644 sorc/regrid_nemsio.fd/physcons.f90 delete mode 100644 sorc/regrid_nemsio.fd/regrid_nemsio_interface.f90 delete mode 100644 sorc/regrid_nemsio.fd/variable_interface.f90 delete mode 100644 sorc/supvit.fd/supvit_main.f delete mode 100644 sorc/supvit.fd/supvit_modules.f delete mode 100644 sorc/syndat_getjtbul.fd/getjtbul.f delete mode 100644 sorc/syndat_maksynrc.fd/maksynrc.f delete mode 100644 sorc/syndat_qctropcy.fd/qctropcy.f delete mode 100644 sorc/tave.fd/tave.f delete mode 100644 sorc/tocsbufr.fd/tocsbufr.f delete mode 100644 sorc/vint.fd/vint.f create mode 100755 ush/compare_f90nml.py create mode 100755 ush/detect_machine.sh delete mode 100755 ush/drive_makeprepbufr.sh delete mode 100755 ush/fv3gfs_nc2nemsio.sh delete mode 100755 ush/fv3gfs_regrid_nemsio.sh create mode 100755 ush/gfs_post.sh create mode 100755 ush/gldas_forcing.sh create mode 100755 ush/gldas_get_data.sh create mode 100755 ush/gldas_liscrd.sh create mode 100755 ush/gldas_process_data.sh delete mode 100755 ush/global_extrkr.sh create mode 100644 ush/jjob_header.sh create mode 100755 ush/load_ufsda_modules.sh create mode 100755 ush/minmon_xtrct_costs.pl create mode 100755 ush/minmon_xtrct_gnorms.pl create mode 100755 ush/minmon_xtrct_reduct.pl create mode 100755 ush/module-setup.sh create mode 100644 ush/nems.configure.cpld_aero_outerwave.IN create mode 100644 ush/nems.configure.cpld_outerwave.IN create mode 100755 ush/ozn_xtrct.sh mode change 100755 => 100644 ush/preamble.sh create mode 100644 ush/python/pygfs/__init__.py create mode 100644 ush/python/pygfs/task/__init__.py create mode 100644 ush/python/pygfs/task/aero_analysis.py create mode 100644 ush/python/pygfs/task/analysis.py create mode 100644 ush/python/pygfs/task/atm_analysis.py create mode 100644 ush/python/pygfs/task/atmens_analysis.py create mode 100644 ush/python/pygfs/task/gfs_forecast.py create mode 100644 ush/python/pygfs/ufswm/__init__.py create mode 100644 ush/python/pygfs/ufswm/gfs.py create mode 100644 ush/python/pygfs/ufswm/ufs.py create mode 100644 ush/python/pygw/.gitignore create mode 100644 ush/python/pygw/README.md create mode 100644 ush/python/pygw/setup.cfg create mode 100644 ush/python/pygw/setup.py create mode 100644 ush/python/pygw/src/pygw/__init__.py create mode 100644 ush/python/pygw/src/pygw/attrdict.py create mode 100644 ush/python/pygw/src/pygw/configuration.py create mode 100644 ush/python/pygw/src/pygw/exceptions.py create mode 100644 ush/python/pygw/src/pygw/executable.py create mode 100644 ush/python/pygw/src/pygw/file_utils.py create mode 100644 ush/python/pygw/src/pygw/fsutils.py create mode 100644 ush/python/pygw/src/pygw/jinja.py create mode 100644 ush/python/pygw/src/pygw/logger.py create mode 100644 ush/python/pygw/src/pygw/task.py create mode 100644 ush/python/pygw/src/pygw/template.py create mode 100644 ush/python/pygw/src/pygw/timetools.py create mode 100644 ush/python/pygw/src/pygw/yaml_file.py create mode 100644 ush/python/pygw/src/tests/__init__.py create mode 100644 ush/python/pygw/src/tests/test_configuration.py create mode 100644 ush/python/pygw/src/tests/test_exceptions.py create mode 100644 ush/python/pygw/src/tests/test_executable.py create mode 100644 ush/python/pygw/src/tests/test_file_utils.py create mode 100644 ush/python/pygw/src/tests/test_jinja.py create mode 100644 ush/python/pygw/src/tests/test_logger.py create mode 100644 ush/python/pygw/src/tests/test_template.py create mode 100644 ush/python/pygw/src/tests/test_timetools.py create mode 100644 ush/python/pygw/src/tests/test_yaml_file.py create mode 100755 ush/radmon_diag_ck.sh create mode 100755 ush/radmon_err_rpt.sh create mode 100755 ush/radmon_verf_angle.sh create mode 100755 ush/radmon_verf_bcoef.sh create mode 100755 ush/radmon_verf_bcor.sh create mode 100755 ush/radmon_verf_time.sh create mode 100755 ush/rstprod.sh delete mode 100644 util/modulefiles/gfs_util.hera delete mode 100755 util/sorc/compile_gfs_util_wcoss.sh delete mode 100755 util/sorc/mkgfsawps.fd/compile_mkgfsawps_wcoss.sh delete mode 100755 util/sorc/mkgfsawps.fd/makefile delete mode 100755 util/sorc/mkgfsawps.fd/makefile.hera delete mode 100755 util/sorc/mkgfsawps.fd/mkgfsawps.f delete mode 100755 util/sorc/overgridid.fd/compile_overgridid_wcoss.sh delete mode 100755 util/sorc/overgridid.fd/makefile delete mode 100755 util/sorc/overgridid.fd/overgridid.f delete mode 100755 util/sorc/overgridid.fd/sample.script delete mode 100755 util/sorc/rdbfmsua.fd/MAPFILE delete mode 100755 util/sorc/rdbfmsua.fd/README delete mode 100755 util/sorc/rdbfmsua.fd/README.new delete mode 100755 util/sorc/rdbfmsua.fd/compile_rdbfmsua_wcoss.sh delete mode 100755 util/sorc/rdbfmsua.fd/makefile delete mode 100755 util/sorc/rdbfmsua.fd/makefile.hera delete mode 100755 util/sorc/rdbfmsua.fd/rdbfmsua.f delete mode 100755 util/sorc/rdbfmsua.fd/rdbfmsua.f_org delete mode 100755 util/sorc/webtitle.fd/README delete mode 100755 util/sorc/webtitle.fd/compile_webtitle_wcoss.sh delete mode 100755 util/sorc/webtitle.fd/makefile delete mode 100755 util/sorc/webtitle.fd/webtitle.f delete mode 100755 util/ush/finddate.sh delete mode 100755 util/ush/make_NTC_file.pl delete mode 100755 util/ush/make_ntc_bull.pl delete mode 100755 util/ush/make_tif.sh delete mode 100755 util/ush/month_name.sh create mode 100644 versions/fix.ver create mode 100644 workflow/hosts/container.yaml create mode 100644 workflow/hosts/jet_emc.yaml create mode 100644 workflow/hosts/s4.yaml create mode 100644 workflow/hosts/wcoss2.yaml create mode 120000 workflow/pygw diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md new file mode 100644 index 00000000000..c9b72628dd4 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/feature_request.md @@ -0,0 +1,40 @@ +--- +name: Feature Request +about: Use this template for requesting new features +title: +labels: feature +assignees: + +--- + + + + + + +**Description** + + + +**Requirements** + + +**Acceptance Criteria (Definition of Done)** + + +**(Optional): Suggest A Solution** + diff --git a/.github/ISSUE_TEMPLATE/fix_file.md b/.github/ISSUE_TEMPLATE/fix_file.md new file mode 100644 index 00000000000..1e05f0c9df9 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/fix_file.md @@ -0,0 +1,24 @@ +--- +name: Fix File Update +about: Use this template for adding, updating, or removing fix files from global dataset +title: +labels: Fix Files +assignees: + - KateFriedman-NOAA + - WalterKolczynski-NOAA + +--- + +**Description** + + + + + + +**Tasks** + +- [ ] Discuss needs with global-workflow developer assigned to request. +- [ ] Add/update/remove fix file(s) in fix sets on supported platforms (global-workflow assignee task). +- [ ] Update "Fix File Management" spreadsheet (https://docs.google.com/spreadsheets/d/1BeIvcz6TO3If4YCqkUK-oz_kGS9q2wTjwLS-BBemSEY/edit?usp=sharing). +- [ ] Make related workflow/component updates. diff --git a/.github/ISSUE_TEMPLATE/production_update.md b/.github/ISSUE_TEMPLATE/production_update.md new file mode 100644 index 00000000000..fd517d3d0a7 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/production_update.md @@ -0,0 +1,31 @@ +--- +name: Production Update +about: Use this template for operational production updates +title: +labels: production update +assignees: + - KateFriedman-NOAA + +--- + +**Description** + + + + +**Workflow Changes** + + + +**Tasks** +- [ ] Create release branch +- [ ] Make workflow changes for upgrade in release branch (add additional checklist items as needed) +- [ ] Create release notes +- [ ] Cut hand-off tag for CDF +- [ ] Submit CDF to NCO +- [ ] Implementation into operations complete +- [ ] Merge release branch into operational branch +- [ ] Cut version tag from operational branch +- [ ] Release new version tag +- [ ] Announce to users +- [ ] Update Read-The-Docs operations status version in develop diff --git a/.github/scripts/build_docs.sh b/.github/scripts/build_docs.sh new file mode 100755 index 00000000000..7fb6701da2a --- /dev/null +++ b/.github/scripts/build_docs.sh @@ -0,0 +1,31 @@ +#! /bin/bash + +set -eux + +# path to docs directory relative to top level of repository +# $GITHUB_WORKSPACE is set if the actions/checkout@v3 action is run first + +cwd=$(pwd) +DOCS_DIR="${GITHUB_WORKSPACE}/docs" + +# run Make to build the documentation and return to previous directory +cd "${DOCS_DIR}" +make clean html +cd "${cwd}" + +# copy HTML output into directory to create an artifact +mkdir -p artifact/documentation +cp -R "${DOCS_DIR}/build/html/." artifact/documentation + +# check if the warnings.log file is empty +# Copy it into the artifact and documeentation directories +# so it will be available in the artifacts +warning_file="${DOCS_DIR}/build/warnings.log" +if [[ -s ${warning_file} ]]; then + cp -r "${DOCS_DIR}/build/warnings.log" artifact/doc_warnings.log + cp artifact/doc_warnings.log artifact/documentation + echo "Warnings were encountered while building documentation." + echo "========== Begin warnings ==========" + cat artifact/doc_warnings.log + echo "=========== End warnings ===========" +fi diff --git a/.github/workflows/docs.yaml b/.github/workflows/docs.yaml new file mode 100644 index 00000000000..ae083a3c0bf --- /dev/null +++ b/.github/workflows/docs.yaml @@ -0,0 +1,51 @@ +name: Build and Deploy Documentation +on: + push: + branches: + - develop + - feature/* + - main/* + - bugfix/* + - release/* + paths: + - docs/** + pull_request: + types: [opened, reopened, synchronize] + +jobs: + documentation: + runs-on: ubuntu-latest + name: Build and deploy documentation + + steps: + - name: Setup Python + uses: actions/setup-python@v4 + with: + python-version: "3.9" + + - name: Install (upgrade) python dependencies + run: | + pip install --upgrade pip sphinx sphinx-gallery sphinx_rtd_theme sphinxcontrib-bibtex + + - name: Checkout + uses: actions/checkout@v3 + + - name: Build documentation + run: | + ./.github/scripts/build_docs.sh + + - name: Upload documentation (on success) + uses: actions/upload-artifact@v3 + if: always() + with: + name: documentation + path: artifact/documentation + + - name: Upload warnings (on failure) + uses: actions/upload-artifact@v3 + if: failure() + with: + name: documentation_warnings.log + path: artifact/doc_warnings.log + if-no-files-found: ignore + diff --git a/.github/workflows/linters.yaml b/.github/workflows/linters.yaml new file mode 100644 index 00000000000..488b6a1407e --- /dev/null +++ b/.github/workflows/linters.yaml @@ -0,0 +1,64 @@ +# +name: shellnorms +on: + pull_request: + +permissions: + contents: read + +defaults: + run: + shell: bash -o pipefail {0} + +jobs: + lint-shell: + runs-on: ubuntu-latest + + permissions: + security-events: write + + steps: + - name: Checkout code + uses: actions/checkout@v3 + with: + fetch-depth: 0 + + - id: ShellCheck + name: Lint shell scripts + uses: redhat-plumbers-in-action/differential-shellcheck@v4 + with: + token: ${{ secrets.GITHUB_TOKEN }} + + - if: ${{ always() }} + name: Upload artifact with ShellCheck defects in SARIF format + uses: actions/upload-artifact@v3 + with: + name: Differential ShellCheck SARIF + path: ${{ steps.ShellCheck.outputs.sarif }} + + # lint-python: + # runs-on: ubuntu-latest + + # permissions: + # security-events: write + + # steps: + # - name: Checkout code + # uses: actions/checkout@v3 + + # - id: VCS_Diff_Lint + # name: Lint python scripts + # uses: fedora-copr/vcs-diff-lint-action@v1 + + # - if: ${{ always() }} + # name: Upload artifact with detected defects in SARIF format + # uses: actions/upload-artifact@v3 + # with: + # name: VCS Diff Lint SARIF + # path: ${{ steps.VCS_Diff_Lint.outputs.sarif }} + + # - if: ${{ failure() }} + # name: Upload SARIF to GitHub using github/codeql-action/upload-sarif + # uses: github/codeql-action/upload-sarif@v2 + # with: + # sarif_file: ${{ steps.VCS_Diff_Lint.outputs.sarif }} diff --git a/.github/workflows/pynorms.yaml b/.github/workflows/pynorms.yaml new file mode 100644 index 00000000000..7f823f83181 --- /dev/null +++ b/.github/workflows/pynorms.yaml @@ -0,0 +1,24 @@ +name: pynorms +on: [push, pull_request] + +jobs: + check_norms: + runs-on: ubuntu-latest + name: Check Python coding norms with pycodestyle + + steps: + + - name: Install dependencies + run: | + pip install --upgrade pip + pip install pycodestyle + + - name: Checkout + uses: actions/checkout@v3 + with: + path: global-workflow + + - name: Run pycodestyle + run: | + cd $GITHUB_WORKSPACE/global-workflow + pycodestyle -v --config ./.pycodestyle --exclude='.git,.github' ./ diff --git a/.github/workflows/pytests.yaml b/.github/workflows/pytests.yaml new file mode 100644 index 00000000000..f15a776c0f1 --- /dev/null +++ b/.github/workflows/pytests.yaml @@ -0,0 +1,36 @@ +name: pytests +on: [push, pull_request] + +jobs: + run_pytests: + runs-on: ubuntu-latest + name: Install pygw and run tests with pytests + strategy: + max-parallel: 1 + matrix: + python: ["3.7", "3.8", "3.9", "3.10"] + + steps: + - name: Setup Python + uses: actions/setup-python@v4 + with: + python-version: ${{ matrix.python }} + + - name: Install (upgrade) python dependencies + run: | + pip install --upgrade pip + + - name: Checkout + uses: actions/checkout@v3 + with: + path: global-workflow + + - name: Install pygw + run: | + cd $GITHUB_WORKSPACE/global-workflow/ush/python/pygw + pip install .[dev] + + - name: Run pytests + run: | + cd $GITHUB_WORKSPACE/global-workflow/ush/python/pygw + pytest -v src/tests diff --git a/.gitignore b/.gitignore index df54a098920..e73b9f2e05b 100644 --- a/.gitignore +++ b/.gitignore @@ -5,9 +5,15 @@ __pycache__ *.[aox] *.mod *.sw[a-p] +._* .DS_Store +#nohup.out - some users do not want this to be a part of .gitignore. TODO: review against best practices .idea/ +.vscode/ +# Ignore editor generated backup files +#------------------------------------- +*~ # Ignore folders #------------------- exec/ @@ -17,8 +23,23 @@ install*/ # Ignore fix directory symlinks #------------------------------ fix/0readme -fix/fix_* -fix/gdas/ +fix/aer +fix/am +fix/chem +fix/cice +fix/cpl +fix/datm +fix/gdas +fix/gldas +fix/gsi +fix/lut +fix/mom6 +fix/orog +fix/reg2grb2 +fix/sfc_climo +fix/ugwd +fix/verif +fix/wave fix/wafs # Ignore parm file symlinks @@ -75,71 +96,12 @@ parm/wafs #-------------------------------------------- sorc/*log sorc/logs -sorc/ufs_model.fd -sorc/gfs_post.fd -sorc/gfs_wafs.fd -sorc/gldas.fd -sorc/gsi_enkf.fd -sorc/gsi.fd -sorc/enkf.fd -sorc/gdas.cd -sorc/gsi_utils.fd -sorc/gsi_monitor.fd -sorc/ufs_utils.fd -sorc/verif-global.fd - -# Ignore sorc symlinks -#--------------------- -sorc/calc_analysis.fd -sorc/calc_increment_ens.fd -sorc/calc_increment_ens_ncio.fd -sorc/emcsfc_ice_blend.fd -sorc/emcsfc_snow2mdl.fd -sorc/fregrid.fd -sorc/gdas2gldas.fd -sorc/getsfcensmeanp.fd -sorc/getsigensmeanp_smooth.fd -sorc/getsigensstatp.fd -sorc/gfs_ncep_post.fd -sorc/gldas2gdas.fd -sorc/gldas_forcing.fd -sorc/gldas_model.fd -sorc/gldas_post.fd -sorc/gldas_rst.fd -sorc/global_chgres.fd -sorc/global_cycle.fd -sorc/global_enkf.fd -sorc/global_gsi.fd -sorc/interp_inc.fd -sorc/make_hgrid.fd -sorc/make_solo_mosaic.fd -sorc/ncdiag_cat.fd -sorc/nst_tf_chg.fd -sorc/oznmon_horiz.fd -sorc/oznmon_time.fd -sorc/radmon_angle.fd -sorc/radmon_bcoef.fd -sorc/radmon_bcor.fd -sorc/radmon_time.fd -sorc/recentersigp.fd -sorc/upp.fd -sorc/wafs_awc_wafavn.fd -sorc/wafs_blending.fd -sorc/wafs_blending_0p25.fd -sorc/wafs_cnvgrib2.fd -sorc/wafs_gcip.fd -sorc/wafs_grib2_0p25.fd -sorc/wafs_makewafs.fd -sorc/wafs_setmissing.fd +sorc/*.cd +sorc/*.fd # Ignore scripts from externals #------------------------------ # jobs symlinks -jobs/JGDAS_ATMOS_GLDAS -jobs/JGDAS_ATMOS_VERFOZN -jobs/JGDAS_ATMOS_VERFRAD -jobs/JGDAS_ATMOS_VMINMON -jobs/JGFS_ATMOS_VMINMON jobs/JGFS_ATMOS_WAFS jobs/JGFS_ATMOS_WAFS_BLENDING jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 @@ -148,11 +110,6 @@ jobs/JGFS_ATMOS_WAFS_GRIB2 jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 # scripts symlinks scripts/exemcsfc_global_sfc_prep.sh -scripts/exgdas_atmos_gldas.sh -scripts/exgdas_atmos_verfozn.sh -scripts/exgdas_atmos_verfrad.sh -scripts/exgdas_atmos_vminmon.sh -scripts/exgfs_atmos_vminmon.sh scripts/exgfs_atmos_wafs_blending.sh scripts/exgfs_atmos_wafs_blending_0p25.sh scripts/exgfs_atmos_wafs_gcip.sh @@ -168,29 +125,18 @@ ush/fv3gfs_driver_grid.sh ush/fv3gfs_filter_topo.sh ush/fv3gfs_make_grid.sh ush/fv3gfs_make_orog.sh -ush/gldas_archive.sh -ush/gldas_forcing.sh -ush/gldas_get_data.sh -ush/gldas_liscrd.sh -ush/gldas_post.sh -ush/gldas_process_data.sh ush/global_chgres.sh ush/global_chgres_driver.sh ush/global_cycle.sh ush/global_cycle_driver.sh -ush/minmon_xtrct_costs.pl -ush/minmon_xtrct_gnorms.pl -ush/minmon_xtrct_reduct.pl +ush/jediinc2fv3.py ush/mkwfsgbl.sh -ush/ozn_xtrct.sh -ush/radmon_ck_stdout.sh -ush/radmon_err_rpt.sh -ush/radmon_verf_angle.sh -ush/radmon_verf_bcoef.sh -ush/radmon_verf_bcor.sh -ush/radmon_verf_time.sh ush/ufsda -ush/rstprod.sh ush/wafs_blending.sh ush/wafs_grib2.regrid.sh ush/wafs_intdsk.sh +ush/finddate.sh +ush/make_NTC_file.pl +ush/make_ntc_bull.pl +ush/make_tif.sh +ush/month_name.sh diff --git a/.pycodestyle b/.pycodestyle new file mode 100644 index 00000000000..8bd18fa9d71 --- /dev/null +++ b/.pycodestyle @@ -0,0 +1,6 @@ +[pycodestyle] +count = False +ignore = E402,W504 +max-line-length = 160 +statistics = True +exclude = Experimental diff --git a/.shellcheckrc b/.shellcheckrc new file mode 100644 index 00000000000..6d540ba17fa --- /dev/null +++ b/.shellcheckrc @@ -0,0 +1,16 @@ +# Global settings for Shellcheck (https://github.com/koalaman/shellcheck) +enable=all + +external-sources=false + +# Disable variable referenced but not assigned +disable=SC2154 + +# Disable following non-constant source +disable=SC1090 + +# Disable non-existent binary +disable=SC1091 + +# Disable -p -m only applies to deepest directory +disable=SC2174 diff --git a/Externals.cfg b/Externals.cfg index 0725d254897..e78cd2838e0 100644 --- a/Externals.cfg +++ b/Externals.cfg @@ -1,59 +1,72 @@ # External sub-modules of global-workflow -[FV3GFS] -hash = 9350745855aebe0790813e0ed2ba5ad680e3f75c -local_path = sorc/fv3gfs.fd +[UFS] +tag = 2247060 +local_path = sorc/ufs_model.fd repo_url = https://github.com/ufs-community/ufs-weather-model.git protocol = git required = True -[GSI] -hash = 9c1fc15d42573b398037319bbf8d5143ad126fb6 -local_path = sorc/gsi.fd -repo_url = https://github.com/NOAA-EMC/GSI.git -protocol = git -required = True - -[GLDAS] -tag = gldas_gfsv16_release.v1.15.0 -local_path = sorc/gldas.fd -repo_url = https://github.com/NOAA-EMC/GLDAS.git +[gfs-utils] +hash = 0b8ff56 +local_path = sorc/gfs_utils.fd +repo_url = https://github.com/NOAA-EMC/gfs-utils protocol = git required = True -[UPP] -#No externals setting = .gitmodules will be invoked for CMakeModules and comupp/src/lib/crtm2 submodules -hash = ff42e0227d6100285d4179a2572b700fd5a959cb -local_path = sorc/gfs_post.fd -repo_url = https://github.com/NOAA-EMC/UPP.git -protocol = git -required = True - -[UFS_UTILS] -tag = ufs_utils_1_8_0 +[UFS-Utils] +hash = 5b67e4d local_path = sorc/ufs_utils.fd repo_url = https://github.com/ufs-community/UFS_UTILS.git protocol = git required = True [EMC_verif-global] -tag = verif_global_v2.5.2 +tag = c267780 local_path = sorc/verif-global.fd repo_url = https://github.com/NOAA-EMC/EMC_verif-global.git protocol = git required = True -[EMC_gfs_wafs] -hash = c2a29a67d9432b4d6fba99eac7797b81d05202b6 -local_path = sorc/gfs_wafs.fd -repo_url = https://github.com/NOAA-EMC/EMC_gfs_wafs.git +[GSI-EnKF] +hash = 113e307 +local_path = sorc/gsi_enkf.fd +repo_url = https://github.com/NOAA-EMC/GSI.git +protocol = git +required = False + +[GSI-Utils] +hash = 322cc7b +local_path = sorc/gsi_utils.fd +repo_url = https://github.com/NOAA-EMC/GSI-utils.git +protocol = git +required = False + +[GSI-Monitor] +hash = 45783e3 +local_path = sorc/gsi_monitor.fd +repo_url = https://github.com/NOAA-EMC/GSI-monitor.git protocol = git required = False -[aeroconv] -hash = 24f6ddc -local_path = sorc/aeroconv.fd -repo_url = https://github.com/NCAR/aeroconv.git +[GDASApp] +hash = aaf7caa +local_path = sorc/gdas.cd +repo_url = https://github.com/NOAA-EMC/GDASApp.git +protocol = git +required = False + +[GLDAS] +tag = fd8ba62 +local_path = sorc/gldas.fd +repo_url = https://github.com/NOAA-EMC/GLDAS.git +protocol = git +required = False + +[EMC-gfs_wafs] +hash = 014a0b8 +local_path = sorc/gfs_wafs.fd +repo_url = https://github.com/NOAA-EMC/EMC_gfs_wafs.git protocol = git required = False diff --git a/LICENSE.md b/LICENSE.md new file mode 100644 index 00000000000..0927556b544 --- /dev/null +++ b/LICENSE.md @@ -0,0 +1,157 @@ +### GNU LESSER GENERAL PUBLIC LICENSE + +Version 3, 29 June 2007 + +Copyright (C) 2007 Free Software Foundation, Inc. + + +Everyone is permitted to copy and distribute verbatim copies of this +license document, but changing it is not allowed. + +This version of the GNU Lesser General Public License incorporates the +terms and conditions of version 3 of the GNU General Public License, +supplemented by the additional permissions listed below. + +#### 0. Additional Definitions. + +As used herein, "this License" refers to version 3 of the GNU Lesser +General Public License, and the "GNU GPL" refers to version 3 of the +GNU General Public License. + +"The Library" refers to a covered work governed by this License, other +than an Application or a Combined Work as defined below. + +An "Application" is any work that makes use of an interface provided +by the Library, but which is not otherwise based on the Library. +Defining a subclass of a class defined by the Library is deemed a mode +of using an interface provided by the Library. + +A "Combined Work" is a work produced by combining or linking an +Application with the Library. The particular version of the Library +with which the Combined Work was made is also called the "Linked +Version". + +The "Minimal Corresponding Source" for a Combined Work means the +Corresponding Source for the Combined Work, excluding any source code +for portions of the Combined Work that, considered in isolation, are +based on the Application, and not on the Linked Version. + +The "Corresponding Application Code" for a Combined Work means the +object code and/or source code for the Application, including any data +and utility programs needed for reproducing the Combined Work from the +Application, but excluding the System Libraries of the Combined Work. + +#### 1. Exception to Section 3 of the GNU GPL. + +You may convey a covered work under sections 3 and 4 of this License +without being bound by section 3 of the GNU GPL. + +#### 2. Conveying Modified Versions. + +If you modify a copy of the Library, and, in your modifications, a +facility refers to a function or data to be supplied by an Application +that uses the facility (other than as an argument passed when the +facility is invoked), then you may convey a copy of the modified +version: + +- a) under this License, provided that you make a good faith effort + to ensure that, in the event an Application does not supply the + function or data, the facility still operates, and performs + whatever part of its purpose remains meaningful, or +- b) under the GNU GPL, with none of the additional permissions of + this License applicable to that copy. + +#### 3. Object Code Incorporating Material from Library Header Files. + +The object code form of an Application may incorporate material from a +header file that is part of the Library. You may convey such object +code under terms of your choice, provided that, if the incorporated +material is not limited to numerical parameters, data structure +layouts and accessors, or small macros, inline functions and templates +(ten or fewer lines in length), you do both of the following: + +- a) Give prominent notice with each copy of the object code that + the Library is used in it and that the Library and its use are + covered by this License. +- b) Accompany the object code with a copy of the GNU GPL and this + license document. + +#### 4. Combined Works. + +You may convey a Combined Work under terms of your choice that, taken +together, effectively do not restrict modification of the portions of +the Library contained in the Combined Work and reverse engineering for +debugging such modifications, if you also do each of the following: + +- a) Give prominent notice with each copy of the Combined Work that + the Library is used in it and that the Library and its use are + covered by this License. +- b) Accompany the Combined Work with a copy of the GNU GPL and this + license document. +- c) For a Combined Work that displays copyright notices during + execution, include the copyright notice for the Library among + these notices, as well as a reference directing the user to the + copies of the GNU GPL and this license document. +- d) Do one of the following: + - 0) Convey the Minimal Corresponding Source under the terms of + this License, and the Corresponding Application Code in a form + suitable for, and under terms that permit, the user to + recombine or relink the Application with a modified version of + the Linked Version to produce a modified Combined Work, in the + manner specified by section 6 of the GNU GPL for conveying + Corresponding Source. + - 1) Use a suitable shared library mechanism for linking with + the Library. A suitable mechanism is one that (a) uses at run + time a copy of the Library already present on the user's + computer system, and (b) will operate properly with a modified + version of the Library that is interface-compatible with the + Linked Version. +- e) Provide Installation Information, but only if you would + otherwise be required to provide such information under section 6 + of the GNU GPL, and only to the extent that such information is + necessary to install and execute a modified version of the + Combined Work produced by recombining or relinking the Application + with a modified version of the Linked Version. (If you use option + 4d0, the Installation Information must accompany the Minimal + Corresponding Source and Corresponding Application Code. If you + use option 4d1, you must provide the Installation Information in + the manner specified by section 6 of the GNU GPL for conveying + Corresponding Source.) + +#### 5. Combined Libraries. + +You may place library facilities that are a work based on the Library +side by side in a single library together with other library +facilities that are not Applications and are not covered by this +License, and convey such a combined library under terms of your +choice, if you do both of the following: + +- a) Accompany the combined library with a copy of the same work + based on the Library, uncombined with any other library + facilities, conveyed under the terms of this License. +- b) Give prominent notice with the combined library that part of it + is a work based on the Library, and explaining where to find the + accompanying uncombined form of the same work. + +#### 6. Revised Versions of the GNU Lesser General Public License. + +The Free Software Foundation may publish revised and/or new versions +of the GNU Lesser General Public License from time to time. Such new +versions will be similar in spirit to the present version, but may +differ in detail to address new problems or concerns. + +Each version is given a distinguishing version number. If the Library +as you received it specifies that a certain numbered version of the +GNU Lesser General Public License "or any later version" applies to +it, you have the option of following the terms and conditions either +of that published version or of any later version published by the +Free Software Foundation. If the Library as you received it does not +specify a version number of the GNU Lesser General Public License, you +may choose any version of the GNU Lesser General Public License ever +published by the Free Software Foundation. + +If the Library as you received it specifies that a proxy can decide +whether future versions of the GNU Lesser General Public License shall +apply, that proxy's public statement of acceptance of any version is +permanent authorization for you to choose that version for the +Library. diff --git a/README.md b/README.md index c89aa7275b4..465b0529fac 100644 --- a/README.md +++ b/README.md @@ -1,54 +1,40 @@ -# global-workflow -Global Superstructure/Workflow currently supporting the Finite-Volume on a Cubed-Sphere Global Forecast System (FV3GFS) - -The global-workflow depends on the following prerequisities to be available on the system: - -* workload management platform / scheduler - LSF or SLURM -* workflow manager - ROCOTO (https://github.com/christopherwharrop/rocoto) -* modules - NCEPLIBS (various), esmf v8.0.0bs48, hdf5, intel/ips v18, impi v18, wgrib2, netcdf v4.7.0, hpss, gempak (see module files under /modulefiles for additional details) - -The global-workflow current supports the following machines: +[![Read The Docs Status](https://readthedocs.org/projects/global-workflow/badge/?badge=latest)](http://global-workflow.readthedocs.io/) +[![shellnorms](https://github.com/NOAA-EMC/global-workflow/actions/workflows/linters.yaml/badge.svg)](https://github.com/NOAA-EMC/global-workflow/actions/workflows/linters.yaml) +[![pynorms](https://github.com/NOAA-EMC/global-workflow/actions/workflows/pynorms.yaml/badge.svg)](https://github.com/NOAA-EMC/global-workflow/actions/workflows/pynorms.yaml) +[![pytests](https://github.com/NOAA-EMC/global-workflow/actions/workflows/pytests.yaml/badge.svg)](https://github.com/NOAA-EMC/global-workflow/actions/workflows/pytests.yaml) -* WCOSS-Dell -* WCOSS-Cray -* Hera -* Orion - -Quick-start instructions are below. Full instructions are available in the [wiki](https://github.com/NOAA-EMC/global-workflow/wiki/Run-Global-Workflow) - -## Build global-workflow: +# global-workflow +Global Workflow currently supporting the Global Forecast System (GFS) with the [UFS-weather-model](https://github.com/ufs-community/ufs-weather-model) and [GSI](https://github.com/NOAA-EMC/GSI)-based Data Assimilation System. -### 1. Check out components +The `global-workflow` depends on the following prerequisities to be available on the system: -While in /sorc folder: -``` -$ sh checkout.sh -``` +* Workflow Engine - [Rocoto](https://github.com/christopherwharrop/rocoto) and [ecFlow](https://github.com/ecmwf/ecflow) (for NWS Operations) +* Compiler - Intel Compiler Suite +* Software - NCEPLIBS (various), ESMF, HDF5, NetCDF, and a host of other softwares (see module files under /modulefiles for additional details) -### 2. Build components +The `global-workflow` current supports the following tier-1 machines: -While in /sorc folder: +* NOAA RDHPCS - Hera +* MSU HPC - Orion +* NOAA's operational HPC - WCOSS2 -``` -$ sh build_all.sh -``` +Additionally, the following tier-2 machine is supported: +* SSEC at Univ. of Wisconsin - S4 (Note that S2S+ experiments are not fully supported) -Or use an available option: -``` -build_all.sh [-a UFS_app][-c build_config][-h][-v] - -a UFS_app: - Build a specific UFS app instead of the default - -c build_config: - Selectively build based on the provided config instead of the default config - -h: - Print usage message and exit - -v: - Run all scripts in verbose mode -``` +Documentation (in progress) is available [here](https://global-workflow.readthedocs.io/en/latest/). -### 3. Link components +# Disclaimer -While in /sorc folder: +The United States Department of Commerce (DOC) GitHub project code is provided +on an "as is" basis and the user assumes responsibility for its use. DOC has +relinquished control of the information and no longer has responsibility to +protect the integrity, confidentiality, or availability of the information. Any +claims against the Department of Commerce stemming from the use of its GitHub +project will be governed by all applicable Federal law. Any reference to +specific commercial products, processes, or services by service mark, +trademark, manufacturer, or otherwise, does not constitute or imply their +endorsement, recommendation or favoring by the Department of Commerce. The +Department of Commerce seal and logo, or the seal and logo of a DOC bureau, +shall not be used in any manner to imply endorsement of any commercial product +or activity by DOC or the United States Government. -$ sh link_workflow.sh emc $MACHINE -...where $MACHINE is "dell", "cray", "hera", or "orion". diff --git a/ci/cases/C96C48_hybatmDA.yaml b/ci/cases/C96C48_hybatmDA.yaml new file mode 100644 index 00000000000..9efce409009 --- /dev/null +++ b/ci/cases/C96C48_hybatmDA.yaml @@ -0,0 +1,15 @@ +experiment: + mode: cycled + +arguments: + app: ATM + resdet: 96 + resens: 48 + comrot: ${RUNTESTS}/${pslot}/COMROT + expdir: ${RUNTESTS}/${pslot}/EXPDIR + icsdir: ${ICSDIR_ROOT}/C96C48 + idate: 2021122018 + edate: 2021122200 + nens: 2 + gfs_cyc: 1 + start: cold diff --git a/ci/cases/C96_atm3DVar.yaml b/ci/cases/C96_atm3DVar.yaml new file mode 100644 index 00000000000..1648432e091 --- /dev/null +++ b/ci/cases/C96_atm3DVar.yaml @@ -0,0 +1,14 @@ +experiment: + mode: cycled + +arguments: + app: ATM + resdet: 96 + comrot: ${RUNTESTS}/${pslot}/COMROT + expdir: ${RUNTESTS}/${pslot}/EXPDIR + icsdir: ${ICSDIR_ROOT}/C96C48 + idate: 2021122018 + edate: 2021122100 + nens: 0 + gfs_cyc: 1 + start: cold diff --git a/ci/platforms/hera.sh b/ci/platforms/hera.sh new file mode 100644 index 00000000000..35fe7bca912 --- /dev/null +++ b/ci/platforms/hera.sh @@ -0,0 +1,7 @@ +#!/usr/bin/bash +export GFS_CI_ROOT=/scratch1/NCEPDEV/global/Terry.McGuinness/GFS_CI_ROOT +export SLURM_ACCOUNT=fv3-cpu +export SALLOC_ACCOUNT="${SLURM_ACCOUNT}" +export SBATCH_ACCOUNT="${SLURM_ACCOUNT}" +export SLURM_QOS=debug +export ICSDIR_ROOT="/scratch1/NCEPDEV/global/glopara/data/ICSDIR" diff --git a/ci/platforms/orion.sh b/ci/platforms/orion.sh new file mode 100644 index 00000000000..7d69a3b276e --- /dev/null +++ b/ci/platforms/orion.sh @@ -0,0 +1,11 @@ +#!/usr/bin/bash + +export GFS_CI_ROOT=/work2/noaa/global/mterry/GFS_CI_ROOT +export ICSDIR_ROOT=/work/noaa/global/glopara/data/ICSDIR +export SLURM_ACCOUNT=fv3-cpu +export SALLOC_ACCOUNT=${SLURM_ACCOUNT} +export SBATCH_ACCOUNT=${SLURM_ACCOUNT} +export SLURM_QOS=debug +export SLURM_EXCLUSIVE=user +export OMP_NUM_THREADS=1 +ulimit -s unlimited diff --git a/ci/scripts/check_ci.sh b/ci/scripts/check_ci.sh new file mode 100755 index 00000000000..aa48e9f8946 --- /dev/null +++ b/ci/scripts/check_ci.sh @@ -0,0 +1,115 @@ +#!/bin/bash +set -eux +##################################################################################### +# +# Script description: BASH script for checking for cases in a given PR and +# running rocotostat on each to determine if the experiment has +# succeeded or faild. This script is intended +# to run from within a cron job in the CI Managers account +# Abstract TODO +##################################################################################### + +HOMEgfs="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." >/dev/null 2>&1 && pwd )" +scriptname=$(basename "${BASH_SOURCE[0]}") +echo "Begin ${scriptname} at $(date -u)" || true +export PS4='+ $(basename ${BASH_SOURCE})[${LINENO}]' + +GH=${HOME}/bin/gh +REPO_URL=${REPO_URL:-"https://github.com/NOAA-EMC/global-workflow.git"} + +######################################################################### +# Set up runtime environment varibles for accounts on supproted machines +######################################################################### + +source "${HOMEgfs}/ush/detect_machine.sh" +case ${MACHINE_ID} in + hera | orion) + echo "Running Automated Testing on ${MACHINE_ID}" + source "${HOMEgfs}/ci/platforms/${MACHINE_ID}.sh" + ;; + *) + echo "Unsupported platform. Exiting with error." + exit 1 + ;; +esac +set +x +source "${HOMEgfs}/ush/module-setup.sh" +module use "${HOMEgfs}/modulefiles" +module load "module_gwsetup.${MACHINE_ID}" +module list +set -x +rocotostat=$(which rocotostat) +if [[ -z ${rocotostat+x} ]]; then + echo "rocotostat not found on system" + exit 1 +else + echo "rocotostat being used from ${rocotostat}" +fi + +pr_list_file="open_pr_list" + +if [[ -s "${GFS_CI_ROOT}/${pr_list_file}" ]]; then + pr_list=$(cat "${GFS_CI_ROOT}/${pr_list_file}") +else + echo "no PRs to process .. exit" + exit 0 +fi + +############################################################# +# Loop throu all PRs in PR List and look for expirments in +# the RUNTESTS dir and for each one run runcotorun on them +############################################################# + +for pr in ${pr_list}; do + id=$("${GH}" pr view "${pr}" --repo "${REPO_URL}" --json id --jq '.id') + echo "Processing Pull Request #${pr} and looking for cases" + pr_dir="${GFS_CI_ROOT}/PR/${pr}" + + # If there is no RUNTESTS dir for this PR then cases have not been made yet + if [[ ! -d "${pr_dir}/RUNTESTS" ]]; then + continue + fi + num_cases=$(find "${pr_dir}/RUNTESTS" -mindepth 1 -maxdepth 1 -type d | wc -l) || true + + #Check for PR success when ${pr_dir}/RUNTESTS is void of subfolders + # since all successfull ones where previously removed + if [[ "${num_cases}" -eq 0 ]] && [[ -d "${pr_dir}/RUNTESTS" ]]; then + "${GH}" pr edit --repo "${REPO_URL}" "${pr}" --remove-label "CI-${MACHINE_ID^}-Running" --add-label "CI-${MACHINE_ID^}-Passed" + "${GH}" pr comment "${pr}" --repo "${REPO_URL}" --body-file "${GFS_CI_ROOT}/PR/${pr}/output_${id}" + sed -i "/${pr}/d" "${GFS_CI_ROOT}/${pr_list_file}" + # Completely remove the PR and its cloned repo on sucess of all cases + rm -Rf "${pr_dir}" + continue + fi + + for cases in "${pr_dir}/RUNTESTS/"*; do + pslot=$(basename "${cases}") + xml="${pr_dir}/RUNTESTS/${pslot}/EXPDIR/${pslot}/${pslot}.xml" + db="${pr_dir}/RUNTESTS/${pslot}/EXPDIR/${pslot}/${pslot}.db" + rocoto_stat_output=$("${rocotostat}" -w "${xml}" -d "${db}" -s | grep -v CYCLE) || true + num_cycles=$(echo "${rocoto_stat_output}" | wc -l) || true + num_done=$(echo "${rocoto_stat_output}" | grep -c Done) || true + num_succeeded=$("${rocotostat}" -w "${xml}" -d "${db}" -a | grep -c SUCCEEDED) || true + echo "${pslot} Total Cycles: ${num_cycles} number done: ${num_done}" || true + num_failed=$("${rocotostat}" -w "${xml}" -d "${db}" -a | grep -c -E 'FAIL|DEAD') || true + if [[ ${num_failed} -ne 0 ]]; then + { + echo "Experiment ${pslot} Terminated: *FAILED*" + echo "Experiment ${pslot} Terminated with ${num_failed} tasks failed at $(date)" || true + } >> "${GFS_CI_ROOT}/PR/${pr}/output_${id}" + "${GH}" pr edit --repo "${REPO_URL}" "${pr}" --remove-label "CI-${MACHINE_ID^}-Running" --add-label "CI-${MACHINE_ID^}-Failed" + "${GH}" pr comment "${pr}" --repo "${REPO_URL}" --body-file "${GFS_CI_ROOT}/PR/${pr}/output_${id}" + sed -i "/${pr}/d" "${GFS_CI_ROOT}/${pr_list_file}" + fi + if [[ "${num_done}" -eq "${num_cycles}" ]]; then + { + echo "Experiment ${pslot} completed: *SUCCESS*" + echo "Experiment ${pslot} Completed at $(date)" || true + echo -n "with ${num_succeeded} successfully completed jobs" || true + } >> "${GFS_CI_ROOT}/PR/${pr}/output_${id}" + "${GH}" pr comment "${pr}" --repo "${REPO_URL}" --body-file "${GFS_CI_ROOT}/PR/${pr}/output_${id}" + #Remove Experment cases that completed successfully + rm -Rf "${pr_dir}/RUNTESTS/${pslot}" + fi + done +done diff --git a/ci/scripts/clone-build_ci.sh b/ci/scripts/clone-build_ci.sh new file mode 100755 index 00000000000..022cc443784 --- /dev/null +++ b/ci/scripts/clone-build_ci.sh @@ -0,0 +1,122 @@ +#!/bin/bash +set -eux + +##################################################################### +# Usage and arguments for specfifying cloned directgory +##################################################################### +usage() { + set +x + echo + echo "Usage: $0 -p -d -o -h" + echo + echo " -p PR nunber to clone and build" + echo " -d Full path of of were to clone and build PR" + echo " -o Full path to output message file detailing results of CI tests" + echo " -h display this message and quit" + echo + exit 1 +} + +################################################################ +while getopts "p:d:o:h" opt; do + case ${opt} in + p) + PR=${OPTARG} + ;; + d) + repodir=${OPTARG} + ;; + o) + outfile=${OPTARG} + ;; + h|\?|:) + usage + ;; + *) + echo "Unrecognized option" + usage + exit + ;; + esac +done + +cd "${repodir}" || exit 1 +# clone copy of repo +if [[ -d global-workflow ]]; then + rm -Rf global-workflow +fi + +git clone "${REPO_URL}" +cd global-workflow || exit 1 + +pr_state=$(gh pr view "${PR}" --json state --jq '.state') +if [[ "${pr_state}" != "OPEN" ]]; then + title=$(gh pr view "${PR}" --json title --jq '.title') + echo "PR ${title} is no longer open, state is ${pr_state} ... quitting" + exit 1 +fi + +# checkout pull request +"${GH}" pr checkout "${PR}" --repo "${REPO_URL}" +HOMEgfs="${PWD}" +source "${HOMEgfs}/ush/detect_machine.sh" + +#################################################################### +# start output file +{ + echo "Automated global-workflow Testing Results:" + echo '```' + echo "Machine: ${MACHINE_ID^}" + echo "Start: $(date) on $(hostname)" || true + echo "---------------------------------------------------" +} >> "${outfile}" +###################################################################### + +# get commit hash +commit=$(git log --pretty=format:'%h' -n 1) +echo "${commit}" > "../commit" + +# run checkout script +cd sorc || exit 1 +set +e +./checkout.sh -c -g -u &>> log.checkout +checkout_status=$? +if [[ ${checkout_status} != 0 ]]; then + { + echo "Checkout: *FAILED*" + echo "Checkout: Failed at $(date)" || true + echo "Checkout: see output at ${PWD}/log.checkout" + } >> "${outfile}" + exit "${checkout_status}" +else + { + echo "Checkout: *SUCCESS*" + echo "Checkout: Completed at $(date)" || true + } >> "${outfile}" +fi + +# build full cycle +source "${HOMEgfs}/ush/module-setup.sh" +export BUILD_JOBS=8 +rm -rf log.build +./build_all.sh &>> log.build +build_status=$? + +if [[ ${build_status} != 0 ]]; then + { + echo "Build: *FAILED*" + echo "Build: Failed at $(date)" || true + echo "Build: see output at ${PWD}/log.build" + } >> "${outfile}" + exit "${build_status}" +else + { + echo "Build: *SUCCESS*" + echo "Build: Completed at $(date)" || true + } >> "${outfile}" +fi + +./link_workflow.sh + +echo "check/build/link test completed" +exit "${build_status}" diff --git a/ci/scripts/create_experiment.py b/ci/scripts/create_experiment.py new file mode 100755 index 00000000000..ce95714d486 --- /dev/null +++ b/ci/scripts/create_experiment.py @@ -0,0 +1,108 @@ +#!/usr/bin/env python3 + +""" +Basic python script to create an experiment directory on the fly from a given + +yaml file for the arguments to the two scripts below in ${HOMEgfs}/workflow + +where ${HOMEgfs} is specified within the input yaml file. + + ${HOMEgfs}/workflow/setup_expt.py + ${HOMEgfs}/workflow/setup_xml.py + +The yaml file are simply the arguments for these two scripts. +After this scripts runs these two the use will have an experiment ready for launching + +Output +------ + +Functionally an experiment is setup as a result running the two scripts described above +with an error code of 0 upon success. +""" + +import sys +import socket +from pathlib import Path + +from pygw.yaml_file import YAMLFile +from pygw.logger import Logger +from pygw.executable import Executable + +from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter + +logger = Logger(level='DEBUG', colored_log=True) + + +def input_args(): + """ + Method to collect user arguments for `create_experiment.py` + + Input + ----- + + A single key valued argument: --yaml + + Description + ----------- + + A full path to a YAML file with the following format with required sections: experiment, arguments + + experiment: + mode: + used to hold the only required positional argument to setup_expt.py + + arguments: + holds all the remaining key values pairs for all requisite arguments documented for setup_expt.py + Note: the argument pslot is derived from the basename of the yamlfile itself + + Returns + ------- + + args: Namespace + + Namespace with the value of the file path to a yaml file from the key yaml +:w + """ + + description = """Single argument as a yaml file containing the + key value pairs as arguments to setup_expt.py + """ + + parser = ArgumentParser(description=description, + formatter_class=ArgumentDefaultsHelpFormatter) + + parser.add_argument('--yaml', help='yaml configuration file per experiment', type=str, required=True) + parser.add_argument('--dir', help='full path to top level of repo of global-workflow', type=str, required=True) + + args = parser.parse_args() + return args + + +if __name__ == '__main__': + + user_inputs = input_args() + setup_expt_args = YAMLFile(path=user_inputs.yaml) + + HOMEgfs = user_inputs.dir + pslot = Path(user_inputs.yaml).stem + mode = setup_expt_args.experiment.mode + + setup_expt_cmd = Executable(Path.absolute(Path.joinpath(Path(HOMEgfs), 'workflow', 'setup_expt.py'))) + setup_expt_cmd.add_default_arg(mode) + + for conf, value in setup_expt_args.arguments.items(): + setup_expt_cmd.add_default_arg(f'--{conf}') + setup_expt_cmd.add_default_arg(str(value)) + + setup_expt_cmd.add_default_arg('--pslot') + setup_expt_cmd.add_default_arg(pslot) + + logger.info(f'Run command: {setup_expt_cmd.command}') + setup_expt_cmd(output='stdout_expt', error='stderr_expt') + + setup_xml_cmd = Executable(Path.absolute(Path.joinpath(Path(HOMEgfs), 'workflow', 'setup_xml.py'))) + expdir = Path.absolute(Path.joinpath(Path(setup_expt_args.arguments.expdir), Path(pslot))) + setup_xml_cmd.add_default_arg(str(expdir)) + + logger.info(f'Run command: {setup_xml_cmd.command}') + setup_xml_cmd(output='stdout_setupxml', error='stderr_setupxml') diff --git a/ci/scripts/driver.sh b/ci/scripts/driver.sh new file mode 100755 index 00000000000..0bd90db36c4 --- /dev/null +++ b/ci/scripts/driver.sh @@ -0,0 +1,136 @@ +#!/bin/bash +set -eux + +##################################################################################### +# +# Script description: Top level driver script for checking PR +# ready for CI regression testing +# +# Abstract: +# +# This script uses GitHub CLI to check for Pull Requests with CI-Ready-${machine} tags on the +# development branch for the global-workflow repo. It then stages tests directories per +# PR number and calls clone-build_ci.sh to perform a clone and full build from $(HOMEgfs)/sorc +# of the PR. It then is ready to run a suite of regression tests with various +# configurations with run_tests.py. +####################################################################################### + +################################################################# +# TODO using static build for GitHub CLI until fixed in HPC-Stack +################################################################# +export GH=${HOME}/bin/gh +export REPO_URL=${REPO_URL:-"https://github.com/NOAA-EMC/global-workflow.git"} + +################################################################ +# Setup the reletive paths to scripts and PS4 for better logging +################################################################ +HOMEgfs="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." >/dev/null 2>&1 && pwd )" +scriptname=$(basename "${BASH_SOURCE[0]}") +echo "Begin ${scriptname} at $(date -u)" || true +export PS4='+ $(basename ${BASH_SOURCE})[${LINENO}]' + +######################################################################### +# Set up runtime environment varibles for accounts on supproted machines +######################################################################### + +source "${HOMEgfs}/ush/detect_machine.sh" +case ${MACHINE_ID} in + hera | orion) + echo "Running Automated Testing on ${MACHINE_ID}" + source "${HOMEgfs}/ci/platforms/${MACHINE_ID}.sh" + ;; + *) + echo "Unsupported platform. Exiting with error." + exit 1 + ;; +esac + +###################################################### +# setup runtime env for correct python install and git +###################################################### +set +x +source "${HOMEgfs}/ush/module-setup.sh" +module use "${HOMEgfs}/modulefiles" +module load "module_gwsetup.${MACHINE_ID}" +set -x + +############################################################ +# query repo and get list of open PRs with tags {machine}-CI +############################################################ +pr_list_file="open_pr_list" +touch "${GFS_CI_ROOT}/${pr_list_file}" +list=$(${GH} pr list --repo "${REPO_URL}" --label "CI-${MACHINE_ID^}-Ready" --state "open") +list=$(echo "${list}" | awk '{print $1;}' >> "${GFS_CI_ROOT}/${pr_list_file}") + +if [[ -s "${GFS_CI_ROOT}/${pr_list_file}" ]]; then + pr_list=$(cat "${GFS_CI_ROOT}/${pr_list_file}") +else + echo "no PRs to process .. exit" + exit 0 +fi + +############################################################# +# Loop throu all open PRs +# Clone, checkout, build, creat set of cases, for each +############################################################# + +for pr in ${pr_list}; do + + "${GH}" pr edit --repo "${REPO_URL}" "${pr}" --remove-label "CI-${MACHINE_ID^}-Ready" --add-label "CI-${MACHINE_ID^}-Building" + echo "Processing Pull Request #${pr}" + pr_dir="${GFS_CI_ROOT}/PR/${pr}" + mkdir -p "${pr_dir}" + # call clone-build_ci to clone and build PR + id=$("${GH}" pr view "${pr}" --repo "${REPO_URL}" --json id --jq '.id') + set +e + "${HOMEgfs}/ci/scripts/clone-build_ci.sh" -p "${pr}" -d "${pr_dir}" -o "${pr_dir}/output_${id}" + ci_status=$? + set -e + if [[ ${ci_status} -eq 0 ]]; then + #setup space to put an experiment + # export RUNTESTS for yaml case files to pickup + export RUNTESTS="${pr_dir}/RUNTESTS" + rm -Rf "${pr_dir:?}/RUNTESTS/"* + + ############################################################# + # loop over every yaml file in ${HOMEgfs}/ci/cases + # and create an run directory for each one for this PR loop + ############################################################# + for yaml_config in "${HOMEgfs}/ci/cases/"*.yaml; do + pslot=$(basename "${yaml_config}" .yaml) || true + export pslot + set +e + "${HOMEgfs}/ci/scripts/create_experiment.py" --yaml "${HOMEgfs}/ci/cases/${pslot}.yaml" --dir "${pr_dir}/global-workflow" + ci_status=$? + set -e + if [[ ${ci_status} -eq 0 ]]; then + { + echo "Created experiment: *SUCCESS*" + echo "Case setup: Completed at $(date) for experiment ${pslot}" || true + } >> "${GFS_CI_ROOT}/PR/${pr}/output_${id}" + "${GH}" pr edit --repo "${REPO_URL}" "${pr}" --remove-label "CI-${MACHINE_ID^}-Building" --add-label "CI-${MACHINE_ID^}-Running" + else + { + echo "Failed to create experiment}: *FAIL* ${pslot}" + echo "Experiment setup: failed at $(date) for experiment ${pslot}" || true + } >> "${GFS_CI_ROOT}/PR/${pr}/output_${id}" + "${GH}" pr edit "${pr}" --repo "${REPO_URL}" --remove-label "CI-${MACHINE_ID^}-Building" --add-label "CI-${MACHINE_ID^}-Failed" + fi + done + + else + { + echo "Failed on cloning and building global-workflowi PR: ${pr}" + echo "CI on ${MACHINE_ID^} failed to build on $(date) for repo ${REPO_URL}}" || true + } >> "${GFS_CI_ROOT}/PR/${pr}/output_${id}" + "${GH}" pr edit "${pr}" --repo "${REPO_URL}" --remove-label "CI-${MACHINE_ID^}-Building" --add-label "CI-${MACHINE_ID^}-Failed" + fi + "${GH}" pr comment "${pr}" --repo "${REPO_URL}" --body-file "${GFS_CI_ROOT}/PR/${pr}/output_${id}" + +done # looping over each open and labeled PR + +########################################## +# scrub working directory for older files +########################################## +# +#find "${GFS_CI_ROOT}/PR/*" -maxdepth 1 -mtime +3 -exec rm -rf {} \; diff --git a/ci/scripts/pygw b/ci/scripts/pygw new file mode 120000 index 00000000000..77d784f6ca5 --- /dev/null +++ b/ci/scripts/pygw @@ -0,0 +1 @@ +../../ush/python/pygw/src/pygw \ No newline at end of file diff --git a/ci/scripts/run_ci.sh b/ci/scripts/run_ci.sh new file mode 100755 index 00000000000..c79ea06e77e --- /dev/null +++ b/ci/scripts/run_ci.sh @@ -0,0 +1,71 @@ +#!/bin/bash +set -eux + +##################################################################################### +# +# Script description: BASH script for checking for cases in a given PR and +# simply running rocotorun on each. This script is intended +# to run from within a cron job in the CI Managers account +# Abstract TODO +##################################################################################### + +HOMEgfs="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." >/dev/null 2>&1 && pwd )" +scriptname=$(basename "${BASH_SOURCE[0]}") +echo "Begin ${scriptname} at $(date -u)" || true +export PS4='+ $(basename ${BASH_SOURCE})[${LINENO}]' + +######################################################################### +# Set up runtime environment varibles for accounts on supproted machines +######################################################################### + +source "${HOMEgfs}/ush/detect_machine.sh" +case ${MACHINE_ID} in + hera | orion) + echo "Running Automated Testing on ${MACHINE_ID}" + source "${HOMEgfs}/ci/platforms/${MACHINE_ID}.sh" + ;; + *) + echo "Unsupported platform. Exiting with error." + exit 1 + ;; +esac +set +x +source "${HOMEgfs}/ush/module-setup.sh" +module use "${HOMEgfs}/modulefiles" +module load "module_gwsetup.${MACHINE_ID}" +module list +set -eux +rocotorun=$(which rocotorun) +if [[ -z ${var+x} ]]; then + echo "rocotorun being used from ${rocotorun}" +else + echo "rocotorun not found on system" + exit 1 +fi + +pr_list_file="open_pr_list" + +if [[ -s "${GFS_CI_ROOT}/${pr_list_file}" ]]; then + pr_list=$(cat "${GFS_CI_ROOT}/${pr_list_file}") +else + echo "no PRs to process .. exit" + exit 0 +fi + +############################################################# +# Loop throu all PRs in PR List and look for expirments in +# the RUNTESTS dir and for each one run runcotorun on them +############################################################# + +for pr in ${pr_list}; do + echo "Processing Pull Request #${pr} and looking for cases" + pr_dir="${GFS_CI_ROOT}/PR/${pr}" + for cases in "${pr_dir}/RUNTESTS/"*; do + pslot=$(basename "${cases}") + xml="${pr_dir}/RUNTESTS/${pslot}/EXPDIR/${pslot}/${pslot}.xml" + db="${pr_dir}/RUNTESTS/${pslot}/EXPDIR/${pslot}/${pslot}.db" + echo "Running: ${rocotorun} -v 6 -w ${xml} -d ${db}" + "${rocotorun}" -v 10 -w "${xml}" -d "${db}" + done +done + diff --git a/docs/Makefile b/docs/Makefile new file mode 100644 index 00000000000..72173f32a73 --- /dev/null +++ b/docs/Makefile @@ -0,0 +1,25 @@ +# Minimal makefile for Sphinx documentation +# + +# You can set these variables from the command line, and also +# from the environment for the first two. +SPHINXOPTS ?= +SPHINXBUILD ?= sphinx-build +SOURCEDIR = source +BUILDDIR = build + +# Put it first so that "make" without argument is like "make help". +help: + @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) + +# Sphinx doesn't know to clean out the debris from sphinx-gallery +clean: + rm -rf $(BUILDDIR)/* + +.PHONY: help Makefile + +# Catch-all target: route all unknown targets to Sphinx using the new +# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). +%: Makefile + [ -d $(BUILDDIR) ] || mkdir -p $(BUILDDIR) + @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) -w "$(BUILDDIR)/warnings.log" diff --git a/docs/make.bat b/docs/make.bat new file mode 100644 index 00000000000..6247f7e2317 --- /dev/null +++ b/docs/make.bat @@ -0,0 +1,35 @@ +@ECHO OFF + +pushd %~dp0 + +REM Command file for Sphinx documentation + +if "%SPHINXBUILD%" == "" ( + set SPHINXBUILD=sphinx-build +) +set SOURCEDIR=source +set BUILDDIR=build + +if "%1" == "" goto help + +%SPHINXBUILD% >NUL 2>NUL +if errorlevel 9009 ( + echo. + echo.The 'sphinx-build' command was not found. Make sure you have Sphinx + echo.installed, then set the SPHINXBUILD environment variable to point + echo.to the full path of the 'sphinx-build' executable. Alternatively you + echo.may add the Sphinx directory to PATH. + echo. + echo.If you don't have Sphinx installed, grab it from + echo.http://sphinx-doc.org/ + exit /b 1 +) + +%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% +goto end + +:help +%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% + +:end +popd diff --git a/docs/note_fixfield.txt b/docs/note_fixfield.txt index 3b22de5e139..af2539e48a9 100644 --- a/docs/note_fixfield.txt +++ b/docs/note_fixfield.txt @@ -4,6 +4,8 @@ They are saved locally on all platforms Hera: /scratch1/NCEPDEV/global/glopara/fix Orion: /work/noaa/global/glopara/fix +Jet: /mnt/lfs4/HFIP/hfv3gfs/glopara/git/fv3gfs/fix +S4: /data/prod/glopara/fix ------------------------------------------------------------------------------ 09/28/2018 diff --git a/docs/requirements.txt b/docs/requirements.txt new file mode 100644 index 00000000000..9c7258463b8 --- /dev/null +++ b/docs/requirements.txt @@ -0,0 +1,2 @@ +sphinxcontrib-bibtex +sphinx_rtd_theme diff --git a/docs/source/_static/GFS_v16_flowchart.png b/docs/source/_static/GFS_v16_flowchart.png new file mode 100644 index 0000000000000000000000000000000000000000..963c915768cb92c747b13cb870ff5f2d213604da GIT binary patch literal 121683 zcmd>lbx_n%+ct;-O00l1E&>8dE}aV^We}otEZyC(O9=`{inNqSiNw;~wRAVq!qOc} z?ziakzE8~ezVrS6&Fl;_JNrAobMCnA>$>m5r#DLSME7X!VPRnry?i0_77Gif2@C7i zAU-bU8+~``GAt}Vtd}xUsveN76ug87vU43823;b^CokSOy}AGRZpiIBQvP2aJoxKn7@mF2g%m|UQiMxEM#SPHBT4-J#)Q$D1x4-{-9(e|?9U;%o z&d#ZwD~ySiWsG|9nDmp({}+Fr659?Kt+Z;{)Wx=Q^plBbZTYo=FHz)TrJ5Sxa5f+u ztW5c3Z-Vnk_){=aM7j%B158!eKNVBEviZRdR(iZ&Bc`FqUdI>L1E}iM+k7odZ;euwxya(Ffb>*A3FDJ#XFD$?1Fp%e$IooCe zl|mpoC7Mrt3iY`*$>7JrHJTJ+A?OI$0g52mg#cy0b{h=*`l7x{a3s7ROgm>>K0q*(SLd-ksE|$75_1Gu(!{kBDb^wMIr8n)6KhOomdxaU$9p> zMwXvnqBtcSv8fmGs|Z0T4Z}@WjxgMWch_9se zG4&AE;e690;C0^h%%Dd&km66|KlqFaKPFB~+D6tfz;Mu^4Dijfd8GG}SIg796#=PG zXqBxL8X{@q3-cPZo@VX>J46qM;8Xoz@+*IOuU~ zrsT{sqwx+}TBN3++G+`hTU50oN}>*Tkwyst1=n=6=wT6jT%=NVzBhbOwEja` zK%!rS!y(#ufat+)bM<3HD);8a!#k#aXBTcx5`Z~pY>M8J15~)?KCQte@fdFk9P!>G zWvsMLZc!Q>@c@w5FS#kqehL#2o1Om1QEb1vl(3WSQj6}7Oy zahX&P63jFX;I`)zD6G?1@CBYo_P}=X09@~P?UU)zIiyH$FdeeilsiN~P=UbcPmIMR zpoq$o+yZ=rxDW59zYoL}ZzMTAs7=>Vp)xS<-`DQhQjBYg0St~v#hLk~1YRN~JnJg? zo=cO3;{&hXE7NoV4C$lI@0k1PFTv^GLiDpbR*oJwdje5m zbPdjsRCW&gfwJEbbg&m`uwJ-bOU3?Jc=m7V9&_f^l&7Gk&slx;NK>#Z5z;e!3DG!y z@a`-=6?X}NW?J48=R(>H4SO^ADvRe6_+F(d&wWV~rrg+#`hnfiqJN2sme|FUXm-!b zd7m##R8;C;a1NyU>hlE%D6w%~^#SVjOS9#F-xLxeZI!89f@bCf7!bFfj1sjSX09MA zk&}62wTL~xXW1)o<&j~9H-~OZ3U8M8E^*J>r0h|(+7MD*oauE-50S4w_8q)?is@ff z19~6Kg_RZ2N)Q{)YGjILI-MTCa^Jr2QqPwO}e-hpTatK_2DXdk)_Q3RxCL8=EBYm~|S7iNP{imsJzhv0XSWO?~sghk33 zC$)67NRGB@9Rlg)U!{tlS4N7YQ$cyjLzl;)+Njr7=S8#ZjI9P<mJ z^eHAfV>QjCKOeK#Z(dd!1!m;t} zYnvC}-R?gXjcUBsynCTVUt!g5>6czqU-%3HFu7hZ74|#CmyS9TSoABvrU7G2@owU!!r%z}BO#b{${SKzbznWf977w^i3JPziJJbKzxuC_f8ic8iTHk%c}t=pUCroKE&uz zzkn1GLOLt9{|S&VNaFj)fEX$Rq}aQhEb&528Pi~2Y1!?1$QXg%pLq+nSAT$sXUR$$ zJb?Hmkj3ahNb%;azr2Ua)KXe{CE8kTt$s;pZu8()>$Zs(I{>Hm(B9OOwj*tgXtxMF z|56eCoyIN^*Q0=DhY+r=7@mWOY)oyXwXQsDwBK*k87)()5)C{3MnRE)U9pwQ>z?%n z66btB65Xi!=``D!gB!=|*ht=$ccpiWXy)_-@7VYmvPC>WCB9ooy58=Bw z#0RNLMk)hB{jxg1gVfK4KRF7d0PnIsc>nxx^_pI3y-h1OTZ0HRuz81hi%B5m!ND4CSJ42^2e?op&smn14auAkzay8%N!< zJYK{0$cnJlk?lSOC->MKe^obw;tr}Ju}_Y^KaZ>LOrs6*J*np9KMVy(a@w};&?)7;hx%s^`1@ZBOdv;eU#y=VTMzAAkk+7@*#+)NygldD|l_V->eELU$St9L;Jll0?+Q6 zwU$uIwOra0o}w!UV7E>EPOS|!J)UHy6GvUAjVHxp%-X(L<{DA&0R-ArODmE$!~j^G z%48=vE~KJzw#3u=tdi->e+NKV7x6>y?H5s7)oBtH87qvQY7iBj*BeQcSm zxsy1KzzQ-=P>rbn_Lau`J>fDMKGM?mYG>$)I!-?1G>c<5d1XF!dF`Rw$ieOkmq3cD zCM}N<#tSJ?l2)Ps&1P-oW!4^G3e!kEdt|Ez8=|piQ=8FL*3j>)IJ8XILvxL1Gw#%| z4?a{U^cS%3OX>)C-V;IT#K8b-=~PbJ1Z_fGOaz2pM31&JJROq6K96loQk zJ+i3l1R*{cNTkd^2aCdU&$O{=4UXC@c7>GB(A*;YLl`M>hDPSDw@QRkztozOI};NQ zoO2ccBhA1MW5^6?GcRdUNE~2eID$=7eITii2oW4J!`Bq-Svm1efg;N=H6)w7QRUZh z43}_c;M405Y4ey|JpC+-h*rLZtC<#=$h~24kKF(#|K2=aKf~*FJ-o}wM;{IcTq7VA zm-u3beyyuGV#apDALc`7xVdo%OY-!IhdE<{yG;!rI0Uc*B!#$3Q2aV zzb^B+b||a6;ClGeux=e1JhF+f2suy3{oBG&jH@OZCTV0Ss^RCO48`_&C2}pF>CNF> zcE@25>$73p1eWiQ>}V(TMzx-WlkW?@oP?n=+fo6^z7gt%yZ0O7$Gi&sFwxO5=9#eJ zuLkca`xDrdv~MLt&94jDj=i0dfvjnA%id5?{BW26aN`~#1?)D)Fgd?hg?AuDcpP|3 zu&F_gm3vWlqQTzb16hBuB2~q@<7O(NUP;m3vr@_lw(pYOoL`R+N=|ax`uJ&j4^98B zMLdJpX^FpS8hJi~gn?d+Vu|5e2^}o?O16l=5h|vM&RWO?VP*F?x62}kIz5Y+XNBs> z>J#Z5HLdBNjh@s733>trWQdV~ zqut8yiUPEZ*nNXBKNaAxtd2>~`shyYTPwd572%yVK(eIaw&C4E9jK_J)@pqRs7YkH zRkjDokg1$Uk+*>!o5$9kM0=%H z2{C*$QhG0})8?3LpB^@F+Kys)*ej~wZY-U-*e{O#v@`lE%ca z^uM*qKJ4$p;;1hzKYzqg3`Tcq*yWbGyiFsBK&$xSOkKT&t
1J&b002Q%5_M<=?{W0G(A` z^#Mj*5KCPBaJEAHJ)A8b>{nC2t_%wBaVIU0a&ak~*}<@}=W*4xVYLhQ3H@n08hcW$ zw2|Ps(cwH?syy1BqWLjz*%XJP?>j+J%j+z;U>-{YzxW@XA7X74A9+8Pf?)EsH*dT> z5-@zQm;b^h@nRL%$j@`qW!oFfzSFX`XIvSyT@Y3D>i83;LN{WMBfm+IH9s+I*YBm) zSPn_$yspg8I{JP-#<@YVg4FF7k>0HLQR^ZDfHr-+iJAIN9L4FDMX?E9^Z$B@hZz)HXnbIuGdrp}d z<0tcZDeB7}y|et*dIJ1zH8~6Nz?X*Bzc*+0B;n}f4N4DaEW@pbW4pB!RYiS0;ky=* zOP3h>OaP;_$40+_L7WtrK3%aWOHnmwT%+~0t8x}Y0vjpZ@at3YrEx?M5QAQ#0Tn_6 z8Y%;2`$Q{2j+spH^m_R=*`iAY8kh2VMDF<+{byuysZsIV3pdKS_csvI|M}hV>AC5o z*3&5-(a8)g#oE`|A8+LF;o!6R>$PCh(>6t{I=W3xXWtFL7@pp_Jr4q+sLetAJIvw@ zvJ9Ph1u?VqZ1kV^kLH$B{Gr2fw+v_?UZ&F8xVmUd5(r!^s=%Ub+I^NS54K)}w@xzl zR^vi}FC5J}%RbQ*Y>A2$D8Q{Q^` zVHLI;RCCO;>Dl}jmhaeZs}1$}S&?nNt=inIH1&qN4c~merOwJFSotx`{?>NGxd=8L zKX%?YO~l2el9n1{*vG055~jbTXg{#wyQ$@dq#UNI?{Q8{J77Opln(S=sod&tsY=SO zEFKP4jZ-e4%Sn{I8SXIkW9Z@kcJ^L{)hH(L3m!ChY={V}zCxbk@O+V)ar=_x2+vgX zXYSq&eW%CTu-w~I`YP{k&HatW;EBAB;e5$JDN|f^<>uxFf7m0!BIBlG(SwhOhm0f4 z7)7{Pd&Fz8wp#xTqb@7Dg%(`P^&;cr^Pdob3O^abRy^yuTpFSQZ~ubIok!+*bVsQ_ zXQpLtb)2#^ew4*1Tu__+$rVQ_>m*KObjHBIM~*DRo@JX89(+=M;qdB!xT~^Rz7IN! z;2hQ#Fm7~qCeg1O#0Gd|PymOpq&@sdPS$hB{MJR4lnK&L71-0d3B=grsZQQt{ORIyv6CZR`hx_ zE?q~tG&q~q!2C(^ZZKuP4wYa&a0FApTh#d0@SuTUox67oF^)(@N~w zbPz1up1#1}OZiZ4`x&*#0|n-OKD_dTX14Y}@wYI=5ITk);7MfcRyfoV#U`&s5b%%w zOO0UYSkV^wTONG^vqSs^p?UMOdGO_~uxnL17ZBN2GipKk06=mC;|JB+tA3X{XIL8lh1Z%4It6dl*Lc6qX&86 z&e!KQ0)&oZUWcTSjhZi!JkU#U|C>YEWgQ~G%pt$&bgoqbn-^D`>jp{MCeEn+DRC$9_}+DjXs&Fki&GD6y9iYZZy8_NAtJU_jg4eOsH z`N7%mQE-!I!U#v`?~N}U`E$zmZ%h6Cm@ooNqjWzC=ga2I*>sL({p0(nw@QgcpVbOg z25j>yw*}gQrSHpZH7$!Y=HUSGG~1X7L2RASM3TS1mg$~4=vCgD+{p>)faaF0O!abJ zgbUx6YQq%=`n=OJ@dGpBDExih17R_JRDdfzuL}QUC^>gC%ZEa~rTe!@z$SiIvbYI< zw(S3c7FV&M_aQOZ=KLO0@}uCjM>t#DyiNajnKsYKmxuY9#Y^A!eTEZ9Ci^MT{Fo)8 zg#K*_F)mGC${sE?IsOH)Bar}G*bXHVvDE!2hJRbC#9*Q{U$_0|car&hJwV0&C;OXy zqoi>s_0MP)oMK;fhcNvyWN_k_tnKvt+fh%-lgfg+a&(4slF0a0WwLw{Dcz(xtPP@`d))?NR-??+gC zI~&6kIbl!9s15%3>&v}q(wcOJV9o05F|GW*eWLa=)od?jL7~eJLw3ygjFh}h*Z(l4 z%v&j)=R$n9NdQ9i-Xxno+OHjzVxMS*q*QL0Wo*BCz+tW**IM!La!gn$6ZoHx@!lrk zuB;PEOA*YC7KTT#sfUFp*A7`UlHO?zfk`}=(Gb`jjJ6fst0z&)r1;N=c`#-`a%}Vw z?WrtXZb;g35i}lE4n;7(eGMm}tO~RqBI2f6l(G63Q$HpzV|F=weqRb?y=uKV#xk#T6tKf*D8J9eukJBqDLvclH^K{~~z6DSl_(ox~+?bv*_Zw$AEu}wx2y?A` zG4IN@t(cD6dQTyA108n45#^e=wiTv=b9? z6y1MrHngbPQ94dzznEj0}zgWxY(YubS*bMDZf(>8W!;%DS zVgA-FSSR)!{#sB8nZ5tC$4iD_6Q{#P&#m7)+ilfB-j)c?Y0EOlPi<1Wb_H^XV5r1S z+sLnFu~f7kDwK5UWXS|YyhnNT5Zu!l^n&=G*%h4WU-7~24L&1zvH((M0*g@8y?Cgf znRMCT)xGAJ9p+03aStA|o$dh7l~_xc#!d|?90L`KqP6=pe* z!7muIJ!x5h_`&>r(w^WC#tnwwmC6jU%u_aO52YkS8lI*{V{-f&G+is>#cwyFHp;$w zZRdSrK5oB7S6Nx9aQ13ztH){b>s4Cmm#m%;<4Lxc0|769fA%ZfO2wO34*O1q!46s4 zUt~uYGE#meYuO(jz3pPw0GMZWn}j-(nA!I5NmAQF7zI}Y(;VOeOh`Ku0ouvKLhL~4 zRum&7GCjJT$0oXL{nKukSJTgnh$2(>dUHBf(g4Du#^+j9QeUw9srE+c=d}*|$HM2Z zZeZ8K4~DgK2c_Sr{NVPJP`elSjAQvVhc@cDUuVgts$3~67Kh+rZrmhbTP3O|8>HZd zl8H(T5ZN4*fbEi3-0KHI$hjF>0u}WQf4{G8$r;dOh6Wc~owis5XdLl<-)LMR`b?L( zh-eK-|72&f;kV*|T@F1OFy^E(8!szi*xlcgNc^AR7ebH3)$`JTcK)d>tm1Rtnvd6?Kw=?KAMv<()t`UG@@-%of+V1pLjI0pjZIw5_>Av=V1?uP| z$CKxc8^!PVbXlND&-g5-?9}UIDt@#cPRts9FeZ3B#TUjfe1Hm+Z!kfm3C;a!g%4Ab672w@GDNq7#BwmY2UDoRW5EaSy7bx|WGLz8`4LaT467_q(jl}Fr# zO2Roj0Vh8@KqWQpFb2hCW>Ygy^)_YQM)HVu6YL}^w(%oFoJHuNR{b=8$^O#sMBb0W zCVqUuF4jjkl{1%hcIOTzjwJ9I^@OjyW)hlR45V1Bn=~5H# zI!ekf(CeQXx2yFho9+^{kH6$iy;Pb$rsu%>$^S>N5JK)jqCuiv^t>Ps&I4yJmq^n_1MmAJzaT?kHPB>1gB(JZ9UdEpKJPv%m zxv%yr_9+5We&eGAN2X9Hp{=8J_I5@iXv&L&14sPmeU}kxZcsyKBq1nD@IM^dDs_Lo zqt3QfvupSy4*c+(n6lOJ*o`MjGNCp^G{&HRckPbO)SrUS)p&uUUTp>zvI{@&oBYrN zC)ZIm9JA;J*g=2oGv>>#>>bGV7rX6CVGruAY}UianA+S6LbpK_*<762-{V zG5~3!sVJIJrTPv#AvV9(-anZ^Y zXmgENxPzqV$wP)EA?ul0e5Omx4&nEM2pt&APYP3kt$6_sV4|CZs{V}45^PYlUl6K! zDs{4(E7SGXru(N(SOqVMm>zBfiK}*llk0qY@#T-%$!nmttJu`PEq0jKs+n7I^7Esp zX}5XkzplAZwKD5zGuWoIB;L`Yl(A~ZJKZB=)9LMB+)@f$#j8;MVmtGWm6Nwcy0TtY zN8{`n#zPFm)C+{)mYaSt|5g(PSQ)f2-<*~fb6OH{b8hiZ4_#ws8)zK*y(oOv?K5U| zsh&kz*FvPHs$|8&i&Tf~du>!ACr@4gMy5`Fap%tsAwHozu!?`L3W+aHx>H@;Hnsz) ze;Xg6G`)e*HU^wzYz5t!c2FZ0;<)MtyUnCAH26zvFT1iab4h5ZbIs;cV}h*34GEVo zGfH)+ujXNUs(uAtCe#K7#}0RUp$H%I8pm29(HXLat-@2yk8A=mn(aT!ziSuQf_%*P zZz}GeBN=S-wz$;=j(!P_|~oJ;7}`R31oj1L&Pz zomgZtPFk>C5A`fL5l1^Aiy=q?Pd!nJU6@u;--uF7Oea^@^r=M3N4u+_D3-+hDx|b} zx|N~|u8n{IMh_Tqad-1L?Z{!co_A#!dEOXBeLXK{FVBV^nuG0J*Om%WjiR7(x+a%P z{9($}&bi=33j4)n^_+Z!O>D3nq_U@u{0)eqH)B&t71dg)CwAXE`m+ zgjlS0c!XikE_7b8^lAY;$&+$XwK$d4U39&?T7sz-=HBuHPLhDIdeO%;^$om=7%-XY zU~LfSj<&>bH;tR3BdzI#?n))Vg={+Fw*Zp4FDaGp%g~VZTqR)qmKkUbNp9A5$!G81 zcZ&U4a%V!qyex}=W=lVeGx<$NMucbxQL2;3%>L!B9qOP~(%@L9jBGKu(qwm24%lUK z7>zc?>Co@(R?lCuM8t7vM5XZVlUA%Grv0kYT&E7z>k1cz>FSnyP;-DNI+*Q3%N`J!ZyL$A!lRNLxW3Bj%q{8M2$^l7_f{z&DE_g7M{KZ4q zTym@Esibx?lKFMK=?J_IVWCX<090z)twCxk4cJ35Pf$5FO<$p16nUY6$9t+phaV*; zO@$g52`|3WLiY6Xe-3Ez=}8AcejP_%B7y_b`!iN;DOxl4p~K`t_{)*}ps8HohTiA@ zGF-r#P2DhJda3P!J|TO3}_t5&amJu}Zv;;p2Uc z=ax)LEdOl=BMh=FiM8s3!yYldcMd$`ue$61g5Ic_J{kjY{)VGlz`{l0?K)<;T2VFK zv&jh~k0K6&K&nkgRggx7<~}K-T@o2C&ElQD4}-3D+;-I=YTR71@PJr)J0IpYQWM9AVA5Kxx*{^m^!p zXrj#xbN)SPSGX_YGbrL|cNXATjyR$E-GTU}a}>9PwsNH@mWt*+>}^#P;DbY(1dvol zz&>$nj<3w8c(~@6i*1e}w=mqAeCacG#QRrjKdV_uNhHbhR0TKUo(S)yfYNY~cz80`8gKEXNjwjnL_UgPZ5`{spQ z1|}ty_ctVwj<<=&@7!abPwp{(ipKh#2Egjb0t-H)ckoKhOI%@ax}c2BgOfwEm50k1 ze5~*c$o;jqrXeRLW>HjWhXKs3Q=3_4{pEuGJlV+gEn5Jrg&RX*W@rhelhYE3Y4Y#S zeqVmMbj8HEkoA2R@XHycig~mDKi@paJt(s^=ot%HxG?yyU@J_*7=X-5x=oHHStyBK zB!0}w;Pi6BW-_^;?vyE`;2}S#dF@XBfWzjY6?fx`A`>x{=TPkMx1y(cc04W7L?>5Z z-`R&9jt|?cr?TKW!~@I9Aj^Yp?LH%ftP_UcO%D>N-gQKw(q zZ5sh46cbv)@poaN#ucU(HdRRcheQ6|EITcod2lW>JmOws`G|bp78ePWL3QxOYEnF{ zVewu!d;BM_hU@cJOdOEwvw?EPfy%ji)P_;0^&C#dKIZ{WPzxi&CcE~Wr+IFae7+ZI zG33YQ)ftl8asDERzK|5raOo7FQdo!D^%bH$qmVPIw)=vur2K=3gf`J-?$_SA&3K`? z@qXf~Sk3`6TklQ_=bVt&g}!rqaJTagShz6Tq&%f#^Ijr`S-Ct*L4>8c9i88*zE1&# z1}W+{vvAeVt+e!Aw57IU0hk8^I1XEXPIk(#KHd1~aaRyigF0Y2eE*dh*rP!{?^%q3 z`ty!%&Ep+fI#9|f38&|x#M1{2m=w%cdnFimEOQ>3e`tR`QRtSjT3E{kYFS$Onb%M1 z89_BPgdf^QYtKPqX*Hg2@Im%F?9^UY;V>mHlkTowt13oiXIka}7j59&@1Vfa=2z2CU-ZQgrx--T|0%}>j$gsBFZ@0xGv zYYJOq6spl$)i1KC$d5|LA4zzd_NfV{&zCXwQ#-76SbV0Oz~99D>H%Lj60r&A<~iUFYjc`?jFX)DhL%x~_qa~v z$_p0C*xLwRZW)uC5!1@|-Xq{at3h;vAVs~w86_+0^jaC?DYK|M)o{nhN1nYK*G*I{ z9Ic=jE#*93s~wh8F_q09@6b_Tm2A+R>28=sLC zOC!~w?{@<*U`@c3Xyms{eJdpx8&YBOgCvgQfUoK2nO==cFbNltLoAzVc##52+&}ZC z3)%Pkjih>hm2E#X@+-q?(T45bx8@>!OMGQf=dUD%{$r`dAHUR5REk+@zUrnTxX}r< zOXc(8W`xIaE0JdVg?vaGhvfi_YA8zKzFrT^Ft3^jCYZ#~0P+j5fvM}@DEXS32GdQE zrF!PQMrpz5wpRCwd(Rt#wC;wF-w`^9oA_g^_w9ZaXbpFGhAZrQjWpYP-xBLC<;Fdmv^>8~?Y;IbwF~HXgSYoh9~;c1UbK_p9h%L! zNPfgL5?}chgcO;6N!2LPf?kLjt(9OleMHqr&?q~^&$YGh;Tp+A>DHAc6746 zE@N%x?nO4)|B?cv0!>MR;iUBu@cN3$VD71M>$>1W%pfL@X2E@4ufwcwgFSGLn07qE zc1}Fx3bjnZ1mIBpC|=Y#ve>Fo#A94jBoz>0x;Y+xf8UXAv2sqd{l!Hi+SXjmJ%a3P z**jZl`zgiJt8#2P5tR-Hk7N(;wZZ3tw$m@aE1)>6Cy34GeH*)8PFx4sVcxXm6o&_3TxPENRm9dk?P0< zp?qYb0wa+;P}t|a4%ZeSKS(}Ar?hz}-=#Wa39UY>L?PX5D(>3gI5tfLJ`5lE-+~S^ zs!e!F7g{zsQaNvDlJzRqfcbTtgNAn;uE)V|AR{3XRrSIc`+@C7xm`4We4@7f8Lb~>liaegEsM`u5tk9)BaZ)-DLp_ z-Oi@L)x%Z&`+?!}Ay}A$B+ljOyy19(4w++}bP(=VRy-FuLh{^#nO3wcMJG^|c7fNl zL^i{qBW!Mq9*zI$tjGPAW5W}?7Wv4XrI$Ih2CkEb;{_Svr$ZRgCV5lXM+1B$A2Ys- z=}$@-&~2hAD}t#NJ2|e6`%>8+m^kl*Bu%QchkW8~TKH%O>j}>I4w!=YoZ=b1f08K= z;DKyfzP$@|Qi;o$NwfIyvR|iq;;I*PWK|EZ+ckDq;j~G7e5%+%@%G$n+2!N^L^F0; zg87?@zX959r%Eb>8(^oI!DsW%11_f+c-4HVCDbR0u2W0U|Bwex8pgumaLa0pOvCnB zOV?V`EVo(-0|%imoms`MJ!)Pf-UXEmsfi1JJaY2pHIiWdWtq<#RY62J# zyknl4i2gY-jFnNT$(?bp0RXl)yeZ_D`;IzauQgS$a8CLz=%TAPJLZAdPi zbkd~w!^|NrWeUx~yUyDCb4F6RdIQiFjal?vG5fJTBc7+1@SOO3Oke&x9@QDY-K=bH z!Ps)zp!GfPS9!P@iEeAuH9sV8A6INP#6n&%=u+3i||G&K?(OlZ$bl~($x4@Wfxe>wZkJ6W_yYM zUZb<0qzVZ&xRQt&eg-??hrU+S^3EF)RL0$g%{0EFs$18XTLvQ7I&TzXUuIne(0$3$ zW=e0D`pnNUK5(`QlnC`;({vmNvZDssYXA`6HM4rAGh+6p{%8iA!m4EuyGHKUq=`R5 zYdBGcXipkox-S^%>DvdUM7fKzhF+cJBnna_U%vgnGsarbmo+*AKar}0cLW9pzxUfi z@jjubAO#)qxKyn<$ER3pi?UT#3R&0^D=~s{UJK@auwyH(2tD5`872wTR;fV8#qS{g zCeXL@`;IBf5?U)SF{n@PQ`Oyhkq39Gj6oew{92sHL@8mK&t9|E5-R5D$)z=2bZh6; zXWXi{S-f)8u194bslCju{wVYHyBkuC^rTTwe^Y(YHFw_s!Up4Fup4 zyfhqJF6Hjor`wgZD}>dLcX4Tby%}<8szCP@;Wds2GuFb13#CoJl=PgFkf?5$q*JBx zEzlNJ`_YMF;YeqA{5+&({RJZ~>I_+`xjwvlx`U@slqx}y?DapIi1Y(Cu4X%xA+6AK zeNBRID8!f!cC5C0Z)g~+5q<5%M32Q6sXT0a$kb>1hPTDp22 z`;?WUR@-%+gy(jZ;p>@inESK^C%9>ji&jE5y)D~v4Y{}Lr#6h#Xn|+LDYS&R%0$$b zO{=Wv9*i2^?C7zd|Lwv|g5%Y4w;H$7W1nL@L!Kf@m?u;*(f{M~Qjerg+UuQKGvLtD z@8~~W?EecnjQU2nXIrUAZkQq}KnhRu8s2HQohWP|(V$tVOvYT_hIlzf;ls7dlVJ)p z1~$H#2s|HL)THrdi1CvoAiY5YPYXdP_DP<*4O;d4hx$idnJ;Jmi~O*rMe)^!4%`VC zE>`BNFgbm~6hXYSc~jUDJ?7fFdx>(2F~hCBWxJe+!`XO2t=_r~>{A+_^P|{Q5Melw z2Uag8MB!|yECEZMq@)D53geCtw&o&r-WZ5w$Q6iIr{yB|a^>j!&RQoYG&@>>eNb+9 zLk7&<_$l-1)e7x$j+1F?Gde<8!(PiT4|a{Oj^Eq6m`wl4H}RU8^V~|5SdYvCT>sSSe`rX4PxY)bUl?CFPOkmPk&49cKH%#>QMgr zfn&wCcZ}=CoID*|JwtGF$``-(Bvm@~RJYD2zSFpAaA>PRI<3SfC)^}9Pu*N#bJU!P zwbwwo?n*x+SxV|(*sc_{!BMtR@zY+{mN(k@{b5xl@|vHPKp4-D{uJ73 z#1c=FQx-CoU0G-;zdd(38S%~=*I z02#cAbwDVbzNtEWU}Aly<1v03)P)yicxl82=YVx6q zFwr~HbfC(f!kQ>@Zk3j33PVu$3xa5l$jx1q90*h75Haa*_ykPFbi008E0$(OCWRl< z&m~kIpV^YaS=LDZ3FgeY`20z(>K-(7#qGJlUQAE8@T3vG4iSS_ufMIFM~N1u&!&!{ z+{K*36-XbR7VHy&xHN3(Z$b}Ib7YvChT-{ke6Wz1+cQk%n*#jvwi);QJ&5)iysOgy z?9VnA*~r1$FL~b zA<2p1dL+s(puj7gL+kCH%nT=GQ?2ARV zV#$}{$H{!%#kBwT7K7M7oXLt)`5bljfG`w3!1~tGG90LW{~HH1pyAuyZIRarlP@0W zDraO{kMwyt0asxyLcdxJV^=#wokN?dPL_Oc0JS8=FTLv^!S&+Z5*0AUm?qZB zZ|nqH0Tdx%P%5Zxizr|8pe-E&DKx&0?q- z0{HH=Vr8>fju&158~o8xB%Q`L**+o86k+eV%39nRQ}dP=`usS(oi+_MZv5F=$gb;a z1z3s@*}*mh8a#(()7aQQ?g_GuEV_y!oH^ebNEK;Lm=SllyydpO6W{jTfHJ@FHOK8h z(2;*yr4u~LSBeCVeSPH)+kq?0d$_9IdwYI?2$SNRKhC)BECL%%Ts7O&i1yy?UjOdGJN)EG zC!KLzf{T4H4g-BcS&=S4qxoJ#AeZyQ?mt;~p-dLHkZ(KQf{__Qn7p7BS zXkqBFVxK1)s=f6dueZ8+pT@xNPPM8rv64h}EEQWAny(!TM_l2Vt}gA$#)OIAA>PET zw#d`3_xYcM&VA;#_%ejULCO9_6|NISlscNo+31@8BGmZ&3WT4Dtxj)dd=$@tNKy7( z;Au)==U7gWv`^d@QFW8tD?wn*@BfC`X$vSPnAPik^7lGEXI(9wweK}cxtN;R!Q5&I z{+M%g0vl}ltyZWtAS|i+wS3!-kh8;@PGdPeM|D<%3fz*5B8+PKIZoAKGhwb(u~#FV z>R(!iT*JJWd4OTDVv^ja^Y>cgt&ID9LbOwS1G;~@5F`|jsaOd4 zF4NGzchnS2xQca!$m&^c6Lo(VS|SjOg8I9T+Xgm0S1h( z21>n&1&4$cOgwLT)SsHk^hIXuMDVa>((YDq(izC6r@2q1d(o~bCc5D4T5xuNT+mn zcO%ju-Q6XyY53MgJ;(FLz4!Ysd#*L-7-Nnn#!|&|P`}*aj(=|O{R@p6=M*Y8h-c>_ zf5J3=W7f1Uu1K!egUMlgRWkI@`!Oc$^@P@B=##VQD))Mh>mb_Ql}WU%Lmd`(I?jWL zKe}VUVS!;Zn^B?-7uM^{n+M$`F<`&~4hpb|F3vn}R)8cf2w`SSK;RMo8A zj}AO4YY?wu{Vn#-9eTyqQWL>@*~PKUS%v?x`{*-nw*qX4Fx_{zeGf2&Bb*VP#1nt{ z@T3>Y#6G2+?2?>Wgo{e%Ji*xwoMFDjt4GM|j2zzxAJ~E5B-27QGv4u`?x!L8ae3Ze z5Q4{)i6%2+l4-WHWK#gO<3WVUAKO@i=a5Cm!$kuGBZTD;c#O*AqHCSf&4BiUk3lp@ z_(cm-xymk*&ZIBQ4D&a|G#v*_QMMK*bh;1rcD*aJddeWzl(#>3ksz_z<}od3*5azw zZrj!>vh`wT->TUYC49UcAt_;|tKe<3YVxt$pr&;sk>m2nPAH&+UKgE(>f!A<_0dmI zV>2?h-Qe@GJzWMwpD9FIL~o{++!kuRZ;v2)+`a061~Uew$JX@;*ln6IlbUX(JLQ3? zo?a*)Yx#}ZrJ6V$U&9MR*PCxnK#;i_7{=J)HN)UDBo#DzZ8ac83lY`dS4m;aK1E^J zx3_2#(R8gKSMyTL@W*n}bQAqT#SaC1$~Hs6X^0R24*wp&^z}{wVRyrREgnrT1`5NQ z>GyY-zPh+b`4IB$yD1g92}j@ttA!@>s)vvY)|hye4qBD#?V4*dNpA=ByM|nQdk-XR zsv0G=N<{VG+C%IzrB;#iS?KXgEE6RF;v(qorkmYQW}Fl;hWOX46-r`1hwxjqy{2eL z<#fN4sMg5ezw1CeOYenpt$%%Dkl$UZWp$p}9D3eE`5gATpF?#QJX1aygadABnb1n< zeB2Cq^kJhExDT#ym17{J7#`}Ge6M=ge`lRoA zl`*RGqIl zl}htord0Nn&UG-oYt$rT-66e<*{ZeRMA0d+bQZU9wXEMec`C#9-R_VmDS|tqc@MvIF(wvj#dzoDEt(6wmylc!Jo%ymB35j(kgyq%VkiR zyR8^4o9re_*wXFnVfhoOzBln&D_Ofrxsx*EPwftcD}Zr6Xr)MoV+0;Hy0P~PbUsVs zktDVGN8H@&&wrQkabzqmPmU2K(H?SDbHo=dAM}qYHpA!T3Q)E0h+FO?xM-QXvY9bX!=i?P;rtxX4&<-gOi_-!Guj*O|w z^jOWxQgmA+30KP?sY#|77hgtN(#$x}HYw}nHe4R#6bcXXwpNygBVDO#GP(bey~R07&=hxJ5Mkytb(*RE@wDYTbIC&wY2 zP?XMN?xK@m!Da%T+*bozowza(H~WY7p*&mPa>*#2;IpQ6zi_~GgLho&=mfT2 zwi>-I{nA0cUev482|5E+)dI7rSx)qq`s^Fw zai%t=H5T@{!Fb2rwP;MiOvT7x(BiiJp1Ck`+8JJlsn>!#$Hb&AzFo%roMyevN`I0m z0Olh_|2Twu0r#(r{+DoBX#m1SgA>TsEGC;?*W;;@Tuot3T~0e_a%Xsiw{PYg z4wb6_k51Q7YQO{(1RWaAG|i#JXrpTr9Q3jL8^VFu4*u)r#{Rvz29)$$yQE%jnzks z4QZW?GEnFoz7eu%z~S>>Je`=VGBZ9q z*|u(3?yNm3;V&U2{bw-82ax>PWC6Ao!9Om*Bk!l9;Gap@^B{VjH9jg|-%{}d>6kP^ z6UX#zk2cU27A?`Lmm(q0U82i!`a$dQYdc{TJZ=OWRw2Ak=irm*f3!9j&^2WGXT#ES zCqMm`b9O7U;8e_XhbX@_s;1-E|bjOC!NXxL=K;cMIk?y-4f{H60N zfN2D1)`mX|$Nw0!f(_4Kn+Hs$9#e8@T(k%&6^6b1Tx~JRZq|Q07z*|r($>*!6_06@RS`%aM9@cZxHbl!GtH>^W^ zLj$k_Sa<1X9CkD6B76cL1E4*2nLqbmH45pT7(p7|KcU6FjQvNu5eJY^1b9y5&I*NU zt>yPM2=uslxVeyA@5wI9&82TYVGG2-@F^GEQ#S z;%(ZRgT5HZoxso6wAq8@$ z+{=?kJ_F1@5QBtJm?o>}f!QRB`Eo^rqF4NGUM-bQ1kaMI)k-Iac>0IUeoDE=QL7?MDPxf>rn8` zKQK1lkc<};Hfoc!SnV+*R|j)-IH8WZZ6(Kq+QEqfvlDY=I{4MsWWkO-d&$}+k}>(u z+e+}GlM!0T)h0b_l+yJ8M~#b9#RDC$XDOZ*2}|x*=k~jubl{B?gG8r)Vt{*ti3*na zE;&8KQ}gZ(Rh>A+Cxj%Q3+%t|@%H-PLC!Y&W1rVy`#dbIY1q|y{5l_n|09Y+(;)(_ zYF&ZT!hE^`{t}d60JA5)tpTQNX6hO1cq{J$*3`3??+QjM*y>j=H(nme#5n~bw!0a* z-?D-9d@b0*K@<#3^R#NfN0)j!es|vQZK4{e*fGg+L>nyv&;RNHKMN#nQih6=+M%tZ zZ;Js^DfzD!FundUb&g^xtpbDIEFjjTHl1)k{nT9(>{^?DsXM6N+-T|pf2?)n5oP-Q zT|&xrF($+N+Rw*E0vXAp^s;x)U8G9KK9WWwl~^)=^Z(L+?g>m%@h;uVHO(O}Bi5{W zQ6NsTWUiq}3a9aCgwV49U!XQGcy1J(6VvlwYp-L||JFv~ha1WpQ=xCSb%fS`S7k^@ z{~Z+j-M@ok22NaH4>mEZV#{COkJ&SNGxb z7*>PzKkd6L-Py{oDuTM_{Q<=aVuVlXEh2)51){uF+YT=vVgZUL##5E|brSuT%Qp~z zo1e5ggCD9M+96?MNuP)nx$T-V6RYjDjdx#WeB1z%a_T z5QnkQk5pSGPr$KC^@~K%hIA0NPhh2lba$}jRg${ELA`+4Tc>y5W$^H!K>7BqY=^|) zW~KS}yhv6N-ND(hD?a-boiIivvFN43aI(iz70C;L>>OZ4#~EdYor3qges)~kBJx3r z+{tEebpTh3OZ;P?(b-@_z^8r1e7>9z46Z}|?$1^Bd-`feW&A4Yd?fp@TKcX;oR~xji zFT3l2@b)~0vBSQ!Bf?Iaaui>bbvy7YcC?#}z*TaNd)e+PE*x?6wH2T;tvm<#grk$002Q-jA7Bc^I)G)#yLNo97Sc~CEy z2{5m52gL7Ao>N#X43`TPGvlU7oF*=YhI_WhlR7p$7hPf3oB~YJOv_Jw^og|J6elO$ zMbjIoRQ$GK5;8I5 z6!-|u5dRzV3_OZvAX%P2fk(IKCH1@}zn5_-3rhft8%yAtN8&FmB%t~6(RaRc*b%j^c>3CE;1U4)a5qC>@>R3 zY;B!rs!`TM`-6|5zo$XHg{vBh?R@P9Cb&=iZsVac>fxxR$Kd!uO;TaqHYNCL@ZF0a zE{nCYlkw)&0S#3e_2Dlsx~fGI6+UilXk+{lDIY0EU;z|Z@QX9rm+sWiu$P@z@WP2a znt$n+Wcp!CJO4=m8vVfyK73(!k3JxX92LMcdNlq)r~fl=vQmk(omb%Pw-t10m6HdW zE)3EE#SeHJv*?1__$UVXaeDRA<<6-O$Bf{P)`ofRQ$lAxS->4QIY2~azaj=ZTeR}9 zK#9cFQus}k23*w>w`?NI+qDEI&CiFciluUj27B(a&(+IXU3V6Umf-9&?i?s0ew`Gi zbOwUHSD=#$2A@w9O(gw55@c3tsj<#q3|F+8|7`31 zq<9o`7t>~OGy8EME>+4Of%3JBp~>-@j22KHV{=dTlj;>;Kb43X0QX^cG0$Z@i<~>- zWkg{_+4`<3Ir)yZ9im&V_2VQs1aqG8-(cpyDJ|n1FI8uBSn8xzo!EEp+5Bw`^{w`Ms%G;5Z|$Ml;HlL$j=Ex59?L-Y z1zIiflf&(!XWJS}(VCZ+XxRgHayni!Rcm{hczwLOtv$5&bc`W}zE?1%pg|=&gP4$; zcJkrLR%7Q~owy_=MVAcDf%CT^=6z-q$u8ztjfP~_J*=a;wNjF)FOHs=-N~K@$_^U- z7R%?VD~17jBT!`Seb(f}-N_f-p27zVVH69l=nagw{#$%k4Yg?q7e~rX`sgH=z!}oK zQjHDYqOcJl*=oKK18QoTFEFu(cyD7b&=OA+1m+{OX%DU@53vGE*53~T(+g9&v>AY* z8bsVaF2K~h+_2|?Gy{AYAenU%EB7pifSeeRf#)Z0je6!MI@yJ6@}KQ?&+KRD>;kUp zkQa{(fbjLaiRwfn{+bX2`lfq#5gx}*`j7gr-!Jbo`7y3AG!;kKKa0{mc^~OxTV_Dao5k3dWa(cd}+WS5g z6TGs=bB0ko$JNq4c+@01kqnrQ;(|MOlHFyGzYF97C0YE&zwG5%llEuEsl}VO2SWO9 zL@zD*fw6`_Bj7@O8gx3#I$mQhZB43JpcG8sw58ZC+tHgrYhiWpcKcgD07CJ)@VoOT z%z;5(^#Q;@4swbIfKsh7M%C?-XZ@CT*j0GKP7&gsC}e^x3wD9%9d!r*`Q$_YsECxy z&>JW~w`+*R)OsGmT?erFH5MbE#Qhn-iIlM*eB~$>krb-slU)v#4g!ArCaOK>*`Z}l zDHf0{N=rqvpDY%S3L1gSfzVLuIN z8>odaNk9uij#%Vrj%zxT9>EzS5ko(?Zsyr>_~?kqd5ZLGJ|kfy*nw}Kxt<#*?>mMo zeD6769u_{q@3o3u409JPfiQM6-1bCvRi=4F)%%|gZrgm{>k}=u`#Wp0RZW$51Ej!T zbG46Qr8-~-2mqRE;L&fn0?1!%t_O7sxfHW{Du*V+K_nL81*~<_VHRse*Vnr<{3Yx3 zGeIk5hAM#R?uo~C=s^cFie=Ze7IAmp;2E(~IK4`pZ$GF)>4RR5-9W!=O|Y)7rAyy) zy%#+vwoHik133dk(+Hfv7^BdeU3Wu<9j?CQPjD0ukMpjAR8;Nns4Oh^;CA*|+Z+MB zkkGiN!w!r!jss8<6a6!dd>H-oO&Srp>p|6lxmGZgA1`AKn2UlxgBmc|1&l(5LmEmV z%wVs(h-%S^9d?=8(|?ksj|07?coK(H>K%RnFd%tM6v2y_!=KJv>ykS!UGs|0N)P+yGE9+|S$MKbQ#u+Ug z5VCrz4;;i!?8AS!SmAkS5FEUGy&l$LOCk_bPC|99NLG< zd7jvj(7rAEjK~PgeckhL?proE!ko8YUcDgKL7uHXIxfr&1kO9wIz^aa4Mp|Gs*XnXd64(`6^uU*`l8DLhb>$Os;*Iu}`OGFGBZo`QGOUY?)*#tB-@WG4j4=MG_luE$B0HW!9 zX^cAi2~B!7L3b%aX|k3MIS7`xpIuCQqG^qf8uNaEX=pCrb?aX`{S^M?aY6zvBdI4kJY}Zm&Rk?P57EOR)wci zFte*tWpKCQw-5kFr(pkM>R1Jsz6)w}Wgc{bOno0c-K6hwZTH&IQ7I(s!e2TK;|nVU z22t4ER~juEWmN5hBMiEan~Azdq970fP;pL&MQgi@n&+if{IU7(ukPmL8I_hHokqM; zreRMx+wmVC;(^I5rSBixTaTLm;|l!QT<{Qf!5JOamaJ(u z0@>XTu-T<6TX%*Bye*l}%h-7}YE~0%am8Vf#6&QJgoPdm2#I@>i@)+t-Av7S{QNP^ zV{&pD@&GSgsAoP8=1%L@8^q^4$21-Li(GCm^0HkF@>~t-He}9DP7*zm&uXjl^YTr% z6_m#&a9m)-p!w$-HHxO(3vVU*`Q*=j~1-C=ppKz z%8OO=-87w79fl|}5-lH*%OGn9%z+wN9d?Ii!$^V=7kv;yCt@+t4yG|tT48EhT$A9k z4ojp9ecZ?+(Ht3kc;5;Wt`KX$N9tw21gqWnSLBc^OSLUPhk3bqxi>JR;@g(>cv?ZZ+rOOEa-g5PL%T#xW4|V|ha% z(?M`axk`7e#@hK|He3k-|20Ocpg&B7TP?VG+fsxxw1}h?gU$=NVM0|K34P`}_gAVH zTUHw{zG5q8Fb}l8W8!j{)+-+K6bqgvFlZ29fB3vhtZM3@lp|Yy<2XK$bpwx40DFJd zWQcSNel-4xhxK3+g#rHcGs&}SAHoVF#x|X4yAzis9E7GXb^R5^G}_%;5yxbYM$?%! z)#hIHN@mb_mAW2#!+|-SGcNwNNeAmbsj zErX6(MLGSq&2P)~gmy;ew1((q8)Q=!XgF@wk!(jidzOjy81%M<{1Is*sYnsRU@v%EIwTR4rXfMdm@>9(6;pnHhzPl8qJZeWzXq_fU1!^dn{uS5n}U z8^RqPCQJtoK!4}wQOK1W*M_CUU5&il?ec-M(zS8N+l1ER!B1{Ay8lZA z{%Ol2aD+ms8bQSEW&a=skz8(khqTB-11Myh{N+0ug)eaG3(OyS;8ma!$XYRgg^On< zN8cC=!QSsj-@!N?JO*nYYFdXaB264d)%YTVzEL}5e~fsq<_hs;l0tUUu3dwP?B4WZmcTzu&YWA}|( zZPK!7c(&<={gtv7N$Rzmo+>Px$pfQ)LOl(Khus@MWsE3?!S6_b@0Wj99!YIl|M02@ zQ|qI$0uhcuLKngXaAN=1veh~$_@k_0gO!shjgCR^%y}0pce*3?k*maK8j|v(&ykDr zVzK*l$2Q-kcTjZ7Q1fAGQXjwS3^rY&Q5c$Sx?*^honk?!XP~X15;}^utdH+E^5AO1j!OMzRfP8fk=h}#mV18tTlLVLwO#qG{M!8FO( z@Su+o?7BP`X9pe{?XYYUDk;E?r<$e^qX7)d;HAq6efrG z>Gt&g01jaf) zL$Av8EFC2+hmt`-);1?JA5$wGO)Yo7G-G&@Rl4DI!w@$jeCLiQQ`4C$Khj@NiO|Ur zsCZ`9nFm!QPbC+YCCgEVN~nLGG29awuM(}Z-p>MAnHX3Z3P}MCSvyt6XFa!Us{e;F zg4{*kG!bv|tr_YnW}ID3+^&!LmKIQT>AbMpVN?W@M0WhQ%4>y+Cec&uKi$-l`V?o4 zQF*d$;&c(y!klNPkh8q%3K^dDxM5rNvv1wbQu)+;%l4`>$au*QmE}_X*{{gNYuRQs zN~TG;L5UslHaCI&Z;QnMi@j!;vIK7j=L+Sz9r4V7iN%0|HJ_@APoebSa6(J;+t8e& z=xn~y*lbZvo0BxFbiC10iE8w=VbtLbxb^n4)@${vUUuo9HT=&5 z?m0bxlI&pJoL=*FzvgrwKQJ1dwC(mlP9YXpqcOfY?J}<$Dr?jtutUZU#(}=RIm3h| zMiH6KmKvUA>c2Q!2%Ui5#nsIFjBC-2mfoz`cILQ;==aqb8f6voraNSH=;j^lm5BFu z#~=6 z8*HL zbeVFQK|7(u&o%36td!edOa&)_W8s+eaJWLp2S*KWq!^|T-lt>d>I82uJ1Xa>MXp=A z%%y7K1G#kGhD>qe?{W%bH^4RRvKY(|R5Ou>lPbwDJd4?+!8NsjI5b;3ouy*Zp!9H| z9k?jG;jG@Xqt+d5DLUm93~urjV9tDpaT@5!Xz$lYXE8X*l0QjMaTi5+1c(1+Rs%nRx1iMp3Eo z#qUtYFve-Yp%LbL++QPF8?=U6_4R?-UB~C#H?toNK~TmNk!T!|FSRSGrQ)#KqStz3 zR)@97U}o|LEqGO+TqA!(={Cy`KF!kTeBsAyhqT|U56_F8M_*93m9&#{^~!v7RQIBD z>zi*lft^&5o+6YF4lN8YkTWcz)ME@Ca!7S25z;$Z87W_KgS~>ollxfnK5>J^BL~vb zgpZ9bPx#>ZDX&W9Jxid#l(Qn!?fo|GRXr%amCPbd2w?}$m#KmK7q!+QP@s49g~J1; z^KO2a3I}tqqGnLSwwZsUYv_^c&*VYGZQ*MvN=o$JOTjOm6o=aa`bmyRP3`a1uYKis zFL&g3zmoyk8kwt^S{qv7pRnd<9!W6rZe^GwnI7ye;O5SdAYv+WYcRxG8=wiDNZn>^ z2L!4%^e*h~bPx69ZQxN*lEO!Z`+xyP)(1Iim~UHTZdzeDnd2_en5rYdR^_uL<}U=f z&3WBBeLrhyTs<)GxAhyoDZbVpVylHm);@-ueF=JrDOI-i6;&6lgPcI!*Npb0UWL3q z)6>}un>8DFBBs*EDc=n_Nbf5PC(o}RturTf+m)k{2hE_TPZ1an2f;4jS4cBNpg~{n z6hmL0jIJ)0i)~jbJ-ly4{_`r2Wo6tBu~)F}o9kO`=tO+nmZ|qYHzKj*NVC4{%^Kl5 z;aT7Vmsvb1cl81bt4O|>E&R9msRS$)gTy_YSgi@#agVIJRR48L{?R0Fqu<{Pi6k|u z7L)98t<*%t+??;X*~c8fMH=Z0kZwv~Wfcba>6LCGVGDAgQTYGv@PPkSV`gEz{v z>2?=EtUIlO@JTCv!NSrHl4tuMQfQWbMl%WyB18TP|!D=AQ z2>!<6i&$cwQv{&pD1jP4o2BD0OXypy2qkm1<+kUpj;oRNIKpDp>5+#oTrmWGm$C5? zW|egOgUzjM*pnsyF8k&Nc+R{Cd1Q_i-v6&9{6aaJIU)xdukEah8yeE=rV_7ET)K;a z?*D8r7*RVwYm9&H!@=4;KAJ@M*W~#}*MA?;?F-OI#18e?qP)Mx{!=FY6WIa5w<}%qR~F7zl&<>u zGkp1!8650dy=}u#$a_*KaCh$8hPY_ijQVCU856U?)zJ&&K2`&cnSk;q;;OV_8PJ=M zk!c0XhA8i5c-uvY`}4#IBLrsR??vM90fyV!YE>&I!*b5@vuiSd)hfZib&3R2qpiA`Iz};hochSQ)KX_wGje z9gDDcJ#QqJ9PYaCIxqAF6Inl;asN)MUFqxZA|SgdCqgDW6m$L`rLBybKqg%46R|Lm zT341#R0)z1USl!a&nGR{MTIxvuQVM`rl9J`w@MS`7tFfijG3s2XM6z~-HO94bQJFA zGr$>o-Zj2mr1XI+5r;NR?EIH9_j7aOW1KE3JvT!Ks$%AGEnZ%tU~DMM)=y!~@7zPN zOW=<=j^v3{8=M{0iSE3dIYMJTw_|#3C?I?ZiMf; zC|bIHFUeiZ3SB8Z3N^M9DefYte`RaUy>|J*Ds`6k6K{KDBPx1N^kMC5psf%eI?2m8 z+=5B5ZvGp^^CyFt;3Y$uj#{V3&c6xK!Uc`jxLN^1D zLzoHkjbScYHt}MzpkZCs?R9^Y_~%4Gfc^%WbYA9*Yiy75=wAp_&3KL`^FQQ4ZMKPT zWhD8R0-U&be#Jj>$Sj{L7D4Vpp;#xyRcC%7@)BkI3cQ~^C2lhS*M4%hBK^M0Ueh(1 z$d<%tEGxaq0$&4ndsXDh@n9~6Fl9vUx_SL52G!UBW>&Hi8XNhCn7hLYt}g4TSVTY}8?%^3?VOKoJMVb=2%!U?giY^Gx}HG{Q#9@ZW+ z?y~s1s<{&OPt?}fwjXT{N3I9iN?AvHeA?uDbMk0&b6D%4h=2O10ba+KY4S*vxLM&l zbJ*hAvDDx))L$%yHc<{vfa0u*An-4+_3LkrykVNZNL0qdH`^`#Zk&GBep~O_>UpYt zKx@T0y32?|7VgI4t=yo;gttKSxzt^@uO33JF3vJxGwFq=D1u5R(RebjGj1t`9L*z7t_Q zt=GEaS3CN8d}&LkRV4`>($uGkzL&8fbDLC`hT2)yxc zt=f%_dOa`Wfe({X&#sE?tT5sR9Lr_)4mDLbW2Tp#lUbBos8X8+om+djWGA~^e!tCs z)Z*;yyxcU{46emreNQy~&{@;&ZfuaEWmv<0hwBUbTUX{`+rtNH?mv`$_`cx+w-x^# zaVoRkZmDB2Gim;Z(aoRtEIbHwzw;S8XZlFEEg=se)t<^|Jgu;1q97q=#{v(XjB&P0 zx=A20WuzgLlMK+=fd7@TF3E)iOBPD+0sd$vb0fh-6ZeDN@K9Y6A-%Ic;?zJp=VF(4!&>Zlc{@KqX(3OEYJR?3o$6bq8BFlI4n zxvt19p+ELmLQ`MT2r{6yU?gT&O4KtcFOhD~a!JJ3ZaUOj`%dVnN4ge|X!w2dD_IWOS1SsCY<|1j&KcLlaY>f&{RI4d$ zGkqT7tcz3cU=!3f$&V>XHp`H;`n6JQTfxQ{kM(J=k|m~T{9zZH%O!ft%*Yh-!6^R% zlg@e}^XQZ+6blxl{Yrf0K%T&AO3z*bW53OYWnp+q^1SUcy|O^!!cEKQrG34uTh_#r zqX$f0ULY#?112l)m5iuC28a2hrxSBN@2l`d}t`?mHE#u5cK9#`WPM#T*6U zSt|bKn%h1-XY@8P=#bk_4QM-_OG=e4^mBXj-460sMU4PUuPQrcIV*(@)LGn<3mT;& z8WDw~HfZOC%Cn2_`+cOZLv;Ju_TS(7?yjVTk8F5*u$|%{K?vqO9{K zyd^X1F$-8aD!+FRDoJ5>r_aLOsAVyNS;O^)95c(!_ZxZN@)i5egibrHs^7}P#S2HH z=5(p4AG!+8OJ$-R?u+d2z% zd8%F7J@N@!Xk)FKFGuiYIszaH6j0V;KW7$|&dcvrK4e4G^pyj5Y~{E?OmtoZ4U+x+5yBDefyl3fmSnhcQ5Cpn^r5yy_hADHu}|5zW7I^ zhoB~~PRyx55k;hLVV&%yW#hnx6j2mB`h-ss&11jP>5JkY1AYNIkzArgFt%d2A$7Y9 zuoAqpn+R-v$Z=vl$nE%XGoAEm)w7KrbT?K66-_ZkF7+L4CD+?YLER~oO_WNm9d}Gv zjaH#h{FtNBVn`D7OcC157x>QJDT$h~HHSH2>L5RdAF!YIa!4{(OM@Rd8wBr-oVqG# zYpp)}fodsXO~wTa1tizj;0S6I_bdYeTMq`_2R3#@fwVV8&$waOvQVG>a^>4R>8R9-x1oT7=J)RZ}uGmt)^I^|6FX#vK1Ff*?=s{Kh z33emfP@v^cJ@#rGv&O$646X$8gh(=mnrI;3L|30wK7H-eCmS4^t((^;wC@%&<#UR< z3Ixu38)6t=KwZ+6-37*(yDnTS_qh-5&MTiUb$Z@CNk5Us8hqScR|f^<1tl&dpg_X< z8Xdd%&O{kX&r8g&HU98u)G|ywSQ8`M1evjH>C`mW9?v}5B=?Q$OTDpP`2JPl#78aK zy||hUBfC=~p7|eEtN@bX@H5mp+oRm~z!B+t+ubA^4dGMVeXkS)%qt`9V*Cvcqby+k z!?7(e`Zz_W+Ilg$qMmazaMgRkF;PGePd~MYY^=0!ut>C)_V$D5W9ONf9 z`3j@krA{zTH}d{^+zM=_6Y1~24A9+nZ$#+N#;R$uDXcGr&E*_Dq?AhNx|pT{6O8OU z_#4N3L^rAb$iU4QNY#NLZuPtZMbs`HD9Md>$yg1r;2CR`1j=uY1fqTd&s7&3YC# zri;YW#-LWtMoIUeEPz5^@H9D=;idknP$OALbL?B{vmR)CUf|pBDR(}c-5XqEtS{{s z-Iw53HyMOj2R0a0v;EuIY(g(`0w;(@^bk1AHB?AFUE%GU-w=JE#l5^)v2dMDO=$2N zg>jmz3s55Do87ehN%3L;p$(MaNra;ufD+rfeEEs~Y97-v8TuO($m^>tk5KqH(CG|2 zZux~4Em9JF>9NHNv<D(vj ziXY>4kn~a|k>B=^V!HIs-DPd=po421ZiTr1Wv?k(!x3Ad$L=$=vsD4+&Wl6!J4_ub z`s);@{qU2+DvFckqvp|~_7ucwy@Q+7X5#6J9dtiCPUyc-d8A!+gh%&Jq6}!sNz6Pv zaWy%M(#1GHP$+R#-&2zsK8(Vg&%44ZWQXD2r3pg=mn-!eV3)M zU|8-r#$KH=q_{ zXcUjTm#?nCaGSl1p&q|zt2SGh*uPFh%s;-W%?r~#PWA>N{82gVs@l^@Rljm_uQhnn z=b{)5u}X|9A}Swrp!>uiI_NQltBi+?bXqi=*kK~QO4&TdJar8UWjBPOn@o62_TAxq zVE9k2Yqr9=F0M)!Y7K7!+`Bn+AXj%6m4d=eGx|sb>JF^#h08e3UPGfu^GdVsfze>{ zB$H$e{HCQR3&=F)rxeRe;-34&X!V@nb++Rx*o66!|s*;GpDsueE@ZZ!_kX6L&;KOtVdOQ*D|J1ZH#l|N;6R) zid@xTKBN!_h|b5iW%C=V%%`ggO8SCs8i4bV<|3l%y{3d#R64)Jly5|#R48edz9g3<}tXkHe zS~&X@c-O6LOz7*=3Cy$MbT$sNuxE2JT*9_Qyp#ChfU08iU&!kJk(reys_$GsGmvZU zoe5lsj#2*1P=D*;(_|A2kYbyoqV@Z?R}>cLd^xq6PG20tWFIRQ71W}Vn1r;`EV0%( zJb*n1IM#Af00$sddFnRkQsHC-|EmK4pq;FARiF$J{t6un+o?WM-5CA1c;QKgLn;<; z@S(47ub}f(-X<)&?G#Q|Y@iZV)Rb)D3;-HZC#`hS{d$}bQBno!3hLihlUk!$JCR7T zPNq@(MrhX3;q4}uGUrqR5e5G^N=|I4x??vQHCEZ{0R~u+(QWKI79)BpNJn{X>*aEW z_NG*~3(QR#&TkRYO|$f0m-{7A{VT^UO=KA4yyN$#XXIkJx@y~n^q#`BHj5z5_%930!l!42Hn#Y5Q@xEG zH8E*!v|)>{KFSbq&5TrlD^Xh9vm8s0Kw!*!s2o5w2MxWVOfD}g72yogOj&d37DUx23iU$g#?X9jdN-`8Ac?KodSy2p z-v7RP`_B^5y?X$%-z-Wx&{1;H3LRdQpj7f%oh8QjV$<1p)pBkq1m{gIdUUkuMOQDFMV=VRcB_ zINCFb2b<~PAxB*lZb|?9*>N({5t3Vh^f}DeCgk{E<&R&YKSVE-K?(LlOvIZ)of*`k zb3KyCQ_5~AG4A*4-wHFlGaE8+fZRsM(q2J$6GaKEJ)4X=(8O6%zRNBmO#AQMlmXoR zThwF!%kR-rwOvkhFJBE~ygw$d*SWd~hhQe3^|23{v`YEH9Og7kn=JXj)3#Ej7`;}p zcl7?*6O7P!mE#mcoKW(a4#qn2*NQi>D)&CEzkCX=h0=m*-(^6QI)D9mB&ugaw3=fP zqnUb{6jywyOR+$@>K7)MN$Gp}5%AC;H;hBvXvo?-k5d4xuwNfR%d;^=BD!yah|9~i zCSNCRw^PP-w~`&z_$`l9+*4)Yc8FDFVadERTRDqvPqx~;HW5DTAU4u@3D2gIZUZK~ z`K3=C>d^KN-HQv>Td*{IQgny{+@UN#0Y`&GR~X53rJ$eC>@Ao>Z`R;9KlaHWTk;Y85RM)s}IY`EjNy_2gvzOpkk&{#}^{F>)sex8zf~R9xArBN?2O81{ zSlO9CW7p_*^3TTwW52_z*jA98MEA@j0{k?wyAK!(;Y0Qwn`t~{4EGzTnMobPiU64X zTUR}j2$Wj*8i&1rnX3b$6?O{Z`Clsf88@=j65v1~S;gj_lmgpq4$;wQfo5KonT(Y! zYoNC4b3hz>mBxT$ueA4R?KR0_W_?$Oom&4ZfAz+&BQ!aTPM^IEev$qi60DZ#YVWJC zSOHVe&0OkKpEk6a#{-}sF&u&n-&6C_c>xZ^pnW5jDe*>^C|~*n)?9-_y)rVMz0!$k z)(qjjA?CRBsjW&q?jo94!a~4eg)k{vNiZ{bC#BBk^&{AnrI8y*X^Hey*S!YfN zIP2D9`-1ENE8J3U$%g;7g$6D;j^PFV4qTP0<&W$|r zn)^$pgPY%>WAgQfD*+sZCWb&+EZI`@MapzKrcKd>*e@YRNy0^GYvu-gV~kqs^#Q(U z`>0{LkJD)Cl*cWCLy?MfLg(AF&+I=NKq5d({nL>LGYMEIE6bC109RSpP^oqz>_y?U zS0LO)24DTr;r*{CaZ;A&B7Gnc97o1RpxZZ`M2u9q2*5K%WGS`bA_KqJcz&I@5qRpo*JTCGe!N@B3C`v2oxvcg@9=ZH{xyww;839+$Ys zL-TYf+?E~zzTBie4Fm|C+{JYdNPZ`)b==b!$M<0(b11>tg9dnj@#lVd$s~t`VY1DW zc|7FmYpl%8_)OVsX{ft3M(8|^spl^JD$VmwjU;VHH4eYS^9cznPkH&{;PCR?EV#s+qhTP+{BLTHoh zdxVyfflg4QPcsS^n%BY1%(Kq*45~jaeQ%%FOvUTl=fHEt9E?=YyBU0c#eY7ALSe5d zzCg;bG^%quqz&VlsYUzC42pKI%`E*aRMv4H;7Z`x+i5gqX6|2~OqbB^s|9fIFkq!q z!WyFybC-qcyC@D^2`Ife%UYXhd*E}!n{kFAI!))~ulA;eFK1M!3LrGqubT%F${Db= z@KtuTZb$Vt5y;t98J!4ML_hoj;nsxyxg9`J`0fcZ83p{Z*fHWNP@*3bbdG@N6OsWw z3ATn-Qk^&PG~EYX9%ux}r6!QO-`rUrciyQHnI=tuf)QuLBLPww=rfkO!a4qKGwyw` zBhZ#Gohtun!Jkzqu=gpQ7sc295~oWu2lD%^N0?L}GaZ)Fyx@&`uBr5a79=+c16fM5 zp1jRrjyt`1HAahk$q$A3->b~4*{I`Zss$7k=?K;k}X_XyfKH^%7btzz>eh!MIu26^_?N{sc>z)3$g{&o@ zVu}^S4^UnV$QS7n_@K&VzRslaJ15wn*2?v6#g)?iXqg9M(Sw*%0pf@;E`W= z(`i4syB(z*qH}w-0Cl=cP`Vb+*2M%M#ie_s2)T89?OGt_r*GnRq6s?CyX?Q6;rF*p zjnwjID3L^n%SbY#T%&Xk;=YNceLSeph#L-k2pSUzw{g;R5#bJV4-YSRbhM$pYX9jk zQPA=1^sg#Sgs)3=#$&VyS4t!}Nq;slKf?$RfE3@z=u;&Ozxvxz@W;+tZ}YUACz&9m zSclm~(^hF3h-IVN!mY;TO)N)jGYl=355g$aI|!{(F26?znV5M(9R9aedIb9Cc9uo< z8$ob>G%|$0CuDMZO0Ql&x47~2A8*LX@4b-ga7eHrW>~8`^Ulw1no0Rpa;^(KDPW&j z+^dHDUXlCuP$CEyb*SuJ*Um0h*x=uH;}*aW>fwCGp6W4Rg>WRja09#JqSs(?7Cgf1 z#1FrG=3@o^-wk=jZNk*9bKqdm1{7OK!9|1Z-m*-M)xn??7KDP&ABkdz>^}t< zz(jLV5E=!I&~OSUx$#qTbMHbktL#G&C-yFMDSLUnqo?)pmSRei1b-alH9Q2AzcbJZ-T}J)9@4B(HPFT5%Z3f!b z)SYX{Q=^gQedBVullR90u-elb&Ch%VRC8F8*Vq%W!m<7I%8DTSM0U%tztZd}p@#eG zPOIim?KhT+HFG^Ur1zA7q7f=dzZ&rnx<>7B_L89M?m7wka1i2Uw`c2fZyqlG8-v^g zyCNhCgw4Lk_KSmB!E|F^;Wv-dZ?2fJ@7LzF`j*sz-tMgnZC48$B?r`edmf?;AeXEf zNF$*!qM!}WhmgnA>JXyhKJmexsQfY|k2}C+E7auNACy#2HxBZYZur_M-|s{P$QC;{ zvmtse7Gr_q2N6^V24ABgYvabyMJv0aJ+2lFkOpvU2|Q!Gzs>~gin}E@{Z5*$%})G^ z)==45G!jMgQqziAPhPxbqtvt>xkpOv+zHULG~@u8g_d=;h@cqu0_TYuZg^y5x-I%T zwMVsYd_~=_twhndMT)glZ8B!#sjgYJl)FbdF`v^#f0!C;6PaaF7@zn@s>_6YYDiB+ zT+A*iE-OG(*em7zCA9@JH2=E5ZIzRf-T@IM0{1w^k2Yf`UxaeOExu`|=nki=AsKTz z=ryseZ%15l&DNn`wd0#Fs8*zG#cuXeZp~pHT(@QukwNz%Q*?*BrZQlJ3njzh1Budv*pNOyCKrIO9!p%`Q2tqHn zve&AZGXn6SIcDA@uI=cyG8g%ZOr z#a6##)gK386DUL8JE6t~JW8gNt+-)2d=P?EWHWwqdZ3t}U@K0**BEp#o@_-;@;^e- zse(w)8o4@Aa0b1D)NGF&`53>L9J^?0Zy{ou%cfhPB< z7A8OE&*H2pFTqrs(x)WzS0yZk*CPj~p~_R+mp}3NEjr{We&bKbB-cp-!zR*iR!NA7 zo6^j@eI|*lL2t4uLn5?5tCUhH{GMa;+e7auAX*g|IXk|%&J*;G45?^s>KbVfl?Sn# zz$itML4lT*Cw0)!%{yG;Wl(_wlKxnh%27aH>Y(GyM69}7pwU1EUBg%PXUt?5xYfR&JOgYE~1Mp61wa+omR5AqYN40zfKPsV4#82cqE^M3k3(!VE!wl z|09Oud)t257KiLxAwPxfpz!)tJBjZ!s#nKqwV#rRPjD?X!DvfoOGtVBQ*;Wsqn7XD+_4a~WzOgxmP7%P`` zs$*Ta0S&UTfwPSs)WE@SQx9|BI}?x6oNms1kHiWMDD;p#nIo`k{&0Pz-E!?XzO{Md z4wGYog1*}zFS~Yw;%^|}s|;JF-vv3_kA-$cW-Tu%>Yotr4h$+ie%$EBq{-&P8^)jR zbh9B#o_+KF%eDMtmhjiXzg%t;bng|0WaO=am8MCXuz-xONMr~WITfhdxv~hG89CxO z3)Nl3q6b_xzBj8*a){;&aMW-#WngX;cl+K}|0mLREYI^eZXHfPSOtOx)<61Bp0;Eg>Df+F>IOX+jO058lj@%Ru?F;u-w6 zgk|yH3Uj#jR9@M%l@<8ojeVzv^Qie67V4I~uj)*&-s8S0bf#4G@g+gpigvJ*%_9?w zw0T8Ri(n%Za)wJ(4o+j-Dtp}9}bh%E2LBrCecw(QEEOAQw$4l&JzWriC;D zeVRtQGoACH;xwXVXh+GE&)%1w+UA>OYCkeBlLk&X2JV2$Y?V~RvQq+g;fgGVcy4gZ z1Ijcf`7A}>Oqnc7wWk~xBMUQg@TYSq*WQ_<#>8b#qS*NoG|b#d$$ccqfj*-^5fX)A z5o1Hn3X7w;&gd`}fZGXnZy`o)5RP;T zG@g?#UQ0Hh8b^>Uw?}5(?M$&x6-NMS)Vs3iQTXm0Ku$5hX+Co1b4Z+0dl)Do&GRV0 z1=)XMand)V6Ds4%?q9c^XeJd+Ct5(_E^B`@uHeU1VaL_38B+>;CsQBuK zDVhy8!wtTs0;E)Bb|N|LvqJv0ie=t85Yk)OUAHvJej^p8_w@(x&p~Vxhi+uKJgqS1 zMk!}&f4Rwj&ZpCY5gI%xQ#6>7CL&k5+hn#g&Ql9FZ)qWF9eiDP<;NA(Z*i;)8Q0D= z9m|fe@U>_-f+`1vNWa1H1luN&wRE9te%F&VII#%%B+=rzkJ_uVg~zHA7$y*Ov(2zN zy70z(1nwZ@MEk9|)3Bu|#CN4^T-TUsA*Un%Aa`_#3REIY|pg6nL!fN-FV zlwc-f6$LQA6LnY+4sdpk<4D zTPk;}c_D8~Phia{m=BlI1B3NR2YQD^RBW6uOi*plG?mCu^CSHQ&4_#j`?J2R2eqA?kqna@V;;UNHvy^fz^=b@ZHfe#8_|KiOzKR zA{Q6pp*%RTu(G9POwGTh_J0tWC^+sTFmSz5ET{X8B@mbUm3~9dFksFX7Z$I614V@%luGFFMK}}bm0!&9 zw2ZJO_|#d*9tfBe%qHI;6MQQR6BMStr5rF9xTJ(cK|oM0eSh8CP-(og=IIy#DrOoJ z$qjs1FG&RY0U^z^+hd!b2q)R!8n#e8kQ&~E8*HG>_*NxiTLI|o8*^m%zpKrwk*Ukl zADX_Z4hc0{xIn#NXX=Y#q5`)WuT%Cuk93^=4V7SL&w)p*N;?W-aM|;+jFgX7%j{~a zo{MgtV_r8>TDB3m1(Fz86wS8rsu2}(hntglbDp4jjRNiHbv8;5*#fHc@)Yy?J}t77B<(NIXsow}^)T zQlhw!ZjQ7~Li+Q-+~UqZr%#=DN7}<(W=fO;ClGQGGzL(;fZ2$_FiI?Zw}Z4OM)5RCevrqsGRb zBWqA5F|6TB4^R7)Oe;wFSOC#9>(-m1A^M77=DTi9_nUKX*IDo?h<=HpyC7gSq_ukk zUBEJd5`=KvOUe=mq)x#|t|1I{8qZ+29qHRc&0{75mx@nkoC2@adz_>HxdIazqt)QgY9*I$%4elEGCMN41Erz5VkT>D�- z#YgMlFS35DAbF)Ga0eKY`!U)9us6?>_6XHcLYEEp0%^wM2UbS(`xHEhFlJV>R313H zi{_u>^GI+RXz*9;7@Vpu;x8683*7xxdI<@A;pz4Zw9Cf&!RPFbwU+YoU~ca5(vRa8 ztu=9Wt&*rUwOf$oy`x#z+{Y7inUORzxPOdJGuq-ZtAMN##Ha20I5JO@8Mn8*he8_k z?0-=>pkoZIDENTa2^bmQZv-d0*@STmGkBGm=TKn5pUNraWv$Zt`*2v~xsBw6hXQxG z259-I(iq7A23e-{?19B4%l!D=G6blXdy$x*Gu}v}nZ=id%yv+Bc2zGja;jx7@x(b# z9+y6fVz<%~r7K6P~`Q9w}h!}g++LxR6eOk!MuDL4m({Vv9V_=ft{~UER~0O)eetk zNJg7g8xtSx*rpFQDSoemMF?g8W~+=rxewxs({viL#!-(`nI;Y6*d{0cJH7JyNJ5MF zm<724B}8-qyTW&?KF|?+DCz_w_LoQHsM!u|KH8s2v0^>1osNMOp1;enY23n`?#M1Mr2O9QTvX(+VHs}GoOj9b1R+;56w=l3Gvppv+vLy*px1-L4lwfr5hv4hnh0D zmDFnPCsY_{cJ6$*t2TZ+i@{X%QXEz@cU;97;aPK=&Etq5@W1&(dQ;`oA6@eIF|< zc$PT${ErcD`l4{zEop_J`A#i8~bnCyDjcVDMUDo6;f8d~6-@WpBf8})U2XK5_UTq;Jd`hY{%qKCg^9#T@9SnD$s z<;Z6LOO!4K);Yfa$6wT_Oyp?*A7*O}f~r8Tx^O@Rsz(8h(EN`0=C7arHQyxD!7g}1 zx59I9*~bKOS>%Uz5l zVqQP$Fyu0OfoQqd7V*HQ_Uo#0gn4peLNkGk1#5F!237I>oJsrn*1A-;2Z?FP&+fYS zbCmuiaDMtUQ21Q^$hnWTp3p*R9Oy;HLVJ;bpl_?NRT?WXLReJ4fNJ7?HjA#F8*Nto z=USc|9Dd)h zx#3>?x_Z8R4vWvQTMKB2Y!1XpX{TK=Kmvjsy*(Guh5--E3;zyBAB{5@s!{T!uuUTg zuf%O&nD7GaDH!K0(BK4OzC<_V7pc#cPNP)yc1!;juEZ_1zakzg{K=F|smj*{>XoK) ze4j67Uzd-d8cBS0jJJ3I!RGGE?c08GAIk#A}?2ow8nyV;J2V# za%}9uY9jPC01IY$m`Tb!2v_ru#?=Yr@&pID|GJ=36^roSM#+0E59XOb#Xv<)Tu|GF z@b*a-Duf6%7+XqQL1@en6V7orKs^e<|1 z*|-Ay0q_RHh3}yb84OtqV6HXVzjb?DUMj4LG&$EpRi^DNov=g|d#?rIte661;m7tw zLd^P~R|@<^YWnvlM~GFE%~ptEtKFu&+7KrHM=A2!n&G7U0ZleqJO?jnZcO8CG{8PO z@Tisr&2%6T3*|=qsM+|v*o-jAuot-$j`^qB=I>8GDx4`pNzmJ)6>Wl`t4K2`?s;-$ zC&J`kW;jc)8LoB2kE-iD5&+|~8Dy^s6X#qvan&#!GW}Z`p+EnTjO=cWEsUDi>PBa` z=IE2-U#;{z9cBYHu?<`i;7X9PmnX*JrCWu9GBpMVG$g;9eOKA-ceseZI{EnQm1ZGE zX~H{$t1!`f!?4u3QMNxF4kOW&W_@K%!hx%AEjX!c^!5hc~Q!tD<0x2&w8hfH3!`2V@= z7R0hb^rEaY15B8~zYcGbH%>+9Kd~N6leIO6fp1F3du?^m2j^D&_GDLYksDs1d8zB{0|vKd~(pb3VT}7HRGv zZa-?b2{&!m)q>M>^f2-C8A>Ggv#6jLJ~*F{_mY@->8M=^x7`G}&lSB8z`U~l*E8&# z3DPn9J7;|Ml{+s@nZZmE#vox8Ntr$q>HKGX>h2U_LtNx<0~R&s{wHfuuKlqYjxFCsR*!{|+F!fCmdIMrV0; z&ga0Fv+Ve%xPM)?qAmfiwf(i2(a8&liL%C49t;#$Z~D^lakySHNQC&M?IWJ&2X5g5|x%Zq=qAwE5D9Mmb3X|wty4ntm5|8o)F z6n;NISikQYih5TWl6v}KKjkh_=HK^$7jSvHO`u8R;8Io}y2S+&^RJ9qyRXC-?s8(< zbauMKQd(k6W_C!F|G<+MDt~aHyNVhOzwXR_WB*O%{c5sKAjrto-zTe7!pPOIL7OGkrT=@ScR*ypaaxKs_3HiJ%<6G>k(j;mY5%X!L3s5=>! z1FW0iNY49S9%LwH*)^DUnjQqwOF=wQXz^&iA=OqTkFf1T=8lpyfWGoe*SV4%!(~*F zrp2ufD;wzRd-(CB0-wUv{9G;9JV%>Lq#$reYGc|ntD`n-KiFeU+0#Gsn;_P$8{xra zdtdpq<+@rg$0nGYzFEHzaRP2?i1sZf7K)vm8skagvD zo1d~LV$A4NJ@i_+>_qYjz|+4@sq8hch-F=K<2?=SQ9rHTZ%KXjXvW=dV$rZQdf8?1 zY-8zV&b5p|N1QZeyD^RppZOvkyliasx2E)92p~8c{qC24yYDHo+GAwut`!y&eCsvCwMwJ|;D`}m zDdBbK8bAX@nNl@xA}52G-sUK@z5Z40{pDy(CHzofe|c;004T`bhv?5hg=M@%j0HNG z)J*TL5h>izGRvzO7ei)pUXD8e!}b`^kWeY2@i&sZzmX3t9yFa{=Eh1w~9Ez2dcQG0^_Q?X5`3i?Gz{ zuq<{aMOQ2FhGKOKG!{k8adGAxVaWYSC= z88VUCO!&Mv9}+pL#m&YjmQIKOlVTC5qrzClK9_nZPOM0Mj#WRtRtoj0dq=i+MXd@& zkZqk+FZFfS#%F*S&g%b27_R4M=}6O=st}C5^fcs&HXQXHQ9F^F9JJlZAA6HwuuXD) zT3$R5?Dthoq18Gc{AfWX+dI6BcF7;(6c1Eq$9^q z=4(I7#UnPDj+|C{fuZ+Ig|*&`ThkNnpC62)8hbI=Dy>`5AOM!FK}n}mRga z1GG>n^Nk@Qcl}-e{D1>eaKNm>GZEKUNKS6tNiX@bxB|XsY9pTQVphK)oDan3aQssT zw54Zxncig}bZR?q6JCO*0GkV~Pz7q$?sLx7)B7u41a{U$Pn-(T-$m1+mddSl2Kn3n zK^Df!M9`$?G}hbVc6bX3s90G)r1VJjXel-}i9uj9nb?E2?51a~YnaU3PuJJBAuxP; z>L(JQO6f7}lfU-FE*;EAUoZ6$iwTVXkTNR@pDSq!lGbhPFdPt_eu8kW zHxaT>Y<0=qR9`QR+Mlutzk^*@er+`e$R@y)Wulvt2R7{uz=^<&)N(gWRL6`aaKb!s zVz!G~l=*RHQ(zCLoG1O`-xLVQi=%RFu9wP`@?cJrv*p6#UiT5A1s>jIXz*ld{MzHR z0-*=BJH(u;VQB8Rr8D{;-3zVR6kP7!GjWyI{{XWRm|9zdSaxKnRbhLyY13$@zmGmi zA2}_c#-uz3m&hfxQ2-2jTzMaQNKo;+Ur~Lt?BF9|Z?RJ9qdS8$ri@(tf7IHmBXI@3 zzY9I~>+31V*U4F4M`aple-kV$T!vz=ct8c^vrU6DtYY@&H;x1|Por1GU6>@ys_b8B zxE~d+TS2+}A0_TpcKCp}Ern$&di*~}%@vIcdS|d;SmdyIUhR+K_A0%G<9`;BEoN9E zQ&hBFdFS6Eu386XV=bHw(P-Je{E5TC=RskMHC}RxCon@PlOO)WvEY9{`L7>P{Kf`s zJfTC*m;=VItp|P5HIrBO6-(Ow5LxWQTvm?AD5GPF|5Zp~hEw*`t)+5S$(dCNrL{4J zKfb&buX<2m6@J#uYNbcM{SD`N!Q}`*C zBq(6eG$9k>wp@2Ivm?`MuBL~yLjIekTDw;kN_8)~@2Og5atEay(cx<#ruD4nj9Cah z_f~V-v-YfPvlhG%ZJoSe7+Cw9?qfmaJ9+A(#&K@%St)(*VHOndUeVCx(#iEJYpH{q zO66Om@2J{E=PwIPGztXz!_yte+`_ew20f7Ky~sbbd4D{#!7_8!Q+a9=iFS2D10*!5 zf+OIPl|+D@ms%esf!-TLBE`0L6uy4(jX+(}Vi-}W!t5HM(}lCE(>5>|>rHq?r#&+Q zmA%=V5jI>Tx3~1#qR4dv4PMSkD`wr{`tIv6Ohc#%7bab`A$nb<_^H?j&7Nf%hCvds z#=N*RvCx~U#{2P0z$@)ret|Rs8kegVs1hWaO7=5&ee%Ej<6^%$x=Zk+w!oO_1;FIZ z)xh0IcmWNc=j8mbE83!Wf#!XiY}U#if3LJeN{Pg3?^!e3_zH7U<$y^F`1p=Ly`Odc zH(x`jRu@>Sw#=#yx)r6wT<`Zp0w#&&D$k~V^yd?)clg9dWrmx6At_- zZ=sf@E44ChZT)T0(-G(m`h)`ACpc0P|IXm?X_I^0pUaB;03)sRE!p!-JGa1iN1p4* zTM0-+(A|5qnim0bMi$f=l0HfI^A$c@!lD`~xqVPIlXkEZ=gwb**VRAZ31%wei$wRBgp?Iz}JOmAau0 z&Fud3$YibswD+v~HrNa3#m#|HNkv@*qB$crx6SU0M5LYNxl`s60FR+*xl9>6qi@+P zsuo*VGP%$0PY)D`nDaXIiv5w+EwmW#<6B6SgxJ3-4;<)z_g=rmHA*cRRptz(y;Iwr zwe0cB!M0$WFEWaHj91qkR3&)EjBeCka_IZp>i2Te^Q*YjsZ55dlFh5^yLrNV1*aU# ztjvYtL%AJ;X6&L7nKJ24!QvQXDF#BL_G>gVc;KFg_TU%UEFN(>pXL?s)~{9MR~>9| zl8y~gvLV56SR6irDg%2OyW&8#*zQN4#kPS7RXz1k&n7rE-f~+1J_$v@>)61vN~YXx zRxi1?#%M>)k~;`O_LUNRP=(&YlY1=R5qn0>VS}6`f4T+sQNhO-87k$4W;nI4CO@8= zJ)BaKElv$2Grx#xdrL#NUT|7Z*s#`}crGTJ)qmTJ!2S-7bTWfc;Ledf$K3{vzA}+6 zx4xXgqhh03vuSdKO|K;$%kypHi*EOmA%AO@Qu*5MaiVd@XJx`LDy44PPfiEiy+(se z^DxY?sN`5bJRYCoGD^rg%8N8lujlN)GDZa#fQNIYtxH9iDb7>n(Ua49ANq{#-RaPe z&WSKKo!(JkyrE9KT(Rcl`7&EFVQ_SR@6>|+UlaFLPi9N8hU00G{7XdOf?*xCbhJPn))kdPW2fr7A>dU+MH z28|E+Zl#m8ig9G9P&dC?#CQatYZs@cl0jF)B6n*vODGFtLx1l_98}?it;%9s1JSx| z2K96^rK?Txt*=mHx#AaXv49yE}B_5EJg^zn{B`7^eMU%FMfSw5C8hbyV5bT&e#$o9L$47VQX!-M9@q$ zQqBU}O)k5grr;_^iyasY?z(ou2t_s^JE=TR@*Q@aVdA}W3n1cr1OifjxF@br3Y41*wt)jY&ME0t{T34CkIDY zEk!*ze6Y{PFqjhD_?7(VdL2*@r<}Z3T^~7d!C;GIATirJw!zFB<2c{H5mWWu#mFLa zi{Cx>I?pd~RCeE^(iWxLycQ4YX7ytp3Ch!F*m!(rL9dxH$Ccd`&>oyOV73&CFk)G1 zVJE&Vo%unCKLtulXscT6mVg>jhB{6gnT56FnKpIf)E#j6IoysrW=?}T^_!cpV||`+zd_RI7&|ST**5nDM!a=ThUhHQ>Z@oehaUP8q9i5eOO0b1Xw_E3BlckLH!I z>e|^jC^~zlC!lnrj|9<|GT@fy7+7-Cpm#ofX(_BXqksXY6C?3Xe}e5wbptAa^B3Tn zNP&$7lzqsUwVG@aV{Nf=rX|4{PP=$dmF+J4u_sFvk<{L9P$ZnBAocE1ucb;hwxlDw zXCp)wk>;;3HGoD1aalSTsUeFvo)o?uOAG{-HBY$WO@JnEqUFwbBx&%=oP zK#~U?9s%Lr!@I)#2=~Yyz`pYIcypT%0bA5BAc>UpZZhdzA^L*(I~&y#->Yk<5qnF{ zk2+2E_T$!vmlEEszl(o-#<5W}Dw_^IL{m}G#U`tYNJm`GZ5mDE4`tbu6FCj3 zopk4yzwF$IjlaglI)TF4OE<(_9E>#8^peAIGMJD0z@3gqu?#cHKZ)p&ldDIC(-mgU zlQ_~g`h2^gI93i{b9BE}5Ed`*mL_pgp@t^{)z+P+(B@-fM&}Q`Fud*x1kfjYQ7S7;<4G56uCEq! zxu=@LYO71o3}_Tkdo>cB9Acdm%dLI9fqT?^;7Bo%?`5)?#!DpfJ5BjFViRQFblurL z`Ha|`*wOo*N2B@d_Qlv1H@@Z}BE2`^i6;wn*LsbqufZgB+yT*9OoK*ydze9TNuauG zOHi!yOZ5Zy^d#yr|8~AHE;{Xcafp1LqHja~3$oI|sWS~uWnoX(59xz5LOu|IYMKdF zsXc;G9@#8RrFgaQ4(tW^cdW2C?z(Nbn+D7|VSd9@Sy0L2Csl(vo%U5+ZXqb@i;&mL za9;l+HR*~&baAvWpshuc7gS^V$6~)>?Yg!hRj(LbyX|%i|K?_EWyePfnznI`!B65C zFmcCaIpLz2u{yVMR8N=>td=MVk;<_QVC}I0@ zpNC@X&x~FqE#ScC5@8_Xq6)Ya!sove(o*oWn`=dS_T=SC+)9a16Pd@sdWTf%bMJ#Z zatGw09e*8FW3GG`c#1Fs>-tkGj}IL?k9VhQW!_oM zWr%N@PFW7pd_dsabq+foZ8FO@R!;YWh`WN1=aTn}G5QbCy6xdHnXMzZjbFsSDzu=W zefRXf$~{@l~3wHsw#6QX#o{UcCU72Z>q7C=}rP5BR+H)hZR!YaE z!%qmW?#V!+augb*BwrmQO*kqa&Ou1f&b!-m=xOk?p2m(|>-|hm6Epe*rgi6ueE`x_ zU#n@nj@0xkx+OJVDD>UzqOPh) zY0QsARKr^yDdKLP-6KWP`M@+gFRSCZM}yB|oI&I=?{)uzZ)U!tN6TR>SuOGrj}S#S zU71=SUpDOFP=O2aCxpmfqcWjeEU}n_wn;q-sZjVpCPf*G{-J?o;)lv3M+la;kDJW z=1X%#@kXZ744wq*Y3-B~&8-6DDb>ntZ_xF5pU-{HAQC}Ki0r(Q^J1u-Zq~Q>#nA53 zfKA++sHm&B8SDG~b72*kM9%FKaM5ydAg;toADLWPv$qq zD(X$+o0Qn{7n142{ZeZj;&+Z021^I;N=(?3P5(NV{zy{|JXdkhy)!0IG>nl2x)Ahf zkG1O`y1H}#zsv1|BdVU#Ni%3xFob4>e+z;>*fOJdMjJvtSXT66v&D&O$Dv@f+jMHo zCW(QXUY0*z@p_;<=ZC^TpEjvn7yjcwZ*}Zk2ag%F+MW0<$57DQ%M=h(?lwvvO03?b zEzD3W3Yj}48r?}7*&kE>w9z||>zwH{U1CnH>+duA#J|K$ql;gY97Du)3&eP+@rcB0 zijCQ@&6V?w+i$(uEx+80uZ4>f|J$BTxnHo4wfYN`TfOHmbO4bf(ppM`&IfN{w)_~y zzb?=E8PVzc`}aZ|jx}~d`f1oQbRd?G4W;al6H)wMd2r&8_c$D_T`<8f2bYeCX=ZJV z3hO>lFJSba+(YA**h*(LNegj3BrYsB?MPWZo#5>|9cAt25^!X4`C1l~@be$sG zYdhi2%w13y)%0Vx*xtD3M8JG=uLRFpUZD*Wwbb^O@pxTAV1p=W&EzIpGDgL`c7}mxt#;Y*%IQv zS(`{dha;L)`@vvp3dU%`I`nenvvmaIegOeprP;lXZ>wWNNadlzGt1 ztuueraT9ZIX2reWZS!@rm3spfFFA3%-|_u8@wtZFAR+zjm1wuUK7R%WOm5N9EM<7r zPY_aShB`?vVmk6RRn38<7<|lxYp9%Q_vR)Vw?e4Bh#H>jC@c>jWFak~=rz)}tvGo5 zN7aZLDe)%hPvK`DiS;DEHy3#|Qt~iU*_`l0&6%C-5QE?BRcaDuj?A98BS+iW5P|k) zJR2cP2FwXG&g75me2q0@`!)lMODaFgRk6Y%GEp$~Nn9(3V8~iBTDVC{y{gmkY=>)_ z<#YJKxX`1u6StSpdl{35(;|$5Pav0oC;wTOH9?1-U^eWK`F7jlC^ix)>?5|lIB&k! zAq)4)8kblg1Zz6%-Y6Q~6?>G+?|oY`6W!H}3$Eq=NSngU^y7G~;6vS=xMwWZ^asj~ z``yYHG7kqi-}|U@4LeYj8T4qi7|`O>334AsQK@k-&uVs@RR-CtkJ<^*RO&wrezddA z2LSPZJpIKRCkSl~@nxd$dv*K?=y>I3@b$=7Sq;z&wP%s3V2ZklZBk)&_7~c+mhMs% zf?skFf5jfB2AAI>{zI%8anm>k&AZ=D zOuFtzglz zd`2qNYJF>LdKcd#*sd_X&}8^^{(*D6hfCS(2S`KKnloPhOy=*k3vTT$<%|$K?ccUH6eZw*WlOr^)o$k|+AlcQ?PgI{&^MqC{UcEINIF~reKO0u{ z5%3KbK$n7XN*Fa@WsiflOuMAg6Nw7Jx660&?s;8djd`1l*yM%HY(^&9zL6~ zFe`c+dZ+K?*+hQ1Q=8bmkOf*roXCB$uBzvgxGk2ritFqKZ=H-Tu0=9F`hD|6kigub zwieH7F>1`Ccs5PyLG#DPiI8s-{_Xae27OS4*K`I1Y&X3*_oHQ31z=O^ae3yKp4_J+ z{NG>W+pqNbj3WE!{55{HX2NoXEodFU;0n%~b9 z+$F5LkOS-XrETdp+d+Rvrp${b>@S^ix>VkBqeNv6q|M{UiU#isolIjg-psXqER^My z?kh;#IWgAZ1%u3U-IGZjK6wY=a<@9?up2ozgS{+~Fkn=EhTASm?jY$jA4JBMFG#bf?SGpL@%Cw9c zpsG}mWVz@N?5|J{$?A1Z5F^D)RzjJQnrrXAFm~rkGm0PZXfQiF#cK=nGA}wdYE)(O z5_oy-039Z7?$mMhXCl;jLk*)+BV}*~smJ?Br{-&x@#68Rwf!p%LDF27+NX}?YJpyu zFhLpZBYIHqxlya1@#a3SCa@0yXb0~b!On#`60=00P`O=OV`t5>>ulX%1) zENa~8bu*Aat~^@HDKgis#BEOegkCA)L-S z=E%yM1KQv{G)^y$8>uKBd}P2l>$xgiS%cLHr$9aT5FJ2MyFUL16s7PTI(4RT^sZ{QbDn_GbUGh``hOKtBv1BE1Gp8DSZ9!o!w zJM{XVo&S|(9+9x1!%#Q1Pq3Y5hqxSv-{-h>4GdoI5jfr7`sO^TI1|pJSs~bSS%rZ+ z1kP`&#F%#y#;p4;Gocdk3OuNaUqfk(f7$PkDyac;h_PlsY}ITvAzoLYQj7)v_j2|x zh*ew&6*(M2svYnI(nj_SvawBw^q}xEkdTCdM@(kFW2SKnyUo+BW<-_H7B|~fM|k6OA3D_w@qJ=tmzl21Qf?_K0on^SKLZ-RLLg2y<)}O)N$OG{!+z#HUk& zENf}q3!8PQfg6%&s~qB%bnd(6EcU~QEDqltp4iQ-DmN~-QhSgj=c(JU4Zjbu2$^SG zeYE|qV*j}LHn&b6Jj=qTz2&JBZiP+?HRN7SWZgMHQ2C?RMdP1+4_7oADW+zz(1@%! zoS|lMbIHF!P}s+Pu2~Z5UB!6(cP8NfZ`sRqNVA?UNDm?8r#?1ciI2Dah$_;MN0%JPljqamU`bw?l zfGprnVmM^#cy8L37?q)#z9TfWd`cx#SmPs;vQK$0s3WBoAIY!^WvrOrSVYDDO{z@i z>@`Z$B1rD!N5yE>wTkpi&b%}KRVZCaC0h}6KL=4vT%rs~pyJd?ME3Hs#mIwjxjYhK zF~L>4&}(~pohxd+kZ2*YYS-nD687{7#rJyid(US4($St=RkCzETq;6H1urYUC+7f4 z?Kj^6<@7AA>3(1LVH=Qz6aD(YQqq@H+lcGCaT+yQ6M1@`(fJiNUEIu_)~&m1K_^mxI>8c2C^~X zOz$n5tK`{4`OU|J4|luTy!9dTv2wMXDstn4huVcPFaAImqsYzB^S8e<^R+K4#I`LciOd!x4(I>AQ~+7V>feU;mKw+i}j@~qV^ zPLKET*$!aST_@=-DOJozaSt%hx64FlgF4ftR6h_nNE|junl8GZs{~^Ot!pvkkma3- zpUVa(px+`xYJ5QJ6@SyygPHiGO`p5>^OEVxCp;}ctYDOWPiQqz51cOvtrj}L-SYNu z*;;ddag4+yaev^I{q(YL!+wb$f`Tp{Ssl{kW)n%t)|aRy_aGHARnyY#3BubU81hKE ze&VaV8tTjjA1Vxws+pIKxhN{_JvN57Vb@YZp4e5sPBY=II_Sus{R*of5sMhjDz-qF zO}skGksk^YgF)Ovg8T(;q+D|MoG>|heU*lrlx~nt>6Y$p_^quT z&+*>#-h03AAK3eO)-%^!bB;O2SSCxYk+;KAg)au9eZg73^N+J090hODYQxY2=;MW# zfA-Wr3SrtN)YW%^rEoayZ%F7D2>jP+{1sj^A+;tvv3=dAEX@4DcqAA?%ogsq`vyOQ%)4u$vYvB( z4t5PkYTYUBu1=)BfWX+?t7x>`s!t>XtJcV__AZUkOhGpRpI^jS?235Vj2S&!&CVB_ zoviR_&~{>ad~v7|5zR?ot3|~6Ic+|?itGPNhG14Ht?BsiP$@%mRN)rNYrQ`~-`^EA zqSTF-uU0M7?)hZ0**v?ixgRYm=$_Zo_&N3#Q&&;X#>3>;4S}OXqb~GD|67-0ma9J5 zY!sHHK5?Tp_nv69(8)!K1#<+QuiCWA#6E#Vy$WI)kS0gOvm!YdTH?2!jVtc`@UhET z`<4K!$H{V$+bzc-ugVOph4;SDODCN05h}JZF{OWVz#=c|Mkcm)-2zZ*SJx zoi~_Kd$YoKw=#s9@9lJLyyH!g2^NpStF`>&SiRe(P3Z6uLC1dqMFG(JA9p$^AU4zw zvU{TbNU>IHKbNc%v7DyQU}bxq*;6o*$5N~K^I+<-#7ZuaI2XUOg^(f#Ro_etKLYtO6TfGs= zjuUbFzb>Bm{_6ZI>%GP=5Y&S;zXoVXF+2dP{_g?!M4AFz&{umJMEXZ!EWO=~Ew?5UDz1)+WVEKgC$SwT@l8&#+qPE>gu6ehd2x`G+x5*=%Yn!6r! zMZ3DRmR2iVE8zzqrkz810(0vQHd}j=cIgE!&wtL3B7= zKeuG_I9mT&R(YZikFv*&e?ss7Kcj3#n^^n!c2tGtSvV?#J|}-5M^#TChYO;#AA$>J zxpJkfQb_;H_Wty6j&~U2kJ%FlBG{lN_&B2e_}cO8HIbM}0mTLWgve=^-#B^^NfMrV zO8fHp)Om9>JGU$rZ?kbgU%_zev37FP6-)}YKe(ln&IsxI>+BtQ(xx5wm&Zp0QA22I z4h+HK(aT~UELmU4E?wIR_W4qL+}u9=cv!?dCD>(~^0{j_WD9vMY)#@_r*rtYF#3;E zA{%nKlFp)>sZdi+d6$784`myh*SiKKs5!qN^vFjp{zp`~{7Y1Q)!mIv-CvuT3>CHr zxMw}8Jn(fhtojzKRi?C{Ywu6o{0Dd>=8rwdNNffQCA-|yA%o8HFT}r*_>XP&9(z`6 zF2p4)#d0%1FFLP0!$W~5s%W~lU{y}9YocFv<3Bv^3X3ZoL+WSf@^h6ii0 z@KB8Gkw3f(nxp1FmMyrQv?i-)7`XK}D=g{jHvS$A|8?w&8P9w>VKbeXIJV~2!bhMG z_*y0PWy7lg-WKej>vkP(JTu;!>b+|8eVK`h{aGnr`LzoiijzT>mX$|_1HCCmf_H?Wm;Dk2iNX?3fu4RuY5lsY%MOY45)qVt!nd$ zY#Km|`#F<{q@iYGKu+TLi+}KUb7u(;iSuoF?*_SlgaRFM0rmF}ePGWr=%H*!zxPr~ zjXrri8k4D+&DFf8hjF8j?*SlMC?=^h4Ou+GEZKiLU7&~83ZDK&PqurC4}Ry%`nwlS zRwx(M{A?by?{+4I_N?$~)t=y8nIO0v(VqnoVs_1_T?7lVDAmuu*1pyUFn5bv>Awb5 zKg)YDSW~{R9()A;FP}jfkMAGffX-v1?7`b0!aACMuH^>d3sOV0wsUQ3uKmj1p9TM2 zt}A%GPyUc@#rf+(xYPx-Ygv5bcq&d?9$(^mof<3t8D14$<*~NbG_79Hd@-cvxc|;F z{RRobl>Ki0YMx)hJySc16B$aED>U_o_;=V=wI!3%Z2<-(Ea4LC zRIcOdng|ITxAy}*N~%v{0XbU|Bef2S(z~ykl zqXy?zblzd48WVdh-ICPK0941FQrkC*w@7~XD>MYqUu37)L)Y2|aJrk!1b^&G(4Y?u z196!vUMZkii6IwU5TU;V-bC9*y(=6S_f?7qkY`QuZm>CZ(ZmOqN+_h6-f!14>b?OT z5wFy_*G}{hV!!MnyfN>!eaa`J%2SIJ3p`HFfgS3PqF{CDYx-sx3m_z*ek1XG*}0q z*F6W9I$sc+9_CYSj2mRG4()M^jeZuLc2;?cS@naQVgx6r0Q=v?fy4pWsdWwsB44&{ zp9yNIW-;TtZlTksR1RCj&yVWPrn6<2v|fbYdk3k3cc2!%-)#B3pau6dg z2J%#c?JYUENih}R@}cG4Fyw7G7)3{zEnT+R>oax5s*#82BU(;%HOH$A1khs^eWxs^ zRblaT*xQ-(RlGT>NkG3*QUN;d>y0C=g?QhqIs+I$TQ+#j7mPZ8wq@u(`2SkF(c99Z ze{KGCG;ITYbgsr88s3hUF4fkr{^YJU1W5>+|9!ehE5>Rt2p#L`5LH9@J2_Q?GoP~` z642`;q8R31HY#r@l4ZnfIrQ?G@5K)f85tDc5+06i=_fD}awStt;l!vmyK;{Bd@~xHwD6#}ET~2W;}zQf+vBT4 zM2>Z9AT{fEO{)HZO^f{V>F-*MLe))I<5}5BV0fi%WlAe|IA^Ykrx0Q3S!a@cQ0(Y|LYL|F)m8%jUKW){1T5;)^o%Y>%Iu-?GTH| zJ@(RA>+z7uWozjP;mNu$yMW*KGrtgF@b*Wn1Onpc-+9snjgPvl+dfMnNnx3gQB<0% zmj?h}48O`PZ)!w7{9k8m5+cdr#B1WJRx<&{>$PL&K4mq^N5Zz=Q=^mM^ocmOVT>=e z{rR3^v(T}-uK(!2ejX}{gtadCw0qm9mkYo4lRu<%8cqXAM=&2}3X_g;ek^FV_A{(~ z8iI>op3eON@`5@bop~-0U3>Q6qT^Rg{{ZL{hYdF zZ<*M5cqgCLDqyZHT|}NrAk@_;;4w;7jr(8a+mF%#Fwu-W#_b+4^e;|bmmBB*nTnGv z>CHaY{8>da_nv_2RFQB;@_Hl4{>gH(s~6lcM+%cF+4AGhV{Y>ZpWI!L|Ey#L=yM+B zn*uL_7$>f2PuKqHkgnm^9sc6I!oRhVwM9U>*{{v>v^=a-Mh7r)|2&*zDh7Ry>2`{2 zF6Nm91T^CME-C0)D3P~IFSg{z7$1cfH{GxAx-i)(4a=Gy!7+yOLCdB8+GT&f9k>{S zgFo`E!_(TT?zp*s27_nm!XFF-bINX;n_8=gOLeaF?4xHb2}AGEL_TF!-l{pByuKCg zdh06fM>%x5tpmz@Dd7|L(Q0r@cm!Y9rTtZ?>D!y312pDmNuik&f}5w3_KHG83?4wL zEZ2@S5OK1jukCV1jWV`!tSWr=O!5XG>|FVnT49CWOI|~N$>leHBU*a;4nmHk#n9Nv z^e4N4gnwJTm&}=q1Wv&Ndp;mj{}s~0TffnqA1S9lm2KiQ0d4x~H8MU_FaeBf2uh8H zv&ZmNHR9c48sgm+tjGNL1di)N87MT0g*yt6K4gD;n$5d$^UO^=KcxT@O_L~s`Vil3 zW#PRcwWA#zslW~gI-%jrd7X3U4qgbq`Hog-)-K_lfve^>gsQ5t@#HrvXj`23Nk6?I zx}#4q08p`m-2uED%0J=lA=MgI!`h^o|eGBrQt#30YQ< zcrU*G+g?w5XDpjE%qOG+idl`@6Ve)K?B<($f6q7J(b2M2QGQjzWDKz6@S4b`Id{AJ zwd|C{Y9#@O>7828vzb$k%}*SxSUr`NG#h0i=>i*N_t!MJH!FOyz$n3xnht9RV2weC zw2S+Ote4NJXG4V}AJ3Hlj(+hp7$|VtwF@c(Fng@^im!#mBaoOqiEY_AlfQP05jwKY zXz~>&P>I`g%(H>Ua3_9^DV?YLjv{u+Qi({|3eO zB|u_0e%5kd>#jcC2h~8B07B3{|Iu>tDHf@j|7vzRI}IFjX{T+Y?Uk3Ss9Gzdd>UH4 z9b!E7*({?<6c)7CfQzxXMnlF&H2jLI&m`^atxALm*L7C@Z^?SwB3mMMb?~D3k7TX* zkUPECBsQM?pBc{aZ$ODERQW*Hl+X*u4xWCu4jzL6VQin*OW`?sNh$HBbP6l)4?n3*3lBu_u+adP+h=I

ZIE?#Ptj-_)Y7LXH7i^Bq< z5GSj&wCLGy6?fh!F@K;ld(nymhn)UMsQ+t9C|SNE@UCj>d zU@$1x>5P~G%aqddhZLu5BRn%DT5*|MwucxLi9QXXKZ=2WWX?}0KL$39YZcr4u1|6< zZ%HtRRD;pzzJ1fsJ!z6Tch+57$@ixahzI1Mgs4)ta9KJmALSO*m~+x@4*Rho^)kNh zIjdqRMZln~#?gR|%pKoEXjM06a#Ga0j#yodUPOo0i3M1xIz;bLd8y-Ut_6VD*tn;}dA3X{V_R_* zqT(RFeBWRsl04KFM0IVC$5)J20*LDLKLxnqmmmAujogWQWrMWT(xWPk+xut1 zC{cqH8#;_Qw6@RP=5+>@jk4Nf<|k6L-?0a<{a~(1nXiv1UjtF#BN%mZi_0W@R=w@{`Bd)LeHXS5Iu|%VWL1WS}aU)81>fJ zb85x-1LsnRVmW?@y#c2}ko@=5`*OW>!1lWe46a&~U_tuxBr>EeZrOHmK{kM0TGT9P z>b9~f-&z!T&8FL7`fu;sFuLPN&d}-$QYOd(< z*tctZAvzXe{tE(i=Aw>3L*tFEd-j9+PBv2-cNZvRFsfTtP8t9t^6^1hT>e|C@-ICC zzX;BNE>cC5p`D%|(0Vn937yr_qO?=bwj)TJwi=2ZsOI%C;b<<0ZYVCh%Peq@$J;@j zF12O;ZJVvG$nJg_l|BbvI!Gd?rY|;7rILP7>F!EYyS*}z>2$HNX z@GL3iX3tqbrfAGBd*QhY)GYrwS>AgsBcMYoWNDpX|BERH`U(fur`57LH$txh!enzg zgKGJ)2IBC2`xJ^;p$3r)f|)HnRg&<*_w`T*1L9J(sIR7hOpWNga+Ew?`16cMf>8P_ zk}urs#V?8K@Y3*xZx~EJa$Mb-iLSrR-tJ>D!KS{4e(j05SD1m!7Sv7ct9qc>y(s43 z9#L35TaZh#3?G{Jye8*dpy2UsW}IBh1f#8`BsD5u;;EA!iEP&-bA5F41DtRkka zw_Y*X%LYzC%D9>@vQJUHKN7bzTt#h$9L1JxxU=W1_eqzW5w|z~&VxpGJeF9xl6)`= z66=%)7B%MPQP}tP3SP|q-@o`n#;beymI$5m1C&LOMJu=W#z*Yoe!tK1ejrTdM{h^u z*hrE}pe2e?t7msSPAtDP*JPeai@TY(kz>r8m#A)&3EB~Rg)XLjvh*cP&{!EDezdC* zbDCNFosR(nXInkdDX!+G@_CO0EutE-CBsuHFn9jhTRnV?kP{ijoBty6c8jJnQmWPn z@^<3KT%{9c#^4YD*K-Fxsd)c)IVW(Vt$2^HAL-VSWANXtY7S!DuT+;g=YI1IytGyP z@Y>NTR|Y{b-fHvskUexSMyb+0N$_{lcxQ)ryeV~<37=U#16+j1TguBg-sBi6agEFl zNWMCfk85f{M*wThcU_m{el;O;di$i!Bu~*^a61U!&z=wzQQ}7PdehP|)CpL0B?B8$ zXPMoRTu+-H`>L$nqemPeOECXgf&O}AFXmi=0br&2AOY8_~|1$OLd3^H$l|S6PG2FH3jlPc8aW zd>OUz6i8hY>=93e*LNaY8IgX{P9>fRYYzh_L`<=?*OyHoDHKE4l$qD;^NzjMvWi>hI+IM&USbP{60?v!C`lcB{LHKp2 z)jvo)AuhW*800*5S@@-AP1H(T^~uZCFn`@9>Z@J%EN!owFYW;h4x1ieQl-(rhadm@X>zISKdyYzUGR$YuG8;QA1OMNkd-J85aj~4 z3~KIFu?FSYxhj2*TeAM@JI))*!|@kPH8P{G7~Y0;gOPLClfviC%E|hyLtZ_x1>d)L z&Bq=F&f>VPJ)86HL>>+W&q2xJtDNG`;=xED49W*d2yqY`3Cl;x`p1gHmEjpd#RPTt zh$ci6c*(e4X{sF~34~^sQdVZJubVQPG^VjOOkHaUH*~owbMblP+vZdK?E9wcd{36- zTG4H)28a2>#)$Ka7Bhiso}`*0V_)sWhO&0Q1iX+>edPGqxmP`4{$YXy9i$LTX0H~6uE-L)V;~-}@sIM3p5X#J6`W5T!`lny~cmYh= zogWF9<^|cZ+^+=a+YY$$0Mxn#pMOXRjLLcl*{`FxTaK&PNuM!>I*j#nxoRU9%nwU%=Zr(`=KK1#Rbk3>`cI!QDBPp?88 zr1L32`{yf>#_2vfdUNSaH5d-yxvS>ly)Oh zaL!1C_s9{y4rTuDjEjGzC~bS)mRf@UfBG1TKH?S^idw>#PD5$p#6x*{ytY8k&%)zF z3RYT*xlE_x#;+w}BdL{m2CMtg?to+Zzjm18GX@B|f4A9rQo|6YlOrnM1{K{-ecYh+U5Aix@P604}L$`TD zl1!mMcU{)Ri0D62sA!=Oli=EjZUW{v?bcs~LZ7X7h2NIHn_~)KsmEv4r?_P!Q2PJv z0}3nil}YI^Px}`Ruvo2lfOw$)FwQ<1@M|~x{+dniF3TmWm1+wq2j|S_S*^nGuI>qQ zWC6f*%LEM0Na16gr_5R}Y;JS}}-<5NSOOm<9UdF2X3Ze1f&~PU3vruUjKF59X z9_pm3?VLOYB)?FnBazFylx4>AIG-g2*WXUf+Sitaci~QyexUqnSUYsIp(G9sQbh9A z&Rm=>6{U3CLZjczyr6z6G;L#oQ?hit!mB9M^UNmHY2ktzyHbEhP|^-p&?>6jX0Sx` zW(Harb<%N3OhU3CP#!)Ls@n6mq8KA+rLFe*GM%-yn47X67J72J7P1rQaapY>Mud(Qby#Mj z=@ZT)P3s@14$>B={&LQxV#3pC5SRcP`<;%Z?sk4Y$!MgUmXl5qB z7{^Z6AHHE%<`DlRSngtYZl2-|RrFv?bL#y22-oPgZ(YDfD^g`&2-f^RO@0m67 z#G)oqT-gf4dvM?KTHuz^V8`c0KFEk(!-@TtMHwL9G`-iWjUliw*urHtX*ygp;OL7^ z#dU6sVKTI68aqrSh<7_050gIiC>+brc;-?o(+^}zSpRsesBw#}(L#g{VWY!UjPaVK9XLL2P zRGZb27cG0l6ekNNKsXo$wPdZa0ENe>Wh+pR8!P_;v)@FyXz*s7>Y}`7Rm$mvx%$l_ zWz%7Xv=McohR-3CMLj19rkG_z3P(Q)EQkPYTBrz8<8+FsTqDMKO1%jLe3eLSz^L3WI4%p^*pAI#qmhr`%Yg7oN)gOSN^R2pgfkW-0qvhf6hBRAW9)HVR3b_h{dKQK@=jj&h6h;oNb*{*Rj6rLZ~lm!>ll$Wl%c|ju%yfzPGH^SqI>L6FO$EP}jLyG2~ zT=^cLO`rOu#%j-(=B{rSU3{CU4cu)Ey|Ko#mE1Q1m*tZ{2VsC9uuNvN3D>o&12Ti} zdmC|G>`^0xpwdSFUF+7%M6$*EoR+W5b*>mvb!tTi3ieANsIF+ipXMX-C)3^#Fp0r# z6SJ48!jJsFYR(fJOW{IvCEn$I4VxUPD}kh!y(&ke(n`6?!ZKL&L1$1DPTMIA{OHzm{k0pvOpBXxOI#ObT2enHZ<}hK5!92*{Nq8sNY+%x{-I=S61!s%M~| z0@+CGJvgU?K4xXlBE3MxOyC`fpYE{Hq}!G3!Tz0uY@*zryWOc~+aLKYMq4-#us8*P zaAC-OL~R}ok`f4B-8V7+X(mS84ettJqMZVb1KnjuNH;**0}S^nAor7I6>vpmyQPq4N=G(5(r9-aGgX393+fWc82)pD~2Uz0e^HMl*QlC&b zYQNEQK37q-y)sc`xp=6v_CKAn)%x}Q!Qt%4p*W3idK~cPXAkF0+|vq*Dm5S2MN#E$ z6hq?q_K4_npF6@$lVGmc5ebxEPV$yL$WvBC{6T13WiMI?HyZ!X6! zbI2$|NuYX#`6wJv1RDWQnBGl!hb41jpZ@k6btEOjJV<^asT{?XIE`E;-Y|ZyLjWg< z#2Qu_51Gi5q&3HMtTXNuQ-{E3T0mfyUN;Kn5$vIlkuAU5UhTbE+*s|`*(4`y0s@F6 zy)4u%B()6(APNcIr*XjNvE6(%HcUtANrLzaP#uD7jK5tAutk5kF76GhJ(`g(4$bU$ zfX@0=5>H>EJ-PBx^Xc-vg%gHN6xY8Z2<8YF1BfJuvZ2Ldy>wVButywejPvdVR%fte z8FAoLvqKoJnE_XPrHF31aqD`SrO9o0MpR9PLT69#&ri)8N; zDNoO|2f!by1zn?XhY6#xNII0}mO7*o!PW2RmIvRj&hM`*phyRFom`C2ZQoZP9%-x! z_R;)RhaHhsHGW-d>0G32ZOfxU5ZwB909D%~GP}LrqOn=(4a?#_ECE-+eJI984)4CP zg|hd`DHzlXjXj3~>Ek{TrTuk<9w1kIn6Zftsiw8O<&&&ckdIREOa@R$y;@iPHv%&}j&`IS?_;yGY5F8zQWl3LyY+BK zf@y;`!#C4aOSbbQL0n>*NL6lLKgJ7oE`uZw%2z*flE7-fC}rQh+V@Fgw+Wz^fOO0U zhJ#$YS8#o5 zmU@>i{L7a6IXZVMX5H=mSxY+S*DnRVWl!VO^NvWCI>fkMr#^znYPTW5NnGcEt32)x zv6DK|Mq;W3<^jdUAE}Infi3$|XP_ReiE8!~4INAu3Q*M7#{!-Mb1_6cwitBXD>xqt zfV`HU`-H<{H(n%mG-av_xQrE%Htj7LaB&2awu*AiUiRj^B)_hRUa8;G#kE1a^=%a; znd3CDPZXUwsPj29=hPAC(`1ZigX}#I1e%C{Uvdi zuSGFu`0JN=4~Zu(Jz!pa8`4{l`BtPo#})gegPl{V8kVJpji`zV_(Xw_mW&c%T(cNM zgwA{oflmhyo2-5UpuZcA>jA}I%8vU=+r)phZ`JyZ5f|YWEB!BrHX9QZN7B-@<+Z$U z_c`bFNii7%AR!?M8@wl{jNv-(7Y#_v2-K`ZSn1NFrn|NvI_`VM7>)af`srn9962Mw z9QbwJTPX0{BVk?o{`E76Y9^uM?45fB3q3uCy3DL01DMjmWFmn?&l>zwaMngGqF_3? zP%{jT=sj=zv~)JTaAiNyCOWJ@`B`@Dqh2d#?Jjp;LEB?RdY6lDk%o-VtHCxpaaF3y z&JJ}wuR}AwC`H;AaTDZX$KeLlTLNvM&?*l7kRUHyeR{9Sp|xuC{%N&dk>NK_Ffd-q zubP&2-~8Nrfu8(r#Gcusr`=f?ms+5!fHTzJ8jP(R&HO%;m5?+$(#uRYMp}%Rj{pq4 zdSt{^K%29n7((UL{DNL4G!{uK;h)9C4~htp$QY61U(M(+#np1{z7hR*tyFdFA?G@D zb%f|m*v%b!L_U7E(rL_mS3SO9OGNfn_h*;S^#h0B1)osr%7`|J>2vQwl?xuc$uszK z9TTPzc=91Z^2**uv#91;l|u#1mG$V2Vvq3NtJSNGf0X2eJ_T1`=r^r2f8rupk#sog z7Lrzjjop*r-F%H#S^!9WM-nW|KJv2Fbf4YIh?CQuqB$?0dy`?>uE=W(xt_*uRvS^% z58!3DAq76Am8nI-+{x__c)5AE#C+`%O!ST%IX_`@X7^wRs?DmU1C`LIZ-%#Y>-~?a zi%amKrS^y@mL9bT-|~@X9ENVS(*dL2O|on^6R@amRcqB?o&(i-{z9Aj4kDiruM$)Y z#d7q`$>R#l%hygzX0ya?3HGmOtrFcF<8G0CPx@7txmnJ9pI&*-Psr>`^euzJ1zpdk z_nb6F7)h!({GLm;s-NxZSYSiMR7o@XT7NaJ59_n)(d%g&+tuppIwKb4FeFhwjzpwUoMR!?|CU$J^VAE(l3R}iC- z1wxg!QP9A9C}%dplB206L@|3@!@*tzFSCFR)Gu`4R&A}UdQ3JmD7XjZK4Bf+Ix$W;3kzjzf*WrXrtbOEMAdB6W6{B?kw`EYxw0k0 z`vS};Uw_E$O7Lq4w+o6|)02dxFZs4hXF%9;xHj-kxiQ@xW1n(l@J@a8X_}RymY@Rg z(%YDKb_i&wBW|lqBad;x$dX?Q6D8L1|@T9+N>=uebNSvUn6l`5W;Xb+`68J zq9hhP=aQJjv~bO~1H9GIXjNZSe{sIF<6SNYG+<|n1FAvmQpj9#UmH`mYAJ9i9&V~O z8$V*956^k)(N8MG<_5Dz6b%=PUu=V;K<{Qv5 zM}EV56b6^C5$N~O8^g`5PdI$Y10!KC$YIcU>T_xOIR|1g`iVsg79<&B-bic_ZC&cX zxQ`pLGyB3MxpH(N|3IUxB_XXytHJOXKJG&cL!&TFhc_0UBIRax1Qc>FBp?EfUDXee zE$bz$Iqn76WCq&oNp>PewWbQ_gGma;J=rafMO7Zy|=R*RB3puY*KG_0&SM_*y_?sIb4_zr&4~ z5)fh;F0^~^-vUPNAf0-G1jRpatfe=Q5HvX9*3C0C`UuGKSuc9#`w{-Jh-ZdMIRvM6 z`EOH!#{GFFb^46M6qUIsnV^PhM?>|;_F=zD!j%N?$V;77djac?AhQpl(h(J4!O?6)z zn_A$>B#yus8bD(6|sj{9F|6{sbg6SaDU ztGXumtNFlu{09b}{93PEw@vAJriiuB?xDW~(9{eq_fZbbHu5$Ny7lXCS}Nq5kA>b? z(bn}t*G-~nVXb)^Yts8!Pso`1r>+Rya|V~rqk0;SZ-b0tvW^Ty@Ffaa)8RzMD@ir5 zR+^ANo?8!CO4n@S85088s~ZYl(`<~CL^h|XOrA141z5s%W^hb4P=fi6iejx5zoHTW zcL%ZEAevrW&ub|x7`^HiF{+gac$a0hr?T%3ed1}|hY>M_|F*RE2t{Wv7#fypVlaJG`f2Ra?S$=8u-6>~9n-O)PO zp5j#oGJAy)N?bv4o`I!l*)yQHQaLhp6g3;aiZ>oArS}2XHuI=M3=9kg40i!mBtu_@ zo~>Ep8%Dh=s| zg`Pa=#GXbfcSp^;Kl%}dP{z>E&lB{WH1X8*=$<0}ki#F_6{o*S&$a{U0lJhx@5$M- zthEB)eGy_aSr3k;LV(?oz@k)wIAo|>+<=7Mka~7(aR-UI*Kj@1;>ruOtmZ>`?_qnr zo5Mk`;L?DS`8Ibjas3h4J_(FO6zasUiD~G61VDD)kPR?H<_3__ z!OXo|SmSDX#l^Cda*4`Gfq_?=dK?BpPbfZeJ+%bH zxfTx4qOII7Ws1w5Y9Lo7*&^D%0ev&o_?1%7tl9L;Efd48z zxe#5D{i~B5GFFeuBPu|<14uDFt4v7^@ET~j?CWvoT#v9Ak%ieIk{B5ZeDCQn)CXdg z7k>7kj+$^G!cfeiTKfh;%4;X6Yx0!*)#85O3k@l6dac$d#Nf={^V3$&U*!-rHboQLx=OBs4y_qAX|edML@mV@IV)wB)-V6Mm=zQJ3MzOBVF6jAIu8 z@m)O8T{wV7{iBbA+f#lO8x|)bH~ao+7;El}p|u!5q?grLNPK+KJ9DOrV@%0-%tU$A zjvrB%ZGE2}Tr`+|3e1dgH^y@jyHGfYKJ}kf%AMC2H16xenLd}X%_Y>CJaCMyZZlBkCERgX2b#PpJhQi-Q9y(IG;A?Ar+ochSS0gk4 z|JK23+6CVaBr|2~9`+B>=t5Q;|1!LLAmL7XS(gubc)rrut7?Wo!=qb&^tV6O zr2((*8A_lwXgWW2T_Y)MEY<*m{WJe0X4?1VD_T1?hmW`+LTOg+x<+&8*)-iY%&lpv&Z7VL{M*W zv8*gBZ{9B>qFNxkW-nhQ`*P@tXrBVmtZ>L3^W48n-o(3-bTom9OQz zWRF&Q_@pj_2$A1DI?4bs=)?oJUi>-S2WrenRzvY_1P*u9hkmpmeXEI9?~n&57DN$3 z?ZZ$FzkLqoJk1_0UQ-^rjNN3eG?f2r)O*ZTa-1g>e9UAW+2>pusV)2!-OCb6Qisva zPqDQtR=@)Yy6Q=!*--i?G9}RS$6Kxj6zL*GDWd#p2o;Y?qHVCe9pMtr=JL7=fsl72 zLk%^WCvS=qiG`D2$aY}=vpfg2q9#Xqs&YcFc#_d@X_k5;f4vEv#uGPv!fUe_+}Tex zGcjHcWp2pbN^%}UF;E;~GT>1jB2N&z(y5vudz5WdX7^&;N{RaVOMkCzu~=FUfrQLg zZwQs|+nQuxi&U{7@3Qx%Hs0w{nBm9cvGh{LzJJ+F{okHZgPteROz&;YO(D77b)g)ygswU~Y z1R{KEB%)8{T|qHJ z!DV@hfQN75-37GA?YOVIHWX3*sPkt?*hnTU=L^xq&A5wiZRB5j6V*@HkqM3KPP`!P z@f-WR)G1$#D<5=G>gC)k+(ZN=K0|5AjsDNq3@kmD~N~XDZFiF;>=!Tqip+L z6Cdqt)#82Xa`d%v5bkNX z%bk@0w`rtPXd^12DN{bGSdBC_Uz~oSpvW5 z#@rVORn5))wc;3;>X zRvw1M;}PRRGom`f4(xiDSDw$E8_VnIJY11~A0gMB$ZRU)+)?@Dz{b6frr$)kKNKQS zuy#DWkAh1@#r|_S#QyF8@2WuZ@86eJ&N8Zf!$w3P2CCHp&(k^HcS0BIdoEOr3U^MZVGxBRo{g)n+!(>LCm>A=P z(}D);da28pX`zWs{RKXW+FqI6`UqI#>09Ul8SAF_b)1Q7#AeUU%YYTByr9G#sAGvx z8c!qDz(LO{JW8TW)<$H@UN(cJtb$Skgmv6sKKt%yBSAn! zx;c=5(EVOS&!@=}h;Nj99w{qS4a&pA0t ztn#u<^($5Kdp@dCA_NKsgBeHKS!4-F45?S7qpbS+ZqrHC+q0w(yWA4{sJFn z#TtfDPiaJM?VnZZU3JKwQSYgM!`+L~`x)$R^!R_kH9w1D^A&Cru2iq)>;f8v!t2dL zgF+^^LojtchCI72$OrUWMQVcP_6ZDG(o$-vC#-A?XBhx}lJ<7mD^EmVe(4->wew85 zM3nMT{_~AqPp@=4!=%lI@E3$YzE6W`mNcHFzo)n|-tF)Aw1w{3VrdgDtphH_juVEB znnuX#jrM6MBxYOzxj^KnIWn^nPHl*fCtz>pq~gpY80{LEvaAc7@Tv=>=nZSHE-%$S zB@uS^=SU>3X~NxnF}xx6S>X5*ci#v7;u^H-ovqvS%c|MUd?SjPlE)b zc7biIHK*jxtsdnAa_;12o*vWbXSq(6k+4weBN#4c#mOkI5>SvD+LQts+aR#?ppM{a z2Ui-68*)B%SwM6zl+L)^XMoR~jDhNP{{XWyHe3FTV5n%C11@GH2sP@lUFA-(787nG zFWTI6T}8hL?NOa7H79o^McRpT-j|o7I&YJv#(dXc-fgyKluM*VGy;GC&QTe_>C93| z=2?)h)~Ec;%PvO7Ex2-L*jn{2$F|n;}AwQTO6j zI^xI6b(}#`xNFsa3vXYwsEc-_eWn!rqfQfh2ZdY6DC?TSr1_P^YwgPQs zH7}pfzAQ0WsFVBw$i?_a8*U{CirIV6+t@f}DciAfcHVxK4Djxm9*{=(xVB=*ix#%b=OIT&zGQgeYi4fLNt;9ECV;Z2+0#ZaiH_Bi?_k3pA(LZ(L1TvyluCHq7LCq0G-mhbjqTl zqF@)=UVNQmHO|kk`FsoqVR&f;0zf#nTojDnx^s-^dpU-=$q*L0P z|9;4%*V}Y|IiEEA;PA}8|BCRAYU(~YuSIHm1SV?*>P{`tB{5xqX5Z9%R#Tq$^hQ*A z!su($%Y^DZNaUxsv28h=$PPxSucDq9bcpX50z4S--^PkQe0L8W5%H1ATbhRqcRZyd zXx@n;SbwusowOa&D$2{eJc~<;j$d7MPB(DUchZ*?>FJ1jGijYwnJz4gvU0&)boGIq zW#c+ah8&I=n@USrwuF;Xu9l}n*0|EcT;hZIA}8FAGqoWTy8_5JY7VFvrfwpR*WzqX zw@23?@wIuXFb7mCoYAj1$lp~34p&~^&C-=t@+?dk4Ka4+W9;3N%IZpisLbmHKUgPw zbavm77+g|z^W1p_t89y$cO$f!OJ1kvtW}<5O2u1rsiu)BPqc`KAbz*)J%iyE`MviN z+ckj+il-Hm227Nn2v*XCkA0fx=Hf@}!gt#?$Y`nXwTsrce5`LSTzBqvdt_y;x}I3x zzi>xU4MP_)>t8{uw7}g!AMB!F*%-)T-!@|R{mDF8fgOwLz(xt>XSN2wmFUdNzN|VLU_A zTBiJg$2OD$b+7(dp$1G{TM{l@cRN(mv>F+bBmbs*{?=`>b~H4#V})|K;qst+z8-YZ zX{R*a$n-j1M8OQ)JIE1Rg$ciX?Xud&cl;2@1g!JhNl0 z(5!tML@Y1ua`Vt-0hc$00w{cEJ6qvXW_7nQ^j^z)TjUOhH|SBRKZ0SYLP?gQYYdS% ztu`MeQ~B!@L0X=^a5QI-0ojeuQNclZZ){3WjdYBqOShNTO}H z(uZ2@C$ryi2Zf_*^e(QiaEEn!QupljI>>=jGZMI@Xjq#<|BK&(YtAaSfEr{CmxO*6 zrNHsWTjQ*!)ky!ZZ=EpSN&0$(JN(dkmrWNTmZwi0CEA@mZuZc1zVssXmZFow?1_&( zq4wh(wV-k`TXWuB+&Dhf7hMr?bC|?;!k9U>KK1NxmNVnoQ@H!JH#P=3ekS3MWp&1} z+7G^Z%D;H6<1Rl3@hFQ2uv_-Cx*QUaQ=nM=f1G`FK$Tnb?xv(mQc3}l?(PO@kPcx3 zA|Ne|Qqm>Tor)mc9h+`YTDp-g>ALU6IG%I9@BZ%n{l8Ed4Xu7@K99?~K{PcZOepz`$B_XklYYT-K@>R{ zv{zjMbxNL=$Kx-ODiKM`6E@EiJ9xP#( z;b(!%ue`-c=M;w1ly?Uo43P7G_4c}a(J3UTEQN;_B+GKlKBZ!XnGNseVa;v|Z#1qv zIoMt2gMV zF7_W8VK6L`r|ws&2c$|(27W@N!9X?W271w08MOWTVO`F~D5dIgH{vlvZKs&7XhRLZ zt{zrx&Fk783^+Xvt9A#{rS{_a2LAx9fiI+-?yOmIFbn~jaI{|gDdA05DcIqD5Uja! zb!lKbLdlodlyrb$5Nn^y!f|xvBDbc;%XFU|b<}XR%S^cn>4$bPkp~KXd4~_eUW`R+ zy6b6~J(&AQvZY%3VXA1dxKk+{%-QH$cenIVI>Ng#;4Rb2_mgCb$lougpAR6l6=syD zMLENu&X*g)kPuh9&sAG_T7C-TgIV1-Ix^|Yim}70e3{CG-z!EU0EveSMaW_C)LIoh z>Ao-3P)0Uq)aElf*l78|@G-kUwMgNmfaQgC0v6r z*0YtU1;JMn!-!#i#UF)g{d?}vdGVjzlUhi0O#1B1R-{Hsm!sKW;2>difBrk9BG;`S z10Qk@*8zrk-Oi0xyVZZ@dTNg~Po`qnY~<4;gjHiA^0Nwfrq6)yx9x+4DVk6u zVn`R_+7kF`XE{JhzmIkxsp&y`uU|pmcT%o|Ay6CrRe!WhpMGV`P(Q4ebB7Z%KZ>tX*$dWN>z-w8I9^0$+3~bl0V7rG8BWVGp<~ z?s8+oVZOATtNG3O$VGb|@XpkR=ZL7_D;VQyIeqe(~}omBdxLheKcchir$;LLOlFJhQn!#QzC!Pe)cgt>b&XOrX-y~hcDG8hiA!=8sKeHF z%Q3XU*A4>>hnqRw?{J~6C6;aHThdJy&jZ1Ri87bH>aPwMuT2jxSC#E9^iip+HPnl~ zHp6OpLI?{{z{{F8bdN51fEzQ~EmFjaKuPI7q%>VcPFo3G^W_B&wilA&P8ic_>I z`#Ra5=^>9wqezkd-+opY6}(pMUgU?PmzO)f1%}BN2BYCa?7W7ZWYy}&XuqQr2W>q= zggwY3ef5Zq$BLeG+T#l)6wy;pChIW9Go2PQkx}O-Alny@RJOu1x(oP^$d`|0zRa(~ z#eE~H#UwJ8IV7^4vsi#1&;1K2Y)R(JQ|{swIifVcT2lN(%+w$>#VJ|DDueGxSy##R zYYoSRtra;b)n+XhKit65vjd}Qx5D;~h59cH2Al-Lc2Y4Yc%08qABHO;WlKcFqr!Ml z;!d_zmd_Nko9~c^;_RXRtMLV$5k)y~t63zdDDN)iioX1CwYTnu^( zmt3iuxmcl?q3NaY%X`@)n$30B=Z{0kpm=psKR)YNjbDKi7ufdlK62h^&tRuh{lnDt z5ov)bt9Lvnvq~OkZv`sShK>icj6Su}zMf+V#SuCQGYu!gVz!$dpkPEr9kx?9vJ5rFO+{~JE_4}eU&Lqv%R(~xLe^& zS}3{QMn#cv9C6%q;1}O$=A#85jLUvSQG2n>p+wSvd;5s=cp&@zo9Eq4)+gFuoKJCe zf}Br3NVm-%-V`spIvm+A&NgBylLbzz4<@q=Y3#Vjs1@jk6xLZU#X9Vi-M`tP_G(_8 zBD!ay1!9+XX|~V!-#XddygU)%l5Lt%l}rcMNnN}&s;V&y*+sS0?ax<2O%Cq_r4=WW z)K|^ttkS7435Cv7WaDNH*r{i)zF8@QFkwZ zq#h+sQ&p09JneI{WJlI*Q*b#&?m3xnIRdUaYpbDeKNnn+oO*SkqW3mg#%RuO(9Clp z^d5U~{@2nFfZ0a%^y^ouN7Fesm^msj+g%Sg4IP1Oaz{l3xNSMy^;Of0d6IOfQNjMm z%q~o7zmU8bCY$%e_=AK1K&~~SqPesTzt9G(82@)G zf+Xzm+C=_w$h*?gPM(C~pB+%?PPWqiz$9_kqpuaMzmc_nLmD(pH3)_nihz;&bC_9~bUvs#OV|?GOk0vS=Bz3x%n}Vw!L)TYT zwed~orqkjCz10QgZ|4_LL`N9(eF$@jF3|h$ScbM_^1RPX{Dl5G*L?r&Usyq)VDuE! zYHx8of|0-yu(*H!2q}(1H=-qKt>Bp6pw7@fc$d@V4R+P;;+oOr*2>~B|5V6wbbT}i zaOwM_{VOz54`C&8tLu>M2 zCHGVM74Mu562sc@RH4UqLl&WiV+VW=u%{ylN^Kh?y#F*96kXpo92L;0edi0VYuxBxOilHX?5|az(JQuPO{5Hc{q!RI z0}n&2daWLn4vQRs3aB`1rJ(2WEIw;uzZ2U&6!1Hwya#t+3mH92k}K5;HRJij=mIql zJM3$cyBje?8eYlj?#xKs0V0W||2}Y(3kHjTJ67J^_Z%%Z^RfV5`uyeL7=o?r?4{Az zlCH${8%Y{8rQO=K!Lyb2%#^)ZF+xhrww-yqvi2IKYnFp8p{o+JkwxnW_UCBn&TbxyiA%U|IOY&!W*T0@fZL_|3A((<#Pm<$G+-VaLwlYRCEY(=c~L6 zMTxOpQ0mgpqVJDTd~`P23_Fn|kr|GwvL2k1+a5N)WD&}z^bvG0p>zs>AmB~q zX$C-^I_!^NYdQZfANm!-_k{kn0m)oY&iViKDCkfI_#Z&>2PZXz$w&O#dBTU7x5Dyp z?fnzOeiKhXZSV%DSCU~p8Y+16FwY0bmH}dhUhf3(C<4fT2=f25Ba%93@YNX;tI<<9 zN&yp1Rh6r^Mwuc(56}5MbdBW_$|1MJ@@|27kQB#`U64tPQ3X zixN9nLQf!v|0clyS~f6L-6Z7K^Sn1fo1tMu(IHV!!72TcMw7BBE1wi#^oxWx5P*&u zp$ZiJi%c%K|HEUj^FgV{Eu^Jo4_{Q!lH)fGX{hBt{T#S?@vR_p^Lx^|y+7IwH5spK zTo%TE`)xfuhzzZE*nQbYFCjZGbXhd$Mi@knTMRfEBcK+VqE(eX-m4>^#w2>^@p=Uh zHjS6D)couah{`mn;M;QuPw%yB;6xnL`}9Gb*2_;RS`HZnU6G79?Nlq>bg}x@O3&na zi1$hVtK$vlxwG zZX=v@ag#Y5{^-Y-RJqF;dvF$;y#rCzpX^j9@3*lmc39TVJPE*9R_C`18hX43sQbuK9p7w=I2xZrr&MwBu4q!J^`EU;W>3?mwd zuXN&M=Cm(3n5b8qJ}0&4&fa4@SG1sz)%Uc*d-0CnK=I4gY*HHz^-kFcHVrhW1gY=s zJozi%Ih8d_L3rde|MM(trqq;m+W$tDrsV_xfWM&;UVYskM7eQM-_&~_fXZ8t z-p18OnAG(hiY)=%0v6@nCi60ks^Ye<-cZhP7tLZId1X>>soy4y4Lc`Mb@ILOr zb>fno8pDIi2Gas|LBQ?NaPsD7vi{0o_F}pm@Z-O=!fM^Zx}qV>pU;B5VWX2KZH;%t z7yB{eGadJwie7Ynr1Z0a#4k?#wJ|Ld*`)|BMv)cG<#*6e*K##>`lP3lA+WY6o|%Q_DEhryiz zg9Ced8`e@<#M2uP24UVBBwvyR93(l7f9~6_udi=Py#)m+C-#4HT&e>woZ+F$!sAxxpx4MzC#Vn!a-? z_6&~y)34|cNNZH(1F$&!KLha(cZ*oPy1Hmz>kQXjq?$_=W}clalYVVG??t7sV#C}> zbM3n5YcMSxfDz=G#%I!PA(gsxfl9z&9^tF$eYyN-SjH9c2S?H>nbo@(nyr<27>hu`(k1 zl{ZIPFL_UT$jJuzfE*wQ$?}h+bT4L}A64EGP=tP%R8(8tS2DN&6+bXiGNdQ2E=RQ> zvAWp3F5lO3)cgH z*ku@hEREhNZ@y8*Lx91PI!Xm?G(yh>(*4BnEP~nTL~XxrVo*jN-@5#%x!6R@cuDGF1w6a1p}``U1|H z!xd5)+{(=|R>d|^FFvglB&a6&FIXLXjN==h$3ie^56H6J=ddcC^8ZCQ6 zVrQ4)Qr&u9_D4L3t_(%^!DgGUKZZ?BcDYE^3aa@m@+s{rpyzx&;6n+YdCl+Yg&i>X zf|T!_=5||xH-Xl`x+4ne{AVE6|5t1L!tQ7@CSpj(+*aLLP83=5oa89p{6}XNs-Vb26FK+P*NX&wLPIS-ISJjS{ynBU z1233%wmizrMi%_NzS^F_$UT*xF9xkjeyPVm1$%=3)ZhNWS7gimMdWODa__N+R`>Ed z2V-LD<1D+rT>B`XyE@1XbsZ8T-z-pU3;yF0eycsOTr!zthm14>b$L7ltfYM<^RYZ!+K~2- z64<{J7#3gbcYaEB!GYAHI6#q#ut~Tpyy=C`12B!l`<<20z1B{hU}9oC?0amdl_I$w z(yXkzD|HMjTeqxUS<4~4C?}F`0&=bypgT0|!#^?(xWS3f_`*%`0pdySFUbV(=1=9W z>B04)dWepBYsQ&ye=;R}Kyjv

G^s`tdaA3W@Q znGrras?36(KmC`BR1_~4x*EH?=2`%`| zuT*ko*-Ntgj(|fz z785Ch3nBY!VU`m39rZsrbF1$--vhK=I7>um%BdJMMgo>n`U`eoudeQutvj6$PJ*i& zokZo_o#vqmybe627fiqI>{REIG!#^_4zxK;c?R!SzsWv8CRwn}5PgG4{yLsOKlisk zh<4?D z70(;??Oj3I?pAC5q4Z|inug7-@e3E;K~%Oye-r_?D4u>)&&EJ}XYaMI+2LJD znWxjk8sYt)coQO>7eT5Gc|`;Kg^V#d%Vmt+M3fF=rv%1? zaKSN*nRJC7g_m>OA18omsM^$ZN&4&C^OPQ~KbQ8V6q*7d7X9!J3^h<5L(PaHE?rDW zC4XXZLkShBLy8Cc=u_5EdoL<2bhqtx!&nzmpS21s^eIg|`?O7S8Px?V^}>k2JdwFC z6^U6kABSpf6y$h;O!E3gQcgPQ=*u8bW8I%?VRJJUp9_m^K2L{b_{{3)%np2-?QbX| z{~eH2pzeyH0Sd1@CytGd9F2UG;QCE*ZdV(4bxVFLBf~o1txHIEs(B~Us!Fl$1Rdjg zoF7GGN)8fHC^bJ!M4UFXB_%q~k2Ow?2)qC>3yp-I8^5^U?`&|fXuA8&nijfW23EvP zPO?_Q!r^94((Wr-IbOQtz7&dYZ9_a0%q^pg%m{vUl6zB+$yqf!3Zogs*GuK?&i`d zPE?F0+d$C|$Fwsz>V$YLzBU7=edc73Gn{4h%F4qcx+^9!rG=f#xn!gnUNI8|tehKZ zvtlb#Nz!;Ty#QS;hEIp5Tf!9b_QzF)hhVrrFz#uGHK9YeHYbB?LRZT#%8slKQrcwu zHi=5ztYU?O=45r&&mF>GSAz5-^}~e`SQ4?Zwtmql0qgYHfxH}8-+dM+w1e~SxW|mO(&-+sL;&3KNA?JX8hcpO zd5daRc%nR!)~O2!oPxw`J#Uz6R7QD8+z%nim(FYE4ixmZAdQU&sVU{FWULmt+?21~ z!xoKt`QJ-kT)ZH*3&HqaF%m&mkGaQrR~zK>N_v!ghYHe~*5Fr5#4#z?A7uu?NM!QG z|JkHLa;L8~z3f@{THbcaFu`CA@Tsx@m;l!$c}%+eqQf^f<~9S(aKiH zEb!;!25}*p`UwoNjGJ`lM|sq;aRbb3rUPc?by~iHw{)Z!_B2A(`HLskou06;;9z{8 zLW1EF&Z-fU+dUkD4KYZNW(M^k>%X!i%!;ckL1NV6a96bzj|VC`&x?zRb`Z&~w-lCH z%*vw@V*?@he{R|%7Br%%`wBA!b8-FP{T+ zz%`O8CKOyG!gUTab5c3?yUTzyvd(JaPOZ|Y^JV13k?TAp<1?Gp7S5+5GEy^I9H_p- z9)fxF{+bBbvPWhF<(QK)3UU&AJ41i^l(*sMBmT0Jak;-v!Bd39+ug$DUIhjGrOE); z&wr?xpd12#mjlStayWK1QUZiu;g+LLN{1<_{qeg}!0T9FlFfZqePq`aLmjXp6!=@} ziB?9iWf*|DemSJN7{le$mUS%NmPS8tk>$?n#F^#JVV0Vu=ay56?5{k@hW}dwZT8&` zlPs+ND)I8#GPBob)Esv>^s2wT>j=HA-x)XCEsv8?$}Y-@3$OoNgJt%jH7(!f6XpYj z?KVjqe8`uj`7G_l;Tit+$W-IYb2Q2%C!B+))gSDQvZ+jZnPl;3DH6+bAH|aF*pwL3 z6kkzBab;=~?DxmEE{-ncwXyGpRh5g=q+jP{D-hf|eqbO-N|4~1DIUJ1;8mx>t={hY zv6U1l1H=1myL$TE7N35;e8oMNf~Chz<(gCI8KU_o5m%;Z}x7mlpup!oH-W^H-!a}gJM2DO zoL~8h|BBCV`@Uc$dNQRf9wS9+EnCYFQ?u8}r~h$*EaK_=!sB%T47lu*Xm{;*$sRE& z%@}nbw!cSgFICG3765O0@h(GqHzDFpCiK}BB0bo)H!RE|@}vwL6i zUoxJFQL{bkdhoomVXg~KG@1WZqyd?7ksjy%n(W+O(c4e3cy{RYBP>H}<@aUHeTW>2 zqZw7x<4-&5ucDa{_Efvi+ImLpmkzUAfk}vR1Qtv{n%ip_H64cr{MGM0uNt-dri~^x z+>E{;%(ZA>lGdwEbC#EO#nZiwU=oB3U7+4!Zt8zREdux)P<;6SZ~H|6p)K3~1m&oq z^TS}xy-kVz(lZ7wN7yOhTD|ykmG4_G1G1jJff>wO#r>z7oxlKM_U(g;rNGIrX&4BY z?}0bH)jk?6L@th;Rv7ITQ&HFqOh0_H59cG`zHu(`WUkh#yA{>O?qg`9prk=kWz4~2 z7&L$GM$J&ON%c#s0n}w$DZN5V-tNO!Mq4jb>iW=e#|t!2>F5#mP(O)XPFNMJAFir- zUtW|Bs}Ml`&=E9%;4*q}Z%|5STJTt^Wbbv-i0L4~Kq6&SyywS?EFmw43Y<&-$y_@B)UfvZnXPTf@>Abd76IO*ntV=C0Lt4kosj#-!Y-%cAbmy}TNVYrkFy zk4Tp!Lvxp^{R(GU*yMgBg=?cfvwwtG6-4T@8su!c^PyN9bW=JnSwc?h*FvCkgen87 zV@2eNu5^g9Y1DqW+GnmLHmX3xnfrJSJ8jgygCB6^3?wj``+(-iN&SS~=2I%O=KN2C z-0X{d3box&;^5a`P`};36LA-kbx44-!5@yCtei%EZuNPjSK{rsfu4VG^PLh&#!B!T^T=4QJQ**9hH*D^JwR_$gzB82REv^&H?wqHT^2i`^i)b|`SAb+-8enNIdi_r zax1ZDmi#1n4t>UQsEyj6ncH#)BCNA;6?1TcY zYW~rpmnUKXFjLF)$zfUmM3%;0VI7(McN4%zt4qe?B%^>Cp844;T8+*Hwl1pU>})C6 zd)-uBGV~)#rOK&ttDbX*yba1va7Ru51nVN3=izZ=EX1;C5t}O}o`YSLGi(-|r<2d% z#|TTf=3<+ql|oyl1b<^-3p)`;{#joefk+IGtJQ!r3>kxZR}ETpz5xARq!zJwK~xZf zbpxVG_(wNJz&Q{@n_=UTQLShAt$$E`4$>VF5kQ{jZC=OvwiduZ;Tozx* z*g{C;Zg(KnWB+tk%;D)CNSq`oK&*+yfpsMUcn`{lX2de5pbwV3H0UAz90h8JX7_Wk70S#bYXAHaL#nMjnA*c9&f{4b|(vpxv#lmUv`>j>m~+C;n`$xk_&k* zdr>KIAUT%5(J^2+=nO_(1nL@ftUL=nf|!9zj#Gi$9I8qB=Int(jmgEck*i$)<&pA4 zfSUSkjHk&9W}*IEGfJSu2iCIhel58ym$C|DoMoEe;V^r{7YV)`}1$2E?5@lq6ovDHc(IE8lp3TvH)8fGfi%N1Da3CGsd$JkAS?qy$&3>`Zw8Ut>UQ+dhv4n2aBai(#u7SeW z<+#c+H51v3^czb#-m01k9V)pxc(@$|g`;1{Y#Um#wv`=QT@G{+rkTxr3Mb^X%AjIQ zu6Eq5M&5-3Dht?&7|suT`4How><@O?;1Dx}nM}|uC=RmZbyoM{b^7ayJtV8J4;=xh z6JizNs$%AH+<@vM1YpjOR*7YmIcI(NP09I2&~E?hfk-k1F?auBMSmSr20PuaNnLjU zXe!QgP~VR;=g)bJ0tU){{7}c?k#f1-nv+&PWY{;sj^y;u91e zyOFs}&L_nGqp+*={QF!9BY{ebdDHb)XiyN~uifi!lW>i|Mago8m^mOBM@8 zDr3wUyl@V08<3IUq=X(jJ%b?uMA5%P?S3dg8-k0~f;r*$8eGGb&C z2~mIU&U*hk|2GmsF%S2JC;0raUAhy@JB|LNp``&*4D3!W?;e^C$_z)mV`^xMyp*BE z6)yI+scV>`A7dHYc*n5I*PLU3GT(m|{CN(Y*FGeT;B=BP5;nd{^k4wS*yRc6zUvh+ ztNWKbteg9cQqvi|Hy?RyccO%8cCxCPn($AmglVmTxr@vx$QO>S6QEx;u!i1=(2+r} zD&WAH_j}^pnTGHMTYJ%rV@Ket7rT;QfR?*2z&L`@biT>r>D1gkbLOU#C_8uRxf^YH zpv}`bp0%)Bk9)%7J0q*hWoxwQe9k?isR(XHR}De?Hv<6zK(8C_VcIm+faf_umKC3^ z_VjE$>DT@9T<+l-L&^_=_Lom{aaMNLd9~zIs}c)k#w>vMCruq*+hvNMT(nW%+LOpV0oM#bep43nk|iTjeU)D@(9*FA4@|n)8Y6MsO0`fJsWjar#ti;gP#gGRM5rTW=0%Xz?%R@O znTHe=C>&PZc@yI7Qx=d%Iz0C&4iw*dEe3hVqP%6EsI3re=XW@;2f3!x70&(ESv*mpp}@ zgry(QJ=t zdq%H8@In+>#&>};6y@uu*+Q8Pdo49hs?(*6%uiBZBcOx;o#j%lY$gNpuFsif9v(kE z?c-4z)*$^9iVH~}Lg#D)nyA~Ck}!fU5!~sAoz}VE929(YoaC9iGB#Cd^nU*_s#+@f zJ5TS+%~WA&!^?(3HQs}BvR9kBx|l4h@RZieJ|Q{N!0h6%=baI2+klDcUA$oSn{3EC zWczOy@d!fobJ;?d>dyv}{R)xo!&ES4=kI3qhCmzaE7hl{1(mDZ?n?lW540d=tlJ{O z{wX;D?QRCQA$3!`+TA-xxuh^hIm4Tfa|6k}v2pTfv4j$i!4=03;0+|7tlmA)BLBFT z@}PE7y)H$_*rTYA5@D|m-~F;@1D@qw9TNC0Z5ydj_HvixtlZJ%Ld)jVg9IR<0#WYF zB=Cn--!KS054v#uYF<(qH`;7`NB4fF9xx(kU)?`W5m3-t7YqrTx z)(ZgU5&)wOqJY-{4XB(G9RV0SN`E}CeQ`enm^O@9VAWRBuYQLIqgB+E{aAmT`G==azZ5F-9p$Ax~EVz!r3FjL4h%=5%{bO6q|`BqD4{Y z(p`#cJzT@=mt`CKn3Z0zzFjstgXP-j+kWN|b0E&{^*2o^Zp7;LnHgFpDjSbni!tM* zc5ls_bxlQ-d0;~HqJ7S#*9sStSaiN6DGa!gQ(bb-Fv|S2>^lV_+{Tk#AG05=HNTnO zUmBYW614OPK;FcC7x%j3`^=G1sKEm%ae*=PH@JCE`h%O0sMiMQ-IOdN-p$TGMg||b z6DaQ3xL~?xM<~<0H2T-i|ls7KqWbAwYcC1?6z*`48eG`Fnv=GT-Ur<1?at4l(G@tG zrVaK1N11Td+lq3SjdftzzjR@#PhuNhre%n(NppA{CeB$u-FVy>5>+QBpy~aPCzu zfa3ZJ`#(?WZ4R0GFe=!=VoB?@$z{FnjQ-@bYLm~aMRl#H0xi3P#UYwI4iz_A6KaFo zO2rTPX#nOn?}SIFRNh{SR-POJw5huAC@4?1h;L{tGxdDjqKU8{w1NIs8s@d@2L8EX z!jlW}(~z_5RR>zHtN>*rYA@D@nggPK%B=>0Mo0Av4@XRCfLq%U=)FqL*~5bS-bJkH zQDUy57%wm|2t1m~5cNy?)c5{G?{FEI01dzC#Juo5$oiBN<|Hhz~Vxw{#Y8g z27&JLEbpz;gLavF#;hljuLra)qrXj?#v3FjzYiuK#ENivH!taD=5Y?8>VYF?-1My~ z11}iiVtNNPac*+44y_0*9K=|2;7_pmkas_au{;(sHl%H9)(vB6<7v@~?!9c_gU(P? z&3&B>1V|cj+D-=iXLh6^KRy&$=lS-sOgJGn}^p*;^kc+)lz$*U^qh^$K zBAuIwb_%FNAM)~=8FjvmAAo+;`A^}6tYo-Ix@eOARcakDfqKhN>kaSwv65&NriQ8~ zVw>Va(&yk*Y01Rj@1HEJOoei+llz0B_0<@-5L8^|;uCPWP{B12e1VFWiq;K}oK;9x zS1Cey8(*?AyCh~+`%bjIThSnO8+vEK`}*Fv!H(NpuP#$8RYM2XH5=xDyWC8N(Rv45 z$C0ri{c`K{Q=nm@qeXY`9VNzL+JdCCfF(RO<^p7WQYyiQ-{24KlM$|&l zffyjAqQ$=n7OB9CF7!t;$74RWCq5&AXId=2Dd63WmQLlUkHzZTv!2x{qFs4f zvK?=r-{*Za$~yB>5y_^HI~JCRqGs^3Ovj5Z;F_BZ?M!)ub2J!!V4SnmO2HedqKkCrY z@V5#9R6Tg)_LH9A0Ak+CQ3d5}LvzF=BF~el2Q~dbvKe|!hTjqy!YD~pWaEPiG<0WU z{uGfh0NrDK2`b0cqm%>Hzh5#wuvVzp3osrWNl-xRkF~Z2W%=Br1RE@9W?N>)WZmew zHb;`t0txxZd3;yvkC>+gLIj4#zuN&aCeAD(w5bvr%;$L_vlM{D2-Hq?mX=e9c42v@ z^$qwJXNXA@uXRYjsz)sa0Rkyb0-H%WAbW3~%M&ye@z{Jy3L+`GuWbs5ZIBB!)rMbH z!aiSwzo+nfKrHz4_XlBlB7xtZAl^!zq8|^O+*{CE>q+jje64Kg$L`MJH77BgcI{%))DZ1`YJiLK>}gnstfdHiQDL8uPyf(!X_ za6Z{Q3pRBybEwmr+7vZhH)C5rn6BqtoK-nuxwVzTk3 zF!M!8X{=yG!X8|bW}dQssP@>v--;pl5cvtHvJCA14|2;cU;GmiU9chghv$>ny#Q+} zfNj`rC|Qgp!fX2~v+0*r)@|l$I##y$J*bnR679~!P4g$g_T9Vz?SWR`IOLR9!97*c z>0HWFRD^p%qsdax$^=xiBmPsfv-U>@9`e@en6MMQfK`mX5R-gz_i7$9R9tlOW~eo(%|@x15QD%|z*Y2h z*-Eiig;_a`K5)gJqg!zn;~NB^Xb^oXP5ob1T;H_ zmNUI4e(0bB?OO^V;|^J2MrT``c?NXRIAFM5e~8|anyek+IIOVKF*Dyg7jiW5#dd;A z7XPc~9{h;mZ@SkXCRB$Z34uO$J243(FoGa&JMa2yVEUK4$p8(U-)i@MG;jjR$6xl* z!hw|R04v6?ZwAV*k1GEfTlv2|=3U7C@=--=?<^P}f`+5}`wykt;BBACYJ3IBzwzR_ zC40AB6da^K%>?%!m9$CJC5g$Sfo z5`Z5rPywO&80I-)2kD;B?_HW_S**Z7YRwb*k6)#V0AdRC4}qzk<_oo-Hev8;4g9Mk z?r#fKZ4xGM3$As%cvwF^j=}8ebI*TY$ffFM{|>aHV_yx?m;6^5wx6ZB1$M$6=%s(P za$?#sj3eQCe0~T174S7=`1+}z9c+N;80wM**P;j90Pq7z(Qp_Tykr zvlAL>k<1|&!=c|L+`m@*!P9eMlUS%_8~AQ}11!*c4D4;$uii-Ll7~pK9=ZQ}i=?+s z0vZ~z$&-J%p7tMByjy~Sgu?Kb{wB7l4;>9p-#SnVUfgt0~ zSpti(OGDmllz}R@BZKd*SLZh>4FQ;*mfh5@8=nklFbm}r$|y*TVB& ztRHsd5iq*Qinc^{c^$%JSkV+v-)J#@%_MaGO){j4`c`@`oZvqg=Q=G~OuSdyvk z7uVz-H|_;gS4}rFOsO8{)V*g^7biDQUter2*;nV zr?d5`f6NXxJW35R-4ihDO9zhYHPmNXMqv!!1R$`qO5E|dkt7)3zMno$@CKtb;AHFU zjX!0Y;6>s@>B_6%IDL)1KK41~WD*KAm|zxoYJYZd%f^cxIdcs1Sc=0z;j|Cuedc?G zVtb8O$GsiJ&uD?1S*HN?!JL6-!AF6qN_UGkakZto`STGr(A-J5@R)Kah+qZd&#kgECD^cdtZ>+B@B z+!@U6K>HaYn1)Ss95-x1WfPHh>Dyhou`lJHK_}57njK*}cv=)L`1j90UekMAIr^M# zS2=#aCx!ugH%<8F5`l8=%lrGv(*gI%GFbe{C_^(ku%d>1P`4yl+A4Hxbbu0bGqMWP z*Xn&!O3V#0GBlr#&o5k}xNWYRpTs5XHcq|Jqvx?-brc7+vZLv~zPZAuBz4S8=7T3P z40K)YbiDjNJRFT7nxCplg=7*^uDphJliK1r~lc5msAF+XowNa^9Z?zxD;x0G#4fW$#U z7hB`Fh|DaDv#x8tD*~Vvto6OD#J`Kdzf!*h z=%u{Q9+ugwpS^O$A)&JqpPk>E6r&itBR9VDtEm(qaA-L9^svG5pM76=QtKPAFXn#p zc^7TR2NN0-gPn~s?|E~OXSd_-~5MB1$!$U3JfgiICeJE~sgs?L< zV?D31MqBWIaCOl8R7x<-y>dKK$%q?~&_2q9?2NSO)Ckj09VT?)%YokB_2MhUTG)yN z<@#oQKOjvp_jXQk8*F1gZ6snh`?;@uL%rLQvlQ2Pc!{4T(*v{6aClwrgZ+chGEi6^ zalEPY#JXLpSL;R}n<5IB_R49-E8GQ*+1SJtPM>?IZ38!Tj(|N)8n;IXza247+8XwfDbdU`PH#F-b zoRnty^ss0)UuDd)@1r#q{q$o(AlUSN^A~MB3~6fB7%y8kg|@x|%M(Rq>A-ta;#qvV zJTrFG%C)qcY{_qigZgcsU?;Xr)B|Ml@^w#=$OTr44Uzo?dkwq2feUfO`Fax8t`Y;- z15`n+7H=gX#XR`_>Pr_l2feD%h^RiF*3f)i_U|m|S2`#gzw=9xg4RW`ZRF>yW#Fdn z3}k%w!4yzj5CZC;9pmF1LpM6EDkQ`WLr6;Iqu~dh*ImFAg|)p?iyPlKQ2t^x|8nOf z;2&Ea0K*G(tB!?wP*l6L*W--s0IZ>+=1J`BAR=sr8FdHz()INH!D+h<5pUaULVp;z}`FJP$b ztcqp6UU^aI=H;zdi~=@e!_RJ6Q$}F%Qf}M$QtI(Q(~ip~95)c;uhq#JzX^-gM5te| z8V9bUN+6iyBm}fGuC}_f`aK3?nrgR3k%TIvq8tY~8I@0~j^=J&4UJf_ace*)$W|y1 zx1A4<(DRtni+cB1hc@!P$`*ASeH0p-WWSb_El;Aa1+y-vS`OfDfzJX5ZaakG&p=qye>P9 z$KEx8=>u5YkE9l0XkH|DQWiFhv}#bF5ri;TRxBG+09gd!zs*Q29fEHvZh+QTi2yt9 zU_IAJiM7H*l0leQUDQweZ0N0_B{?`ETEVh?_cW>=M%tMpD^+*|$_MJtw5aT5QiQvD zl#+IZ)Uw=YVpu3*ZY_jl^zd+B!-3Y`q{prw@XOg0_Puv=Ua3*hd@!DCvVD@-8Ms5? z+?bqs68*@000v+^_>Zd%>T5BJH8s8cKQkorr{NN|Is+NP>)&CKsP{f`Gw)GqU|L;a!{AID_J_ z?4b^gpAxwRb_G!W_Po4Y1GJtty~!lL?FJom!>xLTmXLSiu*8bMdV*!=Rqp1xCI3Nc zbIxa5%$pXT^8RFoMq=uRhB%NBq#GK9OAh&;JMK>ClxUsOweYl*(H)=-Uoh9t zHtfQy2!ky@+0q_zsFnLLAK1M~_ zS{fL0;fw(EkT|EPE-FW%V%ehTqk0B5N53mCPf^EitLK(Mo`i3En#!OhA46lt{z-|B zwN=;Yy{w&3sBro2eO3%bF+bH4fKbakc6S)`k-XV+5u;@$1V>o4o29AccgBU+H`$gb z7-&;K8`-_R4}Zr$`5whuhWNYt6c2aNvIe`MisqA{JDM^vy8fZxVKH0azgz<8zvaM- zKVVhMjs@C4$netbh{D^gb{y`Jt}xN41AqpE6+AP!2*1y{fY}djCtE+{$dyCi1??u4 z%TM17AiGi!ts#9%zJT9`dx7K`q6lb34y9jH zqz&yC*J5UTiXAiQ34g09Cq*I9)}3KCEe2)AOipxPa(ms%q{!BO!L#^Nny-pC@N$i# z9q+CpoKRLoRJF4Z3$YKh>Pmm9Y%`y-g&HcN50&td83`kUeQ7wWX zao4AwtP|S{6;d&QG0M5h%Jm&|HLuXNUlEJ5o_J`0PXG1eu!xv1+U^%!9Ph?JT&Dwt zc&a*XTQSx-u5+>4vyaMN46=q_rLXlLtR4(Rv}jc4D5-=YzVb?!8RHx2%KN?I8AX_3 zz$Mp2C7=8r@!k)zu2O2lHivhhe&w9&DJGb4@rcRvvd29>)_lKd>Z@ZsOd0F8>|6i}2Rh=*H#k_=xwbQJ>TLt`+fi=aK18XP{i8`hF~4^`2IwR_67MtBgV@j$Bjh zXm@}n_Tr)7S&34M=YFD-Fj0@YqR(wx1D-CxcxhInZxc?{S1F0qI&>sTUqOp{jl;=u zFw5a={_>3C;-K6NbYB8lxK$X{_?^=45P$oE8P~V8@r{W5qwF)UMC3Tiipi{8@^5w_2#vY`mu$}wmYK=rIVhtm=$qN~v+vQH6_OZ7! zGZ!eJt)Nb9ZkI=773{%oZGCJ}CFMQ(re1fz^I@YeWUahaHJrkCfjYx$eDM(hC1jdM zD37+bQ=So7bNGcHmvR4>{$!fHNy^a(p0@j(rSSiaBmX0iThMgFx5^m~ryZ<%n0;b7U7$7R3c#8) zn4&%aI=?%~6mb2xmg%uWfVO_v9bT?{?YhJ#8lbW@iMn#W`$#jK*(!Uz>~$JHQ?n~W zg}ME3)Jf&^ng=UDWqWVxM39*ts5S>gv}=29<*pP>k+LC2TAgCjsC z@dzsj0Ff*`|HbXFk%@mxX`hy(uXr^$gu`&m&2xa4-Rc;6p*NO>pUNIna9%OsL#7UU z6Tq=RY&`=h6*ujTFgSYjg7n;09dUb59)VcQ@O-D0_A!d=F}t3^XGYclMcGSo;qjqv zsc^SudYJf}iV=~u>(KY1Dsau`!9DkCJ(_D{kt^I37&a^E6AJX^W@BRlXTmF?273R* zcZl{qvP2R_WPg#iq~(-IGT>IMnPB~Yoj?E3il3$tk0M+G&5Pj& z1N-s@%Na)UTyogetp~(28fY{>$!d7%k^(P6mP6;~pSq+^r;A1LklcZtO6YSprA0$` zH>oB-M;1V6;O2qT;`AHtV+@von-8V@cehEq)>7cE_xlp{rVH0_P~zsY?<3*II@|lG zVPv)vk|hiUjuRO64>sdB&-ry8Nv#Yr3a7YBNR}w)6B+;MD%|O4y#LaPFBAf_ai}$GZA+I`DnAGv1Se@N>!XW;y;5)6`UE4(ny>HWDlBpw_=1S~!NC=7&jgtV?1 zdrBSd*(AKvN5UAw>k9<$GG7qpu?&*g1uGn`PLuQw*vG#jYdkB*RRPIFvjdS6)YRqh zAxUp`T2DmGeUh&)>d0?e4NV!C%Fxl?mg1Vn;4xI+fOb~v@A#`}*p{a&7GLgpUzN!) zeN==;N9-2jK}!SM?EbyRh{m{6e4mzj-bSCvCoJUuME)EGoqr3(dU=Y3Qhq77IuZej z0D)5bf$zyliC&QphpAX1>|qAici(CKm`}LfGkmr+B0O(}@d1g#`|Jg(((h&=_Y)?n z{cx5G#RY)cApAh#Wgm}U?{V`xlM(JJPT`3XVTN7`MQ5>dxK?{{gck|VgS9kzkhzEV zkhmsD00L|i4CNoQ6?Xv#Z{4#t-_%pMO*T*7RxOVJ!AcP!pRkTrW)ZPfJAATJhRbUz zQ?Mptu>LvM|&L4sN>j#fJ{5O?$<&-Za>U2VKt`T6izk+I^REB zKHI)u18V$It6@7jDA3O2CZ(&UG-Oacd~k2?W8Z{w1Nn8K&#nJAR_Bi?!TEk zd#|{1LY9C{M4&Dla)LiV87$Z_uPlMif~N1Zu(BAGH1UZ?9c|qA-l~q*`v&tSyn^%u zaEFevNY#6r{YXdFp9O(`P^sgA2Pv&_xqn5*G5B4 z@OUl{4B`oNHG{O4fBJPC1;1D*9@e2=>kF>qeA<12$GJn|YsNBw?ym;)G)M{-Y8|}9 zon>fJxCqajD{7q)utxRnsWBNyj)8V!PY*Um6^yXk&0Ath+pm*i=bSVJL=dqPPq#Sx zDO?$uI}Df1{&sY@NHY}ejd`J`Y>Q^rAFr`T3~e(M%kuD}++cV{XOqStm1u1!z0D@+ zgQA~S;B&nrN$%BiDZziT=Go7wjXw8Q$nJR-^nZo=|IgVf#VOq6mbfjxv}=v8GnnmY}VVe)@TmQvkYl&b96gCf&26s+Fft6QDqC zVuB~fFX$9|W^-_Q%Zz;h z3%aJ^%B_~t8l)Fq$f4=yWu{l?3NHABDv1n3x`oQ5Yp7V!_1*jpq-LHiBhT}w(rG4+ zUt;RLy{7cTA@}l>I>B`Xa3GpOeHBkah<+JBP+V=_iLG_yMXuFW@_dqJy44&eK>>zl z|34!iLgVG`k?z@O-@2}*{*0g2C_aCD*DxWLtO^Pyju z%AY3K^|1(x{L~3mL;B=PzC&rbtoVl0Mk8!|l3n)ds1bxD-`uW_jV>1RJ*?^r5}1*o zDIiN_0*t^hAlCPVpuOYmXyH4u9;of2UG)g-NpXfAY8T(Yi}Ej~?M48RbxiN4mtOg4 z;0u9KY|gVB99<2!k3fuVxz`55f~lIiYTC*Wk2l8Z`se-AO-4Iht7vAMB>+a z7RCaDpBRax|Hm zNDDm40@KY;8?nxs@h<{+bWpbu2wYWkHTGA#9}Jo`j8r~+6PaD*2&=YC1K}igw9m=q zhYAu+I>Z=J^glhdt3V5)gVr3~prslN=(Yi=N|>P8s0mFxrz}jmgaDE&ED$IGEYWSS z^w~V2`TBVqKQpuWP-%dpYXAO>AVWQLn6C6D=Cax-GCy7cZ3TQ@exEqFI+ZLEegsq> zh^N4H(>}?{NWT<$YvBT?j;6|5>#h*S5TB+HKdkoyBo=ORKU-z|)d{1U`HQmOusVrN+9a6(qu9>k8J(i2STm9yyehf_A}^2SHWgY zY~WUys!}Ear!xn?<4_J z$qJ;$Ugw9BgkbOBYev=VuIg>9g3Yz`v{P2oIC^;%4ed#USKHD zcL2bvucAcZ;o|kL0jtg8h*+pm{-x9ypMbpPfOYzvW7Bj- zkeknfFSUGy_0`7v7<1rHnT_MV&?8n@W`k24&gGhwUifMp#aEA;=;W8RWbPB?KVl0m zZp#ik%+`+OLc|M}#3Z6`do)k{bL}3~WYe0Ioq##<3-gy?{>N(t0D}g41j~Jq+O`5F zJ`)O@epqje1_sa)vbSHt!N+Zt;wtPnzp>!J7m=9$aB`pd+peh$zRTOlG)ye%ao^P^ zPpHzefcLze#c(a{S&rOxOZ?%=S6`xSprD&akk+isc%XItN+9tN7X@ZkxYFPSm#)V; zS8{!2JS`Gi7lOq44N?f5D0?AoJz79kXb$c(Yz|RGC5-3vVSauw&xzun)6>$UV6i>mSm44HU*RGnhJYAlOlpr+kQgU;6%S&kz2& z8G_?P@uzKH0^9;|`uWByez~o?mY2un=pysB;=M$x#?LmZ6;K+I`j{-CJCgcQ3h&yK z3o9-r#jH4c1iKMC(SyQUi=up|x@7XrW(^M^xjxxXIv#Kd;tOcaRpuv;J^-og7*anw zxq3unsljg}BrmxPY3S6fFygHiE~oQmQ8`UuBPHqZ8qPv}8=8Z$jA#OA-v-+q$@O|_ zZCrTM#k&l%GgRUFb2gPHU<}cvLQAr9vYIvtJX?g*fFz+@VoV~ASIdSldN)GB z?gOK3Bt&~BT0bfp&fs!H)}ZILACJVyUob-^$(g~~NF-g_-^NYiF>?slo~~iK)70#c zzns5(WGv=xEjsyTVNrSf^%0#6z0=1)BwI}+A*8NG`5BNbz>E^wJV?m4nRD97*(2l6 zBvdVTn+gYIyw|bo?n#>bY{&p(^{3Bp`)-&x5klG;ZxVfoiMr;gYOCZNW$ha=peOtxe}F3M{bmu@-l6B?yMd|3 z&hPZgRfo?v$ujsaMWv;sm2QKvrO254WD@C1ffX`Sc&^b6sMbi@)aU_?@+WS^Vh(;9sdPm}Ku9x-okfT!TI3`YQdWt)}j9%DC>qjO!OA zima$(Y>%!Ld75Hh+vK`kW|C>|P`FR&K$^qy)yH4kdJbw~(0j%g9j-oNQ23+9+WQe%n#hYx@o@bkP~#dLgqM zZAnDP0;{0SL!tro>7A*Fg@-3(}$|xCkpxUjQ zWuQ-NW#=3W4u1R5yTG@dIP|K9AoF;z)=|_G#azd3^SX|3dlSMNe9Mo0*he>`;$vW6 zG82mdM@SvPldQX(9B|uv-$Ek+|B$(R?!hKbMDFOC?)GlCv4gF=!FW!GVo zfqou7^?_Hh`Y3Ne?MCd zTx&o-`T2IZL4IhpeUt`8Q#@TmbtSXA{Iso5Asew?@GqR-b;SJmvLveYVfrq*L!$&d_x6|JCuG}^0*1w}g4;RyG z8D$6?=db;H0x!}%L*D5>CVlAjnnk1f;p1maU$!pw>E6$JGb^Rt*p_ZlBorU=Dn&o< zc%X$r$Sn@G+-2m))&%pdbrz3wF4nZDVLZ!_= zhQ4Gp993&jSfxvW)6TBn38lU>#F%$ary5~6!6ohWW_q*QZ}(SHOL5k4gjd5PvgLL$ z{(rPhypykO>>v9CE^Vlo3q)tNBM=Kt2j>nqpY14m+e|yUv+WUfMi``i*Vidr3LJrc zv&L?@^7M*s$!5pMoVp$B%y(^h4EZ8}epngxsua<_Iap5YG=NQzxVE}_jXsPs*ucu4 z-(o9v^x=cx%p64=@cl6H=oUd2;n|Ks$L2zO`RQKF&9k^PGiq3pm~BclPu?ZA_ns#~9Os8ee$0W*J5* zJ9IQen(T#@Sqi{}hHx?3+j9c%K?m>G_PjCU9jMi@itKnyni`C;0tqG?dyYeNdR>hF z@#0;w#K9?AlITfB08-Nc3r5r?Q$MXAm(LLrP3rXYp;8tfa`Fa3?klMfI9`X>MpwEZ zcoeZcc=N+z{LlLRxlLluTD3J|RYtbE6*1Fq0wrgcc$rQ(HQV5Rz`G>=_In3;n5vStJ(0$6!V?TRZRYD9bRnmm#u7MIM%haKXD}DbbVzJ@2`@l%J>S zNe(frlArk|LyQVdYp-!(*% zmZz)NS`M$BOgl<_4QSF6MEi}Po;N7q2!mb}#mPMP-Op;cqN)0Utj-?vY_jzl#ZPvU zlUX7tht;1FI7LO5`;r3-U#M7uDrYtVf_p%whGFuPz;v@ zJwnn!l2{#(RoFO5nzgPT4PF~{%qsS;Uy2|pdGm-M>^2{WCU!~AOFV?#{-lUXw^i8B zO3?Ws%UZ|5etFweFo--5f-$aK8DiLJv`fCVhu5ZFvD6%$hHw;Xz!P*qV)OVtVDR~~ z40Xgm3aY{z@WM~@M{~RM&bjlNE4XAtVO>xm@q3B(M`cP+>^9e^QN%EmKf93yVnzU} z#E*BDj$`>l=1&`sr**cF?QMI+2;t8HE$8eA?VVMqLwX zKU8Fh{6m;CY~z>5e6YP6i9U@uQD5!#WLBar z`U^AAB(V`$kA9Glg38OEqUpcJMHVc^X?*n$qyU8c(#`qM4HeOT%-*!=sfs|-8-}te zY^S7NG?67o9~$!ZvJ><`qs`1LG<`V(XuIg~4SzQ1CH054D&lN(JlJ9I2`;${qx^u3 z1<+kG7F>+p~p%F`fEBvY1i#JA5Bl6YR0bM18F0`~Lk3i9_+P*HH2 zoG!4Wvm5^7TmH77&(=r6{|ZwaENd97b{UnV2HB01QRqRPJqrP0a9-mVEs<2hpoy_?DVuOf0n;0XE{bZ zI#?y@|53fsLxRfI-t6xwzRd9N76K3a0ec{BEBZ<+0n}VLh+5%}%{M7851(T$o?7*< ztBC=&W|+w;Hyl znfye35dlZ02du^xSXFA6EGUUi@3kHX7u$Xa=8un?K;+zIRTw)n6JFTTQKXG_nI#L( z*<%@^J%{aOx;oyfzjTgAme35 zw0_NU<`FM&+JdgYfB?VEdq_(*j7zlm*yeC}uK~k;2fZUNFkXDZ-J@59BV^^pN3`cW zJKFyF6Ysq4^(1@c$3>!WtUBwc34|1AYtFz`q!K@3YRyjZ2DhiZp~+<%6*xwfNOpeUCAtR4f9IpJN)lM194ZpTxrCtEiXC-)~z4(9E` zr>hvW=IGV^VpT7t&*z97j7>39T2AqDF$06u+RL5`mlJyB#WU#$Qe!vP7Sq{@jO!(2 zI=v^eP%KVca6JE>ZS!Qm>CGHZan4@q!NFH+J!?L-n{>BiJcTMD8;fsr>{(J&*FN+1 zD&*22iUn`~h@IuvZ@(7Md8Q~^u=6uuHQfrHzEun~QUay^S{;GBox4oRf51_RY!nTI z<84s{-G)aCS8kk7%Xx}kU1-U;?6dD*e}tQv59-MRz4XtP4x!)40JbVcufZ^-`B%eM z#Zn-~~SEzl$)a7T1fTyTm5YVhYxa;3CRF6L*Qr_1Y%he>ZwgE2Mcwry`ucP6@Z6oE);X8ec=8?>YHmdk z(9B-0u3TFJNf*#^SaeR@n>RyW`ohJe(J&q>m&(YuLn|X0S-IPKnW5u5$0?KHJr(;@ zz9LVSNT$}NYLG##*7){f?_MDz4Z=K9D<>~k@Cvq2YnuZ=8{rO$T~OEltikjWRTIb~X|dt!^rD)aPO_t+6~-zu;q zgAPV1o5+JXlUVG4vK-U-m-0Vu^xVF9%X3-IkQ?hDDX=3}JmL}u!X+aFi+_TXdqWLu z?K4zH?>u+~XA3Pc^44|Z6fY&ugkGqg&^AGYV3WJLna0RGB6dCD0ER#p#|yE}-7v2q z`mZYQ&FVmsDaflm%$tq1jh^M#!u<M24vUXt``Jf1n_rNZKbRzedZtF-|BnbKH6`n>*| zvsjWSo{Ph0{ltIPjRfb*m&rwgLKxg;aA5Ps`prwQc1{q)EW-l*y{m?SABb63P~h== z`l2~(bqM)M(CO`AZF2M0mm_aZJ91{fb2GojF@&Q~oY`~Jk=i^tlGEf&FT)Yc9=_&N zd6#71SrG?Mf<(6( zalgRTQWeHtI@mJE-&0A#@od-zd*A(ltM_A6^W&=Zh(pPno)vZ2F;{W+v@HDmP+9Rg zWb$DwtTQt_5fFT~Ctdf&@WG716dj5!7~_e^LhQWl;Be>sT;*3t>zyHTbwKash{!(s z4^(-%A^hg^omp_r5s~g@na=4;zsrLfi3el~WrBeLr3wLp<{) z$XIhZ_@}Pr&~f}hh>R(^K8mWFzf@H9#*7gY-_0*5s=nVVt;r${c9#-}Sn!N!1@;GG z69HGv*cgNUqt%Pi-@%x4|4|;m(#0tg+sn}4q0Mf=oYNR~Mn9#EVx7I8gcPDi z8)v!(BOEJ7+c~}H;5`EDAT(Wdgy0TkiD7-Ek6_V~L>8X?OmEZv07RE+vr7l@&>4(l zA*)0oSt7VuzCEItT($y&`F63AL0#`ja?Qw*qpR0H8>0aRt!~*CnAW#tZ-)`mvP~|d zT2e8Bg_FHFzfFLqZ}(rhqIB^Ca((rFYl4I1O=j770^yVl1>|WzHq3wUyx_ZQ_$B=kyqB;NiD(N>djkW7Fqj3LU_J$f0Xi{h(q}MD0a%zPW^L-8 zv+S55A_PMH)PDCEh^%q2utHH0^t2T>T3?OY443bZENJ~YHL6|X@e~x^*>tRq9Qy+n zj)YiI2Lv++Xl3xSccB!4IykA~m{Ag7QZh9CIKbJf{$HOAdr4MQ=t&Gn6#&7lNKH2$ zyNWdtIEms8Nr*Ax=>L8fI0`=-(slFR@yP=lv;bHPw-PAs6>u!w1!V^6VE&#|7h*6% zrm%=dOb|!=`>mC}Gs1=7&dhZLAT+D=^}XV6`405YRBhBu%X*3ee*ND^K=nsAY8|@R zCi85f=N!5#nI+wIk31_FfXIycE$mqSb!9-d5>lr}qcc~VxU!@Otr0Nt%MGWyfP;HrJY~9&n`54>0f4$p6h7vU zT@$2de}gmrm=_!y zt1P=IP*eDlLP(gmBM9gLsY9Qf$7ZoSA`RM6T!C9yxe>~y1{c%rSfw1-& zPiEE$gB_pJrqrb!C5=Y(~gfC zhq=gAb>15|GB+1Bc~V$1rbCq*BO*4pBzEc!6LN`y*LjeF^u(`8H|muij4oV`7!^_z z&olN-r0Z-GWxp8DD@ua*4*w!%Q7M0Umh^maR!T%smVyM5Zx-Eoy5V8tk2g~elSQ(6 zNKIkEKwi29JDcF#>g6+z@r3=o15B}VY*MsZ!)`{t??y}(u0i)yPisA$RIyaHJ23-a z+rTMQF5%yP#DpWhzAf9$Iq9BIrfwE5`dO&0`4Yp*C@nQ}wEeq0@myEELK>;FrVvDO zvF)I2x7+1LXB)rG2%pSsZxD50@lcMX_T8nNYVVL-wLzapvI8e{sf?D;v-9_O?)8|d zRQLu8bdw-TNTX-b@yl3#Kiq+wAQfQJ*T+&aPXvC%wg)oJLTB;wO$O8}&7yUdzJ^1d zF=(>8mO%tu)W=(znSzzL9rN!Ww_9$l56c}<&eB~qm75zOGD(^zGe(KkHdg&9W^*Q5 zwd%?(PzX)Baer>KCJd3~ebqR5xOL$ZXY0C{P|&LMX@gpI>y0{T?^!jMV&f|F$-0ok{vs46pwi^)WD7bl`j z38e=67zrcgASBPk^fGG^0l^+0llPDKu^F~?fy=)fHBKvBv+#4rUNzh{6AkW@7t~;^Sr<06m+wVMx}Izydl?j^QqbYB%q*LYnkx&n_w^> zRi;vY5b0b}^FDGkCPoc7SY}%dH2qDrhnJ#tO%026;5Nd7c>>7vcp~FZ#gRlV)OI>M zdm1^O^x^-E(;D3m3wUe7u~D3&@aOI9hBO&etjq|3w_922;((=dI^!3)SqQlXVsWTC zY_J6N`&UEeGb#NjJZPa%V`-U-gGgpSk;jEv-y4f7mq_ASu{=sYtDh(=R+9so@4TxC)e!;UUBu3ugwR3dUEkM7BSCx zv-(3y>-3&^j5Z&yZ(Uw~KI}!P<3P+x9r!`mNe7i3_TA4ej?4;d#$I^#u>*TmuA8nC z><&FzG9{``5n({0Oae|%~Y99f1CfskvS>+X%wB`Q>yJ*8wKt~JAdiNczrmmMUUi`!;h5)?Yeru4TF#g%$>L zLT7M8Tj8lxB8sFULv!op6juE4cJ26MJSKQ=x^LvU(iHfDq`vGx-gopB31K>8tl7eT zimV%@U(Ks=-b=FBakh>{8)Ua=nMnm5H0n_YA}OId0uw&e6+PdYJY69H zIL;P(l*a4#f?pGHFq3yP*$W&)2LM#5@Ok^qPWW-{Y2Vot_33IS(p*>`28FQ7qTw_ z=;#ApdWTT_g?9`;2|Mh79}!VkPu#C(Tnu{>yS5Q95&z1S>*if%nI&%?ulZ@+D&tvp znJ1(b8`s-5lqG}MeU=Y*jLnX9LdJ6zuFOdY*mT65>BrZDEKCjC*9i7^ozhI z7DaHSEwr$C!Alx>)@Auy&jafQ5xL>d6xaO^^n_EV;d@4VzV)p|r^Ah4`e^Tlx@aE$ znaOjF=|OKy>4TN#-Geh^%xeF!{ul1~mis4?sV~t1wAPFVy#~X%EeTF!0DkT-#v8?v z&Xr(#HsKwFEUyP;ibSNz#OGq!_n^0Giv24uGA%3~~_m8ayt9SM{@g z1;gPhy$+a>N+O4Egae-hK;s`qk=75DvP`UK|Ieq)uK0LWv0=~p&!-sL zlRs>kutOaw+#P=CNNOmxnS6fp`7a=Q`Z^Tr*vh`dIHbMI5dH@DUu`Wx$nmU_-w}}A z<>{eiK!a&+ml){LsV#P-R^w!lX7$2Ow`TwG&casg1nY$u)lT(%20$q`%i#gcWISSO z$lGmhraaB-w>x|#0Cqdm_xW&(fBza?e|GD0v)AVM`1KDYhiL|Ytdv0_BrX00jMJS7 z*#6_8%;E3JWdQElNJ0Zso3%Myhx?98S*{!mTpZ zI2jeo7ZQ1OXJ}(kC)5hth=1ZzNxWYi#v@OwHb=d0H@*!X#EeuSWFG6|b{9AZ?lx|~cUUXzG*1C^^hMHECdXQVIJJ<}LqNd=SErK)%MX^zd>Gn*I< zdAliYyQ0keXvs@;m@TA!Zhf+&j2=Wb@4;oV`1sHjspDF%! zmnFN$QX`)1|4fE{6Ex|6M8=Wz6$rfZcj6gi=%0>6k0{z9HPI8>r!S^IWchR3w>Ci$ zI^K)8%0GYaoR0Q&kWQ<~f7&z z-LIXGu480O3Afj|coUoC;y${euRw?r?@m%|E^|RWCz2#Qn(P*_kd&0f1^Ylq>%D3J z+3!TBUDN(&YR2i0EVJi&ZtXCf#CI$%W82sto9SD-&$DbT?a>IKG4t*<%lK{sz#oTU zz6+lajPJ%H45ayY}9^_-IJLpxkKmSjd5257 z`w`-8GVfh5km%qKZ{j!$s$uyMUtJwWktTLknd5c@H7cf>>_RuxuJ}sdX zy9k)KmY?v0!XbUZrN`Wa;^+5*N>}SqZQYg!@T`!*x1W85CDx8ea*EE?!2N>!VzoN% znO^KRxoJQ5Xm5Qj2Zww3+~wGp#g6gXNZ9vV6|gGyeel}gk>?E061Day`zMRF#;dU<%xptm`p{O-N? z%mK8rinCNrw~T11hgY=(a3DE|(+e=~dElzb$jfV1cgNvde&;$)-_UZ2a;qUDjhSEt z-b?Gfy@WO^zg+amuhV9uFcZ>g~7pRnTdw z7OPy3(zwaP+{#2isLFppZ0|)hUtJ%;`8ANj^lIaBb-f7;zD?ic!Y=j(>VztPe^v1; zC`eHB4KK+5xML2!o9Wx5&}_}9QE~meSxQ2|eo;IIcQrL36>KdNBBxrLSVK&7-W)vm z+&vt>jY1^moOw9`Qdo(wu=6s}nzua@PTkY3{puJ-g{NooO^{DTi~#J~?F@v>=2`nB%hXM$34*#6sG#2%;cjlN3p0v{{#ACIF2;>AC{0lP?xJkpGPpKyo1m6kj-mLjuPU|p?t_hG-y@b7I#6Z|dQ9u0mB!pS9KZX2Y0%Pgc`iqKZ!K|(9AHMRN_OC%+pI3u? zO!uIlG26?}(v}Fyba5+v?RSk%ZZ?n;LI|(lC}rU&qSxkNx4{t78k1kl@`sr85bTRw zPzEoX02Wx#%gtqp&d+MHGK8*^VOJbop#ao!p(#VmS+PwhR^fAN;{=Z2S1Fo_$#y-g z4H2S>=Igo_w11-O{}AxmCetz&v;MyX#7+Jxmc@2qrV|hgKsy;( zl-qHxV3PCXCEP4DzuO;m!S1i%`lhD4Idbp%z+yg`7puKco)S-Ig50Ue&o{warp5~B zPy4wupg%}+ZTG_`VKlu9UoZ3;WZn5VeKRX5*EpeCWgOxhtlOBJVN{KixPl{KwaqZRO@tTrf_I9DhV%1}iyAygSV5s@jbrf+<-F_2 zlQ`7QIojJ#WL6(*n~WCN2BrXPpCEEr-Ko-I{d?`C^I~-X35(qtyX#=x101N+P4U;S zcb(inEXAMGgYcuq>ducHejl5C+&b@Qx;zXOP$@lQfQCp zoHdBZBzd~Yxmc%8PsDTP1Yig%&0~ZBc_G|hm1=4zmuEm^64f1!vY7k?oGYo`U+$Pt z!0Q^i;{DRHtEsy^mOAy^Liiv4cN*Mt3e=oXkJQg|yumZEFEMxYGt)K&E?zw|vuXOh z_GADlCPENShq4@z&CO_kc0iS0Xoy$Uz@UP`*KZewfg{yhmkBV_sV4_ilzaI~-ZE{1 z4_w4)EEF3O1qEzdiYAKWpCXc1eMglgM#CSx2TO(~0>&*m8ZQ)2n6d>QQBu&~-B$^J=` z%dE-KLz#xrA$m;FciE?VN3U{QgoWXZfR;kavAINoKN)o(C;i$BE`w*ClG;6Axl|va zJxo#wu{Z3cUnX?d*rZ&sOx*lao~Y3Nlk?3$&7!-&I%l=(2zJQEbUIbM&2512fNHU& z#kxA;$M&*2#IkZ2;9odoh!u&wCJze%z5Imum#M|3@ZSdNM5HrTZ;Q+WjDD|<5?<_8 zi;b!dyum_Ug7J9J#y+Ec{h!yH*Eq^GhSDaxu4@LB2N&e@tj|seGCl--C<$oOS9^e; zp8j*xT|Vol==Z_`$i;t1+NWz&5y+o6&|AtY>55j=ZKvdSnY$+vHdf6pKh@}tb@EDqUKB%hOc;J&On27L!C~tvwU`72Hz*2XZ zOUVDW;*QQ&_GrL5+n1R8xeVH8GG=|>BQ*zETt&)vNRt~yB}qNMa29FO z$jlJL0l>Te*VCIpo~lgL{a!jh*9g7mYUhMSQ=du{8`Jnhrft=Y(lXT$yovIl*yStg zx}ExJpAi2_?8>ZNcRgL=q^JG53BtwI4IBagCJg?fF)?jK`|o=rL@I9wzU8J4+6Z`r z*aWh#Q!-Ms=so)76(pdDB>B>*296N_m@mTEeA6@5_PKAnN-6f^&1FxjeB1(rviihK{kmK^I=6sD%I45+iH7kMqTTiOuQy6;fxb zSUX7|i<#beM>$QG`VIu|W~&t3NGsA}Az%_h<2qsDI+(;!*QW!&XW*^B^E>ywPyxtR zyR>Yt^|!&b;{kaCbUy3z{h#>d_y4KwI-{D(ws2@B5s{1_B0?~Nj57jCN2E9uJ4FFO zst{4sP^1VX2*DuYpi(R_Lls6q1QHO5NC+TZq^U^gp(8CJ7&_sd#5VKR`|;L#>)jt& zEBD@Y?mqXPv(Mh&{`NjXH1f)yG4;R^RNPQ{{3H{@1)GKlBBE~?3p1*H_VNL(*Bmz7 z7To?lC={U|uOWJSKcYzt6X@kA?sw4i_~inU?kGH=Gjg z=Qc?zd5pd?Fd<&ym{=s-jcnZhsc;f|7q+)FV_kc)e;7bRwCsI|K=0+Os8aoyH;_xVJRQySi=hnHWj+<$d3+_0!S$lZt;*ntk zjfng@4Np0@cPDNpSLJ0zTlpvP2Vxc~UAocAL}fySKcoM)C4@o2Uaj98mOK5t8F>XW zCImQh>+*&?IfI!!rQi*s2U22BQysRm(P=BaCh5K%!;R-bB;qhmK}=|C1#OhN5-z{z zWys9m!`O=A9cYbw$!F-6V|DU-Yxlye<43Al*@hI_S25r72A>;o86THRSmRcJ045UL<>| z_BqQZ5$3Knwub19kR;mOW~pt5?G?ntD=LlIOS>F8cmX>uh7ev28S@@bGA4?B&_KR6 zJY8z2@uDBu6#b}tv@9Fb-(R5Q=SfahvFl9Yw~cabS|RI7{hHY7YirwC|Mt0?)rHWr zz5`Ydvmj;;s}j;tmHBZ{-36e}EW%(;>#N2~%-I*LxwY;O2O=iFR(fqG{!3D|Klh{K=bQ&bvPzuT8qV2Tr5IhNsL$5N8GY zaa-U_Z*qvLlc&>t8EJ&Nhs|CqO}8d`#iOR{Gp;3<+AA#dJ3*{c;h!H1k&5RoX2K2d zTLCct{iICBsar3sQrRT;+@UCJ37p;(-@l`)*7SwXL#s5!4^df=T57MyahAC*0j)x_ z_PN1T2tS9@xIJ_*SkSop-B*k>-;X>bif0-#_v$vLow=D)=tnzWGF2V+F3m^j6EHZe z832rbrkkKOY=<_0_=a_Ncu90^qm=8}OUrrnuO7{Aes#iAbb5^4AQ~}+sPCdOdv4cK z%kYf7jL0LrG3vydOW}w_V4n+bOyyG{7oZSk8BC(0r**JH*q_v{?ul#}fv#U-_(_4a3>>TiacK!ps8 zvH9Wqso7u`_t1LI+QH#OFQn$gX!THZXy&{xlYaU?2_e4(2*WiIw$M6P(z9Xcb$CEl zD#G?M$LJ2_nq>XohFNJ_xHU43JPrj+%J88?x>2> z9r83ElquMFujZUhC4`~LYe&qSk8-XO)miij9mH<$eIxhEcWg(Y->2WuE4Pr#XWA=U zoj;VXbDSSWR$qKz1d$V^i(Eqn)ZJUV+)tAj$)4;h7fb4=Uf*{sLdsyF$NsXx9s<5b zU(<8d+4efE7_+qhd{yfOnx>b28q1zcC9<@=rUb?a#fu~H&b1^xD8`U1#w_NTx;9$hs6x5aB#l{?afgel!U$svZ3 zrWD|}Ooy-H= z*={;*##b;}Z@y%8ZDALBN8sgI8Isp9>d=dr+Epi-_dUCC>MBd$ji7A;JbpV|XtMk4 z>R{Bm${Uii(&rZt&Lpv>HehAqng0Y-Xd&SDh>1Cqm-(?@!t20t<61AM1~s5c=L3J@ zDvTS@K93nQ15~ww6aOM8^_syd=U~fq*FHm8;~CE3g0toZd8hfs(xi(Q`ulTqpCDpG zR|`&d`fRCSP*-x_gs!pWj$06&*WKKmL8aDLSMSjYqA6!aaPzA+BLfAXSS6eFDa(9M ziy{V5mHphMYW?YIK!2Wyj6E*m>d9eLUo2RZ+(A68L{!EuEs(RM<<_2)`T#1$1~q4% zY$4l02;@GjS+Z>P^Wp=c+Vvto6cx|BqX+T)utn;xIvWAE2lNM?V)|tu?c>WenNn+^ zY|DY7gc%A`Ost{bpqlYxU(bCJuqvb}#7Ob=iDwHLWI}zdeul`lh&qXGdQV>nu2e@W z^5jLC>E4 z#eN|jJr>ugunvtaUx>ek!PWi3PU>9FFejn>#Oc#f+dYoE9tq(oCvP5l4wMr>z4tkD zaC;a@LmLr*f*E)T#bI{Je1^|qU7{86sLlRhZ-j*HQkWVMNk#*I6w z_X?`&qw|#`%+QjB`dWea*&%bg@Gop6Q?4uoT}m=aOTISU`4Krr!B37+Guyrof)Iz| z_F&*51hVH>p7!>U^78p7oU@b2jbImAs*yRZ^&}6?j@@m3Dx#-S@gb2+^`*JR)5KJ+GeJ z6dxfHM)X^wz~~)L-A^#Gp_E^yg1tFiD>6zO)rGuWcXM5%>K&8Yc*c;daok?_gUSSP zSs$Ke5afa5VuqMO6yD{1ST5bMo$pD}eq(wT6uGz?b5uS3kkQM0#}&t;On_FAJ_Wth zL2PXu6ra1;I|kcnc&hHe1KXCWHD;BP3;XjdC}~`xLoCBN!#jcE*m#W$37PKg0i42?5H8hyM9ZUAfX;Hmhc%&AD){&p7pZp-;!n!??I~IiAt9OX~CP zA2WVvV!RpQjU87zq<+3EPffKjpIBCX0Dev{Sn7~Fr~xI#-ahz2Z^|y?rjnq-D8xwD zDCYI(=kd!e>a9k}>|i(g{NO7gl0-4Y=>p7k6_(ZwNtdl8`x#%3^u`?FOOgg;9E7Zj zzx}?3JvMfI36zh)lEjEi)W9;{#X@bHFZw!}BS2BVeEGyP=EmUh2Z;<_AIdE4P z!T+B54M%bKzAj=z7x^~r4?Zw?pXKce-}P4LW8-GjU}33#!;XKpKK8qtE9U@`0Y>bqGQNYjuBsktLnx{6q70G@H2r?Vf<4? zwf&E#z!ZX^!mGu-^sROVkh!9<#;r+uR1W@x#5{>5=mPZf=1#R0O`h z3rh>WlQ#sudgx5+`ABQ=l|Eg$01rC zsZXUMc~3~j-r>f9MPS3c~1d9@=JK`DeE_bc8wfs{I|PVkUVHpUG@L# c7(Ti#y3LC%@$_!A3j`d;Ow5e)4bR>97sdi3djJ3c literal 0 HcmV?d00001 diff --git a/docs/source/_static/custom.css b/docs/source/_static/custom.css new file mode 100644 index 00000000000..85a59fca399 --- /dev/null +++ b/docs/source/_static/custom.css @@ -0,0 +1,19 @@ +@import "default.css"; + +div.admonition-todo { +border-top: 2px solid red; +border-bottom: 2px solid red; +border-left: 2px solid red; +border-right: 2px solid red; +background-color: #ff6347 +} + +p.admonition-title { + display: offline; +} + +/*p.first.admonition-title { +background-color: #aa6347; +width: 100%; +} +*/ diff --git a/docs/source/_static/fv3_rocoto_view.png b/docs/source/_static/fv3_rocoto_view.png new file mode 100644 index 0000000000000000000000000000000000000000..02265122fe82aaad00119c2d85b6c082709f3ced GIT binary patch literal 161304 zcmb5VV{~R+vo4&DZQJSC9oz2Mw$-sZwr$&X?%1|%+xYT4@80`6dyMnv{J6&)bFEox z)>wB{t-7w775ZCN3=SF_8VCpoPC{H*0SE|m69@>{7!vsVN|}T zjt@N$&;rYcyU{GLRenLdZe+OOy)LZ)hV~&^&Wt2wz zUpK#pU1on=cO*#RA=l5gzpzCAeZzWpSt1^`6Xu__A@I}l^GZ58vZDWeH$+8TLIR7) zTlVwp2sI3r(gNk*bx}8m6F5u(|GjrZZ>!_cuM6JAA9D>=RZ+0tyY{!6Sy;F~UlU;o z{I`M{L{wB3cUgO2=JYYC3}44Asd<3;$v@Ult^+h zO7Z4AB*KUmB4RZ4sF~=*jJ|}#WckYkuo8IY?Qlqtfg`jFo3*Mj2Tp8;dFs$4v{074cXTKg+>N6VoN;%_)$UkwZ$1gnXfhgvr9h{wDmMkUa(q zD^ym_@uw^*K-}798OAZ1lL?9Eifc%*eoLW+LlMt1Lx1TtOuH>{1c_;sK$7l4#aURh zk%LAz%=BreG-H z{P#Asvi|6t1kNZZpFJ`EsFfQeR$$cs6-q2p94P)1F`!@{`Avwpba62jxu|j8S5NQ; zRHE^T2aX}MMTv>L(g>0gRl>3~NTR}lfABA**cmD+V3q|5v6G2jP zMBxG>)Hi>k(z(ZcDbm(L4UbK|f_>sIXrWmr4RH}rqt;nwWJp0;C&CN!sU$3kP;o&R zJfk!N@v=rIY{)r{Q~u8hN9q5qqyeLV@P8>e%wnB8_ZW*TVe@~gpRIzbAwB>`y9O)t z6Di>P8N7?;(}EDGMVQ}j8jn(6LkbU)9QE5GJhtY5;zOj!3}+>SVgzFwk}Oh$wWvz{ zhYC$mKYPq_NyPF&5#}jKDiIMe5j)6IA{u%l2Kef@4?mdr#Ujb1!V2tnt5M`5Y@D$h zo)~0ib{W4NBftCR<|c#JYxVwQaIV;IxnmuNcTz|#;uZ#3MGYOQ%Dm_`pSGzgery)0r${f#0CBDUob?(!=o*KGnvME z#=_Dd9;<|!sQSrT{@GY8gF(NnBk+5*0W+RkzcWKJ3o1dBB`40U9>$nz3Z^Xkj{n1fDnO_Y2pBNBAj|LwczXClm zw|jVanAXreH-~b1{o*MIp`fY?L1sFY#pCnIt;2sfmP{W_z?YSgk&&G(N7acj%pyXF zNDy8aj6e_8y|}d%is}if#tZYB$+xr?8$ryAZxWRI76dlMs=zn^42XzjVo+OKd0p4f z=IG4*5k^#9+~vKOqrGRx+*H5P+SiBfN_BSM#8dAQqw7Lm<@fI!+@%&(SLYDNWfH(H zSE+Qr-u5-y26<7CnX8L9BN{VtZ%Lbj8rSAAp@RwCzaGAqgFd>)bf~|dm?=0H3f+`W z5+`s>6B^}3?kIpsH-*+1?Vjv2)IFyt^Q<#sgaH-AZG15sezh zWIcUZZRCo^cXGSkm)Pp^N=%GKlth8|*_v+%dbY+53YYdE>o)r|qIXk2aw8eV@wTye zZe<#z-Rfr4NSXuxb)(v+Yoq4EUo5l1 zS5N6Eudz{9l(%x1nRqj+4g)_`GWm3}(mHqXOxCPw)+TqEow>c&pY+E2#~H=eSaF~Z z3rL_?pL7 zIEDjr^ua3;Eh$c=+PxJI@_5?Wp0p2PFte1;b6^ZXgRP@-BSi9kF!5=Oxe~d)e>to6 zDWrbd5mt8)DfgE0Lyu!7A1P=o=?K@wB2PiA3R79Tts|GE=J`?26eZ0dyDKBEJ1CA{ zXV8TY&HyElU-ZLPQpg_YhU*yQ{*9BIjLzmK^v{k{{mM%GTPt-1aXFb^$H?qOL~iSl zc;Klr*v743OH-z#bEw*0)^yq#E7!@9vGMBj^<+y$OJX`w?KgR-r$+szIxU%QcN(jF z@~s1`kqcNW7}dqv?;kq^w#QW;G>4CKVATeGpW~IDaz}Xo`L^joOOwwj<1*!fZM3_5 z?M3r>zn)M$kE$vwH+sG}q$Epw=yvhXB|uo?BXEW5*E`ACp-}~%CSnBPq z7>pizUHAUNyFXiUdHrKn;g^?3%)5mMuGFFZD>?>2^V77tH9E|q#tPFGpFRDY>s9pZ zm@eN`tlG`vlht6pQ5bEbRVZ?uZ89S_8E_KJFh!RR?vw2LNY33F-(lQ50f+&)8{tZ^ z)ip`?c`nIX+d6RuxuogyxOT?M7B}{K4%jIKk1H! zpj+>qb#3`HnyaFT1`DsNS_i3abxZneSIo%vyxBKZ&2s-$MSVArze6I+{w8WRiPhX& zblV*?w#oB_*#*KpC4TB)s{U2WsxvaDIK*w|`HZ@Fdh=ES-QLWdyzywLdfDkgg0b19 zT;}qp{zsA4bzAdp5!lyXFh|ozJ*^(vgITcfqwOGkVoRUmbd z$61ldvS^#1YXn`R$DSsP9&=5knRIQ3x91W%uTvyCThu+Be-co+AyVK8j8@kqeaV?n zeCw(X_e+x&(FTUTycNYqS0xVWM!tT(IwN3`GW%Tav0bEX0R5X8_4`0rTVG`Hc=UF^ zzuZBMK-?Cl7Ihp?2=P!{wXCHjJA7{7^}R~BT|YP_w9-_QaGZrtz~9w|bas&iL9RfE zqUs%};`-xEKk|gz!q%+^ihrru$xfYa4k8J9I%>5ab{d5~jl?(kd~$rgI$Z?>1jLg< zEi9-ZlqSY9#nYw7{dAk0pI=N?SunSxhKI*>-#?mkxDO*<2bbpX-h0BRqh^6{!nTU{&ux~Kq3>D`f4PF)dxEif+*&+_YrPeJgQe>2H0 zQCU1#&F|L1290-8MGmf9>ilkfo!CYIYpOPQr~kkofri1ts)DAx8x^<6JqX;1b*W+u zr4}2maT%IMLlD;u$B41S#2q@c`c|KIb6-Vc&Fo+!v|YZiORqtOAhV2`jd@fU<33ij z=Wbr2p5CT?<9#J(!-!O@)22jXoI6$q`aij-Cs&0=qojxk+0UQHfRkxWXRnnMW#Igw zVcVn51nAz5GVxHzM@Hg(!mn|^ho!OM;bwRQ$jzrnJQ{n|L4->QF+$GaY<>OW#Y#<~ zd|GBkMoOwW8A)se9}y9+A2~hy9laLbUjAK-C+KStRc#rWsTTc-sy3UMiLGSG^vR2R zFcS%#qwnU!bXehYxXdURQX`F~3U#B*w^R_3HH{2POpEd%Epg)Hkk_w`tXWzTGG&Yo&7g8jt`xm<5$&h#Wn<+Q#~d6~*Q0J;JQM92W#yENS2|Me_B6BO z49hl~)LvW@ZR6{FMfeGfd`Lp71gGZDs`I;STrwkYknFcpmaJZdU9#?|SPw6Jf4~nK zS2tf-RTb#Fb)`8hJm8bhw$B$LU71>h`Lq^XlU0K6-0$7AZ6GSh0VbU5MJ=mRdr^ zm_ak2cUhE-nzZhw^vgk;V6V&%0wEj+vh#K{B;A(o{qE3PflSv3m})%tDZ2qij%uVO zw2TbP0%N`HRRiB)|2xz8PPeFY%F4>BA?bBG2vIh}Z}x^zygRC@ssc>ugYR-Tf^`$j zv2p4g_{_~XAU4AJ*oOvESn5*-FT*EAadD*F%MP!$>={Asj4wpt*>SOFd&k(^>wl!O z-}$BBjfL`dx)lZE#aiecF`9gKonc)x{u!&W&onYgaXCDko4Mcuodp;BHD-rI;js0d zD?D;o#muHFZZXAc&osiT2vrctiZ=R7>@H65aIa4h?mTD`A(RZ)F*L~>e`}4q8$+(Y z=W*HYR)h&$<)PgBHi1=G4e)8Fl>m1i?)>jkK0?pd5^FAI~^N}rf+w?c#EgFJ2;qAqfr@SHDv0| z@_afo^j_)`aHK^|MFTT;s*R--yBd|zS|xj{mg;+2v`%+^70w!c)NV2_-XqaTz=?9; z@*F<8mJBbUE&gC5#oc;(9-di+g6Mp2coB;l6wUNG0PG*edd?A0U~aAaS${IyJA0!P ztJ?o{jgv<=pyQ^Ih!X#2S5igZcrOwKAqv;siG==X{d0MPOZPoqJ+<{NW9gLb1NhD6 zW1woseSG`Hy|wv*Y_^g-2o_Z5k8I~U$6*Zwv3T7=?`|A(&V z=iTX*SmMjnCT?qj_JF0Q1J9+Pl6+t7*rKEE&sJxSnzcauOeSx^#JC`CfZYd6LMjrZ5$9>oL4^o@Uv2hZLU0t5>wC49ox2l!sJy_!A zENt%rI`(hJs_Kb~%-o$K6D@Lhdi`iTGM-xpTAK~0r|0=OF6)Wo{)^*%ynFz_Gr@9p zbEZWN=d5f)?D7_rp1zmp57K6m^=kUwWzb#O<@!F-6%)YM%-ym%*s7xSxU{00Opxdh zY9_F<`5}kEZ4{Ypk>qe?EKFrGt73$K%kyw656(fUYwh;%P%fcQA5is9e_52#N`Fo) z`Y#f`86c*n=K2k)tgOr=5J0EhW`$a;TCHQ0#!exZ#cj6@gmgJyI+w}i(s0wG5RrWgS4zG@Du=F+p&CM7z7>w z<4y@?)(VEt)!x&2uo?E_B96EGI%cYtX12Il{SXDS0u76>Rd<>jP`h5Z!23J46BT63 z*jyQL)N?Q@lGjo6H&Vw%Lq40wy*k;(q*IpH$uga7g?&IHqKF!adBd!gS|3 zl7w0AoqqX3xK!bB18Qg)l%N9B?w4@vKX4iqQc%Gm^n)l?$?}W^@gFSTP|Txgf+`70 zbi%o-Fp|;`M^x~ew6YMJ94R?$02Jv2E6|5Q2`$+!p`iVpRgWef(UHuAH-?G5}0oGXYDog>WVFis!eWbez5_sHn zJo(oMqA+-5(hmkmJ*f=8s637Y3$yaO5SEGs4v9Yn&GSLZPiW(Yi&P$AeV8sSU zn!(v{<~p{bz`WT$r}L%#KQzx9)CHWC<>lqmpK$5LJp^(a)}1Il1&T_iCsFIk*GNhF z`wSMfX-M_cp=sv=kKwk zUDXs8%$d{@=%t0_7`I+`cUE53`b`1zauE=u?EP)zIeHys>>Kv2(5R;$FccGh#cwt? zZ(OmOI523aq|`s|%;(_JOOq5E8-_G8XxI zk6c6W&zu%8yx9#z)yf=P*#RlRwpbKD@D`mU?@f=5gpi*rjMGiqL~BDcI;Zc71oYAm zhBV0k*@y3~n1;SbN@2z-R)s!SI9eIPM^OYf)`tE=ZbWD@b~3DDnsFdQZq&6*104Ccz+$ap)mI;?}e`=|*4@1vW&lqF#6vB%*-t&~7M}T3ub8 zoE)CWL}$HnFdVK`bleJ(9P)x4&Q1j5VOn?_Wy3r3*;sdW#~OSweKL0ZlwXzUiBS#AKpUFoT*+Szy{^mPytY<~BpE{btnz&8l9TibY;d6|>#Rd4=1pWM{9qvRNi9p|xw9+%*8{=Y>ROaO4QA@Jw5$)bP-G611 z*%0Qll80hpZFIpT6mLBPk!H`s6g_u=iFd;W=Bpt&sx;ipr{bccJwt_svjs%3pQ))m z;b{do>!|Zd5qQLbcPb0eUh{n+_Uyx^it4?S?Q_#(b5nZP{i-t?xnVW+&0Idl z`n+O(dPYZ)|IeQ15Al;PON?I6i4}Z^H56nsC&ak6wy==E09Y3c^1#aJI$ef?zc`e3 zTrqx9;?Q&x;dr@FnV12$-06c9c9%K{bw*1Pwb5V~rj57Uv@tJnetuW8cXgGMgoLEtDK0*~{m%*T7N)JeTZhIi&#ktn z`B~bo*0>o5ifp-=$IA;+Zg}z4?qz&!EKz`E%?GMW9UJW5y7xn-$b_{MIvUb!|EFfb z%gH*zV)X%}KhXk-w+PqAPK@ZT5uaS_IH4{F=5H0~ZMncPSSW3Vi_5%^X<^{b@v9uF zH7o%WCKVedU>vC&teD}u}hXTEC81RzBj731B8xt66Z{w#buq|zmYB^p}Fa!Tn&G-i>Sd!Ax(zc>$ zq(eVvdJ1$FCDC;mL1wgutfbKX4DZ61d^}pQ#;DSwY>*M67foW-EV2pr6sA!$rX^vT zFWXrImnJ_Urnx+#98~2i`Mu^fbMt&2mM~-rPjl~x!m4n0Ri)w6HZWRyB`T%49fp%B zAZt&GXPtkj{AtKU#+ERJBz~T1{a^%*5XX)?Ns-12^*^;1Pnxcr0Yl*<)VQXi;$csA z1H~o&0F_$#`LIR7IEbAA_v(iTt{uUsOakYb`VAcg|Sd?jdbwRj|Lvw z=-)vzXV1%&np9@1SBB-e;^q z|Dba`x2n0~b$eg@xOQAoAcv2ISDRL%-n zN9TFNRm8fUE^{KKP7n&8c&6#ZuzJYvhs|U-aGOfc*t3E%4!s8tcP;tN=N`G2CN)VT zxsJ$tcrh57r&iRn;Pl0gak+R9QDSv3L06z@TUKLv>W;Cq>I9wu;YMIHoQ;D+mpEg^Cg}UYjEI2bQe0Ncr7G~6moemCLkBWj;YhTM zh9jsG{*5A23?5F(e)9?`sZhBwD)M}Sh%)n*nG--^NN zZV#Ow$+2&ZP<3X(sRtEc1<+noNNHOVlijPzdE!8M7TrAe3=xEt1 zJS$qsw8-+jrc%vr*FSHDFV#6v!OL#cy6gv~1w6N``_b>H_&#T-X=?JhUK3JMmby93 zy**t$U2dd93gzYHO=j`5Tdy@$yWC&`rCh?eKqo_=>@o6@a{Akm3`CM5_-aEp41Rwu z92bo%WtA>cGR4rra*a0-FNe2voKi))XB27K|uv=d-q^E(m#Ps@CD={0x7xL$Wk_~?0NIj(p##bmmj~*T z!7->ohpf+C9i6zB*u<^Tf(?PMnwijyEHNMP_YGle3$?-|uE0>*Rh0Cjt$CYAXFN#L zWp{eT4Yl3uiGmf&Y-WZQA|`kCF_M+pSbl(o|JbA(OvHs(ofufx!a^5m#91GM`YL;l z?S{|rC8klYjSnv(UuO~9aEocJSx`eWq*8g(9nbgi z>!Wc$omuY08%(du=gC250TM#7Mb+9Qbsr;$PavaLW8e}mwGSOI0*MkoMLLTAR z=#EC=m+F1eHMr&wyag#ZA(m7;EKQHQpIc&_+x>!2!{uupr; z{}LYWE}^2+1HsL$=HDu~%*(f7@CKx#(!>Jb*d9is<;f`N&MipJd_Y)di8 z?4Y(p5dP%) z8n3uaa~(kdZ6Vnim13BYVO@0f^cm(=_+!p0IjUmQZQ0m%ttsSmVvjDlWCvaGF6- z=`wZ<7yiIr*C2)GcrA_Uj$cN**WG)Yo~{}G&J##)_%p!9QW!l$5Qz1oDD*$j5m$XV z8hI&a(rbBoG5yo`wg@@B$Ym9!U|+X<*)uvdm-u9=tt@~S+@_py0ShaAon->xzfaFa z5xl+Jw;Ek55?|rd5xxB8`RYCTK0RI$ zve+>r@9pZ66_b+{6qqXAGI%@z_)*?fGr@3n7tX@Uh$}=C>RTEg7y>nuvQrSnlocae z-S!or=jQM{W_sisYA)g&_E3v(F`D;^$4(v^Bd=u&Xt++CG|cT~1+|cMi|;a^qTYsN zzwVqZI6kd`Zph@*!U)=zM4S#X*zD~o$@rVTT2hz3*1Is036@)S5=Yj%YonaVIBuYz z$PUP=Q5awR$BOeu@4=Cq`&SKIJG-fia@xY;fkEh*a|74rf2XA2 zN^-fbuB5l1;nIu)7gbcW_J(#Q8K$I>l=j4WVqyezpi8=CL^O*Si=;A85>Zkz^3ql1 z0B`j-ODGFyt*_md?d~VrLrGM}>kJeh9o{`(t7y1?4X1tBW(+=^2*<`2o<1m-x;|eY zl8k>9ba7InWf?(XaSV`gT&MIGo1o42wJQpF8PBXG+uMhph;VRRYehc(Y)OL@mDCdB$`V_}J0OM|nvN>npLhuzgg3QnGPfoJ1>{HdjhROYB=PzU#VQ6%=&2^oLZu8{j?XJgYI(zI_rT)e^+Fv(#dtdUvT-*%=OZ%)j4 zruCK|90&~jEIil|D*l@hv2>6)@EbVWOc@Mo{~#k)F)=FmBYayJ>-%3-i z-u$s5V@o-MReKzaSL%l>-nM2+iU(s2@6CMN#d&kyoF?o|P3RW5jEI6+hjSb-V1eoB z>FYIHz($)>_}D>}dObNUZKSlTZWqlQYCU7}RtVSj(mFkbZ_-*X^HbED~{U6TA_~w3d;r;CbIZ(4&tPBJqpe@-+-k>uiejLJ)0m*l%3- zyVTaXoK{Ju463bO=I$e9aE#FlBz%>o|W(Cf=M7 zr4yovyz6Q^YaU}Q5Q5*85_5`(ltib1b8IGIzrU+y5?yTD#pAlg4lQ zD1US!;_l?bh}5!4m88HCVtO^|>ojIZ2KS0zu@)UiLa z3G#I^B}aRcd^pvaFK3EvOt!Q{z7ksq$vAT0M0Yfi{jHdD@#5wbQNsbyt}wYwofSJe z+>GpLmKa%xRJ?289*WlNaG-hU|3n-wvCK>hC~3FH%6p2fQ+-0w^%_r_@WOo{t&fS= z-sc5mr&FIWRG1r=C1odV@FG}07eL;!2FiuWn1-m+QbC88X1cQ@RmZ$wnSL%i74FAx z;=wrLV^CG&^0Oay;aKB1s+Oxsk6d5FmHGSOtxY?k>kr$2M>oFK(z`UWY9o`BOF`923Aj>TUX+*G zKp`^H$g^fk6}kBrqtU1UkMc7>2_5wpri;bewc&EKR=}0~HGTZN-o@;Vq@p^s3i{xV z922%18s&(@CeOItL0kzln}&yixQI%k&Om7e#?9;jS`p)X;!h=Rg)H}<`+*BJNZ?l7 zVf-Qaxw*CcxJ)}eAyb2gU6tiyQ2=|pGgMovqJ+imo6!+OwdFM9 zu>6bN_KsE~wd)n_@F1>`6U(=(0TiORc(738Dyhged%Mz zsD!pwf(f>|D^XH_s@`0ryi{-&_$Dm`6zEw`&#Ne}PN$WAJ!aLPRpb~6v??(7pzF+6 zgrV_@t?Ju0hNh_WCC9f1WwP|om@nF>*c8%gSjwKHpKae*I%NZ}(g>NXQ~3)|Tnu$P z!n`!2+VSb8IvNi8ORGOs0~t7K#8lJmgA;bJj08WjV~TxZ z^{eqvgmb!%kfQT2=>P!x;^Yi82)5c1qs_-r5)U^z=E-h*u{o{Hh0Xb-tbmx!406*t zQ+L_uyu!la9&Y+ah$QECKOHo5XOyOTxDNr%lxv3}h;s!7-B;zX^WaLO5a<1juVC)} zk#`CGB7}mbavE}z2f6oP7L6l3%$wt1?{ue}wm102@sLQ9;abYZ12kz6umcsX3UppQy=Y_>UG_vSxbqP`mEAp zGATh1OZtcm7AWFiG-Z}(E4!&UUZ5sIXqGxahe<9wNXzP^wi#YMaY@QV$}lO_mP-fC zDrxsvBeS<9i&}?PURszMLUKCr2@q~Z(;EkE=8HvlJpr>Ldb{-&@^DoZ8|sT(6#}B_ z-q#g52obNk=jZ1)HZmhTc|3<@DOyMHkp+!Ue#C!cx6kr4N=~oS5-dHUxDQQ7M+Z1e%{#}J*xTW zk2Z4L>6pV}c1Q80Z&A8-Ir4A+(mDe8m2J>dYx(GO1xEnqobTVQSv?Ke?*}^+`E_Jj z2~6AOCjPz9vy4S0&=i7Y0oGfcYIp2Y2^>m3)>x6lGHbFn#jM8XqVsK{yz};vJ|^q} z%`rdaTQ<0hiot$2xYlxMvRprn^q~Nrb{jzltoeLiiD0i#8?lis2fs``da+p^EW;$f z>29E8h&$68ECGAVBg*fk_T0Jofa7DF$k}Vxm3@YeZGK09JOzsI^uwS61;wSv2Re?@s`r5WcvY-pmKhrlc)Wie z1%RRuGR%ug-$X)lsVL2_&s4B4`{bIxZ+6-z;5!FRWpOj7*NoDz(a>mF>73by5|uZl z;c2#KTJp`1e%+E3>gzLOksRUqL2U$8{8@Bd&Fvz3bd>II z7#6|;8*2Gu)^KBlac2vnn<5a0Wtx~0Qvj; z>Ph?PUWhAy7=4zBV3$wbc4)D0zz=IR*o&Yb_0Qq;gW8%opCy174L=dq30+oc4>O8# zIgip zBf? zRZlTd8g4(AY&DiqbR4OmDW3NEYK$x(^GnviYQpwVN@9ix4=`JG>7XxW{<%N(d{~2e z_m3a;<$!5g8QHnxU^q)wLz=EG8P#=&4QE4~LGey_Ls(d_Cg#eKmWDAaI{^yRU}km&467%5k;f zdaSGhS@zhUb3X?gLI70Jor8tu{)~~{1PbPuW78l~azyC1TID7AEwAE!F`>4tH|R!L zX>ZF6HoDt#H?1faz^9mFbBDqjn#sYytDlkFSIk`TylyYH8i4}K%&mN$o!m;lX4 zAhR^^a9+wsD^1X^vQ;-cUWer^e_Z>@HOtxCRNgWIk8Tha84S32ir&LbN0o{GV2$6Y z9{^29%xpWHU0&0#+tRA%KKQhvyFi(V-~Ns1(|wRSyL^H81P7$J_Xd2{dp(Ut&ah@_ zVLX&+zC|YvGg3f0E<%T%fR|>l%$wsJvNN`-^xUXw1kh@D`i@e{KNqEZU<3e9F9lUC zsGA8A3JOz0L&J4)?Y2L@wh;mX0%7ip+&Z)`_%C!_2|9`p5W$P%m%mD@$7PNni;e;q z7VaOE)VCTe;diO^r*i~)dU}lZ&vyEjSRd3{yvmrHs`MlV{dqK>NRf11D;8hx&0M=x1lALvu~??crn*OaVblVG;; zXB@C8f9#jYRtt`g!^qeifQ~u1{$%0MQF6B#P$t}~x46iDAKK*ci5U|l?t-kqxP-Gy z@^2D&Bm!iEI?m}Gx5%w=!|1{h?-gxXoJ*ogrkV5JJ#c%ku`7cl@67k(ZglLy7f(G8 zFn3g@(aPJX&qlH0?2f>vOo0Rr;<8?F2^qqP2^kO<%t~3JXOgQ>6*5xLeNC)|G2~uT z38T4BX>hE7VdKK3&gg4CX#+jhe?E0`vCM^9=2i^N>Qb=pzWSBtgJMRhcj~j7T*H)7 zb2uAKJ$S@*`fPzZlL#h9llK8zFnvQeYTMi|T%p3EjOB8`1*iJKW8daT69c75_EwZD z^_THyXGD7O1rZLQ0sg1vgLeS4r0z8Y>I-k+Q@&^pX9 zu7vwnI_TFmKn(jGOXqkQb=E^88cUUYu!!ndn0)!C8Q!nnP5beA`@UOG9yL_1<>WZg zM3;3F#S>!wm3*5%J%|uHMuuP#Y(87g`<2@~O|5jSjBkgR-G}e>nU8A-AT?tc9)p!y zBJ-EYo$+8nP5lSb`0(|V6LMj%OR+vZ6ky!^)3-XUHo1vR98&nwl5O5*Ya<@Qx-B;$ ziDzPa;C8)?0GXv7-SgJ%i1e;NwvQxy-$;P|0RBqXr;pKhQoor=oJYMsv2>1qD_8EhP!Lh$Xc*49=$RvEzKr-=2H_I`$NsXRn7 zN&C0xf6Cn7{W@6Fw*g<@C$C3FM&y5@&H%-a1o(!9LA$xxZv{O1YUv{VPw5VLhA!s= z>Wht$@q6vg-AkUYpC2%0O-+rs+|W_+1>wI<2mifVxc%=S#XF#;=C_DQpk!efI4T;N z{@=bGGqaF!hiRLCdxidU2K2kl!P)npCZ_-Cd%8(|`2*jn`v%+jT`$x<0PqM^=iVW5 zA^MP!herU|ETRm-*KTL=wY+0QB@6P+mYUnG-Y1R*j5`H%nqRCvJIF@*!ru;r?@K~N(y zMel2gNHAF`U5NARMHU-u!7LH@s8jpbQKx9A*LU0c$%`H_GR0!b4TnVi^7uJGr4R1;VxQ zW7k=EFk$l@(0~^f-5LikoZrVLvR)pTi@1CkB9eq<aSs?Hs)Llj$U&1^6ZpC6Yxc~w zWXh?nWD6T@VO1}5Xd-W2Ah?0l*)D%qcbAl57dD$F%3&lSu-DF#JRAu4*?-OXXfe%3 zN$Xj_-lkh=-vE=(W2r^g>BTtY@1#GTudq0F-0e#|5>wARDs#!$=NSxav!p)RF`7BT zLIo~5p3h!*JXYKvFV5AsOWp~(12Lc~71OB~WGBCF_snUqHtS-*k)f=1>s{}H;TJ+J z7vYJ@FksLq{j@3LLJ}1`Y+oc`!Gp1Ep0O>MOQ6gYK_5IQQ;)1QCzcBJ2DV(1 z39n(<^v#HyJ;q|DfV&bHB(i!SQZ)UOIN^CssIq9atVO@`!=j88`(jR8V0tzr>9?75 zwn+L@$n~5Z2@@QUcJSw!ALEWK-$dGDUq=u4H84|lvkgpD2og&nS1DD_(7kSGLE_#p zm~5km^j9PzYyF4MhesMp>6LG+Ve(C9<8b^@&)ztd0v%kL(3kod;40kG#LP_^V?{sB z*9ifGnT+XCz^odpmwG;b#&E&{N3xfoh+M1h}#?8L1&;7@95G9W$ zA744x5QcUwvzgHv8(5Ul4veWGLi;XdeTe54$;+d5Fq@ho9^`H>)%KkU2a8XD^>AVcY*^q2JP0uD@ciOov~5O%@KDC%e(fA2JltoqZbk~m=2HJO3Jj)uc>3D&)KXiv zNohF3QkD2|CruI)tQuRy8BZiNxaL5mzEahz|tKv zM1rVHZWhw%q&L$+_%7Uz(ZQp&FxSbf0xu*YrKs*ZKC81INnIw#=1| z>}}}xbFBX01}ByVZnHuum8oumtdz1olp5!sI6yr7PdxBMg2Z#D?~lMwl{0prFuYdAzs#6*v4l?Qz?oY&6Apk@&+JlQqyh z)*h@1#GNVzIBqbg&GgB(WaW`nK9NKOcosinRQ2S{ppxt+Ooeb4PD*(?3;8+z^%JuV*9H3)>DGYbdyG8DN_)^(g2`s()^e&&+CX&ut=oWcBrFFUxU1{q zbRReGZw7oOqp(bQcUd|hlS6d0!i3diYL~#i-tP$S=G5o$8||S2lVd0frK;di=%Zu( z&AFI--7_j(0?sLwO>QHQk|o{Och5uFxH!qdHw?P`7lqB$Rx|?~*_eI)&+vOY{UO@^0DnMZ|aGu+rK{+qRjK~FmYf%Id@K=^^6H(xCE zf|Zdu-7==ytH<^YtO6kzM|&fL=tfFQlwB%o%}v&Fc9ZGWL6}C)g2!%`FO^S`lHQ@S zvZR?BOVzDxn@xz?AH&o6wQKRlzvjnn?mE#R>0jdXJw*fo4EyKJS*sQ+DJ_3q^IT`V z)rF2^e1)PjL(|wsM}2Ob1&d6~e)*Ig_29iZj@HnR@>&X2=3`jDLh#`$EpA_VV~Vu@ zvc&ObAz}e*$G3qWEJK!+z0Px$uy=Bqy_y!8G3A#F#iH1BlfSQ7jBCN)5xFVp4Z6-m z2XmD5USL@+5AP?_)r|eRHqYy%KE#p@1MO&W2{k!gRF{*;bLNN^v@ML&H$Rrs;EKLk zk_(Qjg^cI%>_L8skr|>s&ZCH{%U6ARr9kT6(WA7KNn<&6IL4O+Lm$vzhD~E!qO_a| zq8835EPC2_V!pK9N;2ejZV{N>seusP{0x$j-`Tl&^yW)7{=##H*urX6Z{?10=MEOXvI<8b+2&wTFNp|a$ybLl_yG!AfP#Bvd* zpYXPSlI^I`C;ffSV$gE$QJQB=07Dy!8uM~39CZs+`ZVi7rXsY#73y8$-#vFIYS5Wn zoIo+~am;>(->K=L(D0S_Q?}LdAyOrnE-a;FO~cuX_dW0DlqYVM^mm<7BrnHB2lbbG z+*a$hNUZqP(vpPkZa8Fxws6uVBm&G%GfD=P?CwRt=qLUpGThp4&4=e1FE23{FOg7O z!k5n#=*li_u2QM_sJr)dNAIN{lcKbiF07%f>uV}`&K`>!Y(NTv+~q+c?3ob=ACv z4ss;7(2C-DOV&W)>$f-@2yk8aya1pv%LGoNzS6zT@#=3pj#0X5ozr%t-e}4x^c`lq zsc9_%r9eyWb!98M{ux`a(mQH*cAd+;z^KnrCNH;s*{Dk_f;vmWC)0IFbh=QrT3J5xK~_WMKpC{us8f%V7d6b;q=o^|K;}{k2Y(V=xoyHuyE{Y zwzFOJxQ%?h9&@+)cI_*h%*R+7Q@iGZo!rKPV%^B#W2)e}uhbbY$Bb?OjPa=&)laoup&iX2-T|+fK)}-LY-kwrxAP zbZ%%)lp;wZDG@C-H#j>} zOd@Y?=B-@kDVwgq4bK<&F_fVh3vZpokgjZZkwF5K!Rtaw_X50^8v&At8q2}2ZK5`h zu^Z4H@DLPapo#zORfwhv5dEJZK5ORGL9y00m~dtuA$Z<>y4KwjpsD!!Wqa2 z?25VoWnaw% z6=vnl^;nfr+e>oK#5$4>mnbkzDI2-0?8mSl92bbCv7ZZuX{nSJr_ZP31zv^avYr~_ z7IMS${`&SQ&b(fS=T(-HHD%o5p$9>0^uqSjgxQBR&aWOQj$xiq{ARI~!Yq!cswg9n zYa4{{w$t;=C}l~fi3LkIGPzb;H4l> z{hfv5dhs`GRbD2YBCW9~EJNtR)rcJ@%RgbRsECTStGN?*3ccT?zqfby_2KN`;K1G8 zotc@ru&{6f1Q!?AGyx4A-NM46xS$}XrU+OE&gfylVlt=Ro~o0RXi12(P3`7rK?}k; z4TPQ)UTRmuKG4A5AJ3VkXhD~Z`*>fNI@h67DvU$`d|(O~e|4FJ>+SE&kmvuDS?R_Y z%PlU+N-93*&SA_+dpPas@PVk4Ab7~w7kwfc1y?xd8@1n4mb|^kAswJeEX0n`+Ag+# z^C7iuyYJJ5!!@1!?JB-pcqRo4c|`v;oW3SA28Ut&5mnQ46?m|)1X!$Gp$WQn%g8Qvb&o=A zkT7y=jl?K>u>SxbxjYG;T?c!UDMR?aem6Ae4NWVUAFiYr6krb{KKi2GN=T%DD4$z; zytfpdy7Jy4ZP#Rt&a5Y@CpgWM87i?IYIpp^4CcO8Cf2!$OgI1VkdVTlxH7nE9_zAJ z1mT@oo0XV#<>{P)jE$v8+EpPLWEL_kFIYYc}YYbYZ34j)z?t)kVXNw?5*!PIMUY4@mp!NqxvX9 z7+ZA6=j+T$9-=k!8oSrKT6%kWjiS^US@dNZ`jf|rxx}q;S=xH*1%vp4%{e5xgU+Qt zS7qo;Y&bAKzIX!-GjtVV)0mp`0%H+G}>iG-mj+yr4Mv<68sI3=Oe+t<(~>9SX{82X+n6!l>AlXh+Z}Y_i6ga z9|{FX0-O4@p#qa_ zje=1M-5!*_b*#iCoyiqN&$>6)zmMFfgr@aZyJhKz~G z-#F!xj`r}2$hU;edm*brAt)YDc8pc|!pL-w;KLxTyMFfc(N@G(BF%6us2PD-N2h&azoOI*{mN>z3_mHUg}lH(4t;~_+N>dj1b z!Ns@Y{7?U|HL3RzmujF9d?Mx58uoKi_Et%j0(eTRGKQZ!#<~~1iwN<5aU;o_>|IQc zAA<~W1cQH11{4-9SY7*V@S6UmY&`#vZ@@;X?U9lGl{@76b*DiRj~M7!ppDYKXtTHP z-CLG7G`_w4+YlwKsP6F1B7RPSf6j4DRRMD&s}s^2T^okNoh02`!=!+a3o+O1^nf^} zTK(HrPDd`}*YVs$LPQsgDR3%35)u)TJ*%^v(9ixp#&Yps)454(`~Ffj-Sf;P)(q?m z#1<1EBK}&^J z=QMVU_MqCZBEL$UPra6`JQm$sLkk#2M(pHNqP7oQ`w}!Jo(|L%CfB8m+-B_E{p=&B z__AWNG&4%&DBPA z^D}yf)V6Rq8dmHYU9iq!p^pw273RTfkdlzm|a!W-L zBw#R8A{4?ToKzW^Y`N9Z)KxXNOYO4+t`;s%gDwGhS}Bg)lgOLgjSlXW^3}KD-eU z9UUtD<5{RZ%m;RTD5_h5GfNKIV>3pmq0smPXR8cE8?oMp8REi+Zj z)4hm2=~JgTc*#BcQ@FD;)0MYA`)1CW?B17 z&g%HqX*rlXrU0d%p0Uj1Lv9>%B?+tV{nhC*FNU~jWqgE(-XqFaYg0T&_UWG~zfiy^ zm$Eh@({vW-W$M;a-YWH`+rxl$Pbux+Sx{< zv@aXgm(gR>##6ek!dh=BIT0{*Lp@TJi?@J!yL9G&y!;8LEtwmlN+-9=1s;oFRq%_L zNuo+3#H2`yx3{~bYl5kqalpGf;u&!3&Rc!`M<~TLr z153kMzaL}kBt(86P-G_FJs56?;%AC~`Pp^W4#(~FqLyzffb(y=qmyTRVxq6FPeNQ= zQc@C#P=JPkNe2rK4%Vbe?%Tn@#3Uvm$;14oIkMYfS|0v9(Ph<^ko(uTyrf%xnvRiC zIayO5;Aa{>%q=0UE#M5So%w!ty7OcT(*A?0`j*#Gt@L2oOl!?WUenYxldO3F|6l88 zw@XG4jHN+PNhvWS11CrT5st466@r3-V*h~Hc-Zgo0;0*sk)-!9^E z7|ype?f|=;s6xTt-vOB+tNDxFK&-RCf*G8Vn&qxljS2efaz(H zgUM4{8+3e#y`%T_;=sVRlltz#hlQO}ml@^P@L$tir0+R*YblP0_`qZZaziC`gp69Ew3_MJmQ&NUsi%-S2u3Ydr0_4Aty_4@wN&kv zvPsorz5^Vwd4`&=-y_Jq$9#C9Erw!<&+>M$jxFL}#$1-QZ*Yf}PivuGy2KwBbd)Qp z6B2EGBJ&hjRhE&n&$ld3k5MgK3-cpyimaT&FSHE!SDxgl&$ux zHQDHF%c9(uDqL;(WNK{n9~Bf*cs#2gXn9TgQDhEEx-`4|WJ-rIZ&A$0!%I7;+C;ep zRBiv>uee^DIuSv;PY8*=tCq3GXi%a>)>&eWDKmGKOc`6xy&)Ol^z9|?>}VVIka=VLk3AGUkP z>Oh=|@8NI2b-8tk^hh?GE8S!FAiYUFs8#7(2ZjtCovo%Ua=9h+6I<#76dSlRC2#SB ziRYPbw^%5i-PA*b+q)Gw;GUSfaP3*T&}_>LBi-*6XkC-!lo5O{_Y-;Sy1;p9`x=&` z;EUd$3%Iy$XLQr=Gnw>cr!F@t65OPMD~TW;l&C!7c?q^Rj?m|d}u zL#oENtk#crxV3WDl{?v(^ChPb0V!+`kGMSZEgnzO6coW=RHs=T#%oX5y!tUD8X}YP zM*TV7lN(kj6yc0vHxI9*x3S z`Z_o`n3jfRnzQMHg#Y~EA}eidjXZAwEC4obNt5u~AvTU>N#7hg`FNdVKA6bzwcv-QJXl@qxo&OC4aRY0}P+u4=1yfG-> z`S*8Y6a&S?PKH?}Vj_H)0RFB@>^LXtZ-k7xeLyhJ*<}##uy5qm^IWA4tOC88Sm};T z*N2cZ4VY{Op2z7onyw^f!wv5)>jKTSb>o{~Pj@Dt(BE6enV(Wv(e8K;fAO%zmN7eB zKIOM9W(AQi?Y2(Br|5;6{2gb*M~aiYn)m9xt)A$})^au7X554s1qyiIs21GAZ3uq} z`7Blj^;zby-AtayD=ihaEpj0J4n4(&2XEQNUHNGz+J>CAofftpMinQIbf+wS- z?BkP`j%sYQwN!osf&@VQo@I|E=VtOZ;+`(UevK=JEhpC$6%{>tk@p?%8|KHm9hsA? z1W*tuD+(&ayUqY)&1xHsM_G~kIY}F=-hG~p_Fk=mqN_-{h&fd_#fA9?KbC~YmaL%R z;Xga8-lOJ(s>+2fbPv+V`n?qM7IQ%AyH+pwleqPY-paPXYzkEbrN3;HEx&`=mRX#U zH%`K@kCioUb?+~*HqydNC7?DVu`b?O*xEV0R&*>m;NrOdu=l3Evf?j^t*J>(6@Q)B z-~sb-|CadDP~AMZ`^+d>g_hhyH5GuciN)*#$^zO{f|t}6uY_Ir;k-YcIMYJwjj)aZ z?^_Y6W-yL#L5sNdceBBiqDcWCOc;tlD0XN{)P7+7D8;(}mfISif%!0JT9Fc03}gsEmxB$(^O zOSrEX=|K0Tf$r~u_-}n!lLDW8KswdpebbUNbp0V$%``XVZ66-d0H#*jP$}T%+7p@v?Y1d&jH3u8dfBd#!NmnqH1Mj zrGBV!4xF)!Ov^Kpdx`dAN{{r^{IiU6^myD}EqyX-CMPmWa@Z&@mS5glgGr2yjXOF- zy;$e{BmqBq4yV+LScUS@~#^-6I)bQLB&1^d`^xvB0t8~dx|uc zgsnfR)ma()rS&_0-2+4wtu?8hC#OH z)HGskjiJ#gJ{-~NRHi%OK9#_=+=60vu1RX!uLsdZbhNoJyM8U7pHWW5Df|k!KCVIG z7jDosr1@J#Yx1|c`7Gqnp6IJ1N)Y-H7U(dTh z;3ZzlYM=6?&Q4x8InDz{@nZym%-`qT9L(j3R7R?|=~-OXmWN2bzGY3B?=q(CFWWp? zYpR%=I)+N95eE`qVKz%;byf_|`Q7ih!EYReVI(pG9$+Ym2d9+aEOvVn#ZxKj zuVD<+&$G%w4V($S*auRym05ddE2c;~6>u3$cwNK&0r;VD&mJ;ZTBNDuyWECpB8yQ) ztF9d_BO^g*S>J zauQI-vqPqLK2NPTxt}DX8lSbr;B`tt+a8g9CZAEEU6-mD8pq!7a}Zgfpg7S`8$s`o zq$s78?({kzRp`tat6J+<+Js|m%J5bttq)2O2^gelm>GrOOO-=uy%V!{K)~yFH66wW zdz8{ZTHwl3?DMpJ+R^d=vBqSQa>5F$#NqJjs)trDKDFKp=EOEN#ZoK1W7IiNPAueR zyPFEI$D3$flAY2v9t|@#hIlCXor-D%j(8-9bO#dDbIIu$7hgkRTCS3#vc-n(PFoVT z{HNXTuD~u!JV!Z6J^lSyUDdEO2coqP%-98FVfx3>0XHnWSJ&>IGg2*FJBj2)MPJ!I zXi*qWviQOImIGHL7##1Ab-rkXZ234OpM>A-FYT@`-}hmn^geC4Or`+=!Cb#+rJV{- z1cZkO0}sI}ZZ9itgFYch^_?}s)FP22(2tYMa1|R;-SlhqhKY(3BeeuIOyW^b0XB>0?TDSkWRzoXnfJGW=;(13l(Ha5vLwiFFtTny7`mZU=b_cMEP z8RLC_{1RbicrENv2i^D*mJSYd_nY5-(^nY()-L@jp^=dj2V==6j_yuQ4dvxj0X?Sh zKxe&diK1y68X8(sQj&Qx63zJ!s-~c3xJc51IAh zFB3~ZZ$$~Ji%DV4vSTknL6k}#=~6+@x{jG*ZZ;HE(X=y2G?o?h&Z(9!EO=5((eGUGIT z*;GsY2i0Xsr&=ns!kWsoq2gInNsbx!_f-*!coGF26^$_)U(}ew;c2LlwW8otEWeTC z2J?!AIXI829|1#H&s=!kLgY*k;WP{(atcJzLl!?2GiZj2zDH4bi(&gwIqdrnm@P#W z8z%o)`@6Tu!|OW!;}run$|E*7F{2?8lkyexIFiFRX{ike0ds_?&B&}M)4eyf1=%(QJZ%+;K(lm~zKe=67CQO}o7ps7Z8Y&;DjBL;9!I{_&?O ztBBpgFBKWbjwjo+DEs9yqWrSRWH<9qQ>9CtW4xLj)W|EI^I7Jq_GFEv&<#hNpBQkkA#D1(rS z$bdqEO%=Z1po83R?K0pWFKrTIwtG#OtVaw7=Qn~Y8w5n5I^r_=9CXgj;GA%RjKTkAwvHwtRwZfH`VD)|H8{Qn-y}`J zJV$?FJjzfR(tWY2=TtB^0c?e9O__RKP&cQq9YxzQ4>jnWFtRo&vHg1;|2dWCCE%H0 zV_}h@c+Lu$fs_uf9^&R)V&`tpT2iZO9p!kS76^%r(&MKTG|+*{n0Fa(?CF|NaXhL* zmz-&Pe0wH+?k`Y5ast|vGygPYC6&Kn+F(7Z-Lj_{6s60%;MU4p_K7OUt zA3E2Xbne!wup9Oy`^BY{ga(~N2lt4b{cs*09&}oBEsEuX3duj|p&OeOZ(0T^?ucnc zCSD`Gd8~79<^HL8w8r0fylaR3VpzZ^3)v~RhQJT~fv$UW0dYela`2F$-(g%G=KS3Q zlH3$^45&8XEi?dho{46rTn8hVS7l9x-{sy2@%&qJeR&TI)lEb&YRgZF1e)L z;p6-O(ku{5V#;p4Ik#w3F%{ynF5QG?!zVyWbxgLQfpKY$9_#pQ2nPFWuf<#-HP@(S zH>9xs)c+M?U07*OzHQrndUel6PgNP1{WZBEE4vl{sq$s~ginB(m=g9m&&&F7Sai4> zo>7}nLDiFs7~WU&C#)jf3;&5Kt{`J`baleto5MzV zRmoXgiL0s)_lGSoayDX1>TZ2I@x{#;!;3fkaR!$SuqTIRBg9A}cB`3MG<(G^Tp76UWw!3{8ie_Vx^s z{F;?>AmTkNEG#OD`NP;SN>J1hDRaWV5k~z9Ic}$MN>k(%d7{JH%Ay{C2vWw9tBZ8Dslj; z=SuPp&rve{d_8gXLvm5WQM87;UtLtZ%U){{vz=6<-nQRFpneJ1UM4mD$fGHx0+_BE zgR!2!x3oc^7~E&pxykzos$q7^E~as~)P@m$=K7VaOmuXqbme|pus}AC?1Ztwx?!J5 z*DBs-6>?jj6?N|j)#m9Xzl0a5QRwM)YDCS-w_pPMAHU?kkV$U>0%~ zU+5;f9#R{%udQ}|{V;xi>7Yyn_&%-Y&{wviG@HQ@sdAD8XkO15-#!W(RG^xvE?bqfGXfW!fzQ+2O;6+#>L7eCnvYrY)6DE zC@SjZv$nRjv^@8?PxB@+koul-i54tc#JZKw8dwaoKhk(Si zSFWXHYllYA<|2E^JvW1XL=W#hYrh0TEpL_r+T+lc3LtrA)*uTxyP(=d|UVf<=1fe%1IOb=eYxcpqu{c>gmZy zz}q>sjDvjOgnQ;|)QM`RyHp9Ars|A8tS3*a*z>tMW7m*^70=++seu z-cQ}6ZEO&L7HP%60zt8H3@PBh1;!UuLqiI_8nEpe6$KwDf(!?ggU*$q zHMApSK*v&KeqFjo)#m9sXNG|MGXJQAHQM#P`@P-SLfKNY_F;6@76CHH3G{0)0dci6 z2ZXCnRb?6t&waMiKVH<}>w`Xb1W02h9BwcKfOL@>S9*i9g=} zjZ;Gp+}y01M^jmxNXSaYII7HNnl#l{mw2_-Xp*I@H&~`NzGVT$LrX}nS$nLf)Vv2v zFw_SBf++iaFo8J@WaM~YkA^23=$|1Y^96x{3BSL06lf9jGytMi_hD7LuI^urD$@SF z?^Kf?lua3%=+i|FD~L+Zgs10w#MbGzgtV8#He(DyY|WCf)l!VTf;5fRzZ(5&5Emii zGFCdj+Yi*UkI+wZIo4QKxTyG#?=w!vG|XF<>Bm{7w#T*X&|mIp_`M+ySz@l5sJ-<5 zp;`}MJCo00^sPP9Z#)?QAM4$qMZ^L} zgr{FLq_L70G`z3GuwrqK?rUEo>cd~Qkv55#EN!EaZETKE-QD|@37wH@Z!{hWA>yIr zQ=P2tad`>WYSispi5Q9-ZOfD`Rl;Ey4{o&Y2tmjG6wnrDl2P6!i;Q=ClAOgZ7k&M0 z9_z=oiA*DYb5|xqS?{Rckx#$_U5O037`A6tEMD~CLXUmRt)>Aox9&8|)Zg`|T`svr zLM3(xU^1{ye+vCf7hDAicQgak5mh-(&(cSG-JkXCufTJ{L2vN4(;21Gr6$9Hc(YIk zc=Ltzz3rmef;}Au^n)Ip^tI=?Hgb|w#i&)>2UEobKUgyUB)J`G~ zv^}z*rLK=m;MBLkvM?YLUzj2y{-7L7Swe%}WMm<(%@f_?;BmCTHc(SZQcxXWdIo=)H{|7bbr}GOAk96D$y&sl zdfk#kNl|?|_D#gZ0GF4x3BcH921^ar&}zREDDj@>xN|RY_#p_6*Wpq0-3GM1G_33C zLaogh&NH1ov(nr7!j=KVR~s~X)8UP^1tZky9nE1gj*&lf)(I&WVTBF01H;Qc?-QCt zTQEKJkIiTBZ8q`6D88W;I8JE7EkE>4Ny(^Z%~R7)prmHG?$P-7mb(k!O#^N4N%ExK zZFBUCtGp8wiPn2(nmWNh%3us|V z&w#uRQZ%m8o)2#MZq`Lo&lcd)+S6rSrqX0)CBTtKVVsVXR1!VLInMok?H-Iu0T00; z!isix6&B*1Hfe5hQk2A3+00Qylm*hWpF6>>ZFxezKsa=M+>`!DlNOSa;Tr*yCJq%b zOzYUmbfssRtSJOd{$e@4txv~Sg`#}$4F^qG4`1Z5O}x^KoQiksY-MUUCWLyXllWH) zNqz0;);_%>IDFoymz5IIFAiEK&o{aFy}JVZ%HX(Kq(&tAAN761<~3)~l*2qrySJWj zJt6T+c5Zw!LAGi|W<9F|?Y$v?4~x2}=;s;x4gh549H?k9`FgzYPN<2RYDRv%j6u?K zGYkE#6weJWD+7+ielWD_MR!1LCnZdJ{n&{y)LxEf)cn=wR$>Q8xGO~ zHY7aD^+&UUL?vI6*ik3ZX&-`)hn$BC3$+egY$Kj~OrBs_S(G%n=tTckwx4@HLpGyf z6)>L|nR`_j&Eg*|K#kf%VxGTn>vfdoSq_AC*5kUaalMK#^c&3vH)3p?gn6zOOh!lg z!)#g1x4824S62sV9#dRw&d~5^iC993j?O||V-cYXc0OZ$?D|`m*J-zGO0l^(NgPohZQBN;x}8IEunlnRPasFbx}rNzm4wJ=yD)}F zbH?G=tm0%(LjAYO_?DONnO}6G5;)C59nRGOMs@EddPN=e#|q2C?oF$S-1_P`Gf{eQ zV{Um}2>v47Q9)p>XFvt!PqZ5<(cZoPPO=M)nw|v7^$}c& zlk*a_ZU@u>XU915^9wL_?uxMNf}59p#g3TOQr4lX1e^Xb9X+|%JMdjk#Oa2Rw)OaB zNdH1H0&-Rs8;Kj5;i!Y%)V`RE4Oiwn#-Q89V@yJ}?T8JjKzRp;lZ1+&lN-=M(~;HQVdF?`K90IgF|=y!RCx_?i6micYH+ z4V2cd)?K#9CjH8e??E(2_z@Gto+t0`AYR4z9Y1cSUgD8GT*i&? z|0^@mhkG$9&jMK@Fr zRxb4-1}eIu&^L3CCNOex$aUW(kRJJ5B;qEMLit!JXW4YkHpSQ0BI;y*;>F0s&Pua^ zj-NF~vV{%V31GyzkJfNF2Ut_o!YD6eOY~4}%rPwg6&WoNhhe8^ zwh}#`#bYqJ&K9gYRDZEw#q@-99Z!q8qnGM`65Jn|s0i3bOT0vJE6aYIp#lgjEFrXT z8`fc_59n^IIUr*VUbP5x4Y&^+ZqmuXNU)ygdwuNA-Y^6z;Tf>%S~D z#^H7hx^Zm1N@>e4c@fDHcfBsahZ)$Eo!zph52%9 zoerZ!8KM`{STqFy}~oO4_&OD1Vm<;?C#oFnzCmj{||))iT2u} z78fx44 zVhrD}Yul(RfEZB#P*b~6(zUSgIGH6z3s+KB7Ac-P*xyeB1Lg&oup%dnkM{S0dbH60 zmG*=DUjSYI`3`{9_Z(wF&3AZRW!;$+0C=uDEZzaGlL004l2`o=1DYJX(QJ=TG!#xmrgjdjDEu3Z!2A z10;Uvh1rtmSPEPw7DE#5-`*P1e+U#g=8Dp>^S5{Qmg3IJolh?dwXz?X2qMQwiGQfJ z;lZQ9s)?hLo{Wr@w;tZSD<*|T3X)n4h-eGzzcbFPmSDM@=;wa7i2Gm&)5N>i5!oEB zNH#LiQl2=n44wZcM0SZa?5bFV5LPIsv@7k9f=H?+g=AL5^wFS8(hZX$ep z$?q2sScFxYXpIgKKc71)JkXT?)l7zU+B`CK6{?BM=7>6U34hz524pJ}5dU}w?yb>1 z;bwPb1Va9J-hLEGz(fS4MOg6MjIa2}L51UIhaKBOxInSM`qtBH35uu)Z+7b>@^kL;SXl}ekhvT-(_JwQhQhmyiLbLQC73V? z3aTy*hB;WB4wQeLSLS2EaCat>m_xgYv7w!x z+a%L=K7(-j6#HCXHnDP}dujFiL zpL8@2qwyNFOnnt(#U^Dy04oGlRZro$Gw+Y$oKt_L}b9}mCMFaDi_=B?YsojxBixcXNhW@Z^R8g5< zi^Ve|YSx>jqfD*cxicHkoH0(KiJV_UX0=zL=_1<<~LQSCIjA}8~?-S6hDD)C}kptCzj zNA7~8hSPMqAB}IviH(|pu%1uFr33|4zKEcrsMt5qN%}p{#PvWvVn;ty?JR1;<8rZ( zEeWXga(9Yqx7|GnZ{~E-=~*4A3k~5knvO`tJZk#91FV&WoP-r=RoK1^2?YatoCd@6 zhGwv6uyn(d#uUiObtQ=>Bjn7W262QEME5l~iZuvHiddOqy7U;5Eb-6o0E;30bk&vg zg7wyPUL(7ON8zE1X)iSbhf|l${;IebW=A z-sygvGP`_;CM>;;HG@^NZbq)a$VltG$XVL^p07T;E@TU(ula6S_~!cLW?o!Siupb7 zXN-a!);5;I8#CrW&G}^>qKobn^SR`v-A{||_nY0L$s^y8^yE6gGPEzZ^=Hr7G7dlo zkW+{Y@MQqRgIoi;k-vH*bQu9QwqDN?-BO|{;G3?WgFb;cx7gN;>Hs=$fCiuuDsFqt z6AsV{ct>EN1e}4}le+Ty8G`^EK4XR*TdAU-;}eNF3F3{UpQ`7zzg6YIBcRL^>Eb@%h_mj8|3 z;KXR@b=c@|56@FwGS6>sJ#!cVr|jlGJim!@_u%ym+$v+apURHPz-G65qB}3!rlqCO zQc`})LIrN#{67P6RGg@Xzq1y>1)7K&cN2e7wGB!;=NakY4;u4^g&@tA-|!b%I;n-+ z{Yvb2HwuJBl_pB@q^>Swzke&)-^FCurRkgMD<wkJgsR8{fmU(2d z*DChJZf-n0n7G0IO{pT#At}yL8Xy*|bKjfb)wJH&I>o*G#&#^s`D?<1STJ8ytKqnc zAwR3<@oU`&W`)mKtX}ohc%PjoWgX7B?}i#u)_pFb6+Adt-Z(5$T(ar0lW9K)Y+vad z;N&Oe=E&&OL#&rw zOr~S8*(N{ixzA`USRr?I7kDA5;53i(LAOpZANM|n*iTQ{BL@}UKTISqpK8+gu!Bav zMLtX?5xU+ml$0r{@Vj(noCbCVo4DWv1*bp-4HQ~7f;z*=mngak2Z;D8Ktl|UhZ37zz@cI0f+jWoSe>1Z6;ZHc~CGB5fOHF_J5W$ z|GEKf_i?yI$hkP(h{2=HfbD>po91lM&S!%YQKLj#U!K#$zJAnYTN2cL z+dZy+awjcRJS5~rIP-ZKDV@?SGXD8`@EEIg$JpZ|_H=4Y*)ay2Da+-%fragLkrrvx zrMBHMYAirI7~-leGv@Vt+?i|ba;^V|HuKPx3LG_e8|0R&?eLb1IQFo&4d)c1I&Jj! zZVs5rC%Y3xEhxbIVy>?zB)_gMFM5Z(r-$=NiwF{ohzC|$t}tQ+Fy4}BMd!X5?WGvL zd|Mz<$ZrC_n7P_qx8y%@O=hWvoRi}9CB@?A#otaUE#2Z(R1k;r9Lb+w zi?TmF=^D|0iUq!C(!{;Z=|TXQphS#-Z>PoBtL&ZV7SKg&rlti5LBK%y4KmhKc2*9A9;8A$=4Ngz zr>sM%D#p%%->~a0-#_|AZ&wR(6_xy|WxSjQIxi*zGSLA+|h7Sq~`q>cTW3s=Bi7gh$wb=9@1;aC#WpZ7 zoQ)kOeG?rsp2Vo*{K*0;_c#p4gWNr$5Cn=@6EOwx{MVA7W`eeeeSAS8p2|6@lFM{| zknCsz8DJ=%N5rMp9x_0VJTdF%b>W z_0na-KUrF41OWY4l5u<N(z$HtMa zI{$g}2=&Y!?=x7h5}b-HkN&-P-& zo5c+2x2!H7fGwUcDgbAo!(2y{TWc$%_EXpa-)V`t()8X~XQ z6|i;#^h#aMSUn~aiWE5EzvK5DENDV&>?4j>T4j0r^0kClz&|ZGoOk;1wc}s+`+osL z=;-L^z-+@Gh|$qeU4VKi&g7e)b-Y88R-N;b~aykFPZ4aXY+U&o!#ee(mwHg5Xo8YkiC!^5K# z6yy>d3=Q!`9NJE(Rk772Wp0VL{u^Yi@ydC{0^D5b;CE4p;uUh{`yw;+1ySJc%<8t!xJJtU&9FdT)W#DtNNPH<9iwas5f zUXv%=I*&Si#cjI@OAZou;*>QCEuDmv`KlYL+jpmILR}0rjTf6;K7L}Jcv_%dz?B&v ztiF7YNv9Qrx^Mpxfya7kW2*alkyPKq&W6~0yL}pW6y-H>N$}-ALv1*9qR_2 z{7;B=N5sqg|E65xJ}I#d%=SSkQ@N{M+U&H5K*gl*>VOsa35-=0$G zEOO*tV=soZL!&gwJMq)4mfxWkm*m=C#LuBl)*KmjgmjSAD53q1$xxt;1!`5P@DMtI zkVXig9beaX!q}l@H}m@iSSnCI_&;p@V{~NS*S!x19lK+vW83MtW81cE+eyc^ZQHhO z+fJV9&-ecSUOn$p85uQBrOw%Vt-0sC)*kmP=BL8;+i}R$aKfSiUVgmyW^+%XD8>8n z3@%l(-988Z}(DOwu4?I`$utcCzjryIfHd1L=Zb9x@-r!CCeCO|Y`OxkYajjZ!t106GLuNVUq2D?{b7T~`Z61K-bgbl3_RrJ#bxTx$QT`kZiNqyae-kC?IJTJxM4nNk3KK<8|GCgY;YA z*Xv1Y$xj|-uop=96On=hotH}x&mR{1F}jIkDrT5@3eHsOPIVG(KDPl7JAh6ZkP57x zc!!e%2(H_tQc_X~_&mS;4dz_#_$*>6{D4BcyU4;{5XWUNmV8WI3<|-m)eatEM%@P1 zGnBhW11fJ)1)#)rws}36s;uM84gmM?_)U zrk)hr?a4JAy>ZprK2uUVm3FFFZwv^fxh2TlBNRPeT)EDmj)TdkT|Hly;=|R`+}IGB zuO_+wGmyBw_v?wTLjmvutG)-9`4R8w|%?3d*@V+?r> z!PX7BE9z{0om!RHTN~-=C+5Pu_s#B`1@=dtLi*TvzK(X3bYG>$ zJSi8r$@6w4XuuJ!IGpTD#y z<()D*HN=Zno7f#>%})lP<3dPX>P3<%+kHWG$HwYFB|Jd1rI9F}Ak+^E*Dj&)a5=JK zDG)GV0XW$}2!}%%tv4uZaG?l7{Nt`WJ^U*fC6gljC;?Spf|<2^PHS0+_EIF_)LuO) zshNs=23n8%8hZW_AR$9_*IXTJSx+zgFY&Iu;#fshUt2RrpowA2(VMRu^RD3gMFhX+ zxA%&02SBrTlYqpnch3iU#xcf(-oL|GB!1mQ)5{ymZ~pj4YzoZ=dUABOifsF= zxbt#vK!o%MFd=kW?ExUryH`)>AVL;dIXO8+MSx7)=r(=p-++!1JG#3L^lyF50}V|p zDuBFI)1{Qse9_uF6`W?`EnBAtxdj8@`XLx!@DHe%0om38w&?%aoCzdwJmxk_em8|3C44<|Xjk!qhY&DT(!8(Ete0 zhL*a+z-$kf!*Kiy1;hLPBbyF%PsHi!-Ue|z40Qqde2E3~CRp|YsRQ2RMZBiIzTNdU zFD_0V5cB%a-6!rqKt~rdYFJiUN-w8{=^yJ=;J;+;myv&fUq8BW+EzEhQWZ)6 zzssxCv$n>=!_$_R2Ot^b0VAkYRabE6nVAplTXEorM@GmQ7?7e^{_mXSpLK134EVpR zqXP-kR!E%sM3ccUq@dG?|I!`bQpm9H{S2&Hu-VvP=#<1NhnBurf1^)ST}BZUoAzw` zo*UATXx=6?sWLIgn10Z)Ebk}C``cWb{PgXXs2M7i|4!FEF$0uC$a4|0d7zI`WZ(mJ z5mPx!NI*64?SpWO2p|u4(wB5|U^Cy@d^TRfwx+>>3-T}8oPr-7WBb#E)6T{0BG(ja zOdei}RHKAV)FN0k>px)#kp|sAOxWZ$Q-&Q%{F02~k2gU$H<#5(`drS<%xkQ_OMBiF z1@KVXb1lBz3ECjvx`VQgtILHKhjq5>Z^CUucoEu&k|QmCKfeb_GDg3csv>OFn__&V zJ3UW-f0ES7Lg}i@yckuzepz|QGmR|!A4T}9IG#j`s_G{fdD~ z1=^!*8cXkHOi1t~m90Ka4z@?{R&wyD1s3gmIJZ#|N}hRj|C@Nuan}y@Ia%5fJ}$!g zjr^Wii6fY2%45(Qc?CYUaZ#M^5(J6g3-~gQ09lnh?SCa^)`(det=8|S!zrDQPtb4a z#my$`v<&K0!>iNn^iTa)f$!|*%jWCDvXP!^kuxootCeO4rJPb$K`Cw5``ugYmsPoO zG`fdjHP_hu^3p%V3JF}3S=aF$sPQ_LOd1;jn*Enja)b9U>vm4xZrym~mgZY^20Ojn zkqzBBo_0G*%VzcMsbvQGMpWOAJ~%6A7X|PrsOjhoK{jNT)oK>-A;It}la=Kam8Sl0F6eSDJ2 zZA{>Lr9aaT{#btM*fw@hKHhI{o|d_{nfEDKBH=@XM^Co7Y0Iq$)zZpY+yz68tu&t- zpIij-O3fr1)Td193tta%p)%D(=r0xxf113dm&KDog;S-)7v9rFaghpbcgNt#!cqrp#|E2_J;&OA@>HC6C5L@u)##)fP4CYFUs_~ z>d@=|NC`ae-!A#JBM%L9s5@^-Z_bXvg_kVzqn*z8%-z1aOA14|3;PKBD=3roOQ`C8 zGb&IAwVUwajrL|q21fm?+w;?ydzl~0KKna*RnG7vJb+B?DbkV~3o2gTeG9f)XX@s*3K1#hGI#MYpxc|}Ykqc4~J9=A?WPMA&=rCY3eNPnIa zZbxpL*kUq9_WjSq%jBlN!ri9p$9LPylD5z6ZZ!h3cJQ~))uNk<9_ivaT$5sgGxVlo zc|J36a3)YJ{Icp@y0f5B?N*KG2WQr&i<3eB6w}rZOf2n~N#a!Fm87j8U|$?AfQ?nI z7@gXTgIU4*`e`-iYJ=*A*Vff(+aP;;f{(W?+g^8>+cQ!ViClMax3jB`sU#aFE5~92 z*>l#vUlveD6qW&kV>z4Ui*A zjXN#|7uGl-I)X1z%>rl@4_^}q>;puEv9DYJ|0b0!0g@`YKbOnM=KGK=9^-hi?lj%9 zL#?QDw05_jI$d_*65DQ=4%}Nj8k}p5g@`=I&aym;l>6DZVUb}ZpQk03 z*ilyjZOp_%{mlZfMnnp!;=K|sgYk-0`2hozYA9)bi3)0Q1u{f|!pWCrMRAIsSvJ_e ztu^B-c&L8_OF21}!Q7xTg!YEM^R}7_`cOVCQDP zd2^U9ycTN@XK;T_>N>y^<$beL*zuVIrqKBaOg=5;1!-wKp+#So{K1qGr?~(ywAU6H zCe?Od+|JPrj=RrT%;`f=X$fpu^tr%fq}ke)9#5AGKc7wXb-^6)_}mZ{=lv#wmvOrx zI4TX@X4(V&TL{d-YH8sAn@4AQMSf?a%8aEB<}+kcWk73mU%Sx!3`5t1C_s8yPWRUB zV7>e|Hhrok;l@HjrI9ZAFq&pr^nS-mEg4~aAU+ch6@y;Rxol39j{F^`?t`XmD?~xV z{$7;Gx@PeWDa34u4K!djB6-3F>+dO748NC$ zkuiN65pi*GK$R2~6_xkzbQuFsj8f3~WEU>S{)3qDpp;cS?ZkrK$6Q7eREUC04a7Ir z=H~JzEmg@FArgs*%=f*cYsX*rqqLR8&?M`~#G|@S|CBXj)_}PTeZgUr&bfY=PvQ=F z#-;XhyfOn{ZC#TayVKs3qFJvy7Bw|JspdILkKQ6f9x#WXTIM6tJ~C5OBd9PoK$Z!< zle4KoSxZS36pyZE$BzexTJ2yim##XqGS*9sbZWM~FwyYliWr@r`g*#rqy(GW0BTe! z#5`^Jee~fM(%6~lfoP9O+1OOnQC3%1MtOXsgDryaaf{S2G!8QlQ*OEuIX*l^TGT_` z93aD+xyuW^G(BQ5rTBIF_)fcTREW#v(C=cZlShsfpca=5o-L-N6yx)5{dBj}Es>Ks zubP#aq&0S!)J0-nU}T8ueUw}MjmlUDZO^=s+-h!?`mEVxRxGGM8zt zVW62Ej7wep%BZOl(Nl@eCzBn?Z+Pez#yR4#eA9DB*fR>oQRpiUVQ#*YseRSBm_vYn zaOjuDnNt9hP~cw1*h7GTcpo0=|ICy2Lbi+!0FK6TQ4eqEGJB=a*5F-^=pLhzo za11e+V~_Q;ygFfd7~nu|ZEj9++`HS>p#Ww196b)BRgv^yl)|jP_uzAgl|gRyV`#>C z3HOLR7bbb43ToK(u+2mAH87Yw$Ssx`|u;VOY5D-cM z32xU-i;E5d9bL89ys`9c!!Kl@kx}==t&NP-jAgC2R`S06Jn!oxD|twu;+DW_@gy`i z!}YKZ=^C7o4-&u5iXaVs&!6GiHt(0ZeqqTfG16hjteKJuIZQQ3oy`4QFkZm2f*dDqe=!sHLjpb99v<`rUgLtXYUR4lKqo5cdOiM~6Um`Ef#veh)S}sT z@cZfF!@#`TTEZt~D_(Y~7Js#nlneD;6Idp(jFeMK$ThGGnJ_dj>(Hrq!9+>L4mi`w z+&b5S*B#+t7}Gr1%SVetx-aHyTNsla!%tl9d6V&P0u1w(lU;i>_&zu=2RM+XEe9-0 z_jv0>54_|Zs8_|!2vrRkEJDN1Aa{<;TfL7*!g9jh0W<6fM$lX$blF)qV)OM=yoNkw zc@-agtL{jxBc*s(S8F9!oj6t^%8zFYEhSX%+2a&cyjv^(|2kUqkcy_k3OD4luUDh> z*X9{orMV>)vJcVoK74NF8P>RDVBfMl(=tV`-%VQc+b3rjSPl+T$}oM?)GU3l zinb0-QtUp06@mdQ`m!;REyb;qbg1@aMWjRFE!bJGLX&e7bzyf|bT}Oq6XKZQLE&z0 zwQ3FOSvf>?v>&SyLNs0zvhj> zokJ&ec0T$_TJyBV?ydq-_VT%I+tP;n^@~=*G^D{&qf~W|uXHY2K3PdgLlm^gnMDKM zZ*8l{8x$`YKLD{>z5xWu@tN5nWnhV%8KugogletGqvc^9iQ31sqAI>JH7CZyqJ7Ya zN8Dv=b)yOy={@z~YKR46pX0VS_nZ;{)Qh^69BUj;yWVRGuZ93sO-m8%pZn8U)a|U ze)54fu&-00+qh-#%YQFwfdghyxwO`uY&Baet79&ehsQI~tvV+rDo26--+iC@kh|TEwy1ItO#@tj& zELjzm5I;7XEex!#X%(r4Du8e21PI^}Ri9pM-N@VVw&a}EGM~3OSr^=Q7S_{qvqIm# z?c}pV3gR!FaK~~c%7s5)oI#J?f=#H-(QLlE8=UYYl zaxIL&>+S{De_r2EfF2%B*x>Oz3EO<@{~x2(`My}+Yq%6NxV*d&6$ON{F+OYV9#o|I zIJx(k)JTNRMT3R~^t|fdNENfgL>mK}-=XDYi7d6(`+GUndFI=LbyutV1$W_bC+jM6 z5g0=X3aLL&S3?8A=PsD;KlLhv zt>G6G3hbouV*cVhJo}NWMW=J{XQzYf6e)43QBe6Ub2;P4?DMr(XCMDjS^t1F<-@?z zNI0TUQybLb`f1N*GLcYP(H6d@UIEPr?uC<n(C}*{IcA^HZ(V1gIhC_H=c4hhw7oPKj-<}==`diQHoic zrM;VxlbktaLb(fLRF~~TFOz*Vio`M|umarmmQ(=5P;b-6L9K+$UiTCUVmI;18TjYh zqmmwo5Xsq6HgEhAHZww&+5QXn;k!QJfy4P7$&Ai84Hn#LN^}urI{OS}r zb9`bPx^umut-iKPGM%7T#3101_dFo+2fBd$Bx=JF z#u)ilQ27SCQ*Fn510Zg$VEJfQ+;BQZL7m}&zVX1M-2WDVFG+#0O$DUK`0A7Swt_AM zP|}{Tfh7!hji0BTA$++dDY;{HFkqbA3A$uWSmW{EJoYF$KV;5ktEtM$#G93ga$`Q# zxLKc7L#PF02@b89q1#(1DjMo+wYc6dF1K=0?NJS^t-YNsmKzqJy8hPo_VyN8X8At= zeqIrUM!&3voaIuGy);&90i=+!kb+90l!bW-)xm-CU8o6)~Xl zoBi3^#oAq;_$}RyBncDugiUo<8#{8BXn3DI!))HM(Ln^~)R=6-gVe(5N)>C;fk|?# z1mc{MQ=_7)|EkDty1A2>HNTlxeq={@QZQxvs$W5ESJU4B0K)>~j*=5=1tb@4qs~&g zR@Vs|sKk*)%F2iXOOe)J8Apu^mcz*l*Yhgj&c7mWEav)9*vJCI)IjI$_0h9<7I_6d zTEB<_Tw_@h+r{e$%?Ne$9FzV$eBORsRh)9oyzIsfs9aWdd_k^A2^AG3VVPdBam6t_ z4+rL)-D;=EMW-%Y|4AYmi#25rOMNved(pBUW>H$|p zA*-E9L?cbz2)yDxF9TR3iHMv`GZW?om11))v4o~QbAk$(@eq19ql_k z3Nom&h-MV1l19o%$n7H?nTvAFw+og8wD>8x&rMM*~?PGbhSy8;0$p(rWo z<_skH8;HzQH=Rk=2())|V+~|1yoN*%2yCR*4`(57Mjm!-WHjq##ozTR~nvhf`?opFsU5u#mz`z!SbC4lAN76V>fwrhLPG?t#Cj;Pi8{y*gIz#~ut& zd=VPK_pg^r4Ax>KEhu&_e+=H89Z+!ccsCw~Z@Yi}*VjG<21Z~=h>Ef@pk!QHS^^Z1 zFd%?NUN#m$cClyA0H9v3#1;M%#R=Tuej4Y8ojM(|0)4P!qx=RH9_LN}2LcKGW>=&<0uCwS7x?#r*$MNR z$d*=X!{@-!@(={y!1pSGoY5&7QQ54D=H?k zo4+|EdoLq$?uR%Wlo3lt_adCLku#a6YutogtKq~Zn|QKjj|AyK3r<;POpnzP{+>zm z@z}Q^3r!SoLc?2Q2-p5YJvD-;VytqNeS+#IG@umKfOA|`BMXE=@U{1p`}wQ3BiNYF zh?1fn^#IkRdWFz?-Quya(lD?%T4HD{%^>U zT@9@8bfW^~O5Iq$LF;ray7V}j`dGXeC>dQ2FZW*Kgu&qvFY4QhxWujn8KY_&sc{-3 zqZ;P)*!WtK&b|^#Anw}h7|f-OC!qwNRfQmP`xnuTjkJ2w@|ZdJ3{Z|&LXdf z&uR`qwdLA;ljMT{Fz{ZKyZQwPde%iGbZuB+i$zAV)DcF9SiT$_^8R2&%v$ z_&izEg~tgRYH@w*qlG0l7^zR9{NILkQ3=zWbgJ##2m{^e48D>;x+evx9Uk6Ucz71x zTp$)?V+1PZEE#oMbvzfthpMG>MwgSY36Sq(By@MpH0zA!qbtRJzl>QO`S!c6_2T)u z1ls_Q@4pmB%Jk)o2GUjOlrDX+bquohYFE^RZy3FrmlI0$IY)b~!rvATaMP=0(kbX|cRj(jl6wM&D z_&6zH&W`j+PyzZY{lZ265vy2^5s ztl-Ult+SK$(c1X>6^mPA>O(M8Wt5MDX2pqxal#WSTDumy4hqVqjtq`d*(>cG0`>kS z1ZpToCH*~0D*{>*xSd3&k-l#AwlAuy+f!{}Oc4fLwCdTB(Vpp4Bk5)G6*}mx zV^TmHU=#G;8kN_4cZ=iRPAX+IWO?sm67T|7?&6Y1#1ybh9S(GX%D;Y*sx`eDl?ftI z?bWrELvQ%_COtqwXMKLg9n{n7Q=N`xcJ`?_6oR`}yD{i&p(qj+7>}{TXfPWss5hI~ z;y@%8!}_)-wozHlz*+M*(#x`LFCX-oI?YhG@rGubKaj!`3D?;)Dxz|+tZV zg1gyYBjWsbS>!Ig*V_Jit*d`lG-YFL*oEIgl8Uma*-Yq&hu%pPer}S>^A;5$Wo{|; z(!_Ga}8YK>+umI{}Q;k<@${%5<6lyKO`L9nJtHgAJaB(pMw%w#{Y752YljZ5Y zzP?#;HjX#i)STjcNh{)a6%7f!%Lk`hh*D?~jf&D(UV*%(BPa>s@JLUBgUSuo$-?XB zz9p7U5YiUTn(x&MFLJa{vsJ4tj?YnN&))eON}8MdXJTHo2Kss;a!9SehKV?lWb(+)iU+2BpCDJt*0~hu?%EK(D#G9{S~7F`Avaio?% z<>wQzqb#K!iA6$-_Q`YQieP)q-ngn6o*3*P&(FYH%b7=hLvuMR=S0vf>21Rg=U6gg zL$Pt4=2hn#>pg&LBv%(rcahZ`Kul)jpgX-7GMb%mka{@{W(g0uyb5ARNwduB5*IZ2 z(9#ntGB-Fj4l^qqDlIXx^uZUjA=Vx;VU;EO z=3?e=2zD6b$fVelv*BU7A8PfnhNrJB(ZpWf?nrC>nj>@f{A^%*@qln!cvu+HReK-s zi#IIj&!0b?oVeY9&&!wx9S^2*P*b8GP?lm*cQukjP}E(o*G&aY|Mz8;7pna221O|z zSAmI`9|*|8-pe(-ar#;dqySo)n#vrP)HLMeg7LV7I++!Q1eV04!EVk>o_bLY=;JO& zAdNd}?m57Sy5&R$Y`JiRXjYv!*_ZHjcQ0=4QM-eZOTva>xLCDGbK-V zKMz?@3moX6*mz0r>_tbe+TAFn1DFBNnVce?QLelT$%CY+X~}nGhFbinUR5?+k$;Aw zAQCGZim-y0ptd$Y5$x=rM3w{d@EQVsFKhi)f+7}I^zHm*lQsskA2~%G#0RP~sZ1u* znOw;mmbSWups7H78KyI5xUvyxyim}-IW2+2SQd_jwSiQfsWiJ zE_LRumzM<6OrT5EZh|S!Ji&hvc*3jq`UEgM9W2fFu7EEm!Q>#g$TuPpOrZ`J#$J*JxB` z7S!2ylrfP=7iv5es$t~{LQsB9r6J_&js96 z1tlYtdwOXm~{HcDTeo+8AZP(4ja{)l(ksELtaz7jAmEi5S%z6i2Vz_$6v{eBvs zSx`;W;g3IPzv&YgEeYReup^mV%$kCNP~OGstiqo9Jv`4q%$_D6nUp6Gx&G_j+Ja6k zwEbA1jwwI&0yy09Er%iLIbLkqJZ@JVD{7oCmI4?bbugBelKB_rQ3j7^rw``(Y!(E=jG%V6!ZY&g*{B zb7nazPZ%lUT$+Hi^i~?fh4N+ z!icnA^rq+E2#sK^hqWW?fv!le`KrKK5Nrq^1owdB7%eXO2GJ41)8I$JjbGe+AZT#U z`ft$HCeJF{w9Iw!D=$2SDM!Ey=>qPy)uoS5Y$wp7&U|bjv01vdf~$s$4IE&0lu!Ri z7q|*PSkg+Ufoi-Cu}N;wwQCULR8xAh*W4P@JhvKf=zzq-X8q)?H43rYmI$ioqaKng7|9zkH#6Ar+o#}C4(^2zrD{CZx`kBzl(%_S zSujH$PiDVnOJTr^MjI>iFL%c=t8{^ohXXC67+H65%NjI@Z&?(A>~UPPs8S}huk zJq|VO)R93BO=k8ze5W3cF zKiZI!EAP_=a}0hei??TU&^+RO6H-o#Ns%L`vKjlcGI8-XekS`~-Cc}rv8M*%<%{=! zyZuS~Z)fLwu;$4Ur0WTpaDuQVUWf%^zk<7K0_y4G3YH-4;QwzAr$i}MP0Hz9sSz8_ zQJmY7`srB<`bzp&E)oM@75vLqz&a9PqR06rOp;`*k47BQzB!omTC~n~L$n0ixA9qs z5h82Y2V5KJD`((>z|)q@L%4ZN%rXcrV8G(syEask0O1S^1df5H58`ZBSj)fS$Bg5n zOpnW$)S#v+Fg6G_Id--8laS4XTFRv$E5)pD&5?&eDD1{XN~i07LnDFD@X{6kLG)-R zWOCQNJG9jRuVXu9cXrd$S;m6%>9s`*#j9X{^`R^(mOjdMn|b%e2{=OtWcFbg8fD9iHBwTn9LkoIZPOByOJ2Mm!DP zOjBAd4XcalHQ=eI1*r5UD~IKJdaj=z-zHYRDj9{%1H*I=AB9vZbUHoI&PP5}q);+H zCUi%y@78B!Lb@vu|L3dJjxH=T0wmR(oF4l_5a#FR0zt#V!svmV%p4t=Jq;diZ}m#C z?p+@bJPzJZx|oA(h6AGmRo^=V1qIr&>bcaQ?4VDJb}SHXm*`~0aJP>2>~?@R=$y%+ zXhFA1eIa{^WGDVCOm%x4qR30(8j{j7hMlYznz6B-{T#^J#S2siP1f(>Z_y7-(%^3( zpY_G%W&G=&sKcy3d})MQn=6aNRekIMSk4447b!01x2dGsH)~!Vjc32~$<~i?3;lOO zK(_p;ue;)S+56^DZaMU7iYmXn<5*ueKJ?`)#R4h`6Z8wh%=+(j5u~T~ho2?Cb+$D0DybceVxNRR@50d6X^UfFLC-dB=&X^9V?2 zDbECYt^i$jaWz6fQH|u>XI#%v&G6wuorANf!}Q^x&<~V0uC$K(RP+Tw5hC5Q#aWy=zTQoLql<}di1F)<<7?Z?ckpAQ>@_@ z+h^ZD@v1W=9@=+r+dNMED_mPfG2Wfw{xr#mxT2z_hRMn8_CmN+YqOir;!QjeMqD$0 zT~T4)HWJdBnU&3CXuby_>?~w*mQUBWtQU5g4^sY51iSVX=p`mj5C)hqDl04d`~apM z3`A7aAkO%a4I1|&yP3FNq>w&qH7ALKmYRqM@Sl(@^37 zCO!lJOG!<3bYw}Y-0ngMKH{Z3O>LnO+{mhuzVcje6>X;g-8WetGp`M)WB-;BdO4e= z^~g(wzuM~Z2%r@f9wIhEN7Amuk7G?$LqhF7$0BS{S zEL5q`hUiAFN{WjX;FCt~)njMcW@w=Y@|Iq|Iksk}v<-A0=?yyK9PiMZJ)ITPtxLg$ z{7+7J7Lc?{h>ZnqB+ryahbO0?=;bpvFKcRIjcwKN44O;W_Y6+>o5?>W4k95!2VvqlgF>(M4m7S9z&s;{J$e{q-WPw|z==rF3eiaR zfv2myk&89qP1ADgzS#jB>;6~mBI1q~a`G47D81!3LSF2Smd*2L6c7kH5xwT!=2*-Q zZ?7M^9xzq->5Yak&O?|cnwR#>m8(EMel%w*volbtmdG$OP!iw)T{Df6IlH^HN;Ne% zH!s9dkOwwJcYKNq|EK2y%7 zJ%x3O2aUjN_~CRtANmG-4|M=nlif=&t|s`L(GGX$*_+uFHPsIIhS@q4Vu|M(dpt6V z5q_K7t3g8am&cH_%e!@B-6V#Mg>1{P&OHF~(>pybdFM?||OsQI5meUd{B&Xqf;kj2B5G7qi0n@I_I}JosGBdIQ0; zSgxS8%$9&2RiM|!*r1si2OHZC3&``G>GD}!$`EO1m?mEqc zP{4$+Zxmxz11WWc)v?2oZFBKK-NW6p3!*uJ>nAbi+m3@@tm$ei2M}WIkTMi6PJj_T z7G9KS`H5{%=cD-72)$v#yE+`sZ`Yc})~b}7Ia-w9x~kWl-T(;oM)*Jf9D7-L8#Y~Y;$EBCo1UIbNtj_ z-snNd9pwUe0w$RUMq3cEj04^=OJS-FLH!!BbwQ~ZT4MYdwO;hZjLI-o^#Q+H&2^qD z6aFLRm@ptZPbREc_^n$(dkOt9noF796Vdl~-dYOlf2$-Q)cC=|%*B=7zdK{fyzA@l z4+7ia{wOXe7ywu%zs~1S#APAaUS!DLqRi@_+|LF(ui%4U&R{JCxm2?M76kUkzy1s~ z>t^B@T3$@El^&c^dC`%Sg0l;G4cOn>h2HSWnP@m~_^))z*=`sDmQnqHTNMY^_%x#6X~=&L_FDo0f}T*tkrI+ zK4BHQ#yMbUI+G_wJUu^)P`SiTVc3=@g1vBN z6^nDe7(V``;=D0*hU9%k#;HM>X#fA}1hhYYYEZ!fG+%K==0fR^R0>@ASD3zQS{B0Jl zFE>28(Y8rE;pP6EHn@M)i`sngu$@Yze|4-u`{wQTtO_2$l2Qdpr0BIY+hJj51uYv0G|IAX_6Ln;suiS9We#sy z40KCE{pBNUXy9gdlBW-g6;#P!HS|)}|1Kc$vN>y+nR=g*_VJWaX%55;Gu$8F(_vM> z)Ode(38u#@+->{MfXSL1wev3*2B9210@8|FDqe)&XRi>f1YjPTst46o`tAKam zXk3mJi}fn2u5K^h)RU=XIVP;9%4u5t^w016^|na?4R{$~d2?gluPc~IA1j?jotO(q zk^gqp*>b?NK>Un>%d?dmo`^%eo;4GV=3%HW?4Lw^EOc?8%1HQO-_k@rLsV)QMYl6z z_GWs!)3+&g88hLPPK}L=)T>oI=7y5O>I_-IytE{FuwY4lb6P95IM7&c-hHC!=2cGJWfe%S8808a7PSHzw^2zb!GGljn*RqU~GHq~hR`FFP2!bJtAPuFESS)c6jwill}XHFHF5|bt- zmJKsIstx8vaZsd*1=V&geM&k(?>cGP(nhyL^k_uGU7uJg34SIA&QrKV|k9OtsBax zVtgOTG4g(c$J=E8t3o?HaTxnJ)h2pXJ`V8@nM%Lywu>A zz>!q03tqP{fEJ(Yea>>gdp`4Ym4Dh2P0gK+d@dYs))|sp9`OIP_=%bF+}zmg?Cq5m zavVTC>j$i*wFv+<6FM}%Op7Z_rS#KbV_|7)3s54>mcMSOFkXqDb@{rP`{)7}@!Oia zMk&g1287O-I=XXLcOqY|BPyvVC`-FZK6L6cFS975_?tNxrA)NJ=kJ$Vn%ovGC!#7D zYC>vH2JNFq46i@bXelxeV~#8>Eb3d2PRmDFk>2ZQn!8FV%3*QP(cvydi@=^9u-xp_ zr|~`_CH9${osB;CRn2Y4t7t{Ip(;@9QQtl(z{Ja9=(iahU4KTM%x-iz{%J32ahS8| zOPT!(=KbvQc_H@lCT^}e_4>u8(~F2Zlqh>!)$ALWR;JbfNOVi$vC3!_+81yNglr_Df zLp0j3)qj~u3pnn`Ne8(d1q=XTVq{cQQUc69u;mjIg94XVP-tv$zXN&7PW+d?*XjR@ z!8+`vp1qf|jX-=H&l>qray*Cy{YNA^r6MVAW$}Qvvt^{jPLf1?`ghL)0l%6trP2;> zxdj{KM#Lns!H>`4m@>CF_UB3mHYT;Y%(AAc8O8BRt6vL7N>*SI+sDcHiQ1*4B9Zk8 zhW%@q9)SoZG!oGsrKRGW6CfPi_YD(1+Y!`%l$52zQmQyG4d{iOSQcyASO8G?1d6kU zM|IJZ7&099LKJods{{DY-2kfjNeJtP#>^$iV?VEv*Bl2t&2};lOB5bPkApduSW1nbgYwlLeO@&Q#f`&p&5!xYO`uwZcJ+a&RtLFiqllatE3P3cMv=NrRG9V#$wtH22#uI;2`+omh?fGgpJH z;xFPbr_ie4ktEvZ@eTKcw#s-yQbOfWW7D%u_o}IL9e4%3lp0j-^C?d*HuVj86vgi6 z@+b+~nk(#ou9^WA$oMcXkmaSNsqt|OOG^|$bP+(I^z-94Hh!N9tnx#P;U_@c91IPEuP>Rx6EcRq;kC9AYqwsdW+-dcA z@od8oMn1wi9MtywD9a?OSjw5aAMWuf*U^0bd}q}onr+nJ4e;7qM%)d-&*ZSZZL1Cc z|FQLs(RH?8w0F?3vC$@tZQHhu9XpMkG`5{Iwr$&uZQE@4?zYeWjC00#zlVF@`?}Uz z^Ec(yM7rsY!?v;By2pU$wqirC4=aB7bR9OU z;W)hltZLwA2#Y$XQN9l6?cNTwMC4zhR@2_@oN%Sk(l7VX-*mw%#XwU!OzLuoFP}sELS~2O@mLJeW1TxY1j9d?N*M9Ubq4K^pBY;C zQb^SORX=ht*P>@l>0i~2l;szFIQ3#IKI_-KwVpW}e8NOOHz7u|hWAD6sn|*M(cVWC z)~~1!#U`34kB88PYi;0ei>}sN4<5fF_H!<+aE?;&Q%tp$No7wepOD2zPhUT6D zRl{P_CtQOzj~3*RN%0*UugGcNKqENcr*GePWnXq)YaX{89G!p(slZdCFzWrzu>_L@OzS4hRGi@N*66ARBQ>w1O2YmNkM8|p{|`}$ZU zzaw&b!<8WC?l?gnJ~UqRle^zP{_Kw--m?Iq10A!mH2|&F^wWU$%7Im0AYDHcz+$=i zoC|g57v#02vmR2p+ET~bf3M`;ph;<%njLZ^hMBXMQXHFLUcd2K|kESC2`S#5{+ zCULJlj53t@{ZhPj>9>@v;}07}6d~_nAg~Hgk93ebO^`PgN+`SkLYqp5o0K^Ldl{L| zX8Z0Po9Bm?Q7?F;$Ql*yz0&TM_ZY^T_M}McNt{@}*13Ad8S~);1ERn&-;&7(92ux{ zj#XNxVsbdptu>plwpdUKTT=s2tA(qpL!l!XoZmIaA7-Cx-d1@quu&M9xY_8irX`?$ zhd`uv9Kgj7U@%yNFW5{VR->fzn8GD=tAgwkQJS)OjT8bS4azlVa-2DZOi@`#45948x$D4Thorjo&{&@F#OHk#uXa{Lf;rubwV2!-t>Rj&(1Bu<+ZR zGj_2OWK4DkpVZ?;Q63_)l^JPf(&2w(@L`ikTs44n``Svwr}LA}&$5ihSwt;6;_5p;=) zux%|OT_$lrf;{Vcaj&^P=v=H&r%HXwBM{N!Unc6UBe&e^*54qdwG6=*$ z?#<05w?-xs8pXv$PR63ho%a0RQ%~_?1^nn_V#BCv^6qZl%D*Dp+2pUpo$*b>vCFX-CYz# zvva%zd0}y68;ksCaA-S?chN;Fs9xu@5^dzvNL!VJy=e3DkVgCK9cK@l5L%^>QMJhe}(|n&+rQW2fsOm29Bt*-=UZHTSqF{~Qv>np+ z;lN<9mB+n{?;4Xpfw?;!um3W|tW8sX`Ye1yesJItn(So6j@MsfvsY6(6#8f*lZ%WO zPu~=F6RJjFJ9DeJ^5MI|Vphx+Nmy`Jv1_Bh95IJtnpOhRp3T?I_}e>U{#YirFJ~?U zMMytNk7l8&!tPRy1@pg)E{*EIcWSSH8ig&D%7=@bD`0R?SPO!1$XM&_!1`e1gL>Hz zRm^NS>>gQpCM9I;7bGL2dSk&;U0f*QX>s3!z<58rLygB@u!9~z->y5{QR4o@enVXn z)|)Q;X>F5n;`cCic~)yDEjM%LoFa-^*uI=gh3+1+<$G+%;lS@bU;4CCx=XFYv%UJ8 zsom1i6VV*wOyhj0*)jZiyE4Ez;l-LN+kUuSH@1;Lm+QNz&}Z`D@@|VF;;>cY49t}^ z&MP&X&P=J-K6gW02Ow#Ptit&9;0M66vJzmxGuqRW;Q5n6W7`=DifTKWo;vEHIyS&7 zg+fCXg~Yl~A+IK&2Lr=P>_LHef`8ieTjnNUPo9b-NnVZ}J!yL)BN9lkMteIqkaEU}B>fT-AMfuA+|{Wv-)Q@+bI2sA z^o2(DrLx>*wqxy*go&3O59BeDi)@!jEmiZ^YU8qR?^on}qZ-FMdk_gOzg0&e?YT_B zts|@8uFp3pLkOS)vPwVtYX!AKK#;+k{<$XvgQKF*u}@DC*!_0Ro8ZaIBd2}Mh=6f;qKxOj?!rD~;HdCAhSl#5C*8CB=J zTs^d$20nu#@^6egR!>63d<^-C6$%{%E2DGCIQB`X*bQ4?kq4&qF>&+_YLS3s> z51Ngegn&Rr=UX}c0_Q0T47j$BI&5K*@<44ZE^Vh~> zR&UIoNqKElb1ne80K#pNKmlW5IskurEV>cEZ>DDkIvPNK?$SFk=}37_soUR^CqmcM z4T;^D!%T!U3jDz?2pmpdgv^LhNKwN{FpyS}2C#O`W>w*#X9w{-3L-qF*EI~R;o4}V zSJ}Y0Rj^o-f_fo(TcvV8uVi#O)5z_CIQNQ?Db~TZRjly#XAUi?#}UAhqP~#FEq8*g z?}-?oy?U6Tgmd2${}X%9*6V(U$?ttFlrH5ycd0;$4rd5>UP%KcT_3?AGd}RnJ;Tvc zN5gvw&@M_K7krLXF0QmiDrh<#1V`iSo0DfDWJr!cHB+};$3#LwQ@3LyX6*=Jc8(8g zA8MK@eSrK40}sE><6k$AztPQp{>O~o|IM+9JOgy4ba_ot=y;htyoo^2PLsXh{WZGC zmP7#Ij+1B^x zRblQ`hdsYZL^>Pp84UCg!dQ4X^pDZG>CA-CK!{hh`O5Ax&*1VNkAcm=>KVdqTOqVv z895Fank56j&UxC~g4Q)BYn*&UG1K#@w*|DIA%C1Z8Z{&oY<#ej%jH@jw;p?FRN5s-g!mow`+RG7>2>=Fz!fg!6^p#L@k_hi^sCE zvUomSKW&R-NIhwdWZ>sZ@rGg;balK&%c#%t3PPHb8XZjxXaK>=gI^>oNc(;a)a-lQ zAc23|I(LZ!j=r23Q>Ub)~kW#ehw+an)vQOHVOKx7aO4#r9*mgBsH2)YI4X4jkiFc zO3<|VvxYnGQ6ax4W2a=WF1D0MF|hNPPmINEFsx7X<=}v;@@Q!bfYXYH>-cKWZN8OL zEv=Z^%5|Gaq3>`uET4=t;I6vC7(Lsip%hOJIjun=t!jb=g7H)<6ha5uzPp)geN*6E zk59b{biYcxDAIj`*tEWN`9)P>GAtQmJGO&@qKIBFA-u;kTzYCAHXwEotMGJ2Rcbi= znbrVYj4TD@(C&~wcSrm0>;Hi#}j1bZd&<3wxl1qrmD*fnOsL^`(In%JJvl#+wp#EtU*=dgZX~_JnM*qZ6~dhuLRVHs62bwLGrjoZFGn}#XCR%Os|W!rwCX%pZ_(QQFQJI{M4Br6cehPG^>3QPt==KP z_e80N$1#Ne71zQ(>(iGff8Yev1B^B)c9+u~z95=5zToh7(b)dRPq%-}5(-Et{)LgR zRXFd!qWs&qa6i#Lo>^^t(_Oo`=YsVOY^vkA)?EP}z^Ra-6nW;L{OZcNQ>R3o%;P`S z1%>#oL5q+NGhy;=0=4nHFW;^ zb~hHr$!{7K>oC8dLvE zJTbUJO2?R+Q3vkSn?lec2aG~$%w{Ae?l<4_k1Q$Ar<%60PPUnPYK>Wy{xvUbZEcR^ zT0XDiaU&@q3sY7a;>TUSo+I{TsnO?;>#~>VbSx}NB2^(a8Pf8T7s+@e^ms?rFje^# zB#s(e=_@?-l4g3QZlxuW94l{P4!21lvaV%PeFB1O<4f;(etv$@q>}yU$<-Guj9a#e zw52TQMHLLS8T7q@P|D&Ei9%j4zqWHy5+1jALY}P8xJ1Y1ofE|+V8G40=*F?HR)lcH zgPYVf#IM6gaf+mradv^mKL#B&f9J{I&qQ~Y5;p~vbPMW@{;k!H-aHc#I#SxoA7!m% zwJ}!}%D{IRe_vL%k2ix%Dm3EmfUbI_r|b9YJ=(iKNCgip$kW*9c=;}AL6Clr{}Gtp zab*}b6a^sP|j1fEE_Q>Yr=`JLdwjN`UuZyQxK{KC5k@}j|-ZC zs4(J@y~kh!JYiyu%L5QpW|_r=B#z5G*M2h`9E8qzvt&~ z5iQQPh&V7$BoOH3bM?LN;j(oOTshVOzGySrZH{xmgT#9)aig-lo>F2L#viVfiVD{4 zX18!l(x*rR!7W_?>;2ndLqn;T_H74~mLMBw1^CHCw*_F6=o z8NB#wpY(!%LBti7`JmJcisG)1y?JwLnAXOtkcOhmf-1XPf7U{2uTeK?GnU*`f4!MF zi*Y`U?i6?-VS@>IgT?03S*-Vx(Z_gqFCbpvC3rs{eLT?22q?%XAriE7yK&2Njd@t! z$O12_v%1VN^|09-7B&c$bXk+YTwy}rL=4iGSd?9430$-h$5f4ZQ$*5TRMwnW%H=8b zBd}G5)Ea}<9G8I$3?c|E!%aoufPrH{C->VBC&yQ5m^UT~gkwj66*fVwU=xj$q)bh> zfTf8E6UZUQLcvW*hpF)D-8+gi0VASe-Ix%M1z4uWvcfSLdi8{58!%caIgFc?mI;z^ zQo)!>qknC89(?`^sZpbRUjwhNH07`Gny;ag z`ubjI^CJUm>EKFFeDxS9hrHN)xU7$82a8Bg_pkKOi4r0^y|=dqbVC8VwrzfIZ(cZJ zQqqIHy*quK*-2U%hA>6C=c?}wOvLkv|B#WUlh})WZ@RU7Bz!f?vxZRXqpb_Ma--!~ zaa-y)c;heyw7SK0P`Zf1hO$W!K;(4lAcZ6}A>v{~Zj+kKrwqx)GjVT@jYhqkIL>lt zKfz9+-y77lfMa?8hjzWk?fS`w_b==j#vX_~s}3)sq$pgua}D_#Z41d0sjV|*D%%f^#L>Wt*y_% zUG2cBb1x{lBp|!8l9rRR6>0_zw^9=C)3xeOW7(>W{tM~a<4IEA)gr%0bCF`M7F}im z)U~g27`n=5K{y|i0ot2TF63nTttD{K)F8409)XV%uF zi)OVnH96tCyQ=iMI)QP0&RwV?$I*drbsIgVUoBU-ip4pa4TT84U)jG-9xH+{_91+= zZ#wf*DkVbFqliRzdy>pKmWvmP{_y&ItoTnP%EBt<-rfX`oJr#)a^wTR;UK{wEi5cT zO5$2S#}!_FvuoqD(k1z)pQkf%%!1>|F5Qie-H?G6oaN0}^$Ry$&1b&*JE&YkuTUf$ z|1&!K9>J0GZg1}F7-(o{P$p7-(XUbOR|a~TZO3(&A!3-lpnft0>jAS<+uH;7u>a3@ z0nOyWy3Ft>K5DBho>z2L_Bsj?!NKec~%NUgy+DqK=Pd3m?ZSIJHO?d%7s zIhW}~FGK4b1`__xuFJIus&0;4*z*(5>bkHJUDX>4lG`g#@%SSUY^kfOYd3!v0P@E~ zL_&&+iok4Lq~Tw`f^rl3wt=5n;7+Yv^O!%WuY!^uM{KB$9~;eU^G-dDvsIX$ZSq+n zr1x3%;s+`*ZRw}A_p2OU)&(S2_aY?9XM-|00CUZoXNcrg(v=g+aNEq4`w34rwJ%li za+a>`l-v?O4fi-D9-o?0yu`SjCafIR;ASp8}B3~Dk;Z#nG+KP96jK&8lNby?50@h3WdvuF%Ma^ ze)Z^cxu|xA$|SFFwn>Mw-pX&88-fD3sabIW$Ef-7h>DRlr$O|#jhz_wa$9lyf2sz$l zlBw4j(Vj47L-7m1{KcRQvremJ`z@s;-gb4eR>DX7rGdW1Nn=d$A1_rHjD`%p* zJmu?kcgz|@mW-h4@bt7%zYj$msJ1!}dRJr^kOs~nXH91f!Scdg)rs#l`o5i3%Gy*c zyKF7+xr{Gt7aC*kcJ$hc6h@a-;Rj8Ey=+%nK0f_!(Lko~THJEx>tcC>29FB5|FpRt zu)f$J*M>boJv6aYhcC5MZ#I3ij+cKF{IJZo_51C8t+lnPkl1`+yXdJUlDT__ld19L z_gl49YIN4-NBcUYOd5Q=KEDbMeWYohy*HoCrYo!T-R;Z#*GVz)h_LP>64h=1> z?4R>0z?3d|EHT0A5q4T6h8dX2z?0un*qq&rLdgswoPwhoRDKt12lV>-a%L<-vwY2O zs4b7P%#M6aa|SIf>=VzJfx0#qR#ucsTpnHwkF>)cc*YquiU^=A1+WqyN-$!RWQUH9 zQk9mKMNA>LsnJsVwF$7W6=Il`7MIm7pXC&;a75$B!MHynTD> z(tsD4-7ukZ|lYgf2vXV``N^kCQFD&sxxs0-`_$KqiipMZ%saU%gwf`Td%m6PWv#53A={&HtSV? z6l!O;%TW(En&zdv`x@3?e8=D{l%j&&tsQD78_K>Uh_c#~b4*84{iTwym!`_nSsX+2 zPH{Mol`m{uc);ps!DTk_{rxSv=xJ+tjSLZC$Zj#R_qjxdbev62{c*_c2DLjhI|k5a z9eQ=j)zeclVLUl&K=(7$+)7`m1LO2aetAP8JB6u!<{811v+T5 zH6Q=>s2~&P9c#t~h~l{(*gsC%W^GjYSX4WLr=@E?yaaQOm0}>GmfMXD_*KG%>PSsb z|9z8hxsD8;7?l@kEh6|@u1>{~#tJS`e6|hQkVld-1%>Dgx%Ww5yMqnpPFf?JC zJ{6UWtVbAPuKsexA?9VXRFp77>(8&XIA3+vBIYY2dSX?(`_LfN+mer; zru8t5=1OX78e$lT^QlWIflWSVQ71FkwaL zBEh_oVM#;XW*M18676m?j})a8MMt2WLR!J_-sN+xoGoGdcOMw3fmw#5xS0y>i!!z2 znBjO2d>lGG=mIv?z+~@Dlw}5t7WU{(`>&l~iAic@r}YzHas~1tS6L+eMMb&aK66alpV+-u+`gQF#A`#ZyjD{ z=YmsE!KLA!B?n0AC3;+t-qTM{Hb0UWQfi)GIQ(Bayx{l45~Q|``&8MBrSuuBPhl2T zYMVLtcqGQB9nyqZJA>|}AB|6%g{sK?X>~L0X(22vvI*o6B#dzpoxl3B`0P|@<@Wj1PZNnvB48UOUWf2KVf>?y zd_sYkN%+E_n0+8)q@zStMRXcWA2wPVa>0ace9zU1`na>v%*@fzf;R?P7TUn9tj$|< zDd|bOQB%BoP-hj>&MYP1tDvQQPP-}8frAn)6Twu1J&{7+Bw$Llc?mfQtYj9-~%v?P>dWb2BWioitTmq%ag>t+4` z$xF6h^G-h_$qOJ7h~JJ~nD<%SupoyVd*VdA*vA)IcgBhkbfYHsU9iO?bzAI`e9=5h zdBc}nu6pC2ZJJv`DNf1RRM60qJKlDAGK_fv%eA3t*~oKhMI?3uCvwkO;&dR&3{j&? zm)>?ULFtmJ8;@;f$J-vWvx_^bK%RH-Y`X4_TQ~}lW%u{b=j2>@B#rVoM zr!$L+o4s|d!eXX$CPD-V6AcfVYFF%bnza2^#}o@~*Zgf^F%@?&(LKF%Ti<=DWCYeE z=}`8DQ#0|6uE&>uMxX27^z`)5h=^6?Ug=vQFSj$N7a_fTjr{MqB!ifk3jcHC%R5_RGPqP8x@6|m40PBSNdM~|JH9F?ShWhn!$6d5tCOp(e-iEn}hHS|KHzrI{DMVj=rtfC?#IeGCf#Y;))2MP}E9}xiuqp z_S!e+E4#hcdW?G_-{|dFI`Pot`Qo@L>rvncS8u>TTQc zI-SQ#KF3*ydp8Bl5a;N4n<3M}XlRK`NFIw2;3D`Nl7kqtXo_#H~Z#t}3$UB7{=9}jcr!XWb-gxZd>tE`}O`XtBP1WKJ}?SiW98 zj$z7S4|S6nK;QrE8s}Hondlf*#b@%!rN?gjro9F?N^7Or^+w0-adf|m*b=xuEeuB9 z>~$bLl_NBTXYVX0$l{SwN@#MBxYlLI!p?qoGXVtwkxx8mW-jjIdivVpAQrVrU0iOC zqIsYpYo6cYxjjFnuihr4pb;;+5(fjagn%FCd47~ymqR9G+qc4yVgyxE5w2#q_P*<5 zSm3T+Sz?(Zw8TfTmfTdiNsd{%a|p!>jF{3eOU3>BLFQJVCj-jsDj!VU%PffU}B+u1TJWrDyMOH7)^s$XWQoof$NgRzH@03=!WMom*EZHua*e& z>J)|}wk2H-;OTK;vw3{fVk@Zn$!=a0%1B15*?HLw%GV{twfJ`TvY!Qo0;6~~vf+cr z7h${e62GgjtutoX<_2_D+&rn53l5HL3-}?@;akBG-8O#V!@mkIJQmzET*B}`{9ukF z66@oQ2ooRCa6!2kR35SwPe!rtEWtmbf;ihQG}KFOm~` z$ud8$tuf#X&bSFWX0@h&ZMt}YiK3toRUVV?rmI{|yw19XCKNzgY70obA`gr}9X99q z^*-n@BKQ`R{9%L{`^R_UdTX2-v5F#xh#1Xc=rjqbM~2FbW;}wFkFp@^>;F#hJv~AKBgU(iU*8de&l3w8F=Y+O$gjzMOk>$+EfQ zvEnU0@4SY@3Q~X#$`!>uz|>t=z1X)~MKuXMbhsP*?y6n*FcHuTZdom(`i^1DY?;&; z#~%3W3!j6L%@V@+^ZLUOXI}Y=DENN@zyQ)|nyQ)_>g_cwRJKJRC~Yzo1%<@_3VNUL z+@Rg@?Oan?kxo*0o}t`m_rABQatp?1-6O}+P)6}d!dIW)-){{GX#;giRHqb@O&-2; zW<$c}l|_G5iE zsRmcGv=V&w72evf-PQNL(%<5rbtb3+pVV1{c*+!2$uP&D3eVFzE|_NY%%~rHs4+s9 zPzF1tJih0tEaIWkFYXNKA6yz*yr)HGvF1Q9AtLsCfe$hM5af_93%EW$W1QxZB#(2{ zA80q>;|Yob+D$;ion!xNA;JGGY(i;GNZO{3Mz=_g{vt)U?J>H*!mMz_JX4Y&nKzAt zyW%$5iQZaTSgJYNH)Mq70BTc6|B0tEE;qvc6|e-U0>CL_$IeP|{NC7e3)2Tp%$f)xcVhcqz~+ z7;w9E#52KvlHfjc()!#QE@8@SIPn47^3MA`(WL@q2IrhOBV}PAC>R6Jg$rRdCX7)+r+c6?uqt8?-JA9R##OSVE^J?ZoAt^FN zDjw|)VK36Q%3RDzoR$)8pNrzJ@QG5DSc2A){zU!rN|N1FNhLg!lCm39-neP{$x#{- zBK&}r*IxTV&aTh#w4+nXYCWIqnds=sv{frA76qk!$Z^$%6~jjB#LS@1SE?aR7g; zR{DKPHJf7!`Y%flEGku|{Cqqr%gTt>Tmt{6D)jfeGmd@wQxxU~lSk+==#m~5e{2R& z9Y3k?E+uedwJd_+<05<+|HQAV5TWwGOws05b)Y2=K+I0ybthq#v-vd}>}EgMcU}GF zx;_NVn19+~@Ej+Di+4*xTRos!aXF~z^MN0_PcLrnV^d-wx&&mF@?>*ABB+dL#|1`W zp7Zq=vu@)6f*ac07$)j%`N%&fDi>6*?4xD`k@VnBmdd2{_7Hi&X~peiZuKC3s@j_i z9b(dXx91B)L~O)V`$L&NCp6t1c)QyOLxBoLRVDj1nTp&fgKIt;E9a%L6GESM#q4|E23 zC5MCP&*ghpfkxk9QR!aY7z=$bPs}-;ax|RpQeWs+e|w*-{B~#W^=Xl#`|!LY1rRY{ z-Sep#e?B}DXp_VaF=Wb=G^+?m`e`#v(e8@ycL(< zloN|;3tyT6>~lR%ap3V`D%*~kgJ4_({hy2c4Kqt4otTtNG;vSDXN2`vg7grsPey)O zM0J*&w%aZVhqueB-@cf0f11qPm7_XSS1%@}$e9}+rGFbw>ud31qr-g|n|w@ukyh>D zF1eNc=sZPU)}l7MttIawGN@ZnZ|c0`x^Zy_ZPcgvU(uSHx;zk$2}D=J!@__)LPDP3 zpRUNiCnh%5?fygs7|(IO+SK%tK`!Ydo#SYJf%vB&5N1P4D$Iwu)<88<4u{=dEx-~VWZnw&f)IhkzD1p*WZt6>d~jqQF9L6%(y|8rFN9*8TW+hSIavoE4n<%Yg*Dw6l||9aDDe&ZraoZ;gr60pt#GAMz?H;7QC3 z4fDQd7Z*cA0@E5885x1Gk6?c<8=Jqo3;%xo`&dd^nr^FOXpcTHeHj=<0F;;nvKt$d zBnt%3HMO<1b#DGySa&%(|sx`Pt*SKX^QwI@+bGWc1uGg_f2mJnP6G0 zBzdu13TNGYPi73%5*Zt`kf>YkEZkQ&NaR4YqIDQ9e5vwhFrx@enzV1={4(zKdS=`J zCG1W!Ssyc0dDnTiD+)Z69EKnnm&+_ zM`tXe?QH~Vm#VPv8vzXtc3(()vJ*O+-JVzcH9gT4;DW}x8KQ*X64T1UEA=?3w7Fy~ zi$it$mrIssp$Xxwn6IyX zGspgq6)vQxAo6u@Cbi~oFg{&Oi-wS^hza@IswB8sL}aqB*QaKyyk5*OsG{JBB~)6N z-{&@?w*vRFc{%rXisEo=sMb>07#uQ#CcuPxVPT0KkK7nsZTPyX+ZZ7SJ*#3t)BWVI zrn7`@r?dSuHffpW`_*(4S#(-v`JPff>Iq0q+@Dr{eqtVh$?Sp*4)V6FsGOBkE0_5S zuD}5_bd&1A^MG2a74ll&?=`Y0+>3cJiFVu#3v`7)q5#gZ{0@sV$7+vp6v%1>k7ZIQ zJb%iG9q*1|R#BXsGOplY{d>=M{I7x0vL(1|HpKqG;leb8y@Lb(Zpd+jQS#u=a1X_a zGFgvT!3Dc0iHCXOay~`CO7Hb!)O)Jp%$y$X;6vn-V zl^}1IQdQ-S%~@X>r!LX2AE7q(fo!yVqrMa;q6-Iem@kBSf-m9Bp(LFS7dl}j)?gJS zA8X!zEm8b?%8o&&Qs-TG!I!7^6|)47@r*!hoVG!rAX;I%Y$y=@hngoH;hE!b0byyY zTB557h$fCMsWu{q7bmIAhIrkxx0R$0Q)+H8Q8`B8a60@&SrIVelyqKQ)zql){TpxSV!*w^V*~x{+z*&9 z(`j*x0{0*>O0%`hEYnl+$(X3|^J-SVNK9W6XKRiu%RjTpD;+x{HMQ@A5rYPpK?WTtc5@`_&=lWsa;#{UkZ=`rBP zADF%w<>x0TArbnA!_IDR{{p#W#pkf`OFOQ7R#v7pS;ze<07%RHTAfJSr)W9C+dBRwm5Kb~) zH*5)Rj@XcxklD#A{-j=*k@6ET305g_bZ0*H+Ga*RYeP~cfXf$v4&kP&!vd@fW|Mx= z5D}YtL70s4-hzO9+(`%vD3`A*OSSte4vuTmanXHk0+&}xv(BAgn0^MtT)spuw)0ZP z#cb1_s#e zr&FLwOF83YE7q_TU))R=n6EjDVy5D0^V{6|l0#?8UZmAz9LXJHeAqjD>M^w)Y^1vK z*8_yG=AV(WH)vmXNkg+HVX<;%=GWBZWKt{?0N0sC?SH#%4RDaF_U4CH0aMcTl(anU z7g-J0hRv`s$aQ$L!@wq+s@`a&`v%;9^c>)zU>=xXlb*0YAXc<^`I#FP2j_%MYyEHw zo-s$~`~+~`4}EB8r@)Xy=O;6PuDk;EQ1X?{r|_=BTI`o7Kf=XFoq{wa7wdsJ=+rse zQuD=~EZ;cZcGGMZFoIjcrk6&%>mJdD&tp%y@a9t+Xn>BkrfRH$$E~}oaQ|OhMWfo1G(BBcA%o!J)W&ZNmwh@C)GF0FK$lO zn^_2VqQAb>pNZM^^E({h4#_DpKc8-XEMu;?;I(eLEU0&KvHo}S!=>c_`4`WcVZ_Yp z;;gy6AL-`S8@W8fb2Gv$mlyM-XNKCz*|Y4NweyLA=X_j7L$-I04$i-YayOT9MD2ku zO~`2aiNvB?t1J-R)AWAcw<^oz2JjKvA0)Llz%05>xbA+Z8qLf%1qdM9>3Fr=HP1li z-lyZ<^gZTZwYmBR2>P7*`V;tDMI?}vpefRH$x3j#zs;ZE|M4Bp+TwPU-=S4ks01M{ zjM`!xn=(PK8e~Vpe=Ux{jxN!lkz#ttu&ikp-5F2dZMj}w@Tc!)s75JKJF$)24^~f0 zlL{*v8Ijs<=^m5+ka9}#u~iaMvYh12BHYji-gD`zOe^idm~M1}F1<~_o@%^FjdT{s zHc3qIN~u}{g`;1sB(v!GM@IH^0@Lm|KsdXsP!rguLILg6|7%nIftoUK$5(bXa7C#E zyS$<=koIM-%I~kYnwgG{F{cQ&>hWD|hG4X&NOqGF)AbjI>fDTmK|VJi zEL1{Sr9M*46O&rS+G%8HdWl@gFR0eNH6lApSxU9Ay)Y~SNq(Y8+@5JHoa=hUdT+fA zEyL7$r7z1uq@#zD*@lOgo#{}eruFSxFM{Qep?EA$0YBJ6WiH&J<^DW+#f+k7WOsHM z>%7nS?@28Q7o6eMLE9GM8B}~OrWPkh=>2}f6^16`7b_6z?Oe=(ut9B@AKjBK)(M) zOiYJ=b+=xKyNMzm43-9o5?IcU+=^fdmIm6`JHQyQ7aPjp>)u?RHmSA{p$CdvUyl-7 z!R&tO>9CYgBFH&9ez#O#%Se( zsP5ZhwI5>a&8^B|+saU+F z`frBv-V*=cy#k1$PK^B3-)Pm@iG+NPiUZ5QImdoOruI0@k5tSZLYl@c$R0VywnfUWzrx1@?1LuoS0Qa-dsF;|mAF@lFtThwxq zMyMgD4Z`!<-aMBLXP#mzZ?Mp}(sViPuWJL4$Oryvc}xMK{7v{EtIOSU&gcBvxvpIG zK{92KzC4HpeMH(M|`1)HvYf)}dR@EPos*R636PNevH<@FM zn92cEfBO6qON;tx$>nRD9H8ogk&u#0VU7l$Ka1)L{Fm0aAzJ6ga6_ zc)*Rd%V*VaCj>Y(LIkF z*M9_m@vmKVN)VhdJ4KmUjv!g!WmsHXxiZDu9J~= z*cQmb>ZMR~U48KIikD@@r0mRVqT{ld$Uzuz-Q*>(ts>RudeY_GHC}FXGifdw%a0lY z;h&xV*x_X(%IUj8agobvCf%9ns6F@mrEQ^>8&-TjtK%%(m3kl9;oMZ2F(1Lxx7zZ3 zDASC4W+B3MdskKHD9p0XiF}TWX|<(|sSm-49M399>KFPcS0)+I7oLLpYcG-9k*4M4 z+2s64ec&OLAOn3@DyN}=g_ZRID=MzU8ehd*k5KkuHw4Xql(;peQo~0_bWG#Q_XT3S)VK7j>_24=?3~~ z6*Ot}x&B>V4{2wxI`-B#X(w#uDfuMy7xo5S`cJCLGgIYNh3)E0(J$P;3A&uI!Un-# zN)6ZBIpl2lCW*)#o_~Z;rXrl=!D>}kw#s{NsfT_Z`u~{v%cwZIwhI@<3GNmk5D4z> z?(XjH65QQ2xVw9BcXxMpcW4}T=XuX}_GS$DRihbQ)z!7uJ?FfpTp?NxU1BBP>kis| za><6ot~yeDnFr2qJ|SJ9tR$=I&(Rvy~PfvN({?opok1K)~`3)**$S7Bz~q6ZD8FZeluH>8(zorQ2N0C#C5dektUFP;4&j6iM822HrtS{}O+68o;`>f>%)W zY}eb%mo%|P5n!oV*^)XrC?g{A8DI!%wGFZHZI`rR@e2W%-Vd*0s1sUqxgiy#xx4b- zLzO6f*epNy1Vqg*MP=R1R?NC!z^GJt=WQjlDlxNgRhkO+0BpHL1eX<$znO(NsYcNS zzrMa|HQP?+@a0hd+;jAISMsf?_$ep)f9_2fU%)Zq%UZQ||Wl)ae@>?STSBzEijY-uOd(G@PwTfLKh#uq7}UqhFK9+`Mq z_%tRr$U9N1P4mNRy_;+&hJq7Q426y0>PWvYc&=rN*mqgG8K!-S)V+ z>EZVJ!z{tfjv4Us&E$qgY>s8std=o8e=tn3V@LlB%)!bVu%C|gRS?!xldT^)Ue~phhkw_(X#iR%6q(4hF-y+ zFH)VBfC%S)-Zj?7_oOBkC4QPl+(|KXy>>9qd7sRW)ugY~UOK#T34 zTD^{0{APh0gWhX0c0zWA+-&x#P&IJvo28HTmQo;ufPU}sRvxxKUFlA;VtQaT0J-R4PAvsB9ET&q>-D+~;Ah7pc zZK?wz5b*+)EmH$x;H1!3Y1=aIKX@frK3S?gpIT9z=N$kJ{W92pr`E&h5Eubdk(b|y z$sw|2Ci?+fP4lbY)}fWg-K&}*4ZNjHA@-^kln3xQ(U;CihlKwp=(*wQ&8RDZc0>eB zVEw-BHGu91+aKKmKK1iZa)|K&d5H%#Iz0j^aolRcKU1M1wt#g7h z8v{O^obB~DT%6Yt3L$_GHz#{1SI36X!E3@YcZCX#f~TEcjJS;~fK6F=JG6RLubFom zO1!rrBN>x^VlR;w_3@i!G7k5qU*#WH>Ha2{0^Z;lu?tM7HihLT!A^y+raKsx{S2=N z%Z99UR;^IMPOf?*`7@keHH19xZz%=ymYTWK>47Ke>GTntZ9=8^*KyHq@VdC#omoi-|F&1=w(%0=(^X+_`=7w#Xq8 zR!OE$P?{rXVCl%m>6@{qwZ`+kXIIJpbE& z+!l!|S)XZBX{8{v;F;hW?u!y9puTgZafev60$f}Qui^UeIaT5?s9iWX$2vFg1MwC~E35|^+c*{x5ZKdfONa^9YpOOWF*FpthFiP-T3X2x{FcrR&?Sp^Y$=2J z`AMaH86;w9@DNg!MWY%)GaZN55HOOq;GQiP92IvzB}LaNA`vIhbrrMJN5n#9|B^>8 z&B{~+nRMrK-qF+oPM*MWlYM@FcAis4TjiWA3V zHVp!LM)XRWs`iPh4G;ExL9c{+{3jh3M}$F?LkNbWOMJ5>G(qTZ39Z2ZFiTvBI1H#9 zRX)MRMs+wj_=c`MX1p*dpw-m=^YNg605>)#CSPznKIp^#caa$#q6uD(S1cBntcw=w zSW^M*sjZe`wj%Ajz!q3A?UHET+n^{S&XO)Eqtg-&4vA;jNLl=>z86UOfLedK*QZXl0LsqycZutGgcTq`VblsVt%z|>xw}f7; z6GCBRe0|Y)J#m@RmoOgChZU^eYYcDmtVqLsmE{m#FiPO3gpIdIYqkN%L*sFU4ka^1 z^&ZenUwUVS{Wb*o*KZ(*sY!RbmBz6tK^F7rD0Lu5dur&r;i=V zjqZSJ+23!d6jlDZj<1kasXQJJg^wkYzAFc16cWH-$XQXc~5Y=f_7DH`D~1x`oE+TKCk=)!-~ zN=%^e_Apcp3fEB+I=ojNIoFsJeb)BEy-!lJZ0YAFmDo_4+usYi-2)3yvw@>Bgz7KY zC2FYg<^X)Io|v3of|6b#^BRaJB(PYCopyY}hHndGf%xaWbHVxr1a8=1OR|tm-aR_c zk?9DZYf!ATP*TnQ>&-n&!TK@K>`EZ~6fLz#7dIeN=RNK}%qbz3pN!u0nvtnjf*`@> zcOtP`>fop|VVAT)SD21iS8jPypAc41VD*Vcz;?xYYW|Afe0}V&A@zL0yp}8A9ej2Q zZM9jfJl@9cX@~OWyxg7O6eldFtLFSpqhA}H3^Zg-rkQ_p!dHN_tKE}- zc=-J>3IXhfzklR~4DCkPI#p8S5p!8ziV7LEnBg>xr?+@3AbcftMSF;Nv3$1HIF0is0`j)J=y3gp% z#nH<^$K=%s$AN{U-(F*XaaIEQGYGEM;L)O;90Do(;q$)(d}5YY5-x1h>KD=9v(Jg^~$4_~pjy6i}q9=rWp9-?r6zmnm0t?l4e-Y~9M zm%w;>u?WrK_N|n>Za}jB`IVH%__#Xb>2LGUP{_b@1Emb<)}}^+C&!N8HNl%L>5&6f z5QI6?wZU_^PFL6P*XY6GU3!K%ld$%NpJvvqJai46?p!3gJ8>_w%F#!3%j|Lvb7Mfn zvbK8r>>B(=^&zv=(Ny?I04H_)T|-_s21VhvXeaydsQ2QlGgF(-yo$-zN+XPsn($aZ zl|OF=s}9+*=`?5bI-3nA&AXEP>)sK-Cdhj^dV+E;A_2$K`5cqFmyh~luW7us5Qn=P z#qW6g;k?UlfkNY+7z(gs|%p`2@gZGvOVwiQecunwGRDjd^=meURhV zjhG!V*k7hbTQV{-w(A<1nhr~LE&ygr*MXT(ZTXpYY@Ps7pc+8Isz0E(bBWNg3>Ko6DDjCc6c=2lnh>=%3L4 zZQ#Q;1Q!hCujaWhB3FMx5R1s!U5;shE(rSK?1R%_aVV1L@L||1spn% z-(-VMQ2RHd%~XKJ@e*!hQu~Vpe91wDG|*jVoe}3UZ}>a9K9wK=&E?Tq%~yyzSbhv6r78nPse!Ioz`;RI_`BPDd=`QVyzVYD}*+AI=N=Chc>(Xd+^0&rgqH+}@57*#JUg zW>;58`1vO?36&Q*K7KKupM1fEI^DpeJ^kXzWUU5Sx(dD9C?1N}0#(Xr4)K$-0F!qs zW`LIl>-u2&rwVqh0dv|S|DPltqwAq&YPdIGRmRo(kJF~TI(XARZ~!NNIZXr{Azc4A zv-gl$H;dY)%M-%)*0A-Mq;E#AWZmNJLP8p^_Ydz-+6r@tlaD5jIa|YM57bTr`|Wm78{W#yNN})^ za;?~5QRTfr#WBY|aqZExQ(}#;xBI>BRPkkmPwtw4bwpSj{aVY1OQz=Q#l*4iWKoL{ zlLFU=@7BTZ{7zvlw}{Ancut0hiQz3djYzBeadHRF*S&w&YIz^9Sg_AZ!~Pibo<;7H z*1~o-IOdGoY^`CW{LW1HGpkg7zjJfF=4cYo*ObaOanqAmFz^x_dn%#Th!65|tQTM)_ldEw*Q1!ap z)~KjX@U&Td`|n{@P=W+~^uLAz(pYvT;P`bnu0O9 zdJ}o{f1-S*#~Gdf6`_R=oeE?Hg#rH7w9 zrPYt2xUYurwnFRzrqhs;EqAdir&D}c2bo*De}Uz5hDmB#l1ww)};^y;4i*Sa_% zt4E7(rzJx1IQ94OHQ*G{uF{Nzl1L(P<3BG#bAY86rX~DJ442I!O|2Bop_e>OGRwKCO(rsJ(aH6`g*`N_2mts}^3fa5=fuSs z2V1sTCKzr|-T)CTludr=G3%oRP$a;Y$V2x#ZqzHzSZBv4(l!r}^nJqYXcJy)c1hBa z2{_<@!=!u9yLh?W$OqV+>J*n(-WZt9Uc3OyGPyIUPsnO+fiXYka9^6NZkHU>?b|M` zPEq(PDw+_uO*RXlaQ*=3$?0b6_e8(M#QUpg(WBz&sHt7|8jzC}&JaX8)vk&NDfl_} zi@F}NeiBttF&sOAWL-^!@pb6&=BAy+G=^-Nmd$VJHGY3Mds^=!5M|3Z*V?jvKyi#O z?E>sFvG{JmN{+=fQ(gRFJ$#*dli4?@l6@(ZXmIv#++na0w<*w=v?74z%#F0C7ym)Y z)V8D_nGx7;oO12^9HXJ#L)c+MzR|UM0(o z)L@^uyIHB!!ohz_yx@}a#d{2}fV~$ImEhsNoldz~t!{)b6b`(xNA|(g9*1r!5Pxfz z2LYh&tV$7Xgi6yp(38hN(*-vkaqUKdoS6xTqu>N9CxgK_Ne~0O$4P^PYRbzkf!ex~ zQi6Rhu#XigH)MSL{BArGSuf%qhLklvK=n@oSS`Y(ws(MGogV31O-}+4a!Bu>p0||h z-5eQDYj{u4I7wJ&VRPKp$H(|~xI23N7-e;T_5nZtlYyp_wsRt;_lVCjqYFlLd3KZS z7sbn~rvGc7{`b}_aoNt6W}E6?+{asyWuqw_)a!>q?KU|H+eD+Gn^8fIU^s6!%LYqL zF9i_M-Yw$=UR=;uI2l}N8+2V->LqBl*YC>k%ZoN8ku(!W$Tzg5h^{nVxijj4O z-KJa})3f!y`OXE~Mm-T0lvKrikViy0gHl!zXw~HxXWn2OD#g3J-fZD_an1X$x>2dj zt;iU!HWf=c`+n5+FE2keSESMIzsAE5fWjuxlF~|8?Jh3drE@vmd@H4!ppc(C@C_)8 zWMo-%qK8@0&dFM(7IQP)6SqWI7{V`Qkkq?Cb$ieye9Ql-XdNBCFE%DAB_S!f)YKiDpYsa;gS9+DZ@Fw7_KE{G+JI+30Gtkf-Mr^P;bZ6XCq}I7 z#h&L6;!m25_j@r!9qFoSTG@8{_nr^f)g{%7d;@yuwY5?bZo+QfZw5Q>mzB@^>$^!E z-rI4K8fY@CzVhsawSeBig08OaA6>MZzw=qH(kM)A*}@;W;g3S!uj3~2Ylnolrr|Kc zBETg$%{?F@h68C4D$pRUwFq6&o$hqXknBf>MDv?6g3s~E{ln`8#J`;wU)4||x58Kx zeIkJjs6+f4@1(y(;QiAN=f_zyf7@<12Fqwi%D_U5N_6?SRn;Rv15tN-?s*m&-Wkks z4BrDMeT9vO+}z(@ceA|I)OqoanT#13QmR0tYSs%s;@8(rq)Rfofpg%Rr3`uw)|Gd$ zwiS7(&T(ZvC&d47nqrb)`#(KvtWsZVD1%{QI=+1fZQ9@OTlP2Xi z$9F}VRt|NML2WdIhWljVhum~to1>pv>5f1JXhGhy#PMl8j+yySX}NjnuZ(3w|zr>Yk7)KG#|Oj=@3P z;5lHJ$47V4bG$%1NV2HmTlRewaEi95vSP)OL`m{fw096jz{=4G?S~cl^yb%_p5%VY zu5P22s;q5BMq_KOuC#>F%+Ak06!j@j?ajLrj^x7_PS}`Nd?>37J}>t-tt45~W-^3h zi3dx}PIIq7zaGM4T{`IsXb5~-t9{^EbhLuoHf~nG=@JGT>&~jC?y+g9`GYxU(HApD ztLyLh`p02J3a|JuZ{wn~44kva?YrUdnLpiX(%5JB>#hfh?PnoEi&A>seQhgR-_y&} zOZ^^?)VJmio|&rlG*iurOIu>PF3w91Y}Dg3XDtmxgZQY3=J-89uo=Om)gh#91I*sF z1cJbBo(SPDSdsm|8U@%u_o4lChSy*A49Nq00EEZbC$A9Tb@M&1rZ>xIa|r6m2*&5P zo09&sT0O}$`lKMQwq9RbHOX~jeUua|6XRfkq@0;^in7mk52SUvSn?_dBim{W$%(28 zytm?KGR#FsETdzL;dIF2Rf^>e5q-QpB|u8-Z8gKV{RIQF!07>f@o(kJPc8gEQy@Nk3Sf8%Yo-jgmrUvue=`3}b`5wLo881BqEZiJW?@uI28m}@j&!p1RMULi%He^nqw zq*1orXNBV5bLobjJIqUb;R^Erfr(V=wJ^HdDPNB7SM#p$n5wwUt6NLLRe7(gt2|gQ z7;aQRMj$cE_nO4@>5Nse7|@7p(buas%jT=laF2*X3N_h{Y8*BjDAq=K_g9Ipz$c61 zjFeXYx`y2QWwS}g6tt3LlOo}qb^L-Z!(*pPXle$h<3m)YYdg_-i^Pz+N}lddi>J+N zRZ7D9q2iMWiG93Ra!JELfr`8AjVj|P14&Y@qg5-#<;rZN=H5F?!OUJ zTsZRYkB9z--|SkWUTZCAA4k8y;xLe<>yh)YgB+THbEN5Mx)QNrQf9Bg?=zg`6x30O#nl2pKc zawI1TI^CYqt{rE8c7OG@2awV-*=${es(tySX@LI1fCeztNOt<`6-v4&VZk48f1Zyt zO8%&?*`F|QmS`uc{kg%3wlt;K{W1<}s9HE3NO!{mrXr9#f+9UwGuS7!BTbUZgEazr zW6k#M6iF2x)L}B05*{xFj7dyj7v_ovG7Y+iI<*}Y@8w6~DOdqXPc&r2Q{cE#FFfL( zQb+~d9vu>SMyLyPhX}VF=&KlLsvvFicUFI1tsX+`2v9D%z+pV(XM9#W`vuM$FN|kV zX#E%)8N%O8ZViaiu3)mys(LhSF5@M*o80!uZypAXUKvyK1toS}<)<~=SxdVXA8sS&5ZJVI=pr#I=CmL#mtv| zFe(lU4h$7jm8tXc=CKu@oHQ4QDDzxTn zcrThtGlWc88-f_CFdELiNDN4mz zu#S_KQ@A79-6-=bRU2yhH!KiIsTo+sx;Va^B~&!FR*l+!9D)IpERzr^qe3#LL^KU& z4+UbeB&Ko152<6x_m~q5e~JP^?oJ{F3~Dvr@5W+u8eIZl@yFIHThKm_e=K$uL8FUJ z^#~UIRz-mtHZFCW%VR&<5Q!vRKjq{58eSxG#@Lt}I$AP? zGOHffrc%nkt~=XZ%t*1@tTgZQvUGD;i#%@CD6i#TjO58cDz=xR#2s9@yP=OL=>Cmw z6~ZC4oNO$d!*0*e9u7l?1S?bZg`E;El$g4+aSNWbQBKyH>T6VxnmnA}I;1)9Fh-+7 zx7P$6ytu?VbNF~nW6yGgob$mHQ?r!;@jXIOkEx;2_$2qb@)6u*{15Y`UuE$r+kLk} z99aJ32JSLlu&Cj9X@dd%nCqyU21aAZ#J5<2cbJT*Etdz&`ZTzq@km$L}14sd@&^>5t& zsD1l>=${66(-fkH-qH@4IR^=68px}%V~gGL4A0yEreH)r3Hg8m3Tf7Rq~vD?@_^sd z!NC6stL_35^@`C#0$Ssps|-qx6ZkUjx65=hk2%wUm{Q!jvteK#3RkWdDCwR{3CoP> zXJ^t|#NG6i*<#~h3vIKXfJ3rQbg5zIIwkb>9`phhRN>(~@~ z{1n+JvfJN6%VQnt%iLUHJFADxEOB|7Ed@n6m!5HsYM0556DP=?@jyvVeUCzA#x5SN zRNA;Y{1MALWy^-Z1$KK2RO{N^Zu1KZjTID7l=G-DrI7#i55=p9iXJUisL9v(2+TcP zZMM%B%McR-gE}Sr7Ow{|z8)*n(SMmqWWwaYNM%xJh>8Ccr&IYt&JWAtRY|#6_$1}( zDnSfjJ!=M3%W2(kltC?ifm#4<2cPFBr>7#KqL^4%xY*eCtsB4=LIHj?H7v?GAm7{D z+xz+T74lyM;r7wL@box8kh!YH>-P+wW&F4{4XZEp)$%uBUg+O1@DLFLLqms`I}@Nj zwz49gC(nt%UnCiIb&<^cp=f#ngeQC(@T8FXrWt)dZwKvcO-4Tc)YbU>KjYVfW$ts3 z@8H0I8@H~MRHS`wULGhEuqov1EC`eGoUwEn$gFV!U$A;@O=bVR*BAWd5Bag45EoZ# zxk$+v@LUt%0ZeH?>_z_nocCb=r8vo;{!33h=kp_W{-2WgZ}mL!e>qR#={)*@LyV*q z>!y6X$hE8Af+%LAy$~cK?xdaGV9~S{J;@k$Z*HW5MdKqzTquNe!~!H}pt+0~_67M- z>=wOP%{?|?2dT!e!orfODD}X+WB9BDZp^n}BAV&sLA8quzd~v)&>sySSUl=nA|#di zcXNA}ynt|dlETBk#N;E+rT9YHA*JDo>RJfA6wJR@nE%r1ym7ATkfSo*xqT2uC|+td zbII@E3{S=7XF9cyhX|UC8o0QvhB3`d&#xPZ@09q_kOfzju7aY3B<9C+XVH2=#j^_7O`&liCMlsH*Z|bM z(AygDT;N>2l8KNB(q28RF`j{g5MWT5Gm92aUS8hM|KY*ue}j&uH?D^{;la>$ zw-t$|4qJ1DM6_uyH;x_3)`#xIn9EEGaGb|Qbjk7alG$TB9y@DLe(lvIAHU-Ky2q2- z#XLt}Zn0`nNwqn>=CJ}sw$54{Q!xJiwYtx?=_9mO_3z`>2Tm!U`5Gqt3L#jp%d-cJ z)Nr0xTQ(oC1cgcZC=x~fOLJtVl@UEec16JwAhB^t!j@zxG z`dPZajdDLR_cur%=F4fQis_Zf7iWD0*sRVo!hQ+k?m2*)QKy%vCMlCr(5bi(Ki_Oh zGY0oq?$TZ}z~|&NkrZ{1QeS-Tm&9BIZla@`=Gg&$<|B5C=LTM>$g(kI#o_=@zDf?6~S zpnl>h`~Zbg4v0j$F(&<+2o-*_wg6%T4RUK}PXmR_Ruji7?hbPfZ7l{OLMt8xdb)Cs zA)ofG(o*0~=LXga8+BDcpB-|e8@i5ye-w7b9^?auJtd%oB2XZ3hX6!T;2F&iLjW9` z4GhIG7XZ|YBv0AfT2mP!$F2MW-154&fzkWJVow<9M}NnX^;+QWMN_NXfgrQZE$6A) zq_kW7#_ZJE)0wt8ufHDM|9R!#4ShBgjXO#RRO_AmH#Rng+2=6l&7SOFxOWJA$Y>cF z2YbEiMhb8>7S*$N9lNa}*3sN}Id?T=yHWbJ7=+M9h`t zb*8GoQ5*V(Yw4(JJFtmvPFDvie6m)FDwN&%0PZw;u%i_*+02P96W1e`p10F}TVp77 zSl6b=5(Q?hY6BNsIGFc@jiuN0>sX_5>FM+eJo;N-4NRZhPIsc#@7@gaw{t|0?qEf4 zqoq_6W20=$s=mlMK`bxOBI90_9Q(JlUygrQueaqUesYDe6N^8OP) z?6PT`;k>J`4p-J&s>c&Oe}%;-<*u$SSlMk?cj9YX7TOGi<{$6gQ=_fNv2f5uw93bY z&zKX{B-rXihPC;A?OJiHqm;9Z%GItbR(-7mlOC@7@O^i_7VrpPO?+w7XjbHXDcT4D z$EHANxy@Zceoz2)0O95Wp9=+50!i26U(lUl#8Ln24-d(F81NR(Z}M#>39`)BKO;C- z5?Z~E--xBu2SPVBz@IV~@Vx~ST5-{z2A8=R>cfJ70Z~ZxB1qR++na(gX3%9+Qr-C*a1`!ezm3&TD z^837YzWe>Ne;RX5o*RTy$qVrMwR_a<&C}TgRn5?QA6XwV-qbNM(m!P{qbgALip)mL z{gcl!>#f|1j#B@P<%W?-_xQZxamDD_N>npqN&XeDo8l~zk!}!U>TEg1x34T}6b;HO z6t6{x7F&Drnz>)#Bkl3N#n^vh8r7 zmnM*WM&zBq3_m05+V`{ab48ZTHJxvk3&^I`e?WTJ$yxfi9}3U^qzX;JZ;6FuNe1 zA5Mn82q%_NlQr0zIS8O=K>@=HVKb?mN30UzK?(Q-$#s_}Figdqn)I-ChZsQMo}z*W zQ;Fp7j(I0CvJ#Pgr>|DG1nC(=IE*}?^Fjjh3wb2j0F4nlR4r!=`Ku=~<}51ndh2>{ ziKH6&3riA!85Mc1h93dR2-V-EQ%wNiF?IdmjbgGZgVLKf*+O_!4_i;MrfGPaH9`;l zTkjk_jLVcllYV@5=0o`Ud+pFIe^91-)l$#b{ZY|6Ql%T875Cg&b_vj0Q*oZmN4w4W zf!oFPv`xp;I|0AS5ZQ@2YgaW;CH0xYZ1zf_j9tiun75cj$Z6jd}h&(k!B_b7xC2hKnwCowqgqE0vnm_}}nf8o=w8Psaw) zO_cBTLkHGv!8L-aBR&g%*uGpB0~5IfER?j|iU*dI2A|?b)(xJk2S{+yVnlJl!>&83 zQXM|qs9>vzqT~;`7ot+ontHz|QXEe#Ree#0u-$-|?DU}>E*AS;(5tX@j6(m}Z}N@c z=zo60Zz;($;ynB`!hWUdMQ41*ASV7vw#3@k+Dg)eBR+dm=qFLoA(Wdy1d~jS2yA- z(BzLsEdp)X-r;s<$nRN}#iH)%TRy+=Y8ZJAPZP}M;$h?A%jMdLDtmJfq*h;mB=d`qQi=AFe}&`G;C_EWv~C=B)~z zUwIUlT*N9~9dR=`G0VIAWegwS<>gYFAp+n2*%GsXQj-Q-!}s*2QF~*%m74AT;)635 zhEx53M!Xyc(KPAZI-4mi%^t%|&BPwd2 z5Dy<5O&Q@c^j{|m^)pDKSHIDr!VjGh|E{)N_U|1p(f#J^@@^3E?OwsPA%6ZZJ0KjX zKMJa$!VE8Cfbs7yo>JutNMS|Ax~8V4q9P#UQF?LQu(tfSUDj2=zliuV)=qUH6)Sug z@9$qleqUYd3mZg&6I6)3f|HYiybafW`IXiAai0f2aUgRW%Tx+&)&!uEU|nTIzm=jU zs5j)FMF2FL@&L}VNB1SeQ=5=VP*(rQnU$dn`=P8*=BVhSIrS4-|HZ$~j(> zs@-1X8Z?v$IJ;}Y`dV75+XK@DFOG6oNh0&vgnvTfNVPcWwnb%Nv$ir~vSy_Y5(uPO z+8`~Y*sJ{9{;Vy%SV)&bqU1Bb_ZDGZ#1h8Kyp0YW-m_D^giqNgPf9V=E5<=hfI4Yj zQnp{UCP`*jl9f$=s4GNVZJ9eby>xRU;q#2sAg=CHKsf`mv8kq>kdWiMzaO&#AA%?2 zVLGpHoW4%&?CJ0K&VJI98&Fr=7^C&2eN~r-_q+e<@z>Em=O-)=LO}55L(iqhF3ka5 z-oE&;2ue)ZleL^%v8Rj^%B?aN~=-XsRifUC^ZbYca;X+cO zVYahHu_g?HiPB5my@HJn=OIp-lmB@9-3{m-5`j^trw#T5-r@r{16EGi6N09tJAz^z zkJEg<`0V+3xx}i5f(p?v5Y*Cx=f*}y zRa8}3+1TKqfp&G6>8-lD?00*@fAcDju+08=iOjm(z~TM++LIHLa|ZXoIGo!jC2uRM z=TE9CLRy`dcLUQD(x`PctwzB>?oz`X^CYLw>NifC&x!AiT*WBL)t6N^MY+@Z;Z>Ozsp`kG%7|lz99b7MQI?a@> zY8%a@#iDXOivdf@mUG<$&*`Njk=!1`qhD6gjC&@0Q_XT(82RkW8*SMZd#A(bKCv(r zW=K{7c!^2M4QG~D9Run0{r1x}Vwwh80SY+`6)BT*2l|^$(fv2QXA;A=$|8xu&;DIq zJQIy!bYo{s*E(<~dnvMq2@NZ6nxK4>`30&9%bA#eckibz|5#dbxy{8=m7rRU4VZ-B zU*C~97|U7gb~h7L09bp`d0W(!2SOmfaKkMG3YheXg)LkTl9 zoW4C$w-Y2^7u!Xg9fv>2svCqN>P?MAa7d-FiD2k)I4>4rwNh!C|E`QGEeCT@YH(I> z{WQDgT?v!)oqDQ=z1kYX%JMNeVt1&Dy=uydoz;r#BtB8-)G4c!LEoy($mjh?qVu9?wON6j9b5i=}Oq${s|-EVZF6Uvkjz z?e`T-l1`>W;Lr_288BrT7Khs{%L6(v zpF$^Q7O^hRxQQe6=pb9%4VG(d&Q^tCON)!O2oDpwIyyRqm2=s+keODnApFGoIaX`y z$H#-Tl?buql&o%)Oj|`J^tJ%DeNXJgj=d`+-IN~cAD7JMW_o7cqL%{PQt)54t_?dU zW4iMn9z5*re7lI-0s36j)`P|mT2A*TeL1r8F7Pf!zB>Bfna$emnUNpav|7Bo%_Oyc zSt49N;csVm$i`we=KM9hCD$@VBNV=x5f8`D&>tj=lc5^0zdf@NH*$g7B}% z*YE8KyqY4}&ap7v)<y#aRNZ-B^*fv&tM63CtG(=| zOeM=a-S|IpgbN6W-2R0`2?mR-uGJF*GtZ0;i+w(iCO0@ z!Fo~4%DoyZDJCzE^!o!h-alvAsJSs8Sr?sZug4Tr1Z^F=wv?xgpeoi~&l~k`<6RyG zriNjZ)Gh|0`h5hnx(r#@d(qz;o!@%TpH3rf=Y3y~n82;et0dV`8=kZ|i^_M8cnVZE z&DY3`#T)4@&hjr;_~*OKa=)}4c_!~J1{$>hyu`!Z=`qh9AHR;)p=3M;<{nE5k-~f@ zPF&9#okmfa7D{Manw{V1CQiG+E`~yA-tchMO{&Xh4X>U&GuIRwF+_%+Di62O!+5Fs zFlX;On0)jauOOISKpWqS=7;xbMjY&DKR&U%7;k*XB*$oj(#Y3sF&_)ySU0vMY%24; zjugEY%kue38_6gNHO*s7x^U%i)Z|y8OOsM+**0$%_HLC+a2F!>m9I?}o~afR<*nsb z6(dy@V_p`PSd>?}Elt!S#`1mO@(nf6Qn-9%yldnxHKa)X#L9j2mA!CN8X;k9M7)|n z-TJRc$h7uhbJ6A@3dV?>&* zez&U4DFV5{JOQ#jrKP0>1O!46@&72k?hV5E_fAYy=H~~DAEswyl$4aP0koCmWk5m9 zA*>e@Yr#Izm>bBbGlzLx2x-rS^wy!g9v~D15Yc^MPupcER9+cMi$OrV4p!FIK3|S= zCW7IC!+u*^TO$>v)EPJskekoh^2MGXKw_nd%<3m1$iJW5(T-M_&fIh`5ReYG69g^E z7^BrjTla3hfM(-vx38wF|Bl+}EUrVPa~c^PWng6$wWM6nt*MEHM5W-4g@ZiPpm zhXvv%3WDYV3gR~2qF618ZUw9|mc~$VPUIZZ2EK$>?*ILpnQ#0okpI2tb82mUJrELQ zx!LwOUm~Z~V4hb|K}FB)`C6{O62S%X-7u_wXEJ1rNrWxS-;-uM;x)K5~h-* z(#AOgb~o|%rROLh?QEFh7R!!d7_zAjo6AL?Dx%!YVSfek$XYOP%P4ep4mx@s$YueM zc}En)M`+ee53PQ;kauo8m8e}-FIg;e&;+>R*Q7aeazf^0juyACD@T?u+^5V-o-#?o zkh>CRdRmlIV40B(YftUKOf@K2nJh1__6pWicMvk{+PPdww;{TmuKqaubw4&b@CKW8 z^qM|<$>^h8(?(Wyj5-$t2L9hW?;zC={r9dp*wd!NY{vH<)3^Q5>*Z)v($Q@$c2u2GzkP>il+|Ou z8`sFRH(K-ER^VX9{$*M8o;i?hpix<9stF!5+T$W6z0|-;@jrNw7LGN6etNWcgVh+; zQO(-rnjTVy);eFAcr#e<)!`~>_dS}+awh7wlv2xCgzoE0MsPfPD+^49GK!X8w&5~q zTh7W*lfPoV=u&4^bPK3BB)nz-SYk~NZ+N_d9m>kw;9Z_PE*@NIsMX2?Wn5?D^C`P% zHH~K5?aQN+9u&*h|6ThU1QwQ!(V6*qDq7m=swxv)Nhztg-ku&HzrYW$LH_R(l^br) z*{1SK?1w~qR>s?o@yMsJ_p!uML*!CP&YCv6%fasaM*FFp?&*3OHi`_>*K1pFJ=ydf z-|v^H`3sLdG^MY*-QrxmABWtNBn?cg9HOFn zw>$19RY}CLECnmgg2SjsoD5XJ{i3IOIn`IWi^GNFg&r9t4FWYagK5ZG6~dY`u-bYi zUq`cv1dx2hsY0ixdM~9WB(0&|s27>#7VxG?@Qt8tL!a&dHA8mQDQ z^uG(U;~hM_pVuAxgU52Y8edUq6-r1*$arL8;$kKb{IXTS&l31tMx&7AXdu22WauQ# z6^Yi=F=64USGiWw+K1l2GpwfcR4D#z&Fa&Xpo&EC*5Z}!63}b3u+w=A;Co?xn)05M z%cM!Af+E3C8kfaDQcRP=q%5aQ4x!0bt6i8JlGpht)YVWE!QlIa;N(Pk{^FSQ$lJK! z`LKt>a37&?Me0gMsaJ9J(Y7%yNd`by0R*fU6#R6&Z8NvIWH9?YuKqxoKYpXrc$vIV zsLp!4o!IX+I-1N3yL^+&JY}+^XF7^{kONm5_8HiEynk^lel(w6X`W-d%VwEeb?5*3 z%yE~|{ri~$31|o`y5AU#ku6ZcZI9z^_kH%bXhK7XYK&hq7Qk{1$XRaMmn^P-c2p&0}D>2xdKLdFW_Ma^dYNhxpLyYi%B zeW%#m%Hpk}j7dGV9}LUPm!8!t9p}y7Tq@U@A8))Vwbs8N<-S@ob-GRD%+=Lb0O3a& zPlJqz9#!f@%DEdBjilpbnifrpsMQ6f)uT!ZzLSoYp>A@QZoca=JmUF(v)*FO&n$~_ zczrk`>rPHL^p5wI2(1Y)8(1Rhn0D3pdkY%EzEIwHxMaH=ZeAOpZDN(@5!aFiC=OD* z`Dk*w4oPdD(w`P9x|eJ=qlGTsSg57i=MQ9+H9q$pQ>qn0cv$A2g8nS{@;Fy5Bb(vL z4R7iBn6^8Pi(2Di0_Lk{vV}_2zgrTK5lgMRVBidREH`axEmLoL#{L5Rclvrf5u{Hb zL3Xl=iZ;o%rizjJjlOL5K#Gu6-l}cxtA*iQlq``|7MHRfC}ACPn3jQn&B~`7HlwRN z)|edUX5=VBR>Y(w{56Z@x9JRC=d%S21*A}|IF^d9pB^rqHWk&P7^+LV6)CN9wmgMq zJk2K?8rNGzWHl#qHC~T5PB)v3hN9-?X|qfoTn%=U3MxuQAvHNBh4^imPy0(*?Jk$^ zCX;yJF^q_tE;qSTa@l;gMv|lm85vE}bPGQi^11MX%DdCuYD?1#r##2JXSX~mp%%M3 zB{Z%ku?F1?k@A1#WN2tEDM*C~OCor@WoTvw{CEwMQ-=4lF zyw3Q+Q~kLR5KBZXU=+;17baPV zDP<)Fg-(!&s9>p-S;GB4Wc_7STwT`$iY6gIg1bv_cXxLS?vS9t-5r9vyL;o%xVyW% zySu~bB+vWZJI>iZ_`&F2yZ745=B%1koIC|(wV}h{#Ry6kt-zsbDBU=wd63};t#BW= z-Pu}1qb0FuIz+OaBqh}aZ9V+?u6pBUaaqy}QbrS6ASQ=}kFBbLxQZiMV&Y8w)D~5x zDWH$V%KrV@6;MzBfnHJ_6M~iONbO@Pl`asw@5R!T$&j}osd|mjw)~#f z(o{emABKH_M$zbG_*x&D#QDv=mG}2> z>9MnwGhTgWbjElHm?~!dePCQ=Jg7rA3O=l1-Tc;}p`meoqFwd|su)bn&C$}J-eI9Y zDuJeU3Y3`JimI+d1$*o9d^`U2ExNS)$tlh>YhO=k{`92y?me@ApW^K@mO52C_uY5r zJ&Cv2upwIQ0lo5HTB2~Sk5)_i6e$#f>Lvcs5#EUeXFme2Ra7%xiw1z$8A4LItK-Wt?&uS9vv$ms3y7{e?3?Fa{!n4z{e<;CtiZ-^ zRHtnLG&-CiHCe;0a&G%}$+}w_U{8w-qLOmU4fM|jA8)zO8$Fk4!$=eYl$EG8OjHsR zz5}op&O_#8TTOj;|>=W*&HBr2Nv>qA zZNKcP!w{Z!>tz;1By}BqZB_Lzv$=d@r|zW}!C6+$w56^X+fJR7SyGBK90WBa700YiX)G-W*q8_>7l6tNyU)qalAv=2$y;U}ONI6d$MwCHAr<_?*s=>ARfjEE9gEn83rqL)RSsWZRTwzESofNH%ciOrFBQecL zv%~3%PW#@h9jzR0@0?)q%~!VeCY9cfnWed*Qh(FU>HP`~;GEg2LT=P^(QmO?)Y3Luh;EjKt$V6 ziKbHxp3OCv18`=(TRrH3Djy(QT>d~(UkOZ6O>?EHX_e4?WpzzKU3BZxJ4#yNX|{V7 zz-GeMM-i2a9Fl>3?Sd;;-xh2^jK0H+NM`lo|mg5l9t8{pM5;MN);$Vw>d$N+{)xQ?4dgq>- zw>|)BkC6Or7oY0x?v6|@V`h5V?dAYPa8B^?@#*lkmzR-|(bwOenNiA12kG*fOqUC} zz$?VXqbkV!ETi~yZtzPXsF9LyUyG%Q$J;|^*`28D%b62wj#bx`)Lvg}OZE2|(2@Lm z4|nqkt{Xy{nVueD8JTd4iTQc407!Veq~zpDP+sa!IR4=r)|B(~qLDTs0ix_2e*ruB zo)f)hNH~1DT6;8Bm(WsE!;yIsl}8QjPlMi(Fs@tWAS;26~P~dMzgN8D9h>K z7tVh{dYm!o#PgnLdNA4|>Z6Q;&IR`wPV7ni$b8c&{omhoj(v3W@PiNZh>3K%Q_q)B zFd`x%0~6)iT(pT&Zk8y2+xh#+4Y2+7t&+%vPYJcKBbad1@>(T(Pm!KvS= zyM7HA>i2yAJ+Gtj-(4;;KOwvFeRnQqcrH>ehr`)>lIh^#m>%<#JUU*WCShr)S)8cU z+JlN3@0ZuGa1C2=q2GGA-2REZUq3(n!yFOIm1y6J+wsM@r!zg%an_LfYtx?g{251^ zYn=69PJzw7_s07qvZ#rt#@S+mO&pmEvji{X}Y<$BTFNwd=QLbucK zd4D86>>^P{+TgtoV8hZ!bs}nTm15McL4}t$R*a z9E>yVcQyK5{iSJ!2#+q6;WZJ%4YvdU(AewIY9mcc$QF-u!DLFT@Nvoou90h; zo+OOmxU>E^Yp{v+dP;rzc}oq}#yMlLqgt0=Sm+vdt=_gaHL0HJ*Iq8)w7Tbk-e9|D z5PsD1#JP#Lg}PgD*6S~uvcV8>(fhUx07%~4@tQu(*%`rc*MD9PWw&bHC>n|p3dMMRQ_xj0Hok{PYYJ6XuY0ua*SH|gC#r(~>F;hT9){w{S z*%o#5V+J%E1h@lT!>ypQzaDN8z|errL%QDZx-e(u^}P5MKv#+Z8FIwaSGXG6jocUc zOlgfilpa8doPGA;eZ1_g(<#p=khmDqF>}G6l!li(giKfbk)6$`IG1DlBGY)Ka^Bh96s+JRfuilBOuR?eQ(|! zPBXg!zdhG^-<9`T6)gY^QK6rqI9X02X|}sQv}c07wrSg49p!eplM+Z<^yUwo4!^#5 za(dRc09yiT*UJ(TT1^L6hZt`zXxdX!^ky7BZ1XR2I#5OO*p(GzxIQ*u?2h%O;nA|D z)34s>4V*^~Ux=7$D37;k2YYm5IGr8Uu>YtscH%ir5DrcZR@wh~{YqaWc6)Hh>QbD! zPOrJtYl>8Lf1OG^lT9%+Q1|Xj0(^b>nJzNsaX902p`d$mv3oF?Nr!1RsHT`6AWic2 ziuNphVop6%e#%!^hR7>T1H^ra&%~cvHGX^?ebsZWmkS_dGnlZFC{BeJu$9F#Fy;Ns zL8$43^!Fus-{Tn?mI{J^Bvn<_`T6-8P^yMy07NgAS5_7Qtu|JgEHp#kIE%Aq55S-G zu&|VJ&{2m3-}HnmaU}c4P#%xDhu8@H$m;l`wJarf^3T%4cpUI56)M*hF%(v3EN5O? zb3(!MY3<``%^9ec5~F{1jBFfG+u(AuFc8i!u14b(A%CWngvsE@VuBO?+;1sToLw{# zg^Cnp)t^JYy+g_w)X<}*Q9~g=fI=S?oDtxTeSslpj6A&0;ANGwa>`rf4h9xQU>br6 zML_-0Qm>~VS?FM=>VuGxB_{Qc)REs-o457`KPV5=w2PDoXj!M|2J z1#8pR(}pV6sOoB149hX%W@VahJd!iIw5q?)e}A5-lQE^B5zD2ZlPg!P{XXB3n>1%s zkk1iD8fI%eMnOwQvbW?O*DTkB6QmWcXQ%&@MEA{TMSwn(w(Dg(AyE?q~2Ala9EyQcqDya zDsd#I4DmgK;PM--^`2+*1Qhtj+qK-#>UaxJJT0E{l;=$V>Y0-e9Jk4IcV4>9Z0ti` zvm7U^Zi`JR5>)i6(ME=Iu+)HR>OjWgAe?$w0yfcR9v#l zbv6S~%t;u-F|Yv9kMj1)Yka$q8KrdYXQV@hLXuXD$tCw`GX_&|YdnS~lC$j-)qk%v zY$QVyRu!7#n%Qw9BNmGY{U#`0X=ZpvgE?jTd|3HVZG+PL0Eg3>9!)?oVg-y><5n-0 zH@Q25BOG1}*xdEH5Qms zG%i{iPTCzj(7Cs3&Y!N=BDZVUk?_>#gv?zTW)<+bBwUw`zIpLc3gN77*fHsb1ISBC zYxxZa%ay?24gBhrigK+uDyyd z%)eitqt%}+#DbD7mZyWizigc56j&TgG}9?UeinnIlDU2V1*M}n$$T~G(HF56_d#s>1{Hv|HLf!QEQsEbvqnW#ny z`OCXCTDTDs0#hHWuwe>PY4a?i_~vorEU`EtXIs1~N$5*vLC0)g1_92Bh?*!!q#^(@ z6YE5LNLuwbloGCn&vspx(oG$ocf{%PM?xM5z6}mA{l@$jE^qSDMghecq6LkK z00DEQ+wMDp!c$9wM;p1ryJ8WmiR4vGRgjXBFgFb$9Ru^X;OhIxU9yD4mTp|bpSodr zP-4gVmwP^6_Y+xcUo8}n{D(h~eUA}hjBVx-q(4b%a^gkma;F?(BoYcT8)>8OQ-%_m zOqK5o)AB<)u^dV^oF80+Cs$7+&=iO#Y^e)4=TXCoi z*|C4&nWiNa%`FsNt8#gD6vfmH1zfik3*kdZQUX$Fgf)Ih$wXW|&wYrWWe1H;4cnCF zk@r(DHt-Al9AVxzkmljc>}JK00&Hz<>8&EvwF(wPWPjKg2c2;G6$HzaQ4A__2ymj9 zt1Gb)BUacRr^PWKumMR-K5f428VOGF5{c;3R1})S$^4?My?8{`^oedH_>y8{s=sSQ z4h9VjA1Oko$z;`qzc>dhD!JFnLl8L;9U0;%iUiQKpOcL=EOeeQPiaqK?YPc)@>t%% zyanAQcYQ^#U_BYNhAqT)4Y3;8B?~%Ef3up}yySIJ%E<2^f6H*QQ|?a_CyUFnZ@!IIY|$pKaT&_`=?4=w~=ebnJcg+##2 z^mzhrKZ1vihU(5Ehv{Znt&CBJjGf$?sh1Y*XzwEKgz#{~@GgEn2A#)r+uHg;%^LX-91H0w04 zTZO{|HJAM7+s9(7M93b9^Z0CrV$K`2QSnHDCE>k1XMwu7bklSUjk|WI1YYM%8#D^a z4=1KCy3)7^v|2J$`g0B~Y;inoz2f6!Z*z8ir>_L(vY-A-QTMs=W+!%6u@VM6oMzkw z6gb9u2<0+`7~F28ex>7LfuQ(cnWOCnZEQu+Y|XA-V=QG`Q{}p`M+?BPNz-bXklWHW z+A8RBb_<4awn+$j*eT!w%`OR>pWdJvt<1$G^bsSsHsT>R}qvN zje+jO$n3o94j&os?9NFUiY5;d21`JU2l4LXSzB?K2xT>9jlrLtKlkh_>=kVZhv*8d z2usFvcb?LOQjl0WA~7~J1__TSmDnF7bd*C!45Q`9g^rfT$J%`T-Yp!zd$voC6BBEZ1Aj$%Hp7;i5&dMU5yN!Kc!TMNMQ6hAtw2hi@Ue|PXAAuX=lNyvMY#Dhg-w=?sB7m4Lc| zW2=tJrbXS=!=0McGXE}Xs)}FYB52zv?toC!ouv6=)d%m@?DH1lw4*|}Ph zl~&bJbM%O1)l7U%{}N1E^T_zP zQ-Sf?Ewl*9O_A*K&l>cH$5Fde!|7=|M5wFX10CU8efp!hns2WvuwErP!jrMhlN>av zQ4i?1#k%d=INSXC3_mIt{g~f=v5G2dtB$+eu@QTpPdTDq2I?RB&O2q$Pmfz&F390F z!Dss5c=X+pk$C|Y@9+bKlX)!o(VNHjXf8FG`ZA~nJg-yey#6r zm*x_DO?WS8Vbu!rCVY-p7|(Tn{gIfKr|gcK2bL)LmWL;Yr{;q4Wlz#+XAA}dNt>A( zw;&f%`}u^^W*0XYM$=V`y!jsNBSn4#e= z$TO}`E;HgY_?y&>M)>Pf>R>{}B{_)q0=m2dY^<%FZw}-^iA-DDt5&!Bg@9BxD?%X$ z2bLpR!cSmewabfF&hgTG)dXupMHJAu*!7ECa_dxs!SNH{Nld0##X09>#VdkB(0ERW zsPO3c<0Gp$GElH^Nt$G6=on}}y@Gay{83cQC@Gm9-kxA_6m{MKsc-?L1upfCj2$!1 ziJ672!Hk?c7E4@nQ>|Y}Q!^2(oIt}#!gB^%T3R?X&J`_#d9@Iv+`C;u9?H9$F^5ni zx5mZ>_C*XkTg7~Mx1$C2@JpTg>xFzbj%Y=S%^4%#h$=+L3SaHOnLP}AzD8^qHZ*-Z zu{^N=(V#N}&?Z7)m!;M}fb|+KVVJ(GU~gN{)Wlg+WxdTf*1_P^0n6me(-qiZ79rnU z$?B;)m+ncb$CA(Z}R#;Hb#|p}_6ABp_75MuvaI*V5 zF9RsR`WO7?z9S2wHMQ*xhN6|%tWr>#cgD%sXyt@mG)z)?sPrL-xmc&Qd^#-#j^g2U zMFqPVyME4-7k$$ikta=?b8awy9>UKtp%W=ayxzAO>rK$hdmPQgkCs%z#ACCJ?T79Gk^N* z21629s`}=vP8ZMH?@X8VJ?x6xYAUY%3Cr$|M+9}VrUJh}Fs+gItE-7aM+F;x3LSL3?*DzhHks~applOOK z33ZU&uM(IPfXQMA)vmt5B&+JW5#yx2L)&(Q@gXngl-Vakx!Hc~1|+lVLickT#GwOS zYn1q7zt`%8jM#bmg`j)~6BG)4RFvb1KXXb2*uzL~WFiW-1X-LOVPIeqB4P8S&#Wp+ ze<6}HVkFC;8_FD~l(svbZ&*-D z7&h>ECT78ZY8bPMJ^?L@cusfdpaGGoN?npv3h71uBQ%s0(qvPtMl4c(9NHEoWwHK` zZyCBYH$<$lD%EO>NE9(P!Bbr-(TXI{Lc}F{dM$o`*>j*L@QzG|Hx@i!?(91L&X~a< z%?iO%&dfvcaY31kW@Zrr@);HHmaGD#Y6V9-1{%_u#mH_JBk;*cI~t$!-&$4L1g zJO6iQK{$0EEc|$4D@|SKdu1-}L+i#~ks@lXx)sSUTV&zTu0IEgM@Gb35Muc*2zqS&T+(G-?gAxnhTA&_u-t-0~LeUjMNiiZm^Zw3eXBEy_$0NGq^bkv9)Q!*t z3)clZ&u4dv#p98z`wLzVSN7Uc*ZiC**|L+aF~3Rw{P?)0a&0*-CNPgg2E};V$47j3 z?^nV&rBQP?joSG`zLox>n;H1j?go;u&}vbs{^-~zC(~T5!W@ZRU3OI0L)z=clG@!4 zx;~F7$zoCE#;dbgyZ-Gn#6}9q=Ggt2udHQMQ~>g|YpQI-_c##ar~-MluC+~LOD9>( z%KPRhbAm_ps!~E(>b!3?IJu(H=uFpU1eoQhRhpMGU;N3))oQ5)Q{*>$+D%U7jWVyzl!?`@sb>AQm^nUEk+Y)lsh5lGPZ?)H2O2CT=prQiN zCu7HYc&N&X3{7I1M8P06W>41Jz;`XJ2*=&>E~KW(%DXejPT-etZWIS3;a@--)9rUggIR(IPPv6CvZWaD@;N#5vwhoSh zKzw`^EyZjWh}R4gzAL08gZN;cq7Ra+5M6Rto^6%tcN}wc35A^IqUj#&ilVs4R*Z9` z{xMq4B@!=AGvxJhf0`J;!!=FohkIIMNz7eQTnLbKri+s%TdwVyiwCc-AkmR+dMwR~ z>*tNEX#y1vr!8@csykpWY(Ac1-+aiISS~lbe;XwP`|TnTIRq(#wUWTTQ?Twe7G{-_ z0#w*HYRjT4>hh}48g=!Zt!dIYoA#x|Rc0~Q*0!a)UD_=fjBFr@=gw9M1Mcp_TWYtu zbL`5?%Q(O;UZe@gl!e6_W=?wL7O>*Ex4u2SO{5bqJRwVm>=uEG{f) zue&K|i~fG~e7Xk{czqnk;(gRxOa6sJO8o7|>S3)%Rk}yV<#eZ6Yk-W73|DX88KwmH z^o=eswhs2|<754Ymcd)^L+l(q{qeLH|MJ93>|nbK9Cx1su2?N1Uk~&bMk`_TKDq_z7@VB!WWR682UP3f*k;$A$SWM<1sQX z4EOju(l0U;)zyRF_I7rXl9Gywi>v6czt~5FchyK6vltm~lJjq+T{o@G)YB z7EG79LGI2ZX+p_KDl00VF6oe*`W#3IIn<)=8t(3@m>z+MD}>WdsDY}=v8rln@+vAC z8XEH2cPEo}eVaVOjcr?*ui3#QyW@&_dgbdCqLjC>r4MfTZ&YoXHGN+0iBGz9ubsLc zRJg5{eygTqXM8sPcz`ws(w*m4VUx0rjOMicqho={-W2$BmBiGvbk%~qg0)q^=^|18 zIfC1fz)L`Q>X?p~wC=~zpB=>QEIBZ>)A}pCTGV2+Mz+JoO`%`6AzDO$BhBNFoZiJ? zT~{1;z{L`s4s8R`P!*Pfb8N+;T==uZxR@2l9()Yb5hd1nG7u{BJovZRBNKhTJ*s7 zN(x$+ah`b>+{uzuyQOm59+0r_O*#f(G2xraws?$xco^etVPstYb2G?&egc-imc=1h;N>+g^)zWV`W0)A7y0_U7Zqv3-Kg!&A8!RzY%ls*!dn@srqpGl`vVSZS$=t(mJ4$E*nx=u7-(;giIs_p zF9*R*U%d!_VWS7m)lZ2)u8nX*QmwY$hP6*A#Vy6CR%jeeKnq}pOJa`#UQ}yulFS4o zbgI}qNXp<iiJ;jByu+e3J6PAcE-oSk|_2e-VEjpF{4 z!YR60Sg)aWTOi1MZKq6RsGdanjYP1)R_)6%{F~v-geD!?APg8(Qi;bd&52S zjphp5O%kw}o<{Nq^HmXQ(`;-(S6|KQJA7_5E5vpZfmA{*V5ch98Lt+RDqUP!NL77v z^RE+Y42>K;U6~d2&$(GL&0Xm=oDR9)UT4!beNSd8Yj-%!aV1LTkle@Zv7eanu4v^r za<=-3i)~C+3%qtRK?eYiY}}yDl$GQ7(mnN!%hi87iO@FTx&e%ON`uEZd*o6hR`;Jq zvyZ{$U6_tUTUS+)Io~nuec$ZA>nbM4ygBy(zpf6F=xyA@uXVecFsS}rdRi*}6xQHx zHG-%{Ees3$DREq<*m&|YnW=rJxhxGTqR43F84LY+O%ID_qaVH2v-{!8(9hH# zi?gkXPcQk}pR9Pa2X$eO*vUF5m`vyWnWJ~JwZ-t^(|kOkV}*=M`DB+F?C-g^jNiKB zz8MJ+3FP(`jRqPK?gGo3wk{G#`jHMTqDBhdE=C_LZ^UW$T&|*Pmm~Muw%Fw!T*rEN zePKg~uJKP72Vg#Fa>}W_&i1HaDGjD8O0fl?O}L#C+K(Djt1~RERUYkj^YB4l*#+99 zA>W+ROBt}&>vv|fXG|qjz)o+v`5MP&a%C7e>TOB;J zG|O~auXgxzTe-_#rJAUYu$WDrRU?C1cH;1EGAxF1t%G*$bQK6K&@iHOFRn|4YWK9V z!D(%GUSTXE;hI(J&G49J9?4Zdr{yGkJ))!U&bZwdi{O|NUc0oAr32|;Gft&buixHG z#_m2Z{bi_hJcs)Ebq6Dp{c(S1Vgh-|g@=cO{~>?@%@?b+n;i!S2SI$caGrT3#M!u1 z3`9gML`3BHI2n1-xv_FZNF48B4 zn|{S7SZRoLD(i3Q%MDMlW@CY?GaXX<=nd$*;P^-iA9=zz{PvH()&Ut59zMEQV+;bk zfDqgIlbKO4_B7KPMR1V{*u19Xug5c?jcMkva=Qr51qs~ zVczj_P=KRytwA2LcG7)o>y+helUlgqAO^9q>g{G*kp3Dsqqm0?C1+}FT$BB?aRl{4 zOMOwv{qbTp;bX*$w&!1l=L13!5f?w#mJC|dru`8An@oIzT`NN|8!jM8NNfTEM}p2nEhwqKgYK?iAQn>(9yzqitK_Cg$SUTVG6^9uLFp>UQG_0OkKBxBObloH0Uo0v@nk4%GzVD_zzNh zKNn+v`dVzlgl{s@Su-|OP zCP`K{s0midoRUJ4p;3Hy|B`)>9<24!gJFymEG7)9cR43VsVz#7Pat#-KMnLSUc2=}c^zpj=#Z!@UP`Hy{oq zU6uK}dVcuuEq+PE-CPGmJq_#zow1CtzZW87g4}TrqGv_@fI7uH!mfn#>V;QH#{uGB zgx6NP?eet$=6bezeq*=<7&AsF->hh~S?$a(2Ib&@A*0Bo$QcDuAwPC4Afd>g2%^CE zX(uIoB8FV7G3<})dBT=2Cpb|HijU@#4H_)-$|8%!!&mt-`6=5DCy8HIf(rjL<;OQ- zQ;Mu7L@C%!jJOeYea3E**2>{l3cZ11i7JXIX`_6@l}UMS%%;9y#Ie>eFlqjt!pFie z!6R5M5Ko~-@S$6H7z2qh#oPt0!FoF*W3_5#q(Q72PeXja*FQZ5q12NTD-eVwO&!4y ziMJA~OX6Zyohiz9Cath+TEw5;?+NDKRdPC z5qaGAN<^exA+d#DeFGGK`8eI-W@7qK0C#9qLY=*(mDbJ|7 zUHzBhAkJtD8)k@hldwY)IBqKqb@3n|i8yJ{YaTh0zalh+0S+MEpAmGUG_eC(U(j?} zNg9-ImtO2}lg^DS@T*eV4t-+ur4A<48z_;`uh0+E5uTyuIF^cyA*V)@Ali@P1JtaX zB90{C(SS<|ZV<43H+Jjb-6BWktWk3;_!QHEx@`B#W!<^N?TD8@*FnL<&n`CmRoeQQ!hzS6-s!5 zi7|y7COCvENF$=1)f6Z&Abt`-F)4V~0s|}TgJv1cIhZ(&J0^^7^(j32qPwRjy)##J z*{7a2oUEr=U^${xdbFWB`kLqAv2ZQ6XEo`x- zul|o}y*p9)F)l$zSB?YvDvbte=ye}kQth8A!Z0Fc)gMp%9QCRy5eHjMR|O3QXh<;y zDBtgx<>6_EBmA%X+YO`sa16dXf`1$A?hT|jK`YlJR+>JLUVg@ZcBn(fjnO}0exBF& zxg%F)gw`c3C)oVdNYjV#)%IRwFw<%{V7vOvMtL^h!LloUl@qR9|9~@pII!FbFA~2n zqWeVu;PTreLDr)a@_%cE_{ARS&5)7|2g(!#QjJb`CtAzNjB|*slMTXMBu&eB?%54z z^fYsguLt+1EjtQ3r+TBCJn$ik&H=mrgg(lGFSu-0UPL~YL*ZYc?QXA9PI*~qm?P#o zD?xxP+8VK%cQbZbEWI`R17$pwA-z;sbFt0AdE4ZY^J!KTucrwLYCNSrv??UJRorM1 zxCcz_;cWn!V$Fi7_2jCs(!nM4N%8^)3{Q7dS6UK@2qk#W_exUip3&g4DLZzx^O3EX zg1%8%;dUKDyr{mt-BFBu-<^<7)Nh=Ozcx=sGn}y zKuKm4_j@}pTPl@XPa#O+NlK$aEiFgO-}S*nViE2QdbOCiJ)Y4hjWWRR+1y8Ti@7hG zLTFG8{mRLJ^kNHjVW$0Y-OVfpE7s zm&?@b-;)ITEu_pJ1XZ>8ZxFkMPGM`g#$iyGKOH;#|8}5H3cQo+=yW;# zy8%t0l6UDBKKxFk?jj0LM48aR&gIOEj#N~9EI$^-5Jty4Terb>MEdAd#l!rle>(;a zCxQo$AnvVwc5wtseuK|*Z44QF6deUw=01njvZV1W2TsVA}Ixw zQ!e|VJLiaF>h~<%%|bnW`YBD*yE7~}i zV6GZ$8TAd5gWiK;LyM^LOPxQVY0qfCW#>SMw_DI=^Z>zsU$qC`s=oz)(j!-k9qnA| z1b+<|M+uHZM@JmO$j{L~{mnd5^lMTiUH#JeBfvdu^b#`Qphy^zB_R<5-N!SnbTP#_ zdH`fSuLm^PK|;#w%af05_RE3tkk^}O(;DL8{{Mi6tUqwE3mMjq9q~ZY?qs%AwXtrqqsUMrR8_&M!uY z%RF`76XQxC&Y4Y*Z8keIYWLRiva0VJn~4jDlN<<=yuMDtOc|1n+C$A39l_5kx8&6AyDM*D z*L84LG6Ys?G)yAOt0C%bjKu zt@@|dTU##o_pi_cP`JKF`qfz>)`AR(A;=F%U_`z9*4py&#&2*#K^LOZj5|DiSHc}`$MC(0*;6#DCjgdrMic)NBP%U?2?u&6{ zhO|5V4+lhqZp0)*pc9ljRQT2-sfMBs>06Byt0F_pF=!_v!qgiy+BLw$n1+je+POHb zC~Z6Wix^>SHeZeu6|a}-tKF5EL}`1RfEqD8pC*vJ)SUIyG6wdKM$Vt>fw1Xh#OU~!jL%WKN#>_)hZQY|LSTJ zgbrq+VX~uEgyx2oY8(3#XC~O?fcKlsCip*ejKq^+uWcj{usFBrBC%r7tW<|YEIE%f z(E!_GcG$d--OvEO8;ShLlG%b-N z{byN;ScKNuXYBDVo|ta;S`|X|G2O`u`>I;iX}Q8E2)WrQ_8?ZnT5jMFO-wVd5-WM| zv@dbx40QLj>{n(T^{kPtt1OZJ7|34I-$4{{3Y{EC0XuBxQiS;iC#YjXK;!RSWkUJ~ zf~W$Q34 zW2&fMo+npqOn*2t#)$2C>dG|!-{|4Ik#7G7-2t1LQY;3BhlLEapdx@lCoK7J9xwPX!c za|^0m{+GmdF0{6Ww$HOA@_529C9*)ywpi#Vqh20hTG73mQ|ud4_Y8-R@+aB%!-9{?%_{O`&_5PuZ*3Jndd1p@!|3}0SXR~P7KP5-aX@82iA z1wjRYz98L;|6MP}|3lsX{AYs?g7l`1;9n)(wV3o4sXN+1f$&~THo0qCQK@)T94lFi>MgF|21>Y3^aWrg^PSlQU*c4$&Po53ZcnT{3tr6BfO>wC17bV|94B+ zK)+kR|M&Q#qAC%K!~+#2_KNYCYN)3TCw-7mKqe|uKa=JBuU-;#D|{Z%hT^r74oON>mEa} z(0BSWshutQG*Ep!xi=iTAeZ1$_={oorlAPa8Hmv7C!&WCm4Xp{d~)_{L@S3MlZ@@F zVdy}VmhTeKce`6AY=PPC3k^%ek1w>sKMA4GqJv=c|L+~{T!T*U3d;Co9Ghp- z9}z%A^B@Na!K@lYeX@=L4*Klh?86`YZ4F+aYz)99;kwQud!S2S&tYXARtWZYj!UwlRXWD+)Hx{`gHmijwTuCxVZ-Zi8`qMzzY*e2q8Q!NN$DFk>fi8QiSFiBguCO$3+9VA7<`;bRs$o08z3UfdTu5(F4+8SkD?nsB2+K?AjL|9LNS4;oSpP}5n z;eQ1LhN4AuP&@2YF*}u?sba)|9=?nw8l7J^dQY#zM;vbWdQrYn2ogFg3AYSPl)Mmz zEaD|gH3sG}RK@M2Y5bBSv543^2P*~j+keyITj0@=>CHWeI|kxj+}vpU2xw|zV>W1& z`}lkWg%lkciUfb-(V?~Xg^HT=u;8fy2FH}w<0^hPN+&zvv@n-U`yzFM{)9%gi4;=iO%5h>da7(SBF*aNGk|yA9R4i!B-{0xB%4wWeeuXhh#o)>`r1#@N76q}2pVu)U z7bm16JTi#!%67}FKwSb+0)@qFHi3aS5C@tL$K-4eLv-B&)G?b4rEXfv4kqc?irxdF z{hbbB$P6JzVOJqV4M8s#l*uEu{|{wv85P&kZVMBG5G;6bcXyXS8VMHMA<($H(;XlL zcXxMphY;M|oyOh0;p^=4?sLv}$Nh8bM~|_(msZ!Rs`=D><_uk_*Ae%x4-J><{Hx*h zPZ6d5PrtTsxjPn>JotYAtsfTjKU2a~nRr|$s$|1sLU13XTT|rpLmXjoV(1Ew76FSM z)?^<`Sa!COzYk-p?;oMC=+`);Jy>)aA$J}bA6;!tk$R;n?QX=hw%ey$hPnQ4tmpdv z{_K{N{#5`WYWraR47ZJo67yhoh!jo|;Rm{}VIkQ+lj&g0Y+}G6j7|W<&g!bFWk9F) zeC$-pR&pB&Lmimh(|X*ckf7RT+;2rq@)X)3CUO|`mBcdqOd@XWL3A=O%~)?(V)j36 z!4EbVUNyj223;Gp3=B0Wg|=bHFnbMHQ2U?GyeLwKdGa^yfOq%^)3PQceDlmrswV1l zd4FPte+duXcCMglQ!uT@>n|z-?P>!Bv~2S^HT79psu@}0k_wvZ4ML31vaGj~y%Zhx2VU0N#_P`ZT{NUsN@|NK{xdIuX9P(nk~CiKtutE5H-g0pEot1;uS zc(IiHH3Z;;erL!8Z}dRdYa-a$*|W)U@6uI_ZWncIaH4}}dzJhoQLaKZH9eV;KYR!Z z4GWVS9cBpX(MIAT4r1IfIZpr@cB665u(6xUvYWx$6dkks^s7#{|6L!@5kg5(y$c|HUS?bv37#7@1>RSv?yX7UJ^ zQL~W62Q&$+|5S#E#n2AE%$JF4<1k$iLpC=zzh`xg09?ZU%oCoj65x|HA1YYrytTYj z-4s<04{s8RW)Anhle4h1b-rHhbjnXrvwQsy^HS+#qpHBQT$^)gXr!lSpf@olg+cIkf)4MwjpJ=O@N7rd|oAzS`fNq@PHDVvR7 zhBod07uJ}MvDDmabh{@|GzSrLh%1Mft%Rz_;inVf0_dpfj_wd>?)k_q(9 z^Oh|J#VN1FVGNqj`_etd+9SJ>=)(X>w4Ei<6AxEZMbdformxViXg^H?|Cp2xSU#0^ zFlh*H*1cOD6-m$=Il944=Xwgj@goekwYk*Lvbj;M$^~`!+fIFuhx6uCQX>Q#(=rRM zDH7AZlitEYXBov^L4KnqfrOKI@elZNt|Kij5&Q4?&JQb?1n{^%bHY--9D&C}`R~k} z96w}*3}G&A6HCkw;tHluywYc;AAB5^qa>_yiN_Y!I}?-?hH_z$g`LJ`vcLtODHJ<3 zObo$pmt;Vi1PD3Wr^7*EFnFL;ql#%PQ-z`OI$7eyyqSjp?A5R|WnH7b1g(f^hO+@CF5f|-QrwYu5w zj;6s@(<)K+_xJnHF1ClZwv2e^YVCPP`0K{wkPEkQCSb#z{Lm;(-$~;3*PF?#VWJh& zmd zX)KpRF^35en1F=BQoZUY41eVLvluCn`o8n0*`h=slVcF3myU`n@v>@FK&m$Tmu~o7 ztaRQd-^&hliu_FhzhQ(%W8~Hpx&LIXFL1rlswwW{7fE^CwLg){Nl8&P-*nJ}j_w;A zLf$8!$9L*}M8B}7zZ5)-CBFZkf641#TB5|dh#$d*0XG#aHn5wh9`mbReR_I=WhY;$)5{Ta^$e-vQNY28ivJI`GaNq=G1pp~MMFJ(zT^W^ z{8FnhfCMaHovwIXyaRTvAJpW7J%c^$Y~#8k++JdPL`;+EvRzt?%`POF79$I`l(zKH*v zS^Uf1v*uH%=2lL;!;^{^yzqz|e?5m<*kA-7i2EGG1*G_G^ps7veU4s5w?K~amHt@x zGKK3~NcM-AuKG0h1>2)tn-#e#?*+fe5rZ%~Pk3yu(Ypo5kg+EgYNWVSu}{|-oMSWi zDb>&XdWLItq|fs`+F_^68g9%P%W*lsWZ$Wa`x4C97-bu-dhhLyE(Z8fP!e+u%h^8! zK=(t#C@r{0fxXTZsE9b=j{~BoHGGZ;$%P#1fy=R6E%K~8b3tk^()!J{p4d&d>rjl| zgX2u|dHc=opdF6`l>x}*ErV{e3ip0cuFaI|aHD}P7j5z8K$w0+`UsfH9+MOCYW_U+ zyDM}zl6y++hApemvtX96d9;!3D!FDeMnE!L$D`2fGM~0&-?F`uG`|$d6Sa51&5{d0 zZEVMDCXYP`Bw@Z+cbT}gvLNbL?B-5>0j5LKCr7(;;r476x+82hLx!io1sg|!@6W2{ zU(db?6x40eePa7Es|^FWAIr4rs_W{Q=%Q|GH|=<}8*JKO_#$67;DR_8Hcw(hb?{My zy|u|95(G6BerV-s8#Z(Gc~(qW;W?&@eb${mFVIbkJ|);Q`9WzC%dL|RsdE;ZwH5jl zg!kvl{Lo7%ifG~6`iKJsS#6~hfE1AOwzO}8H)*meJsDA|OiQ;{i7hAJ$6Ij@DmLzm zfEF;fd##ivi??sQz)RST<1@YA!Y7ML!#xf#eP03}R%I|f?DQgu&kM&EkI zB({v{tS>H!%4pW>1s8JG+UtI5JY2uX3iXfNT_n=+&JP_2Wb*{ZyOTsOD9WgzSyoKc zV|oX8T}Wpz%bC%?r^3cQByQt3YcMJ&5Ph5~5;@Wo@RN4BaQ%EKadT{5l=WjB!%SP? zIw$Az0Ef$ncv;*@Af0fR!2VZyOy3kS8zNN(OS~%=T~_4@lP)$_QH3`+klns#?g}Bu z1HPiI>C#k@D-TibQk$v-JkjE1&t505Wfx7k9&Z`AqmwB4f>mvTqbgr!aG-t4(p`|e z)bt6~{~c5b4Hw77G44NRhOj}XbFxh5+PpiqXOPAvXhkCQj)c3am6}9C~8LNjnLUr{HxhAY2J6{1`8!% zG;PQ5Jp;n6eT#}Qoman?a=agJV1WZ62Hajt3PsI23v3{)r~dW>rmK|u@mqv?U-zU< z;F!YUt|ji;qhrNOGE{oIJEB+ui6KB(jYw8*39rPg_Go&caC851SQ}AtQqTEt9JJTH zFQnpl$>8?xU7joSNU4a^~|p7YWl(puTrg)&Pgk@x`v0Br>8Y?7(3u1GU8W)`Qjr> z46I)U{@ll0$q?Hk$%G946xs;CflA}ZYR`5IvTT_8W`|lm3TPO3Dwh@a-5b~DX(kAJ zr#>vEm^;+SXeA}r+nOqAU(~ADG-#>nR)h>So^n62PVV(A^skg~`-_xhO*_jT9XC7I znpNcJX{(ePELBL-V5U{$!n7-H$9fBDCnomu^xb&XY@9(h{~*2O3Lqb-`ajm*m zWiX`5{8c}vO|HR4tuYPJ_JdqkdN}IdFL5=%vKYrBh&w+2Sju_&(ADK&HBWu5{Qz*U zT6PtQOBA=+N}hx_XMM`*=z|;Q?lChB5Xr(2$(TB@XS6D=jU+rlm&3Oc~K z_HgT$^eZ8QOmZtl6vLf*FQ1fy&alU4?@kr}ExB`@8(jA85YHxyHbR6Y*WqT+{z3_S zL26sG`%eFxmG%|=G{=k`0vs2{yw?`~AVfiY0LLsmgt-9xP*liHX6*)Vynpk^ZB_{IXgm;JW(;*lUu4CExM6E8{D=8ExJFPK2qQK^~qJXct0P_41oin z+NQSV+y^1%D;6Xjg`x^`>?*{^Rvt^5nwcpZJ(t<|X*W<$7mmpZS~}*!;=G#gf7)v4!82?XtA99zfsjH<{G!=$Y;zM!=W=YB#qb7|c1nPy2rQi5e53J;b;V<842FX`P^W!iX}PRG24qX+1N zXdJfb3r*L7Zt0QXo35wJLC)8|4^*RuDOqZ4#CzS`QBNl>cKm<#uc74`CET1pxb*g?6D=Lk z=Copqv^<~c8?5;|2;XX-Lh`e;?2pIuXR;ES-9a6T1^P(AUj^#6AJ$c(=hLB8 zZj0sMpeTG+@2={BHdTks3heo-6PrS<#{-A0^5c!FcU%_UeJn{F2HziN3(=>yM3}}W zcc1Quigm&4uc+7#M9j|B#85Y&H^qYF?@aE+bis|~BUA`aFzoSl~cZMY&o?Y)vGhKqdejn5kw&v%(I%e}I zRv5#jd(jhagPv{c7!@r~+VwUNNFM53XjbZ{`^vLcA9onWffwq)Ole_b&y6SFtCl9X zP&;hihc?~6UBS{1-v`dZqg}tZ4Zi6TL%T~RE%9ZLDS+~NEg{Z@tEk?$WxDGN9kmG> zZJ1K3{kj6ShurjAneODwT4iKo?=>**OOsgxXGe+v=YAf$+XDFl>xZ$>DG-x(oq+)j zQ}LGcL3|x3p{5yJXTJD-$%fxeqwvCBPg=W6T|_?J(WhLTLR$sN1<%LVMBF5Dlt8#y zVC8`!JugiVbaacUj<2+U4$tB86PaG}g13k%oWW-3E^NMFS(ao?A(n^_%I4PY+G|X}{2BKps^+pT0wglMlh! zPIlk!kR6>$V%wH68^0BH5nSunJL#}IXlf}ZU7ndths*6=I6+@Gpd92npH>{dBFOmw z|JYM*Hx&M8+}R6h#V^zA$_KZ(#b(iSG{%E(|NQ!WZLah1^|j1x;0h8fZLPs%4Dg`s zFYif^Qtvz#jDlOCVX~$c<8fYbI=BX3291grw7DGr{i%y=ibj@j+eW_ksrYLK#qrtB z4YoE52T$}w?nujnkbbp`>8Q-kWkA=RagF3nHRqH#yK2Xv+-HT9A8)Enb(HzJNA=yq z>m92-o=mrjLcml{LnNfO#v7M%zu%T5?^?ufv1#AURS39{1mSWLIL#pw3Eo^r?c7%6 z_omJhd6xWk*qZdG@YHtZR_S61{sW$RLDg;-@IMbvCxZH^Ju2;XoC`RS=jl85@NeM13G6!T&zVrs=nPE#R}^Uwd#60v3p}zS%x%gZlSGX?{FO|ZyLh5w^hm&w%^=Xa48qAJ+8}D8YiATb9f_9HYbnE;eJrT~m z*Fg9mJi%pU$J;{*9PI4$40+(aRv7rJ0`?`dW~m2Tz>+W?H+w=Z9PZ9u!;#<4 zr??P`fd9zXsV!(lcW{5xGlbG{w>mW-h_#P4M>qEaB7CgO^R~RHNa%F0egh57`r(Fj zQo8V_3nwdled2mbj9HMNiP=TE;V81sN79xQPzT-OdZm9=DBgC4G-fvA zA@W!Pu2drR3s2whG|9nzJQjCMAolvLagp@wc(i3xF?&sHiw66qc=b8f>n9GramqRS1%x8}5$&2;o$JaDT4z-k!2i-d!~I z;m@{uNIbK99vV6-AuY0%ZElcjVljX5rM>{>oZoR#jxf=&@O-)pnFA?={8G6$J z9hQZlX%-i0l~f_kjS$;tUvGlW4&1?Oz*4+Givv+kHreeE5nrC4-OCvIeLMIc4^MqP zN($l?B4*As>$;7Ql;nm=XdlOb4Z}Ufbmj3;0>FeleZ-`*m5JHeH|^S0p|~6J#Q@=A zXdlxpbkMGE93Wevplq6IJFd?B2QOVoFR*JCSV6SWiQKGcCKaUhqQ!Q-cQk$!$a z?~>>DdIT6b{lQYX$>l`WJvyY(iA#7J9W=ApV+jfuM9mm$J%a~tTIZNkme0F=5g(=Z zb5b@19tleo?JRY6OZB&OV8mUtTRNfJZ6A*21Y5l$8ZQYH4ZB%exDqlvH&BOpeVc^Y z*+FH;sX2l6gYEK)`t@=uoi?$O?$aO8p4%=@X>zpbKDi%~2G>llLM}+5-U9)$V*_ha z*t#6EU=pu}#BLfyMIU+Vd!<44?^{^Bbo@qu@l*5Ect}IqS>r|vMjc7WGeqqXly|?k zZrC9%Wk-}F{dq(0mWOHeQ;K*1rn;Pf2l3+Nw#*LB@3o0~gy8`}&X<<{fbJGv?voO3 zMkk5j-6&OYF!qwy4>h^8;r%5%A(?`@d(-O!u6;u^_h`DOfpwMa97KL|aWn7ZT!JXB+ITjbi5?jDHBcETjT)u<*9ncj+qd-Ws zKT%rwmPjy*Sn`l>?Qrm^^R)&L#VH5oz$d;U5^i-(HI9x>{D^3(}$TlAoz z%tqI)xk?IYyb84fu?67lF}&8&wusf;PsU5=j?Joq!nGR(hYnXt-Ch0WIb*2zUMG9# z+8`2-+wO7-n({S}D%oD)tWa>sdueJv)|cmf&%!kj8b^$aM?*iP%GvX9m7Ags3a?(U z7W~Ba{dC(R1ADDRm>*;K)P-=ZrEJJ`m07p2dsG)i56Tg?79~{Upp|i^t1ED*7@i>2elHsBn_9%4J zy}xGbnvN@Dz87&T!m*lh?&DE4D!wQQeK1{yDm=Op?UgDiDb1IgDum6J=`Y_EbQ%Ek zZU#hH){QvQ1#=jy!h%zPDUY~*@=>Vzon>JyV^{hj#3^}7(1UJ;kZ0*nJf>T}-IvJW zp)40;miFo#UtBop2Be#!^$ZvKie_rMbC(q6F}ceK`!Elw zbSi?6kT2v9Ow0_Ps-4Pa_qoy05{EeCEJMk~sng(Wcf4UL`iJ6bLJteF&hh*TnN#$1 z{dp7p=iP|CNXP(6c=RBpFuTLLYdV31uu;51OiFgfSVYxbx&bLq&uP>4TtlgD!-4jT zJ;AQ(PJWqAQ`*`qTS8}JY!xIXYgHBQO``B+$AD<~NAP76^oDXJDhFiF?A)HtM0wEamWdm`aNb0`AEogFzt@{wn$_SsJgxHtW(?t( zt*eqtGFj`Ye%*p;(26`xT`R6;gx9tx?z7rvt1Zs#rNa2z)?v*SC2LPndwGG?>ZbC~ z6{-foj4QZBMiReblQ+)p!N?h7^O-c01r&_ru2SBwO};=`kJsAu!-?6-p{cB-I<*XbqJExQ@$TF*fS_f3!mfjPT`U#x zv79|b(YUi%93WeJB&LsmWMOl&-ukq~qM}@3<8~B#ErNN0#xWf(#oO~^<&IKh;q;%J zy0amuaH|9`FXPDYL7k7kPzZVxlYe5(U|lJ}-TiRCK2UXbu1nTZuIEIDtTqV|C<6rq z!d`m8nh^nbFHgj^FSgSxUN@vAlMBnUi=&i}IO&1crIlY)+rS762cM}AzC`E>&E(Zs z2Y8~V-0I4`+!Et<)Eggb#-_Xk)3?E75isfggwL0>N&-{>64YGH$~-YQ33|evF-_GM zA8P9!8ya}8Kf!`o87s10lanR~JZtsKmqRaGHZ$5o3}M1pUWX6*-ok0@jtdTt;8Wc@ zrga6-t;Sk!mT8DGF8vTKQ3`sv_0Ui{PJ4F_KTf$YQXk6EqT=Q9XdO+tiOF`{H~O+B zcR-z*eEBGDL^tHsys)?~ibZ1siE^H`L+}jzSo(6vv&O3B#nGrm4{kc5zmn3dWnMsf zD^|hjbVYOf2N4f)gG+D7CnUHm7F2lD&7{H26a5j^2O(+dan-HtKn z^I=tp@sqibJC6eVv6Ri)$cXLr5y0_lg4G9xFE;sY@tzU2`!fv>H~ZHjsWR<1-!wik z)gi-HLah!IJlMJA^4rU4c^a~t%hC}66Us;>LlqhPc2F#~kgOsBv*qcr0B?)p=dFm0 zb`hcsJ<@qm4Y|C$_Fm|SMvHto^*K?GQ4>j9zgE~MFGc8PokI3Q%?+JA^k|Q9IbWxxdYHF*-2JHVF0pK| z@kuA7X3n^^REVk&#rJ?;A>ubme;>|pI9>-Nh;tFgVN5(9dI(E68JtJGF?iY^=CiTE zM9>*(ld5m1b4tJK`7ul5a@wT&?QE`n#BZ)#Z`~GXm1VzE(4}zh@jcHsJJn(JEDLxX zE%m0jyvSRzc(zCWa&nEmcFH%~usGUt>>97G|O+`?33<$WI^Zyi}!T;#Wh3JT`;B^R|~$ zcVxNbPdndT$>X_Y7(s2^xwW7iptuWe6xlWUqK1}Us@)tFQa8?7% zVTtND%)9Z7Y_bCkhIwY}WJgH}2}+1pw2f7VbrB(dt(GrKZ-D4YRzzkL)X!}dvK~n3 z8f*4U{2quXO?Z$y(T6g6&yD$j*9;_tKRzY3xyeBW3RdDbw!>96>csWdhzT1mGRNRB zoi~J=oQn!vpBj&+kq+^)K4%b^7hV9}P@r)XUYIC%Hl* z_|`~Dzn?N23fc`RgtXL}%OPoN>w0ksnvJ}aD`}am%EZfkG8;M4&{TKk+1Sf3%i{a; zVs*QaDg+{??Ff8WEbikO^X@SBtgo~(nsn`4#qL^koIbU&)4j1c`eSQ?HizzIUC5km zs@%Zs2(>w!BZ_rbM`hFlD;a&*KQn%WiN}uK)QSFIIUAt=_o*^Y|Kv8+Tx^kbQdWl9K!k}65&zncj`4(al#nvxoF(giv^C~g>7^Yg#Jve(6TkLsJ11(rT5~CK zZ`Ia6f4;Hjs#9+hvfYuAZw=JjsIGc`-U|mAksKnt5F+cuuwM=sOc-AQ3ormJ4Tvq$Gc+(88KP#Lw~@!e_Ms=JMpH>e$p-PYe%%Xq6M~%NGfs7G*<=aQQoAnr7v6k&EuV{o>tg@e z*5In~8lat(1QX+^OHk(Uu%zVT1zD#OLa$_t-1gefgg0p&JFigKQk82)H9bSCowE2L z<6?)yEM?mDsa3%?LwExuWi|1q>cOXjoxPZE3o|b=(Plk?E zl$=nT6|f)k#)pZ0ob%)^TkZpoLD*^8BHr~-SWK5l7|9eM}^X z{bjyJ^o%tv3k}fr9O2vv{_*ZAn+HJGOpM{+)3zeRqR`QqE25s3aHnnddP$m@9!Dr3 z)M?`JaX0XTRpk^GA^kUPdxn$Ydjd@@_&0(yiG4&*P`x6%ayq|ed3~N+g;JDzmf+0} z)T0(@dXs(?k&@sNGMZ;L0#0MAmZ!|MGg-itUKb7WNpRY`SfRVMatITGTj-u;mc)!u z3#RLRP<+|<{`g|U*!z#(z8ebHqgBX(iyxS|wggvPb z3)S19ge5lD7^Pl-bzM=xoMEh3~fY;9Cv8U&%imWTAp-6LGhtkFt2Pbllkwu z)A0D9>)dKiuuh_du#Fa9f8u%L_{$sC&c2&Nq!1? zI`hvRhjBB73tsXzrexciFOS=wo-r~_cR9M`A_Ef8V&tKCW679In8(b(*|PLAdWYR$ z;0vpoSYitK&2E>2)AH;wLdATw3d8aEuQJIDsg^Ln1m2#|ND#4|#AZh$nEK%Dy<2w` z0sriXk8_~x%+Os-L3Fir*51NM@^+jP|6O3gmJ(zmEYF*~geW8yqD&5MQQO`*F&L?UchZ+UU8AWM z)WawAzJ$fC>kCNoo9I3p>;`cikQF--u|Zn)M+;_UQA`yK@@sw*!mGqMLQe~k%K{U*v>I_c0|KXLKzyQY)H}A?K zl_q=WJdripoQY2JGJGIO4dPur`(Q@4_@=F+wz$>(!b}jO$6@^GdM$KD86XFFj^c<_cncsER(Pd*9Dm1wW=FnVLUxlFFq2Y zjR0D;r>4@5I=0ql`C6RJj0wd>-ekI3$nBRzMVx+cW7cHT6EWLQ1~>jeT!%~HCTj5v zC$TIoGc+@wYjOr$6h&qVfxXWx+q4$K&(nCSU3cr7T+8!T8)=($vgtxKn@xXN_Tdz( zI0NiW9)J{{OAJe1xktS3;7-YGZ3rlD`d}R_tmJ;{gzlzDL%4YqSpb4-{FXR|kbeh@31p`Eg42evKU-mfd>wSKW-1=wEi| zkDaM(KnN@ii&QnLDgIY@IO<VgLYL(Ny+n8RHE zbrHCJQWy)?Y;403S1C5`i;6OeYH=EKURUZc)mzw6ceHLy#pat`578IR_mFeE-2WU! zn&E3PLQ!B#A5@(GUUl2U)RHX>`|tj|X6+VQ$}&AGJt-k-D#3!B|2aSuCmDb8_Ws1O zXj;h*gX7gd=l;8|t~Locf$cw;d!mkZM^bg0fygO;bK6MXRj6~RsuiCapcat-XOFP; z(J2o-E+Fhg%Sz%~jkd6gSkGC9J#O(}Y)F`xK-s@E;*lAIYvZ_3O^0c$UZMZo1xy{_ z<$_vc<_JYNchQAxPQsN%P|oNz?N}#=?@X%_#Zm3 zE#5b|HKbwO3~!M0ZFf5kKVk2$Y+gd5l+RPH`raUyDn1}Uv1L-(Md8SX@hb1Vkq}}qC2;8J(TGG6xHe7pEx z{#vlht$V*!nYxA7Q&i&yVqE}5261Dg zo<^ruo>n60#*}B#{@nl1Lz&;1j$X=sf6>_=!^QMc5up&@iO&Z}8Eb-!_2n}LqZ=t> zNO-PMQi!}JKSlq`d+qO2YW}^?kFMl?A=sl_`+YpJj_Aycs&6lQaG6iih5ID}xXcrB zfE-o>=vf%_@30uR`=bqLKLN9Vu-iWg!TmX;W#5X~LV$!*7%io+{M2N>`EL_|BNgTH zCb_e|yIP?G#DEc)@P{Y@pb}$=4$9sL^m}Ib=V%mdX|7yMLPuSVF zgr58_y;EQm^hpZeAsx_LPL{r)lw{)g8B)pvk`_e9wXb=ZsjcG2-=a;h{-g{hOKo?| zevvYI8`0*O5@W0RDut}f4Nz%h9UU8dsTGoVmV8>;U^f@7VgBu_gGJ+cWgeL04@0+Q z@%9dnOw=M4U}N(hZKz6`Gy5!M=0;DM>LuWBvx+8iz36j?L?#!Tf8NhtEusHvzW>g|Byy~gTC=hrvStWBusQ((FsHy9W)TSo~;`qG^ zmc>)y-`xn;J8jm8j+Fm}&BPU{ssMvW*!L8^wS+V0VmqeMJd@O@jShPvVQ3 z6x*9Y6q2YmsRAw(n%eXP#P>r&Jm_QN*a4bh!IP^T18TS|>?_`z zhF{gt5xc8CwMe-2)5En|zs4-itr`xpM-ZlmC7{|}lU%|$U{~@(!t*Y7dM07+N{j>> z&POX|qz>yt9~7z@It8xA-5T5-M<(xt8tHs^J{0Dk3L|h--k5Wk9ln*`p7tt#*Y52u zxc}w_liEdc6!x;1CC@M3d=j*o%w_3V>8h!We93ZC_#i*t_~`U5cKE2#`mz`0Wa6{B za?ztrD?&_F59#7NxK*b9Es`5byhSnAWB+z6(o|G&!n3^Z7?==hWU^4~$4>Rv9Tx@L z@7uUIUrBxuWUFB0sYjwf5tVJ5^gri$Uq?h{HCVJP1@HCGN7qEE0b{$8ZyYo4Hz{u^B*YEuS(xii*905bUrr@Q>*BdO5BS&QN*t zVZ}K+GYHSeMuY?OO=f5k0uVTyt$J-pO-B=3WS5*bCW3-+x)qUuNS(nmd6MPXcD-FL zBK)I_q|SDaJ9hlS1+~7y#J$^h6kcR@l8P*P9LRmz-Ew3&aA)BLM|Bj-L_#Ox`=&xd z?hLFx&wC-O@LRfcZ`-&4tGf0nd${b>-!mn@82$`ked`faDP-#J@&3t>XWtg-nII7D ztKzKXyEeBk-9RK7`sT)YmG0NnVWPM+7q5YmKlMS`r@{;6?~OV!664I>FATbozU<8h zihV5jY)=5V1aQK1XbN6$&SONHYc{R}qUf4pWJs~G(Z_-X*_;NkTw>jbQ#~79#Z`l( z7KexBoo*fCKY@Q|`MZl9^I?G_R6aG?=j_eH z3^YOk425II*#iyZXqZ|)eZ*)eBM(%|7Uf+I%&isAFE)CMQP8o+?2I?Cobjj@3rz;p?HSk>(;t<>S;iK}Y&h zO)G#h@f0jI^*tzO5aTD6v%lE;H& z^16n{|BR=<#{3DN{C2v@;1RQGy>IwdZuz6vORa!okL#AF$!m^;t>7EFgCChpKj#=V zI^R>G_tK=13V=QnhBE%4HAgxRyqj~-#k2R1nrPpb;k!8RLeyD+4b9b*-Rh-g0Q75m z;)F&i@{f|Av%|2W>Obn$d{l`2^5xM8kBYK@Y0r|-zcEB(vs+V-&?`x^+%VAJS(U>M1bE12Z|F-tRs3S!>13ghr$x8KI(=1j!(Fu!w=KQHmv%f1Ax8D z<+Zg80<=qF?C6^Jkwisqv|c0OdKRQn-?|y?J_p`m`4Ay@#%IWfVuy#vpAR>F@T!w+f?)D|35#z+)j3J6AQVDJ}43rvk!+j{ANKj?1eaGM@ z)(xFR4+^3>8Purc5dE$h8Og}ANr{j45-k7oRDFUsFo#r3@yqMzZ>45?TYH(InzF1e z$yR2Py(~m3TfS7`Qi50hq-vn-*QZBrcpa~KCK7`fmWFZSG|no{{m13X_3byz#~4Fg zWA2;`z!%0rZeR=v*bKC~0flE@Ac0Td7aGhQ%HHPi3&9dg4jw`9V!fgrJF;Y-Z+3Pc zO0p-V_@_YiSm*2R-mu*>9zvv){VP`Un~SpDYBLtXaMb(X8Ew{fw@oc9AH=h}mBNRQ z8Kf8y9V5_Bl8b#;bnw-=Y^$Q6h%Hq(j?#Jxq6xxqApI_*u3W zal~!7F03I2p>0o50G0T&Wy-73bx-aa{$Kjo*(BHKM!4_eMSQF_iQiQU$5yM|PCy*I z7W&DFCT2v80>gft4{3EnYLOB9kf#P*^9*qD=-%N>Zs!;PhEOGK{X2tW<&~_~zwv+3 zF`zmdQjmW&D48A@Jd26PgZ)7vp|Cr#XxX<{F|hmNC*l)kDMmJ@uPT`$9}2%28$>8= zn9u~p5=}`F_#8E&A0^rcu}V#I>n?=-@YWBKTrTvKCf?M<4A0>jBkmseg0vsX;kif$#E~MPSoGj>D*h(BWpr#Soz@w+&sjF<;RU#T}_jftp9ph=g zOnA}@$SkX7s*~X}BD+U8y+Yxn^WThoL+3h+=}(93VlHL$%|pQ0BJz9)q)I=DA78`e;HI&pnDq@s3~e9jFYsa{YAVQ`?5mGWz8U#`&ALLqXI zC-wFB0YAaeN5w))F!89~8sFFu>w$^-VmX&bk=3&+49AJ8NzPrIZiaZ*hSV%h#+RZF`V6Vw8snE_d)w@m z#ha9l5+3zses2)N-{|XC3e`_dO!AZ_FhiSm6(`)Uxq|r03-8_08FVC3evjg>>>Bkk zrGA-XWzX&5ysB0FH1QQnbkX3q$pdwFnZ%ZS6uML(`G}0Dxc_u$iav+~NG8%j&RMmW z;@&4MuGYz!&o|Uj5ICN#6!nW%)Wv}FN(xupBF4gqusZ4`6?de4K8^ZUDt+`C7L@~S zMKu5#+~6(-v|+mk5tMG-{*%9XS4e8>IH3;5-Tc=u%Px z4_X%#2qLGl4+1+;WhmvLTm*-n2a76vCXWoy{2m=S3=TX33lmC1Ni9x<)ob-(-h_K)w_ ze|P_x<5+8YX05L3uCD5;^IVorE!qmCZaqk*Q-M*yHb5i7`=UF#5IrI~#r82s&gLR8 zCxc^iL@p=+%BR;^a8C zIMZ_kAN!`l_vRhRhK2LLv7Cw&>&wLZF3OHqV-slI1QOV<6A$K2v6rxpYv-d&oYOII z@Rf_-XL@h{_vXyw;AqpHQlBF#yt4Q+%*! zr*g1uy-_r;o)Q0vBl)m|R}P{J3(XBbhb~X}FKY(^30bR!-i~t2g_hUW=zyk_Om_&M zd=5f*{)k&u&(PQ=?&@OdAKp`U%te6!%K$_1SK}d@Bo&XHwh^%$A15PC3Z&USx=5-B z;$>zt!TM(7hJ|pXY97kA4a%T_Ijn#)Z3OAlxG4_ZDVO{|n1#=ZCl49=+ITfi+?k?V zo^U=xr(Z^E@WU6m%4{DEyJo%`Bug$P7*9>hCbPC_z7JgT;%z;~XoL7`5~>|`WL70L z7k0SFUsqhLGJz0XRBsybb4O$t5;GdAw1Fa!_FDmEyX$vh=P!JD2Qg!+UJV7)}}A2WV0h(qGm` zEHS*9ik~%=Z*O-D;BQ~38;oGUBYUD`caHuom{Mb|m+-(~wzr~B?Ihi)A-HMu07~7% z;GGv6<p$)?NEZ5%E4fGAvgBA+s0tXka1vX6hgYQ2zYN zN_qItH18>{C@n${-I%l7uNw%i;4nRJPXYBr7AlzP)kYU}-Pq|e_y98H;1&nMt7 z4>CReDz-J)WkfVWRW(0!yhzd1P%FwvJ@QwAe8@0QI5_!f@=k6*X_ki;BX1M*z0a8K zJ97R{(!yE~Y%_T~elp)FWkG!M%Pl{gxtld?T!}I<#^s?0YhK>DjG!3hYw9z5S!y`i z($x&+;#dX|3CYXmxrms#-)Hjc@TdLe$_D-a%o zvlgL{w56(DxEW__$gra#u5SCu`W`TlUt^lrhplFhRweS4UBe~e$VN2g`Ddh}*~#Hu z?G8B?+7HnVNQwBrW7OR?9#)*eq#vtJ@XH$cEuqE;>y@g351tT#k!F#UWO{BD>!Hcp z!0B;9xo>FR9jQSiAYryr=aL&Psz%`@G`=CFwcOvyrZb8l7+a00!IY(cQAT2kE3Gw6 zLT|jgw;bx?@1~>&%QKy9P!^nU`|k-j5AXOjCT`BV3y7qsLs`V5($$isqbuzX>X=J< zto~z2<=mp5F#A=J@K)ZE71|IpE3K-HPr^lY{lVwzddV;G>Nt_o>y{ZRPYS`=@A0#E z6r9bK-53r`fHXPBc$qj6ih+7#x@4yfJK48*T<~$}Les_wH2mX~`m7Px!};*m*=v_CY=?(S?m97N2GUPMiTS7%*vv<}mOTmcDp+l(P`|WTjd!OJxA5J+3 zm@hW`2@F*qP}>cV%NeD7W+aUdum|0pm6U1D9j!-Tma{L=3OhP#Y%iL8Uc+-IxrzWy zmQUvR&5Cd(mz#Bx5+n;(dK(<1Mh!r9AItJG_!e%qol?vV%yES<>j`9y%)^%$-Tt>4 z_c%Mu*?#gJNK(seHRZCprivPyB}I<;f<(LKVp`Cmk$raGgEcC6Z@YH?M#J*Qc@x%W zg2!ZuMwpRG6e|YZZ4{_=k{1*zB-4FJH{9OG%LH4U5mS6Fc3bM-RHi0929ggnLitRt z-NIh0!4P`F5;LTL=88B!{E8<3np8i-xnyO$6=(ujL2-oXtR^I#jh%O&sjTHn4?6do zkN?V-t)j?i87`uR8cLrKvptnuVAZ#l$R8BGWM?g+nDy4^{BxMa@Om z!peG}f1Fb;Zo$ZeEfvO1Z{;UWnmYmaX!6Sk1L-6iQleNWgKijD;Cpqa4fz7l-UWT{ zKfbLkx%p67&&?}|@VMJ^H~cdhP0qL`yl_uK+?K#zJXl&7Q;~y7d)Y9GhAfNRvk!RD zOKlyAe^uWF_#-o(i~RkMhd5>AzVg9W31v6IJ}l#?FBqDwK7qBAp5`ao`u)a6B@=Pt zhG9{${KXhNibO?&Pv@4O4y$HHWHgz=JERe12sI7!1aOqUSIP&Imxb@E{ota?D?ks% zj!^lv>d!qER+@3Nxm%3lug)|uhi!T`Rk5N=thj*M(Hex07Aahz8xztX#yo>`InX|I zhREqgG(&4Z_{XY)h9GBwp6HQZT7iNqCa7pNCxv(&50x2yU87c1Ae=x-nAtd?iTco= zWiz$@S1OGcV_nC499A*{9nde+$7+tODA~t-*W=efA?4QIoJo!zu;IBGv}$^fh)!D& z*eorm0{7`G95QrboDALCJhJrYg+8ADhVF5Q&|6I%E zP9ry~cqZu;Z~SLvAQ<~oq~ScYlWCMcJKfCA=M<%#3CxJ1o+m$)p;3umaRzh*P5wAM zP2NbeBpeQwfR=_#8ty4wL5_KjsCpj4Z;@jN`OP>grVonY{X_#_LSs=)ED3l67h7nt zOqaS4bT5cj_YT*kQdNCZ3usHvoXea);K{K@o5|yG*_U>wS#_+HVF|HwIX$?MyJ8F* zz1z6aC614Ut4uU)C#Wm55-kd3=ZH~8JEGlOvZ=QUn_s?(B0#-;dZ!bh_4nx*k?ku zl~%>M03D9pmvXpO$0nQDeZ(%raXD5vEEUhd7%%Ov2A53e#^1a`n~46-R1-LofH&Dm z4XmM9VBblO3^sU_kY13BD56WiZ8iCvHWjb`wMnusK})PTR(XoHwp&G&wLc}q+cD7a zk0YfJ1N1?wNn~2wTx-$>>TZ)e>tL3n`tgt`Z(;OKbh`15g#j5Qy@EatrHdNtI5sDP zvy_!9T~SAb7K38wL=+qfm(apmZxusCJGzvYAP32&nUv^^AyVw90i^}ru&tq~;Oh>z zKG#ba3WjTX(w{%W!?|#n<7l3r$(sNpzC1&1nY$d+BHEZ*>S>foFX9K$7bXcRkR5gu zVw#Q5ZPUZVI+p@930q%*1^}}s=e?v4)iSSMpoT%gD&-**A<}sbfGs89In>vDsO&w7gMJ3aNWde1%C*AiZu?QIf zk3v3`%}8;FxW&K~%zSSPPUjacZ$i!l%X6;T{xBl71DK?|z?oKlE%|E;EQN7_K);0b zmETSo3JL~6pJqJL+7pz%*=91SwG@S!+97ZrLuo*HIRAcXEPNj6ZnxEbg=*zyeETDD z;Vl%XLgJ+yj;7B{c_horjNF{rt3|3}k(tv(1#>LNbvT9I^|ILdMH71(6}vJ^tL^uF z+=uX~IA!Wp5ca&o_i`Ap4%aUaY3N$at1BrS7op!OnB2Nw83^u(zTjPw0VVj{moYpx z5pOGk6FL3o2rx=|UfBMo<7@t=$oy&f4qiyN{o7t03 z&yWv8(ac@s^Rz5Gpj)L-Lw-`(=`Jn(wqSi7Ull!}j*sm-RivS~off93eypL-(9ou2 zJ~7TCM9YR|>!z1R@kBvJbH_DBIaYspFUBATA1($Dej>J~6V7Lc~kV{-KBBo-Q+-=0h$H zmGufyOTgY$URm)9*OmKUBP#)uLDktNRs6V8*uk=vHM&SCVk2S_=al-zPOKtY+Swr$ z4s0!-a1FA?JV%crFu|~EZ3iXqaK0o=hDw?i^{Cp1MTAb~3^GUx zI0ZNvo`|zLoRxlq;($#MtpDvS`^5KPHj2;{-9LcC>H;z_O%9(QFYs0*7jR`0r0@q$ zUxO`jy@%?p!_X1`fT$&A`l?-V3TLU>Y#7&T5-J3K3t$0?b}~iCNe%@{C1n{()F~6d z=?ksPbnL`r2yd&x@Vsky%bet~FpotV1i^*WWIsa5Ww8zoKN%Nblo21g(sU7zIGj^5 ztBNOnBu;@D8V15jSJsExYv(^Ag$5*WkqOo=3G?BK*ea)0E<{WzUnjX2L<|qk=q|T$ zA`3dy*OSsW(_@M>}Xkz?{;6jXmmQnm=j!B6__)YFE%f#t$|8?hJ}7=^OR3iN)E%nZ66{nZ)tH?*A4jBj;+B-9V>a#n@w9yb>dS-go$|`X$n-6YkJce$zosM)gkJp{!r^(_9+21fw?dJmM=j`+Min6d!Yq%nv zZ}7_r%jiGQa{stzEh)CkolzUQYtcR0JYLpA41#;xgStI{GwaRsz@Hz`{V7f<8w02EWyvL4nnq?*ssu`$vp(kk+AT9 zxomgmLKS>AcNL_rDWCz8V8rihCU2cA+WXyg@gk`}^Y>RQ%7&kSrc}v&zZb(jIj0*P zSjB&=*JP^8WdP0i@O!=rptN(gCL>a>t~6+rj<7KHvNUM6D653-_yX^SSpv)zDC(L!c~#lQw2Yi zU~pD*KySJVcamHi{2`Z~;33~QNH>B;SiLoizaq}qpprtTShF%FH~+>KzbJ^}`3NqQ z&Ul%?A-qIVJ4cR5_@1DOksEHY`XQf8fNWvWn++>Ir-J6(R8FOEKAZ<-KUI)`l|}i- zO#a^tM?=KS1M}gy&}KUS_rX=bWXh<38=Qv{|djAM>&r~usN(v3vf%6@ruGpFzJDsgrhJF66&i?uR=|(2Hxil zck8P+_pcYz4U78EEiXA6*WNiBUiN7S{D#kd>stg=`WTVhtAI1DIOndL&4DJkvgH*y|D^um#z%NzO(ikvlr$_4c}yfMxY(_en_ICWnz<&gZlml&f5 zH)JZbELis9?L#7C!vInJOmP@`35zi!jN!8V*WdT>7k&Q>Fj23J#nkt!H(Evh7hlf) z7a_RxgyLR_S_5*~tld;k^Ibx2>&O1YN!xbOEDxH)tD19{t?y{WM|{IkJ33+0sz`jj zfV` zn~R?ubh%)zXf8FXhr!3ArjKI!7I)qEZ8A6fO8bsBsBk{dsnyWnmNp4W22_fUDYMK7 zDT&7o6JT9`9qi27L`t%YKk=(zFxcmkLHGZ`RV!~zTU-DAlz^mdIk_ZxV)HZHF8`YF zx#(xv+;_x@YW{cJR#UV!nc&Tck`3R%SNdlk+zX+%C*tU9&*+}7<3ewjalecqx7CYP zdM2OO5xE`K-j1#Av{aol*Wg!!+ha9QVB)pfYPrF+d^x+Gl%9yz5Xg69RsNA z6!r2)kqxodHXf}OYS+CU^L;uBduU>|&GsRcB^WZpR9aB=(9L|+P&_rb!uTc|w8tDm zJKQAdq%rw%!!jZPA<&P)SH78==S?$KQ}&cfn2OLEPMi`0NvI!bVFWP{N&Hyn>GPu` zI^pZkwqvq=|0Gl@Eg$_Iji05)9nJZeikcLMBQwJ;wfrW39iB5f`eQXA37UO<=cFKv zJ@gD50*ps-!ms}R%Hr%pT@^yJQ-<~AL_K3uvOHxj*+y)s4H=3az~xM1j?DfmlxB?H3kCzI4&XUuKDZK3_hbuNYKr*YwEvSBJ{uv|JY9 zo{2@@JFA&c6MTLmBQJj_TB{=Y+SyOLow7q!@nRVHnlygM@-ql2cA~DD&@=Db1WbV4 zQa&q>)IQJ}`R`@T7x8S>pMkKWo0Jl6%6sq-su7z0#sUy-C7Szx;m22`}XZK4FcD8aqghVefLl7!F`p|Nu z=t)gnUMh*CT7;3%yLyq-G*no4 zZpSmZQyFz6s{?vuhcZhPg=U*QdnWmL$M5aU(B-b<{9eGpNLUBAS>K#tU|v?Dt_19V}Syf+Z?!*A~D27 ztI#QMP+pmFT?Z|mnR=P$`;t$iPf6|@rH5;hz0P$5s~>=6VNg~uLCCxL_`F{!b!uUu z+|f)uCZ^6>d?iev^)1R-8!Znwlyj9-7|jZ8$Qz`Zp`oVWoBzpNc4x57MBTy`uAN1= zaSmw1<(JT!te@@-Y8N8cPy=UU+LIc2_}1Ah4pD?y_wr%or~os(sM`gdG66f6!EjQ?OjLot-27SuoF9RD=` zH~5ABgBcD51xw)gf8z=J0}*uJPe%W3xuC=GhT^2a*F;hRwnwQ>px}X;`{4Klv!Day z8duL?wwg{v?EY;I@?5^)_nIHJmSB9YBUBAiz~^<3JP+RDu{_fp?63X{@oB893i7Ei zR*l~f5<1`(7(y#4tf)8*iyfK#3#n@~JG)=$?;WT?`=Q%hbCks7lLwKRp!eJ= zg+G#Va0M=|0=Mg8yYwx%@hH1aetc5F{%h4O+Ub!8?qXj09O5gu(+q#28stoL|E10J zB^yk#&qF`BHfT?qCqX?%TC&yE59l)G=5WkH2)?-9a9nkB%Ih8glniS@`t1+;E&VON z+4Hu|nZpjgerUr6FmS|K$bS4kfXq*+Z`w9Tx0BbQ4*D!70bh<%$=hE;(b5II`}U~b zApBYDfP~EmG3Olo1_meEYdH*PU=k@o=ViK)jbERA_=6{;8Wg+wGHbMj6>boDafy$! z)3sSru%5N=Ez~lN2KS(dh~Ic}n4pd3@mQ-v0{!VRK$aCb?!G%{sRLhqJB~-0_kMasi5DR;B@lyf5HprcHVp&PoB!#L_`2Li5nv@VdYkdWs zFU8{H@y)Zp6VQN&?ww=B%<#qlq|U2x%XauoaT!N4u*8Py(YxgDSZqgHefMJg!vKjL zQ8Y=|!h+%qv4T6tum6!$^ndmFRn9LuLy#Xpd*wtgXZP&Ug+1+6=e;3iM>DmL)am@X zq!8pmGuaC~Z0TBCRquUK?PJurHxROV_SK`=XXm&RVU_E-e>Ca3d2VEsXu7`y(ru6K z#riR~KI-9eGuR#o@ns0D=&T+ceA9LqWoT#stfPseq>tNs`!5{p`Mn(Mt$+HCJxX%c zk<#b#_Y@-c=9|^51RH=5`^=b5AJxv+2O)#B23f$mC|HtA{uiAZS=b9eSeU~VL)Mlf z)ah6VqwTH~!sQ*|&Fk`K4c7LPGKfaC+`LcS+So z(X-HUKD&b3b!6>4EprD|Io^cE?cyvJnl^tOzXskZX=HBY>3a6Ds3FYRH5hINrGVF| zZN6Z@jvKsNEqCh2MT%z9+~@zTO`B7ERfiU-_Rd)wfu3T1I79e3T$Z3PnlD1N6?f!9 z@LD47i+5eijf?IOVWFqzV3hslbKcmfYj3kTU!&*yNVi={ao^FJ>c8J6_5(;gz06JB z7bBvGxclyn?tk9?-P+&T0Pk$v4|lonJ8mk=W(oH>40WA>f&OQo#=+_T(q{Mvh@CeV z$Kcv^RUdB-JTQAZ-AsVuUS7!|(TH83*C8A1=XK8Z;t%;dZd@O+oQc^-EQkhO>#uUk z6^o;np0B5W`1K}!w$FK^A$II|CK!w%p`s53)Jc{7DfAQD|1FhueLIgDd;N*)BtuYc zA^hvxTVH2i-vz9D=5QBt{HsZjK&x1sIlLTQw63BWTN&QKL;~o zko~`PBe22xnNvEh<#uZcgoDTS25tS1G5EZ%c}AngQXT0$+f8*o)HPDN1^avyNdeUj zKra5= z+|hawJ_hsUACumO8y{WAev?%xU;?8~$|@d`&Qj&VPDoOt(FJABa-XAyLY^&)pxdJ= zNP!PmQnjbS-bcWLZ5|NmGN_Q(y70cY_nVvQ!F<*ER99~Y@d+!(v6TK*OTDi`H!bN- z2GDPQ;z*l24LiJVYPxhoW&wU}bdbcQ@dzrQILV%82SXQiD(HV}r~edTtoD?0JIfU- z>79z?Was2xm0tz$plkE{`vU_Z=JB<}0hGZ2AHTXVAirxFY)kQ0(kazno8|YI4kO7= zwnU$!gK{wur3)QHi&8qn?J3uqg7xUD%S}lI$#gz^hZC2Y5VI+hhsU(@-R8TAN&b_J z8KlzlQ7Odf4PibBa}2+arEg+V#9&4L|6eT#TX$6GI)8zA^#*NB)=!ur5e9XP$D*#| zw=Li5;T~n7=@ql$9CciXu-(wu=C74UyFBwz`i+4IY`Xl`3Vy+0Ol1r#B=E>M$e9=g?G!oY_KvZ(V-UCL{Cuh^%)v_8TRsgXU5N z?!j|BeXomYChoY^muRw~$R$<(ppRyp;Ra$LvW?T(6VOSBgW_k3IOw~mPmcQPdj8y2 z8FQWeHk;6uz5C1}L`1N8{#rK}YBnLMv%XW4$P=@&hB7i zak?0Q-UKA56UTC1Bf-T^_ki80&%7;gbr zW}REQf{MDvo4q^pyNG{{+YkLKg_ZSt2T-zuu{XpNvcHF{aZVB}D(HS?en)i|v9;fE zSJF$0U+~$(c_nj~7=#pncK>|3Z3JHWpkV8LC@KB~>eD7{=sqi0+t65Bx(VqEKqhFt zFF|c;ee8S(a*C~<4gqtxG^Dn>A5SJOxBKy~T%Wj9?LLIE3e|8E--vBfbWm;FNRKeq zqDEx+-h~n?JXd6K2NIzN+@6;@VBtm{>i;ukdv}e+>I| zBZon}))uL}X=lJ82h-)%cI9#0^`fx-o>oljE{$#H9jBZ6Sadc!Kks(y#+ay+H^zJm z2O$O2#FG{s8j}$`v=qO3fBAUjl%T{j>->pr4QWd4f3F6)(>evk-TvTlHG#=S9z4#D z-bYfu*R#pI^A%Fg4)A}l^#6IwTt5K6^~b8iHIX66!z9@6&2!YPT6?mF&mA`b@^lY| z(A|2_~w6qkylGsxS6p|{IXqbV21{T z?AN`4$lbta*i9dcg^iGCMF&o<03r=(Spee5;|@1H1UXLFkYtf=%-HH!r~*iJ%`vC3_DB%L{t5 zs{0kG16xNbt`e3={ePs`EfM-~hfJ714T5{`d-jok_rLohVF#%9KL5%Z>i@aRhyLfE z{%r>q-T&vAp`f1d|Nl4#|F63ZRCIJ?>tVh#g2j>wO$a*bDkz||!9W?Zu=_sUkcov( z9k)2|KRrLs+Wh+vc!;!mv;TN{GpcvcQ2u-g-<G!|%mp?~keqU;9ie{5Y|SQrp|_n=BPl?>l| z|EC#t=brv|u--2SgP_9(|GoA8@rwK(f9-!R2n)gf-yC*OQ7+C+&2gNWJgleX+(dXqj?5^m703b|N5XxpdN^mXq!b!4taj_b&#vQq8ok5l$q>SCm~|8|d)l z=n~|=&O~6ft?y$RybS+>aw-5{mHipn&?D%gdQba+rEC9XO@vZ>A-ne0bR0#2tt1XLX0zfz7bVicsB8GNt=?q7-JURP807y#K|L6RNwQncMn^-DINK z=%lq^XKwLCRZ{DJl}ioZyBgNi8+0hQ+!iIq zNIB@9nKpT))0d#Ox4ooftg3PU8e^7lF|hBvT+f@e(f4qNEN7mqL-o7u39CqMHFSrl znJafC+c2!_zx|-PpPTvcoL92$Wj#u7vA*@HDn9DkmAif`2Mu+BP8`!mve30SH+}}y zy%y1ih22Skz>`X0 zrNu_s$pyQVZFD+E@Tw)`W=towd3h90xhw95kK{87DJQzE00S$%6 zk~Pd%W{38{sRA0RBVffw;T%Lb5eDcedw^-f{`Ne6sdI8gcAd<6@=v{Ov&~_-&;@`A z!Zho!b70``{Ws}-$IlZ#PxB*eAt1GbyNte>KpTk%NBtKq$31Mo*ZAAjTU7glvG}c% zI->*c75q<2QsPSlls`3^cZo=H(>8vI&y*44@Fwt#0^fEh{+YPQn5#NAvdzkOQQ5Y1 zhb-4)O{p77Omw9X0A*cbY)0(RP%JyRalVHj@z@{*c(qVtvp>93WkeB{Zc5roVzsv_?1V*DHE{U}1Q146_R+7EE$MiiEnmH723ZZLJisGovmUQo zfP6f*X-TgbMRcb*%&(8{b^QFS{b0S8o^#M=|GEn0m^XQUk>fXB8p@mfBc_@vsPL!e zeFyIPA|o!MK2~Cod%RaPxlZ@6_fWLLXu7StIb_6c_U-c&q|jvAsP8jC`h?CziJ13R zb2r98<#*?Lp4!>t?UmFyR`8c9yLj?n&Xx{U0(uX?*~4PgjIY>#JZ6WMXvlvGZrjo& zqqFH}Uu`O|j<4U#cUi#7!A{7XB~4c&Mf^&rk(G0@Q2T0!^|ndbB9YX!*U)=-wO{mR#UA>$+dp!wI( zq|czQ;M%ZpJL$I%6ODP+rP^wgQHHxw$?EWo9}OZ)85fx~7TR6q$ps9~Bitk^VI%Ax z6x!X}elsKiuE*9B$6s6sRXn4sIPG5V`kp}@&PSmsb7rdB#f3%e7STl}S%~8)Su;x= zCR+VfNKIht@_d|r+DgmX!JI5_y-DT zu$7cs0(SmSl98f-zv1aW%oXwto!KO_*K{2}FZR4+{b7;nRoJcXqO=y8^?V-WKV6hD zD%YCvVknUm>Ji2=M~mYWo*>jUSMd-yw0LB)oQ!R*$OVns`wBF_Uu49eqrC zX1}=Iw^x2_Z@TFoX0@>FNbPWKz4)Emy(Q{CDxke;sH3YKH8PJd17#P{(*qMp@TIDF z&K6Sryk2+C6_=Ts?xz$*q`bo#I?Hyz4m9J)>3EytfC68F<2Sdm55DtDr>|osJ-;G# zAAF?D2rGrw7QOr6>Vc1``9CFnc2j?wF4U_W8WN%n+}K zdZbyH$G4GjIZ~fFFL;AnUEcfKjm~1Ud)<5d^ov;;9y}D~`;MEY6yt3S;#1lS()H(3 zz`*J=<iUq?Bvp_oMZ<@>gEp z*0z$}=Y>SH%HCeF5|y@DTSoq(9I5^1gBL5bXyMn!{FrltI{kAT)euq3Heg%hlkqV8 z3V07GHWWEHzJrh1PyN0zZ7=7T7KN01lI26zB%|N$P9^=0y}SG5uAES^5775*H6qc_ z0qA<*U~}DBW_@M}o;7S+2EMJjpXbDzrGtSFgU_jD2qVcToD(GzBqgV>7b;4V76BW- zUL8;GLX&8}|EzwYFGmnr+u9mNKJYrc5lOJ~6PdA-*Hg$DMo=LM+1mF?h zcT=DVMrI1DubM&bVX9mTGvCB#Ud4tv-LGx6_ZpdH`^dWwAvbgV;MBkS{Fu=tk7&eP zJ11JCNgjj_-i*L#O}~*{EWfuzf&im9c^%@|=PKg}!!;g=!7T&s_YWD>Dan4sc%B=u z`7bKwgzIbiPd}_GZoSYd-@YeRH;~~+S{b4jG4_P?=k)E)#{|7j5%3&ClYOs=b*!`y zw0z7e6>)cOy5E_4_@DwF*K#f<5w{YmZ}f2Ydlg6#?GB_luUnB23+Ielp!T6M^_iP$ zoQ3$Fbrtr9sfK?I>GPK+=>)!Xbuw6=<<7@1V4K<%XV56&l~2Ut5ld=J9$3RaoRevNtCdY2*EjU)Wj_wge>iN`@ zAn(j^qoz}cm=C%=_gkCL2i0IY2HKlSPCtF88m1J4i@BHPhCB*?z%Z7Y}xQ7WU(K= zVo^2vn@K;aFZg$l^lh`+cwjk&{m82mD61OS2tEff`an*sHUJ!ITJ%El8<+hXlrzG1 za9lQ|T(l=C@$jI`aS0j+-QvIV%K)XFUZEUetTN%5+=8FG9j#4W=+^e`;KkI9>W*)I zX9Ad+1+z}pH(1CVMt2v|u!YqYEs{fd4vpE|}PDaS}3^?)>_nSk}^TynMj zMYVC+$?;FFEs6O}dhgc24l#=}Z}h4M@~nT{-Nr0Y2{vP3C|d1b_iYNvuUR^g*rgVnimz2XA}Cf5~ri3D7}BP0QAn%1AN*O zJW$TQ92vNu)4JZ zD~UdrkUHdnkwl6atEYD5ues$dyyl#?Dps)=Z~h`Ju9VMNb-KEa($mbF%>fQGUqPy~ z(ISb%3}(y}*qaO%soB#**{HmND;@V=WnQN&7G9GXPQy1Ed^ikZx!s6Mx6IB0nPku@ zuDp*7j3baOghr1S9t$icf6LKvug|K6UQ11?N4h6aXHC;C>ntjA{gw;LwSNL=u4dh5 z?M`oBsLaynyj^sp_{IZT?xM!S4q`0>>h^0j>%qFSR#r!uo!f2Gz{#G$au>g*qrFQKtAGC`)uwMC zNX%&t!)aytpz-H4X+KMY(D%DdSM>~SXGPlvXTL7oRpPcup; zaHGnWvcvDI&+hbYd+QH+n~zL*Yt)tf*1J?VEN*8(KRWwL2e~@mcARDQwDY#nOQ27V zX;cRiJ9cS)EPsIKwi#uMtKHRCGa(t*#gk_e=syLQ%!O98?I@!dm^QMX`z^BV2Arf8 zSWHRYE@+(-S$WgGO`lf$IdOZXWkRhOBvRwGwEiPKatBPWs%?E0I}Uryy{ZM(vyF=;IGwlNhIGdDzM$ zy?q;p_)qprLDaS&=Ip<|k@LgdbV*HMLCu#l4e-OYz5U7J5rhn>zgU@hH$7$>+Zf&e zrw(VpL1M;aM_?nKHIDxQ8PfRb1M7guC%7_s@BK0m>vu)vwLAAbl0(7svze7gHrPLPZBps8q~9M4bfnF6%6%ohg0c+j94ulWxc@eNM0muH(O42cp!qpCjR?w)&XgRh$1#w_-ZN1e@=2bC7o( z-dA>;01m8YW|g9;@6qj29~SAI+ARi!o^U39Eo}*Z-m2gHo-NP@T7*-`W|h{ujzH_P z_j& z{2k@@=$qKJ#~2@9&)sQ@V495TEQ_kEcRDScuEwuUXnZz1>mpUe*T z3?)SGhHGDRhe6vdTo0xZ*!bX^o zZnW^QqD1JCNn!NlJ1O`_%dKq{GP+npOt;FU_h|=AGX~s#?mR8&G@C$kGS+%E{Tb07 zM&b&Y=|jq+00=p}m#0}Fwu0uEdGVbuND?UxfIZHBw+Bw04<~Mc8d*Y*%VpF(;bE1Y zzJNXEG3Sz1c5%zqtjnpH76uedA?UIlHM&oG+b`tzz!9~VkKwV3=4T-YcvQZRUHlk> zPONT!XW;Q)Vs#l)3|elvT3ZgYmEN;s9>*T~*&&y@E?JLn5zw|T0}^`M_=5K``HT>k zO)a!23Q?e_Uc24AQ_4%R(z393X=!hbI!}EB_+gkoQ7t31$UgpHto4HjWD1Hg#VkF9 zyU)JW<)O6TtnZuKV-nO%(tr{NS3@Vtv22mdC z`HBy7Z_N4J^IIIoSxl!g2WK{H%r+#5OThOhX^=LNr4XZGk?(?T85(thQ5!dKO7&#&+w1rVmP6T(z|qtP z=ym^TwIaQ%DN>8B$Y;;lhg*XGOb0rOy@|xTHtO_VPchHKR}41I?89Va<15&2|3zsit@W|OhVN-z&2t; z#7yiODX?i_APU7XW|K=5{}7DymIo}i4oR(DcixjvaGhmtKoV1B7Yh~@2c2aAZ0Xa(!J@ zHmg&Xb~yx-&Q-THf9;cNNDbCwt)9vz3Al;g|5ILjiW(a>Jw? z;AOfWWIoO>{}w@kSH!E_Got{$*Pg|sH7$p1!fo{yaoi#Vp1A-wt>|`%=3v;{^E@Bi zr+xdJjxk$r&cS26eNbkcGqI(o1HVW?iM{67 zY{8y5I8KIF8vPTp*4-AdN+(2cK z7wtb0eC;{p#6N%88?W6SVP;-qTknU3O1+G?8G%n%K&z4CEz!Di8=1%wK(b4Hg#Vf< zSAog>BJtqe7+>wL%Uy{tRvZVCVQSvPu?hPwVG$gb0AKXW&Zg5bL}TR8B%%w@Az^tj z;aek$DM8-V!6Mp0M5*rRmNm7mtJ6aNoV`o9?tPb1J|5P|C&RU^Hsuu3#XVy%Q?Mv} zpM)Sz)1zec`YL&(5uBUG0>DH}T&iqLDl)KJ2m17fy6dk#4{KF^WS+n%M+o0Oz0m7U z8U6FCFrpx3Z4io7MU$yLhD+jC8dZXr#r^(%=kpqA`GP6YTfYq{&_m?A_-;Pau;uHW zxX7hL4n%JIli$kk>!?*V(|xfp9<_c{z^2(8iy@)BX*bB7otzJ+(u&L?USVAw|a zO!(HA?a&gEdZz1vSk0oP6i;QLrF7XFhc!4pchi|nRUzv)%bCXsA}H5IDJ>e)<7U=4 z_q!fq<|5L&yc$gZ(CJy7Cq_%hhm57mCOgR38C6`K_*4;x6&QNV!JP(0+M>S7VCMZW z77k*NuMw-)Yq?yXW@f94!UPG`7H!s<4|<(pd)Jk%iOeYpX=v%-g1Y=<&l4p~kWp@z zpxX4AxUU_){a@VuRZv`A)V7ThBnj?rA!y_7?$EfqyKCbR92$3bcZc9E!QCymTX0GK ze!lnptG?QY`)Jo*N2{u<`=G}?=bCF=W85QHl2+imP(XV@ys0!HKw+d$z?cWWmdh9# zR2jAZpN=#Gqy_|{x#*_BlY>*>yc`kFQ8F)uaD95gu-_mlGtVEs^~&p-EV1X%&JIBX zxskZX?&Q|K8#jJefeVBN+#T;Fdmni=&mB;^Ui^zCq87H}&^_Q?o^E2?5P-Vsv7E97 ziDo@Zp4nn|lXmaJ$f+yCG1Dw9uiImCJie+`XougCXjID9u|-%ox1DwIX3v2M)9F!0 zTNLNT#}T)))Q9hFrwPra#GF&f++&0&bl!?!51T_*fKZf}hi62T- zd5_Ij^+#aFSOgx|>EcQiaHk0lPcOLr8vQR3uWhf}en&Ly?bhe4Ek0|d;l|o|b>0ny zI&Sz30~f;%Zt+Lgs?FvPH zeN%&NjiUj5q3?m7hV&0h=~?o*I{T=dJeEHzEJdUQ zrwo^|hJxw=L{ht@jK2kRPAVwOigGB;;7311_HrM@#Pq!?SXh-j3B3UoQ~Hqnm`B3GqmcP3_<3D)ATFqwQK+)}LqQa1XD%s+#U%!r_y1 z3!kOphydF&+}CB-9qz5cZ&u@4>wm1O-g8M5_PK^b+O!ey zkPpvSLaAmP{I`4`5qf``yX~3Ow7Bx`QzNjwVH2|+q6J=2$3(^+KRT}sMLR04+-bP) zNlW3!?{WF=C2IpAIpFKoU!Zc;7wp_&%fg?lEvF&sv>91tvZ2U?l6NKv^tq??k;Esl zrEIcMjOJtUkx&23PBH)eJ!329V_aKp3ce5DTrBzgw;LXp_qbo1gs*QspfFN#sic(l z)MrO`ua@~WquW;`+q&2b&}+Dta=nca3%5r{pi(^U$}9WU@3V%3-d5X{v?3noc9-Qp zBWE2&efav6GN0J}u;y$m)-vl{9KZph-I;34eU;TnYFfBg_0*-L>P7@63dwcOL+ z*^tzsR2&bSL7pXcw)zLZKW$$si z_{`I^91@p}$8@3NMZVG!)AI#Sq+syb8+7Se;>Ihg>3Z6B;ma}YXi?6GiR8WqQ3|Gf zI^ECZ-7}-)pX%-t?lRpyZrXolaj}Dow4IS>PV2LVEsM7jTzl^5DpES#&U_5~!et+E zcbz$rI~8eLX5t%?EEuAvebFC+ihcgMq}6?m%7V)GxXHI2kMU;KXrcF4m>fjWWL0{% z%#>=%1fA~LItDC9^GWJo$}c(g+)wv@{1v|~yQ{Ghm2Y}6KsO(8yMyE-#aBZ^U1oxx z5SS2#r)O)fn>_Axu?u6OcfFi=wq&)?qc)u`8Gl9 zxz8VK7%%^+3K<-G{5}`wp`Ivj#mAWE{YLus!ZnMU5z3RAxKgrw-7ETX_H!*(wCC=j z!jyV@(%;gyVwM=P8&Bm>x&`^9efuNOWw_|Pi%N=R`0dxpPnCcZPtF}q*v#wwHj4PJ zGNGp>-Mqwb-Ckc0AA##y>64##OJyl<5ZCpyX~iMQ3CD|Op8es}uxMN_*)#_b!r8pK zArO^GYOL$)hjhnV+s^A?xZS^lWyFB-zoO(#(_fLkDP*6{Sx!w#XA9~KdMEksea6cp zg(wTe*2;gX4mSv|Q z58#3fm)jOWuzeH_59c#Im(PUegeY~MA5lpwokBt@$yu5$4e>jMaIU@e-Z9K>oLkVjh}i^;?(y$9 zlY23=CMH7&J-zcTpBytAen^`@(=6#SDkkPaBy4$~l&5Oojwva$NB|Y<>&p(0XjU|O7`U}Ts z0%G!Fq6g$M$-4-W>=Zn^txZ3BH=fiqA-^?M8St&U-Klsi6<~%l&TdmQUO*6UBL0xD z>ll*v5-=JpjZNuF?s2>mVHRtdiV&U?yB1G#@s+_#F|Omz>)>~qBBE51%t<8^{YLuJ z6yNc>4x`?r;60^nAuoXsyOD>ls z2SX8-FWz$^kBU?km2>KOZk4 z2XyLQFMFRg|Ccp~#J|Y>Ya*j2zhn_69QV2v|72OlX7H(Ulf*Lx4bmBiNc1^;p7u`9 zX~V~$>6-Xt!KB<(xnrGew5{pZ+C5x+AEzgXuL5u; zGkxj8L`5U~q9-?hONHM1dmtWS6aBC6&iD7VrOu3*W!_)?)0PEI-WxvFRu;#%CB&5L zo1Sj7DW@%mYWU}CiBj6fo4nhoQPmd=@v{zM$C5CAPd5pWUu1qmIU}`v!#{*UlD|hK z1IzMT)D;Ef_{WIrd?4qLn`M>$^maDSsP)#)*?t*};Lx|ERK#P~yB3Zjt6-X#f*HoV zrL3!LSjE->g{*pm{-w_=fW03iAn2xqhmn4gl?@!ET8m*C2-@a~ni1|S+H<+YrRXTK z{Zo5)4(Kx7AMVdG-jkH&YVGyKGoSsd6SiHss7W`$2J&~$e;jFB>7DuO$Kj*CIHwYD zP=$H$DS5K&#hQL#7rFe!<8SnRl&Yye$x>Vh)Hchx zUa4H0d+nu(>c%37xOSXXoLB|VC_aydlWfn|ZbH^_ZXxK3)*<}t?qbbod?sh)rsBpJ zybhu7E`xsW@eh|UeAbQ_+MwmE`z&s4I^LPRV2UOQPcOiz<3T#EVf>-VUQ(iKiNw3f zAa@iTWuD|9h>l3wMk?OE-c!>krbdHBq8uT$Et^&4{KH6eyciV3? zK@m0S78iZ6LD-CrAdpUdyK<6%?3;@Ozw58*fE)accJ{?CZPr1l*r$u3(SQo8TuLcY zaw1YUA_76l^mxDzSx(n2Qw)#q+zwksiG%0n(Rm!EZ%m$Pl6Q>3KP}7U8iXP%shONQ z!{;V$;dS2kk$yV`z^e|LQybr-n@R}{S5-;>(32g@Nwsf-q=mRG0*kpluTpyN#N68A zGFyHdjx=m;b+ig}M!1Q`)~>xwS?_I-D?6s2XR@IYr8Rsb-;9kEi$r3zmvmo?HFpg< zZIB~dqa^`nw0z&}=Ub3bWHK+Ut9cx_?eC=?zqvW7nUF#h+hdK_o``Taw=a(Jti)bk z3VA?bWdG@XgR7}1eM^i(=kwbku79~2ZwNwJEN)L7JPsx~Z6UTnw{dk|O6=mM1EX|U z`TMC`WSa(+i>61@xE(nR$33ez&|Y52w{4TC>tnvuy{qbfkQGfoK^0luH*Jcz3<$RB zxrP7?dGQEeHtGIwjmDur<=#G!2sz80Rlc;JTL5;P6iJyT8E+7Y=?K7p()J74sy9Nk zMs{-0n@f_Jaof{#KKcxO0RPkBiFNHmAoHMQwYQ1w`-u_`H1IKSC6ukA4krq6G+a^l zubRUM%#1V$6P}32PGPcHziw2Jm{uR5>`g|jaPT=JO{j%#nBA4$YkTFb;O3XEp%?2k zg)4Pf$jp=i{u3HgDx|)spoo`yp*p7S-R=Ai-j(V#va?|KM3PA{BjB6Z8UH=W;I;1S zL0}rWie=2Md2CYn#O+cZP1+i4NR8LaoMAX>QN_bxMz_bGiKu0(E)MwI&Z~+oLXtio zKAGgew01qgtCPfE@9Pr_xtS;OoxT2_#4M-1jNa&gLkwC92D5m}OE5Ai2J2)*M|2HlGFm?l^Ju!0f5exu?RnHB+D(exK}xbtcIe?j$&u7}Aku>b47GbGqHr|=$+6B^Y zTJ5!@>e&6QYK&srdDXv%+art+c;wV*=;b>|#9~&Rw?N(XJfQlohkeL7GdyHQXK*Cn zq1U}!rz6$YDaiKxI-Z?Ew-cF=$4-6@Kl_)yIN)0FWA0Jg z;u^&INfODsj~Ur&AnoxGgPNZ2TlrX_2aB30LkXn`V?*!4=I^bJI(&|AS_uvr-3np+ zu1aKMI&&te5byfQP;+u|LpqJGKM7MCTwVqT3}EC(9hqL&3doji#jbUIPAx|CJHn6YRA&4vxfn zQknO-?0@;+Z{_!KO8@UIr2JUuzez#=@Adz01;_utTTH!PQqnLy!g#6N;QrOZ7Vgas z%XV&A{ZDBkGgW9AZcF_}Fr2Pw5yh`7gnh;$$DtCjNaO;#rTYSNxR{G3e*(l*9#oPO zO)dO(rLM4mQ|n9Oo1csupHSIR5w0i#k?vxnNs!d!y?U*b*wC7wQx9sP^i&yiXhHN@ z1kJTSHEqp-bL%frT$fdL-|{kb=?jtc^gzKQz5ut*E zM79!cnLU(xzjk;E2SC?We2y7jo6#*VaF<&g0Zoq3Jg$9Zz)szyie)RoF`x+a?Pf{z zW5X+Nt)2vk{Yu-7Wrm6?(v1{AT?z@7>AzqzN+(5mBiX)O;71!p3Qj_K^uK*9dW04> zbSi633+^P}H)em82u|Unq3inwtGDoX02HJ4R$MGG8~g=Ic*Oo{ocG#8>g9uH>)5vl zw-YjyhdKX8ETp3$cOY~xHRrcnOHC1^YXbD&TO8<#lgaI2PvKaTqOYLi=Xm@H=L!r@mWvHAmHyom-eF`{Y zh$R5Cy|kh%QWGJ#fSiwa_$Ym3zRS&-?sM`{5W=D3Fjk>>0sp+Xd=B+Y9=TbDNQ2ek zXEAdcGjDb8-0f&?WkqyL1PjARcPWu#3URWtc((HFC61YPDrqbFt^Pf#xHHWVz93Rc z_hFJ#eDo1P5VrTy^y60B?2~80lRIdJ<%&?lM^aKnw?sNQv`!hrwphGYN3vNHv2xDW zZxCMHHhTXee{;3cqwX8oH@^*i2DuVYaZBsGoZG(!;-!nG2o5F#VAo@S5vlG&tS8Z1 z@iVbWC9^^_QjD#%cZIjc&l+~(FDG)2!N9%4slHElVbCi)8rSt$f(#s$gU z#O0!J)0iSRGVE6#;#JXY=UhCoy)`-`tP|wMDI~s^YK4iNUwjI6b@<3x`ih4nuP^K;dMo539xUZu!zsHiBoFjQn1q|?(1uK9DL8E39B z23(Y_c__vXC(JRIfC-N*Qui)smi?pk2ZuIzMVl*yThq=+NhI~lPbu3KW!D)}Zl8dy zfEp828z&Z_K*}LOc!UgS6al(%3D(={y0ZW5?jSZh!@trTau8(ti$>&Y6)@!PYFII+A9cr|6l+9^Hk}*E*vEyU5>J2l4GX1J`{gzuPj0XN{jVBTir`3w=gI?+^E5kPF0Y2AuoijPRL=nt?qzUNna3_Rzq*glHX;}j*V~`NF zUrqVO6MK{RE7;@U@wLwo%7H#D^<~;$Ge)?lQDl>}nB=+`oN8OtH@?qGslT2*4Hb-* z%csuz!fKK(*`ue}dHu@7cv_SO>ZK(BW(I^(@I9gBC5Je_Ts9&IFu+Kp9p*#JGTKq) z6)#>#rur4JnUu13*!gsKxGyp@@m~G}FDD{lb)>rK0_SJ1UmUf-B^^bByqzUZuhqRp z1Y}bBs&Gq7);BFShZ9U4xf0#Weaa*)Z%Nwt;L3`SC4Kn4CSE=o9Xw?_zrzh3i4x)j zGW4%0jOQr4<6s;GIxJ*n9sB19Uf|`A*p5@3;ZL9ONu1;fAKm_Jm0k_JPm|ts$2!Kk z{gXKXn!D*!D?TFNB37tms{W}e58~21jB1roDT_xRRLm?~m`zeM*3GhwHJfV{dWcXG z(3*17BW4+w{jOoHs;40%5L46+Rwr!tvz(Eo!^=yFKgc=6)rNSyS-qOUV`aj+j-V%w zEWbt_09ih%IM0%N$3;gc>(-z4Ml<1Dt5fDG2TL#kf3Uw09ErFCJwVYB^2m`xXJ?z?VyCou6uH*YS@R?C=xBrprv zV!vkgAuJ@1f!2SXKNs{=aceU-9n35?3H2uVZP9&gkuWw>15>IAt9eXZMGg zvE)zL#+XtPWrZQ^dIr})Mx)!Q75=>r3N2#Gl={88yHxc~E8A~$u~?z~vIstMI*Bci z9?Y2_ty|Dc<2MgnLZ}sNg}R3}LvnWGxQR^ICvR3pOXXgU`aCIvefD+IXw0Z$TD6eB zC}5mdY+a9YMk}uCY#ZJx9ql=B#ZZ+K!MJ(0C3P0l$^zKfzY=xXt@&k{$*& zPq&E*+tT+4U4!{nU7=}Ag@6LZaU0TLjjs*c%;yKqhQ*Z>UHz((qI6W20=_+ls*w(!wMeAsP90=-fQ|RE zi1Hp-7i|a(d!%C06Z*(gnP^R4*gq{<-i1CNY)Nkn*Y0a$eBr2y@|&Z(X_2rtO9l_S z8StEibVSOUH)wXy!&?8gv8uyyBLYWlYY#fy)H6|l&nWmRHDA43QmG7ZXj{`t7uDPh zHG-w(Y~m!q6dPTJ1c`d((my_ti`jN{xk?5&lvABL;%j|&(?Xf*m+)3iPWT(~uny9w zM!>FKlL^9eHVPp5vAQaCdWASpS)adNzPM&i@n(2a6HB>7xF8uMa zCSo3&SU2Pe(!%8y6E#gII3dgFXC3m3-3PoMPT z5U9J?ivodlXu(ef|K=CHj-X8w6K4b7#+#QfzPVLVPdM4wWV56Eljw9>BL*r<&=>N7 zQ_obod}+ueaNxePS_|0EAsg#qE|%bD#!KejYY`I{yCrNfzhQJrhV{D%8vB8}#@>rm zE@+|v0GYouRmU8haxn<;*-{CM%wJ(f8#18Pev|tqocIh9gx+$ zs%vTbWE54G=ldi8;J?KNdMEi-zttL$Qys0w6)tjC%*yKYGI}a7_(~nCHUBO0bkJf| zaE)Cj* z3JHGUi-cB$1ueTCt1d9ArK7pI+~_IxULDbN;WU>*1Ln?|jE^z0rXscrKN4lZlQmiA zx`V3ykc+Ma6R%u@6#Fj=fbVuQaeQ;B0bQi%2KwW^>wA0Bx{^F>3ML;IkQ$=gz;lp5 zHT3~||DQsn^XFWr9;Esk3lcF$;leYJxn>!yf+VdQsUa4BU%m=qJvTB;4GZo3xd0}d zANztepD(j6{K&$0{%kI?@m4ck)aa8BT9uQ({j{P5vOjXGi-*#>OG^oOAh2r39Mj4I zQI9Xj^g8XfWji9J^n(jK)+SA3poacxckiUg=>w44>MnEy>l#vl#AZ- z3H}6D`;nIVujxhZN=@XHHx~^iNrkgTbH3-c+*nMp`s1B*4Mk$-)qYIn^b_GuGX5g? zi(h|hiu__VT|;N)JbQ!Dx6=^T7#ORmf7aWst>pY`F3#GT>ysa>B+Wv8LSnxS89gOx z#C}n;YowBzjEj-i`|t(TmNW+XowGs$gp_!>p($4^W3X2?BJt{)zyd(m!#|fwnr``E zVZ%O(u4X@2#jW%25q5H<8HtWIfvRh$hVs&P9uZLWV+dxMKqp8`?Bg&$Ziw-&N!2^x z7zKod^RH4EyWxoxeP13Bq9U6b!oIGbjs02E4COv34{(0${MS*#Hh>+I!9-0 zuA_Y|;MYz@svVVrP4MB%%dOkFcPd`^1RJN#8^|v|x1-4En`+in1!~0rTV`MGB3^tu z+wE)VSR3=^)7Hy&+HDB4sNReprT&C1UqEZI4@KwFHMde~sbr|;p;t8=-N13(1cSq_ z=;R!H1^!0sC4e4K*S@MVkZRQ)jb1|;Y?j((lB~ZTr4TU?4Dmshp{kAkN&Va!%)$nT zj!!AoX196wIaka|yTVBUnINMt9d?e367!q-cAlzR8`5V6DJMNv1wvxI=2Yx*i!Se0 znfVw(klr^N^s9&k7Gqrr=UltO#?N5{P`FsNs04TFC0XU zN^5?2Dd^*JOmTcM&U4gxma(!4Nk8EHOh>&tLVTt0c2FDgjE0fPFZAZw^!fhyc%a%I#e_CLoOMBat#_LL`8-IVgner z0muws){3zq7~@RKaAhQMk>df)${mi@MfLbo8A&G*Om@>!MS`qLw%r$Hm1Cw2zJq`q z)rQ-9fcAxZQLMmG(&`^8dCR-GX*ibohj$dGh5ED6z(%dAOO$f&2yoE7=wg3=)^5(f zlhuBG-Q_w(Z468f;=_rlz^|q{KuK6ILrToD4G=jE4O6CbLYZZ(y;`Id{;ANNk!?6c zGOPgV-PB+`n9H!C*itq}wMdeRnF@gY65t61_6|5p$yb=&Va019a@(WR7SOU^Q?Aaw zJzJxKA3Va=P1#qwP?7?Dtv(0oL1!ebCu`~%+X5KI99&p4bbH;Y@HDuCe|H8Wel|qh zcOUn&-OU%$*3=)bjh>St#u_}dWso+MDp8N%LZmh5cBv{lTzH(-EWlgjnqsUpVzQO9 zhELpM!%7w=rRAJW`zbWNCO{c#c-K7BNfDRxHK2R_*zwdr|E-IW!aFVGtz1*l>uceK zo+^qEAYxP_Gn~2+7%@Z@v+vTFf>~UDRo8d?Hm>k;Z4D4}0s}PT)_KjzKshw8k1n?k+b0s+H zpafl1Nb!N3n%+1`l!9+gt>3i2ymX>5vpXiSia$M7;+likssbc=ohguWmqN$GPK*BA zWH+UpJ1!5L{x-r9*h8N-kn3hlwfD_ZT;r)LiWu<5M_&5$+~@;?V4w}c+D&7_IC-Bm zC}}9b@Gvb*rD++kS<~-h)E~}YmMrZ`wga?a;ifn{!&~%J;p0QvHt9=vyu3Jg(=x`t z>fO&zi({^+rRRESm0yO%AZ-N>sp#nITleoL{&N28Gw`bWc`!A2`c4c*8l41R)%%*n zA(7emr})pzuJ78=HRhh|+E%oi_4^oO*^8xJ;CU6q@y@1Sn~;rkuzUa?c?UU~(AOTx zVh(fp#{)GNprrUp@vNkdmu6%fGU!8Q>zS*SfLvl$f58p%7WUf_9ho*4u_ zv+3Gb79aV(wg@980{=hD_)F=og7jI2G-saVrq-ovyxozrQ|lA*-^0-eu8y)s;0$KQ zJ%&Show6`TfV0^6_+sjt7cBqA&A$aGI zi<<)JpJh$0>*Z2)R6VSY#z^hD{+zUJ*_2KPhyP!j#Uf_Po@BG1j;64x%-WvuP2_c% zm&!)A4XIaJID(|T?HWc1!;h&|ZZ`()agCVSHyGP*T~t2WUHP9{Q_q505KS8p(wmmg zvn7u4{*}XaHF>RdEOFyl@H&GGC>f1)kd*OL(@V2gLv8{YD@a=4hjD%0h|-~!2WoXFg>1V$~J=G6K6XB3aof;ubdvHYmGXT>EA z>G@p1)oL|Li{-5J*>MkR@<=ueBv8%Ox|%c_Bso_f30X<1rbZbR$-9)g+rp`>6;Apc zyV(Sstn}FJb$jlmzu@1)(mPmYS@GJ4irZ{t3}d`c(E%(jVn*dM^IslAouRpE^rE1L zJ?j0m(;^db6I#L?|3^6KPO8&q?qkG2>_@VeaEJ3r=1M2YK@XVy&xVAi+V>G05prvy zIx8u3D>zCY!T_fEGew)iIi|BB6mY^_%TWz^o66J0_a{-+R*26{J3xOg`|M=T_+Fic zq>({4=5Xl(`XGb0s+D?80hA$j=}&w0MAd}V0!6GZQjfHXS|(-Y702CGGk@nh%!LBM_iF+!vTRD0 zZ7&BlztAO!b3g}Q1$1OxZm0v?@B>MRN)l*t)|89Ih7>x~7jdbo+7`vqRpX>lVVdYT zUqc5E5x&~L7{UKSfo)2|ldwUG00NjUlW9;B@-YO1)`_%HKoRAy z%xdK58*~N?ro|=f2;{NMv?lWDn=Rvp~*L@j*H(~sJklv1lXP4jq#Cvd6u$&Kl#4Ro5E=HH4?a` zZO|zc+LMq_V=JXdQdJ|=$i@+PWfV-pNnL!!uVU!q*6r3IY&AYjZcWdfv~RyBfGA@m zZ>+)EizmkC*9`-HFQ;%&)pB#uFjl{8;KE@51N*y+23U;+djlqcQXY+dKYrSRYj)vY zY559(kGZ1+eYkWo-u^GcaUh1f-8WW)8I0%6daKG<`{dd~>PWy0E?W&Bhxd}I9K0Wj zKuW5iY6&-#vMMC;9oVF-9$-c|5elG3k#M#%MVd-9j{wQ{{4$K(9P!1$BFUaH81k0DfOQP@VH+k7p>8QIjJnm+eOPHmKsW$Lg zuH!H0utJp+A>72tO1n0Gie+0(msGtvlQUieE0c>+)5@ZvvqL8aU!?{XPsJbRBPwuM zb`eBrF?F@x!AC1>YZNl?eO zwwdo=H&?E?|+2zOGh+Y9Yy9VoV=(AeqgLZ1TH-Hc$Bvh3K_px{RXFtY^Xb(-* z1RgVg&Xa!OK8;Ks4$iCz8Xl(W`0o%TCEscxOa~7od~G@vmDk$A?V$Lf?6)oGyE&(3H;m&sudx<(@>SRRlR=n0Yj~jH`VKd5 zcX7cwMu8s+Cxp>R&GC{WK&c^tR@k}e2^7^Di|e*^Xsi83Z6`Go;hri0EVE3|KLm?Q zI-)%RWTo4=zfh522fp7X&X>xcdhVcY!|DRt7|Hp<)7fyWr#7HpQFKk`syx%!AwxH6 z;P~z)$3OmKqWlU#cLR&kvuGM9CAMek`#sfm4z3F}2}@%BGrL4yC@7MksRHVBmAJ*L zpvU9dKAn0|AI!=4g8__T1gre(`jpZ*@RTv>q_+AfXXUcK_8$%NEQ7lXFc;*v0n1&X z3mQ#-a^)gJ)!)+k;?Z~G`vq-eHG~5wHfnxU|Jc^hQayLOg;+A?7X8wZdSZ+KPB3}F zyw7*Sb-9Pyp(1qoNUY~3MjiJvT`Ind6=FC=#>eQzRzj?Kti}y)AjUi=>Oq5Jsi0=t zdfa!FK7hy7{{?#cz8T;AAKd)EtT&kG|7*W3Sq)h zaH`YUt?iW&-?$^{)g5!V%(j;iK(qln^N12jj(0j=%(^LQxap0rSCBn6}_ zDxT%SnyMptN)b5(#baa*pee9-@!#J;c@`9-*O5{bs;lTmP`7|{oJggjB*KRRXMhn( z@n*Y4ZynT%o04+3>EZZ5IhO%Jw58;$RP9dUa=rCs)yIDJBWtTG+$mz2!gQ5s-lyHG zk`k-S_u;HZb)mEUal zSc5(}lXYSujtJNP5PulYK8}xqz9&Vo_nRG!yjE8T$xJwR+8OlPZd)V&@ur4r*W!eZ zZ~J*)&j3CF{(S%E_{CPRa|=DxJ*UAGwxHDh2a4R|U)q(ZEyC*b_m4H~|D%&1B>cQK zI;Rf~)lVt1F8asrQTt(uKuWvNdNCqK(o)3QG;H>D2-y3Y7gAd`t=rnEgQsUzO|#lz zL-qn4&_XLvr>2!@Pd6q0M7mKx(%N)1Y+Lg?^W$ zaV#HAv?ys0($Gx==?QTXHnle3?M*{KaQg~$+dSdtj2Y1WWP-aC@KZA5Ed4P;yd74MzjG*CjlBUiLM-389=F_Y^Xr|GHwWmx) zgR*)4#ywBNKU8DA1yU^iz0#6zS;c(x6O(Rd=NK*Qg~D4tq3ztd7x$f)ZIqeDY^0sF zn%F#WY4D_V-t8T;%tXCVY%+ReeyeLX9(kjH-FS(QL~O_EdM2Eb%S_rgm!gh$FQr+S z+kpy)K4nYiQB`w)Q0ih64>*^@)c{ETU9KgN@5rkoM9ne47*oQhg(!L!*;?*)9-nKX zW>mF}9yej5@*aD}`~6HkT%F7`!Ky-#f$;W=nv(BtQ~dHj;w>n@^gA6-RS08B9CzM% z*ribN&A}Y7*REt5@+lWN*gcpZ);wJZTPvJSYWURVCQ%d>buQUYVi${q)+7>UcZKHV zibh|h2&`s1I`Sv+G%sfK5wwYZmyQrH$K>dmSiIQHThLw?xf19q&Q{qLKzXsSave)qrSx=uKY8GobyKI(9f2}B2`hE z{uct4^rms`8+to<6L#-?Cn*6n%Aj(lMH=J~eghr1^6wR-8eserj!ZtGwY3lGq)mS+ zx4h~zD^tavKXd>~<#cUW8g(T$B}KIjhB7i^0aES@1(F-81`v&~=|A=g^U%zIL5VOc z8(-wrOsa(vW#`SOX%brFJ1(@maQrFc&DE@FzqhJ4%yi5GEMW7@0RNqXE96V=WEj=j zUKh8cc21(xtexCU8asyM;9Uaw8fKBUJ{v}EceSgTZ_ac1)#Mcs6okQ(2-M(osg!oO ze^uvixj3Au_mrp_;sTcr32<<0tRt#u*Fcczeex^}R(_Cq+oig12*mLTJ<|`EwqaF@ zgYg{ho=)C5tAq(Us)D0St122;DH2la(HSXep+FXx=uyTd9g!9i7Ym_}Gl`#~v|d-& z(q38OUJ>3}Ru5;jHUDM)=Bl}8y|TEeU(~vj8>S9jjqD@sRx^d06)C80Pi{_{SYK0| z3(&Ildm&lIHR_Ov?wL=VOBFV6sNgV@kE^_6`G6eI|5RCF=rEBOBhbJjoU*be(HQ^o z2^p1b6c;=0t2MUTQqm+K^ZyTumO`MY+It6VU;}PQX+eKI(-771<2m^cD}*c3Xgj_Mh@0shk390S?oxA_)Y41#GwYeh5KVso(D z=5Mz9C`oPv_jLBw3Bur*z^kE-s?A5Iu#LB@d!525TNn8bk*N zCI>#$koNz#NQ1S+D^BVZ4#_2t`9KF2L`6x&RxNg_uRY=wG>eb9RN80>SDtIZ&lY+( zjzTY-!l}Z3I67_pE5erBdR45o=UG<-)@_njWawRz30W9x+ZJE9%zt6X7E(Vc@u=iU zBnsgWEKqM-Cq6z`Bo(8mr7m_o< zR})C2kuv2;mE)tD-saJ2!#w>FO{BDTmH9Oamze^Fv2li{O=sA!0|sa|VuMD85rAbn zSah?*TP_rPCgi2ArZB(duo3ix0)KN!3A$BGbC4S_g2NX~V*}btJ5(uP0Z1dNcn~>M zhBu09)k!j>EGM!P)o8swf@@p&gOhX0sf6ohz4%4kWS=bEb4Z2nG;Er$I!cYn(Z&g^ zP`B=ZA`)IoO0AaY{688?6vySMQJdlj5wtb8j27 zRaqD+(BHPQz!YNSo<><^=<>U5{^`=BPSS+TvzMKsUCF?JVxln?4Zj^IEgz$_C|1}i z=jrAa7p3t)wTg=Y9JVxATQhmmzZ#HX9X0ZrgnHA`;p4W*sb2Yoc##tS% z|M;o&=a>>m#;Ss5JNz#Y2#ntvfgPcsEco}Bk{|&S2wWz|o`kSIi*ntGyKy;d0Dct4 z&%Gd4WBKS7Zd6BsWUPHpzPONLE3arWV>7Q+`&MK%pgX|8Dn1^gOj^rq-rMFY@q(6S zq(>JL+uGVE01K~Ns_<42n(}93B1_SDG$J{TNsUFGRkYFVD2`N{Dax8xN;S=da5K_`M2aYr*G$u&k6pi|Ss<&r)|g-e(t9aC!yUpo+`7<7lb`F2LpCVlR|ihyU+lw2tZJh zns`{$N^K_{f+LK$vl-!m8Oi_#F#|HiT&LX-rys zh~p3&v21=#mVrlJ&xMx3;a@tJw3=;c5|hOHM+Jf*`9E2u2L^!+NmI8h^3h^*ieCXU zT-)u#E};0GUM0smL~5Fp-W*Ed z?i}TlNFJ9n`Vrx%arirF;c()i<5?qNHjVTA)F3QvQ+iW1I~1?t$XXsM+H|^RxTREL zb3GL`Gb=4OSk3IwHn`%uVIIf3DKfC#m(Sx=t4$#lBw`9W>Dsvv?`+P*>3CtjT5Gd< zej}Km-T#%03BvGE*T_*mV;GfLap4XZwxevefIXIej-e8nhYDd0_E@Wnc~x1%q)bL@ z2U37$_+&4^7KTl;#y*s2usc=~^p^9_DiVauGpCXE6J}&Od|5->MGl#)kaB7F@KAwr zHO#)I`{=mTu8Ahk|0?EiuXPQ6*Bs`*zNa{ZFuu^ypHwaM>eboSn%Y`7Xr!g$U|(z} z^w=2N@m5gbRDH+kECt@eRgLScPrrnVqem_Fz4GPSlQOSGt};omvsYB-lC34$A5d8D z6GT1z?okErP38L&fip@NXw~c|#Ri=IsUV>1%7X&ylca&lB&=>A@!~qqC6=t95}qZ? z0(~nCY`h7=5#ff8LPy5M|HL8*Gf){RDdC}&S}oDI5s+Z2-G=v4a#DDjSy`HIFkG07 zlyuVEl8kfUGzV`%QK41~*ck9ty<6@Jc~@7DrYZRUptwH9kcZsGwF0(v)oi=O* zeRx_b{+Rv5^w`0?W4R-`vtM#W)N^=C?5$sGJJOKD*M)r@%R@_JwSKX>I!w1Bkk_PG zBxHvs#bbn15qM4aAo^w`J2@^&JYAZ7uNm4;90uv(kYpy9Kv1fqo`#!I%S*p3i;?$F zz<>bKAN(Qwtr;~-iBYfa&fphz5=H}koOKxa!rHE1%u3a@Cs|}Qw@B5Q5&dVhK;nTk z)9EqUDB)D6hd*#ZbVFhzb+(OB(qGWMO563sMa^h%@vy^fBv2mYf9b*uJAr`1S1q@y z&Z5-+7v7!k?_$6waVTPm1uI&|Mi*Ayauy5yWr4S*s!b=YYfZs1TX8B)=6;Bviwl1x zbNRkPR7Ja%K_I0&5jO6}Q-w=T?S9rsjPk|k!8Tv@+(s({p4T7FkAdgocjhC`U!q$R zB8AtTu`KjEg0V!+XgN}Qc#Ogu+xW3^3d}6#AV)rLx?<8)W`@-t<)$YG8XtHVAp&7G zdGEpRX6m-acx8_5G=U3gcAJK8?zVpFzjLD(x%Y}WQx2=6~ z5f@dMB@XXAE9et(`pW7K5V=**LVV2ku7|y0{=2oX;`aC5*YYAE7v1=RMDyrDy?Q3l zX=m^I_4$Ryzdgv_9#La*J`gb^AL=e&T7PnK&}_c}IwA#BR61JFZRw5XN(?ebVNpec zJv;oQS5Cm%~JnkwE@`kjAq+q=09 z>uBU>-mUlypQ9<_`IdR9niOm4&$jhcCSas5H}(&1112J(6N_{?9lT%^aoegFofK}- zTvcOB7A4DM8ZjTBi*O30e^pzteo^Jjwr9{n7YxvMaE8G(iB*uET5YvgG?ww28eYa| zMOpYip$wZ$5OL$!{`^2~gPuFDP=>rp3;shiZ7zj(629uwTq3c*B;qugsV_Ct#<`Mw za4iSqOycL%%FQEnJb7pA^c?D_yb1YepEyBXVYeZtL;UhnMQ%)R?;rSFZA1?7?y+W7 zWzDLlbr?s;Ti|T9JK8k>ZfOI1@Gzijy2*}XgF*MSp zbV-VIgS5m9-ObQBAd(UTf^>s)gU{x^|Mzh`@3;58k9R)Ix0$_>O5@nz*n96H~afKKKf431}D+nPxa*-VZ6nrBgR^3rY0qWxeo#xnvT zt1Xg`&elXm$YM)|oe~9P@)MDhSdO@|hCLQ-<*O6lqn+BZY-@rn;89*}#$V>t24Ucz zXV6!&?kR_JNOdC`!b?AiyE{@ zH$T7%Smrb^W0WP`C)FV{U&J(!3c-N6+zo1(`EwN;L;b7X^6ol`iAQZR{%TTKeSKMg z!gP!ZDOGBwzbhNL+*{3GCT=k;tT#5Vt{=CsX3rWN=>MfHa5Dua;tuT=clK7CNEbZ! zcWrNL_BEh-@X0M^1o`s?AIR*94UWh+Pwh=; zS_>A{ZCSb4R#ymyefeXep(kt#6fI&P}sa=l7epuaK z^Knu7RDlJj6jdithml>xj@&<^pGih$?oZlIOb_HtwR44YF~0O^Ol|Upig(YNbz#Kvu)@Dm%3EzMeyLjqJ}XbPW2Ej*?k>R; zZ&UNoLzm&|ZkKZ8%Bmc$O4dIf`xGJ1OwhCi%?bxq6u%TZsp_Wdy#c3Tu^qEv329)R zD5~*oJXsSVCbA&?li0y$ZU16k7R;%J>#xTNcRPPC`U`9xmCoRh{^62*L$o2{7q#py zEJ}#Ef3WI4Zm>ehix>(eNqS?YK#p@Er5QtcjAQ9-!LB3|PEq50A?zMZK*m%QAckY} z&3tz!o!6)T{uiW*bj$IDuqf;X2R;KY5?pzIh{5Wz@l&i@POV!)X(y}W>JCl}kDtug z+0WEE+F3ZI_#>}G01^(A7sA0@b^eyK#~q!cp4DI`=zkYfMfoH$kw#nokLG%zbO!!0 z?H1`{Jh7P0Wkn!m$Wdcua6e-nn8E5aj6VJ!4UR@HAfx$9hl(9P^} zlDc3MeK(ae6hu$==sPmm90TWzo)8FZkJ-CQ_}uoj@?+-xkeyp>)T23vb*(w*cS*{9 z%GoZ7#a|`7Px7x}uRY=XlGq^Q5M7ntSE7BY&ytul=zNvgo~h)s&agR8m)L45ku)D8 zsJ%7R_*$PF6~p$nry~nqNo9k-xXew72*RE9H%5zu6MBY-=(h8wh-xBOg`vIv%arvZ znhA9nMLSU?OoedTAPWVKymt+x)%4ZvF#H1DI2O@!a|yGWgRfW+3WU^ZwH3q-4NA}w zwl})aleL~szk@ApM+c~J@A41jvQ+hWTiqJv{PeTfjNr-H{&Wi8w7UVtZRlftU9Nwg z+|^gV{Cora^I-g)geMwi{=&pTm#7SqFH~Qz)p1R4YQT%%xlHTXd8N$u%JTF%B)Z=5 zX@VO|tfB9`gmT9MJ7saZCaA-9ce;*2J_w^${m>jB@E9`jo|)^3b78((<=GU!Lp0HK z^b@TCFiKypI@oYwB8DngMhgof!2DUJc6#=EAU*2i&vO_`bp$=Md(2|ikV<Gyb4(arNf>%9 z$R)t?l>a`SQoQAhhtw)zW=m~>?WY9ybzT%bg?w`xFYAf>xpc`S|CR~rDZh`BJ0lEd zTF12`xA_A9*trr%hWq1ZxUhM8R|Q#F%*1!D#|;T3=EpslcjiVSEs2~%n-!Sw_TAVX z>N8cT73t6c>^{?F46u`^e!YzL7^(Oi0YeT94W(j+A26{1BUp1lDDf`qSdB`iZwO4> zSXFX9OBqSh2VGi(n5I!i?#kjacD?@o_D>z&tz^s4<_wmP2U#*_mi2cR8#yj5O)0Z} zn^LyliTw#AW`J{Q1pmEA?vMBuC!LnAaqnH|d8TWP+CTdo_ZSx1WyW9P1g`H#$(Kqv zXZRgZZ&Nf}i)E*pt3oQpub#4{H<5v_$?}NbIRa*g-5>&I^>BmvuCTF&tqEOOtc-0X zS(oQYTn7reuZO;~(EP{p#cblsh#+R}v|Z7rEVsFrJtBW8L-7vwt;``YN!{OTif19Z z(e#cETlvgDyzMS-iK*^!B$A~}xJ8XegNm*wJ`byu9gZbMu{GP$QZ!6cE$;E`Uty%3 z0Wj|Y^V%GzAr|3|zFMx2#vcpU%$FXFxMO*2e+dkq)<>NSf#rN3ZS4gLEoG}Es@@0i z19>*75#>@FPTdR<*^8gaovk5(?aMKg{B7o48H}EE~hOqsu~X|2(eP{`Sa8!D{2&a?@+=5_X?Eo{+de6n~~pFl>BC zHoe2!x{$S)?}>s!_G^|R(&$H>hi`DbiA=U05A#Fd&sd<;F4igWQaUa@q6;;7v_s#- zDQg-9sOQRZ3`fk?)5I_yJw0V9w<$dWS|>2RYPvyjmg$L6A>;RQdC*Db@{S@#^|gtb zm_LcRmH>Zvn%~p%*blu47L5595q)(PPP~l_K6=Us0A$0Nab*zhK=U2)vS-@~PX8mIR;Xd>yORKG7ieJZ@~mKI%37q}+EJ-SJ3>nLu&NxdQ= zy}a$rM!%*qFgM4BKS41Gm{{?v=)5@O{oN|pdX%|&lj{0Akr?pu2!rGMgDfxWnk*wG zzwVPmQ}XJ8#w>ICU2nYOBFcrRrII8X`5HXeSh!waJ#fMSU88;Zp*d^@3OkAIr^gvr z(mBTGzlu}Z#BJx6rW}u2>L%uSM=x$)h(>>mGyh)*C`v$J*o+`r`%RDEefz|21kkV` zd|{oftzzAC>c@p})z}3gJ6!P6#Ujams;@bdl@k=Zw>9TU_v(B4SG6oa6kpi>!bHeM%{ zo!WY+s(f#?-MsNv-S2yY3j%iL4Gg#-v|7rTsD$$FgdP_;wfMbS=J2^MljH4bhu)MA zHY;Bho@4$9^TXBq^~kVNNcVK%=cQ%rlPb!jL6uu^?)KxIzsg20&0A@Ka{vPO&{I8b zYar*1WhFi#;eOxOJ^gUGi-42&^cUJY83!$H@|{I3Oy0S&^Jix%*tIZ&s+^iYm64^s zBMZxfDkbb#LR|b@6FGjsTmquM#GB7zyFJ=zI#}-2EGI>evj-W(b`k+9Michp1#C%7t0=R!X!up}z z=7V;%{S(M5)4{%S;`pIp%#X($#pp0K9fY;N$Cq+o1dnadc^6RY7U0ZH6pQ>Go8hHn@&rTFwx(Z0_xllx>t#FUA-hRPSuy zyOnE=n{m#y31t7&vhbh4cGl9;bbx2v1I~=Dg^GxP{6lRkX_iTfN;p%o>2%!J9%|ls z)6!WHyiX^(??ZiMk@5`Lb&nfSo~BK8(}F7NmK44Z4X*eRbB5{Ua?{SM{dl8Lo&27j zTxp1-@A-V9thscc$z|0#Pv5nxzfU#3(3x zhwHEZ>h(x-K={)94zRuF`OmI%=Qi^d{dHdiVNY-Qc%^@yC}$_#BwFP%)X4gzS%7W# ziOr)jojShB##oP9k6=o**-68!*M1JoMcHk!{vXBj)wE zlX?h>WqM^jy4K|AUi_l!HAgh$1HWN@_sJn2LSQ(%!4u-hQ(TI;T`P_o#gFAtO zozv3LRlcqux+?3t!Sfi6o-#U~uu*1k$iSZ^KlZ;xXYiU~Sze@_ak5Y$kQy_4!UtXd z16di_83mMQa!>`RQgc6Jj*SDUWcUuO9wnAItDX)>t+&ZR%RMRFR&He`R#JrOiNcP* z+s&lY<^Vw_dJr72AsSjWR&&y(DNo#iaKo88 zm-e+;vfE9&w0*x|feb3a73jeY5=vrEiy5|sw4nEx1aYk*q;Ea#?y>=?*Ytb4AMEFC z$%ZxoqK|>RXt#&o!ixP?%7Fu!F_d@llW#RgK*#c3?wf#e@7rFtC#>fX|1{Sx>l(!tzpGg&hw zyocFXWtW7{Ni0pmMNg;BH;W+aui&!hl`cGLKxHc2kk!sCF220FJf+{@x{-1GRsT$n zXk54teuWq)tkJM~q zcq0H$z?K6#L#SxPQWPAEA;z%WQ$*TjmJ0}YqylkFtA6|x`!yD{b>BK+j zs9U_S>9`9fbwC0RJcO83YnP(t%VG_arAW-I_)D@dQ%*$h0ceMXdZ8x7H?Si@an7xb z@ka<%_om0;tGcq3Ov9h765~#KM2cp3QHsd=SHuva&E)F)$or+bm-lv-?uMNYlRHZ; z)!EEOa#YtH8i;p8%b-WQ>|zgXPDA9sf*uGX=Fu%E)X!7aWcY;-zlcdHNFTi!VO+>; z6yYW$m$ENfTynsWA32q#jHQvzw$^ZbpS|QST`~HbaNu=eLj<+m;BFhRHSp?2i&WI5 z9RPTfql%Y(wu1`;E-t%-d!8`PFb~mKg7`S&56*syEDE1LS!)hev zEcgWI`1vK5Gqi~;XP!Ra!8VG(7#0FX)Okkruf(`z%*}PyjFCxexVSNw_em(}{hy=v z=c-O)Hx9#}7ud&n+GhJrn@ZJ7hmHXFY}eA47-IxjGqXBn?5MDSrh$ zdIk29B%I(4@tgL+`0%ZGZ9^Fl$(yqJW#Wh^Pn6Tc4h+HZZxb?0k) z^&(*mYtRpPeUkiMUOfjMa(qqGr}o2_!XGX`j0MJjctJ$w8X_q*p}B^dr0WL=V~df| zvVbEYI%Dp^Ad(De2`2r|ZB7FkzH81bq^0O`VW9E8-V~4hWR2rfBzRb5Cyy5^=eY09 z7n~EQy?u7&SZS~)^&6jj&v&}ge3&i*|1ql6Y%*@18!p=d2-;wY3jebyi&$vptQyB40k|&#IsQz^V&o~Nq2cfxAd3ZG1YtdkR9;W7 z;T#rZk}5U}xeneu`e(JH0V~HOy>d`T89-A^|6CE|{qe}AGAjN0+C#>E+h11b#3bhe zZs)gmE&td)QYx&-BZTV7tdLCNGU6b^GJ5U#fC}#^Mq#K=2Rcfpxod3dFnSw@ffGyL zZcUYe`jgGAwhZ{|)47~Jd@6gZ#gecW1O1Ex(sH6=)EaC%6c#@utPRzv+T?W;*T!qr z?#u}$6b*UA8*JD)5f#gd07unkObz>G291rAYvV^s7nsCZr7P(CLI?LTF>EGY!z@fo zEe4oSyHFpNjTda$cLuN>3Z9xQJW4PN1NQ6H|JdEW`%}5e;LU< zq5DS8*|Lg%BM4Crh(Pk^f-SlreZ)Mp-Rr1*op>>8EBMDv>0|EnK_s_- zW{m&-*2l;kk!?5$RXByg$$s%WMUEYs!JDpiNK(q z6%s`Xa&}l8?Vcog5_q$S4~n-$!@=*7=9J%313{9VsG?5-BWHMCosB2 zK1j5TABp%)+#7VikCb$k>SY|Wl?+B0m~s~_Z85H8=$GtLR9mqb%-C`OFKfW-8pr8@ z>_$k^!Qkare96R_O}O8+$2_H_lp`nm3v6zmw51}Z%cP(c>gMBG6jhlDK5oj3wS@fb z(^>$C@dWETR_uLyZBWdFcQ=qz`b}Jj^Vdy%ls^G*Kk{69+<6uXIbQ>2%CczAdxKbE z+TR0|Cn;Y18#Tz*E+uAN0R9G(1`7o-yS$iNVp%p>1)vr{%3~g zHZmjK#j8uLp29a;1ibtTgdECapW;UL_dK#p+ige@NI;*T)@?D@vwzPoZZTf^gt_${ zUZ^oij@Km>_4E3AP0}(5S;k~Ywt&$|jQ)SKXx2gpLRaje9Fb12Fa*1crex;x8>hlrDi7-i1%>G=7Z z%=pP$Oei#^PHMKYy7M9@oqw*)Zc=LC9mtvEopJ#Z)d|qgqzP8D*ZbuK)!#TZiCUeF zm2yUn*fl4jFjiF3HWz$1;j!RI8_8{Ry1~%--sa)!U~GZV^k4leF`vS##Y zZ`laQYSH1~Q;A!u22C`WAa47zYiNEqi&lbM*@|ke>|sJ1Ms1$b>kIah4i^M z13=*?>5p4=S%ld;ZdYXBB!f5j_iB{l38G|X%Cg@pi|63)?vqBA0E7?_@hsEf({BeP za2We(?zogunu52Q1&crST>%h?m^q=hFCf=`^#K>UiR({Mr+~QTXmchON~eZDd1IB& zkM{stbT*40{At$Z;u0qlz=zjqH7;*>z literal 0 HcmV?d00001 diff --git a/docs/source/_static/theme_overrides.css b/docs/source/_static/theme_overrides.css new file mode 100644 index 00000000000..9713e89ab26 --- /dev/null +++ b/docs/source/_static/theme_overrides.css @@ -0,0 +1,9 @@ +/* !important prevents the common CSS stylesheets from overriding this CSS since on RTD they are loaded after this stylesheet */ + +.wy-nav-content { + max-width: 100% !important; +} + +.wy-table-responsive table td { + white-space: normal !important; +} diff --git a/docs/source/clone.rst b/docs/source/clone.rst new file mode 100644 index 00000000000..c31968ec2e3 --- /dev/null +++ b/docs/source/clone.rst @@ -0,0 +1,153 @@ +=============================== +Clone and build Global Workflow +=============================== + +^^^^^^^^^^^^^^^^^^ +Quick Instructions +^^^^^^^^^^^^^^^^^^ + +Quick clone/build/link instructions (more detailed instructions below). + +.. note:: + Here we are making the assumption that you are using the workflow to run an experiment and so are working from the authoritative repository. If you are using a development branch then follow the instructions in :doc:`development.rst`. Once you do that you can follow the instructions here with the only difference being the repository/fork you are cloning from. + +For forecast-only (coupled or uncoupled): + +:: + + git clone https://github.com/NOAA-EMC/global-workflow.git + cd global-workflow/sorc + ./checkout.sh + ./build_all.sh + ./link_workflow.sh + +For cycled (w/ data assimilation): + +:: + + git clone https://github.com/NOAA-EMC/global-workflow.git + cd global-workflow/sorc + ./checkout.sh -g + ./build_all.sh + ./link_workflow.sh + +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Clone workflow and component repositories +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +******** +Workflow +******** + +There are several ways to clone repositories from GitHub. Below we describe how to clone the global-workflow using either the ssh or https methods. **The ssh method is highly preferred and recommended.** + +ssh method (using a password protected SSH key): + +:: + + git clone git@github.com:NOAA-EMC/global-workflow.git + +.. note:: + When using ssh methods you need to make sure that your GitHub account is configured for the computer from which you are accessing the repository (See `this link `_) + +https method: + +:: + + git clone https://github.com/NOAA-EMC/global-workflow.git + +Check what you just cloned (by default you will have only the develop branch): + +:: + + cd global-workflow + git branch + * develop + +You now have a cloned copy of the global-workflow git repository. To checkout a branch or tag in your clone: + +:: + + git checkout BRANCH_NAME + +.. note:: + Branch must already exist. If it does not you need to make a new branch using the ``-b`` flag: + +:: + + git checkout -b BRANCH_NAME + +The ``checkout`` command will checkout BRANCH_NAME and switch your clone to that branch. Example: + +:: + + git checkout my_branch + git branch + * my_branch + develop + +********** +Components +********** + +Once you have cloned the workflow repository it's time to checkout/clone its components. The components will be checked out under the ``/sorc`` folder via a script called checkout.sh. Run the script with no arguments for forecast-only: + +:: + + cd sorc + ./checkout.sh + +Or with the ``-g`` switch to include data assimilation (GSI) for cycling: + +:: + + cd sorc + ./checkout.sh -g + +If wishing to run with the operational GTG UPP and WAFS (only for select users) provide the ``-o`` flag with checkout.sh: + +:: + + ./checkout.sh -o + +Each component cloned via checkout.sh will have a log (``/sorc/logs/checkout-COMPONENT.log``). Check the screen output and logs for clone errors. + +^^^^^^^^^^^^^^^^ +Build components +^^^^^^^^^^^^^^^^ + +Under the ``/sorc`` folder is a script to build all components called ``build_all.sh``. After running checkout.sh run this script to build all components codes: + +:: + + ./build_all.sh [-a UFS_app][-c build_config][-h][-v] + -a UFS_app: + Build a specific UFS app instead of the default + -c build_config: + Selectively build based on the provided config instead of the default config + -h: + Print usage message and exit + -v: + Run all scripts in verbose mode + +A partial build option is also available via two methods: + + a) modify gfs_build.cfg config file to disable/enable particular builds and then rerun build_all.sh + + b) run individual build scripts also available in ``/sorc`` folder for each component or group of codes + +^^^^^^^^^^^^^^^ +Link components +^^^^^^^^^^^^^^^ + +At runtime the global-workflow needs all pieces in place within the main superstructure. To establish this a link script is run to create symlinks from the top level folders down to component files checked out in ``/sorc`` folders. + +After running the checkout and build scripts run the link script: + +:: + + ./link_workflow.sh [-o] + +Where: + ``-o``: Run in operations (NCO) mode. This creates copies instead of using symlinks and is generally only used by NCO during installation into production. + diff --git a/docs/source/components.rst b/docs/source/components.rst new file mode 100644 index 00000000000..3ebd575a823 --- /dev/null +++ b/docs/source/components.rst @@ -0,0 +1,106 @@ +########################### +Global Workflow Components +########################### + +The global-workflow is a combination of several components working together to prepare, analyze, produce, and post-process forecast data. + +The major components of the system are: + +* Workflow +* Pre-processing +* Analysis +* Forecast +* Post-processing +* Verification + +The Global Workflow repository contains the workflow and script layers. After running the checkout script, the code and additional offline scripts for the analysis, forecast, and post-processing components will be present. Any non-workflow component is known as a sub-module. All of the sub-modules of the system reside in their respective repositories on GitHub. The global-workflow sub-modules are obtained by running the checkout script found under the /sorc folder. + +====================== +Component repositories +====================== + +Components checked out via sorc/checkout.sh: + +* **GFS UTILS** (https://github.com/ufs-community/gfs_utils): Utility codes needed by Global Workflow to run the GFS configuration +* **UFS-Weather-Model** (https://github.com/ufs-community/ufs-weather-model): This is the core model used by the Global-Workflow to provide forecasts. The UFS-weather-model repository is an umbrella repository consisting of cooupled component earth systeme that are all checked out when we check out the code at the top level of the repoitory +* **GSI** (https://github.com/NOAA-EMC/GSI): This is the core code base for atmospheric Data Assimilation +* **GSI UTILS** (https://github.com/NOAA-EMC/GSI-Utils): Utility codes needed by GSI to create analysis +* **GSI Monitor** (https://github.com/NOAA-EMC/GSI-Monitor): These tools monitor the GSI package's data assimilation, detecting and reporting missing data sources, low observation counts, and high penalty values +* **GLDAS** (https://github.com/NOAA-EMC/GLDAS): Code base for Land Data Assimiation +* **GDAS** (https://github.com/NOAA-EMC/GDASApp): Jedi based Data Assimilation system. This system is currently being developed for marine Data Assimilation and in time will replace GSI for atmospheric data assimilation as well +* **UFS UTILS** (https://github.com/ufs-community/UFS_UTILS): Utility codes needed for UFS-weather-model +* **Verif global** (https://github.com/NOAA-EMC/EMC_verif-global): Verification package to evaluate GFS parallels. It uses MET and METplus. At this moment the verification package is limited to providing atmospheric metrics only +* **GFS WAFS** (https://github.com/NOAA-EMC/EMC_gfs_wafs): Additional post processing products for Aircrafts + +.. note:: + When running the system in forecast-only mode the Data Assimilation components are not needed and are hence not checked out. + +===================== +External dependencies +===================== + +^^^^^^^^^ +Libraries +^^^^^^^^^ + +All the libraries that are needed to run the end to end Global Workflow are built using a package manager. Currently these are served via HPC-STACK but will soon be available via SPACK-STACK. These libraries are already available on supported NOAA HPC platforms + +Find information on official installations of HPC-STACK here: + +https://github.com/NOAA-EMC/hpc-stack/wiki/Official-Installations + +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Observation data (OBSPROC/prep) +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +**** +Data +**** + +Observation data, also known as dump data, is prepared in production and then archived in a global dump archive (GDA) for use by users when running cycled experiments. The GDA (identified as ``$DMPDIR`` in the workflow) is available on supported platforms and the workflow system knows where to find the data. + +* Hera: /scratch1/NCEPDEV/global/glopara/dump +* Orion: /work/noaa/rstprod/dump +* Jet: /mnt/lfs4/HFIP/hfv3gfs/glopara/dump +* WCOSS2: /lfs/h2/emc/global/noscrub/emc.global/dump +* S4: /data/prod/glopara/dump + +----------------------------- +Global Dump Archive Structure +----------------------------- + +The global dump archive (GDA) mimics the structure of its production source: ``DMPDIR/CDUMP.PDY/[CC/atmos/]FILES`` + +The ``CDUMP`` is either gdas, gfs, or rtofs. All three contain production output for each day (``PDY``). The gdas and gfs folders are further broken into cycle (``CC``) and component (``atmos``). + +The GDA also contains special versions of some datasets and experimental data that is being evaluated ahead of implementation into production. The following subfolder suffixes exist: + ++--------+------------------------------------------------------------------------------------------------------+ +| SUFFIX | WHAT | ++========+======================================================================================================+ +| nr | Non-restricted versions of restricted files in production. Produced in production. Restriced data is | +| | fully stripped from files. These files remain as is. | ++--------+------------------------------------------------------------------------------------------------------+ +| ur | Un-restricted versions of restricted files in production. Produced and archived on a 48hrs delay. | +| | Some restricted datasets are unrestricted. Data amounts: restricted > un-restricted > non-restricted | ++--------+------------------------------------------------------------------------------------------------------+ +| x | Experimental global datasets being evaluated for production. Dates and types vary depending on | +| | upcoming global upgrades. | ++--------+------------------------------------------------------------------------------------------------------+ +| y | Similar to "x" but only used when there is a duplicate experimental file in the x subfolder with the | +| | same name. These files will be different from both the production versions (if that exists already) | +| | and the x versions. This suffix is rarely used. | ++--------+------------------------------------------------------------------------------------------------------+ +| p | Pre-production copy of full dump dataset, as produced by NCO during final 30-day parallel ahead of | +| | implementation. Not always archived. | ++--------+------------------------------------------------------------------------------------------------------+ + +*************** +Data processing +*************** + +Upstream of the global-workflow is the collection, quality control, and packaging of observed weather. The handling of that data is done by the OBSPROC group codes and scripts. The global-workflow uses two packages from OBSPROC to run its prep step to prepare observation (dump) data for use by the analysis system: + +1. https://github.com/NOAA-EMC/obsproc +2. https://github.com/NOAA-EMC/prepobs + +Package versions and locations on supported platforms are set in the global-workflow system configs, modulefiles, and version files. diff --git a/docs/source/conf.py b/docs/source/conf.py new file mode 100644 index 00000000000..c0f9ca572a7 --- /dev/null +++ b/docs/source/conf.py @@ -0,0 +1,111 @@ +# Configuration file for the Sphinx documentation builder. +# +# This file only contains a selection of the most common options. For a full +# list see the documentation: +# https://www.sphinx-doc.org/en/master/usage/configuration.html + +# -- Path setup -------------------------------------------------------------- + +# If extensions (or modules to document with autodoc) are in another directory, +# add these directories to sys.path here. If the directory is relative to the +# documentation root, use os.path.abspath to make it absolute, like shown here. +# +import os +import sys +sys.path.insert(0, os.path.abspath('.')) + + +# -- Project information ----------------------------------------------------- + +project = 'Global-workflow' +copyright = '2023, Kate Friedman, Walter Kolczynski, Rahul Mahajan, Lin Gan, Arun Chawla' +author = 'Kate Friedman, Walter Kolczynski, Rahul Mahajan, Lin Gan, Arun Chawla' + +# The full version, including alpha/beta/rc tags +release = '0.1' + + +# -- General configuration --------------------------------------------------- + +# Add any Sphinx extension module names here, as strings. They can be +# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom +# ones. +extensions = [ + 'sphinx.ext.autodoc', + 'sphinx.ext.doctest', + 'sphinx.ext.intersphinx', + 'sphinx.ext.todo', + 'sphinx.ext.coverage', + 'sphinx.ext.mathjax', + 'sphinx.ext.ifconfig', + 'sphinx.ext.viewcode', + 'sphinx.ext.githubpages', + 'sphinx.ext.napoleon', + 'sphinxcontrib.bibtex' +] + +bibtex_bibfiles = ['references.bib'] + +# Add any paths that contain templates here, relative to this directory. +templates_path = ['_templates'] + +# The suffix(es) of source filenames. +# You can specify multiple suffix as a list of string: +# +# source_suffix = ['.rst', '.md'] +source_suffix = '.rst' + +# The master toctree document. +master_doc = 'index' + +# List of patterns, relative to source directory, that match files and +# directories to ignore when looking for source files. +# This pattern also affects html_static_path and html_extra_path. +exclude_patterns = [] + +# The name of the Pygments (syntax highlighting) style to use. +pygments_style = 'sphinx' + + +# -- Options for HTML output ------------------------------------------------- + +# The theme to use for HTML and HTML Help pages. See the documentation for +# a list of builtin themes. +# +html_theme = 'sphinx_rtd_theme' +html_theme_path = ["_themes", ] + +# Theme options are theme-specific and customize the look and feel of a theme +# further. For a list of options available for each theme, see the +# documentation. +# +# html_theme_options = {} +html_theme_options = {"body_max_width": "none"} + +# Add any paths that contain custom static files (such as style sheets) here, +# relative to this directory. They are copied after the builtin static files, +# so a file named "default.css" will overwrite the builtin "default.css". +html_static_path = ['_static'] +html_context = {} + + +def setup(app): + app.add_css_file('custom.css') # may also be an URL + app.add_css_file('theme_overrides.css') # may also be an URL + + +# Custom sidebar templates, must be a dictionary that maps document names +# to template names. +# +# The default sidebars (for documents that don't match any pattern) are +# defined by theme itself. Builtin themes are using these templates by +# default: ``['localtoc.html', 'relations.html', 'sourcelink.html', +# 'searchbox.html']``. +# +# html_sidebars = {} + + +# -- Options for HTMLHelp output --------------------------------------------- + +# Output file base name for HTML help builder. +htmlhelp_basename = 'Global-Workflow' diff --git a/docs/source/configure.rst b/docs/source/configure.rst new file mode 100644 index 00000000000..284297459d4 --- /dev/null +++ b/docs/source/configure.rst @@ -0,0 +1,59 @@ +============= +Configure Run +============= + +The global-workflow configs contain switches that change how the system runs. Many defaults are set initially. Users wishing to run with different settings should adjust their $EXPDIR configs and then rerun the ``setup_xml.py`` script since some configuration settings/switches change the workflow/xml ("Adjusts XML" column value is "YES"). + ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| Switch | What | Default | Adjusts XML | More Details | ++================+==============================+===============+=============+===================================================+ +| APP | Model application | ATM | YES | See case block in config.base for options | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| DOIAU | Enable 4DIAU for control | YES | NO | Turned off for cold-start first half cycle | +| | with 3 increments | | | | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| DOHYBVAR | Run EnKF | YES | YES | Don't recommend turning off | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| DONST | Run NSST | YES | NO | If YES, turns on NSST in anal/fcst steps, and | +| | | | | turn off rtgsst | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_AWIPS | Run jobs to produce AWIPS | NO | YES | downstream processing, ops only | +| | products | | | | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_BUFRSND | Run job to produce BUFR | NO | YES | downstream processing | +| | sounding products | | | | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_GEMPAK | Run job to produce GEMPAK | NO | YES | downstream processing, ops only | +| | products | | | | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_GLDAS | Run GLDAS to spin up land | YES | YES | Spins up for 84hrs if sflux files not available | +| | ICs | | | | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_VRFY | Run vrfy job | NO | YES | Whether to include vrfy job (GSI monitoring, | +| | | | | tracker, VSDB, fit2obs) | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_METP | Run METplus jobs | YES | YES | One cycle spinup | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| EXP_WARM_START | Is experiment starting warm | .false. | NO | Impacts IAU settings for initial cycle. Can also | +| | (.true.) or cold (.false)? | | | be set when running ``setup_expt.py`` script with | +| | | | | the ``--start`` flag (e.g. ``--start warm``) | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| HPSSARCH | Archive to HPPS | NO | Possibly | Whether to save output to tarballs on HPPS | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| LOCALARCH | Archive to a local directory | NO | Possibly | Instead of archiving data to HPSS, archive to a | +| | | | | local directory, specified by ATARDIR. If | +| | | | | LOCALARCH=YES, then HPSSARCH must =NO. Changing | +| | | | | HPSSARCH from YES to NO will adjust the XML. | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| QUILTING | Use I/O quilting | .true. | NO | If .true. choose OUTPUT_GRID as cubed_sphere_grid | +| | | | | in netcdf or gaussian_grid | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| RETRO | Use retrospective parallel | NO | NO | Default of NO will tell getic job to pull from | +| | for ICs | | | production tapes. | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| WAFSF | Run jobs to produce WAFS | NO | YES | downstream processing, ops only | +| | products | | | | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ +| WRITE_DOPOST | Run inline post | .true. | NO | If .true. produces master post output in forecast | +| | | | | job | ++----------------+------------------------------+---------------+-------------+---------------------------------------------------+ diff --git a/docs/source/development.rst b/docs/source/development.rst new file mode 100644 index 00000000000..6c7711bfe1c --- /dev/null +++ b/docs/source/development.rst @@ -0,0 +1,198 @@ +################################### +Contributing to the Global Workflow +################################### + +This section is devoted to developers who wish to contribute to the Global Workflow repository. + +.. _managers: + +============= +Code managers +============= + + * Kate Friedman - @KateFriedman-NOAA / kate.friedman@noaa.gov + * Walter Kolczynski - @WalterKolczynski-NOAA / walter.kolczynski@noaa.gov + +.. _development: + +======================== +Where to do development? +======================== + + * In authoritative (main) repository: + + - Work for upcoming implementation (who: members of global-workflow-developers team) + - Major new features or port work (who: generally code managers and/or members of global-workflow-developers team) + + * In a fork: + + - Everything and everyone else + - How do I fork this repository? See the following GitHub documentation on forking repos: https://help.github.com/en/github/getting-started-with-github/fork-a-repo + +.. _protected: + +================== +Protected branches +================== + +The following global-workflow branches are protected by the code management team: + +* develop (HEAD) +* dev/gfs.v16 (kept aligned with current production, as well as ingests bug fixes and updates between release branches) + +These protected branches require the following to accept changes: + + 1. a pull request with at least 1 reviewer sign-off + 2. a code manager to perform the commit + +Other authoritative repository branches may also be protected at the request of members of the global-workflow-developers team. + +.. _howto: + +============================================= +How to get changes into develop (HEAD) branch +============================================= + +The following steps should be followed in order to make changes to the develop branch of global-workflow. Communication with the code managers throughout the process is encouraged. + + #. Issue - Open issue to document changes. Reference this issue in commits to your branches (e.g. ``git commit -m "Issue #23 - blah changes for what-not code"``) Click `here `__ to open a new global-workflow issue. + #. GitFlow - Follow `GitFlow `_ procedures for development (branch names, forking vs branching, etc.). Read more `here `__ about GitFlow at EMC. + #. To fork or not to fork? - If not working within authoritative repository create a fork of the authoritative repository. Read more `here `__ about forking in GitHub. + #. Branch - Create branch in either authoritative repository or fork of authoritative repository. See the `Where to do development? `_ section for how to determine where. Follow GitFlow conventions when creating branch. + #. Development - Perform and test changes in branch. Document work in issue and mention issue number in commit messages to link your work to the issue. See `Commit Messages `_ section below. Depending on changes the code manager may request or perform additional pre-commit tests. + #. Pull request - When ready to merge changes back to develop branch, the lead developer should initiate a pull request (PR) of your branch (either fork or not) into the develop branch. Read `here `__ about pull requests in GitHub. Provide some information about the PR in the proper field, add at least one reviewer to the PR and assign the PR to a code manager. + #. Complete - When review and testing is complete the code manager will complete the pull request and subsequent merge/commit. + #. Cleanup - When complete the lead developer should delete the branch and close the issue. "Closing keywords" can be used in the PR to automatically close associated issues. + +.. _development-tools: + +================= +Development Tools +================= + +See the ``/test`` folder in global-workflow for available development and testing tools. + +---------------- +Comparison Tools +---------------- + +There are several scripts to compare output between two experiments (e.g. control and test). See scripts under ``/test`` folder and read `README` there for information on how to use them. + +.. _code-standards: + +============== +Code standards +============== + +All scripts should be in either bash or python 3. + +We have adopted the `Google style guide `_ for shell scripts and `PEP-8 `_ for python. Python code should additionally have docstrings following `numpy style `_. + +All new code after 2022 Sep 1 will be required to meet these standards. We will slowly be updating existing scripts to comply with the standards. We are also in the process of adding GitHub actions to automatically lint code submitted for PRs. + +.. _commit-standards: + +======================== +Commit message standards +======================== + +**ALL** commits must follow best practices for commit messages: https://chris.beams.io/posts/git-commit/ + + * Separate subject from body with a blank line + * Limit the subject line to 50 characters + * Capitalize the subject line + * Do not end the subject line with a period + * Use the `imperative mood `_ in the subject line + * Wrap the body at 72 characters + * Use the body to explain what and why vs. how + * The final line of the commit message should include tags to relevant issues (e.g. ``Refs: #217, #300``) + +Here is the example commit message from the article linked above; it includes descriptions of what would be in each part of the commit message for guidance: + +:: + + Summarize changes in around 50 characters or less + + More detailed explanatory text, if necessary. Wrap it to about 72 + characters or so. In some contexts, the first line is treated as the + subject of the commit and the rest of the text as the body. The + blank line separating the summary from the body is critical (unless + you omit the body entirely); various tools like `log`, `shortlog` + and `rebase` can get confused if you run the two together. + + Explain the problem that this commit is solving. Focus on why you + are making this change as opposed to how (the code explains that). + Are there side effects or other unintuitive consequences of this + change? Here's the place to explain them. + + Further paragraphs come after blank lines. + + - Bullet points are okay, too + + - Typically a hyphen or asterisk is used for the bullet, preceded + by a single space, with blank lines in between, but conventions + vary here + + If you use an issue tracker, put references to them at the bottom, + like this: + + Resolves: #123 + See also: #456, #789 + +A detailed commit message is very useful for documenting changes. + +.. _sync: + +================================================== +How to sync fork with the authoritative repository +================================================== + +As development in the main authoritative repository moves forward you will need to sync your fork branches to stay up-to-date. Below is an example of how to sync your fork copy of a branch with the authoritative repository copy. The branch name for the example will be "feature/new_thing". Click `here `__ for documentation on syncing forks. + +1. Clone your fork and checkout branch that needs syncing: + +:: + + git clone https://github.com/JoeSchmo-NOAA/global-workflow.git ./fork + cd fork + git checkout feature/my_new_thing + +2. Add upstream info to your clone so it knows where to merge from. The term "upstream" refers to the authoritative repository from which the fork was created. + +:: + + git remote add upstream https://github.com/NOAA-EMC/global-workflow.git + +3. Fetch upstream information into clone: + +:: + + git fetch upstream + +Later on you can update your fork remote information by doing the following command: + +:: + + git remote update + +4. Merge upstream ``feature/other_new_thing`` into your branch: + +:: + + git merge upstream/feature/other_new_thing + +5. Resolve any conflicts and perform any needed "add"s or "commit"s for conflict resolution. + +6. Push the merged copy back up to your fork (origin): + +:: + + git push origin feature/my_new_thing + +Done! + +Moving forward you'll want to perform the "remote update" command regularly to update the metadata for the remote/upstream repository in your fork (e.g. pull in metadata for branches made in auth repo after you forked it). + +:: + + git remote update diff --git a/docs/source/errors_faq.rst b/docs/source/errors_faq.rst new file mode 100644 index 00000000000..2660a01e609 --- /dev/null +++ b/docs/source/errors_faq.rst @@ -0,0 +1,45 @@ +========================== +Common Errors Known Issues +========================== + +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Error: "ImportError" message when running setup script +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Example:: + + $ ./setup_xml.py /path/to/your/experiment/directory + /usr/bin/env: python3: No such file or directory + +**Cause:** Missing python module in your environment + +**Solution:** Load a python module ("module load python") and retry setup script. + +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Error: curses default colors when running viewer +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Example:: + + $ ./rocoto_viewer.py -d blah.db -w blah.xml + Traceback (most recent call last): + File "./rocoto_viewer.py", line 2376, in + curses.wrapper(main) + File "/contrib/anaconda/anaconda2/4.4.0/lib/python2.7/curses/wrapper.py", line 43, in wrapper + return func(stdscr, *args, **kwds) + File "./rocoto_viewer.py", line 1202, in main + curses.use_default_colors() + _curses.error: use_default_colors() returned ERR + +**Cause:** wrong TERM setting for curses + +**Solution:** set TERM to "xterm" (bash: export TERM=xterm ; csh/tcsh: setenv TERM xterm) + +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Issue: Directory name change for EnKF folder in COMROT +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +**Issue:** The EnKF COMROT folders were renamed during the GFS v15 development process to remove the period between "enkf" and "gdas": enkf.gdas.$PDY → enkfgdas.$PDY + +**Fix:** Older tarballs on HPSS will have the older directory name with the period between 'enkf' and 'gdas'. Make sure to rename folder to 'enkfgdas.$PDY' after obtaining. Only an issue for the initial cycle. + diff --git a/docs/source/hpc.rst b/docs/source/hpc.rst new file mode 100644 index 00000000000..da54f295212 --- /dev/null +++ b/docs/source/hpc.rst @@ -0,0 +1,125 @@ +##################### +HPC Settings and Help +##################### + +Running the GFS configurations (or almost any global workflow configuration except the coarsest) is a resource intensive exercise. This page discusses recommended HPC environmental settings and contact information in case you need assistance from a particular HPC helpdesk. While most of the documentation is based on supported NOAA platforms, the learnings here can hopefully apply to other platforms. + +================================ +Experiment troubleshooting help +================================ + +Users may email Kate Friedman (kate.friedman@noaa.gov) questions or requests for troubleshooting assistance with their global-workflow experiments/parallels on supported platforms. For troubleshooting, please provide a brief description of the issue(s) and include relevant error messages and/or paths to logs for failed jobs. + +Any issues related to HPC/machine problems, and which are unrelated to the workflow itself, should go to the appropriate HPC helpdesk. + +============= +HPC helpdesks +============= + +* WCOSS2: hpc.wcoss2-help@noaa.gov +* Hera: rdhpcs.hera.help@noaa.gov +* Orion: rdhpcs.orion.help@noaa.gov +* HPSS: rdhpcs.hpss.help@noaa.gov +* Gaea: oar.gfdl.help@noaa.gov +* S4: david.huber@noaa.gov +* Jet: rdhpcs.jet.help@noaa.gov + +====================== +Restricted data access +====================== + +The GFS system ingests dump data files that contain global observation data. A number of these dump files contain restricted data which means those files come with an extra level of permissions called restricted or ‘rstprod’. Users who wish to run cycled GFS experiments, which both utilizes restricted observation data and produces output containing restricted data, will need to gain rstprod group access. + +NOTE: Only non-restricted data is available on S4. + +To request rstprod access, do either a and/or b below: + +a) If you need restricted data access on WCOSS2, read details about restricted data and fill out form here: + +https://www.nco.ncep.noaa.gov/sib/restricted_data/restricted_data_sib/ + +b) If you need restricted data access on RDHPCS systems: go to the AIM system, click on "Request new access to a project", select the rstprod project, provide justification for needed access, and submit the request: + +https://aim.rdhpcs.noaa.gov/ + +==================================== +Optimizing the global workflow on S4 +==================================== + +The S4 cluster is relatively small and so optimizations are recommended to improve cycled runtimes. Please contact David Huber (david.huber@noaa.gov) if you are planning on running a cycled experiment on this system to obtain optimized configuration files. + +============ +Git settings +============ + +^^^^^^ +Merges +^^^^^^ + +Use the following command to have merge commits include the one-line description of all the commits being merged (up to 200). You only need to do this once on each machine; it will be saved to your git settings:: + + git config --global merge.log 200 + +Use the ``--no-ff`` option to make sure there is always a merge commit when a fast-forward only is available. Exception: If the merge contains only a single commit, it can be applied as a fast-forward. + +For any merge with multiple commits, a short synopsis of the merge should appear between the title and the list of commit titles added by merge.log. + +^^^^^^^ +Version +^^^^^^^ + +It is advised to use Git v2+ when available. At the time of writing this documentation the default Git clients on the different machines were as noted in the table below. It is recommended that you check the default modules before loading recommended ones: + ++---------+----------+---------------------------------------+ +| Machine | Default | Recommended | ++---------+----------+---------------------------------------+ +| Hera | v2.18.0 | default | ++---------+----------+---------------------------------------+ +| Orion | v1.8.3.1 | **module load git/2.28.0** | ++---------+----------+---------------------------------------+ +| Jet | v2.18.0 | default | ++---------+----------+---------------------------------------+ +| WCOSS2 | v2.26.2 | default or **module load git/2.29.0** | ++---------+----------+---------------------------------------+ +| S4 | v1.8.3.1 | **module load git/2.30.0** | ++---------+----------+---------------------------------------+ + +^^^^^^^^^^^^^ +Output format +^^^^^^^^^^^^^ + +For proper display of Git command output (e.g. git branch and git diff) type the following once per machine: + +:: + + git config --global core.pager 'less -FRX' + +For the manage_externals utility functioning:: + + Error: fatal: ssh variant 'simple' does not support setting port + Fix: git config --global ssh.variant ssh + +======================================== +Stacksize on R&Ds (Hera, Orion, Jet, S4) +======================================== + +Some GFS components, like the UPP, need an unlimited stacksize. Add the following setting into your appropriate .*rc file to support these components: + +csh:: + + limit stacksize unlimited + +sh/bash/ksh:: + + ulimit -s unlimited + +========================================= +Forecast hangs due to issue with ssh-keys +========================================= + +Did you generate your ssh-keys with a passphrase? If so, remake them without one. To test this try ssh-ing to a different login node; you should be able to without being prompted for your passphrase. + +Is your public key in the authorized_keys file? If not, add it:: + + cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys + diff --git a/docs/source/index.rst b/docs/source/index.rst new file mode 100644 index 00000000000..e254a83fa2c --- /dev/null +++ b/docs/source/index.rst @@ -0,0 +1,39 @@ + +############### +Global Workflow +############### + +**Global-workflow** is the end-to-end workflow designed to run global configurations of medium range weather forecasting for the UFS weather model. It supports both development and operational implementations. In its current format it supports the Global Forecast System (GFS) and the Global Ensemble Forecast System (GEFS) configurations + +====== +Status +====== + +* State of develop (HEAD) branch: GFSv17+ development +* State of operations (dev/gfs.v16 branch): GFS v16.3.7 `tag: [gfs.v16.3.7] `_ + +============= +Code managers +============= + +* Kate Friedman - @KateFriedman-NOAA / kate.friedman@noaa.gov +* Walter Kolczynski - @WalterKolczynski-NOAA / walter.kolczynski@noaa.gov + +============= +Announcements +============= + +General updates: NOAA employees and affiliates can join the gfs-announce distribution list to get updates on the GFS and global-workflow. Contact Kate Friedman (kate.friedman@noaa.gov) and Walter Kolczynski (walter.kolczynski@noaa.gov) to get added to the list or removed from it. + +GitHub updates: Users should adjust their "Watch" settings for this repo so they receive notifications as they'd like to. Find the "Watch" or "Unwatch" button towards the top right of the `authoritative global-workflow repository page `_ and click it to adjust how you watch the repo. + +.. toctree:: + :numbered: + :maxdepth: 3 + + development.rst + components.rst + jobs.rst + hpc.rst + output.rst + run.rst diff --git a/docs/source/init.rst b/docs/source/init.rst new file mode 100644 index 00000000000..b065af23735 --- /dev/null +++ b/docs/source/init.rst @@ -0,0 +1,573 @@ +================== +Initial Conditions +================== + +There are two types of initial conditions for the global-workflow: + +#. Warm start: these ICs are taken directly from either the GFS in production or an experiment "warmed" up (at least one cycle in). +#. Cold start: any ICs converted to a new resolution or grid (e.g. C768 -> C384). These ICs are often prepared by chgres_cube (change resolution utility). + +Most users will initiate their experiments with cold start ICs unless running high resolution (C768 deterministic with C384 EnKF) for a date with warm starts available. It is `not recommended` to run high resolution unless required or as part of final testing. + +Atmosphere Resolutions: + +* C48 = 2 degree ≈ 200km +* C96 = 1 degree ≈ 100km +* C192 = 1/2 degree ≈ 50km +* C384 = 1/4 degree ≈ 25km +* C768 = 1/8 degree ≈ 13km +* C1152 ≈ 9km +* C3072 ≈ 3km + +Supported atmosphere resolutions in global-workflow: C48, C96, C192, C384, C768 + +Ocean Resolutions: + +* mx500 = 5 degree +* mx100 = 1 degree +* mx050 = 1/2 degree +* mx025 = 1/4 degree + +Supported ocean resolutions in global-workflow: mx500, mx100 + +^^^^^^^^^^^^^^^^^^^^^^^^^ +Staged Initial Conditions +^^^^^^^^^^^^^^^^^^^^^^^^^ + +* :ref:`Cycled ATM-only` +* :ref:`Cycled ATM w/ Coupled (S2S) model` +* :ref:`Prototype` + +.. _staged_ics_cycled_atmonly: + +*************** +Cycled ATM-only +*************** + +Cold-start atmosphere-only cycled C96 deterministic C48 enkf (80 members) ICs are available in the following locations on supported platforms: + +:: + + Hera: /scratch1/NCEPDEV/global/glopara/data/ICSDIR/C96C48 + Orion: /work/noaa/global/glopara/data/ICSDIR/C96C48 + WCOSS2: /lfs/h2/emc/global/noscrub/emc.global/data/ICSDIR/C96C48 + +Start date = 2021122018 + +:: + + -bash-4.2$ tree /scratch1/NCEPDEV/global/glopara/data/ICSDIR/C96C48/ + |-- enkfgdas.20211220 + | `-- 18 + | |-- mem### (where ### = 001 -> 080) + | | `-- atmos + | | `-- INPUT + | | |-- gfs_ctrl.nc + | | |-- gfs_data.tile1.nc + | | |-- gfs_data.tile2.nc + | | |-- gfs_data.tile3.nc + | | |-- gfs_data.tile4.nc + | | |-- gfs_data.tile5.nc + | | |-- gfs_data.tile6.nc + | | |-- sfc_data.tile1.nc + | | |-- sfc_data.tile2.nc + | | |-- sfc_data.tile3.nc + | | |-- sfc_data.tile4.nc + | | |-- sfc_data.tile5.nc + | | `-- sfc_data.tile6.nc + `-- gdas.20211220 + `-- 18 + `-- atmos + |-- INPUT + | |-- gfs_ctrl.nc + | |-- gfs_data.tile1.nc + | |-- gfs_data.tile2.nc + | |-- gfs_data.tile3.nc + | |-- gfs_data.tile4.nc + | |-- gfs_data.tile5.nc + | |-- gfs_data.tile6.nc + | |-- sfc_data.tile1.nc + | |-- sfc_data.tile2.nc + | |-- sfc_data.tile3.nc + | |-- sfc_data.tile4.nc + | |-- sfc_data.tile5.nc + | `-- sfc_data.tile6.nc + |-- gdas.t18z.abias + |-- gdas.t18z.abias_air + |-- gdas.t18z.abias_pc + `-- gdas.t18z.radstat + +.. _staged_ics_cycled_coupled: + +********************************* +Cycled ATM w/ Coupled (S2S) model +********************************* + +Warm-start cycled w/ coupled (S2S) model C48 atmosphere 5 degree ocean/ice ICs are available in the following locations on supported platforms: + +:: + + Hera: /scratch1/NCEPDEV/global/glopara/data/ICSDIR/C48mx500 + Orion: /work/noaa/global/glopara/data/ICSDIR/C48mx500 + WCOSS2: /lfs/h2/emc/global/noscrub/emc.global/data/ICSDIR/C48mx500 + +Start date = 2021032312 + +:: + + -bash-4.2$ tree /scratch1/NCEPDEV/global/glopara/data/ICSDIR/C48mx500 + `-- gdas.20210323 + |-- 06 + | |-- atmos + | | `-- RESTART + | | |-- 20210323.120000.ca_data.tile1.nc + | | |-- 20210323.120000.ca_data.tile2.nc + | | |-- 20210323.120000.ca_data.tile3.nc + | | |-- 20210323.120000.ca_data.tile4.nc + | | |-- 20210323.120000.ca_data.tile5.nc + | | |-- 20210323.120000.ca_data.tile6.nc + | | |-- 20210323.120000.coupler.res + | | |-- 20210323.120000.fv_core.res.nc + | | |-- 20210323.120000.fv_core.res.tile1.nc + | | |-- 20210323.120000.fv_core.res.tile2.nc + | | |-- 20210323.120000.fv_core.res.tile3.nc + | | |-- 20210323.120000.fv_core.res.tile4.nc + | | |-- 20210323.120000.fv_core.res.tile5.nc + | | |-- 20210323.120000.fv_core.res.tile6.nc + | | |-- 20210323.120000.fv_srf_wnd.res.tile1.nc + | | |-- 20210323.120000.fv_srf_wnd.res.tile2.nc + | | |-- 20210323.120000.fv_srf_wnd.res.tile3.nc + | | |-- 20210323.120000.fv_srf_wnd.res.tile4.nc + | | |-- 20210323.120000.fv_srf_wnd.res.tile5.nc + | | |-- 20210323.120000.fv_srf_wnd.res.tile6.nc + | | |-- 20210323.120000.fv_tracer.res.tile1.nc + | | |-- 20210323.120000.fv_tracer.res.tile2.nc + | | |-- 20210323.120000.fv_tracer.res.tile3.nc + | | |-- 20210323.120000.fv_tracer.res.tile4.nc + | | |-- 20210323.120000.fv_tracer.res.tile5.nc + | | |-- 20210323.120000.fv_tracer.res.tile6.nc + | | |-- 20210323.120000.phy_data.tile1.nc + | | |-- 20210323.120000.phy_data.tile2.nc + | | |-- 20210323.120000.phy_data.tile3.nc + | | |-- 20210323.120000.phy_data.tile4.nc + | | |-- 20210323.120000.phy_data.tile5.nc + | | |-- 20210323.120000.phy_data.tile6.nc + | | |-- 20210323.120000.sfc_data.tile1.nc + | | |-- 20210323.120000.sfc_data.tile2.nc + | | |-- 20210323.120000.sfc_data.tile3.nc + | | |-- 20210323.120000.sfc_data.tile4.nc + | | |-- 20210323.120000.sfc_data.tile5.nc + | | `-- 20210323.120000.sfc_data.tile6.nc + | |-- ice + | | `-- RESTART + | | `-- 20210323.120000.cice_model.res.nc + | |-- med + | | `-- RESTART + | | `-- 20210323.120000.ufs.cpld.cpl.r.nc + | `-- ocean + | `-- RESTART + | `-- 20210323.120000.MOM.res.nc + `-- 12 + |-- atmos + | |-- gdas.t12z.abias + | |-- gdas.t12z.abias_air + | |-- gdas.t12z.abias_int + | |-- gdas.t12z.abias_pc + | `-- gdas.t12z.radstat + `-- ocean + `-- gdas.t12z.ocninc.nc + +.. _staged_ics_prototype: + +********* +Prototype +********* + +Forecast-only P8 prototype initial conditions are made available to users on supported platforms in the following locations: + +:: + + WCOSS2: /lfs/h2/emc/global/noscrub/emc.global/IC/COUPLED + HERA: /scratch1/NCEPDEV/climate/role.ufscpara/IC + ORION: /work/noaa/global/glopara/data/ICSDIR/prototype_ICs + JET: /mnt/lfs4/HFIP/hfv3gfs/glopara/data/ICSDIR/prototype_ICs + S4: /data/prod/glopara/coupled_ICs + +These locations are known within the workflow via paths set in ``parm/config/config.coupled_ic``. + +^^^^^^^^^^^^^^^^^^^^^^^^^^ +Prepare Initial Conditions +^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. _automated-generation: + +******************** +Automated Generation +******************** + +.. _cycled: + +----------- +Cycled mode +----------- + +Not yet supported. See :ref:`Manual Generation` section below for how to create your ICs yourself (outside of workflow). + +.. _forecastonly-coupled: + +--------------------- +Forecast-only coupled +--------------------- +Coupled initial conditions are currently only generated offline and copied prior to the forecast run. Prototype initial conditions will automatically be used when setting up an experiment as an S2SW app, there is no need to do anything additional. Copies of initial conditions from the prototype runs are currently maintained on Hera, Orion, Jet, and WCOSS2. The locations used are determined by ``parm/config/config.coupled_ic``. If you need prototype ICs on another machine, please contact Walter (Walter.Kolczynski@noaa.gov). + +.. _forecastonly-atmonly: + +----------------------------- +Forecast-only mode (atm-only) +----------------------------- + +Forecast-only mode in global workflow includes ``getic`` and ``init`` jobs for the gfs suite. The ``getic`` job pulls inputs for ``chgres_cube`` (init job) or warm start ICs into your ``ROTDIR/COMROT``. The ``init`` job then ingests those files to produce initial conditions for your experiment. + +Users on machines without HPSS access (e.g. Orion) need to perform the ``getic`` step manually and stage inputs for the ``init`` job. The table below lists the needed files for ``init`` and where to place them in your ``ROTDIR``. + +Note for table: yyyy=year; mm=month; dd=day; hh=cycle + +Operations/production output location on HPSS: /NCEPPROD/hpssprod/runhistory/rh ``yyyy``/``yyyymm``/``yyyymmdd``/ + ++----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ +| Source | Files | Tarball name | Where in ROTDIR | ++----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ +| v12 ops | gfs.t. ``hh`` z.sanl | com_gfs_prod_gfs. ``yyyymmddhh`` .anl.tar | gfs. ``yyyymmdd`` /``hh`` | +| | | | | +| | gfs.t. ``hh`` z.sfcanl | | | ++----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ +| v13 ops | gfs.t. ``hh`` z.sanl | com2_gfs_prod_gfs. ``yyyymmddhh`` .anl.tar | gfs. ``yyyymmdd`` /``hh`` | +| | | | | +| | gfs.t. ``hh`` z.sfcanl | | | ++----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ +| v14 ops | gfs.t. ``hh`` z.atmanl.nemsio | gpfs_hps_nco_ops_com_gfs_prod_gfs. ``yyyymmddhh`` .anl.tar | gfs. ``yyyymmdd`` /``hh`` | +| | | | | +| | gfs.t. ``hh`` z.sfcanl.nemsio | | | ++----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ +| v15 ops | gfs.t. ``hh`` z.atmanl.nemsio | gpfs_dell1_nco_ops_com_gfs_prod_gfs. ``yyyymmdd`` _ ``hh`` .gfs_nemsioa.tar | gfs. ``yyyymmdd`` /``hh`` | +| | | | | +| pre-2020022600 | gfs.t. ``hh`` z.sfcanl.nemsio | | | ++----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ +| v15 ops | gfs.t. ``hh`` z.atmanl.nemsio | com_gfs_prod_gfs. ``yyyymmdd`` _ ``hh`` .gfs_nemsioa.tar | gfs. ``yyyymmdd`` /``hh`` | +| | | | | +| | gfs.t. ``hh`` z.sfcanl.nemsio | | | ++----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ +| v16 retro | gfs.t. ``hh`` z.atmanl.nc | gfs_netcdfa.tar* | gfs. ``yyyymmdd`` /``hh``/atmos| +| | | | | +| | gfs.t. ``hh`` z.sfcanl.nc | | | ++----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ +| v16.0[1] ops | gfs.t. ``hh`` z.atmanl.nc | com_gfs_prod_gfs. ``yyyymmdd`` _ ``hh`` .gfs_nca.tar | gfs. ``yyyymmdd`` /``hh``/atmos| +| | | | | +| | gfs.t. ``hh`` z.sfcanl.nc | | | ++----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ +| v16.2[3]+ ops | gfs.t. ``hh`` z.atmanl.nc | com_gfs_ ``gfs_ver`` _gfs. ``yyyymmdd`` _ ``hh`` .gfs_nca.tar | gfs. ``yyyymmdd`` /``hh``/atmos| +| | | | | +| | gfs.t. ``hh`` z.sfcanl.nc | | | ++----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ + +For HPSS path, see retrospective table in :ref:`pre-production parallel section ` below + +.. _manual-generation: + +***************** +Manual Generation +***************** + +.. note:: + Initial conditions cannot be generated on S4. These must be generated on another supported platform then pushed to S4. If you do not have access to a supported system or need assistance, please contact David Huber (david.huber@noaa.gov). + +.. _coldstarts: + +The following information is for users needing to generate cold-start initial conditions for a cycled experiment that will run at a different resolution or layer amount than the operational GFS (C768C384L127). + +The ``chgres_cube`` code is available from the `UFS_UTILS repository `_ on GitHub and can be used to convert GFS ICs to a different resolution or number of layers. Users may clone the develop/HEAD branch or the same version used by global-workflow develop (found in ``sorc/checkout.sh``). The ``chgres_cube`` code/scripts currently support the following GFS inputs: + +* pre-GFSv14 +* GFSv14 +* GFSv15 +* GFSv16 + +Users can use the copy of UFS_UTILS that is already cloned and built within their global-workflow clone or clone/build it separately: + +Within a built/linked global-workflow clone: + +:: + + cd sorc/ufs_utils.fd/util/gdas_init + +Clone and build separately: + +1. Clone UFS_UTILS: + +:: + + git clone --recursive https://github.com/NOAA-EMC/UFS_UTILS.git + +Then switch to a different tag or use the default branch (develop). + +2. Build UFS_UTILS: + +:: + + sh build_all.sh + cd fix + sh link_fixdirs.sh emc $MACHINE + +where ``$MACHINE`` is ``wcoss2``, ``hera``, or ``jet``. + +.. note:: + UFS-UTILS builds on Orion but due to the lack of HPSS access on Orion the ``gdas_init`` utility is not supported there. + +3. Configure your conversion: + +:: + + cd util/gdas_init + vi config + +Read the doc block at the top of the config and adjust the variables to meet you needs (e.g. ``yy, mm, dd, hh`` for ``SDATE``). + +Most users will want to adjust the following ``config`` settings for the current system design: + +#. EXTRACT_DATA=YES (to pull original ICs to convert off HPSS) +#. RUN_CHGRES=YES (to run chgres_cube on the original ICs pulled off HPSS) +#. LEVS=128 (for the L127 GFS) + +4. Submit conversion script: + +:: + + ./driver.$MACHINE.sh + +where ``$MACHINE`` is currently ``wcoss2``, ``hera`` or ``jet``. Additional options will be available as support for other machines expands. + +.. note:: + UFS-UTILS builds on Orion but due to lack of HPSS access there is no ``gdas_init`` driver for Orion nor support to pull initial conditions from HPSS for the ``gdas_init`` utility. + +Several small jobs will be submitted: + + - 1 jobs to pull inputs off HPSS + - 1 or 2 jobs to run ``chgres_cube`` (1 for deterministic/hires and 1 for each EnKF ensemble member) + +The chgres jobs will have a dependency on the data-pull jobs and will wait to run until all data-pull jobs have completed. + +5. Check output: + +In the config you will have defined an output folder called ``$OUTDIR``. The converted output will be found there, including the needed abias and radstat initial condition files (if CDUMP=gdas). The files will be in the needed directory structure for the global-workflow system, therefore a user can move the contents of their ``$OUTDIR`` directly into their ``$ROTDIR/$COMROT``. + +Please report bugs to George Gayno (george.gayno@noaa.gov) and Kate Friedman (kate.friedman@noaa.gov). + +.. _warmstarts-prod: + +***************************** +Warm starts (from production) +***************************** + +Output and warm start initial conditions from the operational GFS (FV3GFS) are saved on HPSS. Users can pull these warm start initial conditions from tape for their use in running operational resolution experiments. + +See production output in the following location on HPSS: + +``/NCEPPROD/hpssprod/runhistory/rhYYYY/YYYYMM/YYYYMMDD`` + +Example location for January 2nd 2023: + +``/NCEPPROD/hpssprod/runhistory/rh2023/202301/20230102`` + +Example listing for January 2nd 2023 00z (2023010200) production tarballs: + +:: + + -bash-4.2$ hpsstar dir /NCEPPROD/hpssprod/runhistory/rh2023/202301/20230102 | grep gfs | grep _00. | grep -v idx + [connecting to hpsscore1.fairmont.rdhpcs.noaa.gov/1217] + -rw-r----- 1 nwprod rstprod 34824086016 Jan 4 03:31 com_gfs_v16.3_enkfgdas.20230102_00.enkfgdas.tar + -rw-r--r-- 1 nwprod prod 219779890688 Jan 4 04:04 com_gfs_v16.3_enkfgdas.20230102_00.enkfgdas_restart_grp1.tar + -rw-r--r-- 1 nwprod prod 219779921408 Jan 4 04:13 com_gfs_v16.3_enkfgdas.20230102_00.enkfgdas_restart_grp2.tar + -rw-r--r-- 1 nwprod prod 219775624192 Jan 4 04:23 com_gfs_v16.3_enkfgdas.20230102_00.enkfgdas_restart_grp3.tar + -rw-r--r-- 1 nwprod prod 219779726848 Jan 4 04:33 com_gfs_v16.3_enkfgdas.20230102_00.enkfgdas_restart_grp4.tar + -rw-r--r-- 1 nwprod prod 219777990656 Jan 4 04:42 com_gfs_v16.3_enkfgdas.20230102_00.enkfgdas_restart_grp5.tar + -rw-r--r-- 1 nwprod prod 219780963328 Jan 4 04:52 com_gfs_v16.3_enkfgdas.20230102_00.enkfgdas_restart_grp6.tar + -rw-r--r-- 1 nwprod prod 219775471104 Jan 4 05:02 com_gfs_v16.3_enkfgdas.20230102_00.enkfgdas_restart_grp7.tar + -rw-r--r-- 1 nwprod prod 219779499008 Jan 4 05:11 com_gfs_v16.3_enkfgdas.20230102_00.enkfgdas_restart_grp8.tar + -rw-r----- 1 nwprod rstprod 2287770624 Jan 4 02:07 com_gfs_v16.3_gdas.20230102_00.gdas.tar + -rw-r--r-- 1 nwprod prod 1026611200 Jan 4 02:07 com_gfs_v16.3_gdas.20230102_00.gdas_flux.tar + -rw-r--r-- 1 nwprod prod 91233038336 Jan 4 02:16 com_gfs_v16.3_gdas.20230102_00.gdas_nc.tar + -rw-r--r-- 1 nwprod prod 10865070592 Jan 4 02:08 com_gfs_v16.3_gdas.20230102_00.gdas_pgrb2.tar + -rw-r----- 1 nwprod rstprod 69913956352 Jan 4 02:11 com_gfs_v16.3_gdas.20230102_00.gdas_restart.tar + -rw-r--r-- 1 nwprod prod 18200814080 Jan 4 02:17 com_gfs_v16.3_gdas.20230102_00.gdaswave_keep.tar + -rw-r--r-- 1 nwprod prod 5493360128 Jan 4 02:18 com_gfs_v16.3_gfs.20230102_00.gfs.tar + -rw-r--r-- 1 nwprod prod 62501531648 Jan 4 02:21 com_gfs_v16.3_gfs.20230102_00.gfs_flux.tar + -rw-r--r-- 1 nwprod prod 121786191360 Jan 4 02:41 com_gfs_v16.3_gfs.20230102_00.gfs_nca.tar + -rw-r--r-- 1 nwprod prod 130729495040 Jan 4 02:48 com_gfs_v16.3_gfs.20230102_00.gfs_ncb.tar + -rw-r--r-- 1 nwprod prod 138344908800 Jan 4 02:29 com_gfs_v16.3_gfs.20230102_00.gfs_pgrb2.tar + -rw-r--r-- 1 nwprod prod 59804635136 Jan 4 02:32 com_gfs_v16.3_gfs.20230102_00.gfs_pgrb2b.tar + -rw-r--r-- 1 nwprod prod 25095460864 Jan 4 02:34 com_gfs_v16.3_gfs.20230102_00.gfs_restart.tar + -rw-r--r-- 1 nwprod prod 21573020160 Jan 4 02:49 com_gfs_v16.3_gfs.20230102_00.gfswave_output.tar + -rw-r--r-- 1 nwprod prod 32850422784 Jan 4 02:51 com_gfs_v16.3_gfs.20230102_00.gfswave_raw.tar + -rw-r----- 1 nwprod rstprod 7419548160 Jan 4 05:15 com_obsproc_v1.1_gfs.20230102_00.obsproc_gfs.tar + +The warm starts and other output from production are at C768 deterministic and C384 EnKF. The warm start files must be converted to your desired resolution(s) using ``chgres_cube`` if you wish to run a different resolution. If you are running a C768C384L127 experiment you can use them as is. + +------------------------------------------------------------------------------------------ +What files should you pull for starting a new experiment with warm starts from production? +------------------------------------------------------------------------------------------ + +That depends on what mode you want to run -- forecast-only or cycled. Whichever mode, navigate to the top of your ``COMROT`` and pull the entirety of the tarball(s) listed below for your mode. The files within the tarball are already in the ``$CDUMP.$PDY/$CYC/$ATMOS`` folder format expected by the system. + +For forecast-only there are two tarballs to pull + +1. File #1 (for starting cycle SDATE): + +:: + + /NCEPPROD/hpssprod/runhistory/rhYYYY/YYYYMM/YYYYMMDD/com_gfs_vGFSVER_gfs.YYYYMMDD_CC.gfs_restart.tar + +...where ``GFSVER`` is the version of the GFS (e.g. "16.3"). + +2. File #2 (for prior cycle GDATE=SDATE-06): + +:: + + /NCEPPROD/hpssprod/runhistory/rhYYYY/YYYYMM/YYYYMMDD/com_gfs_vGFSVER_gdas.YYYYMMDD_CC.gdas_restart.tar + +...where ``GFSVER`` is the version of the GFS (e.g. "16.3"). + +For cycled mode there 18 tarballs to pull (9 for SDATE and 9 for GDATE (SDATE-06)): + +:: + + HPSS path: /NCEPPROD/hpssprod/runhistory/rhYYYY/YYYYMM/YYYYMMDD/ + +Tarballs per cycle: + +:: + + com_gfs_vGFSVER_gdas.YYYYMMDD_CC.gdas_restart.tar + com_gfs_vGFSVER_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp1.tar + com_gfs_vGFSVER_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp2.tar + com_gfs_vGFSVER_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp3.tar + com_gfs_vGFSVER_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp4.tar + com_gfs_vGFSVER_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp5.tar + com_gfs_vGFSVER_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp6.tar + com_gfs_vGFSVER_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp7.tar + com_gfs_vGFSVER_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp8.tar + +Go to the top of your ``COMROT/ROTDIR`` and pull the contents of all tarballs there. The tarballs already contain the needed directory structure. + +.. _warmstarts-preprod-parallels: + +******************************************* +Warm starts (from pre-production parallels) +******************************************* + +Recent pre-implementation parallel series was for GFS v16 (implemented March 2021). For the prior v15 (Q2FY19) see an additional table below. + +* **What resolution are warm-starts available for?** Warm-start ICs are saved at the resolution the model was run at (C768/C384) and can only be used to run at the same resolution combination. If you need to run a different resolution you will need to make your own cold-start ICs. See cold start section above. +* **What dates have warm-start files saved?** Unfortunately the frequency changed enough during the runs that it’s not easy to provide a definitive list easily. +* **What files?** All warm-starts are saved in separate tarballs which include “restart” in the name. You need to pull the entirety of each tarball, all files included in the restart tarballs are needed. +* **Where are these tarballs?** See below for the location on HPSS for each v16 pre-implementation parallel. +* **What tarballs do I need to grab for my experiment?** Tarballs from two cycles are required. The tarballs are listed below, where $CDATE is your starting cycle and $GDATE is one cycle prior. + + - Forecast-only + + ../$CDATE/gfs_restarta.tar + + ../$GDATE/gdas_restartb.tar + - Cycled w/EnKF + + ../$CDATE/gdas_restarta.tar + + ../$CDATE/enkfgdas_restarta_grp##.tar (where ## is 01 through 08) (note, older tarballs may include a period between enkf and gdas: "enkf.gdas") + + ../$GDATE/gdas_restartb.tar + + ../$GDATE/enkfgdas_restartb_grp##.tar (where ## is 01 through 08) (note, older tarballs may include a period between enkf and gdas: "enkf.gdas") + +* **Where do I put the warm-start initial conditions?** Extraction should occur right inside your COMROT. You may need to rename the enkf folder (enkf.gdas.$PDY -> enkfgdas.$PDY). + +Due to a recent change in the dycore, you may also need an additional offline step to fix the checksum of the NetCDF files for warm start. See the :ref:`Fix netcdf checksum section `. + +.. _retrospective: + +-------------------------------------------------------------- +GFSv16 (March 2021) Pre-Implementation Parallel HPSS Locations +-------------------------------------------------------------- + ++-----------------------------+---------------+--------------------------------------------------+ +| Time Period | Parallel Name | Archive Location on HPSS | +| | | PREFIX=/NCEPDEV/emc-global/5year/emc.glopara | ++-----------------------------+---------------+--------------------------------------------------+ +| 2019050106 ~ 2019060100 | v16retro0e | $PREFIX/WCOSS_D/gfsv16/v16retro0e/``yyyymmddhh`` | ++-----------------------------+---------------+--------------------------------------------------+ +| 2019060106 ~ 2019083118 | v16retro1e | $PREFIX/WCOSS_D/gfsv16/v16retro1e/``yyyymmddhh`` | ++-----------------------------+---------------+--------------------------------------------------+ +| 2019090100 ~ 2019110918 | v16retro2e | $PREFIX/WCOSS_D/gfsv16/v16retro2e/``yyyymmddhh`` | ++-----------------------------+---------------+--------------------------------------------------+ +| 2019111000 ~ 2020122200 | v16rt2 | $PREFIX/WCOSS_D/gfsv16/v16rt2/``yyyymmddhh`` | ++-----------------------------+---------------+--------------------------------------------------+ +| 2020122206 ~ implementation | v16rt2n | $PREFIX/WCOSS_D/gfsv16/v16rt2n/``yyyymmddhh`` | ++-----------------------------+---------------+--------------------------------------------------+ + +---------------------------------------------------------- +GFSv15 (Q2FY19) Pre-Implementation Parallel HPSS Locations +---------------------------------------------------------- + ++---------------------+-----------------+-----------------------------------------------------------+ +| Time Period | Parallel Name | Archive Location on HPSS | +| | | PREFIX=/NCEPDEV/emc-global/5year | ++---------------------+-----------------+-----------------------------------------------------------+ +| 20180525 - 20190612 | prfv3rt1 | $PREFIX/emc.glopara/WCOSS_C/Q2FY19/prfv3rt1 | ++---------------------+-----------------+-----------------------------------------------------------+ +| 20171125 - 20170831 | fv3q2fy19retro1 | $PREFIX/Fanglin.Yang/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro1 | ++---------------------+-----------------+-----------------------------------------------------------+ +| 20170525 - 20170625 | fv3q2fy19retro2 | $PREFIX/emc.glopara/WCOSS_C/Q2FY19/fv3q2fy19retro2 | ++---------------------+-----------------+-----------------------------------------------------------+ +| 20170802 - 20171130 | fv3q2fy19retro2 | $PREFIX/Fanglin.Yang/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro2 | ++---------------------+-----------------+-----------------------------------------------------------+ +| 20161125 - 20170531 | fv3q2fy19retro3 | $PREFIX/Fanglin.Yang/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro3 | ++---------------------+-----------------+-----------------------------------------------------------+ +| 20160817 - 20161130 | fv3q2fy19retro4 | $PREFIX/emc.glopara/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro4 | ++---------------------+-----------------+-----------------------------------------------------------+ +| 20160522 - 20160825 | fv3q2fy19retro4 | $PREFIX/emc.glopara/WCOSS_C/Q2FY19/fv3q2fy19retro4 | ++---------------------+-----------------+-----------------------------------------------------------+ +| 20151125 - 20160531 | fv3q2fy19retro5 | $PREFIX/emc.glopara/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro5 | ++---------------------+-----------------+-----------------------------------------------------------+ +| 20150503 - 20151130 | fv3q2fy19retro6 | $PREFIX/emc.glopara/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro6 | ++---------------------+-----------------+-----------------------------------------------------------+ + +.. _gfsv17-warmstarts: + +*************************************** +Using pre-GFSv17 warm starts for GFSv17 +*************************************** + +If a user wishes to run a high-res (C768C384L127) GFSv17 experiment with warm starts from the operational GFSv16 (or older) warm starts, they must process the initial condition files before using. See details below in the :ref:`Fix netcdf checksum section `. + +.. _gfsv17-checksum: + +------------------------- +Fix NetCDF checksum issue +------------------------- + +Due to a recent change in UFS, the setting to bypass the data verification no longer works, so you may also need an additional offline step to delete the checksum of the NetCDF files for warm start: + +On RDHPCS: + +:: + + module load nco/4.9.3 + +On WCOSS2: + +:: + + module load intel/19.1.3.304 + module load netcdf/4.7.4 + module load udunits/2.2.28 + module load gsl/2.7 + module load nco/4.7.9 + +And then on all platforms: + +:: + + cd $COMROT + for f in $(find ./ -name *tile*.nc); do echo $f; ncatted -a checksum,,d,, $f; done diff --git a/docs/source/jobs.rst b/docs/source/jobs.rst new file mode 100644 index 00000000000..ae7e1cd68a6 --- /dev/null +++ b/docs/source/jobs.rst @@ -0,0 +1,89 @@ +################# +GFS Configuration +################# + +.. figure:: _static/GFS_v16_flowchart.png + + Schematic flow chart for GFS v16 in operations + +The sequence of jobs that are run for an end-to-end (analysis+forecast+post processing+verification) GFS configuration using the Global Workflow is shown above. The system utilizes a collection of scripts that perform the tasks for each step. + +For any cycle the system consists of two suites -- the "gdas" suite which provides the initial guess fields, and the "gfs" suite which creates the initial conditions and forecast of the system. As with the operational system, the gdas runs for each cycle (00, 06, 12, and 18 UTC), however, to save time and space in experiments, the gfs (right side of the diagram) is initially setup to run for only the 00 UTC cycle (See the "run GFS this cycle?" portion of the diagram). The option to run the GFS for all four cycles is available (see the ``gfs_cyc`` variable in configuration file). + +An experimental run is different from operations in the following ways: + +* Workflow manager: operations utilizes `ecFlow `__, while development currently utilizes `ROCOTO `__. Note, experiments can also be run using ecFlow on platforms with ecFlow servers established. + +* Dump step is not run as it has already been completed during the real-time production runs and dump data is available in the global dump archive on supported machines. + +* Addition steps in experimental mode: + + - verification (vrfy) + + - archive (arch) + +Downstream jobs (e.g. awips, gempak, etc.) are not included in the diagram. Those jobs are not normally run in developmental tests. + +============================= +Jobs in the GFS Configuration +============================= ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| JOB NAME | PURPOSE | ++===================+=======================================================================================================================+ +| anal | Runs the analysis. 1) Runs the atmospheric analysis (global_gsi) to produce analysis increments; 2) Update surface | +| | guess file via global_cycle to create surface analysis on tiles. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| analcalc | Adds the analysis increments to previous cycle’s forecasts to produce atmospheric analysis files. Produces surface | +| | analysis file on Gaussian grid. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| analdiag | Creates netCDF diagnostic files containing observation values, innovation (O-F), error, quality control, as well as | +| | other analysis-related quantities (cnvstat, radstat, ozstat files). | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| arch | Archives select files from the deterministic model and cleans up older data. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| earcN/eamn | Archival script for EnKF: 1) Write select EnKF output to HPSS; 2) Copy select files to online archive; 3) Clean up | +| | EnKF temporary run directories; 4) Remove "old" EnKF files from rotating directory. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| ecenN/ecmn | Recenter ensemble members around hi-res deterministic analysis. GFS v16 recenters ensemble member analysis. | +| | increments. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| echgres | Runs chgres on full-resolution forecast for EnKF recentering (ecen). | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| ediag | Same as analdiag but for ensemble members. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| efcsN/efmn | Run 9 hour forecast for each ensemble member. There are 80 ensemble members. Each efcs job sequentially processes 8 | +| | ensemble members, so there are 10 efcs jobs in total. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| eobs | Data selection for EnKF update (eupd). | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| eposN/epmn | Generate ensemble mean atmospheric and surface forecast files. The ensemble spread is also computed for atmospheric | +| | forecast files. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| esfc | Generate ensemble surface analyses on tiles. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| eupd | Perform EnKF update (i.e., generate ensemble member analyses). | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| fcst | Runs the forecast (with or without one-way waves). | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| gldas | Runs the Global Land Data Assimilation System (GLDAS). | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| metpN | Runs MET/METplus verification via EMC_verif-global. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| prep | Runs the data preprocessing prior to the analysis (storm relocation if needed and generation of prepbufr file). | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| postN | Runs the post processor. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| vrfy | Runs the verification tasks. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| waveinit | Runs wave initialization step. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| waveprep | Runs wave prep step. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| wavepostsbs | Runs wave post-processing side-by-side. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| wavepostbndpnt | Runs wave post-processing for boundary points. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| wavepostbndpntbll | Runs wave post-processing for boundary points bulletins. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ +| wavepostpnt | Runs wave post-processing for points. | ++-------------------+-----------------------------------------------------------------------------------------------------------------------+ diff --git a/docs/source/monitor_rocoto.rst b/docs/source/monitor_rocoto.rst new file mode 100644 index 00000000000..1b8b2a38360 --- /dev/null +++ b/docs/source/monitor_rocoto.rst @@ -0,0 +1,136 @@ +================== +Monitor ROCOTO Run +================== + +Click `here `__ to view full rocoto documentation on GitHub + + +^^^^^^^^^^^^^^^^^^ +Using command line +^^^^^^^^^^^^^^^^^^ + +You can use Rocoto commands with arguments to check the status of your experiment. + +Start or continue a run: + +:: + + rocotorun -d /path/to/workflow/database/file -w /path/to/workflow/xml/file + +Check the status of the workflow: + +:: + + rocotostat -d /path/to/workflow/database/file -w /path/to/workflow/xml/file [-c YYYYMMDDCCmm,[YYYYMMDDCCmm,...]] [-t taskname,[taskname,...]] [-s] [-T] + +.. note:: + YYYYMMDDCCmm = YearMonthDayCycleMinute ...where mm/Minute is ’00’ for all cycles currently. + +Check the status of a job: + +:: + + rocotocheck -d /path/to/workflow/database/file -w /path/to/workflow/xml/file -c YYYYMMDDCCmm -t taskname + +Force a task to run (ignores dependencies - USE CAREFULLY!): + +:: + + rocotoboot -d /path/to/workflow/database/file -w /path/to/workflow/xml/file -c YYYYMMDDCCmm -t taskname + +Rerun task(s): + +:: + + rocotorewind -d /path/to/workflow/database/file -w /path/to/workflow/xml/file -c YYYYMMDDCCmm -t taskname + + (If job is currently queued or running rocoto will kill the job. Run rocotorun afterwards to fire off rewound task.) + +Set a task to complete (overwrites current state): + +:: + + rocotocomplete -d /path/to/workflow/database/file -w /path/to/workflow/xml/file -c YYYYMMDDCCmm -t taskname + +(Will not kill queued or running job, only update status.) + +Several dates and task names may be specified in the same command by adding more -c and -t options. However, lists are not allowed. + +^^^^^^^^^^^^^^^^^ +Use ROCOTO viewer +^^^^^^^^^^^^^^^^^ + +An alternative approach is to use a GUI that was designed to assist with monitoring global workflow experiments that use ROCOTO. It can be found under the ``workflow`` folder in global-workflow. + +***** +Usage +***** + +:: + + ./rocoto_viewer.py -d /path/to/workflow/database/file -w /path/to/workflow/xml/file + +.. note:: + Note 1: Terminal/window must be wide enough to display all experiment information columns, viewer will complain if not. + + Note 2: The viewer requires the full path to the database and xml files if you are not in your EXPDIR when you invoke it. + + Note 3: Only ``TERM=xterm`` is supported. You may wish to create a shell function to switch automatically if you are in a different terminal: + + Bash example: + + :: + + function rv { + oldterm=${TERM}; + export TERM='xterm'; + ${PATH_TO_VIEWER}/rocoto_viewer.py $@; + export TERM=${oldterm}; + } + +********************* +What the viewer shows +********************* + + .. figure:: _static/fv3_rocoto_view.png + + Sample output from Rocoto viewer + +The figure above shows a sample output from a Rocoto viewer for a running experiment. Where: + + * First column: cycle (YYYYMMDDCCmm, YYYY=year, MM=month, DD=day, CC=cycle hour, mm=minute) + * Second column: task name (a "<" symbol indicates a group/meta-task, click "x" when meta-task is selected to expand/collapse) + * Third column: job ID from scheduler + * Fourth column: job state (QUEUED, RUNNING, SUCCEEDED, FAILED, or DEAD) + * Fifth column: exit code (0 if all ended well) + * Sixth column: number of tries/attempts to run job (0 when not yet run or just rewound, 1 when run once successfully, 2+ for multiple tries up to max try value where job is considered DEAD) + * Seventh column: job duration in seconds + +************************** +How to navigate the viewer +************************** + +The rocoto viewer accepts both mouse and keyboard inputs. Click “h” for help menu and more options. + +Available viewer commands:: + + c = get information on selected job + r = rewind (rerun) selected job, group, or cycle + R = run rocotorun + b = boot (forcibly run) selected job or group + -> = right arrow key, advance viewer forward to next cycle + <- = left arrow key, advance viewer backward to previous cycle + Q = quit/exit viewer + +Advanced features: + + * Select multiple tasks at once + + - Click “Enter” on a task to select it, click on other tasks or use the up/down arrows to move to other tasks and click “Enter” to select them as well. + - When you next choose “r” for rewinding the pop-up window will now ask if you are sure you want to rewind all those selected tasks. + + * Rewind entire group or cycle + + - Group - While group/metatask is collapsed (<) click “r” to rewind whole group/metatask. + - Cycle - Use up arrow to move selector up past the first task until the entire left column is highlighted. Click “r” and the entire cycle will be rewound. + diff --git a/docs/source/output.rst b/docs/source/output.rst new file mode 100644 index 00000000000..5ccbbb0fc18 --- /dev/null +++ b/docs/source/output.rst @@ -0,0 +1,20 @@ +############### +Plotting Output +############### + +=============== +Analysis output +=============== + +The `GSI Monitor `_ repository contains a monitoring package called **RadMon**. This package reads the information on the radiances contained in the radstat files, such as quality control flags and departure statistics, and produces a webpage with many plots such as time series of data counts for a particular instrument. You can also directly compare two different experiments with this tool. If there are quantities that you are interested in but the RadMon package is not plotting them for you, you can use the existing RadMon code as a guide for how to read them and plot them yourself. The radstat files contain a wealth of information. + +The RadMon package can be found under the ``src/Radiance_Monitor`` folder within the `GSI Monitor`_. If checked out under global-workflow you will find it under ``gsi_monitor.fd/src/Radiance_Monitor``. + +If you have questions or issues getting the package to work for you please contact the developer of RadMon: Ed Safford (edward.safford@noaa.gov). + +=============== +Forecast output +=============== + +This section will be updated when we have some basic plotting utilities using EMCPY + diff --git a/docs/source/run.rst b/docs/source/run.rst new file mode 100644 index 00000000000..56728d32822 --- /dev/null +++ b/docs/source/run.rst @@ -0,0 +1,16 @@ +################### +Run Global Workflow +################### + +Here we will show how you can run an experiment using the Global Workflow. The Global workflow is regularly evolving and the underlying UFS-weather-model that it drives can run many different configurations. So this part of the document will be regularly updated. The workflow as it is configured today can be run as forecast only or cycled (forecast+Data Assimilation). Since cycled mode requires a number of Data Assimilation supporting repositories to be checked out, the instructions for the two modes from initial checkout stage will be slightly different. Apart from this there is a third mode that is rarely used in development mode and is primarily for operational use. This mode switches on specialized post processing needed by the aviation industry. Since the files associated with this mode are restricted, only select users will have need and/or ability to run in this mode. + +.. toctree:: + + clone.rst + init.rst + setup.rst + configure.rst + start.rst + monitor_rocoto.rst + view.rst + errors_faq.rst diff --git a/docs/source/setup.rst b/docs/source/setup.rst new file mode 100644 index 00000000000..eb13b4b6f36 --- /dev/null +++ b/docs/source/setup.rst @@ -0,0 +1,301 @@ +================ +Experiment Setup +================ + + Global workflow uses a set of scripts to help configure and set up the drivers (also referred to as Workflow Manager) that run the end-to-end system. While currently we use a `ROCOTO `__ based system and that is documented here, an `ecFlow `__ based systm is also under development and will be introduced to the Global Workflow when it is mature. To run the setup scripts, you need to make sure to have a copy of ``python3`` with ``numpy`` available. The easiest way to guarantee this is to load python from the `official hpc-stack installation `_ for the machine you are on: + +.. list-table:: Python Module Load Commands + :widths: 25 120 + :header-rows: 1 + + * - **MACHINE** + - **COMMAND(S)** + * - Hera + - :: + + module use -a /contrib/anaconda/modulefiles + module load anaconda/anaconda3-5.3.1 + * - Orion + - :: + + module load python/3.7.5 + * - WCOSS2 + - :: + + module load python/3.8.6 + * - S4 + - :: + + module load miniconda/3.8-s4 + + * - Jet + - :: + + module use /mnt/lfs4/HFIP/hfv3gfs/role.epic/miniconda3/modulefiles + module load miniconda3/4.12.0 + conda activate ufswm + +If running with Rocoto make sure to have a Rocoto module loaded before running setup scripts: + +.. list-table:: ROCOTO Module Load Commands + :widths: 25 120 + :header-rows: 1 + + * - **MACHINE** + - **COMMAND(S)** + * - Hera + - :: + + module load rocoto/1.3.3 + * - Orion + - :: + + module load contrib + module load rocoto/1.3.3 + * - WCOSS2 + - :: + + module use /apps/ops/test/nco/modulefiles/ + module load core/rocoto/1.3.5 + * - S4 + - :: + + module load rocoto/1.3.4 + * - Jet + - :: + + module load rocoto/1.3.3 + +^^^^^^^^^^^^^^^^^^^^^^^^ +Forecast-only experiment +^^^^^^^^^^^^^^^^^^^^^^^^ + +Scripts that will be used: + + * ``workflow/setup_expt.py`` + * ``workflow/setup_xml.py`` + +*************************************** +Step 1: Run experiment generator script +*************************************** + +The following command examples include variables for reference but users should not use environmental variables but explicit values to submit the commands. Exporting variables like EXPDIR to your environment causes an error when the python scripts run. Please explicitly include the argument inputs when running both setup scripts: + +:: + + cd workflow + ./setup_expt.py forecast-only --idate $IDATE --edate $EDATE [--app $APP] [--start $START] [--gfs_cyc $GFS_CYC] [--resdet $RESDET] + [--pslot $PSLOT] [--configdir $CONFIGDIR] [--comrot $COMROT] [--expdir $EXPDIR] + +where: + + * ``forecast-only`` is the first positional argument that instructs the setup script to produce an experiment directory for forecast only experiments. + * ``$APP`` is the target application, one of: + + - ATM: atmosphere-only [default] + - ATMW: atm-wave + - ATMA: atm-aerosols + - S2S: atm-ocean-ice + - S2SW: atm-ocean-ice-wave + - S2SWA: atm-ocean-ice-wave-aerosols + + * ``$START`` is the start type (warm or cold [default]) + * ``$IDATE`` is the initial start date of your run (first cycle CDATE, YYYYMMDDCC) + * ``$EDATE`` is the ending date of your run (YYYYMMDDCC) and is the last cycle that will complete + * ``$PSLOT`` is the name of your experiment [default: test] + * ``$CONFIGDIR`` is the path to the ``/config`` folder under the copy of the system you're using [default: $TOP_OF_CLONE/parm/config/] + * ``$RESDET`` is the FV3 resolution (i.e. 768 for C768) [default: 384] + * ``$GFS_CYC`` is the forecast frequency (0 = none, 1 = 00z only [default], 2 = 00z & 12z, 4 = all cycles) + * ``$COMROT`` is the path to your experiment output directory. DO NOT include PSLOT folder at end of path, it’ll be built for you. [default: $HOME (but do not use default due to limited space in home directories normally, provide a path to a larger scratch space)] + * ``$EXPDIR`` is the path to your experiment directory where your configs will be placed and where you will find your workflow monitoring files (i.e. rocoto database and xml file). DO NOT include PSLOT folder at end of path, it will be built for you. [default: $HOME] + +Examples: + +Atm-only: + +:: + + cd workflow + ./setup_expt.py forecast-only --pslot test --idate 2020010100 --edate 2020010118 --resdet 384 --gfs_cyc 4 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir + +Coupled: + +:: + + cd workflow + ./setup_expt.py forecast-only --app S2SW --pslot coupled_test --idate 2013040100 --edate 2013040100 --resdet 384 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir + +Coupled with aerosols: + +:: + + cd workflow + ./setup_expt.py forecast-only --app S2SWA --pslot coupled_test --idate 2013040100 --edate 2013040100 --resdet 384 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir + +**************************************** +Step 2: Set user and experiment settings +**************************************** + +Go to your EXPDIR and check/change the following variables within your config.base now before running the next script: + + * ACCOUNT + * HOMEDIR + * STMP + * PTMP + * ARCDIR (location on disk for online archive used by verification system) + * HPSSARCH (YES turns on archival) + * HPSS_PROJECT (project on HPSS if archiving) + * ATARDIR (location on HPSS if archiving) + +Some of those variables will be found within a machine-specific if-block so make sure to change the correct ones for the machine you'll be running on. + +Now is also the time to change any other variables/settings you wish to change in config.base or other configs. `Do that now.` Once done making changes to the configs in your EXPDIR go back to your clone to run the second setup script. See :doc:configure.rst for more information on configuring your run. + +************************************* +Step 3: Run workflow generator script +************************************* + +This step sets up the files needed by the Workflow Manager/Driver. At this moment only ROCOTO configurations are generated: + +:: + + ./setup_xml.py $EXPDIR/$PSLOT + +Example: + +:: + + ./setup_xml.py /some_safe_disk_area/Joe.Schmo/expdir/test + +Additional options for setting up Rocoto are available with `setup_xml.py -h` that allow users to change the number of failed tries, number of concurrent cycles and tasks as well as Rocoto's verbosity levels. + +**************************************** +Step 4: Confirm files from setup scripts +**************************************** + +You will now have a rocoto xml file in your EXPDIR ($PSLOT.xml) and a crontab file generated for your use. Rocoto uses CRON as the scheduler. If you do not have a crontab file you may not have had the rocoto module loaded. To fix this load a rocoto module and then rerun setup_xml.py script again. Follow directions for setting up the rocoto cron on the platform the experiment is going to run on. + +^^^^^^^^^^^^^^^^^ +Cycled experiment +^^^^^^^^^^^^^^^^^ + +Scripts that will be used: + + * ``workflow/setup_expt.py`` + * ``workflow/setup_xml.py`` + +*************************************** +Step 1) Run experiment generator script +*************************************** + +The following command examples include variables for reference but users should not use environmental variables but explicit values to submit the commands. Exporting variables like EXPDIR to your environment causes an error when the python scripts run. Please explicitly include the argument inputs when running both setup scripts: + +:: + + cd workflow + ./setup_expt.py cycled --idate $IDATE --edate $EDATE [--app $APP] [--start $START] [--gfs_cyc $GFS_CYC] + [--resdet $RESDET] [--resens $RESENS] [--nens $NENS] [--cdump $CDUMP] + [--pslot $PSLOT] [--configdir $CONFIGDIR] [--comrot $COMROT] [--expdir $EXPDIR] [--icsdir $ICSDIR] + +where: + + * ``cycled`` is the first positional argument that instructs the setup script to produce an experiment directory for cycled experiments. + * ``$APP`` is the target application, one of: + + - ATM: atmosphere-only [default] + - ATMW: atm-wave + + * ``$IDATE`` is the initial start date of your run (first cycle CDATE, YYYYMMDDCC) + * ``$EDATE`` is the ending date of your run (YYYYMMDDCC) and is the last cycle that will complete + * ``$START`` is the start type (warm or cold [default]) + * ``$GFS_CYC`` is the forecast frequency (0 = none, 1 = 00z only [default], 2 = 00z & 12z, 4 = all cycles) + * ``$RESDET`` is the FV3 resolution of the deterministic forecast [default: 384] + * ``$RESENS`` is the FV3 resolution of the ensemble (EnKF) forecast [default: 192] + * ``$NENS`` is the number of ensemble members [default: 20] + * ``$CDUMP`` is the starting phase [default: gdas] + * ``$PSLOT`` is the name of your experiment [default: test] + * ``$CONFIGDIR`` is the path to the config folder under the copy of the system you're using [default: $TOP_OF_CLONE/parm/config/] + * ``$COMROT`` is the path to your experiment output directory. DO NOT include PSLOT folder at end of path, it’ll be built for you. [default: $HOME] + * ``$EXPDIR`` is the path to your experiment directory where your configs will be placed and where you will find your workflow monitoring files (i.e. rocoto database and xml file). DO NOT include PSLOT folder at end of path, it will be built for you. [default: $HOME] + * ``$ICSDIR`` is the path to the ICs for your run if generated separately. [default: None] + +.. [#] More Coupled configurations in cycled mode are currently under development and not yet available + +Example: + +:: + + cd workflow + ./setup_expt.py cycled --pslot test --configdir /home/Joe.Schmo/git/global-workflow/parm/config --idate 2020010100 --edate 2020010118 --comrot /some_large_disk_area/Joe.Schmo/comrot --expdir /some_safe_disk_area/Joe.Schmo/expdir --resdet 384 --resens 192 --nens 80 --gfs_cyc 4 + +Example ``setup_expt.py`` on Orion: + +:: + + Orion-login-3$ ./setup_expt.py cycled --pslot test --idate 2022010118 --edate 2022010200 --resdet 192 --resens 96 --nens 80 --comrot /work/noaa/stmp/jschmo/comrot --expdir /work/noaa/global/jschmo/expdir + EDITED: /work/noaa/global/jschmo/expdir/test/config.base as per user input. + EDITED: /work/noaa/global/jschmo/expdir/test/config.aeroanl as per user input. + EDITED: /work/noaa/global/jschmo/expdir/test/config.ocnanal as per user input. + +The message about the config.base.default is telling you that you are free to delete it if you wish but it’s not necessary to remove. Your resulting config.base was generated from config.base.default and the default one is there for your information. + +What happens if I run ``setup_expt.py`` again for an experiment that already exists? + +:: + + Orion-login-3$ ./setup_expt.py cycled --pslot test --idate 2022010118 --edate 2022010200 --resdet 192 --resens 96 --nens 80 --comrot /work/noaa/stmp/jschmo/comrot --expdir /work/noaa/global/jschmo/expdir + + directory already exists in /work/noaa/stmp/jschmo/comrot/test + + Do you wish to over-write [y/N]: y + + directory already exists in /work/noaa/global/jschmo/expdir/test + + Do you wish to over-write [y/N]: y + EDITED: /work/noaa/global/jschmo/expdir/test/config.base as per user input. + EDITED: /work/noaa/global/jschmo/expdir/test/config.aeroanl as per user input. + EDITED: /work/noaa/global/jschmo/expdir/test/config.ocnanal as per user input. + +Your ``COMROT`` and ``EXPDIR`` will be deleted and remade. Be careful with this! + +**************************************** +Step 2: Set user and experiment settings +**************************************** + +Go to your EXPDIR and check/change the following variables within your config.base now before running the next script: + + * ACCOUNT + * HOMEDIR + * STMP + * PTMP + * ARCDIR (location on disk for online archive used by verification system) + * HPSSARCH (YES turns on archival) + * HPSS_PROJECT (project on HPSS if archiving) + * ATARDIR (location on HPSS if archiving) + +Some of those variables will be found within a machine-specific if-block so make sure to change the correct ones for the machine you'll be running on. + +Now is also the time to change any other variables/settings you wish to change in config.base or other configs. `Do that now.` Once done making changes to the configs in your EXPDIR go back to your clone to run the second setup script. See :doc: configure.rst for more information on configuring your run. + + +************************************* +Step 3: Run workflow generator script +************************************* + +This step sets up the files needed by the Workflow Manager/Driver. At this moment only ROCOTO configurations are generated: + +:: + + ./setup_xml.py $EXPDIR/$PSLOT + +Example: + +:: + + ./setup_xml.py /some_safe_disk_area/Joe.Schmo/expdir/test + +**************************************** +Step 4: Confirm files from setup scripts +**************************************** + +You will now have a rocoto xml file in your EXPDIR ($PSLOT.xml) and a crontab file generated for your use. Rocoto uses CRON as the scheduler. If you do not have a crontab file you may not have had the rocoto module loaded. To fix this load a rocoto module and then rerun ``setup_xml.py`` script again. Follow directions for setting up the rocoto cron on the platform the experiment is going to run on. diff --git a/docs/source/start.rst b/docs/source/start.rst new file mode 100644 index 00000000000..957971e637f --- /dev/null +++ b/docs/source/start.rst @@ -0,0 +1,48 @@ +============== +Start your run +============== + +Make sure a rocoto module is loaded: ``module load rocoto`` + +If needed check for available rocoto modules on machine: ``module avail rocoto`` or ``module spider rocoto`` + +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Start your run from within your EXPDIR +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +:: + + rocotorun -d $PSLOT.db -w $PSLOT.xml + +The first jobs of your run should now be queued or already running (depending on machine traffic). How exciting! + +You'll now have a "logs" folder in both your COMROT and EXPDIR. The EXPDIR log folder contains workflow log files (e.g. rocoto command results) and the COMROT log folder will contain logs for each job (previously known as dayfiles). + +^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Set up your experiment cron +^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. note:: + Orion currently only supports cron on Orion-login-1. Cron support for other login nodes is coming in the future. + +:: + + crontab -e + +or + +:: + + crontab $PSLOT.crontab + +.. warning:: + + The ``crontab $PSLOT.crontab`` command will overwrite existing crontab file on your login node. If running multiple crons recommend editing crontab file with ``crontab -e`` command. + +Check your crontab settings:: + + crontab -l + +Crontab uses following format:: + + */5 * * * * /path/to/rocotorun -w /path/to/workflow/definition/file -d /path/to/workflow/database/file diff --git a/docs/source/view.rst b/docs/source/view.rst new file mode 100644 index 00000000000..3093755e9a4 --- /dev/null +++ b/docs/source/view.rst @@ -0,0 +1,46 @@ +====================== +View Experiment output +====================== + +The output from your run will be found in the ``COMROT/ROTDIR`` you established. This is also where you placed your initial conditions. Within your ``COMROT`` you will have the following directory structure (based on the type of experiment you run): + +^^^^^^^^^^^^^ +Forecast-only +^^^^^^^^^^^^^ + +:: + + gfs.YYYYMMDD/CC/atmos <- contains deterministic long forecast gfs inputs/outputs (atmosphere) + gfs.YYYYMMDD/CC/wave <- contains deterministic long forecast gfs inputs/outputs (wave) + logs/ <- logs for each cycle in the run + vrfyarch/ <- contains files related to verification and archival + +^^^^^^ +Cycled +^^^^^^ + +:: + + enkfgdas.YYYYMMDD/CC/mem###/atmos <- contains EnKF inputs/outputs for each cycle and each member + gdas.YYYYMMDD/CC/atmos <- contains deterministic gdas inputs/outputs (atmosphere) + gdas.YYYYMMDD/CC/wave <- contains deterministic gdas inputs/outputs (wave) + gfs.YYYYMMDD/CC/atmos <- contains deterministic long forecast gfs inputs/outputs (atmosphere) + gfs.YYYYMMDD/CC/wave <- contains deterministic long forecast gfs inputs/outputs (wave) + logs/ <- logs for each cycle in the run + vrfyarch/ <- contains files related to verification and archival + +Here is an example ``COMROT`` for a cycled run as it may look several cycles in (note the archival steps remove older cycle folders as the run progresses): + +:: + + -bash-4.2$ ll /scratch1/NCEPDEV/stmp4/Joe.Schmo/comrot/testcyc192 + total 88 + drwxr-sr-x 4 Joe.Schmo stmp 4096 Oct 22 04:50 enkfgdas.20190529 + drwxr-sr-x 4 Joe.Schmo stmp 4096 Oct 22 07:20 enkfgdas.20190530 + drwxr-sr-x 6 Joe.Schmo stmp 4096 Oct 22 03:15 gdas.20190529 + drwxr-sr-x 4 Joe.Schmo stmp 4096 Oct 22 07:15 gdas.20190530 + drwxr-sr-x 6 Joe.Schmo stmp 4096 Oct 22 03:15 gfs.20190529 + drwxr-sr-x 4 Joe.Schmo stmp 4096 Oct 22 07:15 gfs.20190530 + drwxr-sr-x 120 Joe.Schmo stmp 12288 Oct 22 07:15 logs + drwxr-sr-x 13 Joe.Schmo stmp 4096 Oct 22 07:07 vrfyarch + diff --git a/ecf/scripts/enkfgdas/analysis/recenter/jenkfgdas_sfc.ecf b/ecf/scripts/enkfgdas/analysis/recenter/jenkfgdas_sfc.ecf index 6928765c1f3..39d4ec2e8d6 100755 --- a/ecf/scripts/enkfgdas/analysis/recenter/jenkfgdas_sfc.ecf +++ b/ecf/scripts/enkfgdas/analysis/recenter/jenkfgdas_sfc.ecf @@ -4,7 +4,7 @@ #PBS -q %QUEUE% #PBS -A %PROJ%-%PROJENVIR% #PBS -l walltime=00:06:00 -#PBS -l select=1:mpiprocs=80:ompthreads=1:ncpus=80:mem=60GB +#PBS -l select=1:mpiprocs=80:ompthreads=1:ncpus=80:mem=80GB #PBS -l place=vscatter #PBS -l debug=true diff --git a/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_blending_0p25.ecf b/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_blending_0p25.ecf index f0ce0daacf5..83647d9c154 100755 --- a/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_blending_0p25.ecf +++ b/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_blending_0p25.ecf @@ -4,7 +4,7 @@ #PBS -q %QUEUE% #PBS -A %PROJ%-%PROJENVIR% #PBS -l walltime=00:30:00 -#PBS -l select=1:mpiprocs=1:ompthreads=1:ncpus=1:mem=1GB +#PBS -l select=1:mpiprocs=1:ompthreads=1:ncpus=1:mem=15GB #PBS -l place=vscatter #PBS -l debug=true @@ -35,6 +35,7 @@ module list ############################################################# export cyc=%CYC% export cycle=t%CYC%z +export ICAO2023=no ############################################################ # CALL executable job script here diff --git a/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_grib2.ecf b/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_grib2.ecf index 1c271e57b85..25d4fc37aa5 100755 --- a/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_grib2.ecf +++ b/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_grib2.ecf @@ -4,7 +4,7 @@ #PBS -q %QUEUE% #PBS -A %PROJ%-%PROJENVIR% #PBS -l walltime=00:30:00 -#PBS -l select=1:mpiprocs=1:ompthreads=1:ncpus=1:mem=5GB +#PBS -l select=1:mpiprocs=18:ompthreads=1:ncpus=18:mem=80GB #PBS -l place=vscatter #PBS -l debug=true @@ -28,6 +28,7 @@ module load cray-pals/${cray_pals_ver} module load libjpeg/${libjpeg_ver} module load grib_util/${grib_util_ver} module load wgrib2/${wgrib2_ver} +module load cfp/${cfp_ver} module list @@ -36,6 +37,8 @@ module list ############################################################# export cyc=%CYC% export cycle=t%CYC%z +export USE_CFP=YES +export ICAO2023=no ############################################################ # CALL executable job script here diff --git a/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_grib2_0p25.ecf b/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_grib2_0p25.ecf index ce78889886d..9beac6f13a3 100755 --- a/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_grib2_0p25.ecf +++ b/ecf/scripts/gfs/atmos/post_processing/grib2_wafs/jgfs_atmos_wafs_grib2_0p25.ecf @@ -4,7 +4,7 @@ #PBS -q %QUEUE% #PBS -A %PROJ%-%PROJENVIR% #PBS -l walltime=00:30:00 -#PBS -l select=1:mpiprocs=1:ompthreads=1:ncpus=1:mem=1GB +#PBS -l select=1:mpiprocs=11:ompthreads=1:ncpus=11:mem=80GB #PBS -l place=vscatter #PBS -l debug=true @@ -28,6 +28,7 @@ module load cray-pals/${cray_pals_ver} module load libjpeg/${libjpeg_ver} module load grib_util/${grib_util_ver} module load wgrib2/${wgrib2_ver} +module load cfp/${cfp_ver} module list @@ -36,6 +37,8 @@ module list ############################################################# export cyc=%CYC% export cycle=t%CYC%z +export USE_CFP=YES +export ICAO2023=no ############################################################ # CALL executable job script here diff --git a/ecf/scripts/gfs/atmos/post_processing/jgfs_atmos_wafs_gcip.ecf b/ecf/scripts/gfs/atmos/post_processing/jgfs_atmos_wafs_gcip.ecf index cf5d893b6e4..00a87f3948c 100755 --- a/ecf/scripts/gfs/atmos/post_processing/jgfs_atmos_wafs_gcip.ecf +++ b/ecf/scripts/gfs/atmos/post_processing/jgfs_atmos_wafs_gcip.ecf @@ -40,6 +40,7 @@ module list export cyc=%CYC% export cycle=t%CYC%z export USE_CFP=YES +export ICAO2023=no ############################################################ # CALL executable job script here diff --git a/env/CONTAINER.env b/env/CONTAINER.env new file mode 100755 index 00000000000..4f85ae56de1 --- /dev/null +++ b/env/CONTAINER.env @@ -0,0 +1,38 @@ +#! /usr/bin/env bash + +if [[ $# -ne 1 ]]; then + + echo "Must specify an input argument to set runtime environment variables!" + echo "argument can be any one of the following:" + echo "atmanlrun atmensanlrun aeroanlrun landanlrun" + echo "anal sfcanl fcst post vrfy metp" + echo "eobs eupd ecen efcs epos" + echo "postsnd awips gempak" + exit 1 + +fi + +step=$1 + +export npe_node_max=40 +export launcher="mpirun" +export mpmd_opt="--multi-prog" + +# Configure MPI environment +export MPI_BUFS_PER_PROC=2048 +export MPI_BUFS_PER_HOST=2048 +export MPI_GROUP_MAX=256 +export MPI_MEMMAP_OFF=1 +export MP_STDOUTMODE="ORDERED" +export KMP_AFFINITY=scatter +export OMP_STACKSIZE=2048000 +export NTHSTACK=1024000000 + +ulimit -s unlimited +ulimit -a + + +if [ "${step}" = "ocnanalrun" ]; then + export NTHREADS_OCNANAL=1 + export APRUN_OCNANAL="${launcher} -n 2" +fi diff --git a/env/HERA.env b/env/HERA.env index cb4121e8397..3ebbcfae6ba 100755 --- a/env/HERA.env +++ b/env/HERA.env @@ -1,10 +1,10 @@ #! /usr/bin/env bash -if [ $# -ne 1 ]; then +if [[ $# -ne 1 ]]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "atmanalrun atmensanalrun" + echo "atmanlrun atmensanlrun aeroanlrun landanlrun" echo "anal sfcanl fcst post vrfy metp" echo "eobs eupd ecen efcs epos" echo "postsnd awips gempak" @@ -17,6 +17,7 @@ step=$1 export npe_node_max=40 #JKHexport launcher="srun -l --export=ALL" export launcher="srun -l --epilog=/apps/local/bin/report-mem --export=ALL" +export mpmd_opt="--multi-prog --output=${step}.%J.%t.out" # Configure MPI environment #export I_MPI_ADJUST_ALLREDUCE=5 @@ -32,247 +33,283 @@ export NTHSTACK=1024000000 ulimit -s unlimited ulimit -a -export job=${PBS_JOBNAME:-$step} -export jobid=${job}.${PBS_JOBID:-$$} +if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then - -if [ $step = "prep" -o $step = "prepbufr" ]; then - - nth_max=$(($npe_node_max / $npe_node_prep)) + nth_max=$((npe_node_max / npe_node_prep)) export POE="NO" export BACK="NO" export sys_tp="HERA" + export launcher_PREP="srun" -elif [ $step = "waveinit" -o $step = "waveprep" -o $step = "wavepostsbs" -o $step = "wavepostbndpnt" -o $step = "wavepostbndpntbll" -o $step = "wavepostpnt" ]; then +elif [[ "${step}" = "waveinit" ]] || [[ "${step}" = "waveprep" ]] || [[ "${step}" = "wavepostsbs" ]] || [[ "${step}" = "wavepostbndpnt" ]] || [[ "${step}" = "wavepostbndpntbll" ]] || [[ "${step}" = "wavepostpnt" ]]; then - export mpmd="--multi-prog" export CFP_MP="YES" - if [ $step = "waveprep" ]; then export MP_PULSE=0 ; fi + if [[ "${step}" = "waveprep" ]]; then export MP_PULSE=0 ; fi export wavempexec=${launcher} - export wave_mpmd=${mpmd} + export wave_mpmd=${mpmd_opt} -elif [ $step = "atmanalrun" ]; then +elif [[ "${step}" = "atmanlrun" ]]; then - export CFP_MP=${CFP_MP:-"YES"} - export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + nth_max=$((npe_node_max / npe_node_atmanlrun)) - nth_max=$(($npe_node_max / $npe_node_atmanalrun)) + export NTHREADS_ATMANL=${nth_atmanlrun:-${nth_max}} + [[ ${NTHREADS_ATMANL} -gt ${nth_max} ]] && export NTHREADS_ATMANL=${nth_max} + export APRUN_ATMANL="${launcher} -n ${npe_atmanlrun}" - export NTHREADS_ATMANAL=${nth_atmanalrun:-$nth_max} - [[ $NTHREADS_ATMANAL -gt $nth_max ]] && export NTHREADS_ATMANAL=$nth_max - export APRUN_ATMANAL="$launcher -n $npe_atmanalrun" +elif [[ "${step}" = "atmensanlrun" ]]; then -elif [ $step = "atmensanalrun" ]; then + nth_max=$((npe_node_max / npe_node_atmensanlrun)) - export CFP_MP=${CFP_MP:-"YES"} - export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export NTHREADS_ATMENSANL=${nth_atmensanlrun:-${nth_max}} + [[ ${NTHREADS_ATMENSANL} -gt ${nth_max} ]] && export NTHREADS_ATMENSANL=${nth_max} + export APRUN_ATMENSANL="${launcher} -n ${npe_atmensanlrun}" + +elif [[ "${step}" = "aeroanlrun" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" + + nth_max=$((npe_node_max / npe_node_aeroanlrun)) + + export NTHREADS_AEROANL=${nth_aeroanlrun:-${nth_max}} + [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} + export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun}" + +elif [[ "${step}" = "landanlrun" ]]; then + + nth_max=$((npe_node_max / npe_node_landanlrun)) + + export NTHREADS_LANDANL=${nth_landanlrun:-${nth_max}} + [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} + export APRUN_LANDANL="${launcher} -n ${npe_landanlrun}" + +elif [[ "${step}" = "ocnanalbmat" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd --multi-prog" + + nth_max=$((npe_node_max / npe_node_ocnanalbmat)) + + export NTHREADS_OCNANAL=${nth_ocnanalbmat:-${nth_max}} + [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalbmat}" + +elif [[ "${step}" = "ocnanalrun" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd --multi-prog" + + nth_max=$((npe_node_max / npe_node_ocnanalrun)) + + export NTHREADS_OCNANAL=${nth_ocnanalrun:-${nth_max}} + [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalrun}" + +elif [[ "${step}" = "ocnanalchkpt" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd --multi-prog" - nth_max=$(($npe_node_max / $npe_node_atmensanalrun)) + nth_max=$((npe_node_max / npe_node_ocnanalchkpt)) - export NTHREADS_ATMENSANAL=${nth_atmensanalrun:-$nth_max} - [[ $NTHREADS_ATMENSANAL -gt $nth_max ]] && export NTHREADS_ATMENSANAL=$nth_max - export APRUN_ATMENSANAL="$launcher -n $npe_atmensanalrun" + export NTHREADS_OCNANAL=${nth_ocnanalchkpt:-${nth_max}} + [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalchkpt}" -elif [ $step = "anal" ]; then +elif [[ "${step}" = "anal" ]] || [[ "${step}" = "analcalc" ]]; then export MKL_NUM_THREADS=4 export MKL_CBWR=AUTO export CFP_MP=${CFP_MP:-"YES"} export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" - nth_max=$(($npe_node_max / $npe_node_anal)) + nth_max=$((npe_node_max / npe_node_anal)) - export NTHREADS_GSI=${nth_anal:-$nth_max} - [[ $NTHREADS_GSI -gt $nth_max ]] && export NTHREADS_GSI=$nth_max - export APRUN_GSI="$launcher" + export NTHREADS_GSI=${nth_anal:-${nth_max}} + [[ ${NTHREADS_GSI} -gt ${nth_max} ]] && export NTHREADS_GSI=${nth_max} + export APRUN_GSI="${launcher} -n ${npe_gsi:-${npe_anal}}" export NTHREADS_CALCINC=${nth_calcinc:-1} - [[ $NTHREADS_CALCINC -gt $nth_max ]] && export NTHREADS_CALCINC=$nth_max - export APRUN_CALCINC="$launcher" + [[ ${NTHREADS_CALCINC} -gt ${nth_max} ]] && export NTHREADS_CALCINC=${nth_max} + export APRUN_CALCINC="${launcher} \$ncmd" export NTHREADS_CYCLE=${nth_cycle:-12} - [[ $NTHREADS_CYCLE -gt $npe_node_max ]] && export NTHREADS_CYCLE=$npe_node_max + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} npe_cycle=${ntiles:-6} - export APRUN_CYCLE="$launcher -n $npe_cycle" + export APRUN_CYCLE="${launcher} -n ${npe_cycle}" export NTHREADS_GAUSFCANL=1 npe_gausfcanl=${npe_gausfcanl:-1} - export APRUN_GAUSFCANL="$launcher -n $npe_gausfcanl" + export APRUN_GAUSFCANL="${launcher} -n ${npe_gausfcanl}" -elif [ $step = "sfcanl" ]; then - nth_max=$(($npe_node_max / $npe_node_sfcanl)) +elif [[ "${step}" = "sfcanl" ]]; then + + nth_max=$((npe_node_max / npe_node_sfcanl)) export NTHREADS_CYCLE=${nth_sfcanl:-14} - [[ $NTHREADS_CYCLE -gt $npe_node_max ]] && export NTHREADS_CYCLE=$npe_node_max + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} npe_sfcanl=${ntiles:-6} - export APRUN_CYCLE="$launcher -n $npe_sfcanl" + export APRUN_CYCLE="${launcher} -n ${npe_sfcanl}" + +elif [[ "${step}" = "gldas" ]]; then -elif [ $step = "gldas" ]; then + export USE_CFP="NO" + export CFP_MP="YES" - nth_max=$(($npe_node_max / $npe_node_gldas)) + nth_max=$((npe_node_max / npe_node_gldas)) - export NTHREADS_GLDAS=${nth_gldas:-$nth_max} - [[ $NTHREADS_GLDAS -gt $nth_max ]] && export NTHREADS_GLDAS=$nth_max - export APRUN_GLDAS="$launcher -n $npe_gldas" + export NTHREADS_GLDAS=${nth_gldas:-${nth_max}} + [[ ${NTHREADS_GLDAS} -gt ${nth_max} ]] && export NTHREADS_GLDAS=${nth_max} + export APRUN_GLDAS="${launcher} -n ${npe_gldas}" export NTHREADS_GAUSSIAN=${nth_gaussian:-1} - [[ $NTHREADS_GAUSSIAN -gt $nth_max ]] && export NTHREADS_GAUSSIAN=$nth_max - export APRUN_GAUSSIAN="$launcher -n $npe_gaussian" + [[ ${NTHREADS_GAUSSIAN} -gt ${nth_max} ]] && export NTHREADS_GAUSSIAN=${nth_max} + export APRUN_GAUSSIAN="${launcher} -n ${npe_gaussian}" # Must run data processing with exactly the number of tasks as time # periods being processed. - npe_gldas_data_proc=$(($gldas_spinup_hours + 12)) - export APRUN_GLDAS_DATA_PROC="$launcher -n $npe_gldas_data_proc --multi-prog" + npe_gldas_data_proc=$((gldas_spinup_hours + 12)) + export APRUN_GLDAS_DATA_PROC="${launcher} -n ${npe_gldas_data_proc} ${mpmd_opt}" -elif [ $step = "eobs" ]; then +elif [[ "${step}" = "eobs" ]]; then export MKL_NUM_THREADS=4 export MKL_CBWR=AUTO - nth_max=$(($npe_node_max / $npe_node_eobs)) + nth_max=$((npe_node_max / npe_node_eobs)) - export NTHREADS_GSI=${nth_eobs:-$nth_max} - [[ $NTHREADS_GSI -gt $nth_max ]] && export NTHREADS_GSI=$nth_max - export APRUN_GSI="$launcher" + export NTHREADS_GSI=${nth_eobs:-${nth_max}} + [[ ${NTHREADS_GSI} -gt ${nth_max} ]] && export NTHREADS_GSI=${nth_max} + export APRUN_GSI="${launcher} -n ${npe_gsi:-${npe_eobs}}" export CFP_MP=${CFP_MP:-"YES"} export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" -elif [ $step = "eupd" ]; then +elif [[ "${step}" = "eupd" ]]; then - nth_max=$(($npe_node_max / $npe_node_eupd)) + nth_max=$((npe_node_max / npe_node_eupd)) - export NTHREADS_ENKF=${nth_eupd:-$nth_max} - [[ $NTHREADS_ENKF -gt $nth_max ]] && export NTHREADS_ENKF=$nth_max - export APRUN_ENKF="$launcher" + export NTHREADS_ENKF=${nth_eupd:-${nth_max}} + [[ ${NTHREADS_ENKF} -gt ${nth_max} ]] && export NTHREADS_ENKF=${nth_max} + export APRUN_ENKF="${launcher} -n ${npe_enkf:-${npe_eupd}}" export CFP_MP=${CFP_MP:-"YES"} export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" -elif [ $step = "fcst" ]; then +elif [[ "${step}" = "fcst" ]] || [[ "${step}" = "efcs" ]]; then - #PEs and PEs/node can differ for GFS and GDAS forecasts if threading differs - if [[ $CDUMP == "gfs" ]]; then - npe_fcst=$npe_fcst_gfs - npe_node_fcst=$npe_node_fcst_gfs - nth_fv3=$nth_fv3_gfs + if [[ "${CDUMP}" =~ "gfs" ]]; then + nprocs="npe_${step}_gfs" + ppn="npe_node_${step}_gfs" || ppn="npe_node_${step}" + else + nprocs="npe_${step}" + ppn="npe_node_${step}" fi + (( nnodes = (${!nprocs}+${!ppn}-1)/${!ppn} )) + (( ntasks = nnodes*${!ppn} )) + # With ESMF threading, the model wants to use the full node + export APRUN_UFS="${launcher} -n ${ntasks}" + unset nprocs ppn nnodes ntasks - nth_max=$(($npe_node_max / $npe_node_fcst)) - - export NTHREADS_FV3=${nth_fv3:-$nth_max} - [[ $NTHREADS_FV3 -gt $nth_max ]] && export NTHREADS_FV3=$nth_max - export cores_per_node=$npe_node_max - export APRUN_FV3="$launcher -n $npe_fcst" - - export NTHREADS_REGRID_NEMSIO=${nth_regrid_nemsio:-1} - [[ $NTHREADS_REGRID_NEMSIO -gt $nth_max ]] && export NTHREADS_REGRID_NEMSIO=$nth_max - export APRUN_REGRID_NEMSIO="$launcher" - - export NTHREADS_REMAP=${nth_remap:-2} - [[ $NTHREADS_REMAP -gt $nth_max ]] && export NTHREADS_REMAP=$nth_max - export APRUN_REMAP="$launcher" - export I_MPI_DAPL_UD="enable" - -elif [ $step = "efcs" ]; then - - nth_max=$(($npe_node_max / $npe_node_efcs)) - - export NTHREADS_FV3=${nth_efcs:-$nth_max} - [[ $NTHREADS_FV3 -gt $nth_max ]] && export NTHREADS_FV3=$nth_max - export cores_per_node=$npe_node_max - export APRUN_FV3="$launcher -n $npe_efcs" - - export NTHREADS_REGRID_NEMSIO=${nth_regrid_nemsio:-1} - [[ $NTHREADS_REGRID_NEMSIO -gt $nth_max ]] && export NTHREADS_REGRID_NEMSIO=$nth_max - export APRUN_REGRID_NEMSIO="$launcher $LEVS" - -elif [ $step = "post" ]; then +elif [[ "${step}" = "post" ]]; then - nth_max=$(($npe_node_max / $npe_node_post)) + nth_max=$((npe_node_max / npe_node_post)) export NTHREADS_NP=${nth_np:-1} - [[ $NTHREADS_NP -gt $nth_max ]] && export NTHREADS_NP=$nth_max - export APRUN_NP="$launcher" + [[ ${NTHREADS_NP} -gt ${nth_max} ]] && export NTHREADS_NP=${nth_max} + export APRUN_NP="${launcher} -n ${npe_post}" export NTHREADS_DWN=${nth_dwn:-1} - [[ $NTHREADS_DWN -gt $nth_max ]] && export NTHREADS_DWN=$nth_max - export APRUN_DWN="$launcher" + [[ ${NTHREADS_DWN} -gt ${nth_max} ]] && export NTHREADS_DWN=${nth_max} + export APRUN_DWN="${launcher} -n ${npe_dwn}" -elif [ $step = "ecen" ]; then +elif [[ "${step}" = "ecen" ]]; then - nth_max=$(($npe_node_max / $npe_node_ecen)) + nth_max=$((npe_node_max / npe_node_ecen)) - export NTHREADS_ECEN=${nth_ecen:-$nth_max} - [[ $NTHREADS_ECEN -gt $nth_max ]] && export NTHREADS_ECEN=$nth_max - export APRUN_ECEN="$launcher" + export NTHREADS_ECEN=${nth_ecen:-${nth_max}} + [[ ${NTHREADS_ECEN} -gt ${nth_max} ]] && export NTHREADS_ECEN=${nth_max} + export APRUN_ECEN="${launcher} -n ${npe_ecen}" export NTHREADS_CHGRES=${nth_chgres:-12} - [[ $NTHREADS_CHGRES -gt $npe_node_max ]] && export NTHREADS_CHGRES=$npe_node_max + [[ ${NTHREADS_CHGRES} -gt ${npe_node_max} ]] && export NTHREADS_CHGRES=${npe_node_max} export APRUN_CHGRES="time" export NTHREADS_CALCINC=${nth_calcinc:-1} - [[ $NTHREADS_CALCINC -gt $nth_max ]] && export NTHREADS_CALCINC=$nth_max - export APRUN_CALCINC="$launcher" + [[ ${NTHREADS_CALCINC} -gt ${nth_max} ]] && export NTHREADS_CALCINC=${nth_max} + export APRUN_CALCINC="${launcher} -n ${npe_ecen}" -elif [ $step = "esfc" ]; then +elif [[ "${step}" = "esfc" ]]; then - nth_max=$(($npe_node_max / $npe_node_esfc)) + nth_max=$((npe_node_max / npe_node_esfc)) - export NTHREADS_ESFC=${nth_esfc:-$nth_max} - [[ $NTHREADS_ESFC -gt $nth_max ]] && export NTHREADS_ESFC=$nth_max - export APRUN_ESFC="$launcher -n $npe_esfc" + export NTHREADS_ESFC=${nth_esfc:-${nth_max}} + [[ ${NTHREADS_ESFC} -gt ${nth_max} ]] && export NTHREADS_ESFC=${nth_max} + export APRUN_ESFC="${launcher} -n ${npe_esfc}" export NTHREADS_CYCLE=${nth_cycle:-14} - [[ $NTHREADS_CYCLE -gt $npe_node_max ]] && export NTHREADS_CYCLE=$npe_node_max - export APRUN_CYCLE="$launcher -n $npe_esfc" + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} + export APRUN_CYCLE="${launcher} -n ${npe_esfc}" -elif [ $step = "epos" ]; then +elif [[ "${step}" = "epos" ]]; then - nth_max=$(($npe_node_max / $npe_node_epos)) + nth_max=$((npe_node_max / npe_node_epos)) - export NTHREADS_EPOS=${nth_epos:-$nth_max} - [[ $NTHREADS_EPOS -gt $nth_max ]] && export NTHREADS_EPOS=$nth_max - export APRUN_EPOS="$launcher" + export NTHREADS_EPOS=${nth_epos:-${nth_max}} + [[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max} + export APRUN_EPOS="${launcher} -n ${npe_epos}" -elif [ $step = "init" ]; then +elif [[ "${step}" = "init" ]]; then - export APRUN="$launcher" + export APRUN="${launcher} -n ${npe_init}" -elif [ $step = "postsnd" ]; then +elif [[ "${step}" = "postsnd" ]]; then - nth_max=$(($npe_node_max / $npe_node_postsnd)) + export CFP_MP="YES" + + nth_max=$((npe_node_max / npe_node_postsnd)) export NTHREADS_POSTSND=${nth_postsnd:-1} - [[ $NTHREADS_POSTSND -gt $nth_max ]] && export NTHREADS_POSTSND=$nth_max - export APRUN_POSTSND="$launcher" + [[ ${NTHREADS_POSTSND} -gt ${nth_max} ]] && export NTHREADS_POSTSND=${nth_max} + export APRUN_POSTSND="${launcher} -n ${npe_postsnd}" export NTHREADS_POSTSNDCFP=${nth_postsndcfp:-1} - [[ $NTHREADS_POSTSNDCFP -gt $nth_max ]] && export NTHREADS_POSTSNDCFP=$nth_max - export APRUN_POSTSNDCFP="$launcher" + [[ ${NTHREADS_POSTSNDCFP} -gt ${nth_max} ]] && export NTHREADS_POSTSNDCFP=${nth_max} + export APRUN_POSTSNDCFP="${launcher} -n ${npe_postsndcfp} ${mpmd_opt}" -elif [ $step = "awips" ]; then +elif [[ "${step}" = "awips" ]]; then - nth_max=$(($npe_node_max / $npe_node_awips)) + nth_max=$((npe_node_max / npe_node_awips)) export NTHREADS_AWIPS=${nth_awips:-2} - [[ $NTHREADS_AWIPS -gt $nth_max ]] && export NTHREADS_AWIPS=$nth_max - export APRUN_AWIPSCFP="$launcher -n $npe_awips --multi-prog" + [[ ${NTHREADS_AWIPS} -gt ${nth_max} ]] && export NTHREADS_AWIPS=${nth_max} + export APRUN_AWIPSCFP="${launcher} -n ${npe_awips} ${mpmd_opt}" + +elif [[ "${step}" = "gempak" ]]; then -elif [ $step = "gempak" ]; then + export CFP_MP="YES" + + if [[ ${CDUMP} == "gfs" ]]; then + npe_gempak=${npe_gempak_gfs} + npe_node_gempak=${npe_node_gempak_gfs} + fi - nth_max=$(($npe_node_max / $npe_node_gempak)) + nth_max=$((npe_node_max / npe_node_gempak)) export NTHREADS_GEMPAK=${nth_gempak:-1} - [[ $NTHREADS_GEMPAK -gt $nth_max ]] && export NTHREADS_GEMPAK=$nth_max - export APRUN="$launcher -n $npe_gempak --multi-prog" + [[ ${NTHREADS_GEMPAK} -gt ${nth_max} ]] && export NTHREADS_GEMPAK=${nth_max} + export APRUN="${launcher} -n ${npe_gempak} ${mpmd_opt}" + + +elif [[ "${step}" = "fit2obs" ]]; then + + nth_max=$((npe_node_max / npe_node_fit2obs)) + + export NTHREADS_FIT2OBS=${nth_fit2obs:-1} + [[ ${NTHREADS_FIT2OBS} -gt ${nth_max} ]] && export NTHREADS_FIT2OBS=${nth_max} + export MPIRUN="${launcher} -n ${npe_fit2obs}" + fi diff --git a/env/JET.env b/env/JET.env index fb834840e32..11533cb7878 100755 --- a/env/JET.env +++ b/env/JET.env @@ -1,10 +1,10 @@ #! /usr/bin/env bash -if [ $# -ne 1 ]; then +if [[ $# -ne 1 ]]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "atmanalrun atmensanalrun" + echo "atmanlrun atmensanlrun aeroanlrun landanlrun" echo "anal sfcanl fcst post vrfy metp" echo "eobs eupd ecen efcs epos" echo "postsnd awips gempak" @@ -14,15 +14,15 @@ fi step=$1 -if [[ "$PARTITION_BATCH" = "xjet" ]]; then +if [[ "${PARTITION_BATCH}" = "xjet" ]]; then export npe_node_max=24 -elif [[ "$PARTITION_BATCH" = "vjet" || "$PARTITION_BATCH" = "sjet" ]]; then +elif [[ "${PARTITION_BATCH}" = "vjet" ]]; then export npe_node_max=16 -elif [[ "$PARTITION_BATCH" = "kjet" ]]; then +elif [[ "${PARTITION_BATCH}" = "kjet" ]]; then export npe_node_max=40 fi -#JKHexport launcher="srun -l --export=ALL" export launcher="srun -l --epilog=/apps/local/bin/report-mem --export=ALL" +export mpmd_opt="--multi-prog --output=${step}.%J.%t.out" # Configure MPI environment export OMP_STACKSIZE=2048000 @@ -31,246 +31,238 @@ export NTHSTACK=1024000000 ulimit -s unlimited ulimit -a -export job=${PBS_JOBNAME:-$step} -export jobid=${job}.${PBS_JOBID:-$$} +if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then -if [ $step = "prep" -o $step = "prepbufr" ]; then - - nth_max=$(($npe_node_max / $npe_node_prep)) + nth_max=$((npe_node_max / npe_node_prep)) export POE="NO" export BACK="NO" export sys_tp="JET" + export launcher_PREP="srun" -elif [ $step = "waveinit" -o $step = "waveprep" -o $step = "wavepostsbs" -o $step = "wavepostbndpnt" -o $step = "wavepostbndpntbll" -o $step = "wavepostpnt" ]; then +elif [[ "${step}" = "waveinit" ]] || [[ "${step}" = "waveprep" ]] || [[ "${step}" = "wavepostsbs" ]] || [[ "${step}" = "wavepostbndpnt" ]] || [[ "${step}" = "wavepostbndpntbll" ]] || [[ "${step}" = "wavepostpnt" ]]; then - export mpmd="--multi-prog" export CFP_MP="YES" - if [ $step = "waveprep" ]; then export MP_PULSE=0 ; fi + if [[ "${step}" = "waveprep" ]]; then export MP_PULSE=0 ; fi export wavempexec=${launcher} - export wave_mpmd=${mpmd} + export wave_mpmd=${mpmd_opt} -elif [ $step = "atmanalrun" ]; then +elif [[ "${step}" = "atmanlrun" ]]; then - export CFP_MP=${CFP_MP:-"YES"} - export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + nth_max=$((npe_node_max / npe_node_atmanlrun)) - nth_max=$(($npe_node_max / $npe_node_atmanalrun)) + export NTHREADS_ATMANL=${nth_atmanlrun:-${nth_max}} + [[ ${NTHREADS_ATMANL} -gt ${nth_max} ]] && export NTHREADS_ATMANL=${nth_max} + export APRUN_ATMANL="${launcher} -n ${npe_atmanlrun}" - export NTHREADS_ATMANAL=${nth_atmanalrun:-$nth_max} - [[ $NTHREADS_ATMANAL -gt $nth_max ]] && export NTHREADS_ATMANAL=$nth_max - export APRUN_ATMANAL="$launcher -n $npe_atmanalrun" +elif [[ "${step}" = "atmensanlrun" ]]; then -elif [ $step = "atmensanalrun" ]; then + nth_max=$((npe_node_max / npe_node_atmensanlrun)) - export CFP_MP=${CFP_MP:-"YES"} - export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export NTHREADS_ATMENSANL=${nth_atmensanlrun:-${nth_max}} + [[ ${NTHREADS_ATMENSANL} -gt ${nth_max} ]] && export NTHREADS_ATMENSANL=${nth_max} + export APRUN_ATMENSANL="${launcher} ${npe_atmensanlrun}" + +elif [[ "${step}" = "aeroanlrun" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" + + nth_max=$((npe_node_max / npe_node_aeroanlrun)) + + export NTHREADS_AEROANL=${nth_aeroanlrun:-${nth_max}} + [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} + export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun}" + +elif [[ "${step}" = "landanlrun" ]]; then + + nth_max=$((npe_node_max / npe_node_landanlrun)) + + export NTHREADS_LANDANL=${nth_landanlrun:-${nth_max}} + [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} + export APRUN_LANDANL="${launcher} -n ${npe_landanlrun}" + +elif [[ "${step}" = "ocnanalbmat" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" + + nth_max=$((npe_node_max / npe_node_ocnanalbmat)) + + export NTHREADS_OCNANAL=${nth_ocnanalbmat:-${nth_max}} + [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalbmat}" + +elif [[ "${step}" = "ocnanalrun" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" - nth_max=$(($npe_node_max / $npe_node_atmensanalrun)) + nth_max=$((npe_node_max / npe_node_ocnanalrun)) - export NTHREADS_ATMENSANAL=${nth_atmensanalrun:-$nth_max} - [[ $NTHREADS_ATMENSANAL -gt $nth_max ]] && export NTHREADS_ATMENSANAL=$nth_max - export APRUN_ATMENSANAL="$launcher -n $npe_atmensanalrun" + export NTHREADS_OCNANAL=${nth_ocnanalrun:-${nth_max}} + [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalrun}" -elif [ $step = "anal" ]; then +elif [[ "${step}" = "anal" ]] || [[ "${step}" = "analcalc" ]]; then export MKL_NUM_THREADS=4 export MKL_CBWR=AUTO export CFP_MP=${CFP_MP:-"YES"} export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" - nth_max=$(($npe_node_max / $npe_node_anal)) + nth_max=$((npe_node_max / npe_node_anal)) - export NTHREADS_GSI=${nth_anal:-$nth_max} - [[ $NTHREADS_GSI -gt $nth_max ]] && export NTHREADS_GSI=$nth_max - export APRUN_GSI="$launcher" + export NTHREADS_GSI=${nth_anal:-${nth_max}} + [[ ${NTHREADS_GSI} -gt ${nth_max} ]] && export NTHREADS_GSI=${nth_max} + export APRUN_GSI="${launcher} -n ${npe_gsi:-${npe_anal}}" export NTHREADS_CALCINC=${nth_calcinc:-1} - [[ $NTHREADS_CALCINC -gt $nth_max ]] && export NTHREADS_CALCINC=$nth_max - export APRUN_CALCINC="$launcher" + [[ ${NTHREADS_CALCINC} -gt ${nth_max} ]] && export NTHREADS_CALCINC=${nth_max} + export APRUN_CALCINC="${launcher} \$ncmd" export NTHREADS_CYCLE=${nth_cycle:-12} - [[ $NTHREADS_CYCLE -gt $npe_node_max ]] && export NTHREADS_CYCLE=$npe_node_max + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} npe_cycle=${ntiles:-6} - export APRUN_CYCLE="$launcher -n $npe_cycle" - + export APRUN_CYCLE="${launcher} -n ${npe_cycle}" export NTHREADS_GAUSFCANL=1 npe_gausfcanl=${npe_gausfcanl:-1} - export APRUN_GAUSFCANL="$launcher -n $npe_gausfcanl" + export APRUN_GAUSFCANL="${launcher} -n ${npe_gausfcanl}" -elif [ $step = "sfcanl" ]; then - nth_max=$(($npe_node_max / $npe_node_sfcanl)) +elif [[ "${step}" = "sfcanl" ]]; then + nth_max=$((npe_node_max / npe_node_sfcanl)) export NTHREADS_CYCLE=${nth_sfcanl:-14} - [[ $NTHREADS_CYCLE -gt $npe_node_max ]] && export NTHREADS_CYCLE=$npe_node_max + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} npe_sfcanl=${ntiles:-6} - export APRUN_CYCLE="$launcher -n $npe_sfcanl" - -elif [ $step = "gldas" ]; then - - nth_max=$(($npe_node_max / $npe_node_gldas)) - - export NTHREADS_GLDAS=${nth_gldas:-$nth_max} - [[ $NTHREADS_GLDAS -gt $nth_max ]] && export NTHREADS_GLDAS=$nth_max - export APRUN_GLDAS="$launcher -n $npe_gldas" - - export NTHREADS_GAUSSIAN=${nth_gaussian:-1} - [[ $NTHREADS_GAUSSIAN -gt $nth_max ]] && export NTHREADS_GAUSSIAN=$nth_max - export APRUN_GAUSSIAN="$launcher -n $npe_gaussian" + export APRUN_CYCLE="${launcher} -n ${npe_sfcanl}" -# Must run data processing with exactly the number of tasks as time -# periods being processed. +elif [[ "${step}" = "gldas" ]]; then - npe_gldas_data_proc=$(($gldas_spinup_hours + 12)) - export APRUN_GLDAS_DATA_PROC="$launcher -n $npe_gldas_data_proc --multi-prog" + echo "WARNING: ${step} is not enabled on ${machine}!" -elif [ $step = "eobs" ]; then +elif [[ "${step}" = "eobs" ]]; then export MKL_NUM_THREADS=4 export MKL_CBWR=AUTO - nth_max=$(($npe_node_max / $npe_node_eobs)) + nth_max=$((npe_node_max / npe_node_eobs)) - export NTHREADS_GSI=${nth_eobs:-$nth_max} - [[ $NTHREADS_GSI -gt $nth_max ]] && export NTHREADS_GSI=$nth_max - export APRUN_GSI="$launcher" + export NTHREADS_GSI=${nth_eobs:-${nth_max}} + [[ ${NTHREADS_GSI} -gt ${nth_max} ]] && export NTHREADS_GSI=${nth_max} + export APRUN_GSI="${launcher} -n ${npe_gsi:-${npe_eobs}}" export CFP_MP=${CFP_MP:-"YES"} export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" -elif [ $step = "eupd" ]; then +elif [[ "${step}" = "eupd" ]]; then - nth_max=$(($npe_node_max / $npe_node_eupd)) + nth_max=$((npe_node_max / npe_node_eupd)) - export NTHREADS_ENKF=${nth_eupd:-$nth_max} - [[ $NTHREADS_ENKF -gt $nth_max ]] && export NTHREADS_ENKF=$nth_max - export APRUN_ENKF="$launcher" + export NTHREADS_ENKF=${nth_eupd:-${nth_max}} + [[ ${NTHREADS_ENKF} -gt ${nth_max} ]] && export NTHREADS_ENKF=${nth_max} + export APRUN_ENKF="${launcher} -n ${npe_enkf:-${npe_eupd}}" export CFP_MP=${CFP_MP:-"YES"} export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" -elif [ $step = "fcst" ]; then +elif [[ "${step}" = "fcst" ]] || [[ "${step}" = "efcs" ]]; then - #PEs and PEs/node can differ for GFS and GDAS forecasts if threading differs - if [[ $CDUMP == "gfs" ]]; then - npe_fcst=$npe_fcst_gfs - npe_node_fcst=$npe_node_fcst_gfs - nth_fv3=$nth_fv3_gfs + if [[ "${CDUMP}" =~ "gfs" ]]; then + nprocs="npe_${step}_gfs" + ppn="npe_node_${step}_gfs" || ppn="npe_node_${step}" + else + nprocs="npe_${step}" + ppn="npe_node_${step}" fi + (( nnodes = (${!nprocs}+${!ppn}-1)/${!ppn} )) + (( ntasks = nnodes*${!ppn} )) + # With ESMF threading, the model wants to use the full node + export APRUN_UFS="${launcher} -n ${ntasks}" + unset nprocs ppn nnodes ntasks - nth_max=$(($npe_node_max / $npe_node_fcst)) - - export NTHREADS_FV3=${nth_fv3:-$nth_max} - [[ $NTHREADS_FV3 -gt $nth_max ]] && export NTHREADS_FV3=$nth_max - export cores_per_node=$npe_node_max - export APRUN_FV3="$launcher -n $npe_fcst" - - export NTHREADS_REGRID_NEMSIO=${nth_regrid_nemsio:-1} - [[ $NTHREADS_REGRID_NEMSIO -gt $nth_max ]] && export NTHREADS_REGRID_NEMSIO=$nth_max - export APRUN_REGRID_NEMSIO="$launcher" - - export NTHREADS_REMAP=${nth_remap:-2} - [[ $NTHREADS_REMAP -gt $nth_max ]] && export NTHREADS_REMAP=$nth_max - export APRUN_REMAP="$launcher" - export I_MPI_DAPL_UD="enable" - -elif [ $step = "efcs" ]; then +elif [[ "${step}" = "post" ]]; then - nth_max=$(($npe_node_max / $npe_node_efcs)) - - export NTHREADS_FV3=${nth_efcs:-$nth_max} - [[ $NTHREADS_FV3 -gt $nth_max ]] && export NTHREADS_FV3=$nth_max - export cores_per_node=$npe_node_max - export APRUN_FV3="$launcher -n $npe_efcs" - - export NTHREADS_REGRID_NEMSIO=${nth_regrid_nemsio:-1} - [[ $NTHREADS_REGRID_NEMSIO -gt $nth_max ]] && export NTHREADS_REGRID_NEMSIO=$nth_max - export APRUN_REGRID_NEMSIO="$launcher $LEVS" - -elif [ $step = "post" ]; then - - nth_max=$(($npe_node_max / $npe_node_post)) + nth_max=$((npe_node_max / npe_node_post)) export NTHREADS_NP=${nth_np:-1} - [[ $NTHREADS_NP -gt $nth_max ]] && export NTHREADS_NP=$nth_max - export APRUN_NP="$launcher --epilog=/apps/local/bin/report-mem" ## JKH + [[ ${NTHREADS_NP} -gt ${nth_max} ]] && export NTHREADS_NP=${nth_max} + export APRUN_NP="${launcher} -n ${npe_post}" export NTHREADS_DWN=${nth_dwn:-1} - [[ $NTHREADS_DWN -gt $nth_max ]] && export NTHREADS_DWN=$nth_max - export APRUN_DWN="$launcher --epilog=/apps/local/bin/report-mem" ## JKH + [[ ${NTHREADS_DWN} -gt ${nth_max} ]] && export NTHREADS_DWN=${nth_max} + export APRUN_DWN="${launcher} -n ${npe_dwn}" -elif [ $step = "ecen" ]; then +elif [[ "${step}" = "ecen" ]]; then - nth_max=$(($npe_node_max / $npe_node_ecen)) + nth_max=$((npe_node_max / npe_node_ecen)) - export NTHREADS_ECEN=${nth_ecen:-$nth_max} - [[ $NTHREADS_ECEN -gt $nth_max ]] && export NTHREADS_ECEN=$nth_max - export APRUN_ECEN="$launcher" + export NTHREADS_ECEN=${nth_ecen:-${nth_max}} + [[ ${NTHREADS_ECEN} -gt ${nth_max} ]] && export NTHREADS_ECEN=${nth_max} + export APRUN_ECEN="${launcher} -n ${npe_ecen}" export NTHREADS_CHGRES=${nth_chgres:-12} - [[ $NTHREADS_CHGRES -gt $npe_node_max ]] && export NTHREADS_CHGRES=$npe_node_max + [[ ${NTHREADS_CHGRES} -gt ${npe_node_max} ]] && export NTHREADS_CHGRES=${npe_node_max} export APRUN_CHGRES="time" export NTHREADS_CALCINC=${nth_calcinc:-1} - [[ $NTHREADS_CALCINC -gt $nth_max ]] && export NTHREADS_CALCINC=$nth_max - export APRUN_CALCINC="$launcher" + [[ ${NTHREADS_CALCINC} -gt ${nth_max} ]] && export NTHREADS_CALCINC=${nth_max} + export APRUN_CALCINC="${launcher} -n ${npe_ecen}" -elif [ $step = "esfc" ]; then +elif [[ "${step}" = "esfc" ]]; then - nth_max=$(($npe_node_max / $npe_node_esfc)) + nth_max=$((npe_node_max / npe_node_esfc)) - export NTHREADS_ESFC=${nth_esfc:-$nth_max} - [[ $NTHREADS_ESFC -gt $nth_max ]] && export NTHREADS_ESFC=$nth_max - export APRUN_ESFC="$launcher -n $npe_esfc" + export NTHREADS_ESFC=${nth_esfc:-${nth_max}} + [[ ${NTHREADS_ESFC} -gt ${nth_max} ]] && export NTHREADS_ESFC=${nth_max} + export APRUN_ESFC="${launcher} -n ${npe_esfc}" export NTHREADS_CYCLE=${nth_cycle:-14} - [[ $NTHREADS_CYCLE -gt $npe_node_max ]] && export NTHREADS_CYCLE=$npe_node_max - export APRUN_CYCLE="$launcher -n $npe_esfc" + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} + export APRUN_CYCLE="${launcher} -n ${npe_esfc}" -elif [ $step = "epos" ]; then +elif [[ "${step}" = "epos" ]]; then - nth_max=$(($npe_node_max / $npe_node_epos)) + nth_max=$((npe_node_max / npe_node_epos)) - export NTHREADS_EPOS=${nth_epos:-$nth_max} - [[ $NTHREADS_EPOS -gt $nth_max ]] && export NTHREADS_EPOS=$nth_max - export APRUN_EPOS="$launcher" + export NTHREADS_EPOS=${nth_epos:-${nth_max}} + [[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max} + export APRUN_EPOS="${launcher} -n ${npe_epos}" -elif [ $step = "init" ]; then +elif [[ "${step}" = "init" ]]; then - export APRUN="$launcher" + export APRUN="${launcher} -n ${npe_init}" -elif [ $step = "postsnd" ]; then +elif [[ "${step}" = "postsnd" ]]; then - nth_max=$(($npe_node_max / $npe_node_postsnd)) + export CFP_MP="YES" + + nth_max=$((npe_node_max / npe_node_postsnd)) export NTHREADS_POSTSND=${nth_postsnd:-1} - [[ $NTHREADS_POSTSND -gt $nth_max ]] && export NTHREADS_POSTSND=$nth_max - export APRUN_POSTSND="$launcher" + [[ ${NTHREADS_POSTSND} -gt ${nth_max} ]] && export NTHREADS_POSTSND=${nth_max} + export APRUN_POSTSND="${launcher} -n ${npe_postsnd}" export NTHREADS_POSTSNDCFP=${nth_postsndcfp:-1} - [[ $NTHREADS_POSTSNDCFP -gt $nth_max ]] && export NTHREADS_POSTSNDCFP=$nth_max - export APRUN_POSTSNDCFP="$launcher" + [[ ${NTHREADS_POSTSNDCFP} -gt ${nth_max} ]] && export NTHREADS_POSTSNDCFP=${nth_max} + export APRUN_POSTSNDCFP="${launcher} -n ${npe_postsndcfp} ${mpmd_opt}" + +elif [[ "${step}" = "awips" ]]; then + + echo "WARNING: ${step} is not enabled on ${machine}!" -elif [ $step = "awips" ]; then +elif [[ "${step}" = "gempak" ]]; then - nth_max=$(($npe_node_max / $npe_node_awips)) + echo "WARNING: ${step} is not enabled on ${machine}!" - export NTHREADS_AWIPS=${nth_awips:-2} - [[ $NTHREADS_AWIPS -gt $nth_max ]] && export NTHREADS_AWIPS=$nth_max - export APRUN_AWIPSCFP="$launcher -n $npe_awips --multi-prog" +elif [[ "${step}" = "fit2obs" ]]; then -elif [ $step = "gempak" ]; then + nth_max=$((npe_node_max / npe_node_fit2obs)) - nth_max=$(($npe_node_max / $npe_node_gempak)) + export NTHREADS_FIT2OBS=${nth_fit2obs:-1} + [[ ${NTHREADS_FIT2OBS} -gt ${nth_max} ]] && export NTHREADS_FIT2OBS=${nth_max} + export MPIRUN="${launcher} -n ${npe_fit2obs}" - export NTHREADS_GEMPAK=${nth_gempak:-1} - [[ $NTHREADS_GEMPAK -gt $nth_max ]] && export NTHREADS_GEMPAK=$nth_max - export APRUN="$launcher -n $npe_gempak --multi-prog" fi diff --git a/env/ORION.env b/env/ORION.env index bef0661f476..466a115b90a 100755 --- a/env/ORION.env +++ b/env/ORION.env @@ -1,10 +1,10 @@ #! /usr/bin/env bash -if [ $# -ne 1 ]; then +if [[ $# -ne 1 ]]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "atmanalrun atmensanalrun" + echo "atmanlrun atmensanlrun aeroanlrun landanlrun" echo "anal sfcanl fcst post vrfy metp" echo "eobs eupd ecen efcs epos" echo "postsnd awips gempak" @@ -16,6 +16,7 @@ step=$1 export npe_node_max=40 export launcher="srun -l --export=ALL" +export mpmd_opt="--multi-prog --output=${step}.%J.%t.out" # Configure MPI environment export MPI_BUFS_PER_PROC=2048 @@ -31,245 +32,281 @@ export NTHSTACK=1024000000 ulimit -s unlimited ulimit -a -export job=${PBS_JOBNAME:-$step} -export jobid=${job}.${PBS_JOBID:-$$} +if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then -if [ $step = "prep" -o $step = "prepbufr" ]; then - - nth_max=$(($npe_node_max / $npe_node_prep)) + nth_max=$((npe_node_max / npe_node_prep)) export POE="NO" export BACK=${BACK:-"YES"} export sys_tp="ORION" + export launcher_PREP="srun" -elif [ $step = "waveinit" -o $step = "waveprep" -o $step = "wavepostsbs" -o $step = "wavepostbndpnt" -o $step = "wavepostpnt" ]; then +elif [[ "${step}" = "waveinit" ]] || [[ "${step}" = "waveprep" ]] || [[ "${step}" = "wavepostsbs" ]] || \ + [[ "${step}" = "wavepostbndpnt" ]] || [[ "${step}" = "wavepostpnt" ]] || [[ "${step}" == "wavepostbndpntbll" ]]; then - export mpmd="--multi-prog" export CFP_MP="YES" - if [ $step = "waveprep" ]; then export MP_PULSE=0 ; fi + if [[ "${step}" = "waveprep" ]]; then export MP_PULSE=0 ; fi export wavempexec=${launcher} - export wave_mpmd=${mpmd} + export wave_mpmd=${mpmd_opt} -elif [ $step = "atmanalrun" ]; then +elif [[ "${step}" = "atmanlrun" ]]; then - export CFP_MP=${CFP_MP:-"YES"} - export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + nth_max=$((npe_node_max / npe_node_atmanlrun)) - nth_max=$(($npe_node_max / $npe_node_atmanalrun)) + export NTHREADS_ATMANL=${nth_atmanlrun:-${nth_max}} + [[ ${NTHREADS_ATMANL} -gt ${nth_max} ]] && export NTHREADS_ATMANL=${nth_max} + export APRUN_ATMANL="${launcher} -n ${npe_atmanlrun}" - export NTHREADS_ATMANAL=${nth_atmanalrun:-$nth_max} - [[ $NTHREADS_ATMANAL -gt $nth_max ]] && export NTHREADS_ATMANAL=$nth_max - export APRUN_ATMANAL="$launcher -n $npe_atmanalrun" +elif [[ "${step}" = "atmensanlrun" ]]; then -elif [ $step = "atmensanalrun" ]; then + nth_max=$((npe_node_max / npe_node_atmensanlrun)) - export CFP_MP=${CFP_MP:-"YES"} - export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export NTHREADS_ATMENSANL=${nth_atmensanlrun:-${nth_max}} + [[ ${NTHREADS_ATMENSANL} -gt ${nth_max} ]] && export NTHREADS_ATMENSANL=${nth_max} + export APRUN_ATMENSANL="${launcher} -n ${npe_atmensanlrun}" + +elif [[ "${step}" = "aeroanlrun" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" + + nth_max=$((npe_node_max / npe_node_aeroanlrun)) + + export NTHREADS_AEROANL=${nth_aeroanlrun:-${nth_max}} + [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} + export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun}" + +elif [[ "${step}" = "landanlrun" ]]; then + + nth_max=$((npe_node_max / npe_node_landanlrun)) + + export NTHREADS_LANDANL=${nth_landanlrun:-${nth_max}} + [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} + export APRUN_LANDANL="${launcher} -n ${npe_landanlrun}" + +elif [[ "${step}" = "ocnanalbmat" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" + + nth_max=$((npe_node_max / npe_node_ocnanalbmat)) + + export NTHREADS_OCNANAL=${nth_ocnanalbmat:-${nth_max}} + [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalbmat}" + +elif [[ "${step}" = "ocnanalrun" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" + + nth_max=$((npe_node_max / npe_node_ocnanalrun)) + + export NTHREADS_OCNANAL=${nth_ocnanalrun:-${nth_max}} + [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalrun}" + +elif [[ "${step}" = "ocnanalchkpt" ]]; then - nth_max=$(($npe_node_max / $npe_node_atmensanalrun)) + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" - export NTHREADS_ATMENSANAL=${nth_atmensanalrun:-$nth_max} - [[ $NTHREADS_ATMENSANAL -gt $nth_max ]] && export NTHREADS_ATMENSANAL=$nth_max - export APRUN_ATMENSANAL="$launcher -n $npe_atmensanalrun" + nth_max=$((npe_node_max / npe_node_ocnanalchkpt)) -elif [ $step = "anal" ]; then + export NTHREADS_OCNANAL=${nth_ocnanalchkpt:-${nth_max}} + [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalchkpt}" + +elif [[ "${step}" = "anal" ]] || [[ "${step}" = "analcalc" ]]; then export MKL_NUM_THREADS=4 export MKL_CBWR=AUTO export CFP_MP=${CFP_MP:-"YES"} export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" - nth_max=$(($npe_node_max / $npe_node_anal)) + nth_max=$((npe_node_max / npe_node_anal)) - export NTHREADS_GSI=${nth_anal:-$nth_max} - [[ $NTHREADS_GSI -gt $nth_max ]] && export NTHREADS_GSI=$nth_max - export APRUN_GSI="$launcher" + export NTHREADS_GSI=${nth_anal:-${nth_max}} + [[ ${NTHREADS_GSI} -gt ${nth_max} ]] && export NTHREADS_GSI=${nth_max} + export APRUN_GSI="${launcher} -n ${npe_gsi:-${npe_anal}}" export NTHREADS_CALCINC=${nth_calcinc:-1} - [[ $NTHREADS_CALCINC -gt $nth_max ]] && export NTHREADS_CALCINC=$nth_max - export APRUN_CALCINC="$launcher" + [[ ${NTHREADS_CALCINC} -gt ${nth_max} ]] && export NTHREADS_CALCINC=${nth_max} + export APRUN_CALCINC="${launcher} \$ncmd" export NTHREADS_CYCLE=${nth_cycle:-12} - [[ $NTHREADS_CYCLE -gt $npe_node_max ]] && export NTHREADS_CYCLE=$npe_node_max + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} npe_cycle=${ntiles:-6} - export APRUN_CYCLE="$launcher -n $npe_cycle" + export APRUN_CYCLE="${launcher} -n ${npe_cycle}" export NTHREADS_GAUSFCANL=1 npe_gausfcanl=${npe_gausfcanl:-1} - export APRUN_GAUSFCANL="$launcher -n $npe_gausfcanl" + export APRUN_GAUSFCANL="${launcher} -n ${npe_gausfcanl}" -elif [ $step = "sfcanl" ]; then - nth_max=$(($npe_node_max / $npe_node_sfcanl)) +elif [[ "${step}" = "sfcanl" ]]; then + nth_max=$((npe_node_max / npe_node_sfcanl)) export NTHREADS_CYCLE=${nth_sfcanl:-14} - [[ $NTHREADS_CYCLE -gt $npe_node_max ]] && export NTHREADS_CYCLE=$npe_node_max + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} npe_sfcanl=${ntiles:-6} - export APRUN_CYCLE="$launcher -n $npe_sfcanl" + export APRUN_CYCLE="${launcher} -n ${npe_sfcanl}" + +elif [[ "${step}" = "gldas" ]]; then -elif [ $step = "gldas" ]; then + export USE_CFP="NO" - nth_max=$(($npe_node_max / $npe_node_gldas)) + nth_max=$((npe_node_max / npe_node_gldas)) - export NTHREADS_GLDAS=${nth_gldas:-$nth_max} - [[ $NTHREADS_GLDAS -gt $nth_max ]] && export NTHREADS_GLDAS=$nth_max - export APRUN_GLDAS="$launcher -n $npe_gldas" + export NTHREADS_GLDAS=${nth_gldas:-${nth_max}} + [[ ${NTHREADS_GLDAS} -gt ${nth_max} ]] && export NTHREADS_GLDAS=${nth_max} + export APRUN_GLDAS="${launcher} -n ${npe_gldas}" export NTHREADS_GAUSSIAN=${nth_gaussian:-1} - [[ $NTHREADS_GAUSSIAN -gt $nth_max ]] && export NTHREADS_GAUSSIAN=$nth_max - export APRUN_GAUSSIAN="$launcher -n $npe_gaussian" + [[ ${NTHREADS_GAUSSIAN} -gt ${nth_max} ]] && export NTHREADS_GAUSSIAN=${nth_max} + export APRUN_GAUSSIAN="${launcher} -n ${npe_gaussian}" # Must run data processing with exactly the number of tasks as time # periods being processed. - npe_gldas_data_proc=$(($gldas_spinup_hours + 12)) - export APRUN_GLDAS_DATA_PROC="$launcher -n $npe_gldas_data_proc --multi-prog" + npe_gldas_data_proc=$((gldas_spinup_hours + 12)) + export APRUN_GLDAS_DATA_PROC="${launcher} -n ${npe_gldas_data_proc} ${mpmd_opt}" -elif [ $step = "eobs" ]; then +elif [[ "${step}" = "eobs" ]]; then export MKL_NUM_THREADS=4 export MKL_CBWR=AUTO export CFP_MP=${CFP_MP:-"YES"} export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" - nth_max=$(($npe_node_max / $npe_node_eobs)) + nth_max=$((npe_node_max / npe_node_eobs)) - export NTHREADS_GSI=${nth_eobs:-$nth_max} - [[ $NTHREADS_GSI -gt $nth_max ]] && export NTHREADS_GSI=$nth_max - export APRUN_GSI="$launcher" + export NTHREADS_GSI=${nth_eobs:-${nth_max}} + [[ ${NTHREADS_GSI} -gt ${nth_max} ]] && export NTHREADS_GSI=${nth_max} + export APRUN_GSI="${launcher} -n ${npe_gsi:-${npe_eobs}}" -elif [ $step = "eupd" ]; then +elif [[ "${step}" = "eupd" ]]; then export CFP_MP=${CFP_MP:-"YES"} export USE_CFP=${USE_CFP:-"YES"} - export APRUNCFP="$launcher -n \$ncmd --multi-prog" + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" - nth_max=$(($npe_node_max / $npe_node_eupd)) + nth_max=$((npe_node_max / npe_node_eupd)) - export NTHREADS_ENKF=${nth_eupd:-$nth_max} - [[ $NTHREADS_ENKF -gt $nth_max ]] && export NTHREADS_ENKF=$nth_max - export APRUN_ENKF="$launcher" + export NTHREADS_ENKF=${nth_eupd:-${nth_max}} + [[ ${NTHREADS_ENKF} -gt ${nth_max} ]] && export NTHREADS_ENKF=${nth_max} + export APRUN_ENKF="${launcher} -n ${npe_enkf:-${npe_eupd}}" -elif [ $step = "fcst" ]; then +elif [[ "${step}" = "fcst" ]] || [[ "${step}" = "efcs" ]]; then - #PEs and PEs/node can differ for GFS and GDAS forecasts if threading differs - if [[ $CDUMP == "gfs" ]]; then - npe_fcst=$npe_fcst_gfs - npe_node_fcst=$npe_node_fcst_gfs - nth_fv3=$nth_fv3_gfs + export OMP_STACKSIZE=512M + if [[ "${CDUMP}" =~ "gfs" ]]; then + nprocs="npe_${step}_gfs" + ppn="npe_node_${step}_gfs" || ppn="npe_node_${step}" + else + nprocs="npe_${step}" + ppn="npe_node_${step}" fi + (( nnodes = (${!nprocs}+${!ppn}-1)/${!ppn} )) + (( ntasks = nnodes*${!ppn} )) + # With ESMF threading, the model wants to use the full node + export APRUN_UFS="${launcher} -n ${ntasks}" + unset nprocs ppn nnodes ntasks - nth_max=$(($npe_node_max / $npe_node_fcst)) - - export NTHREADS_FV3=${nth_fv3:-$nth_max} - [[ $NTHREADS_FV3 -gt $nth_max ]] && export NTHREADS_FV3=$nth_max - export cores_per_node=$npe_node_max - export APRUN_FV3="$launcher -n $npe_fcst" - - export NTHREADS_REGRID_NEMSIO=${nth_regrid_nemsio:-1} - [[ $NTHREADS_REGRID_NEMSIO -gt $nth_max ]] && export NTHREADS_REGRID_NEMSIO=$nth_max - export APRUN_REGRID_NEMSIO="$launcher" - - export NTHREADS_REMAP=${nth_remap:-2} - [[ $NTHREADS_REMAP -gt $nth_max ]] && export NTHREADS_REMAP=$nth_max - export APRUN_REMAP="$launcher" - export I_MPI_DAPL_UD="enable" - -elif [ $step = "efcs" ]; then - - nth_max=$(($npe_node_max / $npe_node_efcs)) - - export NTHREADS_FV3=${nth_efcs:-$nth_max} - [[ $NTHREADS_FV3 -gt $nth_max ]] && export NTHREADS_FV3=$nth_max - export cores_per_node=$npe_node_max - export APRUN_FV3="$launcher -n $npe_efcs" - - export NTHREADS_REGRID_NEMSIO=${nth_regrid_nemsio:-1} - [[ $NTHREADS_REGRID_NEMSIO -gt $nth_max ]] && export NTHREADS_REGRID_NEMSIO=$nth_max - export APRUN_REGRID_NEMSIO="$launcher $LEVS" - -elif [ $step = "post" ]; then +elif [[ "${step}" = "post" ]]; then - nth_max=$(($npe_node_max / $npe_node_post)) + nth_max=$((npe_node_max / npe_node_post)) export NTHREADS_NP=${nth_np:-1} - [[ $NTHREADS_NP -gt $nth_max ]] && export NTHREADS_NP=$nth_max - export APRUN_NP="$launcher" + [[ ${NTHREADS_NP} -gt ${nth_max} ]] && export NTHREADS_NP=${nth_max} + export APRUN_NP="${launcher} -n ${npe_post}" export NTHREADS_DWN=${nth_dwn:-1} - [[ $NTHREADS_DWN -gt $nth_max ]] && export NTHREADS_DWN=$nth_max - export APRUN_DWN="$launcher" + [[ ${NTHREADS_DWN} -gt ${nth_max} ]] && export NTHREADS_DWN=${nth_max} + export APRUN_DWN="${launcher} -n ${npe_dwn}" -elif [ $step = "ecen" ]; then +elif [[ "${step}" = "ecen" ]]; then - nth_max=$(($npe_node_max / $npe_node_ecen)) + nth_max=$((npe_node_max / npe_node_ecen)) - export NTHREADS_ECEN=${nth_ecen:-$nth_max} - [[ $NTHREADS_ECEN -gt $nth_max ]] && export NTHREADS_ECEN=$nth_max - export APRUN_ECEN="$launcher" + export NTHREADS_ECEN=${nth_ecen:-${nth_max}} + [[ ${NTHREADS_ECEN} -gt ${nth_max} ]] && export NTHREADS_ECEN=${nth_max} + export APRUN_ECEN="${launcher} -n ${npe_ecen}" export NTHREADS_CHGRES=${nth_chgres:-12} - [[ $NTHREADS_CHGRES -gt $npe_node_max ]] && export NTHREADS_CHGRES=$npe_node_max + [[ ${NTHREADS_CHGRES} -gt ${npe_node_max} ]] && export NTHREADS_CHGRES=${npe_node_max} export APRUN_CHGRES="time" export NTHREADS_CALCINC=${nth_calcinc:-1} - [[ $NTHREADS_CALCINC -gt $nth_max ]] && export NTHREADS_CALCINC=$nth_max - export APRUN_CALCINC="$launcher" + [[ ${NTHREADS_CALCINC} -gt ${nth_max} ]] && export NTHREADS_CALCINC=${nth_max} + export APRUN_CALCINC="${launcher} -n ${npe_ecen}" -elif [ $step = "esfc" ]; then +elif [[ "${step}" = "esfc" ]]; then - nth_max=$(($npe_node_max / $npe_node_esfc)) + nth_max=$((npe_node_max / npe_node_esfc)) - export NTHREADS_ESFC=${nth_esfc:-$nth_max} - [[ $NTHREADS_ESFC -gt $nth_max ]] && export NTHREADS_ESFC=$nth_max - export APRUN_ESFC="$launcher -n $npe_esfc" + export NTHREADS_ESFC=${nth_esfc:-${nth_max}} + [[ ${NTHREADS_ESFC} -gt ${nth_max} ]] && export NTHREADS_ESFC=${nth_max} + export APRUN_ESFC="${launcher} -n ${npe_esfc}" export NTHREADS_CYCLE=${nth_cycle:-14} - [[ $NTHREADS_CYCLE -gt $npe_node_max ]] && export NTHREADS_CYCLE=$npe_node_max - export APRUN_CYCLE="$launcher -n $npe_esfc" + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} + export APRUN_CYCLE="${launcher} -n ${npe_esfc}" -elif [ $step = "epos" ]; then +elif [[ "${step}" = "epos" ]]; then - nth_max=$(($npe_node_max / $npe_node_epos)) + nth_max=$((npe_node_max / npe_node_epos)) - export NTHREADS_EPOS=${nth_epos:-$nth_max} - [[ $NTHREADS_EPOS -gt $nth_max ]] && export NTHREADS_EPOS=$nth_max - export APRUN_EPOS="$launcher" + export NTHREADS_EPOS=${nth_epos:-${nth_max}} + [[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max} + export APRUN_EPOS="${launcher} -n ${npe_epos}" -elif [ $step = "init" ]; then +elif [[ "${step}" = "init" ]]; then - export APRUN="$launcher" + export APRUN="${launcher} -n ${npe_init}" -elif [ $step = "postsnd" ]; then +elif [[ "${step}" = "postsnd" ]]; then - nth_max=$(($npe_node_max / $npe_node_postsnd)) + export CFP_MP="YES" + + nth_max=$((npe_node_max / npe_node_postsnd)) export NTHREADS_POSTSND=${nth_postsnd:-1} - [[ $NTHREADS_POSTSND -gt $nth_max ]] && export NTHREADS_POSTSND=$nth_max - export APRUN_POSTSND="$launcher" + [[ ${NTHREADS_POSTSND} -gt ${nth_max} ]] && export NTHREADS_POSTSND=${nth_max} + export APRUN_POSTSND="${launcher} -n ${npe_postsnd}" export NTHREADS_POSTSNDCFP=${nth_postsndcfp:-1} - [[ $NTHREADS_POSTSNDCFP -gt $nth_max ]] && export NTHREADS_POSTSNDCFP=$nth_max - export APRUN_POSTSNDCFP="$launcher" + [[ ${NTHREADS_POSTSNDCFP} -gt ${nth_max} ]] && export NTHREADS_POSTSNDCFP=${nth_max} + export APRUN_POSTSNDCFP="${launcher} -n ${npe_postsndcfp} ${mpmd_opt}" -elif [ $step = "awips" ]; then +elif [[ "${step}" = "awips" ]]; then - nth_max=$(($npe_node_max / $npe_node_awips)) + nth_max=$((npe_node_max / npe_node_awips)) export NTHREADS_AWIPS=${nth_awips:-2} - [[ $NTHREADS_AWIPS -gt $nth_max ]] && export NTHREADS_AWIPS=$nth_max - export APRUN_AWIPSCFP="$launcher -n $npe_awips --multi-prog" + [[ ${NTHREADS_AWIPS} -gt ${nth_max} ]] && export NTHREADS_AWIPS=${nth_max} + export APRUN_AWIPSCFP="${launcher} -n ${npe_awips} ${mpmd_opt}" + +elif [[ "${step}" = "gempak" ]]; then + + export CFP_MP="YES" -elif [ $step = "gempak" ]; then + if [[ ${CDUMP} == "gfs" ]]; then + npe_gempak=${npe_gempak_gfs} + npe_node_gempak=${npe_node_gempak_gfs} + fi - nth_max=$(($npe_node_max / $npe_node_gempak)) + nth_max=$((npe_node_max / npe_node_gempak)) export NTHREADS_GEMPAK=${nth_gempak:-1} - [[ $NTHREADS_GEMPAK -gt $nth_max ]] && export NTHREADS_GEMPAK=$nth_max - export APRUN="$launcher -n $npe_gempak --multi-prog" + [[ ${NTHREADS_GEMPAK} -gt ${nth_max} ]] && export NTHREADS_GEMPAK=${nth_max} + export APRUN="${launcher} -n ${npe_gempak} ${mpmd_opt}" + +elif [[ "${step}" = "fit2obs" ]]; then + + nth_max=$((npe_node_max / npe_node_fit2obs)) + + export NTHREADS_FIT2OBS=${nth_fit2obs:-1} + [[ ${NTHREADS_FIT2OBS} -gt ${nth_max} ]] && export NTHREADS_FIT2OBS=${nth_max} + export MPIRUN="${launcher} -n ${npe_fit2obs}" + fi diff --git a/env/S4.env b/env/S4.env new file mode 100755 index 00000000000..fef8ac562fb --- /dev/null +++ b/env/S4.env @@ -0,0 +1,271 @@ +#! /usr/bin/env bash + +if [[ $# -ne 1 ]]; then + + echo "Must specify an input argument to set runtime environment variables!" + echo "argument can be any one of the following:" + echo "atmanlrun atmensanlrun aeroanlrun landanlrun" + echo "anal sfcanl fcst post vrfy metp" + echo "eobs eupd ecen efcs epos" + echo "postsnd awips gempak" + exit 1 + +fi + +step=$1 +PARTITION_BATCH=${PARTITION_BATCH:-"s4"} + +if [[ ${PARTITION_BATCH} = "s4" ]]; then + export npe_node_max=32 +elif [[ ${PARTITION_BATCH} = "ivy" ]]; then + export npe_node_max=20 +fi +export launcher="srun -l --export=ALL" +export mpmd_opt="--multi-prog --output=${step}.%J.%t.out" + +# Configure MPI environment +export OMP_STACKSIZE=2048000 +export NTHSTACK=1024000000 + +ulimit -s unlimited +ulimit -a + +if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then + + nth_max=$((npe_node_max / npe_node_prep)) + + export POE="NO" + export BACK="NO" + export sys_tp="S4" + export launcher_PREP="srun" + +elif [[ "${step}" = "waveinit" ]] || [[ "${step}" = "waveprep" ]] || [[ "${step}" = "wavepostsbs" ]] || [[ "${step}" = "wavepostbndpnt" ]] || [[ "${step}" = "wavepostbndpntbll" ]] || [[ "${step}" = "wavepostpnt" ]]; then + + export CFP_MP="YES" + if [[ "${step}" = "waveprep" ]]; then export MP_PULSE=0 ; fi + export wavempexec=${launcher} + export wave_mpmd=${mpmd_opt} + +elif [[ "${step}" = "atmanlrun" ]]; then + + nth_max=$((npe_node_max / npe_node_atmanlrun)) + + export NTHREADS_ATMANL=${nth_atmanlrun:-${nth_max}} + [[ ${NTHREADS_ATMANL} -gt ${nth_max} ]] && export NTHREADS_ATMANL=${nth_max} + export APRUN_ATMANL="${launcher} -n ${npe_atmanlrun}" + +elif [[ "${step}" = "atmensanlrun" ]]; then + + nth_max=$((npe_node_max / npe_node_atmensanlrun)) + + export NTHREADS_ATMENSANL=${nth_atmensanlrun:-${nth_max}} + [[ ${NTHREADS_ATMENSANL} -gt ${nth_max} ]] && export NTHREADS_ATMENSANL=${nth_max} + export APRUN_ATMENSANL="${launcher} -n ${npe_atmensanlrun}" + +elif [[ "${step}" = "aeroanlrun" ]]; then + + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" + + nth_max=$((npe_node_max / npe_node_aeroanlrun)) + + export NTHREADS_AEROANL=${nth_aeroanlrun:-${nth_max}} + [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} + export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun}" + +elif [[ "${step}" = "landanlrun" ]]; then + + nth_max=$((npe_node_max / npe_node_landanlrun)) + + export NTHREADS_LANDANL=${nth_landanlrun:-${nth_max}} + [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} + export APRUN_LANDANL="${launcher} -n ${npe_landanlrun}" + +elif [[ "${step}" = "ocnanalbmat" ]]; then + echo "WARNING: ${step} is not enabled on S4!" + +elif [[ "${step}" = "ocnanalrun" ]]; then + echo "WARNING: ${step} is not enabled on S4!" + +elif [[ "${step}" = "anal" ]] || [[ "${step}" = "analcalc" ]]; then + + export MKL_NUM_THREADS=4 + export MKL_CBWR=AUTO + + export CFP_MP=${CFP_MP:-"YES"} + export USE_CFP=${USE_CFP:-"YES"} + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" + + nth_max=$((npe_node_max / npe_node_anal)) + + export NTHREADS_GSI=${nth_anal:-${nth_max}} + [[ ${NTHREADS_GSI} -gt ${nth_max} ]] && export NTHREADS_GSI=${nth_max} + export APRUN_GSI="${launcher} -n ${npe_gsi:-${npe_anal}}" + + export NTHREADS_CALCINC=${nth_calcinc:-1} + [[ ${NTHREADS_CALCINC} -gt ${nth_max} ]] && export NTHREADS_CALCINC=${nth_max} + export APRUN_CALCINC="${launcher} \$ncmd" + + export NTHREADS_CYCLE=${nth_cycle:-12} + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} + npe_cycle=${ntiles:-6} + export APRUN_CYCLE="${launcher} -n ${npe_cycle}" + + + export NTHREADS_GAUSFCANL=1 + npe_gausfcanl=${npe_gausfcanl:-1} + export APRUN_GAUSFCANL="${launcher} -n ${npe_gausfcanl}" + +elif [[ "${step}" = "sfcanl" ]]; then + nth_max=$((npe_node_max / npe_node_sfcanl)) + + export NTHREADS_CYCLE=${nth_sfcanl:-14} + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} + npe_sfcanl=${ntiles:-6} + export APRUN_CYCLE="${launcher} -n ${npe_sfcanl}" + +elif [[ "${step}" = "gldas" ]]; then + + export USE_CFP="NO" + export CFP_MP="YES" + + nth_max=$((npe_node_max / npe_node_gldas)) + + export NTHREADS_GLDAS=${nth_gldas:-${nth_max}} + [[ ${NTHREADS_GLDAS} -gt ${nth_max} ]] && export NTHREADS_GLDAS=${nth_max} + export APRUN_GLDAS="${launcher} -n ${npe_gldas}" + + export NTHREADS_GAUSSIAN=${nth_gaussian:-1} + [[ ${NTHREADS_GAUSSIAN} -gt ${nth_max} ]] && export NTHREADS_GAUSSIAN=${nth_max} + export APRUN_GAUSSIAN="${launcher} -n ${npe_gaussian}" + +# Must run data processing with exactly the number of tasks as time +# periods being processed. + + npe_gldas_data_proc=$((gldas_spinup_hours + 12)) + export APRUN_GLDAS_DATA_PROC="${launcher} -n ${npe_gldas_data_proc} ${mpmd_opt}" + +elif [[ "${step}" = "eobs" ]]; then + + export MKL_NUM_THREADS=4 + export MKL_CBWR=AUTO + + nth_max=$((npe_node_max / npe_node_eobs)) + + export NTHREADS_GSI=${nth_eobs:-${nth_max}} + [[ ${NTHREADS_GSI} -gt ${nth_max} ]] && export NTHREADS_GSI=${nth_max} + export APRUN_GSI="${launcher} -n ${npe_gsi:-${npe_eobs}}" + + export CFP_MP=${CFP_MP:-"YES"} + export USE_CFP=${USE_CFP:-"YES"} + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" + +elif [[ "${step}" = "eupd" ]]; then + + nth_max=$((npe_node_max / npe_node_eupd)) + + export NTHREADS_ENKF=${nth_eupd:-${nth_max}} + [[ ${NTHREADS_ENKF} -gt ${nth_max} ]] && export NTHREADS_ENKF=${nth_max} + export APRUN_ENKF="${launcher} -n ${npe_enkf:-${npe_eupd}}" + + export CFP_MP=${CFP_MP:-"YES"} + export USE_CFP=${USE_CFP:-"YES"} + export APRUNCFP="${launcher} -n \$ncmd ${mpmd_opt}" + +elif [[ "${step}" = "fcst" ]] || [[ "${step}" = "efcs" ]]; then + + if [[ "${CDUMP}" =~ "gfs" ]]; then + nprocs="npe_${step}_gfs" + ppn="npe_node_${step}_gfs" || ppn="npe_node_${step}" + else + nprocs="npe_${step}" + ppn="npe_node_${step}" + fi + (( nnodes = (${!nprocs}+${!ppn}-1)/${!ppn} )) + (( ntasks = nnodes*${!ppn} )) + # With ESMF threading, the model wants to use the full node + export APRUN_UFS="${launcher} -n ${ntasks}" + unset nprocs ppn nnodes ntasks + +elif [[ "${step}" = "post" ]]; then + + nth_max=$((npe_node_max / npe_node_post)) + + export NTHREADS_NP=${nth_np:-1} + [[ ${NTHREADS_NP} -gt ${nth_max} ]] && export NTHREADS_NP=${nth_max} + export APRUN_NP="${launcher} -n ${npe_post}" + + export NTHREADS_DWN=${nth_dwn:-1} + [[ ${NTHREADS_DWN} -gt ${nth_max} ]] && export NTHREADS_DWN=${nth_max} + export APRUN_DWN="${launcher} -n ${npe_dwn}" + +elif [[ "${step}" = "ecen" ]]; then + + nth_max=$((npe_node_max / npe_node_ecen)) + + export NTHREADS_ECEN=${nth_ecen:-${nth_max}} + [[ ${NTHREADS_ECEN} -gt ${nth_max} ]] && export NTHREADS_ECEN=${nth_max} + export APRUN_ECEN="${launcher} -n ${npe_ecen}" + + export NTHREADS_CHGRES=${nth_chgres:-12} + [[ ${NTHREADS_CHGRES} -gt ${npe_node_max} ]] && export NTHREADS_CHGRES=${npe_node_max} + export APRUN_CHGRES="time" + + export NTHREADS_CALCINC=${nth_calcinc:-1} + [[ ${NTHREADS_CALCINC} -gt ${nth_max} ]] && export NTHREADS_CALCINC=${nth_max} + export APRUN_CALCINC="${launcher} -n ${npe_ecen}" + +elif [[ "${step}" = "esfc" ]]; then + + nth_max=$((npe_node_max / npe_node_esfc)) + + export NTHREADS_ESFC=${nth_esfc:-${nth_max}} + [[ ${NTHREADS_ESFC} -gt ${nth_max} ]] && export NTHREADS_ESFC=${nth_max} + export APRUN_ESFC="${launcher} -n ${npe_esfc}" + + export NTHREADS_CYCLE=${nth_cycle:-14} + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} + export APRUN_CYCLE="${launcher} -n ${npe_esfc}" + +elif [[ "${step}" = "epos" ]]; then + + nth_max=$((npe_node_max / npe_node_epos)) + + export NTHREADS_EPOS=${nth_epos:-${nth_max}} + [[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max} + export APRUN_EPOS="${launcher} -n ${npe_epos}" + +elif [[ "${step}" = "init" ]]; then + + export APRUN="${launcher} -n ${npe_init}" + +elif [[ "${step}" = "postsnd" ]]; then + + export CFP_MP="YES" + + nth_max=$((npe_node_max / npe_node_postsnd)) + + export NTHREADS_POSTSND=${nth_postsnd:-1} + [[ ${NTHREADS_POSTSND} -gt ${nth_max} ]] && export NTHREADS_POSTSND=${nth_max} + export APRUN_POSTSND="${launcher} -n ${npe_postsnd}" + + export NTHREADS_POSTSNDCFP=${nth_postsndcfp:-1} + [[ ${NTHREADS_POSTSNDCFP} -gt ${nth_max} ]] && export NTHREADS_POSTSNDCFP=${nth_max} + export APRUN_POSTSNDCFP="${launcher} -n ${npe_postsndcfp} ${mpmd_opt}" + +elif [[ "${step}" = "awips" ]]; then + + echo "WARNING: ${step} is not enabled on S4!" + +elif [[ "${step}" = "gempak" ]]; then + + echo "WARNING: ${step} is not enabled on S4!" + +elif [[ "${step}" = "fit2obs" ]]; then + + nth_max=$((npe_node_max / npe_node_fit2obs)) + + export NTHREADS_FIT2OBS=${nth_fit2obs:-1} + [[ ${NTHREADS_FIT2OBS} -gt ${nth_max} ]] && export NTHREADS_FIT2OBS=${nth_max} + export MPIRUN="${launcher} -n ${npe_fit2obs}" + +fi diff --git a/env/WCOSS2.env b/env/WCOSS2.env new file mode 100755 index 00000000000..6d4acf5d19c --- /dev/null +++ b/env/WCOSS2.env @@ -0,0 +1,308 @@ +#! /usr/bin/env bash + +if [[ $# -ne 1 ]]; then + + echo "Must specify an input argument to set runtime environment variables!" + echo "argument can be any one of the following:" + echo "atmanlrun atmensanlrun aeroanlrun landanlrun" + echo "anal sfcanl fcst post vrfy metp" + echo "eobs eupd ecen esfc efcs epos" + echo "postsnd awips gempak" + exit 1 + +fi + +step=$1 + +# WCOSS2 information +export launcher="mpiexec -l" +export mpmd_opt="--cpu-bind verbose,core cfp" + +export npe_node_max=128 + +if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then + + nth_max=$((npe_node_max / npe_node_prep)) + + export POE=${POE:-"YES"} + export BACK=${BACK:-"off"} + export sys_tp="wcoss2" + export launcher_PREP="mpiexec" + +elif [[ "${step}" = "waveinit" ]] || [[ "${step}" = "waveprep" ]] || [[ "${step}" = "wavepostsbs" ]] || [[ "${step}" = "wavepostbndpnt" ]] || [[ "${step}" = "wavepostbndpntbll" ]] || [[ "${step}" = "wavepostpnt" ]]; then + + if [[ "${step}" = "waveprep" ]] && [[ "${CDUMP}" = "gfs" ]]; then export NTASKS=${NTASKS_gfs} ; fi + export wavempexec="${launcher} -np" + export wave_mpmd=${mpmd_opt} + +elif [[ "${step}" = "atmanlrun" ]]; then + + nth_max=$((npe_node_max / npe_node_atmanlrun)) + + export NTHREADS_ATMANL=${nth_atmanlrun:-${nth_max}} + [[ ${NTHREADS_ATMANL} -gt ${nth_max} ]] && export NTHREADS_ATMANL=${nth_max} + export APRUN_ATMANL="${launcher} -n ${npe_atmanlrun}" + +elif [[ "${step}" = "atmensanlrun" ]]; then + + nth_max=$((npe_node_max / npe_node_atmensanlrun)) + + export NTHREADS_ATMENSANL=${nth_atmensanlrun:-${nth_max}} + [[ ${NTHREADS_ATMENSANL} -gt ${nth_max} ]] && export NTHREADS_ATMENSANL=${nth_max} + export APRUN_ATMENSANL="${launcher} -n ${npe_atmensanlrun}" + +elif [[ "${step}" = "aeroanlrun" ]]; then + + export APRUNCFP="${launcher} -np \$ncmd ${mpmd_opt}" + + nth_max=$((npe_node_max / npe_node_aeroanlrun)) + + export NTHREADS_AEROANL=${nth_aeroanlrun:-${nth_max}} + [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} + export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun}" + +elif [[ "${step}" = "landanlrun" ]]; then + + nth_max=$((npe_node_max / npe_node_landanlrun)) + + export NTHREADS_LANDANL=${nth_landanlrun:-${nth_max}} + [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} + export APRUN_LANDANL="${launcher} -n ${npe_landanlrun}" + +elif [[ "${step}" = "anal" ]] || [[ "${step}" = "analcalc" ]]; then + + export OMP_PLACES=cores + export OMP_STACKSIZE=1G + export FI_OFI_RXM_SAR_LIMIT=3145728 + + if [[ "${step}" = "analcalc" ]]; then + export MPICH_MPIIO_HINTS="*:romio_cb_write=disable" + fi + + nth_max=$((npe_node_max / npe_node_anal)) + + export NTHREADS_GSI=${nth_anal:-${nth_max}} + [[ ${NTHREADS_GSI} -gt ${nth_max} ]] && export NTHREADS_GSI=${nth_max} + export APRUN_GSI="${launcher} -n ${npe_gsi:-${npe_anal}} -ppn ${npe_node_anal} --cpu-bind depth --depth ${NTHREADS_GSI}" + + export NTHREADS_CALCINC=${nth_calcinc:-1} + [[ ${NTHREADS_CALCINC} -gt ${nth_max} ]] && export NTHREADS_CALCINC=${nth_max} + export APRUN_CALCINC="${launcher} \$ncmd" + + export NTHREADS_CYCLE=${nth_cycle:-14} + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} + npe_cycle=${ntiles:-6} + export APRUN_CYCLE="${launcher} -n ${npe_cycle} -ppn ${npe_node_cycle} --cpu-bind depth --depth ${NTHREADS_CYCLE}" + + export NTHREADS_GAUSFCANL=1 + npe_gausfcanl=${npe_gausfcanl:-1} + export APRUN_GAUSFCANL="${launcher} -n ${npe_gausfcanl}" + + export NTHREADS_CHGRES=${nth_echgres:-14} + [[ ${NTHREADS_CHGRES} -gt ${npe_node_max} ]] && export NTHREADS_CHGRES=${npe_node_max} + export APRUN_CHGRES="" + + export CFP_MP=${CFP_MP:-"NO"} + export USE_CFP=${USE_CFP:-"YES"} + export APRUNCFP="${launcher} -np \$ncmd ${mpmd_opt}" + +elif [[ "${step}" = "sfcanl" ]]; then + + nth_max=$((npe_node_max / npe_node_sfcanl)) + + export NTHREADS_CYCLE=${nth_sfcanl:-14} + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} + npe_sfcanl=${ntiles:-6} + export APRUN_CYCLE="${launcher} -n ${npe_sfcanl}" + +elif [[ "${step}" = "gldas" ]]; then + + export USE_CFP="YES" + export CFP_MP="NO" + + nth_max=$((npe_node_max / npe_node_gldas)) + + export NTHREADS_GLDAS=${nth_gldas:-${nth_max}} + [[ ${NTHREADS_GLDAS} -gt ${nth_max} ]] && export NTHREADS_GLDAS=${nth_max} + export APRUN_GLDAS="${launcher} -n ${npe_gldas} -ppn ${npe_node_gldas} --cpu-bind depth --depth ${NTHREADS_GLDAS}" + + export NTHREADS_GAUSSIAN=${nth_gaussian:-1} + [[ ${NTHREADS_GAUSSIAN} -gt ${nth_max} ]] && export NTHREADS_GAUSSIAN=${nth_max} + export APRUN_GAUSSIAN="${launcher} -n ${npe_gaussian} -ppn ${npe_node_gaussian} --cpu-bind depth --depth ${NTHREADS_GAUSSIAN}" + + # Must run data processing with exactly the number of tasks as time + # periods being processed. + export USE_CFP=${USE_CFP:-"YES"} + npe_gldas_data_proc=$((gldas_spinup_hours + 12)) + export APRUN_GLDAS_DATA_PROC="${launcher} -np ${npe_gldas_data_proc} ${mpmd_opt}" + +elif [[ "${step}" = "eobs" ]]; then + + export OMP_PLACES=cores + export OMP_STACKSIZE=1G + export FI_OFI_RXM_SAR_LIMIT=3145728 + + nth_max=$((npe_node_max / npe_node_eobs)) + + export NTHREADS_GSI=${nth_eobs:-${nth_max}} + [[ ${NTHREADS_GSI} -gt ${nth_max} ]] && export NTHREADS_GSI=${nth_max} + export APRUN_GSI="${launcher} -n ${npe_gsi:-${npe_eobs}} -ppn ${npe_node_eobs} --cpu-bind depth --depth ${NTHREADS_GSI}" + + export CFP_MP=${CFP_MP:-"NO"} + export USE_CFP=${USE_CFP:-"YES"} + export APRUNCFP="${launcher} -np \$ncmd ${mpmd_opt}" + +elif [[ "${step}" = "eupd" ]]; then + + export OMP_PLACES=cores + export OMP_STACKSIZE=2G + export MPICH_COLL_OPT_OFF=1 + export FI_OFI_RXM_SAR_LIMIT=3145728 + + nth_max=$((npe_node_max / npe_node_eupd)) + + export NTHREADS_ENKF=${nth_eupd:-${nth_max}} + [[ ${NTHREADS_ENKF} -gt ${nth_max} ]] && export NTHREADS_ENKF=${nth_max} + export APRUN_ENKF="${launcher} -n ${npe_enkf:-${npe_eupd}} -ppn ${npe_node_eupd} --cpu-bind depth --depth ${NTHREADS_ENKF}" + + export CFP_MP=${CFP_MP:-"NO"} + export USE_CFP=${USE_CFP:-"YES"} + export APRUNCFP="${launcher} -np \$ncmd ${mpmd_opt}" + +elif [[ "${step}" = "fcst" ]] || [[ "${step}" = "efcs" ]]; then + + if [[ "${CDUMP}" =~ "gfs" ]]; then + nprocs="npe_${step}_gfs" + ppn="npe_node_${step}_gfs" || ppn="npe_node_${step}" + else + nprocs="npe_${step}" + ppn="npe_node_${step}" + fi + (( nnodes = (${!nprocs}+${!ppn}-1)/${!ppn} )) + (( ntasks = nnodes*${!ppn} )) + # With ESMF threading, the model wants to use the full node + export APRUN_UFS="${launcher} -n ${ntasks} -ppn ${!ppn} --cpu-bind depth --depth 1" + unset nprocs ppn nnodes ntasks + + # TODO: Why are fcst and efcs so different on WCOSS2? + # TODO: Compare these with the ufs-weather-model regression test job card at: + # https://github.com/ufs-community/ufs-weather-model/blob/develop/tests/fv3_conf/fv3_qsub.IN_wcoss2 + export FI_OFI_RXM_RX_SIZE=40000 + export FI_OFI_RXM_TX_SIZE=40000 + if [[ "${step}" = "fcst" ]]; then + export OMP_PLACES=cores + export OMP_STACKSIZE=2048M + elif [[ "${step}" = "efcs" ]]; then + export MPICH_MPIIO_HINTS="*:romio_cb_write=disable" + export FI_OFI_RXM_SAR_LIMIT=3145728 + fi + +elif [[ "${step}" = "post" ]]; then + + nth_max=$((npe_node_max / npe_node_post)) + + export NTHREADS_NP=${nth_np:-1} + [[ ${NTHREADS_NP} -gt ${nth_max} ]] && export NTHREADS_NP=${nth_max} + export APRUN_NP="${launcher} -n ${npe_np:-${npe_post}} -ppn ${npe_node_post} --cpu-bind depth --depth ${NTHREADS_NP}" + + export NTHREADS_DWN=${nth_dwn:-1} + [[ ${NTHREADS_DWN} -gt ${nth_max} ]] && export NTHREADS_DWN=${nth_max} + export APRUN_DWN="${launcher} -np ${npe_dwn} ${mpmd_opt}" + +elif [[ "${step}" = "ecen" ]]; then + + nth_max=$((npe_node_max / npe_node_ecen)) + + export NTHREADS_ECEN=${nth_ecen:-${nth_max}} + [[ ${NTHREADS_ECEN} -gt ${nth_max} ]] && export NTHREADS_ECEN=${nth_max} + export APRUN_ECEN="${launcher} -n ${npe_ecen} -ppn ${npe_node_ecen} --cpu-bind depth --depth ${NTHREADS_ECEN}" + + export NTHREADS_CHGRES=${nth_chgres:-14} + [[ ${NTHREADS_CHGRES} -gt ${npe_node_max} ]] && export NTHREADS_CHGRES=${npe_node_max} + export APRUN_CHGRES="time" + + export NTHREADS_CALCINC=${nth_calcinc:-1} + [[ ${NTHREADS_CALCINC} -gt ${nth_max} ]] && export NTHREADS_CALCINC=${nth_max} + export APRUN_CALCINC="${launcher} -n ${npe_ecen}" + + export NTHREADS_CYCLE=${nth_cycle:-14} + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} + export APRUN_CYCLE="${launcher} -n ${npe_ecen} -ppn ${npe_node_cycle} --cpu-bind depth --depth ${NTHREADS_CYCLE}" + +elif [[ "${step}" = "esfc" ]]; then + + nth_max=$((npe_node_max / npe_node_esfc)) + + export NTHREADS_ESFC=${nth_esfc:-${nth_max}} + [[ ${NTHREADS_ESFC} -gt ${nth_max} ]] && export NTHREADS_ESFC=${nth_max} + export APRUN_ESFC="${launcher} -n ${npe_esfc} -ppn ${npe_node_esfc} --cpu-bind depth --depth ${NTHREADS_ESFC}" + + export NTHREADS_CYCLE=${nth_cycle:-14} + [[ ${NTHREADS_CYCLE} -gt ${npe_node_max} ]] && export NTHREADS_CYCLE=${npe_node_max} + export APRUN_CYCLE="${launcher} -n ${npe_esfc} -ppn ${npe_node_cycle} --cpu-bind depth --depth ${NTHREADS_CYCLE}" + +elif [[ "${step}" = "epos" ]]; then + + nth_max=$((npe_node_max / npe_node_epos)) + + export NTHREADS_EPOS=${nth_epos:-${nth_max}} + [[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max} + export APRUN_EPOS="${launcher} -n ${npe_epos} -ppn ${npe_node_epos} --cpu-bind depth --depth ${NTHREADS_EPOS}" + +elif [[ "${step}" = "init" ]]; then + + export APRUN="${launcher}" + +elif [[ "${step}" = "postsnd" ]]; then + + export MPICH_MPIIO_HINTS_DISPLAY=1 + export OMP_NUM_THREADS=1 + + nth_max=$((npe_node_max / npe_node_postsnd)) + + export NTHREADS_POSTSND=${nth_postsnd:-1} + [[ ${NTHREADS_POSTSND} -gt ${nth_max} ]] && export NTHREADS_POSTSND=${nth_max} + export APRUN_POSTSND="${launcher} -n ${npe_postsnd} --depth=${NTHREADS_POSTSND} --cpu-bind depth" + + export NTHREADS_POSTSNDCFP=${nth_postsndcfp:-1} + [[ ${NTHREADS_POSTSNDCFP} -gt ${nth_max} ]] && export NTHREADS_POSTSNDCFP=${nth_max} + export APRUN_POSTSNDCFP="${launcher} -np ${npe_postsndcfp} ${mpmd_opt}" + +elif [[ "${step}" = "awips" ]]; then + + nth_max=$((npe_node_max / npe_node_awips)) + + export NTHREADS_AWIPS=${nth_awips:-2} + [[ ${NTHREADS_AWIPS} -gt ${nth_max} ]] && export NTHREADS_AWIPS=${nth_max} + export APRUN_AWIPSCFP="${launcher} -np ${npe_awips} ${mpmd_opt}" + +elif [[ "${step}" = "gempak" ]]; then + + if [[ ${CDUMP} == "gfs" ]]; then + npe_gempak=${npe_gempak_gfs} + npe_node_gempak=${npe_node_gempak_gfs} + fi + + nth_max=$((npe_node_max / npe_node_gempak)) + + export NTHREADS_GEMPAK=${nth_gempak:-1} + [[ ${NTHREADS_GEMPAK} -gt ${nth_max} ]] && export NTHREADS_GEMPAK=${nth_max} + export APRUN_GEMPAKCFP="${launcher} -np ${npe_gempak} ${mpmd_opt}" + +elif [[ "${step}" = "fit2obs" ]]; then + + nth_max=$((npe_node_max / npe_node_fit2obs)) + + export NTHREADS_FIT2OBS=${nth_fit2obs:-1} + [[ ${NTHREADS_FIT2OBS} -gt ${nth_max} ]] && export NTHREADS_FIT2OBS=${nth_max} + export MPIRUN="${launcher} -np ${npe_fit2obs}" + +elif [[ "${step}" = "waveawipsbulls" ]]; then + + unset PERL5LIB + +elif [[ "${step}" = "wafsgrib2" ]] || [[ "${step}" = "wafsgrib20p25" ]]; then + + export USE_CFP=${USE_CFP:-"YES"} + +fi diff --git a/env/gfs.ver b/env/gfs.ver deleted file mode 100644 index a8f32bd2893..00000000000 --- a/env/gfs.ver +++ /dev/null @@ -1,22 +0,0 @@ -export gfs_ver=v15.0.0 - -export crtm_ver=2.3.0 -export hwrf_ver=v11.0.0 -export g2tmpl_ver=1.4.0 - -export grib_util_ver=1.1.0 -export util_shared_ver=1.0.6 -export cfp_intel_sandybridge_ver=1.1.0 -export iobuf_ver=2.0.7 -export ESMF_intel_sandybridge_ver=3_1_0rp5 -export ESMF_intel_haswell_ver=3_1_0rp5 -export gempak_ver=7.3.3 -export old_gempak_ver=6.32.0 -export NCL_gnu_sandybridge_ver=6.3.0 -export ncarg_intel_sandybridge_ver=6.1.0 -export dumpjb_ver=5.1.0 - -## FOLLOWING are used by JGDAS_TROPC -export obsproc_dump_ver=v4.0.0 -export obsproc_shared_bufr_dumplist_ver=v1.5.0 - diff --git a/gempak/ush/gempak_gfs_f00_gif.sh b/gempak/ush/gempak_gfs_f00_gif.sh index 172cb687a2e..2a7cca5c9f6 100755 --- a/gempak/ush/gempak_gfs_f00_gif.sh +++ b/gempak/ush/gempak_gfs_f00_gif.sh @@ -593,7 +593,7 @@ if [ $SENDCOM = YES ]; then export input=${COMOUT}/${hgttmp500dev} export HEADER=YES export OUTPATH=$DATA/gfs_500_hgt_tmp_nh_anl_${cyc}.tif - ${UTILgfs}/ush/make_tif.sh + ${USHgfs}/make_tif.sh fi msg=" GEMPAK_GIF ${fhr} hour completed normally" diff --git a/jobs/JGDAS_ATMOS_ANALYSIS_DIAG b/jobs/JGDAS_ATMOS_ANALYSIS_DIAG index 4b2728e13f4..6ad5c8f31b5 100755 --- a/jobs/JGDAS_ATMOS_ANALYSIS_DIAG +++ b/jobs/JGDAS_ATMOS_ANALYSIS_DIAG @@ -1,118 +1,41 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base anal analdiag" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env anal -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} - -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "anal" -c "base anal analdiag" ############################################## # Set variables used in the script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} -export COMPONENT=${COMPONENT:-atmos} -export DO_CALC_ANALYSIS=${DO_CALC_ANALYSIS:-"YES"} +export CDUMP="${RUN/enkf}" +export DO_CALC_ANALYSIS=${DO_CALC_ANALYSIS:-"YES"} ############################################## # Begin JOB SPECIFIC work ############################################## -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) -GDUMP=${GDUMP:-"gdas"} +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -${assim_freq} ${PDY}${cyc}) +# shellcheck disable= +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} +export GDUMP="gdas" +export GDUMP_ENS="enkf${GDUMP}" export OPREFIX="${CDUMP}.t${cyc}z." export GPREFIX="${GDUMP}.t${gcyc}z." -export APREFIX="${CDUMP}.t${cyc}z." -export GSUFFIX=${GSUFFIX:-$SUFFIX} -export ASUFFIX=${ASUFFIX:-$SUFFIX} - - -if [ $RUN_ENVIR = "nco" -o ${ROTDIR_DUMP:-NO} = "YES" ]; then - export COMIN=${COMIN:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMOUT=${COMOUT:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_OBS=${COMIN_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_GES_OBS=${COMIN_GES_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$GDUMP.$gPDY/$gcyc/$COMPONENT} -else - export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_OBS="$DMPDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_GES_OBS="$DMPDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -fi -mkdir -m 775 -p $COMOUT -# COMIN_GES and COMIN_GES_ENS are used in script -export COMIN_GES="$ROTDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -export COMIN_GES_ENS="$ROTDIR/enkfgdas.$gPDY/$gcyc/$COMPONENT" - - -export ATMGES="$COMIN_GES/${GPREFIX}atmf006${GSUFFIX}" -if [ ! -f $ATMGES ]; then - echo "FATAL ERROR: FILE MISSING: ATMGES = $ATMGES" - exit 1 -fi - - -if [ $DOHYBVAR = "YES" ]; then - export ATMGES_ENSMEAN="$COMIN_GES_ENS/${GPREFIX}atmf006.ensmean$GSUFFIX" - if [ ! -f $ATMGES_ENSMEAN ]; then - echo "FATAL ERROR: FILE MISSING: ATMGES_ENSMEAN = $ATMGES_ENSMEAN" - exit 2 - fi -fi +export APREFIX="${RUN}.t${cyc}z." +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS +mkdir -m 775 -p "${COM_ATMOS_ANALYSIS}" ############################################################### # Run relevant script -${ANALDIAGSH:-$SCRgfs/exglobal_diag.sh} +${ANALDIAGSH:-${SCRgfs}/exglobal_diag.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## @@ -122,15 +45,15 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGDAS_ATMOS_CHGRES_FORENKF b/jobs/JGDAS_ATMOS_CHGRES_FORENKF index d2268df7672..1bbed535869 100755 --- a/jobs/JGDAS_ATMOS_CHGRES_FORENKF +++ b/jobs/JGDAS_ATMOS_CHGRES_FORENKF @@ -1,62 +1,13 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base anal echgres" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env anal -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} - -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "anal" -c "base anal echgres" ############################################## # Set variables used in the script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} -export COMPONENT=${COMPONENT:-atmos} +export CDUMP=${RUN/enkf} export DO_CALC_ANALYSIS=${DO_CALC_ANALYSIS:-"YES"} @@ -64,40 +15,17 @@ export DO_CALC_ANALYSIS=${DO_CALC_ANALYSIS:-"YES"} # Begin JOB SPECIFIC work ############################################## -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) -GDUMP=${GDUMP:-"gdas"} - -export OPREFIX="${CDUMP}.t${cyc}z." -export GPREFIX="${GDUMP}.t${gcyc}z." export APREFIX="${CDUMP}.t${cyc}z." -export GSUFFIX=${GSUFFIX:-$SUFFIX} -export ASUFFIX=${ASUFFIX:-$SUFFIX} +export APREFIX_ENS="${RUN}.t${cyc}z." - -if [ $RUN_ENVIR = "nco" -o ${ROTDIR_DUMP:-NO} = "YES" ]; then - export COMIN=${COMIN:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMOUT=${COMOUT:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMOUT_ENS=${COMOUT_ENS:-$ROTDIR/enkfgdas.$PDY/$cyc/$COMPONENT} - export COMIN_OBS=${COMIN_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_GES_OBS=${COMIN_GES_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$GDUMP.$gPDY/$gcyc/$COMPONENT} -else - export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMOUT_ENS="$ROTDIR/enkfgdas.$PDY/$cyc/$COMPONENT" - export COMIN_OBS="$DMPDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_GES_OBS="$DMPDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -fi -mkdir -m 775 -p $COMOUT -# COMIN_GES and COMIN_GES_ENS are used in script -export COMIN_GES="$ROTDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -export COMIN_GES_ENS="$ROTDIR/enkfgdas.$gPDY/$gcyc/$COMPONENT" +RUN=${CDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_HISTORY +MEMDIR="mem001" YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_HISTORY_MEM:COM_ATMOS_HISTORY_TMPL ############################################################### # Run relevant script -${CHGRESFCSTSH:-$SCRgfs/exgdas_atmos_chgres_forenkf.sh} +${CHGRESFCSTSH:-${SCRgfs}/exgdas_atmos_chgres_forenkf.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## @@ -107,15 +35,15 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGDAS_ATMOS_GEMPAK b/jobs/JGDAS_ATMOS_GEMPAK index 88654324501..f0131ffb948 100755 --- a/jobs/JGDAS_ATMOS_GEMPAK +++ b/jobs/JGDAS_ATMOS_GEMPAK @@ -1,41 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak" -c "base gempak" -############################################ -# GDAS GEMPAK PRODUCT GENERATION -############################################ - -########################################################## -# obtain unique process id (pid) and make temp directory -########################################################## -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -########################################### -# Run setpdy and initialize PDY variables -########################################### -setpdy.sh -. PDY +# TODO (#1219) This j-job is not part of the rocoto suite ################################ # Set up the HOME directory -################################ -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export EXECgfs=${EXECgfs:-$HOMEgfs/exec} -export PARMgfs=${PARMgfs:-$HOMEgfs/parm} -export PARMwmo=${PARMwmo:-$HOMEgfs/parm/wmo} -export PARMproduct=${PARMproduct:-$HOMEgfs/parm/product} -export FIXgempak=${FIXgempak:-$HOMEgfs/gempak/fix} -export USHgempak=${USHgempak:-$HOMEgfs/gempak/ush} -export SRCgfs=${SRCgfs:-$HOMEgfs/scripts} -export UTILgfs=${UTILgfs:-$HOMEgfs/util} +export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} +export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} +export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} +export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} +export FIXgempak=${FIXgempak:-${HOMEgfs}/gempak/fix} +export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} +export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} +export UTILgfs=${UTILgfs:-${HOMEgfs}/util} ############################################ # Set up model and cycle specific variables @@ -48,73 +27,75 @@ export GRIB=pgrb2f export EXT="" export DBN_ALERT_TYPE=GDAS_GEMPAK +export SENDCOM=${SENDCOM:-NO} export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} ################################### # Specify NET and RUN Name and model #################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gdas} export model=${model:-gdas} -export COMPONENT=${COMPONENT:-atmos} ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT/gempak} +for grid in 0p25 0p50 1p00; do + GRID=${grid} YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GRIB_${grid}:COM_ATMOS_GRIB_TMPL" +done -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT -fi +for grid in 1p00 0p25; do + prod_dir="COM_ATMOS_GEMPAK_${grid}" + GRID=${grid} YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GEMPAK_${grid}:COM_ATMOS_GEMPAK_TMPL" -export pgmout=OUTPUT.$$ + if [[ ${SENDCOM} == YES && ! -d "${!prod_dir}" ]] ; then + mkdir -m 775 -p "${!prod_dir}" + fi +done -if [ -f $DATA/poescrip ]; then - rm $DATA/poescript +# TODO: These actions belong in an ex-script not a j-job +if [[ -f poescript ]]; then + rm -f poescript fi ######################################################## # Execute the script. -echo "$SRCgfs/exgdas_atmos_nawips.sh gdas 009 GDAS_GEMPAK " >> poescript +echo "${SRCgfs}/exgdas_atmos_nawips.sh gdas 009 GDAS_GEMPAK ${COM_ATMOS_GEMPAK_1p00}" >> poescript ######################################################## ######################################################## # Execute the script for quater-degree grib -echo "$SRCgfs/exgdas_atmos_nawips.sh gdas_0p25 009 GDAS_GEMPAK " >>poescript +echo "${SRCgfs}/exgdas_atmos_nawips.sh gdas_0p25 009 GDAS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" >> poescript ######################################################## cat poescript -chmod 775 $DATA/poescript +chmod 775 ${DATA}/poescript export MP_PGMMODEL=mpmd -export MP_CMDFILE=$DATA/poescript +export MP_CMDFILE=${DATA}/poescript -ntasks=${NTASKS_GEMPAK:-$(cat $DATA/poescript | wc -l)} +ntasks=${NTASKS_GEMPAK:-$(cat ${DATA}/poescript | wc -l)} ptile=${PTILE_GEMPAK:-4} threads=${NTHREADS_GEMPAK:-1} -export OMP_NUM_THREADS=$threads -APRUN="mpirun -n $ntasks cfp " +export OMP_NUM_THREADS=${threads} +APRUN="mpiexec -l -np ${ntasks} --cpu-bind verbose,core cfp" -APRUN_GEMPAKCFP=${APRUN_GEMPAKCFP:-$APRUN} -APRUNCFP=$(eval echo $APRUN_GEMPAKCFP) +APRUN_GEMPAKCFP=${APRUN_GEMPAKCFP:-${APRUN}} -$APRUNCFP $DATA/poescript +${APRUN_GEMPAKCFP} ${DATA}/poescript export err=$?; err_chk ############################################ # print exec I/O output ############################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ] ; then - rm -rf $DATA +if [ "${KEEPDATA}" != "YES" ] ; then + rm -rf ${DATA} fi diff --git a/jobs/JGDAS_ATMOS_GEMPAK_META_NCDC b/jobs/JGDAS_ATMOS_GEMPAK_META_NCDC index ffb46db0f92..beadb7ccf85 100755 --- a/jobs/JGDAS_ATMOS_GEMPAK_META_NCDC +++ b/jobs/JGDAS_ATMOS_GEMPAK_META_NCDC @@ -1,54 +1,37 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - ############################################ # GDAS GEMPAK META NCDC PRODUCT GENERATION ############################################ -########################################################## -# obtain unique process id (pid) and make temp directory -########################################################## -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} +# TODO (#1222) This j-job is not part of the rocoto -########################################### -# Run setpdy and initialize PDY variables -########################################### -setpdy.sh -. PDY +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak_meta" -c "base gempak" ################################ # Set up the HOME directory ################################ -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export EXECgfs=${EXECgfs:-$HOMEgfs/exec} -export PARMgfs=${PARMgfs:-$HOMEgfs/parm} -export PARMwmo=${PARMwmo:-$HOMEgfs/parm/wmo} -export PARMproduct=${PARMproduct:-$HOMEgfs/parm/product} -export FIXgempak=${FIXgempak:-$HOMEgfs/gempak/fix} -export USHgempak=${USHgempak:-$HOMEgfs/gempak/ush} -export SRCgfs=${SRCgfs:-$HOMEgfs/scripts} -export UTILgfs=${UTILgfs:-$HOMEgfs/util} +export HOMEgfs=${HOMEgfs:-${PACKAGEROOT}/gfs.${gfs_ver}} +export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} +export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} +export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} +export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} +export FIXgempak=${FIXgempak:-${HOMEgfs}/gempak/fix} +export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} +export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} +export UTILgfs=${UTILgfs:-${HOMEgfs}/util} # # Now set up GEMPAK/NTRANS environment # -cp $FIXgempak/datatype.tbl datatype.tbl +cp ${FIXgempak}/datatype.tbl datatype.tbl ################################### # Specify NET and RUN Name and model #################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gdas} -export COMPONENT=${COMPONENT:-atmos} +export COMPONENT="atmos" export MODEL=GDAS export GRID_NAME=gdas export fend=09 @@ -67,19 +50,19 @@ export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT/gempak} -export COMINgdas=${COMINgdas:-$(compath.py ${NET}/${envir}/${RUN})} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT/gempak/meta} -export COMOUTncdc=${COMOUTncdc:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT} +export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}/gempak} +export COMINgdas=${COMINgdas:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}} +export COMOUT=${COMOUT:-$(compath.py -o ${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}/gempak/meta} +export COMOUTncdc=${COMOUTncdc:-$(compath.py -o ${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}} -export COMINukmet=${COMINukmet:-$(compath.py nawips/prod/ukmet)} -export COMINecmwf=${COMINecmwf:-$(compath.py ecmwf/prod/ecmwf)} +export COMINukmet=${COMINukmet:-$(compath.py ${envir}/ukmet/${ukmet_ver})/ukmet} +export COMINecmwf=${COMINecmwf:-$(compath.py ${envir}/ecmwf/${ecmwf_ver})/ecmwf} export COMOUTukmet=${COMOUT} export COMOUTecmwf=${COMOUT} -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT $COMOUTncdc $COMOUTukmet $COMOUTecmwf +if [ ${SENDCOM} = YES ] ; then + mkdir -m 775 -p ${COMOUT} ${COMOUTncdc} ${COMOUTukmet} ${COMOUTecmwf} fi export pgmout=OUTPUT.$$ @@ -87,10 +70,10 @@ export pgmout=OUTPUT.$$ ######################################################## # Execute the script. -$USHgempak/gdas_meta_na.sh -$USHgempak/gdas_ecmwf_meta_ver.sh -$USHgempak/gdas_meta_loop.sh -$USHgempak/gdas_ukmet_meta_ver.sh +${USHgempak}/gdas_meta_na.sh +${USHgempak}/gdas_ecmwf_meta_ver.sh +${USHgempak}/gdas_meta_loop.sh +${USHgempak}/gdas_ukmet_meta_ver.sh export err=$?; err_chk ######################################################## @@ -100,21 +83,21 @@ export err=$?; err_chk ######################################################## # Execute the script. -$SRCgfs/exgdas_atmos_gempak_gif_ncdc.sh +${SRCgfs}/exgdas_atmos_gempak_gif_ncdc.sh export err=$?; err_chk ######################################################## ############################################ # print exec I/O output ############################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ] ; then - rm -rf $DATA +if [ "${KEEPDATA}" != "YES" ] ; then + rm -rf ${DATA} fi diff --git a/jobs/JGDAS_ATMOS_GLDAS b/jobs/JGDAS_ATMOS_GLDAS new file mode 100755 index 00000000000..dee6b4c9e35 --- /dev/null +++ b/jobs/JGDAS_ATMOS_GLDAS @@ -0,0 +1,85 @@ +#! /usr/bin/env bash + +source "${HOMEgfs:?}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "gldas" -c "base gldas" + +if [[ "${cyc:?}" -ne "${gldas_cyc:?}" ]]; then + echo "GLDAS only runs for ${gldas_cyc} cycle; Skip GLDAS step for cycle ${cyc}" + rm -Rf "${DATA}" + exit 0 +fi + +gldas_spinup_hours=${gldas_spinup_hours-:72} +xtime=$((gldas_spinup_hours+12)) +if [[ "${CDATE}" -le "$(${NDATE:?} +"${xtime}" "${SDATE:?}")" ]]; then + echo "GLDAS needs fluxes as forcing from cycles in previous ${xtime} hours" + echo "starting from ${SDATE}. This gldas cycle is skipped" + rm -Rf "${DATA}" + exit 0 +fi + + +############################################## +# Set variables used in the exglobal script +############################################## +export CDATE=${CDATE:-${PDY}${cyc}} +export CDUMP=${CDUMP:-${RUN:-"gdas"}} +export COMPONENT="atmos" + + +############################################## +# Begin JOB SPECIFIC work +############################################## +export gldas_ver=${gldas_ver:-v2.3.0} +export HOMEgldas=${HOMEgldas:-${HOMEgfs}} +export FIXgldas=${FIXgldas:-${HOMEgldas}/fix/gldas} +export PARMgldas=${PARMgldas:-${HOMEgldas}/parm/gldas} +export EXECgldas=${EXECgldas:-${HOMEgldas}/exec} +export USHgldas=${USHgldas:-${HOMEgldas}/ush} +export PARA_CONFIG=${HOMEgfs}/parm/config/config.gldas + +if [[ "${RUN_ENVIR}" = "nco" ]]; then + export COMIN=${COMIN:-${ROTDIR}/${RUN}.${PDY}/${cyc}/atmos} + export COMOUT=${COMOUT:-${ROTDIR}/${RUN}.${PDY}/${cyc}/atmos} +else + export COMIN="${ROTDIR}/${CDUMP}.${PDY}/${cyc}/atmos" + export COMOUT="${ROTDIR}/${CDUMP}.${PDY}/${cyc}/atmos" +fi +if [[ ! -d ${COMOUT} ]]; then + mkdir -p "${COMOUT}" + chmod 775 "${COMOUT}" +fi + +export COMINgdas=${COMINgdas:-${ROTDIR}} +export DCOMIN=${DCOMIN:-${DCOMROOT:-"/lfs/h1/ops/prod/dcom"}} + +export model=${model:-noah} +export MODEL=${MODEL:-"${model} |tr 'a-z' 'A-Z'"} + + +############################################################### +# Run relevant exglobal script + +${GLDASSH:-${HOMEgldas}/scripts/exgdas_atmos_gldas.sh} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || exit 1 +[[ ${KEEPDATA:?} = "NO" ]] && rm -rf "${DATA}" + +exit 0 + diff --git a/jobs/JGDAS_ATMOS_VERFOZN b/jobs/JGDAS_ATMOS_VERFOZN new file mode 100755 index 00000000000..deccc0b28ee --- /dev/null +++ b/jobs/JGDAS_ATMOS_VERFOZN @@ -0,0 +1,86 @@ +#! /usr/bin/env bash + +############################################################# +# Set up environment for GDAS Ozone Monitor job +############################################################# +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "vrfy" -c "base vrfy" + +export OZNMON_SUFFIX=${OZNMON_SUFFIX:-${NET}} + +#--------------------------------------------- +# Specify Execution Areas +# +export HOMEgfs_ozn=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} +export HOMEgdas_ozn=${HOMEgfs_ozn:-${NWROOT}/gfs.${gfs_ver}} +export PARMgdas_ozn=${PARMgfs_ozn:-${HOMEgfs_ozn}/parm/mon} +export SCRgdas_ozn=${SCRgfs_ozn:-${HOMEgfs_ozn}/scripts} +export FIXgdas_ozn=${FIXgfs_ozn:-${HOMEgfs_ozn}/fix/gdas} + +export HOMEoznmon=${HOMEoznmon:-${HOMEgfs_ozn}} +export EXECoznmon=${EXECoznmon:-${HOMEoznmon}/exec} +export FIXoznmon=${FIXoznmon:-${HOMEoznmon}/fix} +export USHoznmon=${USHoznmon:-${HOMEoznmon}/ush} + + +#----------------------------------- +# source the parm file +# +. ${PARMgdas_ozn}/gdas_oznmon.parm + + +############################################# +# determine PDY and cyc for previous cycle +############################################# + +pdate=$(${NDATE} -6 ${PDY}${cyc}) +echo "pdate = ${pdate}" + +export P_PDY=${pdate:0:8} +export p_cyc=${pdate:8:2} + +#--------------------------------------------- +# OZN_TANKDIR - WHERE OUTPUT DATA WILL RESIDE +# +export OZN_TANKDIR=${OZN_TANKDIR:-$(compath.py ${envir}/${NET}/${gfs_ver})} +export TANKverf_ozn=${TANKverf_ozn:-${OZN_TANKDIR}/${RUN}.${PDY}/${cyc}/atmos/oznmon} +export TANKverf_oznM1=${TANKverf_oznM1:-${OZN_TANKDIR}/${RUN}.${P_PDY}/${p_cyc}/atmos/oznmon} + +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS + +if [[ ! -d ${TANKverf_ozn} ]]; then + mkdir -p -m 775 ${TANKverf_ozn} +fi + +#--------------------------------------- +# set up validation file +# +if [[ ${VALIDATE_DATA} -eq 1 ]]; then + export ozn_val_file=${ozn_val_file:-${FIXgdas_ozn}/gdas_oznmon_base.tar} +fi + +#--------------------------------------- +# Set necessary environment variables +# +export OZN_AREA=${OZN_AREA:-glb} +export oznstat=${oznstat:-${COM_ATMOS_ANALYSIS}/gdas.t${cyc}z.oznstat} + + +#------------------------------------------------------- +# Execute the script. +# +${OZNMONSH:-${SCRgdas_ozn}/exgdas_atmos_verfozn.sh} ${PDY} ${cyc} +err=$? +[[ ${err} -ne 0 ]] && exit ${err} + + +################################ +# Remove the Working Directory +################################ +KEEPDATA=${KEEPDATA:-NO} +cd ${DATAROOT} +if [ ${KEEPDATA} = NO ] ; then + rm -rf ${DATA} +fi + +exit 0 diff --git a/jobs/JGDAS_ATMOS_VERFRAD b/jobs/JGDAS_ATMOS_VERFRAD new file mode 100755 index 00000000000..42e112c74f9 --- /dev/null +++ b/jobs/JGDAS_ATMOS_VERFRAD @@ -0,0 +1,97 @@ +#! /usr/bin/env bash + +############################################################# +# Set up environment for GDAS Radiance Monitor job +############################################################# +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "vrfy" -c "base vrfy" + +export COMPONENT="atmos" + +export RAD_DATA_IN=${DATA} + +export RADMON_SUFFIX=${RADMON_SUFFIX:-${RUN}} +export CYCLE_INTERVAL=${CYCLE_INTERVAL:-6} + +mkdir -p ${RAD_DATA_IN} +cd ${RAD_DATA_IN} + +############################################## +# Specify Execution Areas +############################################## +export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} +export SCRgfs=${SCRgfs:-${HOMEgfs}/scripts} + +export FIXgdas=${FIXgdas:-${HOMEgfs}/fix/gdas} +export PARMmon=${PARMmon:-${HOMEgfs}/parm/mon} + +export HOMEradmon=${HOMEradmon:-${HOMEgfs}} +export EXECradmon=${EXECradmon:-${HOMEradmon}/exec} +export FIXradmon=${FIXradmon:-${FIXgfs}} +export USHradmon=${USHradmon:-${HOMEradmon}/ush} + + +################################### +# source the parm file +################################### +parm_file=${parm_file:-${PARMmon}/da_mon.parm} +. ${parm_file} + + +############################################# +# determine PDY and cyc for previous cycle +############################################# + +pdate=$(${NDATE} -6 ${PDY}${cyc}) +echo "pdate = ${pdate}" + +export P_PDY=${pdate:0:8} +export p_cyc=${pdate:8:2} + +############################################# +# COMOUT - WHERE GSI OUTPUT RESIDES +# TANKverf - WHERE OUTPUT DATA WILL RESIDE +############################################# +export TANKverf=${TANKverf:-$(compath.py ${envir}/${NET}/${gfs_ver})} +export TANKverf_rad=${TANKverf_rad:-${TANKverf}/${RUN}.${PDY}/${cyc}/atmos/radmon} +export TANKverf_radM1=${TANKverf_radM1:-${TANKverf}/${RUN}.${P_PDY}/${p_cyc}/atmos/radmon} + +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS + +mkdir -p -m 775 ${TANKverf_rad} + +######################################## +# Set necessary environment variables +######################################## +export RAD_AREA=${RAD_AREA:-glb} + +export biascr=${biascr:-${COM_ATMOS_ANALYSIS}/gdas.t${cyc}z.abias} +export radstat=${radstat:-${COM_ATMOS_ANALYSIS}/gdas.t${cyc}z.radstat} + +echo " " +echo "JOB HAS STARTED" +echo " " + + +######################################################## +# Execute the script. +${RADMONSH:-${SCRgfs}/exgdas_atmos_verfrad.sh} ${PDY} ${cyc} +err=$? + +if [[ ${err} -ne 0 ]] ; then + exit ${err} +else + echo " " + echo "JOB HAS COMPLETED NORMALLY" + echo " " +fi + +################################ +# Remove the Working Directory +################################ +KEEPDATA=${KEEPDATA:-YES} +cd ${DATAROOT} +if [ ${KEEPDATA} = NO ] ; then + rm -rf ${RAD_DATA_IN} +fi + diff --git a/jobs/JGDAS_ATMOS_VMINMON b/jobs/JGDAS_ATMOS_VMINMON new file mode 100755 index 00000000000..3f9c0d856f8 --- /dev/null +++ b/jobs/JGDAS_ATMOS_VMINMON @@ -0,0 +1,74 @@ +#! /usr/bin/env bash + +########################################################### +# GDAS Minimization Monitor (MinMon) job +########################################################### +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "vrfy" -c "base vrfy" + +########################################################### +# obtain unique process id (pid) and make temp directories +########################################################### +export MINMON_SUFFIX=${MINMON_SUFFIX:-${NET}} +export m_job=${m_job:-${MINMON_SUFFIX}_mmDE} + + +############################################## +# Specify Package Areas +############################################## +export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} +export SCRgfs=${SCRgfs:-${HOMEgfs}/scripts} + +export M_FIXgdas=${M_FIXgdas:-${HOMEgfs}/fix/gdas} + +export HOMEminmon=${HOMEminmon:-${HOMEgfs}} +export EXECminmon=${EXECminmon:-${HOMEminmon}/exec} +export USHminmon=${USHminmon:-${HOMEminmon}/ush} + + +############################################# +# determine PDY and cyc for previous cycle +############################################# + +pdate=$(${NDATE} -6 ${PDY}${cyc}) +echo "pdate = ${pdate}" + +export P_PDY=${pdate:0:8} +export p_cyc=${pdate:8:2} + + +############################################# +# TANKverf - WHERE OUTPUT DATA WILL RESIDE +############################################# +export M_TANKverf=${M_TANKverf:-${COM_IN}/${RUN}.${PDY}/${cyc}/atmos/minmon} +export M_TANKverfM1=${M_TANKverfM1:-${COM_IN}/${RUN}.${P_PDY}/${p_cyc}/atmos/minmon} + +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS + +mkdir -p -m 775 ${M_TANKverf} + + +######################################## +# Set necessary environment variables +######################################## +export CYCLE_INTERVAL=6 +export gsistat=${gsistat:-${COM_ATMOS_ANALYSIS}/gdas.t${cyc}z.gsistat} + + +######################################################## +# Execute the script. +${GMONSH:-${SCRgfs}/exgdas_atmos_vminmon.sh} ${PDY} ${cyc} +err=$? +[[ ${err} -ne 0 ]] && exit ${err} + + +################################ +# Remove the Working Directory +################################ +KEEPDATA=${KEEPDATA:-NO} +cd ${DATAROOT} +if [ ${KEEPDATA} = NO ] ; then + rm -rf ${DATA} +fi + +exit 0 diff --git a/jobs/JGDAS_ENKF_ARCHIVE b/jobs/JGDAS_ENKF_ARCHIVE new file mode 100755 index 00000000000..37f4e17b9b8 --- /dev/null +++ b/jobs/JGDAS_ENKF_ARCHIVE @@ -0,0 +1,44 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "earc" -c "base earc" + + +############################################## +# Set variables used in the script +############################################## +export CDUMP=${RUN/enkf} + +YMD=${PDY} HH=${cyc} generate_com -rx COM_TOP +MEMDIR="ensstat" YMD=${PDY} HH=${cyc} generate_com -rx \ + COM_ATMOS_ANALYSIS_ENSSTAT:COM_ATMOS_ANALYSIS_TMPL + +############################################################### +# Run archive script +############################################################### + +"${SCRgfs}/exgdas_enkf_earc.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################################### + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || (echo "${DATAROOT} does not exist. ABORT!"; exit 1) +[[ ${KEEPDATA} = "NO" ]] && rm -rf "${DATA}" + +exit 0 diff --git a/jobs/JGDAS_ENKF_DIAG b/jobs/JGDAS_ENKF_DIAG index 5ce8d86b78d..e2684fded24 100755 --- a/jobs/JGDAS_ENKF_DIAG +++ b/jobs/JGDAS_ENKF_DIAG @@ -1,154 +1,105 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base anal eobs analdiag ediag" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env eobs -status=$? -[[ $status -ne 0 ]] && exit $status +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "eobs" -c "base anal eobs analdiag ediag" ############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables +# Set variables used in the script ############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY +export CDUMP="${RUN/enkf}" +export MAKE_NSSTBUFR=${MAKE_NSSTBUFR:-"NO"} +export MAKE_ACFTBUFR=${MAKE_ACFTBUFR:-"NO"} ############################################## -# Determine Job Output Name on System +# Begin JOB SPECIFIC work ############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") +# shellcheck disable= +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} +export GDUMP="gdas" +export GDUMP_ENS="enkf${GDUMP}" +export CASE=${CASE_ENKF} -############################################## -# Set variables used in the script -############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gdas"}} -export COMPONENT=${COMPONENT:-atmos} +export OPREFIX="${CDUMP}.t${cyc}z." +export APREFIX="${RUN}.t${cyc}z." +export GPREFIX="${GDUMP_ENS}.t${gcyc}z." +GPREFIX_DET="${GDUMP}.t${gcyc}z." +RUN=${CDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS +MEMDIR="ensstat" YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS -############################################## -# Begin JOB SPECIFIC work -############################################## - -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) -GDUMP=${GDUMP:-"gdas"} - -export CASE=$CASE_ENKF -export CDUMP_OBS=${CDUMP_OBS:-$CDUMP} - -export OPREFIX="${CDUMP_OBS}.t${cyc}z." -export APREFIX="${CDUMP}.t${cyc}z." -export GPREFIX="${GDUMP}.t${gcyc}z." -export GSUFFIX="${GSUFFIX:-".ensmean${SUFFIX}"}" -export ASUFFIX="${ASUFFIX:-"${SUFFIX}"}" - -if [ $RUN_ENVIR = "nco" -o ${ROTDIR_DUMP:-NO} = "YES" ]; then - export COMIN_OBS=${COMIN_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_GES_OBS=${COMIN_GES_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$GDUMP.$gPDY/$gcyc/$COMPONENT} -else - export COMIN_OBS="$DMPDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_GES_OBS="$DMPDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -fi +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_OBS_PREV:COM_OBS_TMPL \ + COM_ATMOS_ANALYSIS_DET_PREV:COM_ATMOS_ANALYSIS_TMPL -# COMIN_GES, COMIN_ANL COMIN_GES_ENS, and COMOUT are used in script -COMIN_GES_CTL="$ROTDIR/gdas.$gPDY/$gcyc/$COMPONENT" -export COMIN_ANL="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export COMIN_GES_ENS="$ROTDIR/enkfgdas.$gPDY/$gcyc/$COMPONENT" -export COMIN_GES=$COMIN_GES_ENS -export COMOUT="$ROTDIR/enkf$CDUMP.$PDY/$cyc/$COMPONENT" +MEMDIR="ensstat" RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL -export ATMGES_ENSMEAN="$COMIN_GES_ENS/${GPREFIX}atmf006$GSUFFIX" -if [ ! -f $ATMGES_ENSMEAN ]; then - echo "FATAL ERROR: FILE MISSING: ATMGES_ENSMEAN = $ATMGES_ENSMEAN" +export ATMGES_ENSMEAN="${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf006.ensmean.nc" +if [ ! -f ${ATMGES_ENSMEAN} ]; then + echo "FATAL ERROR: FILE MISSING: ATMGES_ENSMEAN = ${ATMGES_ENSMEAN}" exit 1 fi - # Link observational data -export PREPQC="$COMIN_OBS/${OPREFIX}prepbufr" -if [ ! -f $PREPQC ]; then - echo "WARNING: Global PREPBUFR FILE $PREPQC MISSING" +export PREPQC="${COM_OBS}/${OPREFIX}prepbufr" +if [[ ! -f ${PREPQC} ]]; then + echo "WARNING: Global PREPBUFR FILE ${PREPQC} MISSING" +fi +export TCVITL="${COM_OBS}/${OPREFIX}syndata.tcvitals.tm00" +if [[ ${DONST} = "YES" ]]; then + export NSSTBF="${COM_OBS}/${OPREFIX}nsstbufr" fi -export PREPQCPF="$COMIN_OBS/${OPREFIX}prepbufr.acft_profiles" -export TCVITL="$COMIN_ANL/${OPREFIX}syndata.tcvitals.tm00" -[[ $DONST = "YES" ]] && export NSSTBF="$COMIN_OBS/${OPREFIX}nsstbufr" +export PREPQCPF="${COM_OBS}/${OPREFIX}prepbufr.acft_profiles" # Guess Bias correction coefficients related to control -export GBIAS=${COMIN_GES_CTL}/${GPREFIX}abias -export GBIASPC=${COMIN_GES_CTL}/${GPREFIX}abias_pc -export GBIASAIR=${COMIN_GES_CTL}/${GPREFIX}abias_air -export GRADSTAT=${COMIN_GES_CTL}/${GPREFIX}radstat +export GBIAS=${COM_ATMOS_ANALYSIS_DET_PREV}/${GPREFIX_DET}abias +export GBIASPC=${COM_ATMOS_ANALYSIS_DET_PREV}/${GPREFIX_DET}abias_pc +export GBIASAIR=${COM_ATMOS_ANALYSIS_DET_PREV}/${GPREFIX_DET}abias_air +export GRADSTAT=${COM_ATMOS_ANALYSIS_DET_PREV}/${GPREFIX_DET}radstat # Bias correction coefficients related to ensemble mean -export ABIAS="$COMOUT/${APREFIX}abias.ensmean" -export ABIASPC="$COMOUT/${APREFIX}abias_pc.ensmean" -export ABIASAIR="$COMOUT/${APREFIX}abias_air.ensmean" -export ABIASe="$COMOUT/${APREFIX}abias_int.ensmean" +export ABIAS="${COM_ATMOS_ANALYSIS}/${APREFIX}abias.ensmean" +export ABIASPC="${COM_ATMOS_ANALYSIS}/${APREFIX}abias_pc.ensmean" +export ABIASAIR="${COM_ATMOS_ANALYSIS}/${APREFIX}abias_air.ensmean" +export ABIASe="${COM_ATMOS_ANALYSIS}/${APREFIX}abias_int.ensmean" # Diagnostics related to ensemble mean -export GSISTAT="$COMOUT/${APREFIX}gsistat.ensmean" -export CNVSTAT="$COMOUT/${APREFIX}cnvstat.ensmean" -export OZNSTAT="$COMOUT/${APREFIX}oznstat.ensmean" -export RADSTAT="$COMOUT/${APREFIX}radstat.ensmean" +export GSISTAT="${COM_ATMOS_ANALYSIS}/${APREFIX}gsistat.ensmean" +export CNVSTAT="${COM_ATMOS_ANALYSIS}/${APREFIX}cnvstat.ensmean" +export OZNSTAT="${COM_ATMOS_ANALYSIS}/${APREFIX}oznstat.ensmean" +export RADSTAT="${COM_ATMOS_ANALYSIS}/${APREFIX}radstat.ensmean" # Select observations based on ensemble mean export RUN_SELECT="YES" export USE_SELECT="NO" -export SELECT_OBS="$COMOUT/${APREFIX}obsinput.ensmean" +export SELECT_OBS="${COM_ATMOS_ANALYSIS}/${APREFIX}obsinput.ensmean" export DIAG_SUFFIX="_ensmean" export DIAG_COMPRESS="NO" # GSI namelist options specific to eobs -export SETUP_INVOBS="passive_bc=.false.,$SETUP_INVOBS" +export SETUP_INVOBS="passive_bc=.false.,${SETUP_INVOBS}" # Ensure clean stat tarballs for ensemble mean -for fstat in $CNVSTAT $OZNSTAT $RADSTAT; do - [[ -f $fstat ]] && rm -f $fstat +for fstat in ${CNVSTAT} ${OZNSTAT} ${RADSTAT}; do + [[ -f ${fstat} ]] && rm -f ${fstat} done ############################################################### # Run relevant script -${ANALDIAGSH:-$SCRgfs/exglobal_diag.sh} +${ANALDIAGSH:-${SCRgfs}/exglobal_diag.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## @@ -158,15 +109,15 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGDAS_ENKF_ECEN b/jobs/JGDAS_ENKF_ECEN index 1e7a51b5ae7..9c2f09a0dc9 100755 --- a/jobs/JGDAS_ENKF_ECEN +++ b/jobs/JGDAS_ENKF_ECEN @@ -1,110 +1,50 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base ecen" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env ecen -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ecen" -c "base ecen" ############################################## # Set variables used in the script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gdas"}} -export COMPONENT=${COMPONENT:-atmos} - +export CDUMP="${RUN/enkf}" ############################################## # Begin JOB SPECIFIC work ############################################## +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") +# shellcheck disable= +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} +export GDUMP="gdas" +export GDUMP_ENS="enkf${GDUMP}" -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) -GDUMP=${GDUMP:-"gdas"} - -export CASE=$CASE_ENKF - - -EUPD_CYC=$(echo ${EUPD_CYC:-"gdas"} | tr a-z A-Z) -if [ $EUPD_CYC = "GFS" ]; then - CDUMP_ENKF="gfs" -else - CDUMP_ENKF=$CDUMP -fi +export CASE=${CASE_ENKF} export OPREFIX="${CDUMP}.t${cyc}z." export APREFIX="${CDUMP}.t${cyc}z." -export APREFIX_ENKF="${CDUMP_ENKF}.t${cyc}z." -export GPREFIX="${CDUMP}.t${gcyc}z." -export GSUFFIX=${GSUFFIX:-$SUFFIX} -export ASUFFIX=${ASUFFIX:-$SUFFIX} - -if [ $RUN_ENVIR = "nco" -o ${ROTDIR_DUMP:-NO} = "YES" ]; then - export COMIN_OBS=${COMIN_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_GES_OBS=${COMIN_GES_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$GDUMP.$gPDY/$gcyc/$COMPONENT} -else - export COMIN_OBS="$DMPDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_GES_OBS="$DMPDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -fi +export APREFIX_ENS="${RUN}.t${cyc}z." +export GPREFIX="${GDUMP}.t${gcyc}z." +export GPREFIX_ENS="${GDUMP_ENS}.t${gcyc}z." + +RUN=${CDUMP} YMD=${PDY} HH=${cyc} generate_com -rx \ + COM_ATMOS_ANALYSIS_DET:COM_ATMOS_ANALYSIS_TMPL + +MEMDIR="ensstat" YMD=${PDY} HH=${cyc} generate_com -rx \ + COM_ATMOS_ANALYSIS_STAT:COM_ATMOS_ANALYSIS_TMPL -# COMIN, COMIN_ENS and COMIN_GES_ENS are used in script -export COMIN="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export COMIN_ENS="$ROTDIR/enkf$CDUMP_ENKF.$PDY/$cyc/$COMPONENT" -export COMOUT_ENS="$ROTDIR/enkf$CDUMP.$PDY/$cyc/$COMPONENT" -export COMIN_GES_ENS="$ROTDIR/enkf$CDUMP.$gPDY/$gcyc/$COMPONENT" +MEMDIR="ensstat" RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_ATMOS_HISTORY_STAT_PREV:COM_ATMOS_HISTORY_TMPL ############################################################### # Run relevant script -${ENKFRECENSH:-$SCRgfs/exgdas_enkf_ecen.sh} +${ENKFRECENSH:-${SCRgfs}/exgdas_enkf_ecen.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## @@ -114,15 +54,15 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGDAS_ENKF_FCST b/jobs/JGDAS_ENKF_FCST index 68b3a532045..77c8b7c0593 100755 --- a/jobs/JGDAS_ENKF_FCST +++ b/jobs/JGDAS_ENKF_FCST @@ -1,80 +1,33 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -configs="base fcst efcs" -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env efcs -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "efcs" -c "base fcst efcs" ############################################## # Set variables used in the script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gdas"}} -export COMPONENT=${COMPONENT:-atmos} - +export CDUMP=${RUN/enkf} +export rCDUMP="enkfgdas" ############################################## # Begin JOB SPECIFIC work ############################################## -export CASE=$CASE_ENKF +export CASE=${CASE_ENKF} -# COMOUT is used in script -export COMOUT="$ROTDIR/enkf$CDUMP.$PDY/$cyc/$COMPONENT" +YMD=${PDY} HH=${cyc} generate_com -rx COM_TOP # Forecast length for EnKF forecast -export FHMIN=$FHMIN_ENKF -export FHOUT=$FHOUT_ENKF -export FHMAX=$FHMAX_ENKF - +export FHMIN=${FHMIN_ENKF} +export FHOUT=${FHOUT_ENKF} +export FHMAX=${FHMAX_ENKF} # Get ENSBEG/ENSEND from ENSGRP and NMEM_EFCSGRP +if [[ $CDUMP == "gfs" ]]; then + export NMEM_EFCSGRP=${NMEM_EFCSGRP_GFS:-${NMEM_EFCSGRP:-1}} +fi export ENSEND=$((NMEM_EFCSGRP * 10#${ENSGRP})) export ENSBEG=$((ENSEND - NMEM_EFCSGRP + 1)) @@ -82,21 +35,21 @@ export ENSBEG=$((ENSEND - NMEM_EFCSGRP + 1)) ############################################################### # Run relevant script -${ENKFFCSTSH:-$SCRgfs/exgdas_enkf_fcst.sh} +${ENKFFCSTSH:-${SCRgfs}/exgdas_enkf_fcst.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} # Double check the status of members in ENSGRP -EFCSGRP=$COMOUT/efcs.grp${ENSGRP} +EFCSGRP="${COM_TOP}/efcs.grp${ENSGRP}" npass=0 -if [ -f $EFCSGRP ]; then - npass=$(grep "PASS" $EFCSGRP | wc -l) +if [ -f ${EFCSGRP} ]; then + npass=$(grep "PASS" ${EFCSGRP} | wc -l) fi -echo "$npass/$NMEM_EFCSGRP members successfull in efcs.grp$ENSGRP" -if [ $npass -ne $NMEM_EFCSGRP ]; then - echo "FATAL ERROR: Failed members in group $ENSGRP, ABORT!" - cat $EFCSGRP +echo "${npass}/${NMEM_EFCSGRP} members successfull in efcs.grp${ENSGRP}" +if [ ${npass} -ne ${NMEM_EFCSGRP} ]; then + echo "FATAL ERROR: Failed members in group ${ENSGRP}, ABORT!" + cat ${EFCSGRP} exit 99 fi @@ -104,8 +57,8 @@ fi ############################################## # Send Alerts ############################################## -if [ $SENDDBN = YES ] ; then - $DBNROOT/bin/dbn_alert MODEL ENKF1_MSC_fcsstat $job $EFCSGRP +if [ ${SENDDBN} = YES ] ; then + ${DBNROOT}/bin/dbn_alert MODEL ENKF1_MSC_fcsstat ${job} ${EFCSGRP} fi @@ -116,15 +69,14 @@ fi ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA - +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGDAS_ENKF_POST b/jobs/JGDAS_ENKF_POST index dcc6335e449..0f7039d6142 100755 --- a/jobs/JGDAS_ENKF_POST +++ b/jobs/JGDAS_ENKF_POST @@ -1,61 +1,13 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base epos" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env epos -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "epos" -c "base epos" ############################################## # Set variables used in the script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gdas"}} -export COMPONENT=${COMPONENT:-atmos} +export CDUMP=${RUN/enkf} ############################################## @@ -63,12 +15,7 @@ export COMPONENT=${COMPONENT:-atmos} ############################################## export GFS_NCIO=${GFS_NCIO:-"YES"} -export PREFIX="${CDUMP}.t${cyc}z." - -# COMIN, COMOUT are used in script -export COMIN="$ROTDIR/enkf$CDUMP.$PDY/$cyc/$COMPONENT" -export COMOUT="$ROTDIR/enkf$CDUMP.$PDY/$cyc/$COMPONENT" - +export PREFIX="${RUN}.t${cyc}z." export LEVS=$((LEVS-1)) @@ -76,9 +23,9 @@ export LEVS=$((LEVS-1)) ############################################################### # Run relevant script -${ENKFPOSTSH:-$SCRgfs/exgdas_enkf_post.sh} +${ENKFPOSTSH:-${SCRgfs}/exgdas_enkf_post.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## @@ -88,15 +35,15 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGDAS_ENKF_SELECT_OBS b/jobs/JGDAS_ENKF_SELECT_OBS index 92bd78b04c0..7c02512989b 100755 --- a/jobs/JGDAS_ENKF_SELECT_OBS +++ b/jobs/JGDAS_ENKF_SELECT_OBS @@ -1,163 +1,130 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base anal eobs" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env eobs -status=$? -[[ $status -ne 0 ]] && exit $status +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "eobs" -c "base anal eobs" ############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables +# Set variables used in the script ############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY +export CDUMP=${RUN/enkf} +export MAKE_NSSTBUFR=${MAKE_NSSTBUFR:-"NO"} +export MAKE_ACFTBUFR=${MAKE_ACFTBUFR:-"NO"} ############################################## -# Determine Job Output Name on System +# Begin JOB SPECIFIC work ############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -${assim_freq} ${PDY}${cyc}) +# shellcheck disable= +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} +export GDUMP="gdas" +export GDUMP_ENS="enkf${GDUMP}" +export OPREFIX="${CDUMP}.t${cyc}z." +export APREFIX="${RUN}.t${cyc}z." +export GPREFIX="${GDUMP_ENS}.t${gcyc}z." +APREFIX_DET="${CDUMP}.t${cyc}z." +GPREFIX_DET="${GDUMP}.t${gcyc}z." -############################################## -# Set variables used in the script -############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gdas"}} -export COMPONENT=${COMPONENT:-atmos} +export GSUFFIX=".ensmean.nc" +# Generate COM variables from templates +RUN=${CDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS +MEMDIR='ensstat' YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS +declare -rx COM_ATMOS_ANALYSIS_ENS="${COM_ATMOS_ANALYSIS}" -############################################## -# Begin JOB SPECIFIC work -############################################## +RUN=${CDUMP} YMD=${PDY} HH=${cyc} generate_com -r COM_ATMOS_ANALYSIS_DET:COM_ATMOS_ANALYSIS_TMPL -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) -GDUMP=${GDUMP:-"gdas"} - -export CASE=$CASE_ENKF -export CDUMP_OBS=${CDUMP_OBS:-$CDUMP} - -export OPREFIX="${CDUMP_OBS}.t${cyc}z." -export APREFIX="${CDUMP}.t${cyc}z." -export GPREFIX="${GDUMP}.t${gcyc}z." -export GSUFFIX="${GSUFFIX:-".ensmean${SUFFIX}"}" -export ASUFFIX="${ASUFFIX:-"${SUFFIX}"}" - -if [ $RUN_ENVIR = "nco" -o ${ROTDIR_DUMP:-NO} = "YES" ]; then - export COMIN_OBS=${COMIN_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_GES_OBS=${COMIN_GES_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$GDUMP.$gPDY/$gcyc/$COMPONENT} -else - export COMIN_OBS="$DMPDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_GES_OBS="$DMPDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -fi +MEMDIR='ensstat' RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_ATMOS_ANALYSIS_PREV:COM_ATMOS_ANALYSIS_TMPL \ + COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL \ -# COMIN_GES, COMIN_ANL COMIN_GES_ENS, and COMOUT are used in script -COMIN_GES_CTL="$ROTDIR/gdas.$gPDY/$gcyc/$COMPONENT" -export COMIN_ANL="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export COMIN_GES_ENS="$ROTDIR/enkfgdas.$gPDY/$gcyc/$COMPONENT" -export COMIN_GES=$COMIN_GES_ENS -export COMOUT="$ROTDIR/enkf$CDUMP.$PDY/$cyc/$COMPONENT" +RUN="${GDUMP}" YMD=${gPDY} HH=${gcyc} generate_com -r COM_ATMOS_ANALYSIS_DET_PREV:COM_ATMOS_ANALYSIS_TMPL +mkdir -m 775 -p "${COM_ATMOS_ANALYSIS}" -export ATMGES_ENSMEAN="$COMIN_GES_ENS/${GPREFIX}atmf006$GSUFFIX" -if [ ! -f $ATMGES_ENSMEAN ]; then - echo "FATAL ERROR: FILE MISSING: ATMGES_ENSMEAN = $ATMGES_ENSMEAN" +export ATMGES_ENSMEAN="${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf006${GSUFFIX}" +if [[ ! -f ${ATMGES_ENSMEAN} ]]; then + echo "FATAL ERROR: FILE MISSING: ATMGES_ENSMEAN = ${ATMGES_ENSMEAN}" exit 1 fi -export LEVS=$($NCDUMP -h $ATMGES_ENSMEAN | grep -i "pfull" | head -1 | awk -F" = " '{print $2}' | awk -F" " '{print $1}') # get LEVS +# Ignore masking of chained commands and possible misspelling warning +# shellcheck disable=SC2153,SC2312 +LEVS=$(${NCDUMP} -h "${ATMGES_ENSMEAN}" | grep -i "pfull" | head -1 | awk -F" = " '{print $2}' | awk -F" " '{print $1}') # get LEVS +# shellcheck disable= status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit "${status}" +export LEVS # Link observational data -export PREPQC="$COMIN_OBS/${OPREFIX}prepbufr" -if [ ! -f $PREPQC ]; then - echo "WARNING: Global PREPBUFR FILE $PREPQC MISSING" +export PREPQC="${COM_OBS}/${OPREFIX}prepbufr" +if [[ ! -f ${PREPQC} ]]; then + echo "WARNING: Global PREPBUFR FILE ${PREPQC} MISSING" fi -export PREPQCPF="$COMIN_OBS/${OPREFIX}prepbufr.acft_profiles" -export TCVITL="$COMIN_ANL/${OPREFIX}syndata.tcvitals.tm00" -[[ $DONST = "YES" ]] && export NSSTBF="$COMIN_OBS/${OPREFIX}nsstbufr" +export TCVITL="${COM_OBS}/${APREFIX_DET}syndata.tcvitals.tm00" +if [[ ${DONST} = "YES" ]]; then + export NSSTBF="${COM_OBS}/${OPREFIX}nsstbufr" +fi +export PREPQCPF="${COM_OBS}/${OPREFIX}prepbufr.acft_profiles" + +# Deterministic analysis and increment files +export SFCANL="${COM_ATMOS_ANALYSIS_DET}/${APREFIX_DET}sfcanl.nc" +export DTFANL="${COM_ATMOS_ANALYSIS_DET}/${APREFIX_DET}dtfanl.nc" +export ATMANL="${COM_ATMOS_ANALYSIS_DET}/${APREFIX_DET}atmanl.nc" +export ATMINC="${COM_ATMOS_ANALYSIS_DET}/${APREFIX_DET}atminc.nc" # Guess Bias correction coefficients related to control -export GBIAS=${COMIN_GES_CTL}/${GPREFIX}abias -export GBIASPC=${COMIN_GES_CTL}/${GPREFIX}abias_pc -export GBIASAIR=${COMIN_GES_CTL}/${GPREFIX}abias_air -export GRADSTAT=${COMIN_GES_CTL}/${GPREFIX}radstat +export GBIAS=${COM_ATMOS_ANALYSIS_DET_PREV}/${GPREFIX_DET}abias +export GBIASPC=${COM_ATMOS_ANALYSIS_DET_PREV}/${GPREFIX_DET}abias_pc +export GBIASAIR=${COM_ATMOS_ANALYSIS_DET_PREV}/${GPREFIX_DET}abias_air +export GRADSTAT=${COM_ATMOS_ANALYSIS_DET_PREV}/${GPREFIX_DET}radstat # Bias correction coefficients related to ensemble mean -export ABIAS="$COMOUT/${APREFIX}abias.ensmean" -export ABIASPC="$COMOUT/${APREFIX}abias_pc.ensmean" -export ABIASAIR="$COMOUT/${APREFIX}abias_air.ensmean" -export ABIASe="$COMOUT/${APREFIX}abias_int.ensmean" +export ABIAS="${COM_ATMOS_ANALYSIS}/${APREFIX}abias.ensmean" +export ABIASPC="${COM_ATMOS_ANALYSIS}/${APREFIX}abias_pc.ensmean" +export ABIASAIR="${COM_ATMOS_ANALYSIS}/${APREFIX}abias_air.ensmean" +export ABIASe="${COM_ATMOS_ANALYSIS}/${APREFIX}abias_int.ensmean" # Diagnostics related to ensemble mean -export GSISTAT="$COMOUT/${APREFIX}gsistat.ensmean" -export CNVSTAT="$COMOUT/${APREFIX}cnvstat.ensmean" -export OZNSTAT="$COMOUT/${APREFIX}oznstat.ensmean" -export RADSTAT="$COMOUT/${APREFIX}radstat.ensmean" +export GSISTAT="${COM_ATMOS_ANALYSIS}/${APREFIX}gsistat.ensmean" +export CNVSTAT="${COM_ATMOS_ANALYSIS}/${APREFIX}cnvstat.ensmean" +export OZNSTAT="${COM_ATMOS_ANALYSIS}/${APREFIX}oznstat.ensmean" +export RADSTAT="${COM_ATMOS_ANALYSIS}/${APREFIX}radstat.ensmean" # Select observations based on ensemble mean export RUN_SELECT="YES" export USE_SELECT="NO" -export SELECT_OBS="$COMOUT/${APREFIX}obsinput.ensmean" +export SELECT_OBS="${COM_ATMOS_ANALYSIS}/${APREFIX}obsinput.ensmean" export DIAG_SUFFIX="_ensmean" # GSI namelist options specific to eobs -export SETUP_INVOBS="passive_bc=.false.,$SETUP_INVOBS" +export SETUP_INVOBS="passive_bc=.false.,${SETUP_INVOBS}" # Ensure clean stat tarballs for ensemble mean -for fstat in $CNVSTAT $OZNSTAT $RADSTAT; do - [[ -f $fstat ]] && rm -f $fstat +for fstat in ${CNVSTAT} ${OZNSTAT} ${RADSTAT}; do + [[ -f ${fstat} ]] && rm -f ${fstat} done ############################################################### # Run relevant script -${INVOBSSH:-$SCRgfs/exgdas_enkf_select_obs.sh} +${INVOBSSH:-${SCRgfs}/exgdas_enkf_select_obs.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## # Send Alerts ############################################## -if [ $SENDDBN = YES ] ; then - $DBNROOT/bin/dbn_alert MODEL ENKF1_MSC_gsistat $job $GSISTAT +if [[ ${SENDDBN} = YES ]] ; then + ${DBNROOT}/bin/dbn_alert MODEL ENKF1_MSC_gsistat ${job} ${GSISTAT} fi @@ -168,15 +135,15 @@ fi ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGDAS_ENKF_SFC b/jobs/JGDAS_ENKF_SFC index 54f196234af..9e6196fbd7e 100755 --- a/jobs/JGDAS_ENKF_SFC +++ b/jobs/JGDAS_ENKF_SFC @@ -1,111 +1,51 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base esfc" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env esfc -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "esfc" -c "base esfc" ############################################## # Set variables used in the script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gdas"}} -export COMPONENT=${COMPONENT:-atmos} - +export CDUMP="${RUN/enkf}" ############################################## # Begin JOB SPECIFIC work ############################################## +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") +# shellcheck disable= +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} +export GDUMP="gdas" +export GDUMP_ENS="enkf${GDUMP}" -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) -GDUMP=${GDUMP:-"gdas"} - -export CASE=$CASE_ENKF - +export OPREFIX="${CDUMP}.t${cyc}z." +export GPREFIX="${GDUMP}.t${gcyc}z." +export APREFIX="${CDUMP}.t${cyc}z." -EUPD_CYC=$(echo ${EUPD_CYC:-"gdas"} | tr a-z A-Z) -if [ $EUPD_CYC = "GFS" ]; then - CDUMP_ENKF="gfs" -else - CDUMP_ENKF=$CDUMP -fi +export CASE=${CASE_ENKF} export OPREFIX="${CDUMP}.t${cyc}z." export APREFIX="${CDUMP}.t${cyc}z." -export APREFIX_ENKF="${CDUMP_ENKF}.t${cyc}z." -export GPREFIX="${CDUMP}.t${gcyc}z." -export GSUFFIX=${GSUFFIX:-$SUFFIX} -export ASUFFIX=${ASUFFIX:-$SUFFIX} - -if [ $RUN_ENVIR = "nco" -o ${ROTDIR_DUMP:-NO} = "YES" ]; then - export COMIN_OBS=${COMIN_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_GES_OBS=${COMIN_GES_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$GDUMP.$gPDY/$gcyc/$COMPONENT} -else - export COMIN_OBS="$DMPDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_GES_OBS="$DMPDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -fi +export APREFIX_ENS="${RUN}.t${cyc}z." +export GPREFIX="${GDUMP}.t${gcyc}z." +export GPREFIX_ENS="${GDUMP_ENS}.t${gcyc}z." -# COMIN, COMIN_ENS and COMIN_GES_ENS are used in script -export COMIN="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export COMIN_GES="$ROTDIR/$CDUMP.$gPDY/$gcyc/$COMPONENT" -export COMIN_ENS="$ROTDIR/enkf$CDUMP_ENKF.$PDY/$cyc/$COMPONENT" -export COMOUT_ENS="$ROTDIR/enkf$CDUMP.$PDY/$cyc/$COMPONENT" -export COMIN_GES_ENS="$ROTDIR/enkf$CDUMP.$gPDY/$gcyc/$COMPONENT" +RUN=${CDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS \ + COM_ATMOS_ANALYSIS_DET:COM_ATMOS_ANALYSIS_TMPL +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_OBS_PREV:COM_OBS_TMPL \ + COM_ATMOS_ANALYSIS_DET_PREV:COM_ATMOS_ANALYSIS_TMPL ############################################################### # Run relevant script -${ENKFRESFCSH:-$SCRgfs/exgdas_enkf_sfc.sh} +${ENKFRESFCSH:-${SCRgfs}/exgdas_enkf_sfc.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## @@ -115,15 +55,15 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGDAS_ENKF_UPDATE b/jobs/JGDAS_ENKF_UPDATE index dafd9b13f26..10505291655 100755 --- a/jobs/JGDAS_ENKF_UPDATE +++ b/jobs/JGDAS_ENKF_UPDATE @@ -1,95 +1,50 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base anal eupd" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env eupd -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "eupd" -c "base anal eupd" ############################################## # Set variables used in the script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gdas"}} -export COMPONENT=${COMPONENT:-atmos} +export CDUMP="${RUN/enkf}" ############################################## # Begin JOB SPECIFIC work ############################################## +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") +# shellcheck disable= +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} +export GDUMP="gdas" +export GDUMP_ENS="enkf${GDUMP}" -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) - -export APREFIX="${CDUMP}.t${cyc}z." -export GPREFIX="gdas.t${gcyc}z." -export ASUFFIX=${ASUFFIX:-$SUFFIX} -export GSUFFIX=${GSUFFIX:-$SUFFIX} +export APREFIX="${RUN}.t${cyc}z." +export GPREFIX="${GDUMP_ENS}.t${gcyc}z." +MEMDIR="ensstat" YMD=${PDY} HH=${cyc} generate_com -rx \ + COM_ATMOS_ANALYSIS_STAT:COM_ATMOS_ANALYSIS_TMPL -# COMIN_GES_ENS and COMOUT_ANL_ENS are used in script -export COMIN_GES_ENS="$ROTDIR/enkfgdas.$gPDY/$gcyc/$COMPONENT" -export COMOUT_ANL_ENS="$ROTDIR/enkf$CDUMP.$PDY/$cyc/$COMPONENT" +MEMDIR="ensstat" RUN="enkfgdas" YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_ATMOS_HISTORY_STAT_PREV:COM_ATMOS_HISTORY_TMPL ############################################################### # Run relevant script -${ENKFUPDSH:-$SCRgfs/exgdas_enkf_update.sh} +${ENKFUPDSH:-${SCRgfs}/exgdas_enkf_update.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## # Send Alerts ############################################## -if [ $SENDDBN = YES ] ; then - $DBNROOT/bin/dbn_alert MODEL ENKF1_MSC_enkfstat $job $COMOUT_ANL_ENS/${APREFIX}enkfstat +if [ ${SENDDBN} = YES ] ; then + "${DBNROOT}/bin/dbn_alert" "MODEL" "ENKF1_MSC_enkfstat" "${job}" "${COM_ATMOS_ANALYSIS_STAT}/${APREFIX}enkfstat" fi @@ -100,15 +55,15 @@ fi ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGDAS_FIT2OBS b/jobs/JGDAS_FIT2OBS new file mode 100755 index 00000000000..d673845404b --- /dev/null +++ b/jobs/JGDAS_FIT2OBS @@ -0,0 +1,88 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "fit2obs" -c "base fit2obs" + + +############################################## +# Set variables used in the script +############################################## + +export CDUMP=${RUN/enkf} + +# Ignore spelling warning; nothing is misspelled +# shellcheck disable=SC2153 +CDATE=$(${NDATE} -"${VBACKUP_FITS}" "${PDY}${cyc}") # set CDATE to lookback cycle for use in fit2obs package +export CDATE +vday=${CDATE:0:8} +vcyc=${CDATE:8:2} + +export COM_INA=${ROTDIR}/gdas.${vday}/${vcyc}/atmos +# We want to defer variable expansion, so ignore warning about single quotes +# shellcheck disable=SC2016 +export COM_INF='$ROTDIR/vrfyarch/gfs.$fdy/$fzz' +export COM_PRP=${ROTDIR}/gdas.${vday}/${vcyc}/obs + +export PRPI=${COM_PRP}/${RUN}.t${vcyc}z.prepbufr +export sig1=${COM_INA}/${RUN}.t${vcyc}z.atmanl.nc +export sfc1=${COM_INA}/${RUN}.t${vcyc}z.atmanl.nc +export CNVS=${COM_INA}/${RUN}.t${vcyc}z.cnvstat + +export OUTPUT_FILETYPE=${OUTPUT_FILETYPE:-netcdf} + +export FIT_DIR=${ARCDIR}/fits +[[ ! -d "${FIT_DIR}" ]] && mkdir -p "${FIT_DIR}" +export HORZ_DIR=${ARCDIR}/horiz +[[ ! -d "${HORZ_DIR}" ]] && mkdir -p "${HORZ_DIR}" +export COMLOX=${DATA}/fitx +[[ ! -d "${COMLOX}" ]] && mkdir -p "${COMLOX}" + +echo "echo err_chk">"${DATA}"/err_chk; chmod 755 "${DATA}"/err_chk +echo "echo postmsg">"${DATA}"/postmsg; chmod 755 "${DATA}"/postmsg + +############################################## +# Check spinup and available inputs +############################################## + +# Ignore spelling warning; nothing is misspelled +# shellcheck disable=SC2153 +if [[ ${CDATE} -gt ${SDATE} ]]; then + for file in ${PRPI} ${sig1} ${sfc1} ${CNVS}; do + if [[ ! -f "${file}" ]]; then + echo "FATAL ERROR: FILE MISSING: ${file}" + exit 1 + fi + done + + ############################################## + # RUN FIT2OBS VERIFICATION + ############################################## + + "${SCRIPTSfit2obs}/excfs_gdas_vrfyfits.sh" + status=$? + [[ ${status} -ne 0 ]] && exit "${status}" + + ############################################## + # End JOB SPECIFIC work + ############################################## + + ############################################## + # Final processing + ############################################## + if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" + fi + +else + + echo "Too early for FIT2OBS to run. Exiting." + +fi + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || (echo "FATAL ERROR: ${DATAROOT} does not exist. ABORT!"; exit 1) +[[ ${KEEPDATA} = "NO" ]] && rm -rf "${DATA}" + +exit 0 diff --git a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT new file mode 100755 index 00000000000..613de589d2c --- /dev/null +++ b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT @@ -0,0 +1,45 @@ +#!/bin/bash +export STRICT="NO" +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" + +export DATA="${DATAROOT}/${RUN}ocnanal_${cyc}" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalrun" -c "base ocnanal ocnanalrun" + + +############################################## +# Set variables used in the script +############################################## + + +############################################## +# Begin JOB SPECIFIC work +############################################## + +export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/ocean} + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASPREPPY:-${HOMEgfs}/sorc/gdas.cd/scripts/exgdas_global_marine_analysis_bmat.sh} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Do not remove the Temporary working directory (do this in POST) +########################################## +cd "${DATAROOT}" || exit 1 + +exit 0 diff --git a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT_VRFY b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT_VRFY new file mode 100755 index 00000000000..c85b5c886b8 --- /dev/null +++ b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT_VRFY @@ -0,0 +1,44 @@ +#!/bin/bash +export STRICT="NO" +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA="${DATAROOT}/${RUN}ocnanal_${cyc}" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalrun" -c "base ocnanal ocnanalrun" + + +############################################## +# Set variables used in the script +############################################## + + +############################################## +# Begin JOB SPECIFIC work +############################################## + +export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/ocean} + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASPREPPY:-${HOMEgfs}/sorc/gdas.cd/scripts/exgdas_global_marine_analysis_bmat_vrfy.sh} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Do not remove the Temporary working directory (do this in POST) +########################################## +cd "${DATAROOT}" || exit 1 + +exit 0 diff --git a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_CHKPT b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_CHKPT new file mode 100755 index 00000000000..f157488a595 --- /dev/null +++ b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_CHKPT @@ -0,0 +1,58 @@ +#!/bin/bash +export STRICT="NO" +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA="${DATAROOT}/${RUN}ocnanal_${cyc}" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalchkpt" -c "base ocnanal ocnanalchkpt" + + +############################################## +# Set variables used in the script +############################################## +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +export GDATE +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} +export GDUMP=${GDUMP:-"gdas"} + +export GPREFIX="${GDUMP}.t${gcyc}z." +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +export APREFIX="${CDUMP}.t${cyc}z." + +# Generate COM variables from templates +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_ATMOS_HISTORY:COM_ATMOS_HISTORY_TMPL +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_ATMOS_ANALYSIS:COM_ATMOS_ANALYSIS_TMPL + + +############################################## +# Begin JOB SPECIFIC work +############################################## + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASPREPPY:-${HOMEgfs}/sorc/gdas.cd/scripts/exgdas_global_marine_analysis_chkpt.sh} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Do not remove the Temporary working directory (do this in POST) +########################################## +cd "${DATAROOT}" || exit 1 + +exit 0 diff --git a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_POST b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_POST new file mode 100755 index 00000000000..9b3467feec3 --- /dev/null +++ b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_POST @@ -0,0 +1,47 @@ +#!/bin/bash +export STRICT="NO" +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +DATA="${DATAROOT}/${RUN}ocnanal_${cyc}" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalpost" -c "base ocnanalpost" + + +############################################## +# Set variables used in the script +############################################## +export CDUMP=${CDUMP:-${RUN:-"gfs"}} +export CDATE=${CDATE:-${PDY}${cyc}} +export GDUMP=${GDUMP:-"gdas"} + +# Generate COM variables from templates +RUN=${GDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_OCEAN_ANALYSIS:COM_OCEAN_ANALYSIS_TMPL +RUN=${GDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_ICE_RESTART + +mkdir -p "${COM_OCEAN_ANALYSIS}" +mkdir -p "${COM_ICE_RESTART}" + +############################################## +# Begin JOB SPECIFIC work +############################################## + +# Add UFSDA to PYTHONPATH +ufsdaPATH="${HOMEgfs}/sorc/gdas.cd/ush/" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${ufsdaPATH}" +export PYTHONPATH + +############################################################### +# Run relevant script +############################################################### + +EXSCRIPT=${GDASPREPPY:-${HOMEgfs}/sorc/gdas.cd/scripts/exgdas_global_marine_analysis_post.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || exit 1 +[[ "${KEEPDATA}" = "NO" ]] && rm -rf "${DATA}" + +exit 0 diff --git a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP new file mode 100755 index 00000000000..c4843c596d4 --- /dev/null +++ b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP @@ -0,0 +1,59 @@ +#!/bin/bash +export STRICT="NO" +source "${HOMEgfs}/ush/preamble.sh" +export DATA="${DATAROOT}/${RUN}ocnanal_${cyc}" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalprep" -c "base ocnanal ocnanalprep" + + +############################################## +# Set variables used in the script +############################################## +export CDUMP=${CDUMP:-${RUN:-"gfs"}} +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +export GDATE +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} +export GDUMP=${GDUMP:-"gdas"} + +export OPREFIX="${CDUMP}.t${cyc}z." +export GPREFIX="${GDUMP}.t${gcyc}z." +export APREFIX="${CDUMP}.t${cyc}z." + +# Generate COM variables from templates +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_OCEAN_HISTORY:COM_OCEAN_HISTORY_TMPL +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_ICE_HISTORY:COM_ICE_HISTORY_TMPL +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_ICE_RESTART:COM_ICE_RESTART_TMPL + + + +############################################## +# Begin JOB SPECIFIC work +############################################## + +# Add UFSDA to PYTHONPATH +ufsdaPATH="${HOMEgfs}/sorc/gdas.cd/ush/" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${ufsdaPATH}" +export PYTHONPATH + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASPREPPY:-${HOMEgfs}/sorc/gdas.cd/scripts/exgdas_global_marine_analysis_prep.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +exit 0 diff --git a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_RUN b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_RUN new file mode 100755 index 00000000000..c89f8b273bc --- /dev/null +++ b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_RUN @@ -0,0 +1,39 @@ +#!/bin/bash +export STRICT="NO" +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA="${DATAROOT}/${RUN}ocnanal_${cyc}" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalrun" -c "base ocnanal ocnanalrun" + + +############################################## +# Begin JOB SPECIFIC work +############################################## + +export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/ocean} + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASPREPPY:-${HOMEgfs}/sorc/gdas.cd/scripts/exgdas_global_marine_analysis_run.sh} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Do not remove the Temporary working directory (do this in POST) +########################################## +cd "${DATAROOT}" || exit 1 + +exit 0 diff --git a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY new file mode 100755 index 00000000000..86aac5fdaac --- /dev/null +++ b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY @@ -0,0 +1,53 @@ +#!/bin/bash +export STRICT="NO" +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalprep" -c "base ocnanal ocnanalprep" + + +############################################## +# Set variables used in the script +############################################## +export CDUMP=${CDUMP:-${RUN:-"gfs"}} +export GDUMP=${GDUMP:-"gdas"} +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} + +RUN=${GDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_OCEAN_ANALYSIS +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_OCEAN_HISTORY:COM_OCEAN_HISTORY_TMPL +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx COM_ICE_HISTORY:COM_ICE_HISTORY_TMPL + +############################################## +# Begin JOB SPECIFIC work +############################################## + +# Add UFSDA to PYTHONPATH +export PYTHONPATH=${HOMEgfs}/sorc/gdas.cd/ush/:${HOMEgfs}/sorc/gdas.cd/ush/eva:${PYTHONPATH} + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASPREPPY:-${HOMEgfs}/sorc/gdas.cd/scripts/exgdas_global_marine_analysis_vrfy.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Do not remove the Temporary working directory (do this in POST) +########################################## +cd "${DATAROOT}" || exit 1 + +exit 0 diff --git a/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG b/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG index 2528013e39f..0119bc7f2d9 100755 --- a/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG +++ b/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG @@ -1,62 +1,39 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "awips" -c "base awips" export OMP_NUM_THREADS=${OMP_NUM_THREADS:-1} -########################################### -# GFS_AWIPS_20KM AWIPS PRODUCT GENERATION -########################################### - -######################################################### -# obtain unique process id (pid) and make temp directory -######################################################### -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -########################################### -# Run setpdy and initialize PDY variables -########################################### -setpdy.sh -. PDY - ################################ # Set up the HOME directory ################################ -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export USHgfs=${USHgfs:-$HOMEgfs/ush} -export EXECgfs=${EXECgfs:-$HOMEgfs/exec} -export PARMgfs=${PARMgfs:-$HOMEgfs/parm} -export PARMwmo=${PARMwmo:-$HOMEgfs/parm/wmo} -export PARMproduct=${PARMproduct:-$HOMEgfs/parm/product} -export FIXgfs=${FIXgfs:-$HOMEgfs/fix} +export HOMEgfs=${HOMEgfs:-${PACKAGEROOT}/gfs.${gfs_ver}} +export USHgfs=${USHgfs:-${HOMEgfs}/ush} +export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} +export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} +export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} +export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} +export FIXgfs=${FIXgfs:-${HOMEgfs}/fix} ################################### # Specify NET and RUN Name and model #################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} export model=${model:-gfs} -export COMPONENT=${COMPONENT:-atmos} +export COMPONENT="atmos" ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT} -export COMOUTwmo=${COMOUTwmo:-${COMOUT}/wmo} - export SENDDBN=${SENDDBN:-NO} +export SENDAWIP=${SENDAWIP:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT $COMOUTwmo +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_WMO +GRID="0p25" YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_GRIB_0p25:COM_ATMOS_GRIB_TMPL + +if [[ ${SENDCOM} == "YES" && ! -d "${COM_ATMOS_WMO}" ]] ; then + mkdir -m 775 -p "${COM_ATMOS_WMO}" fi export pgmout=OUTPUT.$$ @@ -70,21 +47,21 @@ export pgmout=OUTPUT.$$ ######################################################## # Execute the script. -$HOMEgfs/scripts/exgfs_atmos_awips_20km_1p0deg.sh $fcsthrs +${HOMEgfs}/scripts/exgfs_atmos_awips_20km_1p0deg.sh ${fcsthrs} export err=$?; err_chk ######################################################## ############################################ # print exec I/O output ############################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ] ; then - rm -rf $DATA +if [ "${KEEPDATA}" != "YES" ] ; then + rm -rf ${DATA} fi diff --git a/jobs/JGFS_ATMOS_AWIPS_G2 b/jobs/JGFS_ATMOS_AWIPS_G2 index 9dd2fdca636..94151fbd726 100755 --- a/jobs/JGFS_ATMOS_AWIPS_G2 +++ b/jobs/JGFS_ATMOS_AWIPS_G2 @@ -1,63 +1,43 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export OMP_NUM_THREADS=${OMP_NUM_THREADS:-1} - ######################################## # GFS_AWIPS_G2 AWIPS PRODUCT GENERATION ######################################## -########################################################## -# obtain unique process id (pid) and make temp directory -########################################################## -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "awips" -c "base awips" -########################################### -# Run setpdy and initialize PDY variables -########################################### -setpdy.sh -. PDY +export OMP_NUM_THREADS=${OMP_NUM_THREADS:-1} ################################ # Set up the HOME directory ################################ -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export USHgfs=${USHgfs:-$HOMEgfs/ush} -export EXECgfs=${EXECgfs:-$HOMEgfs/exec} -export PARMgfs=${PARMgfs:-$HOMEgfs/parm} -export PARMwmo=${PARMwmo:-$HOMEgfs/parm/wmo} -export PARMproduct=${PARMproduct:-$HOMEgfs/parm/product} -export FIXgfs=${FIXgfs:-$HOMEgfs/fix} -export UTILgfs=${UTILgfs:-$HOMEgfs/util} +export USHgfs=${USHgfs:-${HOMEgfs}/ush} +export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} +export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} +export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} +export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} +export FIXgfs=${FIXgfs:-${HOMEgfs}/fix} +export UTILgfs=${UTILgfs:-${HOMEgfs}/util} ################################### # Specify NET and RUN Name and model #################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} export model=${model:-gfs} -export COMPONENT=${COMPONENT:-atmos} +export COMPONENT="atmos" ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT} -export COMOUTwmo=${COMOUTwmo:-${COMOUT}/wmo} - export SENDDBN=${SENDDBN:-NO} +export SENDAWIP=${SENDAWIP:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT $COMOUTwmo +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_WMO +GRID="0p25" YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_GRIB_0p25:COM_ATMOS_GRIB_TMPL + +if [[ ${SENDCOM} == "YES" && ! -d "${COM_ATMOS_WMO}" ]] ; then + mkdir -m 775 -p "${COM_ATMOS_WMO}" fi export pgmout=OUTPUT.$$ @@ -67,21 +47,21 @@ export pgmout=OUTPUT.$$ # Execute the script. ######################################################### mkdir -m 775 awips_g1 -cd $DATA/awips_g1 -$HOMEgfs/scripts/exgfs_atmos_grib_awips.sh $fcsthrs +cd ${DATA}/awips_g1 +${HOMEgfs}/scripts/exgfs_atmos_grib_awips.sh ${fcsthrs} export err=$?; err_chk ############################################ # print exec I/O output ############################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ] ; then - rm -rf $DATA +if [ "${KEEPDATA}" != "YES" ] ; then + rm -rf ${DATA} fi diff --git a/jobs/JGFS_ATMOS_CYCLONE_GENESIS b/jobs/JGFS_ATMOS_CYCLONE_GENESIS index 79d43ebb1ed..85e4bf76515 100755 --- a/jobs/JGFS_ATMOS_CYCLONE_GENESIS +++ b/jobs/JGFS_ATMOS_CYCLONE_GENESIS @@ -1,73 +1,16 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -configs="base vrfy" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env vrfy -status=$? -[[ $status -ne 0 ]] && exit $status +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "vrfy" -c "base vrfy" -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -if [ $RUN_ENVIR = "nco" ]; then - export DATA=${DATA:-${DATAROOT}/${jobid:?}} -else - export job="gfs_cyclone_genesis" - export DATA="$DATAROOT/${job}$$" - [[ -d $DATA ]] && rm -rf $DATA -fi -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -#################################### -# Specify NET and RUN Name and model -#################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-atmos} - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +# TODO (#1220) Determine if this is still needed +export RUN_ENVIR=${RUN_ENVIR:-"nco"} ############################################## # Set variables used in the exglobal script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi -export cmodel=$CDUMP +export cmodel=${RUN} #################################### # SENDCOM - Copy Files From TMPDIR to $COMOUT @@ -80,27 +23,33 @@ export SENDECF=${SENDECF:-NO} #################################### # Specify Execution Areas #################################### -export HOMEens_tracker=${HOMEens_tracker:-${NWROOT:?}/ens_tracker.${ens_tracker_ver}} -export EXECens_tracker=${EXECens_tracker:-$HOMEens_tracker/exec} -export FIXens_tracker=${FIXens_tracker:-$HOMEens_tracker/fix} -export USHens_tracker=${USHens_tracker:-$HOMEens_tracker/ush} -export SCRIPTens_tracker=${SCRIPTens_tracker:-$HOMEens_tracker/scripts} +export HOMEens_tracker=${HOMEens_tracker:-${PACKAGEROOT}/ens_tracker.${ens_tracker_ver}} +export EXECens_tracker=${EXECens_tracker:-${HOMEens_tracker}/exec} +export FIXens_tracker=${FIXens_tracker:-${HOMEens_tracker}/fix} +export USHens_tracker=${USHens_tracker:-${HOMEens_tracker}/ush} +export SCRIPTens_tracker=${SCRIPTens_tracker:-${HOMEens_tracker}/scripts} ############################################## # Define COM directories ############################################## -export COMIN=${ROTDIR}/${RUN}.${PDY}/${cyc}/$COMPONENT -export gfsdir=${COMIN} -export COMINgfs=${COMIN} -export COMOUT=${ROTDIR}/${RUN}.${PDY}/${cyc}/$COMPONENT +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_GENESIS +YMD=${PDY} HH=${cyc} GRID="0p25" generate_com -rx COM_ATMOS_GRIB_0p25:COM_ATMOS_GRIB_TMPL + +# The following variables are used by the tracker scripts which are outside +# of global-workflow and therefore can't be standardized at this time +export COMIN=${COM_ATMOS_GRIB_0p25} +export gfsdir=${COM_ATMOS_GRIB_0p25} +export COMINgfs=${COM_ATMOS_GRIB_0p25} + +export COMINgenvit=${COM_ATMOS_GENESIS} +export COMOUTgenvit=${COM_ATMOS_GENESIS} +export COMOUT=${COM_ATMOS_GENESIS} -export JYYYY=$(echo ${PDY} | cut -c1-4) -export COMINgenvit=${COMINgenvit:-${COMOUT}/genesis_vital_${JYYYY}} -export COMOUTgenvit=${COMOUTgenvit:-${COMOUT}/genesis_vital_${JYYYY}} +export COMINsyn=${COMINsyn:-$(compath.py "${envir}/com/gfs/${gfs_ver}")/syndat} -export COMINsyn=${COMINsyn:-$(compath.py gfs/prod/syndat)} +mkdir -m 775 -p "${COMOUTgenvit}" -mkdir -m 775 -p $COMOUTgenvit +export JYYYY=${PDY:0:4} ############################################## # Run relevant script @@ -112,15 +61,15 @@ export err=$?; err_chk ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGFS_ATMOS_CYCLONE_TRACKER b/jobs/JGFS_ATMOS_CYCLONE_TRACKER index 4b05ea0b802..3aa3c6f5f4e 100755 --- a/jobs/JGFS_ATMOS_CYCLONE_TRACKER +++ b/jobs/JGFS_ATMOS_CYCLONE_TRACKER @@ -1,74 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "vrfy" -c "base vrfy" -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -configs="base vrfy" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env vrfy -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -if [ $RUN_ENVIR = "nco" ]; then - export DATA=${DATA:-${DATAROOT}/${jobid:?}} -else - export job="gfs_cyclone_tracker" - export DATA="$DATAROOT/${job}$$" - [[ -d $DATA ]] && rm -rf $DATA -fi -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -#################################### -# Specify NET and RUN Name and model -#################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-atmos} +# TODO (#1220) Determine if this is still needed +export RUN_ENVIR=${RUN_ENVIR:-"nco"} -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +export COMPONENT="atmos" ############################################## # Set variables used in the exglobal script ############################################## export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi +export CDUMP=${RUN/enkf} #################################### @@ -82,32 +28,40 @@ export SENDECF=${SENDECF:-NO} #################################### # Specify Execution Areas #################################### -export HOMEens_tracker=${HOMEens_tracker:-${NWROOT:?}/ens_tracker.${ens_tracker_ver}} -export EXECens_tracker=${EXECens_tracker:-$HOMEens_tracker/exec} -export FIXens_tracker=${FIXens_tracker:-$HOMEens_tracker/fix} -export USHens_tracker=${USHens_tracker:-$HOMEens_tracker/ush} +export HOMEens_tracker=${HOMEens_tracker:-${PACKAGEROOT}/ens_tracker.${ens_tracker_ver}} +export EXECens_tracker=${EXECens_tracker:-${HOMEens_tracker}/exec} +export FIXens_tracker=${FIXens_tracker:-${HOMEens_tracker}/fix} +export USHens_tracker=${USHens_tracker:-${HOMEens_tracker}/ush} ############################################## # Define COM and Data directories ############################################## -export COMIN=${ROTDIR}/${RUN}.${PDY}/${cyc}/$COMPONENT -export COMINgfs=${COMIN} -export gfsdir=${COMINgfs} -export COMINgdas=${COMIN} -export gdasdir=${COMINgdas} -export COMOUT=${ROTDIR}/${RUN}.${PDY}/${cyc}/$COMPONENT -export COMINsyn=${COMINsyn:-$(compath.py arch/prod/syndat)} - -if [ $RUN_ENVIR = "nco" ]; then - export COMOUThur=${COMROOTp1:?}/hur/${envir}/global - export COMOUTatcf=${COMROOTp1:?}/nhc/${envir}/atcf - mkdir -m 775 -p $COMOUThur $COMOUTatcf +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_TRACK COM_ATMOS_GENESIS +YMD=${PDY} HH=${cyc} GRID="0p25" generate_com -rx COM_ATMOS_GRIB_0p25:COM_ATMOS_GRIB_TMPL + +if [[ ! -d "${COM_ATMOS_TRACK}" ]]; then mkdir -p "${COM_ATMOS_TRACK}"; fi + +# The following variables are used by the tracker scripts which are outside +# of global-workflow and therefore can't be standardized at this time +export COMINgfs=${COM_ATMOS_GRIB_0p25} +export gfsdir=${COM_ATMOS_GRIB_0p25} +export COMINgdas=${COM_ATMOS_GRIB_0p25} +export gdasdir=${COM_ATMOS_GRIB_0p25} +export COMOUT=${COM_ATMOS_TRACK} +export COMINsyn=${COMINsyn:-$(compath.py ${envir}/com/gfs/${gfs_ver})/syndat} + +export COMINgenvit=${COM_ATMOS_GENESIS} + +if [ ${RUN_ENVIR} = "nco" ]; then + export COMOUThur=${COMROOTp1}/hur/${envir}/global + export COMOUTatcf=${COMROOTp1}/nhc/${envir}/atcf + mkdir -m 775 -p ${COMOUThur} ${COMOUTatcf} else # export COMOUThur=$COMOUT # export COMOUTatcf=$COMOUT - export COMOUThur=$DATA - export COMOUTatcf=$DATA + export COMOUThur=${DATA} + export COMOUTatcf=${DATA} fi ############################################## @@ -117,7 +71,7 @@ fi ############################################################# # Execute the script export pert="p01" -export cmodel=$CDUMP +export cmodel=${CDUMP} export loopnum=1 #-----------input data checking ----------------- @@ -143,15 +97,15 @@ export err=$?; err_chk ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGFS_ATMOS_FBWIND b/jobs/JGFS_ATMOS_FBWIND index 42e459dd0bc..f4b94442e8c 100755 --- a/jobs/JGFS_ATMOS_FBWIND +++ b/jobs/JGFS_ATMOS_FBWIND @@ -1,83 +1,62 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +# TODO (#1221) This job is not part of the rocoto suite ############################################ # GFS FBWIND PRODUCT GENERATION ############################################ - -########################################################### -# obtain unique process id (pid) and make temp directory -########################################################### -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -########################################### -# Run setpdy and initialize PDY variables -########################################### -setpdy.sh -. PDY +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "fbwind" -c "base" ################################ # Set up the HOME directory ################################ -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export USHgfs=${USHgfs:-$HOMEgfs/ush} -export EXECgfs=${EXECgfs:-$HOMEgfs/exec} -export PARMgfs=${PARMgfs:-$HOMEgfs/parm} -export PARMwmo=${PARMwmo:-$HOMEgfs/parm/wmo} -export PARMproduct=${PARMproduct:-$HOMEgfs/parm/product} -export FIXgfs=${FIXgfs:-$HOMEgfs/fix} -export UTILgfs=${UTILgfs:-$HOMEgfs/util} +export USHgfs=${USHgfs:-${HOMEgfs}/ush} +export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} +export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} +export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} +export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} +export FIXgfs=${FIXgfs:-${HOMEgfs}/fix} +export UTILgfs=${UTILgfs:-${HOMEgfs}/util} ################################### # Specify NET and RUN Name and model #################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} export model=${model:-gfs} -export COMPONENT=${COMPONENT:-atmos} +export COMPONENT="atmos" ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT} +export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}} +export COMOUT=${COMOUT:-$(compath.py -o ${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}} export COMOUTwmo=${COMOUTwmo:-${COMOUT}/wmo} export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT $COMOUTwmo +if [ ${SENDCOM} = YES ] ; then + mkdir -m 775 -p ${COMOUT} ${COMOUTwmo} fi -export pgmout=OUTPUT.$$ - ######################################################## # Execute the script. -$HOMEgfs/scripts/exgfs_atmos_fbwind.sh +${HOMEgfs}/scripts/exgfs_atmos_fbwind.sh export err=$?;err_chk ######################################################## ############################################ # print exec I/O output ############################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ] ; then - rm -rf $DATA +if [ "${KEEPDATA}" != "YES" ] ; then + rm -rf ${DATA} fi diff --git a/jobs/JGFS_ATMOS_FSU_GENESIS b/jobs/JGFS_ATMOS_FSU_GENESIS index eb3069bfcb9..e5fd5ff3c3c 100755 --- a/jobs/JGFS_ATMOS_FSU_GENESIS +++ b/jobs/JGFS_ATMOS_FSU_GENESIS @@ -1,65 +1,11 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "vrfy" -c "base vrfy" export RUN_ENVIR=${RUN_ENVIR:-"nco"} -############################# -# Source relevant config files -############################# -configs="base vrfy" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - -##exit - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env vrfy -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -if [ $RUN_ENVIR = "nco" ]; then - export DATA=${DATA:-${DATAROOT}/${jobid:?}} -else - export job="gfs_fsu_genesis" - export DATA="$DATAROOT/${job}$$" - [[ -d $DATA ]] && rm -rf $DATA -fi -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -#################################### -# Specify NET and RUN Name and model -#################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-atmos} - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +export COMPONENT="atmos" ############################################## @@ -67,9 +13,6 @@ export pgmerr=errfile ############################################## export CDATE=${CDATE:-${PDY}${cyc}} export CDUMP=${CDUMP:-${RUN:-"gfs"}} -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi #################################### @@ -83,12 +26,12 @@ export SENDECF=${SENDECF:-NO} #################################### # Specify Execution Areas #################################### -export HOMEens_tracker=${HOMEens_tracker:-${NWROOT:?}/ens_tracker.${ens_tracker_ver}} -export EXECens_tracker=${EXECens_tracker:-$HOMEens_tracker/exec} -export FIXens_tracker=${FIXens_tracker:-$HOMEens_tracker/fix} -export USHens_tracker=${USHens_tracker:-$HOMEens_tracker/ush} -export SCRIPTens_tracker=${SCRIPTens_tracker:-$HOMEens_tracker/scripts} -export BINens_tracker=${BINens_tracker:-$HOMEens_tracker/ush/FSUgenesisPY/bin} +export HOMEens_tracker=${HOMEens_tracker:-${PACKAGEROOT}/ens_tracker.${ens_tracker_ver}} +export EXECens_tracker=${EXECens_tracker:-${HOMEens_tracker}/exec} +export FIXens_tracker=${FIXens_tracker:-${HOMEens_tracker}/fix} +export USHens_tracker=${USHens_tracker:-${HOMEens_tracker}/ush} +export SCRIPTens_tracker=${SCRIPTens_tracker:-${HOMEens_tracker}/scripts} +export BINens_tracker=${BINens_tracker:-${HOMEens_tracker}/ush/FSUgenesisPY/bin} export PYTHONPATH=${USHens_tracker}/FSUgenesisPY:${PYTHONPATH} ############################################## @@ -101,17 +44,17 @@ export gfsdir=${ROTDIR} export COMINgdas=${COMIN} export gdasdir=${COMINgdas} export COMOUT=${ROTDIR}/${RUN}.${PDY}/${cyc}/${COMPONENT} -export COMINsyn=${COMINsyn:-$(compath.py arch/prod/syndat)} +export COMINsyn=${COMINsyn:-$(compath.py ${envir}/com/gfs/${gfs_ver})/syndat} -if [ $RUN_ENVIR = "nco" ]; then - export COMOUThur=${COMROOTp1:?}/hur/${envir}/global - export COMOUTatcf=${COMROOTp1:?}/nhc/${envir}/atcf - mkdir -m 775 -p $COMOUThur $COMOUTatcf +if [ ${RUN_ENVIR} = "nco" ]; then + export COMOUThur=${COMROOTp1}/hur/${envir}/global + export COMOUTatcf=${COMROOTp1}/nhc/${envir}/atcf + mkdir -m 775 -p ${COMOUThur} ${COMOUTatcf} else -# export COMOUThur=$COMOUT +# export COMOUThur=$COMOUT # export COMOUTatcf=$COMOUT - export COMOUThur=$DATA - export COMOUTatcf=$DATA + export COMOUThur=${DATA} + export COMOUTatcf=${DATA} fi ############################################## @@ -126,15 +69,15 @@ export err=$?; err_chk ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGFS_ATMOS_GEMPAK b/jobs/JGFS_ATMOS_GEMPAK index 502bb96a7a8..161f0e08831 100755 --- a/jobs/JGFS_ATMOS_GEMPAK +++ b/jobs/JGFS_ATMOS_GEMPAK @@ -1,58 +1,18 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak" -c "base gempak" -############################################ -# GFS GEMPAK PRODUCT GENERATION -############################################ - -############################# -# Source relevant config files -############################# -configs="base gempak" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env gempak -status=$? -[[ $status -ne 0 ]] && exit $status - -########################################################## -# obtain unique process id (pid) and make temp directory -########################################################## -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -########################################### -# Run setpdy and initialize PDY variables -########################################### -setpdy.sh -. PDY ################################ # Set up the HOME directory ################################ -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export EXECgfs=${EXECgfs:-$HOMEgfs/exec} -export PARMgfs=${PARMgfs:-$HOMEgfs/parm} -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -export FIXgempak=${FIXgempak:-$HOMEgfs/gempak/fix} -export USHgempak=${USHgempak:-$HOMEgfs/gempak/ush} -export SRCgfs=${SRCgfs:-$HOMEgfs/scripts} +export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} +export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} +export EXPDIR=${EXPDIR:-${HOMEgfs}/parm/config} +export FIXgempak=${FIXgempak:-${HOMEgfs}/gempak/fix} +export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} +export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} # For half-degree P Grib files export DO_HD_PGRB=${DO_HD_PGRB:-YES} @@ -70,114 +30,124 @@ export DBN_ALERT_TYPE=${DBN_ALERT_TYPE:-GFS_GEMPAK} ################################### # Specify NET and RUN Name and model #################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} export model=${model:-gfs} -export COMPONENT=${COMPONENT:-atmos} ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT/gempak} - export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT -fi +for grid in 0p25 0p50 1p00; do + GRID=${grid} YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GRIB_${grid}:COM_ATMOS_GRIB_TMPL" +done + +for grid in 1p00 0p50 0p25 40km 35km_atl 35km_pac; do + prod_dir="COM_ATMOS_GEMPAK_${grid}" + GRID=${grid} YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GEMPAK_${grid}:COM_ATMOS_GEMPAK_TMPL" -export pgmout=OUTPUT.$$ + if [[ ${SENDCOM} == YES && ! -d "${!prod_dir}" ]] ; then + mkdir -m 775 -p "${!prod_dir}" + fi +done +# TODO: These actions belong in an ex-script not a j-job +if [[ -f poescript ]]; then + rm -f poescript +fi -rm -f poescript +ocean_domain_max=180 +if (( ocean_domain_max > FHMAX_GFS )); then + ocean_domain_max=${FHMAX_GFS} +fi ################################################################# # Execute the script for the 384 hour 1 degree grib ################################################################## -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs 384 GFS_GEMPAK &> $DATA/gfs_1p0.$$.1 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs 384 GFS_GEMPAK &> $DATA/gfs_1p0.$$.2 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs 384 GFS_GEMPAK &> $DATA/gfs_1p0.$$.3 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs 384 GFS_GEMPAK &> $DATA/gfs_1p0.$$.4 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs 384 GFS_GEMPAK &> $DATA/gfs_1p0.$$.5 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs 384 GFS_GEMPAK &> $DATA/gfs_1p0.$$.6 " >>poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.1 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.2 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.3 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.4 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.5 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.6 " >> poescript ################################################################# # Execute the script for the half-degree grib ################################################################## -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p50 384 GFS_GEMPAK &> $DATA/gfs_0p5.$$.1 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p50 384 GFS_GEMPAK &> $DATA/gfs_0p5.$$.2 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p50 384 GFS_GEMPAK &> $DATA/gfs_0p5.$$.3 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p50 384 GFS_GEMPAK &> $DATA/gfs_0p5.$$.4 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p50 384 GFS_GEMPAK &> $DATA/gfs_0p5.$$.5 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p50 384 GFS_GEMPAK &> $DATA/gfs_0p5.$$.6 " >>poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.1 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.2 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.3 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.4 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.5 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.6 " >> poescript ################################################################# # Execute the script for the quater-degree grib #################################################################### -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p25 384 GFS_GEMPAK &> $DATA/gfs_0p25.$$.1 " >> poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p25 384 GFS_GEMPAK &> $DATA/gfs_0p25.$$.2 " >> poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p25 384 GFS_GEMPAK &> $DATA/gfs_0p25.$$.3 " >> poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p25 384 GFS_GEMPAK &> $DATA/gfs_0p25.$$.4 " >> poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p25 384 GFS_GEMPAK &> $DATA/gfs_0p25.$$.5 " >> poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p25 384 GFS_GEMPAK &> $DATA/gfs_0p25.$$.6 " >> poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p25 384 GFS_GEMPAK &> $DATA/gfs_0p25.$$.7 " >> poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p25 384 GFS_GEMPAK &> $DATA/gfs_0p25.$$.8 " >> poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p25 384 GFS_GEMPAK &> $DATA/gfs_0p25.$$.9 " >> poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs_0p25 384 GFS_GEMPAK &> $DATA/gfs_0p25.$$.10 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.1 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.2 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.3 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.4 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.5 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.6 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.7 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.8 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.9 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.10 " >> poescript #################################################################### # Execute the script to create the 35km Pacific grids for OPC ##################################################################### -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs35_pac 180 GFS_GEMPAK_WWB &> $DATA/gfs35_pac.$$.1 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs35_pac 180 GFS_GEMPAK_WWB &> $DATA/gfs35_pac.$$.2 " >>poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs35_pac ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_pac} &> ${DATA}/gfs35_pac.$$.1 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs35_pac ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_pac} &> ${DATA}/gfs35_pac.$$.2 " >> poescript #################################################################### # Execute the script to create the 35km Atlantic grids for OPC ##################################################################### -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs35_atl 180 GFS_GEMPAK_WWB &> $DATA/gfs35_atl.$$.1 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs35_atl 180 GFS_GEMPAK_WWB &> $DATA/gfs35_atl.$$.2 " >>poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs35_atl ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_atl} &> ${DATA}/gfs35_atl.$$.1 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs35_atl ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_atl} &> ${DATA}/gfs35_atl.$$.2 " >> poescript ##################################################################### # Execute the script to create the 40km grids for HPC ###################################################################### -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs40 180 GFS_GEMPAK_WWB &> $DATA/gfs40.$$.1 " >>poescript -echo "time $SRCgfs/exgfs_atmos_nawips.sh gfs40 180 GFS_GEMPAK_WWB &> $DATA/gfs40.$$.2 " >>poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs40 ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_40km} &> ${DATA}/gfs40.$$.1 " >> poescript +echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs40 ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_40km} &> ${DATA}/gfs40.$$.2 " >> poescript -# Add task number to the MPMD script -nl -n ln -v 0 poescript > poescript.new -mv poescript.new poescript +if [[ ${CFP_MP:-"NO"} == "YES" ]]; then + # Add task number to the MPMD script + nl -n ln -v 0 poescript > poescript.new + mv poescript.new poescript +fi cat poescript -chmod 775 $DATA/poescript +chmod 775 ${DATA}/poescript export MP_PGMMODEL=mpmd -export MP_CMDFILE=$DATA/poescript +export MP_CMDFILE=${DATA}/poescript -ntasks=${NTASKS_GEMPAK:-$(cat $DATA/poescript | wc -l)} +ntasks=$(cat ${DATA}/poescript | wc -l) ptile=${PTILE_GEMPAK:-4} threads=${NTHREADS_GEMPAK:-1} -export OMP_NUM_THREADS=$threads -APRUN=${APRUN:-"mpirun -n $ntasks cfp "} +export OMP_NUM_THREADS=${threads} +APRUN=${APRUN:-"mpiexec -l -np ${ntasks} --cpu-bind verbose,core cfp"} -APRUN_GEMPAKCFP=${APRUN_GEMPAKCFP:-$APRUN} -APRUNCFP=$(eval echo $APRUN_GEMPAKCFP) +APRUN_GEMPAKCFP=${APRUN_GEMPAKCFP:-${APRUN}} +APRUNCFP=${APRUN_GEMPAKCFP} -$APRUNCFP $DATA/poescript +${APRUNCFP} ${DATA}/poescript export err=$?; err_chk ############################################ # print exec I/O output ############################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ] ; then - rm -rf $DATA +if [ "${KEEPDATA}" != "YES" ] ; then + rm -rf ${DATA} fi diff --git a/jobs/JGFS_ATMOS_GEMPAK_META b/jobs/JGFS_ATMOS_GEMPAK_META index 9d6683a521b..0a9f5bdd902 100755 --- a/jobs/JGFS_ATMOS_GEMPAK_META +++ b/jobs/JGFS_ATMOS_GEMPAK_META @@ -1,51 +1,34 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +# TODO (#1222) This job is not part of the rocoto suite ############################################ # GFS GEMPAK META PRODUCT GENERATION ############################################ +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak_meta" -e "base" -export LAUNCH_MODE=MPI ############################################### # Set MP variables ############################################### +export LAUNCH_MODE=MPI export OMP_NUM_THREADS=1 export MP_LABELIO=yes export MP_PULSE=0 export MP_DEBUG_NOTIMEOUT=yes -########################################################## -# obtain unique process id (pid) and make temp directory -########################################################## -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -########################################### -# Run setpdy and initialize PDY variables -########################################### -setpdy.sh -. PDY - ################################ # Set up the HOME directory ################################ -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export EXECgfs=${EXECgfs:-$HOMEgfs/exec} -export PARMgfs=${PARMgfs:-$HOMEgfs/parm} -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -export FIXgempak=${FIXgempak:-$HOMEgfs/gempak/fix} -export USHgempak=${USHgempak:-$HOMEgfs/gempak/ush} -export SRCgfs=${SRCgfs:-$HOMEgfs/scripts} +export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} +export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} +export EXPDIR=${EXPDIR:-${HOMEgfs}/parm/config} +export FIXgempak=${FIXgempak:-${HOMEgfs}/gempak/fix} +export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} +export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} -cp $FIXgempak/datatype.tbl datatype.tbl +cp ${FIXgempak}/datatype.tbl datatype.tbl ############################################# #set the fcst hrs for all the cycles @@ -57,10 +40,8 @@ export fhinc=12 ################################### # Specify NET and RUN Name and model #################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} export model=${model:-gfs} -export COMPONENT=${COMPONENT:-atmos} +export COMPONENT="atmos" ############################################## # Set up model and cycle specific variables @@ -70,41 +51,39 @@ export DBN_ALERT_TYPE=GFS_METAFILE ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT/gempak} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT/gempak/meta} -export COMINgempak=${COMINgempak:-${COMROOT}/${NET}/${envir}} +export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}/gempak} +export COMOUT=${COMOUT:-$(compath.py -o ${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}/gempak/meta} +export COMINgempak=${COMINgempak:-$(compath.py ${envir}/${NET}/${gfs_ver})} -export COMINukmet=${COMINukmet:-$(compath.py nawips/prod/ukmet)} -export COMINecmwf=${COMINecmwf:-$(compath.py ecmwf/prod/ecmwf)} -export COMINnam=${COMINnam:-$(compath.py nam/prod/nam)} +export COMINukmet=${COMINukmet:-$(compath.py ${envir}/ukmet/${ukmet_ver})/ukmet} +export COMINecmwf=${COMINecmwf:-$(compath.py ${envir}/ecmwf/${ecmwf_ver})/ecmwf} +export COMINnam=${COMINnam:-$(compath.py ${envir}/nam/${nam_ver})/nam} export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT +if [ ${SENDCOM} = YES ] ; then + mkdir -m 775 -p ${COMOUT} fi -export pgmout=OUTPUT.$$ - ######################################################## # Execute the script. -$SRCgfs/exgfs_atmos_gempak_meta.sh +${SRCgfs}/exgfs_atmos_gempak_meta.sh export err=$?; err_chk ######################################################## ############################################ # print exec I/O output ############################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ] ; then - rm -rf $DATA +if [ "${KEEPDATA}" != "YES" ] ; then + rm -rf ${DATA} fi diff --git a/jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF b/jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF index 4b8a04e6a94..cc9d445965b 100755 --- a/jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF +++ b/jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF @@ -1,51 +1,35 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +# TODO (#1222) This job is not part of the rocoto suite ############################################ # GFS GEMPAK NCDC PRODUCT GENERATION ############################################ +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak_gif" -c "base" -########################################################## -# obtain unique process id (pid) and make temp directory -########################################################## -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -########################################### -# Run setpdy and initialize PDY variables -########################################### -setpdy.sh -. PDY ################################ # Set up the HOME directory ################################ -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export EXECgfs=${EXECgfs:-$HOMEgfs/exec} -export PARMgfs=${PARMgfs:-$HOMEgfs/parm} -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -export FIXgfs=${FIXgfs:-$HOMEgfs/gempak/fix} -export USHgempak=${USHgempak:-$HOMEgfs/gempak/ush} -export SRCgfs=${SRCgfs:-$HOMEgfs/scripts} -export UTILgfs=${UTILgfs:-$HOMEgfs/util} +export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} +export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} +export EXPDIR=${EXPDIR:-${HOMEgfs}/parm/config} +export FIXgfs=${FIXgfs:-${HOMEgfs}/gempak/fix} +export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} +export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} +export UTILgfs=${UTILgfs:-${HOMEgfs}/util} ###################################### # Set up the GEMPAK directory ####################################### -export HOMEgempak=${HOMEgempak:-$HOMEgfs/gempak} -export FIXgempak=${FIXgempak:-$HOMEgempak/fix} -export USHgempak=${USHgempak:-$HOMEgempak/ush} +export HOMEgempak=${HOMEgempak:-${HOMEgfs}/gempak} +export FIXgempak=${FIXgempak:-${HOMEgempak}/fix} +export USHgempak=${USHgempak:-${HOMEgempak}/ush} export MP_PULSE=0 export MP_TIMEOUT=2000 -export cycle=t${cyc}z + # # Set up model and cycle specific variables @@ -60,24 +44,23 @@ export fstart=00 ################################### # Specify NET and RUN Name and model #################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} export model=${model:-gfs} -export COMPONENT=${COMPONENT:-atmos} +export COMPONENT="atmos" ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT/gempak} -export COMINgfs=${COMINgfs:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT} +export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}/gempak} +export COMINgfs=${COMINgfs:-$(compath.py ${envir}/${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}} +export COMINobsproc=${COMINobsproc:-$(compath.py ${envir}/obsproc/${obsproc_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}} +export COMOUT=${COMOUT:-$(compath.py -o ${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}} export COMOUTwmo=${COMOUTwmo:-${COMOUT}/wmo} export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT $COMOUTwmo +if [ ${SENDCOM} = YES ] ; then + mkdir -m 775 -p ${COMOUT} ${COMOUTwmo} fi export pgmout=OUTPUT.$$ @@ -85,21 +68,21 @@ export pgmout=OUTPUT.$$ ######################################################## # Execute the script. -$SRCgfs/exgfs_atmos_gempak_gif_ncdc_skew_t.sh +${SRCgfs}/exgfs_atmos_gempak_gif_ncdc_skew_t.sh export err=$?; err_chk ######################################################## ############################################ # print exec I/O output ############################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ] ; then - rm -rf $DATA +if [ "${KEEPDATA}" != "YES" ] ; then + rm -rf ${DATA} fi diff --git a/jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC b/jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC index d8d05b27f2c..a1c2518a445 100755 --- a/jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC +++ b/jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC @@ -1,39 +1,23 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +# TODO (#1222) This job is not part of the rocoto suite ############################################ # GFS_PGRB2_SPEC_GEMPAK PRODUCT GENERATION ############################################ +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak_spec" -c "base" -######################################################### -# obtain unique process id (pid) and make temp directory -######################################################### -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -########################################### -# Run setpdy and initialize PDY variables -########################################### -setpdy.sh -. PDY ################################ # Set up the HOME directory ################################ -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export EXECgfs=${EXECgfs:-$HOMEgfs/exec} -export PARMgfs=${PARMgfs:-$HOMEgfs/parm} -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -export FIXgempak=${FIXgempak:-$HOMEgfs/gempak/fix} -export USHgempak=${USHgempak:-$HOMEgfs/gempak/ush} -export SRCgfs=${SRCgfs:-$HOMEgfs/scripts} +export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} +export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} +export EXPDIR=${EXPDIR:-${HOMEgfs}/parm/config} +export FIXgempak=${FIXgempak:-${HOMEgfs}/gempak/fix} +export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} +export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} # For half-degree P Grib files #export DO_HD_PGRB=YES @@ -41,9 +25,7 @@ export SRCgfs=${SRCgfs:-$HOMEgfs/scripts} ################################### # Specify NET and RUN Name and model #################################### -export NET=gfs -export RUN=gfs_goessim -export COMPONENT=${COMPONENT:-atmos} +export COMPONENT="atmos" export finc=3 export model=gfs export EXT="" @@ -51,27 +33,30 @@ export EXT="" ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${NET}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${NET}.${PDY}/${cyc}/$COMPONENT/gempak} +export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}} +export COMOUT=${COMOUT:-$(compath.py -o ${NET}/${gfs_ver}/${NET}.${PDY})/${cyc}/${COMPONENT}/gempak} export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT +if [ ${SENDCOM} = YES ] ; then + mkdir -m 775 -p ${COMOUT} fi -export DATA_HOLD=$DATA +# TODO - Assess what is going on with overwriting $DATA here (#1224) + +export DATA_HOLD=${DATA} ################################################################# # Execute the script for the regular grib ################################################################# -export DATA=$DATA_HOLD/SPECIAL -mkdir -p $DATA -cd $DATA +export DATA=${DATA_HOLD}/SPECIAL +mkdir -p ${DATA} +cd ${DATA} export DBN_ALERT_TYPE=GFS_GOESSIM_GEMPAK +export RUN2=gfs_goessim export GRIB=goessimpgrb2.0p25.f export EXT=" " export fend=180 @@ -82,17 +67,17 @@ echo "RUNS the Program" ######################################################## # Execute the script. -$SRCgfs/exgfs_atmos_goes_nawips.sh +${SRCgfs}/exgfs_atmos_goes_nawips.sh ################################################################# # Execute the script for the 221 grib -export DATA=$DATA_HOLD/SPECIAL221 -mkdir -p $DATA -cd $DATA +export DATA=${DATA_HOLD}/SPECIAL221 +mkdir -p ${DATA} +cd ${DATA} export DBN_ALERT_TYPE=GFS_GOESSIM221_GEMPAK -export RUN=gfs_goessim221 +export RUN2=gfs_goessim221 export GRIB=goessimpgrb2f export EXT=".grd221" export fend=180 @@ -103,12 +88,12 @@ echo "RUNS the Program" ######################################################## # Execute the script. -$SRCgfs/exgfs_atmos_goes_nawips.sh +${SRCgfs}/exgfs_atmos_goes_nawips.sh export err=$?; err_chk ######################################################## echo "end of program" -cd $DATA_HOLD +cd ${DATA_HOLD} echo "######################################" echo " SPECIAL.OUT " echo "######################################" @@ -116,14 +101,14 @@ echo "######################################" ############################################ # print exec I/O output ############################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ] ; then - rm -rf $DATA +if [ "${KEEPDATA}" != "YES" ] ; then + rm -rf "${DATA}" fi diff --git a/jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS b/jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS index 8ae1170800d..48b13c3d9e0 100755 --- a/jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS +++ b/jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS @@ -1,68 +1,46 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export OMP_NUM_THREADS=${OMP_NUM_THREADS:-1} +# TODO (#1225) This job is not part of the rocoto suite ############################################ # GFS PGRB2_SPECIAL_POST PRODUCT GENERATION ############################################ +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "npoess" -c "base" -########################################################## -# obtain unique process id (pid) and make temp directory -########################################################## -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -########################################### -# Run setpdy and initialize PDY variables -########################################### -setpdy.sh -. PDY +export OMP_NUM_THREADS=${OMP_NUM_THREADS:-1} ################################ # Set up the HOME directory ################################ -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export USHgfs=${USHgfs:-$HOMEgfs/ush} -export EXECgfs=${EXECgfs:-$HOMEgfs/exec} -export PARMgfs=${PARMgfs:-$HOMEgfs/parm} -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -export PARMwmo=${PARMwmo:-$HOMEgfs/parm/wmo} -export PARMproduct=${PARMproduct:-$HOMEgfs/parm/product} -export FIXgfs=${FIXgfs:-$HOMEgfs/fix} +export USHgfs=${USHgfs:-${HOMEgfs}/ush} +export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} +export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} +export EXPDIR=${EXPDIR:-${HOMEgfs}/parm/config} +export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} +export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} +export FIXgfs=${FIXgfs:-${HOMEgfs}/fix} ################################### # Specify NET and RUN Name and model #################################### -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} export model=${model:-gfs} -export COMPONENT=${COMPONENT:-atmos} ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT} -export COMOUTwmo=${COMOUTwmo:-${COMOUT}/wmo} - export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT $COMOUTwmo -fi +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_GOES +GRID="0p50" YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_GRIB_0p50:COM_ATMOS_GRIB_TMPL -export pgmout=OUTPUT.$$ +if [[ ${SENDCOM} == "YES" ]]; then + mkdir -m 775 -p "${COM_ATMOS_GOES}" +fi +# TODO - This should be in the ex-script (#1226) #################################### # Specify Forecast Hour Range @@ -71,11 +49,6 @@ export SHOUR=000 export FHOUR=180 export FHINC=003 -####################################### -# Specify Restart File Name to Key Off -####################################### -restart_file=$COMIN/${RUN}.t${cyc}z.special.grb2if - #################################### # Specify Timeout Behavior of Post # @@ -90,48 +63,41 @@ export SLEEP_INT=5 #################################### # Check if this is a restart #################################### -if test -f $COMIN/$RUN.t${cyc}z.control.goessimpgrb2 -then - modelrecvy=$(cat < $COMIN/$RUN.t${cyc}z.control.goessimpgrb) - recvy_pdy=$(echo $modelrecvy | cut -c1-8) - recvy_cyc=$(echo $modelrecvy | cut -c9-10) - recvy_shour=$(echo $modelrecvy | cut -c11-13) - - if test $RERUN = "NO" - then - NEW_SHOUR=$(expr $recvy_shour + $FHINC) - if test $NEW_SHOUR -ge $SHOUR - then - export SHOUR=$NEW_SHOUR +if [[ -f "${COM_ATMOS_GOES}/${RUN}.t${cyc}z.control.goessimpgrb2" ]]; then + modelrecvy=$(cat < "${COM_ATMOS_GOES}/${RUN}.t${cyc}z.control.goessimpgrb") + recvy_cyc="${modelrecvy:8:2}" + recvy_shour="${modelrecvy:10:13}" + + if [[ ${RERUN} == "NO" ]]; then + NEW_SHOUR=$(( recvy_shour + FHINC )) + if (( NEW_SHOUR >= SHOUR )); then + export SHOUR=${NEW_SHOUR} fi - if test $recvy_shour -ge $FHOUR - then - msg="Forecast Pgrb Generation Already Completed to $FHOUR" - postmsg "$jlogfile" "$msg" + if (( recvy_shour >= FHOUR )); then + echo "Forecast Pgrb Generation Already Completed to ${FHOUR}" else - msg="Starting: PDY=$PDY cycle=t${recvy_cyc}z SHOUR=$SHOUR ." - postmsg "$jlogfile" "$msg" + echo "Starting: PDY=${PDY} cycle=t${recvy_cyc}z SHOUR=${SHOUR}" fi fi fi ############################################################# # Execute the script -$HOMEgfs/scripts/exgfs_atmos_grib2_special_npoess.sh +"${HOMEgfs}/scripts/exgfs_atmos_grib2_special_npoess.sh" export err=$?;err_chk ############################################################# ############################################ # print exec I/O output ############################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" fi ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ] ; then - rm -rf $DATA +if [[ "${KEEPDATA}" != "YES" ]] ; then + rm -rf "${DATA}" fi diff --git a/jobs/JGFS_ATMOS_POSTSND b/jobs/JGFS_ATMOS_POSTSND index 013e6d16486..2318d70e313 100755 --- a/jobs/JGFS_ATMOS_POSTSND +++ b/jobs/JGFS_ATMOS_POSTSND @@ -1,62 +1,13 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -configs="base postsnd" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env postsnd -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "postsnd" -c "base postsnd" ############################################## # Set variables used in the exglobal script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi +export CDUMP=${RUN/enkf} ######################################## @@ -72,31 +23,31 @@ export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} # Set up the source directories ################################### -export HOMEbufrsnd=${HOMEbufrsnd:-$HOMEgfs} -export EXECbufrsnd=${EXECbufrsnd:-$HOMEbufrsnd/exec} -export FIXbufrsnd=${FIXbufrsnd:-$HOMEbufrsnd/fix/product} -export PARMbufrsnd=${PARMbufrsnd:-$HOMEbufrsnd/parm/product} -export USHbufrsnd=${USHbufrsnd:-$HOMEbufrsnd/ush} -export SCRbufrsnd=${SCRbufrsnd:-$HOMEbufrsnd/scripts} +export HOMEbufrsnd=${HOMEbufrsnd:-${HOMEgfs}} +export EXECbufrsnd=${EXECbufrsnd:-${HOMEbufrsnd}/exec} +export FIXbufrsnd=${FIXbufrsnd:-${HOMEbufrsnd}/fix/product} +export PARMbufrsnd=${PARMbufrsnd:-${HOMEbufrsnd}/parm/product} +export USHbufrsnd=${USHbufrsnd:-${HOMEbufrsnd}/ush} +export SCRbufrsnd=${SCRbufrsnd:-${HOMEbufrsnd}/scripts} ############################## # Define COM Directories ############################## -export COMIN=${COMIN:-$ROTDIR/${CDUMP}.${PDY}/${cyc}/atmos} -export COMOUT=${COMOUT:-$ROTDIR/${CDUMP}.${PDY}/${cyc}/atmos} -export pcom=${pcom:-${COMOUT}/wmo} -export COMAWP=${COMAWP:-${COMOUT}/gempak} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -[[ ! -d $COMOUT ]] && mkdir -p $COMOUT -[[ ! -d $pcom ]] && mkdir -p $pcom -[[ ! -d $COMAWP ]] && mkdir -p $COMAWP + +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_HISTORY COM_ATMOS_BUFR \ + COM_ATMOS_WMO COM_ATMOS_GEMPAK + +[[ ! -d ${COM_ATMOS_BUFR} ]] && mkdir -p "${COM_ATMOS_BUFR}" +[[ ! -d ${COM_ATMOS_GEMPAK} ]] && mkdir -p "${COM_ATMOS_GEMPAK}" +[[ ! -d ${COM_ATMOS_WMO} ]] && mkdir -p "${COM_ATMOS_WMO}" ######################################################## # Execute the script. -$SCRbufrsnd/exgfs_atmos_postsnd.sh +${SCRbufrsnd}/exgfs_atmos_postsnd.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## @@ -106,15 +57,15 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGFS_ATMOS_VMINMON b/jobs/JGFS_ATMOS_VMINMON new file mode 100755 index 00000000000..a7300b4dd35 --- /dev/null +++ b/jobs/JGFS_ATMOS_VMINMON @@ -0,0 +1,73 @@ +#! /usr/bin/env bash + +########################################################### +# GFS Minimization Monitor (MinMon) job +########################################################### +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "vrfy" -c "base vrfy" + +########################################################### +# obtain unique process id (pid) and make temp directories +########################################################### +export MINMON_SUFFIX=${MINMON_SUFFIX:-GFS} +export m_job=${m_job:-${MINMON_SUFFIX}_mmDE} + + +############################################## +# Specify Package Areas +############################################## +export SCRgfs=${SCRgfs:-${HOMEgfs}/scripts} +export M_FIXgfs=${M_FIXgfs:-${HOMEgfs}/fix/product} + +export HOMEminmon=${HOMEminmon:-${HOMEgfs}} +export EXECminmon=${EXECminmon:-${HOMEminmon}/exec} +export USHminmon=${USHminmon:-${HOMEminmon}/ush} + + +############################################# +# determine PDY and cyc for previous cycle +############################################# + +pdate=$(${NDATE} -6 ${PDY}${cyc}) +echo 'pdate = ${pdate}' + +export P_PDY=${pdate:0:8} +export p_cyc=${pdate:8:2} + + +############################################# +# TANKverf - WHERE OUTPUT DATA WILL RESIDE +############################################# +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS +YMD=${P_PDY} HH=${p_cyc} generate_com -rx COM_ATMOS_ANALYSIS_PREV:COM_ATMOS_ANALYSIS_TMPL + +M_TANKverf=${M_TANKverf:-${COM_ATMOS_ANALYSIS}/minmon} +export M_TANKverfM1=${M_TANKverfM1:-${COM_ATMOS_ANALYSIS_PREV}/minmon} + +mkdir -p -m 775 ${M_TANKverf} + +######################################## +# Set necessary environment variables +######################################## +export CYCLE_INTERVAL=6 +export gsistat=${gsistat:-${COM_ATMOS_ANALYSIS}/gfs.t${cyc}z.gsistat} + + +######################################################## +# Execute the script. +${GMONSH:-${SCRgfs}/exgfs_atmos_vminmon.sh} ${PDY} ${cyc} +err=$? +[[ ${err} -ne 0 ]] && exit ${err} + + +################################ +# Remove the Working Directory +################################ +KEEPDATA=${KEEPDATA:-NO} +cd ${DATAROOT} + +if [ ${KEEPDATA} = NO ] ; then + rm -rf ${DATA} +fi + + diff --git a/jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE b/jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE new file mode 100755 index 00000000000..065ebe8d0a8 --- /dev/null +++ b/jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE @@ -0,0 +1,56 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA=${DATA:-${DATAROOT}/${RUN}aeroanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "aeroanlfinal" -c "base aeroanl aeroanlfinal" + +############################################## +# Set variables used in the script +############################################## +# shellcheck disable=SC2153 +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +gPDY=${GDATE:0:8} +gcyc=${GDATE:8:2} +GDUMP="gdas" + + +############################################## +# Begin JOB SPECIFIC work +############################################## + +# Generate COM variables from templates +YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS COM_CHEM_ANALYSIS + +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_CHEM_ANALYSIS_PREV:COM_CHEM_ANALYSIS_TMPL \ + COM_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL + +mkdir -m 775 -p "${COM_CHEM_ANALYSIS}" + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASAEROFINALPY:-${HOMEgfs}/scripts/exglobal_aero_analysis_finalize.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || exit 1 +[[ ${KEEPDATA} = "NO" ]] && rm -rf "${DATA}" + +exit 0 diff --git a/jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE b/jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE new file mode 100755 index 00000000000..2f8c222e182 --- /dev/null +++ b/jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE @@ -0,0 +1,49 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export DATA=${DATA:-${DATAROOT}/${RUN}aeroanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "aeroanlinit" -c "base aeroanl aeroanlinit" + +############################################## +# Set variables used in the script +############################################## +# shellcheck disable=SC2153 +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +gPDY=${GDATE:0:8} +gcyc=${GDATE:8:2} +GDUMP="gdas" + + +############################################## +# Begin JOB SPECIFIC work +############################################## + +# Generate COM variables from templates +YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS COM_CHEM_ANALYSIS + +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_CHEM_ANALYSIS_PREV:COM_CHEM_ANALYSIS_TMPL \ + COM_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL + +mkdir -m 775 -p "${COM_CHEM_ANALYSIS}" + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASAEROINITPY:-${HOMEgfs}/scripts/exglobal_aero_analysis_initialize.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +exit 0 diff --git a/jobs/JGLOBAL_AERO_ANALYSIS_RUN b/jobs/JGLOBAL_AERO_ANALYSIS_RUN new file mode 100755 index 00000000000..853909dc03f --- /dev/null +++ b/jobs/JGLOBAL_AERO_ANALYSIS_RUN @@ -0,0 +1,35 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA=${DATA:-${DATAROOT}/${RUN}aeroanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "aeroanlrun" -c "base aeroanl aeroanlrun" + +############################################## +# Set variables used in the script +############################################## + +############################################## +# Begin JOB SPECIFIC work +############################################## + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASAERORUNSH:-${HOMEgfs}/scripts/exglobal_aero_analysis_run.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +exit 0 diff --git a/jobs/JGLOBAL_ARCHIVE b/jobs/JGLOBAL_ARCHIVE new file mode 100755 index 00000000000..2d2f8c8814e --- /dev/null +++ b/jobs/JGLOBAL_ARCHIVE @@ -0,0 +1,52 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "arch" -c "base arch" + + +############################################## +# Set variables used in the script +############################################## +export CDUMP=${RUN/enkf} + +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS COM_ATMOS_BUFR COM_ATMOS_GEMPAK \ + COM_ATMOS_GENESIS COM_ATMOS_HISTORY COM_ATMOS_INPUT COM_ATMOS_MASTER COM_ATMOS_RESTART \ + COM_ATMOS_TRACK COM_ATMOS_WAFS COM_ATMOS_WMO \ + COM_CHEM_HISTORY \ + COM_ICE_HISTORY COM_ICE_INPUT \ + COM_OBS COM_TOP \ + COM_OCEAN_DAILY COM_OCEAN_HISTORY COM_OCEAN_INPUT COM_OCEAN_XSECT \ + COM_WAVE_GRID COM_WAVE_HISTORY COM_WAVE_STATION + +for grid in "0p25" "0p50" "1p00"; do + YMD=${PDY} HH=${cyc} GRID=${grid} generate_com -rx "COM_ATMOS_GRIB_${grid}:COM_ATMOS_GRIB_TMPL" + YMD=${PDY} HH=${cyc} GRID=${grid} generate_com -rx "COM_OCEAN_GRIB_${grid}:COM_OCEAN_GRIB_TMPL" +done + +############################################################### +# Run archive script +############################################################### + +${GLOBALARCHIVESH:-${SCRgfs}/exglobal_archive.sh} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || (echo "${DATAROOT} does not exist. ABORT!"; exit 1) +[[ ${KEEPDATA} = "NO" ]] && rm -rf "${DATA}" + +exit 0 diff --git a/jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE b/jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE new file mode 100755 index 00000000000..37a49e0ae04 --- /dev/null +++ b/jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE @@ -0,0 +1,48 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA=${DATA:-${DATAROOT}/${RUN}atmensanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "atmensanlfinal" -c "base atmensanl atmensanlfinal" + +############################################## +# Set variables used in the script +############################################## +GDUMP="gdas" +GDUMP_ENS="enkf${GDUMP}" + +############################################## +# Begin JOB SPECIFIC work +############################################## +# Generate COM variable from template +MEMDIR='ensstat' RUN=${GDUMP_ENS} YMD=${PDY} HH=${cyc} generate_com -rx \ + COM_ATMOS_ANALYSIS_ENS:COM_ATMOS_ANALYSIS_TMPL + +mkdir -m 755 -p "${COM_ATMOS_ANALYSIS_ENS}" + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASATMENSFINALPY:-${HOMEgfs}/scripts/exglobal_atmens_analysis_finalize.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || ( echo "FATAL ERROR: ${DATAROOT} does not exist, ABORT!"; exit 1 ) +[[ ${KEEPDATA} = "NO" ]] && rm -rf "${DATA}" + +exit 0 diff --git a/jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE b/jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE new file mode 100755 index 00000000000..246502cdfa9 --- /dev/null +++ b/jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE @@ -0,0 +1,44 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export DATA=${DATA:-${DATAROOT}/${RUN}atmensanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "atmensanlinit" -c "base atmensanl atmensanlinit" + +############################################## +# Set variables used in the script +############################################## +# shellcheck disable=SC2153 +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +gPDY=${GDATE:0:8} +gcyc=${GDATE:8:2} +GDUMP="gdas" + +############################################## +# Begin JOB SPECIFIC work +############################################## +# Generate COM variables from templates +RUN=${GDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS + +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_ATMOS_ANALYSIS_PREV:COM_ATMOS_ANALYSIS_TMPL + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASATMENSINITPY:-${HOMEgfs}/scripts/exglobal_atmens_analysis_initialize.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +exit 0 diff --git a/jobs/JGLOBAL_ATMENS_ANALYSIS_RUN b/jobs/JGLOBAL_ATMENS_ANALYSIS_RUN new file mode 100755 index 00000000000..0d10c76b052 --- /dev/null +++ b/jobs/JGLOBAL_ATMENS_ANALYSIS_RUN @@ -0,0 +1,35 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA=${DATA:-${DATAROOT}/${RUN}atmensanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "atmensanlrun" -c "base atmensanl atmensanlrun" + +############################################## +# Set variables used in the script +############################################## + +############################################## +# Begin JOB SPECIFIC work +############################################## + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASATMENSRUNSH:-${HOMEgfs}/scripts/exglobal_atmens_analysis_run.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +exit 0 diff --git a/jobs/JGLOBAL_ATMOS_ANALYSIS b/jobs/JGLOBAL_ATMOS_ANALYSIS index df1f4ab474d..9e5850bfc38 100755 --- a/jobs/JGLOBAL_ATMOS_ANALYSIS +++ b/jobs/JGLOBAL_ATMOS_ANALYSIS @@ -1,156 +1,103 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base anal" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env anal -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} - -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "anal" -c "base anal" ############################################## # Set variables used in the script ############################################## export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} -export COMPONENT=${COMPONENT:-atmos} -export DO_CALC_ANALYSIS=${DO_CALC_ANALYSIS:-"YES"} +export CDUMP=${RUN/enkf} +export COMPONENT="atmos" +export DO_CALC_ANALYSIS=${DO_CALC_ANALYSIS:-"YES"} +export MAKE_NSSTBUFR=${MAKE_NSSTBUFR:-"NO"} +export MAKE_ACFTBUFR=${MAKE_ACFTBUFR:-"NO"} ############################################## # Begin JOB SPECIFIC work ############################################## -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) -GDUMP=${GDUMP:-"gdas"} +GDATE=$(${NDATE} -${assim_freq} ${PDY}${cyc}) +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} +export GDUMP="gdas" +export GDUMP_ENS="enkf${GDUMP}" export OPREFIX="${CDUMP}.t${cyc}z." export GPREFIX="${GDUMP}.t${gcyc}z." export APREFIX="${CDUMP}.t${cyc}z." -export GSUFFIX=${GSUFFIX:-$SUFFIX} -export ASUFFIX=${ASUFFIX:-$SUFFIX} - - -if [ $RUN_ENVIR = "nco" -o ${ROTDIR_DUMP:-NO} = "YES" ]; then - export COMIN=${COMIN:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMOUT=${COMOUT:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_OBS=${COMIN_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_GES_OBS=${COMIN_GES_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$GDUMP.$gPDY/$gcyc/$COMPONENT} -else - export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_OBS="$DMPDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_GES_OBS="$DMPDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -fi -mkdir -m 775 -p $COMOUT -# COMIN_GES and COMIN_GES_ENS are used in script -export COMIN_GES="$ROTDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -export COMIN_GES_ENS="$ROTDIR/enkfgdas.$gPDY/$gcyc/$COMPONENT" +export GPREFIX_ENS="${GDUMP_ENS}.t${gcyc}z." + +# Generate COM variables from templates +YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS COM_ATMOS_ANALYSIS +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_ATMOS_ANALYSIS_PREV:COM_ATMOS_ANALYSIS_TMPL \ + COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL -export ATMGES="$COMIN_GES/${GPREFIX}atmf006${GSUFFIX}" -if [ ! -f $ATMGES ]; then - echo "FATAL ERROR: FILE MISSING: ATMGES = $ATMGES" +MEMDIR='ensstat' RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_ATMOS_HISTORY_ENS_PREV:COM_ATMOS_HISTORY_TMPL + +mkdir -m 775 -p "${COM_ATMOS_ANALYSIS}" + +export ATMGES="${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf006.nc" +if [ ! -f ${ATMGES} ]; then + echo "FATAL ERROR: FILE MISSING: ATMGES = ${ATMGES}" exit 1 fi - # Get LEVS -if [ ${GSUFFIX} = ".nc" ]; then - export LEVS=$($NCLEN $ATMGES pfull) - status=$? -else - export LEVS=$($NEMSIOGET $ATMGES dimz | awk '{print $2}') - status=$? -fi -[[ $status -ne 0 ]] && exit $status +export LEVS=$(${NCLEN} ${ATMGES} pfull) +status=$? +[[ ${status} -ne 0 ]] && exit ${status} -if [ $DOHYBVAR = "YES" ]; then - export ATMGES_ENSMEAN="$COMIN_GES_ENS/${GPREFIX}atmf006.ensmean$GSUFFIX" - if [ ! -f $ATMGES_ENSMEAN ]; then - echo "FATAL ERROR: FILE MISSING: ATMGES_ENSMEAN = $ATMGES_ENSMEAN" +if [ ${DOHYBVAR} = "YES" ]; then + export ATMGES_ENSMEAN="${COM_ATMOS_HISTORY_ENS_PREV}/${GPREFIX_ENS}atmf006.ensmean.nc" + if [ ! -f ${ATMGES_ENSMEAN} ]; then + echo "FATAL ERROR: FILE MISSING: ATMGES_ENSMEAN = ${ATMGES_ENSMEAN}" exit 2 fi fi # Link observational data -export PREPQC="${COMIN_OBS}/${OPREFIX}prepbufr" -if [ ! -f $PREPQC ]; then - echo "WARNING: Global PREPBUFR FILE $PREPQC MISSING" +export PREPQC="${COM_OBS}/${OPREFIX}prepbufr" +if [[ ! -f ${PREPQC} ]]; then + echo "WARNING: Global PREPBUFR FILE ${PREPQC} MISSING" fi -export PREPQCPF="${COMIN_OBS}/${OPREFIX}prepbufr.acft_profiles" -export TCVITL="${COMOUT}/${OPREFIX}syndata.tcvitals.tm00" -[[ $DONST = "YES" ]] && export NSSTBF="${COMIN_OBS}/${OPREFIX}nsstbufr" - +export TCVITL="${COM_OBS}/${OPREFIX}syndata.tcvitals.tm00" +if [[ ${DONST} = "YES" ]]; then + if [[ ${MAKE_NSSTBUFR} == "YES" ]]; then + export NSSTBF="${COM_OBS}/${OPREFIX}nsstbufr" + fi +fi +export PREPQCPF="${COM_OBS}/${OPREFIX}prepbufr.acft_profiles" -# Copy fix file for obsproc -if [ $RUN = "gfs" ]; then - mkdir -p $ROTDIR/fix - cp $FIXgsi/prepobs_errtable.global $ROTDIR/fix/ +# Copy fix file for obsproc # TODO: Why is this necessary? +if [[ ${RUN} = "gfs" ]]; then + mkdir -p ${ROTDIR}/fix + cp ${FIXgsi}/prepobs_errtable.global ${ROTDIR}/fix/ fi ############################################################### # Run relevant script -${ANALYSISSH:-$SCRgfs/exglobal_atmos_analysis.sh} +${ANALYSISSH:-${SCRgfs}/exglobal_atmos_analysis.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## # Send Alerts ############################################## -if [ $SENDDBN = YES -a $RUN = gdas ] ; then - $DBNROOT/bin/dbn_alert MODEL GDAS_MSC_abias $job $COMOUT/${APREFIX}abias - $DBNROOT/bin/dbn_alert MODEL GDAS_MSC_abias_pc $job $COMOUT/${APREFIX}abias_pc - $DBNROOT/bin/dbn_alert MODEL GDAS_MSC_abias_air $job $COMOUT/${APREFIX}abias_air +if [ ${SENDDBN} = YES -a ${RUN} = gdas ] ; then + ${DBNROOT}/bin/dbn_alert MODEL GDAS_MSC_abias ${job} ${COM_ATMOS_ANALYSIS}/${APREFIX}abias + ${DBNROOT}/bin/dbn_alert MODEL GDAS_MSC_abias_pc ${job} ${COM_ATMOS_ANALYSIS}/${APREFIX}abias_pc + ${DBNROOT}/bin/dbn_alert MODEL GDAS_MSC_abias_air ${job} ${COM_ATMOS_ANALYSIS}/${APREFIX}abias_air fi @@ -161,15 +108,15 @@ fi ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_ATMOS_ANALYSIS_CALC b/jobs/JGLOBAL_ATMOS_ANALYSIS_CALC index 39438e32b7c..65a571a9748 100755 --- a/jobs/JGLOBAL_ATMOS_ANALYSIS_CALC +++ b/jobs/JGLOBAL_ATMOS_ANALYSIS_CALC @@ -1,134 +1,64 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base anal analcalc" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env anal -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} - -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "analcalc" -c "base anal analcalc" ############################################## # Set variables used in the script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} -export COMPONENT=${COMPONENT:-atmos} -export DO_CALC_ANALYSIS=${DO_CALC_ANALYSIS:-"YES"} +export CDUMP="${RUN/enkf}" +export DO_CALC_ANALYSIS=${DO_CALC_ANALYSIS:-"YES"} ############################################## # Begin JOB SPECIFIC work ############################################## - -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) -GDUMP=${GDUMP:-"gdas"} +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") +# shellcheck disable= +export gPDY=${GDATE:0:8} +export gcyc=${GDATE:8:2} +export GDUMP="gdas" +export GDUMP_ENS="enkf${GDUMP}" export OPREFIX="${CDUMP}.t${cyc}z." export GPREFIX="${GDUMP}.t${gcyc}z." -export APREFIX="${CDUMP}.t${cyc}z." -export GSUFFIX=${GSUFFIX:-$SUFFIX} -export ASUFFIX=${ASUFFIX:-$SUFFIX} - - -if [ $RUN_ENVIR = "nco" -o ${ROTDIR_DUMP:-NO} = "YES" ]; then - export COMIN=${COMIN:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMOUT=${COMOUT:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_OBS=${COMIN_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_GES_OBS=${COMIN_GES_OBS:-$(compath.py ${envir}/obsproc/${obsproc_ver})/$GDUMP.$gPDY/$gcyc/$COMPONENT} -else - export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_OBS="$DMPDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_GES_OBS="$DMPDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -fi -mkdir -m 775 -p $COMOUT -# COMIN_GES and COMIN_GES_ENS are used in script -export COMIN_GES="$ROTDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -export COMIN_GES_ENS="$ROTDIR/enkfgdas.$gPDY/$gcyc/$COMPONENT" +export APREFIX="${RUN}.t${cyc}z." +export GPREFIX_ENS="${GDUMP_ENS}.t${gcyc}z." +RUN=${CDUMP} YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS -export ATMGES="$COMIN_GES/${GPREFIX}atmf006${GSUFFIX}" -if [ ! -f $ATMGES ]; then - echo "FATAL ERROR: FILE MISSING: ATMGES = $ATMGES" - exit 1 -fi +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS COM_ATMOS_RESTART +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_OBS_PREV:COM_OBS_TMPL \ + COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL -# Get LEVS -if [ ${GSUFFIX} = ".nc" ]; then - export LEVS=$($NCLEN $ATMGES pfull) - status=$? -else - export LEVS=$($NEMSIOGET $ATMGES dimz | awk '{print $2}') - status=$? -fi -[[ $status -ne 0 ]] && exit $status - -if [ $DOHYBVAR = "YES" ]; then - export ATMGES_ENSMEAN="$COMIN_GES_ENS/${GPREFIX}atmf006.ensmean$GSUFFIX" - if [ ! -f $ATMGES_ENSMEAN ]; then - echo "FATAL ERROR: FILE MISSING: ATMGES_ENSMEAN = $ATMGES_ENSMEAN" - exit 2 - fi +export ATMGES="${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf006.nc" +if [ ! -f ${ATMGES} ]; then + echo "FATAL ERROR: FILE MISSING: ATMGES = ${ATMGES}" + exit 1 fi +# Get LEVS +export LEVS=$(${NCLEN} ${ATMGES} pfull) +status=$? +[[ ${status} -ne 0 ]] && exit ${status} + -# Generate Gaussian surface analysis +# Generate Gaussian surface analysis # TODO: Should this be removed now that sfcanl is its own job? export DOGAUSFCANL=${DOGAUSFCANL:-"YES"} ############################################################### # Run relevant script -${ANALCALCSH:-$SCRgfs/exglobal_atmos_analysis_calc.sh} +${ANALCALCSH:-${SCRgfs}/exglobal_atmos_analysis_calc.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## @@ -138,15 +68,15 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_ATMOS_EMCSFC_SFC_PREP b/jobs/JGLOBAL_ATMOS_EMCSFC_SFC_PREP index c0aab4e9215..fdaca082405 100755 --- a/jobs/JGLOBAL_ATMOS_EMCSFC_SFC_PREP +++ b/jobs/JGLOBAL_ATMOS_EMCSFC_SFC_PREP @@ -1,46 +1,10 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "sfc_prep" -c "base" export RUN_ENVIR=${RUN_ENVIR:-"nco"} -############################# -# Source relevant config files -############################# -configs="base" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile - export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} @@ -49,32 +13,32 @@ export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} ############################################## CDATE=${CDATE:-${PDY}${cyc}} -GDATE=$($NDATE -06 $CDATE) -PDY_m6hrs=$(echo $GDATE | cut -c1-8) -cyc_m6hrs=$(echo $GDATE | cut -c9-10) +GDATE=$(${NDATE} -06 ${CDATE}) +PDY_m6hrs=$(echo ${GDATE} | cut -c1-8) +cyc_m6hrs=$(echo ${GDATE} | cut -c9-10) export cycle_m6hrs=t${cyc_m6hrs}z -export COMPONENT=${COMPONENT:-atmos} -export COMOUT=${COMOUT:-${COMROOT}/$NET/$envir/$RUN.$PDY/$cyc/$COMPONENT} +export COMPONENT="atmos" +export COMOUT=${COMOUT:-$(compath.py -o ${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}} -export COMINgfs=${COMINgfs:-$(compath.py $NET/$envir/$RUN.$PDY)/$cyc/$COMPONENT} -export COMINgfs_m6hrs=${COMINgfs_m6hrs:-$(compath.py $NET/$envir/$RUN.$PDY_m6hrs)/$cyc_m6hrs/$COMPONENT} +export COMINobsproc=${COMINobsproc:-$(compath.py ${envir}/obsproc/${obsproc_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}} +export COMIN_m6hrs=${COMIN_m6hrs:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY_m6hrs}/${cyc_m6hrs}/${COMPONENT}} -export IMS_FILE=${COMINgfs}/${RUN}.${cycle}.imssnow96.grib2 -export FIVE_MIN_ICE_FILE=${COMINgfs}/${RUN}.${cycle}.seaice.5min.grib2 -export AFWA_NH_FILE=${COMINgfs}/${RUN}.${cycle}.NPR.SNWN.SP.S1200.MESH16.grb -export AFWA_SH_FILE=${COMINgfs}/${RUN}.${cycle}.NPR.SNWS.SP.S1200.MESH16.grb +export IMS_FILE=${COMINobsproc}/${RUN}.${cycle}.imssnow96.grib2 +export FIVE_MIN_ICE_FILE=${COMINobsproc}/${RUN}.${cycle}.seaice.5min.grib2 +export AFWA_NH_FILE=${COMINobsproc}/${RUN}.${cycle}.NPR.SNWN.SP.S1200.MESH16.grb +export AFWA_SH_FILE=${COMINobsproc}/${RUN}.${cycle}.NPR.SNWS.SP.S1200.MESH16.grb export BLENDED_ICE_FILE=${BLENDED_ICE_FILE:-${RUN}.${cycle}.seaice.5min.blend.grb} -export BLENDED_ICE_FILE_m6hrs=${BLENDED_ICE_FILE_m6hrs:-${COMINgfs_m6hrs}/${RUN}.${cycle_m6hrs}.seaice.5min.blend.grb} +export BLENDED_ICE_FILE_m6hrs=${BLENDED_ICE_FILE_m6hrs:-${COMIN_m6hrs}/${RUN}.${cycle_m6hrs}.seaice.5min.blend.grb} ############################################################### # Run relevant script ############################################################### -${EMCSFCPREPSH:-$SCRgfs/exemcsfc_global_sfc_prep.sh} +${EMCSFCPREPSH:-${SCRgfs}/exemcsfc_global_sfc_prep.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## # End JOB SPECIFIC work @@ -84,14 +48,14 @@ status=$? # Final processing ############################################## if [ -e ${pgmout} ]; then - cat $pgmout + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_ATMOS_POST b/jobs/JGLOBAL_ATMOS_POST new file mode 100755 index 00000000000..d636be4f303 --- /dev/null +++ b/jobs/JGLOBAL_ATMOS_POST @@ -0,0 +1,122 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "post" -c "base post" + + +#################################### +# Specify version numbers +#################################### +export crtm_ver=${post_crtm_ver:-v2.2.6} +export gfs_ver=${gfs_ver:-v15.0.0} +export hwrf_ver=${hwrf_ver:-v11.0.5} +export g2tmpl_ver=${g2tmpl_ver:-v1.5.0} + +############################################## +# Set variables used in the exglobal script +############################################## +export CDUMP=${RUN/enkf} + + +############################################## +# TODO: Remove this egregious HACK +############################################## +if [[ "${SDATE:-}" = "${PDY}${cyc}" ]]; then + if [[ ${post_times} = "anl" ]]; then + echo "No offline post-processing in the first half cycle for analysis" + exit 0 + fi +fi + + +############################################## +# Begin JOB SPECIFIC work +############################################## +export APRUNP=${APRUN:-${APRUN_NP}} +export RERUN=${RERUN:-NO} +export HOMECRTM=${HOMECRTM:-${PACKAGEROOT}/lib/crtm/${crtm_ver}} +export FIXCRTM=${CRTM_FIX:-${HOMECRTM}/fix} +export PARMpost=${PARMpost:-${HOMEgfs}/parm/post} +export INLINE_POST=${WRITE_DOPOST:-".false."} + +# Construct COM variables from templates +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_RESTART COM_ATMOS_ANALYSIS COM_ATMOS_HISTORY COM_ATMOS_MASTER +if [[ ! -d ${COM_ATMOS_MASTER} ]]; then mkdir -m 775 -p "${COM_ATMOS_MASTER}"; fi + +if [[ ${GOESF} == "YES" ]]; then + YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_GOES + if [[ ! -d ${COM_ATMOS_GOES} ]]; then mkdir -m 775 -p "${COM_ATMOS_GOES}"; fi +fi + +if [[ ${WAFSF} == "YES" ]]; then + YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_WAFS + if [[ ! -d ${COM_ATMOS_WAFS} ]]; then mkdir -m 775 -p "${COM_ATMOS_WAFS}"; fi +fi + +for grid in '0p25' '0p50' '1p00'; do + prod_dir="COM_ATMOS_GRIB_${grid}" + GRID=${grid} YMD=${PDY} HH=${cyc} generate_com -rx "${prod_dir}:COM_ATMOS_GRIB_TMPL" + if [[ ! -d "${prod_dir}" ]]; then mkdir -m 775 -p "${!prod_dir}"; fi +done + +if [ "${RUN}" = gfs ];then + export FHOUT_PGB=${FHOUT_GFS:-3} #Output frequency of gfs pgb file at 1.0 and 0.5 deg. +fi +if [ "${RUN}" = gdas ]; then + export IGEN_GFS="gfs_avn" + export IGEN_ANL="anal_gfs" + export IGEN_FCST="gfs_avn" + export IGEN_GDAS_ANL="anal_gdas" + export FHOUT_PGB=${FHOUT:-1} #Output frequency of gfs pgb file at 1.0 and 0.5 deg. +fi + +if [ "${GRIBVERSION}" = grib2 ]; then + export IGEN_ANL="anal_gfs" + export IGEN_FCST="gfs_avn" + export IGEN_GFS="gfs_avn" +fi + +####################################### +# Specify Restart File Name to Key Off +####################################### +# TODO Improve the name of this variable +export restart_file=${COM_ATMOS_HISTORY}/${RUN}.t${cyc}z.atm.logf + +#################################### +# Specify Timeout Behavior of Post +# +# SLEEP_TIME - Amount of time to wait for +# a restart file before exiting +# SLEEP_INT - Amount of time to wait between +# checking for restart files +#################################### +export SLEEP_TIME=900 +export SLEEP_INT=5 + + +############################################################### +# Run relevant exglobal script + +"${HOMEgfs}/scripts/ex${RUN}_atmos_post.sh" +status=$? +(( status != 0 )) && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [ -e "${pgmout}" ]; then + cat "${pgmout}" +fi + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || exit 1 +[[ "${KEEPDATA:-NO}" = "NO" ]] && rm -rf "${DATA}" + + +exit 0 diff --git a/jobs/JGLOBAL_ATMOS_POST_MANAGER b/jobs/JGLOBAL_ATMOS_POST_MANAGER index b931a7aa90e..1d82537dcab 100755 --- a/jobs/JGLOBAL_ATMOS_POST_MANAGER +++ b/jobs/JGLOBAL_ATMOS_POST_MANAGER @@ -1,61 +1,17 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +# TODO (#1227) This job is not used in the rocoto suite -######################################## -# GFS post manager -######################################## +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "post" -c "base post" -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -configs="base post" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env post -status=$? -[[ $status -ne 0 ]] && exit $status - -set -xue -# #### 07/30/1999 ################### -# SET SHELL PROCESSING VARIABLES -# ################################### -export PS4='$SECONDS + ' -date #################################### # Specify NET and RUN Name and model #################################### export NET=${NET:-gfs} export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-atmos} - -#################################### -# obtain unique process id (pid) and make temp directories -#################################### -export pid=${pid:-$$} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir $DATA -cd $DATA -#################################### -# Determine Job Output Name on System -#################################### -export outid="LL$job" -export jobid="${outid}.o${pid}" -export pgmout="OUTPUT.${pid}" #################################### # Specify version numbers @@ -65,38 +21,24 @@ export gfs_ver=${gfs_ver:-v15.0.0} #################################### # Specify Execution Areas #################################### -export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} -export EXECgfs=${HOMEgfs:-$HOMEgfs/exec} -export FIXgfs=${HOMEgfs:-$HOMEgfs/fix} -export PARMgfs=${HOMEgfs:-$HOMEgfs/parm} -export USHgfs=${HOMEgfs:-$HOMEgfs/ush} +export HOMEgfs=${HOMEgfs:-${PACKAGEROOT}/gfs.${gfs_ver}} +export EXECgfs=${HOMEgfs:-${HOMEgfs}/exec} +export FIXgfs=${HOMEgfs:-${HOMEgfs}/fix} +export PARMgfs=${HOMEgfs:-${HOMEgfs}/parm} +export USHgfs=${HOMEgfs:-${HOMEgfs}/ush} ########################### # Set up EXT variable ########################### export EXT_FCST=NO -################################### -# Set up the UTILITIES -################################### -# export HOMEutil=${HOMEutil:-/nw${envir}/util.${util_ver}} -# export utilscript=${utilscript:-$HOMEutil/ush} -# export utilexec=${utilexec:-$HOMEutil/exec} - -########################################### -# Run setpdy and initialize PDY variables -########################################### -export cycle=t${cyc}z -setpdy.sh -. ./PDY - -export ROTDIR=${ROTDIR:-${COMROOT:?}/$NET/$envir} -export COMIN=${COMIN:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} -export COMOUT=${COMOUT:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} +export ROTDIR=${ROTDIR:-${COMROOT:?}/${NET}/${envir}} +export COMIN=${COMIN:-${ROTDIR}/${RUN}.${PDY}/${cyc}/atmos} +export COMOUT=${COMOUT:-${ROTDIR}/${RUN}.${PDY}/${cyc}/atmos} ######################################################## # Execute the script. -$HOMEgfs/scripts/exglobal_atmos_pmgr.sh +${HOMEgfs}/scripts/exglobal_atmos_pmgr.sh ######################################################## diff --git a/jobs/JGLOBAL_ATMOS_SFCANL b/jobs/JGLOBAL_ATMOS_SFCANL index 7d0e70782b3..07a6570c74a 100755 --- a/jobs/JGLOBAL_ATMOS_SFCANL +++ b/jobs/JGLOBAL_ATMOS_SFCANL @@ -1,111 +1,45 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -configs="base sfcanl" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env sfcanl -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} - -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "sfcanl" -c "base sfcanl" ############################################## # Set variables used in the script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} -export COMPONENT=${COMPONENT:-atmos} -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir +export CDUMP="${RUN/enkf}" +if [[ ${RUN_ENVIR} = "nco" ]]; then + export ROTDIR=${COMROOT:?}/${NET}/${envir} fi ############################################## # Begin JOB SPECIFIC work ############################################## - -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) -GDUMP=${GDUMP:-"gdas"} +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") +# shellcheck disable= +gPDY=${GDATE:0:8} +gcyc=${GDATE:8:2} +export GDUMP="gdas" export OPREFIX="${CDUMP}.t${cyc}z." export GPREFIX="${GDUMP}.t${gcyc}z." export APREFIX="${CDUMP}.t${cyc}z." -export GSUFFIX=${GSUFFIX:-$SUFFIX} -export ASUFFIX=${ASUFFIX:-$SUFFIX} - - -if [ $RUN_ENVIR = "nco" -o ${ROTDIR_DUMP:-NO} = "YES" ]; then - export COMIN=${COMIN:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMOUT=${COMOUT:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_OBS=${COMIN_OBS:-$ROTDIR/$RUN.$PDY/$cyc/$COMPONENT} - export COMIN_GES_OBS=${COMIN_GES_OBS:-$ROTDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT} -else - export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" - export COMIN_OBS="$DMPDIR/$CDATE/$CDUMP" - export COMIN_GES_OBS="$DMPDIR/$GDATE/$GDUMP" -fi -mkdir -m 775 -p $COMOUT -# COMIN_GES and COMIN_GES_ENS are used in script -export COMIN_GES="$ROTDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" -export COMIN_GES_ENS="$ROTDIR/enkfgdas.$gPDY/$gcyc/$COMPONENT" +YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS COM_ATMOS_ANALYSIS COM_ATMOS_RESTART -export ATMGES="$COMIN_GES/${GPREFIX}atmf006${GSUFFIX}" -if [ ! -f $ATMGES ]; then - echo "FATAL ERROR: FILE MISSING: ATMGES = $ATMGES" - exit 1 -fi - +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_OBS_PREV:COM_OBS_TMPL \ + COM_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL ############################################################### # Run relevant script -${SFCANALSH:-$SCRgfs/exglobal_atmos_sfcanl.sh} +${SFCANALSH:-${SCRgfs}/exglobal_atmos_sfcanl.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## @@ -115,15 +49,15 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_ATMOS_TROPCY_QC_RELOC b/jobs/JGLOBAL_ATMOS_TROPCY_QC_RELOC index 5496861e5f5..d5e48348516 100755 --- a/jobs/JGLOBAL_ATMOS_TROPCY_QC_RELOC +++ b/jobs/JGLOBAL_ATMOS_TROPCY_QC_RELOC @@ -1,64 +1,17 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "prep" -c "base prep" +# TODO (#1220) Evaluate if this is still needed export RUN_ENVIR=${RUN_ENVIR:-"nco"} -############################# -# Source relevant config files -############################# -configs="base prep" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env prep -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile - ############################################## # Set variables used in the exglobal script ############################################## export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} -export COMPONENT=${COMPONENT:-atmos} -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi +export CDUMP=${RUN/enkf} ############################################## @@ -68,32 +21,21 @@ fi export PROCESS_TROPCY=${PROCESS_TROPCY:-YES} # Turn on tropical cyclone tcvitals QC proc. if YES export DO_RELOCATE=${DO_RELOCATE:-NO} # Turn on tropical cyclone relocation proc. if YES - export tmmark=tm00 -if [ $RUN_ENVIR = "nco" ]; then - export ARCHSYND=$COMROOTp3/gfs/${envir}/syndat # this location is unique, do not change -else - export ARCHSYND=${ROTDIR}/syndat -fi -if [ ! -d ${ARCHSYND} ]; then mkdir -p $ARCHSYND; fi - -export HOMENHCp1=${HOMENHCp1:-/gpfs/?p1/nhc/save/guidance/storm-data/ncep} -export HOMENHC=${HOMENHC:-/gpfs/dell2/nhc/save/guidance/storm-data/ncep} +export ARCHSYND=${ROTDIR}/syndat # this location is unique, do not change +if [ ! -d ${ARCHSYND} ]; then mkdir -p ${ARCHSYND}; fi -# JY export TANK_TROPCY=${TANK_TROPCY:-${DCOMROOT}/${envir}} # path to tropical cyclone record database -export TANK_TROPCY=${TANK_TROPCY:-${DCOMROOT}/prod} # path to tropical cyclone record database +export HOMENHC=${HOMENHC:-/lfs/h1/ops/prod/dcom/nhc/atcf/ncep} +export TANK_TROPCY=${TANK_TROPCY:-${DCOMROOT}} # path to tropical cyclone record database ############################################## # Define COM directories ############################################## -export COMIN=${ROTDIR}/${RUN}.${PDY}/${cyc}/$COMPONENT -export COMOUT=${ROTDIR}/${RUN}.${PDY}/${cyc}/$COMPONENT -if [ ! -d ${COMOUT} ]; then mkdir -p $COMOUT; fi -#export COMINgdas=${ROTDIR}/gdas.${PDY}/${cyc} -#export COMINgfs=${ROTDIR}/gfs.${PDY}/${cyc} +generate_com COM_OBS +if [[ ! -d "${COM_OBS}" ]]; then mkdir -p "${COM_OBS}"; fi -export CRES=$(echo $CASE | cut -c2-) +export CRES=$(echo ${CASE} | cut -c2-) export LATB=$((CRES*2)) export LONB=$((CRES*4)) export BKGFREQ=1 # for hourly relocation @@ -103,23 +45,23 @@ export BKGFREQ=1 # for hourly relocation # Run relevant script ############################################## -${TROPCYQCRELOSH:-$SCRgfs/exglobal_atmos_tropcy_qc_reloc.sh} +${TROPCYQCRELOSH:-${SCRgfs}/exglobal_atmos_tropcy_qc_reloc.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE b/jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE new file mode 100755 index 00000000000..c0bc56f6e23 --- /dev/null +++ b/jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE @@ -0,0 +1,58 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA=${DATA:-${DATAROOT}/${RUN}atmanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "atmanlfinal" -c "base atmanl atmanlfinal" + +############################################## +# Set variables used in the script +############################################## +# shellcheck disable=SC2153 +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +gPDY=${GDATE:0:8} +gcyc=${GDATE:8:2} +GDUMP="gdas" + + +############################################## +# Begin JOB SPECIFIC work +############################################## + +# Generate COM variables from templates +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS + +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_ATMOS_ANALYSIS_PREV:COM_ATMOS_ANALYSIS_TMPL \ + COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL \ + COM_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL + +mkdir -m 775 -p "${COM_ATMOS_ANALYSIS}" + + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASATMFINALPY:-${HOMEgfs}/scripts/exglobal_atm_analysis_finalize.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || ( echo "FATAL ERROR: ${DATAROOT} does not exist, ABORT!"; exit 1 ) +[[ ${KEEPDATA} = "NO" ]] && rm -rf "${DATA}" + +exit 0 diff --git a/jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE b/jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE new file mode 100755 index 00000000000..2d794fb8462 --- /dev/null +++ b/jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE @@ -0,0 +1,55 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export DATA=${DATA:-${DATAROOT}/${RUN}atmanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "atmanlinit" -c "base atmanl atmanlinit" + +############################################## +# Set variables used in the script +############################################## +# shellcheck disable=SC2153 +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +gPDY=${GDATE:0:8} +gcyc=${GDATE:8:2} +GDUMP="gdas" +GDUMP_ENS="enkf${GDUMP}" + + +############################################## +# Begin JOB SPECIFIC work +############################################## + +# Generate COM variables from templates +YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS COM_ATMOS_ANALYSIS + +RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_ATMOS_ANALYSIS_PREV:COM_ATMOS_ANALYSIS_TMPL \ + COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL \ + COM_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL + +MEMDIR='ensstat' RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_ATMOS_HISTORY_ENS_PREV:COM_ATMOS_HISTORY_TMPL + +mkdir -m 775 -p "${COM_ATMOS_ANALYSIS}" + + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASATMINITPY:-${HOMEgfs}/scripts/exglobal_atm_analysis_initialize.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +exit 0 diff --git a/jobs/JGLOBAL_ATM_ANALYSIS_RUN b/jobs/JGLOBAL_ATM_ANALYSIS_RUN new file mode 100755 index 00000000000..bbfdbe4a1fc --- /dev/null +++ b/jobs/JGLOBAL_ATM_ANALYSIS_RUN @@ -0,0 +1,37 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA=${DATA:-${DATAROOT}/${RUN}atmanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "atmanlrun" -c "base atmanl atmanlrun" + +############################################## +# Set variables used in the script +############################################## + + +############################################## +# Begin JOB SPECIFIC work +############################################## + + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASATMRUNSH:-${HOMEgfs}/scripts/exglobal_atm_analysis_run.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +exit 0 diff --git a/jobs/JGLOBAL_FORECAST b/jobs/JGLOBAL_FORECAST index 40e8f46051a..5be44a8c97f 100755 --- a/jobs/JGLOBAL_FORECAST +++ b/jobs/JGLOBAL_FORECAST @@ -1,143 +1,78 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -#-------------------------------- -if [ $RUN_ENVIR = "emc" ]; then -#-------------------------------- +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "fcst" -c "base fcst" ############################################## -# Set variables used in the exglobal script +# Set variables used in the script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} - -############################# -# Source relevant config files -############################# -configs="base fcst" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env fcst -status=$? -[[ $status -ne 0 ]] && exit $status - -#-------------------------------- -fi -#-------------------------------- +export CDUMP=${RUN/enkf} ############################################## -# Obtain unique process id (pid) and make temp directory +# Begin JOB SPECIFIC work ############################################## -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY +# Restart conditions for GFS cycle come from GDAS +rCDUMP=${CDUMP} +[[ ${CDUMP} = "gfs" ]] && export rCDUMP="gdas" +# Forecast length for GFS forecast +if [ ${CDUMP} = "gfs" ]; then + export FHMAX=${FHMAX_GFS} + export FHOUT=${FHOUT_GFS} + export FHMAX_HF=${FHMAX_HF_GFS} + export FHOUT_HF=${FHOUT_HF_GFS} +else + export FHMAX_HF=0 + export FHOUT_HF=0 +fi -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") +# shellcheck disable= +declare -x gPDY="${GDATE:0:8}" +declare -x gcyc="${GDATE:8:2}" -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir - export RSTDIR=${GESROOT:?}/$envir -fi +# Construct COM variables from templates (see config.com) +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_RESTART COM_ATMOS_INPUT COM_ATMOS_ANALYSIS \ + COM_ATMOS_HISTORY COM_ATMOS_MASTER COM_TOP +RUN=${rCDUMP} YMD="${gPDY}" HH="${gcyc}" generate_com -rx \ + COM_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL -#-------------------------------- -if [ $RUN_ENVIR = "nco" ]; then -#-------------------------------- - -############################# -# Source relevant config files -############################# -configs="base fcst" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done -# Source additional configs -if [ ${DO_WAVE:-"NO"} = "YES" ]; then - configs="wave" - for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status - done +if [[ ${DO_WAVE} == "YES" ]]; then + YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_RESTART COM_WAVE_PREP COM_WAVE_HISTORY + RUN=${rCDUMP} YMD="${gPDY}" HH="${gcyc}" generate_com -rx \ + COM_WAVE_RESTART_PREV:COM_WAVE_RESTART_TMPL + declare -rx RUNwave="${RUN}wave" fi -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env fcst -status=$? -[[ $status -ne 0 ]] && exit $status - -#-------------------------------- +if [[ ${DO_OCN} == "YES" ]]; then + YMD=${PDY} HH=${cyc} generate_com -rx COM_MED_RESTART COM_OCEAN_RESTART COM_OCEAN_INPUT \ + COM_OCEAN_HISTORY COM_OCEAN_ANALYSIS + RUN=${CDUMP} YMD="${gPDY}" HH="${gcyc}" generate_com -rx \ + COM_OCEAN_RESTART_PREV:COM_OCEAN_RESTART_TMPL fi -#-------------------------------- - -# Set wave variables -if [ ${DO_WAVE:-"NO"} = "YES" ]; then - # WAVE component directory - export CDUMPwave=${CDUMPwave:-${CDUMP}wave} - export COMINwave=${COMINwave:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/wave} - export COMOUTwave=${COMOUTwave:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/wave} +if [[ ${DO_ICE} == "YES" ]]; then + YMD=${PDY} HH=${cyc} generate_com -rx COM_ICE_HISTORY COM_ICE_INPUT COM_ICE_RESTART + RUN=${CDUMP} YMD="${gPDY}" HH="${gcyc}" generate_com -rx \ + COM_ICE_RESTART_PREV:COM_ICE_RESTART_TMPL fi -############################################## -# Begin JOB SPECIFIC work -############################################## - -# Restart conditions for GFS cycle come from GDAS -rCDUMP=$CDUMP -[[ $CDUMP = "gfs" ]] && export rCDUMP="gdas" - -# Forecast length for GFS forecast -if [ $CDUMP = "gfs" ]; then - export FHMAX=$FHMAX_GFS - export FHOUT=$FHOUT_GFS - export FHMAX_HF=$FHMAX_HF_GFS - export FHOUT_HF=$FHOUT_HF_GFS -else - export FHMAX_HF=0 - export FHOUT_HF=0 +if [[ ${DO_AERO} == "YES" ]]; then + YMD=${PDY} HH=${cyc} generate_com -rx COM_CHEM_HISTORY fi ############################################################### # Run relevant exglobal script -${FORECASTSH:-$SCRgfs/exglobal_forecast.sh} +${FORECASTSH:-${SCRgfs}/exglobal_forecast.sh} status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################## @@ -147,15 +82,15 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "$pgmout" ] ; then - cat $pgmout +if [ -e "${pgmout}" ] ; then + cat ${pgmout} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_LAND_ANALYSIS_FINALIZE b/jobs/JGLOBAL_LAND_ANALYSIS_FINALIZE new file mode 100755 index 00000000000..39b20ba6627 --- /dev/null +++ b/jobs/JGLOBAL_LAND_ANALYSIS_FINALIZE @@ -0,0 +1,50 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA=${DATA:-${DATAROOT}/${RUN}landanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "landanlfinal" -c "base landanl landanlfinal" + +############################################## +# Set variables used in the script +############################################## +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +GDUMP="gdas" + +############################################## +# Begin JOB SPECIFIC work +############################################## + +export COMOUT=${COMOUT:-${ROTDIR}/${RUN}.${PDY}/${cyc}/atmos} +mkdir -p "${COMOUT}" + +# COMIN_GES and COMIN_GES_ENS are used in script +export COMIN_GES="${ROTDIR}/${GDUMP}.${GDATE:0:8}/${GDATE:8:2}/atmos" +export COMIN_GES_ENS="${ROTDIR}/enkf${GDUMP}.${GDATE:0:8}/${GDATE:8:2}/atmos" + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASLANDFINALPY:-${HOMEgfs}/scripts/exglobal_land_analysis_finalize.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || ( echo "FATAL ERROR: ${DATAROOT} does not exist, ABORT!"; exit 1 ) +[[ ${KEEPDATA} = "NO" ]] && rm -rf "${DATA}" + +exit 0 diff --git a/jobs/JGLOBAL_LAND_ANALYSIS_INITIALIZE b/jobs/JGLOBAL_LAND_ANALYSIS_INITIALIZE new file mode 100755 index 00000000000..5f71d58a00c --- /dev/null +++ b/jobs/JGLOBAL_LAND_ANALYSIS_INITIALIZE @@ -0,0 +1,43 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export DATA=${DATA:-${DATAROOT}/${RUN}landanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "landanlinit" -c "base landanl landanlinit" + +############################################## +# Set variables used in the script +############################################## +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +GDUMP="gdas" + +############################################## +# Begin JOB SPECIFIC work +############################################## + +export COMOUT=${COMOUT:-${ROTDIR}/${RUN}.${PDY}/${cyc}/atmos} +mkdir -p "${COMOUT}" + +# COMIN_GES and COMIN_GES_ENS are used in script +export COMIN_GES="${ROTDIR}/${GDUMP}.${GDATE:0:8}/${GDATE:8:2}/atmos" +export COMIN_GES_ENS="${ROTDIR}/enkf${GDUMP}.${GDATE:0:8}/${GDATE:8:2}/atmos" + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASLANDINITPY:-${HOMEgfs}/scripts/exglobal_land_analysis_initialize.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +exit 0 diff --git a/jobs/JGLOBAL_LAND_ANALYSIS_RUN b/jobs/JGLOBAL_LAND_ANALYSIS_RUN new file mode 100755 index 00000000000..d489b603f43 --- /dev/null +++ b/jobs/JGLOBAL_LAND_ANALYSIS_RUN @@ -0,0 +1,44 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +export WIPE_DATA="NO" +export DATA=${DATA:-${DATAROOT}/${RUN}landanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "landanlrun" -c "base landanl landanlrun" + +############################################## +# Set variables used in the script +############################################## +GDATE=$(date +%Y%m%d%H -d "${PDY} ${cyc} - ${assim_freq} hours") +GDUMP="gdas" + +############################################## +# Begin JOB SPECIFIC work +############################################## + +export COMOUT=${COMOUT:-${ROTDIR}/${RUN}.${PDY}/${cyc}/atmos} +mkdir -p "${COMOUT}" + +# COMIN_GES and COMIN_GES_ENS are used in script +export COMIN_GES="${ROTDIR}/${GDUMP}.${GDATE:0:8}/${GDATE:8:2}/atmos" +export COMIN_GES_ENS="${ROTDIR}/enkf${GDUMP}.${GDATE:0:8}/${GDATE:8:2}/atmos" + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASLANDRUNSH:-${HOMEgfs}/scripts/exglobal_land_analysis_run.sh} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +exit 0 diff --git a/jobs/JGLOBAL_WAVE_GEMPAK b/jobs/JGLOBAL_WAVE_GEMPAK index 591dcff3930..b7c97ce5713 100755 --- a/jobs/JGLOBAL_WAVE_GEMPAK +++ b/jobs/JGLOBAL_WAVE_GEMPAK @@ -1,60 +1,34 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -# JY - 10/29, move the block in the front, otherwise PDY is not defined for COMIN -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -setpdy.sh -. PDY - -# -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-wave} -export machine=${machine:-WCOSS2} -export HOMEgfs=${HOMEgfs:-$(dirname $(dirname $0))} +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "wavegempak" -c "base wave wavegempak" # Add default errchk = err_chk export errchk=${errchk:-err_chk} ################################### # Set COM Paths -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT/gempak} -#export pid=$$ -export pgmout="OUTPUT.$$" - +################################### export DBN_ALERT_TYPE=GFS_WAVE_GEMPAK export SENDCOM=${SENDCOM:-YES} export SENDDBN=${SENDDBN:-YES} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} +YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_GRID COM_WAVE_GEMPAK -if [ $SENDCOM = YES ] ; then - mkdir -m 775 -p $COMOUT -fi - +if [[ ! -d ${COM_WAVE_GEMPAK} ]]; then mkdir -p "${COM_WAVE_GEMPAK}"; fi ######################################################## # Execute the script. ${HOMEgfs}/scripts/exgfs_wave_nawips.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ################################### + # Remove temp directories -if [ "$KEEPDATA" != "YES" ]; then - cd $DATAROOT - rm -rf $DATA +cd ${DATAROOT} +if [ "${KEEPDATA}" != "YES" ]; then + rm -rf ${DATA} fi - exit 0 diff --git a/jobs/JGLOBAL_WAVE_INIT b/jobs/JGLOBAL_WAVE_INIT index 013dff7e707..49fccad66f5 100755 --- a/jobs/JGLOBAL_WAVE_INIT +++ b/jobs/JGLOBAL_WAVE_INIT @@ -1,82 +1,39 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "waveinit" -c "base wave waveinit" -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -configs="base wave waveinit" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env waveinit -status=$? -[[ $status -ne 0 ]] && exit $status - -# PATH for working directory -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-wave} # Add default errchk = err_chk export errchk=${errchk:-err_chk} -# Create and go to DATA directory -export DATA=${DATA:-${DATAROOT:?}/${jobid}} -mkdir -p $DATA -cd $DATA - -cyc=${cyc:-00} -export cycle=${cycle:-t${cyc}z} - -# Set PDY -setpdy.sh -. PDY - -export pgmout=OUTPUT.$$ - export MP_PULSE=0 # Path to HOME Directory -export FIXwave=${FIXwave:-$HOMEgfs/fix/fix_wave_${NET}} -export PARMwave=${PARMwave:-$HOMEgfs/parm/wave} -export USHwave=${USHwave:-$HOMEgfs/ush} -export EXECwave=${EXECwave:-$HOMEgfs/exec} +export FIXwave=${FIXwave:-${HOMEgfs}/fix/fix_wave_${NET}} +export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} +export USHwave=${USHwave:-${HOMEgfs}/ush} +export EXECwave=${EXECwave:-${HOMEgfs}/exec} -# Set COM Paths and GETGES environment -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi -export COMIN=${COMIN:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} -[[ ! -d $COMOUT ]] && mkdir -m 775 -p $COMOUT +# Set COM Paths +YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_PREP -if [ $SENDCOM = YES ]; then - mkdir -p $COMOUT/rundata +if [ ${SENDCOM} = YES ]; then + mkdir -m 775 -p ${COM_WAVE_PREP} fi # Set mpi serial command -export wavempexec=${launcher:-"mpirun -n"} -export wave_mpmd=${mpmd:-"cfp"} +export wavempexec=${wavempexec:-"mpirun -n"} +export wave_mpmd=${wave_mpmd:-"cfp"} -# Execute the Script -$HOMEgfs/scripts/exgfs_wave_init.sh +# Execute the Script +${HOMEgfs}/scripts/exgfs_wave_init.sh ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_WAVE_POST_BNDPNT b/jobs/JGLOBAL_WAVE_POST_BNDPNT index 0821a9fdaf8..9016d624d77 100755 --- a/jobs/JGLOBAL_WAVE_POST_BNDPNT +++ b/jobs/JGLOBAL_WAVE_POST_BNDPNT @@ -1,92 +1,42 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -configs="base wave wavepostsbs wavepostbndpnt" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env wavepostbndpnt -status=$? -[[ $status -ne 0 ]] && exit $status - -# PATH for working directory -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-wave} - -export HOMEgefs=${HOMEgefs:-$NWROOT/$NET.${gefs_ver}} -export HOMEgfs=${HOMEgfs:-$NWROOT/$NET.${gfs_ver}} +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "wavepostbndpnt" -c "base wave wavepostsbs wavepostbndpnt" # Add default errchk = err_chk export errchk=${errchk:-err_chk} -# Create and go to DATA directory -export DATA=${DATA:-${DATAROOT:?}/${jobid}} -mkdir -p $DATA -cd $DATA - -export cyc=${cyc:-00} -export cycle=${cycle:-t${cyc}z} - -# Set PDY -setpdy.sh -. PDY - -export CDATE=$PDY$cyc - -export pgmout=OUTPUT.$$ - export MP_PULSE=0 # Path to HOME Directory -export FIXwave=${FIXwave:-$HOMEgfs/fix/fix_wave_${NET}} -export PARMwave=${PARMwave:-$HOMEgfs/parm/wave} -export USHwave=${USHwave:-$HOMEgfs/ush} -export EXECwave=${EXECwave:-$HOMEgfs/exec} +export FIXwave=${FIXwave:-${HOMEgfs}/fix/fix_wave_${NET}} +export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} +export USHwave=${USHwave:-${HOMEgfs}/ush} +export EXECwave=${EXECwave:-${HOMEgfs}/exec} # Set COM Paths and GETGES environment -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi -export COMIN=${COMIN:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} - -mkdir -p $COMOUT/station +YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_PREP COM_WAVE_HISTORY COM_WAVE_STATION +if [[ ! -d ${COM_WAVE_STATION} ]]; then mkdir -p "${COM_WAVE_STATION}"; fi -# Set wave model ID tag to include member number +# Set wave model ID tag to include member number # if ensemble; waveMEMB var empty in deterministic membTAG='p' if [ "${waveMEMB}" == "00" ]; then membTAG='c'; fi export membTAG -export WAV_MOD_TAG=${CDUMP}wave${waveMEMB} +export WAV_MOD_TAG=${RUN}wave${waveMEMB} export CFP_VERBOSE=1 -export FHMAX_WAV_PNT=180 -if [ $FHMAX_WAV -lt $FHMAX_WAV_PNT ] ; then export FHMAX_WAV_IBP=$FHMAX_WAV ; fi +export FHMAX_WAV_PNT=${FHMAX_WAV_IBP} export DOSPC_WAV='YES' # Spectral post export DOBLL_WAV='NO' # Bulletin post -export DOBNDPNT_WAV='YES' #not boundary points +export DOBNDPNT_WAV='YES' # Do boundary points -# Execute the Script -$HOMEgfs/scripts/exgfs_wave_post_pnt.sh +# Execute the Script +${HOMEgfs}/scripts/exgfs_wave_post_pnt.sh err=$? -if [ $err -ne 0 ]; then +if [ ${err} -ne 0 ]; then echo "FATAL ERROR: ex-script of GWES_POST failed!" exit ${err} fi @@ -94,8 +44,8 @@ fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_WAVE_POST_BNDPNTBLL b/jobs/JGLOBAL_WAVE_POST_BNDPNTBLL index 404ab14d9ef..c193a28cf7b 100755 --- a/jobs/JGLOBAL_WAVE_POST_BNDPNTBLL +++ b/jobs/JGLOBAL_WAVE_POST_BNDPNTBLL @@ -1,95 +1,46 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "wavepostbndpntbll" -c "base wave wavepostsbs wavepostbndpntbll" -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -configs="base wave wavepostsbs wavepostbndpnt" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env wavepostbndpntbll -status=$? -[[ $status -ne 0 ]] && exit $status - -# PATH for working directory -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-wave} - -export HOMEgefs=${HOMEgefs:-$NWROOT/$NET.${gefs_ver}} -export HOMEgfs=${HOMEgfs:-$NWROOT/$NET.${gfs_ver}} +export COMPONENT="wave" # Add default errchk = err_chk export errchk=${errchk:-err_chk} -# Create and go to DATA directory -export DATA=${DATA:-${DATAROOT:?}/${jobid}} -mkdir -p $DATA -cd $DATA - -export cyc=${cyc:-00} -export cycle=${cycle:-t${cyc}z} - -# Set PDY -setpdy.sh -. PDY - -export CDATE=$PDY$cyc - -export pgmout=OUTPUT.$$ +export CDATE=${PDY}${cyc} export MP_PULSE=0 # Path to HOME Directory -export FIXwave=${FIXwave:-$HOMEgfs/fix/fix_wave_${NET}} -export PARMwave=${PARMwave:-$HOMEgfs/parm/wave} -export USHwave=${USHwave:-$HOMEgfs/ush} -export EXECwave=${EXECwave:-$HOMEgfs/exec} +export FIXwave=${FIXwave:-${HOMEgfs}/fix/fix_wave_${NET}} +export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} +export USHwave=${USHwave:-${HOMEgfs}/ush} +export EXECwave=${EXECwave:-${HOMEgfs}/exec} # Set COM Paths and GETGES environment -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi -export COMIN=${COMIN:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} - +YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_PREP COM_WAVE_HISTORY COM_WAVE_STATION -mkdir -p $COMOUT/station +if [[ ! -d ${COM_WAVE_STATION} ]]; then mkdir -p "${COM_WAVE_STATION}"; fi - -# Set wave model ID tag to include member number -# if ensemble; waveMEMB var empty in deterministic # Set wave model ID tag to include member number # if ensemble; waveMEMB var empty in deterministic membTAG='p' if [ "${waveMEMB}" == "00" ]; then membTAG='c'; fi export membTAG -export WAV_MOD_TAG=${CDUMP}wave${waveMEMB} +export WAV_MOD_TAG=${RUN}wave${waveMEMB} export CFP_VERBOSE=1 -export FHMAX_WAV_PNT=180 -if [ $FHMAX_WAV -lt $FHMAX_WAV_PNT ] ; then export FHMAX_WAV_IBP=$FHMAX_WAV ; fi +export FHMAX_WAV_PNT=${FHMAX_WAV_IBP} export DOSPC_WAV='NO' # Spectral post export DOBLL_WAV='YES' # Bulletin post export DOBNDPNT_WAV='YES' #boundary points # Execute the Script -$HOMEgfs/scripts/exgfs_wave_post_pnt.sh +${HOMEgfs}/scripts/exgfs_wave_post_pnt.sh err=$? -if [ $err -ne 0 ]; then +if [ ${err} -ne 0 ]; then echo "FATAL ERROR: ex-script of GFS_WAVE_POST_PNT failed!" exit ${err} fi @@ -97,8 +48,8 @@ fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_WAVE_POST_PNT b/jobs/JGLOBAL_WAVE_POST_PNT index acde66e7a58..3ee1d56eefb 100755 --- a/jobs/JGLOBAL_WAVE_POST_PNT +++ b/jobs/JGLOBAL_WAVE_POST_PNT @@ -1,81 +1,30 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -configs="base wave wavepostsbs wavepostpnt" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-${NWROOT:-}/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env wavepostpnt -status=$? -[[ $status -ne 0 ]] && exit $status - -# PATH for working directory -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-wave} - -export HOMEgefs=${HOMEgefs:-${NWROOT:-}/$NET.${gefs_ver:-}} -export HOMEgfs=${HOMEgfs:-${NWROOT:-}/$NET.${gfs_ver}} +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "wavepostpnt" -c "base wave wavepostsbs wavepostpnt" # Add default errchk = err_chk export errchk=${errchk:-err_chk} -# Create and go to DATA directory -export DATA=${DATA:-${DATAROOT:?}/${jobid}} -mkdir -p $DATA -cd $DATA - -export cyc=${cyc:-00} -export cycle=${cycle:-t${cyc}z} - -# Set PDY -setpdy.sh -. ./PDY - -export CDATE=$PDY$cyc - -export pgmout=OUTPUT.$$ - export MP_PULSE=0 # Path to HOME Directory -export FIXwave=${FIXwave:-$HOMEgfs/fix/fix_wave_${NET}} -export PARMwave=${PARMwave:-$HOMEgfs/parm/wave} -export USHwave=${USHwave:-$HOMEgfs/ush} -export EXECwave=${EXECwave:-$HOMEgfs/exec} +export FIXwave=${FIXwave:-${HOMEgfs}/fix/fix_wave_${NET}} +export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} +export USHwave=${USHwave:-${HOMEgfs}/ush} +export EXECwave=${EXECwave:-${HOMEgfs}/exec} # Set COM Paths and GETGES environment -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi -export COMIN=${COMIN:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} - -mkdir -p $COMOUT/station +YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_PREP COM_WAVE_HISTORY COM_WAVE_STATION +if [[ ! -d ${COM_WAVE_STATION} ]]; then mkdir -p "${COM_WAVE_STATION}"; fi -# Set wave model ID tag to include member number -# if ensemble; waveMEMB var empty in deterministic # Set wave model ID tag to include member number # if ensemble; waveMEMB var empty in deterministic membTAG='p' if [ "${waveMEMB}" == "00" ]; then membTAG='c'; fi export membTAG -export WAV_MOD_TAG=${CDUMP}wave${waveMEMB} +export WAV_MOD_TAG=${RUN}wave${waveMEMB} export CFP_VERBOSE=1 @@ -85,10 +34,10 @@ export DOBLL_WAV='YES' # Bulletin post export DOBNDPNT_WAV='NO' #not boundary points -# Execute the Script -$HOMEgfs/scripts/exgfs_wave_post_pnt.sh +# Execute the Script +${HOMEgfs}/scripts/exgfs_wave_post_pnt.sh err=$? -if [ $err -ne 0 ]; then +if [ ${err} -ne 0 ]; then echo "FATAL ERROR: ex-script of GWES_POST failed!" exir ${err} fi @@ -96,8 +45,8 @@ fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_WAVE_POST_SBS b/jobs/JGLOBAL_WAVE_POST_SBS index 868cf8b2424..47e7063db44 100755 --- a/jobs/JGLOBAL_WAVE_POST_SBS +++ b/jobs/JGLOBAL_WAVE_POST_SBS @@ -1,101 +1,49 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -configs="base wave wavepostsbs" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-${NWROOT:-}/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env wavepostsbs -status=$? -[[ $status -ne 0 ]] && exit $status - -# PATH for working directory -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-wave} - -export HOMEgefs=${HOMEgefs:-${NWROOT:-}/$NET.${gefs_ver:-}} -export HOMEgfs=${HOMEgfs:-${NWROOT:-}/$NET.${gfs_ver}} +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "wavepostsbs" -c "base wave wavepostsbs" # Add default errchk = err_chk export errchk=${errchk:-err_chk} -# Create and go to DATA directory -export DATA=${DATA:-${DATAROOT:?}/${jobid}} -mkdir -p $DATA -cd $DATA - -export cyc=${cyc:-00} -export cycle=${cycle:-t${cyc}z} - -# Set PDY -setpdy.sh -. ./PDY - -export CDATE=$PDY$cyc - -export pgmout=OUTPUT.$$ - export MP_PULSE=0 # Path to HOME Directory -export FIXwave=${FIXwave:-$HOMEgfs/fix/fix_wave_${NET}} -export PARMwave=${PARMwave:-$HOMEgfs/parm/wave} -export USHwave=${USHwave:-$HOMEgfs/ush} -export EXECwave=${EXECwave:-$HOMEgfs/exec} +export FIXwave=${FIXwave:-${HOMEgfs}/fix/fix_wave_${NET}} +export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} +export USHwave=${USHwave:-${HOMEgfs}/ush} +export EXECwave=${EXECwave:-${HOMEgfs}/exec} # Set COM Paths and GETGES environment -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi -export COMIN=${COMIN:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} - -export COMINice=${COMINice:-${COMROOTp2:-${COMROOT}}/omb/prod} -export COMINwnd=${COMINwnd:-${COMROOT}/gfs/prod} -export COMIN_WAV_CUR=${COMIN_WAV_CUR:-${COMROOTp2:-${COMROOT}}/rtofs/prod} +YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_PREP COM_WAVE_HISTORY COM_WAVE_GRID -mkdir -p $COMOUT/gridded +mkdir -p "${COM_WAVE_GRID}" -# Set wave model ID tag to include member number +# Set wave model ID tag to include member number # if ensemble; waveMEMB var empty in deterministic # Set wave model ID tag to include member number # if ensemble; waveMEMB var empty in deterministic membTAG='p' if [ "${waveMEMB}" == "00" ]; then membTAG='c'; fi export membTAG -export WAV_MOD_TAG=${CDUMP}wave${waveMEMB} +export WAV_MOD_TAG=${RUN}wave${waveMEMB} export CFP_VERBOSE=1 -# Execute the Script -$HOMEgfs/scripts/exgfs_wave_post_gridded_sbs.sh +# Execute the Script +${HOMEgfs}/scripts/exgfs_wave_post_gridded_sbs.sh err=$? -if [ $err -ne 0 ]; then +if [ ${err} -ne 0 ]; then echo "FATAL ERROR: ex-script of GWES_POST failed!" - exit $err + exit ${err} fi ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/JGLOBAL_WAVE_PRDGEN_BULLS b/jobs/JGLOBAL_WAVE_PRDGEN_BULLS index 617217dfac9..794258e756e 100755 --- a/jobs/JGLOBAL_WAVE_PRDGEN_BULLS +++ b/jobs/JGLOBAL_WAVE_PRDGEN_BULLS @@ -1,59 +1,36 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -# Set PDY - setpdy.sh - . PDY - -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-wave} -export HOMEgfs=${HOMEgfs:-$(dirname $(dirname $0))} # parent directory of current job card +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "waveawipsbulls" -c "base wave waveawipsbulls" # Add default errchk = err_chk export errchk=${errchk:-err_chk} ################################### # Set COM Paths -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT} -export PCOM=${PCOM:-${COMOUT}/wmo} - +################################### export SENDCOM=${SENDCOM:-YES} export SENDDBN_NTC=${SENDDBN_NTC:-YES} export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} +YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_STATION COM_WAVE_WMO -if [ $SENDCOM = YES ]; then - mkdir -p $COMOUT $PCOM -fi - +if [[ ! -d ${COM_WAVE_WMO} ]]; then mkdir -p "${COM_WAVE_WMO}"; fi ################################### -# Execute the Script +# Execute the Script -$HOMEgfs/scripts/exgfs_wave_prdgen_bulls.sh +${HOMEgfs}/scripts/exgfs_wave_prdgen_bulls.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + ################################### # Remove temp directories - -if [ "$KEEPDATA" != "YES" ]; then - cd $DATAROOT - rm -rf $DATA +cd ${DATAROOT} +if [ "${KEEPDATA}" != "YES" ]; then + rm -rf ${DATA} fi exit 0 - diff --git a/jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED b/jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED index 45cea6d4e2c..a2134461da2 100755 --- a/jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED +++ b/jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED @@ -1,25 +1,7 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export DATA=${DATA:-${DATAROOT}/${jobid:?}} -mkdir -p $DATA -cd $DATA - -###################################### -# Set up the cycle variable -###################################### -export cycle=${cycle:-t${cyc}z} - -# Set PDY - setpdy.sh - . PDY - -# PATH for working directory -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-wave} -export HOMEgfs=${HOMEgfs:-$(dirname $(dirname $0))} # parent directory of current job card +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "waveawipsgridded" -c "base wave waveawipsgridded" # Add default errchk = err_chk export errchk=${errchk:-err_chk} @@ -27,33 +9,32 @@ export errchk=${errchk:-err_chk} ################################### # Set COM Paths ################################### -export COMIN=${COMIN:-$(compath.py ${NET}/${envir}/${RUN}.${PDY})/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${COMROOT}/${NET}/${envir}/${RUN}.${PDY}/${cyc}/$COMPONENT} -export PCOM=${PCOM:-${COMOUT}/wmo} - - export SENDCOM=${SENDCOM:-YES} export SENDDBN_NTC=${SENDDBN_NTC:-YES} export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} +YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_GRID COM_WAVE_WMO -if [ $SENDCOM = YES ]; then - mkdir -p $COMOUT $PCOM +if [[ ! -d ${COM_WAVE_WMO} ]]; then mkdir -p "${COM_WAVE_WMO}"; fi + +if [ ${SENDCOM} = YES ]; then + mkdir -p "${COM_WAVE_WMO}" fi ################################### -# Execute the Script +# Execute the Script ################################### -$HOMEgfs/scripts/exgfs_wave_prdgen_gridded.sh +${HOMEgfs}/scripts/exgfs_wave_prdgen_gridded.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + ################################### # Remove temp directories ################################### -if [ "$KEEPDATA" != "YES" ]; then - cd $DATAROOT - rm -rf $DATA +cd ${DATAROOT} +if [ "${KEEPDATA}" != "YES" ]; then + rm -rf ${DATA} fi diff --git a/jobs/JGLOBAL_WAVE_PREP b/jobs/JGLOBAL_WAVE_PREP index 5878e36444f..5ff48d886c6 100755 --- a/jobs/JGLOBAL_WAVE_PREP +++ b/jobs/JGLOBAL_WAVE_PREP @@ -1,53 +1,15 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -export RUN_ENVIR=${RUN_ENVIR:-"nco"} - -############################# -# Source relevant config files -############################# -configs="base wave waveprep" -export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env waveprep -status=$? -[[ $status -ne 0 ]] && exit $status - -# PATH for working directory -export NET=${NET:-gfs} -export RUN=${RUN:-gfs} -export COMPONENT=${COMPONENT:-wave} - -export HOMEgfs=${HOMEgfs:-$NWROOT/gfs.${gfs_ver}} +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "waveprep" -c "base wave waveprep" # Add default errchk = err_chk export errchk=${errchk:-err_chk} -# Create and go to DATA directory -export DATA=${DATA:-${DATAROOT:?}/${jobid}} -mkdir -p $DATA -cd $DATA +export CDUMP=${RUN/enkf} -cyc=${cyc:-00} -export cycle=${cycle:-t${cyc}z} - -# Set PDY -setpdy.sh -. ./PDY # Set rtofs PDY -export RPDY=$PDY - -export pgmout=OUTPUT.$$ +export RPDY=${PDY} export MP_PULSE=0 @@ -55,50 +17,24 @@ export MP_PULSE=0 export CDO=${CDO_ROOT}/bin/cdo # Path to HOME Directory -export FIXwave=${FIXwave:-$HOMEgfs/fix/fix_wave_${NET}} -export PARMwave=${PARMwave:-$HOMEgfs/parm/wave} -export USHwave=${USHwave:-$HOMEgfs/ush} -export EXECwave=${EXECwave:-$HOMEgfs/exec} +export FIXwave=${FIXwave:-${HOMEgfs}/fix/fix_wave_${NET}} +export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} +export USHwave=${USHwave:-${HOMEgfs}/ush} +export EXECwave=${EXECwave:-${HOMEgfs}/exec} # Set COM Paths and GETGES environment -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir -fi -export COMIN=${COMIN:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} -export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/$COMPONENT} -[[ ! -d $COMOUT ]] && mkdir -m 775 -p $COMOUT - -if [ $RUN_ENVIR = "nco" ]; then - export COMIN_WAV_ICE=${COMIN_WAV_ICE:-$(compath.py gfs/prod)}/${CDUMP}.${PDY}/${cyc}/atmos - export COMIN_WAV_RTOFS=${COMIN_WAV_RTOFS:-$(compath.py ${WAVECUR_DID}/prod)} -else - if [ $WW3CURINP = "YES" ]; then - if [ ! -d $DMPDIR/${WAVECUR_DID}.${RPDY} ]; then export RPDY=$($NDATE -24 ${PDY}00 | cut -c1-8); fi - if [ ! -L $ROTDIR/${WAVECUR_DID}.${RPDY} ]; then # Check if symlink already exists in ROTDIR - $NLN $DMPDIR/${WAVECUR_DID}.${RPDY} $ROTDIR/${WAVECUR_DID}.${RPDY} - fi - BRPDY=$($NDATE -24 ${RPDY}00 | cut -c1-8) - if [ ! -L $ROTDIR/${WAVECUR_DID}.${BRPDY} ]; then # Check if symlink already exists in ROTDIR - $NLN $DMPDIR/${WAVECUR_DID}.${BRPDY} $ROTDIR/${WAVECUR_DID}.${BRPDY} - fi - export COMIN_WAV_RTOFS=${COMIN_WAV_RTOFS:-$ROTDIR} - fi - if [ $WW3ICEINP = "YES" ]; then - if [ ! -L $ROTDIR/${CDUMP}.${PDY}/${cyc}/atmos/${WAVICEFILE} ]; then # Check if symlink already exists in ROTDIR - $NLN $DMPDIR/$CDUMP.${PDY}/$cyc/atmos/${WAVICEFILE} $ROTDIR/$CDUMP.${PDY}/$cyc/atmos/${WAVICEFILE} - fi - export COMIN_WAV_ICE=${COMIN_WAV_ICE:-$ROTDIR/$RUN.$PDY/$cyc/atmos} - fi -fi +YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS COM_WAVE_PREP +generate_com -rx COM_RTOFS +[[ ! -d ${COM_WAVE_PREP} ]] && mkdir -m 775 -p "${COM_WAVE_PREP}" -# Execute the Script -$HOMEgfs/scripts/exgfs_wave_prep.sh +# Execute the Script +${HOMEgfs}/scripts/exgfs_wave_prep.sh ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd ${DATAROOT} +[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} exit 0 diff --git a/jobs/rocoto/aeroanlfinal.sh b/jobs/rocoto/aeroanlfinal.sh new file mode 100755 index 00000000000..8f5a445de49 --- /dev/null +++ b/jobs/rocoto/aeroanlfinal.sh @@ -0,0 +1,23 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="aeroanlfinal" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE" +status=$? +exit "${status}" diff --git a/jobs/rocoto/aeroanlinit.sh b/jobs/rocoto/aeroanlinit.sh new file mode 100755 index 00000000000..4e3d32ff9f6 --- /dev/null +++ b/jobs/rocoto/aeroanlinit.sh @@ -0,0 +1,24 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="aeroanlinit" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH + +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE" +status=$? +exit "${status}" diff --git a/jobs/rocoto/aeroanlrun.sh b/jobs/rocoto/aeroanlrun.sh new file mode 100755 index 00000000000..0ec2fb84376 --- /dev/null +++ b/jobs/rocoto/aeroanlrun.sh @@ -0,0 +1,24 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="aeroanlrun" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH + +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_AERO_ANALYSIS_RUN" +status=$? +exit "${status}" diff --git a/jobs/rocoto/anal.sh b/jobs/rocoto/anal.sh index cd7fdc932a1..d99152ef19a 100755 --- a/jobs/rocoto/anal.sh +++ b/jobs/rocoto/anal.sh @@ -1,16 +1,19 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="anal" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_ATMOS_ANALYSIS +${HOMEgfs}/jobs/JGLOBAL_ATMOS_ANALYSIS status=$? diff --git a/jobs/rocoto/analcalc.sh b/jobs/rocoto/analcalc.sh index d80756cfc76..2e669b0163a 100755 --- a/jobs/rocoto/analcalc.sh +++ b/jobs/rocoto/analcalc.sh @@ -1,17 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="analcalc" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_ATMOS_ANALYSIS_CALC +${HOMEgfs}/jobs/JGLOBAL_ATMOS_ANALYSIS_CALC status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/analdiag.sh b/jobs/rocoto/analdiag.sh index f9d97360c65..cd6e1113f0a 100755 --- a/jobs/rocoto/analdiag.sh +++ b/jobs/rocoto/analdiag.sh @@ -1,17 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="analdiag" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGDAS_ATMOS_ANALYSIS_DIAG +${HOMEgfs}/jobs/JGDAS_ATMOS_ANALYSIS_DIAG status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/arch.sh b/jobs/rocoto/arch.sh index c9441b5a755..2f62d8b3547 100755 --- a/jobs/rocoto/arch.sh +++ b/jobs/rocoto/arch.sh @@ -1,411 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -############################################################### -## Abstract: -## Archive driver script -## RUN_ENVIR : runtime environment (emc | nco) -## HOMEgfs : /full/path/to/workflow -## EXPDIR : /full/path/to/config/files -## CDATE : current analysis date (YYYYMMDDHH) -## CDUMP : cycle name (gdas / gfs) -## PDY : current date (YYYYMMDD) -## cyc : current cycle (HH) -############################################################### +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. "${HOMEgfs}"/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status - -############################################################### -# Source relevant configs -configs="base arch" -for config in $configs; do - . $EXPDIR/config.${config} - status=$? - [[ $status -ne 0 ]] && exit $status -done - -# ICS are restarts and always lag INC by $assim_freq hours -ARCHINC_CYC=$ARCH_CYC -ARCHICS_CYC=$((ARCH_CYC-assim_freq)) -if [ $ARCHICS_CYC -lt 0 ]; then - ARCHICS_CYC=$((ARCHICS_CYC+24)) -fi - -# CURRENT CYCLE -APREFIX="${CDUMP}.t${cyc}z." -ASUFFIX=${ASUFFIX:-$SUFFIX} - -if [ $ASUFFIX = ".nc" ]; then - format="netcdf" -else - format="nemsio" -fi - +[[ ${status} -ne 0 ]] && exit "${status}" -# Realtime parallels run GFS MOS on 1 day delay -# If realtime parallel, back up CDATE_MOS one day -CDATE_MOS=$CDATE -if [ $REALTIME = "YES" ]; then - CDATE_MOS=$($NDATE -24 $CDATE) -fi -PDY_MOS=$(echo $CDATE_MOS | cut -c1-8) +export job="arch" +export jobid="${job}.$$" ############################################################### -# Archive online for verification and diagnostics -############################################################### - -COMIN=${COMINatmos:-"$ROTDIR/$CDUMP.$PDY/$cyc/atmos"} -cd $COMIN - -source "${HOMEgfs}/ush/file_utils.sh" - -[[ ! -d $ARCDIR ]] && mkdir -p $ARCDIR -nb_copy ${APREFIX}gsistat $ARCDIR/gsistat.${CDUMP}.${CDATE} -nb_copy ${APREFIX}pgrb2.1p00.anl $ARCDIR/pgbanl.${CDUMP}.${CDATE}.grib2 - -# Archive 1 degree forecast GRIB2 files for verification -if [ $CDUMP = "gfs" ]; then - fhmax=$FHMAX_GFS - fhr=0 - while [ $fhr -le $fhmax ]; do - fhr2=$(printf %02i $fhr) - fhr3=$(printf %03i $fhr) - nb_copy ${APREFIX}pgrb2.1p00.f$fhr3 $ARCDIR/pgbf${fhr2}.${CDUMP}.${CDATE}.grib2 - fhr=$((10#$fhr + 10#$FHOUT_GFS )) - done -fi -if [ $CDUMP = "gdas" ]; then - flist="000 003 006 009" - for fhr in $flist; do - fname=${APREFIX}pgrb2.1p00.f${fhr} - fhr2=$(printf %02i $((10#$fhr))) - nb_copy $fname $ARCDIR/pgbf${fhr2}.${CDUMP}.${CDATE}.grib2 - done -fi - -if [ -s avno.t${cyc}z.cyclone.trackatcfunix ]; then - PLSOT4=$(echo $PSLOT|cut -c 1-4 |tr '[a-z]' '[A-Z]') - cat avno.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunix.${CDUMP}.$CDATE - cat avnop.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunixp.${CDUMP}.$CDATE -fi - -if [ $CDUMP = "gdas" -a -s gdas.t${cyc}z.cyclone.trackatcfunix ]; then - PLSOT4=$(echo $PSLOT|cut -c 1-4 |tr '[a-z]' '[A-Z]') - cat gdas.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunix.${CDUMP}.$CDATE - cat gdasp.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunixp.${CDUMP}.$CDATE -fi - -if [ $CDUMP = "gfs" ]; then - nb_copy storms.gfso.atcf_gen.$CDATE ${ARCDIR}/. - nb_copy storms.gfso.atcf_gen.altg.$CDATE ${ARCDIR}/. - nb_copy trak.gfso.atcfunix.$CDATE ${ARCDIR}/. - nb_copy trak.gfso.atcfunix.altg.$CDATE ${ARCDIR}/. - - mkdir -p ${ARCDIR}/tracker.$CDATE/$CDUMP - blist="epac natl" - for basin in $blist; do - if [[ -f $basin ]]; then - cp -rp $basin ${ARCDIR}/tracker.$CDATE/$CDUMP - fi - done -fi - -# Archive required gaussian gfs forecast files for Fit2Obs -if [ $CDUMP = "gfs" -a $FITSARC = "YES" ]; then - VFYARC=${VFYARC:-$ROTDIR/vrfyarch} - [[ ! -d $VFYARC ]] && mkdir -p $VFYARC - mkdir -p $VFYARC/${CDUMP}.$PDY/$cyc - prefix=${CDUMP}.t${cyc}z - fhmax=${FHMAX_FITS:-$FHMAX_GFS} - fhr=0 - while [[ $fhr -le $fhmax ]]; do - fhr3=$(printf %03i $fhr) - sfcfile=${prefix}.sfcf${fhr3}${ASUFFIX} - sigfile=${prefix}.atmf${fhr3}${ASUFFIX} - nb_copy $sfcfile $VFYARC/${CDUMP}.$PDY/$cyc/ - nb_copy $sigfile $VFYARC/${CDUMP}.$PDY/$cyc/ - (( fhr = 10#$fhr + 6 )) - done -fi - - -############################################################### -# Archive data either to HPSS or locally -if [[ $HPSSARCH = "YES" || $LOCALARCH = "YES" ]]; then -############################################################### - -# --set the archiving command and create local directories, if necessary -TARCMD="htar" -if [[ $LOCALARCH = "YES" ]]; then - TARCMD="tar" - [ ! -d $ATARDIR/$CDATE ] && mkdir -p $ATARDIR/$CDATE - [ ! -d $ATARDIR/$CDATE_MOS -a -d $ROTDIR/gfsmos.$PDY_MOS -a $cyc -eq 18 ] && mkdir -p $ATARDIR/$CDATE_MOS -fi - -#--determine when to save ICs for warm start and forecast-only runs -SAVEWARMICA="NO" -SAVEWARMICB="NO" -SAVEFCSTIC="NO" -firstday=$($NDATE +24 $SDATE) -mm=$(echo $CDATE|cut -c 5-6) -dd=$(echo $CDATE|cut -c 7-8) -nday=$(( (10#$mm-1)*30+10#$dd )) -mod=$(($nday % $ARCH_WARMICFREQ)) -if [ $CDATE -eq $firstday -a $cyc -eq $ARCHINC_CYC ]; then SAVEWARMICA="YES" ; fi -if [ $CDATE -eq $firstday -a $cyc -eq $ARCHICS_CYC ]; then SAVEWARMICB="YES" ; fi -if [ $mod -eq 0 -a $cyc -eq $ARCHINC_CYC ]; then SAVEWARMICA="YES" ; fi -if [ $mod -eq 0 -a $cyc -eq $ARCHICS_CYC ]; then SAVEWARMICB="YES" ; fi - -if [ $ARCHICS_CYC -eq 18 ]; then - nday1=$((nday+1)) - mod1=$(($nday1 % $ARCH_WARMICFREQ)) - if [ $mod1 -eq 0 -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="YES" ; fi - if [ $mod1 -ne 0 -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="NO" ; fi - if [ $CDATE -eq $SDATE -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="YES" ; fi -fi - -mod=$(($nday % $ARCH_FCSTICFREQ)) -if [ $mod -eq 0 -o $CDATE -eq $firstday ]; then SAVEFCSTIC="YES" ; fi - - -ARCH_LIST="$COMIN/archlist" -[[ -d $ARCH_LIST ]] && rm -rf $ARCH_LIST -mkdir -p $ARCH_LIST -cd $ARCH_LIST - -$HOMEgfs/ush/hpssarch_gen.sh $CDUMP +# Execute the JJOB +"${HOMEgfs}"/jobs/JGLOBAL_ARCHIVE status=$? -if [ $status -ne 0 ]; then - echo "$HOMEgfs/ush/hpssarch_gen.sh $CDUMP failed, ABORT!" - exit $status -fi - -cd $ROTDIR - -if [ $CDUMP = "gfs" ]; then - - targrp_list="gfsa gfsb" - - if [ ${ARCH_GAUSSIAN:-"NO"} = "YES" ]; then - targrp_list="$targrp_list gfs_flux gfs_${format}b gfs_pgrb2b" - if [ $MODE = "cycled" ]; then - targrp_list="$targrp_list gfs_${format}a" - fi - fi - - if [ $DO_WAVE = "YES" -a "$WAVE_CDUMP" != "gdas" ]; then - targrp_list="$targrp_list gfswave" - fi - - if [ $DO_OCN = "YES" ]; then - targrp_list="$targrp_list ocn_ice_grib2_0p5 ocn_ice_grib2_0p25 ocn_2D ocn_3D ocn_xsect ocn_daily wavocn gfs_flux_1p00" - fi - - if [ $DO_ICE = "YES" ]; then - targrp_list="$targrp_list ice" - fi - - # Aerosols - if [ $DO_AERO = "YES" ]; then - for targrp in chem; do - htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar $(cat $ARCH_LIST/${targrp}.txt) - status=$? - if [ $status -ne 0 -a $CDATE -ge $firstday ]; then - echo "HTAR $CDATE ${targrp}.tar failed" - exit $status - fi - done - fi - - #for restarts - if [ $SAVEFCSTIC = "YES" ]; then - targrp_list="$targrp_list gfs_restarta" - fi - - #for downstream products - if [ $DO_BUFRSND = "YES" -o $WAFSF = "YES" ]; then - targrp_list="$targrp_list gfs_downstream" - fi - - #--save mdl gfsmos output from all cycles in the 18Z archive directory - if [ -d gfsmos.$PDY_MOS -a $cyc -eq 18 ]; then - set +e - $TARCMD -P -cvf $ATARDIR/$CDATE_MOS/gfsmos.tar ./gfsmos.$PDY_MOS - status=$? - if [ $status -ne 0 -a $CDATE -ge $firstday ]; then - echo "$(echo $TARCMD | tr 'a-z' 'A-Z') $CDATE gfsmos.tar failed" - exit $status - fi - ${ERR_EXIT_ON:-set -e} - fi -elif [ $CDUMP = "gdas" ]; then - - targrp_list="gdas" - - #gdaswave - if [ $DO_WAVE = "YES" ]; then - targrp_list="$targrp_list gdaswave" - fi - - if [ $SAVEWARMICA = "YES" -o $SAVEFCSTIC = "YES" ]; then - targrp_list="$targrp_list gdas_restarta" - - if [ $DO_WAVE = "YES" ]; then - targrp_list="$targrp_list gdaswave_restart" - fi - fi - - if [ $SAVEWARMICB = "YES" -o $SAVEFCSTIC = "YES" ]; then - targrp_list="$targrp_list gdas_restartb" - fi -fi - -# Turn on extended globbing options -shopt -s extglob -for targrp in $targrp_list; do - set +e - $TARCMD -P -cvf $ATARDIR/$CDATE/${targrp}.tar $(cat $ARCH_LIST/${targrp}.txt) - status=$? - if [ $status -ne 0 -a $CDATE -ge $firstday ]; then - echo "$(echo $TARCMD | tr 'a-z' 'A-Z') $CDATE ${targrp}.tar failed" - exit $status - fi - ${ERR_EXIT_ON:-set -e} -done -# Turn extended globbing back off -shopt -u extglob - -############################################################### -fi ##end of HPSS archive -############################################################### - - - -############################################################### -# Clean up previous cycles; various depths -# PRIOR CYCLE: Leave the prior cycle alone -GDATE=$($NDATE -$assim_freq $CDATE) - -# PREVIOUS to the PRIOR CYCLE -GDATE=$($NDATE -$assim_freq $GDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) - -# Remove the TMPDIR directory -COMIN="$RUNDIR/$GDATE" -[[ -d $COMIN ]] && rm -rf $COMIN - -if [[ "${DELETE_COM_IN_ARCHIVE_JOB:-YES}" == NO ]] ; then - exit 0 -fi - -# Step back every assim_freq hours and remove old rotating directories -# for successful cycles (defaults from 24h to 120h). If GLDAS is -# active, retain files needed by GLDAS update. Independent of GLDAS, -# retain files needed by Fit2Obs -DO_GLDAS=${DO_GLDAS:-"NO"} -GDATEEND=$($NDATE -${RMOLDEND:-24} $CDATE) -GDATE=$($NDATE -${RMOLDSTD:-120} $CDATE) -GLDAS_DATE=$($NDATE -96 $CDATE) -RTOFS_DATE=$($NDATE -48 $CDATE) -while [ $GDATE -le $GDATEEND ]; do - gPDY=$(echo $GDATE | cut -c1-8) - gcyc=$(echo $GDATE | cut -c9-10) - COMIN="$ROTDIR/${CDUMP}.$gPDY/$gcyc/atmos" - COMINwave="$ROTDIR/${CDUMP}.$gPDY/$gcyc/wave" - COMINrtofs="$ROTDIR/rtofs.$gPDY" - if [ -d $COMIN ]; then - rocotolog="$EXPDIR/logs/${GDATE}.log" - if [ -f $rocotolog ]; then - testend=$(tail -n 1 $rocotolog | grep "This cycle is complete: Success") - rc=$? - if [ $rc -eq 0 ]; then - if [ -d $COMINwave ]; then rm -rf $COMINwave ; fi - if [ -d $COMINrtofs -a $GDATE -lt $RTOFS_DATE ]; then rm -rf $COMINrtofs ; fi - if [ $CDUMP != "gdas" -o $DO_GLDAS = "NO" -o $GDATE -lt $GLDAS_DATE ]; then - if [ $CDUMP = "gdas" ]; then - for file in $(ls $COMIN |grep -v prepbufr |grep -v cnvstat |grep -v atmanl.nc); do - rm -rf $COMIN/$file - done - else - rm -rf $COMIN - fi - else - if [ $DO_GLDAS = "YES" ]; then - for file in $(ls $COMIN |grep -v sflux |grep -v RESTART |grep -v prepbufr |grep -v cnvstat |grep -v atmanl.nc); do - rm -rf $COMIN/$file - done - for file in $(ls $COMIN/RESTART |grep -v sfcanl ); do - rm -rf $COMIN/RESTART/$file - done - else - for file in $(ls $COMIN |grep -v prepbufr |grep -v cnvstat |grep -v atmanl.nc); do - rm -rf $COMIN/$file - done - fi - fi - fi - fi - fi - - # Remove any empty directories - if [ -d $COMIN ]; then - [[ ! "$(ls -A $COMIN)" ]] && rm -rf $COMIN - fi - - if [ -d $COMINwave ]; then - [[ ! "$(ls -A $COMINwave)" ]] && rm -rf $COMINwave - fi - - # Remove mdl gfsmos directory - if [ $CDUMP = "gfs" ]; then - COMIN="$ROTDIR/gfsmos.$gPDY" - if [ -d $COMIN -a $GDATE -lt $CDATE_MOS ]; then rm -rf $COMIN ; fi - fi - - GDATE=$($NDATE +$assim_freq $GDATE) -done - -# Remove archived gaussian files used for Fit2Obs in $VFYARC that are -# $FHMAX_FITS plus a delta before $CDATE. Touch existing archived -# gaussian files to prevent the files from being removed by automatic -# scrubber present on some machines. - -if [ $CDUMP = "gfs" ]; then - fhmax=$((FHMAX_FITS+36)) - RDATE=$($NDATE -$fhmax $CDATE) - rPDY=$(echo $RDATE | cut -c1-8) - COMIN="$VFYARC/$CDUMP.$rPDY" - [[ -d $COMIN ]] && rm -rf $COMIN - - TDATE=$($NDATE -$FHMAX_FITS $CDATE) - while [ $TDATE -lt $CDATE ]; do - tPDY=$(echo $TDATE | cut -c1-8) - tcyc=$(echo $TDATE | cut -c9-10) - TDIR=$VFYARC/$CDUMP.$tPDY/$tcyc - [[ -d $TDIR ]] && touch $TDIR/* - TDATE=$($NDATE +6 $TDATE) - done -fi - -# Remove $CDUMP.$rPDY for the older of GDATE or RDATE -GDATE=$($NDATE -${RMOLDSTD:-120} $CDATE) -fhmax=$FHMAX_GFS -RDATE=$($NDATE -$fhmax $CDATE) -if [ $GDATE -lt $RDATE ]; then - RDATE=$GDATE -fi -rPDY=$(echo $RDATE | cut -c1-8) -COMIN="$ROTDIR/$CDUMP.$rPDY" -[[ -d $COMIN ]] && rm -rf $COMIN - - -############################################################### -exit 0 +exit "${status}" diff --git a/jobs/rocoto/atmanlfinal.sh b/jobs/rocoto/atmanlfinal.sh new file mode 100755 index 00000000000..3c75c52cb08 --- /dev/null +++ b/jobs/rocoto/atmanlfinal.sh @@ -0,0 +1,23 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="atmanlfinal" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE" +status=$? +exit "${status}" diff --git a/jobs/rocoto/atmanlinit.sh b/jobs/rocoto/atmanlinit.sh new file mode 100755 index 00000000000..7bb2587f0ba --- /dev/null +++ b/jobs/rocoto/atmanlinit.sh @@ -0,0 +1,24 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="atmanlinit" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH + +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE" +status=$? +exit "${status}" diff --git a/jobs/rocoto/atmanlrun.sh b/jobs/rocoto/atmanlrun.sh new file mode 100755 index 00000000000..aad80e0b06e --- /dev/null +++ b/jobs/rocoto/atmanlrun.sh @@ -0,0 +1,24 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="atmanlrun" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH + +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_ATM_ANALYSIS_RUN" +status=$? +exit "${status}" diff --git a/jobs/rocoto/atmensanlfinal.sh b/jobs/rocoto/atmensanlfinal.sh new file mode 100755 index 00000000000..838e9712f87 --- /dev/null +++ b/jobs/rocoto/atmensanlfinal.sh @@ -0,0 +1,23 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="atmensanlfinal" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE" +status=$? +exit "${status}" diff --git a/jobs/rocoto/atmensanlinit.sh b/jobs/rocoto/atmensanlinit.sh new file mode 100755 index 00000000000..0ab78a1083a --- /dev/null +++ b/jobs/rocoto/atmensanlinit.sh @@ -0,0 +1,24 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="atmensanlinit" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH + +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE" +status=$? +exit "${status}" diff --git a/jobs/rocoto/atmensanlrun.sh b/jobs/rocoto/atmensanlrun.sh new file mode 100755 index 00000000000..91efdb37686 --- /dev/null +++ b/jobs/rocoto/atmensanlrun.sh @@ -0,0 +1,24 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="atmensanlrun" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH + +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_ATMENS_ANALYSIS_RUN" +status=$? +exit "${status}" diff --git a/jobs/rocoto/awips.sh b/jobs/rocoto/awips.sh index f8e5646aa67..f9289255f9c 100755 --- a/jobs/rocoto/awips.sh +++ b/jobs/rocoto/awips.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### ## Abstract: @@ -15,138 +15,61 @@ source "$HOMEgfs/ush/preamble.sh" ############################################################### ############################################################### -echo -echo "=============== BEGIN TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +# Source FV3GFS workflow modules +source "${HOMEgfs}/ush/load_fv3gfs_modules.sh" status=$? -[[ $status -ne 0 ]] && exit $status +(( status != 0 )) && exit "${status}" +export job="awips" +export jobid="${job}.$$" -############################################################### -echo -echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" -configs="base awips" -for config in $configs; do - . $EXPDIR/config.${config} - status=$? - [[ $status -ne 0 ]] && exit $status -done +# TODO (#1228) - This script is doing more than just calling a j-job +# Also, this forces us to call the config files here instead of the j-job +source "${HOMEgfs}/ush/jjob_header.sh" -e "awips" -c "base awips" -fhrlst=$(echo $FHRLST | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') - -############################################################### -echo -echo "=============== BEGIN TO SOURCE MACHINE RUNTIME ENVIRONMENT ===============" -. $BASE_ENV/${machine}.env awips -status=$? -[[ $status -ne 0 ]] && exit $status +fhrlst=$(echo ${FHRLST} | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') ############################################################### -export COMPONENT=${COMPONENT:-atmos} -export CDATEm1=$($NDATE -24 $CDATE) -export PDYm1=$(echo $CDATEm1 | cut -c1-8) - -export COMIN="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export DATAROOT="$RUNDIR/$CDATE/$CDUMP/awips$FHRGRP" -[[ -d $DATAROOT ]] && rm -rf $DATAROOT -mkdir -p $DATAROOT - ################################################################################ echo echo "=============== BEGIN AWIPS ===============" -export SENDCOM="YES" -export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export PCOM="$COMOUT/wmo" -export jlogfile="$ROTDIR/logs/$CDATE/jgfs_awips.log" - -SLEEP_TIME=1800 -SLEEP_INT=5 -SLEEP_LOOP_MAX=$(expr $SLEEP_TIME / $SLEEP_INT) -for fhr in $fhrlst; do - - if [ $fhr -gt $FHMAX_GFS ]; then - echo "Nothing to process for FHR = $fhr, cycle" +for fhr in ${fhrlst}; do + if (( fhr > FHMAX_GFS )); then + echo "Nothing to process for FHR = ${fhr}, cycle" continue fi fhmin=0 fhmax=84 - if [ $fhr -ge $fhmin -a $fhr -le $fhmax ] ; then - if [[ $(expr $fhr % 3) -eq 0 ]]; then - fhr3=$(printf %03d $((10#$fhr))) - -# Check for input file existence. If not present, sleep -# Loop SLEEP_LOOP_MAX times. Abort if not found. - ic=1 - while [[ $ic -le $SLEEP_LOOP_MAX ]]; do - if [ -s $COMOUT/$CDUMP.t${cyc}z.pgrb2b.0p25.f${fhr3}.idx ]; then - break - else - ic=$(expr $ic + 1) - sleep $SLEEP_INT - fi - if [ $ic -eq $SLEEP_LOOP_MAX ]; then - echo "***FATAL ERROR*** $COMOUT/$CDUMP.t${cyc}z.pgrb2b.0p25.f${fhr3}.idx NOT available" - export err=9 - err_chk - fi - done - - export fcsthrs=$fhr3 - export job="jgfs_awips_f${fcsthrs}_20km_${cyc}" - export DATA="${DATAROOT}/$job" - $AWIPS20SH - fi - - if [[ $(expr $fhr % 6) -eq 0 ]]; then - export job="jgfs_awips_f${fcsthrs}_${cyc}" - export DATA="${DATAROOT}/$job" - $AWIPSG2SH - fi + if (( fhr >= fhmin && fhr <= fhmax )); then + if ((fhr % 3 == 0)); then + fhr3=$(printf %03d $((10#${fhr}))) + export fcsthrs=${fhr3} + ${AWIPS20SH} + fi + + if ((fhr % 6 == 0)); then + ${AWIPSG2SH} + fi fi fhmin=90 fhmax=240 - if [ $fhr -ge $fhmin -a $fhr -le $fhmax ]; then - - if [[ $(expr $fhr % 6) -eq 0 ]]; then - fhr3=$(printf %03i $fhr) - -# Check for input file existence. If not present, sleep -# Loop SLEEP_LOOP_MAX times. Abort if not found. - ic=1 - while [[ $ic -le $SLEEP_LOOP_MAX ]]; do - if [ -s $COMOUT/$CDUMP.t${cyc}z.pgrb2b.0p25.f${fhr3}.idx ]; then - break - else - ic=$(expr $ic + 1) - sleep $SLEEP_INT - fi - if [ $ic -eq $SLEEP_LOOP_MAX ]; then - echo "***FATAL ERROR*** $COMOUT/$CDUMP.t${cyc}z.pgrb2b.0p25.f${fhr3}.idx NOT available" - export err=9 - err_chk - fi - done - - export fcsthrs=$fhr3 - export job="jgfs_awips_f${fcsthrs}_20km_${cyc}" - export DATA="${DATAROOT}/$job" - $AWIPS20SH - - export job="jgfs_awips_f${fcsthrs}_${cyc}" - export DATA="${DATAROOT}/$job" - $AWIPSG2SH - fi + if (( fhr >= fhmin && fhr <= fhmax )); then + if ((fhr % 6 == 0)); then + fhr3=$(printf %03i $((10#${fhr}))) + export fcsthrs=${fhr3} + ${AWIPS20SH} + ${AWIPSG2SH} + fi fi done ############################################################### # Force Exit out cleanly -if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi - +if [[ ${KEEPDATA:-"NO"} == "NO" ]] ; then rm -rf "${DATA}" ; fi exit 0 diff --git a/jobs/rocoto/coupled_ic.sh b/jobs/rocoto/coupled_ic.sh index 1be2a216b5c..bedb7e5272f 100755 --- a/jobs/rocoto/coupled_ic.sh +++ b/jobs/rocoto/coupled_ic.sh @@ -4,11 +4,9 @@ source "$HOMEgfs/ush/preamble.sh" ############################################################### ## Abstract: -## Create FV3 initial conditions from GFS intitial conditions -## RUN_ENVIR : runtime environment (emc | nco) +## Copy initial conditions from BASE_CPLIC to ROTDIR for coupled forecast-only runs ## HOMEgfs : /full/path/to/workflow ## EXPDIR : /full/path/to/config/files -## CDATE : current date (YYYYMMDDHH) ## CDUMP : cycle name (gdas / gfs) ## PDY : current date (YYYYMMDD) ## cyc : current cycle (HH) @@ -16,101 +14,123 @@ source "$HOMEgfs/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} err=0 ############################################################### # Source relevant configs configs="base coupled_ic wave" -for config in $configs; do - . $EXPDIR/config.${config} +for config in ${configs}; do + . ${EXPDIR}/config.${config} status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done ############################################################### # Source machine runtime environment -. $BASE_ENV/${machine}.env config.coupled_ic +. ${BASE_ENV}/${machine}.env config.coupled_ic status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} -# Create ICSDIR if needed -[[ ! -d $ICSDIR/$CDATE ]] && mkdir -p $ICSDIR/$CDATE -[[ ! -d $ICSDIR/$CDATE/atmos ]] && mkdir -p $ICSDIR/$CDATE/atmos -[[ ! -d $ICSDIR/$CDATE/ocn ]] && mkdir -p $ICSDIR/$CDATE/ocn -[[ ! -d $ICSDIR/$CDATE/ice ]] && mkdir -p $ICSDIR/$CDATE/ice +############################################################### +# Locally scoped variables and functions +GDATE=$(date -d "${PDY} ${cyc} - ${assim_freq} hours" +%Y%m%d%H) +gPDY="${GDATE:0:8}" +gcyc="${GDATE:8:2}" + +error_message(){ + echo "FATAL ERROR: Unable to copy ${1} to ${2} (Error code ${3})" +} -if [ $ICERES = '025' ]; then - ICERESdec="0.25" -fi -if [ $ICERES = '050' ]; then - ICERESdec="0.50" -fi +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_INPUT COM_ICE_RESTART COM_WAVE_RESTART +YMD=${gPDY} HH=${gcyc} generate_com -rx COM_OCEAN_RESTART -# Setup ATM initial condition files -cp -r $BASE_CPLIC/$CPL_ATMIC/$CDATE/$CDUMP/* $ICSDIR/$CDATE/atmos/ +############################################################### +# Start staging + +# Stage the FV3 initial conditions to ROTDIR (cold start) +ATMdir="${COM_ATMOS_INPUT}" +[[ ! -d "${ATMdir}" ]] && mkdir -p "${ATMdir}" +source="${BASE_CPLIC}/${CPL_ATMIC}/${PDY}${cyc}/${CDUMP}/${CASE}/INPUT/gfs_ctrl.nc" +target="${ATMdir}/gfs_ctrl.nc" +${NCP} "${source}" "${target}" rc=$? -if [[ $rc -ne 0 ]] ; then - echo "FATAL: Unable to copy $BASE_CPLIC/$CPL_ATMIC/$CDATE/$CDUMP/* to $ICSDIR/$CDATE/atmos/ (Error code $rc)" -fi +[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}" err=$((err + rc)) +for ftype in gfs_data sfc_data; do + for tt in $(seq 1 6); do + source="${BASE_CPLIC}/${CPL_ATMIC}/${PDY}${cyc}/${CDUMP}/${CASE}/INPUT/${ftype}.tile${tt}.nc" + target="${ATMdir}/${ftype}.tile${tt}.nc" + ${NCP} "${source}" "${target}" + rc=$? + [[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}" + err=$((err + rc)) + done +done - -# Setup Ocean IC files -cp -r $BASE_CPLIC/$CPL_OCNIC/$CDATE/ocn/$OCNRES/MOM*.nc $ICSDIR/$CDATE/ocn/ +# Stage ocean initial conditions to ROTDIR (warm start) +OCNdir="${COM_OCEAN_RESTART}" +[[ ! -d "${OCNdir}" ]] && mkdir -p "${OCNdir}" +source="${BASE_CPLIC}/${CPL_OCNIC}/${PDY}${cyc}/ocn/${OCNRES}/MOM.res.nc" +target="${OCNdir}/${PDY}.${cyc}0000.MOM.res.nc" +${NCP} "${source}" "${target}" rc=$? -if [[ $rc -ne 0 ]] ; then - echo "FATAL: Unable to copy $BASE_CPLIC/$CPL_OCNIC/$CDATE/ocn/$OCNRES/MOM*.nc to $ICSDIR/$CDATE/ocn/ (Error code $rc)" -fi +[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}" err=$((err + rc)) - -#Setup Ice IC files -cp $BASE_CPLIC/$CPL_ICEIC/$CDATE/ice/$ICERES/cice5_model_${ICERESdec}.res_$CDATE.nc $ICSDIR/$CDATE/ice/cice_model_${ICERESdec}.res_$CDATE.nc +case $OCNRES in + "025") + for nn in $(seq 1 4); do + source="${BASE_CPLIC}/${CPL_OCNIC}/${PDY}${cyc}/ocn/${OCNRES}/MOM.res_${nn}.nc" + if [[ -f "${source}" ]]; then + target="${OCNdir}/${PDY}.${cyc}0000.MOM.res_${nn}.nc" + ${NCP} "${source}" "${target}" + rc=$? + [[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}" + err=$((err + rc)) + fi + done + ;; + *) + echo "FATAL ERROR: Unsupported ocean resolution ${OCNRES}" + rc=1 + err=$((err + rc)) + ;; +esac + +# Stage ice initial conditions to ROTDIR (cold start as these are SIS2 generated) +ICEdir="${COM_ICE_RESTART}" +[[ ! -d "${ICEdir}" ]] && mkdir -p "${ICEdir}" +ICERESdec=$(echo "${ICERES}" | awk '{printf "%0.2f", $1/100}') +source="${BASE_CPLIC}/${CPL_ICEIC}/${PDY}${cyc}/ice/${ICERES}/cice5_model_${ICERESdec}.res_${PDY}${cyc}.nc" +target="${ICEdir}/${PDY}.${cyc}0000.cice_model.res.nc" +${NCP} "${source}" "${target}" rc=$? -if [[ $rc -ne 0 ]] ; then - echo "FATAL: Unable to copy $BASE_CPLIC/$CPL_ICEIC/$CDATE/ice/$ICERES/cice5_model_${ICERESdec}.res_$CDATE.nc to $ICSDIR/$CDATE/ice/cice_model_${ICERESdec}.res_$CDATE.nc (Error code $rc)" -fi +[[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}" err=$((err + rc)) -if [ $DO_WAVE = "YES" ]; then - [[ ! -d $ICSDIR/$CDATE/wav ]] && mkdir -p $ICSDIR/$CDATE/wav - for grdID in $waveGRD - do - cp $BASE_CPLIC/$CPL_WAVIC/$CDATE/wav/$grdID/*restart.$grdID $ICSDIR/$CDATE/wav/ +# Stage the WW3 initial conditions to ROTDIR (warm start; TODO: these should be placed in $RUN.$gPDY/$gcyc) +if [[ "${DO_WAVE}" = "YES" ]]; then + WAVdir="${COM_WAVE_RESTART}" + [[ ! -d "${WAVdir}" ]] && mkdir -p "${WAVdir}" + for grdID in ${waveGRD}; do # TODO: check if this is a bash array; if so adjust + source="${BASE_CPLIC}/${CPL_WAVIC}/${PDY}${cyc}/wav/${grdID}/${PDY}.${cyc}0000.restart.${grdID}" + target="${WAVdir}/${PDY}.${cyc}0000.restart.${grdID}" + ${NCP} "${source}" "${target}" rc=$? - if [[ $rc -ne 0 ]] ; then - echo "FATAL: Unable to copy $BASE_CPLIC/$CPL_WAVIC/$CDATE/wav/$grdID/*restart.$grdID to $ICSDIR/$CDATE/wav/ (Error code $rc)" - fi + [[ ${rc} -ne 0 ]] && error_message "${source}" "${target}" "${rc}" err=$((err + rc)) done fi -# Stage the FV3 initial conditions to ROTDIR -export OUTDIR="$ICSDIR/$CDATE/atmos/$CASE/INPUT" -COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/atmos" -[[ ! -d $COMOUT ]] && mkdir -p $COMOUT -cd $COMOUT || exit 99 -rm -rf INPUT -$NLN $OUTDIR . - -#Stage the WW3 initial conditions to ROTDIR -if [ $DO_WAVE = "YES" ]; then - export OUTDIRw="$ICSDIR/$CDATE/wav" - COMOUTw="$ROTDIR/$CDUMP.$PDY/$cyc/wave/restart" - [[ ! -d $COMOUTw ]] && mkdir -p $COMOUTw - cd $COMOUTw || exit 99 - $NLN $OUTDIRw/* . +############################################################### +# Check for errors and exit if any of the above failed +if [[ "${err}" -ne 0 ]] ; then + echo "FATAL ERROR: Unable to copy ICs from ${BASE_CPLIC} to ${ROTDIR}; ABORT!" + exit "${err}" fi -if [[ $err -ne 0 ]] ; then - echo "Fatal Error: ICs are not properly set-up" - exit $err -fi - ############################################################## # Exit cleanly - - exit 0 diff --git a/jobs/rocoto/earc.sh b/jobs/rocoto/earc.sh index 8b80b4b9e88..c4c73416988 100755 --- a/jobs/rocoto/earc.sh +++ b/jobs/rocoto/earc.sh @@ -1,229 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" - -############################################################### -## Abstract: -## Ensemble archive driver script -## RUN_ENVIR : runtime environment (emc | nco) -## HOMEgfs : /full/path/to/workflow -## EXPDIR : /full/path/to/config/files -## CDATE : current analysis date (YYYYMMDDHH) -## PDY : current date (YYYYMMDD) -## cyc : current cycle (HH) -## CDUMP : cycle name (gdas / gfs) -## ENSGRP : ensemble sub-group to archive (0, 1, 2, ...) -############################################################### +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh -status=$? -[[ $status -ne 0 ]] && exit $status - -############################################################### -# Source relevant configs -configs="base earc" -for config in $configs; do - . $EXPDIR/config.${config} - status=$? - [[ $status -ne 0 ]] && exit $status -done - -export COMPONENT=${COMPONENT:-atmos} - -n=$((ENSGRP)) - -# ICS are restarts and always lag INC by $assim_freq hours. -EARCINC_CYC=$ARCH_CYC -EARCICS_CYC=$((ARCH_CYC-assim_freq)) -if [ $EARCICS_CYC -lt 0 ]; then - EARCICS_CYC=$((EARCICS_CYC+24)) -fi - -# EnKF update in GFS, GDAS or both -CDUMP_ENKF=$(echo ${EUPD_CYC:-"gdas"} | tr a-z A-Z) - -ARCH_LIST="$ROTDIR/enkf${CDUMP}.$PDY/$cyc/$COMPONENT/earc$ENSGRP" -[[ -d $ARCH_LIST ]] && rm -rf $ARCH_LIST -mkdir -p $ARCH_LIST -cd $ARCH_LIST - -$HOMEgfs/ush/hpssarch_gen.sh enkf${CDUMP} +. "${HOMEgfs}/ush/load_fv3gfs_modules.sh" status=$? -if [ $status -ne 0 ]; then - echo "$HOMEgfs/ush/hpssarch_gen.sh enkf${CDUMP} failed, ABORT!" - exit $status -fi - -cd $ROTDIR - -source "${HOMEgfs}/ush/file_utils.sh" - -################################################################### -# ENSGRP > 0 archives a group of ensemble members -firstday=$($NDATE +24 $SDATE) -if [[ $ENSGRP -gt 0 ]] && [[ $HPSSARCH = "YES" || $LOCALARCH = "YES" ]]; then - -#--set the archiving command and create local directories, if necessary - TARCMD="htar" - if [[ $LOCALARCH = "YES" ]]; then - TARCMD="tar" - [ ! -d $ATARDIR/$CDATE ] && mkdir -p $ATARDIR/$CDATE - fi - -#--determine when to save ICs for warm start - SAVEWARMICA="NO" - SAVEWARMICB="NO" - mm=$(echo $CDATE|cut -c 5-6) - dd=$(echo $CDATE|cut -c 7-8) - nday=$(( (mm-1)*30+dd )) - mod=$(($nday % $ARCH_WARMICFREQ)) - if [ $CDATE -eq $firstday -a $cyc -eq $EARCINC_CYC ]; then SAVEWARMICA="YES" ; fi - if [ $CDATE -eq $firstday -a $cyc -eq $EARCICS_CYC ]; then SAVEWARMICB="YES" ; fi - if [ $mod -eq 0 -a $cyc -eq $EARCINC_CYC ]; then SAVEWARMICA="YES" ; fi - if [ $mod -eq 0 -a $cyc -eq $EARCICS_CYC ]; then SAVEWARMICB="YES" ; fi - - if [ $EARCICS_CYC -eq 18 ]; then - nday1=$((nday+1)) - mod1=$(($nday1 % $ARCH_WARMICFREQ)) - if [ $mod1 -eq 0 -a $cyc -eq $EARCICS_CYC ] ; then SAVEWARMICB="YES" ; fi - if [ $mod1 -ne 0 -a $cyc -eq $EARCICS_CYC ] ; then SAVEWARMICB="NO" ; fi - if [ $CDATE -eq $SDATE -a $cyc -eq $EARCICS_CYC ] ; then SAVEWARMICB="YES" ; fi - fi - - if [ $CDATE -gt $SDATE ]; then # Don't run for first half cycle - - $TARCMD -P -cvf $ATARDIR/$CDATE/enkf${CDUMP}_grp${ENSGRP}.tar $(cat $ARCH_LIST/enkf${CDUMP}_grp${n}.txt) - status=$? - if [ $status -ne 0 -a $CDATE -ge $firstday ]; then - echo "$(echo $TARCMD | tr 'a-z' 'A-Z') $CDATE enkf${CDUMP}_grp${ENSGRP}.tar failed" - exit $status - fi - - if [ $SAVEWARMICA = "YES" -a $cyc -eq $EARCINC_CYC ]; then - $TARCMD -P -cvf $ATARDIR/$CDATE/enkf${CDUMP}_restarta_grp${ENSGRP}.tar $(cat $ARCH_LIST/enkf${CDUMP}_restarta_grp${n}.txt) - status=$? - if [ $status -ne 0 ]; then - echo "$(echo $TARCMD | tr 'a-z' 'A-Z') $CDATE enkf${CDUMP}_restarta_grp${ENSGRP}.tar failed" - exit $status - fi - fi - - if [ $SAVEWARMICB = "YES" -a $cyc -eq $EARCICS_CYC ]; then - $TARCMD -P -cvf $ATARDIR/$CDATE/enkf${CDUMP}_restartb_grp${ENSGRP}.tar $(cat $ARCH_LIST/enkf${CDUMP}_restartb_grp${n}.txt) - status=$? - if [ $status -ne 0 ]; then - echo "$(echo $TARCMD | tr 'a-z' 'A-Z') $CDATE enkf${CDUMP}_restartb_grp${ENSGRP}.tar failed" - exit $status - fi - fi - - fi # CDATE>SDATE +[[ ${status} -ne 0 ]] && exit "${status}" -fi - - -################################################################### -# ENSGRP 0 archives ensemble means and copy data to online archive -if [ $ENSGRP -eq 0 ]; then - - if [[ $HPSSARCH = "YES" || $LOCALARCH = "YES" ]]; then - -#--set the archiving command and create local directories, if necessary - TARCMD="htar" - if [[ $LOCALARCH = "YES" ]]; then - TARCMD="tar" - [ ! -d $ATARDIR/$CDATE ] && mkdir -p $ATARDIR/$CDATE - fi - - set +e - $TARCMD -P -cvf $ATARDIR/$CDATE/enkf${CDUMP}.tar $(cat $ARCH_LIST/enkf${CDUMP}.txt) - status=$? - if [ $status -ne 0 -a $CDATE -ge $firstday ]; then - echo "$(echo $TARCMD | tr 'a-z' 'A-Z') $CDATE enkf${CDUMP}.tar failed" - exit $status - fi - ${ERR_EXIT_ON:-set -eu} - fi - - #-- Archive online for verification and diagnostics - [[ ! -d $ARCDIR ]] && mkdir -p $ARCDIR - cd $ARCDIR - - nb_copy $ROTDIR/enkf${CDUMP}.$PDY/$cyc/$COMPONENT/${CDUMP}.t${cyc}z.enkfstat enkfstat.${CDUMP}.$CDATE - nb_copy $ROTDIR/enkf${CDUMP}.$PDY/$cyc/$COMPONENT/${CDUMP}.t${cyc}z.gsistat.ensmean gsistat.${CDUMP}.${CDATE}.ensmean - - if [ $CDUMP_ENKF != "GDAS" ]; then - nb_copy $ROTDIR/enkfgfs.$PDY/$cyc/$COMPONENT/${CDUMP}.t${cyc}z.enkfstat enkfstat.gfs.$CDATE - nb_copy $ROTDIR/enkfgfs.$PDY/$cyc/$COMPONENT/${CDUMP}.t${cyc}z.gsistat.ensmean gsistat.gfs.${CDATE}.ensmean - fi - -fi - - -if [[ "${DELETE_COM_IN_ARCHIVE_JOB:-YES}" == NO ]] ; then - exit 0 -fi - -############################################################### -# ENSGRP 0 also does clean-up -if [ $ENSGRP -eq 0 ]; then - - # Start start and end dates to remove - GDATEEND=$($NDATE -${RMOLDEND_ENKF:-24} $CDATE) - GDATE=$($NDATE -${RMOLDSTD_ENKF:-120} $CDATE) - while [ $GDATE -le $GDATEEND ]; do - - gPDY=$(echo $GDATE | cut -c1-8) - gcyc=$(echo $GDATE | cut -c9-10) - - # Loop over GDAS and GFS EnKF directories separately. - clist="gdas gfs" - for ctype in $clist; do - COMIN_ENS="$ROTDIR/enkf$ctype.$gPDY/$gcyc/$COMPONENT" - if [ -d $COMIN_ENS ]; then - rocotolog="$EXPDIR/logs/${GDATE}.log" - if [ -f $rocotolog ]; then - testend=$(tail -n 1 $rocotolog | grep "This cycle is complete: Success") - rc=$? - if [ $rc -eq 0 ]; then - # Retain f006.ens files. Remove everything else - for file in $(ls $COMIN_ENS | grep -v f006.ens); do - rm -rf $COMIN_ENS/$file - done - fi - fi - fi - - # Remove empty directories - if [ -d $COMIN_ENS ] ; then - [[ ! "$(ls -A $COMIN_ENS)" ]] && rm -rf $COMIN_ENS - fi - done - - # Advance to next cycle - GDATE=$($NDATE +$assim_freq $GDATE) - - done - -fi - -# Remove enkf*.$rPDY for the older of GDATE or RDATE -GDATE=$($NDATE -${RMOLDSTD_ENKF:-120} $CDATE) -fhmax=$FHMAX_GFS -RDATE=$($NDATE -$fhmax $CDATE) -if [ $GDATE -lt $RDATE ]; then - RDATE=$GDATE -fi -rPDY=$(echo $RDATE | cut -c1-8) -clist="gdas gfs" -for ctype in $clist; do - COMIN="$ROTDIR/enkf$ctype.$rPDY" - [[ -d $COMIN ]] && rm -rf $COMIN -done +export job="earc" +export jobid="${job}.$$" ############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGDAS_ENKF_ARCHIVE" +status=$? -exit 0 +exit "${status}" diff --git a/jobs/rocoto/ecen.sh b/jobs/rocoto/ecen.sh index dd4a8ac8a69..744956b1ff4 100755 --- a/jobs/rocoto/ecen.sh +++ b/jobs/rocoto/ecen.sh @@ -1,26 +1,27 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################################### # Loop over groups to Execute the JJOB -fhrlst=$(echo $FHRLST | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') -for fhr in $fhrlst; do +fhrlst=$(echo ${FHRLST} | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') +for fhr in ${fhrlst}; do - export FHMIN_ECEN=$fhr - export FHMAX_ECEN=$fhr - export FHOUT_ECEN=$fhr - export job=ecen${fhr} + export FHMIN_ECEN=${fhr} + export FHMAX_ECEN=${fhr} + export FHOUT_ECEN=${fhr} + export job=ecen + export jobid="${job}.$$" - $HOMEgfs/jobs/JGDAS_ENKF_ECEN + ${HOMEgfs}/jobs/JGDAS_ENKF_ECEN status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done diff --git a/jobs/rocoto/echgres.sh b/jobs/rocoto/echgres.sh index 3171388f6ad..5779a91f06a 100755 --- a/jobs/rocoto/echgres.sh +++ b/jobs/rocoto/echgres.sh @@ -1,16 +1,19 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="echgres" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGDAS_ATMOS_CHGRES_FORENKF +${HOMEgfs}/jobs/JGDAS_ATMOS_CHGRES_FORENKF status=$? diff --git a/jobs/rocoto/ediag.sh b/jobs/rocoto/ediag.sh index b09a7f49637..8462edf296d 100755 --- a/jobs/rocoto/ediag.sh +++ b/jobs/rocoto/ediag.sh @@ -1,17 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="ediag" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGDAS_ENKF_DIAG +${HOMEgfs}/jobs/JGDAS_ENKF_DIAG status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/efcs.sh b/jobs/rocoto/efcs.sh index 4454ad6c8b7..46a25ac759f 100755 --- a/jobs/rocoto/efcs.sh +++ b/jobs/rocoto/efcs.sh @@ -1,16 +1,34 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh -status=$? -[[ $status -ne 0 ]] && exit $status +#. ${HOMEgfs}/ush/load_fv3gfs_modules.sh +#status=$? +#[[ ${status} -ne 0 ]] && exit ${status} + +# TODO: clean this up +source "${HOMEgfs}/ush/detect_machine.sh" +set +x +source "${HOMEgfs}/ush/module-setup.sh" +module use "${HOMEgfs}/sorc/ufs_model.fd/tests" +module load modules.ufs_model.lua +# Workflow needs utilities from prod_util (setPDY.sh, ndate, etc.) +module load prod_util +if [[ "${MACHINE_ID}" = "wcoss2" ]]; then + module load cray-pals +fi +module list +unset MACHINE_ID +set_trace + +export job="efcs" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGDAS_ENKF_FCST +${HOMEgfs}/jobs/JGDAS_ENKF_FCST status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/eobs.sh b/jobs/rocoto/eobs.sh index f6dc275578d..95fa42cb087 100755 --- a/jobs/rocoto/eobs.sh +++ b/jobs/rocoto/eobs.sh @@ -1,17 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="eobs" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGDAS_ENKF_SELECT_OBS +${HOMEgfs}/jobs/JGDAS_ENKF_SELECT_OBS status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/eomg.sh b/jobs/rocoto/eomg.sh deleted file mode 100755 index de981c02bb4..00000000000 --- a/jobs/rocoto/eomg.sh +++ /dev/null @@ -1,17 +0,0 @@ -#! /usr/bin/env bash - -source "$HOMEgfs/ush/preamble.sh" - -############################################################### -# Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh -status=$? -[[ $status -ne 0 ]] && exit $status - -############################################################### -# Execute the JJOB -$HOMEgfs/jobs/JGDAS_ENKF_INNOVATE_OBS -status=$? - - -exit $status diff --git a/jobs/rocoto/epos.sh b/jobs/rocoto/epos.sh index 1039b8ab20c..d1f890a9309 100755 --- a/jobs/rocoto/epos.sh +++ b/jobs/rocoto/epos.sh @@ -1,27 +1,28 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} +export job="epos" +export jobid="${job}.$$" + ############################################################### # Loop over groups to Execute the JJOB -fhrlst=$(echo $FHRLST | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') +fhrlst=$(echo ${FHRLST} | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') -for fhr in $fhrlst; do - - export FHMIN_EPOS=$fhr - export FHMAX_EPOS=$fhr - export FHOUT_EPOS=$fhr - export job=epos${fhr} - - $HOMEgfs/jobs/JGDAS_ENKF_POST +for fhr in ${fhrlst}; do + + export FHMIN_EPOS=${fhr} + export FHMAX_EPOS=${fhr} + export FHOUT_EPOS=${fhr} + ${HOMEgfs}/jobs/JGDAS_ENKF_POST status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done diff --git a/jobs/rocoto/esfc.sh b/jobs/rocoto/esfc.sh index d830c59c50e..85f44151c9c 100755 --- a/jobs/rocoto/esfc.sh +++ b/jobs/rocoto/esfc.sh @@ -1,17 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="esfc" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGDAS_ENKF_SFC +${HOMEgfs}/jobs/JGDAS_ENKF_SFC status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/eupd.sh b/jobs/rocoto/eupd.sh index d202c45aef3..3ed028f87ae 100755 --- a/jobs/rocoto/eupd.sh +++ b/jobs/rocoto/eupd.sh @@ -1,17 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="eupd" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGDAS_ENKF_UPDATE +${HOMEgfs}/jobs/JGDAS_ENKF_UPDATE status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/fcst.sh b/jobs/rocoto/fcst.sh index d59872c60ca..512bee127f3 100755 --- a/jobs/rocoto/fcst.sh +++ b/jobs/rocoto/fcst.sh @@ -1,17 +1,53 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh -status=$? -[[ $status -ne 0 ]] && exit $status +#. ${HOMEgfs}/ush/load_fv3gfs_modules.sh +#status=$? +#[[ ${status} -ne 0 ]] && exit ${status} + +# TODO: clean this up +source "${HOMEgfs}/ush/detect_machine.sh" +set +x +source "${HOMEgfs}/ush/module-setup.sh" +module use "${HOMEgfs}/sorc/ufs_model.fd/tests" +module load modules.ufs_model.lua +module load prod_util +if [[ "${MACHINE_ID}" = "wcoss2" ]]; then + module load cray-pals +fi +if [[ "${MACHINE_ID}" = "hera" ]]; then + module use "/scratch2/NCEPDEV/ensemble/save/Walter.Kolczynski/modulefiles/core" + module load "miniconda3/4.6.14" + module load "gfs_workflow/1.0.0" +# TODO: orion and wcoss2 will be uncommented when they are ready. This comment block will be removed in the next PR +#elif [[ "${MACHINE_ID}" = "orion" ]]; then +# module use "/home/rmahajan/opt/global-workflow/modulefiles/core" +# module load "python/3.7.5" +# module load "gfs_workflow/1.0.0" +#elif [[ "${MACHINE_ID}" = "wcoss2" ]]; then +# module load "python/3.7.5" +fi +module list +unset MACHINE_ID +set_trace + +############################################################### +# exglobal_forecast.py requires the following in PYTHONPATH +# This will be moved to a module load when ready +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src:${HOMEgfs}/ush/python/pygfs" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH + +export job="fcst" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_FORECAST +${HOMEgfs}/jobs/JGLOBAL_FORECAST status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/fit2obs.sh b/jobs/rocoto/fit2obs.sh new file mode 100755 index 00000000000..d991234fbe7 --- /dev/null +++ b/jobs/rocoto/fit2obs.sh @@ -0,0 +1,23 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +echo +echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" +. "${HOMEgfs}/ush/load_fv3gfs_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="fit2obs" +export jobid="${job}.$$" + +############################################################### +echo +echo "=============== START TO RUN FIT2OBS ===============" +# Execute the JJOB +"${HOMEgfs}/jobs/JGDAS_FIT2OBS" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +exit 0 diff --git a/jobs/rocoto/gempak.sh b/jobs/rocoto/gempak.sh index 5b7f43ce47c..14950535c84 100755 --- a/jobs/rocoto/gempak.sh +++ b/jobs/rocoto/gempak.sh @@ -1,73 +1,17 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### -## Abstract: -## Inline gempak driver script -## RUN_ENVIR : runtime environment (emc | nco) -## HOMEgfs : /full/path/to/workflow -## EXPDIR : /full/path/to/config/files -## CDATE : current analysis date (YYYYMMDDHH) -## CDUMP : cycle name (gdas / gfs) -## PDY : current date (YYYYMMDD) -## cyc : current cycle (HH) -############################################################### - -############################################################### -echo -echo "=============== BEGIN TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. "${HOMEgfs}/ush/load_fv3gfs_modules.sh" status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} +export job="gempak" +export jobid="${job}.$$" -############################################################### -echo -echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" -configs="base gempak" -for config in $configs; do - . $EXPDIR/config.${config} - status=$? - [[ $status -ne 0 ]] && exit $status -done - +# Execute the JJOB +${HOMEgfs}/jobs/JGFS_ATMOS_GEMPAK -############################################################### -echo -echo "=============== BEGIN TO SOURCE MACHINE RUNTIME ENVIRONMENT ===============" -. $BASE_ENV/${machine}.env gempak status=$? -[[ $status -ne 0 ]] && exit $status - -############################################################### -export COMPONENT=${COMPONENT:-atmos} -export CDATEm1=$($NDATE -24 $CDATE) -export PDYm1=$(echo $CDATEm1 | cut -c1-8) - -export COMIN="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export DATAROOT="$RUNDIR/$CDATE/$CDUMP/gempak" -[[ -d $DATAROOT ]] && rm -rf $DATAROOT -mkdir -p $DATAROOT - - -################################################################################ -echo -echo "=============== BEGIN GEMPAK ===============" -export job="jgfs_gempak_${cyc}" -export jlogfile="$ROTDIR/logs/$CDATE/$job.log" -export DATA="${DATAROOT}/$job" -export SENDCOM="YES" -export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT/gempak" -export FIXgfs="" # set blank so that GEMPAKSH defaults FIXgfs to HOMEgfs/gempak/fix -export USHgfs="" # set blank so that GEMPAKSH defaults FIXgfs to HOMEgfs/gempak/ush - -$GEMPAKSH - - -############################################################### -# Force Exit out cleanly -if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi - - -exit 0 +exit ${status} diff --git a/jobs/rocoto/getic.sh b/jobs/rocoto/getic.sh index 4c4f98cf552..96093ec8bea 100755 --- a/jobs/rocoto/getic.sh +++ b/jobs/rocoto/getic.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### ## Abstract: @@ -16,40 +16,40 @@ source "$HOMEgfs/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. "${HOMEgfs}/ush/load_fv3gfs_modules.sh" status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit "${status}" ############################################################### # Source relevant configs configs="base getic init" -for config in $configs; do - . $EXPDIR/config.${config} +for config in ${configs}; do + . "${EXPDIR}/config.${config}" status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit "${status}" done ############################################################### # Source machine runtime environment -. $BASE_ENV/${machine}.env getic +. ${BASE_ENV}/${machine}.env getic status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit "${status}" ############################################################### # Set script and dependency variables -export yy=$(echo $CDATE | cut -c1-4) -export mm=$(echo $CDATE | cut -c5-6) -export dd=$(echo $CDATE | cut -c7-8) -export hh=${cyc:-$(echo $CDATE | cut -c9-10)} -export GDATE=$($NDATE -${assim_freq:-"06"} $CDATE) -export gyy=$(echo $GDATE | cut -c1-4) -export gmm=$(echo $GDATE | cut -c5-6) -export gdd=$(echo $GDATE | cut -c7-8) -export ghh=$(echo $GDATE | cut -c9-10) +export yy="$(echo ${CDATE} | cut -c1-4)" +export mm="$(echo ${CDATE} | cut -c5-6)" +export dd="$(echo ${CDATE} | cut -c7-8)" +export hh="${cyc:-$(echo ${CDATE} | cut -c9-10)}" +export GDATE="$(${NDATE} -${assim_freq:-"06"} ${CDATE})" +export gyy="$(echo ${GDATE} | cut -c1-4)" +export gmm="$(echo ${GDATE} | cut -c5-6)" +export gdd="$(echo ${GDATE} | cut -c7-8)" +export ghh="$(echo ${GDATE} | cut -c9-10)" export DATA=${DATA:-${DATAROOT}/getic} -export EXTRACT_DIR=${DATA:-$EXTRACT_DIR} +export EXTRACT_DIR=${DATA:-${EXTRACT_DIR}} export PRODHPSSDIR=${PRODHPSSDIR:-/NCEPPROD/hpssprod/runhistory} export COMPONENT="atmos" export gfs_ver=${gfs_ver:-"v16"} @@ -57,10 +57,9 @@ export OPS_RES=${OPS_RES:-"C768"} export GETICSH=${GETICSH:-${GDASINIT_DIR}/get_v16.data.sh} # Create ROTDIR/EXTRACT_DIR -if [ ! -d $ROTDIR ]; then mkdir -p $ROTDIR ; fi -if [ ! -d $EXTRACT_DIR ]; then mkdir -p $EXTRACT_DIR ; fi -cd $EXTRACT_DIR - +if [[ ! -d ${ROTDIR} ]]; then mkdir -p "${ROTDIR}" ; fi +if [[ ! -d ${EXTRACT_DIR} ]]; then mkdir -p "${EXTRACT_DIR}" ; fi +cd "${EXTRACT_DIR}" # JKH: tarball name changes effective 00Z 22 Jun 2022 to "v16.2" instead of "prod" if [ $yy$mm$dd$hh -ge 2022062700 ]; then version="v16.2" @@ -69,33 +68,33 @@ else fi # Check version, cold/warm start, and resolution -if [[ $gfs_ver = "v16" && $EXP_WARM_START = ".true." && $CASE = $OPS_RES ]]; then # Pull warm start ICs - no chgres +if [[ ${gfs_ver} = "v16" && ${EXP_WARM_START} = ".true." && ${CASE} = ${OPS_RES} ]]; then # Pull warm start ICs - no chgres # Pull RESTART files off HPSS - if [ ${RETRO:-"NO"} = "YES" ]; then # Retrospective parallel input + if [[ ${RETRO:-"NO"} = "YES" ]]; then # Retrospective parallel input # Pull prior cycle restart files htar -xvf ${HPSSDIR}/${GDATE}/gdas_restartb.tar status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit "${status}" # Pull current cycle restart files htar -xvf ${HPSSDIR}/${CDATE}/gfs_restarta.tar status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit "${status}" # Pull IAU increment files htar -xvf ${HPSSDIR}/${CDATE}/gfs_netcdfa.tar status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit "${status}" else # Opertional input - warm starts - cd $ROTDIR + cd "${ROTDIR}" # Pull CDATE gfs restart tarball - htar -xvf ${PRODHPSSDIR}/rh${yy}/${yy}${mm}/${yy}${mm}${dd}/com_gfs_${version}_gfs.${yy}${mm}${dd}_${hh}.gfs_restart.tar + htar -xvf ${PRODHPSSDIR}/rh${yy}/${yy}${mm}/${yy}${mm}${dd}/com_gfs_prod_gfs.${yy}${mm}${dd}_${hh}.gfs_restart.tar # Pull GDATE gdas restart tarball - htar -xvf ${PRODHPSSDIR}/rh${gyy}/${gyy}${gmm}/${gyy}${gmm}${gdd}/com_gfs_${version}_gdas.${gyy}${gmm}${gdd}_${ghh}.gdas_restart.tar + htar -xvf ${PRODHPSSDIR}/rh${gyy}/${gyy}${gmm}/${gyy}${gmm}${gdd}/com_gfs_prod_gdas.${gyy}${gmm}${gdd}_${ghh}.gdas_restart.tar fi else # Pull chgres cube inputs for cold start IC generation @@ -103,7 +102,7 @@ else # Pull chgres cube inputs for cold start IC generation # Run UFS_UTILS GETICSH sh ${GETICSH} ${CDUMP} status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit "${status}" fi @@ -113,39 +112,39 @@ if [[ -d ${ROTDIR}/${CDUMP}.${yy}${mm}${dd}/${hh}/${COMPONENT} ]]; then fi mkdir -p "${ROTDIR}/${CDUMP}.${yy}${mm}${dd}/${hh}/${COMPONENT}" -if [ $gfs_ver = v16 -a $RETRO = "YES" ]; then +if [ ${gfs_ver} = v16 -a ${RETRO} = "YES" ]; then mv ${EXTRACT_DIR}/${CDUMP}.${yy}${mm}${dd}/${hh}/* ${ROTDIR}/${CDUMP}.${yy}${mm}${dd}/${hh}/${COMPONENT} else mv ${EXTRACT_DIR}/${CDUMP}.${yy}${mm}${dd}/${hh}/* ${ROTDIR}/${CDUMP}.${yy}${mm}${dd}/${hh} fi # Pull pgbanl file for verification/archival - v14+ -if [ $gfs_ver = v14 -o $gfs_ver = v15 -o $gfs_ver = v16 ]; then +if [ ${gfs_ver} = v14 -o ${gfs_ver} = v15 -o ${gfs_ver} = v16 ]; then for grid in 0p25 0p50 1p00 do file=gfs.t${hh}z.pgrb2.${grid}.anl - if [ $gfs_ver = v14 ]; then # v14 production source + if [[ ${gfs_ver} = v14 ]]; then # v14 production source - cd $ROTDIR/${CDUMP}.${yy}${mm}${dd}/${hh}/${COMPONENT} + cd "${ROTDIR}/${CDUMP}.${yy}${mm}${dd}/${hh}/${COMPONENT}" export tarball="gpfs_hps_nco_ops_com_gfs_prod_gfs.${yy}${mm}${dd}${hh}.pgrb2_${grid}.tar" htar -xvf ${PRODHPSSDIR}/rh${yy}/${yy}${mm}/${yy}${mm}${dd}/${tarball} ./${file} - elif [ $gfs_ver = v15 ]; then # v15 production source + elif [[ ${gfs_ver} = v15 ]]; then # v15 production source - cd $EXTRACT_DIR + cd "${EXTRACT_DIR}" export tarball="com_gfs_prod_gfs.${yy}${mm}${dd}_${hh}.gfs_pgrb2.tar" htar -xvf ${PRODHPSSDIR}/rh${yy}/${yy}${mm}/${yy}${mm}${dd}/${tarball} ./${CDUMP}.${yy}${mm}${dd}/${hh}/${file} mv ${EXTRACT_DIR}/${CDUMP}.${yy}${mm}${dd}/${hh}/${file} ${ROTDIR}/${CDUMP}.${yy}${mm}${dd}/${hh}/${COMPONENT}/${file} - elif [ $gfs_ver = v16 ]; then # v16 - determine RETRO or production source next + elif [[ ${gfs_ver} = v16 ]]; then # v16 - determine RETRO or production source next - if [ $RETRO = "YES" ]; then # Retrospective parallel source + if [[ ${RETRO} = "YES" ]]; then # Retrospective parallel source - cd $EXTRACT_DIR - if [ $grid = "0p25" ]; then # anl file spread across multiple tarballs + cd ${EXTRACT_DIR} + if [[ ${grid} = "0p25" ]]; then # anl file spread across multiple tarballs export tarball="gfsa.tar" - elif [ $grid = "0p50" -o $grid = "1p00" ]; then + elif [ ${grid} = "0p50" -o ${grid} = "1p00" ]; then export tarball="gfsb.tar" fi htar -xvf ${HPSSDIR}/${yy}${mm}${dd}${hh}/${tarball} ./${CDUMP}.${yy}${mm}${dd}/${hh}/${file} @@ -153,8 +152,8 @@ if [ $gfs_ver = v14 -o $gfs_ver = v15 -o $gfs_ver = v16 ]; then else # Production source - cd $ROTDIR - export tarball="com_gfs_${version}_gfs.${yy}${mm}${dd}_${hh}.gfs_pgrb2.tar" + cd "${ROTDIR}" + export tarball="com_gfs_prod_gfs.${yy}${mm}${dd}_${hh}.gfs_pgrb2.tar" htar -xvf ${PRODHPSSDIR}/rh${yy}/${yy}${mm}/${yy}${mm}${dd}/${tarball} ./${CDUMP}.${yy}${mm}${dd}/${hh}/atmos/${file} fi # RETRO vs production @@ -166,8 +165,8 @@ fi # v14-v16 pgrb anl file pull ########################################## # Remove the Temporary working directory ########################################## -cd $DATAROOT -[[ $KEEPDATA = "NO" ]] && rm -rf $DATA +cd "${DATAROOT}" +[[ ${KEEPDATA} = "NO" ]] && rm -rf "${DATA}" ############################################################### # Exit out cleanly diff --git a/jobs/rocoto/gldas.sh b/jobs/rocoto/gldas.sh index db16dd883f1..8d8bb903bb0 100755 --- a/jobs/rocoto/gldas.sh +++ b/jobs/rocoto/gldas.sh @@ -8,6 +8,9 @@ source "$HOMEgfs/ush/preamble.sh" status=$? [[ $status -ne 0 ]] && exit $status +export job="gldas" +export jobid="${job}.$$" + ############################################################### # Execute the JJOB. GLDAS only runs once per day. diff --git a/jobs/rocoto/landanlfinal.sh b/jobs/rocoto/landanlfinal.sh new file mode 100755 index 00000000000..a6fa48c679d --- /dev/null +++ b/jobs/rocoto/landanlfinal.sh @@ -0,0 +1,23 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="landanlfinal" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_LAND_ANALYSIS_FINALIZE" +status=$? +exit "${status}" diff --git a/jobs/rocoto/landanlinit.sh b/jobs/rocoto/landanlinit.sh new file mode 100755 index 00000000000..e9c0b2d7a2e --- /dev/null +++ b/jobs/rocoto/landanlinit.sh @@ -0,0 +1,24 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="landanlinit" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH + +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_LAND_ANALYSIS_INITIALIZE" +status=$? +exit "${status}" diff --git a/jobs/rocoto/landanlrun.sh b/jobs/rocoto/landanlrun.sh new file mode 100755 index 00000000000..3f306a32be8 --- /dev/null +++ b/jobs/rocoto/landanlrun.sh @@ -0,0 +1,24 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="landanlrun" +export jobid="${job}.$$" + +############################################################### +# setup python path for workflow utilities and tasks +pygwPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/pygw/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${pygwPATH}" +export PYTHONPATH + +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGLOBAL_LAND_ANALYSIS_RUN" +status=$? +exit "${status}" diff --git a/jobs/rocoto/metp.sh b/jobs/rocoto/metp.sh index 80138b90265..82254a04354 100755 --- a/jobs/rocoto/metp.sh +++ b/jobs/rocoto/metp.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### ## Abstract: @@ -19,62 +19,78 @@ source "$HOMEgfs/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} +export job="metp${METPCASE}" +export jobid="${job}.$$" + +############################################## +# make temp directory +############################################## +export DATA=${DATA:-${DATAROOT}/${jobid}} +mkdir -p ${DATA} +cd ${DATA} + + +############################################## +# Run setpdy and initialize PDY variables +############################################## +export cycle="t${cyc}z" +setpdy.sh +. ./PDY ############################################################### echo echo "=============== START TO SOURCE RELEVANT CONFIGS ===============" configs="base metp" -for config in $configs; do - . $EXPDIR/config.${config} +for config in ${configs}; do + . ${EXPDIR}/config.${config} status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done ############################################################### echo echo "=============== START TO SOURCE MACHINE RUNTIME ENVIRONMENT ===============" -. $BASE_ENV/${machine}.env metp +. ${BASE_ENV}/${machine}.env metp status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################################### -export COMPONENT=${COMPONENT:-atmos} -export VDATE="$(echo $($NDATE -${VRFYBACK_HRS} $CDATE) | cut -c1-8)" - -export pid=${pid:-$$} -export jobid=${job}.${pid} -export COMIN="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export DATAROOT="$RUNDIR/$CDATE/$CDUMP/metp.${jobid}" -[[ -d $DATAROOT ]] && rm -rf $DATAROOT -mkdir -p $DATAROOT +export COMPONENT="atmos" +export VDATE="$(echo $(${NDATE} -${VRFYBACK_HRS} ${CDATE}) | cut -c1-8)" +export COMIN="${ROTDIR}/${CDUMP}.${PDY}/${cyc}/${COMPONENT}" +# TODO: This should not be permitted as DATAROOT is set at the job-card level. +# TODO: DATAROOT is being used as DATA in metp jobs. This should be rectified in metp. +# TODO: The temporary directory is DATA and is created at the top of the J-Job. +# TODO: remove this line +export DATAROOT=${DATA} ############################################################### echo echo "=============== START TO RUN METPLUS VERIFICATION ===============" -if [ $CDUMP = "gfs" ]; then +if [ ${CDUMP} = "gfs" ]; then - if [ $RUN_GRID2GRID_STEP1 = "YES" -o $RUN_GRID2OBS_STEP1 = "YES" -o $RUN_PRECIP_STEP1 = "YES" ]; then + if [ ${RUN_GRID2GRID_STEP1} = "YES" -o ${RUN_GRID2OBS_STEP1} = "YES" -o ${RUN_PRECIP_STEP1} = "YES" ]; then - $VERIF_GLOBALSH + ${VERIF_GLOBALSH} status=$? - [[ $status -ne 0 ]] && exit $status - [[ $status -eq 0 ]] && echo "Succesfully ran $VERIF_GLOBALSH" + [[ ${status} -ne 0 ]] && exit ${status} + [[ ${status} -eq 0 ]] && echo "Succesfully ran ${VERIF_GLOBALSH}" fi fi -if [ $CDUMP = "gdas" ]; then +if [ ${CDUMP} = "gdas" ]; then echo "METplus verification currently not supported for CDUMP=${CDUMP}" fi ############################################################### # Force Exit out cleanly -if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi +if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf ${DATAROOT} ; fi # TODO: This should be $DATA exit 0 diff --git a/jobs/rocoto/ocnanalbmat.sh b/jobs/rocoto/ocnanalbmat.sh new file mode 100755 index 00000000000..e62db9115ad --- /dev/null +++ b/jobs/rocoto/ocnanalbmat.sh @@ -0,0 +1,19 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ "${status}" -ne 0 ]] && exit "${status}" + +export job="ocnanalbmat" +export jobid="${job}.$$" + +############################################################### +# Execute the JJOB +"${HOMEgfs}"/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT +echo "BMAT gets run here" +status=$? +exit "${status}" diff --git a/jobs/rocoto/ocnanalchkpt.sh b/jobs/rocoto/ocnanalchkpt.sh new file mode 100755 index 00000000000..ae98bc8e882 --- /dev/null +++ b/jobs/rocoto/ocnanalchkpt.sh @@ -0,0 +1,18 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="ocnanalchkpt" +export jobid="${job}.$$" + +############################################################### +# Execute the JJOB +"${HOMEgfs}"/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_CHKPT +status=$? +exit "${status}" diff --git a/jobs/rocoto/ocnanalpost.sh b/jobs/rocoto/ocnanalpost.sh new file mode 100755 index 00000000000..b99a4e05caf --- /dev/null +++ b/jobs/rocoto/ocnanalpost.sh @@ -0,0 +1,18 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="ocnanalpost" +export jobid="${job}.$$" + +############################################################### +# Execute the JJOB +"${HOMEgfs}"/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_POST +status=$? +exit "${status}" diff --git a/jobs/rocoto/ocnanalprep.sh b/jobs/rocoto/ocnanalprep.sh new file mode 100755 index 00000000000..3830fe1c391 --- /dev/null +++ b/jobs/rocoto/ocnanalprep.sh @@ -0,0 +1,19 @@ +#! /usr/bin/env bash + +export STRICT="NO" +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="ocnanalprep" +export jobid="${job}.$$" + +############################################################### +# Execute the JJOB +"${HOMEgfs}"/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP +status=$? +exit "${status}" diff --git a/jobs/rocoto/ocnanalrun.sh b/jobs/rocoto/ocnanalrun.sh new file mode 100755 index 00000000000..5f998af9894 --- /dev/null +++ b/jobs/rocoto/ocnanalrun.sh @@ -0,0 +1,18 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="ocnanalrun" +export jobid="${job}.$$" + +############################################################### +# Execute the JJOB +"${HOMEgfs}"/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_RUN +status=$? +exit "${status}" diff --git a/jobs/rocoto/ocnanalvrfy.sh b/jobs/rocoto/ocnanalvrfy.sh new file mode 100755 index 00000000000..d8e9bbb8050 --- /dev/null +++ b/jobs/rocoto/ocnanalvrfy.sh @@ -0,0 +1,19 @@ +#! /usr/bin/env bash + +export STRICT="NO" +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" --eva +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="ocnanalvrfy" +export jobid="${job}.$$" + +############################################################### +# Execute the JJOB +"${HOMEgfs}/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY" +status=$? +exit "${status}" diff --git a/jobs/rocoto/ocnpost.sh b/jobs/rocoto/ocnpost.sh index 0f6413ec43e..ee8da061f28 100755 --- a/jobs/rocoto/ocnpost.sh +++ b/jobs/rocoto/ocnpost.sh @@ -1,159 +1,120 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### -## CICE5/MOM6 post driver script +## CICE5/MOM6 post driver script ## FHRGRP : forecast hour group to post-process (e.g. 0, 1, 2 ...) ## FHRLST : forecast hourlist to be post-process (e.g. anl, f000, f000_f001_f002, ...) ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +source "${HOMEgfs}/ush/load_fv3gfs_modules.sh" status=$? -[[ $status -ne 0 ]] && exit $status - -############################# -# Source relevant config files -############################# -configs="base ocnpost" -config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} -for config in $configs; do - . $config_path/config.$config - status=$? - [[ $status -ne 0 ]] && exit $status -done - - -########################################## -# Source machine runtime environment -########################################## -. $HOMEgfs/env/${machine}.env ocnpost -status=$? -[[ $status -ne 0 ]] && exit $status - - -############################################## -# Obtain unique process id (pid) and make temp directory -############################################## -export job=${job:-"ocnpost"} -export pid=${pid:-$$} -export outid=${outid:-"LL$job"} -export jobid=${jobid:-"${outid}.o${pid}"} - -export DATAROOT="$RUNDIR/$CDATE/$CDUMP" -[[ ! -d $DATAROOT ]] && mkdir -p $DATAROOT - -export DATA="$DATAROOT/${job}.${pid}" -# DATA dir not used for now. - -[[ -d $DATA ]] && rm -rf $DATA -mkdir -p $DATA -cd $DATA - -############################################## -# Run setpdy and initialize PDY variables -############################################## -export cycle="t${cyc}z" -setpdy.sh -. ./PDY - -############################################## -# Define the Log File directory -############################################## -export jlogfile=${jlogfile:-$COMROOT/logs/jlogfiles/jlogfile.${job}.${pid}} - -############################################## -# Determine Job Output Name on System -############################################## -export pgmout="OUTPUT.${pid}" -export pgmerr=errfile +(( status != 0 )) && exit "${status}" +export job="ocnpost" +export jobid="${job}.$$" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnpost" -c "base ocnpost" ############################################## # Set variables used in the exglobal script ############################################## -export CDATE=${CDATE:-${PDY}${cyc}} -export CDUMP=${CDUMP:-${RUN:-"gfs"}} -if [ $RUN_ENVIR = "nco" ]; then - export ROTDIR=${COMROOT:?}/$NET/$envir +export CDUMP=${RUN/enkf} +if [[ ${RUN_ENVIR} = "nco" ]]; then + export ROTDIR=${COMROOT:?}/${NET}/${envir} fi ############################################## # Begin JOB SPECIFIC work ############################################## -[[ ! -d $COMOUTocean ]] && mkdir -p $COMOUTocean -[[ ! -d $COMOUTice ]] && mkdir -p $COMOUTice +YMD=${PDY} HH=${cyc} generate_com -rx COM_OCEAN_HISTORY COM_OCEAN_2D COM_OCEAN_3D \ + COM_OCEAN_XSECT COM_ICE_HISTORY -fhrlst=$(echo $FHRLST | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') +for grid in "0p50" "0p25"; do + YMD=${PDY} HH=${cyc} GRID=${grid} generate_com -rx "COM_OCEAN_GRIB_${grid}:COM_OCEAN_GRIB_TMPL" +done + +for outdir in COM_OCEAN_2D COM_OCEAN_3D COM_OCEAN_XSECT COM_OCEAN_GRIB_0p25 COM_OCEAN_GRIB_0p50; do + if [[ ! -d "${!outdir}" ]]; then + mkdir -p "${!outdir}" + fi +done + +fhrlst=$(echo ${FHRLST} | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') export OMP_NUM_THREADS=1 export ENSMEM=${ENSMEM:-01} -export IDATE=$CDATE - -for fhr in $fhrlst; do - export fhr=$fhr - VDATE=$($NDATE $fhr $IDATE) - # Regrid the MOM6 and CICE5 output from tripolar to regular grid via NCL - # This can take .25 degree input and convert to .5 degree - other opts avail - # The regrid scripts use CDATE for the current day, restore it to IDATE afterwards - export CDATE=$VDATE - cd $DATA - if [ $fhr -gt 0 ]; then - export MOM6REGRID=${MOM6REGRID:-$HOMEgfs} - $MOM6REGRID/scripts/run_regrid.sh - status=$? - [[ $status -ne 0 ]] && exit $status - - # Convert the netcdf files to grib2 - export executable=$MOM6REGRID/exec/reg2grb2.x - $MOM6REGRID/scripts/run_reg2grb2.sh - status=$? - [[ $status -ne 0 ]] && exit $status - - - #break up ocn netcdf into multiple files: - if [ -f $COMOUTocean/ocn_2D_$VDATE.$ENSMEM.$IDATE.nc ]; then - echo "File $COMOUTocean/ocn_2D_$VDATE.$ENSMEM.$IDATE.nc already exists" +export IDATE=${PDY}${cyc} + +for fhr in ${fhrlst}; do + export fhr=${fhr} + # Ignore possible spelling error (nothing is misspelled) + # shellcheck disable=SC2153 + VDATE=$(${NDATE} "${fhr}" "${IDATE}") + # shellcheck disable= + declare -x VDATE + cd "${DATA}" || exit 2 + if (( fhr > 0 )); then + # TODO: This portion calls NCL scripts that are deprecated (see Issue #923) + if [[ "${MAKE_OCN_GRIB:-YES}" == "YES" ]]; then + export MOM6REGRID=${MOM6REGRID:-${HOMEgfs}} + "${MOM6REGRID}/scripts/run_regrid.sh" + status=$? + [[ ${status} -ne 0 ]] && exit "${status}" + + # Convert the netcdf files to grib2 + export executable=${MOM6REGRID}/exec/reg2grb2.x + "${MOM6REGRID}/scripts/run_reg2grb2.sh" + status=$? + [[ ${status} -ne 0 ]] && exit "${status}" + ${NMV} "ocn_ice${VDATE}.${ENSMEM}.${IDATE}_0p25x0p25.grb2" "${COM_OCEAN_GRIB_0p25}/" + ${NMV} "ocn_ice${VDATE}.${ENSMEM}.${IDATE}_0p5x0p5.grb2" "${COM_OCEAN_GRIB_0p50}/" + fi + + #break up ocn netcdf into multiple files: + if [[ -f "${COM_OCEAN_2D}/ocn_2D_${VDATE}.${ENSMEM}.${IDATE}.nc" ]]; then + echo "File ${COM_OCEAN_2D}/ocn_2D_${VDATE}.${ENSMEM}.${IDATE}.nc already exists" else - ncks -x -v vo,uo,so,temp $COMOUTocean/ocn$VDATE.$ENSMEM.$IDATE.nc $COMOUTocean/ocn_2D_$VDATE.$ENSMEM.$IDATE.nc + ncks -x -v vo,uo,so,temp \ + "${COM_OCEAN_HISTORY}/ocn${VDATE}.${ENSMEM}.${IDATE}.nc" \ + "${COM_OCEAN_2D}/ocn_2D_${VDATE}.${ENSMEM}.${IDATE}.nc" status=$? - [[ $status -ne 0 ]] && exit $status - fi - if [ -f $COMOUTocean/ocn_3D_$VDATE.$ENSMEM.$IDATE.nc ]; then - echo "File $COMOUTocean/ocn_3D_$VDATE.$ENSMEM.$IDATE.nc already exists" - else - ncks -x -v Heat_PmE,LW,LwLatSens,MLD_003,MLD_0125,SSH,SSS,SST,SSU,SSV,SW,cos_rot,ePBL,evap,fprec,frazil,latent,lprec,lrunoff,sensible,sin_rot,speed,taux,tauy,wet_c,wet_u,wet_v $COMOUTocean/ocn$VDATE.$ENSMEM.$IDATE.nc $COMOUTocean/ocn_3D_$VDATE.$ENSMEM.$IDATE.nc + [[ ${status} -ne 0 ]] && exit "${status}" + fi + if [[ -f "${COM_OCEAN_3D}/ocn_3D_${VDATE}.${ENSMEM}.${IDATE}.nc" ]]; then + echo "File ${COM_OCEAN_3D}/ocn_3D_${VDATE}.${ENSMEM}.${IDATE}.nc already exists" + else + ncks -x -v Heat_PmE,LW,LwLatSens,MLD_003,MLD_0125,SSH,SSS,SST,SSU,SSV,SW,cos_rot,ePBL,evap,fprec,frazil,latent,lprec,lrunoff,sensible,sin_rot,speed,taux,tauy,wet_c,wet_u,wet_v \ + "${COM_OCEAN_HISTORY}/ocn${VDATE}.${ENSMEM}.${IDATE}.nc" \ + "${COM_OCEAN_3D}/ocn_3D_${VDATE}.${ENSMEM}.${IDATE}.nc" status=$? - [[ $status -ne 0 ]] && exit $status - fi - if [ -f $COMOUTocean/ocn-temp-EQ_$VDATE.$ENSMEM.$IDATE.nc ]; then - echo "File $COMOUTocean/ocn-temp-EQ_$VDATE.$ENSMEM.$IDATE.nc already exists" - else - ncks -v temp -d yh,503 -d xh,-299.92,60.03 $COMOUTocean/ocn_3D_$VDATE.$ENSMEM.$IDATE.nc $COMOUTocean/ocn-temp-EQ_$VDATE.$ENSMEM.$IDATE.nc + [[ ${status} -ne 0 ]] && exit "${status}" + fi + if [[ -f "${COM_OCEAN_XSECT}/ocn-temp-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc" ]]; then + echo "File ${COM_OCEAN_XSECT}/ocn-temp-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc already exists" + else + ncks -v temp -d yh,503 -d xh,-299.92,60.03 \ + "${COM_OCEAN_3D}/ocn_3D_${VDATE}.${ENSMEM}.${IDATE}.nc" \ + "${COM_OCEAN_XSECT}/ocn-temp-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc" status=$? - [[ $status -ne 0 ]] && exit $status - fi - if [ -f $COMOUTocean/ocn-uo-EQ_$VDATE.$ENSMEM.$IDATE.nc ]; then - echo "File $COMOUTocean/ocn-uo-EQ_$VDATE.$ENSMEM.$IDATE.nc already exists" - else - ncks -v uo -d yh,503 -d xh,-299.92,60.03 $COMOUTocean/ocn_3D_$VDATE.$ENSMEM.$IDATE.nc $COMOUTocean/ocn-uo-EQ_$VDATE.$ENSMEM.$IDATE.nc + [[ ${status} -ne 0 ]] && exit "${status}" + fi + if [[ -f "${COM_OCEAN_XSECT}/ocn-uo-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc" ]]; then + echo "File ${COM_OCEAN_XSECT}/ocn-uo-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc already exists" + else + ncks -v uo -d yh,503 -d xh,-299.92,60.03 \ + "${COM_OCEAN_3D}/ocn_3D_${VDATE}.${ENSMEM}.${IDATE}.nc" \ + "${COM_OCEAN_XSECT}/ocn-uo-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc" status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit "${status}" fi fi - done -# Restore CDATE to what is expected -export CDATE=$IDATE -$NMV ocn_ice*.grb2 $COMOUTocean/ -status=$? -[[ $status -ne 0 ]] && exit $status - # clean up working folder -if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATA ; fi +if [[ ${KEEPDATA:-"NO"} = "NO" ]] ; then rm -rf "${DATA}" ; fi ############################################################### # Exit out cleanly diff --git a/jobs/rocoto/post.sh b/jobs/rocoto/post.sh index b32e8c511d7..e84b2b7b716 100755 --- a/jobs/rocoto/post.sh +++ b/jobs/rocoto/post.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### ## NCEP post driver script @@ -9,27 +9,25 @@ source "$HOMEgfs/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} -export COMPONENT=${COMPONENT:-atmos} +export job="post" +export jobid="${job}.$$" -if [ $FHRGRP = 'anl' ]; then +if [ ${FHRGRP} = 'anl' ]; then fhrlst="anl" - restart_file=$ROTDIR/${CDUMP}.${PDY}/${cyc}/$COMPONENT/${CDUMP}.t${cyc}z.atm else - fhrlst=$(echo $FHRLST | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') - restart_file=$ROTDIR/${CDUMP}.${PDY}/${cyc}/$COMPONENT/${CDUMP}.t${cyc}z.logf + fhrlst=$(echo ${FHRLST} | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') fi - #--------------------------------------------------------------- -for fhr in $fhrlst; do - export post_times=$fhr - $HOMEgfs/jobs/JGLOBAL_ATMOS_NCEPPOST +for fhr in ${fhrlst}; do + export post_times=${fhr} + ${HOMEgfs}/jobs/JGLOBAL_ATMOS_POST status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done exit 0 diff --git a/jobs/rocoto/postsnd.sh b/jobs/rocoto/postsnd.sh index fadfaa6d9ed..bc274361dbe 100755 --- a/jobs/rocoto/postsnd.sh +++ b/jobs/rocoto/postsnd.sh @@ -1,20 +1,22 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} +export job="postsnd" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGFS_ATMOS_POSTSND +${HOMEgfs}/jobs/JGFS_ATMOS_POSTSND status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/prep.sh b/jobs/rocoto/prep.sh index 7d22adc7aae..826dec5ae7a 100755 --- a/jobs/rocoto/prep.sh +++ b/jobs/rocoto/prep.sh @@ -1,56 +1,60 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################################### -# Source relevant configs -configs="base prep prepbufr" -for config in $configs; do - . $EXPDIR/config.${config} - status=$? - [[ $status -ne 0 ]] && exit $status -done +export job="prep" +export jobid="${job}.$$" +source "${HOMEgfs}/ush/jjob_header.sh" -e "prep" -c "base prep" -############################################################### -# Source machine runtime environment -. $BASE_ENV/${machine}.env prep -status=$? -[[ $status -ne 0 ]] && exit $status +export CDUMP="${RUN/enkf}" ############################################################### # Set script and dependency variables -export COMPONENT=${COMPONENT:-atmos} +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") +# shellcheck disable= +gPDY=${GDATE:0:8} +gcyc=${GDATE:8:2} +GDUMP="gdas" + export OPREFIX="${CDUMP}.t${cyc}z." -export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -[[ ! -d $COMOUT ]] && mkdir -p $COMOUT + +YMD=${PDY} HH=${cyc} DUMP=${CDUMP} generate_com -rx COM_OBS COM_OBSDMP + +RUN=${GDUMP} DUMP=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + COM_OBS_PREV:COM_OBS_TMPL \ + COM_OBSDMP_PREV:COM_OBSDMP_TMPL + +export MAKE_PREPBUFR=${MAKE_PREPBUFR:-"YES"} +if [[ ! -d "${COM_OBS}" ]]; then mkdir -p "${COM_OBS}"; fi ############################################################### # If ROTDIR_DUMP=YES, copy dump files to rotdir -if [ $ROTDIR_DUMP = "YES" ]; then - $HOMEgfs/ush/getdump.sh $CDATE $CDUMP $DMPDIR/${CDUMP}${DUMP_SUFFIX}.${PDY}/${cyc}/${COMPONENT} $COMOUT +if [[ ${ROTDIR_DUMP} = "YES" ]]; then + "${HOMEgfs}/ush/getdump.sh" "${PDY}${cyc}" "${CDUMP}" "${COM_OBSDMP}" "${COM_OBS}" status=$? - [[ $status -ne 0 ]] && exit $status - -# Ensure previous cycle gdas dumps are available (used by cycle & downstream) - GDATE=$($NDATE -$assim_freq $CDATE) - gPDY=$(echo $GDATE | cut -c1-8) - gcyc=$(echo $GDATE | cut -c9-10) - GDUMP=gdas - gCOMOUT="$ROTDIR/$GDUMP.$gPDY/$gcyc/$COMPONENT" - if [ ! -s $gCOMOUT/$GDUMP.t${gcyc}z.updated.status.tm00.bufr_d ]; then - $HOMEgfs/ush/getdump.sh $GDATE $GDUMP $DMPDIR/${GDUMP}${DUMP_SUFFIX}.${gPDY}/${gcyc}/${COMPONENT} $gCOMOUT + [[ ${status} -ne 0 ]] && exit ${status} + + # Ensure previous cycle gdas dumps are available (used by cycle & downstream) + if [[ ! -s "${COM_OBS_PREV}/${GDUMP}.t${gcyc}z.updated.status.tm00.bufr_d" ]]; then + "${HOMEgfs}/ush/getdump.sh" "${GDATE}" "${GDUMP}" "${COM_OBSDMP_PREV}" "${COM_OBS_PREV}" status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} fi - + # exception handling to ensure no dead link + [[ $(find ${COM_OBS} -xtype l | wc -l) -ge 1 ]] && exit 9 + [[ $(find ${COM_OBS_PREV} -xtype l | wc -l) -ge 1 ]] && exit 9 fi + ############################################################### ############################################################### @@ -58,71 +62,71 @@ fi # copy files from operational syndata directory to a local directory. # Otherwise, copy existing tcvital data from globaldump. -if [ $PROCESS_TROPCY = "YES" ]; then +if [[ ${PROCESS_TROPCY} = "YES" ]]; then export COMINsyn=${COMINsyn:-$(compath.py gfs/prod/syndat)} - if [ $RUN_ENVIR != "nco" ]; then + if [[ ${RUN_ENVIR} != "nco" ]]; then export ARCHSYND=${ROTDIR}/syndat - if [ ! -d ${ARCHSYND} ]; then mkdir -p $ARCHSYND; fi - if [ ! -s $ARCHSYND/syndat_akavit ]; then + if [[ ! -d ${ARCHSYND} ]]; then mkdir -p ${ARCHSYND}; fi + if [[ ! -s ${ARCHSYND}/syndat_akavit ]]; then for file in syndat_akavit syndat_dateck syndat_stmcat.scr syndat_stmcat syndat_sthisto syndat_sthista ; do - cp $COMINsyn/$file $ARCHSYND/. + cp ${COMINsyn}/${file} ${ARCHSYND}/. done fi fi - [[ $ROTDIR_DUMP = "YES" ]] && rm $COMOUT${CDUMP}.t${cyc}z.syndata.tcvitals.tm00 + if [[ ${ROTDIR_DUMP} = "YES" ]]; then rm "${COM_OBS}/${CDUMP}.t${cyc}z.syndata.tcvitals.tm00"; fi - $HOMEgfs/jobs/JGLOBAL_ATMOS_TROPCY_QC_RELOC + "${HOMEgfs}/jobs/JGLOBAL_ATMOS_TROPCY_QC_RELOC" status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} else - [[ $ROTDIR_DUMP = "NO" ]] && cp $DMPDIR/${CDUMP}${DUMP_SUFFIX}.${PDY}/${cyc}/${COMPONENT}/${CDUMP}.t${cyc}z.syndata.tcvitals.tm00 $COMOUT/ + if [[ ${ROTDIR_DUMP} = "NO" ]]; then cp "${COM_OBSDMP}/${CDUMP}.t${cyc}z.syndata.tcvitals.tm00" "${COM_OBS}/"; fi fi ############################################################### # Generate prepbufr files from dumps or copy from OPS -if [ $DO_MAKEPREPBUFR = "YES" ]; then - if [ $ROTDIR_DUMP = "YES" ]; then - rm $COMOUT/${OPREFIX}prepbufr - rm $COMOUT/${OPREFIX}prepbufr.acft_profiles - rm $COMOUT/${OPREFIX}nsstbufr +if [[ ${MAKE_PREPBUFR} = "YES" ]]; then + if [[ ${ROTDIR_DUMP} = "YES" ]]; then + rm -f "${COM_OBS}/${OPREFIX}prepbufr" + rm -f "${COM_OBS}/${OPREFIX}prepbufr.acft_profiles" + rm -f "${COM_OBS}/${OPREFIX}nsstbufr" fi export job="j${CDUMP}_prep_${cyc}" - export DATAROOT="$RUNDIR/$CDATE/$CDUMP/prepbufr" - #export COMIN=${COMIN:-$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT} - export COMIN=${COMIN:-$ROTDIR} - export COMINgdas=${COMINgdas:-$ROTDIR/gdas.$PDY/$cyc/$COMPONENT} - export COMINgfs=${COMINgfs:-$ROTDIR/gfs.$PDY/$cyc/$COMPONENT} - if [ $ROTDIR_DUMP = "NO" ]; then - COMIN_OBS=${COMIN_OBS:-$DMPDIR/${CDUMP}${DUMP_SUFFIX}.${PDY}/${cyc}/${COMPONENT}} - export COMSP=${COMSP:-$COMIN_OBS/$CDUMP.t${cyc}z.} + export DATAROOT="${RUNDIR}/${CDATE}/${CDUMP}/prepbufr" + export COMIN=${COM_OBS} + export COMOUT=${COM_OBS} + RUN="gdas" YMD=${PDY} HH=${cyc} generate_com -rx COMINgdas:COM_ATMOS_HISTORY_TMPL + RUN="gfs" YMD=${PDY} HH=${cyc} generate_com -rx COMINgfs:COM_ATMOS_HISTORY_TMPL + if [[ ${ROTDIR_DUMP} = "NO" ]]; then + export COMSP=${COMSP:-"${COM_OBSDMP}/${CDUMP}.t${cyc}z."} else - export COMSP=${COMSP:-$ROTDIR/${CDUMP}.${PDY}/${cyc}/$COMPONENT/$CDUMP.t${cyc}z.} + export COMSP=${COMSP:-"${COM_OBS}/${CDUMP}.t${cyc}z."} fi + export COMSP=${COMSP:-${COMIN_OBS}/${CDUMP}.t${cyc}z.} # Disable creating NSSTBUFR if desired, copy from DMPDIR instead - if [[ ${DO_MAKE_NSSTBUFR:-"NO"} = "NO" ]]; then + if [[ ${MAKE_NSSTBUFR:-"NO"} = "NO" ]]; then export MAKE_NSSTBUFR="NO" fi - $HOMEobsproc_network/jobs/JGLOBAL_PREP + "${HOMEobsproc}/jobs/JOBSPROC_GLOBAL_PREP" status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} # If creating NSSTBUFR was disabled, copy from DMPDIR if appropriate. - if [[ ${DO_MAKE_NSSTBUFR:-"NO"} = "NO" ]]; then - [[ $DONST = "YES" ]] && $NCP $DMPDIR/${CDUMP}${DUMP_SUFFIX}.${PDY}/${cyc}/${COMPONENT}/${OPREFIX}nsstbufr $COMOUT/${OPREFIX}nsstbufr + if [[ ${MAKE_NSSTBUFR:-"NO"} = "NO" ]]; then + if [[ ${DONST} = "YES" ]]; then ${NCP} "${COM_OBSDMP}/${OPREFIX}nsstbufr" "${COM_OBS}/${OPREFIX}nsstbufr"; fi fi else - if [ $ROTDIR_DUMP = "NO" ]; then - $NCP $DMPDIR/${CDUMP}${DUMP_SUFFIX}.${PDY}/${cyc}/${COMPONENT}/${OPREFIX}prepbufr $COMOUT/${OPREFIX}prepbufr - $NCP $DMPDIR/${CDUMP}${DUMP_SUFFIX}.${PDY}/${cyc}/${COMPONENT}/${OPREFIX}prepbufr.acft_profiles $COMOUT/${OPREFIX}prepbufr.acft_profiles - [[ $DONST = "YES" ]] && $NCP $DMPDIR/${CDUMP}${DUMP_SUFFIX}.${PDY}/${cyc}/${COMPONENT}/${OPREFIX}nsstbufr $COMOUT/${OPREFIX}nsstbufr + if [[ ${ROTDIR_DUMP} = "NO" ]]; then + ${NCP} "${COM_OBSDMP}/${OPREFIX}prepbufr" "${COM_OBS}/${OPREFIX}prepbufr" + ${NCP} "${COM_OBSDMP}/${OPREFIX}prepbufr.acft_profiles" "${COM_OBS}/${OPREFIX}prepbufr.acft_profiles" + if [[ ${DONST} = "YES" ]]; then ${NCP} "${COM_OBSDMP}/${OPREFIX}nsstbufr" "${COM_OBS}/${OPREFIX}nsstbufr"; fi fi fi diff --git a/jobs/rocoto/sfcanl.sh b/jobs/rocoto/sfcanl.sh index 7b9812f37be..44f93ee0c31 100755 --- a/jobs/rocoto/sfcanl.sh +++ b/jobs/rocoto/sfcanl.sh @@ -1,17 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### # Source FV3GFS workflow modules -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="sfcanl" +export jobid="${job}.$$" ############################################################### # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_ATMOS_SFCANL +${HOMEgfs}/jobs/JGLOBAL_ATMOS_SFCANL status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/wafs.sh b/jobs/rocoto/wafs.sh index 8aab955cc8a..59d1ede1397 100755 --- a/jobs/rocoto/wafs.sh +++ b/jobs/rocoto/wafs.sh @@ -1,59 +1,46 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} +export job="wafs" +export jobid="${job}.$$" + +############################################################### ############################################################### +# TODO: sourcing configs should be in the j-job echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" configs="base wafs" -for config in $configs; do - . $EXPDIR/config.${config} +for config in ${configs}; do + . ${EXPDIR}/config.${config} status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done ############################################################### -export pid=${pid:-$$} -export jobid=${job}.${pid} -export DATAROOT="$RUNDIR/$CDATE/$CDUMP/wafs.$jobid" -[[ -d $DATAROOT ]] && rm -rf $DATAROOT -mkdir -p $DATAROOT - -export DATA="${DATAROOT}/$job" - -############################################################### echo echo "=============== START TO RUN WAFS ===============" # Loop through fcsthrs hr=0 -while [ $hr -le 120 ]; do +while [ ${hr} -le 120 ]; do - if [ $hr -le 100 ]; then - export fcsthrs="$(printf "%02d" $(( 10#$hr )) )" - else - export fcsthrs=$hr - fi + export fcsthrs=$(printf "%03d" ${hr}) # Execute the JJOB - $HOMEgfs/jobs/JGFS_ATMOS_WAFS + ${HOMEgfs}/jobs/JGFS_ATMOS_WAFS status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} - hr=$(expr $hr + 6) + hr=$(expr ${hr} + 6) done -############################################################### -# Force Exit out cleanly -if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi - - exit 0 diff --git a/jobs/rocoto/wafsblending.sh b/jobs/rocoto/wafsblending.sh index 2793986e80c..e16e8fa2b30 100755 --- a/jobs/rocoto/wafsblending.sh +++ b/jobs/rocoto/wafsblending.sh @@ -1,43 +1,38 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="wafsblending" +export jobid="${job}.$$" ############################################################### +# TODO: sourcing configs should be in the j-job echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" configs="base wafsblending" -for config in $configs; do - . $EXPDIR/config.${config} +for config in ${configs}; do + . ${EXPDIR}/config.${config} status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done -############################################################### - -export DATAROOT="$RUNDIR/$CDATE/$CDUMP/wafsblending" -[[ -d $DATAROOT ]] && rm -rf $DATAROOT -mkdir -p $DATAROOT - -export pid=${pid:-$$} -export jobid=${job}.${pid} -export DATA="${DATAROOT}/$job" +# TODO: Mising source machine runtime environment ############################################################### + echo echo "=============== START TO RUN WAFSBLENDING ===============" # Execute the JJOB -$HOMEgfs/jobs/JGFS_ATMOS_WAFS_BLENDING +${HOMEgfs}/jobs/JGFS_ATMOS_WAFS_BLENDING status=$? +[[ ${status} -ne 0 ]] && exit ${status} ############################################################### -# Force Exit out cleanly -if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi - -exit $status +exit 0 diff --git a/jobs/rocoto/wafsblending0p25.sh b/jobs/rocoto/wafsblending0p25.sh index fb06284f558..11788baf4d4 100755 --- a/jobs/rocoto/wafsblending0p25.sh +++ b/jobs/rocoto/wafsblending0p25.sh @@ -1,43 +1,38 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" + +export job="wafsblending0p25" +export jobid="${job}.$$" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} ############################################################### +# TODO: sourcing configs should be in the j-job echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" configs="base wafsblending0p25" -for config in $configs; do - . $EXPDIR/config.${config} +for config in ${configs}; do + . ${EXPDIR}/config.${config} status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done -############################################################### - -export DATAROOT="$RUNDIR/$CDATE/$CDUMP/wafsblending0p25" -[[ -d $DATAROOT ]] && rm -rf $DATAROOT -mkdir -p $DATAROOT - -export pid=${pid:-$$} -export jobid=${job}.${pid} -export DATA="${DATAROOT}/$job" +# TODO: Mising source machine runtime environment ############################################################### + echo echo "=============== START TO RUN WAFSBLENDING0P25 ===============" # Execute the JJOB -$HOMEgfs/jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 +${HOMEgfs}/jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 status=$? +[[ ${status} -ne 0 ]] && exit ${status} ############################################################### -# Force Exit out cleanly -if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi - -exit $status +exit 0 diff --git a/jobs/rocoto/wafsgcip.sh b/jobs/rocoto/wafsgcip.sh index f3e98a03dac..36b2b491d7b 100755 --- a/jobs/rocoto/wafsgcip.sh +++ b/jobs/rocoto/wafsgcip.sh @@ -1,43 +1,45 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} +export job="wafsgcip" +export jobid="${job}.$$" + +# ############################################################### +# TODO: sourcing configs should be in the j-job echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" configs="base wafsgcip" -for config in $configs; do - . $EXPDIR/config.${config} +for config in ${configs}; do + . ${EXPDIR}/config.${config} status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done -############################################################### - -export DATAROOT="$RUNDIR/$CDATE/$CDUMP/wafsgcip" -[[ -d $DATAROOT ]] && rm -rf $DATAROOT -mkdir -p $DATAROOT +########################################## +# Source machine runtime environment +########################################## +. ${HOMEgfs}/env/${machine}.env wafsgcip +status=$? +[[ ${status} -ne 0 ]] && exit ${status} -export pid=${pid:-$$} -export jobid=${job}.${pid} -export DATA="${DATAROOT}/$job" +############################################################### ############################################################### echo echo "=============== START TO RUN WAFSGCIP ===============" # Execute the JJOB -$HOMEgfs/jobs/JGFS_ATMOS_WAFS_GCIP +${HOMEgfs}/jobs/JGFS_ATMOS_WAFS_GCIP status=$? +[[ ${status} -ne 0 ]] && exit ${status} ############################################################### -# Force Exit out cleanly -if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi - -exit $status +exit 0 diff --git a/jobs/rocoto/wafsgrib2.sh b/jobs/rocoto/wafsgrib2.sh index c7dbead30da..a2903e5aa25 100755 --- a/jobs/rocoto/wafsgrib2.sh +++ b/jobs/rocoto/wafsgrib2.sh @@ -1,43 +1,38 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="wafsgrib2" +export jobid=${job}.$$ ############################################################### +# TODO: Sourcing configs should be done in the j-job echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" configs="base wafsgrib2" -for config in $configs; do - . $EXPDIR/config.${config} +for config in ${configs}; do + . ${EXPDIR}/config.${config} status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done -############################################################### - -export DATAROOT="$RUNDIR/$CDATE/$CDUMP/wafsgrib2" -[[ -d $DATAROOT ]] && rm -rf $DATAROOT -mkdir -p $DATAROOT - -export pid=${pid:-$$} -export jobid=${job}.${pid} -export DATA="${DATAROOT}/$job" +# TODO: Missing sourcing of $MACHINE.env ############################################################### + echo echo "=============== START TO RUN WAFSGRIB2 ===============" # Execute the JJOB -$HOMEgfs/jobs/JGFS_ATMOS_WAFS_GRIB2 +${HOMEgfs}/jobs/JGFS_ATMOS_WAFS_GRIB2 status=$? +[[ ${status} -ne 0 ]] && exit ${status} ############################################################### -# Force Exit out cleanly -if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi - -exit $status +exit 0 diff --git a/jobs/rocoto/wafsgrib20p25.sh b/jobs/rocoto/wafsgrib20p25.sh index e99ee210d91..585ca23524c 100755 --- a/jobs/rocoto/wafsgrib20p25.sh +++ b/jobs/rocoto/wafsgrib20p25.sh @@ -1,43 +1,37 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="wafsgrib20p25" +export jobid="${job}.$$" ############################################################### +# TODO: sourcing configs should be in the j-job echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" configs="base wafsgrib20p25" -for config in $configs; do - . $EXPDIR/config.${config} +for config in ${configs}; do + . ${EXPDIR}/config.${config} status=$? - [[ $status -ne 0 ]] && exit $status + [[ ${status} -ne 0 ]] && exit ${status} done -############################################################### - -export DATAROOT="$RUNDIR/$CDATE/$CDUMP/wafsgrib20p25" -[[ -d $DATAROOT ]] && rm -rf $DATAROOT -mkdir -p $DATAROOT - -export pid=${pid:-$$} -export jobid=${job}.${pid} -export DATA="${DATAROOT}/$job" +# TODO: missing sourcing $MACHINE.env ############################################################### echo echo "=============== START TO RUN WAFSGRIB20p25 ===============" # Execute the JJOB -$HOMEgfs/jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 +${HOMEgfs}/jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 status=$? +[[ ${status} -ne 0 ]] && exit ${status} ############################################################### -# Force Exit out cleanly -if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi - -exit $status +exit 0 diff --git a/jobs/rocoto/waveawipsbulls.sh b/jobs/rocoto/waveawipsbulls.sh index 1e1e1cd4e20..4b6d6e1e82a 100755 --- a/jobs/rocoto/waveawipsbulls.sh +++ b/jobs/rocoto/waveawipsbulls.sh @@ -3,34 +3,15 @@ source "$HOMEgfs/ush/preamble.sh" ############################################################### -echo -echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +# Source FV3GFS workflow modules +source ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} -############################################################### -echo -echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" -configs="base waveawipsbulls" -for config in $configs; do - . $EXPDIR/config.${config} - status=$? - [[ $status -ne 0 ]] && exit $status -done - -############################################################### -echo -echo "=============== BEGIN TO SOURCE MACHINE RUNTIME ENVIRONMENT ===============" -. $BASE_ENV/${machine}.env waveawipsbulls -status=$? -[[ $status -ne 0 ]] && exit $status - -export DBNROOT=/dev/null +export job="waveawipsbulls" +export jobid="${job}.$$" ############################################################### -echo -echo "=============== START TO RUN WAVE PRDGEN BULLS ===============" # Execute the JJOB $HOMEgfs/jobs/JGLOBAL_WAVE_PRDGEN_BULLS status=$? diff --git a/jobs/rocoto/waveawipsgridded.sh b/jobs/rocoto/waveawipsgridded.sh index 3627ba62c4d..c10f2f39fd5 100755 --- a/jobs/rocoto/waveawipsgridded.sh +++ b/jobs/rocoto/waveawipsgridded.sh @@ -3,37 +3,18 @@ source "$HOMEgfs/ush/preamble.sh" ############################################################### -echo -echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +# Source FV3GFS workflow modules +source ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} -############################################################### -echo -echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" -configs="base waveawipsgridded" -for config in $configs; do - . $EXPDIR/config.${config} - status=$? - [[ $status -ne 0 ]] && exit $status -done - -############################################################### -echo -echo "=============== BEGIN TO SOURCE MACHINE RUNTIME ENVIRONMENT ===============" -. $BASE_ENV/${machine}.env waveawipsgridded -status=$? -[[ $status -ne 0 ]] && exit $status - -export DBNROOT=/dev/null +export job="waveawipsgridded" +export jobid="${job}.$$" ############################################################### -echo -echo "=============== START TO RUN WAVE PRDGEN GRIDDED ===============" # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED +${HOMEgfs}/jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED status=$? -exit $status +exit ${status} diff --git a/jobs/rocoto/wavegempak.sh b/jobs/rocoto/wavegempak.sh index d4cf1667fc6..58fbcdcc5b0 100755 --- a/jobs/rocoto/wavegempak.sh +++ b/jobs/rocoto/wavegempak.sh @@ -3,35 +3,16 @@ source "$HOMEgfs/ush/preamble.sh" ############################################################### -echo -echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +source $HOMEgfs/ush/load_fv3gfs_modules.sh status=$? [[ $status -ne 0 ]] && exit $status -############################################################### -echo -echo "=============== BEGIN TO SOURCE RELEVANT CONFIGS ===============" -configs="base wavegempak" -for config in $configs; do - . $EXPDIR/config.${config} - status=$? - [[ $status -ne 0 ]] && exit $status -done - -############################################################### -echo -echo "=============== BEGIN TO SOURCE MACHINE RUNTIME ENVIRONMENT ===============" -. $BASE_ENV/${machine}.env wavegempak -status=$? -[[ $status -ne 0 ]] && exit $status +export job="post" +export jobid="${job}.$$" ############################################################### -echo -echo "=============== START TO RUN WAVE GEMPAK ===============" # Execute the JJOB $HOMEgfs/jobs/JGLOBAL_WAVE_GEMPAK status=$? - exit $status diff --git a/jobs/rocoto/waveinit.sh b/jobs/rocoto/waveinit.sh index 5995b85302a..d0c3f499292 100755 --- a/jobs/rocoto/waveinit.sh +++ b/jobs/rocoto/waveinit.sh @@ -1,20 +1,23 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="waveinit" +export jobid="${job}.$$" ############################################################### echo echo "=============== START TO RUN WAVE INIT ===============" # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_WAVE_INIT +${HOMEgfs}/jobs/JGLOBAL_WAVE_INIT status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} exit 0 diff --git a/jobs/rocoto/wavepostbndpnt.sh b/jobs/rocoto/wavepostbndpnt.sh index fe0e2a07233..5d264983567 100755 --- a/jobs/rocoto/wavepostbndpnt.sh +++ b/jobs/rocoto/wavepostbndpnt.sh @@ -1,20 +1,23 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="wavepostbndpnt" +export jobid="${job}.$$" ############################################################### echo echo "=============== START TO RUN WAVE_POST_BNDPNT ===============" # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_WAVE_POST_BNDPNT +${HOMEgfs}/jobs/JGLOBAL_WAVE_POST_BNDPNT status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} exit 0 diff --git a/jobs/rocoto/wavepostbndpntbll.sh b/jobs/rocoto/wavepostbndpntbll.sh index cea3c0bc6b2..ce4f9e6b2d5 100755 --- a/jobs/rocoto/wavepostbndpntbll.sh +++ b/jobs/rocoto/wavepostbndpntbll.sh @@ -1,20 +1,23 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="wavepostbndpntbll" +export jobid="${job}.$$" ############################################################### echo echo "=============== START TO RUN WAVE_POST_BNDPNT ===============" # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_WAVE_POST_BNDPNTBLL +${HOMEgfs}/jobs/JGLOBAL_WAVE_POST_BNDPNTBLL status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} exit 0 diff --git a/jobs/rocoto/wavepostpnt.sh b/jobs/rocoto/wavepostpnt.sh index 1b1d8c97650..9efb755decd 100755 --- a/jobs/rocoto/wavepostpnt.sh +++ b/jobs/rocoto/wavepostpnt.sh @@ -1,20 +1,23 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="wavepostpnt" +export jobid="${job}.$$" ############################################################### echo echo "=============== START TO RUN WAVE_POST_PNT ===============" # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_WAVE_POST_PNT +${HOMEgfs}/jobs/JGLOBAL_WAVE_POST_PNT status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} exit 0 diff --git a/jobs/rocoto/wavepostsbs.sh b/jobs/rocoto/wavepostsbs.sh index fb4fdfbd8b6..e4bea0bc344 100755 --- a/jobs/rocoto/wavepostsbs.sh +++ b/jobs/rocoto/wavepostsbs.sh @@ -1,20 +1,20 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### -echo -echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +# Source FV3GFS workflow modules +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="wavepostsbs" +export jobid="${job}.$$" ############################################################### -echo -echo "=============== START TO RUN WAVE POST_SBS ===============" # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_WAVE_POST_SBS +${HOMEgfs}/jobs/JGLOBAL_WAVE_POST_SBS status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} exit 0 diff --git a/jobs/rocoto/waveprep.sh b/jobs/rocoto/waveprep.sh index c55c8526d98..0cbafde87ea 100755 --- a/jobs/rocoto/waveprep.sh +++ b/jobs/rocoto/waveprep.sh @@ -1,20 +1,23 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" ############################################################### echo echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" -. $HOMEgfs/ush/load_fv3gfs_modules.sh +. ${HOMEgfs}/ush/load_fv3gfs_modules.sh status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} + +export job="waveprep" +export jobid="${job}.$$" ############################################################### echo echo "=============== START TO RUN WAVE PREP ===============" # Execute the JJOB -$HOMEgfs/jobs/JGLOBAL_WAVE_PREP +${HOMEgfs}/jobs/JGLOBAL_WAVE_PREP status=$? -[[ $status -ne 0 ]] && exit $status +[[ ${status} -ne 0 ]] && exit ${status} exit 0 diff --git a/modulefiles/module-setup.csh.inc b/modulefiles/module-setup.csh.inc index e5cad465706..e8219424f6f 100644 --- a/modulefiles/module-setup.csh.inc +++ b/modulefiles/module-setup.csh.inc @@ -2,7 +2,13 @@ set __ms_shell=csh eval "if ( -d / ) set __ms_shell=tcsh" -if ( { test -d /lfs3 } ) then +if ( { test -d /lfs/f1 } ) then + # We are on NOAA Cactus or Dogwood + if ( ! { module help >& /dev/null } ) then + source /usr/share/lmod/lmod/init/$__ms_shell + fi + module reset +else if ( { test -d /lfs3 } ) then if ( ! { module help >& /dev/null } ) then source /apps/lmod/lmod/init/$__ms_shell endif @@ -19,11 +25,12 @@ else if ( { test -d /work } ) then source /apps/lmod/init/$__ms_shell endif module purge -else if ( { test -d /jetmon } ) then - # We are on NOAA Jet +else if ( { test -d /data/prod } ) then + # We are on SSEC S4 if ( ! { module help >& /dev/null } ) then - source /apps/lmod/lmod/init/$__ms_shell + source /usr/share/lmod/lmod/init/$__ms_shell endif + source /etc/profile module purge else if ( { test -d /glade } ) then # We are on NCAR Yellowstone diff --git a/modulefiles/module-setup.sh.inc b/modulefiles/module-setup.sh.inc index b55643719cc..e5322cbb2c1 100644 --- a/modulefiles/module-setup.sh.inc +++ b/modulefiles/module-setup.sh.inc @@ -16,7 +16,13 @@ else __ms_shell=sh fi -if [[ -d /lfs3 ]] ; then +if [[ -d /lfs/f1 ]] ; then + # We are on NOAA Cactus or Dogwood + if ( ! eval module help > /dev/null 2>&1 ) ; then + source /usr/share/lmod/lmod/init/$__ms_shell + fi + module reset +elif [[ -d /mnt/lfs1 ]] ; then # We are on NOAA Jet if ( ! eval module help > /dev/null 2>&1 ) ; then source /apps/lmod/lmod/init/$__ms_shell @@ -80,6 +86,12 @@ elif [[ -d /lustre && -d /ncrc ]] ; then source /etc/profile unset __ms_source_etc_profile fi +elif [[ -d /data/prod ]] ; then + # We are on SSEC's S4 + if ( ! eval module help > /dev/null 2>&1 ) ; then + source /usr/share/lmod/lmod/init/$__ms_shell + fi + module purge else echo WARNING: UNKNOWN PLATFORM 1>&2 fi diff --git a/modulefiles/module_base.hera.lua b/modulefiles/module_base.hera.lua index 36cb672eb0a..13f08d3cd97 100644 --- a/modulefiles/module_base.hera.lua +++ b/modulefiles/module_base.hera.lua @@ -9,42 +9,30 @@ load(pathJoin("hpc-intel", "18.0.5.274")) load(pathJoin("hpc-impi", "2018.0.4")) load(pathJoin("hpss", "hpss")) -load(pathJoin("nco", "4.9.1")) load(pathJoin("gempak", "7.4.2")) load(pathJoin("ncl", "6.6.2")) - -load(pathJoin("prod_util", "1.2.2")) -load(pathJoin("grib_util", "1.2.2")) - -load(pathJoin("crtm", "2.3.0")) -setenv("CRTM_FIX","/scratch2/NCEPDEV/nwprod/NCEPLIBS/fix/crtm_v2.3.0") - load(pathJoin("jasper", "2.0.25")) -load(pathJoin("zlib", "1.2.11")) load(pathJoin("png", "1.6.35")) +load(pathJoin("cdo", "1.9.5")) +load(pathJoin("R", "3.5.0")) load(pathJoin("hdf5", "1.10.6")) load(pathJoin("netcdf", "4.7.4")) -load(pathJoin("pio", "2.5.2")) -load(pathJoin("esmf", "8.2.1b04")) -load(pathJoin("fms", "2021.03")) -load(pathJoin("bacio", "2.4.1")) -load(pathJoin("g2", "3.4.2")) +load(pathJoin("nco", "4.9.1")) +load(pathJoin("prod_util", "1.2.2")) +load(pathJoin("grib_util", "1.2.2")) load(pathJoin("g2tmpl", "1.10.0")) -load(pathJoin("ip", "3.3.3")) -load(pathJoin("nemsio", "2.5.2")) -load(pathJoin("sp", "2.3.3")) -load(pathJoin("w3emc", "2.7.3")) -load(pathJoin("w3nco", "2.4.1")) load(pathJoin("ncdiag", "1.0.0")) - +load(pathJoin("crtm", "2.4.0")) load(pathJoin("wgrib2", "2.0.8")) setenv("WGRIB2","wgrib2") -load(pathJoin("cdo", "1.9.5")) +prepend_path("MODULEPATH", pathJoin("/scratch1/NCEPDEV/global/glopara/git/prepobs/feature-GFSv17_com_reorg/modulefiles")) +load(pathJoin("prepobs", "1.0.1")) -load(pathJoin("R", "3.5.0")) +prepend_path("MODULEPATH", pathJoin("/scratch1/NCEPDEV/global/glopara/git/Fit2Obs/v1.0.0/modulefiles")) +load(pathJoin("fit2obs", "1.0.0")) -- Temporary until official hpc-stack is updated prepend_path("MODULEPATH", "/scratch2/NCEPDEV/ensemble/save/Walter.Kolczynski/hpc-stack/modulefiles/stack") diff --git a/modulefiles/module_base.jet.lua b/modulefiles/module_base.jet.lua index 625f27421e4..dd0e87f730e 100644 --- a/modulefiles/module_base.jet.lua +++ b/modulefiles/module_base.jet.lua @@ -2,56 +2,51 @@ help([[ Load environment to run GFS on Jet ]]) -prepend_path("MODULEPATH", "/lfs4/HFIP/hfv3gfs/nwprod/hpc-stack/libs/modulefiles/stack") +prepend_path("MODULEPATH", "/lfs4/HFIP/hfv3gfs/role.epic/hpc-stack/libs/intel-18.0.5.274/modulefiles/stack") -load(pathJoin("hpc", "1.1.0")) +load(pathJoin("hpc", "1.2.0")) load(pathJoin("hpc-intel", "18.0.5.274")) load(pathJoin("hpc-impi", "2018.4.274")) +load(pathJoin("cmake", "3.20.1")) -load(pathJoin("hpss", "hpss")) -load(pathJoin("nco", "4.9.1")) +load("hpss") load(pathJoin("gempak", "7.4.2")) load(pathJoin("ncl", "6.6.2")) - -load(pathJoin("prod_util", "1.2.2")) -load(pathJoin("grib_util", "1.2.2")) - -load(pathJoin("crtm", "2.3.0")) -setenv("CRTM_FIX","/lfs4/HFIP/hfv3gfs/nwprod/NCEPLIBS/fix/crtm_v2.3.0") load(pathJoin("jasper", "2.0.25")) -load(pathJoin("zlib", "1.2.11")) -load(pathJoin("png", "1.6.35")) +load(pathJoin("libpng", "1.6.35")) +load(pathJoin("cdo", "1.9.5")) +load(pathJoin("R", "4.0.2")) load(pathJoin("hdf5", "1.10.6")) load(pathJoin("netcdf", "4.7.4")) -load(pathJoin("pio", "2.5.2")) -load(pathJoin("esmf", "8.2.1b04")) -load(pathJoin("fms", "2021.03")) -load(pathJoin("bacio", "2.4.1")) -load(pathJoin("g2", "3.4.2")) +load(pathJoin("nco", "4.9.1")) +load(pathJoin("prod_util", "1.2.2")) +load(pathJoin("grib_util", "1.2.2")) load(pathJoin("g2tmpl", "1.10.0")) -load(pathJoin("ip", "3.3.3")) -load(pathJoin("nemsio", "2.5.2")) -load(pathJoin("sp", "2.3.3")) -load(pathJoin("w3emc", "2.7.3")) -load(pathJoin("w3nco", "2.4.1")) load(pathJoin("ncdiag", "1.0.0")) - +load(pathJoin("crtm", "2.4.0")) load(pathJoin("wgrib2", "2.0.8")) setenv("WGRIB2","wgrib2") -load(pathJoin("cdo", "1.9.5")) +prepend_path("MODULEPATH", pathJoin("/lfs4/HFIP/hfv3gfs/glopara/git/prepobs/v1.0.1/modulefiles")) +load(pathJoin("prepobs", "1.0.1")) -load(pathJoin("R", "3.5.0")) +prepend_path("MODULEPATH", "/contrib/anaconda/modulefiles") +load(pathJoin("anaconda", "5.3.1")) + +prepend_path("MODULEPATH", pathJoin("/lfs4/HFIP/hfv3gfs/glopara/git/prepobs/feature-GFSv17_com_reorg/modulefiles")) +load(pathJoin("prepobs", "1.0.1")) +prepend_path("MODULEPATH", pathJoin("/lfs4/HFIP/hfv3gfs/glopara/git/Fit2Obs/v1.0.0/modulefiles")) +load(pathJoin("fit2obs", "1.0.0")) -- Temporary until official hpc-stack is updated -prepend_path("MODULEPATH", "/lfs1/NESDIS/nesdis-rdo2/David.Huber/save/hpc-stack/modulefiles/stack/") -load(pathJoin("hpc", "1.2.0")) -load(pathJoin("hpc-intel", "18.0.5.274")) -load(pathJoin("hpc-miniconda3", "4.6.14")) -load(pathJoin("ufswm", "1.0.0")) -load(pathJoin("met", "9.1")) -load(pathJoin("metplus", "3.1")) +-- prepend_path("MODULEPATH", "/lfs1/NESDIS/nesdis-rdo2/David.Huber/save/hpc-stack/modulefiles/stack/") +-- load(pathJoin("hpc", "1.2.0")) +-- load(pathJoin("hpc-intel", "18.0.5.274")) +-- load(pathJoin("hpc-miniconda3", "4.6.14")) +-- load(pathJoin("ufswm", "1.0.0")) +-- load(pathJoin("met", "9.1")) +-- load(pathJoin("metplus", "3.1")) whatis("Description: GFS run environment") diff --git a/modulefiles/module_base.orion.lua b/modulefiles/module_base.orion.lua index 22a47644198..d5cdb5b00c1 100644 --- a/modulefiles/module_base.orion.lua +++ b/modulefiles/module_base.orion.lua @@ -2,50 +2,36 @@ help([[ Load environment to run GFS on Orion ]]) -prepend_path("MODULEPATH", "/apps/contrib/NCEP/libs/hpc-stack/modulefiles/stack") +prepend_path("MODULEPATH", "/apps/contrib/NCEP/hpc-stack/libs/hpc-stack/modulefiles/stack") load(pathJoin("hpc", "1.1.0")) load(pathJoin("hpc-intel", "2018.4")) load(pathJoin("hpc-impi", "2018.4")) -load(pathJoin("nco", "4.8.1")) load(pathJoin("gempak", "7.5.1")) load(pathJoin("ncl", "6.6.2")) - -load(pathJoin("prod_util", "1.2.2")) -load(pathJoin("grib_util", "1.2.2")) - -load(pathJoin("crtm", "2.3.0")) -setenv("CRTM_FIX","/apps/contrib/NCEPLIBS/orion/fix/crtm_v2.3.0") - load(pathJoin("jasper", "2.0.25")) load(pathJoin("zlib", "1.2.11")) load(pathJoin("png", "1.6.35")) +load(pathJoin("cdo", "1.9.5")) load(pathJoin("hdf5", "1.10.6")) load(pathJoin("netcdf", "4.7.4")) -load(pathJoin("pio", "2.5.2")) -load(pathJoin("esmf", "8.2.1b04")) -load(pathJoin("fms", "2021.03")) -load(pathJoin("bacio", "2.4.1")) -load(pathJoin("g2", "3.4.2")) +load(pathJoin("nco", "4.8.1")) +load(pathJoin("prod_util", "1.2.2")) +load(pathJoin("grib_util", "1.2.2")) load(pathJoin("g2tmpl", "1.10.0")) -load(pathJoin("ip", "3.3.3")) -load(pathJoin("nemsio", "2.5.2")) -load(pathJoin("sp", "2.3.3")) -load(pathJoin("w3emc", "2.7.3")) -load(pathJoin("w3nco", "2.4.1")) load(pathJoin("ncdiag", "1.0.0")) - +load(pathJoin("crtm", "2.4.0")) load(pathJoin("wgrib2", "2.0.8")) setenv("WGRIB2","wgrib2") -load("contrib") -load(pathJoin("rocoto", "1.3.3")) -load(pathJoin("slurm", "19.05.3-2")) +prepend_path("MODULEPATH", pathJoin("/work/noaa/global/glopara/git/prepobs/feature-GFSv17_com_reorg/modulefiles")) +load(pathJoin("prepobs", "1.0.1")) -load(pathJoin("cdo", "1.9.5")) +prepend_path("MODULEPATH", pathJoin("/work/noaa/global/glopara/git/Fit2Obs/v1.0.0/modulefiles")) +load(pathJoin("fit2obs", "1.0.0")) -- Temporary until official hpc-stack is updated prepend_path("MODULEPATH", "/work2/noaa/global/wkolczyn/save/hpc-stack/modulefiles/stack") diff --git a/modulefiles/module_base.s4.lua b/modulefiles/module_base.s4.lua new file mode 100644 index 00000000000..5bd0f1d6fb6 --- /dev/null +++ b/modulefiles/module_base.s4.lua @@ -0,0 +1,37 @@ +help([[ +Load environment to run GFS on S4 +]]) + +load("license_intel") +prepend_path("MODULEPATH", "/data/prod/hpc-stack/modulefiles/stack") + +load(pathJoin("hpc", "1.1.0")) +load(pathJoin("hpc-intel", "18.0.4")) +load(pathJoin("hpc-impi", "18.0.4")) + +load(pathJoin("miniconda", "3.8-s4")) +load(pathJoin("ncl", "6.4.0-precompiled")) +load(pathJoin("cdo", "1.9.8")) +load(pathJoin("jasper", "2.0.25")) +load(pathJoin("zlib", "1.2.11")) +load(pathJoin("png", "1.6.35")) + +load(pathJoin("hdf5", "1.10.6")) +load(pathJoin("netcdf", "4.7.4")) + +load(pathJoin("nco", "4.9.3")) +load(pathJoin("prod_util", "1.2.2")) +load(pathJoin("grib_util", "1.2.2")) +load(pathJoin("g2tmpl", "1.10.0")) +load(pathJoin("ncdiag", "1.0.0")) +load(pathJoin("crtm", "2.4.0")) +load(pathJoin("wgrib2", "2.0.8")) +setenv("WGRIB2","wgrib2") + +prepend_path("MODULEPATH", pathJoin("/data/prod/glopara/git/prepobs/feature-GFSv17_com_reorg/modulefiles")) +load(pathJoin("prepobs", "1.0.1")) + +prepend_path("MODULEPATH", pathJoin("/data/prod/glopara/git/Fit2Obs/v1.0.0/modulefiles")) +load(pathJoin("fit2obs", "1.0.0")) + +whatis("Description: GFS run environment") diff --git a/modulefiles/module_base.wcoss2.lua b/modulefiles/module_base.wcoss2.lua new file mode 100644 index 00000000000..a6f3f4679c5 --- /dev/null +++ b/modulefiles/module_base.wcoss2.lua @@ -0,0 +1,40 @@ +help([[ +Load environment to run GFS on WCOSS2 +]]) + +load(pathJoin("PrgEnv-intel", "8.1.0")) +load(pathJoin("craype", "2.7.13")) +load(pathJoin("intel", "19.1.3.304")) +load(pathJoin("cray-mpich", "8.1.9")) +load(pathJoin("cray-pals", "1.0.17")) +load(pathJoin("cfp", "2.0.4")) +setenv("USE_CFP","YES") + +load(pathJoin("python", "3.8.6")) +load(pathJoin("gempak", "7.14.1")) +load(pathJoin("perl", "5.32.0")) +load(pathJoin("libjpeg", "9c")) +load(pathJoin("udunits", "2.2.28")) +load(pathJoin("gsl", "2.7")) +load(pathJoin("cdo", "1.9.8")) + +load(pathJoin("hdf5", "1.10.6")) +load(pathJoin("netcdf", "4.7.4")) + +load(pathJoin("nco", "4.7.9")) +load(pathJoin("prod_util", "2.0.9")) +load(pathJoin("grib_util", "1.2.3")) +load(pathJoin("bufr_dump", "1.0.0")) +load(pathJoin("util_shared", "1.4.0")) +load(pathJoin("g2tmpl", "1.9.1")) +load(pathJoin("ncdiag", "1.0.0")) +load(pathJoin("crtm", "2.4.0")) +load(pathJoin("wgrib2", "2.0.7")) + +prepend_path("MODULEPATH", pathJoin("/lfs/h2/emc/global/save/emc.global/git/prepobs/feature-GFSv17_com_reorg/modulefiles")) +load(pathJoin("prepobs", "1.0.1")) + +prepend_path("MODULEPATH", pathJoin("/lfs/h2/emc/global/save/emc.global/git/Fit2Obs/v1.0.0/modulefiles")) +load(pathJoin("fit2obs", "1.0.0")) + +whatis("Description: GFS run environment") diff --git a/modulefiles/module_gwci.hera.lua b/modulefiles/module_gwci.hera.lua new file mode 100644 index 00000000000..f4b62a5fd2f --- /dev/null +++ b/modulefiles/module_gwci.hera.lua @@ -0,0 +1,15 @@ +help([[ +Load environment to run GFS workflow setup scripts on Hera +]]) + +prepend_path("MODULEPATH", "/scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/modulefiles/stack") + +load(pathJoin("hpc", "1.1.0")) +load(pathJoin("hpc-intel", "18.0.5.274")) +load(pathJoin("hpc-impi", "2018.0.4")) + +load(pathJoin("netcdf","4.7.4")) +load(pathJoin("nccmp","1.8.7.0")) +load(pathJoin("wgrib2", "2.0.8")) + +whatis("Description: GFS run setup CI environment") diff --git a/modulefiles/module_gwci.orion.lua b/modulefiles/module_gwci.orion.lua new file mode 100644 index 00000000000..779e80a4543 --- /dev/null +++ b/modulefiles/module_gwci.orion.lua @@ -0,0 +1,21 @@ +help([[ +Load environment to run GFS workflow ci scripts on Orion +]]) + +prepend_path("MODULEPATH", "/apps/contrib/NCEP/hpc-stack/libs/hpc-stack/modulefiles/stack") + +load(pathJoin("hpc", "1.1.0")) +load(pathJoin("hpc-intel", "2018.4")) +load(pathJoin("hpc-impi", "2018.4")) +load(pathJoin("netcdf","4.7.4")) +load(pathJoin("nccmp"," 1.8.7.0")) +load(pathJoin("contrib","0.1")) +load(pathJoin("wgrib2","3.0.2")) + +prepend_path("MODULEPATH", "/work2/noaa/global/wkolczyn/save/hpc-stack/modulefiles/stack") +load(pathJoin("hpc", "1.2.0")) +load(pathJoin("hpc-intel", "2018.4")) +load(pathJoin("hpc-miniconda3", "4.6.14")) +load(pathJoin("gfs_workflow", "1.0.0")) + +whatis("Description: GFS run ci top-level sripts environment") diff --git a/modulefiles/module_gwsetup.hera.lua b/modulefiles/module_gwsetup.hera.lua new file mode 100644 index 00000000000..a07b32b6a6d --- /dev/null +++ b/modulefiles/module_gwsetup.hera.lua @@ -0,0 +1,13 @@ +help([[ +Load environment to run GFS workflow setup scripts on Hera +]]) + +load(pathJoin("rocoto")) + +-- Temporary until official hpc-stack is updated +prepend_path("MODULEPATH", "/scratch2/NCEPDEV/ensemble/save/Walter.Kolczynski/hpc-stack/modulefiles/stack") +load(pathJoin("hpc", "1.2.0")) +load(pathJoin("hpc-miniconda3", "4.6.14")) +load(pathJoin("gfs_workflow", "1.0.0")) + +whatis("Description: GFS run setup environment") diff --git a/modulefiles/module_gwsetup.orion.lua b/modulefiles/module_gwsetup.orion.lua new file mode 100644 index 00000000000..37f3187fb46 --- /dev/null +++ b/modulefiles/module_gwsetup.orion.lua @@ -0,0 +1,17 @@ +help([[ +Load environment to run GFS workflow ci scripts on Orion +]]) + +-- Temporary until official hpc-stack is updated + +prepend_path("MODULEPATH", "/apps/modulefiles/core") +load(pathJoin("contrib","0.1")) +load(pathJoin("rocoto","1.3.3")) +load(pathJoin("git","2.28.0")) + +prepend_path("MODULEPATH", "/work2/noaa/global/wkolczyn/save/hpc-stack/modulefiles/stack") +load(pathJoin("hpc", "1.2.0")) +load(pathJoin("hpc-miniconda3", "4.6.14")) +load(pathJoin("gfs_workflow", "1.0.0")) + +whatis("Description: GFS run ci top-level sripts environment") diff --git a/parm/config/config.aero b/parm/config/config.aero index 3aeb33790ee..1cb3bf5679d 100644 --- a/parm/config/config.aero +++ b/parm/config/config.aero @@ -13,6 +13,15 @@ case $machine in "ORION") AERO_INPUTS_DIR="/work2/noaa/global/wkolczyn/noscrub/global-workflow/gocart_emissions" ;; + "S4") + AERO_INPUTS_DIR="/data/prod/glopara/gocart_emissions" + ;; + "WCOSS2") + AERO_INPUTS_DIR="/lfs/h2/emc/global/noscrub/emc.global/data/gocart_emissions" + ;; + "JET") + AERO_INPUTS_DIR="/lfs4/HFIP/hfv3gfs/glopara/data/gocart_emissions" + ;; *) echo "FATAL ERROR: Machine $machine unsupported for aerosols" exit 2 diff --git a/parm/config/config.aeroanl b/parm/config/config.aeroanl new file mode 100644 index 00000000000..41d63f85490 --- /dev/null +++ b/parm/config/config.aeroanl @@ -0,0 +1,24 @@ +#!/bin/bash -x + +########## config.aeroanl ########## +# configuration common to all aero analysis tasks + +echo "BEGIN: config.aeroanl" + +export CASE_ANL=${CASE} +export OBS_YAML_DIR=${HOMEgfs}/sorc/gdas.cd/parm/aero/obs/config/ +export OBS_LIST=${HOMEgfs}/sorc/gdas.cd/parm/aero/obs/lists/gdas_aero_prototype.yaml +export AEROVARYAML=${HOMEgfs}/sorc/gdas.cd/parm/aero/variational/3dvar_gfs_aero.yaml +export STATICB_TYPE='identity' +export BERROR_YAML=${HOMEgfs}/sorc/gdas.cd/parm/aero/berror/staticb_${STATICB_TYPE}.yaml +export FV3JEDI_FIX=${HOMEgfs}/fix/gdas +export BERROR_DATA_DIR=${FV3JEDI_FIX}/bump/aero/${CASE_ANL}/ +export BERROR_DATE="20160630.000000" + +export io_layout_x=@IO_LAYOUT_X@ +export io_layout_y=@IO_LAYOUT_Y@ + +export JEDIEXE=${HOMEgfs}/exec/fv3jedi_var.x +export crtm_VERSION="2.3.0" + +echo "END: config.aeroanl" diff --git a/parm/config/config.aeroanlfinal b/parm/config/config.aeroanlfinal new file mode 100644 index 00000000000..230ec5205a9 --- /dev/null +++ b/parm/config/config.aeroanlfinal @@ -0,0 +1,10 @@ +#!/bin/bash -x + +########## config.aeroanlfinal ########## +# Post Aero Analysis specific + +echo "BEGIN: config.aeroanlfinal" + +# Get task specific resources +. $EXPDIR/config.resources aeroanlfinal +echo "END: config.aeroanlfinal" diff --git a/parm/config/config.aeroanlinit b/parm/config/config.aeroanlinit new file mode 100644 index 00000000000..72175b8d0cc --- /dev/null +++ b/parm/config/config.aeroanlinit @@ -0,0 +1,10 @@ +#!/bin/bash -x + +########## config.aeroanlinit ########## +# Pre Aero Analysis specific + +echo "BEGIN: config.aeroanlinit" + +# Get task specific resources +. $EXPDIR/config.resources aeroanlinit +echo "END: config.aeroanlinit" diff --git a/parm/config/config.aeroanlrun b/parm/config/config.aeroanlrun new file mode 100644 index 00000000000..da13df28316 --- /dev/null +++ b/parm/config/config.aeroanlrun @@ -0,0 +1,11 @@ +#!/bin/bash -x + +########## config.aeroanlrun ########## +# Aerosol Analysis specific + +echo "BEGIN: config.aeroanlrun" + +# Get task specific resources +. $EXPDIR/config.resources aeroanlrun + +echo "END: config.aeroanlrun" diff --git a/parm/config/config.aerosol_init b/parm/config/config.aerosol_init index 9cd640a6513..0e586e02315 100644 --- a/parm/config/config.aerosol_init +++ b/parm/config/config.aerosol_init @@ -7,4 +7,4 @@ echo "BEGIN: config.aerosol_init" # Get task specific resources source $EXPDIR/config.resources aerosol_init -echo "END: config.aerosol_init" +echo "END: config.aerosol_init" \ No newline at end of file diff --git a/parm/config/config.anal b/parm/config/config.anal old mode 100755 new mode 100644 index 6d3a48c82eb..018bab95979 --- a/parm/config/config.anal +++ b/parm/config/config.anal @@ -6,44 +6,35 @@ echo "BEGIN: config.anal" # Get task specific resources -. $EXPDIR/config.resources anal +. ${EXPDIR}/config.resources anal -if [ $DONST = "YES" ]; then - . $EXPDIR/config.nsst +if [[ ${DONST} = "YES" ]]; then + . ${EXPDIR}/config.nsst fi -if [[ "$CDATE" = "$FDATE" && $EXP_WARM_START = ".false." ]]; then # Cold starting - export USE_RADSTAT="NO" +if [[ "${CDUMP}" = "gfs" ]] ; then + export USE_RADSTAT="NO" # This can be only used when bias correction is not-zero. + export GENDIAG="NO" + export SETUP='diag_rad=.false.,diag_pcp=.false.,diag_conv=.false.,diag_ozone=.false.,write_diag(3)=.false.,niter(2)=100,' + export DIAG_TARBALL="YES" fi -if [[ "$CDUMP" = "gfs" ]] ; then - export USE_RADSTAT="NO" # This can be only used when bias correction is not-zero. - export GENDIAG="NO" - export SETUP='diag_rad=.false.,diag_pcp=.false.,diag_conv=.false.,diag_ozone=.false.,write_diag(3)=.false.,niter(2)=100,' - export DIAG_TARBALL="NO" -fi - -export npe_gsi=$npe_anal +export npe_gsi=${npe_anal} -if [[ "$CDUMP" == "gfs" ]] ; then - export npe_gsi=$npe_anal_gfs - export nth_anal=$nth_anal_gfs +if [[ "${CDUMP}" == "gfs" ]] ; then + export npe_gsi=${npe_anal_gfs} + export nth_anal=${nth_anal_gfs} fi # Set parameters specific to L127 -if [ $LEVS = "128" ]; then - export GRIDOPTS="nlayers(63)=1,nlayers(64)=1," - export SETUP="gpstop=55,nsig_ext=56,${SETUP:-}" +if [[ ${LEVS} = "128" ]]; then + export GRIDOPTS="nlayers(63)=1,nlayers(64)=1," + export SETUP="gpstop=55,nsig_ext=56,${SETUP:-}" fi # Set namelist option for LETKF export lobsdiag_forenkf=".false." # anal does not need to write out jacobians - # set to .true. in config.eobs and config.eupd - -if [ $OUTPUT_FILE = "nemsio" ]; then - export DO_CALC_INCREMENT="YES" - export DO_CALC_ANALYSIS="NO" -fi + # set to .true. in config.eobs and config.eupd # Do not process the following datasets export GSNDBF=${GSNDBF:-/dev/null} @@ -54,124 +45,102 @@ export AMSR2BF=${AMSR2BF:-/dev/null} # Set default values for info files and observation error # NOTE: Remember to set PRVT in config.prep as OBERROR is set below -export CONVINFO=$FIXgsi/global_convinfo.txt -export OZINFO=$FIXgsi/global_ozinfo.txt -export SATINFO=$FIXgsi/global_satinfo.txt -export OBERROR=$FIXgsi/prepobs_errtable.global +export CONVINFO=${FIXgsi}/global_convinfo.txt +export OZINFO=${FIXgsi}/global_ozinfo.txt +export SATINFO=${FIXgsi}/global_satinfo.txt +export OBERROR=${FIXgsi}/prepobs_errtable.global # Use experimental dumps in EMC GFS v16 parallels -if [[ $RUN_ENVIR == "emc" ]]; then - export ABIBF="/dev/null" - if [[ "$CDATE" -ge "2019022800" ]] ; then - export ABIBF="$DMPDIR/${CDUMP}x.${PDY}/${cyc}/atmos/${CDUMP}.t${cyc}z.gsrcsr.tm00.bufr_d" - if [[ "$CDATE" -ge "2019111000" && "$CDATE" -le "2020052612" ]]; then - export ABIBF="$DMPDIR/${CDUMP}y.${PDY}/${cyc}/atmos/${CDUMP}.t${cyc}z.gsrcsr.tm00.bufr_d" - fi - fi - - export AHIBF="/dev/null" - if [[ "$CDATE" -ge "2019042300" ]]; then - export AHIBF="$DMPDIR/${CDUMP}x.${PDY}/${cyc}/atmos/${CDUMP}.t${cyc}z.ahicsr.tm00.bufr_d" - fi - - export HDOB=$DMPDIR/${CDUMP}x.${PDY}/${cyc}/atmos/${CDUMP}.t${cyc}z.hdob.tm00.bufr_d - - # Use dumps from NCO GFS v16 parallel - if [[ "$CDATE" -ge "2020103012" ]]; then - export ABIBF="" - export AHIBF="" - export HDOB="" - fi - - # Set info files and prepobs.errtable.global for GFS v16 retrospective parallels - if [[ "$CDATE" -ge "2019021900" && "$CDATE" -lt "2019110706" ]]; then - export CONVINFO=$FIXgsi/gfsv16_historical/global_convinfo.txt.2019021900 - export OBERROR=$FIXgsi/gfsv16_historical/prepobs_errtable.global.2019021900 - fi - - # Place GOES-15 AMVs in monitor, assimilate GOES-17 AMVs, assimilate KOMPSAT-5 gps - if [[ "$CDATE" -ge "2019110706" && "$CDATE" -lt "2020040718" ]]; then - export CONVINFO=$FIXgsi/gfsv16_historical/global_convinfo.txt.2019110706 - export OBERROR=$FIXgsi/gfsv16_historical/prepobs_errtable.global.2019110706 - fi - - # Assimilate 135 (T) & 235 (uv) Canadian AMDAR observations - if [[ "$CDATE" -ge "2020040718" && "$CDATE" -lt "2020052612" ]]; then - export CONVINFO=$FIXgsi/gfsv16_historical/global_convinfo.txt.2020040718 - export OBERROR=$FIXgsi/gfsv16_historical/prepobs_errtable.global.2020040718 - fi - - # Assimilate COSMIC-2 - if [[ "$CDATE" -ge "2020052612" && "$CDATE" -lt "2020082412" ]]; then - export CONVINFO=$FIXgsi/gfsv16_historical/global_convinfo.txt.2020052612 - export OBERROR=$FIXgsi/gfsv16_historical/prepobs_errtable.global.2020040718 - fi - - # Assimilate HDOB - if [[ "$CDATE" -ge "2020082412" && "$CDATE" -lt "2020091612" ]]; then - export CONVINFO=$FIXgsi/gfsv16_historical/global_convinfo.txt.2020082412 - fi - - # Assimilate Metop-C GNSSRO - if [[ "$CDATE" -ge "2020091612" && "$CDATE" -lt "2021031712" ]]; then - export CONVINFO=$FIXgsi/gfsv16_historical/global_convinfo.txt.2020091612 - fi - - # Assimilate DO-2 GeoOptics - if [[ "$CDATE" -ge "2021031712" && "$CDATE" -lt "2021091612" ]]; then - export CONVINFO=$FIXgsi/gfsv16_historical/global_convinfo.txt.2021031712 - fi - - # NOTE: - # As of 2021110312, gfsv16_historical/global_convinfo.txt.2021110312 is - # identical to ../global_convinfo.txt. Thus, the logic below is not - # needed at this time. - # Assimilate COSMIC-2 GPS - # if [[ "$CDATE" -ge "2021110312" && "$CDATE" -lt "YYYYMMDDHH" ]]; then - # export CONVINFO=$FIXgsi/gfsv16_historical/global_convinfo.txt.2021110312 - # fi - - # Turn off assmilation of OMPS during period of bad data - if [[ "$CDATE" -ge "2020011600" && "$CDATE" -lt "2020011806" ]]; then - export OZINFO=$FIXgsi/gfsv16_historical/global_ozinfo.txt.2020011600 - fi - - - # Set satinfo for start of GFS v16 parallels - if [[ "$CDATE" -ge "2019021900" && "$CDATE" -lt "2019110706" ]]; then - export SATINFO=$FIXgsi/gfsv16_historical/global_satinfo.txt.2019021900 - fi - - # Turn on assimilation of Metop-C AMSUA and MHS - if [[ "$CDATE" -ge "2019110706" && "$CDATE" -lt "2020022012" ]]; then - export SATINFO=$FIXgsi/gfsv16_historical/global_satinfo.txt.2019110706 - fi - - # Turn off assimilation of Metop-A MHS - if [[ "$CDATE" -ge "2020022012" && "$CDATE" -lt "2021052118" ]]; then - export SATINFO=$FIXgsi/gfsv16_historical/global_satinfo.txt.2020022012 - fi - - # Turn off assimilation of S-NPP CrIS - if [[ "$CDATE" -ge "2021052118" && "$CDATE" -lt "2021092206" ]]; then - export SATINFO=$FIXgsi/gfsv16_historical/global_satinfo.txt.2021052118 - fi - - # Turn off assimilation of MetOp-A IASI - if [[ "$CDATE" -ge "2021092206" && "$CDATE" -lt "2021102612" ]]; then - export SATINFO=$FIXgsi/gfsv16_historical/global_satinfo.txt.2021092206 - fi - - # NOTE: - # As of 2021110312, gfsv16_historical/global_satinfo.txt.2021110312 is - # identical to ../global_satinfo.txt. Thus, the logic below is not - # needed at this time - # - # Turn off assmilation of all Metop-A MHS - # if [[ "$CDATE" -ge "2021110312" && "$CDATE" -lt "YYYYMMDDHH" ]]; then - # export SATINFO=$FIXgsi/gfsv16_historical/global_satinfo.txt.2021110312 - # fi +if [[ ${RUN_ENVIR} == "emc" ]]; then + # Set info files and prepobs.errtable.global for GFS v16 retrospective parallels + if [[ "${CDATE}" -ge "2019021900" && "${CDATE}" -lt "2019110706" ]]; then + export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2019021900 + export OBERROR=${FIXgsi}/gfsv16_historical/prepobs_errtable.global.2019021900 + fi + + # Place GOES-15 AMVs in monitor, assimilate GOES-17 AMVs, assimilate KOMPSAT-5 gps + if [[ "${CDATE}" -ge "2019110706" && "${CDATE}" -lt "2020040718" ]]; then + export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2019110706 + export OBERROR=${FIXgsi}/gfsv16_historical/prepobs_errtable.global.2019110706 + fi + + # Assimilate 135 (T) & 235 (uv) Canadian AMDAR observations + if [[ "${CDATE}" -ge "2020040718" && "${CDATE}" -lt "2020052612" ]]; then + export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2020040718 + export OBERROR=${FIXgsi}/gfsv16_historical/prepobs_errtable.global.2020040718 + fi + + # Assimilate COSMIC-2 + if [[ "${CDATE}" -ge "2020052612" && "${CDATE}" -lt "2020082412" ]]; then + export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2020052612 + export OBERROR=${FIXgsi}/gfsv16_historical/prepobs_errtable.global.2020040718 + fi + + # Assimilate HDOB + if [[ "${CDATE}" -ge "2020082412" && "${CDATE}" -lt "2020091612" ]]; then + export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2020082412 + fi + + # Assimilate Metop-C GNSSRO + if [[ "${CDATE}" -ge "2020091612" && "${CDATE}" -lt "2021031712" ]]; then + export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2020091612 + fi + + # Assimilate DO-2 GeoOptics + if [[ "${CDATE}" -ge "2021031712" && "${CDATE}" -lt "2021091612" ]]; then + export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2021031712 + fi + + # NOTE: + # As of 2021110312, gfsv16_historical/global_convinfo.txt.2021110312 is + # identical to ../global_convinfo.txt. Thus, the logic below is not + # needed at this time. + # Assimilate COSMIC-2 GPS + # if [[ "$CDATE" -ge "2021110312" && "$CDATE" -lt "YYYYMMDDHH" ]]; then + # export CONVINFO=$FIXgsi/gfsv16_historical/global_convinfo.txt.2021110312 + # fi + + # Turn off assmilation of OMPS during period of bad data + if [[ "${CDATE}" -ge "2020011600" && "${CDATE}" -lt "2020011806" ]]; then + export OZINFO=${FIXgsi}/gfsv16_historical/global_ozinfo.txt.2020011600 + fi + + + # Set satinfo for start of GFS v16 parallels + if [[ "${CDATE}" -ge "2019021900" && "${CDATE}" -lt "2019110706" ]]; then + export SATINFO=${FIXgsi}/gfsv16_historical/global_satinfo.txt.2019021900 + fi + + # Turn on assimilation of Metop-C AMSUA and MHS + if [[ "${CDATE}" -ge "2019110706" && "${CDATE}" -lt "2020022012" ]]; then + export SATINFO=${FIXgsi}/gfsv16_historical/global_satinfo.txt.2019110706 + fi + + # Turn off assimilation of Metop-A MHS + if [[ "${CDATE}" -ge "2020022012" && "${CDATE}" -lt "2021052118" ]]; then + export SATINFO=${FIXgsi}/gfsv16_historical/global_satinfo.txt.2020022012 + fi + + # Turn off assimilation of S-NPP CrIS + if [[ "${CDATE}" -ge "2021052118" && "${CDATE}" -lt "2021092206" ]]; then + export SATINFO=${FIXgsi}/gfsv16_historical/global_satinfo.txt.2021052118 + fi + + # Turn off assimilation of MetOp-A IASI + if [[ "${CDATE}" -ge "2021092206" && "${CDATE}" -lt "2021102612" ]]; then + export SATINFO=${FIXgsi}/gfsv16_historical/global_satinfo.txt.2021092206 + fi + + # NOTE: + # As of 2021110312, gfsv16_historical/global_satinfo.txt.2021110312 is + # identical to ../global_satinfo.txt. Thus, the logic below is not + # needed at this time + # + # Turn off assmilation of all Metop-A MHS + # if [[ "$CDATE" -ge "2021110312" && "$CDATE" -lt "YYYYMMDDHH" ]]; then + # export SATINFO=$FIXgsi/gfsv16_historical/global_satinfo.txt.2021110312 + # fi fi echo "END: config.anal" diff --git a/parm/config/config.analcalc b/parm/config/config.analcalc old mode 100755 new mode 100644 index c02aafc2c34..9405114ecc6 --- a/parm/config/config.analcalc +++ b/parm/config/config.analcalc @@ -8,4 +8,8 @@ echo "BEGIN: config.analcalc" # Get task specific resources . $EXPDIR/config.resources analcalc +if [[ "$CDUMP" == "gfs" ]]; then + export nth_echgres=$nth_echgres_gfs +fi + echo "END: config.analcalc" diff --git a/parm/config/config.analdiag b/parm/config/config.analdiag old mode 100755 new mode 100644 diff --git a/parm/config/config.arch b/parm/config/config.arch old mode 100755 new mode 100644 diff --git a/parm/config/config.atmanl b/parm/config/config.atmanl new file mode 100644 index 00000000000..c0cd9e6733d --- /dev/null +++ b/parm/config/config.atmanl @@ -0,0 +1,24 @@ +#! /usr/bin/env bash + +########## config.atmanl ########## +# configuration common to all atm var analysis tasks + +echo "BEGIN: config.atmanl" + +export OBS_YAML_DIR=${HOMEgfs}/sorc/gdas.cd/parm/atm/obs/config/ +export OBS_LIST=${HOMEgfs}/sorc/gdas.cd/parm/atm/obs/lists/gdas_prototype_3d.yaml +export ATMVARYAML=${HOMEgfs}/sorc/gdas.cd/parm/atm/variational/3dvar_dripcg.yaml +export STATICB_TYPE="gsibec" +export BERROR_YAML=${HOMEgfs}/sorc/gdas.cd/parm/atm/berror/staticb_${STATICB_TYPE}.yaml +export INTERP_METHOD='barycentric' + +export layout_x=1 +export layout_y=1 + +export io_layout_x=1 +export io_layout_y=1 + +export JEDIEXE=${HOMEgfs}/exec/fv3jedi_var.x +export crtm_VERSION="2.3.0" + +echo "END: config.atmanl" diff --git a/parm/config/config.atmanlfinal b/parm/config/config.atmanlfinal new file mode 100644 index 00000000000..a6b714f7fcb --- /dev/null +++ b/parm/config/config.atmanlfinal @@ -0,0 +1,10 @@ +#! /usr/bin/env bash + +########## config.atmanlfinal ########## +# Post Atm Var Analysis specific + +echo "BEGIN: config.atmanlfinal" + +# Get task specific resources +. "${EXPDIR}/config.resources" atmanlfinal +echo "END: config.atmanlfinal" diff --git a/parm/config/config.atmanlinit b/parm/config/config.atmanlinit new file mode 100644 index 00000000000..bc95ef4962a --- /dev/null +++ b/parm/config/config.atmanlinit @@ -0,0 +1,10 @@ +#! /usr/bin/env bash + +########## config.atmanlinit ########## +# Pre Atm Var Analysis specific + +echo "BEGIN: config.atmanlinit" + +# Get task specific resources +. "${EXPDIR}/config.resources" atmanlinit +echo "END: config.atmanlinit" diff --git a/parm/config/config.atmanlrun b/parm/config/config.atmanlrun new file mode 100644 index 00000000000..68b76157186 --- /dev/null +++ b/parm/config/config.atmanlrun @@ -0,0 +1,11 @@ +#! /usr/bin/env bash + +########## config.atmanlrun ########## +# Atm Var Analysis specific + +echo "BEGIN: config.atmanlrun" + +# Get task specific resources +. "${EXPDIR}/config.resources" atmanlrun + +echo "END: config.atmanlrun" diff --git a/parm/config/config.atmensanl b/parm/config/config.atmensanl new file mode 100755 index 00000000000..4d945ea717e --- /dev/null +++ b/parm/config/config.atmensanl @@ -0,0 +1,22 @@ +#! /usr/bin/env bash + +########## config.atmensanl ########## +# configuration common to all atm ens analysis tasks + +echo "BEGIN: config.atmensanl" + +export OBS_YAML_DIR=${HOMEgfs}/sorc/gdas.cd/parm/atm/obs/config/ +export OBS_LIST=${HOMEgfs}/sorc/gdas.cd/parm/atm/obs/lists/lgetkf_prototype.yaml +export ATMENSYAML=${HOMEgfs}/sorc/gdas.cd/parm/atm/lgetkf/lgetkf.yaml +export INTERP_METHOD='barycentric' + +export layout_x=1 +export layout_y=1 + +export io_layout_x=1 +export io_layout_y=1 + +export JEDIEXE=${HOMEgfs}/exec/fv3jedi_letkf.x +export crtm_VERSION="2.3.0" + +echo "END: config.atmensanl" diff --git a/parm/config/config.atmensanlfinal b/parm/config/config.atmensanlfinal new file mode 100755 index 00000000000..5d8ec458c38 --- /dev/null +++ b/parm/config/config.atmensanlfinal @@ -0,0 +1,10 @@ +#! /usr/bin/env bash + +########## config.atmensanlfinal ########## +# Post Atm Ens Analysis specific + +echo "BEGIN: config.atmensanlfinal" + +# Get task specific resources +. "${EXPDIR}/config.resources" atmensanlfinal +echo "END: config.atmensanlfinal" diff --git a/parm/config/config.atmensanlinit b/parm/config/config.atmensanlinit new file mode 100755 index 00000000000..34429023bbb --- /dev/null +++ b/parm/config/config.atmensanlinit @@ -0,0 +1,10 @@ +#! /usr/bin/env bash + +########## config.atmensanlinit ########## +# Pre Atm Ens Analysis specific + +echo "BEGIN: config.atmensanlinit" + +# Get task specific resources +. "${EXPDIR}/config.resources" atmensanlinit +echo "END: config.atmensanlinit" diff --git a/parm/config/config.atmensanlrun b/parm/config/config.atmensanlrun new file mode 100755 index 00000000000..01f211a17a0 --- /dev/null +++ b/parm/config/config.atmensanlrun @@ -0,0 +1,11 @@ +#! /usr/bin/env bash + +########## config.atmensanlrun ########## +# Atm Ens Analysis specific + +echo "BEGIN: config.atmensanlrun" + +# Get task specific resources +. "${EXPDIR}/config.resources" atmensanlrun + +echo "END: config.atmensanlrun" diff --git a/parm/config/config.awips b/parm/config/config.awips old mode 100755 new mode 100644 diff --git a/parm/config/config.base.nco.static b/parm/config/config.base.nco.static old mode 100755 new mode 100644 index a94f0be8631..e3702852f4a --- a/parm/config/config.base.nco.static +++ b/parm/config/config.base.nco.static @@ -31,7 +31,7 @@ export SCRgfs=$HOMEgfs/scripts # GLOBAL static environment parameters -export NWPROD="/gpfs/dell1/nco/ops/nwprod" +export PACKAGEROOT="/lfs/h1/ops/prod/packages" export RTMFIX=$CRTM_FIX # Machine specific paths used everywhere @@ -62,12 +62,10 @@ export REALTIME="YES" # CLEAR #################################################### # Build paths relative to $HOMEgfs -export FIXgsi="$HOMEgfs/fix/fix_gsi" +export FIXgsi="$HOMEgfs/fix/gsi" export HOMEfv3gfs="$HOMEgfs/sorc/fv3gfs.fd" export HOMEpost="$HOMEgfs" -export HOMEobsproc_prep="$NWPROD/obsproc_prep.v5.5.0" -export HOMEobsproc_network="$NWPROD/obsproc_global.v3.4.2" -export HOMEobsproc_global=$HOMEobsproc_network +export HOMEobsproc="/lfs/h1/ops/prod/packages/obsproc.v1.1.2" # CONVENIENT utility scripts and other environment parameters export NCP="/bin/cp -p" @@ -89,7 +87,7 @@ export EDATE=2039123100 export assim_freq=6 export PSLOT="test" export EXPDIR="$EXPDIR" -export ROTDIR="$ROTDIR" +export ROTDIR="$(compath.py ${envir}/${NET}/${gfs_ver})" export ROTDIR_DUMP="YES" export DUMP_SUFFIX="" export RUNDIR="$DATAROOT" @@ -166,25 +164,12 @@ export restart_interval_gfs=12 # I/O QUILTING, true--use Write Component; false--use GFDL FMS # if quilting=true, choose OUTPUT_GRID as cubed_sphere_grid in netcdf or gaussian_grid -# if gaussian_grid, set OUTPUT_FILE for nemsio or netcdf # WRITE_DOPOST=true, use inline POST export QUILTING=".true." export OUTPUT_GRID="gaussian_grid" -export OUTPUT_FILE="netcdf" -export WRITE_DOPOST=".true." +export WRITE_DOPOST=".true." # WRITE_DOPOST=true, use inline POST export WRITE_NSFLIP=".true." -# suffix options depending on file format -if [ $OUTPUT_FILE = "netcdf" ]; then - export SUFFIX=".nc" - export NEMSIO_IN=".false." - export NETCDF_IN=".true." -else - export SUFFIX=".nemsio" - export NEMSIO_IN=".true." - export NETCDF_IN=".false." -fi - # IAU related parameters export DOIAU="YES" # Enable 4DIAU for control with 3 increments export IAUFHRS="3,6,9" @@ -245,11 +230,6 @@ export nst_anl=.true. # Analysis increments to zero in CALCINCEXEC export INCREMENTS_TO_ZERO="'liq_wat_inc','icmr_inc'" -if [ $OUTPUT_FILE = "nemsio" ]; then - export DO_CALC_INCREMENT="YES" - export DO_CALC_ANALYSIS="NO" -fi - # Stratospheric increments to zero export INCVARS_ZERO_STRAT="'sphum_inc','liq_wat_inc','icmr_inc'" export INCVARS_EFOLD="5" diff --git a/parm/config/config.com b/parm/config/config.com new file mode 100644 index 00000000000..40cba6da5a3 --- /dev/null +++ b/parm/config/config.com @@ -0,0 +1,92 @@ +# shellcheck shell=bash +# Ignore shellcheck warnings about variables not being expanded; this is what we want +# shellcheck disable=SC2016 +echo "BEGIN: config.com" + +# These are just templates. All templates must use single quotations so variable +# expansion does not occur when this file is sourced. Substitution happens later +# during runtime. It is recommended to use the helper function `generate_com()`, +# to do this substitution, which is defined in `ush/preamble.sh`. +# +# Syntax for generate_com(): +# generate_com [-rx] $var1[:$tmpl1] [$var2[:$tmpl2]] [...]] +# +# options: +# -r: Make variable read-only (same as `decalre -r`) +# -x: Mark variable for declare -rx (same as `declare -x`) +# var1, var2, etc: Variable names whose values will be generated from a template +# and declared +# tmpl1, tmpl2, etc: Specify the template to use (default is "${var}_TMPL") +# +# Examples: +# # Current cycle and RUN +# YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS +# +# # Previous cycle and gdas +# RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ +# COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL +# +# # Current cycle and COM for first member +# MEMDIR='mem001' YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_HISTORY +# + +# +# If any restart, input, or analysis template is updated, `setup_expt.py.fill_COMROT_cycled()` +# must correspondingly be updated to match. +# +if [[ "${RUN_ENVIR:-emc}" == "nco" ]]; then + COM_OBS_TMPL=$(compath.py "${envir}/obsproc/${obsproc_ver}")'/${RUN}.${YMD}/${HH}/atmos' + COM_RTOFS_TMPL=$(compath.py "${envir}/${WAVECUR_DID}/${rtofs_ver}") +else + COM_OBS_TMPL='${ROTDIR}/${RUN}.${YMD}/${HH}/obs' + COM_RTOFS_TMPL='${DMPDIR}' +fi +declare -rx COM_OBS_TMPL COM_RTOFS_TMPL +declare -rx COM_OBSDMP_TMPL='${DMPDIR}/${DUMP}${DUMP_SUFFIX}.${YMD}/${HH}/atmos' + +COM_BASE='${ROTDIR}/${RUN}.${YMD}/${HH}/${MEMDIR}' + +declare -rx COM_TOP_TMPL='${ROTDIR}/${RUN}.${YMD}/${HH}' + +declare -rx COM_ATMOS_INPUT_TMPL=${COM_BASE}'/model_data/atmos/input' +declare -rx COM_ATMOS_RESTART_TMPL=${COM_BASE}'/model_data/atmos/restart' +declare -rx COM_ATMOS_ANALYSIS_TMPL=${COM_BASE}'/analysis/atmos' +declare -rx COM_ATMOS_HISTORY_TMPL=${COM_BASE}'/model_data/atmos/history' +declare -rx COM_ATMOS_MASTER_TMPL=${COM_BASE}'/model_data/atmos/master' +declare -rx COM_ATMOS_GRIB_TMPL=${COM_BASE}'/products/atmos/grib2/${GRID}' +declare -rx COM_ATMOS_BUFR_TMPL=${COM_BASE}'/products/atmos/bufr' +declare -rx COM_ATMOS_GEMPAK_TMPL=${COM_BASE}'/products/atmos/gempak/${GRID}' +declare -rx COM_ATMOS_GENESIS_TMPL=${COM_BASE}'/products/atmos/cyclone/genesis_vital' +declare -rx COM_ATMOS_TRACK_TMPL=${COM_BASE}'/products/atmos/cyclone/tracks' +declare -rx COM_ATMOS_GOES_TMPL=${COM_BASE}'/products/atmos/goes_sim' +declare -rx COM_ATMOS_IMAGERY_TMPL=${COM_BASE}'/products/atmos/imagery' +declare -rx COM_ATMOS_MINMON_TMPL=${COM_BASE}'/products/atmos/minmon' +declare -rx COM_ATMOS_WAFS_TMPL=${COM_BASE}'/products/atmos/wafs' +declare -rx COM_ATMOS_WMO_TMPL=${COM_BASE}'/products/atmos/wmo' + +declare -rx COM_WAVE_RESTART_TMPL=${COM_BASE}'/model_data/wave/restart' +declare -rx COM_WAVE_PREP_TMPL=${COM_BASE}'/model_data/wave/prep' +declare -rx COM_WAVE_HISTORY_TMPL=${COM_BASE}'/model_data/wave/history' +declare -rx COM_WAVE_GRID_TMPL=${COM_BASE}'/products/wave/gridded' +declare -rx COM_WAVE_STATION_TMPL=${COM_BASE}'/products/wave/station' +declare -rx COM_WAVE_GEMPAK_TMPL=${COM_BASE}'/products/wave/gempak' +declare -rx COM_WAVE_WMO_TMPL=${COM_BASE}'/products/wave/wmo' + +declare -rx COM_OCEAN_HISTORY_TMPL=${COM_BASE}'/model_data/ocean/history' +declare -rx COM_OCEAN_RESTART_TMPL=${COM_BASE}'/model_data/ocean/restart' +declare -rx COM_OCEAN_INPUT_TMPL=${COM_BASE}'/model_data/ocean/input' +declare -rx COM_OCEAN_ANALYSIS_TMPL=${COM_BASE}'/analysis/ocean' +declare -rx COM_OCEAN_2D_TMPL=${COM_BASE}'/products/ocean/2D' +declare -rx COM_OCEAN_3D_TMPL=${COM_BASE}'/products/ocean/3D' +declare -rx COM_OCEAN_DAILY_TMPL=${COM_BASE}'/products/ocean/daily' +declare -rx COM_OCEAN_XSECT_TMPL=${COM_BASE}'/products/ocean/xsect' +declare -rx COM_OCEAN_GRIB_TMPL=${COM_BASE}'/products/ocean/grib2/${GRID}' + +declare -rx COM_ICE_INPUT_TMPL=${COM_BASE}'/model_data/ice/input' +declare -rx COM_ICE_HISTORY_TMPL=${COM_BASE}'/model_data/ice/history' +declare -rx COM_ICE_RESTART_TMPL=${COM_BASE}'/model_data/ice/restart' + +declare -rx COM_CHEM_HISTORY_TMPL=${COM_BASE}'/model_data/chem/history' +declare -rx COM_CHEM_ANALYSIS_TMPL=${COM_BASE}'/analysis/chem' + +declare -rx COM_MED_RESTART_TMPL=${COM_BASE}'/model_data/med/restart' diff --git a/parm/config/config.coupled_ic b/parm/config/config.coupled_ic old mode 100755 new mode 100644 index 0df82591d94..50fab283b59 --- a/parm/config/config.coupled_ic +++ b/parm/config/config.coupled_ic @@ -5,18 +5,39 @@ echo "BEGIN: config.coupled_ic" # Get task specific resources -source $EXPDIR/config.resources coupled_ic +source ${EXPDIR}/config.resources coupled_ic -if [[ "$machine" == "HERA" ]]; then +if [[ "${machine}" == "WCOSS2" ]]; then + export BASE_CPLIC="/lfs/h2/emc/couple/noscrub/Jiande.Wang/IC" +elif [[ "${machine}" == "HERA" ]]; then export BASE_CPLIC="/scratch1/NCEPDEV/climate/role.ufscpara/IC" -elif [[ "$machine" == "ORION" ]]; then - export BASE_CPLIC="/work/noaa/global/wkolczyn/noscrub/global-workflow/IC" +elif [[ "${machine}" == "ORION" ]]; then + export BASE_CPLIC="/work/noaa/global/glopara/data/ICSDIR/prototype_ICs" +elif [[ "${machine}" == "S4" ]]; then + export BASE_CPLIC="/data/prod/glopara/coupled_ICs" +elif [[ "${machine}" == "JET" ]]; then + export BASE_CPLIC="/mnt/lfs4/HFIP/hfv3gfs/glopara/data/ICSDIR/prototype_ICs" fi -export CPL_ATMIC=GEFS-NoahMP-aerosols-p8c -export CPL_ICEIC=CPC -export CPL_OCNIC=CPC3Dvar -export CPL_WAVIC=GEFSwave20210528v2 -export CPL_DATM=CDEPS_DATM + +case "${CASE}" in + "C384") + #C384 and P8 ICs + export CPL_ATMIC=GEFS-NoahMP-aerosols-p8c + export CPL_ICEIC=CPC + export CPL_OCNIC=CPC3Dvar + export CPL_WAVIC=GEFSwave20210528v2 + ;; + "C768") + export CPL_ATMIC=HR1 + export CPL_ICEIC=HR1 + export CPL_OCNIC=HR1 + export CPL_WAVIC=HR1 + ;; + *) + echo "Unrecognized case: ${1}" + exit 1 + ;; +esac echo "END: config.coupled_ic" diff --git a/parm/config/config.defaults.s2sw b/parm/config/config.defaults.s2sw index 5032a998ad5..7f751e02bc9 100644 --- a/parm/config/config.defaults.s2sw +++ b/parm/config/config.defaults.s2sw @@ -3,7 +3,6 @@ # Empty variables must include a space otherwise they will be overwritten # config.base -# CASE=C384 FHMAX_GFS_00=48 FHMAX_GFS_06=48 FHMAX_GFS_12=48 @@ -15,28 +14,15 @@ FHOUT_HF_GFS=-1 min_seaice="1.0e-6" use_cice_alb=".true." -# config.fv3 -DELTIM=300 -layout_x_gfs=8 -layout_y_gfs=8 -WRITE_GROUP_GFS=1 -WRTTASK_PER_GROUP_GFS=24 -#The settings below will result in S2SWA running 35 days under 8 hours wallclock on hera -#layout_x_gfs=24 -#layout_y_gfs=16 -#WRTTASK_PER_GROUP_GFS=86 -WRTIOBUF="32M" -MEDPETS=300 - # config.wave -waveGRD='gwes_30m' -waveinterpGRD=' ' -waveuoutpGRD='gwes_30m' -MESH_WAV='mesh.gwes_30m.nc' +waveGRD='mx025' +waveinterpGRD='reg025' +waveuoutpGRD='mx025' +MESH_WAV='mesh.mx025.nc' waveesmfGRD=' ' -wavepostGRD='gwes_30m' +wavepostGRD=' ' waveGRDN="1" waveGRDG="10" USE_WAV_RMP="NO" diff --git a/parm/config/config.earc b/parm/config/config.earc old mode 100755 new mode 100644 diff --git a/parm/config/config.ecen b/parm/config/config.ecen old mode 100755 new mode 100644 diff --git a/parm/config/config.echgres b/parm/config/config.echgres old mode 100755 new mode 100644 diff --git a/parm/config/config.ediag b/parm/config/config.ediag old mode 100755 new mode 100644 diff --git a/parm/config/config.efcs b/parm/config/config.efcs old mode 100755 new mode 100644 index 2e8979faf07..a9b410e416c --- a/parm/config/config.efcs +++ b/parm/config/config.efcs @@ -5,29 +5,43 @@ echo "BEGIN: config.efcs" +# TODO: the _ENKF counterparts need to be defined in config.base +export DO_AERO=${DO_AERO_ENKF:-"NO"} +export DO_OCN=${DO_OCN_ENKF:-"NO"} +export DO_ICE=${DO_ICE_ENKF:-"NO"} +export DO_WAVE=${DO_WAVE_ENKF:-"NO"} + +# TODO: Possibly need OCNRES_ENKF, ICERES_ENKF, WAVRES_ENKF too +if [[ ${DO_OCN} == "YES" ]]; then + case "$CASE_ENKF" in + "C48") export OCNRES=500;; + "C96") export OCNRES=100;; + "C192") export OCNRES=050;; + "C384") export OCNRES=025;; + "C768") export OCNRES=025;; + *) export OCNRES=025;; + esac +fi +[[ ${DO_ICE} == "YES" ]] && export ICERES=$OCNRES +[[ ${DO_WAVE} == "YES" ]] && export waveGRD=${waveGRD_ENKF:-$waveGRD} # TODO: will we run waves with a different resolution in the ensemble? + # Source model specific information that is resolution dependent -. $EXPDIR/config.fv3 $CASE_ENKF +string="--fv3 $CASE_ENKF" +[[ ${DO_OCN} == "YES" ]] && string="$string --mom6 $OCNRES" +[[ ${DO_ICE} == "YES" ]] && string="$string --cice6 $ICERES" +[[ ${DO_WAVE} == "YES" ]] && string="$string --ww3 ${waveGRD// /;}" +source $EXPDIR/config.ufs ${string} # Get task specific resources . $EXPDIR/config.resources efcs -export npe_fv3=$npe_efcs - -if [ $QUILTING = ".true." ]; then - export npe_fv3=$(echo " $npe_fv3 + $WRITE_GROUP * $WRTTASK_PER_GROUP" | bc) - export npe_efcs=$npe_fv3 -fi - -# Only use serial I/O for ensemble on Hera and Orion (lustre?) -case $machine in - "HERA" | "ORION") - export OUTPUT_FILETYPE_ATM="netcdf" - export OUTPUT_FILETYPE_SFC="netcdf" - ;; -esac +# Use serial I/O for ensemble (lustre?) +export OUTPUT_FILETYPE_ATM="netcdf" +export OUTPUT_FILETYPE_SFC="netcdf" # Number of enkf members per fcst job export NMEM_EFCSGRP=2 +export NMEM_EFCSGRP_GFS=1 export RERUN_EFCSGRP="NO" # Turn off inline UPP for EnKF forecast @@ -53,11 +67,7 @@ export SPPT_LOGIT=".true." export SPPT_SFCLIMIT=".true." if [ $QUILTING = ".true." -a $OUTPUT_GRID = "gaussian_grid" ]; then - if [[ "$CCPP_SUITE" == "FV3_RAP_cires_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_unified_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_ugwpv1" ]] ; then - export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_da_gsl" - else - export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_da" - fi + export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_da" else export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_da_orig" fi @@ -75,7 +85,7 @@ export cplwav=.false. # ocean model resolution case "$CASE_ENKF" in - "C48") export OCNRES=400;; + "C48") export OCNRES=500;; "C96") export OCNRES=100;; "C192") export OCNRES=050;; "C384") export OCNRES=025;; diff --git a/parm/config/config.eobs b/parm/config/config.eobs old mode 100755 new mode 100644 diff --git a/parm/config/config.epos b/parm/config/config.epos old mode 100755 new mode 100644 diff --git a/parm/config/config.esfc b/parm/config/config.esfc old mode 100755 new mode 100644 diff --git a/parm/config/config.eupd b/parm/config/config.eupd old mode 100755 new mode 100644 diff --git a/parm/config/config.fcst b/parm/config/config.fcst old mode 100755 new mode 100644 index c1f8324a942..2a57647644d --- a/parm/config/config.fcst +++ b/parm/config/config.fcst @@ -5,17 +5,19 @@ echo "BEGIN: config.fcst" -# set -eu - -# Source model specific information that is resolution dependent -. $EXPDIR/config.fv3 $CASE - # Turn off waves if not used for this CDUMP case $WAVE_CDUMP in - both | $CDUMP ) ;; # Don't change + both | ${CDUMP/enkf} ) ;; # Don't change *) DO_WAVE="NO" ;; # Turn waves off esac +# Source model specific information that is resolution dependent +string="--fv3 $CASE" +[[ ${DO_OCN} == "YES" ]] && string="$string --mom6 $OCNRES" +[[ ${DO_ICE} == "YES" ]] && string="$string --cice6 $ICERES" +[[ ${DO_WAVE} == "YES" ]] && string="$string --ww3 ${waveGRD// /;}" +source $EXPDIR/config.ufs ${string} + # Source component configs if necessary for component in WAVE OCN ICE AERO; do control="DO_${component}" @@ -29,7 +31,7 @@ done export domains_stack_size="16000000" -if [ $DONST = "YES" ]; then +if [[ "$DONST" = "YES" ]]; then . $EXPDIR/config.nsst fi @@ -39,13 +41,8 @@ export esmf_logkind="ESMF_LOGKIND_MULTI_ON_ERROR" #Options: ESMF_LOGKIND_MULTI_O ####################################################################### # COUPLING COMPONENTS -export OCN_model="mom6" -export ICE_model="cice6" -export WAV_model="ww3" -export CHM_model="gocart" # cpl defaults - export cpl=".false." export cplflx=".false." export cplice=".false." @@ -54,31 +51,27 @@ export cplwav=".false." # cpl changes based on APP -if [ $DO_COUPLED = "YES" ]; then +if [[ "$DO_COUPLED" = "YES" ]]; then export cpl=".true." fi -if [ $DO_AERO = "YES" ]; then +if [[ "$DO_AERO" = "YES" ]]; then export cplchm=".true." fi -if [ $DO_ICE = "YES" ]; then +if [[ "$DO_ICE" = "YES" ]]; then export cplice=".true." export cplflx=".true." fi -if [ $DO_OCN = "YES" ]; then +if [[ "$DO_OCN" = "YES" ]]; then export cplflx=".true." fi -if [ $DO_WAVE = "YES" ]; then +if [[ "$DO_WAVE" = "YES" ]]; then export cplwav=".true." fi -####################################################################### -# COUPLING COMPONENTS -export use_coldstart=".false." - - ####################################################################### export FORECASTSH="$HOMEgfs/scripts/exglobal_forecast.sh" +#export FORECASTSH="$HOMEgfs/scripts/exglobal_forecast.py" # Temp. while this is worked on export FCSTEXECDIR="$HOMEgfs/exec" export FCSTEXEC="ufs_model.x" @@ -93,12 +86,12 @@ export h2o_phys=".true." # Options of stratosphere O3 physics reaction coefficients export new_o3forc="YES" -export gwd_opt=2 +export gwd_opt=2 # --GFS.v16 uGWD.v0, used for suite FV3_GFS_v16 and UFS p6 etc # do_ugwp=T: use unified CGWD and OGWD, and turbulent orographic form drag (TOFD) # do_ugwp=F: use unified CGWD but old OGWD, TOFD is not uded. -if [ $gwd_opt -eq 1 ]; then +if [[ "$gwd_opt" -eq 1 ]]; then export knob_ugwp_version=0 export do_ugwp=".false." export do_tofd=".false." @@ -107,7 +100,7 @@ fi # -- uGWD.v1, for suite FV3_GFS_v17 and FV3_GFS_v17p8b etc -if [ $gwd_opt -eq 2 ]; then +if [[ "$gwd_opt" -eq 2 ]]; then #--used for UFS p7 and p8a #export knob_ugwp_version=1 @@ -121,8 +114,8 @@ if [ $gwd_opt -eq 2 ]; then #export do_gsl_drag_ss=".true." #export do_gsl_drag_tofd=".true." #export do_ugwp_v1_orog_only=".false." - - #--used for UFS p8 + + #--used for UFS p8 export knob_ugwp_version=0 export do_ugwp=".false." export do_tofd=".false." @@ -144,63 +137,33 @@ fi export tau=10.0 export rf_cutoff=7.5e2 export d2_bg_k1=0.20 -### JKH -if [[ "$CCPP_SUITE" == "FV3_RAP_cires_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_unified_ugwp" ]] ; then - export d2_bg_k2=0.15 ### JKH - 10dec - export dz_min=2 - export dt_inner=40. ### JKH - 10dec -else - export d2_bg_k2=0.04 - export dz_min=6 -fi -if [ $LEVS = "128" ]; then export n_sponge=42; fi #127 layer -if [ $LEVS = "65" ]; then - if [ "CCPP_SUITE" = "FV3_RAP_cires_ugwp" -o "CCPP_SUITE" = "FV3_RAP_noah_sfcdiff_unified_ugwp" ]; then - export n_sponge=23 - else - export n_sponge=42 - fi -fi -if [ $LEVS = "128" -a "$CDUMP" = "gdas" ]; then +export d2_bg_k2=0.04 +export dz_min=6 +export n_sponge=42 +if [[ "${LEVS}" = "128" && "${CDUMP}" =~ "gdas" ]]; then export tau=5.0 export rf_cutoff=1.0e3 export d2_bg_k1=0.20 export d2_bg_k2=0.0 fi -# PBL/turbulence schemes +# PBL/turbulance schemes export hybedmf=".false." -if [[ "$CCPP_SUITE" == "FV3_RAP_cires_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_unified_ugwp" || "$CCPP_SUITE" == "FV3_GFS_v16_mynn" || "$CCPP_SUITE" == "FV3_GFS_v17_p8_mynn" || "$CCPP_SUITE" == "FV3_GFS_v17_p8_gf_mynn" ]] ; then - export satmedmf=".false." - export isatmedmf=0 - if [[ "$CCPP_SUITE" == "FV3_GFS_v17_p8_mynn" || "$CCPP_SUITE" == "FV3_GFS_v17_p8_gf_mynn" ]] ; then - export shal_cnv=".false." - else - export shal_cnv=".true." - fi - export do_mynnedmf=".true." - if [[ "$CCPP_SUITE" == "FV3_RAP_cires_ugwp" ]] ; then - export do_mynnsfclay=".true." - else - export do_mynnsfclay=".false." - fi - export icloud_bl=1 - export bl_mynn_tkeadvect=.true. - export bl_mynn_edmf=1 - export bl_mynn_edmf_mom=1 -else - export satmedmf=".true." - export isatmedmf=1 -fi - +export satmedmf=".true." +export isatmedmf=1 tbf="" -if [ $satmedmf = ".true." ]; then tbf="_satmedmf" ; fi +if [[ "$satmedmf" = ".true." ]]; then tbf="_satmedmf" ; fi -# Radiation options +#Convection schemes +export progsigma=".true." +tbp="" +if [ "$progsigma" = ".true." ]; then tbp="_progsigma" ; fi + +# Radiation options export IAER=1011 ; #spectral band mapping method for aerosol optical properties -export iovr_lw=3 ; #de-correlation length cloud overlap method (Barker, 2008) -export iovr_sw=3 ; #de-correlation length cloud overlap method (Barker, 2008) -export iovr=3 ; #de-correlation length cloud overlap method (Barker, 2008) +export iovr_lw=3 ; #de-correlation length cloud overlap method (Barker, 2008) +export iovr_sw=3 ; #de-correlation length cloud overlap method (Barker, 2008) +export iovr=3 ; #de-correlation length cloud overlap method (Barker, 2008) export icliq_sw=2 ; #cloud optical coeffs from AER's newer version v3.9-v4.0 for hu and stamnes export isubc_sw=2 export isubc_lw=2 @@ -215,99 +178,47 @@ export doGP_lwscat=.false. export iopt_sfc="3" export iopt_trs="2" -# Convection Options: 2-SASAS, 3-GF -if [[ "$CCPP_SUITE" == "FV3_RAP_cires_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_unified_ugwp" || "$CCPP_SUITE" == "FV3_GFS_v17_p8_gf_mynn" ]] ; then - export imfdeepcnv=3 - export imfshalcnv=-1 ## JKH - no shallow GF -elif [[ "$CCPP_SUITE" == "FV3_GFS_v16_gf" || "$CCPP_SUITE" == "FV3_GFS_v17_p8_gf" ]] ; then - export imfdeepcnv=3 - export imfshalcnv=3 -else - export imfdeepcnv=2 - if [[ "$CCPP_SUITE" == "FV3_GFS_v17_p8_mynn" ]] ; then - export imfshalcnv=-1 - else - export imfshalcnv=2 - fi -fi - # Microphysics configuration export dnats=0 export cal_pre=".true." export do_sat_adj=".false." export random_clds=".true." -if [ $imp_physics -eq 99 ]; then # ZhaoCarr +if [[ "$imp_physics" -eq 99 ]]; then # ZhaoCarr export ncld=1 - export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_zhaocarr${tbf}" + export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_zhaocarr${tbf}${tbp}" export nwat=2 -elif [ $imp_physics -eq 6 ]; then # WSM6 +elif [[ "$imp_physics" -eq 6 ]]; then # WSM6 export ncld=2 - export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_wsm6${tbf}" + export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_wsm6${tbf}${tbp}" export nwat=6 -elif [ $imp_physics -eq 8 ]; then # Thompson +elif [[ "$imp_physics" -eq 8 ]]; then # Thompson export ncld=2 - export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_thompson_noaero_tke" + export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_thompson_noaero_tke${tbp}" export nwat=6 + export cal_pre=".false." export random_clds=".false." export effr_in=".true." + export ltaerosol=".false." + export lradar=".false." export ttendlim="-999" + export dt_inner=$((DELTIM/2)) + export sedi_semi=.true. + if [[ "$sedi_semi" = .true. ]]; then export dt_inner=$DELTIM ; fi + export decfl=10 + + export hord_mt_nh_nonmono=5 + export hord_xx_nh_nonmono=5 + export vtdm4_nh_nonmono=0.02 + export nord=2 export dddmp=0.1 export d4_bg=0.12 - if [[ "$CCPP_SUITE" == "FV3_RAP_cires_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_unified_ugwp" || "$CCPP_SUITE" == "FV3_GFS_v16_thompson" ]] ; then - export ncld=5 - export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_thompson_aero_tke" - export ltaerosol=.true. - export lradar=.true. - - ## GSL namelist changes - export vtdm4_nh_nonmono=0.03 ### JKH - 10dec - export nord=3 ### JKH - 10dec - export dt_inner=40. ### JKH - 10dec - if [[ "$CCPP_SUITE" == "FV3_RAP_cires_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_unified_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_ugwpv1" ]] ; then - export k_split=6 - export n_split=2 - fi - - export kord_tm=-11 ### JKH - 10dec - export kord_mt=11 ### JKH - 10dec - export kord_wz=11 ### JKH - 10dec - export kord_tr=11 ### JKH - 10dec - export d_con_nonmono=0.5 ### JKH - 10dec - export hord_mt_nh_nonmono=6 ### JKH - 10dec - export hord_xx_nh_nonmono=6 ### JKH - 10dec - else - export ncld=2 - if [[ "$CCPP_SUITE" == "FV3_GFS_v17_p8_mynn" || "$CCPP_SUITE" == "FV3_GFS_v17_p8_gf_mynn" ]] ; then - export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_thompson_aero_tke" - export ltaerosol=".true." - export sedi_semi=.false. ## JKH - 14sep - export decfl=8 ## JKH - 14sep - else - export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_thompson_noaero_tke" - export ltaerosol=".false." - export sedi_semi=.true. ## JKH - 14sep - export decfl=10 ## JKH - 14sep - fi - export lradar=".false." - export dt_inner=$((DELTIM/2)) - if [ $sedi_semi = .true. ]; then export dt_inner=$DELTIM ; fi - export hord_mt_nh_nonmono=5 - export hord_xx_nh_nonmono=5 - export vtdm4_nh_nonmono=0.02 - export nord=2 - fi - -elif [ $imp_physics -eq 11 ]; then # GFDL +elif [[ "$imp_physics" -eq 11 ]]; then # GFDL export ncld=5 - if [[ "$CCPP_SUITE" == "FV3_GFS_v16_mynn" ]] ; then - export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_gfdl_satmedmf" - else - export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_gfdl${tbf}" - fi + export FIELD_TABLE="$HOMEgfs/parm/parm_fv3diag/field_table_gfdl${tbf}${tbp}" export nwat=6 export dnats=1 export cal_pre=".false." @@ -335,13 +246,10 @@ export DO_SKEB=${DO_SKEB:-"NO"} export DO_SHUM=${DO_SHUM:-"NO"} export DO_LAND_PERT=${DO_LAND_PERT:-"NO"} export DO_CA=${DO_CA:-"YES"} -export DO_OCN_SPPT=${DO_OCN_SPPT:-"NO"} -export DO_OCN_PERT_EPBL=${DO_OCN_PERT_EPBL:-"NO"} #coupling settings -export FRAC_GRID=".true." export cplmode="nems_frac" -if [ $FRAC_GRID = ".false." ]; then +if [[ "${FRAC_GRID:-".true."}" = ".false." ]]; then export cplmode="nems_orig" fi export psm_bc="1" @@ -360,7 +268,7 @@ export FSICS="0" export ideflate=1 export nbits=14 export ishuffle=0 -# compression for RESTART files written by FMS +# compression for RESTART files written by FMS export shuffle=1 export deflate_level=1 @@ -368,58 +276,40 @@ export deflate_level=1 # Disable the use of coupler.res; get model start time from model_configure export USE_COUPLER_RES="NO" -if [[ "$CDUMP" == "gdas" ]] ; then # GDAS cycle specific parameters +if [[ "$CDUMP" =~ "gdas" ]] ; then # GDAS cycle specific parameters # Variables used in DA cycling - if [ $QUILTING = ".true." -a $OUTPUT_GRID = "gaussian_grid" ]; then - if [[ "$CCPP_SUITE" == "FV3_RAP_cires_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_unified_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_ugwpv1" ]] ; then - export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_da_gsl" - else - export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_da" - fi - else - export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_da_orig" - fi + export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_da" - # Write restart files, where $number is current model start time. + # Write restart files, where $number is current model start time. # restart_interval: $number - # number=0, writes out restart files at the end of forecast. + # number=0, writes out restart files at the end of forecast. # number>0, writes out restart files at the frequency of $number and at the end of forecast. # restart_interval: "$number -1" # writes out restart files only once at $number forecast hour. # restart_interval: "$number1 $number2 $number3 ..." - # writes out restart file at the specified forecast hours + # writes out restart file at the specified forecast hours export restart_interval=${restart_interval:-6} # For IAU, write restarts at beginning of window also - if [ $DOIAU = "YES" ]; then + if [[ "$DOIAU" = "YES" ]]; then export restart_interval="3 6" fi # Choose coupling with wave - if [ $DO_WAVE = "YES" ]; then export cplwav=".true." ; fi + if [[ "$DO_WAVE" = "YES" ]]; then export cplwav=".true." ; fi # Turn on dry mass adjustment in GDAS export adjust_dry_mass=".true." -elif [[ "$CDUMP" == "gfs" ]] ; then # GFS cycle specific parameters +elif [[ "$CDUMP" =~ "gfs" ]] ; then # GFS cycle specific parameters # Write more variables to output - if [ $QUILTING = ".true." -a $OUTPUT_GRID = "gaussian_grid" ]; then - if [ $CCPP_SUITE = "FV3_RAP_cires_ugwp" ]; then - export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_gsl_ruc" - elif [ $CCPP_SUITE = "FV3_RAP_noah_sfcdiff_unified_ugwp" ]; then - export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_gsl" - else - export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table" - fi - else - export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_orig" - fi + export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table" - # Write gfs restart files to rerun fcst from any break point + # Write gfs restart files to rerun fcst from any break point export restart_interval_gfs=${restart_interval_gfs:-0} - if [ $restart_interval_gfs -le 0 ]; then + if [[ "$restart_interval_gfs" -le 0 ]]; then export restart_interval="$FHMAX_GFS" else rst_list="" @@ -433,15 +323,15 @@ elif [[ "$CDUMP" == "gfs" ]] ; then # GFS cycle specific parameters export restart_interval="$rst_list" fi - if [ $DO_AERO = "YES" ]; then + if [[ "$DO_AERO" = "YES" ]]; then # Make sure a restart file is written at the cadence time if [[ ! "${restart_interval[*]}" =~ "$STEP_GFS" ]]; then export restart_interval="$STEP_GFS $restart_interval" fi fi - + # Choose coupling with wave - if [ $DO_WAVE = "YES" -a "$WAVE_CDUMP" != "gdas" ]; then + if [[ "$DO_WAVE" = "YES" && "$WAVE_CDUMP" != "gdas" ]]; then export cplwav=".true." fi @@ -449,7 +339,7 @@ elif [[ "$CDUMP" == "gfs" ]] ; then # GFS cycle specific parameters export adjust_dry_mass=".false." # Write each restart file in 16 small files to save time - if [ $CASE = C768 ]; then + if [[ "$CASE" = C768 ]]; then export io_layout="4,4" else export io_layout="1,1" @@ -457,11 +347,7 @@ elif [[ "$CDUMP" == "gfs" ]] ; then # GFS cycle specific parameters fi -if [[ $DO_COUPLED = "YES" ]] ; then # coupled model - export DIAG_TABLE="$HOMEgfs/parm/parm_fv3diag/diag_table_cpl" -fi - -if [ $DO_AERO = "YES" ]; then # temporary settings for aerosol coupling +if [[ "$DO_AERO" = "YES" ]]; then # temporary settings for aerosol coupling export AERO_DIAG_TABLE="${AERO_DIAG_TABLE:-$HOMEgfs/parm/parm_fv3diag/diag_table.aero}" export AERO_FIELD_TABLE="${AERO_FIELD_TABLE:-$HOMEgfs/parm/parm_fv3diag/field_table.aero}" export AERO_EMIS_FIRE=$( echo "${AERO_EMIS_FIRE:-none}" | awk '{ print tolower($1) }' ) diff --git a/parm/config/config.fit2obs b/parm/config/config.fit2obs new file mode 100644 index 00000000000..46baaa9e459 --- /dev/null +++ b/parm/config/config.fit2obs @@ -0,0 +1,23 @@ +#! /usr/bin/env bash + +########## config.fit2obs ########## +# Fit to Observations + +echo "BEGIN: config.fit2obs" + +# Get task specific resources +. "${EXPDIR}/config.resources" fit2obs + +export PRVT=${HOMEgfs}/fix/gsi/prepobs_errtable.global +export HYBLEVS=${HOMEgfs}/fix/am/global_hyblev.l${LEVS}.txt + +export VBACKUP_FITS=24 +export OUTPUT_FILETYPE="netcdf" +export CONVNETC="YES" +export ACPROFit="YES" + +if [[ ${netcdf_diag:-".false."} = ".true." ]]; then + export CONVNETC="YES" +fi + +echo "END: config.fit2obs" diff --git a/parm/config/config.fv3 b/parm/config/config.fv3 deleted file mode 100755 index cae220b0c13..00000000000 --- a/parm/config/config.fv3 +++ /dev/null @@ -1,193 +0,0 @@ -#! /usr/bin/env bash - -########## config.fv3 ########## -# FV3 model resolution specific parameters -# e.g. time-step, processor layout, physics and dynamics parameters -# This config sets default variables for FV3 for a given resolution -# User can over-ride after sourcing this config file - -if [ $# -ne 1 ]; then - - echo "Must specify an input resolution argument to set variables!" - echo "argument can be any one of the following:" - echo "C48 C96 C192 C384 C768 C1152 C3072" - exit 1 - -fi - -case_in=$1 - -echo "BEGIN: config.fv3" - - -if [[ "$machine" = "JET" ]]; then - if [[ "$PARTITION_BATCH" = "xjet" ]]; then - export npe_node_max=24 - elif [[ "$PARTITION_BATCH" = "vjet" || "$PARTITION_BATCH" = "sjet" ]]; then - export npe_node_max=16 - elif [[ "$PARTITION_BATCH" = "kjet" ]]; then - export npe_node_max=40 - fi -elif [[ "$machine" = "HERA" ]]; then - export npe_node_max=40 -elif [[ "$machine" = "ORION" ]]; then - export npe_node_max=40 -fi - -# (Standard) Model resolution dependent variables -case $case_in in - "C48") - export DELTIM=450 - export layout_x=3 - export layout_y=2 - export layout_x_gfs=3 - export layout_y_gfs=2 - export nth_fv3=1 - export nth_fv3_gfs=1 - export cdmbgwd="0.071,2.1,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling - if [[ "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_ugwpv1" ]]; then export cdmbgwd="1.0,1.0,1.0,1.0"; fi - export WRITE_GROUP=1 - export WRTTASK_PER_GROUP=$npe_node_max - export WRITE_GROUP_GFS=1 - export WRTTASK_PER_GROUP_GFS=$npe_node_max - export WRTIOBUF="1M" - ;; - "C96") - export DELTIM=450 - export layout_x=6 - export layout_y=4 - export layout_x_gfs=6 - export layout_y_gfs=4 - export nth_fv3=1 - export nth_fv3_gfs=1 - export cdmbgwd="0.14,1.8,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling - if [[ "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_ugwpv1" ]]; then export cdmbgwd="1.0,1.0,1.0,1.0"; fi - export WRITE_GROUP=1 - export WRTTASK_PER_GROUP=$npe_node_max - export WRITE_GROUP_GFS=1 - export WRTTASK_PER_GROUP_GFS=$npe_node_max - export WRTIOBUF="4M" - export n_split=6 - ;; - "C192") - export DELTIM=450 - export layout_x=4 - export layout_y=6 - export layout_x_gfs=4 - export layout_y_gfs=6 - export nth_fv3=2 - export nth_fv3_gfs=2 - export cdmbgwd="0.23,1.5,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling - export WRITE_GROUP=1 - export WRTTASK_PER_GROUP=$npe_node_max - export WRITE_GROUP_GFS=2 - export WRTTASK_PER_GROUP_GFS=$npe_node_max - export WRTIOBUF="8M" - ;; - "C384") - export DELTIM=${DELTIM:-300} - export layout_x=6 - export layout_y=8 - export layout_x_gfs=${layout_x_gfs:-8} - export layout_y_gfs=${layout_y_gfs:-12} - export nth_fv3=2 - export nth_fv3_gfs=${nth_fv3_gfs:-2} - export cdmbgwd="1.1,0.72,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling - export WRITE_GROUP=1 - export WRTTASK_PER_GROUP=$npe_node_max - export WRITE_GROUP_GFS=${WRITE_GROUP_GFS:-2} - export WRTTASK_PER_GROUP_GFS=${WRTTASK_PER_GROUP_GFS:-$npe_node_max} - export WRTIOBUF=${WRTIOBUF:-"16M"} - ;; - "C768") - if [[ "$CCPP_SUITE" == "FV3_RAP_cires_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_unified_ugwp" || "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_ugwpv1" ]] ; then - if [ $LEVS = "128" ]; then - export DELTIM=120 - else - #JKHexport DELTIM=225 - export DELTIM=180 - fi - else - if [[ "$CCPP_SUITE" == "FV3_GFS_v16_mynn" ]] ; then - export DELTIM=100 - else - export DELTIM=150 - fi - fi - export layout_x=8 - export layout_y=12 - #JKHexport layout_x_gfs=16 - export layout_x_gfs=12 ## JKH - export layout_y_gfs=12 - export nth_fv3=4 - export nth_fv3_gfs=4 - export cdmbgwd="4.0,0.15,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling - if [[ "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_unified_ugwp" ]]; then export cdmbgwd="4.0,0.15,1.0,1.0"; fi - if [[ "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_ugwpv1" ]]; then export cdmbgwd="1.0,1.0,1.0,1.0"; fi - export WRITE_GROUP=2 - export WRTTASK_PER_GROUP=$(echo "2*$npe_node_max" |bc) - export WRITE_GROUP_GFS=2 ## JKH - export WRTTASK_PER_GROUP_GFS=$(echo "2*$npe_node_max" |bc) - export WRTIOBUF="32M" - ;; - "C1152") - export DELTIM=120 - export layout_x=8 - export layout_y=16 - export layout_x_gfs=8 - export layout_y_gfs=16 - export nth_fv3=4 - export nth_fv3_gfs=4 - export cdmbgwd="4.0,0.10,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling - if [[ "$CCPP_SUITE" == "FV3_RAP_noah_sfcdiff_ugwpv1" ]]; then export cdmbgwd="1.0,1.0,1.0,1.0"; fi - export WRITE_GROUP=4 - export WRTTASK_PER_GROUP=$(echo "2*$npe_node_max" |bc) - export WRITE_GROUP_GFS=4 - export WRTTASK_PER_GROUP_GFS=$(echo "2*$npe_node_max" |bc) - export WRTIOBUF="48M" - ;; - "C3072") - export DELTIM=90 - export layout_x=16 - export layout_y=32 - export layout_x_gfs=16 - export layout_y_gfs=32 - export nth_fv3=4 - export nth_fv3_gfs=4 - export cdmbgwd="4.0,0.05,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling - export WRITE_GROUP=4 - export WRTTASK_PER_GROUP=$(echo "3*$npe_node_max" |bc) - export WRITE_GROUP_GFS=4 - export WRTTASK_PER_GROUP_GFS=$(echo "3*$npe_node_max" |bc) - export WRTIOBUF="64M" - ;; - *) - echo "grid $case_in not supported, ABORT!" - exit 1 - ;; -esac - -# Calculate chunksize based on resolution -export RESTILE=$(echo $case_in |cut -c2-) -export ichunk2d=$((4*RESTILE)) -export jchunk2d=$((2*RESTILE)) -export ichunk3d=$((4*RESTILE)) -export jchunk3d=$((2*RESTILE)) -export kchunk3d=1 - -# Determine whether to use parallel NetCDF based on resolution -case $case_in in - "C48" | "C96" | "C192") - export OUTPUT_FILETYPE_ATM="netcdf" - export OUTPUT_FILETYPE_SFC="netcdf" - ;; - "C384" | "C768" | "C1152" | "C3072") - if [[ "$machine" = "JET" ]]; then - export OUTPUT_FILETYPE_ATM="netcdf" ## JKH - else - export OUTPUT_FILETYPE_ATM="netcdf_parallel" - fi - export OUTPUT_FILETYPE_SFC="netcdf_parallel" - ;; -esac -echo "END: config.fv3" diff --git a/parm/config/config.fv3.nco.static b/parm/config/config.fv3.nco.static old mode 100755 new mode 100644 index 9181ca88e97..dc60b2ef032 --- a/parm/config/config.fv3.nco.static +++ b/parm/config/config.fv3.nco.static @@ -19,15 +19,7 @@ case_in=$1 echo "BEGIN: config.fv3" - -if [[ "$machine" = "JET" ]]; then - export npe_node_max=24 -elif [[ "$machine" = "HERA" ]]; then - export npe_node_max=40 -elif [[ "$machine" = "ORION" ]]; then - export npe_node_max=40 -fi - +export npe_node_max=128 # (Standard) Model resolution dependent variables case $case_in in @@ -80,7 +72,7 @@ case $case_in in export WRTIOBUF="8M" ;; "C384") - export DELTIM=240 + export DELTIM=200 export layout_x=8 export layout_y=8 export layout_x_gfs=6 @@ -89,26 +81,27 @@ case $case_in in export npe_wav_gfs=35 export nth_fv3=1 export cdmbgwd="1.1,0.72,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling - export WRITE_GROUP=1 - export WRTTASK_PER_GROUP=$npe_node_max - export WRITE_GROUP_GFS=2 - export WRTTASK_PER_GROUP_GFS=$npe_node_max + export WRITE_GROUP=2 + export WRTTASK_PER_GROUP=64 + export WRITE_GROUP_GFS=1 + export WRTTASK_PER_GROUP_GFS=64 export WRTIOBUF="16M" ;; "C768") export DELTIM=150 export layout_x=8 export layout_y=12 - export layout_x_gfs=16 + export layout_x_gfs=12 export layout_y_gfs=24 export npe_wav=140 - export npe_wav_gfs=630 - export nth_fv3=4 + export npe_wav_gfs=448 + export nth_fv3=3 + export nth_fv3_gfs=5 export cdmbgwd="4.0,0.15,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export WRITE_GROUP=2 - export WRTTASK_PER_GROUP=$(echo "2*$npe_node_max" |bc) + export WRTTASK_PER_GROUP=64 export WRITE_GROUP_GFS=8 - export WRTTASK_PER_GROUP_GFS=$(echo "2*$npe_node_max" |bc) + export WRTTASK_PER_GROUP_GFS=64 export WRTIOBUF="32M" ;; "C1152") diff --git a/parm/config/config.gempak b/parm/config/config.gempak old mode 100755 new mode 100644 index a2b5ecbaf58..791770ba4af --- a/parm/config/config.gempak +++ b/parm/config/config.gempak @@ -8,6 +8,4 @@ echo "BEGIN: config.gempak" # Get task specific resources . $EXPDIR/config.resources gempak -export GEMPAKSH=$HOMEgfs/jobs/JGFS_ATMOS_GEMPAK - echo "END: config.gempak" diff --git a/parm/config/config.getic b/parm/config/config.getic old mode 100755 new mode 100644 diff --git a/parm/config/config.gldas b/parm/config/config.gldas old mode 100755 new mode 100644 index 8d503d03683..c51829d9fc1 --- a/parm/config/config.gldas +++ b/parm/config/config.gldas @@ -11,6 +11,6 @@ echo "BEGIN: config.gldas" export GLDASSH=$HOMEgfs/scripts/exgdas_atmos_gldas.sh export gldas_spinup_hours=72 export CPCGAUGE=$DMPDIR -export FINDDATE=$HOMEgfs/util/ush/finddate.sh +export FINDDATE=$USHgfs/finddate.sh echo "END: config.gldas" diff --git a/parm/config/config.ice b/parm/config/config.ice index 3a6916600f1..7bc1f809663 100644 --- a/parm/config/config.ice +++ b/parm/config/config.ice @@ -1,4 +1,5 @@ #! /usr/bin/env bash -export NX_GLB="1440" -export NY_GLB="1080" +echo "BEGIN: config.ice" + +echo "END: config.ice" diff --git a/parm/config/config.init b/parm/config/config.init old mode 100755 new mode 100644 index 2301b1cdc1f..3e016fb2483 --- a/parm/config/config.init +++ b/parm/config/config.init @@ -16,6 +16,7 @@ export GDASINIT_DIR=${UFS_DIR}/util/gdas_init export CRES_HIRES=$CASE export CRES_ENKF=$CASE_ENKF +export FRAC_ORO="yes" export RUNICSH=${GDASINIT_DIR}/run_v16.chgres.sh if [ "${RETRO:-"NO"}" = "YES" ] || [ "$CDUMP" = "gdas" ]; then diff --git a/parm/config/config.landanl b/parm/config/config.landanl new file mode 100755 index 00000000000..89bb8a4b7bf --- /dev/null +++ b/parm/config/config.landanl @@ -0,0 +1,23 @@ +#! /usr/bin/env bash + +########## config.landanl ########## +# configuration common to all land analysis tasks + +echo "BEGIN: config.landanl" + +obs_list_name=gdas_land_adpsfc_only.yaml +if [[ "${cyc}" == "18" ]]; then + obs_list_name=gdas_land_prototype.yaml +fi + +export OBS_YAML_DIR=${HOMEgfs}/sorc/gdas.cd/parm/land/obs/config/ +export OBS_LIST=${HOMEgfs}/sorc/gdas.cd/parm/land/obs/lists/${obs_list_name} +export LANDVARYAML=${HOMEgfs}/sorc/gdas.cd/parm/land/letkfoi/letkfoi.yaml +export FV3JEDI_FIX=${HOMEgfs}/fix/gdas + +export io_layout_x=@IO_LAYOUT_X@ +export io_layout_y=@IO_LAYOUT_Y@ + +export JEDIEXE=${HOMEgfs}/exec/fv3jedi_letkf.x + +echo "END: config.landanl" diff --git a/parm/config/config.landanlfinal b/parm/config/config.landanlfinal new file mode 100755 index 00000000000..242089325a4 --- /dev/null +++ b/parm/config/config.landanlfinal @@ -0,0 +1,10 @@ +#! /usr/bin/env bash + +########## config.landanlfinal ########## +# Post Land Analysis specific + +echo "BEGIN: config.landanlfinal" + +# Get task specific resources +. "${EXPDIR}/config.resources" landanlfinal +echo "END: config.landanlfinal" diff --git a/parm/config/config.landanlinit b/parm/config/config.landanlinit new file mode 100755 index 00000000000..62054525c88 --- /dev/null +++ b/parm/config/config.landanlinit @@ -0,0 +1,10 @@ +#! /usr/bin/env bash + +########## config.landanlinit ########## +# Pre Land Analysis specific + +echo "BEGIN: config.landanlinit" + +# Get task specific resources +. "${EXPDIR}/config.resources" landanlinit +echo "END: config.landanlinit" diff --git a/parm/config/config.landanlrun b/parm/config/config.landanlrun new file mode 100755 index 00000000000..0f44011c1d7 --- /dev/null +++ b/parm/config/config.landanlrun @@ -0,0 +1,11 @@ +#! /usr/bin/env bash + +########## config.landanlrun ########## +# Land Analysis specific + +echo "BEGIN: config.landanlrun" + +# Get task specific resources +. "${EXPDIR}/config.resources" landanlrun + +echo "END: config.landanlrun" diff --git a/parm/config/config.metp b/parm/config/config.metp old mode 100755 new mode 100644 index 4be7151ffa6..c90903f6a55 --- a/parm/config/config.metp +++ b/parm/config/config.metp @@ -6,7 +6,7 @@ echo "BEGIN: config.metp" # Get task specific resources -. $EXPDIR/config.resources metp +. "${EXPDIR}/config.resources" metp export RUN_GRID2GRID_STEP1="YES" # Run grid-to-grid verification using METplus export RUN_GRID2OBS_STEP1="YES" # Run grid-to-obs verification using METplus @@ -18,15 +18,15 @@ export RUN_PRECIP_STEP1="YES" # Run precip verification using METplus #---------------------------------------------------------- ## EMC_VERIF_GLOBAL SETTINGS export HOMEverif_global=${HOMEgfs}/sorc/verif-global.fd -export VERIF_GLOBALSH=$HOMEverif_global/ush/run_verif_global_in_global_workflow.sh +export VERIF_GLOBALSH=${HOMEverif_global}/ush/run_verif_global_in_global_workflow.sh ## INPUT DATA SETTINGS -export model=$PSLOT +export model=${PSLOT} export model_file_format="pgbf{lead?fmt=%2H}.${CDUMP}.{init?fmt=%Y%m%d%H}.grib2" -export model_hpss_dir=$ATARDIR/.. +export model_hpss_dir=${ATARDIR}/.. export get_data_from_hpss="NO" export hpss_walltime="10" ## OUTPUT SETTINGS -export model_stat_dir=$ARCDIR/.. +export model_stat_dir=${ARCDIR}/.. export make_met_data_by="VALID" export SENDMETVIEWER="NO" ## DATE SETTINGS @@ -39,20 +39,20 @@ export log_MET_output_to_METplus="yes" export g2g1_type_list="anom pres sfc" export g2g1_anom_truth_name="self_anl" export g2g1_anom_truth_file_format="pgbanl.${CDUMP}.{valid?fmt=%Y%m%d%H}.grib2" -export g2g1_anom_fhr_min=$FHMIN_GFS -export g2g1_anom_fhr_max=$FHMAX_GFS +export g2g1_anom_fhr_min=${FHMIN_GFS} +export g2g1_anom_fhr_max=${FHMAX_GFS} export g2g1_anom_grid="G002" export g2g1_anom_gather_by="VSDB" export g2g1_pres_truth_name="self_anl" export g2g1_pres_truth_file_format="pgbanl.${CDUMP}.{valid?fmt=%Y%m%d%H}.grib2" -export g2g1_pres_fhr_min=$FHMIN_GFS -export g2g1_pres_fhr_max=$FHMAX_GFS +export g2g1_pres_fhr_min=${FHMIN_GFS} +export g2g1_pres_fhr_max=${FHMAX_GFS} export g2g1_pres_grid="G002" export g2g1_pres_gather_by="VSDB" export g2g1_sfc_truth_name="self_f00" export g2g1_sfc_truth_file_format="pgbf00.${CDUMP}.{valid?fmt=%Y%m%d%H}.grib2" -export g2g1_sfc_fhr_min=$FHMIN_GFS -export g2g1_sfc_fhr_max=$FHMAX_GFS +export g2g1_sfc_fhr_min=${FHMIN_GFS} +export g2g1_sfc_fhr_max=${FHMAX_GFS} export g2g1_sfc_grid="G002" export g2g1_sfc_gather_by="VSDB" export g2g1_mv_database_name="mv_${PSLOT}_grid2grid_metplus" @@ -62,19 +62,19 @@ export g2g1_mv_database_desc="Grid-to-grid METplus data for global workflow expe export g2o1_type_list="upper_air conus_sfc" export g2o1_upper_air_msg_type_list="ADPUPA" export g2o1_upper_air_vhr_list="00 06 12 18" -export g2o1_upper_air_fhr_min=$FHMIN_GFS +export g2o1_upper_air_fhr_min=${FHMIN_GFS} export g2o1_upper_air_fhr_max="240" export g2o1_upper_air_grid="G003" export g2o1_upper_air_gather_by="VSDB" export g2o1_conus_sfc_msg_type_list="ONLYSF ADPUPA" export g2o1_conus_sfc_vhr_list="00 03 06 09 12 15 18 21" -export g2o1_conus_sfc_fhr_min=$FHMIN_GFS +export g2o1_conus_sfc_fhr_min=${FHMIN_GFS} export g2o1_conus_sfc_fhr_max="240" export g2o1_conus_sfc_grid="G104" export g2o1_conus_sfc_gather_by="VSDB" export g2o1_polar_sfc_msg_type_list="IABP" export g2o1_polar_sfc_vhr_list="00 03 06 09 12 15 18 21" -export g2o1_polar_sfc_fhr_min=$FHMIN_GFS +export g2o1_polar_sfc_fhr_min=${FHMIN_GFS} export g2o1_polar_sfc_fhr_max="240" export g2o1_polar_sfc_grid="G219" export g2o1_polar_sfc_gather_by="VSDB" @@ -87,7 +87,7 @@ export precip1_type_list="ccpa_accum24hr" export precip1_ccpa_accum24hr_model_bucket="06" export precip1_ccpa_accum24hr_model_var="APCP" export precip1_ccpa_accum24hr_model_file_format="pgbf{lead?fmt=%2H}.${CDUMP}.{init?fmt=%Y%m%d%H}.grib2" -export precip1_ccpa_accum24hr_fhr_min=$FHMIN_GFS +export precip1_ccpa_accum24hr_fhr_min=${FHMIN_GFS} export precip1_ccpa_accum24hr_fhr_max="180" export precip1_ccpa_accum24hr_grid="G211" export precip1_ccpa_accum24hr_gather_by="VSDB" diff --git a/parm/config/config.nsst b/parm/config/config.nsst old mode 100755 new mode 100644 diff --git a/parm/config/config.ocn b/parm/config/config.ocn index 1675713e7c3..7d14e3dd52f 100644 --- a/parm/config/config.ocn +++ b/parm/config/config.ocn @@ -1,11 +1,23 @@ #! /usr/bin/env bash -# OCNRES is currently being set in config.base -# case "$CASE" in -# "C48") export OCNRES=400;; -# "C96") export OCNRES=100;; -# "C192") export OCNRES=050;; -# "C384") export OCNRES=025;; -# "C768") export OCNRES=025;; -# *) export OCNRES=025;; -# esac +echo "BEGIN: config.ocn" + +# MOM_input template to use +export MOM_INPUT="MOM_input_template_${OCNRES}" + +export DO_OCN_SPPT="NO" # In MOM_input, this variable is determines OCN_SPPT (OCN_SPPT = True|False) +export DO_OCN_PERT_EPBL="NO" # In MOM_input, this variable determines PERT_EPBL (PERT_EPBL = True|False) + +# Templated variables in MOM_input_template +export MOM6_USE_LI2016="True" # set to False for restart reproducibility +export MOM6_THERMO_SPAN="False" +export MOM6_ALLOW_LANDMASK_CHANGES="False" + +if [[ "${DO_JEDIOCNVAR}" == "YES" ]]; then + export ODA_INCUPD="True" +else + export ODA_INCUPD="False" +fi +export ODA_INCUPD_NHOURS="3.0" # In MOM_input, this is time interval for applying increment + +echo "END: config.ocn" diff --git a/parm/config/config.ocnanal b/parm/config/config.ocnanal new file mode 100644 index 00000000000..36519c7f354 --- /dev/null +++ b/parm/config/config.ocnanal @@ -0,0 +1,32 @@ +#!/bin/bash + +########## config.ocnanal ########## +# configuration common to all ocean analysis tasks + +echo "BEGIN: config.ocnanal" + +export OBS_YAML_DIR=${HOMEgfs}/sorc/gdas.cd/parm/soca/obs/config +export OBS_LIST=@SOCA_OBS_LIST@ +[[ -n "${OBS_LIST}" ]] || export OBS_LIST=${HOMEgfs}/sorc/gdas.cd/parm/soca/obs/obs_list.yaml +export OBS_YAML=${OBS_LIST} +export FV3JEDI_STAGE_YAML=${HOMEgfs}/sorc/gdas.cd/test/soca/testinput/dumy.yaml +export SOCA_INPUT_FIX_DIR=@SOCA_INPUT_FIX_DIR@ +export SOCA_VARS=tocn,socn,ssh +export SABER_BLOCKS_YAML=@SABER_BLOCKS_YAML@ +export SOCA_NINNER=@SOCA_NINNER@ +export CASE_ANL=@CASE_ANL@ +export DOMAIN_STACK_SIZE=116640000 #TODO: Make the stack size reolution dependent +export JEDI_BIN=${HOMEgfs}/sorc/gdas.cd/build/bin + +# R2D2 +export R2D2_OBS_DB=shared +export R2D2_OBS_DUMP=@R2D2_OBS_DUMP@ +export R2D2_OBS_SRC=@R2D2_OBS_SRC@ +export R2D2_OBS_WINDOW=24 # TODO: Check if the R2D2 sampling DB window is still needed +export COMIN_OBS=@COMIN_OBS@ + +# NICAS +export NICAS_RESOL=@NICAS_RESOL@ +export NICAS_GRID_SIZE=@NICAS_GRID_SIZE@ + +echo "END: config.ocnanal" diff --git a/parm/config/config.ocnanalbmat b/parm/config/config.ocnanalbmat new file mode 100644 index 00000000000..024da5f51bc --- /dev/null +++ b/parm/config/config.ocnanalbmat @@ -0,0 +1,11 @@ +#!/bin/bash + +########## config.ocnanalbmat ########## +# Ocn Analysis specific + +echo "BEGIN: config.ocnanalbmat" + +# Get task specific resources +. "${EXPDIR}/config.resources" ocnanalbmat + +echo "END: config.ocnanalbmat" diff --git a/parm/config/config.ocnanalchkpt b/parm/config/config.ocnanalchkpt new file mode 100644 index 00000000000..c059fdba427 --- /dev/null +++ b/parm/config/config.ocnanalchkpt @@ -0,0 +1,11 @@ +#!/bin/bash + +########## config.ocnanalchkpt ########## +# Ocn Analysis specific + +echo "BEGIN: config.ocnanalchkpt" + +# Get task specific resources +. "${EXPDIR}/config.resources" ocnanalchkpt + +echo "END: config.ocnanalchkpt" diff --git a/parm/config/config.ocnanalpost b/parm/config/config.ocnanalpost new file mode 100644 index 00000000000..bc4d945865c --- /dev/null +++ b/parm/config/config.ocnanalpost @@ -0,0 +1,10 @@ +#!/bin/bash + +########## config.ocnanalpost ########## +# Post Ocn Analysis specific + +echo "BEGIN: config.ocnanalpost" + +# Get task specific resources +. "${EXPDIR}/config.resources" ocnanalpost +echo "END: config.ocnanalpost" diff --git a/parm/config/config.ocnanalprep b/parm/config/config.ocnanalprep new file mode 100644 index 00000000000..225eb089c34 --- /dev/null +++ b/parm/config/config.ocnanalprep @@ -0,0 +1,10 @@ +#!/bin/bash + +########## config.ocnanalprep ########## +# Pre Ocn Analysis specific + +echo "BEGIN: config.ocnanalprep" + +# Get task specific resources +. "${EXPDIR}/config.resources" ocnanalprep +echo "END: config.ocnanalprep" diff --git a/parm/config/config.ocnanalrun b/parm/config/config.ocnanalrun new file mode 100644 index 00000000000..5345b6c6843 --- /dev/null +++ b/parm/config/config.ocnanalrun @@ -0,0 +1,11 @@ +#!/bin/bash + +########## config.ocnanalrun ########## +# Ocn Analysis specific + +echo "BEGIN: config.ocnanalrun" + +# Get task specific resources +. "${EXPDIR}/config.resources" ocnanalrun + +echo "END: config.ocnanalrun" diff --git a/parm/config/config.ocnanalvrfy b/parm/config/config.ocnanalvrfy new file mode 100644 index 00000000000..4eda451853c --- /dev/null +++ b/parm/config/config.ocnanalvrfy @@ -0,0 +1,10 @@ +#!/bin/bash + +########## config.ocnanalvrfy ########## +# Pre Ocn Analysis specific + +echo "BEGIN: config.ocnanalvrfy" + +# Get task specific resources +. "${EXPDIR}/config.resources" ocnanalvrfy +echo "END: config.ocnanalvrfy" diff --git a/parm/config/config.ocnpost b/parm/config/config.ocnpost old mode 100755 new mode 100644 diff --git a/parm/config/config.post b/parm/config/config.post old mode 100755 new mode 100644 index 501c527105b..71b0fe79d27 --- a/parm/config/config.post +++ b/parm/config/config.post @@ -8,24 +8,18 @@ echo "BEGIN: config.post" # Get task specific resources . $EXPDIR/config.resources post -# Convert nemsio files to grib files using post job -#------------------------------------------- - # No. of concurrent post jobs [0 implies sequential] export NPOSTGRP=42 export OUTTYP=4 -export MODEL_OUT_FORM=binarynemsiompiio -if [ $OUTPUT_FILE = "netcdf" ]; then - export MODEL_OUT_FORM=netcdfpara -fi +export MODEL_OUT_FORM=netcdfpara -# Post driver job that calls gfs_nceppost.sh and downstream jobs -export POSTJJOBSH="$HOMEpost/jobs/JGLOBAL_NCEPPOST" +# Post driver job that calls gfs_post.sh and downstream jobs +export POSTJJOBSH="$HOMEpost/jobs/JGLOBAL_POST" export GFSDOWNSH="$HOMEpost/ush/fv3gfs_downstream_nems.sh" export GFSDWNSH="$HOMEpost/ush/fv3gfs_dwn_nems.sh" -export POSTGPSH="$HOMEpost/ush/gfs_nceppost.sh" -export POSTGPEXEC="$HOMEpost/exec/gfs_ncep_post" +export POSTGPSH="$HOMEpost/ush/gfs_post.sh" +export POSTGPEXEC="$HOMEpost/exec/upp.x" export GOESF=NO # goes image export FLXF=YES # grib2 flux file written by post @@ -33,8 +27,7 @@ export npe_postgp=$npe_post export nth_postgp=1 export GFS_DOWNSTREAM="YES" -#JKHexport downset=2 -export downset=1 ## JKH (removes creation of pgrb2b files) +export downset=2 export npe_dwn=24 export GRIBVERSION='grib2' diff --git a/parm/config/config.postsnd b/parm/config/config.postsnd old mode 100755 new mode 100644 diff --git a/parm/config/config.prep b/parm/config/config.prep old mode 100755 new mode 100644 index ac172bf5b87..3e1cf8c32f2 --- a/parm/config/config.prep +++ b/parm/config/config.prep @@ -8,7 +8,8 @@ echo "BEGIN: config.prep" # Get task specific resources . $EXPDIR/config.resources prep -export DO_MAKEPREPBUFR="YES" # if NO, will copy prepbufr from globaldump +export MAKE_PREPBUFR="YES" # if NO, will copy prepbufr from globaldump +export cdate10=${PDY}${cyc} # Relocation and syndata QC export PROCESS_TROPCY=${PROCESS_TROPCY:-NO} @@ -17,11 +18,12 @@ export DO_RELOCATE="NO" export TROPCYQCRELOSH="$HOMEgfs/scripts/exglobal_atmos_tropcy_qc_reloc.sh" export SENDCOM=YES -export COMINsyn=${COMINsyn:-${COMROOT}/gfs/prod/syndat} +export COMINtcvital=${COMINtcvital:-${DMPDIR}/${CDUMP}.${PDY}/${cyc}/atmos} +export COMINsyn=${COMINsyn:-$(compath.py ${envir}/com/gfs/${gfs_ver})/syndat} export HOMERELO=$HOMEgfs export EXECRELO=${HOMERELO}/exec -export FIXRELO=${HOMERELO}/fix/fix_am +export FIXRELO=${HOMERELO}/fix/am export USHRELO=${HOMERELO}/ush # Adjust observation error for GFS v16 parallels @@ -63,6 +65,5 @@ if [[ "$CDATE" -ge "2020102200" ]]; then else export DTYPS_nsst='sfcshp dbuoyb mbuoyb tesac bathy trkob' fi -export DO_MAKE_NSSTBUFR="NO" # if NO, will copy nsstbufr from globaldump echo "END: config.prep" diff --git a/parm/config/config.prepbufr b/parm/config/config.prepbufr deleted file mode 100755 index 2d6ececc5bb..00000000000 --- a/parm/config/config.prepbufr +++ /dev/null @@ -1,19 +0,0 @@ -#! /usr/bin/env bash - -########## config.prepbufr ########## -# PREPBUFR specific configuration - -echo "BEGIN: config.prepbufr" - -# Get task specific resources -. $EXPDIR/config.resources prepbufr - -# Set variables - -if [ $machine = "HERA" ]; then - export GESROOT=/scratch1/NCEPDEV/rstprod -elif [ $machine = "ORION" ]; then - export GESROOT=/dev/null -fi - -echo "END: config.prepbufr" diff --git a/parm/config/config.resources b/parm/config/config.resources old mode 100755 new mode 100644 index 731b57462b7..1e5b747982d --- a/parm/config/config.resources +++ b/parm/config/config.resources @@ -4,20 +4,23 @@ # Set resource information for job tasks # e.g. walltime, node, cores per node, memory etc. -if [ $# -ne 1 ]; then +if [[ $# -ne 1 ]]; then echo "Must specify an input task argument to set resource variables!" echo "argument can be any one of the following:" echo "getic init coupled_ic aerosol_init" - echo "atmanalprep atmanalrun atmanalpost" - echo "atmensanalprep atmensanalrun atmensanalpost" - echo "anal sfcanl analcalc analdiag gldas fcst post vrfy metp arch echgres" + echo "atmanlinit atmanlrun atmanlfinal" + echo "atmensanlinit atmensanlrun atmensanlfinal" + echo "landanlinit landanlrun landanlfinal" + echo "aeroanlinit aeroanlrun aeroanlfinal" + echo "anal sfcanl analcalc analdiag gldas fcst post vrfy fit2obs metp arch echgres" echo "eobs ediag eomg eupd ecen esfc efcs epos earc" echo "init_chem mom6ic ocnpost" echo "waveinit waveprep wavepostsbs wavepostbndpnt wavepostbndpntbll wavepostpnt" echo "wavegempak waveawipsbulls waveawipsgridded" echo "postsnd awips gempak" echo "wafs wafsgrib2 wafsblending wafsgrib20p25 wafsblending0p25 wafsgcip" + echo "ocnanalprep ocnanalbmat ocnanalrun ocnanalchkpt ocnanalpost ocnanalvrfy" exit 1 fi @@ -26,340 +29,681 @@ step=$1 echo "BEGIN: config.resources" -if [[ "$machine" = "JET" ]]; then - if [[ "$PARTITION_BATCH" = "xjet" ]]; then +if [[ "${machine}" = "WCOSS2" ]]; then + export npe_node_max=128 +elif [[ "${machine}" = "JET" ]]; then + if [[ ${PARTITION_BATCH} = "xjet" ]]; then export npe_node_max=24 - elif [[ "$PARTITION_BATCH" = "vjet" || "$PARTITION_BATCH" = "sjet" ]]; then + elif [[ ${PARTITION_BATCH} = "vjet" || ${PARTITION_BATCH} = "sjet" ]]; then export npe_node_max=16 - elif [[ "$PARTITION_BATCH" = "kjet" ]]; then + elif [[ ${PARTITION_BATCH} = "kjet" ]]; then export npe_node_max=40 fi -elif [[ "$machine" = "HERA" ]]; then +elif [[ ${machine} = "HERA" ]]; then export npe_node_max=40 -elif [[ "$machine" = "ORION" ]]; then +elif [[ ${machine} = "S4" ]]; then + if [[ ${PARTITION_BATCH} = "s4" ]]; then + export npe_node_max=32 + elif [[ ${PARTITION_BATCH} = "ivy" ]]; then + export npe_node_max=20 + fi +elif [[ ${machine} = "ORION" ]]; then export npe_node_max=40 fi -if [ $step = "prep" -o $step = "prepbufr" ]; then - eval "export wtime_$step='00:45:00'" - eval "export npe_$step=4" - eval "export npe_node_$step=2" - eval "export nth_$step=1" - eval "export memory_$step=40G" +if [[ ${step} = "prep" ]]; then + export wtime_prep='00:30:00' + export npe_prep=4 + export npe_node_prep=2 + export nth_prep=1 + if [[ "${machine}" = "WCOSS2" ]]; then + export is_exclusive=True + else + export memory_prep="40G" + fi -elif [ $step = "aerosol_init" ]; then +elif [[ "${step}" = "aerosol_init" ]]; then export wtime_aerosol_init="00:05:00" export npe_aerosol_init=1 export nth_aerosol_init=1 - export npe_node_aerosol_init=$(echo "$npe_node_max / $nth_aerosol_init" | bc) + npe_node_aerosol_init=$(echo "${npe_node_max} / ${nth_aerosol_init}" | bc) + export npe_node_aerosol_init export NTASKS=${npe_aerosol_init} export memory_aerosol_init="6G" -elif [ $step = "waveinit" ]; then +elif [[ ${step} = "waveinit" ]]; then export wtime_waveinit="00:10:00" export npe_waveinit=12 export nth_waveinit=1 - export npe_node_waveinit=$(echo "$npe_node_max / $nth_waveinit" | bc) + npe_node_waveinit=$(echo "${npe_node_max} / ${nth_waveinit}" | bc) + export npe_node_waveinit export NTASKS=${npe_waveinit} + export memory_waveinit="2GB" -elif [ $step = "waveprep" ]; then +elif [[ ${step} = "waveprep" ]]; then - export wtime_waveprep="00:30:00" - export npe_waveprep=65 + export wtime_waveprep="00:10:00" + export npe_waveprep=5 + export npe_waveprep_gfs=65 export nth_waveprep=1 - export npe_node_waveprep=$(echo "$npe_node_max / $nth_waveprep" | bc) + export nth_waveprep_gfs=1 + npe_node_waveprep=$(echo "${npe_node_max} / ${nth_waveprep}" | bc) + export npe_node_waveprep + npe_node_waveprep_gfs=$(echo "${npe_node_max} / ${nth_waveprep_gfs}" | bc) + export npe_node_waveprep_gfs export NTASKS=${npe_waveprep} + export NTASKS_gfs=${npe_waveprep_gfs} + export memory_waveprep="100GB" + export memory_waveprep_gfs="150GB" -elif [ $step = "wavepostsbs" ]; then +elif [[ ${step} = "wavepostsbs" ]]; then - export wtime_wavepostsbs="06:00:00" - export npe_wavepostsbs=10 + export wtime_wavepostsbs="00:20:00" + export wtime_wavepostsbs_gfs="03:00:00" + export npe_wavepostsbs=8 export nth_wavepostsbs=1 - export npe_node_wavepostsbs=$(echo "$npe_node_max / $nth_wavepostsbs" | bc) + npe_node_wavepostsbs=$(echo "${npe_node_max} / ${nth_wavepostsbs}" | bc) + export npe_node_wavepostsbs export NTASKS=${npe_wavepostsbs} + export memory_wavepostsbs="10GB" + export memory_wavepostsbs_gfs="10GB" -elif [ $step = "wavepostbndpnt" ]; then +elif [[ ${step} = "wavepostbndpnt" ]]; then - export wtime_wavepostbndpnt="02:00:00" - export npe_wavepostbndpnt=280 + export wtime_wavepostbndpnt="01:00:00" + export npe_wavepostbndpnt=240 export nth_wavepostbndpnt=1 - export npe_node_wavepostbndpnt=$(echo "$npe_node_max / $nth_wavepostbndpnt" | bc) + npe_node_wavepostbndpnt=$(echo "${npe_node_max} / ${nth_wavepostbndpnt}" | bc) + export npe_node_wavepostbndpnt export NTASKS=${npe_wavepostbndpnt} + export is_exclusive=True -elif [ $step = "wavepostbndpntbll" ]; then +elif [[ ${step} = "wavepostbndpntbll" ]]; then export wtime_wavepostbndpntbll="01:00:00" - export npe_wavepostbndpntbll=280 + export npe_wavepostbndpntbll=448 export nth_wavepostbndpntbll=1 - export npe_node_wavepostbndpntbll=$(echo "$npe_node_max / $nth_wavepostbndpntbll" | bc) + npe_node_wavepostbndpntbll=$(echo "${npe_node_max} / ${nth_wavepostbndpntbll}" | bc) + export npe_node_wavepostbndpntbll export NTASKS=${npe_wavepostbndpntbll} + export is_exclusive=True -elif [ $step = "wavepostpnt" ]; then +elif [[ ${step} = "wavepostpnt" ]]; then - export wtime_wavepostpnt="02:00:00" - export npe_wavepostpnt=280 + export wtime_wavepostpnt="01:30:00" + export npe_wavepostpnt=200 export nth_wavepostpnt=1 - export npe_node_wavepostpnt=$(echo "$npe_node_max / $nth_wavepostpnt" | bc) + npe_node_wavepostpnt=$(echo "${npe_node_max} / ${nth_wavepostpnt}" | bc) + export npe_node_wavepostpnt export NTASKS=${npe_wavepostpnt} + export is_exclusive=True -elif [ $step = "wavegempak" ]; then +elif [[ ${step} = "wavegempak" ]]; then - export wtime_wavegempak="01:00:00" - export npe_wavegempak=$npe_node_max + export wtime_wavegempak="02:00:00" + export npe_wavegempak=1 export nth_wavegempak=1 - export npe_node_wavegempak=$(echo "$npe_node_max / $nth_wavegempak" | bc) + npe_node_wavegempak=$(echo "${npe_node_max} / ${nth_wavegempak}" | bc) + export npe_node_wavegempak export NTASKS=${npe_wavegempak} + export memory_wavegempak="1GB" -elif [ $step = "waveawipsbulls" ]; then +elif [[ ${step} = "waveawipsbulls" ]]; then - export wtime_waveawipsbulls="00:30:00" - export npe_waveawipsbulls=$npe_node_max + export wtime_waveawipsbulls="00:20:00" + export npe_waveawipsbulls=1 export nth_waveawipsbulls=1 - export npe_node_waveawipsbulls=$(echo "$npe_node_max / $nth_waveawipsbulls" | bc) + npe_node_waveawipsbulls=$(echo "${npe_node_max} / ${nth_waveawipsbulls}" | bc) + export npe_node_waveawipsbulls export NTASKS=${npe_waveawipsbulls} + export is_exclusive=True -elif [ $step = "waveawipsgridded" ]; then +elif [[ ${step} = "waveawipsgridded" ]]; then - export wtime_waveawipsgridded="00:30:00" - export npe_waveawipsgridded=$npe_node_max + export wtime_waveawipsgridded="02:00:00" + export npe_waveawipsgridded=1 export nth_waveawipsgridded=1 - export npe_node_waveawipsgridded=$(echo "$npe_node_max / $nth_waveawipsgridded" | bc) + npe_node_waveawipsgridded=$(echo "${npe_node_max} / ${nth_waveawipsgridded}" | bc) + export npe_node_waveawipsgridded export NTASKS=${npe_waveawipsgridded} + export memory_waveawipsgridded_gfs="1GB" -elif [ $step = "atmanalprep" ]; then +elif [[ "${step}" = "atmanlinit" ]]; then - export wtime_atmanalprep="00:10:00" - export npe_atmanalprep=1 - export nth_atmanalprep=1 - export npe_node_atmanalprep=$(echo "$npe_node_max / $nth_atmanalprep" | bc) - export memory_atmanalprep="3072M" + export wtime_atmanlinit="00:10:00" + export npe_atmanlinit=1 + export nth_atmanlinit=1 + npe_node_atmanlinit=$(echo "${npe_node_max} / ${nth_atmanlinit}" | bc) + export npe_node_atmanlinit + export memory_atmanlinit="3072M" -elif [ $step = "atmanalrun" ]; then +elif [[ "${step}" = "atmanlrun" ]]; then # make below case dependent later export layout_x=1 export layout_y=1 - export wtime_atmanalrun="00:30:00" - export npe_atmanalrun=$(echo "$layout_x * $layout_y * 6" | bc) - export npe_atmanalrun_gfs=$(echo "$layout_x * $layout_y * 6" | bc) - export nth_atmanalrun=1 - export nth_atmanalrun_gfs=$nth_atmanalrun - export native_atmanalrun="--exclusive" - export npe_node_atmanalrun=$(echo "$npe_node_max / $nth_atmanalrun" | bc) - -elif [ $step = "atmanalpost" ]; then - - export wtime_atmanalpost="00:30:00" - export npe_atmanalpost=$npe_node_max - export nth_atmanalpost=1 - export npe_node_atmanalpost=$(echo "$npe_node_max / $nth_atmanalpost" | bc) - -elif [ $step = "anal" ]; then + export wtime_atmanlrun="00:30:00" + npe_atmanlrun=$(echo "${layout_x} * ${layout_y} * 6" | bc) + export npe_atmanlrun + npe_atmanlrun_gfs=$(echo "${layout_x} * ${layout_y} * 6" | bc) + export npe_atmanlrun_gfs + export nth_atmanlrun=1 + export nth_atmanlrun_gfs=${nth_atmanlrun} + npe_node_atmanlrun=$(echo "${npe_node_max} / ${nth_atmanlrun}" | bc) + export npe_node_atmanlrun + export is_exclusive=True + +elif [[ "${step}" = "atmanlfinal" ]]; then + + export wtime_atmanlfinal="00:30:00" + export npe_atmanlfinal=${npe_node_max} + export nth_atmanlfinal=1 + npe_node_atmanlfinal=$(echo "${npe_node_max} / ${nth_atmanlfinal}" | bc) + export npe_node_atmanlfinal + export is_exclusive=True + +elif [[ "${step}" = "landanlinit" || "${step}" = "landanlrun" || "${step}" = "landanlfinal" ]]; then + # below lines are for creating JEDI YAML + case ${CASE} in + C768) + layout_x=6 + layout_y=6 + ;; + C384) + layout_x=5 + layout_y=5 + ;; + C192 | C96 | C48) + layout_x=1 + layout_y=1 + ;; + *) + echo "FATAL ERROR: Resolution not supported for land analysis'" + exit 1 + esac + + export layout_x + export layout_y + + if [[ "${step}" = "landanlinit" || "${step}" = "landanlfinal" ]]; then + declare -x "wtime_${step}"="00:10:00" + declare -x "npe_${step}"=1 + declare -x "nth_${step}"=1 + temp_stepname="nth_${step}" + declare -x "npe_node_${step}"="$(echo "${npe_node_max} / ${!temp_stepname}" | bc)" + declare -x "memory_${step}"="3072M" + elif [[ "${step}" = "landanlrun" ]]; then + export wtime_landanlrun="00:30:00" + npe_landanlrun=$(echo "${layout_x} * ${layout_y} * 6" | bc) + export npe_landanlrun + export nth_landanlrun=1 + npe_node_landanlrun=$(echo "${npe_node_max} / ${nth_landanlrun}" | bc) + export npe_node_landanlrun + export is_exclusive=True + fi - export wtime_anal="01:00:00" - export npe_anal=1000 +elif [[ "${step}" = "aeroanlinit" ]]; then + + # below lines are for creating JEDI YAML + case ${CASE} in + C768) + layout_x=6 + layout_y=6 + ;; + C384) + layout_x=5 + layout_y=5 + ;; + C192 | C96 | C48) + layout_x=8 + layout_y=8 + ;; + *) + echo "FATAL ERROR: Resolution not supported for aerosol analysis'" + exit 1 + esac + + export layout_x + export layout_y + + export wtime_aeroanlinit="00:10:00" + export npe_aeroanlinit=1 + export nth_aeroanlinit=1 + npe_node_aeroanlinit=$(echo "${npe_node_max} / ${nth_aeroanlinit}" | bc) + export npe_node_aeroanlinit + export memory_aeroanlinit="3072M" + +elif [[ "${step}" = "aeroanlrun" ]]; then + + case ${CASE} in + C768) + layout_x=6 + layout_y=6 + ;; + C384) + layout_x=5 + layout_y=5 + ;; + C192 | C96 | C48) + layout_x=8 + layout_y=8 + ;; + *) + echo "FATAL ERROR: Resolution ${CASE} is not supported, ABORT!" + exit 1 + esac + + export layout_x + export layout_y + + export wtime_aeroanlrun="00:30:00" + npe_aeroanlrun=$(echo "${layout_x} * ${layout_y} * 6" | bc) + export npe_aeroanlrun + npe_aeroanlrun_gfs=$(echo "${layout_x} * ${layout_y} * 6" | bc) + export npe_aeroanlrun_gfs + export nth_aeroanlrun=1 + export nth_aeroanlrun_gfs=1 + npe_node_aeroanlrun=$(echo "${npe_node_max} / ${nth_aeroanlrun}" | bc) + export npe_node_aeroanlrun + export is_exclusive=True + +elif [[ "${step}" = "aeroanlfinal" ]]; then + + export wtime_aeroanlfinal="00:10:00" + export npe_aeroanlfinal=1 + export nth_aeroanlfinal=1 + npe_node_aeroanlfinal=$(echo "${npe_node_max} / ${nth_aeroanlfinal}" | bc) + export npe_node_aeroanlfinal + export memory_aeroanlfinal="3072M" + +elif [[ "${step}" = "ocnanalprep" ]]; then + + export wtime_ocnanalprep="00:10:00" + export npe_ocnanalprep=1 + export nth_ocnanalprep=1 + npe_node_ocnanalprep=$(echo "${npe_node_max} / ${nth_ocnanalprep}" | bc) + export npe_node_ocnanalprep + export memory_ocnanalprep="24GB" + +elif [[ "${step}" = "ocnanalbmat" ]]; then + npes=16 + case ${CASE} in + C384) + npes=480 + ;; + C48) + npes=16 + ;; + *) + echo "FATAL: Resolution not supported'" + exit 1 + esac + + export wtime_ocnanalbmat="00:30:00" + export npe_ocnanalbmat=${npes} + export nth_ocnanalbmat=1 + export is_exclusive=True + npe_node_ocnanalbmat=$(echo "${npe_node_max} / ${nth_ocnanalbmat}" | bc) + export npe_node_ocnanalbmat + +elif [[ "${step}" = "ocnanalrun" ]]; then + npes=16 + case ${CASE} in + C384) + npes=480 + ;; + C48) + npes=16 + ;; + *) + echo "FATAL: Resolution not supported'" + exit 1 + esac + + export wtime_ocnanalrun="00:30:00" + export npe_ocnanalrun=${npes} + export nth_ocnanalrun=1 + export is_exclusive=True + npe_node_ocnanalrun=$(echo "${npe_node_max} / ${nth_ocnanalrun}" | bc) + export npe_node_ocnanalrun + +elif [[ "${step}" = "ocnanalchkpt" ]]; then + + export wtime_ocnanalchkpt="00:10:00" + export npe_ocnanalchkpt=1 + export nth_ocnanalchkpt=1 + npe_node_ocnanalchkpt=$(echo "${npe_node_max} / ${nth_ocnanalchkpt}" | bc) + export npe_node_ocnanalchkpt + case ${CASE} in + C384) + export memory_ocnanalchkpt="128GB" + ;; + C48) + export memory_ocnanalchkpt="32GB" + ;; + *) + echo "FATAL: Resolution not supported'" + exit 1 + esac + +elif [[ "${step}" = "ocnanalpost" ]]; then + + export wtime_ocnanalpost="00:30:00" + export npe_ocnanalpost=${npe_node_max} + export nth_ocnanalpost=1 + npe_node_ocnanalpost=$(echo "${npe_node_max} / ${nth_ocnanalpost}" | bc) + export npe_node_ocnanalpost + +elif [[ "${step}" = "ocnanalvrfy" ]]; then + + export wtime_ocnanalvrfy="00:35:00" + export npe_ocnanalvrfy=1 + export nth_ocnanalvrfy=1 + npe_node_ocnanalvrfy=$(echo "${npe_node_max} / ${nth_ocnanalvrfy}" | bc) + export npe_node_ocnanalvrfy + export memory_ocnanalvrfy="24GB" + +elif [[ ${step} = "anal" ]]; then + + export wtime_anal="00:50:00" + export wtime_anal_gfs="00:40:00" + export npe_anal=780 export nth_anal=5 - export npe_anal_gfs=1000 - if [ $CASE = "C384" ]; then - export npe_anal=400 - export npe_anal_gfs=400 + export npe_anal_gfs=825 + export nth_anal_gfs=5 + if [[ "${machine}" = "WCOSS2" ]]; then + export nth_anal=8 + export nth_anal_gfs=8 + fi + if [[ ${CASE} = "C384" ]]; then + export npe_anal=160 + export npe_anal_gfs=160 + export nth_anal=10 + export nth_anal_gfs=10 + if [[ ${machine} = "S4" ]]; then + #On the S4-s4 partition, this is accomplished by increasing the task + #count to a multiple of 32 + if [[ ${PARTITION_BATCH} = "s4" ]]; then + export npe_anal=416 + export npe_anal_gfs=416 + fi + #S4 is small, so run this task with just 1 thread + export nth_anal=1 + export nth_anal_gfs=1 + export wtime_anal="02:00:00" + fi fi - if [ $CASE = "C192" -o $CASE = "C96" -o $CASE = "C48" ]; then + if [[ ${CASE} = "C192" || ${CASE} = "C96" || ${CASE} = "C48" ]]; then export npe_anal=84 export npe_anal_gfs=84 + if [[ ${machine} = "S4" ]]; then + export nth_anal=4 + export nth_anal_gfs=4 + #Adjust job count for S4 + if [[ ${PARTITION_BATCH} = "s4" ]]; then + export npe_anal=88 + export npe_anal_gfs=88 + elif [[ ${PARTITION_BATCH} = "ivy" ]]; then + export npe_anal=90 + export npe_anal_gfs=90 + fi + fi fi - export nth_anal_gfs=$nth_anal - export npe_node_anal=$(echo "$npe_node_max / $nth_anal" | bc) - export nth_cycle=$nth_anal + npe_node_anal=$(echo "${npe_node_max} / ${nth_anal}" | bc) + export npe_node_anal + export nth_cycle=${nth_anal} + npe_node_cycle=$(echo "${npe_node_max} / ${nth_cycle}" | bc) + export npe_node_cycle + export is_exclusive=True -elif [ $step = "analcalc" ]; then +elif [[ ${step} = "analcalc" ]]; then export wtime_analcalc="00:10:00" export npe_analcalc=127 + export ntasks="${npe_analcalc}" export nth_analcalc=1 - export npe_node_analcalc=$npe_node_max + export nth_echgres=4 + export nth_echgres_gfs=12 + npe_node_analcalc=$(echo "${npe_node_max} / ${nth_analcalc}" | bc) + export npe_node_analcalc + export is_exclusive=True -elif [ $step = "analdiag" ]; then +elif [[ ${step} = "analdiag" ]]; then - export wtime_analdiag="00:10:00" - export npe_analdiag=112 + export wtime_analdiag="00:15:00" + export npe_analdiag=96 # Should be at least twice npe_ediag export nth_analdiag=1 - export npe_node_analdiag=$npe_node_max + npe_node_analdiag=$(echo "${npe_node_max} / ${nth_analdiag}" | bc) + export npe_node_analdiag + export memory_analdiag="48GB" -elif [ $step = "sfcanl" ]; then +elif [[ ${step} = "sfcanl" ]]; then export wtime_sfcanl="00:10:00" export npe_sfcanl=6 export nth_sfcanl=1 - export npe_node_sfcanl=$(echo "$npe_node_max / $nth_sfcanl" | bc) + npe_node_sfcanl=$(echo "${npe_node_max} / ${nth_sfcanl}" | bc) + export npe_node_sfcanl + export is_exclusive=True -elif [ $step = "gldas" ]; then +elif [[ ${step} = "gldas" ]]; then export wtime_gldas="00:10:00" - export npe_gldas=96 + export npe_gldas=112 export nth_gldas=1 - export npe_node_gldas=$npe_node_max + npe_node_gldas=$(echo "${npe_node_max} / ${nth_gldas}" | bc) + export npe_node_gldas export npe_gaussian=96 export nth_gaussian=1 - export npe_node_gaussian=24 + npe_node_gaussian=$(echo "${npe_node_max} / ${nth_gaussian}" | bc) + export npe_node_gaussian + export is_exclusive=True -elif [ $step = "fcst" ]; then +elif [[ "${step}" = "fcst" || "${step}" = "efcs" ]]; then - export wtime_fcst="00:30:00" - if [ $CASE = "C768" ]; then - export wtime_fcst_gfs="06:00:00" - elif [ $CASE = "C384" ]; then - export wtime_fcst_gfs="06:00:00" - else - export wtime_fcst_gfs="03:00:00" + export is_exclusive=True + + if [[ "${step}" = "fcst" ]]; then + _CDUMP_LIST=${CDUMP:-"gdas gfs"} + elif [[ "${step}" = "efcs" ]]; then + _CDUMP_LIST=${CDUMP:-"enkfgdas enkfgfs"} fi # During workflow creation, we need resources for all CDUMPs and CDUMP is undefined - CDUMP_LIST=${CDUMP:-"gdas gfs"} - for CDUMP in $CDUMP_LIST; do - if [[ "$CDUMP" == "gfs" ]]; then - export layout_x=$layout_x_gfs - export layout_y=$layout_y_gfs - export WRITE_GROUP=$WRITE_GROUP_GFS - export WRTTASK_PER_GROUP=$WRTTASK_PER_GROUP_GFS + for _CDUMP in ${_CDUMP_LIST}; do + if [[ "${_CDUMP}" =~ "gfs" ]]; then + export layout_x=${layout_x_gfs} + export layout_y=${layout_y_gfs} + export WRITE_GROUP=${WRITE_GROUP_GFS} + export WRTTASK_PER_GROUP_PER_THREAD=${WRTTASK_PER_GROUP_PER_THREAD_GFS} + ntasks_fv3=${ntasks_fv3_gfs} + ntasks_quilt=${ntasks_quilt_gfs} + nthreads_fv3=${nthreads_fv3_gfs} fi - (( ATMPETS = layout_x * layout_y * 6 )) - - # Mediator only uses the atm model PETS or less - export MEDPETS=${MEDPETS:-ATMPETS} + # PETS for the atmosphere dycore + (( FV3PETS = ntasks_fv3 * nthreads_fv3 )) + echo "FV3 using (nthreads, PETS) = (${nthreads_fv3}, ${FV3PETS})" - if [[ $DO_AERO == "YES" ]]; then - # Aerosol model only uses the atm model PETS - export CHMPETS=$ATMPETS - # Aerosol model runs on same PETs as ATM, so don't add to $NTASKS_TOT + # PETS for quilting + if [[ "${QUILTING:-}" = ".true." ]]; then + (( QUILTPETS = ntasks_quilt * nthreads_fv3 )) + (( WRTTASK_PER_GROUP = WRTTASK_PER_GROUP_PER_THREAD )) + export WRTTASK_PER_GROUP + else + QUILTPETS=0 fi - - # If using in-line post, add the write tasks to the ATMPETS - if [[ $QUILTING == ".true." ]]; then - (( ATMPETS = ATMPETS + WRITE_GROUP * WRTTASK_PER_GROUP )) + echo "QUILT using (nthreads, PETS) = (${nthreads_fv3}, ${QUILTPETS})" + + # Total PETS for the atmosphere component + ATMTHREADS=${nthreads_fv3} + (( ATMPETS = FV3PETS + QUILTPETS )) + export ATMPETS ATMTHREADS + echo "FV3ATM using (nthreads, PETS) = (${ATMTHREADS}, ${ATMPETS})" + + # Total PETS for the coupled model (starting w/ the atmosphere) + NTASKS_TOT=${ATMPETS} + + # The mediator PETS can overlap with other components, usually it lands on the atmosphere tasks. + # However, it is suggested limiting mediator PETS to 300, as it may cause the slow performance. + # See https://docs.google.com/document/d/1bKpi-52t5jIfv2tuNHmQkYUe3hkKsiG_DG_s6Mnukog/edit + # TODO: Update reference when moved to ufs-weather-model RTD + MEDTHREADS=${nthreads_mediator:-1} + MEDPETS=${MEDPETS:-ATMPETS} + [[ "${MEDPETS}" -gt 300 ]] && MEDPETS=300 + export MEDPETS MEDTHREADS + echo "MEDIATOR using (threads, PETS) = (${MEDTHREADS}, ${MEDPETS})" + + if [[ "${DO_AERO}" = "YES" ]]; then + # GOCART shares the same grid and forecast tasks as FV3 (do not add write grid component tasks). + (( CHMTHREADS = ATMTHREADS )) + (( CHMPETS = FV3PETS )) + # Do not add to NTASKS_TOT + export CHMPETS CHMTHREADS + echo "GOCART using (threads, PETS) = (${CHMTHREADS}, ${CHMPETS})" fi - export ATMPETS - NTASKS_TOT=$ATMPETS - - export nth_fcst=${nth_fv3:-2} - export nth_fcst_gfs=${nth_fv3_gfs:-2} - - export npe_node_fcst=$(echo "$npe_node_max / $nth_fcst" | bc) - export npe_node_fcst_gfs=$(echo "$npe_node_max / $nth_fcst_gfs" | bc) - - if [[ $DO_WAVE == "YES" ]]; then - case $waveGRD in - 'gnh_10m aoc_9km gsh_15m') export WAVPETS=140 ;; - 'gwes_30m') export WAVPETS=100 ;; - 'mx050') export WAVPETS=240 ;; - 'mx025') export WAVPETS=80 ;; - *) - echo "FATAL: Number of PEs not defined for wave grid '$waveGRD'" - echo " Please add an entry to config.resources within fcst for this grid" - exit 3 - esac - (( NTASKS_TOT = NTASKS_TOT + WAVPETS )) + + if [[ "${DO_WAVE}" = "YES" ]]; then + (( WAVPETS = ntasks_ww3 * nthreads_ww3 )) + (( WAVTHREADS = nthreads_ww3 )) + export WAVPETS WAVTHREADS + echo "WW3 using (threads, PETS) = (${WAVTHREADS}, ${WAVPETS})" + (( NTASKS_TOT = NTASKS_TOT + WAVPETS )) fi - if [[ $DO_OCN == "YES" ]]; then - case $OCNRES in - # Except for 025, these are guesses for now - 100) export OCNPETS=20 ;; - 050) export OCNPETS=60 ;; - 025) export OCNPETS=220 ;; - *) - echo "FATAL: Number of PEs not defined for ocean resolution '$OCNRES'" - echo " Please add an entry to config.resources within fcst for this resolution" - exit 3 - esac - (( NTASKS_TOT = NTASKS_TOT + OCNPETS )) + if [[ "${DO_OCN}" = "YES" ]]; then + (( OCNPETS = ntasks_mom6 * nthreads_mom6 )) + (( OCNTHREADS = nthreads_mom6 )) + export OCNPETS OCNTHREADS + echo "MOM6 using (threads, PETS) = (${OCNTHREADS}, ${OCNPETS})" + (( NTASKS_TOT = NTASKS_TOT + OCNPETS )) fi - if [[ $DO_ICE == "YES" ]]; then - case $ICERES in - # Except for 025, these are guesses for now - 100) export ICEPETS=10 ;; - 050) export ICEPETS=30 ;; - 025) export ICEPETS=120 ;; - *) - echo "FATAL: Number of PEs not defined for ice resolution '$ICERES'" - echo " Please add an entry to config.resources within fcst for this resolution" - exit 3 - esac - (( NTASKS_TOT = NTASKS_TOT + ICEPETS )) + if [[ "${DO_ICE}" = "YES" ]]; then + (( ICEPETS = ntasks_cice6 * nthreads_cice6 )) + (( ICETHREADS = nthreads_cice6 )) + export ICEPETS ICETHREADS + echo "CICE6 using (threads, PETS) = (${ICETHREADS}, ${ICEPETS})" + (( NTASKS_TOT = NTASKS_TOT + ICEPETS )) fi - if [[ $CDUMP == "gfs" ]]; then - export npe_fcst_gfs=$NTASKS_TOT + echo "Total PETS for ${_CDUMP} = ${NTASKS_TOT}" + + if [[ "${_CDUMP}" =~ "gfs" ]]; then + declare -x "npe_${step}_gfs"="${NTASKS_TOT}" + declare -x "nth_${step}_gfs"=1 # ESMF handles threading for the UFS-weather-model + declare -x "npe_node_${step}_gfs"="${npe_node_max}" else - export npe_fcst=$NTASKS_TOT + declare -x "npe_${step}"="${NTASKS_TOT}" + declare -x "nth_${step}"=1 # ESMF handles threading for the UFS-weather-model + declare -x "npe_node_${step}"="${npe_node_max}" fi + done -elif [ $step = "ocnpost" ]; then + case "${CASE}" in + "C48" | "C96" | "C192") + declare -x "wtime_${step}"="00:30:00" + declare -x "wtime_${step}_gfs"="03:00:00" + ;; + "C384" | "C768" | "C1152") + declare -x "wtime_${step}"="01:00:00" + declare -x "wtime_${step}_gfs"="06:00:00" + ;; + *) + echo "FATAL ERROR: Resolution ${CASE} not supported in ${step}" + exit 1 + ;; + esac + + unset _CDUMP _CDUMP_LIST + unset NTASKS_TOT + +elif [[ ${step} = "ocnpost" ]]; then export wtime_ocnpost="00:30:00" export npe_ocnpost=1 export npe_node_ocnpost=1 export nth_ocnpost=1 export memory_ocnpost="96G" + if [[ ${machine} == "JET" ]]; then + # JET only has 88GB of requestable memory per node + # so a second node is required to meet the requiremtn + npe_ocnpost=2 + fi -elif [ $step = "post" ]; then +elif [[ ${step} = "post" ]]; then - export wtime_post="02:00:00" - export wtime_post_gfs="06:00:00" - export memory_post="50G" - export npe_post=112 + export wtime_post="00:12:00" + export wtime_post_gfs="01:00:00" + export npe_post=126 + res=$(echo "${CASE}" | cut -c2-) + if (( npe_post > res )); then + export npe_post=${res} + fi export nth_post=1 - export npe_node_post=12 - export npe_node_dwn=$npe_node_max + export npe_node_post=${npe_post} + export npe_node_post_gfs=${npe_post} + export npe_node_dwn=${npe_node_max} + if [[ "${npe_node_post}" -gt "${npe_node_max}" ]]; then export npe_node_post=${npe_node_max} ; fi + if [[ "${npe_node_post_gfs}" -gt "${npe_node_max}" ]]; then export npe_node_post_gfs=${npe_node_max} ; fi + export is_exclusive=True -elif [ $step = "wafs" ]; then +elif [[ ${step} = "wafs" ]]; then export wtime_wafs="00:30:00" export npe_wafs=1 - export npe_node_wafs=1 + export npe_node_wafs=${npe_wafs} export nth_wafs=1 + export memory_wafs="1GB" -elif [ $step = "wafsgcip" ]; then +elif [[ ${step} = "wafsgcip" ]]; then export wtime_wafsgcip="00:30:00" export npe_wafsgcip=2 - export npe_node_wafsgcip=1 export nth_wafsgcip=1 + export npe_node_wafsgcip=1 + export memory_wafsgcip="50GB" -elif [ $step = "wafsgrib2" ]; then +elif [[ ${step} = "wafsgrib2" ]]; then export wtime_wafsgrib2="00:30:00" - export npe_wafsgrib2=1 - export npe_node_wafsgrib2=1 + export npe_wafsgrib2=18 export nth_wafsgrib2=1 + npe_node_wafsgrib2=$(echo "${npe_node_max} / ${nth_wafsgrib2}" | bc) + export npe_node_wafsgrib2 + export memory_wafsgrib2="80GB" -elif [ $step = "wafsblending" ]; then +elif [[ ${step} = "wafsblending" ]]; then export wtime_wafsblending="00:30:00" export npe_wafsblending=1 - export npe_node_wafsblending=1 export nth_wafsblending=1 + npe_node_wafsblending=$(echo "${npe_node_max} / ${nth_wafsblending}" | bc) + export npe_node_wafsblending + export memory_wafsblending="15GB" -elif [ $step = "wafsgrib20p25" ]; then +elif [[ ${step} = "wafsgrib20p25" ]]; then export wtime_wafsgrib20p25="00:30:00" - export npe_wafsgrib20p25=1 - export npe_node_wafsgrib20p25=1 + export npe_wafsgrib20p25=11 export nth_wafsgrib20p25=1 + npe_node_wafsgrib20p25=$(echo "${npe_node_max} / ${nth_wafsgrib20p25}" | bc) + export npe_node_wafsgrib20p25 + export memory_wafsgrib20p25="80GB" -elif [ $step = "wafsblending0p25" ]; then +elif [[ ${step} = "wafsblending0p25" ]]; then export wtime_wafsblending0p25="00:30:00" export npe_wafsblending0p25=1 - export npe_node_wafsblending0p25=1 export nth_wafsblending0p25=1 + npe_node_wafsblending0p25=$(echo "${npe_node_max} / ${nth_wafsblending0p25}" | bc) + export npe_node_wafsblending0p25 + export memory_wafsblending0p25="15GB" -elif [ $step = "vrfy" ]; then +elif [[ ${step} = "vrfy" ]]; then export wtime_vrfy="03:00:00" export wtime_vrfy_gfs="06:00:00" @@ -368,11 +712,21 @@ elif [ $step = "vrfy" ]; then export npe_node_vrfy=1 export npe_vrfy_gfs=1 export npe_node_vrfy_gfs=1 - if [[ "$machine" == "HERA" ]]; then + if [[ ${machine} == "HERA" ]]; then export memory_vrfy="16384M" fi + export is_exclusive=True -elif [ $step = "metp" ]; then +elif [[ "${step}" = "fit2obs" ]]; then + + export wtime_fit2obs="00:20:00" + export npe_fit2obs=3 + export nth_fit2obs=1 + export npe_node_fit2obs=1 + export memory_fit2obs="20G" + if [[ ${machine} == "WCOSS2" ]]; then export npe_node_fit2obs=3 ; fi + +elif [[ "${step}" = "metp" ]]; then export nth_metp=1 export wtime_metp="03:00:00" @@ -381,200 +735,235 @@ elif [ $step = "metp" ]; then export wtime_metp_gfs="06:00:00" export npe_metp_gfs=4 export npe_node_metp_gfs=4 + export is_exclusive=True -elif [ $step = "echgres" ]; then +elif [[ ${step} = "echgres" ]]; then export wtime_echgres="00:10:00" export npe_echgres=3 - export nth_echgres=$npe_node_max + export nth_echgres=${npe_node_max} export npe_node_echgres=1 + if [[ "${machine}" = "WCOSS2" ]]; then + export memory_echgres="200GB" + fi -elif [ $step = "init" ]; then +elif [[ ${step} = "init" ]]; then export wtime_init="00:30:00" export npe_init=24 export nth_init=1 export npe_node_init=6 - if [ $machine = "JET" ]; then - export memory_init="50G" - else - export memory_init="70G" - fi + export memory_init="70G" -elif [ $step = "init_chem" ]; then +elif [[ ${step} = "init_chem" ]]; then export wtime_init_chem="00:30:00" export npe_init_chem=1 export npe_node_init_chem=1 + export is_exclusive=True -elif [ $step = "mom6ic" ]; then +elif [[ ${step} = "mom6ic" ]]; then export wtime_mom6ic="00:30:00" export npe_mom6ic=24 export npe_node_mom6ic=24 + export is_exclusive=True -elif [ $step = "arch" -o $step = "earc" -o $step = "getic" ]; then +elif [[ ${step} = "arch" || ${step} = "earc" || ${step} = "getic" ]]; then - eval "export wtime_$step='06:00:00'" - eval "export npe_$step=1" - eval "export npe_node_$step=1" - eval "export nth_$step=1" - eval "export memory_$step=2048M" + eval "export wtime_${step}='06:00:00'" + eval "export npe_${step}=1" + eval "export npe_node_${step}=1" + eval "export nth_${step}=1" + eval "export memory_${step}=4096M" + if [[ "${machine}" = "WCOSS2" ]]; then + eval "export memory_${step}=50GB" + fi -elif [ $step = "coupled_ic" ]; then +elif [[ ${step} = "coupled_ic" ]]; then export wtime_coupled_ic="00:15:00" export npe_coupled_ic=1 export npe_node_coupled_ic=1 export nth_coupled_ic=1 + export is_exclusive=True -elif [ $step = "atmensanalprep" ]; then +elif [[ "${step}" = "atmensanlinit" ]]; then - export wtime_atmensanalprep="00:10:00" - export npe_atmensanalprep=1 - export nth_atmensanalprep=1 - export npe_node_atmensanalprep=$(echo "$npe_node_max / $nth_atmensanalprep" | bc) + export wtime_atmensanlinit="00:10:00" + export npe_atmensanlinit=1 + export nth_atmensanlinit=1 + npe_node_atmensanlinit=$(echo "${npe_node_max} / ${nth_atmensanlinit}" | bc) + export npe_node_atmensanlinit + export memory_atmensanlinit="3072M" -elif [ $step = "atmensanalrun" ]; then +elif [[ "${step}" = "atmensanlrun" ]]; then # make below case dependent later - export layout_x=2 - export layout_y=3 - - export wtime_atmensanalrun="00:30:00" - export npe_atmensanalrun=$(echo "$layout_x * $layout_y * 6" | bc) - export npe_atmensanalrun_gfs=$(echo "$layout_x * $layout_y * 6" | bc) - export nth_atmensanalrun=1 - export nth_atmensanalrun_gfs=$nth_atmensanalrun - export native_atmensanalrun="--exclusive" - export npe_node_atmensanalrun=$(echo "$npe_node_max / $nth_atmensanalrun" | bc) - -elif [ $step = "atmensanalpost" ]; then - - export wtime_atmensanalpost="00:30:00" - export npe_atmensanalpost=$npe_node_max - export nth_atmensanalpost=1 - export npe_node_atmensanalpost=$(echo "$npe_node_max / $nth_atmensanalpost" | bc) - -elif [ $step = "eobs" -o $step = "eomg" ]; then + export layout_x=1 + export layout_y=1 - export wtime_eobs="00:45:00" + export wtime_atmensanlrun="00:30:00" + npe_atmensanlrun=$(echo "${layout_x} * ${layout_y} * 6" | bc) + export npe_atmensanlrun + npe_atmensanlrun_gfs=$(echo "${layout_x} * ${layout_y} * 6" | bc) + export npe_atmensanlrun_gfs + export nth_atmensanlrun=1 + export nth_atmensanlrun_gfs=${nth_atmensanlrun} + npe_node_atmensanlrun=$(echo "${npe_node_max} / ${nth_atmensanlrun}" | bc) + export npe_node_atmensanlrun + export is_exclusive=True + +elif [[ "${step}" = "atmensanlfinal" ]]; then + + export wtime_atmensanlfinal="00:30:00" + export npe_atmensanlfinal=${npe_node_max} + export nth_atmensanlfinal=1 + npe_node_atmensanlfinal=$(echo "${npe_node_max} / ${nth_atmensanlfinal}" | bc) + export npe_node_atmensanlfinal + export is_exclusive=True + +elif [[ ${step} = "eobs" || ${step} = "eomg" ]]; then + + export wtime_eobs="00:15:00" export wtime_eomg="01:00:00" - if [ $CASE = "C768" ]; then + if [[ ${CASE} = "C768" ]]; then export npe_eobs=200 - elif [ $CASE = "C384" ]; then + elif [[ ${CASE} = "C384" ]]; then export npe_eobs=100 - elif [ $CASE = "C192" ]; then + elif [[ ${CASE} = "C192" || ${CASE} = "C96" || ${CASE} = "C48" ]]; then export npe_eobs=40 - elif [ $CASE = "C96" -o $CASE = "C48" ]; then - export npe_eobs=20 fi - export npe_eomg=$npe_eobs + export npe_eomg=${npe_eobs} export nth_eobs=2 - export nth_eomg=$nth_eobs - export npe_node_eobs=$(echo "$npe_node_max / $nth_eobs" | bc) - export npe_node_eomg=$npe_node_eobs + export nth_eomg=${nth_eobs} + npe_node_eobs=$(echo "${npe_node_max} / ${nth_eobs}" | bc) + export npe_node_eobs + export npe_node_eomg=${npe_node_eobs} + export is_exclusive=True + #The number of tasks and cores used must be the same for eobs + #For S4, this is accomplished by running 10 tasks/node + if [[ ${machine} = "S4" ]]; then + export npe_node_eobs=10 + fi -elif [ $step = "ediag" ]; then +elif [[ ${step} = "ediag" ]]; then - export wtime_ediag="00:06:00" - export npe_ediag=56 + export wtime_ediag="00:15:00" + export npe_ediag=48 export nth_ediag=1 - export npe_node_ediag=$npe_node_max + npe_node_ediag=$(echo "${npe_node_max} / ${nth_ediag}" | bc) + export npe_node_ediag + export memory_ediag="30GB" -elif [ $step = "eupd" ]; then +elif [[ ${step} = "eupd" ]]; then export wtime_eupd="00:30:00" - if [ $CASE = "C768" ]; then + if [[ ${CASE} = "C768" ]]; then export npe_eupd=480 export nth_eupd=6 - if [[ "$machine" = "HERA" ]]; then - export npe_eupd=150 - export nth_eupd=40 + if [[ "${machine}" = "WCOSS2" ]]; then + export npe_eupd=315 + export nth_eupd=14 fi - elif [ $CASE = "C384" ]; then + elif [[ ${CASE} = "C384" ]]; then export npe_eupd=270 export nth_eupd=2 - if [[ "$machine" = "HERA" ]]; then - export npe_eupd=100 - export nth_eupd=40 + if [[ "${machine}" = "WCOSS2" ]]; then + export npe_eupd=315 + export nth_eupd=14 + elif [[ "${machine}" = "HERA" || "${machine}" = "JET" ]]; then + export nth_eupd=8 + elif [[ ${machine} = "S4" ]]; then + export npe_eupd=160 + export nth_eupd=2 fi - elif [ $CASE = "C192" -o $CASE = "C96" -o $CASE = "C48" ]; then + elif [[ ${CASE} = "C192" || ${CASE} = "C96" || ${CASE} = "C48" ]]; then export npe_eupd=42 export nth_eupd=2 - if [[ "$machine" = "HERA" ]]; then - export npe_eupd=40 - export nth_eupd=40 + if [[ "${machine}" = "HERA" || "${machine}" = "JET" ]]; then + export nth_eupd=4 fi fi - export npe_node_eupd=$(echo "$npe_node_max / $nth_eupd" | bc) + npe_node_eupd=$(echo "${npe_node_max} / ${nth_eupd}" | bc) + export npe_node_eupd + export is_exclusive=True -elif [ $step = "ecen" ]; then +elif [[ ${step} = "ecen" ]]; then export wtime_ecen="00:10:00" export npe_ecen=80 - export nth_ecen=6 - if [ $CASE = "C384" -o $CASE = "C192" -o $CASE = "C96" -o $CASE = "C48" ]; then export nth_ecen=2; fi - export npe_node_ecen=$(echo "$npe_node_max / $nth_ecen" | bc) - export nth_cycle=$nth_ecen - -elif [ $step = "esfc" ]; then + export nth_ecen=4 + if [[ "${machine}" = "HERA" ]]; then export nth_ecen=6; fi + if [[ ${CASE} = "C384" || ${CASE} = "C192" || ${CASE} = "C96" || ${CASE} = "C48" ]]; then export nth_ecen=2; fi + npe_node_ecen=$(echo "${npe_node_max} / ${nth_ecen}" | bc) + export npe_node_ecen + export nth_cycle=${nth_ecen} + npe_node_cycle=$(echo "${npe_node_max} / ${nth_cycle}" | bc) + export npe_node_cycle + export is_exclusive=True + +elif [[ ${step} = "esfc" ]]; then export wtime_esfc="00:06:00" export npe_esfc=80 - export npe_node_esfc=$npe_node_max export nth_esfc=1 - export nth_cycle=$nth_esfc - -elif [ $step = "efcs" ]; then - - if [ $CASE = "C768" ]; then - export wtime_efcs="01:00:00" - else - export wtime_efcs="00:40:00" - fi - export npe_efcs=$(echo "$layout_x * $layout_y * 6" | bc) - export nth_efcs=${nth_fv3:-2} - export npe_node_efcs=$(echo "$npe_node_max / $nth_efcs" | bc) + npe_node_esfc=$(echo "${npe_node_max} / ${nth_esfc}" | bc) + export npe_node_esfc + export nth_cycle=${nth_esfc} + npe_node_cycle=$(echo "${npe_node_max} / ${nth_cycle}" | bc) + export npe_node_cycle + export memory_esfc="80GB" -elif [ $step = "epos" ]; then +elif [[ ${step} = "epos" ]]; then export wtime_epos="00:15:00" export npe_epos=80 - export nth_epos=6 - export npe_node_epos=$(echo "$npe_node_max / $nth_epos" | bc) + export nth_epos=4 + if [[ "${machine}" == "HERA" ]]; then + export nth_epos=6 + fi + npe_node_epos=$(echo "${npe_node_max} / ${nth_epos}" | bc) + export npe_node_epos + export is_exclusive=True -elif [ $step = "postsnd" ]; then +elif [[ ${step} = "postsnd" ]]; then export wtime_postsnd="02:00:00" export npe_postsnd=40 - export nth_postsnd=1 - export npe_node_postsnd=5 + export nth_postsnd=8 + export npe_node_postsnd=10 export npe_postsndcfp=9 - export npe_node_postsndcfp=3 - if [ $OUTPUT_FILE == "nemsio" ]; then - export npe_postsnd=13 - export npe_node_postsnd=4 + export npe_node_postsndcfp=1 + postsnd_req_cores=$(echo "${npe_node_postsnd} * ${nth_postsnd}" | bc) + if [[ ${postsnd_req_cores} -gt "${npe_node_max}" ]]; then + npe_node_postsnd=$(echo "${npe_node_max} / ${nth_postsnd}" | bc) + export npe_node_postsnd fi - if [[ "$machine" = "HERA" ]]; then export npe_node_postsnd=2; fi + export is_exclusive=True -elif [ $step = "awips" ]; then +elif [[ ${step} = "awips" ]]; then export wtime_awips="03:30:00" - export npe_awips=4 - export npe_node_awips=4 - export nth_awips=2 - -elif [ $step = "gempak" ]; then - - export wtime_gempak="02:00:00" - export npe_gempak=28 - export npe_node_gempak=4 - export nth_gempak=3 + export npe_awips=1 + export npe_node_awips=1 + export nth_awips=1 + export memory_awips="3GB" + +elif [[ ${step} = "gempak" ]]; then + + export wtime_gempak="03:00:00" + export npe_gempak=2 + export npe_gempak_gfs=28 + export npe_node_gempak=2 + export npe_node_gempak_gfs=28 + export nth_gempak=1 + export memory_gempak="4GB" + export memory_gempak_gfs="2GB" else - echo "Invalid step = $step, ABORT!" + echo "Invalid step = ${step}, ABORT!" exit 2 fi diff --git a/parm/config/config.resources.nco.static b/parm/config/config.resources.nco.static new file mode 100644 index 00000000000..e6cd2ef73eb --- /dev/null +++ b/parm/config/config.resources.nco.static @@ -0,0 +1,354 @@ +#! /usr/bin/env bash + +########## config.resources ########## +# Set resource information for job tasks +# e.g. walltime, node, cores per node, memory etc. + +if [ $# -ne 1 ]; then + + echo "Must specify an input task argument to set resource variables!" + echo "argument can be any one of the following:" + echo "anal analcalc analdiag gldas fcst post vrfy metp arch echgres" + echo "eobs ediag eomg eupd ecen esfc efcs epos earc" + echo "waveinit waveprep wavepostsbs wavepostbndpnt wavepostbndpntbll wavepostpnt" + echo "wavegempak waveawipsbulls waveawipsgridded" + echo "postsnd awips gempak" + echo "wafs wafsgrib2 wafsblending wafsgrib20p25 wafsblending0p25 wafsgcip" + exit 1 + +fi + +step=$1 + +echo "BEGIN: config.resources" + +export npe_node_max=128 + +if [ $step = "prep" -o $step = "prepbufr" ]; then + + eval "export wtime_$step='00:45:00'" + eval "export npe_$step=4" + eval "export npe_node_$step=2" + eval "export nth_$step=1" + +elif [ $step = "waveinit" ]; then + + export wtime_waveinit="00:10:00" + export npe_waveinit=11 + export nth_waveinit=1 + export npe_node_waveinit=$npe_waveinit + export NTASKS=$npe_waveinit + export memory_waveinit="2GB" + +elif [ $step = "waveprep" ]; then + + export wtime_waveprep="00:10:00" + export npe_waveprep=5 + export npe_waveprep_gfs=65 + export nth_waveprep=1 + export npe_node_waveprep=$npe_waveprep + export npe_node_waveprep_gfs=$npe_waveprep_gfs + export memory_waveprep="100GB" + export memory_waveprep_gfs="220GB" + export NTASKS=$npe_waveprep + export NTASKS_gfs=$npe_waveprep_gfs + +elif [ $step = "wavepostsbs" ]; then + + export wtime_wavepostsbs="00:20:00" + export wtime_wavepostsbs_gfs="03:00:00" + export npe_wavepostsbs=8 + export nth_wavepostsbs=1 + export npe_node_wavepostsbs=$npe_wavepostsbs + export memory_wavepostsbs="10GB" + export memory_wavepostsbs_gfs="40GB" + export NTASKS=$npe_wavepostsbs + +elif [ $step = "wavepostbndpnt" ]; then + + export wtime_wavepostbndpnt="01:00:00" + export npe_wavepostbndpnt=240 + export nth_wavepostbndpnt=1 + export npe_node_wavepostbndpnt=80 + export NTASKS=$npe_wavepostbndpnt + +elif [ $step = "wavepostbndpntbll" ]; then + + export wtime_wavepostbndpntbll="01:00:00" + export npe_wavepostbndpntbll=448 + export nth_wavepostbndpntbll=1 + export npe_node_wavepostbndpntbll=112 + export NTASKS=$npe_wavepostbndpntbll + +elif [ $step = "wavepostpnt" ]; then + + export wtime_wavepostpnt="01:30:00" + export npe_wavepostpnt=200 + export nth_wavepostpnt=1 + export npe_node_wavepostpnt=50 + export NTASKS=$npe_wavepostpnt + +elif [ $step = "wavegempak" ]; then + + export wtime_wavegempak="02:00:00" + export npe_wavegempak=1 + export nth_wavegempak=1 + export npe_node_wavegempak=$npe_wavegempak + export NTASKS=$npe_wavegempak + export memory_wavegempak="10GB" + +elif [ $step = "waveawipsbulls" ]; then + + export wtime_waveawipsbulls="00:20:00" + export npe_waveawipsbulls=1 + export nth_waveawipsbulls=1 + export npe_node_waveawipsbulls=$(echo "$npe_node_max / $nth_waveawipsbulls" | bc) + export NTASKS=$npe_waveawipsbulls + +elif [ $step = "waveawipsgridded" ]; then + + export wtime_waveawipsgridded="02:00:00" + export npe_waveawipsgridded=1 + export nth_waveawipsgridded=1 + export npe_node_waveawipsgridded=$(echo "$npe_node_max / $nth_waveawipsgridded" | bc) + export NTASKS=$npe_waveawipsgridded + export memory_waveawipsgridded_gfs="2GB" + +elif [ $step = "anal" ]; then + + export wtime_anal="00:50:00" + export wtime_anal_gfs="00:40:00" + export npe_anal=780 + export nth_anal=8 + export npe_anal_gfs=825 + export nth_anal_gfs=8 + export npe_node_anal=15 + export nth_cycle=$npe_node_max + export npe_node_cycle=$(echo "$npe_node_max / $nth_cycle" | bc) + +elif [ $step = "analcalc" ]; then + + export wtime_analcalc="00:10:00" + export npe_analcalc=127 + export ntasks=$npe_analcalc + export nth_analcalc=1 + export nth_echgres=4 + export nth_echgres_gfs=12 + export npe_node_analcalc=$npe_node_max + +elif [ $step = "analdiag" ]; then + + export wtime_analdiag="00:10:00" + export npe_analdiag=96 # Should be at least twice npe_ediag + export nth_analdiag=1 + export npe_node_analdiag=$npe_analdiag + export memory_analdiag="48GB" + +elif [ $step = "gldas" ]; then + + export wtime_gldas="00:10:00" + export npe_gldas=112 + export nth_gldas=1 + export npe_node_gldas=$npe_gldas + export npe_gaussian=96 + export nth_gaussian=1 + export npe_node_gaussian=$(echo "$npe_node_max / $nth_gaussian" | bc) + +elif [ $step = "fcst" ]; then + + export wtime_fcst="01:30:00" + export wtime_fcst_gfs="02:30:00" + export npe_fcst=$(echo "$layout_x * $layout_y * 6" | bc) + export npe_fcst_gfs=$(echo "$layout_x_gfs * $layout_y_gfs * 6" | bc) + export nth_fcst=${nth_fv3:-2} + export nth_fcst_gfs=${nth_fv3_gfs:-2} + export npe_node_fcst=32 + export npe_node_fcst_gfs=24 + +elif [ $step = "post" ]; then + + export wtime_post="00:12:00" + export wtime_post_gfs="01:00:00" + export npe_post=126 + export nth_post=1 + export npe_node_post=$npe_post + export npe_node_post_gfs=$npe_post + export npe_node_dwn=$npe_node_max + +elif [ $step = "wafs" ]; then + + export wtime_wafs="00:30:00" + export npe_wafs=1 + export npe_node_wafs=$npe_wafs + export nth_wafs=1 + export memory_wafs="5GB" + +elif [ $step = "wafsgcip" ]; then + + export wtime_wafsgcip="00:30:00" + export npe_wafsgcip=2 + export npe_node_wafsgcip=$npe_wafsgcip + export nth_wafsgcip=1 + export memory_wafsgcip="50GB" + +elif [ $step = "wafsgrib2" ]; then + + export wtime_wafsgrib2="00:30:00" + export npe_wafsgrib2=18 + export npe_node_wafsgrib2=$npe_wafsgrib2 + export nth_wafsgrib2=1 + export memory_wafsgrib2="80GB" + +elif [ $step = "wafsblending" ]; then + + export wtime_wafsblending="00:30:00" + export npe_wafsblending=1 + export npe_node_wafsblending=$npe_wafsblending + export nth_wafsblending=1 + export memory_wafsblending="1GB" + +elif [ $step = "wafsgrib20p25" ]; then + + export wtime_wafsgrib20p25="00:30:00" + export npe_wafsgrib20p25=11 + export npe_node_wafsgrib20p25=$npe_wafsgrib20p25 + export nth_wafsgrib20p25=1 + export memory_wafsgrib20p25="80GB" + +elif [ $step = "wafsblending0p25" ]; then + + export wtime_wafsblending0p25="00:30:00" + export npe_wafsblending0p25=1 + export npe_node_wafsblending0p25=$npe_wafsblending0p25 + export nth_wafsblending0p25=1 + export memory_wafsblending0p25="15GB" + +elif [ $step = "vrfy" ]; then + + export wtime_vrfy="03:00:00" + export wtime_vrfy_gfs="06:00:00" + export npe_vrfy=3 + export nth_vrfy=1 + export npe_node_vrfy=1 + export npe_vrfy_gfs=1 + export npe_node_vrfy_gfs=1 + +elif [ $step = "metp" ]; then + + export nth_metp=1 + export wtime_metp="03:00:00" + export npe_metp=4 + export npe_node_metp=4 + export wtime_metp_gfs="06:00:00" + export npe_metp_gfs=4 + export npe_node_metp_gfs=4 + +elif [ $step = "echgres" ]; then + + export wtime_echgres="00:10:00" + export npe_echgres=3 + export nth_echgres=1 + export npe_node_echgres=3 + export memory_echgres="200GB" + +elif [ $step = "arch" -o $step = "earc" -o $step = "getic" ]; then + + eval "export wtime_$step='06:00:00'" + eval "export npe_$step=1" + eval "export npe_node_$step=1" + eval "export nth_$step=1" + eval "export memory_$step=50GB" + +elif [ $step = "eobs" -o $step = "eomg" ]; then + + + export wtime_eobs="00:10:00" + export wtime_eomg="01:00:00" + export npe_eobs=480 + export nth_eobs=3 + export npe_node_eobs=40 + +elif [ $step = "ediag" ]; then + + export wtime_ediag="00:06:00" + export npe_ediag=48 + export nth_ediag=1 + export npe_node_ediag=$npe_node_max + export memory_ediag="28GB" + +elif [ $step = "eupd" ]; then + + export wtime_eupd="00:30:00" + export npe_eupd=315 + export nth_eupd=14 + export npe_node_eupd=$(echo "$npe_node_max / $nth_eupd" | bc) + +elif [ $step = "ecen" ]; then + + export wtime_ecen="00:10:00" + export npe_ecen=80 + export nth_ecen=4 + export npe_node_ecen=$(echo "$npe_node_max / $nth_ecen" | bc) + export nth_cycle=$nth_ecen + export npe_node_cycle=$(echo "$npe_node_max / $nth_cycle" | bc) + +elif [ $step = "esfc" ]; then + + export wtime_esfc="00:06:00" + export npe_esfc=80 + export npe_node_esfc=$npe_esfc + export nth_esfc=1 + export nth_cycle=$nth_esfc + export npe_node_cycle=$(echo "$npe_node_max / $nth_cycle" | bc) + export memory_esfc="80GB" + +elif [ $step = "efcs" ]; then + + export wtime_efcs="00:40:00" + export npe_efcs=$(echo "$layout_x * $layout_y * 6" | bc) + export nth_efcs=${nth_fv3:-2} + export npe_node_efcs=$(echo "$npe_node_max / $nth_efcs" | bc) + +elif [ $step = "epos" ]; then + + export wtime_epos="00:15:00" + export npe_epos=80 + export nth_epos=4 + export npe_node_epos=$(echo "$npe_node_max / $nth_epos" | bc) + +elif [ $step = "postsnd" ]; then + + export wtime_postsnd="02:00:00" + export npe_postsnd=40 + export nth_postsnd=8 + export npe_node_postsnd=10 + export npe_postsndcfp=9 + export npe_node_postsndcfp=1 + +elif [ $step = "awips" ]; then + + export wtime_awips="03:30:00" + export npe_awips=1 + export npe_node_awips=1 + export nth_awips=1 + export memory_awips="10GB" + +elif [ $step = "gempak" ]; then + + export wtime_gempak="03:00:00" + export npe_gempak=2 + export npe_gempak_gfs=28 + export npe_node_gempak=2 + export npe_node_gempak_gfs=28 + export nth_gempak=1 + export memory_gempak="20GB" + export memory_gempak_gfs="200GB" + +else + + echo "Invalid step = $step, ABORT!" + exit 2 + +fi + +echo "END: config.resources" diff --git a/parm/config/config.ufs b/parm/config/config.ufs new file mode 100644 index 00000000000..a96ba126e29 --- /dev/null +++ b/parm/config/config.ufs @@ -0,0 +1,370 @@ +#! /usr/bin/env bash + +########## config.ufs ########## +# UFS model resolution specific parameters +# e.g. time-step, processor layout, physics and dynamics parameters +# This config sets default variables for FV3, MOM6, CICE6 for their resolutions +# User can over-ride after sourcing this config file + +echo "BEGIN: config.ufs" + +if [ $# -le 1 ]; then + + echo "Must specify an input resolution argument to set variables!" + echo "argument can be any one of the following:" + echo "--fv3 C48|C96|C192|C384|C768|C1152|C3072" + echo "--mom6 500|100|025" + echo "--cice6 500|100|025" + echo "--ww3 gnh_10m;aoc_9km;gsh_15m|gwes_30m|mx050|mx025" + + exit 1 + +fi + +# Initialize +skip_mom6=true +skip_cice6=true +skip_ww3=true +skip_mediator=true + +# Loop through named arguments +while [[ $# -gt 0 ]]; do + key="$1" + case "${key}" in + "--fv3") + fv3_res="$2" + ;; + "--mom6") + mom6_res="$2" + skip_mom6=false + ;; + "--cice6") + cice6_res="$2" + skip_cice6=false + ;; + "--ww3") + ww3_res="$2" + skip_ww3=false + ;; + *) # unknown option + echo "FATAL ERROR: Unknown option: ${key}, ABORT!" + exit 1 + ;; + esac + shift + shift +done + +# Mediator is required if any of the non-ATM components are used +if [[ "${skip_mom6}" == "false" ]] || [[ "${skip_cice6}" == "false" ]] || [[ "${skip_ww3}" == "false" ]]; then + skip_mediator=false +fi + +case "${machine}" in + "WCOSS2") + npe_node_max=128 + ;; + "HERA" | "ORION") + npe_node_max=40 + ;; + "JET") + case "${PARTITION_BATCH}" in + "xjet") + npe_node_max=24 + ;; + "vjet" | "sjet") + npe_node_max=16 + ;; + "kjet") + npe_node_max=40 + ;; + *) + echo "FATAL ERROR: Unsupported ${machine} PARTITION_BATCH = ${PARTITION_BATCH}, ABORT!" + exit 1 + ;; + esac + ;; + "S4") + case "${PARTITION_BATCH}" in + "s4") + npe_node_max=32 + ;; + "ivy") + npe_node_max=20 + ;; + *) + echo "FATAL ERROR: Unsupported ${machine} PARTITION_BATCH = ${PARTITION_BATCH}, ABORT!" + exit 1 + ;; + esac + ;; +esac +export npe_node_max + +# (Standard) Model resolution dependent variables +case "${fv3_res}" in + "C48") + export DELTIM=1200 + export layout_x=1 + export layout_y=1 + export layout_x_gfs=1 + export layout_y_gfs=1 + export nthreads_fv3=1 + export nthreads_fv3_gfs=1 + export cdmbgwd="0.071,2.1,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export WRITE_GROUP=1 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=1 + export WRITE_GROUP_GFS=1 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=1 + ;; + "C96") + export DELTIM=600 + export layout_x=2 + export layout_y=2 + export layout_x_gfs=2 + export layout_y_gfs=2 + export nthreads_fv3=1 + export nthreads_fv3_gfs=1 + export cdmbgwd="0.14,1.8,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export WRITE_GROUP=1 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=1 + export WRITE_GROUP_GFS=1 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=1 + ;; + "C192") + export DELTIM=450 + export layout_x=4 + export layout_y=6 + export layout_x_gfs=4 + export layout_y_gfs=6 + export nthreads_fv3=1 + export nthreads_fv3_gfs=2 + export cdmbgwd="0.23,1.5,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export WRITE_GROUP=1 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 + export WRITE_GROUP_GFS=2 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=5 + ;; + "C384") + export DELTIM=300 + export layout_x=6 + export layout_y=8 + export layout_x_gfs=8 + export layout_y_gfs=8 + export nthreads_fv3=1 + export nthreads_fv3_gfs=2 + export cdmbgwd="1.1,0.72,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export WRITE_GROUP=2 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=8 + export WRITE_GROUP_GFS=2 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=4 + ;; + "C768") + export DELTIM=150 + export layout_x=8 + export layout_y=12 + export layout_x_gfs=12 + export layout_y_gfs=16 + export nthreads_fv3=4 + export nthreads_fv3_gfs=4 + export cdmbgwd="4.0,0.15,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export WRITE_GROUP=2 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 + export WRITE_GROUP_GFS=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=10 + ;; + "C1152") + export DELTIM=120 + export layout_x=8 + export layout_y=16 + export layout_x_gfs=8 + export layout_y_gfs=16 + export nthreads_fv3=4 + export nthreads_fv3_gfs=4 + export cdmbgwd="4.0,0.10,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export WRITE_GROUP=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 # TODO: refine these numbers when a case is available + export WRITE_GROUP_GFS=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=10 # TODO: refine these numbers when a case is available + ;; + "C3072") + export DELTIM=90 + export layout_x=16 + export layout_y=32 + export layout_x_gfs=16 + export layout_y_gfs=32 + export nthreads_fv3=4 + export nthreads_fv3_gfs=4 + export cdmbgwd="4.0,0.05,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export WRITE_GROUP=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 # TODO: refine these numbers when a case is available + export WRITE_GROUP_GFS=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=10 # TODO: refine these numbers when a case is available + ;; + *) + echo "FATAL ERROR: Unsupported FV3 resolution = ${fv3_res}, ABORT!" + exit 1 + ;; +esac + +(( WRTTASK_PER_GROUP_PER_THREAD = WRTTASK_PER_GROUP_PER_THREAD_PER_TILE * 6 )) +(( WRTTASK_PER_GROUP_PER_THREAD_GFS = WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS * 6 )) +export WRTTASK_PER_GROUP_PER_THREAD +export WRTTASK_PER_GROUP_PER_THREAD_GFS + +(( ntasks_fv3 = layout_x * layout_y * 6 )) +(( ntasks_fv3_gfs = layout_x_gfs * layout_y_gfs * 6 )) +export ntasks_fv3 +export ntasks_fv3_gfs + +(( ntasks_quilt = WRITE_GROUP * WRTTASK_PER_GROUP_PER_THREAD )) +(( ntasks_quilt_gfs = WRITE_GROUP_GFS * WRTTASK_PER_GROUP_PER_THREAD_GFS )) +export ntasks_quilt +export ntasks_quilt_gfs + +# Determine whether to use parallel NetCDF based on resolution +case ${fv3_res} in + "C48" | "C96" | "C192" | "C384") + OUTPUT_FILETYPE_ATM="netcdf" + OUTPUT_FILETYPE_SFC="netcdf" + ;; + "C768" | "C1152" | "C3072") + OUTPUT_FILETYPE_ATM="netcdf_parallel" + OUTPUT_FILETYPE_SFC="netcdf_parallel" + ;; +esac +export OUTPUT_FILETYPE_ATM OUTPUT_FILETYPE_SFC + +# Mediator specific settings +if [[ "${skip_mediator}" == "false" ]]; then + export nthreads_mediator=${nthreads_fv3} # Use same threads as FV3 +fi + +# MOM6 specific settings +if [[ "${skip_mom6}" == "false" ]]; then + nthreads_mom6=1 + case "${mom6_res}" in + "500") + ntasks_mom6=8 + OCNTIM=3600 + NX_GLB=72 + NY_GLB=35 + DT_DYNAM_MOM6='3600' + DT_THERM_MOM6='3600' + FRUNOFF="" + CHLCLIM="seawifs_1998-2006_smoothed_2X.nc" + MOM6_RESTART_SETTING='r' + MOM6_RIVER_RUNOFF='False' + ;; + "100") + ntasks_mom6=20 + OCNTIM=3600 + NX_GLB=360 + NY_GLB=320 + DT_DYNAM_MOM6='1800' + DT_THERM_MOM6='3600' + FRUNOFF="" + CHLCLIM="seawifs_1998-2006_smoothed_2X.nc" + MOM6_RESTART_SETTING='n' + MOM6_RIVER_RUNOFF='False' + ;; + "50") + ntasks_mom6=60 + OCNTIM=3600 + NX_GLB=720 + NY_GLB=576 + DT_DYNAM_MOM6='1800' + DT_THERM_MOM6='3600' + FRUNOFF="runoff.daitren.clim.${NX_GLB}x${NY_GLB}.v20180328.nc" + CHLCLIM="seawifs-clim-1997-2010.${NX_GLB}x${NY_GLB}.v20180328.nc" + MOM6_RESTART_SETTING='n' + MOM6_RIVER_RUNOFF='True' + ;; + "025") + ntasks_mom6=220 + OCNTIM=1800 + NX_GLB=1440 + NY_GLB=1080 + DT_DYNAM_MOM6='900' + DT_THERM_MOM6='1800' + FRUNOFF="runoff.daitren.clim.${NX_GLB}x${NY_GLB}.v20180328.nc" + CHLCLIM="seawifs-clim-1997-2010.${NX_GLB}x${NY_GLB}.v20180328.nc" + MOM6_RIVER_RUNOFF='True' + MOM6_RESTART_SETTING="r" + ;; + *) + echo "FATAL ERROR: Unsupported MOM6 resolution = ${mom6_res}, ABORT!" + exit 1 + ;; + esac + export nthreads_mom6 ntasks_mom6 + export OCNTIM + export NX_GLB NY_GLB + export DT_DYNAM_MOM6 DT_THERM_MOM6 + export FRUNOFF + export CHLCLIM + export MOM6_RIVER_RUNOFF + export MOM6_RESTART_SETTING +fi + +# CICE6 specific settings +if [[ "${skip_cice6}" == "false" ]]; then + # Ensure we sourced the MOM6 section + if [[ "${skip_mom6}" == "true" ]]; then + echo "FATAL ERROR: CICE6 cannot be configured without MOM6, ABORT!" + exit 1 + fi + nthreads_cice6=${nthreads_mom6} # CICE6 needs to run on same threads as MOM6 + case "${cice6_res}" in + "500") + ntasks_cice6=4 + cice6_processor_shape="slenderX1" + ;; + "100") + ntasks_cice6=10 + cice6_processor_shape="slenderX2" + ;; + "050") + ntasks_cice6=30 + cice6_processor_shape="slenderX2" + ;; + "025") + ntasks_cice6=120 + cice6_processor_shape="slenderX2" + ;; + *) + echo "FATAL ERROR: Unsupported CICE6 resolution = ${cice6_res}, ABORT!" + exit 1 + ;; + esac + # NX_GLB and NY_GLB are set in the MOM6 section above + # CICE6 runs on the same domain decomposition as MOM6 + export nthreads_cice6 ntasks_cice6 + export cice6_processor_shape +fi + +# WW3 specific settings +if [[ "${skip_ww3}" == "false" ]]; then + nthreads_ww3=2 + case "${ww3_res}" in + "gnh_10m;aoc_9km;gsh_15m") + ntasks_ww3=140 + ;; + "gwes_30m") + ntasks_ww3=100 + ;; + "mx050") + ntasks_ww3=240 + ;; + "mx025") + ntasks_ww3=80 + ;; + *) + echo "FATAL ERROR: Unsupported WW3 resolution = ${ww3_res}, ABORT!" + exit 1 + ;; + esac + export ntasks_ww3 nthreads_ww3 +fi + +echo "END: config.ufs" diff --git a/parm/config/config.vrfy b/parm/config/config.vrfy old mode 100755 new mode 100644 index f2249e9cfeb..c277e8e963a --- a/parm/config/config.vrfy +++ b/parm/config/config.vrfy @@ -6,14 +6,10 @@ echo "BEGIN: config.vrfy" # Get task specific resources -. $EXPDIR/config.resources vrfy +. "${EXPDIR}/config.resources" vrfy -export VDUMP="gfs" # Verifying dump -export CDUMPFCST="gdas" # Fit-to-obs with GDAS/GFS prepbufr export CDFNL="gdas" # Scores verification against GDAS/GFS analysis - export MKPGB4PRCP="YES" # Make 0.25-deg pgb files in ARCDIR for precip verification -export VRFYFITS="YES" # Fit to observations export VRFYRAD="YES" # Radiance data assimilation monitoring export VRFYOZN="YES" # Ozone data assimilation monitoring export VRFYMINMON="YES" # GSI minimization monitoring @@ -22,73 +18,44 @@ export VRFYGENESIS="YES" # Cyclone genesis verification export VRFYFSU="NO" # Cyclone genesis verification (FSU) export RUNMOS="NO" # whether to run entire MOS package -#------------------------------------------------- -# Fit to Observations -#------------------------------------------------- - -if [ $VRFYFITS = "YES" ]; then - - export fit_ver="newm.1.3" - export fitdir="$BASE_GIT/verif/global/Fit2Obs/${fit_ver}/batrun" - export PRVT=$HOMEgfs/fix/fix_gsi/prepobs_errtable.global - export HYBLEVS=$HOMEgfs/fix/fix_am/global_hyblev.l${LEVS}.txt - export CUE2RUN=$QUEUE - - export VBACKUP_FITS=24 - - export CONVNETC="NO" - if [ ${netcdf_diag:-".false."} = ".true." ]; then - export CONVNETC="YES" - fi - - if [ $machine = "HERA" ]; then - export PREPQFITSH="$fitdir/subfits_hera_slurm" - elif [ $machine = "ORION" ]; then - export PREPQFITSH="$fitdir/subfits_orion_netcdf" - else - echo "Fit2Obs NOT supported on this machine" - fi - -fi - - #---------------------------------------------------------- # Minimization, Radiance and Ozone Monitoring #---------------------------------------------------------- -if [ $VRFYRAD = "YES" -o $VRFYMINMON = "YES" -o $VRFYOZN = "YES" ]; then +if [[ ${VRFYRAD} = "YES" || ${VRFYMINMON} = "YES" || ${VRFYOZN} = "YES" ]]; then export envir="para" + export COM_IN=${ROTDIR} # Radiance Monitoring - if [[ "$VRFYRAD" == "YES" && "$CDUMP" == "$CDFNL" ]] ; then + if [[ "${VRFYRAD}" == "YES" && "${RUN}" == "${CDFNL}" ]] ; then - export RADMON_SUFFIX=$PSLOT - export TANKverf="$NOSCRUB/monitor/radmon" - export VRFYRADSH="$HOMEgfs/jobs/JGDAS_ATMOS_VERFRAD" + export RADMON_SUFFIX=${PSLOT} + export TANKverf="${NOSCRUB}/monitor/radmon" + export VRFYRADSH="${HOMEgfs}/jobs/JGDAS_ATMOS_VERFRAD" fi # Minimization Monitoring - if [[ "$VRFYMINMON" = "YES" ]] ; then - - export MINMON_SUFFIX=$PSLOT - export M_TANKverf="$NOSCRUB/monitor/minmon" - if [[ "$CDUMP" = "gdas" ]] ; then - export VRFYMINSH="$HOMEgfs/jobs/JGDAS_ATMOS_VMINMON" - elif [[ "$CDUMP" = "gfs" ]] ; then - export VRFYMINSH="$HOMEgfs/jobs/JGFS_ATMOS_VMINMON" + if [[ "${VRFYMINMON}" = "YES" ]] ; then + + export MINMON_SUFFIX=${PSLOT} + export M_TANKverf="${NOSCRUB}/monitor/minmon" + if [[ "${RUN}" = "gdas" ]] ; then + export VRFYMINSH="${HOMEgfs}/jobs/JGDAS_ATMOS_VMINMON" + elif [[ "${RUN}" = "gfs" ]] ; then + export VRFYMINSH="${HOMEgfs}/jobs/JGFS_ATMOS_VMINMON" fi fi # Ozone Monitoring - if [[ "$VRFYOZN" == "YES" && "$CDUMP" == "$CDFNL" ]] ; then + if [[ "${VRFYOZN}" == "YES" && "${RUN}" == "${CDFNL}" ]] ; then - export HOMEgfs_ozn="$HOMEgfs" - export OZNMON_SUFFIX=$PSLOT - export TANKverf_ozn="$NOSCRUB/monitor/oznmon" - export VRFYOZNSH="$HOMEgfs/jobs/JGDAS_ATMOS_VERFOZN" + export HOMEgfs_ozn="${HOMEgfs}" + export OZNMON_SUFFIX=${PSLOT} + export TANKverf_ozn="${NOSCRUB}/monitor/oznmon" + export VRFYOZNSH="${HOMEgfs}/jobs/JGDAS_ATMOS_VERFOZN" fi @@ -99,53 +66,45 @@ fi # Cyclone genesis and cyclone track verification #------------------------------------------------- -export ens_tracker_ver=v1.1.15.4 -export HOMEens_tracker=$BASE_GIT/TC_tracker/TC_tracker.${ens_tracker_ver} -## JKH -if [ $machine = "JET" ] ; then - export HOMEens_tracker=$HOMEgfs/sorc/ens_tracker.${ens_tracker_ver} -fi +export ens_tracker_ver=feature-GFSv17_com_reorg # TODO - temporary ahead of new tag/version +export HOMEens_tracker=$BASE_GIT/TC_tracker/${ens_tracker_ver} -if [ "$VRFYTRAK" = "YES" ]; then +if [[ "${VRFYTRAK}" = "YES" ]]; then - export TRACKERSH="$HOMEgfs/jobs/JGFS_ATMOS_CYCLONE_TRACKER" - if [ "$CDUMP" = "gdas" ]; then + export TRACKERSH="${HOMEgfs}/jobs/JGFS_ATMOS_CYCLONE_TRACKER" + COMINsyn=${COMINsyn:-$(compath.py "${envir}"/com/gfs/"${gfs_ver}")/syndat} + export COMINsyn + if [[ "${RUN}" = "gdas" ]]; then export FHOUT_CYCLONE=3 - export FHMAX_CYCLONE=$FHMAX + export FHMAX_CYCLONE=${FHMAX} else export FHOUT_CYCLONE=6 - export FHMAX_CYCLONE=$(( FHMAX_GFS<240 ? FHMAX_GFS : 240 )) - fi - ## JKH - if [ $machine = "JET" ]; then - export COMINsyn=${COMINsyn:-/mnt/lfs4/HFIP/hwrf-data/hwrf-input/SYNDAT-PLUS} - else - export COMINsyn=${COMINsyn:-${COMROOT}/gfs/prod/syndat} + FHMAX_CYCLONE=$(( FHMAX_GFS<240 ? FHMAX_GFS : 240 )) + export FHMAX_CYCLONE fi fi -if [[ "$VRFYGENESIS" == "YES" && "$CDUMP" == "gfs" ]]; then +if [[ "${VRFYGENESIS}" == "YES" && "${RUN}" == "gfs" ]]; then - export GENESISSH="$HOMEgfs/jobs/JGFS_ATMOS_CYCLONE_GENESIS" + export GENESISSH="${HOMEgfs}/jobs/JGFS_ATMOS_CYCLONE_GENESIS" fi -if [[ "$VRFYFSU" == "YES" && "$CDUMP" == "gfs" ]]; then +if [[ "${VRFYFSU}" == "YES" && "${RUN}" == "gfs" ]]; then - export GENESISFSU="$HOMEgfs/jobs/JGFS_ATMOS_FSU_GENESIS" + export GENESISFSU="${HOMEgfs}/jobs/JGFS_ATMOS_FSU_GENESIS" fi -if [[ "$RUNMOS" == "YES" && "$CDUMP" == "gfs" ]]; then +if [[ "${RUNMOS}" == "YES" && "${RUN}" == "gfs" ]]; then - if [ $machine = "HERA" ] ; then - export RUNGFSMOSSH="$HOMEgfs/scripts/run_gfsmos_master.sh.hera" + if [[ "${machine}" = "HERA" ]] ; then + export RUNGFSMOSSH="${HOMEgfs}/scripts/run_gfsmos_master.sh.hera" else - echo "WARNING: MOS package is not enabled on $machine!" + echo "WARNING: MOS package is not enabled on ${machine}!" export RUNMOS="NO" export RUNGFSMOSSH="" fi fi - echo "END: config.vrfy" diff --git a/parm/config/config.wafs b/parm/config/config.wafs old mode 100755 new mode 100644 diff --git a/parm/config/config.wafsblending b/parm/config/config.wafsblending old mode 100755 new mode 100644 diff --git a/parm/config/config.wafsblending0p25 b/parm/config/config.wafsblending0p25 old mode 100755 new mode 100644 diff --git a/parm/config/config.wafsgcip b/parm/config/config.wafsgcip old mode 100755 new mode 100644 diff --git a/parm/config/config.wafsgrib2 b/parm/config/config.wafsgrib2 old mode 100755 new mode 100644 diff --git a/parm/config/config.wafsgrib20p25 b/parm/config/config.wafsgrib20p25 old mode 100755 new mode 100644 diff --git a/parm/config/config.wave b/parm/config/config.wave old mode 100755 new mode 100644 index f69adda3ec4..658c4b40ae4 --- a/parm/config/config.wave +++ b/parm/config/config.wave @@ -11,7 +11,7 @@ echo "BEGIN: config.wave" export wave_sys_ver=v1.0.0 export EXECwave="$HOMEgfs/exec" -export FIXwave="$HOMEgfs/fix/fix_wave" +export FIXwave="$HOMEgfs/fix/wave" export PARMwave="$HOMEgfs/parm/wave" export USHwave="$HOMEgfs/ush" @@ -19,7 +19,7 @@ export USHwave="$HOMEgfs/ush" # Some others are also used across the workflow in wave component scripts # General runtime labels -export CDUMPwave="${CDUMP}wave" +export CDUMPwave="${RUN}wave" # In GFS/GDAS, restart files are generated/read from gdas runs export CDUMPRSTwave="gdas" @@ -58,6 +58,8 @@ export FHMIN_WAV=${FHMIN_WAV:-0} export FHOUT_WAV=${FHOUT_WAV:-3} export FHMAX_HF_WAV=${FHMAX_HF_WAV:-120} export FHOUT_HF_WAV=${FHOUT_HF_WAV:-1} +export FHMAX_WAV_IBP=180 +if (( FHMAX_WAV < FHMAX_WAV_IBP )); then export FHMAX_WAV_IBP=${FHMAX_GFS} ; fi # gridded and point output rate export DTFLD_WAV=$(expr $FHOUT_HF_WAV \* 3600) diff --git a/parm/config/config.waveawipsbulls b/parm/config/config.waveawipsbulls old mode 100755 new mode 100644 index e3748e9cd11..fd21869355b --- a/parm/config/config.waveawipsbulls +++ b/parm/config/config.waveawipsbulls @@ -10,8 +10,5 @@ echo "BEGIN: config.waveawipsbulls" export DBNROOT=/dev/null export SENDCOM="YES" -export COMPONENT=${COMPONENT:-wave} -export COMIN="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" echo "END: config.waveawipsbulls" diff --git a/parm/config/config.waveawipsgridded b/parm/config/config.waveawipsgridded old mode 100755 new mode 100644 index e84352558ed..6896ec8bd20 --- a/parm/config/config.waveawipsgridded +++ b/parm/config/config.waveawipsgridded @@ -10,8 +10,5 @@ echo "BEGIN: config.waveawipsgridded" export DBNROOT=/dev/null export SENDCOM="YES" -export COMPONENT=${COMPONENT:-wave} -export COMIN="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" echo "END: config.waveawipsgridded" diff --git a/parm/config/config.wavegempak b/parm/config/config.wavegempak old mode 100755 new mode 100644 index 66af59f2a46..da76c364cea --- a/parm/config/config.wavegempak +++ b/parm/config/config.wavegempak @@ -9,8 +9,5 @@ echo "BEGIN: config.wavegempak" . $EXPDIR/config.resources wavegempak export SENDCOM="YES" -export COMPONENT=${COMPONENT:-wave} -export COMIN="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT" -export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc/$COMPONENT/gempak" echo "END: config.wavegempak" diff --git a/parm/config/config.waveinit b/parm/config/config.waveinit old mode 100755 new mode 100644 diff --git a/parm/config/config.wavepostbndpnt b/parm/config/config.wavepostbndpnt old mode 100755 new mode 100644 index eaa1626e628..dfeddc79b27 --- a/parm/config/config.wavepostbndpnt +++ b/parm/config/config.wavepostbndpnt @@ -8,7 +8,4 @@ echo "BEGIN: config.wavepostbndpnt" # Get task specific resources . $EXPDIR/config.resources wavepostbndpnt -export FHMAX_WAV_IBP=180 -if [[ "$FHMAX_GFS" -lt "$FHMAX_WAV_IBP" ]] ; then export FHMAX_WAV_IBP=$FHMAX_GFS ; fi - echo "END: config.wavepostbndpnt" diff --git a/parm/config/config.wavepostbndpntbll b/parm/config/config.wavepostbndpntbll old mode 100755 new mode 100644 diff --git a/parm/config/config.wavepostpnt b/parm/config/config.wavepostpnt old mode 100755 new mode 100644 diff --git a/parm/config/config.wavepostsbs b/parm/config/config.wavepostsbs old mode 100755 new mode 100644 diff --git a/parm/config/config.waveprep b/parm/config/config.waveprep old mode 100755 new mode 100644 diff --git a/parm/config/yaml/defaults.yaml b/parm/config/yaml/defaults.yaml new file mode 100644 index 00000000000..4c3817ef01d --- /dev/null +++ b/parm/config/yaml/defaults.yaml @@ -0,0 +1,19 @@ +aeroanl: + IO_LAYOUT_X: 1 + IO_LAYOUT_Y: 1 + +landanl: + IO_LAYOUT_X: 1 + IO_LAYOUT_Y: 1 + +ocnanal: + SOCA_INPUT_FIX_DIR: '/scratch2/NCEPDEV/ocean/Guillaume.Vernieres/data/static/72x35x25' + CASE_ANL: 'C48' + SOCA_OBS_LIST: '' + COMIN_OBS: '/scratch2/NCEPDEV/marineda/r2d2' + SABER_BLOCKS_YAML: '' + SOCA_NINNER: 50 + R2D2_OBS_SRC: 'gdas_marine' + R2D2_OBS_DUMP: 's2s_v1' + NICAS_RESOL: 1 + NICAS_GRID_SIZE: 15000 diff --git a/parm/mom6/MOM_input_template_025 b/parm/mom6/MOM_input_template_025 index 3abbf2191bd..6c0779f426b 100644 --- a/parm/mom6/MOM_input_template_025 +++ b/parm/mom6/MOM_input_template_025 @@ -6,7 +6,6 @@ ! This MOM_input file typically contains only the non-default values that are needed to reproduce this example. ! A full list of parameters for this example can be found in the corresponding MOM_parameter_doc.all file ! which is generated by the model at run-time. - ! === module MOM_domains === TRIPOLAR_N = True ! [Boolean] default = False ! Use tripolar connectivity at the northern edge of the domain. With @@ -406,7 +405,7 @@ GILL_EQUATORIAL_LD = True ! [Boolean] default = False ! radius, otherwise, if false, use Pedlosky's definition. These definitions ! differ by a factor of 2 in front of the beta term in the denominator. Gill's ! is the more appropriate definition. -INTERNAL_WAVE_SPEED_BETTER_EST = False ! [Boolean] default = True +INTERNAL_WAVE_SPEED_BETTER_EST = False ! [Boolean] default = True ! If true, use a more robust estimate of the first mode wave speed as the ! starting point for iterations. @@ -510,6 +509,9 @@ USE_LAND_MASK_FOR_HVISC = False ! [Boolean] default = False HMIX_FIXED = 0.5 ! [m] ! The prescribed depth over which the near-surface viscosity and diffusivity are ! elevated when the bulk mixed layer is not used. +KVML = 1.0E-04 ! [m2 s-1] default = 1.0E-04 + ! The kinematic viscosity in the mixed layer. A typical value is ~1e-2 m2 s-1. + ! KVML is not used if BULKMIXEDLAYER is true. The default is set by KV. MAXVEL = 6.0 ! [m s-1] default = 3.0E+08 ! The maximum velocity allowed before the velocity components are truncated. @@ -731,7 +733,7 @@ NSTAR = 0.06 ! [nondim] default = 0.2 ! The portion of the buoyant potential energy imparted by surface fluxes that is ! available to drive entrainment at the base of mixed layer when that energy is ! positive. -EPBL_MLD_BISECTION = True ! [Boolean] default = False +EPBL_MLD_BISECTION = True ! [Boolean] default = False ! If true, use bisection with the iterative determination of the self-consistent ! mixed layer depth. Otherwise use the false position after a maximum and ! minimum bound have been evaluated and the returned value or bisection before @@ -833,6 +835,30 @@ ENERGYSAVEDAYS = 1.00 ! [days] default = 1.0 ! === module ocean_model_init === +! === module MOM_oda_incupd === +ODA_INCUPD = @[ODA_INCUPD] ! [Boolean] default = False + ! If true, oda incremental updates will be applied + ! everywhere in the domain. +ODA_INCUPD_FILE = "mom6_increment.nc" ! The name of the file with the T,S,h increments. + +ODA_TEMPINC_VAR = "Temp" ! default = "ptemp_inc" + ! The name of the potential temperature inc. variable in + ! ODA_INCUPD_FILE. +ODA_SALTINC_VAR = "Salt" ! default = "sal_inc" + ! The name of the salinity inc. variable in + ! ODA_INCUPD_FILE. +ODA_THK_VAR = "h" ! default = "h" + ! The name of the int. depth inc. variable in + ! ODA_INCUPD_FILE. +ODA_INCUPD_UV = false ! +!ODA_UINC_VAR = "u" ! default = "u_inc" + ! The name of the zonal vel. inc. variable in + ! ODA_INCUPD_UV_FILE. +!ODA_VINC_VAR = "v" ! default = "v_inc" + ! The name of the meridional vel. inc. variable in + ! ODA_INCUPD_UV_FILE. +ODA_INCUPD_NHOURS = @[ODA_INCUPD_NHOURS] ! default=3.0 + ! === module MOM_surface_forcing === OCEAN_SURFACE_STAGGER = "A" ! default = "C" ! A case-insensitive character string to indicate the @@ -868,8 +894,8 @@ LIQUID_RUNOFF_FROM_DATA = @[MOM6_RIVER_RUNOFF] ! [Boolean] default = False ! the data_table using the component name 'OCN'. ! === module ocean_stochastics === DO_SPPT = @[DO_OCN_SPPT] ! [Boolean] default = False - ! If true perturb the diabatic tendencies in MOM_diabadic_driver -PERT_EPBL = @[PERT_EPBL] ! [Boolean] default = False + ! If true perturb the diabatic tendencies in MOM_diabatic_driver +PERT_EPBL = @[PERT_EPBL] ! [Boolean] default = False ! If true perturb the KE dissipation and destruction in MOM_energetic_PBL ! === module MOM_restart === RESTART_CHECKSUMS_REQUIRED = False diff --git a/parm/mom6/MOM_input_template_050 b/parm/mom6/MOM_input_template_050 index 4e703a4bfdb..4c39198c026 100644 --- a/parm/mom6/MOM_input_template_050 +++ b/parm/mom6/MOM_input_template_050 @@ -6,7 +6,6 @@ ! This MOM_input file typically contains only the non-default values that are needed to reproduce this example. ! A full list of parameters for this example can be found in the corresponding MOM_parameter_doc.all file ! which is generated by the model at run-time. - ! === module MOM_domains === TRIPOLAR_N = True ! [Boolean] default = False ! Use tripolar connectivity at the northern edge of the domain. With @@ -419,7 +418,7 @@ GILL_EQUATORIAL_LD = True ! [Boolean] default = False ! radius, otherwise, if false, use Pedlosky's definition. These definitions ! differ by a factor of 2 in front of the beta term in the denominator. Gill's ! is the more appropriate definition. -INTERNAL_WAVE_SPEED_BETTER_EST = False ! [Boolean] default = True +INTERNAL_WAVE_SPEED_BETTER_EST = False ! [Boolean] default = True ! If true, use a more robust estimate of the first mode wave speed as the ! starting point for iterations. @@ -540,6 +539,9 @@ USE_LAND_MASK_FOR_HVISC = False ! [Boolean] default = False HMIX_FIXED = 0.5 ! [m] ! The prescribed depth over which the near-surface viscosity and diffusivity are ! elevated when the bulk mixed layer is not used. +KVML = 1.0E-04 ! [m2 s-1] default = 1.0E-04 + ! The kinematic viscosity in the mixed layer. A typical value is ~1e-2 m2 s-1. + ! KVML is not used if BULKMIXEDLAYER is true. The default is set by KV. MAXVEL = 6.0 ! [m s-1] default = 3.0E+08 ! The maximum velocity allowed before the velocity components are truncated. @@ -757,7 +759,7 @@ MSTAR2_COEF1 = 0.29 ! [nondim] default = 0.3 MSTAR2_COEF2 = 0.152 ! [nondim] default = 0.085 ! Coefficient in computing mstar when only rotation limits the total mixing ! (used if EPBL_MSTAR_SCHEME = OM4) -EPBL_MLD_BISECTION = True ! [Boolean] default = False +EPBL_MLD_BISECTION = True ! [Boolean] default = False ! If true, use bisection with the iterative determination of the self-consistent ! mixed layer depth. Otherwise use the false position after a maximum and ! minimum bound have been evaluated and the returned value or bisection before @@ -858,8 +860,32 @@ USE_NEUTRAL_DIFFUSION = True ! [Boolean] default = False ! If true, enables the neutral diffusion module. ! === module ocean_model_init === - RESTART_CHECKSUMS_REQUIRED = False + +! === module MOM_oda_incupd === +ODA_INCUPD = @[ODA_INCUPD] ! [Boolean] default = False + ! If true, oda incremental updates will be applied + ! everywhere in the domain. +ODA_INCUPD_FILE = "mom6_increment.nc" ! The name of the file with the T,S,h increments. + +ODA_TEMPINC_VAR = "Temp" ! default = "ptemp_inc" + ! The name of the potential temperature inc. variable in + ! ODA_INCUPD_FILE. +ODA_SALTINC_VAR = "Salt" ! default = "sal_inc" + ! The name of the salinity inc. variable in + ! ODA_INCUPD_FILE. +ODA_THK_VAR = "h" ! default = "h" + ! The name of the int. depth inc. variable in + ! ODA_INCUPD_FILE. +ODA_INCUPD_UV = false ! +!ODA_UINC_VAR = "u" ! default = "u_inc" + ! The name of the zonal vel. inc. variable in + ! ODA_INCUPD_UV_FILE. +!ODA_VINC_VAR = "v" ! default = "v_inc" + ! The name of the meridional vel. inc. variable in + ! ODA_INCUPD_UV_FILE. +ODA_INCUPD_NHOURS = @[ODA_INCUPD_NHOURS] ! default=3.0 + ! === module MOM_lateral_boundary_diffusion === ! This module implements lateral diffusion of tracers near boundaries @@ -913,8 +939,8 @@ LIQUID_RUNOFF_FROM_DATA = @[MOM6_RIVER_RUNOFF] ! [Boolean] default = False ! the data_table using the component name 'OCN'. ! === module ocean_stochastics === DO_SPPT = @[DO_OCN_SPPT] ! [Boolean] default = False - ! If true perturb the diabatic tendencies in MOM_diabadic_driver -PERT_EPBL = @[PERT_EPBL] ! [Boolean] default = False + ! If true perturb the diabatic tendencies in MOM_diabatic_driver +PERT_EPBL = @[PERT_EPBL] ! [Boolean] default = False ! If true perturb the KE dissipation and destruction in MOM_energetic_PBL ! === module MOM_restart === diff --git a/parm/mom6/MOM_input_template_100 b/parm/mom6/MOM_input_template_100 index 1aedf9e7f19..8b616ad27fd 100644 --- a/parm/mom6/MOM_input_template_100 +++ b/parm/mom6/MOM_input_template_100 @@ -1,5 +1,4 @@ ! This file was written by the model and records all non-layout or debugging parameters used at run-time. - ! === module MOM === ! === module MOM_unit_scaling === @@ -76,7 +75,7 @@ SAVE_INITIAL_CONDS = False ! [Boolean] default = False ! If true, write the initial conditions to a file given by IC_OUTPUT_FILE. ! === module MOM_oda_incupd === -ODA_INCUPD = @[MOM_IAU] ! [Boolean] default = False +ODA_INCUPD = @[ODA_INCUPD] ! [Boolean] default = False ! If true, oda incremental updates will be applied ! everywhere in the domain. ODA_INCUPD_FILE = "mom6_increment.nc" ! The name of the file with the T,S,h increments. @@ -97,7 +96,7 @@ ODA_UINC_VAR = "u_inc" ! default = "u_inc" ODA_VINC_VAR = "v_inc" ! default = "v_inc" ! The name of the meridional vel. inc. variable in ! ODA_INCUPD_UV_FILE. -ODA_INCUPD_NHOURS = @[MOM_IAU_HRS] ! default=3.0 +ODA_INCUPD_NHOURS = @[ODA_INCUPD_NHOURS] ! default=3.0 ! Number of hours for full update (0=direct insertion). ! === module MOM_domains === @@ -430,7 +429,7 @@ VISC_RES_FN_POWER = 2 ! [nondim] default = 100 ! used, although even integers are more efficient to calculate. Setting this ! greater than 100 results in a step-function being used. This function affects ! lateral viscosity, Kh, and not KhTh. -INTERNAL_WAVE_SPEED_BETTER_EST = False ! [Boolean] default = True +INTERNAL_WAVE_SPEED_BETTER_EST = False ! [Boolean] default = True ! If true, use a more robust estimate of the first mode wave speed as the ! starting point for iterations. @@ -531,6 +530,9 @@ USE_KH_BG_2D = True ! [Boolean] default = False HMIX_FIXED = 0.5 ! [m] ! The prescribed depth over which the near-surface viscosity and diffusivity are ! elevated when the bulk mixed layer is not used. +KVML = 1.0E-04 ! [m2 s-1] default = 1.0E-04 + ! The kinematic viscosity in the mixed layer. A typical value is ~1e-2 m2 s-1. + ! KVML is not used if BULKMIXEDLAYER is true. The default is set by KV. MAXVEL = 6.0 ! [m s-1] default = 3.0E+08 ! The maximum velocity allowed before the velocity components are truncated. @@ -829,6 +831,28 @@ ENERGYSAVEDAYS = 0.25 ! [days] default = 1.0 ! other globally summed diagnostics. ! === module ocean_model_init === +ODA_INCUPD = @[ODA_INCUPD] ! [Boolean] default = False + ! If true, oda incremental updates will be applied + ! everywhere in the domain. +ODA_INCUPD_FILE = "mom6_increment.nc" ! The name of the file with the T,S,h increments. + +ODA_TEMPINC_VAR = "Temp" ! default = "ptemp_inc" + ! The name of the potential temperature inc. variable in + ! ODA_INCUPD_FILE. +ODA_SALTINC_VAR = "Salt" ! default = "sal_inc" + ! The name of the salinity inc. variable in + ! ODA_INCUPD_FILE. +ODA_THK_VAR = "h" ! default = "h" + ! The name of the int. depth inc. variable in + ! ODA_INCUPD_FILE. +ODA_INCUPD_UV = false ! +!ODA_UINC_VAR = "u" ! default = "u_inc" + ! The name of the zonal vel. inc. variable in + ! ODA_INCUPD_UV_FILE. +!ODA_VINC_VAR = "v" ! default = "v_inc" + ! The name of the meridional vel. inc. variable in + ! ODA_INCUPD_UV_FILE. +ODA_INCUPD_NHOURS = @[ODA_INCUPD_NHOURS] ! default=3.0 ! === module MOM_surface_forcing === OCEAN_SURFACE_STAGGER = "A" ! default = "C" @@ -856,7 +880,7 @@ FIX_USTAR_GUSTLESS_BUG = False ! [Boolean] default = True ! velocity ! === module ocean_stochastics === DO_SPPT = @[DO_OCN_SPPT] ! [Boolean] default = False - ! If true perturb the diabatic tendencies in MOM_diabadic_driver + ! If true perturb the diabatic tendencies in MOM_diabatic_driver PERT_EPBL = @[PERT_EPBL] ! [Boolean] default = False ! If true perturb the KE dissipation and destruction in MOM_energetic_PBL diff --git a/parm/mom6/MOM_input_template_500 b/parm/mom6/MOM_input_template_500 new file mode 100644 index 00000000000..5a378caeb0e --- /dev/null +++ b/parm/mom6/MOM_input_template_500 @@ -0,0 +1,541 @@ +! This file was written by the model and records the non-default parameters used at run-time. +! === module MOM === + +! === module MOM_unit_scaling === +! Parameters for doing unit scaling of variables. +USE_REGRIDDING = True ! [Boolean] default = False + ! If True, use the ALE algorithm (regridding/remapping). If False, use the + ! layered isopycnal algorithm. +THICKNESSDIFFUSE = True ! [Boolean] default = False + ! If true, interface heights are diffused with a coefficient of KHTH. +THICKNESSDIFFUSE_FIRST = True ! [Boolean] default = False + ! If true, do thickness diffusion before dynamics. This is only used if + ! THICKNESSDIFFUSE is true. +DT = @[DT_DYNAM_MOM6] ! [s] + ! The (baroclinic) dynamics time step. The time-step that is actually used will + ! be an integer fraction of the forcing time-step (DT_FORCING in ocean-only mode + ! or the coupling timestep in coupled mode.) +DT_THERM = @[DT_THERM_MOM6] ! [s] default = 1800.0 + ! The thermodynamic and tracer advection time step. Ideally DT_THERM should be + ! an integer multiple of DT and less than the forcing or coupling time-step, + ! unless THERMO_SPANS_COUPLING is true, in which case DT_THERM can be an integer + ! multiple of the coupling timestep. By default DT_THERM is set to DT. +THERMO_SPANS_COUPLING = @[MOM6_THERMO_SPAN] ! [Boolean] default = False + ! If true, the MOM will take thermodynamic and tracer timesteps that can be + ! longer than the coupling timestep. The actual thermodynamic timestep that is + ! used in this case is the largest integer multiple of the coupling timestep + ! that is less than or equal to DT_THERM. +HFREEZE = 20.0 ! [m] default = -1.0 + ! If HFREEZE > 0, melt potential will be computed. The actual depth + ! over which melt potential is computed will be min(HFREEZE, OBLD) + ! where OBLD is the boundary layer depth. If HFREEZE <= 0 (default) + ! melt potential will not be computed. +FRAZIL = True ! [Boolean] default = False + ! If true, water freezes if it gets too cold, and the accumulated heat deficit + ! is returned in the surface state. FRAZIL is only used if + ! ENABLE_THERMODYNAMICS is true. +BOUND_SALINITY = True ! [Boolean] default = False + ! If true, limit salinity to being positive. (The sea-ice model may ask for more + ! salt than is available and drive the salinity negative otherwise.) + +! === module MOM_domains === +TRIPOLAR_N = True ! [Boolean] default = False + ! Use tripolar connectivity at the northern edge of the domain. With + ! TRIPOLAR_N, NIGLOBAL must be even. +NIGLOBAL = @[NX_GLB] ! + ! The total number of thickness grid points in the x-direction in the physical + ! domain. With STATIC_MEMORY_ this is set in MOM_memory.h at compile time. +NJGLOBAL = @[NY_GLB] ! + ! The total number of thickness grid points in the y-direction in the physical + ! domain. With STATIC_MEMORY_ this is set in MOM_memory.h at compile time. + +! === module MOM_hor_index === +! Sets the horizontal array index types. + +! === module MOM_fixed_initialization === +INPUTDIR = "INPUT" ! default = "." + ! The directory in which input files are found. + +! === module MOM_grid_init === +GRID_CONFIG = "mosaic" ! + ! A character string that determines the method for defining the horizontal + ! grid. Current options are: + ! mosaic - read the grid from a mosaic (supergrid) + ! file set by GRID_FILE. + ! cartesian - use a (flat) Cartesian grid. + ! spherical - use a simple spherical grid. + ! mercator - use a Mercator spherical grid. +GRID_FILE = "ocean_hgrid.nc" ! + ! Name of the file from which to read horizontal grid data. +GRID_ROTATION_ANGLE_BUGS = False ! [Boolean] default = True + ! If true, use an older algorithm to calculate the sine and + ! cosines needed rotate between grid-oriented directions and + ! true north and east. Differences arise at the tripolar fold +USE_TRIPOLAR_GEOLONB_BUG = False ! [Boolean] default = True + ! If true, use older code that incorrectly sets the longitude in some points + ! along the tripolar fold to be off by 360 degrees. +TOPO_CONFIG = "file" ! + ! This specifies how bathymetry is specified: + ! file - read bathymetric information from the file + ! specified by (TOPO_FILE). + ! flat - flat bottom set to MAXIMUM_DEPTH. + ! bowl - an analytically specified bowl-shaped basin + ! ranging between MAXIMUM_DEPTH and MINIMUM_DEPTH. + ! spoon - a similar shape to 'bowl', but with an vertical + ! wall at the southern face. + ! halfpipe - a zonally uniform channel with a half-sine + ! profile in the meridional direction. + ! bbuilder - build topography from list of functions. + ! benchmark - use the benchmark test case topography. + ! Neverworld - use the Neverworld test case topography. + ! DOME - use a slope and channel configuration for the + ! DOME sill-overflow test case. + ! ISOMIP - use a slope and channel configuration for the + ! ISOMIP test case. + ! DOME2D - use a shelf and slope configuration for the + ! DOME2D gravity current/overflow test case. + ! Kelvin - flat but with rotated land mask. + ! seamount - Gaussian bump for spontaneous motion test case. + ! dumbbell - Sloshing channel with reservoirs on both ends. + ! shelfwave - exponential slope for shelfwave test case. + ! Phillips - ACC-like idealized topography used in the Phillips config. + ! dense - Denmark Strait-like dense water formation and overflow. + ! USER - call a user modified routine. +TOPO_FILE = "ocean_topog.nc" ! default = "topog.nc" + ! The file from which the bathymetry is read. +!MAXIMUM_DEPTH = 5801.341919389728 ! [m] + ! The (diagnosed) maximum depth of the ocean. +MINIMUM_DEPTH = 10.0 ! [m] default = 0.0 + ! If MASKING_DEPTH is unspecified, then anything shallower than MINIMUM_DEPTH is + ! assumed to be land and all fluxes are masked out. If MASKING_DEPTH is + ! specified, then all depths shallower than MINIMUM_DEPTH but deeper than + ! MASKING_DEPTH are rounded to MINIMUM_DEPTH. + +! === module MOM_open_boundary === +! Controls where open boundaries are located, what kind of boundary condition to impose, and what data to apply, +! if any. +MASKING_DEPTH = 0.0 ! [m] default = -9999.0 + ! The depth below which to mask points as land points, for which all fluxes are + ! zeroed out. MASKING_DEPTH is ignored if negative. + +! === module MOM_verticalGrid === +! Parameters providing information about the vertical grid. +NK = 25 ! [nondim] + ! The number of model layers. + +! === module MOM_tracer_registry === + +! === module MOM_EOS === +TFREEZE_FORM = "MILLERO_78" ! default = "LINEAR" + ! TFREEZE_FORM determines which expression should be used for the freezing + ! point. Currently, the valid choices are "LINEAR", "MILLERO_78", "TEOS10" + +! === module MOM_restart === +RESTART_CHECKSUMS_REQUIRED = False +! === module MOM_tracer_flow_control === + +! === module MOM_coord_initialization === +COORD_CONFIG = "file" ! default = "none" + ! This specifies how layers are to be defined: + ! ALE or none - used to avoid defining layers in ALE mode + ! file - read coordinate information from the file + ! specified by (COORD_FILE). + ! BFB - Custom coords for buoyancy-forced basin case + ! based on SST_S, T_BOT and DRHO_DT. + ! linear - linear based on interfaces not layers + ! layer_ref - linear based on layer densities + ! ts_ref - use reference temperature and salinity + ! ts_range - use range of temperature and salinity + ! (T_REF and S_REF) to determine surface density + ! and GINT calculate internal densities. + ! gprime - use reference density (RHO_0) for surface + ! density and GINT calculate internal densities. + ! ts_profile - use temperature and salinity profiles + ! (read from COORD_FILE) to set layer densities. + ! USER - call a user modified routine. +COORD_FILE = "layer_coord25.nc" ! + ! The file from which the coordinate densities are read. +REGRIDDING_COORDINATE_MODE = "HYCOM1" ! default = "LAYER" + ! Coordinate mode for vertical regridding. Choose among the following + ! possibilities: LAYER - Isopycnal or stacked shallow water layers + ! ZSTAR, Z* - stretched geopotential z* + ! SIGMA_SHELF_ZSTAR - stretched geopotential z* ignoring shelf + ! SIGMA - terrain following coordinates + ! RHO - continuous isopycnal + ! HYCOM1 - HyCOM-like hybrid coordinate + ! SLIGHT - stretched coordinates above continuous isopycnal + ! ADAPTIVE - optimize for smooth neutral density surfaces +BOUNDARY_EXTRAPOLATION = True ! [Boolean] default = False + ! When defined, a proper high-order reconstruction scheme is used within + ! boundary cells rather than PCM. E.g., if PPM is used for remapping, a PPM + ! reconstruction will also be used within boundary cells. +ALE_COORDINATE_CONFIG = "HYBRID:hycom1_25.nc,sigma2,FNC1:5,4000,4.5,.01" ! default = "UNIFORM" + ! Determines how to specify the coordinate + ! resolution. Valid options are: + ! PARAM - use the vector-parameter ALE_RESOLUTION + ! UNIFORM[:N] - uniformly distributed + ! FILE:string - read from a file. The string specifies + ! the filename and variable name, separated + ! by a comma or space, e.g. FILE:lev.nc,dz + ! or FILE:lev.nc,interfaces=zw + ! WOA09[:N] - the WOA09 vertical grid (approximately) + ! FNC1:string - FNC1:dz_min,H_total,power,precision + ! HYBRID:string - read from a file. The string specifies + ! the filename and two variable names, separated + ! by a comma or space, for sigma-2 and dz. e.g. + ! HYBRID:vgrid.nc,sigma2,dz +!ALE_RESOLUTION = 2*5.0, 5.01, 5.07, 5.25, 5.68, 6.55, 8.1, 10.66, 14.620000000000001, 20.450000000000003, 28.73, 40.1, 55.32, 75.23, 100.8, 133.09, 173.26, 222.62, 282.56, 354.62, 440.47, 541.87, 660.76, 799.1800000000001 ! [m] + ! The distribution of vertical resolution for the target + ! grid used for Eulerian-like coordinates. For example, + ! in z-coordinate mode, the parameter is a list of level + ! thicknesses (in m). In sigma-coordinate mode, the list + ! is of non-dimensional fractions of the water column. +!TARGET_DENSITIES = 1010.0, 1020.843017578125, 1027.0274658203125, 1029.279541015625, 1030.862548828125, 1032.1572265625, 1033.27978515625, 1034.251953125, 1034.850830078125, 1035.28857421875, 1035.651123046875, 1035.967529296875, 1036.2410888671875, 1036.473876953125, 1036.6800537109375, 1036.8525390625, 1036.9417724609375, 1037.0052490234375, 1037.057373046875, 1037.1065673828125, 1037.15576171875, 1037.2060546875, 1037.26416015625, 1037.3388671875, 1037.4749755859375, 1038.0 ! [m] + ! HYBRID target densities for itnerfaces +REGRID_COMPRESSIBILITY_FRACTION = 0.01 ! [not defined] default = 0.0 + ! When interpolating potential density profiles we can add + ! some artificial compressibility solely to make homogenous + ! regions appear stratified. +MAXIMUM_INT_DEPTH_CONFIG = "FNC1:5,8000.0,1.0,.125" ! default = "NONE" + ! Determines how to specify the maximum interface depths. + ! Valid options are: + ! NONE - there are no maximum interface depths + ! PARAM - use the vector-parameter MAXIMUM_INTERFACE_DEPTHS + ! FILE:string - read from a file. The string specifies + ! the filename and variable name, separated + ! by a comma or space, e.g. FILE:lev.nc,Z + ! FNC1:string - FNC1:dz_min,H_total,power,precision +!MAXIMUM_INT_DEPTHS = 0.0, 5.0, 36.25, 93.75, 177.5, 287.5, 423.75, 586.25, 775.0, 990.0, 1231.25, 1498.75, 1792.5, 2112.5, 2458.75, 2831.25, 3230.0, 3655.0, 4106.25, 4583.75, 5087.5, 5617.5, 6173.75, 6756.25, 7365.0, 8000.0 ! [m] + ! The list of maximum depths for each interface. +MAX_LAYER_THICKNESS_CONFIG = "FNC1:400,31000.0,0.1,.01" ! default = "NONE" + ! Determines how to specify the maximum layer thicknesses. + ! Valid options are: + ! NONE - there are no maximum layer thicknesses + ! PARAM - use the vector-parameter MAX_LAYER_THICKNESS + ! FILE:string - read from a file. The string specifies + ! the filename and variable name, separated + ! by a comma or space, e.g. FILE:lev.nc,Z + ! FNC1:string - FNC1:dz_min,H_total,power,precision +!MAX_LAYER_THICKNESS = 400.0, 1094.2, 1144.02, 1174.81, 1197.42, 1215.4099999999999, 1230.42, 1243.3200000000002, 1254.65, 1264.78, 1273.94, 1282.31, 1290.02, 1297.17, 1303.85, 1310.1, 1316.0, 1321.5700000000002, 1326.85, 1331.87, 1336.67, 1341.25, 1345.6399999999999, 1349.85, 1353.88 ! [m] + ! The list of maximum thickness for each layer. +REMAPPING_SCHEME = "PPM_H4" ! default = "PLM" + ! This sets the reconstruction scheme used for vertical remapping for all + ! variables. It can be one of the following schemes: PCM (1st-order + ! accurate) + ! PLM (2nd-order accurate) + ! PPM_H4 (3rd-order accurate) + ! PPM_IH4 (3rd-order accurate) + ! PQM_IH4IH3 (4th-order accurate) + ! PQM_IH6IH5 (5th-order accurate) + +! === module MOM_grid === +! Parameters providing information about the lateral grid. + +! === module MOM_state_initialization === +INIT_LAYERS_FROM_Z_FILE = True ! [Boolean] default = False + ! If true, initialize the layer thicknesses, temperatures, and salinities from a + ! Z-space file on a latitude-longitude grid. + +! === module MOM_initialize_layers_from_Z === +TEMP_SALT_Z_INIT_FILE = "" ! default = "temp_salt_z.nc" + ! The name of the z-space input file used to initialize + ! temperatures (T) and salinities (S). If T and S are not + ! in the same file, TEMP_Z_INIT_FILE and SALT_Z_INIT_FILE + ! must be set. +TEMP_Z_INIT_FILE = "woa18_decav_t00_01.nc" ! default = "" + ! The name of the z-space input file used to initialize + ! temperatures, only. +SALT_Z_INIT_FILE = "woa18_decav_s00_01.nc" ! default = "" + ! The name of the z-space input file used to initialize + ! temperatures, only. +Z_INIT_FILE_PTEMP_VAR = "t_an" ! default = "ptemp" + ! The name of the potential temperature variable in + ! TEMP_Z_INIT_FILE. +Z_INIT_FILE_SALT_VAR = "s_an" ! default = "salt" + ! The name of the salinity variable in + ! SALT_Z_INIT_FILE. +Z_INIT_ALE_REMAPPING = True ! [Boolean] default = False + ! If True, then remap straight to model coordinate from file. + +! === module MOM_diag_mediator === + +! === module MOM_MEKE === +USE_MEKE = True ! [Boolean] default = False + ! If true, turns on the MEKE scheme which calculates a sub-grid mesoscale eddy + ! kinetic energy budget. + +! === module MOM_lateral_mixing_coeffs === +USE_VARIABLE_MIXING = True ! [Boolean] default = False + ! If true, the variable mixing code will be called. This allows diagnostics to + ! be created even if the scheme is not used. If KHTR_SLOPE_CFF>0 or + ! KhTh_Slope_Cff>0, this is set to true regardless of what is in the parameter + ! file. +! === module MOM_set_visc === +CHANNEL_DRAG = True ! [Boolean] default = False + ! If true, the bottom drag is exerted directly on each layer proportional to the + ! fraction of the bottom it overlies. +HBBL = 10.0 ! [m] + ! The thickness of a bottom boundary layer with a viscosity of KVBBL if + ! BOTTOMDRAGLAW is not defined, or the thickness over which near-bottom + ! velocities are averaged for the drag law if BOTTOMDRAGLAW is defined but + ! LINEAR_DRAG is not. +KV = 1.0E-04 ! [m2 s-1] + ! The background kinematic viscosity in the interior. The molecular value, ~1e-6 + ! m2 s-1, may be used. + +! === module MOM_continuity === + +! === module MOM_continuity_PPM === + +! === module MOM_CoriolisAdv === +CORIOLIS_SCHEME = "SADOURNY75_ENSTRO" ! default = "SADOURNY75_ENERGY" + ! CORIOLIS_SCHEME selects the discretization for the Coriolis terms. Valid + ! values are: + ! SADOURNY75_ENERGY - Sadourny, 1975; energy cons. + ! ARAKAWA_HSU90 - Arakawa & Hsu, 1990 + ! SADOURNY75_ENSTRO - Sadourny, 1975; enstrophy cons. + ! ARAKAWA_LAMB81 - Arakawa & Lamb, 1981; En. + Enst. + ! ARAKAWA_LAMB_BLEND - A blend of Arakawa & Lamb with + ! Arakawa & Hsu and Sadourny energy +BOUND_CORIOLIS = True ! [Boolean] default = False + ! If true, the Coriolis terms at u-points are bounded by the four estimates of + ! (f+rv)v from the four neighboring v-points, and similarly at v-points. This + ! option would have no effect on the SADOURNY Coriolis scheme if it were + ! possible to use centered difference thickness fluxes. + +! === module MOM_PressureForce === + +! === module MOM_PressureForce_AFV === +MASS_WEIGHT_IN_PRESSURE_GRADIENT = True ! [Boolean] default = False + ! If true, use mass weighting when interpolating T/S for integrals near the + ! bathymetry in AFV pressure gradient calculations. + +! === module MOM_hor_visc === +LAPLACIAN = True ! [Boolean] default = False + ! If true, use a Laplacian horizontal viscosity. +KH_VEL_SCALE = 0.01 ! [m s-1] default = 0.0 + ! The velocity scale which is multiplied by the grid spacing to calculate the + ! Laplacian viscosity. The final viscosity is the largest of this scaled + ! viscosity, the Smagorinsky and Leith viscosities, and KH. +KH_SIN_LAT = 2000.0 ! [m2 s-1] default = 0.0 + ! The amplitude of a latitudinally-dependent background viscosity of the form + ! KH_SIN_LAT*(SIN(LAT)**KH_PWR_OF_SINE). +SMAGORINSKY_KH = True ! [Boolean] default = False + ! If true, use a Smagorinsky nonlinear eddy viscosity. +SMAG_LAP_CONST = 0.15 ! [nondim] default = 0.0 + ! The nondimensional Laplacian Smagorinsky constant, often 0.15. +AH_VEL_SCALE = 0.01 ! [m s-1] default = 0.0 + ! The velocity scale which is multiplied by the cube of the grid spacing to + ! calculate the biharmonic viscosity. The final viscosity is the largest of this + ! scaled viscosity, the Smagorinsky and Leith viscosities, and AH. +SMAGORINSKY_AH = True ! [Boolean] default = False + ! If true, use a biharmonic Smagorinsky nonlinear eddy viscosity. +SMAG_BI_CONST = 0.06 ! [nondim] default = 0.0 + ! The nondimensional biharmonic Smagorinsky constant, typically 0.015 - 0.06. +USE_LAND_MASK_FOR_HVISC = True ! [Boolean] default = False + ! If true, use Use the land mask for the computation of thicknesses at velocity + ! locations. This eliminates the dependence on arbitrary values over land or + ! outside of the domain. + +! === module MOM_vert_friction === +HMIX_FIXED = 0.5 ! [m] + ! The prescribed depth over which the near-surface viscosity and diffusivity are + ! elevated when the bulk mixed layer is not used. +KVML = 1.0E-04 ! [m2 s-1] default = 1.0E-04 + ! The kinematic viscosity in the mixed layer. A typical value is ~1e-2 m2 s-1. + ! KVML is not used if BULKMIXEDLAYER is true. The default is set by KV. +MAXVEL = 6.0 ! [m s-1] default = 3.0E+08 + ! The maximum velocity allowed before the velocity components are truncated. + +! === module MOM_barotropic === +BOUND_BT_CORRECTION = True ! [Boolean] default = False + ! If true, the corrective pseudo mass-fluxes into the barotropic solver are + ! limited to values that require less than maxCFL_BT_cont to be accommodated. +BT_PROJECT_VELOCITY = True ! [Boolean] default = False + ! If true, step the barotropic velocity first and project out the velocity + ! tendency by 1+BEBT when calculating the transport. The default (false) is to + ! use a predictor continuity step to find the pressure field, and then to do a + ! corrector continuity step using a weighted average of the old and new + ! velocities, with weights of (1-BEBT) and BEBT. +DYNAMIC_SURFACE_PRESSURE = False ! [Boolean] default = False + ! If true, add a dynamic pressure due to a viscous ice shelf, for instance. +BEBT = 0.2 ! [nondim] default = 0.1 + ! BEBT determines whether the barotropic time stepping uses the forward-backward + ! time-stepping scheme or a backward Euler scheme. BEBT is valid in the range + ! from 0 (for a forward-backward treatment of nonrotating gravity waves) to 1 + ! (for a backward Euler treatment). In practice, BEBT must be greater than about + ! 0.05. +DTBT = -0.9 ! [s or nondim] default = -0.98 + ! The barotropic time step, in s. DTBT is only used with the split explicit time + ! stepping. To set the time step automatically based the maximum stable value + ! use 0, or a negative value gives the fraction of the stable value. Setting + ! DTBT to 0 is the same as setting it to -0.98. The value of DTBT that will + ! actually be used is an integer fraction of DT, rounding down. + +! === module MOM_mixed_layer_restrat === +MIXEDLAYER_RESTRAT = False ! [Boolean] default = False + ! If true, a density-gradient dependent re-stratifying flow is imposed in the + ! mixed layer. Can be used in ALE mode without restriction but in layer mode can + ! only be used if BULKMIXEDLAYER is true. +FOX_KEMPER_ML_RESTRAT_COEF = 60.0 ! [nondim] default = 0.0 + ! A nondimensional coefficient that is proportional to the ratio of the + ! deformation radius to the dominant lengthscale of the submesoscale mixed layer + ! instabilities, times the minimum of the ratio of the mesoscale eddy kinetic + ! energy to the large-scale geostrophic kinetic energy or 1 plus the square of + ! the grid spacing over the deformation radius, as detailed by Fox-Kemper et al. + ! (2010) +MLE_FRONT_LENGTH = 200.0 ! [m] default = 0.0 + ! If non-zero, is the frontal-length scale used to calculate the upscaling of + ! buoyancy gradients that is otherwise represented by the parameter + ! FOX_KEMPER_ML_RESTRAT_COEF. If MLE_FRONT_LENGTH is non-zero, it is recommended + ! to set FOX_KEMPER_ML_RESTRAT_COEF=1.0. +MLE_USE_PBL_MLD = True ! [Boolean] default = False + ! If true, the MLE parameterization will use the mixed-layer depth provided by + ! the active PBL parameterization. If false, MLE will estimate a MLD based on a + ! density difference with the surface using the parameter MLE_DENSITY_DIFF. +MLE_MLD_DECAY_TIME = 2.592E+06 ! [s] default = 0.0 + ! The time-scale for a running-mean filter applied to the mixed-layer depth used + ! in the MLE restratification parameterization. When the MLD deepens below the + ! current running-mean the running-mean is instantaneously set to the current + ! MLD. + +! === module MOM_diabatic_driver === +! The following parameters are used for diabatic processes. +ENERGETICS_SFC_PBL = True ! [Boolean] default = False + ! If true, use an implied energetics planetary boundary layer scheme to + ! determine the diffusivity and viscosity in the surface boundary layer. +EPBL_IS_ADDITIVE = False ! [Boolean] default = True + ! If true, the diffusivity from ePBL is added to all other diffusivities. + ! Otherwise, the larger of kappa-shear and ePBL diffusivities are used. + +! === module MOM_CVMix_KPP === +! This is the MOM wrapper to CVMix:KPP +! See http://cvmix.github.io/ + +! === module MOM_tidal_mixing === +! Vertical Tidal Mixing Parameterization + +! === module MOM_CVMix_conv === +! Parameterization of enhanced mixing due to convection via CVMix + +! === module MOM_set_diffusivity === + +! === module MOM_bkgnd_mixing === +! Adding static vertical background mixing coefficients +KD = 1.5E-05 ! [m2 s-1] default = 0.0 + ! The background diapycnal diffusivity of density in the interior. Zero or the + ! molecular value, ~1e-7 m2 s-1, may be used. +KD_MIN = 2.0E-06 ! [m2 s-1] default = 2.0E-07 + ! The minimum diapycnal diffusivity. +HENYEY_IGW_BACKGROUND = True ! [Boolean] default = False + ! If true, use a latitude-dependent scaling for the near surface background + ! diffusivity, as described in Harrison & Hallberg, JPO 2008. + +! === module MOM_kappa_shear === +! Parameterization of shear-driven turbulence following Jackson, Hallberg and Legg, JPO 2008 +USE_JACKSON_PARAM = True ! [Boolean] default = False + ! If true, use the Jackson-Hallberg-Legg (JPO 2008) shear mixing + ! parameterization. +MAX_RINO_IT = 25 ! [nondim] default = 50 + ! The maximum number of iterations that may be used to estimate the Richardson + ! number driven mixing. + +! === module MOM_CVMix_shear === +! Parameterization of shear-driven turbulence via CVMix (various options) + +! === module MOM_CVMix_ddiff === +! Parameterization of mixing due to double diffusion processes via CVMix + +! === module MOM_diabatic_aux === +! The following parameters are used for auxiliary diabatic processes. + +! === module MOM_energetic_PBL === +EPBL_USTAR_MIN = 1.45842E-18 ! [m s-1] + ! The (tiny) minimum friction velocity used within the ePBL code, derived from + ! OMEGA and ANGSTROM.. +USE_LA_LI2016 = @[MOM6_USE_LI2016] ! [nondim] default = False + ! A logical to use the Li et al. 2016 (submitted) formula to determine the + ! Langmuir number. +USE_WAVES = @[MOM6_USE_WAVES] ! [Boolean] default = False + ! If true, enables surface wave modules. + +! === module MOM_regularize_layers === + +! === module MOM_opacity === + +! === module MOM_tracer_advect === +TRACER_ADVECTION_SCHEME = "PPM:H3" ! default = "PLM" + ! The horizontal transport scheme for tracers: + ! PLM - Piecewise Linear Method + ! PPM:H3 - Piecewise Parabolic Method (Huyhn 3rd order) + ! PPM - Piecewise Parabolic Method (Colella-Woodward) + +! === module MOM_tracer_hor_diff === +KHTR = 50.0 ! [m2 s-1] default = 0.0 + ! The background along-isopycnal tracer diffusivity. +CHECK_DIFFUSIVE_CFL = True ! [Boolean] default = False + ! If true, use enough iterations the diffusion to ensure that the diffusive + ! equivalent of the CFL limit is not violated. If false, always use the greater + ! of 1 or MAX_TR_DIFFUSION_CFL iteration. +MAX_TR_DIFFUSION_CFL = 2.0 ! [nondim] default = -1.0 + ! If positive, locally limit the along-isopycnal tracer diffusivity to keep the + ! diffusive CFL locally at or below this value. The number of diffusive + ! iterations is often this value or the next greater integer. + +! === module MOM_neutral_diffusion === +! This module implements neutral diffusion of tracers +USE_NEUTRAL_DIFFUSION = True ! [Boolean] default = False + ! If true, enables the neutral diffusion module. + +! === module MOM_sum_output === +MAXTRUNC = 1000 ! [truncations save_interval-1] default = 0 + ! The run will be stopped, and the day set to a very large value if the velocity + ! is truncated more than MAXTRUNC times between energy saves. Set MAXTRUNC to 0 + ! to stop if there is any truncation of velocities. + +! === module ocean_model_init === + +! === module MOM_oda_incupd === +ODA_INCUPD = @[ODA_INCUPD] ! [Boolean] default = False + ! If true, oda incremental updates will be applied + ! everywhere in the domain. +ODA_INCUPD_FILE = "mom6_increment.nc" ! The name of the file with the T,S,h increments. + +ODA_TEMPINC_VAR = "Temp" ! default = "ptemp_inc" + ! The name of the potential temperature inc. variable in + ! ODA_INCUPD_FILE. +ODA_SALTINC_VAR = "Salt" ! default = "sal_inc" + ! The name of the salinity inc. variable in + ! ODA_INCUPD_FILE. +ODA_THK_VAR = "h" ! default = "h" + ! The name of the int. depth inc. variable in + ! ODA_INCUPD_FILE. +ODA_INCUPD_UV = false ! +!ODA_UINC_VAR = "u" ! default = "u_inc" + ! The name of the zonal vel. inc. variable in + ! ODA_INCUPD_UV_FILE. +!ODA_VINC_VAR = "v" ! default = "v_inc" + ! The name of the meridional vel. inc. variable in + ! ODA_INCUPD_UV_FILE. +ODA_INCUPD_NHOURS = @[ODA_INCUPD_NHOURS] ! default=3.0 + +! === module MOM_surface_forcing === +OCEAN_SURFACE_STAGGER = "A" ! default = "C" + ! A case-insensitive character string to indicate the + ! staggering of the surface velocity field that is + ! returned to the coupler. Valid values include + ! 'A', 'B', or 'C'. + +MAX_P_SURF = 0.0 ! [Pa] default = -1.0 + ! The maximum surface pressure that can be exerted by the atmosphere and + ! floating sea-ice or ice shelves. This is needed because the FMS coupling + ! structure does not limit the water that can be frozen out of the ocean and the + ! ice-ocean heat fluxes are treated explicitly. No limit is applied if a + ! negative value is used. +WIND_STAGGER = "A" ! default = "C" + ! A case-insensitive character string to indicate the + ! staggering of the input wind stress field. Valid + ! values are 'A', 'B', or 'C'. +! === module MOM_restart === + +! === module MOM_file_parser === diff --git a/parm/parm_fv3diag/diag_table b/parm/parm_fv3diag/diag_table index bcd9a882e4e..37421f8a4f4 100644 --- a/parm/parm_fv3diag/diag_table +++ b/parm/parm_fv3diag/diag_table @@ -1,6 +1,83 @@ "fv3_history", 0, "hours", 1, "hours", "time" "fv3_history2d", 0, "hours", 1, "hours", "time" +"ocn%4yr%2mo%2dy%2hr", 6, "hours", 1, "hours", "time", 6, "hours", "1901 1 1 0 0 0" +"ocn_daily%4yr%2mo%2dy", 1, "days", 1, "days", "time", 1, "days", "1901 1 1 0 0 0" +############## +# Ocean fields +############## +# static fields +"ocean_model", "geolon", "geolon", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "geolat", "geolat", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "geolon_c", "geolon_c", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "geolat_c", "geolat_c", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "geolon_u", "geolon_u", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "geolat_u", "geolat_u", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "geolon_v", "geolon_v", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "geolat_v", "geolat_v", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +# "ocean_model", "depth_ocean", "depth_ocean", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +# "ocean_model", "wet", "wet", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "wet_c", "wet_c", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "wet_u", "wet_u", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "wet_v", "wet_v", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "sin_rot", "sin_rot", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 +"ocean_model", "cos_rot", "cos_rot", "ocn%4yr%2mo%2dy%2hr", "all", .false., "none", 2 + +# ocean output TSUV and others +"ocean_model", "SSH", "SSH", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model", "SST", "SST", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model", "SSS", "SSS", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model", "speed", "speed", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model", "SSU", "SSU", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model", "SSV", "SSV", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model", "frazil", "frazil", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model", "ePBL_h_ML", "ePBL", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model", "MLD_003", "MLD_003", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model", "MLD_0125", "MLD_0125", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 + +# Z-Space Fields Provided for CMIP6 (CMOR Names): +"ocean_model_z", "uo", "uo", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model_z", "vo", "vo", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model_z", "so", "so", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 +"ocean_model_z", "temp", "temp", "ocn%4yr%2mo%2dy%2hr", "all", .true., "none", 2 + +# forcing +"ocean_model", "taux", "taux", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "tauy", "tauy", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "latent", "latent", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "sensible", "sensible", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "SW", "SW", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "LW", "LW", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "evap", "evap", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "lprec", "lprec", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "lrunoff", "lrunoff", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +# "ocean_model", "frunoff", "frunoff", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "fprec", "fprec", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "LwLatSens", "LwLatSens", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 +"ocean_model", "Heat_PmE", "Heat_PmE", "ocn%4yr%2mo%2dy%2hr","all",.true.,"none",2 + +# Daily fields +"ocean_model", "geolon", "geolon", "ocn_daily%4yr%2mo%2dy", "all", .false., "none", 2 +"ocean_model", "geolat", "geolat", "ocn_daily%4yr%2mo%2dy", "all", .false., "none", 2 +"ocean_model", "geolon_c", "geolon_c", "ocn_daily%4yr%2mo%2dy", "all", .false., "none", 2 +"ocean_model", "geolat_c", "geolat_c", "ocn_daily%4yr%2mo%2dy", "all", .false., "none", 2 +"ocean_model", "geolon_u", "geolon_u", "ocn_daily%4yr%2mo%2dy", "all", .false., "none", 2 +"ocean_model", "geolat_u", "geolat_u", "ocn_daily%4yr%2mo%2dy", "all", .false., "none", 2 +"ocean_model", "geolon_v", "geolon_v", "ocn_daily%4yr%2mo%2dy", "all", .false., "none", 2 +"ocean_model", "geolat_v", "geolat_v", "ocn_daily%4yr%2mo%2dy", "all", .false., "none", 2 +"ocean_model", "SST", "sst", "ocn_daily%4yr%2mo%2dy", "all", .true., "none", 2 +"ocean_model", "latent", "latent", "ocn_daily%4yr%2mo%2dy", "all", .true., "none", 2 +"ocean_model", "sensible", "sensible", "ocn_daily%4yr%2mo%2dy", "all", .true., "none", 2 +"ocean_model", "SW", "SW", "ocn_daily%4yr%2mo%2dy", "all", .true., "none", 2 +"ocean_model", "LW", "LW", "ocn_daily%4yr%2mo%2dy", "all", .true., "none", 2 +"ocean_model", "evap", "evap", "ocn_daily%4yr%2mo%2dy", "all", .true., "none", 2 +"ocean_model", "lprec", "lprec", "ocn_daily%4yr%2mo%2dy", "all", .true., "none", 2 +"ocean_model", "taux", "taux", "ocn_daily%4yr%2mo%2dy", "all", .true., "none", 2 +"ocean_model", "tauy", "tauy", "ocn_daily%4yr%2mo%2dy", "all", .true., "none", 2 + +################### +# Atmosphere fields +################### "gfs_dyn", "ucomp", "ugrd", "fv3_history", "all", .false., "none", 2 "gfs_dyn", "vcomp", "vgrd", "fv3_history", "all", .false., "none", 2 "gfs_dyn", "sphum", "spfh", "fv3_history", "all", .false., "none", 2 @@ -23,6 +100,13 @@ "gfs_dyn", "hs", "hgtsfc", "fv3_history", "all", .false., "none", 2 "gfs_phys", "cldfra", "cldfra", "fv3_history2d", "all", .false., "none", 2 +"gfs_phys", "frzr", "frzr", "fv3_history2d", "all", .false., "none", 2 +"gfs_phys", "frzrb", "frzrb", "fv3_history2d", "all", .false., "none", 2 +"gfs_phys", "frozr", "frozr", "fv3_history2d", "all", .false., "none", 2 +"gfs_phys", "frozrb", "frozrb", "fv3_history2d", "all", .false., "none", 2 +"gfs_phys", "tsnowp", "tsnowp", "fv3_history2d", "all", .false., "none", 2 +"gfs_phys", "tsnowpb", "tsnowpb", "fv3_history2d", "all", .false., "none", 2 +"gfs_phys", "rhonewsn", "rhonewsn", "fv3_history2d", "all", .false., "none", 2 "gfs_phys", "ALBDO_ave", "albdo_ave", "fv3_history2d", "all", .false., "none", 2 "gfs_phys", "cnvprcp_ave", "cprat_ave", "fv3_history2d", "all", .false., "none", 2 "gfs_phys", "cnvprcpb_ave", "cpratb_ave", "fv3_history2d", "all", .false., "none", 2 diff --git a/parm/parm_fv3diag/diag_table_da b/parm/parm_fv3diag/diag_table_da index d0407482374..cdcc36ee578 100644 --- a/parm/parm_fv3diag/diag_table_da +++ b/parm/parm_fv3diag/diag_table_da @@ -1,5 +1,16 @@ "fv3_history", 0, "hours", 1, "hours", "time" "fv3_history2d", 0, "hours", 1, "hours", "time" +"ocn_da%4yr%2mo%2dy%2hr", 1, "hours", 1, "hours", "time", 1, "hours" + +"ocean_model", "geolon", "geolon", "ocn_da%4yr%2mo%2dy%2hr", "all", "none", "none", 2 +"ocean_model", "geolat", "geolat", "ocn_da%4yr%2mo%2dy%2hr", "all", "none", "none", 2 +"ocean_model", "SSH", "ave_ssh", "ocn_da%4yr%2mo%2dy%2hr", "all", "none", "none", 2 +"ocean_model", "MLD_0125", "MLD", "ocn_da%4yr%2mo%2dy%2hr", "all", "none", "none", 2 +"ocean_model", "u", "u", "ocn_da%4yr%2mo%2dy%2hr", "all", "none", "none", 2 +"ocean_model", "v", "v", "ocn_da%4yr%2mo%2dy%2hr", "all", "none", "none", 2 +"ocean_model", "h", "h", "ocn_da%4yr%2mo%2dy%2hr", "all", "none", "none", 2 +"ocean_model", "salt", "Salt", "ocn_da%4yr%2mo%2dy%2hr", "all", "none", "none", 2 +"ocean_model", "temp", "Temp", "ocn_da%4yr%2mo%2dy%2hr", "all", "none", "none", 2 "gfs_dyn", "ucomp", "ugrd", "fv3_history", "all", .false., "none", 2 "gfs_dyn", "vcomp", "vgrd", "fv3_history", "all", .false., "none", 2 diff --git a/parm/parm_fv3diag/diag_table_history b/parm/parm_fv3diag/diag_table_history deleted file mode 100644 index 9a5766c27c3..00000000000 --- a/parm/parm_fv3diag/diag_table_history +++ /dev/null @@ -1,89 +0,0 @@ -#"atmos_static", -1, "hours", 1, "hours", "time" -"fv3_history", 0, "hours", 1, "hours", "time" -"fv3_history2d", 0, "hours", 1, "hours", "time" -# -# static data -# "dynamics", "pk", "pk", "atmos_static", "all", .false., "none", 2 -# "dynamics", "bk", "bk", "atmos_static", "all", .false., "none", 2 -# "dynamics", "hyam", "hyam", "atmos_static", "all", .false., "none", 2 -# "dynamics", "hybm", "hybm", "atmos_static", "all", .false., "none", 2 -# "dynamics", "zsurf", "zsurf", "atmos_static", "all", .false., "none", 2 -# -# history files -"gfs_dyn", "ucomp", "ucomp", "fv3_history", "all", .false., "none", 2 -"gfs_dyn", "vcomp", "vcomp", "fv3_history", "all", .false., "none", 2 -"gfs_dyn", "sphum", "sphum", "fv3_history", "all", .false., "none", 2 -"gfs_dyn", "temp", "temp", "fv3_history", "all", .false., "none", 2 -"gfs_dyn", "liq_wat", "liq_wat", "fv3_history", "all", .false., "none", 2 -"gfs_dyn", "o3mr", "o3mr", "fv3_history", "all", .false., "none", 2 -"gfs_dyn", "delp", "delp", "fv3_history", "all", .false., "none", 2 -"gfs_dyn", "pfhy", "hypres", "fv3_history", "all", .false., "none", 2 -"gfs_dyn", "pfnh", "nhpres", "fv3_history", "all", .false., "none", 2 -"gfs_dyn", "w", "vvel", "fv3_history", "all", .false., "none", 2 -"gfs_dyn", "delz", "delz", "fv3_history", "all", .false., "none", 2 -# -"gfs_sfc" "hgtsfc" "hgtsfc" "fv3_history2d" "all" .false. "none" 2 -"gfs_phys" "psurf" "pressfc" "fv3_history2d" "all" .false. "none" 2 -"gfs_phys" "u10m" "u10m" "fv3_history2d" "all" .false. "none" 2 -"gfs_phys" "v10m" "v10m" "fv3_history2d" "all" .false. "none" 2 -"gfs_phys" "soilm" "soilm" "fv3_history2d" "all" .false. "none" 2 -"gfs_phys" "cnvprcp" "cnvprcp" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "tprcp" "tprcp" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "weasd" "weasd" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "f10m" "f10m" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "q2m" "q2m" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "t2m" "t2m" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "tsfc" "tsfc" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "vtype" "vtype" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "stype" "stype" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "slmsksfc" "slmsk" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "vfracsfc" "vfrac" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "zorlsfc" "zorl" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "uustar" "uustar" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "soilt1" "soilt1" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "soilt2" "soilt2" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "soilt3" "soilt3" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "soilt4" "soilt4" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "soilw1" "soilw1" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "soilw2" "soilw2" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "soilw3" "soilw3" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "soilw4" "soilw4" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "slc_1" "slc_1" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "slc_2" "slc_2" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "slc_3" "slc_3" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "slc_4" "slc_4" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "slope" "slope" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "alnsf" "alnsf" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "alnwf" "alnwf" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "alvsf" "alvsf" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "alvwf" "alvwf" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "canopy" "canopy" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "facsf" "facsf" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "facwf" "facwf" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "ffhh" "ffhh" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "ffmm" "ffmm" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "fice" "fice" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "hice" "hice" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "snoalb" "snoalb" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "shdmax" "shdmax" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "shdmin" "shdmin" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "snowd" "snowd" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "tg3" "tg3" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "tisfc" "tisfc" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "tref" "tref" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "z_c" "z_c" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "c_0" "c_0" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "c_d" "c_d" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "w_0" "w_0" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "w_d" "w_d" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "xt" "xt" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "xz" "xz" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "dt_cool" "dt_cool" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "xs" "xs" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "xu" "xu" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "xv" "xv" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "xtts" "xtts" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "xzts" "xzts" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "d_conv" "d_conv" "fv3_history2d" "all" .false. "none" 2 -"gfs_sfc" "qrain" "qrain" "fv3_history2d" "all" .false. "none" 2 - diff --git a/parm/parm_fv3diag/field_table_gfdl_progsigma b/parm/parm_fv3diag/field_table_gfdl_progsigma new file mode 100644 index 00000000000..f7668455da6 --- /dev/null +++ b/parm/parm_fv3diag/field_table_gfdl_progsigma @@ -0,0 +1,42 @@ +# added by FRE: sphum must be present in atmos +# specific humidity for moist runs + "TRACER", "atmos_mod", "sphum" + "longname", "specific humidity" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic cloud water mixing ratio + "TRACER", "atmos_mod", "liq_wat" + "longname", "cloud water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "rainwat" + "longname", "rain mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "ice_wat" + "longname", "cloud ice mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "snowwat" + "longname", "snow mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "graupel" + "longname", "graupel mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic ozone mixing ratio tracer + "TRACER", "atmos_mod", "o3mr" + "longname", "ozone mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognotsitc sigmab tracer + "TRACER", "atmos_mod", "sigmab" + "longname", "sigma fraction" + "units", "fraction" + "profile_type", "fixed", "surface_value=0.0" / +# non-prognostic cloud amount + "TRACER", "atmos_mod", "cld_amt" + "longname", "cloud amount" + "units", "1" + "profile_type", "fixed", "surface_value=1.e30" / diff --git a/parm/parm_fv3diag/field_table_gfdl_satmedmf_progsigma b/parm/parm_fv3diag/field_table_gfdl_satmedmf_progsigma new file mode 100644 index 00000000000..edc5389839e --- /dev/null +++ b/parm/parm_fv3diag/field_table_gfdl_satmedmf_progsigma @@ -0,0 +1,47 @@ +# added by FRE: sphum must be present in atmos +# specific humidity for moist runs + "TRACER", "atmos_mod", "sphum" + "longname", "specific humidity" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic cloud water mixing ratio + "TRACER", "atmos_mod", "liq_wat" + "longname", "cloud water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "rainwat" + "longname", "rain mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "ice_wat" + "longname", "cloud ice mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "snowwat" + "longname", "snow mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "graupel" + "longname", "graupel mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic ozone mixing ratio tracer + "TRACER", "atmos_mod", "o3mr" + "longname", "ozone mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic subgrid scale turbulent kinetic energy + "TRACER", "atmos_mod", "sgs_tke" + "longname", "subgrid scale turbulent kinetic energy" + "units", "m2/s2" + "profile_type", "fixed", "surface_value=0.0" / +# prognotsitc sigmab tracer + "TRACER", "atmos_mod", "sigmab" + "longname", "sigma fraction" + "units", "fraction" + "profile_type", "fixed", "surface_value=0.0" / +# non-prognostic cloud amount + "TRACER", "atmos_mod", "cld_amt" + "longname", "cloud amount" + "units", "1" + "profile_type", "fixed", "surface_value=1.e30" / diff --git a/parm/parm_fv3diag/field_table_thompson_noaero_tke_progsigma b/parm/parm_fv3diag/field_table_thompson_noaero_tke_progsigma new file mode 100644 index 00000000000..f424eb0d215 --- /dev/null +++ b/parm/parm_fv3diag/field_table_thompson_noaero_tke_progsigma @@ -0,0 +1,70 @@ +# added by FRE: sphum must be present in atmos +# specific humidity for moist runs + "TRACER", "atmos_mod", "sphum" + "longname", "specific humidity" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=3.e-6" / +# prognostic cloud water mixing ratio + "TRACER", "atmos_mod", "liq_wat" + "longname", "cloud water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic ice water mixing ratio + "TRACER", "atmos_mod", "ice_wat" + "longname", "cloud ice mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic rain water mixing ratio + "TRACER", "atmos_mod", "rainwat" + "longname", "rain water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic snow water mixing ratio + "TRACER", "atmos_mod", "snowwat" + "longname", "snow water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic Grau water mixing ratio + "TRACER", "atmos_mod", "graupel" + "longname", "graupel mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic cloud water number concentration - not for non-aerosol runs +# "TRACER", "atmos_mod", "water_nc" +# "longname", "cloud liquid water number concentration" +# "units", "/kg" +# "profile_type", "fixed", "surface_value=0.0" / +# prognostic cloud ice number concentration + "TRACER", "atmos_mod", "ice_nc" + "longname", "cloud ice water number concentration" + "units", "/kg" + "profile_type", "fixed", "surface_value=0.0" / +# prognostic rain number concentration + "TRACER", "atmos_mod", "rain_nc" + "longname", "rain number concentration" + "units", "/kg" + "profile_type", "fixed", "surface_value=0.0" / +# prognostic ozone mixing ratio tracer + "TRACER", "atmos_mod", "o3mr" + "longname", "ozone mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# water- and ice-friendly aerosols (Thompson) - not for non-aerosol runs +# "TRACER", "atmos_mod", "liq_aero" +# "longname", "water-friendly aerosol number concentration" +# "units", "/kg" +# "profile_type", "fixed", "surface_value=0.0" / +# "TRACER", "atmos_mod", "ice_aero" +# "longname", "ice-friendly aerosol number concentration" +# "units", "/kg" +# "profile_type", "fixed", "surface_value=0.0" / +# prognostic subgrid scale turbulent kinetic energy + "TRACER", "atmos_mod", "sgs_tke" + "longname", "subgrid scale turbulent kinetic energy" + "units", "m2/s2" + "profile_type", "fixed", "surface_value=0.0" / +# prognotsitc sigmab tracer + "TRACER", "atmos_mod", "sigmab" + "longname", "sigma fraction" + "units", "fraction" + "profile_type", "fixed", "surface_value=0.0" / \ No newline at end of file diff --git a/parm/parm_fv3diag/field_table_wsm6_progsigma b/parm/parm_fv3diag/field_table_wsm6_progsigma new file mode 100644 index 00000000000..3bc52e12963 --- /dev/null +++ b/parm/parm_fv3diag/field_table_wsm6_progsigma @@ -0,0 +1,38 @@ +# added by FRE: sphum must be present in atmos +# specific humidity for moist runs + "TRACER", "atmos_mod", "sphum" + "longname", "specific humidity" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=3.e-6" / +# prognostic cloud water mixing ratio + "TRACER", "atmos_mod", "liq_wat" + "longname", "cloud water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "ice_wat" + "longname", "ice water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=3.e-6" / +# prognostic cloud water mixing ratio + "TRACER", "atmos_mod", "rainwat" + "longname", "rain water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "snowwat" + "longname", "snow water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "graupel" + "longname", "graupel mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic ozone mixing ratio tracer + "TRACER", "atmos_mod", "o3mr" + "longname", "ozone mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognotsitc sigmab tracer + "TRACER", "atmos_mod", "sigmab" + "longname", "sigma fraction" + "units", "fraction" + "profile_type", "fixed", "surface_value=0.0" / diff --git a/parm/parm_fv3diag/field_table_wsm6_satmedmf_progsigma b/parm/parm_fv3diag/field_table_wsm6_satmedmf_progsigma new file mode 100644 index 00000000000..a73d13dbbf5 --- /dev/null +++ b/parm/parm_fv3diag/field_table_wsm6_satmedmf_progsigma @@ -0,0 +1,43 @@ +# added by FRE: sphum must be present in atmos +# specific humidity for moist runs + "TRACER", "atmos_mod", "sphum" + "longname", "specific humidity" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=3.e-6" / +# prognostic cloud water mixing ratio + "TRACER", "atmos_mod", "liq_wat" + "longname", "cloud water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "ice_wat" + "longname", "ice water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=3.e-6" / +# prognostic cloud water mixing ratio + "TRACER", "atmos_mod", "rainwat" + "longname", "rain water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "snowwat" + "longname", "snow water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / + "TRACER", "atmos_mod", "graupel" + "longname", "graupel mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic ozone mixing ratio tracer + "TRACER", "atmos_mod", "o3mr" + "longname", "ozone mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic subgrid scale turbulent kinetic energy + "TRACER", "atmos_mod", "sgs_tke" + "longname", "subgrid scale turbulent kinetic energy" + "units", "m2/s2" + "profile_type", "fixed", "surface_value=0.0" / +# prognotsitc sigmab tracer + "TRACER", "atmos_mod", "sigmab" + "longname", "sigma fraction" + "units", "fraction" + "profile_type", "fixed", "surface_value=0.0" / diff --git a/parm/parm_fv3diag/field_table_zhaocarr_progsigma b/parm/parm_fv3diag/field_table_zhaocarr_progsigma new file mode 100644 index 00000000000..9a1a1abf5d8 --- /dev/null +++ b/parm/parm_fv3diag/field_table_zhaocarr_progsigma @@ -0,0 +1,21 @@ +# added by FRE: sphum must be present in atmos +# specific humidity for moist runs + "TRACER", "atmos_mod", "sphum" + "longname", "specific humidity" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=3.e-6" / +# prognostic cloud water mixing ratio + "TRACER", "atmos_mod", "liq_wat" + "longname", "cloud water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic ozone mixing ratio tracer + "TRACER", "atmos_mod", "o3mr" + "longname", "ozone mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognotsitc sigmab tracer + "TRACER", "atmos_mod", "sigmab" + "longname", "sigma fraction" + "units", "fraction" + "profile_type", "fixed", "surface_value=0.0" / diff --git a/parm/parm_fv3diag/field_table_zhaocarr_satmedmf_progsigma b/parm/parm_fv3diag/field_table_zhaocarr_satmedmf_progsigma new file mode 100644 index 00000000000..5b29a4375da --- /dev/null +++ b/parm/parm_fv3diag/field_table_zhaocarr_satmedmf_progsigma @@ -0,0 +1,26 @@ +# added by FRE: sphum must be present in atmos +# specific humidity for moist runs + "TRACER", "atmos_mod", "sphum" + "longname", "specific humidity" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=3.e-6" / +# prognostic cloud water mixing ratio + "TRACER", "atmos_mod", "liq_wat" + "longname", "cloud water mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic ozone mixing ratio tracer + "TRACER", "atmos_mod", "o3mr" + "longname", "ozone mixing ratio" + "units", "kg/kg" + "profile_type", "fixed", "surface_value=1.e30" / +# prognostic subgrid scale turbulent kinetic energy + "TRACER", "atmos_mod", "sgs_tke" + "longname", "subgrid scale turbulent kinetic energy" + "units", "m2/s2" + "profile_type", "fixed", "surface_value=0.0" / +# prognotsitc sigmab tracer + "TRACER", "atmos_mod", "sigmab" + "longname", "sigma fraction" + "units", "fraction" + "profile_type", "fixed", "surface_value=0.0" / \ No newline at end of file diff --git a/parm/parm_gdas/aero_crtm_coeff.yaml b/parm/parm_gdas/aero_crtm_coeff.yaml new file mode 100644 index 00000000000..d310ff6d319 --- /dev/null +++ b/parm/parm_gdas/aero_crtm_coeff.yaml @@ -0,0 +1,13 @@ +mkdir: +- $(DATA)/crtm/ +copy: +- [$(FV3JEDI_FIX)/crtm/$(crtm_VERSION)/AerosolCoeff.bin, $(DATA)/crtm/] +- [$(FV3JEDI_FIX)/crtm/$(crtm_VERSION)/CloudCoeff.bin, $(DATA)/crtm/] +- [$(FV3JEDI_FIX)/crtm/$(crtm_VERSION)/v.viirs-m_npp.SpcCoeff.bin, $(DATA)/crtm/] +- [$(FV3JEDI_FIX)/crtm/$(crtm_VERSION)/v.viirs-m_npp.TauCoeff.bin, $(DATA)/crtm/] +- [$(FV3JEDI_FIX)/crtm/$(crtm_VERSION)/v.viirs-m_j1.SpcCoeff.bin, $(DATA)/crtm/] +- [$(FV3JEDI_FIX)/crtm/$(crtm_VERSION)/v.viirs-m_j1.TauCoeff.bin, $(DATA)/crtm/] +- [$(FV3JEDI_FIX)/crtm/$(crtm_VERSION)/NPOESS.VISice.EmisCoeff.bin, $(DATA)/crtm/] +- [$(FV3JEDI_FIX)/crtm/$(crtm_VERSION)/NPOESS.VISland.EmisCoeff.bin, $(DATA)/crtm/] +- [$(FV3JEDI_FIX)/crtm/$(crtm_VERSION)/NPOESS.VISsnow.EmisCoeff.bin, $(DATA)/crtm/] +- [$(FV3JEDI_FIX)/crtm/$(crtm_VERSION)/NPOESS.VISwater.EmisCoeff.bin, $(DATA)/crtm/] diff --git a/parm/parm_gdas/aero_jedi_fix.yaml b/parm/parm_gdas/aero_jedi_fix.yaml new file mode 100644 index 00000000000..31ece4ff8ff --- /dev/null +++ b/parm/parm_gdas/aero_jedi_fix.yaml @@ -0,0 +1,11 @@ +mkdir: +- !ENV ${DATA}/fv3jedi +copy: +- - !ENV ${FV3JEDI_FIX}/fv3jedi/fv3files/akbk$(npz).nc4 + - !ENV ${DATA}/fv3jedi/akbk.nc4 +- - !ENV ${FV3JEDI_FIX}/fv3jedi/fv3files/fmsmpp.nml + - !ENV ${DATA}/fv3jedi/fmsmpp.nml +- - !ENV ${FV3JEDI_FIX}/fv3jedi/fv3files/field_table_gfdl + - !ENV ${DATA}/fv3jedi/field_table +- - !ENV ${FV3JEDI_FIX}/fv3jedi/fieldmetadata/gfs-aerosol.yaml + - !ENV ${DATA}/fv3jedi/gfs-restart.yaml diff --git a/parm/parm_gdas/aeroanl_inc_vars.yaml b/parm/parm_gdas/aeroanl_inc_vars.yaml new file mode 100644 index 00000000000..298373d6e28 --- /dev/null +++ b/parm/parm_gdas/aeroanl_inc_vars.yaml @@ -0,0 +1 @@ +incvars: ['dust1', 'dust2', 'dust3', 'dust4', 'dust5', 'seas1', 'seas2', 'seas3', 'seas4', 'so4', 'oc1', 'oc2', 'bc1', 'bc2'] diff --git a/parm/parm_gdas/atm_crtm_coeff.yaml b/parm/parm_gdas/atm_crtm_coeff.yaml new file mode 100644 index 00000000000..8e8d433b067 --- /dev/null +++ b/parm/parm_gdas/atm_crtm_coeff.yaml @@ -0,0 +1,178 @@ +mkdir: +- $(DATA)/crtm +copy: +# Emissivity files +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/NPOESS.VISice.EmisCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/NPOESS.VISland.EmisCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/NPOESS.VISsnow.EmisCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/NPOESS.VISwater.EmisCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/NPOESS.IRice.EmisCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/NPOESS.IRland.EmisCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/NPOESS.IRsnow.EmisCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/Nalli.IRwater.EmisCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/FASTEM6.MWwater.EmisCoeff.bin, $(DATA)/crtm] +# Aerosol and Cloud files +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/AerosolCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/CloudCoeff.bin, $(DATA)/crtm] +##- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/CloudCoeff.GFDLFV3.-109z-1.bin, $(DATA)/crtm] +# Satellite_Sensor specific Tau and Spc coefficient files +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/abi_g16.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/abi_g16.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/abi_g17.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/abi_g17.TauCoeff.bin, $(DATA)/crtm] +##- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/abi_g18.SpcCoeff.bin, $(DATA)/crtm] +##- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/abi_g18.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ahi_himawari8.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ahi_himawari8.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ahi_himawari9.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ahi_himawari9.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/airs_aqua.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/airs_aqua.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsr2_gcom-w1.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsr2_gcom-w1.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsre_aqua.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsre_aqua.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_aqua.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_aqua.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_metop-a.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_metop-a.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_metop-b.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_metop-b.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_metop-c.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_metop-c.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_n15.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_n15.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_n18.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_n18.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_n19.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_n19.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsub_n17.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsub_n17.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/atms_n20.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/atms_n20.TauCoeff.bin, $(DATA)/crtm] +##- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/atms_n21.SpcCoeff.bin, $(DATA)/crtm] +##- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/atms_n21.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/atms_npp.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/atms_npp.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/avhrr3_metop-a.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/avhrr3_metop-a.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/avhrr3_metop-b.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/avhrr3_metop-b.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/avhrr3_metop-c.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/avhrr3_metop-c.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/avhrr3_n18.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/avhrr3_n18.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/avhrr3_n19.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/avhrr3_n19.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/cris-fsr_n20.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/cris-fsr_n20.TauCoeff.bin, $(DATA)/crtm] +##- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/cris-fsr_n21.SpcCoeff.bin, $(DATA)/crtm] +##- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/cris-fsr_n21.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/cris-fsr_npp.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/cris-fsr_npp.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/gmi_gpm.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/gmi_gpm.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/hirs3_n17.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/hirs3_n17.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/hirs4_metop-a.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/hirs4_metop-a.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/hirs4_metop-b.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/hirs4_metop-b.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/hirs4_n19.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/hirs4_n19.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/iasi_metop-a.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/iasi_metop-a.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/iasi_metop-b.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/iasi_metop-b.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/iasi_metop-c.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/iasi_metop-c.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/imgr_g11.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/imgr_g11.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/imgr_g12.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/imgr_g12.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/imgr_g13.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/imgr_g13.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/imgr_g14.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/imgr_g14.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/imgr_g15.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/imgr_g15.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/mhs_metop-a.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/mhs_metop-a.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/mhs_metop-b.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/mhs_metop-b.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/mhs_metop-c.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/mhs_metop-c.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/mhs_n18.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/mhs_n18.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/mhs_n19.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/mhs_n19.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/saphir_meghat.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/saphir_meghat.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/seviri_m08.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/seviri_m08.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/seviri_m09.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/seviri_m09.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/seviri_m10.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/seviri_m10.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/seviri_m11.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/seviri_m11.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD1_g11.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD1_g11.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD1_g12.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD1_g12.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD1_g13.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD1_g13.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD1_g14.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD1_g14.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD1_g15.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD1_g15.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD2_g11.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD2_g11.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD2_g12.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD2_g12.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD2_g13.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD2_g13.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD2_g14.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD2_g14.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD2_g15.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD2_g15.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD3_g11.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD3_g11.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD3_g12.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD3_g12.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD3_g13.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD3_g13.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD3_g14.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD3_g14.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD3_g15.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD3_g15.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD4_g11.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD4_g11.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD4_g12.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD4_g12.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD4_g13.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD4_g13.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD4_g14.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD4_g14.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD4_g15.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/sndrD4_g15.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmi_f15.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmi_f15.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmis_f16.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmis_f16.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmis_f17.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmis_f17.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmis_f18.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmis_f18.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmis_f19.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmis_f19.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmis_f20.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/ssmis_f20.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/viirs-m_j1.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/viirs-m_j1.TauCoeff.bin, $(DATA)/crtm] +##- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/viirs-m_j2.SpcCoeff.bin, $(DATA)/crtm] +##- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/viirs-m_j2.TauCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/viirs-m_npp.SpcCoeff.bin, $(DATA)/crtm] +- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/viirs-m_npp.TauCoeff.bin, $(DATA)/crtm] +# Special Spc files +##- [$(HOMEgfs)/fix/gdas/crtm/$(crtm_VERSION)/amsua_metop-a_v2.SpcCoeff.bin, $(DATA)/crtm] diff --git a/parm/parm_gdas/atm_jedi_fix.yaml b/parm/parm_gdas/atm_jedi_fix.yaml new file mode 100644 index 00000000000..07b0fe49f14 --- /dev/null +++ b/parm/parm_gdas/atm_jedi_fix.yaml @@ -0,0 +1,7 @@ +mkdir: +- $(DATA)/fv3jedi +copy: +- [$(HOMEgfs)/fix/gdas/fv3jedi/fv3files/akbk$(npz).nc4, $(DATA)/fv3jedi/akbk.nc4] +- [$(HOMEgfs)/fix/gdas/fv3jedi/fv3files/fmsmpp.nml, $(DATA)/fv3jedi/fmsmpp.nml] +- [$(HOMEgfs)/fix/gdas/fv3jedi/fv3files/field_table_gfdl, $(DATA)/fv3jedi/field_table] +- [$(HOMEgfs)/fix/gdas/fv3jedi/fieldmetadata/gfs-restart.yaml, $(DATA)/fv3jedi/gfs-restart.yaml] diff --git a/parm/parm_gdas/atmanl_inc_vars.yaml b/parm/parm_gdas/atmanl_inc_vars.yaml new file mode 100644 index 00000000000..cb6718ce9f4 --- /dev/null +++ b/parm/parm_gdas/atmanl_inc_vars.yaml @@ -0,0 +1 @@ +incvars: ['ua', 'va', 't', 'sphum', 'liq_wat', 'ice_wat', 'o3mr'] diff --git a/parm/ufs/fix/gfs/atmos.fixed_files.yaml b/parm/ufs/fix/gfs/atmos.fixed_files.yaml new file mode 100644 index 00000000000..cc82f7a2531 --- /dev/null +++ b/parm/ufs/fix/gfs/atmos.fixed_files.yaml @@ -0,0 +1,85 @@ +copy: + # Atmosphere mosaic file linked as the grid_spec file (atm only) + - [$(FIX_orog)/$(atm_res)/$(atm_res)_mosaic.nc, $(DATA)/INPUT/grid_spec.nc] + + # Atmosphere grid tile files + - [$(FIX_orog)/$(atm_res)/$(atm_res)_grid.tile1.nc, $(DATA)/INPUT/] + - [$(FIX_orog)/$(atm_res)/$(atm_res)_grid.tile2.nc, $(DATA)/INPUT/] + - [$(FIX_orog)/$(atm_res)/$(atm_res)_grid.tile3.nc, $(DATA)/INPUT/] + - [$(FIX_orog)/$(atm_res)/$(atm_res)_grid.tile4.nc, $(DATA)/INPUT/] + - [$(FIX_orog)/$(atm_res)/$(atm_res)_grid.tile5.nc, $(DATA)/INPUT/] + - [$(FIX_orog)/$(atm_res)/$(atm_res)_grid.tile6.nc, $(DATA)/INPUT/] + + # oro_data_ls and oro_data_ss files from FIX_ugwd + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ls.tile1.nc, $(DATA)/INPUT/oro_data_ls.tile1.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ls.tile2.nc, $(DATA)/INPUT/oro_data_ls.tile2.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ls.tile3.nc, $(DATA)/INPUT/oro_data_ls.tile3.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ls.tile4.nc, $(DATA)/INPUT/oro_data_ls.tile4.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ls.tile5.nc, $(DATA)/INPUT/oro_data_ls.tile5.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ls.tile6.nc, $(DATA)/INPUT/oro_data_ls.tile6.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ss.tile1.nc, $(DATA)/INPUT/oro_data_ss.tile1.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ss.tile2.nc, $(DATA)/INPUT/oro_data_ss.tile2.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ss.tile3.nc, $(DATA)/INPUT/oro_data_ss.tile3.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ss.tile4.nc, $(DATA)/INPUT/oro_data_ss.tile4.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ss.tile5.nc, $(DATA)/INPUT/oro_data_ss.tile5.nc] + - [$(FIX_ugwd)/$(atm_res)/$(atm_res)_oro_data_ss.tile6.nc, $(DATA)/INPUT/oro_data_ss.tile6.nc] + + # GWD?? + - [$(FIX_ugwd)/ugwp_limb_tau.nc, $(DATA)/ugwp_limb_tau.nc] + + # CO2 climatology + - [$(FIX_am)/co2monthlycyc.txt, $(DATA)/co2monthlycyc.txt] + - [$(FIX_am)/global_co2historicaldata_glob.txt, $(DATA)/co2historicaldata_glob.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2009.txt, $(DATA)/co2historicaldata_2009.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2010.txt, $(DATA)/co2historicaldata_2010.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2011.txt, $(DATA)/co2historicaldata_2011.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2012.txt, $(DATA)/co2historicaldata_2012.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2013.txt, $(DATA)/co2historicaldata_2013.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2014.txt, $(DATA)/co2historicaldata_2014.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2015.txt, $(DATA)/co2historicaldata_2015.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2016.txt, $(DATA)/co2historicaldata_2016.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2017.txt, $(DATA)/co2historicaldata_2017.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2018.txt, $(DATA)/co2historicaldata_2018.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2019.txt, $(DATA)/co2historicaldata_2019.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2020.txt, $(DATA)/co2historicaldata_2020.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2021.txt, $(DATA)/co2historicaldata_2021.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2022.txt, $(DATA)/co2historicaldata_2022.txt] + - [$(FIX_am)/fix_co2_proj/global_co2historicaldata_2023.txt, $(DATA)/co2historicaldata_2023.txt] + + # FIX_am files + - [$(FIX_am)/global_climaeropac_global.txt, $(DATA)/aerosol.dat] + - [$(FIX_am)/ozprdlos_2015_new_sbuvO3_tclm15_nuchem.f77, $(DATA)/global_o3prdlos.f77] + - [$(FIX_am)/global_h2o_pltc.f77, $(DATA)/global_h2oprdlos.f77] + - [$(FIX_am)/global_glacier.2x2.grb, $(DATA)/global_glacier.2x2.grb] + - [$(FIX_am)/global_maxice.2x2.grb, $(DATA)/global_maxice.2x2.grb] + - [$(FIX_am)/global_snoclim.1.875.grb, $(DATA)/global_snoclim.1.875.grb] + - [$(FIX_am)/global_slmask.t1534.3072.1536.grb, $(DATA)/global_slmask.t1534.3072.1536.grb] + - [$(FIX_am)/global_soilmgldas.statsgo.t1534.3072.1536.grb, $(DATA)/global_soilmgldas.statsgo.t1534.3072.1536.grb] + - [$(FIX_am)/global_solarconstant_noaa_an.txt, $(DATA)/solarconstant_noaa_an.txt] + - [$(FIX_am)/global_sfc_emissivity_idx.txt, $(DATA)/sfc_emissivity_idx.txt] + - [$(FIX_am)/RTGSST.1982.2012.monthly.clim.grb, $(DATA)/RTGSST.1982.2012.monthly.clim.grb] + - [$(FIX_am)/IMS-NIC.blended.ice.monthly.clim.grb, $(DATA)/IMS-NIC.blended.ice.monthly.clim.grb] + + # MERRA2 Aerosol Climatology + - [$(FIX_aer)/merra2.aerclim.2003-2014.m01.nc, $(DATA)/aeroclim.m01.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m02.nc, $(DATA)/aeroclim.m02.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m03.nc, $(DATA)/aeroclim.m03.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m04.nc, $(DATA)/aeroclim.m04.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m05.nc, $(DATA)/aeroclim.m05.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m06.nc, $(DATA)/aeroclim.m06.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m07.nc, $(DATA)/aeroclim.m07.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m08.nc, $(DATA)/aeroclim.m08.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m09.nc, $(DATA)/aeroclim.m09.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m10.nc, $(DATA)/aeroclim.m10.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m11.nc, $(DATA)/aeroclim.m11.nc] + - [$(FIX_aer)/merra2.aerclim.2003-2014.m12.nc, $(DATA)/aeroclim.m12.nc] + + # Optical depth + - [$(FIX_lut)/optics_BC.v1_3.dat, $(DATA)/optics_BC.dat] + - [$(FIX_lut)/optics_DU.v15_3.dat, $(DATA)/optics_DU.dat] + - [$(FIX_lut)/optics_OC.v1_3.dat, $(DATA)/optics_OC.dat] + - [$(FIX_lut)/optics_SS.v3_3.dat, $(DATA)/optics_SS.dat] + - [$(FIX_lut)/optics_SU.v1_3.dat, $(DATA)/optics_SU.dat] + + # fd_nems.yaml file + - [$(HOMEgfs)/sorc/ufs_model.fd/tests/parm/fd_nems.yaml, $(DATA)/] diff --git a/parm/ufs/fix/gfs/land.fixed_files.yaml b/parm/ufs/fix/gfs/land.fixed_files.yaml new file mode 100644 index 00000000000..ab93ff27a63 --- /dev/null +++ b/parm/ufs/fix/gfs/land.fixed_files.yaml @@ -0,0 +1,58 @@ +copy: + + # Files from FIX_orog/C??.mx??_frac/fix_sfc + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).facsf.tile1.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).facsf.tile2.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).facsf.tile3.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).facsf.tile4.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).facsf.tile5.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).facsf.tile6.nc, $(DATA)/] + + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).maximum_snow_albedo.tile1.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).maximum_snow_albedo.tile2.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).maximum_snow_albedo.tile3.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).maximum_snow_albedo.tile4.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).maximum_snow_albedo.tile5.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).maximum_snow_albedo.tile6.nc, $(DATA)/] + + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).slope_type.tile1.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).slope_type.tile2.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).slope_type.tile3.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).slope_type.tile4.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).slope_type.tile5.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).slope_type.tile6.nc, $(DATA)/] + + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).snowfree_albedo.tile1.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).snowfree_albedo.tile2.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).snowfree_albedo.tile3.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).snowfree_albedo.tile4.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).snowfree_albedo.tile5.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).snowfree_albedo.tile6.nc, $(DATA)/] + + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).soil_type.tile1.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).soil_type.tile2.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).soil_type.tile3.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).soil_type.tile4.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).soil_type.tile5.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).soil_type.tile6.nc, $(DATA)/] + + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).substrate_temperature.tile1.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).substrate_temperature.tile2.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).substrate_temperature.tile3.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).substrate_temperature.tile4.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).substrate_temperature.tile5.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).substrate_temperature.tile6.nc, $(DATA)/] + + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_greenness.tile1.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_greenness.tile2.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_greenness.tile3.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_greenness.tile4.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_greenness.tile5.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_greenness.tile6.nc, $(DATA)/] + + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_type.tile1.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_type.tile2.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_type.tile3.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_type.tile4.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_type.tile5.nc, $(DATA)/] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/fix_sfc/$(atm_res).vegetation_type.tile6.nc, $(DATA)/] diff --git a/parm/ufs/fix/gfs/ocean.fixed_files.yaml b/parm/ufs/fix/gfs/ocean.fixed_files.yaml new file mode 100644 index 00000000000..801f070c49a --- /dev/null +++ b/parm/ufs/fix/gfs/ocean.fixed_files.yaml @@ -0,0 +1,10 @@ +copy: + + # Orography data tile files + # The following are for "frac_grid = .true." + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/oro_$(atm_res).mx$(ocn_res).tile1.nc, $(DATA)/INPUT/oro_data.tile1.nc] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/oro_$(atm_res).mx$(ocn_res).tile2.nc, $(DATA)/INPUT/oro_data.tile2.nc] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/oro_$(atm_res).mx$(ocn_res).tile3.nc, $(DATA)/INPUT/oro_data.tile3.nc] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/oro_$(atm_res).mx$(ocn_res).tile4.nc, $(DATA)/INPUT/oro_data.tile4.nc] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/oro_$(atm_res).mx$(ocn_res).tile5.nc, $(DATA)/INPUT/oro_data.tile5.nc] + - [$(FIX_orog)/$(atm_res).mx$(ocn_res)_frac/oro_$(atm_res).mx$(ocn_res).tile6.nc, $(DATA)/INPUT/oro_data.tile6.nc] diff --git a/parm/wave/bull_awips_gfswave b/parm/wave/bull_awips_gfswave new file mode 100644 index 00000000000..87aa19fe48f --- /dev/null +++ b/parm/wave/bull_awips_gfswave @@ -0,0 +1,496 @@ +# Gulf of Alaska (AG) Spectral data (4) near S/SW Alaska Anchorage (8) +export b46001=AGGA48_KWBJ_OSBM01 +export b46066=AGGA48_KWBJ_OSBM02 +export b46061=AGGA48_KWBJ_OSBM03 +export b46075=AGGA48_KWBJ_OSBM04 +export b46076=AGGA48_KWBJ_OSBM05 +export b46078=AGGA48_KWBJ_OSBM06 +export b46106=AGGA48_KWBJ_OSBM07 +export b46080=AGGA48_KWBJ_OSBM08 +export b46108=AGGA48_KWBJ_OSBM09 +export b46021=AGGA48_KWBJ_OSBM10 +export b46060=AGGA48_KWBJ_OSBM11 +export b46077=AGGA48_KWBJ_OSBM12 +export b46079=AGGA48_KWBJ_OSBM13 +export b46105=AGGA48_KWBJ_OSBM14 +export b46107=AGGA48_KWBJ_OSBM15 +export b46265=AGGA48_KWBJ_OSBM16 +# Gulf of Alaska (AG) Spectral data (4) near Alaska Panhandle and NBC (7) +export b46004=AGGA47_KWBJ_OSBM01 +export b46184=AGGA47_KWBJ_OSBM02 +export b46082=AGGA47_KWBJ_OSBM03 +export b46083=AGGA47_KWBJ_OSBM04 +export b46084=AGGA47_KWBJ_OSBM05 +export b46085=AGGA47_KWBJ_OSBM06 +export b46205=AGGA47_KWBJ_OSBM07 +export b46145=AGGA47_KWBJ_OSBM08 +export b46147=AGGA47_KWBJ_OSBM09 +export b46183=AGGA47_KWBJ_OSBM10 +export b46185=AGGA47_KWBJ_OSBM11 +export b46204=AGGA47_KWBJ_OSBM12 +export b46207=AGGA47_KWBJ_OSBM13 +export b46208=AGGA47_KWBJ_OSBM14 +export b46138=AGGA47_KWBJ_OSBM15 +# Eastern Pacific (PZ) spectral data (4) near Pacific states and SBC (6) +export b46002=AGPZ46_KWBJ_OSBM01 +export b46006=AGPZ46_KWBJ_OSBM02 +export b46059=AGPZ46_KWBJ_OSBM03 +export b46011=AGPZ46_KWBJ_OSBM04 +export b46012=AGPZ46_KWBJ_OSBM05 +export b46013=AGPZ46_KWBJ_OSBM06 +export b46014=AGPZ46_KWBJ_OSBM07 +export b46022=AGPZ46_KWBJ_OSBM08 +export b46023=AGPZ46_KWBJ_OSBM09 +export b46026=AGPZ46_KWBJ_OSBM10 +export b46027=AGPZ46_KWBJ_OSBM11 +export b46015=AGPZ46_KWBJ_OSBM12 +export b46025=AGPZ46_KWBJ_OSBM13 +export b46028=AGPZ46_KWBJ_OSBM14 +export b46030=AGPZ46_KWBJ_OSBM15 +export b46042=AGPZ46_KWBJ_OSBM16 +export b46047=AGPZ46_KWBJ_OSBM17 +export b46050=AGPZ46_KWBJ_OSBM18 +export b46053=AGPZ46_KWBJ_OSBM19 +export b46054=AGPZ46_KWBJ_OSBM20 +export b46062=AGPZ46_KWBJ_OSBM21 +export b46063=AGPZ46_KWBJ_OSBM22 +export b46069=AGPZ46_KWBJ_OSBM23 +export b46086=AGPZ46_KWBJ_OSBM24 +export b46089=AGPZ46_KWBJ_OSBM25 +export b46213=AGPZ46_KWBJ_OSBM26 +export b46214=AGPZ46_KWBJ_OSBM27 +export b46216=AGPZ46_KWBJ_OSBM28 +export b46217=AGPZ46_KWBJ_OSBM29 +export b46218=AGPZ46_KWBJ_OSBM30 +export b46219=AGPZ46_KWBJ_OSBM31 +export b46221=AGPZ46_KWBJ_OSBM32 +export b46222=AGPZ46_KWBJ_OSBM33 +export b46223=AGPZ46_KWBJ_OSBM34 +export b46224=AGPZ46_KWBJ_OSBM35 +export b46225=AGPZ46_KWBJ_OSBM36 +export b46227=AGPZ46_KWBJ_OSBM37 +export b46229=AGPZ46_KWBJ_OSBM38 +export b46231=AGPZ46_KWBJ_OSBM39 +export b46232=AGPZ46_KWBJ_OSBM40 +export b46215=AGPZ46_KWBJ_OSBM41 +export b46236=AGPZ46_KWBJ_OSBM42 +export b46237=AGPZ46_KWBJ_OSBM43 +export b46238=AGPZ46_KWBJ_OSBM44 +export b46239=AGPZ46_KWBJ_OSBM45 +export b46240=AGPZ46_KWBJ_OSBM46 +export b46243=AGPZ46_KWBJ_OSBM47 +export b46244=AGPZ46_KWBJ_OSBM48 +export b46246=AGPZ46_KWBJ_OSBM49 +export b46248=AGPZ46_KWBJ_OSBM50 +export b46024=AGPZ46_KWBJ_OSBM51 +export b46091=AGPZ46_KWBJ_OSBM52 +export b46092=AGPZ46_KWBJ_OSBM53 +export b46093=AGPZ46_KWBJ_OSBM54 +export b46094=AGPZ46_KWBJ_OSBM55 +export b46097=AGPZ46_KWBJ_OSBM56 +export b46098=AGPZ46_KWBJ_OSBM57 +export b46114=AGPZ46_KWBJ_OSBM58 +export b46212=AGPZ46_KWBJ_OSBM59 +export b46226=AGPZ46_KWBJ_OSBM60 +export b46233=AGPZ46_KWBJ_OSBM61 +export b46235=AGPZ46_KWBJ_OSBM62 +export b46242=AGPZ46_KWBJ_OSBM63 +export b46247=AGPZ46_KWBJ_OSBM64 +export b46249=AGPZ46_KWBJ_OSBM65 +export b46250=AGPZ46_KWBJ_OSBM66 +export b46251=AGPZ46_KWBJ_OSBM67 +export b46252=AGPZ46_KWBJ_OSBM68 +export b46253=AGPZ46_KWBJ_OSBM69 +export b46254=AGPZ46_KWBJ_OSBM70 +export b46255=AGPZ46_KWBJ_OSBM71 +export b46256=AGPZ46_KWBJ_OSBM72 +export b46257=AGPZ46_KWBJ_OSBM73 +export b46258=AGPZ46_KWBJ_OSBM74 +export b46259=AGPZ46_KWBJ_OSBM75 +export b46262=AGPZ46_KWBJ_OSBM76 +# Eastern Pacific (PZ) spectral data (4) near Alaska Panhandle and NBC (7) +export b46005=AGPZ47_KWBJ_OSBM01 +export b46036=AGPZ47_KWBJ_OSBM02 +export b46132=AGPZ47_KWBJ_OSBM03 +export b46206=AGPZ47_KWBJ_OSBM04 +export b46029=AGPZ47_KWBJ_OSBM05 +export b46041=AGPZ47_KWBJ_OSBM06 +export b46087=AGPZ47_KWBJ_OSBM07 +export b46211=AGPZ47_KWBJ_OSBM08 +export b46088=AGPZ47_KWBJ_OSBM09 +export b46096=AGPZ47_KWBJ_OSBM10 +export b46099=AGPZ47_KWBJ_OSBM11 +export b46100=AGPZ47_KWBJ_OSBM12 +export b46119=AGPZ47_KWBJ_OSBM13 +export b46127=AGPZ47_KWBJ_OSBM14 +export b46139=AGPZ47_KWBJ_OSBM15 +export b46264=AGPZ47_KWBJ_OSBM16 +# North Pacific and Behring Sea (PN) spectra (4) near S/SW Alaska Anchorage (8) +export b46035=AGPN48_KWBJ_OSBM01 +export b46070=AGPN48_KWBJ_OSBM02 +export b46073=AGPN48_KWBJ_OSBM03 +export b46071=AGPN48_KWBJ_OSBM04 +export b46072=AGPN48_KWBJ_OSBM05 +export b46020=AGPN48_KWBJ_OSBM06 +# Hawaiian waters (HW) spectra (4) in Pacific Ocean and Pacific Isles (0) +export b51001=AGHW40_KWBJ_OSBM01 +export b51002=AGHW40_KWBJ_OSBM02 +export b51003=AGHW40_KWBJ_OSBM03 +export b51004=AGHW40_KWBJ_OSBM04 +export b51201=AGHW40_KWBJ_OSBM05 +export b51202=AGHW40_KWBJ_OSBM06 +export b51000=AGHW40_KWBJ_OSBM07 +export b51100=AGHW40_KWBJ_OSBM08 +export b51101=AGHW40_KWBJ_OSBM09 +export b51203=AGHW40_KWBJ_OSBM10 +export b51204=AGHW40_KWBJ_OSBM11 +export b51205=AGHW40_KWBJ_OSBM12 +export b51206=AGHW40_KWBJ_OSBM13 +export b51207=AGHW40_KWBJ_OSBM14 +export b51028=AGHW40_KWBJ_OSBM15 +export b51200=AGHW40_KWBJ_OSBM16 +export b51208=AGHW40_KWBJ_OSBM17 +export b51209=AGHW40_KWBJ_OSBM18 +export b51210=AGHW40_KWBJ_OSBM19 +export b52212=AGHW40_KWBJ_OSBM20 +export b51211=AGHW40_KWBJ_OSBM21 +export b51212=AGHW40_KWBJ_OSBM22 +export b51213=AGHW40_KWBJ_OSBM23 +# Western Pacific (PW) spectra (4) in Pacific Ocean and Pacific Isles (0) +export b52200=AGPW40_KWBJ_OSBM01 +export b22101=AGPW40_KWBJ_OSBM02 +export b22102=AGPW40_KWBJ_OSBM03 +export b22103=AGPW40_KWBJ_OSBM04 +export b22104=AGPW40_KWBJ_OSBM05 +export b22105=AGPW40_KWBJ_OSBM06 +export b52201=AGPW40_KWBJ_OSBM07 +export b52202=AGPW40_KWBJ_OSBM08 +export b52211=AGPW40_KWBJ_OSBM09 +export b21178=AGPW40_KWBJ_OSBM10 +export b21229=AGPW40_KWBJ_OSBM11 +export b22108=AGPW40_KWBJ_OSBM12 +export b22184=AGPW40_KWBJ_OSBM13 +export b22185=AGPW40_KWBJ_OSBM14 +export b22186=AGPW40_KWBJ_OSBM15 +export b22187=AGPW40_KWBJ_OSBM16 +export b22188=AGPW40_KWBJ_OSBM17 +export b22189=AGPW40_KWBJ_OSBM18 +export b22190=AGPW40_KWBJ_OSBM19 +# South Pacific (PS) in Pacific Ocean and Pacific Isles (0) +export b55020=AGPS40_KWBJ_OSBM01 +export b55033=AGPS40_KWBJ_OSBM02 +export b55035=AGPS40_KWBJ_OSBM03 +export b55039=AGPS40_KWBJ_OSBM04 +# Gulf of Mexico (GX) spectra (4) south from NC and Puerto Rico (2) +export b42001=AGGX42_KWBJ_OSBM01 +export b42002=AGGX42_KWBJ_OSBM02 +export b42003=AGGX42_KWBJ_OSBM03 +export b42007=AGGX42_KWBJ_OSBM04 +export b42019=AGGX42_KWBJ_OSBM05 +export b42020=AGGX42_KWBJ_OSBM06 +export b42035=AGGX42_KWBJ_OSBM07 +export b42036=AGGX42_KWBJ_OSBM08 +export b42039=AGGX42_KWBJ_OSBM09 +export b42040=AGGX42_KWBJ_OSBM10 +export b42041=AGGX42_KWBJ_OSBM11 +export b42038=AGGX42_KWBJ_OSBM12 +export b42055=AGGX42_KWBJ_OSBM13 +export b42099=AGGX42_KWBJ_OSBM14 +export b42012=AGGX42_KWBJ_OSBM15 +export b42887=AGGX42_KWBJ_OSBM16 +export b42013=AGGX42_KWBJ_OSBM17 +export b42014=AGGX42_KWBJ_OSBM18 +export b42021=AGGX42_KWBJ_OSBM19 +export b42022=AGGX42_KWBJ_OSBM20 +export b42023=AGGX42_KWBJ_OSBM21 +export b42043=AGGX42_KWBJ_OSBM22 +export b42044=AGGX42_KWBJ_OSBM23 +export b42045=AGGX42_KWBJ_OSBM24 +export b42046=AGGX42_KWBJ_OSBM25 +export b42047=AGGX42_KWBJ_OSBM26 +export b42067=AGGX42_KWBJ_OSBM27 +export b42097=AGGX42_KWBJ_OSBM28 +export b42098=AGGX42_KWBJ_OSBM29 +export b42360=AGGX42_KWBJ_OSBM30 +export b42361=AGGX42_KWBJ_OSBM31 +export b42362=AGGX42_KWBJ_OSBM32 +export b42363=AGGX42_KWBJ_OSBM33 +export b42364=AGGX42_KWBJ_OSBM34 +export b42365=AGGX42_KWBJ_OSBM35 +export b42369=AGGX42_KWBJ_OSBM36 +export b42370=AGGX42_KWBJ_OSBM37 +export b42374=AGGX42_KWBJ_OSBM38 +export b42375=AGGX42_KWBJ_OSBM39 +export b42376=AGGX42_KWBJ_OSBM40 +export b42390=AGGX42_KWBJ_OSBM41 +export b42392=AGGX42_KWBJ_OSBM42 +export b42394=AGGX42_KWBJ_OSBM43 +export b42395=AGGX42_KWBJ_OSBM44 +# Caribbean Sea (CA) spectra (4) south from NC and Puerto Rico (2) +export b42056=AGCA42_KWBJ_OSBM01 +export b42057=AGCA42_KWBJ_OSBM02 +export b42058=AGCA42_KWBJ_OSBM03 +export b42080=AGCA42_KWBJ_OSBM04 +export b42059=AGCA42_KWBJ_OSBM05 +export b32012=AGCA42_KWBJ_OSBM06 +export b42060=AGCA42_KWBJ_OSBM07 +export b41194=AGCA42_KWBJ_OSBM08 +export b42085=AGCA42_KWBJ_OSBM09 +export b42089=AGCA42_KWBJ_OSBM10 +export b41052=AGCA42_KWBJ_OSBM11 +export b41051=AGCA42_KWBJ_OSBM12 +export b41056=AGCA42_KWBJ_OSBM13 +export b41115=AGCA42_KWBJ_OSBM14 +export b41117=AGCA42_KWBJ_OSBM15 +export b42079=AGCA42_KWBJ_OSBM16 +export b42086=AGCA42_KWBJ_OSBM17 +export b42095=AGCA42_KWBJ_OSBM18 +# Western Atlantic (NT) spectra (4) south from NC and Puerto Rico (2) +export b41001=AGNT42_KWBJ_OSBM01 +export b41002=AGNT42_KWBJ_OSBM02 +export b41004=AGNT42_KWBJ_OSBM03 +export b41008=AGNT42_KWBJ_OSBM04 +export b41009=AGNT42_KWBJ_OSBM05 +export b41010=AGNT42_KWBJ_OSBM06 +export b41012=AGNT42_KWBJ_OSBM07 +export b41013=AGNT42_KWBJ_OSBM08 +export b41025=AGNT42_KWBJ_OSBM09 +export b41035=AGNT42_KWBJ_OSBM10 +export b41036=AGNT42_KWBJ_OSBM11 +export b41043=AGNT42_KWBJ_OSBM12 +export b41046=AGNT42_KWBJ_OSBM13 +export b41047=AGNT42_KWBJ_OSBM14 +export b41048=AGNT42_KWBJ_OSBM15 +export b41112=AGNT42_KWBJ_OSBM16 +export b41113=AGNT42_KWBJ_OSBM17 +export b41114=AGNT42_KWBJ_OSBM18 +export b44014=AGNT42_KWBJ_OSBM19 +export b41037=AGNT42_KWBJ_OSBM20 +export b41038=AGNT42_KWBJ_OSBM21 +export b41049=AGNT42_KWBJ_OSBM22 +export b41044=AGNT42_KWBJ_OSBM23 +export b41109=AGNT42_KWBJ_OSBM24 +export b41110=AGNT42_KWBJ_OSBM25 +export b41111=AGNT42_KWBJ_OSBM26 +export b41053=AGNT42_KWBJ_OSBM27 +export b41058=AGNT42_KWBJ_OSBM28 +export b41024=AGNT42_KWBJ_OSBM29 +export b41027=AGNT42_KWBJ_OSBM30 +export b41029=AGNT42_KWBJ_OSBM31 +export b41030=AGNT42_KWBJ_OSBM32 +export b41033=AGNT42_KWBJ_OSBM33 +export b41061=AGNT42_KWBJ_OSBM34 +export b41062=AGNT42_KWBJ_OSBM35 +export b41063=AGNT42_KWBJ_OSBM36 +export b41064=AGNT42_KWBJ_OSBM37 +export b41108=AGNT42_KWBJ_OSBM38 +export b41159=AGNT42_KWBJ_OSBM39 +export b44056=AGNT42_KWBJ_OSBM40 +# Western Atlantic (NT) spectra (4) NE states north of VA (1) +export b44138=AGNT41_KWBJ_OSBM01 +export b44011=AGNT41_KWBJ_OSBM02 +export b44141=AGNT41_KWBJ_OSBM03 +export b44142=AGNT41_KWBJ_OSBM04 +export bWRB07=AGNT41_KWBJ_OSBM05 +export b44137=AGNT41_KWBJ_OSBM06 +export b44139=AGNT41_KWBJ_OSBM07 +export b44140=AGNT41_KWBJ_OSBM08 +export b44150=AGNT41_KWBJ_OSBM09 +export b44004=AGNT41_KWBJ_OSBM10 +export b44005=AGNT41_KWBJ_OSBM11 +export b44008=AGNT41_KWBJ_OSBM12 +export b44009=AGNT41_KWBJ_OSBM13 +export b44017=AGNT41_KWBJ_OSBM14 +export b44018=AGNT41_KWBJ_OSBM15 +export b44025=AGNT41_KWBJ_OSBM16 +export b44070=AGNT41_KWBJ_OSBM17 +export b44024=AGNT41_KWBJ_OSBM18 +export b44027=AGNT41_KWBJ_OSBM19 +export b44037=AGNT41_KWBJ_OSBM20 +export b44038=AGNT41_KWBJ_OSBM21 +export b44251=AGNT41_KWBJ_OSBM22 +export b44255=AGNT41_KWBJ_OSBM23 +export b44099=AGNT41_KWBJ_OSBM24 +export b44100=AGNT41_KWBJ_OSBM25 +export b44066=AGNT41_KWBJ_OSBM26 +export b44093=AGNT41_KWBJ_OSBM27 +export b44095=AGNT41_KWBJ_OSBM28 +export b44096=AGNT41_KWBJ_OSBM29 +export b44097=AGNT41_KWBJ_OSBM30 +export b44098=AGNT41_KWBJ_OSBM31 +export b44007=AGNT41_KWBJ_OSBM32 +export b44013=AGNT41_KWBJ_OSBM33 +export b44020=AGNT41_KWBJ_OSBM34 +export b44029=AGNT41_KWBJ_OSBM35 +export b44030=AGNT41_KWBJ_OSBM36 +export b44031=AGNT41_KWBJ_OSBM37 +export b44032=AGNT41_KWBJ_OSBM38 +export b44033=AGNT41_KWBJ_OSBM39 +export b44034=AGNT41_KWBJ_OSBM40 +export b44039=AGNT41_KWBJ_OSBM41 +export b44040=AGNT41_KWBJ_OSBM42 +export b44043=AGNT41_KWBJ_OSBM43 +export b44054=AGNT41_KWBJ_OSBM44 +export b44055=AGNT41_KWBJ_OSBM45 +export b44058=AGNT41_KWBJ_OSBM46 +export b44060=AGNT41_KWBJ_OSBM47 +export b44061=AGNT41_KWBJ_OSBM48 +export b44062=AGNT41_KWBJ_OSBM49 +export b44063=AGNT41_KWBJ_OSBM50 +export b44064=AGNT41_KWBJ_OSBM51 +export b44065=AGNT41_KWBJ_OSBM52 +export b44072=AGNT41_KWBJ_OSBM53 +export b44089=AGNT41_KWBJ_OSBM54 +export b44090=AGNT41_KWBJ_OSBM55 +export b44091=AGNT41_KWBJ_OSBM56 +export b44092=AGNT41_KWBJ_OSBM57 +export b44094=AGNT41_KWBJ_OSBM58 +export b44172=AGNT41_KWBJ_OSBM59 +export b44235=AGNT41_KWBJ_OSBM60 +export b44087=AGNT41_KWBJ_OSBM61 +# Western Atlantic (NT) spectra (4) near South America (3) +export b31201=AGNT43_KWBJ_OSBM01 +export b31052=AGNT43_KWBJ_OSBM02 +export b31260=AGNT43_KWBJ_OSBM03 +export b31374=AGNT43_KWBJ_OSBM04 +export b31051=AGNT43_KWBJ_OSBM05 +export b31053=AGNT43_KWBJ_OSBM06 +export b31375=AGNT43_KWBJ_OSBM07 +# Tropical Belt (XT) spectra (4) near South America (3) +export b41040=AGXT43_KWBJ_OSBM01 +export b41041=AGXT43_KWBJ_OSBM02 +export b41100=AGXT43_KWBJ_OSBM03 +export b41101=AGXT43_KWBJ_OSBM04 +export b41060=AGXT43_KWBJ_OSBM05 +export b42087=AGXT43_KWBJ_OSBM06 +export b42088=AGXT43_KWBJ_OSBM07 +# Tropical Belt (XT) spectra (4) in Pacific Ocean and Pacific Isles (0) +export b43010=AGXT40_KWBJ_OSBM01 +export b52009=AGXT40_KWBJ_OSBM02 +# Eastern Atlantic (ET) spectra (3) near Europe (3) +export b62001=AGET43_KWBJ_OSBM01 +export b62002=AGET43_KWBJ_OSBM02 +export b62029=AGET43_KWBJ_OSBM03 +export b62023=AGET43_KWBJ_OSBM04 +export b62052=AGET43_KWBJ_OSBM05 +export b62081=AGET43_KWBJ_OSBM06 +export b62090=AGET43_KWBJ_OSBM07 +export b62091=AGET43_KWBJ_OSBM08 +export b62092=AGET43_KWBJ_OSBM09 +export b62093=AGET43_KWBJ_OSBM10 +export b62094=AGET43_KWBJ_OSBM11 +export b62095=AGET43_KWBJ_OSBM12 +export b62103=AGET43_KWBJ_OSBM13 +export b62105=AGET43_KWBJ_OSBM14 +export b62106=AGET43_KWBJ_OSBM15 +export b62107=AGET43_KWBJ_OSBM16 +export b62108=AGET43_KWBJ_OSBM17 +export b62163=AGET43_KWBJ_OSBM18 +export b62301=AGET43_KWBJ_OSBM19 +export b62303=AGET43_KWBJ_OSBM20 +export b62305=AGET43_KWBJ_OSBM21 +export b62170=AGET43_KWBJ_OSBM22 +export b64045=AGET43_KWBJ_OSBM23 +export b64046=AGET43_KWBJ_OSBM24 +export bTFGSK=AGET43_KWBJ_OSBM25 +export bTFHFN=AGET43_KWBJ_OSBM26 +export bTFSRT=AGET43_KWBJ_OSBM27 +export bLF3F=AGET43_KWBJ_OSBM28 +export b62026=AGET43_KWBJ_OSBM29 +export b62109=AGET43_KWBJ_OSBM30 +export b62111=AGET43_KWBJ_OSBM31 +export b62112=AGET43_KWBJ_OSBM32 +export b62116=AGET43_KWBJ_OSBM33 +export b62117=AGET43_KWBJ_OSBM34 +export b62119=AGET43_KWBJ_OSBM35 +export b62128=AGET43_KWBJ_OSBM36 +export b62132=AGET43_KWBJ_OSBM37 +export b62133=AGET43_KWBJ_OSBM38 +export b62142=AGET43_KWBJ_OSBM39 +export b62143=AGET43_KWBJ_OSBM40 +export b62144=AGET43_KWBJ_OSBM41 +export b62145=AGET43_KWBJ_OSBM42 +export b62152=AGET43_KWBJ_OSBM43 +export b62162=AGET43_KWBJ_OSBM44 +export b62164=AGET43_KWBJ_OSBM45 +export b62304=AGET43_KWBJ_OSBM46 +export b63055=AGET43_KWBJ_OSBM47 +export b63056=AGET43_KWBJ_OSBM48 +export b63057=AGET43_KWBJ_OSBM49 +export b63103=AGET43_KWBJ_OSBM50 +export b63108=AGET43_KWBJ_OSBM51 +export b63110=AGET43_KWBJ_OSBM52 +export b63112=AGET43_KWBJ_OSBM53 +export b63113=AGET43_KWBJ_OSBM54 +export b63115=AGET43_KWBJ_OSBM55 +export bLF3J=AGET43_KWBJ_OSBM56 +export bLF4B=AGET43_KWBJ_OSBM57 +export bLF4H=AGET43_KWBJ_OSBM58 +export bLF4C=AGET43_KWBJ_OSBM59 +export bLF5U=AGET43_KWBJ_OSBM60 +export bEURO=AGET43_KWBJ_OSBM61 +export bK13=AGET43_KWBJ_OSBM62 +export b62024=AGET43_KWBJ_OSBM63 +export b62082=AGET43_KWBJ_OSBM64 +export b62084=AGET43_KWBJ_OSBM65 +export b62085=AGET43_KWBJ_OSBM66 +export b13130=AGET43_KWBJ_OSBM67 +export b13131=AGET43_KWBJ_OSBM68 +export b62118=AGET43_KWBJ_OSBM69 +export b62146=AGET43_KWBJ_OSBM70 +export bBSH01=AGET43_KWBJ_OSBM71 +export bBSH02=AGET43_KWBJ_OSBM72 +export bBSH03=AGET43_KWBJ_OSBM73 +export bBSH04=AGET43_KWBJ_OSBM74 +export bBSH05=AGET43_KWBJ_OSBM75 +# Arctic Ocean (AC) spectra (4) non-descript (3) +export bTFBLK=AGAC43_KWBJ_OSBM01 +export bTFGRS=AGAC43_KWBJ_OSBM02 +export bTFKGR=AGAC43_KWBJ_OSBM03 +export bLF3N=AGAC43_KWBJ_OSBM04 +export bLF5T=AGAC43_KWBJ_OSBM05 +export bLDWR=AGAC43_KWBJ_OSBM06 +export b3FYT=AGAC43_KWBJ_OSBM07 +export bLFB1=AGAC43_KWBJ_OSBM08 +export bLFB2=AGAC43_KWBJ_OSBM09 +export b64071=AGAC43_KWBJ_OSBM10 +export b48012=AGAC43_KWBJ_OSBM11 +export b48114=AGAC43_KWBJ_OSBM12 +export b48211=AGAC43_KWBJ_OSBM13 +export b48212=AGAC43_KWBJ_OSBM14 +export b48213=AGAC43_KWBJ_OSBM15 +export b48214=AGAC43_KWBJ_OSBM16 +export b48216=AGAC43_KWBJ_OSBM17 +# Indian Ocean (I) spectra (4) non-descript (5) +export b23092=AGIO45_KWBJ_OSBM01 +export b23093=AGIO45_KWBJ_OSBM02 +export b23094=AGIO45_KWBJ_OSBM03 +export b23096=AGIO45_KWBJ_OSBM04 +export b23097=AGIO45_KWBJ_OSBM05 +export b23098=AGIO45_KWBJ_OSBM06 +export b23099=AGIO45_KWBJ_OSBM07 +export b23100=AGIO45_KWBJ_OSBM08 +export b23101=AGIO45_KWBJ_OSBM09 +export b23168=AGIO45_KWBJ_OSBM10 +export b23169=AGIO45_KWBJ_OSBM11 +export b23170=AGIO45_KWBJ_OSBM12 +export b23172=AGIO45_KWBJ_OSBM13 +export b23173=AGIO45_KWBJ_OSBM14 +export b23174=AGIO45_KWBJ_OSBM15 +export b56002=AGIO45_KWBJ_OSBM16 +export b56005=AGIO45_KWBJ_OSBM17 +export b56006=AGIO45_KWBJ_OSBM18 +export b56007=AGIO45_KWBJ_OSBM19 +export bAGULHAS_FA=AGIO45_KWBJ_OSBM20 +export b56010=AGIO45_KWBJ_OSBM21 +export b56012=AGIO45_KWBJ_OSBM22 +export b23167=AGIO45_KWBJ_OSBM23 +export b23171=AGIO45_KWBJ_OSBM24 +export b23451=AGIO45_KWBJ_OSBM25 +export b23455=AGIO45_KWBJ_OSBM26 +export b23456=AGIO45_KWBJ_OSBM27 +export b23491=AGIO45_KWBJ_OSBM28 +export b23492=AGIO45_KWBJ_OSBM29 +export b23493=AGIO45_KWBJ_OSBM30 +export b23494=AGIO45_KWBJ_OSBM31 +export b23495=AGIO45_KWBJ_OSBM32 diff --git a/parm/wave/grib2_gfswave.ao_9km.f000 b/parm/wave/grib2_gfswave.ao_9km.f000 new file mode 100644 index 00000000000..bd8c07adfa4 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f000 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTA88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTA88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATA88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTA88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTA88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTA88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTA88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTA88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTA88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTA88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 0 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTA88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTA88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTA88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 0 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTA88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTA88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTA88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 0 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f003 b/parm/wave/grib2_gfswave.ao_9km.f003 new file mode 100644 index 00000000000..02a8fae550b --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f003 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTB88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTB88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATB88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTB88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTB88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTB88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTB88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTB88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTB88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTB88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 3 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTB88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTB88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTB88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 3 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTB88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTB88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTB88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 3 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f006 b/parm/wave/grib2_gfswave.ao_9km.f006 new file mode 100644 index 00000000000..9166dac9aa0 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f006 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTC88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTC88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATC88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTC88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTC88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTC88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTC88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTC88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTC88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTC88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 6 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTC88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTC88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTC88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 6 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTC88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTC88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTC88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 6 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f009 b/parm/wave/grib2_gfswave.ao_9km.f009 new file mode 100644 index 00000000000..ad03ea47035 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f009 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTD88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTD88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATD88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTD88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTD88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTD88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTD88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTD88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTD88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTD88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 9 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTD88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTD88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTD88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 9 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTD88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTD88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTD88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 9 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f012 b/parm/wave/grib2_gfswave.ao_9km.f012 new file mode 100644 index 00000000000..b7e1b8f637e --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f012 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTE88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTE88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATE88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTE88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTE88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTE88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTE88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTE88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTE88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTE88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 12 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTE88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTE88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTE88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 12 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTE88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTE88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTE88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 12 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f015 b/parm/wave/grib2_gfswave.ao_9km.f015 new file mode 100644 index 00000000000..bebde1b7248 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f015 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTF88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTF88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATF88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTF88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTF88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTF88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTF88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTF88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTF88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTF88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 15 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTF88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTF88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTF88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 15 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTF88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTF88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTF88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 15 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f018 b/parm/wave/grib2_gfswave.ao_9km.f018 new file mode 100644 index 00000000000..98e94ed3ff1 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f018 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTG88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTG88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATG88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTG88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTG88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTG88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTG88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTG88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTG88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTG88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 18 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTG88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTG88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTG88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 18 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTG88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTG88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTG88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 18 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f021 b/parm/wave/grib2_gfswave.ao_9km.f021 new file mode 100644 index 00000000000..eaedce9ea68 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f021 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTH88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTH88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATH88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTH88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTH88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTH88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTH88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTH88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTH88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTH88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 21 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTH88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTH88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTH88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 21 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTH88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTH88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTH88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 21 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f024 b/parm/wave/grib2_gfswave.ao_9km.f024 new file mode 100644 index 00000000000..64dfd856b15 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f024 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTI88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTI88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTI88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTI88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTI88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTI88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 24 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTI88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 24 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTI88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 24 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f027 b/parm/wave/grib2_gfswave.ao_9km.f027 new file mode 100644 index 00000000000..080077a2de7 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f027 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTI88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTI88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTI88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTI88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTI88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTI88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 27 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTI88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 27 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTI88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 27 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f030 b/parm/wave/grib2_gfswave.ao_9km.f030 new file mode 100644 index 00000000000..fc7a3a350ee --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f030 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTJ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTJ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTJ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTJ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTJ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTJ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 30 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTJ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 30 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTJ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 30 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f033 b/parm/wave/grib2_gfswave.ao_9km.f033 new file mode 100644 index 00000000000..505911229f1 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f033 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTJ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTJ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTJ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTJ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTJ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTJ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 33 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTJ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 33 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTJ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 33 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f036 b/parm/wave/grib2_gfswave.ao_9km.f036 new file mode 100644 index 00000000000..56a5e0e2f29 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f036 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTK88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTK88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTK88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTK88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTK88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTK88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 36 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTK88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 36 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTK88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 36 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f039 b/parm/wave/grib2_gfswave.ao_9km.f039 new file mode 100644 index 00000000000..0693f2bc402 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f039 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTK88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTK88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTK88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTK88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTK88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTK88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 39 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTK88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 39 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTK88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 39 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f042 b/parm/wave/grib2_gfswave.ao_9km.f042 new file mode 100644 index 00000000000..cac1f66a6c9 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f042 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTL88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTL88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTL88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTL88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTL88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTL88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 42 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTL88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 42 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTL88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 42 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f045 b/parm/wave/grib2_gfswave.ao_9km.f045 new file mode 100644 index 00000000000..f9a99d13bf3 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f045 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTL88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTL88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTL88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTL88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTL88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTL88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 45 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTL88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 45 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTL88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 45 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f048 b/parm/wave/grib2_gfswave.ao_9km.f048 new file mode 100644 index 00000000000..b570ab7c4dc --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f048 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTM88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTM88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTM88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTM88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTM88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTM88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 48 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTM88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 48 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTM88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 48 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f051 b/parm/wave/grib2_gfswave.ao_9km.f051 new file mode 100644 index 00000000000..9c700657ca8 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f051 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTM88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTM88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTM88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTM88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTM88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTM88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 51 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTM88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 51 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTM88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 51 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f054 b/parm/wave/grib2_gfswave.ao_9km.f054 new file mode 100644 index 00000000000..4043a5e515c --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f054 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTX88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTX88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTX88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTX88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTX88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTX88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 54 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTX88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 54 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTX88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 54 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f057 b/parm/wave/grib2_gfswave.ao_9km.f057 new file mode 100644 index 00000000000..50f40538e47 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f057 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTX88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTX88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTX88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTX88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTX88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTX88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 57 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTX88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 57 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTX88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 57 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f060 b/parm/wave/grib2_gfswave.ao_9km.f060 new file mode 100644 index 00000000000..e696f31665d --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f060 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTN88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTN88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTN88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTN88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTN88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTN88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 60 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTN88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 60 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTN88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 60 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f063 b/parm/wave/grib2_gfswave.ao_9km.f063 new file mode 100644 index 00000000000..c03ee50a2eb --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f063 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTN88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTN88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTN88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTN88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTN88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTN88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 63 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTN88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 63 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTN88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 63 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f066 b/parm/wave/grib2_gfswave.ao_9km.f066 new file mode 100644 index 00000000000..842ebdac19f --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f066 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTY88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTY88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTY88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTY88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTY88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTY88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 66 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTY88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 66 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTY88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 66 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f069 b/parm/wave/grib2_gfswave.ao_9km.f069 new file mode 100644 index 00000000000..2c44dd2bc80 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f069 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTY88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTY88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTY88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTY88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTY88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTY88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 69 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTY88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 69 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTY88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 69 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f072 b/parm/wave/grib2_gfswave.ao_9km.f072 new file mode 100644 index 00000000000..eb75a8f5af6 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f072 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTO88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTO88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTO88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTO88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTO88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTO88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 72 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTO88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 72 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTO88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 72 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f078 b/parm/wave/grib2_gfswave.ao_9km.f078 new file mode 100644 index 00000000000..c938a909e08 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f078 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTO88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTO88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTO88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTO88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTO88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTO88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 78 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTO88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 78 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTO88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 78 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f084 b/parm/wave/grib2_gfswave.ao_9km.f084 new file mode 100644 index 00000000000..9f11fc5c18d --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f084 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTP88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTP88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTP88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTP88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTP88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTP88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 84 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTP88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 84 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTP88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 84 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f090 b/parm/wave/grib2_gfswave.ao_9km.f090 new file mode 100644 index 00000000000..f3c52a2171a --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f090 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTP88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTP88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTP88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTP88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTP88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTP88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 90 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTP88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 90 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTP88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 90 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f096 b/parm/wave/grib2_gfswave.ao_9km.f096 new file mode 100644 index 00000000000..df9f5793cd9 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f096 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTQ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTQ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTQ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTQ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTQ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTQ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 96 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTQ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 96 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTQ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 96 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f102 b/parm/wave/grib2_gfswave.ao_9km.f102 new file mode 100644 index 00000000000..1558071b8f7 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f102 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTQ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTQ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTQ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTQ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTQ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTQ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 102 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTQ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 102 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTQ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 102 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f108 b/parm/wave/grib2_gfswave.ao_9km.f108 new file mode 100644 index 00000000000..41543b4d86b --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f108 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTZ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTZ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTZ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTZ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTZ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTZ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 108 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTZ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 108 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTZ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 108 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f114 b/parm/wave/grib2_gfswave.ao_9km.f114 new file mode 100644 index 00000000000..d42dcb3da6b --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f114 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTZ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTZ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTZ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTZ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTZ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTZ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 114 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTZ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 114 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTZ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 114 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f120 b/parm/wave/grib2_gfswave.ao_9km.f120 new file mode 100644 index 00000000000..5b0b3538c3e --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f120 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTR88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTR88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTR88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTR88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTR88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTR88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 120 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTR88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 120 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTR88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 120 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f126 b/parm/wave/grib2_gfswave.ao_9km.f126 new file mode 100644 index 00000000000..148f9a9a12d --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f126 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTR88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTR88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTR88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTR88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTR88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTR88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 126 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTR88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 126 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTR88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 126 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f132 b/parm/wave/grib2_gfswave.ao_9km.f132 new file mode 100644 index 00000000000..9daea35eecc --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f132 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTS88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTS88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTS88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTS88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTS88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTS88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 132 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTS88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 132 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTS88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 132 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f138 b/parm/wave/grib2_gfswave.ao_9km.f138 new file mode 100644 index 00000000000..0b29e8706dc --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f138 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTS88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTS88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTS88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTS88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTS88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTS88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 138 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTS88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 138 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTS88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 138 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f144 b/parm/wave/grib2_gfswave.ao_9km.f144 new file mode 100644 index 00000000000..240f35b7ea1 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f144 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTT88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTT88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTT88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTT88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTT88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTT88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 144 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTT88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 144 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTT88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 144 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f150 b/parm/wave/grib2_gfswave.ao_9km.f150 new file mode 100644 index 00000000000..25d79d2de04 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f150 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTT88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTT88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTT88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTT88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTT88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTT88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 150 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTT88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 150 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTT88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 150 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f156 b/parm/wave/grib2_gfswave.ao_9km.f156 new file mode 100644 index 00000000000..3f9f9e7cb77 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f156 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTU88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTU88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTU88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTU88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTU88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTU88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 156 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTU88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 156 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTU88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 156 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f162 b/parm/wave/grib2_gfswave.ao_9km.f162 new file mode 100644 index 00000000000..9948e9d8106 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f162 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTU88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTU88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTU88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTU88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTU88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTU88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 162 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTU88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 162 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTU88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 162 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f168 b/parm/wave/grib2_gfswave.ao_9km.f168 new file mode 100644 index 00000000000..97a45485322 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f168 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTV88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTV88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTV88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTV88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTV88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTV88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 168 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTV88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 168 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTV88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 168 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f174 b/parm/wave/grib2_gfswave.ao_9km.f174 new file mode 100644 index 00000000000..ebc56be0aed --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f174 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTV88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTV88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTV88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTV88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTV88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTV88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 174 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTV88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 174 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTV88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 174 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ao_9km.f180 b/parm/wave/grib2_gfswave.ao_9km.f180 new file mode 100644 index 00000000000..527ec3a7605 --- /dev/null +++ b/parm/wave/grib2_gfswave.ao_9km.f180 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQTW88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERTW88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' UGRD Surface ',WMOHEAD='EATW88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' VGRD Surface ',WMOHEAD='EBTW88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECTW88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJTW88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKTW88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELTW88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTW88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOTW88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 180 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMTW88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTW88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYTW88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 180 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENTW88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTW88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPTW88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 180 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f000 b/parm/wave/grib2_gfswave.at_10m.f000 new file mode 100644 index 00000000000..d477dab5b54 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f000 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBA88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBA88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABA88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBA88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBA88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBA88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBA88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBA88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBA88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBA88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 0 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBA88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBA88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBA88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 0 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBA88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBA88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBA88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 0 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f003 b/parm/wave/grib2_gfswave.at_10m.f003 new file mode 100644 index 00000000000..de559c52593 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f003 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBB88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBB88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABB88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBB88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBB88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBB88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBB88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBB88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBB88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBB88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 3 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBB88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBB88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBB88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 3 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBB88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBB88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBB88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 3 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f006 b/parm/wave/grib2_gfswave.at_10m.f006 new file mode 100644 index 00000000000..083706ac703 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f006 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBC88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBC88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABC88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBC88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBC88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBC88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBC88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBC88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBC88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBC88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 6 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBC88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBC88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBC88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 6 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBC88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBC88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBC88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 6 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f009 b/parm/wave/grib2_gfswave.at_10m.f009 new file mode 100644 index 00000000000..a9edacbcc9d --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f009 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBD88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBD88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABD88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBD88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBD88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBD88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBD88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBD88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBD88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBD88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 9 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBD88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBD88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBD88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 9 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBD88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBD88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBD88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 9 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f012 b/parm/wave/grib2_gfswave.at_10m.f012 new file mode 100644 index 00000000000..5a99330ec06 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f012 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBE88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBE88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABE88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBE88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBE88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBE88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBE88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBE88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBE88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBE88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 12 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBE88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBE88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBE88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 12 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBE88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBE88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBE88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 12 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f015 b/parm/wave/grib2_gfswave.at_10m.f015 new file mode 100644 index 00000000000..ca140775e78 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f015 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBF88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBF88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABF88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBF88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBF88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBF88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBF88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBF88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBF88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBF88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 15 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBF88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBF88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBF88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 15 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBF88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBF88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBF88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 15 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f018 b/parm/wave/grib2_gfswave.at_10m.f018 new file mode 100644 index 00000000000..edad98ca56f --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f018 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBG88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBG88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABG88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBG88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBG88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBG88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBG88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBG88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBG88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBG88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 18 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBG88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBG88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBG88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 18 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBG88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBG88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBG88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 18 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f021 b/parm/wave/grib2_gfswave.at_10m.f021 new file mode 100644 index 00000000000..fcd2c2ce535 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f021 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBH88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBH88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABH88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBH88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBH88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBH88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBH88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBH88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBH88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBH88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 21 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBH88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBH88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBH88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 21 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBH88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBH88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBH88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 21 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f024 b/parm/wave/grib2_gfswave.at_10m.f024 new file mode 100644 index 00000000000..45b9d8a4ecf --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f024 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBI88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBI88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBI88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBI88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBI88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBI88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 24 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBI88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 24 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBI88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 24 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f027 b/parm/wave/grib2_gfswave.at_10m.f027 new file mode 100644 index 00000000000..689bf6af9b1 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f027 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBI88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBI88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBI88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBI88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBI88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBI88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 27 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBI88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 27 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBI88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 27 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f030 b/parm/wave/grib2_gfswave.at_10m.f030 new file mode 100644 index 00000000000..a6ced8144d7 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f030 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBJ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBJ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBJ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBJ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBJ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBJ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 30 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBJ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 30 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBJ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 30 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f033 b/parm/wave/grib2_gfswave.at_10m.f033 new file mode 100644 index 00000000000..0a8fc7537cc --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f033 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBJ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBJ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBJ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBJ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBJ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBJ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 33 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBJ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 33 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBJ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 33 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f036 b/parm/wave/grib2_gfswave.at_10m.f036 new file mode 100644 index 00000000000..e886e1578d1 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f036 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBK88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBK88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBK88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBK88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBK88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBK88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 36 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBK88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 36 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBK88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 36 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f039 b/parm/wave/grib2_gfswave.at_10m.f039 new file mode 100644 index 00000000000..30f98c8455b --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f039 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBK88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBK88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBK88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBK88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBK88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBK88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 39 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBK88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 39 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBK88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 39 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f042 b/parm/wave/grib2_gfswave.at_10m.f042 new file mode 100644 index 00000000000..a46567d18e9 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f042 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBL88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBL88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBL88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBL88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBL88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBL88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 42 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBL88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 42 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBL88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 42 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f045 b/parm/wave/grib2_gfswave.at_10m.f045 new file mode 100644 index 00000000000..b7e34b3160b --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f045 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBL88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBL88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBL88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBL88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBL88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBL88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 45 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBL88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 45 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBL88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 45 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f048 b/parm/wave/grib2_gfswave.at_10m.f048 new file mode 100644 index 00000000000..8590d97c8f2 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f048 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBM88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBM88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBM88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBM88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBM88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBM88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 48 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBM88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 48 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBM88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 48 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f051 b/parm/wave/grib2_gfswave.at_10m.f051 new file mode 100644 index 00000000000..4facc855762 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f051 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBM88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBM88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBM88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBM88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBM88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBM88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 51 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBM88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 51 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBM88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 51 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f054 b/parm/wave/grib2_gfswave.at_10m.f054 new file mode 100644 index 00000000000..56b4b166fcf --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f054 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBX88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBX88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBX88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBX88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBX88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBX88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 54 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBX88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 54 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBX88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 54 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f057 b/parm/wave/grib2_gfswave.at_10m.f057 new file mode 100644 index 00000000000..62f017e4bb1 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f057 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBX88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBX88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBX88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBX88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBX88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBX88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 57 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBX88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 57 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBX88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 57 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f060 b/parm/wave/grib2_gfswave.at_10m.f060 new file mode 100644 index 00000000000..1d36770e68e --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f060 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBN88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBN88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBN88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBN88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBN88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBN88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 60 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBN88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 60 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBN88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 60 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f063 b/parm/wave/grib2_gfswave.at_10m.f063 new file mode 100644 index 00000000000..9bf847403e9 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f063 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBN88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBN88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBN88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBN88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBN88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBN88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 63 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBN88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 63 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBN88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 63 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f066 b/parm/wave/grib2_gfswave.at_10m.f066 new file mode 100644 index 00000000000..45276d44c1b --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f066 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBY88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBY88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBY88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBY88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBY88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBY88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 66 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBY88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 66 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBY88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 66 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f069 b/parm/wave/grib2_gfswave.at_10m.f069 new file mode 100644 index 00000000000..8b729955bce --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f069 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBY88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBY88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBY88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBY88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBY88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBY88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 69 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBY88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 69 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBY88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 69 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f072 b/parm/wave/grib2_gfswave.at_10m.f072 new file mode 100644 index 00000000000..1434f76cada --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f072 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBO88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBO88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBO88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBO88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBO88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBO88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 72 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBO88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 72 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBO88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 72 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f078 b/parm/wave/grib2_gfswave.at_10m.f078 new file mode 100644 index 00000000000..5d2e7d8bc8f --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f078 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBO88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBO88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBO88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBO88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBO88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBO88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 78 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBO88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 78 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBO88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 78 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f084 b/parm/wave/grib2_gfswave.at_10m.f084 new file mode 100644 index 00000000000..7b3b2aa7313 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f084 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBP88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBP88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBP88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBP88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBP88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBP88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 84 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBP88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 84 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBP88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 84 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f090 b/parm/wave/grib2_gfswave.at_10m.f090 new file mode 100644 index 00000000000..8ba15ede537 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f090 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBP88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBP88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBP88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBP88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBP88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBP88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 90 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBP88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 90 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBP88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 90 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f096 b/parm/wave/grib2_gfswave.at_10m.f096 new file mode 100644 index 00000000000..cc07a2d6f5a --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f096 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBQ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBQ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBQ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBQ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBQ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBQ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 96 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBQ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 96 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBQ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 96 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f102 b/parm/wave/grib2_gfswave.at_10m.f102 new file mode 100644 index 00000000000..220c5180a02 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f102 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBQ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBQ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBQ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBQ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBQ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBQ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 102 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBQ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 102 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBQ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 102 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f108 b/parm/wave/grib2_gfswave.at_10m.f108 new file mode 100644 index 00000000000..d84639c5c15 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f108 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBZ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBZ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBZ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBZ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBZ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBZ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 108 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBZ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 108 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBZ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 108 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f114 b/parm/wave/grib2_gfswave.at_10m.f114 new file mode 100644 index 00000000000..8503d62ca07 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f114 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBZ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBZ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBZ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBZ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBZ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBZ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 114 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBZ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 114 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBZ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 114 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f120 b/parm/wave/grib2_gfswave.at_10m.f120 new file mode 100644 index 00000000000..9a4331916f3 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f120 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBR88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBR88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBR88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBR88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBR88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBR88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 120 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBR88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 120 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBR88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 120 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f126 b/parm/wave/grib2_gfswave.at_10m.f126 new file mode 100644 index 00000000000..83b01f5cdd3 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f126 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBR88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBR88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBR88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBR88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBR88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBR88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 126 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBR88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 126 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBR88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 126 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f132 b/parm/wave/grib2_gfswave.at_10m.f132 new file mode 100644 index 00000000000..5ac9fc52771 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f132 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBS88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBS88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBS88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBS88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBS88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBS88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 132 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBS88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 132 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBS88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 132 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f138 b/parm/wave/grib2_gfswave.at_10m.f138 new file mode 100644 index 00000000000..bcc102965df --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f138 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBS88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBS88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBS88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBS88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBS88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBS88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 138 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBS88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 138 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBS88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 138 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f144 b/parm/wave/grib2_gfswave.at_10m.f144 new file mode 100644 index 00000000000..144487aad38 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f144 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBT88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBT88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBT88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBT88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBT88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBT88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 144 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBT88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 144 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBT88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 144 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f150 b/parm/wave/grib2_gfswave.at_10m.f150 new file mode 100644 index 00000000000..d5d68ae51d9 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f150 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBT88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBT88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBT88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBT88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBT88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBT88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 150 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBT88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 150 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBT88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 150 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f156 b/parm/wave/grib2_gfswave.at_10m.f156 new file mode 100644 index 00000000000..05552c9575f --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f156 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBU88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBU88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBU88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBU88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBU88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBU88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 156 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBU88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 156 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBU88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 156 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f162 b/parm/wave/grib2_gfswave.at_10m.f162 new file mode 100644 index 00000000000..5aab798d360 --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f162 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBU88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBU88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBU88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBU88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBU88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBU88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 162 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBU88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 162 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBU88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 162 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f168 b/parm/wave/grib2_gfswave.at_10m.f168 new file mode 100644 index 00000000000..2d660fcc97f --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f168 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBV88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBV88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBV88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBV88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBV88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBV88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 168 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBV88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 168 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBV88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 168 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f174 b/parm/wave/grib2_gfswave.at_10m.f174 new file mode 100644 index 00000000000..1acd3d8d5ed --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f174 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBV88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBV88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBV88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBV88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBV88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBV88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 174 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBV88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 174 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBV88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 174 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.at_10m.f180 b/parm/wave/grib2_gfswave.at_10m.f180 new file mode 100644 index 00000000000..7166559be0f --- /dev/null +++ b/parm/wave/grib2_gfswave.at_10m.f180 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQBW88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERBW88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EABW88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBBW88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECBW88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJBW88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKBW88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELBW88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBW88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOBW88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 180 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMBW88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBW88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYBW88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 180 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENBW88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBW88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPBW88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 180 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f000 b/parm/wave/grib2_gfswave.ep_10m.f000 new file mode 100644 index 00000000000..f8d065cf4be --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f000 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDA88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDA88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADA88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDA88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDA88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDA88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDA88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDA88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODA88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODA88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 0 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDA88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDA88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDA88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 0 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDA88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDA88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDA88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 0 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f003 b/parm/wave/grib2_gfswave.ep_10m.f003 new file mode 100644 index 00000000000..115803fd635 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f003 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDB88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDB88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADB88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDB88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDB88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDB88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDB88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDB88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODB88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODB88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 3 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDB88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDB88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDB88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 3 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDB88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDB88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDB88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 3 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f006 b/parm/wave/grib2_gfswave.ep_10m.f006 new file mode 100644 index 00000000000..065d4288c81 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f006 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDC88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDC88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADC88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDC88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDC88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDC88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDC88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDC88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODC88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODC88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 6 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDC88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDC88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDC88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 6 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDC88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDC88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDC88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 6 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f009 b/parm/wave/grib2_gfswave.ep_10m.f009 new file mode 100644 index 00000000000..d80dc1b7d3d --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f009 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDD88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDD88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADD88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDD88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDD88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDD88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDD88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDD88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODD88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODD88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 9 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDD88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDD88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDD88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 9 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDD88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDD88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDD88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 9 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f012 b/parm/wave/grib2_gfswave.ep_10m.f012 new file mode 100644 index 00000000000..cc3e77a1da7 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f012 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDE88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDE88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADE88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDE88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDE88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDE88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDE88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDE88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODE88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODE88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 12 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDE88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDE88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDE88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 12 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDE88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDE88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDE88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 12 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f015 b/parm/wave/grib2_gfswave.ep_10m.f015 new file mode 100644 index 00000000000..c6d3895bb74 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f015 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDF88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDF88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADF88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDF88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDF88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDF88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDF88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDF88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODF88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODF88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 15 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDF88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDF88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDF88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 15 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDF88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDF88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDF88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 15 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f018 b/parm/wave/grib2_gfswave.ep_10m.f018 new file mode 100644 index 00000000000..52088366076 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f018 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDG88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDG88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADG88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDG88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDG88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDG88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDG88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDG88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODG88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODG88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 18 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDG88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDG88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDG88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 18 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDG88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDG88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDG88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 18 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f021 b/parm/wave/grib2_gfswave.ep_10m.f021 new file mode 100644 index 00000000000..92e9cd6082c --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f021 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDH88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDH88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADH88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDH88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDH88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDH88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDH88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDH88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODH88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODH88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 21 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDH88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDH88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDH88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 21 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDH88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDH88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDH88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 21 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f024 b/parm/wave/grib2_gfswave.ep_10m.f024 new file mode 100644 index 00000000000..a92bba3c82a --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f024 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDI88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDI88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDI88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDI88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDI88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDI88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 24 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDI88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 24 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDI88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 24 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f027 b/parm/wave/grib2_gfswave.ep_10m.f027 new file mode 100644 index 00000000000..d4061202460 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f027 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDI88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDI88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDI88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDI88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDI88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDI88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 27 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDI88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 27 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDI88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 27 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f030 b/parm/wave/grib2_gfswave.ep_10m.f030 new file mode 100644 index 00000000000..ddd98764704 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f030 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDJ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDJ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDJ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDJ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDJ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDJ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 30 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDJ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 30 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDJ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 30 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f033 b/parm/wave/grib2_gfswave.ep_10m.f033 new file mode 100644 index 00000000000..17b366b5260 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f033 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDJ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDJ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDJ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDJ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDJ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDJ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 33 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDJ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 33 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDJ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 33 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f036 b/parm/wave/grib2_gfswave.ep_10m.f036 new file mode 100644 index 00000000000..dc07f4c40cc --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f036 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDK88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDK88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDK88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDK88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDK88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDK88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 36 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDK88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 36 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDK88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 36 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f039 b/parm/wave/grib2_gfswave.ep_10m.f039 new file mode 100644 index 00000000000..cac056faca1 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f039 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDK88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDK88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDK88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDK88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDK88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDK88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 39 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDK88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 39 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDK88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 39 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f042 b/parm/wave/grib2_gfswave.ep_10m.f042 new file mode 100644 index 00000000000..26e25bda578 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f042 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDL88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDL88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDL88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDL88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDL88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDL88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 42 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDL88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 42 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDL88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 42 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f045 b/parm/wave/grib2_gfswave.ep_10m.f045 new file mode 100644 index 00000000000..1de3d4f4082 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f045 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDL88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDL88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDL88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDL88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDL88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDL88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 45 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDL88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 45 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDL88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 45 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f048 b/parm/wave/grib2_gfswave.ep_10m.f048 new file mode 100644 index 00000000000..085c0ef3a00 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f048 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDM88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDM88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDM88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDM88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDM88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDM88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 48 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDM88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 48 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDM88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 48 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f051 b/parm/wave/grib2_gfswave.ep_10m.f051 new file mode 100644 index 00000000000..e5ad1dba9dd --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f051 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDM88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDM88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDM88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDM88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDM88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDM88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 51 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDM88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 51 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDM88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 51 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f054 b/parm/wave/grib2_gfswave.ep_10m.f054 new file mode 100644 index 00000000000..a3f52e7d27d --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f054 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDX88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDX88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDX88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDX88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDX88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDX88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 54 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDX88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 54 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDX88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 54 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f057 b/parm/wave/grib2_gfswave.ep_10m.f057 new file mode 100644 index 00000000000..3899e478239 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f057 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDX88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDX88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDX88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDX88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDX88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDX88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 57 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDX88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 57 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDX88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 57 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f060 b/parm/wave/grib2_gfswave.ep_10m.f060 new file mode 100644 index 00000000000..a28c9990425 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f060 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDN88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDN88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDN88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDN88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDN88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDN88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 60 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDN88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 60 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDN88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 60 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f063 b/parm/wave/grib2_gfswave.ep_10m.f063 new file mode 100644 index 00000000000..f13e736383f --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f063 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDN88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDN88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDN88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDN88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDN88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDN88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 63 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDN88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 63 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDN88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 63 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f066 b/parm/wave/grib2_gfswave.ep_10m.f066 new file mode 100644 index 00000000000..f598f767a0d --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f066 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDY88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDY88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDY88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDY88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDY88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDY88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 66 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDY88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 66 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDY88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 66 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f069 b/parm/wave/grib2_gfswave.ep_10m.f069 new file mode 100644 index 00000000000..3a05f77135e --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f069 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDY88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDY88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDY88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDY88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDY88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDY88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 69 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDY88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 69 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDY88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 69 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f072 b/parm/wave/grib2_gfswave.ep_10m.f072 new file mode 100644 index 00000000000..482076b5c8d --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f072 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDO88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDO88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDO88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDO88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDO88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDO88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 72 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDO88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 72 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDO88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 72 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f078 b/parm/wave/grib2_gfswave.ep_10m.f078 new file mode 100644 index 00000000000..8b1170193fa --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f078 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDO88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDO88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDO88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDO88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDO88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDO88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 78 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDO88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 78 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDO88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 78 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f084 b/parm/wave/grib2_gfswave.ep_10m.f084 new file mode 100644 index 00000000000..e566a3a375d --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f084 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDP88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDP88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDP88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDP88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDP88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDP88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 84 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDP88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 84 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDP88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 84 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f090 b/parm/wave/grib2_gfswave.ep_10m.f090 new file mode 100644 index 00000000000..5a16bed734b --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f090 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDP88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDP88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDP88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDP88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDP88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDP88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 90 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDP88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 90 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDP88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 90 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f096 b/parm/wave/grib2_gfswave.ep_10m.f096 new file mode 100644 index 00000000000..7810dd8b217 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f096 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDQ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDQ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDQ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDQ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDQ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDQ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 96 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDQ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 96 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDQ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 96 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f102 b/parm/wave/grib2_gfswave.ep_10m.f102 new file mode 100644 index 00000000000..7e8bdf4ab2c --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f102 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDQ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDQ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDQ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDQ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDQ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDQ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 102 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDQ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 102 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDQ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 102 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f108 b/parm/wave/grib2_gfswave.ep_10m.f108 new file mode 100644 index 00000000000..0844a51d9b4 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f108 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDZ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDZ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDZ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDZ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDZ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDZ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 108 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDZ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 108 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDZ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 108 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f114 b/parm/wave/grib2_gfswave.ep_10m.f114 new file mode 100644 index 00000000000..c53b21d622d --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f114 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDZ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDZ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDZ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDZ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDZ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDZ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 114 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDZ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 114 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDZ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 114 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f120 b/parm/wave/grib2_gfswave.ep_10m.f120 new file mode 100644 index 00000000000..caa597569b4 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f120 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDR88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDR88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDR88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDR88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDR88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDR88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 120 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDR88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 120 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDR88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 120 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f126 b/parm/wave/grib2_gfswave.ep_10m.f126 new file mode 100644 index 00000000000..c2bf8697f29 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f126 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDR88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDR88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDR88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDR88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDR88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDR88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 126 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDR88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 126 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDR88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 126 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f132 b/parm/wave/grib2_gfswave.ep_10m.f132 new file mode 100644 index 00000000000..f6021aaae17 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f132 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDS88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDS88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDS88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDS88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDS88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDS88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 132 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDS88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 132 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDS88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 132 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f138 b/parm/wave/grib2_gfswave.ep_10m.f138 new file mode 100644 index 00000000000..303f65efd65 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f138 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDS88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDS88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDS88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDS88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDS88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDS88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 138 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDS88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 138 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDS88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 138 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f144 b/parm/wave/grib2_gfswave.ep_10m.f144 new file mode 100644 index 00000000000..713fd1ce1a6 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f144 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDT88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDT88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDT88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDT88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDT88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDT88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 144 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDT88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 144 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDT88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 144 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f150 b/parm/wave/grib2_gfswave.ep_10m.f150 new file mode 100644 index 00000000000..35cd044bc53 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f150 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDT88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDT88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDT88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDT88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDT88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDT88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 150 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDT88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 150 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDT88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 150 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f156 b/parm/wave/grib2_gfswave.ep_10m.f156 new file mode 100644 index 00000000000..a61f769843a --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f156 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDU88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDU88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDU88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDU88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDU88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDU88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 156 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDU88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 156 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDU88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 156 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f162 b/parm/wave/grib2_gfswave.ep_10m.f162 new file mode 100644 index 00000000000..71eb7d15018 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f162 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDU88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDU88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDU88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDU88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDU88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDU88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 162 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDU88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 162 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDU88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 162 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f168 b/parm/wave/grib2_gfswave.ep_10m.f168 new file mode 100644 index 00000000000..343a165fa9e --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f168 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDV88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDV88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDV88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDV88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDV88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDV88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 168 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDV88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 168 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDV88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 168 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f174 b/parm/wave/grib2_gfswave.ep_10m.f174 new file mode 100644 index 00000000000..cf57aea1450 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f174 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDV88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDV88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDV88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDV88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDV88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDV88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 174 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDV88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 174 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDV88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 174 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.ep_10m.f180 b/parm/wave/grib2_gfswave.ep_10m.f180 new file mode 100644 index 00000000000..7ce0873b6f7 --- /dev/null +++ b/parm/wave/grib2_gfswave.ep_10m.f180 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQDW88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERDW88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EADW88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBDW88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECDW88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJDW88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKDW88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELDW88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODW88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EODW88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 180 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMDW88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDW88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYDW88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 180 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENDW88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDW88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPDW88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 180 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f000 b/parm/wave/grib2_gfswave.glo_30m.f000 new file mode 100644 index 00000000000..66ff96c8035 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f000 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAA88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAA88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAA88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAA88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAA88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAA88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAA88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAA88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAA88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAA88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 0 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAA88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAA88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAA88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 0 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAA88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAA88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAA88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 0 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f003 b/parm/wave/grib2_gfswave.glo_30m.f003 new file mode 100644 index 00000000000..9b5200fe14e --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f003 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAB88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAB88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAB88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAB88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAB88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAB88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAB88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAB88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAB88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAB88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 3 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAB88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAB88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAB88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 3 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAB88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAB88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAB88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 3 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f006 b/parm/wave/grib2_gfswave.glo_30m.f006 new file mode 100644 index 00000000000..b8ea82ce76b --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f006 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAC88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAC88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAC88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAC88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAC88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAC88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAC88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAC88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAC88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAC88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 6 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAC88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAC88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAC88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 6 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAC88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAC88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAC88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 6 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f009 b/parm/wave/grib2_gfswave.glo_30m.f009 new file mode 100644 index 00000000000..57b88e5db66 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f009 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAD88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAD88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAD88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAD88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAD88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAD88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAD88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAD88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAD88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAD88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 9 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAD88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAD88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAD88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 9 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAD88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAD88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAD88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 9 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f012 b/parm/wave/grib2_gfswave.glo_30m.f012 new file mode 100644 index 00000000000..3e6c098b84a --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f012 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAE88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAE88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAE88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAE88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAE88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAE88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAE88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAE88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAE88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAE88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 12 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAE88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAE88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAE88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 12 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAE88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAE88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAE88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 12 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f015 b/parm/wave/grib2_gfswave.glo_30m.f015 new file mode 100644 index 00000000000..28c2420b30c --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f015 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAF88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAF88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAF88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAF88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAF88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAF88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAF88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAF88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAF88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAF88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 15 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAF88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAF88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAF88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 15 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAF88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAF88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAF88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 15 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f018 b/parm/wave/grib2_gfswave.glo_30m.f018 new file mode 100644 index 00000000000..a6ded38ecf8 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f018 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAG88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAG88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAG88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAG88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAG88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAG88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAG88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAG88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAG88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAG88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 18 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAG88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAG88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAG88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 18 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAG88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAG88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAG88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 18 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f021 b/parm/wave/grib2_gfswave.glo_30m.f021 new file mode 100644 index 00000000000..ddaaad80f59 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f021 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAH88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAH88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAH88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAH88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAH88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAH88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAH88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAH88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAH88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAH88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 21 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAH88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAH88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAH88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 21 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAH88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAH88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAH88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 21 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f024 b/parm/wave/grib2_gfswave.glo_30m.f024 new file mode 100644 index 00000000000..f08b5122728 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f024 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAI88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAI88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAI88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAI88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAI88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAI88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 24 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAI88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 24 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAI88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 24 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f027 b/parm/wave/grib2_gfswave.glo_30m.f027 new file mode 100644 index 00000000000..926f7db8379 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f027 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAI88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAI88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAI88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAI88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAI88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAI88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 27 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAI88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 27 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAI88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 27 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f030 b/parm/wave/grib2_gfswave.glo_30m.f030 new file mode 100644 index 00000000000..4799f6dff4a --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f030 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAJ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAJ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAJ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAJ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAJ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAJ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 30 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAJ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 30 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAJ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 30 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f033 b/parm/wave/grib2_gfswave.glo_30m.f033 new file mode 100644 index 00000000000..87f867858f4 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f033 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAJ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAJ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAJ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAJ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAJ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAJ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 33 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAJ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 33 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAJ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 33 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f036 b/parm/wave/grib2_gfswave.glo_30m.f036 new file mode 100644 index 00000000000..c030fe540f9 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f036 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAK88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAK88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAK88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAK88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAK88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAK88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 36 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAK88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 36 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAK88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 36 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f039 b/parm/wave/grib2_gfswave.glo_30m.f039 new file mode 100644 index 00000000000..af21e75e8e3 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f039 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAK88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAK88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAK88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAK88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAK88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAK88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 39 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAK88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 39 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAK88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 39 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f042 b/parm/wave/grib2_gfswave.glo_30m.f042 new file mode 100644 index 00000000000..6c2ed1db8c3 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f042 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAL88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAL88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAL88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAL88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAL88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAL88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 42 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAL88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 42 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAL88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 42 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f045 b/parm/wave/grib2_gfswave.glo_30m.f045 new file mode 100644 index 00000000000..e9af7c48d3f --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f045 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAL88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAL88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAL88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAL88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAL88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAL88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 45 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAL88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 45 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAL88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 45 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f048 b/parm/wave/grib2_gfswave.glo_30m.f048 new file mode 100644 index 00000000000..8e6f08ceda4 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f048 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAM88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAM88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAM88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAM88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAM88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAM88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 48 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAM88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 48 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAM88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 48 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f051 b/parm/wave/grib2_gfswave.glo_30m.f051 new file mode 100644 index 00000000000..7cf17bee6cf --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f051 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAM88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAM88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAM88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAM88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAM88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAM88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 51 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAM88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 51 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAM88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 51 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f054 b/parm/wave/grib2_gfswave.glo_30m.f054 new file mode 100644 index 00000000000..83230fbcb61 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f054 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAX88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAX88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAX88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAX88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAX88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAX88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 54 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAX88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 54 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAX88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 54 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f057 b/parm/wave/grib2_gfswave.glo_30m.f057 new file mode 100644 index 00000000000..a16252d1dc1 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f057 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAX88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAX88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAX88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAX88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAX88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAX88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 57 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAX88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 57 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAX88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 57 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f060 b/parm/wave/grib2_gfswave.glo_30m.f060 new file mode 100644 index 00000000000..8657aaef613 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f060 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAN88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAN88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAN88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAN88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAN88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAN88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 60 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAN88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 60 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAN88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 60 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f063 b/parm/wave/grib2_gfswave.glo_30m.f063 new file mode 100644 index 00000000000..10e770b94eb --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f063 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAN88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAN88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAN88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAN88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAN88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAN88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 63 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAN88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 63 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAN88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 63 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f066 b/parm/wave/grib2_gfswave.glo_30m.f066 new file mode 100644 index 00000000000..942497d603f --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f066 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAY88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAY88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAY88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAY88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAY88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAY88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 66 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAY88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 66 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAY88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 66 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f069 b/parm/wave/grib2_gfswave.glo_30m.f069 new file mode 100644 index 00000000000..839d3fb3929 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f069 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAY88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAY88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAY88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAY88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAY88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAY88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 69 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAY88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 69 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAY88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 69 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f072 b/parm/wave/grib2_gfswave.glo_30m.f072 new file mode 100644 index 00000000000..ea2af78e812 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f072 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAO88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAO88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAO88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAO88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAO88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAO88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 72 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAO88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 72 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAO88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 72 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f078 b/parm/wave/grib2_gfswave.glo_30m.f078 new file mode 100644 index 00000000000..3021da6a373 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f078 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAO88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAO88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAO88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAO88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAO88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAO88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 78 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAO88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 78 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAO88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 78 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f084 b/parm/wave/grib2_gfswave.glo_30m.f084 new file mode 100644 index 00000000000..4f6ebc8ff07 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f084 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAP88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAP88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAP88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAP88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAP88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAP88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 84 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAP88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 84 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAP88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 84 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f090 b/parm/wave/grib2_gfswave.glo_30m.f090 new file mode 100644 index 00000000000..0045375fb20 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f090 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAP88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAP88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAP88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAP88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAP88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAP88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 90 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAP88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 90 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAP88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 90 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f096 b/parm/wave/grib2_gfswave.glo_30m.f096 new file mode 100644 index 00000000000..28cd75597d3 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f096 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAQ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAQ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAQ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAQ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAQ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAQ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 96 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAQ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 96 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAQ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 96 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f102 b/parm/wave/grib2_gfswave.glo_30m.f102 new file mode 100644 index 00000000000..b4528fae64d --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f102 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAQ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAQ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAQ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAQ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAQ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAQ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 102 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAQ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 102 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAQ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 102 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f108 b/parm/wave/grib2_gfswave.glo_30m.f108 new file mode 100644 index 00000000000..f34717ec732 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f108 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAZ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAZ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAZ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAZ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAZ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAZ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 108 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAZ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 108 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAZ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 108 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f114 b/parm/wave/grib2_gfswave.glo_30m.f114 new file mode 100644 index 00000000000..d595cb13d9e --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f114 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAZ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAZ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAZ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAZ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAZ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAZ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 114 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAZ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 114 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAZ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 114 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f120 b/parm/wave/grib2_gfswave.glo_30m.f120 new file mode 100644 index 00000000000..cd13fb4123d --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f120 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAR88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAR88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAR88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAR88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAR88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAR88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 120 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAR88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 120 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAR88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 120 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f126 b/parm/wave/grib2_gfswave.glo_30m.f126 new file mode 100644 index 00000000000..44e08675dac --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f126 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAR88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAR88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAR88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAR88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAR88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAR88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 126 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAR88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 126 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAR88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 126 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f132 b/parm/wave/grib2_gfswave.glo_30m.f132 new file mode 100644 index 00000000000..5268404dee1 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f132 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAS88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAS88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAS88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAS88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAS88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAS88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 132 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAS88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 132 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAS88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 132 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f138 b/parm/wave/grib2_gfswave.glo_30m.f138 new file mode 100644 index 00000000000..fa38b3221ed --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f138 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAS88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAS88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAS88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAS88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAS88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAS88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 138 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAS88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 138 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAS88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 138 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f144 b/parm/wave/grib2_gfswave.glo_30m.f144 new file mode 100644 index 00000000000..d002662383d --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f144 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAT88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAT88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAT88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAT88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAT88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAT88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 144 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAT88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 144 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAT88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 144 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f150 b/parm/wave/grib2_gfswave.glo_30m.f150 new file mode 100644 index 00000000000..390306be21b --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f150 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAT88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAT88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAT88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAT88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAT88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAT88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 150 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAT88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 150 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAT88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 150 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f156 b/parm/wave/grib2_gfswave.glo_30m.f156 new file mode 100644 index 00000000000..4cd17d276d6 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f156 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAU88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAU88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAU88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAU88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAU88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAU88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 156 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAU88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 156 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAU88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 156 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f162 b/parm/wave/grib2_gfswave.glo_30m.f162 new file mode 100644 index 00000000000..5d24d74ccec --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f162 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAU88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAU88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAU88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAU88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAU88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAU88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 162 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAU88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 162 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAU88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 162 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f168 b/parm/wave/grib2_gfswave.glo_30m.f168 new file mode 100644 index 00000000000..f9d51588522 --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f168 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAV88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAV88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAV88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAV88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAV88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAV88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 168 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAV88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 168 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAV88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 168 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f174 b/parm/wave/grib2_gfswave.glo_30m.f174 new file mode 100644 index 00000000000..dc7577d1e2a --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f174 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAV88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAV88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAV88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAV88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAV88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAV88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 174 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAV88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 174 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAV88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 174 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.glo_30m.f180 b/parm/wave/grib2_gfswave.glo_30m.f180 new file mode 100644 index 00000000000..9b94c0282fb --- /dev/null +++ b/parm/wave/grib2_gfswave.glo_30m.f180 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQAW88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERAW88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EAAW88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBAW88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECAW88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJAW88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKAW88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELAW88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAW88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOAW88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 180 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMAW88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAW88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYAW88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 180 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENAW88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAW88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPAW88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 180 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f000 b/parm/wave/grib2_gfswave.wc_10m.f000 new file mode 100644 index 00000000000..de854de5fcd --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f000 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCA88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCA88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACA88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCA88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCA88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCA88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCA88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCA88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCA88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCA88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 0 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCA88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCA88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCA88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 0 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCA88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 0 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCA88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 0 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCA88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 0 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f003 b/parm/wave/grib2_gfswave.wc_10m.f003 new file mode 100644 index 00000000000..617e9e7b4f3 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f003 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCB88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCB88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACB88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCB88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCB88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCB88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCB88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCB88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCB88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCB88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 3 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCB88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCB88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCB88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 3 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCB88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 3 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCB88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 3 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCB88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 3 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f006 b/parm/wave/grib2_gfswave.wc_10m.f006 new file mode 100644 index 00000000000..9ce20a63b1c --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f006 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCC88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCC88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACC88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCC88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCC88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCC88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCC88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCC88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCC88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCC88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 6 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCC88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCC88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCC88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 6 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCC88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 6 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCC88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 6 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCC88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 6 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f009 b/parm/wave/grib2_gfswave.wc_10m.f009 new file mode 100644 index 00000000000..07b584002d6 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f009 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCD88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCD88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACD88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCD88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCD88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCD88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCD88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCD88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCD88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCD88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 9 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCD88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCD88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCD88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 9 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCD88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 9 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCD88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 9 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCD88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 9 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f012 b/parm/wave/grib2_gfswave.wc_10m.f012 new file mode 100644 index 00000000000..6a1c38ef685 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f012 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCE88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCE88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACE88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCE88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCE88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCE88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCE88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCE88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCE88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCE88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 12 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCE88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCE88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCE88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 12 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCE88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 12 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCE88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 12 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCE88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 12 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f015 b/parm/wave/grib2_gfswave.wc_10m.f015 new file mode 100644 index 00000000000..0b3333560bd --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f015 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCF88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCF88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACF88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCF88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCF88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCF88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCF88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCF88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCF88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCF88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 15 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCF88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCF88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCF88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 15 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCF88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 15 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCF88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 15 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCF88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 15 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f018 b/parm/wave/grib2_gfswave.wc_10m.f018 new file mode 100644 index 00000000000..404773d954a --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f018 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCG88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCG88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACG88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCG88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCG88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCG88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCG88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCG88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCG88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCG88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 18 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCG88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCG88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCG88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 18 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCG88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 18 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCG88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 18 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCG88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 18 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f021 b/parm/wave/grib2_gfswave.wc_10m.f021 new file mode 100644 index 00000000000..06c297e2759 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f021 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCH88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCH88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACH88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCH88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCH88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCH88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCH88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCH88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCH88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCH88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 21 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCH88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCH88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCH88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 21 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCH88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 21 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCH88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 21 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCH88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 21 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f024 b/parm/wave/grib2_gfswave.wc_10m.f024 new file mode 100644 index 00000000000..28e4cfa9047 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f024 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCI88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCI88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCI88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCI88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCI88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCI88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 24 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCI88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 24 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCI88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 24 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 24 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 24 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f027 b/parm/wave/grib2_gfswave.wc_10m.f027 new file mode 100644 index 00000000000..2f2ddf1d1ba --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f027 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCI88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCI88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCI88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCI88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCI88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCI88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCI88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCI88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 27 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCI88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCI88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 27 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCI88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 27 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 27 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCI88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 27 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f030 b/parm/wave/grib2_gfswave.wc_10m.f030 new file mode 100644 index 00000000000..d0725e80b4e --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f030 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCJ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCJ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCJ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCJ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCJ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCJ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 30 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCJ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 30 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCJ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 30 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 30 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 30 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f033 b/parm/wave/grib2_gfswave.wc_10m.f033 new file mode 100644 index 00000000000..f89ed37542a --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f033 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCJ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCJ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCJ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCJ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCJ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCJ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCJ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCJ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 33 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCJ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCJ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 33 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCJ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 33 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 33 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCJ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 33 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f036 b/parm/wave/grib2_gfswave.wc_10m.f036 new file mode 100644 index 00000000000..88f84d150a5 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f036 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCK88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCK88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCK88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCK88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCK88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCK88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 36 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCK88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 36 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCK88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 36 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 36 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 36 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f039 b/parm/wave/grib2_gfswave.wc_10m.f039 new file mode 100644 index 00000000000..9883f8ad982 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f039 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCK88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCK88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCK88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCK88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCK88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCK88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCK88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCK88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 39 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCK88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCK88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 39 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCK88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 39 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 39 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCK88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 39 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f042 b/parm/wave/grib2_gfswave.wc_10m.f042 new file mode 100644 index 00000000000..499279984a0 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f042 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCL88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCL88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCL88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCL88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCL88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCL88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 42 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCL88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 42 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCL88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 42 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 42 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 42 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f045 b/parm/wave/grib2_gfswave.wc_10m.f045 new file mode 100644 index 00000000000..8ac60c51e82 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f045 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCL88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCL88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCL88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCL88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCL88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCL88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCL88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCL88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 45 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCL88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCL88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 45 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCL88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 45 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 45 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCL88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 45 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f048 b/parm/wave/grib2_gfswave.wc_10m.f048 new file mode 100644 index 00000000000..7da32742dc6 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f048 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCM88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCM88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCM88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCM88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCM88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCM88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 48 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCM88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 48 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCM88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 48 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 48 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 48 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f051 b/parm/wave/grib2_gfswave.wc_10m.f051 new file mode 100644 index 00000000000..fe2762bc94c --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f051 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCM88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCM88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCM88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCM88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCM88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCM88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCM88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCM88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 51 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCM88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCM88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 51 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCM88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 51 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 51 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCM88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 51 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f054 b/parm/wave/grib2_gfswave.wc_10m.f054 new file mode 100644 index 00000000000..d1c9d07a656 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f054 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCX88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCX88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCX88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCX88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCX88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCX88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 54 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCX88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 54 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCX88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 54 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 54 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 54 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f057 b/parm/wave/grib2_gfswave.wc_10m.f057 new file mode 100644 index 00000000000..d03780335df --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f057 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCX88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCX88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCX88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCX88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCX88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCX88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCX88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCX88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 57 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCX88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCX88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 57 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCX88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 57 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 57 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCX88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 57 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f060 b/parm/wave/grib2_gfswave.wc_10m.f060 new file mode 100644 index 00000000000..8f87f58223f --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f060 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCN88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCN88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCN88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCN88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCN88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCN88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 60 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCN88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 60 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCN88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 60 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 60 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 60 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f063 b/parm/wave/grib2_gfswave.wc_10m.f063 new file mode 100644 index 00000000000..bc5ce486210 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f063 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCN88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCN88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCN88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCN88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCN88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCN88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCN88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCN88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 63 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCN88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCN88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 63 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCN88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 63 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 63 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCN88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 63 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f066 b/parm/wave/grib2_gfswave.wc_10m.f066 new file mode 100644 index 00000000000..33e35d5003e --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f066 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCY88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCY88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCY88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCY88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCY88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCY88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 66 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCY88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 66 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCY88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 66 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 66 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 66 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f069 b/parm/wave/grib2_gfswave.wc_10m.f069 new file mode 100644 index 00000000000..dddb78d455b --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f069 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCY88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCY88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCY88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCY88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCY88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCY88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCY88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCY88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 69 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCY88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCY88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 69 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCY88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 69 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 69 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCY88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 69 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f072 b/parm/wave/grib2_gfswave.wc_10m.f072 new file mode 100644 index 00000000000..39476255100 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f072 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCO88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCO88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCO88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCO88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCO88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCO88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 72 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCO88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 72 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCO88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 72 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 72 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 72 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f078 b/parm/wave/grib2_gfswave.wc_10m.f078 new file mode 100644 index 00000000000..52d71b8ea0b --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f078 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCO88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCO88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCO88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCO88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCO88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCO88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCO88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCO88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 78 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCO88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCO88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 78 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCO88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 78 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 78 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCO88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 78 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f084 b/parm/wave/grib2_gfswave.wc_10m.f084 new file mode 100644 index 00000000000..e534f1c3083 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f084 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCP88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCP88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCP88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCP88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCP88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCP88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 84 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCP88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 84 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCP88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 84 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 84 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 84 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f090 b/parm/wave/grib2_gfswave.wc_10m.f090 new file mode 100644 index 00000000000..0b2a1e51984 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f090 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCP88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCP88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCP88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCP88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCP88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCP88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCP88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCP88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 90 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCP88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCP88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 90 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCP88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 90 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 90 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCP88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 90 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f096 b/parm/wave/grib2_gfswave.wc_10m.f096 new file mode 100644 index 00000000000..0e54d388482 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f096 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCQ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCQ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCQ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCQ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCQ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCQ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 96 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCQ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 96 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCQ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 96 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 96 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 96 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f102 b/parm/wave/grib2_gfswave.wc_10m.f102 new file mode 100644 index 00000000000..0d5f302fa1e --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f102 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCQ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCQ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCQ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCQ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCQ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCQ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCQ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCQ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 102 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCQ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCQ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 102 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCQ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 102 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 102 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCQ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 102 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f108 b/parm/wave/grib2_gfswave.wc_10m.f108 new file mode 100644 index 00000000000..50ff238485b --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f108 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCZ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCZ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCZ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCZ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCZ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCZ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 108 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCZ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 108 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCZ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 108 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 108 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 108 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f114 b/parm/wave/grib2_gfswave.wc_10m.f114 new file mode 100644 index 00000000000..244e459484b --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f114 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCZ88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCZ88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCZ88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCZ88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCZ88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCZ88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCZ88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCZ88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 114 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCZ88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCZ88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 114 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCZ88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 114 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 114 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCZ88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 114 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f120 b/parm/wave/grib2_gfswave.wc_10m.f120 new file mode 100644 index 00000000000..9b29cdc0d15 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f120 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCR88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCR88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCR88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCR88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCR88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCR88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 120 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCR88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 120 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCR88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 120 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 120 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 120 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f126 b/parm/wave/grib2_gfswave.wc_10m.f126 new file mode 100644 index 00000000000..7b67a325604 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f126 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCR88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCR88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCR88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCR88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCR88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCR88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCR88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCR88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 126 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCR88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCR88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 126 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCR88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 126 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 126 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCR88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 126 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f132 b/parm/wave/grib2_gfswave.wc_10m.f132 new file mode 100644 index 00000000000..783bfaf0e26 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f132 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCS88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCS88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCS88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCS88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCS88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCS88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 132 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCS88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 132 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCS88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 132 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 132 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 132 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f138 b/parm/wave/grib2_gfswave.wc_10m.f138 new file mode 100644 index 00000000000..292160e70fb --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f138 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCS88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCS88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCS88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCS88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCS88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCS88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCS88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCS88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 138 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCS88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCS88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 138 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCS88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 138 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 138 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCS88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 138 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f144 b/parm/wave/grib2_gfswave.wc_10m.f144 new file mode 100644 index 00000000000..ccfd82dd78d --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f144 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCT88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCT88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCT88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCT88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCT88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCT88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 144 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCT88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 144 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCT88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 144 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 144 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 144 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f150 b/parm/wave/grib2_gfswave.wc_10m.f150 new file mode 100644 index 00000000000..8a4891b48b5 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f150 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCT88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCT88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCT88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCT88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCT88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCT88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCT88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCT88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 150 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCT88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCT88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 150 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCT88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 150 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 150 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCT88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 150 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f156 b/parm/wave/grib2_gfswave.wc_10m.f156 new file mode 100644 index 00000000000..a581cbe253c --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f156 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCU88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCU88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCU88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCU88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCU88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCU88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 156 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCU88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 156 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCU88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 156 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 156 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 156 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f162 b/parm/wave/grib2_gfswave.wc_10m.f162 new file mode 100644 index 00000000000..c54e1289dd6 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f162 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCU88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCU88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCU88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCU88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCU88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCU88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCU88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCU88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 162 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCU88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCU88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 162 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCU88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 162 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 162 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCU88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 162 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f168 b/parm/wave/grib2_gfswave.wc_10m.f168 new file mode 100644 index 00000000000..6bd248c568c --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f168 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCV88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCV88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCV88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCV88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCV88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCV88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 168 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCV88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 168 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCV88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 168 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 168 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 168 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f174 b/parm/wave/grib2_gfswave.wc_10m.f174 new file mode 100644 index 00000000000..bd1894388cb --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f174 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCV88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCV88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCV88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCV88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCV88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCV88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCV88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCV88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 174 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCV88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCV88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 174 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCV88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 174 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 174 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCV88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 174 241 0 2 255 0 0 / diff --git a/parm/wave/grib2_gfswave.wc_10m.f180 b/parm/wave/grib2_gfswave.wc_10m.f180 new file mode 100644 index 00000000000..4c8cb145de2 --- /dev/null +++ b/parm/wave/grib2_gfswave.wc_10m.f180 @@ -0,0 +1,16 @@ +&GRIBIDS DESC=' WIND Surface ',WMOHEAD='EQCW88 KWBJ',PDTN= 0 ,PDT= 2 1 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' WDIR Surface ',WMOHEAD='ERCW88 KWBJ',PDTN= 0 ,PDT= 2 0 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' U GRD Surface ',WMOHEAD='EACW88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 2 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' V GRD Surface ',WMOHEAD='EBCW88 KWBJ',EXTRACT=.true.,PDTN= 0 ,PDT= 2 3 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' HTSGW Surface ',WMOHEAD='ECCW88 KWBJ',PDTN= 0 ,PDT= 0 3 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' PERPW Surface ',WMOHEAD='EJCW88 KWBJ',PDTN= 0 ,PDT= 0 11 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' DIRPW Surface ',WMOHEAD='EKCW88 KWBJ',PDTN= 0 ,PDT= 0 10 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' WVHGT Surface ',WMOHEAD='ELCW88 KWBJ',PDTN= 0 ,PDT= 0 5 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCW88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWELL Order Seq. Of Data ',WMOHEAD='EOCW88 KWBJ',PDTN= 0 ,PDT= 0 8 2 0 11 0 0 1 180 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVPER Surface ',WMOHEAD='EMCW88 KWBJ',PDTN= 0 ,PDT= 0 6 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCW88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWPER Order Seq. Of Data ',WMOHEAD='EYCW88 KWBJ',PDTN= 0 ,PDT= 0 9 2 0 11 0 0 1 180 241 0 2 255 0 0 / +&GRIBIDS DESC=' WVDIR Surface ',WMOHEAD='ENCW88 KWBJ',PDTN= 0 ,PDT= 0 4 2 0 11 0 0 1 180 1 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCW88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 180 241 0 1 255 0 0 / +&GRIBIDS DESC=' SWDIR Order Seq. Of Data ',WMOHEAD='EPCW88 KWBJ',PDTN= 0 ,PDT= 0 7 2 0 11 0 0 1 180 241 0 2 255 0 0 / diff --git a/parm/wave/wave_gfs.buoys b/parm/wave/wave_gfs.buoys deleted file mode 100755 index 0d2f4ab11ec..00000000000 --- a/parm/wave/wave_gfs.buoys +++ /dev/null @@ -1,614 +0,0 @@ -$ -$ Global output point data file for multi-grid wave model -$ -$ Key to data in file: -$ -$ LON Longitude, east positive -$ LAT Latitude -$ NAME Output point name C*10, no blanks in name allowed -$ AH Anemometer height, dummy value for none-data points -$ TYPE Buoy type indicator, used for plotting and postprocessing -$ DAT Data point -$ NBY 'Non-NWS Virtual buoy' -$ SOURCE Source of data point -$ ENCAN Environment Canada -$ GOMOOS Gulf of Maine OOS -$ IDT Irish Department of Transportation -$ METFR Meteo France -$ NCEP Boundary and other data points -$ NDBC National Data Buoy Center -$ PRIV Private and incidental data sources -$ SCRIPPS Scripps -$ UKMO UK Met Office -$ PDES Puertos del Estados -$ SHOM Service Hydrographique et Oceanographique de la Marine -$ OCNOR Fugro Oceanor -$ WHOI Woods Hole Oceanographic Institute -$ SKOREA South Korea -$ MVEW Ministerie van Verkeer en Waterstaat -$ CORMP Coastal Ocean Research and Monitoring Program -$ DIMAR Direccion General Maritima (Columbia) -$ BP British Petroleum -$ SCALE Scale indicator for plotting of locations on map -$ Point will only be plotted if SCALE =< DX in our -$ GrADS scripts, DX is width of plot in logitude -$ -$ Notes: -$ -$ - The '$' at the first position identifies comments for WAVEWATCH III -$ input. -$ - The first three data columns are used by the forecats code, the other -$ are used by postprocessing scripts. -$ -$ LON LAT NAME AH TYPE SOURCE SCALE -$ --------------------------------------------------------- -$ -$ AWIPS Data section (most actual observational sites) -$ AWIPS code indicated prior and after each AWIPS section -$ -$AGGA48 -$ Gulf of Alaska (AG) Spectral data (4) near S/SW Alaska Anchorage (8) - -148.02 56.31 '46001 ' 5.0 DAT NDBC 360 - -154.98 52.70 '46066 ' 5.0 DAT NDBC 360 - -146.83 60.22 '46061 ' 5.0 DAT NDBC 90 - -160.81 53.93 '46075 ' 5.0 DAT NDBC 360 - -148.00 59.50 '46076 ' 5.0 DAT NDBC 360 - -152.45 56.05 '46078 ' 5.0 DAT NDBC 360 - -152.09 59.76 '46106 ' 999 DAT NDBC 75 - -150.00 58.00 '46080 ' 5.0 DAT NDBC 360 - -151.829 59.597 '46108 ' 5.0 DAT NDBC 45 - -160.000 57.700 '46021 ' 999.0 DAT NDBC 45 - -146.805 60.584 '46060 ' 5.0 DAT NDBC 45 - -154.175 57.910 '46077 ' 5.0 DAT NDBC 45 - -152.230 59.050 '46079 ' 4.9 DAT NDBC 45 - -152.233 59.049 '46105 ' 2.0 DAT NDBC 45 - -147.992 59.925 '46107 ' 2.0 DAT NDBC 45 - -165.475 64.473 '46265 ' 2.0 DAT NDBC 45 -$AGGA48 -$ -$AGGA47 -$ Gulf of Alaska (AG) Spectral data (4) near Alaska Panhandle and NBC (7) - -136.10 50.93 '46004 ' 5.0 DAT ENCAN 360 - -138.85 53.91 '46184 ' 5.0 DAT ENCAN 360 - -143.42 59.69 '46082 ' 5.0 DAT NDBC 360 - -138.00 58.25 '46083 ' 5.0 DAT NDBC 360 - -136.16 56.59 '46084 ' 5.0 DAT NDBC 360 - -142.56 56.85 '46085 ' 5.0 DAT NDBC 360 - -134.28 54.16 '46205 ' 5.0 DAT ENCAN 45 - -132.45 54.38 '46145 ' 5.0 DAT ENCAN 45 - -131.22 51.83 '46147 ' 5.0 DAT ENCAN 90 - -131.10 53.62 '46183 ' 5.0 DAT ENCAN 45 - -129.81 52.42 '46185 ' 5.0 DAT ENCAN 45 - -128.75 51.37 '46204 ' 5.0 DAT ENCAN 45 - -129.92 50.87 '46207 ' 5.0 DAT ENCAN 45 - -132.68 52.52 '46208 ' 5.0 DAT ENCAN 45 - -129.795 52.437 '46138 ' 999.0 DAT NDBC 45 -$AGGA47 -$ -$AGPZ46 -$ Eastern Pacific (PZ) spectral data (4) near Pacific states and SBC (6) - -130.27 42.60 '46002 ' 5.0 DAT NDBC 360 - -137.48 40.80 '46006 ' 5.0 DAT NDBC 360 - -130.00 37.98 '46059 ' 5.0 DAT NDBC 360 - -120.87 34.88 '46011 ' 5.0 DAT NDBC 15 - -122.88 37.36 '46012 ' 5.0 DAT NDBC 45 - -123.32 38.23 '46013 ' 5.0 DAT NDBC 25 - -123.97 39.22 '46014 ' 5.0 DAT NDBC 45 - -124.54 40.78 '46022 ' 5.0 DAT NDBC 25 - -120.97 34.71 '46023 ' 10.0 DAT NDBC 45 - -122.82 37.75 '46026 ' 5.0 DAT NDBC 25 - -124.38 41.85 '46027 ' 5.0 DAT NDBC 45 - -124.85 42.75 '46015 ' 5.0 DAT NDBC 45 - -119.08 33.75 '46025 ' 5.0 DAT NDBC 45 - -121.89 35.74 '46028 ' 5.0 DAT NDBC 45 - -124.53 40.42 '46030 ' 5.0 DAT NDBC 15 - -122.42 36.75 '46042 ' 5.0 DAT NDBC 45 - -119.53 32.43 '46047 ' 5.0 DAT NDBC 45 - -124.53 44.62 '46050 ' 5.0 DAT NDBC 45 - -119.85 34.24 '46053 ' 5.0 DAT NDBC 45 - -120.45 34.27 '46054 ' 10.0 DAT NDBC 25 - -121.01 35.10 '46062 ' 5.0 DAT NDBC 45 - -120.70 34.27 '46063 ' 5.0 DAT NDBC 45 - -120.20 33.65 '46069 ' 5.0 DAT NDBC 45 - -118.00 32.50 '46086 ' 5.0 DAT NDBC 45 - -125.77 45.88 '46089 ' 5.0 DAT NDBC 45 - -124.74 40.29 '46213 ' 999. DAT SCRIPPS 25 - -123.465 37.9403 '46214 ' 999. DAT SCRIPPS 45 - -119.80 34.33 '46216 ' 999. DAT SCRIPPS 15 - -119.43 34.17 '46217 ' 999. DAT SCRIPPS 15 - -120.78 34.45 '46218 ' 999. DAT SCRIPPS 25 - -119.88 33.22 '46219 ' 999. DAT SCRIPPS 45 - -118.641 33.8599 '46221 ' 999. DAT SCRIPPS 15 - -118.32 33.62 '46222 ' 999. DAT SCRIPPS 15 - -117.77 33.46 '46223 ' 999. DAT SCRIPPS 15 - -117.47 33.18 '46224 ' 999. DAT SCRIPPS 15 - -117.39 32.93 '46225 ' 999. DAT SCRIPPS 15 - -117.44 32.63 '46227 ' 999. DAT SCRIPPS 15 - -124.55 43.77 '46229 ' 999. DAT SCRIPPS 25 - -117.37 32.75 '46231 ' 999. DAT SCRIPPS 15 - -117.425 32.517 '46232 ' 999. DAT SCRIPPS 15 - -120.86 35.20 '46215 ' 999. DAT SCRIPPS 45 - -121.95 36.76 '46236 ' 999. DAT SCRIPPS 15 - -122.634 37.787 '46237 ' 999. DAT SCRIPPS 15 - -119.47 33.40 '46238 ' 999. DAT SCRIPPS 15 - -122.10 36.34 '46239 ' 999. DAT SCRIPPS 15 - -121.91 36.62 '46240 ' 999. DAT SCRIPPS 15 - -124.13 46.22 '46243 ' 999. DAT SCRIPPS 45 - -124.36 40.89 '46244 ' 999. DAT SCRIPPS 45 - -145.20 50.033 '46246 ' 999. DAT SCRIPPS 45 - -124.644 46.133 '46248 ' 999. DAT SCRIPPS 45 - -119.200 33.000 '46024 ' 10.0 DAT NDBC 45 - -121.899 36.835 '46091 ' 4.0 DAT NDBC 45 - -122.030 36.750 '46092 ' 4.0 DAT NDBC 45 - -122.410 36.690 '46093 ' 4.0 DAT NDBC 45 - -124.300 44.642 '46094 ' 3.0 DAT NDBC 45 - -124.304 44.639 '46097 ' 4.5 DAT NDBC 45 - -124.956 44.381 '46098 ' 4.5 DAT NDBC 45 - -122.33 36.685 '46114 ' 999.0 DAT NDBC 45 - -124.313 40.753 '46212 ' 999.0 DAT NDBC 45 - -117.353 32.848 '46226 ' 999.0 DAT NDBC 45 - -117.320 32.936 '46233 ' 3.0 DAT NDBC 45 - -117.167 32.572 '46235 ' 999.0 DAT NDBC 45 - -117.439 33.220 '46242 ' 999.0 DAT NDBC 45 - -122.833 37.753 '46247 ' 999.0 DAT NDBC 45 - -119.708 33.821 '46249 ' 999.0 DAT NDBC 45 - -119.090 34.034 '46250 ' 999.0 DAT NDBC 45 - -119.564 33.769 '46251 ' 999.0 DAT NDBC 45 - -119.257 33.953 '46252 ' 999.0 DAT NDBC 45 - -118.181 33.576 '46253 ' 999.0 DAT NDBC 45 - -117.267 32.868 '46254 ' 999.0 DAT NDBC 45 - -119.651 33.400 '46255 ' 999.0 DAT NDBC 45 - -118.201 33.700 '46256 ' 999.0 DAT NDBC 45 - -120.766 34.439 '46257 ' 999.0 DAT NDBC 45 - -117.500 32.750 '46258 ' 999.0 DAT NDBC 45 - -121.497 34.767 '46259 ' 999.0 DAT NDBC 45 - -119.004 33.704 '46262 ' 999.0 DAT NDBC 45 -$AGPZ46 -$ -$AGPZ47 -$ Eastern Pacific (PZ) spectral data (4) near Alaska Panhandle and NBC (7) - -131.02 46.05 '46005 ' 5.0 DAT NDBC 360 - -133.94 48.35 '46036 ' 5.0 DAT ENCAN 360 - -127.93 49.74 '46132 ' 5.0 DAT ENCAN 90 - -126.00 48.84 '46206 ' 5.0 DAT ENCAN 45 - -124.51 46.12 '46029 ' 5.0 DAT NDBC 45 - -124.75 47.34 '46041 ' 5.0 DAT NDBC 45 - -124.73 48.49 '46087 ' 5.0 DAT NDBC 45 - -124.24 46.86 '46211 ' 999. DAT SCRIPPS 25 - -123.165 48.334 '46088 ' 5.0 DAT NDBC 45 - -124.127 46.173 '46096 ' 3.0 DAT NDBC 45 - -124.566 46.986 '46099 ' 4.5 DAT NDBC 45 - -124.972 46.851 '46100 ' 4.5 DAT NDBC 45 - -124.950 47.967 '46119 ' 3.7 DAT NDBC 45 - -124.063 46.215 '46127 ' 3.0 DAT NDBC 45 - -126.010 48.844 '46139 ' 999.0 DAT NDBC 45 - -151.700 57.480 '46264 ' 999.0 DAT NDBC 45 -$AGPZ47 -$ -$AGPN48 -$ North Pacific and Behring Sea (PN) spectra (4) near S/SW Alaska Anchorage (8) - -177.58 57.05 '46035 ' 10.0 DAT NDBC 360 - 175.28 55.00 '46070 ' 5.0 DAT NDBC 360 - -172.03 54.94 '46073 ' 10.0 DAT NDBC 360 - 179.05 51.16 '46071 ' 5.0 DAT NDBC 360 - -171.73 52.25 '46072 ' 5.0 DAT NDBC 360 - -168.000 55.883 '46020 ' 999.0 DAT NDBC 360 -$AGPN48 -$ -$AGHW40 -$ Hawaiian waters (HW) spectra (4) in Pacific Ocean and Pacific Isles (0) - -162.21 23.43 '51001 ' 5.0 DAT NDBC 360 - -157.78 17.19 '51002 ' 5.0 DAT NDBC 360 - -160.82 19.22 '51003 ' 5.0 DAT NDBC 360 - -152.48 17.52 '51004 ' 5.0 DAT NDBC 360 - -158.12 21.67 '51201 ' 999. DAT SCRIPPS 11 - -157.68 21.42 '51202 ' 999. DAT SCRIPPS 11 - -154.06 23.55 '51000 ' 5.0 DAT NDBC 11 - -153.90 23.56 '51100 ' 5.0 DAT NDBC 11 - -162.06 24.32 '51101 ' 5.0 DAT NDBC 11 - -157.00 20.79 '51203 ' 999. DAT SCRIPPS 11 - -158.12 21.28 '51204 ' 999. DAT SCRIPPS 11 - -156.42 21.02 '51205 ' 999. DAT SCRIPPS 11 - -154.97 19.78 '51206 ' 999. DAT SCRIPPS 11 - -157.75 21.48 '51207 ' 999. DAT SCRIPPS 11 - -153.87 0.02 '51028 ' 5.0 DAT NDBC 11 - -158.303 21.096 '51200 ' 999.0 DAT NDBC 11 - -159.574 22.285 '51208 ' 999. DAT SCRIPPS 11 - -170.5 -14.273 '51209 ' 999.0 DAT NDBC 360 - -157.756 21.477 '51210 ' 999.0 DAT NDBC 11 - 134.670 7.692 '52212 ' 999.0 DAT NDBC 360 - -157.959 21.297 '51211 ' 999.0 DAT NDBC 360 - -158.150 21.323 '51212 ' 999.0 DAT NDBC 360 - -157.003 20.750 '51213 ' 999.0 DAT NDBC 360 -$AGHW40 -$ -$AGPW40 -$ Western Pacific (PW) spectra (4) in Pacific Ocean and Pacific Isles (0) - 144.79 13.35 '52200 ' 999. DAT SCRIPPS 360 - 126.02 37.23 '22101 ' 999. DAT SKOREA 100 - 125.77 34.80 '22102 ' 999. DAT SKOREA 100 - 127.50 34.00 '22103 ' 999. DAT SKOREA 100 - 128.90 34.77 '22104 ' 999. DAT SKOREA 100 - 130.00 37.53 '22105 ' 999. DAT SKOREA 100 - 171.391 7.038 '52201 ' 999. DAT SCRIPPS 360 - 144.80 13.68 '52202 ' 999. DAT SCRIPPS 360 - 145.66 15.27 '52211 ' 999. DAT SCRIPPS 360 - 133.62 33.19 '21178 ' 999. DAT WMO 360 - 131.11 37.46 '21229 ' 999. DAT WMO 360 - 125.75 36.25 '22108 ' 999. DAT WMO 360 - 126.14 33.79 '22184 ' 999. DAT WMO 360 - 125.43 37.09 '22185 ' 999. DAT WMO 360 - 125.81 35.66 '22186 ' 999. DAT WMO 360 - 127.02 33.13 '22187 ' 999. DAT WMO 360 - 128.23 34.39 '22188 ' 999. DAT WMO 360 - 129.84 35.35 '22189 ' 999. DAT WMO 360 - 129.87 36.91 '22190 ' 999. DAT WMO 360 -$AGPW40 -$ -$AGPS40 -$ South Pacific (PS) in Pacific Ocean and Pacific Isles (0) - 150.18 -37.29 '55020 ' 999. DAT UNKNOWN 50 - 151.07 -23.31 '55033 ' 999. DAT UNKNOWN 50 - 153.63 -27.49 '55035 ' 999. DAT UNKNOWN 50 - 148.19 -38.60 '55039 ' 999. DAT UNKNOWN 50 -$AGPS40 -$ -$AGGX42 -$ Gulf of Mexico (GX) spectra (4) south from NC and Puerto Rico (2) - -89.67 25.90 '42001 ' 10.0 DAT NDBC 360 - -94.42 25.17 '42002 ' 10.0 DAT NDBC 360 - -85.94 26.07 '42003 ' 10.0 DAT NDBC 360 - -88.77 30.09 '42007 ' 5.0 DAT NDBC 90 - -95.36 27.91 '42019 ' 5.0 DAT NDBC 90 - -96.70 26.94 '42020 ' 5.0 DAT NDBC 90 - -94.40 29.22 '42035 ' 5.0 DAT NDBC 90 - -84.52 28.50 '42036 ' 5.0 DAT NDBC 90 - -86.02 28.79 '42039 ' 5.0 DAT NDBC 90 - -88.21 29.18 '42040 ' 5.0 DAT NDBC 90 - -90.46 27.50 '42041 ' 5.0 DAT NDBC 90 - -92.55 27.42 '42038 ' 5.0 DAT NDBC 90 - -94.05 22.01 '42055 ' 10.0 DAT NDBC 360 - -84.275 27.348 '42099 ' 999. DAT SCRIPPS 100 - -87.55 30.06 '42012 ' 5.0 DAT NDBC 90 - -88.49 28.19 '42887 ' 48.2 DAT BP 90 - -82.924 27.173 '42013 ' 3.1 DAT NDBC 90 - -82.220 25.254 '42014 ' 2.8 DAT NDBC 90 - -83.306 28.311 '42021 ' 2.8 DAT NDBC 90 - -83.741 27.504 '42022 ' 3.1 DAT NDBC 90 - -83.086 26.010 '42023 ' 3.1 DAT NDBC 90 - -94.899 28.982 '42043 ' 3.4 DAT NDBC 90 - -97.051 26.191 '42044 ' 3.4 DAT NDBC 90 - -96.500 26.217 '42045 ' 3.4 DAT NDBC 90 - -94.037 27.890 '42046 ' 3.4 DAT NDBC 90 - -93.597 27.896 '42047 ' 3.4 DAT NDBC 90 - -88.647 30.042 '42067 ' 5.0 DAT NDBC 90 - -83.650 25.700 '42097 ' 999.0 DAT NDBC 90 - -82.931 27.589 '42098 ' 999.0 DAT NDBC 90 - -90.471 26.672 '42360 ' 3.0 DAT NDBC 90 - -92.490 27.550 '42361 ' 122.0 DAT NDBC 90 - -90.648 27.795 '42362 ' 122.0 DAT NDBC 90 - -89.220 28.160 '42363 ' 122.0 DAT NDBC 90 - -88.090 29.060 '42364 ' 122.0 DAT NDBC 90 - -89.120 28.200 '42365 ' 122.0 DAT NDBC 90 - -90.283 27.207 '42369 ' 60.4 DAT NDBC 90 - -90.536 27.322 '42370 ' 78.7 DAT NDBC 90 - -88.056 28.866 '42374 ' 61.0 DAT NDBC 90 - -88.289 28.521 '42375 ' 61.0 DAT NDBC 90 - -87.944 29.108 '42376 ' 61.0 DAT NDBC 90 - -94.898 26.129 '42390 ' 61.0 DAT NDBC 90 - -90.027 27.196 '42392 ' 100.0 DAT NDBC 90 - -89.240 28.157 '42394 ' 100.0 DAT NDBC 90 - -90.792 26.404 '42395 ' 3.0 DAT NDBC 90 -$AGGX42 -$ -$AGCA42 -$ Caribbean Sea (CA) spectra (4) south from NC and Puerto Rico (2) - -85.06 19.87 '42056 ' 10.0 DAT NDBC 360 - -81.50 16.83 '42057 ' 10.0 DAT NDBC 360 - -75.06 15.09 '42058 ' 10.0 DAT NDBC 360 - -81.95 24.39 '42080 ' 999. DAT NDBC 45 - -67.50 15.01 '42059 ' 5.0 DAT NDBC 360 - -85.38 -19.62 '32012' 999. DAT WHOI 360 - -63.50 16.50 '42060 ' 5.0 DAT NDBC 360 - -74.681 11.161 '41194 ' 999.0 DAT NDBC 90 - -66.524 17.860 '42085 ' 4.0 DAT NDBC 90 - -80.061 19.699 '42089 ' 3.4 DAT NDBC 90 - -64.763 18.251 '41052 ' 4.0 DAT NDBC 90 - -65.004 18.257 '41051 ' 4.0 DAT NDBC 90 - -65.457 18.260 '41056 ' 4.0 DAT NDBC 90 - -67.280 18.379 '41115 ' 999.0 DAT NDBC 90 - -81.080 30.000 '41117 ' 999.0 DAT NDBC 90 - -81.244 24.535 '42079 ' 999.0 DAT NDBC 90 - -75.042 36.000 '42086 ' 999.0 DAT NDBC 90 - -81.967 24.407 '42095 ' 999.0 DAT NDBC 90 -$AGCA42 -$ -$AGNT42 -$ Western Atlantic (NT) spectra (4) south from NC and Puerto Rico (2) - -72.66 34.68 '41001 ' 5.0 DAT NDBC 360 - -75.36 32.32 '41002 ' 5.0 DAT NDBC 360 - -79.09 32.50 '41004 ' 5.0 DAT NDBC 360 - -80.87 31.40 '41008 ' 5.0 DAT NDBC 360 - -80.17 28.50 '41009 ' 5.0 DAT NDBC 80 - -78.47 28.95 '41010 ' 5.0 DAT NDBC 80 - -80.60 30.00 '41012 ' 5.0 DAT NDBC 80 - -77.74 33.44 '41013 ' 5.0 DAT NDBC 80 - -75.40 35.01 '41025 ' 5.0 DAT NDBC 80 - -77.28 34.48 '41035 ' 5.0 DAT NDBC 80 - -76.95 34.21 '41036 ' 5.0 DAT NDBC 80 - -65.01 20.99 '41043 ' 5.0 DAT NDBC 90 - -70.99 24.00 '41046 ' 5.0 DAT NDBC 90 - -71.49 27.47 '41047 ' 10.0 DAT NDBC 90 - -69.65 31.98 '41048 ' 10.0 DAT NDBC 90 - -81.292 30.709 '41112 ' 999. DAT SCRIPPS 30 - -80.53 28.40 '41113 ' 999. DAT SCRIPPS 30 - -80.22 27.55 '41114 ' 999. DAT SCRIPPS 30 - -74.84 36.61 '44014 ' 5.0 DAT NDBC 90 - -77.36 33.99 '41037 ' 3.0 DAT CORMP 80 - -77.72 34.14 '41038 ' 3.0 DAT CORMP 80 - -63.00 27.50 '41049 ' 5.0 DAT NDBC 90 - -58.69 21.65 '41044 ' 5.0 DAT NDBC 90 - -77.30 34.48 '41109 ' 3.0 DAT CORMP 80 - -77.71 34.14 '41110 ' 3.0 DAT CORMP 80 - -67.28 18.38 '41111 ' 3.0 DAT CORMP 80 - -66.099 18.474 '41053 ' 5.0 DAT NDBC 80 - -65.157 18.476 '41058 ' 5.0 DAT NDBC 80 - -78.484 33.837 '41024 ' 3.0 DAT NDBC 80 - -78.137 33.302 '41027 ' 3.0 DAT NDBC 80 - -79.624 32.803 '41029 ' 3.0 DAT NDBC 80 - -79.340 32.520 '41030 ' 3.0 DAT NDBC 80 - -80.410 32.279 '41033 ' 3.0 DAT NDBC 80 - -38.000 24.581 '41061 ' 2.7 DAT NDBC 80 - -75.095 35.778 '41062 ' 3.5 DAT NDBC 80 - -75.941 34.782 '41063 ' 3.5 DAT NDBC 80 - -76.949 34.207 '41064 ' 3.0 DAT NDBC 80 - -78.015 33.721 '41108 ' 999.0 DAT NDBC 80 - -76.948 34.210 '41159 ' 999.0 DAT NDBC 80 - -75.714 36.200 '44056 ' 999.0 DAT NDBC 80 -$AGNT42 -$ -$AGNT41 -$ Western Atlantic (NT) spectra (4) NE states north of VA (1) - -53.62 44.26 '44138 ' 5.0 DAT ENCAN 360 - -66.58 41.11 '44011 ' 5.0 DAT NDBC 360 - -58.00 43.00 '44141 ' 5.0 DAT ENCAN 360 - -64.02 42.50 '44142 ' 5.0 DAT ENCAN 360 - -48.01 46.77 'WRB07 ' 10.0 DAT PRIV 360 - -62.00 42.26 '44137 ' 5.0 DAT ENCAN 360 - -57.08 44.26 '44139 ' 5.0 DAT ENCAN 360 - -51.74 43.75 '44140 ' 5.0 DAT ENCAN 360 - -64.01 42.50 '44150 ' 5.0 DAT ENCAN 360 - -70.43 38.48 '44004 ' 5.0 DAT NDBC 90 - -69.16 43.19 '44005 ' 5.0 DAT NDBC 90 - -69.43 40.50 '44008 ' 5.0 DAT NDBC 90 - -74.70 38.46 '44009 ' 5.0 DAT NDBC 90 - -72.10 40.70 '44017 ' 5.0 DAT NDBC 80 - -69.29 41.26 '44018 ' 5.0 DAT NDBC 80 - -73.17 40.25 '44025 ' 5.0 DAT NDBC 80 - -71.01 41.38 '44070 ' 999. DAT NDBC 60 - -65.93 42.31 '44024 ' 4.0 DAT GOMOOS 80 - -67.31 44.27 '44027 ' 5.0 DAT NDBC 80 - -67.88 43.49 '44037 ' 4.0 DAT GOMOOS 80 - -66.55 43.62 '44038 ' 4.0 DAT GOMOOS 80 - -53.39 46.44 '44251 ' 5.0 DAT ENCAN 80 - -57.35 47.28 '44255 ' 5.0 DAT ENCAN 80 - -75.720 36.915 '44099 ' 999. DAT SCRIPPS 90 - -75.59 36.26 '44100 ' 999. DAT SCRIPPS 90 - -72.60 39.58 '44066 ' 5.0 DAT NDBC 80 - -75.492 36.872 '44093 ' 999. DAT SCRIPPS 80 - -75.33 35.75 '44095 ' 999. DAT SCRIPPS 80 - -75.809 37.023 '44096 ' 999. DAT SCRIPPS 80 - -71.126 40.967 '44097 ' 999. DAT SCRIPPS 80 - -70.17 42.80 '44098 ' 999. DAT SCRIPPS 80 - -70.141 43.525 '44007 ' 5.0 DAT NDBC 80 - -70.651 42.346 '44013 ' 5.0 DAT NDBC 80 - -70.186 41.439 '44020 ' 5.0 DAT NDBC 80 - -70.566 42.523 '44029 ' 4.0 DAT NDBC 80 - -70.428 43.181 '44030 ' 4.0 DAT NDBC 80 - -70.060 43.570 '44031 ' 4.0 DAT NDBC 80 - -69.355 43.716 '44032 ' 4.0 DAT NDBC 80 - -68.998 44.055 '44033 ' 4.0 DAT NDBC 80 - -68.109 44.106 '44034 ' 4.0 DAT NDBC 80 - -72.655 41.138 '44039 ' 3.5 DAT NDBC 80 - -73.580 40.956 '44040 ' 3.5 DAT NDBC 80 - -76.391 39.152 '44043 ' 3.0 DAT NDBC 80 - -75.183 38.883 '44054 ' 999.0 DAT NDBC 80 - -75.256 39.122 '44055 ' 999.0 DAT NDBC 80 - -76.257 37.567 '44058 ' 3.0 DAT NDBC 80 - -72.067 41.263 '44060 ' 3.5 DAT NDBC 80 - -77.036 38.788 '44061 ' 2.0 DAT NDBC 80 - -76.415 38.556 '44062 ' 3.0 DAT NDBC 80 - -76.448 38.963 '44063 ' 3.0 DAT NDBC 80 - -76.087 36.998 '44064 ' 3.0 DAT NDBC 80 - -73.703 40.369 '44065 ' 5.0 DAT NDBC 80 - -76.266 37.201 '44072 ' 3.0 DAT NDBC 80 - -75.334 37.757 '44089 ' 999.0 DAT NDBC 80 - -70.329 41.840 '44090 ' 999.0 DAT NDBC 80 - -73.77 39.77 '44091 ' 999.0 DAT NDBC 80 - -70.632 42.942 '44092 ' 999.0 DAT NDBC 80 - -73.106 40.585 '44094 ' 999.0 DAT NDBC 80 - -63.408 44.500 '44172 ' 999.0 DAT NDBC 360 - -57.341 47.263 '44235 ' 999.0 DAT NDBC 360 - -76.149 37.024 '44087 ' 999.0 DAT NDBC 360 -$AGNT41 -$ -$AGNT43 -$ Western Atlantic (NT) spectra (4) near South America (3) - -48.13 -27.70 '31201 ' 999. DAT SCRIPPS 180 - -34.567 -8.15 '31052 ' 999. DAT PNBOIA 180 - -43.088 -23.031 '31260 ' 999. DAT PNBOIA 180 - -47.367 -28.5 '31374 ' 999. DAT PNBOIA 180 - -44.933 -25.283 '31051 ' 999. DAT PNBOIA 180 - -51.353 -32.595 '31053 ' 999. DAT PNBOIA 180 - -49.81 -31.52 '31375 ' 999. DAT WMO 360 -$AGNT43 -$ -$AGXT43 -$ Tropical Belt (XT) spectra (4) near South America (3) - -53.08 14.55 '41040 ' 5.0 DAT NDBC 360 - -46.00 14.53 '41041 ' 5.0 DAT NDBC 360 - -57.90 15.90 '41100 ' 5.0 DAT METFR 360 - -56.20 14.60 '41101 ' 5.0 DAT METFR 360 - -50.949 14.754 '41060 ' 2.7 DAT NDBC 360 - -60.848 11.185 '42087 ' 3.4 DAT NDBC 360 - -60.521 11.301 '42088 ' 3.4 DAT NDBC 360 -$AGXT43 -$ -$AGXT40 -$ Tropical Belt (XT) spectra (4) in Pacific Ocean and Pacific Isles (0) - -125.032 10.051 '43010 ' 3.5 DAT NDBC 360 - -144.668 13.729 '52009 ' 5.0 DAT NDBC 360 -$AGXT40 -$ -$AGET43 -$ Eastern Atlantic (ET) spectra (3) near Europe (3) - -5.00 45.20 '62001 ' 3.0 DAT UKMO 360 - -20.00 41.60 '62002 ' 999. DAT UNKNOWN 360 - -12.40 48.70 '62029 ' 3.0 DAT UKMO 360 - -7.90 51.40 '62023 ' 999. DAT UNKNOWN 360 - -5.60 48.50 '62052 ' 999. DAT METFR 100 - -13.30 51.00 '62081 ' 3.0 DAT UKMO 360 - -11.20 53.13 '62090 ' 4.5 DAT IDT 100 - -5.42 53.47 '62091 ' 4.5 DAT IDT 60 - -10.55 51.22 '62092 ' 4.5 DAT IDT 100 - -9.07 54.67 '62093 ' 4.5 DAT IDT 60 - -6.70 51.69 '62094 ' 4.5 DAT IDT 60 - -15.92 53.06 '62095 ' 4.5 DAT IDT 100 - -2.90 49.90 '62103 ' 14.0 DAT UKMO 360 - -12.36 54.54 '62105 ' 3.0 DAT UKMO 360 - -9.90 57.00 '62106 ' 4.5 DAT UKMO 360 - -6.10 50.10 '62107 ' 14.0 DAT UKMO 360 - -19.50 53.50 '62108 ' 3.0 DAT UKMO 360 - -8.50 47.50 '62163 ' 3.0 DAT UKMO 360 - -4.70 52.30 '62301 ' 3.0 DAT UKMO 25 - -5.10 51.60 '62303 ' 3.0 DAT UKMO 25 - 0.00 50.40 '62305 ' 14.0 DAT UKMO 25 - 2.00 51.40 '62170 ' 999.0 DAT UKMO 25 - -11.40 59.10 '64045 ' 3.0 DAT UKMO 360 - -4.50 60.70 '64046 ' 3.0 DAT UKMO 360 - -23.10 64.05 'TFGSK ' 999. DAT UNKNOWN 60 - -15.20 64.00 'TFHFN ' 999. DAT UNKNOWN 60 - -20.35 63.00 'TFSRT ' 999. DAT UNKNOWN 60 - 7.80 64.30 'LF3F ' 999. DAT UNKNOWN 360 - 1.10 55.30 '62026 ' 999. DAT UNKNOWN 360 - 0.00 57.00 '62109 ' 999. DAT UNKNOWN 25 - 0.40 58.10 '62111 ' 999. DAT UNKNOWN 25 - 1.30 58.70 '62112 ' 999. DAT UNKNOWN 25 - 1.40 57.70 '62116 ' 999. DAT UNKNOWN 360 - 0.00 57.90 '62117 ' 999. DAT UNKNOWN 15 - 2.00 57.00 '62119 ' 999. DAT UNKNOWN 25 - 1.40 58.70 '62128 ' 999. DAT UNKNOWN 25 - 2.00 56.40 '62132 ' 999. DAT UNKNOWN 25 - 1.00 57.10 '62133 ' 999. DAT UNKNOWN 15 - 2.10 53.00 '62142 ' 999. DAT PRIV 30 - 1.80 57.70 '62143 ' 999. DAT UNKNOWN 25 - 1.70 53.40 '62144 ' 999. DAT PRIV 45 - 2.80 53.10 '62145 ' 999. DAT PRIV 360 - 1.80 57.00 '62152 ' 999. DAT UNKNOWN 25 - 0.50 57.40 '62162 ' 999. DAT UNKNOWN 25 - 0.50 57.20 '62164 ' 999. DAT PRIV 15 - 1.90 51.10 '62304 ' 14.0 DAT UKMO 25 - 1.70 60.60 '63055 ' 999. DAT UNKNOWN 25 - 1.60 59.50 '63056 ' 999. DAT UNKNOWN 25 - 1.50 59.20 '63057 ' 999. DAT UNKNOWN 360 - 1.10 61.20 '63103 ' 999. DAT UNKNOWN 15 - 1.70 60.80 '63108 ' 999. DAT UNKNOWN 15 - 1.50 59.50 '63110 ' 999. DAT PRIV 15 - 1.00 61.10 '63112 ' 999. DAT PRIV 360 - 1.70 61.00 '63113 ' 999. DAT PRIV 100 - 1.30 61.60 '63115 ' 999. DAT PRIV 25 - 2.30 61.20 'LF3J ' 999. DAT UNKNOWN 25 - 3.70 60.60 'LF4B ' 999. DAT UNKNOWN 360 - 2.20 59.60 'LF4H ' 999. DAT UNKNOWN 25 - 1.90 58.40 'LF4C ' 999. DAT UNKNOWN 25 - 3.20 56.50 'LF5U ' 999. DAT UNKNOWN 60 - 3.28 51.99 'EURO ' 999. DAT MVEW 60 - 3.22 53.22 'K13 ' 999. DAT MVEW 25 - -3.03 43.63 '62024 ' 999. DAT PDES 25 - -7.62 44.07 '62082 ' 999. DAT PDES 25 - -9.40 42.12 '62084 ' 999. DAT PDES 25 - -6.97 36.48 '62085 ' 999. DAT PDES 25 - -15.82 28.18 '13130 ' 999. DAT PDES 25 - -16.58 28.00 '13131 ' 999. DAT PDES 25 - 0.90 57.70 '62118 ' 999. DAT UNKNOWN 15 - 2.10 57.10 '62146 ' 999. DAT UNKNOWN 25 - 6.33 55.00 'BSH01 ' 999. DAT UNKNOWN 60 - 7.89 54.16 'BSH02 ' 999. DAT UNKNOWN 60 - 8.12 54.00 'BSH03 ' 999. DAT UNKNOWN 60 - 6.58 54.00 'BSH04 ' 999. DAT UNKNOWN 60 - 8.22 54.92 'BSH05 ' 999. DAT UNKNOWN 60 -$AGET43 -$ -$AGAC43 -$ Arctic Ocean (AC) spectra (4) non-descript (3) - -25.00 65.69 'TFBLK ' 999. DAT UNKNOWN 60 - -18.20 66.50 'TFGRS ' 999. DAT UNKNOWN 60 - -13.50 65.65 'TFKGR ' 999. DAT UNKNOWN 60 - 7.30 65.30 'LF3N ' 999. DAT UNKNOWN 60 - 8.10 66.00 'LF5T ' 999. DAT UNKNOWN 360 - 2.00 66.00 'LDWR ' 999. DAT UNKNOWN 360 - 21.10 71.60 '3FYT ' 999. DAT UNKNOWN 360 - 15.50 73.50 'LFB1 ' 999. DAT OCNOR 360 - 30.00 74.00 'LFB2 ' 999. DAT OCNOR 360 - -9.26 68.48 '64071 ' 999. DAT UNKNOWN 60 - -166.071 70.025 '48012 ' 3.0 DAT NDBC 360 - -169.454 65.011 '48114 ' 999.0 DAT NDBC 360 - -146.040 70.370 '48211 ' 999.0 DAT NDBC 360 - -150.279 70.874 '48212 ' 999.0 DAT NDBC 360 - -164.133 71.502 '48213 ' 999.0 DAT NDBC 360 - -165.248 70.872 '48214 ' 999.0 DAT NDBC 360 - -167.952 71.758 '48216 ' 999.0 DAT NDBC 360 -$AGAC43 -$ -$AGIO45 -$ Indian Ocean (I) spectra (4) non-descript (5) - 72.49 17.02 '23092 ' 999. DAT UNKNOWN 20 - 73.75 15.40 '23093 ' 999. DAT UNKNOWN 120 - 74.50 12.94 '23094 ' 999. DAT UNKNOWN 120 - 80.39 13.19 '23096 ' 999. DAT UNKNOWN 120 - 69.24 15.47 '23097 ' 999. DAT UNKNOWN 360 - 72.51 10.65 '23098 ' 999. DAT UNKNOWN 360 - 90.74 12.14 '23099 ' 999. DAT UNKNOWN 360 - 87.56 18.35 '23100 ' 999. DAT UNKNOWN 120 - 83.27 13.97 '23101 ' 999. DAT UNKNOWN 360 - 87.50 15.00 '23168 ' 999. DAT UNKNOWN 360 - 90.14 18.13 '23169 ' 999. DAT UNKNOWN 360 - 72.66 8.33 '23170 ' 999. DAT UNKNOWN 360 - 72.00 12.50 '23172 ' 999. DAT UNKNOWN 360 - 78.57 8.21 '23173 ' 999. DAT UNKNOWN 120 - 81.53 11.57 '23174 ' 999. DAT UNKNOWN 360 - 116.14 -19.59 '56002 ' 999. DAT UNKNOWN 120 - 115.40 -32.11 '56005 ' 999. DAT UNKNOWN 50 - 114.78 -33.36 '56006 ' 999. DAT UNKNOWN 120 - 114.94 -21.41 '56007 ' 999. DAT UNKNOWN 50 - 22.17 -34.97 'AGULHAS_FA' 10.0 DAT PRIV 360 - 121.90 -34.00 '56010 ' 999. DAT UNKNOWN 50 - 114.10 -21.70 '56012 ' 999. DAT UNKNOWN 50 - 85.00 12.60 '23167 ' 999. DAT UNKNOWN 360 - 70.00 11.02 '23171 ' 999. DAT UNKNOWN 360 - 91.66 10.52 '23451 ' 999. DAT UNKNOWN 120 - 89.04 10.97 '23455 ' 999. DAT UNKNOWN 120 - 86.98 9.99 '23456 ' 999. DAT UNKNOWN 120 - 70.10 5.16 '23491 ' 999. DAT UNKNOWN 120 - 68.08 13.89 '23492 ' 999. DAT UNKNOWN 120 - 66.98 11.12 '23493 ' 999. DAT UNKNOWN 120 - 75.00 6.46 '23494 ' 999. DAT UNKNOWN 120 - 68.97 7.13 '23495 ' 999. DAT UNKNOWN 120 -$AGIO45 -$ -$ END of AWIPS Section -$ -$ South America DAT - -77.50 6.26 '32488 ' 999. DAT DIMAR 45 - -77.74 3.52 '32487 ' 999. DAT DIMAR 45 - -72.22 12.35 '41193 ' 999. DAT DIMAR 120 -$ Japanese buoys DAT -$ South Korean buoys DAT - 129.78 36.35 '22106 ' 999. DAT SKOREA 100 - 126.33 33.00 '22107 ' 999. DAT SKOREA 100 -$ Africa DAT - 57.70 -20.45 'MAUR01 ' 999. DAT WMO 360 - 57.75 -20.10 'MAUR02 ' 999. DAT WMO 360 -$ End of multi_1 buoy file -$ - 0.00 0.00 'STOPSTRING' 999. XXX NCEP 0 diff --git a/parm/wave/wave_gfs.buoys b/parm/wave/wave_gfs.buoys new file mode 120000 index 00000000000..6f47adefac7 --- /dev/null +++ b/parm/wave/wave_gfs.buoys @@ -0,0 +1 @@ +wave_gfs.buoys.full \ No newline at end of file diff --git a/scripts/exgdas_atmos_chgres_forenkf.sh b/scripts/exgdas_atmos_chgres_forenkf.sh index afc7cc9f5e1..25d034ef476 100755 --- a/scripts/exgdas_atmos_chgres_forenkf.sh +++ b/scripts/exgdas_atmos_chgres_forenkf.sh @@ -21,11 +21,11 @@ source "$HOMEgfs/ush/preamble.sh" # Directories. pwd=$(pwd) -export FIXgsm=${FIXgsm:-$HOMEgfs/fix/fix_am} +export FIXgsm=${FIXgsm:-$HOMEgfs/fix/am} # Base variables CDATE=${CDATE:-"2001010100"} -CDUMP=${CDUMP:-"gdas"} +CDUMP=${CDUMP:-"enkfgdas"} GDUMP=${GDUMP:-"gdas"} # Derived base variables @@ -63,24 +63,24 @@ SIGLEVEL=${SIGLEVEL:-${FIXgsm}/global_hyblev.l${LEVS}.txt} # forecast files APREFIX=${APREFIX:-""} -ASUFFIX=${ASUFFIX:-$SUFFIX} +APREFIX_ENS=${APREFIX_ENS:-""} # at full resolution -ATMF03=${ATMF03:-${COMOUT}/${APREFIX}atmf003${ASUFFIX}} -ATMF04=${ATMF04:-${COMOUT}/${APREFIX}atmf004${ASUFFIX}} -ATMF05=${ATMF05:-${COMOUT}/${APREFIX}atmf005${ASUFFIX}} -ATMF06=${ATMF06:-${COMOUT}/${APREFIX}atmf006${ASUFFIX}} -ATMF07=${ATMF07:-${COMOUT}/${APREFIX}atmf007${ASUFFIX}} -ATMF08=${ATMF08:-${COMOUT}/${APREFIX}atmf008${ASUFFIX}} -ATMF09=${ATMF09:-${COMOUT}/${APREFIX}atmf009${ASUFFIX}} +ATMF03=${ATMF03:-${COM_ATMOS_HISTORY}/${APREFIX}atmf003.nc} +ATMF04=${ATMF04:-${COM_ATMOS_HISTORY}/${APREFIX}atmf004.nc} +ATMF05=${ATMF05:-${COM_ATMOS_HISTORY}/${APREFIX}atmf005.nc} +ATMF06=${ATMF06:-${COM_ATMOS_HISTORY}/${APREFIX}atmf006.nc} +ATMF07=${ATMF07:-${COM_ATMOS_HISTORY}/${APREFIX}atmf007.nc} +ATMF08=${ATMF08:-${COM_ATMOS_HISTORY}/${APREFIX}atmf008.nc} +ATMF09=${ATMF09:-${COM_ATMOS_HISTORY}/${APREFIX}atmf009.nc} # at ensemble resolution -ATMF03ENS=${ATMF03ENS:-${COMOUT}/${APREFIX}atmf003.ensres${ASUFFIX}} -ATMF04ENS=${ATMF04ENS:-${COMOUT}/${APREFIX}atmf004.ensres${ASUFFIX}} -ATMF05ENS=${ATMF05ENS:-${COMOUT}/${APREFIX}atmf005.ensres${ASUFFIX}} -ATMF06ENS=${ATMF06ENS:-${COMOUT}/${APREFIX}atmf006.ensres${ASUFFIX}} -ATMF07ENS=${ATMF07ENS:-${COMOUT}/${APREFIX}atmf007.ensres${ASUFFIX}} -ATMF08ENS=${ATMF08ENS:-${COMOUT}/${APREFIX}atmf008.ensres${ASUFFIX}} -ATMF09ENS=${ATMF09ENS:-${COMOUT}/${APREFIX}atmf009.ensres${ASUFFIX}} -ATMFCST_ENSRES=${ATMFCST_ENSRES:-${COMOUT_ENS}/mem001/${APREFIX}atmf006${ASUFFIX}} +ATMF03ENS=${ATMF03ENS:-${COM_ATMOS_HISTORY}/${APREFIX}atmf003.ensres.nc} +ATMF04ENS=${ATMF04ENS:-${COM_ATMOS_HISTORY}/${APREFIX}atmf004.ensres.nc} +ATMF05ENS=${ATMF05ENS:-${COM_ATMOS_HISTORY}/${APREFIX}atmf005.ensres.nc} +ATMF06ENS=${ATMF06ENS:-${COM_ATMOS_HISTORY}/${APREFIX}atmf006.ensres.nc} +ATMF07ENS=${ATMF07ENS:-${COM_ATMOS_HISTORY}/${APREFIX}atmf007.ensres.nc} +ATMF08ENS=${ATMF08ENS:-${COM_ATMOS_HISTORY}/${APREFIX}atmf008.ensres.nc} +ATMF09ENS=${ATMF09ENS:-${COM_ATMOS_HISTORY}/${APREFIX}atmf009.ensres.nc} +ATMFCST_ENSRES=${ATMFCST_ENSRES:-${COM_ATMOS_HISTORY_MEM}/${APREFIX_ENS}atmf006.nc} # Set script / GSI control parameters DOHYBVAR=${DOHYBVAR:-"NO"} @@ -102,16 +102,7 @@ fi ################################################################################ ################################################################################ -# Preprocessing -mkdata=NO -if [ ! -d $DATA ]; then - mkdata=YES - mkdir -p $DATA -fi - -cd $DATA || exit 99 -############################################################## # get resolution information LONB_ENKF=${LONB_ENKF:-$($NCLEN $ATMFCST_ENSRES grid_xt)} # get LONB_ENKF LATB_ENKF=${LATB_ENKF:-$($NCLEN $ATMFCST_ENSRES grid_yt)} # get LATB_ENFK @@ -196,7 +187,5 @@ fi ################################################################################ # Postprocessing cd $pwd -[[ $mkdata = "YES" ]] && rm -rf $DATA - exit $err diff --git a/scripts/exgdas_atmos_gempak_gif_ncdc.sh b/scripts/exgdas_atmos_gempak_gif_ncdc.sh index 3671d5511fa..63a7475a0e5 100755 --- a/scripts/exgdas_atmos_gempak_gif_ncdc.sh +++ b/scripts/exgdas_atmos_gempak_gif_ncdc.sh @@ -50,8 +50,6 @@ then $USHgempak/gempak_${RUN}_f${fhr}_gif.sh if [ ! -f $USHgempak/gempak_${RUN}_f${fhr}_gif.sh ] ; then echo "WARNING: $USHgempak/gempak_${RUN}_f${fhr}_gif.sh FILE is missing" - msg=" $USHgempak/gempak_${RUN}_f${fhr}_gif.sh file is missing " - postmsg "jlogfile" "$msg" fi fi diff --git a/scripts/exgdas_atmos_gldas.sh b/scripts/exgdas_atmos_gldas.sh new file mode 100755 index 00000000000..ba56e323aa2 --- /dev/null +++ b/scripts/exgdas_atmos_gldas.sh @@ -0,0 +1,332 @@ +#! /usr/bin/env bash + +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: exgdas_atmos_gldas.sh +# Script description: Runs the global land analysis +# +################################################################################ + +source "${HOMEgfs:?}/ush/preamble.sh" + +################################# +# Set up UTILITIES +################################# +export FINDDATE=${FINDDATE:-/apps/ops/prod/nco/core/prod_util.v2.0.13/ush/finddate.sh} +export utilexec=${utilexec:-/apps/ops/prod/libs/intel/19.1.3.304/grib_util/1.2.3/bin} +export CNVGRIB=${CNVGRIB:-${utilexec}/cnvgrib} +export WGRIB=${WGRIB:-${utilexec}/wgrib} +export WGRIB2=${WGRIB2:-/apps/ops/prod/libs/intel/19.1.3.304/wgrib2/2.0.7/bin/wgrib2} +export COPYGB=${COPYGB:-${utilexec}/copygb} +export NDATE=${NDATE:-/apps/ops/prod/nco/core/prod_util.v2.0.13/exec/ndate} +export DCOMIN=${DCOMIN:-${DCOMROOT:-"/lfs/h1/ops/prod/dcom"}} +export CPCGAUGE=${CPCGAUGE:-/lfs/h2/emc/global/noscrub/emc.global/dump} +export COMINgdas=${COMINgdas:-${ROTDIR}} +export OFFLINE_GLDAS=${OFFLINE_GLDAS:-"NO"} +export ERRSCRIPT=${ERRSCRIPT:-"eval [[ ${err} = 0 ]]"} + + +################################# +# Set up the running environment +################################# +export USE_CFP=${USE_CFP:-"NO"} +export assim_freq=${assim_freq:-6} +export gldas_spinup_hours=${gldas_spinup_hours:-72} + +# Local date variables +gldas_cdate=${CDATE:?} +gldas_eymd=$(echo "${gldas_cdate}" |cut -c 1-8) +gldas_ecyc=$(echo "${gldas_cdate}" |cut -c 9-10) +gldas_sdate=$(${NDATE} -"${gldas_spinup_hours}" "${CDATE}") +gldas_symd=$(echo "${gldas_sdate}" |cut -c 1-8) +gldas_scyc=$(echo "${gldas_sdate}" |cut -c 9-10) + +iau_cdate=${CDATE} +if [[ "${DOIAU:?}" = "YES" ]]; then + IAU_OFFSET=${IAU_OFFSET:-0} + IAUHALH=$((IAU_OFFSET/2)) + iau_cdate=$(${NDATE} -"${IAUHALH}" "${CDATE}") +fi +iau_eymd=$(echo "${iau_cdate}" |cut -c 1-8) +iau_ecyc=$(echo "${iau_cdate}" |cut -c 9-10) +echo "GLDAS runs from ${gldas_sdate} to ${iau_cdate}" + +CASE=${CASE:-C768} +res=$(echo "${CASE}" |cut -c2-5) +JCAP=$((2*res-2)) +nlat=$((2*res)) +nlon=$((4*res)) + +export USHgldas=${USHgldas:?} +export FIXgldas=${FIXgldas:-${HOMEgfs}/fix/gldas} +export topodir=${topodir:-${HOMEgfs}/fix/orog/${CASE}} + +DATA=${DATA:-${pwd}/gldastmp$$} +mkdata=NO +if [[ ! -d "${DATA}" ]]; then + mkdata=YES + mkdir -p "${DATA}" +fi +cd "${DATA}" || exit 1 +export RUNDIR=${DATA} + + +################################# +GDAS=${RUNDIR}/force +mkdir -p "${GDAS}" + +input1=${COMINgdas}/gdas.${gldas_symd}/${gldas_scyc}/atmos/RESTART +input2=${COMINgdas}/gdas.${gldas_eymd}/${gldas_ecyc}/atmos/RESTART +[[ -d ${RUNDIR} ]] && rm -fr "${RUNDIR}/FIX" +[[ -f ${RUNDIR}/LIS ]] && rm -fr "${RUNDIR}/LIS" +[[ -d ${RUNDIR}/input ]] && rm -fr "${RUNDIR}/input" +mkdir -p "${RUNDIR}/input" +ln -fs "${GDAS}" "${RUNDIR}/input/GDAS" +ln -fs "${EXECgldas:?}/gldas_model" "${RUNDIR}/LIS" + +# Set FIXgldas subfolder +ln -fs "${FIXgldas}/frac_grid/FIX_T${JCAP}" "${RUNDIR}/FIX" + +#--------------------------------------------------------------- +### 1) Get gdas 6-tile netcdf restart file and gdas forcing data +#--------------------------------------------------------------- + +"${USHgldas}/gldas_get_data.sh" "${gldas_sdate}" "${gldas_cdate}" +export err=$? +${ERRSCRIPT} || exit 2 + +#--------------------------------------------------------------- +### 2) Get CPC daily precip and temporally disaggreated +#--------------------------------------------------------------- + +"${USHgldas}/gldas_forcing.sh" "${gldas_symd}" "${gldas_eymd}" +export err=$? +${ERRSCRIPT} || exit 3 + +# spatially disaggregated + +if [[ "${JCAP}" -eq 1534 ]]; then + gds='255 4 3072 1536 89909 0 128 -89909 -117 117 768 0 0 0 0 0 0 0 0 0 255 0 0 0 0 0' +elif [[ "${JCAP}" -eq 766 ]]; then + gds='255 4 1536 768 89821 0 128 -89821 -234 234 384 0 0 0 0 0 0 0 0 0 255 0 0 0 0 0' +elif [[ "${JCAP}" -eq 382 ]]; then + gds='255 4 768 384 89641 0 128 -89641 -469 469 192 0 0 0 0 0 0 0 0 0 255 0 0 0 0 0' +elif [[ "${JCAP}" -eq 190 ]]; then + gds='255 4 384 192 89284 0 128 -89284 -938 938 96 0 0 0 0 0 0 0 0 0 255 0 0 0 0 0' +else + echo "JCAP=${JCAP} not supported, exit" + export err=4 + ${ERRSCRIPT} || exit 4 +fi + +echo "${JCAP}" +echo "${gds}" +ymdpre=$(sh "${FINDDATE}" "${gldas_symd}" d-1) +ymdend=$(sh "${FINDDATE}" "${gldas_eymd}" d-2) +ymd=${ymdpre} + +if [[ "${USE_CFP}" = "YES" ]] ; then + rm -f ./cfile + touch ./cfile +fi + +while [[ "${ymd}" -le "${ymdend}" ]]; do + if [[ "${ymd}" -ne "${ymdpre}" ]]; then + if [[ "${USE_CFP}" = "YES" ]] ; then + echo "${COPYGB} -i3 '-g${gds}' -x ${GDAS}/cpc.${ymd}/precip.gldas.${ymd}00 ${RUNDIR}/cmap.gdas.${ymd}00" >> ./cfile + echo "${COPYGB} -i3 '-g${gds}' -x ${GDAS}/cpc.${ymd}/precip.gldas.${ymd}06 ${RUNDIR}/cmap.gdas.${ymd}06" >> ./cfile + else + ${COPYGB} -i3 -g"${gds}" -x "${GDAS}/cpc.${ymd}/precip.gldas.${ymd}00" "${RUNDIR}/cmap.gdas.${ymd}00" + ${COPYGB} -i3 -g"${gds}" -x "${GDAS}/cpc.${ymd}/precip.gldas.${ymd}06" "${RUNDIR}/cmap.gdas.${ymd}06" + fi + fi + if [[ "${ymd}" -ne "${ymdend}" ]]; then + if [[ "${USE_CFP}" = "YES" ]] ; then + echo "${COPYGB} -i3 '-g${gds}' -x ${GDAS}/cpc.${ymd}/precip.gldas.${ymd}12 ${RUNDIR}/cmap.gdas.${ymd}12" >> ./cfile + echo "${COPYGB} -i3 '-g${gds}' -x ${GDAS}/cpc.${ymd}/precip.gldas.${ymd}18 ${RUNDIR}/cmap.gdas.${ymd}18" >> ./cfile + else + ${COPYGB} -i3 -g"${gds}" -x "${GDAS}/cpc.${ymd}/precip.gldas.${ymd}12" "${RUNDIR}/cmap.gdas.${ymd}12" + ${COPYGB} -i3 -g"${gds}" -x "${GDAS}/cpc.${ymd}/precip.gldas.${ymd}18" "${RUNDIR}/cmap.gdas.${ymd}18" + fi + fi + ymd=$(sh "${FINDDATE}" "${ymd}" d+1) +done + +if [[ "${USE_CFP}" = "YES" ]] ; then + ${APRUN_GLDAS_DATA_PROC:?} ./cfile +fi + +# create configure file +"${USHgldas}/gldas_liscrd.sh" "${gldas_sdate}" "${iau_cdate}" "${JCAP}" +export err=$? +${ERRSCRIPT} || exit 4 + + +#--------------------------------------------------------------- +### 3) Produce initials noah.rst from 6-tile gdas restart files +#--------------------------------------------------------------- +rm -f fort.41 fort.141 fort.11 fort.12 + +# 3a) create gdas2gldas input file + +cat >> fort.141 << EOF + &config + data_dir_input_grid="${input1}" + sfc_files_input_grid="${gldas_symd}.${gldas_scyc}0000.sfcanl_data.tile1.nc","${gldas_symd}.${gldas_scyc}0000.sfcanl_data.tile2.nc","${gldas_symd}.${gldas_scyc}0000.sfcanl_data.tile3.nc","${gldas_symd}.${gldas_scyc}0000.sfcanl_data.tile4.nc","${gldas_symd}.${gldas_scyc}0000.sfcanl_data.tile5.nc","${gldas_symd}.${gldas_scyc}0000.sfcanl_data.tile6.nc" + mosaic_file_input_grid="${CASE}_mosaic.nc" + orog_dir_input_grid="${topodir}/" + orog_files_input_grid="${CASE}_oro_data.tile1.nc","${CASE}_oro_data.tile2.nc","${CASE}_oro_data.tile3.nc","${CASE}_oro_data.tile4.nc","${CASE}_oro_data.tile5.nc","${CASE}_oro_data.tile6.nc" + i_target=${nlon} + j_target=${nlat} + model="${model:?}" + / +EOF +cp fort.141 fort.41 + + +# 3b) Use gdas2gldas to generate nemsio file + +export OMP_NUM_THREADS=1 +export pgm=gdas2gldas +# shellcheck disable=SC1091 +. prep_step +# shellcheck disable= +${APRUN_GAUSSIAN:?} "${EXECgldas}/gdas2gldas" 1>&1 2>&2 +export err=$? +${ERRSCRIPT} || exit 5 + + +# 3c)gldas_rst to generate noah.rst + +sfcanl=sfc.gaussian.nemsio +ln -fs "FIX/lmask_gfs_T${JCAP}.bfsa" fort.11 +ln -fs "${sfcanl}" fort.12 +export pgm=gldas_rst +# shellcheck disable=SC1091 +. prep_step +# shellcheck disable= +"${EXECgldas}/gldas_rst" 1>&1 2>&2 +export err=$? +${ERRSCRIPT} || exit 6 + +mv "${sfcanl}" "${sfcanl}.${gldas_symd}" + + +#--------------------------------------------------------------- +### 4) run noah/noahmp model +#--------------------------------------------------------------- +export pgm=LIS +# shellcheck disable=SC1091 +. prep_step +# shellcheck disable= +${APRUN_GLDAS:?} ./LIS 1>&1 2>&2 +export err=$? +${ERRSCRIPT} || exit 7 + + +#--------------------------------------------------------------- +### 5) using gdas2gldas to generate nemsio file for gldas_eymd +### use gldas_post to replace soil moisture and temperature +### use gldas2gdas to produce 6-tile restart file +#--------------------------------------------------------------- +rm -f fort.41 fort.241 fort.42 + +# 5a) create input file for gdas2gldas + +cat >> fort.241 << EOF + &config + data_dir_input_grid="${input2}" + sfc_files_input_grid="${iau_eymd}.${iau_ecyc}0000.sfcanl_data.tile1.nc","${iau_eymd}.${iau_ecyc}0000.sfcanl_data.tile2.nc","${iau_eymd}.${iau_ecyc}0000.sfcanl_data.tile3.nc","${iau_eymd}.${iau_ecyc}0000.sfcanl_data.tile4.nc","${iau_eymd}.${iau_ecyc}0000.sfcanl_data.tile5.nc","${iau_eymd}.${iau_ecyc}0000.sfcanl_data.tile6.nc" + mosaic_file_input_grid="${CASE}_mosaic.nc" + orog_dir_input_grid="${topodir}/" + orog_files_input_grid="${CASE}_oro_data.tile1.nc","${CASE}_oro_data.tile2.nc","${CASE}_oro_data.tile3.nc","${CASE}_oro_data.tile4.nc","${CASE}_oro_data.tile5.nc","${CASE}_oro_data.tile6.nc" + i_target=${nlon} + j_target=${nlat} + model="${model:?}" + / +EOF +cp fort.241 fort.41 + +# 5b) use gdas2gldas to produce nemsio file + +export OMP_NUM_THREADS=1 +export pgm=gdas2gldas +# shellcheck disable=SC1091 +. prep_step +# shellcheck disable= +${APRUN_GAUSSIAN} "${EXECgldas}/gdas2gldas" 1>&1 2>&2 +export err=$? +${ERRSCRIPT} || exit 8 + + +# 5c) use gldas_post to replace soil moisture and temperature + +yyyy=$(echo "${iau_eymd}" | cut -c1-4) +gbin=${RUNDIR}/EXP901/NOAH/${yyyy}/${iau_eymd}/LIS.E901.${iau_eymd}${iau_ecyc}.NOAHgbin +sfcanl=sfc.gaussian.nemsio +rm -rf fort.11 fort.12 +ln -fs "${gbin}" fort.11 +ln -fs "${sfcanl}" fort.12 + +export pgm=gldas_post +# shellcheck disable=SC1091 +. prep_step +# shellcheck disable= +"${EXECgldas}/gldas_post" 1>&1 2>&2 +export err=$? +${ERRSCRIPT} || exit 9 + +cp fort.22 ./gldas.nemsio +mv fort.22 "${sfcanl}.gldas" + + +# 5d) use gldas2gdas to create 6-tile restart tiles + +cat >> fort.42 << EOF + &config + orog_dir_gdas_grid="${topodir}/" + mosaic_file_gdas_grid="${CASE}_mosaic.nc" + / +EOF + +# copy/link gdas netcdf tiles +k=1; while [[ "${k}" -le 6 ]]; do + cp "${input2}/${iau_eymd}.${iau_ecyc}0000.sfcanl_data.tile${k}.nc" "./sfc_data.tile${k}.nc" + k=$((k+1)) +done + +# copy soil type +ln -fs "FIX/stype_gfs_T${JCAP}.bfsa" "stype_gfs_T${JCAP}.bfsa" + +export OMP_NUM_THREADS=1 +export pgm=gldas2gdas +# shellcheck disable=SC1091 +. prep_step +# shellcheck disable= +${APRUN_GAUSSIAN} "${EXECgldas}/gldas2gdas" 1>&1 2>&2 +export err=$? +${ERRSCRIPT} || exit 10 + + +# 5e) archive gldas results + +if [[ "${OFFLINE_GLDAS}" = "YES" ]]; then + "${USHgldas}/gldas_archive.sh" "${gldas_symd}" "${gldas_eymd}" + export err=$? + ${ERRSCRIPT} || exit 11 +else + k=1; while [[ "${k}" -le 6 ]]; do + mv "${input2}/${iau_eymd}.${iau_ecyc}0000.sfcanl_data.tile${k}.nc" "${input2}/${iau_eymd}.${iau_ecyc}0000.sfcanl_data.tile${k}.nc_bfgldas" + cp "sfc_data.tile${k}.nc" "${input2}/${iau_eymd}.${iau_ecyc}0000.sfcanl_data.tile${k}.nc" + k=$((k+1)) + done +fi + + +#------------------------------------------------------------------ +# Clean up before leaving +if [[ "${mkdata}" = "YES" ]]; then rm -rf "${DATA}"; fi + +exit "${err}" + diff --git a/scripts/exgdas_atmos_nawips.sh b/scripts/exgdas_atmos_nawips.sh index 4836065aa72..725cb0223f3 100755 --- a/scripts/exgdas_atmos_nawips.sh +++ b/scripts/exgdas_atmos_nawips.sh @@ -13,11 +13,12 @@ source "$HOMEgfs/ush/preamble.sh" "${2}" cd $DATA -RUN=$1 +RUN2=$1 fend=$2 DBN_ALERT_TYPE=$3 +destination=$4 -DATA_RUN=$DATA/$RUN +DATA_RUN=$DATA/$RUN2 mkdir -p $DATA_RUN cd $DATA_RUN @@ -75,24 +76,20 @@ while [ $fhcnt -le $fend ] ; do fhr3=$(printf "%03d" $fhcnt) - GEMGRD=${RUN}_${PDY}${cyc}f${fhr3} + GEMGRD=${RUN2}_${PDY}${cyc}f${fhr3} - if [ $RUN = "gdas_0p25" ]; then - export GRIBIN=$COMIN/${model}.${cycle}.pgrb2.0p25.f${fhr} - if [ ! -f $GRIBIN ] ; then - echo "WARNING: $GRIBIN FILE is missing" - msg=" $GRIBIN file is missing " - postmsg "$jlogfile" "$msg" + if [[ ${RUN2} = "gdas_0p25" ]]; then + export GRIBIN=${COM_ATMOS_GRIB_0p25}/${model}.${cycle}.pgrb2.0p25.f${fhr} + if [[ ! -f ${GRIBIN} ]] ; then + echo "WARNING: ${GRIBIN} FILE is missing" fi - GRIBIN_chk=$COMIN/${model}.${cycle}.pgrb2.0p25.f${fhr}.idx + GRIBIN_chk=${COM_ATMOS_GRIB_0p25}${model}.${cycle}.pgrb2.0p25.f${fhr}.idx else - export GRIBIN=$COMIN/${model}.${cycle}.pgrb2.1p00.f${fhr} - if [ ! -f $GRIBIN ] ; then - echo "WARNING: $GRIBIN FILE is missing" - msg=" $GRIBIN file is missing " - postmsg "$jlogfile" "$msg" + export GRIBIN=${COM_ATMOS_GRIB_1p00}/${model}.${cycle}.pgrb2.1p00.f${fhr} + if [[ ! -f ${GRIBIN} ]] ; then + echo "WARNING: ${GRIBIN} FILE is missing" fi - GRIBIN_chk=$COMIN/${model}.${cycle}.pgrb2.1p00.f${fhr}.idx + GRIBIN_chk=${COM_ATMOS_GRIB_1p00}/${model}.${cycle}.pgrb2.1p00.f${fhr}.idx fi icnt=1 @@ -102,15 +99,13 @@ while [ $fhcnt -le $fend ] ; do sleep 5 break else - msg="The process is waiting ... ${GRIBIN_chk} file to proceed." - postmsg "${jlogfile}" "$msg" + echo "The process is waiting ... ${GRIBIN_chk} file to proceed." sleep 20 let "icnt=icnt+1" fi if [ $icnt -ge $maxtries ] then - msg="ABORTING: after 1 hour of waiting for ${GRIBIN_chk} file at F$fhr to end." - postmsg "${jlogfile}" "$msg" + echo "ABORTING: after 1 hour of waiting for ${GRIBIN_chk} file at F$fhr to end." export err=7 ; err_chk exit $err fi @@ -141,17 +136,17 @@ EOF export err=$?;err_chk if [ $SENDCOM = "YES" ] ; then - cp $GEMGRD $COMOUT/.$GEMGRD + cp "${GEMGRD}" "${destination}/.${GEMGRD}" export err=$? - if [[ $err -ne 0 ]] ; then - echo " File $GEMGRD does not exist." - exit $err + if [[ ${err} -ne 0 ]] ; then + echo " File ${GEMGRD} does not exist." + exit "${err}" fi - mv $COMOUT/.$GEMGRD $COMOUT/$GEMGRD - if [ $SENDDBN = "YES" ] ; then - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/$GEMGRD + mv "${destination}/.${GEMGRD}" "${destination}/${GEMGRD}" + if [[ ${SENDDBN} = "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${destination}/${GEMGRD}" else echo "##### DBN_ALERT_TYPE is: ${DBN_ALERT_TYPE} #####" fi diff --git a/scripts/exgdas_atmos_post.sh b/scripts/exgdas_atmos_post.sh new file mode 100755 index 00000000000..c49be8b0b89 --- /dev/null +++ b/scripts/exgdas_atmos_post.sh @@ -0,0 +1,335 @@ +#! /usr/bin/env bash + +##################################################################### +# echo "-----------------------------------------------------" +# echo " exgdas_nceppost.sh" +# echo " Sep 07 - Chuang - Modified script to run unified post" +# echo " July 14 - Carlis - Changed to 0.25 deg grib2 master file" +# echo " Feb 16 - Lin - Modify to use Vertical Structure" +# echo " Aug 17 - Meng - Modify to use 3-digit forecast hour naming" +# echo " master and flux files" +# echo " Dec 17 - Meng - Link sfc data file to flxfile " +# echo " since fv3gfs does not output sfc files any more." +# echo " Dec 17 - Meng - Add fv3gfs_downstream_nems.sh for pgb processing " +# echo " and remove writing data file to /nwges" +# echo " Jan 18 - Meng - For EE2 standard, move IDRT POSTGPVARS setting" +# echo " from j-job script." +# echo " Feb 18 - Meng - Removed legacy setting for generating grib1 data" +# echo " and reading sigio model outputs." +# echo " Aug 20 - Meng - Remove .ecf extentsion per EE2 review." +# echo " Sep 20 - Meng - Update clean up files per EE2 review." +# echo " Mar 21 - Meng - Update POSTGRB2TBL default setting." +# echo " Oct 21 - Meng - Remove jlogfile for wcoss2 transition." +# echo " Feb 22 - Lin - Exception handling if anl input not found." +# echo "-----------------------------------------------------" +##################################################################### + +source "${HOMEgfs}/ush/preamble.sh" + +cd "${DATA}" || exit 1 + +export POSTGPSH=${POSTGPSH:-${USHgfs}/gfs_post.sh} +export GFSDOWNSH=${GFSDOWNSH:-${USHgfs}/fv3gfs_downstream_nems.sh} +export GFSDWNSH=${GFSDWNSH:-${USHgfs}/fv3gfs_dwn_nems.sh} +export TRIMRH=${TRIMRH:-${USHgfs}/trim_rh.sh} +export MODICEC=${MODICEC:-${USHgfs}/mod_icec.sh} +export INLINE_POST=${INLINE_POST:-".false."} + +############################################################ +# Define Variables: +# ----------------- +# fhr is the current forecast hour. +# SLEEP_TIME is the number of seconds to sleep before exiting with error. +# SLEEP_INT is the number of seconds to sleep between restrt file checks. +# restart_file is the name of the file to key off of to kick off post. +############################################################ + +export IO=${LONB:-1440} +export JO=${LATB:-721} +# specify default model output format: 3 for sigio and 4 +# for nemsio +export OUTTYP=${OUTTYP:-4} +export PREFIX=${PREFIX:-${RUN}.t${cyc}z.} +export machine=${machine:-WCOSS2} + +########################### +# Specify Output layers +########################### +export POSTGPVARS="KPO=57,PO=1000.,975.,950.,925.,900.,875.,850.,825.,800.,775.,750.,725.,700.,675.,650.,625.,600.,575.,550.,525.,500.,475.,450.,425.,400.,375.,350.,325.,300.,275.,250.,225.,200.,175.,150.,125.,100.,70.,50.,40.,30.,20.,15.,10.,7.,5.,3.,2.,1.,0.7,0.4,0.2,0.1,0.07,0.04,0.02,0.01,rdaod=.true.," + +########################################################## +# Specify variable to directly output pgrb2 files for GDAS/GFS +########################################################## +export IDRT=${IDRT:-0} # IDRT=0 is setting for outputting grib files on lat/lon grid + +############################################################ +# Post Analysis Files before starting the Forecast Post +############################################################ +# Chuang: modify to process analysis when post_times is 00 +stime="$(echo "${post_times}" | cut -c1-3)" +export stime +export loganl="${COM_ATMOS_ANALYSIS}/${PREFIX}atmanl.nc" + +if [[ "${stime}" = "anl" ]]; then + if [[ -f "${loganl}" ]]; then + # add new environmental variables for running new ncep post + # Validation date + + export VDATE=${PDY}${cyc} + + # set outtyp to 1 because we need to run chgres in the post before model start running chgres + # otherwise set to 0, then chgres will not be executed in global_nceppost.sh + + export OUTTYP=${OUTTYP:-4} + + # specify output file name from chgres which is input file name to nceppost + # if model already runs gfs io, make sure GFSOUT is linked to the gfsio file + # new imported variable for global_nceppost.sh + + export GFSOUT=${RUN}.${cycle}.gfsioanl + + # specify smaller control file for GDAS because GDAS does not + # produce flux file, the default will be /nwprod/parm/gfs_cntrl.parm + + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + export POSTGRB2TBL=${POSTGRB2TBL:-${g2tmpl_ROOT}/share/params_grib2_tbl_new} + export PostFlatFile=${PostFlatFile:-${PARMpost}/postxconfig-NT-GFS-ANL.txt} + export CTLFILE=${PARMpost}/postcntrl_gfs_anl.xml + fi + + [[ -f flxfile ]] && rm flxfile ; [[ -f nemsfile ]] && rm nemsfile + + ln -fs "${COM_ATMOS_ANALYSIS}/${PREFIX}atmanl.nc" nemsfile + export NEMSINP=nemsfile + ln -fs "${COM_ATMOS_ANALYSIS}/${PREFIX}sfcanl.nc" flxfile + export FLXINP=flxfile + export PGBOUT=pgbfile + export PGIOUT=pgifile + export PGBOUT2=pgbfile.grib2 + export PGIOUT2=pgifile.grib2.idx + export IGEN="${IGEN_ANL}" + export FILTER=0 + + # specify fhr even for analysis because postgp uses it + # export fhr=00 + + ${POSTGPSH} + export err=$?; err_chk + + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + mv "${PGBOUT}" "${PGBOUT2}" + + #Proces pgb files + export FH=-1 + export downset=${downset:-1} + ${GFSDOWNSH} + export err=$?; err_chk + fi + + if [[ "${SENDCOM}" = 'YES' ]]; then + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + MASTERANL=${PREFIX}master.grb2anl + ##########XXW Accord to Boi, fortran index should use *if${fhr}, wgrib index use .idx + #MASTERANLIDX=${RUN}.${cycle}.master.grb2${fhr3}.idx + MASTERANLIDX=${PREFIX}master.grb2ianl + cp "${PGBOUT2}" "${COM_ATMOS_MASTER}/${MASTERANL}" + ${GRB2INDEX} "${PGBOUT2}" "${COM_ATMOS_MASTER}/${MASTERANLIDX}" + fi + + if [[ "${SENDDBN}" = 'YES' ]]; then + run="$(echo "${RUN}" | tr '[:lower:]' '[:upper:]')" + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL "${run}_MSC_sfcanl" "${job}" "${COM_ATMOS_ANALYSIS}/${PREFIX}sfcanl.nc" + "${DBNROOT}/bin/dbn_alert" MODEL "${run}_SA" "${job}" "${COM_ATMOS_ANALYSIS}/${PREFIX}atmanl.nc" + "${DBNROOT}/bin/dbn_alert" MODEL "GDAS_PGA_GB2" "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.anl" + "${DBNROOT}/bin/dbn_alert" MODEL "GDAS_PGA_GB2_WIDX" "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.anl.idx" + fi + fi + fi + rm pgbfile.grib2 + else + #### atmanl file not found need failing job + echo " *** FATAL ERROR: No model anl file output " + export err=9 + err_chk + fi +else ## not_anl if_stimes + SLEEP_LOOP_MAX=$(( SLEEP_TIME / SLEEP_INT )) + + ############################################################ + # Loop Through the Post Forecast Files + ############################################################ + + for fhr in ${post_times}; do + # Enforce decimal math expressions + d_fhr=$((10#${fhr})) + ############################### + # Start Looping for the + # existence of the restart files + ############################### + export pgm="postcheck" + ic=1 + while (( ic <= SLEEP_LOOP_MAX )); do + if [[ -f "${restart_file}${fhr}.txt" ]]; then + break + else + ic=$(( ic + 1 )) + sleep "${SLEEP_INT}" + fi + ############################### + # If we reach this point assume + # fcst job never reached restart + # period and error exit + ############################### + if (( ic == SLEEP_LOOP_MAX )); then + echo " *** FATAL ERROR: No model output for f${fhr} " + export err=9 + err_chk + fi + done + + ############################### + # Put restart files into /nwges + # for backup to start Model Fcst + ############################### + [[ -f flxfile ]] && rm flxfile + [[ -f nemsfile ]] && rm nemsfile + ln -sf "${COM_ATMOS_HISTORY}/${PREFIX}atmf${fhr}.nc" nemsfile + export NEMSINP=nemsfile + ln -sf "${COM_ATMOS_HISTORY}/${PREFIX}sfcf${fhr}.nc" flxfile + export FLXINP=flxfile + + if (( d_fhr > 0 )); then + export IGEN=${IGEN_FCST} + else + export IGEN=${IGEN_ANL} + fi + + # add new environmental variables for running new ncep post + # Validation date + + # No shellcheck, NDATE is not a typo + # shellcheck disable=SC2153 + VDATE="$(${NDATE} "+${fhr}" "${PDY}${cyc}")" + # shellcheck disable= + export VDATE + + # set to 3 to output lat/lon grid + + export OUTTYP=${OUTTYP:-4} + + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + export POSTGRB2TBL="${POSTGRB2TBL:-${g2tmpl_ROOT}/share/params_grib2_tbl_new}" + export PostFlatFile="${PARMpost}/postxconfig-NT-GFS.txt" + if [[ "${RUN}" = gfs ]]; then + export IGEN="${IGEN_GFS}" + if (( d_fhr > 0 )); then export IGEN="${IGEN_FCST}" ; fi + else + export IGEN="${IGEN_GDAS_ANL}" + if (( d_fhr > 0 )); then export IGEN="${IGEN_FCST}" ; fi + fi + if [[ "${RUN}" = gfs ]]; then + if (( d_fhr == 0 )); then + export PostFlatFile="${PARMpost}/postxconfig-NT-GFS-F00.txt" + export CTLFILE="${PARMpost}/postcntrl_gfs_f00.xml" + else + export CTLFILE="${CTLFILEGFS:-${PARMpost}/postcntrl_gfs.xml}" + fi + else + if (( d_fhr == 0 )); then + export PostFlatFile="${PARMpost}/postxconfig-NT-GFS-F00.txt" + export CTLFILE="${CTLFILEGFS:-${PARMpost}/postcntrl_gfs_f00.xml}" + else + export CTLFILE="${CTLFILEGFS:-${PARMpost}/postcntrl_gfs.xml}" + fi + fi + fi + + export FLXIOUT=flxifile + export PGBOUT=pgbfile + export PGIOUT=pgifile + export PGBOUT2=pgbfile.grib2 + export PGIOUT2=pgifile.grib2.idx + export FILTER=0 + export fhr3=${fhr} + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + MASTERFHR=${PREFIX}master.grb2f${fhr} + MASTERFHRIDX=${PREFIX}master.grb2if${fhr} + fi + + if [[ "${INLINE_POST}" = ".false." ]]; then + ${POSTGPSH} + else + cp "${COM_ATMOS_MASTER}/${MASTERFHR}" "${PGBOUT}" + fi + export err=$?; err_chk + + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + mv "${PGBOUT}" "${PGBOUT2}" + fi + + #wm Process pgb files + export FH=$(( 10#${fhr} + 0 )) + export downset=${downset:-1} + ${GFSDOWNSH} + export err=$?; err_chk + + if [[ "${SENDDBN}" = "YES" ]]; then + run="$(echo "${RUN}" | tr '[:lower:]' '[:upper:]')" + "${DBNROOT}/bin/dbn_alert" MODEL "${run}_PGB2_0P25" "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2.0p25.f${fhr}" + "${DBNROOT}/bin/dbn_alert" MODEL "${run}_PGB2_0P25_WIDX ""${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2.0p25.f${fhr}.idx" + "${DBNROOT}/bin/dbn_alert" MODEL "${run}_PGB_GB2" "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.f${fhr}" + "${DBNROOT}/bin/dbn_alert" MODEL "${run}_PGB_GB2_WIDX" "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.f${fhr}.idx" + fi + + + if [[ "${SENDCOM}" = 'YES' ]]; then + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + if [[ "${INLINE_POST}" = ".false." ]]; then + cp "${PGBOUT2}" "${COM_ATMOS_MASTER}/${MASTERFHR}" + fi + ${GRB2INDEX} "${PGBOUT2}" "${COM_ATMOS_MASTER}/${MASTERFHRIDX}" + fi + + # Model generated flux files will be in nemsio after FY17 upgrade + # use post to generate Grib2 flux files + + if (( OUTTYP == 4 )) ; then + export NEMSINP=${COM_ATMOS_HISTORY}/${PREFIX}atmf${fhr}.nc + export FLXINP=${COM_ATMOS_HISTORY}/${PREFIX}sfcf${fhr}.nc + if (( d_fhr == 0 )); then + export PostFlatFile=${PARMpost}/postxconfig-NT-GFS-FLUX-F00.txt + export CTLFILE=${PARMpost}/postcntrl_gfs_flux_f00.xml + else + export PostFlatFile=${PARMpost}/postxconfig-NT-GFS-FLUX.txt + export CTLFILE=${PARMpost}/postcntrl_gfs_flux.xml + fi + export PGBOUT=fluxfile + export FILTER=0 + FLUXFL=${PREFIX}sfluxgrbf${fhr}.grib2 + FLUXFLIDX=${PREFIX}sfluxgrbf${fhr}.grib2.idx + + if [[ "${INLINE_POST}" = ".false." ]]; then + ${POSTGPSH} + export err=$?; err_chk + mv fluxfile "${COM_ATMOS_MASTER}/${FLUXFL}" + fi + ${WGRIB2} -s "${COM_ATMOS_MASTER}/${FLUXFL}" > "${COM_ATMOS_MASTER}/${FLUXFLIDX}" + fi + + if [[ "${SENDDBN}" = 'YES' ]] && [[ "${RUN}" = 'gdas' ]] && (( d_fhr % 3 == 0 )); then + "${DBNROOT}/bin/dbn_alert" MODEL "${run}_SF" "${job}" "${COM_ATMOS_HISTORY}/${PREFIX}atmf${fhr}.nc" + "${DBNROOT}/bin/dbn_alert" MODEL "${run}_BF" "${job}" "${COM_ATMOS_HISTORY}/${PREFIX}sfcf${fhr}.nc" + "${DBNROOT}/bin/dbn_alert" MODEL "${run}_SGB_GB2" "${job}" "${COM_ATMOS_MASTER}/${PREFIX}sfluxgrbf${fhr}.grib2" + "${DBNROOT}/bin/dbn_alert" MODEL "${run}_SGB_GB2_WIDX ""${job}" "${COM_ATMOS_MASTER}/${PREFIX}sfluxgrbf${fhr}.grib2.idx" + fi + fi + + [[ -f pgbfile.grib2 ]] && rm pgbfile.grib2 + [[ -f flxfile ]] && rm flxfile + done +fi ## end_if_times + +exit 0 + +################## END OF SCRIPT ####################### diff --git a/scripts/exgdas_atmos_verfozn.sh b/scripts/exgdas_atmos_verfozn.sh new file mode 100755 index 00000000000..aa686284bec --- /dev/null +++ b/scripts/exgdas_atmos_verfozn.sh @@ -0,0 +1,85 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +################################################################################ +# exgdas_vrfyozn.sh +# +# This script runs the data extract/validation portion of the Ozone Monitor +# (OznMon) DA package. +# +################################################################################ +err=0 + +#------------------------------------------------------------------------------- +# Set environment +# +export RUN_ENVIR=${RUN_ENVIR:-nco} +export NET=${NET:-gfs} +export RUN=${RUN:-gdas} +export envir=${envir:-prod} + +# Other variables +export SATYPE_FILE=${SATYPE_FILE:-$FIXgdas_ozn/gdas_oznmon_satype.txt} +export PDATE=${PDY}${cyc} +export DO_DATA_RPT=${DO_DATA_RPT:-1} +export NCP=${NCP:-/bin/cp} + + +#----------------------------------------------------------------- +# ensure work and TANK dirs exist, verify oznstat is available +# +export OZN_WORK_DIR=${OZN_WORK_DIR:-$(pwd)} + +if [[ ! -d ${OZN_WORK_DIR} ]]; then + mkdir $OZN_WORK_DIR +fi +cd $OZN_WORK_DIR + +if [[ ! -d ${TANKverf_ozn} ]]; then + mkdir -p $TANKverf_ozn +fi + +if [[ -s ${oznstat} ]]; then + echo ${oznstat} is available +fi + + + +data_available=0 + +if [[ -s ${oznstat} ]]; then + data_available=1 + + #------------------------------------------------------------------ + # Copy data files file to local data directory. + # Untar oznstat file. + #------------------------------------------------------------------ + + $NCP $oznstat ./oznstat.$PDATE + + tar -xvf oznstat.$PDATE + rm oznstat.$PDATE + + netcdf=0 + count=$(ls diag* | grep ".nc4" | wc -l) + if [ $count -gt 0 ] ; then + netcdf=1 + for filenc4 in $(ls diag*nc4.gz); do + file=$(echo $filenc4 | cut -d'.' -f1-2).gz + mv $filenc4 $file + done + fi + + export OZNMON_NETCDF=${netcdf} + + ${HOMEoznmon}/ush/ozn_xtrct.sh + err=$? + +else + # oznstat file not found + err=1 +fi + +exit ${err} + diff --git a/scripts/exgdas_atmos_verfrad.sh b/scripts/exgdas_atmos_verfrad.sh new file mode 100755 index 00000000000..5306fbbdbab --- /dev/null +++ b/scripts/exgdas_atmos_verfrad.sh @@ -0,0 +1,212 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: exgdas_vrfyrad.sh +# Script description: Runs data extract/validation for global radiance diag data +# +# Author: Ed Safford Org: NP23 Date: 2012-01-18 +# +# Abstract: This script runs the data extract/validation portion of the +# RadMon package. +# +# Condition codes +# 0 - no problem encountered +# >0 - some problem encountered +# +################################################################################ + +export VERBOSE=${VERBOSE:-YES} + +export RUN_ENVIR=${RUN_ENVIR:-nco} +export NET=${NET:-gfs} +export RUN=${RUN:-gdas} +export envir=${envir:-prod} + +# Filenames +biascr=${biascr:-${COM_ATMOS_ANALYSIS}/gdas.t${cyc}z.abias} +radstat=${radstat:-${COM_ATMOS_ANALYSIS}/gdas.t${cyc}z.radstat} +satype_file=${satype_file:-${FIXgdas}/gdas_radmon_satype.txt} + +# Other variables +export RAD_AREA=${RAD_AREA:-glb} +export MAKE_CTL=${MAKE_CTL:-1} +export MAKE_DATA=${MAKE_DATA:-1} +export USE_ANL=${USE_ANL:-1} +export PDATE=${PDY}${cyc} +export DO_DIAG_RPT=${DO_DIAG_RPT:-1} +export DO_DATA_RPT=${DO_DATA_RPT:-1} +export NCP=${NCP:-/bin/cp} + +########################################################################### +# ensure TANK dir exists, verify radstat and biascr are available +# +if [[ ! -d ${TANKverf_rad} ]]; then + mkdir -p $TANKverf_rad +fi + +if [[ "$VERBOSE" = "YES" ]]; then + if [[ -s ${radstat} ]]; then + echo ${radstat} is available + fi + if [[ -s ${biascr} ]]; then + echo ${biascr} is available + fi +fi +##################################################################### + +data_available=0 +if [[ -s ${radstat} && -s ${biascr} ]]; then + data_available=1 + + #------------------------------------------------------------------ + # Copy data files file to local data directory. + # Untar radstat file. + #------------------------------------------------------------------ + + $NCP $biascr ./biascr.$PDATE + $NCP $radstat ./radstat.$PDATE + + tar -xvf radstat.$PDATE + rm radstat.$PDATE + + #------------------------------------------------------------------ + # SATYPE is the list of expected satellite/instrument sources + # in the radstat file. It should be stored in the $TANKverf + # directory. If it isn't there then use the $FIXgdas copy. In all + # cases write it back out to the radmon.$PDY directory. Add any + # new sources to the list before writing back out. + #------------------------------------------------------------------ + + radstat_satype=$(ls d*ges* | awk -F_ '{ print $2 "_" $3 }') + if [[ "$VERBOSE" = "YES" ]]; then + echo $radstat_satype + fi + + echo satype_file = $satype_file + + #------------------------------------------------------------------ + # Get previous cycle's date, and look for the satype_file. Using + # the previous cycle will get us the previous day's directory if + # the cycle being processed is 00z. + #------------------------------------------------------------------ + if [[ $cyc = "00" ]]; then + use_tankdir=${TANKverf_radM1} + else + use_tankdir=${TANKverf_rad} + fi + + echo satype_file = $satype_file + export SATYPE=$(cat ${satype_file}) + + + #------------------------------------------------------------- + # Update the SATYPE if any new sat/instrument was + # found in $radstat_satype. Write the SATYPE contents back + # to $TANKverf/radmon.$PDY. + #------------------------------------------------------------- + satype_changes=0 + new_satype=$SATYPE + for type in ${radstat_satype}; do + test=$(echo $SATYPE | grep $type | wc -l) + + if [[ $test -eq 0 ]]; then + if [[ "$VERBOSE" = "YES" ]]; then + echo "Found $type in radstat file but not in SATYPE list. Adding it now." + fi + satype_changes=1 + new_satype="$new_satype $type" + fi + done + + + #------------------------------------------------------------------ + # Rename the diag files and uncompress + #------------------------------------------------------------------ + netcdf=0 + + for type in ${SATYPE}; do + + if [[ netcdf -eq 0 && -e diag_${type}_ges.${PDATE}.nc4.${Z} ]]; then + netcdf=1 + fi + + if [[ $(find . -maxdepth 1 -type f -name "diag_${type}_ges.${PDATE}*.${Z}" | wc -l) -gt 0 ]]; then + mv diag_${type}_ges.${PDATE}*.${Z} ${type}.${Z} + ${UNCOMPRESS} ./${type}.${Z} + else + echo "WARNING: diag_${type}_ges.${PDATE}*.${Z} not available, skipping" + fi + + if [[ $USE_ANL -eq 1 ]]; then + if [[ $(find . -maxdepth 1 -type f -name "diag_${type}_anl.${PDATE}*.${Z}" | wc -l) -gt 0 ]]; then + mv diag_${type}_anl.${PDATE}*.${Z} ${type}_anl.${Z} + ${UNCOMPRESS} ./${type}_anl.${Z} + else + echo "WARNING: diag_${type}_anl.${PDATE}*.${Z} not available, skipping" + fi + fi + done + + export RADMON_NETCDF=$netcdf + + + #------------------------------------------------------------------ + # Run the child sccripts. + #------------------------------------------------------------------ + ${USHradmon}/radmon_verf_angle.sh ${PDATE} + rc_angle=$? + + ${USHradmon}/radmon_verf_bcoef.sh ${PDATE} + rc_bcoef=$? + + ${USHradmon}/radmon_verf_bcor.sh "${PDATE}" + rc_bcor=$? + + ${USHradmon}/radmon_verf_time.sh "${PDATE}" + rc_time=$? + + #-------------------------------------- + # optionally run clean_tankdir script + # + if [[ ${CLEAN_TANKVERF:-0} -eq 1 ]]; then + "${USHradmon}/clean_tankdir.sh" glb 60 + rc_clean_tankdir=$? + echo "rc_clean_tankdir = $rc_clean_tankdir" + fi + +fi + + + +##################################################################### +# Postprocessing + +err=0 +if [[ ${data_available} -ne 1 ]]; then + err=1 +elif [[ $rc_angle -ne 0 ]]; then + err=$rc_angle +elif [[ $rc_bcoef -ne 0 ]]; then + err=$rc_bcoef +elif [[ $rc_bcor -ne 0 ]]; then + err=$rc_bcor +elif [[ $rc_time -ne 0 ]]; then + err=$rc_time +fi + +##################################################################### +# Restrict select sensors and satellites +export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} +rlist="saphir" +for rtype in $rlist; do + if compgen -G "$TANKverf_rad/*${rtype}*" > /dev/null; then + ${CHGRP_CMD} "${TANKverf_rad}"/*${rtype}* + fi +done + +exit ${err} + diff --git a/scripts/exgdas_atmos_vminmon.sh b/scripts/exgdas_atmos_vminmon.sh new file mode 100755 index 00000000000..2a22fcb0b67 --- /dev/null +++ b/scripts/exgdas_atmos_vminmon.sh @@ -0,0 +1,113 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: exgdas_vrfminmon.sh +# Script description: Runs data extract/validation for GSI normalization diag data +# +# Author: Ed Safford Org: NP23 Date: 2015-04-10 +# +# Abstract: This script runs the data extract/validation portion of the +# MinMon package. +# +# Condition codes +# 0 - no problem encountered +# >0 - some problem encountered +# +################################################################################ + + +######################################## +# Set environment +######################################## +export RUN_ENVIR=${RUN_ENVIR:-nco} +export NET=${NET:-gfs} +export RUN=${RUN:-gdas} +export envir=${envir:-prod} + +######################################## +# Directories +######################################## +export DATA=${DATA:-$(pwd)} + + +######################################## +# Filenames +######################################## +gsistat=${gsistat:-${COM_ATMOS_ANALYSIS}/gdas.t${cyc}z.gsistat} +export mm_gnormfile=${gnormfile:-${M_FIXgdas}/gdas_minmon_gnorm.txt} +export mm_costfile=${costfile:-${M_FIXgdas}/gdas_minmon_cost.txt} + +######################################## +# Other variables +######################################## +export MINMON_SUFFIX=${MINMON_SUFFIX:-GDAS} +export PDATE=${PDY}${cyc} +export NCP=${NCP:-/bin/cp} +export pgm=exgdas_vrfminmon.sh + +if [[ ! -d ${DATA} ]]; then + mkdir $DATA +fi +cd $DATA + +###################################################################### + +data_available=0 + +if [[ -s ${gsistat} ]]; then + + data_available=1 + + #----------------------------------------------------------------------- + # Copy the $MINMON_SUFFIX.gnorm_data.txt file to the working directory + # It's ok if it doesn't exist; we'll create a new one if needed. + # + # Note: The logic below is to accomodate two different data storage + # methods. Some parallels (and formerly ops) dump all MinMon data for + # a given day in the same directory (if condition). Ops now separates + # data into ${cyc} subdirectories (elif condition). + #----------------------------------------------------------------------- + if [[ -s ${M_TANKverf}/gnorm_data.txt ]]; then + $NCP ${M_TANKverf}/gnorm_data.txt gnorm_data.txt + elif [[ -s ${M_TANKverfM1}/gnorm_data.txt ]]; then + $NCP ${M_TANKverfM1}/gnorm_data.txt gnorm_data.txt + fi + + + #------------------------------------------------------------------ + # Run the child sccripts. + #------------------------------------------------------------------ + ${USHminmon}/minmon_xtrct_costs.pl ${MINMON_SUFFIX} ${PDY} ${cyc} ${gsistat} dummy + rc_costs=$? + echo "rc_costs = $rc_costs" + + ${USHminmon}/minmon_xtrct_gnorms.pl ${MINMON_SUFFIX} ${PDY} ${cyc} ${gsistat} dummy + rc_gnorms=$? + echo "rc_gnorms = $rc_gnorms" + + ${USHminmon}/minmon_xtrct_reduct.pl ${MINMON_SUFFIX} ${PDY} ${cyc} ${gsistat} dummy + rc_reduct=$? + echo "rc_reduct = $rc_reduct" + +fi + +##################################################################### +# Postprocessing + +err=0 +if [[ ${data_available} -ne 1 ]]; then + err=1 +elif [[ $rc_costs -ne 0 ]]; then + err=$rc_costs +elif [[ $rc_gnorms -ne 0 ]]; then + err=$rc_gnorms +elif [[ $rc_reduct -ne 0 ]]; then + err=$rc_reduct +fi + +exit ${err} + diff --git a/scripts/exgdas_enkf_earc.sh b/scripts/exgdas_enkf_earc.sh new file mode 100755 index 00000000000..79e43d1a17b --- /dev/null +++ b/scripts/exgdas_enkf_earc.sh @@ -0,0 +1,304 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################## +# Begin JOB SPECIFIC work +############################################## +export n=$((10#${ENSGRP})) +export CDUMP_ENKF="${EUPD_CYC:-"gdas"}" + +export ARCH_LIST="${COM_TOP}/earc${ENSGRP}" + +# ICS are restarts and always lag INC by $assim_freq hours. +EARCINC_CYC=${ARCH_CYC} +EARCICS_CYC=$((ARCH_CYC-assim_freq)) +if [ "${EARCICS_CYC}" -lt 0 ]; then + EARCICS_CYC=$((EARCICS_CYC+24)) +fi + +[[ -d ${ARCH_LIST} ]] && rm -rf "${ARCH_LIST}" +mkdir -p "${ARCH_LIST}" +cd "${ARCH_LIST}" || exit 2 + +"${HOMEgfs}/ush/hpssarch_gen.sh" "${RUN}" +status=$? +if [ "${status}" -ne 0 ]; then + echo "${HOMEgfs}/ush/hpssarch_gen.sh ${RUN} failed, ABORT!" + exit "${status}" +fi + +cd "${ROTDIR}" || exit 2 + +source "${HOMEgfs}/ush/file_utils.sh" + +################################################################### +# ENSGRP > 0 archives a group of ensemble members +firstday=$(${NDATE} +24 "${SDATE}") +if (( 10#${ENSGRP} > 0 )) && [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then + +#--set the archiving command and create local directories, if necessary + TARCMD="htar" + if [[ ${LOCALARCH} = "YES" ]]; then + TARCMD="tar" + if [[ ! -d "${ATARDIR}/${PDY}${cyc}" ]]; then mkdir -p "${ATARDIR}/${PDY}${cyc}"; fi + fi + +#--determine when to save ICs for warm start + SAVEWARMICA="NO" + SAVEWARMICB="NO" + mm="${PDY:4:2}" + dd="${PDY:6:2}" + nday=$(( (10#${mm}-1)*30+10#${dd} )) + mod=$((nday % ARCH_WARMICFREQ)) + if [ "${PDY}${cyc}" -eq "${firstday}" ] && [ "${cyc}" -eq "${EARCINC_CYC}" ]; then SAVEWARMICA="YES" ; fi + if [ "${PDY}${cyc}" -eq "${firstday}" ] && [ "${cyc}" -eq "${EARCICS_CYC}" ]; then SAVEWARMICB="YES" ; fi + if [ "${mod}" -eq 0 ] && [ "${cyc}" ] && [ "${EARCINC_CYC}" ]; then SAVEWARMICA="YES" ; fi + if [ "${mod}" -eq 0 ] && [ "${cyc}" ] && [ "${EARCICS_CYC}" ]; then SAVEWARMICB="YES" ; fi + + if [ "${EARCICS_CYC}" -eq 18 ]; then + nday1=$((nday+1)) + mod1=$((nday1 % ARCH_WARMICFREQ)) + if [ "${mod1}" -eq 0 ] && [ "${cyc}" -eq "${EARCICS_CYC}" ] ; then SAVEWARMICB="YES" ; fi + if [ "${mod1}" -ne 0 ] && [ "${cyc}" -eq "${EARCICS_CYC}" ] ; then SAVEWARMICB="NO" ; fi + if [ "${PDY}${cyc}" -eq "${SDATE}" ] && [ "${cyc}" -eq "${EARCICS_CYC}" ] ; then SAVEWARMICB="YES" ; fi + fi + + if [ "${PDY}${cyc}" -gt "${SDATE}" ]; then # Don't run for first half cycle + + ${TARCMD} -P -cvf "${ATARDIR}/${PDY}${cyc}/${RUN}_grp${ENSGRP}.tar" $(cat "${ARCH_LIST}/${RUN}_grp${n}.txt") + status=$? + if [ "${status}" -ne 0 ] && [ "${PDY}${cyc}" -ge "${firstday}" ]; then + echo "FATAL ERROR: ${TARCMD} ${PDY}${cyc} ${RUN}_grp${ENSGRP}.tar failed" + exit "${status}" + fi + + if [ "${SAVEWARMICA}" = "YES" ] && [ "${cyc}" -eq "${EARCINC_CYC}" ]; then + ${TARCMD} -P -cvf "${ATARDIR}/${PDY}${cyc}/${RUN}_restarta_grp${ENSGRP}.tar" $(cat "${ARCH_LIST}/${RUN}_restarta_grp${n}.txt") + status=$? + if [ "${status}" -ne 0 ]; then + echo "FATAL ERROR: ${TARCMD} ${PDY}${cyc} ${RUN}_restarta_grp${ENSGRP}.tar failed" + exit "${status}" + fi + fi + + if [ "${SAVEWARMICB}" = "YES" ] && [ "${cyc}" -eq "${EARCICS_CYC}" ]; then + ${TARCMD} -P -cvf "${ATARDIR}/${PDY}${cyc}/${RUN}_restartb_grp${ENSGRP}.tar" $(cat "${ARCH_LIST}/${RUN}_restartb_grp${n}.txt") + status=$? + if [ "${status}" -ne 0 ]; then + echo "FATAL ERROR: ${TARCMD} ${PDY}${cyc} ${RUN}_restartb_grp${ENSGRP}.tar failed" + exit "${status}" + fi + fi + + fi # CDATE>SDATE + +fi + + +################################################################### +# ENSGRP 0 archives ensemble means and copy data to online archive +if [ "${ENSGRP}" -eq 0 ]; then + + if [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then + + #--set the archiving command and create local directories, if necessary + TARCMD="htar" + HSICMD="hsi" + if [[ ${LOCALARCH} = "YES" ]]; then + TARCMD="tar" + HSICMD="" + if [[ ! -d "${ATARDIR}/${PDY}${cyc}" ]]; then mkdir -p "${ATARDIR}/${PDY}${cyc}"; fi + fi + + set +e + ${TARCMD} -P -cvf "${ATARDIR}/${PDY}${cyc}/${RUN}.tar" $(cat "${ARCH_LIST}/${RUN}.txt") + status=$? + ${HSICMD} chgrp rstprod "${ATARDIR}/${PDY}${cyc}/${RUN}.tar" + ${HSICMD} chmod 640 "${ATARDIR}/${PDY}${cyc}/${RUN}.tar" + if (( status != 0 && ${PDY}${cyc} >= firstday )); then + echo "FATAL ERROR: ${TARCMD} ${PDY}${cyc} ${RUN}.tar failed" + exit "${status}" + fi + set_strict + fi + + #-- Archive online for verification and diagnostics + [[ ! -d ${ARCDIR} ]] && mkdir -p "${ARCDIR}" + cd "${ARCDIR}" || exit 2 + + nb_copy "${COM_ATMOS_ANALYSIS_ENSSTAT}/${RUN}.t${cyc}z.enkfstat" \ + "enkfstat.${RUN}.${PDY}${cyc}" + nb_copy "${COM_ATMOS_ANALYSIS_ENSSTAT}/${RUN}.t${cyc}z.gsistat.ensmean" \ + "gsistat.${RUN}.${PDY}${cyc}.ensmean" +fi + + +if [[ "${DELETE_COM_IN_ARCHIVE_JOB:-YES}" == NO ]] ; then + exit 0 +fi + +############################################################### +# ENSGRP 0 also does clean-up +############################################################### +if [[ "${ENSGRP}" -eq 0 ]]; then + function remove_files() { + # TODO: move this to a new location + local directory=$1 + shift + if [[ ! -d ${directory} ]]; then + echo "No directory ${directory} to remove files from, skiping" + return + fi + local exclude_list="" + if (($# > 0)); then + exclude_list=$* + fi + local file_list + declare -a file_list + # Suppress warnings about chained commands suppressing exit codes + # shellcheck disable=SC2312 + readarray -t file_list < <(find -L "${directory}" -type f) + if (( ${#file_list[@]} == 0 )); then return; fi + for exclude in ${exclude_list}; do + echo "Excluding ${exclude}" + declare -a file_list_old=("${file_list[@]}") + # Suppress warnings about chained commands suppressing exit codes + # shellcheck disable=SC2312 + readarray file_list < <(printf -- '%s\n' "${file_list_old[@]}" | grep -v "${exclude}") + if (( ${#file_list[@]} == 0 )); then return; fi + done + + for file in "${file_list[@]}"; do + rm -f "${file}" + done + # Remove directory if empty + rmdir "${directory}" || true + } + + # Start start and end dates to remove + GDATEEND=$(${NDATE} -"${RMOLDEND_ENKF:-24}" "${PDY}${cyc}") + GDATE=$(${NDATE} -"${RMOLDSTD_ENKF:-120}" "${PDY}${cyc}") + + while [ "${GDATE}" -le "${GDATEEND}" ]; do + + gPDY="${GDATE:0:8}" + gcyc="${GDATE:8:2}" + + if [[ -d ${COM_TOP} ]]; then + rocotolog="${EXPDIR}/logs/${GDATE}.log" + if [[ -f "${rocotolog}" ]]; then + set +e + # Suppress warnings about chained commands suppressing exit codes + # shellcheck disable=SC2312 + testend=$(tail -n 1 "${rocotolog}" | grep "This cycle is complete: Success") + rc=$? + set_strict + if [ "${rc}" -eq 0 ]; then + case ${CDUMP} in + gdas) nmem=${NMEM_ENKF};; + gfs) nmem=${NMEM_EFCS};; + *) + echo "FATAL ERROR: Unknown CDUMP ${CDUMP} during cleanup" + exit 10 + ;; + esac + + readarray memlist< <(seq --format="mem%03g" 1 "${nmem}") + memlist+=("ensstat") + + for mem in "${memlist[@]}"; do + # Atmos + exclude_list="f006.ens" + # Suppress warnings about chained commands suppressing exit codes + # shellcheck disable=SC2312 + templates=$(compgen -A variable | grep 'COM_ATMOS_.*_TMPL') + for template in ${templates}; do + MEMDIR="${mem}" YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Wave + exclude_list="" + # Suppress warnings about chained commands suppressing exit codes + # shellcheck disable=SC2312 + templates=$(compgen -A variable | grep 'COM_WAVE_.*_TMPL') + for template in ${templates}; do + MEMDIR="${mem}" YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Ocean + exclude_list="" + # Suppress warnings about chained commands suppressing exit codes + # shellcheck disable=SC2312 + templates=$(compgen -A variable | grep 'COM_OCEAN_.*_TMPL') + for template in ${templates}; do + YMEMDIR="${mem}" MD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Ice + exclude_list="" + # Suppress warnings about chained commands suppressing exit codes + # shellcheck disable=SC2312 + templates=$(compgen -A variable | grep 'COM_ICE_.*_TMPL') + for template in ${templates}; do + MEMDIR="${mem}" YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Aerosols (GOCART) + exclude_list="" + # Suppress warnings about chained commands suppressing exit codes + # shellcheck disable=SC2312 + templates=$(compgen -A variable | grep 'COM_CHEM_.*_TMPL') + for template in ${templates}; do + MEMDIR="${mem}" YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Mediator + exclude_list="" + # Suppress warnings about chained commands suppressing exit codes + # shellcheck disable=SC2312 + templates=$(compgen -A variable | grep 'COM_MED_.*_TMPL') + for template in ${templates}; do + MEMDIR="${mem}" YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + done + fi + fi + fi + + # Remove any empty directories + YMD=${gPDY} HH=${gcyc} generate_com target_dir:COM_TOP_TMPL + target_dir="${ROTDIR:?}/${RUN}.${gPDY}/${gcyc}/" + if [[ -d ${target_dir} ]]; then + find "${target_dir}" -empty -type d -delete + fi + + # Advance to next cycle + GDATE=$(${NDATE} +"${assim_freq}" "${GDATE}") + done +fi + +# Remove enkf*.$rPDY for the older of GDATE or RDATE +GDATE=$(${NDATE} -"${RMOLDSTD_ENKF:-120}" "${PDY}${cyc}") +fhmax=${FHMAX_GFS} +RDATE=$(${NDATE} -"${fhmax}" "${PDY}${cyc}") +if [ "${GDATE}" -lt "${RDATE}" ]; then + RDATE=${GDATE} +fi +rPDY=$(echo "${RDATE}" | cut -c1-8) +clist="enkfgdas enkfgfs" +for ctype in ${clist}; do + COMIN="${ROTDIR}/${ctype}.${rPDY}" + [[ -d ${COMIN} ]] && rm -rf "${COMIN}" +done + +############################################################### + + +exit 0 diff --git a/scripts/exgdas_enkf_ecen.sh b/scripts/exgdas_enkf_ecen.sh index 91e7483be9f..3d54cf001f6 100755 --- a/scripts/exgdas_enkf_ecen.sh +++ b/scripts/exgdas_enkf_ecen.sh @@ -24,6 +24,7 @@ pwd=$(pwd) # Base variables CDATE=${CDATE:-"2010010100"} +CDUMP=${CDUMP:-"gdas"} DONST=${DONST:-"NO"} export CASE=${CASE:-384} ntiles=${ntiles:-6} @@ -31,7 +32,6 @@ ntiles=${ntiles:-6} # Utilities NCP=${NCP:-"/bin/cp -p"} NLN=${NLN:-"/bin/ln -sf"} -NEMSIOGET=${NEMSIOGET:-${NWPROD}/exec/nemsio_get} NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} # Scripts @@ -47,10 +47,9 @@ CALCINCNCEXEC=${CALCINCEXEC:-$HOMEgfs/exec/calc_increment_ens_ncio.x} OPREFIX=${OPREFIX:-""} OSUFFIX=${OSUFFIX:-""} APREFIX=${APREFIX:-""} -APREFIX_ENKF=${APREFIX_ENKF:-$APREFIX} -ASUFFIX=${ASUFFIX:-$SUFFIX} +APREFIX_ENS=${APREFIX_ENS:-$APREFIX} GPREFIX=${GPREFIX:-""} -GSUFFIX=${GSUFFIX:-$SUFFIX} +GPREFIX_ENS=${GPREFIX_ENS:-$GPREFIX} # Variables NMEM_ENKF=${NMEM_ENKF:-80} @@ -61,8 +60,11 @@ FHMIN=${FHMIN_ECEN:-3} FHMAX=${FHMAX_ECEN:-9} FHOUT=${FHOUT_ECEN:-3} FHSFC=${FHSFC_ECEN:-$FHMIN} -DO_CALC_INCREMENT=${DO_CALC_INCREMENT:-"NO"} - +if [ $CDUMP = "enkfgfs" ]; then + DO_CALC_INCREMENT=${DO_CALC_INCREMENT_ENKF_GFS:-"NO"} +else + DO_CALC_INCREMENT=${DO_CALC_INCREMENT:-"NO"} +fi # global_chgres stuff CHGRESNEMS=${CHGRESNEMS:-$HOMEgfs/exec/enkf_chgres_recenter.x} @@ -75,8 +77,8 @@ CYCLESH=${CYCLESH:-$HOMEgfs/ush/global_cycle.sh} export CYCLEXEC=${CYCLEXEC:-$HOMEgfs/exec/global_cycle} APRUN_CYCLE=${APRUN_CYCLE:-${APRUN:-""}} NTHREADS_CYCLE=${NTHREADS_CYCLE:-${NTHREADS:-1}} -export FIXfv3=${FIXfv3:-$HOMEgfs/fix/fix_fv3_gmted2010} -export FIXgsm=${FIXgsm:-$HOMEgfs/fix/fix_am} +export FIXfv3=${FIXfv3:-$HOMEgfs/fix/orog} +export FIXgsm=${FIXgsm:-$HOMEgfs/fix/am} export CYCLVARS=${CYCLVARS:-"FSNOL=-2.,FSNOS=99999.,"} export FHOUR=${FHOUR:-0} export DELTSFC=${DELTSFC:-6} @@ -108,32 +110,39 @@ for FHR in $(seq $FHMIN $FHOUT $FHMAX); do for imem in $(seq 1 $NMEM_ENKF); do memchar="mem"$(printf %03i $imem) - $NLN $COMIN_GES_ENS/$memchar/${GPREFIX}atmf00${FHR}${ENKF_SUFFIX}$GSUFFIX ./atmges_$memchar + + MEMDIR=${memchar} YMD=${PDY} HH=${cyc} generate_com -x \ + COM_ATMOS_ANALYSIS_MEM:COM_ATMOS_ANALYSIS_TMPL + + MEMDIR=${memchar} RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com -x \ + COM_ATMOS_HISTORY_MEM_PREV:COM_ATMOS_HISTORY_TMPL + + ${NLN} "${COM_ATMOS_HISTORY_MEM_PREV}/${GPREFIX_ENS}atmf00${FHR}${ENKF_SUFFIX}.nc" "./atmges_${memchar}" if [ $DO_CALC_INCREMENT = "YES" ]; then if [ $FHR -eq 6 ]; then - $NLN $COMIN_ENS/$memchar/${APREFIX_ENKF}atmanl$ASUFFIX ./atmanl_$memchar + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX_ENS}atmanl.nc" "./atmanl_${memchar}" else - $NLN $COMIN_ENS/$memchar/${APREFIX_ENKF}atma00${FHR}$ASUFFIX ./atmanl_$memchar + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX_ENS}atma00${FHR}.nc" "./atmanl_${memchar}" fi fi - mkdir -p $COMOUT_ENS/$memchar + mkdir -p "${COM_ATMOS_ANALYSIS_MEM}" if [ $FHR -eq 6 ]; then - $NLN $COMOUT_ENS/$memchar/${APREFIX}atminc.nc ./atminc_$memchar + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX_ENS}atminc.nc" "./atminc_${memchar}" else - $NLN $COMOUT_ENS/$memchar/${APREFIX}atmi00${FHR}.nc ./atminc_$memchar + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX_ENS}atmi00${FHR}.nc" "./atminc_${memchar}" fi if [[ $RECENTER_ENKF = "YES" ]]; then if [ $DO_CALC_INCREMENT = "YES" ]; then if [ $FHR -eq 6 ]; then - $NLN $COMOUT_ENS/$memchar/${APREFIX}ratmanl$ASUFFIX ./ratmanl_$memchar + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX_ENS}ratmanl.nc" "./ratmanl_${memchar}" else - $NLN $COMOUT_ENS/$memchar/${APREFIX}ratma00${FHR}$ASUFFIX ./ratmanl_$memchar + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX_ENS}ratma00${FHR}.nc" "./ratmanl_${memchar}" fi else if [ $FHR -eq 6 ]; then - $NLN $COMOUT_ENS/$memchar/${APREFIX}ratminc$ASUFFIX ./ratminc_$memchar + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX_ENS}ratminc.nc" "./ratminc_${memchar}" else - $NLN $COMOUT_ENS/$memchar/${APREFIX}ratmi00${FHR}$ASUFFIX ./ratminc_$memchar + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX_ENS}ratmi00${FHR}.nc" "./ratminc_${memchar}" fi fi fi @@ -142,9 +151,9 @@ done if [ $DO_CALC_INCREMENT = "YES" ]; then # Link ensemble mean analysis if [ $FHR -eq 6 ]; then - $NLN $COMIN_ENS/${APREFIX_ENKF}atmanl.ensmean$ASUFFIX ./atmanl_ensmean + ${NLN} "${COM_ATMOS_ANALYSIS_STAT}/${APREFIX_ENS}atmanl.ensmean.nc" "./atmanl_ensmean" else - $NLN $COMIN_ENS/${APREFIX_ENKF}atma00${FHR}.ensmean$ASUFFIX ./atmanl_ensmean + ${NLN} "${COM_ATMOS_ANALYSIS_STAT}/${APREFIX_ENS}atma00${FHR}.ensmean.nc" "./atmanl_ensmean" fi # Compute ensemble mean analysis @@ -162,9 +171,9 @@ if [ $DO_CALC_INCREMENT = "YES" ]; then else # Link ensemble mean increment if [ $FHR -eq 6 ]; then - $NLN $COMIN_ENS/${APREFIX_ENKF}atminc.ensmean$ASUFFIX ./atminc_ensmean + ${NLN} "${COM_ATMOS_ANALYSIS_STAT}/${APREFIX_ENS}atminc.ensmean.nc" "./atminc_ensmean" else - $NLN $COMIN_ENS/${APREFIX_ENKF}atmi00${FHR}.ensmean$ASUFFIX ./atminc_ensmean + ${NLN} "${COM_ATMOS_ANALYSIS_STAT}/${APREFIX_ENS}atmi00${FHR}.ensmean.nc" "./atminc_ensmean" fi # Compute ensemble mean increment @@ -181,8 +190,8 @@ else export err=$?; err_chk # If available, link to ensemble mean guess. Otherwise, compute ensemble mean guess - if [ -s $COMIN_GES_ENS/${GPREFIX}atmf00${FHR}.ensmean$GSUFFIX ]; then - $NLN $COMIN_GES_ENS/${GPREFIX}atmf00${FHR}.ensmean$GSUFFIX ./atmges_ensmean + if [[ -s "${COM_ATMOS_HISTORY_STAT_PREV}/${GPREFIX_ENS}atmf00${FHR}.ensmean.nc" ]]; then + ${NLN} "${COM_ATMOS_HISTORY_STAT_PREV}/${GPREFIX_ENS}atmf00${FHR}.ensmean.nc" "./atmges_ensmean" else DATAPATH="./" ATMGESNAME="atmges" @@ -198,23 +207,16 @@ else fi fi -if [ ${SUFFIX} = ".nc" ]; then - if [ $DO_CALC_INCREMENT = "YES" ]; then - LONB_ENKF=${LONB_ENKF:-$($NCLEN atmanl_ensmean grid_xt)} # get LONB - LATB_ENKF=${LATB_ENKF:-$($NCLEN atmanl_ensmean grid_yt)} # get LATB - LEVS_ENKF=${LEVS_ENKF:-$($NCLEN atmanl_ensmean pfull)} # get LEVS - else - LONB_ENKF=${LONB_ENKF:-$($NCLEN atminc_ensmean lon)} # get LONB - LATB_ENKF=${LATB_ENKF:-$($NCLEN atminc_ensmean lat)} # get LATB - LEVS_ENKF=${LEVS_ENKF:-$($NCLEN atminc_ensmean lev)} # get LEVS - fi - JCAP_ENKF=${JCAP_ENKF:--9999} # there is no jcap in these files +if [ $DO_CALC_INCREMENT = "YES" ]; then + LONB_ENKF=${LONB_ENKF:-$($NCLEN atmanl_ensmean grid_xt)} # get LONB + LATB_ENKF=${LATB_ENKF:-$($NCLEN atmanl_ensmean grid_yt)} # get LATB + LEVS_ENKF=${LEVS_ENKF:-$($NCLEN atmanl_ensmean pfull)} # get LEVS else - LONB_ENKF=${LONB_ENKF:-$($NEMSIOGET atmanl_ensmean dimx | awk '{print $2}')} - LATB_ENKF=${LATB_ENKF:-$($NEMSIOGET atmanl_ensmean dimy | awk '{print $2}')} - LEVS_ENKF=${LEVS_ENKF:-$($NEMSIOGET atmanl_ensmean dimz | awk '{print $2}')} - JCAP_ENKF=${JCAP_ENKF:-$($NEMSIOGET atmanl_ensmean jcap | awk '{print $2}')} + LONB_ENKF=${LONB_ENKF:-$($NCLEN atminc_ensmean lon)} # get LONB + LATB_ENKF=${LATB_ENKF:-$($NCLEN atminc_ensmean lat)} # get LATB + LEVS_ENKF=${LEVS_ENKF:-$($NCLEN atminc_ensmean lev)} # get LEVS fi +JCAP_ENKF=${JCAP_ENKF:--9999} # there is no jcap in these files [ $JCAP_ENKF -eq -9999 -a $LATB_ENKF -ne -9999 ] && JCAP_ENKF=$((LATB_ENKF-2)) [ $LONB_ENKF -eq -9999 -o $LATB_ENKF -eq -9999 -o $LEVS_ENKF -eq -9999 -o $JCAP_ENKF -eq -9999 ] && exit -9999 @@ -224,11 +226,11 @@ if [ $RECENTER_ENKF = "YES" ]; then # GSI EnVar analysis if [ $FHR -eq 6 ]; then - ATMANL_GSI=$COMIN/${APREFIX}atmanl$ASUFFIX - ATMANL_GSI_ENSRES=$COMIN/${APREFIX}atmanl.ensres$ASUFFIX + ATMANL_GSI="${COM_ATMOS_ANALYSIS_DET}/${APREFIX}atmanl.nc" + ATMANL_GSI_ENSRES="${COM_ATMOS_ANALYSIS_DET}/${APREFIX}atmanl.ensres.nc" else - ATMANL_GSI=$COMIN/${APREFIX}atma00${FHR}$ASUFFIX - ATMANL_GSI_ENSRES=$COMIN/${APREFIX}atma00${FHR}.ensres$ASUFFIX + ATMANL_GSI="${COM_ATMOS_ANALYSIS_DET}/${APREFIX}atma00${FHR}.nc" + ATMANL_GSI_ENSRES="${COM_ATMOS_ANALYSIS_DET}/${APREFIX}atma00${FHR}.ensres.nc" fi # if we already have a ensemble resolution GSI analysis then just link to it @@ -241,15 +243,9 @@ if [ $RECENTER_ENKF = "YES" ]; then $NLN $ATMANL_GSI atmanl_gsi $NLN $ATMANL_GSI_ENSRES atmanl_gsi_ensres SIGLEVEL=${SIGLEVEL:-${FIXgsm}/global_hyblev.l${LEVS}.txt} - if [ ${SUFFIX} = ".nc" ]; then - $NLN $CHGRESNC chgres.x - chgresnml=chgres_nc_gauss.nml - nmltitle=chgres - else - $NLN $CHGRESNEMS chgres.x - chgresnml=fort.43 - nmltitle=nam - fi + $NLN $CHGRESNC chgres.x + chgresnml=chgres_nc_gauss.nml + nmltitle=chgres export OMP_NUM_THREADS=$NTHREADS_CHGRES @@ -326,18 +322,13 @@ if [ $DO_CALC_INCREMENT = "YES" ]; then fi export OMP_NUM_THREADS=$NTHREADS_CALCINC - if [ ${SUFFIX} = ".nc" ]; then - CALCINCEXEC=$CALCINCNCEXEC - else - CALCINCEXEC=$CALCINCNEMSEXEC - fi + CALCINCEXEC=$CALCINCNCEXEC export pgm=$CALCINCEXEC . prep_step $NCP $CALCINCEXEC $DATA - - rm calc_increment.nml + [[ -f calc_increment.nml ]] && rm calc_increment.nml cat > calc_increment.nml << EOF &setup datapath = './' @@ -367,4 +358,4 @@ cd $pwd [[ $mkdata = "YES" ]] && rm -rf $DATA -exit $err +exit ${err} diff --git a/scripts/exgdas_enkf_fcst.sh b/scripts/exgdas_enkf_fcst.sh index cd796818870..fae4dde7f3e 100755 --- a/scripts/exgdas_enkf_fcst.sh +++ b/scripts/exgdas_enkf_fcst.sh @@ -21,9 +21,8 @@ source "$HOMEgfs/ush/preamble.sh" # Directories. -pwd=$(pwd) export FIX_DIR=${FIX_DIR:-$HOMEgfs/fix} -export FIX_AM=${FIX_AM:-$FIX_DIR/fix_am} +export FIX_AM=${FIX_AM:-$FIX_DIR/am} # Utilities export NCP=${NCP:-"/bin/cp -p"} @@ -46,10 +45,6 @@ export FCSTEXEC=${FCSTEXEC:-fv3gfs.x} export PARM_FV3DIAG=${PARM_FV3DIAG:-$HOMEgfs/parm/parm_fv3diag} export DIAG_TABLE=${DIAG_TABLE_ENKF:-${DIAG_TABLE:-$PARM_FV3DIAG/diag_table_da}} -# Cycling and forecast hour specific parameters -export CDATE=${CDATE:-"2001010100"} -export CDUMP=${CDUMP:-"gdas"} - # Re-run failed members, or entire group RERUN_EFCSGRP=${RERUN_EFCSGRP:-"YES"} @@ -60,23 +55,15 @@ export PREFIX_ATMINC=${PREFIX_ATMINC:-""} # Ops related stuff SENDECF=${SENDECF:-"NO"} SENDDBN=${SENDDBN:-"NO"} -GSUFFIX=${GSUFFIX:-$SUFFIX} ################################################################################ # Preprocessing -mkdata=NO -if [ ! -d $DATA ]; then - mkdata=YES - mkdir -p $DATA -fi cd $DATA || exit 99 DATATOP=$DATA ################################################################################ # Set output data -cymd=$(echo $CDATE | cut -c1-8) -chh=$(echo $CDATE | cut -c9-10) -EFCSGRP=$COMOUT/efcs.grp${ENSGRP} +EFCSGRP="${COM_TOP}/efcs.grp${ENSGRP}" if [ -f $EFCSGRP ]; then if [ $RERUN_EFCSGRP = "YES" ]; then rm -f $EFCSGRP @@ -108,10 +95,16 @@ export LEVS=${LEVS_ENKF:-${LEVS:-64}} # nggps_diag_nml export FHOUT=${FHOUT_ENKF:-3} - +if [[ ${RUN} == "enkfgfs" ]]; then + export FHOUT=${FHOUT_ENKF_GFS:-${FHOUT_ENKF:${FHOUT:-3}}} +fi # model_configure export DELTIM=${DELTIM_ENKF:-${DELTIM:-225}} export FHMAX=${FHMAX_ENKF:-9} +if [[ ${RUN} == "enkfgfs" ]]; then + export FHMAX=${FHMAX_ENKF_GFS:-${FHMAX_ENKF:-${FHMAX}}} +fi + export restart_interval=${restart_interval_ENKF:-${restart_interval:-6}} # gfs_physics_nml @@ -132,9 +125,11 @@ if [ $RECENTER_ENKF = "YES" ]; then export PREFIX_ATMINC="r" fi -# APRUN for different executables -export APRUN_FV3=${APRUN_FV3:-${APRUN:-""}} -export NTHREADS_FV3=${NTHREADS_FV3:-${NTHREADS:-1}} +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") +declare -x gPDY="${GDATE:0:8}" +declare -x gcyc="${GDATE:8:2}" ################################################################################ # Run forecast for ensemble member @@ -144,7 +139,7 @@ for imem in $(seq $ENSBEG $ENSEND); do cd $DATATOP cmem=$(printf %03i $imem) - memchar="mem$cmem" + memchar="mem${cmem}" echo "Processing MEMBER: $cmem" @@ -156,12 +151,40 @@ for imem in $(seq $ENSBEG $ENSEND); do [[ $memstat -eq 1 ]] && skip_mem="YES" fi + # Construct COM variables from templates (see config.com) + # Can't make these read-only because we are looping over members + MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_ATMOS_RESTART COM_ATMOS_INPUT COM_ATMOS_ANALYSIS \ + COM_ATMOS_HISTORY COM_ATMOS_MASTER + + RUN=${rCDUMP} MEMDIR="${memchar}" YMD="${gPDY}" HH="${gcyc}" generate_com -x COM_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL + + if [[ ${DO_WAVE} == "YES" ]]; then + MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_WAVE_RESTART COM_WAVE_PREP COM_WAVE_HISTORY + RUN=${rCDUMP} MEMDIR="${memchar}" YMD="${gPDY}" HH="${gcyc}" generate_com -x COM_WAVE_RESTART_PREV:COM_WAVE_RESTART_TMPL + fi + + if [[ ${DO_OCN} == "YES" ]]; then + MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_MED_RESTART COM_OCEAN_RESTART \ + COM_OCEAN_INPUT COM_OCEAN_HISTORY COM_OCEAN_ANALYSIS + RUN=${rCDUMP} MEMDIR="${memchar}" YMD="${gPDY}" HH="${gcyc}" generate_com -x COM_OCEAN_RESTART_PREV:COM_OCEAN_RESTART_TMPL + fi + + if [[ ${DO_ICE} == "YES" ]]; then + MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_ICE_HISTORY COM_ICE_INPUT COM_ICE_RESTART + RUN=${rCDUMP} MEMDIR="${memchar}" YMD="${gPDY}" HH="${gcyc}" generate_com -x COM_ICE_RESTART_PREV:COM_ICE_RESTART_TMPL + fi + + if [[ ${DO_AERO} == "YES" ]]; then + MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_CHEM_HISTORY + fi + + if [ $skip_mem = "NO" ]; then ra=0 export MEMBER=$imem - export DATA=$DATATOP/$memchar + export DATA="${DATATOP}/${memchar}" if [ -d $DATA ]; then rm -rf $DATA; fi mkdir -p $DATA $FORECASTSH @@ -181,7 +204,7 @@ for imem in $(seq $ENSBEG $ENSEND); do while [ $fhr -le $FHMAX ]; do FH3=$(printf %03i $fhr) if [ $(expr $fhr % 3) -eq 0 ]; then - $DBNROOT/bin/dbn_alert MODEL GFS_ENKF $job $COMOUT/$memchar/${CDUMP}.t${cyc}z.sfcf${FH3}${GSUFFIX} + "${DBNROOT}/bin/dbn_alert" MODEL GFS_ENKF "${job}" "${COM_ATMOS_HISTORY}/${RUN}.t${cyc}z.sfcf${FH3}.nc" fi fhr=$((fhr+FHOUT)) done @@ -221,8 +244,5 @@ export err=$rc; err_chk ################################################################################ # Postprocessing -cd $pwd -[[ $mkdata = "YES" ]] && rm -rf $DATATOP - exit $err diff --git a/scripts/exgdas_enkf_post.sh b/scripts/exgdas_enkf_post.sh index 2ef2895d199..0bdea980ecd 100755 --- a/scripts/exgdas_enkf_post.sh +++ b/scripts/exgdas_enkf_post.sh @@ -42,23 +42,19 @@ GETSFCENSMEANEXEC=${GETSFCENSMEANEXEC:-$HOMEgfs/exec/getsfcensmeanp.x} # Other variables. PREFIX=${PREFIX:-""} -SUFFIX=${SUFFIX:-""} FHMIN=${FHMIN_EPOS:-3} FHMAX=${FHMAX_EPOS:-9} FHOUT=${FHOUT_EPOS:-3} + +if [[ $CDUMP == "gfs" ]]; then + NMEM_ENKF=${NMEM_EFCS:-${NMEM_ENKF:-30}} +fi NMEM_ENKF=${NMEM_ENKF:-80} SMOOTH_ENKF=${SMOOTH_ENKF:-"NO"} ENKF_SPREAD=${ENKF_SPREAD:-"NO"} ################################################################################ # Preprocessing -mkdata=NO -if [ ! -d $DATA ]; then - mkdata=YES - mkdir -p $DATA -fi -cd $DATA || exit 99 - ENKF_SUFFIX="s" [[ $SMOOTH_ENKF = "NO" ]] && ENKF_SUFFIX="" @@ -72,26 +68,32 @@ export OMP_NUM_THREADS=$NTHREADS_EPOS ################################################################################ # Forecast ensemble member files for imem in $(seq 1 $NMEM_ENKF); do - memchar="mem"$(printf %03i $imem) + memchar="mem"$(printf %03i "${imem}") + MEMDIR=${memchar} YMD=${PDY} HH=${cyc} generate_com -x COM_ATMOS_HISTORY:COM_ATMOS_HISTORY_TMPL + for fhr in $(seq $FHMIN $FHOUT $FHMAX); do fhrchar=$(printf %03i $fhr) - $NLN $COMIN/$memchar/${PREFIX}sfcf$fhrchar${SUFFIX} sfcf${fhrchar}_$memchar - $NLN $COMIN/$memchar/${PREFIX}atmf$fhrchar${SUFFIX} atmf${fhrchar}_$memchar + ${NLN} "${COM_ATMOS_HISTORY}/${PREFIX}sfcf${fhrchar}.nc" "sfcf${fhrchar}_${memchar}" + ${NLN} "${COM_ATMOS_HISTORY}/${PREFIX}atmf${fhrchar}.nc" "atmf${fhrchar}_${memchar}" done done # Forecast ensemble mean and smoothed files +MEMDIR="ensstat" YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_HISTORY_STAT:COM_ATMOS_HISTORY_TMPL +if [[ ! -d "${COM_ATMOS_HISTORY_STAT}" ]]; then mkdir -p "${COM_ATMOS_HISTORY_STAT}"; fi + for fhr in $(seq $FHMIN $FHOUT $FHMAX); do fhrchar=$(printf %03i $fhr) - $NLN $COMOUT/${PREFIX}sfcf${fhrchar}.ensmean${SUFFIX} sfcf${fhrchar}.ensmean - $NLN $COMOUT/${PREFIX}atmf${fhrchar}.ensmean${SUFFIX} atmf${fhrchar}.ensmean + ${NLN} "${COM_ATMOS_HISTORY_STAT}/${PREFIX}sfcf${fhrchar}.ensmean.nc" "sfcf${fhrchar}.ensmean" + ${NLN} "${COM_ATMOS_HISTORY_STAT}/${PREFIX}atmf${fhrchar}.ensmean.nc" "atmf${fhrchar}.ensmean" if [ $SMOOTH_ENKF = "YES" ]; then for imem in $(seq 1 $NMEM_ENKF); do - memchar="mem"$(printf %03i $imem) - $NLN $COMOUT/$memchar/${PREFIX}atmf${fhrchar}${ENKF_SUFFIX}${SUFFIX} atmf${fhrchar}${ENKF_SUFFIX}_$memchar + memchar="mem"$(printf %03i "${imem}") + MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_ATMOS_HISTORY + ${NLN} "${COM_ATMOS_HISTORY}/${PREFIX}atmf${fhrchar}${ENKF_SUFFIX}.nc" "atmf${fhrchar}${ENKF_SUFFIX}_${memchar}" done fi - [[ $ENKF_SPREAD = "YES" ]] && $NLN $COMOUT/${PREFIX}atmf${fhrchar}.ensspread${SUFFIX} atmf${fhrchar}.ensspread + [[ $ENKF_SPREAD = "YES" ]] && ${NLN} "${COM_ATMOS_HISTORY_STAT}/${PREFIX}atmf${fhrchar}.ensspread.nc" "atmf${fhrchar}.ensspread" done ################################################################################ @@ -132,7 +134,7 @@ if [ $SMOOTH_ENKF = "YES" ]; then echo WARNING! no smoothed ensemble member for fhour = $fhrchar >&2 for imem in $(seq 1 $NMEM_ENKF); do memchar="mem"$(printf %03i $imem) - $NCP atmf${fhrchar}_$memchar atmf${fhrchar}${ENKF_SUFFIX}_$memchar + ${NCP} "atmf${fhrchar}_${memchar}" "atmf${fhrchar}${ENKF_SUFFIX}_${memchar}" done fi done @@ -146,7 +148,7 @@ if [ $SENDDBN = "YES" ]; then fhrchar=$(printf %03i $fhr) if [ $(expr $fhr % 3) -eq 0 ]; then if [ -s ./sfcf${fhrchar}.ensmean ]; then - $DBNROOT/bin/dbn_alert MODEL GFS_ENKF $job $COMOUT/${PREFIX}sfcf${fhrchar}.ensmean${SUFFIX} + ${DBNROOT}/bin/dbn_alert "MODEL" "GFS_ENKF" "${job}" "${COM_ATMOS_HISTORY_STAT}/${PREFIX}sfcf${fhrchar}.ensmean.nc" fi fi done @@ -156,7 +158,5 @@ fi ################################################################################ # Postprocessing cd $pwd -[[ $mkdata = "YES" ]] && rm -rf $DATA - exit $err diff --git a/scripts/exgdas_enkf_select_obs.sh b/scripts/exgdas_enkf_select_obs.sh index 92e1fd8c60d..2ad624bcdb4 100755 --- a/scripts/exgdas_enkf_select_obs.sh +++ b/scripts/exgdas_enkf_select_obs.sh @@ -28,14 +28,9 @@ export NLN=${NLN:-"/bin/ln -sf"} # Scripts. ANALYSISSH=${ANALYSISSH:-$HOMEgfs/scripts/exglobal_atmos_analysis.sh} -# Prefix and Suffix Variables. -export APREFIX=${APREFIX:-""} -export ASUFFIX=${ASUFFIX:-$SUFFIX} - # Select obs export RUN_SELECT=${RUN_SELECT:-"YES"} export USE_SELECT=${USE_SELECT:-"NO"} -export SELECT_OBS=${SELECT_OBS:-$COMOUT/${APREFIX}obsinput} # Observation Operator GSI namelist initialization SETUP_INVOBS=${SETUP_INVOBS:-""} @@ -62,8 +57,6 @@ if [ ! -d $DATA ]; then fi cd $DATA || exit 8 -[[ ! -d $COMOUT ]] && mkdir -p $COMOUT - ################################################################################ # ObsInput file from ensemble mean rm -f obs*input* diff --git a/scripts/exgdas_enkf_sfc.sh b/scripts/exgdas_enkf_sfc.sh index 4589a593561..d74306a8b3d 100755 --- a/scripts/exgdas_enkf_sfc.sh +++ b/scripts/exgdas_enkf_sfc.sh @@ -23,7 +23,6 @@ source "$HOMEgfs/ush/preamble.sh" pwd=$(pwd) # Base variables -CDATE=${CDATE:-"2010010100"} DONST=${DONST:-"NO"} DOSFCANL_ENKF=${DOSFCANL_ENKF:-"YES"} export CASE=${CASE:-384} @@ -32,7 +31,6 @@ ntiles=${ntiles:-6} # Utilities NCP=${NCP:-"/bin/cp -p"} NLN=${NLN:-"/bin/ln -sf"} -NEMSIOGET=${NEMSIOGET:-${NWPROD}/exec/nemsio_get} NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} # Scripts @@ -43,10 +41,9 @@ NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} OPREFIX=${OPREFIX:-""} OSUFFIX=${OSUFFIX:-""} APREFIX=${APREFIX:-""} -APREFIX_ENKF=${APREFIX_ENKF:-$APREFIX} -ASUFFIX=${ASUFFIX:-$SUFFIX} +APREFIX_ENS=${APREFIX_ENS:-$APREFIX} GPREFIX=${GPREFIX:-""} -GSUFFIX=${GSUFFIX:-$SUFFIX} +GPREFIX_ENS=${GPREFIX_ENS:-${GPREFIX}} # Variables NMEM_ENKF=${NMEM_ENKF:-80} @@ -57,8 +54,8 @@ CYCLESH=${CYCLESH:-$HOMEgfs/ush/global_cycle.sh} export CYCLEXEC=${CYCLEXEC:-$HOMEgfs/exec/global_cycle} APRUN_CYCLE=${APRUN_CYCLE:-${APRUN:-""}} NTHREADS_CYCLE=${NTHREADS_CYCLE:-${NTHREADS:-1}} -export FIXfv3=${FIXfv3:-$HOMEgfs/fix/fix_fv3_gmted2010} -export FIXgsm=${FIXgsm:-$HOMEgfs/fix/fix_am} +export FIXfv3=${FIXfv3:-$HOMEgfs/fix/orog} +export FIXgsm=${FIXgsm:-$HOMEgfs/fix/am} export CYCLVARS=${CYCLVARS:-"FSNOL=-2.,FSNOS=99999.,"} export FHOUR=${FHOUR:-0} export DELTSFC=${DELTSFC:-6} @@ -80,34 +77,30 @@ cd $DATA || exit 99 ################################################################################ # Update surface fields in the FV3 restart's using global_cycle. -PDY=$(echo $CDATE | cut -c1-8) -cyc=$(echo $CDATE | cut -c9-10) - -GDATE=$($NDATE -$assim_freq $CDATE) -gPDY=$(echo $GDATE | cut -c1-8) -gcyc=$(echo $GDATE | cut -c9-10) -GDUMP=${GDUMP:-"gdas"} - -BDATE=$($NDATE -3 $CDATE) -bPDY=$(echo $BDATE | cut -c1-8) -bcyc=$(echo $BDATE | cut -c9-10) +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +BDATE=$(${NDATE} -3 "${PDY}${cyc}") +bPDY=${BDATE:0:8} +bcyc=${BDATE:8:2} # Get dimension information based on CASE -res=$(echo $CASE | cut -c2-) +res=${CASE:2:} JCAP_CASE=$((res*2-2)) LATB_CASE=$((res*2)) LONB_CASE=$((res*4)) # Global cycle requires these files export FNTSFA=${FNTSFA:-' '} -export FNACNA=${FNACNA:-$COMIN/${OPREFIX}seaice.5min.blend.grb} -export FNSNOA=${FNSNOA:-$COMIN/${OPREFIX}snogrb_t${JCAP_CASE}.${LONB_CASE}.${LATB_CASE}} -[[ ! -f $FNSNOA ]] && export FNSNOA="$COMIN/${OPREFIX}snogrb_t1534.3072.1536" -FNSNOG=${FNSNOG:-$COMIN_GES/${GPREFIX}snogrb_t${JCAP_CASE}.${LONB_CASE}.${LATB_CASE}} -[[ ! -f $FNSNOG ]] && FNSNOG="$COMIN_GES/${GPREFIX}snogrb_t1534.3072.1536" +export FNACNA=${FNACNA:-${COM_OBS}/${OPREFIX}seaice.5min.blend.grb} +export FNSNOA=${FNSNOA:-${COM_OBS}/${OPREFIX}snogrb_t${JCAP_CASE}.${LONB_CASE}.${LATB_CASE}} +[[ ! -f $FNSNOA ]] && export FNSNOA="${COM_OBS}/${OPREFIX}snogrb_t1534.3072.1536" +FNSNOG=${FNSNOG:-${COM_OBS_PREV}/${GPREFIX}snogrb_t${JCAP_CASE}.${LONB_CASE}.${LATB_CASE}} +[[ ! -f $FNSNOG ]] && FNSNOG="${COM_OBS_PREV}/${GPREFIX}snogrb_t1534.3072.1536" # Set CYCLVARS by checking grib date of current snogrb vs that of prev cycle if [ ${RUN_GETGES:-"NO"} = "YES" ]; then + # Ignore possible spelling error (nothing is misspelled) + # shellcheck disable=SC2153 snoprv=$($GETGESSH -q -t snogrb_$JCAP_CASE -e $gesenvir -n $GDUMP -v $GDATE) else snoprv=${snoprv:-$FNSNOG} @@ -123,7 +116,7 @@ else fi if [ $DONST = "YES" ]; then - export NST_FILE=${NST_FILE:-$COMIN/${APREFIX}dtfanl.nc} + export NST_FILE=${NST_FILE:-${COM_ATMOS_ANALYSIS_DET}/${APREFIX}dtfanl.nc} else export NST_FILE="NULL" fi @@ -145,16 +138,26 @@ if [ $DOIAU = "YES" ]; then cmem=$(printf %03i $imem) memchar="mem$cmem" - [[ $TILE_NUM -eq 1 ]] && mkdir -p $COMOUT_ENS/$memchar/RESTART + MEMDIR=${memchar} YMD=${PDY} HH=${cyc} generate_com \ + COM_ATMOS_RESTART_MEM:COM_ATMOS_RESTART_TMPL + + MEMDIR=${memchar} RUN="enkfgdas" YMD=${gPDY} HH=${gcyc} generate_com \ + COM_ATMOS_RESTART_MEM_PREV:COM_ATMOS_RESTART_TMPL + + [[ ${TILE_NUM} -eq 1 ]] && mkdir -p "${COM_ATMOS_RESTART_MEM}" - $NLN $COMIN_GES_ENS/$memchar/RESTART/$bPDY.${bcyc}0000.sfc_data.tile${n}.nc $DATA/fnbgsi.$cmem - $NLN $COMOUT_ENS/$memchar/RESTART/$bPDY.${bcyc}0000.sfcanl_data.tile${n}.nc $DATA/fnbgso.$cmem - $NLN $FIXfv3/$CASE/${CASE}_grid.tile${n}.nc $DATA/fngrid.$cmem - $NLN $FIXfv3/$CASE/${CASE}_oro_data.tile${n}.nc $DATA/fnorog.$cmem + ${NCP} "${COM_ATMOS_RESTART_MEM_PREV}/${bPDY}.${bcyc}0000.sfc_data.tile${n}.nc" \ + "${COM_ATMOS_RESTART_MEM}/${bPDY}.${bcyc}0000.sfcanl_data.tile${n}.nc" + ${NLN} "${COM_ATMOS_RESTART_MEM_PREV}/${bPDY}.${bcyc}0000.sfc_data.tile${n}.nc" \ + "${DATA}/fnbgsi.${cmem}" + ${NLN} "${COM_ATMOS_RESTART_MEM}/${bPDY}.${bcyc}0000.sfcanl_data.tile${n}.nc" \ + "${DATA}/fnbgso.${cmem}" + ${NLN} "${FIXfv3}/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.${cmem}" + ${NLN} "${FIXfv3}/${CASE}/${CASE}_oro_data.tile${n}.nc" "${DATA}/fnorog.${cmem}" done - $CYCLESH + CDATE="${PDY}${cyc}" ${CYCLESH} export err=$?; err_chk done @@ -162,28 +165,38 @@ if [ $DOIAU = "YES" ]; then fi if [ $DOSFCANL_ENKF = "YES" ]; then - for n in $(seq 1 $ntiles); do + for n in $(seq 1 $ntiles); do - export TILE_NUM=$n + export TILE_NUM=$n - for imem in $(seq 1 $NMEM_ENKF); do + for imem in $(seq 1 $NMEM_ENKF); do - cmem=$(printf %03i $imem) - memchar="mem$cmem" + cmem=$(printf %03i $imem) + memchar="mem$cmem" - [[ $TILE_NUM -eq 1 ]] && mkdir -p $COMOUT_ENS/$memchar/RESTART + MEMDIR=${memchar} YMD=${PDY} HH=${cyc} generate_com \ + COM_ATMOS_RESTART_MEM:COM_ATMOS_RESTART_TMPL - $NLN $COMIN_GES_ENS/$memchar/RESTART/$PDY.${cyc}0000.sfc_data.tile${n}.nc $DATA/fnbgsi.$cmem - $NLN $COMOUT_ENS/$memchar/RESTART/$PDY.${cyc}0000.sfcanl_data.tile${n}.nc $DATA/fnbgso.$cmem - $NLN $FIXfv3/$CASE/${CASE}_grid.tile${n}.nc $DATA/fngrid.$cmem - $NLN $FIXfv3/$CASE/${CASE}_oro_data.tile${n}.nc $DATA/fnorog.$cmem + RUN="${GDUMP_ENS}" MEMDIR=${memchar} YMD=${gPDY} HH=${gcyc} generate_com \ + COM_ATMOS_RESTART_MEM_PREV:COM_ATMOS_RESTART_TMPL - done + [[ ${TILE_NUM} -eq 1 ]] && mkdir -p "${COM_ATMOS_RESTART_MEM}" - $CYCLESH - export err=$?; err_chk + ${NCP} "${COM_ATMOS_RESTART_MEM_PREV}/${PDY}.${cyc}0000.sfc_data.tile${n}.nc" \ + "${COM_ATMOS_RESTART_MEM}/${PDY}.${cyc}0000.sfcanl_data.tile${n}.nc" + ${NLN} "${COM_ATMOS_RESTART_MEM_PREV}/${PDY}.${cyc}0000.sfc_data.tile${n}.nc" \ + "${DATA}/fnbgsi.${cmem}" + ${NLN} "${COM_ATMOS_RESTART_MEM}/${PDY}.${cyc}0000.sfcanl_data.tile${n}.nc" \ + "${DATA}/fnbgso.${cmem}" + ${NLN} "${FIXfv3}/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.${cmem}" + ${NLN} "${FIXfv3}/${CASE}/${CASE}_oro_data.tile${n}.nc" "${DATA}/fnorog.${cmem}" + + done - done + CDATE="${PDY}${cyc}" ${CYCLESH} + export err=$?; err_chk + + done fi ################################################################################ diff --git a/scripts/exgdas_enkf_update.sh b/scripts/exgdas_enkf_update.sh index 422b2e54e2c..dfe00effd96 100755 --- a/scripts/exgdas_enkf_update.sh +++ b/scripts/exgdas_enkf_update.sh @@ -25,7 +25,6 @@ pwd=$(pwd) # Utilities NCP=${NCP:-"/bin/cp -p"} NLN=${NLN:-"/bin/ln -sf"} -NEMSIOGET=${NEMSIOGET:-$NWPROD/utils/exec/nemsio_get} NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} USE_CFP=${USE_CFP:-"NO"} CFP_MP=${CFP_MP:-"NO"} @@ -42,12 +41,11 @@ ENKFEXEC=${ENKFEXEC:-$HOMEgfs/exec/enkf.x} # Cycling and forecast hour specific parameters CDATE=${CDATE:-"2001010100"} +CDUMP=${CDUMP:-"gdas"} # Filenames. GPREFIX=${GPREFIX:-""} -GSUFFIX=${GSUFFIX:-$SUFFIX} APREFIX=${APREFIX:-""} -ASUFFIX=${ASUFFIX:-$SUFFIX} SMOOTH_ENKF=${SMOOTH_ENKF:-"YES"} @@ -84,35 +82,30 @@ cnvw_option=${cnvw_option:-".false."} netcdf_diag=${netcdf_diag:-".true."} modelspace_vloc=${modelspace_vloc:-".false."} # if true, 'vlocal_eig.dat' is needed IAUFHRS_ENKF=${IAUFHRS_ENKF:-6} -DO_CALC_INCREMENT=${DO_CALC_INCREMENT:-"NO"} +if [ $CDUMP = "enkfgfs" ]; then + DO_CALC_INCREMENT=${DO_CALC_INCREMENT_ENKF_GFS:-"NO"} +else + DO_CALC_INCREMENT=${DO_CALC_INCREMENT:-"NO"} +fi INCREMENTS_TO_ZERO=${INCREMENTS_TO_ZERO:-"'NONE'"} ################################################################################ -ATMGES_ENSMEAN=$COMIN_GES_ENS/${GPREFIX}atmf006.ensmean${GSUFFIX} -if [ $SUFFIX = ".nc" ]; then - LONB_ENKF=${LONB_ENKF:-$($NCLEN $ATMGES_ENSMEAN grid_xt)} # get LONB_ENKF - LATB_ENKF=${LATB_ENKF:-$($NCLEN $ATMGES_ENSMEAN grid_yt)} # get LATB_ENFK - LEVS_ENKF=${LEVS_ENKF:-$($NCLEN $ATMGES_ENSMEAN pfull)} # get LEVS_ENFK - use_gfs_ncio=".true." - use_gfs_nemsio=".false." - paranc=${paranc:-".true."} - if [ $DO_CALC_INCREMENT = "YES" ]; then - write_fv3_incr=".false." - else - write_fv3_incr=".true." - WRITE_INCR_ZERO="incvars_to_zero= $INCREMENTS_TO_ZERO," - fi + +ATMGES_ENSMEAN="${COM_ATMOS_HISTORY_STAT_PREV}/${GPREFIX}atmf006.ensmean.nc" +LONB_ENKF=${LONB_ENKF:-$($NCLEN $ATMGES_ENSMEAN grid_xt)} # get LONB_ENKF +LATB_ENKF=${LATB_ENKF:-$($NCLEN $ATMGES_ENSMEAN grid_yt)} # get LATB_ENFK +LEVS_ENKF=${LEVS_ENKF:-$($NCLEN $ATMGES_ENSMEAN pfull)} # get LEVS_ENFK +use_gfs_ncio=".true." +use_gfs_nemsio=".false." +paranc=${paranc:-".true."} +WRITE_INCR_ZERO="incvars_to_zero= $INCREMENTS_TO_ZERO," +if [ $DO_CALC_INCREMENT = "YES" ]; then + write_fv3_incr=".false." else - LEVS_ENKF=${LEVS_ENKF:-$($NEMSIOGET $ATMGES_ENSMEAN dimz | awk '{print $2}')} - LATB_ENKF=${LATB_ENKF:-$($NEMSIOGET $ATMGES_ENSMEAN dimy | awk '{print $2}')} - LONB_ENKF=${LONB_ENKF:-$($NEMSIOGET $ATMGES_ENSMEAN dimx | awk '{print $2}')} - use_gfs_ncio=".false." - use_gfs_nemsio=".true." - paranc=${paranc:-".false."} + write_fv3_incr=".true." fi LATA_ENKF=${LATA_ENKF:-$LATB_ENKF} LONA_ENKF=${LONA_ENKF:-$LONB_ENKF} - SATANGL=${SATANGL:-${FIXgsi}/global_satangbias.txt} SATINFO=${SATINFO:-${FIXgsi}/global_satinfo.txt} CONVINFO=${CONVINFO:-${FIXgsi}/global_convinfo.txt} @@ -121,7 +114,6 @@ SCANINFO=${SCANINFO:-${FIXgsi}/global_scaninfo.txt} HYBENSINFO=${HYBENSINFO:-${FIXgsi}/global_hybens_info.l${LEVS_ENKF}.txt} ANAVINFO=${ANAVINFO:-${FIXgsi}/global_anavinfo.l${LEVS_ENKF}.txt} VLOCALEIG=${VLOCALEIG:-${FIXgsi}/vlocal_eig_l${LEVS_ENKF}.dat} - ENKF_SUFFIX="s" [[ $SMOOTH_ENKF = "NO" ]] && ENKF_SUFFIX="" @@ -146,23 +138,23 @@ $NLN $ANAVINFO anavinfo $NLN $VLOCALEIG vlocal_eig.dat # Bias correction coefficients based on the ensemble mean -$NLN $COMOUT_ANL_ENS/$GBIASe satbias_in +${NLN} "${COM_ATMOS_ANALYSIS_STAT}/${GBIASe}" "satbias_in" ################################################################################ if [ $USE_CFP = "YES" ]; then [[ -f $DATA/untar.sh ]] && rm $DATA/untar.sh [[ -f $DATA/mp_untar.sh ]] && rm $DATA/mp_untar.sh - set +x cat > $DATA/untar.sh << EOFuntar #!/bin/sh memchar=\$1 +COM_ATMOS_ANALYSIS=\$2 flist="$CNVSTAT $OZNSTAT $RADSTAT" for ftype in \$flist; do if [ \$memchar = "ensmean" ]; then - fname=$COMOUT_ANL_ENS/\${ftype}.ensmean + fname=\${COM_ATMOS_ANALYSIS}/\${ftype}.ensmean else - fname=$COMOUT_ANL_ENS/\$memchar/\$ftype + fname=\${COM_ATMOS_ANALYSIS}/\${ftype} fi tar -xvf \$fname done @@ -175,49 +167,62 @@ fi flist="$CNVSTAT $OZNSTAT $RADSTAT" if [ $USE_CFP = "YES" ]; then - echo "$nm $DATA/untar.sh ensmean" | tee -a $DATA/mp_untar.sh + echo "${nm} ${DATA}/untar.sh ensmean ${COM_ATMOS_ANALYSIS_STAT}" | tee -a "${DATA}/mp_untar.sh" if [ ${CFP_MP:-"NO"} = "YES" ]; then nm=$((nm+1)) fi else for ftype in $flist; do - fname=$COMOUT_ANL_ENS/${ftype}.ensmean + fname="${COM_ATMOS_ANALYSIS_STAT}/${ftype}.ensmean" tar -xvf $fname done fi nfhrs=$(echo $IAUFHRS_ENKF | sed 's/,/ /g') for imem in $(seq 1 $NMEM_ENKF); do memchar="mem"$(printf %03i $imem) + + MEMDIR=${memchar} RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com -x \ + COM_ATMOS_HISTORY_MEM_PREV:COM_ATMOS_HISTORY_TMPL + + MEMDIR=${memchar} YMD=${PDY} HH=${cyc} generate_com -x \ + COM_ATMOS_ANALYSIS_MEM:COM_ATMOS_ANALYSIS_TMPL + if [ $lobsdiag_forenkf = ".false." ]; then if [ $USE_CFP = "YES" ]; then - echo "$nm $DATA/untar.sh $memchar" | tee -a $DATA/mp_untar.sh + echo "${nm} ${DATA}/untar.sh ${memchar} ${COM_ATMOS_ANALYSIS_MEM}" | tee -a "${DATA}/mp_untar.sh" if [ ${CFP_MP:-"NO"} = "YES" ]; then nm=$((nm+1)) fi else for ftype in $flist; do - fname=$COMOUT_ANL_ENS/$memchar/$ftype + fname="${COM_ATMOS_ANALYSIS_MEM}/${ftype}" tar -xvf $fname done fi fi - mkdir -p $COMOUT_ANL_ENS/$memchar + mkdir -p "${COM_ATMOS_ANALYSIS_MEM}" for FHR in $nfhrs; do - $NLN $COMIN_GES_ENS/$memchar/${GPREFIX}atmf00${FHR}${ENKF_SUFFIX}${GSUFFIX} sfg_${CDATE}_fhr0${FHR}_${memchar} + ${NLN} "${COM_ATMOS_HISTORY_MEM_PREV}/${GPREFIX}atmf00${FHR}${ENKF_SUFFIX}.nc" \ + "sfg_${PDY}${cyc}_fhr0${FHR}_${memchar}" if [ $cnvw_option = ".true." ]; then - $NLN $COMIN_GES_ENS/$memchar/${GPREFIX}sfcf00${FHR}${GSUFFIX} sfgsfc_${CDATE}_fhr0${FHR}_${memchar} + ${NLN} "${COM_ATMOS_HISTORY_MEM_PREV}/${GPREFIX}sfcf00${FHR}.nc" \ + "sfgsfc_${PDY}${cyc}_fhr0${FHR}_${memchar}" fi if [ $FHR -eq 6 ]; then if [ $DO_CALC_INCREMENT = "YES" ]; then - $NLN $COMOUT_ANL_ENS/$memchar/${APREFIX}atmanl${ASUFFIX} sanl_${CDATE}_fhr0${FHR}_${memchar} + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX}atmanl.nc" \ + "sanl_${PDY}${cyc}_fhr0${FHR}_${memchar}" else - $NLN $COMOUT_ANL_ENS/$memchar/${APREFIX}atminc${ASUFFIX} incr_${CDATE}_fhr0${FHR}_${memchar} + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX}atminc.nc" \ + "incr_${PDY}${cyc}_fhr0${FHR}_${memchar}" fi else if [ $DO_CALC_INCREMENT = "YES" ]; then - $NLN $COMOUT_ANL_ENS/$memchar/${APREFIX}atma00${FHR}${ASUFFIX} sanl_${CDATE}_fhr0${FHR}_${memchar} + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX}atma00${FHR}.nc" \ + "sanl_${PDY}${cyc}_fhr0${FHR}_${memchar}" else - $NLN $COMOUT_ANL_ENS/$memchar/${APREFIX}atmi00${FHR}${ASUFFIX} incr_${CDATE}_fhr0${FHR}_${memchar} + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX}atmi00${FHR}.nc" \ + "incr_${PDY}${cyc}_fhr0${FHR}_${memchar}" fi fi done @@ -225,9 +230,12 @@ done # Ensemble mean guess for FHR in $nfhrs; do - $NLN $COMIN_GES_ENS/${GPREFIX}atmf00${FHR}.ensmean${GSUFFIX} sfg_${CDATE}_fhr0${FHR}_ensmean + + ${NLN} "${COM_ATMOS_HISTORY_STAT_PREV}/${GPREFIX}atmf00${FHR}.ensmean.nc" \ + "sfg_${PDY}${cyc}_fhr0${FHR}_ensmean" if [ $cnvw_option = ".true." ]; then - $NLN $COMIN_GES_ENS/${GPREFIX}sfcf00${FHR}.ensmean${GSUFFIX} sfgsfc_${CDATE}_fhr0${FHR}_ensmean + ${NLN} "${COM_ATMOS_HISTORY_STAT_PREV}/${GPREFIX}sfcf00${FHR}.ensmean.nc" \ + "sfgsfc_${PDY}${cyc}_fhr0${FHR}_ensmean" fi done @@ -246,7 +254,7 @@ fi # Create global_enkf namelist cat > enkf.nml << EOFnml &nam_enkf - datestring="$CDATE",datapath="$DATA/", + datestring="${PDY}${cyc}",datapath="$DATA/", analpertwtnh=${analpertwt},analpertwtsh=${analpertwt},analpertwttr=${analpertwt}, covinflatemax=1.e2,covinflatemin=1,pseudo_rh=.true.,iassim_order=0, corrlengthnh=${corrlength},corrlengthsh=${corrlength},corrlengthtr=${corrlength}, @@ -387,7 +395,7 @@ $APRUN_ENKF ${DATA}/$(basename $ENKFEXEC) 1>stdout 2>stderr export err=$?; err_chk # Cat runtime output files. -cat stdout stderr > $COMOUT_ANL_ENS/$ENKFSTAT +cat stdout stderr > "${COM_ATMOS_ANALYSIS_STAT}/${ENKFSTAT}" ################################################################################ # Postprocessing diff --git a/scripts/exgfs_aero_init_aerosol.py b/scripts/exgfs_aero_init_aerosol.py index ecd968ce2f3..db5e462f649 100755 --- a/scripts/exgfs_aero_init_aerosol.py +++ b/scripts/exgfs_aero_init_aerosol.py @@ -4,7 +4,7 @@ 'script'-level control of the aerosol init job. Reads environment variables, determines the atmospheric IC files and most recent available -restart files, then calls the script that merges the tracers from the restart files into +restart files, then calls the script that merges the tracers from the restart files into the IC files. INPUTS @@ -22,7 +22,7 @@ Additionally, the following data files are used: - Tiled atmospheric initial conditions that follow the naming pattern determined by `atm_base_pattern` and `atm_file_pattern` -- Restart files from a previous cycle that fit the pattern determined by restart_base_pattern and restart_file_pattern, +- Restart files from a previous cycle that fit the pattern determined by restart_base_pattern and restart_file_pattern, tracer_file_pattern, and dycore_file_pattern - A static file containing a list of tracers from the restart files to be added to the IC files, determined by `tracer_list_file_pattern` @@ -64,223 +64,229 @@ def main() -> None: - # Read in environment variables and make sure they exist - cdate = get_env_var("CDATE") - incr = int(get_env_var('STEP_GFS')) - fcst_length = int(get_env_var('FHMAX_GFS')) - cdump = get_env_var("CDUMP") - rot_dir = get_env_var("ROTDIR") - ush_gfs = get_env_var("USHgfs") - parm_gfs = get_env_var("PARMgfs") + # Read in environment variables and make sure they exist + cdate = get_env_var("CDATE") + incr = int(get_env_var('STEP_GFS')) + fcst_length = int(get_env_var('FHMAX_GFS')) + cdump = get_env_var("CDUMP") + rot_dir = get_env_var("ROTDIR") + ush_gfs = get_env_var("USHgfs") + parm_gfs = get_env_var("PARMgfs") - # os.chdir(data) + # os.chdir(data) - merge_script = merge_script_pattern.format(ush_gfs=ush_gfs) - tracer_list_file = tracer_list_file_pattern.format(parm_gfs=parm_gfs) + merge_script = merge_script_pattern.format(ush_gfs=ush_gfs) + tracer_list_file = tracer_list_file_pattern.format(parm_gfs=parm_gfs) - time = datetime.strptime(cdate, "%Y%m%d%H") - atm_source_path = time.strftime(atm_base_pattern.format(**locals())) + time = datetime.strptime(cdate, "%Y%m%d%H") + atm_source_path = time.strftime(atm_base_pattern.format(**locals())) - if(debug): - for var in ['merge_script', 'tracer_list_file', 'atm_source_path']: - print(f'{var} = {f"{var}"}') + if (debug): + for var in ['merge_script', 'tracer_list_file', 'atm_source_path']: + print(f'{var} = {f"{var}"}') - atm_files, ctrl_files = get_atm_files(atm_source_path) - tracer_files, rest_files, core_files = get_restart_files(time, incr, max_lookback, fcst_length, rot_dir, cdump) + atm_files, ctrl_files = get_atm_files(atm_source_path) + tracer_files, rest_files, core_files = get_restart_files(time, incr, max_lookback, fcst_length, rot_dir, cdump) - if (tracer_files is not None): - merge_tracers(merge_script, atm_files, tracer_files, rest_files, core_files[0], ctrl_files[0], tracer_list_file) + if (tracer_files is not None): + merge_tracers(merge_script, atm_files, tracer_files, rest_files, core_files[0], ctrl_files[0], tracer_list_file) - return + return def get_env_var(varname: str, fail_on_missing: bool = True) -> str: - ''' - Retrieve environment variable and exit or print warning if not defined - - Parameters - ---------- - varname : str - Environment variable to read - fail_on_missing : bool, optional - Whether to fail (if True) or print warning (False) if environment variable is not defined (default: True) - - Returns - ---------- - str - Value of the named variable - - Raises - ---------- - RuntimeError - If fail_on_missing is True and environment variable is not defined - - ''' - if(debug): - print(f'Trying to read envvar {varname}') - - var = os.environ.get(varname) - if(var is None): - if(fail_on_missing is True): - raise RuntimeError(f'Environment variable {varname} not set') - else: - print(f"WARNING: Environment variable {varname} not set, continuing using None") - if(debug): - print(f'\tValue: {var}') - return(var) + ''' + Retrieve environment variable and exit or print warning if not defined + + Parameters + ---------- + varname : str + Environment variable to read + fail_on_missing : bool, optional + Whether to fail (if True) or print warning (False) if environment variable is not defined (default: True) + + Returns + ---------- + str + Value of the named variable + + Raises + ---------- + RuntimeError + If fail_on_missing is True and environment variable is not defined + + ''' + if (debug): + print(f'Trying to read envvar {varname}') + + var = os.environ.get(varname) + if (var is None): + if (fail_on_missing is True): + raise RuntimeError(f'Environment variable {varname} not set') + else: + print(f"WARNING: Environment variable {varname} not set, continuing using None") + if (debug): + print(f'\tValue: {var}') + return (var) def get_atm_files(path: str) -> typing.List[typing.List[str]]: - ''' - Checks whether all atmospheric IC files exist in the given location and returns a list - of the filenames. - - Parameters - ---------- - path : str - Location where atmospheric IC files should exist - - Returns - ---------- - list of str - List of the full paths to each of the atmospheric files - - Raises - ---------- - IOError - If fail_on_missing is True and environment variable is not defined - - ''' - print(f'Checking for atm files in {path}') - - file_list = [] - for file_pattern in atm_file_pattern, atm_ctrl_pattern: - files = list(map(lambda tile: file_pattern.format(tile=tile, path=path), tiles)) - for file_name in files: - if(debug): - print(f"\tChecking for {file_name}") - if(not os.path.isfile(file_name)): - raise IOError(f"Atmosphere file {file_name} not found") - elif(debug): - print(f"\t\tFound {file_name}") - file_list = file_list + [files] - return file_list + ''' + Checks whether all atmospheric IC files exist in the given location and returns a list + of the filenames. + + Parameters + ---------- + path : str + Location where atmospheric IC files should exist + + Returns + ---------- + list of str + List of the full paths to each of the atmospheric files + + Raises + ---------- + IOError + If fail_on_missing is True and environment variable is not defined + + ''' + print(f'Checking for atm files in {path}') + + file_list = [] + for file_pattern in atm_file_pattern, atm_ctrl_pattern: + files = list(map(lambda tile: file_pattern.format(tile=tile, path=path), tiles)) + for file_name in files: + if (debug): + print(f"\tChecking for {file_name}") + if (not os.path.isfile(file_name)): + raise IOError(f"Atmosphere file {file_name} not found") + elif (debug): + print(f"\t\tFound {file_name}") + file_list = file_list + [files] + return file_list def get_restart_files(time: datetime, incr: int, max_lookback: int, fcst_length: int, rot_dir: str, cdump: str) -> typing.List[typing.List[str]]: - ''' - Determines the last cycle where all the necessary restart files are available. Ideally the immediate previous cycle - - Parameters - ---------- - time : datetime - Initial time for the current forecast - incr : int - Forecast cadence in hours - max_lookback : int - Maximum number of cycles to look back before failing - fcst_length : int - Length of forecast in hours - rot_dir : str - Path to the ROTDIR (COM) directory - cdump : str - CDUMP of current forecast portion (currently should always be 'gfs') - - Returns - ---------- - list of str - Full pathnames of all restart files needed from previous cycle (fv_core and fv_tracer files) - If all needed files aren't found within lookback period, An array of three None is returned instead. - - ''' - print(f"Looking for restart tracer files in {rot_dir}") - for lookback in map(lambda i: incr * (i + 1), range(max_lookback)): - if(lookback > fcst_length): - # Trying to look back farther than the length of a forecast - break - elif(lookback == fcst_length): - # Restart files at the end of the cycle don't have a timestamp - timestamp = "" - else: - timestamp = time.strftime("%Y%m%d.%H0000.") - - last_time = time - timedelta(hours=lookback) - - if(debug): - print(f"\tChecking {last_time}") - file_list = [] - file_base = last_time.strftime(restart_base_pattern.format(**locals())) - - for file_pattern in tracer_file_pattern, restart_file_pattern, dycore_file_pattern: - files = list(map(lambda tile: file_pattern.format(timestamp=timestamp, file_base=file_base, tile=tile), tiles)) - if(debug): - print(f"\t\tLooking for files {files} in directory {file_base}") - file_list = file_list + [files] - - found = all([os.path.isfile(file) for file in files for files in file_list]) - - if(found): - break - else: - print(last_time.strftime("Restart files not found for %Y%m%d_%H")) - - if(found): - return file_list - else: - print("WARNING: Unable to find restart files, will use zero fields") - return [ None, None, None ] + ''' + Determines the last cycle where all the necessary restart files are available. Ideally the immediate previous cycle + + Parameters + ---------- + time : datetime + Initial time for the current forecast + incr : int + Forecast cadence in hours + max_lookback : int + Maximum number of cycles to look back before failing + fcst_length : int + Length of forecast in hours + rot_dir : str + Path to the ROTDIR (COM) directory + cdump : str + CDUMP of current forecast portion (currently should always be 'gfs') + + Returns + ---------- + list of str + Full pathnames of all restart files needed from previous cycle (fv_core and fv_tracer files) + If all needed files aren't found within lookback period, An array of three None is returned instead. + + ''' + print(f"Looking for restart tracer files in {rot_dir}") + for lookback in map(lambda i: incr * (i + 1), range(max_lookback)): + if (lookback > fcst_length): + # Trying to look back farther than the length of a forecast + break + elif (lookback == fcst_length): + # Restart files at the end of the cycle don't have a timestamp + timestamp = "" + else: + timestamp = time.strftime("%Y%m%d.%H0000.") + + last_time = time - timedelta(hours=lookback) + + if (debug): + print(f"\tChecking {last_time}") + file_list = [] + file_base = last_time.strftime(restart_base_pattern.format(**locals())) + + for file_pattern in tracer_file_pattern, restart_file_pattern, dycore_file_pattern: + files = list(map(lambda tile: file_pattern.format(timestamp=timestamp, file_base=file_base, tile=tile), tiles)) + if (debug): + print(f"\t\tLooking for files {files} in directory {file_base}") + file_list = file_list + [files] + + found = all([os.path.isfile(file) for file in files for files in file_list]) + + if (found): + break + else: + print(last_time.strftime("Restart files not found for %Y%m%d_%H")) + + if (found): + return file_list + else: + print("WARNING: Unable to find restart files, will use zero fields") + return [None, None, None] # Merge tracer data into atmospheric data -def merge_tracers(merge_script: str, atm_files: typing.List[str], tracer_files: typing.List[str], rest_files: typing.List[str], core_file: str, ctrl_file: str, tracer_list_file: str) -> None: - ''' - Call the merger script to merge the tracers into the atmospheric IC files. Merged file is written to a temp file - which then overwrites the original upon successful completion of the script. - - Parameters - ---------- - merge_script : str - Full path to the merge script - atm_files : list of str - List of paths to atmospheric IC files - tracer_files : list of str - List of paths to tracer restart files - rest_files : list of str - List of paths to dycore tile restart files - core_file : str - Path of dycore restart file - ctrl_file : str - Path of control file - tracer_list_file : str - Full path to the file listing the tracer variables to add - - Returns - ---------- - None - - Raises - ---------- - ValueError - If `atm_files`, `tracer_files`, and `rest_files` are not all the same length - CalledProcessError - If merge script exits with a non-zero error - - ''' - print("Merging tracers") - if(len(atm_files) != len(tracer_files)): - raise ValueError("Atmosphere file list and tracer file list are not the same length") - - if(len(atm_files) != len(rest_files)): - raise ValueError("Atmosphere file list and dycore file list are not the same length") - - for atm_file, tracer_file, rest_file in zip(atm_files, tracer_files, rest_files): - if debug: - print(f"\tMerging tracers from {tracer_file} into {atm_file}") - temp_file = f'{atm_file}.tmp' - subprocess.run([merge_script, atm_file, tracer_file, core_file, ctrl_file, rest_file, tracer_list_file, temp_file], check=True) - os.replace(temp_file, atm_file) +def merge_tracers(merge_script: str, + atm_files: typing.List[str], + tracer_files: typing.List[str], + rest_files: typing.List[str], + core_file: str, + ctrl_file: str, + tracer_list_file: str) -> None: + ''' + Call the merger script to merge the tracers into the atmospheric IC files. Merged file is written to a temp file + which then overwrites the original upon successful completion of the script. + + Parameters + ---------- + merge_script : str + Full path to the merge script + atm_files : list of str + List of paths to atmospheric IC files + tracer_files : list of str + List of paths to tracer restart files + rest_files : list of str + List of paths to dycore tile restart files + core_file : str + Path of dycore restart file + ctrl_file : str + Path of control file + tracer_list_file : str + Full path to the file listing the tracer variables to add + + Returns + ---------- + None + + Raises + ---------- + ValueError + If `atm_files`, `tracer_files`, and `rest_files` are not all the same length + CalledProcessError + If merge script exits with a non-zero error + + ''' + print("Merging tracers") + if (len(atm_files) != len(tracer_files)): + raise ValueError("Atmosphere file list and tracer file list are not the same length") + + if (len(atm_files) != len(rest_files)): + raise ValueError("Atmosphere file list and dycore file list are not the same length") + + for atm_file, tracer_file, rest_file in zip(atm_files, tracer_files, rest_files): + if debug: + print(f"\tMerging tracers from {tracer_file} into {atm_file}") + temp_file = f'{atm_file}.tmp' + subprocess.run([merge_script, atm_file, tracer_file, core_file, ctrl_file, rest_file, tracer_list_file, temp_file], check=True) + os.replace(temp_file, atm_file) if __name__ == "__main__": - main() - exit(0) + main() + exit(0) diff --git a/scripts/exgfs_atmos_awips_20km_1p0deg.sh b/scripts/exgfs_atmos_awips_20km_1p0deg.sh index 3f9f84f2372..0f9868a5069 100755 --- a/scripts/exgfs_atmos_awips_20km_1p0deg.sh +++ b/scripts/exgfs_atmos_awips_20km_1p0deg.sh @@ -19,47 +19,39 @@ # echo " " ############################################################################### -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" fcsthrs="$1" num=$# -job_name=$(echo $job|sed 's/[jpt]gfs/gfs/') +job_name=${job/[jpt]gfs/gfs} -if test "$num" -ge 1 -then +if (( num != 1 )); then echo "" - echo " Appropriate number of arguments were passed" + echo " FATAL ERROR: Incorrect number of arguments " echo "" -else echo "" - echo " Number of arguments were not passed " - echo "" - echo "" - echo "Usage: $0 \$fcsthrs (3 digits) " + echo "Usage: $0 \${fcsthrs} (3 digits) " echo "" exit 16 fi -cd $DATA +cd "${DATA}" || exit 2 ############################################### # Wait for the availability of the pgrb file ############################################### icnt=1 -while [ $icnt -lt 1000 ] -do - if [ -s $COMIN/${RUN}.${cycle}.pgrb2b.0p25.f$fcsthrs.idx ] - then - break - fi - - sleep 10 - icnt=$((icnt + 1)) - if [ $icnt -ge 180 ] - then - msg="ABORTING after 30 min of waiting for the GFS pgrb2 file!" - err_exit $msg - fi +while (( icnt < 1000 )); do + if [[ -s "${COM_ATMOS_GRIB_0p25}/${RUN}.${cycle}.pgrb2b.0p25.f${fcsthrs}.idx" ]]; then + break + fi + + sleep 10 + icnt=$((icnt + 1)) + if (( icnt >= 180 )); then + msg="FATAL ERROR: No GFS pgrb2 file after 30 min of waiting" + err_exit "${msg}" + fi done ######################################## @@ -74,7 +66,7 @@ echo "#######################################" echo " Process GRIB AWIP GRIB2 PRODUCTS " echo "#######################################" echo " " -${TRACE_ON:-set -x} +set_trace # Set type of Interpolation for WGRIB2 export opt1=' -set_grib_type same -new_grid_winds earth ' @@ -94,160 +86,184 @@ export SCALEDEC=${SCALDEC:-$USHgfs/scale_dec.sh} # Process GFS GRIB AWIP PRODUCTS IN GRIB2 # ############################################################### -cp $COMIN/gfs.t${cyc}z.pgrb2.0p25.f${fcsthrs} tmpfile2${fcsthrs} -cp $COMIN/gfs.t${cyc}z.pgrb2b.0p25.f${fcsthrs} tmpfile2b${fcsthrs} -cat tmpfile2${fcsthrs} tmpfile2b${fcsthrs} > tmpfile${fcsthrs} -$WGRIB2 tmpfile${fcsthrs} | grep -F -f $PARMproduct/gfs_awips_parmlist_g2 | $WGRIB2 -i -grib masterfile tmpfile${fcsthrs} +cp "${COM_ATMOS_GRIB_0p25}/gfs.t${cyc}z.pgrb2.0p25.f${fcsthrs}" "tmpfile2${fcsthrs}" +cp "${COM_ATMOS_GRIB_0p25}/gfs.t${cyc}z.pgrb2b.0p25.f${fcsthrs}" "tmpfile2b${fcsthrs}" +cat "tmpfile2${fcsthrs}" "tmpfile2b${fcsthrs}" > "tmpfile${fcsthrs}" +${WGRIB2} "tmpfile${fcsthrs}" | grep -F -f "${PARMproduct}/gfs_awips_parmlist_g2" | \ + ${WGRIB2} -i -grib masterfile "tmpfile${fcsthrs}" export err=$? -if [[ $err -ne 0 ]] ; then +if [[ $err -ne 0 ]]; then echo " FATAL ERROR: masterfile does not exist." exit $err fi -$WGRIB2 masterfile -match ":PWAT:entire atmosphere" -grib gfs_pwat.grb -$WGRIB2 masterfile | grep -v ":PWAT:entire atmosphere" | $WGRIB2 -i -grib temp_gfs masterfile +${WGRIB2} masterfile -match ":PWAT:entire atmosphere" -grib gfs_pwat.grb +${WGRIB2} masterfile | grep -v ":PWAT:entire atmosphere" | ${WGRIB2} -i -grib temp_gfs masterfile ################################################################## # Process to change PWAT from level 200 to 10 (Entire Atmosphere) # in production defintion template (PDT) 4.0 ################################################################## -$WGRIB2 gfs_pwat.grb -set_byte 4 23 10 -grib gfs_pwat_levels_10.grb +${WGRIB2} gfs_pwat.grb -set_byte 4 23 10 -grib gfs_pwat_levels_10.grb export err=$?; err_chk cat temp_gfs gfs_pwat_levels_10.grb > tmp_masterfile -for GRID in conus ak prico pac 003 -do - case $GRID in - conus) - # Grid 20km_conus - CONUS - 20 km Quadruple Resolution (Lambert Conformal) - # export grid_20km_conus="30 6 0 0 0 0 0 0 369 257 12190000 226541000 8 25000000 265000000 20318000 20318000 0 64 25000000 25000000 0 0" - # $COPYGB2 -g "$grid_20km_conus" -i0 -x tmp_masterfile awps_file_f${fcsthrs}_${GRID} - - export gridconus="lambert:265.0:25.0:25.0 226.541:369:20318.0 12.19:257:20318.0" - $WGRIB2 tmp_masterfile $opt1uv $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid $gridconus awps_file_f${fcsthrs}_${GRID} - ;; - ak) - # Grid 20km_ak - Alaska - Double Resolution (Polar Stereographic) - # Redefined grid 217 for Alaska region - # export grid_20km_ak="20 6 0 0 0 0 0 0 277 213 30000000 187000000 8 60000000 225000000 22500000 22500000 0 64" - # $COPYGB2 -g "$grid_20km_ak" -i0 -x tmp_masterfile awps_file_f${fcsthrs}_${GRID} - - export gridak="nps:210.0:60.0 170.0:277:22500 35.0:225:22500" - $WGRIB2 tmp_masterfile $opt1uv $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid $gridak awps_file_f${fcsthrs}_${GRID} - ;; - prico) - # Grid 20km_prico - 0.25 degree Lat/Lon grid for Puerto Rico (20km) - # export grid_20km_prico="0 6 0 0 0 0 0 0 275 205 0 0 50750000 271750000 48 -250000 340250000 250000 250000 0" - # $COPYGB2 -g "$grid_20km_prico" -i0 -x tmp_masterfile awps_file_f${fcsthrs}_${GRID} - - export gridprico="latlon 271.75:275:0.25 50.75:205:-0.25" - $WGRIB2 tmp_masterfile $opt1 $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid $gridprico awps_file_f${fcsthrs}_${GRID} - ;; - pac) - # Grid 20km_pac - 20 km Mercator grid for Pacific Region - # export grid_20km_pac="10 6 0 0 0 0 0 0 837 692 -45000000 110000000 48 20000000 65720000 270000000 64 0 20000000 20000000" - # NEW export grid_20km_pac="10 6 0 0 0 0 0 0 837 725 -45000000 110000000 48 20000000 65734500 270000000 64 0 20000000 20000000" - # $COPYGB2 -g "$grid_20km_pac" -i0 -x tmp_masterfile awps_file_f${fcsthrs}_${GRID} - - export gridpac="mercator:20.0 110.0:837:20000:270.0 -45.0:725:20000:65.7345" - $WGRIB2 tmp_masterfile $opt1 $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid $gridpac awps_file_f${fcsthrs}_${GRID} - ;; - 003) - ###################################################################### - # Process GFS GRIB AWIP 1.0 DEGREE (GRID 003) PRODUCTS IN GRIB2 # - ###################################################################### - export grid003="latlon 0:360:1.0 90:181:-1.0" - $WGRIB2 tmp_masterfile $opt1 $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid $grid003 awps_file_f${fcsthrs}_${GRID} - ;; +for GRID in conus ak prico pac 003; do + # shellcheck disable=SC2086 + case ${GRID} in + conus) + # Grid 20km_conus - CONUS - 20 km Quadruple Resolution (Lambert Conformal) + # export grid_20km_conus="30 6 0 0 0 0 0 0 369 257 12190000 226541000 8 25000000 265000000 20318000 20318000 0 64 25000000 25000000 0 0" + # $COPYGB2 -g "$grid_20km_conus" -i0 -x tmp_masterfile awps_file_f${fcsthrs}_${GRID} + + export gridconus="lambert:265.0:25.0:25.0 226.541:369:20318.0 12.19:257:20318.0" + ${WGRIB2} tmp_masterfile ${opt1uv} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} \ + ${opt27} ${opt28} -new_grid ${gridconus} "awps_file_f${fcsthrs}_${GRID}" + ;; + ak) + # Grid 20km_ak - Alaska - Double Resolution (Polar Stereographic) + # Redefined grid 217 for Alaska region + # export grid_20km_ak="20 6 0 0 0 0 0 0 277 213 30000000 187000000 8 60000000 225000000 22500000 22500000 0 64" + # $COPYGB2 -g "$grid_20km_ak" -i0 -x tmp_masterfile awps_file_f${fcsthrs}_${GRID} + + export gridak="nps:210.0:60.0 170.0:277:22500 35.0:225:22500" + ${WGRIB2} tmp_masterfile ${opt1uv} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} \ + ${opt27} ${opt28} -new_grid ${gridak} "awps_file_f${fcsthrs}_${GRID}" + ;; + prico) + # Grid 20km_prico - 0.25 degree Lat/Lon grid for Puerto Rico (20km) + # export grid_20km_prico="0 6 0 0 0 0 0 0 275 205 0 0 50750000 271750000 48 -250000 340250000 250000 250000 0" + # $COPYGB2 -g "$grid_20km_prico" -i0 -x tmp_masterfile awps_file_f${fcsthrs}_${GRID} + + export gridprico="latlon 271.75:275:0.25 50.75:205:-0.25" + ${WGRIB2} tmp_masterfile ${opt1} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} \ + ${opt27} ${opt28} -new_grid ${gridprico} "awps_file_f${fcsthrs}_${GRID}" + ;; + pac) + # Grid 20km_pac - 20 km Mercator grid for Pacific Region + # export grid_20km_pac="10 6 0 0 0 0 0 0 837 692 -45000000 110000000 48 20000000 65720000 270000000 64 0 20000000 20000000" + # NEW export grid_20km_pac="10 6 0 0 0 0 0 0 837 725 -45000000 110000000 48 20000000 65734500 270000000 64 0 20000000 20000000" + # $COPYGB2 -g "$grid_20km_pac" -i0 -x tmp_masterfile awps_file_f${fcsthrs}_${GRID} + + export gridpac="mercator:20.0 110.0:837:20000:270.0 -45.0:725:20000:65.7345" + ${WGRIB2} tmp_masterfile ${opt1} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} \ + ${opt27} ${opt28} -new_grid ${gridpac} "awps_file_f${fcsthrs}_${GRID}" + ;; + 003) + ###################################################################### + # Process GFS GRIB AWIP 1.0 DEGREE (GRID 003) PRODUCTS IN GRIB2 # + ###################################################################### + export grid003="latlon 0:360:1.0 90:181:-1.0" + ${WGRIB2} tmp_masterfile ${opt1} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} \ + ${opt27} ${opt28} -new_grid ${grid003} "awps_file_f${fcsthrs}_${GRID}" + ;; + *) + echo "FATAL ERROR: Unknown output grid ${GRID}" + exit 2 + ;; esac - $TRIMRH awps_file_f${fcsthrs}_${GRID} - $SCALEDEC awps_file_f${fcsthrs}_${GRID} - $GRB2INDEX awps_file_f${fcsthrs}_${GRID} awps_file_fi${fcsthrs}_${GRID} - -########################################################################### -# Checking fields in awps_file_f${fcsthrs}_${GRID} file -# before TOCGRIB2 adding WMO headers for AWIPS products. -# -# NOTE: numparm is the total of fields in grib2_awpgfs_20km_conusf000 file -########################################################################### -numparm=247 -numrec=$( $WGRIB2 awps_file_f${fcsthrs}_${GRID} | wc -l ) - -if [ $numrec -lt $numparm ] -then - msg="ABORTING : awps_file_f${fcsthrs}_${GRID} file is missing fields for AWIPS !" - err_exit $msg -fi + # shellcheck disable= + ${TRIMRH} "awps_file_f${fcsthrs}_${GRID}" + ${SCALEDEC} "awps_file_f${fcsthrs}_${GRID}" + ${GRB2INDEX} "awps_file_f${fcsthrs}_${GRID}" "awps_file_fi${fcsthrs}_${GRID}" + + ########################################################################### + # Checking fields in awps_file_f${fcsthrs}_${GRID} file + # before TOCGRIB2 adding WMO headers for AWIPS products. + # + # NOTE: numparm is the total of fields in grib2_awpgfs_20km_conusf000 file + ########################################################################### + numparm=247 + numrec=$( ${WGRIB2} "awps_file_f${fcsthrs}_${GRID}" | wc -l ) + + if (( numrec < numparm )); then + msg="FATAL ERROR: awps_file_f${fcsthrs}_${GRID} file is missing fields for AWIPS !" + err_exit "${msg}" || exit 10 + fi -# Processing AWIPS GRIB2 grids with WMO headers + # Processing AWIPS GRIB2 grids with WMO headers pgm=tocgrib2 export pgm; prep_step startmsg - if [ $GRID = "003" -a $(expr ${fcsthrs} % 6) -eq 0 ] ; then - export FORT11=awps_file_f${fcsthrs}_${GRID} - export FORT31=awps_file_fi${fcsthrs}_${GRID} - export FORT51=grib2.awpgfs${fcsthrs}.${GRID} + if [[ ${GRID} = "003" && $(( fcsthrs % 6 )) == 0 ]]; then + export FORT11="awps_file_f${fcsthrs}_${GRID}" + export FORT31="awps_file_fi${fcsthrs}_${GRID}" + export FORT51="grib2.awpgfs${fcsthrs}.${GRID}" + + cp "${PARMwmo}/grib2_awpgfs${fcsthrs}.${GRID}" "parm_list" + if [[ ${DO_WAVE} != "YES" ]]; then + # Remove wave field it not running wave model + grep -vw "5WAVH" "parm_list" > "parm_list_temp" + mv "parm_list_temp" "parm_list" + fi - $TOCGRIB2 < $PARMwmo/grib2_awpgfs${fcsthrs}.${GRID} >> $pgmout 2> errfile + ${TOCGRIB2} < "parm_list" >> "${pgmout}" 2> errfile export err=$?; err_chk echo " error from tocgrib2=",$err - if [ $SENDCOM = "YES" ] ; then + if [[ ${SENDCOM} == "YES" ]]; then ############################## - # Post Files to ${COMOUTwmo} + # Post Files to ${COM_ATMOS_WMO} ############################## - mv grib2.awpgfs${fcsthrs}.${GRID} ${COMOUTwmo}/grib2.awpgfs${fcsthrs}.${GRID}.gfs_awips_f${fcsthrs}_1p0deg_${cyc} + mv "grib2.awpgfs${fcsthrs}.${GRID}" \ + "${COM_ATMOS_WMO}/grib2.awpgfs${fcsthrs}.${GRID}.gfs_awips_f${fcsthrs}_1p0deg_${cyc}" ############################## # Distribute Data ############################## - if [ "$SENDDBN" = 'YES' -o "$SENDAWIP" = 'YES' ] ; then - $DBNROOT/bin/dbn_alert NTC_LOW $NET $job ${COMOUTwmo}/grib2.awpgfs${fcsthrs}.${GRID}.gfs_awips_f${fcsthrs}_1p0deg_${cyc} + if [[ "${SENDDBN}" == 'YES' || "${SENDAWIP}" == 'YES' ]]; then + "${DBNROOT}/bin/dbn_alert" NTC_LOW "${NET}" "${job}" \ + "${COM_ATMOS_WMO}/grib2.awpgfs${fcsthrs}.${GRID}.gfs_awips_f${fcsthrs}_1p0deg_${cyc}" else - msg="File ${COMOUTwmo}/grib2.awpgfs${fcsthrs}.${GRID}.gfs_awips_f${fcsthrs}_1p0deg_${cyc} not posted to db_net." - postmsg "$jlogfile" "$msg" + echo "File ${COM_ATMOS_WMO}/grib2.awpgfs${fcsthrs}.${GRID}.gfs_awips_f${fcsthrs}_1p0deg_${cyc} not posted to db_net." fi fi - elif [ $GRID != "003" ] ; then - export FORT11=awps_file_f${fcsthrs}_${GRID} - export FORT31=awps_file_fi${fcsthrs}_${GRID} - export FORT51=grib2.awpgfs_20km_${GRID}_f${fcsthrs} + elif [[ ${GRID} != "003" ]]; then + export FORT11="awps_file_f${fcsthrs}_${GRID}" + export FORT31="awps_file_fi${fcsthrs}_${GRID}" + export FORT51="grib2.awpgfs_20km_${GRID}_f${fcsthrs}" + + cp "${PARMwmo}/grib2_awpgfs_20km_${GRID}f${fcsthrs}" "parm_list" + if [[ ${DO_WAVE} != "YES" ]]; then + # Remove wave field it not running wave model + grep -vw "5WAVH" "parm_list" > "parm_list_temp" + mv "parm_list_temp" "parm_list" + fi - $TOCGRIB2 < $PARMwmo/grib2_awpgfs_20km_${GRID}f${fcsthrs} >> $pgmout 2> errfile - export err=$? ;err_chk - echo " error from tocgrib2=",$err + ${TOCGRIB2} < "parm_list" >> "${pgmout}" 2> errfile + export err=$?; err_chk || exit "${err}" - if [ $SENDCOM = "YES" ] ; then + if [[ ${SENDCOM} = "YES" ]]; then - ############################## - # Post Files to ${COMOUTwmo} - ############################## + ############################## + # Post Files to ${COM_ATMOS_WMO} + ############################## - mv grib2.awpgfs_20km_${GRID}_f${fcsthrs} ${COMOUTwmo}/grib2.awpgfs_20km_${GRID}_f${fcsthrs}.$job_name + mv "grib2.awpgfs_20km_${GRID}_f${fcsthrs}" \ + "${COM_ATMOS_WMO}/grib2.awpgfs_20km_${GRID}_f${fcsthrs}.${job_name}" - ############################## - # Distribute Data - ############################## + ############################## + # Distribute Data + ############################## - if [ "$SENDDBN" = 'YES' -o "$SENDAWIP" = 'YES' ] ; then - $DBNROOT/bin/dbn_alert NTC_LOW $NET $job ${COMOUTwmo}/grib2.awpgfs_20km_${GRID}_f${fcsthrs}.$job_name - else - msg="File ${COMOUTwmo}/grib2.awpgfs_20km_${GRID}_f${fcsthrs}.$job_name not posted to db_net." - postmsg "$jlogfile" "$msg" + if [[ "${SENDDBN}" = 'YES' || "${SENDAWIP}" = 'YES' ]]; then + "${DBNROOT}/bin/dbn_alert" NTC_LOW "${NET}" "${job}" \ + "${COM_ATMOS_WMO}/grib2.awpgfs_20km_${GRID}_f${fcsthrs}.${job_name}" + else + echo "File ${COM_ATMOS_WMO}/grib2.awpgfs_20km_${GRID}_f${fcsthrs}.${job_name} not posted to db_net." + fi fi - fi fi - msg="Awip Processing ${fcsthrs} hour completed normally" - postmsg "$jlogfile" "$msg" + echo "Awip Processing ${fcsthrs} hour completed normally" done -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]]; then + cat "${pgmout}" fi diff --git a/scripts/exgfs_atmos_fbwind.sh b/scripts/exgfs_atmos_fbwind.sh index a4ecd248f00..e7d0ff3d824 100755 --- a/scripts/exgfs_atmos_fbwind.sh +++ b/scripts/exgfs_atmos_fbwind.sh @@ -31,7 +31,7 @@ echo " Process Bulletins of forecast winds and temps for Hawaii " echo " and 15 sites outside of the Hawaiian Islands. " echo "#############################################################" echo " " -${TRACE_ON:-set -x} +set_trace export pgm=bulls_fbwndgfs . prep_step @@ -79,7 +79,7 @@ fi if test "$SENDDBN" = 'YES' then # make_ntc_bull.pl WMOBH NONE KWNO NONE tran.fbwnd_pacific ${COMOUTwmo}/tran.fbwnd_pacific.$job_name - ${UTILgfs}/ush/make_ntc_bull.pl WMOBH NONE KWNO NONE tran.fbwnd_pacific ${COMOUTwmo}/tran.fbwnd_pacific.$job_name + ${USHgfs}/make_ntc_bull.pl WMOBH NONE KWNO NONE tran.fbwnd_pacific ${COMOUTwmo}/tran.fbwnd_pacific.$job_name fi ##################################################################### diff --git a/scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh b/scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh index 394c5c30d88..64562daeedf 100755 --- a/scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh +++ b/scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh @@ -29,25 +29,23 @@ then while [ $icnt -lt 1000 ] do if [ -r ${COMIN}/${RUN}_${PDY}${cyc}f0${fhr} ] ; then - sleep 5 + sleep 5 break else - msg="The process is waiting ... ${GRIBFILE} file to proceed." - postmsg "${jlogfile}" "$msg" + echo "The process is waiting ... ${GRIBFILE} file to proceed." sleep 20 let "icnt=icnt+1" fi if [ $icnt -ge $maxtries ] then - msg="ABORTING: after 1 hour of waiting for ${GRIBFILE} file at F$fhr to end." - postmsg "${jlogfile}" "$msg" + echo "ABORTING: after 1 hour of waiting for ${GRIBFILE} file at F$fhr to end." export err=7 ; err_chk exit $err fi done cp ${COMIN}/${RUN}_${PDY}${cyc}f0${fhr} gem_grids${fhr}.gem - + # if [ $cyc -eq 00 -o $cyc -eq 12 ] #then $USHgempak/gempak_${RUN}_f${fhr}_gif.sh @@ -77,7 +75,7 @@ export RSHPDY=$(echo $PDY | cut -c5-)$(echo $PDY | cut -c3-4) cp $HOMEgfs/gempak/dictionaries/sonde.land.tbl . cp $HOMEgfs/gempak/dictionaries/metar.tbl . sort -k 2n,2 metar.tbl > metar_stnm.tbl -cp $COMINgfs/${model}.$cycle.adpupa.tm00.bufr_d fort.40 +cp $COMINobsproc/${model}.$cycle.adpupa.tm00.bufr_d fort.40 export err=$? if [[ $err -ne 0 ]] ; then echo " File ${model}.$cycle.adpupa.tm00.bufr_d does not exist." diff --git a/scripts/exgfs_atmos_gempak_meta.sh b/scripts/exgfs_atmos_gempak_meta.sh index cb64138c612..04f4f1fc5c0 100755 --- a/scripts/exgfs_atmos_gempak_meta.sh +++ b/scripts/exgfs_atmos_gempak_meta.sh @@ -27,7 +27,7 @@ do_all=0 #loop through and process needed forecast hours while [ $fhr -le $fhend ] do - # + # # First check to see if this is a rerun. If so make all Meta files if [ $fhr -gt 126 -a $first_time -eq 0 ] ; then do_all=1 @@ -51,8 +51,7 @@ do fi if [ $icnt -ge $maxtries ] then - msg="ABORTING after 1 hour of waiting for gempak grid F$fhr to end." - postmsg "${jlogfile}" "$msg" + echo "ABORTING after 1 hour of waiting for gempak grid F$fhr to end." export err=7 ; err_chk exit $err fi @@ -104,7 +103,7 @@ do # If this is the final fcst hour, alert the # file to all centers. -# +# if [ 10#$fhr -ge $fhend ] ; then export DBN_ALERT_TYPE=GFS_METAFILE_LAST fi @@ -112,12 +111,11 @@ do export fend=$fhr sleep 20 -# mpirun.lsf ntasks=${NTASKS_META:-$(cat $DATA/poescript | wc -l)} ptile=${PTILE_META:-4} threads=${NTHREADS_META:-1} export OMP_NUM_THREADS=$threads - APRUN="mpirun -n $ntasks cfp " + APRUN="mpiexec -l -n $ntasks -ppn $ntasks --cpu-bind verbose,core cfp" APRUN_METACFP=${APRUN_METACFP:-$APRUN} APRUNCFP=$(eval echo $APRUN_METACFP) diff --git a/scripts/exgfs_atmos_goes_nawips.sh b/scripts/exgfs_atmos_goes_nawips.sh index 7aae2e143cb..76ae0672801 100755 --- a/scripts/exgfs_atmos_goes_nawips.sh +++ b/scripts/exgfs_atmos_goes_nawips.sh @@ -25,7 +25,7 @@ cp $FIXgempak/g2vcrdncep1.tbl g2vcrdncep1.tbl NAGRIB=$GEMEXE/nagrib2 # -entry=$(grep "^$RUN " $NAGRIB_TABLE | awk 'index($1,"#") != 1 {print $0}') +entry=$(grep "^$RUN2 " $NAGRIB_TABLE | awk 'index($1,"#") != 1 {print $0}') if [ "$entry" != "" ] ; then cpyfil=$(echo $entry | awk 'BEGIN {FS="|"} {print $2}') @@ -56,7 +56,7 @@ while [ $fhcnt -le $fend ] ; do fhr3=$(printf "03d" $fhcnt) GRIBIN=$COMIN/${model}.${cycle}.${GRIB}${fhr}${EXT} - GEMGRD=${RUN}_${PDY}${cyc}f${fhr3} + GEMGRD=${RUN2}_${PDY}${cyc}f${fhr3} GRIBIN_chk=$GRIBIN @@ -71,8 +71,7 @@ while [ $fhcnt -le $fend ] ; do fi if [ $icnt -ge $maxtries ] then - msg="ABORTING after 1 hour of waiting for F$fhr to end." - postmsg "${jlogfile}" "$msg" + echo "ABORTING after 1 hour of waiting for F$fhr to end." export err=7 ; err_chk exit $err fi diff --git a/scripts/exgfs_atmos_grib2_special_npoess.sh b/scripts/exgfs_atmos_grib2_special_npoess.sh index ad24bf64354..4009a8e66a6 100755 --- a/scripts/exgfs_atmos_grib2_special_npoess.sh +++ b/scripts/exgfs_atmos_grib2_special_npoess.sh @@ -47,66 +47,62 @@ SLEEP_LOOP_MAX=$(expr $SLEEP_TIME / $SLEEP_INT) ############################################################################## export SHOUR=000 export FHOUR=024 -export fhr=$(printf "%03d" $SHOUR) ############################################################ # Loop Through the Post Forecast Files ############################################################ -while test 10#$fhr -le $FHOUR -do - - ############################### - # Start Looping for the - # existence of the restart files - ############################### - export pgm="postcheck" - ic=1 - while [ $ic -le $SLEEP_LOOP_MAX ] - do - if test -f $COMIN/gfs.t${cyc}z.pgrb2b.0p50.f${fhr}.idx - then - break - else - ic=$(expr $ic + 1) - sleep $SLEEP_INT - fi - ############################### - # If we reach this point assume - # fcst job never reached restart - # period and error exit - ############################### - if [ $ic -eq $SLEEP_LOOP_MAX ] - then - export err=9 - err_chk - fi - done - -###################################################################### -# Process Global NPOESS 0.50 GFS GRID PRODUCTS IN GRIB2 F000 - F024 # -###################################################################### - paramlist=${PARMproduct}/global_npoess_paramlist_g2 - cp $COMIN/gfs.t${cyc}z.pgrb2.0p50.f${fhr} tmpfile2 - cp $COMIN/gfs.t${cyc}z.pgrb2b.0p50.f${fhr} tmpfile2b - cat tmpfile2 tmpfile2b > tmpfile - $WGRIB2 tmpfile | grep -F -f $paramlist | $WGRIB2 -i -grib pgb2file tmpfile - export err=$?; err_chk - - if test $SENDCOM = "YES" - then - cp pgb2file $COMOUT/${RUN}.${cycle}.pgrb2f${fhr}.npoess - - if test $SENDDBN = "YES" - then - $DBNROOT/bin/dbn_alert MODEL GFS_PGBNPOESS $job $COMOUT/${RUN}.${cycle}.pgrb2f${fhr}.npoess - else - msg="File ${RUN}.${cycle}.pgrb2f${fhr}.npoess not posted to db_net." - postmsg "$msg" - fi - echo "$PDY$cyc$fhr" > $COMOUT/${RUN}.t${cyc}z.control.halfdeg.npoess - fi - rm tmpfile pgb2file - export fhr=$(printf "%03d" $(expr $fhr + $FHINC)) +for (( fhr=$((10#${SHOUR})); fhr <= $((10#${FHOUR})); fhr = fhr + FHINC )); do + + fhr3=$(printf "%03d" "${fhr}") + + ############################### + # Start Looping for the + # existence of the restart files + ############################### + export pgm="postcheck" + ic=1 + while (( ic <= SLEEP_LOOP_MAX )); do + if [[ -f "${COM_ATMOS_GRIB_0p50}/gfs.t${cyc}z.pgrb2b.0p50.f${fhr3}.idx" ]]; then + break + else + ic=$((ic + 1)) + sleep "${SLEEP_INT}" + fi + ############################### + # If we reach this point assume + # fcst job never reached restart + # period and error exit + ############################### + if (( ic == SLEEP_LOOP_MAX )); then + echo "FATAL ERROR: 0p50 grib file not available after max sleep time" + export err=9 + err_chk || exit "${err}" + fi + done + + ###################################################################### + # Process Global NPOESS 0.50 GFS GRID PRODUCTS IN GRIB2 F000 - F024 # + ###################################################################### + paramlist=${PARMproduct}/global_npoess_paramlist_g2 + cp "${COM_ATMOS_GRIB_0p50}/gfs.t${cyc}z.pgrb2.0p50.f${fhr3}" tmpfile2 + cp "${COM_ATMOS_GRIB_0p50}/gfs.t${cyc}z.pgrb2b.0p50.f${fhr3}" tmpfile2b + cat tmpfile2 tmpfile2b > tmpfile + ${WGRIB2} tmpfile | grep -F -f ${paramlist} | ${WGRIB2} -i -grib pgb2file tmpfile + export err=$?; err_chk + + if [[ ${SENDCOM} == "YES" ]]; then + cp pgb2file "${COM_ATMOS_GOES}/${RUN}.${cycle}.pgrb2f${fhr3}.npoess" + + if [[ ${SENDDBN} == "YES" ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGBNPOESS "${job}" \ + "${COM_ATMOS_GOES}/${RUN}.${cycle}.pgrb2f${fhr3}.npoess" + else + msg="File ${RUN}.${cycle}.pgrb2f${fhr3}.npoess not posted to db_net." + postmsg "${msg}" || echo "${msg}" + fi + echo "${PDY}${cyc}${fhr3}" > "${COM_ATMOS_GOES}/${RUN}.t${cyc}z.control.halfdeg.npoess" + fi + rm tmpfile pgb2file done @@ -115,89 +111,80 @@ done ################################################################ export SHOUR=000 export FHOUR=180 -export fhr=$(printf "%03d" $SHOUR) ################################# # Process GFS PGRB2_SPECIAL_POST ################################# -while test 10#$fhr -le $FHOUR -do - ############################### - # Start Looping for the - # existence of the restart files - ############################### - set +x - export pgm="postcheck" - ic=1 - while [ $ic -le $SLEEP_LOOP_MAX ] - do - if test -f $restart_file$fhr - then - break - else - ic=$(expr $ic + 1) - sleep $SLEEP_INT - fi - ############################### - # If we reach this point assume - # fcst job never reached restart - # period and error exit - ############################### - if [ $ic -eq $SLEEP_LOOP_MAX ] - then - export err=9 - err_chk - fi - done - ${TRACE_ON:-set -x} - - ############################### - # Put restart files into /nwges - # for backup to start Model Fcst - ############################### - - cp $COMIN/${RUN}.t${cyc}z.special.grb2f$fhr masterfile - -# $COPYGB2 -g "0 6 0 0 0 0 0 0 360 181 0 0 90000000 0 48 -90000000 359000000 1000000 1000000 0" -i1,1 -x masterfile pgb2file - -# export grid1p0="latlon 0:360:1.0 90:181:-1.0" - export grid0p25="latlon 0:1440:0.25 90:721:-0.25" - $WGRIB2 masterfile $opt1 $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid $grid0p25 pgb2file - -# creating higher resolution goes files for US centers -# $COPYGB2 -g "30 6 0 0 0 0 0 0 349 277 1000000 214500000 8 50000000 253000000 32463000 32463000 0 64 50000000 50000000 0 0" -i1,1 -x masterfile pgb2file2 - - export gridconus="lambert:253.0:50.0:50.0 214.5:349:32463.0 1.0:277:32463.0" - $WGRIB2 masterfile $opt1uv $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid $gridconus pgb2file2 - - $WGRIB2 pgb2file -s > pgb2ifile - - if test $SENDCOM = "YES" - then - - cp pgb2file $COMOUT/${RUN}.${cycle}.goessimpgrb2.0p25.f${fhr} - cp pgb2ifile $COMOUT/${RUN}.${cycle}.goessimpgrb2.0p25.f${fhr}.idx - - cp pgb2file2 $COMOUT/${RUN}.${cycle}.goessimpgrb2f${fhr}.grd221 - - if test $SENDDBN = "YES" - then - $DBNROOT/bin/dbn_alert MODEL GFS_GOESSIMPGB2_0P25 $job $COMOUT/${RUN}.${cycle}.goessimpgrb2.0p25.f${fhr} - $DBNROOT/bin/dbn_alert MODEL GFS_GOESSIMPGB2_0P25_WIDX $job $COMOUT/${RUN}.${cycle}.goessimpgrb2.0p25.f${fhr}.idx - $DBNROOT/bin/dbn_alert MODEL GFS_GOESSIMGRD221_PGB2 $job $COMOUT/${RUN}.${cycle}.goessimpgrb2f${fhr}.grd221 - fi - - echo "$PDY$cyc$fhr" > $COMOUT/${RUN}.t${cyc}z.control.goessimpgrb - fi - rm pgb2file2 pgb2ifile - - if test "$SENDECF" = 'YES' - then - export fhour=$(expr ${fhr} % 6 ) - fi - - export fhr=$(printf "%03d" $(expr $fhr + $FHINC)) +for (( fhr=$((10#${SHOUR})); fhr <= $((10#${FHOUR})); fhr = fhr + FHINC )); do + + fhr3=$(printf "%03d" "${fhr}") + + ############################### + # Start Looping for the + # existence of the restart files + ############################### + set +x + export pgm="postcheck" + ic=1 + while (( ic <= SLEEP_LOOP_MAX )); do + if [[ -f "${COM_ATMOS_GOES}/${RUN}.t${cyc}z.special.grb2if${fhr3}.idx" ]]; then + break + else + ic=$((ic + 1)) + sleep "${SLEEP_INT}" + fi + ############################### + # If we reach this point assume + # fcst job never reached restart + # period and error exit + ############################### + if (( ic == SLEEP_LOOP_MAX )); then + echo "FATAL ERROR: Special goes grib file not available after max sleep time" + export err=9 + err_chk || exit "${err}" + fi + done + set_trace + ############################### + # Put restart files into /nwges + # for backup to start Model Fcst + ############################### + cp "${COM_ATMOS_GOES}/${RUN}.t${cyc}z.special.grb2if${fhr3}" masterfile + export grid0p25="latlon 0:1440:0.25 90:721:-0.25" + ${WGRIB2} masterfile ${opt1} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} \ + ${opt27} ${opt28} -new_grid ${grid0p25} pgb2file + + export gridconus="lambert:253.0:50.0:50.0 214.5:349:32463.0 1.0:277:32463.0" + ${WGRIB2} masterfile ${opt1} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} \ + ${opt27} ${opt28} -new_grid ${gridconus} pgb2file2 + + ${WGRIB2} pgb2file -s > pgb2ifile + + if [[ ${SENDCOM} == "YES" ]]; then + + cp pgb2file "${COM_ATMOS_GOES}/${RUN}.${cycle}.goessimpgrb2.0p25.f${fhr3}" + cp pgb2ifile "${COM_ATMOS_GOES}/${RUN}.${cycle}.goessimpgrb2.0p25.f${fhr3}.idx" + cp pgb2file2 "${COM_ATMOS_GOES}/${RUN}.${cycle}.goessimpgrb2f${fhr3}.grd221" + + if [[ ${SENDDBN} == "YES" ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_GOESSIMPGB2_0P25 "${job}" \ + "${COM_ATMOS_GOES}/${RUN}.${cycle}.goessimpgrb2.0p25.f${fhr}" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_GOESSIMPGB2_0P25_WIDX "${job}" \ + "${COM_ATMOS_GOES}/${RUN}.${cycle}.goessimpgrb2.0p25.f${fhr}.idx" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_GOESSIMGRD221_PGB2 "${job}" \ + "${COM_ATMOS_GOES}/${RUN}.${cycle}.goessimpgrb2f${fhr}.grd221" + fi + + echo "${PDY}${cyc}${fhr}" > "${COM_ATMOS_GOES}/${RUN}.t${cyc}z.control.goessimpgrb" + fi + rm pgb2file2 pgb2ifile + + if [[ ${SENDECF} == "YES" ]]; then + # TODO Does this even do anything? + export fhour=$(( fhr % 6 )) + fi + done diff --git a/scripts/exgfs_atmos_grib_awips.sh b/scripts/exgfs_atmos_grib_awips.sh index 5252d71983e..f10508626f7 100755 --- a/scripts/exgfs_atmos_grib_awips.sh +++ b/scripts/exgfs_atmos_grib_awips.sh @@ -21,57 +21,46 @@ # echo " FEB 2019 - Removed grid 225" ##################################################################### -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" fcsthrs="$1" num=$# -job_name=$(echo $job|sed 's/[jpt]gfs/gfs/') +job_name=${job/[jpt]gfs/gfs} -fcsthrs=$(printf "%03d" $fcsthrs) - -export SCALEDEC=${SCALDEC:-$USHgfs/scale_dec.sh} - -if test "$num" -ge 1 -then - echo "" - echo " Appropriate number of arguments were passed" - echo "" -else +if (( num != 1 )); then echo "" - echo " FATAL ERROR: Number of arguments were not passed." + echo " FATAL ERROR: Incorrect number of arguments " echo "" echo "" - echo "Usage: $0 \$fcsthrs (3-digit) " + echo "Usage: $0 \${fcsthrs} (3 digits) " echo "" exit 16 fi -cd $DATA/awips_g1 +cd "${DATA}" || exit 2 + +fcsthrs=$(printf "%03d" "${fcsthrs}") + +export SCALEDEC=${SCALDEC:-${USHgfs}/scale_dec.sh} + +cd ${DATA}/awips_g1 || exit 2 ############################################### # Wait for the availability of the pgrb file ############################################### icnt=1 -while [ $icnt -lt 1000 ] -do - if [ -s $COMIN/${RUN}.${cycle}.pgrb2b.0p25.f${fcsthrs}.idx ] - then - break - fi - - sleep 10 - icnt=$((icnt + 1)) - if [ $icnt -ge 180 ] - then - msg="ABORTING after 30 min of waiting for the pgrb file!" - err_exit $msg - fi -done +while (( icnt < 1000 )); do + if [[ -s "${COM_ATMOS_GRIB_0p25}/${RUN}.${cycle}.pgrb2b.0p25.f${fcsthrs}.idx" ]]; then + break + fi -######################################## -msg="HAS BEGUN!" -postmsg "$jlogfile" "$msg" -######################################## + sleep 10 + icnt=$((icnt + 1)) + if (( icnt >= 180 )); then + msg="FATAL ERROR: No GFS pgrb2 file after 30 min of waiting" + err_exit "${msg}" + fi +done echo " ------------------------------------------" echo " BEGIN MAKING GFS GRIB1 AWIPS PRODUCTS" @@ -83,67 +72,65 @@ echo "###############################################" echo " Process GFS GRIB1 AWIP PRODUCTS (211) " echo "###############################################" echo " " -${TRACE_ON:-set -x} +set_trace - cp $COMIN/gfs.t${cyc}z.pgrb2.0p25.f${fcsthrs} tmpfile2 - cp $COMIN/gfs.t${cyc}z.pgrb2b.0p25.f${fcsthrs} tmpfile2b - cat tmpfile2 tmpfile2b > tmpfile - $WGRIB2 tmpfile | grep -F -f $PARMproduct/gfs_awips_parmlist_g2 | $WGRIB2 -i -grib masterfile tmpfile - $SCALEDEC masterfile - $CNVGRIB -g21 masterfile masterfile.grib1 +cp "${COM_ATMOS_GRIB_0p25}/gfs.t${cyc}z.pgrb2.0p25.f${fcsthrs}" "tmpfile2" +cp "${COM_ATMOS_GRIB_0p25}/gfs.t${cyc}z.pgrb2b.0p25.f${fcsthrs}" "tmpfile2b" +cat tmpfile2 tmpfile2b > tmpfile +${WGRIB2} tmpfile | grep -F -f "${PARMproduct}/gfs_awips_parmlist_g2" | \ + ${WGRIB2} -i -grib masterfile tmpfile +${SCALEDEC} masterfile +${CNVGRIB} -g21 masterfile masterfile.grib1 - ln -s masterfile.grib1 fort.11 +ln -s masterfile.grib1 fort.11 -# $OVERGRIDID << EOF - ${UTILgfs}/exec/overgridid << EOF +"${HOMEgfs}/exec/overgridid.x" << EOF 255 EOF - mv fort.51 master.grbf${fcsthrs} - rm fort.11 +mv fort.51 "master.grbf${fcsthrs}" +rm fort.11 - $GRBINDEX master.grbf${fcsthrs} master.grbif${fcsthrs} +${GRBINDEX} "master.grbf${fcsthrs}" "master.grbif${fcsthrs}" ############################################################### # Process GFS GRIB1 AWIP GRIDS 211 PRODUCTS ############################################################### - executable=mkgfsawps - DBNALERT_TYPE=GRIB_LOW +DBNALERT_TYPE=GRIB_LOW - startmsg +startmsg # GRID=211 out to 240 hours: - export GRID=211 - export FORT11=master.grbf${fcsthrs} - export FORT31=master.grbif${fcsthrs} - export FORT51=xtrn.awpgfs${fcsthrs}.${GRID} +export GRID=211 +export FORT11="master.grbf${fcsthrs}" +export FORT31="master.grbif${fcsthrs}" +export FORT51="xtrn.awpgfs${fcsthrs}.${GRID}" # $MKGFSAWPS < $PARMwmo/grib_awpgfs${fcsthrs}.${GRID} parm=KWBC >> $pgmout 2>errfile - ${UTILgfs}/exec/mkgfsawps < $PARMwmo/grib_awpgfs${fcsthrs}.${GRID} parm=KWBC >> $pgmout 2>errfile - export err=$?; err_chk - ############################## - # Post Files to ${COMOUTwmo} - ############################## +"${HOMEgfs}/exec/mkgfsawps.x" < "${PARMwmo}/grib_awpgfs${fcsthrs}.${GRID}" parm=KWBC >> "${pgmout}" 2>errfile +export err=$?; err_chk +############################## +# Post Files to ${COM_ATMOS_WMO} +############################## - if test "$SENDCOM" = 'YES' - then - cp xtrn.awpgfs${fcsthrs}.${GRID} ${COMOUTwmo}/xtrn.awpgfs${fcsthrs}.${GRID}.$job_name +if [[ "${SENDCOM}" = 'YES' ]]; then + cp "xtrn.awpgfs${fcsthrs}.${GRID}" "${COM_ATMOS_WMO}/xtrn.awpgfs${fcsthrs}.${GRID}.${job_name}" - ############################## - # Distribute Data - ############################## + ############################## + # Distribute Data + ############################## - if [ "$SENDDBN" = 'YES' -o "$SENDAWIP" = 'YES' ] ; then - $DBNROOT/bin/dbn_alert $DBNALERT_TYPE $NET $job ${COMOUTwmo}/xtrn.awpgfs${fcsthrs}.${GRID}.$job_name - else - msg="File $output_grb.$job_name not posted to db_net." - postmsg "$jlogfile" "$msg" - fi + if [[ "${SENDDBN}" == 'YES' || "${SENDAWIP}" == 'YES' ]] ; then + "${DBNROOT}/bin/dbn_alert" "${DBNALERT_TYPE}" "${NET}" "${job}" \ + "${COM_ATMOS_WMO}/xtrn.awpgfs${fcsthrs}.${GRID}.${job_name}" + else + echo "File ${output_grb}.${job_name} not posted to db_net." fi +fi -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat ${pgmout} fi ############################################################################### diff --git a/scripts/exgfs_atmos_nawips.sh b/scripts/exgfs_atmos_nawips.sh index 5b751735067..07b0ca8b3f6 100755 --- a/scripts/exgfs_atmos_nawips.sh +++ b/scripts/exgfs_atmos_nawips.sh @@ -18,16 +18,17 @@ source "$HOMEgfs/ush/preamble.sh" "${2}" export ILPOST=${ILPOST:-1} cd $DATA -RUN=$1 +RUN2=$1 fend=$2 DBN_ALERT_TYPE=$3 +destination=${4} -DATA_RUN=$DATA/$RUN +DATA_RUN=$DATA/$RUN2 mkdir -p $DATA_RUN cd $DATA_RUN # -NAGRIB=$GEMEXE/nagrib2_nc +NAGRIB=$GEMEXE/nagrib2 # cpyfil=gds @@ -44,139 +45,131 @@ maxtries=360 fhcnt=$fstart while [ $fhcnt -le $fend ] ; do -if [[ $(mkdir lock.${fhcnt}) == 0 ]] ; then - cd lock.$fhcnt - cp $FIXgempak/g2varswmo2.tbl g2varswmo2.tbl - cp $FIXgempak/g2vcrdwmo2.tbl g2vcrdwmo2.tbl - cp $FIXgempak/g2varsncep1.tbl g2varsncep1.tbl - cp $FIXgempak/g2vcrdncep1.tbl g2vcrdncep1.tbl - - fhr=$(printf "%03d" $fhcnt) - fhcnt3=$(expr $fhr % 3) - - fhr3=$(printf "%03d" $fhcnt) - - GEMGRD=${RUN}_${PDY}${cyc}f${fhr3} - -# Set type of Interpolation for WGRIB2 - export opt1=' -set_grib_type same -new_grid_winds earth ' - export opt1uv=' -set_grib_type same -new_grid_winds grid ' - export opt21=' -new_grid_interpolation bilinear -if ' - export opt22=":(CSNOW|CRAIN|CFRZR|CICEP|ICSEV):" - export opt23=' -new_grid_interpolation neighbor -fi ' - export opt24=' -set_bitmap 1 -set_grib_max_bits 16 -if ' - export opt25=":(APCP|ACPCP|PRATE|CPRAT):" - export opt26=' -set_grib_max_bits 25 -fi -if ' - export opt27=":(APCP|ACPCP|PRATE|CPRAT|DZDT):" - export opt28=' -new_grid_interpolation budget -fi ' - export TRIMRH=$HOMEgfs/ush/trim_rh.sh - - if [ $RUN = "gfs_0p50" ]; then - export GRIBIN=$COMIN/${model}.${cycle}.pgrb2.0p50.f${fhr} - GRIBIN_chk=$COMIN/${model}.${cycle}.pgrb2.0p50.f${fhr}.idx - elif [ $RUN = "gfs_0p25" -o $RUN = "gdas_0p25" -o $RUN = "gfs35_atl" -o $RUN = "gfs35_pac" -o $RUN = "gfs40" ]; then - export GRIBIN=$COMIN/${model}.${cycle}.pgrb2.0p25.f${fhr} - GRIBIN_chk=$COMIN/${model}.${cycle}.pgrb2.0p25.f${fhr}.idx - else - export GRIBIN=$COMIN/${model}.${cycle}.pgrb2.1p00.f${fhr} - GRIBIN_chk=$COMIN/${model}.${cycle}.pgrb2.1p00.f${fhr}.idx - fi - - icnt=1 - while [ $icnt -lt 1000 ] - do - if [ -r $GRIBIN_chk ] ; then - sleep 5 - break - else - msg="The process is waiting ... ${GRIBIN_chk} file to proceed." - postmsg "${jlogfile}" "$msg" - sleep 10 - let "icnt=icnt+1" - fi - if [ $icnt -ge $maxtries ] - then - msg="ABORTING: after 1 hour of waiting for ${GRIBIN_chk} file at F$fhr to end." - postmsg "${jlogfile}" "$msg" - export err=7 ; err_chk - exit $err - fi - done - -case $RUN in - gfs35_pac) -# $COPYGB2 -g "0 6 0 0 0 0 0 0 416 186 0 0 75125000 130000000 48 17000000 260000000 312000 312000 0" -x $GRIBIN grib$fhr -# NEW define gfs35_pac="0 6 0 0 0 0 0 0 416 186 0 -1 75125000 130000000 48 17405000 259480000 312000 312000 0" -# $COPYGB2 -g "0 6 0 0 0 0 0 0 416 186 0 -1 75125000 130000000 48 17405000 259480000 312000 312000 0" -x $GRIBIN grib$fhr - - export gfs35_pac='latlon 130.0:416:0.312 75.125:186:-0.312' - $WGRIB2 $GRIBIN $opt1 $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid ${gfs35_pac} grib$fhr - $TRIMRH grib$fhr - ;; - gfs35_atl) -# $COPYGB2 -g "0 6 0 0 0 0 0 0 480 242 0 0 75125000 230000000 48 -500000 20000000 312000 312000 0" -x $GRIBIN grib$fhr -# NEW define gfs35_atl="0 6 0 0 0 0 0 0 480 242 0 -1 75125000 230000000 48 -67000 19448000 312000 312000 0" -# $COPYGB2 -g "0 6 0 0 0 0 0 0 480 242 0 -1 75125000 230000000 48 -67000 19448000 312000 312000 0" -x $GRIBIN grib$fhr - - export gfs35_atl='latlon 230.0:480:0.312 75.125:242:-0.312' - $WGRIB2 $GRIBIN $opt1 $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid ${gfs35_atl} grib$fhr - $TRIMRH grib$fhr - ;; - gfs40) -# $COPYGB2 -g "30 6 0 0 0 0 0 0 185 129 12190000 226541000 8 25000000 265000000 40635000 40635000 0 64 25000000 25000000 0 0" -x $GRIBIN grib$fhr - - export gfs40='lambert:265.0:25.0:25.0 226.541:185:40635.0 12.19:129:40635.0' - $WGRIB2 $GRIBIN $opt1uv $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid ${gfs40} grib$fhr - $TRIMRH grib$fhr - ;; - *) - cp $GRIBIN grib$fhr -esac - - export pgm="nagrib2 F$fhr" - startmsg - - $NAGRIB << EOF - GBFILE = grib$fhr - INDXFL = - GDOUTF = $GEMGRD - PROJ = $proj - GRDAREA = $grdarea - KXKY = $kxky - MAXGRD = $maxgrd - CPYFIL = $cpyfil - GAREA = $garea - OUTPUT = $output - GBTBLS = $gbtbls - GBDIAG = - PDSEXT = $pdsext - l - r + if mkdir "lock.${fhcnt}" ; then + cd lock.$fhcnt + cp $FIXgempak/g2varswmo2.tbl g2varswmo2.tbl + cp $FIXgempak/g2vcrdwmo2.tbl g2vcrdwmo2.tbl + cp $FIXgempak/g2varsncep1.tbl g2varsncep1.tbl + cp $FIXgempak/g2vcrdncep1.tbl g2vcrdncep1.tbl + + fhr=$(printf "%03d" "${fhcnt}") + + GEMGRD=${RUN2}_${PDY}${cyc}f${fhr} + + # Set type of Interpolation for WGRIB2 + export opt1=' -set_grib_type same -new_grid_winds earth ' + export opt1uv=' -set_grib_type same -new_grid_winds grid ' + export opt21=' -new_grid_interpolation bilinear -if ' + export opt22=":(CSNOW|CRAIN|CFRZR|CICEP|ICSEV):" + export opt23=' -new_grid_interpolation neighbor -fi ' + export opt24=' -set_bitmap 1 -set_grib_max_bits 16 -if ' + export opt25=":(APCP|ACPCP|PRATE|CPRAT):" + export opt26=' -set_grib_max_bits 25 -fi -if ' + export opt27=":(APCP|ACPCP|PRATE|CPRAT|DZDT):" + export opt28=' -new_grid_interpolation budget -fi ' + export TRIMRH=$HOMEgfs/ush/trim_rh.sh + + case ${RUN2} in + # TODO: Why aren't we interpolating from the 0p25 grids for 35-km and 40-km? + 'gfs_0p50' | 'gfs_0p25') res=${RUN2: -4};; + *) res="1p00";; + esac + + source_var="COM_ATMOS_GRIB_${res}" + export GRIBIN="${!source_var}/${model}.${cycle}.pgrb2.${res}.f${fhr}" + GRIBIN_chk="${!source_var}/${model}.${cycle}.pgrb2.${res}.f${fhr}.idx" + + icnt=1 + while [ $icnt -lt 1000 ]; do + if [ -r $GRIBIN_chk ] ; then + sleep 5 + break + else + echo "The process is waiting ... ${GRIBIN_chk} file to proceed." + sleep 10 + let "icnt=icnt+1" + fi + if [ $icnt -ge $maxtries ]; then + echo "ABORTING: after 1 hour of waiting for ${GRIBIN_chk} file at F$fhr to end." + export err=7 ; err_chk + exit $err + fi + done + + case $RUN2 in + gfs35_pac) + # $COPYGB2 -g "0 6 0 0 0 0 0 0 416 186 0 0 75125000 130000000 48 17000000 260000000 312000 312000 0" -x $GRIBIN grib$fhr + # NEW define gfs35_pac="0 6 0 0 0 0 0 0 416 186 0 -1 75125000 130000000 48 17405000 259480000 312000 312000 0" + # $COPYGB2 -g "0 6 0 0 0 0 0 0 416 186 0 -1 75125000 130000000 48 17405000 259480000 312000 312000 0" -x $GRIBIN grib$fhr + + export gfs35_pac='latlon 130.0:416:0.312 75.125:186:-0.312' + $WGRIB2 $GRIBIN $opt1 $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid ${gfs35_pac} grib$fhr + $TRIMRH grib$fhr + ;; + gfs35_atl) + # $COPYGB2 -g "0 6 0 0 0 0 0 0 480 242 0 0 75125000 230000000 48 -500000 20000000 312000 312000 0" -x $GRIBIN grib$fhr + # NEW define gfs35_atl="0 6 0 0 0 0 0 0 480 242 0 -1 75125000 230000000 48 -67000 19448000 312000 312000 0" + # $COPYGB2 -g "0 6 0 0 0 0 0 0 480 242 0 -1 75125000 230000000 48 -67000 19448000 312000 312000 0" -x $GRIBIN grib$fhr + + export gfs35_atl='latlon 230.0:480:0.312 75.125:242:-0.312' + $WGRIB2 $GRIBIN $opt1 $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid ${gfs35_atl} grib$fhr + $TRIMRH grib$fhr + ;; + gfs40) + # $COPYGB2 -g "30 6 0 0 0 0 0 0 185 129 12190000 226541000 8 25000000 265000000 40635000 40635000 0 64 25000000 25000000 0 0" -x $GRIBIN grib$fhr + + export gfs40='lambert:265.0:25.0:25.0 226.541:185:40635.0 12.19:129:40635.0' + $WGRIB2 $GRIBIN $opt1uv $opt21 $opt22 $opt23 $opt24 $opt25 $opt26 $opt27 $opt28 -new_grid ${gfs40} grib$fhr + $TRIMRH grib$fhr + ;; + *) + cp $GRIBIN grib$fhr + esac + + export pgm="nagrib2 F$fhr" + startmsg + + $NAGRIB << EOF + GBFILE = grib$fhr + INDXFL = + GDOUTF = $GEMGRD + PROJ = $proj + GRDAREA = $grdarea + KXKY = $kxky + MAXGRD = $maxgrd + CPYFIL = $cpyfil + GAREA = $garea + OUTPUT = $output + GBTBLS = $gbtbls + GBDIAG = + PDSEXT = $pdsext + l + r EOF - export err=$?;err_chk - - if [ $SENDCOM = "YES" ] ; then - cpfs $GEMGRD $COMOUT/$GEMGRD - if [ $SENDDBN = "YES" ] ; then - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/$GEMGRD - fi - fi - cd $DATA_RUN -else + export err=$?;err_chk + + if [[ ${SENDCOM} == "YES" ]] ; then + cpfs "${GEMGRD}" "${destination}/${GEMGRD}" + if [[ ${SENDDBN} == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${destination}/${GEMGRD}" + fi + fi + cd $DATA_RUN + else if [ $fhcnt -ge 240 ] ; then - if [ $fhcnt -lt 276 -a $RUN = "gfs_0p50" ] ; then - let fhcnt=fhcnt+6 - else - let fhcnt=fhcnt+12 - fi - elif [ $fhcnt -lt 120 -a $RUN = "gfs_0p25" ] ; then -#### let fhcnt=fhcnt+1 - let fhcnt=fhcnt+$ILPOST + if [ $fhcnt -lt 276 -a $RUN2 = "gfs_0p50" ] ; then + let fhcnt=fhcnt+6 + else + let fhcnt=fhcnt+12 + fi + elif [ $fhcnt -lt 120 -a $RUN2 = "gfs_0p25" ] ; then + #### let fhcnt=fhcnt+1 + let fhcnt=fhcnt+$ILPOST else - let fhcnt=fhcnt+finc + fhcnt=$((ILPOST > finc ? fhcnt+ILPOST : fhcnt+finc )) fi -fi + fi done $GEMEXE/gpend diff --git a/scripts/exgfs_atmos_post.sh b/scripts/exgfs_atmos_post.sh new file mode 100755 index 00000000000..40bde0f731c --- /dev/null +++ b/scripts/exgfs_atmos_post.sh @@ -0,0 +1,513 @@ +#! /usr/bin/env bash + +##################################################################### +# echo "-----------------------------------------------------" +# echo " exgfs_nceppost.sh" +# echo " Apr 99 - Michaud - Generated to post global forecast" +# echo " Mar 03 - Zhu - Add post for 0.5x0.5 degree" +# echo " Nov 03 - Gilbert - Modified from exglobal_post.sh.sms" +# echo " to run only one master post job." +# echo " Jan 07 - Cooke - Add DBNet Alert for Master files" +# echo " May 07 - Chuang - Modified scripts to run unified post" +# echo " Feb 10 - Carlis - Add 12-hr accum precip bucket at f192" +# echo " Jun 12 - Wang - Add option for grb2" +# echo " Jul 14 - Carlis - Add 0.25 deg master " +# echo " Mar 17 - F Yang - Modified for running fv3gfs" +# echo " Aug 17 - Meng - Add flags for turning on/off flx, gtg " +# echo " and satellite look like file creation" +# echo " and use 3-digit forecast hour naming" +# echo " post output files" +# echo " Dec 17 - Meng - Link sfc data file to flxfile " +# echo " since fv3gfs does not output sfc files any more." +# echo " Dec 17 - Meng - Add fv3gfs_downstream_nems.sh for pgb processing " +# echo " Jan 18 - Meng - Add flag PGBF for truning on/off pgb processing. " +# echo " Jan 18 - Meng - For EE2 standard, move IDRT POSTGPVARS setting" +# echo " from j-job script." +# echo " Feb 18 - Meng - Removed legacy setting for generating grib1 data" +# echo " and reading sigio model outputs." +# echo " Aug 20 - Meng - Remove .ecf extentsion per EE2 review." +# echo " Sep 20 - Meng - Update clean up files per EE2 review." +# echo " Dec 20 - Meng - Add alert for special data file." +# echo " Mar 21 - Meng - Update POSTGRB2TBL default setting." +# echo " Jun 21 - Mao - Instead of err_chk, catch err and print out" +# echo " WAFS failure warnings to avoid job crashing" +# echo " Oct 21 - Meng - Remove jlogfile for wcoss2 transition." +# echo " Feb 22 - Lin - Exception handling if anl input not found." +# echo "-----------------------------------------------------" +##################################################################### + +source "${HOMEgfs}/ush/preamble.sh" + +cd "${DATA}" || exit 1 + +export POSTGPSH=${POSTGPSH:-${USHgfs}/gfs_post.sh} +export GFSDOWNSH=${GFSDOWNSH:-${USHgfs}/fv3gfs_downstream_nems.sh} +export GFSDOWNSHF=${GFSDOWNSHF:-${USHgfs}/inter_flux.sh} +export GFSDWNSH=${GFSDWNSH:-${USHgfs}/fv3gfs_dwn_nems.sh} +export TRIMRH=${TRIMRH:-${USHgfs}/trim_rh.sh} +export MODICEC=${MODICEC:-${USHgfs}/mod_icec.sh} +export INLINE_POST=${INLINE_POST:-".false."} + +############################################################ +# Define Variables: +# ----------------- +# FH is the current forecast hour. +# SLEEP_TIME is the number of seconds to sleep before exiting with error. +# SLEEP_INT is the number of seconds to sleep between restrt file checks. +# restart_file is the name of the file to key off of to kick off post. +############################################################ +export IO=${LONB:-1440} +export JO=${LATB:-721} +export OUTTYP=${OUTTYP:-4} +export FLXF=${FLXF:-"YES"} +export FLXGF=${FLXGF:-"YES"} +export GOESF=${GOESF:-"YES"} +export WAFSF=${WAFSF:-"NO"} +export PGBF=${PGBF:-"YES"} +export TCYC=${TCYC:-".t${cyc}z."} +export PREFIX=${PREFIX:-${RUN}${TCYC}} +export machine=${machine:-WCOSS2} + +########################### +# Specify Output layers +########################### +export POSTGPVARS="KPO=57,PO=1000.,975.,950.,925.,900.,875.,850.,825.,800.,775.,750.,725.,700.,675.,650.,625.,600.,575.,550.,525.,500.,475.,450.,425.,400.,375.,350.,325.,300.,275.,250.,225.,200.,175.,150.,125.,100.,70.,50.,40.,30.,20.,15.,10.,7.,5.,3.,2.,1.,0.7,0.4,0.2,0.1,0.07,0.04,0.02,0.01," + +########################################################## +# Specify variable to directly output pgrb2 files for GDAS/GFS +########################################################## +export IDRT=${IDRT:-0} # IDRT=0 is setting for outputting grib files on lat/lon grid + +############################################################ +# Post Analysis Files before starting the Forecast Post +############################################################ +# Process analysis when post_times is 00 +stime="$(echo "${post_times}" | cut -c1-3)" +export stime +export loganl="${COM_ATMOS_ANALYSIS}/${PREFIX}atmanl.nc" + +if [[ "${stime}" = "anl" ]]; then + if [[ -f "${loganl}" ]]; then + # add new environmental variables for running new ncep post + # Validation date + export VDATE=${PDY}${cyc} + # specify output file name from chgres which is input file name to nceppost + # if model already runs gfs io, make sure GFSOUT is linked to the gfsio file + # new imported variable for global_nceppost.sh + export GFSOUT=${PREFIX}gfsioanl + + # specify smaller control file for GDAS because GDAS does not + # produce flux file, the default will be /nwprod/parm/gfs_cntrl.parm + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + # use grib2 nomonic table in product g2tmpl directory as default + export POSTGRB2TBL=${POSTGRB2TBL:-${g2tmpl_ROOT}/share/params_grib2_tbl_new} + export PostFlatFile=${PostFlatFile:-${PARMpost}/postxconfig-NT-GFS-ANL.txt} + export CTLFILE=${PARMpost}/postcntrl_gfs_anl.xml + fi + + [[ -f flxfile ]] && rm flxfile ; [[ -f nemsfile ]] && rm nemsfile + ln -fs "${COM_ATMOS_ANALYSIS}/${PREFIX}atmanl.nc" nemsfile + export NEMSINP=nemsfile + ln -fs "${COM_ATMOS_ANALYSIS}/${PREFIX}sfcanl.nc" flxfile + export FLXINP=flxfile + + export PGBOUT=pgbfile + export PGIOUT=pgifile + export PGBOUT2=pgbfile.grib2 + export PGIOUT2=pgifile.grib2.idx + export IGEN=${IGEN_ANL} + export FILTER=0 + + ${POSTGPSH} + export err=$?; err_chk + + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + mv "${PGBOUT}" "${PGBOUT2}" + fi + + # Process pgb files + if [[ "${PGBF}" = 'YES' ]]; then + export FH=-1 + export downset=${downset:-2} + ${GFSDOWNSH} + export err=$?; err_chk + fi + + if [[ "${SENDCOM}" = 'YES' ]]; then + export fhr3=anl + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + MASTERANL=${PREFIX}master.grb2${fhr3} + MASTERANLIDX=${PREFIX}master.grb2i${fhr3} + cp "${PGBOUT2}" "${COM_ATMOS_MASTER}/${MASTERANL}" + ${GRB2INDEX} "${PGBOUT2}" "${COM_ATMOS_MASTER}/${MASTERANLIDX}" + fi + + if [[ "${SENDDBN}" = 'YES' ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_MSC_sfcanl "${job}" "${COM_ATMOS_ANALYSIS}/${PREFIX}sfcanl.nc" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_SA "${job}" "${COM_ATMOS_ANALYSIS}/${PREFIX}atmanl.nc" + if [[ "${PGBF}" = 'YES' ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_0P25 "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2.0p25.anl" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_0P25_WIDX "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2.0p25.anl.idx" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_0P25 "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2b.0p25.anl" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_0P25_WIDX "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2b.0p25.anl.idx" + + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_0P5 "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.anl" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_0P5_WIDX "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.anl.idx" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_0P5 "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2b.0p50.anl" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_0P5_WIDX "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2b.0p50.anl.idx" + + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_1P0 "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.anl" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_1P0_WIDX "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.anl.idx" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_1P0 "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2b.1p00.anl" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_1P0_WIDX "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2b.1p00.anl.idx" + fi + fi + fi + [[ -f pgbfile.grib2 ]] && rm pgbfile.grib2 + # ecflow_client --event release_pgrb2_anl + + ########################## WAFS U/V/T analysis start ########################## + # U/V/T on ICAO standard atmospheric pressure levels for WAFS verification + if [[ "${WAFSF}" = "YES" ]]; then + if [[ "${RUN}" = "gfs" && "${GRIBVERSION}" = 'grib2' ]]; then + export OUTTYP=${OUTTYP:-4} + + export PostFlatFile="${PARMpost}/postxconfig-NT-GFS-WAFS-ANL.txt" + export CTLFILE="${PARMpost}/postcntrl_gfs_wafs_anl.xml" + + export PGBOUT=wafsfile + export PGIOUT=wafsifile + + ${POSTGPSH} + export err=$? + if (( err != 0 )); then + echo " *** GFS POST WARNING: WAFS output failed for analysis, err=${err}" + else + # WAFS package doesn't process this part. + # Need to be saved for WAFS U/V/T verification, + # resolution higher than WAFS 1.25 deg for future compatibility + wafsgrid="latlon 0:1440:0.25 90:721:-0.25" + ${WGRIB2} "${PGBOUT}" -set_grib_type same -new_grid_winds earth \ + -new_grid_interpolation bilinear -set_bitmap 1 \ + -new_grid ${wafsgrid} "${PGBOUT}.tmp" + + if [[ "${SENDCOM}" = "YES" ]]; then + cp "${PGBOUT}.tmp" "${COM_ATMOS_WAFS}/${PREFIX}wafs.0p25.anl" + ${WGRIB2} -s "${PGBOUT}.tmp" > "${COM_ATMOS_WAFS}/${PREFIX}wafs.0p25.anl.idx" + + # if [ $SENDDBN = YES ]; then + # $DBNROOT/bin/dbn_alert MODEL GFS_WAFS_GB2 $job $COMOUT/${PREFIX}wafs.0p25.anl + # $DBNROOT/bin/dbn_alert MODEL GFS_WAFS_GB2__WIDX $job $COMOUT/${PREFIX}wafs.0p25.anl.idx + # fi + fi + rm "${PGBOUT}" "${PGBOUT}.tmp" + fi + fi + fi + ########################## WAFS U/V/T analysis end ########################## + else + #### atmanl file not found need failing job + echo " *** FATAL ERROR: No model anl file output " + export err=9 + err_chk + fi +else ## not_anl if_stime + SLEEP_LOOP_MAX=$(( SLEEP_TIME / SLEEP_INT )) + + ############################################################ + # Loop Through the Post Forecast Files + ############################################################ + + for fhr in ${post_times}; do + echo "Start processing fhr=${post_times}" + ############################### + # Start Looping for the + # existence of the restart files + ############################### + export pgm="postcheck" + ic=1 + while (( ic <= SLEEP_LOOP_MAX )); do + if [[ -f "${restart_file}${fhr}.txt" ]]; then + break + else + ic=$(( ic + 1 )) + sleep "${SLEEP_INT}" + fi + ############################### + # If we reach this point assume + # fcst job never reached restart + # period and error exit + ############################### + if (( ic == SLEEP_LOOP_MAX )); then + echo " *** FATAL ERROR: No model output for f${fhr} " + export err=9 + err_chk + fi + done + + ############################### + # Put restart files into /nwges + # for backup to start Model Fcst + ############################### + [[ -f flxfile ]] && rm flxfile ; [[ -f nemsfile ]] && rm nemsfile + ln -fs "${COM_ATMOS_HISTORY}/${PREFIX}atmf${fhr}.nc" nemsfile + export NEMSINP=nemsfile + ln -fs "${COM_ATMOS_HISTORY}/${PREFIX}sfcf${fhr}.nc" flxfile + export FLXINP=flxfile + + if (( fhr > 0 )); then + export IGEN=${IGEN_FCST} + else + export IGEN=${IGEN_ANL} + fi + + # No shellcheck, NDATE is not a typo + # shellcheck disable=SC2153 + VDATE="$(${NDATE} "+${fhr}" "${PDY}${cyc}")" + # shellcheck disable= + export VDATE + export OUTTYP=${OUTTYP:-4} + export GFSOUT="${PREFIX}gfsio${fhr}" + + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + export POSTGRB2TBL="${POSTGRB2TBL:-${g2tmpl_ROOT}/share/params_grib2_tbl_new}" + export PostFlatFile="${PostFlatFile:-${PARMpost}/postxconfig-NT-GFS.txt}" + + if [[ "${RUN}" = "gfs" ]]; then + export IGEN=${IGEN_GFS} + if (( fhr > 0 )); then export IGEN=${IGEN_FCST} ; fi + else + export IGEN=${IGEN_GDAS_ANL} + if (( fhr > 0 )); then export IGEN=${IGEN_FCST} ; fi + fi + if [[ "${RUN}" = "gfs" ]]; then + if (( fhr == 0 )); then + export PostFlatFile="${PARMpost}/postxconfig-NT-GFS-F00.txt" + export CTLFILE="${PARMpost}/postcntrl_gfs_f00.xml" + else + export CTLFILE="${CTLFILEGFS:-${PARMpost}/postcntrl_gfs.xml}" + fi + else + if (( fhr == 0 )); then + export PostFlatFile="${PARMpost}/postxconfig-NT-GFS-F00.txt" + export CTLFILE="${CTLFILEGFS:-${PARMpost}/postcntrl_gfs_f00.xml}" + else + export CTLFILE="${CTLFILEGFS:-${PARMpost}/postcntrl_gfs.xml}" + fi + fi + fi + + export FLXIOUT=flxifile + export PGBOUT=pgbfile + export PGIOUT=pgifile + export PGBOUT2=pgbfile.grib2 + export PGIOUT2=pgifile.grib2.idx + export FILTER=0 + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + MASTERFL=${PREFIX}master.grb2f${fhr} + MASTERFLIDX=${PREFIX}master.grb2if${fhr} + fi + + if [[ "${INLINE_POST}" = ".false." ]]; then + ${POSTGPSH} + else + cp -p "${COM_ATMOS_MASTER}/${MASTERFL}" "${PGBOUT}" + fi + export err=$?; err_chk + + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + mv "${PGBOUT}" "${PGBOUT2}" + fi + + # Process pgb files + if [[ "${PGBF}" = 'YES' ]]; then + export FH=$(( 10#${fhr} + 0 )) + export downset=${downset:-2} + ${GFSDOWNSH} + export err=$?; err_chk + fi + + if [[ "${SENDCOM}" = "YES" ]]; then + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + if [[ "${INLINE_POST}" = ".false." ]]; then + cp "${PGBOUT2}" "${COM_ATMOS_MASTER}/${MASTERFL}" + fi + ${GRB2INDEX} "${PGBOUT2}" "${COM_ATMOS_MASTER}/${MASTERFLIDX}" + fi + + if [[ "${SENDDBN}" = 'YES' ]]; then + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + if [[ "${PGBF}" = 'YES' ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_0P25 "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2.0p25.f${fhr}" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_0P25_WIDX "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2.0p25.f${fhr}.idx" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_0P25 "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2b.0p25.f${fhr}" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_0P25_WIDX "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2b.0p25.f${fhr}.idx" + + if [[ -s "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.f${fhr}" ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_0P5 "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.f${fhr}" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_0P5_WIDX "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.f${fhr}.idx" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_0P5 "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2b.0p50.f${fhr}" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_0P5_WIDX "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2b.0p50.f${fhr}.idx" + fi + + if [[ -s "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.f${fhr}" ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_1P0 "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.f${fhr}" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2_1P0_WIDX "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.f${fhr}.idx" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_1P0 "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2b.1p00.f${fhr}" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGB2B_1P0_WIDX "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2b.1p00.f${fhr}.idx" + fi + fi + fi + fi + + export fhr + "${USHgfs}/gfs_transfer.sh" + fi + [[ -f pgbfile.grib2 ]] && rm pgbfile.grib2 + + + # use post to generate GFS Grib2 Flux file as model generated Flux file + # will be in nemsio format after FY17 upgrade. + if (( OUTTYP == 4 )) && [[ "${FLXF}" == "YES" ]]; then + if (( fhr == 0 )); then + export PostFlatFile="${PARMpost}/postxconfig-NT-GFS-FLUX-F00.txt" + export CTLFILE="${PARMpost}/postcntrl_gfs_flux_f00.xml" + else + export PostFlatFile="${PARMpost}/postxconfig-NT-GFS-FLUX.txt" + export CTLFILE="${PARMpost}/postcntrl_gfs_flux.xml" + fi + export PGBOUT=fluxfile + export FILTER=0 + export FLUXFL=${PREFIX}sfluxgrbf${fhr}.grib2 + FLUXFLIDX=${PREFIX}sfluxgrbf${fhr}.grib2.idx + + if [[ "${INLINE_POST}" = ".false." ]]; then + ${POSTGPSH} + export err=$?; err_chk + mv fluxfile "${COM_ATMOS_MASTER}/${FLUXFL}" + fi + ${WGRIB2} -s "${COM_ATMOS_MASTER}/${FLUXFL}" > "${COM_ATMOS_MASTER}/${FLUXFLIDX}" + + #Add extra flux.1p00 file for coupled + if [[ "${FLXGF}" = 'YES' ]]; then + export FH=$(( 10#${fhr} + 0 )) + ${GFSDOWNSHF} + export err=$?; err_chk + fi + + if [[ "${SENDDBN}" = 'YES' ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_SGB_GB2 "${job}" "${COM_ATMOS_MASTER}/${FLUXFL}" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_SGB_GB2_WIDX "${job}" "${COM_ATMOS_MASTER}/${FLUXFLIDX}" + fi + fi + + # process satellite look alike separately so that master pgb gets out in time + # set outtyp to 2 because master post already generates gfs io files + if [[ "${GOESF}" = "YES" ]]; then + export OUTTYP=${OUTTYP:-4} + + # specify output file name from chgres which is input file name to nceppost + # if model already runs gfs io, make sure GFSOUT is linked to the gfsio file + # new imported variable for global_post.sh + + export GFSOUT=${PREFIX}gfsio${fhr} + + # link satellite coefficients files, use hwrf version as ops crtm 2.0.5 + # does not new coefficient files used by post + export FIXCRTM="${FIXCRTM:-${CRTM_FIX}}" + "${USHgfs}/link_crtm_fix.sh" "${FIXCRTM}" + + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + export PostFlatFile="${PARMpost}/postxconfig-NT-GFS-GOES.txt" + export CTLFILE="${PARMpost}/postcntrl_gfs_goes.xml" + fi + export FLXINP=flxfile + export FLXIOUT=flxifile + export PGBOUT=goesfile + export PGIOUT=goesifile + export FILTER=0 + export IO=0 + export JO=0 + export IGEN=0 + + if [[ "${NET}" = "gfs" ]]; then + ${POSTGPSH} + export err=$?; err_chk + fi + + if [[ "${GRIBVERSION}" = 'grib2' ]]; then + SPECIALFL="${PREFIX}special.grb2" + SPECIALFLIDX="${PREFIX}special.grb2i" + fi + fhr3=${fhr} + + if [[ "${SENDCOM}" = "YES" ]]; then + # echo "$PDY$cyc$pad$fhr" > $COMOUT/${RUN}.t${cyc}z.master.control + + mv goesfile "${COM_ATMOS_GOES}/${SPECIALFL}f${fhr}" + mv goesifile "${COM_ATMOS_GOES}/${SPECIALFLIDX}f${fhr}" + + if [[ "${SENDDBN}" = "YES" ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_SPECIAL_GB2 "${job}" "${COM_ATMOS_GOES}/${SPECIALFL}f${fhr}" + fi + fi + fi + # end of satellite processing + + ########################## WAFS start ########################## + # Generate WAFS products on ICAO standard level. + # Do not need to be sent out to public, WAFS package will process the data. + if [[ "${WAFSF}" = "YES" ]] && (( 10#${fhr} <= 120 )); then + if [[ "${RUN}" = gfs && "${GRIBVERSION}" = 'grib2' ]]; then + export OUTTYP=${OUTTYP:-4} + + # Extend WAFS icing and gtg up to 120 hours + export PostFlatFile="${PARMpost}/postxconfig-NT-GFS-WAFS.txt" + export CTLFILE="${PARMpost}/postcntrl_gfs_wafs.xml" + + # gtg has its own configurations + cp "${PARMpost}/gtg.config.gfs" gtg.config + cp "${PARMpost}/gtg_imprintings.txt" gtg_imprintings.txt + + export PGBOUT=wafsfile + export PGIOUT=wafsifile + + # WAFS data is processed: + # hourly if fhr<=24 + # every 3 forecast hour if 24 ../${RUN}.${cycle}.bufrsnd.tar.gz -cd $DATA +cd "${COM_ATMOS_BUFR}" || exit 2 +tar -cf - . | /usr/bin/gzip > "${RUN}.${cycle}.bufrsnd.tar.gz" +cd "${DATA}" || exit 2 ######################################## # Send the single tar file to OSO ######################################## -if test "$SENDDBN" = 'YES' -then - $DBNROOT/bin/dbn_alert MODEL GFS_BUFRSND_TAR $job \ - $COMOUT/${RUN}.${cycle}.bufrsnd.tar.gz +if [[ "${SENDDBN}" == 'YES' ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_BUFRSND_TAR "${job}" \ + "${COM_ATMOS_BUFR}/${RUN}.${cycle}.bufrsnd.tar.gz" fi ######################################## # Create Regional Collectives of BUFR data and # add appropriate WMO Headers. ######################################## -collect=' 1 2 3 4 5 6 7 8 9' -if [ $machine == "HERA" -o $machine == "JET" ]; then -for m in ${collect} -do -sh $USHbufrsnd/gfs_sndp.sh $m -done - -################################################ -# Convert the bufr soundings into GEMPAK files -################################################ -sh $USHbufrsnd/gfs_bfr2gpk.sh - -else rm -rf poe_col -for m in ${collect} -do -echo "sh $USHbufrsnd/gfs_sndp.sh $m " >> poe_col +for (( m = 1; m <10 ; m++ )); do + echo "sh ${USHbufrsnd}/gfs_sndp.sh ${m} " >> poe_col done -mv poe_col cmdfile +if [[ ${CFP_MP:-"NO"} == "YES" ]]; then + nl -n ln -v 0 poe_col > cmdfile +else + mv poe_col cmdfile +fi cat cmdfile chmod +x cmdfile ${APRUN_POSTSNDCFP} cmdfile -sh $USHbufrsnd/gfs_bfr2gpk.sh -fi +sh "${USHbufrsnd}/gfs_bfr2gpk.sh" ############## END OF SCRIPT ####################### diff --git a/scripts/exgfs_atmos_vminmon.sh b/scripts/exgfs_atmos_vminmon.sh new file mode 100755 index 00000000000..a1346d5f9e9 --- /dev/null +++ b/scripts/exgfs_atmos_vminmon.sh @@ -0,0 +1,116 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: exgfs_vrfminmon.sh +# Script description: Runs data extract/validation for GSI normalization diag data +# +# Author: Ed Safford Org: NP23 Date: 2015-04-10 +# +# Abstract: This script runs the data extract/validation portion of the +# MinMon package. +# +# Condition codes +# 0 - no problem encountered +# >0 - some problem encountered +# +################################################################################ + + +######################################## +# Set environment +######################################## +export RUN_ENVIR=${RUN_ENVIR:-nco} +export NET=${NET:-gfs} +export RUN=${RUN:-gfs} +export envir=${envir:-prod} + +######################################## +# Command line arguments +######################################## +export PDY=${1:-${PDY:?}} +export cyc=${2:-${cyc:?}} + +######################################## +# Directories +######################################## +export DATA=${DATA:-$(pwd)} + + +######################################## +# Filenames +######################################## +gsistat=${gsistat:-${COM_ATMOS_ANALYSIS}/gfs.t${cyc}z.gsistat} +export mm_gnormfile=${gnormfile:-${M_FIXgfs}/gfs_minmon_gnorm.txt} +export mm_costfile=${costfile:-${M_FIXgfs}/gfs_minmon_cost.txt} + +######################################## +# Other variables +######################################## +export MINMON_SUFFIX=${MINMON_SUFFIX:-GFS} +export PDATE=${PDY}${cyc} +export NCP=${NCP:-/bin/cp} +export pgm=exgfs_vrfminmon.sh + + + +if [[ ! -d ${DATA} ]]; then + mkdir $DATA +fi +cd $DATA + +###################################################################### + +data_available=0 + +if [[ -s ${gsistat} ]]; then + + data_available=1 + + #------------------------------------------------------------------ + # Copy the $MINMON_SUFFIX.gnorm_data.txt file to the working directory + # It's ok if it doesn't exist; we'll create a new one if needed. + #------------------------------------------------------------------ + if [[ -s ${M_TANKverf}/gnorm_data.txt ]]; then + $NCP ${M_TANKverf}/gnorm_data.txt gnorm_data.txt + elif [[ -s ${M_TANKverfM1}/gnorm_data.txt ]]; then + $NCP ${M_TANKverfM1}/gnorm_data.txt gnorm_data.txt + fi + + + #------------------------------------------------------------------ + # Run the child sccripts. + #------------------------------------------------------------------ + ${USHminmon}/minmon_xtrct_costs.pl ${MINMON_SUFFIX} ${PDY} ${cyc} ${gsistat} dummy + rc_costs=$? + echo "rc_costs = $rc_costs" + + ${USHminmon}/minmon_xtrct_gnorms.pl ${MINMON_SUFFIX} ${PDY} ${cyc} ${gsistat} dummy + rc_gnorms=$? + echo "rc_gnorms = $rc_gnorms" + + ${USHminmon}/minmon_xtrct_reduct.pl ${MINMON_SUFFIX} ${PDY} ${cyc} ${gsistat} dummy + rc_reduct=$? + echo "rc_reduct = $rc_reduct" + +fi + +##################################################################### +# Postprocessing + +err=0 +if [[ ${data_available} -ne 1 ]]; then + err=1 +elif [[ $rc_costs -ne 0 ]]; then + err=$rc_costs +elif [[ $rc_gnorms -ne 0 ]]; then + err=$rc_gnorms +elif [[ $rc_reduct -ne 0 ]]; then + err=$rc_reduct +fi + +exit ${err} + diff --git a/scripts/exgfs_wave_init.sh b/scripts/exgfs_wave_init.sh index 31c39fd52a7..2be224d1dac 100755 --- a/scripts/exgfs_wave_init.sh +++ b/scripts/exgfs_wave_init.sh @@ -26,7 +26,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" # 0.a Basic modes of operation @@ -40,11 +40,11 @@ source "$HOMEgfs/ush/preamble.sh" echo ' *** MWW3 INIT CONFIG SCRIPT ***' echo ' ********************************' echo ' Initial configuration script' - echo " Model identifier : ${CDUMP}wave" + echo " Model identifier : ${RUN}wave" echo ' ' echo "Starting at : $(date)" echo ' ' - ${TRACE_ON:-set -x} + set_trace # Script will run only if pre-defined NTASKS # The actual work is distributed over these tasks. @@ -58,7 +58,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' echo " Script set to run with $NTASKS tasks " echo ' ' - ${TRACE_ON:-set -x} + set_trace # --------------------------------------------------------------------------- # @@ -68,7 +68,7 @@ source "$HOMEgfs/ush/preamble.sh" echo 'Preparing input files :' echo '-----------------------' echo ' ' - ${TRACE_ON:-set -x} + set_trace # 1.a Model definition files @@ -82,32 +82,30 @@ source "$HOMEgfs/ush/preamble.sh" array=($WAVECUR_FID $WAVEICE_FID $WAVEWND_FID $waveuoutpGRD $waveGRD $waveesmfGRD $wavepostGRD $waveinterpGRD) grdALL=$(printf "%s\n" "${array[@]}" | sort -u | tr '\n' ' ') - for grdID in ${grdALL} - do - if [ -f "$COMIN/rundata/${CDUMP}wave.mod_def.${grdID}" ] - then + for grdID in ${grdALL}; do + if [[ -f "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" ]]; then set +x - echo " Mod def file for $grdID found in ${COMIN}/rundata. copying ...." - ${TRACE_ON:-set -x} - cp $COMIN/rundata/${CDUMP}wave.mod_def.${grdID} mod_def.$grdID + echo " Mod def file for ${grdID} found in ${COM_WAVE_PREP}. copying ...." + set_trace + cp "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" "mod_def.${grdID}" else set +x - echo " Mod def file for $grdID not found in ${COMIN}/rundata. Setting up to generate ..." + echo " Mod def file for ${grdID} not found in ${COM_WAVE_PREP}. Setting up to generate ..." echo ' ' - ${TRACE_ON:-set -x} - if [ -f $PARMwave/ww3_grid.inp.$grdID ] + set_trace + if [ -f $FIXwave/ww3_grid.inp.$grdID ] then - cp $PARMwave/ww3_grid.inp.$grdID ww3_grid.inp.$grdID + cp $FIXwave/ww3_grid.inp.$grdID ww3_grid.inp.$grdID fi if [ -f ww3_grid.inp.$grdID ] then set +x echo ' ' - echo " ww3_grid.inp.$grdID copied ($PARMwave/ww3_grid.inp.$grdID)." + echo " ww3_grid.inp.$grdID copied ($FIXwave/ww3_grid.inp.$grdID)." echo ' ' - ${TRACE_ON:-set -x} + set_trace else set +x echo ' ' @@ -116,11 +114,11 @@ source "$HOMEgfs/ush/preamble.sh" echo '*********************************************************** ' echo " grdID = $grdID" echo ' ' - ${TRACE_ON:-set -x} + set_trace err=2;export err;${errchk} fi - [[ ! -d $COMOUT/rundata ]] && mkdir -m 775 -p $COMOUT/rundata + [[ ! -d "${COM_WAVE_PREP}" ]] && mkdir -m 775 -p "${COM_WAVE_PREP}" if [ ${CFP_MP:-"NO"} = "YES" ]; then echo "$nmoddef $USHwave/wave_grid_moddef.sh $grdID > $grdID.out 2>&1" >> cmdfile else @@ -141,7 +139,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' echo " Generating $nmoddef mod def files" echo ' ' - ${TRACE_ON:-set -x} + set_trace # Set number of processes for mpmd wavenproc=$(wc -l cmdfile | awk '{print $1}') @@ -154,7 +152,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Executing the mod_def command file at : $(date)" echo ' ------------------------------------' echo ' ' - ${TRACE_ON:-set -x} + set_trace if [ "$NTASKS" -gt '1' ] then if [ ${CFP_MP:-"NO"} = "YES" ]; then @@ -177,22 +175,20 @@ source "$HOMEgfs/ush/preamble.sh" echo '********************************************************' echo ' See Details Below ' echo ' ' - ${TRACE_ON:-set -x} + set_trace fi fi # 1.a.3 File check - for grdID in ${grdALL} - do - if [ -f ${COMOUT}/rundata/${CDUMP}wave.mod_def.$grdID ] - then + for grdID in ${grdALL}; do + if [[ -f "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" ]]; then set +x echo ' ' echo " mod_def.$grdID succesfully created/copied " echo ' ' - ${TRACE_ON:-set -x} + set_trace else set +x echo ' ' @@ -202,7 +198,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " grdID = $grdID" echo ' ' sed "s/^/$grdID.out : /g" $grdID.out - ${TRACE_ON:-set -x} + set_trace err=3;export err;${errchk} fi done diff --git a/scripts/exgfs_wave_nawips.sh b/scripts/exgfs_wave_nawips.sh index 8d41578d7e1..09d23ec6853 100755 --- a/scripts/exgfs_wave_nawips.sh +++ b/scripts/exgfs_wave_nawips.sh @@ -14,8 +14,8 @@ source "$HOMEgfs/ush/preamble.sh" #export grids=${grids:-'glo_30m at_10m ep_10m wc_10m ao_9km'} #Interpolated grids -export grids=${grids:-'glo_10m gso_15m ao_9km'} #Native grids -export RUNwave=${RUNwave:-${RUN}${COMPONENT}} +export grids=${grids:-'glo_30m'} #Native grids +export RUNwave=${RUNwave:-${RUN}wave} export fstart=${fstart:-0} export FHMAX_WAV=${FHMAX_WAV:-180} #180 Total of hours to process export FHMAX_HF_WAV=${FHMAX_HF_WAV:-72} @@ -71,7 +71,7 @@ while [ $fhcnt -le $FHMAX_WAV ]; do *) gridIDin= grdIDout= ;; esac - GRIBIN=$COMIN/gridded/$RUNwave.$cycle.$grdIDin.f${fhr}.grib2 + GRIBIN="${COM_WAVE_GRID}/${RUNwave}.${cycle}.${grdIDin}.f${fhr}.grib2" GRIBIN_chk=$GRIBIN.idx icnt=1 @@ -84,14 +84,13 @@ while [ $fhcnt -le $FHMAX_WAV ]; do fi if [ $icnt -ge $maxtries ]; then msg="ABORTING after 5 minutes of waiting for $GRIBIN." - postmsg "$jlogfile" "$msg" echo ' ' echo '**************************** ' echo '*** ERROR : NO GRIB FILE *** ' echo '**************************** ' echo ' ' echo $msg - ${TRACE_ON:-set -x} + set_trace echo "$RUNwave $grdID ${fhr} prdgen $date $cycle : GRIB file missing." >> $wavelog err=1;export err;${errchk} || exit ${err} fi @@ -104,7 +103,6 @@ while [ $fhcnt -le $FHMAX_WAV ]; do OK=$? if [ "$OK" != '0' ]; then msg="ABNORMAL EXIT: ERROR IN interpolation the global grid" - postmsg "$jlogfile" "$msg" #set +x echo ' ' echo '************************************************************* ' @@ -112,7 +110,7 @@ while [ $fhcnt -le $FHMAX_WAV ]; do echo '************************************************************* ' echo ' ' echo $msg - #${TRACE_ON:-set -x} + #set_trace echo "$RUNwave $grdID prdgen $date $cycle : error in grbindex." >> $wavelog err=2;export err;err_chk else @@ -160,12 +158,11 @@ while [ $fhcnt -le $FHMAX_WAV ]; do fi if [ $SENDCOM = "YES" ] ; then - cpfs $GEMGRD $COMOUT/$GEMGRD + cpfs "${GEMGRD}" "${COM_WAVE_GEMPAK}/${GEMGRD}" if [ $SENDDBN = "YES" ] ; then - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/$GEMGRD + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" "${COM_WAVE_GEMPAK}/${GEMGRD}" else - echo "##### DBN_ALERT is: MODEL ${DBN_ALERT_TYPE} $job $COMOUT/$GEMGRD#####" + echo "##### DBN_ALERT is: MODEL ${DBN_ALERT_TYPE} ${job} ${COM_WAVE_GEMPAK}/${GEMGRD}#####" fi fi rm grib_$grid diff --git a/scripts/exgfs_wave_post_gridded_sbs.sh b/scripts/exgfs_wave_post_gridded_sbs.sh index b602ba3a0e4..76e2d6d1daa 100755 --- a/scripts/exgfs_wave_post_gridded_sbs.sh +++ b/scripts/exgfs_wave_post_gridded_sbs.sh @@ -36,7 +36,7 @@ source "$HOMEgfs/ush/preamble.sh" # Set wave model ID tag to include member number # if ensemble; waveMEMB var empty in deterministic - export WAV_MOD_TAG=${CDUMP}wave${waveMEMB} + export WAV_MOD_TAG=${RUN}wave${waveMEMB} cd $DATA @@ -51,7 +51,7 @@ source "$HOMEgfs/ush/preamble.sh" echo "Starting at : $(date)" echo '-------------' echo ' ' - ${TRACE_ON:-set -x} + set_trace # Script will run only if pre-defined NTASKS # The actual work is distributed over these tasks. @@ -81,11 +81,8 @@ source "$HOMEgfs/ush/preamble.sh" echo " Interpolated grids : $waveinterpGRD" echo " Post-process grids : $wavepostGRD" echo ' ' - ${TRACE_ON:-set -x} + set_trace - -# 0.c.3 Define CDATE_POST - export CDATE_POST=${CDATE} export FHRUN=0 # --------------------------------------------------------------------------- # @@ -100,42 +97,38 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' echo 'Preparing input files :' echo '-----------------------' - ${TRACE_ON:-set -x} + set_trace # 1.a Model definition files and output files (set up using poe) # 1.a.1 Copy model definition files - for grdID in $waveGRD $wavepostGRD $waveinterpGRD - do - if [ -f "$COMIN/rundata/${CDUMP}wave.mod_def.${grdID}" ] - then + for grdID in ${waveGRD} ${wavepostGRD} ${waveinterpGRD}; do + if [[ -f "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" ]]; then set +x - echo " Mod def file for $grdID found in ${COMIN}/rundata. copying ...." - ${TRACE_ON:-set -x} + echo " Mod def file for ${grdID} found in ${COM_WAVE_PREP}. copying ...." + set_trace - cp -f $COMIN/rundata/${CDUMP}wave.mod_def.${grdID} mod_def.$grdID + cp -f "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" "mod_def.${grdID}" fi done # 1.a.2 Check that model definition files exist - for grdID in $waveGRD $wavepostGRD $waveinterpGRD - do - if [ ! -f mod_def.$grdID ] - then + for grdID in ${waveGRD} ${wavepostGRD} ${waveinterpGRD}; do + if [[ ! -f "mod_def.${grdID}" ]]; then set +x echo ' ' echo '*************************************************** ' echo " FATAL ERROR : NO MOD_DEF FILE mod_def.$grdID " echo '*************************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=2; export err;${errchk} exit $err DOGRB_WAV='NO' else set +x echo "File mod_def.$grdID found. Syncing to all nodes ..." - ${TRACE_ON:-set -x} + set_trace fi done @@ -155,7 +148,7 @@ source "$HOMEgfs/ush/preamble.sh" then set +x echo " ${intGRD}_interp.inp.tmpl copied. Syncing to all nodes ..." - ${TRACE_ON:-set -x} + set_trace else set +x echo ' ' @@ -163,8 +156,8 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** ERROR : NO TEMPLATE FOR GRINT INPUT FILE *** ' echo '*********************************************** ' echo ' ' - ${TRACE_ON:-set -x} - echo "$WAV_MOD_TAG post $date $cycle : GRINT template file missing." + set_trace + echo "${WAV_MOD_TAG} post ${PDY} ${cycle} : GRINT template file missing." exit_code=1 DOGRI_WAV='NO' fi @@ -184,7 +177,7 @@ source "$HOMEgfs/ush/preamble.sh" then set +x echo " ww3_grib2.${grbGRD}.inp.tmpl copied. Syncing to all nodes ..." - ${TRACE_ON:-set -x} + set_trace else set +x echo ' ' @@ -192,7 +185,7 @@ source "$HOMEgfs/ush/preamble.sh" echo "*** ERROR : NO TEMPLATE FOR ${grbGRD} GRIB INPUT FILE *** " echo '*********************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit_code=2 DOGRB_WAV='NO' fi @@ -211,7 +204,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Sufficient data for GRID interpolation : $DOGRI_WAV" echo " Sufficient data for GRIB files : $DOGRB_WAV" echo ' ' - ${TRACE_ON:-set -x} + set_trace # --------------------------------------------------------------------------- # # 2. Make consolidated grib2 file for side-by-side grids and interpolate @@ -221,7 +214,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' Making command file for sbs grib2 and GRID Interpolation ' - ${TRACE_ON:-set -x} + set_trace # 1.a.2 Loop over forecast time to generate post files # When executed side-by-side, serial mode (cfp when run after the fcst step) @@ -241,7 +234,7 @@ source "$HOMEgfs/ush/preamble.sh" iwaitmax=120 # Maximum loop cycles for waiting until wave component output file is ready (fails after max) while [ $fhr -le $FHMAX_WAV ]; do - ymdh=$($NDATE $fhr $CDATE) + ymdh=$($NDATE $fhr ${PDY}${cyc}) YMD=$(echo $ymdh | cut -c1-8) HMS="$(echo $ymdh | cut -c9-10)0000" YMDHMS=${YMD}${HMS} @@ -265,15 +258,15 @@ source "$HOMEgfs/ush/preamble.sh" then iwait=0 for wavGRD in ${waveGRD} ; do - gfile=$COMIN/rundata/${WAV_MOD_TAG}.out_grd.${wavGRD}.${YMD}.${HMS} + gfile=${COM_WAVE_HISTORY}/${WAV_MOD_TAG}.out_grd.${wavGRD}.${YMD}.${HMS} while [ ! -s ${gfile} ]; do sleep 10; let iwait=iwait+1; done if [ $iwait -eq $iwaitmax ]; then echo '*************************************************** ' echo " FATAL ERROR : NO RAW FIELD OUTPUT FILE out_grd.$grdID " echo '*************************************************** ' echo ' ' - ${TRACE_ON:-set -x} - echo "$WAV_MOD_TAG post $grdID $date $cycle : field output missing." + set_trace + echo "${WAV_MOD_TAG} post ${grdID} ${PDY} ${cycle} : field output missing." err=3; export err;${errchk} exit $err fi @@ -367,7 +360,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Executing the grib2_sbs scripts at : $(date)" echo ' ------------------------------------' echo ' ' - ${TRACE_ON:-set -x} + set_trace if [ "$wavenproc" -gt '1' ] then @@ -392,7 +385,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*************************************' echo ' See Details Below ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=4; export err;${errchk} exit $err fi @@ -407,8 +400,8 @@ source "$HOMEgfs/ush/preamble.sh" # Check if grib2 file created ENSTAG="" if [ ${waveMEMB} ]; then ENSTAG=".${membTAG}${waveMEMB}" ; fi - gribchk=${CDUMP}wave.${cycle}${ENSTAG}.${GRDNAME}.${GRDRES}.f${FH3}.grib2 - if [ ! -s ${COMOUT}/gridded/${gribchk} ]; then + gribchk="${RUN}wave.${cycle}${ENSTAG}.${GRDNAME}.${GRDRES}.f${FH3}.grib2" + if [ ! -s ${COM_WAVE_GRID}/${gribchk} ]; then set +x echo ' ' echo '********************************************' @@ -416,7 +409,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '********************************************' echo ' See Details Below ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=5; export err;${errchk} exit $err fi diff --git a/scripts/exgfs_wave_post_pnt.sh b/scripts/exgfs_wave_post_pnt.sh index cf42db0bb42..a7aa9575641 100755 --- a/scripts/exgfs_wave_post_pnt.sh +++ b/scripts/exgfs_wave_post_pnt.sh @@ -54,7 +54,7 @@ source "$HOMEgfs/ush/preamble.sh" echo "Starting at : $(date)" echo '-------------' echo ' ' - ${TRACE_ON:-set -x} + set_trace # Script will run only if pre-defined NTASKS # The actual work is distributed over these tasks. @@ -91,7 +91,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '-------------------' echo " Output points : $waveuoutpGRD" echo ' ' - ${TRACE_ON:-set -x} + set_trace # --------------------------------------------------------------------------- # # 1. Get files that are used by most child scripts @@ -102,7 +102,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' echo 'Preparing input files :' echo '-----------------------' - ${TRACE_ON:-set -x} + set_trace # 1.a Model definition files and output files (set up using poe) @@ -112,20 +112,18 @@ source "$HOMEgfs/ush/preamble.sh" touch cmdfile chmod 744 cmdfile - ${TRACE_ON:-set -x} + set_trace # Copy model definition files iloop=0 - for grdID in $waveuoutpGRD - do - if [ -f "$COMIN/rundata/${CDUMP}wave.mod_def.${grdID}" ] - then + for grdID in ${waveuoutpGRD}; do + if [[ -f "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" ]]; then set +x - echo " Mod def file for $grdID found in ${COMIN}/rundata. copying ...." - ${TRACE_ON:-set -x} + echo " Mod def file for ${grdID} found in ${COM_WAVE_PREP}. copying ...." + set_trace - cp -f $COMIN/rundata/${CDUMP}wave.mod_def.${grdID} mod_def.$grdID - iloop=$(expr $iloop + 1) + cp -f "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" "mod_def.${grdID}" + iloop=$((iloop + 1)) fi done @@ -139,13 +137,13 @@ source "$HOMEgfs/ush/preamble.sh" echo " FATAL ERROR : NO MOD_DEF FILE mod_def.$grdID " echo '*************************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=2; export err;${errchk} exit $err else set +x echo "File mod_def.$grdID found. Syncing to all nodes ..." - ${TRACE_ON:-set -x} + set_trace fi done @@ -169,7 +167,7 @@ source "$HOMEgfs/ush/preamble.sh" then set +x echo " buoy.loc and buoy.ibp copied and processed ($PARMwave/wave_${NET}.buoys)." - ${TRACE_ON:-set -x} + set_trace else set +x echo ' ' @@ -177,7 +175,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' FATAL ERROR : NO BUOY LOCATION FILE ' echo '************************************* ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=3; export err;${errchk} exit $err DOSPC_WAV='NO' @@ -195,7 +193,7 @@ source "$HOMEgfs/ush/preamble.sh" then set +x echo " ww3_outp_spec.inp.tmpl copied. Syncing to all grids ..." - ${TRACE_ON:-set -x} + set_trace else set +x echo ' ' @@ -203,7 +201,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** ERROR : NO TEMPLATE FOR SPEC INPUT FILE *** ' echo '*********************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit_code=3 DOSPC_WAV='NO' DOBLL_WAV='NO' @@ -218,7 +216,7 @@ source "$HOMEgfs/ush/preamble.sh" then set +x echo " ww3_outp_bull.inp.tmpl copied. Syncing to all nodes ..." - ${TRACE_ON:-set -x} + set_trace else set +x echo ' ' @@ -226,7 +224,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** ERROR : NO TEMPLATE FOR BULLETIN INPUT FILE *** ' echo '*************************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit_code=4 DOBLL_WAV='NO' fi @@ -235,29 +233,28 @@ source "$HOMEgfs/ush/preamble.sh" if [ "$DOSPC_WAV" = 'YES' ] || [ "$DOBLL_WAV" = 'YES' ] then - ymdh=$($NDATE -${WAVHINDH} $CDATE) - tstart="$(echo $ymdh | cut -c1-8) $(echo $ymdh | cut -c9-10)0000" + ymdh=$(${NDATE} -"${WAVHINDH}" "${PDY}${cyc}") + tstart="${ymdh:0:8} ${ymdh:8:2}0000" dtspec=3600. # default time step (not used here) - sed -e "s/TIME/$tstart/g" \ - -e "s/DT/$dtspec/g" \ + sed -e "s/TIME/${tstart}/g" \ + -e "s/DT/${dtspec}/g" \ -e "s/POINT/1/g" \ -e "s/ITYPE/0/g" \ -e "s/FORMAT/F/g" \ ww3_outp_spec.inp.tmpl > ww3_outp.inp ln -s mod_def.$waveuoutpGRD mod_def.ww3 - YMD=$(echo $CDATE | cut -c1-8) - HMS="$(echo $CDATE | cut -c9-10)0000" - if [ -f $COMIN/rundata/${WAV_MOD_TAG}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS} ] - then - ln -s $COMIN/rundata/${WAV_MOD_TAG}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS} ./out_pnt.${waveuoutpGRD} + HMS="${cyc}0000" + if [[ -f "${COM_WAVE_HISTORY}/${WAV_MOD_TAG}.out_pnt.${waveuoutpGRD}.${PDY}.${HMS}" ]]; then + ln -s "${COM_WAVE_HISTORY}/${WAV_MOD_TAG}.out_pnt.${waveuoutpGRD}.${PDY}.${HMS}" \ + "./out_pnt.${waveuoutpGRD}" else echo '*************************************************** ' - echo " FATAL ERROR : NO RAW POINT OUTPUT FILE out_pnt.${waveuoutpGRD}.${YMD}.${HMS} " + echo " FATAL ERROR : NO RAW POINT OUTPUT FILE out_pnt.${waveuoutpGRD}.${PDY}.${HMS} " echo '*************************************************** ' echo ' ' - ${TRACE_ON:-set -x} - echo "$WAV_MOD_TAG post $waveuoutpGRD $CDATE $cycle : field output missing." + set_trace + echo "${WAV_MOD_TAG} post ${waveuoutpGRD} ${PDY}${cyc} ${cycle} : field output missing." err=4; export err;${errchk} fi @@ -280,7 +277,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' cat buoy_tmp.loc echo "$WAV_MOD_TAG post $date $cycle : buoy log file failed to be created." - ${TRACE_ON:-set -x} + set_trace err=5;export err;${errchk} DOSPC_WAV='NO' DOBLL_WAV='NO' @@ -303,7 +300,7 @@ source "$HOMEgfs/ush/preamble.sh" then set +x echo 'Buoy log file created. Syncing to all nodes ...' - ${TRACE_ON:-set -x} + set_trace else set +x echo ' ' @@ -311,7 +308,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** ERROR : NO BUOY LOG FILE CREATED *** ' echo '**************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=6;export err;${errchk} DOSPC_WAV='NO' DOBLL_WAV='NO' @@ -331,7 +328,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Sufficient data for bulletins : $DOBLL_WAV ($Nb points)" echo " Boundary points : $DOBNDPNT_WAV" echo ' ' - ${TRACE_ON:-set -x} + set_trace # --------------------------------------------------------------------------- # # 2. Make files for processing boundary points @@ -340,7 +337,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' Making command file for wave post points ' - ${TRACE_ON:-set -x} + set_trace rm -f cmdfile touch cmdfile @@ -351,11 +348,11 @@ source "$HOMEgfs/ush/preamble.sh" while [ $fhr -le $FHMAX_WAV_PNT ]; do echo " Creating the wave point scripts at : $(date)" - ymdh=$($NDATE $fhr $CDATE) - YMD=$(echo $ymdh | cut -c1-8) - HMS="$(echo $ymdh | cut -c9-10)0000" + ymdh=$($NDATE "${fhr}" "${PDY}${cyc}") + YMD=${ymdh:0:8} + HMS="${ymdh:8:2}0000" YMDHMS=${YMD}${HMS} - FH3=$(printf %03i $fhr) + FH3=$(printf %03i ${fhr}) rm -f tmpcmdfile.${FH3} touch tmpcmdfile.${FH3} @@ -367,14 +364,14 @@ source "$HOMEgfs/ush/preamble.sh" export BULLDATA=${DATA}/output_$YMDHMS cp $DATA/mod_def.${waveuoutpGRD} mod_def.${waveuoutpGRD} - pfile=$COMIN/rundata/${WAV_MOD_TAG}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS} + pfile="${COM_WAVE_HISTORY}/${WAV_MOD_TAG}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS}" if [ -f ${pfile} ] then ln -fs ${pfile} ./out_pnt.${waveuoutpGRD} else echo " FATAL ERROR : NO RAW POINT OUTPUT FILE out_pnt.$waveuoutpGRD.${YMD}.${HMS} " echo ' ' - ${TRACE_ON:-set -x} + set_trace err=7; export err;${errchk} exit $err fi @@ -468,7 +465,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Executing the wave point scripts at : $(date)" echo ' ------------------------------------' echo ' ' - ${TRACE_ON:-set -x} + set_trace if [ "$wavenproc" -gt '1' ] then @@ -493,7 +490,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*************************************' echo ' See Details Below ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=8; export err;${errchk} exit $err fi @@ -560,7 +557,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Executing the boundary point cat script at : $(date)" echo ' ------------------------------------' echo ' ' - ${TRACE_ON:-set -x} + set_trace if [ "$wavenproc" -gt '1' ] then @@ -585,7 +582,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*************************************' echo ' See Details Below ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=9; export err;${errchk} exit $err fi @@ -604,7 +601,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' echo ' Making command file for taring all point output files.' - ${TRACE_ON:-set -x} + set_trace # 6.b Spectral data files @@ -662,7 +659,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Executing the wave_tar scripts at : $(date)" echo ' ------------------------------------' echo ' ' - ${TRACE_ON:-set -x} + set_trace if [ "$wavenproc" -gt '1' ] then @@ -687,7 +684,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*************************************' echo ' See Details Below ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=10; export err;${errchk} exit $err fi diff --git a/scripts/exgfs_wave_prdgen_bulls.sh b/scripts/exgfs_wave_prdgen_bulls.sh index 10bdee523b0..e75df8dfd17 100755 --- a/scripts/exgfs_wave_prdgen_bulls.sh +++ b/scripts/exgfs_wave_prdgen_bulls.sh @@ -23,13 +23,13 @@ source "$HOMEgfs/ush/preamble.sh" # 0.a Basic modes of operation # PATH for working and home directories - export RUNwave=${RUNwave:-${RUN}${COMPONENT}} + export RUNwave=${RUNwave:-${RUN}wave} export envir=${envir:-ops} export cyc=${cyc:-00} export cycle=${cycle:-t${cyc}z} export pgmout=OUTPUT.$$ export DATA=${DATA:-${DATAROOT:?}/${job}.$$} - #export CODEwave=${CODEwave:-${NWROOT}/${NET}_code.${wave_code_ver}/${code_pkg}} + #export CODEwave=${CODEwave:-${PACKAGEROOT}/${NET}_code.${wave_code_ver}/${code_pkg}} export EXECwave=${EXECwave:-$HOMEgfs/exec} export FIXwave=${FIXwave:-$HOMEgfs/fix} export PARMwave=${PARMwave:-$HOMEgfs/parm/parm_wave} @@ -54,20 +54,19 @@ source "$HOMEgfs/ush/preamble.sh" echo "Starting at : $(date)" echo ' ' echo ' ' - ${TRACE_ON:-set -x} + set_trace # 1. Get necessary files set +x - echo " Copying bulletins from $COMIN" - ${TRACE_ON:-set -x} + echo " Copying bulletins from ${COM_WAVE_STATION}" + set_trace # 1.a Link the input file and untar it - BullIn=$COMIN/station/${RUNwave}.$cycle.cbull_tar + BullIn="${COM_WAVE_STATION}/${RUNwave}.${cycle}.cbull_tar" if [ -f $BullIn ]; then cp $BullIn cbull.tar else - msg="ABNORMAL EXIT: NO BULLETIN TAR FILE" - postmsg "$jlogfile" "$msg" + echo "ABNORMAL EXIT: NO BULLETIN TAR FILE" set +x echo ' ' echo '************************************ ' @@ -75,7 +74,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '************************************ ' echo ' ' echo $msg - ${TRACE_ON:-set -x} + set_trace msg="FATAL ERROR ${RUNwave} prdgen $date $cycle : bulletin tar missing." echo $msg >> $wavelog export err=1; ${errchk} @@ -84,18 +83,17 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo " Untarring bulletins ..." - ${TRACE_ON:-set -x} + set_trace tar -xf cbull.tar OK=$? if [ "$OK" = '0' ]; then set +x echo " Unpacking successfull ..." - ${TRACE_ON:-set -x} + set_trace rm -f cbull.tar else - msg="ABNORMAL EXIT: ERROR IN BULLETIN UNTAR" - postmsg "$jlogfile" "$msg" + echo "ABNORMAL EXIT: ERROR IN BULLETIN UNTAR" set +x echo ' ' echo '****************************************** ' @@ -103,7 +101,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '****************************************** ' echo ' ' echo $msg - ${TRACE_ON:-set -x} + set_trace echo "${RUNwave} prdgen $date $cycle : bulletin untar error." >> $wavelog err=2;export err;err_chk exit $err @@ -113,7 +111,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' Nb=$(ls -1 *.cbull | wc -l)' Nb=$(ls -1 *.cbull | wc -l) - ${TRACE_ON:-set -x} + set_trace echo ' ' echo " Number of bulletin files : $Nb" echo ' --------------------------' @@ -123,7 +121,6 @@ source "$HOMEgfs/ush/preamble.sh" cp $PARMwave/bull_awips_gfswave awipsbull.data else msg="ABNORMAL EXIT: NO AWIPS BULLETIN HEADER DATA FILE" - postmsg "$jlogfile" "$msg" set +x echo ' ' echo '******************************************* ' @@ -131,7 +128,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '******************************************* ' echo ' ' echo $msg - ${TRACE_ON:-set -x} + set_trace echo "${RUNwave} prdgen $date $cycle : Bulletin header data file missing." >> $wavelog err=3;export err;err_chk exit $err @@ -144,13 +141,12 @@ source "$HOMEgfs/ush/preamble.sh" echo ' Sourcing data file with header info ...' # 2.b Set up environment variables - ${TRACE_ON:-set -x} + set_trace . awipsbull.data # 2.c Generate list of bulletins to process echo ' Generating buoy list ...' - echo 'bulls=$(sed -e 's/export b//g' -e 's/=/ /' awipsbull.data | grep -v "#" |awk '{ print $1}')' - bulls=$(sed -e 's/export b//g' -e 's/=/ /' awipsbull.data | grep -v "#" |awk '{ print $1}') + bulls=$(sed -e 's/export b//g' -e 's/=/ /' awipsbull.data | grep -v "#" |awk '{print $1}') # 2.d Looping over buoys running formbul echo ' Looping over buoys ... \n' @@ -162,7 +158,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Processing $bull ($headr $oname) ..." if [ -z "$headr" ] || [ ! -s $fname ]; then - ${TRACE_ON:-set -x} + set_trace msg="ABNORMAL EXIT: MISSING BULLETING INFO" set +x echo ' ' @@ -171,23 +167,22 @@ source "$HOMEgfs/ush/preamble.sh" echo '******************************************** ' echo ' ' echo $msg - ${TRACE_ON:-set -x} + set_trace echo "${RUNwave} prdgen $date $cycle : Missing bulletin data." >> $wavelog err=4;export err;err_chk exit $err fi - ${TRACE_ON:-set -x} + set_trace - formbul.pl -d $headr -f $fname -j $job -m ${RUNwave} \ - -p $PCOM -s NO -o $oname > formbul.out 2>&1 + formbul.pl -d "${headr}" -f "${fname}" -j "${job}" -m "${RUNwave}" \ + -p "${COM_WAVE_WMO}" -s "NO" -o "${oname}" > formbul.out 2>&1 OK=$? if [ "$OK" != '0' ] || [ ! -f $oname ]; then - ${TRACE_ON:-set -x} + set_trace cat formbul.out msg="ABNORMAL EXIT: ERROR IN formbul" - postmsg "$jlogfile" "$msg" set +x echo ' ' echo '************************************** ' @@ -195,7 +190,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '************************************** ' echo ' ' echo $msg - ${TRACE_ON:-set -x} + set_trace echo "${RUNwave} prdgen $date $cycle : error in formbul." >> $wavelog err=5;export err;err_chk exit $err @@ -206,20 +201,20 @@ source "$HOMEgfs/ush/preamble.sh" done # 3. Send output files to the proper destination - ${TRACE_ON:-set -x} - if [ "$SENDCOM" = YES ]; then - cp awipsbull.$cycle.${RUNwave} $PCOM/awipsbull.$cycle.${RUNwave} - if [ "$SENDDBN_NTC" = YES ]; then - make_ntc_bull.pl WMOBH NONE KWBC NONE $DATA/awipsbull.$cycle.${RUNwave} $PCOM/awipsbull.$cycle.${RUNwave} - else - if [ "${envir}" = "para" ] || [ "${envir}" = "test" ] || [ "${envir}" = "dev" ]; then - echo "Making NTC bulletin for parallel environment, but do not alert." - ${TRACE_ON:-set -x} - (export SENDDBN=NO; make_ntc_bull.pl WMOBH NONE KWBC NONE \ - $DATA/awipsbull.$cycle.${RUNwave} $PCOM/awipsbull.$cycle.${RUNwave}) - fi - fi - fi +set_trace +if [ "$SENDCOM" = YES ]; then + cp "awipsbull.${cycle}.${RUNwave}" "${COM_WAVE_WMO}/awipsbull.${cycle}.${RUNwave}" + if [ "$SENDDBN_NTC" = YES ]; then + make_ntc_bull.pl "WMOBH" "NONE" "KWBC" "NONE" "${DATA}/awipsbull.${cycle}.${RUNwave}" \ + "${COM_WAVE_WMO}/awipsbull.${cycle}.${RUNwave}" + else + if [ "${envir}" = "para" ] || [ "${envir}" = "test" ] || [ "${envir}" = "dev" ]; then + echo "Making NTC bulletin for parallel environment, but do not alert." + (export SENDDBN=NO; make_ntc_bull.pl "WMOBH" "NONE" "KWBC" "NONE" \ + "${DATA}/awipsbull.${cycle}.${RUNwave}" "${COM_WAVE_WMO}/awipsbull.${cycle}.${RUNwave}") + fi + fi +fi # --------------------------------------------------------------------------- # # 4. Clean up diff --git a/scripts/exgfs_wave_prdgen_gridded.sh b/scripts/exgfs_wave_prdgen_gridded.sh index b56fb15819f..de7f2c4974c 100755 --- a/scripts/exgfs_wave_prdgen_gridded.sh +++ b/scripts/exgfs_wave_prdgen_gridded.sh @@ -23,7 +23,7 @@ source "$HOMEgfs/ush/preamble.sh" # 0.a Basic modes of operation - export RUNwave=${RUNwave:-${RUN}${COMPONENT}} + export RUNwave=${RUNwave:-${RUN}wave} export envir=${envir:-ops} export fstart=${fstart:-0} export FHMAX_WAV=${FHMAX_WAV:-180} #180 Total of hours to process @@ -40,14 +40,13 @@ source "$HOMEgfs/ush/preamble.sh" export DATA=${DATA:-${DATAROOT:?}/${job}.$$} mkdir -p $DATA cd $DATA - export wavelog=${DATA}/${COMPONENTwave}_prdggridded.log + export wavelog=${DATA}/${RUNwave}_prdggridded.log - postmsg "$jlogfile" "HAS BEGUN on $(hostname)" - msg="Starting MWW3 GRIDDED PRODUCTS SCRIPT" - postmsg "$jlogfile" "$msg" + echo "Starting MWW3 GRIDDED PRODUCTS SCRIPT" # Output grids - grids=${grids:-ao_9km at_10m ep_10m wc_10m glo_30m} -# grids=${grids:-ak_10m at_10m ep_10m wc_10m glo_30m} + # grids=${grids:-ao_9km at_10m ep_10m wc_10m glo_30m} +grids=${grids:-ak_10m at_10m ep_10m wc_10m glo_30m} +# export grids=${wavepostGRD} maxtries=${maxtries:-720} # 0.b Date and time stuff export date=$PDY @@ -63,14 +62,14 @@ source "$HOMEgfs/ush/preamble.sh" echo " AWIPS grib fields" echo " Wave Grids : $grids" echo ' ' - ${TRACE_ON:-set -x} + set_trace # --------------------------------------------------------------------------- # # 1. Get necessary files echo ' ' echo 'Preparing input files :' echo '-----------------------' - ${TRACE_ON:-set -x} + set_trace #======================================================================= ASWELL=(SWELL1 SWELL2) # Indices of HS from partitions @@ -99,7 +98,7 @@ source "$HOMEgfs/ush/preamble.sh" esac # - GRIBIN=$COMIN/gridded/$RUNwave.$cycle.$grdID.f${fhr}.grib2 + GRIBIN="${COM_WAVE_GRID}/${RUNwave}.${cycle}.${grdID}.f${fhr}.grib2" GRIBIN_chk=$GRIBIN.idx icnt=1 @@ -113,14 +112,13 @@ source "$HOMEgfs/ush/preamble.sh" fi if [ $icnt -ge $maxtries ]; then msg="ABNORMAL EXIT: NO GRIB FILE FOR GRID $GRIBIN" - postmsg "$jlogfile" "$msg" echo ' ' echo '**************************** ' echo '*** ERROR : NO GRIB FILE *** ' echo '**************************** ' echo ' ' echo $msg - ${TRACE_ON:-set -x} + set_trace echo "$RUNwave $grdID ${fhr} prdgen $date $cycle : GRIB file missing." >> $wavelog err=1;export err;${errchk} || exit ${err} fi @@ -177,19 +175,18 @@ source "$HOMEgfs/ush/preamble.sh" # 2.a.1 Set up for tocgrib2 echo " Do set up for tocgrib2." - ${TRACE_ON:-set -x} + set_trace #AWIPSGRB=awipsgrib.$grdID.f${fhr} AWIPSGRB=awipsgrib # 2.a.2 Make GRIB index echo " Make GRIB index for tocgrib2." - ${TRACE_ON:-set -x} + set_trace $GRB2INDEX gribfile.$grdID.f${fhr} gribindex.$grdID.f${fhr} OK=$? if [ "$OK" != '0' ] then msg="ABNORMAL EXIT: ERROR IN grb2index MWW3 for grid $grdID" - postmsg "$jlogfile" "$msg" #set +x echo ' ' echo '******************************************** ' @@ -197,7 +194,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '******************************************** ' echo ' ' echo $msg - #${TRACE_ON:-set -x} + #set_trace echo "$RUNwave $grdID prdgen $date $cycle : error in grbindex." >> $wavelog err=4;export err;err_chk fi @@ -205,7 +202,7 @@ source "$HOMEgfs/ush/preamble.sh" # 2.a.3 Run AWIPS GRIB packing program tocgrib2 echo " Run tocgrib2" - ${TRACE_ON:-set -x} + set_trace export pgm=tocgrib2 export pgmout=tocgrib2.out . prep_step @@ -219,7 +216,6 @@ source "$HOMEgfs/ush/preamble.sh" if [ "$OK" != '0' ]; then cat tocgrib2.out msg="ABNORMAL EXIT: ERROR IN tocgrib2" - postmsg "$jlogfile" "$msg" #set +x echo ' ' echo '*************************************** ' @@ -227,7 +223,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*************************************** ' echo ' ' echo $msg - #${TRACE_ON:-set -x} + #set_trace echo "$RUNwave prdgen $date $cycle : error in tocgrib2." >> $wavelog err=5;export err;err_chk else @@ -236,21 +232,21 @@ source "$HOMEgfs/ush/preamble.sh" # 2.a.7 Get the AWIPS grib bulletin out ... #set +x echo " Get awips GRIB bulletins out ..." - #${TRACE_ON:-set -x} + #set_trace if [ "$SENDCOM" = 'YES' ] then #set +x echo " Saving $AWIPSGRB.$grdOut.f${fhr} as grib2.$cycle.awipsww3_${grdID}.f${fhr}" - echo " in $PCOM" - #${TRACE_ON:-set -x} - cp $AWIPSGRB.$grdID.f${fhr} $PCOM/grib2.$cycle.f${fhr}.awipsww3_${grdOut} + echo " in ${COM_WAVE_WMO}" + #set_trace + cp "${AWIPSGRB}.${grdID}.f${fhr}" "${COM_WAVE_WMO}/grib2.${cycle}.f${fhr}.awipsww3_${grdOut}" #set +x fi if [ "$SENDDBN" = 'YES' ] then echo " Sending $AWIPSGRB.$grdID.f${fhr} to DBRUN." - $DBNROOT/bin/dbn_alert GRIB_LOW $RUN $job $PCOM/grib2.$cycle.f${fhr}.awipsww3_${grdOut} + "${DBNROOT}/bin/dbn_alert" GRIB_LOW "${RUN}" "${job}" "${COM_WAVE_WMO}/grib2.${cycle}.f${fhr}.awipsww3_${grdOut}" fi rm -f $AWIPSGRB.$grdID.f${fhr} tocgrib2.out done # For grids diff --git a/scripts/exgfs_wave_prep.sh b/scripts/exgfs_wave_prep.sh index f3ecf388beb..be006c1c85f 100755 --- a/scripts/exgfs_wave_prep.sh +++ b/scripts/exgfs_wave_prep.sh @@ -46,7 +46,7 @@ source "$HOMEgfs/ush/preamble.sh" # Set wave model ID tag to include member number # if ensemble; waveMEMB var empty in deterministic - export WAV_MOD_TAG=${CDUMP}wave${waveMEMB} + export WAV_MOD_TAG=${RUN}wave${waveMEMB} cd $DATA mkdir outtmp @@ -64,14 +64,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' echo "Starting at : $(date)" echo ' ' - ${TRACE_ON:-set -x} - - if [ "$INDRUN" = 'no' ] - then - FHMAX_WAV=${FHMAX_WAV:-3} - else - FHMAX_WAV=${FHMAX_WAV:-384} - fi + set_trace # 0.b Date and time stuff @@ -136,7 +129,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " starting time : $time_beg" echo " ending time : $time_end" echo ' ' - ${TRACE_ON:-set -x} + set_trace # Script will run only if pre-defined NTASKS # The actual work is distributed over these tasks. @@ -153,7 +146,7 @@ source "$HOMEgfs/ush/preamble.sh" echo 'Preparing input files :' echo '-----------------------' echo ' ' - ${TRACE_ON:-set -x} + set_trace # 1.a Model definition files @@ -161,20 +154,20 @@ source "$HOMEgfs/ush/preamble.sh" touch cmdfile grdINP='' - if [ "${WW3ATMINP}" = 'YES' ]; then grdINP="${grdINP} $WAVEWND_FID" ; fi - if [ "${WW3ICEINP}" = 'YES' ]; then grdINP="${grdINP} $WAVEICE_FID" ; fi - if [ "${WW3CURINP}" = 'YES' ]; then grdINP="${grdINP} $WAVECUR_FID" ; fi + if [ "${WW3ATMINP}" = 'YES' ]; then grdINP="${grdINP} $WAVEWND_FID" ; fi + if [ "${WW3ICEINP}" = 'YES' ]; then grdINP="${grdINP} $WAVEICE_FID" ; fi + if [ "${WW3CURINP}" = 'YES' ]; then grdINP="${grdINP} $WAVECUR_FID" ; fi ifile=1 for grdID in $grdINP $waveGRD do - if [ -f "$COMIN/rundata/${CDUMP}wave.mod_def.${grdID}" ] + if [ -f "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" ] then set +x - echo " Mod def file for $grdID found in ${COMIN}/rundata. copying ...." - ${TRACE_ON:-set -x} - cp $COMIN/rundata/${CDUMP}wave.mod_def.${grdID} mod_def.$grdID + echo " Mod def file for $grdID found in ${COM_WAVE_PREP}. copying ...." + set_trace + cp ${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID} mod_def.$grdID else set +x @@ -185,15 +178,15 @@ source "$HOMEgfs/ush/preamble.sh" echo " grdID = $grdID" echo ' ' echo "FATAL ERROR: NO MODEL DEFINITION FILE" - ${TRACE_ON:-set -x} + set_trace err=2;export err;${errchk} fi done # 1.b Netcdf Preprocessor template files - if [ "$WW3ATMINP" = 'YES' ]; then itype="$itype wind" ; fi - if [ "$WW3ICEINP" = 'YES' ]; then itype="$itype ice" ; fi - if [ "$WW3CURINP" = 'YES' ]; then itype="$itype cur" ; fi + if [[ "${WW3ATMINP}" == 'YES' ]]; then itype="${itype:-} wind" ; fi + if [[ "${WW3ICEINP}" == 'YES' ]]; then itype="${itype:-} ice" ; fi + if [[ "${WW3CURINP}" == 'YES' ]]; then itype="${itype:-} cur" ; fi for type in $itype do @@ -225,7 +218,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' echo " ww3_prnc.${type}.$grdID.inp.tmpl copied ($PARMwave)." echo ' ' - ${TRACE_ON:-set -x} + set_trace else set +x echo ' ' @@ -236,7 +229,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' echo "ABNORMAL EXIT: NO FILE $file" echo ' ' - ${TRACE_ON:-set -x} + set_trace err=4;export err;${errchk} fi done @@ -265,7 +258,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' sed "s/^/wave_prnc_ice.out : /g" wave_prnc_ice.out echo ' ' - ${TRACE_ON:-set -x} + set_trace err=5;export err;${errchk} else mv -f wave_prnc_ice.out $DATA/outtmp @@ -273,7 +266,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' echo ' Ice field unpacking successful.' echo ' ' - ${TRACE_ON:-set -x} + set_trace fi else echo ' ' @@ -295,7 +288,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : Not set-up to preprocess wind *** ' echo '*************************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=6;export err;${errchk} fi @@ -313,7 +306,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' echo ' Concatenate binary current fields ...' echo ' ' - ${TRACE_ON:-set -x} + set_trace # Prepare files for cfp process rm -f cmdfile @@ -326,22 +319,22 @@ source "$HOMEgfs/ush/preamble.sh" export RPDY=$($NDATE -24 ${RPDY}00 | cut -c1-8) fi #Set the first time for RTOFS files to be the beginning time of simulation - ymdh_rtofs=$ymdh_beg + ymdh_rtofs=$ymdh_beg if [ "$FHMAX_WAV_CUR" -le 72 ]; then - rtofsfile1=$COMIN_WAV_RTOFS/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f024_prog.nc - rtofsfile2=$COMIN_WAV_RTOFS/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f048_prog.nc - rtofsfile3=$COMIN_WAV_RTOFS/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f072_prog.nc + rtofsfile1="${COM_RTOFS}/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f024_prog.nc" + rtofsfile2="${COM_RTOFS}/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f048_prog.nc" + rtofsfile3="${COM_RTOFS}/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f072_prog.nc" if [ ! -f $rtofsfile1 ] || [ ! -f $rtofsfile2 ] || [ ! -f $rtofsfile3 ]; then #Needed current files are not available, so use RTOFS from previous day export RPDY=$($NDATE -24 ${RPDY}00 | cut -c1-8) fi else - rtofsfile1=$COMIN_WAV_RTOFS/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f096_prog.nc - rtofsfile2=$COMIN_WAV_RTOFS/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f120_prog.nc - rtofsfile3=$COMIN_WAV_RTOFS/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f144_prog.nc - rtofsfile4=$COMIN_WAV_RTOFS/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f168_prog.nc - rtofsfile5=$COMIN_WAV_RTOFS/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f192_prog.nc + rtofsfile1="${COM_RTOFS}/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f096_prog.nc" + rtofsfile2="${COM_RTOFS}/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f120_prog.nc" + rtofsfile3="${COM_RTOFS}/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f144_prog.nc" + rtofsfile4="${COM_RTOFS}/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f168_prog.nc" + rtofsfile5="${COM_RTOFS}/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_f192_prog.nc" if [ ! -f $rtofsfile1 ] || [ ! -f $rtofsfile2 ] || [ ! -f $rtofsfile3 ] || [ ! -f $rtofsfile4 ] || [ ! -f $rtofsfile5 ]; then #Needed current files are not available, so use RTOFS from previous day @@ -349,8 +342,6 @@ source "$HOMEgfs/ush/preamble.sh" fi fi - export COMIN_WAV_CUR=$COMIN_WAV_RTOFS/${WAVECUR_DID}.${RPDY} - ymdh_end_rtofs=$($NDATE ${FHMAX_WAV_CUR} ${RPDY}00) if [ "$ymdh_end" -lt "$ymdh_end_rtofs" ]; then ymdh_end_rtofs=$ymdh_end @@ -369,8 +360,8 @@ source "$HOMEgfs/ush/preamble.sh" fhr_rtofs=$(${NHOUR} ${ymdh_rtofs} ${RPDY}00) fh3_rtofs=$(printf "%03d" "${fhr_rtofs#0}") - curfile1h=${COMIN_WAV_CUR}/rtofs_glo_2ds_${fext}${fh3_rtofs}_prog.nc - curfile3h=${COMIN_WAV_CUR}/rtofs_glo_2ds_${fext}${fh3_rtofs}_prog.nc + curfile1h=${COM_RTOFS}/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_${fext}${fh3_rtofs}_prog.nc + curfile3h=${COM_RTOFS}/${WAVECUR_DID}.${RPDY}/rtofs_glo_2ds_${fext}${fh3_rtofs}_prog.nc if [ -s ${curfile1h} ] && [ "${FLGHF}" = "T" ] ; then curfile=${curfile1h} @@ -390,7 +381,7 @@ source "$HOMEgfs/ush/preamble.sh" echo "*** FATAL ERROR: NO CUR FILE $curfile *** " echo '************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace echo "FATAL ERROR - NO CURRENT FILE (RTOFS)" err=11;export err;${errchk} exit $err @@ -423,7 +414,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Executing the curr prnc cmdfile at : $(date)" echo ' ------------------------------------' echo ' ' - ${TRACE_ON:-set -x} + set_trace if [ $wavenproc -gt '1' ] then @@ -448,7 +439,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '********************************************' echo ' See Details Below ' echo ' ' - ${TRACE_ON:-set -x} + set_trace fi files=$(ls ${WAVECUR_DID}.* 2> /dev/null) @@ -462,7 +453,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '******************************************** ' echo ' ' echo "ABNORMAL EXIT: NO ${WAVECUR_FID}.* FILES FOUND" - ${TRACE_ON:-set -x} + set_trace err=11;export err;${errchk} fi @@ -474,7 +465,7 @@ source "$HOMEgfs/ush/preamble.sh" cat $file >> cur.${WAVECUR_FID} done - cp -f cur.${WAVECUR_FID} ${COMOUT}/rundata/${CDUMP}wave.${WAVECUR_FID}.$cycle.cur + cp -f cur.${WAVECUR_FID} ${COM_WAVE_PREP}/${RUN}wave.${WAVECUR_FID}.$cycle.cur else echo ' ' @@ -494,6 +485,4 @@ source "$HOMEgfs/ush/preamble.sh" # 4. Ending output -exit $err - # End of MWW3 preprocessor script ------------------------------------------- # diff --git a/scripts/exglobal_aero_analysis_finalize.py b/scripts/exglobal_aero_analysis_finalize.py new file mode 100755 index 00000000000..1a0c1d75a5a --- /dev/null +++ b/scripts/exglobal_aero_analysis_finalize.py @@ -0,0 +1,25 @@ +#!/usr/bin/env python3 +# exgdas_global_aero_analysis_finalize.py +# This script creates an AerosolAnalysis class +# and runs the finalize method +# which perform post-processing and clean up activities +# for a global aerosol variational analysis +import os + +from pygw.logger import Logger +from pygw.configuration import cast_strdict_as_dtypedict +from pygfs.task.aero_analysis import AerosolAnalysis + + +# Initialize root logger +logger = Logger(level='DEBUG', colored_log=True) + + +if __name__ == '__main__': + + # Take configuration from environment and cast it as python dictionary + config = cast_strdict_as_dtypedict(os.environ) + + # Instantiate the aerosol analysis task + AeroAnl = AerosolAnalysis(config) + AeroAnl.finalize() diff --git a/scripts/exglobal_aero_analysis_initialize.py b/scripts/exglobal_aero_analysis_initialize.py new file mode 100755 index 00000000000..bf0c61c8b9f --- /dev/null +++ b/scripts/exglobal_aero_analysis_initialize.py @@ -0,0 +1,25 @@ +#!/usr/bin/env python3 +# exgdas_global_aero_analysis_initialize.py +# This script creates an AerosolAnalysis class +# and runs the initialize method +# which create and stage the runtime directory +# and create the YAML configuration +# for a global aerosol variational analysis +import os + +from pygw.logger import Logger +from pygw.configuration import cast_strdict_as_dtypedict +from pygfs.task.aero_analysis import AerosolAnalysis + +# Initialize root logger +logger = Logger(level='DEBUG', colored_log=True) + + +if __name__ == '__main__': + + # Take configuration from environment and cast it as python dictionary + config = cast_strdict_as_dtypedict(os.environ) + + # Instantiate the aerosol analysis task + AeroAnl = AerosolAnalysis(config) + AeroAnl.initialize() diff --git a/scripts/exglobal_aero_analysis_run.py b/scripts/exglobal_aero_analysis_run.py new file mode 100755 index 00000000000..f4ab0e67ffc --- /dev/null +++ b/scripts/exglobal_aero_analysis_run.py @@ -0,0 +1,23 @@ +#!/usr/bin/env python3 +# exgdas_global_aero_analysis_run.py +# This script creates an AerosolAnalysis object +# and runs the execute method +# which executes the global aerosol variational analysis +import os + +from pygw.logger import Logger +from pygw.configuration import cast_strdict_as_dtypedict +from pygfs.task.aero_analysis import AerosolAnalysis + +# Initialize root logger +logger = Logger(level='DEBUG', colored_log=True) + + +if __name__ == '__main__': + + # Take configuration from environment and cast it as python dictionary + config = cast_strdict_as_dtypedict(os.environ) + + # Instantiate the aerosol analysis task + AeroAnl = AerosolAnalysis(config) + AeroAnl.execute() diff --git a/scripts/exglobal_archive.sh b/scripts/exglobal_archive.sh new file mode 100755 index 00000000000..2204799067c --- /dev/null +++ b/scripts/exglobal_archive.sh @@ -0,0 +1,481 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################## +# Begin JOB SPECIFIC work +############################################## + +# ICS are restarts and always lag INC by $assim_freq hours +ARCHINC_CYC=${ARCH_CYC} +ARCHICS_CYC=$((ARCH_CYC-assim_freq)) +if [ "${ARCHICS_CYC}" -lt 0 ]; then + ARCHICS_CYC=$((ARCHICS_CYC+24)) +fi + +# CURRENT CYCLE +APREFIX="${RUN}.t${cyc}z." + +# Realtime parallels run GFS MOS on 1 day delay +# If realtime parallel, back up CDATE_MOS one day +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +CDATE_MOS=${PDY}${cyc} +if [ "${REALTIME}" = "YES" ]; then + CDATE_MOS=$(${NDATE} -24 "${PDY}${cyc}") +fi +PDY_MOS="${CDATE_MOS:0:8}" + +############################################################### +# Archive online for verification and diagnostics +############################################################### +source "${HOMEgfs}/ush/file_utils.sh" + +[[ ! -d ${ARCDIR} ]] && mkdir -p "${ARCDIR}" +nb_copy "${COM_ATMOS_ANALYSIS}/${APREFIX}gsistat" "${ARCDIR}/gsistat.${RUN}.${PDY}${cyc}" +nb_copy "${COM_ATMOS_GRIB_1p00}/${APREFIX}pgrb2.1p00.anl" "${ARCDIR}/pgbanl.${RUN}.${PDY}${cyc}.grib2" + +# Archive 1 degree forecast GRIB2 files for verification +if [[ "${RUN}" == "gfs" ]]; then + fhmax=${FHMAX_GFS} + fhr=0 + while [ "${fhr}" -le "${fhmax}" ]; do + fhr2=$(printf %02i "${fhr}") + fhr3=$(printf %03i "${fhr}") + nb_copy "${COM_ATMOS_GRIB_1p00}/${APREFIX}pgrb2.1p00.f${fhr3}" "${ARCDIR}/pgbf${fhr2}.${RUN}.${PDY}${cyc}.grib2" + fhr=$((10#${fhr} + 10#${FHOUT_GFS} )) + done +fi +if [[ "${RUN}" == "gdas" ]]; then + flist="000 003 006 009" + for fhr in ${flist}; do + fname="${COM_ATMOS_GRIB_1p00}/${APREFIX}pgrb2.1p00.f${fhr}" + # TODO Shouldn't the archived files also use three-digit tags? + fhr2=$(printf %02i $((10#${fhr}))) + nb_copy "${fname}" "${ARCDIR}/pgbf${fhr2}.${RUN}.${PDY}${cyc}.grib2" + done +fi + +if [[ -s "${COM_ATMOS_TRACK}/avno.t${cyc}z.cyclone.trackatcfunix" ]]; then + # shellcheck disable=2153 + PSLOT4=${PSLOT:0:4} + # shellcheck disable= + PSLOT4=${PSLOT4^^} + sed "s:AVNO:${PSLOT4}:g" < "${COM_ATMOS_TRACK}/avno.t${cyc}z.cyclone.trackatcfunix" \ + > "${ARCDIR}/atcfunix.${RUN}.${PDY}${cyc}" + sed "s:AVNO:${PSLOT4}:g" < "${COM_ATMOS_TRACK}/avnop.t${cyc}z.cyclone.trackatcfunix" \ + > "${ARCDIR}/atcfunixp.${RUN}.${PDY}${cyc}" +fi + +if [[ "${RUN}" == "gdas" ]] && [[ -s "${COM_ATMOS_TRACK}/gdas.t${cyc}z.cyclone.trackatcfunix" ]]; then + # shellcheck disable=2153 + PSLOT4=${PSLOT:0:4} + # shellcheck disable= + PSLOT4=${PSLOT4^^} + sed "s:AVNO:${PSLOT4}:g" < "${COM_ATMOS_TRACK}/gdas.t${cyc}z.cyclone.trackatcfunix" \ + > "${ARCDIR}/atcfunix.${RUN}.${PDY}${cyc}" + sed "s:AVNO:${PSLOT4}:g" < "${COM_ATMOS_TRACK}/gdasp.t${cyc}z.cyclone.trackatcfunix" \ + > "${ARCDIR}/atcfunixp.${RUN}.${PDY}${cyc}" +fi + +if [ "${RUN}" = "gfs" ]; then + nb_copy "${COM_ATMOS_GENESIS}/storms.gfso.atcf_gen.${PDY}${cyc}" "${ARCDIR}/." + nb_copy "${COM_ATMOS_GENESIS}/storms.gfso.atcf_gen.altg.${PDY}${cyc}" "${ARCDIR}/." + nb_copy "${COM_ATMOS_TRACK}/trak.gfso.atcfunix.${PDY}${cyc}" "${ARCDIR}/." + nb_copy "${COM_ATMOS_TRACK}/trak.gfso.atcfunix.altg.${PDY}${cyc}" "${ARCDIR}/." + + mkdir -p "${ARCDIR}/tracker.${PDY}${cyc}/${RUN}" + blist="epac natl" + for basin in ${blist}; do + if [[ -f ${basin} ]]; then + cp -rp "${COM_ATMOS_TRACK}/${basin}" "${ARCDIR}/tracker.${PDY}${cyc}/${RUN}" + fi + done +fi + +# Archive required gaussian gfs forecast files for Fit2Obs +if [[ "${RUN}" == "gfs" ]] && [[ "${FITSARC}" = "YES" ]]; then + VFYARC=${VFYARC:-${ROTDIR}/vrfyarch} + [[ ! -d ${VFYARC} ]] && mkdir -p "${VFYARC}" + mkdir -p "${VFYARC}/${RUN}.${PDY}/${cyc}" + prefix="${RUN}.t${cyc}z" + fhmax=${FHMAX_FITS:-${FHMAX_GFS}} + fhr=0 + while [[ ${fhr} -le ${fhmax} ]]; do + fhr3=$(printf %03i "${fhr}") + sfcfile="${COM_ATMOS_MASTER}/${prefix}.sfcf${fhr3}.nc" + sigfile="${COM_ATMOS_MASTER}/${prefix}.atmf${fhr3}.nc" + nb_copy "${sfcfile}" "${VFYARC}/${RUN}.${PDY}/${cyc}/" + nb_copy "${sigfile}" "${VFYARC}/${RUN}.${PDY}/${cyc}/" + (( fhr = 10#${fhr} + 6 )) + done +fi + + +############################################################### +# Archive data either to HPSS or locally +if [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then +############################################################### + + # --set the archiving command and create local directories, if necessary + TARCMD="htar" + HSICMD="hsi" + if [[ ${LOCALARCH} = "YES" ]]; then + TARCMD="tar" + HSICMD='' + [[ ! -d "${ATARDIR}/${PDY}${cyc}" ]] && mkdir -p "${ATARDIR}/${PDY}${cyc}" + [[ ! -d "${ATARDIR}/${CDATE_MOS}" ]] && [[ -d "${ROTDIR}/gfsmos.${PDY_MOS}" ]] && [[ "${cyc}" -eq 18 ]] && mkdir -p "${ATARDIR}/${CDATE_MOS}" + fi + + #--determine when to save ICs for warm start and forecast-only runs + SAVEWARMICA="NO" + SAVEWARMICB="NO" + SAVEFCSTIC="NO" + firstday=$(${NDATE} +24 "${SDATE}") + mm="${PDY:2:2}" + dd="${PDY:4:2}" + # TODO: This math yields multiple dates sharing the same nday + nday=$(( (10#${mm}-1)*30+10#${dd} )) + mod=$((nday % ARCH_WARMICFREQ)) + if [[ "${PDY}${cyc}" -eq "${firstday}" ]] && [[ "${cyc}" -eq "${ARCHINC_CYC}" ]]; then SAVEWARMICA="YES" ; fi + if [[ "${PDY}${cyc}" -eq "${firstday}" ]] && [[ "${cyc}" -eq "${ARCHICS_CYC}" ]]; then SAVEWARMICB="YES" ; fi + if [[ "${mod}" -eq 0 ]] && [[ "${cyc}" -eq "${ARCHINC_CYC}" ]]; then SAVEWARMICA="YES" ; fi + if [[ "${mod}" -eq 0 ]] && [[ "${cyc}" -eq "${ARCHICS_CYC}" ]]; then SAVEWARMICB="YES" ; fi + + if [[ "${ARCHICS_CYC}" -eq 18 ]]; then + nday1=$((nday+1)) + mod1=$((nday1 % ARCH_WARMICFREQ)) + if [[ "${mod1}" -eq 0 ]] && [[ "${cyc}" -eq "${ARCHICS_CYC}" ]] ; then SAVEWARMICB="YES" ; fi + if [[ "${mod1}" -ne 0 ]] && [[ "${cyc}" -eq "${ARCHICS_CYC}" ]] ; then SAVEWARMICB="NO" ; fi + if [[ "${PDY}${cyc}" -eq "${SDATE}" ]] && [[ "${cyc}" -eq "${ARCHICS_CYC}" ]] ; then SAVEWARMICB="YES" ; fi + fi + + mod=$((nday % ARCH_FCSTICFREQ)) + if [[ "${mod}" -eq 0 ]] || [[ "${PDY}${cyc}" -eq "${firstday}" ]]; then SAVEFCSTIC="YES" ; fi + + + ARCH_LIST="${DATA}/archlist" + [[ -d ${ARCH_LIST} ]] && rm -rf "${ARCH_LIST}" + mkdir -p "${ARCH_LIST}" + cd "${ARCH_LIST}" || exit 2 + + "${HOMEgfs}/ush/hpssarch_gen.sh" "${RUN}" + status=$? + if [ "${status}" -ne 0 ]; then + echo "${HOMEgfs}/ush/hpssarch_gen.sh ${RUN} failed, ABORT!" + exit "${status}" + fi + + cd "${ROTDIR}" || exit 2 + + if [[ "${RUN}" = "gfs" ]]; then + + targrp_list="gfsa gfsb" + + if [ "${ARCH_GAUSSIAN:-"NO"}" = "YES" ]; then + targrp_list="${targrp_list} gfs_flux gfs_netcdfb gfs_pgrb2b" + if [ "${MODE}" = "cycled" ]; then + targrp_list="${targrp_list} gfs_netcdfa" + fi + fi + + if [ "${DO_WAVE}" = "YES" ]; then + targrp_list="${targrp_list} gfswave" + fi + + if [ "${DO_OCN}" = "YES" ]; then + targrp_list="${targrp_list} ocn_ice_grib2_0p5 ocn_ice_grib2_0p25 ocn_2D ocn_3D ocn_xsect ocn_daily gfs_flux_1p00" + fi + + if [ "${DO_ICE}" = "YES" ]; then + targrp_list="${targrp_list} ice" + fi + + # Aerosols + if [ "${DO_AERO}" = "YES" ]; then + for targrp in chem; do + # TODO: Why is this tar being done here instead of being added to the list? + ${TARCMD} -P -cvf "${ATARDIR}/${PDY}${cyc}/${targrp}.tar" $(cat "${ARCH_LIST}/${targrp}.txt") + status=$? + if [[ "${status}" -ne 0 ]] && [[ "${PDY}${cyc}" -ge "${firstday}" ]]; then + echo "HTAR ${PDY}${cyc} ${targrp}.tar failed" + exit "${status}" + fi + done + fi + + #for restarts + if [ "${SAVEFCSTIC}" = "YES" ]; then + targrp_list="${targrp_list} gfs_restarta" + fi + + #for downstream products + if [ "${DO_BUFRSND}" = "YES" ] || [ "${WAFSF}" = "YES" ]; then + targrp_list="${targrp_list} gfs_downstream" + fi + + #--save mdl gfsmos output from all cycles in the 18Z archive directory + if [[ -d "gfsmos.${PDY_MOS}" ]] && [[ "${cyc}" -eq 18 ]]; then + set +e + # TODO: Why is this tar being done here instead of being added to the list? + ${TARCMD} -P -cvf "${ATARDIR}/${CDATE_MOS}/gfsmos.tar" "./gfsmos.${PDY_MOS}" + status=$? + if [[ "${status}" -ne 0 ]] && [[ "${PDY}${cyc}" -ge "${firstday}" ]]; then + echo "${TARCMD^^} ${PDY}${cyc} gfsmos.tar failed" + exit "${status}" + fi + set_strict + fi + elif [[ "${RUN}" = "gdas" ]]; then + + targrp_list="gdas" + + #gdaswave + if [ "${DO_WAVE}" = "YES" ]; then + targrp_list="${targrp_list} gdaswave" + fi + + #gdasocean + if [ "${DO_OCN}" = "YES" ]; then + targrp_list="${targrp_list} gdasocean" + fi + + #gdasice + if [ "${DO_ICE}" = "YES" ]; then + targrp_list="${targrp_list} gdasice" + fi + + if [ "${SAVEWARMICA}" = "YES" ] || [ "${SAVEFCSTIC}" = "YES" ]; then + targrp_list="${targrp_list} gdas_restarta" + if [ "${DO_WAVE}" = "YES" ]; then targrp_list="${targrp_list} gdaswave_restart"; fi + if [ "${DO_OCN}" = "YES" ]; then targrp_list="${targrp_list} gdasocean_restart"; fi + if [ "${DO_ICE}" = "YES" ]; then targrp_list="${targrp_list} gdasice_restart"; fi + fi + + if [ "${SAVEWARMICB}" = "YES" ] || [ "${SAVEFCSTIC}" = "YES" ]; then + targrp_list="${targrp_list} gdas_restartb" + fi + fi + + # Turn on extended globbing options + shopt -s extglob + for targrp in ${targrp_list}; do + set +e + ${TARCMD} -P -cvf "${ATARDIR}/${PDY}${cyc}/${targrp}.tar" $(cat "${ARCH_LIST}/${targrp}.txt") + status=$? + case ${targrp} in + 'gdas'|'gdas_restarta') + ${HSICMD} chgrp rstprod "${ATARDIR}/${CDATE}/${targrp}.tar" + ${HSICMD} chmod 640 "${ATARDIR}/${CDATE}/${targrp}.tar" + ;; + *) ;; + esac + if [ "${status}" -ne 0 ] && [ "${PDY}${cyc}" -ge "${firstday}" ]; then + echo "FATAL ERROR: ${TARCMD} ${PDY}${cyc} ${targrp}.tar failed" + exit "${status}" + fi + set_strict + done + # Turn extended globbing back off + shopt -u extglob + +############################################################### +fi ##end of HPSS archive +############################################################### + + + +############################################################### +# Clean up previous cycles; various depths +# PRIOR CYCLE: Leave the prior cycle alone +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") + +# PREVIOUS to the PRIOR CYCLE +GDATE=$(${NDATE} -"${assim_freq}" "${GDATE}") +gPDY="${GDATE:0:8}" +gcyc="${GDATE:8:2}" + +# Remove the TMPDIR directory +# TODO Only prepbufr is currently using this directory, and all jobs should be +# cleaning up after themselves anyway +COMIN="${DATAROOT}/${GDATE}" +[[ -d ${COMIN} ]] && rm -rf "${COMIN}" + +if [[ "${DELETE_COM_IN_ARCHIVE_JOB:-YES}" == NO ]] ; then + exit 0 +fi + +# Step back every assim_freq hours and remove old rotating directories +# for successful cycles (defaults from 24h to 120h). If GLDAS is +# active, retain files needed by GLDAS update. Independent of GLDAS, +# retain files needed by Fit2Obs +# TODO: This whole section needs to be revamped to remove marine component +# directories and not look at the rocoto log. +DO_GLDAS=${DO_GLDAS:-"NO"} +GDATEEND=$(${NDATE} -"${RMOLDEND:-24}" "${PDY}${cyc}") +GDATE=$(${NDATE} -"${RMOLDSTD:-120}" "${PDY}${cyc}") +GLDAS_DATE=$(${NDATE} -96 "${PDY}${cyc}") +RTOFS_DATE=$(${NDATE} -48 "${PDY}${cyc}") +function remove_files() { + # TODO: move this to a new location + local directory=$1 + shift + if [[ ! -d ${directory} ]]; then + echo "No directory ${directory} to remove files from, skiping" + return + fi + local exclude_list="" + if (($# > 0)); then + exclude_list=$* + fi + local file_list + declare -a file_list + readarray -t file_list < <(find -L "${directory}" -type f) + if (( ${#file_list[@]} == 0 )); then return; fi + # echo "Number of files to remove before exclusions: ${#file_list[@]}" + for exclude in ${exclude_list}; do + echo "Excluding ${exclude}" + declare -a file_list_old=("${file_list[@]}") + readarray file_list < <(printf -- '%s\n' "${file_list_old[@]}" | grep -v "${exclude}") + # echo "Number of files to remove after exclusion: ${#file_list[@]}" + if (( ${#file_list[@]} == 0 )); then return; fi + done + # echo "Number of files to remove after exclusions: ${#file_list[@]}" + + for file in "${file_list[@]}"; do + rm -f "${file}" + done + # Remove directory if empty + rmdir "${directory}" || true +} + +while [ "${GDATE}" -le "${GDATEEND}" ]; do + gPDY="${GDATE:0:8}" + gcyc="${GDATE:8:2}" + COMINrtofs="${ROTDIR}/rtofs.${gPDY}" + if [ -d "${COM_TOP}" ]; then + rocotolog="${EXPDIR}/logs/${GDATE}.log" + if [ -f "${rocotolog}" ]; then + set +e + testend=$(tail -n 1 "${rocotolog}" | grep "This cycle is complete: Success") + rc=$? + set_strict + + if [ "${rc}" -eq 0 ]; then + # Obs + exclude_list="prepbufr" + templates="COM_OBS" + for template in ${templates}; do + YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Atmos + exclude_list="cnvstat atmanl.nc" + if [[ ${DO_GLDAS} == "YES" ]] && [[ ${RUN} =~ "gdas" ]] && [[ "${GDATE}" -ge "${GLDAS_DATE}" ]]; then + exclude_list="${exclude_list} sflux sfcanl" + fi + templates=$(compgen -A variable | grep 'COM_ATMOS_.*_TMPL') + for template in ${templates}; do + YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Wave + exclude_list="" + templates=$(compgen -A variable | grep 'COM_WAVE_.*_TMPL') + for template in ${templates}; do + YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Ocean + exclude_list="" + templates=$(compgen -A variable | grep 'COM_OCEAN_.*_TMPL') + for template in ${templates}; do + YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Ice + exclude_list="" + templates=$(compgen -A variable | grep 'COM_ICE_.*_TMPL') + for template in ${templates}; do + YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Aerosols (GOCART) + exclude_list="" + templates=$(compgen -A variable | grep 'COM_CHEM_.*_TMPL') + for template in ${templates}; do + YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + # Mediator + exclude_list="" + templates=$(compgen -A variable | grep 'COM_MED_.*_TMPL') + for template in ${templates}; do + YMD="${gPDY}" HH="${gcyc}" generate_com "directory:${template}" + remove_files "${directory}" "${exclude_list[@]}" + done + + if [ -d "${COMINrtofs}" ] && [ "${GDATE}" -lt "${RTOFS_DATE}" ]; then rm -rf "${COMINrtofs}" ; fi + fi + fi + fi + + # Remove mdl gfsmos directory + if [ "${RUN}" = "gfs" ]; then + COMIN="${ROTDIR}/gfsmos.${gPDY}" + if [ -d "${COMIN}" ] && [ "${GDATE}" -lt "${CDATE_MOS}" ]; then rm -rf "${COMIN}" ; fi + fi + + # Remove any empty directories + target_dir="${ROTDIR:?}/${RUN}.${gPDY}/${gcyc}/" + if [[ -d ${target_dir} ]]; then + find "${target_dir}" -empty -type d -delete + fi + + GDATE=$(${NDATE} +"${assim_freq}" "${GDATE}") +done + +# Remove archived gaussian files used for Fit2Obs in $VFYARC that are +# $FHMAX_FITS plus a delta before $CDATE. Touch existing archived +# gaussian files to prevent the files from being removed by automatic +# scrubber present on some machines. + +if [ "${RUN}" = "gfs" ]; then + fhmax=$((FHMAX_FITS+36)) + RDATE=$(${NDATE} -"${fhmax}" "${PDY}${cyc}") + rPDY=$(echo "${RDATE}" | cut -c1-8) + COMIN="${VFYARC}/${RUN}.${rPDY}" + [[ -d ${COMIN} ]] && rm -rf "${COMIN}" + + TDATE=$(${NDATE} -"${FHMAX_FITS}" "${PDY}${cyc}") + while [ "${TDATE}" -lt "${PDY}${cyc}" ]; do + tPDY=$(echo "${TDATE}" | cut -c1-8) + tcyc=$(echo "${TDATE}" | cut -c9-10) + TDIR=${VFYARC}/${RUN}.${tPDY}/${tcyc} + [[ -d ${TDIR} ]] && touch "${TDIR}"/* + TDATE=$(${NDATE} +6 "${TDATE}") + done +fi + +# Remove $RUN.$rPDY for the older of GDATE or RDATE +GDATE=$(${NDATE} -"${RMOLDSTD:-120}" "${PDY}${cyc}") +fhmax=${FHMAX_GFS} +RDATE=$(${NDATE} -"${fhmax}" "${PDY}${cyc}") +if [ "${GDATE}" -lt "${RDATE}" ]; then + RDATE=${GDATE} +fi +rPDY=$(echo "${RDATE}" | cut -c1-8) +COMIN="${ROTDIR}/${RUN}.${rPDY}" +[[ -d ${COMIN} ]] && rm -rf "${COMIN}" + + +############################################################### + + +exit 0 diff --git a/scripts/exglobal_atm_analysis_finalize.py b/scripts/exglobal_atm_analysis_finalize.py new file mode 100755 index 00000000000..cd6938e210a --- /dev/null +++ b/scripts/exglobal_atm_analysis_finalize.py @@ -0,0 +1,25 @@ +#!/usr/bin/env python3 +# exgdas_global_atm_analysis_finalize.py +# This script creates an AtmAnalysis class +# and runs the finalize method +# which perform post-processing and clean up activities +# for a global atm variational analysis +import os + +from pygw.logger import Logger +from pygw.configuration import cast_strdict_as_dtypedict +from pygfs.task.atm_analysis import AtmAnalysis + + +# Initialize root logger +logger = Logger(level='DEBUG', colored_log=True) + + +if __name__ == '__main__': + + # Take configuration from environment and cast it as python dictionary + config = cast_strdict_as_dtypedict(os.environ) + + # Instantiate the atm analysis task + AtmAnl = AtmAnalysis(config) + AtmAnl.finalize() diff --git a/scripts/exglobal_atm_analysis_initialize.py b/scripts/exglobal_atm_analysis_initialize.py new file mode 100755 index 00000000000..b003d98c005 --- /dev/null +++ b/scripts/exglobal_atm_analysis_initialize.py @@ -0,0 +1,25 @@ +#!/usr/bin/env python3 +# exgdas_global_atm_analysis_initialize.py +# This script creates an AtmAnalysis class +# and runs the initialize method +# which create and stage the runtime directory +# and create the YAML configuration +# for a global atm variational analysis +import os + +from pygw.logger import Logger +from pygw.configuration import cast_strdict_as_dtypedict +from pygfs.task.atm_analysis import AtmAnalysis + +# Initialize root logger +logger = Logger(level='DEBUG', colored_log=True) + + +if __name__ == '__main__': + + # Take configuration from environment and cast it as python dictionary + config = cast_strdict_as_dtypedict(os.environ) + + # Instantiate the atm analysis task + AtmAnl = AtmAnalysis(config) + AtmAnl.initialize() diff --git a/scripts/exglobal_atm_analysis_run.py b/scripts/exglobal_atm_analysis_run.py new file mode 100755 index 00000000000..e1f44208c92 --- /dev/null +++ b/scripts/exglobal_atm_analysis_run.py @@ -0,0 +1,23 @@ +#!/usr/bin/env python3 +# exgdas_global_atm_analysis_run.py +# This script creates an AtmAnalysis object +# and runs the execute method +# which executes the global atm variational analysis +import os + +from pygw.logger import Logger +from pygw.configuration import cast_strdict_as_dtypedict +from pygfs.task.atm_analysis import AtmAnalysis + +# Initialize root logger +logger = Logger(level='DEBUG', colored_log=True) + + +if __name__ == '__main__': + + # Take configuration from environment and cast it as python dictionary + config = cast_strdict_as_dtypedict(os.environ) + + # Instantiate the atm analysis task + AtmAnl = AtmAnalysis(config) + AtmAnl.execute() diff --git a/scripts/exglobal_atmens_analysis_finalize.py b/scripts/exglobal_atmens_analysis_finalize.py new file mode 100755 index 00000000000..5271c5c486e --- /dev/null +++ b/scripts/exglobal_atmens_analysis_finalize.py @@ -0,0 +1,25 @@ +#!/usr/bin/env python3 +# exgdas_global_atmens_analysis_finalize.py +# This script creates an AtmEnsAnalysis class +# and runs the finalize method +# which perform post-processing and clean up activities +# for a global atm local ensemble analysis +import os + +from pygw.logger import Logger +from pygw.configuration import cast_strdict_as_dtypedict +from pygfs.task.atmens_analysis import AtmEnsAnalysis + + +# Initialize root logger +logger = Logger(level='DEBUG', colored_log=True) + + +if __name__ == '__main__': + + # Take configuration from environment and cast it as python dictionary + config = cast_strdict_as_dtypedict(os.environ) + + # Instantiate the atmens analysis task + AtmEnsAnl = AtmEnsAnalysis(config) + AtmEnsAnl.finalize() diff --git a/scripts/exglobal_atmens_analysis_initialize.py b/scripts/exglobal_atmens_analysis_initialize.py new file mode 100755 index 00000000000..97326ddf3de --- /dev/null +++ b/scripts/exglobal_atmens_analysis_initialize.py @@ -0,0 +1,25 @@ +#!/usr/bin/env python3 +# exgdas_global_atmens_analysis_initialize.py +# This script creates an AtmEnsAnalysis class +# and runs the initialize method +# which create and stage the runtime directory +# and create the YAML configuration +# for a global atm local ensemble analysis +import os + +from pygw.logger import Logger +from pygw.configuration import cast_strdict_as_dtypedict +from pygfs.task.atmens_analysis import AtmEnsAnalysis + +# Initialize root logger +logger = Logger(level='DEBUG', colored_log=True) + + +if __name__ == '__main__': + + # Take configuration from environment and cast it as python dictionary + config = cast_strdict_as_dtypedict(os.environ) + + # Instantiate the atmens analysis task + AtmEnsAnl = AtmEnsAnalysis(config) + AtmEnsAnl.initialize() diff --git a/scripts/exglobal_atmens_analysis_run.py b/scripts/exglobal_atmens_analysis_run.py new file mode 100755 index 00000000000..2de95e850da --- /dev/null +++ b/scripts/exglobal_atmens_analysis_run.py @@ -0,0 +1,23 @@ +#!/usr/bin/env python3 +# exgdas_global_atmens_analysis_run.py +# This script creates an AtmEnsAnalysis object +# and runs the execute method +# which executes the global atm local ensemble analysis +import os + +from pygw.logger import Logger +from pygw.configuration import cast_strdict_as_dtypedict +from pygfs.task.atmens_analysis import AtmEnsAnalysis + +# Initialize root logger +logger = Logger(level='DEBUG', colored_log=True) + + +if __name__ == '__main__': + + # Take configuration from environment and cast it as python dictionary + config = cast_strdict_as_dtypedict(os.environ) + + # Instantiate the atmens analysis task + AtmEnsAnl = AtmEnsAnalysis(config) + AtmEnsAnl.execute() diff --git a/scripts/exglobal_atmos_analysis.sh b/scripts/exglobal_atmos_analysis.sh index 7970b9b3d8a..ccedbdabda6 100755 --- a/scripts/exglobal_atmos_analysis.sh +++ b/scripts/exglobal_atmos_analysis.sh @@ -19,7 +19,7 @@ # Set environment. -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" # Directories. pwd=$(pwd) @@ -30,20 +30,19 @@ CDUMP=${CDUMP:-"gdas"} GDUMP=${GDUMP:-"gdas"} # Derived base variables -GDATE=$($NDATE -$assim_freq $CDATE) -BDATE=$($NDATE -3 $CDATE) -PDY=$(echo $CDATE | cut -c1-8) -cyc=$(echo $CDATE | cut -c9-10) -bPDY=$(echo $BDATE | cut -c1-8) -bcyc=$(echo $BDATE | cut -c9-10) +GDATE=$(${NDATE} -${assim_freq} ${CDATE}) +BDATE=$(${NDATE} -3 ${CDATE}) +PDY=$(echo ${CDATE} | cut -c1-8) +cyc=$(echo ${CDATE} | cut -c9-10) +bPDY=$(echo ${BDATE} | cut -c1-8) +bcyc=$(echo ${BDATE} | cut -c9-10) # Utilities export NCP=${NCP:-"/bin/cp"} export NMV=${NMV:-"/bin/mv"} export NLN=${NLN:-"/bin/ln -sf"} export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} -export NEMSIOGET=${NEMSIOGET:-${NWPROD}/exec/nemsio_get} -export NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} +export NCLEN=${NCLEN:-${HOMEgfs}/ush/getncdimlen} COMPRESS=${COMPRESS:-gzip} UNCOMPRESS=${UNCOMPRESS:-gunzip} APRUNCFP=${APRUNCFP:-""} @@ -69,20 +68,19 @@ DOIAU=${DOIAU:-"NO"} export IAUFHRS=${IAUFHRS:-"6"} # Dependent Scripts and Executables -GSIEXEC=${GSIEXEC:-$HOMEgfs/exec/gsi.x} +GSIEXEC=${GSIEXEC:-${HOMEgfs}/exec/gsi.x} export NTHREADS_CALCINC=${NTHREADS_CALCINC:-1} export APRUN_CALCINC=${APRUN_CALCINC:-${APRUN:-""}} export APRUN_CALCANL=${APRUN_CALCANL:-${APRUN:-""}} export APRUN_CHGRES=${APRUN_CALCANL:-${APRUN:-""}} -export CALCINCEXEC=${CALCINCEXEC:-$HOMEgfs/exec/calc_increment_ens.x} -export CALCINCNCEXEC=${CALCINCNCEXEC:-$HOMEgfs/exec/calc_increment_ens_ncio.x} -export CALCANLEXEC=${CALCANLEXEC:-$HOMEgfs/exec/calc_analysis.x} -export CHGRESNCEXEC=${CHGRESNCEXEC:-$HOMEgfs/exec/enkf_chgres_recenter_nc.x} -export CHGRESINCEXEC=${CHGRESINCEXEC:-$HOMEgfs/exec/interp_inc.x} -CHGRESEXEC=${CHGRESEXEC:-$HOMEgfs/exec/enkf_chgres_recenter.x} +export CALCINCEXEC=${CALCINCEXEC:-${HOMEgfs}/exec/calc_increment_ens.x} +export CALCINCNCEXEC=${CALCINCNCEXEC:-${HOMEgfs}/exec/calc_increment_ens_ncio.x} +export CALCANLEXEC=${CALCANLEXEC:-${HOMEgfs}/exec/calc_analysis.x} +export CHGRESNCEXEC=${CHGRESNCEXEC:-${HOMEgfs}/exec/enkf_chgres_recenter_nc.x} +export CHGRESINCEXEC=${CHGRESINCEXEC:-${HOMEgfs}/exec/interp_inc.x} +CHGRESEXEC=${CHGRESEXEC:-${HOMEgfs}/exec/enkf_chgres_recenter.x} export NTHREADS_CHGRES=${NTHREADS_CHGRES:-24} -CALCINCPY=${CALCINCPY:-$HOMEgfs/ush/calcinc_gfs.py} -CALCANLPY=${CALCANLPY:-$HOMEgfs/ush/calcanl_gfs.py} +CALCINCPY=${CALCINCPY:-${HOMEgfs}/ush/calcinc_gfs.py} # OPS flags RUN=${RUN:-""} @@ -90,121 +88,120 @@ SENDECF=${SENDECF:-"NO"} SENDDBN=${SENDDBN:-"NO"} RUN_GETGES=${RUN_GETGES:-"NO"} GETGESSH=${GETGESSH:-"getges.sh"} -export gesenvir=${gesenvir:-$envir} +export gesenvir=${gesenvir:-${envir}} # Observations OPREFIX=${OPREFIX:-""} OSUFFIX=${OSUFFIX:-""} -PREPQC=${PREPQC:-${COMIN_OBS}/${OPREFIX}prepbufr${OSUFFIX}} -PREPQCPF=${PREPQCPF:-${COMIN_OBS}/${OPREFIX}prepbufr.acft_profiles${OSUFFIX}} -NSSTBF=${NSSTBF:-${COMIN_OBS}/${OPREFIX}nsstbufr${OSUFFIX}} -SATWND=${SATWND:-${COMIN_OBS}/${OPREFIX}satwnd.tm00.bufr_d${OSUFFIX}} -OSCATBF=${OSCATBF:-${COMIN_OBS}/${OPREFIX}oscatw.tm00.bufr_d${OSUFFIX}} -RAPIDSCATBF=${RAPIDSCATBF:-${COMIN_OBS}/${OPREFIX}rapidscatw.tm00.bufr_d${OSUFFIX}} -GSNDBF=${GSNDBF:-${COMIN_OBS}/${OPREFIX}goesnd.tm00.bufr_d${OSUFFIX}} -GSNDBF1=${GSNDBF1:-${COMIN_OBS}/${OPREFIX}goesfv.tm00.bufr_d${OSUFFIX}} -B1HRS2=${B1HRS2:-${COMIN_OBS}/${OPREFIX}1bhrs2.tm00.bufr_d${OSUFFIX}} -B1MSU=${B1MSU:-${COMIN_OBS}/${OPREFIX}1bmsu.tm00.bufr_d${OSUFFIX}} -B1HRS3=${B1HRS3:-${COMIN_OBS}/${OPREFIX}1bhrs3.tm00.bufr_d${OSUFFIX}} -B1HRS4=${B1HRS4:-${COMIN_OBS}/${OPREFIX}1bhrs4.tm00.bufr_d${OSUFFIX}} -B1AMUA=${B1AMUA:-${COMIN_OBS}/${OPREFIX}1bamua.tm00.bufr_d${OSUFFIX}} -B1AMUB=${B1AMUB:-${COMIN_OBS}/${OPREFIX}1bamub.tm00.bufr_d${OSUFFIX}} -B1MHS=${B1MHS:-${COMIN_OBS}/${OPREFIX}1bmhs.tm00.bufr_d${OSUFFIX}} -ESHRS3=${ESHRS3:-${COMIN_OBS}/${OPREFIX}eshrs3.tm00.bufr_d${OSUFFIX}} -ESAMUA=${ESAMUA:-${COMIN_OBS}/${OPREFIX}esamua.tm00.bufr_d${OSUFFIX}} -ESAMUB=${ESAMUB:-${COMIN_OBS}/${OPREFIX}esamub.tm00.bufr_d${OSUFFIX}} -ESMHS=${ESMHS:-${COMIN_OBS}/${OPREFIX}esmhs.tm00.bufr_d${OSUFFIX}} -HRS3DB=${HRS3DB:-${COMIN_OBS}/${OPREFIX}hrs3db.tm00.bufr_d${OSUFFIX}} -AMUADB=${AMUADB:-${COMIN_OBS}/${OPREFIX}amuadb.tm00.bufr_d${OSUFFIX}} -AMUBDB=${AMUBDB:-${COMIN_OBS}/${OPREFIX}amubdb.tm00.bufr_d${OSUFFIX}} -MHSDB=${MHSDB:-${COMIN_OBS}/${OPREFIX}mhsdb.tm00.bufr_d${OSUFFIX}} -AIRSBF=${AIRSBF:-${COMIN_OBS}/${OPREFIX}airsev.tm00.bufr_d${OSUFFIX}} -IASIBF=${IASIBF:-${COMIN_OBS}/${OPREFIX}mtiasi.tm00.bufr_d${OSUFFIX}} -ESIASI=${ESIASI:-${COMIN_OBS}/${OPREFIX}esiasi.tm00.bufr_d${OSUFFIX}} -IASIDB=${IASIDB:-${COMIN_OBS}/${OPREFIX}iasidb.tm00.bufr_d${OSUFFIX}} -AMSREBF=${AMSREBF:-${COMIN_OBS}/${OPREFIX}amsre.tm00.bufr_d${OSUFFIX}} -AMSR2BF=${AMSR2BF:-${COMIN_OBS}/${OPREFIX}amsr2.tm00.bufr_d${OSUFFIX}} -GMI1CRBF=${GMI1CRBF:-${COMIN_OBS}/${OPREFIX}gmi1cr.tm00.bufr_d${OSUFFIX}} -SAPHIRBF=${SAPHIRBF:-${COMIN_OBS}/${OPREFIX}saphir.tm00.bufr_d${OSUFFIX}} -SEVIRIBF=${SEVIRIBF:-${COMIN_OBS}/${OPREFIX}sevcsr.tm00.bufr_d${OSUFFIX}} -AHIBF=${AHIBF:-${COMIN_OBS}/${OPREFIX}ahicsr.tm00.bufr_d${OSUFFIX}} -SSTVIIRS=${SSTVIIRS:-${COMIN_OBS}/${OPREFIX}sstvcw.tm00.bufr_d${OSUFFIX}} -ABIBF=${ABIBF:-${COMIN_OBS}/${OPREFIX}gsrcsr.tm00.bufr_d${OSUFFIX}} -CRISBF=${CRISBF:-${COMIN_OBS}/${OPREFIX}cris.tm00.bufr_d${OSUFFIX}} -ESCRIS=${ESCRIS:-${COMIN_OBS}/${OPREFIX}escris.tm00.bufr_d${OSUFFIX}} -CRISDB=${CRISDB:-${COMIN_OBS}/${OPREFIX}crisdb.tm00.bufr_d${OSUFFIX}} -CRISFSBF=${CRISFSBF:-${COMIN_OBS}/${OPREFIX}crisf4.tm00.bufr_d${OSUFFIX}} -ESCRISFS=${ESCRISFS:-${COMIN_OBS}/${OPREFIX}escrsf.tm00.bufr_d${OSUFFIX}} -CRISFSDB=${CRISFSDB:-${COMIN_OBS}/${OPREFIX}crsfdb.tm00.bufr_d${OSUFFIX}} -ATMSBF=${ATMSBF:-${COMIN_OBS}/${OPREFIX}atms.tm00.bufr_d${OSUFFIX}} -ESATMS=${ESATMS:-${COMIN_OBS}/${OPREFIX}esatms.tm00.bufr_d${OSUFFIX}} -ATMSDB=${ATMSDB:-${COMIN_OBS}/${OPREFIX}atmsdb.tm00.bufr_d${OSUFFIX}} -SSMITBF=${SSMITBF:-${COMIN_OBS}/${OPREFIX}ssmit.tm00.bufr_d${OSUFFIX}} -SSMISBF=${SSMISBF:-${COMIN_OBS}/${OPREFIX}ssmisu.tm00.bufr_d${OSUFFIX}} -SBUVBF=${SBUVBF:-${COMIN_OBS}/${OPREFIX}osbuv8.tm00.bufr_d${OSUFFIX}} -OMPSNPBF=${OMPSNPBF:-${COMIN_OBS}/${OPREFIX}ompsn8.tm00.bufr_d${OSUFFIX}} -OMPSTCBF=${OMPSTCBF:-${COMIN_OBS}/${OPREFIX}ompst8.tm00.bufr_d${OSUFFIX}} -OMPSLPBF=${OMPSLPBF:-${COMIN_OBS}/${OPREFIX}ompslp.tm00.bufr_d${OSUFFIX}} -GOMEBF=${GOMEBF:-${COMIN_OBS}/${OPREFIX}gome.tm00.bufr_d${OSUFFIX}} -OMIBF=${OMIBF:-${COMIN_OBS}/${OPREFIX}omi.tm00.bufr_d${OSUFFIX}} -MLSBF=${MLSBF:-${COMIN_OBS}/${OPREFIX}mls.tm00.bufr_d${OSUFFIX}} -SMIPCP=${SMIPCP:-${COMIN_OBS}/${OPREFIX}spssmi.tm00.bufr_d${OSUFFIX}} -TMIPCP=${TMIPCP:-${COMIN_OBS}/${OPREFIX}sptrmm.tm00.bufr_d${OSUFFIX}} -GPSROBF=${GPSROBF:-${COMIN_OBS}/${OPREFIX}gpsro.tm00.bufr_d${OSUFFIX}} -TCVITL=${TCVITL:-${COMIN_OBS}/${OPREFIX}syndata.tcvitals.tm00} -B1AVHAM=${B1AVHAM:-${COMIN_OBS}/${OPREFIX}avcsam.tm00.bufr_d${OSUFFIX}} -B1AVHPM=${B1AVHPM:-${COMIN_OBS}/${OPREFIX}avcspm.tm00.bufr_d${OSUFFIX}} -HDOB=${HDOB:-${COMIN_OBS}/${OPREFIX}hdob.tm00.bufr_d${OSUFFIX}} +PREPQC=${PREPQC:-${COM_OBS}/${OPREFIX}prepbufr${OSUFFIX}} +PREPQCPF=${PREPQCPF:-${COM_OBS}/${OPREFIX}prepbufr.acft_profiles${OSUFFIX}} +NSSTBF=${NSSTBF:-${COM_OBS}/${OPREFIX}nsstbufr${OSUFFIX}} +SATWND=${SATWND:-${COM_OBS}/${OPREFIX}satwnd.tm00.bufr_d${OSUFFIX}} +OSCATBF=${OSCATBF:-${COM_OBS}/${OPREFIX}oscatw.tm00.bufr_d${OSUFFIX}} +RAPIDSCATBF=${RAPIDSCATBF:-${COM_OBS}/${OPREFIX}rapidscatw.tm00.bufr_d${OSUFFIX}} +GSNDBF=${GSNDBF:-${COM_OBS}/${OPREFIX}goesnd.tm00.bufr_d${OSUFFIX}} +GSNDBF1=${GSNDBF1:-${COM_OBS}/${OPREFIX}goesfv.tm00.bufr_d${OSUFFIX}} +B1HRS2=${B1HRS2:-${COM_OBS}/${OPREFIX}1bhrs2.tm00.bufr_d${OSUFFIX}} +B1MSU=${B1MSU:-${COM_OBS}/${OPREFIX}1bmsu.tm00.bufr_d${OSUFFIX}} +B1HRS3=${B1HRS3:-${COM_OBS}/${OPREFIX}1bhrs3.tm00.bufr_d${OSUFFIX}} +B1HRS4=${B1HRS4:-${COM_OBS}/${OPREFIX}1bhrs4.tm00.bufr_d${OSUFFIX}} +B1AMUA=${B1AMUA:-${COM_OBS}/${OPREFIX}1bamua.tm00.bufr_d${OSUFFIX}} +B1AMUB=${B1AMUB:-${COM_OBS}/${OPREFIX}1bamub.tm00.bufr_d${OSUFFIX}} +B1MHS=${B1MHS:-${COM_OBS}/${OPREFIX}1bmhs.tm00.bufr_d${OSUFFIX}} +ESHRS3=${ESHRS3:-${COM_OBS}/${OPREFIX}eshrs3.tm00.bufr_d${OSUFFIX}} +ESAMUA=${ESAMUA:-${COM_OBS}/${OPREFIX}esamua.tm00.bufr_d${OSUFFIX}} +ESAMUB=${ESAMUB:-${COM_OBS}/${OPREFIX}esamub.tm00.bufr_d${OSUFFIX}} +ESMHS=${ESMHS:-${COM_OBS}/${OPREFIX}esmhs.tm00.bufr_d${OSUFFIX}} +HRS3DB=${HRS3DB:-${COM_OBS}/${OPREFIX}hrs3db.tm00.bufr_d${OSUFFIX}} +AMUADB=${AMUADB:-${COM_OBS}/${OPREFIX}amuadb.tm00.bufr_d${OSUFFIX}} +AMUBDB=${AMUBDB:-${COM_OBS}/${OPREFIX}amubdb.tm00.bufr_d${OSUFFIX}} +MHSDB=${MHSDB:-${COM_OBS}/${OPREFIX}mhsdb.tm00.bufr_d${OSUFFIX}} +AIRSBF=${AIRSBF:-${COM_OBS}/${OPREFIX}airsev.tm00.bufr_d${OSUFFIX}} +IASIBF=${IASIBF:-${COM_OBS}/${OPREFIX}mtiasi.tm00.bufr_d${OSUFFIX}} +ESIASI=${ESIASI:-${COM_OBS}/${OPREFIX}esiasi.tm00.bufr_d${OSUFFIX}} +IASIDB=${IASIDB:-${COM_OBS}/${OPREFIX}iasidb.tm00.bufr_d${OSUFFIX}} +AMSREBF=${AMSREBF:-${COM_OBS}/${OPREFIX}amsre.tm00.bufr_d${OSUFFIX}} +AMSR2BF=${AMSR2BF:-${COM_OBS}/${OPREFIX}amsr2.tm00.bufr_d${OSUFFIX}} +GMI1CRBF=${GMI1CRBF:-${COM_OBS}/${OPREFIX}gmi1cr.tm00.bufr_d${OSUFFIX}} # GMI temporarily disabled due to array overflow. +SAPHIRBF=${SAPHIRBF:-${COM_OBS}/${OPREFIX}saphir.tm00.bufr_d${OSUFFIX}} +SEVIRIBF=${SEVIRIBF:-${COM_OBS}/${OPREFIX}sevcsr.tm00.bufr_d${OSUFFIX}} +AHIBF=${AHIBF:-${COM_OBS}/${OPREFIX}ahicsr.tm00.bufr_d${OSUFFIX}} +SSTVIIRS=${SSTVIIRS:-${COM_OBS}/${OPREFIX}sstvcw.tm00.bufr_d${OSUFFIX}} +ABIBF=${ABIBF:-${COM_OBS}/${OPREFIX}gsrcsr.tm00.bufr_d${OSUFFIX}} +CRISBF=${CRISBF:-${COM_OBS}/${OPREFIX}cris.tm00.bufr_d${OSUFFIX}} +ESCRIS=${ESCRIS:-${COM_OBS}/${OPREFIX}escris.tm00.bufr_d${OSUFFIX}} +CRISDB=${CRISDB:-${COM_OBS}/${OPREFIX}crisdb.tm00.bufr_d${OSUFFIX}} +CRISFSBF=${CRISFSBF:-${COM_OBS}/${OPREFIX}crisf4.tm00.bufr_d${OSUFFIX}} +ESCRISFS=${ESCRISFS:-${COM_OBS}/${OPREFIX}escrsf.tm00.bufr_d${OSUFFIX}} +CRISFSDB=${CRISFSDB:-${COM_OBS}/${OPREFIX}crsfdb.tm00.bufr_d${OSUFFIX}} +ATMSBF=${ATMSBF:-${COM_OBS}/${OPREFIX}atms.tm00.bufr_d${OSUFFIX}} +ESATMS=${ESATMS:-${COM_OBS}/${OPREFIX}esatms.tm00.bufr_d${OSUFFIX}} +ATMSDB=${ATMSDB:-${COM_OBS}/${OPREFIX}atmsdb.tm00.bufr_d${OSUFFIX}} +SSMITBF=${SSMITBF:-${COM_OBS}/${OPREFIX}ssmit.tm00.bufr_d${OSUFFIX}} +SSMISBF=${SSMISBF:-${COM_OBS}/${OPREFIX}ssmisu.tm00.bufr_d${OSUFFIX}} +SBUVBF=${SBUVBF:-${COM_OBS}/${OPREFIX}osbuv8.tm00.bufr_d${OSUFFIX}} +OMPSNPBF=${OMPSNPBF:-${COM_OBS}/${OPREFIX}ompsn8.tm00.bufr_d${OSUFFIX}} +OMPSTCBF=${OMPSTCBF:-${COM_OBS}/${OPREFIX}ompst8.tm00.bufr_d${OSUFFIX}} +OMPSLPBF=${OMPSLPBF:-${COM_OBS}/${OPREFIX}ompslp.tm00.bufr_d${OSUFFIX}} +GOMEBF=${GOMEBF:-${COM_OBS}/${OPREFIX}gome.tm00.bufr_d${OSUFFIX}} +OMIBF=${OMIBF:-${COM_OBS}/${OPREFIX}omi.tm00.bufr_d${OSUFFIX}} +MLSBF=${MLSBF:-${COM_OBS}/${OPREFIX}mls.tm00.bufr_d${OSUFFIX}} +SMIPCP=${SMIPCP:-${COM_OBS}/${OPREFIX}spssmi.tm00.bufr_d${OSUFFIX}} +TMIPCP=${TMIPCP:-${COM_OBS}/${OPREFIX}sptrmm.tm00.bufr_d${OSUFFIX}} +GPSROBF=${GPSROBF:-${COM_OBS}/${OPREFIX}gpsro.tm00.bufr_d${OSUFFIX}} +TCVITL=${TCVITL:-${COM_OBS}/${OPREFIX}syndata.tcvitals.tm00} +B1AVHAM=${B1AVHAM:-${COM_OBS}/${OPREFIX}avcsam.tm00.bufr_d${OSUFFIX}} +B1AVHPM=${B1AVHPM:-${COM_OBS}/${OPREFIX}avcspm.tm00.bufr_d${OSUFFIX}} +HDOB=${HDOB:-${COM_OBS}/${OPREFIX}hdob.tm00.bufr_d${OSUFFIX}} # Guess files GPREFIX=${GPREFIX:-""} -GSUFFIX=${GSUFFIX:-$SUFFIX} -SFCG03=${SFCG03:-${COMIN_GES}/${GPREFIX}sfcf003${GSUFFIX}} -SFCG04=${SFCG04:-${COMIN_GES}/${GPREFIX}sfcf004${GSUFFIX}} -SFCG05=${SFCG05:-${COMIN_GES}/${GPREFIX}sfcf005${GSUFFIX}} -SFCGES=${SFCGES:-${COMIN_GES}/${GPREFIX}sfcf006${GSUFFIX}} -SFCG07=${SFCG07:-${COMIN_GES}/${GPREFIX}sfcf007${GSUFFIX}} -SFCG08=${SFCG08:-${COMIN_GES}/${GPREFIX}sfcf008${GSUFFIX}} -SFCG09=${SFCG09:-${COMIN_GES}/${GPREFIX}sfcf009${GSUFFIX}} -ATMG03=${ATMG03:-${COMIN_GES}/${GPREFIX}atmf003${GSUFFIX}} -ATMG04=${ATMG04:-${COMIN_GES}/${GPREFIX}atmf004${GSUFFIX}} -ATMG05=${ATMG05:-${COMIN_GES}/${GPREFIX}atmf005${GSUFFIX}} -ATMGES=${ATMGES:-${COMIN_GES}/${GPREFIX}atmf006${GSUFFIX}} -ATMG07=${ATMG07:-${COMIN_GES}/${GPREFIX}atmf007${GSUFFIX}} -ATMG08=${ATMG08:-${COMIN_GES}/${GPREFIX}atmf008${GSUFFIX}} -ATMG09=${ATMG09:-${COMIN_GES}/${GPREFIX}atmf009${GSUFFIX}} -GBIAS=${GBIAS:-${COMIN_GES}/${GPREFIX}abias} -GBIASPC=${GBIASPC:-${COMIN_GES}/${GPREFIX}abias_pc} -GBIASAIR=${GBIASAIR:-${COMIN_GES}/${GPREFIX}abias_air} -GRADSTAT=${GRADSTAT:-${COMIN_GES}/${GPREFIX}radstat} +GSUFFIX=${GSUFFIX:-".nc"} +SFCG03=${SFCG03:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}sfcf003${GSUFFIX}} +SFCG04=${SFCG04:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}sfcf004${GSUFFIX}} +SFCG05=${SFCG05:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}sfcf005${GSUFFIX}} +SFCGES=${SFCGES:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}sfcf006${GSUFFIX}} +SFCG07=${SFCG07:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}sfcf007${GSUFFIX}} +SFCG08=${SFCG08:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}sfcf008${GSUFFIX}} +SFCG09=${SFCG09:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}sfcf009${GSUFFIX}} +ATMG03=${ATMG03:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf003${GSUFFIX}} +ATMG04=${ATMG04:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf004${GSUFFIX}} +ATMG05=${ATMG05:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf005${GSUFFIX}} +ATMGES=${ATMGES:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf006${GSUFFIX}} +ATMG07=${ATMG07:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf007${GSUFFIX}} +ATMG08=${ATMG08:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf008${GSUFFIX}} +ATMG09=${ATMG09:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf009${GSUFFIX}} +GBIAS=${GBIAS:-${COM_ATMOS_ANALYSIS_PREV}/${GPREFIX}abias} +GBIASPC=${GBIASPC:-${COM_ATMOS_ANALYSIS_PREV}/${GPREFIX}abias_pc} +GBIASAIR=${GBIASAIR:-${COM_ATMOS_ANALYSIS_PREV}/${GPREFIX}abias_air} +GRADSTAT=${GRADSTAT:-${COM_ATMOS_ANALYSIS_PREV}/${GPREFIX}radstat} # Analysis files export APREFIX=${APREFIX:-""} -export ASUFFIX=${ASUFFIX:-$SUFFIX} -SFCANL=${SFCANL:-${COMOUT}/${APREFIX}sfcanl${ASUFFIX}} -DTFANL=${DTFANL:-${COMOUT}/${APREFIX}dtfanl.nc} -ATMANL=${ATMANL:-${COMOUT}/${APREFIX}atmanl${ASUFFIX}} -ABIAS=${ABIAS:-${COMOUT}/${APREFIX}abias} -ABIASPC=${ABIASPC:-${COMOUT}/${APREFIX}abias_pc} -ABIASAIR=${ABIASAIR:-${COMOUT}/${APREFIX}abias_air} -ABIASe=${ABIASe:-${COMOUT}/${APREFIX}abias_int} -RADSTAT=${RADSTAT:-${COMOUT}/${APREFIX}radstat} -GSISTAT=${GSISTAT:-${COMOUT}/${APREFIX}gsistat} -PCPSTAT=${PCPSTAT:-${COMOUT}/${APREFIX}pcpstat} -CNVSTAT=${CNVSTAT:-${COMOUT}/${APREFIX}cnvstat} -OZNSTAT=${OZNSTAT:-${COMOUT}/${APREFIX}oznstat} +SFCANL=${SFCANL:-${COM_ATMOS_ANALYSIS}/${APREFIX}sfcanl.nc} +DTFANL=${DTFANL:-${COM_ATMOS_ANALYSIS}/${APREFIX}dtfanl.nc} +ATMANL=${ATMANL:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmanl.nc} +ABIAS=${ABIAS:-${COM_ATMOS_ANALYSIS}/${APREFIX}abias} +ABIASPC=${ABIASPC:-${COM_ATMOS_ANALYSIS}/${APREFIX}abias_pc} +ABIASAIR=${ABIASAIR:-${COM_ATMOS_ANALYSIS}/${APREFIX}abias_air} +ABIASe=${ABIASe:-${COM_ATMOS_ANALYSIS}/${APREFIX}abias_int} +RADSTAT=${RADSTAT:-${COM_ATMOS_ANALYSIS}/${APREFIX}radstat} +GSISTAT=${GSISTAT:-${COM_ATMOS_ANALYSIS}/${APREFIX}gsistat} +PCPSTAT=${PCPSTAT:-${COM_ATMOS_ANALYSIS}/${APREFIX}pcpstat} +CNVSTAT=${CNVSTAT:-${COM_ATMOS_ANALYSIS}/${APREFIX}cnvstat} +OZNSTAT=${OZNSTAT:-${COM_ATMOS_ANALYSIS}/${APREFIX}oznstat} # Increment files -ATMINC=${ATMINC:-${COMOUT}/${APREFIX}atminc.nc} +ATMINC=${ATMINC:-${COM_ATMOS_ANALYSIS}/${APREFIX}atminc.nc} # Obs diag RUN_SELECT=${RUN_SELECT:-"NO"} USE_SELECT=${USE_SELECT:-"NO"} USE_RADSTAT=${USE_RADSTAT:-"YES"} -SELECT_OBS=${SELECT_OBS:-${COMOUT}/${APREFIX}obsinput} +SELECT_OBS=${SELECT_OBS:-${COM_ATMOS_ANALYSIS}/${APREFIX}obsinput} GENDIAG=${GENDIAG:-"YES"} DIAG_SUFFIX=${DIAG_SUFFIX:-""} -if [ $netcdf_diag = ".true." ] ; then +if [ ${netcdf_diag} = ".true." ] ; then DIAG_SUFFIX="${DIAG_SUFFIX}.nc4" fi DIAG_COMPRESS=${DIAG_COMPRESS:-"YES"} @@ -212,10 +209,10 @@ DIAG_TARBALL=${DIAG_TARBALL:-"YES"} USE_CFP=${USE_CFP:-"NO"} CFP_MP=${CFP_MP:-"NO"} nm="" -if [ $CFP_MP = "YES" ]; then +if [ ${CFP_MP} = "YES" ]; then nm=0 fi -DIAG_DIR=${DIAG_DIR:-${COMOUT}/gsidiags} +DIAG_DIR=${DIAG_DIR:-${COM_ATMOS_ANALYSIS}/gsidiags} # Set script / GSI control parameters DOHYBVAR=${DOHYBVAR:-"NO"} @@ -237,87 +234,62 @@ export INCREMENTS_TO_ZERO=${INCREMENTS_TO_ZERO:-"'NONE'"} USE_CORRELATED_OBERRS=${USE_CORRELATED_OBERRS:-"YES"} # Get header information from Guess files -if [ ${SUFFIX} = ".nc" ]; then - LONB=${LONB:-$($NCLEN $ATMGES grid_xt)} # get LONB - LATB=${LATB:-$($NCLEN $ATMGES grid_yt)} # get LATB - LEVS=${LEVS:-$($NCLEN $ATMGES pfull)} # get LEVS - JCAP=${JCAP:--9999} # there is no jcap in these files -else - LONB=${LONB:-$($NEMSIOGET $ATMGES dimx | grep -i "dimx" | awk -F"= " '{print $2}' | awk -F" " '{print $1}')} # 'get LONB - LATB=${LATB:-$($NEMSIOGET $ATMGES dimy | grep -i "dimy" | awk -F"= " '{print $2}' | awk -F" " '{print $1}')} # 'get LATB - LEVS=${LEVS:-$($NEMSIOGET $ATMGES dimz | grep -i "dimz" | awk -F"= " '{print $2}' | awk -F" " '{print $1}')} # 'get LEVS - JCAP=${JCAP:-$($NEMSIOGET $ATMGES jcap | grep -i "jcap" | awk -F"= " '{print $2}' | awk -F" " '{print $1}')} # 'get JCAP -fi -[ $JCAP -eq -9999 -a $LATB -ne -9999 ] && JCAP=$((LATB-2)) -[ $LONB -eq -9999 -o $LATB -eq -9999 -o $LEVS -eq -9999 -o $JCAP -eq -9999 ] && exit -9999 +LONB=${LONB:-$(${NCLEN} ${ATMGES} grid_xt)} # get LONB +LATB=${LATB:-$(${NCLEN} ${ATMGES} grid_yt)} # get LATB +LEVS=${LEVS:-$(${NCLEN} ${ATMGES} pfull)} # get LEVS +JCAP=${JCAP:--9999} # there is no jcap in these files +[ ${JCAP} -eq -9999 -a ${LATB} -ne -9999 ] && JCAP=$((LATB-2)) +[ ${LONB} -eq -9999 -o ${LATB} -eq -9999 -o ${LEVS} -eq -9999 -o ${JCAP} -eq -9999 ] && exit -9999 # Get header information from Ensemble Guess files -if [ $DOHYBVAR = "YES" ]; then - SFCGES_ENSMEAN=${SFCGES_ENSMEAN:-${COMIN_GES_ENS}/${GPREFIX}sfcf006.ensmean${GSUFFIX}} - export ATMGES_ENSMEAN=${ATMGES_ENSMEAN:-${COMIN_GES_ENS}/${GPREFIX}atmf006.ensmean${GSUFFIX}} - if [ ${SUFFIX} = ".nc" ]; then - LONB_ENKF=${LONB_ENKF:-$($NCLEN $ATMGES_ENSMEAN grid_xt)} # get LONB_ENKF - LATB_ENKF=${LATB_ENKF:-$($NCLEN $ATMGES_ENSMEAN grid_yt)} # get LATB_ENFK - LEVS_ENKF=${LEVS_ENKF:-$($NCLEN $ATMGES_ENSMEAN pfull)} # get LATB_ENFK - JCAP_ENKF=${JCAP_ENKF:--9999} # again, no jcap in the netcdf files - else - LONB_ENKF=${LONB_ENKF:-$($NEMSIOGET $ATMGES_ENSMEAN dimx | grep -i "dimx" | awk -F"= " '{print $2}' | awk -F" " '{print $1}')} # 'get LONB_ENKF - LATB_ENKF=${LATB_ENKF:-$($NEMSIOGET $ATMGES_ENSMEAN dimy | grep -i "dimy" | awk -F"= " '{print $2}' | awk -F" " '{print $1}')} # 'get LATB_ENKF - LEVS_ENKF=${LEVS_ENKF:-$($NEMSIOGET $ATMGES_ENSMEAN dimz | grep -i "dimz" | awk -F"= " '{print $2}' | awk -F" " '{print $1}')} # 'get LEVS_ENKF - JCAP_ENKF=${JCAP_ENKF:-$($NEMSIOGET $ATMGES_ENSMEAN jcap | grep -i "jcap" | awk -F"= " '{print $2}' | awk -F" " '{print $1}')} # 'get JCAP_ENKF - fi - NLON_ENKF=${NLON_ENKF:-$LONB_ENKF} - NLAT_ENKF=${NLAT_ENKF:-$(($LATB_ENKF+2))} - [ $JCAP_ENKF -eq -9999 -a $LATB_ENKF -ne -9999 ] && JCAP_ENKF=$((LATB_ENKF-2)) - [ $LONB_ENKF -eq -9999 -o $LATB_ENKF -eq -9999 -o $LEVS_ENKF -eq -9999 -o $JCAP_ENKF -eq -9999 ] && exit -9999 +if [ ${DOHYBVAR} = "YES" ]; then + SFCGES_ENSMEAN=${SFCGES_ENSMEAN:-${COM_ATMOS_HISTORY_ENS_PREV}/${GPREFIX_ENS}sfcf006.ensmean.nc} + export ATMGES_ENSMEAN=${ATMGES_ENSMEAN:-${COM_ATMOS_HISTORY_ENS_PREV}/${GPREFIX_ENS}atmf006.ensmean.nc} + LONB_ENKF=${LONB_ENKF:-$(${NCLEN} ${ATMGES_ENSMEAN} grid_xt)} # get LONB_ENKF + LATB_ENKF=${LATB_ENKF:-$(${NCLEN} ${ATMGES_ENSMEAN} grid_yt)} # get LATB_ENFK + LEVS_ENKF=${LEVS_ENKF:-$(${NCLEN} ${ATMGES_ENSMEAN} pfull)} # get LATB_ENFK + JCAP_ENKF=${JCAP_ENKF:--9999} # again, no jcap in the netcdf files + NLON_ENKF=${NLON_ENKF:-${LONB_ENKF}} + NLAT_ENKF=${NLAT_ENKF:-$((${LATB_ENKF}+2))} + [ ${JCAP_ENKF} -eq -9999 -a ${LATB_ENKF} -ne -9999 ] && JCAP_ENKF=$((LATB_ENKF-2)) + [ ${LONB_ENKF} -eq -9999 -o ${LATB_ENKF} -eq -9999 -o ${LEVS_ENKF} -eq -9999 -o ${JCAP_ENKF} -eq -9999 ] && exit -9999 else LONB_ENKF=0 # just for if statement later fi # Get dimension information based on CASE -res=$(echo $CASE | cut -c2-) +res=$(echo ${CASE} | cut -c2-) JCAP_CASE=$((res*2-2)) LATB_CASE=$((res*2)) LONB_CASE=$((res*4)) # Set analysis resolution information -if [ $DOHYBVAR = "YES" ]; then - JCAP_A=${JCAP_A:-${JCAP_ENKF:-$JCAP}} - LONA=${LONA:-${LONB_ENKF:-$LONB}} - LATA=${LATA:-${LATB_ENKF:-$LATB}} +if [ ${DOHYBVAR} = "YES" ]; then + JCAP_A=${JCAP_A:-${JCAP_ENKF:-${JCAP}}} + LONA=${LONA:-${LONB_ENKF:-${LONB}}} + LATA=${LATA:-${LATB_ENKF:-${LATB}}} else - JCAP_A=${JCAP_A:-$JCAP} - LONA=${LONA:-$LONB} - LATA=${LATA:-$LATB} + JCAP_A=${JCAP_A:-${JCAP}} + LONA=${LONA:-${LONB}} + LATA=${LATA:-${LATB}} fi -NLON_A=${NLON_A:-$LONA} -NLAT_A=${NLAT_A:-$(($LATA+2))} - -DELTIM=${DELTIM:-$((3600/($JCAP_A/20)))} +NLON_A=${NLON_A:-${LONA}} +NLAT_A=${NLAT_A:-$((${LATA}+2))} -# logic for netCDF I/O -if [ ${SUFFIX} = ".nc" ]; then - # GSI namelist options to use netCDF background - use_gfs_nemsio=".false." - use_gfs_ncio=".true." -else - # GSI namelist options to use NEMSIO background - use_gfs_nemsio=".true." - use_gfs_ncio=".false." -fi +DELTIM=${DELTIM:-$((3600/(${JCAP_A}/20)))} # determine if writing or calculating increment -if [ $DO_CALC_INCREMENT = "YES" ]; then +if [ ${DO_CALC_INCREMENT} = "YES" ]; then write_fv3_increment=".false." else write_fv3_increment=".true." - WRITE_INCR_ZERO="incvars_to_zero= $INCREMENTS_TO_ZERO," - WRITE_ZERO_STRAT="incvars_zero_strat= $INCVARS_ZERO_STRAT," - WRITE_STRAT_EFOLD="incvars_efold= $INCVARS_EFOLD," + WRITE_INCR_ZERO="incvars_to_zero= ${INCREMENTS_TO_ZERO}," + WRITE_ZERO_STRAT="incvars_zero_strat= ${INCVARS_ZERO_STRAT}," + WRITE_STRAT_EFOLD="incvars_efold= ${INCVARS_EFOLD}," fi # GSI Fix files -RTMFIX=${RTMFIX:-${CRTM_FIX}} +RTMFIX=${CRTM_FIX} BERROR=${BERROR:-${FIXgsi}/Big_Endian/global_berror.l${LEVS}y${NLAT_A}.f77} SATANGL=${SATANGL:-${FIXgsi}/global_satangbias.txt} SATINFO=${SATINFO:-${FIXgsi}/global_satinfo.txt} @@ -353,7 +325,7 @@ NST=${NST:-""} #uGSI Namelist parameters lrun_subdirs=${lrun_subdirs:-".true."} -if [ $DOHYBVAR = "YES" ]; then +if [ ${DOHYBVAR} = "YES" ]; then l_hyb_ens=.true. export l4densvar=${l4densvar:-".false."} export lwrite4danl=${lwrite4danl:-".false."} @@ -364,62 +336,62 @@ else fi # Set 4D-EnVar specific variables -if [ $DOHYBVAR = "YES" -a $l4densvar = ".true." -a $lwrite4danl = ".true." ]; then - ATMA03=${ATMA03:-${COMOUT}/${APREFIX}atma003${ASUFFIX}} - ATMI03=${ATMI03:-${COMOUT}/${APREFIX}atmi003.nc} - ATMA04=${ATMA04:-${COMOUT}/${APREFIX}atma004${ASUFFIX}} - ATMI04=${ATMI04:-${COMOUT}/${APREFIX}atmi004.nc} - ATMA05=${ATMA05:-${COMOUT}/${APREFIX}atma005${ASUFFIX}} - ATMI05=${ATMI05:-${COMOUT}/${APREFIX}atmi005.nc} - ATMA07=${ATMA07:-${COMOUT}/${APREFIX}atma007${ASUFFIX}} - ATMI07=${ATMI07:-${COMOUT}/${APREFIX}atmi007.nc} - ATMA08=${ATMA08:-${COMOUT}/${APREFIX}atma008${ASUFFIX}} - ATMI08=${ATMI08:-${COMOUT}/${APREFIX}atmi008.nc} - ATMA09=${ATMA09:-${COMOUT}/${APREFIX}atma009${ASUFFIX}} - ATMI09=${ATMI09:-${COMOUT}/${APREFIX}atmi009.nc} +if [ ${DOHYBVAR} = "YES" -a ${l4densvar} = ".true." -a ${lwrite4danl} = ".true." ]; then + ATMA03=${ATMA03:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma003.nc} + ATMI03=${ATMI03:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi003.nc} + ATMA04=${ATMA04:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma004.nc} + ATMI04=${ATMI04:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi004.nc} + ATMA05=${ATMA05:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma005.nc} + ATMI05=${ATMI05:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi005.nc} + ATMA07=${ATMA07:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma007.nc} + ATMI07=${ATMI07:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi007.nc} + ATMA08=${ATMA08:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma008.nc} + ATMI08=${ATMI08:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi008.nc} + ATMA09=${ATMA09:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma009.nc} + ATMI09=${ATMI09:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi009.nc} fi ################################################################################ # Preprocessing mkdata=NO -if [ ! -d $DATA ]; then +if [ ! -d ${DATA} ]; then mkdata=YES - mkdir -p $DATA + mkdir -p ${DATA} fi -cd $DATA || exit 99 +cd ${DATA} || exit 99 ############################################################## # Fixed files -$NLN $BERROR berror_stats -$NLN $SATANGL satbias_angle -$NLN $SATINFO satinfo -$NLN $RADCLOUDINFO cloudy_radiance_info.txt -$NLN $ATMSFILTER atms_beamwidth.txt -$NLN $ANAVINFO anavinfo -$NLN $CONVINFO convinfo -$NLN $vqcdat vqctp001.dat -$NLN $INSITUINFO insituinfo -$NLN $OZINFO ozinfo -$NLN $PCPINFO pcpinfo -$NLN $AEROINFO aeroinfo -$NLN $SCANINFO scaninfo -$NLN $HYBENSINFO hybens_info -$NLN $OBERROR errtable +${NLN} ${BERROR} berror_stats +${NLN} ${SATANGL} satbias_angle +${NLN} ${SATINFO} satinfo +${NLN} ${RADCLOUDINFO} cloudy_radiance_info.txt +${NLN} ${ATMSFILTER} atms_beamwidth.txt +${NLN} ${ANAVINFO} anavinfo +${NLN} ${CONVINFO} convinfo +${NLN} ${vqcdat} vqctp001.dat +${NLN} ${INSITUINFO} insituinfo +${NLN} ${OZINFO} ozinfo +${NLN} ${PCPINFO} pcpinfo +${NLN} ${AEROINFO} aeroinfo +${NLN} ${SCANINFO} scaninfo +${NLN} ${HYBENSINFO} hybens_info +${NLN} ${OBERROR} errtable #If using correlated error, link to the covariance files -if [ $USE_CORRELATED_OBERRS == "YES" ]; then - if grep -q "Rcov" $ANAVINFO ; then +if [ ${USE_CORRELATED_OBERRS} == "YES" ]; then + if grep -q "Rcov" ${ANAVINFO} ; then if ls ${FIXgsi}/Rcov* 1> /dev/null 2>&1; then - $NLN ${FIXgsi}/Rcov* $DATA + ${NLN} ${FIXgsi}/Rcov* ${DATA} echo "using correlated obs error" else echo "FATAL ERROR: Satellite error covariance files (Rcov) are missing." - echo "Check for the required Rcov files in " $ANAVINFO + echo "Check for the required Rcov files in " ${ANAVINFO} exit 1 fi else - echo "FATAL ERROR: Satellite error covariance info missing in " $ANAVINFO + echo "FATAL ERROR: Satellite error covariance info missing in " ${ANAVINFO} exit 1 fi @@ -436,140 +408,138 @@ fi # CRTM Spectral and Transmittance coefficients mkdir -p crtm_coeffs for file in $(awk '{if($1!~"!"){print $1}}' satinfo | sort | uniq); do - $NLN $RTMFIX/${file}.SpcCoeff.bin ./crtm_coeffs/${file}.SpcCoeff.bin - $NLN $RTMFIX/${file}.TauCoeff.bin ./crtm_coeffs/${file}.TauCoeff.bin + ${NLN} ${RTMFIX}/${file}.SpcCoeff.bin ./crtm_coeffs/${file}.SpcCoeff.bin + ${NLN} ${RTMFIX}/${file}.TauCoeff.bin ./crtm_coeffs/${file}.TauCoeff.bin done -$NLN $RTMFIX/amsua_metop-a_v2.SpcCoeff.bin ./crtm_coeffs/amsua_metop-a_v2.SpcCoeff.bin - -$NLN $RTMFIX/Nalli.IRwater.EmisCoeff.bin ./crtm_coeffs/Nalli.IRwater.EmisCoeff.bin -$NLN $RTMFIX/NPOESS.IRice.EmisCoeff.bin ./crtm_coeffs/NPOESS.IRice.EmisCoeff.bin -$NLN $RTMFIX/NPOESS.IRland.EmisCoeff.bin ./crtm_coeffs/NPOESS.IRland.EmisCoeff.bin -$NLN $RTMFIX/NPOESS.IRsnow.EmisCoeff.bin ./crtm_coeffs/NPOESS.IRsnow.EmisCoeff.bin -$NLN $RTMFIX/NPOESS.VISice.EmisCoeff.bin ./crtm_coeffs/NPOESS.VISice.EmisCoeff.bin -$NLN $RTMFIX/NPOESS.VISland.EmisCoeff.bin ./crtm_coeffs/NPOESS.VISland.EmisCoeff.bin -$NLN $RTMFIX/NPOESS.VISsnow.EmisCoeff.bin ./crtm_coeffs/NPOESS.VISsnow.EmisCoeff.bin -$NLN $RTMFIX/NPOESS.VISwater.EmisCoeff.bin ./crtm_coeffs/NPOESS.VISwater.EmisCoeff.bin -$NLN $RTMFIX/FASTEM6.MWwater.EmisCoeff.bin ./crtm_coeffs/FASTEM6.MWwater.EmisCoeff.bin -$NLN $RTMFIX/AerosolCoeff.bin ./crtm_coeffs/AerosolCoeff.bin -$NLN $RTMFIX/CloudCoeff.bin ./crtm_coeffs/CloudCoeff.bin +${NLN} ${RTMFIX}/amsua_metop-a_v2.SpcCoeff.bin ./crtm_coeffs/amsua_metop-a_v2.SpcCoeff.bin + +${NLN} ${RTMFIX}/Nalli.IRwater.EmisCoeff.bin ./crtm_coeffs/Nalli.IRwater.EmisCoeff.bin +${NLN} ${RTMFIX}/NPOESS.IRice.EmisCoeff.bin ./crtm_coeffs/NPOESS.IRice.EmisCoeff.bin +${NLN} ${RTMFIX}/NPOESS.IRland.EmisCoeff.bin ./crtm_coeffs/NPOESS.IRland.EmisCoeff.bin +${NLN} ${RTMFIX}/NPOESS.IRsnow.EmisCoeff.bin ./crtm_coeffs/NPOESS.IRsnow.EmisCoeff.bin +${NLN} ${RTMFIX}/NPOESS.VISice.EmisCoeff.bin ./crtm_coeffs/NPOESS.VISice.EmisCoeff.bin +${NLN} ${RTMFIX}/NPOESS.VISland.EmisCoeff.bin ./crtm_coeffs/NPOESS.VISland.EmisCoeff.bin +${NLN} ${RTMFIX}/NPOESS.VISsnow.EmisCoeff.bin ./crtm_coeffs/NPOESS.VISsnow.EmisCoeff.bin +${NLN} ${RTMFIX}/NPOESS.VISwater.EmisCoeff.bin ./crtm_coeffs/NPOESS.VISwater.EmisCoeff.bin +${NLN} ${RTMFIX}/FASTEM6.MWwater.EmisCoeff.bin ./crtm_coeffs/FASTEM6.MWwater.EmisCoeff.bin +${NLN} ${RTMFIX}/AerosolCoeff.bin ./crtm_coeffs/AerosolCoeff.bin +${NLN} ${RTMFIX}/CloudCoeff.bin ./crtm_coeffs/CloudCoeff.bin #$NLN $RTMFIX/CloudCoeff.GFDLFV3.-109z-1.bin ./crtm_coeffs/CloudCoeff.bin ############################################################## # Observational data -$NLN $PREPQC prepbufr -$NLN $PREPQCPF prepbufr_profl -$NLN $SATWND satwndbufr -$NLN $OSCATBF oscatbufr -$NLN $RAPIDSCATBF rapidscatbufr -$NLN $GSNDBF gsndrbufr -$NLN $GSNDBF1 gsnd1bufr -$NLN $B1HRS2 hirs2bufr -$NLN $B1MSU msubufr -$NLN $B1HRS3 hirs3bufr -$NLN $B1HRS4 hirs4bufr -$NLN $B1AMUA amsuabufr -$NLN $B1AMUB amsubbufr -$NLN $B1MHS mhsbufr -$NLN $ESHRS3 hirs3bufrears -$NLN $ESAMUA amsuabufrears -$NLN $ESAMUB amsubbufrears +${NLN} ${PREPQC} prepbufr +${NLN} ${PREPQCPF} prepbufr_profl +${NLN} ${SATWND} satwndbufr +${NLN} ${OSCATBF} oscatbufr +${NLN} ${RAPIDSCATBF} rapidscatbufr +${NLN} ${GSNDBF} gsndrbufr +${NLN} ${GSNDBF1} gsnd1bufr +${NLN} ${B1HRS2} hirs2bufr +${NLN} ${B1MSU} msubufr +${NLN} ${B1HRS3} hirs3bufr +${NLN} ${B1HRS4} hirs4bufr +${NLN} ${B1AMUA} amsuabufr +${NLN} ${B1AMUB} amsubbufr +${NLN} ${B1MHS} mhsbufr +${NLN} ${ESHRS3} hirs3bufrears +${NLN} ${ESAMUA} amsuabufrears +${NLN} ${ESAMUB} amsubbufrears #$NLN $ESMHS mhsbufrears -$NLN $HRS3DB hirs3bufr_db -$NLN $AMUADB amsuabufr_db -$NLN $AMUBDB amsubbufr_db +${NLN} ${HRS3DB} hirs3bufr_db +${NLN} ${AMUADB} amsuabufr_db +${NLN} ${AMUBDB} amsubbufr_db #$NLN $MHSDB mhsbufr_db -$NLN $SBUVBF sbuvbufr -$NLN $OMPSNPBF ompsnpbufr -$NLN $OMPSLPBF ompslpbufr -$NLN $OMPSTCBF ompstcbufr -$NLN $GOMEBF gomebufr -$NLN $OMIBF omibufr -$NLN $MLSBF mlsbufr -$NLN $SMIPCP ssmirrbufr -$NLN $TMIPCP tmirrbufr -$NLN $AIRSBF airsbufr -$NLN $IASIBF iasibufr -$NLN $ESIASI iasibufrears -$NLN $IASIDB iasibufr_db -$NLN $AMSREBF amsrebufr -$NLN $AMSR2BF amsr2bufr -$NLN $GMI1CRBF gmibufr -$NLN $SAPHIRBF saphirbufr -$NLN $SEVIRIBF seviribufr -$NLN $CRISBF crisbufr -$NLN $ESCRIS crisbufrears -$NLN $CRISDB crisbufr_db -$NLN $CRISFSBF crisfsbufr -$NLN $ESCRISFS crisfsbufrears -$NLN $CRISFSDB crisfsbufr_db -$NLN $ATMSBF atmsbufr -$NLN $ESATMS atmsbufrears -$NLN $ATMSDB atmsbufr_db -$NLN $SSMITBF ssmitbufr -$NLN $SSMISBF ssmisbufr -$NLN $GPSROBF gpsrobufr -$NLN $TCVITL tcvitl -$NLN $B1AVHAM avhambufr -$NLN $B1AVHPM avhpmbufr -$NLN $AHIBF ahibufr -$NLN $ABIBF abibufr -$NLN $HDOB hdobbufr -$NLN $SSTVIIRS sstviirs - -[[ $DONST = "YES" ]] && $NLN $NSSTBF nsstbufr +${NLN} ${SBUVBF} sbuvbufr +${NLN} ${OMPSNPBF} ompsnpbufr +${NLN} ${OMPSLPBF} ompslpbufr +${NLN} ${OMPSTCBF} ompstcbufr +${NLN} ${GOMEBF} gomebufr +${NLN} ${OMIBF} omibufr +${NLN} ${MLSBF} mlsbufr +${NLN} ${SMIPCP} ssmirrbufr +${NLN} ${TMIPCP} tmirrbufr +${NLN} ${AIRSBF} airsbufr +${NLN} ${IASIBF} iasibufr +${NLN} ${ESIASI} iasibufrears +${NLN} ${IASIDB} iasibufr_db +${NLN} ${AMSREBF} amsrebufr +${NLN} ${AMSR2BF} amsr2bufr +#${NLN} ${GMI1CRBF} gmibufr # GMI temporarily disabled due to array overflow. +${NLN} ${SAPHIRBF} saphirbufr +${NLN} ${SEVIRIBF} seviribufr +${NLN} ${CRISBF} crisbufr +${NLN} ${ESCRIS} crisbufrears +${NLN} ${CRISDB} crisbufr_db +${NLN} ${CRISFSBF} crisfsbufr +${NLN} ${ESCRISFS} crisfsbufrears +${NLN} ${CRISFSDB} crisfsbufr_db +${NLN} ${ATMSBF} atmsbufr +${NLN} ${ESATMS} atmsbufrears +${NLN} ${ATMSDB} atmsbufr_db +${NLN} ${SSMITBF} ssmitbufr +${NLN} ${SSMISBF} ssmisbufr +${NLN} ${GPSROBF} gpsrobufr +${NLN} ${TCVITL} tcvitl +${NLN} ${B1AVHAM} avhambufr +${NLN} ${B1AVHPM} avhpmbufr +${NLN} ${AHIBF} ahibufr +${NLN} ${ABIBF} abibufr +${NLN} ${HDOB} hdobbufr +${NLN} ${SSTVIIRS} sstviirs + +[[ ${DONST} = "YES" ]] && ${NLN} ${NSSTBF} nsstbufr ############################################################## # Required bias guess files -$NLN $GBIAS satbias_in -$NLN $GBIASPC satbias_pc -$NLN $GBIASAIR aircftbias_in -$NLN $GRADSTAT radstat.gdas +${NLN} ${GBIAS} satbias_in +${NLN} ${GBIASPC} satbias_pc +${NLN} ${GBIASAIR} aircftbias_in +${NLN} ${GRADSTAT} radstat.gdas ############################################################## # Required model guess files -$NLN $ATMG03 sigf03 -$NLN $ATMGES sigf06 -$NLN $ATMG09 sigf09 +${NLN} ${ATMG03} sigf03 +${NLN} ${ATMGES} sigf06 +${NLN} ${ATMG09} sigf09 -$NLN $SFCG03 sfcf03 -$NLN $SFCGES sfcf06 -$NLN $SFCG09 sfcf09 +${NLN} ${SFCG03} sfcf03 +${NLN} ${SFCGES} sfcf06 +${NLN} ${SFCG09} sfcf09 -# Link hourly backgrounds (if present) -if [ -f $ATMG04 -a -f $ATMG05 -a -f $ATMG07 -a -f $ATMG08 ]; then - nhr_obsbin=1 -fi +[[ -f ${ATMG04} ]] && ${NLN} ${ATMG04} sigf04 +[[ -f ${ATMG05} ]] && ${NLN} ${ATMG05} sigf05 +[[ -f ${ATMG07} ]] && ${NLN} ${ATMG07} sigf07 +[[ -f ${ATMG08} ]] && ${NLN} ${ATMG08} sigf08 -[[ -f $ATMG04 ]] && $NLN $ATMG04 sigf04 -[[ -f $ATMG05 ]] && $NLN $ATMG05 sigf05 -[[ -f $ATMG07 ]] && $NLN $ATMG07 sigf07 -[[ -f $ATMG08 ]] && $NLN $ATMG08 sigf08 +[[ -f ${SFCG04} ]] && ${NLN} ${SFCG04} sfcf04 +[[ -f ${SFCG05} ]] && ${NLN} ${SFCG05} sfcf05 +[[ -f ${SFCG07} ]] && ${NLN} ${SFCG07} sfcf07 +[[ -f ${SFCG08} ]] && ${NLN} ${SFCG08} sfcf08 -[[ -f $SFCG04 ]] && $NLN $SFCG04 sfcf04 -[[ -f $SFCG05 ]] && $NLN $SFCG05 sfcf05 -[[ -f $SFCG07 ]] && $NLN $SFCG07 sfcf07 -[[ -f $SFCG08 ]] && $NLN $SFCG08 sfcf08 - -if [ $DOHYBVAR = "YES" ]; then +if [ ${DOHYBVAR} = "YES" ]; then # Link ensemble members mkdir -p ensemble_data ENKF_SUFFIX="s" - [[ $SMOOTH_ENKF = "NO" ]] && ENKF_SUFFIX="" + [[ ${SMOOTH_ENKF} = "NO" ]] && ENKF_SUFFIX="" fhrs="06" - if [ $l4densvar = ".true." ]; then + if [ ${l4densvar} = ".true." ]; then fhrs="03 04 05 06 07 08 09" + nhr_obsbin=1 fi - for imem in $(seq 1 $NMEM_ENKF); do - memchar="mem"$(printf %03i $imem) - for fhr in $fhrs; do - $NLN ${COMIN_GES_ENS}/$memchar/${GPREFIX}atmf0${fhr}${ENKF_SUFFIX}${GSUFFIX} ./ensemble_data/sigf${fhr}_ens_$memchar - if [ $cnvw_option = ".true." ]; then - $NLN ${COMIN_GES_ENS}/$memchar/${GPREFIX}sfcf0${fhr}${GSUFFIX} ./ensemble_data/sfcf${fhr}_ens_$memchar + for imem in $(seq 1 ${NMEM_ENKF}); do + memchar="mem$(printf %03i "${imem}")" + MEMDIR=${memchar} RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com COM_ATMOS_HISTORY + + for fhr in ${fhrs}; do + ${NLN} ${COM_ATMOS_HISTORY}/${GPREFIX_ENS}atmf0${fhr}${ENKF_SUFFIX}.nc ./ensemble_data/sigf${fhr}_ens_${memchar} + if [ ${cnvw_option} = ".true." ]; then + ${NLN} ${COM_ATMOS_HISTORY}/${GPREFIX_ENS}sfcf0${fhr}.nc ./ensemble_data/sfcf${fhr}_ens_${memchar} fi done done @@ -579,11 +549,11 @@ fi ############################################################## # Handle inconsistent surface mask between background, ensemble and analysis grids # This needs re-visiting in the context of NSST; especially references to JCAP* -if [ $JCAP -ne $JCAP_A ]; then - if [ $DOHYBVAR = "YES" -a $JCAP_A = $JCAP_ENKF ]; then - if [ -e $SFCGES_ENSMEAN ]; then +if [ ${JCAP} -ne ${JCAP_A} ]; then + if [ ${DOHYBVAR} = "YES" -a ${JCAP_A} = ${JCAP_ENKF} ]; then + if [ -e ${SFCGES_ENSMEAN} ]; then USE_READIN_ANL_SFCMASK=.true. - $NLN $SFCGES_ENSMEAN sfcf06_anlgrid + ${NLN} ${SFCGES_ENSMEAN} sfcf06_anlgrid else echo "Warning: Inconsistent sfc mask between analysis and ensemble grids, GSI will interpolate" fi @@ -595,108 +565,108 @@ fi ############################################################## # Diagnostic files # if requested, link GSI diagnostic file directories for use later -if [ $GENDIAG = "YES" ] ; then - if [ $lrun_subdirs = ".true." ] ; then - if [ -d $DIAG_DIR ]; then - rm -rf $DIAG_DIR +if [ ${GENDIAG} = "YES" ] ; then + if [ ${lrun_subdirs} = ".true." ] ; then + if [ -d ${DIAG_DIR} ]; then + rm -rf ${DIAG_DIR} fi - npe_m1="$(($npe_gsi-1))" - for pe in $(seq 0 $npe_m1); do - pedir="dir."$(printf %04i $pe) - mkdir -p $DIAG_DIR/$pedir - $NLN $DIAG_DIR/$pedir $pedir + npe_m1="$((${npe_gsi}-1))" + for pe in $(seq 0 ${npe_m1}); do + pedir="dir."$(printf %04i ${pe}) + mkdir -p ${DIAG_DIR}/${pedir} + ${NLN} ${DIAG_DIR}/${pedir} ${pedir} done else - err_exit "FATAL ERROR: lrun_subdirs must be true. lrun_subdirs=$lrun_subdirs" + err_exit "FATAL ERROR: lrun_subdirs must be true. lrun_subdirs=${lrun_subdirs}" fi fi ############################################################## # Output files -$NLN $ATMANL siganl -$NLN $ATMINC siginc.nc -if [ $DOHYBVAR = "YES" -a $l4densvar = ".true." -a $lwrite4danl = ".true." ]; then - $NLN $ATMA03 siga03 - $NLN $ATMI03 sigi03.nc - $NLN $ATMA04 siga04 - $NLN $ATMI04 sigi04.nc - $NLN $ATMA05 siga05 - $NLN $ATMI05 sigi05.nc - $NLN $ATMA07 siga07 - $NLN $ATMI07 sigi07.nc - $NLN $ATMA08 siga08 - $NLN $ATMI08 sigi08.nc - $NLN $ATMA09 siga09 - $NLN $ATMI09 sigi09.nc +${NLN} ${ATMANL} siganl +${NLN} ${ATMINC} siginc.nc +if [ ${DOHYBVAR} = "YES" -a ${l4densvar} = ".true." -a ${lwrite4danl} = ".true." ]; then + ${NLN} ${ATMA03} siga03 + ${NLN} ${ATMI03} sigi03.nc + ${NLN} ${ATMA04} siga04 + ${NLN} ${ATMI04} sigi04.nc + ${NLN} ${ATMA05} siga05 + ${NLN} ${ATMI05} sigi05.nc + ${NLN} ${ATMA07} siga07 + ${NLN} ${ATMI07} sigi07.nc + ${NLN} ${ATMA08} siga08 + ${NLN} ${ATMI08} sigi08.nc + ${NLN} ${ATMA09} siga09 + ${NLN} ${ATMI09} sigi09.nc fi -$NLN $ABIAS satbias_out -$NLN $ABIASPC satbias_pc.out -$NLN $ABIASAIR aircftbias_out +${NLN} ${ABIAS} satbias_out +${NLN} ${ABIASPC} satbias_pc.out +${NLN} ${ABIASAIR} aircftbias_out -if [ $DONST = "YES" ]; then - $NLN $DTFANL dtfanl +if [ ${DONST} = "YES" ]; then + ${NLN} ${DTFANL} dtfanl fi # If requested, link (and if tarred, de-tar obsinput.tar) into obs_input.* files -if [ $USE_SELECT = "YES" ]; then +if [ ${USE_SELECT} = "YES" ]; then rm obs_input.* - nl=$(file $SELECT_OBS | cut -d: -f2 | grep tar | wc -l) - if [ $nl -eq 1 ]; then + nl=$(file ${SELECT_OBS} | cut -d: -f2 | grep tar | wc -l) + if [ ${nl} -eq 1 ]; then rm obsinput.tar - $NLN $SELECT_OBS obsinput.tar + ${NLN} ${SELECT_OBS} obsinput.tar tar -xvf obsinput.tar rm obsinput.tar else - for filetop in $(ls $SELECT_OBS/obs_input.*); do - fileloc=$(basename $filetop) - $NLN $filetop $fileloc + for filetop in $(ls ${SELECT_OBS}/obs_input.*); do + fileloc=$(basename ${filetop}) + ${NLN} ${filetop} ${fileloc} done fi fi ############################################################## # If requested, copy and de-tar guess radstat file -if [ $USE_RADSTAT = "YES" ]; then - if [ $USE_CFP = "YES" ]; then - [[ -f $DATA/unzip.sh ]] && rm $DATA/unzip.sh - [[ -f $DATA/mp_unzip.sh ]] && rm $DATA/mp_unzip.sh - cat > $DATA/unzip.sh << EOFunzip +if [ ${USE_RADSTAT} = "YES" ]; then + if [ ${USE_CFP} = "YES" ]; then + [[ -f ${DATA}/unzip.sh ]] && rm ${DATA}/unzip.sh + [[ -f ${DATA}/mp_unzip.sh ]] && rm ${DATA}/mp_unzip.sh + cat > ${DATA}/unzip.sh << EOFunzip #!/bin/sh diag_file=\$1 diag_suffix=\$2 fname=\$(echo \$diag_file | cut -d'.' -f1) fdate=\$(echo \$diag_file | cut -d'.' -f2) - $UNCOMPRESS \$diag_file + ${UNCOMPRESS} \$diag_file fnameges=\$(echo \$fname | sed 's/_ges//g') - $NMV \$fname.\$fdate\$diag_suffix \$fnameges + ${NMV} \$fname.\$fdate\$diag_suffix \$fnameges EOFunzip - chmod 755 $DATA/unzip.sh + chmod 755 ${DATA}/unzip.sh fi listdiag=$(tar xvf radstat.gdas | cut -d' ' -f2 | grep _ges) - for type in $listdiag; do - diag_file=$(echo $type | cut -d',' -f1) - if [ $USE_CFP = "YES" ] ; then - echo "$nm $DATA/unzip.sh $diag_file $DIAG_SUFFIX" | tee -a $DATA/mp_unzip.sh + for type in ${listdiag}; do + diag_file=$(echo ${type} | cut -d',' -f1) + if [ ${USE_CFP} = "YES" ] ; then + echo "${nm} ${DATA}/unzip.sh ${diag_file} ${DIAG_SUFFIX}" | tee -a ${DATA}/mp_unzip.sh if [ ${CFP_MP:-"NO"} = "YES" ]; then nm=$((nm+1)) fi else - fname=$(echo $diag_file | cut -d'.' -f1) - date=$(echo $diag_file | cut -d'.' -f2) - $UNCOMPRESS $diag_file - fnameges=$(echo $fname|sed 's/_ges//g') - $NMV $fname.$date$DIAG_SUFFIX $fnameges + fname=$(echo ${diag_file} | cut -d'.' -f1) + date=$(echo ${diag_file} | cut -d'.' -f2) + ${UNCOMPRESS} ${diag_file} + fnameges=$(echo ${fname}|sed 's/_ges//g') + ${NMV} ${fname}.${date}${DIAG_SUFFIX} ${fnameges} fi done - if [ $USE_CFP = "YES" ] ; then - chmod 755 $DATA/mp_unzip.sh - ncmd=$(cat $DATA/mp_unzip.sh | wc -l) - if [ $ncmd -gt 0 ]; then + if [ ${USE_CFP} = "YES" ] ; then + chmod 755 ${DATA}/mp_unzip.sh + ncmd=$(cat ${DATA}/mp_unzip.sh | wc -l) + if [ ${ncmd} -gt 0 ]; then ncmd_max=$((ncmd < npe_node_max ? ncmd : npe_node_max)) - APRUNCFP_UNZIP=$(eval echo $APRUNCFP) - $APRUNCFP_UNZIP $DATA/mp_unzip.sh + APRUNCFP_UNZIP=$(eval echo ${APRUNCFP}) + ${APRUNCFP_UNZIP} ${DATA}/mp_unzip.sh export err=$?; err_chk fi fi @@ -704,18 +674,18 @@ fi # if [ $USE_RADSTAT = "YES" ] ############################################################## # GSI Namelist options -if [ $DOHYBVAR = "YES" ]; then - HYBRID_ENSEMBLE="n_ens=$NMEM_ENKF,jcap_ens=$JCAP_ENKF,nlat_ens=$NLAT_ENKF,nlon_ens=$NLON_ENKF,jcap_ens_test=$JCAP_ENKF,$HYBRID_ENSEMBLE" - if [ $l4densvar = ".true." ]; then - SETUP="niter(1)=50,niter(2)=150,niter_no_qc(1)=25,niter_no_qc(2)=0,thin4d=.true.,ens_nstarthr=3,l4densvar=$l4densvar,lwrite4danl=$lwrite4danl,$SETUP" - JCOPTS="ljc4tlevs=.true.,$JCOPTS" - STRONGOPTS="tlnmc_option=3,$STRONGOPTS" - OBSQC="c_varqc=0.04,$OBSQC" +if [ ${DOHYBVAR} = "YES" ]; then + HYBRID_ENSEMBLE="n_ens=${NMEM_ENKF},jcap_ens=${JCAP_ENKF},nlat_ens=${NLAT_ENKF},nlon_ens=${NLON_ENKF},jcap_ens_test=${JCAP_ENKF},${HYBRID_ENSEMBLE}" + if [ ${l4densvar} = ".true." ]; then + SETUP="niter(1)=50,niter(2)=150,niter_no_qc(1)=25,niter_no_qc(2)=0,thin4d=.true.,ens_nstarthr=3,l4densvar=${l4densvar},lwrite4danl=${lwrite4danl},${SETUP}" + JCOPTS="ljc4tlevs=.true.,${JCOPTS}" + STRONGOPTS="tlnmc_option=3,${STRONGOPTS}" + OBSQC="c_varqc=0.04,${OBSQC}" fi fi -if [ $DONST = "YES" ]; then - NST="nstinfo=$NSTINFO,fac_dtl=$FAC_DTL,fac_tsl=$FAC_TSL,zsea1=$ZSEA1,zsea2=$ZSEA2,$NST" +if [ ${DONST} = "YES" ]; then + NST="nstinfo=${NSTINFO},fac_dtl=${FAC_DTL},fac_tsl=${FAC_TSL},zsea1=${ZSEA1},zsea2=${ZSEA2},${NST}" fi ############################################################## @@ -727,33 +697,33 @@ cat > gsiparm.anl << EOF niter_no_qc(1)=50,niter_no_qc(2)=0, write_diag(1)=.true.,write_diag(2)=.false.,write_diag(3)=.true., qoption=2, - gencode=${IGEN:-0},deltim=$DELTIM, + gencode=${IGEN:-0},deltim=${DELTIM}, factqmin=0.5,factqmax=0.0002, iguess=-1, - tzr_qc=$TZR_QC, + tzr_qc=${TZR_QC}, oneobtest=.false.,retrieval=.false.,l_foto=.false., use_pbl=.false.,use_compress=.true.,nsig_ext=12,gpstop=50.,commgpstop=45.,commgpserrinf=1.0, - use_gfs_nemsio=${use_gfs_nemsio},use_gfs_ncio=${use_gfs_ncio},sfcnst_comb=.true., + use_gfs_nemsio=.false.,use_gfs_ncio=.true.,sfcnst_comb=.true., use_readin_anl_sfcmask=${USE_READIN_ANL_SFCMASK}, - lrun_subdirs=$lrun_subdirs, + lrun_subdirs=${lrun_subdirs}, crtm_coeffs_path='./crtm_coeffs/', newpc4pred=.true.,adp_anglebc=.true.,angord=4,passive_bc=.true.,use_edges=.false., diag_precon=.true.,step_start=1.e-3,emiss_bc=.true.,nhr_obsbin=${nhr_obsbin:-3}, - cwoption=3,imp_physics=$imp_physics,lupp=$lupp,cnvw_option=$cnvw_option,cao_check=${cao_check}, - netcdf_diag=$netcdf_diag,binary_diag=$binary_diag, - lobsdiag_forenkf=$lobsdiag_forenkf, - write_fv3_incr=$write_fv3_increment, + cwoption=3,imp_physics=${imp_physics},lupp=${lupp},cnvw_option=${cnvw_option},cao_check=${cao_check}, + netcdf_diag=${netcdf_diag},binary_diag=${binary_diag}, + lobsdiag_forenkf=${lobsdiag_forenkf}, + write_fv3_incr=${write_fv3_increment}, nhr_anal=${IAUFHRS}, ta2tb=${ta2tb}, - $WRITE_INCR_ZERO - $WRITE_ZERO_STRAT - $WRITE_STRAT_EFOLD - $SETUP + ${WRITE_INCR_ZERO} + ${WRITE_ZERO_STRAT} + ${WRITE_STRAT_EFOLD} + ${SETUP} / &GRIDOPTS - JCAP_B=$JCAP,JCAP=$JCAP_A,NLAT=$NLAT_A,NLON=$NLON_A,nsig=$LEVS, + JCAP_B=${JCAP},JCAP=${JCAP_A},NLAT=${NLAT_A},NLON=${NLON_A},nsig=${LEVS}, regional=.false.,nlayers(63)=3,nlayers(64)=6, - $GRIDOPTS + ${GRIDOPTS} / &BKGERR vs=0.7, @@ -763,30 +733,30 @@ cat > gsiparm.anl << EOF bkgv_flowdep=.true.,bkgv_rewgtfct=1.5, bkgv_write=.false., cwcoveqqcov=.false., - $BKGVERR + ${BKGVERR} / &ANBKGERR anisotropic=.false., - $ANBKGERR + ${ANBKGERR} / &JCOPTS ljcdfi=.false.,alphajc=0.0,ljcpdry=.true.,bamp_jcpdry=5.0e7, - $JCOPTS + ${JCOPTS} / &STRONGOPTS tlnmc_option=2,nstrong=1,nvmodes_keep=8,period_max=6.,period_width=1.5, - $STRONGOPTS + ${STRONGOPTS} / &OBSQC dfact=0.75,dfact1=3.0,noiqc=.true.,oberrflg=.false.,c_varqc=0.02, use_poq7=.true.,qc_noirjaco3_pole=.true.,vqc=.false.,nvqc=.true., aircraft_t_bc=.true.,biaspredt=1.0e5,upd_aircraft=.true.,cleanup_tail=.true., tcp_width=70.0,tcp_ermax=7.35, - $OBSQC + ${OBSQC} / &OBS_INPUT dmesh(1)=145.0,dmesh(2)=150.0,dmesh(3)=100.0,dmesh(4)=25.0,time_window_max=3.0, - $OBSINPUT + ${OBSINPUT} / OBS_INPUT:: ! dfile dtype dplat dsis dval dthin dsfcalc @@ -898,7 +868,6 @@ OBS_INPUT:: iasibufr iasi metop-c iasi_metop-c 0.0 1 1 sstviirs viirs-m npp viirs-m_npp 0.0 4 0 sstviirs viirs-m j1 viirs-m_j1 0.0 4 0 - abibufr abi g18 abi_g18 0.0 1 0 ahibufr ahi himawari9 ahi_himawari9 0.0 1 0 atmsbufr atms n21 atms_n21 0.0 1 1 crisfsbufr cris-fsr n21 cris-fsr_n21 0.0 1 0 @@ -908,37 +877,37 @@ OBS_INPUT:: gomebufr gome metop-c gome_metop-c 0.0 2 0 :: &SUPEROB_RADAR - $SUPERRAD + ${SUPERRAD} / &LAG_DATA - $LAGDATA + ${LAGDATA} / &HYBRID_ENSEMBLE - l_hyb_ens=$l_hyb_ens, + l_hyb_ens=${l_hyb_ens}, generate_ens=.false., beta_s0=0.125,readin_beta=.false., s_ens_h=800.,s_ens_v=-0.8,readin_localization=.true., aniso_a_en=.false.,oz_univ_static=.false.,uv_hyb_ens=.true., ensemble_path='./ensemble_data/', ens_fast_read=.true., - $HYBRID_ENSEMBLE + ${HYBRID_ENSEMBLE} / &RAPIDREFRESH_CLDSURF dfi_radar_latent_heat_time_period=30.0, - $RAPIDREFRESH_CLDSURF + ${RAPIDREFRESH_CLDSURF} / &CHEM - $CHEM + ${CHEM} / &SINGLEOB_TEST maginnov=0.1,magoberr=0.1,oneob_type='t', - oblat=45.,oblon=180.,obpres=1000.,obdattim=$CDATE, + oblat=45.,oblon=180.,obpres=1000.,obdattim=${CDATE}, obhourset=0., - $SINGLEOB + ${SINGLEOB} / &NST - nst_gsi=$NST_GSI, - $NST + nst_gsi=${NST_GSI}, + ${NST} / EOF cat gsiparm.anl @@ -946,20 +915,20 @@ cat gsiparm.anl ############################################################## # Run gsi analysis -export OMP_NUM_THREADS=$NTHREADS_GSI -export pgm=$GSIEXEC +export OMP_NUM_THREADS=${NTHREADS_GSI} +export pgm=${GSIEXEC} . prep_step -$NCP $GSIEXEC $DATA -$APRUN_GSI ${DATA}/$(basename $GSIEXEC) 1>&1 2>&2 +${NCP} ${GSIEXEC} ${DATA} +${APRUN_GSI} ${DATA}/$(basename ${GSIEXEC}) 1>&1 2>&2 export err=$?; err_chk ############################################################## # If full analysis field written, calculate analysis increment # here before releasing FV3 forecast -if [ $DO_CALC_INCREMENT = "YES" ]; then - $CALCINCPY +if [ ${DO_CALC_INCREMENT} = "YES" ]; then + ${CALCINCPY} export err=$?; err_chk fi @@ -967,52 +936,52 @@ fi ############################################################## # For eupd if [ -s satbias_out.int ]; then - $NCP satbias_out.int $ABIASe + ${NCP} satbias_out.int ${ABIASe} else - $NCP satbias_in $ABIASe + ${NCP} satbias_in ${ABIASe} fi # Cat runtime output files. -cat fort.2* > $GSISTAT +cat fort.2* > ${GSISTAT} # If requested, create obsinput tarball from obs_input.* files -if [ $RUN_SELECT = "YES" ]; then +if [ ${RUN_SELECT} = "YES" ]; then echo $(date) START tar obs_input >&2 [[ -s obsinput.tar ]] && rm obsinput.tar - $NLN $SELECT_OBS obsinput.tar + ${NLN} ${SELECT_OBS} obsinput.tar ${CHGRP_CMD} obs_input.* tar -cvf obsinput.tar obs_input.* - chmod 750 $SELECT_OBS - ${CHGRP_CMD} $SELECT_OBS + chmod 750 ${SELECT_OBS} + ${CHGRP_CMD} ${SELECT_OBS} rm obsinput.tar echo $(date) END tar obs_input >&2 fi ################################################################################ # Send alerts -if [ $SENDDBN = "YES" ]; then - if [ $RUN = "gfs" ]; then - $DBNROOT/bin/dbn_alert MODEL GFS_abias $job $ABIAS +if [ ${SENDDBN} = "YES" ]; then + if [ ${RUN} = "gfs" ]; then + ${DBNROOT}/bin/dbn_alert MODEL GFS_abias ${job} ${ABIAS} fi fi ################################################################################ # Postprocessing -cd $pwd -[[ $mkdata = "YES" ]] && rm -rf $DATA +cd ${pwd} +[[ ${mkdata} = "YES" ]] && rm -rf ${DATA} ############################################################## # Add this statement to release the forecast job once the # atmopsheric analysis and updated surface RESTARTS are # available. Do not release forecast when RUN=enkf ############################################################## -if [ $SENDECF = "YES" -a "$RUN" != "enkf" ]; then +if [ ${SENDECF} = "YES" -a "${RUN}" != "enkf" ]; then ecflow_client --event release_fcst fi -echo "$CDUMP $CDATE atminc done at $(date)" > $COMOUT/${APREFIX}loginc.txt +echo "${CDUMP} ${CDATE} atminc done at $(date)" > ${COM_ATMOS_ANALYSIS}/${APREFIX}loginc.txt ################################################################################ -exit $err +exit ${err} ################################################################################ diff --git a/scripts/exglobal_atmos_analysis_calc.sh b/scripts/exglobal_atmos_analysis_calc.sh index 2fa44c16b43..b353d3c52b1 100755 --- a/scripts/exglobal_atmos_analysis_calc.sh +++ b/scripts/exglobal_atmos_analysis_calc.sh @@ -23,27 +23,17 @@ source "$HOMEgfs/ush/preamble.sh" # Directories. pwd=$(pwd) -export FIXgsm=${FIXgsm:-$HOMEgfs/fix/fix_am} +export FIXgsm=${FIXgsm:-$HOMEgfs/fix/am} # Base variables -CDATE=${CDATE:-"2001010100"} CDUMP=${CDUMP:-"gdas"} GDUMP=${GDUMP:-"gdas"} -# Derived base variables -GDATE=$($NDATE -$assim_freq $CDATE) -BDATE=$($NDATE -3 $CDATE) -PDY=$(echo $CDATE | cut -c1-8) -cyc=$(echo $CDATE | cut -c9-10) -bPDY=$(echo $BDATE | cut -c1-8) -bcyc=$(echo $BDATE | cut -c9-10) - # Utilities export NCP=${NCP:-"/bin/cp"} export NMV=${NMV:-"/bin/mv"} export NLN=${NLN:-"/bin/ln -sf"} export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} -export NEMSIOGET=${NEMSIOGET:-${NWPROD}/exec/nemsio_get} export NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} COMPRESS=${COMPRESS:-gzip} UNCOMPRESS=${UNCOMPRESS:-gunzip} @@ -72,7 +62,7 @@ CALCANLPY=${CALCANLPY:-$HOMEgfs/ush/calcanl_gfs.py} DOGAUSFCANL=${DOGAUSFCANL-"NO"} GAUSFCANLSH=${GAUSFCANLSH:-$HOMEgfs/ush/gaussian_sfcanl.sh} -export GAUSFCANLEXE=${GAUSFCANLEXE:-$HOMEgfs/exec/gaussian_sfcanl.exe} +export GAUSFCANLEXE=${GAUSFCANLEXE:-$HOMEgfs/exec/gaussian_sfcanl.x} NTHREADS_GAUSFCANL=${NTHREADS_GAUSFCANL:-1} APRUN_GAUSFCANL=${APRUN_GAUSFCANL:-${APRUN:-""}} @@ -83,24 +73,22 @@ SENDDBN=${SENDDBN:-"NO"} # Guess files GPREFIX=${GPREFIX:-""} -GSUFFIX=${GSUFFIX:-$SUFFIX} -ATMG03=${ATMG03:-${COMIN_GES}/${GPREFIX}atmf003${GSUFFIX}} -ATMG04=${ATMG04:-${COMIN_GES}/${GPREFIX}atmf004${GSUFFIX}} -ATMG05=${ATMG05:-${COMIN_GES}/${GPREFIX}atmf005${GSUFFIX}} -ATMGES=${ATMGES:-${COMIN_GES}/${GPREFIX}atmf006${GSUFFIX}} -ATMG07=${ATMG07:-${COMIN_GES}/${GPREFIX}atmf007${GSUFFIX}} -ATMG08=${ATMG08:-${COMIN_GES}/${GPREFIX}atmf008${GSUFFIX}} -ATMG09=${ATMG09:-${COMIN_GES}/${GPREFIX}atmf009${GSUFFIX}} +ATMG03=${ATMG03:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf003.nc} +ATMG04=${ATMG04:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf004.nc} +ATMG05=${ATMG05:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf005.nc} +ATMGES=${ATMGES:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf006.nc} +ATMG07=${ATMG07:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf007.nc} +ATMG08=${ATMG08:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf008.nc} +ATMG09=${ATMG09:-${COM_ATMOS_HISTORY_PREV}/${GPREFIX}atmf009.nc} # Analysis files export APREFIX=${APREFIX:-""} -export ASUFFIX=${ASUFFIX:-$SUFFIX} -SFCANL=${SFCANL:-${COMOUT}/${APREFIX}sfcanl${ASUFFIX}} -DTFANL=${DTFANL:-${COMOUT}/${APREFIX}dtfanl.nc} -ATMANL=${ATMANL:-${COMOUT}/${APREFIX}atmanl${ASUFFIX}} +SFCANL=${SFCANL:-${COM_ATMOS_ANALYSIS}/${APREFIX}sfcanl.nc} +DTFANL=${DTFANL:-${COM_ATMOS_ANALYSIS}/${APREFIX}dtfanl.nc} +ATMANL=${ATMANL:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmanl.nc} # Increment files -ATMINC=${ATMINC:-${COMOUT}/${APREFIX}atminc.nc} +ATMINC=${ATMINC:-${COM_ATMOS_ANALYSIS}/${APREFIX}atminc.nc} # Set script / GSI control parameters DOHYBVAR=${DOHYBVAR:-"NO"} @@ -117,18 +105,18 @@ fi # Set 4D-EnVar specific variables if [ $DOHYBVAR = "YES" -a $l4densvar = ".true." -a $lwrite4danl = ".true." ]; then - ATMA03=${ATMA03:-${COMOUT}/${APREFIX}atma003${ASUFFIX}} - ATMI03=${ATMI03:-${COMOUT}/${APREFIX}atmi003.nc} - ATMA04=${ATMA04:-${COMOUT}/${APREFIX}atma004${ASUFFIX}} - ATMI04=${ATMI04:-${COMOUT}/${APREFIX}atmi004.nc} - ATMA05=${ATMA05:-${COMOUT}/${APREFIX}atma005${ASUFFIX}} - ATMI05=${ATMI05:-${COMOUT}/${APREFIX}atmi005.nc} - ATMA07=${ATMA07:-${COMOUT}/${APREFIX}atma007${ASUFFIX}} - ATMI07=${ATMI07:-${COMOUT}/${APREFIX}atmi007.nc} - ATMA08=${ATMA08:-${COMOUT}/${APREFIX}atma008${ASUFFIX}} - ATMI08=${ATMI08:-${COMOUT}/${APREFIX}atmi008.nc} - ATMA09=${ATMA09:-${COMOUT}/${APREFIX}atma009${ASUFFIX}} - ATMI09=${ATMI09:-${COMOUT}/${APREFIX}atmi009.nc} + ATMA03=${ATMA03:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma003.nc} + ATMI03=${ATMI03:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi003.nc} + ATMA04=${ATMA04:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma004.nc} + ATMI04=${ATMI04:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi004.nc} + ATMA05=${ATMA05:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma005.nc} + ATMI05=${ATMI05:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi005.nc} + ATMA07=${ATMA07:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma007.nc} + ATMI07=${ATMI07:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi007.nc} + ATMA08=${ATMA08:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma008.nc} + ATMI08=${ATMI08:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi008.nc} + ATMA09=${ATMA09:-${COM_ATMOS_ANALYSIS}/${APREFIX}atma009.nc} + ATMI09=${ATMI09:-${COM_ATMOS_ANALYSIS}/${APREFIX}atmi009.nc} fi ################################################################################ @@ -197,7 +185,7 @@ if [ $DOGAUSFCANL = "YES" ]; then export err=$?; err_chk fi -echo "$CDUMP $CDATE atmanl and sfcanl done at $(date)" > $COMOUT/${APREFIX}loganl.txt +echo "${CDUMP} ${PDY}${cyc} atmanl and sfcanl done at $(date)" > "${COM_ATMOS_ANALYSIS}/${APREFIX}loganl.txt" ################################################################################ # Postprocessing diff --git a/scripts/exglobal_atmos_sfcanl.sh b/scripts/exglobal_atmos_sfcanl.sh index 899e0ae84ae..65c9b26250a 100755 --- a/scripts/exglobal_atmos_sfcanl.sh +++ b/scripts/exglobal_atmos_sfcanl.sh @@ -19,30 +19,24 @@ # Set environment. -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" # Directories. pwd=$(pwd) -# Base variables -CDATE=${CDATE:-"2001010100"} -CDUMP=${CDUMP:-"gdas"} -GDUMP=${GDUMP:-"gdas"} - # Derived base variables -GDATE=$($NDATE -$assim_freq $CDATE) -BDATE=$($NDATE -3 $CDATE) -PDY=$(echo $CDATE | cut -c1-8) -cyc=$(echo $CDATE | cut -c9-10) -bPDY=$(echo $BDATE | cut -c1-8) -bcyc=$(echo $BDATE | cut -c9-10) +# Ignore possible spelling error (nothing is misspelled) +# shellcheck disable=SC2153 +GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") +BDATE=$(${NDATE} -3 "${PDY}${cyc}") +bPDY=${BDATE:0:8} +bcyc=${BDATE:8:2} # Utilities export NCP=${NCP:-"/bin/cp"} export NMV=${NMV:-"/bin/mv"} export NLN=${NLN:-"/bin/ln -sf"} export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} -export NEMSIOGET=${NEMSIOGET:-${NWPROD}/exec/nemsio_get} export NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} COMPRESS=${COMPRESS:-gzip} UNCOMPRESS=${UNCOMPRESS:-gunzip} @@ -53,16 +47,16 @@ DOIAU=${DOIAU:-"NO"} export IAUFHRS=${IAUFHRS:-"6"} # Surface cycle related parameters -CYCLESH=${CYCLESH:-$HOMEgfs/ush/global_cycle.sh} -export CYCLEXEC=${CYCLEXEC:-$HOMEgfs/exec/global_cycle} +CYCLESH=${CYCLESH:-${HOMEgfs}/ush/global_cycle.sh} +export CYCLEXEC=${CYCLEXEC:-${HOMEgfs}/exec/global_cycle} NTHREADS_CYCLE=${NTHREADS_CYCLE:-24} APRUN_CYCLE=${APRUN_CYCLE:-${APRUN:-""}} export SNOW_NUDGE_COEFF=${SNOW_NUDGE_COEFF:-'-2.'} export CYCLVARS=${CYCLVARS:-""} export FHOUR=${FHOUR:-0} export DELTSFC=${DELTSFC:-6} -export FIXgsm=${FIXgsm:-$HOMEgfs/fix/fix_am} -export FIXfv3=${FIXfv3:-$HOMEgfs/fix/fix_fv3_gmted2010} +export FIXgsm=${FIXgsm:-${HOMEgfs}/fix/am} +export FIXfv3=${FIXfv3:-${HOMEgfs}/fix/orog} # FV3 specific info (required for global_cycle) export CASE=${CASE:-"C384"} @@ -78,15 +72,15 @@ export APRUN_CALCINC=${APRUN_CALCINC:-${APRUN:-""}} export APRUN_CALCANL=${APRUN_CALCANL:-${APRUN:-""}} export APRUN_CHGRES=${APRUN_CALCANL:-${APRUN:-""}} -export CALCANLEXEC=${CALCANLEXEC:-$HOMEgfs/exec/calc_analysis.x} -export CHGRESNCEXEC=${CHGRESNCEXEC:-$HOMEgfs/exec/enkf_chgres_recenter_nc.x} -export CHGRESINCEXEC=${CHGRESINCEXEC:-$HOMEgfs/exec/interp_inc.x} +export CALCANLEXEC=${CALCANLEXEC:-${HOMEgfs}/exec/calc_analysis.x} +export CHGRESNCEXEC=${CHGRESNCEXEC:-${HOMEgfs}/exec/enkf_chgres_recenter_nc.x} +export CHGRESINCEXEC=${CHGRESINCEXEC:-${HOMEgfs}/exec/interp_inc.x} export NTHREADS_CHGRES=${NTHREADS_CHGRES:-1} -CALCINCPY=${CALCINCPY:-$HOMEgfs/ush/calcinc_gfs.py} -CALCANLPY=${CALCANLPY:-$HOMEgfs/ush/calcanl_gfs.py} +CALCINCPY=${CALCINCPY:-${HOMEgfs}/ush/calcinc_gfs.py} +CALCANLPY=${CALCANLPY:-${HOMEgfs}/ush/calcanl_gfs.py} export APRUN_CHGRES=${APRUN_CALCANL:-${APRUN:-""}} -CHGRESEXEC=${CHGRESEXEC:-$HOMEgfs/exec/enkf_chgres_recenter.x} +CHGRESEXEC=${CHGRESEXEC:-${HOMEgfs}/exec/enkf_chgres_recenter.x} # OPS flags RUN=${RUN:-""} @@ -94,7 +88,7 @@ SENDECF=${SENDECF:-"NO"} SENDDBN=${SENDDBN:-"NO"} RUN_GETGES=${RUN_GETGES:-"NO"} GETGESSH=${GETGESSH:-"getges.sh"} -export gesenvir=${gesenvir:-$envir} +export gesenvir=${gesenvir:-${envir}} # Observations OPREFIX=${OPREFIX:-""} @@ -105,10 +99,10 @@ GPREFIX=${GPREFIX:-""} # Analysis files export APREFIX=${APREFIX:-""} -DTFANL=${DTFANL:-${COMOUT}/${APREFIX}dtfanl.nc} +DTFANL=${DTFANL:-${COM_ATMOS_ANALYSIS}/${APREFIX}dtfanl.nc} # Get dimension information based on CASE -res=$(echo $CASE | cut -c2-) +res=$(echo ${CASE} | cut -c2-) JCAP_CASE=$((res*2-2)) LATB_CASE=$((res*2)) LONB_CASE=$((res*4)) @@ -116,16 +110,16 @@ LONB_CASE=$((res*4)) ################################################################################ # Preprocessing mkdata=NO -if [ ! -d $DATA ]; then +if [[ ! -d ${DATA} ]]; then mkdata=YES - mkdir -p $DATA + mkdir -p ${DATA} fi -cd $DATA || exit 99 +cd ${DATA} || exit 99 -if [ $DONST = "YES" ]; then - export NSSTBF="${COMOUT}/${OPREFIX}nsstbufr" - $NLN $NSSTBF nsstbufr +if [[ ${DONST} = "YES" ]]; then + export NSSTBF="${COM_OBS}/${OPREFIX}nsstbufr" + ${NLN} ${NSSTBF} nsstbufr fi @@ -135,87 +129,91 @@ fi ############################################################## # Output files -if [ $DONST = "YES" ]; then - $NLN $DTFANL dtfanl +if [[ ${DONST} = "YES" ]]; then + ${NLN} ${DTFANL} dtfanl fi ############################################################## # Update surface fields in the FV3 restart's using global_cycle -mkdir -p $COMOUT/RESTART +mkdir -p "${COM_ATMOS_RESTART}" # Global cycle requires these files -export FNTSFA=${FNTSFA:-$COMIN_OBS/${OPREFIX}rtgssthr.grb} -export FNACNA=${FNACNA:-$COMIN_OBS/${OPREFIX}seaice.5min.blend.grb} -export FNSNOA=${FNSNOA:-$COMIN_OBS/${OPREFIX}snogrb_t${JCAP_CASE}.${LONB_CASE}.${LATB_CASE}} -[[ ! -f $FNSNOA ]] && export FNSNOA="$COMIN_OBS/${OPREFIX}snogrb_t1534.3072.1536" -FNSNOG=${FNSNOG:-$COMIN_GES_OBS/${GPREFIX}snogrb_t${JCAP_CASE}.${LONB_CASE}.${LATB_CASE}} -[[ ! -f $FNSNOG ]] && FNSNOG="$COMIN_GES_OBS/${GPREFIX}snogrb_t1534.3072.1536" +export FNTSFA=${FNTSFA:-${COM_OBS}/${OPREFIX}rtgssthr.grb} +export FNACNA=${FNACNA:-${COM_OBS}/${OPREFIX}seaice.5min.blend.grb} +export FNSNOA=${FNSNOA:-${COM_OBS}/${OPREFIX}snogrb_t${JCAP_CASE}.${LONB_CASE}.${LATB_CASE}} +[[ ! -f ${FNSNOA} ]] && export FNSNOA="${COM_OBS}/${OPREFIX}snogrb_t1534.3072.1536" +FNSNOG=${FNSNOG:-${COM_OBS_PREV}/${GPREFIX}snogrb_t${JCAP_CASE}.${LONB_CASE}.${LATB_CASE}} +[[ ! -f ${FNSNOG} ]] && FNSNOG="${COM_OBS_PREV}/${GPREFIX}snogrb_t1534.3072.1536" # Set CYCLVARS by checking grib date of current snogrb vs that of prev cycle -if [ $RUN_GETGES = "YES" ]; then - snoprv=$($GETGESSH -q -t snogrb_$JCAP_CASE -e $gesenvir -n $GDUMP -v $GDATE) +if [[ ${RUN_GETGES} = "YES" ]]; then + snoprv=$(${GETGESSH} -q -t snogrb_${JCAP_CASE} -e ${gesenvir} -n ${GDUMP} -v ${GDATE}) else - snoprv=${snoprv:-$FNSNOG} + snoprv=${snoprv:-${FNSNOG}} fi -if [ $($WGRIB -4yr $FNSNOA 2>/dev/null | grep -i snowc | awk -F: '{print $3}' | awk -F= '{print $2}') -le \ - $($WGRIB -4yr $snoprv 2>/dev/null | grep -i snowc | awk -F: '{print $3}' | awk -F= '{print $2}') ] ; then +if [[ $(${WGRIB} -4yr ${FNSNOA} 2>/dev/null | grep -i snowc | awk -F: '{print $3}' | awk -F= '{print $2}') -le \ + $(${WGRIB} -4yr ${snoprv} 2>/dev/null | grep -i snowc | awk -F: '{print $3}' | awk -F= '{print $2}') ]] ; then export FNSNOA=" " export CYCLVARS="FSNOL=99999.,FSNOS=99999.," else export SNOW_NUDGE_COEFF=${SNOW_NUDGE_COEFF:-0.} - export CYCLVARS="FSNOL=${SNOW_NUDGE_COEFF},$CYCLVARS" + export CYCLVARS="FSNOL=${SNOW_NUDGE_COEFF},${CYCLVARS}" fi -if [ $DONST = "YES" ]; then - export NST_FILE=${GSI_FILE:-$COMOUT/${APREFIX}dtfanl.nc} +if [[ ${DONST} = "YES" ]]; then + export NST_FILE=${GSI_FILE:-${COM_ATMOS_ANALYSIS}/${APREFIX}dtfanl.nc} else export NST_FILE="NULL" fi -if [ $DOIAU = "YES" ]; then +if [[ ${DOIAU} = "YES" ]]; then # update surface restarts at the beginning of the window, if IAU # For now assume/hold dtfanl.nc valid at beginning of window - for n in $(seq 1 $ntiles); do - $NLN $COMIN_GES/RESTART/$bPDY.${bcyc}0000.sfc_data.tile${n}.nc $DATA/fnbgsi.00$n - $NLN $COMOUT/RESTART/$bPDY.${bcyc}0000.sfcanl_data.tile${n}.nc $DATA/fnbgso.00$n - $NLN $FIXfv3/$CASE/${CASE}_grid.tile${n}.nc $DATA/fngrid.00$n - $NLN $FIXfv3/$CASE/${CASE}_oro_data.tile${n}.nc $DATA/fnorog.00$n + for n in $(seq 1 ${ntiles}); do + ${NCP} "${COM_ATMOS_RESTART_PREV}/${bPDY}.${bcyc}0000.sfc_data.tile${n}.nc" \ + "${COM_ATMOS_RESTART}/${bPDY}.${bcyc}0000.sfcanl_data.tile${n}.nc" + ${NLN} "${COM_ATMOS_RESTART_PREV}/${bPDY}.${bcyc}0000.sfc_data.tile${n}.nc" "${DATA}/fnbgsi.00${n}" + ${NLN} "${COM_ATMOS_RESTART}/${bPDY}.${bcyc}0000.sfcanl_data.tile${n}.nc" "${DATA}/fnbgso.00${n}" + ${NLN} "${FIXfv3}/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.00${n}" + ${NLN} "${FIXfv3}/${CASE}/${CASE}_oro_data.tile${n}.nc" "${DATA}/fnorog.00${n}" done - export APRUNCY=$APRUN_CYCLE - export OMP_NUM_THREADS_CY=$NTHREADS_CYCLE - export MAX_TASKS_CY=$ntiles + export APRUNCY=${APRUN_CYCLE} + export OMP_NUM_THREADS_CY=${NTHREADS_CYCLE} + export MAX_TASKS_CY=${ntiles} - $CYCLESH + CDATE="${PDY}${cyc}" ${CYCLESH} export err=$?; err_chk fi # Update surface restarts at middle of window -for n in $(seq 1 $ntiles); do - $NLN $COMIN_GES/RESTART/$PDY.${cyc}0000.sfc_data.tile${n}.nc $DATA/fnbgsi.00$n - $NLN $COMOUT/RESTART/$PDY.${cyc}0000.sfcanl_data.tile${n}.nc $DATA/fnbgso.00$n - $NLN $FIXfv3/$CASE/${CASE}_grid.tile${n}.nc $DATA/fngrid.00$n - $NLN $FIXfv3/$CASE/${CASE}_oro_data.tile${n}.nc $DATA/fnorog.00$n +for n in $(seq 1 ${ntiles}); do + ${NCP} "${COM_ATMOS_RESTART_PREV}/${PDY}.${cyc}0000.sfc_data.tile${n}.nc" \ + "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile${n}.nc" + ${NLN} "${COM_ATMOS_RESTART_PREV}/${PDY}.${cyc}0000.sfc_data.tile${n}.nc" "${DATA}/fnbgsi.00${n}" + ${NLN} "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile${n}.nc" "${DATA}/fnbgso.00${n}" + ${NLN} "${FIXfv3}/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.00${n}" + ${NLN} "${FIXfv3}/${CASE}/${CASE}_oro_data.tile${n}.nc" "${DATA}/fnorog.00${n}" done -export APRUNCY=$APRUN_CYCLE -export OMP_NUM_THREADS_CY=$NTHREADS_CYCLE -export MAX_TASKS_CY=$ntiles +export APRUNCY=${APRUN_CYCLE} +export OMP_NUM_THREADS_CY=${NTHREADS_CYCLE} +export MAX_TASKS_CY=${ntiles} -$CYCLESH +CDATE="${PDY}${cyc}" ${CYCLESH} export err=$?; err_chk ################################################################################ # Postprocessing -cd $pwd -[[ $mkdata = "YES" ]] && rm -rf $DATA +cd ${pwd} +[[ ${mkdata} = "YES" ]] && rm -rf ${DATA} ################################################################################ -exit $err +exit ${err} ################################################################################ diff --git a/scripts/exglobal_atmos_tropcy_qc_reloc.sh b/scripts/exglobal_atmos_tropcy_qc_reloc.sh index 6f96d7cfb43..380441a6c9d 100755 --- a/scripts/exglobal_atmos_tropcy_qc_reloc.sh +++ b/scripts/exglobal_atmos_tropcy_qc_reloc.sh @@ -17,8 +17,6 @@ cd $DATA cat break > $pgmout -export COMSP=$COMOUT/${RUN}.${cycle}. - tmhr=$(echo $tmmark|cut -c3-4) cdate10=$( ${NDATE:?} -$tmhr $PDY$cyc) @@ -28,21 +26,17 @@ tmmark_uc=$(echo $tmmark | tr [a-z] [A-Z]) iflag=0 if [ $RUN = ndas ]; then if [ $DO_RELOCATE = NO ]; then - msg="CENTER PROCESSING TIME FOR NDAS TROPICAL CYCLONE QC IS $cdate10" - postmsg "$jlogfile" "$msg" - msg="Output tcvitals files will be copied forward in time to proper \ + echo "CENTER PROCESSING TIME FOR NDAS TROPICAL CYCLONE QC IS $cdate10" + echo "Output tcvitals files will be copied forward in time to proper \ output file directory path locations" - postmsg "$jlogfile" "$msg" iflag=1 else - msg="CENTER PROCESSING TIME FOR $tmmark_uc NDAS TROPICAL CYCLONE \ + echo "CENTER PROCESSING TIME FOR $tmmark_uc NDAS TROPICAL CYCLONE \ RELOCATION IS $cdate10" - postmsg "$jlogfile" "$msg" fi else - msg="CENTER PROCESSING TIME FOR $tmmark_uc $NET_uc TROPICAL CYCLONE QC/\ + echo "CENTER PROCESSING TIME FOR $tmmark_uc $NET_uc TROPICAL CYCLONE QC/\ RELOCATION IS $cdate10" - postmsg "$jlogfile" "$msg" fi @@ -59,13 +53,12 @@ if [ "$PROCESS_TROPCY" = 'YES' ]; then ${USHSYND:-$HOMEgfs/ush}/syndat_qctropcy.sh $cdate10 errsc=$? if [ "$errsc" -ne '0' ]; then - msg="syndat_qctropcy.sh failed. exit" - postmsg "$jlogfile" "$msg" + echo "syndat_qctropcy.sh failed. exit" exit $errsc fi - cd $COMOUT + cd "${COM_OBS}" || exit 1 pwd ls -ltr *syndata* cd $ARCHSYND @@ -84,10 +77,10 @@ else # don't want to wipe out these files) # - [ ! -s ${COMSP}syndata.tcvitals.$tmmark ] && \ - cp /dev/null ${COMSP}syndata.tcvitals.$tmmark - [ ! -s ${COMSP}jtwc-fnoc.tcvitals.$tmmark ] && \ - cp /dev/null ${COMSP}jtwc-fnoc.tcvitals.$tmmark + [ ! -s "${COM_OBS}/${RUN}.t${cyc}z.syndata.tcvitals.${tmmark}" ] && \ + cp "/dev/null" "${COM_OBS}/${RUN}.t${cyc}z.syndata.tcvitals.${tmmark}" + [ ! -s "${COM_OBS}/${RUN}.t${cyc}z.jtwc-fnoc.tcvitals.${tmmark}" ] && \ + cp "/dev/null" "${COM_OBS}/${RUN}.t${cyc}z.jtwc-fnoc.tcvitals.${tmmark}" # endif loop $PROCESS_TROPCY fi @@ -115,25 +108,25 @@ if [ "$DO_RELOCATE" = 'YES' ]; then [ $RUN = gfs -o $RUN = gdas -o $NET = cfs ] && qual_last="" if [ $BKGFREQ -eq 1 ]; then - [ -s sgm3prep ] && cp sgm3prep ${COMSP}sgm3prep${qual_last} - [ -s sgm2prep ] && cp sgm2prep ${COMSP}sgm2prep${qual_last} - [ -s sgm1prep ] && cp sgm1prep ${COMSP}sgm1prep${qual_last} - [ -s sgesprep ] && cp sgesprep ${COMSP}sgesprep${qual_last} - [ -s sgp1prep ] && cp sgp1prep ${COMSP}sgp1prep${qual_last} - [ -s sgp2prep ] && cp sgp2prep ${COMSP}sgp2prep${qual_last} - [ -s sgp3prep ] && cp sgp3prep ${COMSP}sgp3prep${qual_last} + if [[ -s sgm3prep ]]; then cp "sgm3prep" "${COM_OBS}/${RUN}.t${cyc}z.sgm3prep${qual_last}"; fi + if [[ -s sgm2prep ]]; then cp "sgm2prep" "${COM_OBS}/${RUN}.t${cyc}z.sgm2prep${qual_last}"; fi + if [[ -s sgm1prep ]]; then cp "sgm1prep" "${COM_OBS}/${RUN}.t${cyc}z.sgm1prep${qual_last}"; fi + if [[ -s sgesprep ]]; then cp "sgesprep" "${COM_OBS}/${RUN}.t${cyc}z.sgesprep${qual_last}"; fi + if [[ -s sgp1prep ]]; then cp "sgp1prep" "${COM_OBS}/${RUN}.t${cyc}z.sgp1prep${qual_last}"; fi + if [[ -s sgp2prep ]]; then cp "sgp2prep" "${COM_OBS}/${RUN}.t${cyc}z.sgp2prep${qual_last}"; fi + if [[ -s sgp3prep ]]; then cp "sgp3prep" "${COM_OBS}/${RUN}.t${cyc}z.sgp3prep${qual_last}"; fi elif [ $BKGFREQ -eq 3 ]; then - [ -s sgm3prep ] && cp sgm3prep ${COMSP}sgm3prep${qual_last} - [ -s sgesprep ] && cp sgesprep ${COMSP}sgesprep${qual_last} - [ -s sgp3prep ] && cp sgp3prep ${COMSP}sgp3prep${qual_last} + if [[ -s sgm3prep ]]; then cp "sgm3prep" "${COM_OBS}/${RUN}.t${cyc}z.sgm3prep${qual_last}"; fi + if [[ -s sgesprep ]]; then cp "sgesprep" "${COM_OBS}/${RUN}.t${cyc}z.sgesprep${qual_last}"; fi + if [[ -s sgp3prep ]]; then cp "sgp3prep" "${COM_OBS}/${RUN}.t${cyc}z.sgp3prep${qual_last}"; fi fi -# The existence of ${COMSP}tropcy_relocation_status.$tmmark file will tell the +# The existence of ${COM_OBS}/${RUN}.t${cyc}z.tropcy_relocation_status.$tmmark file will tell the # subsequent PREP processing that RELOCATION processing occurred, if this file # does not already exist at this point, echo "RECORDS PROCESSED" into it to # further tell PREP processing that records were processed by relocation and # the global sigma guess was modified by tropical cyclone relocation -# Note: If ${COMSP}tropcy_relocation_status.$tmmark already exists at this +# Note: If ${COM_OBS}/${RUN}.t${cyc}z.tropcy_relocation_status.$tmmark already exists at this # point it means that it contains the string "NO RECORDS to process" # and was created by the child script tropcy_relocate.sh because records # were not processed by relocation and the global sigma guess was NOT @@ -141,8 +134,9 @@ if [ "$DO_RELOCATE" = 'YES' ]; then # were found in the relocation step) # ---------------------------------------------------------------------------- - [ ! -s ${COMSP}tropcy_relocation_status.$tmmark ] && \ - echo "RECORDS PROCESSED" > ${COMSP}tropcy_relocation_status.$tmmark + if [[ ! -s "${COM_OBS}/${RUN}.t${cyc}z.tropcy_relocation_status.${tmmark}" ]]; then + echo "RECORDS PROCESSED" > "${COM_OBS}/${RUN}.t${cyc}z.tropcy_relocation_status.${tmmark}" + fi # endif loop $DO_RELOCATE fi diff --git a/scripts/exglobal_diag.sh b/scripts/exglobal_diag.sh index 0423a9fc703..3aa1093fad1 100755 --- a/scripts/exglobal_diag.sh +++ b/scripts/exglobal_diag.sh @@ -25,24 +25,15 @@ source "$HOMEgfs/ush/preamble.sh" pwd=$(pwd) # Base variables -CDATE=${CDATE:-"2001010100"} +CDATE="${PDY}${cyc}" CDUMP=${CDUMP:-"gdas"} GDUMP=${GDUMP:-"gdas"} -# Derived base variables -GDATE=$($NDATE -$assim_freq $CDATE) -BDATE=$($NDATE -3 $CDATE) -PDY=$(echo $CDATE | cut -c1-8) -cyc=$(echo $CDATE | cut -c9-10) -bPDY=$(echo $BDATE | cut -c1-8) -bcyc=$(echo $BDATE | cut -c9-10) - # Utilities export NCP=${NCP:-"/bin/cp"} export NMV=${NMV:-"/bin/mv"} export NLN=${NLN:-"/bin/ln -sf"} export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} -export NEMSIOGET=${NEMSIOGET:-${NWPROD}/exec/nemsio_get} export NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} export CATEXEC=${CATEXEC:-$ncdiag_ROOT/bin/ncdiag_cat_serial.x} COMPRESS=${COMPRESS:-gzip} @@ -62,11 +53,10 @@ SENDDBN=${SENDDBN:-"NO"} # Analysis files export APREFIX=${APREFIX:-""} -export ASUFFIX=${ASUFFIX:-$SUFFIX} -RADSTAT=${RADSTAT:-${COMOUT}/${APREFIX}radstat} -PCPSTAT=${PCPSTAT:-${COMOUT}/${APREFIX}pcpstat} -CNVSTAT=${CNVSTAT:-${COMOUT}/${APREFIX}cnvstat} -OZNSTAT=${OZNSTAT:-${COMOUT}/${APREFIX}oznstat} +RADSTAT=${RADSTAT:-${COM_ATMOS_ANALYSIS}/${APREFIX}radstat} +PCPSTAT=${PCPSTAT:-${COM_ATMOS_ANALYSIS}/${APREFIX}pcpstat} +CNVSTAT=${CNVSTAT:-${COM_ATMOS_ANALYSIS}/${APREFIX}cnvstat} +OZNSTAT=${OZNSTAT:-${COM_ATMOS_ANALYSIS}/${APREFIX}oznstat} # Remove stat file if file already exists [[ -s $RADSTAT ]] && rm -f $RADSTAT @@ -88,7 +78,7 @@ nm="" if [ $CFP_MP = "YES" ]; then nm=0 fi -DIAG_DIR=${DIAG_DIR:-${COMOUT}/gsidiags} +DIAG_DIR=${DIAG_DIR:-${COM_ATMOS_ANALYSIS}/gsidiags} REMOVE_DIAG_DIR=${REMOVE_DIAG_DIR:-"NO"} # Set script / GSI control parameters diff --git a/scripts/exglobal_forecast.py b/scripts/exglobal_forecast.py new file mode 100755 index 00000000000..2b21934bfac --- /dev/null +++ b/scripts/exglobal_forecast.py @@ -0,0 +1,27 @@ +#!/usr/bin/env python3 + +import os + +from pygw.logger import Logger, logit +from pygw.yaml_file import save_as_yaml +from pygw.configuration import cast_strdict_as_dtypedict +from pygfs.task.gfs_forecast import GFSForecast + +# initialize root logger +logger = Logger(level=os.environ.get("LOGGING_LEVEL"), colored_log=True) + + +@logit(logger) +def main(): + + # instantiate the forecast + config = cast_strdict_as_dtypedict(os.environ) + save_as_yaml(config, f'{config.EXPDIR}/fcst.yaml') # Temporarily save the input to the Forecast + + fcst = GFSForecast(config) + fcst.initialize() + fcst.configure() + + +if __name__ == '__main__': + main() diff --git a/scripts/exglobal_forecast.sh b/scripts/exglobal_forecast.sh index 3f2ad87cafd..d86691d5ecc 100755 --- a/scripts/exglobal_forecast.sh +++ b/scripts/exglobal_forecast.sh @@ -77,9 +77,9 @@ # Main body starts here ####################### -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" -SCRIPTDIR=$(dirname $(readlink -f "$0") )/../ush +SCRIPTDIR="${HOMEgfs}/ush" echo "MAIN: environment loaded for $machine platform,Current Script locates in $SCRIPTDIR." # include all subroutines. Executions later. @@ -123,18 +123,14 @@ common_predet echo $RUN case $RUN in 'data') DATM_predet;; - 'gfs') FV3_GFS_predet;; - 'gdas') FV3_GFS_predet;; - 'gefs') FV3_GEFS_predet;; + *gfs | *gdas | 'gefs') FV3_GFS_predet;; esac [[ $cplflx = .true. ]] && MOM6_predet [[ $cplwav = .true. ]] && WW3_predet [[ $cplice = .true. ]] && CICE_predet case $RUN in - 'gfs') FV3_GFS_det;; - 'gdas') FV3_GFS_det;; - 'gefs') FV3_GEFS_det;; + *gfs | *gdas | 'gefs') FV3_GFS_det;; esac #no run type determination for data atmosphere [[ $cplflx = .true. ]] && MOM6_det [[ $cplwav = .true. ]] && WW3_det @@ -146,9 +142,7 @@ echo "MAIN: Post-determination set up of run type" echo $RUN case $RUN in 'data') DATM_postdet;; - 'gfs') FV3_GFS_postdet;; - 'gdas') FV3_GFS_postdet;; - 'gefs') FV3_GEFS_postdet;; + *gfs | *gdas | 'gefs') FV3_GFS_postdet;; esac #no post determination set up for data atmosphere [[ $cplflx = .true. ]] && MOM6_postdet [[ $cplwav = .true. ]] && WW3_postdet @@ -159,10 +153,8 @@ echo "MAIN: Post-determination set up of run type finished" echo "MAIN: Writing name lists and model configuration" case $RUN in 'data') DATM_nml;; - 'gfs') FV3_GFS_nml;; - 'gdas') FV3_GFS_nml;; - 'gefs') FV3_GEFS_nml;; -esac #no namelist for data atmosphere + *gfs | *gdas | 'gefs') FV3_GFS_nml;; +esac [[ $cplflx = .true. ]] && MOM6_nml [[ $cplwav = .true. ]] && WW3_nml [[ $cplice = .true. ]] && CICE_nml @@ -170,9 +162,7 @@ esac #no namelist for data atmosphere case $RUN in 'data') DATM_model_configure;; - 'gfs') FV3_model_configure;; - 'gdas') FV3_model_configure;; - 'gefs') FV3_model_configure;; + *gfs | *gdas | 'gefs') FV3_model_configure;; esac echo "MAIN: Name lists and model configuration written" @@ -188,31 +178,20 @@ if [ $esmf_profile ]; then export ESMF_RUNTIME_PROFILE_OUTPUT=SUMMARY fi -if [ $machine != 'sandbox' ]; then - $NCP $FCSTEXECDIR/$FCSTEXEC $DATA/. - export OMP_NUM_THREADS=$NTHREADS_FV3 - $APRUN_FV3 $DATA/$FCSTEXEC 1>&1 2>&2 - export ERR=$? - export err=$ERR - $ERRSCRIPT || exit $err -else - echo "MAIN: mpirun launch here" -fi +$NCP $FCSTEXECDIR/$FCSTEXEC $DATA/. +$APRUN_UFS $DATA/$FCSTEXEC 1>&1 2>&2 +export ERR=$? +export err=$ERR +$ERRSCRIPT || exit $err -if [ $machine != 'sandbox' ]; then - case $RUN in - 'data') data_out_Data_ATM;; - 'gfs') data_out_GFS;; - 'gdas') data_out_GFS;; - 'gefs') data_out_GEFS;; - esac - [[ $cplflx = .true. ]] && MOM6_out - [[ $cplwav = .true. ]] && WW3_out - [[ $cplice = .true. ]] && CICE_out - [[ $esmf_profile = .true. ]] && CPL_out -else - echo "MAIN: Running on sandbox mode, no output linking" -fi +case $RUN in + 'data') data_out_Data_ATM;; + *gfs | *gdas | 'gefs') data_out_GFS;; +esac +[[ $cplflx = .true. ]] && MOM6_out +[[ $cplwav = .true. ]] && WW3_out +[[ $cplice = .true. ]] && CICE_out +[[ $esmf_profile = .true. ]] && CPL_out echo "MAIN: Output copied to COMROT" #------------------------------------------------------------------ diff --git a/scripts/run_reg2grb2.sh b/scripts/run_reg2grb2.sh index 2284088f47e..ab2c80043ed 100755 --- a/scripts/run_reg2grb2.sh +++ b/scripts/run_reg2grb2.sh @@ -1,11 +1,11 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" #requires grib_util module -MOM6REGRID=${MOM6REGRID:-$HOMEgfs} -export mask_file=$MOM6REGRID/fix/fix_reg2grb2/mask.0p25x0p25.grb2 +MOM6REGRID=${MOM6REGRID:-${HOMEgfs}} +export mask_file="${MOM6REGRID}/fix/reg2grb2/mask.0p25x0p25.grb2" # offline testing: #export DATA= @@ -14,25 +14,25 @@ export mask_file=$MOM6REGRID/fix/fix_reg2grb2/mask.0p25x0p25.grb2 #export outfile=$DATA/DATA0p5/out/ocnh2012010106.01.2012010100.grb2 # # workflow testing: -export icefile=icer${CDATE}.${ENSMEM}.${IDATE}_0p25x0p25_CICE.nc -export ocnfile=ocnr${CDATE}.${ENSMEM}.${IDATE}_0p25x0p25_MOM6.nc -export outfile=ocn_ice${CDATE}.${ENSMEM}.${IDATE}_0p25x0p25.grb2 -export outfile0p5=ocn_ice${CDATE}.${ENSMEM}.${IDATE}_0p5x0p5.grb2 +export icefile="icer${VDATE}.${ENSMEM}.${IDATE}_0p25x0p25_CICE.nc" +export ocnfile="ocnr${VDATE}.${ENSMEM}.${IDATE}_0p25x0p25_MOM6.nc" +export outfile="ocn_ice${VDATE}.${ENSMEM}.${IDATE}_0p25x0p25.grb2" +export outfile0p5="ocn_ice${VDATE}.${ENSMEM}.${IDATE}_0p5x0p5.grb2" export mfcstcpl=${mfcstcpl:-1} export IGEN_OCNP=${IGEN_OCNP:-197} # PT This is the forecast date -export year=$(echo $CDATE | cut -c1-4) -export month=$(echo $CDATE | cut -c5-6) -export day=$(echo $CDATE | cut -c7-8) -export hour=$(echo $CDATE | cut -c9-10) +export year=${VDATE:0:4} +export month=${VDATE:4:2} +export day=${VDATE:6:2} +export hour=${VDATE:8:2} # PT This is the initialization date -export syear=$(echo $IDATE | cut -c1-4) -export smonth=$(echo $IDATE | cut -c5-6) -export sday=$(echo $IDATE | cut -c7-8) -export shour=$(echo $IDATE | cut -c9-10) +export syear=${IDATE:0:4} +export smonth=${IDATE:4:2} +export sday=${IDATE:6:2} +export shour=${IDATE:8:2} # PT Need to get this from above - could be 6 or 1 hour export hh_inc_ocn=6 @@ -63,11 +63,10 @@ export flatn=90. export flonw=0.0 export flone=359.75 -ln -sf $mask_file ./iceocnpost.g2 -$executable > reg2grb2.$CDATE.$IDATE.out +ln -sf "${mask_file}" ./iceocnpost.g2 +${executable} > "reg2grb2.${VDATE}.${IDATE}.out" # interpolated from 0p25 to 0p5 grid grid2p05="0 6 0 0 0 0 0 0 720 361 0 0 90000000 0 48 -90000000 359500000 500000 500000 0" -#### $NWPROD/util/exec/copygb2 -g "${grid2p05}" -i0 -x $outfile $outfile0p5 -$COPYGB2 -g "${grid2p05}" -i0 -x $outfile $outfile0p5 +${COPYGB2} -g "${grid2p05}" -i0 -x "${outfile}" "${outfile0p5}" diff --git a/scripts/run_regrid.sh b/scripts/run_regrid.sh index 6d18eeb6930..103e9a759e5 100755 --- a/scripts/run_regrid.sh +++ b/scripts/run_regrid.sh @@ -1,26 +1,27 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" -MOM6REGRID=${MOM6REGRID:-$HOMEgfs} -export EXEC_DIR=$MOM6REGRID/exec -export USH_DIR=$MOM6REGRID/ush -export COMOUTocean=$COMOUTocean -export COMOUTice=$COMOUTice -export IDATE=$IDATE -export ENSMEM=$ENSMEM -export FHR=$fhr -export DATA=$DATA -export FIXreg2grb2=$FIXreg2grb2 +MOM6REGRID="${MOM6REGRID:-${HOMEgfs}}" +export EXEC_DIR="${MOM6REGRID}/exec" +export USH_DIR="${MOM6REGRID}/ush" +export COMOUTocean="${COM_OCEAN_HISTORY}" +export COMOUTice="${COM_ICE_HISTORY}" +export IDATE="${IDATE}" +export VDATE="${VDATE}" +export ENSMEM="${ENSMEM}" +export FHR="${fhr}" +export DATA="${DATA}" +export FIXreg2grb2="${FIXreg2grb2}" ###### DO NOT MODIFY BELOW UNLESS YOU KNOW WHAT YOU ARE DOING ####### #Need NCL module to be loaded: -echo $NCARG_ROOT -export NCL=$NCARG_ROOT/bin/ncl +echo "${NCARG_ROOT}" +export NCL="${NCARG_ROOT}/bin/ncl" ls -alrt -$NCL $USH_DIR/icepost.ncl -$NCL $USH_DIR/ocnpost.ncl +${NCL} "${USH_DIR}/icepost.ncl" +${NCL} "${USH_DIR}/ocnpost.ncl" ##################################################################### diff --git a/sorc/build_all.sh b/sorc/build_all.sh index e3802855cd4..af15be7b1ff 100755 --- a/sorc/build_all.sh +++ b/sorc/build_all.sh @@ -16,7 +16,7 @@ function _usage() { Builds all of the global-workflow components by calling the individual build scripts in sequence. -Usage: $BASH_SOURCE [-a UFS_app][-c build_config][-h][-v] +Usage: ${BASH_SOURCE[0]} [-a UFS_app][-c build_config][-h][-v] -a UFS_app: Build a specific UFS app instead of the default -c build_config: @@ -29,9 +29,13 @@ EOF exit 1 } +script_dir=$(cd "$(dirname "${BASH_SOURCE[0]}")" &> /dev/null && pwd) +cd "${script_dir}" || exit 1 + _build_ufs_opt="" _ops_opt="" _verbose_opt="" +_partial_opt="" # Reset option counter in case this script is sourced OPTIND=1 while getopts ":a:c:hov" option; do @@ -40,14 +44,13 @@ while getopts ":a:c:hov" option; do c) _partial_opt+="-c ${OPTARG} ";; h) _usage;; o) _ops_opt+="-o";; - # s) _build_ufs_opt+="-s ${OPTARG} ";; v) _verbose_opt="-v";; - \?) - echo "[$BASH_SOURCE]: Unrecognized option: ${option}" + :) + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage ;; - :) - echo "[$BASH_SOURCE]: ${option} requires an argument" + *) + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" usage ;; esac @@ -55,15 +58,14 @@ done shift $((OPTIND-1)) -build_dir=$(pwd) -logs_dir=$build_dir/logs -if [ ! -d $logs_dir ]; then +logs_dir="${script_dir}/logs" +if [[ ! -d "${logs_dir}" ]]; then echo "Creating logs folder" - mkdir $logs_dir + mkdir "${logs_dir}" || exit 1 fi # Check final exec folder exists -if [ ! -d "../exec" ]; then +if [[ ! -d "../exec" ]]; then echo "Creating ../exec folder" mkdir ../exec fi @@ -71,69 +73,101 @@ fi #------------------------------------ # GET MACHINE #------------------------------------ -target="" -source ./machine-setup.sh > /dev/null 2>&1 +export COMPILER="intel" +source gfs_utils.fd/ush/detect_machine.sh +source gfs_utils.fd/ush/module-setup.sh +if [[ -z "${MACHINE_ID}" ]]; then + echo "FATAL: Unable to determine target machine" + exit 1 +fi #------------------------------------ # INCLUDE PARTIAL BUILD #------------------------------------ -source ./partial_build.sh $_verbose_opt $_partial_opt +# Turn off some shellcheck warnings because we want to have +# variables with multiple arguments. +# shellcheck disable=SC2086,SC2248 +source ./partial_build.sh ${_verbose_opt} ${_partial_opt} +# shellcheck disable= -if [ $target = jet ]; then - Build_gsi=false - Build_gldas=false - Build_gfs_util=false - Build_ww3_prepost=false +# Disable gldas on Jet +if [[ ${MACHINE_ID} =~ jet.* ]]; then + Build_gldas="false" fi #------------------------------------ # Exception Handling Init #------------------------------------ +# Disable shellcheck warning about single quotes not being substituted. +# shellcheck disable=SC2016 ERRSCRIPT=${ERRSCRIPT:-'eval [[ $err = 0 ]]'} +# shellcheck disable= err=0 +#------------------------------------ +# build gfs_utils +#------------------------------------ +if [[ ${Build_gfs_utils} == 'true' ]]; then + echo " .... Building gfs_utils .... " + # shellcheck disable=SC2086,SC2248 + ./build_gfs_utils.sh ${_verbose_opt} > "${logs_dir}/build_gfs_utils.log" 2>&1 + # shellcheck disable= + rc=$? + if (( rc != 0 )) ; then + echo "Fatal error in building gfs_utils." + echo "The log file is in ${logs_dir}/build_gfs_utils.log" + fi + err=$((err + rc)) +fi + #------------------------------------ # build WW3 pre & post execs #------------------------------------ -$Build_ww3_prepost && { +if [[ ${Build_ww3_prepost} == "true" ]]; then echo " .... Building WW3 pre and post execs .... " - ./build_ww3prepost.sh ${_verbose_opt} ${_build_ufs_opt} > $logs_dir/build_ww3_prepost.log 2>&1 + # shellcheck disable=SC2086,SC2248 + ./build_ww3prepost.sh ${_verbose_opt} ${_build_ufs_opt} > "${logs_dir}/build_ww3_prepost.log" 2>&1 + # shellcheck disable= rc=$? - if [[ $rc -ne 0 ]] ; then + if (( rc != 0 )) ; then echo "Fatal error in building WW3 pre/post processing." - echo "The log file is in $logs_dir/build_ww3_prepost.log" + echo "The log file is in ${logs_dir}/build_ww3_prepost.log" fi - ((err+=$rc)) -} + err=$((err + rc)) +fi #------------------------------------ # build forecast model #------------------------------------ -$Build_ufs_model && { +if [[ ${Build_ufs_model} == 'true' ]]; then echo " .... Building forecast model .... " - ./build_ufs.sh $_verbose_opt ${_build_ufs_opt} > $logs_dir/build_ufs.log 2>&1 + # shellcheck disable=SC2086,SC2248 + ./build_ufs.sh ${_verbose_opt} ${_build_ufs_opt} > "${logs_dir}/build_ufs.log" 2>&1 + # shellcheck disable= rc=$? - if [[ $rc -ne 0 ]] ; then + if (( rc != 0 )) ; then echo "Fatal error in building UFS model." - echo "The log file is in $logs_dir/build_ufs.log" + echo "The log file is in ${logs_dir}/build_ufs.log" fi - ((err+=$rc)) -} + err=$((err + rc)) +fi #------------------------------------ # build GSI and EnKF - optional checkout #------------------------------------ -if [ -d gsi_enkf.fd ]; then - $Build_gsi_enkf && { - echo " .... Building gsi and enkf .... " - ./build_gsi_enkf.sh $_ops_opt $_verbose_opt > $logs_dir/build_gsi_enkf.log 2>&1 - rc=$? - if [[ $rc -ne 0 ]] ; then - echo "Fatal error in building gsi_enkf." - echo "The log file is in $logs_dir/build_gsi_enkf.log" +if [[ -d gsi_enkf.fd ]]; then + if [[ ${Build_gsi_enkf} == 'true' ]]; then + echo " .... Building gsi and enkf .... " + # shellcheck disable=SC2086,SC2248 + ./build_gsi_enkf.sh ${_ops_opt} ${_verbose_opt} > "${logs_dir}/build_gsi_enkf.log" 2>&1 + # shellcheck disable= + rc=$? + if (( rc != 0 )) ; then + echo "Fatal error in building gsi_enkf." + echo "The log file is in ${logs_dir}/build_gsi_enkf.log" + fi + err=$((err + rc)) fi - ((err+=$rc)) -} else echo " .... Skip building gsi and enkf .... " fi @@ -141,17 +175,19 @@ fi #------------------------------------ # build gsi utilities #------------------------------------ -if [ -d gsi_utils.fd ]; then - $Build_gsi_utils && { - echo " .... Building gsi utilities .... " - ./build_gsi_utils.sh $_ops_opt $_verbose_opt > $logs_dir/build_gsi_utils.log 2>&1 - rc=$? - if [[ $rc -ne 0 ]] ; then - echo "Fatal error in building gsi utilities." - echo "The log file is in $logs_dir/build_gsi_utils.log" +if [[ -d gsi_utils.fd ]]; then + if [[ ${Build_gsi_utils} == 'true' ]]; then + echo " .... Building gsi utilities .... " + # shellcheck disable=SC2086,SC2248 + ./build_gsi_utils.sh ${_ops_opt} ${_verbose_opt} > "${logs_dir}/build_gsi_utils.log" 2>&1 + # shellcheck disable= + rc=$? + if (( rc != 0 )) ; then + echo "Fatal error in building gsi utilities." + echo "The log file is in ${logs_dir}/build_gsi_utils.log" + fi + err=$((err + rc)) fi - ((err+=$rc)) -} else echo " .... Skip building gsi utilities .... " fi @@ -159,17 +195,19 @@ fi #------------------------------------ # build gdas - optional checkout #------------------------------------ -if [ -d gdas.cd ]; then - $Build_gdas && { - echo " .... Building GDASApp .... " - ./build_gdas.sh $_verbose_opt > $logs_dir/build_gdas.log 2>&1 - rc=$? - if [[ $rc -ne 0 ]] ; then - echo "Fatal error in building GDASApp." - echo "The log file is in $logs_dir/build_gdas.log" +if [[ -d gdas.cd ]]; then + if [[ ${Build_gdas} == 'true' ]]; then + echo " .... Building GDASApp .... " + # shellcheck disable=SC2086,SC2248 + ./build_gdas.sh ${_verbose_opt} > "${logs_dir}/build_gdas.log" 2>&1 + # shellcheck disable= + rc=$? + if (( rc != 0 )) ; then + echo "Fatal error in building GDASApp." + echo "The log file is in ${logs_dir}/build_gdas.log" + fi + err=$((err + rc)) fi - ((err+=$rc)) -} else echo " .... Skip building GDASApp .... " fi @@ -177,17 +215,19 @@ fi #------------------------------------ # build gsi monitor #------------------------------------ -if [ -d gsi_monitor.fd ]; then - $Build_gsi_monitor && { - echo " .... Building gsi monitor .... " - ./build_gsi_monitor.sh $_ops_opt $_verbose_opt > $logs_dir/build_gsi_monitor.log 2>&1 - rc=$? - if [[ $rc -ne 0 ]] ; then - echo "Fatal error in building gsi monitor." - echo "The log file is in $logs_dir/build_gsi_monitor.log" +if [[ -d gsi_monitor.fd ]]; then + if [[ ${Build_gsi_monitor} == 'true' ]]; then + echo " .... Building gsi monitor .... " + # shellcheck disable=SC2086,SC2248 + ./build_gsi_monitor.sh ${_ops_opt} ${_verbose_opt} > "${logs_dir}/build_gsi_monitor.log" 2>&1 + # shellcheck disable= + rc=$? + if (( rc != 0 )) ; then + echo "Fatal error in building gsi monitor." + echo "The log file is in ${logs_dir}/build_gsi_monitor.log" + fi + err=$((err + rc)) fi - ((err+=$rc)) -} else echo " .... Skip building gsi monitor .... " fi @@ -195,45 +235,51 @@ fi #------------------------------------ # build UPP #------------------------------------ -$Build_upp && { +if [[ ${Build_upp} == 'true' ]]; then echo " .... Building UPP .... " - ./build_upp.sh $_ops_opt $_verbose_opt > $logs_dir/build_upp.log 2>&1 + # shellcheck disable=SC2086,SC2248 + ./build_upp.sh ${_ops_opt} ${_verbose_opt} > "${logs_dir}/build_upp.log" 2>&1 + # shellcheck disable= rc=$? - if [[ $rc -ne 0 ]] ; then + if (( rc != 0 )) ; then echo "Fatal error in building UPP." - echo "The log file is in $logs_dir/build_upp.log" + echo "The log file is in ${logs_dir}/build_upp.log" fi - ((err+=$rc)) -} + err=$((err + rc)) +fi #------------------------------------ # build ufs_utils #------------------------------------ -$Build_ufs_utils && { +if [[ ${Build_ufs_utils} == 'true' ]]; then echo " .... Building ufs_utils .... " - ./build_ufs_utils.sh $_verbose_opt > $logs_dir/build_ufs_utils.log 2>&1 + # shellcheck disable=SC2086,SC2248 + ./build_ufs_utils.sh ${_verbose_opt} > "${logs_dir}/build_ufs_utils.log" 2>&1 + # shellcheck disable= rc=$? - if [[ $rc -ne 0 ]] ; then + if (( rc != 0 )) ; then echo "Fatal error in building ufs_utils." - echo "The log file is in $logs_dir/build_ufs_utils.log" + echo "The log file is in ${logs_dir}/build_ufs_utils.log" fi - ((err+=$rc)) -} + err=$((err + rc)) +fi #------------------------------------ # build gldas #------------------------------------ -if [ -d gldas.fd ]; then - $Build_gldas && { - echo " .... Building gldas .... " - ./build_gldas.sh $_verbose_opt > $logs_dir/build_gldas.log 2>&1 - rc=$? - if [[ $rc -ne 0 ]] ; then - echo "Fatal error in building gldas." - echo "The log file is in $logs_dir/build_gldas.log" +if [[ -d gldas.fd ]]; then + if [[ ${Build_gldas} == 'true' ]]; then + echo " .... Building gldas .... " + # shellcheck disable=SC2086,SC2248 + ./build_gldas.sh ${_verbose_opt} > "${logs_dir}/build_gldas.log" 2>&1 + # shellcheck disable= + rc=$? + if (( rc != 0 )) ; then + echo "Fatal error in building gldas." + echo "The log file is in ${logs_dir}/build_gldas.log" + fi + err=$((err + rc)) fi - ((err+=$rc)) -} else echo " .... Skip building gldas .... " fi @@ -241,52 +287,31 @@ fi #------------------------------------ # build gfs_wafs - optional checkout #------------------------------------ -if [ -d gfs_wafs.fd ]; then - $Build_gfs_wafs && { - echo " .... Building gfs_wafs .... " - ./build_gfs_wafs.sh $_verbose_opt > $logs_dir/build_gfs_wafs.log 2>&1 - rc=$? - if [[ $rc -ne 0 ]] ; then - echo "Fatal error in building gfs_wafs." - echo "The log file is in $logs_dir/build_gfs_wafs.log" +if [[ -d gfs_wafs.fd ]]; then + if [[ ${Build_gfs_wafs} == 'true' ]]; then + echo " .... Building gfs_wafs .... " + # shellcheck disable=SC2086,SC2248 + ./build_gfs_wafs.sh ${_verbose_opt} > "${logs_dir}/build_gfs_wafs.log" 2>&1 + # shellcheck disable= + rc=$? + if (( rc != 0 )) ; then + echo "Fatal error in building gfs_wafs." + echo "The log file is in ${logs_dir}/build_gfs_wafs.log" + fi + err=$((err + rc)) fi - ((err+=$rc)) -} fi -#------------------------------------ -# build workflow_utils -#------------------------------------ -$Build_workflow_utils && { - echo " .... Building workflow_utils .... " - target=$target ./build_workflow_utils.sh $_verbose_opt > $logs_dir/build_workflow_utils.log 2>&1 - rc=$? - if [[ $rc -ne 0 ]] ; then - echo "Fatal error in building workflow_utils." - echo "The log file is in $logs_dir/build_workflow_utils.log" - fi - ((err+=$rc)) -} - -#------------------------------------ -# build gfs_util -#------------------------------------ -$Build_gfs_util && { - echo " .... Building gfs_util .... " - ./build_gfs_util.sh $_verbose_opt > $logs_dir/build_gfs_util.log 2>&1 - rc=$? - if [[ $rc -ne 0 ]] ; then - echo "Fatal error in building gfs_util." - echo "The log file is in $logs_dir/build_gfs_util.log" - fi - ((err+=$rc)) -} - #------------------------------------ # Exception Handling #------------------------------------ -[[ $err -ne 0 ]] && echo "FATAL BUILD ERROR: Please check the log file for detail, ABORT!" -$ERRSCRIPT || exit $err +if (( err != 0 )); then + cat << EOF +BUILD ERROR: One or more components failed to build + Check the associated build log(s) for details. +EOF + ${ERRSCRIPT} || exit "${err}" +fi echo;echo " .... Build system finished .... " diff --git a/sorc/build_gdas.sh b/sorc/build_gdas.sh index f9238c9ab06..39cf5ac9a72 100755 --- a/sorc/build_gdas.sh +++ b/sorc/build_gdas.sh @@ -1,26 +1,29 @@ #! /usr/bin/env bash set -eux -source ./machine-setup.sh > /dev/null 2>&1 -cwd=$(pwd) +OPTIND=1 +while getopts ":dov" option; do + case "${option}" in + d) export BUILD_TYPE="DEBUG";; + v) export BUILD_VERBOSE="YES";; + :) + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" + usage + ;; + *) + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" + usage + ;; + esac +done +shift $((OPTIND-1)) -export BUILD_TARGET=$target +# TODO: GDASApp does not presently handle BUILD_TYPE -# use more build jobs if on NOAA HPC -build_jobs=4 -case "${target}" in - hera|orion) - build_jobs=10 - ;; -esac - -# Check final exec folder exists -if [ ! -d "../exec" ]; then - mkdir ../exec -fi - -cd gdas.cd -BUILD_JOBS=$build_jobs ./build.sh -t $BUILD_TARGET +BUILD_TYPE=${BUILD_TYPE:-"Release"} \ +BUILD_VERBOSE=${BUILD_VERBOSE:-"NO"} \ +BUILD_JOBS="${BUILD_JOBS:-8}" \ +WORKFLOW_BUILD="ON" \ +./gdas.cd/build.sh exit - diff --git a/sorc/build_gfs_util.sh b/sorc/build_gfs_util.sh deleted file mode 100755 index 675d1c9609e..00000000000 --- a/sorc/build_gfs_util.sh +++ /dev/null @@ -1,21 +0,0 @@ -#! /usr/bin/env bash -set -eux - -source ./machine-setup.sh > /dev/null 2>&1 -export dir=$( pwd ) - -cd ../util/sorc - -# Check for gfs_util folders exist -if [ ! -d "./mkgfsawps.fd" ]; then - echo " " - echo " GFS_UTIL folders DO NOT exist " - echo " " - exit -fi - -echo "" -echo " Building ... Executables for GFS_UTILITIES " -echo "" - -source ./compile_gfs_util_wcoss.sh diff --git a/sorc/build_gfs_utils.sh b/sorc/build_gfs_utils.sh new file mode 100755 index 00000000000..2a7a6112393 --- /dev/null +++ b/sorc/build_gfs_utils.sh @@ -0,0 +1,45 @@ +#! /usr/bin/env bash +set -eux + +function usage() { + cat << EOF +Builds the GFS utility programs. + +Usage: ${BASH_SOURCE[0]} [-d][-h][-v] + -d: + Build with debug options + -h: + Print this help message and exit + -v: + Turn on verbose output +EOF + exit 1 +} + +cwd=$(pwd) + +OPTIND=1 +while getopts ":dvh" option; do + case "${option}" in + d) export BUILD_TYPE="DEBUG";; + v) export BUILD_VERBOSE="YES";; + h) + usage + ;; + :) + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" + usage + ;; + *) + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" + usage + ;; + esac +done +shift $((OPTIND-1)) + +BUILD_TYPE=${BUILD_TYPE:-"Release"} \ +BUILD_VERBOSE=${BUILD_VERBOSE:-"NO"} \ +"${cwd}/gfs_utils.fd/ush/build.sh" + +exit diff --git a/sorc/build_gfs_wafs.sh b/sorc/build_gfs_wafs.sh index 7ddde2d6789..cbbf6ec9507 100755 --- a/sorc/build_gfs_wafs.sh +++ b/sorc/build_gfs_wafs.sh @@ -1,11 +1,11 @@ #! /usr/bin/env bash set -eux -source ./machine-setup.sh > /dev/null 2>&1 -cwd=$(pwd) +script_dir=$(dirname "${BASH_SOURCE[0]}") +cd "${script_dir}" || exit 1 # Check final exec folder exists -if [ ! -d "../exec" ]; then +if [[ ! -d "../exec" ]]; then mkdir ../exec fi diff --git a/sorc/build_gldas.sh b/sorc/build_gldas.sh index 635c2bee174..05963b93482 100755 --- a/sorc/build_gldas.sh +++ b/sorc/build_gldas.sh @@ -1,11 +1,11 @@ #! /usr/bin/env bash set -eux -source ./machine-setup.sh > /dev/null 2>&1 -cwd=$(pwd) +script_dir=$(dirname "${BASH_SOURCE[0]}") +cd "${script_dir}" || exit 1 # Check final exec folder exists -if [ ! -d "../exec" ]; then +if [[ ! -d "../exec" ]]; then mkdir ../exec fi diff --git a/sorc/build_gsi_enkf.sh b/sorc/build_gsi_enkf.sh index 1adb80061b3..671c3d6205b 100755 --- a/sorc/build_gsi_enkf.sh +++ b/sorc/build_gsi_enkf.sh @@ -1,20 +1,18 @@ #! /usr/bin/env bash set -eux -cwd=$(pwd) - OPTIND=1 while getopts ":dov" option; do case "${option}" in d) export BUILD_TYPE="DEBUG";; o) _ops="YES";; v) export BUILD_VERBOSE="YES";; - \?) - echo "[$BASH_SOURCE]: Unrecognized option: ${option}" + :) + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage ;; - :) - echo "[$BASH_SOURCE]: ${option} requires an argument" + *) + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" usage ;; esac diff --git a/sorc/build_gsi_monitor.sh b/sorc/build_gsi_monitor.sh index 0fa78044d8c..ec3645e52fd 100755 --- a/sorc/build_gsi_monitor.sh +++ b/sorc/build_gsi_monitor.sh @@ -9,12 +9,12 @@ while getopts ":dov" option; do d) export BUILD_TYPE="DEBUG";; o) _ops="YES";; v) export BUILD_VERBOSE="YES";; - \?) - echo "[$BASH_SOURCE]: Unrecognized option: ${option}" + :) + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage ;; - :) - echo "[$BASH_SOURCE]: ${option} requires an argument" + *) + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" usage ;; esac @@ -23,6 +23,6 @@ shift $((OPTIND-1)) BUILD_TYPE=${BUILD_TYPE:-"Release"} \ BUILD_VERBOSE=${BUILD_VERBOSE:-"NO"} \ -${cwd}/gsi_monitor.fd/ush/build.sh +"${cwd}/gsi_monitor.fd/ush/build.sh" exit diff --git a/sorc/build_gsi_utils.sh b/sorc/build_gsi_utils.sh index bc579300d1d..bcbc110cf6b 100755 --- a/sorc/build_gsi_utils.sh +++ b/sorc/build_gsi_utils.sh @@ -9,12 +9,12 @@ while getopts ":dov" option; do d) export BUILD_TYPE="DEBUG";; o) _ops="YES";; # TODO - unused; remove? v) export BUILD_VERBOSE="YES";; - \?) - echo "[$BASH_SOURCE]: Unrecognized option: ${option}" + :) + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage ;; - :) - echo "[$BASH_SOURCE]: ${option} requires an argument" + *) + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" usage ;; esac @@ -24,6 +24,6 @@ shift $((OPTIND-1)) BUILD_TYPE=${BUILD_TYPE:-"Release"} \ BUILD_VERBOSE=${BUILD_VERBOSE:-"NO"} \ UTIL_OPTS="-DBUILD_UTIL_ENKF_GFS=ON -DBUILD_UTIL_NCIO=ON" \ -${cwd}/gsi_utils.fd/ush/build.sh +"${cwd}/gsi_utils.fd/ush/build.sh" exit diff --git a/sorc/build_ufs.sh b/sorc/build_ufs.sh index 9cba966caab..130476f3a85 100755 --- a/sorc/build_ufs.sh +++ b/sorc/build_ufs.sh @@ -6,34 +6,36 @@ cwd=$(pwd) # Default settings APP="S2SWA" CCPP_SUITES="FV3_GFS_v16,FV3_GFS_v16_ugwpv1,FV3_GFS_v17_p8,FV3_RAP_noah_sfcdiff_unified_ugwp,FV3_GFS_v17_p8,FV3_GFS_v17_p8_mynn,FV3_GFS_v17_p8_gf_mynn" -#JKHCCPP_SUITES="FV3_GFS_v16,FV3_GFS_v16_ugwpv1,FV3_GFS_v17_p8,FV3_GFS_v16_coupled_nsstNoahmpUGWPv1,FV3_GFS_v17_coupled_p8" +#JKHCCPP_SUITES="FV3_GFS_v16,FV3_GFS_v16_no_nsst,FV3_GFS_v16_ugwpv1,FV3_GFS_v17_p8,FV3_GFS_v16_coupled_nsstNoahmpUGWPv1,FV3_GFS_v17_coupled_p8" +export RT_COMPILER="intel" +source "${cwd}/ufs_model.fd/tests/detect_machine.sh" +source "${cwd}/ufs_model.fd/tests/module-setup.sh" while getopts ":da:v" option; do case "${option}" in - d) BUILD_TYPE="Debug";; + d) BUILD_TYPE="DEBUG";; a) APP="${OPTARG}" ;; - v) BUILD_VERBOSE="YES";; - \?) - echo "[$BASH_SOURCE]: Unrecognized option: ${option}" - ;; + v) export BUILD_VERBOSE="YES";; :) - echo "[$BASH_SOURCE]: ${option} requires an argument" + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" + ;; + *) + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" ;; esac done -cd $cwd/ufs_model.fd +cd "${cwd}/ufs_model.fd" -export RT_COMPILER="intel" -source $cwd/ufs_model.fd/tests/detect_machine.sh -MAKE_OPT="-DAPP=${APP} -DCCPP_SUITES=${CCPP_SUITES}" +MAKE_OPT="-DAPP=${APP} -D32BIT=ON -DCCPP_SUITES=${CCPP_SUITES}" [[ ${BUILD_TYPE:-"Release"} = "DEBUG" ]] && MAKE_OPT+=" -DDEBUG=ON" COMPILE_NR=0 CLEAN_BEFORE=YES CLEAN_AFTER=NO -./tests/compile.sh $MACHINE_ID "$MAKE_OPT" $COMPILE_NR $CLEAN_BEFORE $CLEAN_AFTER -mv ./tests/fv3_${COMPILE_NR}.exe ./tests/ufs_model.x -mv ./tests/modules.fv3_${COMPILE_NR} ./tests/modules.ufs_model +./tests/compile.sh "${MACHINE_ID}" "${MAKE_OPT}" "${COMPILE_NR}" "${CLEAN_BEFORE}" "${CLEAN_AFTER}" +mv "./tests/fv3_${COMPILE_NR}.exe" ./tests/ufs_model.x +mv "./tests/modules.fv3_${COMPILE_NR}.lua" ./tests/modules.ufs_model.lua +cp "./modulefiles/ufs_common.lua" ./tests/ufs_common.lua exit 0 diff --git a/sorc/build_ufs_utils.sh b/sorc/build_ufs_utils.sh index 480dda9b89f..5e2edf0737f 100755 --- a/sorc/build_ufs_utils.sh +++ b/sorc/build_ufs_utils.sh @@ -1,12 +1,10 @@ #! /usr/bin/env bash set -eux -source ./machine-setup.sh > /dev/null 2>&1 -cwd=$(pwd) +script_dir=$(dirname "${BASH_SOURCE[0]}") +cd "${script_dir}/ufs_utils.fd" || exit 1 -cd ufs_utils.fd - -./build_all.sh +CMAKE_OPTS="-DGFS=ON" ./build_all.sh exit diff --git a/sorc/build_upp.sh b/sorc/build_upp.sh index 4db0321e2d8..67460487a64 100755 --- a/sorc/build_upp.sh +++ b/sorc/build_upp.sh @@ -1,8 +1,8 @@ #! /usr/bin/env bash set -eux -source ./machine-setup.sh > /dev/null 2>&1 -cwd=$(pwd) +script_dir=$(dirname "${BASH_SOURCE[0]}") +cd "${script_dir}" || exit 1 OPTIND=1 _opts="" @@ -11,12 +11,12 @@ while getopts ":dov" option; do d) export BUILD_TYPE="DEBUG";; o) _opts+="-g ";; v) _opts+="-v ";; - \?) - echo "[$BASH_SOURCE]: Unrecognized option: ${option}" + :) + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage ;; - :) - echo "[$BASH_SOURCE]: ${option} requires an argument" + *) + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" usage ;; esac @@ -24,9 +24,10 @@ done shift $((OPTIND-1)) # Check final exec folder exists -if [ ! -d "../exec" ]; then +if [[ ! -d "../exec" ]]; then mkdir ../exec fi cd ufs_model.fd/FV3/upp/tests +# shellcheck disable=SC2086 ./compile_upp.sh ${_opts} diff --git a/sorc/build_ww3prepost.sh b/sorc/build_ww3prepost.sh index d7017c0e138..bf78e7b2ac0 100755 --- a/sorc/build_ww3prepost.sh +++ b/sorc/build_ww3prepost.sh @@ -1,47 +1,57 @@ -#!/bin/sh +#! /usr/bin/env bash set -x +script_dir=$(dirname "${BASH_SOURCE[0]}") +cd "${script_dir}" || exit 1 + +export RT_COMPILER="intel" +source "${script_dir}/ufs_model.fd/tests/detect_machine.sh" +source "${script_dir}/ufs_model.fd/tests/module-setup.sh" + # Default settings APP="S2SWA" while getopts "a:v" option; do case "${option}" in a) APP="${OPTARG}" ;; - v) BUILD_VERBOSE="YES";; + v) export BUILD_VERBOSE="YES";; + :) + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" + usage + ;; *) - echo "Unrecognized option: ${1}" - exit 1 + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" + usage ;; esac done # Determine which switch to use -if [ APP == ATMW ]; then - ww3switch=model/esmf/switch +if [[ "${APP}" == "ATMW" ]]; then + ww3switch="model/esmf/switch" else - ww3switch=model/bin/switch_meshcap + ww3switch="model/bin/switch_meshcap" fi # Check final exec folder exists -if [ ! -d "../exec" ]; then +if [[ ! -d "../exec" ]]; then mkdir ../exec fi -finalexecdir=$( pwd -P )/../exec +finalexecdir="$( pwd -P )/../exec" #Determine machine and load modules set +x -source ./machine-setup.sh > /dev/null 2>&1 - -module use ../modulefiles -module load modulefile.ww3.$target +module use "${script_dir}/ufs_model.fd/modulefiles" +module load "ufs_${MACHINE_ID}" set -x #Set WW3 directory, switch, prep and post exes -cd ufs_model.fd/WW3 -export WW3_DIR=$( pwd -P ) +cd ufs_model.fd/WW3 || exit 1 +WW3_DIR=$( pwd -P ) +export WW3_DIR export SWITCHFILE="${WW3_DIR}/${ww3switch}" # Build exes for prep jobs and post jobs: @@ -49,12 +59,12 @@ prep_exes="ww3_grid ww3_prep ww3_prnc ww3_grid" post_exes="ww3_outp ww3_outf ww3_outp ww3_gint ww3_ounf ww3_ounp ww3_grib" #create build directory: -path_build=$WW3_DIR/build_SHRD -mkdir -p $path_build -cd $path_build +path_build="${WW3_DIR}/build_SHRD" +mkdir -p "${path_build}" || exit 1 +cd "${path_build}" || exit 1 echo "Forcing a SHRD build" -echo $(cat ${SWITCHFILE}) > ${path_build}/tempswitch +cat "${SWITCHFILE}" > "${path_build}/tempswitch" sed -e "s/DIST/SHRD/g"\ -e "s/OMPG / /g"\ @@ -64,44 +74,44 @@ sed -e "s/DIST/SHRD/g"\ -e "s/B4B / /g"\ -e "s/PDLIB / /g"\ -e "s/NOGRB/NCEP2/g"\ - ${path_build}/tempswitch > ${path_build}/switch -rm ${path_build}/tempswitch + "${path_build}/tempswitch" > "${path_build}/switch" +rm "${path_build}/tempswitch" -echo "Switch file is $path_build/switch with switches:" -cat $path_build/switch +echo "Switch file is ${path_build}/switch with switches:" +cat "${path_build}/switch" #Build executables: -cmake $WW3_DIR -DSWITCH=$path_build/switch -DCMAKE_INSTALL_PREFIX=install +cmake "${WW3_DIR}" -DSWITCH="${path_build}/switch" -DCMAKE_INSTALL_PREFIX=install rc=$? -if [[ $rc -ne 0 ]] ; then +if (( rc != 0 )); then echo "Fatal error in cmake." - exit $rc + exit "${rc}" fi make -j 8 rc=$? -if [[ $rc -ne 0 ]] ; then +if (( rc != 0 )); then echo "Fatal error in make." - exit $rc + exit "${rc}" fi make install -if [[ $rc -ne 0 ]] ; then +if (( rc != 0 )); then echo "Fatal error in make install." - exit $rc + exit "${rc}" fi # Copy to top-level exe directory -for prog in $prep_exes $post_exes; do - cp $path_build/install/bin/$prog $finalexecdir/ +for prog in ${prep_exes} ${post_exes}; do + cp "${path_build}/install/bin/${prog}" "${finalexecdir}/" rc=$? - if [[ $rc -ne 0 ]] ; then - echo "FATAL: Unable to copy $path_build/$prog to $finalexecdir (Error code $rc)" - exit $rc + if (( rc != 0 )); then + echo "FATAL: Unable to copy ${path_build}/${prog} to ${finalexecdir} (Error code ${rc})" + exit "${rc}" fi done #clean-up build directory: -echo "executables are in $finalexecdir" -echo "cleaning up $path_build" -rm -rf $path_build +echo "executables are in ${finalexecdir}" +echo "cleaning up ${path_build}" +rm -rf "${path_build}" exit 0 diff --git a/sorc/checkout.sh b/sorc/checkout.sh index 234864d1ebc..3a37912fe09 100755 --- a/sorc/checkout.sh +++ b/sorc/checkout.sh @@ -1,4 +1,5 @@ -#!/bin/sh +#! /usr/bin/env bash + set +x set -u @@ -9,7 +10,7 @@ Clones and checks out external components necessary for cloning and just check out the requested version (unless -c option is used). -Usage: $BASH_SOURCE [-c][-h][-m ufs_hash][-o] +Usage: ${BASH_SOURCE[0]} [-c][-h][-m ufs_hash][-o] -c: Create a fresh clone (delete existing directories) -h: @@ -18,7 +19,7 @@ Usage: $BASH_SOURCE [-c][-h][-m ufs_hash][-o] Check out this UFS hash instead of the default -o: Check out operational-only code (GTG and WAFS) - -g: + -g: Check out GSI for GSI-based DA -u: Check out GDASApp for UFS-based DA @@ -49,8 +50,9 @@ function checkout() { dir="$1" remote="$2" version="$3" + recursive=${4:-"YES"} - name=$(echo ${dir} | cut -d '.' -f 1) + name=$(echo "${dir}" | cut -d '.' -f 1) echo "Performing checkout of ${name}" logfile="${logdir:-$(pwd)}/checkout_${name}.log" @@ -59,10 +61,10 @@ function checkout() { rm "${logfile}" fi - cd "${topdir}" - if [[ -d "${dir}" && $CLEAN == "YES" ]]; then + cd "${topdir}" || exit 1 + if [[ -d "${dir}" && ${CLEAN} == "YES" ]]; then echo "|-- Removing existing clone in ${dir}" - rm -Rf "$dir" + rm -Rf "${dir}" fi if [[ ! -d "${dir}" ]]; then echo "|-- Cloning from ${remote} into ${dir}" @@ -73,10 +75,10 @@ function checkout() { echo return "${status}" fi - cd "${dir}" + cd "${dir}" || exit 1 else # Fetch any updates from server - cd "${dir}" + cd "${dir}" || exit 1 echo "|-- Fetching updates from ${remote}" git fetch fi @@ -88,13 +90,15 @@ function checkout() { echo return "${status}" fi - git submodule update --init --recursive >> "${logfile}" 2>&1 - echo "|-- Updating submodules (if any)" - status=$? - if ((status > 0)); then - echo " WARNING: Error while updating submodules of ${name}" - echo - return "${status}" + if [[ "${recursive}" == "YES" ]]; then + echo "|-- Updating submodules (if any)" + git submodule update --init --recursive >> "${logfile}" 2>&1 + status=$? + if ((status > 0)); then + echo " WARNING: Error while updating submodules of ${name}" + echo + return "${status}" + fi fi echo return 0 @@ -102,29 +106,27 @@ function checkout() { # Set defaults for variables toggled by options export CLEAN="NO" -CHECKOUT_GSI="NO" -CHECKOUT_GDAS="NO" +checkout_gsi="NO" +checkout_gdas="NO" checkout_gtg="NO" checkout_wafs="NO" checkout_aeroconv="NO" -#JKHufs_model_hash="HFIP2022-08-17" -ufs_model_hash="Prototype-P8" # Parse command line arguments while getopts ":chgum:o" option; do - case $option in + case ${option} in c) - echo "Recieved -c flag, will delete any existing directories and start clean" + echo "Received -c flag, will delete any existing directories and start clean" export CLEAN="YES" ;; g) - echo "Receieved -g flag for optional checkout of GSI-based DA" - CHECKOUT_GSI="YES" + echo "Received -g flag for optional checkout of GSI-based DA" + checkout_gsi="YES" ;; h) usage;; u) echo "Received -u flag for optional checkout of UFS-based DA" - CHECKOUT_GDAS="YES" + checkout_gdas="YES" ;; o) echo "Received -o flag for optional checkout of operational-only codes" @@ -132,58 +134,65 @@ while getopts ":chgum:o" option; do checkout_wafs="YES" ;; m) - echo "Received -m flag with argument, will check out ufs-weather-model hash $OPTARG instead of default" - ufs_model_hash=$OPTARG + echo "Received -m flag with argument, will check out ufs-weather-model hash ${OPTARG} instead of default" + ufs_model_hash=${OPTARG} ;; :) - echo "option -$OPTARG needs an argument" + echo "option -${OPTARG} needs an argument" usage ;; *) - echo "invalid option -$OPTARG, exiting..." + echo "invalid option -${OPTARG}, exiting..." usage ;; esac done shift $((OPTIND-1)) -export topdir=$(pwd) +topdir=$(cd "$(dirname "${BASH_SOURCE[0]}")" &> /dev/null && pwd) +export topdir export logdir="${topdir}/logs" -mkdir -p ${logdir} +mkdir -p "${logdir}" # The checkout version should always be a speciifc commit (hash or tag), not a branch errs=0 -#JKHcheckout "ufs_model.fd" "https://github.com/NOAA-GSL/ufs-weather-model" "${ufs_model_hash}"; errs=$((errs + $?)) -checkout "ufs_model.fd" "https://github.com/ufs-community/ufs-weather-model" "${ufs_model_hash}"; errs=$((errs + $?)) -if [[ -d ufs_model.fd_gsl ]]; then - rsync -avx ufs_model.fd_gsl/ ufs_model.fd/ ## copy over GSL changes not in UFS repository -fi -#JKHcheckout "ufs_utils.fd" "https://github.com/ufs-community/UFS_UTILS.git" "a2b0817" ; errs=$((errs + $?)) -checkout "ufs_utils.fd" "https://github.com/ufs-community/UFS_UTILS.git" "ufs_utils_1_8_0" ; errs=$((errs + $?)) +checkout "gfs_utils.fd" "https://github.com/NOAA-EMC/gfs-utils" "0b8ff56" ; errs=$((errs + $?)) + +checkout "ufs_utils.fd" "https://github.com/ufs-community/UFS_UTILS.git" "4e673bf" ; errs=$((errs + $?)) +## JKH if [[ -d ufs_utils.fd_gsl ]]; then rsync -avx ufs_utils.fd_gsl/ ufs_utils.fd/ ## copy over GSL changes not in UFS_UTILS repository fi -checkout "verif-global.fd" "https://github.com/NOAA-EMC/EMC_verif-global.git" "c267780" ; errs=$((errs + $?)) +## JKH + +checkout "ufs_model.fd" "https://github.com/ufs-community/ufs-weather-model" "${ufs_model_hash:-2247060}" ; errs=$((errs + $?)) +## JKH +if [[ -d ufs_model.fd_gsl ]]; then + rsync -avx ufs_model.fd_gsl/ ufs_model.fd/ ## copy over GSL changes not in UFS repository +fi +## JKH + +checkout "verif-global.fd" "https://github.com/NOAA-EMC/EMC_verif-global.git" "c267780" ; errs=$((errs + $?)) -if [[ $CHECKOUT_GSI == "YES" ]]; then - checkout "gsi_enkf.fd" "https://github.com/NOAA-EMC/GSI.git" "67f5ab4"; errs=$((errs + $?)) +if [[ ${checkout_gsi} == "YES" ]]; then + checkout "gsi_enkf.fd" "https://github.com/NOAA-EMC/GSI.git" "113e307" "NO"; errs=$((errs + $?)) fi -if [[ $CHECKOUT_GDAS == "YES" ]]; then - checkout "gdas.cd" "https://github.com/NOAA-EMC/GDASApp.git" "5952c9d"; errs=$((errs + $?)) +if [[ ${checkout_gdas} == "YES" ]]; then + checkout "gdas.cd" "https://github.com/NOAA-EMC/GDASApp.git" "aaf7caa"; errs=$((errs + $?)) fi -if [[ $CHECKOUT_GSI == "YES" || $CHECKOUT_GDAS == "YES" ]]; then +if [[ ${checkout_gsi} == "YES" || ${checkout_gdas} == "YES" ]]; then checkout "gsi_utils.fd" "https://github.com/NOAA-EMC/GSI-Utils.git" "322cc7b"; errs=$((errs + $?)) - checkout "gsi_monitor.fd" "https://github.com/NOAA-EMC/GSI-Monitor.git" "acf8870"; errs=$((errs + $?)) + checkout "gsi_monitor.fd" "https://github.com/NOAA-EMC/GSI-Monitor.git" "45783e3"; errs=$((errs + $?)) checkout "gldas.fd" "https://github.com/NOAA-EMC/GLDAS.git" "fd8ba62"; errs=$((errs + $?)) fi -if [[ $checkout_wafs == "YES" ]]; then +if [[ ${checkout_wafs} == "YES" ]]; then checkout "gfs_wafs.fd" "https://github.com/NOAA-EMC/EMC_gfs_wafs.git" "014a0b8"; errs=$((errs + $?)) fi -if [[ $checkout_gtg == "YES" ]]; then +if [[ ${checkout_gtg} == "YES" ]]; then ################################################################################ # checkout_gtg ## yes: The gtg code at NCAR private repository is available for ops. GFS only. @@ -192,7 +201,7 @@ if [[ $checkout_gtg == "YES" ]]; then ################################################################################ echo "Checking out GTG extension for UPP" - cd "${topdir}/ufs_model.fd/FV3/upp" + cd "${topdir}/ufs_model.fd/FV3/upp" || exit 1 logfile="${logdir}/checkout_gtg.log" git -c submodule."post_gtg.fd".update=checkout submodule update --init --recursive >> "${logfile}" 2>&1 status=$? @@ -211,4 +220,4 @@ if (( errs > 0 )); then echo "WARNING: One or more errors encountered during checkout process, please check logs before building" fi echo -exit $errs +exit "${errs}" diff --git a/sorc/enkf_chgres_recenter.fd/.gitignore b/sorc/enkf_chgres_recenter.fd/.gitignore deleted file mode 100644 index 544aec4c425..00000000000 --- a/sorc/enkf_chgres_recenter.fd/.gitignore +++ /dev/null @@ -1,3 +0,0 @@ -*.exe -*.o -*.mod diff --git a/sorc/enkf_chgres_recenter.fd/driver.f90 b/sorc/enkf_chgres_recenter.fd/driver.f90 deleted file mode 100644 index 02a138ae8f8..00000000000 --- a/sorc/enkf_chgres_recenter.fd/driver.f90 +++ /dev/null @@ -1,65 +0,0 @@ - program recenter - - use setup, only : program_setup - use interp, only : gaus_to_gaus, adjust_for_terrain - use input_data, only : read_input_data, & - read_vcoord_info - use output_data, only : set_output_grid, write_output_data - - implicit none - - call w3tagb('CHGRES_RECENTER',2018,0179,0055,'NP20') - - print*,"STARTING PROGRAM" - -!-------------------------------------------------------- -! Read configuration namelist. -!-------------------------------------------------------- - - call program_setup - -!-------------------------------------------------------- -! Read input grid data -!-------------------------------------------------------- - - call read_input_data - -!-------------------------------------------------------- -! Read vertical coordinate info -!-------------------------------------------------------- - - call read_vcoord_info - -!-------------------------------------------------------- -! Get output grid specs -!-------------------------------------------------------- - - call set_output_grid - -!-------------------------------------------------------- -! Interpolate data to output grid -!-------------------------------------------------------- - - call gaus_to_gaus - -!-------------------------------------------------------- -! Adjust output fields for differences between -! interpolated and external terrain. -!-------------------------------------------------------- - - call adjust_for_terrain - -!-------------------------------------------------------- -! Write output data to file. -!-------------------------------------------------------- - - call write_output_data - - print* - print*,"PROGRAM FINISHED NORMALLY!" - - call w3tage('CHGRES_RECENTER') - - stop - - end program recenter diff --git a/sorc/enkf_chgres_recenter.fd/input_data.f90 b/sorc/enkf_chgres_recenter.fd/input_data.f90 deleted file mode 100644 index 704aa58c8dd..00000000000 --- a/sorc/enkf_chgres_recenter.fd/input_data.f90 +++ /dev/null @@ -1,383 +0,0 @@ - module input_data - - use nemsio_module - use utils - use setup - - implicit none - - private - - integer, public :: idvc, idsl, idvm, nvcoord - integer, public :: ntrac, ncldt,icldamt - integer, public :: ij_input, kgds_input(200) - integer(nemsio_intkind), public :: i_input, j_input, lev - integer(nemsio_intkind), public :: idate(7) - - logical, public :: gfdl_mp - - real, allocatable, public :: vcoord(:,:) - real, allocatable, public :: clwmr_input(:,:) - real, allocatable, public :: dzdt_input(:,:) - real, allocatable, public :: grle_input(:,:) - real, allocatable, public :: cldamt_input(:,:) - real, allocatable, public :: hgt_input(:) - real, allocatable, public :: icmr_input(:,:) - real, allocatable, public :: o3mr_input(:,:) - real, allocatable, public :: rwmr_input(:,:) - real, allocatable, public :: sfcp_input(:) - real, allocatable, public :: snmr_input(:,:) - real, allocatable, public :: spfh_input(:,:) - real, allocatable, public :: tmp_input(:,:) - real, allocatable, public :: ugrd_input(:,:) - real, allocatable, public :: vgrd_input(:,:) - - public :: read_input_data - public :: read_vcoord_info - - contains - - subroutine read_input_data - -!------------------------------------------------------------------------------------- -! Read input grid data from a nemsio file. -!------------------------------------------------------------------------------------- - - implicit none - - character(len=20) :: vlevtyp, vname - character(len=50), allocatable :: recname(:) - - integer(nemsio_intkind) :: vlev, iret, idum, nrec - integer :: n - - real(nemsio_realkind), allocatable :: dummy(:) - - type(nemsio_gfile) :: gfile - - call nemsio_init(iret) - - print* - print*,"OPEN INPUT FILE: ",trim(input_file) - call nemsio_open(gfile, input_file, "read", iret=iret) - if (iret /= 0) then - print*,"FATAL ERROR OPENING FILE: ",trim(input_file) - print*,"IRET IS: ", iret - call errexit(2) - endif - - print*,"GET INPUT FILE HEADER" - call nemsio_getfilehead(gfile, iret=iret, nrec=nrec, idate=idate, & - dimx=i_input, dimy=j_input, dimz=lev) - if (iret /= 0) goto 67 - - print*,'DIMENSIONS OF DATA ARE: ', i_input, j_input, lev - print*,'DATE OF DATA IS: ', idate - - ij_input = i_input * j_input - - allocate(recname(nrec)) - - call nemsio_getfilehead(gfile, iret=iret, recname=recname) - if (iret /= 0) goto 67 - - gfdl_mp = .false. ! Zhao-Carr MP - do n = 1, nrec - if (trim(recname(n)) == "icmr") then - gfdl_mp = .true. ! GFDL MP - exit - endif - enddo - - icldamt = 0 - do n = 1, nrec - if (trim(recname(n)) == "cld_amt") then - icldamt = 1 ! 3D cloud amount present - exit - endif - enddo - - call nemsio_getfilehead(gfile, iret=iret, idvc=idum) - if (iret /= 0) goto 67 - idvc = idum - print*,'IDVC IS: ', idvc - - call nemsio_getfilehead(gfile, iret=iret, idsl=idum) - if (iret /= 0) goto 67 - idsl = idum - print*,'IDSL IS: ', idsl - - call nemsio_getfilehead(gfile, iret=iret, idvm=idum) - if (iret /= 0) goto 67 - idvm = idum - print*,'IDVM IS: ', idvm - - if (gfdl_mp) then - ntrac = 7 + icldamt - ncldt = 5 - else - ntrac = 3 - ncldt = 1 - endif - - allocate(dummy(ij_input)) - - ! figure out the sign of delz - print*,"READ DELZ FOR SIGN CHECK" - vlev = 1 - vlevtyp = "mid layer" - vname = "delz" - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - if ( sum(dummy) > 0 ) then - flipdelz = .false. - print*,"DELZ IS POSITIVE" - else - flipdelz = .true. - print*,"DELZ IS NEGATIVE" - end if - - print* - print*,"READ SURFACE PRESSURE" - vlev = 1 - vlevtyp = "sfc" - vname = "pres" - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - - allocate(sfcp_input(ij_input)) - sfcp_input = dummy - print*,'MAX/MIN SURFACE PRESSURE: ',maxval(sfcp_input), minval(sfcp_input) - - print* - print*,"READ SURFACE HEIGHT" - vlev = 1 - vlevtyp = "sfc" - vname = "hgt" - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - - allocate(hgt_input(ij_input)) - hgt_input = dummy - print*,'MAX/MIN SURFACE HEIGHT: ',maxval(hgt_input), minval(hgt_input) - - print* - print*,"READ U WIND" - vname = "ugrd" - vlevtyp = "mid layer" - allocate(ugrd_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - ugrd_input(:,vlev) = dummy - print*,'MAX/MIN U WIND AT LEVEL ',vlev, "IS: ", maxval(ugrd_input(:,vlev)), minval(ugrd_input(:,vlev)) - enddo - - print* - print*,"READ V WIND" - vname = "vgrd" - vlevtyp = "mid layer" - allocate(vgrd_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - vgrd_input(:,vlev) = dummy - print*,'MAX/MIN V WIND AT LEVEL ', vlev, "IS: ", maxval(vgrd_input(:,vlev)), minval(vgrd_input(:,vlev)) - enddo - - print* - print*,"READ TEMPERATURE" - vname = "tmp" - vlevtyp = "mid layer" - allocate(tmp_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - tmp_input(:,vlev) = dummy(:) - print*,'MAX/MIN TEMPERATURE AT LEVEL ', vlev, 'IS: ', maxval(tmp_input(:,vlev)), minval(tmp_input(:,vlev)) - enddo - - print* - print*,"READ SPECIFIC HUMIDITY" - vname = "spfh" - vlevtyp = "mid layer" - allocate(spfh_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - spfh_input(:,vlev) = dummy - print*,'MAX/MIN SPECIFIC HUMIDITY AT LEVEL ', vlev, 'IS: ', maxval(spfh_input(:,vlev)), minval(spfh_input(:,vlev)) - enddo - - print* - print*,"READ CLOUD LIQUID WATER" - vname = "clwmr" - vlevtyp = "mid layer" - allocate(clwmr_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - clwmr_input(:,vlev) = dummy - print*,'MAX/MIN CLOUD LIQUID WATER AT LEVEL ', vlev, 'IS: ', maxval(clwmr_input(:,vlev)), minval(clwmr_input(:,vlev)) - enddo - - print* - print*,"READ OZONE" - vname = "o3mr" - vlevtyp = "mid layer" - allocate(o3mr_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - o3mr_input(:,vlev) = dummy - print*,'MAX/MIN OZONE AT LEVEL ', vlev, 'IS: ', maxval(o3mr_input(:,vlev)), minval(o3mr_input(:,vlev)) - enddo - - print* - print*,"READ DZDT" - vname = "dzdt" - vlevtyp = "mid layer" - allocate(dzdt_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - dzdt_input(:,vlev) = dummy - print*,'MAX/MIN DZDT AT LEVEL ', vlev, 'IS: ', maxval(dzdt_input(:,vlev)), minval(dzdt_input(:,vlev)) - enddo - - if (gfdl_mp) then - - print* - print*,"READ RWMR" - vname = "rwmr" - vlevtyp = "mid layer" - allocate(rwmr_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - rwmr_input(:,vlev) = dummy - print*,'MAX/MIN RWMR AT LEVEL ', vlev, 'IS: ', maxval(rwmr_input(:,vlev)), minval(rwmr_input(:,vlev)) - enddo - - print* - print*,"READ ICMR" - vname = "icmr" - vlevtyp = "mid layer" - allocate(icmr_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - icmr_input(:,vlev) = dummy - print*,'MAX/MIN ICMR AT LEVEL ', vlev, 'IS: ', maxval(icmr_input(:,vlev)), minval(icmr_input(:,vlev)) - enddo - - print* - print*,"READ SNMR" - vname = "snmr" - vlevtyp = "mid layer" - allocate(snmr_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - snmr_input(:,vlev) = dummy - print*,'MAX/MIN SNMR AT LEVEL ', vlev, 'IS: ', maxval(snmr_input(:,vlev)), minval(snmr_input(:,vlev)) - enddo - - print* - print*,"READ GRLE" - vname = "grle" - vlevtyp = "mid layer" - allocate(grle_input(ij_input,lev)) - do vlev = 1, lev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - grle_input(:,vlev) = dummy - print*,'MAX/MIN GRLE AT LEVEL ', vlev, 'IS: ', maxval(grle_input(:,vlev)), minval(grle_input(:,vlev)) - enddo - - if (icldamt == 1) then - print* - print*,"READ CLD_AMT" - vname = "cld_amt" - vlevtyp = "mid layer" - allocate(cldamt_input(ij_input,lev)) - do vlev = 1, lev - write(6,*) 'read ',vname,' on ',vlev - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) goto 67 - cldamt_input(:,vlev) = dummy - print*,'MAX/MIN CLD_AMT AT LEVEL ', vlev, 'IS: ', maxval(cldamt_input(:,vlev)), minval(cldamt_input(:,vlev)) - enddo - endif - - endif - - deallocate(dummy) - - print*,"CLOSE FILE" - call nemsio_close(gfile, iret=iret) - - call nemsio_finalize() - -!--------------------------------------------------------------------------------------- -! Set the grib 1 grid description array need by the NCEP IPOLATES library. -!--------------------------------------------------------------------------------------- - - call calc_kgds(i_input, j_input, kgds_input) - - return - - 67 continue - - print*,"FATAL ERROR READING FILE: ", trim(input_file) - print*,"IRET IS: ", iret - call errexit(3) - - end subroutine read_input_data - - subroutine read_vcoord_info - -!--------------------------------------------------------------------------------- -! Read vertical coordinate information. -!--------------------------------------------------------------------------------- - - implicit none - - integer :: istat, levs_vcoord, n, k - - print* - print*,"OPEN VERTICAL COORD FILE: ", trim(vcoord_file) - open(14, file=trim(vcoord_file), form='formatted', iostat=istat) - if (istat /= 0) then - print*,"FATAL ERROR OPENING FILE. ISTAT IS: ", istat - call errexit(4) - endif - - read(14, *, iostat=istat) nvcoord, levs_vcoord - if (istat /= 0) then - print*,"FATAL ERROR READING FILE HEADER. ISTAT IS: ",istat - call errexit(5) - endif - -!--------------------------------------------------------------------------------- -! The last value in the file is not used for the fv3 core. Only read the first -! (lev + 1) values. -!--------------------------------------------------------------------------------- - - allocate(vcoord(lev+1, nvcoord)) - read(14, *, iostat=istat) ((vcoord(n,k), k=1,nvcoord), n=1,lev+1) - if (istat /= 0) then - print*,"FATAL ERROR READING FILE. ISTAT IS: ",istat - call errexit(6) - endif - - print* - do k = 1, (lev+1) - print*,'VCOORD FOR LEV ', k, 'IS: ', vcoord(k,:) - enddo - - close(14) - - end subroutine read_vcoord_info - - end module input_data diff --git a/sorc/enkf_chgres_recenter.fd/interp.f90 b/sorc/enkf_chgres_recenter.fd/interp.f90 deleted file mode 100644 index bb2afedbc37..00000000000 --- a/sorc/enkf_chgres_recenter.fd/interp.f90 +++ /dev/null @@ -1,552 +0,0 @@ - module interp - - use nemsio_module - - implicit none - - private - - real, allocatable :: sfcp_b4_adj_output(:) - real, allocatable :: clwmr_b4_adj_output(:,:) - real, allocatable :: dzdt_b4_adj_output(:,:) - real, allocatable :: grle_b4_adj_output(:,:) - real, allocatable :: cldamt_b4_adj_output(:,:) - real, allocatable :: icmr_b4_adj_output(:,:) - real, allocatable :: o3mr_b4_adj_output(:,:) - real, allocatable :: rwmr_b4_adj_output(:,:) - real, allocatable :: snmr_b4_adj_output(:,:) - real, allocatable :: spfh_b4_adj_output(:,:) - real, allocatable :: tmp_b4_adj_output(:,:) - real, allocatable :: ugrd_b4_adj_output(:,:) - real, allocatable :: vgrd_b4_adj_output(:,:) - - public :: adjust_for_terrain - public :: gaus_to_gaus - - contains - - subroutine adjust_for_terrain - -!--------------------------------------------------------------------------------- -! Adjust fields based on differences between the interpolated and external -! terrain. -!--------------------------------------------------------------------------------- - - use input_data - use output_data - use utils - use setup - - implicit none - - integer :: k - - real, allocatable :: pres_b4_adj_output(:,:) - real, allocatable :: pres_output(:,:) - real, allocatable :: q_b4_adj_output(:,:,:), q_output(:,:,:) - -!--------------------------------------------------------------------------------- -! First, compute the mid-layer pressure using the interpolated surface pressure. -!--------------------------------------------------------------------------------- - - allocate(pres_b4_adj_output(ij_output,lev)) - pres_b4_adj_output = 0.0 - - print* - print*,"COMPUTE MID-LAYER PRESSURE FROM INTERPOLATED SURFACE PRESSURE." - call newpr1(ij_output, lev, idvc, idsl, nvcoord, vcoord, & - sfcp_b4_adj_output, pres_b4_adj_output) - -!print*,'after newpr1, pres b4 adj: ', pres_b4_adj_output(ij_output/2,:) - -!--------------------------------------------------------------------------------- -! Adjust surface pressure based on differences between interpolated and -! grid terrain. -!--------------------------------------------------------------------------------- - - allocate(sfcp_output(ij_output)) - sfcp_output = 0.0 - - print*,"ADJUST SURFACE PRESSURE BASED ON TERRAIN DIFFERENCES" - call newps(hgt_output, sfcp_b4_adj_output, ij_output, & - lev, pres_b4_adj_output, tmp_b4_adj_output, & - spfh_b4_adj_output, hgt_external_output, sfcp_output) - -!print*,'after newps ',sfcp_b4_adj_output(ij_output/2),sfcp_output(ij_output/2) - - deallocate(sfcp_b4_adj_output) - -!--------------------------------------------------------------------------------- -! Recompute mid-layer pressure based on the adjusted surface pressure. -!--------------------------------------------------------------------------------- - - allocate(pres_output(ij_output, lev)) - pres_output = 0.0 - - allocate(dpres_output(ij_output, lev)) - dpres_output = 0.0 - - print*,"RECOMPUTE MID-LAYER PRESSURE." - call newpr1(ij_output, lev, idvc, idsl, nvcoord, vcoord, & - sfcp_output, pres_output, dpres_output) - -!do k = 1, lev -! print*,'after newpr1 ',pres_b4_adj_output(ij_output/2,k),pres_output(ij_output/2,k), dpres_output(ij_output/2,k) -!enddo - -!--------------------------------------------------------------------------------- -! Vertically interpolate from the pre-adjusted to the adjusted mid-layer -! pressures. -!--------------------------------------------------------------------------------- - - allocate(q_b4_adj_output(ij_output,lev,ntrac)) - q_b4_adj_output(:,:,1) = spfh_b4_adj_output(:,:) - q_b4_adj_output(:,:,2) = o3mr_b4_adj_output(:,:) - q_b4_adj_output(:,:,3) = clwmr_b4_adj_output(:,:) - if (gfdl_mp) then - q_b4_adj_output(:,:,4) = rwmr_b4_adj_output(:,:) - q_b4_adj_output(:,:,5) = icmr_b4_adj_output(:,:) - q_b4_adj_output(:,:,6) = snmr_b4_adj_output(:,:) - q_b4_adj_output(:,:,7) = grle_b4_adj_output(:,:) - if (icldamt == 1) q_b4_adj_output(:,:,8) = cldamt_b4_adj_output(:,:) - endif - - allocate(q_output(ij_output,lev,ntrac)) - q_output = 0.0 - - allocate(dzdt_output(ij_output,lev)) - dzdt_output = 0.0 - - allocate(ugrd_output(ij_output,lev)) - ugrd_output=0.0 - - allocate(vgrd_output(ij_output,lev)) - vgrd_output=0.0 - - allocate(tmp_output(ij_output,lev)) - tmp_output=0.0 - - print*,"VERTICALLY INTERPOLATE TO NEW PRESSURE LEVELS" - call vintg(ij_output, lev, lev, ntrac, pres_b4_adj_output, & - ugrd_b4_adj_output, vgrd_b4_adj_output, tmp_b4_adj_output, q_b4_adj_output, & - dzdt_b4_adj_output, pres_output, ugrd_output, vgrd_output, tmp_output, & - q_output, dzdt_output) - - deallocate (dzdt_b4_adj_output, q_b4_adj_output) - deallocate (pres_b4_adj_output, pres_output) - - allocate(spfh_output(ij_output,lev)) - spfh_output = q_output(:,:,1) - allocate(o3mr_output(ij_output,lev)) - o3mr_output = q_output(:,:,2) - allocate(clwmr_output(ij_output,lev)) - clwmr_output = q_output(:,:,3) - if (gfdl_mp) then - allocate(rwmr_output(ij_output,lev)) - rwmr_output = q_output(:,:,4) - allocate(icmr_output(ij_output,lev)) - icmr_output = q_output(:,:,5) - allocate(snmr_output(ij_output,lev)) - snmr_output = q_output(:,:,6) - allocate(grle_output(ij_output,lev)) - grle_output = q_output(:,:,7) - if (icldamt == 1) then - allocate(cldamt_output(ij_output,lev)) - cldamt_output = q_output(:,:,8) - endif - endif - - deallocate(q_output) - -!do k = 1, lev -!print*,'after vintg tmp ',tmp_b4_adj_output(ij_output/2,k),tmp_output(ij_output/2,k) -!enddo - - deallocate(tmp_b4_adj_output) - -!do k = 1, lev -!print*,'after vintg u ',ugrd_b4_adj_output(ij_output/2,k),ugrd_output(ij_output/2,k) -!enddo - - deallocate(ugrd_b4_adj_output) - -!do k = 1, lev -!print*,'after vintg v ',vgrd_b4_adj_output(ij_output/2,k),vgrd_output(ij_output/2,k) -!enddo - - deallocate(vgrd_b4_adj_output) - -!do k = 1, lev -!print*,'after vintg spfh ',spfh_b4_adj_output(ij_output/2,k),spfh_output(ij_output/2,k) -!enddo - - deallocate(spfh_b4_adj_output) - -!do k = 1, lev -!print*,'after vintg o3 ',o3mr_b4_adj_output(ij_output/2,k),o3mr_output(ij_output/2,k) -!enddo - - deallocate(o3mr_b4_adj_output) - -!do k = 1, lev -!print*,'after vintg clw ',clwmr_b4_adj_output(ij_output/2,k),clwmr_output(ij_output/2,k) -!enddo - - deallocate(clwmr_b4_adj_output) - - if (gfdl_mp) then - -! do k = 1, lev -! print*,'after vintg rw ',rwmr_b4_adj_output(ij_output/2,k),rwmr_output(ij_output/2,k) -! enddo - - deallocate(rwmr_b4_adj_output) - -! do k = 1, lev -! print*,'after vintg ic ',icmr_b4_adj_output(ij_output/2,k),icmr_output(ij_output/2,k) -! enddo - - deallocate(icmr_b4_adj_output) - -! do k = 1, lev -! print*,'after vintg sn ',snmr_b4_adj_output(ij_output/2,k),snmr_output(ij_output/2,k) -! enddo - - deallocate(snmr_b4_adj_output) - -! do k = 1, lev -! print*,'after vintg grle ',grle_b4_adj_output(ij_output/2,k),grle_output(ij_output/2,k) -! enddo - - deallocate(grle_b4_adj_output) - - if (icldamt == 1) then -! do k = 1, lev -! print*,'after vintg cld_amt ',cldamt_b4_adj_output(ij_output/2,k),cldamt_output(ij_output/2,k) -! enddo - - deallocate(cldamt_b4_adj_output) - endif - - - endif - - allocate(delz_output(ij_output, lev)) - delz_output = 0.0 - - call compute_delz(ij_output, lev, vcoord(:,1), vcoord(:,2), sfcp_output, hgt_output, & - tmp_output, spfh_output, delz_output, flipdelz) - - deallocate(hgt_output) - - end subroutine adjust_for_terrain - - subroutine gaus_to_gaus - -!---------------------------------------------------------------------------------- -! Interpolate data from the input to output grid using IPOLATES library. -!---------------------------------------------------------------------------------- - - use output_data - use input_data - use setup - - implicit none - - integer :: ip, ipopt(20) - integer :: num_fields - integer :: iret, numpts - integer, allocatable :: ibi(:), ibo(:) - - logical*1, allocatable :: bitmap_input(:,:), bitmap_output(:,:) - - real, allocatable :: data_input(:,:) - real, allocatable :: data_output(:,:), crot(:), srot(:) - - print* - print*,'INTERPOLATE DATA TO OUTPUT GRID' - - ip = 0 ! bilinear - ipopt = 0 - -!---------------------------------------------------------------------------------- -! Do 2-D fields first -!---------------------------------------------------------------------------------- - - num_fields = 1 - - allocate(ibi(num_fields)) - ibi = 0 ! no bitmap - allocate(ibo(num_fields)) - ibo = 0 ! no bitmap - - allocate(bitmap_input(ij_input,num_fields)) - bitmap_input = .true. - allocate(bitmap_output(ij_output,num_fields)) - bitmap_output = .true. - - allocate(rlat_output(ij_output)) - rlat_output = 0.0 - allocate(rlon_output(ij_output)) - rlon_output = 0.0 - -!---------------- -! Surface height -!---------------- - - allocate(data_input(ij_input,num_fields)) - data_input(:,num_fields) = hgt_input(:) - deallocate(hgt_input) - - allocate(data_output(ij_output,num_fields)) - data_output = 0 - - print*,"INTERPOLATE SURFACE HEIGHT" - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, data_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - data_output, iret) - if (iret /= 0) goto 89 - - allocate(hgt_output(ij_output)) - hgt_output = data_output(:,num_fields) - -!------------------ -! surface pressure -!------------------ - - data_input(:,num_fields) = sfcp_input(:) - deallocate(sfcp_input) - - print*,"INTERPOLATE SURFACE PRESSURE" - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, data_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - data_output, iret) - if (iret /= 0) goto 89 - - allocate(sfcp_b4_adj_output(ij_output)) - sfcp_b4_adj_output = data_output(:,num_fields) - - deallocate(ibi, ibo, bitmap_input, bitmap_output, data_input, data_output) - -!---------------------------------------------------------------------------------- -! 3d scalars -!---------------------------------------------------------------------------------- - - num_fields = lev - - allocate(ibi(num_fields)) - ibi = 0 ! no bitmap - allocate(ibo(num_fields)) - ibo = 0 ! no bitmap - - allocate(bitmap_input(ij_input,num_fields)) - bitmap_input = .true. - allocate(bitmap_output(ij_output,num_fields)) - bitmap_output = .true. - -!------------- -! Temperature -!------------- - - allocate(tmp_b4_adj_output(ij_output,num_fields)) - tmp_b4_adj_output = 0 - - print*,'INTERPOLATE TEMPERATURE' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, tmp_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - tmp_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(tmp_input) - -!-------------------- -! Cloud liquid water -!-------------------- - - allocate(clwmr_b4_adj_output(ij_output,num_fields)) - clwmr_b4_adj_output = 0 - - print*,'INTERPOLATE CLOUD LIQUID WATER' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, clwmr_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - clwmr_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(clwmr_input) - -!-------------------- -! Specific humidity -!-------------------- - - allocate(spfh_b4_adj_output(ij_output,num_fields)) - spfh_b4_adj_output = 0 - - print*,'INTERPOLATE SPECIFIC HUMIDITY' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, spfh_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - spfh_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(spfh_input) - -!----------- -! Ozone -!----------- - - allocate(o3mr_b4_adj_output(ij_output,num_fields)) - o3mr_b4_adj_output = 0 - - print*,'INTERPOLATE OZONE' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, o3mr_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - o3mr_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(o3mr_input) - -!----------- -! DZDT -!----------- - - allocate(dzdt_b4_adj_output(ij_output,num_fields)) - dzdt_b4_adj_output = 0 - - print*,'INTERPOLATE DZDT' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, dzdt_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - dzdt_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(dzdt_input) - -!---------------------------------------------------------------------------------- -! Interpolate additional 3-d scalars for GFDL microphysics. -!---------------------------------------------------------------------------------- - - if (gfdl_mp) then - -!------------- -! Rain water -!------------- - - allocate(rwmr_b4_adj_output(ij_output,num_fields)) - rwmr_b4_adj_output = 0 - - print*,'INTERPOLATE RWMR' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, rwmr_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - rwmr_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(rwmr_input) - -!------------- -! Snow water -!------------- - - allocate(snmr_b4_adj_output(ij_output,num_fields)) - snmr_b4_adj_output = 0 - - print*,'INTERPOLATE SNMR' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, snmr_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - snmr_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(snmr_input) - -!------------- -! Ice water -!------------- - - allocate(icmr_b4_adj_output(ij_output,num_fields)) - icmr_b4_adj_output = 0 - - print*,'INTERPOLATE ICMR' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, icmr_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - icmr_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(icmr_input) - -!------------- -! Graupel -!------------- - - allocate(grle_b4_adj_output(ij_output,num_fields)) - grle_b4_adj_output = 0 - - print*,'INTERPOLATE GRLE' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, grle_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - grle_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(grle_input) - -!--------------------------- -! Cloud amount (if present) -!--------------------------- - - if (icldamt == 1) then - allocate(cldamt_b4_adj_output(ij_output,num_fields)) - cldamt_b4_adj_output = 0 - - print*,'INTERPOLATE CLD_AMT' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, cldamt_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - cldamt_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(cldamt_input) - endif - - - endif - -!---------------------------------------------------------------------------------- -! 3d u/v winds -!---------------------------------------------------------------------------------- - - allocate(crot(ij_output), srot(ij_output)) - crot = 0. - srot = 0. - - allocate(ugrd_b4_adj_output(ij_output,num_fields)) - ugrd_b4_adj_output = 0 - allocate(vgrd_b4_adj_output(ij_output,num_fields)) - vgrd_b4_adj_output = 0 - - print*,'INTERPOLATE WINDS' - call ipolatev(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, ugrd_input, vgrd_input, & - numpts, rlat_output, rlon_output, crot, srot, ibo, bitmap_output, & - ugrd_b4_adj_output, vgrd_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate (ugrd_input, vgrd_input) - deallocate (crot, srot) - deallocate (ibi, ibo, bitmap_input, bitmap_output) - - return - - 89 continue - print*,"FATAL ERROR IN IPOLATES. IRET IS: ", iret - call errexit(23) - - end subroutine gaus_to_gaus - - end module interp diff --git a/sorc/enkf_chgres_recenter.fd/output_data.f90 b/sorc/enkf_chgres_recenter.fd/output_data.f90 deleted file mode 100644 index 36063d3a061..00000000000 --- a/sorc/enkf_chgres_recenter.fd/output_data.f90 +++ /dev/null @@ -1,396 +0,0 @@ - module output_data - - use nemsio_module - - implicit none - - private - - integer, public :: kgds_output(200) - -! data on the output grid. - real, allocatable, public :: hgt_output(:) ! interpolated from input grid - real, allocatable, public :: hgt_external_output(:) - real, allocatable, public :: sfcp_output(:) - real, allocatable, public :: tmp_output(:,:) - real, allocatable, public :: clwmr_output(:,:) - real, allocatable, public :: delz_output(:,:) - real, allocatable, public :: dpres_output(:,:) - real, allocatable, public :: dzdt_output(:,:) - real, allocatable, public :: o3mr_output(:,:) - real, allocatable, public :: spfh_output(:,:) - real, allocatable, public :: ugrd_output(:,:) - real, allocatable, public :: vgrd_output(:,:) - real, allocatable, public :: rwmr_output(:,:) - real, allocatable, public :: icmr_output(:,:) - real, allocatable, public :: snmr_output(:,:) - real, allocatable, public :: grle_output(:,:) - real, allocatable, public :: cldamt_output(:,:) - real, allocatable, public :: rlat_output(:) - real, allocatable, public :: rlon_output(:) - - public :: set_output_grid - public :: write_output_data - - character(len=50), allocatable :: recname(:) - character(len=50), allocatable :: reclevtyp(:) - - integer(nemsio_intkind) :: nrec - integer(nemsio_intkind), allocatable :: reclev(:) - - real(nemsio_realkind), allocatable :: vcoord_header(:,:,:) - real(nemsio_realkind), allocatable :: lat(:), lon(:) - - contains - - subroutine set_output_grid - -!------------------------------------------------------------------- -! Set grid specs on the output grid. -!------------------------------------------------------------------- - - use setup - use input_data - use utils - - implicit none - - character(len=20) :: vlevtyp, vname - - integer(nemsio_intkind) :: vlev - integer :: iret - - real(nemsio_realkind), allocatable :: dummy(:) - - type(nemsio_gfile) :: gfile - - print* - print*,"OUTPUT GRID I/J DIMENSIONS: ", i_output, j_output - -!------------------------------------------------------------------- -! Set the grib 1 grid description section, which is needed -! by the IPOLATES library. -!------------------------------------------------------------------- - - kgds_output = 0 - - call calc_kgds(i_output, j_output, kgds_output) - -!------------------------------------------------------------------- -! Read the terrain on the output grid. To ensure exact match, -! read it from an existing enkf nemsio restart file. -!------------------------------------------------------------------- - - call nemsio_init(iret) - - print* - print*,"OPEN OUTPUT GRID TERRAIN FILE: ", trim(terrain_file) - call nemsio_open(gfile, terrain_file, "read", iret=iret) - if (iret /= 0) then - print*,"FATAL ERROR OPENING FILE: ",trim(terrain_file) - print*,"IRET IS: ", iret - call errexit(50) - endif - - allocate(dummy(ij_output)) - allocate(hgt_external_output(ij_output)) - - print* - print*,"READ SURFACE HEIGHT" - vlev = 1 - vlevtyp = "sfc" - vname = "hgt" - call nemsio_readrecv(gfile, vname, vlevtyp, vlev, dummy, 0, iret) - if (iret /= 0) then - print*,"FATAL ERROR READING FILE: ",trim(terrain_file) - print*,"IRET IS: ", iret - call errexit(51) - endif - - hgt_external_output = dummy - - deallocate(dummy) - - call nemsio_close(gfile, iret=iret) - - call nemsio_finalize() - - end subroutine set_output_grid - - subroutine write_output_data - -!------------------------------------------------------------------- -! Write output grid data to a nemsio file. -!------------------------------------------------------------------- - - use input_data - use setup - - implicit none - - character(len=5) :: gaction - - integer :: n, iret - - real(nemsio_realkind), allocatable :: dummy(:) - - type(nemsio_gfile) :: gfile - -!------------------------------------------------------------------- -! Set up some header info. -!------------------------------------------------------------------- - - call header_set - -!------------------------------------------------------------------- -! Open and write file. -!------------------------------------------------------------------- - - call nemsio_init(iret) - - gaction="write" - - print* - print*,'OPEN OUTPUT FILE: ',trim(output_file) - call nemsio_open(gfile, output_file, gaction, iret=iret, gdatatype="bin4", & - nmeta=8, modelname="FV3GFS", nrec=nrec, & - idate=idate, dimx=i_output, & - dimy=j_output, dimz=lev, ntrac=ntrac, & - ncldt=ncldt, idvc=idvc, idsl=idsl, idvm=idvm, & - idrt=4, recname=recname, reclevtyp=reclevtyp, & - reclev=reclev,vcoord=vcoord_header, & - lat=lat, lon=lon) - if (iret/=0) then - print*,"FATAL ERROR OPENING FILE. IRET IS: ", iret - call errexit(9) - endif - - deallocate(lon, lat, recname, reclevtyp, reclev, vcoord_header) - - allocate(dummy(i_output*j_output)) - - print*,"WRITE SURFACE HEIGHT" - dummy = hgt_external_output - call nemsio_writerecv(gfile, "hgt", "sfc", 1, dummy, iret=iret) - if (iret/=0) goto 88 - deallocate(hgt_external_output) - - print*,"WRITE SURFACE PRESSURE" - dummy = sfcp_output - call nemsio_writerecv(gfile, "pres", "sfc", 1, dummy, iret=iret) - if (iret/=0) goto 88 - deallocate(sfcp_output) - - print*,"WRITE TEMPERATURE" - do n = 1, lev - dummy = tmp_output(:,n) - call nemsio_writerecv(gfile, "tmp", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(tmp_output) - - print*,"WRITE CLOUD LIQUID WATER" - do n = 1, lev - dummy = clwmr_output(:,n) - call nemsio_writerecv(gfile, "clwmr", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(clwmr_output) - - print*,"WRITE SPECIFIC HUMIDITY" - do n = 1, lev - dummy = spfh_output(:,n) - call nemsio_writerecv(gfile, "spfh", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(spfh_output) - - print*,"WRITE OZONE" - do n = 1, lev - dummy = o3mr_output(:,n) - call nemsio_writerecv(gfile, "o3mr", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(o3mr_output) - - print*,"WRITE U-WINDS" - do n = 1, lev - dummy = ugrd_output(:,n) - call nemsio_writerecv(gfile, "ugrd", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(ugrd_output) - - print*,"WRITE V-WINDS" - do n = 1, lev - dummy = vgrd_output(:,n) - call nemsio_writerecv(gfile, "vgrd", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(vgrd_output) - - print*,"WRITE DZDT" - do n = 1, lev - dummy = dzdt_output(:,n) - call nemsio_writerecv(gfile, "dzdt", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(dzdt_output) - - print*,"WRITE DPRES" - do n = 1, lev - dummy = dpres_output(:,n) - call nemsio_writerecv(gfile, "dpres", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(dpres_output) - - print*,"WRITE DELZ" - do n = 1, lev - dummy = delz_output(:,n) - call nemsio_writerecv(gfile, "delz", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(delz_output) - - if (gfdl_mp) then - - print*,"WRITE RAIN WATER" - do n = 1, lev - dummy = rwmr_output(:,n) - call nemsio_writerecv(gfile, "rwmr", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(rwmr_output) - - print*,"WRITE SNOW WATER" - do n = 1, lev - dummy = snmr_output(:,n) - call nemsio_writerecv(gfile, "snmr", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(snmr_output) - - print*,"WRITE ICE WATER" - do n = 1, lev - dummy = icmr_output(:,n) - call nemsio_writerecv(gfile, "icmr", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(icmr_output) - - print*,"WRITE GRAUPEL" - do n = 1, lev - dummy = grle_output(:,n) - call nemsio_writerecv(gfile, "grle", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(grle_output) - - if (icldamt == 1) then - print*,"WRITE CLD_AMT" - do n = 1, lev - dummy = cldamt_output(:,n) - call nemsio_writerecv(gfile, "cld_amt", "mid layer", n, dummy, iret=iret) - if (iret/=0) goto 88 - enddo - deallocate(cldamt_output) - endif - - - endif - - deallocate(dummy) - - call nemsio_close(gfile, iret=iret) - - call nemsio_finalize() - - return - - 88 continue - print*,"FATAL ERROR WRITING FILE. IRET IS: ", iret - call errexit(10) - - end subroutine write_output_data - - subroutine header_set - -!------------------------------------------------------------------- -! Set header information for the output nemsio file. -!------------------------------------------------------------------- - - use input_data - use setup - - implicit none - - character(len=8) :: fields(9) - character(len=8) :: fields_gfdl_mp(5) - - integer :: count, l, n - -! Fields common to Zhao-Carr and GFDL microphysics - data fields /'ugrd', 'vgrd', 'dzdt', 'dpres', 'delz', & - 'tmp', 'spfh', 'clwmr', 'o3mr'/ - -! Fields for GFDL microphysics - data fields_gfdl_mp /'rwmr', 'icmr', 'snmr', 'grle', 'cld_amt'/ - - print* - print*,"SET HEADER INFO FOR OUTPUT FILE." - - if (gfdl_mp) then - nrec = ((13+icldamt) * lev) + 2 - else - nrec = (9 * lev) + 2 - endif - - allocate(recname(nrec)) - allocate(reclev(nrec)) - allocate(reclevtyp(nrec)) - - count = 0 - do n = 1, 9 - do l = 1, lev - count = count + 1 - recname(count) = fields(n) - reclev(count) = l - reclevtyp(count) = "mid layer" - enddo - enddo - - if (gfdl_mp) then - do n = 1, 4 + icldamt - do l = 1, lev - count = count + 1 - recname(count) = fields_gfdl_mp(n) - reclev(count) = l - reclevtyp(count) = "mid layer" - enddo - enddo - endif - - recname(nrec-1) = "pres" - reclev(nrec-1) = 1 - reclevtyp(nrec-1) = "sfc" - - recname(nrec) = "hgt" - reclev(nrec) = 1 - reclevtyp(nrec) = "sfc" - - allocate(vcoord_header(lev+1,3,2)) - vcoord_header = 0.0 - vcoord_header(:,1,1) = vcoord(:,1) - vcoord_header(:,2,1) = vcoord(:,2) - - allocate(lat(ij_output), lon(ij_output)) - - lat = rlat_output - lon = rlon_output - - deallocate(rlat_output, rlon_output) - - end subroutine header_set - - end module output_data diff --git a/sorc/enkf_chgres_recenter.fd/setup.f90 b/sorc/enkf_chgres_recenter.fd/setup.f90 deleted file mode 100644 index c2c2dc450e8..00000000000 --- a/sorc/enkf_chgres_recenter.fd/setup.f90 +++ /dev/null @@ -1,53 +0,0 @@ - module setup - - use nemsio_module - - implicit none - - private - - character(len=300), public :: input_file - character(len=300), public :: output_file - character(len=300), public :: terrain_file - character(len=300), public :: vcoord_file - - integer(nemsio_intkind), public :: i_output - integer(nemsio_intkind), public :: j_output - integer , public :: ij_output - logical, public :: flipdelz - - public :: program_setup - - contains - - subroutine program_setup - - implicit none - - integer :: istat - - namelist /nam_setup/ i_output, j_output, input_file, output_file, & - terrain_file, vcoord_file - - print* - print*,"OPEN SETUP NAMELIST." - open(43, file="./fort.43", iostat=istat) - if (istat /= 0) then - print*,"FATAL ERROR OPENING NAMELIST FILE. ISTAT IS: ",istat - call errexit(30) - endif - - print*,"READ SETUP NAMELIST." - read(43, nml=nam_setup, iostat=istat) - if (istat /= 0) then - print*,"FATAL ERROR READING NAMELIST FILE. ISTAT IS: ",istat - call errexit(31) - endif - - ij_output = i_output * j_output - - close(43) - - end subroutine program_setup - - end module setup diff --git a/sorc/enkf_chgres_recenter.fd/utils.f90 b/sorc/enkf_chgres_recenter.fd/utils.f90 deleted file mode 100644 index e09c75b0188..00000000000 --- a/sorc/enkf_chgres_recenter.fd/utils.f90 +++ /dev/null @@ -1,783 +0,0 @@ - module utils - - private - - public :: calc_kgds - public :: newps - public :: newpr1 - public :: vintg - public :: compute_delz - - contains - - subroutine compute_delz(ijm, levp, ak_in, bk_in, ps, zs, t, sphum, delz, flipsign) - - implicit none - integer, intent(in):: levp, ijm - real, intent(in), dimension(levp+1):: ak_in, bk_in - real, intent(in), dimension(ijm):: ps, zs - real, intent(in), dimension(ijm,levp):: t - real, intent(in), dimension(ijm,levp):: sphum - real, intent(out), dimension(ijm,levp):: delz - logical, intent(in) :: flipsign -! Local: - real, dimension(ijm,levp+1):: zh - real, dimension(ijm,levp+1):: pe0, pn0 - real, dimension(levp+1) :: ak, bk - integer i,k - real, parameter :: GRAV = 9.80665 - real, parameter :: RDGAS = 287.05 - real, parameter :: RVGAS = 461.50 - real :: zvir - real:: grd - - print*,"COMPUTE LAYER THICKNESS." - - grd = grav/rdgas - zvir = rvgas/rdgas - 1. - ak = ak_in - bk = bk_in - ak(levp+1) = max(1.e-9, ak(levp+1)) - - do i=1, ijm - pe0(i,levp+1) = ak(levp+1) - pn0(i,levp+1) = log(pe0(i,levp+1)) - enddo - - do k=levp,1, -1 - do i=1,ijm - pe0(i,k) = ak(k) + bk(k)*ps(i) - pn0(i,k) = log(pe0(i,k)) - enddo - enddo - - do i = 1, ijm - zh(i,1) = zs(i) - enddo - - do k = 2, levp+1 - do i = 1, ijm - zh(i,k) = zh(i,k-1)+t(i,k-1)*(1.+zvir*sphum(i,k-1))* & - (pn0(i,k-1)-pn0(i,k))/grd - enddo - enddo - - do k = 1, levp - do i = 1, ijm - if (flipsign) then - delz(i,k) = zh(i,k) - zh(i,k+1) - else - delz(i,k) = zh(i,k+1) - zh(i,k) - end if - enddo - enddo - - end subroutine compute_delz - - subroutine calc_kgds(idim, jdim, kgds) - - use nemsio_module - - implicit none - - integer(nemsio_intkind), intent(in) :: idim, jdim - - integer, intent(out) :: kgds(200) - - kgds = 0 - kgds(1) = 4 ! OCT 6 - TYPE OF GRID (GAUSSIAN) - kgds(2) = idim ! OCT 7-8 - # PTS ON LATITUDE CIRCLE - kgds(3) = jdim ! OCT 9-10 - # PTS ON LONGITUDE CIRCLE - kgds(4) = 90000 ! OCT 11-13 - LAT OF ORIGIN - kgds(5) = 0 ! OCT 14-16 - LON OF ORIGIN - kgds(6) = 128 ! OCT 17 - RESOLUTION FLAG - kgds(7) = -90000 ! OCT 18-20 - LAT OF EXTREME POINT - kgds(8) = nint(-360000./idim) ! OCT 21-23 - LON OF EXTREME POINT - kgds(9) = nint((360.0 / float(idim))*1000.0) - ! OCT 24-25 - LONGITUDE DIRECTION INCR. - kgds(10) = jdim/2 ! OCT 26-27 - NUMBER OF CIRCLES POLE TO EQUATOR - kgds(12) = 255 ! OCT 29 - RESERVED - kgds(20) = 255 ! OCT 5 - NOT USED, SET TO 255 - - end subroutine calc_kgds - - SUBROUTINE NEWPS(ZS,PS,IM,KM,P,T,Q,ZSNEW,PSNEW) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: NEWPS COMPUTE NEW SURFACE PRESSURE -! PRGMMR: IREDELL ORG: W/NMC23 DATE: 92-10-31 -! -! ABSTRACT: COMPUTES A NEW SURFACE PRESSURE GIVEN A NEW OROGRAPHY. -! THE NEW PRESSURE IS COMPUTED ASSUMING A HYDROSTATIC BALANCE -! AND A CONSTANT TEMPERATURE LAPSE RATE. BELOW GROUND, THE -! LAPSE RATE IS ASSUMED TO BE -6.5 K/KM. -! -! PROGRAM HISTORY LOG: -! 91-10-31 MARK IREDELL -! -! USAGE: CALL NEWPS(ZS,PS,IM,KM,P,T,Q,ZSNEW,PSNEW) -! INPUT ARGUMENT LIST: -! IM INTEGER NUMBER OF POINTS TO COMPUTE -! ZS REAL (IM) OLD OROGRAPHY (M) -! PS REAL (IM) OLD SURFACE PRESSURE (PA) -! KM INTEGER NUMBER OF LEVELS -! P REAL (IM,KM) PRESSURES (PA) -! T REAL (IM,KM) TEMPERATURES (K) -! Q REAL (IM,KM) SPECIFIC HUMIDITIES (KG/KG) -! ZSNEW REAL (IM) NEW OROGRAPHY (M) -! OUTPUT ARGUMENT LIST: -! PSNEW REAL (IM) NEW SURFACE PRESSURE (PA) -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN -! -!C$$$ - REAL ZS(IM),PS(IM),P(IM,KM),T(IM,KM),Q(IM,KM) - REAL ZSNEW(IM),PSNEW(IM) - PARAMETER(BETA=-6.5E-3,EPSILON=1.E-9) - PARAMETER(G=9.80665,RD=287.05,RV=461.50) - PARAMETER(GOR=G/RD,FV=RV/RD-1.) - REAL ZU(IM) - FTV(AT,AQ)=AT*(1+FV*AQ) - FGAM(APU,ATVU,APD,ATVD)=-GOR*LOG(ATVD/ATVU)/LOG(APD/APU) - FZ0(AP,ATV,AZD,APD)=AZD+ATV/GOR*LOG(APD/AP) - FZ1(AP,ATV,AZD,APD,AGAM)=AZD-ATV/AGAM*((APD/AP)**(-AGAM/GOR)-1) - FP0(AZ,AZU,APU,ATVU)=APU*EXP(-GOR/ATVU*(AZ-AZU)) - FP1(AZ,AZU,APU,ATVU,AGAM)=APU*(1+AGAM/ATVU*(AZ-AZU))**(-GOR/AGAM) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COMPUTE SURFACE PRESSURE BELOW THE ORIGINAL GROUND - LS=0 - K=1 - GAMMA=BETA - DO I=1,IM - PU=P(I,K) - TVU=FTV(T(I,K),Q(I,K)) - ZU(I)=FZ1(PU,TVU,ZS(I),PS(I),GAMMA) - IF(ZSNEW(I).LE.ZU(I)) THEN - PU=P(I,K) - TVU=FTV(T(I,K),Q(I,K)) - IF(ABS(GAMMA).GT.EPSILON) THEN - PSNEW(I)=FP1(ZSNEW(I),ZU(I),PU,TVU,GAMMA) - ELSE - PSNEW(I)=FP0(ZSNEW(I),ZU(I),PU,TVU) - ENDIF - ELSE - PSNEW(I)=0 - LS=LS+1 - ENDIF -! endif - ENDDO -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COMPUTE SURFACE PRESSURE ABOVE THE ORIGINAL GROUND - DO K=2,KM - IF(LS.GT.0) THEN - DO I=1,IM - IF(PSNEW(I).EQ.0) THEN - PU=P(I,K) - TVU=FTV(T(I,K),Q(I,K)) - PD=P(I,K-1) - TVD=FTV(T(I,K-1),Q(I,K-1)) - GAMMA=FGAM(PU,TVU,PD,TVD) - IF(ABS(GAMMA).GT.EPSILON) THEN - ZU(I)=FZ1(PU,TVU,ZU(I),PD,GAMMA) - ELSE - ZU(I)=FZ0(PU,TVU,ZU(I),PD) - ENDIF - IF(ZSNEW(I).LE.ZU(I)) THEN - IF(ABS(GAMMA).GT.EPSILON) THEN - PSNEW(I)=FP1(ZSNEW(I),ZU(I),PU,TVU,GAMMA) - ELSE - PSNEW(I)=FP0(ZSNEW(I),ZU(I),PU,TVU) - ENDIF - LS=LS-1 - ENDIF - ENDIF - ENDDO - ENDIF - ENDDO -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COMPUTE SURFACE PRESSURE OVER THE TOP - IF(LS.GT.0) THEN - K=KM - GAMMA=0 - DO I=1,IM - IF(PSNEW(I).EQ.0) THEN - PU=P(I,K) - TVU=FTV(T(I,K),Q(I,K)) - PSNEW(I)=FP0(ZSNEW(I),ZU(I),PU,TVU) - ENDIF - ENDDO - ENDIF - END SUBROUTINE NEWPS - - SUBROUTINE NEWPR1(IM,KM,IDVC,IDSL,NVCOORD,VCOORD, & - PS,PM,DP) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: NEWPR1 COMPUTE MODEL PRESSURES -! PRGMMR: JUANG ORG: W/NMC23 DATE: 2005-04-11 -! PRGMMR: Fanglin Yang ORG: W/NMC23 DATE: 2006-11-28 -! PRGMMR: S. Moorthi ORG: NCEP/EMC DATE: 2006-12-12 -! PRGMMR: S. Moorthi ORG: NCEP/EMC DATE: 2007-01-02 -! -! ABSTRACT: COMPUTE MODEL PRESSURES. -! -! PROGRAM HISTORY LOG: -! 2005-04-11 HANN_MING HENRY JUANG hybrid sigma, sigma-p, and sigma- -! -! USAGE: CALL NEWPR1(IM,IX,KM,KMP,IDVC,IDSL,NVCOORD,VCOORD,PP,TP,QP,P -! INPUT ARGUMENT LIST: -! IM INTEGER NUMBER OF POINTS TO COMPUTE -! KM INTEGER NUMBER OF LEVELS -! IDVC INTEGER VERTICAL COORDINATE ID -! (1 FOR SIGMA AND 2 FOR HYBRID) -! IDSL INTEGER TYPE OF SIGMA STRUCTURE -! (1 FOR PHILLIPS OR 2 FOR MEAN) -! NVCOORD INTEGER NUMBER OF VERTICAL COORDINATES -! VCOORD REAL (KM+1,NVCOORD) VERTICAL COORDINATE VALUES -! FOR IDVC=1, NVCOORD=1: SIGMA INTERFACE -! FOR IDVC=2, NVCOORD=2: HYBRID INTERFACE A AND B -! FOR IDVC=3, NVCOORD=3: JUANG GENERAL HYBRID INTERFACE -! AK REAL (KM+1) HYBRID INTERFACE A -! BK REAL (KM+1) HYBRID INTERFACE B -! PS REAL (IX) SURFACE PRESSURE (PA) -! OUTPUT ARGUMENT LIST: -! PM REAL (IX,KM) MID-LAYER PRESSURE (PA) -! DP REAL (IX,KM) LAYER DELTA PRESSURE (PA) -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN -! -!C$$$ - IMPLICIT NONE - - INTEGER, INTENT(IN) :: IM, KM, NVCOORD, IDVC, IDSL - - REAL, INTENT(IN) :: VCOORD(KM+1,NVCOORD) - REAL, INTENT(IN) :: PS(IM) - - REAL, INTENT(OUT) :: PM(IM,KM) - REAL, OPTIONAL, INTENT(OUT) :: DP(IM,KM) - - REAL, PARAMETER :: RD=287.05, RV=461.50, CP=1004.6, & - ROCP=RD/CP, ROCP1=ROCP+1, ROCPR=1/ROCP, & - FV=RV/RD-1. - - INTEGER :: I, K - - REAL :: AK(KM+1), BK(KM+1), PI(IM,KM+1) - - IF(IDVC.EQ.2) THEN - DO K=1,KM+1 - AK(K) = VCOORD(K,1) - BK(K) = VCOORD(K,2) - PI(1:IM,K) = AK(K) + BK(K)*PS(1:IM) - ENDDO - ELSE - print*,'routine only works for idvc 2' - stop - ENDIF - - IF(IDSL.EQ.2) THEN - DO K=1,KM - PM(1:IM,K) = (PI(1:IM,K)+PI(1:IM,K+1))/2 - ENDDO - ELSE - DO K=1,KM - PM(1:IM,K) = ((PI(1:IM,K)**ROCP1-PI(1:IM,K+1)**ROCP1)/ & - (ROCP1*(PI(1:IM,K)-PI(1:IM,K+1))))**ROCPR - ENDDO - ENDIF - - IF(PRESENT(DP))THEN - DO K=1,KM - DO I=1,IM - DP(I,K) = PI(I,K) - PI(I,K+1) - ENDDO - ENDDO - ENDIF - - END SUBROUTINE NEWPR1 - - SUBROUTINE TERP3(IM,IXZ1,IXQ1,IXZ2,IXQ2,NM,NXQ1,NXQ2, & - KM1,KXZ1,KXQ1,Z1,Q1,KM2,KXZ2,KXQ2,Z2,Q2,J2) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: TERP3 CUBICALLY INTERPOLATE IN ONE DIMENSION -! PRGMMR: IREDELL ORG: W/NMC23 DATE: 98-05-01 -! -! ABSTRACT: INTERPOLATE FIELD(S) IN ONE DIMENSION ALONG THE COLUMN(S). -! THE INTERPOLATION IS CUBIC LAGRANGIAN WITH A MONOTONIC CONSTRAINT -! IN THE CENTER OF THE DOMAIN. IN THE OUTER INTERVALS IT IS LINEAR. -! OUTSIDE THE DOMAIN, FIELDS ARE HELD CONSTANT. -! -! PROGRAM HISTORY LOG: -! 98-05-01 MARK IREDELL -! 1999-01-04 IREDELL USE ESSL SEARCH -! -! USAGE: CALL TERP3(IM,IXZ1,IXQ1,IXZ2,IXQ2,NM,NXQ1,NXQ2, -! & KM1,KXZ1,KXQ1,Z1,Q1,KM2,KXZ2,KXQ2,Z2,Q2,J2) -! INPUT ARGUMENT LIST: -! IM INTEGER NUMBER OF COLUMNS -! IXZ1 INTEGER COLUMN SKIP NUMBER FOR Z1 -! IXQ1 INTEGER COLUMN SKIP NUMBER FOR Q1 -! IXZ2 INTEGER COLUMN SKIP NUMBER FOR Z2 -! IXQ2 INTEGER COLUMN SKIP NUMBER FOR Q2 -! NM INTEGER NUMBER OF FIELDS PER COLUMN -! NXQ1 INTEGER FIELD SKIP NUMBER FOR Q1 -! NXQ2 INTEGER FIELD SKIP NUMBER FOR Q2 -! KM1 INTEGER NUMBER OF INPUT POINTS -! KXZ1 INTEGER POINT SKIP NUMBER FOR Z1 -! KXQ1 INTEGER POINT SKIP NUMBER FOR Q1 -! Z1 REAL (1+(IM-1)*IXZ1+(KM1-1)*KXZ1) -! INPUT COORDINATE VALUES IN WHICH TO INTERPOLATE -! (Z1 MUST BE STRICTLY MONOTONIC IN EITHER DIRECTION) -! Q1 REAL (1+(IM-1)*IXQ1+(KM1-1)*KXQ1+(NM-1)*NXQ1) -! INPUT FIELDS TO INTERPOLATE -! KM2 INTEGER NUMBER OF OUTPUT POINTS -! KXZ2 INTEGER POINT SKIP NUMBER FOR Z2 -! KXQ2 INTEGER POINT SKIP NUMBER FOR Q2 -! Z2 REAL (1+(IM-1)*IXZ2+(KM2-1)*KXZ2) -! OUTPUT COORDINATE VALUES TO WHICH TO INTERPOLATE -! (Z2 NEED NOT BE MONOTONIC) -! -! OUTPUT ARGUMENT LIST: -! Q2 REAL (1+(IM-1)*IXQ2+(KM2-1)*KXQ2+(NM-1)*NXQ2) -! OUTPUT INTERPOLATED FIELDS -! J2 REAL (1+(IM-1)*IXQ2+(KM2-1)*KXQ2+(NM-1)*NXQ2) -! OUTPUT INTERPOLATED FIELDS CHANGE WRT Z2 -! -! SUBPROGRAMS CALLED: -! RSEARCH SEARCH FOR A SURROUNDING REAL INTERVAL -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN -! -!C$$$ - IMPLICIT NONE - INTEGER IM,IXZ1,IXQ1,IXZ2,IXQ2,NM,NXQ1,NXQ2 - INTEGER KM1,KXZ1,KXQ1,KM2,KXZ2,KXQ2 - INTEGER I,K1,K2,N - REAL Z1(1+(IM-1)*IXZ1+(KM1-1)*KXZ1) - REAL Q1(1+(IM-1)*IXQ1+(KM1-1)*KXQ1+(NM-1)*NXQ1) - REAL Z2(1+(IM-1)*IXZ2+(KM2-1)*KXZ2) - REAL Q2(1+(IM-1)*IXQ2+(KM2-1)*KXQ2+(NM-1)*NXQ2) - REAL J2(1+(IM-1)*IXQ2+(KM2-1)*KXQ2+(NM-1)*NXQ2) - REAL FFA(IM),FFB(IM),FFC(IM),FFD(IM) - REAL GGA(IM),GGB(IM),GGC(IM),GGD(IM) - INTEGER K1S(IM,KM2) - REAL Z1A,Z1B,Z1C,Z1D,Q1A,Q1B,Q1C,Q1D,Z2S,Q2S,J2S -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! FIND THE SURROUNDING INPUT INTERVAL FOR EACH OUTPUT POINT. - CALL RSEARCH(IM,KM1,IXZ1,KXZ1,Z1,KM2,IXZ2,KXZ2,Z2,1,IM,K1S) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! GENERALLY INTERPOLATE CUBICALLY WITH MONOTONIC CONSTRAINT -! FROM TWO NEAREST INPUT POINTS ON EITHER SIDE OF THE OUTPUT POINT, -! BUT WITHIN THE TWO EDGE INTERVALS INTERPOLATE LINEARLY. -! KEEP THE OUTPUT FIELDS CONSTANT OUTSIDE THE INPUT DOMAIN. - -!!$OMP PARALLEL DO DEFAULT(PRIVATE) SHARED(IM,IXZ1,IXQ1,IXZ2), & -!!$OMP& SHARED(IXQ2,NM,NXQ1,NXQ2,KM1,KXZ1,KXQ1,Z1,Q1,KM2,KXZ2), & -!!$OMP& SHARED(KXQ2,Z2,Q2,J2,K1S) - - DO K2=1,KM2 - DO I=1,IM - K1=K1S(I,K2) - IF(K1.EQ.1.OR.K1.EQ.KM1-1) THEN - Z2S=Z2(1+(I-1)*IXZ2+(K2-1)*KXZ2) - Z1A=Z1(1+(I-1)*IXZ1+(K1-1)*KXZ1) - Z1B=Z1(1+(I-1)*IXZ1+(K1+0)*KXZ1) - FFA(I)=(Z2S-Z1B)/(Z1A-Z1B) - FFB(I)=(Z2S-Z1A)/(Z1B-Z1A) - GGA(I)=1/(Z1A-Z1B) - GGB(I)=1/(Z1B-Z1A) - ELSEIF(K1.GT.1.AND.K1.LT.KM1-1) THEN - Z2S=Z2(1+(I-1)*IXZ2+(K2-1)*KXZ2) - Z1A=Z1(1+(I-1)*IXZ1+(K1-2)*KXZ1) - Z1B=Z1(1+(I-1)*IXZ1+(K1-1)*KXZ1) - Z1C=Z1(1+(I-1)*IXZ1+(K1+0)*KXZ1) - Z1D=Z1(1+(I-1)*IXZ1+(K1+1)*KXZ1) - FFA(I)=(Z2S-Z1B)/(Z1A-Z1B)* & - (Z2S-Z1C)/(Z1A-Z1C)* & - (Z2S-Z1D)/(Z1A-Z1D) - FFB(I)=(Z2S-Z1A)/(Z1B-Z1A)* & - (Z2S-Z1C)/(Z1B-Z1C)* & - (Z2S-Z1D)/(Z1B-Z1D) - FFC(I)=(Z2S-Z1A)/(Z1C-Z1A)* & - (Z2S-Z1B)/(Z1C-Z1B)* & - (Z2S-Z1D)/(Z1C-Z1D) - FFD(I)=(Z2S-Z1A)/(Z1D-Z1A)* & - (Z2S-Z1B)/(Z1D-Z1B)* & - (Z2S-Z1C)/(Z1D-Z1C) - GGA(I)= 1/(Z1A-Z1B)* & - (Z2S-Z1C)/(Z1A-Z1C)* & - (Z2S-Z1D)/(Z1A-Z1D)+ & - (Z2S-Z1B)/(Z1A-Z1B)* & - 1/(Z1A-Z1C)* & - (Z2S-Z1D)/(Z1A-Z1D)+ & - (Z2S-Z1B)/(Z1A-Z1B)* & - (Z2S-Z1C)/(Z1A-Z1C)* & - 1/(Z1A-Z1D) - GGB(I)= 1/(Z1B-Z1A)* & - (Z2S-Z1C)/(Z1B-Z1C)* & - (Z2S-Z1D)/(Z1B-Z1D)+ & - (Z2S-Z1A)/(Z1B-Z1A)* & - 1/(Z1B-Z1C)* & - (Z2S-Z1D)/(Z1B-Z1D)+ & - (Z2S-Z1A)/(Z1B-Z1A)* & - (Z2S-Z1C)/(Z1B-Z1C)* & - 1/(Z1B-Z1D) - GGC(I)= 1/(Z1C-Z1A)* & - (Z2S-Z1B)/(Z1C-Z1B)* & - (Z2S-Z1D)/(Z1C-Z1D)+ & - (Z2S-Z1A)/(Z1C-Z1A)* & - 1/(Z1C-Z1B)* & - (Z2S-Z1D)/(Z1C-Z1D)+ & - (Z2S-Z1A)/(Z1C-Z1A)* & - (Z2S-Z1B)/(Z1C-Z1B)* & - 1/(Z1C-Z1D) - GGD(I)= 1/(Z1D-Z1A)* & - (Z2S-Z1B)/(Z1D-Z1B)* & - (Z2S-Z1C)/(Z1D-Z1C)+ & - (Z2S-Z1A)/(Z1D-Z1A)* & - 1/(Z1D-Z1B)* & - (Z2S-Z1C)/(Z1D-Z1C)+ & - (Z2S-Z1A)/(Z1D-Z1A)* & - (Z2S-Z1B)/(Z1D-Z1B)* & - 1/(Z1D-Z1C) - ENDIF - ENDDO -! INTERPOLATE. - DO N=1,NM - DO I=1,IM - K1=K1S(I,K2) - IF(K1.EQ.0) THEN - Q2S=Q1(1+(I-1)*IXQ1+(N-1)*NXQ1) - J2S=0 - ELSEIF(K1.EQ.KM1) THEN - Q2S=Q1(1+(I-1)*IXQ1+(KM1-1)*KXQ1+(N-1)*NXQ1) - J2S=0 - ELSEIF(K1.EQ.1.OR.K1.EQ.KM1-1) THEN - Q1A=Q1(1+(I-1)*IXQ1+(K1-1)*KXQ1+(N-1)*NXQ1) - Q1B=Q1(1+(I-1)*IXQ1+(K1+0)*KXQ1+(N-1)*NXQ1) - Q2S=FFA(I)*Q1A+FFB(I)*Q1B - J2S=GGA(I)*Q1A+GGB(I)*Q1B - ELSE - Q1A=Q1(1+(I-1)*IXQ1+(K1-2)*KXQ1+(N-1)*NXQ1) - Q1B=Q1(1+(I-1)*IXQ1+(K1-1)*KXQ1+(N-1)*NXQ1) - Q1C=Q1(1+(I-1)*IXQ1+(K1+0)*KXQ1+(N-1)*NXQ1) - Q1D=Q1(1+(I-1)*IXQ1+(K1+1)*KXQ1+(N-1)*NXQ1) - Q2S=FFA(I)*Q1A+FFB(I)*Q1B+FFC(I)*Q1C+FFD(I)*Q1D - J2S=GGA(I)*Q1A+GGB(I)*Q1B+GGC(I)*Q1C+GGD(I)*Q1D - IF(Q2S.LT.MIN(Q1B,Q1C)) THEN - Q2S=MIN(Q1B,Q1C) - J2S=0 - ELSEIF(Q2S.GT.MAX(Q1B,Q1C)) THEN - Q2S=MAX(Q1B,Q1C) - J2S=0 - ENDIF - ENDIF - Q2(1+(I-1)*IXQ2+(K2-1)*KXQ2+(N-1)*NXQ2)=Q2S - J2(1+(I-1)*IXQ2+(K2-1)*KXQ2+(N-1)*NXQ2)=J2S - ENDDO - ENDDO - ENDDO -!!$OMP END PARALLEL DO - - END SUBROUTINE TERP3 - - SUBROUTINE RSEARCH(IM,KM1,IXZ1,KXZ1,Z1,KM2,IXZ2,KXZ2,Z2,IXL2,KXL2,& - L2) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: RSEARCH SEARCH FOR A SURROUNDING REAL INTERVAL -! PRGMMR: IREDELL ORG: W/NMC23 DATE: 98-05-01 -! -! ABSTRACT: THIS SUBPROGRAM SEARCHES MONOTONIC SEQUENCES OF REAL NUMBERS -! FOR INTERVALS THAT SURROUND A GIVEN SEARCH SET OF REAL NUMBERS. -! THE SEQUENCES MAY BE MONOTONIC IN EITHER DIRECTION; THE REAL NUMBERS -! MAY BE SINGLE OR DOUBLE PRECISION; THE INPUT SEQUENCES AND SETS -! AND THE OUTPUT LOCATIONS MAY BE ARBITRARILY DIMENSIONED. -! -! PROGRAM HISTORY LOG: -! 1999-01-05 MARK IREDELL -! -! USAGE: CALL RSEARCH(IM,KM1,IXZ1,KXZ1,Z1,KM2,IXZ2,KXZ2,Z2,IXL2,KXL2, -! & L2) -! INPUT ARGUMENT LIST: -! IM INTEGER NUMBER OF SEQUENCES TO SEARCH -! KM1 INTEGER NUMBER OF POINTS IN EACH SEQUENCE -! IXZ1 INTEGER SEQUENCE SKIP NUMBER FOR Z1 -! KXZ1 INTEGER POINT SKIP NUMBER FOR Z1 -! Z1 REAL (1+(IM-1)*IXZ1+(KM1-1)*KXZ1) -! SEQUENCE VALUES TO SEARCH -! (Z1 MUST BE MONOTONIC IN EITHER DIRECTION) -! KM2 INTEGER NUMBER OF POINTS TO SEARCH FOR -! IN EACH RESPECTIVE SEQUENCE -! IXZ2 INTEGER SEQUENCE SKIP NUMBER FOR Z2 -! KXZ2 INTEGER POINT SKIP NUMBER FOR Z2 -! Z2 REAL (1+(IM-1)*IXZ2+(KM2-1)*KXZ2) -! SET OF VALUES TO SEARCH FOR -! (Z2 NEED NOT BE MONOTONIC) -! IXL2 INTEGER SEQUENCE SKIP NUMBER FOR L2 -! KXL2 INTEGER POINT SKIP NUMBER FOR L2 -! -! OUTPUT ARGUMENT LIST: -! L2 INTEGER (1+(IM-1)*IXL2+(KM2-1)*KXL2) -! INTERVAL LOCATIONS HAVING VALUES FROM 0 TO KM1 -! (Z2 WILL BE BETWEEN Z1(L2) AND Z1(L2+1)) -! -! SUBPROGRAMS CALLED: -! SBSRCH ESSL BINARY SEARCH -! DBSRCH ESSL BINARY SEARCH -! -! REMARKS: -! IF THE ARRAY Z1 IS DIMENSIONED (IM,KM1), THEN THE SKIP NUMBERS ARE -! IXZ1=1 AND KXZ1=IM; IF IT IS DIMENSIONED (KM1,IM), THEN THE SKIP -! NUMBERS ARE IXZ1=KM1 AND KXZ1=1; IF IT IS DIMENSIONED (IM,JM,KM1), -! THEN THE SKIP NUMBERS ARE IXZ1=1 AND KXZ1=IM*JM; ETCETERA. -! SIMILAR EXAMPLES APPLY TO THE SKIP NUMBERS FOR Z2 AND L2. -! -! RETURNED VALUES OF 0 OR KM1 INDICATE THAT THE GIVEN SEARCH VALUE -! IS OUTSIDE THE RANGE OF THE SEQUENCE. -! -! IF A SEARCH VALUE IS IDENTICAL TO ONE OF THE SEQUENCE VALUES -! THEN THE LOCATION RETURNED POINTS TO THE IDENTICAL VALUE. -! IF THE SEQUENCE IS NOT STRICTLY MONOTONIC AND A SEARCH VALUE IS -! IDENTICAL TO MORE THAN ONE OF THE SEQUENCE VALUES, THEN THE -! LOCATION RETURNED MAY POINT TO ANY OF THE IDENTICAL VALUES. -! -! TO BE EXACT, FOR EACH I FROM 1 TO IM AND FOR EACH K FROM 1 TO KM2, -! Z=Z2(1+(I-1)*IXZ2+(K-1)*KXZ2) IS THE SEARCH VALUE AND -! L=L2(1+(I-1)*IXL2+(K-1)*KXL2) IS THE LOCATION RETURNED. -! IF L=0, THEN Z IS LESS THAN THE START POINT Z1(1+(I-1)*IXZ1) -! FOR ASCENDING SEQUENCES (OR GREATER THAN FOR DESCENDING SEQUENCES). -! IF L=KM1, THEN Z IS GREATER THAN OR EQUAL TO THE END POINT -! Z1(1+(I-1)*IXZ1+(KM1-1)*KXZ1) FOR ASCENDING SEQUENCES -! (OR LESS THAN OR EQUAL TO FOR DESCENDING SEQUENCES). -! OTHERWISE Z IS BETWEEN THE VALUES Z1(1+(I-1)*IXZ1+(L-1)*KXZ1) AND -! Z1(1+(I-1)*IXZ1+(L-0)*KXZ1) AND MAY EQUAL THE FORMER. -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN -! -!C$$$ -! IMPLICIT NONE -! INTEGER,INTENT(IN):: IM,KM1,IXZ1,KXZ1,KM2,IXZ2,KXZ2,IXL2,KXL2 -! REAL,INTENT(IN):: Z1(1+(IM-1)*IXZ1+(KM1-1)*KXZ1) -! REAL,INTENT(IN):: Z2(1+(IM-1)*IXZ2+(KM2-1)*KXZ2) -! INTEGER,INTENT(OUT):: L2(1+(IM-1)*IXL2+(KM2-1)*KXL2) -! INTEGER(4) INCX,N,INCY,M,INDX(KM2),RC(KM2),IOPT -! INTEGER I,K2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! FIND THE SURROUNDING INPUT INTERVAL FOR EACH OUTPUT POINT. -! DO I=1,IM -! IF(Z1(1+(I-1)*IXZ1).LE.Z1(1+(I-1)*IXZ1+(KM1-1)*KXZ1)) THEN -! INPUT COORDINATE IS MONOTONICALLY ASCENDING. -! INCX=KXZ2 -! N=KM2 -! INCY=KXZ1 -! M=KM1 -! IOPT=1 -! IF(DIGITS(1.).LT.DIGITS(1._8)) THEN -! CALL SBSRCH(Z2(1+(I-1)*IXZ2),INCX,N, -! & Z1(1+(I-1)*IXZ1),INCY,M,INDX,RC,IOPT) -! ELSE -! CALL DBSRCH(Z2(1+(I-1)*IXZ2),INCX,N, -! & Z1(1+(I-1)*IXZ1),INCY,M,INDX,RC,IOPT) -! ENDIF -! DO K2=1,KM2 -! L2(1+(I-1)*IXL2+(K2-1)*KXL2)=INDX(K2)-RC(K2) -! ENDDO -! ELSE -! INPUT COORDINATE IS MONOTONICALLY DESCENDING. -! INCX=KXZ2 -! N=KM2 -! INCY=-KXZ1 -! M=KM1 -! IOPT=0 -! IF(DIGITS(1.).LT.DIGITS(1._8)) THEN -! CALL SBSRCH(Z2(1+(I-1)*IXZ2),INCX,N, -! & Z1(1+(I-1)*IXZ1),INCY,M,INDX,RC,IOPT) -! ELSE -! CALL DBSRCH(Z2(1+(I-1)*IXZ2),INCX,N, -! & Z1(1+(I-1)*IXZ1),INCY,M,INDX,RC,IOPT) -! ENDIF -! DO K2=1,KM2 -! L2(1+(I-1)*IXL2+(K2-1)*KXL2)=KM1+1-INDX(K2) -! ENDDO -! ENDIF -! ENDDO -! - IMPLICIT NONE - INTEGER,INTENT(IN):: IM,KM1,IXZ1,KXZ1,KM2,IXZ2,KXZ2,IXL2,KXL2 - REAL,INTENT(IN):: Z1(1+(IM-1)*IXZ1+(KM1-1)*KXZ1) - REAL,INTENT(IN):: Z2(1+(IM-1)*IXZ2+(KM2-1)*KXZ2) - INTEGER,INTENT(OUT):: L2(1+(IM-1)*IXL2+(KM2-1)*KXL2) - INTEGER I,K2,L - REAL Z -!C - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -!C FIND THE SURROUNDING INPUT INTERVAL FOR EACH OUTPUT POINT. - DO I=1,IM - IF(Z1(1+(I-1)*IXZ1).LE.Z1(1+(I-1)*IXZ1+(KM1-1)*KXZ1)) THEN -!C INPUT COORDINATE IS MONOTONICALLY ASCENDING. - DO K2=1,KM2 - Z=Z2(1+(I-1)*IXZ2+(K2-1)*KXZ2) - L=0 - DO - IF(Z.LT.Z1(1+(I-1)*IXZ1+L*KXZ1)) EXIT - L=L+1 - IF(L.EQ.KM1) EXIT - ENDDO - L2(1+(I-1)*IXL2+(K2-1)*KXL2)=L - ENDDO - ELSE -!C INPUT COORDINATE IS MONOTONICALLY DESCENDING. - DO K2=1,KM2 - Z=Z2(1+(I-1)*IXZ2+(K2-1)*KXZ2) - L=0 - DO - IF(Z.GT.Z1(1+(I-1)*IXZ1+L*KXZ1)) EXIT - L=L+1 - IF(L.EQ.KM1) EXIT - ENDDO - L2(1+(I-1)*IXL2+(K2-1)*KXL2)=L - ENDDO - ENDIF - ENDDO - - END SUBROUTINE RSEARCH - - SUBROUTINE VINTG(IM,KM1,KM2,NT,P1,U1,V1,T1,Q1,W1,P2, & - U2,V2,T2,Q2,W2) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: VINTG VERTICALLY INTERPOLATE UPPER-AIR FIELDS -! PRGMMR: IREDELL ORG: W/NMC23 DATE: 92-10-31 -! -! ABSTRACT: VERTICALLY INTERPOLATE UPPER-AIR FIELDS. -! WIND, TEMPERATURE, HUMIDITY AND OTHER TRACERS ARE INTERPOLATED. -! THE INTERPOLATION IS CUBIC LAGRANGIAN IN LOG PRESSURE -! WITH A MONOTONIC CONSTRAINT IN THE CENTER OF THE DOMAIN. -! IN THE OUTER INTERVALS IT IS LINEAR IN LOG PRESSURE. -! OUTSIDE THE DOMAIN, FIELDS ARE GENERALLY HELD CONSTANT, -! EXCEPT FOR TEMPERATURE AND HUMIDITY BELOW THE INPUT DOMAIN, -! WHERE THE TEMPERATURE LAPSE RATE IS HELD FIXED AT -6.5 K/KM AND -! THE RELATIVE HUMIDITY IS HELD CONSTANT. -! -! PROGRAM HISTORY LOG: -! 91-10-31 MARK IREDELL -! -! USAGE: CALL VINTG(IM,KM1,KM2,NT,P1,U1,V1,T1,Q1,P2, -! & U2,V2,T2,Q2) -! INPUT ARGUMENT LIST: -! IM INTEGER NUMBER OF POINTS TO COMPUTE -! KM1 INTEGER NUMBER OF INPUT LEVELS -! KM2 INTEGER NUMBER OF OUTPUT LEVELS -! NT INTEGER NUMBER OF TRACERS -! P1 REAL (IM,KM1) INPUT PRESSURES -! ORDERED FROM BOTTOM TO TOP OF ATMOSPHERE -! U1 REAL (IM,KM1) INPUT ZONAL WIND -! V1 REAL (IM,KM1) INPUT MERIDIONAL WIND -! T1 REAL (IM,KM1) INPUT TEMPERATURE (K) -! Q1 REAL (IM,KM1,NT) INPUT TRACERS (HUMIDITY FIRST) -! P2 REAL (IM,KM2) OUTPUT PRESSURES -! OUTPUT ARGUMENT LIST: -! U2 REAL (IM,KM2) OUTPUT ZONAL WIND -! V2 REAL (IM,KM2) OUTPUT MERIDIONAL WIND -! T2 REAL (IM,KM2) OUTPUT TEMPERATURE (K) -! Q2 REAL (IM,KM2,NT) OUTPUT TRACERS (HUMIDITY FIRST) -! -! SUBPROGRAMS CALLED: -! TERP3 CUBICALLY INTERPOLATE IN ONE DIMENSION -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN -! -!C$$$ - IMPLICIT NONE - - INTEGER, INTENT(IN) :: IM, KM1, KM2, NT - - REAL, INTENT(IN) :: P1(IM,KM1),U1(IM,KM1),V1(IM,KM1) - REAL, INTENT(IN) :: T1(IM,KM1),Q1(IM,KM1,NT) - REAL, INTENT(IN) :: W1(IM,KM1),P2(IM,KM2) - REAL, INTENT(OUT) :: U2(IM,KM2),V2(IM,KM2) - REAL, INTENT(OUT) :: T2(IM,KM2),Q2(IM,KM2,NT) - REAL, INTENT(OUT) :: W2(IM,KM2) - - REAL, PARAMETER :: DLTDZ=-6.5E-3*287.05/9.80665 - REAL, PARAMETER :: DLPVDRT=-2.5E6/461.50 - - INTEGER :: I, K, N - - REAL :: DZ - REAL,ALLOCATABLE :: Z1(:,:),Z2(:,:) - REAL,ALLOCATABLE :: C1(:,:,:),C2(:,:,:),J2(:,:,:) - - ALLOCATE (Z1(IM+1,KM1),Z2(IM+1,KM2)) - ALLOCATE (C1(IM+1,KM1,4+NT),C2(IM+1,KM2,4+NT),J2(IM+1,KM2,4+NT)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COMPUTE LOG PRESSURE INTERPOLATING COORDINATE -! AND COPY INPUT WIND, TEMPERATURE, HUMIDITY AND OTHER TRACERS -!$OMP PARALLEL DO PRIVATE(K,I) - DO K=1,KM1 - DO I=1,IM - Z1(I,K) = -LOG(P1(I,K)) - C1(I,K,1) = U1(I,K) - C1(I,K,2) = V1(I,K) - C1(I,K,3) = W1(I,K) - C1(I,K,4) = T1(I,K) - C1(I,K,5) = Q1(I,K,1) - ENDDO - ENDDO -!$OMP END PARALLEL DO - DO N=2,NT - DO K=1,KM1 - DO I=1,IM - C1(I,K,4+N) = Q1(I,K,N) - ENDDO - ENDDO - ENDDO -!$OMP PARALLEL DO PRIVATE(K,I) - DO K=1,KM2 - DO I=1,IM - Z2(I,K) = -LOG(P2(I,K)) - ENDDO - ENDDO -!$OMP END PARALLEL DO -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! PERFORM LAGRANGIAN ONE-DIMENSIONAL INTERPOLATION -! THAT IS 4TH-ORDER IN INTERIOR, 2ND-ORDER IN OUTSIDE INTERVALS -! AND 1ST-ORDER FOR EXTRAPOLATION. - CALL TERP3(IM,1,1,1,1,4+NT,(IM+1)*KM1,(IM+1)*KM2, & - KM1,IM+1,IM+1,Z1,C1,KM2,IM+1,IM+1,Z2,C2,J2) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COPY OUTPUT WIND, TEMPERATURE, HUMIDITY AND OTHER TRACERS -! EXCEPT BELOW THE INPUT DOMAIN, LET TEMPERATURE INCREASE WITH A FIXED -! LAPSE RATE AND LET THE RELATIVE HUMIDITY REMAIN CONSTANT. - DO K=1,KM2 - DO I=1,IM - U2(I,K)=C2(I,K,1) - V2(I,K)=C2(I,K,2) - W2(I,K)=C2(I,K,3) - DZ=Z2(I,K)-Z1(I,1) - IF(DZ.GE.0) THEN - T2(I,K)=C2(I,K,4) - Q2(I,K,1)=C2(I,K,5) - ELSE - T2(I,K)=T1(I,1)*EXP(DLTDZ*DZ) - Q2(I,K,1)=Q1(I,1,1)*EXP(DLPVDRT*(1/T2(I,K)-1/T1(I,1))-DZ) - ENDIF - ENDDO - ENDDO - DO N=2,NT - DO K=1,KM2 - DO I=1,IM - Q2(I,K,N)=C2(I,K,4+N) - ENDDO - ENDDO - ENDDO - DEALLOCATE (Z1,Z2,C1,C2,J2) - END SUBROUTINE VINTG - end module utils diff --git a/sorc/enkf_chgres_recenter_nc.fd/driver.f90 b/sorc/enkf_chgres_recenter_nc.fd/driver.f90 deleted file mode 100644 index 1ec7c70f034..00000000000 --- a/sorc/enkf_chgres_recenter_nc.fd/driver.f90 +++ /dev/null @@ -1,67 +0,0 @@ -!!! based on chgres_recenter -!!! cory.r.martin@noaa.gov 2019-09-27 - program regrid - - use setup, only : program_setup - use interp, only : gaus_to_gaus, adjust_for_terrain - use input_data, only : read_input_data, & - read_vcoord_info - use output_data, only : set_output_grid, write_output_data - - implicit none - - call w3tagb('ENKF_CHGRES_RECENTER_NCIO',2019,0270,0085,'NP20') - - print*,"STARTING PROGRAM" - -!-------------------------------------------------------- -! Read configuration namelist. -!-------------------------------------------------------- - - call program_setup - -!-------------------------------------------------------- -! Read input grid data -!-------------------------------------------------------- - - call read_input_data - -!-------------------------------------------------------- -! Read vertical coordinate info -!-------------------------------------------------------- - - call read_vcoord_info - -!-------------------------------------------------------- -! Get output grid specs -!-------------------------------------------------------- - - call set_output_grid - -!-------------------------------------------------------- -! Interpolate data to output grid -!-------------------------------------------------------- - - call gaus_to_gaus - -!-------------------------------------------------------- -! Adjust output fields for differences between -! interpolated and external terrain. -!-------------------------------------------------------- - - call adjust_for_terrain - -!-------------------------------------------------------- -! Write output data to file. -!-------------------------------------------------------- - - call write_output_data - - print* - print*,"PROGRAM FINISHED NORMALLY!" - - call w3tage('ENKF_CHGRES_RECENTER_NCIO') - - stop - - end program regrid diff --git a/sorc/enkf_chgres_recenter_nc.fd/input_data.f90 b/sorc/enkf_chgres_recenter_nc.fd/input_data.f90 deleted file mode 100644 index b77fe26b3e6..00000000000 --- a/sorc/enkf_chgres_recenter_nc.fd/input_data.f90 +++ /dev/null @@ -1,345 +0,0 @@ - module input_data - - use utils - use setup - use module_ncio - - implicit none - - private - - integer, public :: idvc, idsl, idvm, nvcoord - integer, public :: nvcoord_input, ntrac, ncldt - integer, public :: ij_input, kgds_input(200) - integer, public :: i_input, j_input, lev, lev_output - integer, public :: idate(6) - integer, public :: icldamt, iicmr, & - idelz,idpres,idzdt, & - irwmr,isnmr,igrle - - - real, allocatable, public :: vcoord(:,:) - real, allocatable, public :: vcoord_input(:,:) - real, allocatable, public :: clwmr_input(:,:) - real, allocatable, public :: dzdt_input(:,:) - real, allocatable, public :: grle_input(:,:) - real, allocatable, public :: cldamt_input(:,:) - real, allocatable, public :: hgt_input(:) - real, allocatable, public :: icmr_input(:,:) - real, allocatable, public :: o3mr_input(:,:) - real, allocatable, public :: rwmr_input(:,:) - real, allocatable, public :: sfcp_input(:) - real, allocatable, public :: snmr_input(:,:) - real, allocatable, public :: spfh_input(:,:) - real, allocatable, public :: tmp_input(:,:) - real, allocatable, public :: ugrd_input(:,:) - real, allocatable, public :: vgrd_input(:,:) - real :: missing_value=1.e30 - - public :: read_input_data - public :: read_vcoord_info - - contains - - subroutine read_input_data - -!------------------------------------------------------------------------------------- -! Read input grid data from a netcdf file. -!------------------------------------------------------------------------------------- - - implicit none - - integer :: vlev,rvlev - type(Dataset) :: indset - type(Dimension) :: ncdim - real, allocatable :: work2d(:,:),work3d(:,:,:) - integer iret, k, kk - real, allocatable :: ak(:), bk(:) - - ! hard code these values that are the same for GFS - idvc=2 - idsl=1 - idvm=1 - ntrac = 8 - ncldt = 5 - - print* - print*,"OPEN INPUT FILE: ",trim(input_file) - indset = open_dataset(input_file) - - print*,"GET INPUT FILE HEADER" - ncdim = get_dim(indset, 'grid_xt'); i_input = ncdim%len - ncdim = get_dim(indset, 'grid_yt'); j_input = ncdim%len - ncdim = get_dim(indset, 'pfull'); lev = ncdim%len - idate = get_idate_from_time_units(indset) - - print*,'DIMENSIONS OF DATA ARE: ', i_input, j_input, lev - print*,'DATE OF DATA IS: ', idate - - ij_input = i_input * j_input - - call read_attribute(indset, 'ak', ak) - call read_attribute(indset, 'bk', bk) - - nvcoord_input = 2 - allocate(vcoord_input(lev+1,nvcoord_input)) - do k = 1, lev+1 - kk = lev+2-k - vcoord_input(k,1) = ak(kk) - vcoord_input(k,2) = bk(kk) - print*,'VCOORD OF INPUT DATA ',k,vcoord_input(k,:) - enddo - - deallocate(ak, bk) - - print* - print*,"READ SURFACE PRESSURE" - call read_vardata(indset, 'pressfc', work2d) - - allocate(sfcp_input(ij_input)) - sfcp_input = reshape(work2d,(/ij_input/)) - print*,'MAX/MIN SURFACE PRESSURE: ',maxval(sfcp_input), minval(sfcp_input) - - print* - print*,"READ SURFACE HEIGHT" - call read_vardata(indset, 'hgtsfc', work2d) - - allocate(hgt_input(ij_input)) - hgt_input = reshape(work2d,(/ij_input/)) - print*,'MAX/MIN SURFACE HEIGHT: ',maxval(hgt_input), minval(hgt_input) - - print* - print*,"READ U WIND" - allocate(ugrd_input(ij_input,lev)) - call read_vardata(indset, 'ugrd', work3d) - do vlev = 1, lev - rvlev = lev+1-vlev - ugrd_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN U WIND AT LEVEL ',vlev, "IS: ", maxval(ugrd_input(:,vlev)), minval(ugrd_input(:,vlev)) - enddo - - print* - print*,"READ V WIND" - allocate(vgrd_input(ij_input,lev)) - call read_vardata(indset, 'vgrd', work3d) - do vlev = 1, lev - rvlev = lev+1-vlev - vgrd_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN V WIND AT LEVEL ', vlev, "IS: ", maxval(vgrd_input(:,vlev)), minval(vgrd_input(:,vlev)) - enddo - - print* - print*,"READ TEMPERATURE" - allocate(tmp_input(ij_input,lev)) - call read_vardata(indset, 'tmp', work3d) - do vlev = 1, lev - rvlev = lev+1-vlev - tmp_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN TEMPERATURE AT LEVEL ', vlev, 'IS: ', maxval(tmp_input(:,vlev)), minval(tmp_input(:,vlev)) - enddo - - print* - print*,"READ SPECIFIC HUMIDITY" - allocate(spfh_input(ij_input,lev)) - call read_vardata(indset, 'spfh', work3d) - do vlev = 1, lev - rvlev = lev+1-vlev - spfh_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN SPECIFIC HUMIDITY AT LEVEL ', vlev, 'IS: ', maxval(spfh_input(:,vlev)), minval(spfh_input(:,vlev)) - enddo - - print* - print*,"READ CLOUD LIQUID WATER" - allocate(clwmr_input(ij_input,lev)) - call read_vardata(indset, 'clwmr', work3d) - do vlev = 1, lev - rvlev = lev+1-vlev - clwmr_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN CLOUD LIQUID WATER AT LEVEL ', vlev, 'IS: ', maxval(clwmr_input(:,vlev)), minval(clwmr_input(:,vlev)) - enddo - - print* - print*,"READ OZONE" - allocate(o3mr_input(ij_input,lev)) - call read_vardata(indset, 'o3mr', work3d) - do vlev = 1, lev - rvlev = lev+1-vlev - o3mr_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN OZONE AT LEVEL ', vlev, 'IS: ', maxval(o3mr_input(:,vlev)), minval(o3mr_input(:,vlev)) - enddo - - print* - print*,"READ DZDT" - allocate(dzdt_input(ij_input,lev)) - call read_vardata(indset, 'dzdt', work3d, errcode=iret) - if (iret == 0) then - do vlev = 1, lev - rvlev = lev+1-vlev - dzdt_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN DZDT AT LEVEL ', vlev, 'IS: ', maxval(dzdt_input(:,vlev)), minval(dzdt_input(:,vlev)) - enddo - idzdt = 1 - else - dzdt_input = missing_value - print*,'DZDT NOT IN INPUT FILE' - idzdt = 0 - endif - - - print* - print*,"READ RWMR" - allocate(rwmr_input(ij_input,lev)) - call read_vardata(indset, 'rwmr', work3d, errcode=iret) - if (iret == 0) then - do vlev = 1, lev - rvlev = lev+1-vlev - rwmr_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN RWMR AT LEVEL ', vlev, 'IS: ', maxval(rwmr_input(:,vlev)), minval(rwmr_input(:,vlev)) - enddo - irwmr = 1 - else - rwmr_input = missing_value - print*,'RWMR NOT IN INPUT FILE' - irwmr = 0 - endif - - print* - print*,"READ ICMR" - allocate(icmr_input(ij_input,lev)) - call read_vardata(indset, 'icmr', work3d, errcode=iret) - if (iret == 0) then - do vlev = 1, lev - rvlev = lev+1-vlev - icmr_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN ICMR AT LEVEL ', vlev, 'IS: ', maxval(icmr_input(:,vlev)), minval(icmr_input(:,vlev)) - enddo - iicmr = 1 - else - icmr_input = missing_value - print*,'ICMR NOT IN INPUT FILE' - iicmr = 0 - endif - - print* - print*,"READ SNMR" - allocate(snmr_input(ij_input,lev)) - call read_vardata(indset, 'snmr', work3d, errcode=iret) - if (iret == 0) then - do vlev = 1, lev - rvlev = lev+1-vlev - snmr_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN SNMR AT LEVEL ', vlev, 'IS: ', maxval(snmr_input(:,vlev)), minval(snmr_input(:,vlev)) - enddo - isnmr = 1 - else - snmr_input = missing_value - print*,'SNMR NOT IN INPUT FILE' - isnmr = 0 - endif - - print* - print*,"READ GRLE" - allocate(grle_input(ij_input,lev)) - call read_vardata(indset, 'grle', work3d, errcode=iret) - if (iret == 0) then - do vlev = 1, lev - rvlev = lev+1-vlev - grle_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN GRLE AT LEVEL ', vlev, 'IS: ', maxval(grle_input(:,vlev)), minval(grle_input(:,vlev)) - enddo - igrle = 1 - else - grle_input = missing_value - print*,'GRLE NOT IN INPUT FILE' - igrle = 0 - endif - - print* - print*,"READ CLD_AMT" - allocate(cldamt_input(ij_input,lev)) - if (cld_amt) then - call read_vardata(indset, 'cld_amt', work3d, errcode=iret) - if (iret == 0) then - do vlev = 1, lev - rvlev = lev+1-vlev - cldamt_input(:,vlev) = reshape(work3d(:,:,rvlev),(/ij_input/)) - print*,'MAX/MIN CLD_AMT AT LEVEL ', vlev, 'IS: ', maxval(cldamt_input(:,vlev)), minval(cldamt_input(:,vlev)) - enddo - icldamt = 1 - else - cldamt_input = missing_value - print*,'CLDAMT NOT IN INPUT FILE' - icldamt = 0 - endif - else - cldamt_input = missing_value - print*,'CLDAMT NOT READ - CLD_AMT NAMELIST OPTION NOT SET TO TRUE' - icldamt = 0 - end if - - call read_vardata(indset, 'dpres', work3d, errcode=iret) - if (iret == 0) then - idpres = 1 - else - idpres = 0 - endif - call read_vardata(indset, 'delz', work3d, errcode=iret) - if (iret == 0) then - idelz = 1 - else - idelz = 0 - endif - - print*,"CLOSE FILE" - call close_dataset(indset) - deallocate(work2d,work3d) - -!--------------------------------------------------------------------------------------- -! Set the grib 1 grid description array need by the NCEP IPOLATES library. -!--------------------------------------------------------------------------------------- - - call calc_kgds(i_input, j_input, kgds_input) - - return - - end subroutine read_input_data - - subroutine read_vcoord_info - -!--------------------------------------------------------------------------------- -! Read vertical coordinate information. -!--------------------------------------------------------------------------------- - - implicit none - - integer :: istat, n, k, k2 - - real, allocatable :: ak(:), bk(:) - - type(Dataset) :: refdset - - print* - print*,"READ OUTPUT VERT COORDINATE INFO FROM REFERENCE FILE: ",trim(ref_file) - - refdset = open_dataset(ref_file) - call read_attribute(refdset, 'ak', ak) - call read_attribute(refdset, 'bk', bk) - call close_dataset(refdset) - - lev_output = size(bk) - 1 - - nvcoord=2 - allocate(vcoord(lev_output+1, nvcoord)) - - do k = 1, (lev_output+1) - k2 = lev_output+2 - k - vcoord(k,1) = ak(k2) - vcoord(k,2) = bk(k2) - print*,'VCOORD OF OUTPUT GRID ',k,vcoord(k,:) - enddo - - deallocate (ak, bk) - - end subroutine read_vcoord_info - - end module input_data diff --git a/sorc/enkf_chgres_recenter_nc.fd/interp.f90 b/sorc/enkf_chgres_recenter_nc.fd/interp.f90 deleted file mode 100644 index 291e8ef0d34..00000000000 --- a/sorc/enkf_chgres_recenter_nc.fd/interp.f90 +++ /dev/null @@ -1,582 +0,0 @@ - module interp - - implicit none - - private - - real, allocatable :: sfcp_b4_adj_output(:) - real, allocatable :: clwmr_b4_adj_output(:,:) - real, allocatable :: dzdt_b4_adj_output(:,:) - real, allocatable :: grle_b4_adj_output(:,:) - real, allocatable :: cldamt_b4_adj_output(:,:) - real, allocatable :: icmr_b4_adj_output(:,:) - real, allocatable :: o3mr_b4_adj_output(:,:) - real, allocatable :: rwmr_b4_adj_output(:,:) - real, allocatable :: snmr_b4_adj_output(:,:) - real, allocatable :: spfh_b4_adj_output(:,:) - real, allocatable :: tmp_b4_adj_output(:,:) - real, allocatable :: ugrd_b4_adj_output(:,:) - real, allocatable :: vgrd_b4_adj_output(:,:) - - public :: adjust_for_terrain - public :: gaus_to_gaus - - contains - - subroutine adjust_for_terrain - -!--------------------------------------------------------------------------------- -! Adjust fields based on differences between the interpolated and external -! terrain. -!--------------------------------------------------------------------------------- - - use input_data - use output_data - use utils - use setup - - implicit none - - integer :: k - - real, allocatable :: pres_b4_adj_output(:,:) - real, allocatable :: pres_output(:,:) - real, allocatable :: q_b4_adj_output(:,:,:), q_output(:,:,:) - -!--------------------------------------------------------------------------------- -! First, compute the mid-layer pressure using the interpolated surface pressure. -!--------------------------------------------------------------------------------- - - allocate(pres_b4_adj_output(ij_output,lev)) - pres_b4_adj_output = 0.0 - - print*,'before newpr1, sfcp b4 adj: ', sfcp_b4_adj_output(ij_output/2) - - print* - print*,"COMPUTE MID-LAYER PRESSURE FROM INTERPOLATED SURFACE PRESSURE." - call newpr1(ij_output, lev, idvc, idsl, nvcoord_input, vcoord_input, & - sfcp_b4_adj_output, pres_b4_adj_output) - - print*,'after newpr1, pres b4 adj: ', pres_b4_adj_output(ij_output/2,:) - -!--------------------------------------------------------------------------------- -! Adjust surface pressure based on differences between interpolated and -! grid terrain. -!--------------------------------------------------------------------------------- - - allocate(sfcp_output(ij_output)) - sfcp_output = 0.0 - - print*,"ADJUST SURFACE PRESSURE BASED ON TERRAIN DIFFERENCES" - call newps(hgt_output, sfcp_b4_adj_output, ij_output, & - lev, pres_b4_adj_output, tmp_b4_adj_output, & - spfh_b4_adj_output, hgt_external_output, sfcp_output) - - print*,'after newps ',sfcp_b4_adj_output(ij_output/2),sfcp_output(ij_output/2) - - deallocate(sfcp_b4_adj_output) - -!--------------------------------------------------------------------------------- -! Recompute mid-layer pressure based on the adjusted surface pressure. -!--------------------------------------------------------------------------------- - - allocate(pres_output(ij_output, lev_output)) - pres_output = 0.0 - - allocate(dpres_output(ij_output, lev_output)) - dpres_output = 0.0 - - print*,'before newpr1 ',sfcp_output(ij_output/2) - print*,'before newpr1 ',idvc,idsl,nvcoord,vcoord - - print*,"RECOMPUTE MID-LAYER PRESSURE." - call newpr1(ij_output, lev_output, idvc, idsl, nvcoord, vcoord, & - sfcp_output, pres_output, dpres_output) - - do k = 1, lev_output - print*,'after newpr1 ',pres_output(ij_output/2,k), dpres_output(ij_output/2,k) - enddo - -!--------------------------------------------------------------------------------- -! Vertically interpolate from the pre-adjusted to the adjusted mid-layer -! pressures. -!--------------------------------------------------------------------------------- - - allocate(q_b4_adj_output(ij_output,lev,ntrac)) - q_b4_adj_output(:,:,1) = spfh_b4_adj_output(:,:) - q_b4_adj_output(:,:,2) = o3mr_b4_adj_output(:,:) - q_b4_adj_output(:,:,3) = clwmr_b4_adj_output(:,:) - q_b4_adj_output(:,:,4) = rwmr_b4_adj_output(:,:) - q_b4_adj_output(:,:,5) = icmr_b4_adj_output(:,:) - q_b4_adj_output(:,:,6) = snmr_b4_adj_output(:,:) - q_b4_adj_output(:,:,7) = grle_b4_adj_output(:,:) - q_b4_adj_output(:,:,8) = cldamt_b4_adj_output(:,:) - - allocate(q_output(ij_output,lev_output,ntrac)) - q_output = 0.0 - - allocate(dzdt_output(ij_output,lev_output)) - dzdt_output = 0.0 - - allocate(ugrd_output(ij_output,lev_output)) - ugrd_output=0.0 - - allocate(vgrd_output(ij_output,lev_output)) - vgrd_output=0.0 - - allocate(tmp_output(ij_output,lev_output)) - tmp_output=0.0 - - print*,"VERTICALLY INTERPOLATE TO NEW PRESSURE LEVELS" - call vintg(ij_output, lev, lev_output, ntrac, pres_b4_adj_output, & - ugrd_b4_adj_output, vgrd_b4_adj_output, tmp_b4_adj_output, q_b4_adj_output, & - dzdt_b4_adj_output, pres_output, ugrd_output, vgrd_output, tmp_output, & - q_output, dzdt_output) - - deallocate (dzdt_b4_adj_output, q_b4_adj_output) -!deallocate (pres_b4_adj_output, pres_output) - - allocate(spfh_output(ij_output,lev_output)) - spfh_output = q_output(:,:,1) - allocate(o3mr_output(ij_output,lev_output)) - o3mr_output = q_output(:,:,2) - allocate(clwmr_output(ij_output,lev_output)) - clwmr_output = q_output(:,:,3) - allocate(rwmr_output(ij_output,lev_output)) - rwmr_output = q_output(:,:,4) - allocate(icmr_output(ij_output,lev_output)) - icmr_output = q_output(:,:,5) - allocate(snmr_output(ij_output,lev_output)) - snmr_output = q_output(:,:,6) - allocate(grle_output(ij_output,lev_output)) - grle_output = q_output(:,:,7) - allocate(cldamt_output(ij_output,lev_output)) - cldamt_output = q_output(:,:,8) - - deallocate(q_output) - - do k = 1, lev - print*,'after vintg tmp b4 ',tmp_b4_adj_output(ij_output/2,k), pres_b4_adj_output(ij_output/2,k) - enddo - do k = 1, lev_output - print*,'after vintg tmp ',tmp_output(ij_output/2,k),pres_output(ij_output/2,k) - enddo - - deallocate(tmp_b4_adj_output) - - deallocate(ugrd_b4_adj_output) - - deallocate(vgrd_b4_adj_output) - - deallocate(spfh_b4_adj_output) - - deallocate(o3mr_b4_adj_output) - - deallocate(clwmr_b4_adj_output) - - deallocate(rwmr_b4_adj_output) - - deallocate(icmr_b4_adj_output) - - deallocate(snmr_b4_adj_output) - - deallocate(grle_b4_adj_output) - - deallocate(cldamt_b4_adj_output) - - allocate(delz_output(ij_output, lev_output)) - delz_output = 0.0 - - call compute_delz(ij_output, lev_output, vcoord(:,1), vcoord(:,2), sfcp_output, hgt_output, & - tmp_output, spfh_output, delz_output) - - do k = 1, lev_output - print*,'after compute_delz ',delz_output(ij_output/2,k) - enddo - - deallocate(hgt_output) - - end subroutine adjust_for_terrain - - subroutine gaus_to_gaus - -!---------------------------------------------------------------------------------- -! Interpolate data from the input to output grid using IPOLATES library. -!---------------------------------------------------------------------------------- - - use output_data - use input_data - use setup - - implicit none - - integer :: ip, ipopt(20), i - integer :: num_fields - integer :: iret, numpts - integer, allocatable :: ibi(:), ibo(:) - - logical*1, allocatable :: bitmap_input(:,:), bitmap_output(:,:) - logical :: same_grid - - real, allocatable :: data_input(:,:) - real, allocatable :: data_output(:,:), crot(:), srot(:) - - same_grid=.true. - do i = 1, 11 - if (kgds_input(i) /= kgds_output(i)) then - same_grid=.false. - exit - endif - enddo - - if (same_grid) then - - print* - print*,'INPUT AND OUTPUT GRIDS ARE THE SAME.' - print*,'NO HORIZ INTERPOLATION REQUIRED.' - - allocate(hgt_output(ij_output)) - hgt_output = hgt_input - deallocate(hgt_input) - - allocate(sfcp_b4_adj_output(ij_output)) - sfcp_b4_adj_output = sfcp_input - deallocate(sfcp_input) - - allocate(tmp_b4_adj_output(ij_output,lev)) - tmp_b4_adj_output = tmp_input - deallocate(tmp_input) - - allocate(clwmr_b4_adj_output(ij_output,lev)) - clwmr_b4_adj_output = clwmr_input - deallocate(clwmr_input) - - allocate(spfh_b4_adj_output(ij_output,lev)) - spfh_b4_adj_output = spfh_input - deallocate(spfh_input) - - allocate(o3mr_b4_adj_output(ij_output,lev)) - o3mr_b4_adj_output = o3mr_input - deallocate(o3mr_input) - - allocate(dzdt_b4_adj_output(ij_output,lev)) - dzdt_b4_adj_output = dzdt_input - deallocate(dzdt_input) - - allocate(rwmr_b4_adj_output(ij_output,lev)) - rwmr_b4_adj_output = rwmr_input - deallocate(rwmr_input) - - allocate(snmr_b4_adj_output(ij_output,lev)) - snmr_b4_adj_output = snmr_input - deallocate(snmr_input) - - allocate(icmr_b4_adj_output(ij_output,lev)) - icmr_b4_adj_output = icmr_input - deallocate(icmr_input) - - allocate(grle_b4_adj_output(ij_output,lev)) - grle_b4_adj_output = grle_input - deallocate(grle_input) - - allocate(cldamt_b4_adj_output(ij_output,lev)) - cldamt_b4_adj_output = cldamt_input - deallocate(cldamt_input) - - allocate(ugrd_b4_adj_output(ij_output,lev)) - ugrd_b4_adj_output = ugrd_input - deallocate(ugrd_input) - - allocate(vgrd_b4_adj_output(ij_output,lev)) - vgrd_b4_adj_output = vgrd_input - deallocate(vgrd_input) - - else - - print* - print*,'INTERPOLATE DATA TO OUTPUT GRID' - - - ip = 0 ! bilinear - ipopt = 0 - -!---------------------------------------------------------------------------------- -! Do 2-D fields first -!---------------------------------------------------------------------------------- - - num_fields = 1 - - allocate(ibi(num_fields)) - ibi = 0 ! no bitmap - allocate(ibo(num_fields)) - ibo = 0 ! no bitmap - - allocate(bitmap_input(ij_input,num_fields)) - bitmap_input = .true. - allocate(bitmap_output(ij_output,num_fields)) - bitmap_output = .true. - - allocate(rlat_output(ij_output)) - rlat_output = 0.0 - allocate(rlon_output(ij_output)) - rlon_output = 0.0 - -!---------------- -! Surface height -!---------------- - - allocate(data_input(ij_input,num_fields)) - data_input(:,num_fields) = hgt_input(:) - deallocate(hgt_input) - - allocate(data_output(ij_output,num_fields)) - data_output = 0 - - print*,"INTERPOLATE SURFACE HEIGHT" - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, data_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - data_output, iret) - if (iret /= 0) goto 89 - - allocate(hgt_output(ij_output)) - hgt_output = data_output(:,num_fields) - -!------------------ -! surface pressure -!------------------ - - data_input(:,num_fields) = sfcp_input(:) - deallocate(sfcp_input) - - print*,"INTERPOLATE SURFACE PRESSURE" - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, data_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - data_output, iret) - if (iret /= 0) goto 89 - - allocate(sfcp_b4_adj_output(ij_output)) - sfcp_b4_adj_output = data_output(:,num_fields) - - deallocate(ibi, ibo, bitmap_input, bitmap_output, data_input, data_output) - -!---------------------------------------------------------------------------------- -! 3d scalars -!---------------------------------------------------------------------------------- - - num_fields = lev - - allocate(ibi(num_fields)) - ibi = 0 ! no bitmap - allocate(ibo(num_fields)) - ibo = 0 ! no bitmap - - allocate(bitmap_input(ij_input,num_fields)) - bitmap_input = .true. - allocate(bitmap_output(ij_output,num_fields)) - bitmap_output = .true. - -!------------- -! Temperature -!------------- - - allocate(tmp_b4_adj_output(ij_output,num_fields)) - tmp_b4_adj_output = 0 - - print*,'INTERPOLATE TEMPERATURE' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, tmp_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - tmp_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(tmp_input) - -!-------------------- -! Cloud liquid water -!-------------------- - - allocate(clwmr_b4_adj_output(ij_output,num_fields)) - clwmr_b4_adj_output = 0 - - print*,'INTERPOLATE CLOUD LIQUID WATER' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, clwmr_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - clwmr_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(clwmr_input) - -!-------------------- -! Specific humidity -!-------------------- - - allocate(spfh_b4_adj_output(ij_output,num_fields)) - spfh_b4_adj_output = 0 - - print*,'INTERPOLATE SPECIFIC HUMIDITY' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, spfh_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - spfh_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(spfh_input) - -!----------- -! Ozone -!----------- - - allocate(o3mr_b4_adj_output(ij_output,num_fields)) - o3mr_b4_adj_output = 0 - - print*,'INTERPOLATE OZONE' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, o3mr_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - o3mr_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(o3mr_input) - -!----------- -! DZDT -!----------- - - allocate(dzdt_b4_adj_output(ij_output,num_fields)) - dzdt_b4_adj_output = 0 - - print*,'INTERPOLATE DZDT' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, dzdt_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - dzdt_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(dzdt_input) - -!---------------------------------------------------------------------------------- -! Interpolate additional 3-d scalars for GFDL microphysics. -!---------------------------------------------------------------------------------- - - -!------------- -! Rain water -!------------- - - allocate(rwmr_b4_adj_output(ij_output,num_fields)) - rwmr_b4_adj_output = 0 - - print*,'INTERPOLATE RWMR' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, rwmr_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - rwmr_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(rwmr_input) - -!------------- -! Snow water -!------------- - - allocate(snmr_b4_adj_output(ij_output,num_fields)) - snmr_b4_adj_output = 0 - - print*,'INTERPOLATE SNMR' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, snmr_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - snmr_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(snmr_input) - -!------------- -! Ice water -!------------- - - allocate(icmr_b4_adj_output(ij_output,num_fields)) - icmr_b4_adj_output = 0 - - print*,'INTERPOLATE ICMR' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, icmr_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - icmr_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(icmr_input) - -!------------- -! Graupel -!------------- - - allocate(grle_b4_adj_output(ij_output,num_fields)) - grle_b4_adj_output = 0 - - print*,'INTERPOLATE GRLE' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, grle_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - grle_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(grle_input) - - -!--------------------------- -! Cloud amount -!--------------------------- - - allocate(cldamt_b4_adj_output(ij_output,num_fields)) - cldamt_b4_adj_output = 0 - - print*,'INTERPOLATE CLD_AMT' - call ipolates(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, cldamt_input, & - numpts, rlat_output, rlon_output, ibo, bitmap_output, & - cldamt_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate(cldamt_input) - - - -!---------------------------------------------------------------------------------- -! 3d u/v winds -!---------------------------------------------------------------------------------- - - allocate(crot(ij_output), srot(ij_output)) - crot = 0. - srot = 0. - - allocate(ugrd_b4_adj_output(ij_output,num_fields)) - ugrd_b4_adj_output = 0 - allocate(vgrd_b4_adj_output(ij_output,num_fields)) - vgrd_b4_adj_output = 0 - - print*,'INTERPOLATE WINDS' - call ipolatev(ip, ipopt, kgds_input, kgds_output, ij_input, ij_output,& - num_fields, ibi, bitmap_input, ugrd_input, vgrd_input, & - numpts, rlat_output, rlon_output, crot, srot, ibo, bitmap_output, & - ugrd_b4_adj_output, vgrd_b4_adj_output, iret) - if (iret /= 0) goto 89 - - deallocate (ugrd_input, vgrd_input) - deallocate (crot, srot) - deallocate (ibi, ibo, bitmap_input, bitmap_output) - - endif - - return - - 89 continue - print*,"FATAL ERROR IN IPOLATES. IRET IS: ", iret - call errexit(23) - - end subroutine gaus_to_gaus - - end module interp diff --git a/sorc/enkf_chgres_recenter_nc.fd/output_data.f90 b/sorc/enkf_chgres_recenter_nc.fd/output_data.f90 deleted file mode 100644 index 00b39fc7c8f..00000000000 --- a/sorc/enkf_chgres_recenter_nc.fd/output_data.f90 +++ /dev/null @@ -1,288 +0,0 @@ - module output_data - - use module_ncio - - implicit none - - private - - integer, public :: kgds_output(200) - -! data on the output grid. - real, allocatable, public :: hgt_output(:) ! interpolated from input grid - real, allocatable, public :: hgt_external_output(:) - real, allocatable, public :: sfcp_output(:) - real, allocatable, public :: tmp_output(:,:) - real, allocatable, public :: clwmr_output(:,:) - real, allocatable, public :: delz_output(:,:) - real, allocatable, public :: dpres_output(:,:) - real, allocatable, public :: dzdt_output(:,:) - real, allocatable, public :: o3mr_output(:,:) - real, allocatable, public :: spfh_output(:,:) - real, allocatable, public :: ugrd_output(:,:) - real, allocatable, public :: vgrd_output(:,:) - real, allocatable, public :: rwmr_output(:,:) - real, allocatable, public :: icmr_output(:,:) - real, allocatable, public :: snmr_output(:,:) - real, allocatable, public :: grle_output(:,:) - real, allocatable, public :: cldamt_output(:,:) - real, allocatable, public :: rlat_output(:) - real, allocatable, public :: rlon_output(:) - - public :: set_output_grid - public :: write_output_data - type(Dataset) :: indset, outdset - - - contains - - subroutine set_output_grid - -!------------------------------------------------------------------- -! Set grid specs on the output grid. -!------------------------------------------------------------------- - - use setup - use input_data - use utils - - implicit none - - - type(Dataset) :: indset - real, allocatable :: work2d(:,:) - - - - print* - print*,"OUTPUT GRID I/J DIMENSIONS: ", i_output, j_output - -!------------------------------------------------------------------- -! Set the grib 1 grid description section, which is needed -! by the IPOLATES library. -!------------------------------------------------------------------- - - kgds_output = 0 - - call calc_kgds(i_output, j_output, kgds_output) - -!------------------------------------------------------------------- -! Read the terrain on the output grid. To ensure exact match, -! read it from an existing netcdf file. -!------------------------------------------------------------------- - - print* - print*,"OPEN OUTPUT GRID TERRAIN FILE: ", trim(terrain_file) - indset = open_dataset(terrain_file) - - allocate(hgt_external_output(ij_output)) - - print* - print*,"READ SURFACE HEIGHT" - call read_vardata(indset, 'hgtsfc', work2d) - - hgt_external_output = reshape(work2d,(/ij_output/)) - - call close_dataset(indset) - - end subroutine set_output_grid - - subroutine write_output_data - -!------------------------------------------------------------------- -! Write output grid data to a netcdf file. -!------------------------------------------------------------------- - - use input_data - use setup - - implicit none - - integer :: n,nrev - real, allocatable, dimension (:,:) :: out2d - real, allocatable, dimension (:,:,:) :: out3d - -!------------------------------------------------------------------- -! Set up some header info. -!------------------------------------------------------------------- - - call header_set - -!------------------------------------------------------------------- -! Open and write file. -!------------------------------------------------------------------- -! TODO: note there can be compression applied to this output file if necessary -! see how it's done in the GSI EnKF for example - - - print* - print*,'OPEN OUTPUT FILE: ',trim(output_file) - allocate(out2d(i_output,j_output)) - allocate(out3d(i_output,j_output,lev_output)) - - print*,"WRITE SURFACE HEIGHT" - out2d = reshape(hgt_external_output, (/i_output,j_output/)) - call write_vardata(outdset, 'hgtsfc', out2d) - deallocate(hgt_external_output) - - print*,"WRITE SURFACE PRESSURE" - out2d = reshape(sfcp_output, (/i_output,j_output/)) - call write_vardata(outdset, 'pressfc', out2d) - deallocate(sfcp_output) - - print*,"WRITE TEMPERATURE" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(tmp_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'tmp', out3d) - deallocate(tmp_output) - - print*,"WRITE CLOUD LIQUID WATER" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(clwmr_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'clwmr', out3d) - deallocate(clwmr_output) - - print*,"WRITE SPECIFIC HUMIDITY" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(spfh_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'spfh', out3d) - deallocate(spfh_output) - - print*,"WRITE OZONE" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(o3mr_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'o3mr', out3d) - deallocate(o3mr_output) - - print*,"WRITE U-WINDS" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(ugrd_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'ugrd', out3d) - deallocate(ugrd_output) - - print*,"WRITE V-WINDS" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(vgrd_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'vgrd', out3d) - deallocate(vgrd_output) - - if (idzdt == 1) then - print*,"WRITE DZDT" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(dzdt_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'dzdt', out3d) - deallocate(dzdt_output) - endif - - if (idpres == 1) then - print*,"WRITE DPRES" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(dpres_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'dpres', out3d) - endif - deallocate(dpres_output) - - if (idelz == 1) then - print*,"WRITE DELZ" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(delz_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'delz', out3d) - endif - deallocate(delz_output) - - if (irwmr == 1) then - print*,"WRITE RAIN WATER" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(rwmr_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'rwmr', out3d) - deallocate(rwmr_output) - endif - - if (isnmr == 1) then - print*,"WRITE SNOW WATER" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(snmr_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'snmr', out3d) - deallocate(snmr_output) - endif - - if (iicmr == 1) then - print*,"WRITE ICE WATER" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(icmr_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'icmr', out3d) - deallocate(icmr_output) - endif - - if (igrle == 1) then - print*,"WRITE GRAUPEL" - do n=1,lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(grle_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'grle', out3d) - deallocate(grle_output) - endif - - if (icldamt == 1) then - print*,"WRITE CLD_AMT" - do n = 1, lev_output - nrev = lev_output+1-n - out3d(:,:,n) = reshape(cldamt_output(:,nrev), (/i_output,j_output/)) - end do - call write_vardata(outdset, 'cld_amt', out3d) - deallocate(cldamt_output) - endif - - - deallocate(out2d,out3d) - - return - - end subroutine write_output_data - - subroutine header_set - -!------------------------------------------------------------------- -! copy dimensions and metadata to the output file from the -! input terrain (output res) file -!------------------------------------------------------------------- - - use input_data - use setup - - implicit none - - print* - print*,"SET HEADER INFO FOR OUTPUT FILE." - - indset = open_dataset(ref_file) - outdset = create_dataset(output_file, indset, nocompress=.true.) - - end subroutine header_set - - end module output_data diff --git a/sorc/enkf_chgres_recenter_nc.fd/setup.f90 b/sorc/enkf_chgres_recenter_nc.fd/setup.f90 deleted file mode 100644 index ee9956ae03d..00000000000 --- a/sorc/enkf_chgres_recenter_nc.fd/setup.f90 +++ /dev/null @@ -1,55 +0,0 @@ - module setup - - implicit none - - private - - character(len=300), public :: input_file - character(len=300), public :: output_file - character(len=300), public :: terrain_file - character(len=300), public :: ref_file - - integer, public :: i_output - integer, public :: j_output - integer , public :: ij_output - logical, public :: cld_amt - - public :: program_setup - - contains - - subroutine program_setup - - implicit none - - integer :: istat - character(len=500) :: filenamelist - - namelist /chgres_setup/ i_output, j_output, input_file, output_file, & - terrain_file, cld_amt, ref_file - - cld_amt = .false. ! default option - - print* - call getarg(1,filenamelist) - print*,"OPEN SETUP NAMELIST ",trim(filenamelist) - open(43, file=filenamelist, iostat=istat) - if (istat /= 0) then - print*,"FATAL ERROR OPENING NAMELIST FILE. ISTAT IS: ",istat - stop - endif - - print*,"READ SETUP NAMELIST." - read(43, nml=chgres_setup, iostat=istat) - if (istat /= 0) then - print*,"FATAL ERROR READING NAMELIST FILE. ISTAT IS: ",istat - stop - endif - - ij_output = i_output * j_output - - close(43) - - end subroutine program_setup - - end module setup diff --git a/sorc/enkf_chgres_recenter_nc.fd/utils.f90 b/sorc/enkf_chgres_recenter_nc.fd/utils.f90 deleted file mode 100644 index 3fa0910606c..00000000000 --- a/sorc/enkf_chgres_recenter_nc.fd/utils.f90 +++ /dev/null @@ -1,736 +0,0 @@ - module utils - - private - - public :: calc_kgds - public :: newps - public :: newpr1 - public :: vintg - public :: compute_delz - - contains - - subroutine compute_delz(ijm, levp, ak_in, bk_in, ps, zs, t, sphum, delz) - - implicit none - integer, intent(in):: levp, ijm - real, intent(in), dimension(levp+1):: ak_in, bk_in - real, intent(in), dimension(ijm):: ps, zs - real, intent(in), dimension(ijm,levp):: t - real, intent(in), dimension(ijm,levp):: sphum - real, intent(out), dimension(ijm,levp):: delz -! Local: - real, dimension(ijm,levp+1):: zh - real, dimension(ijm,levp+1):: pe0, pn0 - real, dimension(levp+1) :: ak, bk - integer i,k - real, parameter :: GRAV = 9.80665 - real, parameter :: RDGAS = 287.05 - real, parameter :: RVGAS = 461.50 - real :: zvir - real:: grd - - print*,"COMPUTE LAYER THICKNESS." - - grd = grav/rdgas - zvir = rvgas/rdgas - 1. - ak = ak_in - bk = bk_in - ak(levp+1) = max(1.e-9, ak(levp+1)) - - do i=1, ijm - pe0(i,levp+1) = ak(levp+1) - pn0(i,levp+1) = log(pe0(i,levp+1)) - enddo - - do k=levp,1, -1 - do i=1,ijm - pe0(i,k) = ak(k) + bk(k)*ps(i) - pn0(i,k) = log(pe0(i,k)) - enddo - enddo - - do i = 1, ijm - zh(i,1) = zs(i) - enddo - - do k = 2, levp+1 - do i = 1, ijm - zh(i,k) = zh(i,k-1)+t(i,k-1)*(1.+zvir*sphum(i,k-1))* & - (pn0(i,k-1)-pn0(i,k))/grd - enddo - enddo - - do k = 1, levp - do i = 1, ijm - delz(i,k) = zh(i,k) - zh(i,k+1) - enddo - enddo - - end subroutine compute_delz - - subroutine calc_kgds(idim, jdim, kgds) - - implicit none - - integer, intent(in) :: idim, jdim - - integer, intent(out) :: kgds(200) - - kgds = 0 - kgds(1) = 4 ! OCT 6 - TYPE OF GRID (GAUSSIAN) - kgds(2) = idim ! OCT 7-8 - # PTS ON LATITUDE CIRCLE - kgds(3) = jdim ! OCT 9-10 - # PTS ON LONGITUDE CIRCLE - kgds(4) = 90000 ! OCT 11-13 - LAT OF ORIGIN - kgds(5) = 0 ! OCT 14-16 - LON OF ORIGIN - kgds(6) = 128 ! OCT 17 - RESOLUTION FLAG - kgds(7) = -90000 ! OCT 18-20 - LAT OF EXTREME POINT - kgds(8) = nint(-360000./idim) ! OCT 21-23 - LON OF EXTREME POINT - kgds(9) = nint((360.0 / float(idim))*1000.0) - ! OCT 24-25 - LONGITUDE DIRECTION INCR. - kgds(10) = jdim/2 ! OCT 26-27 - NUMBER OF CIRCLES POLE TO EQUATOR - kgds(12) = 255 ! OCT 29 - RESERVED - kgds(20) = 255 ! OCT 5 - NOT USED, SET TO 255 - - end subroutine calc_kgds - - SUBROUTINE NEWPS(ZS,PS,IM,KM,P,T,Q,ZSNEW,PSNEW) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: NEWPS COMPUTE NEW SURFACE PRESSURE -! PRGMMR: IREDELL ORG: W/NMC23 DATE: 92-10-31 -! -! ABSTRACT: COMPUTES A NEW SURFACE PRESSURE GIVEN A NEW OROGRAPHY. -! THE NEW PRESSURE IS COMPUTED ASSUMING A HYDROSTATIC BALANCE -! AND A CONSTANT TEMPERATURE LAPSE RATE. BELOW GROUND, THE -! LAPSE RATE IS ASSUMED TO BE -6.5 K/KM. -! -! PROGRAM HISTORY LOG: -! 91-10-31 MARK IREDELL -! -! USAGE: CALL NEWPS(ZS,PS,IM,KM,P,T,Q,ZSNEW,PSNEW) -! INPUT ARGUMENT LIST: -! IM INTEGER NUMBER OF POINTS TO COMPUTE -! ZS REAL (IM) OLD OROGRAPHY (M) -! PS REAL (IM) OLD SURFACE PRESSURE (PA) -! KM INTEGER NUMBER OF LEVELS -! P REAL (IM,KM) PRESSURES (PA) -! T REAL (IM,KM) TEMPERATURES (K) -! Q REAL (IM,KM) SPECIFIC HUMIDITIES (KG/KG) -! ZSNEW REAL (IM) NEW OROGRAPHY (M) -! OUTPUT ARGUMENT LIST: -! PSNEW REAL (IM) NEW SURFACE PRESSURE (PA) -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN -! -!C$$$ - REAL ZS(IM),PS(IM),P(IM,KM),T(IM,KM),Q(IM,KM) - REAL ZSNEW(IM),PSNEW(IM) - PARAMETER(BETA=-6.5E-3,EPSILON=1.E-9) - PARAMETER(G=9.80665,RD=287.05,RV=461.50) - PARAMETER(GOR=G/RD,FV=RV/RD-1.) - REAL ZU(IM) - FTV(AT,AQ)=AT*(1+FV*AQ) - FGAM(APU,ATVU,APD,ATVD)=-GOR*LOG(ATVD/ATVU)/LOG(APD/APU) - FZ0(AP,ATV,AZD,APD)=AZD+ATV/GOR*LOG(APD/AP) - FZ1(AP,ATV,AZD,APD,AGAM)=AZD-ATV/AGAM*((APD/AP)**(-AGAM/GOR)-1) - FP0(AZ,AZU,APU,ATVU)=APU*EXP(-GOR/ATVU*(AZ-AZU)) - FP1(AZ,AZU,APU,ATVU,AGAM)=APU*(1+AGAM/ATVU*(AZ-AZU))**(-GOR/AGAM) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COMPUTE SURFACE PRESSURE BELOW THE ORIGINAL GROUND - LS=0 - K=1 - GAMMA=BETA - DO I=1,IM - PU=P(I,K) - TVU=FTV(T(I,K),Q(I,K)) - ZU(I)=FZ1(PU,TVU,ZS(I),PS(I),GAMMA) - IF(ZSNEW(I).LE.ZU(I)) THEN - PU=P(I,K) - TVU=FTV(T(I,K),Q(I,K)) - IF(ABS(GAMMA).GT.EPSILON) THEN - PSNEW(I)=FP1(ZSNEW(I),ZU(I),PU,TVU,GAMMA) - ELSE - PSNEW(I)=FP0(ZSNEW(I),ZU(I),PU,TVU) - ENDIF - ELSE - PSNEW(I)=0 - LS=LS+1 - ENDIF -! endif - ENDDO -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COMPUTE SURFACE PRESSURE ABOVE THE ORIGINAL GROUND - DO K=2,KM - IF(LS.GT.0) THEN - DO I=1,IM - IF(PSNEW(I).EQ.0) THEN - PU=P(I,K) - TVU=FTV(T(I,K),Q(I,K)) - PD=P(I,K-1) - TVD=FTV(T(I,K-1),Q(I,K-1)) - GAMMA=FGAM(PU,TVU,PD,TVD) - IF(ABS(GAMMA).GT.EPSILON) THEN - ZU(I)=FZ1(PU,TVU,ZU(I),PD,GAMMA) - ELSE - ZU(I)=FZ0(PU,TVU,ZU(I),PD) - ENDIF - IF(ZSNEW(I).LE.ZU(I)) THEN - IF(ABS(GAMMA).GT.EPSILON) THEN - PSNEW(I)=FP1(ZSNEW(I),ZU(I),PU,TVU,GAMMA) - ELSE - PSNEW(I)=FP0(ZSNEW(I),ZU(I),PU,TVU) - ENDIF - LS=LS-1 - ENDIF - ENDIF - ENDDO - ENDIF - ENDDO -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COMPUTE SURFACE PRESSURE OVER THE TOP - IF(LS.GT.0) THEN - K=KM - GAMMA=0 - DO I=1,IM - IF(PSNEW(I).EQ.0) THEN - PU=P(I,K) - TVU=FTV(T(I,K),Q(I,K)) - PSNEW(I)=FP0(ZSNEW(I),ZU(I),PU,TVU) - ENDIF - ENDDO - ENDIF - END SUBROUTINE NEWPS - - SUBROUTINE NEWPR1(IM,KM,IDVC,IDSL,NVCOORD,VCOORD, & - PS,PM,DP) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: NEWPR1 COMPUTE MODEL PRESSURES -! PRGMMR: JUANG ORG: W/NMC23 DATE: 2005-04-11 -! PRGMMR: Fanglin Yang ORG: W/NMC23 DATE: 2006-11-28 -! PRGMMR: S. Moorthi ORG: NCEP/EMC DATE: 2006-12-12 -! PRGMMR: S. Moorthi ORG: NCEP/EMC DATE: 2007-01-02 -! -! ABSTRACT: COMPUTE MODEL PRESSURES. -! -! PROGRAM HISTORY LOG: -! 2005-04-11 HANN_MING HENRY JUANG hybrid sigma, sigma-p, and sigma- -! -! USAGE: CALL NEWPR1(IM,IX,KM,KMP,IDVC,IDSL,NVCOORD,VCOORD,PP,TP,QP,P -! INPUT ARGUMENT LIST: -! IM INTEGER NUMBER OF POINTS TO COMPUTE -! KM INTEGER NUMBER OF LEVELS -! IDVC INTEGER VERTICAL COORDINATE ID -! (1 FOR SIGMA AND 2 FOR HYBRID) -! IDSL INTEGER TYPE OF SIGMA STRUCTURE -! (1 FOR PHILLIPS OR 2 FOR MEAN) -! NVCOORD INTEGER NUMBER OF VERTICAL COORDINATES -! VCOORD REAL (KM+1,NVCOORD) VERTICAL COORDINATE VALUES -! FOR IDVC=1, NVCOORD=1: SIGMA INTERFACE -! FOR IDVC=2, NVCOORD=2: HYBRID INTERFACE A AND B -! FOR IDVC=3, NVCOORD=3: JUANG GENERAL HYBRID INTERFACE -! AK REAL (KM+1) HYBRID INTERFACE A -! BK REAL (KM+1) HYBRID INTERFACE B -! PS REAL (IX) SURFACE PRESSURE (PA) -! OUTPUT ARGUMENT LIST: -! PM REAL (IX,KM) MID-LAYER PRESSURE (PA) -! DP REAL (IX,KM) LAYER DELTA PRESSURE (PA) -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN -! -!C$$$ - IMPLICIT NONE - - INTEGER, INTENT(IN) :: IM, KM, NVCOORD, IDVC, IDSL - - REAL, INTENT(IN) :: VCOORD(KM+1,NVCOORD) - REAL, INTENT(IN) :: PS(IM) - - REAL, INTENT(OUT) :: PM(IM,KM) - REAL, OPTIONAL, INTENT(OUT) :: DP(IM,KM) - - REAL, PARAMETER :: RD=287.05, RV=461.50, CP=1004.6, & - ROCP=RD/CP, ROCP1=ROCP+1, ROCPR=1/ROCP, & - FV=RV/RD-1. - - INTEGER :: I, K - - REAL :: AK(KM+1), BK(KM+1), PI(IM,KM+1) - - IF(IDVC.EQ.2) THEN - DO K=1,KM+1 - AK(K) = VCOORD(K,1) - BK(K) = VCOORD(K,2) - PI(:,K) = AK(K) + BK(K)*PS(:) - ENDDO - ELSE - print*,'routine only works for idvc 2' - stop - ENDIF - - IF(IDSL.EQ.2) THEN - DO K=1,KM - PM(1:IM,K) = (PI(1:IM,K)+PI(1:IM,K+1))/2 - ENDDO - ELSE - DO K=1,KM - PM(1:IM,K) = ((PI(1:IM,K)**ROCP1-PI(1:IM,K+1)**ROCP1)/ & - (ROCP1*(PI(1:IM,K)-PI(1:IM,K+1))))**ROCPR - ENDDO - ENDIF - - IF(PRESENT(DP))THEN - DO K=1,KM - DO I=1,IM - DP(I,K) = PI(I,K) - PI(I,K+1) - ENDDO - ENDDO - ENDIF - - END SUBROUTINE NEWPR1 - - SUBROUTINE RSEARCH(KM1,Z1,KM2,Z2,L2) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: RSEARCH SEARCH FOR A SURROUNDING REAL INTERVAL -! PRGMMR: IREDELL ORG: W/NMC23 DATE: 98-05-01 -! -! ABSTRACT: THIS SUBPROGRAM SEARCHES MONOTONIC SEQUENCES OF REAL NUMBERS -! FOR INTERVALS THAT SURROUND A GIVEN SEARCH SET OF REAL NUMBERS. -! THE SEQUENCES MAY BE MONOTONIC IN EITHER DIRECTION; THE REAL NUMBERS -! MAY BE SINGLE OR DOUBLE PRECISION; THE INPUT SEQUENCES AND SETS -! AND THE OUTPUT LOCATIONS MAY BE ARBITRARILY DIMENSIONED. -! -! PROGRAM HISTORY LOG: -! 1999-01-05 MARK IREDELL -! 2022-06-16 CONVERT TO SINGLE COLUMN PROCESSING. GDIT -! -! USAGE: CALL RSEARCH(KM1,Z1,KM2,Z2,L2) -! -! INPUT ARGUMENT LIST: -! KM1 INTEGER NUMBER OF POINTS IN THE SEQUENCE. -! Z1 REAL SEQUENCE VALUES TO SEARCH. -! (Z1 MUST BE MONOTONIC IN EITHER DIRECTION) -! KM2 INTEGER NUMBER OF POINTS TO SEARCH FOR -! IN THE SEQUENCE -! Z2 REAL SET OF VALUES TO SEARCH FOR. -! (Z2 NEED NOT BE MONOTONIC) -! -! OUTPUT ARGUMENT LIST: -! L2 INTEGER INTERVAL LOCATIONS HAVING VALUES FROM -! 0 TO KM1. (Z2 WILL BE BETWEEN Z1(L2) AND Z1(L2+1)) -! -! REMARKS: -! -! RETURNED VALUES OF 0 OR KM1 INDICATE THAT THE GIVEN SEARCH VALUE -! IS OUTSIDE THE RANGE OF THE SEQUENCE. -! -! IF A SEARCH VALUE IS IDENTICAL TO ONE OF THE SEQUENCE VALUES -! THEN THE LOCATION RETURNED POINTS TO THE IDENTICAL VALUE. -! IF THE SEQUENCE IS NOT STRICTLY MONOTONIC AND A SEARCH VALUE IS -! IDENTICAL TO MORE THAN ONE OF THE SEQUENCE VALUES, THEN THE -! LOCATION RETURNED MAY POINT TO ANY OF THE IDENTICAL VALUES. -! -! TO BE EXACT, FOR K FROM 1 TO KM2, Z=Z2(K) IS THE SEARCH VALUE -! AND L=L2(K) IS THE LOCATION RETURNED. -! -! IF L=0, THEN Z IS LESS THAN THE START POINT Z1(1) -! FOR ASCENDING SEQUENCES (OR GREATER THAN FOR DESCENDING SEQUENCES). -! IF L=KM1, THEN Z IS GREATER THAN OR EQUAL TO THE END POINT -! Z1(KM1) FOR ASCENDING SEQUENCES (OR LESS THAN OR EQUAL -! TO FOR DESCENDING SEQUENCES). OTHERWISE Z IS BETWEEN -! THE VALUES Z1(L) AND Z1(1+L) AND MAY EQUAL THE FORMER. -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN -! -!C$$$ - IMPLICIT NONE - INTEGER,INTENT(IN):: KM1,KM2 - REAL,INTENT(IN):: Z1(KM1) - REAL,INTENT(IN):: Z2(KM2) - INTEGER,INTENT(OUT):: L2(KM2) - INTEGER K2,L - REAL Z -!C - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -!C FIND THE SURROUNDING INPUT INTERVAL FOR EACH OUTPUT POINT. - - IF(Z1(1).LE.Z1(km1)) THEN -!C INPUT COORDINATE IS MONOTONICALLY ASCENDING. - DO K2=1,KM2 - Z=Z2(K2) - L=0 - DO - IF(Z.LT.Z1(L+1)) EXIT - L=L+1 - IF(L.EQ.KM1) EXIT - ENDDO - L2(K2)=L - ENDDO - ELSE -!C INPUT COORDINATE IS MONOTONICALLY DESCENDING. - DO K2=1,KM2 - Z=Z2(K2) - L=0 - DO - IF(Z.GT.Z1(L+1)) EXIT - L=L+1 - IF(L.EQ.KM1) EXIT - ENDDO - L2(K2)=L - ENDDO - ENDIF - - END SUBROUTINE RSEARCH - - SUBROUTINE VINTG(IM,KM1,KM2,NT,P1,U1,V1,T1,Q1,W1,P2, & - U2,V2,T2,Q2,W2) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: VINTG VERTICALLY INTERPOLATE UPPER-AIR FIELDS -! PRGMMR: IREDELL ORG: W/NMC23 DATE: 92-10-31 -! -! ABSTRACT: VERTICALLY INTERPOLATE UPPER-AIR FIELDS. -! WIND, TEMPERATURE, HUMIDITY AND OTHER TRACERS ARE INTERPOLATED. -! THE INTERPOLATION IS CUBIC LAGRANGIAN IN LOG PRESSURE -! WITH A MONOTONIC CONSTRAINT IN THE CENTER OF THE DOMAIN. -! IN THE OUTER INTERVALS IT IS LINEAR IN LOG PRESSURE. -! OUTSIDE THE DOMAIN, FIELDS ARE GENERALLY HELD CONSTANT, -! EXCEPT FOR TEMPERATURE AND HUMIDITY BELOW THE INPUT DOMAIN, -! WHERE THE TEMPERATURE LAPSE RATE IS HELD FIXED AT -6.5 K/KM AND -! THE RELATIVE HUMIDITY IS HELD CONSTANT. -! -! PROGRAM HISTORY LOG: -! 91-10-31 MARK IREDELL -! -! USAGE: CALL VINTG(IM,KM1,KM2,NT,P1,U1,V1,T1,Q1,P2, -! & U2,V2,T2,Q2) -! INPUT ARGUMENT LIST: -! IM INTEGER NUMBER OF POINTS TO COMPUTE -! KM1 INTEGER NUMBER OF INPUT LEVELS -! KM2 INTEGER NUMBER OF OUTPUT LEVELS -! NT INTEGER NUMBER OF TRACERS -! P1 REAL (IM,KM1) INPUT PRESSURES -! ORDERED FROM BOTTOM TO TOP OF ATMOSPHERE -! U1 REAL (IM,KM1) INPUT ZONAL WIND -! V1 REAL (IM,KM1) INPUT MERIDIONAL WIND -! T1 REAL (IM,KM1) INPUT TEMPERATURE (K) -! Q1 REAL (IM,KM1,NT) INPUT TRACERS (HUMIDITY FIRST) -! P2 REAL (IM,KM2) OUTPUT PRESSURES -! OUTPUT ARGUMENT LIST: -! U2 REAL (IM,KM2) OUTPUT ZONAL WIND -! V2 REAL (IM,KM2) OUTPUT MERIDIONAL WIND -! T2 REAL (IM,KM2) OUTPUT TEMPERATURE (K) -! Q2 REAL (IM,KM2,NT) OUTPUT TRACERS (HUMIDITY FIRST) -! -! SUBPROGRAMS CALLED: -! TERP3 CUBICALLY INTERPOLATE IN ONE DIMENSION -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN -! -!C$$$ - IMPLICIT NONE - - - INTEGER, INTENT(IN) :: IM, KM1, KM2, NT - - REAL, INTENT(IN) :: P1(IM,KM1),U1(IM,KM1),V1(IM,KM1) - REAL, INTENT(IN) :: T1(IM,KM1),Q1(IM,KM1,NT) - REAL, INTENT(IN) :: W1(IM,KM1),P2(IM,KM2) - REAL, INTENT(OUT) :: U2(IM,KM2),V2(IM,KM2) - REAL, INTENT(OUT) :: T2(IM,KM2),Q2(IM,KM2,NT) - REAL, INTENT(OUT) :: W2(IM,KM2) - - REAL, PARAMETER :: DLTDZ=-6.5E-3*287.05/9.80665 - REAL, PARAMETER :: DLPVDRT=-2.5E6/461.50 - - INTEGER :: I, K, N - - REAL :: DZ - REAL,ALLOCATABLE :: Z1(:,:),Z2(:,:) - REAL,ALLOCATABLE :: C1(:,:,:),C2(:,:,:),J2(:,:,:) - - ALLOCATE (Z1(IM+1,KM1),Z2(IM+1,KM2)) - ALLOCATE (C1(IM+1,KM1,4+NT),C2(IM+1,KM2,4+NT),J2(IM+1,KM2,4+NT)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COMPUTE LOG PRESSURE INTERPOLATING COORDINATE -! AND COPY INPUT WIND, TEMPERATURE, HUMIDITY AND OTHER TRACERS -!$OMP PARALLEL DO PRIVATE(K,I) - DO K=1,KM1 - DO I=1,IM - Z1(I,K) = -LOG(P1(I,K)) - C1(I,K,1) = U1(I,K) - C1(I,K,2) = V1(I,K) - C1(I,K,3) = W1(I,K) - C1(I,K,4) = T1(I,K) - C1(I,K,5) = Q1(I,K,1) - ENDDO - ENDDO -!$OMP END PARALLEL DO - DO N=2,NT - DO K=1,KM1 - DO I=1,IM - C1(I,K,4+N) = Q1(I,K,N) - ENDDO - ENDDO - ENDDO -!$OMP PARALLEL DO PRIVATE(K,I) - DO K=1,KM2 - DO I=1,IM - Z2(I,K) = -LOG(P2(I,K)) - ENDDO - ENDDO -!$OMP END PARALLEL DO -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! PERFORM LAGRANGIAN ONE-DIMENSIONAL INTERPOLATION -! THAT IS 4TH-ORDER IN INTERIOR, 2ND-ORDER IN OUTSIDE INTERVALS -! AND 1ST-ORDER FOR EXTRAPOLATION. - CALL TERP3(IM,1,1,1,1,4+NT,(IM+1)*KM1,(IM+1)*KM2, & - KM1,IM+1,IM+1,Z1,C1,KM2,IM+1,IM+1,Z2,C2,J2) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COPY OUTPUT WIND, TEMPERATURE, HUMIDITY AND OTHER TRACERS -! EXCEPT BELOW THE INPUT DOMAIN, LET TEMPERATURE INCREASE WITH A FIXED -! LAPSE RATE AND LET THE RELATIVE HUMIDITY REMAIN CONSTANT. - DO K=1,KM2 - DO I=1,IM - U2(I,K)=C2(I,K,1) - V2(I,K)=C2(I,K,2) - W2(I,K)=C2(I,K,3) - DZ=Z2(I,K)-Z1(I,1) - IF(DZ.GE.0) THEN - T2(I,K)=C2(I,K,4) - Q2(I,K,1)=C2(I,K,5) - ELSE - T2(I,K)=T1(I,1)*EXP(DLTDZ*DZ) - Q2(I,K,1)=Q1(I,1,1)*EXP(DLPVDRT*(1/T2(I,K)-1/T1(I,1))-DZ) - ENDIF - ENDDO - ENDDO - DO N=2,NT - DO K=1,KM2 - DO I=1,IM - Q2(I,K,N)=C2(I,K,4+N) - ENDDO - ENDDO - ENDDO - DEALLOCATE (Z1,Z2,C1,C2,J2) - END SUBROUTINE VINTG - - - SUBROUTINE TERP3(IM,IXZ1,IXQ1,IXZ2,IXQ2,NM,NXQ1,NXQ2, & - KM1,KXZ1,KXQ1,Z1,Q1,KM2,KXZ2,KXQ2,Z2,Q2,J2) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: TERP3 CUBICALLY INTERPOLATE IN ONE DIMENSION -! PRGMMR: IREDELL ORG: W/NMC23 DATE: 98-05-01 -! -! ABSTRACT: INTERPOLATE FIELD(S) IN ONE DIMENSION ALONG THE COLUMN(S). -! THE INTERPOLATION IS CUBIC LAGRANGIAN WITH A MONOTONIC CONSTRAINT -! IN THE CENTER OF THE DOMAIN. IN THE OUTER INTERVALS IT IS LINEAR. -! OUTSIDE THE DOMAIN, FIELDS ARE HELD CONSTANT. -! -! PROGRAM HISTORY LOG: -! 98-05-01 MARK IREDELL -! 1999-01-04 IREDELL USE ESSL SEARCH -! -! USAGE: CALL TERP3(IM,IXZ1,IXQ1,IXZ2,IXQ2,NM,NXQ1,NXQ2, -! & KM1,KXZ1,KXQ1,Z1,Q1,KM2,KXZ2,KXQ2,Z2,Q2,J2) -! INPUT ARGUMENT LIST: -! IM INTEGER NUMBER OF COLUMNS -! IXZ1 INTEGER COLUMN SKIP NUMBER FOR Z1 -! IXQ1 INTEGER COLUMN SKIP NUMBER FOR Q1 -! IXZ2 INTEGER COLUMN SKIP NUMBER FOR Z2 -! IXQ2 INTEGER COLUMN SKIP NUMBER FOR Q2 -! NM INTEGER NUMBER OF FIELDS PER COLUMN -! NXQ1 INTEGER FIELD SKIP NUMBER FOR Q1 -! NXQ2 INTEGER FIELD SKIP NUMBER FOR Q2 -! KM1 INTEGER NUMBER OF INPUT POINTS -! KXZ1 INTEGER POINT SKIP NUMBER FOR Z1 -! KXQ1 INTEGER POINT SKIP NUMBER FOR Q1 -! Z1 REAL (1+(IM-1)*IXZ1+(KM1-1)*KXZ1) -! INPUT COORDINATE VALUES IN WHICH TO INTERPOLATE -! (Z1 MUST BE STRICTLY MONOTONIC IN EITHER DIRECTION) -! Q1 REAL (1+(IM-1)*IXQ1+(KM1-1)*KXQ1+(NM-1)*NXQ1) -! INPUT FIELDS TO INTERPOLATE -! KM2 INTEGER NUMBER OF OUTPUT POINTS -! KXZ2 INTEGER POINT SKIP NUMBER FOR Z2 -! KXQ2 INTEGER POINT SKIP NUMBER FOR Q2 -! Z2 REAL (1+(IM-1)*IXZ2+(KM2-1)*KXZ2) -! OUTPUT COORDINATE VALUES TO WHICH TO INTERPOLATE -! (Z2 NEED NOT BE MONOTONIC) -! -! OUTPUT ARGUMENT LIST: -! Q2 REAL (1+(IM-1)*IXQ2+(KM2-1)*KXQ2+(NM-1)*NXQ2) -! OUTPUT INTERPOLATED FIELDS -! J2 REAL (1+(IM-1)*IXQ2+(KM2-1)*KXQ2+(NM-1)*NXQ2) -! OUTPUT INTERPOLATED FIELDS CHANGE WRT Z2 -! -! SUBPROGRAMS CALLED: -! RSEARCH SEARCH FOR A SURROUNDING REAL INTERVAL -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN -! -!C$$$ - - IMPLICIT NONE - INTEGER, intent(in) :: IM,IXZ1,IXQ1,IXZ2,IXQ2,NM,NXQ1,NXQ2 - INTEGER, intent(in) :: KM1,KXZ1,KXQ1,KM2,KXZ2,KXQ2 - - REAL, intent(in) :: Z1(1+(IM-1)*IXZ1+(KM1-1)*KXZ1) - REAL, intent(in) :: Q1(1+(IM-1)*IXQ1+(KM1-1)*KXQ1+(NM-1)*NXQ1) - REAL, intent(in) :: Z2(1+(IM-1)*IXZ2+(KM2-1)*KXZ2) - REAL, intent(inout) :: Q2(1+(IM-1)*IXQ2+(KM2-1)*KXQ2+(NM-1)*NXQ2) - REAL, intent(inout) :: J2(1+(IM-1)*IXQ2+(KM2-1)*KXQ2+(NM-1)*NXQ2) - INTEGER I,K1,K2,N - REAL FFA(IM),FFB(IM),FFC(IM),FFD(IM) - REAL GGA(IM),GGB(IM),GGC(IM),GGD(IM) - INTEGER K1S(IM,KM2) - INTEGER,allocatable :: kpz(:) - REAL, allocatable :: zpz1(:), zpz2(:) - REAL Z1A,Z1B,Z1C,Z1D,Q1A,Q1B,Q1C,Q1D,Z2S,Q2S,J2S -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - allocate(kpz(km2)) - allocate(zpz1(km1), zpz2(km2)) - -! FIND THE SURROUNDING INPUT INTERVAL FOR EACH OUTPUT POINT. -! PJJ/GDIT - RSEARCH modified to search single vertical column -! copy column to temp arrays before call to RSEARCH - do i=1,im - do k2=1,km1 - zpz1(k2)=z1(1+(I-1)*IXZ1+(k2-1)*KXZ1) - enddo - do k2=1,km2 - zpz2(k2)=z2(1+(I-1)*IXZ2+(k2-1)*KXZ2) - enddo - - CALL RSEARCH(KM1,zpz1,KM2,zpz2,kpz) - - do k2=1,km2 - k1s(i,k2)=kpz(k2) - enddo - enddo - - deallocate(kpz) - deallocate(zpz1,zpz2) - -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! GENERALLY INTERPOLATE CUBICALLY WITH MONOTONIC CONSTRAINT -! FROM TWO NEAREST INPUT POINTS ON EITHER SIDE OF THE OUTPUT POINT, -! BUT WITHIN THE TWO EDGE INTERVALS INTERPOLATE LINEARLY. -! KEEP THE OUTPUT FIELDS CONSTANT OUTSIDE THE INPUT DOMAIN. - -!!$OMP PARALLEL DO DEFAULT(PRIVATE) SHARED(IM,IXZ1,IXQ1,IXZ2), & -!!$OMP& SHARED(IXQ2,NM,NXQ1,NXQ2,KM1,KXZ1,KXQ1,Z1,Q1,KM2,KXZ2), & -!!$OMP& SHARED(KXQ2,Z2,Q2,J2,K1S) - - DO K2=1,KM2 - DO I=1,IM - K1=K1S(I,K2) - IF(K1.EQ.1.OR.K1.EQ.KM1-1) THEN - Z2S=Z2(1+(I-1)*IXZ2+(K2-1)*KXZ2) - Z1A=Z1(1+(I-1)*IXZ1+(K1-1)*KXZ1) - Z1B=Z1(1+(I-1)*IXZ1+(K1+0)*KXZ1) - FFA(I)=(Z2S-Z1B)/(Z1A-Z1B) - FFB(I)=(Z2S-Z1A)/(Z1B-Z1A) - GGA(I)=1/(Z1A-Z1B) - GGB(I)=1/(Z1B-Z1A) - ELSEIF(K1.GT.1.AND.K1.LT.KM1-1) THEN - Z2S=Z2(1+(I-1)*IXZ2+(K2-1)*KXZ2) - Z1A=Z1(1+(I-1)*IXZ1+(K1-2)*KXZ1) - Z1B=Z1(1+(I-1)*IXZ1+(K1-1)*KXZ1) - Z1C=Z1(1+(I-1)*IXZ1+(K1+0)*KXZ1) - Z1D=Z1(1+(I-1)*IXZ1+(K1+1)*KXZ1) - FFA(I)=(Z2S-Z1B)/(Z1A-Z1B)* & - (Z2S-Z1C)/(Z1A-Z1C)* & - (Z2S-Z1D)/(Z1A-Z1D) - FFB(I)=(Z2S-Z1A)/(Z1B-Z1A)* & - (Z2S-Z1C)/(Z1B-Z1C)* & - (Z2S-Z1D)/(Z1B-Z1D) - FFC(I)=(Z2S-Z1A)/(Z1C-Z1A)* & - (Z2S-Z1B)/(Z1C-Z1B)* & - (Z2S-Z1D)/(Z1C-Z1D) - FFD(I)=(Z2S-Z1A)/(Z1D-Z1A)* & - (Z2S-Z1B)/(Z1D-Z1B)* & - (Z2S-Z1C)/(Z1D-Z1C) - GGA(I)= 1/(Z1A-Z1B)* & - (Z2S-Z1C)/(Z1A-Z1C)* & - (Z2S-Z1D)/(Z1A-Z1D)+ & - (Z2S-Z1B)/(Z1A-Z1B)* & - 1/(Z1A-Z1C)* & - (Z2S-Z1D)/(Z1A-Z1D)+ & - (Z2S-Z1B)/(Z1A-Z1B)* & - (Z2S-Z1C)/(Z1A-Z1C)* & - 1/(Z1A-Z1D) - GGB(I)= 1/(Z1B-Z1A)* & - (Z2S-Z1C)/(Z1B-Z1C)* & - (Z2S-Z1D)/(Z1B-Z1D)+ & - (Z2S-Z1A)/(Z1B-Z1A)* & - 1/(Z1B-Z1C)* & - (Z2S-Z1D)/(Z1B-Z1D)+ & - (Z2S-Z1A)/(Z1B-Z1A)* & - (Z2S-Z1C)/(Z1B-Z1C)* & - 1/(Z1B-Z1D) - GGC(I)= 1/(Z1C-Z1A)* & - (Z2S-Z1B)/(Z1C-Z1B)* & - (Z2S-Z1D)/(Z1C-Z1D)+ & - (Z2S-Z1A)/(Z1C-Z1A)* & - 1/(Z1C-Z1B)* & - (Z2S-Z1D)/(Z1C-Z1D)+ & - (Z2S-Z1A)/(Z1C-Z1A)* & - (Z2S-Z1B)/(Z1C-Z1B)* & - 1/(Z1C-Z1D) - GGD(I)= 1/(Z1D-Z1A)* & - (Z2S-Z1B)/(Z1D-Z1B)* & - (Z2S-Z1C)/(Z1D-Z1C)+ & - (Z2S-Z1A)/(Z1D-Z1A)* & - 1/(Z1D-Z1B)* & - (Z2S-Z1C)/(Z1D-Z1C)+ & - (Z2S-Z1A)/(Z1D-Z1A)* & - (Z2S-Z1B)/(Z1D-Z1B)* & - 1/(Z1D-Z1C) - ENDIF - ENDDO -! INTERPOLATE. - DO N=1,NM - DO I=1,IM - K1=K1S(I,K2) - IF(K1.EQ.0) THEN - Q2S=Q1(1+(I-1)*IXQ1+(N-1)*NXQ1) - J2S=0 - ELSEIF(K1.EQ.KM1) THEN - Q2S=Q1(1+(I-1)*IXQ1+(KM1-1)*KXQ1+(N-1)*NXQ1) - J2S=0 - ELSEIF(K1.EQ.1.OR.K1.EQ.KM1-1) THEN - Q1A=Q1(1+(I-1)*IXQ1+(K1-1)*KXQ1+(N-1)*NXQ1) - Q1B=Q1(1+(I-1)*IXQ1+(K1+0)*KXQ1+(N-1)*NXQ1) - Q2S=FFA(I)*Q1A+FFB(I)*Q1B - J2S=GGA(I)*Q1A+GGB(I)*Q1B - ELSE - Q1A=Q1(1+(I-1)*IXQ1+(K1-2)*KXQ1+(N-1)*NXQ1) - Q1B=Q1(1+(I-1)*IXQ1+(K1-1)*KXQ1+(N-1)*NXQ1) - Q1C=Q1(1+(I-1)*IXQ1+(K1+0)*KXQ1+(N-1)*NXQ1) - Q1D=Q1(1+(I-1)*IXQ1+(K1+1)*KXQ1+(N-1)*NXQ1) - Q2S=FFA(I)*Q1A+FFB(I)*Q1B+FFC(I)*Q1C+FFD(I)*Q1D - J2S=GGA(I)*Q1A+GGB(I)*Q1B+GGC(I)*Q1C+GGD(I)*Q1D - IF(Q2S.LT.MIN(Q1B,Q1C)) THEN - Q2S=MIN(Q1B,Q1C) - J2S=0 - ELSEIF(Q2S.GT.MAX(Q1B,Q1C)) THEN - Q2S=MAX(Q1B,Q1C) - J2S=0 - ENDIF - ENDIF - Q2(1+(I-1)*IXQ2+(K2-1)*KXQ2+(N-1)*NXQ2)=Q2S - J2(1+(I-1)*IXQ2+(K2-1)*KXQ2+(N-1)*NXQ2)=J2S - ENDDO - ENDDO - ENDDO -!!$OMP END PARALLEL DO - - END SUBROUTINE TERP3 - end module utils diff --git a/sorc/fbwndgfs.fd/fbwndgfs.f b/sorc/fbwndgfs.fd/fbwndgfs.f deleted file mode 100644 index ce7505fd1b3..00000000000 --- a/sorc/fbwndgfs.fd/fbwndgfs.f +++ /dev/null @@ -1,969 +0,0 @@ -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C . . . . -C MAIN PROGRAM: FBWNDGFS -C PRGMMR: VUONG ORG: NP11 DATE: 2005-08-03 -C -C ABSTRACT: THIS PROGRAM CREATES BULLETINS OF FORECAST WINDS AND -C TEMPS FOR UP TO 15 LEVELS FOR PACIFIC REGION. -C THE PRIMARY (RMP) IS RUN. THE PROGRAM SETUPS TO RUN 4 TIMES PER -C DAY (T00Z, T06Z, T12Z AND T18Z). -C EACH BULLETIN OF A SET REPRESENTS A 6, 12 OR 24 HR FCST. -C THE PROGRAM GENERATED ARE THE FOLLOWING BULLETINS; -C FBOC31, FBOC33, FBOC35, FBOC37, FBOC38, FBOC39 -C THE STATION FILE (FBWNDGFS.STNLIST) IS KEYED TO INDICATE WHICH BULLETIN -C EACH STATION BELONGS IN. THE WIND SPEED (TEN OF DEGREES), WIND DIRECTION -C (KNOTS) & TEMPERATURE(CELSIUS) IN THE FORM (DDff+TT) FOR EACH STATION -C AND LEVELS APPEAR IN THE BULLETIN. WHERE DD IS THE WIND DIRECTION, -C ff IS THE WIND SPEED, AND TT IS THE TEMPERATURE -C THE FORECAST INPUT DATA IS GFS GLOBAL LAT/LON GRID 128 (0.313 DEGREE) -C FORECAST FILES U,V,& T FIELDS, 15 LEVELS: 1000', 1500', 2000', 3000', -C 6000', 9000', 12000', 15000' + 500, 400, 300, 250, 200, 150 AND 100MB -C -C THE INPUT STATION RECORD FOR EACH STATION CONTAINS STN ELEVATION -C AND LATITUDE/LONGITUDE POSITION. -C -C PROGRAM HISTORY LOG: -C 1986-01-03 CAVANAUGH -C 2004-06-29 VUONG MODIFIED THE PROGRAM TO WORK WITH GFS DATA AND -C RUN 4 TIMES PER DAY (T00Z,T06Z,T12Z AND T18Z). -C 2005-08-03 VUONG CHANGED THE FOR USE TIMES SPECIFIED ON WIND AND -C TEMPERATURE ALOFT 6 AND 12 HOUR FORECAST BULLETINS -C 2007-07-03 VUONG CHANGED NO. OF POINTS FOR GFS GLOBAL GAUSSIAN -C LAT/LON GRID 128 -C 2010-05-26 VUONG CHANGED NO. OF POINTS FOR GFS (T574) GAUSSIAN -C LAT/LON GRID -C 2012-08-16 VUONG MODIFIED VARIABLES NNPOS AND CHANGED -C VARIABLE ENVVAR TO CHARACTER*6 -C 2016-05-16 VUONG MODIFIED CODE TO USE MODULE GDSWZD_MOD IN IP.v3.0.0 -C -C USAGE: -C INPUT FILES: -C FORT.05 FBWNDGFS.STNLIST STATION DIRECTORY -C -C - GFS (T574) GLOBAL GAUSSIAN LAT/LON GRID (0.205 DEGREE) -C DIMENSIONS 1760 x 880 = 1548800 -C FORT.11 /COM/GFS/PROD/GFS.${PDY}/GFS.${CYCLE}.MASTER.GRBF06 -C FORT.12 /COM/GFS/PROD/GFS.${PDY}/GFS.${CYCLE}.MASTER.GRBF12 -C FORT.13 /COM/GFS/PROD/GFS.${PDY}/GFS.${CYCLE}.MASTER.GRBF24 -C - GFS INDEX FILES FOR GRIB GRID 128: -C FORT.31 /COM/GFS/PROD/GFS.${PDY}/GFS.${CYCLE}.MASTER.GRBIF06 -C FORT.32 /COM/GFS/PROD/GFS.${PDY}/GFS.${CYCLE}.MASTER.GRBIF12 -C FORT.33 /COM/GFS/PROD/GFS.${PDY}/GFS.${CYCLE}.MASTER.GRBIF24 -C -C WHERE PDY = YYYYMMDD, YYYY IS THE YEAR, MM IS THE MONTH, -C DD IS THE DAY OF THE MONTH -C AND -C CYCLE = T00Z, T06Z, T12Z, T18Z -C -C OUTPUT FILES: -C FORT.06 ERROR MESSAGES -C FORT.51 BULLETIN RECORDS FOR TRANSMISSION -C -C SUBPROGRAMS CALLED: (LIST ALL CALLED FROM ANYWHERE IN CODES) -C LIBRARY: -C W3AI15 WXAI19 W3FC05 W3FI01 -C GETGB (FOR GRIB FILES) -C W3FT01 W3TAGE XMOVEX XSTORE W3UTCDAT -C -C EXIT STATES: -C COND = 110 STN DIRECTORY READ ERR (CONSOLE MSG) -C 1050 NO DATA (FIELD ID IS PRINTED)(FT06 + CONSOLE) -C 1060 NO DATA (FIELD ID IS PRINTED)(FT06 + CONSOLE) -C 1070 NO DATA (FIELD ID IS PRINTED)(FT06 + CONSOLE) -C 1080 NO DATA (FIELD ID IS PRINTED)(FT06 + CONSOLE) -C 1090 NO DATA (FIELD ID IS PRINTED)(FT06 + CONSOLE) -C ALL ARE FATAL -C PLUS W3LIB SUB-RTN RETURN CODES -C -C ATTRIBUTES: -C LANGUAGE: F90 FORTRAN -C MACHINE: IBM WCOSS -C -C$$$ -C - use gdswzd_mod - PARAMETER (NPTS=1548800) - PARAMETER (MAXSTN=800) - PARAMETER (IMAX=1760,JMAX=880) -C - REAL ALAT(MAXSTN),ALON(MAXSTN) - REAL ISTN(MAXSTN),JSTN(MAXSTN) - REAL ERAS(3),FHOUR,FILL - REAL RFLD(NPTS),RINTRP(IMAX,JMAX) - REAL XPTS(NPTS),YPTS(NPTS),RLON(NPTS),RLAT(NPTS) -C -C...MAX NR STNS FOR READ-END SHOULD BE GT ACTUAL NR OF STNS ON STN FILE - INTEGER IELEV(MAXSTN),IRAS(3),KTAU(3) - INTEGER JTIME(8),NDATE(8),MDATE(8) - INTEGER JGDS(100),KGDS(200),JREW,KBYTES - INTEGER KPDS(27),MPDS(27),KREW - INTEGER KSTNU(MAXSTN,15) - INTEGER LMTLWR(2),LMTUPR(2),NTTT -C...NPOS(ITIVE) IS TRANSMISSION SIGN 7C MASK FOR TEMP - INTEGER ICKYR,ICKMO,ICKDAY,ICKHR - INTEGER KSTNV(MAXSTN,15),KSTNT(MAXSTN,15) - INTEGER IDWD1H(3),IDWD2H(3) - INTEGER IDWD1P(3),IDWD2P(3) - INTEGER IDWD2(15),NHGTP(15) -C -C...S,L,T,B ARE SUBSCRIPTS FOR SEQ NR OF STATION, LEVEL, TAU, BULLETIN -C... B IS COUNT OF BULTNS WITHIN TAU, BB IS COUNT WITHIN RUN -C - INTEGER S,L,T,B, BB -C - CHARACTER*6 NHGT6(15), AWIPSID(6) - CHARACTER*1 BSTART,BEND - CHARACTER*1 BULTN(1280) - CHARACTER*1 SPACE(1280) - CHARACTER*1 ETBETX,ETB,ETX,ICK,ICKX - CHARACTER*1 INDIC(MAXSTN),LF,MINUS - CHARACTER*1 MUSES(MAXSTN) - CHARACTER*1 SPC80(80),TSRCE,TMODE,TFLAG - CHARACTER*3 CRCRLF - CHARACTER*4 ITRTIM,STNID(MAXSTN),IVALDA - CHARACTER*1 NNPOS - CHARACTER*4 NFDHDG(6),NCATNR(6),NVALTM(12) - CHARACTER*9 NUSETM(12) -C - CHARACTER*8 IBLANK,IBSDA,IBSTI,ITRDA - CHARACTER*8 ITEMP(MAXSTN,15),IWIND(MAXSTN,15) - CHARACTER*8 NFILE,NTTT4,RF06,RF12,RF24 - CHARACTER*6 ENVVAR - CHARACTER*80 FILEB,FILEI,SPCS,FILEO -C - CHARACTER*86 LINE73 - CHARACTER*40 LN73A,NBUL1 - CHARACTER*46 LN73B - CHARACTER*84 NBULHD - CHARACTER*34 NBUL2 - CHARACTER*32 NBASHD - CHARACTER*60 NVALHD -C - LOGICAL ENDBUL,KBMS(NPTS) -C - EQUIVALENCE (ICK,ICKX) - EQUIVALENCE (RFLD(1),RINTRP(1,1)) - EQUIVALENCE (NBULHD(1:1),NBUL1(1:1)) - EQUIVALENCE (NBULHD(41:41),NBUL2(1:1)) - EQUIVALENCE (LINE73(1:1),LN73A(1:1)) - EQUIVALENCE (LINE73(41:41),LN73B(1:1)) - EQUIVALENCE (SPCS,SPC80) - EQUIVALENCE (NTTT,NTTT4(1:1)) -C - DATA INDEX /1/ - DATA NCYCLK/ 0 / - DATA LIN / 0 / - DATA FHOUR /24.0/ - DATA KTAU /06,12,24/ - DATA LMTLWR/1,11/ - DATA LMTUPR/10,15/ - DATA IDWD1H/ 33, 34, 11/ - DATA IDWD2H/ 103, 103, 103/ - - DATA IDWD1P/ 33, 34, 11/ - DATA IDWD2P/ 100, 100, 100/ - - DATA IDWD2 / 305, 457, 610, 914, - 1 1829, 2743, 3658, 4572, - 2 500, 400, 300, 250, - 3 200, 150, 100/ - - DATA NHGT6 /'1000 ','1500 ','2000 ','3000 ', - 1 '6000 ','9000 ','12000 ','15000 ', - 2 '18000 ','24000 ','30000 ','34000 ', - 3 '39000 ','45000 ','53000'/ - DATA NHGTP /5,5,6,6,6,6,6,6,6,6,5,5,5,5,5/ - DATA BSTART/'B'/ - DATA BEND /'E'/ - DATA ETB /'>'/ - DATA ETX /'%'/ - DATA MINUS /'-'/ - DATA SPC80 /80*' '/ - DATA CRCRLF/'<<@'/ - DATA IBLANK/' '/ - DATA AWIPSID / 'FD1OC1','FD8OC7','FD3OC3', - 1 'FD9OC8','FD5OC5','FD0OC9'/ - DATA NFDHDG/ - 1 'OC31','OC37','OC33','OC38','OC35','OC39'/ - DATA NCATNR/ - 1 '1377','5980','1378','5981','1379','5982'/ - DATA NVALTM/ - 1 '0600','1200','0000','1200','1800','0600', - 2 '1800','0000','1200','0000','0600','1800'/ - DATA NUSETM/ - 1 '0200-0900','0900-1800','1800-0600', - 2 '0800-1500','1500-0000','0000-1200', - 3 '1400-2100','2100-0600','0600-1800', - 4 '2000-0300','0300-1200','1200-0000'/ -C - DATA RF06 /'6 HOURS '/ - DATA RF12 /'12 HOURS'/ - DATA RF24 /'24 HOURS'/ - DATA LN73A /' '/ - DATA LN73B /' <<@^^^'/ - DATA NBUL1 / - 1 '''10 PFB '/ - DATA NBUL2/ - 1 'FB KWNO <<@^^^ <<@$'/ - DATA NBASHD/'DATA BASED ON Z <<@@^^^'/ - DATA NVALHD/ - 1 'VALID Z FOR USE - Z. TEMPS NEG ABV 24000<<@@^'/ -C -C - NNPOS = CHAR(124) - LUGO = 51 - CALL W3TAGB('FBWNDGFS',2012,0184,0184,'NP11') - ENVVAR='FORT ' - WRITE(ENVVAR(5:6),FMT='(I2)') LUGO - CALL GETENV(ENVVAR,FILEO) -C - OPEN(LUGO,FILE=FILEO,ACCESS='DIRECT',RECL=1281) - IREC=1 -C...GET COMPUTER DATE-TIME & SAVE FOR DATA DATE VERIFICATION - CALL W3UTCDAT(JTIME) -C -C...READ AND STORE STATION LIST FROM UNIT 5 -C...INDIC = INDICATOR BEGIN, OR END, BULTN ('B' OR 'E') -C...MUSES = USED IN MULTIPLE BULTNS (FOR SAME TAU) IF '+' -C - DO 25 I = 1, MAXSTN - READ(5,10,ERR=109,END=130) INDIC(I),MUSES(I),STNID(I), - & IELEV(I),ALAT(I),ALON(I) - 25 CONTINUE -C -C/////////////////////////////////////////////////////////////////// - 10 FORMAT(A1,A1,A4,1X,I5,1X,F6.2,1X,F7.2) -C -C...ERROR - 109 CONTINUE - CALL W3TAGE('FBWNDGFS') - PRINT *,'STATION LIST READ ERROR' - CALL ERREXIT (110) -C//////////////////////////////////////////////////////////////////// -C - 130 CONTINUE -C -C CONVERT THE LAT/LONG COORDINATES OF STATION TO LAMBERT -C CONFORMAL PROJECTION I,J COORDINATES FOR GRID 221 -C - NRSTNS = I-1 - WRITE(6,'(A19,1X,I0)') ' NO. OF STATIONS = ',NRSTNS -C -C...END READ. COUNT OF STATIONS STORED -C -C...GET EXEC PARMS -C...PARM FIELD TAKEN OUT, NEXT 4 VALUES HARD WIRED - TMODE = 'M' - TSRCE = 'R' - TFLAG = 'P' - PRINT *,'SOURCE=',TSRCE,' MODE=',TMODE,' FLAG=',TFLAG -C -C********************************************************************** -C -C...READ PACKED DATA, UNPACK, INTERPOLATE, STORE IN STATION ARRAYS, -C... CREATE BULTN HDGS, INSERT STATION IN BULTNS, & WRITE BULTNS. -C - BB = 0 -C -C...BEGIN TAU -C - DO 7000 ITAU=1, 3 -C - WRITE(6,'(A6,1X,I0)') ' ITAU=',ITAU - T = ITAU -C -C SELECT FILE FOR TAU PERIOD (PRIMARY RUN) -C - IF (KTAU(ITAU).EQ.6) THEN - NFILE = RF06 - LUGB = 11 - LUGI = 31 - ELSE IF (KTAU(ITAU).EQ.12) THEN - NFILE = RF12 - LUGB = 12 - LUGI = 32 - ELSE - NFILE = RF24 - LUGB = 13 - LUGI = 33 - ENDIF -C - WRITE(ENVVAR(5:6),FMT='(I2)') LUGB - CALL GETENV(ENVVAR,FILEB) - CALL BAOPENR(LUGB,FILEB,IRET) - WRITE(ENVVAR(5:6),FMT='(I2)') LUGI - CALL GETENV(ENVVAR,FILEI) - CALL BAOPENR(LUGI,FILEI,IRET) - PRINT 1025,NFILE, FILEB, FILEI - 1025 FORMAT('NFILE= ',A8,2X,'GRIB FILE= ',A55,'INDEX FILE= ',A55) -C -C.................................. - DO 2450 ITYP=1,3 -C -C... SEE O.N. 388 FOR FILE ID COMPOSITION -C - DO 2400 L=1,15 -C -C...USE SOME OF THE VALUES IN THE PDS TO GET RECORD -C -C MPDS = -1 SETS ARRAY MPDS TO -1 -C MPDS(3) = GRID IDENTIFICATION (PDS BYTE 7) -C MPDS(5) = INDICATOR OF PARAMETER (PDS BYTE 9) -C MPDS(6) = INDICATOR OF TYPE OF LEVEL OR LAYER (PDS BYTE 10) -C MPDS(7) = HGT,PRES,ETC. OF LEVEL OR LAYER (PDS BYTE 11,12) -C MPDS(14) = P1 - PERIOD OF TIME (PDS BYTE 19) -C VALUES NOT SET TO -1 ARE USED TO FIND RECORD -C - JREW = 0 - KREW = 0 - MPDS = -1 -C -C MPDS(3) = -1 - IF (L.LE.8) THEN - MPDS(5) = IDWD1H(ITYP) -C... HEIGHT ABOVE MEAN SEA LEVEL GPML - MPDS(6) = IDWD2H(ITYP) - ELSE - MPDS(5) = IDWD1P(ITYP) -C... PRESSURE IN HectoPascals (hPa) ISBL - MPDS(6) = IDWD2P(ITYP) - ENDIF - MPDS(7) = IDWD2(L) - MPDS(14) = KTAU(ITAU) -C -C... THE FILE ID COMPLETED. -C PRINT *,MPDS -C... GET THE DATA FIELD. -C - CALL GETGB(LUGB,LUGI,NPTS,JREW,MPDS,JGDS, - & KBYTES,KREW,KPDS,KGDS,KBMS,RFLD,IRET) -C WRITE(*,119) KPDS -119 FORMAT( 1X, 'MAIN: KPDS:', 3(/1X,10(I5,2X) ) ) - -C -C/////////////////////////////////////////////////////////////////////// -C...ERROR - IF (IRET.NE.0) THEN - write(*,120) (MPDS(I),I=3,14) -120 format(1x,' MPDS = ',12(I0,1x)) - WRITE(6,'(A9,1X,I0)') ' IRET = ',IRET - IF (IRET.EQ.96) THEN - PRINT *,'ERROR READING INDEX FILE' - CALL W3TAGE('FBWNDGFS') - CALL ERREXIT (1050) - ELSE IF (IRET.EQ.97) THEN - PRINT *,'ERROR READING GRIB FILE' - CALL W3TAGE('FBWNDGFS') - CALL ERREXIT (1060) - ELSE IF (IRET.EQ.98) THEN - PRINT *,'NUMBER OF DATA POINT GREATER', - * ' THAN NPTS' - CALL W3TAGE('FBWNDGFS') - CALL ERREXIT (1070) - ELSE IF (IRET.EQ.99) THEN - PRINT *,'RECORD REQUESTED NOT FOUND' - CALL W3TAGE('FBWNDGFS') - CALL ERREXIT (1080) - ELSE - PRINT *,'GETGB-W3FI63 GRIB UNPACKER', - * ' RETURN CODE' - CALL W3TAGE('FBWNDGFS') - CALL ERREXIT (1090) - END IF - ENDIF -C -C...GET DATE-TIME FOR LATER BULTN HDG PROCESSING -C - ICKYR = KPDS(8) + 2000 - ICKMO = KPDS(9) - ICKDAY= KPDS(10) - ICKHR = KPDS(11) * 100 - IF (ICKHR.EQ.0000) ICYC=1 - IF (ICKHR.EQ.0600) ICYC=2 - IF (ICKHR.EQ.1200) ICYC=3 - IF (ICKHR.EQ.1800) ICYC=4 - IBSTIM=ICKHR -C -C...GET NEXT DAY - FOR VALID DAY AND 12Z AND 18Z BACKUP TRAN DAY -C...UPDATE TO NEXT DAY - NHOUR=ICKHR*.01 - CALL W3MOVDAT((/0.,FHOUR,0.,0.,0./), - & (/ICKYR,ICKMO,ICKDAY,0,NHOUR,0,0,0/),NDATE) - CALL W3MOVDAT((/0.,FHOUR,0.,0.,0./),NDATE,MDATE) -C -C...12Z, 18Z CYCLE,BACKUP RUN,24HR FCST: VALID DAY IS DAY-AFTER-NEXT -C...NEXT DAY-OF-MONTH NOW STORED IN 'NDATE(3)' -C...NEXT DAY PLUS 1 IN 'MDATE(3)' -C -C CONVERT EARTH COORDINATES OF STATION TO GRID COORDINATES - DO 110 J = 1,NRSTNS -C CALL GDSWIZ(KGDS,-1,1,FILL,XPTS(J),YPTS(J), -C & ALON(J),ALAT(J),IRET,0,DUM,DUM) - CALL GDSWZD(KGDS,-1,1,FILL,XPTS(J),YPTS(J), - & ALON(J),ALAT(J),IRET) - ISTN(J) = XPTS(J) - JSTN(J) = YPTS(J) -C PRINT 111,STNID(J),ALAT(J),ALON(J),ISTN(J),JSTN(J) - 111 FORMAT (3X,A3,2(2X,F8.2),2(2X,F8.3)) - 110 CONTINUE -C -C...CONVERT DATA TO CONVENTIONAL UNITS: -C... WIND FROM METERS/SEC TO KNOTS (2 DIGITS), -C WIND DIRECTION IN TENS OF DEGREES (2 DIGITS), -C AND TEMP FROM K TO CELSIUS (2 DIGITS) -C - DO 1500 I=1,NPTS -C - IF (ITYP.EQ.3) THEN - RFLD(I)=RFLD(I)-273.15 - ELSE - RFLD(I)=RFLD(I)*1.94254 - ENDIF -C - 1500 CONTINUE -C - DO 2300 S=1,NRSTNS -C -C INTERPOLATE GRIDPOINT DATA TO STATION. -C - CALL W3FT01(ISTN(S),JSTN(S),RINTRP,X,IMAX,JMAX,NCYCLK,LIN) -C WRITE(6,830) STNID(S),ISTN(S),JSTN(S),X -830 FORMAT(1X,'STN-ID = ', A4,3X,'SI,SJ = ', 2(F5.1,2X), 1X, - A 'X = ', F10.0) -C -C...INTERPOLATION COMPLETE FOR THIS STATION -C -C...CONVERT WIND, U AND V TO INTEGER -C - IF (ITYP.EQ.1) THEN - KSTNU(S,L)=X*100.0 - ELSE IF (ITYP.EQ.2) THEN - KSTNV(S,L)=X*100.0 -C...CONVERT TEMP TO I*2 - ELSE IF (ITYP.EQ.3) THEN - KSTNT(S,L)=X*100.0 - ENDIF -C - 2300 CONTINUE -C...END OF STATION LOOP -C................................... -C - 2400 CONTINUE -C...END OF LEVEL LOOP -C................................... -C - 2450 CONTINUE -C...END OF DATA TYPE LOOP -C................................... -C -C...INTERPOLATED DATA FOR ALL STATIONS,1 TAU, NOW ARRAYED IN KSTNU-V-T. -C*********************************************************************** -C -C...CONVERT WIND COMPONENTS TO DIRECTION AND SPEED -C -C................................. -C...BEGIN STATION -C - DO 3900 S=1,NRSTNS -C................................. - DO 3750 L=1,15 -C -C...PUT U & V WIND COMPONENTS IN I*4 WORK AREA - IRAS(1)=KSTNU(S,L) - IRAS(2)=KSTNV(S,L) -C...FLOAT U & V - ERAS(1)=FLOAT(IRAS(1))*.01 - ERAS(2)=FLOAT(IRAS(2))*.01 -C -C...CONVERT TO WIND DIRECTION & SPEED -C - CALL W3FC05(ERAS(1),ERAS(2),DD,SS) -C -C...WITH DIR & SPEED IN WORK AREA, PLACE TEMPERATURE -TT- IN WORK - IRAS(3)=KSTNT(S,L) - TT=FLOAT(IRAS(3))*.01 -C -C...DIRECTION, SPEED & TEMP ALL REQUIRE ADDITIONAL TREATMENT TO -C MEET REQUIREMENTS OF BULLETIN FORMAT -C - NDDD=(DD+5.0)/10.0 -C...WIND DIRECTION ROUNDED TO NEAREST 10 DEGREEES -C -C...THERE IS A POSSIBILITY WIND DIRECTION NOT IN RANGE 1-36 - - IF ((NDDD.GT.36).OR.(NDDD.LE.0)) THEN - NDDD = MOD(NDDD, 36) - IF (NDDD.LE.0) NDDD = NDDD + 36 - ENDIF - NSSS=SS+0.5 -C -C...WIND SPEED ROUNDED TO NEAREST KNOT -C...FOR SPEED, KEEP UNITS AND TENS ONLY, WIND SPEEDS OF 100 -C THROUGH 199 KNOTS ARE INDICATED BY SUBTRACTING 100 FROM -C THE SPEED AND ADDING 50 TO DIRECTION. -C -C...WIND SPEEDS GREATER THAN 199 KNOTS ARE INDICATED AS A -C FORECAST SPEED OF 199 KNOTS AND ADDING 50 TO DIRECTION. -C - IF (NSSS.GT.199) THEN - NSSS=99 - NDDD=NDDD+50 -C...SPEED GT 99 AND LE 199 KNOTS - ELSE IF (NSSS.GT.99) THEN - NSSS=NSSS-100 - NDDD=NDDD+50 -C -C...SPEED LT 5 KNOTS (CONSIDERED CALM) AND EXPRESSED BY "9900" - ELSE IF (NSSS.LT.5) THEN - NSSS=0 - NDDD=99 - ENDIF -C -C...COMBINE DIR & SPEED IN ONE WORD I*4 - NDDSS=(NDDD*100)+NSSS -C -C...STORE IN ASCII IN LEVEL ARRAY, WIND FOR ONE STATION - CALL W3AI15(NDDSS,IWIND(S,L),1,4,MINUS) -C -C...TEMP NEXT. IF POSITIVE ROUND TO NEAREST DEGREE, CONV TO ASCII - NTTT = TT - IF (TT.LE.-0.5) NTTT = TT - 0.5 - IF (TT.GE.0.5) NTTT = TT + 0.5 - CALL W3AI15(NTTT,NTTT,1,3,MINUS) - IF (TT.GT.-0.5) NTTT4(1:1) = NNPOS(1:1) - -C...SIGN & 2 DIGITS OF TEMP NOW IN ASCII IN LEFT 3 BYTES OF NTTT -C - ITEMP(S,L)(1:3) = NTTT4(1:3) -C - 3750 CONTINUE -C...END LEVEL (WIND CONVERSION) -C................................. -C -C...AT END OF LVL LOOP FOR ONE STATION, ALL WIND & TEMP DATA IS ARRAYED, -C... IN ASCII, IN IWIND (4 CHARACTER DIR & SPEED) AND ITEMP (3 CHAR -C... INCL SIGN FOR 1ST 10 LVLS, 2 CHAR WITH NO SIGN FOR 5 UPPER LVLS) -C ABOVE 24,000 FEET, THE SIGN IS OMITTED SINCE TEMPERATURES ARE NEGATIVE. -C -C...BEFORE INSERTING INTO BULTN, TEMPS FOR LVLS OTHER THAN 3000' -C... WHICH ARE LESS THAN 2500' ABOVE STATION MUST BE ELIMINATED. -C... (TEMPS FOR 3000' ARE NOT TRANSMITTED) -C...WINDS ARE BLANKED FOR LVLS LESS THAN 1500' ABOVE STATION. -C - IF (IELEV(S).GT.9500) ITEMP(S,7) = IBLANK - IF (IELEV(S).GT.6500) ITEMP(S,6) = IBLANK - IF (IELEV(S).GT.3500) ITEMP(S,5) = IBLANK - ITEMP(S,4)=IBLANK - ITEMP(S,3)=IBLANK - ITEMP(S,2)=IBLANK - ITEMP(S,1)=IBLANK -C - IF (IELEV(S).GT.10500) IWIND(S,7) = IBLANK - IF (IELEV(S).GT.7500) IWIND(S,6) = IBLANK - IF (IELEV(S).GT.4500) IWIND(S,5) = IBLANK - IF (IELEV(S).GT.1500) IWIND(S,4) = IBLANK - -C...DATA FOR 1 STATION, 15 LVLS, 1 TAU NOW READY FOR BULTN LINE -C - 3900 CONTINUE -C...END STATION (WIND CONVERSION) -C -C...DATA FOR ALL STATIONS, ONE TAU, NOW READY FOR BULTN INSERTION -C********************************************************************** -C********************************************************************* -C -C...BULLETIN CREATION -C...REACH THIS POINT ONCE PER TAU -C...B IS BULTN CNT FOR TAU, BB CUMULATIVE BULTN CNT FOR RUN, -C... S IS SEQ NR OF STN. -C... (NOT NEEDED FOR U.S. WHICH IS SET AT #1.) - B = 0 - S = 0 - ENDBUL = .FALSE. -C - DO 6900 J = 1,2 -C....................................................................... -C -C...UPDATE STATION COUNTER -C - 4150 S = S + 1 -C - ICKX=INDIC(S) - IF (ICK(1:1).EQ.BSTART(1:1)) THEN - -C...GO TO START, OR CONTINUE, BULTN -C -C...BEGIN BULLETIN -C -C - B = B + 1 - BB = BB + 1 -C*********************************************************************** -C -C...PROCESS DATE-TIME FOR HEADINGS -C - IF (BB.EQ.1) THEN -C............................... -C...ONE TIME ENTRIES -C -C...TRAN HDGS - ITRDAY=JTIME(3) - IBSDAY=ICKDAY - WRITE(ITRTIM(1:4),'(2(I2.2))') JTIME(5), JTIME(6) -C - IF (TMODE.EQ.'T') THEN -C...BACKUP - IF (ICYC.EQ.3.OR.ICYC.EQ.4) THEN -C...TRAN DAY WILL BE NEXT DAY FOR 12Z, 18Z CYCLE BACKUP - ITRDAY=NDATE(3) - IF (ICYC.EQ.4.AND.T.EQ.3) IVLDAY=MDATE(3) - ENDIF - ENDIF -C...END TRAN BACKUP DAY-HOUR -C -C...PLACE TRAN & BASE DAY-HOUR IN HDGS - CALL W3AI15(ITRDAY,ITRDA,1,2,MINUS) - CALL W3AI15(IBSDAY,IBSDA,1,2,MINUS) - CALL W3AI15(IBSTIM,IBSTI,1,4,MINUS) -C - NBUL2(13:14) = ITRDA(1:2) - NBUL2(15:18) = ITRTIM(1:4) -C - NBASHD(15:16) = IBSDA(1:2) - NBASHD(17:20) = IBSTI(1:4) - ENDIF - -C **************************************************************** -C **************************************************************** -C IF REQUIRED TO INDICATE THE SOURCE FOR THESE FD BULLETINS -C REMOVE THE COMMENT STATUS FROM THE NEXT TWO LINES -C **************************************************************** -C **************************************************************** -C -C...END ONE-TIME ENTRIES -C............................ -C -C...BLANK OUT CONTROL DATE AFTER 1ST BULTN - IF (BB.EQ.2) NBULHD(13:20) = SPCS(1:8) -C -C...CATALOG NUMBER (AND 'P' OR 'B' FOR PRIMARY OR BACKUP RUN) - NBULHD(8:8) = TFLAG - NBULHD(4:7) = NCATNR(BB)(1:4) - NBULHD(43:46) = NFDHDG(BB)(1:4) -C -C INSERT AWIPS ID INTO BULLETIN HEADER -C - NBUL2(25:30) = AWIPSID(BB)(1:6) - - -C...END CATALOG NR -C -C...END TRAN HDGS -C..................................................................... -C -C...VALID-USE HDGS - IF (TMODE.EQ.'T') THEN - -C...BACKUP DAY-HOURS WILL BE SAME AS PRIMARY RUN OF OPPOSITE CYCLE - IVLDAY=NDATE(3) - IF (ICYC.EQ.1.AND.T.EQ.1) IVLDAY=IBSDAY - IF (ICYC.EQ.4.AND.T.EQ.3) IVLDAY=MDATE(3) -C -C...SET POINTER OPPOSITE (USE WITH T -RELATIVE TAU- TO SET HOURS) - IF (ICYC.EQ.1) KCYC=2 - IF (ICYC.EQ.3) KCYC=1 - ELSE - IVLDAY=IBSDAY - IF (T.EQ.3) IVLDAY=NDATE(3) - IF (ICYC.EQ.3.AND.T.EQ.2) IVLDAY=NDATE(3) - IF (ICYC.EQ.4) IVLDAY=NDATE(3) - ENDIF - -C...END BACKUP DAY-HOUR. -C -C...CONVERT TO ASCII AND PLACE IN HDGS - CALL W3AI15(IVLDAY,IVALDA,1,2,MINUS) - NVALHD(7:8) = IVALDA(1:2) - IITAU = ITAU - IF (ICYC.EQ.2) IITAU = ITAU + 3 - IF (ICYC.EQ.3) IITAU = ITAU + 6 - IF (ICYC.EQ.4) IITAU = ITAU + 9 - NVALHD(9:12) = NVALTM(IITAU)(1:4) - NVALHD(25:33) = NUSETM(IITAU)(1:9) -C -C...END VALID-USE HDGS -C -C...MOVE WORK HDGS TO BULTN O/P (TRAN, BASE, VALID, HEIGHT HDGS) - NEXT=0 - CALL WXAI19(NBULHD,74,BULTN,1280,NEXT) -C PRINT *,(NBULHD(L:L),L=41,70) - CALL WXAI19(NBASHD,28,BULTN,1280,NEXT) -C PRINT *,(NBASHD(L:L),L=1,25) - CALL WXAI19(NVALHD,60,BULTN,1280,NEXT) -C PRINT *, (NVALHD(L:L),L=1,55) - LINE73(1:73) = SPCS(1:73) - LINE73(1:2) = 'FT' - NPOS1 = 5 - DO 4500 N = LMTLWR(J), LMTUPR(J) - IF (N.LE.3) THEN - NPOS1 = NPOS1 - ELSE IF (N.EQ.4) THEN - NPOS1 = NPOS1 - 1 - ELSE IF ((N.GE.5).AND.(N.LE.6)) THEN - NPOS1 = NPOS1 + 2 - ELSE IF ((N.EQ.7).OR.(N.EQ.11)) THEN - NPOS1 = NPOS1 + 1 - ELSE IF (N.GT.7) THEN - NPOS1 = NPOS1 + 2 - ENDIF - NPOS2 = NPOS1 + 4 - LINE73(NPOS1:NPOS2) = NHGT6(N)(1:5) - NPOS1 = NPOS1 + NHGTP(N) - 4500 CONTINUE - -C PRINT *,(LINE73(II:II),II=1,NPOS2) - CALL WXAI19(LINE73,NPOS2,BULTN,1280,NEXT) - CALL WXAI19(CRCRLF,3,BULTN,1280,NEXT) - ENDIF -C -C...BULLETIN HDGS FOR ONE BULTN COMPLETE IN O/P AREA -C -C*********************************************************************** -C -C...CONTINUE BULTN, INSERTING DATA LINES. -C - NPOS1 = 5 - LINE73(1:73) = SPCS(1:73) - LINE73(1:1) = '$' - LINE73( 2: 5) = STNID(S)(1:4) - DO 5300 M = LMTLWR(J), LMTUPR(J) - NPOS1 = NPOS1 + 1 - NPOS2 = NPOS1 + 4 - LINE73(NPOS1:NPOS2) = IWIND(S,M)(1:4) - NPOS1 = NPOS1 + 4 - IF ((M.GT.4).AND.(M.LE.10))THEN - NPOS2 = NPOS1 + 2 - LINE73(NPOS1:NPOS2) = ITEMP(S,M)(1:3) - NPOS1 = NPOS1 + 3 - END IF - IF (M.GT.10) THEN - NPOS2 = NPOS1 + 1 - LINE73(NPOS1:NPOS2) = ITEMP(S,M)(2:3) - NPOS1 = NPOS1 + 2 - END IF - 5300 CONTINUE -C PRINT *,(LINE73(II:II),II=2,NPOS2) -C...NXTSAV HOLDS BYTE COUNT IN O/P BULTN FOR RESTORING WXAI19 'NEXT' -C... FIELD SO THAT WHEN 'NEXT' IS RETURNED AS -1, AN ADDITIONAL -C... LINEFEED AND/OR ETB OR ETX CAN BE INSERTED -C - IF (NEXT.GE.1207) THEN - CALL WXAI19 (ETB,1,BULTN,1280,NEXT) - LF = CHAR(10) - do ii=1,next - space(index) = bultn(ii) - if (index .eq. 1280) then - WRITE(51,REC=IREC) space, LF - IREC=IREC + 1 - index = 0 - do kk = 1,1280 - space(kk) = ' ' - enddo - endif - index = index + 1 - enddo -C WRITE(51) BULTN, LF - NEXT = 0 - ENDIF - CALL WXAI19(LINE73,NPOS2,BULTN,1280,NEXT) - CALL WXAI19(CRCRLF,3,BULTN,1280,NEXT) -C -C...AFTER LINE STORED IN O/P, GO TO CHECK BULTN END -C -C................................... -C -C...CHECK FOR LAST STN OF BULTN - IF (ICK(1:1).NE.BEND(1:1)) GO TO 4150 -C -C...END BULLETIN. SET UP RETURN FOR NEXT STN AFTER WRITE O/P. -C...SAVE SEQ NR OF LAST STN FOR SUBSEQUENT SEARCH FOR STNS -C - NXTSAV = NEXT - ENDBUL = .TRUE. -C*********************************************************************** -C -C...OUTPUT SECTION -C - NEXT = NXTSAV - ETBETX = ETB - IF (ENDBUL) ETBETX=ETX -C...END OF TRANSMIT BLOCK, OR END OF TRANSMISSION -C - CALL WXAI19(ETBETX,1,BULTN,1280,NEXT) -C -C...OUTPUT TO HOLD FILES - LF = CHAR(10) - do ii = 1,next - space(index) = bultn(ii) - if (index .eq. 1280) then - WRITE(51,REC=IREC) space, LF - IREC=IREC + 1 - index = 0 - do kk = 1,1280 - space(kk) = ' ' - enddo - endif - index = index + 1 - enddo -C -C...TRAN. -C -C NEXT=0 - ENDBUL=.FALSE. -C -C...RETURN TO START NEW BULTN, OR CONTINUE LINE FOR WHICH THERE WAS -C... INSUFFICIENT SPACE IN BLOCK JUST WRITTEN -C - 6900 CONTINUE -C -C*********************************************************************** - 7000 CONTINUE -C...END TAU LOOP -C -C...FT51 IS TRANSMISSION FILE -C END FILE 51 -C REWIND 51 - if (index .gt. 0) then - WRITE(51,REC=IREC) space, LF - IREC=IREC+1 - endif - KRET = 0 - - CALL W3TAGE('FBWNDGFS') - STOP - END - - SUBROUTINE WXAI19(LINE, L, NBLK, N, NEXT) -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: WXAI19 LINE BLOCKER SUBROUTINE -C AUTHOR: ALLARD, R. ORG: W342 DATE: 01 FEB 74 -C -C ABSTRACT: FILLS A RECORD BLOCK WITH LOGICAL RECORDS OR LINES -C OF INFORMATION. -C -C PROGRAM HISTORY LOG: -C 74-02-01 BOB ALLARD -C 90-09-15 R.E.JONES CONVERT FROM IBM370 ASSEMBLER TO MICROSOFT -C FORTRAN 5.0 -C 90-10-07 R.E.JONES CONVERT TO SUN FORTRAN 1.3 -C 91-07-20 R.E.JONES CONVERT TO SiliconGraphics 3.3 FORTRAN 77 -C 93-03-29 R.E.JONES ADD SAVE STATEMENT -C 94-04-22 R.E.JONES ADD XMOVEX AND XSTORE TO MOVE AND -C STORE CHARACTER DATA FASTER ON THE CRAY -C 96-07-18 R.E.JONES CHANGE EBCDIC FILL TO ASCII FILL -C 96-11-18 R.E.JONES CHANGE NAME W3AI19 TO WXAI19 -C -C USAGE: CALL WXAI19 (LINE, L, NBLK, N, NEXT) -C INPUT ARGUMENT LIST: -C LINE - ARRAY ADDRESS OF LOGICAL RECORD TO BE BLOCKED -C L - NUMBER OF CHARACTERS IN LINE TO BE BLOCKED -C N - MAXIMUM CHARACTER SIZE OF NBLK -C NEXT - FLAG, INITIALIZED TO 0 -C -C OUTPUT ARGUMENT LIST: -C NBLK - BLOCK FILLED WITH LOGICAL RECORDS -C NEXT - CHARACTER COUNT, ERROR INDICATOR -C -C EXIT STATES: -C NEXT = -1 LINE WILL NOT FIT INTO REMAINDER OF BLOCK; -C OTHERWISE, NEXT IS SET TO (NEXT + L) -C NEXT = -2 N IS ZERO OR LESS -C NEXT = -3 L IS ZERO OR LESS -C -C EXTERNAL REFERENCES: XMOVEX XSTORE -C -C ATTRIBUTES: -C LANGUAGE: FORTRAN 90 -C -C$$$ -C -C METHOD: -C -C THE USER MUST SET NEXT = 0 EACH TIME NBLK IS TO BE FILLED WITH -C LOGICAL RECORDS. -C -C WXAI19 WILL THEN MOVE THE LINE OF INFORMATION INTO NBLK, STORE -C BLANK CHARACTERS IN THE REMAINDER OF THE BLOCK, AND SET NEXT = NEXT -C + L. -C -C EACH TIME WXAI19 IS ENTERED, ONE LINE IS BLOCKED AND NEXT INCRE- -C MENTED UNTIL A LINE WILL NOT FIT THE REMAINDER OF THE BLOCK. THEN -C WXAI19 WILL SET NEXT = -1 AS A FLAG FOR THE USER TO DISPOSE OF THE -C BLOCK. THE USER SHOULD BE AWARE THAT THE LAST LOGICAL RECORD WAS NOT -C BLOCKED. -C - INTEGER L - INTEGER N - INTEGER NEXT - INTEGER WBLANK -C - CHARACTER * 1 LINE(*) - CHARACTER * 1 NBLK(*) - CHARACTER * 1 BLANK -C - SAVE -C - DATA WBLANK/Z'2020202020202020'/ -C DATA WBLANK/Z''/ -C -C TEST VALUE OF NEXT. -C - IF (NEXT.LT.0) THEN - RETURN -C -C TEST N FOR ZERO OR LESS -C - ELSE IF (N.LE.0) THEN - NEXT = -2 - RETURN -C -C TEST L FOR ZERO OR LESS -C - ELSE IF (L.LE.0) THEN - NEXT = -3 - RETURN -C -C TEST TO SEE IF LINE WILL FIT IN BLOCK. -C - ELSE IF ((L + NEXT).GT.N) THEN - NEXT = -1 - RETURN -C -C FILL BLOCK WITH BLANK CHARACTERS IF NEXT EQUAL ZERO. -C BLANK IS ASCII BLANK, 20 HEX, OR 32 DECIMAL -C - ELSE IF (NEXT.EQ.0) THEN - CALL W3FI01(LW) - IWORDS = N / LW - CALL XSTORE(NBLK,WBLANK,IWORDS) - IF (MOD(N,LW).NE.0) THEN - NWORDS = IWORDS * LW - IBYTES = N - NWORDS - DO I = 1,IBYTES - NBLK(NWORDS+I) = CHAR(32) - END DO - END IF - END IF -C -C MOVE LINE INTO BLOCK. -C - CALL XMOVEX(NBLK(NEXT+1),LINE,L) -C -C ADJUST VALUE OF NEXT. -C - NEXT = NEXT + L -C - RETURN -C - END diff --git a/sorc/fv3nc2nemsio.fd/0readme b/sorc/fv3nc2nemsio.fd/0readme deleted file mode 100644 index 7be2fbcd341..00000000000 --- a/sorc/fv3nc2nemsio.fd/0readme +++ /dev/null @@ -1,23 +0,0 @@ -The first version of this program was provided by Jeff Whitaker and Philip Pegion from ESRL. -Fanglin Ynag has subsequently made a few revsions. - -10/20/2016, Fanglin Yang -Note that FV3 lat-lon grids are located at the center of each grid box, -start from south to north, and from east to west. -For example, for a 0.5-deg uniform grid, -nlon=720, nlat=360 -X(1,1)=[0.25E,89.75S] -X(nlon,nlat)=[359.75E,89.75N] - -write out nemsio, S->N is reversed to N->S to follow NCEP convention - -12/18/2016 Fanglin Yang -updated to handle output of any frequency and any accumulation bucket - - -01/10/2017 Fanglin Yang -updated to handle both hydrostatic and nonhydrostatic cases. They have different output numbers and variable names. - -10/07/2017 Fanglin Yang -In FV3 tic26 branch which includes the lastest Write Component, hgtsfc has been defined as [m] instead of [gpm]. -The scaling by 1/grav in fv3nc2nemsio.fd needs to be removed. diff --git a/sorc/fv3nc2nemsio.fd/constants.f90 b/sorc/fv3nc2nemsio.fd/constants.f90 deleted file mode 100644 index c0a066eec0f..00000000000 --- a/sorc/fv3nc2nemsio.fd/constants.f90 +++ /dev/null @@ -1,314 +0,0 @@ -! this module was extracted from the GSI version operational -! at NCEP in Dec. 2007. -module constants -!$$$ module documentation block -! . . . . -! module: constants -! prgmmr: treadon org: np23 date: 2003-09-25 -! -! abstract: This module contains the definition of various constants -! used in the gsi code -! -! program history log: -! 2003-09-25 treadon - original code -! 2004-03-02 treadon - allow global and regional constants to differ -! 2004-06-16 treadon - update documentation -! 2004-10-28 treadon - replace parameter tiny=1.e-12 with tiny_r_kind -! and tiny_single -! 2004-11-16 treadon - add huge_single, huge_r_kind parameters -! 2005-01-27 cucurull - add ione -! 2005-08-24 derber - move cg_term to constants from qcmod -! 2006-03-07 treadon - add rd_over_cp_mass -! 2006-05-18 treadon - add huge_i_kind -! 2006-06-06 su - add var-qc wgtlim, change value to 0.25 (ECMWF) -! 2006-07-28 derber - add r1000 -! -! Subroutines Included: -! sub init_constants - compute derived constants, set regional/global constants -! -! Variable Definitions: -! see below -! -! attributes: -! language: f90 -! machine: ibm RS/6000 SP -! -!$$$ - use kinds, only: r_single,r_kind,i_kind - implicit none - -! Declare constants - integer(i_kind) izero,ione - real(r_kind) rearth,grav,omega,rd,rv,cp,cv,cvap,cliq - real(r_kind) csol,hvap,hfus,psat,t0c,ttp,jcal,cp_mass,cg_term - real(r_kind) fv,deg2rad,rad2deg,pi,tiny_r_kind,huge_r_kind,huge_i_kind - real(r_kind) ozcon,rozcon,tpwcon,rd_over_g,rd_over_cp,g_over_rd - real(r_kind) amsua_clw_d1,amsua_clw_d2,constoz,zero,one,two,four - real(r_kind) one_tenth,quarter,three,five,rd_over_cp_mass, gamma - real(r_kind) rearth_equator,stndrd_atmos_ps,r1000 - real(r_kind) semi_major_axis,semi_minor_axis,n_a,n_b - real(r_kind) eccentricity,grav_polar,grav_ratio - real(r_kind) grav_equator,earth_omega,grav_constant - real(r_kind) flattening,eccentricity_linear,somigliana - real(r_kind) dldt,dldti,hsub,psatk,tmix,xa,xai,xb,xbi - real(r_kind) eps,epsm1,omeps,wgtlim - real(r_kind) elocp,cpr,el2orc,cclimit,climit,epsq - real(r_kind) pcpeff0,pcpeff1,pcpeff2,pcpeff3,rcp,c0,delta - real(r_kind) h1000,factor1,factor2,rhcbot,rhctop,dx_max,dx_min,dx_inv - real(r_kind) h300,half,cmr,cws,ke2,row,rrow - real(r_single) zero_single,tiny_single,huge_single - real(r_single) rmw_mean_distance, roic_mean_distance - logical :: constants_initialized = .true. - - -! Define constants common to global and regional applications -! name value description units -! ---- ----- ----------- ----- - parameter(rearth_equator= 6.37813662e6_r_kind) ! equatorial earth radius (m) - parameter(omega = 7.2921e-5_r_kind) ! angular velocity of earth (1/s) - parameter(cp = 1.0046e+3_r_kind) ! specific heat of air @pressure (J/kg/K) - parameter(cvap = 1.8460e+3_r_kind) ! specific heat of h2o vapor (J/kg/K) - parameter(csol = 2.1060e+3_r_kind) ! specific heat of solid h2o (ice)(J/kg/K) - parameter(hvap = 2.5000e+6_r_kind) ! latent heat of h2o condensation (J/kg) - parameter(hfus = 3.3358e+5_r_kind) ! latent heat of h2o fusion (J/kg) - parameter(psat = 6.1078e+2_r_kind) ! pressure at h2o triple point (Pa) - parameter(t0c = 2.7315e+2_r_kind) ! temperature at zero celsius (K) - parameter(ttp = 2.7316e+2_r_kind) ! temperature at h2o triple point (K) - parameter(jcal = 4.1855e+0_r_kind) ! joules per calorie () - parameter(stndrd_atmos_ps = 1013.25e2_r_kind) ! 1976 US standard atmosphere ps (Pa) - -! Numeric constants - parameter(izero = 0) - parameter(ione = 1) - parameter(zero_single = 0.0_r_single) - parameter(zero = 0.0_r_kind) - parameter(one_tenth = 0.10_r_kind) - parameter(quarter= 0.25_r_kind) - parameter(one = 1.0_r_kind) - parameter(two = 2.0_r_kind) - parameter(three = 3.0_r_kind) - parameter(four = 4.0_r_kind) - parameter(five = 5.0_r_kind) - parameter(r1000 = 1000.0_r_kind) - -! Constants for gps refractivity - parameter(n_a=77.6_r_kind) !K/mb - parameter(n_b=3.73e+5_r_kind) !K^2/mb - -! Parameters below from WGS-84 model software inside GPS receivers. - parameter(semi_major_axis = 6378.1370e3_r_kind) ! (m) - parameter(semi_minor_axis = 6356.7523142e3_r_kind) ! (m) - parameter(grav_polar = 9.8321849378_r_kind) ! (m/s2) - parameter(grav_equator = 9.7803253359_r_kind) ! (m/s2) - parameter(earth_omega = 7.292115e-5_r_kind) ! (rad/s) - parameter(grav_constant = 3.986004418e14_r_kind) ! (m3/s2) - -! Derived geophysical constants - parameter(flattening = (semi_major_axis-semi_minor_axis)/semi_major_axis)!() - parameter(somigliana = & - (semi_minor_axis/semi_major_axis) * (grav_polar/grav_equator) - one)!() - parameter(grav_ratio = (earth_omega*earth_omega * & - semi_major_axis*semi_major_axis * semi_minor_axis) / grav_constant) !() - -! Derived thermodynamic constants - parameter ( dldti = cvap-csol ) - parameter ( hsub = hvap+hfus ) - parameter ( psatk = psat*0.001_r_kind ) - parameter ( tmix = ttp-20._r_kind ) - parameter ( elocp = hvap/cp ) - parameter ( rcp = one/cp ) - -! Constants used in GFS moist physics - parameter ( h300 = 300._r_kind ) - parameter ( half = 0.5_r_kind ) - parameter ( cclimit = 0.001_r_kind ) - parameter ( climit = 1.e-20_r_kind) - parameter ( epsq = 2.e-12_r_kind ) - parameter ( h1000 = 1000.0_r_kind) - parameter ( rhcbot=0.85_r_kind ) - parameter ( rhctop=0.85_r_kind ) - parameter ( dx_max=-8.8818363_r_kind ) - parameter ( dx_min=-5.2574954_r_kind ) - parameter ( dx_inv=one/(dx_max-dx_min) ) - parameter ( c0=0.002_r_kind ) - parameter ( delta=0.6077338_r_kind ) - parameter ( pcpeff0=1.591_r_kind ) - parameter ( pcpeff1=-0.639_r_kind ) - parameter ( pcpeff2=0.0953_r_kind ) - parameter ( pcpeff3=-0.00496_r_kind ) - parameter ( cmr = one/0.0003_r_kind ) - parameter ( cws = 0.025_r_kind ) - parameter ( ke2 = 0.00002_r_kind ) - parameter ( row = 1000._r_kind ) - parameter ( rrow = one/row ) - -! Constant used to process ozone - parameter ( constoz = 604229.0_r_kind) - -! Constants used in cloud liquid water correction for AMSU-A -! brightness temperatures - parameter ( amsua_clw_d1 = 0.754_r_kind ) - parameter ( amsua_clw_d2 = -2.265_r_kind ) - -! Constants used for variational qc - parameter ( wgtlim = 0.25_r_kind) ! Cutoff weight for concluding that obs has been - ! rejected by nonlinear qc. This limit is arbitrary - ! and DOES NOT affect nonlinear qc. It only affects - ! the printout which "counts" the number of obs that - ! "fail" nonlinear qc. Observations counted as failing - ! nonlinear qc are still assimilated. Their weight - ! relative to other observations is reduced. Changing - ! wgtlim does not alter the analysis, only - ! the nonlinear qc data "count" - -! Constants describing the Extended Best-Track Reanalysis [Demuth et -! al., 2008] tropical cyclone (TC) distance for regions relative to TC -! track position; units are in kilometers - - parameter (rmw_mean_distance = 64.5479412) - parameter (roic_mean_distance = 338.319656) - -contains - subroutine init_constants_derived -!$$$ subprogram documentation block -! . . . . -! subprogram: init_constants_derived set derived constants -! prgmmr: treadon org: np23 date: 2004-12-02 -! -! abstract: This routine sets derived constants -! -! program history log: -! 2004-12-02 treadon -! 2005-03-03 treadon - add implicit none -! -! input argument list: -! -! output argument list: -! -! attributes: -! language: f90 -! machine: ibm rs/6000 sp -! -!$$$ - implicit none - -! Trigonometric constants - pi = acos(-one) - deg2rad = pi/180.0_r_kind - rad2deg = one/deg2rad - cg_term = (sqrt(two*pi))/two ! constant for variational qc - tiny_r_kind = tiny(zero) - huge_r_kind = huge(zero) - tiny_single = tiny(zero_single) - huge_single = huge(zero_single) - huge_i_kind = huge(izero) - -! Geophysical parameters used in conversion of geopotential to -! geometric height - eccentricity_linear = sqrt(semi_major_axis**2 - semi_minor_axis**2) - eccentricity = eccentricity_linear / semi_major_axis - constants_initialized = .true. - - return - end subroutine init_constants_derived - - subroutine init_constants(regional) -!$$$ subprogram documentation block -! . . . . -! subprogram: init_constants set regional or global constants -! prgmmr: treadon org: np23 date: 2004-03-02 -! -! abstract: This routine sets constants specific to regional or global -! applications of the gsi -! -! program history log: -! 2004-03-02 treadon -! 2004-06-16 treadon, documentation -! 2004-10-28 treadon - use intrinsic TINY function to set value -! for smallest machine representable positive -! number -! 2004-12-03 treadon - move derived constants to init_constants_derived -! 2005-03-03 treadon - add implicit none -! -! input argument list: -! regional - if .true., set regional gsi constants; -! otherwise (.false.), use global constants -! -! output argument list: -! -! attributes: -! language: f90 -! machine: ibm rs/6000 sp -! -!$$$ - implicit none - logical regional - real(r_kind) reradius,g,r_d,r_v,cliq_wrf - - gamma = 0.0065 - -! Define regional constants here - if (regional) then - -! Name given to WRF constants - reradius = one/6370.e03_r_kind - g = 9.81_r_kind - r_d = 287.04_r_kind - r_v = 461.6_r_kind - cliq_wrf = 4190.0_r_kind - cp_mass = 1004.67_r_kind - -! Transfer WRF constants into unified GSI constants - rearth = one/reradius - grav = g - rd = r_d - rv = r_v - cv = cp-r_d - cliq = cliq_wrf - rd_over_cp_mass = rd / cp_mass - -! Define global constants here - else - rearth = 6.3712e+6_r_kind - grav = 9.80665e+0_r_kind - rd = 2.8705e+2_r_kind - rv = 4.6150e+2_r_kind - cv = 7.1760e+2_r_kind - cliq = 4.1855e+3_r_kind - cp_mass= zero - rd_over_cp_mass = zero - endif - - -! Now define derived constants which depend on constants -! which differ between global and regional applications. - -! Constants related to ozone assimilation - ozcon = grav*21.4e-9_r_kind - rozcon= one/ozcon - -! Constant used in vertical integral for precipitable water - tpwcon = 100.0_r_kind/grav - -! Derived atmospheric constants - fv = rv/rd-one ! used in virtual temperature equation - dldt = cvap-cliq - xa = -(dldt/rv) - xai = -(dldti/rv) - xb = xa+hvap/(rv*ttp) - xbi = xai+hsub/(rv*ttp) - eps = rd/rv - epsm1 = rd/rv-one - omeps = one-eps - factor1 = (cvap-cliq)/rv - factor2 = hvap/rv-factor1*t0c - cpr = cp*rd - el2orc = hvap*hvap/(rv*cp) - rd_over_g = rd/grav - rd_over_cp = rd/cp - g_over_rd = grav/rd - - return - end subroutine init_constants - -end module constants diff --git a/sorc/fv3nc2nemsio.fd/fv3_main.f90 b/sorc/fv3nc2nemsio.fd/fv3_main.f90 deleted file mode 100644 index 48c7440b14e..00000000000 --- a/sorc/fv3nc2nemsio.fd/fv3_main.f90 +++ /dev/null @@ -1,215 +0,0 @@ -program fv3_main - use fv3_module - use netcdf - use nemsio_module - implicit none - - type(nemsio_gfile) :: gfile - type(nemsio_meta) :: meta_nemsio - integer,parameter :: nvar2d=48 - character(nemsio_charkind) :: name2d(nvar2d) - integer :: nvar3d - character(nemsio_charkind), allocatable :: name3din(:), name3dout(:) - character(nemsio_charkind) :: varname,levtype - character(len=300) :: inpath,outpath - character(len=100) :: infile2d,infile3d,outfile - character(len=10) :: analdate, cfhour - character(len=5) :: cfhr,cfhzh - character(len=2) :: nhcase - real , allocatable :: lons(:),lats(:),tmp2d(:,:), tmp2dx(:,:) - real*8,allocatable :: tmp1d(:),tmp1dx(:),fhours(:) - real*4 :: fhour - integer :: fhzh, nhcas - - integer :: ii,i,j,k,ncid2d,ncid3d,ifhr,nlevs,nlons,nlats,ntimes,nargs,iargc,YYYY,MM,DD,HH,stat,varid - - data name2d /'ALBDOsfc','CPRATsfc','PRATEsfc','DLWRFsfc','ULWRFsfc','DSWRFsfc','USWRFsfc','DSWRFtoa','USWRFtoa',& - 'ULWRFtoa','GFLUXsfc','HGTsfc','HPBLsfc',& - 'ICECsfc','SLMSKsfc','LHTFLsfc','SHTFLsfc','PRESsfc','PWATclm','SOILM','SOILW1','SOILW2','SOILW3','SOILW4','SPFH2m',& - 'SOILT1','SOILT2','SOILT3','SOILT4','TMP2m','TMPsfc','UGWDsfc','VGWDsfc','UFLXsfc','VFLXsfc','UGRD10m','VGRD10m',& - 'WEASDsfc','SNODsfc','ZORLsfc','VFRACsfc','F10Msfc','VTYPEsfc','STYPEsfc',& - 'TCDCclm', 'TCDChcl', 'TCDCmcl', 'TCDClcl'/ - - !===================================================================== - - ! read in from command line - nargs=iargc() - IF (nargs .NE. 10) THEN - print*,'usage fv3_interface analdate ifhr fhzh fhour inpath infile2d infile3d outpath,outfile,nhcase' - STOP 1 - ENDIF - call getarg(1,analdate) - call getarg(2,cfhr) - call getarg(3,cfhzh) - call getarg(4,cfhour) - call getarg(5,inpath) - call getarg(6,infile2d) - call getarg(7,infile3d) - call getarg(8,outpath) - call getarg(9,outfile) - call getarg(10,nhcase) -! print*,analdate,cfhr,cfhzh,cfhour,inpath,infile2d,infile3d,outpath,outfile,nhcase - - read(nhcase,'(i2.1)') nhcas - read(cfhr,'(i5.1)') ifhr - read(cfhzh,'(i5.1)') fhzh - read(cfhour,*) fhour - read(analdate(1:4),'(i4)') YYYY - read(analdate(5:6),'(i2)') MM - read(analdate(7:8),'(i2)') DD - read(analdate(9:10),'(i2)') HH - print*,"ifhr,fhzh,fhour,analdate ",ifhr,fhzh,fhour,analdate - - if (nhcas == 0 ) then !non-hydrostatic case - nvar3d=9 - allocate (name3din(nvar3d), name3dout(nvar3d)) - name3din=(/'ucomp ','vcomp ','temp ','sphum ','o3mr ','nhpres','w ','clwmr ','delp '/) - name3dout=(/'ugrd ','vgrd ','tmp ','spfh ','o3mr ','pres ','vvel ','clwmr','dpres'/) - else - nvar3d=8 - allocate (name3din(nvar3d), name3dout(nvar3d)) - name3din=(/'ucomp ','vcomp ','temp ','sphum ','o3mr ','hypres','clwmr ','delp '/) - name3dout=(/'ugrd ','vgrd ','tmp ','spfh ','o3mr ','pres ','clwmr','dpres'/) - endif - - ! open netcdf files - print*,'reading',trim(inpath)//'/'//trim(infile2d) - stat = nf90_open(trim(inpath)//'/'//trim(infile2d),NF90_NOWRITE, ncid2d) - if (stat .NE.0) print*,stat - print*,'reading',trim(inpath)//'/'//trim(infile3d) - stat = nf90_open(trim(inpath)//'/'//trim(infile3d),NF90_NOWRITE, ncid3d) - if (stat .NE.0) print*,stat - ! get dimesions - - stat = nf90_inq_dimid(ncid2d,'time',varid) - if (stat .NE.0) print*,stat,varid - if (stat .NE. 0) STOP 1 - stat = nf90_inquire_dimension(ncid2d,varid,len=ntimes) - if (stat .NE.0) print*,stat,ntimes - if (stat .NE. 0) STOP 1 - allocate(fhours(ntimes)) - stat = nf90_inq_varid(ncid2d,'time',varid) - if (stat .NE. 0) STOP 1 - stat = nf90_get_var(ncid2d,varid,fhours) - if (stat .NE.0) print*,stat,fhours - if (stat .NE. 0) STOP 1 - - stat = nf90_inq_dimid(ncid3d,'grid_xt',varid) - if (stat .NE.0) print*,stat,varid - if (stat .NE. 0) STOP 1 - stat = nf90_inquire_dimension(ncid3d,varid,len=nlons) - if (stat .NE.0) print*,stat,nlons - if (stat .NE. 0) STOP 1 - allocate(lons(nlons)) - allocate(tmp1d(nlons)) - stat = nf90_inq_varid(ncid3d,'grid_xt',varid) - if (stat .NE. 0) STOP 1 - stat = nf90_get_var(ncid3d,varid,tmp1d) - if (stat .NE.0) print*,stat - if (stat .NE. 0) STOP 1 - - lons=real(tmp1d,kind=4) - !print*,lons(1),lons(3072) - deallocate(tmp1d) - - stat = nf90_inq_dimid(ncid3d,'grid_yt',varid) - if (stat .NE.0) print*,stat - if (stat .NE. 0) STOP 1 - stat = nf90_inquire_dimension(ncid3d,varid,len=nlats) - if (stat .NE.0) print*,stat - if (stat .NE. 0) STOP 1 - allocate(lats(nlats)) - allocate(tmp1d(nlats)) - allocate(tmp1dx(nlats)) - stat = nf90_inq_varid(ncid3d,'grid_yt',varid) - stat = nf90_get_var(ncid3d,varid,tmp1dx,start=(/1/),count=(/nlats/)) - if (stat .NE.0) print*,stat - if (stat .NE. 0) STOP 1 - do j=1,nlats - tmp1d(j)=tmp1dx(nlats-j+1) - enddo - lats=real(tmp1d,kind=4) - print*,"lats_beg, lats_end",lats(1),lats(nlats) - deallocate(tmp1d, tmp1dx) - - stat = nf90_inq_dimid(ncid3d,'pfull',varid) - if (stat .NE.0) print*,stat - if (stat .NE. 0) STOP 1 - stat = nf90_inquire_dimension(ncid3d,varid,len=nlevs) - if (stat .NE.0) print*,stat - if (stat .NE. 0) STOP 1 - - call define_nemsio_meta(meta_nemsio,nlons,nlats,nlevs,nvar2d,nvar3d,lons,lats) - - allocate (tmp2d(nlons,nlats)) - allocate (tmp2dx(nlons,nlats)) - - meta_nemsio%idate(1)=YYYY - meta_nemsio%idate(2)=MM - meta_nemsio%idate(3)=DD - meta_nemsio%idate(4)=HH - - meta_nemsio%varrval(1)=float(fhzh) -! if (ifhr.EQ.0) then -! meta_nemsio%varrval(1)=0.0 -! else -! meta_nemsio%varrval(1)=(ifhr-1.0)*6.0 -! endif - - ! read in data - meta_nemsio%nfhour= fhours(ifhr) - meta_nemsio%fhour= fhours(ifhr) - print*,fhours(ifhr),ifhr,'calling netcdf read' -!--for ifhr=1, fhours=dt but fhour=00 if diag is determined by FHOUT - if (fhour .ne. fhours(ifhr) .and. ifhr.gt.1 )then - print*, 'requested ',fhour, ' not equal to fhours(ifhr) ', fhours(ifhr) - print*, 'abort ! ' - stop 1 - endif - - call nems_write_init(outpath,outfile,meta_nemsio,gfile) -! read in all of the 2d variables and write out - print*,'calling write',meta_nemsio%rlat_min,meta_nemsio%rlat_max - print*,'lats',minval(meta_nemsio%lat),maxval(meta_nemsio%lat) - print *,'loop over 2d variables' - DO i=1,nvar2d - print *,i,trim(name2d(i)) - call fv3_netcdf_read_2d(ncid2d,ifhr,meta_nemsio,name2d(i),tmp2dx) - do ii=1,nlons - do j=1,nlats - tmp2d(ii,j)=tmp2dx(ii,nlats-j+1) - enddo - enddo - call nems_write(gfile,meta_nemsio%recname(i),meta_nemsio%reclevtyp(i),meta_nemsio%reclev(i), & - nlons*nlats,tmp2d,stat) - ENDDO - levtype='mid layer' -! loop through 3d fields - print *,'loop over 3d variables' - DO i=1,nvar3d - print*,i,trim(name3din(i)) - DO k=1,nlevs -! print*,k - call fv3_netcdf_read_3d(ncid3d,ifhr,meta_nemsio,name3din(i),k,tmp2dx) - do ii=1,nlons - do j=1,nlats - tmp2d(ii,j)=tmp2dx(ii,nlats-j+1) - enddo - enddo - call nems_write(gfile,name3dout(i),levtype,nlevs-k+1,nlons*nlats,tmp2d(:,:),stat) - IF (stat .NE. 0) then - print*,'error writing ,named3dout(i)',stat - STOP 1 - ENDIF - ENDDO - ENDDO - - call nemsio_close(gfile,iret=stat) - stat = nf90_close(ncid2d) - stat = nf90_close(ncid3d) - - deallocate(tmp2dx,tmp2d) - deallocate(name3din,name3dout) - - stop -end program fv3_main diff --git a/sorc/fv3nc2nemsio.fd/fv3_module.f90 b/sorc/fv3nc2nemsio.fd/fv3_module.f90 deleted file mode 100644 index 8d161acfcfa..00000000000 --- a/sorc/fv3nc2nemsio.fd/fv3_module.f90 +++ /dev/null @@ -1,372 +0,0 @@ -module fv3_module - - - !======================================================================= - - ! Define associated modules and subroutines - - !----------------------------------------------------------------------- - use netcdf - use constants - use kinds - use nemsio_module - - type nemsio_meta - character(nemsio_charkind), dimension(:), allocatable :: recname - character(nemsio_charkind), dimension(:), allocatable :: reclevtyp - character(16), dimension(:), allocatable :: variname - character(16), dimension(:), allocatable :: varrname - character(16), dimension(:), allocatable :: varr8name - character(16), dimension(:), allocatable :: aryiname - character(16), dimension(:), allocatable :: aryr8name - character(nemsio_charkind8) :: gdatatype - character(nemsio_charkind8) :: modelname - real(nemsio_realkind) :: rlon_min - real(nemsio_realkind) :: rlon_max - real(nemsio_realkind) :: rlat_min - real(nemsio_realkind) :: rlat_max - real(nemsio_realkind), dimension(:), allocatable :: lon - real(nemsio_realkind), dimension(:), allocatable :: lat - real(nemsio_realkind), dimension(:), allocatable :: varrval - integer(nemsio_intkind), dimension(:,:), allocatable :: aryival - integer(nemsio_intkind), dimension(:), allocatable :: reclev - integer(nemsio_intkind), dimension(:), allocatable :: varival - integer(nemsio_intkind), dimension(:), allocatable :: aryilen - integer(nemsio_intkind), dimension(:), allocatable :: aryr8len - integer(nemsio_intkind) :: idate(7) - integer(nemsio_intkind) :: version - integer(nemsio_intkind) :: nreo_vc - integer(nemsio_intkind) :: nrec - integer(nemsio_intkind) :: nmeta - integer(nemsio_intkind) :: nmetavari - integer(nemsio_intkind) :: nmetaaryi - integer(nemsio_intkind) :: nmetavarr - integer(nemsio_intkind) :: nfhour - integer(nemsio_intkind) :: nfminute - integer(nemsio_intkind) :: nfsecondn - integer(nemsio_intkind) :: nfsecondd - integer(nemsio_intkind) :: dimx - integer(nemsio_intkind) :: dimy - integer(nemsio_intkind) :: dimz - integer(nemsio_intkind) :: nframe - integer(nemsio_intkind) :: nsoil - integer(nemsio_intkind) :: ntrac - integer(nemsio_intkind) :: ncldt - integer(nemsio_intkind) :: idvc - integer(nemsio_intkind) :: idsl - integer(nemsio_intkind) :: idvm - integer(nemsio_intkind) :: idrt - integer(nemsio_intkind) :: fhour - - end type nemsio_meta ! type nemsio_meta - contains -!----------------------------------------------------------------------- - subroutine fv3_netcdf_read_2d(ncid2d,ifhr,meta_nemsio,varname,data2d) - - implicit none - type(nemsio_meta) :: meta_nemsio - integer :: ncid2d - integer :: ifhr,varid,stat - real :: data2d(meta_nemsio%dimx,meta_nemsio%dimy) - character(nemsio_charkind) :: varname - - ! loop through 2d data - stat = nf90_inq_varid(ncid2d,trim(varname),varid) - !print*,stat,varid,trim(varname) - stat = nf90_get_var(ncid2d,varid,data2d,start=(/1,1,ifhr/),count=(/meta_nemsio%dimx,meta_nemsio%dimy,1/)) - IF (stat .NE. 0 ) THEN - print*,'error reading ',varname - STOP - ENDIF - -end subroutine fv3_netcdf_read_2d -!----------------------------------------------------------------------- - - subroutine fv3_netcdf_read_3d(ncid3d,ifhr,meta_nemsio,varname,k,data2d) - - implicit none - - type(nemsio_meta) :: meta_nemsio - integer :: ncid3d - integer :: k - integer :: ifhr,varid,stat - character(nemsio_charkind) :: varname - !real :: data3d(meta_nemsio%dimx,meta_nemsio%dimy,meta_nemsio%dimz) - real :: data2d(meta_nemsio%dimx,meta_nemsio%dimy) - - - stat = nf90_inq_varid(ncid3d,trim(varname),varid) - !print*,stat,varname,varid - !stat = nf90_get_var(ncid3d,varid,data3d,start=(/1,1,1,ifhr/),count=(/meta_nemsio%dimx,meta_nemsio%dimy,meta_nemsio%dimz,1/)) - stat = nf90_get_var(ncid3d,varid,data2d,start=(/1,1,k,ifhr/),count=(/meta_nemsio%dimx,meta_nemsio%dimy,1,1/)) - - IF (stat .NE. 0 ) THEN - print*,'error reading ',varname - STOP - ENDIF - -end subroutine fv3_netcdf_read_3d -!----------------------------------------------------------------------- - - subroutine define_nemsio_meta(meta_nemsio,nlons,nlats,nlevs,nvar2d,nvar3d,lons,lats) - implicit none - type(nemsio_meta) :: meta_nemsio - integer :: nlons,nlats,nlevs,i,j,k,nvar2d,nvar3d - integer*8 :: ct - real :: lons(nlons),lats(nlats) -! local - - meta_nemsio%idate(1:6) = 0 - meta_nemsio%idate(7) = 1 - meta_nemsio%modelname = 'GFS' - meta_nemsio%version = 198410 - meta_nemsio%nrec = nvar2d + nlevs*nvar3d - meta_nemsio%nmeta = 8 - meta_nemsio%nmetavari = 3 - meta_nemsio%nmetavarr = 1 - meta_nemsio%nmetaaryi = 1 - meta_nemsio%dimx = nlons - meta_nemsio%dimy = nlats - meta_nemsio%dimz = nlevs - meta_nemsio%rlon_min = minval(lons) - meta_nemsio%rlon_max = maxval(lons) - meta_nemsio%rlat_min = minval(lats) - meta_nemsio%rlat_max = maxval(lats) - meta_nemsio%nsoil = 4 - meta_nemsio%nframe = 0 - meta_nemsio%nfminute = 0 - meta_nemsio%nfsecondn = 0 - meta_nemsio%nfsecondd = 1 - meta_nemsio%ntrac = 3 - meta_nemsio%idrt = 0 - meta_nemsio%ncldt = 3 - meta_nemsio%idvc = 2 - - - allocate(meta_nemsio%recname(meta_nemsio%nrec)) - allocate(meta_nemsio%reclevtyp(meta_nemsio%nrec)) - allocate(meta_nemsio%reclev(meta_nemsio%nrec)) - allocate(meta_nemsio%variname(meta_nemsio%nmetavari)) - allocate(meta_nemsio%varival(meta_nemsio%nmetavari)) - allocate(meta_nemsio%aryiname(meta_nemsio%nmetavari)) - allocate(meta_nemsio%aryilen(meta_nemsio%nmetavari)) - allocate(meta_nemsio%varrname(meta_nemsio%nmetavarr)) - allocate(meta_nemsio%varrval(meta_nemsio%nmetavarr)) - allocate(meta_nemsio%lon(nlons*nlats)) - allocate(meta_nemsio%lat(nlons*nlats)) - - meta_nemsio%varrname(1)='zhour' - meta_nemsio%variname(1)='cu_physics' - meta_nemsio%varival(1)=4 - meta_nemsio%variname(2)='mp_physics' - meta_nemsio%varival(2)=1000 - meta_nemsio%variname(3)='IVEGSRC' - meta_nemsio%varival(3)=2 - ct=1 - DO j=1,nlats - DO i=1,nlons - meta_nemsio%lon(ct) = lons(i) - meta_nemsio%lat(ct) = lats(j) - ct=ct+1 - ENDDO - ENDDO - - meta_nemsio%aryilen(1) = nlats/2 - meta_nemsio%aryiname(1) = 'lpl' - meta_nemsio%reclev(:)=1 - meta_nemsio%recname(1) = 'albdo_ave' - meta_nemsio%reclevtyp(1) = 'sfc' - meta_nemsio%recname(2) = 'cprat_ave' - meta_nemsio%reclevtyp(2) = 'sfc' - meta_nemsio%recname(3) = 'prate_ave' - meta_nemsio%reclevtyp(3) = 'sfc' - meta_nemsio%recname(4) = 'dlwrf_ave' - meta_nemsio%reclevtyp(4) = 'sfc' - meta_nemsio%recname(5) = 'ulwrf_ave' - meta_nemsio%reclevtyp(5) = 'sfc' - meta_nemsio%recname(6) = 'dswrf_ave' - meta_nemsio%reclevtyp(6) = 'sfc' - meta_nemsio%recname(7) = 'uswrf_ave' - meta_nemsio%reclevtyp(7) = 'sfc' - meta_nemsio%recname(8) = 'dswrf_ave' - meta_nemsio%reclevtyp(8) = 'nom. top' - meta_nemsio%recname(9) = 'uswrf_ave' - meta_nemsio%reclevtyp(9) = 'nom. top' - meta_nemsio%recname(10) = 'ulwrf_ave' - meta_nemsio%reclevtyp(10) = 'nom. top' - meta_nemsio%recname(11) = 'gflux_ave' - meta_nemsio%reclevtyp(11) = 'sfc' - meta_nemsio%recname(12) = 'hgt' - meta_nemsio%reclevtyp(12) = 'sfc' - meta_nemsio%recname(13) = 'hpbl' - meta_nemsio%reclevtyp(13) = 'sfc' - meta_nemsio%recname(14) = 'icec' - meta_nemsio%reclevtyp(14) = 'sfc' - meta_nemsio%recname(15) = 'land' - meta_nemsio%reclevtyp(15) = 'sfc' - meta_nemsio%recname(16) = 'lhtfl_ave' - meta_nemsio%reclevtyp(16) = 'sfc' - meta_nemsio%recname(17) = 'shtfl_ave' - meta_nemsio%reclevtyp(17) = 'sfc' - meta_nemsio%recname(18) = 'pres' - meta_nemsio%reclevtyp(18) = 'sfc' - meta_nemsio%recname(19) = 'pwat' - meta_nemsio%reclevtyp(19) = 'atmos col' - meta_nemsio%recname(20) = 'soilm' - meta_nemsio%reclevtyp(20) = '0-200 cm down' - meta_nemsio%recname(21) = 'soilw' - meta_nemsio%reclevtyp(21) = '0-10 cm down' - meta_nemsio%recname(22) = 'soilw' - meta_nemsio%reclevtyp(22) = '10-40 cm down' - meta_nemsio%recname(23) = 'soilw' - meta_nemsio%reclevtyp(23) = '40-100 cm down' - meta_nemsio%recname(24) = 'soilw' - meta_nemsio%reclevtyp(24) = '100-200 cm down' - meta_nemsio%recname(25) = 'spfh' - meta_nemsio%reclevtyp(25) = '2 m above gnd' - meta_nemsio%recname(26) = 'tmp' - meta_nemsio%reclevtyp(26) = '0-10 cm down' - meta_nemsio%recname(27) = 'tmp' - meta_nemsio%reclevtyp(27) = '10-40 cm down' - meta_nemsio%recname(28) = 'tmp' - meta_nemsio%reclevtyp(28) = '40-100 cm down' - meta_nemsio%recname(29) = 'tmp' - meta_nemsio%reclevtyp(29) = '100-200 cm down' - meta_nemsio%recname(30) = 'tmp' - meta_nemsio%reclevtyp(30) = '2 m above gnd' - meta_nemsio%recname(31) = 'tmp' - meta_nemsio%reclevtyp(31) = 'sfc' - meta_nemsio%recname(32) = 'ugwd' - meta_nemsio%reclevtyp(32) = 'sfc' - meta_nemsio%recname(33) = 'vgwd' - meta_nemsio%reclevtyp(33) = 'sfc' - meta_nemsio%recname(34) = 'uflx_ave' - meta_nemsio%reclevtyp(34) = 'sfc' - meta_nemsio%recname(35) = 'vflx_ave' - meta_nemsio%reclevtyp(35) = 'sfc' - meta_nemsio%recname(36) = 'ugrd' - meta_nemsio%reclevtyp(36) = '10 m above gnd' - meta_nemsio%recname(37) = 'vgrd' - meta_nemsio%reclevtyp(37) = '10 m above gnd' - meta_nemsio%recname(38) = 'weasd' - meta_nemsio%reclevtyp(38) = 'sfc' - meta_nemsio%recname(39) = 'snod' - meta_nemsio%reclevtyp(39) = 'sfc' - meta_nemsio%recname(40) = 'zorl' - meta_nemsio%reclevtyp(40) = 'sfc' - meta_nemsio%recname(41) = 'vfrac' - meta_nemsio%reclevtyp(41) = 'sfc' - meta_nemsio%recname(42) = 'f10m' - meta_nemsio%reclevtyp(42) = 'sfc' - meta_nemsio%recname(43) = 'vtype' - meta_nemsio%reclevtyp(43) = 'sfc' - meta_nemsio%recname(44) = 'stype' - meta_nemsio%reclevtyp(44) = 'sfc' - meta_nemsio%recname(45) = 'tcdc_ave' - meta_nemsio%reclevtyp(45) = 'atmos col' - meta_nemsio%recname(46) = 'tcdc_ave' - meta_nemsio%reclevtyp(46) = 'high cld lay' - meta_nemsio%recname(47) = 'tcdc_ave' - meta_nemsio%reclevtyp(47) = 'mid cld lay' - meta_nemsio%recname(48) = 'tcdc_ave' - meta_nemsio%reclevtyp(48) = 'low cld lay' -! loop through 3d variables - DO k = 1, nlevs - meta_nemsio%recname(k+nvar2d) = 'ugrd' - meta_nemsio%reclevtyp(k+nvar2d) = 'mid layer' - meta_nemsio%reclev(k+nvar2d) = k - meta_nemsio%recname(k+nvar2d+nlevs) = 'vgrd' - meta_nemsio%reclevtyp(k+nvar2d+nlevs) = 'mid layer' - meta_nemsio%reclev(k+nvar2d+nlevs) = k - meta_nemsio%recname(k+nvar2d+nlevs*2) = 'tmp' - meta_nemsio%reclevtyp(k+nvar2d+nlevs*2) = 'mid layer' - meta_nemsio%reclev(k+nvar2d+nlevs*2) = k - meta_nemsio%recname(k+nvar2d+nlevs*3) = 'spfh' - meta_nemsio%reclevtyp(k+nvar2d+nlevs*3) = 'mid layer' - meta_nemsio%reclev(k+nvar2d+nlevs*3) = k - meta_nemsio%recname(k+nvar2d+nlevs*4) = 'o3mr' - meta_nemsio%reclevtyp(k+nvar2d+nlevs*4) = 'mid layer' - meta_nemsio%reclev(k+nvar2d+nlevs*4) = k - meta_nemsio%recname(k+nvar2d+nlevs*5) = 'pres' - meta_nemsio%reclevtyp(k+nvar2d+nlevs*5) = 'mid layer' - meta_nemsio%reclev(k+nvar2d+nlevs*5) = k - meta_nemsio%recname(k+nvar2d+nlevs*6) = 'clwmr' - meta_nemsio%reclevtyp(k+nvar2d+nlevs*6) = 'mid layer' - meta_nemsio%reclev(k+nvar2d+nlevs*6) = k - meta_nemsio%recname(k+nvar2d+nlevs*7) = 'dpres' - meta_nemsio%reclevtyp(k+nvar2d+nlevs*7) = 'mid layer' - meta_nemsio%reclev(k+nvar2d+nlevs*7) = k - if (nvar3d == 9) then - meta_nemsio%recname(k+nvar2d+nlevs*8) = 'vvel' - meta_nemsio%reclevtyp(k+nvar2d+nlevs*8) = 'mid layer' - meta_nemsio%reclev(k+nvar2d+nlevs*8) = k - endif - ENDDO - - end subroutine define_nemsio_meta - - subroutine nems_write_init(datapath,filename_base,meta_nemsio,gfile) - - - implicit none - - type(nemsio_meta) :: meta_nemsio - character(len=200) :: datapath - character(len=100) :: filename_base - character(len=400) :: filename - type(nemsio_gfile) :: gfile - integer :: nemsio_iret - integer :: i, j, k - - write(filename,500) trim(datapath)//'/'//trim(filename_base) -500 format(a,i3.3) - print*,trim(filename) - call nemsio_init(iret=nemsio_iret) - print*,'iret=',nemsio_iret - !gfile%gtype = 'NEMSIO' - meta_nemsio%gdatatype = 'bin4' - call nemsio_open(gfile,trim(filename),'write', & - & iret=nemsio_iret, & - & modelname=trim(meta_nemsio%modelname), & - & version=meta_nemsio%version,gdatatype=meta_nemsio%gdatatype, & - & dimx=meta_nemsio%dimx,dimy=meta_nemsio%dimy, & - & dimz=meta_nemsio%dimz,rlon_min=meta_nemsio%rlon_min, & - & rlon_max=meta_nemsio%rlon_max,rlat_min=meta_nemsio%rlat_min, & - & rlat_max=meta_nemsio%rlat_max, & - & lon=meta_nemsio%lon,lat=meta_nemsio%lat, & - & idate=meta_nemsio%idate,nrec=meta_nemsio%nrec, & - & nframe=meta_nemsio%nframe,idrt=meta_nemsio%idrt,ncldt= & - & meta_nemsio%ncldt,idvc=meta_nemsio%idvc, & - & nfhour=meta_nemsio%nfhour,nfminute=meta_nemsio%nfminute, & - & nfsecondn=meta_nemsio%nfsecondn,nmeta=meta_nemsio%nmeta, & - & nfsecondd=meta_nemsio%nfsecondd,extrameta=.true., & - & nmetaaryi=meta_nemsio%nmetaaryi,recname=meta_nemsio%recname, & - & nmetavari=meta_nemsio%nmetavari,variname=meta_nemsio%variname, & - & varival=meta_nemsio%varival,varrval=meta_nemsio%varrval, & - & nmetavarr=meta_nemsio%nmetavarr,varrname=meta_nemsio%varrname, & - & reclevtyp=meta_nemsio%reclevtyp, & - & reclev=meta_nemsio%reclev,aryiname=meta_nemsio%aryiname, & - & aryilen=meta_nemsio%aryilen) - print*,'iret=',nemsio_iret - end subroutine nems_write_init - - -!------------------------------------------------------ - subroutine nems_write(gfile,recname,reclevtyp,level,dimx,data2d,iret) - - implicit none - type(nemsio_gfile) :: gfile - integer :: iret,level,dimx - real :: data2d(dimx) - character(nemsio_charkind) :: recname, reclevtyp - - call nemsio_writerecv(gfile,recname,levtyp=reclevtyp,lev=level,data=data2d,iret=iret) - if (iret.NE.0) then - print*,'error writing',recname,level,iret - STOP - ENDIF - - end subroutine nems_write - - -end module fv3_module diff --git a/sorc/fv3nc2nemsio.fd/kinds.f90 b/sorc/fv3nc2nemsio.fd/kinds.f90 deleted file mode 100644 index b3378bfccf6..00000000000 --- a/sorc/fv3nc2nemsio.fd/kinds.f90 +++ /dev/null @@ -1,107 +0,0 @@ -! this module was extracted from the GSI version operational -! at NCEP in Dec. 2007. -module kinds -!$$$ module documentation block -! . . . . -! module: kinds -! prgmmr: treadon org: np23 date: 2004-08-15 -! -! abstract: Module to hold specification kinds for variable declaration. -! This module is based on (copied from) Paul vanDelst's -! type_kinds module found in the community radiative transfer -! model -! -! module history log: -! 2004-08-15 treadon -! -! Subroutines Included: -! -! Functions Included: -! -! remarks: -! The numerical data types defined in this module are: -! i_byte - specification kind for byte (1-byte) integer variable -! i_short - specification kind for short (2-byte) integer variable -! i_long - specification kind for long (4-byte) integer variable -! i_llong - specification kind for double long (8-byte) integer variable -! r_single - specification kind for single precision (4-byte) real variable -! r_double - specification kind for double precision (8-byte) real variable -! r_quad - specification kind for quad precision (16-byte) real variable -! -! i_kind - generic specification kind for default integer -! r_kind - generic specification kind for default floating point -! -! -! attributes: -! language: f90 -! machine: ibm RS/6000 SP -! -!$$$ end documentation block - implicit none - private - -! Integer type definitions below - -! Integer types - integer, parameter, public :: i_byte = selected_int_kind(1) ! byte integer - integer, parameter, public :: i_short = selected_int_kind(4) ! short integer - integer, parameter, public :: i_long = selected_int_kind(8) ! long integer - integer, parameter, private :: llong_t = selected_int_kind(16) ! llong integer - integer, parameter, public :: i_llong = max( llong_t, i_long ) - -! Expected 8-bit byte sizes of the integer kinds - integer, parameter, public :: num_bytes_for_i_byte = 1 - integer, parameter, public :: num_bytes_for_i_short = 2 - integer, parameter, public :: num_bytes_for_i_long = 4 - integer, parameter, public :: num_bytes_for_i_llong = 8 - -! Define arrays for default definition - integer, parameter, private :: num_i_kinds = 4 - integer, parameter, dimension( num_i_kinds ), private :: integer_types = (/ & - i_byte, i_short, i_long, i_llong /) - integer, parameter, dimension( num_i_kinds ), private :: integer_byte_sizes = (/ & - num_bytes_for_i_byte, num_bytes_for_i_short, & - num_bytes_for_i_long, num_bytes_for_i_llong /) - -! Default values -! **** CHANGE THE FOLLOWING TO CHANGE THE DEFAULT INTEGER TYPE KIND *** - integer, parameter, private :: default_integer = 3 ! 1=byte, - ! 2=short, - ! 3=long, - ! 4=llong - integer, parameter, public :: i_kind = integer_types( default_integer ) - integer, parameter, public :: num_bytes_for_i_kind = & - integer_byte_sizes( default_integer ) - - -! Real definitions below - -! Real types - integer, parameter, public :: r_single = selected_real_kind(6) ! single precision - integer, parameter, public :: r_double = selected_real_kind(15) ! double precision - integer, parameter, private :: quad_t = selected_real_kind(20) ! quad precision - integer, parameter, public :: r_quad = max( quad_t, r_double ) - -! Expected 8-bit byte sizes of the real kinds - integer, parameter, public :: num_bytes_for_r_single = 4 - integer, parameter, public :: num_bytes_for_r_double = 8 - integer, parameter, public :: num_bytes_for_r_quad = 16 - -! Define arrays for default definition - integer, parameter, private :: num_r_kinds = 3 - integer, parameter, dimension( num_r_kinds ), private :: real_kinds = (/ & - r_single, r_double, r_quad /) - integer, parameter, dimension( num_r_kinds ), private :: real_byte_sizes = (/ & - num_bytes_for_r_single, num_bytes_for_r_double, & - num_bytes_for_r_quad /) - -! Default values -! **** CHANGE THE FOLLOWING TO CHANGE THE DEFAULT REAL TYPE KIND *** - integer, parameter, private :: default_real = 1 ! 1=single, - ! 2=double, - ! 3=quad - integer, parameter, public :: r_kind = real_kinds( default_real ) - integer, parameter, public :: num_bytes_for_r_kind = & - real_byte_sizes( default_real ) - -end module kinds diff --git a/sorc/gaussian_sfcanl.fd/.gitignore b/sorc/gaussian_sfcanl.fd/.gitignore deleted file mode 100644 index 0a4391755c3..00000000000 --- a/sorc/gaussian_sfcanl.fd/.gitignore +++ /dev/null @@ -1,3 +0,0 @@ -*.o -*.mod -*.exe diff --git a/sorc/gaussian_sfcanl.fd/gaussian_sfcanl.f90 b/sorc/gaussian_sfcanl.fd/gaussian_sfcanl.f90 deleted file mode 100644 index acce575cd70..00000000000 --- a/sorc/gaussian_sfcanl.fd/gaussian_sfcanl.f90 +++ /dev/null @@ -1,2093 +0,0 @@ -!------------------------------------------------------------------ -! -! Read in surface and nst data on the cubed-sphere grid, -! interpolate it to the gaussian grid, and output the result -! to a nemsio or netcdf file. To not process nst data, -! set flag 'donst' to 'no'. To process nst, set to 'yes'. -! To output gaussian file in netcdf, set netcdf_out=.true. -! Otherwise, nemsio format will be output. -! -! Input files: -! ------------ -! weights.nc Interpolation weights. netcdf format -! anal.tile[1-6].nc fv3 surface restart files -! orog.tile[1-6].nc fv3 orography files -! fort.41 namelist Configuration namelist -! vcoord.txt Vertical coordinate definition file -! (ascii) -! -! Output files: -! ------------- -! sfc.gaussian.analysis.file surface data on gaussian grid - -! nemsio or netcdf. -! -! Namelist variables: -! ------------------- -! yy/mm/dd/hh year/month/day/hour of data. -! i/jgaus i/j dimension of gaussian grid. -! donst When 'no' do not process nst data. -! When 'yes' process nst data. -! netcdf_out When 'true', output gaussian file in -! netcdf. Otherwise output nemsio format. -! -! 2018-Jan-30 Gayno Initial version -! 2019-Oct-30 Gayno Option to output gaussian analysis file -! in netcdf. -! -!------------------------------------------------------------------ - - module io - - use nemsio_module - - implicit none - - character(len=3) :: donst - - integer, parameter :: num_tiles = 6 - - integer :: itile, jtile, igaus, jgaus - - integer(nemsio_intkind) :: idate(8) - - type :: sfc_data -! surface variables - real, allocatable :: alvsf(:) - real, allocatable :: alvwf(:) - real, allocatable :: alnsf(:) - real, allocatable :: alnwf(:) - real, allocatable :: canopy(:) - real, allocatable :: facsf(:) - real, allocatable :: facwf(:) - real, allocatable :: ffhh(:) - real, allocatable :: ffmm(:) - real, allocatable :: fice(:) - real, allocatable :: f10m(:) - real, allocatable :: hice(:) - real, allocatable :: q2m(:) - real, allocatable :: orog(:) - real, allocatable :: sheleg(:) - real, allocatable :: slmask(:) - real, allocatable :: shdmax(:) - real, allocatable :: shdmin(:) - real, allocatable :: slope(:) - real, allocatable :: srflag(:) - real, allocatable :: snoalb(:) - real, allocatable :: snwdph(:) - real, allocatable :: stype(:) - real, allocatable :: t2m(:) - real, allocatable :: tprcp(:) - real, allocatable :: tisfc(:) - real, allocatable :: tsea(:) - real, allocatable :: tg3(:) - real, allocatable :: uustar(:) - real, allocatable :: vfrac(:) - real, allocatable :: vtype(:) - real, allocatable :: zorl(:) - real, allocatable :: slc(:,:) - real, allocatable :: smc(:,:) - real, allocatable :: stc(:,:) -! nst variables - real, allocatable :: c0(:) - real, allocatable :: cd(:) - real, allocatable :: dconv(:) - real, allocatable :: dtcool(:) - real, allocatable :: land(:) - real, allocatable :: qrain(:) - real, allocatable :: tref(:) - real, allocatable :: w0(:) - real, allocatable :: wd(:) - real, allocatable :: xs(:) - real, allocatable :: xt(:) - real, allocatable :: xtts(:) - real, allocatable :: xu(:) - real, allocatable :: xv(:) - real, allocatable :: xz(:) - real, allocatable :: xzts(:) - real, allocatable :: zc(:) - end type sfc_data - - type(sfc_data) :: tile_data, gaussian_data - - end module io - -!------------------------------------------------------------------------------ -! Main program -!------------------------------------------------------------------------------ - - program main - - use netcdf - use io - - implicit none - - character(len=12) :: weightfile - - integer :: i, error, ncid, id_ns, n_s - integer :: id_col, id_row, id_s, n - integer :: yy, mm, dd, hh - integer, allocatable :: col(:), row(:) - - logical :: netcdf_out - - real(kind=8), allocatable :: s(:) - - namelist /setup/ yy, mm, dd, hh, igaus, jgaus, donst, netcdf_out - - call w3tagb('GAUSSIAN_SFCANL',2018,0179,0055,'NP20') - - print*,"- BEGIN EXECUTION" - - netcdf_out = .true. - - donst = 'no' - - print* - print*,"- READ SETUP NAMELIST" - open(41, file="./fort.41") - read(41, nml=setup, iostat=error) - if (error /= 0) then - print*,"** FATAL ERROR READING NAMELIST. ISTAT IS: ", error - call errexit(56) - endif - close (41) - - idate = 0 - idate(1) = yy - idate(2) = mm - idate(3) = dd - idate(4) = hh - -!------------------------------------------------------------------------------ -! Read interpolation weight file. -!------------------------------------------------------------------------------ - - print* - print*,"- READ INTERPOLATION WEIGHT FILE" - - weightfile = "./weights.nc" - - error=nf90_open(trim(weightfile),nf90_nowrite,ncid) - call netcdf_err(error, 'OPENING weights.nc' ) - - error=nf90_inq_dimid(ncid, 'n_s', id_ns) - call netcdf_err(error, 'READING n_s id' ) - error=nf90_inquire_dimension(ncid,id_ns,len=n_s) - call netcdf_err(error, 'READING n_s' ) - - allocate(col(n_s)) - error=nf90_inq_varid(ncid, 'col', id_col) - call netcdf_err(error, 'READING col id' ) - error=nf90_get_var(ncid, id_col, col) - call netcdf_err(error, 'READING col' ) - - allocate(row(n_s)) - error=nf90_inq_varid(ncid, 'row', id_row) - call netcdf_err(error, 'READING row id' ) - error=nf90_get_var(ncid, id_row, row) - call netcdf_err(error, 'READING row' ) - - allocate(s(n_s)) - error=nf90_inq_varid(ncid, 'S', id_s) - call netcdf_err(error, 'READING s id' ) - error=nf90_get_var(ncid, id_s, s) - call netcdf_err(error, 'READING s' ) - - error = nf90_close(ncid) - -!------------------------------------------------------------------------------ -! Read the tiled analysis data. -!------------------------------------------------------------------------------ - - call read_data_anl - -!------------------------------------------------------------------------------ -! Interpolate tiled data to gaussian grid. -!------------------------------------------------------------------------------ - - allocate(gaussian_data%orog(igaus*jgaus)) ! sfc - allocate(gaussian_data%t2m(igaus*jgaus)) - allocate(gaussian_data%tisfc(igaus*jgaus)) - allocate(gaussian_data%q2m(igaus*jgaus)) - allocate(gaussian_data%stype(igaus*jgaus)) - allocate(gaussian_data%snwdph(igaus*jgaus)) - allocate(gaussian_data%slope(igaus*jgaus)) - allocate(gaussian_data%shdmax(igaus*jgaus)) - allocate(gaussian_data%shdmin(igaus*jgaus)) - allocate(gaussian_data%snoalb(igaus*jgaus)) - allocate(gaussian_data%slmask(igaus*jgaus)) - allocate(gaussian_data%tg3(igaus*jgaus)) - allocate(gaussian_data%alvsf(igaus*jgaus)) - allocate(gaussian_data%alvwf(igaus*jgaus)) - allocate(gaussian_data%alnsf(igaus*jgaus)) - allocate(gaussian_data%alnwf(igaus*jgaus)) - allocate(gaussian_data%facsf(igaus*jgaus)) - allocate(gaussian_data%facwf(igaus*jgaus)) - allocate(gaussian_data%ffhh(igaus*jgaus)) - allocate(gaussian_data%ffmm(igaus*jgaus)) - allocate(gaussian_data%sheleg(igaus*jgaus)) - allocate(gaussian_data%canopy(igaus*jgaus)) - allocate(gaussian_data%vfrac(igaus*jgaus)) - allocate(gaussian_data%vtype(igaus*jgaus)) - allocate(gaussian_data%zorl(igaus*jgaus)) - allocate(gaussian_data%tsea(igaus*jgaus)) - allocate(gaussian_data%f10m(igaus*jgaus)) - allocate(gaussian_data%tprcp(igaus*jgaus)) - allocate(gaussian_data%uustar(igaus*jgaus)) - allocate(gaussian_data%fice(igaus*jgaus)) - allocate(gaussian_data%hice(igaus*jgaus)) - allocate(gaussian_data%srflag(igaus*jgaus)) - allocate(gaussian_data%slc(igaus*jgaus,4)) - allocate(gaussian_data%smc(igaus*jgaus,4)) - allocate(gaussian_data%stc(igaus*jgaus,4)) - - if (trim(donst) == "yes" .or. trim(donst) == "YES") then - allocate(gaussian_data%c0(igaus*jgaus)) ! nst - allocate(gaussian_data%cd(igaus*jgaus)) - allocate(gaussian_data%dconv(igaus*jgaus)) - allocate(gaussian_data%dtcool(igaus*jgaus)) - allocate(gaussian_data%land(igaus*jgaus)) - allocate(gaussian_data%qrain(igaus*jgaus)) - allocate(gaussian_data%tref(igaus*jgaus)) - allocate(gaussian_data%w0(igaus*jgaus)) - allocate(gaussian_data%wd(igaus*jgaus)) - allocate(gaussian_data%xs(igaus*jgaus)) - allocate(gaussian_data%xt(igaus*jgaus)) - allocate(gaussian_data%xtts(igaus*jgaus)) - allocate(gaussian_data%xu(igaus*jgaus)) - allocate(gaussian_data%xv(igaus*jgaus)) - allocate(gaussian_data%xz(igaus*jgaus)) - allocate(gaussian_data%xzts(igaus*jgaus)) - allocate(gaussian_data%zc(igaus*jgaus)) - endif - - do i = 1, n_s - gaussian_data%orog(row(i)) = gaussian_data%orog(row(i)) + s(i)*tile_data%orog(col(i)) - gaussian_data%t2m(row(i)) = gaussian_data%t2m(row(i)) + s(i)*tile_data%t2m(col(i)) - gaussian_data%tisfc(row(i)) = gaussian_data%tisfc(row(i)) + s(i)*tile_data%tisfc(col(i)) - gaussian_data%q2m(row(i)) = gaussian_data%q2m(row(i)) + s(i)*tile_data%q2m(col(i)) - gaussian_data%stype(row(i)) = gaussian_data%stype(row(i)) + s(i)*tile_data%stype(col(i)) - gaussian_data%snwdph(row(i)) = gaussian_data%snwdph(row(i)) + s(i)*tile_data%snwdph(col(i)) - gaussian_data%slope(row(i)) = gaussian_data%slope(row(i)) + s(i)*tile_data%slope(col(i)) - gaussian_data%shdmax(row(i)) = gaussian_data%shdmax(row(i)) + s(i)*tile_data%shdmax(col(i)) - gaussian_data%shdmin(row(i)) = gaussian_data%shdmin(row(i)) + s(i)*tile_data%shdmin(col(i)) - gaussian_data%slmask(row(i)) = gaussian_data%slmask(row(i)) + s(i)*tile_data%slmask(col(i)) - gaussian_data%tg3(row(i)) = gaussian_data%tg3(row(i)) + s(i)*tile_data%tg3(col(i)) - gaussian_data%alvsf(row(i)) = gaussian_data%alvsf(row(i)) + s(i)*tile_data%alvsf(col(i)) - gaussian_data%alvwf(row(i)) = gaussian_data%alvwf(row(i)) + s(i)*tile_data%alvwf(col(i)) - gaussian_data%alnsf(row(i)) = gaussian_data%alnsf(row(i)) + s(i)*tile_data%alnsf(col(i)) - gaussian_data%alnwf(row(i)) = gaussian_data%alnwf(row(i)) + s(i)*tile_data%alnwf(col(i)) - gaussian_data%sheleg(row(i)) = gaussian_data%sheleg(row(i)) + s(i)*tile_data%sheleg(col(i)) - gaussian_data%canopy(row(i)) = gaussian_data%canopy(row(i)) + s(i)*tile_data%canopy(col(i)) - gaussian_data%vfrac(row(i)) = gaussian_data%vfrac(row(i)) + s(i)*tile_data%vfrac(col(i)) - gaussian_data%zorl(row(i)) = gaussian_data%zorl(row(i)) + s(i)*tile_data%zorl(col(i)) - gaussian_data%tsea(row(i)) = gaussian_data%tsea(row(i)) + s(i)*tile_data%tsea(col(i)) - gaussian_data%f10m(row(i)) = gaussian_data%f10m(row(i)) + s(i)*tile_data%f10m(col(i)) - gaussian_data%vtype(row(i)) = gaussian_data%vtype(row(i)) + s(i)*tile_data%vtype(col(i)) - gaussian_data%tprcp(row(i)) = gaussian_data%tprcp(row(i)) + s(i)*tile_data%tprcp(col(i)) - gaussian_data%facsf(row(i)) = gaussian_data%facsf(row(i)) + s(i)*tile_data%facsf(col(i)) - gaussian_data%facwf(row(i)) = gaussian_data%facwf(row(i)) + s(i)*tile_data%facwf(col(i)) - gaussian_data%ffhh(row(i)) = gaussian_data%ffhh(row(i)) + s(i)*tile_data%ffhh(col(i)) - gaussian_data%ffmm(row(i)) = gaussian_data%ffmm(row(i)) + s(i)*tile_data%ffmm(col(i)) - gaussian_data%uustar(row(i)) = gaussian_data%uustar(row(i)) + s(i)*tile_data%uustar(col(i)) - gaussian_data%fice(row(i)) = gaussian_data%fice(row(i)) + s(i)*tile_data%fice(col(i)) - gaussian_data%hice(row(i)) = gaussian_data%hice(row(i)) + s(i)*tile_data%hice(col(i)) - gaussian_data%snoalb(row(i)) = gaussian_data%snoalb(row(i)) + s(i)*tile_data%snoalb(col(i)) - gaussian_data%srflag(row(i)) = gaussian_data%srflag(row(i)) + s(i)*tile_data%srflag(col(i)) - if (trim(donst) == "yes" .or. trim(donst) == "YES") then - gaussian_data%c0(row(i)) = gaussian_data%c0(row(i)) + s(i)*tile_data%c0(col(i)) - gaussian_data%cd(row(i)) = gaussian_data%cd(row(i)) + s(i)*tile_data%cd(col(i)) - gaussian_data%dconv(row(i)) = gaussian_data%dconv(row(i)) + s(i)*tile_data%dconv(col(i)) - gaussian_data%dtcool(row(i)) = gaussian_data%dtcool(row(i)) + s(i)*tile_data%dtcool(col(i)) - gaussian_data%qrain(row(i)) = gaussian_data%qrain(row(i)) + s(i)*tile_data%qrain(col(i)) - gaussian_data%tref(row(i)) = gaussian_data%tref(row(i)) + s(i)*tile_data%tref(col(i)) - gaussian_data%w0(row(i)) = gaussian_data%w0(row(i)) + s(i)*tile_data%w0(col(i)) - gaussian_data%wd(row(i)) = gaussian_data%wd(row(i)) + s(i)*tile_data%wd(col(i)) - gaussian_data%xs(row(i)) = gaussian_data%xs(row(i)) + s(i)*tile_data%xs(col(i)) - gaussian_data%xt(row(i)) = gaussian_data%xt(row(i)) + s(i)*tile_data%xt(col(i)) - gaussian_data%xtts(row(i)) = gaussian_data%xtts(row(i)) + s(i)*tile_data%xtts(col(i)) - gaussian_data%xu(row(i)) = gaussian_data%xu(row(i)) + s(i)*tile_data%xu(col(i)) - gaussian_data%xv(row(i)) = gaussian_data%xv(row(i)) + s(i)*tile_data%xv(col(i)) - gaussian_data%xz(row(i)) = gaussian_data%xz(row(i)) + s(i)*tile_data%xz(col(i)) - gaussian_data%xzts(row(i)) = gaussian_data%xzts(row(i)) + s(i)*tile_data%xzts(col(i)) - gaussian_data%zc(row(i)) = gaussian_data%zc(row(i)) + s(i)*tile_data%zc(col(i)) - endif - do n = 1, 4 - gaussian_data%slc(row(i),n) = gaussian_data%slc(row(i),n) + s(i)*tile_data%slc(col(i),n) - gaussian_data%smc(row(i),n) = gaussian_data%smc(row(i),n) + s(i)*tile_data%smc(col(i),n) - gaussian_data%stc(row(i),n) = gaussian_data%stc(row(i),n) + s(i)*tile_data%stc(col(i),n) - enddo - enddo - - deallocate(col, row, s) - - deallocate(tile_data%orog) - deallocate(tile_data%t2m) - deallocate(tile_data%tisfc) - deallocate(tile_data%q2m) - deallocate(tile_data%stype) - deallocate(tile_data%snwdph) - deallocate(tile_data%slope) - deallocate(tile_data%shdmax) - deallocate(tile_data%shdmin) - deallocate(tile_data%snoalb) - deallocate(tile_data%slmask) - deallocate(tile_data%tg3) - deallocate(tile_data%alvsf) - deallocate(tile_data%alvwf) - deallocate(tile_data%alnsf) - deallocate(tile_data%alnwf) - deallocate(tile_data%facsf) - deallocate(tile_data%facwf) - deallocate(tile_data%ffhh) - deallocate(tile_data%ffmm) - deallocate(tile_data%sheleg) - deallocate(tile_data%canopy) - deallocate(tile_data%vfrac) - deallocate(tile_data%vtype) - deallocate(tile_data%zorl) - deallocate(tile_data%tsea) - deallocate(tile_data%f10m) - deallocate(tile_data%tprcp) - deallocate(tile_data%uustar) - deallocate(tile_data%fice) - deallocate(tile_data%hice) - deallocate(tile_data%srflag) - deallocate(tile_data%slc) - deallocate(tile_data%smc) - deallocate(tile_data%stc) - - if (trim(donst) == "yes" .or. trim(donst) == "YES") then - deallocate(tile_data%c0) - deallocate(tile_data%cd) - deallocate(tile_data%dconv) - deallocate(tile_data%dtcool) - deallocate(tile_data%qrain) - deallocate(tile_data%tref) - deallocate(tile_data%w0) - deallocate(tile_data%wd) - deallocate(tile_data%xs) - deallocate(tile_data%xt) - deallocate(tile_data%xtts) - deallocate(tile_data%xu) - deallocate(tile_data%xv) - deallocate(tile_data%xz) - deallocate(tile_data%xzts) - deallocate(tile_data%zc) - endif - -!------------------------------------------------------------------------------ -! Write gaussian data to either netcdf or nemsio file. -!------------------------------------------------------------------------------ - - if (netcdf_out) then - call write_sfc_data_netcdf - else - call write_sfc_data_nemsio - endif - - deallocate(gaussian_data%orog) - deallocate(gaussian_data%t2m) - deallocate(gaussian_data%tisfc) - deallocate(gaussian_data%q2m) - deallocate(gaussian_data%stype) - deallocate(gaussian_data%snwdph) - deallocate(gaussian_data%slope) - deallocate(gaussian_data%shdmax) - deallocate(gaussian_data%shdmin) - deallocate(gaussian_data%snoalb) - deallocate(gaussian_data%slmask) - deallocate(gaussian_data%tg3) - deallocate(gaussian_data%alvsf) - deallocate(gaussian_data%alvwf) - deallocate(gaussian_data%alnsf) - deallocate(gaussian_data%alnwf) - deallocate(gaussian_data%facsf) - deallocate(gaussian_data%facwf) - deallocate(gaussian_data%ffhh) - deallocate(gaussian_data%ffmm) - deallocate(gaussian_data%sheleg) - deallocate(gaussian_data%canopy) - deallocate(gaussian_data%vfrac) - deallocate(gaussian_data%vtype) - deallocate(gaussian_data%zorl) - deallocate(gaussian_data%tsea) - deallocate(gaussian_data%f10m) - deallocate(gaussian_data%tprcp) - deallocate(gaussian_data%uustar) - deallocate(gaussian_data%fice) - deallocate(gaussian_data%hice) - deallocate(gaussian_data%srflag) - deallocate(gaussian_data%slc) - deallocate(gaussian_data%smc) - deallocate(gaussian_data%stc) - - if (trim(donst) == "yes" .or. trim(donst) == "YES") then - deallocate(gaussian_data%c0) - deallocate(gaussian_data%cd) - deallocate(gaussian_data%dconv) - deallocate(gaussian_data%dtcool) - deallocate(gaussian_data%land) - deallocate(gaussian_data%qrain) - deallocate(gaussian_data%tref) - deallocate(gaussian_data%w0) - deallocate(gaussian_data%wd) - deallocate(gaussian_data%xs) - deallocate(gaussian_data%xt) - deallocate(gaussian_data%xtts) - deallocate(gaussian_data%xu) - deallocate(gaussian_data%xv) - deallocate(gaussian_data%xz) - deallocate(gaussian_data%xzts) - deallocate(gaussian_data%zc) - endif - - print* - print*,'- NORMAL TERMINATION' - - call w3tage('GAUSSIAN_SFCANL') - - end program main - -!------------------------------------------------------------------------------------------- -! Write gaussian surface data to netcdf file. -!------------------------------------------------------------------------------------------- - - subroutine write_sfc_data_netcdf - - use netcdf - use io - - implicit none - - character(len=50) :: outfile - character(len=31) :: date_string - character(len=4) :: year - character(len=2) :: mon, day, hour - - integer :: header_buffer_val = 16384 - integer :: i, error, ncid, dim_xt, dim_yt, dim_time - integer :: id_xt, id_yt, id_lon, id_lat, id_time - integer :: n - -! noah variables - integer, parameter :: num_noah=44 - character(len=30) :: noah_var(num_noah) - character(len=70) :: noah_name(num_noah) - character(len=30) :: noah_units(num_noah) - -! nst variables - integer, parameter :: num_nst=16 - character(len=30) :: nst_var(num_nst) - character(len=70) :: nst_name(num_nst) - character(len=30) :: nst_units(num_nst) - -! variables to be output - integer :: num_vars - character(len=30), allocatable :: var(:) - character(len=70), allocatable :: name(:) - character(len=30), allocatable :: units(:) - integer, allocatable :: id_var(:) - - real, parameter :: missing = 9.99e20 - - real(kind=4), allocatable :: dummy(:,:), slat(:), wlat(:) - -! define noah fields - - data noah_var /"alnsf", & - "alnwf", & - "alvsf", & - "alvwf", & - "cnwat", & - "crain",& - "f10m", & - "facsf", & - "facwf", & - "ffhh", & - "ffmm", & - "fricv", & - "icec", & - "icetk", & - "land", & - "orog", & - "sfcr", & - "shdmax", & - "shdmin", & - "sltyp", & - "snoalb", & - "snod", & - "soill1", & - "soill2", & - "soill3", & - "soill4", & - "soilt1", & - "soilt2", & - "soilt3", & - "soilt4", & - "soilw1", & - "soilw2", & - "soilw3", & - "soilw4", & - "sotyp", & - "spfh2m", & - "tg3" , & - "tisfc", & - "tmp2m", & - "tmpsfc", & - "tprcp", & - "veg", & - "vtype", & - "weasd" / - - data noah_name /"mean nir albedo with strong cosz dependency", & - "mean nir albedo with weak cosz dependency", & - "mean vis albedo with strong cosz dependency", & - "mean vis albedo with weak cosz dependency", & - "canopy water (cnwat in gfs data)" , & - "instantaneous categorical rain", & - "10-meter wind speed divided by lowest model wind speed", & - "fractional coverage with strong cosz dependency", & - "fractional coverage with weak cosz dependency", & - "fh parameter from PBL scheme" , & - "fm parameter from PBL scheme" , & - "uustar surface frictional wind", & - "surface ice concentration (ice=1; no ice=0)", & - "sea ice thickness (icetk in gfs_data)", & - "sea-land-ice mask (0-sea, 1-land, 2-ice)", & - "surface geopotential height", & - "surface roughness", & - "maximum fractional coverage of green vegetation", & - "minimum fractional coverage of green vegetation", & - "surface slope type" , & - "maximum snow albedo in fraction", & - "surface snow depth", & - "liquid soil moisture at layer-1", & - "liquid soil moisture at layer-2", & - "liquid soil moisture at layer-3", & - "liquid soil moisture at layer-4", & - "soil temperature 0-10cm", & - "soil temperature 10-40cm", & - "soil temperature 40-100cm", & - "soil temperature 100-200cm", & - "volumetric soil moisture 0-10cm", & - "volumetric soil moisture 10-40cm", & - "volumetric soil moisture 40-100cm", & - "volumetric soil moisture 100-200cm", & - "soil type in integer", & - "2m specific humidity" , & - "deep soil temperature" , & - "surface temperature over ice fraction", & - "2m temperature", & - "surface temperature", & - "total precipitation" , & - "vegetation fraction", & - "vegetation type in integer", & - "surface snow water equivalent" / - - data noah_units /"%", & - "%", & - "%", & - "%", & - "XXX", & - "number", & - "N/A", & - "XXX", & - "XXX", & - "XXX", & - "XXX", & - "XXX", & - "fraction", & - "XXX", & - "numerical", & - "gpm", & - "m", & - "XXX", & - "XXX", & - "XXX", & - "XXX", & - "m", & - "XXX", & - "XXX", & - "XXX", & - "XXX", & - "K", & - "K", & - "K", & - "K", & - "fraction", & - "fraction", & - "fraction", & - "fraction", & - "number", & - "kg/kg", & - "K", & - "K", & - "K", & - "K", & - "kg/m**2", & - "fraction", & - "number" , & - "kg/m**2" / - -! define nst fields - - data nst_var /"c0", & - "cd", & - "dconv", & - "dtcool", & - "qrain", & - "tref", & - "w0", & - "wd", & - "xs", & - "xt", & - "xtts", & - "xu", & - "xv", & - "xz", & - "xzts", & - "zc" / - - data nst_name /"nsst coefficient1 to calculate d(tz)/d(ts)", & - "nsst coefficient2 to calculate d(tz)/d(ts)", & - "nsst thickness of free convection layer", & - "nsst sub-layer cooling amount", & - "nsst sensible heat flux due to rainfall", & - "nsst reference or foundation temperature", & - "nsst coefficient3 to calculate d(tz)/d(ts)", & - "nsst coefficient4 to calculate d(tz)/d(ts)", & - "nsst salinity content in diurnal thermocline layer", & - "nsst heat content in diurnal thermocline layer", & - "nsst d(xt)/d(ts)", & - "nsst u-current content in diurnal thermocline layer", & - "nsst v-current content in diurnal thermocline layer", & - "nsst diurnal thermocline layer thickness", & - "nsst d(xt)/d(ts)", & - "nsst sub-layer cooling thickness"/ - - data nst_units /"numerical", & - "n/a", & - "m", & - "k", & - "w/m2", & - "K", & - "n/a", & - "n/a", & - "n/a", & - "k*m", & - "m", & - "m2/s", & - "m2/s", & - "m", & - "m/k", & - "m"/ - - outfile = "./sfc.gaussian.analysis.file" - - print*,"- WRITE SURFACE DATA TO NETCDF FILE: ", trim(outfile) - - error = nf90_create(outfile, cmode=IOR(IOR(NF90_CLOBBER,NF90_NETCDF4),NF90_CLASSIC_MODEL), ncid=ncid) - call netcdf_err(error, 'CREATING NETCDF FILE') - -! dimensions - - error = nf90_def_dim(ncid, 'grid_xt', igaus, dim_xt) - call netcdf_err(error, 'DEFINING GRID_XT DIMENSION') - - error = nf90_def_dim(ncid, 'grid_yt', jgaus, dim_yt) - call netcdf_err(error, 'DEFINING GRID_YT DIMENSION') - - error = nf90_def_dim(ncid, 'time', 1, dim_time) - call netcdf_err(error, 'DEFINING TIME DIMENSION') - -! global attributes - - error = nf90_put_att(ncid, nf90_global, 'nsoil', 4) - call netcdf_err(error, 'DEFINING NSOIL ATTRIBUTE') - - error = nf90_put_att(ncid, nf90_global, 'source', "FV3GFS") - call netcdf_err(error, 'DEFINING SOURCE ATTRIBUTE') - - error = nf90_put_att(ncid, nf90_global, 'grid', "gaussian") - call netcdf_err(error, 'DEFINING GRID ATTRIBUTE') - - error = nf90_put_att(ncid, nf90_global, 'im', igaus) - call netcdf_err(error, 'DEFINING IM ATTRIBUTE') - - error = nf90_put_att(ncid, nf90_global, 'jm', jgaus) - call netcdf_err(error, 'DEFINING JM ATTRIBUTE') - -! variables - -! grid_xt - - error = nf90_def_var(ncid, 'grid_xt', NF90_DOUBLE, dim_xt, id_xt) - call netcdf_err(error, 'DEFINING GRID_XT') - - error = nf90_put_att(ncid, id_xt, "cartesian_axis", "X") - call netcdf_err(error, 'DEFINING GRID_XT ATTRIBUTE') - - error = nf90_put_att(ncid, id_xt, "long_name", "T-cell longitude") - call netcdf_err(error, 'DEFINING GRID_XT ATTRIBUTE') - - error = nf90_put_att(ncid, id_xt, "units", "degrees_E") - call netcdf_err(error, 'DEFINING GRID_XT ATTRIBUTE') - -! lon - - error = nf90_def_var(ncid, 'lon', NF90_DOUBLE, (/dim_xt,dim_yt/), id_lon) - call netcdf_err(error, 'DEFINING LON') - - error = nf90_put_att(ncid, id_lon, "long_name", "T-cell longitude") - call netcdf_err(error, 'DEFINING LON ATTRIBUTE') - - error = nf90_put_att(ncid, id_lon, "units", "degrees_E") - call netcdf_err(error, 'DEFINING LON ATTRIBUTE') - -! grid_yt - - error = nf90_def_var(ncid, 'grid_yt', NF90_DOUBLE, dim_yt, id_yt) - call netcdf_err(error, 'DEFINING GRID_YT') - - error = nf90_put_att(ncid, id_yt, "cartesian_axis", "Y") - call netcdf_err(error, 'DEFINING GRID_YT ATTRIBUTE') - - error = nf90_put_att(ncid, id_yt, "long_name", "T-cell latitude") - call netcdf_err(error, 'DEFINING GRID_YT ATTRIBUTE') - - error = nf90_put_att(ncid, id_yt, "units", "degrees_N") - call netcdf_err(error, 'DEFINING GRID_YT ATTRIBUTE') - -! lat - - error = nf90_def_var(ncid, 'lat', NF90_DOUBLE, (/dim_xt,dim_yt/), id_lat) - call netcdf_err(error, 'DEFINING LAT') - - error = nf90_put_att(ncid, id_lat, "long_name", "T-cell latitude") - call netcdf_err(error, 'DEFINING LAT ATTRIBUTE') - - error = nf90_put_att(ncid, id_lat, "units", "degrees_N") - call netcdf_err(error, 'DEFINING LAT ATTRIBUTE') - -! time - - error = nf90_def_var(ncid, 'time', NF90_DOUBLE, dim_time, id_time) - call netcdf_err(error, 'DEFINING TIME') - - error = nf90_put_att(ncid, id_time, "long_name", "time") - call netcdf_err(error, 'DEFINING TIME ATTRIBUTE') - - write(year, "(i4)") idate(1) - write(mon, "(i2.2)") idate(2) - write(day, "(i2.2)") idate(3) - write(hour, "(i2.2)") idate(4) - - date_string="hours since " // year // "-" // mon // "-" // day // " " // hour // ":00:00" - - error = nf90_put_att(ncid, id_time, "units", date_string) - call netcdf_err(error, 'DEFINING TIME ATTRIBUTE') - - error = nf90_put_att(ncid, id_time, "cartesian_axis", "T") - call netcdf_err(error, 'DEFINING TIME ATTRIBUTE') - - error = nf90_put_att(ncid, id_time, "calendar_type", "JULIAN") - call netcdf_err(error, 'DEFINING TIME ATTRIBUTE') - - error = nf90_put_att(ncid, id_time, "calendar", "JULIAN") - call netcdf_err(error, 'DEFINING TIME ATTRIBUTE') - -!------------------------------------------------------------------------------------------- -! Determine what variables to output (noah, or noah plus nst). -!------------------------------------------------------------------------------------------- - - if (trim(donst) == "yes" .or. trim(donst) == "YES") then - num_vars = num_noah + num_nst - else - num_vars = num_noah - endif - - allocate(var(num_vars)) - allocate(name(num_vars)) - allocate(units(num_vars)) - allocate(id_var(num_vars)) - - var(1:num_noah) = noah_var - name(1:num_noah) = noah_name - units(1:num_noah) = noah_units - - if (trim(donst) == "yes" .or. trim(donst) == "YES") then - do n = 1, num_nst - var(n+num_noah) = nst_var(n) - name(n+num_noah) = nst_name(n) - units(n+num_noah) = nst_units(n) - enddo - endif - -!------------------------------------------------------------------------------------------- -! Define variables in netcdf file. -!------------------------------------------------------------------------------------------- - - do n = 1, num_vars - - print*,'- DEFINE VARIABLE ',trim(var(n)) - error = nf90_def_var(ncid, trim(var(n)), NF90_FLOAT, (/dim_xt,dim_yt,dim_time/), id_var(n)) - call netcdf_err(error, 'DEFINING variable') - error = nf90_def_var_deflate(ncid, id_var(n), 1, 1, 1) - call netcdf_err(error, 'DEFINING variable with compression') - - error = nf90_put_att(ncid, id_var(n), "long_name", trim(name(n))) - call netcdf_err(error, 'DEFINING name ATTRIBUTE') - - error = nf90_put_att(ncid, id_var(n), "units", trim(units(n))) - call netcdf_err(error, 'DEFINING units ATTRIBUTE') - - error = nf90_put_att(ncid, id_var(n), "missing", missing) - call netcdf_err(error, 'DEFINING missing ATTRIBUTE') - - error = nf90_put_att(ncid, id_var(n), "cell_methods", "time: point") - call netcdf_err(error, 'DEFINING cell method ATTRIBUTE') - - error = nf90_put_att(ncid, id_var(n), "output_file", "sfc") - call netcdf_err(error, 'DEFINING out file ATTRIBUTE') - - enddo - -! end variable defs - - error = nf90_enddef(ncid, header_buffer_val,4,0,4) - call netcdf_err(error, 'DEFINING HEADER') - -!------------------------------------------------------------------------------------------- -! Write variables to netcdf file. -!------------------------------------------------------------------------------------------- - - allocate(dummy(igaus,jgaus)) - do i = 1, igaus - dummy(i,:) = real((i-1),4) * 360.0_4 / real(igaus,4) - enddo - - error = nf90_put_var(ncid, id_xt, dummy(:,1)) - call netcdf_err(error, 'WRITING GRID_XT') - - error = nf90_put_var(ncid, id_lon, dummy) - call netcdf_err(error, 'WRITING LON') - - allocate(slat(jgaus)) - allocate(wlat(jgaus)) - call splat(4, jgaus, slat, wlat) - - do i = (jgaus/2+1), jgaus - dummy(:,i) = 90.0 - (acos(slat(i)) * 180.0 / (4.0*atan(1.0))) - enddo - - do i = 1, (jgaus/2) - dummy(:,i) = -(dummy(:,(jgaus-i+1))) - enddo - - deallocate(slat, wlat) - - error = nf90_put_var(ncid, id_yt, dummy(1,:)) - call netcdf_err(error, 'WRITING GRID_YT') - - error = nf90_put_var(ncid, id_lat, dummy) - call netcdf_err(error, 'WRITING LAT') - - error = nf90_put_var(ncid, id_time, 0) - call netcdf_err(error, 'WRITING TIME') - - do n = 1, num_vars - print*,'- WRITE VARIABLE ',trim(var(n)) - call get_netcdf_var(var(n), dummy) - error = nf90_put_var(ncid, id_var(n), dummy, start=(/1,1,1/), count=(/igaus,jgaus,1/)) - call netcdf_err(error, 'WRITING variable') - enddo - - deallocate (dummy) - - error = nf90_close(ncid) - - end subroutine write_sfc_data_netcdf - -!------------------------------------------------------------------------------------------- -! Retrieve variable based on its netcdf identifier. -!------------------------------------------------------------------------------------------- - - subroutine get_netcdf_var(var, dummy) - - use io - - implicit none - - character(len=*), intent(in) :: var - - real(kind=4), intent(out) :: dummy(igaus,jgaus) - - select case (var) - case ('alnsf') - dummy = reshape(gaussian_data%alnsf, (/igaus,jgaus/)) - case ('alnwf') - dummy = reshape(gaussian_data%alnwf, (/igaus,jgaus/)) - case ('alvsf') - dummy = reshape(gaussian_data%alvsf, (/igaus,jgaus/)) - case ('alvwf') - dummy = reshape(gaussian_data%alvwf, (/igaus,jgaus/)) - case ('cnwat') - dummy = reshape(gaussian_data%canopy, (/igaus,jgaus/)) - case ('f10m') - dummy = reshape(gaussian_data%f10m, (/igaus,jgaus/)) - case ('facsf') - dummy = reshape(gaussian_data%facsf, (/igaus,jgaus/)) - case ('facwf') - dummy = reshape(gaussian_data%facwf, (/igaus,jgaus/)) - case ('ffhh') - dummy = reshape(gaussian_data%ffhh, (/igaus,jgaus/)) - case ('ffmm') - dummy = reshape(gaussian_data%ffmm, (/igaus,jgaus/)) - case ('fricv') - dummy = reshape(gaussian_data%uustar, (/igaus,jgaus/)) - case ('land') - dummy = reshape(gaussian_data%slmask, (/igaus,jgaus/)) - case ('orog') - dummy = reshape(gaussian_data%orog, (/igaus,jgaus/)) - case ('sltyp') - dummy = reshape(gaussian_data%slope, (/igaus,jgaus/)) - case ('icec') - dummy = reshape(gaussian_data%fice, (/igaus,jgaus/)) - case ('icetk') - dummy = reshape(gaussian_data%hice, (/igaus,jgaus/)) - case ('snoalb') - dummy = reshape(gaussian_data%snoalb, (/igaus,jgaus/)) - case ('shdmin') - dummy = reshape(gaussian_data%shdmin, (/igaus,jgaus/)) - case ('shdmax') - dummy = reshape(gaussian_data%shdmax, (/igaus,jgaus/)) - case ('snod') - dummy = reshape(gaussian_data%snwdph, (/igaus,jgaus/)) / 1000.0 - case ('weasd') - dummy = reshape(gaussian_data%sheleg, (/igaus,jgaus/)) - case ('veg') - dummy = reshape(gaussian_data%vfrac, (/igaus,jgaus/)) * 100.0 - case ('sfcr') - dummy = reshape(gaussian_data%zorl, (/igaus,jgaus/)) / 100.0 - case ('crain') - dummy = reshape(gaussian_data%srflag, (/igaus,jgaus/)) - case ('sotyp') - dummy = reshape(gaussian_data%stype, (/igaus,jgaus/)) - case ('spfh2m') - dummy = reshape(gaussian_data%q2m, (/igaus,jgaus/)) - case ('tmp2m') - dummy = reshape(gaussian_data%t2m, (/igaus,jgaus/)) - case ('tmpsfc') - dummy = reshape(gaussian_data%tsea, (/igaus,jgaus/)) - case ('tg3') - dummy = reshape(gaussian_data%tg3, (/igaus,jgaus/)) - case ('tisfc') - dummy = reshape(gaussian_data%tisfc, (/igaus,jgaus/)) - case ('tprcp') - dummy = reshape(gaussian_data%tprcp, (/igaus,jgaus/)) - case ('vtype') - dummy = reshape(gaussian_data%vtype, (/igaus,jgaus/)) - case ('soill1') - dummy = reshape(gaussian_data%slc(:,1), (/igaus,jgaus/)) - where (dummy > 0.99) dummy = 0.0 ! replace flag value at water/landice - case ('soill2') - dummy = reshape(gaussian_data%slc(:,2), (/igaus,jgaus/)) - where (dummy > 0.99) dummy = 0.0 ! replace flag value at water/landice - case ('soill3') - dummy = reshape(gaussian_data%slc(:,3), (/igaus,jgaus/)) - where (dummy > 0.99) dummy = 0.0 ! replace flag value at water/landice - case ('soill4') - dummy = reshape(gaussian_data%slc(:,4), (/igaus,jgaus/)) - where (dummy > 0.99) dummy = 0.0 ! replace flag value at water/landice - case ('soilt1') - dummy = reshape(gaussian_data%stc(:,1), (/igaus,jgaus/)) - case ('soilt2') - dummy = reshape(gaussian_data%stc(:,2), (/igaus,jgaus/)) - case ('soilt3') - dummy = reshape(gaussian_data%stc(:,3), (/igaus,jgaus/)) - case ('soilt4') - dummy = reshape(gaussian_data%stc(:,4), (/igaus,jgaus/)) - case ('soilw1') - dummy = reshape(gaussian_data%smc(:,1), (/igaus,jgaus/)) - case ('soilw2') - dummy = reshape(gaussian_data%smc(:,2), (/igaus,jgaus/)) - case ('soilw3') - dummy = reshape(gaussian_data%smc(:,3), (/igaus,jgaus/)) - case ('soilw4') - dummy = reshape(gaussian_data%smc(:,4), (/igaus,jgaus/)) - case ('c0') - dummy = reshape(gaussian_data%c0, (/igaus,jgaus/)) - case ('cd') - dummy = reshape(gaussian_data%cd, (/igaus,jgaus/)) - case ('dconv') - dummy = reshape(gaussian_data%dconv, (/igaus,jgaus/)) - case ('dtcool') - dummy = reshape(gaussian_data%dtcool, (/igaus,jgaus/)) - case ('qrain') - dummy = reshape(gaussian_data%qrain, (/igaus,jgaus/)) - case ('tref') - dummy = reshape(gaussian_data%tref, (/igaus,jgaus/)) - case ('w0') - dummy = reshape(gaussian_data%w0, (/igaus,jgaus/)) - case ('wd') - dummy = reshape(gaussian_data%wd, (/igaus,jgaus/)) - case ('xs') - dummy = reshape(gaussian_data%xs, (/igaus,jgaus/)) - case ('xt') - dummy = reshape(gaussian_data%xt, (/igaus,jgaus/)) - case ('xtts') - dummy = reshape(gaussian_data%xtts, (/igaus,jgaus/)) - case ('xu') - dummy = reshape(gaussian_data%xu, (/igaus,jgaus/)) - case ('xv') - dummy = reshape(gaussian_data%xv, (/igaus,jgaus/)) - case ('xz') - dummy = reshape(gaussian_data%xz, (/igaus,jgaus/)) - case ('xzts') - dummy = reshape(gaussian_data%xzts, (/igaus,jgaus/)) - case ('zc') - dummy = reshape(gaussian_data%zc, (/igaus,jgaus/)) - case default - print*,'- FATAL ERROR: UNKNOWN VAR IN GET_VAR: ', var - call errexit(67) - end select - - end subroutine get_netcdf_var - -!------------------------------------------------------------------------------------------- -! Write gaussian surface data to nemsio file. -!------------------------------------------------------------------------------------------- - - subroutine write_sfc_data_nemsio - - use nemsio_module - use io - - implicit none - - integer(nemsio_intkind), parameter :: nrec_all=60 - integer(nemsio_intkind), parameter :: nmetaaryi=1 - integer(nemsio_intkind), parameter :: nmetavari=4 - integer(nemsio_intkind), parameter :: nmetavarr=1 - integer(nemsio_intkind), parameter :: nmetavarc=2 - - character(nemsio_charkind) :: recname_all(nrec_all) - character(nemsio_charkind) :: reclevtyp_all(nrec_all) - character(nemsio_charkind) :: aryiname(nmetaaryi) - character(nemsio_charkind) :: variname(nmetavari) - character(nemsio_charkind) :: varrname(nmetavarr) - character(nemsio_charkind) :: varcname(nmetavarc) - character(nemsio_charkind) :: varcval(nmetavarc) - character(nemsio_charkind), allocatable :: recname(:) - character(nemsio_charkind), allocatable :: reclevtyp(:) - - integer(nemsio_intkind) :: iret, version, nrec - integer(nemsio_intkind) :: reclev_all(nrec_all) - integer(nemsio_intkind) :: aryival(jgaus,nmetaaryi) - integer(nemsio_intkind) :: aryilen(nmetaaryi) - integer(nemsio_intkind) :: varival(nmetavari) - integer :: i, k, n, nvcoord, levs_vcoord - integer(nemsio_intkind), allocatable :: reclev(:) - - real(nemsio_realkind), allocatable :: the_data(:) - real(nemsio_realkind) :: varrval(nmetavarr) - real(nemsio_realkind), allocatable :: lat(:), lon(:) - real(kind=4), allocatable :: dummy(:,:), slat(:), wlat(:) - real(nemsio_realkind), allocatable :: vcoord(:,:,:) - - type(nemsio_gfile) :: gfileo - - data recname_all /'alnsf', 'alnwf', 'alvsf', 'alvwf', & - 'cnwat', 'crain', 'f10m', 'facsf', & - 'facwf', 'ffhh', 'ffmm', 'fricv', & - 'icec', 'icetk', 'land', 'orog', & - 'snoalb', 'sfcr', 'shdmax', 'shdmin', & - 'soill', 'soill', 'soill', 'soill', & - 'sltyp', 'soilw', 'soilw', 'soilw', & - 'soilw', 'snod', 'sotyp', 'spfh', & - 'tmp', 'tmp', 'tmp', 'tmp', & - 'tg3', 'ti', 'tmp', 'tmp', & - 'tprcp', 'veg', 'vtype', 'weasd', & - 'c0', 'cd', 'dconv', 'dtcool', & - 'qrain', 'tref', & - 'w0', 'wd', 'xs', 'xt', & - 'xtts', 'xu', 'xv', 'xz', & - 'xzts', 'zc'/ - - data reclevtyp_all /'sfc', 'sfc', 'sfc', 'sfc', & - 'sfc', 'sfc', '10 m above gnd', 'sfc', & - 'sfc', 'sfc', 'sfc', 'sfc', & - 'sfc', 'sfc', 'sfc', 'sfc', & - 'sfc', 'sfc', 'sfc', 'sfc', & - '0-10 cm down', '10-40 cm down', '40-100 cm down', '100-200 cm down', & - 'sfc', '0-10 cm down', '10-40 cm down', '40-100 cm down', & - '100-200 cm down', 'sfc', 'sfc', '2 m above gnd', & - '0-10 cm down', '10-40 cm down', '40-100 cm down', '100-200 cm down', & - 'sfc', 'sfc', '2 m above gnd', 'sfc', & - 'sfc', 'sfc', 'sfc', 'sfc', & - 'sfc', 'sfc', 'sfc', 'sfc', & - 'sfc', 'sfc', 'sfc', & - 'sfc', 'sfc', 'sfc', 'sfc', & - 'sfc', 'sfc', 'sfc', 'sfc', & - 'sfc'/ - - data reclev_all /1, 1, 1, 1, 1, & - 1, 1, 1, 1, 1, 1, & - 1, 1, 1, 1, 1, 1, & - 1, 1, 1, 1, 1, 1, & - 1, 1, 1, 1, 1, 1, & - 1, 1, 1, 1, 1, 1, & - 1, 1, 1, 1, 1, 1, & - 1, 1, 1, 1, 1, 1, & - 1, 1, 1, 1, 1, 1, & - 1, 1, 1, 1, 1, 1, 1/ - - data aryiname /'lpl'/ - - data variname /'fhzero', 'ncld', 'nsoil', 'imp_physics'/ - - data varival /6, 5, 4, 11/ - - data varrname /'dtp'/ - - data varrval /225.0/ - - data varcname /"y-direction", "z-direction"/ - - data varcval /"north2south", "bottom2top"/ - - version = 200809 - - aryival = igaus ! reduced grid definition - aryilen = jgaus - - allocate(dummy(igaus,jgaus)) - do i = 1, igaus - dummy(i,:) = float(i-1) * 360.0 / float(igaus) - enddo - - allocate(lon(igaus*jgaus)) - lon = reshape (dummy, (/igaus*jgaus/) ) - -! Call 4-byte version of splib to match latitudes in history files. - - allocate(slat(jgaus)) - allocate(wlat(jgaus)) - call splat(4, jgaus, slat, wlat) - - do i = (jgaus/2+1), jgaus - dummy(:,i) = 90.0 - (acos(slat(i)) * 180.0 / (4.0*atan(1.0))) - enddo - - do i = 1, (jgaus/2) - dummy(:,i) = -(dummy(:,(jgaus-i+1))) - enddo - - deallocate(slat, wlat) - - allocate(lat(igaus*jgaus)) - lat = reshape (dummy, (/igaus*jgaus/) ) - - deallocate(dummy) - - print* - print*, "- OPEN VCOORD FILE." - open(14, file="vcoord.txt", form='formatted', iostat=iret) - if (iret /= 0) goto 43 - - print*, "- READ VCOORD FILE." - read(14, *, iostat=iret) nvcoord, levs_vcoord - if (iret /= 0) goto 43 - - allocate(vcoord(levs_vcoord,3,2)) - vcoord = 0.0 - read(14, *, iostat=iret) ((vcoord(n,k,1), k=1,nvcoord), n=1,levs_vcoord) - if (iret /= 0) goto 43 - - close (14) - - if (trim(donst) == "yes" .or. trim(donst) == "YES") then - nrec = nrec_all - allocate(recname(nrec)) - recname = recname_all - allocate(reclevtyp(nrec)) - reclevtyp = reclevtyp_all - allocate(reclev(nrec)) - reclev = reclev_all - else - nrec = 44 - allocate(recname(nrec)) - recname = recname_all(1:nrec) - allocate(reclevtyp(nrec)) - reclevtyp = reclevtyp_all(1:nrec) - allocate(reclev(nrec)) - reclev = reclev_all(1:nrec) - endif - - call nemsio_init(iret=iret) - - print* - print*,"- OPEN GAUSSIAN NEMSIO SURFACE FILE" - - call nemsio_open(gfileo, "sfc.gaussian.analysis.file", 'write', & - modelname="FV3GFS", gdatatype="bin4", version=version, & - nmeta=8, nrec=nrec, dimx=igaus, dimy=jgaus, dimz=(levs_vcoord-1), & - nframe=0, nsoil=4, ntrac=8, jcap=-9999, & - ncldt=5, idvc=-9999, idsl=-9999, idvm=-9999, & - idrt=4, lat=lat, lon=lon, vcoord=vcoord, & - nfhour=0, nfminute=0, nfsecondn=0, & - nfsecondd=1, nfday=0, idate=idate, & - recname=recname, reclevtyp=reclevtyp, & - reclev=reclev, extrameta=.true., & - nmetavari=nmetavari, variname=variname, varival=varival, & - nmetavarr=nmetavarr, varrname=varrname, varrval=varrval, & - nmetavarc=nmetavarc, varcname=varcname, varcval=varcval, & - nmetaaryi=nmetaaryi, aryiname=aryiname, & - aryival=aryival, aryilen=aryilen, iret=iret) - if (iret /= 0) goto 44 - - deallocate (lat, lon, vcoord, recname, reclevtyp, reclev) - - allocate(the_data(igaus*jgaus)) - - print*,"- WRITE GAUSSIAN NEMSIO SURFACE FILE" - - print*,"- WRITE ALNSF" - the_data = gaussian_data%alnsf - call nemsio_writerec(gfileo, 1, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE ALNWF" - the_data = gaussian_data%alnwf - call nemsio_writerec(gfileo, 2, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE ALVSF" - the_data = gaussian_data%alvsf - call nemsio_writerec(gfileo, 3, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE ALVWF" - the_data = gaussian_data%alvwf - call nemsio_writerec(gfileo, 4, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE CANOPY" - the_data = gaussian_data%canopy - call nemsio_writerec(gfileo, 5, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE CRAIN (SRFLAG)" - the_data = gaussian_data%srflag - call nemsio_writerec(gfileo, 6, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE F10M" - the_data = gaussian_data%f10m - call nemsio_writerec(gfileo, 7, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE FACSF" - the_data = gaussian_data%facsf - call nemsio_writerec(gfileo, 8, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE FACWF" - the_data = gaussian_data%facwf - call nemsio_writerec(gfileo, 9, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE FFHH" - the_data = gaussian_data%ffhh - call nemsio_writerec(gfileo, 10, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE FFMM" - the_data = gaussian_data%ffmm - call nemsio_writerec(gfileo, 11, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE UUSTAR" - the_data = gaussian_data%uustar - call nemsio_writerec(gfileo, 12, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE FICE" - the_data = gaussian_data%fice - call nemsio_writerec(gfileo, 13, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE HICE" - the_data = gaussian_data%hice - call nemsio_writerec(gfileo, 14, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE SLMSK" - the_data = gaussian_data%slmask - call nemsio_writerec(gfileo, 15, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE OROG" - the_data = gaussian_data%orog - call nemsio_writerec(gfileo, 16, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE SNOALB" - the_data = gaussian_data%snoalb - call nemsio_writerec(gfileo, 17, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE ZORL" - the_data = gaussian_data%zorl * 0.01 ! meters - call nemsio_writerec(gfileo, 18, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE SHDMAX" - the_data = gaussian_data%shdmax - call nemsio_writerec(gfileo, 19, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE SHDMIN" - the_data = gaussian_data%shdmin - call nemsio_writerec(gfileo, 20, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE SLC" - the_data = gaussian_data%slc(:,1) - call nemsio_writerec(gfileo, 21, the_data, iret=iret) - if (iret /= 0) goto 44 - - the_data = gaussian_data%slc(:,2) - call nemsio_writerec(gfileo, 22, the_data, iret=iret) - if (iret /= 0) goto 44 - - the_data = gaussian_data%slc(:,3) - call nemsio_writerec(gfileo, 23, the_data, iret=iret) - if (iret /= 0) goto 44 - - the_data = gaussian_data%slc(:,4) - call nemsio_writerec(gfileo, 24, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE SLOPE" - the_data = gaussian_data%slope - call nemsio_writerec(gfileo, 25, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE SMC" - the_data = gaussian_data%smc(:,1) - call nemsio_writerec(gfileo, 26, the_data, iret=iret) - if (iret /= 0) goto 44 - - the_data = gaussian_data%smc(:,2) - call nemsio_writerec(gfileo, 27, the_data, iret=iret) - if (iret /= 0) goto 44 - - the_data = gaussian_data%smc(:,3) - call nemsio_writerec(gfileo, 28, the_data, iret=iret) - if (iret /= 0) goto 44 - - the_data = gaussian_data%smc(:,4) - call nemsio_writerec(gfileo, 29, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE SNWDPH" - the_data = gaussian_data%snwdph * 0.001 ! meters - call nemsio_writerec(gfileo, 30, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE STYPE" - the_data = gaussian_data%stype - call nemsio_writerec(gfileo, 31, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE Q2M" - the_data = gaussian_data%q2m - call nemsio_writerec(gfileo, 32, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE STC" - the_data = gaussian_data%stc(:,1) - call nemsio_writerec(gfileo, 33, the_data, iret=iret) - if (iret /= 0) goto 44 - - the_data = gaussian_data%stc(:,2) - call nemsio_writerec(gfileo, 34, the_data, iret=iret) - if (iret /= 0) goto 44 - - the_data = gaussian_data%stc(:,3) - call nemsio_writerec(gfileo, 35, the_data, iret=iret) - if (iret /= 0) goto 44 - - the_data = gaussian_data%stc(:,4) - call nemsio_writerec(gfileo, 36, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE TG3" - the_data = gaussian_data%tg3 - call nemsio_writerec(gfileo, 37, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE TISFC" - the_data = gaussian_data%tisfc - call nemsio_writerec(gfileo, 38, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE T2M" - the_data = gaussian_data%t2m - call nemsio_writerec(gfileo, 39, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE TSEA" - the_data = gaussian_data%tsea - call nemsio_writerec(gfileo, 40, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE TPRCP" - the_data = gaussian_data%tprcp - call nemsio_writerec(gfileo, 41, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE VFRAC" - the_data = gaussian_data%vfrac * 100.0 ! whole percent - call nemsio_writerec(gfileo, 42, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE VTYPE" - the_data = gaussian_data%vtype - call nemsio_writerec(gfileo, 43, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE SHELEG" - the_data = gaussian_data%sheleg - call nemsio_writerec(gfileo, 44, the_data, iret=iret) - if (iret /= 0) goto 44 - - if (trim(donst) == "yes" .or. trim(donst) == "YES") then - - print*,"- WRITE C0" - the_data = gaussian_data%c0 - call nemsio_writerec(gfileo, 45, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE CD" - the_data = gaussian_data%cd - call nemsio_writerec(gfileo, 46, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE DCONV" - the_data = gaussian_data%dconv - call nemsio_writerec(gfileo, 47, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE DTCOOL" - the_data = gaussian_data%dtcool - call nemsio_writerec(gfileo, 48, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE QRAIN" - the_data = gaussian_data%qrain - call nemsio_writerec(gfileo, 49, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE TREF" - the_data = gaussian_data%tref - call nemsio_writerec(gfileo, 50, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE W0" - the_data = gaussian_data%w0 - call nemsio_writerec(gfileo, 51, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE WD" - the_data = gaussian_data%wd - call nemsio_writerec(gfileo, 52, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE XS" - the_data = gaussian_data%xs - call nemsio_writerec(gfileo, 53, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE XT" - the_data = gaussian_data%xt - call nemsio_writerec(gfileo, 54, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE XTTS" - the_data = gaussian_data%xtts - call nemsio_writerec(gfileo, 55, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE XU" - the_data = gaussian_data%xu - call nemsio_writerec(gfileo, 56, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE XV" - the_data = gaussian_data%xv - call nemsio_writerec(gfileo, 57, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE XZ" - the_data = gaussian_data%xz - call nemsio_writerec(gfileo, 58, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE XZTS" - the_data = gaussian_data%xzts - call nemsio_writerec(gfileo, 59, the_data, iret=iret) - if (iret /= 0) goto 44 - - print*,"- WRITE ZC" - the_data = gaussian_data%zc - call nemsio_writerec(gfileo, 60, the_data, iret=iret) - if (iret /= 0) goto 44 - - endif - - call nemsio_close(gfileo,iret=iret) - - call nemsio_finalize() - - deallocate(the_data) - - return - - 43 continue - print*,"- ** FATAL ERROR OPENING/READING VCOORD FILE." - print*,"- IRET IS: ", iret - call errexit(17) - stop - - 44 continue - print*,"- ** FATAL ERROR WRITING GAUSSIAN NEMSIO FILE." - print*,"- IRET IS: ", iret - call errexit(15) - stop - - end subroutine write_sfc_data_nemsio - -!------------------------------------------------------------------------------------------- -! Read tile data. -!------------------------------------------------------------------------------------------- - - subroutine read_data_anl - - use netcdf - use io - - implicit none - - integer :: ijtile, id_dim, id_var - integer :: error, tile, ncid - integer :: istart, iend - - real(kind=8), allocatable :: dummy(:,:), dummy3d(:,:,:) - -!------------------------------------------------------------------------------------------- -! Get tile dimensions from the first analysis file. -!------------------------------------------------------------------------------------------- - - error=nf90_open("./anal.tile1.nc",nf90_nowrite,ncid) - error=nf90_inq_dimid(ncid, 'xaxis_1', id_dim) - call netcdf_err(error, 'READING xaxis_1' ) - error=nf90_inquire_dimension(ncid,id_dim,len=itile) - call netcdf_err(error, 'READING xaxis_1' ) - - error=nf90_inq_dimid(ncid, 'yaxis_1', id_dim) - call netcdf_err(error, 'READING yaxis_1' ) - error=nf90_inquire_dimension(ncid,id_dim,len=jtile) - call netcdf_err(error, 'READING yaxis_1' ) - - error = nf90_close(ncid) - - ijtile = itile*jtile - - allocate(dummy(itile,jtile)) - allocate(dummy3d(itile,jtile,4)) - - allocate(tile_data%orog(ijtile*num_tiles)) - allocate(tile_data%canopy(ijtile*num_tiles)) - allocate(tile_data%slmask(ijtile*num_tiles)) - allocate(tile_data%tg3(ijtile*num_tiles)) - allocate(tile_data%alvsf(ijtile*num_tiles)) - allocate(tile_data%alvwf(ijtile*num_tiles)) - allocate(tile_data%alnsf(ijtile*num_tiles)) - allocate(tile_data%alnwf(ijtile*num_tiles)) - allocate(tile_data%facsf(ijtile*num_tiles)) - allocate(tile_data%facwf(ijtile*num_tiles)) - allocate(tile_data%ffhh(ijtile*num_tiles)) - allocate(tile_data%ffmm(ijtile*num_tiles)) - allocate(tile_data%fice(ijtile*num_tiles)) - allocate(tile_data%hice(ijtile*num_tiles)) - allocate(tile_data%sheleg(ijtile*num_tiles)) - allocate(tile_data%stype(ijtile*num_tiles)) - allocate(tile_data%vfrac(ijtile*num_tiles)) - allocate(tile_data%vtype(ijtile*num_tiles)) - allocate(tile_data%zorl(ijtile*num_tiles)) - allocate(tile_data%tsea(ijtile*num_tiles)) - allocate(tile_data%f10m(ijtile*num_tiles)) - allocate(tile_data%q2m(ijtile*num_tiles)) - allocate(tile_data%shdmax(ijtile*num_tiles)) - allocate(tile_data%shdmin(ijtile*num_tiles)) - allocate(tile_data%slope(ijtile*num_tiles)) - allocate(tile_data%snoalb(ijtile*num_tiles)) - allocate(tile_data%srflag(ijtile*num_tiles)) - allocate(tile_data%snwdph(ijtile*num_tiles)) - allocate(tile_data%t2m(ijtile*num_tiles)) - allocate(tile_data%tisfc(ijtile*num_tiles)) - allocate(tile_data%tprcp(ijtile*num_tiles)) - allocate(tile_data%uustar(ijtile*num_tiles)) - allocate(tile_data%slc(ijtile*num_tiles,4)) - allocate(tile_data%smc(ijtile*num_tiles,4)) - allocate(tile_data%stc(ijtile*num_tiles,4)) -! nst - if (trim(donst) == "yes" .or. trim(donst) == "YES") then - allocate(tile_data%c0(ijtile*num_tiles)) - allocate(tile_data%cd(ijtile*num_tiles)) - allocate(tile_data%dconv(ijtile*num_tiles)) - allocate(tile_data%dtcool(ijtile*num_tiles)) - allocate(tile_data%land(ijtile*num_tiles)) - allocate(tile_data%qrain(ijtile*num_tiles)) - allocate(tile_data%tref(ijtile*num_tiles)) - allocate(tile_data%w0(ijtile*num_tiles)) - allocate(tile_data%wd(ijtile*num_tiles)) - allocate(tile_data%xs(ijtile*num_tiles)) - allocate(tile_data%xt(ijtile*num_tiles)) - allocate(tile_data%xtts(ijtile*num_tiles)) - allocate(tile_data%xu(ijtile*num_tiles)) - allocate(tile_data%xv(ijtile*num_tiles)) - allocate(tile_data%xz(ijtile*num_tiles)) - allocate(tile_data%xzts(ijtile*num_tiles)) - allocate(tile_data%zc(ijtile*num_tiles)) - endif - - do tile = 1, 6 - - print* - print*, "- READ INPUT SFC DATA FOR TILE: ", tile - - istart = (ijtile) * (tile-1) + 1 - iend = istart + ijtile - 1 - - if (tile==1) error=nf90_open("./anal.tile1.nc",nf90_nowrite,ncid) - if (tile==2) error=nf90_open("./anal.tile2.nc",nf90_nowrite,ncid) - if (tile==3) error=nf90_open("./anal.tile3.nc",nf90_nowrite,ncid) - if (tile==4) error=nf90_open("./anal.tile4.nc",nf90_nowrite,ncid) - if (tile==5) error=nf90_open("./anal.tile5.nc",nf90_nowrite,ncid) - if (tile==6) error=nf90_open("./anal.tile6.nc",nf90_nowrite,ncid) - - call netcdf_err(error, 'OPENING FILE' ) - - error=nf90_inq_varid(ncid, "slmsk", id_var) - call netcdf_err(error, 'READING slmsk ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING slmsk' ) - print*,'- SLMSK: ',maxval(dummy),minval(dummy) - tile_data%slmask(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "tsea", id_var) - call netcdf_err(error, 'READING tsea ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING tsea' ) - print*,'- TSEA: ',maxval(dummy),minval(dummy) - tile_data%tsea(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "sheleg", id_var) - call netcdf_err(error, 'READING sheleg ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING sheleg' ) - print*,'- SHELEG: ',maxval(dummy),minval(dummy) - tile_data%sheleg(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "tg3", id_var) - call netcdf_err(error, 'READING tg3 ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING tg3' ) - print*,'- TG3: ',maxval(dummy),minval(dummy) - tile_data%tg3(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "zorl", id_var) - call netcdf_err(error, 'READING zorl ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING zorl' ) - print*,'- ZORL: ',maxval(dummy),minval(dummy) - tile_data%zorl(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "alvsf", id_var) - call netcdf_err(error, 'READING alvsf ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING alvsf' ) - print*,'- ALVSF: ',maxval(dummy),minval(dummy) - tile_data%alvsf(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "alvwf", id_var) - call netcdf_err(error, 'READING alvwf ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING alvwf' ) - print*,'- ALVWF: ',maxval(dummy),minval(dummy) - tile_data%alvwf(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "alnsf", id_var) - call netcdf_err(error, 'READING alnsf ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING alnsf' ) - print*,'- ALNSF: ',maxval(dummy),minval(dummy) - tile_data%alnsf(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "alnwf", id_var) - call netcdf_err(error, 'READING alnwf ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING alnwf' ) - print*,'- ALNWF: ',maxval(dummy),minval(dummy) - tile_data%alnwf(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "facsf", id_var) - call netcdf_err(error, 'READING facsf ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING facsf' ) - print*,'- FACSF: ',maxval(dummy),minval(dummy) - tile_data%facsf(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "facwf", id_var) - call netcdf_err(error, 'READING facwf ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING facwf' ) - print*,'- FACWF: ',maxval(dummy),minval(dummy) - tile_data%facwf(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "vfrac", id_var) - call netcdf_err(error, 'READING vfrac ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING vfrac' ) - print*,'- VFRAC: ',maxval(dummy),minval(dummy) - tile_data%vfrac(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "canopy", id_var) - call netcdf_err(error, 'READING canopy ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING canopy' ) - print*,'- CANOPY: ',maxval(dummy),minval(dummy) - tile_data%canopy(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "f10m", id_var) - call netcdf_err(error, 'READING f10m ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING f10m' ) - print*,'- F10M: ',maxval(dummy),minval(dummy) - tile_data%f10m(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "t2m", id_var) - call netcdf_err(error, 'READING t2m ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING t2m' ) - print*,'- T2M: ',maxval(dummy),minval(dummy) - tile_data%t2m(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "q2m", id_var) - call netcdf_err(error, 'READING q2m ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING q2m' ) - print*,'- Q2M: ',maxval(dummy),minval(dummy) - tile_data%q2m(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "vtype", id_var) - call netcdf_err(error, 'READING vtype ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING vtype' ) - print*,'- VTYPE: ',maxval(dummy),minval(dummy) - tile_data%vtype(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "stype", id_var) - call netcdf_err(error, 'READING stype ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING stype' ) - print*,'- STYPE: ',maxval(dummy),minval(dummy) - tile_data%stype(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "uustar", id_var) - call netcdf_err(error, 'READING uustar ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING uustar' ) - print*,'- UUSTAR: ',maxval(dummy),minval(dummy) - tile_data%uustar(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "ffmm", id_var) - call netcdf_err(error, 'READING ffmm ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING ffmm' ) - print*,'- FFMM: ',maxval(dummy),minval(dummy) - tile_data%ffmm(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "ffhh", id_var) - call netcdf_err(error, 'READING ffhh ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING ffhh' ) - print*,'- FFHH: ',maxval(dummy),minval(dummy) - tile_data%ffhh(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "hice", id_var) - call netcdf_err(error, 'READING hice ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING hice' ) - print*,'- HICE: ',maxval(dummy),minval(dummy) - tile_data%hice(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "fice", id_var) - call netcdf_err(error, 'READING fice ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING fice' ) - print*,'- FICE: ',maxval(dummy),minval(dummy) - tile_data%fice(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "tisfc", id_var) - call netcdf_err(error, 'READING tisfc ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING tisfc' ) - print*,'- TISFC: ',maxval(dummy),minval(dummy) - tile_data%tisfc(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "tprcp", id_var) - call netcdf_err(error, 'READING tprcp ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING tprcp' ) - print*,'- TPRCP: ',maxval(dummy),minval(dummy) - tile_data%tprcp(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "srflag", id_var) - call netcdf_err(error, 'READING srflag ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING srfalg' ) - print*,'- SRFLAG: ',maxval(dummy),minval(dummy) - tile_data%srflag(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "snwdph", id_var) - call netcdf_err(error, 'READING snwdph ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING snwdph' ) - print*,'- SNWDPH: ',maxval(dummy),minval(dummy) - tile_data%snwdph(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "shdmin", id_var) - call netcdf_err(error, 'READING shdmin ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING shdmin' ) - print*,'- SHDMIN: ',maxval(dummy),minval(dummy) - tile_data%shdmin(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "shdmax", id_var) - call netcdf_err(error, 'READING shdmax ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING shdmax' ) - print*,'- SHDMAX: ',maxval(dummy),minval(dummy) - tile_data%shdmax(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "slope", id_var) - call netcdf_err(error, 'READING slope ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING slope' ) - print*,'- SLOPE: ',maxval(dummy),minval(dummy) - tile_data%slope(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "snoalb", id_var) - call netcdf_err(error, 'READING snoalb ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING snoalb' ) - print*,'- SNOALB: ',maxval(dummy),minval(dummy) - tile_data%snoalb(istart:iend) = reshape(dummy, (/ijtile/)) - - if (trim(donst) == "yes" .or. trim(donst) == "YES") then - - error=nf90_inq_varid(ncid, "c_0", id_var) - call netcdf_err(error, 'READING c_0 ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING c_0' ) - print*,'- C_0: ',maxval(dummy),minval(dummy) - tile_data%c0(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "c_d", id_var) - call netcdf_err(error, 'READING c_d ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING c_d' ) - print*,'- C_D: ',maxval(dummy),minval(dummy) - tile_data%cd(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "d_conv", id_var) - call netcdf_err(error, 'READING d_conv ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING d_conv' ) - print*,'- D_CONV: ',maxval(dummy),minval(dummy) - tile_data%dconv(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "dt_cool", id_var) - call netcdf_err(error, 'READING dt_cool ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING dt_cool' ) - print*,'- DT_COOL: ',maxval(dummy),minval(dummy) - tile_data%dtcool(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "qrain", id_var) - call netcdf_err(error, 'READING qrain ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING qrain' ) - print*,'- QRAIN: ',maxval(dummy),minval(dummy) - tile_data%qrain(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "tref", id_var) - call netcdf_err(error, 'READING tref ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING tref' ) - print*,'- TREF: ',maxval(dummy),minval(dummy) - tile_data%tref(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "w_0", id_var) - call netcdf_err(error, 'READING w_0 ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING w_0' ) - print*,'- W_0: ',maxval(dummy),minval(dummy) - tile_data%w0(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "w_d", id_var) - call netcdf_err(error, 'READING w_d ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING w_d' ) - print*,'- W_D: ',maxval(dummy),minval(dummy) - tile_data%wd(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "xs", id_var) - call netcdf_err(error, 'READING xs ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING xs' ) - print*,'- XS: ',maxval(dummy),minval(dummy) - tile_data%xs(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "xt", id_var) - call netcdf_err(error, 'READING xt ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING xt' ) - print*,'- XT: ',maxval(dummy),minval(dummy) - tile_data%xt(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "xtts", id_var) - call netcdf_err(error, 'READING xtts ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING xtts' ) - print*,'- XTTS: ',maxval(dummy),minval(dummy) - tile_data%xtts(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "xzts", id_var) - call netcdf_err(error, 'READING xzts ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING xzts' ) - print*,'- XZTS: ',maxval(dummy),minval(dummy) - tile_data%xzts(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "xu", id_var) - call netcdf_err(error, 'READING xu ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING xu' ) - print*,'- XU: ',maxval(dummy),minval(dummy) - tile_data%xu(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "xv", id_var) - call netcdf_err(error, 'READING xv ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING xv' ) - print*,'- XV: ',maxval(dummy),minval(dummy) - tile_data%xv(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "xz", id_var) - call netcdf_err(error, 'READING xz ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING xz' ) - print*,'- XZ: ',maxval(dummy),minval(dummy) - tile_data%xz(istart:iend) = reshape(dummy, (/ijtile/)) - - error=nf90_inq_varid(ncid, "z_c", id_var) - call netcdf_err(error, 'READING z_c ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING z_c' ) - print*,'- Z_C: ',maxval(dummy),minval(dummy) - tile_data%zc(istart:iend) = reshape(dummy, (/ijtile/)) - - endif ! nst fields - - error=nf90_inq_varid(ncid, "smc", id_var) - call netcdf_err(error, 'READING smc ID' ) - error=nf90_get_var(ncid, id_var, dummy3d) - call netcdf_err(error, 'READING smc' ) - print*,'- SMC: ',maxval(dummy3d),minval(dummy3d) - tile_data%smc(istart:iend,1:4) = reshape(dummy3d, (/ijtile,4/)) - - error=nf90_inq_varid(ncid, "stc", id_var) - call netcdf_err(error, 'READING stc ID' ) - error=nf90_get_var(ncid, id_var, dummy3d) - call netcdf_err(error, 'READING stc' ) - print*,'- STC: ',maxval(dummy3d),minval(dummy3d) - tile_data%stc(istart:iend,1:4) = reshape(dummy3d, (/ijtile,4/)) - - error=nf90_inq_varid(ncid, "slc", id_var) - call netcdf_err(error, 'READING slc ID' ) - error=nf90_get_var(ncid, id_var, dummy3d) - call netcdf_err(error, 'READING slc' ) - print*,'- SLC: ',maxval(dummy3d),minval(dummy3d) - tile_data%slc(istart:iend,1:4) = reshape(dummy3d, (/ijtile,4/)) - - error = nf90_close(ncid) - - print* - print*, "- READ INPUT OROG DATA FOR TILE: ",tile - - if (tile==1) error=nf90_open("./orog.tile1.nc",nf90_nowrite,ncid) - if (tile==2) error=nf90_open("./orog.tile2.nc",nf90_nowrite,ncid) - if (tile==3) error=nf90_open("./orog.tile3.nc",nf90_nowrite,ncid) - if (tile==4) error=nf90_open("./orog.tile4.nc",nf90_nowrite,ncid) - if (tile==5) error=nf90_open("./orog.tile5.nc",nf90_nowrite,ncid) - if (tile==6) error=nf90_open("./orog.tile6.nc",nf90_nowrite,ncid) - - call netcdf_err(error, 'OPENING FILE' ) - - error=nf90_inq_varid(ncid, "orog_raw", id_var) - call netcdf_err(error, 'READING orog_raw ID' ) - error=nf90_get_var(ncid, id_var, dummy) - call netcdf_err(error, 'READING orog_raw' ) - print*,'- OROG: ',maxval(dummy),minval(dummy) - tile_data%orog(istart:iend) = reshape(dummy, (/ijtile/)) - - error = nf90_close(ncid) - - enddo - - deallocate (dummy, dummy3d) - - end subroutine read_data_anl - -!------------------------------------------------------------------------------------------- -! Netcdf error routine. -!------------------------------------------------------------------------------------------- - - subroutine netcdf_err(err, string) - - use netcdf - - implicit none - - character(len=*), intent(in) :: string - integer, intent(in) :: err - - character(len=256) :: errmsg - - if( err.eq.nf90_noerr )return - - errmsg = nf90_strerror(err) - print*,'' - print*,'** FATAL ERROR: ', trim(string), ': ', trim(errmsg) - print*,'STOP.' - call errexit(22) - - return - end subroutine netcdf_err diff --git a/sorc/gaussian_sfcanl.fd/weight_gen/README b/sorc/gaussian_sfcanl.fd/weight_gen/README deleted file mode 100644 index 10294dfc33c..00000000000 --- a/sorc/gaussian_sfcanl.fd/weight_gen/README +++ /dev/null @@ -1,23 +0,0 @@ -Creates the ESMF integration weight files to transform from cubed-sphere grids -to comparable (in resolution) global gaussian grids. - -First, compile the program that creates the 'scrip' files for the -global gaussian grids. For each resolution, two grids are created: -one normal grid and one grid with two extra rows for the N/S poles. -To compile, cd to ./scrip.fd and type 'make.sh'. Currently, only -compiles/runs on Theia. - -Then, run the 'run.theia.ksh' script for the resolution desired. -Script first calls the 'scrip' program, then calls ESMF utility -'RegridWeightGen' to create the interpolation weight files. - -Weight files for the following transforms are created: - -C48 => 192x94 and 192x96 gaussian -C96 => 384x192 and 384x194 gaussian -C128 => 512x256 and 512x258 gaussian -C192 => 768x384 and 768x386 gaussian -C384 => 1536x768 and 1536x770 gaussian -C768 => 3072x1536 and 3072x1538 gaussian -C1152 => 4608x2304 and 4608x2406 gaussian -C3072 => 12288x6144 and 12288x6146 gaussian diff --git a/sorc/gaussian_sfcanl.fd/weight_gen/run.theia.sh b/sorc/gaussian_sfcanl.fd/weight_gen/run.theia.sh deleted file mode 100755 index afcd0f18eca..00000000000 --- a/sorc/gaussian_sfcanl.fd/weight_gen/run.theia.sh +++ /dev/null @@ -1,152 +0,0 @@ -#!/bin/sh - -#------------------------------------------------------------------------ -# Run the "RegridWeightGen" step on Theia to create interpolation -# weight files to transform from cubed-sphere tiles to global -# gaussian. -# -# First, create the 'scrip' files for the gaussian grids. Two -# grids are created - the normal gaussian grid, and one with -# two extra rows at the N/S poles. The program to create the -# script files is in ./scrip.fd. To compile, type 'make.sh'. -# Then, run the RegridWeightGen step to create the interpolation -# weight files. -#------------------------------------------------------------------------ - -#PBS -l nodes=1:ppn=1 -#PBS -l walltime=0:30:00 -#PBS -A fv3-cpu -#PBS -q debug -#PBS -N fv3_wgtgen -#PBS -o ./log -#PBS -e ./log - -set -x - -CRES=C48 # run with one mpi task -#CRES=C96 # run with one mpi task -#CRES=C128 # run with one mpi task -#CRES=C192 # run with one mpi task -#CRES=C384 # run with one mpi task -#CRES=C768 # run with 4 mpi tasks -#CRES=C1152 # run with 8 mpi tasks -#CRES=C3072 # run on two nodes, 8 tasks per node - -WORK=/scratch3/NCEPDEV/stmp1/$LOGNAME/weight_gen -rm -fr $WORK -mkdir -p $WORK -cd $WORK - -source /apps/lmod/lmod/init/sh -module purge -module load intel/15.1.133 -module load impi/5.0.1.035 -module use /scratch4/NCEPDEV/nems/noscrub/emc.nemspara/soft/modulefiles -module load esmf/7.1.0r -module load netcdf/4.3.0 -module load hdf5/1.8.14 - -#------------------------------------------------------------------------ -# The RegridWeightGen program. -#------------------------------------------------------------------------ - -RWG=/scratch4/NCEPDEV/nems/noscrub/emc.nemspara/soft/esmf/7.1.0r/bin/ESMF_RegridWeightGen - -#------------------------------------------------------------------------ -# Path to the 'mosaic' and 'grid' files for each cubed-sphere -# resolution. -#------------------------------------------------------------------------ - -FIX_DIR=/scratch4/NCEPDEV/global/save/glopara/svn/fv3gfs/fix/fix_fv3_gmted2010/$CRES - -#------------------------------------------------------------------------ -# Create 'scrip' files for two gaussian grids. One normal grid -# and one with two extra rows at the N/S poles. -#------------------------------------------------------------------------ - -${PBS_O_WORKDIR}/scrip.fd/scrip.exe $CRES - -if [[ $? -ne 0 ]]; then - echo "ERROR CREATING SCRIP FILE" - exit 2 -fi - -#------------------------------------------------------------------------ -# Create weight files. -#------------------------------------------------------------------------ - -case $CRES in - "C48" ) - LONB="192" - LATB="94" - LATB2="96" - ;; - "C96" ) - LONB="384" - LATB="192" - LATB2="194" - ;; - "C128" ) - LONB="512" - LATB="256" - LATB2="258" - ;; - "C192" ) - LONB="768" - LATB="384" - LATB2="386" - ;; - "C384" ) - LONB="1536" - LATB="768" - LATB2="770" - ;; - "C768" ) - LONB="3072" - LATB="1536" - LATB2="1538" - ;; - "C1152" ) - LONB="4608" - LATB="2304" - LATB2="2306" - ;; - "C3072" ) - LONB="12288" - LATB="6144" - LATB2="6146" - ;; - * ) - echo "GRID NOT SUPPORTED" - exit 3 - ;; -esac - -np=$PBS_NP - -mpirun -np $np $RWG -d ./gaussian.${LONB}.${LATB}.nc -s $FIX_DIR/${CRES}_mosaic.nc \ - -w fv3_SCRIP_${CRES}_GRIDSPEC_lon${LONB}_lat${LATB}.gaussian.neareststod.nc \ - -m neareststod --64bit_offset --tilefile_path $FIX_DIR - -mpirun -np $np $RWG -d ./gaussian.${LONB}.${LATB}.nc -s $FIX_DIR/${CRES}_mosaic.nc \ - -w fv3_SCRIP_${CRES}_GRIDSPEC_lon${LONB}_lat${LATB}.gaussian.bilinear.nc \ - -m bilinear --64bit_offset --tilefile_path $FIX_DIR - -mpirun -np $np $RWG -d ./gaussian.${LONB}.${LATB2}.nc -s $FIX_DIR/${CRES}_mosaic.nc \ - -w fv3_SCRIP_${CRES}_GRIDSPEC_lon${LONB}_lat${LATB2}.gaussian.neareststod.nc \ - -m neareststod --64bit_offset --tilefile_path $FIX_DIR - -#------------------------------------------------------------------------ -# Could not get this C3072 bilinear option to work. This grid is -# so big we are pushing the limits of the utility. -#------------------------------------------------------------------------ - -if [[ $CRES == "C3072" ]]; then - exit 0 -fi - -mpirun -np $np $RWG -d ./gaussian.${LONB}.${LATB2}.nc -s $FIX_DIR/${CRES}_mosaic.nc \ - -w fv3_SCRIP_${CRES}_GRIDSPEC_lon${LONB}_lat${LATB2}.gaussian.bilinear.nc \ - -m bilinear --64bit_offset --tilefile_path $FIX_DIR - -exit diff --git a/sorc/gaussian_sfcanl.fd/weight_gen/scrip.fd/scrip.f90 b/sorc/gaussian_sfcanl.fd/weight_gen/scrip.fd/scrip.f90 deleted file mode 100644 index 5c4d2a42214..00000000000 --- a/sorc/gaussian_sfcanl.fd/weight_gen/scrip.fd/scrip.f90 +++ /dev/null @@ -1,350 +0,0 @@ - program scrip - -!---------------------------------------------------------------------- -! Create "scrip" files that describes a gaussian grid. -! Two files are created: the normal gaussian grid and one with -! two extra rows for the N/S poles. -!---------------------------------------------------------------------- - - implicit none - - character(len=128) :: outfile - character(len=20) :: title - character(len=5) :: idim_ch, jdim_ch, jdimp_ch - character(len=6) :: cres - - integer :: header_buffer_val = 16384 - integer :: fsize=65536, inital = 0 - integer :: error, ncid - integer :: i, j, idim, jdim, ijdim - integer :: jdimp - integer :: dim_size, dim_corners, dim_rank - integer :: id_dims, id_center_lat, id_center_lon - integer :: id_imask, id_corner_lat, id_corner_lon - integer :: num_corners = 4 - integer :: rank = 2 - integer(kind=4), allocatable :: mask(:) - - real(kind=8) :: corner_lon_src - real(kind=8) :: dx_src, lat_edge - real(kind=8), allocatable :: lats(:,:), lons(:,:), dum1d(:) - real(kind=8), allocatable :: dum2d(:,:), latsp(:,:), lonsp(:,:) - real(kind=8), allocatable :: lats_corner(:,:,:), lons_corner(:,:,:) - real(kind=8), allocatable :: latsp_corner(:,:,:), lonsp_corner(:,:,:) - real(kind=8), allocatable :: slat(:), wlat(:) - - include "netcdf.inc" - - call getarg(1, cres) - - select case (trim(cres)) - case ("c48","C48") - idim = 192 ! cres * 4 - jdim = 94 ! cres * 2 - jdimp = 96 ! include two rows for the poles - idim_ch = "192" - jdim_ch = "94" - jdimp_ch = "96" - case ("c96","C96") - idim = 384 ! cres * 4 - jdim = 192 ! cres * 2 - jdimp = 194 ! include two rows for the poles - idim_ch = "384" - jdim_ch = "192" - jdimp_ch = "194" - case ("c128","C128") - idim = 512 ! cres * 4 - jdim = 256 ! cres * 2 - jdimp = 258 ! include two rows for the poles - idim_ch = "512" - jdim_ch = "256" - jdimp_ch = "258" - case ("c192","C192") - idim = 768 ! cres * 4 - jdim = 384 ! cres * 2 - jdimp = 386 ! include two rows for the poles - idim_ch = "768" - jdim_ch = "384" - jdimp_ch = "386" - case ("c384","C384") - idim = 1536 ! cres * 4 - jdim = 768 ! cres * 2 - jdimp = 770 ! include two rows for the poles - idim_ch = "1536" - jdim_ch = "768" - jdimp_ch = "770" - case ("c768","C768") - idim = 3072 ! cres * 4 - jdim = 1536 ! cres * 2 - jdimp = 1538 ! include two rows for the poles - idim_ch = "3072" - jdim_ch = "1536" - jdimp_ch = "1538" - case ("c1152","C1152") - idim = 4608 ! cres * 4 - jdim = 2304 ! cres * 2 - jdimp = 2306 ! include two rows for the poles - idim_ch = "4608" - jdim_ch = "2304" - jdimp_ch = "2306" - case ("c3072","C3072") - idim = 12288 ! cres * 4 - jdim = 6144 ! cres * 2 - jdimp = 6146 ! include two rows for the poles - idim_ch = "12288" - jdim_ch = "6144" - jdimp_ch = "6146" - case default - print*,'- Resolution not supported ', trim(cres) - stop 3 - end select - - corner_lon_src = 0.0 - dx_src = 360.0 / float(idim) - ijdim = idim*jdim - - allocate(slat(jdim)) - allocate(wlat(jdim)) - - call splat(4, jdim, slat, wlat) - - allocate(lats(idim,jdim)) - allocate(lats_corner(num_corners,idim,jdim)) - allocate(lons(idim,jdim)) - allocate(lons_corner(num_corners,idim,jdim)) - - do j = 1, jdim - lats(:,j) = 90.0 - (acos(slat(j))* 180.0 / (4.*atan(1.))) - enddo - - deallocate(slat, wlat) - -!---------------------------------------------------------------- -! First, output file without poles. -!---------------------------------------------------------------- - -!---------------------------------------------------------------- -! Set corners in counter-clockwise order -! -! 2 1 -! -! C -! -! 3 4 -!---------------------------------------------------------------- - - lats_corner(1,:,1) = 90.0 - lats_corner(2,:,1) = 90.0 - - lats_corner(3,:,jdim) = -90.0 - lats_corner(4,:,jdim) = -90.0 - - do j = 1, jdim - 1 - lat_edge = (lats(1,j) + lats(1,j+1)) / 2.0 - lats_corner(3,:,j) = lat_edge - lats_corner(4,:,j) = lat_edge - lats_corner(1,:,j+1) = lat_edge - lats_corner(2,:,j+1) = lat_edge - enddo - - do i = 1, idim - lons(i,:) = corner_lon_src + float(i-1)*dx_src - lons_corner(1,i,:) = lons(i,:) + (dx_src*0.5) - lons_corner(2,i,:) = lons(i,:) - (dx_src*0.5) - lons_corner(3,i,:) = lons(i,:) - (dx_src*0.5) - lons_corner(4,i,:) = lons(i,:) + (dx_src*0.5) - enddo - - i = 1 - j = 1 - print*,'center ',lats(i,j),lons(i,j) - print*,'corner 1 ',lats_corner(1,i,j),lons_corner(1,i,j) - print*,'corner 2 ',lats_corner(2,i,j),lons_corner(2,i,j) - print*,'corner 3 ',lats_corner(3,i,j),lons_corner(3,i,j) - print*,'corner 4 ',lats_corner(4,i,j),lons_corner(4,i,j) - - i = 1 - j = 2 - print*,'center ',lats(i,j),lons(i,j) - print*,'corner 1 ',lats_corner(1,i,j),lons_corner(1,i,j) - print*,'corner 2 ',lats_corner(2,i,j),lons_corner(2,i,j) - print*,'corner 3 ',lats_corner(3,i,j),lons_corner(3,i,j) - print*,'corner 4 ',lats_corner(4,i,j),lons_corner(4,i,j) - - i = 1 - j = jdim - print*,'center ',lats(i,j),lons(i,j) - print*,'corner 1 ',lats_corner(1,i,j),lons_corner(1,i,j) - print*,'corner 2 ',lats_corner(2,i,j),lons_corner(2,i,j) - print*,'corner 3 ',lats_corner(3,i,j),lons_corner(3,i,j) - print*,'corner 4 ',lats_corner(4,i,j),lons_corner(4,i,j) - - i = 1 - j = jdim-1 - print*,'center ',lats(i,j),lons(i,j) - print*,'corner 1 ',lats_corner(1,i,j),lons_corner(1,i,j) - print*,'corner 2 ',lats_corner(2,i,j),lons_corner(2,i,j) - print*,'corner 3 ',lats_corner(3,i,j),lons_corner(3,i,j) - print*,'corner 4 ',lats_corner(4,i,j),lons_corner(4,i,j) - - allocate(mask(ijdim)) - mask = 1 - -! output file without pole. - - outfile = " " - outfile = "./gaussian." // trim(idim_ch) // "." // trim(jdim_ch) // ".nc" - title = " " - title = "gaussian." // trim(idim_ch) // "." // trim(jdim_ch) - -!--- open the file - error = NF__CREATE(outfile, IOR(NF_NETCDF4,NF_CLASSIC_MODEL), inital, fsize, ncid) - print*, 'error after open ', error - -!--- define dimension - error = nf_def_dim(ncid, 'grid_size', ijdim, dim_size) - error = nf_def_dim(ncid, 'grid_corners', num_corners, dim_corners) - error = nf_def_dim(ncid, 'grid_rank', rank, dim_rank) - -!--- define field - error = nf_def_var(ncid, 'grid_dims', NF_INT, 1, (/dim_rank/), id_dims) - error = nf_def_var(ncid, 'grid_center_lat', NF_DOUBLE, 1, (/dim_size/), id_center_lat) - error = nf_put_att_text(ncid, id_center_lat, "units", 7, "degrees") - error = nf_def_var(ncid, 'grid_center_lon', NF_DOUBLE, 1, (/dim_size/), id_center_lon) - error = nf_put_att_text(ncid, id_center_lon, "units", 7, "degrees") - error = nf_def_var(ncid, 'grid_imask', NF_INT, 1, (/dim_size/), id_imask) - error = nf_put_att_text(ncid, id_imask, "units", 8, "unitless") - error = nf_def_var(ncid, 'grid_corner_lat', NF_DOUBLE, 2, (/dim_corners,dim_size/), id_corner_lat) - error = nf_put_att_text(ncid, id_corner_lat, "units", 7, "degrees") - error = nf_def_var(ncid, 'grid_corner_lon', NF_DOUBLE, 2, (/dim_corners,dim_size/), id_corner_lon) - error = nf_put_att_text(ncid, id_corner_lon, "units", 7, "degrees") - error = nf_put_att_text(ncid, NF_GLOBAL, "title", 20, trim(title)) - error = nf__enddef(ncid, header_buffer_val,4,0,4) - -!--- set fields - error = nf_put_var_int( ncid, id_dims, (/idim,jdim/)) - - allocate(dum1d(ijdim)) - dum1d = reshape(lats, (/ijdim/)) - error = nf_put_var_double( ncid, id_center_lat, dum1d) - dum1d = reshape(lons, (/ijdim/)) - error = nf_put_var_double( ncid, id_center_lon, dum1d) - deallocate(dum1d) - - error = nf_put_var_int( ncid, id_imask, mask) - deallocate(mask) - - allocate(dum2d(num_corners,ijdim)) - dum2d = reshape (lats_corner, (/num_corners,ijdim/)) - error = nf_put_var_double( ncid, id_corner_lat, dum2d) - - dum2d = reshape (lons_corner, (/num_corners,ijdim/)) - error = nf_put_var_double( ncid, id_corner_lon, dum2d) - deallocate(dum2d) - - error = nf_close(ncid) - -!---------------------------------------------------------------- -! output file with poles. -!---------------------------------------------------------------- - - outfile = " " - outfile = "./gaussian." // trim(idim_ch) // "." // trim(jdimp_ch) // ".nc" - title = " " - title = "gaussian." // trim(idim_ch) // "." // trim(jdimp_ch) - - ijdim = idim*jdimp - - allocate(latsp(idim,jdimp)) - allocate(lonsp(idim,jdimp)) - - do j = 2, jdim+1 - latsp(:,j) = lats(:,j-1) - lonsp(:,j) = lons(:,j-1) - enddo - - latsp(:,1) = 90.0_8 - lonsp(:,1) = 0.0_8 - - latsp(:,jdimp) = -90.0_8 - lonsp(:,jdimp) = 0.0_8 - - deallocate(lats, lons) - - allocate(latsp_corner(num_corners,idim,jdimp)) - allocate(lonsp_corner(num_corners,idim,jdimp)) - - latsp_corner(:,:,1) = 89.5_8 - latsp_corner(:,:,jdimp) = -89.5_8 - - lonsp_corner(1,:,1) = 0.0_8 - lonsp_corner(2,:,1) = 90.0_8 - lonsp_corner(3,:,1) = 180.0_8 - lonsp_corner(4,:,1) = 270.0_8 - - lonsp_corner(1,:,jdimp) = 0.0_8 - lonsp_corner(2,:,jdimp) = 90.0_8 - lonsp_corner(3,:,jdimp) = 180.0_8 - lonsp_corner(4,:,jdimp) = 270.0_8 - - do j = 2, jdim+1 - latsp_corner(:,:,j) = lats_corner(:,:,j-1) - lonsp_corner(:,:,j) = lons_corner(:,:,j-1) - enddo - - deallocate(lats_corner, lons_corner) - -!--- open the file - error = NF__CREATE(outfile, IOR(NF_NETCDF4,NF_CLASSIC_MODEL), inital, fsize, ncid) - print*, 'error after open ', error - -!--- define dimension - error = nf_def_dim(ncid, 'grid_size', ijdim, dim_size) - error = nf_def_dim(ncid, 'grid_corners', num_corners, dim_corners) - error = nf_def_dim(ncid, 'grid_rank', rank, dim_rank) - -!--- define field - error = nf_def_var(ncid, 'grid_dims', NF_INT, 1, (/dim_rank/), id_dims) - error = nf_def_var(ncid, 'grid_center_lat', NF_DOUBLE, 1, (/dim_size/), id_center_lat) - error = nf_put_att_text(ncid, id_center_lat, "units", 7, "degrees") - error = nf_def_var(ncid, 'grid_center_lon', NF_DOUBLE, 1, (/dim_size/), id_center_lon) - error = nf_put_att_text(ncid, id_center_lon, "units", 7, "degrees") - error = nf_def_var(ncid, 'grid_imask', NF_INT, 1, (/dim_size/), id_imask) - error = nf_put_att_text(ncid, id_imask, "units", 8, "unitless") - error = nf_def_var(ncid, 'grid_corner_lat', NF_DOUBLE, 2, (/dim_corners,dim_size/), id_corner_lat) - error = nf_put_att_text(ncid, id_corner_lat, "units", 7, "degrees") - error = nf_def_var(ncid, 'grid_corner_lon', NF_DOUBLE, 2, (/dim_corners,dim_size/), id_corner_lon) - error = nf_put_att_text(ncid, id_corner_lon, "units", 7, "degrees") - error = nf_put_att_text(ncid, NF_GLOBAL, "title", 20, trim(title)) - error = nf__enddef(ncid, header_buffer_val,4,0,4) - -!--- set fields - error = nf_put_var_int( ncid, id_dims, (/idim,jdimp/)) - - allocate(dum1d(ijdim)) - dum1d = reshape(latsp, (/ijdim/)) - error = nf_put_var_double( ncid, id_center_lat, dum1d) - dum1d = reshape(lonsp, (/ijdim/)) - error = nf_put_var_double( ncid, id_center_lon, dum1d) - deallocate(dum1d) - - allocate(mask(ijdim)) - mask = 1 - error = nf_put_var_int( ncid, id_imask, mask) - deallocate(mask) - - allocate(dum2d(num_corners,ijdim)) - dum2d = reshape (latsp_corner, (/num_corners,ijdim/)) - print*,'lat corner check ',maxval(dum2d),minval(dum2d) - error = nf_put_var_double( ncid, id_corner_lat, dum2d) - deallocate(latsp_corner) - - dum2d = reshape (lonsp_corner, (/num_corners,ijdim/)) - error = nf_put_var_double( ncid, id_corner_lon, dum2d) - deallocate(dum2d, lonsp_corner) - - error = nf_close(ncid) - - print*,'- DONE.' - - end program scrip diff --git a/sorc/gfs_bufr.fd/bfrhdr.f b/sorc/gfs_bufr.fd/bfrhdr.f deleted file mode 100644 index 8bab3043bc6..00000000000 --- a/sorc/gfs_bufr.fd/bfrhdr.f +++ /dev/null @@ -1,174 +0,0 @@ - SUBROUTINE BFRHDR ( luntbl, cseqn, prfflg, clist, np, iret ) -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: PROGRAM NAME (up to 20 characters) -C PRGMMR: YOUR NAME ORG: W/NMCXX DATE: YY-MM-DD -C -C ABSTRACT: START ABSTRACT HERE AND INDENT TO COLUMN 5 ON THE -C FOLLOWING LINES. PLEASE PROVIDE A BRIEF DESCRIPTION OF -C WHAT THE SUBPROGRAM DOES. -C -C PROGRAM HISTORY LOG: -C YY-MM-DD ORIGINAL PROGRAMMER'S NAME HERE -C YY-MM-DD MODIFIER1 DESCRIPTION OF CHANGE -C YY-MM-DD MODIFIER2 DESCRIPTION OF CHANGE -C -C USAGE: CALL PROGRAM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C LANGUAGE: INDICATE EXTENSIONS, COMPILER OPTIONS -C MACHINE: IBM SP -C -C$$$ -C*********************************************************************** -C* BFRHDR -C* -C* This subroutine reads a Jack Woollen BUFR encoding table file to -C* get the string of parameters to be written. This subroutine is -C* given the sequence nmemonic and returns the list associated with it. -C* This list is a character string and is used as the last input to -C* UFBINT. -C* -C* -C* BFRHDR ( LUNTBL, CSEQN, PRFFLG, CLIST, NP, IRET ) -C* -C* Input parameters: -C* LUNTBL INTEGER Unit number of BUFR Table file -C* CSEQN CHAR* Sequence mnemonic -C* PRFFLG LOGICAL Flag for profile parms -C* = .true. for multi-level parms -C* -C* Output parameters: -C* CLIST CHAR* String of parm names -C* NP INTEGER Number of parm names in string -C* IRET INTEGER Return code -C* 0 = normal return -C* -1 = Improper table file -C* -2 = Sequence not found -C** -C* Log: -C* K. Brill/NMC 05/94 -C*********************************************************************** -C* - CHARACTER*(*) cseqn, clist - LOGICAL prfflg -C* - LOGICAL found - CHARACTER*80 sbuf -C -C* Set starting column number of parameter list in the table. -C - DATA istart / 14 / -C----------------------------------------------------------------------- - iret = 0 -C -C* Count the number of lines to end of file (used to reposition -C* pointer to original line at the end). -C - found = .true. - lcnt = 1 - DO WHILE ( found ) - READ ( luntbl, 1000, IOSTAT=ios ) sbuf -1000 FORMAT (A) - IF ( ios .ne. 0 ) THEN - found = .false. - ELSE - lcnt = lcnt + 1 - END IF - END DO -C -C* Read from the file for positioning. -C - REWIND luntbl - found = .false. - DO WHILE ( .not. found ) - READ (luntbl, 1000, IOSTAT=ios ) sbuf - IF ( ios .ne. 0 ) THEN - iret = -1 - RETURN - END IF - iq1 = INDEX ( sbuf, '| REFERENCE' ) - iq2 = INDEX ( sbuf, '| UNITS' ) - iq = iq1 * iq2 - IF ( iq .ne. 0 ) found = .true. - END DO -C -C* Get length of sequence mnemonic string. -C - lc = LEN ( cseqn ) - DO WHILE ( cseqn ( lc:lc ) .eq. ' ' ) - lc = lc-1 - END DO -C -C* Start searching backward for the sequence mnemonic. -C - found = .false. - lenc=0 - DO WHILE ( .not. found ) - BACKSPACE luntbl - READ ( luntbl, 1000, IOSTAT=ios ) sbuf - IF ( ios .ne. 0 .or. sbuf (1:2) .eq. '.-' ) THEN - iret = -2 - RETURN - END IF - BACKSPACE luntbl - iq = INDEX ( sbuf ( 1:14 ), cseqn ( 1:lc ) ) - IF ( iq .ne. 0 ) THEN - found = .true. -C -C* Find the last character of last parameter. -C - i = 79 - DO WHILE ( sbuf ( i:i ) .eq. ' ' ) - i = i - 1 - END DO - clist = ' ' - clist = sbuf ( istart:i ) -C -C* Count the number of entries in CLIST. -C - lenc = i - istart + 1 - nspcs = 0 - np = 0 - DO j = 1, lenc - IF ( clist ( j:j ) .eq. ' ' ) nspcs = nspcs + 1 - END DO - np = nspcs + 1 -C -C* Handle profile sequence. -C - IF ( prfflg ) THEN -C sbuf = cseqn ( 1:lc ) // '^ ' // clist ( 1:lenc ) - sbuf = clist ( 1:lenc ) - clist = sbuf - END IF - END IF - END DO -C -C* Reposition file to original record. -C - found = .true. - DO WHILE ( found ) - READ ( luntbl, 1000, IOSTAT=ios ) sbuf - IF ( ios .ne. 0 ) found = .false. - END DO - DO k = 1, lcnt - BACKSPACE luntbl - END DO -C* - RETURN - END diff --git a/sorc/gfs_bufr.fd/bfrize.f b/sorc/gfs_bufr.fd/bfrize.f deleted file mode 100644 index 1183c62f34d..00000000000 --- a/sorc/gfs_bufr.fd/bfrize.f +++ /dev/null @@ -1,241 +0,0 @@ - SUBROUTINE BFRIZE ( luntbl, lunbfr, sbset, iyr, imn, idy, ihr, - + seqnam, seqflg, nseq, lvlwise, data, nlvl, - + clist, npp, wrkd, iret ) -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: PROGRAM NAME (up to 20 characters) -C PRGMMR: YOUR NAME ORG: W/NMCXX DATE: YY-MM-DD -C -C ABSTRACT: START ABSTRACT HERE AND INDENT TO COLUMN 5 ON THE -C FOLLOWING LINES. PLEASE PROVIDE A BRIEF DESCRIPTION OF -C WHAT THE SUBPROGRAM DOES. -C -C PROGRAM HISTORY LOG: -C YY-MM-DD ORIGINAL PROGRAMMER'S NAME HERE -C YY-MM-DD MODIFIER1 DESCRIPTION OF CHANGE -C YY-MM-DD MODIFIER2 DESCRIPTION OF CHANGE -C -C USAGE: CALL PROGRAM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C LANGUAGE: INDICATE EXTENSIONS, COMPILER OPTIONS -C MACHINE: IBM SP -C -C$$$ -C*********************************************************************** -C* BFRIZE -C* -C* This subroutine calls Jack Woollen's BUFR encoding routines to -C* write a BUFR message to an output file. SBSET is the Mnemonic -C* for the TABLE A entry associated with this message. It appears -C* in the table referenced by LUNTBL. If LUNTBL = 0, the output -C* BUFR file is closed. -C* -C* The data in the array DATA are ordered according to the individual -C* elements of the Sequences given in SEQNAM. The contents of SEQNAM -C* and SEQFLG and, consequently of DATA, are determined by the BUFR -C* table file referenced by LUNTBL. Each entry in SEQNAM has a list of -C* parameters associated with it in the table. This list is read from -C* the table and the number of parameters is determined. This -C* information is stored in CLIST and NPP for future calls to BFRIZE. -C* If the parameters associated with the entry in SEQNAM exist on NLVL -C* levels, the corresponding array element of SEQFLG must be .true.; -C* otherwise, it is .false. -C* -C* Profile data in array DATA may be stored such that contiguous -C* elements are values of different parameters on the same level -C* (parameter-wise storage) or the same parameter on different levels -C* (level-wise storage). If LVLWISE=.false. parameter-wise storage -C* is assumed; otherwise, LVLWISE=.true. and level-wise storage is -C* assumed. -C* -C* The example below shows the contents of SEQNAM, SEQFLG, and DATA -C* for a case when NLVL=3, LVLWISE=.true., and the table file has the -C* following entries for the Mnemonic Sequences: -C* -C* MNEMONIC | SEQUENCE -C* -C* MODELOUT | HDR {PROF} SFC -C* HDR | RLAT RLON -C* PROF | PRES TMPK -C* SFC | PMSL PRCP -C* -C* SEQNAM and SEQFLG have the following assigned entries: -C* -C* INDEX SEQNAM SEQFLG -C* 1 HDR .false. -C* 2 PROF .true. -C* 3 SFC .false. -C* -C* DATA must contain the following values in this order: -C* -C* DATA (1) = rlat DATA (6) = tmpk (1) -C* DATA (2) = rlon DATA (7) = tmpk (2) -C* DATA (3) = pres (1) DATA (8) = tmpk (3) -C* DATA (4) = pres (2) DATA (9) = pmsl -C* DATA (5) = pres (3) DATA (10) = prcp -C* -C* The lower-case names above signify numerical values of the -C* parameters. The values of multiple level parameters are stored -C* contiguously. -C* -C* To add a new output parameter, update the table file by adding the -C* Mnemonic for the parameter to an existing Sequence or by adding -C* a new Sequence. If a new Sequence has been added, SEQNAM and -C* SEQFLG must be updated accordingly. In any case, the new output -C* parameter value must be placed in the correct position within the -C* array DATA. -C* -C* CLIST contains the lists of parameter names for each element of -C* SEQNAM. If CLIST (1) is blank, BFRHDR is called with SEQNAM and -C* SEQFLG as inputs to load the names of the parameters into CLIST; -C* otherwise, the names in CLIST are used. For every element of -C* SEQNAM there is a corresponding element of CLIST. For each element -C* of CLIST, there is a corresponding element of NPP giving the number -C* of parameter names in the list. -C* -C* DATA (i) = 10.E+10 is the missing value. -C* -C* WRKD is a scratch array and should be dimensioned the same size as -C* data. WRKD is not used if LVLWISE=.false. -C* -C* BFRIZE ( LUNTBL, LUNBFR, SBSET, IYR, IMN, IDY, IHR, -C* SEQNAM, SEQFLG, NSEQ, LVLWISE, DATA, NLVL, CLIST, NPP, -C* WRKD, IRET ) -C* -C* Input parameters: -C* LUNTBL INTEGER Unit number of BUFR Table file -C* LUNBFR INTEGER Unit number of BUFR data file -C* SBSET CHAR* BUFR subset name -C* IYR INTEGER 4-digit year -C* IMN INTEGER 2-digit month -C* IDY INTEGER 2-digit day -C* IHR INTEGER 2-digit cycle hour -C* SEQNAM (NSEQ) CHAR* Mnemonic Sequence names -C* SEQFLG (NSEQ) LOGICAL Multi-level flag -C* NSEQ INTEGER Number of Sequence names & flags -C* LVLWISE LOGICAL Level-wise profile data flag -C* DATA (*) REAL Data array -C* NLVL INTEGER Number of levels -C* -C* Input and Output parameters: -C* CLIST (NSEQ) CHAR* Parameter name lists -C* NPP (NSEQ) INTEGER Number of parameter names -C* -C* Output parameters: -C* WRKD (*) REAL Array of reordered profile data -C* IRET INTEGER Return code -C* 0 = normal return -C** -C* Log: -C* K. Brill/NMC 05/94 -C* K. Brill/NMC 06/94 Added LVLWISE, CLIST, NPP, WRKD -C 98-08-28 ROZWODOSKI MADE CHANGES FOR Y2K COMPLIANCE. -C*********************************************************************** - REAL*8 data (*) - INTEGER npp (*), nlvl (*) - CHARACTER*(*) seqnam (*), sbset - LOGICAL seqflg (*), lvlwise - CHARACTER*(*) clist (*) - REAL*8 wrkd (*) -C----------------------------------------------------------------------- - iret = 0 -c print*,'Bufriz.f is creating bufr file' - -C -C* Close BUFR file if LUNTBL = 0. -C - IF ( luntbl .eq. 0 ) THEN - CALL CLOSBF ( lunbfr ) - RETURN - END IF -C -C* Check the status of the output BUFR file. -C - CALL STATUS ( lunbfr, lun, iopn, imm ) - IF ( iopn .eq. 0 ) THEN - CALL SETBLOCK(1) - CALL OPENBF ( lunbfr, 'OUT', luntbl ) - CALL DATELEN ( 10 ) - END IF -C -C* Open a new message. -C - idate = iyr * 1000000 + imn * 10000 + idy * 100 + ihr -c print *, 'Bufriz idate = ', idate - CALL OPENMB ( lunbfr, sbset, idate ) -C -C* Create the parameter name lists if CLIST (1) is blank. -C -c print *, 'clist (1) = ', clist(1) -c print *, 'npp (1) = ', npp(1) -c print *, 'seqnam (1) = ', seqnam(1) -c print *, 'seqflg (1) = ', seqflg(1) -c print *, 'nseq = ', nseq - IF ( clist (1) .eq. ' ' ) THEN - DO is = 1, nseq - CALL BFRHDR ( luntbl, seqnam (is), seqflg (is), - + clist (is), npp (is), iret ) - IF ( iret .ne. 0 ) RETURN - END DO - END IF -C -C* Load the sequences. -C - idpntr = 1 - indxlv = 0 - DO is = 1, nseq - np = npp (is) - IF ( seqflg (is) ) THEN - indxlv = indxlv + 1 - IF ( lvlwise ) THEN -C -C* This is level-wise multi-level data. -C - istrt = idpntr - indx = 0 - DO k = 1, nlvl (indxlv) - DO ip = 1, np - indx = indx + 1 - wrkd ( indx ) = - + data ( istrt + (ip-1) * nlvl (indxlv) ) - END DO - istrt = istrt + 1 - END DO - CALL UFBINT ( lunbfr, wrkd, np, nlvl (indxlv), - + irtrn, clist (is) ) - ELSE -C -C* This is parameter-wise multi-level data. -C - CALL UFBINT ( lunbfr, data (idpntr), np, - + nlvl (indxlv), irtrn, clist (is) ) - END IF - idpntr = idpntr + np * nlvl (indxlv) - ELSE -C -C* This is single-level data. -C - CALL UFBINT ( lunbfr, data (idpntr), - + np, 1, irtrn, clist (is) ) - idpntr = idpntr + np - END IF - END DO - CALL WRITSB ( lunbfr ) -C* - RETURN - END diff --git a/sorc/gfs_bufr.fd/buff.f b/sorc/gfs_bufr.fd/buff.f deleted file mode 100644 index 5441fbf5a84..00000000000 --- a/sorc/gfs_bufr.fd/buff.f +++ /dev/null @@ -1,92 +0,0 @@ - subroutine buff(nint1,nend1,nint3,nend3,npoint,idate,jdate,levs, - & dird,lss,istat,sbset,seqflg,clist,npp,wrkd) - character*150 dird, fnbufr, fmto -!! integer nint, nend, npoint, idate(4), levs, jdate - integer nint1, nend1, nint3, nend3 - integer npoint, idate(4), levs, jdate - real*8 data(6*levs+25), wrkd(1) - integer idtln, nf, nfile, np - integer lss, istat(npoint), ios - CHARACTER*150 FILESEQ - CHARACTER*8 SBSET - LOGICAL SEQFLG(4) - CHARACTER*80 CLIST(4) - INTEGER NPP(4) - CHARACTER*8 SEQNAM(4) - FMTO = '(A,".",I6.6,".",I10)' - idtln = 8 - nfile = 20 -C print *, 'inside buff.f nint1,nend1,nint3,nend3,jdate=' -C print *, nint1,nend1,nint3,nend3,jdate - do nf = 0, nend1, nint1 - nfile = nfile + 1 - rewind nfile - enddo - do nf = nend1+nint3, nend3, nint3 - nfile = nfile + 1 - rewind nfile - enddo - do np = 1, npoint -C OPEN BUFR OUTPUT FILE. - write(fnbufr,fmto) dird(1:lss),istat(np),jdate - print *, ' fnbufr =', fnbufr - open(unit=19,file=fnbufr,form='unformatted', - & status='new', iostat=ios) - IF ( ios .ne. 0 ) THEN - WRITE (6,*) ' CANNOT open ', 19 - STOP - END IF - CALL OPENBF ( 19, 'OUT', 1 ) - nfile = 20 - do nf = 0, nend1, nint1 - nfile = nfile + 1 - read(nfile) data - if(np.eq.1) then - print *, ' creating bufr file for np, nfile =', - & np, nfile - endif -CC WRITE DATA MESSAGE TO BUFR OUTPUT FILE. -CC LUNTBL=-9 BECAUSE BUFR TABLE FILE NOT USED HERE. -CC SEQNAM=XXXXXX BECAUSE MNEMONIC SEQUENCE NAMES NOT USED HERE. - CALL BFRIZE ( -9, 19, SBSET, - & idate(4), iDATE(2), - & iDATE(3), iDATE(1), - & 'XXXXXX', SEQFLG, 4, .FALSE., DATA, levs, - & CLIST, NPP, WRKD, IRET ) - IF ( IRET .NE. 0 ) THEN - PRINT *,' BFRIZE FAILED ' - ENDIF -c 300 continue - enddo -C 3hourly output starts here -!! print *, 'buff.f nfile,nend1+nint3,nend3,nint3=' -!! print *, nfile,nend1+nint3,nend3,nint3 - do nf = nend1+nint3, nend3, nint3 - nfile = nfile + 1 - read(nfile) data - if(np.eq.1) then - print *, ' creating bufr file for np, nfile =', - & np, nfile - endif -C print *, 'read2 in fort(nfile) =', nfile -CC WRITE DATA MESSAGE TO BUFR OUTPUT FILE. -CC LUNTBL=-9 BECAUSE BUFR TABLE FILE NOT USED HERE. -CC SEQNAM=XXXXXX BECAUSE MNEMONIC SEQUENCE NAMES NOT USED HERE. - CALL BFRIZE ( -9, 19, SBSET, - & idate(4), iDATE(2), - & iDATE(3), iDATE(1), - & 'XXXXXX', SEQFLG, 4, .FALSE., DATA, levs, - & CLIST, NPP, WRKD, IRET ) - IF ( IRET .NE. 0 ) THEN - PRINT *,' BFRIZE FAILED ' - ENDIF - enddo - CALL BFRIZE ( 0, 19, SBSET, - & IDATE(4), IDATE(2), - & IDATE(3), IDATE(1), - & 'XXXXXX', SEQFLG, 4, .FALSE., DATA, levs, - & CLIST, NPP, WRKD, IRET ) - call closbf(19) - enddo - return - end diff --git a/sorc/gfs_bufr.fd/calpreciptype.f b/sorc/gfs_bufr.fd/calpreciptype.f deleted file mode 100644 index 23072313376..00000000000 --- a/sorc/gfs_bufr.fd/calpreciptype.f +++ /dev/null @@ -1,1616 +0,0 @@ -SUBROUTINE CALPRECIPTYPE(kdt,nrcm,im,ix,lm,lp1,randomno, & - xlat,xlon, & - gt0,gq0,prsl,prsi,PREC, & !input - phii,n3dfercld,TSKIN,SR,phy_f3d, & !input - DOMR,DOMZR,DOMIP,DOMS) !output -! SUBROUTINE CALPRECIPTYPE(nrcm,randomno,im,lm,lp1,T,Q,PMID,PINT,PREC, & !input -! ZINT,n3dfercld,TSKIN,SR,F_RimeF, & !input -! DOMR,DOMZR,DOMIP,DOMS) !output -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! . . . -! SUBPROGRAM: CALPRECIPTYPE COMPUTE DOMINANT PRECIP TYPE -! PRGRMMR: CHUANG ORG: W/NP2 DATE: 2008-05-28 -! -! -! ABSTRACT: -! THIS ROUTINE COMPUTES PRECIPITATION TYPE. -! . It is adopted from post but was made into a column to used by GFS model -! -! -! use vrbls3d -! use vrbls2d -! use soil -! use masks -! use params_mod -! use ctlblk_mod -! use rqstfld_mod - USE FUNCPHYS, ONLY : gfuncphys,fpvs,ftdp,fpkap,ftlcl,stma,fthe - USE PHYSCONS -!- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - implicit none -! -! INCLUDE "mpif.h" -! -! IN NGM SUBROUTINE OUTPUT WE FIND THE FOLLOWING COMMENT. -! "IF THE FOLLOWING THRESHOLD VALUES ARE CHANGED, CONTACT -! TDL/SYNOPTIC-SCALE TECHNIQUES BRANCH (PAUL DALLAVALLE -! AND JOHN JENSENIUS). THEY MAY BE USING IT IN ONE OF -! THEIR PACKING CODES." THE THRESHOLD VALUE IS 0.01 INCH -! OR 2.54E-4 METER. PRECIPITATION VALUES LESS THAN THIS -! THRESHOLD ARE SET TO MINUS ONE TIMES THIS THRESHOLD. - - real,PARAMETER :: PTHRESH = 0.0 -! -! SET CELCIUS TO KELVIN AND SECOND TO HOUR CONVERSION. - integer,PARAMETER :: NALG = 5 -! -! DECLARE VARIABLES. -! - integer,intent(in) :: kdt,nrcm,im,ix,lm,lp1,n3dfercld - real,intent(in) :: xlat(im),xlon(im) - real,dimension(im),intent(in) :: PREC,SR,TSKIN - real,intent(in) :: randomno(ix,nrcm) - real,dimension(ix,LM),intent(in) :: gt0,gq0,prsl,phy_f3d - real,dimension(ix,lp1),intent(in) :: prsi,phii - real,dimension(im),intent(out) :: DOMR,DOMZR,DOMIP,DOMS - INTEGER :: IWX1,IWX4,IWX5 - REAL :: IWX2,IWX3 - REAL :: ES,QC - REAL :: SLEET(NALG),RAIN(NALG),FREEZR(NALG),SNOW(NALG) - real,dimension(LM) :: T,Q,PMID,F_RimeF - real,dimension(lp1) :: pint,zint - REAL, ALLOCATABLE :: RH(:) - REAL(kind=kind_phys), ALLOCATABLE :: TD8(:) - integer :: I,IWX,ISNO,IIP,IZR,IRAIN,k,k1 - real :: time_vert,time_ncep,time_ramer,time_bourg,time_revised,& - time_dominant,btim,timef - real(kind=kind_phys) :: pv8,pr8,pk8,tr8,tdpd8,tlcl8,thelcl8 - real(kind=kind_phys) :: qwet8,t8(lm) - real(kind=kind_phys),allocatable :: twet8(:) - -! convert geopotential to height -! do l=1,lp1 -! zint(l)=zint(l)/con_g -! end do -! DON'T FORGET TO FLIP 3D ARRAYS AROUND BECAUSE GFS COUNTS FROM BOTTOM UP - - ALLOCATE ( RH(LM),TD8(LM),TWET8(LM) ) - -! Create look up table - call gfuncphys - - time_vert = 0. - time_ncep = 0. - time_ramer = 0. - time_bourg = 0. - time_revised = 0. - - do i=1,im -! print *, 'in calprecip xlat/xlon=', xlat(im),xlon(im),'levs=',lm - do k=1,lm - k1 = lm-k+1 - t8(k1) = gt0(i,k) - q(k1) = gq0(i,k) - pmid(k1) = prsl(i,k) - f_rimef(k1) = phy_f3d(i,k) - pv8 = pmid(k1)*q(k1)/(con_eps-con_epsm1*q(k1)) - td8(k1) = ftdp(pv8) - tdpd8 = t8(k1)-td8(k1) - if(pmid(k1)>=50000.)then ! only compute twet below 500mb to save time - if(tdpd8.gt.0.) then - pr8 = pmid(k1) - tr8 = t8(k1) - pk8 = fpkap(pr8) - tlcl8 = ftlcl(tr8,tdpd8) - thelcl8 = fthe(tlcl8,pk8*tlcl8/tr8) - call stma(thelcl8,pk8,twet8(k1),qwet8) - else - twet8(k1)=t8(k1) - endif - endif - ES = FPVS(T8(k1)) - ES = MIN(ES,PMID(k1)) - QC = CON_EPS*ES/(PMID(k1)+CON_EPSM1*ES) - RH(k1) = MAX(con_epsq,Q(k1))/QC - k1 = lp1-k+1 - pint(k1) = prsi(i,k) - zint(k1) = phii(i,k) !height in meters - enddo - pint(1) = prsi(i,lp1) - zint(1) = phii(i,lp1) - -! print*,'debug in calpreciptype: i,im,lm,lp1,xlon,xlat,prec,tskin,sr,nrcm,randomno,n3dfercld ', & -! i,im,lm,lp1,xlon(i)*57.29578,xlat(i)*57.29578,prec(i),tskin(i),sr(i), & -! nrcm,randomno(i,1:nrcm),n3dfercld -! do l=1,lm -! print*,'debug in calpreciptype: l,t,q,p,pint,z,twet', & -! l,t(l),q(l), & -! pmid(l),pint(l),zint(l),twet(l) -! end do -! print*,'debug in calpreciptype: lp1,pint,z ', lp1,pint(lp1),zint(lp1) -! end if -! end debug print statement - - CALL CALWXT(lm,lp1,T8(1),Q(1),PMID(1),PINT(1),PREC(i), & - PTHRESH,con_fvirt,con_rog,con_epsq, & - ZINT(1),IWX1,TWET8(1)) - IWX = IWX1 - ISNO = MOD(IWX,2) - IIP = MOD(IWX,4)/2 - IZR = MOD(IWX,8)/4 - IRAIN = IWX/8 - SNOW(1) = ISNO*1.0 - SLEET(1) = IIP*1.0 - FREEZR(1) = IZR*1.0 - RAIN(1) = IRAIN*1.0 -! print *, 'inside calprecip after calwxt iwx =',iwx -! DOMINANT PRECIPITATION TYPE -!GSM IF DOMINANT PRECIP TYPE IS REQUESTED, 4 MORE ALGORITHMS -!GSM WILL BE CALLED. THE TALLIES ARE THEN SUMMED IN -!GSM CALWXT_DOMINANT - - -! write(0,*)' i=',i,' lm=',lm,' lp1=',lp1,' T=',T(1),q(1),pmid(1) & -! &,' pint=',pint(1),' prec=',prec(i),' pthresh=',pthresh - - CALL CALWXT_RAMER(lm,lp1,T8(1),Q(1),PMID(1),RH(1),TD8(1), & - PINT(1),PREC(i),PTHRESH,IWX2) -! - IWX = NINT(IWX2) - ISNO = MOD(IWX,2) - IIP = MOD(IWX,4)/2 - IZR = MOD(IWX,8)/4 - IRAIN = IWX/8 - SNOW(2) = ISNO*1.0 - SLEET(2) = IIP*1.0 - FREEZR(2) = IZR*1.0 - RAIN(2) = IRAIN*1.0 -! print *, 'inside calprecip after ramer iwx=',iwx -! BOURGOUIN ALGORITHM - CALL CALWXT_BOURG(LM,LP1,randomno(i,1),con_g,PTHRESH, & - & T8(1),Q(1),PMID(1),PINT(1),PREC(i),ZINT(1),IWX3) - -! - IWX = NINT(IWX3) - ISNO = MOD(IWX,2) - IIP = MOD(IWX,4)/2 - IZR = MOD(IWX,8)/4 - IRAIN = IWX/8 - SNOW(3) = ISNO*1.0 - SLEET(3) = IIP*1.0 - FREEZR(3) = IZR*1.0 - RAIN(3) = IRAIN*1.0 -! print *, 'inside calprecip after bourg iwx=',iwx - -! REVISED NCEP ALGORITHM - CALL CALWXT_REVISED(LM,LP1,T8(1),Q(1),PMID(1),PINT(1),PREC(i),PTHRESH, & - con_fvirt,con_rog,con_epsq,ZINT(1),TWET8(1),IWX4) - -! - IWX = IWX4 - ISNO = MOD(IWX,2) - IIP = MOD(IWX,4)/2 - IZR = MOD(IWX,8)/4 - IRAIN = IWX/8 - SNOW(4) = ISNO*1.0 - SLEET(4) = IIP*1.0 - FREEZR(4) = IZR*1.0 - RAIN(4) = IRAIN*1.0 -! print *, 'inside calprecip after revised iwx=',iwx -! EXPLICIT ALGORITHM (UNDER 18 NOT ADMITTED WITHOUT PARENT -! OR GUARDIAN) - - IF(n3dfercld == 3) then ! Ferrier's scheme - CALL CALWXT_EXPLICIT(LM,PTHRESH, & - TSKIN(i),PREC(i),SR(i),F_RimeF(1),IWX5) - else - IWX5 = 0 - endif -! - IWX = IWX5 - ISNO = MOD(IWX,2) - IIP = MOD(IWX,4)/2 - IZR = MOD(IWX,8)/4 - IRAIN = IWX/8 - SNOW(5) = ISNO*1.0 - SLEET(5) = IIP*1.0 - FREEZR(5) = IZR*1.0 - RAIN(5) = IRAIN*1.0 -! - CALL CALWXT_DOMINANT(NALG,PREC(i),PTHRESH,RAIN(1),FREEZR(1),SLEET(1), & - SNOW(1),DOMR(i),DOMZR(i),DOMIP(i),DOMS(i)) - -! if (DOMS(i).eq.1.) then -! print *, 'Found SNOW at xlat/xlon',xlat,xlon -! elseif (DOMR(i).eq.1.) then -! print *, 'Found RAIN at xlat/xlon',xlat,xlon -! elseif(DOMZR(i).eq.1.) then -! print *, 'Found FREEZING RAIN at xlat/xlon',xlat,xlon -! elseif(DOMIP(i).eq.1.) then -! print *, 'Found ICE at xlat/xlon',xlat,xlon -! endif -! print *, 'In calpre DOMS,DOMR,DOMZR,DOMIP =', int(DOMS),int(DOMR),int(DOMZR),int(DOMIP) - - enddo ! end loop for i - - DEALLOCATE (TWET8,RH,TD8) - RETURN - END -! -!&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& -! - SUBROUTINE CALWXT(lm,lp1,T,Q,PMID,PINT,PREC, & - PTHRESH,D608,ROG,EPSQ, & - ZINT,IWX,TWET) -! -! FILE: CALWXT.f -! WRITTEN: 11 NOVEMBER 1993, MICHAEL BALDWIN -! REVISIONS: -! 30 SEPT 1994-SETUP NEW DECISION TREE (M BALDWIN) -! 12 JUNE 1998-CONVERSION TO 2-D (T BLACK) -! 01-10-25 H CHUANG - MODIFIED TO PROCESS HYBRID MODEL OUTPUT -! 02-01-15 MIKE BALDWIN - WRF VERSION -! -! -! ROUTINE TO COMPUTE PRECIPITATION TYPE USING A DECISION TREE -! APPROACH THAT USES VARIABLES SUCH AS INTEGRATED WET BULB TEMP -! BELOW FREEZING AND LOWEST LAYER TEMPERATURE -! -! SEE BALDWIN AND CONTORNO PREPRINT FROM 13TH WEATHER ANALYSIS -! AND FORECASTING CONFERENCE FOR MORE DETAILS -! (OR BALDWIN ET AL, 10TH NWP CONFERENCE PREPRINT) -! -! use params_mod -! use ctlblk_mod -!- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - implicit none -! -! INPUT: -! T,Q,PMID,HTM,LMH,PREC,ZINT -! - integer,intent(in):: lm,lp1 -! real,intent(in):: pthresh - real,dimension(LM),intent(in) :: Q,PMID - real*8,dimension(LM),intent(in) :: T,TWET - real,dimension(LP1),intent(in) :: ZINT,PINT - integer,intent(out) :: IWX - real,intent(in) :: PREC,PTHRESH,D608,ROG,EPSQ -! real,intent(out) :: ZWET - - -! OUTPUT: -! IWX - INSTANTANEOUS WEATHER TYPE. -! ACTS LIKE A 4 BIT BINARY -! 1111 = RAIN/FREEZING RAIN/ICE PELLETS/SNOW -! WHERE THE ONE'S DIGIT IS FOR SNOW -! THE TWO'S DIGIT IS FOR ICE PELLETS -! THE FOUR'S DIGIT IS FOR FREEZING RAIN -! AND THE EIGHT'S DIGIT IS FOR RAIN -! -! INTERNAL: -! -! REAL, ALLOCATABLE :: TWET(:) - real, parameter :: D00=0.0 - integer KARR,LICEE - real TCOLD,TWARM - -! SUBROUTINES CALLED: -! WETBULB -! -! -! INITIALIZE WEATHER TYPE ARRAY TO ZERO (IE, OFF). -! WE DO THIS SINCE WE WANT IWX TO REPRESENT THE -! INSTANTANEOUS WEATHER TYPE ON RETURN. -! -! -! ALLOCATE LOCAL STORAGE -! - - integer L,LICE,IWRML,IFRZL - real PSFCK,TDCHK,A,TDKL,TDPRE,TLMHK,TWRMK,AREAS8,AREAP4, & - SURFW,SURFC,DZKL,AREA1,PINTK1,PINTK2,PM150,PKL,TKL,QKL - -! ALLOCATE ( TWET(LM) ) -! -!!$omp parallel do - IWX = 0 -! ZWET=SPVAL -! -!!$omp parallel do -!!$omp& private(a,pkl,psfck,qkl,tdchk,tdkl,tdpre,tkl) - -! -! SKIP THIS POINT IF NO PRECIP THIS TIME STEP -! - IF (PREC.LE.PTHRESH) GOTO 800 -! -! FIND COLDEST AND WARMEST TEMPS IN SATURATED LAYER BETWEEN -! 70 MB ABOVE GROUND AND 500 MB -! ALSO FIND HIGHEST SATURATED LAYER IN THAT RANGE -! -!meb - PSFCK=PINT(LM+1) -!meb - TDCHK=2.0 - 760 TCOLD=T(LM) - TWARM=T(LM) - LICEE=LM -! - DO 775 L=1,LM - QKL=Q(L) - QKL=MAX(EPSQ,QKL) - TKL=T(L) - PKL=PMID(L) -! -! SKIP PAST THIS IF THE LAYER IS NOT BETWEEN 70 MB ABOVE GROUND -! AND 500 MB -! - IF (PKL.LT.50000.0.OR.PKL.GT.PSFCK-7000.0) GOTO 775 - A=LOG(QKL*PKL/(6.1078*(0.378*QKL+0.622))) - TDKL=(237.3*A)/(17.269-A)+273.15 - TDPRE=TKL-TDKL - IF (TDPRE.LT.TDCHK.AND.TKL.LT.TCOLD) TCOLD=TKL - IF (TDPRE.LT.TDCHK.AND.TKL.GT.TWARM) TWARM=TKL - IF (TDPRE.LT.TDCHK.AND.L.LT.LICEE) LICEE=L - 775 CONTINUE -! -! IF NO SAT LAYER AT DEW POINT DEP=TDCHK, INCREASE TDCHK -! AND START AGAIN (BUT DON'T MAKE TDCHK > 6) -! - IF (TCOLD==T(LM).AND.TDCHK<6.0) THEN - TDCHK=TDCHK+2.0 - GOTO 760 - ENDIF - 800 CONTINUE -! -! LOWEST LAYER T -! - KARR=0 - IF (PREC.LE.PTHRESH) GOTO 850 - TLMHK=T(LM) -! -! DECISION TREE TIME -! - IF (TCOLD>269.15) THEN - IF (TLMHK.LE.273.15) THEN -! TURN ON THE FLAG FOR -! FREEZING RAIN = 4 -! IF ITS NOT ON ALREADY -! IZR=MOD(IWX(I,J),8)/4 -! IF (IZR.LT.1) IWX(I,J)=IWX(I,J)+4 - IWX=IWX+4 - GOTO 850 - ELSE -! TURN ON THE FLAG FOR -! RAIN = 8 -! IF ITS NOT ON ALREADY -! IRAIN=IWX(I,J)/8 -! IF (IRAIN.LT.1) IWX(I,J)=IWX(I,J)+8 - IWX=IWX+8 - GOTO 850 - ENDIF - ENDIF - KARR=1 - 850 CONTINUE -! -! COMPUTE WET BULB ONLY AT POINTS THAT NEED IT -! -! CALL WETBULB(lm,T,Q,PMID,KARR,TWET) -! CALL WETFRZLVL(TWET,ZWET) -! -!!$omp parallel do -!!$omp& private(area1,areap4,areas8,dzkl,ifrzl,iwrml,lice, -!!$omp& lmhk,pintk1,pintk2,pm150,psfck,surfc,surfw, -!!$omp& tlmhk,twrmk) - - IF(KARR.GT.0)THEN - LICE=LICEE -!meb - PSFCK=PINT(LM+1) -!meb - TLMHK=T(LM) - TWRMK=TWARM -! -! TWET AREA VARIABLES -! CALCULATE ONLY WHAT IS NEEDED -! FROM GROUND TO 150 MB ABOVE SURFACE -! FROM GROUND TO TCOLD LAYER -! AND FROM GROUND TO 1ST LAYER WHERE WET BULB T < 0.0 -! -! PINTK1 IS THE PRESSURE AT THE BOTTOM OF THE LAYER -! PINTK2 IS THE PRESSURE AT THE TOP OF THE LAYER -! -! AREAP4 IS THE AREA OF TWET ABOVE -4 C BELOW HIGHEST SAT LYR -! - AREAS8=D00 - AREAP4=D00 - SURFW =D00 - SURFC =D00 -! - DO 1945 L=LM,LICE,-1 - DZKL=ZINT(L)-ZINT(L+1) - AREA1=(TWET(L)-269.15)*DZKL - IF (TWET(L).GE.269.15) AREAP4=AREAP4+AREA1 - 1945 CONTINUE -! - IF (AREAP4.LT.3000.0) THEN -! TURN ON THE FLAG FOR -! SNOW = 1 -! IF ITS NOT ON ALREADY -! ISNO=MOD(IWX(I,J),2) -! IF (ISNO.LT.1) IWX(I,J)=IWX(I,J)+1 - IWX=IWX+1 - GO TO 1900 - ENDIF -! -! AREAS8 IS THE NET AREA OF TWET W.R.T. FREEZING IN LOWEST 150MB -! - PINTK1=PSFCK - PM150=PSFCK-15000. -! - DO 1955 L=LM,1,-1 - PINTK2=PINT(L) - IF(PINTK1.LT.PM150)GO TO 1950 - DZKL=ZINT(L)-ZINT(L+1) -! -! SUM PARTIAL LAYER IF IN 150 MB AGL LAYER -! - IF(PINTK2.LT.PM150) & - DZKL=T(L)*(Q(L)*D608+1.0)*ROG*LOG(PINTK1/PM150) - AREA1=(TWET(L)-273.15)*DZKL - AREAS8=AREAS8+AREA1 - 1950 PINTK1=PINTK2 - 1955 CONTINUE -! -! SURFW IS THE AREA OF TWET ABOVE FREEZING BETWEEN THE GROUND -! AND THE FIRST LAYER ABOVE GROUND BELOW FREEZING -! SURFC IS THE AREA OF TWET BELOW FREEZING BETWEEN THE GROUND -! AND THE WARMEST SAT LAYER -! - IFRZL=0 - IWRML=0 -! - DO 2050 L=LM,1,-1 - IF (IFRZL.EQ.0.AND.T(L).LT.273.15) IFRZL=1 - IF (IWRML.EQ.0.AND.T(L).GE.TWRMK) IWRML=1 -! - IF (IWRML.EQ.0.OR.IFRZL.EQ.0) THEN -! if(pmid(l) < 50000.)print*,'need twet above 500mb' - DZKL=ZINT(L)-ZINT(L+1) - AREA1=(TWET(L)-273.15)*DZKL - IF(IFRZL.EQ.0.AND.TWET(L).GE.273.15)SURFW=SURFW+AREA1 - IF(IWRML.EQ.0.AND.TWET(L).LE.273.15)SURFC=SURFC+AREA1 - ENDIF - 2050 CONTINUE - IF(SURFC.LT.-3000.0.OR. & - (AREAS8.LT.-3000.0.AND.SURFW.LT.50.0)) THEN -! TURN ON THE FLAG FOR -! ICE PELLETS = 2 -! IF ITS NOT ON ALREADY -! IIP=MOD(IWX(I,J),4)/2 -! IF (IIP.LT.1) IWX(I,J)=IWX(I,J)+2 - IWX=IWX+2 - GOTO 1900 - ENDIF -! - IF(TLMHK.LT.273.15) THEN -! TURN ON THE FLAG FOR -! FREEZING RAIN = 4 -! IF ITS NOT ON ALREADY -! IZR=MOD(IWX(K),8)/4 -! IF (IZR.LT.1) IWX(K)=IWX(K)+4 - IWX=IWX+4 - ELSE -! TURN ON THE FLAG FOR -! RAIN = 8 -! IF ITS NOT ON ALREADY -! IRAIN=IWX(K)/8 -! IF (IRAIN.LT.1) IWX(K)=IWX(K)+8 - IWX=IWX+8 - ENDIF - ENDIF - 1900 CONTINUE -!--------------------------------------------------------- -! DEALLOCATE (TWET) - - RETURN - END -! -! -!ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc -! -! DoPhase is a subroutine written and provided by Jim Ramer at NOAA/FSL -! -! Ramer, J, 1993: An empirical technique for diagnosing precipitation -! type from model output. Preprints, 5th Conf. on Aviation -! Weather Systems, Vienna, VA, Amer. Meteor. Soc., 227-230. -! -! CODE ADAPTED FOR WRF POST 24 AUGUST 2005 G MANIKIN -!ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc -! - SUBROUTINE CALWXT_RAMER(lm,lp1, & - T,Q,PMID,RH,TD,PINT,PREC,PTHRESH,PTYP) - -! SUBROUTINE dophase(pq, ! input pressure sounding mb -! + t, ! input temperature sounding K -! + pmid, ! input pressure -! + pint, ! input interface pressure -! + q, ! input spec humidityfraction -! + lmh, ! input number of levels in sounding -! + prec, ! input amount of precipitation -! + ptyp) ! output(2) phase 2=Rain, 3=Frzg, 4=Solid, -! 6=IP JC 9/16/99 -! use params_mod -! use CTLBLK_mod -!- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - implicit none -! - real,PARAMETER :: twice=266.55,rhprcp=0.80,deltag=1.02, & - & emelt=0.045,rlim=0.04,slim=0.85 - real,PARAMETER :: twmelt=273.15,tz=273.15,efac=1.0 ! specify in params now -! - INTEGER*4 i, k1, lll, k2, toodry -! - REAL xxx ,mye, icefrac - integer,intent(in) :: lm,lp1 - real,DIMENSION(LM),intent(in) :: Q,PMID,RH - real*8,DIMENSION(LM),intent(in) :: T,TD - real,DIMENSION(LP1),intent(in) :: PINT - real,intent(in) :: PREC,PTHRESH - real,intent(out) :: PTYP -! - real,DIMENSION(LM) :: TQ,PQ,RHQ - real,DIMENSION(LM) :: TWQ -! - integer J,L,LEV,ii - real RHMAX,TWMAX,PTOP,dpdrh,twtop,rhtop,wgt1,wgt2, & - rhavg,dtavg,dpk,ptw,pbot -! real b,qtmp,rate,qc - real,external :: xmytw -! -! Initialize. - icefrac = -9999. -! - - PTYP = 0 - DO L = 1,LM - LEV = LP1 - L -! P(L)=PMID(L) -! QC=PQ0/P(L) * EXP(A2*(T(L)-A3)/(T(L)-A4)) -!GSM forcing Q (QTMP) to be positive to deal with negative Q values -! causing problems later in this subroutine -! QTMP=MAX(H1M12,Q(L)) -! RHQTMP(LEV)=QTMP/QC - RHQ(LEV) = RH(L) - PQ(LEV) = PMID(L) * 0.01 - TQ(LEV) = T(L) - enddo - - -! -! SKIP THIS POINT IF NO PRECIP THIS TIME STEP -! - IF (PREC <= PTHRESH) return - -! -!CC RATE RESTRICTION REMOVED BY JOHN CORTINAS 3/16/99 -! -! Construct wet-bulb sounding, locate generating level. - twmax = -999.0 - rhmax = 0.0 - k1 = 0 ! top of precip generating layer - k2 = 0 ! layer of maximum rh -! - IF (rhq(1) < rhprcp) THEN - toodry = 1 - ELSE - toodry = 0 - END IF -! - pbot = pq(1) -! NQ=LM - DO L = 1, lm -! xxx = tdofesat(esat(tq(L))*rhq(L)) - xxx = td(l) !HC: use TD consistent with GFS ice physics - if (xxx < -500.) return - twq(L) = xmytw(tq(L),xxx,pq(L)) - twmax = max(twq(L),twmax) - IF (pq(L) >= 400.0) THEN - IF (rhq(L) > rhmax) THEN - rhmax = rhq(L) - k2 = L - END IF -! - IF (L /= 1) THEN - IF (rhq(L) >= rhprcp .or. toodry == 0) THEN - IF (toodry /= 0) THEN - dpdrh = log(pq(L)/pq(L-1)) / (rhq(L)-RHQ(L-1)) - pbot = exp(log(pq(L))+(rhprcp-rhq(L))*dpdrh) -! - ptw = pq(L) - toodry = 0 - ELSE IF (rhq(L)>= rhprcp) THEN - ptw = pq(L) - ELSE - toodry = 1 - dpdrh = log(pq(L)/pq(L-1)) / (rhq(L)-rhq(L-1)) - ptw = exp(log(pq(L))+(rhprcp-rhq(L))*dpdrh) - -!lin dpdrh = (Pq(i)-Pq(i-1))/(Rhq(i)-Rhq(i-1)) -!lin ptw = Pq(i)+(rhprcp-Rhq(i))*dpdrh -! - END IF -! - IF (pbot/ptw >= deltag) THEN -!lin If (pbot-ptw.lt.deltag) Goto 2003 - k1 = L - ptop = ptw - END IF - END IF - END IF - END IF - enddo -! -! Gross checks for liquid and solid precip which dont require generating level. -! - IF (twq(1) >= 273.15+2.0) THEN - ptyp = 8 ! liquid - icefrac = 0.0 - return - END IF -! - IF (twmax <= twice) THEN - icefrac = 1.0 - ptyp = 1 ! solid - return - END IF -! -! Check to see if we had no success with locating a generating level. -! - IF (k1 == 0) return -! - IF (ptop == pq(k1)) THEN - twtop = twq(k1) - rhtop = rhq(k1) - k2 = k1 - k1 = k1 - 1 - ELSE - k2 = k1 - k1 = k1 - 1 - wgt1 = log(ptop/pq(k2)) / log(pq(k1)/pq(k2)) - wgt2 = 1.0 - wgt1 - twtop = twq(k1) * wgt1 + twq(k2) * wgt2 - rhtop = rhq(k1) * wgt1 + rhq(k2) * wgt2 - END IF -! -! Calculate temp and wet-bulb ranges below precip generating level. - DO L = 1, k1 - twmax = max(twq(l),twmax) - enddo -! -! Gross check for solid precip, initialize ice fraction. -! IF (i.eq.1.and.j.eq.1) WRITE (*,*) 'twmax=',twmax,twice,'twtop=',twtop - - IF (twtop <= twice) THEN - icefrac = 1.0 - IF (twmax <= twmelt) THEN ! gross check for solid precip. - ptyp = 1 ! solid precip - return - END IF - lll = 0 - ELSE - icefrac = 0.0 - lll = 1 - END IF -! -! Loop downward through sounding from highest precip generating level. - 30 CONTINUE -! - IF (icefrac >= 1.0) THEN ! starting as all ice - IF (twq(k1) < twmelt) GO TO 40 ! cannot commence melting - IF (twq(k1) == twtop) GO TO 40 ! both equal twmelt, nothing h - wgt1 = (twmelt-twq(k1)) / (twtop-twq(k1)) - rhavg = rhq(k1) + wgt1 * (rhtop-rhq(k1)) * 0.5 - dtavg = (twmelt-twq(k1)) * 0.5 - dpk = wgt1 * log(pq(k1)/ptop) !lin dpk=wgt1*(Pq(k1)-Ptop) -! mye=emelt*(1.0-(1.0-Rhavg)*efac) - mye = emelt * rhavg ** efac - icefrac = icefrac + dpk * dtavg / mye - ELSE IF (icefrac <= 0.0) THEN ! starting as all liquid - lll = 1 -! Goto 1020 - IF (twq(k1) > twice) GO TO 40 ! cannot commence freezing - IF (twq(k1) == twtop) THEN - wgt1 = 0.5 - ELSE - wgt1 = (twice-twq(k1)) / (twtop-twq(k1)) - END IF - rhavg = rhq(k1) + wgt1 * (rhtop-rhq(k1)) * 0.5 - dtavg = twmelt - (twq(k1)+twice) * 0.5 - dpk = wgt1 * log(pq(k1)/ptop) !lin dpk=wgt1*(Pq(k1)-Ptop) -! mye = emelt*(1.0-(1.0-Rhavg)*efac) - mye = emelt * rhavg ** efac - icefrac = icefrac + dpk * dtavg / mye - ELSE IF ((twq(k1) <= twmelt).and.(twq(k1) < twmelt)) THEN ! mix - rhavg = (rhq(k1)+rhtop) * 0.5 - dtavg = twmelt - (twq(k1)+twtop) * 0.5 - dpk = log(pq(k1)/ptop) !lin dpk=Pq(k1)-Ptop -! mye = emelt*(1.0-(1.0-Rhavg)*efac) - mye = emelt * rhavg ** efac - icefrac = icefrac + dpk * dtavg / mye - ELSE ! mix where Tw curve crosses twmelt in layer - IF (twq(k1) == twtop) GO TO 40 ! both equal twmelt, nothing h - wgt1 = (twmelt-twq(k1)) / (twtop-twq(k1)) - wgt2 = 1.0 - wgt1 - rhavg = rhtop + wgt2 * (rhq(k1)-rhtop) * 0.5 - dtavg = (twmelt-twtop) * 0.5 - dpk = wgt2 * log(pq(k1)/ptop) !lin dpk=wgt2*(Pq(k1)-Ptop) -! mye = emelt*(1.0-(1.0-Rhavg)*efac) - mye = emelt * rhavg ** efac - icefrac = icefrac + dpk * dtavg / mye - icefrac = min(1.0,max(icefrac,0.0)) - IF (icefrac <= 0.0) THEN -! Goto 1020 - IF (twq(k1) > twice) GO TO 40 ! cannot commence freezin - wgt1 = (twice-twq(k1)) / (twtop-twq(k1)) - dtavg = twmelt - (twq(k1)+twice) * 0.5 - ELSE - dtavg = (twmelt-twq(k1)) * 0.5 - END IF - rhavg = rhq(k1) + wgt1 * (rhtop-rhq(k1)) * 0.5 - dpk = wgt1 * log(pq(k1)/ptop) !lin dpk=wgt1*(Pq(k1)-Ptop) -! mye = emelt*(1.0-(1.0-Rhavg)*efac) - mye = emelt * rhavg ** efac - icefrac = icefrac + dpk * dtavg / mye - END IF -! - icefrac = min(1.0,max(icefrac,0.0)) - -! IF (i.eq.1.and.j.eq.1) WRITE (*,*) 'NEW ICEFRAC:', icefrac, icefrac -! -! Get next level down if there is one, loop back. - 40 continue - IF (k1 > 1) THEN - twtop = twq(k1) - ptop = pq(k1) - rhtop = rhq(k1) - k1 = k1 - 1 - GO TO 30 - END IF -! -! Determine precip type based on snow fraction and surface wet-bulb. -! - IF (icefrac >= slim) THEN - IF (lll /= 0) THEN - ptyp = 2 ! Ice Pellets JC 9/16/99 - ELSE - ptyp = 1 ! Snow - END IF - ELSE IF (icefrac <= rlim) THEN - IF (twq(1).lt.tz) THEN - ptyp = 4 ! Freezing Precip - ELSE - ptyp = 8 ! Rain - END IF - ELSE - IF (twq(1) < tz) THEN -!GSM not sure what to do when 'mix' is predicted; In previous -!GSM versions of this code for which I had to have an answer, -!GSM I chose sleet. Here, though, since we have 4 other -!GSM algorithms to provide an answer, I will not declare a -!GSM type from the Ramer in this situation and allow the -!GSM other algorithms to make the call. - - ptyp = 0 ! don't know -! ptyp = 5 ! Mix - ELSE -! ptyp = 5 ! Mix - ptyp = 0 ! don't know - END IF - END IF - - RETURN -! - END -! -! -!-------------------------------------------------------------------------- -! REAL*4 FUNCTION mytw(t,td,p) - FUNCTION xmytw(t,td,p) -! - IMPLICIT NONE -! - INTEGER*4 cflag, l -! REAL*4 f, c0, c1, c2, k, kd, kw, ew, t, td, p, ed, fp, s, & - REAL f, c0, c1, c2, k, kd, kw, ew, t, td, p, ed, fp, s, & - & de, xmytw - DATA f, c0, c1, c2 /0.0006355, 26.66082, 0.0091379024, 6106.3960/ -! -! - xmytw = (t+td) / 2 - IF (td.ge.t) RETURN -! - IF (t.lt.100.0) THEN - k = t + 273.15 - kd = td + 273.15 - IF (kd.ge.k) RETURN - cflag = 1 - ELSE - k = t - kd = td - cflag = 0 - END IF -! - ed = c0 - c1 * kd - c2 / kd - IF (ed.lt.-14.0.or.ed.gt.7.0) RETURN - ed = exp(ed) - ew = c0 - c1 * k - c2 / k - IF (ew.lt.-14.0.or.ew.gt.7.0) RETURN - ew = exp(ew) - fp = p * f - s = (ew-ed) / (k-kd) - kw = (k*fp+kd*s) / (fp+s) -! - DO 10 l = 1, 5 - ew = c0 - c1 * kw - c2 / kw - IF (ew.lt.-14.0.or.ew.gt.7.0) RETURN - ew = exp(ew) - de = fp * (k-kw) + ed - ew - IF (abs(de/ew).lt.1E-5) GO TO 20 - s = ew * (c1-c2/(kw*kw)) - fp - kw = kw - de / s - 10 CONTINUE - 20 CONTINUE -! -! print *, 'kw ', kw - IF (cflag.ne.0) THEN - xmytw = kw - 273.15 - ELSE - xmytw = kw - END IF -! - RETURN - END -! -! -!$$$ Subprogram documentation block -! -! Subprogram: calwxt_bourg Calculate precipitation type (Bourgouin) -! Prgmmr: Baldwin Org: np22 Date: 1999-07-06 -! -! Abstract: This routine computes precipitation type -! using a decision tree approach that uses the so-called -! "energy method" of Bourgouin of AES (Canada) 1992 -! -! Program history log: -! 1999-07-06 M Baldwin -! 1999-09-20 M Baldwin make more consistent with bourgouin (1992) -! 2005-08-24 G Manikin added to wrf post -! 2007-06-19 M Iredell mersenne twister, best practices -! 2008-03-03 G Manikin added checks to prevent stratospheric warming -! episodes from being seen as "warm" layers -! impacting precip type -! -! Usage: call calwxt_bourg(im,jm,jsta_2l,jend_2u,jsta,jend,lm,lp1, & -! & iseed,g,pthresh, & -! & t,q,pmid,pint,lmh,prec,zint,ptype) -! Input argument list: -! im integer i dimension -! jm integer j dimension -! jsta_2l integer j dimension start point (including haloes) -! jend_2u integer j dimension end point (including haloes) -! jsta integer j dimension start point (excluding haloes) -! jend integer j dimension end point (excluding haloes) -! lm integer k dimension -! lp1 integer k dimension plus 1 -! iseed integer random number seed -! g real gravity (m/s**2) -! pthresh real precipitation threshold (m) -! t real(im,jsta_2l:jend_2u,lm) mid layer temp (K) -! q real(im,jsta_2l:jend_2u,lm) specific humidity (kg/kg) -! pmid real(im,jsta_2l:jend_2u,lm) mid layer pressure (Pa) -! pint real(im,jsta_2l:jend_2u,lp1) interface pressure (Pa) -! lmh real(im,jsta_2l:jend_2u) max number of layers -! prec real(im,jsta_2l:jend_2u) precipitation (m) -! zint real(im,jsta_2l:jend_2u,lp1) interface height (m) -! Output argument list: -! ptype real(im,jm) instantaneous weather type () -! acts like a 4 bit binary -! 1111 = rain/freezing rain/ice pellets/snow -! where the one's digit is for snow -! the two's digit is for ice pellets -! the four's digit is for freezing rain -! and the eight's digit is for rain -! in other words... -! ptype=1 snow -! ptype=2 ice pellets/mix with ice pellets -! ptype=4 freezing rain/mix with freezing rain -! ptype=8 rain -! -! Modules used: -! mersenne_twister pseudo-random number generator -! -! Subprograms called: -! random_number pseudo-random number generator -! -! Attributes: -! Language: Fortran 90 -! -! Remarks: vertical order of arrays must be layer 1 = top -! and layer lmh = bottom -! -!$$$ - subroutine calwxt_bourg(lm,lp1,rn,g,pthresh, & - & t,q,pmid,pint,prec,zint,ptype) -! use mersenne_twister - implicit none -! -! input: - integer,intent(in):: lm,lp1 -! integer,intent(in):: iseed - real,intent(in):: g,pthresh,rn - real*8,intent(in):: t(lm) - real,intent(in):: q(lm) - real,intent(in):: pmid(lm) - real,intent(in):: pint(lp1) - real,intent(in):: prec - real,intent(in):: zint(lp1) -! -! output: - real,intent(out):: ptype -! - integer ifrzl,iwrml,l,lhiwrm - real pintk1,areane,tlmhk,areape,pintk2,surfw,area1,dzkl,psfck -! -! initialize weather type array to zero (ie, off). -! we do this since we want ptype to represent the -! instantaneous weather type on return. -! -!!$omp parallel do - - ptype = 0 - -! -! call random_number(rn,iseed) -! -!!$omp parallel do -!!$omp& private(a,tlmhk,iwrml,psfck,lhiwrm,pintk1,pintk2,area1, -!!$omp& areape,dzkl,surfw,r1,r2) - - psfck=pint(lm+1) -! -! skip this point if no precip this time step -! - if (prec.le.pthresh) return -! find the depth of the warm layer based at the surface -! this will be the cut off point between computing -! the surface based warm air and the warm air aloft -! -! -! lowest layer t -! - tlmhk = t(lm) - iwrml = lm + 1 - if (tlmhk.ge.273.15) then - do l = lm, 2, -1 - if (t(l).ge.273.15.and.t(l-1).lt.273.15.and. & - & iwrml.eq.lm+1) iwrml = l - end do - end if -! -! now find the highest above freezing level -! - lhiwrm = lm + 1 - do l = lm, 1, -1 -! gsm added 250 mb check to prevent stratospheric warming situations -! from counting as warm layers aloft - if (t(l).ge.273.15 .and. pmid(l).gt.25000.) lhiwrm = l - end do - -! energy variables -! surfw is the positive energy between the ground -! and the first sub-freezing layer above ground -! areane is the negative energy between the ground -! and the highest layer above ground -! that is above freezing -! areape is the positive energy "aloft" -! which is the warm energy not based at the ground -! (the total warm energy = surfw + areape) -! -! pintk1 is the pressure at the bottom of the layer -! pintk2 is the pressure at the top of the layer -! dzkl is the thickness of the layer -! ifrzl is a flag that tells us if we have hit -! a below freezing layer -! - pintk1 = psfck - ifrzl = 0 - areane = 0.0 - areape = 0.0 - surfw = 0.0 - - do l = lm, 1, -1 - if (ifrzl.eq.0.and.t(l).le.273.15) ifrzl = 1 - pintk2=pint(l) - dzkl=zint(l)-zint(l+1) - area1 = log(t(l)/273.15) * g * dzkl - if (t(l).ge.273.15.and. pmid(l).gt.25000.) then - if (l.lt.iwrml) areape = areape + area1 - if (l.ge.iwrml) surfw = surfw + area1 - else - if (l.gt.lhiwrm) areane = areane + abs(area1) - end if - pintk1 = pintk2 - end do - -! -! decision tree time -! - if (areape.lt.2.0) then -! very little or no positive energy aloft, check for -! positive energy just above the surface to determine rain vs. snow - if (surfw.lt.5.6) then -! not enough positive energy just above the surface -! snow = 1 - ptype = 1 - else if (surfw.gt.13.2) then -! enough positive energy just above the surface -! rain = 8 - ptype = 8 - else -! transition zone, assume equally likely rain/snow -! picking a random number, if <=0.5 snow - if (rn.le.0.5) then -! snow = 1 - ptype = 1 - else -! rain = 8 - ptype = 8 - end if - end if -! - else -! some positive energy aloft, check for enough negative energy -! to freeze and make ice pellets to determine ip vs. zr - if (areane.gt.66.0+0.66*areape) then -! enough negative area to make ip, -! now need to check if there is enough positive energy -! just above the surface to melt ip to make rain - if (surfw.lt.5.6) then -! not enough energy at the surface to melt ip -! ice pellets = 2 - ptype = 2 - else if (surfw.gt.13.2) then -! enough energy at the surface to melt ip -! rain = 8 - ptype = 8 - else -! transition zone, assume equally likely ip/rain -! picking a random number, if <=0.5 ip - if (rn.le.0.5) then -! ice pellets = 2 - ptype = 2 - else -! rain = 8 - ptype = 8 - end if - end if - else if (areane.lt.46.0+0.66*areape) then -! not enough negative energy to refreeze, check surface temp -! to determine rain vs. zr - if (tlmhk.lt.273.15) then -! freezing rain = 4 - ptype = 4 - else -! rain = 8 - ptype = 8 - end if - else -! transition zone, assume equally likely ip/zr -! picking a random number, if <=0.5 ip - if (rn.le.0.5) then -! still need to check positive energy -! just above the surface to melt ip vs. rain - if (surfw.lt.5.6) then -! ice pellets = 2 - ptype = 2 - else if (surfw.gt.13.2) then -! rain = 8 - ptype = 8 - else -! transition zone, assume equally likely ip/rain -! picking a random number, if <=0.5 ip - if (rn.le.0.25) then -! ice pellets = 2 - ptype = 2 - else -! rain = 8 - ptype = 8 - end if - end if - else -! not enough negative energy to refreeze, check surface temp -! to determine rain vs. zr - if (tlmhk.lt.273.15) then -! freezing rain = 4 - ptype = 4 - else -! rain = 8 - ptype = 8 - end if - end if - end if - end if -! end do -! end do - return - end -! -! - SUBROUTINE CALWXT_REVISED(LM,LP1,T,Q,PMID,PINT,PREC, & - PTHRESH,D608,ROG,EPSQ, & - & ZINT,TWET,IWX) -! -! FILE: CALWXT.f -! WRITTEN: 11 NOVEMBER 1993, MICHAEL BALDWIN -! REVISIONS: -! 30 SEPT 1994-SETUP NEW DECISION TREE (M BALDWIN) -! 12 JUNE 1998-CONVERSION TO 2-D (T BLACK) -! 01-10-25 H CHUANG - MODIFIED TO PROCESS HYBRID MODEL OUTPUT -! 02-01-15 MIKE BALDWIN - WRF VERSION -! 05-07-07 BINBIN ZHOU - ADD PREC FOR RSM -! 05-08-24 GEOFF MANIKIN - MODIFIED THE AREA REQUIREMENTS -! TO MAKE AN ALTERNATE ALGORITHM -! -! -! ROUTINE TO COMPUTE PRECIPITATION TYPE USING A DECISION TREE -! APPROACH THAT USES VARIABLES SUCH AS INTEGRATED WET BULB TEMP -! BELOW FREEZING AND LOWEST LAYER TEMPERATURE -! -! SEE BALDWIN AND CONTORNO PREPRINT FROM 13TH WEATHER ANALYSIS -! AND FORECASTING CONFERENCE FOR MORE DETAILS -! (OR BALDWIN ET AL, 10TH NWP CONFERENCE PREPRINT) -! -! SINCE THE ORIGINAL VERSION OF THE ALGORITHM HAS A HIGH BIAS -! FOR FREEZING RAIN AND SLEET, THE GOAL IS TO BALANCE THAT BIAS -! WITH A VERSION MORE LIKELY TO PREDICT SNOW -! -! use params_mod -! use ctlblk_mod -!- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - implicit none -! -! LIST OF VARIABLES NEEDED -! PARAMETERS: -! D608,ROG,H1,D00 -!HC PARAMETER(D608=0.608,ROG=287.04/9.8,H1=1.0,D00=0.0) -! -! INPUT: -! T,Q,PMID,HTM,LMH,PREC,ZINT - integer,intent(in):: lm,lp1 - REAL,dimension(LM),intent(in) :: Q,PMID - REAL*8,dimension(LM),intent(in) :: T,TWET - REAL,dimension(LP1),intent(in) :: PINT,ZINT - REAL,intent(in) :: PREC,PTHRESH,D608,ROG,EPSQ -! OUTPUT: -! IWX - INSTANTANEOUS WEATHER TYPE. -! ACTS LIKE A 4 BIT BINARY -! 1111 = RAIN/FREEZING RAIN/ICE PELLETS/SNOW -! WHERE THE ONE'S DIGIT IS FOR SNOW -! THE TWO'S DIGIT IS FOR ICE PELLETS -! THE FOUR'S DIGIT IS FOR FREEZING RAIN -! AND THE EIGHT'S DIGIT IS FOR RAIN - integer, intent(out) :: IWX -! INTERNAL: -! - real, parameter :: D00=0.0 - integer KARR,LICEE - real TCOLD,TWARM -! - integer L,LMHK,LICE,IWRML,IFRZL - real PSFCK,TDCHK,A,TDKL,TDPRE,TLMHK,TWRMK,AREAS8,AREAP4,AREA1, & - SURFW,SURFC,DZKL,PINTK1,PINTK2,PM150,QKL,TKL,PKL,AREA0, & - AREAP0 - -! SUBROUTINES CALLED: -! WETBULB -! -! -! INITIALIZE WEATHER TYPE ARRAY TO ZERO (IE, OFF). -! WE DO THIS SINCE WE WANT IWX TO REPRESENT THE -! INSTANTANEOUS WEATHER TYPE ON RETURN. -! -! -! ALLOCATE LOCAL STORAGE -! -! -!!$omp parallel do - IWX = 0 - -!!$omp parallel do -!!$omp& private(a,lmhk,pkl,psfck,qkl,tdchk,tdkl,tdpre,tkl) - - LMHK=LM -! -! SKIP THIS POINT IF NO PRECIP THIS TIME STEP -! - IF (PREC.LE.PTHRESH) GOTO 800 -! -! FIND COLDEST AND WARMEST TEMPS IN SATURATED LAYER BETWEEN -! 70 MB ABOVE GROUND AND 500 MB -! ALSO FIND HIGHEST SATURATED LAYER IN THAT RANGE -! -!meb - PSFCK=PINT(LP1) -!meb - TDCHK=2.0 - 760 TCOLD=T(LMHK) - TWARM=T(LMHK) - LICEE=LMHK -! - DO 775 L=1,LMHK - QKL=Q(L) - QKL=MAX(EPSQ,QKL) - TKL=T(L) - PKL=PMID(L) -! -! SKIP PAST THIS IF THE LAYER IS NOT BETWEEN 70 MB ABOVE GROUND -! AND 500 MB -! - IF (PKL.LT.50000.0.OR.PKL.GT.PSFCK-7000.0) GOTO 775 - A=LOG(QKL*PKL/(6.1078*(0.378*QKL+0.622))) - TDKL=(237.3*A)/(17.269-A)+273.15 - TDPRE=TKL-TDKL - IF (TDPRE.LT.TDCHK.AND.TKL.LT.TCOLD) TCOLD=TKL - IF (TDPRE.LT.TDCHK.AND.TKL.GT.TWARM) TWARM=TKL - IF (TDPRE.LT.TDCHK.AND.L.LT.LICEE) LICEE=L - 775 CONTINUE -! -! IF NO SAT LAYER AT DEW POINT DEP=TDCHK, INCREASE TDCHK -! AND START AGAIN (BUT DON'T MAKE TDCHK > 6) -! - IF (TCOLD.EQ.T(LMHK).AND.TDCHK.LT.6.0) THEN - TDCHK=TDCHK+2.0 - GOTO 760 - ENDIF - 800 CONTINUE -! -! LOWEST LAYER T -! - KARR=0 - IF (PREC.LE.PTHRESH) GOTO 850 - LMHK=LM - TLMHK=T(LMHK) -! -! DECISION TREE TIME -! - IF (TCOLD.GT.269.15) THEN - IF (TLMHK.LE.273.15) THEN -! TURN ON THE FLAG FOR -! FREEZING RAIN = 4 -! IF ITS NOT ON ALREADY -! IZR=MOD(IWX,8)/4 -! IF (IZR.LT.1) IWX=IWX+4 - IWX=IWX+4 - GOTO 850 - ELSE -! TURN ON THE FLAG FOR -! RAIN = 8 -! IF ITS NOT ON ALREADY -! IRAIN=IWX/8 -! IF (IRAIN.LT.1) IWX=IWX+8 - IWX=IWX+8 - GOTO 850 - ENDIF - ENDIF - KARR=1 - 850 CONTINUE -! -!!$omp parallel do -!!$omp& private(area1,areap4,areap0,areas8,dzkl,ifrzl,iwrml,lice, -!!$omp& lmhk,pintk1,pintk2,pm150,psfck,surfc,surfw, -!!$omp& tlmhk,twrmk) - - IF(KARR.GT.0)THEN - LMHK=LM - LICE=LICEE -!meb - PSFCK=PINT(LP1) -!meb - TLMHK=T(LMHK) - TWRMK=TWARM -! -! TWET AREA VARIABLES -! CALCULATE ONLY WHAT IS NEEDED -! FROM GROUND TO 150 MB ABOVE SURFACE -! FROM GROUND TO TCOLD LAYER -! AND FROM GROUND TO 1ST LAYER WHERE WET BULB T < 0.0 -! -! PINTK1 IS THE PRESSURE AT THE BOTTOM OF THE LAYER -! PINTK2 IS THE PRESSURE AT THE TOP OF THE LAYER -! -! AREAP4 IS THE AREA OF TWET ABOVE -4 C BELOW HIGHEST SAT LYR -! AREAP0 IS THE AREA OF TWET ABOVE 0 C BELOW HIGHEST SAT LYR -! - AREAS8=D00 - AREAP4=D00 - AREAP0=D00 - SURFW =D00 - SURFC =D00 - -! - DO 1945 L=LMHK,LICE,-1 - DZKL=ZINT(L)-ZINT(L+1) - AREA1=(TWET(L)-269.15)*DZKL - AREA0=(TWET(L)-273.15)*DZKL - IF (TWET(L).GE.269.15) AREAP4=AREAP4+AREA1 - IF (TWET(L).GE.273.15) AREAP0=AREAP0+AREA0 - 1945 CONTINUE -! -! IF (AREAP4.LT.3000.0) THEN -! TURN ON THE FLAG FOR -! SNOW = 1 -! IF ITS NOT ON ALREADY -! ISNO=MOD(IWX,2) -! IF (ISNO.LT.1) IWX=IWX+1 -! IWX=IWX+1 -! GO TO 1900 -! ENDIF - IF (AREAP0.LT.350.0) THEN -! TURN ON THE FLAG FOR -! SNOW = 1 - IWX=IWX+1 - GOTO 1900 - ENDIF -! -! AREAS8 IS THE NET AREA OF TWET W.R.T. FREEZING IN LOWEST 150MB -! - PINTK1=PSFCK - PM150=PSFCK-15000. -! - DO 1955 L=LMHK,1,-1 - PINTK2=PINT(L) - IF(PINTK1.LT.PM150)GO TO 1950 - DZKL=ZINT(L)-ZINT(L+1) -! -! SUM PARTIAL LAYER IF IN 150 MB AGL LAYER -! - IF(PINTK2.LT.PM150) & - DZKL=T(L)*(Q(L)*D608+1.0)*ROG* & - LOG(PINTK1/PM150) - AREA1=(TWET(L)-273.15)*DZKL - AREAS8=AREAS8+AREA1 - 1950 PINTK1=PINTK2 - 1955 CONTINUE -! -! SURFW IS THE AREA OF TWET ABOVE FREEZING BETWEEN THE GROUND -! AND THE FIRST LAYER ABOVE GROUND BELOW FREEZING -! SURFC IS THE AREA OF TWET BELOW FREEZING BETWEEN THE GROUND -! AND THE WARMEST SAT LAYER -! - IFRZL=0 - IWRML=0 -! - DO 2050 L=LMHK,1,-1 - IF (IFRZL.EQ.0.AND.T(L).LT.273.15) IFRZL=1 - IF (IWRML.EQ.0.AND.T(L).GE.TWRMK) IWRML=1 -! - IF (IWRML.EQ.0.OR.IFRZL.EQ.0) THEN -! if(pmid(l) .lt. 50000.)print*,'twet needed above 500mb' - DZKL=ZINT(L)-ZINT(L+1) - AREA1=(TWET(L)-273.15)*DZKL - IF(IFRZL.EQ.0.AND.TWET(L).GE.273.15)SURFW=SURFW+AREA1 - IF(IWRML.EQ.0.AND.TWET(L).LE.273.15)SURFC=SURFC+AREA1 - ENDIF - 2050 CONTINUE - IF(SURFC.LT.-3000.0.OR. & - & (AREAS8.LT.-3000.0.AND.SURFW.LT.50.0)) THEN -! TURN ON THE FLAG FOR -! ICE PELLETS = 2 -! IF ITS NOT ON ALREADY -! IIP=MOD(IWX,4)/2 -! IF (IIP.LT.1) IWX=IWX+2 - IWX=IWX+2 - GOTO 1900 - ENDIF -! - IF(TLMHK.LT.273.15) THEN -! TURN ON THE FLAG FOR -! FREEZING RAIN = 4 -! IF ITS NOT ON ALREADY -! IZR=MOD(IWX(K),8)/4 -! IF (IZR.LT.1) IWX(K)=IWX(K)+4 - IWX=IWX+4 - ELSE -! TURN ON THE FLAG FOR -! RAIN = 8 -! IF ITS NOT ON ALREADY -! IRAIN=IWX(K)/8 -! IF (IRAIN.LT.1) IWX(K)=IWX(K)+8 - IWX=IWX+8 - ENDIF - ENDIF - 1900 CONTINUE -! print *, 'revised check ', IWX(500,800) - - RETURN - END -! -! - SUBROUTINE CALWXT_EXPLICIT(LM,PTHRESH,TSKIN,PREC,SR,F_RIMEF,IWX) -! -! FILE: CALWXT.f -! WRITTEN: 24 AUGUST 2005, G MANIKIN and B FERRIER -! -! ROUTINE TO COMPUTE PRECIPITATION TYPE USING EXPLICIT FIELDS -! FROM THE MODEL MICROPHYSICS - -! use params_mod -! use ctlblk_mod -!- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - implicit none -! -! LIST OF VARIABLES NEEDED -! PARAMETERS: -! -! INPUT: - integer, intent(in):: lm - real,intent(in):: TSKIN, PREC, SR,PTHRESH - REAL,intent(in):: F_RimeF(LM) - integer,intent(out) :: IWX - real SNOW -! real PSFC -! -! ALLOCATE LOCAL STORAGE -! -!!$omp parallel do - IWX = 0 - -!GSM THE RSM IS CURRENTLY INCOMPATIBLE WITH THIS ROUTINE -!GSM ACCORDING TO B FERRIER, THERE MAY BE A WAY TO WRITE -!GSM A VERSION OF THIS ALGORITHM TO WORK WITH THE RSM -!GSM MICROPHYSICS, BUT IT DOESN'T EXIST AT THIS TIME -!!$omp parallel do -!!$omp& private(psfc,tskin) - -! SKIP THIS POINT IF NO PRECIP THIS TIME STEP -! - IF (PREC.LE.PTHRESH) GOTO 800 -! -! A SNOW RATIO LESS THAN 0.5 ELIMINATES SNOW AND SLEET -! USE THE SKIN TEMPERATURE TO DISTINGUISH RAIN FROM FREEZING RAIN -! NOTE THAT 2-M TEMPERATURE MAY BE A BETTER CHOICE IF THE MODEL -! HAS A COLD BIAS FOR SKIN TEMPERATURE -! - IF (SR.LT.0.5) THEN -! SURFACE (SKIN) POTENTIAL TEMPERATURE AND TEMPERATURE. -! PSFC=PMID(LM) -! TSKIN=THS*(PSFC/P1000)**CAPA - - IF (TSKIN.LT.273.15) THEN -! FREEZING RAIN = 4 - IWX=IWX+4 - ELSE -! RAIN = 8 - IWX=IWX+8 - ENDIF - ELSE -! -! DISTINGUISH SNOW FROM SLEET WITH THE RIME FACTOR -! - IF(F_RimeF(LM).GE.10) THEN -! SLEET = 2 - IWX=IWX+2 - ELSE - SNOW = 1 - IWX=IWX+1 - ENDIF - ENDIF - 800 CONTINUE - 810 RETURN - END -! -! - SUBROUTINE CALWXT_DOMINANT(NALG,PREC,PTHRESH,RAIN,FREEZR,SLEET,SNOW, & - & DOMR,DOMZR,DOMIP,DOMS) -! -! WRITTEN: 24 AUGUST 2005, G MANIKIN -! -! THIS ROUTINE TAKES THE PRECIP TYPE SOLUTIONS FROM DIFFERENT -! ALGORITHMS AND SUMS THEM UP TO GIVE A DOMINANT TYPE -! -! use params_mod -! use ctlblk_mod -!- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - implicit none -! -! INPUT: - integer,intent(in) :: NALG - REAL, intent(in) :: PREC,PTHRESH - real,intent(out) :: DOMS,DOMR,DOMZR,DOMIP - real,DIMENSION(NALG),intent(in) :: RAIN,SNOW,SLEET,FREEZR - integer L - real TOTSN,TOTIP,TOTR,TOTZR -!-------------------------------------------------------------------------- -! write(6,*) 'into dominant' -!!$omp parallel do - DOMR = 0. - DOMS = 0. - DOMZR = 0. - DOMIP = 0. -! -!!$omp parallel do -!!$omp& private(totsn,totip,totr,totzr) -! SKIP THIS POINT IF NO PRECIP THIS TIME STEP - IF (PREC.LE.PTHRESH) GOTO 800 - TOTSN = 0. - TOTIP = 0. - TOTR = 0. - TOTZR = 0. -! LOOP OVER THE NUMBER OF DIFFERENT ALGORITHMS THAT ARE USED - DO 820 L = 1, NALG - IF (RAIN(L).GT. 0) THEN - TOTR = TOTR + 1 - GOTO 830 - ENDIF - - IF (SNOW(L).GT. 0) THEN - TOTSN = TOTSN + 1 - GOTO 830 - ENDIF - - IF (SLEET(L).GT. 0) THEN - TOTIP = TOTIP + 1 - GOTO 830 - ENDIF - - IF (FREEZR(L).GT. 0) THEN - TOTZR = TOTZR + 1 - GOTO 830 - ENDIF - 830 CONTINUE - 820 CONTINUE -! print *, 'Calprecip Total Rain, snow, sleet, freeze= ', & -! TOTR,TOTSN,TOTIP,TOTZR - -! TIES ARE BROKEN TO FAVOR THE MOST DANGEROUS FORM OF PRECIP -! FREEZING RAIN > SNOW > SLEET > RAIN - IF (TOTSN .GT. TOTIP) THEN - IF (TOTSN .GT. TOTZR) THEN - IF (TOTSN .GE. TOTR) THEN - DOMS = 1 - GOTO 800 - ELSE - DOMR = 1 - GOTO 800 - ENDIF - ELSE IF (TOTZR .GE. TOTR) THEN - DOMZR = 1 - GOTO 800 - ELSE - DOMR = 1 - GOTO 800 - ENDIF - ELSE IF (TOTIP .GT. TOTZR) THEN - IF (TOTIP .GE. TOTR) THEN - DOMIP = 1 - GOTO 800 - ELSE - DOMR = 1 - GOTO 800 - ENDIF - ELSE IF (TOTZR .GE. TOTR) THEN - DOMZR = 1 - GOTO 800 - ELSE - DOMR = 1 - GOTO 800 - ENDIF - 800 CONTINUE - RETURN - END - - - - - diff --git a/sorc/gfs_bufr.fd/calwxt_gfs_baldwin.f b/sorc/gfs_bufr.fd/calwxt_gfs_baldwin.f deleted file mode 100644 index 217dbbcc0c3..00000000000 --- a/sorc/gfs_bufr.fd/calwxt_gfs_baldwin.f +++ /dev/null @@ -1,294 +0,0 @@ - SUBROUTINE CALWXT(T,Q,td,twet,P,PINT,LMH,IWX,nd) -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: PROGRAM NAME (up to 20 characters) -C PRGMMR: YOUR NAME ORG: W/NMCXX DATE: YY-MM-DD -C -C ABSTRACT: START ABSTRACT HERE AND INDENT TO COLUMN 5 ON THE -C FOLLOWING LINES. PLEASE PROVIDE A BRIEF DESCRIPTION OF -C WHAT THE SUBPROGRAM DOES. -C -C PROGRAM HISTORY LOG: -C YY-MM-DD ORIGINAL PROGRAMMER'S NAME HERE -C YY-MM-DD MODIFIER1 DESCRIPTION OF CHANGE -C YY-MM-DD MODIFIER2 DESCRIPTION OF CHANGE -C -C USAGE: CALL PROGRAM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C LANGUAGE: INDICATE EXTENSIONS, COMPILER OPTIONS -C MACHINE: IBM SP -C -C$$$ -C -C FILE: CALWXT.f -C WRITTEN: 11 NOVEMBER 1993, MICHAEL BALDWIN -C REVISIONS: 4 April 94 - 1-d version intended for obs soundings -C 16 Sept 94 - compute all variables for possible -C future decsion tree modifications -C 14 Oct 94 - clean up 1-d version, use new -C decision tree -C -C ROUTINE TO COMPUTE PRECIPITATION TYPE USING A DECISION TREE -C APPROACH THAT USES VARIABLES SUCH AS INTEGRATED WET BULB TEMP -C BELOW FREEZING AND LOWEST LAYER TEMPERATURE -C -C SEE BALDWIN AND CONTORNO PREPRINT FROM 13TH WEATHER ANALYSIS -C AND FORECASTING CONFERENCE FOR MORE DETAILS -C - PARAMETER (LM=99) - PARAMETER (H1M12=1.E-12) -C -C LIST OF VARIABLES NEEDED -C PARAMETERS: -C D608,ROG,H1,D00 - PARAMETER(D608=0.608,ROG=287.04/9.8,H1=1.0,D00=0.0) -C -C INPUT: -C T,Q,td,twet,P,PINT,LMH -C -C T - Mid layer temp (K) -C Q - Mid layer spec hum (g/g) -C TD - Mid layer dew point temp (K) -C TWET - Mid layer wet bulb temp (K) -C P - Mid layer pressure (Pa) (linear average of interfacial -C pressures in log P) -C PINT - Interfacial pressure (Pa) -C LMH - Number of layers -c nd - 0 .. no print 1 .. print -C+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -C NOTE: VERTICAL ORDER OF ARRAYS MUST BE LAYER 1 = TOP -C ---- . -C . -C . -C LAYER LMH = BOTTOM -C (JUST LIKE IN THE ETA MODEL) -C+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -C -C INTERNAL: -C -C -C OUTPUT: -C IWX - INSTANTANEOUS WEATHER TYPE. -C ACTS LIKE A 4 BIT BINARY -C 1111 = RAIN/FREEZING RAIN/ICE PELLETS/SNOW -C WHERE THE ONE'S DIGIT IS FOR SNOW -C THE TWO'S DIGIT IS FOR ICE PELLETS -C THE FOUR'S DIGIT IS FOR FREEZING RAIN -C AND THE EIGHT'S DIGIT IS FOR RAIN -C -C------------------------------------------------------------- -C IN OTHER WORDS... -C -C IWX=1 SNOW -C IWX=2 ICE PELLETS/MIX WITH ICE PELLETS -C IWX=4 FREEZING RAIN/MIX WITH FREEZING RAIN -C IWX=8 RAIN -C------------------------------------------------------------- -C -C -C SUBROUTINES CALLED: -C WETBLB -C -C -C INITIALIZE WEATHER TYPE ARRAY TO ZERO (IE, OFF). -C WE DO THIS SINCE WE WANT IWX TO REPRESENT THE -C INSTANTANEOUS WEATHER TYPE ON RETURN. -C - DIMENSION T(LM+1),Q(LM),P(LM),PINT(LM+1),TWET(LM),TD(LM) -C - IWX = 0 - AREAS8=D00 - AREAN8=D00 - AREAPI=D00 - AREAP4=D00 - SURFW =D00 - SURFC =D00 -C -C NUMBER OF LEVELS -C - LMHK=LMH -C -C COMPUTE DEW POINTS, -C FIND COLDEST TEMP IN SATURATED LAYER BETWEEN -C 70 MB ABOVE GROUND AND 500 MB, -C AND FIND THE HIGHEST SAT LAYER, 'TIS THE ICE NUCL LEVEL. -C -C - PSFCK=PINT(LMHK+1) - TDCHK=2.0 - 1960 TCOLD=T(LMHK) - TWARM=T(LMHK) - LICE=LMHK - DO 1915 L=1,LMHK - QKL=Q(L) - QKL=AMAX1(H1M12,QKL) - TKL=T(L) - PKL=P(L) - tdkl = td(l) -C -C SKIP PAST THIS IF THE LAYER IS NOT BETWEEN 70 MB ABOVE GROUND -C AND 500 MB -C - IF (PKL.LT.50000.0.OR.PKL.GT.PSFCK-7000.0) GOTO 1915 - TDPRE=TKL-TDKL -C -C ALSO FIND THE HIGHEST SAT LAYER-USE FOR AREAPI,AREAP4 -C - IF (TDPRE.LT.TDCHK.AND.P(L).LT.P(LICE)) LICE=L - IF (TDPRE.LT.TDCHK.AND.TKL.GT.TWARM) TWARM=TKL - IF (TDPRE.LT.TDCHK.AND.TKL.LT.TCOLD) TCOLD=TKL - 1915 CONTINUE -C -C IF WE DONT HAVE A LAYER WITH DEW POINT DEP OF TDCHK OR LESS -C INCREASE TDCHK (UP TO 6 MAX) -C - IF (TCOLD.EQ.T(LMHK+1).AND.TDCHK.LT.6.0) THEN - TDCHK=TDCHK+2.0 - GOTO 1960 - ENDIF -C -C LOWEST LAYER T -C - TLMHK=T(LMHK+1) -C -C TWET AREA VARIABLES -C FROM GROUND TO 150 MB ABOVE SURFACE -C FROM GROUND TO TCOLD LAYER -C FROM GROUND TO 1ST LAYER WHERE T < 0.0 -C FROM GROUND TO TWARM LAYER -C -C PINTK1 IS THE PRESSURE AT THE BOTTOM OF THE LAYER -C PINTK2 IS THE PRESSURE AT THE TOP OF THE LAYER -C -C AREAPI IS THE AREA OF TWET ABOVE FREEZING BELOW TCOLD LYR -C AREAP4 IS THE AREA OF TWET ABOVE -4 C BELOW TCOLD LYR -C - PINTK1=PSFCK - DO 1945 L=LMHK,LICE,-1 - PINTK2=PINT(L) - DZKL=T(L)*(Q(L)*D608+H1)*ROG* - 1 ALOG(PINTK1/PINTK2) - AREA1=(TWET(L)-273.15)*DZKL - AREA2=(TWET(L)-269.15)*DZKL - IF (TWET(L).GE.273.15) AREAPI=AREAPI+AREA1 - IF (TWET(L).GE.269.15) AREAP4=AREAP4+AREA2 - PINTK1=PINTK2 - 1945 CONTINUE -C -C AREAS8 IS THE NET AREA OF TWET W.R.T. FREEZING IN LOWEST 150MB -C AREAN8 IS THE NET AREA OF TWET < FREEZING IN LOWEST 150MB -C - PINTK1=PSFCK - PM150=PSFCK-15000. - DO 1955 L=LMHK,1,-1 - PINTK2=PINT(L) - IF (PINTK1.LT.PM150) GOTO 1950 - DZKL=T(L)*(Q(L)*D608+H1)*ROG* - 1 ALOG(PINTK1/PINTK2) -C -C SUM PARTIAL LAYER IF IN 150 MB AGL LAYER -C - IF (PINTK2.LT.PM150) - & DZKL=T(L)*(Q(L)*D608+H1)*ROG* - 1 ALOG(PINTK1/PM150) - AREA1=(TWET(L)-273.15)*DZKL - AREAS8=AREAS8+AREA1 - IF(AREA1.LT.0.) AREAN8=AREAN8+AREA1 - 1950 PINTK1=PINTK2 - 1955 CONTINUE -C -C SURFW IS THE AREA OF TWET ABOVE FREEZING BETWEEN THE GROUND -C AND THE FIRST LAYER ABOVE GROUND BELOW FREEZING -C SURFC IS THE AREA OF TWET BELOW FREEZING BETWEEN THE GROUND -C AND THE TWARM LAYER -C - PINTK1=PSFCK - IFRZL=0 - IWRML=0 - DO 2050 L=LMHK,1,-1 - IF (IFRZL.EQ.0.AND.T(L).LE.273.15) IFRZL=1 - IF (IWRML.EQ.0.AND.T(L).GE.TWARM) IWRML=1 - PINTK2=PINT(L) - DZKL=T(L)*(Q(L)*D608+H1)*ROG* - 1 ALOG(PINTK1/PINTK2) - AREA1=(TWET(L)-273.15)*DZKL - IF (IFRZL.EQ.0) THEN - IF (TWET(L).GE.273.15) SURFW=SURFW+AREA1 - ENDIF - IF (IWRML.EQ.0) THEN - IF (TWET(L).LE.273.15) SURFC=SURFC+AREA1 - ENDIF - PINTK1=PINTK2 - 2050 CONTINUE -C -C DECISION TREE TIME -C - if(nd.eq.1) then - print *, ' tcold =', tcold - print *, ' tlmhk =', tlmhk - print *, ' areap4 =', areap4 - print *, ' areas8 =', areas8 - print *, ' surfw =', surfw - print *, ' surfc =', surfc -c print *, ' temp= ' -c print *, (t(k),k=1,lmhk+1) -c print *, ' tdew =' -c print *, (td(k),k=1,lmhk) -c print *, ' twet =' -c print *, (twet(k),k=1,lmhk) - endif - IF (TCOLD.GT.269.15) THEN - IF (TLMHK.LE.273.15) THEN -C TURN ON THE FLAG FOR -C FREEZING RAIN = 4 - IWX=4 - GOTO 1900 - ELSE -C TURN ON THE FLAG FOR -C RAIN = 8 - IWX=8 - GOTO 1900 - ENDIF - ENDIF -C - IF (AREAP4.LT.3000.0) THEN -C TURN ON THE FLAG FOR -C SNOW = 1 - IWX=1 - GOTO 1900 - ENDIF -C - IF (SURFC.LE.-3000.0.OR. - & (AREAS8.LE.-3000.0.AND.SURFW.LT.50.0)) THEN -C TURN ON THE FLAG FOR -C ICE PELLETS = 2 - IWX=2 - GOTO 1900 - ENDIF - IF (TLMHK.LT.273.15) THEN -C TURN ON THE FLAG FOR -C FREEZING RAIN = 4 - IWX=4 - ELSE -C TURN ON THE FLAG FOR -C RAIN = 8 - IWX=8 - ENDIF - 1900 CONTINUE - RETURN - END diff --git a/sorc/gfs_bufr.fd/calwxt_gfs_ramer.f b/sorc/gfs_bufr.fd/calwxt_gfs_ramer.f deleted file mode 100644 index 1faabf6214a..00000000000 --- a/sorc/gfs_bufr.fd/calwxt_gfs_ramer.f +++ /dev/null @@ -1,364 +0,0 @@ -Cccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc -C -C DoPhase is a subroutine written and provided by Jim Ramer at NOAA/FSL -C -C Ramer, J, 1993: An empirical technique for diagnosing precipitation -C type from model output. Preprints, 5th Conf. on Aviation -C Weather Systems, Vienna, VA, Amer. Meteor. Soc., 227-230. -C -Cccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc -C - SUBROUTINE CALWXT1(pq,tq,qq,twq,tdq,nq,lm,ppt,ptyp,trace) -c SUBROUTINE dophase(pq, ! input pressure sounding mb -c + tq, ! input temperature sounding K -c + pq, | input pressure -c + qq, ! input spec humidityfraction -c + twq, ! input wet-bulb temperature -c + nq, ! input number of levels in sounding -c + twq, ! output wet-bulb sounding K -c + icefrac, ! output ice fraction -c + ptyp) ! output(2) phase 2=Rain, 3=Frzg, 4=Solid, -C 6=IP JC 9/16/99 - LOGICAL trace -c PARAMETER (trace = .false.) - PARAMETER (A2=17.2693882,A3=273.16,A4=35.86,PQ0=379.90516) - PARAMETER (G=9.80665,CP=1004.686,RCP=0.2857141,LECP=1572.5) - PARAMETER (twice=266.55,rhprcp=0.80,deltag=1.02,prcpmin=0.3, - * emelt=0.045,rlim=0.04,slim=0.85) - PARAMETER (twmelt=273.15,tz=273.15,efac=1.0,PTHRES=0.25) -c pthres is in unit of mm and is equivalent to .01 inch -C - INTEGER*4 i, k1, lll, k2, toodry, iflag, nq -C - INTEGER ptyp -C - REAL rcp, flg, flag, xxx, pq(lm), tq(lm), twq(lm), rhq(lm), mye, - * qq(lm), icefrac, tqtmp(lm), pqtmp(lm), rhqtmp(lm) - * ,twtmp(lm),qqtmp(lm),tdqtmp(lm),tdq(lm) -C - COMMON /flagflg/ flag, flg - DATA iflag / -9/ -C -C Initialize. - IF (trace) print *, '******* NEW STATION ******' - IF (trace) print *, 'Twmelt,Twice,rhprcp,Emelt' - IF (trace) print *, twmelt, twice, rhprcp, emelt - icefrac = flag - ptyp = 0 -c IF (PPT.LE.PTHRES) RETURN -C -C GSM compute RH, convert pressure to mb, and reverse order - - DO 88 i = 1, nq - LEV=NQ-I+1 -c QC=PQ0/PQ(I) * EXP(A2*(TQ(I)-A3)/(TQ(I)-A4)) - call svp(qc,es,pq(i),tq(i)) - RHQTMP(LEV)=QQ(I)/QC - PQTMP(LEV)=PQ(I)/100. - TQTMP(LEV)=TQ(I) - TWTMP(LEV)=TWQ(I) - QQTMP(LEV)=QQ(I) - TDQTMP(LEV)=TDQ(I) - 88 CONTINUE - - do 92 i=1,nq - TQ(I)=TQTMP(I) - PQ(I)=PQTMP(I) - RHQ(I)=RHQTMP(I) - TWQ(I)=TWTMP(I) - QQ(I)=QQTMP(I) - TDQ(I)=TDQTMP(I) - 92 continue - - -C See if there was too little precip reported. -C -CCC RATE RESTRICTION REMOVED BY JOHN CORTINAS 3/16/99 -C -C Construct wet-bulb sounding, locate generating level. - twmax = -999.0 - rhmax = 0.0 - k1 = 0 ! top of precip generating layer - k2 = 0 ! layer of maximum rh -C - IF (trace) WRITE (20,*) 'rhq(1)', rhq(1) - IF (rhq(1).lt.rhprcp) THEN - toodry = 1 - ELSE - toodry = 0 - END IF -C -C toodry=((Rhq(1).lt.rhprcp).and.1) - pbot = pq(1) - DO 10 i = 1, nq -c xxx = tdofesat(esat(tq(i))*rhq(i)) -c call tdew(xxx,tq(i),qq(i),pq(i)*100.) - xxx = tdq(i) - IF (trace) print *, 'T,Rh,Td,P,nq ', tq(i), rhq(i), xxx, - + pq(i), nq -c twq(i) = xmytw(tq(i),xxx,pq(i)) - IF (trace) print *, 'Twq(i),i ', twq(i), i - twmax = amax1(twq(i),twmax) - IF (trace) print *, 'Tw,Rh,P ', twq(i) - 273.15, rhq(i), - + pq(i) - IF (pq(i).ge.400.0) THEN - IF (rhq(i).gt.rhmax) THEN - rhmax = rhq(i) - k2 = i - IF (trace) print *, 'rhmax,k2,i', rhmax, k2, i - END IF -C - IF (i.ne.1) THEN - IF (trace) print *, 'ME: toodry,i', toodry, i - IF (rhq(i).ge.rhprcp.or.toodry.eq.0) THEN - IF (toodry.ne.0) THEN - dpdrh = alog(pq(i)/pq(i-1)) / (rhq(i)- - + rhq(i-1)) - pbot = exp(alog(pq(i))+(rhprcp-rhq(i))*dpdrh) -C -Clin dpdrh=(Pq(i)-Pq(i-1))/(Rhq(i)-Rhq(i-1)) -Clin pbot=Pq(i)+(rhprcp-Rhq(i))*dpdrh - ptw = pq(i) - toodry = 0 - IF (trace) print *, 'dpdrh,pbot,rhprcp-rhq - +(i),i,ptw, toodry', dpdrh, pbot, rhprcp - rhq(i), i, ptw, - + toodry - ELSE IF (rhq(i).ge.rhprcp) THEN - ptw = pq(i) - IF (trace) print *, 'HERE1: ptw,toodry', - + ptw, toodry - ELSE - toodry = 1 - dpdrh = alog(pq(i)/pq(i-1)) / (rhq(i)- - + rhq(i-1)) - ptw = exp(alog(pq(i))+(rhprcp-rhq(i))*dpdrh) - IF (trace) print *, - + 'HERE2:dpdrh,pbot,i,ptw,toodry', dpdrh, - + pbot, i, ptw, toodry -Clin dpdrh=(Pq(i)-Pq(i-1))/(Rhq(i)-Rhq(i-1)) -Clin ptw=Pq(i)+(rhprcp-Rhq(i))*dpdrh -C - END IF -C - IF (trace) print *, 'HERE3:pbot,ptw,deltag', - + pbot, ptw, deltag - IF (pbot/ptw.ge.deltag) THEN -Clin If (pbot-ptw.lt.deltag) Goto 2003 - k1 = i - ptop = ptw - END IF - END IF - END IF - END IF -C - 10 CONTINUE -C -C Gross checks for liquid and solid precip which dont require generating level. -C -c print *, 'twq1 ', twq(1) - IF (twq(1).ge.273.15+2.0) THEN - ptyp = 8 ! liquid - IF (trace) PRINT *, 'liquid' - icefrac = 0.0 - RETURN - END IF -C - print *, 'twmax ', twmax - IF (twmax.le.twice) THEN - icefrac = 1.0 - ptyp = 1 ! solid - RETURN - END IF -C -C Check to see if we had no success with locating a generating level. -C - IF (trace) print *, 'HERE6: k1,ptyp', k1, ptyp - IF (k1.eq.0) THEN - rate = flag - RETURN - END IF -C - IF (ptop.eq.pq(k1)) THEN - twtop = twq(k1) - rhtop = rhq(k1) - k2 = k1 - k1 = k1 - 1 - ELSE - k2 = k1 - k1 = k1 - 1 - wgt1 = alog(ptop/pq(k2)) / alog(pq(k1)/pq(k2)) -Clin wgt1=(ptop-Pq(k2))/(Pq(k1)-Pq(k2)) - wgt2 = 1.0 - wgt1 - twtop = twq(k1) * wgt1 + twq(k2) * wgt2 - rhtop = rhq(k1) * wgt1 + rhq(k2) * wgt2 - END IF -C - IF (trace) print *, - + 'HERE7: ptop,k1,pq(k1),twtop,rhtop,k2,wgt1, wgt2', ptop, - + k1, pq(k1), twtop, rhtop, k2, wgt1, wgt2 -C -C Calculate temp and wet-bulb ranges below precip generating level. - DO 20 i = 1, k1 - twmax = amax1(twq(i),twmax) - 20 CONTINUE -C -C Gross check for solid precip, initialize ice fraction. - IF (trace) print *, twmax - IF (twtop.le.twice) THEN - icefrac = 1.0 - IF (twmax.le.twmelt) THEN ! gross check for solid precip. - IF (trace) PRINT *, 'solid' - ptyp = 1 ! solid precip - RETURN - END IF - lll = 0 - ELSE - icefrac = 0.0 - lll = 1 - END IF -C -C Loop downward through sounding from highest precip generating level. - 30 CONTINUE -C - IF (trace) PRINT *, ptop, twtop - 273.15, icefrac - IF (trace) print *, 'P,Tw,frac,twq(k1)', ptop, twtop - 273.15, - + icefrac, twq(k1) - IF (icefrac.ge.1.0) THEN ! starting as all ice - IF (trace) print *, 'ICEFRAC=1', icefrac - print *, 'twq twmwelt twtop ', twq(k1), twmelt, twtop - IF (twq(k1).lt.twmelt) GO TO 40 ! cannot commence melting - IF (twq(k1).eq.twtop) GO TO 40 ! both equal twmelt, nothing h - wgt1 = (twmelt-twq(k1)) / (twtop-twq(k1)) - rhavg = rhq(k1) + wgt1 * (rhtop-rhq(k1)) / 2 - dtavg = (twmelt-twq(k1)) / 2 - dpk = wgt1 * alog(pq(k1)/ptop) !lin dpk=wgt1*(Pq(k1)-Ptop) -C mye=emelt*(1.0-(1.0-Rhavg)*efac) - mye = emelt * rhavg ** efac - icefrac = icefrac + dpk * dtavg / mye - IF (trace) print *, - + 'HERE8: wgt1,rhavg,dtavg,dpk,mye,icefrac', wgt1, rhavg, - + dtavg, dpk, mye, icefrac - ELSE IF (icefrac.le.0.0) THEN ! starting as all liquid - IF (trace) print *, 'HERE9: twtop,twq(k1),k1,lll', twtop, - + twq(k1), k1, lll - lll = 1 -C If (Twq(k1).le.Twice) icefrac=1.0 ! autoconvert -C Goto 1020 - IF (twq(k1).gt.twice) GO TO 40 ! cannot commence freezing - IF (twq(k1).eq.twtop) THEN - wgt1 = 0.5 - ELSE - wgt1 = (twice-twq(k1)) / (twtop-twq(k1)) - END IF - rhavg = rhq(k1) + wgt1 * (rhtop-rhq(k1)) / 2 - dtavg = twmelt - (twq(k1)+twice) / 2 - dpk = wgt1 * alog(pq(k1)/ptop) !lin dpk=wgt1*(Pq(k1)-Ptop) -C mye=emelt*(1.0-(1.0-Rhavg)*efac) - mye = emelt * rhavg ** efac - icefrac = icefrac + dpk * dtavg / mye - IF (trace) print *, 'HERE10: wgt1,rhtop,rhq(k1),dtavg', - + wgt1, rhtop, rhq(k1), dtavg - ELSE IF ((twq(k1).le.twmelt).and.(twq(k1).lt.twmelt)) THEN ! mix - rhavg = (rhq(k1)+rhtop) / 2 - dtavg = twmelt - (twq(k1)+twtop) / 2 - dpk = alog(pq(k1)/ptop) !lin dpk=Pq(k1)-Ptop -C mye=emelt*(1.0-(1.0-Rhavg)*efac) - mye = emelt * rhavg ** efac - icefrac = icefrac + dpk * dtavg / mye - - IF (trace) print *, 'HERE11: twq(K1),twtop', twq(k1), - + twtop - ELSE ! mix where Tw curve crosses twmelt in layer - IF (twq(k1).eq.twtop) GO TO 40 ! both equal twmelt, nothing h - wgt1 = (twmelt-twq(k1)) / (twtop-twq(k1)) - wgt2 = 1.0 - wgt1 - rhavg = rhtop + wgt2 * (rhq(k1)-rhtop) / 2 - dtavg = (twmelt-twtop) / 2 - dpk = wgt2 * alog(pq(k1)/ptop) !lin dpk=wgt2*(Pq(k1)-Ptop) -C mye=emelt*(1.0-(1.0-Rhavg)*efac) - mye = emelt * rhavg ** efac - icefrac = icefrac + dpk * dtavg / mye - icefrac = amin1(1.0,amax1(icefrac,0.0)) - IF (trace) print *, 'HERE12: twq(k1),twtop,icefrac,wgt1,wg - +t2,rhavg,rhtop,rhq(k1),dtavg,k1', twq(k1), twtop, icefrac, wgt1, - + wgt2, rhavg, rhtop, rhq(k1), dtavg, k1 - IF (icefrac.le.0.0) THEN -C If (Twq(k1).le.Twice) icefrac=1.0 ! autoconvert -C Goto 1020 - IF (twq(k1).gt.twice) GO TO 40 ! cannot commence freezin - wgt1 = (twice-twq(k1)) / (twtop-twq(k1)) - dtavg = twmelt - (twq(k1)+twice) / 2 - IF (trace) WRITE (20,*) 'IN IF' - ELSE - dtavg = (twmelt-twq(k1)) / 2 - IF (trace) WRITE (20,*) 'IN ELSE' - END IF - IF (trace) print *, 'NEW ICE FRAC CALC' - rhavg = rhq(k1) + wgt1 * (rhtop-rhq(k1)) / 2 - dpk = wgt1 * alog(pq(k1)/ptop) !lin dpk=wgt1*(Pq(k1)-Ptop) -C mye=emelt*(1.0-(1.0-Rhavg)*efac) - mye = emelt * rhavg ** efac - icefrac = icefrac + dpk * dtavg / mye - IF (trace) print *, 'HERE13: icefrac,k1,dtavg,rhavg', - + icefrac, k1, dtavg, rhavg - END IF -C - icefrac = amin1(1.0,amax1(icefrac,0.0)) - IF (trace) print *, 'NEW ICEFRAC:', icefrac, icefrac -C -C Get next level down if there is one, loop back. - 40 IF (k1.gt.1) THEN - IF (trace) WRITE (20,*) 'LOOPING BACK' - twtop = twq(k1) - ptop = pq(k1) - rhtop = rhq(k1) - k1 = k1 - 1 - GO TO 30 - END IF -C -C -C Determine precip type based on snow fraction and surface wet-bulb. -C If (trace) Print *,Pq(k1),Twq(k1)-273.15,icefrac -C - IF (trace) print *, 'P,Tw,frac,lll', pq(k1), twq(k2) - 273.15, - + icefrac, lll -C -c print *, 'icefrac ', icefrac - IF (icefrac.ge.slim) THEN - IF (lll.ne.0) THEN - ptyp = 2 ! Ice Pellets JC 9/16/99 - IF (trace) print *, 'frozen' - ELSE - ptyp = 1 ! Snow - print *, 'snow' - IF (trace) print *, 'snow' - END IF - ELSE IF (icefrac.le.rlim) THEN - IF (twq(1).lt.tz) THEN - print *, 'aha! frz' - ptyp = 4 ! Freezing Precip - IF (trace) print *, 'freezing' - ELSE - ptyp = 8 ! Rain - print *, 'rain' - IF (trace) print *, 'liquid' - END IF - ELSE - IF (trace) print *, 'Mix' - IF (twq(1).lt.tz) THEN - IF (trace) print *, 'freezing' -cGSM not sure what to do when 'mix' is predicted; I chose sleet as -cGSK a shaky best option - - ptyp = 2 ! Ice Pellets -c ptyp = 5 ! Mix - ELSE -c ptyp = 5 ! Mix - ptyp = 2 ! Ice Pellets - END IF - END IF - IF (trace) print *, "Returned ptyp is:ptyp,lll ", ptyp, lll - IF (trace) print *, "Returned icefrac is: ", icefrac - RETURN -C - END diff --git a/sorc/gfs_bufr.fd/funcphys.f b/sorc/gfs_bufr.fd/funcphys.f deleted file mode 100644 index fd30d1568f3..00000000000 --- a/sorc/gfs_bufr.fd/funcphys.f +++ /dev/null @@ -1,2899 +0,0 @@ -!------------------------------------------------------------------------------- -module funcphys -!$$$ Module Documentation Block -! -! Module: funcphys API for basic thermodynamic physics -! Author: Iredell Org: W/NX23 Date: 1999-03-01 -! -! Abstract: This module provides an Application Program Interface -! for computing basic thermodynamic physics functions, in particular -! (1) saturation vapor pressure as a function of temperature, -! (2) dewpoint temperature as a function of vapor pressure, -! (3) equivalent potential temperature as a function of temperature -! and scaled pressure to the kappa power, -! (4) temperature and specific humidity along a moist adiabat -! as functions of equivalent potential temperature and -! scaled pressure to the kappa power, -! (5) scaled pressure to the kappa power as a function of pressure, and -! (6) temperature at the lifting condensation level as a function -! of temperature and dewpoint depression. -! The entry points required to set up lookup tables start with a "g". -! All the other entry points are functions starting with an "f" or -! are subroutines starting with an "s". These other functions and -! subroutines are elemental; that is, they return a scalar if they -! are passed only scalars, but they return an array if they are passed -! an array. These other functions and subroutines can be inlined, too. -! -! Program History Log: -! 1999-03-01 Mark Iredell -! 1999-10-15 Mark Iredell SI unit for pressure (Pascals) -! 2001-02-26 Mark Iredell Ice phase changes of Hong and Moorthi -! -! Public Variables: -! krealfp Integer parameter kind or length of reals (=kind_phys) -! -! Public Subprograms: -! gpvsl Compute saturation vapor pressure over liquid table -! -! fpvsl Elementally compute saturation vapor pressure over liquid -! function result Real(krealfp) saturation vapor pressure in Pascals -! t Real(krealfp) temperature in Kelvin -! -! fpvslq Elementally compute saturation vapor pressure over liquid -! function result Real(krealfp) saturation vapor pressure in Pascals -! t Real(krealfp) temperature in Kelvin -! -! fpvslx Elementally compute saturation vapor pressure over liquid -! function result Real(krealfp) saturation vapor pressure in Pascals -! t Real(krealfp) temperature in Kelvin -! -! gpvsi Compute saturation vapor pressure over ice table -! -! fpvsi Elementally compute saturation vapor pressure over ice -! function result Real(krealfp) saturation vapor pressure in Pascals -! t Real(krealfp) temperature in Kelvin -! -! fpvsiq Elementally compute saturation vapor pressure over ice -! function result Real(krealfp) saturation vapor pressure in Pascals -! t Real(krealfp) temperature in Kelvin -! -! fpvsix Elementally compute saturation vapor pressure over ice -! function result Real(krealfp) saturation vapor pressure in Pascals -! t Real(krealfp) temperature in Kelvin -! -! gpvs Compute saturation vapor pressure table -! -! fpvs Elementally compute saturation vapor pressure -! function result Real(krealfp) saturation vapor pressure in Pascals -! t Real(krealfp) temperature in Kelvin -! -! fpvsq Elementally compute saturation vapor pressure -! function result Real(krealfp) saturation vapor pressure in Pascals -! t Real(krealfp) temperature in Kelvin -! -! fpvsx Elementally compute saturation vapor pressure -! function result Real(krealfp) saturation vapor pressure in Pascals -! t Real(krealfp) temperature in Kelvin -! -! gtdpl Compute dewpoint temperature over liquid table -! -! ftdpl Elementally compute dewpoint temperature over liquid -! function result Real(krealfp) dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! ftdplq Elementally compute dewpoint temperature over liquid -! function result Real(krealfp) dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! ftdplx Elementally compute dewpoint temperature over liquid -! function result Real(krealfp) dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! ftdplxg Elementally compute dewpoint temperature over liquid -! function result Real(krealfp) dewpoint temperature in Kelvin -! t Real(krealfp) guess dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! gtdpi Compute dewpoint temperature table over ice -! -! ftdpi Elementally compute dewpoint temperature over ice -! function result Real(krealfp) dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! ftdpiq Elementally compute dewpoint temperature over ice -! function result Real(krealfp) dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! ftdpix Elementally compute dewpoint temperature over ice -! function result Real(krealfp) dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! ftdpixg Elementally compute dewpoint temperature over ice -! function result Real(krealfp) dewpoint temperature in Kelvin -! t Real(krealfp) guess dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! gtdp Compute dewpoint temperature table -! -! ftdp Elementally compute dewpoint temperature -! function result Real(krealfp) dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! ftdpq Elementally compute dewpoint temperature -! function result Real(krealfp) dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! ftdpx Elementally compute dewpoint temperature -! function result Real(krealfp) dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! ftdpxg Elementally compute dewpoint temperature -! function result Real(krealfp) dewpoint temperature in Kelvin -! t Real(krealfp) guess dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! gthe Compute equivalent potential temperature table -! -! fthe Elementally compute equivalent potential temperature -! function result Real(krealfp) equivalent potential temperature in Kelvin -! t Real(krealfp) LCL temperature in Kelvin -! pk Real(krealfp) LCL pressure over 1e5 Pa to the kappa power -! -! ftheq Elementally compute equivalent potential temperature -! function result Real(krealfp) equivalent potential temperature in Kelvin -! t Real(krealfp) LCL temperature in Kelvin -! pk Real(krealfp) LCL pressure over 1e5 Pa to the kappa power -! -! fthex Elementally compute equivalent potential temperature -! function result Real(krealfp) equivalent potential temperature in Kelvin -! t Real(krealfp) LCL temperature in Kelvin -! pk Real(krealfp) LCL pressure over 1e5 Pa to the kappa power -! -! gtma Compute moist adiabat tables -! -! stma Elementally compute moist adiabat temperature and moisture -! the Real(krealfp) equivalent potential temperature in Kelvin -! pk Real(krealfp) pressure over 1e5 Pa to the kappa power -! tma Real(krealfp) parcel temperature in Kelvin -! qma Real(krealfp) parcel specific humidity in kg/kg -! -! stmaq Elementally compute moist adiabat temperature and moisture -! the Real(krealfp) equivalent potential temperature in Kelvin -! pk Real(krealfp) pressure over 1e5 Pa to the kappa power -! tma Real(krealfp) parcel temperature in Kelvin -! qma Real(krealfp) parcel specific humidity in kg/kg -! -! stmax Elementally compute moist adiabat temperature and moisture -! the Real(krealfp) equivalent potential temperature in Kelvin -! pk Real(krealfp) pressure over 1e5 Pa to the kappa power -! tma Real(krealfp) parcel temperature in Kelvin -! qma Real(krealfp) parcel specific humidity in kg/kg -! -! stmaxg Elementally compute moist adiabat temperature and moisture -! tg Real(krealfp) guess parcel temperature in Kelvin -! the Real(krealfp) equivalent potential temperature in Kelvin -! pk Real(krealfp) pressure over 1e5 Pa to the kappa power -! tma Real(krealfp) parcel temperature in Kelvin -! qma Real(krealfp) parcel specific humidity in kg/kg -! -! gpkap Compute pressure to the kappa table -! -! fpkap Elementally raise pressure to the kappa power. -! function result Real(krealfp) p over 1e5 Pa to the kappa power -! p Real(krealfp) pressure in Pascals -! -! fpkapq Elementally raise pressure to the kappa power. -! function result Real(krealfp) p over 1e5 Pa to the kappa power -! p Real(krealfp) pressure in Pascals -! -! fpkapo Elementally raise pressure to the kappa power. -! function result Real(krealfp) p over 1e5 Pa to the kappa power -! p Real(krealfp) surface pressure in Pascals -! -! fpkapx Elementally raise pressure to the kappa power. -! function result Real(krealfp) p over 1e5 Pa to the kappa power -! p Real(krealfp) pressure in Pascals -! -! grkap Compute pressure to the 1/kappa table -! -! frkap Elementally raise pressure to the 1/kappa power. -! function result Real(krealfp) pressure in Pascals -! pkap Real(krealfp) p over 1e5 Pa to the 1/kappa power -! -! frkapq Elementally raise pressure to the kappa power. -! function result Real(krealfp) pressure in Pascals -! pkap Real(krealfp) p over 1e5 Pa to the kappa power -! -! frkapx Elementally raise pressure to the kappa power. -! function result Real(krealfp) pressure in Pascals -! pkap Real(krealfp) p over 1e5 Pa to the kappa power -! -! gtlcl Compute LCL temperature table -! -! ftlcl Elementally compute LCL temperature. -! function result Real(krealfp) temperature at the LCL in Kelvin -! t Real(krealfp) temperature in Kelvin -! tdpd Real(krealfp) dewpoint depression in Kelvin -! -! ftlclq Elementally compute LCL temperature. -! function result Real(krealfp) temperature at the LCL in Kelvin -! t Real(krealfp) temperature in Kelvin -! tdpd Real(krealfp) dewpoint depression in Kelvin -! -! ftlclo Elementally compute LCL temperature. -! function result Real(krealfp) temperature at the LCL in Kelvin -! t Real(krealfp) temperature in Kelvin -! tdpd Real(krealfp) dewpoint depression in Kelvin -! -! ftlclx Elementally compute LCL temperature. -! function result Real(krealfp) temperature at the LCL in Kelvin -! t Real(krealfp) temperature in Kelvin -! tdpd Real(krealfp) dewpoint depression in Kelvin -! -! gfuncphys Compute all physics function tables -! -! Attributes: -! Language: Fortran 90 -! -!$$$ - use machine,only:kind_phys - use physcons - implicit none - private -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! Public Variables -! integer,public,parameter:: krealfp=selected_real_kind(15,45) - integer,public,parameter:: krealfp=kind_phys -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! Private Variables - real(krealfp),parameter:: psatb=con_psat*1.e-5 - integer,parameter:: nxpvsl=7501 - real(krealfp) c1xpvsl,c2xpvsl,tbpvsl(nxpvsl) - integer,parameter:: nxpvsi=7501 - real(krealfp) c1xpvsi,c2xpvsi,tbpvsi(nxpvsi) - integer,parameter:: nxpvs=7501 - real(krealfp) c1xpvs,c2xpvs,tbpvs(nxpvs) - integer,parameter:: nxtdpl=5001 - real(krealfp) c1xtdpl,c2xtdpl,tbtdpl(nxtdpl) - integer,parameter:: nxtdpi=5001 - real(krealfp) c1xtdpi,c2xtdpi,tbtdpi(nxtdpi) - integer,parameter:: nxtdp=5001 - real(krealfp) c1xtdp,c2xtdp,tbtdp(nxtdp) - integer,parameter:: nxthe=241,nythe=151 - real(krealfp) c1xthe,c2xthe,c1ythe,c2ythe,tbthe(nxthe,nythe) - integer,parameter:: nxma=151,nyma=121 - real(krealfp) c1xma,c2xma,c1yma,c2yma,tbtma(nxma,nyma),tbqma(nxma,nyma) -! integer,parameter:: nxpkap=5501 - integer,parameter:: nxpkap=11001 - real(krealfp) c1xpkap,c2xpkap,tbpkap(nxpkap) - integer,parameter:: nxrkap=5501 - real(krealfp) c1xrkap,c2xrkap,tbrkap(nxrkap) - integer,parameter:: nxtlcl=151,nytlcl=61 - real(krealfp) c1xtlcl,c2xtlcl,c1ytlcl,c2ytlcl,tbtlcl(nxtlcl,nytlcl) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! Public Subprograms - public gpvsl,fpvsl,fpvslq,fpvslx - public gpvsi,fpvsi,fpvsiq,fpvsix - public gpvs,fpvs,fpvsq,fpvsx - public gtdpl,ftdpl,ftdplq,ftdplx,ftdplxg - public gtdpi,ftdpi,ftdpiq,ftdpix,ftdpixg - public gtdp,ftdp,ftdpq,ftdpx,ftdpxg - public gthe,fthe,ftheq,fthex - public gtma,stma,stmaq,stmax,stmaxg - public gpkap,fpkap,fpkapq,fpkapo,fpkapx - public grkap,frkap,frkapq,frkapx - public gtlcl,ftlcl,ftlclq,ftlclo,ftlclx - public gfuncphys -contains -!------------------------------------------------------------------------------- - subroutine gpvsl -!$$$ Subprogram Documentation Block -! -! Subprogram: gpvsl Compute saturation vapor pressure table over liquid -! Author: N Phillips W/NMC2X2 Date: 30 dec 82 -! -! Abstract: Computes saturation vapor pressure table as a function of -! temperature for the table lookup function fpvsl. -! Exact saturation vapor pressures are calculated in subprogram fpvslx. -! The current implementation computes a table with a length -! of 7501 for temperatures ranging from 180. to 330. Kelvin. -! -! Program History Log: -! 91-05-07 Iredell -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! -! Usage: call gpvsl -! -! Subprograms called: -! (fpvslx) inlinable function to compute saturation vapor pressure -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx - real(krealfp) xmin,xmax,xinc,x,t -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=180.0_krealfp - xmax=330.0_krealfp - xinc=(xmax-xmin)/(nxpvsl-1) -! c1xpvsl=1.-xmin/xinc - c2xpvsl=1./xinc - c1xpvsl=1.-xmin*c2xpvsl - do jx=1,nxpvsl - x=xmin+(jx-1)*xinc - t=x - tbpvsl(jx)=fpvslx(t) - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental function fpvsl(t) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpvsl Compute saturation vapor pressure over liquid -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute saturation vapor pressure from the temperature. -! A linear interpolation is done between values in a lookup table -! computed in gpvsl. See documentation for fpvslx for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is almost 6 decimal places. -! On the Cray, fpvsl is about 4 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! -! Usage: pvsl=fpvsl(t) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! -! Output argument list: -! fpvsl Real(krealfp) saturation vapor pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpvsl - real(krealfp),intent(in):: t - integer jx - real(krealfp) xj -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xpvsl+c2xpvsl*t,1._krealfp),real(nxpvsl,krealfp)) - jx=min(xj,nxpvsl-1._krealfp) - fpvsl=tbpvsl(jx)+(xj-jx)*(tbpvsl(jx+1)-tbpvsl(jx)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function fpvslq(t) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpvslq Compute saturation vapor pressure over liquid -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute saturation vapor pressure from the temperature. -! A quadratic interpolation is done between values in a lookup table -! computed in gpvsl. See documentation for fpvslx for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is almost 9 decimal places. -! On the Cray, fpvslq is about 3 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell quadratic interpolation -! 1999-03-01 Iredell f90 module -! -! Usage: pvsl=fpvslq(t) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! -! Output argument list: -! fpvslq Real(krealfp) saturation vapor pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpvslq - real(krealfp),intent(in):: t - integer jx - real(krealfp) xj,dxj,fj1,fj2,fj3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xpvsl+c2xpvsl*t,1._krealfp),real(nxpvsl,krealfp)) - jx=min(max(nint(xj),2),nxpvsl-1) - dxj=xj-jx - fj1=tbpvsl(jx-1) - fj2=tbpvsl(jx) - fj3=tbpvsl(jx+1) - fpvslq=(((fj3+fj1)/2-fj2)*dxj+(fj3-fj1)/2)*dxj+fj2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function fpvslx(t) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpvslx Compute saturation vapor pressure over liquid -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Exactly compute saturation vapor pressure from temperature. -! The water model assumes a perfect gas, constant specific heats -! for gas and liquid, and neglects the volume of the liquid. -! The model does account for the variation of the latent heat -! of condensation with temperature. The ice option is not included. -! The Clausius-Clapeyron equation is integrated from the triple point -! to get the formula -! pvsl=con_psat*(tr**xa)*exp(xb*(1.-tr)) -! where tr is ttp/t and other values are physical constants. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! -! Usage: pvsl=fpvslx(t) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! -! Output argument list: -! fpvslx Real(krealfp) saturation vapor pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpvslx - real(krealfp),intent(in):: t - real(krealfp),parameter:: dldt=con_cvap-con_cliq - real(krealfp),parameter:: heat=con_hvap - real(krealfp),parameter:: xpona=-dldt/con_rv - real(krealfp),parameter:: xponb=-dldt/con_rv+heat/(con_rv*con_ttp) - real(krealfp) tr -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - tr=con_ttp/t - fpvslx=con_psat*(tr**xpona)*exp(xponb*(1.-tr)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - subroutine gpvsi -!$$$ Subprogram Documentation Block -! -! Subprogram: gpvsi Compute saturation vapor pressure table over ice -! Author: N Phillips W/NMC2X2 Date: 30 dec 82 -! -! Abstract: Computes saturation vapor pressure table as a function of -! temperature for the table lookup function fpvsi. -! Exact saturation vapor pressures are calculated in subprogram fpvsix. -! The current implementation computes a table with a length -! of 7501 for temperatures ranging from 180. to 330. Kelvin. -! -! Program History Log: -! 91-05-07 Iredell -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: call gpvsi -! -! Subprograms called: -! (fpvsix) inlinable function to compute saturation vapor pressure -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx - real(krealfp) xmin,xmax,xinc,x,t -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=180.0_krealfp - xmax=330.0_krealfp - xinc=(xmax-xmin)/(nxpvsi-1) -! c1xpvsi=1.-xmin/xinc - c2xpvsi=1./xinc - c1xpvsi=1.-xmin*c2xpvsi - do jx=1,nxpvsi - x=xmin+(jx-1)*xinc - t=x - tbpvsi(jx)=fpvsix(t) - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental function fpvsi(t) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpvsi Compute saturation vapor pressure over ice -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute saturation vapor pressure from the temperature. -! A linear interpolation is done between values in a lookup table -! computed in gpvsi. See documentation for fpvsix for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is almost 6 decimal places. -! On the Cray, fpvsi is about 4 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: pvsi=fpvsi(t) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! -! Output argument list: -! fpvsi Real(krealfp) saturation vapor pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpvsi - real(krealfp),intent(in):: t - integer jx - real(krealfp) xj -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xpvsi+c2xpvsi*t,1._krealfp),real(nxpvsi,krealfp)) - jx=min(xj,nxpvsi-1._krealfp) - fpvsi=tbpvsi(jx)+(xj-jx)*(tbpvsi(jx+1)-tbpvsi(jx)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function fpvsiq(t) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpvsiq Compute saturation vapor pressure over ice -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute saturation vapor pressure from the temperature. -! A quadratic interpolation is done between values in a lookup table -! computed in gpvsi. See documentation for fpvsix for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is almost 9 decimal places. -! On the Cray, fpvsiq is about 3 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell quadratic interpolation -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: pvsi=fpvsiq(t) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! -! Output argument list: -! fpvsiq Real(krealfp) saturation vapor pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpvsiq - real(krealfp),intent(in):: t - integer jx - real(krealfp) xj,dxj,fj1,fj2,fj3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xpvsi+c2xpvsi*t,1._krealfp),real(nxpvsi,krealfp)) - jx=min(max(nint(xj),2),nxpvsi-1) - dxj=xj-jx - fj1=tbpvsi(jx-1) - fj2=tbpvsi(jx) - fj3=tbpvsi(jx+1) - fpvsiq=(((fj3+fj1)/2-fj2)*dxj+(fj3-fj1)/2)*dxj+fj2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function fpvsix(t) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpvsix Compute saturation vapor pressure over ice -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Exactly compute saturation vapor pressure from temperature. -! The water model assumes a perfect gas, constant specific heats -! for gas and ice, and neglects the volume of the ice. -! The model does account for the variation of the latent heat -! of condensation with temperature. The liquid option is not included. -! The Clausius-Clapeyron equation is integrated from the triple point -! to get the formula -! pvsi=con_psat*(tr**xa)*exp(xb*(1.-tr)) -! where tr is ttp/t and other values are physical constants. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: pvsi=fpvsix(t) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! -! Output argument list: -! fpvsix Real(krealfp) saturation vapor pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpvsix - real(krealfp),intent(in):: t - real(krealfp),parameter:: dldt=con_cvap-con_csol - real(krealfp),parameter:: heat=con_hvap+con_hfus - real(krealfp),parameter:: xpona=-dldt/con_rv - real(krealfp),parameter:: xponb=-dldt/con_rv+heat/(con_rv*con_ttp) - real(krealfp) tr -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - tr=con_ttp/t - fpvsix=con_psat*(tr**xpona)*exp(xponb*(1.-tr)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - subroutine gpvs -!$$$ Subprogram Documentation Block -! -! Subprogram: gpvs Compute saturation vapor pressure table -! Author: N Phillips W/NMC2X2 Date: 30 dec 82 -! -! Abstract: Computes saturation vapor pressure table as a function of -! temperature for the table lookup function fpvs. -! Exact saturation vapor pressures are calculated in subprogram fpvsx. -! The current implementation computes a table with a length -! of 7501 for temperatures ranging from 180. to 330. Kelvin. -! -! Program History Log: -! 91-05-07 Iredell -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: call gpvs -! -! Subprograms called: -! (fpvsx) inlinable function to compute saturation vapor pressure -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx - real(krealfp) xmin,xmax,xinc,x,t -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=180.0_krealfp - xmax=330.0_krealfp - xinc=(xmax-xmin)/(nxpvs-1) -! c1xpvs=1.-xmin/xinc - c2xpvs=1./xinc - c1xpvs=1.-xmin*c2xpvs - do jx=1,nxpvs - x=xmin+(jx-1)*xinc - t=x - tbpvs(jx)=fpvsx(t) - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental function fpvs(t) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpvs Compute saturation vapor pressure -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute saturation vapor pressure from the temperature. -! A linear interpolation is done between values in a lookup table -! computed in gpvs. See documentation for fpvsx for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is almost 6 decimal places. -! On the Cray, fpvs is about 4 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: pvs=fpvs(t) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! -! Output argument list: -! fpvs Real(krealfp) saturation vapor pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpvs - real(krealfp),intent(in):: t - integer jx - real(krealfp) xj -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xpvs+c2xpvs*t,1._krealfp),real(nxpvs,krealfp)) - jx=min(xj,nxpvs-1._krealfp) - fpvs=tbpvs(jx)+(xj-jx)*(tbpvs(jx+1)-tbpvs(jx)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function fpvsq(t) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpvsq Compute saturation vapor pressure -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute saturation vapor pressure from the temperature. -! A quadratic interpolation is done between values in a lookup table -! computed in gpvs. See documentation for fpvsx for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is almost 9 decimal places. -! On the Cray, fpvsq is about 3 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell quadratic interpolation -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: pvs=fpvsq(t) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! -! Output argument list: -! fpvsq Real(krealfp) saturation vapor pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpvsq - real(krealfp),intent(in):: t - integer jx - real(krealfp) xj,dxj,fj1,fj2,fj3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xpvs+c2xpvs*t,1._krealfp),real(nxpvs,krealfp)) - jx=min(max(nint(xj),2),nxpvs-1) - dxj=xj-jx - fj1=tbpvs(jx-1) - fj2=tbpvs(jx) - fj3=tbpvs(jx+1) - fpvsq=(((fj3+fj1)/2-fj2)*dxj+(fj3-fj1)/2)*dxj+fj2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function fpvsx(t) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpvsx Compute saturation vapor pressure -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Exactly compute saturation vapor pressure from temperature. -! The saturation vapor pressure over either liquid and ice is computed -! over liquid for temperatures above the triple point, -! over ice for temperatures 20 degress below the triple point, -! and a linear combination of the two for temperatures in between. -! The water model assumes a perfect gas, constant specific heats -! for gas, liquid and ice, and neglects the volume of the condensate. -! The model does account for the variation of the latent heat -! of condensation and sublimation with temperature. -! The Clausius-Clapeyron equation is integrated from the triple point -! to get the formula -! pvsl=con_psat*(tr**xa)*exp(xb*(1.-tr)) -! where tr is ttp/t and other values are physical constants. -! The reference for this computation is Emanuel(1994), pages 116-117. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: pvs=fpvsx(t) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! -! Output argument list: -! fpvsx Real(krealfp) saturation vapor pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpvsx - real(krealfp),intent(in):: t - real(krealfp),parameter:: tliq=con_ttp - real(krealfp),parameter:: tice=con_ttp-20.0 - real(krealfp),parameter:: dldtl=con_cvap-con_cliq - real(krealfp),parameter:: heatl=con_hvap - real(krealfp),parameter:: xponal=-dldtl/con_rv - real(krealfp),parameter:: xponbl=-dldtl/con_rv+heatl/(con_rv*con_ttp) - real(krealfp),parameter:: dldti=con_cvap-con_csol - real(krealfp),parameter:: heati=con_hvap+con_hfus - real(krealfp),parameter:: xponai=-dldti/con_rv - real(krealfp),parameter:: xponbi=-dldti/con_rv+heati/(con_rv*con_ttp) - real(krealfp) tr,w,pvl,pvi -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - tr=con_ttp/t - if(t.ge.tliq) then - fpvsx=con_psat*(tr**xponal)*exp(xponbl*(1.-tr)) - elseif(t.lt.tice) then - fpvsx=con_psat*(tr**xponai)*exp(xponbi*(1.-tr)) - else - w=(t-tice)/(tliq-tice) - pvl=con_psat*(tr**xponal)*exp(xponbl*(1.-tr)) - pvi=con_psat*(tr**xponai)*exp(xponbi*(1.-tr)) - fpvsx=w*pvl+(1.-w)*pvi - endif -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - subroutine gtdpl -!$$$ Subprogram Documentation Block -! -! Subprogram: gtdpl Compute dewpoint temperature over liquid table -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute dewpoint temperature table as a function of -! vapor pressure for inlinable function ftdpl. -! Exact dewpoint temperatures are calculated in subprogram ftdplxg. -! The current implementation computes a table with a length -! of 5001 for vapor pressures ranging from 1 to 10001 Pascals -! giving a dewpoint temperature range of 208 to 319 Kelvin. -! -! Program History Log: -! 91-05-07 Iredell -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! -! Usage: call gtdpl -! -! Subprograms called: -! (ftdplxg) inlinable function to compute dewpoint temperature over liquid -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx - real(krealfp) xmin,xmax,xinc,t,x,pv -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=1 - xmax=10001 - xinc=(xmax-xmin)/(nxtdpl-1) - c1xtdpl=1.-xmin/xinc - c2xtdpl=1./xinc - t=208.0 - do jx=1,nxtdpl - x=xmin+(jx-1)*xinc - pv=x - t=ftdplxg(t,pv) - tbtdpl(jx)=t - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental function ftdpl(pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdpl Compute dewpoint temperature over liquid -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute dewpoint temperature from vapor pressure. -! A linear interpolation is done between values in a lookup table -! computed in gtdpl. See documentation for ftdplxg for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is better than 0.0005 Kelvin -! for dewpoint temperatures greater than 250 Kelvin, -! but decreases to 0.02 Kelvin for a dewpoint around 230 Kelvin. -! On the Cray, ftdpl is about 75 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! -! Usage: tdpl=ftdpl(pv) -! -! Input argument list: -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdpl Real(krealfp) dewpoint temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdpl - real(krealfp),intent(in):: pv - integer jx - real(krealfp) xj -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xtdpl+c2xtdpl*pv,1._krealfp),real(nxtdpl,krealfp)) - jx=min(xj,nxtdpl-1._krealfp) - ftdpl=tbtdpl(jx)+(xj-jx)*(tbtdpl(jx+1)-tbtdpl(jx)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftdplq(pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdplq Compute dewpoint temperature over liquid -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute dewpoint temperature from vapor pressure. -! A quadratic interpolation is done between values in a lookup table -! computed in gtdpl. see documentation for ftdplxg for details. -! Input values outside table range are reset to table extrema. -! the interpolation accuracy is better than 0.00001 Kelvin -! for dewpoint temperatures greater than 250 Kelvin, -! but decreases to 0.002 Kelvin for a dewpoint around 230 Kelvin. -! On the Cray, ftdplq is about 60 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell quadratic interpolation -! 1999-03-01 Iredell f90 module -! -! Usage: tdpl=ftdplq(pv) -! -! Input argument list: -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdplq Real(krealfp) dewpoint temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdplq - real(krealfp),intent(in):: pv - integer jx - real(krealfp) xj,dxj,fj1,fj2,fj3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xtdpl+c2xtdpl*pv,1._krealfp),real(nxtdpl,krealfp)) - jx=min(max(nint(xj),2),nxtdpl-1) - dxj=xj-jx - fj1=tbtdpl(jx-1) - fj2=tbtdpl(jx) - fj3=tbtdpl(jx+1) - ftdplq=(((fj3+fj1)/2-fj2)*dxj+(fj3-fj1)/2)*dxj+fj2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftdplx(pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdplx Compute dewpoint temperature over liquid -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: exactly compute dewpoint temperature from vapor pressure. -! An approximate dewpoint temperature for function ftdplxg -! is obtained using ftdpl so gtdpl must be already called. -! See documentation for ftdplxg for details. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! -! Usage: tdpl=ftdplx(pv) -! -! Input argument list: -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdplx Real(krealfp) dewpoint temperature in Kelvin -! -! Subprograms called: -! (ftdpl) inlinable function to compute dewpoint temperature over liquid -! (ftdplxg) inlinable function to compute dewpoint temperature over liquid -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdplx - real(krealfp),intent(in):: pv - real(krealfp) tg -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - tg=ftdpl(pv) - ftdplx=ftdplxg(tg,pv) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftdplxg(tg,pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdplxg Compute dewpoint temperature over liquid -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Exactly compute dewpoint temperature from vapor pressure. -! A guess dewpoint temperature must be provided. -! The water model assumes a perfect gas, constant specific heats -! for gas and liquid, and neglects the volume of the liquid. -! The model does account for the variation of the latent heat -! of condensation with temperature. The ice option is not included. -! The Clausius-Clapeyron equation is integrated from the triple point -! to get the formula -! pvs=con_psat*(tr**xa)*exp(xb*(1.-tr)) -! where tr is ttp/t and other values are physical constants. -! The formula is inverted by iterating Newtonian approximations -! for each pvs until t is found to within 1.e-6 Kelvin. -! This function can be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! -! Usage: tdpl=ftdplxg(tg,pv) -! -! Input argument list: -! tg Real(krealfp) guess dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdplxg Real(krealfp) dewpoint temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdplxg - real(krealfp),intent(in):: tg,pv - real(krealfp),parameter:: terrm=1.e-6 - real(krealfp),parameter:: dldt=con_cvap-con_cliq - real(krealfp),parameter:: heat=con_hvap - real(krealfp),parameter:: xpona=-dldt/con_rv - real(krealfp),parameter:: xponb=-dldt/con_rv+heat/(con_rv*con_ttp) - real(krealfp) t,tr,pvt,el,dpvt,terr - integer i -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - t=tg - do i=1,100 - tr=con_ttp/t - pvt=con_psat*(tr**xpona)*exp(xponb*(1.-tr)) - el=heat+dldt*(t-con_ttp) - dpvt=el*pvt/(con_rv*t**2) - terr=(pvt-pv)/dpvt - t=t-terr - if(abs(terr).le.terrm) exit - enddo - ftdplxg=t -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - subroutine gtdpi -!$$$ Subprogram Documentation Block -! -! Subprogram: gtdpi Compute dewpoint temperature over ice table -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute dewpoint temperature table as a function of -! vapor pressure for inlinable function ftdpi. -! Exact dewpoint temperatures are calculated in subprogram ftdpixg. -! The current implementation computes a table with a length -! of 5001 for vapor pressures ranging from 0.1 to 1000.1 Pascals -! giving a dewpoint temperature range of 197 to 279 Kelvin. -! -! Program History Log: -! 91-05-07 Iredell -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: call gtdpi -! -! Subprograms called: -! (ftdpixg) inlinable function to compute dewpoint temperature over ice -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx - real(krealfp) xmin,xmax,xinc,t,x,pv -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=0.1 - xmax=1000.1 - xinc=(xmax-xmin)/(nxtdpi-1) - c1xtdpi=1.-xmin/xinc - c2xtdpi=1./xinc - t=197.0 - do jx=1,nxtdpi - x=xmin+(jx-1)*xinc - pv=x - t=ftdpixg(t,pv) - tbtdpi(jx)=t - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental function ftdpi(pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdpi Compute dewpoint temperature over ice -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute dewpoint temperature from vapor pressure. -! A linear interpolation is done between values in a lookup table -! computed in gtdpi. See documentation for ftdpixg for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is better than 0.0005 Kelvin -! for dewpoint temperatures greater than 250 Kelvin, -! but decreases to 0.02 Kelvin for a dewpoint around 230 Kelvin. -! On the Cray, ftdpi is about 75 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: tdpi=ftdpi(pv) -! -! Input argument list: -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdpi Real(krealfp) dewpoint temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdpi - real(krealfp),intent(in):: pv - integer jx - real(krealfp) xj -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xtdpi+c2xtdpi*pv,1._krealfp),real(nxtdpi,krealfp)) - jx=min(xj,nxtdpi-1._krealfp) - ftdpi=tbtdpi(jx)+(xj-jx)*(tbtdpi(jx+1)-tbtdpi(jx)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftdpiq(pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdpiq Compute dewpoint temperature over ice -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute dewpoint temperature from vapor pressure. -! A quadratic interpolation is done between values in a lookup table -! computed in gtdpi. see documentation for ftdpixg for details. -! Input values outside table range are reset to table extrema. -! the interpolation accuracy is better than 0.00001 Kelvin -! for dewpoint temperatures greater than 250 Kelvin, -! but decreases to 0.002 Kelvin for a dewpoint around 230 Kelvin. -! On the Cray, ftdpiq is about 60 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell quadratic interpolation -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: tdpi=ftdpiq(pv) -! -! Input argument list: -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdpiq Real(krealfp) dewpoint temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdpiq - real(krealfp),intent(in):: pv - integer jx - real(krealfp) xj,dxj,fj1,fj2,fj3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xtdpi+c2xtdpi*pv,1._krealfp),real(nxtdpi,krealfp)) - jx=min(max(nint(xj),2),nxtdpi-1) - dxj=xj-jx - fj1=tbtdpi(jx-1) - fj2=tbtdpi(jx) - fj3=tbtdpi(jx+1) - ftdpiq=(((fj3+fj1)/2-fj2)*dxj+(fj3-fj1)/2)*dxj+fj2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftdpix(pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdpix Compute dewpoint temperature over ice -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: exactly compute dewpoint temperature from vapor pressure. -! An approximate dewpoint temperature for function ftdpixg -! is obtained using ftdpi so gtdpi must be already called. -! See documentation for ftdpixg for details. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: tdpi=ftdpix(pv) -! -! Input argument list: -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdpix Real(krealfp) dewpoint temperature in Kelvin -! -! Subprograms called: -! (ftdpi) inlinable function to compute dewpoint temperature over ice -! (ftdpixg) inlinable function to compute dewpoint temperature over ice -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdpix - real(krealfp),intent(in):: pv - real(krealfp) tg -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - tg=ftdpi(pv) - ftdpix=ftdpixg(tg,pv) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftdpixg(tg,pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdpixg Compute dewpoint temperature over ice -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Exactly compute dewpoint temperature from vapor pressure. -! A guess dewpoint temperature must be provided. -! The water model assumes a perfect gas, constant specific heats -! for gas and ice, and neglects the volume of the ice. -! The model does account for the variation of the latent heat -! of sublimation with temperature. The liquid option is not included. -! The Clausius-Clapeyron equation is integrated from the triple point -! to get the formula -! pvs=con_psat*(tr**xa)*exp(xb*(1.-tr)) -! where tr is ttp/t and other values are physical constants. -! The formula is inverted by iterating Newtonian approximations -! for each pvs until t is found to within 1.e-6 Kelvin. -! This function can be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: tdpi=ftdpixg(tg,pv) -! -! Input argument list: -! tg Real(krealfp) guess dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdpixg Real(krealfp) dewpoint temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdpixg - real(krealfp),intent(in):: tg,pv - real(krealfp),parameter:: terrm=1.e-6 - real(krealfp),parameter:: dldt=con_cvap-con_csol - real(krealfp),parameter:: heat=con_hvap+con_hfus - real(krealfp),parameter:: xpona=-dldt/con_rv - real(krealfp),parameter:: xponb=-dldt/con_rv+heat/(con_rv*con_ttp) - real(krealfp) t,tr,pvt,el,dpvt,terr - integer i -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - t=tg - do i=1,100 - tr=con_ttp/t - pvt=con_psat*(tr**xpona)*exp(xponb*(1.-tr)) - el=heat+dldt*(t-con_ttp) - dpvt=el*pvt/(con_rv*t**2) - terr=(pvt-pv)/dpvt - t=t-terr - if(abs(terr).le.terrm) exit - enddo - ftdpixg=t -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - subroutine gtdp -!$$$ Subprogram Documentation Block -! -! Subprogram: gtdp Compute dewpoint temperature table -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute dewpoint temperature table as a function of -! vapor pressure for inlinable function ftdp. -! Exact dewpoint temperatures are calculated in subprogram ftdpxg. -! The current implementation computes a table with a length -! of 5001 for vapor pressures ranging from 0.5 to 1000.5 Pascals -! giving a dewpoint temperature range of 208 to 319 Kelvin. -! -! Program History Log: -! 91-05-07 Iredell -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: call gtdp -! -! Subprograms called: -! (ftdpxg) inlinable function to compute dewpoint temperature -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx - real(krealfp) xmin,xmax,xinc,t,x,pv -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=0.5 - xmax=10000.5 - xinc=(xmax-xmin)/(nxtdp-1) - c1xtdp=1.-xmin/xinc - c2xtdp=1./xinc - t=208.0 - do jx=1,nxtdp - x=xmin+(jx-1)*xinc - pv=x - t=ftdpxg(t,pv) - tbtdp(jx)=t - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental function ftdp(pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdp Compute dewpoint temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute dewpoint temperature from vapor pressure. -! A linear interpolation is done between values in a lookup table -! computed in gtdp. See documentation for ftdpxg for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is better than 0.0005 Kelvin -! for dewpoint temperatures greater than 250 Kelvin, -! but decreases to 0.02 Kelvin for a dewpoint around 230 Kelvin. -! On the Cray, ftdp is about 75 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: tdp=ftdp(pv) -! -! Input argument list: -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdp Real(krealfp) dewpoint temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdp - real(krealfp),intent(in):: pv - integer jx - real(krealfp) xj -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xtdp+c2xtdp*pv,1._krealfp),real(nxtdp,krealfp)) - jx=min(xj,nxtdp-1._krealfp) - ftdp=tbtdp(jx)+(xj-jx)*(tbtdp(jx+1)-tbtdp(jx)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftdpq(pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdpq Compute dewpoint temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute dewpoint temperature from vapor pressure. -! A quadratic interpolation is done between values in a lookup table -! computed in gtdp. see documentation for ftdpxg for details. -! Input values outside table range are reset to table extrema. -! the interpolation accuracy is better than 0.00001 Kelvin -! for dewpoint temperatures greater than 250 Kelvin, -! but decreases to 0.002 Kelvin for a dewpoint around 230 Kelvin. -! On the Cray, ftdpq is about 60 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell quadratic interpolation -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: tdp=ftdpq(pv) -! -! Input argument list: -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdpq Real(krealfp) dewpoint temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdpq - real(krealfp),intent(in):: pv - integer jx - real(krealfp) xj,dxj,fj1,fj2,fj3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xtdp+c2xtdp*pv,1._krealfp),real(nxtdp,krealfp)) - jx=min(max(nint(xj),2),nxtdp-1) - dxj=xj-jx - fj1=tbtdp(jx-1) - fj2=tbtdp(jx) - fj3=tbtdp(jx+1) - ftdpq=(((fj3+fj1)/2-fj2)*dxj+(fj3-fj1)/2)*dxj+fj2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftdpx(pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdpx Compute dewpoint temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: exactly compute dewpoint temperature from vapor pressure. -! An approximate dewpoint temperature for function ftdpxg -! is obtained using ftdp so gtdp must be already called. -! See documentation for ftdpxg for details. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: tdp=ftdpx(pv) -! -! Input argument list: -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdpx Real(krealfp) dewpoint temperature in Kelvin -! -! Subprograms called: -! (ftdp) inlinable function to compute dewpoint temperature -! (ftdpxg) inlinable function to compute dewpoint temperature -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdpx - real(krealfp),intent(in):: pv - real(krealfp) tg -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - tg=ftdp(pv) - ftdpx=ftdpxg(tg,pv) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftdpxg(tg,pv) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftdpxg Compute dewpoint temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Exactly compute dewpoint temperature from vapor pressure. -! A guess dewpoint temperature must be provided. -! The saturation vapor pressure over either liquid and ice is computed -! over liquid for temperatures above the triple point, -! over ice for temperatures 20 degress below the triple point, -! and a linear combination of the two for temperatures in between. -! The water model assumes a perfect gas, constant specific heats -! for gas, liquid and ice, and neglects the volume of the condensate. -! The model does account for the variation of the latent heat -! of condensation and sublimation with temperature. -! The Clausius-Clapeyron equation is integrated from the triple point -! to get the formula -! pvsl=con_psat*(tr**xa)*exp(xb*(1.-tr)) -! where tr is ttp/t and other values are physical constants. -! The reference for this decision is Emanuel(1994), pages 116-117. -! The formula is inverted by iterating Newtonian approximations -! for each pvs until t is found to within 1.e-6 Kelvin. -! This function can be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! 2001-02-26 Iredell ice phase -! -! Usage: tdp=ftdpxg(tg,pv) -! -! Input argument list: -! tg Real(krealfp) guess dewpoint temperature in Kelvin -! pv Real(krealfp) vapor pressure in Pascals -! -! Output argument list: -! ftdpxg Real(krealfp) dewpoint temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftdpxg - real(krealfp),intent(in):: tg,pv - real(krealfp),parameter:: terrm=1.e-6 - real(krealfp),parameter:: tliq=con_ttp - real(krealfp),parameter:: tice=con_ttp-20.0 - real(krealfp),parameter:: dldtl=con_cvap-con_cliq - real(krealfp),parameter:: heatl=con_hvap - real(krealfp),parameter:: xponal=-dldtl/con_rv - real(krealfp),parameter:: xponbl=-dldtl/con_rv+heatl/(con_rv*con_ttp) - real(krealfp),parameter:: dldti=con_cvap-con_csol - real(krealfp),parameter:: heati=con_hvap+con_hfus - real(krealfp),parameter:: xponai=-dldti/con_rv - real(krealfp),parameter:: xponbi=-dldti/con_rv+heati/(con_rv*con_ttp) - real(krealfp) t,tr,w,pvtl,pvti,pvt,ell,eli,el,dpvt,terr - integer i -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - t=tg - do i=1,100 - tr=con_ttp/t - if(t.ge.tliq) then - pvt=con_psat*(tr**xponal)*exp(xponbl*(1.-tr)) - el=heatl+dldtl*(t-con_ttp) - dpvt=el*pvt/(con_rv*t**2) - elseif(t.lt.tice) then - pvt=con_psat*(tr**xponai)*exp(xponbi*(1.-tr)) - el=heati+dldti*(t-con_ttp) - dpvt=el*pvt/(con_rv*t**2) - else - w=(t-tice)/(tliq-tice) - pvtl=con_psat*(tr**xponal)*exp(xponbl*(1.-tr)) - pvti=con_psat*(tr**xponai)*exp(xponbi*(1.-tr)) - pvt=w*pvtl+(1.-w)*pvti - ell=heatl+dldtl*(t-con_ttp) - eli=heati+dldti*(t-con_ttp) - dpvt=(w*ell*pvtl+(1.-w)*eli*pvti)/(con_rv*t**2) - endif - terr=(pvt-pv)/dpvt - t=t-terr - if(abs(terr).le.terrm) exit - enddo - ftdpxg=t -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - subroutine gthe -!$$$ Subprogram Documentation Block -! -! Subprogram: gthe Compute equivalent potential temperature table -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute equivalent potential temperature table -! as a function of LCL temperature and pressure over 1e5 Pa -! to the kappa power for function fthe. -! Equivalent potential temperatures are calculated in subprogram fthex -! the current implementation computes a table with a first dimension -! of 241 for temperatures ranging from 183.16 to 303.16 Kelvin -! and a second dimension of 151 for pressure over 1e5 Pa -! to the kappa power ranging from 0.04**rocp to 1.10**rocp. -! -! Program History Log: -! 91-05-07 Iredell -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! -! Usage: call gthe -! -! Subprograms called: -! (fthex) inlinable function to compute equiv. pot. temperature -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx,jy - real(krealfp) xmin,xmax,ymin,ymax,xinc,yinc,x,y,pk,t -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=con_ttp-90._krealfp - xmax=con_ttp+30._krealfp - ymin=0.04_krealfp**con_rocp - ymax=1.10_krealfp**con_rocp - xinc=(xmax-xmin)/(nxthe-1) - c1xthe=1.-xmin/xinc - c2xthe=1./xinc - yinc=(ymax-ymin)/(nythe-1) - c1ythe=1.-ymin/yinc - c2ythe=1./yinc - do jy=1,nythe - y=ymin+(jy-1)*yinc - pk=y - do jx=1,nxthe - x=xmin+(jx-1)*xinc - t=x - tbthe(jx,jy)=fthex(t,pk) - enddo - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental function fthe(t,pk) -!$$$ Subprogram Documentation Block -! -! Subprogram: fthe Compute equivalent potential temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute equivalent potential temperature at the LCL -! from temperature and pressure over 1e5 Pa to the kappa power. -! A bilinear interpolation is done between values in a lookup table -! computed in gthe. see documentation for fthex for details. -! Input values outside table range are reset to table extrema, -! except zero is returned for too cold or high LCLs. -! The interpolation accuracy is better than 0.01 Kelvin. -! On the Cray, fthe is almost 6 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! -! Usage: the=fthe(t,pk) -! -! Input argument list: -! t Real(krealfp) LCL temperature in Kelvin -! pk Real(krealfp) LCL pressure over 1e5 Pa to the kappa power -! -! Output argument list: -! fthe Real(krealfp) equivalent potential temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fthe - real(krealfp),intent(in):: t,pk - integer jx,jy - real(krealfp) xj,yj,ftx1,ftx2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(c1xthe+c2xthe*t,real(nxthe,krealfp)) - yj=min(c1ythe+c2ythe*pk,real(nythe,krealfp)) - if(xj.ge.1..and.yj.ge.1.) then - jx=min(xj,nxthe-1._krealfp) - jy=min(yj,nythe-1._krealfp) - ftx1=tbthe(jx,jy)+(xj-jx)*(tbthe(jx+1,jy)-tbthe(jx,jy)) - ftx2=tbthe(jx,jy+1)+(xj-jx)*(tbthe(jx+1,jy+1)-tbthe(jx,jy+1)) - fthe=ftx1+(yj-jy)*(ftx2-ftx1) - else - fthe=0. - endif -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftheq(t,pk) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftheq Compute equivalent potential temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute equivalent potential temperature at the LCL -! from temperature and pressure over 1e5 Pa to the kappa power. -! A biquadratic interpolation is done between values in a lookup table -! computed in gthe. see documentation for fthex for details. -! Input values outside table range are reset to table extrema, -! except zero is returned for too cold or high LCLs. -! The interpolation accuracy is better than 0.0002 Kelvin. -! On the Cray, ftheq is almost 3 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell quadratic interpolation -! 1999-03-01 Iredell f90 module -! -! Usage: the=ftheq(t,pk) -! -! Input argument list: -! t Real(krealfp) LCL temperature in Kelvin -! pk Real(krealfp) LCL pressure over 1e5 Pa to the kappa power -! -! Output argument list: -! ftheq Real(krealfp) equivalent potential temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftheq - real(krealfp),intent(in):: t,pk - integer jx,jy - real(krealfp) xj,yj,dxj,dyj - real(krealfp) ft11,ft12,ft13,ft21,ft22,ft23,ft31,ft32,ft33 - real(krealfp) ftx1,ftx2,ftx3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(c1xthe+c2xthe*t,real(nxthe,krealfp)) - yj=min(c1ythe+c2ythe*pk,real(nythe,krealfp)) - if(xj.ge.1..and.yj.ge.1.) then - jx=min(max(nint(xj),2),nxthe-1) - jy=min(max(nint(yj),2),nythe-1) - dxj=xj-jx - dyj=yj-jy - ft11=tbthe(jx-1,jy-1) - ft12=tbthe(jx-1,jy) - ft13=tbthe(jx-1,jy+1) - ft21=tbthe(jx,jy-1) - ft22=tbthe(jx,jy) - ft23=tbthe(jx,jy+1) - ft31=tbthe(jx+1,jy-1) - ft32=tbthe(jx+1,jy) - ft33=tbthe(jx+1,jy+1) - ftx1=(((ft31+ft11)/2-ft21)*dxj+(ft31-ft11)/2)*dxj+ft21 - ftx2=(((ft32+ft12)/2-ft22)*dxj+(ft32-ft12)/2)*dxj+ft22 - ftx3=(((ft33+ft13)/2-ft23)*dxj+(ft33-ft13)/2)*dxj+ft23 - ftheq=(((ftx3+ftx1)/2-ftx2)*dyj+(ftx3-ftx1)/2)*dyj+ftx2 - else - ftheq=0. - endif -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- -! elemental function fthex(t,pk) - function fthex(t,pk) -!$$$ Subprogram Documentation Block -! -! Subprogram: fthex Compute equivalent potential temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Exactly compute equivalent potential temperature at the LCL -! from temperature and pressure over 1e5 Pa to the kappa power. -! Equivalent potential temperature is constant for a saturated parcel -! rising adiabatically up a moist adiabat when the heat and mass -! of the condensed water are neglected. Ice is also neglected. -! The formula for equivalent potential temperature (Holton) is -! the=t*(pd**(-rocp))*exp(el*eps*pv/(cp*t*pd)) -! where t is the temperature, pv is the saturated vapor pressure, -! pd is the dry pressure p-pv, el is the temperature dependent -! latent heat of condensation hvap+dldt*(t-ttp), and other values -! are physical constants defined in parameter statements in the code. -! Zero is returned if the input values make saturation impossible. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! -! Usage: the=fthex(t,pk) -! -! Input argument list: -! t Real(krealfp) LCL temperature in Kelvin -! pk Real(krealfp) LCL pressure over 1e5 Pa to the kappa power -! -! Output argument list: -! fthex Real(krealfp) equivalent potential temperature in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fthex - real(krealfp),intent(in):: t,pk - real(krealfp) p,tr,pv,pd,el,expo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - p=pk**con_cpor - tr=con_ttp/t - pv=psatb*(tr**con_xpona)*exp(con_xponb*(1.-tr)) - pd=p-pv - if(pd.gt.pv) then - el=con_hvap+con_dldt*(t-con_ttp) - expo=el*con_eps*pv/(con_cp*t*pd) - fthex=t*pd**(-con_rocp)*exp(expo) - else - fthex=0. - endif -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - subroutine gtma -!$$$ Subprogram Documentation Block -! -! Subprogram: gtma Compute moist adiabat tables -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute temperature and specific humidity tables -! as a function of equivalent potential temperature and -! pressure over 1e5 Pa to the kappa power for subprogram stma. -! Exact parcel temperatures are calculated in subprogram stmaxg. -! The current implementation computes a table with a first dimension -! of 151 for equivalent potential temperatures ranging from 200 to 500 -! Kelvin and a second dimension of 121 for pressure over 1e5 Pa -! to the kappa power ranging from 0.01**rocp to 1.10**rocp. -! -! Program History Log: -! 91-05-07 Iredell -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! -! Usage: call gtma -! -! Subprograms called: -! (stmaxg) inlinable subprogram to compute parcel temperature -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx,jy - real(krealfp) xmin,xmax,ymin,ymax,xinc,yinc,x,y,pk,the,t,q,tg -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=200._krealfp - xmax=500._krealfp - ymin=0.01_krealfp**con_rocp - ymax=1.10_krealfp**con_rocp - xinc=(xmax-xmin)/(nxma-1) - c1xma=1.-xmin/xinc - c2xma=1./xinc - yinc=(ymax-ymin)/(nyma-1) - c1yma=1.-ymin/yinc - c2yma=1./yinc - do jy=1,nyma - y=ymin+(jy-1)*yinc - pk=y - tg=xmin*y - do jx=1,nxma - x=xmin+(jx-1)*xinc - the=x - call stmaxg(tg,the,pk,t,q) - tbtma(jx,jy)=t - tbqma(jx,jy)=q - tg=t - enddo - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental subroutine stma(the,pk,tma,qma) -!$$$ Subprogram Documentation Block -! -! Subprogram: stma Compute moist adiabat temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute temperature and specific humidity of a parcel -! lifted up a moist adiabat from equivalent potential temperature -! at the LCL and pressure over 1e5 Pa to the kappa power. -! Bilinear interpolations are done between values in a lookup table -! computed in gtma. See documentation for stmaxg for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is better than 0.01 Kelvin -! and 5.e-6 kg/kg for temperature and humidity, respectively. -! On the Cray, stma is about 35 times faster than exact calculation. -! This subprogram should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell expand table -! 1999-03-01 Iredell f90 module -! -! Usage: call stma(the,pk,tma,qma) -! -! Input argument list: -! the Real(krealfp) equivalent potential temperature in Kelvin -! pk Real(krealfp) pressure over 1e5 Pa to the kappa power -! -! Output argument list: -! tma Real(krealfp) parcel temperature in Kelvin -! qma Real(krealfp) parcel specific humidity in kg/kg -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp),intent(in):: the,pk - real(krealfp),intent(out):: tma,qma - integer jx,jy - real(krealfp) xj,yj,ftx1,ftx2,qx1,qx2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xma+c2xma*the,1._krealfp),real(nxma,krealfp)) - yj=min(max(c1yma+c2yma*pk,1._krealfp),real(nyma,krealfp)) - jx=min(xj,nxma-1._krealfp) - jy=min(yj,nyma-1._krealfp) - ftx1=tbtma(jx,jy)+(xj-jx)*(tbtma(jx+1,jy)-tbtma(jx,jy)) - ftx2=tbtma(jx,jy+1)+(xj-jx)*(tbtma(jx+1,jy+1)-tbtma(jx,jy+1)) - tma=ftx1+(yj-jy)*(ftx2-ftx1) - qx1=tbqma(jx,jy)+(xj-jx)*(tbqma(jx+1,jy)-tbqma(jx,jy)) - qx2=tbqma(jx,jy+1)+(xj-jx)*(tbqma(jx+1,jy+1)-tbqma(jx,jy+1)) - qma=qx1+(yj-jy)*(qx2-qx1) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental subroutine stmaq(the,pk,tma,qma) -!$$$ Subprogram Documentation Block -! -! Subprogram: stmaq Compute moist adiabat temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute temperature and specific humidity of a parcel -! lifted up a moist adiabat from equivalent potential temperature -! at the LCL and pressure over 1e5 Pa to the kappa power. -! Biquadratic interpolations are done between values in a lookup table -! computed in gtma. See documentation for stmaxg for details. -! Input values outside table range are reset to table extrema. -! the interpolation accuracy is better than 0.0005 Kelvin -! and 1.e-7 kg/kg for temperature and humidity, respectively. -! On the Cray, stmaq is about 25 times faster than exact calculation. -! This subprogram should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell quadratic interpolation -! 1999-03-01 Iredell f90 module -! -! Usage: call stmaq(the,pk,tma,qma) -! -! Input argument list: -! the Real(krealfp) equivalent potential temperature in Kelvin -! pk Real(krealfp) pressure over 1e5 Pa to the kappa power -! -! Output argument list: -! tmaq Real(krealfp) parcel temperature in Kelvin -! qma Real(krealfp) parcel specific humidity in kg/kg -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp),intent(in):: the,pk - real(krealfp),intent(out):: tma,qma - integer jx,jy - real(krealfp) xj,yj,dxj,dyj - real(krealfp) ft11,ft12,ft13,ft21,ft22,ft23,ft31,ft32,ft33 - real(krealfp) ftx1,ftx2,ftx3 - real(krealfp) q11,q12,q13,q21,q22,q23,q31,q32,q33,qx1,qx2,qx3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xma+c2xma*the,1._krealfp),real(nxma,krealfp)) - yj=min(max(c1yma+c2yma*pk,1._krealfp),real(nyma,krealfp)) - jx=min(max(nint(xj),2),nxma-1) - jy=min(max(nint(yj),2),nyma-1) - dxj=xj-jx - dyj=yj-jy - ft11=tbtma(jx-1,jy-1) - ft12=tbtma(jx-1,jy) - ft13=tbtma(jx-1,jy+1) - ft21=tbtma(jx,jy-1) - ft22=tbtma(jx,jy) - ft23=tbtma(jx,jy+1) - ft31=tbtma(jx+1,jy-1) - ft32=tbtma(jx+1,jy) - ft33=tbtma(jx+1,jy+1) - ftx1=(((ft31+ft11)/2-ft21)*dxj+(ft31-ft11)/2)*dxj+ft21 - ftx2=(((ft32+ft12)/2-ft22)*dxj+(ft32-ft12)/2)*dxj+ft22 - ftx3=(((ft33+ft13)/2-ft23)*dxj+(ft33-ft13)/2)*dxj+ft23 - tma=(((ftx3+ftx1)/2-ftx2)*dyj+(ftx3-ftx1)/2)*dyj+ftx2 - q11=tbqma(jx-1,jy-1) - q12=tbqma(jx-1,jy) - q13=tbqma(jx-1,jy+1) - q21=tbqma(jx,jy-1) - q22=tbqma(jx,jy) - q23=tbqma(jx,jy+1) - q31=tbqma(jx+1,jy-1) - q32=tbqma(jx+1,jy) - q33=tbqma(jx+1,jy+1) - qx1=(((q31+q11)/2-q21)*dxj+(q31-q11)/2)*dxj+q21 - qx2=(((q32+q12)/2-q22)*dxj+(q32-q12)/2)*dxj+q22 - qx3=(((q33+q13)/2-q23)*dxj+(q33-q13)/2)*dxj+q23 - qma=(((qx3+qx1)/2-qx2)*dyj+(qx3-qx1)/2)*dyj+qx2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental subroutine stmax(the,pk,tma,qma) -!$$$ Subprogram Documentation Block -! -! Subprogram: stmax Compute moist adiabat temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Exactly compute temperature and humidity of a parcel -! lifted up a moist adiabat from equivalent potential temperature -! at the LCL and pressure over 1e5 Pa to the kappa power. -! An approximate parcel temperature for subprogram stmaxg -! is obtained using stma so gtma must be already called. -! See documentation for stmaxg for details. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! -! Usage: call stmax(the,pk,tma,qma) -! -! Input argument list: -! the Real(krealfp) equivalent potential temperature in Kelvin -! pk Real(krealfp) pressure over 1e5 Pa to the kappa power -! -! Output argument list: -! tma Real(krealfp) parcel temperature in Kelvin -! qma Real(krealfp) parcel specific humidity in kg/kg -! -! Subprograms called: -! (stma) inlinable subprogram to compute parcel temperature -! (stmaxg) inlinable subprogram to compute parcel temperature -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp),intent(in):: the,pk - real(krealfp),intent(out):: tma,qma - real(krealfp) tg,qg -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - call stma(the,pk,tg,qg) - call stmaxg(tg,the,pk,tma,qma) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental subroutine stmaxg(tg,the,pk,tma,qma) -!$$$ Subprogram Documentation Block -! -! Subprogram: stmaxg Compute moist adiabat temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: exactly compute temperature and humidity of a parcel -! lifted up a moist adiabat from equivalent potential temperature -! at the LCL and pressure over 1e5 Pa to the kappa power. -! A guess parcel temperature must be provided. -! Equivalent potential temperature is constant for a saturated parcel -! rising adiabatically up a moist adiabat when the heat and mass -! of the condensed water are neglected. Ice is also neglected. -! The formula for equivalent potential temperature (Holton) is -! the=t*(pd**(-rocp))*exp(el*eps*pv/(cp*t*pd)) -! where t is the temperature, pv is the saturated vapor pressure, -! pd is the dry pressure p-pv, el is the temperature dependent -! latent heat of condensation hvap+dldt*(t-ttp), and other values -! are physical constants defined in parameter statements in the code. -! The formula is inverted by iterating Newtonian approximations -! for each the and p until t is found to within 1.e-4 Kelvin. -! The specific humidity is then computed from pv and pd. -! This subprogram can be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell exact computation -! 1999-03-01 Iredell f90 module -! -! Usage: call stmaxg(tg,the,pk,tma,qma) -! -! Input argument list: -! tg Real(krealfp) guess parcel temperature in Kelvin -! the Real(krealfp) equivalent potential temperature in Kelvin -! pk Real(krealfp) pressure over 1e5 Pa to the kappa power -! -! Output argument list: -! tma Real(krealfp) parcel temperature in Kelvin -! qma Real(krealfp) parcel specific humidity in kg/kg -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp),intent(in):: tg,the,pk - real(krealfp),intent(out):: tma,qma - real(krealfp),parameter:: terrm=1.e-4 - real(krealfp) t,p,tr,pv,pd,el,expo,thet,dthet,terr - integer i -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - t=tg - p=pk**con_cpor - do i=1,100 - tr=con_ttp/t - pv=psatb*(tr**con_xpona)*exp(con_xponb*(1.-tr)) - pd=p-pv - el=con_hvap+con_dldt*(t-con_ttp) - expo=el*con_eps*pv/(con_cp*t*pd) - thet=t*pd**(-con_rocp)*exp(expo) - dthet=thet/t*(1.+expo*(con_dldt*t/el+el*p/(con_rv*t*pd))) - terr=(thet-the)/dthet - t=t-terr - if(abs(terr).le.terrm) exit - enddo - tma=t - tr=con_ttp/t - pv=psatb*(tr**con_xpona)*exp(con_xponb*(1.-tr)) - pd=p-pv - qma=con_eps*pv/(pd+con_eps*pv) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - subroutine gpkap -!$$$ Subprogram documentation block -! -! Subprogram: gpkap Compute coefficients for p**kappa -! Author: Phillips org: w/NMC2X2 Date: 29 dec 82 -! -! Abstract: Computes pressure to the kappa table as a function of pressure -! for the table lookup function fpkap. -! Exact pressure to the kappa values are calculated in subprogram fpkapx. -! The current implementation computes a table with a length -! of 5501 for pressures ranging up to 110000 Pascals. -! -! Program History Log: -! 94-12-30 Iredell -! 1999-03-01 Iredell f90 module -! 1999-03-24 Iredell table lookup -! -! Usage: call gpkap -! -! Subprograms called: -! fpkapx function to compute exact pressure to the kappa -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx - real(krealfp) xmin,xmax,xinc,x,p -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=0._krealfp - xmax=110000._krealfp - xinc=(xmax-xmin)/(nxpkap-1) - c1xpkap=1.-xmin/xinc - c2xpkap=1./xinc - do jx=1,nxpkap - x=xmin+(jx-1)*xinc - p=x - tbpkap(jx)=fpkapx(p) - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental function fpkap(p) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpkap raise pressure to the kappa power. -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Raise pressure over 1e5 Pa to the kappa power. -! A linear interpolation is done between values in a lookup table -! computed in gpkap. See documentation for fpkapx for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy ranges from 9 decimal places -! at 100000 Pascals to 5 decimal places at 1000 Pascals. -! On the Cray, fpkap is over 5 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell standardized kappa, -! increased range and accuracy -! 1999-03-01 Iredell f90 module -! 1999-03-24 Iredell table lookup -! -! Usage: pkap=fpkap(p) -! -! Input argument list: -! p Real(krealfp) pressure in Pascals -! -! Output argument list: -! fpkap Real(krealfp) p over 1e5 Pa to the kappa power -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpkap - real(krealfp),intent(in):: p - integer jx - real(krealfp) xj -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xpkap+c2xpkap*p,1._krealfp),real(nxpkap,krealfp)) - jx=min(xj,nxpkap-1._krealfp) - fpkap=tbpkap(jx)+(xj-jx)*(tbpkap(jx+1)-tbpkap(jx)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function fpkapq(p) -!$$$ Subprogram Documentation Block -! -! Subprogram: fpkapq raise pressure to the kappa power. -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Raise pressure over 1e5 Pa to the kappa power. -! A quadratic interpolation is done between values in a lookup table -! computed in gpkap. see documentation for fpkapx for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy ranges from 12 decimal places -! at 100000 Pascals to 7 decimal places at 1000 Pascals. -! On the Cray, fpkap is over 4 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell standardized kappa, -! increased range and accuracy -! 1999-03-01 Iredell f90 module -! 1999-03-24 Iredell table lookup -! -! Usage: pkap=fpkapq(p) -! -! Input argument list: -! p Real(krealfp) pressure in Pascals -! -! Output argument list: -! fpkapq Real(krealfp) p over 1e5 Pa to the kappa power -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpkapq - real(krealfp),intent(in):: p - integer jx - real(krealfp) xj,dxj,fj1,fj2,fj3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xpkap+c2xpkap*p,1._krealfp),real(nxpkap,krealfp)) - jx=min(max(nint(xj),2),nxpkap-1) - dxj=xj-jx - fj1=tbpkap(jx-1) - fj2=tbpkap(jx) - fj3=tbpkap(jx+1) - fpkapq=(((fj3+fj1)/2-fj2)*dxj+(fj3-fj1)/2)*dxj+fj2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - function fpkapo(p) -!$$$ Subprogram documentation block -! -! Subprogram: fpkapo raise surface pressure to the kappa power. -! Author: Phillips org: w/NMC2X2 Date: 29 dec 82 -! -! Abstract: Raise surface pressure over 1e5 Pa to the kappa power -! using a rational weighted chebyshev approximation. -! The numerator is of order 2 and the denominator is of order 4. -! The pressure range is 40000-110000 Pa and kappa is defined in fpkapx. -! The accuracy of this approximation is almost 8 decimal places. -! On the Cray, fpkap is over 10 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell standardized kappa, -! increased range and accuracy -! 1999-03-01 Iredell f90 module -! -! Usage: pkap=fpkapo(p) -! -! Input argument list: -! p Real(krealfp) surface pressure in Pascals -! p should be in the range 40000 to 110000 -! -! Output argument list: -! fpkapo Real(krealfp) p over 1e5 Pa to the kappa power -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpkapo - real(krealfp),intent(in):: p - integer,parameter:: nnpk=2,ndpk=4 - real(krealfp):: cnpk(0:nnpk)=(/3.13198449e-1,5.78544829e-2,& - 8.35491871e-4/) - real(krealfp):: cdpk(0:ndpk)=(/1.,8.15968401e-2,5.72839518e-4,& - -4.86959812e-7,5.24459889e-10/) - integer n - real(krealfp) pkpa,fnpk,fdpk -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - pkpa=p*1.e-3_krealfp - fnpk=cnpk(nnpk) - do n=nnpk-1,0,-1 - fnpk=pkpa*fnpk+cnpk(n) - enddo - fdpk=cdpk(ndpk) - do n=ndpk-1,0,-1 - fdpk=pkpa*fdpk+cdpk(n) - enddo - fpkapo=fnpk/fdpk -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function fpkapx(p) -!$$$ Subprogram documentation block -! -! Subprogram: fpkapx raise pressure to the kappa power. -! Author: Phillips org: w/NMC2X2 Date: 29 dec 82 -! -! Abstract: raise pressure over 1e5 Pa to the kappa power. -! Kappa is equal to rd/cp where rd and cp are physical constants. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 94-12-30 Iredell made into inlinable function -! 1999-03-01 Iredell f90 module -! -! Usage: pkap=fpkapx(p) -! -! Input argument list: -! p Real(krealfp) pressure in Pascals -! -! Output argument list: -! fpkapx Real(krealfp) p over 1e5 Pa to the kappa power -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) fpkapx - real(krealfp),intent(in):: p -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - fpkapx=(p/1.e5_krealfp)**con_rocp -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - subroutine grkap -!$$$ Subprogram documentation block -! -! Subprogram: grkap Compute coefficients for p**(1/kappa) -! Author: Phillips org: w/NMC2X2 Date: 29 dec 82 -! -! Abstract: Computes pressure to the 1/kappa table as a function of pressure -! for the table lookup function frkap. -! Exact pressure to the 1/kappa values are calculated in subprogram frkapx. -! The current implementation computes a table with a length -! of 5501 for pressures ranging up to 110000 Pascals. -! -! Program History Log: -! 94-12-30 Iredell -! 1999-03-01 Iredell f90 module -! 1999-03-24 Iredell table lookup -! -! Usage: call grkap -! -! Subprograms called: -! frkapx function to compute exact pressure to the 1/kappa -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx - real(krealfp) xmin,xmax,xinc,x,p -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=0._krealfp - xmax=fpkapx(110000._krealfp) - xinc=(xmax-xmin)/(nxrkap-1) - c1xrkap=1.-xmin/xinc - c2xrkap=1./xinc - do jx=1,nxrkap - x=xmin+(jx-1)*xinc - p=x - tbrkap(jx)=frkapx(p) - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental function frkap(pkap) -!$$$ Subprogram Documentation Block -! -! Subprogram: frkap raise pressure to the 1/kappa power. -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Raise pressure over 1e5 Pa to the 1/kappa power. -! A linear interpolation is done between values in a lookup table -! computed in grkap. See documentation for frkapx for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is better than 7 decimal places. -! On the IBM, fpkap is about 4 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell standardized kappa, -! increased range and accuracy -! 1999-03-01 Iredell f90 module -! 1999-03-24 Iredell table lookup -! -! Usage: p=frkap(pkap) -! -! Input argument list: -! pkap Real(krealfp) p over 1e5 Pa to the kappa power -! -! Output argument list: -! frkap Real(krealfp) pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) frkap - real(krealfp),intent(in):: pkap - integer jx - real(krealfp) xj -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xrkap+c2xrkap*pkap,1._krealfp),real(nxrkap,krealfp)) - jx=min(xj,nxrkap-1._krealfp) - frkap=tbrkap(jx)+(xj-jx)*(tbrkap(jx+1)-tbrkap(jx)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function frkapq(pkap) -!$$$ Subprogram Documentation Block -! -! Subprogram: frkapq raise pressure to the 1/kappa power. -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Raise pressure over 1e5 Pa to the 1/kappa power. -! A quadratic interpolation is done between values in a lookup table -! computed in grkap. see documentation for frkapx for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is better than 11 decimal places. -! On the IBM, fpkap is almost 4 times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 94-12-30 Iredell standardized kappa, -! increased range and accuracy -! 1999-03-01 Iredell f90 module -! 1999-03-24 Iredell table lookup -! -! Usage: p=frkapq(pkap) -! -! Input argument list: -! pkap Real(krealfp) p over 1e5 Pa to the kappa power -! -! Output argument list: -! frkapq Real(krealfp) pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) frkapq - real(krealfp),intent(in):: pkap - integer jx - real(krealfp) xj,dxj,fj1,fj2,fj3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xrkap+c2xrkap*pkap,1._krealfp),real(nxrkap,krealfp)) - jx=min(max(nint(xj),2),nxrkap-1) - dxj=xj-jx - fj1=tbrkap(jx-1) - fj2=tbrkap(jx) - fj3=tbrkap(jx+1) - frkapq=(((fj3+fj1)/2-fj2)*dxj+(fj3-fj1)/2)*dxj+fj2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function frkapx(pkap) -!$$$ Subprogram documentation block -! -! Subprogram: frkapx raise pressure to the 1/kappa power. -! Author: Phillips org: w/NMC2X2 Date: 29 dec 82 -! -! Abstract: raise pressure over 1e5 Pa to the 1/kappa power. -! Kappa is equal to rd/cp where rd and cp are physical constants. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 94-12-30 Iredell made into inlinable function -! 1999-03-01 Iredell f90 module -! -! Usage: p=frkapx(pkap) -! -! Input argument list: -! pkap Real(krealfp) p over 1e5 Pa to the kappa power -! -! Output argument list: -! frkapx Real(krealfp) pressure in Pascals -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) frkapx - real(krealfp),intent(in):: pkap -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - frkapx=pkap**(1/con_rocp)*1.e5_krealfp -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - subroutine gtlcl -!$$$ Subprogram Documentation Block -! -! Subprogram: gtlcl Compute equivalent potential temperature table -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute lifting condensation level temperature table -! as a function of temperature and dewpoint depression for function ftlcl. -! Lifting condensation level temperature is calculated in subprogram ftlclx -! The current implementation computes a table with a first dimension -! of 151 for temperatures ranging from 180.0 to 330.0 Kelvin -! and a second dimension of 61 for dewpoint depression ranging from -! 0 to 60 Kelvin. -! -! Program History Log: -! 1999-03-01 Iredell f90 module -! -! Usage: call gtlcl -! -! Subprograms called: -! (ftlclx) inlinable function to compute LCL temperature -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - integer jx,jy - real(krealfp) xmin,xmax,ymin,ymax,xinc,yinc,x,y,tdpd,t -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xmin=180._krealfp - xmax=330._krealfp - ymin=0._krealfp - ymax=60._krealfp - xinc=(xmax-xmin)/(nxtlcl-1) - c1xtlcl=1.-xmin/xinc - c2xtlcl=1./xinc - yinc=(ymax-ymin)/(nytlcl-1) - c1ytlcl=1.-ymin/yinc - c2ytlcl=1./yinc - do jy=1,nytlcl - y=ymin+(jy-1)*yinc - tdpd=y - do jx=1,nxtlcl - x=xmin+(jx-1)*xinc - t=x - tbtlcl(jx,jy)=ftlclx(t,tdpd) - enddo - enddo -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- - elemental function ftlcl(t,tdpd) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftlcl Compute LCL temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute temperature at the lifting condensation level -! from temperature and dewpoint depression. -! A bilinear interpolation is done between values in a lookup table -! computed in gtlcl. See documentation for ftlclx for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is better than 0.0005 Kelvin. -! On the Cray, ftlcl is ? times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 1999-03-01 Iredell f90 module -! -! Usage: tlcl=ftlcl(t,tdpd) -! -! Input argument list: -! t Real(krealfp) LCL temperature in Kelvin -! tdpd Real(krealfp) dewpoint depression in Kelvin -! -! Output argument list: -! ftlcl Real(krealfp) temperature at the LCL in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftlcl - real(krealfp),intent(in):: t,tdpd - integer jx,jy - real(krealfp) xj,yj,ftx1,ftx2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xtlcl+c2xtlcl*t,1._krealfp),real(nxtlcl,krealfp)) - yj=min(max(c1ytlcl+c2ytlcl*tdpd,1._krealfp),real(nytlcl,krealfp)) - jx=min(xj,nxtlcl-1._krealfp) - jy=min(yj,nytlcl-1._krealfp) - ftx1=tbtlcl(jx,jy)+(xj-jx)*(tbtlcl(jx+1,jy)-tbtlcl(jx,jy)) - ftx2=tbtlcl(jx,jy+1)+(xj-jx)*(tbtlcl(jx+1,jy+1)-tbtlcl(jx,jy+1)) - ftlcl=ftx1+(yj-jy)*(ftx2-ftx1) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftlclq(t,tdpd) -!$$$ Subprogram Documentation Block -! -! Subprogram: ftlclq Compute LCL temperature -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute temperature at the lifting condensation level -! from temperature and dewpoint depression. -! A biquadratic interpolation is done between values in a lookup table -! computed in gtlcl. see documentation for ftlclx for details. -! Input values outside table range are reset to table extrema. -! The interpolation accuracy is better than 0.000003 Kelvin. -! On the Cray, ftlclq is ? times faster than exact calculation. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 1999-03-01 Iredell f90 module -! -! Usage: tlcl=ftlclq(t,tdpd) -! -! Input argument list: -! t Real(krealfp) LCL temperature in Kelvin -! tdpd Real(krealfp) dewpoint depression in Kelvin -! -! Output argument list: -! ftlcl Real(krealfp) temperature at the LCL in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftlclq - real(krealfp),intent(in):: t,tdpd - integer jx,jy - real(krealfp) xj,yj,dxj,dyj - real(krealfp) ft11,ft12,ft13,ft21,ft22,ft23,ft31,ft32,ft33 - real(krealfp) ftx1,ftx2,ftx3 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - xj=min(max(c1xtlcl+c2xtlcl*t,1._krealfp),real(nxtlcl,krealfp)) - yj=min(max(c1ytlcl+c2ytlcl*tdpd,1._krealfp),real(nytlcl,krealfp)) - jx=min(max(nint(xj),2),nxtlcl-1) - jy=min(max(nint(yj),2),nytlcl-1) - dxj=xj-jx - dyj=yj-jy - ft11=tbtlcl(jx-1,jy-1) - ft12=tbtlcl(jx-1,jy) - ft13=tbtlcl(jx-1,jy+1) - ft21=tbtlcl(jx,jy-1) - ft22=tbtlcl(jx,jy) - ft23=tbtlcl(jx,jy+1) - ft31=tbtlcl(jx+1,jy-1) - ft32=tbtlcl(jx+1,jy) - ft33=tbtlcl(jx+1,jy+1) - ftx1=(((ft31+ft11)/2-ft21)*dxj+(ft31-ft11)/2)*dxj+ft21 - ftx2=(((ft32+ft12)/2-ft22)*dxj+(ft32-ft12)/2)*dxj+ft22 - ftx3=(((ft33+ft13)/2-ft23)*dxj+(ft33-ft13)/2)*dxj+ft23 - ftlclq=(((ftx3+ftx1)/2-ftx2)*dyj+(ftx3-ftx1)/2)*dyj+ftx2 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - function ftlclo(t,tdpd) -!$$$ Subprogram documentation block -! -! Subprogram: ftlclo Compute LCL temperature. -! Author: Phillips org: w/NMC2X2 Date: 29 dec 82 -! -! Abstract: Compute temperature at the lifting condensation level -! from temperature and dewpoint depression. the formula used is -! a polynomial taken from Phillips mstadb routine which empirically -! approximates the original exact implicit relationship. -! (This kind of approximation is customary (inman, 1969), but -! the original source for this particular one is not yet known. -MI) -! Its accuracy is about 0.03 Kelvin for a dewpoint depression of 30. -! This function should be expanded inline in the calling routine. -! -! Program History Log: -! 91-05-07 Iredell made into inlinable function -! 1999-03-01 Iredell f90 module -! -! Usage: tlcl=ftlclo(t,tdpd) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! tdpd Real(krealfp) dewpoint depression in Kelvin -! -! Output argument list: -! ftlclo Real(krealfp) temperature at the LCL in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftlclo - real(krealfp),intent(in):: t,tdpd - real(krealfp),parameter:: clcl1= 0.954442e+0,clcl2= 0.967772e-3,& - clcl3=-0.710321e-3,clcl4=-0.270742e-5 -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ftlclo=t-tdpd*(clcl1+clcl2*t+tdpd*(clcl3+clcl4*t)) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - elemental function ftlclx(t,tdpd) -!$$$ Subprogram documentation block -! -! Subprogram: ftlclx Compute LCL temperature. -! Author: Iredell org: w/NMC2X2 Date: 25 March 1999 -! -! Abstract: Compute temperature at the lifting condensation level -! from temperature and dewpoint depression. A parcel lifted -! adiabatically becomes saturated at the lifting condensation level. -! The water model assumes a perfect gas, constant specific heats -! for gas and liquid, and neglects the volume of the liquid. -! The model does account for the variation of the latent heat -! of condensation with temperature. The ice option is not included. -! The Clausius-Clapeyron equation is integrated from the triple point -! to get the formulas -! pvlcl=con_psat*(trlcl**xa)*exp(xb*(1.-trlcl)) -! pvdew=con_psat*(trdew**xa)*exp(xb*(1.-trdew)) -! where pvlcl is the saturated parcel vapor pressure at the LCL, -! pvdew is the unsaturated parcel vapor pressure initially, -! trlcl is ttp/tlcl and trdew is ttp/tdew. The adiabatic lifting -! of the parcel is represented by the following formula -! pvdew=pvlcl*(t/tlcl)**(1/kappa) -! This formula is inverted by iterating Newtonian approximations -! until tlcl is found to within 1.e-6 Kelvin. Note that the minimum -! returned temperature is 180 Kelvin. -! -! Program History Log: -! 1999-03-25 Iredell -! -! Usage: tlcl=ftlclx(t,tdpd) -! -! Input argument list: -! t Real(krealfp) temperature in Kelvin -! tdpd Real(krealfp) dewpoint depression in Kelvin -! -! Output argument list: -! ftlclx Real(krealfp) temperature at the LCL in Kelvin -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none - real(krealfp) ftlclx - real(krealfp),intent(in):: t,tdpd - real(krealfp),parameter:: terrm=1.e-4,tlmin=180.,tlminx=tlmin-5. - real(krealfp) tr,pvdew,tlcl,ta,pvlcl,el,dpvlcl,terr - integer i -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - tr=con_ttp/(t-tdpd) - pvdew=con_psat*(tr**con_xpona)*exp(con_xponb*(1.-tr)) - tlcl=t-tdpd - do i=1,100 - tr=con_ttp/tlcl - ta=t/tlcl - pvlcl=con_psat*(tr**con_xpona)*exp(con_xponb*(1.-tr))*ta**(1/con_rocp) - el=con_hvap+con_dldt*(tlcl-con_ttp) - dpvlcl=(el/(con_rv*t**2)+1/(con_rocp*tlcl))*pvlcl - terr=(pvlcl-pvdew)/dpvlcl - tlcl=tlcl-terr - if(abs(terr).le.terrm.or.tlcl.lt.tlminx) exit - enddo - ftlclx=max(tlcl,tlmin) -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end function -!------------------------------------------------------------------------------- - subroutine gfuncphys -!$$$ Subprogram Documentation Block -! -! Subprogram: gfuncphys Compute all physics function tables -! Author: N Phillips w/NMC2X2 Date: 30 dec 82 -! -! Abstract: Compute all physics function tables. Lookup tables are -! set up for computing saturation vapor pressure, dewpoint temperature, -! equivalent potential temperature, moist adiabatic temperature and humidity, -! pressure to the kappa, and lifting condensation level temperature. -! -! Program History Log: -! 1999-03-01 Iredell f90 module -! -! Usage: call gfuncphys -! -! Subprograms called: -! gpvsl compute saturation vapor pressure over liquid table -! gpvsi compute saturation vapor pressure over ice table -! gpvs compute saturation vapor pressure table -! gtdpl compute dewpoint temperature over liquid table -! gtdpi compute dewpoint temperature over ice table -! gtdp compute dewpoint temperature table -! gthe compute equivalent potential temperature table -! gtma compute moist adiabat tables -! gpkap compute pressure to the kappa table -! grkap compute pressure to the 1/kappa table -! gtlcl compute LCL temperature table -! -! Attributes: -! Language: Fortran 90. -! -!$$$ - implicit none -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - call gpvsl - call gpvsi - call gpvs - call gtdpl - call gtdpi - call gtdp - call gthe - call gtma - call gpkap - call grkap - call gtlcl -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - end subroutine -!------------------------------------------------------------------------------- -end module diff --git a/sorc/gfs_bufr.fd/gfsbufr.f b/sorc/gfs_bufr.fd/gfsbufr.f deleted file mode 100644 index e6e3d065177..00000000000 --- a/sorc/gfs_bufr.fd/gfsbufr.f +++ /dev/null @@ -1,276 +0,0 @@ - program meteormrf -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C -C MAIN PROGRAM: METEOMRF -C PRGMMR: PAN ORG: NP23 DATE: 1999-07-21 -C -C ABSTRACT: Creates BUFR meteogram files for the AVN and MRF. -C -C PROGRAM HISTORY LOG: -C 99-07-21 Hualu Pan -C 16-09-27 HUIYA CHUANG MODIFY TO READ GFS NEMS OUTPUT ON GRID SPACE -C 16-10-15 HUIYA CHUANG: CONSOLIDATE TO READ FLUX FIELDS IN THIS -C PACKAGE TOO AND THIS SPEEDS UP BFS BUFR BY 3X -C 17-02-27 GUANG PING LOU: CHANGE MODEL OUTPUT READ-IN TO HOURLY -C TO 120 HOURS AND 3 HOURLY TO 180 HOURS. -C 19-07-16 GUANG PING LOU: CHANGE FROM NEMSIO TO GRIB2. -C -C -C USAGE: -C INPUT FILES: -C FTxxF001 - UNITS 11 THRU 49 -C PARM - UNIT 5 (STANDARD READ) -C -C OUTPUT FILES: (INCLUDING SCRATCH FILES) -C FTxxF001 - UNITS 51 THRU 79 -C FTxxF001 - UNIT 6 (STANDARD PRINTFILE) -C -C SUBPROGRAMS CALLED: (LIST ALL CALLED FROM ANYWHERE IN CODES) -C UNIQUE: - ROUTINES THAT ACCOMPANY SOURCE FOR COMPILE -C LIBRARY: -C W3LIB - -C -C EXIT STATES: -C COND = 0 - SUCCESSFUL RUN -C =NNNN - TROUBLE OR SPECIAL FLAG - SPECIFY NATURE -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C LANGUAGE: INDICATE EXTENSIONS, COMPILER OPTIONS -C MACHINE: IBM SP -C -C$$$ - use netcdf - use mpi - use nemsio_module - use sigio_module - implicit none -!! include 'mpif.h' - integer,parameter:: nsta=3000 - integer,parameter:: ifile=11 - integer,parameter:: levso=64 - integer(sigio_intkind):: irets - type(nemsio_gfile) :: gfile - integer ncfsig, nsig - integer istat(nsta), idate(4), jdate - integer :: levs,nstart,nend,nint,nsfc,levsi,im,jm - integer :: npoint,np,ist,is,iret,lss,nss,nf,nsk,nfile - integer :: ielev - integer :: lsfc - real :: alat,alon,rla,rlo - real :: wrkd(1),dummy - real rlat(nsta), rlon(nsta), elevstn(nsta) - integer iidum(nsta),jjdum(nsta) - integer nint1, nend1, nint3, nend3, np1 - integer landwater(nsta) - character*1 ns, ew - character*4 t3 - character*4 cstat(nsta) - character*32 desc - character*150 dird, fnsig - logical f00, makebufr - CHARACTER*150 FILESEQ - CHARACTER*8 SBSET - LOGICAL SEQFLG(4) - CHARACTER*80 CLIST(4) - INTEGER NPP(4) - CHARACTER*8 SEQNAM(4) - integer ierr, mrank, msize,ntask - integer n0, ntot - integer :: error, ncid, id_var,dimid - character(len=10) :: dim_nam - character(len=6) :: fformat - !added from Cory - integer :: iope, ionproc - integer, allocatable :: iocomms(:) -C - DATA SBSET / 'ABCD1234' / -C - DATA SEQFLG / .FALSE., .TRUE., .FALSE., .FALSE. / -C - DATA SEQNAM / 'HEADR', 'PROFILE', 'CLS1' ,'D10M' / -c DATA SEQNAM / 'HEADR', 'PRES TMDB UWND VWND SPFH OMEG', -c & 'CLS1' ,'D10M' / -C - namelist /nammet/ levs, makebufr, dird, - & nstart, nend, nint, nend1, nint1, - & nint3, nsfc, f00, fformat, np1 - - call mpi_init(ierr) - call mpi_comm_rank(MPI_COMM_WORLD,mrank,ierr) - call mpi_comm_size(MPI_COMM_WORLD,msize,ierr) - if(mrank.eq.0) then - CALL W3TAGB('METEOMRF',1999,0202,0087,'NP23') - endif - open(5,file='gfsparm') - read(5,nammet) - write(6,nammet) - npoint = 0 - 99 FORMAT (I6, F6.2,A1, F7.2,A1,1X,A4,1X,I2, A28, I4) - do np = 1, nsta+2 - read(8,99,end=200) IST,ALAT,NS,ALON,EW,T3,lsfc,DESC,IELEV -CC print*," IST,ALAT,NS,ALON,EW,T3,lsfc,DESC,IELEV= " -CC print*, IST,ALAT,NS,ALON,EW,T3,lsfc,DESC,IELEV - if(alat.lt.95.) then - npoint = npoint + 1 - RLA = 9999. - IF (NS .EQ. 'N') RLA = ALAT - IF (NS .EQ. 'S') RLA = -ALAT - RLO = 9999. - IF (EW .EQ. 'E') RLO = ALON - IF (EW .EQ. 'W') RLO = -ALON - rlat(npoint) = rla - rlon(npoint) = rlo - istat(npoint) = ist - cstat(npoint) = T3 - elevstn(npoint) = ielev - - if(lsfc .le. 9) then - landwater(npoint) = 2 !!nearest - else if(lsfc .le. 19) then - landwater(npoint) = 1 !!land - else if(lsfc .ge. 20) then - landwater(npoint) = 0 !!water - endif - endif - enddo - 200 continue - if(npoint.le.0) then - print *, ' station list file is empty, abort program' - call abort - elseif(npoint.gt.nsta) then - print *, ' number of station exceeds nsta, abort program' - call abort - endif -! print*,'npoint= ', npoint -! print*,'np,IST,idum,jdum,rlat(np),rlon(np)= ' - if(np1 == 0) then - do np = 1, npoint - read(7,98) IST, iidum(np), jjdum(np), ALAT, ALON - enddo - endif - 98 FORMAT (3I6, 2F9.2) - if (mrank.eq.0.and.makebufr) then - REWIND 1 - READ (1,100) SBSET - 100 FORMAT ( ////// 2X, A8 ) - PRINT 120, SBSET - 120 FORMAT ( ' SBSET=#', A8, '#' ) - REWIND 1 -C -C READ PARM NAMES AND NUMBER OF PARM NAMES FROM BUFR TABLE. - DO IS = 1,4 - CALL BFRHDR ( 1, SEQNAM(IS), SEQFLG(IS), - X CLIST(IS), NPP(IS), IRET ) - IF ( IRET .NE. 0 ) THEN - PRINT*, ' CALL BFRHDR IRET=', IRET - ENDIF - ENDDO - lss = len ( dird ) - DO WHILE ( dird (lss:lss) .eq. ' ' ) - lss = lss - 1 - END DO -C - endif - nsig = 11 - nss = nstart + nint - if(f00) nss = nstart -c do nf = nss, nend, nint - ntot = (nend - nss) / nint + 1 - ntask = mrank/(float(msize)/float(ntot)) - nf = ntask * nint + nss - print*,'n0 ntot nint nss mrank msize' - print*, n0,ntot,nint,nss,mrank,msize - print*,'nf, ntask= ', nf, ntask - if(nf .le. nend1) then - nfile = 21 + (nf / nint1) - else - nfile = 21 + (nend1/nint1) + (nf-nend1)/nint3 - endif - print*, 'nf,nint,nfile = ',nf,nint,nfile - if(nf.le.nend) then - if(nf.lt.10) then - fnsig = 'sigf0' - write(fnsig(6:6),'(i1)') nf - ncfsig = 6 - elseif(nf.lt.100) then - fnsig = 'sigf' - write(fnsig(5:6),'(i2)') nf - ncfsig = 6 - else - fnsig = 'sigf' - write(fnsig(5:7),'(i3)') nf - ncfsig = 7 - endif - print *, 'Opening file : ',fnsig - -!! read in either nemsio or NetCDF files - if (fformat == 'netcdf') then - error=nf90_open(trim(fnsig),nf90_nowrite,ncid) - error=nf90_inq_dimid(ncid,"grid_xt",dimid) - error=nf90_inquire_dimension(ncid,dimid,dim_nam,im) - error=nf90_inq_dimid(ncid,"grid_yt",dimid) - error=nf90_inquire_dimension(ncid,dimid,dim_nam,jm) - error=nf90_inq_dimid(ncid,"pfull",dimid) - error=nf90_inquire_dimension(ncid,dimid,dim_nam,levsi) - error=nf90_close(ncid) - print*,'NetCDF file im,jm,lm= ',im,jm,levs,levsi - - else - call nemsio_init(iret=irets) - print *,'nemsio_init, iret=',irets - call nemsio_open(gfile,trim(fnsig),'read',iret=irets) - if ( irets /= 0 ) then - print*,"fail to open nems atmos file";stop - endif - - call nemsio_getfilehead(gfile,iret=irets - & ,dimx=im,dimy=jm,dimz=levsi) - if( irets /= 0 ) then - print*,'error finding model dimensions '; stop - endif - print*,'nemsio file im,jm,lm= ',im,jm,levsi - call nemsio_close(gfile,iret=irets) - endif - allocate (iocomms(0:ntot)) - if (fformat == 'netcdf') then - print*,'iocomms= ', iocomms - call mpi_comm_split(MPI_COMM_WORLD,ntask,0,iocomms(ntask),ierr) - call mpi_comm_rank(iocomms(ntask), iope, ierr) - call mpi_comm_size(iocomms(ntask), ionproc, ierr) - - call meteorg(npoint,rlat,rlon,istat,cstat,elevstn, - & nf,nfile,fnsig,jdate,idate, - & levsi,im,jm,nsfc, - & landwater,nend1, nint1, nint3, iidum,jjdum,np1, - & fformat,iocomms(ntask),iope,ionproc) - call mpi_barrier(iocomms(ntask), ierr) - call mpi_comm_free(iocomms(ntask), ierr) - else -!! For nemsio input - call meteorg(npoint,rlat,rlon,istat,cstat,elevstn, - & nf,nfile,fnsig,jdate,idate, - & levs,im,jm,nsfc, - & landwater,nend1, nint1, nint3, iidum,jjdum,np1, - & fformat,iocomms(ntask),iope,ionproc) - endif - endif - call mpi_barrier(mpi_comm_world,ierr) - call mpi_finalize(ierr) - if(mrank.eq.0) then - print *, ' starting to make bufr files' - print *, ' makebufr= ', makebufr - print *, 'nint1,nend1,nint3,nend= ',nint1,nend1,nint3,nend -!! idate = 0 7 1 2019 -!! jdate = 2019070100 - - if(makebufr) then - nend3 = nend - call buff(nint1,nend1,nint3,nend3, - & npoint,idate,jdate,levso, - & dird,lss,istat,sbset,seqflg,clist,npp,wrkd) - CALL W3TAGE('METEOMRF') - endif - endif - end diff --git a/sorc/gfs_bufr.fd/gslp.f b/sorc/gfs_bufr.fd/gslp.f deleted file mode 100644 index 5b0eca1f519..00000000000 --- a/sorc/gfs_bufr.fd/gslp.f +++ /dev/null @@ -1,92 +0,0 @@ -!$$$ Subprogram documentation block -! -! Subprogram: gslp Compute sea level pressure as in the GFS -! Prgmmr: Iredell Org: np23 Date: 1999-10-18 -! -! Abstract: This subprogram computes sea level pressure from profile data -! using the Shuell method in the GFS. -! -! Program history log: -! 1999-10-18 Mark Iredell -! -! Usage: call gslp(km,hs,ps,p,t,sh,prmsl,h,ho) -! Input argument list: -! km integer number of levels -! hs real surface height (m) -! ps real surface pressure (Pa) -! p real (km) profile pressures (Pa) -! t real (km) profile temperatures (K) -! sh real (km) profile specific humidities (kg/kg) -! Output argument list: -! prmsl real sea level pressure (Pa) -! h real integer-layer height (m) -! ho real integer-layer height at 1000hPa and 500hPa (m) -! -! Modules used: -! physcons physical constants -! -! Attributes: -! Language: Fortran 90 -! -!$$$ -subroutine gslp(km,hs,ps,p,t,sh,prmsl,h,ho) - use physcons - implicit none - integer,intent(in):: km - real,intent(in):: hs,ps - real,intent(in),dimension(km):: p,t,sh - real,intent(out):: prmsl - real,intent(out),dimension(km):: h - real,intent(out),dimension(2):: ho - real,parameter:: gammam=-6.5e-3,zshul=75.,tvshul=290.66 - real,parameter:: pm1=1.e5,tm1=287.45,hm1=113.,hm2=5572.,& - fslp=con_g*(hm2-hm1)/(con_rd*tm1) - integer k,i - real aps,ap(km),tv(km) - real apo(2) - real tvu,tvd,gammas,part - real hfac -! compute model heights - aps=log(ps) - ap(1)=log(p(1)) - tv(1)=t(1)*(1+con_fvirt*sh(1)) - h(1)=hs-con_rog*tv(1)*(ap(1)-aps) - do k=2,km - ap(k)=log(p(k)) - tv(k)=t(k)*(1+con_fvirt*sh(k)) - h(k)=h(k-1)-con_rog*0.5*(tv(k-1)+tv(k))*(ap(k)-ap(k-1)) - enddo -! compute 1000 and 500 mb heights - apo(1)=log(1000.e2) - apo(2)=log(500.e2) - do i=1,2 - if(aps.lt.apo(i)) then - tvu=tv(1) - if(h(1).gt.zshul) then - tvd=tvu-gammam*h(1) - if(tvd.gt.tvshul) then - if(tvu.gt.tvshul) then - tvd=tvshul-5.e-3*(tvu-tvshul)**2 - else - tvd=tvshul - endif - endif - gammas=(tvu-tvd)/h(1) - else - gammas=0. - endif - part=con_rog*(apo(i)-ap(1)) - ho(i)=h(1)-tvu*part/(1.+0.5*gammas*part) - else - do k=1,km - if(ap(k).lt.apo(i)) then - ho(i)=h(k)-con_rog*tv(k)*(apo(i)-ap(k)) - exit - endif - enddo - endif - enddo -! compute sea level pressure - hfac=ho(1)/(ho(2)-ho(1)) - prmsl=pm1*exp(fslp*hfac) -end subroutine diff --git a/sorc/gfs_bufr.fd/lcl.f b/sorc/gfs_bufr.fd/lcl.f deleted file mode 100644 index 5fa4c4719e8..00000000000 --- a/sorc/gfs_bufr.fd/lcl.f +++ /dev/null @@ -1,45 +0,0 @@ - SUBROUTINE LCL(TLCL,PLCL,T,P,Q) -C -C LIFTING CONDENSATION LEVEL ROUTINE -C - REAL L0, KAPPA - parameter (dtdp=4.5e-4,kappa=.286,g=9.81) - parameter (cp=1004.6,cl=4185.5,cpv=1846.0) - parameter (rv=461.5,l0=2.500e6,t0=273.16,es0=610.78) - parameter (cps=2106.0,hfus=3.3358e5,rd=287.05) - parameter (fact1=(CPV - CL) / RV,fact1i=(cps-cl)/rv) - parameter (fact2=(L0 + (CL - CPV) * T0) / RV) - parameter (fact2i=(L0 + hfus + (CL - cps) * T0) / RV) - parameter (fact3=1. / T0,eps=rd/rv,tmix=t0-20.) - DESDT(ES,T) = ES * (FACT1 / T + FACT2 / T ** 2) - DESDTi(ES,T) = ES * (FACT1i / T + FACT2i / T ** 2) - ITER = 0 - CALL TDEW(TG,T,Q,P) - 5 CALL SVP(QS,ES,P,TG) - DES = DESDT(ES,TG) - if(tg.ge.t0) then - des = desdt(es,tg) - elseif(tg.lt.tmix) then - des = desdti(es,tg) - else - w = (tg - tmix) / (t0 - tmix) - des = w * desdt(es,tg) + (1.-w) * desdti(es,tg) - endif - FT = P * (TG / T) ** KAPPA - DFT = KAPPA * FT / TG - GT = (EPS + Q * (1. - EPS)) * ES - Q * FT - DGT = (EPS + Q * (1. - EPS)) * DES - Q * DFT - DTG = GT / DGT -c WRITE(6,*) ' ITER, DTG =', ITER, DTG - TG = TG - DTG - IF(ABS(DTG).LT..1) GOTO 10 - ITER = ITER + 1 - IF(ITER.GT.30) THEN - WRITE(6,*) ' LCL ITERATION DIVERGES' - STOP 'ABEND 101' - ENDIF - GOTO 5 - 10 TLCL = TG - PLCL = P * (TLCL / T) ** KAPPA - RETURN - END diff --git a/sorc/gfs_bufr.fd/machine.f b/sorc/gfs_bufr.fd/machine.f deleted file mode 100644 index bec00028adb..00000000000 --- a/sorc/gfs_bufr.fd/machine.f +++ /dev/null @@ -1,15 +0,0 @@ - MODULE MACHINE - - IMPLICIT NONE - SAVE -! Machine dependant constants - integer kind_io4,kind_io8,kind_phys,kind_rad - parameter (kind_rad = selected_real_kind(13,60)) ! the '60' maps to 64-bit real - parameter (kind_phys = selected_real_kind(13,60)) ! the '60' maps to 64-bit real - parameter (kind_io4 = 4) -! parameter (kind_io8 = 8) - parameter (kind_io8 = 4) - integer kint_mpi - parameter (kint_mpi = 4) - - END MODULE MACHINE diff --git a/sorc/gfs_bufr.fd/meteorg.f b/sorc/gfs_bufr.fd/meteorg.f deleted file mode 100644 index 6b7c2c7db4b..00000000000 --- a/sorc/gfs_bufr.fd/meteorg.f +++ /dev/null @@ -1,1326 +0,0 @@ - subroutine meteorg(npoint,rlat,rlon,istat,cstat,elevstn, - & nf,nfile,fnsig,jdate,idate, - & levs,im,jm,kdim, - & landwater,nend1,nint1,nint3,iidum,jjdum,np1, - & fformat,iocomms,iope,ionproc) - -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! . . . . -! SUBPROGRAM: meteorg -! PRGMMR: HUALU PAN ORG: W/NMC23 DATE: 1999-07-21 -! -! ABSTRACT: Creates BUFR meteogram files for the AVN and MRF. -! -! PROGRAM HISTORY LOG: -! 1999-07-21 HUALU PAN -! 2007-02-02 FANGLIN YANG EXPAND FOR HYBRID COORDINATES USING SIGIO -! 2009-07-24 FANGLIN YANG CHANGE OUTPUT PRESSURE TO INTEGER-LAYER -! PRESSURE (line 290) -! CORRECT THE TEMPERATURE ADJUSTMENT (line 238) -! 2014-03-27 DANA CARLIS UNIFY CODE WITH GFS FORECAST MODEL PRECIP -! TYPE CALCULATION -! 2016-09-27 HUIYA CHUANG MODIFY TO READ GFS NEMS OUTPUT ON GRID SPACE -! 2017-02-27 GUANG PING LOU CHANGE OUTPUT PRECIPITATION TO HOURLY AMOUNT -! TO 120 HOURS AND 3 HOURLY TO 180 HOURS. -! 2018-02-01 GUANG PING LOU INGEST FV3GFS NEMSIO ACCUMULATED PRECIPITATION -! AND RECALCULATE HOURLY AND 3 HOURLY OUTPUT DEPENDING -! ON LOGICAL VALUE OF precip_accu. -! 2018-02-08 GUANG PING LOU ADDED READING IN AND USING DZDT AS VERTICAL VELOCITY -! 2018-02-16 GUANG PING LOU ADDED READING IN AND USING MODEL DELP AND DELZ -! 2018-02-21 GUANG PING LOU THIS VERSION IS BACKWARD COMPATIBLE TO GFS MODEL -! 2018-03-27 GUANG PING LOU CHANGE STATION ELEVATION CORRECTION LAPSE RATE FROM 0.01 TO 0.0065 -! 2018-03-28 GUANG PING LOU GENERALIZE TIME INTERVAL -! 2019-07-08 GUANG PING LOU ADDED STATION CHARACTER IDS -! 2019-10-08 GUANG PING LOU MODIFY TO READ IN NetCDF FILES. RETAIN NEMSIO -! RELATED CALLS AND CLEAN UP THE CODE. -! 2020-04-24 GUANG PING LOU Clean up code and remove station height -! adjustment -! -! USAGE: CALL PROGRAM meteorg -! INPUT: -! npoint - number of points -! rlat(npint) - latitude -! rlon(npoint) - longtitude -! istat(npoint) - station id -! elevstn(npoint) - station elevation (m) -! nf - forecast cycle -! fnsig - sigma file name -! idate(4) - date -! levs - input vertical layers -! kdim - sfc file dimension -! -! OUTPUT: -! nfile - output data file channel -! jdate - date YYYYMMDDHH -! -! ATTRIBUTES: -! LANGUAGE: -! MACHINE: IBM SP -! -!$$$ - use netcdf - use nemsio_module - use sigio_module - use physcons - use mersenne_twister - use funcphys - implicit none - include 'mpif.h' - type(nemsio_gfile) :: gfile - type(nemsio_gfile) :: ffile - type(nemsio_gfile) :: ffile2 - integer :: nfile,npoint,levs,kdim - integer :: nfile1 - integer :: i,j,im,jm,kk,idum,jdum,idvc,idsl -! idsl Integer(sigio_intkind) semi-lagrangian id -! idvc Integer(sigio_intkind) vertical coordinate id -! (=1 for sigma, =2 for ec-hybrid, =3 for ncep hybrid) - integer,parameter :: nvcoord=2 - integer,parameter :: levso=64 - integer :: idate(4),nij,nflx2,np,k,l,nf,nfhour,np1 - integer :: idate_nems(7) - integer :: iret,jdate,leveta,lm,lp1 - character*150 :: fnsig,fngrib -!! real*8 :: data(6*levs+25) - real*8 :: data2(6*levso+25) - real*8 :: rstat1 - character*8 :: cstat1 - character*4 :: cstat(npoint) - real :: fhour,pp,ppn,qs,qsn,esn,es,psfc,ppi,dtemp,nd - real :: t,q,u,v,td,tlcl,plcl,qw,tw,xlat,xlon - integer,dimension(npoint):: landwater - integer,dimension(im,jm):: lwmask - real,dimension(im,jm):: apcp, cpcp - real,dimension(npoint,2+levs*3):: grids - real,dimension(npoint) :: rlat,rlon,pmsl,ps,psn,elevstn - real,dimension(1) :: psone - real,dimension(im*jm) :: dum1d,dum1d2 - real,dimension(im,jm) :: gdlat, hgt, gdlon - real,dimension(im,jm,15) :: dum2d - real,dimension(im,jm,levs) :: t3d, q3d, uh, vh,omega3d - real,dimension(im,jm,levs) :: delpz - real,dimension(im,jm,levs+1) :: pint, zint - real,dimension(npoint,levs) :: gridu,gridv,omega,qnew,zp - real,dimension(npoint,levs) :: p1,pd3,ttnew - real,dimension(npoint,levs) :: z1 - real,dimension(npoint,levs+1) :: pi3 - real :: zp2(2) - real,dimension(kdim,npoint) :: sfc - real,dimension(1,levs+1) :: prsi,phii - real,dimension(1,levs) :: gt0,gq0,prsl,phy_f3d - real :: PREC,TSKIN,SR,randomno(1,2) - real :: DOMR,DOMZR,DOMIP,DOMS - real :: vcoord(levs+1,nvcoord),vdummy(levs+1) - real :: vcoordnems(levs+1,3,2) - real :: rdum - integer :: n3dfercld,iseedl - integer :: istat(npoint) - logical :: trace -!! logical, parameter :: debugprint=.true. - logical, parameter :: debugprint=.false. - character lprecip_accu*3 - real, parameter :: ERAD=6.371E6 - real, parameter :: DTR=3.1415926/180. - real :: ap - integer :: nf1, fint - integer :: nend1, nint1, nint3 - character*150 :: fngrib2 - integer recn_dpres,recn_delz,recn_dzdt - integer :: jrec - equivalence (cstat1,rstat1) - integer iidum(npoint),jjdum(npoint) - integer :: error, ncid, ncid2, id_var,dimid - character(len=100) :: long_name - character(len=6) :: fformat - integer,dimension(8) :: clocking - character(10) :: date - character(12) :: time - character(7) :: zone - character(3) :: Zreverse - character(20) :: VarName,LayName - integer iocomms,iope,ionproc - - nij = 12 -!! nflx = 6 * levs - nflx2 = 6 * levso - recn_dpres = 0 - recn_delz = 0 - recn_dzdt = 0 - jrec = 0 - lprecip_accu='yes' - - idvc=2 - idsl=1 -!read in NetCDF file header info - print*,"fformat= ", fformat -! print*,'meteorg.f, idum,jdum= ' -! do np = 1, npoint -! print*, iidum(np), jjdum(np) -! enddo - - if(fformat .eq. "netcdf") then - print*,'iocomms inside meteorg.f=', iocomms - error=nf90_open(trim(fnsig),ior(nf90_nowrite,nf90_mpiio), - & ncid,comm=iocomms, info = mpi_info_null) - error=nf90_get_att(ncid,nf90_global,"ak",vdummy) - do k = 1, levs+1 - vcoord(k,1)=vdummy(levs-k+1) - enddo - error=nf90_get_att(ncid,nf90_global,"bk",vdummy) - do k = 1, levs+1 - vcoord(k,2)=vdummy(levs-k+1) - enddo - error=nf90_inq_varid(ncid, "time", id_var) - error=nf90_get_var(ncid, id_var, nfhour) - print*, "nfhour:",nfhour - error=nf90_get_att(ncid,id_var,"units",long_name) -!! print*,'time units',' -- ',trim(long_name) - read(long_name(13:16),"(i4)")idate(4) - read(long_name(18:19),"(i2)")idate(2) - read(long_name(21:22),"(i2)")idate(3) - read(long_name(24:25),"(i2)")idate(1) - fhour=float(nfhour) - print*,'date= ', idate - jdate = idate(4)*1000000 + idate(2)*10000+ - & idate(3)*100 + idate(1) - print *, 'jdate = ', jdate - error=nf90_inq_varid(ncid, "lon", id_var) - error=nf90_get_var(ncid, id_var, gdlon) - error=nf90_inq_varid(ncid, "lat", id_var) - error=nf90_get_var(ncid, id_var, gdlat) -!!end read NetCDF hearder info, read nemsio below if necessary - else - - call nemsio_open(gfile,trim(fnsig),'read',iret=iret) - call nemsio_getfilehead(gfile,iret=iret - + ,idate=idate_nems(1:7),nfhour=nfhour - + ,idvc=idvc,idsl=idsl,lat=dum1d,lon=dum1d2 - + ,vcoord=vcoordnems) - - do k=1,levs+1 - vcoord(k,1)=vcoordnems(k,1,1) - vcoord(k,2)=vcoordnems(k,2,1) - end do - idate(1)=idate_nems(4) - idate(2)=idate_nems(2) - idate(3)=idate_nems(3) - idate(4)=idate_nems(1) - fhour=float(nfhour) - print *, ' processing forecast hour ', fhour - print *, ' idate =', idate - jdate = idate(4)*1000000 + idate(2)*10000+ - & idate(3)*100 + idate(1) - print *, 'jdate = ', jdate - print *, 'Total number of stations = ', npoint - ap = 0.0 - do j=1,jm - do i=1,im - gdlat(i,j)=dum1d((j-1)*im+i) - gdlon(i,j)=dum1d2((j-1)*im+i) - end do - end do - - endif !end read in nemsio hearder - - if(debugprint) then - do k=1,levs+1 - print*,'vcoord(k,1)= ', k, vcoord(k,1) - end do - do k=1,levs+1 - print*,'vcoord(k,2)= ', k, vcoord(k,2) - end do - print*,'sample lat= ',gdlat(im/5,jm/4) - + ,gdlat(im/5,jm/3),gdlat(im/5,jm/2) - print*,'sample lon= ',gdlon(im/5,jm/4) - + ,gdlon(im/5,jm/3),gdlon(im/5,jm/2) - endif -! topography - if (fformat == 'netcdf') then - VarName='hgtsfc' - Zreverse='yes' - call read_netcdf_p(ncid,im,jm,1,VarName,hgt,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'surface hgt not found' - else - VarName='hgt' - LayName='sfc' - call read_nemsio(gfile,im,jm,1,VarName,LayName,hgt, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'surface hgt not found' - endif - if(debugprint)print*,'sample sfc h= ',hgt(im/5,jm/4) - + ,hgt(im/5,jm/3),hgt(im/5,jm/2) - -! surface pressure (Pa) - if (fformat == 'netcdf') then - VarName='pressfc' - Zreverse='yes' - call read_netcdf_p(ncid,im,jm,1,VarName,pint(:,:,1), - & Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'surface pressure not found' - else - VarName='pres' - LayName='sfc' - call read_nemsio(gfile,im,jm,1,VarName, - & LayName,pint(:,:,1),error) - if (error /= 0) print*,'surface pressure not found' - endif - if(debugprint)print*,'sample sfc P= ',pint(im/2,jm/4,1), - + pint(im/2,jm/3,1),pint(im/2,jm/2,1) - -! temperature using NetCDF - if (fformat == 'netcdf') then - VarName='tmp' - Zreverse='yes' - call read_netcdf_p(ncid,im,jm,levs,VarName,t3d,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'temp not found' - else - VarName='tmp' - LayName='mid layer' - call read_nemsio(gfile,im,jm,levs,VarName,LayName,t3d,error) - if (error /= 0) print*,'temp not found' - endif - if(debugprint) then - print*,'sample T at lev=1 to levs ' - do k = 1, levs - print*,k, t3d(im/2,jm/3,k) - enddo - endif -! specific humidity - if (fformat == 'netcdf') then - VarName='spfh' - Zreverse='yes' - call read_netcdf_p(ncid,im,jm,levs,VarName,q3d,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'spfh not found' - else - VarName='spfh' - LayName='mid layer' - call read_nemsio(gfile,im,jm,levs,VarName,LayName,q3d,error) - if (error /= 0) print*,'spfh not found' - endif - if(debugprint) then - print*,'sample Q at lev=1 to levs ' - do k = 1, levs - print*,k, q3d(im/2,jm/3,k) - enddo - endif -! U wind - if (fformat == 'netcdf') then - VarName='ugrd' - Zreverse='yes' - call read_netcdf_p(ncid,im,jm,levs,VarName,uh,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'ugrd not found' - else - VarName='ugrd' - LayName='mid layer' - call read_nemsio(gfile,im,jm,levs,VarName,LayName,uh,error) - if (error /= 0) print*,'ugrd not found' - endif - if(debugprint) then - print*,'sample U at lev=1 to levs ' - do k = 1, levs - print*,k, uh(im/2,jm/3,k) - enddo - endif -! V wind - if (fformat == 'netcdf') then - VarName='vgrd' - Zreverse='yes' - call read_netcdf_p(ncid,im,jm,levs,VarName,vh,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'vgrd not found' - else - VarName='vgrd' - LayName='mid layer' - call read_nemsio(gfile,im,jm,levs,VarName,LayName,vh,error) - if (error /= 0) print*,'vgrd not found' - endif - if(debugprint) then - print*,'sample V at lev=1 to levs ' - do k = 1, levs - print*,k, vh(im/2,jm/3,k) - enddo - endif -! dzdt !added by Guang Ping Lou for FV3GFS - if (fformat == 'netcdf') then - VarName='dzdt' - Zreverse='yes' - call read_netcdf_p(ncid,im,jm,levs,VarName,omega3d,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'dzdt not found' - else - VarName='dzdt' - LayName='mid layer' - call read_nemsio(gfile,im,jm,levs,VarName,LayName, - & omega3d,error) - if (error /= 0) print*,'dzdt not found' - endif - if(debugprint) then - print*,'sample dzdt at lev=1 to levs ' - do k = 1, levs - print*,k, omega3d(im/2,jm/3,k) - enddo - endif -! dpres !added by Guang Ping Lou for FV3GFS (interface pressure delta) - if (fformat == 'netcdf') then - VarName='dpres' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,levs,VarName,delpz,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'dpres not found' - else - VarName='dpres' - LayName='mid layer' - call read_nemsio(gfile,im,jm,levs,VarName,LayName, - & delpz,error) - if (error /= 0) print*,'dpres not found' - endif - if(debugprint) then - print*,'sample delp at lev=1 to levs ' - do k = 1, levs - print*,k, delpz(im/2,jm/3,k) - enddo - endif -! compute interface pressure - if(recn_dpres == -9999) then - do k=2,levs+1 - do j=1,jm - do i=1,im - pint(i,j,k)=vcoord(k,1) - + +vcoord(k,2)*pint(i,j,1) - end do - end do - end do - else -! compute pint using dpres from top down if DZDT is used - if (fformat == 'netcdf') then - do j=1,jm - do i=1,im - pint(i,j,levs+1) = delpz(i,j,1) - end do - end do - do k=levs,2,-1 - kk=levs-k+2 - do j=1,jm - do i=1,im - pint(i,j,k) = pint(i,j,k+1) + delpz(i,j,kk) - end do - end do - end do - else - do k=2,levs+1 - do j=1,jm - do i=1,im - pint(i,j,k) = pint(i,j,k-1) - delpz(i,j,k-1) - end do - end do - end do - endif - if(debugprint) then - print*,'sample interface pressure pint at lev =1 to levs ' - do k = 1, levs+1 - print*,k, pint(im/2,jm/3,k),pint(im/3,jm/8,k) - enddo - endif - endif -! delz !added by Guang Ping Lou for FV3GFS ("height thickness" with unit "meters" bottom up) - if (fformat == 'netcdf') then - VarName='delz' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,levs,VarName,delpz,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'delz not found' - else - VarName='delz' - LayName='mid layer' - call read_nemsio(gfile,im,jm,levs,VarName,LayName,delpz,error) - if (error /= 0) print*,'delz not found' - endif - if(debugprint) then - print*,'sample delz at lev=1 to levs ' - do k = 1, levs - print*,k, delpz(im/2,jm/3,k) - enddo - endif - -! compute interface height (meter) - if(recn_delz == -9999) then - print*, 'using calculated height' - else -! compute zint using delz from bot up if DZDT is used - if (fformat == 'netcdf') then - do j=1,jm - do i=1,im - zint(i,j,1) = 0.0 - end do - end do - do k=2,levs+1 - kk=levs-k+1 - do j=1,jm - do i=1,im - zint(i,j,k) = zint(i,j,k-1) - delpz(i,j,kk) - end do - end do - end do - else - do k=2,levs+1 - do j=1,jm - do i=1,im - zint(i,j,k) = zint(i,j,k-1) + delpz(i,j,k-1) - end do - end do - end do - endif - if(debugprint) then - print*,'sample interface height zint at lev =1 to levs ' - do k = 1, levs+1 - print*,k, zint(im/2,jm/3,k),zint(im/3,jm/8,k) - enddo - endif - endif - -! close up this NetCDF file - error=nf90_close(ncid) - -! Now open up NetCDF surface files - if ( nf .le. nend1 ) then - nf1 = nf - nint1 - else - nf1 = nf - nint3 - endif - if ( nf == 0 ) nf1=0 - if(nf==0) then - fngrib='flxf00' - elseif(nf.lt.10) then - fngrib='flxf0' - write(fngrib(6:6),'(i1)') nf - elseif(nf.lt.100) then - fngrib='flxf' - write(fngrib(5:6),'(i2)') nf - else - fngrib='flxf' - write(fngrib(5:7),'(i3)') nf - endif - if(nf1==0) then - fngrib2='flxf00' - elseif(nf1.lt.10) then - fngrib2='flxf0' - write(fngrib2(6:6),'(i1)') nf1 - elseif(nf1.lt.100) then - fngrib2='flxf' - write(fngrib2(5:6),'(i2)') nf1 - else - fngrib2='flxf' - write(fngrib2(5:7),'(i3)') nf1 - endif - if (fformat == 'netcdf') then - error=nf90_open(trim(fngrib),nf90_nowrite,ncid) -!open T-nint below - error=nf90_open(trim(fngrib2),nf90_nowrite,ncid2) - if(error /= 0)print*,'file not open',trim(fngrib), trim(fngrib2) - else - call nemsio_open(ffile,trim(fngrib),'read',iret=error) - call nemsio_open(ffile2,trim(fngrib2),'read',iret=error) - if(error /= 0)print*,'file not open',trim(fngrib), trim(fngrib2) - endif -! land water mask - if (fformat == 'netcdf') then - VarName='land' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,lwmask,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'lwmask not found' - else - VarName='land' - LayName='sfc' - call read_nemsio(ffile,im,jm,1,VarName,LayName,lwmask,error) - if (error /= 0) print*,'lwmask not found' - endif - if(debugprint) - + print*,'sample land mask= ',lwmask(im/2,jm/4), - + lwmask(im/2,jm/3) - -! surface T - if (fformat == 'netcdf') then - VarName='tmpsfc' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,dum2d(:,:,1), - & Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'tmpsfc not found' - else - VarName='tmp' - LayName='sfc' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - & dum2d(:,:,1),error) - if (error /= 0) print*,'tmpsfc not found' - endif - if(debugprint) - + print*,'sample sfc T= ',dum2d(im/2,jm/4,1),dum2d(im/2,jm/3,1), - + dum2d(im/2,jm/2,1) -! 2m T - if (fformat == 'netcdf') then - VarName='tmp2m' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,dum2d(:,:,2), - & Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'tmp2m not found' - else - VarName='tmp' - LayName='2 m above gnd' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,2),error) - if (error /= 0) print*,'tmp2m not found' - endif - if(debugprint) - + print*,'sample 2m T= ',dum2d(im/2,jm/4,2),dum2d(im/2,jm/3,2), - + dum2d(im/2,jm/2,2) - -! 2m Q - if (fformat == 'netcdf') then - VarName='spfh2m' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,dum2d(:,:,3), - & Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'spfh2m not found' - else - VarName='spfh' - LayName='2 m above gnd' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,3),error) - if (error /= 0) print*,'spfh2m not found' - endif - if(debugprint) - + print*,'sample 2m Q= ',dum2d(im/2,jm/4,3),dum2d(im/2,jm/3,3), - + dum2d(im/2,jm/2,3) - -! U10 - if (fformat == 'netcdf') then - VarName='ugrd10m' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,dum2d(:,:,4), - & Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'ugrd10m not found' - else - VarName='ugrd' - LayName='10 m above gnd' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,4),error) - if (error /= 0) print*,'ugrd10m not found' - endif - -! V10 - if (fformat == 'netcdf') then - VarName='vgrd10m' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,dum2d(:,:,5), - & Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'vgrd10m not found' - else - VarName='vgrd' - LayName='10 m above gnd' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,5),error) - if (error /= 0) print*,'vgrd10m not found' - endif - -! soil T - if (fformat == 'netcdf') then - VarName='soilt1' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,dum2d(:,:,6), - & Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'soilt1 not found' - else - VarName='tmp' - LayName='0-10 cm down' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,6),error) - if (error /= 0) print*,'soil T not found' - endif - if(debugprint) - + print*,'sample soil T= ',dum2d(im/2,jm/4,6),dum2d(im/2,jm/3,6), - + dum2d(im/2,jm/2,6) - -! snow depth - if (fformat == 'netcdf') then - VarName='snod' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,dum2d(:,:,7), - & Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'snod not found' - else - VarName='snod' - LayName='sfc' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,7),error) - if (error /= 0) print*,'snod not found' - endif - -! evaporation -!instantaneous surface latent heat net flux - if (fformat == 'netcdf') then - VarName='lhtfl' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,dum2d(:,:,8), - & Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'lhtfl not found' - else - VarName='lhtfl' - LayName='sfc' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,8),error) - if (error /= 0) print*,'lhtfl not found' - endif - if(debugprint) - + print*,'evaporation latent heat net flux= ', - + dum2d(im/2,jm/4,8),dum2d(im/2,jm/3,8) - if(debugprint) - + print*,'evaporation latent heat net flux stn 000692)= ', - + dum2d(2239,441,8) - -! total precip - if ( nf .le. nend1 ) then - fint = nint1 - else - fint = nint3 - endif -! for accumulated precipitation: - if (fformat == 'netcdf') then - VarName='prate_ave' - Zreverse='no' -!! call read_netcdf_p(ncid,im,jm,1,VarName,apcp,Zreverse,error) !current hour - call read_netcdf_p(ncid,im,jm,1,VarName,apcp,Zreverse, - & iope,ionproc,iocomms,error) -!! call read_netcdf_p(ncid2,im,jm,1,VarName,cpcp,Zreverse,error) !earlier hour - call read_netcdf_p(ncid2,im,jm,1,VarName,cpcp,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'prate_ave not found' - else - VarName='prate_ave' - LayName='sfc' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + apcp,error) - call read_nemsio(ffile2,im,jm,1,VarName,LayName, - + cpcp,error) - if (error /= 0) print*,'prate_ave2 not found' - endif - if(debugprint) - & print*,'sample fhour ,3= ', fhour, - & '1sample precip rate= ',apcp(im/2,jm/3),cpcp(im/2,jm/3) - ap=fhour-fint - do j=1,jm - do i=1,im - dum2d(i,j,9) =(apcp(i,j)*fhour-cpcp(i,j)*ap)*3600.0 - end do - end do - - if(debugprint) - & print*,'sample fhour ,5= ', fhour, - & 'sample total precip= ',dum2d(im/2,jm/4,9), - + dum2d(im/2,jm/3,9),dum2d(im/2,jm/2,9) - -! convective precip - if (fformat == 'netcdf') then - VarName='cprat_ave' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,apcp,Zreverse, - & iope,ionproc,iocomms,error) - call read_netcdf_p(ncid2,im,jm,1,VarName,cpcp,Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'cprat_ave not found' - else - VarName='cprat_ave' - LayName='sfc' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + apcp,error) - call read_nemsio(ffile2,im,jm,1,VarName,LayName, - + cpcp,error) - if (error /= 0) print*,'cprat_ave2 not found' - endif - ap=fhour-fint - do j=1,jm - do i=1,im - dum2d(i,j,10)=(apcp(i,j)*fhour-cpcp(i,j)*ap)*3600.0 - & - end do - end do - -! water equi - if (fformat == 'netcdf') then - VarName='weasd' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName,dum2d(:,:,11), - & Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'weasd not found' - else - VarName='weasd' - LayName='sfc' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,11),error) - if (error /= 0) print*,'weasd not found' - endif - -! low cloud fraction - if (fformat == 'netcdf') then - VarName='tcdc_avelcl' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName, - & dum2d(:,:,12),Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'tcdc_avelcl not found' - else - VarName='tcdc_ave' - LayName='low cld lay' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,12),error) - if (error /= 0) print*,'low cld lay not found' - endif - -! mid cloud fraction - if (fformat == 'netcdf') then - VarName='tcdc_avemcl' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName, - & dum2d(:,:,13),Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'tcdc_avemcl not found' - else - VarName='tcdc_ave' - LayName='mid cld lay' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,13),error) - if (error /= 0) print*,'mid cld lay not found' - endif - -! high cloud fraction - if (fformat == 'netcdf') then - VarName='tcdc_avehcl' - Zreverse='no' - call read_netcdf_p(ncid,im,jm,1,VarName, - & dum2d(:,:,14),Zreverse, - & iope,ionproc,iocomms,error) - if (error /= 0) print*,'tcdc_avehcl not found' - else - VarName='tcdc_ave' - LayName='high cld lay' - call read_nemsio(ffile,im,jm,1,VarName,LayName, - + dum2d(:,:,14),error) - if (error /= 0) print*,'high cld lay not found' - endif - - if(debugprint) - + print*,'sample high cloud frac= ',dum2d(im/2,jm/4,14), - + dum2d(im/2,jm/3,14),dum2d(im/2,jm/2,14) - - if (fformat == 'netcdf') then - error=nf90_close(ncid) - error=nf90_close(ncid2) - else - call nemsio_close(ffile,iret=error) - call nemsio_close(ffile2,iret=error) - endif - call date_and_time(date,time,zone,clocking) -! print *,'10reading surface data end= ', clocking - print *,'10date, time, zone',date, time, zone -! -! get the nearest neighbor i,j from the table -! - do np=1, npoint -! use read in predetermined i,j - if (np1==0) then - idum=iidum(np) - jdum=jjdum(np) - - else -! find nearest neighbor - rdum=rlon(np) - if(rdum<0.)rdum=rdum+360. - - do j=1,jm-1 - do i=1,im-1 - if((rdum>=gdlon(i,j) .and. rdum<=gdlon(i+1,j)) .and. - + (rlat(np)<=gdlat(i,j).and.rlat(np)>=gdlat(i,j+1)) ) then - if(landwater(np) == 2)then - idum=i - jdum=j - exit - else if(landwater(np) == lwmask(i,j))then - idum=i - jdum=j !1 - exit - else if(landwater(np) == lwmask(i+1,j))then - idum=i+1 - jdum=j ! 2 - exit - else if(landwater(np) == lwmask(i-1,j))then - idum=i-1 - jdum=j ! 3 - exit - else if(landwater(np) == lwmask(i,j+1))then - idum=i - jdum=j+1 ! 4 - exit - else if(landwater(np) == lwmask(i,j-1))then - idum=i - jdum=j-1 ! 5 - exit - else if(landwater(np) == lwmask(i+1,j-1))then - idum=i+1 - jdum=j-1 ! 6 - exit - else if(landwater(np) == lwmask(i+1,j+1))then - idum=i+1 - jdum=j+1 ! 7 - exit - else if(landwater(np) == lwmask(i-1,j+1))then - idum=i-1 - jdum=j+1 ! 8 - exit - else if(landwater(np) == lwmask(i-1,j-1))then - idum=i-1 - jdum=j-1 ! 9 - exit - else if(landwater(np) == lwmask(i,j+2))then - idum=i - jdum=j+2 ! 10 - exit - else if(landwater(np) == lwmask(i+2,j))then - idum=i+2 - jdum=j !11 - exit - else if(landwater(np) == lwmask(i,j-2))then - idum=i - jdum=j-2 ! 12 - exit - else if(landwater(np) == lwmask(i-2,j))then - idum=i-2 - jdum=j !13 - exit - else if(landwater(np) == lwmask(i-2,j+1))then - idum=i-2 - jdum=j+1 ! 14 - exit - else if(landwater(np) == lwmask(i-1,j+2))then - idum=i-1 - jdum=j+2 !15 - exit - else if(landwater(np) == lwmask(i+1,j+2))then - idum=i+1 - jdum=j+2 !16 - exit - else if(landwater(np) == lwmask(i+2,j+1))then - idum=i+2 - jdum=j+1 !17 - exit - else if(landwater(np) == lwmask(i+2,j-1))then - idum=i+2 - jdum=j-1 !18 - exit - else if(landwater(np) == lwmask(i+1,j-2))then - idum=i+1 - jdum=j-2 !19 - exit - else if(landwater(np) == lwmask(i-1,j-2))then - idum=i-1 - jdum=j-2 !20 - exit - else if(landwater(np) == lwmask(i-2,j-1))then - idum=i-2 - jdum=j-1 !21 - exit - else if(landwater(np) == lwmask(i-2,j-2))then - idum=i-2 - jdum=j-2 !22 - exit - else if(landwater(np) == lwmask(i+2,j-2))then - idum=i+2 - jdum=j-2 !23 - exit - else if(landwater(np) == lwmask(i+2,j+2))then - idum=i+2 - jdum=j+2 !24 - exit - else if(landwater(np) == lwmask(i-2,j+2))then - idum=i-2 - jdum=j+2 !25 - exit - else if(landwater(np) == lwmask(i+3,j))then - idum=i+3 - jdum=j !26 - exit - else if(landwater(np) == lwmask(i-3,j))then - idum=i-3 - jdum=j !27 - exit - else if(landwater(np) == lwmask(i,j+3))then - idum=i - jdum=j+3 !28 - exit - else if(landwater(np) == lwmask(i,j-3))then - idum=i - jdum=j-3 !29 - exit - else -CC print*,'no matching land sea mask np,landwater,i,j,mask= ' -CC print*, np,landwater(np),i,j,lwmask(i,j) -CC print*, ' So it takes i,j ' - idum=i - jdum=j - exit - end if - end if - end do - end do - - idum=max0(min0(idum,im),1) - jdum=max0(min0(jdum,jm),1) - endif !! read in i,j ends here - if (fhour==0.0) then - if(debugprint) then - write(nij,98) np,idum,jdum,rlat(np),rlon(np) - 98 FORMAT (3I6, 2F9.2) - if(elevstn(np)==-999.) elevstn(np)=hgt(idum,jdum) - write(9,99) np,rlat(np),rlon(np),elevstn(np),hgt(idum,jdum) - 99 FORMAT (I6, 4F9.2) - if(np==1 .or.np==100)print*,'nearest neighbor for station ',np - + ,idum,jdum,rlon(np),rlat(np),lwmask(i,j),landwater(np) - endif - endif - - grids(np,1)=hgt(idum,jdum) - grids(np,2)=pint(idum,jdum,1) - - sfc(5,np)=dum2d(idum,jdum,1) - sfc(6,np)=dum2d(idum,jdum,6) - sfc(17,np)=dum2d(idum,jdum,8) - sfc(12,np)=dum2d(idum,jdum,9) - sfc(11,np)=dum2d(idum,jdum,10) - sfc(10,np)=dum2d(idum,jdum,11) - sfc(27,np)=dum2d(idum,jdum,12) - sfc(26,np)=dum2d(idum,jdum,13) - sfc(25,np)=dum2d(idum,jdum,14) - sfc(34,np)=dum2d(idum,jdum,4) - sfc(35,np)=dum2d(idum,jdum,5) - sfc(30,np)=dum2d(idum,jdum,2) - sfc(31,np)=dum2d(idum,jdum,3) - -CC There may be cases where convective precip is greater than total precip -CC due to rounding and interpolation errors, correct it here -G.P. Lou: - if(sfc(11,np) .gt. sfc(12,np)) sfc(11,np)=sfc(12,np) - - do k=1,levs - grids(np,k+2)=t3d(idum,jdum,k) - grids(np,k+2+levs)=q3d(idum,jdum,k) - grids(np,k+2+2*levs)=omega3d(idum,jdum,k) - gridu(np,k)=uh(idum,jdum,k) - gridv(np,k)=vh(idum,jdum,k) - p1(np,k)=pint(idum,jdum,k+1) - z1(np,k)=zint(idum,jdum,k+1) -!! p1(np,k)=0.5*(pint(idum,jdum,k)+pint(idum,jdum,k+1)) -!! z1(np,k)=0.5*(zint(idum,jdum,k)+zint(idum,jdum,k+1)) - - end do - end do - - print*,'finish finding nearest neighbor for each station' - - do np = 1, npoint -! !ps in kPa - ps(np) = grids(np,2)/1000. !! surface pressure - enddo - -! -! ----------------- -! Put topo(1),surf press(2),vir temp(3:66),and specifi hum(67:130) in grids -! for each station -!! if(recn_dzdt == 0 ) then !!DZDT - do k = 1, levs - do np = 1, npoint - omega(np,k) = grids(np,2+levs*2+k) - enddo - enddo - if(debugprint) - + print*,'sample (omega) dzdt ', (omega(3,k),k=1,levs) -! -! move surface pressure to the station surface from the model surface -! - do np = 1, npoint -! -! when the station elevation information in the table says missing, -! use the model elevation -! -! print *, "elevstn = ", elevstn(np) - if(elevstn(np)==-999.) elevstn(np) = grids(np,1) - psn(np) = ps(np) - psone = ps(np) - call sigio_modpr(1,1,levs,nvcoord,idvc, - & idsl,vcoord,iret, - & ps=psone*1000,pd=pd3(np,1:levs)) - grids(np,2) = log(psn(np)) - if(np==11)print*,'station H,grud H,psn,ps,new pm', - & elevstn(np),grids(np,1),psn(np),ps(np) - if(np==11)print*,'pd3= ', pd3(np,1:levs) - enddo -! -!! test removing height adjustments - print*, 'do not do height adjustments' -! -! get sea-level pressure (Pa) and layer geopotential height -! - do k = 1, levs - do np = 1, npoint - ttnew(np,k) = grids(np,k+2) - qnew(np,k) = grids(np,k+levs+2) - enddo - enddo - - do np=1,npoint -!! call gslp(levs,elevstn(np),ps(np)*1000, - call gslp(levs,grids(np,1),ps(np)*1000, - & p1(np,1:levs),ttnew(np,1:levs),qnew(np,1:levs), - & pmsl(np),zp(np,1:levs),zp2(1:2)) - enddo - print *, 'call gslp pmsl= ', (pmsl(np),np=1,20) - if(recn_delz == -9999) then - print*, 'using calculated height ' - else - print*, 'using model height m' - do k = 1, levs - do np=1, npoint - zp(np,k) = z1(np,k) - enddo - enddo - endif - print*,'finish computing MSLP' - print*,'finish computing zp ', (zp(11,k),k=1,levs) - print*,'finish computing zp2(11-12) ', zp2(11),zp2(12) -! -! prepare buffer data -! - if(iope == 0) then - do np = 1, npoint - pi3(np,1)=psn(np)*1000 - do k=1,levs - pi3(np,k+1)=pi3(np,k)-pd3(np,k) !layer pressure (Pa) - enddo -!! ==ivalence (cstat1,rstat1) - cstat1=cstat(np) -!! data(1) = ifix(fhour+.2) * 3600 ! FORECAST TIME (SEC) -!! data(2) = istat(np) ! STATION NUMBER -!! data(3) = rstat1 ! STATION CHARACTER ID -!! data(4) = rlat(np) ! LATITUDE (DEG N) -!! data(5) = rlon(np) ! LONGITUDE (DEG E) -!! data(6) = elevstn(np) ! STATION ELEVATION (M) - data2(1) = ifix(fhour+.2) * 3600 ! FORECAST TIME (SEC) - data2(2) = istat(np) ! STATION NUMBER - data2(3) = rstat1 ! STATION CHARACTER ID - data2(4) = rlat(np) ! LATITUDE (DEG N) - data2(5) = rlon(np) ! LONGITUDE (DEG E) - data2(6) = elevstn(np) ! STATION ELEVATION (M) - psfc = 10. * psn(np) ! convert to MB - leveta = 1 - do k = 1, levs - kk= k/2 + 1 -! -! look for the layer above 500 mb for precip type computation -! - if(pi3(np,k).ge.50000.) leveta = k - ppi = pi3(np,k) - t = grids(np,k+2) - q = max(1.e-8,grids(np,2+k+levs)) - u = gridu(np,k) - v = gridv(np,k) -!! data((k-1)*6+7) = p1(np,k) ! PRESSURE (PA) at integer layer -!! data((k-1)*6+8) = t ! TEMPERATURE (K) -!! data((k-1)*6+9) = u ! U WIND (M/S) -!! data((k-1)*6+10) = v ! V WIND (M/S) -!! data((k-1)*6+11) = q ! HUMIDITY (KG/KG) -!! data((k-1)*6+12) = omega(np,k)*100. ! Omega (pa/sec) !changed to dzdt(cm/s) if available - if (mod(k,2)>0) then - data2((kk-1)*6+7) = p1(np,k) - data2((kk-1)*6+8) = t - data2((kk-1)*6+9) = u - data2((kk-1)*6+10) = v - data2((kk-1)*6+11) = q - data2((kk-1)*6+12) = omega(np,k)*100. - endif -!changed to dzdt(cm/s) if available - enddo -! -! process surface flux file fields -! -!! data(8+nflx) = psfc * 100. ! SURFACE PRESSURE (PA) -!! data(7+nflx) = pmsl(np) - data2(8+nflx2) = psfc * 100. ! SURFACE PRESSURE (PA) - data2(7+nflx2) = pmsl(np) -!! dtemp = .0065 * (grids(np,1) - elevstn(np)) -!! dtemp = .0100 * (grids(np,1) - elevstn(np)) -!! sfc(37,np) = data(6+nflx) * .01 -!! sfc(37,np) = data(7+nflx) * .01 -!! sfc(39,np) = zp2(2) !500 hPa height - sfc(37,np) = data2(7+nflx2) * .01 - sfc(39,np) = zp2(2) !500 hPa height -! -! do height correction if there is no snow or if the temp is less than 0 -! G.P.LOU: -! It was decided that no corrctions were needed due to higher model -! resolution. -! -! if(sfc(10,np)==0.) then -! sfc(30,np) = sfc(30,np) + dtemp -! sfc(5,np) = sfc(5,np) + dtemp -! endif -! if(sfc(10,np).gt.0..and.sfc(5,np).lt.273.16) then -! sfc(5,np) = sfc(5,np) + dtemp -! if(sfc(5,np).gt.273.16) then -! dtemp = sfc(5,np) - 273.16 -! sfc(5,np) = 273.16 -! endif -! sfc(30,np) = sfc(30,np) + dtemp -! endif -! -!G.P. Lou 20200501: -!convert instantaneous surface latent heat net flux to surface -!evapolation 1 W m-2 = 0.0864 MJ m-2 day-1 -! and 1 mm day-1 = 2.45 MJ m-2 day-1 -! equivament to 0.0864/2.54 = 0.035265 -! equivament to 2.54/0.0864 = 28.3565 - if(debugprint) - + print*,'evaporation (stn 000692)= ',sfc(17,np) -!! data(9+nflx) = sfc(5,np) ! tsfc (K) -!! data(10+nflx) = sfc(6,np) ! 10cm soil temp (K) -!!! data(11+nflx) = sfc(17,np)/28.3565 ! evaporation (kg/m**2) from (W m-2) -!! data(11+nflx) = sfc(17,np)*0.035265 ! evaporation (kg/m**2) from (W m-2) -!! data(12+nflx) = sfc(12,np) ! total precip (m) -!! data(13+nflx) = sfc(11,np) ! convective precip (m) -!! data(14+nflx) = sfc(10,np) ! water equi. snow (m) -!! data(15+nflx) = sfc(27,np) ! low cloud (%) -!! data(16+nflx) = sfc(26,np) ! mid cloud -!! data(17+nflx) = sfc(25,np) ! high cloud -!! data(18+nflx) = sfc(34,np) ! U10 (m/s) -!! data(19+nflx) = sfc(35,np) ! V10 (m/s) -!! data(20+nflx) = sfc(30,np) ! T2 (K) -!! data(21+nflx) = sfc(31,np) ! Q2 (K) - -!! data(22+nflx) = 0. -!! data(23+nflx) = 0. -!! data(24+nflx) = 0. -!! data(25+nflx) = 0. -!! create 64 level bufr files - data2(9+nflx2) = sfc(5,np) ! tsfc (K) - data2(10+nflx2) = sfc(6,np) ! 10cm soil temp (K) -!! data2(11+nflx2) = sfc(17,np)/28.3565 ! evaporation (kg/m**2) from (W m-2) - data2(11+nflx2) = sfc(17,np)*0.035265 ! evaporation (kg/m**2) from (W m-2) - data2(12+nflx2) = sfc(12,np) ! total precip (m) - data2(13+nflx2) = sfc(11,np) ! convective precip (m) - data2(14+nflx2) = sfc(10,np) ! water equi. snow (m) - data2(15+nflx2) = sfc(27,np) ! low cloud (%) - data2(16+nflx2) = sfc(26,np) ! mid cloud - data2(17+nflx2) = sfc(25,np) ! high cloud - data2(18+nflx2) = sfc(34,np) ! U10 (m/s) - data2(19+nflx2) = sfc(35,np) ! V10 (m/s) - data2(20+nflx2) = sfc(30,np) ! T2 (K) - data2(21+nflx2) = sfc(31,np) ! Q2 (K) - - data2(22+nflx2) = 0. - data2(23+nflx2) = 0. - data2(24+nflx2) = 0. - data2(25+nflx2) = 0. - nd = 0 - trace = .false. - DOMS=0. - DOMR=0. - DOMIP=0. - DOMZR=0. - if(np==1.or.np==2) nd = 1 - if(np==1.or.np==2) trace = .true. - - if(sfc(12,np).gt.0.) then !check for precip then calc precip type - do k = 1, leveta+1 - pp = p1(np,k) - ppi = pi3(np,k) - t = grids(np,k+2) - q = max(0.,grids(np,2+k+levs)) - u = gridu(np,k) - v = gridv(np,k) - if(q.gt.1.e-6.and.pp.ge.20000.) then - call tdew(td,t,q,pp) - call lcl(tlcl,plcl,t,pp,q) - call mstadb(qw,tw,pp,q,tlcl,plcl) - else - td = t - 30. - tw = t - 30. - endif -! Calpreciptype input variables - gt0(1,k)= t - gq0(1,k) = q - prsl(1,k) = pp - prsi(1,k)=ppi - phii(1,k)=zp(np,k) !height in meters - enddo -! Use GFS routine calpreciptype.f to calculate precip type - xlat=rlat(np) - xlon=rlon(np) - lm=leveta - lp1=leveta+1 -!! PREC=data(12+nflx) - PREC=data2(12+nflx2) - n3dfercld=1 !if =3 then use Ferriers Explicit Precip Type - TSKIN=1. !used in Ferriers Explicit Precip Scheme - SR=1. !used in Ferriers Explicit Precip Scheme - iseedl=jdate - call random_setseed(iseedl) - call random_number(randomno) - call calpreciptype(1,1,1,1,lm,lp1,randomno,xlat,xlon, !input - & gt0,gq0,prsl,prsi,PREC,phii,n3dfercld,TSKIN,SR,phy_f3d, !input - & DOMR,DOMZR,DOMIP,DOMS) ! Output vars - endif -!! data(nflx + 22) = DOMS -!! data(nflx + 23) = DOMIP -!! data(nflx + 24) = DOMZR -!! data(nflx + 25) = DOMR - data2(nflx2 + 22) = DOMS - data2(nflx2 + 23) = DOMIP - data2(nflx2 + 24) = DOMZR - data2(nflx2 + 25) = DOMR - if(np==1.or.np==100) then - print *, ' surface fields for hour', nf, 'np =', np - print *, (data2(l+nflx2),l=1,25) - print *, ' temperature sounding' - print 6101, (data2((k-1)*6+8),k=1,levso) - print *, ' omega sounding' - print *, (data2((k-1)*6+12),k=1,levso) - endif -C print *, 'in meteorg nfile1= ', nfile1 -!! write(nfile) data - write(nfile) data2 - enddo !End loop over stations np - endif - call date_and_time(date,time,zone,clocking) -! print *,'13reading write data end= ', clocking - print *,'13date, time, zone',date, time, zone - print *, 'in meteorg nf,nfile,nfhour= ', nf,nfile,nfhour - print *, 'Finished writing bufr data file' - 6101 format(2x,6f12.3) - 6102 format(2x,6f12.5) - 6103 format(2x,6f12.5) -! - close(unit=nfile) - return - 910 print *, ' error reading surface flux file' - end - -!----------------------------------------------------------------------- diff --git a/sorc/gfs_bufr.fd/modstuff1.f b/sorc/gfs_bufr.fd/modstuff1.f deleted file mode 100644 index 95d4138334a..00000000000 --- a/sorc/gfs_bufr.fd/modstuff1.f +++ /dev/null @@ -1,75 +0,0 @@ - subroutine modstuff(km,idvc,idsl,nvcoord,vcoord,ps,psx,psy,d,u,v,& - pd,pm,om) -! pd,pi,pm,aps,apm,os,om,px,py) -!$$$ Subprogram documentation block -! -! Subprogram: modstuff Compute model coordinate dependent functions -! Prgmmr: Iredell Org: np23 Date: 1999-10-18 -! -! Abstract: This subprogram computes fields which depend on the model coordinate -! such as pressure thickness and vertical velocity. -! -! Program history log: -! 1999-10-18 Mark Iredell -! -! Usage: call modstuff(km,idvc,idsl,nvcoord,vcoord,ps,psx,psy,d,u,v,& -! pd,pi,pm,aps,apm,os,om,px,py) -! Input argument list: -! km integer number of levels -! idvc integer vertical coordinate id (1 for sigma and 2 for hybrid) -! idsl integer type of sigma structure (1 for phillips or 2 for mean) -! nvcoord integer number of vertical coordinates -! vcoord real (km+1,nvcoord) vertical coordinates -! ps real surface pressure (Pa) -! psx real log surface pressure x-gradient (1/m) -! psy real log surface pressure y-gradient (1/m) -! d real (km) wind divergence (1/s) -! u real (km) x-component wind (m/s) -! v real (km) y-component wind (m/s) -! Output argument list: -! pd real (km) pressure thickness (Pa) -! pi real (km+1) interface pressure (Pa) -! pm real (km) mid-layer pressure (Pa) -! aps real log surface pressure () -! apm real (km+1) log mid-layer pressure () -! os real (km) surface pressure tendency (Pa/s) -! om real (km) vertical velocity (Pa/s) -! px real (km) mid-layer pressure x-gradient (Pa/m) -! py real (km) mid-layer pressure y-gradient (Pa/m) -! -! Attributes: -! Language: Fortran 90 -! -!$$$ - use sigio_module - implicit none - integer,intent(in):: km,idvc,idsl,nvcoord - real,intent(in):: vcoord(km+1,nvcoord) - real,intent(in):: ps,psx,psy - real,intent(in):: u(km),v(km),d(km) - real,intent(out) :: pd(km),pm(km),om(km) - real aps,apm(km),os,pi(km+1),px(km),py(km) - real dpmdps(km),dpddps(km),dpidps(km+1),vgradp - integer k,iret -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - call sigio_modpr(1,1,km,nvcoord,idvc,idsl,vcoord,iret,& - ps=(/ps/),& - pm=pm,pd=pd,dpmdps=dpmdps,dpddps=dpddps) - pi(1)=ps - dpidps(1)=1. - do k=1,km - pi(k+1)=pi(k)-pd(k) - dpidps(k+1)=dpidps(k)-dpddps(k) - enddo - aps=log(ps) - apm=log(pm) - os=0 - do k=km,1,-1 - vgradp=u(k)*psx+v(k)*psy - os=os-vgradp*ps*(dpmdps(k)-dpidps(k+1))-d(k)*(pm(k)-pi(k+1)) - om(k)=vgradp*ps*dpmdps(k)+os - os=os-vgradp*ps*(dpidps(k)-dpmdps(k))-d(k)*(pi(k)-pm(k)) - enddo - px=ps*dpmdps*psx - py=ps*dpmdps*psy - end subroutine diff --git a/sorc/gfs_bufr.fd/mstadb.f b/sorc/gfs_bufr.fd/mstadb.f deleted file mode 100644 index e9b01e09c6f..00000000000 --- a/sorc/gfs_bufr.fd/mstadb.f +++ /dev/null @@ -1,49 +0,0 @@ - SUBROUTINE MSTADB(Q2,T2,P2,Q1,T1,P1) -C -C THIS ROUTINE PROVIDES T2 AND QSAT AT T2 AT PRESSUE P2 THAT -C GIVES THE SAME EQUIVALENT POTENTIAL TEMPERATURE AS THE POINT -C ( T1, P1). FOR EASE OF COMPUTATION, Q1 IS REQUESTED -C - REAL L0, KAPPA - parameter (dtdp=4.5e-4,kappa=.286,g=9.81) - parameter (cp=1004.6,cl=4185.5,cpv=1846.0) - parameter (rv=461.5,l0=2.500e6,t0=273.16,es0=610.78) - parameter (cps=2106.0,hfus=3.3358e5,rd=287.05) - parameter (fact1=(CPV - CL) / RV,fact1i=(cps-cl)/rv) - parameter (fact2=(L0 + (CL - CPV) * T0) / RV) - parameter (fact2i=(L0 + hfus + (CL - cps) * T0) / RV) - parameter (fact3=1. / T0,eps=rd/rv,tmix=t0-20.) - FUNC(QS,T) = EXP(L0 * QS / (CP * T)) - DESDT(ES,T) = ES * (FACT1 / T + FACT2 / T ** 2) - DESDTi(ES,T) = ES * (FACT1i / T + FACT2i / T ** 2) -C FIRST GUESS OF T2 - T2 = T1 + DTDP * (P2 - P1) - PFACT = (1.E5 / P2) ** KAPPA - CONST = T1 * (1.E5 / P1) ** KAPPA * FUNC(Q1,T1) - ITER = 0 -C ITERATION STARTS - 10 CALL SVP(Q2,E2,P2,T2) - FACT4 = FUNC(Q2,T2) - F = T2 * PFACT * FACT4 - CONST - if(t2.ge.t0) then - desdt2 = desdt(e2,t2) - elseif(t2.lt.tmix) then - desdt2 = desdti(e2,t2) - else - w = (t2 - tmix) / (t0 - tmix) - desdt2 = w * desdt(e2,t2) + (1.-w) * desdti(e2,t2) - endif - DQSDT = (Q2 / E2) * (P2 / (P2 - (1.-EPS) * E2)) * DESDT2 - DFDT = PFACT * FACT4 + PFACT * FACT4 * (L0 * DQSDT / CP - & - L0 * Q2 / (CP * T2)) - DT = - F / DFDT - T2 = T2 + DT - IF(ABS(DT).LT..1) GOTO 100 - ITER = ITER + 1 - IF(ITER.LT.50) GOTO 10 - WRITE(6,*) ' MSTADB ITERATION DIVERGED, PROGRAM STOPPED' - STOP 'ABEND 240' - 100 CONTINUE - CALL SVP(Q2,E2,P2,T2) - RETURN - END diff --git a/sorc/gfs_bufr.fd/newsig1.f b/sorc/gfs_bufr.fd/newsig1.f deleted file mode 100644 index 2b0b9ccb99a..00000000000 --- a/sorc/gfs_bufr.fd/newsig1.f +++ /dev/null @@ -1,65 +0,0 @@ -C----------------------------------------------------------------------- - SUBROUTINE NEWSIG(NSIL,IDVC,LEVS,NVCOORD,VCOORD,IRET) -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C -C SUBPROGRAM: NEWSIG GET NEW SIGMA STRUCTURE -C PRGMMR: IREDELL ORG: W/NMC23 DATE: 98-04-03 -C -C ABSTRACT: READ IN INTERFACE SIGMA VALUES (OR USE OLD VALUES) -C AND COMPUTE FULL SIGMA VALUES. - -C PROGRAM HISTORY LOG: -C 98-04-03 IREDELL -C -C USAGE: CALL NEWSIG(NSIL,IDVC,LEVS,NVCOORD,VCOORD,IRET) -C INPUT ARGUMENTS: -C NSIL INTEGER UNIT NUMBER OF NEW SIGMA INTERFACE VALUES -C IDVC INTEGER VERTICAL COORDINATE ID -C LEVS INTEGER NEW NUMBER OF LEVELS -C NVCOORD INTEGER NEW NUMBER OF VERTICAL COORDINATES -C OUTPUT ARGUMENTS: -C VCOORD REAL (LEVS+1,NVCOORD) NEW VERTICAL COORDINATES -C IRET INTEGER RETURN CODE -C -C ATTRIBUTES: -C LANGUAGE: FORTRAN -C -C$$$ - REAL VCOORD(LEVS+1,NVCOORD) -C - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -C READ VERTICAL COORDINATES - READ(NSIL,*,IOSTAT=IRET) IDVCI,LEVSI,NVCOORDI - IF(IRET.EQ.0) THEN -C - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - READ(NSIL,*,IOSTAT=IRET) ((VCOORD(K,N),N=1,NVCOORD),K=1,LEVS+1) - IF(IRET.NE.0) RETURN - IF(IDVCI.NE.IDVC.OR.LEVSI.NE.LEVS) IRET=28 - IF(NVCOORDI.NE.NVCOORD) IRET=28 - IF(IRET.NE.0) RETURN -C - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -C READ INTERFACE HYBRID VALUES - ELSE - REWIND NSIL - READ(NSIL,*,IOSTAT=IRET) IDVCI - REWIND NSIL - IF(IRET.EQ.0.AND.(IDVCI.EQ.2.OR.IDVCI.EQ.3)) THEN - READ(NSIL,*,IOSTAT=IRET) IDVCI,LEVSI - READ(NSIL,*,IOSTAT=IRET) (VCOORD(K,1),VCOORD(K,2),K=1,LEVS+1) - IF(IRET.NE.0) RETURN - IF(IDVCI.NE.IDVC.OR.LEVSI.NE.LEVS) IRET=28 - IF(IRET.NE.0) RETURN -C - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -C READ INTERFACE SIGMA VALUES - ELSE - VCOORD(1,1)=1. - VCOORD(LEVS+1,1)=0. - READ(NSIL,*,IOSTAT=IRET) LEVSI - READ(NSIL,*,IOSTAT=IRET) (VCOORD(K,1),K=2,LEVS) - IF(IRET.NE.0) RETURN - IF(LEVSI.NE.LEVS) IRET=28 - IF(IRET.NE.0) RETURN - ENDIF -C - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ENDIF - IRET=0 - END diff --git a/sorc/gfs_bufr.fd/physcons.f b/sorc/gfs_bufr.fd/physcons.f deleted file mode 100644 index 03a0a8001d9..00000000000 --- a/sorc/gfs_bufr.fd/physcons.f +++ /dev/null @@ -1,40 +0,0 @@ -module physcons - use machine,only:kind_phys -! Physical constants as set in NMC handbook from Smithsonian tables. -! Physical constants are given to 5 places. -! 1990/04/30: g and rd are made consistent with NWS usage. -! 2001/10/22: g made consistent with SI usage. -! Math constants - real(kind=kind_phys),parameter:: con_pi =3.141593e+0 ! pi - real(kind=kind_phys),parameter:: con_sqrt2 =1.414214e+0 ! square root of 2 - real(kind=kind_phys),parameter:: con_sqrt3 =1.732051e+0 ! square root of 3 -! Primary constants - real(kind=kind_phys),parameter:: con_rerth =6.3712e+6 ! radius of earth (m) - real(kind=kind_phys),parameter:: con_g =9.80665e+0! gravity (m/s2) - real(kind=kind_phys),parameter:: con_omega =7.2921e-5 ! ang vel of earth (1/s) - real(kind=kind_phys),parameter:: con_rd =2.8705e+2 ! gas constant air (J/kg/K) - real(kind=kind_phys),parameter:: con_rv =4.6150e+2 ! gas constant H2O (J/kg/K) - real(kind=kind_phys),parameter:: con_cp =1.0046e+3 ! spec heat air @p (J/kg/K) - real(kind=kind_phys),parameter:: con_cv =7.1760e+2 ! spec heat air @v (J/kg/K) - real(kind=kind_phys),parameter:: con_cvap =1.8460e+3 ! spec heat H2O gas (J/kg/K) - real(kind=kind_phys),parameter:: con_cliq =4.1855e+3 ! spec heat H2O liq (J/kg/K) - real(kind=kind_phys),parameter:: con_csol =2.1060e+3 ! spec heat H2O ice (J/kg/K) - real(kind=kind_phys),parameter:: con_hvap =2.5000e+6 ! lat heat H2O cond (J/kg) - real(kind=kind_phys),parameter:: con_hfus =3.3358e+5 ! lat heat H2O fusion (J/kg) - real(kind=kind_phys),parameter:: con_psat =6.1078e+2 ! pres at H2O 3pt (Pa) - real(kind=kind_phys),parameter:: con_sbc =5.6730e-8 ! stefan-boltzmann (W/m2/K4) - real(kind=kind_phys),parameter:: con_solr =1.3533e+3 ! solar constant (W/m2) - real(kind=kind_phys),parameter:: con_t0c =2.7315e+2 ! temp at 0C (K) - real(kind=kind_phys),parameter:: con_ttp =2.7316e+2 ! temp at H2O 3pt (K) - real(kind=kind_phys),parameter:: con_epsq =1.0E-12 ! min q for computing precip type -! Secondary constants - real(kind=kind_phys),parameter:: con_rocp =con_rd/con_cp - real(kind=kind_phys),parameter:: con_cpor =con_cp/con_rd - real(kind=kind_phys),parameter:: con_rog =con_rd/con_g - real(kind=kind_phys),parameter:: con_fvirt =con_rv/con_rd-1. - real(kind=kind_phys),parameter:: con_eps =con_rd/con_rv - real(kind=kind_phys),parameter:: con_epsm1 =con_rd/con_rv-1. - real(kind=kind_phys),parameter:: con_dldt =con_cvap-con_cliq - real(kind=kind_phys),parameter:: con_xpona =-con_dldt/con_rv - real(kind=kind_phys),parameter:: con_xponb =-con_dldt/con_rv+con_hvap/(con_rv*con_ttp) -end module diff --git a/sorc/gfs_bufr.fd/read_nemsio.f b/sorc/gfs_bufr.fd/read_nemsio.f deleted file mode 100644 index d1262e7974e..00000000000 --- a/sorc/gfs_bufr.fd/read_nemsio.f +++ /dev/null @@ -1,55 +0,0 @@ - subroutine read_nemsio(gfile,im,jm,levs, - & VarName,LayName,Varout,iret) -!! This subroutine reads either 2d or 3d nemsio data -!! 12/12/2019 Guang Ping Lou - - use nemsio_module - implicit none - include 'mpif.h' - type(nemsio_gfile) :: gfile - character(len=20) :: VarName,LayName - integer,intent(in) :: im,jm,levs - real,intent(out) :: Varout(im,jm,levs) - real,dimension(im*jm) :: dum1d - integer :: iret,i,j,k,jj - - print*,'read_nemsio,im,jm,levs' - print*, im,jm,levs - print*,'VarName=',trim(VarName) - print*,'LayName=',trim(LayName) - if(levs > 1) then - do k =1, levs - call nemsio_readrecvw34(gfile,trim(VarName), - & trim(LayName),k,data=dum1d,iret=iret) - !print*,"VarName,k= ",trim(VarName), k - if (iret /= 0) then - print*,trim(VarName)," not found" - else - do j=1,jm - jj= (j-1)*im - do i=1,im - Varout(i,j,k) = dum1d(jj+i) - end do - end do - end if - enddo - - else - call nemsio_readrecvw34(gfile,trim(VarName), - & trim(LayName),1,data=dum1d,iret=iret) - !print*,"VarName= ",trim(VarName) - if (iret /= 0) then - print*,trim(VarName)," not found" - else - do j=1,jm - jj= (j-1)*im - do i=1,im - Varout(i,j,1) = dum1d(jj+i) - end do - end do - endif - - end if - - end subroutine read_nemsio - diff --git a/sorc/gfs_bufr.fd/read_netcdf.f b/sorc/gfs_bufr.fd/read_netcdf.f deleted file mode 100644 index a024323b314..00000000000 --- a/sorc/gfs_bufr.fd/read_netcdf.f +++ /dev/null @@ -1,55 +0,0 @@ - subroutine read_netcdf(ncid,im,jm,levs, - & VarName,Varout,Zreverse,iret) -!! This subroutine reads either 2d or 3d NetCDF data -!! 12/12/2019 Guang Ping Lou - - use netcdf - implicit none - include 'mpif.h' - character(len=20),intent(in) :: VarName - character(len=3),intent(in) :: Zreverse - integer,intent(in) :: ncid,im,jm,levs - real,intent(out) :: Varout(im,jm,levs) - real :: dummy3d(im,jm,levs) - integer :: iret,i,j,k,id_var,kk - - if(levs > 1) then - iret = nf90_inq_varid(ncid,trim(VarName),id_var) - !print*,stat,varname,id_var - iret = nf90_get_var(ncid,id_var,dummy3d) - if (iret /= 0) then - print*,VarName," not found" - else -!For FV3GFS NetCDF output, vertical layers need to be reversed - if(Zreverse == "yes" ) then - do k = 1, levs - kk=levs-k+1 - do j=1, jm - do i=1, im - Varout(i,j,k) = dummy3d(i,j,kk) - enddo - enddo - enddo - else - do k = 1, levs - do j=1, jm - do i=1, im - Varout(i,j,k) = dummy3d(i,j,k) - enddo - enddo - enddo - endif - endif - - else - iret = nf90_inq_varid(ncid,trim(VarName),id_var) - !print*,stat,varname,id_var - iret = nf90_get_var(ncid,id_var,Varout(:,:,1)) - if (iret /= 0) then - print*,VarName," not found" - endif - - end if - - end subroutine read_netcdf - diff --git a/sorc/gfs_bufr.fd/read_netcdf_p.f b/sorc/gfs_bufr.fd/read_netcdf_p.f deleted file mode 100644 index 4bfa8507be4..00000000000 --- a/sorc/gfs_bufr.fd/read_netcdf_p.f +++ /dev/null @@ -1,113 +0,0 @@ - subroutine read_netcdf_p(ncid,im,jm,levs, - & VarName,Varout,Zreverse,iope,ionproc, - & iocomms,iret) -!! This subroutine reads either 2d or 3d NetCDF data in parallel -!! 02/08/2020 Guang Ping Lou - - use netcdf - use mpi - implicit none -!! include 'mpif.h' - character(len=20),intent(in) :: VarName - character(len=3),intent(in) :: Zreverse - integer,intent(in) :: ncid,im,jm,levs - real,intent(out) :: Varout(im,jm,levs) - real :: dummy3d(im,jm,levs) - integer :: iret,i,j,k,id_var,kk - integer :: iope,ionproc,iocomms - integer :: chunksize,ionproc1 - real, allocatable :: dummy(:,:,:) - integer start(3), count(3) - integer nskip - integer, allocatable :: starts(:) - integer, allocatable :: counts(:) - integer, allocatable :: chunksizes(:) - integer, allocatable :: rdispls(:) - integer, allocatable :: ii(:) - - if(levs > 1) then - nskip = int(levs/ionproc) + 1 - k=ionproc*nskip - if(k > levs) then - kk=(k-levs)/nskip - ionproc1=ionproc - kk - else - ionproc1=ionproc - endif - iret = nf90_inq_varid(ncid,trim(VarName),id_var) - allocate(starts(ionproc1), counts(ionproc1),ii(ionproc1)) - allocate(chunksizes(ionproc1)) - allocate(rdispls(ionproc1)) - print*,'ionproc,ionproc1,nskip= ',ionproc,ionproc1, nskip - print*,'trim(VarName)in read= ',trim(VarName) - starts(1) = 1 - ii(1) = 1 - do i = 2, ionproc1 - starts(i) = 1 + (i-1)*nskip - ii(i)= ii(i-1) + 1 - end do - do i=1, ionproc1 - 1 - counts(i) = starts(i+1) - starts(i) - end do - counts(ionproc1) = levs - starts(ionproc1)+1 - print*,'starts= ',starts - print*, 'counts= ', counts - k=ii(iope+1) - start = (/1,1,starts(k)/) - count = (/im,jm,counts(k)/) - chunksizes(:) = im * jm * counts(:) - rdispls(:) = im * jm * (starts(:)-1) - print*, 'iope,k,start,count= ',iope,k,start(3),count(3) - print*, 'chunksizes= ', chunksizes - print*, 'rdispls= ', rdispls - allocate (dummy(im,jm,count(3))) - iret=nf90_get_var(ncid,id_var,dummy, - & start=start,count=count) - if (iret /= 0) then - print*,VarName," not found" - endif - print*,'start(3),st(3):cnt(3)-1=',start(3),(start(3)+count(3)-1) - print*,'dummy(im/2,jm/2,:)= ', dummy(im/2,jm/2,:) - call mpi_allgatherv(dummy,chunksizes(k),mpi_real,dummy3d, - & chunksizes, rdispls, mpi_real, iocomms, iret) - print*,'VarName= ', VarName - print*,'dummy3d(im/2,jm/2,:)= ', dummy3d(im/2,jm/2,:) -!! call mpi_alltoallv(dummy, chunksizes, sdispls, mpi_real, dummy3d, -!! & chunksizes, rdispls, mpi_real, iocomms, iret) - -! enddo -!For FV3GFS NetCDF output, vertical layers need to be reversed - if(Zreverse == "yes" ) then - do k = 1, levs - kk=levs-k+1 - do j=1, jm - do i=1, im - Varout(i,j,k) = dummy3d(i,j,kk) - enddo - enddo - enddo - else - do k = 1, levs - do j=1, jm - do i=1, im - Varout(i,j,k) = dummy3d(i,j,k) - enddo - enddo - enddo - endif - deallocate(starts, counts,ii) - deallocate(chunksizes) - deallocate(rdispls) - deallocate (dummy) - - else - iret = nf90_inq_varid(ncid,trim(VarName),id_var) - print*,'trim(VarName)in read= ',trim(VarName) - iret = nf90_get_var(ncid,id_var,Varout(:,:,1)) - if (iret /= 0) then - print*,VarName," not found" - endif - - end if - end subroutine read_netcdf_p - diff --git a/sorc/gfs_bufr.fd/rsearch.f b/sorc/gfs_bufr.fd/rsearch.f deleted file mode 100644 index 73141facf5a..00000000000 --- a/sorc/gfs_bufr.fd/rsearch.f +++ /dev/null @@ -1,145 +0,0 @@ -C----------------------------------------------------------------------- - SUBROUTINE RSEARCH(IM,KM1,IXZ1,KXZ1,Z1,KM2,IXZ2,KXZ2,Z2,IXL2,KXL2, - & L2) -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C -C SUBPROGRAM: RSEARCH SEARCH FOR A SURROUNDING REAL INTERVAL -C PRGMMR: IREDELL ORG: W/NMC23 DATE: 98-05-01 -C -C ABSTRACT: THIS SUBPROGRAM SEARCHES MONOTONIC SEQUENCES OF REAL NUMBERS -C FOR INTERVALS THAT SURROUND A GIVEN SEARCH SET OF REAL NUMBERS. -C THE SEQUENCES MAY BE MONOTONIC IN EITHER DIRECTION; THE REAL NUMBERS -C MAY BE SINGLE OR DOUBLE PRECISION; THE INPUT SEQUENCES AND SETS -C AND THE OUTPUT LOCATIONS MAY BE ARBITRARILY DIMENSIONED. -C -C PROGRAM HISTORY LOG: -C 1999-01-05 MARK IREDELL -C -C USAGE: CALL RSEARCH(IM,KM1,IXZ1,KXZ1,Z1,KM2,IXZ2,KXZ2,Z2,IXL2,KXL2, -C & L2) -C INPUT ARGUMENT LIST: -C IM INTEGER NUMBER OF SEQUENCES TO SEARCH -C KM1 INTEGER NUMBER OF POINTS IN EACH SEQUENCE -C IXZ1 INTEGER SEQUENCE SKIP NUMBER FOR Z1 -C KXZ1 INTEGER POINT SKIP NUMBER FOR Z1 -C Z1 REAL (1+(IM-1)*IXZ1+(KM1-1)*KXZ1) -C SEQUENCE VALUES TO SEARCH -C (Z1 MUST BE MONOTONIC IN EITHER DIRECTION) -C KM2 INTEGER NUMBER OF POINTS TO SEARCH FOR -C IN EACH RESPECTIVE SEQUENCE -C IXZ2 INTEGER SEQUENCE SKIP NUMBER FOR Z2 -C KXZ2 INTEGER POINT SKIP NUMBER FOR Z2 -C Z2 REAL (1+(IM-1)*IXZ2+(KM2-1)*KXZ2) -C SET OF VALUES TO SEARCH FOR -C (Z2 NEED NOT BE MONOTONIC) -C IXL2 INTEGER SEQUENCE SKIP NUMBER FOR L2 -C KXL2 INTEGER POINT SKIP NUMBER FOR L2 -C -C OUTPUT ARGUMENT LIST: -C L2 INTEGER (1+(IM-1)*IXL2+(KM2-1)*KXL2) -C INTERVAL LOCATIONS HAVING VALUES FROM 0 TO KM1 -C (Z2 WILL BE BETWEEN Z1(L2) AND Z1(L2+1)) -C -C SUBPROGRAMS CALLED: -C SBSRCH ESSL BINARY SEARCH -C DBSRCH ESSL BINARY SEARCH -C -C REMARKS: -C IF THE ARRAY Z1 IS DIMENSIONED (IM,KM1), THEN THE SKIP NUMBERS ARE -C IXZ1=1 AND KXZ1=IM; IF IT IS DIMENSIONED (KM1,IM), THEN THE SKIP -C NUMBERS ARE IXZ1=KM1 AND KXZ1=1; IF IT IS DIMENSIONED (IM,JM,KM1), -C THEN THE SKIP NUMBERS ARE IXZ1=1 AND KXZ1=IM*JM; ETCETERA. -C SIMILAR EXAMPLES APPLY TO THE SKIP NUMBERS FOR Z2 AND L2. -C -C RETURNED VALUES OF 0 OR KM1 INDICATE THAT THE GIVEN SEARCH VALUE -C IS OUTSIDE THE RANGE OF THE SEQUENCE. -C -C IF A SEARCH VALUE IS IDENTICAL TO ONE OF THE SEQUENCE VALUES -C THEN THE LOCATION RETURNED POINTS TO THE IDENTICAL VALUE. -C IF THE SEQUENCE IS NOT STRICTLY MONOTONIC AND A SEARCH VALUE IS -C IDENTICAL TO MORE THAN ONE OF THE SEQUENCE VALUES, THEN THE -C LOCATION RETURNED MAY POINT TO ANY OF THE IDENTICAL VALUES. -C -C TO BE EXACT, FOR EACH I FROM 1 TO IM AND FOR EACH K FROM 1 TO KM2, -C Z=Z2(1+(I-1)*IXZ2+(K-1)*KXZ2) IS THE SEARCH VALUE AND -C L=L2(1+(I-1)*IXL2+(K-1)*KXL2) IS THE LOCATION RETURNED. -C IF L=0, THEN Z IS LESS THAN THE START POINT Z1(1+(I-1)*IXZ1) -C FOR ASCENDING SEQUENCES (OR GREATER THAN FOR DESCENDING SEQUENCES). -C IF L=KM1, THEN Z IS GREATER THAN OR EQUAL TO THE END POINT -C Z1(1+(I-1)*IXZ1+(KM1-1)*KXZ1) FOR ASCENDING SEQUENCES -C (OR LESS THAN OR EQUAL TO FOR DESCENDING SEQUENCES). -C OTHERWISE Z IS BETWEEN THE VALUES Z1(1+(I-1)*IXZ1+(L-1)*KXZ1) AND -C Z1(1+(I-1)*IXZ1+(L-0)*KXZ1) AND MAY EQUAL THE FORMER. -C -C ATTRIBUTES: -C LANGUAGE: FORTRAN -C -C$$$ - IMPLICIT NONE - INTEGER,INTENT(IN):: IM,KM1,IXZ1,KXZ1,KM2,IXZ2,KXZ2,IXL2,KXL2 - REAL,INTENT(IN):: Z1(1+(IM-1)*IXZ1+(KM1-1)*KXZ1) - REAL,INTENT(IN):: Z2(1+(IM-1)*IXZ2+(KM2-1)*KXZ2) - INTEGER,INTENT(OUT):: L2(1+(IM-1)*IXL2+(KM2-1)*KXL2) - INTEGER(4) INCX,N,INCY,M,INDX(KM2),RC(KM2),IOPT - INTEGER I,K1,K2,CT -C - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -C FIND THE SURROUNDING INPUT INTERVAL FOR EACH OUTPUT POINT. - print*, IM,KM1,KM2,INCX,INCY - DO I=1,IM - IF(Z1(1+(I-1)*IXZ1).LE.Z1(1+(I-1)*IXZ1+(KM1-1)*KXZ1)) THEN -C INPUT COORDINATE IS MONOTONICALLY ASCENDING. - INCX=KXZ2 - N=KM2 - INCY=KXZ1 - M=KM1 - IOPT=1 -! IF(DIGITS(1.).LT.DIGITS(1._8)) THEN -! CALL SBSRCH(Z2(1+(I-1)*IXZ2),INCX,N, -! & Z1(1+(I-1)*IXZ1),INCY,M,INDX,RC,IOPT) -! ELSE -! CALL DBSRCH(Z2(1+(I-1)*IXZ2),INCX,N, -! & Z1(1+(I-1)*IXZ1),INCY,M,INDX,RC,IOPT) -! ENDIF -! DO K2=1,KM2 -! L2(1+(I-1)*IXL2+(K2-1)*KXL2)=INDX(K2)-RC(K2) -! ENDDO - DO K2=1,KM2 - L2(K2)=KM1 - DO K1=(1+(I-1)*IXZ1),(1+(I-1)*IXZ1+(KM1-1)*KXZ1)-1 - IF(Z1(K1)>=Z2(K2).AND.Z1(K1+1)>Z2(K2)) THEN - L2(K2)=K1 - EXIT - ENDIF - ENDDO - ENDDO - ELSE -C INPUT COORDINATE IS MONOTONICALLY DESCENDING. - INCX=KXZ2 - N=KM2 - INCY=-KXZ1 - M=KM1 - IOPT=0 -! IF(DIGITS(1.).LT.DIGITS(1._8)) THEN -! CALL SBSRCH(Z2(1+(I-1)*IXZ2),INCX,N, -! & Z1(1+(I-1)*IXZ1),INCY,M,INDX,RC,IOPT) -! ELSE -! CALL DBSRCH(Z2(1+(I-1)*IXZ2),INCX,N, -! & Z1(1+(I-1)*IXZ1),INCY,M,INDX,RC,IOPT) -! ENDIF -! DO K2=1,KM2 -! L2(1+(I-1)*IXL2+(K2-1)*KXL2)=KM1+1-INDX(K2) -! ENDDO - DO K2=1,KM2 - L2(K2)=KM1 - CT=0 - DO K1=(1+(I-1)*IXZ1+(KM1-1)*KXZ1),(1+(I-1)*IXZ1)+1,-1 - CT=CT+1 - IF(Z2(K2)<=Z1(K1).AND.Z2(K2) /dev/null && pwd) +top_dir=$(cd "$(dirname "${script_dir}")" &> /dev/null && pwd) +cd "${script_dir}" -if [ $RUN_ENVIR != emc -a $RUN_ENVIR != nco ]; then - echo ' Syntax: link_workflow.sh ( nco | emc ) ( hera | orion | jet | stampede )' - exit 1 -fi -if [ $machine != hera -a $machine != orion -a $machine != jet -a $machine != stampede ]; then - echo ' Syntax: link_workflow.sh ( nco | emc ) ( hera | orion | jet | stampede )' +export COMPILER="intel" +# shellcheck disable=SC1091 +source gfs_utils.fd/ush/detect_machine.sh # (sets MACHINE_ID) +# shellcheck disable= +machine=$(echo "${MACHINE_ID}" | cut -d. -f1) + +#------------------------------ +#--model fix fields +#------------------------------ +case "${machine}" in + "wcoss2") FIX_DIR="/lfs/h2/emc/global/noscrub/emc.global/FIX/fix" ;; + "hera") FIX_DIR="/scratch1/NCEPDEV/global/glopara/fix" ;; + "orion") FIX_DIR="/work/noaa/global/glopara/fix" ;; + "jet") FIX_DIR="/lfs4/HFIP/hfv3gfs/glopara/git/fv3gfs/fix" ;; + "s4") FIX_DIR="/data/prod/glopara/fix" ;; + *) + echo "FATAL: Unknown target machine ${machine}, couldn't set FIX_DIR" exit 1 -fi + ;; +esac + +# Source fix version file +source "${top_dir}/versions/fix.ver" LINK="ln -fs" SLINK="ln -fs" -[[ $RUN_ENVIR = nco ]] && LINK="cp -rp" - -pwd=$(pwd -P) +if [[ "${RUN_ENVIR}" == "nco" ]]; then + LINK="cp -rp" +fi # Link post [[ -d upp.fd ]] && rm -rf upp.fd -$LINK ufs_model.fd/FV3/upp upp.fd - -#------------------------------ -#--model fix fields -#------------------------------ -if [ $machine = "hera" ]; then - FIX_DIR="/scratch1/NCEPDEV/global/glopara/fix_NEW" -elif [ $machine = "orion" ]; then - FIX_DIR="/work/noaa/global/glopara/fix_NEW" -elif [ $machine = "jet" ]; then - FIX_DIR="/lfs4/HFIP/hfv3gfs/glopara/git/fv3gfs/fix_NEW" -elif [ $machine = "stampede" ]; then - FIX_DIR="/work/07738/jkuang/stampede2/tempFixICdir/fix_UFSp6" -fi +${LINK} ufs_model.fd/FV3/upp upp.fd -if [ ! -z $FIX_DIR ]; then - if [ ! -d ${pwd}/../fix ]; then mkdir ${pwd}/../fix; fi +if [[ -n "${FIX_DIR}" ]]; then + if [[ ! -d "${top_dir}/fix" ]]; then mkdir "${top_dir}/fix" || exit 1; fi fi -cd ${pwd}/../fix ||exit 8 -for dir in fix_aer \ - fix_am \ - fix_chem \ - fix_fv3_gmted2010 \ - fix_gldas \ - fix_lut \ - fix_fv3_fracoro \ - fix_orog \ - fix_sfc_climo \ - fix_verif \ - fix_cice \ - fix_mom6 \ - fix_cpl \ - fix_wave \ - fix_reg2grb2 \ - fix_ugwd +cd "${top_dir}/fix" || exit 1 +for dir in aer \ + am \ + chem \ + cice \ + cpl \ + datm \ + gldas \ + gsi \ + lut \ + mom6 \ + orog \ + reg2grb2 \ + sfc_climo \ + ugwd \ + verif \ + wave do - if [ -d $dir ]; then - [[ $RUN_ENVIR = nco ]] && chmod -R 755 $dir - rm -rf $dir + if [[ -d "${dir}" ]]; then + [[ "${RUN_ENVIR}" == "nco" ]] && chmod -R 755 "${dir}" + rm -rf "${dir}" fi - $LINK $FIX_DIR/$dir . + fix_ver="${dir}_ver" + ${LINK} "${FIX_DIR}/${dir}/${!fix_ver}" "${dir}" done -if [ -d ${pwd}/ufs_utils.fd ]; then - cd ${pwd}/ufs_utils.fd/fix - ./link_fixdirs.sh $RUN_ENVIR $machine + +if [[ -d "${script_dir}/ufs_utils.fd" ]]; then + cd "${script_dir}/ufs_utils.fd/fix" || exit 1 + ./link_fixdirs.sh "${RUN_ENVIR}" "${machine}" 2> /dev/null fi #--------------------------------------- #--add files from external repositories #--------------------------------------- -cd ${pwd}/../jobs ||exit 8 -if [ -d ../sorc/gldas.fd ]; then - $LINK ../sorc/gldas.fd/jobs/JGDAS_ATMOS_GLDAS . -fi -cd ${pwd}/../parm ||exit 8 - # [[ -d post ]] && rm -rf post - # $LINK ../sorc/upp.fd/parm post - if [ -d ../sorc/gldas.fd ]; then +cd "${top_dir}/parm" || exit 1 + if [[ -d "${script_dir}/gldas.fd" ]]; then [[ -d gldas ]] && rm -rf gldas - $LINK ../sorc/gldas.fd/parm gldas + ${LINK} "${script_dir}/gldas.fd/parm" gldas fi -cd ${pwd}/../parm/post ||exit 8 +cd "${top_dir}/parm/post" || exit 1 for file in postxconfig-NT-GEFS-ANL.txt postxconfig-NT-GEFS-F00.txt postxconfig-NT-GEFS.txt postxconfig-NT-GFS-ANL.txt \ postxconfig-NT-GFS-F00-TWO.txt postxconfig-NT-GFS-F00.txt postxconfig-NT-GFS-FLUX-F00.txt postxconfig-NT-GFS-FLUX.txt \ postxconfig-NT-GFS-GOES.txt postxconfig-NT-GFS-TWO.txt postxconfig-NT-GFS-WAFS-ANL.txt postxconfig-NT-GFS-WAFS.txt \ @@ -100,306 +132,318 @@ cd ${pwd}/../parm/post ||exit 8 post_tag_gfs128 post_tag_gfs65 gtg.config.gfs gtg_imprintings.txt nam_micro_lookup.dat \ AEROSOL_LUTS.dat optics_luts_DUST.dat optics_luts_SALT.dat optics_luts_SOOT.dat optics_luts_SUSO.dat optics_luts_WASO.dat \ ; do - $LINK ../../sorc/upp.fd/parm/$file . + ${LINK} "${script_dir}/upp.fd/parm/${file}" . done -cd ${pwd}/../scripts ||exit 8 - $LINK ../sorc/ufs_utils.fd/scripts/exemcsfc_global_sfc_prep.sh . - if [ -d ../sorc/gldas.fd ]; then - $LINK ../sorc/gldas.fd/scripts/exgdas_atmos_gldas.sh . - fi -cd ${pwd}/../ush ||exit 8 + +cd "${top_dir}/scripts" || exit 8 + ${LINK} "${script_dir}/ufs_utils.fd/scripts/exemcsfc_global_sfc_prep.sh" . +cd "${top_dir}/ush" || exit 8 for file in emcsfc_ice_blend.sh fv3gfs_driver_grid.sh fv3gfs_make_orog.sh global_cycle_driver.sh \ emcsfc_snow.sh fv3gfs_filter_topo.sh global_cycle.sh fv3gfs_make_grid.sh ; do - $LINK ../sorc/ufs_utils.fd/ush/$file . + ${LINK} "${script_dir}/ufs_utils.fd/ush/${file}" . + done + for file in finddate.sh make_ntc_bull.pl make_NTC_file.pl make_tif.sh month_name.sh ; do + ${LINK} "${script_dir}/gfs_utils.fd/ush/${file}" . done - if [ -d ../sorc/gldas.fd ]; then - for file in gldas_archive.sh gldas_forcing.sh gldas_get_data.sh gldas_process_data.sh gldas_liscrd.sh gldas_post.sh ; do - $LINK ../sorc/gldas.fd/ush/$file . - done - fi - #----------------------------------- #--add gfs_wafs link if checked out -if [ -d ${pwd}/gfs_wafs.fd ]; then +if [[ -d "${script_dir}/gfs_wafs.fd" ]]; then #----------------------------------- - cd ${pwd}/../jobs ||exit 8 - $LINK ../sorc/gfs_wafs.fd/jobs/* . - cd ${pwd}/../parm ||exit 8 + cd "${top_dir}/jobs" || exit 1 + ${LINK} "${script_dir}/gfs_wafs.fd/jobs"/* . + cd "${top_dir}/parm" || exit 1 [[ -d wafs ]] && rm -rf wafs - $LINK ../sorc/gfs_wafs.fd/parm/wafs wafs - cd ${pwd}/../scripts ||exit 8 - $LINK ../sorc/gfs_wafs.fd/scripts/* . - cd ${pwd}/../ush ||exit 8 - $LINK ../sorc/gfs_wafs.fd/ush/* . - cd ${pwd}/../fix ||exit 8 + ${LINK} "${script_dir}/gfs_wafs.fd/parm/wafs" wafs + cd "${top_dir}/scripts" || exit 1 + ${LINK} "${script_dir}/gfs_wafs.fd/scripts"/* . + cd "${top_dir}/ush" || exit 1 + ${LINK} "${script_dir}/gfs_wafs.fd/ush"/* . + cd "${top_dir}/fix" || exit 1 [[ -d wafs ]] && rm -rf wafs - $LINK ../sorc/gfs_wafs.fd/fix/* . + ${LINK} "${script_dir}/gfs_wafs.fd/fix"/* . fi -#------------------------------ -#--add GSI fix directory -#------------------------------ -if [ -d ../sorc/gsi_enkf.fd ]; then - cd ${pwd}/../fix ||exit 8 - [[ -d fix_gsi ]] && rm -rf fix_gsi - $LINK ../sorc/gsi_enkf.fd/fix fix_gsi -fi - #------------------------------ #--add GDASApp fix directory #------------------------------ -if [ -d ../sorc/gdas.cd ]; then - cd ${pwd}/../fix ||exit 8 - [[ -d fix_gdas ]] && rm -rf fix_gdas - $LINK $FIX_DIR/fix_gdas . +if [[ -d "${script_dir}/gdas.cd" ]]; then + cd "${top_dir}/fix" || exit 1 + [[ ! -d gdas ]] && mkdir -p gdas + cd gdas || exit 1 + for gdas_sub in crtm fv3jedi gsibec; do + if [[ -d "${gdas_sub}" ]]; then + rm -rf "${gdas_sub}" + fi + fix_ver="gdas_${gdas_sub}_ver" + ${LINK} "${FIX_DIR}/gdas/${gdas_sub}/${!fix_ver}" "${gdas_sub}" + done fi #------------------------------ #--add GDASApp files #------------------------------ -if [ -d ../sorc/gdas.cd ]; then - cd ${pwd}/../ush ||exit 8 - $LINK ../sorc/gdas.cd/ush/ufsda . +if [[ -d "${script_dir}/gdas.cd" ]]; then + cd "${top_dir}/ush" || exit 1 + ${LINK} "${script_dir}/gdas.cd/ush/ufsda" . + ${LINK} "${script_dir}/gdas.cd/ush/jediinc2fv3.py" . fi #------------------------------ #--add DA Monitor file (NOTE: ensure to use correct version) #------------------------------ -if [ -d ../sorc/gsi_monitor.fd ]; then - - cd ${pwd}/../fix ||exit 8 - [[ -d gdas ]] && rm -rf gdas - mkdir -p gdas - cd gdas - $LINK ../../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gdas/fix/gdas_minmon_cost.txt . - $LINK ../../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gdas/fix/gdas_minmon_gnorm.txt . - $LINK ../../sorc/gsi_monitor.fd/src/Ozone_Monitor/nwprod/gdas_oznmon/fix/gdas_oznmon_base.tar . - $LINK ../../sorc/gsi_monitor.fd/src/Ozone_Monitor/nwprod/gdas_oznmon/fix/gdas_oznmon_satype.txt . - $LINK ../../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/fix/gdas_radmon_base.tar . - $LINK ../../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/fix/gdas_radmon_satype.txt . - $LINK ../../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/fix/gdas_radmon_scaninfo.txt . - cd ${pwd}/../jobs ||exit 8 - $LINK ../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gdas/jobs/JGDAS_ATMOS_VMINMON . - $LINK ../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gfs/jobs/JGFS_ATMOS_VMINMON . - $LINK ../sorc/gsi_monitor.fd/src/Ozone_Monitor/nwprod/gdas_oznmon/jobs/JGDAS_ATMOS_VERFOZN . - $LINK ../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/jobs/JGDAS_ATMOS_VERFRAD . - cd ${pwd}/../parm ||exit 8 +if [[ -d "${script_dir}/gsi_monitor.fd" ]]; then + + cd "${top_dir}/fix" || exit 1 + [[ ! -d gdas ]] && ( mkdir -p gdas || exit 1 ) + cd gdas || exit 1 + ${LINK} "${script_dir}/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gdas/fix/gdas_minmon_cost.txt" . + ${LINK} "${script_dir}/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gdas/fix/gdas_minmon_gnorm.txt" . + ${LINK} "${script_dir}/gsi_monitor.fd/src/Ozone_Monitor/nwprod/gdas_oznmon/fix/gdas_oznmon_base.tar" . + ${LINK} "${script_dir}/gsi_monitor.fd/src/Ozone_Monitor/nwprod/gdas_oznmon/fix/gdas_oznmon_satype.txt" . + ${LINK} "${script_dir}/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/fix/gdas_radmon_base.tar" . + ${LINK} "${script_dir}/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/fix/gdas_radmon_satype.txt" . + ${LINK} "${script_dir}/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/fix/gdas_radmon_scaninfo.txt" . + cd "${top_dir}/parm" || exit 1 [[ -d mon ]] && rm -rf mon mkdir -p mon - cd mon - $LINK ../../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/parm/gdas_radmon.parm da_mon.parm - # $LINK ../../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gdas/parm/gdas_minmon.parm . - # $LINK ../../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gfs/parm/gfs_minmon.parm . - $LINK ../../sorc/gsi_monitor.fd/src/Ozone_Monitor/nwprod/gdas_oznmon/parm/gdas_oznmon.parm . - # $LINK ../../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/parm/gdas_radmon.parm . - cd ${pwd}/../scripts ||exit 8 - $LINK ../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gdas/scripts/exgdas_atmos_vminmon.sh . - $LINK ../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gfs/scripts/exgfs_atmos_vminmon.sh . - $LINK ../sorc/gsi_monitor.fd/src/Ozone_Monitor/nwprod/gdas_oznmon/scripts/exgdas_atmos_verfozn.sh . - $LINK ../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/scripts/exgdas_atmos_verfrad.sh . - cd ${pwd}/../ush ||exit 8 - $LINK ../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/minmon_shared/ush/minmon_xtrct_costs.pl . - $LINK ../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/minmon_shared/ush/minmon_xtrct_gnorms.pl . - $LINK ../sorc/gsi_monitor.fd/src/Minimization_Monitor/nwprod/minmon_shared/ush/minmon_xtrct_reduct.pl . - $LINK ../sorc/gsi_monitor.fd/src/Ozone_Monitor/nwprod/oznmon_shared/ush/ozn_xtrct.sh . - $LINK ../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/ush/radmon_err_rpt.sh . - $LINK ../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/ush/radmon_verf_angle.sh . - $LINK ../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/ush/radmon_verf_bcoef.sh . - $LINK ../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/ush/radmon_verf_bcor.sh . - $LINK ../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/ush/radmon_verf_time.sh . - $LINK ../sorc/gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/ush/rstprod.sh . + cd mon || exit 1 + ${LINK} "${script_dir}/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/parm/gdas_radmon.parm" da_mon.parm + # ${LINK} "${script_dir}/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gdas/parm/gdas_minmon.parm" . + # ${LINK} "${script_dir}/gsi_monitor.fd/src/Minimization_Monitor/nwprod/gfs/parm/gfs_minmon.parm" . + ${LINK} "${script_dir}/gsi_monitor.fd/src/Ozone_Monitor/nwprod/gdas_oznmon/parm/gdas_oznmon.parm" . + # ${LINK} "${script_dir}/gsi_monitor.fd/src/Radiance_Monitor/nwprod/gdas_radmon/parm/gdas_radmon.parm" . fi #------------------------------ #--link executables #------------------------------ -if [ ! -d $pwd/../exec ]; then mkdir $pwd/../exec ; fi -cd $pwd/../exec +if [[ ! -d "${top_dir}/exec" ]]; then mkdir "${top_dir}/exec" || exit 1 ; fi +cd "${top_dir}/exec" || exit 1 -[[ -s gaussian_sfcanl.exe ]] && rm -f gaussian_sfcanl.exe -$LINK ../sorc/install/bin/gaussian_sfcanl.x gaussian_sfcanl.exe -for workflowexec in fbwndgfs gfs_bufr regrid_nemsio supvit syndat_getjtbul \ - syndat_maksynrc syndat_qctropcy tocsbufr ; do - [[ -s $workflowexec ]] && rm -f $workflowexec - $LINK ../sorc/install/bin/${workflowexec}.x $workflowexec -done -for workflowexec in enkf_chgres_recenter.x enkf_chgres_recenter_nc.x fv3nc2nemsio.x \ - tave.x vint.x reg2grb2.x ; do - [[ -s $workflowexec ]] && rm -f $workflowexec - $LINK ../sorc/install/bin/$workflowexec . +for utilexe in fbwndgfs.x gaussian_sfcanl.x gfs_bufr.x regrid_nemsio.x supvit.x syndat_getjtbul.x \ + syndat_maksynrc.x syndat_qctropcy.x tocsbufr.x enkf_chgres_recenter.x overgridid.x \ + mkgfsawps.x enkf_chgres_recenter_nc.x fv3nc2nemsio.x tave.x vint.x reg2grb2.x ; do + [[ -s "${utilexe}" ]] && rm -f "${utilexe}" + ${LINK} "${script_dir}/gfs_utils.fd/install/bin/${utilexe}" . done -[[ -s ufs_model.x ]] && rm -f ufs_model.x -$LINK ../sorc/ufs_model.fd/tests/ufs_model.x . +[[ -s "ufs_model.x" ]] && rm -f ufs_model.x +${LINK} "${script_dir}/ufs_model.fd/tests/ufs_model.x" . -[[ -s gfs_ncep_post ]] && rm -f gfs_ncep_post -$LINK ../sorc/upp.fd/exec/upp.x gfs_ncep_post +[[ -s "upp.x" ]] && rm -f upp.x +${LINK} "${script_dir}/upp.fd/exec/upp.x" . -if [ -d ${pwd}/gfs_wafs.fd ]; then +if [[ -d "${script_dir}/gfs_wafs.fd" ]]; then for wafsexe in \ wafs_awc_wafavn.x wafs_blending.x wafs_blending_0p25.x \ wafs_cnvgrib2.x wafs_gcip.x wafs_grib2_0p25.x \ wafs_makewafs.x wafs_setmissing.x; do - [[ -s $wafsexe ]] && rm -f $wafsexe - $LINK ../sorc/gfs_wafs.fd/exec/$wafsexe . + [[ -s ${wafsexe} ]] && rm -f "${wafsexe}" + ${LINK} "${script_dir}/gfs_wafs.fd/exec/${wafsexe}" . done fi for ufs_utilsexe in \ emcsfc_ice_blend emcsfc_snow2mdl global_cycle ; do - [[ -s $ufs_utilsexe ]] && rm -f $ufs_utilsexe - $LINK ../sorc/ufs_utils.fd/exec/$ufs_utilsexe . + [[ -s "${ufs_utilsexe}" ]] && rm -f "${ufs_utilsexe}" + ${LINK} "${script_dir}/ufs_utils.fd/exec/${ufs_utilsexe}" . done # GSI -if [ -d ../sorc/gsi_enkf.fd ]; then - for exe in enkf.x gsi.x; do - [[ -s $exe ]] && rm -f $exe - $LINK ../sorc/gsi_enkf.fd/install/bin/$exe . +if [[ -d "${script_dir}/gsi_enkf.fd" ]]; then + for gsiexe in enkf.x gsi.x; do + [[ -s "${gsiexe}" ]] && rm -f "${gsiexe}" + ${LINK} "${script_dir}/gsi_enkf.fd/install/bin/${gsiexe}" . done fi # GSI Utils -if [ -d ../sorc/gsi_utils.fd ]; then +if [[ -d "${script_dir}/gsi_utils.fd" ]]; then for exe in calc_analysis.x calc_increment_ens_ncio.x calc_increment_ens.x \ getsfcensmeanp.x getsigensmeanp_smooth.x getsigensstatp.x \ interp_inc.x recentersigp.x;do - [[ -s $exe ]] && rm -f $exe - $LINK ../sorc/gsi_utils.fd/install/bin/$exe . + [[ -s "${exe}" ]] && rm -f "${exe}" + ${LINK} "${script_dir}/gsi_utils.fd/install/bin/${exe}" . done fi # GSI Monitor -if [ -d ../sorc/gsi_monitor.fd ]; then +if [[ -d "${script_dir}/gsi_monitor.fd" ]]; then for exe in oznmon_horiz.x oznmon_time.x radmon_angle.x \ radmon_bcoef.x radmon_bcor.x radmon_time.x; do - [[ -s $exe ]] && rm -f $exe - $LINK ../sorc/gsi_monitor.fd/install/bin/$exe . + [[ -s "${exe}" ]] && rm -f "${exe}" + ${LINK} "${script_dir}/gsi_monitor.fd/install/bin/${exe}" . done fi -if [ -d ../sorc/gldas.fd ]; then +if [[ -d "${script_dir}/gldas.fd" ]]; then for gldasexe in gdas2gldas gldas2gdas gldas_forcing gldas_model gldas_post gldas_rst; do - [[ -s $gldasexe ]] && rm -f $gldasexe - $LINK ../sorc/gldas.fd/exec/$gldasexe . + [[ -s "${gldasexe}" ]] && rm -f "${gldasexe}" + ${LINK} "${script_dir}/gldas.fd/exec/${gldasexe}" . done fi # GDASApp -if [ -d ../sorc/gdas.cd ]; then - for gdasexe in fv3jedi_addincrement.x fv3jedi_diffstates.x fv3jedi_ensvariance.x fv3jedi_hofx.x \ - fv3jedi_var.x fv3jedi_convertincrement.x fv3jedi_dirac.x fv3jedi_error_covariance_training.x \ - fv3jedi_letkf.x fv3jedi_convertstate.x fv3jedi_eda.x fv3jedi_forecast.x fv3jedi_plot_field.x \ - fv3jedi_data_checker.py fv3jedi_enshofx.x fv3jedi_hofx_nomodel.x fv3jedi_testdata_downloader.py; do - [[ -s $gdasexe ]] && rm -f $gdasexe - $LINK ../sorc/gdas.cd/build/bin/$gdasexe . +if [[ -d "${script_dir}/gdas.cd" ]]; then + declare -a JEDI_EXE=("fv3jedi_addincrement.x" \ + "fv3jedi_diffstates.x" \ + "fv3jedi_ensvariance.x" \ + "fv3jedi_hofx.x" \ + "fv3jedi_var.x" \ + "fv3jedi_convertincrement.x" \ + "fv3jedi_dirac.x" \ + "fv3jedi_error_covariance_training.x" \ + "fv3jedi_letkf.x" \ + "fv3jedi_convertstate.x" \ + "fv3jedi_eda.x" \ + "fv3jedi_forecast.x" \ + "fv3jedi_plot_field.x" \ + "fv3jedi_data_checker.py" \ + "fv3jedi_enshofx.x" \ + "fv3jedi_hofx_nomodel.x" \ + "fv3jedi_testdata_downloader.py" \ + "soca_convertincrement.x" \ + "soca_error_covariance_training.x" \ + "soca_setcorscales.x" \ + "soca_gridgen.x" \ + "soca_var.x") + for gdasexe in "${JEDI_EXE[@]}"; do + [[ -s "${gdasexe}" ]] && rm -f "${gdasexe}" + ${LINK} "${script_dir}/gdas.cd/build/bin/${gdasexe}" . done fi #------------------------------ #--link source code directories #------------------------------ -cd ${pwd}/../sorc || exit 8 +cd "${script_dir}" || exit 8 - if [ -d gsi_enkf.fd ]; then + if [[ -d gsi_enkf.fd ]]; then [[ -d gsi.fd ]] && rm -rf gsi.fd - $SLINK gsi_enkf.fd/src/gsi gsi.fd + ${SLINK} gsi_enkf.fd/src/gsi gsi.fd [[ -d enkf.fd ]] && rm -rf enkf.fd - $SLINK gsi_enkf.fd/src/enkf enkf.fd + ${SLINK} gsi_enkf.fd/src/enkf enkf.fd fi - if [ -d gsi_utils.fd ]; then + if [[ -d gsi_utils.fd ]]; then [[ -d calc_analysis.fd ]] && rm -rf calc_analysis.fd - $SLINK gsi_utils.fd/src/netcdf_io/calc_analysis.fd calc_analysis.fd + ${SLINK} gsi_utils.fd/src/netcdf_io/calc_analysis.fd calc_analysis.fd [[ -d calc_increment_ens.fd ]] && rm -rf calc_increment_ens.fd - $SLINK gsi_utils.fd/src/EnKF/gfs/src/calc_increment_ens.fd calc_increment_ens.fd + ${SLINK} gsi_utils.fd/src/EnKF/gfs/src/calc_increment_ens.fd calc_increment_ens.fd [[ -d calc_increment_ens_ncio.fd ]] && rm -rf calc_increment_ens_ncio.fd - $SLINK gsi_utils.fd/src/EnKF/gfs/src/calc_increment_ens_ncio.fd calc_increment_ens_ncio.fd + ${SLINK} gsi_utils.fd/src/EnKF/gfs/src/calc_increment_ens_ncio.fd calc_increment_ens_ncio.fd [[ -d getsfcensmeanp.fd ]] && rm -rf getsfcensmeanp.fd - $SLINK gsi_utils.fd/src/EnKF/gfs/src/getsfcensmeanp.fd getsfcensmeanp.fd + ${SLINK} gsi_utils.fd/src/EnKF/gfs/src/getsfcensmeanp.fd getsfcensmeanp.fd [[ -d getsigensmeanp_smooth.fd ]] && rm -rf getsigensmeanp_smooth.fd - $SLINK gsi_utils.fd/src/EnKF/gfs/src/getsigensmeanp_smooth.fd getsigensmeanp_smooth.fd + ${SLINK} gsi_utils.fd/src/EnKF/gfs/src/getsigensmeanp_smooth.fd getsigensmeanp_smooth.fd [[ -d getsigensstatp.fd ]] && rm -rf getsigensstatp.fd - $SLINK gsi_utils.fd/src/EnKF/gfs/src/getsigensstatp.fd getsigensstatp.fd + ${SLINK} gsi_utils.fd/src/EnKF/gfs/src/getsigensstatp.fd getsigensstatp.fd [[ -d recentersigp.fd ]] && rm -rf recentersigp.fd - $SLINK gsi_utils.fd/src/EnKF/gfs/src/recentersigp.fd recentersigp.fd + ${SLINK} gsi_utils.fd/src/EnKF/gfs/src/recentersigp.fd recentersigp.fd [[ -d interp_inc.fd ]] && rm -rf interp_inc.fd - $SLINK gsi_utils.fd/src/netcdf_io/interp_inc.fd interp_inc.fd + ${SLINK} gsi_utils.fd/src/netcdf_io/interp_inc.fd interp_inc.fd fi - if [ -d gsi_monitor.fd ] ; then + if [[ -d gsi_monitor.fd ]] ; then [[ -d oznmon_horiz.fd ]] && rm -rf oznmon_horiz.fd - $SLINK gsi_monitor.fd/src/Ozone_Monitor/nwprod/oznmon_shared/sorc/oznmon_horiz.fd oznmon_horiz.fd + ${SLINK} gsi_monitor.fd/src/Ozone_Monitor/nwprod/oznmon_shared/sorc/oznmon_horiz.fd oznmon_horiz.fd [[ -d oznmon_time.fd ]] && rm -rf oznmon_time.fd - $SLINK gsi_monitor.fd/src/Ozone_Monitor/nwprod/oznmon_shared/sorc/oznmon_time.fd oznmon_time.fd + ${SLINK} gsi_monitor.fd/src/Ozone_Monitor/nwprod/oznmon_shared/sorc/oznmon_time.fd oznmon_time.fd [[ -d radmon_angle.fd ]] && rm -rf radmon_angle.fd - $SLINK gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/sorc/verf_radang.fd radmon_angle.fd + ${SLINK} gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/sorc/verf_radang.fd radmon_angle.fd [[ -d radmon_bcoef.fd ]] && rm -rf radmon_bcoef.fd - $SLINK gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/sorc/verf_radbcoef.fd radmon_bcoef.fd + ${SLINK} gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/sorc/verf_radbcoef.fd radmon_bcoef.fd [[ -d radmon_bcor.fd ]] && rm -rf radmon_bcor.fd - $SLINK gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/sorc/verf_radbcor.fd radmon_bcor.fd + ${SLINK} gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/sorc/verf_radbcor.fd radmon_bcor.fd [[ -d radmon_time.fd ]] && rm -rf radmon_time.fd - $SLINK gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/sorc/verf_radtime.fd radmon_time.fd + ${SLINK} gsi_monitor.fd/src/Radiance_Monitor/nwprod/radmon_shared/sorc/verf_radtime.fd radmon_time.fd fi [[ -d gfs_ncep_post.fd ]] && rm -rf gfs_ncep_post.fd - $SLINK upp.fd/sorc/ncep_post.fd gfs_ncep_post.fd + ${SLINK} upp.fd/sorc/ncep_post.fd gfs_ncep_post.fd for prog in fregrid make_hgrid make_solo_mosaic ; do - [[ -d ${prog}.fd ]] && rm -rf ${prog}.fd - $SLINK ufs_utils.fd/sorc/fre-nctools.fd/tools/$prog ${prog}.fd + [[ -d "${prog}.fd" ]] && rm -rf "${prog}.fd" + ${SLINK} "ufs_utils.fd/sorc/fre-nctools.fd/tools/${prog}" "${prog}.fd" done - for prog in global_cycle.fd \ + for prog in global_cycle.fd \ emcsfc_ice_blend.fd \ emcsfc_snow2mdl.fd ;do - [[ -d $prog ]] && rm -rf $prog - $SLINK ufs_utils.fd/sorc/$prog $prog + [[ -d "${prog}" ]] && rm -rf "${prog}" + ${SLINK} "ufs_utils.fd/sorc/${prog}" "${prog}" done + for prog in enkf_chgres_recenter.fd \ + enkf_chgres_recenter_nc.fd \ + fbwndgfs.fd \ + fv3nc2nemsio.fd \ + gaussian_sfcanl.fd \ + gfs_bufr.fd \ + mkgfsawps.fd \ + overgridid.fd \ + rdbfmsua.fd \ + reg2grb2.fd \ + regrid_nemsio.fd \ + supvit.fd \ + syndat_getjtbul.fd \ + syndat_maksynrc.fd \ + syndat_qctropcy.fd \ + tave.fd \ + tocsbufr.fd \ + vint.fd \ + webtitle.fd + do + if [[ -d "${prog}" ]]; then rm -rf "${prog}"; fi + ${LINK} "gfs_utils.fd/src/${prog}" . + done - if [ -d ${pwd}/gfs_wafs.fd ]; then - $SLINK gfs_wafs.fd/sorc/wafs_awc_wafavn.fd wafs_awc_wafavn.fd - $SLINK gfs_wafs.fd/sorc/wafs_blending.fd wafs_blending.fd - $SLINK gfs_wafs.fd/sorc/wafs_blending_0p25.fd wafs_blending_0p25.fd - $SLINK gfs_wafs.fd/sorc/wafs_cnvgrib2.fd wafs_cnvgrib2.fd - $SLINK gfs_wafs.fd/sorc/wafs_gcip.fd wafs_gcip.fd - $SLINK gfs_wafs.fd/sorc/wafs_grib2_0p25.fd wafs_grib2_0p25.fd - $SLINK gfs_wafs.fd/sorc/wafs_makewafs.fd wafs_makewafs.fd - $SLINK gfs_wafs.fd/sorc/wafs_setmissing.fd wafs_setmissing.fd + if [[ -d "${script_dir}/gfs_wafs.fd" ]]; then + ${SLINK} gfs_wafs.fd/sorc/wafs_awc_wafavn.fd wafs_awc_wafavn.fd + ${SLINK} gfs_wafs.fd/sorc/wafs_blending.fd wafs_blending.fd + ${SLINK} gfs_wafs.fd/sorc/wafs_blending_0p25.fd wafs_blending_0p25.fd + ${SLINK} gfs_wafs.fd/sorc/wafs_cnvgrib2.fd wafs_cnvgrib2.fd + ${SLINK} gfs_wafs.fd/sorc/wafs_gcip.fd wafs_gcip.fd + ${SLINK} gfs_wafs.fd/sorc/wafs_grib2_0p25.fd wafs_grib2_0p25.fd + ${SLINK} gfs_wafs.fd/sorc/wafs_makewafs.fd wafs_makewafs.fd + ${SLINK} gfs_wafs.fd/sorc/wafs_setmissing.fd wafs_setmissing.fd fi - if [ -d gldas.fd ]; then + if [[ -d gldas.fd ]]; then for prog in gdas2gldas.fd gldas2gdas.fd gldas_forcing.fd gldas_model.fd gldas_post.fd gldas_rst.fd ;do - [[ -d $prog ]] && rm -rf $prog - $SLINK gldas.fd/sorc/$prog $prog + [[ -d "${prog}" ]] && rm -rf "${prog}" + ${SLINK} "gldas.fd/sorc/${prog}" "${prog}" done fi #------------------------------ # copy $HOMEgfs/parm/config/config.base.nco.static as config.base for operations # config.base in the $HOMEgfs/parm/config has no use in development -cd $pwd/../parm/config -[[ -s config.base ]] && rm -f config.base -[[ $RUN_ENVIR = nco ]] && cp -p config.base.nco.static config.base +cd "${top_dir}/parm/config" || exit 1 +[[ -s "config.base" ]] && rm -f config.base +if [[ "${RUN_ENVIR}" == "nco" ]] ; then + cp -p config.base.nco.static config.base + cp -p config.fv3.nco.static config.fv3 + cp -p config.resources.nco.static config.resources +fi #------------------------------ +echo "${BASH_SOURCE[0]} completed successfully" exit 0 - diff --git a/sorc/machine-setup.sh b/sorc/machine-setup.sh deleted file mode 100644 index 229c7f03430..00000000000 --- a/sorc/machine-setup.sh +++ /dev/null @@ -1,142 +0,0 @@ -# Create a test function for sh vs. bash detection. The name is -# randomly generated to reduce the chances of name collision. -__ms_function_name="setup__test_function__$$" -eval "$__ms_function_name() { /bin/true ; }" - -# Determine which shell we are using -__ms_ksh_test=$( eval '__text="text" ; if [[ $__text =~ ^(t).* ]] ; then printf "%s" ${.sh.match[1]} ; fi' 2> /dev/null | cat ) -__ms_bash_test=$( eval 'if ( set | grep '$__ms_function_name' | grep -v name > /dev/null 2>&1 ) ; then echo t ; fi ' 2> /dev/null | cat ) - -if [[ ! -z "$__ms_ksh_test" ]] ; then - __ms_shell=ksh -elif [[ ! -z "$__ms_bash_test" ]] ; then - __ms_shell=bash -else - # Not bash or ksh, so assume sh. - __ms_shell=sh -fi - -target="" -USERNAME=$(echo $LOGNAME | awk '{ print tolower($0)'}) -##--------------------------------------------------------------------------- -export hname=$(hostname | cut -c 1,1) -if [[ -d /work ]] ; then - # We are on MSU Orion - if ( ! eval module help > /dev/null 2>&1 ) ; then - echo load the module command 1>&2 - source /apps/lmod/lmod/init/$__ms_shell - fi - target=orion - - module purge - - export myFC=mpiifort - export FCOMP=mpiifort - -##--------------------------------------------------------------------------- -elif [[ -d /scratch1 ]] ; then - # We are on NOAA Hera - if ( ! eval module help > /dev/null 2>&1 ) ; then - echo load the module command 1>&2 - source /apps/lmod/lmod/init/$__ms_shell - fi - target=hera - - module purge - - export myFC=mpiifort - export FCOMP=mpiifort - -##--------------------------------------------------------------------------- -elif [[ -d /glade ]] ; then - # We are on NCAR Yellowstone - if ( ! eval module help > /dev/null 2>&1 ) ; then - echo load the module command 1>&2 - . /usr/share/Modules/init/$__ms_shell - fi - target=yellowstone - module purge - -##--------------------------------------------------------------------------- -elif [[ -d /lustre && -d /ncrc ]] ; then - # We are on GAEA. - echo gaea - if ( ! eval module help > /dev/null 2>&1 ) ; then - # We cannot simply load the module command. The GAEA - # /etc/profile modifies a number of module-related variables - # before loading the module command. Without those variables, - # the module command fails. Hence we actually have to source - # /etc/profile here. - source /etc/profile - __ms_source_etc_profile=yes - else - __ms_source_etc_profile=no - fi - module purge - module purge - # clean up after purge - unset _LMFILES_ - unset _LMFILES_000 - unset _LMFILES_001 - unset LOADEDMODULES - module load modules - if [[ -d /opt/cray/ari/modulefiles ]] ; then - module use -a /opt/cray/ari/modulefiles - fi - if [[ -d /opt/cray/pe/ari/modulefiles ]] ; then - module use -a /opt/cray/pe/ari/modulefiles - fi - if [[ -d /opt/cray/pe/craype/default/modulefiles ]] ; then - module use -a /opt/cray/pe/craype/default/modulefiles - fi - if [[ -s /etc/opt/cray/pe/admin-pe/site-config ]] ; then - source /etc/opt/cray/pe/admin-pe/site-config - fi - export NCEPLIBS=/lustre/f1/pdata/ncep_shared/NCEPLIBS/lib - if [[ -d "$NCEPLIBS" ]] ; then - module use $NCEPLIBS/modulefiles - fi - if [[ "$__ms_source_etc_profile" == yes ]] ; then - source /etc/profile - unset __ms_source_etc_profile - fi - target=gaea - - # GWV ADD - module load craype - module load intel - export NCEPLIBS=/lustre/f2/dev/ncep/George.Vandenberghe/NEWCOPY/l508/lib/ - module use $NCEPLIBS/modulefiles - export WRFPATH=$NCEPLIBS/wrf.shared.new/v1.1.1/src - export myFC=ftn - export FCOMP=ftn - # END GWV ADD - -##--------------------------------------------------------------------------- -elif [[ -d /lfs4 ]] ; then - # We are on NOAA Jet - if ( ! eval module help > /dev/null 2>&1 ) ; then - echo load the module command 1>&2 - source /apps/lmod/lmod/init/$__ms_shell - fi - target=jet - module purge - module load intel/18.0.5.274 - module load impi/2018.4.274 - export NCEPLIBS=/mnt/lfs4/HFIP/hfv3gfs/nwprod/NCEPLIBS - #export NCEPLIBS=/mnt/lfs3/projects/hfv3gfs/gwv/ljtjet/lib - #export NCEPLIBS=/mnt/lfs3/projects/hfv3gfs/gwv/ljtjet/lib - #export NCEPLIBS=/mnt/lfs3/projects/hfv3gfs/gwv/NCEPLIBS.15X - module use $NCEPLIBS/modulefiles - export WRFPATH=$NCEPLIBS/wrf.shared.new/v1.1.1/src - export myFC=mpiifort - -else - echo WARNING: UNKNOWN PLATFORM 1>&2 -fi - -unset __ms_shell -unset __ms_ksh_test -unset __ms_bash_test -unset $__ms_function_name -unset __ms_function_name diff --git a/sorc/ncl.setup b/sorc/ncl.setup index de01309038c..b4981689db2 100644 --- a/sorc/ncl.setup +++ b/sorc/ncl.setup @@ -1,12 +1,12 @@ #!/bin/bash set +x -case $target in +case ${target} in 'jet'|'hera') module load ncl/6.5.0 - export NCARG_LIB=$NCARG_ROOT/lib + export NCARG_LIB=${NCARG_ROOT}/lib ;; *) - echo "[${BASH_SOURCE}]: unknown $target" + echo "[${BASH_SOURCE[0]}]: unknown ${target}" ;; esac diff --git a/sorc/partial_build.sh b/sorc/partial_build.sh index 6b25f5bd859..0d4657136d5 100755 --- a/sorc/partial_build.sh +++ b/sorc/partial_build.sh @@ -8,13 +8,12 @@ declare -a Build_prg=("Build_ufs_model" \ "Build_gsi_utils" \ "Build_gsi_monitor" \ "Build_ww3_prepost" \ - "Build_reg2grb2" \ + "Build_gdas" \ "Build_gldas" \ "Build_upp" \ "Build_ufs_utils" \ "Build_gfs_wafs" \ - "Build_workflow_utils" \ - "Build_gfs_util") + "Build_gfs_utils") # # function parse_cfg: read config file and retrieve the values @@ -31,7 +30,7 @@ parse_cfg() { [[ ${config,,} == "--verbose" ]] && config=$3 all_prg=() for (( n = num_args + 2; n <= total_args; n++ )); do - all_prg+=( ${!n} ) + all_prg+=( "${!n}" ) done if [[ ${config^^} == ALL ]]; then @@ -39,22 +38,26 @@ parse_cfg() { # set all values to true # for var in "${Build_prg[@]}"; do - eval "$var=true" + eval "${var}=true" done - elif [[ $config == config=* ]]; then + elif [[ ${config} == config=* ]]; then # # process config file # cfg_file=${config#config=} - $verbose && echo "INFO: settings in config file: $cfg_file" - while read cline; do + ${verbose} && echo "INFO: settings in config file: ${cfg_file}" + while read -r cline; do # remove leading white space clean_line="${cline#"${cline%%[![:space:]]*}"}" - ( [[ -z "$clean_line" ]] || [[ "${clean_line:0:1}" == "#" ]] ) || { - $verbose && echo $clean_line + { [[ -z "${clean_line}" ]] || [[ "${clean_line:0:1}" == "#" ]]; } || { + ${verbose} && echo "${clean_line}" first9=${clean_line:0:9} [[ ${first9,,} == "building " ]] && { - short_prg=$(sed -e 's/.*(\(.*\)).*/\1/' <<< "$clean_line") + # No shellcheck, this can't be replaced by a native bash substitute + # because it uses a regex + # shellcheck disable=SC2001 + short_prg=$(sed -e 's/.*(\(.*\)).*/\1/' <<< "${clean_line}") + # shellcheck disable= # remove trailing white space clean_line="${cline%"${cline##*[![:space:]]}"}" build_action=true @@ -63,27 +66,27 @@ parse_cfg() { last4=${clean_line: -4} [[ ${last4,,} == ". no" ]] && build_action=false found=false - for prg in ${all_prg[@]}; do - [[ $prg == "Build_"$short_prg ]] && { + for prg in "${all_prg[@]}"; do + [[ ${prg} == "Build_${short_prg}" ]] && { found=true - eval "$prg=$build_action" + eval "${prg}=${build_action}" break } done - $found || { - echo "*** Unrecognized line in config file \"$cfg_file\":" 2>&1 - echo "$cline" 2>&1 + ${found} || { + echo "*** Unrecognized line in config file \"${cfg_file}\":" 2>&1 + echo "${cline}" 2>&1 exit 3 } } } - done < $cfg_file - elif [[ $config == select=* ]]; then + done < "${cfg_file}" + elif [[ ${config} == select=* ]]; then # # set all values to (default) false # for var in "${Build_prg[@]}"; do - eval "$var=false" + eval "${var}=false" done # # read command line partial build setting @@ -91,43 +94,45 @@ parse_cfg() { del="" sel_prg=${config#select=} for separator in " " "," ";" ":" "/" "|"; do - [[ "${sel_prg/$separator}" == "$sel_prg" ]] || { - del=$separator - sel_prg=${sel_prg//$del/ } + [[ "${sel_prg/${separator}}" == "${sel_prg}" ]] || { + del=${separator} + sel_prg=${sel_prg//${del}/ } } done - [[ $del == "" ]] && { - short_prg=$sel_prg - found=false - for prg in ${all_prg[@]}; do - [[ $prg == "Build_"$short_prg ]] && { - found=true - eval "$prg=true" - break - } - done - $found || { - echo "*** Unrecognized program name \"$short_prg\" in command line" 2>&1 - exit 4 - } - } || { - for short_prg in $(echo ${sel_prg}); do + if [[ ${del} == "" ]]; then + { + short_prg=${sel_prg} found=false - for prg in ${all_prg[@]}; do - [[ $prg == "Build_"$short_prg ]] && { + for prg in "${all_prg[@]}"; do + [[ ${prg} == "Build_${short_prg}" ]] && { found=true - eval "$prg=true" + eval "${prg}=true" break } done - $found || { - echo "*** Unrecognized program name \"$short_prg\" in command line" 2>&1 - exit 5 + ${found} || { + echo "*** Unrecognized program name \"${short_prg}\" in command line" 2>&1 + exit 4 } - done - } + } || { + for short_prg in ${sel_prg}; do + found=false + for prg in "${all_prg[@]}"; do + [[ ${prg} == "Build_${short_prg}" ]] && { + found=true + eval "${prg}=true" + break + } + done + ${found} || { + echo "*** Unrecognized program name \"${short_prg}\" in command line" 2>&1 + exit 5 + } + done + } + fi else - echo "*** Unrecognized command line option \"$config\"" 2>&1 + echo "*** Unrecognized command line option \"${config}\"" 2>&1 exit 6 fi } @@ -135,7 +140,7 @@ parse_cfg() { usage() { cat << EOF 2>&1 -Usage: $BASH_SOURCE [-c config_file][-h][-v] +Usage: ${BASH_SOURCE[0]} [-c config_file][-h][-v] -h: Print this help message and exit -v: @@ -154,7 +159,7 @@ verbose=false config_file="gfs_build.cfg" # Reset option counter for when this script is sourced OPTIND=1 -while getopts ":c:hs:v" option; do +while getopts ":c:h:v" option; do case "${option}" in c) config_file="${OPTARG}";; h) usage;; @@ -162,12 +167,12 @@ while getopts ":c:hs:v" option; do verbose=true parse_argv+=( "--verbose" ) ;; - \?) - echo "[$BASH_SOURCE]: Unrecognized option: ${option}" + :) + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage ;; - :) - echo "[$BASH_SOURCE]: ${option} requires an argument" + *) + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" usage ;; esac @@ -175,21 +180,21 @@ done shift $((OPTIND-1)) -parse_argv+=( "config=$config_file" ) +parse_argv+=( "config=${config_file}" ) # # call arguments retriever/config parser # -parse_cfg ${#parse_argv[@]} "${parse_argv[@]}" ${Build_prg[@]} +parse_cfg ${#parse_argv[@]} "${parse_argv[@]}" "${Build_prg[@]}" # # print values of build array # -$verbose && { +${verbose} && { echo "INFO: partial build settings:" for var in "${Build_prg[@]}"; do - echo -n " $var: " - ${!var} && echo True || echo False + echo -n " ${var}: " + "${!var}" && echo True || echo False done } diff --git a/sorc/regrid_nemsio.fd/constants.f90 b/sorc/regrid_nemsio.fd/constants.f90 deleted file mode 100644 index 8627358e2d1..00000000000 --- a/sorc/regrid_nemsio.fd/constants.f90 +++ /dev/null @@ -1,314 +0,0 @@ -! this module was extracted from the GSI version operational -! at NCEP in Dec. 2007. -module constants -!$$$ module documentation block -! . . . . -! module: constants -! prgmmr: treadon org: np23 date: 2003-09-25 -! -! abstract: This module contains the definition of various constants -! used in the gsi code -! -! program history log: -! 2003-09-25 treadon - original code -! 2004-03-02 treadon - allow global and regional constants to differ -! 2004-06-16 treadon - update documentation -! 2004-10-28 treadon - replace parameter tiny=1.e-12 with tiny_r_kind -! and tiny_single -! 2004-11-16 treadon - add huge_single, huge_r_kind parameters -! 2005-01-27 cucurull - add ione -! 2005-08-24 derber - move cg_term to constants from qcmod -! 2006-03-07 treadon - add rd_over_cp_mass -! 2006-05-18 treadon - add huge_i_kind -! 2006-06-06 su - add var-qc wgtlim, change value to 0.25 (ECMWF) -! 2006-07-28 derber - add r1000 -! -! Subroutines Included: -! sub init_constants - compute derived constants, set regional/global constants -! -! Variable Definitions: -! see below -! -! attributes: -! language: f90 -! machine: ibm RS/6000 SP -! -!$$$ - use kinds, only: r_single,r_kind,i_kind - implicit none - -! Declare constants - integer(i_kind) izero,ione - real(r_kind) rearth,grav,omega,rd,rv,cp,cv,cvap,cliq - real(r_kind) csol,hvap,hfus,psat,t0c,ttp,jcal,cp_mass,cg_term - real(r_kind) fv,deg2rad,rad2deg,pi,tiny_r_kind,huge_r_kind,huge_i_kind - real(r_kind) ozcon,rozcon,tpwcon,rd_over_g,rd_over_cp,g_over_rd - real(r_kind) amsua_clw_d1,amsua_clw_d2,constoz,zero,one,two,four - real(r_kind) one_tenth,quarter,three,five,rd_over_cp_mass - real(r_kind) rearth_equator,stndrd_atmos_ps,r1000,stndrd_atmos_lapse - real(r_kind) semi_major_axis,semi_minor_axis,n_a,n_b - real(r_kind) eccentricity,grav_polar,grav_ratio - real(r_kind) grav_equator,earth_omega,grav_constant - real(r_kind) flattening,eccentricity_linear,somigliana - real(r_kind) dldt,dldti,hsub,psatk,tmix,xa,xai,xb,xbi - real(r_kind) eps,epsm1,omeps,wgtlim - real(r_kind) elocp,cpr,el2orc,cclimit,climit,epsq - real(r_kind) pcpeff0,pcpeff1,pcpeff2,pcpeff3,rcp,c0,delta - real(r_kind) h1000,factor1,factor2,rhcbot,rhctop,dx_max,dx_min,dx_inv - real(r_kind) h300,half,cmr,cws,ke2,row,rrow - real(r_single) zero_single,tiny_single,huge_single - real(r_single) rmw_mean_distance, roic_mean_distance - logical :: constants_initialized = .true. - - -! Define constants common to global and regional applications -! name value description units -! ---- ----- ----------- ----- - parameter(rearth_equator= 6.37813662e6_r_kind) ! equatorial earth radius (m) - parameter(omega = 7.2921e-5_r_kind) ! angular velocity of earth (1/s) - parameter(cp = 1.0046e+3_r_kind) ! specific heat of air @pressure (J/kg/K) - parameter(cvap = 1.8460e+3_r_kind) ! specific heat of h2o vapor (J/kg/K) - parameter(csol = 2.1060e+3_r_kind) ! specific heat of solid h2o (ice)(J/kg/K) - parameter(hvap = 2.5000e+6_r_kind) ! latent heat of h2o condensation (J/kg) - parameter(hfus = 3.3358e+5_r_kind) ! latent heat of h2o fusion (J/kg) - parameter(psat = 6.1078e+2_r_kind) ! pressure at h2o triple point (Pa) - parameter(t0c = 2.7315e+2_r_kind) ! temperature at zero celsius (K) - parameter(ttp = 2.7316e+2_r_kind) ! temperature at h2o triple point (K) - parameter(jcal = 4.1855e+0_r_kind) ! joules per calorie () - parameter(stndrd_atmos_ps = 1013.25e2_r_kind) ! 1976 US standard atmosphere ps (Pa) - -! Numeric constants - parameter(izero = 0) - parameter(ione = 1) - parameter(zero_single = 0.0_r_single) - parameter(zero = 0.0_r_kind) - parameter(one_tenth = 0.10_r_kind) - parameter(quarter= 0.25_r_kind) - parameter(one = 1.0_r_kind) - parameter(two = 2.0_r_kind) - parameter(three = 3.0_r_kind) - parameter(four = 4.0_r_kind) - parameter(five = 5.0_r_kind) - parameter(r1000 = 1000.0_r_kind) - -! Constants for gps refractivity - parameter(n_a=77.6_r_kind) !K/mb - parameter(n_b=3.73e+5_r_kind) !K^2/mb - -! Parameters below from WGS-84 model software inside GPS receivers. - parameter(semi_major_axis = 6378.1370e3_r_kind) ! (m) - parameter(semi_minor_axis = 6356.7523142e3_r_kind) ! (m) - parameter(grav_polar = 9.8321849378_r_kind) ! (m/s2) - parameter(grav_equator = 9.7803253359_r_kind) ! (m/s2) - parameter(earth_omega = 7.292115e-5_r_kind) ! (rad/s) - parameter(grav_constant = 3.986004418e14_r_kind) ! (m3/s2) - -! Derived geophysical constants - parameter(flattening = (semi_major_axis-semi_minor_axis)/semi_major_axis)!() - parameter(somigliana = & - (semi_minor_axis/semi_major_axis) * (grav_polar/grav_equator) - one)!() - parameter(grav_ratio = (earth_omega*earth_omega * & - semi_major_axis*semi_major_axis * semi_minor_axis) / grav_constant) !() - -! Derived thermodynamic constants - parameter ( dldti = cvap-csol ) - parameter ( hsub = hvap+hfus ) - parameter ( psatk = psat*0.001_r_kind ) - parameter ( tmix = ttp-20._r_kind ) - parameter ( elocp = hvap/cp ) - parameter ( rcp = one/cp ) - -! Constants used in GFS moist physics - parameter ( h300 = 300._r_kind ) - parameter ( half = 0.5_r_kind ) - parameter ( cclimit = 0.001_r_kind ) - parameter ( climit = 1.e-20_r_kind) - parameter ( epsq = 2.e-12_r_kind ) - parameter ( h1000 = 1000.0_r_kind) - parameter ( rhcbot=0.85_r_kind ) - parameter ( rhctop=0.85_r_kind ) - parameter ( dx_max=-8.8818363_r_kind ) - parameter ( dx_min=-5.2574954_r_kind ) - parameter ( dx_inv=one/(dx_max-dx_min) ) - parameter ( c0=0.002_r_kind ) - parameter ( delta=0.6077338_r_kind ) - parameter ( pcpeff0=1.591_r_kind ) - parameter ( pcpeff1=-0.639_r_kind ) - parameter ( pcpeff2=0.0953_r_kind ) - parameter ( pcpeff3=-0.00496_r_kind ) - parameter ( cmr = one/0.0003_r_kind ) - parameter ( cws = 0.025_r_kind ) - parameter ( ke2 = 0.00002_r_kind ) - parameter ( row = 1000._r_kind ) - parameter ( rrow = one/row ) - -! Constant used to process ozone - parameter ( constoz = 604229.0_r_kind) - -! Constants used in cloud liquid water correction for AMSU-A -! brightness temperatures - parameter ( amsua_clw_d1 = 0.754_r_kind ) - parameter ( amsua_clw_d2 = -2.265_r_kind ) - -! Constants used for variational qc - parameter ( wgtlim = 0.25_r_kind) ! Cutoff weight for concluding that obs has been - ! rejected by nonlinear qc. This limit is arbitrary - ! and DOES NOT affect nonlinear qc. It only affects - ! the printout which "counts" the number of obs that - ! "fail" nonlinear qc. Observations counted as failing - ! nonlinear qc are still assimilated. Their weight - ! relative to other observations is reduced. Changing - ! wgtlim does not alter the analysis, only - ! the nonlinear qc data "count" - -! Constants describing the Extended Best-Track Reanalysis [Demuth et -! al., 2008] tropical cyclone (TC) distance for regions relative to TC -! track position; units are in kilometers - - parameter (rmw_mean_distance = 64.5479412) - parameter (roic_mean_distance = 338.319656) - -contains - subroutine init_constants_derived -!$$$ subprogram documentation block -! . . . . -! subprogram: init_constants_derived set derived constants -! prgmmr: treadon org: np23 date: 2004-12-02 -! -! abstract: This routine sets derived constants -! -! program history log: -! 2004-12-02 treadon -! 2005-03-03 treadon - add implicit none -! -! input argument list: -! -! output argument list: -! -! attributes: -! language: f90 -! machine: ibm rs/6000 sp -! -!$$$ - implicit none - -! Trigonometric constants - pi = acos(-one) - deg2rad = pi/180.0_r_kind - rad2deg = one/deg2rad - cg_term = (sqrt(two*pi))/two ! constant for variational qc - tiny_r_kind = tiny(zero) - huge_r_kind = huge(zero) - tiny_single = tiny(zero_single) - huge_single = huge(zero_single) - huge_i_kind = huge(izero) - -! Geophysical parameters used in conversion of geopotential to -! geometric height - eccentricity_linear = sqrt(semi_major_axis**2 - semi_minor_axis**2) - eccentricity = eccentricity_linear / semi_major_axis - constants_initialized = .true. - - return - end subroutine init_constants_derived - - subroutine init_constants(regional) -!$$$ subprogram documentation block -! . . . . -! subprogram: init_constants set regional or global constants -! prgmmr: treadon org: np23 date: 2004-03-02 -! -! abstract: This routine sets constants specific to regional or global -! applications of the gsi -! -! program history log: -! 2004-03-02 treadon -! 2004-06-16 treadon, documentation -! 2004-10-28 treadon - use intrinsic TINY function to set value -! for smallest machine representable positive -! number -! 2004-12-03 treadon - move derived constants to init_constants_derived -! 2005-03-03 treadon - add implicit none -! -! input argument list: -! regional - if .true., set regional gsi constants; -! otherwise (.false.), use global constants -! -! output argument list: -! -! attributes: -! language: f90 -! machine: ibm rs/6000 sp -! -!$$$ - implicit none - logical regional - real(r_kind) reradius,g,r_d,r_v,cliq_wrf - - stndrd_atmos_lapse = 0.0065 - -! Define regional constants here - if (regional) then - -! Name given to WRF constants - reradius = one/6370.e03_r_kind - g = 9.81_r_kind - r_d = 287.04_r_kind - r_v = 461.6_r_kind - cliq_wrf = 4190.0_r_kind - cp_mass = 1004.67_r_kind - -! Transfer WRF constants into unified GSI constants - rearth = one/reradius - grav = g - rd = r_d - rv = r_v - cv = cp-r_d - cliq = cliq_wrf - rd_over_cp_mass = rd / cp_mass - -! Define global constants here - else - rearth = 6.3712e+6_r_kind - grav = 9.80665e+0_r_kind - rd = 2.8705e+2_r_kind - rv = 4.6150e+2_r_kind - cv = 7.1760e+2_r_kind - cliq = 4.1855e+3_r_kind - cp_mass= zero - rd_over_cp_mass = zero - endif - - -! Now define derived constants which depend on constants -! which differ between global and regional applications. - -! Constants related to ozone assimilation - ozcon = grav*21.4e-9_r_kind - rozcon= one/ozcon - -! Constant used in vertical integral for precipitable water - tpwcon = 100.0_r_kind/grav - -! Derived atmospheric constants - fv = rv/rd-one ! used in virtual temperature equation - dldt = cvap-cliq - xa = -(dldt/rv) - xai = -(dldti/rv) - xb = xa+hvap/(rv*ttp) - xbi = xai+hsub/(rv*ttp) - eps = rd/rv - epsm1 = rd/rv-one - omeps = one-eps - factor1 = (cvap-cliq)/rv - factor2 = hvap/rv-factor1*t0c - cpr = cp*rd - el2orc = hvap*hvap/(rv*cp) - rd_over_g = rd/grav - rd_over_cp = rd/cp - g_over_rd = grav/rd - - return - end subroutine init_constants - -end module constants diff --git a/sorc/regrid_nemsio.fd/fv3_interface.f90 b/sorc/regrid_nemsio.fd/fv3_interface.f90 deleted file mode 100644 index bbe558e4283..00000000000 --- a/sorc/regrid_nemsio.fd/fv3_interface.f90 +++ /dev/null @@ -1,779 +0,0 @@ -module fv3_interface - - !======================================================================= - - ! Define associated modules and subroutines - - !----------------------------------------------------------------------- - - use constants - - !----------------------------------------------------------------------- - - use gfs_nems_interface - use interpolation_interface - use mpi_interface - use namelist_def - use netcdfio_interface - use variable_interface - use nemsio_module - - !----------------------------------------------------------------------- - - implicit none - - !----------------------------------------------------------------------- - - ! Define all data and structure types for routine; these variables - ! are variables required by the subroutines within this module - - type analysis_grid - character(len=500) :: filename - character(len=500) :: filename2d - integer :: nx - integer :: ny - integer :: nz - integer :: ntime - end type analysis_grid ! type analysis_grid - - ! Define global variables - - integer n2dvar,n3dvar,ntvars,nrecs,nvvars - real(nemsio_realkind), dimension(:,:,:,:), allocatable :: fv3_var_3d - real(nemsio_realkind), dimension(:,:,:), allocatable :: fv3_var_2d - - !----------------------------------------------------------------------- - - ! Define interfaces and attributes for module routines - - private - public :: fv3_regrid_nemsio - - !----------------------------------------------------------------------- - -contains - - !----------------------------------------------------------------------- - - subroutine fv3_regrid_nemsio() - - ! Define variables computed within routine - - implicit none - type(analysis_grid) :: anlygrd(ngrids) - type(varinfo), allocatable, dimension(:) :: var_info,var_info2d,var_info3d - type(gfs_grid) :: gfs_grid - type(gridvar) :: invar,invar2 - type(gridvar) :: outvar,outvar2 - type(nemsio_meta) :: meta_nemsio2d, meta_nemsio3d - - type(esmfgrid) :: grid_bilin - type(esmfgrid) :: grid_nn - - character(len=20) :: var_name - character(len=20) :: nems_name - character(len=20) :: nems_levtyp - character(len=20) :: itrptyp - logical :: itrp_bilinear - logical :: itrp_nrstnghbr - real(nemsio_realkind), dimension(:,:), allocatable :: workgrid - real(nemsio_realkind), dimension(:), allocatable :: pk - real(nemsio_realkind), dimension(:), allocatable :: bk - real, dimension(:), allocatable :: sendbuffer,recvbuffer - integer :: fhour - integer :: ncoords - integer nems_lev,ndims,istatus,ncol,levs_fix - logical clip - - ! Define counting variables - - integer :: i, j, k, l,nlev,k2,k3,nrec - - !===================================================================== - - ! Define local variables - - call init_constants_derived() - call gfs_grid_initialize(gfs_grid) - - ! Loop through local variables - - if(mpi_procid .eq. mpi_masternode) then - print *,'variable table' - print *,'--------------' - open(912,file=trim(variable_table),form='formatted') - ntvars=0; n2dvar=0; n3dvar=0 - nrecs = 0 - loop_read: do while (istatus == 0) - read(912,199,iostat=istatus) var_name,nems_name,nems_levtyp,nems_lev,itrptyp,clip,ndims - if( istatus /= 0 ) exit loop_read - nrecs = nrecs + 1 - if(var_name(1:1) .ne. "#") then - ntvars = ntvars + 1 - ntvars = ntvars + 1 - if (ndims == 2) then - n2dvar = n2dvar+1 - else if (ndims == 3) then - n3dvar = n3dvar+1 - else - print *,'ndims must be 2 or 3 in variable_table.txt' - call mpi_abort(mpi_comm_world,-91,mpi_ierror) - stop - endif - !print *,'ntvars,n2dvar,n3dvar',ntvars,n2dvar,n3dvar - !write(6,199) var_name, nems_name,nems_levtyp,nems_lev,itrptyp,clip,ndims - endif - enddo loop_read - close(912) - print *,'nrecs,ntvars,n2dvar,n3dvar',nrecs,ntvars,n2dvar,n3dvar - endif - call mpi_bcast(nrecs,1,mpi_integer,mpi_masternode,mpi_comm_world,mpi_ierror) - call mpi_bcast(n2dvar,1,mpi_integer,mpi_masternode,mpi_comm_world,mpi_ierror) - call mpi_bcast(n3dvar,1,mpi_integer,mpi_masternode,mpi_comm_world,mpi_ierror) - call mpi_bcast(ntvars,1,mpi_integer,mpi_masternode,mpi_comm_world,mpi_ierror) - if (ntvars == 0) then - print *,'empty variable_table.txt!' - call mpi_interface_terminate() - stop - endif - allocate(var_info(ntvars)) - open(912,file=trim(variable_table),form='formatted') - k = 0 - nvvars = 0 ! number of vector variables - do nrec = 1, nrecs - read(912,199,iostat=istatus) var_name,nems_name,nems_levtyp,nems_lev,itrptyp,clip,ndims - if (var_name(1:1) .ne. "#") then - k = k + 1 - var_info(k)%var_name = var_name - var_info(k)%nems_name = nems_name - var_info(k)%nems_levtyp = nems_levtyp - var_info(k)%nems_lev = nems_lev - var_info(k)%itrptyp = itrptyp - if (itrptyp.EQ.'vector') then - nvvars=nvvars+1 - endif - var_info(k)%clip = clip - var_info(k)%ndims = ndims - if(mpi_procid .eq. mpi_masternode) then - write(6,199) var_info(k)%var_name, var_info(k)%nems_name,var_info(k)%nems_levtyp, & - var_info(k)%nems_lev,var_info(k)%itrptyp,var_info(k)%clip,var_info(k)%ndims - endif - endif - end do ! do k = 1, ntvars - ! assume vectors are in pairs - nvvars=nvvars/2 - call mpi_bcast(nvvars,1,mpi_integer,mpi_masternode,mpi_comm_world,mpi_ierror) - close(912) -199 format(a20,1x,a20,1x,a20,1x,i1,1x,a20,1x,l1,1x,i1) - allocate(var_info3d(n3dvar+2)) - allocate(var_info2d(n2dvar)) - k2 = 0 - k3 = 0 - do k=1,ntvars - if (var_info(k)%ndims == 2) then - k2 = k2 + 1 - var_info2d(k2) = var_info(k) - endif - if (var_info(k)%ndims == 3 .or. & - trim(var_info(k)%nems_name) == 'pres' .or. & - trim(var_info(k)%nems_name) == 'orog') then - k3 = k3 + 1 - var_info3d(k3) = var_info(k) - ! orography called 'hgt' in 3d file, not 'orog' - if (trim(var_info(k)%nems_name) == 'orog') then - var_info3d(k3)%nems_name = 'hgt ' - endif - endif - enddo - - - do i = 1, ngrids - anlygrd(i)%filename = analysis_filename(i) - anlygrd(i)%filename2d = analysis_filename2d(i) - call fv3_regrid_initialize(anlygrd(i)) - end do ! do i = 1, ngrids - - ! Define local variables - - ncxdim = anlygrd(1)%nx - ncydim = anlygrd(1)%ny - if (n3dvar > 0) then - nczdim = anlygrd(1)%nz - else - nczdim = 0 - endif - nctdim = anlygrd(1)%ntime - ncoords = ncxdim*ncydim - invar%ncoords = ncoords*ngrids - invar2%ncoords = ncoords*ngrids - outvar%ncoords = gfs_grid%ncoords - outvar2%ncoords = gfs_grid%ncoords - call interpolation_initialize_gridvar(invar) - call interpolation_initialize_gridvar(invar2) - call interpolation_initialize_gridvar(outvar) - call interpolation_initialize_gridvar(outvar2) - meta_nemsio3d%modelname = 'GFS' - meta_nemsio3d%version = 200509 - meta_nemsio3d%nrec = 2 + nczdim*n3dvar - meta_nemsio3d%nmeta = 5 - meta_nemsio3d%nmetavari = 3 - meta_nemsio3d%nmetaaryi = 1 - meta_nemsio3d%dimx = gfs_grid%nlons - meta_nemsio3d%dimy = gfs_grid%nlats - meta_nemsio3d%dimz = nczdim - meta_nemsio3d%jcap = ntrunc - meta_nemsio3d%nsoil = 4 - meta_nemsio3d%nframe = 0 - meta_nemsio3d%ntrac = 3 - meta_nemsio3d%idrt = 4 - meta_nemsio3d%ncldt = 3 - meta_nemsio3d%idvc = 2 - meta_nemsio3d%idvm = 2 - meta_nemsio3d%idsl = 1 - meta_nemsio3d%idate(1:6) = 0 - meta_nemsio3d%idate(7) = 1 - read(forecast_timestamp(9:10),'(i2)') meta_nemsio3d%idate(4) - read(forecast_timestamp(7:8), '(i2)') meta_nemsio3d%idate(3) - read(forecast_timestamp(5:6), '(i2)') meta_nemsio3d%idate(2) - read(forecast_timestamp(1:4), '(i4)') meta_nemsio3d%idate(1) - meta_nemsio2d = meta_nemsio3d - meta_nemsio2d%nrec = n2dvar - call mpi_barrier(mpi_comm_world,mpi_ierror) - call gfs_nems_meta_initialization(meta_nemsio2d,var_info2d,gfs_grid) - call gfs_nems_meta_initialization(meta_nemsio3d,var_info3d,gfs_grid) - - ! Allocate memory for local variables - - if(.not. allocated(fv3_var_2d) .and. n2dvar > 0) & - & allocate(fv3_var_2d(ngrids,ncxdim,ncydim)) - if (mpi_nprocs /= nczdim+1) then - call mpi_barrier(mpi_comm_world, mpi_ierror) - if (mpi_procid .eq. mpi_masternode) then - print *,'number of mpi tasks must be equal to number of levels + 1' - print *,'mpi procs = ',mpi_nprocs,' levels = ',nczdim - endif - call mpi_interface_terminate() - stop - endif - !print *,'allocate fv3_var_3d',ngrids,ncxdim,ncydim,nczdim,mpi_procid - if(.not. allocated(fv3_var_3d) .and. n3dvar > 0) & - & allocate(fv3_var_3d(ngrids,ncxdim,ncydim,nczdim)) - !print *,'done allocating fv3_var_3d',ngrids,ncxdim,ncydim,nczdim,mpi_procid - - ! Check local variable and proceed accordingly - - call mpi_barrier(mpi_comm_world,mpi_ierror) - if(mpi_procid .eq. mpi_masternode) then - - ! Allocate memory for local variables - - if (n3dvar > 0) then - if(.not. allocated(pk)) allocate(pk(nczdim+1)) - if(.not. allocated(bk)) allocate(bk(nczdim+1)) - - ! Define local variables - - if (trim(gfs_hyblevs_filename) == 'NOT USED' ) then - call netcdfio_values_1d(anlygrd(1)%filename,'pk',pk) - call netcdfio_values_1d(anlygrd(1)%filename,'bk',bk) - else - open(913,file=trim(gfs_hyblevs_filename),form='formatted') - read(913,*) ncol, levs_fix - if (levs_fix /= (nczdim+1) ) then - call mpi_barrier(mpi_comm_world, mpi_ierror) - print *,'levs in ', trim(gfs_hyblevs_filename), ' not equal to',(nczdim+1) - call mpi_interface_terminate() - stop - endif - do k=nczdim+1,1,-1 - read(913,*) pk(k),bk(k) - enddo - close(913) - endif - if (minval(pk) < -1.e10 .or. minval(bk) < -1.e10) then - print *,'pk,bk not found in netcdf file..' - meta_nemsio3d%vcoord = -9999._nemsio_realkind - else - ! Loop through local variable - - do k = 1, nczdim + 1 - - ! Define local variables - - meta_nemsio3d%vcoord((nczdim + 1) - k + 1,1,1) = pk(k) - meta_nemsio3d%vcoord((nczdim + 1) - k + 1,2,1) = bk(k) - - end do ! do k = 1, nczdim + 1 - endif - endif - - end if ! if(mpi_procid .eq. mpi_masternode) - - ! initialize/read in interpolation weight - - grid_bilin%filename = esmf_bilinear_filename - call interpolation_initialize_esmf(grid_bilin) - - grid_nn%filename = esmf_neareststod_filename - call interpolation_initialize_esmf(grid_nn) - - do l = 1, nctdim - - ncrec = l ! time level to read from netcdf file - - ! Define local variables - - call fv3_grid_fhour(anlygrd(1),meta_nemsio2d%nfhour) - call fv3_grid_fhour(anlygrd(1),meta_nemsio3d%nfhour) - meta_nemsio3d%nfminute = int(0.0) - meta_nemsio3d%nfsecondn = int(0.0) - meta_nemsio3d%nfsecondd = int(1.0) - meta_nemsio3d%fhour = meta_nemsio3d%nfhour - meta_nemsio2d%nfminute = int(0.0) - meta_nemsio2d%nfsecondn = int(0.0) - meta_nemsio2d%nfsecondd = int(1.0) - meta_nemsio2d%fhour = meta_nemsio2d%nfhour - - ! initialize nemsio file. - if(mpi_procid .eq. mpi_masternode) then - call gfs_nems_initialize(meta_nemsio2d, meta_nemsio3d) - end if - - ! wait here. - call mpi_barrier(mpi_comm_world,mpi_ierror) - - ! Loop through local variables - k2=1 - do k = 1, ntvars - nvvars - - ! Define local variables - - itrp_bilinear = .false. - itrp_nrstnghbr = .false. - - ! Do 2D variables. - - if(var_info(k2)%ndims .eq. 2) then - - ! Check local variable and proceed accordingly - - if(mpi_procid .eq. mpi_masternode) then - - ! Check local variable and proceed accordingly - - call fv3_grid_read(anlygrd(1:ngrids), var_info(k2)%var_name,.true.,.false.) - - call interpolation_define_gridvar(invar,ncxdim,ncydim, ngrids,fv3_var_2d) - if (trim(var_info(k2)%nems_name) == 'pres') then - ! interpolate in exner(pressure) - invar%var = (invar%var/stndrd_atmos_ps)**(rd_over_g*stndrd_atmos_lapse) - end if - - if(var_info(k2)%itrptyp .eq. 'bilinear') then - call interpolation_esmf(invar,outvar,grid_bilin, .false.) - end if - - if(var_info(k2)%itrptyp .eq. 'nrstnghbr') then - call interpolation_esmf(invar,outvar,grid_nn, .true.) - end if - - if (trim(var_info(k2)%nems_name) == 'pres') then - outvar%var = stndrd_atmos_ps*(outvar%var**(g_over_rd/stndrd_atmos_lapse)) - end if - - if(var_info(k2)%itrptyp .eq. 'vector') then - ! read in u winds - call fv3_grid_read(anlygrd(1:ngrids), var_info(k2)%var_name,.true.,.false.) - call interpolation_define_gridvar(invar,ncxdim,ncydim,ngrids,fv3_var_2d) - ! read in v winds - call fv3_grid_read(anlygrd(1:ngrids), var_info(k2+1)%var_name,.true.,.false.) - call interpolation_define_gridvar(invar2,ncxdim,ncydim,ngrids,fv3_var_2d) - call interpolation_esmf_vect(invar,invar2,grid_bilin,outvar,outvar2) - end if - - ! Clip variable to zero if desired. - if(var_info(k2)%clip) call variable_clip(outvar%var) - - ! Write to NEMSIO file. - call gfs_nems_write('2d',real(outvar%var), & - var_info(k2)%nems_name,var_info(k2)%nems_levtyp,var_info(k2)%nems_lev) - if (trim(var_info(k2)%nems_name) == 'pres' .or. & - trim(var_info(k2)%nems_name) == 'orog' .or. & - trim(var_info(k2)%nems_name) == 'hgt') then - ! write surface height and surface pressure to 3d file. - ! (surface height called 'orog' in nemsio bin4, 'hgt' in - ! grib) - if (trim(var_info(k2)%nems_name) == 'orog') then - call gfs_nems_write('3d',real(outvar%var), & - 'hgt ',var_info(k2)%nems_levtyp,1) - else - call gfs_nems_write('3d',real(outvar%var), & - var_info(k2)%nems_name,var_info(k2)%nems_levtyp,1) - endif - endif - if(var_info(k2)%itrptyp .eq. 'vector') then ! write v winds - call gfs_nems_write('2d',real(outvar2%var), & - var_info(k2+1)%nems_name,var_info(k2+1)%nems_levtyp,var_info(k2+1)%nems_lev) - endif - end if ! if(mpi_procid .eq. mpi_masternode) - - ! Define local variables - call mpi_barrier(mpi_comm_world,mpi_ierror) - - end if ! if(var_info(k2)%ndims .eq. 2) - - ! Do 3D variables. - - if(var_info(k2)%ndims .eq. 3) then - - ! read 3d grid on master node, send to other tasks - if(mpi_procid .eq. mpi_masternode) then - call fv3_grid_read(anlygrd(1:ngrids), var_info(k2)%var_name,.false.,.true.) - do nlev=1,nczdim - call mpi_send(fv3_var_3d(1,1,1,nlev),ngrids*ncxdim*ncydim,mpi_real,& - nlev,1,mpi_comm_world,mpi_errorstatus,mpi_ierror) - enddo - if(trim(adjustl(var_info(k2)%itrptyp)) .eq. 'vector') then ! winds - call mpi_barrier(mpi_comm_world,mpi_ierror) - call fv3_grid_read(anlygrd(1:ngrids), var_info(k2+1)%var_name,.false.,.true.) - do nlev=1,nczdim - call mpi_send(fv3_var_3d(1,1,1,nlev),ngrids*ncxdim*ncydim,mpi_real,& - nlev,1,mpi_comm_world,mpi_errorstatus,mpi_ierror) - enddo - endif - else if (mpi_procid .le. nczdim) then - ! do interpolation, one level on each task. - call mpi_recv(fv3_var_3d(1,1,1,mpi_procid),ngrids*ncxdim*ncydim,mpi_real,& - 0,1,mpi_comm_world,mpi_errorstatus,mpi_ierror) - - call interpolation_define_gridvar(invar,ncxdim,ncydim, ngrids,fv3_var_3d(:,:,:,mpi_procid)) - - if(var_info(k2)%itrptyp .eq. 'bilinear') then - call interpolation_esmf(invar,outvar,grid_bilin, .false.) - end if ! if(var_info(k2)%itrptyp .eq. 'bilinear') - - if(var_info(k2)%itrptyp .eq. 'nrstnghbr') then - call interpolation_esmf(invar,outvar,grid_nn, .true.) - end if ! if(var_info(k2)%itrptyp .eq. 'nrstnghbr') - - if(trim(adjustl(var_info(k2)%itrptyp)) .eq. 'vector') then ! winds - call mpi_barrier(mpi_comm_world,mpi_ierror) - call mpi_recv(fv3_var_3d(1,1,1,mpi_procid),ngrids*ncxdim*ncydim,mpi_real,& - 0,1,mpi_comm_world,mpi_errorstatus,mpi_ierror) - call interpolation_define_gridvar(invar2,ncxdim,ncydim,ngrids,fv3_var_3d(:,:,:,mpi_procid)) - call interpolation_esmf_vect(invar,invar2,grid_bilin,outvar,outvar2) - endif - - if(var_info(k2)%clip) call variable_clip(outvar%var(:)) - - end if ! if(mpi_procid .ne. mpi_masternode .and. & - ! mpi_procid .le. nczdim) - - ! gather results back on root node to write out. - - if (mpi_procid == mpi_masternode) then - ! receive one level of interpolated data on root task. - if (.not. allocated(workgrid)) allocate(workgrid(gfs_grid%ncoords,nczdim)) - if (.not. allocated(recvbuffer)) allocate(recvbuffer(gfs_grid%ncoords)) - do nlev=1,nczdim - call mpi_recv(recvbuffer,gfs_grid%ncoords,mpi_real,& - nlev,1,mpi_comm_world,mpi_errorstatus,mpi_ierror) - workgrid(:,nlev) = recvbuffer - enddo - deallocate(recvbuffer) - else - ! send one level of interpolated data to root task. - if (.not. allocated(sendbuffer)) allocate(sendbuffer(gfs_grid%ncoords)) - sendbuffer(:) = outvar%var(:) - call mpi_send(sendbuffer,gfs_grid%ncoords,mpi_real,& - 0,1,mpi_comm_world,mpi_errorstatus,mpi_ierror) - endif - - ! Write to NEMSIO file. - - if(mpi_procid .eq. mpi_masternode) then - - ! Loop through local variable - - do j = 1, nczdim - - ! Define local variables - - call gfs_nems_write('3d',workgrid(:,nczdim - j + 1), & - & var_info(k2)%nems_name,var_info(k2)%nems_levtyp, & - & j) - - end do ! do j = 1, nczdim - - end if ! if(mpi_procid .eq. mpi_masternode) - - if(trim(adjustl(var_info(k2)%itrptyp)) .eq. 'vector') then ! winds - if (mpi_procid == mpi_masternode) then - ! receive one level of interpolated data on root task. - if (.not. allocated(workgrid)) allocate(workgrid(gfs_grid%ncoords,nczdim)) - if (.not. allocated(recvbuffer)) allocate(recvbuffer(gfs_grid%ncoords)) - do nlev=1,nczdim - call mpi_recv(recvbuffer,gfs_grid%ncoords,mpi_real,& - nlev,1,mpi_comm_world,mpi_errorstatus,mpi_ierror) - workgrid(:,nlev) = recvbuffer - enddo - deallocate(recvbuffer) - else - ! send one level of interpolated data to root task. - if (.not. allocated(sendbuffer)) allocate(sendbuffer(gfs_grid%ncoords)) - sendbuffer(:) = outvar2%var(:) - call mpi_send(sendbuffer,gfs_grid%ncoords,mpi_real,& - 0,1,mpi_comm_world,mpi_errorstatus,mpi_ierror) - endif - - ! Write to NEMSIO file. - - if(mpi_procid .eq. mpi_masternode) then - - do j = 1, nczdim - - call gfs_nems_write('3d',workgrid(:,nczdim - j + 1), & - & var_info(k2+1)%nems_name,var_info(k2+1)%nems_levtyp, & - & j) - end do ! do j = 1, nczdim - - end if ! if(mpi_procid .eq. mpi_masternode) - endif - - ! wait here - - call mpi_barrier(mpi_comm_world,mpi_ierror) - - end if ! if(var_info(k2)%ndims .eq. 3) - if(var_info(k2)%itrptyp .eq. 'vector') then ! skip v record here - k2=k2+1 - endif - k2=k2+1 - end do ! do k = 1, ntvars - - ! Wait here. - - call mpi_barrier(mpi_comm_world,mpi_ierror) - - ! Finalize and cleanup - - if(mpi_procid .eq. mpi_masternode) then - call gfs_nems_finalize() - end if - call mpi_barrier(mpi_comm_world,mpi_ierror) - if(allocated(workgrid)) deallocate(workgrid) - - end do ! do l = 1, nctdim - - -!===================================================================== - - end subroutine fv3_regrid_nemsio - - !======================================================================= - - ! fv3_regrid_initialize.f90: - - !----------------------------------------------------------------------- - - subroutine fv3_regrid_initialize(grid) - - ! Define variables passed to routine - - implicit none - type(analysis_grid) :: grid - - !===================================================================== - - ! Define local variables - - call netcdfio_dimension(grid%filename,'grid_xt',grid%nx) - call netcdfio_dimension(grid%filename,'grid_yt',grid%ny) - if (n3dvar > 0) then - call netcdfio_dimension(grid%filename,'pfull',grid%nz) - else - grid%nz = 0 - endif - call netcdfio_dimension(grid%filename,'time',grid%ntime) - - !===================================================================== - - end subroutine fv3_regrid_initialize - - !======================================================================= - - ! fv3_grid_read.f90: - - !----------------------------------------------------------------------- - - subroutine fv3_grid_read(anlygrd,varname,is_2d,is_3d) - - ! Define variables passed to subroutine - - type(analysis_grid) :: anlygrd(ngrids) - character(len=20) :: varname - logical :: is_2d - logical :: is_3d - - ! Define counting variables - - integer :: i, j, k - - !===================================================================== - - ! Loop through local variable - - do k = 1, ngrids - - ! Check local variable and proceed accordingly - - if(debug) write(6,500) ncrec, k - if(is_2d) then - - ! Define local variables - - ! orog and psfc are in 3d file. - if (trim(varname) == 'orog' .or. trim(varname) == 'psfc') then - call netcdfio_values_2d(anlygrd(k)%filename,varname, & - & fv3_var_2d(k,:,:)) - else - call netcdfio_values_2d(anlygrd(k)%filename2d,varname, & - & fv3_var_2d(k,:,:)) - endif - - end if ! if(is_2d) - - ! Check local variable and proceed accordingly - - if(is_3d) then - - ! Define local variables - - call netcdfio_values_3d(anlygrd(k)%filename,varname, & - & fv3_var_3d(k,:,:,:)) - - end if ! if(is_3d) - - end do ! do k = 1, ngrids - - !===================================================================== - - ! Define format statements - -500 format('FV3_GRID_READ: Time record = ', i6, '; Cubed sphere face = ', & - & i1,'.') - - !===================================================================== - - end subroutine fv3_grid_read - - !======================================================================= - - ! fv3_grid_fhour.f90: - - !----------------------------------------------------------------------- - - subroutine fv3_grid_fhour(grid,fhour) - - ! Define variables passed to routine - - implicit none - type(analysis_grid) :: grid - integer :: fhour - - ! Define variables computed within routine - - real(nemsio_realkind) :: workgrid(grid%ntime) - real(nemsio_realkind) :: start_jday - real(nemsio_realkind) :: fcst_jday - integer :: year - integer :: month - integer :: day - integer :: hour - integer :: minute - integer :: second, iw3jdn - character(len=80) timeunits - - !===================================================================== - - ! Define local variables - - read(forecast_timestamp(1:4), '(i4)') year - read(forecast_timestamp(5:6), '(i2)') month - read(forecast_timestamp(7:8), '(i2)') day - read(forecast_timestamp(9:10),'(i2)') hour - minute = 0; second = 0 - - ! Compute local variables - - ! 'flux day' (days since dec 31 1900) - !call date2wnday(start_jday,year,month,day) - ! same as above, but valid after 2099 - start_jday=real(iw3jdn(year,month,day)-iw3jdn(1900,12,31)) - start_jday = start_jday + real(hour)/24.0 + real(minute)/1440.0 + & - & real(second)/86400.0 - - ! Define local variables - - call netcdfio_values_1d(grid%filename,'time',workgrid) - call netcdfio_variable_attr(grid%filename,'time','units',timeunits) - - ! Compute local variables - - ! ncrec is a global variable in the netcdfio-interface module - if (timeunits(1:4) == 'days') then - fcst_jday = start_jday + workgrid(ncrec) - else if (timeunits(1:5) == 'hours') then - fcst_jday = start_jday + workgrid(ncrec)/24. - else if (timeunits(1:7) == 'seconds') then - fcst_jday = start_jday + workgrid(ncrec)/86400.0 - else - print *,'unrecognized time units',trim(timeunits) - call mpi_interface_terminate() - stop - endif - fhour = nint((86400*(fcst_jday - start_jday))/3600.0) - - !===================================================================== - - end subroutine fv3_grid_fhour - -! SUBROUTINE DATE2WNDAY(WDAY, IYR,MON,IDY) -! IMPLICIT NONE -! INTEGER IYR,MON,IDY -! REAL WDAY -!! -!!********** -!!* -!! 1) CONVERT DATE INTO 'FLUX DAY'. -!! -!! 2) THE 'FLUX DAY' IS THE NUMBER OF DAYS SINCE 001/1901 (WHICH IS -!! FLUX DAY 1.0). -!! FOR EXAMPLE: -!! A) IYR=1901,MON=1,IDY=1, REPRESENTS 0000Z HRS ON 01/01/1901 -!! SO WDAY WOULD BE 1.0. -!! A) IYR=1901,MON=1,IDY=2, REPRESENTS 0000Z HRS ON 02/01/1901 -!! SO WDAY WOULD BE 2.0. -!! YEAR MUST BE NO LESS THAN 1901.0, AND NO GREATER THAN 2099.0. -!! NOTE THAT YEAR 2000 IS A LEAP YEAR (BUT 1900 AND 2100 ARE NOT). -!! -!! 3) ALAN J. WALLCRAFT, NAVAL RESEARCH LABORATORY, JULY 2002. -!!* -!!********** -!! -! INTEGER NLEAP -! REAL WDAY1 -! REAL MONTH(13) -! DATA MONTH / 0, 31, 59, 90, 120, 151, 181, & -! 212, 243, 273, 304, 334, 365 / -!! FIND THE RIGHT YEAR. -! NLEAP = (IYR-1901)/4 -! WDAY = 365.0*(IYR-1901) + NLEAP + MONTH(MON) + IDY -! IF (MOD(IYR,4).EQ.0 .AND. MON.GT.2) THEN -! WDAY = WDAY + 1.0 -! ENDIF -! END SUBROUTINE DATE2WNDAY - - !======================================================================= - -end module fv3_interface diff --git a/sorc/regrid_nemsio.fd/gfs_nems_interface.f90 b/sorc/regrid_nemsio.fd/gfs_nems_interface.f90 deleted file mode 100644 index aa1305dc017..00000000000 --- a/sorc/regrid_nemsio.fd/gfs_nems_interface.f90 +++ /dev/null @@ -1,595 +0,0 @@ -module gfs_nems_interface - - !======================================================================= - - ! Define associated modules and subroutines - - !----------------------------------------------------------------------- - - use constants - use kinds - - !----------------------------------------------------------------------- - - use interpolation_interface - use mpi_interface - use namelist_def - use nemsio_module - use netcdfio_interface - use variable_interface - - !----------------------------------------------------------------------- - - implicit none - - !----------------------------------------------------------------------- - - ! Define all data and structure types for routine; these variables - ! are variables required by the subroutines within this module - - type gfs_grid - real(r_kind), dimension(:,:), allocatable :: rlon - real(r_kind), dimension(:,:), allocatable :: rlat - integer :: ncoords - integer :: nlons - integer :: nlats - integer :: nz - end type gfs_grid ! type gfs_grid - - type nemsio_meta - character(nemsio_charkind), dimension(:), allocatable :: recname - character(nemsio_charkind), dimension(:), allocatable :: reclevtyp - character(nemsio_charkind), dimension(:), allocatable :: variname - character(nemsio_charkind), dimension(:), allocatable :: varr8name - character(nemsio_charkind), dimension(:), allocatable :: aryiname - character(nemsio_charkind), dimension(:), allocatable :: aryr8name - character(nemsio_charkind8) :: gdatatype - character(nemsio_charkind8) :: modelname - real(nemsio_realkind), dimension(:,:,:), allocatable :: vcoord - real(nemsio_realkind), dimension(:), allocatable :: lon - real(nemsio_realkind), dimension(:), allocatable :: lat - integer(nemsio_intkind), dimension(:,:), allocatable :: aryival - integer(nemsio_intkind), dimension(:), allocatable :: reclev - integer(nemsio_intkind), dimension(:), allocatable :: varival - integer(nemsio_intkind), dimension(:), allocatable :: aryilen - integer(nemsio_intkind), dimension(:), allocatable :: aryr8len - integer(nemsio_intkind) :: idate(7) - integer(nemsio_intkind) :: version - integer(nemsio_intkind) :: nreo_vc - integer(nemsio_intkind) :: nrec - integer(nemsio_intkind) :: nmeta - integer(nemsio_intkind) :: nmetavari - integer(nemsio_intkind) :: nmetaaryi - integer(nemsio_intkind) :: nfhour - integer(nemsio_intkind) :: nfminute - integer(nemsio_intkind) :: nfsecondn - integer(nemsio_intkind) :: nfsecondd - integer(nemsio_intkind) :: jcap - integer(nemsio_intkind) :: dimx - integer(nemsio_intkind) :: dimy - integer(nemsio_intkind) :: dimz - integer(nemsio_intkind) :: nframe - integer(nemsio_intkind) :: nsoil - integer(nemsio_intkind) :: ntrac - integer(nemsio_intkind) :: ncldt - integer(nemsio_intkind) :: idvc - integer(nemsio_intkind) :: idsl - integer(nemsio_intkind) :: idvm - integer(nemsio_intkind) :: idrt - integer(nemsio_intkind) :: fhour - end type nemsio_meta ! type nemsio_meta - - !----------------------------------------------------------------------- - - ! Define global variables - - type(nemsio_gfile) :: gfile2d,gfile3d - integer :: nemsio_iret - - !----------------------------------------------------------------------- - - ! Define interfaces and attributes for module routines - - private - public :: gfs_grid_initialize - public :: gfs_grid_cleanup - public :: gfs_grid - public :: gfs_nems_meta_initialization - public :: gfs_nems_meta_cleanup - public :: gfs_nems_initialize - public :: gfs_nems_finalize - public :: gfs_nems_write - public :: nemsio_meta - -contains - - !======================================================================= - - ! gfs_nems_write.f90: - - !----------------------------------------------------------------------- - - subroutine gfs_nems_write(c2dor3d,nems_data,nems_varname,nems_levtyp,nems_lev) - - ! Define variables passed to routine - - character(nemsio_charkind) :: nems_varname - character(nemsio_charkind) :: nems_levtyp - real(nemsio_realkind) :: nems_data(:) - integer(nemsio_intkind) :: nems_lev - character(len=2) :: c2dor3d - - !===================================================================== - - ! Define local variables - - if (c2dor3d == '2d') then - call nemsio_writerecv(gfile2d,trim(adjustl(nems_varname)),levtyp= & - & trim(adjustl(nems_levtyp)),lev=nems_lev,data=nems_data, & - & iret=nemsio_iret) - else if (c2dor3d == '3d') then - call nemsio_writerecv(gfile3d,trim(adjustl(nems_varname)),levtyp= & - & trim(adjustl(nems_levtyp)),lev=nems_lev,data=nems_data, & - & iret=nemsio_iret) - else - nemsio_iret=-99 - endif - - ! Check local variable and proceed accordingly - - if(debug) write(6,500) c2dor3d,trim(adjustl(nems_varname)), nemsio_iret, & - & nems_lev, minval(nems_data), maxval(nems_data) - - !===================================================================== - - ! Define format statements - -500 format('GFS_NEMS_WRITE',a2,': NEMS I/O name = ', a, '; writerecv return ', & - & 'code = ', i5,'; level = ', i3, '; (min,max) = (', f13.5,f13.5, & - & ').') - if (nemsio_iret /= 0) then - print *,'nemsio_writerecv failed, stopping...' - call mpi_interface_terminate() - stop - endif - - !===================================================================== - - end subroutine gfs_nems_write - - !======================================================================= - - ! gfs_nems_meta_initialization.f90: - - !----------------------------------------------------------------------- - - subroutine gfs_nems_meta_initialization(meta_nemsio,var_info,grid) - - ! Define variables passed to routine - - type(nemsio_meta) :: meta_nemsio - type(varinfo) :: var_info(:) - type(gfs_grid) :: grid - - ! Define variables computed within routine - - integer :: offset - integer :: n2dvar - integer :: n3dvar - - ! Define counting variables - - integer :: i, j, k - - !===================================================================== - - ! Allocate memory for local variables - - if(.not. allocated(meta_nemsio%recname)) & - & allocate(meta_nemsio%recname(meta_nemsio%nrec)) - if(.not. allocated(meta_nemsio%reclevtyp)) & - & allocate(meta_nemsio%reclevtyp(meta_nemsio%nrec)) - if(.not. allocated(meta_nemsio%reclev)) & - & allocate(meta_nemsio%reclev(meta_nemsio%nrec)) - if(.not. allocated(meta_nemsio%variname)) & - & allocate(meta_nemsio%variname(meta_nemsio%nmetavari)) - if(.not. allocated(meta_nemsio%varival)) & - & allocate(meta_nemsio%varival(meta_nemsio%nmetavari)) - if(.not. allocated(meta_nemsio%aryiname)) & - & allocate(meta_nemsio%aryiname(meta_nemsio%nmetaaryi)) - if(.not. allocated(meta_nemsio%aryilen)) & - & allocate(meta_nemsio%aryilen(meta_nemsio%nmetaaryi)) - if(.not. allocated(meta_nemsio%vcoord)) & - & allocate(meta_nemsio%vcoord(meta_nemsio%dimz+1,3,2)) - if(.not. allocated(meta_nemsio%aryival)) & - & allocate(meta_nemsio%aryival(grid%nlats/2, & - & meta_nemsio%nmetaaryi)) - if(.not. allocated(meta_nemsio%lon)) & - & allocate(meta_nemsio%lon(grid%ncoords)) - if(.not. allocated(meta_nemsio%lat)) & - & allocate(meta_nemsio%lat(grid%ncoords)) - meta_nemsio%vcoord(:,:,:)=0.0 - ! Define local variables - - meta_nemsio%lon = & - & reshape(grid%rlon,(/grid%ncoords/)) - meta_nemsio%lat = & - & reshape(grid%rlat,(/grid%ncoords/)) - meta_nemsio%aryilen(1) = grid%nlats/2 - meta_nemsio%aryiname(1) = 'lpl' - meta_nemsio%aryival(1:grid%nlats/2,1) = grid%nlons - k = 0 - - ! Loop through local variable - offset = 0 - n3dvar = 0 - n2dvar = 0 - - - do i = 1, size(var_info) - - ! Check local variable and proceed accordingly - - if(var_info(i)%ndims .eq. 2) then - - ! Define local variables - - k = k + 1 - meta_nemsio%reclev(k) = var_info(i)%nems_lev - meta_nemsio%recname(k) = trim(adjustl(var_info(i)%nems_name)) - meta_nemsio%reclevtyp(k) = trim(adjustl(var_info(i)%nems_levtyp)) - n2dvar = k - - else if(var_info(i)%ndims .eq. 3) then - - ! Loop through local variable - - meta_nemsio%variname(1) = 'LEVS' - meta_nemsio%varival(1) = meta_nemsio%dimz - meta_nemsio%variname(2) = 'NVCOORD' - meta_nemsio%varival(2) = 2 - meta_nemsio%variname(3) = 'IVS' - meta_nemsio%varival(3) = 200509 - do k = 1, meta_nemsio%dimz - - ! Define local variables - - meta_nemsio%reclev(k+n2dvar+offset) = k - meta_nemsio%recname(k+n2dvar+offset) = & - & trim(adjustl(var_info(i)%nems_name)) - meta_nemsio%reclevtyp(k+n2dvar+offset) = & - & trim(adjustl(var_info(i)%nems_levtyp)) - - end do ! do k = 1, nczdim - - ! Define local variables - - n3dvar = n3dvar + 1 - offset = nczdim*n3dvar - - end if ! if(var_info(i)%ndims .eq. 3) - - end do ! do i = 1, size(var_info) - - !===================================================================== - - end subroutine gfs_nems_meta_initialization - - !======================================================================= - - ! gfs_nems_meta_cleanup.f90: - - !----------------------------------------------------------------------- - - subroutine gfs_nems_meta_cleanup(meta_nemsio2d,meta_nemsio3d) - - ! Define variables passed to routine - - type(nemsio_meta) :: meta_nemsio2d,meta_nemsio3d - - !===================================================================== - - ! Deallocate memory for local variables - - if(allocated(meta_nemsio2d%recname)) & - & deallocate(meta_nemsio2d%recname) - if(allocated(meta_nemsio2d%reclevtyp)) & - & deallocate(meta_nemsio2d%reclevtyp) - if(allocated(meta_nemsio2d%reclev)) & - & deallocate(meta_nemsio2d%reclev) - if(allocated(meta_nemsio2d%variname)) & - & deallocate(meta_nemsio2d%variname) - if(allocated(meta_nemsio2d%aryiname)) & - & deallocate(meta_nemsio2d%aryiname) - if(allocated(meta_nemsio2d%aryival)) & - & deallocate(meta_nemsio2d%aryival) - if(allocated(meta_nemsio2d%aryilen)) & - & deallocate(meta_nemsio2d%aryilen) - if(allocated(meta_nemsio2d%vcoord)) & - & deallocate(meta_nemsio2d%vcoord) - if(allocated(meta_nemsio2d%lon)) & - & deallocate(meta_nemsio2d%lon) - if(allocated(meta_nemsio2d%lat)) & - & deallocate(meta_nemsio2d%lat) - if(allocated(meta_nemsio3d%recname)) & - & deallocate(meta_nemsio3d%recname) - if(allocated(meta_nemsio3d%reclevtyp)) & - & deallocate(meta_nemsio3d%reclevtyp) - if(allocated(meta_nemsio3d%reclev)) & - & deallocate(meta_nemsio3d%reclev) - if(allocated(meta_nemsio3d%variname)) & - & deallocate(meta_nemsio3d%variname) - if(allocated(meta_nemsio3d%aryiname)) & - & deallocate(meta_nemsio3d%aryiname) - if(allocated(meta_nemsio3d%aryival)) & - & deallocate(meta_nemsio3d%aryival) - if(allocated(meta_nemsio3d%aryilen)) & - & deallocate(meta_nemsio3d%aryilen) - if(allocated(meta_nemsio3d%vcoord)) & - & deallocate(meta_nemsio3d%vcoord) - if(allocated(meta_nemsio3d%lon)) & - & deallocate(meta_nemsio3d%lon) - if(allocated(meta_nemsio3d%lat)) & - & deallocate(meta_nemsio3d%lat) - - !===================================================================== - - end subroutine gfs_nems_meta_cleanup - - !======================================================================= - - ! gfs_nems_initialize.f90: - - !----------------------------------------------------------------------- - - subroutine gfs_nems_initialize(meta_nemsio2d, meta_nemsio3d) - - ! Define variables passed to routine - - type(nemsio_meta) :: meta_nemsio2d,meta_nemsio3d - character(len=500) :: filename - character(len=7) :: suffix - - !===================================================================== - - ! Define local variables - - call nemsio_init(iret=nemsio_iret) - write(suffix,500) meta_nemsio2d%nfhour - filename = trim(adjustl(datapathout2d))//suffix - meta_nemsio2d%gdatatype = trim(adjustl(nemsio_opt2d)) - meta_nemsio3d%gdatatype = trim(adjustl(nemsio_opt3d)) - call nemsio_open(gfile2d,trim(adjustl(filename)),'write', & - & iret=nemsio_iret, & - & modelname=trim(adjustl(meta_nemsio2d%modelname)), & - & version=meta_nemsio2d%version, & - & gdatatype=meta_nemsio2d%gdatatype, & - & jcap=meta_nemsio2d%jcap, & - & dimx=meta_nemsio2d%dimx, & - & dimy=meta_nemsio2d%dimy, & - & dimz=meta_nemsio2d%dimz, & - & idate=meta_nemsio2d%idate, & - & nrec=meta_nemsio2d%nrec, & - & nframe=meta_nemsio2d%nframe, & - & idrt=meta_nemsio2d%idrt, & - & ncldt=meta_nemsio2d%ncldt, & - & idvc=meta_nemsio2d%idvc, & - & idvm=meta_nemsio2d%idvm, & - & idsl=meta_nemsio2d%idsl, & - & nfhour=meta_nemsio2d%fhour, & - & nfminute=meta_nemsio2d%nfminute, & - & nfsecondn=meta_nemsio2d%nfsecondn, & - & nfsecondd=meta_nemsio2d%nfsecondd, & - & extrameta=.true., & - & nmetaaryi=meta_nemsio2d%nmetaaryi, & - & recname=meta_nemsio2d%recname, & - & reclevtyp=meta_nemsio2d%reclevtyp, & - & reclev=meta_nemsio2d%reclev, & - & aryiname=meta_nemsio2d%aryiname, & - & aryilen=meta_nemsio2d%aryilen, & - & aryival=meta_nemsio2d%aryival, & - & vcoord=meta_nemsio2d%vcoord) - write(suffix,500) meta_nemsio3d%nfhour - filename = trim(adjustl(datapathout3d))//suffix - call nemsio_open(gfile3d,trim(adjustl(filename)),'write', & - & iret=nemsio_iret, & - & modelname=trim(adjustl(meta_nemsio3d%modelname)), & - & version=meta_nemsio3d%version, & - & gdatatype=meta_nemsio3d%gdatatype, & - & jcap=meta_nemsio3d%jcap, & - & dimx=meta_nemsio3d%dimx, & - & dimy=meta_nemsio3d%dimy, & - & dimz=meta_nemsio3d%dimz, & - & idate=meta_nemsio3d%idate, & - & nrec=meta_nemsio3d%nrec, & - & nframe=meta_nemsio3d%nframe, & - & idrt=meta_nemsio3d%idrt, & - & ncldt=meta_nemsio3d%ncldt, & - & idvc=meta_nemsio3d%idvc, & - & idvm=meta_nemsio3d%idvm, & - & idsl=meta_nemsio3d%idsl, & - & nfhour=meta_nemsio3d%fhour, & - & nfminute=meta_nemsio3d%nfminute, & - & nfsecondn=meta_nemsio3d%nfsecondn, & - & nfsecondd=meta_nemsio3d%nfsecondd, & - & extrameta=.true., & - & nmetaaryi=meta_nemsio3d%nmetaaryi, & - & recname=meta_nemsio3d%recname, & - & reclevtyp=meta_nemsio3d%reclevtyp, & - & reclev=meta_nemsio3d%reclev, & - & aryiname=meta_nemsio3d%aryiname, & - & aryilen=meta_nemsio3d%aryilen, & - & aryival=meta_nemsio3d%aryival, & - & variname=meta_nemsio3d%variname, & - & varival=meta_nemsio3d%varival, & - & nmetavari=meta_nemsio3d%nmetavari, & - & vcoord=meta_nemsio3d%vcoord) - - !===================================================================== - - ! Define format statements - -500 format('.fhr',i3.3) - - !===================================================================== - - end subroutine gfs_nems_initialize - - !======================================================================= - - ! gfs_nems_finalize.f90: - - !----------------------------------------------------------------------- - - subroutine gfs_nems_finalize() - - !===================================================================== - - ! Define local variables - - call nemsio_close(gfile2d,iret=nemsio_iret) - call nemsio_close(gfile3d,iret=nemsio_iret) - - !===================================================================== - - end subroutine gfs_nems_finalize - - !======================================================================= - - ! gfs_grid_initialize.f90: - - !----------------------------------------------------------------------- - - subroutine gfs_grid_initialize(grid) - - ! Define variables passed to routine - - type(gfs_grid) :: grid - - ! Define variables computed within routine - - real(r_kind), dimension(:), allocatable :: slat - real(r_kind), dimension(:), allocatable :: wlat - real(r_kind), dimension(:), allocatable :: workgrid - - ! Define counting variables - - integer :: i, j, k - - !===================================================================== - - ! Check local variable and proceed accordingly - - if(mpi_procid .eq. mpi_masternode) then - - ! Define local variables - - call init_constants_derived() - - ! Check local variable and proceed accordingly - - ! Define local variables - - grid%nlons = nlons - grid%nlats = nlats - - end if ! if(mpi_procid .eq. mpi_masternode) - - ! Define local variables - - call mpi_barrier(mpi_comm_world,mpi_ierror) - - ! Broadcast all necessary variables to compute nodes - - call mpi_bcast(grid%nlons,1,mpi_integer,mpi_masternode,mpi_comm_world, & - & mpi_ierror) - call mpi_bcast(grid%nlats,1,mpi_integer,mpi_masternode,mpi_comm_world, & - & mpi_ierror) - - ! Allocate memory for local variables - - if(.not. allocated(grid%rlon)) & - & allocate(grid%rlon(grid%nlons,grid%nlats)) - if(.not. allocated(grid%rlat)) & - & allocate(grid%rlat(grid%nlons,grid%nlats)) - - ! Check local variable and proceed accordingly - - if(mpi_procid .eq. mpi_masternode) then - - ! Allocate memory for local variables - - if(.not. allocated(slat)) allocate(slat(grid%nlats)) - if(.not. allocated(wlat)) allocate(wlat(grid%nlats)) - if(.not. allocated(workgrid)) allocate(workgrid(grid%nlats)) - - ! Compute local variables - - grid%ncoords = grid%nlons*grid%nlats - call splat(grid%nlats,slat,wlat) - workgrid = acos(slat) - pi/2.0 - - ! Loop through local variable - - do j = 1, grid%nlats - - ! Loop through local variable - - do i = 1, grid%nlons - - ! Compute local variables - - grid%rlon(i,j) = (i-1)*(360./grid%nlons)*deg2rad - grid%rlat(i,j) = workgrid(grid%nlats - j + 1) - - end do ! do i = 1, grid%nlons - - end do ! do j = 1, grid%nlats - - ! Deallocate memory for local variables - - if(allocated(slat)) deallocate(slat) - if(allocated(wlat)) deallocate(wlat) - if(allocated(workgrid)) deallocate(workgrid) - - endif ! if(mpi_procid .eq. mpi_masternode) - - ! Broadcast all necessary variables to compute nodes - - call mpi_bcast(grid%ncoords,1,mpi_integer,mpi_masternode, & - & mpi_comm_world,mpi_ierror) - call mpi_bcast(grid%rlon,grid%ncoords,mpi_real,mpi_masternode, & - & mpi_comm_world,mpi_ierror) - call mpi_bcast(grid%rlat,grid%ncoords,mpi_real,mpi_masternode, & - & mpi_comm_world,mpi_ierror) - - !===================================================================== - - end subroutine gfs_grid_initialize - - !======================================================================= - - ! gfs_grid_cleanup.f90: - - !----------------------------------------------------------------------- - - subroutine gfs_grid_cleanup(grid) - - ! Define variables passed to routine - - type(gfs_grid) :: grid - - !===================================================================== - - ! Deallocate memory for local variables - - if(allocated(grid%rlon)) deallocate(grid%rlon) - if(allocated(grid%rlat)) deallocate(grid%rlat) - - !===================================================================== - - end subroutine gfs_grid_cleanup - - !======================================================================= - -end module gfs_nems_interface diff --git a/sorc/regrid_nemsio.fd/interpolation_interface.f90 b/sorc/regrid_nemsio.fd/interpolation_interface.f90 deleted file mode 100644 index 775d1a7cc33..00000000000 --- a/sorc/regrid_nemsio.fd/interpolation_interface.f90 +++ /dev/null @@ -1,335 +0,0 @@ -module interpolation_interface - - !======================================================================= - - ! Define associated modules and subroutines - - !----------------------------------------------------------------------- - - use constants - use kinds - - !----------------------------------------------------------------------- - - use namelist_def - use netcdf - use netcdfio_interface - use mpi_interface - - !----------------------------------------------------------------------- - - implicit none - - !----------------------------------------------------------------------- - - ! Define interfaces and attributes for module routines - - private - public :: interpolation_initialize_gridvar - public :: interpolation_initialize_esmf - public :: interpolation_define_gridvar - public :: interpolation_define_gridvar_out - public :: interpolation_esmf - public :: interpolation_esmf_vect - public :: gridvar - public :: esmfgrid - - !----------------------------------------------------------------------- - - ! Define all data and structure types for routine; these variables - ! are variables required by the subroutines within this module - - type esmfgrid - character(len=500) :: filename - real(r_double), dimension(:), allocatable :: s - integer, dimension(:), allocatable :: col - integer, dimension(:), allocatable :: row - real(r_double), dimension(:), allocatable :: inlats - real(r_double), dimension(:), allocatable :: inlons - real(r_double), dimension(:), allocatable :: outlats - real(r_double), dimension(:), allocatable :: outlons - integer :: n_s,n_a,n_b - end type esmfgrid ! type esmfgrid - - type gridvar - logical, dimension(:), allocatable :: check - real(r_double), dimension(:), allocatable :: var - integer :: ncoords - integer :: nx - integer :: ny - end type gridvar ! type gridvar - - ! Define global variables - - integer :: ncfileid - integer :: ncvarid - integer :: ncdimid - integer :: ncstatus - - !----------------------------------------------------------------------- - -contains - - !======================================================================= - - subroutine interpolation_define_gridvar(grid,xdim,ydim,ngrid,input) -! collapses the cubed grid into a 1-d array -! Define variables passed to routine - - use nemsio_module, only: nemsio_realkind - integer,intent(in) :: ngrid - integer,intent(in) :: xdim,ydim - type(gridvar),intent(inout) :: grid - real(nemsio_realkind),intent(in) :: input(ngrid,xdim,ydim) - -! locals - integer :: i,j,k,ncount - - ncount = 1 - do k = 1, ngrid - do j = 1, ydim - do i = 1, xdim - grid%var(ncount) = input(k,i,j) - ncount = ncount + 1 - end do - end do - end do - - - end subroutine interpolation_define_gridvar - -!======================================================================= - - - subroutine interpolation_define_gridvar_out(grid,xdim,ydim,output) -! make a 2-d array for output - ! Define variables passed to routine - - integer,intent(in) :: xdim,ydim - type(gridvar),intent(in) :: grid - real(r_double),intent(out) :: output(xdim,ydim) - -! locals - integer :: i,j,ncount - - ncount = 1 - do j = 1, ydim - do i = 1, xdim - output(j,i) = grid%var(ncount) - ncount = ncount + 1 - enddo - enddo - - end subroutine interpolation_define_gridvar_out - - !======================================================================= - - subroutine interpolation_initialize_gridvar(grid) - - ! Define variables passed to routine - - type(gridvar) :: grid - - allocate(grid%var(grid%ncoords)) - - end subroutine interpolation_initialize_gridvar - - -!======================================================================= - - subroutine interpolation_initialize_esmf(grid) - - ! Define variables passed to routine - - type(esmfgrid) :: grid - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(grid%filename)),mode= & - & nf90_nowrite,ncid=ncfileid) - ncstatus = nf90_inq_dimid(ncfileid,'n_s',ncdimid) - ncstatus = nf90_inquire_dimension(ncfileid,ncdimid,len=grid%n_s) - ncstatus = nf90_inq_dimid(ncfileid,'n_a',ncdimid) - ncstatus = nf90_inquire_dimension(ncfileid,ncdimid,len=grid%n_a) - ncstatus = nf90_inq_dimid(ncfileid,'n_b',ncdimid) - ncstatus = nf90_inquire_dimension(ncfileid,ncdimid,len=grid%n_b) - - - ! Allocate memory for local variables - - allocate(grid%s(grid%n_s)) - allocate(grid%row(grid%n_s)) - allocate(grid%col(grid%n_s)) - - allocate(grid%inlats(grid%n_a)) - allocate(grid%inlons(grid%n_a)) - allocate(grid%outlats(grid%n_b)) - allocate(grid%outlons(grid%n_b)) - - ncstatus = nf90_inq_varid(ncfileid,'col',ncvarid) - ncstatus = nf90_get_var(ncfileid,ncvarid,grid%col) - ncstatus = nf90_inq_varid(ncfileid,'row',ncvarid) - ncstatus = nf90_get_var(ncfileid,ncvarid,grid%row) - ncstatus = nf90_inq_varid(ncfileid,'S',ncvarid) - ncstatus = nf90_get_var(ncfileid,ncvarid,grid%s) - ncstatus = nf90_inq_varid(ncfileid,'yc_a',ncvarid) - ncstatus = nf90_get_var(ncfileid,ncvarid,grid%inlats) - ncstatus = nf90_inq_varid(ncfileid,'xc_a',ncvarid) - ncstatus = nf90_get_var(ncfileid,ncvarid,grid%inlons) - where(grid%inlons .LT. 0.0) - grid%inlons=360+grid%inlons - endwhere - ncstatus = nf90_inq_varid(ncfileid,'yc_b',ncvarid) - ncstatus = nf90_get_var(ncfileid,ncvarid,grid%outlats) - ncstatus = nf90_inq_varid(ncfileid,'xc_b',ncvarid) - ncstatus = nf90_get_var(ncfileid,ncvarid,grid%outlons) - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - end subroutine interpolation_initialize_esmf - - -!======================================================================= - - - subroutine interpolation_esmf(invar,outvar,grid,is_nrstnghbr) - - ! Define variables passed to routine - - type(gridvar) :: invar - type(gridvar) :: outvar - logical :: is_nrstnghbr - - type(esmfgrid) :: grid - - integer :: i, j, k, l - - outvar%var = dble(0.0) - - if(is_nrstnghbr) then - do i = 1, grid%n_s - outvar%var(grid%row(i)) = invar%var(grid%col(i)) - enddo - else - do i = 1, grid%n_s - outvar%var(grid%row(i)) = outvar%var(grid%row(i)) + grid%s(i)*invar%var(grid%col(i)) - end do - end if - - end subroutine interpolation_esmf -!===================================================================== - - subroutine interpolation_esmf_vect(invaru,invarv,grid,outvaru,outvarv) - - ! Define variables passed to routine - - type(gridvar) :: invaru,invarv - type(gridvar) :: outvaru,outvarv - type(esmfgrid) :: grid - - integer :: i, j, k, l - real(r_double) :: cxy,sxy,urot,vrot - - - outvaru%var = dble(0.0) - outvarv%var = dble(0.0) - - do i = 1, grid%n_s - CALL MOVECT(grid%inlats(grid%col(i)),grid%inlons(grid%col(i)),& - grid%outlats(grid%row(i)),grid%outlons(grid%row(i)),& - cxy,sxy) - urot=cxy*invaru%var(grid%col(i))-sxy*invarv%var(grid%col(i)) - vrot=sxy*invaru%var(grid%col(i))+cxy*invarv%var(grid%col(i)) - outvaru%var(grid%row(i)) = outvaru%var(grid%row(i)) + grid%s(i)*urot - outvarv%var(grid%row(i)) = outvarv%var(grid%row(i)) + grid%s(i)*vrot - - end do - - end subroutine interpolation_esmf_vect - -!===================================================================== - - SUBROUTINE MOVECT(FLAT,FLON,TLAT,TLON,CROT,SROT) -!$$$ SUBPROGRAM DOCUMENTATION BLOCK -! -! SUBPROGRAM: MOVECT MOVE A VECTOR ALONG A GREAT CIRCLE -! PRGMMR: IREDELL ORG: W/NMC23 DATE: 96-04-10 -! -! ABSTRACT: THIS SUBPROGRAM PROVIDES THE ROTATION PARAMETERS -! TO MOVE A VECTOR ALONG A GREAT CIRCLE FROM ONE -! POSITION TO ANOTHER WHILE CONSERVING ITS ORIENTATION -! WITH RESPECT TO THE GREAT CIRCLE. THESE ROTATION -! PARAMETERS ARE USEFUL FOR VECTOR INTERPOLATION. -! -! PROGRAM HISTORY LOG: -! 96-04-10 IREDELL -! 1999-04-08 IREDELL GENERALIZE PRECISION -! -! USAGE: CALL MOVECT(FLAT,FLON,TLAT,TLON,CROT,SROT) -! -! INPUT ARGUMENT LIST: -! FLAT - REAL LATITUDE IN DEGREES FROM WHICH TO MOVE THE VECTOR -! FLON - REAL LONGITUDE IN DEGREES FROM WHICH TO MOVE THE VECTOR -! TLAT - REAL LATITUDE IN DEGREES TO WHICH TO MOVE THE VECTOR -! TLON - REAL LONGITUDE IN DEGREES TO WHICH TO MOVE THE VECTOR -! -! OUTPUT ARGUMENT LIST: -! CROT - REAL CLOCKWISE VECTOR ROTATION COSINE -! SROT - REAL CLOCKWISE VECTOR ROTATION SINE -! (UTO=CROT*UFROM-SROT*VFROM; -! VTO=SROT*UFROM+CROT*VFROM) -! -! ATTRIBUTES: -! LANGUAGE: FORTRAN 90 -! -!$$$ - IMPLICIT NONE -! - INTEGER, PARAMETER :: KD=SELECTED_REAL_KIND(15,45) -! - REAL(KIND=r_double), INTENT(IN ) :: FLAT, FLON - REAL(KIND=r_double), INTENT(IN ) :: TLAT, TLON - REAL(KIND=r_double), INTENT( OUT) :: CROT, SROT -! - REAL(KIND=r_double), PARAMETER :: CRDLIM=0.9999999 - REAL(KIND=r_double), PARAMETER :: PI=3.14159265358979 - REAL(KIND=r_double), PARAMETER :: DPR=180./PI -! - REAL(KIND=r_double) :: CTLAT,STLAT,CFLAT,SFLAT - REAL(KIND=r_double) :: CDLON,SDLON,CRD - REAL(KIND=r_double) :: SRD2RN,STR,CTR,SFR,CFR -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COMPUTE COSINE OF THE RADIAL DISTANCE BETWEEN THE POINTS. - CTLAT=COS(TLAT/DPR) - STLAT=SIN(TLAT/DPR) -CFLAT=COS(FLAT/DPR) - SFLAT=SIN(FLAT/DPR) - CDLON=COS((FLON-TLON)/DPR) - SDLON=SIN((FLON-TLON)/DPR) - CRD=STLAT*SFLAT+CTLAT*CFLAT*CDLON -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -! COMPUTE ROTATIONS AT BOTH POINTS WITH RESPECT TO THE GREAT CIRCLE -! AND COMBINE THEM TO GIVE THE TOTAL VECTOR ROTATION PARAMETERS. - IF(ABS(CRD).LE.CRDLIM) THEN - SRD2RN=-1/(1-CRD**2) - STR=CFLAT*SDLON - CTR=CFLAT*STLAT*CDLON-SFLAT*CTLAT - SFR=CTLAT*SDLON - CFR=CTLAT*SFLAT*CDLON-STLAT*CFLAT - CROT=SRD2RN*(CTR*CFR-STR*SFR) - SROT=SRD2RN*(CTR*SFR+STR*CFR) -! USE A DIFFERENT APPROXIMATION FOR NEARLY COINCIDENT POINTS. -! MOVING VECTORS TO ANTIPODAL POINTS IS AMBIGUOUS ANYWAY. - ELSE - CROT=CDLON - SROT=SDLON*STLAT - ENDIF -! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - END SUBROUTINE MOVECT - - !======================================================================= - -end module interpolation_interface diff --git a/sorc/regrid_nemsio.fd/kinds.f90 b/sorc/regrid_nemsio.fd/kinds.f90 deleted file mode 100644 index 11c93b98e03..00000000000 --- a/sorc/regrid_nemsio.fd/kinds.f90 +++ /dev/null @@ -1,107 +0,0 @@ -! this module was extracted from the GSI version operational -! at NCEP in Dec. 2007. -module kinds -!$$$ module documentation block -! . . . . -! module: kinds -! prgmmr: treadon org: np23 date: 2004-08-15 -! -! abstract: Module to hold specification kinds for variable declaration. -! This module is based on (copied from) Paul vanDelst's -! type_kinds module found in the community radiative transfer -! model -! -! module history log: -! 2004-08-15 treadon -! -! Subroutines Included: -! -! Functions Included: -! -! remarks: -! The numerical data types defined in this module are: -! i_byte - specification kind for byte (1-byte) integer variable -! i_short - specification kind for short (2-byte) integer variable -! i_long - specification kind for long (4-byte) integer variable -! i_llong - specification kind for double long (8-byte) integer variable -! r_single - specification kind for single precision (4-byte) real variable -! r_double - specification kind for double precision (8-byte) real variable -! r_quad - specification kind for quad precision (16-byte) real variable -! -! i_kind - generic specification kind for default integer -! r_kind - generic specification kind for default floating point -! -! -! attributes: -! language: f90 -! machine: ibm RS/6000 SP -! -!$$$ end documentation block - implicit none - private - -! Integer type definitions below - -! Integer types - integer, parameter, public :: i_byte = selected_int_kind(1) ! byte integer - integer, parameter, public :: i_short = selected_int_kind(4) ! short integer - integer, parameter, public :: i_long = selected_int_kind(8) ! long integer - integer, parameter, private :: llong_t = selected_int_kind(16) ! llong integer - integer, parameter, public :: i_llong = max( llong_t, i_long ) - -! Expected 8-bit byte sizes of the integer kinds - integer, parameter, public :: num_bytes_for_i_byte = 1 - integer, parameter, public :: num_bytes_for_i_short = 2 - integer, parameter, public :: num_bytes_for_i_long = 4 - integer, parameter, public :: num_bytes_for_i_llong = 8 - -! Define arrays for default definition - integer, parameter, private :: num_i_kinds = 4 - integer, parameter, dimension( num_i_kinds ), private :: integer_types = (/ & - i_byte, i_short, i_long, i_llong /) - integer, parameter, dimension( num_i_kinds ), private :: integer_byte_sizes = (/ & - num_bytes_for_i_byte, num_bytes_for_i_short, & - num_bytes_for_i_long, num_bytes_for_i_llong /) - -! Default values -! **** CHANGE THE FOLLOWING TO CHANGE THE DEFAULT INTEGER TYPE KIND *** - integer, parameter, public :: default_integer = 3 ! 1=byte, - ! 2=short, - ! 3=long, - ! 4=llong - integer, parameter, public :: i_kind = integer_types( default_integer ) - integer, parameter, public :: num_bytes_for_i_kind = & - integer_byte_sizes( default_integer ) - - -! Real definitions below - -! Real types - integer, parameter, public :: r_single = selected_real_kind(6) ! single precision - integer, parameter, public :: r_double = selected_real_kind(15) ! double precision - integer, parameter, private :: quad_t = selected_real_kind(20) ! quad precision - integer, parameter, public :: r_quad = max( quad_t, r_double ) - -! Expected 8-bit byte sizes of the real kinds - integer, parameter, public :: num_bytes_for_r_single = 4 - integer, parameter, public :: num_bytes_for_r_double = 8 - integer, parameter, public :: num_bytes_for_r_quad = 16 - -! Define arrays for default definition - integer, parameter, private :: num_r_kinds = 3 - integer, parameter, dimension( num_r_kinds ), private :: real_kinds = (/ & - r_single, r_double, r_quad /) - integer, parameter, dimension( num_r_kinds ), private :: real_byte_sizes = (/ & - num_bytes_for_r_single, num_bytes_for_r_double, & - num_bytes_for_r_quad /) - -! Default values -! **** CHANGE THE FOLLOWING TO CHANGE THE DEFAULT REAL TYPE KIND *** - integer, parameter, public :: default_real = 1 ! 1=single, - ! 2=double, - ! 3=quad - integer, parameter, public :: r_kind = real_kinds( default_real ) - integer, parameter, public :: num_bytes_for_r_kind = & - real_byte_sizes( default_real ) - -end module kinds diff --git a/sorc/regrid_nemsio.fd/main.f90 b/sorc/regrid_nemsio.fd/main.f90 deleted file mode 100644 index f3dfe4ef093..00000000000 --- a/sorc/regrid_nemsio.fd/main.f90 +++ /dev/null @@ -1,92 +0,0 @@ -program regrid_nemsio_main - - !===================================================================== - - !$$$ PROGRAM DOCUMENTATION BLOCK - ! - ! ABSTRACT: - ! - ! This routine provides an interface between the National Oceanic - ! and Atmospheric Administration (NOAA) National Centers for - ! Environmental Prediction (NCEP) implemented NOAA Environmental - ! Modeling System (NEMS) input/output file format and the native - ! FV3 cubed sphere grid. - ! - ! NOTES: - ! - ! * Uses interpolation weights generated by - ! Earth-System Modeling Framework (ESMF) remapping utilities. - ! - ! PRGMMR: Winterbottom - ! ORG: ESRL/PSD1 - ! DATE: 2016-02-02 - ! - ! PROGRAM HISTORY LOG: - ! - ! 2016-02-02 Initial version. Henry R. Winterbottom - ! 2016-11-01 Modifed by Jeff Whitaker. - ! - !$$$ - - !===================================================================== - - ! Define associated modules and subroutines - - !--------------------------------------------------------------------- - - use kinds - - !--------------------------------------------------------------------- - - use mpi_interface - use fv3_interface - use namelist_def - use constants - - !--------------------------------------------------------------------- - - implicit none - - !===================================================================== - - ! Define variables computed within routine - - real(r_kind) :: exectime_start - real(r_kind) :: exectime_finish - - !===================================================================== - - ! Define local variables - - call mpi_interface_initialize() - call init_constants(.false.) - - if(mpi_procid .eq. mpi_masternode) then - - call cpu_time(exectime_start) - - end if - call mpi_barrier(mpi_comm_world,mpi_ierror) - - call namelistparams() - call fv3_regrid_nemsio() - - - if(mpi_procid .eq. mpi_masternode) then - - call cpu_time(exectime_finish) - write(6,500) exectime_finish - exectime_start - - end if ! if(mpi_procid .eq. mpi_masternode) - - call mpi_barrier(mpi_comm_world,mpi_ierror) - call mpi_interface_terminate() - - !===================================================================== - ! Define format statements - -500 format('MAIN: Execution time: ', f13.5, ' seconds.') - - !===================================================================== - -end program regrid_nemsio_main diff --git a/sorc/regrid_nemsio.fd/mpi_interface.f90 b/sorc/regrid_nemsio.fd/mpi_interface.f90 deleted file mode 100644 index 2e6c5c7a944..00000000000 --- a/sorc/regrid_nemsio.fd/mpi_interface.f90 +++ /dev/null @@ -1,89 +0,0 @@ -module mpi_interface - - !======================================================================= - - use kinds - - !----------------------------------------------------------------------- - - implicit none - - !----------------------------------------------------------------------- - - ! Define necessary include files - - include "mpif.h" - - !----------------------------------------------------------------------- - - ! Define global variables - - character :: mpi_nodename(mpi_max_processor_name) - character :: mpi_noderequest - logical :: abort_mpi - integer(kind=4), dimension(:), allocatable :: mpi_ranks - integer(kind=4) :: mpi_errorstatus(mpi_status_size) - integer(kind=4) :: mpi_masternode - integer(kind=4) :: mpi_slavenode - integer(kind=4) :: mpi_ierror - integer(kind=4) :: mpi_ierrorcode - integer(kind=4) :: mpi_procid - integer(kind=4) :: mpi_nprocs - integer(kind=4) :: mpi_node_source - integer(kind=4) :: mpi_node_destination - integer(kind=4) :: mpi_loopcount - integer(kind=4) :: mpi_request - integer(kind=4) :: mpi_group_user - integer(kind=4) :: mpi_group_nprocs - integer(kind=4) :: mpi_group_procid - integer(kind=4) :: mpi_group_begin - integer(kind=4) :: mpi_group_end - - !----------------------------------------------------------------------- - -contains - - !======================================================================= - - ! mpi_interface_initialize.f90: - - !----------------------------------------------------------------------- - - subroutine mpi_interface_initialize() - - !===================================================================== - - ! Define local variables - - call mpi_init(mpi_ierror) - call mpi_comm_rank(mpi_comm_world,mpi_procid,mpi_ierror) - call mpi_comm_size(mpi_comm_world,mpi_nprocs,mpi_ierror) - mpi_masternode = 0 - abort_mpi = .false. - - !===================================================================== - - end subroutine mpi_interface_initialize - - !======================================================================= - - ! mpi_interface_terminate.f90: - - !----------------------------------------------------------------------- - - subroutine mpi_interface_terminate() - - !===================================================================== - - ! Define local variables - - !call mpi_abort(mpi_comm_world,ierror_code,mpi_ierror) - call mpi_finalize(mpi_ierror) - - !===================================================================== - - end subroutine mpi_interface_terminate - - !======================================================================= - -end module mpi_interface diff --git a/sorc/regrid_nemsio.fd/namelist_def.f90 b/sorc/regrid_nemsio.fd/namelist_def.f90 deleted file mode 100644 index ff15a335f56..00000000000 --- a/sorc/regrid_nemsio.fd/namelist_def.f90 +++ /dev/null @@ -1,181 +0,0 @@ -module namelist_def - - !======================================================================= - - ! Define associated modules and subroutines - - !----------------------------------------------------------------------- - - use kinds - - !----------------------------------------------------------------------- - - use mpi_interface - - !----------------------------------------------------------------------- - - implicit none - - !----------------------------------------------------------------------- - - ! Define global variables - - integer, parameter :: max_ngrids = 12 - character(len=500) :: analysis_filename(max_ngrids) = 'NOT USED' - character(len=500) :: analysis_filename2d(max_ngrids) = 'NOT USED' - character(len=500) :: gfs_hyblevs_filename = 'NOT USED' - character(len=500) :: esmf_neareststod_filename = 'NOT USED' - character(len=500) :: esmf_bilinear_filename = 'NOT USED' - character(len=500) :: variable_table = 'NOT USED' - character(len=500) :: datapathout2d = './' - character(len=500) :: datapathout3d = './' - character(len=19) :: forecast_timestamp = '0000-00-00_00:00:00' - character(len=4) :: nemsio_opt = 'bin4' - character(len=4) :: nemsio_opt2d = 'none' - character(len=4) :: nemsio_opt3d = 'none' - logical :: is_ugrid2sgrid = .false. - logical :: debug = .false. - integer :: nlons = 0 - integer :: nlats = 0 - integer :: ntrunc = 0 - integer :: ngrids = 0 - namelist /share/ debug, nlons,nlats,ntrunc,datapathout2d,datapathout3d, & - analysis_filename, forecast_timestamp, nemsio_opt, nemsio_opt2d, nemsio_opt3d, & - analysis_filename2d, variable_table - - namelist /interpio/ esmf_bilinear_filename, esmf_neareststod_filename, gfs_hyblevs_filename - - !--------------------------------------------------------------------- - -contains - - !===================================================================== - - ! namelistparams.f90: - - !--------------------------------------------------------------------- - - subroutine namelistparams() - - ! Define variables computed within routine - - logical :: is_it_there - integer :: unit_nml - - ! Define counting variables - - integer :: i, j, k - - !=================================================================== - - ! Define local variables - - unit_nml = 9 - is_it_there = .false. - inquire(file='regrid-nemsio.input',exist = is_it_there) - - ! Check local variable and proceed accordingly - - if(is_it_there) then - - ! Define local variables - - open(file = 'regrid-nemsio.input', & - unit = unit_nml , & - status = 'old' , & - form = 'formatted' , & - action = 'read' , & - access = 'sequential' ) - read(unit_nml,NML = share) - read(unit_nml,NML = interpio) - close(unit_nml) - if (nemsio_opt2d == 'none') nemsio_opt2d=nemsio_opt - if (nemsio_opt3d == 'none') nemsio_opt3d=nemsio_opt - - ! Loop through local variable - - do i = 1, max_ngrids - - ! Check local variable and proceed accordingly - - if(analysis_filename(i) .ne. 'NOT USED') then - - ! Define local variables - - ngrids = ngrids + 1 - - end if ! if(analysis_filename(i) .ne. 'NOT USED') - - end do ! do i = 1, max_ngrids - - else ! if(is_it_there) - - ! Define local variables - - if(mpi_procid .eq. mpi_masternode) write(6,500) - call mpi_barrier(mpi_comm_world,mpi_ierror) - call mpi_interface_terminate() - - end if ! if(.not. is_it_there) - - !=================================================================== - - ! Check local variable and proceed accordingly - - if(mpi_procid .eq. mpi_masternode) then - - ! Define local variables - - write(6,*) '&SHARE' - write(6,*) 'DEBUG = ', debug - write(6,*) 'ANALYSIS_FILENAME = ' - do k = 1, ngrids - write(6,*) trim(adjustl(analysis_filename(k))) - ! if analysis_filename2d not specified, set to analysis_filename - if (trim(analysis_filename2d(k)) == 'NOT USED') then - analysis_filename2d(k) = analysis_filename(k) - endif - end do ! do k = 1, ngrids - write(6,*) 'ANALYSIS_FILENAME2D = ' - do k = 1, ngrids - write(6,*) trim(adjustl(analysis_filename2d(k))) - end do ! do k = 1, ngrids - write(6,*) 'VARIABLE_TABLE = ', & - & trim(adjustl(variable_table)) - write(6,*) 'FORECAST_TIMESTAMP = ', forecast_timestamp - write(6,*) 'OUTPUT DATAPATH (2d) = ', & - & trim(adjustl(datapathout2d)) - write(6,*) 'OUTPUT DATAPATH (3d) = ', & - & trim(adjustl(datapathout3d)) - write(6,*) 'NEMSIO_OPT (2d) = ', nemsio_opt2d - write(6,*) 'NEMSIO_OPT (3d) = ', nemsio_opt3d - write(6,*) '/' - write(6,*) '&INTERPIO' - write(6,*) 'ESMF_BILINEAR_FILENAME = ', & - & trim(adjustl(esmf_bilinear_filename)) - write(6,*) 'ESMF_NEARESTSTOD_FILENAME = ', & - & trim(adjustl(esmf_neareststod_filename)) - write(6,*) 'GFS_HYBLEVS_FILENAME = ', & - & trim(adjustl(gfs_hyblevs_filename)) - write(6,*) '/' - - end if ! if(mpi_procid .eq. mpi_masternode) - - ! Define local variables - - call mpi_barrier(mpi_comm_world,mpi_ierror) - - !=================================================================== - - ! Define format statements - -500 format('NAMELISTPARAMS: regrid-nemsio.input not found in the', & - & ' current working directory. ABORTING!!!!') - - !=================================================================== - - end subroutine namelistparams - - !===================================================================== - -end module namelist_def diff --git a/sorc/regrid_nemsio.fd/netcdfio_interface.f90 b/sorc/regrid_nemsio.fd/netcdfio_interface.f90 deleted file mode 100644 index 473b765c507..00000000000 --- a/sorc/regrid_nemsio.fd/netcdfio_interface.f90 +++ /dev/null @@ -1,592 +0,0 @@ -module netcdfio_interface - - !======================================================================= - - ! Define associated modules and subroutines - - !----------------------------------------------------------------------- - - use kinds - - !----------------------------------------------------------------------- - - use namelist_def - use netcdf - use mpi_interface - - !----------------------------------------------------------------------- - - implicit none - - !----------------------------------------------------------------------- - - ! Define global variables - - logical :: ncstatic - integer :: ncrec - integer :: ncxdim - integer :: ncydim - integer :: nczdim - integer :: nctdim - integer :: ncfileid - integer :: ncvarid - integer :: ncdimid - integer :: ncstatus - - !----------------------------------------------------------------------- - - ! Define interfaces and attributes for module routines - - private - interface netcdfio_values_1d - module procedure netcdfio_values_1d_dblepr - module procedure netcdfio_values_1d_realpr - module procedure netcdfio_values_1d_intepr - end interface ! interface netcdfio_values_2d - interface netcdfio_values_2d - module procedure netcdfio_values_2d_dblepr - module procedure netcdfio_values_2d_realpr - module procedure netcdfio_values_2d_intepr - end interface ! interface netcdfio_values_2d - interface netcdfio_values_3d - module procedure netcdfio_values_3d_dblepr - module procedure netcdfio_values_3d_realpr - module procedure netcdfio_values_3d_intepr - end interface ! interface netcdfio_values_3d - interface netcdfio_global_attr - module procedure netcdfio_global_attr_char - end interface ! interface netcdfio_global_attr - interface netcdfio_variable_attr - module procedure netcdfio_variable_attr_char - end interface ! interface netcdfio_variable_attr - public :: netcdfio_values_1d - public :: netcdfio_values_2d - public :: netcdfio_values_3d - public :: netcdfio_dimension - public :: netcdfio_global_attr - public :: netcdfio_variable_attr - public :: ncrec - public :: ncxdim - public :: ncydim - public :: nczdim - public :: nctdim - public :: ncstatic - - !----------------------------------------------------------------------- - -contains - - !======================================================================= - - ! netcdfio_global_attr.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_global_attr_char(filename,varname,varvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: varname - character(len=*) :: varvalue - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_get_att(ncfileid,nf90_global,trim(adjustl(varname)), & - & varvalue) - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - end subroutine netcdfio_global_attr_char - - subroutine netcdfio_variable_attr_char(filename,varname,attribute,varvalue) - - implicit none - - !======================================================================= - - ! Define variables passed to subroutine - - character(len=500),intent(in) :: filename - character(len=*),intent(in) :: attribute - character(len=*),intent(in) :: varname - - ! Define variables returned by subroutine - - character(len=80),intent(out) :: varvalue - - ! Define variables for decoding netCDF data - - integer ncid, varid, ncstatus - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite,ncid=ncid) - ncstatus = nf90_inq_varid(ncid,trim(adjustl(varname)),varid) - ncstatus = nf90_get_att(ncid,varid,trim(adjustl(attribute)),varvalue) - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - end subroutine netcdfio_variable_attr_char - - !======================================================================= - - ! netcdfio_values_1d_dblepr.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_values_1d_dblepr(filename,varname,varvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: varname - real(r_double) :: varvalue(:) - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_inq_varid(ncfileid,trim(adjustl(varname)),ncvarid) - if (ncstatus /= 0) then - varvalue = -1.e30 - else - ncstatus = nf90_get_var(ncfileid,ncvarid,varvalue) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - endif - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - end subroutine netcdfio_values_1d_dblepr - - !======================================================================= - - ! netcdfio_values_2d_dblepr.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_values_2d_dblepr(filename,varname,varvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: varname - real(r_double), dimension(ncxdim,ncydim) :: varvalue - - ! Define variables computed within routine - - integer, dimension(3) :: start - integer, dimension(3) :: count - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_inq_varid(ncfileid,trim(adjustl(varname)),ncvarid) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - if(ncstatic) start = (/1,1,1/) - if(.not. ncstatic) start = (/1,1,ncrec/) - count = (/ncxdim,ncydim,1/) - ncstatus = nf90_get_var(ncfileid,ncvarid,varvalue,start,count) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - if(debug) write(6,500) trim(adjustl(varname)), minval(varvalue), & - & maxval(varvalue) - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - ! Define format statements - -500 format('NETCDFIO_VALUES_2D: Variable name = ', a, '; (min,max) = (', & - & f13.5,',',f13.5,').') - - !===================================================================== - - end subroutine netcdfio_values_2d_dblepr - - !======================================================================= - - ! netcdfio_values_3d_dblepr.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_values_3d_dblepr(filename,varname,varvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: varname - real(r_double), dimension(ncxdim,ncydim,nczdim) :: varvalue - - ! Define variables computed within routine - - integer, dimension(4) :: start - integer, dimension(4) :: count - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_inq_varid(ncfileid,trim(adjustl(varname)),ncvarid) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - if(ncstatic) start = (/1,1,1,1/) - if(.not. ncstatic) start = (/1,1,1,ncrec/) - count = (/ncxdim,ncydim,nczdim,1/) - ncstatus = nf90_get_var(ncfileid,ncvarid,varvalue,start,count) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - if(debug) write(6,500) trim(adjustl(varname)), minval(varvalue), & - & maxval(varvalue) - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - ! Define format statements - -500 format('NETCDFIO_VALUES_3D: Variable name = ', a, '; (min,max) = (', & - & f13.5,',',f13.5,').') - - !===================================================================== - - end subroutine netcdfio_values_3d_dblepr - - !======================================================================= - - ! netcdfio_values_1d_realpr.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_values_1d_realpr(filename,varname,varvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: varname - real(r_kind) :: varvalue(:) - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_inq_varid(ncfileid,trim(adjustl(varname)),ncvarid) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - if (ncstatus /= 0) then - varvalue = -1.e30 - else - ncstatus = nf90_get_var(ncfileid,ncvarid,varvalue) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - endif - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - end subroutine netcdfio_values_1d_realpr - - !======================================================================= - - ! netcdfio_values_2d_realpr.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_values_2d_realpr(filename,varname,varvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: varname - real(r_kind), dimension(ncxdim,ncydim) :: varvalue - - ! Define variables computed within routine - - integer, dimension(3) :: start - integer, dimension(3) :: count - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_inq_varid(ncfileid,trim(adjustl(varname)),ncvarid) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - if(ncstatic) start = (/1,1,1/) - if(.not. ncstatic) start = (/1,1,ncrec/) - count = (/ncxdim,ncydim,1/) - ncstatus = nf90_get_var(ncfileid,ncvarid,varvalue,start,count) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - if(debug) write(6,500) trim(adjustl(varname)), minval(varvalue), & - & maxval(varvalue) - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - ! Define format statements - -500 format('NETCDFIO_VALUES_2D: Variable name = ', a, '; (min,max) = (', & - & f13.5,',',f13.5,').') - - !===================================================================== - - end subroutine netcdfio_values_2d_realpr - - !======================================================================= - - ! netcdfio_values_3d_realpr.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_values_3d_realpr(filename,varname,varvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: varname - real(r_kind), dimension(ncxdim,ncydim,nczdim) :: varvalue - - ! Define variables computed within routine - - integer, dimension(4) :: start - integer, dimension(4) :: count - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_inq_varid(ncfileid,trim(adjustl(varname)),ncvarid) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - if(ncstatic) start = (/1,1,1,1/) - if(.not. ncstatic) start = (/1,1,1,ncrec/) - count = (/ncxdim,ncydim,nczdim,1/) - ncstatus = nf90_get_var(ncfileid,ncvarid,varvalue,start,count) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - if(debug) write(6,500) trim(adjustl(varname)), minval(varvalue), & - & maxval(varvalue) - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - ! Define format statements - -500 format('NETCDFIO_VALUES_3D: Variable name = ', a, '; (min,max) = (', & - & f13.5,',',f13.5,').') - - !===================================================================== - - end subroutine netcdfio_values_3d_realpr - - !======================================================================= - - ! netcdfio_values_1d_intepr.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_values_1d_intepr(filename,varname,varvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: varname - integer :: varvalue(:) - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_inq_varid(ncfileid,trim(adjustl(varname)),ncvarid) - if (ncstatus /= 0) then - varvalue = -9999 - else - ncstatus = nf90_get_var(ncfileid,ncvarid,varvalue) - endif - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - end subroutine netcdfio_values_1d_intepr - - !======================================================================= - - ! netcdfio_values_2d_intepr.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_values_2d_intepr(filename,varname,varvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: varname - integer, dimension(ncxdim,ncydim) :: varvalue - - ! Define variables computed within routine - - integer, dimension(3) :: start - integer, dimension(3) :: count - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_inq_varid(ncfileid,trim(adjustl(varname)),ncvarid) - if(ncstatic) start = (/1,1,1/) - if(.not. ncstatic) start = (/1,1,ncrec/) - count = (/ncxdim,ncydim,1/) - ncstatus = nf90_get_var(ncfileid,ncvarid,varvalue,start,count) - if(debug) write(6,500) trim(adjustl(varname)), minval(varvalue), & - & maxval(varvalue) - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - ! Define format statements - -500 format('NETCDFIO_VALUES_2D: Variable name = ', a, '; (min,max) = (', & - & f13.5,',',f13.5,').') - - !===================================================================== - - end subroutine netcdfio_values_2d_intepr - - !======================================================================= - - ! netcdfio_values_3d_intepr.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_values_3d_intepr(filename,varname,varvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: varname - integer, dimension(ncxdim,ncydim,nczdim) :: varvalue - - ! Define variables computed within routine - - integer, dimension(4) :: start - integer, dimension(4) :: count - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_inq_varid(ncfileid,trim(adjustl(varname)),ncvarid) - if (ncstatus .ne. 0) then - print *,'fv3 read failed for ',trim(adjustl(varname)) - call mpi_interface_terminate() - stop - endif - if(ncstatic) start = (/1,1,1,1/) - if(.not. ncstatic) start = (/1,1,1,ncrec/) - count = (/ncxdim,ncydim,nczdim,1/) - ncstatus = nf90_get_var(ncfileid,ncvarid,varvalue,start,count) - if(debug) write(6,500) trim(adjustl(varname)), minval(varvalue), & - & maxval(varvalue) - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - ! Define format statements - -500 format('NETCDFIO_VALUES_3D: Variable name = ', a, '; (min,max) = (', & - & f13.5,',',f13.5,').') - - !===================================================================== - - end subroutine netcdfio_values_3d_intepr - - !======================================================================= - - ! netcdfio_dimension.f90: - - !----------------------------------------------------------------------- - - subroutine netcdfio_dimension(filename,dimname,dimvalue) - - ! Define variables passed to routine - - character(len=500) :: filename - character(len=*) :: dimname - integer :: dimvalue - - !===================================================================== - - ! Define local variables - - ncstatus = nf90_open(path=trim(adjustl(filename)),mode=nf90_nowrite, & - & ncid=ncfileid) - ncstatus = nf90_inq_dimid(ncfileid,trim(adjustl(dimname)),ncdimid) - ncstatus = nf90_inquire_dimension(ncfileid,ncdimid,len=dimvalue) - ncstatus = nf90_close(ncfileid) - - !===================================================================== - - end subroutine netcdfio_dimension - - !======================================================================= - -end module netcdfio_interface diff --git a/sorc/regrid_nemsio.fd/physcons.f90 b/sorc/regrid_nemsio.fd/physcons.f90 deleted file mode 100644 index 4e69dca3376..00000000000 --- a/sorc/regrid_nemsio.fd/physcons.f90 +++ /dev/null @@ -1,77 +0,0 @@ -! this module contains some the most frequently used math and ! -! physics constatns for gcm models. ! -! ! -! references: ! -! as set in NMC handbook from Smithsonian tables. ! -! ! - module physcons -! - use kinds, only : r_kind -! - implicit none -! - public - -! --- ... Math constants - - real(r_kind),parameter:: con_pi =3.1415926535897931 ! pi - real(r_kind),parameter:: con_sqrt2 =1.414214e+0 ! square root of 2 - real(r_kind),parameter:: con_sqrt3 =1.732051e+0 ! square root of 3 - -! --- ... Geophysics/Astronomy constants - - real(r_kind),parameter:: con_rerth =6.3712e+6 ! radius of earth (m) - real(r_kind),parameter:: con_g =9.80665e+0 ! gravity (m/s2) - real(r_kind),parameter:: con_omega =7.2921e-5 ! ang vel of earth (1/s) - real(r_kind),parameter:: con_p0 =1.01325e5 ! std atms pressure (pa) - real(r_kind),parameter:: con_solr =1.3660e+3 ! solar constant (W/m2)-liu(2002) - -! --- ... Thermodynamics constants - - real(r_kind),parameter:: con_rgas =8.314472 ! molar gas constant (J/mol/K) - real(r_kind),parameter:: con_rd =2.8705e+2 ! gas constant air (J/kg/K) - real(r_kind),parameter:: con_rv =4.6150e+2 ! gas constant H2O (J/kg/K) - real(r_kind),parameter:: con_cp =1.0046e+3 ! spec heat air @p (J/kg/K) - real(r_kind),parameter:: con_cv =7.1760e+2 ! spec heat air @v (J/kg/K) - real(r_kind),parameter:: con_cvap =1.8460e+3 ! spec heat H2O gas (J/kg/K) - real(r_kind),parameter:: con_cliq =4.1855e+3 ! spec heat H2O liq (J/kg/K) - real(r_kind),parameter:: con_csol =2.1060e+3 ! spec heat H2O ice (J/kg/K) - real(r_kind),parameter:: con_hvap =2.5000e+6 ! lat heat H2O cond (J/kg) - real(r_kind),parameter:: con_hfus =3.3358e+5 ! lat heat H2O fusion (J/kg) - real(r_kind),parameter:: con_psat =6.1078e+2 ! pres at H2O 3pt (Pa) - real(r_kind),parameter:: con_t0c =2.7315e+2 ! temp at 0C (K) - real(r_kind),parameter:: con_ttp =2.7316e+2 ! temp at H2O 3pt (K) - real(r_kind),parameter:: con_tice =2.7120e+2 ! temp freezing sea (K) - real(r_kind),parameter:: con_jcal =4.1855E+0 ! joules per calorie () - real(r_kind),parameter:: con_rhw0 =1022.0 ! sea water reference density (kg/m^3) - real(r_kind),parameter:: con_epsq =1.0E-12 ! min q for computing precip type - -! Secondary constants - - real(r_kind),parameter:: con_rocp =con_rd/con_cp - real(r_kind),parameter:: con_cpor =con_cp/con_rd - real(r_kind),parameter:: con_rog =con_rd/con_g - real(r_kind),parameter:: con_fvirt =con_rv/con_rd-1. - real(r_kind),parameter:: con_eps =con_rd/con_rv - real(r_kind),parameter:: con_epsm1 =con_rd/con_rv-1. - real(r_kind),parameter:: con_dldt =con_cvap-con_cliq - real(r_kind),parameter:: con_xpona =-con_dldt/con_rv - real(r_kind),parameter:: con_xponb =-con_dldt/con_rv+con_hvap/(con_rv*con_ttp) - -! --- ... Other Physics/Chemistry constants (source: 2002 CODATA) - - real(r_kind),parameter:: con_c =2.99792458e+8 ! speed of light (m/s) - real(r_kind),parameter:: con_plnk =6.6260693e-34 ! planck constatn (J/s) - real(r_kind),parameter:: con_boltz =1.3806505e-23 ! boltzmann constant (J/K) - real(r_kind),parameter:: con_sbc =5.670400e-8 ! stefan-boltzmann (W/m2/K4) - real(r_kind),parameter:: con_avgd =6.0221415e23 ! avogadro constant (1/mol) - real(r_kind),parameter:: con_gasv =22413.996e-6 ! vol of ideal gas at 273.15k, 101.325kpa (m3/mol) - real(r_kind),parameter:: con_amd =28.9644 ! molecular wght of dry air (g/mol) - real(r_kind),parameter:: con_amw =18.0154 ! molecular wght of water vapor (g/mol) - real(r_kind),parameter:: con_amo3 =47.9982 ! molecular wght of o3 (g/mol) - real(r_kind),parameter:: con_amco2 =44.011 ! molecular wght of co2 (g/mol) - real(r_kind),parameter:: con_amo2 =31.9999 ! molecular wght of o2 (g/mol) - real(r_kind),parameter:: con_amch4 =16.043 ! molecular wght of ch4 (g/mol) - real(r_kind),parameter:: con_amn2o =44.013 ! molecular wght of n2o (g/mol) - -end module physcons diff --git a/sorc/regrid_nemsio.fd/regrid_nemsio_interface.f90 b/sorc/regrid_nemsio.fd/regrid_nemsio_interface.f90 deleted file mode 100644 index 9ab5597af82..00000000000 --- a/sorc/regrid_nemsio.fd/regrid_nemsio_interface.f90 +++ /dev/null @@ -1,50 +0,0 @@ -module regrid_nemsio_interface - - !======================================================================= - - ! Define associated modules and subroutines - - !----------------------------------------------------------------------- - - use constants - use kinds - - !----------------------------------------------------------------------- - - use fv3_interface - use gfs_nems_interface - use namelist_def - - !----------------------------------------------------------------------- - - implicit none - - !----------------------------------------------------------------------- - -contains - - !======================================================================= - - ! regrid_nemsio.f90: - - !----------------------------------------------------------------------- - - subroutine regrid_nemsio() - - !===================================================================== - - ! Define local variables - - call namelistparams() - - ! Check local variable and proceed accordingly - - call fv3_regrid_nemsio() - - !===================================================================== - - end subroutine regrid_nemsio - - !======================================================================= - -end module regrid_nemsio_interface diff --git a/sorc/regrid_nemsio.fd/variable_interface.f90 b/sorc/regrid_nemsio.fd/variable_interface.f90 deleted file mode 100644 index d0d568429d0..00000000000 --- a/sorc/regrid_nemsio.fd/variable_interface.f90 +++ /dev/null @@ -1,66 +0,0 @@ -module variable_interface - - !======================================================================= - - ! Define associated modules and subroutines - - !----------------------------------------------------------------------- - - use kinds - use physcons, only: rgas => con_rd, cp => con_cp, grav => con_g, & - & rerth => con_rerth, rocp => con_rocp, & - & pi => con_pi, con_rog - - !----------------------------------------------------------------------- - - use mpi_interface - use namelist_def - - !----------------------------------------------------------------------- - - implicit none - - !----------------------------------------------------------------------- - - ! Define interfaces and attributes for module routines - - private - public :: varinfo - !public :: variable_lookup - public :: variable_clip - - !----------------------------------------------------------------------- - - ! Define all data and structure types for routine; these variables - ! are variables required by the subroutines within this module - - type varinfo - character(len=20) :: var_name - character(len=20) :: nems_name - character(len=20) :: nems_levtyp - integer :: nems_lev - character(len=20) :: itrptyp - logical :: clip - integer :: ndims - end type varinfo ! type varinfo - - !----------------------------------------------------------------------- - -contains - - !======================================================================= - - subroutine variable_clip(grid) - - - real(r_double) :: grid(:) - real(r_double) :: clip - - clip = tiny(grid(1)) - where(grid .le. dble(0.0)) grid = clip - - end subroutine variable_clip - - !======================================================================= - -end module variable_interface diff --git a/sorc/supvit.fd/supvit_main.f b/sorc/supvit.fd/supvit_main.f deleted file mode 100644 index 1484e4efebe..00000000000 --- a/sorc/supvit.fd/supvit_main.f +++ /dev/null @@ -1,865 +0,0 @@ - program sort_and_update_vitals -c -c$$$ MAIN PROGRAM DOCUMENTATION BLOCK -c -c Main Program: SUPVIT Sort and Update Vitals File -C PRGMMR: MARCHOK ORG: NP22 DATE: 1999-04-14 -c -c ABSTRACT: This program searches through the TC Vitals file and reads -c the records for a particular dtg. It contains logic to eliminate -c duplicate records and only keep the most recent one (see further -c documentation below). It also searches to see if a storm was -c included in the Vitals file 6 hours earlier (or 3 hours earlier -c if we're tracking with the off-synoptic-time SREF) but is missing -c from the current Vitals records. In this case, the program assumes -c that the regional forecasting center was late in reporting the -c current position, and it includes the old Vitals record with -c the current Vitals records. This program will also take the -c position and heading from that old vitals record and extrapolate the -c information to get a current first guess estimate of the storm's -c position. By the way, if a storm was found 3 or 6 hours earlier, -c logic is also included to eliminate any duplicate records of that -c storm in those old records. Finally, if it turns out that the -c reason an old vitals is no longer on the current records is that -c the storm has dissipated, don't worry about including it to be -c passed into the tracking program; the tracking program will not be -c able to track it and that'll be the end of it. -c -c Program history log: -c 98-03-26 Marchok - Original operational version. -c 99-04-01 Marchok - Modified code to be able to read the year off -c of the TC Vitals card as a 4-digit integer, -c instead of as a 2-digit integer. -c 00-06-13 Marchok - Modified code to be able to read vitals from 6h -c ahead (this is for use in the GDAS tropical -c cyclone relocation system). -c 04-05-27 Marchok - Modified code to be able to read vitals from 3h -c ago. This is for tracking with the 09z and 21z -c SREF ensemble. Since there are no vitals at -c these offtimes, we need to update vitals from -c the synoptic times 3h earlier. -c -c Input files: -c unit 31 Text file containing all vitals (including duplicates) -c for current time and time from 3 or 6 hours ago and -c 3 or 6 hours ahead. -c Output files: -c unit 51 Text file containing sorted, updated vitals (without -c any duplicates) valid at the current time only. -c -c Subprograms called: -c read_nlists Read input namelists for input dates -c read_tcv_file Read TC vitals file to get initial storm positions -c delete_dups Delete duplicate TC vitals records from current time -c delete_old Delete records from 6h ago if current record exists -c delete_old_dups Delete duplicate records from 6h ago time -c update_old_vits Update position of storms from 6h ago positions -c output Output 1 record for each updated vitals record -c -c Attributes: -c Language: Fortran_90 -c -c$$$ -c -c------- -c -c - USE def_vitals; USE set_max_parms; USE inparms; USE date_checks - USE trig_vals -c - type (tcvcard) storm(maxstorm) - type (datecard) dnow, dold, dfuture - - logical okstorm(maxstorm) - integer vit_hr_incr -c - call w3tagb('SUPVIT ',1999,0104,0058,'NP22 ') -c - okstorm = .FALSE. -c - pi = 4. * atan(1.) ! pi, dtr and rtd were declared in module - dtr = pi/180.0 ! trig_vals, but were not yet defined. - rtd = 180.0/pi -c -c ----------------------------------------- -c Read namelists to get date information -c - call read_nlists (dnow,dold,dfuture,vit_hr_incr) -c -c ----------------------------------------------------------- -c Read in storm cards for current time and delete duplicates -c - - inowct = 0 - call read_tcv_file (storm,ymd_now,hhmm_now,inowct,okstorm) - - if (inowct > 0) then - call delete_dups (storm,inowct,okstorm) - else - print *,' ' - print *,'!!! No storms on tcv card for current time.' - print *,'!!! A check will be made for old tcv storm cards,' - print *,'!!! and if any exist, the positions will be updated' - print *,'!!! (extrapolated) to get a first guess position for' - print *,'!!! the current time.' - print *,'!!! Current forecast time = ',ymd_now,hhmm_now - print *,'!!! Old forecast time = ',ymd_old,hhmm_old - endif -c -c ----------------------------------------------------------- -c Read in storm cards for 3h or 6h ago and delete duplicates -c - rewind (31) - itempct = inowct - call read_tcv_file (storm,ymd_old,hhmm_old,itempct,okstorm) - ioldct = itempct - inowct - - if (ioldct > 0) then - if (inowct > 0) then - call delete_old (storm,inowct,ioldct,okstorm) - endif - call delete_old_dups (storm,inowct,ioldct,okstorm) - endif - -c ---------------------------------------------------------------- -c Now update any vitals records left from 3h or 6h ago by -c extrapolating their positions ahead to the current time. - - if (ioldct > 0) then - call update_old_vits (storm,inowct,ioldct,okstorm,vit_hr_incr) - endif - - -c -------------------------------------------------------------- -c Read in storm cards for 3h or 6h ahead and delete duplicates. -c This is used for Qingfu's vortex relocation purposes. If he is -c doing the analysis/relocation for, say, 12z, he looks at the -c first guess files from the 06z cycle and tracks from there. -c But suppose there is a storm whose first tcvitals card is -c issued at 12z; then we would have no tcvitals card at 06z for -c the tracker to use. So this next part reads the vitals from -c the cycle 6h ahead and, if it finds any vitals that were not -c included with the current time's vitals, then it extrapolates -c those vitals from the next cycle *backwards* to the current -c time. By the way, itempct is input/output for the read -c routine. Going in, it contains the count of the number of -c records read in so far. In that read routine, itempct is -c incremented for every valid record read for the input time. - - rewind (31) - iprevct = inowct + ioldct - call read_tcv_file (storm,ymd_future,hhmm_future,itempct,okstorm) - ifuturect = itempct - iprevct - - print *,'before d6a if, ifuturect = ',ifuturect,' iprevct= ' - & ,iprevct - print *,'before d6a if, inowct = ',inowct,' ioldct= ',ioldct - - if (ifuturect > 0) then - if (iprevct > 0) then - call delete_future (storm,iprevct,ifuturect,okstorm) - endif - call delete_future_dups (storm,iprevct,ifuturect,okstorm) - endif - -c ---------------------------------------------------------------- -c Now update any vitals records not filtered out from 3h or 6h -c ahead by extrapolating their future positions *backwards* to -c the current time. - - if (ifuturect > 0) then - call update_future_vits (storm,iprevct,ifuturect,okstorm - & ,vit_hr_incr) - endif - - -c --------------------------------------------------------- -c Now output all of the sorted, updated TC Vitals records - - itotalct = inowct + ioldct + ifuturect - call output (storm,itotalct,okstorm) -c - call w3tage('SUPVIT ') - stop - end -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine read_tcv_file (storm,ymd,hhmm,ict,okstorm) -c -c ABSTRACT: This routine reads in the TC Vitals file, and stores -c into an array those records that match the input ymd and hhmm. -c -c INPUT: -c -c ict Tells at what index in the storm array to begin reading -c the input records into. This is important because this -c subroutine is called twice; the first time the data are -c for the current time and are just started at ict = 0, -c but the second time it's called we're getting the 6h ago -c data, and they have to be added onto the end of the -c array, so we need to know where the current time's data -c ends so we know what index to start the 6h ago data. -c - USE def_vitals; USE set_max_parms -c - type (tcvcard) storm(maxstorm), ts -c - integer ymd,hhmm - logical okstorm(maxstorm) -c - lucard = 31 - - print *,' ' - print '(a26,i6.6,a8,i4.4)',' IN READ_TCV_FILE: , ymd= ',ymd - & ,' hhmm= ',hhmm - print *,' ' - - - do while (.true.) - read (lucard,21,END=801,ERR=891) ts - if (ts%tcv_yymmdd == ymd .and. ts%tcv_hhmm == hhmm) then - ict = ict + 1 - storm(ict) = ts - okstorm(ict) = .TRUE. - write (6,23) ' !!! MATCH, ict= ',ict,storm(ict) - endif - enddo - 801 continue - - 21 format (a4,1x,a3,1x,a9,1x,i2,i6,1x,i4,1x,i3,a1,1x,i4,a1,1x,i3,1x - & ,i3,a85) - 23 format (a18,i3,2x,a4,1x,a3,1x,a9,1x,i2,i6.6,1x,i4.4,1x,i3,a1,1x,i4 - & ,a1,1x,i3,1x,i3,a85) - - iret = 0 - return - - 891 print *,'!!! ERROR in program sort_and_update_vitals. Error ' - print *,'!!! occurred in read_tcv_file while reading unit ',lucard - iret = 98 - - return - end -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine output (storm,itotalct,okstorm) -c - USE def_vitals; USE set_max_parms; USE inparms -c - type (tcvcard) storm(maxstorm) - type (datecard) dnow, dold, dfuture - - logical okstorm(maxstorm) -c - lunvit = 51 - - ist = 1 - do while (ist <= itotalct) - - if (okstorm(ist)) then - if (storm(ist)%tcv_stdir == -99 .or. - & storm(ist)%tcv_stspd == -99) then - write (lunvit,23,ERR=891) storm(ist) - else - write (lunvit,21,ERR=891) storm(ist) - endif - endif - - ist = ist + 1 - - enddo - - 21 format (a4,1x,a3,1x,a9,1x,i2.2,i6.6,1x,i4.4,1x,i3.3,a1,1x,i4.4 - & ,a1,1x,i3.3,1x,i3.3,a85) - 23 format (a4,1x,a3,1x,a9,1x,i2.2,i6.6,1x,i4.4,1x,i3.3,a1,1x,i4.4 - & ,a1,1x,i3,1x,i3,a85) - - iret = 0 - return - - 891 print *,'!!! ERROR in program sort_and_update_vitals. Error ' - print *,'!!! occurred in output while writing new vitals file ' - print *,'!!! to unit number',lunvit - iret = 98 - - return - end -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine update_old_vits (storm,inowct,ioldct,okstorm - & ,vit_hr_incr) -c -c ABSTRACT: This subroutine updates the vitals from 3h or 6h ago. -c It uses the heading and direction values listed in the vitals -c record (see Module def_vitals for specfics on where to find -c heading & direction in the vitals record) to get a new -c position for the current time by extrapolating out 3h or 6h. -c - USE def_vitals; USE set_max_parms; USE inparms; USE date_checks - USE trig_vals -c - type (tcvcard) storm(maxstorm) - type (datecard) dnow, dold - - logical okstorm(maxstorm) - integer vit_hr_incr -c - ist = inowct + 1 - iend = inowct + ioldct - do while (ist <= iend) - - if (okstorm(ist) .and. storm(ist)%tcv_yymmdd == ymd_old .and. - & storm(ist)%tcv_hhmm == hhmm_old) then - - rlat = float(storm(ist)%tcv_lat) / 10. - rlon = float(storm(ist)%tcv_lon) / 10. - rhdg = float(storm(ist)%tcv_stdir) - rspd = float(storm(ist)%tcv_stspd) / 10. - -c ------------------------------------------ -c This first part updates the positions by simply -c extrapolating the current motion along the current -c heading at the current speed for 3h or 6h. Be -c careful with adding and subtracting these distances -c in the different hemispheres (see the if statements). -c Remember: In the storm message file, there are NO -c negative signs to distinguish between hemispheres, -c so a southern hemisphere latitude will be POSITIVE, -c but will be distinguished by the 'S'. - - strmucomp = rspd * sin(dtr*rhdg) - strmvcomp = rspd * cos(dtr*rhdg) -c - vdistdeg = (strmvcomp * secphr * vit_hr_incr) / dtk - if (storm(ist)%tcv_latns == 'N') then - rnewlat = rlat + vdistdeg - else - rnewlat = rlat - vdistdeg - endif -c - avglat = 0.5 * (rlat + rnewlat) - cosfac = cos(dtr * avglat) - udistdeg = (strmucomp * secphr * vit_hr_incr) / (dtk * cosfac) - if (storm(ist)%tcv_lonew == 'W') then - rnewlon = rlon - udistdeg - else - rnewlon = rlon + udistdeg - endif - -c ------------------------------------------ -c This part updates the E/W and N/S characters -c in the event that a storm changes hemisphere. -c (N to S and S to N is not really possible, but -c we'll include the code anyway). If a storm -c does change hemisphere, say from W to E at 180, -c we need to also adjust the new longitude value -c from say 186W to 174E. Have to include this -c code since storm messages contain longitudes on -c a 0-180 basis (E&W), NOT 0-360. - - if (storm(ist)%tcv_latns == 'N') then - if (rnewlat < 0.) then - storm(ist)%tcv_latns = 'S' - rnewlat = -1. * rnewlat - endif - else - if (rnewlat < 0.) then - storm(ist)%tcv_latns = 'N' - rnewlat = -1. * rnewlat - endif - endif -c - if (storm(ist)%tcv_lonew == 'W') then - if (rnewlon > 180.) then - storm(ist)%tcv_lonew = 'E' - rnewlon = 180. - abs(rnewlon - 180.) - endif - else - if (rnewlon > 180.) then - storm(ist)%tcv_lonew = 'W' - rnewlon = 180. - abs(rnewlon - 180.) - endif - endif - - storm(ist)%tcv_lat = int ((rnewlat + 0.05) * 10.) - storm(ist)%tcv_lon = int ((rnewlon + 0.05) * 10.) - storm(ist)%tcv_yymmdd = ymd_now - storm(ist)%tcv_hhmm = hhmm_now - - endif - - ist = ist + 1 - - enddo -c - return - end - -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine update_future_vits (storm,iprevct,ifuturect,okstorm - & ,vit_hr_incr) -c -c ABSTRACT: This subroutine updates the vitals from 3h or 6h ahead. -c It uses the heading and direction values listed in the vitals -c record (see Module def_vitals for specfics on where to find -c heading & direction in the vitals record) to get a new -c position for the current time by extrapolating *BACKWARDS* -c 3h or 6h to the current time. -c - USE def_vitals; USE set_max_parms; USE inparms; USE date_checks - USE trig_vals -c - type (tcvcard) storm(maxstorm) - type (datecard) dnow, dold, dfuture - - logical okstorm(maxstorm) - integer vit_hr_incr -c - ist = iprevct + 1 - iend = iprevct + ifuturect - do while (ist <= iend) - - if (okstorm(ist) .and. storm(ist)%tcv_yymmdd == ymd_future .and. - & storm(ist)%tcv_hhmm == hhmm_future) then - - rlat = float(storm(ist)%tcv_lat) / 10. - rlon = float(storm(ist)%tcv_lon) / 10. - rhdg = float(storm(ist)%tcv_stdir) - rspd = float(storm(ist)%tcv_stspd) / 10. - -c IMPORTANT NOTE: Since we are extrapolating *BACKWARDS* in -c time in this routine, we have to take that value of the -c storm heading in rhdg and switch it by 180 degrees so that -c we will be pointing back in the direction the storm came -c from.... - - if (rhdg >= 0. .and. rhdg <= 180.) then - rhdg = rhdg + 180. - else - rhdg = rhdg - 180. - endif - -c ------------------------------------------ -c This first part updates the positions by simply -c extrapolating the current motion along the REVERSE of -c the current heading at the current speed for 6 hours. -c Be careful with adding and subtracting these distances -c in the different hemispheres (see the if statements). -c Remember: In the storm message file, there are NO -c negative signs to distinguish between hemispheres, -c so a southern hemisphere latitude will be POSITIVE, -c but will be distinguished by the 'S'. - - strmucomp = rspd * sin(dtr*rhdg) - strmvcomp = rspd * cos(dtr*rhdg) -c - vdistdeg = (strmvcomp * secphr * vit_hr_incr) / dtk - if (storm(ist)%tcv_latns == 'N') then - rnewlat = rlat + vdistdeg - else - rnewlat = rlat - vdistdeg - endif -c - avglat = 0.5 * (rlat + rnewlat) - cosfac = cos(dtr * avglat) - udistdeg = (strmucomp * secphr * vit_hr_incr) / (dtk * cosfac) - if (storm(ist)%tcv_lonew == 'W') then - rnewlon = rlon - udistdeg - else - rnewlon = rlon + udistdeg - endif - -c ------------------------------------------ -c This part updates the E/W and N/S characters -c in the event that a storm changes hemisphere. -c (N to S and S to N is not really possible, but -c we'll include the code anyway). If a storm -c does change hemisphere, say from W to E at 180, -c we need to also adjust the new longitude value -c from say 186W to 174E. Have to include this -c code since storm messages contain longitudes on -c a 0-180 basis (E&W), NOT 0-360. - - if (storm(ist)%tcv_latns == 'N') then - if (rnewlat < 0.) then - storm(ist)%tcv_latns = 'S' - rnewlat = -1. * rnewlat - endif - else - if (rnewlat < 0.) then - storm(ist)%tcv_latns = 'N' - rnewlat = -1. * rnewlat - endif - endif -c - if (storm(ist)%tcv_lonew == 'W') then - if (rnewlon > 180.) then - storm(ist)%tcv_lonew = 'E' - rnewlon = 180. - abs(rnewlon - 180.) - endif - else - if (rnewlon > 180.) then - storm(ist)%tcv_lonew = 'W' - rnewlon = 180. - abs(rnewlon - 180.) - endif - endif - - storm(ist)%tcv_lat = int ((rnewlat + 0.05) * 10.) - storm(ist)%tcv_lon = int ((rnewlon + 0.05) * 10.) - storm(ist)%tcv_yymmdd = ymd_now - storm(ist)%tcv_hhmm = hhmm_now - - endif - - ist = ist + 1 - - enddo -c - return - end - -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine delete_old_dups (storm,inowct,ioldct,okstorm) -c -c ABSTRACT: The purpose of this subroutine is to loop through the -c list of storms for the dtg from 3h or 6h ago to eliminate any -c duplicates. Be sure to sort based on storm identifier (e.g., -c 13L) instead of storm name, since the name may change (e.g., -c from "THIRTEEN" to "IRIS") for an upgrade in intensity, but the -c storm number identifier will remain the same. -c -c ict Total number of storm card entries for this dtg -c - USE def_vitals; USE set_max_parms -c - type (tcvcard) storm(maxstorm) - logical okstorm(maxstorm) - character found_dup*1 -c - ist = inowct + 1 - iend = inowct + ioldct - do while (ist < iend) - - isortnum = ist + 1 - found_dup = 'n' - if (okstorm(ist)) then - - do while (isortnum <= iend .and. found_dup == 'n') - - if (storm(ist)%tcv_storm_id == storm(isortnum)%tcv_storm_id) - & then - found_dup = 'y' - endif - isortnum = isortnum + 1 - - enddo - - endif - - if (found_dup == 'y') then - okstorm(ist) = .FALSE. - endif - - ist = ist + 1 - - enddo - -c NOTE: The last member of the array to be checked is okay, -c since all potential duplicates for this record were eliminated -c in the previous sort while loop just completed, and, further, -c the last member of this array is either already FALSE (from -c being checked off in delete_old), or it's TRUE because it -c didn't get checked off in delete_old, so keep it. - - return - end -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine delete_old (storm,inowct,ioldct,okstorm) -c -c ABSTRACT: This subroutine compares the list of storm card entries -c from 3h or 6h ago to those from the current time to eliminate -c any matching storms (i.e., if we've got a current record for a -c storm, we obviously don't need the old one). -c - USE def_vitals; USE set_max_parms -c - type (tcvcard) storm(maxstorm) -c - logical okstorm(maxstorm) - character found_dup*1 -c - ist = inowct + 1 - iend = inowct + ioldct - do while (ist <= iend) - - isortnum = 1 - found_dup = 'n' - do while (isortnum <= inowct .and. found_dup == 'n') - - if (storm(ist)%tcv_storm_id == storm(isortnum)%tcv_storm_id) - & then - found_dup = 'y' - endif - isortnum = isortnum + 1 - - enddo - - if (found_dup == 'y') then - okstorm(ist) = .FALSE. - endif - - ist = ist + 1 - - enddo - - return - end - -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine delete_future (storm,iprevct,ifuturect,okstorm) -c -c ABSTRACT: This subroutine compares the list of storm card entries -c from 3h or 6h ahead to those from the current time and from 3h or -c 6h ago to eliminate any matching storms (i.e., we only need the -c record for the future time if we don't have either a current time -c record or an old record that we've updated). -c - USE def_vitals; USE set_max_parms -c - type (tcvcard) storm(maxstorm) -c - logical okstorm(maxstorm) - character found_dup*1 -c - ist = iprevct + 1 - iend = iprevct + ifuturect - do while (ist <= iend) - - isortnum = 1 - found_dup = 'n' - do while (isortnum <= iprevct .and. found_dup == 'n') - - if (storm(ist)%tcv_storm_id == storm(isortnum)%tcv_storm_id) - & then - found_dup = 'y' - endif - isortnum = isortnum + 1 - - enddo - - if (found_dup == 'y') then - okstorm(ist) = .FALSE. - endif - - ist = ist + 1 - - enddo - - return - end - -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine delete_future_dups (storm,iprevct,ifuturect,okstorm) -c -c ABSTRACT: The purpose of this subroutine is to loop through the -c list of storms for the dtg from 3h or 6h ahead to eliminate any -c duplicates. Be sure to sort based on storm identifier (e.g., -c 13L) instead of storm name, since the name may change (e.g., -c from "THIRTEEN" to "IRIS") for an upgrade in intensity, but the -c storm number identifier will remain the same. -c -c ict Total number of storm card entries for this dtg -c - USE def_vitals; USE set_max_parms -c - type (tcvcard) storm(maxstorm) - logical okstorm(maxstorm) - character found_dup*1 -c - ist = iprevct + 1 - iend = iprevct + ifuturect - do while (ist < iend) - - isortnum = ist + 1 - found_dup = 'n' - if (okstorm(ist)) then - - do while (isortnum <= iend .and. found_dup == 'n') - - if (storm(ist)%tcv_storm_id == storm(isortnum)%tcv_storm_id) - & then - found_dup = 'y' - endif - isortnum = isortnum + 1 - - enddo - - endif - - if (found_dup == 'y') then - okstorm(ist) = .FALSE. - endif - - ist = ist + 1 - - enddo - -c NOTE: The last member of the array to be checked is okay, -c since all potential duplicates for this record were eliminated -c in the previous sort while loop just completed, and, further, -c the last member of this array is either already FALSE (from -c being checked off in delete_future), or it's TRUE because it -c didn't get checked off in delete_future, so keep it. - - return - end -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine delete_dups (storm,ict,okstorm) -c -c ABSTRACT: The purpose of this subroutine is to loop through the -c list of storms for the current dtg to eliminate any duplicates. -c Be sure to sort based on storm identifier (e.g.,13L) instead of -c storm name, since the name may change (e.g., from "THIRTEEN" to -c "IRIS") for an upgrade in intensity, but the storm number -c identifier will remain the same. -c -c ict Total number of storm card entries for this dtg -c - USE def_vitals; USE set_max_parms -c - type (tcvcard) storm(maxstorm) - logical okstorm(maxstorm) - character found_dup*1 -c - ist = 1 - do while (ist < ict) - - isortnum = ist + 1 - found_dup = 'n' - do while (isortnum <= ict .and. found_dup == 'n') - - if (storm(ist)%tcv_storm_id == storm(isortnum)%tcv_storm_id) - & then - found_dup = 'y' - endif - isortnum = isortnum + 1 - - enddo - - if (found_dup == 'y') then - okstorm(ist) = .FALSE. - endif - - ist = ist + 1 - - enddo - -c Now set the last member of the array to be checked as okay, -c since all potential duplicates for this record were eliminated -c in the previous sort while loop just completed. - - okstorm(ict) = .TRUE. -c - return - end -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine read_nlists (dnow,dold,dfuture,vit_hr_incr) -c -c ABSTRACT: Read in the namelists that contain the date for the -c current time, the time from 3h or 6h ago, and the time from 3h -c or 6h ahead . It also converts the input dates for the current -c time, the old time and the future time into a format that can -c be easily compared against the dates in the TC Vitals file. -c - USE inparms; USE date_checks -c - type (datecard) dnow,dold,dfuture -c - integer vit_hr_incr -c - namelist/datenowin/dnow - namelist/dateoldin/dold - namelist/datefuturein/dfuture - namelist/hourinfo/vit_hr_incr -c - read (5,NML=datenowin,END=801) - 801 continue - read (5,NML=dateoldin,END=803) - 803 continue - read (5,NML=datefuturein,END=805) - 805 continue - read (5,NML=hourinfo,END=807) - 807 continue -c - ymd_now = dnow%yy * 10000 + dnow%mm * 100 + dnow%dd - hhmm_now = dnow%hh * 100 - ymd_old = dold%yy * 10000 + dold%mm * 100 + dold%dd - hhmm_old = dold%hh * 100 - ymd_future = dfuture%yy * 10000 + dfuture%mm * 100 + dfuture%dd - hhmm_future = dfuture%hh * 100 -c - return - end -c -c---------------------------------------------------------------------- -c -c---------------------------------------------------------------------- - integer function char2int (charnum) -c -c This function takes as input a character numeral and -c returns the integer equivalent -c - character*1 charnum,cx(10) - data cx/'0','1','2','3','4','5','6','7','8','9'/ -c - do i=1,10 - if (charnum.eq.cx(i)) char2int = i-1 - enddo -c - return - end -c -c---------------------------------------------------------------------- -c -c---------------------------------------------------------------------- - character function int2char (inum) -c -c This function takes as input an integer and -c returns the character numeral equivalent -c - character*1 cx(10) - data cx/'0','1','2','3','4','5','6','7','8','9'/ -c - do i=1,10 - ihold=i-1 - if (ihold.eq.inum) int2char = cx(i) - enddo -c - return - end diff --git a/sorc/supvit.fd/supvit_modules.f b/sorc/supvit.fd/supvit_modules.f deleted file mode 100644 index 9172af58db9..00000000000 --- a/sorc/supvit.fd/supvit_modules.f +++ /dev/null @@ -1,52 +0,0 @@ - module def_vitals - type tcvcard ! Define a new type for a TC Vitals card - character*4 tcv_center ! Hurricane Center Acronym - character*3 tcv_storm_id ! Storm Identifier (03L, etc) - character*9 tcv_storm_name ! Storm name - integer tcv_century ! 2-digit century id (19 or 20) - integer tcv_yymmdd ! Date of observation - integer tcv_hhmm ! Time of observation (UTC) - integer tcv_lat ! Storm Lat (*10), always >0 - character*1 tcv_latns ! 'N' or 'S' - integer tcv_lon ! Storm Lon (*10), always >0 - character*1 tcv_lonew ! 'E' or 'W' - integer tcv_stdir ! Storm motion vector (in degr) - integer tcv_stspd ! Spd of storm movement (m/s*10) - character*85 tcv_chunk ! Remainder of vitals record; - ! will just be read & written - end type tcvcard - end module def_vitals -c - module inparms - type datecard ! Define a new type for the input namelist parms - sequence - integer yy ! Beginning yy of date to search for - integer mm ! Beginning mm of date to search for - integer dd ! Beginning dd of date to search for - integer hh ! Beginning hh of date to search for - end type datecard - end module inparms -c - module date_checks - integer, save :: ymd_now,hhmm_now,ymd_old,hhmm_old - & ,ymd_future,hhmm_future - end module date_checks -c - module set_max_parms - integer, parameter :: maxstorm=400 ! max # of storms pgm can - ! handle - end module set_max_parms -c - module trig_vals - real, save :: pi, dtr, rtd - real, save :: dtk = 111194.9 ! Dist (m) over 1 deg lat - ! using erad=6371.0e+3 - real, save :: erad = 6371.0e+3 ! Earth's radius (m) - real, save :: ecircum = 40030200 ! Earth's circumference - ! (m) using erad=6371.e3 - real, save :: omega = 7.292e-5 - real, save :: secphr = 3600. - end module trig_vals -c -c------------------------------------------------------ -c diff --git a/sorc/syndat_getjtbul.fd/getjtbul.f b/sorc/syndat_getjtbul.fd/getjtbul.f deleted file mode 100644 index c6e93f752b4..00000000000 --- a/sorc/syndat_getjtbul.fd/getjtbul.f +++ /dev/null @@ -1,248 +0,0 @@ -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C -C MAIN PROGRAM: SYNDAT_GETJTBUL RETRIEVES JTWC BULLETINS FROM TANK -C PRGMMR: STOKES ORG: NP23 DATE: 2013-02-22 -C -C ABSTRACT: RETRIEVES TROPICAL CYCLONE POSITION AND INTENSITY -C INFORMATION FROM JOINT TYPHOON WARNING CENTER/FNMOC. THESE -C BULLETINS COME IN TWO PIECES. THIS PROGRAM READS THEM AND -C JOINS THEM TOGETHER. THIS ALLOWS THE DOWNSTREAM PROGRAM -C QCTROPCY TO PROCESS THEM. -C -C PROGRAM HISTORY LOG: -C 1997-06-23 S. J. LORD ---- ORIGINAL AUTHOR -C 1998-11-24 D. A. KEYSER -- FORTRAN 90/Y2K COMPLIANT -C 1998-12-30 D. A. KEYSER -- MODIFIED TO ALWAYS OUTPUT RECORDS -C CONTAINING A 4-DIGIT YEAR (REGARDLESS OF INPUT) -C 2000-03-09 D. A. KEYSER -- MODIFIED TO RUN ON IBM-SP; CORRECTED -C PROBLEM FROM EARLIER CRAY VERSION WHICH RESULTED -C IN AN INCORRECT JOINING OF PIECES IF THE SAME -C 2-PIECE BULLETIN IS DUPLICATED IN THE ORIGINAL FILE -C THAT IS READ IN BY THIS PROGRAM -C 2013-02-22 D. C. STOKES -- MINOR DOC CHANGES. (WCOSS TRANSIITON) -C -C USAGE: -C INPUT FILES: -C UNIT 11 - FILE CONTAINING JTWC/FNMOC BULLETINS -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C UNIT 51 - FILE CONTAINING JTWC/FNMOC BULLETINS NOW JOINED -C TOGETHER -C -C SUBPROGRAMS CALLED: -C UNIQUE: - NONE -C LIBRARY: -C W3NCO - W3TAGB W3TAGE ERREXIT -C -C EXIT STATES: -C COND = 0 - SUCCESSFUL RUN, DATA RETRIEVED -C = 1 - SUCCESSFUL RUN -- NO DATA RETRIEVED -C = 20 - TROUBLE - EITHER READ ERROR WITHIN PROGRAM OR -C NUMBER OF RECORDS IN INPUT FILE EXCEEDS PROGRAM -C LIMIT. -C -C REMARKS: THE Y2K-COMPLIANT VERSION IS SET-UP TO READ RECORDS WITH -C EITHER A 2-DIGIT YEAR STARTING IN COLUMN 20 OR A 4-DIGIT -C YEAR STARTING IN COLUMN 20. THIS WILL ALLOW THIS PROGRAM -C TO RUN PROPERLY WHEN JTWC/FNMOC TRANSITIONS RECORDS TO -C A 4-DIGIT YEAR. -C -C ATTRIBUTES: -C LANGUAGE FORTRAN 90 -C MACHINE: IBM SP and IBM iDataPlex -C -C$$$ - PROGRAM SYNDAT_GETJTBUL - - PARAMETER (NBULS=200) - - CHARACTER*1 INL1(80) - CHARACTER*9 STNAME - CHARACTER*18 HEAD(NBULS),CHEKHED - CHARACTER*37 ENDMSG - CHARACTER*80 INL,INLS(NBULS) - CHARACTER*80 DUMY2K - CHARACTER*95 OUTL - - INTEGER LINE(NBULS) - - EQUIVALENCE (INL1,INL) - - DATA IIN/11/,IOUT/51/,LINE/NBULS*0/ - - CALL W3TAGB('SYNDAT_GETJTBUL',2013,0053,0050,'NP23 ') - - WRITE(6,*) ' ' - WRITE(6,*) '===> WELCOME TO SYNDAT_GETJTBUL - F90/Y2K VERSION ', - $ '02-22-2013' - WRITE(6,*) ' ' - WRITE(6,*) ' ' - - NLINE = 0 - - DO N=1,NBULS - INL1=' ' - READ(IIN,2,END=100,ERR=200) INL - 2 FORMAT(A80) - NLINE = N - -C AT THIS POINT WE DO NOT KNOW IF A 2-DIGIT YEAR BEGINS IN COLUMN 20 -c OF THE RECORD (OLD NON-Y2K COMPLIANT FORM) OR IF A 4-DIGIT YEAR -c BEGINS IN COLUMN 20 (NEW Y2K COMPLIANT FORM) - TEST ON LOCATION OF -c LATITUDE BLANK CHARACTER TO FIND OUT ... - - IF(INL1(26).EQ.' ') THEN - -c ... THIS RECORD STILL CONTAINS THE OLD 2-DIGIT FORM OF THE YEAR - -c ... THIS PROGRAM WILL NOW CONVERT THE RECORD TO A 4-DIGIT YEAR USING -c THE "WINDOWING" TECHNIQUE SINCE SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> This is an old-format record with a 2-digit ', - $ 'year "',INL(20:21),'"' - PRINT *, ' ' - DUMY2K(1:19) = INL(1:19) - IF(INL(20:21).GT.'20') then - DUMY2K(20:21) = '19' - ELSE - DUMY2K(20:21) = '20' - ENDIF - DUMY2K(22:80) = INL(20:80) - INL= DUMY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ INL(20:23),'" via windowing technique' - PRINT *, ' ' - - ELSE - -c ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR -c ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> This is an new-format record with a 4-digit ', - $ 'year "',INL(20:23),'"' - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - end if - - WRITE(6,3) NLINE,INL - 3 FORMAT(' ...Bulletin line number',I4,' is....',A80,'...') - INLS(NLINE)=INL - HEAD(NLINE)=INL(1:18) - WRITE(6,4) NLINE,HEAD(NLINE) - 4 FORMAT(' ... Header for line number',I4,' is ...',A18,'...') - ENDDO - -C Come here if no. of records in input file exceeds pgm limit ("NBULS") -C --------------------------------------------------------------------- - - WRITE(6,301) NBULS - 301 FORMAT(' **** Number of records in input File exceeds program ', - $ 'limit of',I4,'. Abort') - ICODE=20 - ENDMSG='SYNDAT_GETJTBUL TERMINATED ABNORMALLY' - GO TO 900 - - 100 CONTINUE - -C All records read in -C ------------------- - - IF(NLINE.EQ.0) THEN - -C Come here if ZERO records were read from input file -C --------------------------------------------------- - - ICODE=1 - WRITE(6,101) - 101 FORMAT(' ...No Bulletins available.') - ENDMSG='SYNDAT_GETJTBUL TERMINATED NORMALLY ' - GO TO 900 - ENDIF - - IF(MOD(NLINE,2).NE.0) THEN - -C Come here if number of records read was not even -C ------------------------------------------------ - - WRITE(6,111) NLINE - 111 FORMAT(' **** Number of records read in (=',I4,') is not ', - $ 'even. Abort') - ICODE=20 - ENDMSG='SYNDAT_GETJTBUL TERMINATED ABNORMALLY' - GO TO 900 - ENDIF - - PRINT *, ' ' - PRINT *, ' ' - NBULT=NLINE/2 - NBUL=0 - LOOP1: DO NL=1,NLINE - IF(LINE(NL).EQ.1) CYCLE LOOP1 - CHEKHED=HEAD(NL) - IFND = 0 - LOOP1n1: DO NB=NL+1,NLINE - IF(LINE(NB).EQ.1) CYCLE LOOP1n1 - NBSAV=NB - WRITE(6,11) CHEKHED,INLS(NB)(1:18) - 11 FORMAT(' ...message parts are ...',A18,'...',A18,'...') - IF(CHEKHED .EQ. INLS(NB)(1:18)) THEN - LINE(NL) = 1 - LINE(NB) = 1 - IFND = 1 - EXIT LOOP1n1 - ENDIF - ENDDO LOOP1n1 - IF(IFND.EQ.1) THEN - WRITE(6,131) INLS(NL)(10:10) - 131 FORMAT(' ...inls(nl)(10:10)=',A1,'...') - IF(INLS(NL)(10:10).eq.' ') THEN - LOOP 1n2: DO IB=11,18 - IS=IB - IF(INLS(NL)(IS:IS).NE.' ') EXIT LOOP 1n2 - ENDDO LOOP 1n2 - STNAME=' ' - STNAME=INLS(NL)(IS:18) - INLS(NL)(10:18)=STNAME - ENDIF - OUTL=INLS(NL)(1:66)//INLS(NBSAV)(33:61) - WRITE(6,145) OUTL - 145 FORMAT(' ...Complete bulletin is ...',A95,'...') - WRITE(IOUT,22) OUTL - 22 FORMAT(A95) - NBUL=NBUL+1 - ENDIF - IF(NBUL .EQ. NBULT) GO TO 150 - ENDDO LOOP1 - - 150 CONTINUE - WRITE(6,151) NBUL - 151 FORMAT(' ...',I4,' bulletins have been made.') - ICODE=0 - ENDMSG='SYNDAT_GETJTBUL TERMINATED NORMALLY ' - GO TO 900 - - 200 continue - -C Come here if error reading a record from input file -C --------------------------------------------------- - - WRITE(6,201) - 201 FORMAT(' **** ERROR READING RECORD FROM INPUT FILE. ABORT') - ICODE=20 - ENDMSG='SYNDAT_GETJTBUL TERMINATED ABNORMALLY' - - 900 CONTINUE - - WRITE(6,*) ENDMSG - - CALL W3TAGE('SYNDAT_GETJTBUL') - - IF(ICODE.GT.0) CALL ERREXIT(ICODE) - - STOP - - END diff --git a/sorc/syndat_maksynrc.fd/maksynrc.f b/sorc/syndat_maksynrc.fd/maksynrc.f deleted file mode 100644 index dca5de25758..00000000000 --- a/sorc/syndat_maksynrc.fd/maksynrc.f +++ /dev/null @@ -1,472 +0,0 @@ -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C -C MAIN PROGRAM: SYNDAT_MAKSYNRC MAKE SYNDAT RECORD FROM HUMAN INPUT -C PRGMMR: STOKES ORG: NP23 DATE: 2013-03-15 -C -C ABSTRACT: QUERIES HUMAN INPUT FOR INFORMATION TO CONSTRUCT TROPICAL -C CYCLONE SYNTHETIC DATA RECORD AND WRITES RECORD TO FORTRAN -C UNIT 51 -C -C PROGRAM HISTORY LOG: -C 1997-06-26 S. J. LORD ---- ORIGINAL AUTHOR -C 1998-11-23 D. A. KEYSER -- FORTRAN 90 AND Y2K COMPLIANT -C 1998-12-30 D. A. KEYSER -- MODIFIED TO OUTPUT RECORDS CONTAINING A -C 4-DIGIT YEAR -C 2000-03-03 D. A. KEYSER -- CONVERTED TO RUN ON IBM-SP MACHINE -C 2013-03-15 D. C. STOKES -- Modified some stdout writes to display -C cleanly as part of WCOSS transition. -C -C USAGE: -C INPUT FILES: -C UNIT 05 - INPUT FILE FOR HUMAN (KEYBOARD ENTRY) -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C UNIT 51 - SYNTHETIC DATA RECORD (ONE PER RUN) -C -C SUBPROGRAMS CALLED: -C UNIQUE: - BEGINE ENDE MAKVIT NSEW -C LIBRARY: -C W3LIB: - W3TAGB W3TAGE -C -C EXIT STATES: -C COND = 0 - SUCCESSFUL RUN -C -C REMARKS: -C -C ATTRIBUTES: -C LANGUAGE FORTRAN 90 -C MACHINE: IBM-SP, IBM-iDataPlex -C -C$$$ - program SYNDAT_MAKSYNRC - logical fstflg - character rsmc*4,stmnam*9,stmid*3 - data iuntvi/51/,fstflg/.false./ - - CALL W3TAGB('SYNDAT_MAKSYNRC',2013,0074,0000,'NP23 ') - - write(6,*) "Welcome to the Synthetic Data Record Maker" - write(6,*) "+++ FORTRAN 90 / Y2K VERSION +++" - write(6,*) "+++ 03 March 2000 +++" - write(6,*) "Please follow all directions carefully, paying" - write(6,*) "careful attention to the Units as units" - write(6,*) "conversions are hardwired" - - call begine - write(6,*) 'Enter Storm Name (UPPER CASE)' - read(5,1) stmnam - 1 format(a) - write(6,2) stmnam - 2 format(' Storm name is:',a9) - call ende - - call begine - write(6,*) 'Enter Storm Identifier (e.g. 03P)' - read(5,11) stmid - 11 format(a) - write(6,12) stmid - 12 format(' Storm Identifier is:',a3) - call ende - - call begine - write(6,*) 'Enter Organization ID (e.g. NHC, JTWC)' - read(5,11) rsmc - write(6,13) rsmc - 13 format(' Organization Identifier is:',a4) - call ende - - call begine - write(6,*) 'Enter date (yyyymmdd)' - read(5,*) idate - write(6,*) 'Date is: ',idate - call ende - - call begine - write(6,*) 'Enter hour (hh)' - read(5,*) ihour - iutc=ihour*100 - write(6,*) 'Hour is: ',ihour - call ende - - call begine - write(6,*) 'Enter storm latitude (negative for south)' - read(5,*) stmlat - write(6,'(x,a,f5.1)') 'Storm latitude is: ',stmlat - call ende - - call begine - write(6,*) 'Enter storm longitude (DEG EAST)' - read(5,*) stmlon - write(6,'(x,a,f5.1)') 'Storm longitude is: ',stmlon - call ende - - call begine - write(6,*) 'Enter storm direction (DEG FROM NORTH)' - read(5,*) stmdir - write(6,'(x,a,f4.0)') 'Storm direction is: ',stmdir - call ende - - call begine - write(6,*) 'Enter storm speed (KNOTS)' - read(5,*) stmspd - write(6,'(x,a,f6.2)') 'Storm speed is: ',stmspd - stmspd=stmspd/1.94 - call ende - - call begine - write(6,*) 'Enter storm central pressure (MB)' - read(5,*) pcen - write(6,'(x,a,f5.0)') 'Storm central pressure is: ',pcen - call ende - - call begine - write(6,*) 'Enter storm environmental pressure (MB)' - read(5,*) penv - write(6,'(x,a,f5.0)') 'Storm environmental pressure is: ',penv - call ende - - call begine - write(6,*) 'Enter estimated maximum wind (KNOTS)' - read(5,*) vmax - write(6,'(x,a,f4.0)') 'Estimated maximum wind (KNOTS) is: ',vmax - vmax=vmax/1.94 - call ende - - call begine - write(6,*) 'Enter estimated radius of outermost closed ', - 1'isobar (ROCI), i.e. size of the storm circulation (KM)' - read(5,*) rmax - write(6,'(x,a,f5.0)') 'Estimated ROCI (KM) is: ',rmax - call ende - - call begine - write(6,*) 'Enter estimated radius of maximum wind (KM)' - read(5,*) rmw - write(6,'(x,a,f5.0)') - 1 'Estimated radius of maximum wind (KM) is: ',rmw - call ende - - call begine - call nsew - write(6,*) 'Enter estimated radius of 15 m/s (35 knot) winds (KM)' - write(6,*) - 1 'in each each of the following quadrants (e.g. 290 222 200 180)' - write(6,*) 'Note: numbers must be separated by blanks' - write(6,*) 'Note: numbers must be in the order NE SE SW NW and be' - 1 ,' separated by blanks' - write(6,*) 'Note: enter all negative numbers to denote no ', - 1'estimate' - read(5,*) r15ne,r15se,r15sw,r15nw - write(6,'(x,a,4f8.0)') - 1 'Estimated radius of 15 m/s (35 knot) winds is: ', - 2 r15ne,r15se,r15sw,r15nw - call ende - - call begine - call nsew - write(6,*) 'Enter estimated radius of 26 m/s (55 knot) winds (KM)' - write(6,*) - 1 'in each each of the following quadrants (e.g. 50 50 50 50)' - write(6,*) 'Note: numbers must be separated by blanks' - write(6,*) 'Note: numbers must be in the order NE SE SW NW and be' - 1'separated by blanks' - write(6,*) 'Note: enter all negative numbers to denote no ', - 1'estimate' - read(5,*) r26ne,r26se,r26sw,r26nw - write(6,'(x,a,4f8.0)') - 1 'Estimated radius of 26 m/s (35 knot) winds is: ', - 2 r26ne,r26se,r26sw,r26nw - call ende - - call begine - write(6,*) 'Enter estimated top of cyclonic circulation (mb)' - read(5,*) ptop - write(6,'(x,a,f7.1)') - 1 'Estimated top of cyclonic circulation (mb) is: ',ptop - call ende - - call begine - write(6,*) 'Enter estimated latitude at maximum forecast time ' - write(6,*) '(negative for south)' - write(6,*) 'Note: enter -99.0 to denote no estimate' - read(5,*) fclat - write(6,'(x,a,f5.1)') - 1 'Estimated latitude at maximum forecast time is: ', fclat - call ende - - call begine - write(6,*) 'Enter estimated longitude at maximum forecast time ' - write(6,*) '(DEG EAST)' - write(6,*) 'Note: enter a negative number to denote no estimate' - read(5,*) fclon - write(6,'(x,a,f5.1)') - 1 'Estimated longitude at maximum forecast time is: ', fclon - call ende - - call begine - write(6,*) 'Enter maximum forecast time (hours, e.g. 72)' - write(6,*) 'Note: enter a negative number to denote no estimate' - read(5,*) fcstp - write(6,'(x,a,f4.0)') 'Maximum forecast time is: ',fcstp - call ende - - CALL MAKVIT(IUNTVI,IDATE,IUTC,STMLAT,STMLON,STMDIR,STMSPD, - 1 PCEN,PENV,RMAX,VMAX,RMW,R15NE,R15SE,R15SW, - 2 R15NW,PTOP,STMNAM,STMID,RSMC,FSTFLG,r26ne, - 3 r26se,r26sw,r26nw,fcstp,fclat,fclon) - - CALL W3TAGE('SYNDAT_MAKSYNRC') - stop - end - SUBROUTINE BEGINE - write(6,1) - 1 format(' ') - write(6,11) - 11 format(' *******************************************************') - return - end - - SUBROUTINE ENDE - write(6,1) - 1 format(' *******************************************************') - write(6,11) - 11 format(' ') - return - end -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: MAKVIT CREATES TROP. CYCLONE VITAL. STAT. DATA -C PRGMMR: D. A. KEYSER ORG: NP22 DATE: 1998-12-30 -C -C ABSTRACT: CREATES TROPICAL CYCLONE VITAL STATISTICS RECORDS FROM -C RAW INFORMATION SUCH AS LATITUDE, LONGITUDE, MAX. WINDS ETC. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD ---- ORIGINAL AUTHOR -C 1998-11-23 D. A. KEYSER -- FORTRAN 90 AND Y2K COMPLIANT -C 1998-12-30 D. A. KEYSER -- MODIFIED TO OUTPUT RECORDS CONTAINING A -C 4-DIGIT YEAR -C -C USAGE: CALL PGM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C DDNAME1 - GENERIC NAME & CONTENT -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C DDNAME2 - GENERIC NAME & CONTENT AS ABOVE -C FT06F001 - INCLUDE IF ANY PRINTOUT -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C LANGUAGE: FORTRAN 90 -C MACHINE: CRAY, SGI -C -C$$$ - SUBROUTINE MAKVIT(IUNTVI,IDATE,IUTC,STMLAT,STMLON,STMDIR,STMSPD, - 1 PCEN,PENV,RMAX,VMAX,RMW,R15NE,R15SE,R15SW, - 2 R15NW,PTOP,STMNAM,STMID,RSMC,FSTFLG,r26ne, - 3 r26se,r26sw,r26nw,fcstp,fclat,fclon) -C - SAVE -C - CHARACTER *(*) RSMC,STMNAM,STMID - LOGICAL FSTFLG -C - PARAMETER (MAXCHR=129) - PARAMETER (MAXVIT=22) - PARAMETER (MAXTPC= 3) -C - CHARACTER BUFIN*1,RSMCZ*4,STMIDZ*3,STMNMZ*9,FSTFLZ*1,STMDPZ*1, - 1 SHALO*1,MEDIUM*1, - 2 DEEP*1,LATNS*1,LONEW*1,FMTVIT*6,FMTMIS*4,BUFINZ*129, - 3 RELOCZ*1,STMTPC*1,EXE*1, - 7 latnsf,lonewf -C - DIMENSION IVTVAR(MAXVIT),VITVAR(MAXVIT),VITFAC(MAXVIT), - 1 ISTVAR(MAXVIT),IENVAR(MAXVIT),STMTOP(0:MAXTPC) -C - DIMENSION BUFIN(MAXCHR),STMTPC(0:MAXTPC),FMTVIT(MAXVIT), - 1 MISSNG(MAXVIT),FMTMIS(MAXVIT) -C - EQUIVALENCE (BUFIN(1),RSMCZ),(BUFIN(5),RELOCZ),(BUFIN(6),STMIDZ), - 1 (BUFIN(10),STMNMZ),(BUFIN(19),FSTFLZ), - 2 (BUFIN(37),LATNS),(BUFIN(43),LONEW), - 3 (BUFIN(95),STMDPZ),(BUFIN(1),BUFINZ), - 4 (BUFIN(123),LATNSF),(BUFIN(129),LONEWF) -C - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) -C - EQUIVALENCE (VITVAR( 3),STMLTZ),(VITVAR( 4),STMLNZ), - 1 (VITVAR( 5),STMDRZ),(VITVAR( 6),STMSPZ), - 2 (VITVAR( 7),PCENZ), (VITVAR( 8),PENVZ), - 3 (VITVAR( 9),RMAXZ), (VITVAR(10),VMAXZ), - 4 (VITVAR(11),RMWZ), (VITVAR(12),R15NEZ), - 5 (VITVAR(13),R15SEZ),(VITVAR(14),R15SWZ), - 6 (VITVAR(15),R15NWZ),(VITVAR(16),R26NEZ), - 7 (VITVAR(17),R26SEZ),(VITVAR(18),R26SWZ), - 8 (VITVAR(19),R26NWZ),(VITVAR(20),FCSTPZ), - 9 (VITVAR(21),FCLATZ),(VITVAR(22),FCLONZ) -C - EQUIVALENCE (STMTPC(0), EXE),(STMTPC(1),SHALO),(STMTPC(2),MEDIUM), - 1 (STMTPC(3),DEEP) -C - DATA SHALO/'S'/,MEDIUM/'M'/,DEEP/'D'/,EXE/'X'/, - 2 VITFAC/2*1.0,2*0.1,1.0,0.1,14*1.0,2*0.1/, - 3 FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 4 3*'(I4.4)','(I2.2)','(I3.3)',8*'(I4.4)','(I2.2)', - 5 '(I3.3)','(I4.4)'/, - 6 FMTMIS/'(I8)','(I4)','(I3)','(I4)',2*'(I3)',3*'(I4)', - 7 '(I2)','(I3)',8*'(I4)','(I2)','(I3)','(I4)'/, - 8 MISSNG/-9999999,-999,-99,-999,2*-99,3*-999,-9,-99,8*-999,-9, - 9 -99,-999/, - O ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90, 97,102, - O 107,112,117,120,125/, - 1 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93,100,105, - 1 110,115,118,122,128/, - 3 STMTOP/-99.0,700.,400.,200./ -C - BUFINZ=' ' - RSMCZ=RSMC -cvvvvvy2k - -C NOTE: This program OUTPUTS a record containing a 4-digit year - for -C example: - -C NHC 13L MITCH 19981028 1800 164N 0858W 270 010 0957 1008 0371 51 019 0278 0278 0185 0185 D -C 12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345 ... -C 1 2 3 4 5 6 7 8 9 ... - -C This program will truncate the integer work containing the -C date in the form yyyymmdd to the form yymmdd prior to writing -C it into the output record. -cppppp - print *, ' ' - print *, ' ' - print *, '==> tcvitals file can now contain a 4-digit year, so ', - $ 'no conversion is necessary since 4-digit year is input' - print *, ' ' - print *, ' ' -cppppp -caaaaay2k - IDATEZ=IDATE - IUTCZ=IUTC - STMNMZ=STMNAM - STMIDZ=STMID - STMLTZ=STMLAT -C - IF(STMLTZ .GE. 0.0) THEN - LATNS='N' - ELSE - LATNS='S' - STMLTZ=ABS(STMLTZ) - ENDIF -C - IF(STMLON .GE. 180.) THEN - STMLNZ=360.-STMLON - LONEW='W' -C - ELSE - STMLNZ=STMLON - LONEW='E' - ENDIF -C - IF(fclat .GE. 0.0) THEN - fclatz=fclat - latnsf='N' - ELSE if (fclat .gt. -90.) then - latnsf='S' - fclatz=ABS(fclat) -c - else - latnsf='S' - fclatz=-99.9 - ENDIF -C - IF(fclon .GE. 180.) THEN - fclonz=360.-fclon - lonewf='W' -C - ELSE if (fclon .gt. 0.) then - fclonz=fclon - lonewf='E' -c - else - fclonz=-999.9 - lonewf='E' - ENDIF -C - STMDRZ=STMDIR - STMSPZ=STMSPD - PCENZ =PCEN - PENVZ =PENV - RMAXZ =RMAX - VMAXZ =VMAX - RMWZ =RMW - R15NEZ=R15NE - R15SEZ=R15SE - R15SWZ=R15SW - R15NWZ=R15NW - r26nez=r26ne - r26sez=r26se - r26swz=r26sw - r26nwz=r26nw - fcstpz=fcstp -C - FSTFLZ=' ' - IF(FSTFLG) FSTFLZ=':' -C - DO IV=1,2 - IF(IVTVAR(IV) .GE. 0) THEN - WRITE(BUFINZ(ISTVAR(IV):IENVAR(IV)),FMTVIT(IV)) IVTVAR(IV) - ELSE - WRITE(BUFINZ(ISTVAR(IV):IENVAR(IV)),FMTMIS(IV)) MISSNG(IV) - ENDIF - ENDDO -C - DO IV=3,MAXVIT - IF(VITVAR(IV) .GE. 0) THEN - IVTVAR(IV)=NINT(VITVAR(IV)/VITFAC(IV)) - WRITE(BUFINZ(ISTVAR(IV):IENVAR(IV)),FMTVIT(IV)) IVTVAR(IV) - ELSE - WRITE(BUFINZ(ISTVAR(IV):IENVAR(IV)),FMTMIS(IV)) MISSNG(IV) - ENDIF - ENDDO -C - DO ITOP=0,MAXTPC - IF(PTOP .EQ. STMTOP(ITOP)) THEN - STMDPZ=STMTPC(ITOP) - GO TO 31 - ENDIF - ENDDO - - 31 CONTINUE -C - IF(IUNTVI .GT. 0) THEN - WRITE(IUNTVI,41) BUFINZ - 41 FORMAT(A) - WRITE(6,43) BUFINZ - 43 FORMAT(' ...',A,'...') - ELSE - WRITE(6,43) BUFINZ - ENDIF -C - RETURN - END - - SUBROUTINE NSEW - write(6,*) ' Quadrants' - write(6,*) ' NW : NE' - write(6,*) '----------- Order of quadrants: NE SE SW NW' - write(6,*) ' SW : SE' - return - end diff --git a/sorc/syndat_qctropcy.fd/qctropcy.f b/sorc/syndat_qctropcy.fd/qctropcy.f deleted file mode 100644 index e6bfadebd40..00000000000 --- a/sorc/syndat_qctropcy.fd/qctropcy.f +++ /dev/null @@ -1,12099 +0,0 @@ -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C -C MAIN PROGRAM: SYNDAT_QCTROPCY PERFORMS QC ON TROP. CYCLONE BULLETINS -C PRGMMR: KEYSER ORG: NP22 DATE: 2008-07-10 -C -C ABSTRACT: PERFORMS QUALITY CONTROL ON TROPICAL CYCLONE POSITION -C AND INTENSITY INFORMATION (T. C. VITAL STATISTICS). CHECKS -C PERFORMED ARE: DUPLICATE RECORDS, APPROPRIATE DATE/TIME, PROPER -C RECORD STRUCTURE (BLANKS IN PROPER PLACE AND NO IMPROPER NON- -C INTEGER NUMBERS), STORM NAME/ID NUMBER, RECORDS FROM MULTIPLE -C INSTITUTIONS, SECONDARY VARIABLES (E.G. CENTRAL PRESSURE), -C STORM POSITION AND DIRECTION/SPEED. EMPHASIS IS ON INTERNAL -C CONSISTENCY BETWEEN REPORTED STORM LOCATION AND PRIOR MOTION. -C -C PROGRAM HISTORY LOG: -C 1991-03-27 S. J. LORD -C 1991-07-18 S. J. LORD ADDED ROUTINE FSTSTM, MODIFIED ADFSTF -C 1992-01-22 S. J. LORD CHANGED W3FS12,W3FS13 CALLS TO W3FS19, W3FS17 -C 1992-02-19 S. J. LORD ADDED MULTIPLE RSMC CHECK -C 1992-04-09 S. J. LORD CHANGED SLMASK TO T126 FROM T80 -C 1992-05-20 S. J. LORD CORRECTED BUG IN SELACK CALL -C 1992-06-09 J. JOHNSON CHANGED COND=10 TO COND=4 FOR SUCCESSFUL RUN -C BUT WITH EMPTY INPUT FILES -C 1992-07-01 S. J. LORD ADDED DATE CHECK AND REVISED RITCUR -C 1992-07-10 S. J. LORD REVISED STIDCK TO DISMANTLE CONSISTENCY -C CHECKS IN THE CASE OF NUMBERED DEPRESSIONS -C 1992-07-16 S. J. LORD FIXED SOME BUGS IN RSMCCK -C 1992-08-20 S. J. LORD ADDED THE JTWC MEMORIAL SWITCH CHECK -C 1992-08-20 S. J. LORD MODIFIED DUPCHK TO ADD A NEW INPUT UNIT -C 1992-09-04 S. J. LORD ADDED PRESSURE WIND RELATIONSHIP TO SECVCK -C 1992-09-09 S. J. LORD ADDED CENTRAL PACIFIC NAMES AND NAME CHECK -C 1992-09-18 S. J. LORD ADDED CHECK FOR CORRECT MISSING DATA IN READCK -C 1992-10-28 S. J. LORD ADDED GREEK ALPHABET STORM NAMES -C 1992-12-14 S. J. LORD MODIFIED CONSOLE MESSAGE FOR ISTOP=4 -C 1993-03-05 S. J. LORD IMPLEMENTED STORM CATALOG (RCNCIL) -C 1993-03-31 S. J. LORD IMPLEMENTED READING STORM NAMES FROM EXTERNAL -C FILE IN STIDCK -C 1993-04-08 S. J. LORD IMPLEMENTED WEST PACIFIC CLIPER -C 1993-08-25 S. J. LORD ADDER RETURN CODE OF 10 FOR RCNCIL LOGICAL -C ERROR -C 1993-08-25 S. J. LORD UPGRADED STORM ID CHECKING FOR STORMS CHANGING -C 1994-06-20 S. J. LORD MODIFIED MAXCHK FOR THE GFDL FORMAT -C 1996-04-12 S. J. LORD REMOVED CALL TO DRSPCK -C 1997-06-24 S. J. LORD ADDED NEW UNIT FOR MANUALLY ENTERED MESSAGES -C 1998-03-24 S. J. LORD MODIFIED VITDATN.INC AND VITFMTN.INC TO -C RECOGNIZE RSMC ID "NWOC" (THIS HAD BEEN UNRECOGNIZED -C AND HAD CAUSED THE PROGRAM TO STOP 20); REMOVED -C UNINITIALIZED VARIABLES THAT WERE CAUSING COMPILER -C WARNINGS -C 1998-06-05 D.A. KEYSER - FORTRAN 90 AND Y2K COMPLIANT -C 1998-06-18 S.J. LORD - FORTRAN 90 AND Y2K COMPLIANT (vitfmt.inc) -C 1998-08-16 S.J. LORD - FORTRAN 90 AND Y2K COMPLIANT (completed) -C 1998-12-14 D. A. KEYSER - Y2K/F90 COMPLIANCE, STREAMLINED CODE; -C 2000-03-03 D. A. KEYSER - CONVERTED TO RUN ON IBM-SP MACHINE -C 2001-02-07 D. A. KEYSER - EXPANDED TEST STORM ID RANGE FROM 90-99 -C TO 80-99 AT REQUEST FOR JIM GROSS AT TPC {NOTE: IF THIS -C EVER HAS TO BE DONE AGAIN, THE ONLY LINES THAT NEED TO -C BE CHANGED ARE COMMENTED AS "CHG. TESTID" - ALSO MUST -C CHANGE PROGRAM bulls_bufrcyc WHICH GENERATES GTS -C MESSAGES, CHANGE UTILITY PROGRAM trpsfcmv WHICH -C GENERATES CHARTS FOR THE TROPICS (although technically -C trpsfcmv reads in q-c'd tcvitals files output by this -C program and thus they should not have test storms in -C them), and changes scripts: util/ush/extrkr.sh and -C ush/relocate_extrkr.sh} -C 2004-06-08 D. A. KEYSER - WHEN INTEGER VALUES ARE DECODED FROM -C CHARACTER-BASED RECORD VIA INTERNAL READ IN SUBR. DECVAR, -C IF BYTE IN UNITS DIGIT LOCATION IS ERRONEOUSLY CODED AS -C BLANK (" "), IT IS REPLACED WITH A "5" IN ORDER TO -C PREVENT INVALID VALUE FROM BEING RETURNED (I.E., IF -C "022 " WAS READ, IT WAS DECODED AS "22", IT IS NOW -C DECODED AS "225" - THIS HAPPENED FOR VALUE OF RADIUS OF -C LAST CLOSED ISOBAR FOR JTWC RECORDS FROM 13 JULY 2000 -C THROUGH FNMOC FIX ON 26 MAY 2004 - THE VALUE WAS REPLACED -C BY CLIMATOLOGY BECAUSE IT FAILED A GROSS CHECK, HAD THIS -C CHANGE BEEN IN PLACE THE DECODED VALUE WOULD HAVE BEEN -C W/I 0.5 KM OF THE ACTUAL VALUE) -C 2008-07-10 D. A. KEYSER - CORRECTED MEMORY CLOBBERING CONDITION -C IN SUBR. STIDCK RELATED TO ATTEMPTED STORAGE OF MORE WEST -C PACIFIC STORM NAMES FROM FILE syndat_stmnames (144) THAN -C ALLOCATED BY PROGRAM AND IN syndat_stmnames (140), THIS -C LED TO OVERWRITING OF FIRST FOUR syndat_stmnames STORM -C NAMES IN ATLANTIC BASIN FOR 2002, 2008, 2014 CYCLE - -C DISCOVERED BECAUSE 2008 STORM BERTHA (STORM #2 IN -C ATLANTIC BASIN LIST IN syndat_stmnames) WAS NOT BEING -C RECOGNIZED AND THUS NOT PROCESSED INTO OUTPUT TCVITALS -C FILE - CORRECTED BY LIMITING STORAGE OF WEST PACIFIC -C STORM NAMES TO EXACTLY THE MAXIMUM IN PROGRAM (AND NUMBER -C IN syndat_stmnames) (CURRENTLY 140), ALSO GENERALIZED -C CODE TO ENSURE THAT IS WILL NEVER CLOBBER MEMORY READING -C AND STORING STORM NAMES IN ANY OF THE BASINS EVEN IF THE -C NUMBER OF STORM NAMES IN syndat_stmnames INCREASE (AS -C LONG AS THE MAXIMUM VALUE IS .GE. TO THE NUMBER OF STORM -C NAMES FOR THE BASIN IN FILE syndat_stmnames) -C 2013-03-17 D. C. STOKES - CHANGED SOME LIST DIRECTED OUTPUT TO -C FORMATTED TO PREVENT UNNDECSSARY WRAPPING ON WCOSS. -C 2013-03-24 D. C. STOKES - INITIALIZE VARIABLES THAT WERE NOT GETTING -C SET WHEN THERE ARE NO RECORDS TO PROCESS. -C 2013-10-10 D. C. STOKES - ADDED NON-HYPHNATED CARDINAL NUMBERS IN -C ORDER TO RECOGNIZE SUCH NAMED STORMS IN BASINS L, E, C, W, -C AND TO RECOGNIZE NAME CHANGES OF SUCH IN THE OTHER BASINS. -C ALSO EXTENDED THAT LIST (FROM 36 TO 39). -C -C -C INPUT FILES: -C (Note: These need to be double checked) -C UNIT 03 - TEXT FILE ASSOCIATING UNIT NUMBERS WITH FILE NAMES -C UNIT 05 - NAMELIST: VARIABLES APPROPRIATE TO THIS Q/C PROGRAM: -C MAXUNT: NUMBER OF INPUT FILES -C FILES: LOGICAL VARIABLE CONTROLLING FINAL -C COPYING OF RECORDS AND FILE MANIPULATION. -C FOR NORMAL OPERATIONAL USAGE, SHOULD BE TRUE. -C WHEN TRUE, INPUT FILES (UNIT 30, UNIT 31, -C ETC) WILL ZEROED OUT. FOR MULTIPLE RUNS -C OVER THE SAME INPUT DATA SET, FILES MUST BE -C FALSE. FOR DEBUGGING, IT IS HIGHLY -C RECOMMENDED THAT FILES BE SET TO FALSE. -C LNDFIL: TRUE IF RECORDS OF STORMS OVER COASTAL -C POINTS ARE NOT COPIED TO THE FILE OF -C CURRENT QUALITY CONTROLLED RECORDS. -C RUNID: RUN IDENTIFIER (e.g., 'GDAS_TM00_00'). -C WINCUR: TIME WINDOW FOR WRITING CURRENT FILE -C NVSBRS: NUMBER OF VARIABLES ALLOWED FOR SUBSTITUTION -C IVSBRS: INDICES OF VARIABLES ALLOW FOR SUBSTITUTION -C UNIT 11 - APPROPRIATE T126 32-BIT GLOBAL SEA/LAND MASK FILE ON -C GAUSSIAN GRID -C UNIT 12 - RUN DATE FILE ('YYYYMMDDHH') -C UNIT 14 - DATA FILE CONTAINING STORM NAMES -C UNIT 20 - SCRATCH FILE CONTAINING PRELIMINARY Q/C RECORDS -C UNIT 21 - ORIGINAL SHORT-TERM HISTORY, CONTAINS ORIGINAL RECORDS -C BACK A GIVEN NUMBER (WINMIN) DAYS FROM PRESENT -C UNIT 22 - ALIASED SHORT-TERM HISTORY, CONTAINS ALIAS RECORDS -C BACK A GIVEN NUMBER (WINMIN) DAYS FROM PRESENT -C UNIT 25 - ALIAS FILE CONTAINING EQUIVALENT STORM IDS -C FOR STORMS THAT HAVE BEEN REPORTED BY MULTIPLE RSMC'S -C UNIT 26 - NEW ALIAS FILE CONTAINING EQUIVALENT STORM IDS -C FOR STORMS THAT HAVE BEEN REPORTED BY MULTIPLE RSMC'S -C NOTE: UCL SHOULD COPY THIS FILE TO UNIT 22 (THE OLD -C ALIAS FILE) AT THE END OF EXECUTION. -C UNIT 30 - STARTING POINT FOR FILES CONTAINING NEW RECORDS TO BE -C etc. QUALITY CONTROLLED. ADDITIONAL INPUT FILES ARE UNIT -C 31, UNIT 32 ETC. THE NUMBER OF THESE FILES IS -C CONTROLLED BY THE NAMELIST INPUT VARIABLE "MAXUNT" -C MENTIONED UNDER UNIT 05 ABOVE. AN EXAMPLE OF AN INPUT -C FILE IS: /tpcprd/atcf/ncep/tcvitals. THIS FILE IS -C WRITTEN BY A REMOTE JOB ENTRY (RJE) AT MIAMI AFTER ALL -C TROPICAL CYCLONE FIXES ARE ESTABLISHED FOR THE ATLANTIC -C AND EAST PACIFIC BY NHC(TPC). THIS FILE IS TYPICALLY -C UPDATED (cat'ed) AT 0230, 0830, 1430, AND 2030 UTC -C (I.E. 2.5 HOURS AFTER SYNOPTIC TIME), 4 TIMES DAILY. -C RECORDS APPROPRIATE TO A FUTURE CYCLE ARE WRITTEN BACK -C TO THE APPROPRIATE FILE. -C -C OUTPUT FILES: -C (Note: These need to be double checked) -C UNIT 06 - STANDARD OUTPUT PRINT -C UNIT 20 - SCRATCH FILE CONTAINING PRELIMINARY Q/C RECORDS -C UNIT 21 - SHORT-TERM HISTORY, RECORDS BACK 4 DAYS FROM PRESENT -C UNIT 22 - NEW ALIAS FILE CONTAINING EQUIVALENT STORM IDS -C FOR STORMS THAT HAVE BEEN REPORTED BY MULTIPLE RSMC'S -C UNIT 27 - STORM CATALOG FILE CONTAINING STORM NAME, ALIAS INFO -C FIRST AND LAST DATA OBSERVED -C UNIT 28 - SCRATCH FILE CONTAINING TEMPORARY CATALOG -C UNIT 30 - SEE INPUT FILES ABOVE. RECORDS APPROPRIATE TO A FUTURE -C etc. CYCLE ARE WRITTEN BACK TO THE APPROPRIATE FILE -C UNIT 54 - RUN DATE FILE FOR DATE CHECK ('YYYYMMDDHH') -C UNIT 60 - FILE CONTAINING QUALITY CONTROLLED RECORDS -C UNIT 61 - CONTAINS HISTORY OF ALL RECORDS THAT ARE OPERATED ON BY -C THIS PROGRAM -C -C SUBPROGRAMS CALLED: -C UNIQUE: - RSMCCK BASNCK AKASUB TCCLIM RCNCIL -C MNMXDA SCLIST AKLIST STCATI STCATN -C ADFSTF FSTSTM RITCUR RITSTH RITHIS -C FNLCPY CPYREC DUPCHK BLNKCK READCK -C DTCHK SETMSK STIDCK FIXDUP FIXNAM -C SECVCK WRNING F1 F2 SLDATE -C FIXSLM GAULAT BSSLZ1 TRKSUB NEWVIT -C DECVAR TIMSUB YTIME SORTRL DS2UV -C ATAN2D SIND COSD DISTSP AVGSUB -C ABORT1 OFILE0 -C LIBRARY: -C COMMON - IARGC GETARG INDEX -C W3LIB - W3TAGB W3TAGE W3DIFDAT W3MOVDAT W3UTCDAT -C - ERREXIT -C -C EXIT STATES: -C COND = 0 - SUCCESSFUL RUN. NO RECORDS WITH ERRORS -C = 1 - SUCCESSFUL RUN. FOUND RECORDS WITH STORM ID>=80 -C CHG. TESTID -C = 2 - SUCCESSFUL RUN. FOUND RECORDS WITH ERRORS -C = 3 - BOTH 1 AND 2 ABOVE -C = 4 - SUCCESSFUL RUN, BUT NO INPUT RECORDS FOUND -C = 5 - PROGRAM HAS BEEN RUN PREVIOUSLY -C =10 - LOGICAL INCONSISTENCY IN SUBROUTINE RCNCIL (??) -C =20 - FATAL ERROR (SEE STDOUT PRINT FOR MORE DETAILS) -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - PROGRAM SYNDAT_QCTROPCY - - PARAMETER (MAXCHR=95) - PARAMETER (MAXREC=1000) - PARAMETER (MAXCKS=8) - PARAMETER (MAXRC=MAXREC*(MAXCKS+1)) - PARAMETER (MAXTBP=20) - PARAMETER (MAXFIL=99) - PARAMETER (IVSBMX=14,IVSBM1=IVSBMX+1) - - CHARACTER FILNAM*128 - - DIMENSION FILNAM(0:MAXFIL) - - CHARACTER TSTREC(0:MAXREC)*100,OKAREC(MAXREC)*100, - 1 BADREC(MAXREC)*100,DUMREC*100,SCRREC(0:MAXREC)*9, - 2 XXXREC*27,ZZZREC*100,NNNREC*100,TBPREC(MAXTBP)*100, - 3 SCRATC(MAXREC)*100 - - DIMENSION IEFAIL(MAXREC,0:MAXCKS),NUMOKA(MAXREC),NUMBAD(MAXREC), - 1 NUMTST(MAXREC),NUMTBP(MAXTBP),IDUPID(MAXREC), - 2 IUNTIN(MAXREC) - -C IUNTSL: UNIT NUMBER FOR READING T126 32-BIT SEA-LAND MASK -C ON GAUSSIAN GRID -C IUNTDT: UNIT NUMBER FOR READING RUN DATE ('YYYYMMDDHH') -C IUNTDC: UNIT NUMBER FOR RUN DATE ('YYYYMMDDHH') CHECK -C IUNTOK: UNIT NUMBER FOR PRELIMINARY QUALITY-CONTROLLED -C RECORDS. ***NOTE: AT THE END OF THIS PROGRAM, -C IUNTOK CONTAINS THE SHORT-TERM -C HISTORICAL RECORDS FOR THE NEXT -C INPUT TIME. -C IUNTAL: UNIT NUMBER FOR ALIAS FILE WHICH CONTAINS STORM IDS -C FOR STORMS THAT HAVE BEEN REPORTED BY MULTIPLE RSMC'S -C IUNTAN: UNIT NUMBER FOR NEW ALIAS FILE -C IUNTCA: UNIT NUMBER FOR STORM CATALOG FILE WHICH CONTAINS -C CURRENT LISTING OF ALL STORMS, THEIR NAMES, DATES -C IDS AND ALIASES -C IUNTCN: UNIT NUMBER FOR SCRATCH STORM CATALOG -C IUNTCU: UNIT NUMBER FOR FINAL QUALITY-CONTROLLED RECORDS -C (CURRENT FILE) -C IUNTHO: UNIT NUMBER FOR THE SHORT-TERM HISTORICAL (ORIGINAL) -C VITAL STATISTICS RECORDS. LENGTH OF HISTORY -C CONTROLLED BY WINMIN. THESE ARE ORIGINAL RECORDS AND -C NOT ALIASED RECORDS! -C IUNTHA: UNIT NUMBER FOR THE SHORT-TERM HISTORICAL (ALIAS) -C VITAL STATISTICS RECORDS. LENGTH OF HISTORY -C CONTROLLED BY WINMIN. THESE ARE ALIAS RECORDS IF -C MULTIPLE OBSERVERS FOR A GIVEN STORM ARE PRESENT! -C IUNTHL: UNIT NUMBER FOR THE LONG-TERM HISTORICAL (PREVIOUS) -C VITAL STATISTICS RECORDS. ALL RECORDS, AND QUALITY -C CONTROL FLAGS ARE PUT INTO THIS FILE. -C IUNTVI: UNIT NUMBER FOR RAW VITAL STATISTICS FILE (NEITHER -C QUALITY CONTROLLED NOR CHECKED FOR DUPLICATES) -C WINMIN: WINDOW FOR SHORT-TERM HISTORY FILE (FRACTIONAL DAYS) -C WINMX1: WINDOW FOR MAXIMUM ACCEPTABLE DATE (FRACTIONAL DAYS) -C FOR RECORD PROCESSING -C WINCUR: WINDOW FOR WRITING CURRENT FILE (FRACTIONAL DAYS) -C FILES: TRUE IF NEW SHORT-TERM HISTORY FILE IS CREATED AND -C ALL NEW RECORD FILES ARE ZEROED OUT -C LNDFIL: TRUE IF RECORDS OF STORMS OVER COASTAL POINTS ARE -C NOT COPIED TO THE FILE OF CURRENT QUALITY CONTROLLED -C RECORDS. - - DIMENSION RINC(5) - - DIMENSION IVSBRS(0:IVSBMX) - LOGICAL FILES,LNDFIL - CHARACTER RUNID*12 - - NAMELIST/INPUT/IDATEZ,IUTCZ,RUNID,FILES,LNDFIL,MAXUNT,WINMIN, - 1 NVSBRS,IVSBRS,WINCUR - - DATA IUNTSL/11/,IUNTDT/12/,IUNTDC/54/,IUNTOK/20/,IUNTHO/21/, - 1 IUNTVI/30/,MAXUNT/2/,IUNTCU/60/,IUNTHL/61/,WINMIN/4./, - 2 WINMX1/0.0833333/,IEFAIL/MAXRC*0/,LNDFIL/.TRUE./,IUNTOP/3/, - 3 IUNTHA/22/,IUNTAL/25/,IUNTAN/26/,NVSBRS/0/,IVSBRS/IVSBM1*0/, - 4 WINCUR/0.25/,FIVMIN/3.4722E-3/,FILES/.FALSE./,IUNTCA/27/, - 5 IUNTCN/28/,IUNTSN/14/ - DATA NNNREC/'12345678901234567890123456789012345678901234567890123 - 1456789012345678901234567890123456789012345*****'/ - DATA ZZZREC/'RSMC#SID#NAMEZZZZZ#YYYYMMDD#HHMM#LATZ#LONGZ#DIR#SPD#P - 1CEN#PENV#RMAX#VM#RMW#15NE#15SE#15SW#15NW#D*****'/ - DATA - 1 XXXREC/' FL BL RD DT LL ID MR SV DS'/ - - CALL W3TAGB('SYNDAT_QCTROPCY',2013,0053,0050,'NP22 ') - -C INITIALIZE SOME VARIABLES THAT MIGHT GET USED BEFORE GETTING SET -C UNDER CERTAIN CONDITIONS - IERCHK=0 - IERRCN=0 - NTBP=0 - -C OPEN FILES - - filnam(0)='fildef.vit' - CALL OFILE0(IUNTOP,MAXFIL,NFTOT,FILNAM) - -C READ RUN DATE AND CONVERT TO FLOATING POINT DATE. -C THE RUN DATE ACCEPTANCE WINDOW IS NOT SYMMETRIC ABOUT -C THE CURRENT RUN DATE - - READ(5,INPUT) - WRITE(6,INPUT) - -C GET CURRENT RUN DATE AND OFFSET IN SJL FORMAT -C OFFSET ROUNDED TO THE NEAREST HOUR FROM W3 CALLS - - IOFFTM = 0 - - IF(IDATEZ .LT. 0) THEN - CALL SLDATE(IUNTDC,IDATCK,IUTCCK,IOFFTM) - CALL SLDATE(IUNTDT,IDATEZ,IUTCZ,IOFFTM) - IF(FILES .AND. IDATCK .EQ. IDATEZ .AND. IUTCCK .EQ. IUTCZ) THEN - WRITE(6,1) FILES,IDATCK,IUTCCK - 1 FORMAT(/'######WITH FILES=',L2,' THIS PROGRAM HAS RUN PREVIOUSLY', - 1 ' FOR DATE,TIME=',I9,I5) - ISTOP=5 - GO TO 1000 - ENDIF - ENDIF - - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAY0) - HROFF =IOFFTM*.01 - CYCOFF=(1.0+HROFF)/24. - IF(HROFF .GT. 24.) HROFF=-99.99 - - WRITE(6,2) IOFFTM,CYCOFF - 2 FORMAT(/'...OFFTIM,CYCOFF=',I12,F12.5) - -C THE MINIMUM WINDOW DETERMINES THE OLDEST RECORD THAT CAN -C BE PROCESSED BY QUALITY CONTROL. IT IS ALSO THE TIME COVERED -C BY THE SHORT-TERM HISTORICAL STORMS IN THE WORKING FILE. - -C THERE ARE TWO MAXIMUM WINDOWS: THE SHORT ONE (DAYMX1=2 HR) IS -C FOR PROCESSING RECORDS NO LATER THAN THE CYCLE TIME. THE -C LARGER ONE (DAYMX2) EXTENDS TO THE CURRENT TIME (THE TIME AT -C WHICH THIS PROGRAM IS RUN) PLUS 1 HOUR. RECORDS LATER THAN -C DAYMX1 BUT EARLIER THAN DAYMX2 WILL BE "THROWN BACK INTO -C THE POND" AND WILL BE PROCESSED AT THE NEXT CYCLE. - - DAYMIN=DAY0-WINMIN - DAYMX1=DAY0+WINMX1 - DAYMX2=DAY0+CYCOFF - DAYCUR=DAY0-WINCUR - DAYOFF=0.0 - - DAYMX1=DAYMX1+DAYOFF - - WRITE(6,3) WINMIN,WINMX1,DAYMIN,DAYMX1,DAYMX2 - 3 FORMAT(/'...WINMIN,WINMX1,DAYMIN,DAYMX1,DAYMX2=',/,4X,5F12.3) - - WRITE(6,5) IDATEZ,IUTCZ,DAY0,RUNID,LNDFIL,FILES - 5 FORMAT(20X,'***********************************************'/ - 1 20X,'***********************************************'/ - 2 20X,'**** WELCOME TO SYNDAT_QCTROPCY ****'/ - 3 20X,'**** Y2K/F90 VERSION - 17 MARCH 2013 ****'/ - 4 20X,'**** ****'/ - 5 20X,'**** VITAL STATISTICS RECORD CHECKER ****'/ - 6 20X,'**** FOR DATE=',I8,' UTC=',I4.4,10X,'****'/ - 7 20X,'**** JULIAN DAY=',F10.3,16X,'****'/ - 8 20X,'**** RUNID=',A12,' LNDFIL=',L1,' FILES=',L1,4X,'****'/ - 9 20X,'**** 1) INPUT RECORDS ARE CHECKED FOR ****'/ - O 20X,'**** EXACT DUPLICATES ****'/ - 1 20X,'**** 2) QUALITY CONTROL CHECKS. ****'/ - 2 20X,'**** FIRST: PRIMARY INFORMATION ****'/ - 3 20X,'**** (RECOVERY IS ESSENTIAL) ****'/ - 4 20X,'**** A) ALL COLUMNS ****'/ - 5 20X,'**** B) DATE/TIME ****'/ - 6 20X,'**** C) POSITION ****'/ - 7 20X,'**** SECOND: SECONDARY INFO. ****') - WRITE(6,6) - 6 FORMAT(20X,'**** (RECOVERY FROM PERSIS.) ****'/ - 1 20X,'**** D) DIRECTION/SPEED ****'/ - 2 20X,'**** E) RMAX, PENV, PCEN, STM DEPTH ****'/ - 3 20X,'**** THIRD: TERTIARY INFORMATION ****'/ - 4 20X,'**** (RECOVERY DESIRABLE) ****'/ - 5 20X,'**** F) VMAX, RMW ****'/ - 6 20X,'**** G) R15 NE, SE, SW, NW ****'/ - 7 20X,'**** ****'/ - 8 20X,'***********************************************'/ - 9 20X,'***********************************************'/) - - WRITE(6,7) IUNTSL,IUNTDT,IUNTSN,IUNTOK,IUNTCU,IUNTAL,IUNTAN, - 1 IUNTCA,IUNTCN,IUNTHO,IUNTHA,IUNTHL,IUNTVI - 7 FORMAT(20X,'I/O UNITS ARE:'/ - 1 22X,'SEA/LAND MASK =IUNTSL =',I3/ - 2 22X,'RUN DATE (YYYYMMDDHH) =IUNTDT =',I3/ - 3 22X,'STORM NAMES =IUNTSN =',I3/ - 4 22X,'PRELIMINARY Q/C RECORDS =IUNTOK =',I3/ - 5 22X,'FINAL Q/C RECORDS =IUNTCU =',I3/ - 6 22X,'STORM ID ALIAS =IUNTAL =',I3/ - 7 22X,'NEW STORM ID ALIAS =IUNTAN =',I3/ - 8 22X,'STORM CATALOG =IUNTCA =',I3/ - 9 22X,'SCRATCH STORM CATALOG =IUNTCN =',I3/ - O 22X,'SHORT TERM HIST. (ORIG.)=IUNTHO =',I3/ - 1 22X,'SHORT TERM HIST. (ALIAS)=IUNTHA =',I3/ - 2 22X,'LONG TERM HIST. =IUNTHL =',I3/ - 3 22X,'NEW RECORDS =IUNTVI>=',I3) - -C SET UP THE T126 32-BIT SEA-LAND MASK ON GAUSSIAN GRID -C NTEST,NOKAY,NBAD ARE ALL MEANINGLESS NUMBERS AT THIS POINT - - NTEST=1 - NOKAY=1 - NBAD =1 - CALL SETMSK(IUNTSL,NTEST,NOKAY,NBAD,IECOST,IEFAIL(1:MAXREC,4), - 1 NUMTST,NUMOKA,NUMBAD,ZZZREC,NNNREC,TSTREC,BADREC, - 2 OKAREC) - -C INITIAL CHECKS ARE FOR EXACT DUPLICATES AND BLANKS IN THE -C CORRECT SPOT - - NOKAY=0 - NBAD=0 - CALL DUPCHK(IUNTVI,MAXUNT,MAXREC,IERCHK,NTEST,IEFAIL(1:MAXREC,0), - 1 NUMTST,DUMREC,TSTREC,BADREC,*500) - -C SAVE THE INPUT UNIT NUMBERS FOR ALL RECORDS - - IUNTIN(1:NTEST)=IEFAIL(1:NTEST,0) -C - CALL BLNKCK(NTEST,NOKAY,NBAD,IEFAIL(1:MAXREC,1),NUMTST,NUMOKA, - 1 NUMBAD,ZZZREC,NNNREC,TSTREC,BADREC,OKAREC) - -C RELOAD THE TEST RECORDS - - NTEST=NOKAY - NUMTST(1:NOKAY)=NUMOKA(1:NOKAY) - TSTREC(1:NOKAY)=OKAREC(1:NOKAY) - NOKAY=0 - - CALL READCK(NTEST,NOKAY,NBAD,IEFAIL(1:MAXREC,2),NUMTST,NUMOKA, - 1 NUMBAD,ZZZREC,NNNREC,TSTREC,BADREC,OKAREC) - -C RELOAD THE TEST RECORDS AGAIN - - NTEST=NOKAY - NUMTST(1:NOKAY)=NUMOKA(1:NOKAY) - TSTREC(1:NOKAY)=OKAREC(1:NOKAY) - NOKAY=0 - NTBP=MAXTBP -C - CALL DTCHK(NTEST,NOKAY,NBAD,NTBP,IEFAIL(1:MAXREC,3),NUMTST,NUMOKA, - 1 NUMBAD,NUMTBP,DAYMIN,DAYMX1,DAYMX2,DAYOFF,TSTREC, - 2 BADREC,OKAREC,TBPREC) - -C ENCORE, UNE FOIS - - NTEST=NOKAY - NUMTST(1:NOKAY)=NUMOKA(1:NOKAY) - TSTREC(1:NOKAY)=OKAREC(1:NOKAY) - NOKAY=0 - - CALL LLCHK(IUNTSL,NTEST,NOKAY,NBAD,IEFAIL(1:MAXREC,4),NUMTST, - 1 NUMOKA,NUMBAD,ZZZREC,NNNREC,TSTREC,BADREC,OKAREC) - -C ONE MORE TIME (POUR CEUX QUI NE PARLE PAS FRANCAIS) - - NTEST=NOKAY - NUMTST(1:NOKAY)=NUMOKA(1:NOKAY) - TSTREC(1:NOKAY)=OKAREC(1:NOKAY) - NOKAY=0 - - CALL STIDCK(IUNTHO,IUNTSN,IUNTCA,NTEST,IYR,MAXREC,NOKAY,NBAD, - 1 IEFAIL(1:MAXREC,5),IDUPID,NUMTST,NUMOKA,NUMBAD,ZZZREC, - 2 NNNREC,TSTREC,BADREC,OKAREC,SCRATC) - - -C ***************************************************************** -C ***************************************************************** -C **** **** -C **** END OF THE FIRST PHASE OF ERROR CHECKING. FROM NOW **** -C **** ON, THE ORIGINAL RECORD SHORT-TERM HISTORY FILE IS **** -C **** CLOSED AND THE ALIAS SHORT-TERM HISTORY FILE IS OPEN. **** -C **** SOME INPUT RECORDS MAY BE CHANGED DUE TO SUBSTITUTION **** -C **** OF MISSING VALUES OR AVERAGING OF MULTIPLE STORM **** -C **** REPORTS. **** -C **** **** -C ***************************************************************** -C ***************************************************************** - -C MULTIPLE RSMC CHECK: SAME STORM REPORTED BY MORE THAN ONE -C TROPICAL CYCLONE WARNING CENTER. - -C CHECK FOR: -C 1) MULTIPLE STORM REPORTS BY DIFFERENT RSMC'S AT THE SAME TIME -C 2) TIME SERIES OF REPORTS ON THE SAME STORM BY DIFFERENT RSMC'S -C RECONCILE THE ABOVE: -C 1) ASSIGN A COMMON STORM ID -C 2) REMOVE MULTIPLE REPORTS IN FAVOR OF A SINGLE REPORT WITH THE -C COMMON STORM ID AND COMBINED (AVERAGED) PARAMETERS IF -C NECESSARY - -CCCC NTEST=NOKAY -CCCC WRITE(6,61) XXXREC -CCC61 FORMAT(///'...THE FOLLOWING ACCEPTABLE RECORDS ARE ELIGIBLE FOR ', -CCCC 1 'THE MULTIPLE RSMC CHECK.'/4X,'ERROR CODES ARE:'/21X, -CCCC 2 '=0: NO ERRORS OCCURRED'/21X,'<0: SUCCESSFUL ERROR ', -CCCC 3 'RECOVERY',55X,A/) - -CCCC DO NOK=1,NOKAY -CCCC NUMTST(NOK)=NUMOKA(NOK) -CCCC TSTREC(NOK)=OKAREC(NOK) -CCCC WRITE(6,67) NOK,OKAREC(NOK)(1:MAXCHR),(IEFAIL(NUMOKA(NOK),ICK), -CCCC 1 ICK=0,MAXCKS) - 67 FORMAT('...',I3,'...',A,'...',I2,8I3) -CCCC ENDDO -CCCC NOKAY=0 -CCCC REWIND IUNTOK - -c Stopgap measure is to not allow records to be written into -c the alias short-term history file (17 Sept. 1998) - NRCOVR=0 -CCCC CALL RSMCCK(IUNTHO,IUNTHA,IUNTAL,IUNTAN,IUNTCA,IUNTOK,NVSBRS, -CCCC 1 IVSBRS,MAXREC,NTEST,NOKAY,NBAD,NRCOVR, -CCCC 2 IEFAIL(1:MAXREC,6),NUMTST,NUMOKA,NUMBAD,IDUPID,TSTREC, -CCCC 3 BADREC,OKAREC,SCRATC) - -C COPY ALIAS SHORT-TERM HISTORY RECORDS FROM THE PRELIMINARY -C (SCRATCH) FILE TO THE ALIAS SHORT-TERM HISTORY FILE ONLY -C WHEN WE WISH TO UPDATE THE SHORT-TERM HISTORY FILE. - - IF(FILES) THEN - ICALL=1 - REWIND IUNTHA - WRITE(6,93) - 93 FORMAT(/'...THE FOLLOWING RECORDS WILL BE COPIED FROM THE ', - 1 'PRELIMINARY QUALITY CONTROLLED FILE TO THE ALIAS ', - 2 'SHORT-TERM HISTORICAL FILE:') - - CALL CPYREC(ICALL,IUNTOK,IUNTHA,NOKAY,DAYMIN,DUMREC,OKAREC) - ENDIF - -C BEGIN CHECKS FOR SECONDARY STORM INFORMATION WHICH INCLUDES: -C 1) DIRECTION, SPEED -C 2) PCEN, PENV, RMAX, STORM DEPTH -C THESE NUMBERS ARE NEEDED BY YOGI. IF MISSING, WE TRY TO -C FILL THEM IN BY PERSISTENCE. - -C FIRST, COPY HISTORICAL RECORDS TO THE PRELIMINARY QUALITY -C CONTROLLED FILE AND THEN COPY THE RECORDS FROM THE CURRENT FILE. - -C COPY HISTORICAL RECORDS TO PRELIMINARY FILE, CHECK FOR DUPLICATES - - REWIND IUNTOK - IF(FILES) THEN - ICALL=3 - WRITE(6,95) DAYMIN,ICALL - 95 FORMAT(/'...THE FOLLOWING RECORDS, HAVING DATES GREATER THAN ', - 1 'OR EQUAL TO DAY',F10.3,', WILL BE CHECKED FOR EXACT ', - 2 'AND PARTIAL DUPLICATES '/4X,'(ICALL=',I2,')', - 3 'AND COPIED FROM THE ALIAS SHORT-TERM HISTORICAL FILE ', - 4 'TO THE PRELIMINARY QUALITY CONTROLLED FILE WHICH NOW ', - 5 'WILL CONTAIN '/4X,'ALIAS RECORDS:'/) - - CALL CPYREC(ICALL,IUNTHA,IUNTOK,NOKAY,DAYMIN,DUMREC,OKAREC) - - ELSE - WRITE(6,97) - 97 FORMAT(/'...THE FOLLOWING RECORDS WILL BE COPIED FROM THE ', - 1 'SCRATCH ARRAY TO THE PRELIMINARY QUALITY CONTROLLED ', - 2 'FILE:') - DO NRC=1,NRCOVR - WRITE(6,105) SCRATC(NRC) - 105 FORMAT(' ...',A,'...') - WRITE(IUNTOK,107) SCRATC(NRC) - 107 FORMAT(A) - ENDDO - ENDIF - -C OH NO, NOT AGAIN!!! - - NTEST=NOKAY - write(6,1011) ntest - 1011 format(/'***debug ntest=nokay=',i4/) - WRITE(6,111) - 111 FORMAT(/'...IN PREPARATION FOR SECONDARY VARIABLE CHECKING, THE ', - 1 'FOLLOWING ACCEPTABLE RECORDS WILL BE '/4X,'ADDED TO THE', - 2 ' PRELIMINARY,QUALITY CONTROLLED FILE:'/) - DO NOK=1,NOKAY - NUMTST(NOK)=NUMOKA(NOK) - TSTREC(NOK)=OKAREC(NOK) - WRITE(6,113) NOK,NUMOKA(NOK),OKAREC(NOK) - 113 FORMAT(' ...',I4,'...',I4,'...',A) - WRITE(IUNTOK,119) OKAREC(NOK) - 119 FORMAT(A) - ENDDO - - NOKAY=0 - CALL SECVCK(IUNTOK,NTEST,NOKAY,NBAD,NUMTST,NUMOKA,NUMBAD,DAY0, - 1 DAYMIN,DAYMX1,DAYOFF,IEFAIL(1:MAXREC,7),ZZZREC,NNNREC, - 2 SCRREC,TSTREC,BADREC,OKAREC) - -C COPY HISTORICAL RECORDS TO PRELIMINARY FILE, CHECK FOR DUPLICATES - - REWIND IUNTOK - IF(FILES) THEN - ICALL=3 - WRITE(6,95) DAYMIN,ICALL - CALL CPYREC(ICALL,IUNTHA,IUNTOK,NOKAY,DAYMIN,DUMREC,OKAREC) - - ELSE - WRITE(6,97) - DO NRC=1,NRCOVR - WRITE(6,105) SCRATC(NRC) - WRITE(IUNTOK,107) SCRATC(NRC) - ENDDO - ENDIF - - NTEST=NOKAY - WRITE(6,201) - 201 FORMAT(//'...THE FOLLOWING ACCEPTABLE RECORDS WILL BE ADDED TO ', - 1 'THE PRELIMINARY QUALITY CONTROLLED FILE '/4X,'IN ', - 2 'PREPARATION FOR DIRECTION/SPEED CHECKING.'/) - DO NOK=1,NOKAY - NUMTST(NOK)=NUMOKA(NOK) - TSTREC(NOK)=OKAREC(NOK) - WRITE(6,203) NOK,OKAREC(NOK) - 203 FORMAT(' ...',I4,'...',A) - WRITE(IUNTOK,207) OKAREC(NOK) - 207 FORMAT(A) - ENDDO - - NOKAY=0 - -C SEA/LAND MASK CHECK - - CALL SELACK(NTEST,NOKAY,NBAD,IECOST,IEFAIL(1:MAXREC,4),NUMTST, - 1 NUMOKA,NUMBAD,LNDFIL,ZZZREC,NNNREC,TSTREC,BADREC, - 2 OKAREC) - - WRITE(6,301) XXXREC - 301 FORMAT(/'...THE SECONDARY VARIABLE, DIR/SPD AND SEA/LAND ', - 1 'CHECKING HAVE CONCLUDED. ERROR CHECKING HAS ENDED.'/4X, - 2 'OKAY RECORDS AND ERROR CODES ARE:',69X,A/) - - DO NOK=1,NOKAY - WRITE(6,67) NOK,OKAREC(NOK)(1:MAXCHR),IEFAIL(NUMOKA(NOK),0), - 1 (-IABS(IEFAIL(NUMOKA(NOK),ICK)), - 1 ICK=1,MAXCKS) - ENDDO - - WRITE(6,311) XXXREC - 311 FORMAT(/'...BAD RECORDS AND ERROR CODES ARE:',71X,A/) - - DO NBA=1,NBAD - WRITE(6,67) NBA,BADREC(NBA)(1:MAXCHR),IEFAIL(NUMBAD(NBA),0), - 1 (IEFAIL(NUMBAD(NBA),ICK),ICK=1,MAXCKS) - - ENDDO - -C RECONCILE THE STORM IDS WITH THE STORM CATALOG - -C LET'S PRETEND WE'RE NOT GOING TO DO IT, BUT DO IT ANYWAY - - NTEST=NOKAY+NBAD - WRITE(6,401) XXXREC - 401 FORMAT(///'...THE FOLLOWING ACCEPTABLE RECORDS WILL BE ', - 1 'RECONCILED WITH THE STORM CATALOG.'/4X,'ERROR CODES ', - 2 'ARE:'/21X,'=0: NO ERRORS OCCURRED'/21X,'<0: ', - 3 'SUCCESSFUL ERROR RECOVERY',56X,A/) - - DO NOK=1,NOKAY - NUMTST(NOK)=NUMOKA(NOK) - TSTREC(NOK)=OKAREC(NOK) - WRITE(6,67) NOK,OKAREC(NOK)(1:MAXCHR),IEFAIL(NUMOKA(NOK),0), - 1 (IEFAIL(NUMOKA(NOK),ICK),ICK=1,MAXCKS) - ENDDO - WRITE(6,411) XXXREC - 411 FORMAT(//'...THE FOLLOWING BAD RECORDS WILL BE RECONCILED WITH ', - 1 'THE STORM CATALOG FOR OVERLAND OR OVERLAPPING STORM ', - 2 'CASES.'/4X,'ERROR CODES ARE:'/21X,'>0: ERROR FOUND',70X, - 3 A/) - DO NBA=1,NBAD - NUMTST(NOKAY+NBA)=NUMBAD(NBA) - TSTREC(NOKAY+NBA)=BADREC(NBA) - IF(IEFAIL(NUMBAD(NBA),4) .EQ. 5 .OR. - 1 IEFAIL(NUMBAD(NBA),4) .EQ. 6 .OR. - 2 IEFAIL(NUMBAD(NBA),6) .EQ. 22) THEN - WRITE(6,67) NBA+NOKAY,BADREC(NBA)(1:MAXCHR),IEFAIL(NUMBAD(NBA),0), - 1 (IEFAIL(NUMBAD(NBA),ICK),ICK=1,MAXCKS) - ENDIF - ENDDO - - call rcncil(iuntca,iuntcn,iuntal,ntest,nokay,nbad,maxrec,maxcks, - 1 iefail,ierrcn,idupid,numtst,numoka,numbad,tstrec, - 2 badrec,okarec) - -C CLEAR OUT THE TEMPORARY ALIAS FILE; AKAVIT IS IN ITS FINAL FORM. - - REWIND IUNTAN - END FILE IUNTAN - -C ERROR CHECKING HAS FINALLY ENDED - - 500 WRITE(6,501) XXXREC - 501 FORMAT(//'...THE FINAL ERROR CHECKING HAS ENDED. BAD RECORDS ', - 1 'AND ERROR CODES ARE:',36X,A/) - ISTP90=0 - ISTPBR=0 - DO NBA=1,NBAD - DO NCK=1,MAXCKS - -C SELECT APPROPRIATE CONDITION CODE FOR STOP - - IF(IEFAIL(NUMBAD(NBA),NCK) .EQ. 2 .AND. NCK .EQ. 5) THEN - ISTP90=1 - ELSE IF(IEFAIL(NUMBAD(NBA),NCK) .NE. 0) THEN - ISTPBR=2 - ENDIF - ENDDO - - WRITE(6,543) NBA,BADREC(NBA)(1:MAXCHR),(IEFAIL(NUMBAD(NBA),ICK), - 1 ICK=0,MAXCKS) - 543 FORMAT(' ...',I3,'...',A,'...',I2,8I3) - ENDDO - ISTOP=ISTP90+ISTPBR - IF(IERCHK .EQ. 161) ISTOP=04 - IF(IERRCN .NE. 0) ISTOP=10 - WRITE(6,551) ISTP90,ISTPBR,IERRCN,ISTOP - 551 FORMAT(/'...STOP CODES ARE: ISTP90,ISTPBR,IERRCN,ISTOP=',4I3) - -C ADD FIRST OCCURRENCE FLAGS BY CHECKING THE SHORT-TERM HISTORY -C FILE - - CALL ADFSTF(IUNTHA,NOKAY,NBAD,MAXREC,MAXCKS,IECOST,NUMBAD,IEFAIL, - 1 DUMREC,OKAREC,BADREC) - -C WRITE THE RESULTS OF THE Q/C PROGRAM TO A LONG-TERM HISTORICAL -C FILE - - NRTOT=NOKAY+NBAD - CALL RITHIS(-IUNTHL,IEFAIL,NRTOT,IDATEZ,IUTCZ,NUMOKA,NOKAY,MAXREC, - 1 MAXCKS,HROFF,WINCUR,RUNID,LNDFIL,FILES,OKAREC,ZZZREC, - 2 XXXREC) - CALL RITHIS(IUNTHL,IEFAIL,NRTOT,IDATEZ,IUTCZ,NUMBAD,NBAD,MAXREC, - 1 MAXCKS,HROFF,WINCUR,RUNID,LNDFIL,FILES,BADREC,ZZZREC, - 2 ZZZREC) - -C UPDATE THE SHORT-TERM HISTORY FILES. -C **** IMPORTANT NOTE: ALL INFORMATION FROM TSTREC,OKAREC,BADREC, -C NUMTST,NUMOKA,NUMBAD WILL BE LOST **** -C **** PRENEZ GARDE **** - - IF(FILES) THEN - CALL RITSTH(IUNTHA,IUNTHO,IUNTOK,NOKAY,NBAD,DAYMIN,IECOST,MAXCKS, - 1 MAXREC,NUMBAD,IEFAIL,DUMREC,OKAREC,BADREC) - - CALL FNLCPY(IUNTVI,MAXUNT,IUNTOK,IUNTHA,MAXREC,NTBP,NUMTBP,IUNTIN, - 1 TBPREC,DUMREC) - NTEST=0 - NOKAY=0 - IUNTRD=IUNTOK - -C NOPE: SORRY, ONE LAST TIME, BUT ONLY FOR FILES=.FALSE. - - ELSE - NTEST=NOKAY - IUNTRD=IUNTHA - NUMTST(1:NOKAY)=NUMOKA(1:NOKAY) - TSTREC(1:NOKAY)=OKAREC(1:NOKAY) - NOKAY=0 - - ENDIF - -C WRITE THE FILE CONTAINING ALL CURRENT QUALITY CONTROLLED RECORDS - - CALL YTIME(IYR,DAYCUR+FIVMIN,IDATCU,JUTCCU) - CALL RITCUR(IUNTRD,IUNTCU,NTEST,NOKAY,NBAD,IDATCU,JUTCCU,DAYCUR, - 1 MAXREC,IEFAIL(1:MAXREC,4),NUMTST,NUMOKA,NUMBAD,FILES, - 2 LNDFIL,ZZZREC,NNNREC,DUMREC,SCRREC,TSTREC,OKAREC, - 3 BADREC) - -C CLEAN OUT THE SCRATCH FILE - - REWIND IUNTOK - END FILE IUNTOK - - 1000 CONTINUE - IF(FILES) CALL SLDTCK(IUNTDC) - - WRITE(6,1115) - 1115 FORMAT(////20X,'*******************************************' - 1 /20X,'*******************************************' - 2 /20X,'**** ****' - 3 /20X,'**** SUCCESSFUL COMPLETION OF ****' - 4 /20X,'**** SYNDAT_QCTROPCY ****' - 5 /20X,'**** ****' - 6 /20X,'*******************************************' - 7 /20X,'*******************************************') - - CALL W3TAGE('SYNDAT_QCTROPCY') - -ccccc IF(ISTOP .EQ. 0) THEN - STOP -ccccc ELSE IF(ISTOP .EQ. 1) THEN -ccccc call ERREXIT (1) -ccccc ELSE IF(ISTOP .EQ. 2) THEN -ccccc call ERREXIT (2) -ccccc ELSE IF(ISTOP .EQ. 3) THEN -ccccc call ERREXIT (3) -ccccc ELSE IF(ISTOP .EQ. 04) THEN -ccccc call ERREXIT (4) -ccccc ELSE IF(ISTOP .EQ. 05) THEN -ccccc call ERREXIT (5) -ccccc ELSE IF(ISTOP .EQ. 10) THEN -ccccc call ERREXIT (10) -ccccc ENDIF - - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: RSMCCK CHECKS FOR MULTIPLE STORM REPORTS -C PRGMMR: S. LORD ORG: NP22 DATE: 1992-02-19 -C -C ABSTRACT: INPUT RECORDS ARE CHECKED FOR MULTIPLE REPORTS ON THE SAME -C STORM FROM DIFFERENT RSMC'S. THE FOLLOWING ACTIONS ARE -C TAKEN: -C 1) MULTIPLE STORM REPORTS BY DIFFERENT RSMC'S AT THE SAME -C TIME ARE REMOVED -C 2) TIME SERIES OF REPORTS ON THE SAME STORM BY DIFFERENT -C RSMC'S ARE DISCOVERED -C TO RECONCILE THE ABOVE: -C 1) A COMMON STORM ID IS ASSIGNED -C 2) MULTIPLE REPORTS ARE REMOVED IN FAVOR OF A SINGLE -C REPORT WITH THE COMMON STORM ID AND COMBINED -C (AVERAGED) PARAMETERS IF NECESSARY -C -C PROGRAM HISTORY LOG: -C 1992-02-19 S. LORD -C 1992-07-16 S. LORD FIXED SOME BUGS (390); ADDED RETURN CODE 2. -C 1993-03-09 S. LORD ADDED CODE FOR COMPATIBILITY WITH RCNCIL -C 2013-10-10 D. C. STOKES - ADDED NON-HYPHNATED CARDINAL NUMBER NAMES -C ALSO EXTENDED THAT LIST (FROM 36 TO 39). -C -C USAGE: CALL RSMCCK(IUNTHO,IUNTHA,IUNTAL,IUNTAN,IUNTOK,NVSBRS,IVSBRS, -C MAXOVR,NTEST,NOKAY,NBAD,NRCOVR,IFRSMC,NUMTST, -C NUMOKA,NUMBAD,IOVRLP,TSTREC,BADREC,OKAREC,OVRREC) -C INPUT ARGUMENT LIST: -C IUNTHO - UNIT NUMBER FOR SHORT-TERM HISTORY FILE OF ORIGINAL -C - RECORDS. -C IUNTHA - UNIT NUMBER FOR SHORT-TERM HISTORY FILE OF ALIASED -C - RECORDS. -C IUNTAL - UNIT NUMBER FOR ALIAS FILE. -C IUNTAN - UNIT NUMBER FOR NEW ALIAS FILE. -C IUNTOK - UNIT NUMBER FOR SCRATCH FILE. -C NVSBRS - NUMBER OF ALLOWABLE VARIABLES FOR SUBSTITUTION. -C IVSBRS - INDEX OF ALLOWABLE VARIABLES FOR SUBSTITUTION. -C MAXOVR - DIMENSION FOR SCRATCH SPACE. -C NTEST - NUMBER OF CURRENT RECORDS TO BE TESTED. -C NUMTST - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH RECORD -C - TO BE TESTED. -C IOVRLP - SCRATCH ARRAY. -C TSTREC - CHARACTER ARRAY CONTAINING RECORDS TO BE TESTED. -C -C OUTPUT ARGUMENT LIST: -C NOKAY - NUMBER OF RECORDS THAT PASSED THE RSMC CHECK. -C NBAD - NUMBER OF RECORDS THAT FAILED THE RSMC CHECK. -C NRCOVR - NUBER OF RECORDS RETURNED IN OVRREC. THESE CONTAIN -C - UPDATED ALIAS SHORT-TERM HISTORY RECORDS FOR USE WHEN -C - FILES=F. -C IFRSMC - INTEGER ARRAY CONTAINING ERROR CODE FOR EACH INPUT -C - RECORD. SEE COMMENTS IN PGM FOR KEY TO ERROR CODES. -C NUMOKA - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH GOOD -C - RECORD. -C NUMBAD - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH BAD -C - RECORD. -C BADREC - CHARACTER ARRAY CONTAINING BAD RECORDS THAT FAILED -C - THE RSMC CHECK. -C OKAREC - CHARACTER ARRAY CONTAINING ALL RECORDS THAT PASSED -C - THE RSMC CHECK. -C OVRREC - CHARACTER ARRAY CONTAINING UPDATED ALIAS SHORT-TERM -C - HISTORY RECORDS. -C -C INPUT FILES: -C UNIT 20 - SCRATCH FILE CONTAINING SHORT-TERM HISTORY RECORDS -C UNIT 21 - ORIGINAL SHORT-TERM HISTORY FILE CONTAINING RECORDS -C PROCESSED BY THIS PROGRAM FOR THE LAST SEVERAL DAYS. -C IN THIS FILE, THE ORIGINAL RSMC AND STORM ID ARE KEPT. -C UNIT 22 - ALIAS SHORT-TERM HISTORY FILE CONTAINING RECORDS -C PROCESSED BY THIS PROGRAM FOR THE LAST SEVERAL DAYS. -C IN THIS FILE, THE RSMC AND STORM ID HAVE BEEN UNIFIED. -C UNIT 25 - ALIAS FILE CONTAINING EQUIVALENT STORM IDS -C - FOR STORMS THAT HAVE BEEN REPORTED BY MULTIPLE RSMC'S -C - DCB: LRECL=255, BLKSIZE=23400, RECFM=VB -C UNIT 26 - NEW ALIAS FILE CONTAINING EQUIVALENT STORM IDS -C - FOR STORMS THAT HAVE BEEN REPORTED BY MULTIPLE RSMC'S -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C UNIT 20 - SCRATCH FILE CONTAINING SHORT-TERM HISTORY RECORDS -C UNIT 25 - ALIAS FILE CONTAINING EQUIVALENT STORM IDS -C - FOR STORMS THAT HAVE BEEN REPORTED BY MULTIPLE RSMC'S -C - DCB: LRECL=255, BLKSIZE=23400, RECFM=VB -C UNIT 26 - NEW ALIAS FILE CONTAINING EQUIVALENT STORM IDS -C - FOR STORMS THAT HAVE BEEN REPORTED BY MULTIPLE RSMC'S -C - NOTE: UCL SHOULD COPY THIS FILE TO FT22F001 (THE OLD -C - ALIAS FILE) AT THE END OF EXECUTION. -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE RSMCCK(IUNTHO,IUNTHA,IUNTAL,IUNTAN,IUNTCA,IUNTOK, - 1 NVSBRS,IVSBRS,MAXOVR,NTEST,NOKAY,NBAD,NRCOVR, - 2 IFRSMC,NUMTST,NUMOKA,NUMBAD,IOVRLP,TSTREC, - 3 BADREC,OKAREC,OVRREC) - - PARAMETER (NERCRS=10) - PARAMETER (MAXSTM=70) - PARAMETER (NOVRMX=MAXSTM) - PARAMETER (NADDMX=10) - PARAMETER (MAXREC=1000) - - SAVE - - CHARACTER*(*) TSTREC(0:NTEST),BADREC(MAXREC),OKAREC(NTEST), - 1 ERCRS(NERCRS)*60,OVRREC(MAXOVR) - CHARACTER*100 DUMY2K - - PARAMETER (MAXCHR=95) - PARAMETER (MAXVIT=15) - PARAMETER (NBASIN=11) - PARAMETER (NRSMCX=4) - PARAMETER (NRSMCW=2) - PARAMETER (NCRDMX=57) - - CHARACTER BUFIN*1,RSMCZ*4,STMIDZ*3,STMNMZ*9,FSTFLZ*1,STMDPZ*1, - 1 LATNS*1,LONEW*1,FMTVIT*6,BUFINZ*100,RELOCZ*1,NAMVAR*5, - 2 IDBASN*1,NABASN*16,RSMCID*4,RSMCAP*1,CARDNM*9 - - DIMENSION IVTVAR(MAXVIT),VITVAR(MAXVIT),VITFAC(MAXVIT), - 1 ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION NAMVAR(MAXVIT+1),IDBASN(NBASIN),NABASN(NBASIN), - 1 BUFIN(MAXCHR),FMTVIT(MAXVIT), - 2 RSMCID(NRSMCX),RSMCAP(NRSMCX),RSMCPR(NBASIN), - 3 RSMCWT(NRSMCW),CARDNM(NCRDMX) - - EQUIVALENCE (BUFIN(1),RSMCZ),(BUFIN(5),RELOCZ),(BUFIN(6),STMIDZ), - 1 (BUFIN(10),STMNMZ),(BUFIN(19),FSTFLZ), - 2 (BUFIN(37),LATNS),(BUFIN(43),LONEW), - 3 (BUFIN(95),STMDPZ),(BUFIN(1),BUFINZ) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - EQUIVALENCE (VITVAR( 3),STMLTZ),(VITVAR( 4),STMLNZ), - 1 (VITVAR( 5),STMDRZ),(VITVAR( 6),STMSPZ), - 2 (VITVAR( 7),PCENZ), (VITVAR( 8),PENVZ), - 3 (VITVAR( 9),RMAXZ) - - CHARACTER STMNAM*9,STMID*3,RSMC*4 - - DIMENSION STMNAM(MAXSTM),STMLAT(MAXSTM),STMLON(MAXSTM), - 1 IDATE(MAXSTM),IUTC(MAXSTM),RMAX(MAXSTM),PENV(MAXSTM), - 2 PCEN(MAXSTM),RSMC(MAXSTM),STMID(MAXSTM) - - DIMENSION IFRSMC(MAXREC),NUMOKA(NTEST),NUMBAD(MAXREC), - 1 NUMTST(NTEST),IOVRLP(MAXOVR),IVSBRS(0:NVSBRS) - - DIMENSION IVTVRX(MAXVIT),VITVRX(MAXVIT) - - DIMENSION IPRIOR(NOVRMX),AVWT(NOVRMX),RSMCAL(NOVRMX), - 1 STIDAL(NOVRMX),STNMAD(NOVRMX),IRSMC(4),SRTDAY(NOVRMX), - 2 IDASRT(NOVRMX),INDSAM(NOVRMX),DAYZAD(NADDMX), - 3 RSMCOV(NOVRMX),STIDOV(NOVRMX), - 4 RSMCAD(NADDMX),STIDAD(NADDMX) - - DIMENSION RINC(5) - - CHARACTER BUFCK(MAXCHR)*1,RSMCX*4,RELOCX*1,STMIDX*3,BUFINX*100, - 1 STMNMX*9,LATNSX*1,LONEWX*1,BSCOFL*2,RPCOFL*2,STNMAL*9, - 2 RSMCAL*4,STIDAL*3,STNMAD*9,RSMCOV*4,STIDOV*3,STNMOV*9, - 3 STIDAD*3,RSMCAD*4,STHCH*21 - - LOGICAL OSTHFL - - EQUIVALENCE (BUFCK(1),RSMCX),(BUFCK(5),RELOCX),(BUFCK(6),STMIDX), - 1 (BUFCK(1),BUFINX),(BUFCK(10),STMNMX), - 2 (BUFCK(35),LATNSX),(BUFCK(41),LONEWX) - - EQUIVALENCE (IVTVRX(1),IDATEX),(IVTVRX(2),IUTCX), - 1 (VITVRX(3),STMLTX),(VITVRX(4),STMLNX), - 2 (VITVRX(5),STMDRX),(VITVRX(6),STMSPX), - 3 (VITVRX(7),PCENX), (VITVRX(8),PENVX), - 4 (VITVRX(9),RMAXX) - - DATA VITFAC/2*1.0,2*0.1,1.0,0.1,9*1.0/, - 1 FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 2 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 3 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 4 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/ - - DATA IDBASN/'L','E','C','W','O','T','U','P','S','B','A'/ - - DATA NABASN/'ATLANTIC ','EAST PACIFIC ', - 1 'CENTRAL PACIFIC ','WEST PACIFIC ', - 2 'SOUTH CHINA SEA ','EAST CHINA SEA ', - 3 'AUSTRALIA ','SOUTH PACIFIC ', - 4 'SOUTH INDIAN OCN','BAY OF BENGAL ', - 5 'NRTH ARABIAN SEA'/ - - DATA RSMCID/'NHC ','JTWC','ADRM','JMA '/, - 1 RSMCAP/'N','W','A','J'/,RSMCPR/3*1,3*2,3,4*2/, - 2 RSMCWT/1.0,0.25/ - - DATA NAMVAR/'DATE ','TIME ','LAT. ','LONG.','DIR ','SPEED', - 1 'PCEN ','PENV ','RMAX ','VMAX ','RMW ','R15NE', - 2 'R15SE','R15SW','R15NW','DEPTH'/ - -C CARDINAL NUMBER STORM NAMES FOR UNNAMED ATLANTIC AND EAST PACIFIC -C STORMS - - DATA CARDNM/'ONE ','TWO ','THREE ', - 1 'FOUR ','FIVE ','SIX ', - 2 'SEVEN ','EIGHT ','NINE ', - 3 'TEN ','ELEVEN ','TWELVE ', - 4 'THIRTEEN ','FOURTEEN ','FIFTEEN ', - 5 'SIXTEEN ','SEVENTEEN','EIGHTEEN ', - 6 'NINETEEN ','TWENTY ','TWENTY-ON', - 7 'TWENTY-TW','TWENTY-TH','TWENTY-FO', - 8 'TWENTY-FI','TWENTY-SI','TWENTY-SE', - 9 'TWENTY-EI','TWENTY-NI','THIRTY ', - O 'THIRTY-ON','THIRTY-TW','THIRTY-TH', - 1 'THIRTY-FO','THIRTY-FI','THIRTY-SI', - 2 'THIRTY-SE','THIRTY-EI','THIRTY-NI', - 3 'TWENTYONE','TWENTYTWO','TWENTYTHR', - 4 'TWENTYFOU','TWENTYFIV','TWENTYSIX', - 5 'TWENTYSEV','TWENTYEIG','TWENTYNIN', - 6 'THIRTYONE','THIRTYTWO','THIRTYTHR', - 7 'THIRTYFOU','THIRTYFIV','THIRTYSIX', - 8 'THIRTYSEV','THIRTYEIG','THIRTYNIN'/ - -C BUFZON: BUFFER ZONE REQUIRED BY SYNTHETIC DATA PROGRAM (SYNDATA) -C DEGLAT: ONE DEGREE LATITUDE IN KM -C RMAXMN: MINIMUM ALLOWABLE VALUE OF RMAX -C DTOVR : MINIMUM WINDOWN (FRACTIONAL DAYS) FOR OVERLAPPING STORMS -C EXTRAPOLATED TO A COMMON TIME. -C IPRT : CONTROLS PRINTOUT IN SUBROUTINE BASNCK -C FACSPD: CONVERSION FACTOR FOR R(DEG LAT)=V(M/S)*T(FRAC DAY)* -C FACSPD - - DATA BUFZON/1.0/,DEGLAT/111.1775/,RMAXMN/100./,DTOVR/1.0/, - 1 IPRT/0/,FIVMIN/3.4722E-3/,FACSPD/0.77719/ - - DATA ERCRS - 1 /' 1: CANNOT RESOLVE: SAME RSMC REPORTED OVERLAPPING STORMS ', - 2 '10: RESOLVED: SAME RSMC REPORTED OVERLAPPING STORMS ', - 3 ' 2: CANNOT RESOLVE: DIFF. RSMCS REPORTED DIFF. OVERL. STMS.', - 4 '21: DIFFERENT RSMCS REPORTED SAME OVERLAPPING STORMS (CUR) ', - 5 '22: DIFFERENT RSMCS REPORTED SAME OVERLAPPING STORMS (OSTH)', - 6 '30: UNIFIED RECORD CREATED FOR SINGLY OBSERVED STORM ', - 7 ' 3: STORM IS NOT IN A BASIN DEFINED BY BASNCK ', - 8 ' 4: RSMC IS NOT AMONG LISTED CENTERS (NO ERROR RECOVERY) ', - 9 ' 5: DIFFERENT RSMCS REPORTED DIFFERENT OVERLAPPING STORMS ', - O ' 6: SINGLE RSMC HAS TWO STORM IDS FOR THE SAME STORM '/ - -C ERROR CODES FOR BAD RECORDS RETURNED IN IFRSMC ARE AS FOLLOWS: -C 1: CANNOT RESOLVE: SAME RSMC REPORTED OVERLAPPING STORMS -C 10: RESOLVED: SAME RSMC REPORTED OVERLAPPING STORMS -C 2: CANNOT RESOLVE: DIFF. RSMCS REPORTED DIFF. OVERL. STMS. -C 21: DIFFERENT RSMCS REPORTED SAME OVERLAPPING STORMS (CUR) -C 22: DIFFERENT RSMCS REPORTED SAME OVERLAPPING STORMS (OSTH) -C 30: UNIFIED RECORD CREATED FOR SINGLY OBSERVED STORM -C 3: STORM IS NOT IN A BASIN DEFINED BY BASNCK -C 4: RSMC IS NOT AMONG LISTED CENTERS (NO ERROR RECOVERY) -C 5: TWO DIFFERENT RSMCS REPORT DIFFERENT OVERLAPPING STORMS -C 6: SINGLE RSMC HAS TWO STORM IDS FOR THE SAME STORM - - WRITE(6,1) NTEST,NOKAY,NBAD - 1 FORMAT(//'...ENTERING RSMCCK, LOOKING FOR MULTIPLE STORM ', - 1 'REPORTS. NTEST,NOKAY,NBAD=',3I5/) - - CALL WRNING('RSMCCK') - WRITE(6,3) NVSBRS,(NAMVAR(IVSBRS(NV)),NV=1,NVSBRS) - 3 FORMAT(/'...NUMBER OF ALLOWABLE VARIABLES FOR SUBSTITUTION ', - 1 'IS:',I3,' VARIABLES ARE:'/4X,10(A,1X)) - - NADD=0 - NSUBR=0 - NUNIFY=0 - NALADD=0 - REWIND IUNTAN - OVRREC(1:NTEST)=' ' - IOVRLP(1:NTEST)=0 - IFRSMC(NUMTST(1:NTEST))=0 - -C FOR COMPLETE COTEMPORANEOUS CHECKS, WE MUST MAKE AVAILABLE THE -C ORIGINAL SHORT-TERM HISTORY RECORDS. WE STORE THEM AT THE END -C OF THE OVRREC ARRAY. - - REWIND IUNTHO - NRECHO=0 - WRITE(6,13) IUNTHO - 13 FORMAT(/'...READING FROM ORIGINAL SHORT-TERM HISTORY FILE ', - 1 '(UNIT',I3,') INTO SCRATCH SPACE: RECORD #, STORAGE ', - 2 'INDEX, RECORD=') - - 20 CONTINUE - - READ(IUNTHO,21,END=25) OVRREC(MAXOVR-NRECHO) - 21 FORMAT(A) - -C AT THIS POINT WE DO NOT KNOW IF A 2-DIGIT YEAR BEGINS IN COLUMN 20 -C OF THE RECORD (OLD NON-Y2K COMPLIANT FORM) OR IF A 4-DIGIT YEAR -C BEGINS IN COLUMN 20 (NEW Y2K COMPLIANT FORM) - TEST ON LOCATION OF -C LATITUDE N/S INDICATOR TO FIND OUT ... - - if(OVRREC(MAXOVR-NRECHO)(35:35).eq.'N' .or. - 1 OVRREC(MAXOVR-NRECHO)(35:35).eq.'S') then - -C ... THIS RECORD STILL CONTAINS THE OLD 2-DIGIT FORM OF THE YEAR - -C ... THIS PROGRAM WILL CONVERT THE RECORD TO A 4-DIGIT YEAR USING THE -C "WINDOWING" TECHNIQUE SINCE SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 2-digit year "',OVRREC(MAXOVR-NRECHO)(20:21),'"' - PRINT *, ' ' - PRINT *, 'From unit ',iuntho,'; OVRREC(MAXOVR-NRECHO)-2: ', - $ OVRREC(MAXOVR-NRECHO) - PRINT *, ' ' - DUMY2K(1:19) = OVRREC(MAXOVR-NRECHO)(1:19) - IF(OVRREC(MAXOVR-NRECHO)(20:21).GT.'20') THEN - DUMY2K(20:21) = '19' - ELSE - DUMY2K(20:21) = '20' - ENDIF - DUMY2K(22:100) = OVRREC(MAXOVR-NRECHO)(20:100) - OVRREC(MAXOVR-NRECHO) = DUMY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ OVRREC(MAXOVR-NRECHO)(20:23),'" via windowing technique' - PRINT *, ' ' - PRINT *, 'From unit ',iuntho,'; OVRREC(MAXOVR-NRECHO)-2: ', - $ OVRREC(MAXOVR-NRECHO) - PRINT *, ' ' - - ELSE IF(OVRREC(MAXOVR-NRECHO)(37:37).eq.'N' .OR. - 1 OVRREC(MAXOVR-NRECHO)(37:37).eq.'S') THEN - -C ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR -C ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT '(a,a,a)', '==> Read in RECORD from tcvitals file -- ', - $ ' contains a 4-digit year "',OVRREC(MAXOVR-NRECHO)(20:23),'"' - PRINT *, ' ' - PRINT '(a,i2,a,a)', - $ 'From unit ',iuntho,'; OVRREC(MAXOVR-NRECHO)-2: ', - $ OVRREC(MAXOVR-NRECHO) - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - - ELSE - - PRINT *, ' ' - PRINT *, '***** Cannot determine if this record contains ', - $ 'a 2-digit year or a 4-digit year - skip it and try reading ', - $ 'the next record' - PRINT *, ' ' - GO TO 20 - - END IF - - WRITE(6,23) NTEST+NRECHO+1,MAXOVR-NRECHO,OVRREC(MAXOVR-NRECHO) - 23 FORMAT(' ...',I4,'...',I4,'...',A) - NRECHO=NRECHO+1 - - IF(NRECHO .GE. MAXOVR-NTEST) THEN - WRITE(6,24) NRECHO,MAXOVR,NTEST - 24 FORMAT(/'******INSUFFICIENT SCRATCH SPACE TO STORE ORIGINAL ', - 1 'SHORT-TERM HISTORICAL RECORDS IN OVRREC. NRECHO,', - 2 'MAXOVR,NTEST=',3I3) - CALL ABORT1(' RSMCCK',24) - ENDIF - - GO TO 20 - 25 CONTINUE - WRITE(6,26) NRECHO - 26 FORMAT(' ...',I3,' RECORDS READ FROM ORIGINAL SHORT-TERM ', - 1 'HISTORY FILE.') - -C PART I: -C CHECK COTEMPORANEOUS RECORDS FOR STORMS WITHIN EACH OTHER'S RMAX - - WRITE(6,27) - 27 FORMAT(//'...BEGINNING RSMCCK PART I: COTEMPORANEOUS CHECKS FOR ', - 1 'OVERLAPPING STORMS.') - - DO NREC=1,NTEST - - IETYP=0 - IEROVR=0 - NOVRLP=1 - NRECSV=NREC - -C RECORDS THAT WERE PROCESSED AS COTEMPORANEOUS OVERLAPS PREVIOUSLY -C DO NOT GET FURTHER PROCESSING - - IF(IFRSMC(NUMTST(NREC)) .NE. 0) GO TO 400 - -C RECOVER DATE, UTC, LAT/LON AND RMAX - - BUFINZ=TSTREC(NREC) - - DO IV=1,MAX(9,IVSBRS(NVSBRS)) - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 TSTREC(NREC)) - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 BUFINZ) - ENDDO - - VITVAR(3:MAX(9,IVSBRS(NVSBRS)))= - $ REAL(IVTVAR(3:MAX(9,IVSBRS(NVSBRS))))* - $ VITFAC(3:MAX(9,IVSBRS(NVSBRS))) - IF(LATNS .EQ. 'S') STMLTZ=-STMLTZ - IF(LONEW .EQ. 'W') STMLNZ=360.-STMLNZ - -C STORE NEEDED VARIABLES FOR LATER REFERENCE - - STMNAM(1)=STMNMZ - STMID (1)=STMIDZ - RSMC (1)=RSMCZ - STMLAT(1)=STMLTZ - STMLON(1)=STMLNZ - RMAX (1)=RMAXZ - PCEN (1)=PCENZ - PENV (1)=PENVZ - IOVRLP(1)=NREC - OVRREC(1)=BUFINZ - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - - IF(RMAXZ .LT. 0.0) THEN - DO NBA=1,NBASIN - IF(STMIDZ(3:3) .EQ. IDBASN(NBA)) THEN - IBASN=NBA - GO TO 46 - ENDIF - ENDDO - 46 CONTINUE - RMAXZ=TCCLIM(9,IBASN) - WRITE(6,47) NREC,RMAXZ,NABASN(IBASN) - 47 FORMAT(' ###RMAXZ MISSING FOR COTEMPORANEOUS CHECK ON RECORD',I3, - 1 '.'/4X,'REPLACEMENT VALUE WILL BE A CLIMATOLOGICAL ', - 2 'GUESS OF ',F6.1,' KM FOR BASIN ',A,'.') - ENDIF - -C NOW COMPARE WITH ALL REMAINING STORM REPORTS THAT HAVE NOT BEEN -C MARKED OFF AS ERRONEOUS - - NRECHZ=-1 - DO NTST=NREC+1,NTEST+NRECHO - - IF(NTST .LE. NTEST .AND. IFRSMC(NUMTST(NTST)) .NE. 0) GO TO 100 - - IF(NTST .LE. NTEST) THEN - INDTST=NTST - BUFINX=TSTREC(NTST) - OSTHFL=.FALSE. - ELSE - NRECHZ=NRECHZ+1 - INDTST=MAXOVR-NRECHZ - BUFINX=OVRREC(INDTST) - OSTHFL=.TRUE. - ENDIF - - DO IV=1,MAX(9,IVSBRS(NVSBRS)) - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVRX(IV),IERDEC,FMTVIT(IV), - 1 BUFINX) - ENDDO - - VITVRX(3:MAX(9,IVSBRS(NVSBRS)))= - $ REAL(IVTVRX(3:MAX(9,IVSBRS(NVSBRS))))* - $ VITFAC(3:MAX(9,IVSBRS(NVSBRS))) - - IF(LATNSX .EQ. 'S') STMLTX=-STMLTX - IF(LONEWX .EQ. 'W') STMLNX=360.-STMLNX - -C COTEMPORANEOUS CHECK - - IF(IDATEX .EQ. IDATEZ .AND. IUTCX .EQ. IUTCZ) THEN - - RMAXSV=RMAXX - IF(RMAXX .LT. 0.0) THEN - DO NBA=1,NBASIN - IF(STMIDX(3:3) .EQ. IDBASN(NBA)) THEN - IBASN=NBA - GO TO 66 - ENDIF - ENDDO - 66 CONTINUE - RMAXX=TCCLIM(9,IBASN) - WRITE(6,75) NTST,RMAXX,NABASN(IBASN) - 75 FORMAT(' ###RMAXX MISSING FOR COTEMPORANEOUS CHECK ON RECORD',I3, - 1 '.'/4X,'REPLACEMENT VALUE WILL BE A CLIMATOLOGICAL ', - 2 'GUESS OF ',F6.1,' KM FOR BASIN ',A,'.') - ENDIF - - DISTZ=DISTSP(STMLTZ,STMLNZ,STMLTX,STMLNX)*1.E-3 - -C OVERLAP CHECK. BUFFER ZONE CORRESPONDS TO SYNDATA CONDITION - - IF(DISTZ .LE. RMAXZ+RMAXX+BUFZON*DEGLAT) THEN - -C IF THE MATCHING RECORD IS FROM THE SAME RSMC AND THE STORM -C ID IS THE SAME AND THE RECORD WAS IN THE ORIGINAL SHORT-TERM -C HISTORY FILE, WE ASSUME THE RECORD (NREC) IS AN UPDATE TO THE -C EARLIER RECORD. THE ERROR FLAG IS RESET TO INDICATE NO ERROR. - - IF(RSMCZ .EQ. RSMCX .AND. - 1 STMIDZ .EQ. STMIDX .AND. OSTHFL) THEN - WRITE(6,76) NREC,INDTST,NREC,BUFINZ,INDTST,BUFINX - 76 FORMAT(/'###RECORD IN ORIGINAL SHORT-TERM HISTORY FILE HAS ', - 1 'PROBABLY BEEN UPDATED . NREC,INDTST=',2I4, - 2 '. RECORDS ARE:'/2(4X,'...',I4,'...',A/)) - GO TO 100 - - ELSE - -C STORE NEEDED VARIABLES FOR LATER REFERENCE. DON'T USE THE -C CLIMATOLOGICAL VALUE! - - NOVRLP=NOVRLP+1 - IOVRLP(NOVRLP)=NTST - OVRREC(NOVRLP)=BUFINX - STMNAM(NOVRLP)=STMNMX - STMID (NOVRLP)=STMIDX - RSMC (NOVRLP)=RSMCX - STMLAT(NOVRLP)=STMLTX - STMLON(NOVRLP)=STMLNX - RMAX (NOVRLP)=RMAXSV - PCEN (NOVRLP)=PCENX - PENV (NOVRLP)=PENVX - - WRITE(6,77) DISTZ,NREC,NTST,INDTST,BUFINZ,BUFINX - 77 FORMAT(//'...TWO STORMS REPORTED AT THE SAME DATE/TIME WITHIN ', - 1 'THE OTHERS CIRCULATION. DISTZ,NREC,NTST,INDTST=',F7.1,2 - 2 I4,I5/2(4X,'...',A,'...'/)) - -C SAME OR DIFFERENT RSMC? - - IF(RSMCZ .EQ. RSMCX) THEN - IETYP=1 - ELSE - IETYP=2 - ENDIF - - IF(NOVRLP .EQ. 2) THEN - IEROVR=IETYP - - ELSE - IF(IETYP .NE. IEROVR) THEN - IOVRLP(NOVRLP)=-IABS(IOVRLP(NOVRLP)) - WRITE(6,71) NREC,NTST - 71 FORMAT(' ###WARNING: MULTIPLE OVERLAP TYPES FOR NREC=',I3/4X, - 1 'ERROR RECOVERY CURRENTLY WORKS ON A SINGLE OVERLAP TYPE ', - 2 'SO THIS RECORD=#',I3,' WILL BE AUTOMATICALLY DISCARDED.') - ENDIF - ENDIF - - ENDIF - ENDIF - ENDIF - 100 CONTINUE - ENDDO - IF(IETYP .EQ. 0) GO TO 390 - -C ERROR RECOVERY FOR PART I: - - WRITE(6,103) NREC,IEROVR,NOVRLP-1,(IOVRLP(NOVR),NOVR=2,NOVRLP) - 103 FORMAT(' ...SUMMARY OF OVERLAPS FOR NREC=',I3,'. OVERLAP ', - 1 'TYPE=',I3,' AND NUMBER OF OVERLAPS=',I3, - 2 ' OVERLAP INDICES ARE:'/4X,'(NEGATIVE OVERLAP ', - 3 'INDICES MEAN THAT THE OVERLAP TYPE DIFFERS FROM ', - 4 'THE PRIMARY ONE WHICH IS IEROVR)'/4X,10I3) - -C **************************************************** -C **************************************************** -C **** **** -C **** MULTIPLE REPORTS BY THE SAME INSTITUTION **** -C **** **** -C **************************************************** -C **************************************************** - - IF(IEROVR .EQ. 1) THEN - IVR=9 - WRITE(6,107) IETYP - 107 FORMAT(' ******STORMS ARE REPORTED BY THE SAME RSMC, WHICH ', - 1 'IS A LOGICAL ERROR. IETYP=',I2/4X,'WE PROCEED TO ', - 2 'RECOVER THIS ERROR BY REDUCING THE RMAX OF THE LARGEST ', - 3 'STORM SO THAT OVERLAP WILL NOT OCCUR.') - - IF(NOVRLP .GT. 2) WRITE(6,109) - 109 FORMAT(' ###WARNING, NOVRLP > 2 SO THAT PROCESSING WILL ', - 1 'OCCUR FOR ONLY THE LARGEST AND SMALLEST STORMS. ', - 2 'OTHERS WILL BE AUTOMATICALLY MARKED ERRONEOUS.') - -C PICK OUT THE LARGEST AND SMALLEST STORMS - - INDXZ=1 - INDXX=1 - RMAXZ=RMAX(1) - RMAXX=RMAX(1) - DO NOVR=2,NOVRLP - IF(IOVRLP(NOVR) .GT. 0) THEN - IF(RMAX(NOVR) .GT. RMAXZ) THEN - RMAXZ=RMAX(NOVR) - INDXZ=NOVR - ENDIF - IF(RMAX(NOVR) .LT. RMAXX) THEN - RMAXX=RMAX(NOVR) - INDXX=NOVR - ENDIF - ENDIF - ENDDO - - DISTZX=DISTSP(STMLAT(INDXZ),STMLON(INDXZ), - 1 STMLAT(INDXX),STMLON(INDXX))*1.E-3 - EXCESS=RMAXZ+RMAXX+BUFZON*DEGLAT-DISTZX - WRITE(6,121) INDXZ,INDXX,STMID(INDXZ),RMAXZ,STMID(INDXX),RMAXX, - 1 DISTZX,EXCESS - 121 FORMAT('...INDXZ,INDXX,STMID(INDXZ),RMAX(INDXZ),STMID(INDXX),', - 1 'RMAX(INDXX)=',2I3,2(1X,A,F7.1),' DISTZX,EXCESS=',2F9.1) - RMAXZT=RMAXZ-EXCESS - -C RECOVERY METHOD 1: SUBTRACT EXCESS FROM LARGEST RMAX BUT MAINTAIN -C RELATIVE SIZE - - IF(RMAXZT .GT. RMAXX) THEN - WRITE(OVRREC(INDXZ)(ISTVAR(IVR):IENVAR(IVR)),FMTVIT(IVR)) - 1 NINT(RMAXZT) - OVRREC(INDXZ)(ISTVAR(IVR)-1:ISTVAR(IVR)-1)='O' - OVRREC(INDXX)=TSTREC(IOVRLP(INDXX)) - WRITE(6,123) IOVRLP(INDXZ),RMAXZ,RMAXZT,INDXZ,OVRREC(INDXZ) - 123 FORMAT(' ###IMPORTANT NOTE: FOR RECORD',I3,' RMAXZ=',F7.1, - 1 ' WILL BE SUBSTITUTED BY RMAXZT=',F7.1,' FOR INDXZ=',I3, - 2 '. AFTER SUBSTITUTION, OVRREC='/4X,A) - IETYP=-10 - -C RECOVERY METHOD 2: SUBTRACT HALF THE EXCESS FROM EACH RMAX - - ELSE - WRITE(6,125) - 125 FORMAT('...UNABLE TO MAINTAIN RMAXZ>RMAXX. HALF THE ', - 1 'EXCESS WILL BE SUBTRACTED FROM EACH REPORT.') - RMAXZT=RMAXZ-0.5*EXCESS - RMAXXT=RMAXX-0.5*EXCESS - IF(RMAXZT .GE. RMAXMN .AND. RMAXXT .GE. RMAXMN) THEN - WRITE(OVRREC(INDXZ)(ISTVAR(IVR):IENVAR(IVR)),FMTVIT(IVR)) - 1 NINT(RMAXZT) - WRITE(OVRREC(INDXX)(ISTVAR(IVR):IENVAR(IVR)),FMTVIT(IVR)) - 1 NINT(RMAXXT) - OVRREC(INDXX)(ISTVAR(IVR)-1:ISTVAR(IVR)-1)='O' - WRITE(6,123) IOVRLP(INDXZ),RMAXZ,RMAXZT,INDXZ,OVRREC(INDXZ) - WRITE(6,127) IOVRLP(INDXX),RMAXX,RMAXXT,IOVRLP(INDXX), - 1 OVRREC(INDXX) - 127 FORMAT(' ###IMPORTANT NOTE: FOR RECORD',I3,' RMAXX=',F7.1, - 1 ' WILL BE SUBSTITUTED BY RMAXXT=',F7.1,' FOR INDXX=',I3, - 2 '. AFTER SUBSTITUTION, OVRREC='/4X,A) - IETYP=-10 - - ELSE - WRITE(6,129) RMAXZT,RMAXXT,RMAXMN - 129 FORMAT(' ******RMAXZ AND RMAXX REDUCTION METHODS HAVE FAILED. ', - 1 'RMAXZT,RMAXXT=',2F7.1,' < RMAXMN=',F7.1) - ENDIF - ENDIF - - DO NOVR=1,NOVRLP - -C ASSIGN ERROR FLAGS AND UPDATE RECORDS FOR THE TWO RECORDS -C THAT WE TRIED TO CORRECT - - IF(NOVR .EQ. INDXZ .OR. NOVR .EQ. INDXX) THEN - IFRSMC(NUMTST(IOVRLP(NOVR)))=IETYP - IF(IETYP .GT. 0) THEN - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(IOVRLP(NOVR)) - BADREC(NADD+NBAD)=TSTREC(IOVRLP(NOVR)) - ELSE - NOKAY=NOKAY+1 - NUMOKA(NOKAY)=NUMTST(IOVRLP(NOVR)) - OKAREC(NOKAY)=OVRREC(NOVR) - ENDIF - -C ASSIGN ERROR FLAGS TO ALL OTHER RECORDS - - ELSE - IFRSMC(NUMTST(IOVRLP(NOVR)))=IETYP - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(IOVRLP(NOVR)) - BADREC(NADD+NBAD)=TSTREC(IOVRLP(NOVR)) - ENDIF - ENDDO - GO TO 400 - -C *************************************************** -C *************************************************** -C **** **** -C **** MULTIPLE REPORTS BY TWO DIFFERENT RSMCS **** -C **** **** -C *************************************************** -C *************************************************** - - ELSE IF(IEROVR .EQ. 2) THEN - WRITE(6,201) IETYP - 201 FORMAT('...STORMS ARE REPORTED BY DIFFERENT RSMCS. ', - 1 'WE PROCEED TO SEE IF THEY ARE THE SAME STORM BY ', - 2 'COMPARING NAMES.'/4X,'THEN WE CONSTRUCT A COMMON ', - 3 'STORM ID. PRELIMINARY IETYP=',I2) - - BUFINZ=OVRREC(1) - - NERROR=0 - DO NOVR=2,NOVRLP - IF(STMNAM(NOVR) .EQ. 'NAMELESS' .AND. - 1 STMNMZ .EQ. 'NAMELESS') THEN - WRITE(6,202) STMIDZ,RSMCZ,STMID(NOVR),RSMC(NOVR) - 202 FORMAT(' ###OVERLAPPING NAMELESS STORMS HAVE IDS AND RSMCS=', - 1 2(2(A,1X),2X)) - - ELSE IF(STMNAM(NOVR) .EQ. STMNMZ) THEN - WRITE(6,203) STMNAM(NOVR),NOVR - 203 FORMAT('...STORM NAME=',A,' FOR NOVR=',I3,' MATCHES FIRST ', - 1 'REPORT. THE STORMS ARE THE SAME.') - - ELSE - -C IF ONE RSMC REPORTS A NAMELESS STORM AND THE OTHER RSMCS REPORT -C A NAME, TRANSFER THE STORM NAME TO THE NAMELESS RECORD. - - IF(STMNMZ .EQ. 'NAMELESS') THEN - WRITE(6,205) STMNAM(NOVR),NOVR - 205 FORMAT('...STMNMZ IS NAMELESS. COPYING STMNAM(NOVR)=',A,' TO ', - 1 'STMNMZ. NOVR=',I3) - STMNAM(1)=STMNAM(NOVR) - STMNMZ=STMNAM(NOVR) - OVRREC(1)=BUFINZ - - IF(IOVRLP(1) .LE. NTEST) TSTREC(IOVRLP(1))=BUFINZ - - ELSE IF(STMNAM(NOVR) .EQ. 'NAMELESS') THEN - WRITE(6,207) STMNMZ,NOVR - 207 FORMAT('...STMNAM(NOVR) IS NAMELESS. COPYING STMNMZ=',A,' TO ', - 1 'STMNAM(NOVR). NOVR=',I3) - STMNAM(NOVR)=STMNMZ - BUFINX=OVRREC(NOVR) - STMNMX=STMNMZ - OVRREC(NOVR)=BUFINX - - IF(IOVRLP(NOVR) .LE. NTEST) TSTREC(IOVRLP(NOVR))=BUFINX - -C THERE ARE TWO NAMES, NEITHER OF WHICH IS NAMELESS. THUS THERE IS -C AN UNTREATABLE ERROR - - ELSE - IETYP=5 - NERROR=NERROR+1 - IOVRLP(NOVR)=-IABS(IOVRLP(NOVR)) - WRITE(6,209) NOVR,STMNAM(NOVR),STMNMZ,IETYP - 209 FORMAT(/'******FOR NOVR=',I3,' STORM NAME=',A,' DOES NOT MATCH ', - 1 'NAME FOR THE FIRST REPORT=',A,'.'/4X,' THERE IS NO ', - 2 'ERROR RECOVERY AT THIS TIME. IETYP=',I3) - -C ERROR MARKING OFF ON THE FLY HERE - - IFRSMC(NUMTST(IABS(IOVRLP(NOVR))))=IETYP - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(IABS(IOVRLP(NOVR))) - BADREC(NADD+NBAD)=TSTREC(IABS(IOVRLP(NOVR))) - IETYP=IEROVR - ENDIF - ENDIF - ENDDO - -C IF AN ERROR HAS OCCURRED IN THE PREVIOUS PROCESSING REMOVE -C THE ERRONEOUS RECORD FROM THE OVERLAP LIST AND CONTINUE - - IF(NERROR .NE. 0) THEN - NOVRZ=0 - WRITE(6,213) NERROR - 213 FORMAT(' ******',I3,' ERRORS FOUND DURING STORM NAME MATCHING.') - DO NOVR=1,NOVRLP - IF(IOVRLP(NOVR) .GE. 0 .AND. IOVRLP(NOVR) .LE. NTEST) THEN - NOVRZ=NOVRZ+1 - IOVRLP(NOVRZ)=IOVRLP(NOVR) - OVRREC(NOVRZ)=OVRREC(NOVR) - STMNAM(NOVRZ)=STMNAM(NOVR) - STMID (NOVRZ)=STMID(NOVR) - RSMC (NOVRZ)=RSMC(NOVR) - STMLAT(NOVRZ)=STMLAT(NOVR) - STMLON(NOVRZ)=STMLON(NOVR) - RMAX (NOVRZ)=RMAX(NOVR) - PCEN (NOVRZ)=PCEN(NOVR) - PENV (NOVRZ)=PENV(NOVR) - ENDIF - ENDDO - NOVRLP=NOVRZ - IF(NOVRLP .EQ. 1) GO TO 390 - ENDIF - - WRITE(6,221) - 221 FORMAT(' ...THE OBSERVING RSMCS, THEIR ABBREVIATIONS, ', - 1 'PRIORITIES, INDICES AND REPORTED BASINS ARE:'/11X, - 2 'RSMC',3X,'RSMCAP',3X,'PRIORITY',3X,'INDEX',3X,'BASIN',3X, - 3 'BSCOFL',3X,'RPCOFL') - - NERROR=0 - DO NOVR=1,NOVRLP - -C WHICH BASIN ARE WE IN? - - CALL BASNCK(STMID(NOVR),STMLAT(NOVR),STMLON(NOVR),NBA,IPRT,IER) - IF(IER .EQ. 11) THEN - BSCOFL='IB' - ELSE - BSCOFL='CB' - ENDIF - - IF(IER .EQ. 3) THEN - IETYP=IER - NERROR=NERROR+1 - IOVRLP(NOVR)=-IABS(IOVRLP(NOVR)) - -C AGAIN, ERROR MARKING OFF ON THE FLY - - IFRSMC(NUMTST(IABS(IOVRLP(NOVR))))=IETYP - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(IABS(IOVRLP(NOVR))) - BADREC(NADD+NBAD)=TSTREC(IABS(IOVRLP(NOVR))) - IETYP=IEROVR - ENDIF - - IF(NOVR .EQ. 1) THEN - NBASV=NBA - RPCOFL='CR' - ELSE - IF(NBA .NE. NBASV) THEN - RPCOFL='IR' - NBA=NBASV - ENDIF - ENDIF - -C IS THIS A REPORT BY THE PRIORITY RSMC FOR THIS BASIN? THE -C PRIORITY FLAG IS TWO DIGITS. THE FIRST DIGIT IS PRIORITY -C (=1 IF THE RSMC IS THE PRIORITY RSMC, =2 OTHERWISE). THE -C SECOND DIGIT IS THE RSMC INDEX - - NRSPRI=RSMCPR(NBA) - NRSMC=-1 - DO NRSZ=1,NRSMCX - IF(RSMCID(NRSZ) .EQ. RSMC(NOVR)) THEN - NRSMC=NRSZ - IF(NRSMC .EQ. NRSPRI) THEN - IPRIOR(NOVR)=10+NRSMC - AVWT(NOVR)=RSMCWT(1) - BUFINZ=OVRREC(NOVR) - ELSE - IPRIOR(NOVR)=20+NRSMC - AVWT(NOVR)=RSMCWT(2) - ENDIF - GO TO 231 - ENDIF - ENDDO - 231 CONTINUE - - IF(NRSMC .GE. 0) THEN - WRITE(6,233) NOVR,RSMC(NOVR),RSMCAP(NRSMC),IPRIOR(NOVR),NRSMC, - 1 NBA,BSCOFL,RPCOFL - 233 FORMAT(' ',5X,I3,2X,A,6X,A,8X,I2,5X,I4,5X,I3,2(7X,A)) - - ELSE - IETYP=4 - NERROR=NERROR+1 - IOVRLP(NOVR)=-IABS(IOVRLP(NOVR)) - WRITE(6,235) RSMC(NOVR),NOVR,IETYP - 235 FORMAT('0******RSMC=',A,' COULD NOT BE FOUND IN RSMCCK. THIS ', - 1 'RECORD IS ERRONEOUS. NOVR=',I3,', IETYP=',I3) - -C AGAIN, ERROR MARKING OFF ON THE FLY - - IFRSMC(NUMTST(IABS(IOVRLP(NOVR))))=IETYP - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(IABS(IOVRLP(NOVR))) - BADREC(NADD+NBAD)=TSTREC(IABS(IOVRLP(NOVR))) - ENDIF - - ENDDO - -C IF AN ERROR HAS OCCURRED IN THE PREVIOUS PROCESSING REMOVE -C THE ERRONEOUS RECORD FROM THE OVERLAP LIST AND CONTINUE - - IF(NERROR .NE. 0) THEN - WRITE(6,243) NERROR - 243 FORMAT(' ******',I3,' ERRORS FOUND DURING RSMC VERIFICATION.') - NOVRZ=0 - DO NOVR=1,NOVRLP - IF(IOVRLP(NOVR) .GE. 0 .AND. IOVRLP(NOVR) .LE. NTEST) THEN - NOVRZ=NOVRZ+1 - IOVRLP(NOVRZ)=IOVRLP(NOVR) - IPRIOR(NOVRZ)=IPRIOR(NOVR) - OVRREC(NOVRZ)=OVRREC(NOVR) - STMNAM(NOVRZ)=STMNAM(NOVR) - STMID (NOVRZ)=STMID(NOVR) - RSMC (NOVRZ)=RSMC(NOVR) - STMLAT(NOVRZ)=STMLAT(NOVR) - STMLON(NOVRZ)=STMLON(NOVR) - RMAX (NOVRZ)=RMAX(NOVR) - PCEN (NOVRZ)=PCEN(NOVR) - PENV (NOVRZ)=PENV(NOVR) - AVWT (NOVRZ)=AVWT(NOVR) - ENDIF - ENDDO - NOVRLP=NOVRZ - IF(NOVRLP .EQ. 1) GO TO 390 - ENDIF - - WRITE(6,251) NOVRLP - 251 FORMAT(6X,'KEY: BSCOFL=IB IF REPORTED LAT/LON AND BASIN ', - 1 'ID FROM STORM ID ARE INCONSISTENT.'/18X,'=CB IF ', - 2 'LAT/LON AND BASIN ID ARE CONSISTENT.'/12X,'RPCOFL=', - 3 'CR IF REPORTED BASIN IS THE SAME AS THE FIRST RECORD.' - 4 /18X,'=IR IF REPORTED BASIN IS DIFFERENT FROM THE FIRST ', - 5 'RECORD.'/4X,I3,' OVERLAPPING STORMS HAVE BEEN FOUND.') - -C CHECK THE ALIAS FILE FOR REPORTS UNDER OTHER NAMES - - DO NOVR=1,NOVRLP - NALIAS=0 - NALREC=0 - REWIND IUNTAL - WRITE(6,257) STMNAM(NOVR),STMID(NOVR) - 257 FORMAT(/'...CHECKING THE ALIAS FILE TRYING TO FIND STORM NAME ', - 1 'ID AND RSMC THAT MATCH',3(1X,A)) - - 260 READ(IUNTAL,261,END=300) NALMX,STMNMX,(RSMCAL(NAL),STIDAL(NAL), - 1 NAL=1,MIN(NALMX,NOVRMX)) - 261 FORMAT(I1,1X,A9,10(1X,A4,1X,A3)) - NALREC=NALREC+1 - IF(NOVR .EQ. 1) WRITE(6,267) NALREC,RSMCAL(1),STIDAL(1), - 1 NALMX-1,STMNMX,(RSMCAL(NAL),STIDAL(NAL),NAL=2,MIN(NALMX,NOVRMX)) - 267 FORMAT('...ALIAS RECORD',I3,'=',2(A,1X),' HAS ',I3,' OBSERVERS ', - 1 'AND NAME=',A,' OBSERVERS ARE:'/(14X,2(A,1X))) - -C WRITE(6,293) STMID(NOVR),STIDAL(NAL) -C 293 FORMAT('...CHECKING STORM IDS VERSUS ALIAS FILE. STMID(NOVR),', -C 1 'STIDAL(NAL)=',2(A,1X)) - - IFNDAL=0 - IF(STMNMX .NE. 'NAMELESS' .AND. STMNAM(NOVR) .EQ. STMNMX .AND. - 1 STMID(NOVR)(3:3) .EQ. STIDAL(1)(3:3)) THEN - IFNDAL=1 - WRITE(6,294) STMNMX,STIDAL(1)(3:3) - 294 FORMAT('...EXACT NAME AND BASIN MATCH FOR NAMED STORM=',A,' IN ', - 1 'BASIN ',A,' IN THE ALIAS FILE.') - - ELSE - DO NALZZ=2,MIN(NALMX,NOVRMX) - IF(STMID(NOVR) .EQ. STIDAL(NALZZ) .AND. - 1 RSMC(NOVR) .EQ. RSMCAL(NALZZ)) THEN - IFNDAL=1 - WRITE(6,295) STMNMX,STIDAL(NALZZ),RSMC(NALZZ) - 295 FORMAT('...STORM ID AND RSMC MATCH FOR STORM=',A,' IN THE ', - 1 'ALIAS FILE. ID,RSMC=',2(A,1X)) - ENDIF - ENDDO - ENDIF - - IF(IFNDAL .EQ. 1) THEN - NALIAS=NALMX-1 - -C CHECK THAT THE OBSERVING RSMCS IN THE ALIAS FILE ARE AT LEAST -C THOSE OBSERVING FOR THIS CASE - - NOFIND=0 - DO NOVRZ=1,NOVRLP - DO NALZ=2,MIN(NALMX,NOVRMX) - IF(RSMC(NOVRZ) .EQ. RSMCAL(NALZ)) THEN - NOFIND=0 - GO TO 2294 - ELSE - NOFIND=NOFIND+1 - ENDIF - ENDDO - 2294 CONTINUE - IF(NOFIND .GT. 0) GO TO 2298 - ENDDO - - 2298 IF(NOFIND .EQ. 0) THEN - RSMCZ=RSMCAL(1) - STMIDZ=STIDAL(1) - -C RESET NALIAS TO FORCE A NEW COMBINED RSMC IF THE OBSERVING -C RSMCS AREN'T ON THE ALIAS FILE - - ELSE - WRITE(6,297) - 297 FORMAT('...RESETTING NALIAS=0 TO FORCE NEW ALIAS RECORD ', - 1 'BECAUSE A NEW RSMC HAS OBSERVED THIS STORM.') - NALIAS=0 - ENDIF - GO TO 301 - ENDIF - GO TO 260 - 300 CONTINUE - ENDDO - 301 CONTINUE - -C CONSTRUCT AND WRITE A NEW COMBINED RSMC IF NECESSARY - - IF(NALIAS .EQ. 0) THEN - IF(NALREC .EQ. 0) WRITE(6,303) - 303 FORMAT(/'...THE ALIAS FILE IS EMPTY. WE WILL ADD A NEW ALIAS.') - - IF(IFNDAL .EQ. 0) THEN - RSMCZ='!'//RSMCAP(NRSPRI) - WRITE(6,343) NRSPRI,RSMCAP(NRSPRI),RSMCZ - 343 FORMAT('...CONSTRUCTING NEW COMBINED RSMC FROM PRIORITY RSMC. ', - 1 'NRSPRI,','RSMCAP(NRSPRI),RSMCZ=',I4,2(1X,'...',A,'...')) - NSUB=0 - DO NOVZ=1,MIN0(NOVRLP,3) - IF(IPRIOR(NOVZ)/10 .NE. 1) THEN - NSUB=NSUB+1 - RSMCZ(2+NSUB:2+NSUB)=RSMCAP(IPRIOR(NOVZ)-10*(IPRIOR(NOVZ)/10)) - WRITE(6,349) RSMCZ(2+NSUB:2+NSUB),RSMCZ - 349 FORMAT('...ADDING RSMCAP=',A,', RSMCZ=',A) - ENDIF - ENDDO - - NSUB=1 - DO NOVZ=1,MIN(NOVRLP,NOVRMX-1) - NSUB=NSUB+1 - RSMCAL(NSUB)=RSMC(NOVZ) - STIDAL(NSUB)=STMID(NOVZ) - IF(IPRIOR(NOVZ)/10 .EQ. 1) THEN - RSMCAL(1)=RSMCZ - STIDAL(1)=STMIDZ - ENDIF - ENDDO - NOVRAD=NOVRLP+1 - -C CHECK THE CHOICE OF STORM ID VERSUS THE CATALOG. MAKE ANOTHER -C CHOICE IF THE FIRST CHOICE IS TAKEN. - - WRITE(6,361) STIDAL(1),(STMID(NOVZ),RSMC(NOVZ),NOVZ=1,NOVRLP) - 361 FORMAT('...CHECKING THE CATALOG TO SEE THE IF STORM IS IN ', - 1 'THERE. FIRST CHOICE IS: ',A/4X, - 2 'POSSIBLE IDS AND RSMCS ARE:'/(14X,2(A,2X))) - - read(stidal(1)(1:2),3333) minid - 3333 format(i2.2) - write(6,3334) minid - 3334 FORMAT('...ID OF FIRST CHOICE STORM ID=',I3) - - do novz=1,novrlp - call stcati(iuntca,stmid(novz),rsmc(novz),stmidx,ifnd) - if(ifnd .eq. 1) then - stidal(1)=stmidx - write(6,3335) stidal(1) - 3335 format('...Eureka, this storm is in the catalog with id=',a) - go to 3341 - - else - -c Pick out the maximum storm id from the priority basin - - if(stmid(novz)(3:3) .eq. stidal(1)(3:3)) then - read(stmid(novz)(1:2),3333) minidz - minid=max0(minid,minidz) - endif - - endif - enddo - 3341 continue - - if(ifnd .eq. 0) then - write(stidal(1)(1:2),3333) minid - write(6,3351) stidal(1) - 3351 format('...This storm is not in the catalog. Assign a unique ', - 1 'id that is the smallest for the overlapping storms=',a) - endif - stmidz=stidal(1) - - ELSE - WRITE(6,3357) RSMCAL(1),STIDAL(1),NALMX,(RSMCAL(NN), - 1 STIDAL(NN),NN=2,NALMX) - 3357 FORMAT('...COPYING RSMC =(',A,') AND STORM ID =(',A,') FROM ', - 1 'ALIAS FILE AND ADDING NEW RSMCS.'/4X,'NEW RSMCS AND ', - 2 'STORM IDS WILL NOW BE ADDED. CURRENT NUMBER IS',I3, - 3 ' OTHER RSMCS, STORM IDS ARE:'/(10X,2(A,1X))) - -C ADD NEW RSMCS AND ALIASES AS APPROPRIATE - - NADDRS=0 - - DO NOVR=1,NOVRLP - - DO NRSZA=1,NRSMCX - IF(RSMCID(NRSZA) .EQ. RSMC(NOVR)) THEN - NRSAPA=NRSZA - WRITE(6,3359) NOVR,RSMC(NOVR),NRSAPA - 3359 FORMAT('...FOR OVERLAP RECORD',I3,' RSMC AND INDEX ARE ',A,I4) - GO TO 3361 - ENDIF - ENDDO - 3361 CONTINUE - - IADRMS=1 - LNRSMC=INDEX(RSMCAL(1),' ')-1 - DO LENG=2,LNRSMC - WRITE(6,3377) LENG,RSMCAL(1)(LENG:LENG),RSMCAP(NRSAPA) - 3377 FORMAT('...TRYING TO MATCH RSMC ON ALIAS RECORD WITH OVERLAP ', - 1 'RECORD, LENG,RSMCAL,RSMCAP=',I3,2(1X,A)) - IF(RSMCAL(1)(LENG:LENG) .EQ. RSMCAP(NRSAPA)) THEN - IADRMS=0 - ENDIF - ENDDO - - IF(IADRMS .GT. 0) THEN - NADDRS=NADDRS+1 - RSMCAL(1)(LNRSMC+NADDRS:LNRSMC+NADDRS)=RSMCAP(NRSAPA) - STIDAL(NALMX+NADDRS)=STMID(NOVR) - RSMCAL(NALMX+NADDRS)=RSMC(NOVR) - WRITE(6,3391) NADDRS,NALMX+NADDRS,RSMCAL(1) - 3391 FORMAT('...ADDING RSMC, NADDRS,NALMX+NADDRS,RSMCAL(1)=', - 1 2I4,1X,A) - ENDIF - ENDDO - NOVRAD=NALMX+NADDRS - STMIDZ=STIDAL(1) - RSMCZ=RSMCAL(1) - ENDIF - -C WRITE A NEW RECORD TO THE ALIAS FILE IF THERE ISN'T AN EARLIER -C ONE IN THE NEW ALIAS FILE ALREADY - - IFND=0 - DO NADDZ=1,NALADD - IF(STNMAD(NADDZ) .EQ. STMNAM(NOVR) .OR. - 1 (STIDAD(NADDZ) .EQ. STIDAL(1) .AND. - 2 RSMCAD(NADDZ) .EQ. RSMCAL(1)) .AND. - 3 DAYZ .GE. DAYZAD(NADDZ)) THEN - IFND=1 - GO TO 3661 - ENDIF - ENDDO - 3661 CONTINUE - - IF(IFND .EQ. 0) THEN - WRITE(6,3401) NOVRAD,NADDRS,RSMCAL(1),STIDAL(1),(RSMCAL(NN), - 1 STIDAL(NN),NN=2,NOVRAD) - 3401 FORMAT('...READY TO ADD MODIFIED ALIAS RECORD: NOVRAD,NADDRS,', - 1 'PRIMARY RSMC,STORM ID=',2I4,2(1X,A),' SECONDARY ', - 2 'RSMC, ID:'/(10X,2(A,1X))) - NALADD=NALADD+1 - STNMAD(NALADD)=STMNAM(1) - STIDAD(NALADD)=STIDAL(1) - RSMCAD(NALADD)=RSMCAL(1) - DAYZAD(NALADD)=DAYZ - NAKA=MIN(NOVRAD,NOVRMX) - CALL AKASAV(NALADD,NAKA,DAYZ,STNMAD(NALADD),RSMCAL,STIDAL) - ENDIF - - ENDIF - -C CALCULATE AVERAGE LAT/LON, RMAX -C THEN SUBSTITUTE THE STORM ID, RSMC, LAT/LON, RMAX - - WRITE(6,362) (NO,STMLAT(NO),STMLON(NO),RMAX(NO),PCEN(NO), - 1 PENV(NO),NO=1,NOVRLP) - 362 FORMAT(/'...READY FOR AVERAGING OVER COTEMPORANEOUS STORMS. ', - 1 9X,'LAT',5X,'LON',4X,'RMAX',4X,'PCEN',4X,'PENV ARE:' - 2 /(54X,I3,5F8.1)) - - CALL WTAVRG(STMLAT,AVWT,NOVRLP,STMLTZ) - CALL WTAVRG(STMLON,AVWT,NOVRLP,STMLNZ) - CALL WTAVGP(RMAX,AVWT,NOVRLP,RMAXZ) - CALL WTAVGP(PCEN,AVWT,NOVRLP,PCENZ) - CALL WTAVGP(PENV,AVWT,NOVRLP,PENVZ) - IF(STMLTZ .GE. 0) THEN - LATNS='N' - ELSE - LATNS='S' - STMLTZ=ABS(STMLTZ) - ENDIF - IF(STMLNZ .GT. 180.) THEN - LONEW='W' - ELSE - LONEW='E' - ENDIF - WRITE(6,363) LATNS,LONEW,STMLTZ,STMLNZ,RMAXZ,PCENZ,PENVZ - 363 FORMAT('...AVERAGE STORM VALUES ARE:',2X,'(LATNS,LONEW=',2A2,')' - 1 /57X,5F8.1) - - IF(NVSBRS .NE. 0) THEN - - DO IVR=1,NVSBRS - IVSB=IVSBRS(IVR) - IVTVAR(IVSB)=NINT(VITVAR(IVSB)/VITFAC(IVSB)) - ENDDO - - ELSE - WRITE(6,3364) - 3364 FORMAT(' ###THESE AVERAGE VALUES WILL NOT BE SUBSTITUTED.') - ENDIF - - WRITE(6,365) STMIDZ,RSMCZ - 365 FORMAT(' ...SUBSTITUTING COMBINED STORM ID=',A,' AND RSMC=',A, - 1 ' INTO OVERLAP RECORDS.',/,4X,'AFTER SUBSTITUTION, ', - 2 'INDEX, INPUT RECORD#, RECORD ARE : (~~ INDICATES ', - 3 'RECORD FROM ORIGINAL SHORT-TERM HISTORY FILE)') - ICURR=0 - DO NOVR=1,NOVRLP -C WRITE(6,367) NOVR,STMIDZ,RSMCZ,OVRREC(NOVR) -C 367 FORMAT('...BEFORE SUBSTITUTION,NOVR,STMIDZ,RSMCZ,OVRREC=', -C 1 I3,2(1X,A)/4X,A,'...') - -C COUNT THE NUMBER OF CURRENT OVERLAPPING RECORDS - - IF(IOVRLP(NOVR) .LE. NTEST) THEN - ICURR=ICURR+1 - STHCH=' ' - ELSE - STHCH='~~' - ENDIF - - BUFINX=OVRREC(NOVR) - STMIDX=STMIDZ - RSMCX=RSMCZ - LATNSX=LATNS - LONEWX=LONEW - OVRREC(NOVR)=BUFINX - DO IVR=1,NVSBRS - IVSB=IVSBRS(IVR) - WRITE(OVRREC(NOVR)(ISTVAR(IVSB):IENVAR(IVSB)),FMTVIT(IVSB)) - 1 IVTVAR(IVSB) - OVRREC(NOVR)(ISTVAR(IVSB)-1:ISTVAR(IVSB)-1)='A' - ENDDO - WRITE(6,369) NOVR,IOVRLP(NOVR),STHCH,OVRREC(NOVR) - 369 FORMAT(' ...',2I3,'...',A,'...',A,'...') - ENDDO - -C FINAL ASSIGNMENT OF ERROR CODE: -C =21 IF ALL OVERLAPPING RECORDS ARE CURRENT -C =22 IF ONE OF THE OVERLAPPING RECORDS WAS FROM THE ORIGINAL -C SHORT TERM HISTORY FILE. IN THIS CASE ITS TOO LATE TO USE -C THE CURRENT RECORD ANYWAY. - - IF(ICURR .EQ. NOVRLP) THEN - IETYP=IETYP*10+1 - ELSE - IETYP=IETYP*10+2 - ENDIF - -C ONLY RECORDS FROM THE CURRENT TEST ARRAY CAN BE SPLIT INTO OKAY -C AND BAD RECORDS. - - DO NOVR=1,NOVRLP - IF(IOVRLP(NOVR) .LE. NTEST) THEN - IFRSMC(NUMTST(IOVRLP(NOVR)))=IETYP - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(IOVRLP(NOVR)) - BADREC(NADD+NBAD)=TSTREC(IOVRLP(NOVR)) - IF(IETYP .NE. 0 .AND. IPRIOR(NOVR)/10 .EQ. 1) THEN - NSUBR=NSUBR+1 - NOKAY=NOKAY+1 - NUMOKA(NOKAY)=NUMTST(IOVRLP(NOVR)) - OKAREC(NOKAY)=OVRREC(NOVR) - ENDIF - ENDIF - ENDDO - - GO TO 400 - ENDIF - -C OTHER ERROR PROCESSING - - 390 CONTINUE - - IFRSMC(NUMTST(NRECSV))=IETYP - IF(IETYP .GT. 0) THEN - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(NRECSV) - BADREC(NADD+NBAD)=TSTREC(NRECSV) - ELSE - NOKAY=NOKAY+1 - NUMOKA(NOKAY)=NUMTST(NRECSV) - OKAREC(NOKAY)=TSTREC(NRECSV) - ENDIF - - 400 CONTINUE - ENDDO - -C DUMP ALIAS RECORDS TO NEW ALIAS FILE - - CALL AKADMP(IUNTAN) - - WRITE(6,401) - 401 FORMAT(//'...BEGINNING RSMCCK PART II: UNIFY STORM ID ACROSS ALL', - 1 ' CURRENT AND HISTORICAL OCCURRENCES.') - -C COPY ALIAS FILE (AKAVIT) TO NEW ALIAS FILE. DON'T COPY RECORDS -C THAT ALREADY EXIST IN NEW ALIAS FILE. - - REWIND IUNTAL - CALL AKACPY(IUNTAL,IUNTAN) - -C CHECK ALL RECORDS IN THE ALIAS SHORT-TERM HISTORY FILE VERSUS -C RECORDS THAT ARE OK SO FAR. FIRST, COPY ALL OKAY RECORDS -C INTO WORKING SPACE. - - NCHECK=NOKAY+1 - REWIND IUNTHA - WRITE(6,503) - 503 FORMAT(/'...COPYING OKAY RECORDS TO OVRREC ARRAY: RECORD #, ', - 1 'RECORD=') - DO NOK=1,NOKAY - IOVRLP(NOK)=0 - OVRREC(NOK)=OKAREC(NOK) - WRITE(6,505) NOK,OVRREC(NOK) - 505 FORMAT('...',I3,'...',A,'...') - ENDDO - WRITE(6,511) NOKAY - 511 FORMAT('...',I3,' OKAY RECORDS HAVE BEEN COPIED.') - - WRITE(6,513) IUNTHA - 513 FORMAT(/'...READING FROM ALIAS SHORT-TERM HISTORY FILE (UNIT',I3, - 1 ') INTO OVRREC ARRAY: RECORD #, RECORD='/4X,A) - - 520 CONTINUE - - READ(IUNTHA,521,END=540) OVRREC(NCHECK) - 521 FORMAT(A) - -C AT THIS POINT WE DO NOT KNOW IF A 2-DIGIT YEAR BEGINS IN COLUMN 20 -C OF THE RECORD (OLD NON-Y2K COMPLIANT FORM) OR IF A 4-DIGIT YEAR -C BEGINS IN COLUMN 20 (NEW Y2K COMPLIANT FORM) - TEST ON LOCATION OF -C LATITUDE N/S INDICATOR TO FIND OUT ... - - IF(OVRREC(NCHECK)(35:35).EQ.'N' .OR. - 1 OVRREC(NCHECK)(35:35).EQ.'S') THEN - -C ... THIS RECORD STILL CONTAINS THE OLD 2-DIGIT FORM OF THE YEAR - -C ... THIS PROGRAM WILL CONVERT THE RECORD TO A 4-DIGIT YEAR USING THE -C "WINDOWING" TECHNIQUE SINCE SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 2-digit year "',OVRREC(NCHECK)(20:21),'"' - PRINT *, ' ' - PRINT *, 'From unit ',iuntha,'; OVRREC(NCHECK)-3: ', - $ OVRREC(NCHECK) - PRINT *, ' ' - DUMY2K(1:19) = OVRREC(NCHECK)(1:19) - IF(OVRREC(NCHECK)(20:21).GT.'20') THEN - DUMY2K(20:21) = '19' - ELSE - DUMY2K(20:21) = '20' - ENDIF - DUMY2K(22:100) = OVRREC(NCHECK)(20:100) - OVRREC(NCHECK) = DUMY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ OVRREC(NCHECK)(20:23),'" via windowing technique' - PRINT *, ' ' - PRINT *, 'From unit ',iuntha,'; OVRREC(NCHECK)-3: ', - $ OVRREC(NCHECK) - PRINT *, ' ' - - ELSE IF(OVRREC(NCHECK)(37:37).EQ.'N' .OR. - 1 OVRREC(NCHECK)(37:37).EQ.'S') THEN - -C ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR -C ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 4-digit year "',OVRREC(NCHECK)(20:23),'"' - PRINT *, ' ' - PRINT *, 'From unit ',iuntha,'; OVRREC(NCHECK)-3: ', - $ OVRREC(NCHECK) - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - - ELSE - - PRINT *, ' ' - PRINT *, '***** Cannot determine if this record contains ', - $ 'a 2-digit year or a 4-digit year - skip it and try reading ', - $ 'the next record' - PRINT *, ' ' - GO TO 520 - - END IF - - IOVRLP(NCHECK)=0 - WRITE(6,505) NCHECK,OVRREC(NCHECK) - NCHECK=NCHECK+1 - GO TO 520 - - 540 CONTINUE - NCHECK=NCHECK-1 - WRITE(6,541) NCHECK-NOKAY - 541 FORMAT('...',I3,' SHORT-TERM HISTORY RECORDS HAVE BEEN READ.') - - REWIND IUNTAL - NALADD=0 - DO NOK=1,NOKAY - -C DO ONLY RECORDS THAT HAVE NOT BEEN PROCESSED PREVIOUSLY - - IF(IOVRLP(NOK) .LT. 0) GO TO 700 - BUFINZ=OKAREC(NOK) - WRITE(6,543) NOK,STMNMZ,STMIDZ,RSMCZ - 543 FORMAT(//'...READY TO CHECK OKAY RECORD',I3,' WITH STMNAM,ID,', - 1 'RSMC=',3(1X,A)) - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 BUFINZ) - ENDDO - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - - IBANG=0 - NSAME=1 - STMID(NSAME)=STMIDZ - STMNAM(NSAME)=STMNMZ - RSMC (NSAME)=RSMCZ - IOVRLP(NOK)=-NOK - INDSAM(NSAME)=NOK - IDATE(NSAME)=IDATEZ - IUTC(NSAME)=IUTCZ - IDASRT(NSAME)=NSAME - SRTDAY(NSAME)=DAYZ - IF(RSMC(NSAME)(1:1) .EQ. '!') IBANG=NSAME - -C LOOK IN THE ALIAS FILE TO SEE IF THIS STORM HAS BEEN ALIASED -C BEFORE. - - NALSAV=NOVRMX - CALL AKAFND(IUNTAN,STMNMZ,RSMCZ,STMIDZ,NALSAV,STNMAL,RSMCAL, - 1 STIDAL,IFNDAL) - - IF(IFNDAL .NE. 0) THEN - NALMX=NALSAV - WRITE(6,557) STMNMZ,STMIDZ,NALMX - 557 FORMAT('...STORM NAME,ID=',2(1X,A),' HAS BEEN ASSIGNED AN ALIAS ', - 1 'NAME PREVIOUSLY.',I3,' ALIASES EXIST.') - ELSE - NALMX=1 - WRITE(6,559) STMNMZ - 559 FORMAT('...STORM ',A,' CANNOT BE FOUND IN THE ALIAS FILE.') - ENDIF - -C ACCUMULATE ALL OBSERVATIONAL REPORTS FOR THIS STORM. - - DO NCK=NOK+1,NCHECK - IF(IOVRLP(NCK) .GE. 0) THEN - IFNDX=0 - BUFINX=OVRREC(NCK) - -C NO MATCH FOR BOTH STORMS THAT ARE NAMED. - - IF(STMNMZ .NE. 'NAMELESS' .AND. STMNMX .NE. 'NAMELESS') THEN - IF(STMNMX .EQ. STMNMZ) then - if(STMIDX(3:3) .EQ. STMIDZ(3:3)) then - IFNDX=1 - else - icmat=0 - do nc=1,ncrdmx - if(stmnmx .eq. cardnm(nc)) icmat=1 - enddo - if(icmat .eq. 0) ifndx=1 - endif - endif - -C POSSIBLE MATCH REMAINS: MATCH STORM ID FOR THE SAME RSMC. IF -C STORM WAS IN ALIAS FILE, TRY TO MATCH ANY OF ITS ALIASES. IF -C STORM WAS NOT IN ALIAS FILE, TRY TO MATCH STORM ID AND RSMC. -C WARNING: THIS IS NOT A COMPLETE TEST!!! - - ELSE - IF(IFNDAL .NE. 0) THEN - - DO NAL=1,NALMX - IF(RSMCX .EQ. RSMCAL(NAL) .AND. STMIDX .EQ. STIDAL(NAL)) THEN - IFNDX=1 - GO TO 561 - ENDIF - ENDDO - - ELSE - IF(RSMCX .EQ. RSMCZ .AND. STMIDX .EQ. STMIDZ) THEN - IFNDX=1 - GO TO 561 - ENDIF - - ENDIF - - 561 CONTINUE - ENDIF - -C CONTINUE PROCESSING IF SAME STORM HAS BEEN FOUND. - - IF(IFNDX .NE. 0) THEN - - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVRX(IV),IERDEC,FMTVIT(IV), - 1 BUFINX) - ENDDO - CALL ZTIME(IDATEX,IUTCX,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYX) - -C CHECK FOR RECORDS THAT HAVE THE SAME DATE/TIME - - DO NSZ=1,NSAME - IF(ABS(DAYX-SRTDAY(NSZ)) .LT. FIVMIN) THEN - WRITE(6,567) NSZ,INDSAM(NSZ),BUFINX - 567 FORMAT('###RECORD HAS SAME DATE/TIME AS RECORD #',I3,' WHICH ', - 1 'IS INDEX#',I3,'. IT WILL NOT BE SAVED.',/,4X,A) - IOVRLP(NCK)=-999 - GO TO 570 - ENDIF - ENDDO - - NSAME=NSAME+1 - IDATE(NSAME)=IDATEX - IUTC(NSAME)=IUTCX - IOVRLP(NCK)=-NCK - INDSAM(NSAME)=NCK - STMID(NSAME)=STMIDX - STMNAM(NSAME)=STMNMX - RSMC (NSAME)=RSMCX - IDASRT(NSAME)=NSAME - SRTDAY(NSAME)=DAYX - IF(RSMC(NSAME)(1:1) .EQ. '!') IBANG=NSAME - - ENDIF - ENDIF - 570 CONTINUE - ENDDO - - WRITE(6,571) NSAME-1,STMNMZ,STMIDZ,(INDSAM(NS),NS=2,NSAME) - 571 FORMAT(/'...',I3,' MATCHING STORMS WERE FOUND FOR ',A,' WITH ', - 1 'ID=',A,' BY NAME OR STORM ID MATCHING. INDICES OF ', - 2 'MATCHING STORMS ARE:'/(4X,30I4)) - -C FINAL CHECK: FIND THE CLOSEST STORMS TO EACH OF THE STORMS -C THAT WERE DETERMINED TO BE THE SAME USING THE ABOVE PROCEDURE. -C COMPARE POSITIONS EXTRAPOLATED TO THE COMMON TIMES. - - NSVSAM=NSAME - DO NS=1,NSVSAM - ISAME=0 - DISTMN=1.E10 - -C RECOVER DATE, UTC, LAT/LON, STORM MOTION FOR SUBJECT STORM - - BUFINZ=OVRREC(INDSAM(NS)) - - DO IV=1,9 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 BUFINZ) - VITVAR(IV)=REAL(IVTVAR(IV))*VITFAC(IV) - ENDDO - IF(LATNS .EQ. 'S') STMLTZ=-STMLTZ - IF(LONEW .EQ. 'W') STMLNZ=360.-STMLNZ - DAYZ=SRTDAY(NS) - WRITE(6,1521) NS,NCHECK,STMNMZ,STMIDZ,IDATEZ,IUTCZ,STMLTZ, - 1 STMLNZ,STMDRZ,STMSPZ,DAYZ,RMAXZ - 1521 FORMAT(/'...BEGINNING PROXIMITY CHECK WITH INDEX=',I3,' AND ', - 1 'NUMBER OF STORMS TO COMPARE=',I3/4X,'STORM=',A,'WITH ID', - 2 '=',A,'. IDATEZ,IUTCZ,STMLTZ,STMLNZ,STMDRZ,STMSPZ,DAYZ,', - 3 'RMAXZ='/3X,I9,I5,6F12.3) - - DO 1580 NCK=1,NCHECK - -C PICK ONLY STORMS THAT HAVEN'T YET BEEN RECOGNIZED AS BEING THE -C SAME AND THAT ARE NOT THEMSELVES. - - IF(IOVRLP(NCK) .LT. 0 .OR. NCK .EQ. INDSAM(NS)) GO TO 1580 - -C RECOVER DATE, UTC, LAT/LON, STORM MOTION AND RMAX FOR COMPARISON -C STORM - - BUFINX=OVRREC(NCK) - DO IV=1,9 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVRX(IV),IERDEC,FMTVIT(IV), - 1 BUFINX) - VITVRX(IV)=REAL(IVTVRX(IV))*VITFAC(IV) - ENDDO - IF(LATNSX .EQ. 'S') STMLTX=-STMLTX - IF(LONEWX .EQ. 'W') STMLNX=360.-STMLNX - CALL ZTIME(IDATEX,IUTCX,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYX) - -C PICK ONLY STORMS THAT ARE NOT COTEMPORANEOUS. - - IF(ABS(DAYX-SRTDAY(NS)) .LT. FIVMIN) THEN -C WRITE(6,1553) NCK,INDSAM(NS) -C1553 FORMAT('###RECORD ',I3,' HAS SAME DATE/TIME AS RECORD #',I3,'. ', -C 1 'IT SHOULD HAVE BEEN TREATED BY THE COTEMPORANEOUS CHECK.') - GO TO 1580 - ENDIF - - IF(STMNMZ .NE. 'NAMELESS' .AND. STMNMX .NE. 'NAMELESS') THEN -C WRITE(6,1557) NCK,INDSAM(NS) -C1557 FORMAT('###RECORDS ',I3,' AND',I3,' BOTH HAVE NAMES. THEY ', -C 1 'SHOULD HAVE BEEN TREATED BY THE PREVIOUS MATCHING CHECK.') - GO TO 1580 - ENDIF - -C CAN THEY CAN BE DEFINITIVELY PROVEN NOT TO BE THE SAME STORM? -C IF THEY ARE BOTH BANG STORMS OR BOTH NOT BANG STORMS, THE RSMCS -C AND STORMS IDS CAN BE COMPARED DIRECTLY. OTHERWISE, WE MUST LOOK -C IN THE ALIAS FILE TO SEE IF THE SAME RSMC HAS OBSERVED EACH. - - IF(RSMCZ .EQ. RSMCX .AND. STMIDZ .NE. STMIDX) THEN -C WRITE(6,2551) RSMCZ,STMIDZ,STMIDX -C2551 FORMAT('...DIRECT COMPARISON OF STORM IDS FOR THE SAME RSMC ', -C 1 'GIVES UNAMBIGUOUSLY DIFFERENT STORMS, RSMC,STORM IDS=', -C 2 3(A,1X)) - GO TO 1580 - ENDIF - -C LOOK IN THE ALIAS FILE - - IFNDOV=0 - IRECOV=0 - REWIND IUNTAN - 2552 READ(IUNTAN,261,END=2560) NALOV,STNMOV,(RSMCOV(NAL),STIDOV(NAL), - 1 NAL=1,NALOV) - IRECOV=IRECOV+1 - - DO NALX=1,NALOV - IF((RSMCX(1:1) .EQ. '!' .AND. STMIDX .EQ. STIDOV(NALX)) .OR. - 1 (RSMCX(1:1) .NE. '!' .AND. - 2 RSMCX .EQ. RSMCOV(NALX) .AND. STMIDX .EQ. STIDOV(NALX))) THEN - IFNDOV=1 - DO NALZ=2,NALOV - IF(RSMCZ .EQ. RSMCOV(NALZ) .AND. STMIDZ .NE. STIDOV(NALZ)) THEN -C WRITE(6,2553) IRECOV,RSMCX,STMIDX,NALZ,RSMCOV(NALZ),STIDOV(NALZ) -C 1 STMIDZ -C 2553 FORMAT('###ALIAS RECORD',I3,' MATCHES POTENTIAL OVERLAPPING ', -C 1 'STORM WITH RSMC,ID=',2(A,1X,)/4X,'BUT FOR ALIAS #',I3, -C 2 ' RSMC=',A,' IS THE SAME BUT STORM IDS=',2(A,1X),' ARE ', -C 3 'DIFFERENT.') - GO TO 1580 - ENDIF - ENDDO - ENDIF - ENDDO - GO TO 2552 - - 2560 CONTINUE - - IF(IFNDOV .EQ. 0 .AND. RSMCX(1:1) .EQ. '!') THEN - WRITE(6,2561) STMNMX,RSMCX,STMIDX - 2561 FORMAT('...STORM ',A,' WITH RSMC AND ID=',2(A,1X),' WAS NOT ', - 1 'FOUND IN THE ALIAS FILE. ABORT1') - CALL ABORT1(' RSMCCK',2561) - ENDIF - - ISAME=ISAME+1 - DISTZX=DISTSP(STMLTZ,STMLNZ,STMLTX,STMLNX)*1.E-3 - -C WRITE(6,1571) STMNMX,STMIDX,NCK,IDATEX,IUTCX,STMLTX,STMLNX, -C 1 STMDRX,STMSPX,DAYX,DISTZX,RMAXX -C1571 FORMAT('...BEGINNING COMPARISON WITH STORM=',A,'WITH ID=',A,'. ', -C 1 'INDEX,IDATEX,IUTCX,STMLTX,STMLNX,STMDRX,STMSPX,DAYX,', -C 2 'DISTZX,RMAXX='/4X,I3,I10,I5,7F12.3) - IF(DISTZX .LT. DISTMN) THEN - DISTMN=DISTZX - NCLOSE=NCK - DAYSAV=DAYX - IUTCSV=IUTCX - IDATSV=IDATEX - STLTSV=STMLTX - STLNSV=STMLNX - STDRSV=STMDRX - STSPSV=STMSPX - RMAXSV=RMAXX - ENDIF - 1580 CONTINUE - - IF(ISAME .GT. 0) THEN - WRITE(6,1581) NS,NCLOSE,DISTMN,OVRREC(INDSAM(NS)),OVRREC(NCLOSE) - 1581 FORMAT(/'...FOR NS=',I3,', CLOSEST STORM IS INDEX=',I3,' WITH ', - 1 'DISTANCE=',F8.1,' KM. RECORDS ARE:'/4X,'Z...',A/4X, - 2 'X...',A/) - - BUFINX=OVRREC(NCLOSE) - - IF(RMAXZ .LT. 0.0) THEN - DO NBA=1,NBASIN - IF(STMIDZ(3:3) .EQ. IDBASN(NBA)) THEN - IBASN=NBA - GO TO 1546 - ENDIF - ENDDO - 1546 CONTINUE - RMAXZ=TCCLIM(9,IBASN) - WRITE(6,1583) NREC,RMAXZ,NABASN(IBASN) - 1583 FORMAT('###RMAXZ MISSING FOR PROXIMITY CHECK ON RECORD',I3,'.'/4X, - 1 'REPLACEMENT VALUE WILL BE A CLIMATOLOGICAL GUESS OF ', - 2 F6.1,' KM FOR BASIN ',A,'.') - ENDIF - - IF(RMAXSV .LT. 0.0) THEN - DO NBA=1,NBASIN - IF(STMIDX(3:3) .EQ. IDBASN(NBA)) THEN - IBASN=NBA - GO TO 1556 - ENDIF - ENDDO - 1556 CONTINUE - RMAXSV=TCCLIM(9,IBASN) - WRITE(6,1584) NREC,RMAXSV,NABASN(IBASN) - 1584 FORMAT('###RMAXSV MISSING FOR PROXIMITY CHECK ON RECORD',I3,'. ', - 1 'REPLACEMENT VALUE WILL BE A CLIMATOLOGICAL GUESS '/4X, - 2 'OF ',F6.1,' KM FOR BASIN ',A,'.') - ENDIF - - DTXZ=DAYSAV-DAYZ - DSTFAC=DTXZ*FACSPD - CALL DS2UV(USTMZ,VSTMZ,STMDRZ,STMSPZ) - CALL DS2UV(USTMX,VSTMX,STDRSV,STSPSV) - EXTLTZ=STMLTZ+VSTMZ*DSTFAC - EXTLNZ=STMLNZ+USTMZ*DSTFAC/COSD(EXTLTZ) - EXTLTX=STLTSV-VSTMX*DSTFAC - EXTLNX=STLNSV-USTMX*DSTFAC/COSD(EXTLTX) - DSTX2Z=DISTSP(STMLTZ,STMLNZ,EXTLTX,EXTLNX)*1.E-3 - DSTZ2X=DISTSP(STLTSV,STLNSV,EXTLTZ,EXTLNZ)*1.E-3 - -C LAST CRITERION FOR FINDING THE SAME STORM IS DISTANCE - - DSTOLP=RMAXZ+RMAXSV - IF(DSTZ2X .GE. DSTOLP .OR. DSTX2Z .GE. DSTOLP) THEN -C WRITE(6,1585) -C1585 FORMAT(/'...STORMS ARE NOT CONSIDERED THE SAME SINCE NO ', -C 1 'OVERLAPPING IS PRESENT AT A COMMON EXTRAPOLATED TIME.') - - ELSE - WRITE(6,1587) DAYZ,DAYX,DTXZ,DISTMN,STMNMZ,STMIDZ,STMLTZ,EXTLTZ, - 1 STMLNZ,EXTLNZ,DSTZ2X,RMAXZ,STMNMX,STMIDX,STLTSV, - 2 EXTLTX,STLNSV,EXTLNX,DSTX2Z,RMAXSV - 1587 FORMAT(/'...EXTRAPOLATION TABLE TO COMMON TIMES: DAYX,DAYZ,DTXZ', - 1 ',DISTMN=',4F12.3/20X,'SUBJECT (Z) STORM & ID',6X, - 2 'T=0LAT',6X,'T=XLAT',6X,'T=0LON',6X,'T=XLON',2X, - 3 'DISTANCE TO X',3X,'RMAXZ'/2(25X,A,2X,A,3X,6F12.3/),20X, - 4 'COMPARISON (X) STORM & ID',3X, - 5 'T=0LAT',6X,'T=ZLAT',6X,'T=0LON',6X,'T=ZLON',2X, - 6 'DISTANCE TO Z',3X,'RMAXX') - WRITE(6,1589) - 1589 FORMAT(/'###STORMS ARE OVERLAPPED AT A COMMON EXTRAPOLATED TIME.', - 1 ' THEY ARE ASSUMED TO BE THE SAME.###') - - BUFINX=OVRREC(NCLOSE) - NSAME=NSAME+1 - IDATE(NSAME)=IDATSV - IUTC(NSAME)=IUTCSV - IOVRLP(NCLOSE)=-NCLOSE - INDSAM(NSAME)=NCLOSE - STMID(NSAME)=STMIDX - STMNAM(NSAME)=STMNMX - RSMC (NSAME)=RSMCX - IDASRT(NSAME)=NSAME - SRTDAY(NSAME)=DAYSAV - IF(RSMC(NSAME)(1:1) .EQ. '!') IBANG=NSAME - - ENDIF - ENDIF - ENDDO - -C PROCESS ALL RECORDS FOR THE SAME STORM - - IF(NSAME .GT. 1) THEN - BUFINZ=OKAREC(NOK) - WRITE(6,577) NSAME,STMNMZ,STMIDZ,(NS,IDATE(NS),IUTC(NS), - 1 RSMC(NS),STMID(NS),STMNAM(NS),NS=1,NSAME) - 577 FORMAT('...',I3,' RECORDS APPEAR TO BE THE SAME STORM WITH NAME,', - 1 ' ID=',2(1X,A),' AND MUST BE UNIFIED.'/10X,' DATE ', - 2 'UTC RSMC STMID NAME ARE:'/(4X,I3,I10,2X,I5,2X,2(3X, - 3 A),4X,A)) - -c Sort the records by time - - CALL SORTRL(SRTDAY(1:NSAME),IDASRT(1:NSAME),NSAME) - -C LOOK IN THE ALIAS FILE TO SEE WHICH STORM ALIASES CORRESPOND -C TO THE BANG STORM. - - IF(IBANG .NE. 0) THEN - STMIDX=STMID(IBANG) - STMNMX=STMNAM(IBANG) - RSMCX=RSMC (IBANG) - - REWIND IUNTAN - NRECAL=0 - 552 READ(IUNTAN,261,END=555) NALMX,STNMAL,(RSMCAL(NAL),STIDAL(NAL), - 1 NAL=1,NALMX) - NRECAL=NRECAL+1 - -C NO MATCH FOR BOTH STORMS THAT ARE NAMED. - - IF(STMNMX .NE. 'NAMELESS' .AND. - 1 STNMAL .NE. 'NAMELESS' .AND. - 2 STNMAL .NE. STMNMX) GO TO 552 - -C POSSIBLE MATCH REMAINS: MATCH STORM ID ONLY IN THIS CASE SINCE -C THEY ARE BOTH BANG STORMS. - - DO NAL=1,NALMX - IF(STMIDX .EQ. STIDAL(NAL)) THEN - IFNDAL=NRECAL - GO TO 555 - ENDIF - ENDDO - GO TO 552 - - 555 CONTINUE - - IF(IFNDAL .EQ. 0) THEN - WRITE(6,5571) IBANG,STMNMX,RSMCX,STMIDX - 5571 FORMAT('******BANG STORM WITH INDEX=',I3,', NAME,RSMC,ID=', - 1 3(A,1X),' CANNOT BE FOUND IN THE ALIAS FILE. ABORT1') - CALL ABORT1(' RSMCCK',5571) - - ELSE - WRITE(6,5573) IBANG,STMNMX,RSMCX,STMIDX,IFNDAL - 5573 FORMAT('...BANG STORM WITH INDEX=',I3,', NAME,RSMC,ID=',3(A,1X), - 1 ' WAS FOUND AS RECORD#',I4,' IN THE ALIAS FILE. ') - ENDIF - ENDIF - -C LOOK FOR ALL THE RSMCS THAT HAVE OBSERVED THIS STORM SO FAR - - NRSMC=NALMX-1 - NALMXZ=NALMX - -C LOAD RSMCS FROM THE ALIAS FILE, IF ANY - - DO NRS=2,NALMX - DO NRSZ=1,NRSMCX - IF(RSMCAL(NRS) .EQ. RSMCID(NRSZ)) THEN - NRSMCF=NRSZ - ENDIF - ENDDO - IRSMC(NRS-1)=NRSMCF - WRITE(6,6633) NRS-1,RSMCID(NRSMCF) - 6633 FORMAT('...STORING ALIAS #',I3,' WHICH IS ',A) - ENDDO - - DO NS=1,NSAME - - IF(RSMC(NS) (1:1) .EQ. '!') THEN - NPS=2 - NPE=4 - ELSE - NPS=1 - NPE=1 - ENDIF - - DO NP=NPS,NPE - -C COMBINED RSMC CASE - - IF(RSMC(NS) (1:1) .EQ. '!') THEN - DO NRSZ=1,NRSMCX - IF(RSMC(NS)(NP:NP) .EQ. RSMCAP(NRSZ)) THEN - NRSMCF=NRSZ - GO TO 591 - ENDIF - ENDDO - -C INDIVIDUAL RSMC CASE - - ELSE - DO NRSZ=1,NRSMCX - IF(RSMC(NS) .EQ. RSMCID(NRSZ)) THEN - NRSMCF=NRSZ - GO TO 591 - ENDIF - ENDDO - ENDIF - 591 CONTINUE - - - ISAV=0 - DO NRSMS=1,NRSMC - IF(IRSMC(NRSMS) .EQ. NRSMCF) ISAV=ISAV+1 - ENDDO - - IF(ISAV .EQ. 0) THEN - NRSMC=NRSMC+1 - IRSMC(NRSMC)=NRSMCF - -C STORE A NEW RSMC IF NECESSARY. - - IADDAL=0 - DO NAL=2,NALMXZ - IF(RSMCAL(NAL) .EQ. RSMCID(NRSMCF)) IADDAL=IADDAL+1 -C WRITE(6,6441) NAL,RSMCAL(NAL),RSMCID(NRSMCF),IADDAL -C6441 FORMAT('...DEBUGGING, NAL,RSMCAL(NAL),RSMCID(NRSMCF),IADDAL=', -C 1 I3,2(1X,A),I3) - ENDDO - - IF(IADDAL .EQ. 0) THEN - WRITE(6,641) RSMCID(NRSMCF),STMID(NS) - 641 FORMAT('...THE LIST OF OBSERVERS WILL INCLUDE RSMC=',A,' FOR ', - 1 'STORM ID=',A) - NALMXZ=NALMXZ+1 - STIDAL(NALMXZ)=STMID(NS) - RSMCAL(NALMXZ)=RSMCID(NRSMCF) - - ELSE - WRITE(6,643) RSMCID(NRSMCF),STMNMZ - 643 FORMAT('...RSMC=',A,' IS ALREADY IN THE LIST OF OBSERVERS FOR ',A) - ENDIF - - ENDIF - - ENDDO - ENDDO - WRITE(6,651) STMNMZ,STMIDZ,NRSMC,(RSMCID(IRSMC(NRS)),NRS=1,NRSMC) - 651 FORMAT(/'...SUMMARY OF ALL OBSERVING RSMCS FOR STORM WITH NAME,', - 1 'ID=',2(1X,A),'. NUMBER OF RSMCS=',I3/4X,10(A,2X)) - -C IF MORE THAN ONE RSMC HAS OBSERVED STORM, UNIFY THE STORM ID -C AND RSMC IF ANY NEW RSMCS HAVE BEEN ADDED. - - IF(NRSMC .GT. 1 .OR. IFNDAL .NE. 0) THEN - - IF(NALMX .EQ. NALMXZ) THEN - -C NO NEW RSMC NEED BE ADDED. COPY STORM ID AND RSMC FROM A BANG -C RECORD. - - IRITAL=0 - - IF(IFNDAL .NE. 0) THEN - WRITE(6,6653) STMNMZ,STMIDZ,STNMAL,STIDAL(1),RSMCAL(1) - 6653 FORMAT(/'...STORM WITH NAME, ID=',2(1X,A),' WAS FOUND IN ALIAS ', - 1 'FILE WITH NAME=',A,'. ID,RSMC=',2(A,1X)) - STMIDZ=STIDAL(1) - RSMCZ=RSMCAL(1) - STMNMZ=STNMAL - - ELSE IF(IBANG .NE. 0) THEN - WRITE(6,653) - 653 FORMAT('...STORM NOT FOUND IN ALIAS FILE AND NO NEW RSMC HAS ', - 1 'BEEN ADDED. STORE RSMC AND STORM ID FROM A BANG RECORD.') - STMIDZ=STMID(IBANG) - RSMCZ=RSMC(IBANG) - - ELSE - WRITE(6,655) STMNMZ,STMIDZ - 655 FORMAT(/'******STORM WITH NAME, ID=',2(1X,A),' IS NOT LISTED AS ', - 1 'A BANG STORM, CANNOT BE FOUND IN THE ALIAS FILE,'/7X, - 2 'HAS MORE THAN ONE RSMC BUT NONE ARE TO BE ADDED. ABORT1') - CALL ABORT1(' RSMCCK',655) - ENDIF - - ELSE - -C ADD A NEW RSMC. COPY RSMC FROM THE BANG STORM RECORD. THEN ADD -C NEW RSMCS. IF THERE IS NO BANG RECORD, MAKE UP A NEW RSMC -C AND STORM ID BASED ON THE EARLIEST RECORD. - - IRITAL=1 - - NWRSMC=NALMXZ-NALMX - WRITE(6,6657) NWRSMC - 6657 FORMAT('...',I3,' NEW RSMCS WILL BE ADDED.') - -c Mark a relocation flag for the record in which a new -c rsmc has observed storm - - do ns=2,nsame - if(rsmc(idasrt(ns)) .ne. rsmc(idasrt(1))) then - write(6,6679) ns,idasrt(1),rsmc(idasrt(1)),idasrt(ns), - 1 rsmc(idasrt(ns)),nsame - 6679 format('...For ns=',i3,' a new observing rsmc has been detected.', - 1 ' Index,rsmc (first,new)=',2(i3,1x,a)/4x,'Total number ', - 2 'of observed records=',i3,' We insert a relocation flag ', - 3 'in the new record.') - bufinx=ovrrec(indsam(idasrt(ns))) - relocx='R' - ovrrec(indsam(idasrt(ns)))=bufinx - write(6,5509) indsam(idasrt(ns)),bufinx - 5509 format('...Record index and corrected record are:',i3/4x,a) - endif - enddo - - IF(IBANG .NE. 0) THEN - STMIDZ=STMID(IBANG) - RSMCZ=RSMC(IBANG) - LNRSMC=INDEX(RSMCZ,' ')-1 - WRITE(6,657) LNRSMC - 657 FORMAT('...BANG STORM EXISTS: STORE RSMC AND STORM ID FROM A ', - 1 'BANG RECORD, LENGTH IS:',I2) - - NWSLOT=0 - DO NAD=1,NWRSMC - NWSLOT=NWSLOT+1 - - IF(LNRSMC+NWSLOT .LE. 4) THEN - DO NRSZ=1,NRSMCX - IF(RSMCAL(NALMX+NAD) .EQ. RSMCID(NRSZ)) THEN -c write(6,6541) nad,nalmx,nwslot,lnrsmc+nwslot,nrsz, -c 1 rsmcal(nalmx+nad),rsmcid(nrsz) -c6541 format('...debugging, nad,nalmx,nwslot,lnrsmc+nwslot,nrsz,', -c 1 'rsmcal(nalmx+nad),rsmcid(nrsz)'/4x,5i4,2(1x,a)) - NRSMCF=NRSZ - GO TO 6561 - ENDIF - ENDDO - 6561 CONTINUE - RSMCZ(LNRSMC+NWSLOT:LNRSMC+NWSLOT)=RSMCAP(NRSMCF) - WRITE(6,6563) RSMCAP(NRSMCF),RSMCZ - 6563 FORMAT('...ADDING RSMC=',A,' TO AN ALREADY DEFINED BANG STORM ', - 1 'RSMC. UPDATED RSMC=',A) - - ELSE - WRITE(6,6567) NWSLOT,LNRSMC,NWRSMC - 6567 FORMAT('###INSUFFICIENT SPACE TO ADD NEW RSMC, NWSLOT,LNRSMC,', - 1 'NWRSMC=',3I3) - ENDIF - ENDDO - - ELSE - -C IN THIS CASE, NO OBSERVERS ARE BANG RECORDS AND THE STORM IS -C NOT IN THE ALIAS FILE. AN ALIAS RECORD MUST BE CREATED AND -C WRITTEN TO THE ALIAS FILE - - WRITE(6,659) IDASRT(1),STMID(IDASRT(1)),STMNAM(IDASRT(1)) - 659 FORMAT(/'...NO BANG STORMS EXIST. EARLIEST RECORD IS:',I3, - 1 '. STORM ID IS: ',A,' STORM NAME IS: ',A) - -C SUBSTITUTE THE ID OF THE FIRST OBSERVING RSMC AND CONSTRUCT -C A UNIFIED RSMC. SUBSTITUTE STORM NAME IF FIRST OBSERVATION -C DOES NOT HAVE NAMELESS AS A STORM NAME. - - RSMCZ=RSMC(IDASRT(1)) - STMIDZ=STMID(IDASRT(1)) - STMNMZ=STMNAM(IDASRT(1)) - -C FIRST TWO RSMC SLOTS - - IF(RSMCZ(1:1) .EQ. '!') THEN - WRITE(6,663) RSMC(IDASRT(1))(1:2) - 663 FORMAT('...THIS RECORD IS A MULTIPLY OBSERVED STORM. COPY THE ', - 1 'RSMCAP AND BANG FROM THIS RECORD=',A) - RSMCZ(1:2)=RSMC(IDASRT(1))(1:2) - DO NRSZ=1,NRSMCX - IF(RSMC(IDASRT(1))(2:2) .EQ. RSMCAP(NRSZ)) THEN - NRSST=NRSZ - GO TO 661 - ENDIF - ENDDO - 661 CONTINUE - - ELSE - WRITE(6,667) - 667 FORMAT('...THIS RECORD IS A SINGLY OBSERVED STORM. COPY THE ', - 1 'RSMC FROM THIS RECORD.') - RSMCZ(1:1)='!' - DO NRSZ=1,NRSMCX - IF(RSMC(IDASRT(1)) .EQ. RSMCID(NRSZ)) THEN - NRSST=NRSZ - GO TO 671 - ENDIF - ENDDO - 671 CONTINUE - RSMCZ(2:2)=RSMCAP(NRSST) - ENDIF - -C REMAINING RSMC SLOTS - - NID=2 - RSMCZ(3:4)=' ' - DO NRS=1,NRSMC - IF(RSMCID(IRSMC(NRS)) .NE. RSMCID(NRSST)) THEN - NID=NID+1 - IF(NID .GT. 4) GO TO 680 - RSMCZ(NID:NID)=RSMCAP(IRSMC(NRS)) - WRITE(6,679) RSMCAP(IRSMC(NRS)),IRSMC(NRS),NID,RSMCZ - 679 FORMAT('...ADDING RSMCAP ',A,' FOR RSMC ',I2,' IN SLOT ',I3, - 1 ' RSMCZ=',A) - ENDIF - 680 CONTINUE - ENDDO - - ENDIF - - ENDIF - -C HAS THE STORM BEEN NAMED BY SOMEONE OVER ITS HISTORY? IF SO, -C SUBSTITUTE THE NAME FOR THE ALIAS FILE. - - IF(STMNMZ .EQ. 'NAMELESS') THEN - DO NS=1,NSAME - IF(STMNAM(NS) .NE. 'NAMELESS') THEN - STMNMZ=STMNAM(NS) - WRITE(6,6689) STMNAM(NS),NS - 6689 FORMAT('###STORM NAMELESS WILL BE RENAMED ',A,' IN THE ALIAS ', - 1 'FILE. INDEX OF NAMED STORM=',I3) - IRITAL=1 - GO TO 6691 - ENDIF - ENDDO - 6691 CONTINUE - ENDIF - -C IF NECESSARY, WRITE ALIAS RECORD AND SUBSTITUTE UNIFIED RSMC AND -C STORM ID. - - IF(IRITAL .EQ. 1) THEN - WRITE(6,681) STMNMZ,STMIDZ,RSMCZ - 681 FORMAT(/'...WRITING A UNIFIED ALIAS RECORD FOR STORM NAME=',A, - 1 '. STORM ID AND UNIFIED RSMC ARE:',2(1X,A)) - NALADD=NALADD+1 - STIDAL(1)=STMIDZ - RSMCAL(1)=RSMCZ - DAYZ=-999.0 - CALL AKASAV(NALADD,NALMXZ,DAYZ,STMNMZ,RSMCAL,STIDAL) - ENDIF - - DO NS=1,NSAME - BUFINX=OVRREC(INDSAM(NS)) -C WRITE(6,683) NS,INDSAM(NS),BUFINX -C 683 FORMAT('...SUBSTITUTING UNIFIED RSMC AND STMID. NS,INDSAM,RECORD', -C 1 ' ARE:',2I3/' ...',A) - STMIDX=STMIDZ - RSMCX=RSMCZ - OVRREC(INDSAM(NS))=BUFINX -C WRITE(6,683) NS,INDSAM(NS),BUFINX - ENDDO - - ELSE - WRITE(6,693) - 693 FORMAT(/'...ONLY 1 RSMC HAS OBSERVED STORM. THERE IS NO NEED TO', - 1 ' UNIFY THE RSMC AND STORM ID IF STORM IDS ARE THE SAME.' - 2 /4X,'WE PROCEED TO CHECK STORM ID CONSISTENCY.') - - ISAME=0 - DO NS=2,NSAME - IF(STMID(NS) .NE. STMIDZ) THEN - IF(ABS(SRTDAY(NS)-SRTDAY(1)) .LE. DTOVR) THEN - ISAME=ISAME+1 - IETYP=6 - WRITE(6,1683) DTOVR,INDSAM(NS),INDSAM(1),STMID(NS),STMIDZ, - 1 STMNAM(NS),STMNMZ,SRTDAY(NS),SRTDAY(1), - 2 OVRREC(INDSAM(NS)),OVRREC(INDSAM(1)) - 1683 FORMAT(/'###TWO STORMS OBSERVED BY THE SAME RSMC WITH TIMES ', - 1 'DIFFERING BY LESS THAN ',F5.1,' DAYS AND DIFFERENT ', - 2 'STORM ID.'/4X,'THESE ARE PROBABLY THE SAME STORM. IN ', - 3 'ORDER (NS,1), INDEX, STORM ID, STORM NAME, DAY AND ', - 4 'RECORD ARE:'/10X,2I5,4(2X,A),2F12.3/2(4X,A/)) - ELSE - WRITE(6,1687) DTOVR,INDSAM(NS),INDSAM(1),STMID(NS),STMIDZ, - 1 STMNAM(NS),STMNMZ,SRTDAY(NS),SRTDAY(1), - 2 OVRREC(INDSAM(NS)),OVRREC(INDSAM(1)) - 1687 FORMAT(/'###TWO STORMS OBSERVED BY THE SAME RSMC WITH TIMES ', - 1 'DIFFERING BY MORE THAN ',F5.1,' DAYS AND DIFFERENT ', - 2 'STORM ID.'/4X,'THESE ARE PROBABLY NOT THE SAME STORM.', - 3 ' IN ORDER (NS,1), INDEX, STORM ID, STORM NAME, DAY ', - 4 'AND RECORD ARE:'/10X,2I5,4(2X,A),2F12.3/2(4X,A/)) - ENDIF - ENDIF - ENDDO - -C STORMS HAVE ALREADY BEEN SORTED IN CHRONOLOGICAL ORDER SO -C SUBSTITUTE THE STORM ID OF THE EARLIEST STORM. - - IF(ISAME .NE. 0) THEN - - WRITE(6,1695) IDASRT(1),STMID(IDASRT(1)),STMNAM(IDASRT(1)) - 1695 FORMAT(/'...EARLIEST RECORD IS:',I3,'. STORM ID IS: ',A,' STORM ', - 1 'NAME IS: ',A/4X,'THIS STORM ID AND RSMC WILL BE COPIED ', - 2 'TO THE FOLLOWING STORMS:') - DO NS=1,NSAME - BUFINX=OVRREC(INDSAM(NS)) - STMIDX=STMID(IDASRT(1)) - RSMCX =RSMC (IDASRT(1)) - OVRREC(INDSAM(NS))=BUFINX - IF(INDSAM(NS) .LE. NOKAY) IFRSMC(NUMOKA(INDSAM(NS)))=-IETYP - WRITE(6,1697) NS,INDSAM(NS),OVRREC(INDSAM(NS)) - 1697 FORMAT('...',I3,'...',I3,'...',A) - ENDDO - ENDIF - - ENDIF - - ELSE - WRITE(6,697) NOK,OKAREC(NOK) - 697 FORMAT('...OKAY RECORD ',I3,' IS UNIQUE AMONG OKAY AND SHORT-', - 1 'TERM HISTORY RECORDS. NO FURTHER PROCESSING WILL BE ', - 2 'DONE. RECORD IS:'/4X,'...',A,'...') - ENDIF - - 700 CONTINUE - ENDDO - CALL AKADMP(IUNTAL) - -C SAVE AS BAD RECORDS THOSE ORIGINAL RECORDS THAT HAVE BEEN -C UNIFIED, BUT NOT MULTIPLY OBSERVED, SO THAT THEY CAN BE -C COPIED TO THE ORIGINAL SHORT-TERM HISTORY FILE LATER BY RITSTH. - - DO NOK=1,NOKAY - - IF(OKAREC(NOK)(1:1) .NE. '!' .AND. - 1 OVRREC(NOK)(1:1) .EQ. '!') THEN - IETYP=30 - IFRSMC(NUMOKA(NOK))=IETYP - NADD=NADD+1 - NUNIFY=NUNIFY+1 - NUMBAD(NADD+NBAD)=NUMOKA(NOK) - BADREC(NADD+NBAD)=OKAREC(NOK) - ENDIF - - OKAREC(NOK)=OVRREC(NOK) - ENDDO - - WRITE(6,711) IUNTOK - 711 FORMAT(/'...WE HAVE UNIFIED ALL RECORDS AND ARE WRITING THEM TO ', - 1 'THE SCRATCH FILE.'/4X,'THEY WILL BE WRITTEN TO THE ', - 2 'ALIAS SHORT-TERM HISTORY FILE IF UPDATING IS REQUIRED.'/ - 3 4X,'OLD ALIAS SHORT-TERM HISTORY RECORDS WRITTEN TO ', - 4 'IUNTOK=',I3,' ARE:') - NRCOVR=0 - DO NHA=NOKAY+1,NCHECK - IF(IOVRLP(NHA) .NE. -999) THEN - NRCOVR=NRCOVR+1 - WRITE(IUNTOK,521) OVRREC(NHA) - WRITE(6,719) NRCOVR,OVRREC(NHA) - 719 FORMAT('...',I3,'...',A,'...') - OVRREC(NRCOVR)=OVRREC(NHA) - ENDIF - ENDDO - WRITE(6,721) NRCOVR - 721 FORMAT(/'...IMPORTANT NOTE: THE UPDATED OLD ALIAS SHORT-TERM ', - 1 'HISTORY RECORDS ARE RETURNED TO THE MAIN PROGRAM IN ', - 2 'OVRREC.'/4X,'THEY WILL BE COPIED INTO THE SCRATCH FILE ', - 3 '(INSTEAD OF USING CPYREC) WHEN FILES=F.'/4X,'THE NUMBER', - 4 ' OF RECORDS RETURNED IS:',I4) - -C COPY NEW ALIAS FILE TO AKAVIT. DON'T COPY RECORDS -C THAT ALREADY EXIST IN AKAVIT. - - REWIND IUNTAN - CALL AKACPY(IUNTAN,IUNTAL) - -C DO NOT CLEAR OUT THE NEW ALIAS FILE; AKAVIT MAY BE CHANGED BY -C RCNCIL LATER - - WRITE(6,1001) NOKAY,-NSUBR,-NUNIFY,NADD,NTEST, - 1 (ERCRS(NER),NER=1,NERCRS) - 1001 FORMAT(//'...RESULTS OF THE MULTIPLE RSMC CHECK ARE: NOKAY=',I4, - 1 ' NSUBR=',I4,' NUNIFY=',I4,' AND NADD=',I4,' FOR A ', - 2 'TOTAL OF ',I4,' RECORDS.'//4X,'ERROR CODES ARE:'/(6X,A)) - WRITE(6,1003) - 1003 FORMAT(/'...OKAY RECORDS ARE:',100X,'ERC'/) - DO NOK=1,NOKAY - WRITE(6,1009) NOK,NUMOKA(NOK),OKAREC(NOK),-IFRSMC(NUMOKA(NOK)) - 1009 FORMAT(3X,I4,'...',I4,'...',A,'...',I3) - ENDDO - IF(NADD .GT. 0) WRITE(6,1011) (NBAD+NBA,NUMBAD(NBAD+NBA), - 1 BADREC(NBAD+NBA), - 2 IFRSMC(NUMBAD(NBAD+NBA)), - 3 NBA=1,NADD) - 1011 FORMAT(/' ADDED BAD RECORDS ARE:',95X,'ERC'/(3X,I4,'...',I4, - 1 '...',A,'...',I3)) - NBAD=NBAD+NADD - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: BASNCK CHECKS FOR PROPERLY IDENTIFIED BASINS -C PRGMMR: S. LORD ORG: NP22 DATE: 1992-02-24 -C -C ABSTRACT: INPUT RECORDS ARE CHECKED FOR PROPERLY IDENTIFIED BASINS. -C THE INPUT LATIDUDE AND LONGITUDE ARE CHECKED AGAINST -C TABULATED MIN AND MAX LATITUDES AND LONGITUDES FOR THE -C SPECIFIED BASIN. INCONSISTENCIES ARE FLAGGED. -C -C PROGRAM HISTORY LOG: -C 1992-02-19 S. LORD -C -C USAGE: CALL BASNCK(STMIDX,RLTSTM,RLNSTM,NBA,IPRT,IER) -C INPUT ARGUMENT LIST: -C STMIDX - 3 CHARACTER STORM ID. THIRD CHARACTER CARRIES BASIN -C IDENTIFIER -C IPRT - PRINT LEVEL. =1 FOR PRINTOUT; =0 FOR NO PRINTOUT -C -C OUTPUT ARGUMENT LIST: -C NBA - BASIN NUMBER CORRESPONDING TO THE INPUT LAT/LON -C IER - ERROR RETURN CODE: -C 3: STORM IS NOT IN A BASIN DEFINED BY THE TABULATED -C MINIMUM AND MAXIMUM LAT/LON -C 11: BASIN AND BASIN BOUNDARIES DO NOT MATCH. THIS DOES -C NOT NECESSARILY MEAN THERE IS AN ERROR SINCE THE -C STORM COULD HAVE ORIGINATED IN THAT BASIN AND MOVED -C TO ANOTHER -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE BASNCK(STMIDX,RLTSTM,RLNSTM,NBA,IPRT,IER) - - SAVE - - CHARACTER*(*) STMIDX - - PARAMETER (NBASIN=11) - - CHARACTER IDBASN*1 - - DIMENSION IDBASN(NBASIN),BSLTMN(NBASIN),BSLTMX(NBASIN), - 1 BSLNMN(NBASIN),BSLNMX(NBASIN) - - DATA IDBASN/'L','E','C','W','O','T','U','P','S','B','A'/ - -C BASIN BOUNDARIES: MIN AND MAX LATITUDES; MIN AND MAX LONGITUDES -C NOTE: SOME BOUNDARIES MAY OVERLAP, BUT SCANNING IS IN ORDER OF -C DECREASING PRIORITY SO BASINS SHOULD BE CAPTURED PROPERLY - - DATA BSLTMN/3*-20.,2*0.0,20.,3*-50.,2*0.0/, - 1 BSLTMX/4*60.,25.,40.,3*0.0,2*30./, - 2 BSLNMN/260.,220.,180.,2*100.,110.,90.,160.,40.,75.,40./, - 3 BSLNMX/350.,260.,220.,180.,125.,140.,160.,290.,90.,100.,75./ - - - IER=0 - -C RECOVER BASIN NUMBER FROM STORM ID -C WE ASSUME ALL BASIN IDS ARE VALID HERE - - DO NB=1,NBASIN - IF(STMIDX(3:3) .EQ. IDBASN(NB)) THEN - NBA=NB - GO TO 11 - ENDIF - ENDDO - 11 CONTINUE - - IF(RLTSTM .LT. BSLTMN(NBA) .OR. RLTSTM .GT. BSLTMX(NBA) .OR. - 1 RLNSTM .LT. BSLNMN(NBA) .OR. RLNSTM .GT. BSLNMX(NBA)) THEN - IF(IPRT .EQ. 1) WRITE(6,21) STMIDX,NBA,RLTSTM,RLNSTM,BSLTMN(NBA), - 1 BSLTMX(NBA),BSLNMN(NBA),BSLNMX(NBA) - 21 FORMAT(/'******BASIN IDENTIFIER AND LAT/LON ARE INCONSISTENT. A ', - 1 'POSSIBLE ERROR EXISTS OR THE STORM ORIGINATED IN A ', - 2 'DIFFERENT BASIN.'/4X,'STMIDX,NBA,RLTSTM,RLNSTM,BSLTMN(', - 3 'NBA),BSLTMX(NBA),BSLNMN(NBA),BSLNMX(NBA)='/4X,A,I3,6F8.1) - IER=11 - -C IN WHICH BASIN IS THE STORM REALLY LOCATED? - - DO NB=1,NBASIN - IF(RLTSTM .GE. BSLTMN(NB) .AND. RLTSTM .LE. BSLTMX(NB) .AND. - 1 RLNSTM .GE. BSLNMN(NB) .AND. RLNSTM .LE. BSLNMX(NB)) THEN - NBA=NB - RETURN - ENDIF - ENDDO - IER=3 - WRITE(6,51) STMIDX,NBA,RLTSTM,RLNSTM,BSLTMN(NBA), - 1 BSLTMX(NBA),BSLNMN(NBA),BSLNMX(NBA) - 51 FORMAT(/'******STORM=',A,' IS NOT IN A DEFINED BASIN. NBA,', - 1 'RLTSTM,RLNSTM,BSLTMN(NBA),BSLTMX(NBA),BSLNMN(NBA),', - 2 'BSLNMX(NBA)='/I3,6F8.1) - ENDIF - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: AKASUB HANDLES STORAGE AND WRITING ALIAS RECORDS -C PRGMMR: S. LORD ORG: NP22 DATE: 1992-03-05 -C -C ABSTRACT: STORES ALIAS RECORDS UNTIL THEY ARE READY TO BE DUMPED TO -C DISK. DUMPING TO DISK INVOLVES FINDING THE ONE RECORD FOR -C EACH STORM THAT HAS THE EARLIEST DATE. COPYING FROM ONE -C UNIT TO ANOTHER ALSO INVOLVES FINDING THE EARLIEST DATE. -C FUNCTIONS ARE PERFORMED BY 3 SEPARATE ENTRIES AS SHOWN -C BELOW. AKASUB IS JUST A DUMMY HEADING. -C -C PROGRAM HISTORY LOG: -C 1992-03-05 S. LORD -C -C USAGE: CALL AKASUB(IUNITI,IUNITO,NAKREC,NAKA,DAYZ,AKANAM,AKRSMC, -C AKSTID) -C CALL AKASAV(NAKREC,NAKA,DAYZ,AKANAM,AKRSMC,AKSTID): STORES -C RECORDS -C CALL AKADMP(IUNITO): DUMPS RECORDS TO DISK -C CALL AKACPY(IUNITI,IUNITO): COPIES RECORDS FROM IUNITI TO -C IUNITO -C INPUT ARGUMENT LIST: -C IUNITI - INPUT UNIT NUMBER. FILE POSITIONING MUST BE HANDLED -C - OUTSIDE THIS ROUTINE. -C IUNITO - OUTPUT UNIT NUMBER. FILE POSITIONING MUST BE HANDLED -C - OUTSIDE THIS ROUTINE. -C NAKREC - RECORD NUMBER, FIRST RECORD IS 1 AND SO ON. -C NAKA - NUMBER OF ALIASES IN EACH RECORD. FIRST ALIAS IS -C - USUALLY A COMBINED OR UNIFIED ALIAS BEGINNING WITH A !. -C DAYZ - FRACTIONAL DAY FOR EACH RECORD -C AKANAM - STORM NAME (CHARACTER*9) -C AKRSMC - ARRAY CONTAINING ALL RSMCS (CHARACTER*4) -C AKSTID - ARRAY CONTAINING ALL STORM IDS (CHARACTER*3) -C -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE AKASUB(IUNITI,IUNITO,NAKREC,NAKA,DAYZ,AKANAM,AKRSMC, - 1 AKSTID,ICSTNM,ICRSMC,ICSTID,IFAKA) - - PARAMETER (MAXSTM=70) - PARAMETER (NOVRMX=MAXSTM) - PARAMETER (MAXAKA=10) - - SAVE - - DIMENSION NUMSAV(MAXSTM),SAVNAM(MAXSTM),SAVRSM(MAXSTM,MAXAKA), - 1 SAVID(MAXSTM,MAXAKA),SAVDAY(MAXSTM),INDSAM(MAXSTM) - - DIMENSION AKRSMC(NOVRMX),AKSTID(NOVRMX),RSMCCP(MAXAKA), - 1 STIDCP(MAXAKA) - - CHARACTER SAVNAM*9,SAVRSM*4,SAVID*3,STMNMX*9,RSMCCP*4,STIDCP*3 - CHARACTER*(*) AKANAM,AKRSMC,AKSTID,ICSTNM,ICRSMC,ICSTID - - LOGICAL FOUND - -C----------------------------------------------------------------------- -C THIS ENTRY STORES ALIAS ENTRIES - - ENTRY AKASAV(NAKREC,NAKA,DAYZ,AKANAM,AKRSMC,AKSTID) - - WRITE(6,1) NAKREC - 1 FORMAT(/'...ENTERING AKASAV TO STORE RECORD #',I3,'. RECORD IS:') - - NAKSAV=NAKREC - NUMSAV(NAKSAV)=NAKA - SAVNAM(NAKSAV)=AKANAM - SAVDAY(NAKSAV)=DAYZ - - SAVRSM(NAKSAV,1:NAKA)=AKRSMC(1:NAKA) - SAVID (NAKSAV,1:NAKA)=AKSTID(1:NAKA) - WRITE(6,11) NAKA,AKANAM,(AKRSMC(NAL),AKSTID(NAL),NAL=1,NAKA) - 11 FORMAT('...',I1,1X,A9,10(1X,A4,1X,A3)) - - RETURN - -C----------------------------------------------------------------------- -C THIS ENTRY DUMPS ALIAS ENTRIES. ONLY THE EARLIEST ENTRY FOR -C EACH STORM IS SAVED. - - ENTRY AKADMP(IUNITO) - - WRITE(6,21) IUNITO - 21 FORMAT(/'...ENTERING AKADMP TO WRITE EARLIEST UNIQUE ALIAS ', - 1 'RECORDS TO UNIT',I3,'. STORED RECORDS ARE:'/10X,'NAL', - 2 4X,'NAME',12X,'JDAY',5X,'RSMC',2X,'STMID') - DO NAK=1,NAKSAV - WRITE(6,23) NAK,NUMSAV(NAK),SAVNAM(NAK),SAVDAY(NAK), - 1 (SAVRSM(NAK,NS),SAVID(NAK,NS),NS=1,NUMSAV(NAK)) - 23 FORMAT(3X,I3,2X,I3,4X,A,3X,F12.3,10(3X,A)) - ENDDO - - NREC=0 - DO NAK=1,NAKSAV - IF(NUMSAV(NAK) .GT. 0) THEN - IFND=1 - INDSAM(IFND)=NAK - WRITE(6,27) NAK,IFND,SAVNAM(NAK),SAVDAY(NAK),(SAVRSM(NAK,NSAV), - 1 SAVID(NAK,NSAV),NSAV=1,NUMSAV(NAK)) - 27 FORMAT(/'...LOOKING FOR MATCHING STORM NAMES FOR INDEX=',I3, - 1 ', IFND=',I3,' STORM NAME= ',A,' WITH DAY=',F12.3/4X, - 2 'ALIASES ARE: ',10(A,1X,A,'; ')) - WRITE(6,29) - 29 FORMAT('...IMPORTANT NOTE: ALIAS RECORDS WITH DATE=-999.0 WILL ', - 1 'ALWAYS BE COPIED.') - - DO NSAME=NAK+1,NAKSAV - IF(NUMSAV(NSAME) .GT. 0) THEN - FOUND=.FALSE. - -C SAME STORM NAME IF NOT NAMELESS - - IF(SAVNAM(NAK) .NE. 'NAMELESS' .AND. - 1 SAVNAM(NSAME) .NE. 'NAMELESS' .AND. - 2 SAVNAM(NAK) .EQ. SAVNAM(NSAME)) THEN - FOUND=.TRUE. - -C DIRECT COMPARISON OF STORM IDS FOR THE SAME RSMC - - ELSE - DO NAL2=1,NUMSAV(NAK) - DO NAL1=1,NUMSAV(NSAME) - IF(SAVRSM(NSAME,NAL1) .EQ. SAVRSM(NAK,NAL2) .AND. - 1 SAVID (NSAME,NAL1) .EQ. SAVID (NAK,NAL2)) FOUND=.TRUE. - ENDDO - ENDDO - ENDIF - - IF(FOUND) THEN - NUMSAV(NSAME)=-IABS(NUMSAV(NSAME)) - IFND=IFND+1 - INDSAM(IFND)=NSAME - WRITE(6,59) NSAME,IFND,SAVDAY(NSAME) - 59 FORMAT(/'...STORM NAME FOR INDEX=',I3,' MATCHES. IFND=',I3,' AND', - 1 ' DAY=',F12.3) - ENDIF - ENDIF - ENDDO - -C SINGLE OCCURRENCE - - IF(IFND .EQ. 1) THEN - NW=NAK - DAYMNZ=SAVDAY(NAK) - STMNMX=SAVNAM(NAK) - WRITE(6,61) NW,SAVNAM(NAK),SAVID(NAK,1) - 61 FORMAT('...INDEX',I3,' WITH NAME=',A,' AND ID=',A,' HAS ONLY A ', - 1 'SINGLE OCCURRENCE.') - -C IF THERE ARE MULTIPLE OCCURRENCES, WRITE ONLY THE EARLIEST RECORD, -C BUT SUBSTITUTE IN THE STORM NAME IF IT IS NOT NAMELESS. - - ELSE - WRITE(6,63) SAVNAM(NAK),SAVID(NAK,1) - 63 FORMAT('...STORM NAME=',A,' AND ID=',A,' HAS MULTIPLE ', - 1 'OCCURRENCES. WE LOOK FOR THE FIRST OCCURRENCE.') - DAYMNZ=1.E10 - STMNMX='NAMELESS' - DO IF=1,IFND - IF(STMNMX .EQ. 'NAMELESS' .AND. - 1 SAVNAM(INDSAM(IF)) .NE. 'NAMELESS') - 1 STMNMX=SAVNAM(INDSAM(IF)) - IF(SAVDAY(INDSAM(IF)) .LT. DAYMNZ) THEN - DAYMNZ=SAVDAY(INDSAM(IF)) - NW=INDSAM(IF) - ENDIF - ENDDO - ENDIF - -C WRITE THE RECORD - - NREC=NREC+1 - WRITE(IUNITO,81) IABS(NUMSAV(NW)),STMNMX,(SAVRSM(NW,NAL), - 1 SAVID(NW,NAL),NAL=1,IABS(NUMSAV(NW))) - 81 FORMAT(I1,1X,A9,10(1X,A4,1X,A3)) - WRITE(6,83) NREC,DAYMNZ,NW,IUNITO,STMNMX, - 1 IABS(NUMSAV(NW))-1,(SAVRSM(NW,NAL),SAVID(NW,NAL), - 2 NAL=1,IABS(NUMSAV(NW))) - 83 FORMAT('...ADDING NEW ALIAS RECORD ',I3,' WITH DATE=',F12.3, - 1 ' AND INDEX',I3,' TO UNIT ',I3,' FOR STORM NAME=',A,'.'/4X, - 2 'NUMBER OF OBSERVERS IS:',I2,' RSMC, STORM IDS ARE:'/10X, - 3 10(1X,A4,1X,A3)) - - ENDIF - ENDDO - WRITE(6,91) NREC,IUNITO - 91 FORMAT(/'...',I3,' RECORDS HAVE BEEN WRITTEN TO UNIT',I3) - - RETURN - -C----------------------------------------------------------------------- - - ENTRY AKACPY(IUNITI,IUNITO) - - NCPYAL=0 - WRITE(6,101) IUNITI,IUNITO - 101 FORMAT(/'...ENTERING AKACPY TO COPY ALIAS RECORDS FROM IUNITI=', - 1 I3,' TO IUNITO=',I3,':') - - 110 READ(IUNITI,81,END=180) NALMX,STMNMX,(RSMCCP(NAL),STIDCP(NAL), - 1 NAL=1,NALMX) - - DO NALZ=1,NAKSAV - FOUND=.FALSE. - -C SAME STORM NAME IF NOT NAMELESS - - IF(STMNMX .NE. 'NAMELESS' .AND. - 1 SAVNAM(NALZ) .NE. 'NAMELESS' .AND. - 2 STMNMX .EQ. SAVNAM(NALZ)) THEN - FOUND=.TRUE. - GO TO 171 - -C DIRECT COMPARISON OF STORM IDS FOR THE SAME RSMC - - ELSE - DO NAL2=1,NALMX - DO NAL1=1,NUMSAV(NALZ) - IF(SAVRSM(NALZ,NAL1) .EQ. RSMCCP(NAL2) .AND. - 1 SAVID (NALZ,NAL1) .EQ. STIDCP(NAL2)) FOUND=.TRUE. - ENDDO - ENDDO - ENDIF - - ENDDO - 171 CONTINUE - - IF(.NOT. FOUND) THEN - NCPYAL=NCPYAL+1 - WRITE(IUNITO,81) NALMX,STMNMX,(RSMCCP(NAL),STIDCP(NAL), - 1 NAL=1,NALMX) - WRITE(6,175) NALMX,STMNMX,(RSMCCP(NAL),STIDCP(NAL), - 1 NAL=1,NALMX) - 175 FORMAT('...',I1,1X,A9,10(1X,A4,1X,A3)) - - ELSE - WRITE(6,177) STMNMX - 177 FORMAT('...STORM ',A,' IS ALREADY IN OUTPUT ALIAS FILE. IT WILL ', - 1 'NOT BE COPIED.') - ENDIF - - GO TO 110 - - 180 CONTINUE - WRITE(6,181) NCPYAL,IUNITI,IUNITO - 181 FORMAT('...',I3,' RECORDS COPIED FROM UNIT',I3,' TO UNIT ',I3,'.') - - RETURN - -C----------------------------------------------------------------------- - - ENTRY AKAFND(IUNITI,ICSTNM,ICRSMC,ICSTID,NAKA,AKANAM,AKRSMC, - 1 AKSTID,IFAKA) - - ifaka=0 - irec=0 - rewind iuniti - 210 read(iuniti,81,end=240) nalmx,stmnmx,(rsmccp(nal),stidcp(nal), - 1 nal=1,min(nalmx,maxaka)) - irec=irec+1 - do nal=1,nalmx - if(icrsmc .eq. rsmccp(nal) .and. - 1 icstid .eq. stidcp(nal)) then - ifaka=irec - go to 240 - endif - enddo - go to 210 - 240 continue - - if(ifaka .gt. 0) then - - if(nalmx .gt. naka) then - write(6,241) nalmx,naka - 241 format('******Insufficient storage to return aliases. nalmx,', - 1 'naka=',2i5,' Abort.') - call abort1(' AKAFND',241) - endif - - naka=nalmx - akanam=stmnmx - akrsmc(1:nalmx)=rsmccp(1:nalmx) - akstid(1:nalmx)=stidcp(1:nalmx) -c write(6,251) naka,ifaka,icstnm,icrsmc,icstid,akanam, -c 1 (akrsmc(nal),akstid(nal),nal=1,naka) -c 251 format('...akafnd results: # of aliases=',i4,' matching alias ', -c 1 'record #=',i4,' input storm name,rsmc,id=',3(a,1x)/4x, -c 2 'matched name,rsmc,id=',a/(4x,10(1x,a4,1x,a3))) - - else -c write(6,271) icstnm,icrsmc,icstid -c 271 format('###Storm not found in akavit file, storm name,rsmc,', -c 1 'id are:',3(a,1x)) - endif - return - -C----------------------------------------------------------------------- - - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: TCCLIM TROPICAL CYCLONE CLIMATOLOGICAL VALUES -C PRGMMR: S. LORD ORG: NP22 DATE: 1992-04-07 -C -C ABSTRACT: RETURNS CLIMATOLOGICAL VALUES FOR SOME TROPICAL CYCLONE -C PROPERTIES. PROPERTIES ARE: CENTRAL PRESSURE OF STORM; -C ENVIRONMENTAL PRESSURE ON THAT ISOBAR RADIUS OF THE OUTERMOST -C CLOSED ISOBAR A SECOND ENTRY CONTAINS PRESSURE-WIND TABLES FOR -C THE ATLANTIC, EAST AND CENTRAL PACIFIC AND WEST PACIFIC BASINS. -C -C PROGRAM HISTORY LOG: -C 1992-04-07 S. LORD -C 1992-09-04 S. LORD ADDED PRESSURE WIND RELATIONSHIP -C -C USAGE: VALUE=TCCLIM(IVAR,IBASN) OR VALUE=TCPWTB(PRES,IBASN) -C INPUT ARGUMENT LIST: -C IVAR - VARIABLE NUMBER (7: CENTRAL PRESSURE) -C - (8: ENVIRONMENTAL PRESSURE) -C - (9: RADIUS OF OUTERMOST CLOSED ISOBAR) -C IBASN - BASIN NUMBER -C PRES - PRESSURE IN MB -C -C -C REMARKS: IVAR VALUES OF 7,8,9 ONLY ARE ALLOWED. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - FUNCTION TCCLIM(IVAR,IBASN) - - PARAMETER (NPRMAX=9) - - PARAMETER (NBASIN=11) - PARAMETER (ISECVR= 5,ITERVR=10) - PARAMETER (NSECVR=ITERVR-ISECVR) - - DIMENSION SECVCL(NBASIN,NSECVR-2),PRTABL(NBASIN,0:NPRMAX+1), - 1 VMTABL(NBASIN,0:NPRMAX+1) - - DATA SECVCL/3*940.0,3*930.0,2*970.0,3*960.0, - 1 3*1010.0,5*1008.0,3*1010.0, - 2 6*400.0,5*300.0/ - - DATA PRTABL/2*1020.,9*1020., 2*987.,9*976., - 2 2*979.,9*966., 2*970.,9*954., - 2 2*960.,9*941., 2*948.,9*927., - 3 2*935.,9*914., 2*921.,9*898., - 4 2*906.,9*879., 2*890.,9*858., - 5 2*850.,9*850./ - - DATA VMTABL/11*12.5,11*33.5,11*39.7,11*46.4,11*52.6,11*59.3, - 1 11*65.5,11*72.2,11*80.0,11*87.6,11*110./ - - ITABL=IVAR-(ISECVR+2)+1 - TCCLIM=SECVCL(IBASN,ITABL) - - RETURN - -C----------------------------------------------------------------------- - - ENTRY TCPWTB(PRESR,IBASN) - - DO IPR=1,NPRMAX - IF(PRESR .LE. PRTABL(IBASN,IPR-1) .AND. - 1 PRESR .GT. PRTABL(IBASN,IPR)) THEN - IPRZ=IPR - GO TO 11 - ENDIF - ENDDO - IPRZ=NPRMAX+1 - 11 CONTINUE - TCPWTB=VMTABL(IBASN,IPRZ-1)+ - 1 (VMTABL(IBASN,IPRZ)-VMTABL(IBASN,IPRZ-1))* - 2 (PRESR-PRTABL(IBASN,IPRZ-1))/ - 3 (PRTABL(IBASN,IPRZ)-PRTABL(IBASN,IPRZ-1)) - - RETURN - -C----------------------------------------------------------------------- - - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: RCNCIL MANAGES STORM CATALOG -C PRGMMR: S. LORD ORG: NP22 DATE: 1993-03-05 -C -C ABSTRACT: STORM RECORDS ARE CHECKED FOR PRESENCE IN THE STORM -C CATALOG UPDATED AND ADDED IF NECESSARY. -C -C PROGRAM HISTORY LOG: -C 1992-03-25 S. LORD -C 1992-08-25 S. LORD ADDED IER RETURN CODE -C -C USAGE: CALL RCNCIL(IUNTCA,IUNTCN,IUNTAL,NTEST,NOKAY,NBAD,MAXREC, -C MAXCKS,IEFAIL,IER,IECAT,NUMTST,NUMOKA,NUMBAD, -C TSTREC,BADREC,OKAREC) -C INPUT ARGUMENT LIST: -C IUNTCA - UNIT NUMBER FOR THE STORM CATALOG. -C -C IUNTCN - UNIT NUMBER FOR THE TEMPORARY CATALOG -C -C IUNTAL - UNIT NUMBER FOR ALIAS FILE. -C NTEST - NUMBER OF CURRENT RECORDS TO BE TESTED. -C MAXREC - MAXIMUM NUMBER OF RECORDS (STORAGE FOR ARRAYS) -C MAXCKS - MAXIMUM NUMBER OF ERROR CHECKS (STORAGE FOR ARRAYS) -C IEFAIL - ARRAY CONTAINING ERROR CODES FOR ERROR CHECKS -C NUMTST - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH RECORD -C - TO BE TESTED. -C IOVRLP - SCRATCH ARRAY. -C TSTREC - CHARACTER ARRAY CONTAINING RECORDS TO BE TESTED. -C -C OUTPUT ARGUMENT LIST: -C NOKAY - NUMBER OF RECORDS THAT PASSED THE RSMC CHECK. -C NBAD - NUMBER OF RECORDS THAT FAILED THE RSMC CHECK. -C IER - ERROR RETURN CODE. 0 EXCEPT IF LOGICAL INCONSISTENCY -C FOUND. -C IECAT - INTEGER ARRAY CONTAINING ERROR CODE FOR EACH INPUT -C - RECORD. SEE COMMENTS IN PGM FOR KEY TO ERROR CODES. -C NUMOKA - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH GOOD -C - RECORD. -C NUMBAD - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH BAD -C - RECORD. -C BADREC - CHARACTER ARRAY CONTAINING BAD RECORDS THAT FAILED -C - THE RSMC CHECK. -C OKAREC - CHARACTER ARRAY CONTAINING ALL RECORDS THAT PASSED -C - THE RSMC CHECK. -C -C INPUT FILES: -C UNIT 25 - ALIAS FILE CONTAINING EQUIVALENT STORM IDS -C - FOR STORMS THAT HAVE BEEN REPORTED BY MULTIPLE RSMC'S -C - DCB: LRECL=255, BLKSIZE=23400, RECFM=VB -C UNIT 26 - NEW ALIAS FILE CONTAINING EQUIVALENT STORM IDS -C - FOR STORMS THAT HAVE BEEN REPORTED BY MULTIPLE RSMC'S -C UNIT 27 - STORM CATALOG FILE -C - DCB: LRECL=255, BLKSIZE=23400, RECFM=VB -C UNIT 28 - SCRATCH STORM CATALOG FILE -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C UNIT 27 - SAME AS ABOVE -C UNIT 28 - SAME AS ABOVE -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE RCNCIL(IUNTCA,IUNTCN,IUNTAL,NTEST,NOKAY,NBAD,MAXREC, - 1 MAXCKS,IEFAIL,IER,IECAT,NUMTST,NUMOKA,NUMBAD, - 2 TSTREC,BADREC,OKAREC) - - PARAMETER (NERCRC=3) - PARAMETER (MAXSTM=70) - PARAMETER (NOVRMX=MAXSTM) - PARAMETER (NADDMX=10) - - CHARACTER*(*) TSTREC(0:NTEST),BADREC(MAXREC),OKAREC(NOKAY), - 1 ERCRCN(NERCRC)*60 - character stnmal*9,stidal*3,rsmcal*4,stnmca*9,stidca*3,rsmcca*4, - 1 stidad*3,rsmcad*4 - - PARAMETER (MAXCHR=95) - PARAMETER (MAXVIT=15) - PARAMETER (NBASIN=11) - PARAMETER (NRSMCX=4) - - CHARACTER BUFIN*1,RSMCZ*4,STMIDZ*3,STMNMZ*9,FSTFLZ*1,STMDPZ*1, - 1 LATNS*1,LONEW*1,FMTVIT*6,BUFINZ*100,RELOCZ*1,IDBASN*1, - 2 RSMCID*4,RSMCAP*1 - - DIMENSION IVTVAR(MAXVIT),ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION BUFIN(MAXCHR),IDBASN(NBASIN), - 1 FMTVIT(MAXVIT),RSMCID(NRSMCX),RSMCAP(NRSMCX) - - EQUIVALENCE (BUFIN(1),RSMCZ),(BUFIN(5),RELOCZ),(BUFIN(6),STMIDZ), - 1 (BUFIN(10),STMNMZ),(BUFIN(19),FSTFLZ), - 2 (BUFIN(37),LATNS),(BUFIN(43),LONEW), - 3 (BUFIN(95),STMDPZ),(BUFIN(1),BUFINZ) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - DIMENSION IVTVRX(MAXVIT) - - DIMENSION RINC(5) - - CHARACTER BUFCK(MAXCHR)*1,RSMCX*4,RELOCX*1,STMIDX*3,BUFINX*100, - 1 STMNMX*9,LATNSX*1,LONEWX*1 - - DIMENSION IEFAIL(MAXREC,0:MAXCKS),IECAT(MAXREC),NUMOKA(NOKAY), - 1 NUMBAD(MAXREC),NUMTST(NTEST),MAXNO(NBASIN) - - dimension rsmcal(novrmx),stidal(novrmx), - 1 rsmcca(novrmx),stidca(novrmx), - 2 rsmcad(naddmx),stidad(naddmx) - - EQUIVALENCE (BUFCK(1),RSMCX),(BUFCK(5),RELOCX),(BUFCK(6),STMIDX), - 1 (BUFCK(1),BUFINX),(BUFCK(10),STMNMX), - 2 (BUFCK(35),LATNSX),(BUFCK(41),LONEWX) - - EQUIVALENCE (IVTVRX(1),IDATEX),(IVTVRX(2),IUTCX) - - DATA FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 1 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 2 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 3 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/ - - DATA IDBASN/'L','E','C','W','O','T','U','P','S','B','A'/ - - DATA RSMCID/'NHC ','JTWC','ADRM','JMA '/, - 1 RSMCAP/'N','W','A','J'/ - - data maxno/nbasin*0/,minday/-1/,maxday/1/ - - DATA ERCRCN - 1 /'10: NEW STORM, ADD TO CATALOG ', - 2 '20: DUP. STORM ID IN CATALOG. CREATE NEW ID, APPEND CATALOG ', - 3 '30: STORM FOUND IN CATALOG, UPDATE CATALOG ENTRY '/ - - write(6,1) nokay - 1 format(//'...Entering rcncil to reconcile catalog, alias file ', - 1 'and new records. Number of okay records=',i4/4x,'Codes', - 2 ' are:'/10x,'1: No catalog entry'/13x,'Action: Append ', - 3 'catalog (first time appearance), record unchanged'/10x, - 4 '2: Duplicate storm id to primary catalog id'/13x, - 5 'Action: Find new, unique id which is one more than the', - 6 'largest id for that basin, modify record, append to ', - 7 'catalog'/10x,'3: Storm found in catalog,'/13x,'Action:', - 8 'update catalog entry') - rewind iuntca - rewind iuntcn - ncat=0 - ipack=10*maxrec - nadd=0 - ier=0 - - write(6,3) - 3 format(/'...Input records are:') - - do iec=1,ntest - iecat(iec)=ipack - write(6,5) iec,numtst(iec),tstrec(iec) - 5 format('...',i4,'...',i5,'...',a) - - enddo - - call sclist(iuntca) - call aklist(iuntal) - -c First pass through catalog to determine what should be done - - 20 continue - READ(IUNTCA,21,END=90) NALCA,STNMCA,IYMDMN,IUTCMN,IYMDMX,IUTCMX, - 1 (RSMCCA(NAL),STIDCA(NAL), - 2 NAL=1,MIN(NALCA,NOVRMX)) - 21 FORMAT(I1,1X,A9,2(1X,I8,1X,I4.4),10(1X,A4,1X,A3)) - ncat=ncat+1 - -c Determine maximum storm id in each basin from the catalog - - read(stidca(1)(1:2),23) idno - 23 format(i2) - do nb=1,nbasin - if(stidca(1)(3:3) .eq. idbasn(nb)) then - maxno(nb)=max0(maxno(nb),idno) - go to 31 - endif - enddo - 31 continue - -c Determine the catalog code for each record -c Codes and actions are: - -c Code 1: No catalog entry -c Action: Append catalog (first time appearance), record unchanged - -c Code 2: Duplicate storm id to primary catalog id, storm not -c found in catalog -c Action: Find new, unique id which is one more than the largest -c id for that basin, modify record, append to catalog - -c Code 3: Storm found in catalog -c Action: Update catalog date and other entries if necessary - -c Notes: codes from 1-3 are in order of increasing priority so that -c a code of 2 can be overridden by a code of 3 -c A final check on the consistency between the catalog and the alias -c (akavit) file is made. Any inconsistency is resolved in favor of t -c catalog but is flagged by a positive error code even though the -c record is retained. - -c Codes are packed so that the appropriate record number in the -c catalog is recoverable. Packing depends on maxrec, which -c should be a 4 digit number (1000 should work fine). - - do 80 nrec=1,ntest - -c Look at okay records and bad records with overland error codes. -c An error code for the rsmcck of 22 forces a look at the -c alias file since an entry has been made already. - - if(nrec .le. nokay .or. - 1 (nrec .gt. nokay .and. (iefail(numtst(nrec),4) .eq. 5 .or. - 2 iefail(numtst(nrec),4) .eq. 6 .or. - 3 iefail(numtst(nrec),6) .eq. 22))) then - - bufinz=tstrec(nrec) - - if(rsmcz(1:1) .ne. '!' .and. iefail(numtst(nrec),6) .ne. 22) - 1 then - nalsav=1 - stnmal=stmnmz - rsmcal(1)=rsmcz - stidal(1)=stmidz - - else -c write(6,35) nrec,stmnmz,rsmcz,stmidz -c 35 format('...Calling akafnd for record',i4,' with storm name,', -c 1 'rsmc,id=',3(a,1x),' to find all aliases.') - nalsav=novrmx - call akafnd(iuntal,stmnmz,rsmcz,stmidz,nalsav,stnmal,rsmcal, - 1 stidal,ifnd) - - if(ifnd .eq. 0) then - write(6,37) stmnmz,stmidz,rsmcz - 37 format('******Bang or overlapped storm not found in akavit file ', - 1 'when finding aliases. stmnmz,stmidz,rsmcz=',3(1x,a), - 2 ' abort') -c call abort1(' RCNCIL',37) - endif - - endif - - do nal=1,nalsav - -c Code 3: - -c if the record is nameless the entire storm id and rsmc -c must match - - IF(STMNMZ .NE. 'NAMELESS') THEN - - if(stnmca .eq. stnmal .and. - 1 stidca(1)(3:3) .eq. stidal(nal)(3:3)) then - iecat(nrec)=3*ipack+ncat - write(6,43) nrec,stnmal,stidal(nal),rsmcal(nal),iecat(nrec) - 43 format('...For nrec=',i5,' storm named=',a,' with id,rsmc=', - 1 2(a,1x),' is in catalog, iecat=',i6) - go to 80 - endif - ENDIF - - do nca=1,nalca - if(rsmcal(nal) .eq. rsmcca(nca) .and. - 1 stidal(nal) .eq. stidca(nca)) then - iecat(nrec)=3*ipack+ncat - write(6,47) nrec,nca,stnmal,stidal(nal),rsmcal(nal),iecat(nrec) - 47 format('...For nrec,nca=',2i5,' storm named=',a,' with id,rsmc=', - 1 2(a,1x),' is in catalog, iecat=',i6) - go to 80 - endif - enddo - enddo - - -c Code 2: now there is no exact match to the catalog - make sure the -c won't be a duplicate storm id - -c Possibilities are: -c 1) If both record and catalog are bang, RSMCCK may have changed th -c rsmc (e.g. added a new observing rsmc). We assume the storm is -c in the catalog (code 3). -c 2) If the catalog is a bang, and the record is not, the record is -c new storm (code 2) or the records has been processed by rsmcc -c but not yet by rcncil. Check the AKAVIT file and adjust the -c code accordingly. -c 3) Neither record or catalog entry is a bang (code 2). - - if(stmidz .eq. stidca(1)) then - - if(rsmcz(1:1) .eq. '!' .and. - 1 rsmcca(1)(1:1) .eq. '!') then - iecatz=3 - write(6,71) nrec,stmidz,ncat,rsmcz,rsmcca(1) - 71 format(/'...For nrec=',i5,' only storm id=',a,' matches catalog ', - 1 'entry',i5,'. Record and catalog rsmcs are both bang:', - 2 2(1x,a)/4x,'###This case should never happen!') - - else if(rsmcz(1:1) .ne. '!' .and. - 1 rsmcca(1)(1:1) .eq. '!') then - - write(6,73) nrec,stmidz,rsmcz,rsmcca(1),stmnmz,rsmcz,stmidz - - 73 format('...For nrec=',i5,' only storm id=',a,' matches catalog ', - 1 'entry.'/4x,'...Record rsmc (',a,') is not bang but ', - 2 'catalog rsmc is (',a,').'/4x,'...Calling akafnd with ', - 3 'storm name, rsmc, id=',3(a,1x),' to find all aliases.') - - nalsav=novrmx - call akafnd(iuntal,stmnmz,rsmcz,stmidz,nalsav,stnmal,rsmcal, - 1 stidal,ifnd) - if(ifnd .eq. 1) then - write(6,75) - 75 format(3x,'...Record found in alias file. Code 3 assigned.') - iecatz=3 - - else - write(6,77) - 77 format(3x,'...Record not found in alias file. Code 2 retained.') - iecatz=2 - endif - - else - iecatz=2 - write(6,79) nrec,stmidz,ncat,rsmcz,rsmcca(1) - 79 format(/'...For nrec=',i5,' only storm id=',a,' matches catalog ', - 1 'entry',i5,'. Rsmcs are:',2(1x,a)/4x,' ###Probable new ', - 2 'storm with a duplicate storm id') - endif - - iecat(nrec)=max0(iecat(nrec)/ipack,iecatz)*ipack+ncat - endif - - endif - 80 continue - -c Write to the scratch catalog - - WRITE(IUNTCN,21) NALCA,STNMCA,IYMDMN,IUTCMN,IYMDMX,IUTCMX, - 1 (RSMCCA(NAL),STIDCA(NAL), - 2 NAL=1,MIN(NALCA,NOVRMX)) - go to 20 - 90 continue - - if(ncat .eq. 0) then - write(6,91) - 91 format(/'...There are no catalog entries. All input records will', - 1 ' be assigned code 1.') - iecat(1:ntest)=ipack - - endif - - write(6,131) - 131 format('...Summary of catalog codes for first scan:') - do nrec=1,ntest - if(nrec .le. nokay .or. - 1 (nrec .gt. nokay .and. (iefail(numtst(nrec),4) .eq. 5 .or. - 2 iefail(numtst(nrec),4) .eq. 6 .or. - 3 iefail(numtst(nrec),6) .eq. 22))) then - write(6,133) nrec,iecat(nrec),tstrec(nrec) - 133 format(4x,2i6,1x,'...',a,'...') - if(iabs(iefail(numtst(nrec),5)) .le. 9) then - iefail(numtst(nrec),5)=-(iabs(iefail(numtst(nrec),5))+ - 1 iabs(iecat(nrec))/ipack*10) - endif - endif - enddo - write(6,143) (nb,idbasn(nb),maxno(nb),nb=1,nbasin) - 143 format('...Summary of maximum storm ids for each basin:'/(4x,i3, - 1 1x,a,i4)) - -c Second pass: copy back from the scratch catalog and update -c each entry as needed - - rewind iuntca - rewind iuntcn - ncat=0 - - 201 continue - READ(IUNTCN,21,END=300) NALCA,STNMCA,IYMDMN,IUTCMN,IYMDMX,IUTCMX, - 1 (RSMCCA(NAL),STIDCA(NAL), - 2 NAL=1,MIN(NALCA,NOVRMX)) - ncat=ncat+1 - -c *********************** -c **** Code 3 errors **** -c *********************** - - do nrec=1,ntest - - if(nrec .le. nokay .or. - 1 (nrec .gt. nokay .and. (iefail(numtst(nrec),4) .eq. 5 .or. - 2 iefail(numtst(nrec),4) .eq. 6 .or. - 3 iefail(numtst(nrec),6) .eq. 22))) then - - bufinz=tstrec(nrec) - ietyp=iecat(nrec)/ipack - ircat=iecat(nrec)-ietyp*ipack - - if(ircat .eq. ncat .and. ietyp .eq. 3) then - - write(6,213) nrec,bufinz,NALCA,STNMCA,IYMDMN,IUTCMN,IYMDMX,IUTCMX, - 1 (RSMCCA(NAL),STIDCA(NAL), - 2 NAL=1,MIN(NALCA,NOVRMX)) - 213 format(/'...Preparing to reconcile code 3 errors for nrec=',i3, - 1 ' record, catalog entry are:'/4x,a,'...'/4x,i1,1x,a9,2(1x, - 2 i8,1x,i4.4),10(1x,a4,1x,a3)) - - IF(STMNMZ .NE. 'NAMELESS' .AND. STNMCA .EQ. 'NAMELESS') THEN - write(6,217) stnmca,ncat,stmnmz,nrec - 217 format('...',a,' storm with catalog entry=',i4,' will have name=', - 1 a,' assigned, nrec=',i4) - STNMCA=STMNMZ - ENDIF - - do iv=1,2 - call decvar(istvar(iv),ienvar(iv),ivtvar(iv),ierdec,fmtvit(iv), - 1 bufinz) - enddo - - call mnmxda(iymdmn,iutcmn,idatez,iutcz,dayz,minday) - call mnmxda(iymdmx,iutcmx,idatez,iutcz,dayz,maxday) - daysav=dayz - ilate=nrec - -c Do all records identified as the same storm - - do nchk=nrec+1,ntest - - if(nchk .le. nokay .or. - 1 (nchk .gt. nokay .and. (iefail(numtst(nchk),4) .eq. 5 .or. - 2 iefail(numtst(nchk),4) .eq. 6 .or. - 3 iefail(numtst(nchk),6) .eq. 22))) then - - bufinx=tstrec(nchk) - ietypx=iecat(nchk)/ipack - ircatx=iecat(nchk)-ietyp*ipack - - if(ircatx .eq. ncat .and. ietypx .eq. 3) then - - IF(STMNMX .NE. 'NAMELESS' .AND. STNMCA .EQ. 'NAMELESS') THEN - write(6,227) stnmca,ncat,stmnmx,nchk - 227 format('...',a,' storm with catalog entry=',i4,' will have name=', - 1 a,' assigned, nchk=',i4) - STNMCA=STMNMX - ENDIF - - do iv=1,2 - call decvar(istvar(iv),ienvar(iv),ivtvrx(iv),ierdec,fmtvit(iv), - 1 bufinx) - enddo - -c write(6,231) nchk,iymdmn,iutcmn,idatex,iutcx,bufinx -c 231 format('...calling mnmxda with nchk,iymdmn,iutcmn,idatex,iutcx,' -c 1 'bufinx=',i4,i9,i6,i7,i6/4x,a) - call mnmxda(iymdmn,iutcmn,idatex,iutcx,dayz,minday) - call mnmxda(iymdmx,iutcmx,idatex,iutcx,dayz,maxday) - if(dayz .gt. daysav) then - daysav=dayz - ilate=nchk - endif - - iecat(nchk)=-iabs(iecat(nchk)) - endif - endif - enddo - -c Look in akavit for the storm. If it is there, extract -c latest pertinent information that will be transferred to the -c storm catalog - - write(6,243) ilate,stmnmz,rsmcz,stmidz - 243 format('...Look in akavit for appropriate information. Latest ', - 1 'record has index=',i5,' storm name,rsmc,id=',3(a,1x)) - - nalsav=novrmx - call akafnd(iuntal,stmnmz,rsmcz,stmidz,nalsav,stnmca,rsmcal, - 1 stidal,ifnd) - - if(ifnd .eq. 0) then - if(rsmcz(1:1) .eq. '!') then - write(6,271) stmnmz,stmidz,rsmcz - 271 format('******Storm not found in akavit file. stmnmz,stmidz,', - 1 'rsmcz=',3(1x,a),' abort') - call abort1(' RCNCIL',271) - - else - write(6,273) ilate - 273 format('...Storm is not multiply observed. We copy the latest ', - 1 'record (#',i5,') to get the latest information.') - bufinx=tstrec(ilate) - nalca=1 - rsmcca(1)=rsmcx - stidca(1)=stmidx - if(stmnmx .ne. 'NAMELESS') stnmca=stmnmx - endif - - else - write(6,277) - 277 format('...Storm is multiply observed. We copy the alias record ', - 1 'to get the latest information.') - -c Do not copy the storm id if there is already a catalog entry - - nalca=nalsav - rsmcca(1)=rsmcal(1) - rsmcca(2:nalca)=rsmcal(2:nalca) - stidca(2:nalca)=stidal(2:nalca) - endif - - iecat(nrec)=-iabs(iecat(nrec)) - - endif - endif - enddo - -c write to the updated catalog - - WRITE(IUNTCA,21) NALCA,STNMCA,IYMDMN,IUTCMN,IYMDMX,IUTCMX, - 1 (RSMCCA(NAL),STIDCA(NAL), - 2 NAL=1,MIN(NALCA,NOVRMX)) - WRITE(6,293) NCAT,NALCA,STNMCA,IYMDMN,IUTCMN,IYMDMX,IUTCMX, - 1 (RSMCCA(NAL),STIDCA(NAL), - 2 NAL=1,MIN(NALCA,NOVRMX)) - 293 format(/'...CATALOG RECORD ',I3,' WRITTEN. RECORD IS:',I1,1X,A9, - 1 2(1X,I8,1X,I4.4),10(1X,A4,1X,A3)) - go to 201 - - 300 continue - -c **************************** -c **** Code 1 or 2 errors **** -c **************************** - -c Add new storms to the catalog or storms that have duplicate -c ids - - nadcat=0 -c** naladd=0 - do nrec=1,ntest - - if(nrec .le. nokay .or. - 1 (nrec .gt. nokay .and. (iefail(numtst(nrec),4) .eq. 5 .or. - 2 iefail(numtst(nrec),4) .eq. 6 .or. - 3 iefail(numtst(nrec),6) .eq. 22))) then - - bufinz=tstrec(nrec) - ietyp=iecat(nrec)/ipack - - if(ietyp .eq. 1 .or. ietyp .eq. 2) then - write(6,303) nrec,ietyp,bufinz - 303 format(//'...Ready to add new storm to catalog. nrec,ietyp,', - 1 'record are:',2i4/4x,a) - -c Default entry for catalog is a copy of the candidate record or the -c entry from the alias (akavit) file. These entries may be -c updated by records with a later date, entries from the -c alias file, and the need to create a new, unique storm id. - - if(rsmcz(1:1) .ne. '!') then - nalca=1 - stnmca=stmnmz - rsmcca(1)=rsmcz - stidca(1)=stmidz - - else - write(6,305) nrec,stmnmz,rsmcz,stmidz - 305 format('...Calling akafnd for record',i4,' with storm name,', - 1 'rsmc,id=',3(a,1x),' to produce default catalog entries.') - nalsav=novrmx - call akafnd(iuntal,stmnmz,rsmcz,stmidz,nalsav,stnmca,rsmcca, - 1 stidca,ifnd) - nalca=nalsav - - if(ifnd .eq. 0) then - write(6,307) stmnmz,stmidz,rsmcz - 307 format('******Storm not found in akavit file. stmnmz,stmidz,', - 1 'rsmcz=',3(1x,a),' abort') - call abort1(' RCNCIL',307) - endif - endif - - read(stmidz(1:2),23) idno - do nb=1,nbasin - if(stmidz(3:3) .eq. idbasn(nb)) then - nbasav=nb - go to 311 - endif - enddo - 311 continue - - istidn=0 - if(idno .le. maxno(nbasav)) then - istidn=1 - write(6,313) idno,maxno(nbasav) - 313 format('###Storm id number=',i3,' is not larger than catalog ', - 1 'maximum. A new number and storm id must be created=',i4) - endif - - do naddc=1,nadcat - if(stmidz .eq. stidad(naddc)) then - istidn=1 - write(6,315) stmidz - 315 format('...Current storm id has already been added to catalog. A', - 1 ' unique one must be created.') - endif - enddo - -c Create added storm id and rsmc in advance to guarantee uniqueness -c or transfer new storm id to the catalog record. -c istidn=0 : no uniqueness problem has been detected -c istidn=1 : uniqueness problem detected and new id will -c be created -c The new id will be transferred to all records. It must be a bang -c record with only one observing rsmc. It must also be entered int -c the alias file. - - istidn=0 ! Qingfu added to skip the changes of storm ID number - - if(istidn .eq. 1) then - - if(rsmcz(1:1) .eq. '!') then - write(6,331) stmidz,rsmcz,bufinz - 331 format('###Storm with id, rsmc=',2(a,1x),'is a duplicate to a ', - 1 'catalog entry as well as being a bang storm. Record is:'/ - 2 4x,a) - write(6,333) - 333 format('******This problem is not yet coded. Abort') - call abort1(' rcncil',333) - - else - idnomx=-1 - do naddc=1,nadcat - read(stidad(naddc)(1:2),23) idno - if(stidad(naddc)(3:3) .eq. idbasn(nbasav)) - 1 idnomx=max0(idnomx,idno) - enddo - stidad(nadcat+1)(3:3)=idbasn(nbasav) - - if(idnomx .ge. 0) then - write(stidad(nadcat+1)(1:2),3401) idnomx+1 - 3401 format(i2.2) - write(6,341) idbasn(nbasav),stidad(nadcat+1) - 341 format('...Previous storms have been added for basin ',a,' storm', - 1 ' id set to one more than the maximum already added to ', - 2 'the catalog=',a) - else - write(stidad(nadcat+1)(1:2),3401) maxno(nbasav)+1 - write(6,343) idbasn(nbasav),stidad(nadcat+1) - 343 format('...No previous storms added for basin ',a,'. Storm id ', - 1 'set to one more than the maximum already in the catalog=', - 2 a) - endif - -c Create a bang record with one observing rsmc - -c** naladd=naladd+1 - do nrsz=1,nrsmcx - if(rsmcid(nrsz) .eq. rsmcz) then - nrsmc=nrsz - go to 351 - endif - enddo - 351 continue - nalca=2 - rsmcad(nadcat+1)='!'//rsmcap(nrsmc) - stidca(1)=stidad(nadcat+1) - rsmcca(1)=rsmcad(nadcat+1) - stidca(2)=stmidz - rsmcca(2)=rsmcz -c** write(6,355) naladd,(stidca(nca),rsmcca(nca),nca=1,nalca) - write(6,355) nadcat+1,(stidca(nca),rsmcca(nca),nca=1,nalca) - 355 format('...New bang storm (#',i2,') created with unique id. Id, ', - 1 'rsmc are:'/(4x,2(a,3x))) -c** call akasav(naladd,nalca,dayz,stmnmz,rsmcca,stidca) - - endif - - endif - - do iv=1,2 - call decvar(istvar(iv),ienvar(iv),ivtvar(iv),ierdec,fmtvit(iv), - 1 bufinz) - enddo - idatmn=idatez - iutcmn=iutcz - idatmx=idatez - iutcmx=iutcz - call ztime(idatez,iutcz,iyr,imo,ida,ihr,imin) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - call flday(jdy,ihr,imin,daysav) - ilate=nrec - -C####################################################################### - -c Do all records identified as the same storm - - do nchk=nrec+1,ntest - -C----------------------------------------------------------------------- - if(nchk .le. nokay .or. - 1 (nchk .gt. nokay .and. (iefail(numtst(nchk),4) .eq. 5 .or. - 2 iefail(numtst(nchk),4) .eq. 6 .or. - 3 iefail(numtst(nchk),6) .eq. 22))) then - - imatch=0 - - bufinx=tstrec(nchk) - ietypx=iecat(nchk)/ipack - -C....................................................................... - if(ietypx .eq. 1 .or. ietypx .eq. 2) then - - ifnd=0 - -c Storms are obviously the same - - if(stmidz .eq. stmidx .and. rsmcz .eq. rsmcx) then - write(6,371) nchk,nrec,nrec,bufinz,nchk,bufinx - 371 format('...Record',i5,' has the same storm id and rsmc as the ', - 1 'candidate record (#',i5,'). Records are:'/4x,i4,1x,a/4x, - 2 i4,1x,a) - ifnd=-1 - -c Last resort: look in akavit for the storm - - else - write(6,373) nchk,stmnmx,rsmcx,stmidx - 373 format('...calling akafnd for record',i4,' with storm name,rsmc,', - 1 'id=',3(a,1x)) - nalsav=novrmx - call akafnd(iuntal,stmnmx,rsmcx,stmidx,nalsav,stnmal, - 1 rsmcal,stidal,ifnd) - - if(ifnd .eq. 0) then - - if(rsmcx(1:1) .eq. '!') then - write(6,381) stmnmx,stmidx,rsmcx - 381 format('******Storm not found in akavit file. stmnmx,stmidx,', - 1 'rsmcx=',3(1x,a),' abort') - call abort1(' RCNCIL',381) - else -c write(6,383) -c 383 format('...Storm does not have a bang rsmc. It is therefore not ', -c 1 'required to find a match.') - endif - - else - write(6,405) ifnd - 405 format('...Storm found in akavit file at record #',i3) - do nal=1,nalsav - if(rsmcz .eq. rsmcal(nal) .and. - 1 stmidz .eq. stidal(nal)) then - imatch=1 - go to 411 - endif - enddo - 411 continue - endif - - endif - - if(imatch .eq. 1 .or. ifnd .eq. -1) then - write(6,413) ifnd,imatch - 413 format('...Storm matches exactly or by catalog association, ', - 1 'ifnd,imatch=',2i3) - do iv=1,2 - call decvar(istvar(iv),ienvar(iv),ivtvrx(iv),ierdec, - 1 fmtvit(iv),bufinx) - enddo - -c write(6,231) nchk,idatmn,iutcmn,idatex,iutcx,bufinx - call mnmxda(idatmn,iutcmn,idatex,iutcx,dayz,minday) - call mnmxda(idatmx,iutcmx,idatex,iutcx,dayz,maxday) - if(dayz .gt. daysav) then - daysav=dayz - ilate=nchk - endif - - if(istidn .eq. 1) then - tstrec(nchk)=bufinx - nadd=nadd+1 - badrec(nbad+nadd)=bufinx - numbad(nbad+nadd)=numtst(nchk) - iefail(numbad(nbad+nadd),5)= - 1 -iabs(iefail(numtst(nchk),5)) - stmidx=stidad(nadcat+1) - rsmcx =rsmcad(nadcat+1) - write(6,473) stmidx,bufinx,nadd,badrec(nbad+nadd) - 473 format('...Record same as candidate record to be added to ', - 1 'catalog. New storm id=',a,' is assigned. Modified ', - 2 'record is:'/4x,a/4x,'Bad record #',i3,' added is:'/4x,a) - endif - - iecat(nchk)=-iabs(iecat(nchk)) - if(nchk .le. nokay) then - okarec(nchk)=bufinx - else - badrec(nchk-nokay)=bufinx - endif - - endif -C....................................................................... - -c Exact match: substitute storm name if it is not nameless - - if(ifnd .eq. -1) then - - if(stmnmx.ne.'NAMELESS' .and. stmnmz.eq.'NAMELESS') then - stnmca=stmnmx - write(6,475) stnmca - 475 format('...NAMELESS candidate record is renamed to ',a,'from a ', - 1 'matching record.') - endif - -c Match through the alias file: copy alias information for the -c catalog entry - - else if(imatch .eq. 1) then - if(stmnmz.eq.'NAMELESS' .and. stnmal.ne.'NAMELESS') then - stnmca=stnmal - write(6,477) stnmca - 477 format('...NAMELESS candidate record is renamed to ',a,'from a ', - 1 'matching alias record.') - endif - - nalca=nalsav - rsmcca(1:nalca)=rsmcal(1:nalca) - stidca(1:nalca)=stidal(1:nalca) - - else - write(6,491) ifnd,imatch - 491 format('...Storm does not match exactly or by catalog ', - 1 'association, ifnd,imatch=',2i3) - endif - - endif - endif -C----------------------------------------------------------------------- - enddo -C####################################################################### - - if(iecat(nrec) .gt. 0) then - nadcat=nadcat+1 - - if(nadcat .gt. naddmx) then - write(6,505) nadcat,naddmx - 505 format('******Trying to add too many storms to the catalog,', - 1 ' nadcat,naddmx=',2i3) - call abort1(' RCNCIL',505) - endif - - if(istidn .eq. 1) then - nadd=nadd+1 - badrec(nbad+nadd)=bufinz - numbad(nbad+nadd)=numtst(nrec) - iefail(numbad(nbad+nadd),5)=-iabs(iefail(numtst(nrec),5)) - write(6,511) nadd,nrec,nbad+nadd,numtst(nrec) - 511 format(/'...Adding a new bad record due to duplicate storm id, ', - 1 'nadd,nrec,nbad+nadd,numtst=',4i4) - - stmidz=stidad(nadcat) - rsmcz =rsmcad(nadcat) - write(6,513) stidca(1),nalca,bufinz - 513 format('...Id for storm added to catalog =',a,' is new and ', - 1 'unique. nalca=',i3,' Record is:'/4x,a) - - else - stidad(nadcat)=stidca(1) - write(6,515) stidad(nadcat) - 515 format('...Id for storm added to catalog =',a,' has been ', - 1 'recorded to prevent duplication.') - endif - - WRITE(IUNTCA,21) NALCA,STNMCA,IDATMN,IUTCMN,IDATMX,IUTCMX, - 1 (RSMCCA(NAL),STIDCA(NAL), - 2 NAL=1,MIN(NALCA,NOVRMX)) - WRITE(6,293) NCAT+NADCAT,NALCA,STNMCA,IDATMN,IUTCMN,IDATMX, - 1 IUTCMX,(RSMCCA(NAL),STIDCA(NAL), - 2 NAL=1,MIN(NALCA,NOVRMX)) - endif - - if(nrec .le. nokay) then - okarec(nrec)=bufinz - else - badrec(nrec-nokay)=bufinz - endif - - iecat(nrec)=-iabs(iecat(nrec)) - endif - endif - - enddo -c** write(6,601) nadcat,naladd -c 601 format('...',i3,' new storms added to catalog. ',i3,' bang ', -c 1 'storms added to temporary alias file.'/4x,'Dump alias ' -c 2 'records to temporary alias file if necessary (naladd>0).' - write(6,601) nadcat - 601 format('...',i3,' new storms added to catalog.') - -c Finally, storm catalog and alias file (akavit) reconciliation. -c We force the alias file to be a direct subset of the storm -c catalog. - -c write(6,703) -c 703 format(/'...Storm catalog and alias file reconciliation. '/4x, -c 1 'Copy temporary alias file records to the new alias file', -c 2 ' if necessary.') - - iuntaw=iuntal - rewind iuntca - rewind iuntaw - - 720 read(iuntca,21,end=830) nalca,stmnmz,iymdmn,iutcmn,iymdca,iutcca, - 1 (rsmcca(nca),stidca(nca), - 2 nca=1,min(nalca,novrmx)) - if(rsmcca(1)(1:1) .eq. '!') write(iuntaw,711) nalca,stmnmz, - 1 (rsmcca(nca),stidca(nca), - 2 nca=1,min(nalca,novrmx)) - 711 format(i1,1x,a9,10(1x,a4,1x,a3)) - -c** ifndca=0 - -c if(stmnmz .eq. stnmal .and. -c 1 stidca(1) .eq. stidal(1)) then -c ifndz=0 -c write(6,801) stmnmz,stidca(1) -c 801 format('...Alias file and catalog have the same storm and basin ', -c 1 'id=',a,1x,a) - -c do nc=1,nalca -c if(rsmcal(nc) .eq. rsmcca(nc) .and. -c 1 stidal(nc) .eq. stidca(nc)) then -c ifndz=ifndz+1 -c endif -c enddo - -c if(ifndz .eq. nalca) then -c ifndca=1 -c go to 831 -c endif -c** endif - - go to 720 - 830 continue -cc831 continue - -c** if(ifndca .eq. 0) then -c write(6,833) nalca,stmnmz,(rsmcca(nca),stidca(nca), -c 1 nca=1,min(nalca,novrmx)) -c write(6,835) nalmx,stnmal,(rsmcal(nal),stidal(nal), -c 3 nal=1,min(nalmx,novrmx)) -c 833 format('******Storm in alias file but different or not in ', -c 1 'catalog. Catalog entry is:'/4x,i1,1x,a9,10(1x,a4,1x,a3) -c 835 format('Alias entry is:'/4x,i1,1x,a9,10(1x,a4,1x,a3)) -c call abort1(' RCNCIL',835) - -c else -c write(6,841) nalmx,stnmal,(rsmcal(nal),stidal(nal), -c 1 nal=1,min(nalmx,novrmx)) -c 841 format('...Alias file entry is identical to catalog. Entry is:'/ -c 1 4x,i1,1x,a9,10(1x,a4,1x,a3)) -c endif -c** go to 710 - -c Error summary - - write(6,901) nokay,ntest,nadd,(ercrcn(ner),ner=1,nercrc) - 901 format(//'...Results of the catalog reconciliation check are: ', - 1 'nokay=',i4,', ntest=',i4,', nadd=',i3//4x,'Error codes ', - 2 'are:'/(6x,a)) - write(6,903) - 903 format(/'...Okay records are:',100x,'erc'/) - do nok=1,nokay - write(6,909) nok,numoka(nok),okarec(nok),iefail(numoka(nok),5) - 909 format(3x,i4,'...',i4,'...',a,'...',i3) - enddo - - write(6,913) - 913 format(/'...Updated overland or overlapped (bad) records are:', - 1 68x,'erc') - do nba=1,nbad - if(iefail(numbad(nba),4) .eq. 5 .or. - 1 iefail(numbad(nba),4) .eq. 6 .or. - 2 iefail(numbad(nba),6) .eq. 22) then - write(6,919) nba,numbad(nba),badrec(nba),iefail(numbad(nba),5) - 919 format(3x,i4,'...',i4,'...',a,'...',i3) - endif - enddo - - write(6,923) - 923 format(/'...Added records due to duplicate storm id are:',73x, - 1 'erc'/) - do nad=1,nadd - write(6,929) nad,numbad(nbad+nad),badrec(nbad+nad), - 1 iabs(iefail(numbad(nbad+nad),5)) - 929 format(3x,i4,'...',i4,'...',a,'...',i3) - enddo - nbad=nbad+nadd - - return - end - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: MNMXDA SUBSTITUTES MIN OR MAX DATE -C PRGMMR: S. LORD ORG: NP22 DATE: 1993-06-01 -C -C ABSTRACT: SUBSTITUTES MIN OR MAX DATE -C -C PROGRAM HISTORY LOG: -C 1993-06-01 S. LORD -C -C USAGE: CALL MNMXDA(IYMDNX,IUTCNX,IYMDZ,IUTCZ,DAYZ,MINMAX) -C INPUT ARGUMENT LIST: -C IYMDNX - MINIMUM YEAR,MONTH,DAY. -C -C IUTCNX - MINIMUM HOUR (UTC). -C IYMDZ - INPUT YEAR,MONTH,DAY. -C -C IUTCZ - INPUT HOUR (UTC). -C -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - subroutine mnmxda(iymdnx,iutcnx,iymdz,iutcz,dayz,minmax) - - DIMENSION RINC(5) - -c in minmax<0, minimum is returned -c in minmax>0, minimum is returned - - call ztime(iymdnx,iutcnx,iyr,imo,ida,ihr,imin) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - call flday(jdy,ihr,imin,daynx) - - call ztime(iymdz,iutcz,iyr,imo,ida,ihr,imin) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - call flday(jdy,ihr,imin,dayz) - - if(minmax .gt. 0) then - if(dayz .gt. daynx) then - write(6,11) iymdnx,iutcnx,iymdz,iutcz - 11 format('...Substituting maximum date. iymdnx,iutcnx,iymdz,iutcz=', - 1 2(i9,i6.4)) - iymdnx=iymdz - iutcnx=iutcz - else -c write(6,13) iymdnx,iutcnx,iymdz,iutcz -c 13 format('...No substitution of maximum date. iymdnx,iutcnx,iymdz,', -c 1 'iutcz=',2(i9,i6.4)) - endif - - else if(minmax .lt. 0) then - if(dayz .lt. daynx) then - write(6,21) iymdnx,iutcnx,iymdz,iutcz - 21 format('...Substituting minimum date. iymdnx,iutcnx,iymdz,iutcz=', - 1 2(i9,i6.4)) - iymdnx=iymdz - iutcnx=iutcz - else -c write(6,23) iymdnx,iutcnx,iymdz,iutcz -c 23 format('...No substitution of minimum date. iymdnx,iutcnx,iymdz,', -c 1 'iutcz=',2(i9,i6.4)) - endif - - else - write(6,31) minmax - 31 format('******minmax value=',i5,' is improper. abort.') - CALL ABORT1(' MNMXDA',31) - endif - - return - end - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: SCLIST LISTS STORM CATALOG -C PRGMMR: S. LORD ORG: NP22 DATE: 1993-06-01 -C -C ABSTRACT: LISTS STORM CATALOG -C -C PROGRAM HISTORY LOG: -C 1993-06-01 S. LORD -C -C USAGE: CALL SCLIST(IUNTCA) -C INPUT ARGUMENT LIST: -C IUNTCA - UNIT NUMBER FOR CATALOG. -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - subroutine sclist(iuntca) - parameter (novrmx=70) - - character stnmca*9,stidca*3,rsmcca*4 - dimension stidca(novrmx),rsmcca(novrmx) - - rewind iuntca - nrec=0 - - write(6,1) iuntca - 1 format(/'...Storm catalog list for unit ',i3) - 10 continue - READ(IUNTCA,21,END=90) NALCA,STNMCA,IYMDMN,IUTCMN,IYMDMX,IUTCMX, - 1 (RSMCCA(NAL),STIDCA(NAL), - 2 NAL=1,MIN(NALCA,NOVRMX)) - nrec=nrec+1 - 21 FORMAT(I1,1X,A9,2(1X,I8,1X,I4.4),10(1X,A4,1X,A3)) - write(6,23) nrec,NALCA,STNMCA,IYMDMN,IUTCMN,IYMDMX,IUTCMX, - 1 (RSMCCA(NAL),STIDCA(NAL), - 2 NAL=1,MIN(NALCA,NOVRMX)) - 23 FORMAT(3x,i4,2x,I1,1X,A9,2(1X,I8,1X,I4.4),10(1X,A4,1X,A3)) - go to 10 - - 90 continue - write(6,91) - 91 format('...End of storm catalog list.'/) - rewind iuntca - return - end - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: AKLIST LISTS ALIAS FILE -C PRGMMR: S. LORD ORG: NP22 DATE: 1993-06-01 -C -C ABSTRACT: LISTS ALIAS FILE -C -C PROGRAM HISTORY LOG: -C 1993-06-01 S. LORD -C -C USAGE: CALL AKLIST(IUNTAL) -C INPUT ARGUMENT LIST: -C IUNTAL - UNIT NUMBER FOR ALIAS FILE. -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - subroutine aklist(iuntal) - parameter (novrmx=70) - - character stnmal*9,stidal*3,rsmcal*4 - dimension stidal(novrmx),rsmcal(novrmx) - - rewind iuntal - nrec=0 - - write(6,1) iuntal - 1 format(/'...Storm alias list for unit ',i3) - 10 continue - READ(IUNTAL,21,END=90) NALAL,STNMAL,(RSMCAL(NAL),STIDAL(NAL), - - 1 NAL=1,MIN(NALAL,NOVRMX)) - nrec=nrec+1 - 21 FORMAT(I1,1X,A9,10(1X,A4,1X,A3)) - write(6,23) nrec,NALAL,STNMAL,(RSMCAL(NAL),STIDAL(NAL), - 1 NAL=1,MIN(NALAL,NOVRMX)) - 23 FORMAT(3x,i4,2x,I1,1X,A9,10(1X,A4,1X,A3)) - go to 10 - - 90 continue - write(6,91) - 91 format('...End of storm alias list.'/) - rewind iuntal - return - end - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: STCATI GETS STORM ID FROM CATALOG -C PRGMMR: S. LORD ORG: NP22 DATE: 1993-06-01 -C -C ABSTRACT: LOOKS FOR GIVEN STORM ID AND RSMC IN CATALOG -C -C PROGRAM HISTORY LOG: -C 1993-06-01 S. LORD -C -C USAGE: CALL STCATI(IUNTCA,STMIDZ,RSMCZ,STMIDX,IFND) -C INPUT ARGUMENT LIST: -C IUNTCA - UNIT NUMBER FOR STORM CATALOG. -C -C STMIDZ - REQUESTED STORM ID. -C RSMCZ - REQUESTED RSMC. -C -C OUTPUT ARGUMENT LIST: -C STMIDX - CATALOGED STORM ID. -C IFND - 1 IF FOUND. -C - THE RSMC CHECK. -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - subroutine stcati(iuntca,stmidz,rsmcz,stmidx,ifnd) - - parameter (novrmx=70) - - dimension rsmcca(novrmx),stidca(novrmx) - - character stmidz*(*),stmidx*(*),rsmcz*(*) - character stnmca*9,stidca*3,rsmcca*4 - - ifnd=0 - rewind iuntca - write(6,1) stmidz,rsmcz - 1 format('...Entering stcati looking for storm id,rsmc=',2(a,2x)) - 10 continue - READ(IUNTCA,21,END=90) NALCA,STNMCA,IYMDMN,IUTCMN,IYMDMX,IUTCMX, - 1 (RSMCCA(NCA),STIDCA(NCA), - 2 NCA=1,MIN(NALCA,NOVRMX)) - 21 FORMAT(I1,1X,A9,2(1X,I8,1X,I4.4),10(1X,A4,1X,A3)) - do nca=1,min(nalca,novrmx) - if(stmidz .eq. stidca(nca) .and. rsmcz .eq. rsmcca(nca)) then - ifnd=1 - stmidx=stidca(1) - rewind iuntca - return - endif - enddo - go to 10 - - 90 continue - - rewind iuntca - return - end - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: STCATN GETS STORM NAME AND LAST DATE FROM CATLG -C PRGMMR: S. LORD ORG: NP22 DATE: 1993-08-25 -C -C ABSTRACT: LOOKS FOR GIVEN STORM ID AND RSMC IN CATALOG -C -C PROGRAM HISTORY LOG: -C 1993-08-25 S. LORD -C -C USAGE: CALL STCATN(IUNTCA,STMNMZ,IDATEZ,IUTCZ,IFND) -C INPUT ARGUMENT LIST: -C IUNTCA - UNIT NUMBER FOR STORM CATALOG. -C STMNMZ - REQUESTED STORM NAME. -C -C OUTPUT ARGUMENT LIST: -C IDATEZ - LATEST DATE FOUND FOR NAMED STORM. -C IUTCZ - LATEST HHMM FOUND FOR NAMED STORM. -C IFND - 1 IF FOUND. -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE STCATN(IUNTCA,STMNMZ,IDATEZ,IUTCZ,IFND) - - character STMNMZ*(*) - character stnmca*9 - - ifnd=0 - IDATEZ=-999999 - IUTCZ=-999 - rewind iuntca - write(6,1) STMNMZ - 1 format('...Entering stcatn looking for storm name=',a) - 10 continue - READ(IUNTCA,21,END=90) NALCA,STNMCA,IYMDMN,IUTCMN,IYMDMX,IUTCMX - 21 FORMAT(I1,1X,A9,2(1X,I8,1X,I4.4)) - if(STNMCA .eq. STMNMZ) then - ifnd=1 - IDATEZ=IYMDMX - IUTCZ=IUTCMX - endif - go to 10 - - 90 continue - - rewind iuntca - return - end - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: ADFSTF ADDS FIRST OCCURRENCE FLAGS TO RECORDS -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-07 -C -C ABSTRACT: ADDS FIRST OCCURRENCE FLAGS TO RECORDS AS APPROPRIATE, -C EVEN IF A FLAG HAS BEEN CLASSIFIED AS A BAD RECORD. -C -C PROGRAM HISTORY LOG: -C 1991-06-07 S. J. LORD -C 1991-06-07 S. J. LORD DISABLED FIRST FLAGS FOR RELOCATED STORMS -C -C USAGE: CALL ADFSTF(IUNTHA,NOKAY,NBAD,MAXREC,MAXCKS,IECOST,NUMBAD, -c IEFAIL,DUMREC,OKAREC,BADREC) -C INPUT ARGUMENT LIST: -C IUNTHA - UNIT NUMBER FOR THE ALIAS SHORT-TERM HISTORY FILE -C NOKAY - LENGTH OF ARRAY OKAREC -C NBAD - LENGTH OF ARRAY BADREC AND NUMBAD -C MAXREC - LENGTH OF FIRST DIMENSION OF ARRAY IEFAIL -C MAXCKS - LENGTH OF SECOND DIMENSION OF ARRAY IEFAIL -C IECOST - ERROR CODE FOR OVERLAND (COASTAL) TROPICAL CYCLONE -C - POSITIONS -C NUMBAD - ARRAY CONTAINING INDEX NUMBER OF EACH BAD RECORD -C IEFAIL - 2-D ARRAY OF ERROR CODES FOR ALL RECORDS -C DUMREC - DUMMY CHARACTER VARIABLE FOR READING SHORT-TERM -C - HISTORY RECORDS -C OKAREC - CHARACTER ARRAY OF OK RECORDS, RECORDS THAT HAVE -C - PASSES ALL Q/C CHECKS SO FAR -C BADREC - CHARACTER ARRAY OF BAD RECORDS, RECORDS THAT HAVE -C - FAILED AT LEAST ONE Q/C CHECK SO FAR -C -C OUTPUT ARGUMENT LIST: -C DUMREC - DESCRIPTION AS ABOVE -C OKAREC - SAME AS INPUT, EXCEPT FIRST OCCURENCE FLAG MAY HAVE -C - BEEN ADDED -C BADREC - SAME AS INPUT, EXCEPT FIRST OCCURENCE FLAG MAY HAVE -C - BEEN ADDED IN THE CASE OF OVER-LAND (COASTAL) STORMS -C -C INPUT FILES: -C UNIT "IUNTHA" - SHORT-TERM HISTORY FILE -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE ADFSTF(IUNTHA,NOKAY,NBAD,MAXREC,MAXCKS,IECOST,NUMBAD, - 1 IEFAIL,DUMREC,OKAREC,BADREC) - - SAVE - - LOGICAL FOUNDO,FOUNDB - - CHARACTER*(*) DUMREC,OKAREC(NOKAY),BADREC(NBAD) - CHARACTER*100 DUMY2K - - PARAMETER (MAXCHR=95) - PARAMETER (MAXVIT=15) - - CHARACTER BUFIN*1,RSMCZ*4,STMIDZ*3,STMNMZ*9,FSTFLZ*1,STMDPZ*1, - 1 LATNS*1,LONEW*1,FMTVIT*6,BUFINZ*100,RELOCZ*1 - - DIMENSION IVTVAR(MAXVIT),ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION BUFIN(MAXCHR),FMTVIT(MAXVIT) - - EQUIVALENCE (BUFIN(1),RSMCZ),(BUFIN(5),RELOCZ),(BUFIN(6),STMIDZ), - 1 (BUFIN(10),STMNMZ),(BUFIN(19),FSTFLZ), - 2 (BUFIN(37),LATNS),(BUFIN(43),LONEW), - 3 (BUFIN(95),STMDPZ),(BUFIN(1),BUFINZ) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - DIMENSION IEFAIL(MAXREC,0:MAXCKS),NUMBAD(NBAD) - - DATA FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 1 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 2 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 3 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/, - 4 IFSTFL/19/,ISTID/6/,IENID/8/ - - DATA NUM/1/ - - WRITE(6,1) NOKAY,NBAD,IECOST - 1 FORMAT(/'...ENTERING ADFSTF WITH NOKAY,NBAD,IECOST=',3I4/4X, - 1 'WARNING: FIRST OCCURRENCE FLAGS (FOF) MAY OR MAY NOT BE', - 2 ' PRESENT IN THE ORIGINAL SHORT-TERM ALIAS FILE DUE TO ', - 3 'THIS ROUTINE.'/4X,'RELIABLE FOFS ARE PRESENT ONLY IN ', - 4 'THE ALIAS SHORT-TERM HISTORY FILE.') - -C CHECK EACH ALIAS SHORT-TERM HISTORY RECORD FIRST VERSUS THE -C "OKAY" RECORDS AND SECOND VERSUS THE "BAD" RECORDS THAT -C HAVE ONLY AN OVER COAST ERROR - - DO NOK=1,NOKAY - BUFINZ=OKAREC(NOK) - FOUNDO=.FALSE. - REWIND IUNTHA - NREC=0 - - 10 CONTINUE - - READ(IUNTHA,11,END=90) DUMREC - 11 FORMAT(A) - -C AT THIS POINT WE DO NOT KNOW IF A 2-DIGIT YEAR BEGINS IN COLUMN 20 -C OF THE RECORD (OLD NON-Y2K COMPLIANT FORM) OR IF A 4-DIGIT YEAR -C BEGINS IN COLUMN 20 (NEW Y2K COMPLIANT FORM) - TEST ON LOCATION OF -C LATITUDE N/S INDICATOR TO FIND OUT ... - - IF(DUMREC(35:35).EQ.'N' .OR. DUMREC(35:35).EQ.'S') THEN - -C ... THIS RECORD STILL CONTAINS THE OLD 2-DIGIT FORM OF THE YEAR - -C ... THIS PROGRAM WILL CONVERT THE RECORD TO A 4-DIGIT YEAR USING THE -C "WINDOWING" TECHNIQUE SINCE SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 2-digit year "',DUMREC(20:21),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntha,'; DUMREC-4: ',dumrec - PRINT *, ' ' - DUMY2K(1:19) = DUMREC(1:19) - IF(DUMREC(20:21).GT.'20') THEN - DUMY2K(20:21) = '19' - ELSE - DUMY2K(20:21) = '20' - ENDIF - DUMY2K(22:100) = DUMREC(20:100) - DUMREC = DUMY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ DUMREC(20:23),'" via windowing technique' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntha,'; DUMREC-4: ',dumrec - PRINT *, ' ' - - ELSE IF(DUMREC(37:37).EQ.'N' .OR. DUMREC(37:37).EQ.'S') THEN - -C ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR -C ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 4-digit year "',DUMREC(20:23),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntha,'; DUMREC-4: ',dumrec - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - - ELSE - - PRINT *, ' ' - PRINT *, '***** Cannot determine if this record contains ', - $ 'a 2-digit year or a 4-digit year - skip it and try reading ', - $ 'the next record' - PRINT *, ' ' - GO TO 10 - - END IF - - NREC=NREC+1 - IF(STMIDZ .EQ. DUMREC(ISTID:IENID) .AND. - 1 DUMREC(IFSTFL:IFSTFL) .NE. '*') THEN - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 DUMREC) - ENDDO - IDTDUM=IDATEZ - IUTDUM=IUTCZ - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 OKAREC(NOK)) - ENDDO - -C IF THERE ARE DUPLICATE DATES, THEN WE ASSUME THE OKAY RECORD -C IS AN UPDATED RECORD AND WE TRANSFER THE FIRST OCCURRENCE -C FLAG TO THE UPDATED RECORD. THIS CREATES A PARTIAL -C DUPLICATE RECORD THAT WILL BE DEALT WITH IN RITSTH. - - IF(IDATEZ .EQ. IDTDUM .AND. IUTCZ .EQ. IUTDUM) THEN - OKAREC(NOK)(IFSTFL:IFSTFL)=DUMREC(IFSTFL:IFSTFL) - ELSE - FOUNDO=.TRUE. - ENDIF - ENDIF - -C WRITE(6,87) NOK,FOUNDO,DUMREC,OKAREC(NOK) -C 87 FORMAT('...CHECKING FOR FIRST OCCURRENCE, NOK,FOUNDO,DUMREC,', -C 1 'OKAREC=',I3,1X,L1/4X,A/4X,A) - GO TO 10 - - 90 CONTINUE - -C IF THERE ARE NO MATCHING STORMS IN THE SHORT-TERM HISTORY FILE, -C FIND THE EARLIEST STORM IN THE OKAY RECORDS - - IF(.NOT. FOUNDO) THEN - CALL FSTSTM(NOKAY,NOK,NFIRST,OKAREC) - OKAREC(NFIRST)(IFSTFL:IFSTFL)=':' - ENDIF - - ENDDO - - DO NBA=1,NBAD - - IF(IEFAIL(NUMBAD(NBA),4) .EQ. IECOST) THEN - - DO NCK=1,MAXCKS - IF(NCK .NE. 4 .AND. IEFAIL(NUMBAD(NBA),NCK) .GT. 0) GO TO 200 - ENDDO - - BUFINZ=BADREC(NBA) - REWIND IUNTHA - FOUNDB=.FALSE. - NREC=0 - - 160 CONTINUE - - READ(IUNTHA,11,END=190) DUMREC - NREC=NREC+1 - -C AT THIS POINT WE DO NOT KNOW IF A 2-DIGIT YEAR BEGINS IN COLUMN 20 -C OF THE RECORD (OLD NON-Y2K COMPLIANT FORM) OR IF A 4-DIGIT YEAR -C BEGINS IN COLUMN 20 (NEW Y2K COMPLIANT FORM) - TEST ON LOCATION OF -C LATITUDE N/S INDICATOR TO FIND OUT ... - - IF(DUMREC(35:35).EQ.'N' .OR. DUMREC(35:35).EQ.'S') THEN - -C ... THIS RECORD STILL CONTAINS THE OLD 2-DIGIT FORM OF THE YEAR - -C ... THIS PROGRAM WILL CONVERT THE RECORD TO A 4-DIGIT YEAR USING THE -C "WINDOWING" TECHNIQUE SINCE SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 2-digit year "',DUMREC(20:21),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntha,'; DUMREC-5: ',dumrec - PRINT *, ' ' - DUMY2K(1:19) = DUMREC(1:19) - IF(DUMREC(20:21).GT.'20') THEN - DUMY2K(20:21) = '19' - ELSE - DUMY2K(20:21) = '20' - ENDIF - DUMY2K(22:100) = DUMREC(20:100) - DUMREC = DUMY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ DUMREC(20:23),'" via windowing technique' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntha,'; DUMREC-5: ',dumrec - PRINT *, ' ' - - ELSE IF(DUMREC(37:37).EQ.'N' .OR. DUMREC(37:37).EQ.'S') THEN - -C ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR -C ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 4-digit year "',DUMREC(20:23),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntha,'; DUMREC-5: ',dumrec - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - - ELSE - - PRINT *, ' ' - PRINT *, '***** Cannot determine if this record contains ', - $ 'a 2-digit year or a 4-digit year - skip it and try reading ', - $ 'the next record' - PRINT *, ' ' - GO TO 160 - - END IF - - IF(STMIDZ .EQ. DUMREC(ISTID:IENID) .AND. - 1 DUMREC(IFSTFL:IFSTFL) .NE. '*') THEN - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 DUMREC) - ENDDO - IDTDUM=IDATEZ - IUTDUM=IUTCZ - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 BADREC(NBA)) - ENDDO - -C IF THERE ARE DUPLICATE DATES, THEN WE ASSUME THE BAD RECORD -C IS AN UPDATED RECORD AND WE TRANSFER THE FIRST OCCURRENCE -C FLAG TO THE UPDATED RECORD. THIS CREATES A PARTIAL -C DUPLICATE RECORD THAT WILL BE DEALT WITH IN RITSTH. - - IF(IDATEZ .EQ. IDTDUM .AND. IUTCZ .EQ. IUTDUM) THEN - BADREC(NBA)(IFSTFL:IFSTFL)=DUMREC(IFSTFL:IFSTFL) - ELSE - FOUNDB=.TRUE. - ENDIF - ENDIF - -C WRITE(6,187) NBA,DUMREC,BADREC(NBA) -C 187 FORMAT('...CHECKING FOR FIRST OCCURRENCE, NBA,DUMREC,BADREC=',I3/ -C 1 4X,A/4X,A) - GO TO 160 - - 190 CONTINUE - -C IF THERE ARE NO MATCHING STORMS IN THE SHORT-TERM HISTORY FILE, -C FIND THE EARLIEST STORM IN THE BAD RECORDS - - IF(.NOT. FOUNDB) THEN - CALL FSTSTM(NBAD,NBA,NFIRST,BADREC) - BADREC(NFIRST)(IFSTFL:IFSTFL)='*' - ENDIF - - ENDIF - 200 CONTINUE - ENDDO - -C IF THERE ARE NO RECORDS IN THE SHORT-TERM HISTORY FILE, -C WE MUST ASSIGN A FIRST OCCURRENCE FLAG TO EACH STORM - - IF(NREC .EQ. 0) THEN - DO NOK=1,NOKAY - CALL FSTSTM(NOKAY,NOK,NFIRST,OKAREC) - OKAREC(NFIRST)(IFSTFL:IFSTFL)=':' - ENDDO - ENDIF - -C ADD FIRST OCCURRENCE FLAGS FOR RELOCATED STORMS -C DISABLED 4-9-93 - -C DO NOK=1,NOKAY -C BUFINZ=OKAREC(NOK) -C IF(RELOCZ .EQ. 'R') OKAREC(NOK)(IFSTFL:IFSTFL)=':' -C ENDDO - -C VERY SPECIAL CASE: NO RECORDS IN THE SHORT-TERM HISTORY FILE -C AND A RECORD HAS AN OVER LAND ERROR - - IF(NREC .EQ. 0) THEN - DO NBA=1,NBAD - - IF(IEFAIL(NUMBAD(NBA),4) .EQ. IECOST) THEN - - DO NCK=1,MAXCKS - IF(NCK .NE. 4 .AND. IEFAIL(NUMBAD(NBA),NCK) .GT. 0) GO TO 400 - ENDDO - - BADREC(NBA)(IFSTFL:IFSTFL)='*' - - ENDIF - 400 CONTINUE - ENDDO - ENDIF - - WRITE(6,401) NOKAY,NBAD,NREC - 401 FORMAT(/'...LEAVING ADFSTF, NOKAY, NBAD=',2I4/4X,I3,' RECORDS ', - 1 'READ FROM ALIAS SHORT-TERM HISTORY FILE.') - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: FSTSTM FINDS FIRST OCCURRENCE FOR A STORM -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-07-18 -C -C ABSTRACT: FINDS FIRST OCCURRENCE OF A PARTICULAR STORM BY PICKING -C OUT THE MINIMUM TIME. -C -C PROGRAM HISTORY LOG: -C 1991-07-18 S. J. LORD -C -C USAGE: CALL FSTSTM(NRCMX,NRCSTM,NFIRST,DUMREC) -C INPUT ARGUMENT LIST: -C NRCMX - LENGTH OF ARRAY DUMREC -C NRCSTM - INDEX OF THE RECORD CONTAINING THE DESIRED STORM -C DUMREC - ARRAY OF INPUT RECORDS -C -C OUTPUT ARGUMENT LIST: -C NFIRST - INDEX OF THE FIRST RECORD FOR THE DESIRED STORM -C DUMREC - DESCRIPTION AS ABOVE -C -C REMARKS: NONE -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE FSTSTM(NRCMX,NRCSTM,NFIRST,DUMREC) - - CHARACTER*(*) DUMREC(NRCMX) - - DIMENSION RINC(5) - - SAVE - - PARAMETER (MAXCHR=95) - PARAMETER (MAXVIT=15) - - CHARACTER BUFIN*1,RSMCZ*4,STMIDZ*3,STMNMZ*9,FSTFLZ*1,STMDPZ*1, - 1 LATNS*1,LONEW*1,FMTVIT*6,BUFINZ*100,RELOCZ*1 - - DIMENSION IVTVAR(MAXVIT),ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION BUFIN(MAXCHR),FMTVIT(MAXVIT) - - EQUIVALENCE (BUFIN(1),RSMCZ),(BUFIN(5),RELOCZ),(BUFIN(6),STMIDZ), - 1 (BUFIN(10),STMNMZ),(BUFIN(19),FSTFLZ), - 2 (BUFIN(37),LATNS),(BUFIN(43),LONEW), - 3 (BUFIN(95),STMDPZ),(BUFIN(1),BUFINZ) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - DATA FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 1 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 2 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 3 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/, - 4 ISTID/6/,IENID/8/ - - DATA NUM/1/ - -C WRITE(6,1) NRCMX,NRCSTM -C 1 FORMAT(/'...ENTERING FSTSTM WITH NRCMX,NRCSTM=',2I4) - - DAYFST=1.0E10 - -C PICK OUT THE RECORD WITH THE MINIMUM DATE FOR THE CHOSEN STORM - - DO NCOM=1,NRCMX - BUFINZ=DUMREC(NCOM) - IF(STMIDZ .EQ. DUMREC(NRCSTM)(ISTID:IENID)) THEN - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 BUFINZ) - ENDDO - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - IF(DAYZ .LE. DAYFST) THEN - NFIRST=NCOM - DAYFST=DAYZ - ENDIF - ENDIF - ENDDO - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: RITCUR WRITES Q/C RECORDS TO CURRENT DATA FILE -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: WRITES CURRENT QUALITY CONTROLLED RECORDS TO THE CURRENT -C FILE (UNIT 60). -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C 1991-07-22 S. LORD ADDED IDATEZ,IUTCZ TO ARGUMENT LIST -C 1992-07-01 S. LORD REVISION FOR TIME WINDOW -C -C USAGE: CALL RITCUR(IUNTRD,IUNTCU,NTEST,NOKAY,NBAD,IDATCU,JUTCCU,DAY0, -C MAXREC,IFLLCK,NUMTST,NUMOKA,NUMBAD,FILES,LNDFIL, -C ZZZREC,NNNREC,DUMREC,SCRREC,TSTREC,OKAREC,BADREC) -C INPUT ARGUMENT LIST: -C IUNTRD - UNIT NUMBER FOR READING RECORDS -C IUNTCU - UNIT NUMBER FOR CURRENT DATA FILE -C NTEST - NUMBER OF INPUT RECORDS (>0 FOR FILES=FALSE OPTION, -C - =0 FOR FILES=TRUE OPTION) -C IDATCU - DATE (YYYYMMDD) FOR ACCEPTANCE WINDOW -C JUTCCU - UTC (HHMMSS) FOR ACCEPTANCE WINDOW -C DAY0 - DATE OF ACCEPTANCE WINDOW -C MAXREC - DIMENSION OF INPUT ARRAYS -C FILES - LOGICAL VARIABLE, TRUE IF UPDATED SHORT-TERM HISTORY -C FILE HAS BEEN CREATED -C LNDFIL - LOGICAL VARIABLE, TRUE IF OVER-LAND FILTER SHOULD BE -C APPLIED TO CURRENT RECORDS. -C RECORDS TO THE CURRENT FILE -C DUMREC - CHARACTER VARIABLE -C TSTREC - CHARACTER ARRAY (LENGTH MAXREC) OF INPUT RECORDS. ONLY -C - THE FIRST NTEST ARE VALID IN THE CASE OF FILES=.FALSE. -C NUMTST - INDEX FOR ARRAY TSTREC -C ZZZREC - CHARACTER VARIABLE CONTAINING HEADER INFO -C NNNREC - CHARACTER VARIABLE CONTAINING COLUMN INFO -C -C OUTPUT ARGUMENT LIST: -C OKAREC - CONTAINS CANDIDATE QUALITY CONTROLLED RECORDS COPIED -C - TO THE CURRENT FILE -C NOKAY - NUMBER OF OKAY RECORDS -C NBAD - NUMBER OF RECORDS THAT FAILED THE OVERLAND CHECK -C IFLLCK - CONTAINS FAILURE CODE OF BAD RECORDS -C BADREC - ARRAY CONTAINING BAD RECORDS -C SCRREC - SCRATCH ARRAY CONTAINING STORM IDS AND NAMES -C NUMOKA - ARRAY CONTAINING INDICES OF OKAY RECORDS -C NUMBAD - ARRAY CONTAINING INDICES OF BAD RECORDS -C -C INPUT FILES: -C UNIT 20 - SCRATCH FILE CONTAINING QUALITY CONTROLLED RECORDS -C - IUNTRD POINTS TO THIS FILE WHEN FILES=.TRUE. -C UNIT 22 - ALIAS SHORT-TERM HISTORY FILE CONTAINING RECORDS -C - PROCESSED BY THIS PROGRAM FOR THE LAST SEVERAL DAYS. -C - IUNTRD POINTS TO THIS FILE WHEN FILES=.FALSE. -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C UNIT 60 - QUALITY CONTROLLED RECORDS (IUNTCU) -C -C REMARKS: IF LENGTH OF OUTPUT RECORDS (MAXCHR) EXCEEDS THE DESIGNATED -C RECORD LENGTH FOR THE FILE (MAXSPC), THIS SUBROUTINE WILL -C PRINT A NASTY MESSAGE AND CALL AN ABORT1 PROGRAM THAT GIVES -C A RETURN CODE OF 20 FOR THIS PROGRAM EXECUTION. UNDER -C THE FILES=TRUE OPTION, RECORDS ARE READ FROM THE SCRATCH -C FILE, DATE CHECKED, CHECKED FOR OVERLAND POSITIONS IF NEED -C BE, AND THEN WRITTEN TO THE CURRENT FILE. UNDER THE FILES= -C FALSE OPTION, ALL RECORDS PROCESSED BY THE PRESENT RUN OF -C THIS PROGRAM MAY BE WRITTEN IN ADDITION TO SOME RECORDS FROM -C THE ALIAS SHORT-TERM HISTORY FILE. IN BOTH OPTIONS, ONLY THE -C LATEST STORM RECORD IS WRITTEN. ALL RECORDS LIE IN A TIME -C WINDOW GIVEN BY DAY0. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE RITCUR(IUNTRD,IUNTCU,NTEST,NOKAY,NBAD,IDATCU,JUTCCU, - 1 DAY0,MAXREC,IFLLCK,NUMTST,NUMOKA,NUMBAD,FILES, - 2 LNDFIL,ZZZREC,NNNREC,DUMREC,SCRREC,TSTREC, - 3 OKAREC,BADREC) - - PARAMETER (MAXSPC=100) - - SAVE - - LOGICAL FIRST,FILES,LNDFIL,FOUND - - CHARACTER*(*) TSTREC(0:MAXREC),OKAREC(MAXREC),BADREC(MAXREC), - 1 ZZZREC,NNNREC,DUMREC,SCRREC(0:MAXREC) - CHARACTER*100 DUMY2K - - PARAMETER (MAXCHR=95) - PARAMETER (MAXVIT=15) - - CHARACTER FMTVIT*6 - - DIMENSION IVTVAR(MAXVIT),ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION FMTVIT(MAXVIT) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - DIMENSION IFLLCK(MAXREC),NUMTST(MAXREC),NUMOKA(MAXREC), - 1 NUMBAD(MAXREC) - - DIMENSION RINC(5) - - DATA FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 2 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 3 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 4 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/, - 5 ISTID/6/,IENID/8/ - - DATA FIRST/.TRUE./,NUM/1/,FIVMIN/3.4722E-3/ - - WRITE(6,1) IUNTRD,IUNTCU,FILES,LNDFIL,IDATCU,JUTCCU,DAY0 - 1 FORMAT(/'...ENTERING RITCUR WITH IUNTRD,IUNTCU,FILES,LNDFIL,', - 1 'IDATCU,JUTCCU,DAY0',2I3,2L2,I9,I7,F10.3) - - IF(FIRST) THEN - FIRST=.FALSE. - IF(MAXCHR .GT. MAXSPC) THEN - WRITE(6,5) MAXCHR,MAXSPC - 5 FORMAT(/'******INSUFFICIENT SPACE ALLOCATED FOR CURRENT HISTORY ', - 1 'FILE.'/7X,'MAXCHR, AVAILABLE SPACE ARE:',2I4) - CALL ABORT1(' RITCUR',1) - ENDIF - - ENDIF - -C RITCUR USES EITHER OF TWO POSSIBLE SOURCES FOR CURRENT RECORDS: -C 1) IF FILES=.TRUE., THE SCRATCH FILE (IUNTOK) CONTAINS -C ALL THE CURRENT RECORDS, INCLUDING THOSE PROCESSED BY A -C PREVIOUS RUN OF THIS PROGRAM. HOWEVER, A POSSIBILITY -C EXISTS THAT A CURRENT COASTAL RECORD MAY BE IN THE -C SCRATCH FILE. THEREFORE, THERE IS AN OPTIONAL FILTER -C (LNDFIL) BY USING A CALL TO SELACK TO WEED OUT THESE -C RECORDS. - -C 2) IF FILES=.FALSE., THE CURRENT RECORDS ARE THOSE -C PROCESSED BY THE PRESENT RUN OF THIS PROGRAM (OKAREC) -C AND CANDIDATES FROM THE ALIAS SHORT-TERM HISTORY FILE. - -C IN EITHER CASE, ONLY THE LATEST RECORD FOR EACH STORM IS -C WRITTEN. - - REWIND IUNTCU - REWIND IUNTRD - NUNIQ=0 - SCRREC(NUNIQ)='ZZZ' - print *, ' ' - print *, ' ' - - 10 CONTINUE - - READ(IUNTRD,11,END=100) DUMREC - 11 FORMAT(A) - -C AT THIS POINT WE DO NOT KNOW IF A 2-DIGIT YEAR BEGINS IN COLUMN 20 -C OF THE RECORD (OLD NON-Y2K COMPLIANT FORM) OR IF A 4-DIGIT YEAR -C BEGINS IN COLUMN 20 (NEW Y2K COMPLIANT FORM) - TEST ON LOCATION OF -C LATITUDE N/S INDICATOR TO FIND OUT ... - - IF(DUMREC(35:35).EQ.'N' .OR. DUMREC(35:35).EQ.'S') THEN - -C ... THIS RECORD STILL CONTAINS THE OLD 2-DIGIT FORM OF THE YEAR - -C ... THIS PROGRAM WILL CONVERT THE RECORD TO A 4-DIGIT YEAR USING THE -C "WINDOWING" TECHNIQUE SINCE SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 2-digit year "',DUMREC(20:21),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntrd,'; DUMREC-6: ',dumrec - PRINT *, ' ' - DUMY2K(1:19) = DUMREC(1:19) - IF(DUMREC(20:21).GT.'20') THEN - DUMY2K(20:21) = '19' - ELSE - DUMY2K(20:21) = '20' - ENDIF - DUMY2K(22:100) = DUMREC(20:100) - DUMREC = DUMY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ DUMREC(20:23),'" via windowing technique' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntrd,'; DUMREC-6: ',dumrec - PRINT *, ' ' - - ELSE IF(DUMREC(37:37).EQ.'N' .OR. DUMREC(37:37).EQ.'S') THEN - -C ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR -C ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 4-digit year "',DUMREC(20:23),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntrd,'; DUMREC-6: ',dumrec - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - - ELSE - - PRINT *, ' ' - PRINT *, '***** Cannot determine if this record contains ', - $ 'a 2-digit year or a 4-digit year - skip it and try reading ', - $ 'the next record' - PRINT *, ' ' - GO TO 10 - - END IF - - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 DUMREC) - ENDDO - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - - IF(DAYZ .GE. DAY0-FIVMIN) THEN - NTEST=NTEST+1 - TSTREC(NTEST)=DUMREC - NUMTST(NTEST)=NTEST -C WRITE(6,33) NTEST,DUMREC -C 33 FORMAT('...READING FROM SCRATCH FILE'/4X,I4,'...',A,'...') - ENDIF - GO TO 10 - - 100 CONTINUE - - IF(NTEST .GT. 0) THEN - IF(LNDFIL .AND. FILES) THEN - WRITE(6,103) NTEST,NOKAY,NBAD - 103 FORMAT(/'...IN RITCUR, CALLING SELACK IN RITCUR TO CHECK FOR ', - 1 'OVERLAND POSITIONS.'/4X,'NTEST,NOKAY,NBAD=',3I4) - - CALL SELACK(NTEST,NOKAY,NBAD,IECOST,IFLLCK,NUMTST,NUMOKA,NUMBAD, - 1 LNDFIL,ZZZREC,NNNREC,TSTREC,BADREC,OKAREC) - - ELSE - DO NOK=1,NTEST - OKAREC(NOK)=TSTREC(NOK) - NUMOKA(NOK)=NOK - ENDDO - NOKAY=NTEST - ENDIF - -C PICK OUT THE UNIQUE STORMS - - DO NOK=1,NOKAY - FOUND=.FALSE. - DO NUNI=1,NUNIQ - IF(OKAREC(NOK)(ISTID:IENID) .EQ. SCRREC(NUNI)(1:IENID-ISTID+1)) - 1 FOUND=.TRUE. - ENDDO - IF(.NOT. FOUND) THEN - NUNIQ=NUNIQ+1 - SCRREC(NUNIQ)(1:IENID-ISTID+1)=OKAREC(NOK)(ISTID:IENID) - ENDIF - ENDDO - WRITE(6,151) NUNIQ - 151 FORMAT(/'...THE NUMBER OF UNIQUE STORMS IS',I4) - -C SCAN THROUGH RECORDS AND PICK OUT THE LATEST STORM RECORD FOR -C EACH UNIQUE STORM. - - WRITE(6,157) - 157 FORMAT(/'...THE FOLLOWING LATEST RECORDS FOR EACH STORM ARE ', - 1 'BEING WRITTEN TO THE CURRENT FILE:') - - DO NUNI=1,NUNIQ - DAYCHK=-1.E10 - INDXZ=-99 - DO NOK=1,NOKAY - IF(OKAREC(NOK)(ISTID:IENID) .EQ. SCRREC(NUNI)(1:IENID-ISTID+1)) - 1 THEN - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 OKAREC(NOK)) - ENDDO - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - IF(DAYZ .GT. DAYCHK) THEN - INDXZ=NOK - DAYCHK=DAYZ - ENDIF - ENDIF - ENDDO - IF(INDXZ .GT. 0) THEN - WRITE(6,173) INDXZ,OKAREC(INDXZ)(1:MAXCHR) - WRITE(IUNTCU,177) OKAREC(INDXZ)(1:MAXCHR) - 173 FORMAT('...',I3,'...',A,'...') - 177 FORMAT(A) - - ELSE - WRITE(6,181) SCRREC(NUNI)(1:IENID-ISTID+1) - 181 FORMAT(/'###STORM ID=',A,' CANNOT BE FOUND. ABORT1') - CALL ABORT1(' RITCUR',181) - ENDIF - ENDDO - WRITE(6,221) NUNIQ,IUNTCU - 221 FORMAT(/'...',I4,' RECORDS HAVE BEEN COPIED TO THE CURRENT FILE ', - 1 '(UNIT',I3,').') - - ELSE - WRITE(6,231) - 231 FORMAT(/'...NO CURRENT RECORDS WILL BE WRITTEN.') - END FILE IUNTCU - ENDIF - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: RITSTH WRITES SHORT-TERM HISTORY FILE -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: WRITES ALL INPUT RECORDS AND QUALITY CONTROL MARKS -C ASSIGNED BY THIS PROGRAM TO A SCRATCH FILE THAT -C CONTAINS ALL RECENT HISTORICAL RECORDS FOR EACH STORM. -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C -C USAGE: CALL RITSTH(IUNTHA,IUNTHO,IUNTOK,NOKAY,NBAD,DAYMIN,IECOST, -C MAXCKS,MAXREC,NUMBAD,IEFAIL,DUMREC,OKAREC,BADREC) -C INPUT ARGUMENT LIST: -C IUNTHA - UNIT NUMBER FOR THE ALIAS SHORT-TERM HISTORY FILE. -C IUNTHO - UNIT NUMBER FOR THE ORIGINAL SHORT-TERM HISTORY FILE. -C IUNTOK - UNIT NUMBER FOR THE SCRATCH FILE CONTAINING RECORDS -C - WRITTEN TO THE SHORT-TERM HISTORY FILE. -C NOKAY - NUMBER OF RECORDS THAT PASSED ALL Q/C CHECKS. -C NBAD - NUMBER OF RECORDS THAT HAVE AT LEAST ONE ERROR. -C DAYMIN - EARLIEST (MINIMUM) DATE FOR RECORDS THAT WILL BE -C - COPIED TO THE SHORT-TERM HISTORICAL FILE. -C - UNITS ARE DDD.FFF, WHERE DDD=JULIAN DAY, FFF=FRAC- -C - TIONAL DAY (E.G. .5=1200 UTC). -C IECOST - ERROR CODE FOR AN OVERLAND (COASTAL) RECORD. -C MAXCKS - NUMBER OF QUALITY CONTROL CHECKS. SECOND DIMENSION OF -C - ARRAY IEFAIL IS (0:MAXCKS). -C MAXREC - FIRST DIMENSION OF ARRAY IEFAIL. -C NUMBAD - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH BAD -C - RECORD. -C IEFAIL - INTEGER ARRAY CONTAINING QUALITY MARKS. INDEXING -C - IS ACCORDING TO ARRAY NUMBAD. -C DUMREC - CHARACTER VARIABLE LONG ENOUGH TO HOLD VITAL -C - STATISTICS RECORD. -C OKAREC - CHARACTER ARRAY CONTAINING ALL RECORDS THAT HAVE -C - PASSED ALL Q/C CHECKS -C BADREC - CHARACTER ARRAY CONTAINING ALL RECORDS THAT HAVE -C - FAILED AT LEAST ONE Q/C CHECK -C -C INPUT FILES: -C UNIT 22 - ALIAS SHORT=TERM HISTORY FILE -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C UNIT 20 - SCRATCH FILE -C UNIT 21 - ORIGINAL SHORT-TERM HISTORY FILE -C -C REMARKS: RECORDS ARE COPIED FROM THE CURRENT ALIAS SHORT-TERM HISTORY -C FILE TO THE SCRATCH FILE IUNTOK. THE CONTENTS OF IUNTOK -C WILL BE FINALLY BE COPIED TO THE SHORT-TERM HISTORY FILE -C BY ROUTINE FNLCPY. ORIGINAL RECORDS THAT CONTRIBUTED TO -C MAKING ALIAS RECORDS ARE COPIED TO THE ORIGINAL SHORT-TERM -C SHORT-TERM HISTORY FILE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE RITSTH(IUNTHA,IUNTHO,IUNTOK,NOKAY,NBAD,DAYMIN,IECOST, - 1 MAXCKS,MAXREC,NUMBAD,IEFAIL,DUMREC,OKAREC,BADREC) - - SAVE - - CHARACTER*(*) DUMREC,OKAREC(NOKAY),BADREC(NBAD) - - DIMENSION IEFAIL(MAXREC,0:MAXCKS),NUMBAD(NBAD) - - ICALL=2 - - REWIND IUNTOK - -C COPY ALL RECORDS FROM THE CURRENT ORIGINAL SHORT-TERM HISTORY -C FILE TO A SCRATCH FILE (IUNTOK) FOR TEMPORARY STORAGE. - - WRITE(6,1) DAYMIN,ICALL - 1 FORMAT(/'...THE FOLLOWING RECORDS, HAVING DATES GREATER THAN OR ', - 1 'EQUAL TO DAY',F10.3,', WILL BE CHECKED FOR EXACT AND ', - 2 'PARTIAL DUPLICATES '/4X,'(ICALL=',I2,') AND WILL BE ', - 3 'COPIED FROM THE ORIGINAL SHORT-TERM HISTORICAL FILE TO ', - 4 'THE PRELIMINARY QUALITY CONTROLLED FILE'/4X,'(SCRATCH ', - 5 'FILE) FOR TEMPORARY STORAGE:') - - CALL CPYREC(ICALL,IUNTHO,IUNTOK,NOKAY,DAYMIN,DUMREC,OKAREC) - -C NOW ADD THE CURRENT RECORDS. - - WRITE(6,21) - 21 FORMAT(//'...THE FOLLOWING ACCEPTABLE ORIGINAL RECORDS WILL BE ', - 1 'ADDED TO THE NEW ORIGINAL SHORT-TERM HISTORY FILE:'/) - DO NOK=1,NOKAY - IF(OKAREC(NOK)(1:1) .NE. '!') THEN - WRITE(6,23) NOK,OKAREC(NOK) - 23 FORMAT('...',I4,'...',A) - WRITE(IUNTOK,27) OKAREC(NOK) - 27 FORMAT(A) - ENDIF - ENDDO - -C NOW WE APPEND THE SCRATCH FILE WITH RECORDS THAT CONTRIBUTED -C TO ALIAS RECORDS. - - WRITE(6,101) - 101 FORMAT(/'...THE FOLLOWING (BAD) RECORDS WITH RSMCCK OR RCNCIL ', - 1 'ERRORS WILL BE ADDED TO THE SHORT-TERM ORIGINAL'/4X, - 2 'HISTORY FILE:'/) - - DO NBA=1,NBAD - - IF(IEFAIL(NUMBAD(NBA),6) .EQ. 10 .OR. - 1 IEFAIL(NUMBAD(NBA),6) .GE. 21 .OR. - 1 IABS(IEFAIL(NUMBAD(NBA),5)) .EQ. 20) THEN - - DO NCK=1,MAXCKS - IF(NCK .NE. 6 .AND. NCK .NE. 5 .AND. - 1 IEFAIL(NUMBAD(NBA),NCK) .GT. 0) GO TO 150 - ENDDO - - WRITE(6,131) NBA,BADREC(NBA) - 131 FORMAT('...',I4,'...',A) - WRITE(IUNTOK,133) BADREC(NBA) - 133 FORMAT(A) - - ENDIF - 150 CONTINUE - ENDDO - -C COPY RECORDS THAT ARE MORE RECENT THAN DAYMIN FROM THE -C SCRATCH FILE (IUNTOK) TO THE ORIGINAL SHORT-TERM -C HISTORY FILE - - ICALL=1 - REWIND IUNTOK - REWIND IUNTHO - WRITE(6,151) - 151 FORMAT(/'...THE FOLLOWING RECORDS WILL BE COPIED FROM THE ', - 1 'SCRATCH FILE TO THE NEW ORIGINAL SHORT-TERM HISTORICAL ', - 2 'FILE:') - - CALL CPYREC(ICALL,IUNTOK,IUNTHO,NOKAY,DAYMIN,DUMREC,OKAREC) - - ICALL=3 - - REWIND IUNTOK - -C COPY RECORDS THAT ARE MORE RECENT THAN DAYMIN FROM THE -C CURRENT ALIAS SHORT-TERM HISTORY FILE TO A SCRATCH FILE -C (IUNTOK). THEN ADD THE CURRENT RECORDS. - - CALL CPYREC(ICALL,IUNTHA,IUNTOK,NOKAY,DAYMIN,DUMREC,OKAREC) - - WRITE(6,211) - 211 FORMAT(//'...THE FOLLOWING ACCEPTABLE RECORDS WILL BE ADDED TO ', - 1 'THE NEW ALIAS SHORT-TERM HISTORY FILE:'/) - DO NOK=1,NOKAY - WRITE(6,213) NOK,OKAREC(NOK) - 213 FORMAT('...',I4,'...',A) - WRITE(IUNTOK,217) OKAREC(NOK) - 217 FORMAT(A) - ENDDO - -C ADD RECORDS THAT HAVE OVERLAND POSITIONS TO THE SHORT-TERM -C HISTORY FILE, PROVIDED THEY HAVE NO OTHER ERRORS - - WRITE(6,41) - 41 FORMAT(/'...THE FOLLOWING (BAD) RECORDS WITH COASTAL OVERLAND ', - 1 'POSITIONS WILL BE ADDED TO THE NEW ALIAS SHORT-TERM '/4X, - 2 'HISTORY FILE FOR FUTURE TRACK CHECKS:'/) - - DO NBA=1,NBAD - - IF(IEFAIL(NUMBAD(NBA),4) .EQ. IECOST) THEN - - DO NCK=1,MAXCKS - IF(NCK .NE. 4 .AND. IEFAIL(NUMBAD(NBA),NCK) .GT. 0) GO TO 300 - ENDDO - - WRITE(6,261) NBA,BADREC(NBA) - 261 FORMAT('...',I4,'...',A) - WRITE(IUNTOK,263) BADREC(NBA) - 263 FORMAT(A) - - ENDIF - 300 CONTINUE - ENDDO - -C THE SCRATCH FILE (IUNTOK) NOW CONTAINS ALL RECORDS THAT WILL -C BE IN THE NEW ALIAS SHORT-TERM HISTORY FILE. SUBROUTINE FNLCPY -C WILL COPY THIS SCRATCH FILE TO THE NEW ALIAS SHORT-TERM HISTORY -C FILE. - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: RITHIS WRITES RECORDS AND Q/C MARKS TO FILE -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: WRITES ALL INPUT RECORDS AND QUALITY CONTROL MARKS -C ASSIGNED BY THIS PROGRAM TO A LONG-TERM HISTORY FILE. -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C -C USAGE: CALL RITHIS(IUNTHI,IEFAIL,NRTOT,IDATE,IUTC,NUMREC,NREC, -C MAXREC,MAXCKS,HROFF,WINCUR,RUNID,LNDFIL,FILES, -C RECORD,ZZZREC,XXXREC) -C INPUT ARGUMENT LIST: -C IUNTHI - UNIT NUMBER FOR THE OUTPUT FILE. NOTE: SIGN OF THE -C - QUALITY MARKS IS ATTACHED TO THIS NUMBER! -C IEFAIL - INTEGER ARRAY CONTAINING QUALITY MARKS. INDEXING -C - IS ACCORDING TO ARRAY NUMREC. SIGN OF THIS NUMBER IS -C - ATTACHED TO IUNTHI! -C NRTOT - TOTAL NUMBER OF RECORDS WRITTEN INTO THE FILE. NREC -C - IS THE NUMBER WRITTEN FOR EACH CALL OF THE ROUTINE. -C IDATE - YYYYMMDD FOR WHICH THE PROGRAM IS BEING RUN. -C IUTC - HHMM FOR WHICH THE PROGRAM IS BEING RUN. -C NUMREC - ARRAY OF RECORD NUMBERS CORRESPONDING TO THE QUALITY -C - MARKS STORED IN ARRAY IEFAIL. -C NREC - NUMBER OF RECORDS TO BE WRITTEN TO THE OUTPUT FILE. -C MAXREC - FIRST DIMENSION OF ARRAY IEFAIL. -C MAXCKS - NUMBER OF QUALITY CONTROL CHECKS. SECOND DIMENSION OF -C - ARRAY IEFAIL IS (0:MAXCKS). -C HROFF - OFFSET (FRACTIONAL HOURS) BETWEEN TIME PROGRAM IS -C - RUN AND THE VALID CYCLE TIME -C WINCUR - TIME WINDOW FOR ADDING RECORDS TO CURRENT FILE -C RUNID - CHARACTER VARIABLE IDENTIFYING RUN -C LNDFIL - LOGICAL VARIABLE, TRUE IF OVER LAND POSITIONS ARE -C - NOT WRITTEN TO CURRENT FILE -C FILES - LOGICAL VARIABLE: TRUE IF SHORT-TERM HISTORY FILES ARE -C - UPDATED. -C RECORD - CHARACTER ARRAY CONTAINING OUTPUT RECORDS. -C ZZZREC - COLUMN HEADER RECORD. -C XXXREC - COLUMN HEADER RECORD. -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C UNIT 61 - CONTAINS HISTORY OF ALL RECORDS THAT ARE OPERATED ON -C - BY THIS PROGRAM -C -C REMARKS: THE HEADER RECORD IS WRITTEN ON THE FIRST CALL OF THIS -C ROUTINE. IT CONSISTS OF IDATE,IUTC,NRTOT,NREC,ZZZREC -C AND XXXREC. FOR THE FIRST CALL, NREC CORRESPONDS TO THE -C NUMBER OF RECORDS THAT PASSED THE Q/C CHECKS. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE RITHIS(IUNTHI,IEFAIL,NRTOT,IDATE,IUTC,NUMREC,NREC, - 1 MAXREC,MAXCKS,HROFF,WINCUR,RUNID,LNDFIL,FILES, - 2 RECORD,ZZZREC,XXXREC) - - PARAMETER (MAXSPH=131) - - SAVE - - LOGICAL FIRST,LNDFIL,FILES - - CHARACTER*(*) RUNID,RECORD(NREC),ZZZREC,XXXREC - - PARAMETER (MAXCHR=95) - - DIMENSION IEFAIL(MAXREC,0:MAXCKS),NUMREC(NREC) - - DATA FIRST/.TRUE./ - - IF(FIRST) THEN - FIRST=.FALSE. - IF(MAXCHR+1+3*(MAXCKS+1) .GT. MAXSPH) THEN - WRITE(6,1) MAXCHR,MAXCKS,MAXCHR+1+3*(MAXCKS+1),MAXSPH - 1 FORMAT(/'******INSUFFICIENT SPACE ALLOCATED FOR LONG-TERM ', - 1 'HISTORY FILE.'/7X,'MAXCHR,MAXCK,(REQUIRED,AVAILABLE) ', - 2 ' SPACE ARE:',4I4) - CALL ABORT1(' RITHIS',1) - ENDIF - - NROKAY=NREC - WRITE(IABS(IUNTHI),3) IDATE,IUTC,NRTOT,NROKAY,HROFF,RUNID,LNDFIL, - 1 FILES,WINCUR,ZZZREC(1:MAXCHR),XXXREC - 3 FORMAT('IDATE=',I8,' IUTC=',I4,' NRTOT=',I4,' NROKAY=',I4, - 1 ' HROFF=',F6.2,' RUNID=',A12,' LNDFIL=',L1,' FILES=',L1, - 2 ' WINCUR=',F6.3/A,1X,A) - ENDIF - -C OUTPUT UNIT NUMBER IS NEGATIVE FOR OKAY RECORDS (ERROR CODES ARE -C ALWAYS NEGATIVE). OUTPUT UNIT NUMBER IS POSITIVE FOR BAD -C RECORDS, WHICH MAY HAVE A MIXTURE OF POSITIVE AND NEGATIVE -C ERROR CODES. - - IF(IUNTHI .LT. 0) THEN - DO NR=1,NREC - WRITE(IABS(IUNTHI),5) RECORD(NR)(1:MAXCHR),IEFAIL(NUMREC(NR),0), - 1 (-IABS(IEFAIL(NUMREC(NR),ICK)),ICK=1,MAXCKS) - 5 FORMAT(A,1X,I3,8I3) - ENDDO - - ELSE - DO NR=1,NREC - WRITE(IABS(IUNTHI),5) RECORD(NR)(1:MAXCHR),IEFAIL(NUMREC(NR),0), - 1 (IEFAIL(NUMREC(NR),ICK),ICK=1,MAXCKS) - ENDDO - ENDIF - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: FNLCPY RESETS FILES FOR THE NEXT INPUT CYCLE -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: RESETS THE FILES CONTAINING THE INPUT RECORDS FOR THE -C NEXT RUN OF THE PROGRAM. THE SHORT-TERM HISTORY FILE IS UPDATED -C AND ALL INPUT FILES ARE FLUSHED, RECORDS THAT BELONG TO A FUTURE -C CYCLE ARE REINSERTED INTO THE INPUT FILES. -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C -C USAGE: CALL FNLCPY(IUNTVZ,MAXUNT,IUNTOK,IUNTHA,MAXREC,NTBP,NUMTBP, -C IUNTIN,TBPREC,DUMREC) -C INPUT ARGUMENT LIST: -C IUNTVZ - UNIT NUMBER FOR FIRST INPUT FILE -C MAXUNT - NUMBER OF INPUT FILES TO BE RESET -C IUNTOK - UNIT NUMBER FOR TEMPORARY HISTORY FILE, WHICH CONTAINS -C - QUALITY CONTROLLED RECORDS, INCLUDING THOSE JUST -C - PROCESSED. -C IUNTHA - UNIT NUMBER FOR THE ALIAS SHORT TERM HISTORY FILE. -C RECORDS ARE COPIED FROM IUNTOK TO IUNTHA. -C MAXREC - MAXIMUM NUMBER OF RECORDS, DIMENSION OF IUNTIN. -C NTBP - NUMBER OF RECORDS FOR THE NEXT CYCLE THAT WILL BE -C - PUT BACK INTO THE INPUT FILES (THROWN BACK INTO THE -C - POND). -C NUMTBP - INTEGER ARRAY CONTAINING INDICES OF RECORDS THAT WILL -C - THROWN BACK INTO THE POND. INDICES REFER TO POSITION -C - IN ARRAY IUNTIN. -C IUNTIN - INTEGER ARRAY CONTAINING UNIT NUMBERS FOR RECORDS -C - THAT WILL BE THROWN BACK INTO THE POND. -C TBPREC - CHARACTER ARRAY CONTAINING RECORDS THAT WILL BE -C - THROWN BACK INTO THE POND. -C DUMREC - CHARACTER VARIABLE FOR COPYING RECORDS TO THE -C - SHORT-TERM HISTORY FILE. -C -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C UNIT 10 - SCRATCH FILE -C UNIT 22 - SHORT-TERM HISTORY, RECORDS BACK 4 DAYS FROM PRESENT -C UNIT 30 - FILE(S) CONTAINING NEW RECORDS TO BE QUALITY -C - CONTROLLED. RECORDS APPROPRIATE TO A FUTURE CYCLE ARE -C - WRITTEN BACK TO THIS FILE -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE FNLCPY(IUNTVZ,MAXUNT,IUNTOK,IUNTHA,MAXREC,NTBP,NUMTBP, - 1 IUNTIN,TBPREC,DUMREC) - - SAVE - - CHARACTER DUMREC*(*),TBPREC(NTBP)*(*) - CHARACTER*100 DUMY2K - - DIMENSION NUMTBP(NTBP),IUNTIN(MAXREC) - -C FINAL COPYING BACK TO SHORT TERM HISTORY FILE AND ZEROING ALL -C FILES THAT WILL CONTAIN NEW RECORDS FOR THE NEXT CYCLE - - NREC=0 - REWIND IUNTOK - REWIND IUNTHA - - 10 CONTINUE - - READ(IUNTOK,11,END=20) DUMREC - 11 FORMAT(A) - -C AT THIS POINT WE DO NOT KNOW IF A 2-DIGIT YEAR BEGINS IN COLUMN 20 -C OF THE RECORD (OLD NON-Y2K COMPLIANT FORM) OR IF A 4-DIGIT YEAR -C BEGINS IN COLUMN 20 (NEW Y2K COMPLIANT FORM) - TEST ON LOCATION OF -C LATITUDE N/S INDICATOR TO FIND OUT ... - - IF(DUMREC(35:35).EQ.'N' .OR. DUMREC(35:35).EQ.'S') THEN - -C ... THIS RECORD STILL CONTAINS THE OLD 2-DIGIT FORM OF THE YEAR - -C ... THIS PROGRAM WILL CONVERT THE RECORD TO A 4-DIGIT YEAR USING THE -C "WINDOWING" TECHNIQUE SINCE SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 2-digit year "',DUMREC(20:21),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntok,'; DUMREC-7: ',dumrec - PRINT *, ' ' - DUMY2K(1:19) = DUMREC(1:19) - IF(DUMREC(20:21).GT.'20') THEN - DUMY2K(20:21) = '19' - ELSE - DUMY2K(20:21) = '20' - ENDIF - DUMY2K(22:100) = DUMREC(20:100) - DUMREC = DUMY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ DUMREC(20:23),'" via windowing technique' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntok,'; DUMREC-7: ',dumrec - PRINT *, ' ' - - ELSE IF(DUMREC(37:37).EQ.'N' .OR. DUMREC(37:37).EQ.'S') THEN - -C ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR -C ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 4-digit year "',DUMREC(20:23),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntok,'; DUMREC-7: ',dumrec - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - - ELSE - - PRINT *, ' ' - PRINT *, '***** Cannot determine if this record contains ', - $ 'a 2-digit year or a 4-digit year - skip it and try reading ', - $ 'the next record' - PRINT *, ' ' - GO TO 10 - - END IF - - NREC=NREC+1 - WRITE(IUNTHA,11) DUMREC - GO TO 10 - - 20 CONTINUE - WRITE(6,21) NREC,IUNTHA - 21 FORMAT(/'...',I3,' RECORDS HAVE BEEN COPIED TO THE FUTURE ALIAS ', - 1 'SHORT-TERM HISTORY FILE, UNIT=',I3) - - IUNTVI=IUNTVZ - DO NFILE=1,MAXUNT - REWIND IUNTVI - - IF(NTBP .EQ. 0) THEN - - END FILE IUNTVI - WRITE(6,23) IUNTVI - 23 FORMAT(/'...UNIT',I3,' HAS BEEN ZEROED FOR THE NEXT CYCLE.') - -C THROW RECORDS FOR THE NEXT CYCLE BACK INTO THE POND - - ELSE - - WRITE(6,27) IUNTVI - 27 FORMAT(/'...THE FOLLOWING RECORDS WILL BE THROWN BACK INTO THE ', - 1 'POND = UNIT',I3,':') - - DO NTB=1,NTBP - IF(IUNTIN(NUMTBP(NTB)) .EQ. IUNTVI) THEN - WRITE(IUNTVI,11) TBPREC(NTB) - WRITE(6,29) NTB,NUMTBP(NTB),TBPREC(NTB) - 29 FORMAT(3X,I4,'...',I4,'...',A,'...') - ENDIF - ENDDO - - ENDIF - - IUNTVI=IUNTVI+1 - - ENDDO - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: CPYREC COPIES RECORDS CHECKS DATES & DUPLICATES -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: RECORDS ARE CHECKED FOR DATE AND EXACT AND PARTIAL -C DUPLICATES AND COPIED FROM ONE FILE TO A SECOND FILE. -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C 1992-03-10 S. LORD - ADDED FILTERS. -C -C USAGE: CALL CPYREC(ICALL,IUNTRD,IUNTWT,NOKAY,DAYMN,DUMREC,OKAREC) -C INPUT ARGUMENT LIST: -C ICALL - TOGGLE FOR FILTER. 1: NO FILTER (STRAIGHT COPY) -C 2: DATE/TIME, STORM ID & NAME -C 3: #2 ABOVE PLUS RSMC (PARTIAL -C DUPLICATE) -C IUNTRD - UNIT NUMBER FOR RECORDS TO BE COPIED -C IUNTWT - RECORDS COPIED TO THIS UNIT NUMBER -C NOKAY - LENGTH OF ARRAY OKAREC -C DAYMN - RECORDS WITH DATES PRIOR TO THIS DAY WILL NOT BE -C - COPIED. DAYMN HAS UNITS OF DDD.FFF, WHERE DDD= -C - JULIAN DAY, FFF=FRACTIONAL DAY (E.G. .5 IS 1200 UTC.) -C DUMREC - CHARACTER VARIABLE LONG ENOUGH TO HOLD COPIED RECORD. -C OKAREC - CHARACTER ARRAY CONTAINING RECORDS AGAINST WHICH -C - EACH COPIED RECORD WILL BE CHECKED FOR EXACT OR -C - PARTIAL DUPLICATES. A PARTIAL DUPLICATE IS ONE WITH -C - THE SAME RSMC, DATE/TIME AND STORM NAME/ID. -C -C INPUT FILES: -C UNIT 20 - SHORT TERM HISTORY -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C UNIT 22 - PRELIMINARY QUALITY CONTROLLED FILE -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE CPYREC(ICALL,IUNTRD,IUNTWT,NOKAY,DAYMN,DUMREC,OKAREC) - - SAVE - - CHARACTER*(*) DUMREC,OKAREC(NOKAY) - CHARACTER*100 DUMY2K - - DIMENSION RINC(5) - - PARAMETER (MAXVIT=15) - - CHARACTER FMTVIT*6 - - DIMENSION IVTVAR(MAXVIT),ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION FMTVIT(MAXVIT) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - DATA FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 1 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 2 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 3 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/ - - DATA NUM/1/,FIVMIN/3.4722E-3/ - - NREC=0 - REWIND IUNTRD - - 10 CONTINUE - - READ(IUNTRD,11,END=100) DUMREC - 11 FORMAT(A) - -C AT THIS POINT WE DO NOT KNOW IF A 2-DIGIT YEAR BEGINS IN COLUMN 20 -C OF THE RECORD (OLD NON-Y2K COMPLIANT FORM) OR IF A 4-DIGIT YEAR -C BEGINS IN COLUMN 20 (NEW Y2K COMPLIANT FORM) - TEST ON LOCATION OF -C LATITUDE N/S INDICATOR TO FIND OUT ... - - IF(DUMREC(35:35).EQ.'N' .OR. DUMREC(35:35).EQ.'S') THEN - -C ... THIS RECORD STILL CONTAINS THE OLD 2-DIGIT FORM OF THE YEAR - -C ... THIS PROGRAM WILL CONVERT THE RECORD TO A 4-DIGIT YEAR USING THE -C "WINDOWING" TECHNIQUE SINCE SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 2-digit year "',DUMREC(20:21),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntrd,'; DUMREC-8: ',dumrec - PRINT *, ' ' - DUMY2K(1:19) = DUMREC(1:19) - IF(DUMREC(20:21).GT.'20') THEN - DUMY2K(20:21) = '19' - ELSE - DUMY2K(20:21) = '20' - ENDIF - DUMY2K(22:100) = DUMREC(20:100) - DUMREC = DUMY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ DUMREC(20:23),'" via windowing technique' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntrd,'; DUMREC-8: ',dumrec - PRINT *, ' ' - - ELSE IF(DUMREC(37:37).EQ.'N' .OR. DUMREC(37:37).EQ.'S') THEN - -C ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR -C ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 4-digit year "',DUMREC(20:23),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntrd,'; DUMREC-8: ',dumrec - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - - ELSE - - PRINT *, ' ' - PRINT *, '***** Cannot determine if this record contains ', - $ 'a 2-digit year or a 4-digit year - skip it and try reading ', - $ 'the next record' - PRINT *, ' ' - GO TO 10 - - END IF - - IF(ICALL .GT. 1) THEN - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 DUMREC) - ENDDO - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) -C WRITE(6,21) IDATEZ,IUTCZ,DAYZ,DAYMN -C 21 FORMAT(/'...CHECKING DATE,TIME FOR COPYING HISTORICAL RECORDS',I9, -C I5,2F10.2) - - IF(DAYZ .GE. DAYMN-FIVMIN) THEN - - DO NOK=1,NOKAY - IF(DUMREC .EQ. OKAREC(NOK)) THEN - WRITE(6,27) DUMREC - 27 FORMAT(/'...EXACT DUPLICATE FOUND IN THE NEW AND HISTORICAL ', - 1 'FILES. THE HISTORICAL RECORD WILL NOT BE COPIED.'/8X, - 2 '...',A/) - GO TO 10 - ENDIF - -C CHECK FOR VARIOUS PARTIAL DUPLICATES: -C ICALL = 2: DATE/TIME, STORM ID, STORM NAME FILTER -C ICALL = 3: #2 ABOVE PLUS RSMC, I.E. A PARTIAL DUPLICATE - - IF(ICALL .EQ. 2 .AND. DUMREC(6:ISTVAR(3)-1) .EQ. - 1 OKAREC(NOK)(6:ISTVAR(3)-1)) THEN - WRITE(6,59) DUMREC,OKAREC(NOK) - 59 FORMAT(/'...PARTIAL DUPLICATE IN STORM ID & NAME, DATE AND TIME ', - 1 'FOUND IN THE NEW AND HISTORICAL FILES.'/4X,'THE ', - 2 'HISTORICAL RECORD WILL NOT BE COPIED.'/5X,'HIS...',A/5X, - 3 'NEW...',A/) - GO TO 10 - ENDIF - - IF(ICALL .GE. 3 .AND. DUMREC(1:ISTVAR(3)-1) .EQ. - 1 OKAREC(NOK)(1:ISTVAR(3)-1)) THEN - WRITE(6,69) DUMREC,OKAREC(NOK) - 69 FORMAT(/'...PARTIAL DUPLICATE IN RSMC, STORM ID & NAME, DATE AND', - 1 ' TIME FOUND IN THE NEW AND HISTORICAL FILES.'/4X,'THE ', - 2 'HISTORICAL RECORD WILL NOT BE COPIED.'/5X,'HIS...',A/5X, - 3 'NEW...',A/) - GO TO 10 - ENDIF - - ENDDO - - NREC=NREC+1 - WRITE(6,83) NREC,DUMREC - 83 FORMAT(3X,I4,'...',A,'...') - - WRITE(IUNTWT,11) DUMREC - ENDIF - - ELSE - NREC=NREC+1 - WRITE(6,83) NREC,DUMREC - WRITE(IUNTWT,11) DUMREC - ENDIF - - GO TO 10 - - 100 WRITE(6,101) NREC,IUNTRD,IUNTWT - 101 FORMAT(/'...',I4,' RECORDS HAVE BEEN COPIED FROM UNIT',I3,' TO ', - 1 'UNIT',I3,'.') - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: DUPCHK READS INPUT RECORDS, DUPLICATE CHECKS -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: READS INPUT RECORDS FROM ALL SPECIFIED FILES. CHECKS FOR -C EXACT DUPLICATES. RETURNS ALL RECORDS TO BE QUALITY CONTROLLED. -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C 1992-08-20 S. LORD ADDED NEW UNIT FOR GTS BUFR MESSAGES -C 1997-06-24 S. LORD ADDED NEW UNIT FOR MANUALLY ENTERED MESSAGES -C -C USAGE: CALL DUPCHK(IUNTIN,MAXUNT,MAXREC,IERCHK,NUNI,IFILE, -C NUMOKA,DUMREC,UNIREC,DUPREC,*) -C INPUT ARGUMENT LIST: -C IUNTIN - THE INPUT UNIT NUMBER FOR THE FIRST FILE TO BE READ. -C MAXUNT - NUMBER OF INPUT FILES. -C MAXREC - MAXIMUM NUMBER OF INPUT RECORDS. SUBROUTINE -C - RETURNS WITH CONDITION CODE=51 OR 53 WHEN NUMBER OF -C - UNIQUE OR DUPLICATE RECORDS EXCEEDS MAXREC. -C -C OUTPUT ARGUMENT LIST: -C IERCHK - ERROR INDICATOR. -C NUNI - NUMBER OF UNIQUE RECORDS TO BE QUALITY CONTROLLED -C IFILE - INTEGER ARRAY CONTAINING THE UNIT NUMBER FROM WHICH -C - EACH INPUT RECORD WAS READ. -C NUMOKA - INDEX NUMBER FOR EACH UNIQUE RECORD. INDEX NUMBER -C - IS SIMPLY THE ORDINAL NUMBER OF EACH RECORD READ -C - THAT IS UNIQUE, I.E. NOT A DUPLICATE. -C DUMREC - DUMMY CHARACTER VARIABLE LARGE ENOUGH TO READ A RECORD. -C UNIREC - CHARACTER ARRAY HOLDING ALL INPUT RECORDS. -C DUPREC - CHARACTER ARRAY HOLDING ALL DUPLICATE RECORDS. -C * - ALTERNATE RETURN IF NO INPUT RECORDS ARE FOUND. -C - SUBROUTINE RETURNS WITH IERCHK=161. -C -C INPUT FILES: -C UNIT 30 - FILES CONTAINING NEW RECORDS TO BE QUALITY CONTROLLED. -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE DUPCHK(IUNTIN,MAXUNT,MAXREC,IERCHK,NUNI,IFILE,NUMOKA, - 1 DUMREC,UNIREC,DUPREC,*) - - PARAMETER (MAXFIL=5) - - SAVE - - LOGICAL UNIQUE - CHARACTER*(*) DUMREC,UNIREC(0:MAXREC),DUPREC(MAXREC) - CHARACTER INPFIL(MAXFIL)*4 - CHARACTER*100 DUMY2K - - DIMENSION NUMOKA(MAXREC),IFILE(MAXREC) - - DATA INPFIL/'NHC ','FNOC','GBTB','GBFR','HBTB'/ - - IF(MAXUNT .GT. MAXFIL) THEN - WRITE(6,1) MAXUNT,MAXFIL - 1 FORMAT(/'******MAXIMUM NUMBER OF UNITS TO BE READ=',I3,' EXCEEDS', - 1 ' EXPECTATIONS. NUMBER WILL BE REDUCED TO',I3) - MAXUNT=MAXFIL - ENDIF - - IUNTVI=IUNTIN - IERCHK=0 - NUNI=0 - NDUP=0 - NSTART=0 - NALREC=0 - NRFILE=0 - UNIREC(0)='ZZZZZZZ' - - WRITE(6,3) MAXREC,IUNTVI,MAXUNT,(INPFIL(IFFF), - 1 IUNTIN+IFFF-1,IFFF=1,MAXUNT) - 3 FORMAT(//'...ENTERING DUPCHK: READING FILE AND LOOKING FOR EXACT', - 1 ' DUPLICATES. MAXREC=',I4,'.'/4X,'INITIAL UNIT NUMBER=', - 2 I4,' AND',I3,' UNITS WILL BE READ'/4X,'FILES AND UNIT ', - 3 'NUMBERS ARE:'/(6X,A,':',I3)) - - 10 CONTINUE - - DO NREC=1,MAXREC - READ(IUNTVI,11,END=130) DUMREC - 11 FORMAT(A) - -C AT THIS POINT WE DO NOT KNOW IF A 2-DIGIT YEAR BEGINS IN COLUMN 20 -C OF THE RECORD (OLD NON-Y2K COMPLIANT FORM) OR IF A 4-DIGIT YEAR -C BEGINS IN COLUMN 20 (NEW Y2K COMPLIANT FORM) - TEST ON LOCATION OF -C LATITUDE N/S INDICATOR TO FIND OUT ... - - IF(DUMREC(35:35).EQ.'N' .OR. DUMREC(35:35).EQ.'S') THEN - -C ... THIS RECORD STILL CONTAINS THE OLD 2-DIGIT FORM OF THE YEAR - -C FOR EXAMPLE: - -C NHC 13L MITCH 981028 1800 164N 0858W 270 010 0957 1008 0371 51 019 0278 0278 0185 0185 D -C 123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123 -C 1 2 3 4 5 6 7 8 9 - -C ... THIS PROGRAM WILL CONVERT THE RECORD TO A 4-DIGIT YEAR USING THE -C "WINDOWING" TECHNIQUE SINCE SUBSEQUENT LOGIC EXPECTS THIS - FOR -C EXAMPLE, THE ABOVE RECORD IS CONVERTED TO: - -C NHC 13L MITCH 19981028 1800 164N 0858W 270 010 0957 1008 0371 51 019 0278 0278 0185 0185 D -C 12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345 -C 1 2 3 4 5 6 7 8 9 - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 2-digit year "',DUMREC(20:21),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntvi,'; DUMREC-1: ',dumrec - PRINT *, ' ' - DUMY2K(1:19) = DUMREC(1:19) - IF(DUMREC(20:21).GT.'20') THEN - DUMY2K(20:21) = '19' - ELSE - DUMY2K(20:21) = '20' - ENDIF - DUMY2K(22:100) = DUMREC(20:100) - DUMREC = DUMY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ DUMREC(20:23),'" via windowing technique' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntvi,'; DUMREC-1: ',dumrec - PRINT *, ' ' - - ELSE IF(DUMREC(37:37).EQ.'N' .OR. DUMREC(37:37).EQ.'S') THEN - -C ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR - -C FOR EXAMPLE: - -C NHC 13L MITCH 19981028 1800 164N 0858W 270 010 0957 1008 0371 51 019 0278 0278 0185 0185 D -C 12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345 -C 1 2 3 4 5 6 7 8 9 - -C ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 4-digit year "',DUMREC(20:23),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iuntvi,'; DUMREC-1: ',dumrec - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - - ELSE - - PRINT *, ' ' - PRINT '(a,a,a)', '***** Cannot determine if this record ', - $ 'contains a 2-digit year or a 4-digit year - skip it and ', - $ 'try reading the next record' - PRINT *, ' ' - GO TO 100 - - END IF - - NALREC=NALREC+1 - NRFILE=NRFILE+1 - - UNIQUE=.TRUE. - DO NR=NSTART,NUNI - IF(DUMREC .EQ. UNIREC(NR)) UNIQUE=.FALSE. - ENDDO - - IF(UNIQUE) THEN - - IF(NUNI .EQ. MAXREC) THEN - WRITE(6,51) MAXREC - 51 FORMAT('******INSUFFICIENT STORAGE FOR ALL VITAL ', - 1 'STATISTICS RECORDS, MAXREC=',I5) - IERCHK=51 - RETURN - ELSE - NUNI=NUNI+1 - NUMOKA(NUNI)=NUNI - UNIREC(NUNI)=DUMREC - IFILE(NUNI)=IUNTVI - ENDIF - - ELSE - - IF(NDUP .EQ. MAXREC) THEN - WRITE(6,51) MAXREC - IERCHK=53 - RETURN - ELSE - NDUP=NDUP+1 - DUPREC(NDUP)=DUMREC - ENDIF - ENDIF - NSTART=1 - - 100 continue - - ENDDO - - 130 CONTINUE - -C LOOP FOR MORE FILES IF REQUESTED - - IF(NRFILE .EQ. 0) WRITE(6,133) INPFIL(IUNTVI-29) - 133 FORMAT(/'###',A,' FILE IS EMPTY.') - - IUNTVI=IUNTVI+1 - IF(IUNTVI-IUNTIN .LT. MAXUNT) THEN - NRFILE=0 - WRITE(6,141) IUNTVI,MAXUNT - 141 FORMAT(/'...LOOPING TO READ UNIT NUMBER',I3,'. MAXUNT=',I3) - GO TO 10 - ENDIF - - WRITE(6,151) NALREC - 151 FORMAT(//'...TOTAL NUMBER OF RECORDS=',I4) - WRITE(6,153) NUNI,(NUMOKA(NR),UNIREC(NR),NR=1,NUNI) - 153 FORMAT(/'...',I4,' RECORDS ARE UNIQUE, BUT NOT ERROR CHECKED.'// - 1 (' ...',I4,'...',A)) - WRITE(6,157) NDUP,(NR,DUPREC(NR),NR=1,NDUP) - 157 FORMAT(/'...',I4,' RECORDS ARE EXACT DUPLICATES:'//(' ...',I4, - 1 '...',A)) - - IF(NUNI .EQ. 0) THEN - WRITE(6,161) - 161 FORMAT(/'###THERE ARE NO RECORDS TO BE READ. THIS PROGRAM ', - 1 'WILL COMPLETE FILE PROCESSING AND LEAVE AN EMPTY ', - 2 ' "CURRENT" FILE!!') - IERCHK=161 - RETURN 1 - ENDIF - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: BLNKCK CHECKS FOR PROPER COLUMNAR FORMAT -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: CHECKS ALL INPUT RECORDS FOR PROPER COLUMNAR FORMAT. -C THE TABULAR INPUT RECORD HAS SPECIFIED BLANK COLUMNS. IF -C NONBLANK CHARACTERS ARE FOUND IN SPECIFIED BLANK COLUMNS, -C AN OBVIOUS ERROR HAS OCCURRED. THE RECORD IS REJECTED IN THIS -C CASE. -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C 1994-06-20 S. LORD MODIFIED MAXCHK FOR THE GFDL FORMAT -C -C USAGE: CALL BLNKCK(NTEST,NOKAY,NBAD,IFBLNK,NUMTST,NUMOKA,NUMBAD, -C ZZZREC,NNNREC,TSTREC,BADREC,OKAREC) -C INPUT ARGUMENT LIST: -C NTEST - NUMBER OF RECORDS TO BE TESTED. -C NUMTST - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH RECORD -C - TO BE TESTED. -C ZZZREC - CHARACTER VARIABLE CONTAINING VARIABLE NAMES. -C NNNREC - CHARACTER VARIABLE CONTAINING COLUMN NUMBERS. -C TSTREC - CHARACTER ARRAY CONTAINING RECORDS TO BE TESTED. -C -C OUTPUT ARGUMENT LIST: -C NOKAY - NUMBER OF RECORDS THAT PASSED THE BLANK CHECK. -C NBAD - NUMBER OF RECORDS THAT FAILED THE BLANK CHECK. -C IFBLNK - INTEGER ARRAY CONTAINING ERROR CODE FOR EACH INPUT -C - RECORD. SEE COMMENTS IN PGM FOR KEY TO ERROR CODES. -C NUMOKA - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH GOOD -C - RECORD. -C NUMBAD - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH BAD -C - RECORD. -C BADREC - CHARACTER ARRAY CONTAINING BAD RECORDS THAT FAILED -C - THE BLANK CHECK. -C OKAREC - CHARACTER ARRAY CONTAINING ALL RECORDS THAT PASSED -C - THE BLANK CHECK. -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE BLNKCK(NTEST,NOKAY,NBAD,IFBLNK,NUMTST,NUMOKA,NUMBAD, - 1 ZZZREC,NNNREC,TSTREC,BADREC,OKAREC) - - PARAMETER (MAXCHK=95) - PARAMETER (NERCBL=3) - PARAMETER (MAXREC=1000) - - SAVE - - CHARACTER*(*) ZZZREC,NNNREC,TSTREC(0:NTEST),BADREC(MAXREC), - 1 OKAREC(NTEST) - CHARACTER ERCBL(NERCBL)*60 - - PARAMETER (MAXCHR=95) - PARAMETER (MAXVIT=15) - - CHARACTER NAMVAR*5 - - DIMENSION ISTVAR(MAXVIT) - - DIMENSION NAMVAR(MAXVIT+1) - - DIMENSION IFBLNK(MAXREC),NUMOKA(NTEST),NUMBAD(MAXREC), - 1 NUMTST(NTEST) - - DATA ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 1 LENHED/18/ - - DATA NAMVAR/'DATE ','TIME ','LAT. ','LONG.','DIR ','SPEED', - 1 'PCEN ','PENV ','RMAX ','VMAX ','RMW ','R15NE', - 2 'R15SE','R15SW','R15NW','DEPTH'/ - - DATA ERCBL - 1 /'1 : LAST NON-BLANK CHARACTER IS IN THE WRONG COLUMN ', - 2 '18 : FIRST 18 COLUMNS ARE BLANK ', - 3 '19-87: FIRST NON-BLANK CHARACTER FOUND IN THIS COLUMN '/ - -C ERROR CODES FOR BAD RECORDS RETURNED IN IFBLNK ARE AS FOLLOWS: -C 1: LAST NON-BLANK CHARACTER IS IN THE WRONG COLUMN -C 18 : FIRST 18 COLUMNS ARE BLANK -C 19-87: NON-BLANK CHARACTER FOUND IN A BLANK COLUMN. ERROR -C CODE GIVES COLUMN OF LEFT-MOST OCCURRENCE - -C SET COUNTERS FOR INITIAL SORTING OF ALL RECORDS. ALL SUBSEQUENT -C CALLS TO THIS ROUTINE SHOULD BE FOR SINGLE RECORDS - - WRITE(6,1) NTEST - 1 FORMAT(//'...ENTERING BLNKCK, LOOKING FOR WRONGLY POSITIONED ', - 1 ' BLANKS. NTEST=',I4//) - - NADD=0 - IF(NREC .GT. 0) THEN - NOKAY=0 - NBAD =0 - ENDIF - -C DO ALL RECORDS - - DO NREC=1,NTEST - IETYP=0 - -C FIND THE RIGHT-MOST NON-BLANK CHARACTER: IT SHOULD CORRESPOND -C TO THE MAXIMUM NUMBER OF CHARACTERS IN THE MESSAGE (MAXCHR) - - DO ICH=MAXCHK,1,-1 - IF(TSTREC(NREC)(ICH:ICH) .NE. ' ') THEN - IBLANK=ICH - GO TO 31 - ENDIF - ENDDO - 31 CONTINUE -C WRITE(6,3311) IBLANK,TSTREC(NREC)(1:IBLANK) -C3311 FORMAT(/'...TESTING LENGTH OF RECORD, IBLANK,TSTREC=',I4/4X,'...', -C 1 A,'...') -C - IF(IBLANK .NE. MAXCHR) THEN - IETYP=1 - WRITE(6,33) NREC,IBLANK,NNNREC,ZZZREC,TSTREC(NREC) - 33 FORMAT(/'...RECORD #',I3,' HAS RIGHT-MOST NON-BLANK CHARACTER ', - 1 'AT POSITION',I4/2(1X,'@@@',A,'@@@'/),4X,A) - GO TO 41 - ENDIF - -C CHECK FOR BLANKS IN THE HEADER SECTION (THE FIRST 18 COLUMNS) - - IF(TSTREC(NREC)(1:LENHED) .EQ. ' ') THEN - IETYP=LENHED - WRITE(6,35) NREC,NNNREC,ZZZREC,TSTREC(NREC) - 35 FORMAT(/'...RECORD #',I3,' HAS BLANK HEADER SECTION.'/2(1X,'@@@', - 1 A,'@@@'/),4X,A) - ENDIF - -C CHECK COLUMN BLANKS STARTING TO THE LEFT OF THE YYMMDD GROUP - - DO IBL=1,MAXVIT - IF(TSTREC(NREC)(ISTVAR(IBL)-1:ISTVAR(IBL)-1) .NE. ' ') THEN - IETYP=ISTVAR(IBL)-1 - WRITE(6,39) TSTREC(NREC)(ISTVAR(IBL)-1:ISTVAR(IBL)-1), - 1 ISTVAR(IBL)-1,NAMVAR(IBL),NNNREC,ZZZREC,TSTREC(NREC) - 39 FORMAT(/'...NONBLANK CHARACTER ',A1,' AT POSITION ',I3, - 1 ' PRECEEDING VARIABLE',1X,A/2(1X,'@@@',A,'@@@'/),4X,A) - GO TO 41 - ENDIF - ENDDO - - 41 IFBLNK(NUMTST(NREC))=IETYP - IF(IETYP .GT. 0) THEN - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(NREC) - BADREC(NADD+NBAD)=TSTREC(NREC) - ELSE - NOKAY=NOKAY+1 - NUMOKA(NOKAY)=NUMTST(NREC) - OKAREC(NOKAY)=TSTREC(NREC) - ENDIF - - ENDDO - - print *, ' ' - IF(NTEST .GT. 1) THEN - WRITE(6,101) NOKAY,NADD,NTEST,(ERCBL(NER),NER=1,NERCBL) - 101 FORMAT(/'...RESULTS OF THE GLOBAL BLANK CHECK ARE: NOKAY=',I4, - 1 ' AND NADD=',I4,' FOR A TOTAL OF ',I4,' RECORDS.'//4X, - 2 'ERROR CODES ARE:'/(6X,A)) - WRITE(6,103) - 103 FORMAT(/'...OKAY RECORDS ARE:',100X,'ERC'/) - DO NOK=1,NOKAY - WRITE(6,109) NOK,NUMOKA(NOK),OKAREC(NOK),IFBLNK(NUMOKA(NOK)) - 109 FORMAT(3X,I4,'...',I4,'...',A,'...',I3) - ENDDO - IF(NADD .GT. 0) WRITE(6,111) (NBAD+NBA,NUMBAD(NBAD+NBA), - 1 BADREC(NBAD+NBA), - 2 IFBLNK(NUMBAD(NBAD+NBA)), - 3 NBA=1,NADD) - 111 FORMAT(/' ADDED BAD RECORDS ARE:',95X,'ERC'/(3X,I4,'...',I4, - 1 '...',A,'...',I3)) - NBAD=NBAD+NADD - ELSE - WRITE(6,113) IETYP,TSTREC(NTEST),NOKAY - 113 FORMAT(/'...BLANK TEST FOR SINGLE RECORD, BLANK ERROR CODE=',I2, - 1 ' RECORD IS:'/4X,'...',A/4X,'NOKAY=',I2) - ENDIF - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: READCK CHECKS READABILITY OF EACH RECORD -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: CHECKS READABILITY OF EACH RECORD. SINCE THE INPUT -C RECORD FORMAT CONTAINS ONLY NUMBERS AND LETTERS, -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C 1992-09-18 S. J. LORD ADDED CHECK FOR CORRECT MISSING DATA IN READCK -C -C USAGE: CALL READCK(NTEST,NOKAY,NBAD,IFREAD,NUMTST,NUMOKA,NUMBAD, -C ZZZREC,NNNREC,TSTREC,BADREC,OKAREC) -C INPUT ARGUMENT LIST: -C NTEST - NUMBER OF RECORDS TO BE TESTED. -C NUMTST - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH RECORD -C - TO BE TESTED. -C ZZZREC - CHARACTER VARIABLE CONTAINING VARIABLE NAMES. -C NNNREC - CHARACTER VARIABLE CONTAINING COLUMN NUMBERS. -C TSTREC - CHARACTER ARRAY CONTAINING RECORDS TO BE TESTED. -C -C OUTPUT ARGUMENT LIST: -C NOKAY - NUMBER OF RECORDS THAT PASSED THE BLANK CHECK. -C NBAD - NUMBER OF RECORDS THAT FAILED THE BLANK CHECK. -C IFREAD - INTEGER ARRAY CONTAINING ERROR CODE FOR EACH INPUT -C - RECORD. SEE COMMENTS IN PGM FOR KEY TO ERROR CODES. -C NUMOKA - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH GOOD -C - RECORD. -C NUMBAD - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH BAD -C - RECORD. -C BADREC - CHARACTER ARRAY CONTAINING BAD RECORDS THAT FAILED -C - THE BLANK CHECK. -C OKAREC - CHARACTER ARRAY CONTAINING ALL RECORDS THAT PASSED -C - THE BLANK CHECK. -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE READCK(NTEST,NOKAY,NBAD,IFREAD,NUMTST,NUMOKA,NUMBAD, - 1 ZZZREC,NNNREC,TSTREC,BADREC,OKAREC) - - PARAMETER (NERCRD=2) - PARAMETER (MAXREC=1000) - - SAVE - - CHARACTER*(*) ZZZREC,NNNREC,TSTREC(0:NTEST),BADREC(MAXREC), - 1 OKAREC(NTEST),ERCRD(NERCRD)*60 - - PARAMETER (MAXVIT=15) - PARAMETER (ITERVR=10) - - CHARACTER FMTVIT*6,NAMVAR*5 - - DIMENSION IVTVAR(MAXVIT),ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION NAMVAR(MAXVIT+1),FMTVIT(MAXVIT),MISSNG(MAXVIT) - - DIMENSION IFREAD(MAXREC),NUMOKA(NTEST),NUMBAD(MAXREC), - 1 NUMTST(NTEST) - - DATA FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 1 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 2 MISSNG/-9999999,-999,-99,-999,2*-99,3*-999,-9,-99,4*-999/, - 3 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 4 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/ - - DATA NAMVAR/'DATE ','TIME ','LAT. ','LONG.','DIR ','SPEED', - 1 'PCEN ','PENV ','RMAX ','VMAX ','RMW ','R15NE', - 2 'R15SE','R15SW','R15NW','DEPTH'/ - - DATA NUM/1/ - - DATA ERCRD - 1 /'N: INDEX OF THE FIRST UNREADABLE RECORD ', - 2 '20-N: WRONG MISSING CODE '/ - -C ERROR CODE FOR UNREADABLE RECORD IS THE INDEX OF THE FIRST -C UNREADABLE RECORD. -C ***NOTE: THERE MAY BE ADDITIONAL UNREADABLE RECORDS TO THE -C RIGHT. - - WRITE(6,1) NTEST - 1 FORMAT(//'...ENTERING READCK, LOOKING FOR UNREADABLE (NOT ', - 1 ' CONTAINING INTEGERS) PRIMARY AND SECONDARY VARIABLES,', - 2 ' NTEST=',I4//) - - NADD=0 - -C DO ALL RECORDS - - DO NREC=1,NTEST - IETYP=0 - - DO IV=1,ITERVR - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 TSTREC(NREC)) - IF(IERDEC .NE. 0) THEN - IETYP=IV - WRITE(6,7) NREC,ISTVAR(IV),NAMVAR(IV),NNNREC,ZZZREC,TSTREC(NREC) - 7 FORMAT(/'...RECORD #',I3,' IS UNREADABLE AT POSITION',I3, - 1 ' FOR VARIABLE ',A,'.'/2(1X,'@@@',A,'@@@'/),4X,A) - GO TO 11 - ENDIF - ENDDO - 11 CONTINUE - - DO IV=1,ITERVR - IF(IVTVAR(IV) .LT. 0 .AND. IVTVAR(IV) .NE. MISSNG(IV)) THEN - IETYP=20-IV - WRITE(TSTREC(NREC) (ISTVAR(IV):IENVAR(IV)),FMTVIT(IV))MISSNG(IV) - ENDIF - ENDDO - - IFREAD(NUMTST(NREC))=IETYP - IF(IETYP .GT. 0) THEN - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(NREC) - BADREC(NADD+NBAD)=TSTREC(NREC) - ELSE - NOKAY=NOKAY+1 - NUMOKA(NOKAY)=NUMTST(NREC) - OKAREC(NOKAY)=TSTREC(NREC) - ENDIF - - ENDDO - - WRITE(6,101) NOKAY,NADD,NTEST,(ERCRD(NER),NER=1,NERCRD) - 101 FORMAT(//'...RESULTS OF THE READABILITY CHECK ARE: NOKAY=',I4, - 1 ' AND NADD=',I4,' FOR A TOTAL OF ',I4,' RECORDS.'//4X, - 2 'ERROR CODES ARE:'/(6X,A)) - WRITE(6,103) - 103 FORMAT(/'...OKAY RECORDS ARE:',100X,'ERC'/) - DO NOK=1,NOKAY - WRITE(6,109) NOK,NUMOKA(NOK),OKAREC(NOK),IFREAD(NUMOKA(NOK)) - 109 FORMAT(3X,I4,'...',I4,'...',A,'...',I3) - ENDDO - IF(NADD .GT. 0) WRITE(6,111) (NBAD+NBA,NUMBAD(NBAD+NBA), - 1 BADREC(NBAD+NBA), - 2 IFREAD(NUMBAD(NBAD+NBA)), - 3 NBA=1,NADD) - 111 FORMAT(/' ADDED BAD RECORDS ARE:',95X,'ERC'/(3X,I4,'...',I4, - 1 '...',A,'...',I3)) - NBAD=NBAD+NADD - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: DTCHK CHECK FOR VALID DATE FOR ALL RECORDS -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: CHECKS FOR VALID DATE IN ALL RECORDS. -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C -C USAGE: CALL DTCHK(NTEST,NOKAY,NBAD,NTBP,IFDTCK,NUMTST,NUMOKA, -C NUMBAD,NUMTBP,DAYMN,DAYMX1,DAYMX2,DAYOFF,TSTREC, -C BADREC,OKAREC,TBPREC) -C INPUT ARGUMENT LIST: -C NTEST - NUMBER OF RECORDS TO BE TESTED. -C NUMTST - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH RECORD -C - TO BE TESTED. -C DAYMN - EARLIEST (MINIMUM) DATE FOR ACCEPTANCE OF A RECORD. -C - UNITS ARE DDD.FFF, WHERE DDD=JULIAN DAY, FFF=FRAC- -C - TIONAL DAY (E.G. .5=1200 UTC). -C DAYMX1 - LATEST (MAXIMUM) DATE FOR ACCEPTANCE OF A RECORD. -C - UNITS ARE FRACTIONAL JULIAN DAYS AS IN DAYMN ABOVE. -C DAYMX2 - EARLIEST (MINIMUM) DATE FOR REJECTION OF A RECORD. -C - RECORDS WITH DATES BETWEEN DAYMX1 AND DAYMX2 ARE -C - ASSUMED TO BELONG TO A FUTURE CYCLE AND ARE THROWN -C - BACK INTO THE POND, I.E. NEITHER REJECTED OR ACCEPTED. -C - UNITS ARE FRACTIONAL JULIAN DAYS AS IN DAYMN ABOVE. -C DAYOFF - OFFSET DAYS WHEN ACCEPTANCE WINDOW CROSSES YEAR -C BOUNDARY -C TSTREC - CHARACTER ARRAY CONTAINING RECORDS TO BE TESTED. -C -C OUTPUT ARGUMENT LIST: -C NOKAY - NUMBER OF RECORDS THAT PASSED THE BLANK CHECK. -C NBAD - NUMBER OF RECORDS THAT FAILED THE BLANK CHECK. -C NTBP - NUMBER OF RECORDS THAT ARE TO BE RESTORED TO THE -C - INPUT FILES (THROWN BACK INTO THE POND). -C IFDTCK - INTEGER ARRAY CONTAINING ERROR CODE FOR EACH INPUT -C - RECORD. SEE COMMENTS IN PGM FOR KEY TO ERROR CODES. -C NUMOKA - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH GOOD -C - RECORD. -C NUMBAD - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH BAD -C - RECORD. -C NUMTBP - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH RECORD -C - TO BE THROWN BACK INTO THE POND. -C BADREC - CHARACTER ARRAY CONTAINING BAD RECORDS THAT FAILED -C - THE BLANK CHECK. -C OKAREC - CHARACTER ARRAY CONTAINING ALL RECORDS THAT PASSED -C - THE BLANK CHECK. -C TBPREC - CHARACTER ARRAY CONTAINING ALL RECORDS THAT ARE TO -C - BE THROWN BACK INTO THE POND. -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE DTCHK(NTEST,NOKAY,NBAD,NTBP,IFDTCK,NUMTST,NUMOKA, - 1 NUMBAD,NUMTBP,DAYMN,DAYMX1,DAYMX2,DAYOFF,TSTREC, - 2 BADREC,OKAREC,TBPREC) - - PARAMETER (NERCDT=8) - PARAMETER (MAXREC=1000) - PARAMETER (MAXTBP=20) - - SAVE - - CHARACTER*(*) TSTREC(0:NTEST),BADREC(MAXREC),OKAREC(NTEST), - 1 TBPREC(MAXTBP),ERCDT(NERCDT)*60 - - PARAMETER (MAXVIT=15) - - CHARACTER FMTVIT*6 - - DIMENSION IVTVAR(MAXVIT),ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION FMTVIT(MAXVIT) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - DIMENSION RINC(5) - - DIMENSION IFDTCK(MAXREC),NUMOKA(NTEST),NUMBAD(MAXREC), - 1 NUMTST(NTEST),NUMTBP(MAXTBP),IDAMX(12) - - DATA FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 1 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 2 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 3 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/ - - DATA NUM/1/,IYRMN/0/,IYRMX/9999/,IMOMN/1/,IMOMX/12/,IDAMN/1/, - 1 IDAMX/31,29,31,30,31,30,31,31,30,31,30,31/,IHRMN/0/, - 2 IHRMX/23/,IMINMN/0/,IMINMX/59/ - - DATA ERCDT - 1 /' 1: YEAR OUT OF RANGE ', - 2 ' 2: MONTH OUT OF RANGE ', - 3 ' 3: DAY OUT OF RANGE ', - 4 ' 4: HOUR OUT OF RANGE ', - 5 ' 5: MINUTE OUT OF RANGE ', - 6 ' 6: DATE/TIME LESS THAN ALLOWED WINDOW ', - 7 ' 7: DATE/TIME GREATER THAN ALLOWED MAXIMUM WINDOW ', - 8 '-8: DATE/TIME PROBABLY VALID AT LATER CYCLE TIME (TBIP) '/ - -C ERROR CODES FOR BAD RECORDS RETURNED IN IFDTCK ARE AS FOLLOWS: -C 1: YEAR OUT OF RANGE -C 2: MONTH OUT OF RANGE -C 3: DAY OUT OF RANGE -C 4: HOUR OUT OF RANGE -C 5: MINUTE OUT OF RANGE -C 6: DATE/TIME LESS THAN ALLOWED WINDOW -C 7: DATE/TIME GREATER THAN ALLOWED WINDOW -C -8: DATE/TIME PROBABLY VALID AT LATER CYCLE TIME (THROWN BACK -C INTO THE POND) - - WRITE(6,1) NTEST,NOKAY,NBAD,DAYMN,DAYMX1,DAYMX2 - 1 FORMAT(//'...ENTERING DTCHK, LOOKING FOR BAD DATE/TIME GROUPS. ', - 1 'NTEST,NOKAY,NBAD=',3I4,'.'/4X,'DAYMN,DAYMX1,DAYMX2=', - 2 3F12.4//) - - NADD=0 - NTBPZ=0 - DO NREC=1,NTEST - - IETYP=0 - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 TSTREC(NREC)) - ENDDO - -C CONVERT DATE/TIME TO FLOATING POINT DATE - - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - - IF(IYR .LT. IYRMN .OR. IYR .GT. IYRMX) THEN - IETYP=1 - WRITE(6,21) IYR,IYRMN,IYRMX,TSTREC(NREC) - 21 FORMAT(/'******DECODED YEAR OUT OF ALLOWED BOUNDS, IYR,IYRMN,', - 1 'IYRMX,RECORD=',3I9/8X,A) - ENDIF - - IF(IMO .LT. IMOMN .OR. IMO .GT. IMOMX) THEN - IETYP=2 - WRITE(6,31) IMO,IMOMN,IMOMX,TSTREC(NREC) - 31 FORMAT(/'******DECODED MONTH OUT OF ALLOWED BOUNDS, IMO,IMOMN,', - 1 'IMOMX,RECORD=',3I9/8X,A/5X,'...(DAY WILL NOT BE CHECKED)') - - ELSE - IF(IDA .LT. IDAMN .OR. IDA .GT. IDAMX(IMO)) THEN - IETYP=3 - WRITE(6,41) IDA,IDAMN,IDAMX,TSTREC(NREC) - 41 FORMAT(/'******DECODED DAY OUT OF ALLOWED BOUNDS, IDA,IDAMN,', - 1 'IDAMX,RECORD=',3I9/8X,A) - ENDIF - ENDIF - - IF(IHR .LT. IHRMN .OR. IHR .GT. IHRMX) THEN - IETYP=4 - WRITE(6,51) IHR,IHRMN,IHRMX,TSTREC(NREC) - 51 FORMAT(/'******DECODED HOUR OUT OF ALLOWED BOUNDS, IHR,IHRMN,', - 1 'IHRMX,RECORD=',3I9/8X,A) - ENDIF - - IF(IMIN .LT. IMINMN .OR. IMIN .GT. IMINMX) THEN - IETYP=5 - WRITE(6,61) IMIN,IMINMN,IMINMX,TSTREC(NREC) - 61 FORMAT(/'******DECODED MINUTE OUT OF ALLOWED BOUNDS, IMIN,', - 1 'IMINMN,IMINMX,RECORD=',3I9/8X,A) - ENDIF - - IF(IETYP .EQ. 0 .AND. DAYZ+DAYOFF .LT. DAYMN) THEN - IETYP=6 - WRITE(6,71) DAYZ,DAYMN,TSTREC(NREC) - 71 FORMAT(/'******DECODED DAY LESS THAN MINIMUM WINDOW, DAYZ,DAYMN,', - 1 'RECORD=',2F12.4/8X,A) - ENDIF - - IF(IETYP .EQ. 0 .AND. DAYZ+DAYOFF .GT. DAYMX2) THEN - IETYP=7 - WRITE(6,73) DAYZ,DAYMX2,TSTREC(NREC) - 73 FORMAT(/'******DECODED DAY EXCEEDS MAXIMUM WINDOW, DAYZ,DAYMX2,', - 1 'RECORD=',2F12.4/8X,A) - ENDIF - - IF(IETYP .EQ. 0 .AND. DAYZ .GT. DAYMX1) THEN - IETYP=-8 - WRITE(6,77) DAYZ,DAYMX1,TSTREC(NREC) - 77 FORMAT(/'###DECODED DAY PROBABLY VALID AT FUTURE CYCLE TIME. ', - 1 'DAYZ,DAYMX1,RECORD=',2F12.4/8X,A/4X, 'THIS RECORD WILL ', - 2 'BE THROWN BACK IN THE POND.') - ENDIF - - IFDTCK(NUMTST(NREC))=IETYP - IF(IETYP .GT. 0) THEN - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(NREC) - BADREC(NADD+NBAD)=TSTREC(NREC) - ELSE IF(IETYP .EQ. 0) THEN - NOKAY=NOKAY+1 - NUMOKA(NOKAY)=NUMTST(NREC) - OKAREC(NOKAY)=TSTREC(NREC) - ELSE - NTBPZ=NTBPZ+1 - NUMTBP(NTBPZ)=NUMTST(NREC) - TBPREC(NTBPZ)=TSTREC(NREC) - ENDIF - - ENDDO - - NTBP=NTBPZ - WRITE(6,101) NOKAY,NADD,NTBP,NTEST,(ERCDT(NER),NER=1,NERCDT) - 101 FORMAT(//'...RESULTS OF THE DATE/TIME CHECK ARE: NOKAY=',I4, - 1 ' ,NADD=',I4,' AND NTBP=',I4,' FOR A TOTAL OF',I4, - 2 ' RECORDS.'//4X,'ERROR CODES ARE:'/(6X,A)) - - WRITE(6,103) - 103 FORMAT(/'...OKAY RECORDS ARE:',100X,'ERC'/) - DO NOK=1,NOKAY - WRITE(6,109) NOK,NUMOKA(NOK),OKAREC(NOK),IFDTCK(NUMOKA(NOK)) - 109 FORMAT(3X,I4,'...',I4,'...',A,'...',I3) - ENDDO - - WRITE(6,113) - 113 FORMAT(/'...RECORDS THAT WILL BE RETURNED TO THE INPUT FILES ', - 1 '(THROWN BACK INTO THE POND) ARE:',36X,'ERC'/) - DO NTB=1,NTBP - WRITE(6,119) NTB,NUMTBP(NTB),TBPREC(NTB), - 1 IFDTCK(NUMTBP(NTB)) - 119 FORMAT(3X,I4,'...',I4,'...',A,'...',I3) - ENDDO - - IF(NADD .GT. 0) WRITE(6,131) (NBAD+NBA,NUMBAD(NBAD+NBA), - 1 BADREC(NBAD+NBA), - 2 IFDTCK(NUMBAD(NBAD+NBA)), - 3 NBA=1,NADD) - 131 FORMAT(/' ADDED BAD RECORDS ARE:',95X,'ERC'/(3X,I4,'...',I4, - 1 '...',A,'...',I3)) - NBAD=NBAD+NADD - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: SETMSK CHECKS ALL RECORDS FOR CORRECT LAT/LON -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: INPUT RECORDS ARE CHECKED FOR PHYSICALLY REALISTIC -C LATITUDE AND LONGITUDE (-70 Read in RECORD from tcvitals file -- contains a', - $ ' 2-digit year "',SCRATC(NCHECK)(20:21),'"' - PRINT *, ' ' - PRINT *, 'From unit ',iuntho,'; SCRATC(NCHECK)-9: ', - $ scratc(ncheck) - PRINT *, ' ' - DUMY2K(1:19) = SCRATC(NCHECK)(1:19) - IF(SCRATC(NCHECK)(20:21).GT.'20') THEN - DUMY2K(20:21) = '19' - ELSE - DUMY2K(20:21) = '20' - ENDIF - DUMY2K(22:100) = SCRATC(NCHECK)(20:100) - SCRATC(NCHECK) = DUMY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ SCRATC(NCHECK)(20:23),'" via windowing technique' - PRINT *, ' ' - PRINT *, 'From unit ',IUNTHo,'; SCRATC(NCHECK)-9: ', - $ scratc(ncheck) - PRINT *, ' ' - - ELSE IF(SCRATC(NCHECK)(37:37).EQ.'N' .OR. - 1 SCRATC(NCHECK)(37:37).EQ.'S') THEN - -C ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR -C ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 4-digit year "',SCRATC(NCHECK)(20:23),'"' - PRINT *, ' ' - PRINT *, 'From unit ',iuntho,'; SCRATC(NCHECK)-9: ', - $ SCRATC(NCHECK) - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - - ELSE - - PRINT *, ' ' - PRINT *, '***** Cannot determine if this record contains ', - $ 'a 2-digit year or a 4-digit year - skip it and try reading ', - $ 'the next record' - PRINT *, ' ' - GO TO 30 - - END IF - - WRITE(6,19) NCHECK,SCRATC(NCHECK) - NCPY=NCPY+1 - NCHECK=NCHECK+1 - GO TO 30 - - 40 CONTINUE - NCHECK=NCHECK-1 - WRITE(6,41) NCPY,NCHECK - 41 FORMAT('...',I3,' RECORDS COPIED FOR A TOTAL OF ',I4,' TO BE ', - 1 'CHECKED.') - - NADD=0 - DO NREC=1,NTEST - -C INITIALIZE THE CHARACTER STRING AND ERROR CODE - - BUFINZ=TSTREC(NREC) - IETYP=0 - NDUP =0 - -C SET THE FLAG FOR ERROR TYPE=4 (PREVIOUS RECORD WITH DUPLICATE -C RSMC, DATE/TIME AND STORM ID APPEARS TO BE VALID) - -C RECORDS THAT WERE MARKED ERRONEOUS EARLIER DO NOT RECEIVE -C FURTHER PROCESSING WITH THIS VERSION OF THE CODE. - - IF(IDUPID(NREC) .GT. 0) THEN - IETYP=IDUPID(NREC) - GO TO 190 - ENDIF - -C BASIN CHECK - - NIDBSN=999 - DO NBA=1,NBASIN - IF(STMIDZ(3:3) .EQ. IDBASN(NBA)) THEN - NIDBSN=NBA - ENDIF - ENDDO - - IF(NIDBSN .GT. 130) THEN - IETYP=1 - WRITE(6,51) NREC,STMIDZ(3:3),(IDBASN(NBA),NBA=1,NBASIN),NNNREC, - 1 ZZZREC,TSTREC(NREC) - 51 FORMAT(/'******RECORD #',I3,' HAS BAD BASIN CODE=',A1,'. ALLOWED', - 2 ' CODES ARE:',1X,11(A1,1X)/2(1X,'@@@',A,'@@@'/),4X,A) - -C CHECK THAT THE LAT/LON CORRESPONDS TO A VALID BASIN - - ELSE - DO IV=3,4 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 TSTREC(NREC)) - VITVAR(IV)=REAL(IVTVAR(IV))*VITFAC(IV) - ENDDO - IF(LATNS .EQ. 'S') STMLTZ=-STMLTZ - IF(LONEW .EQ. 'W') STMLNZ=360.-STMLNZ - CALL BASNCK(STMIDZ,STMLTZ,STMLNZ,NBAZ,IPRT,IER) - IF(IER .EQ. 3) THEN - IETYP=6 - WRITE(6,61) NREC,STMIDZ,STMLTZ,STMLNZ,IETYP,NNNREC,ZZZREC, - 1 TSTREC(NREC) - 61 FORMAT(/'******RECORD #',I3,' WITH STMID=',A,' HAS LAT/LON ', - 1 'OUTSIDE BASIN LIMITS. LAT/LON=',2F9.1,' IETYP=',I3/ - 2 2(1X,'@@@',A,'@@@'/),4X,A) - ENDIF - ENDIF - - IF(IETYP .EQ. 0) THEN - -C CHECK CODED STORM ID NUMBER: ID NUMBERS GREATER >= 80 ARE -C CONSIDERED ERRONEOUS. ! CHG. TESTID - - CALL DECVAR(ISTIDC,ISTIDC+ITWO-1,KSTORM,IERDEC,'(I2.2)', - 1 STMIDZ) - IF(KSTORM .LT. 1 .OR. KSTORM .GE. ISTMAX .OR. IERDEC .NE. 0) THEN - IETYP=2 - IF(KSTORM .GE. ISTMAX .AND. KSTORM .LT. 100) THEN - WRITE(6,94) NREC,STMIDZ(ISTIDC:ISTIDC+ITWO-1),NNNREC,ZZZREC, - 1 TSTREC(NREC) - 94 FORMAT(/'******RECORD #',I3,' HAS TEST STORM NUMBER=',A2, - 1 ' -- CONSIDER IT BAD'/2(1X,'@@@',A,'@@@'/),4X,A) - ELSE - WRITE(6,63) NREC,STMIDZ(ISTIDC:ISTIDC+ITWO-1),NNNREC,ZZZREC, - 1 TSTREC(NREC) - 63 FORMAT(/'******RECORD #',I3,' HAS BAD STORM NUMBER=',A2/ - 1 2(1X,'@@@',A,'@@@'/),4X,A) - END IF - ENDIF - -C CHECK CONSISTENCY BETWEEN STORM NAME AND STORM ID, PRESENT AND -C PAST. FIRST, CHECK FOR EXACT DUPLICATES IN THE INPUT AND -C SHORT-TERM HISTORY FILES. - - IF(IETYP .EQ. 0) THEN - DO NCK=NCHECK,NREC+1,-1 - BUFINX=SCRATC(NCK) - - IF(NCK .GT. NTEST .AND. BUFINZ(1:IFSTFL-1) .EQ. - 1 BUFINX(1:IFSTFL-1) .AND. - 2 BUFINZ(IFSTFL+1:MAXCHR) .EQ. - 3 BUFINX(IFSTFL+1:MAXCHR)) THEN - IETYP=9 - WRITE(6,64) NREC,NCK,NNNREC,ZZZREC,TSTREC(NREC),SCRATC(NCK) - 64 FORMAT(/'******RECORD #',I3,' IS IDENTICAL TO RECORD #',I3, - 1 ' WHICH IS FROM THE ORIGINAL SHORT-TERM HISTORY FILE.'/4X, - 2 'RECORDS ARE:'/2(1X,'@@@',A,'@@@'/),2(4X,A/)) - GO TO 71 - ENDIF - - IF(RSMCX .EQ. RSMCZ) THEN - -C DISABLE THE FOLLOWING TWO CHECKS IN THE CASE OF A CARDINAL -C TROPICAL STORM IDENTIFIER - - DO NCARD=1,NCRDMX - IF(STMNMZ(1:ICRDCH(NCARD)) .EQ. CARDNM(NCARD)(1:ICRDCH(NCARD)) - 1 .OR. - 2 STMNMX(1:ICRDCH(NCARD)) .EQ. CARDNM(NCARD)(1:ICRDCH(NCARD))) - 3 THEN - WRITE(6,1147) STMNMZ(1:ICRDCH(NCARD)), - 1 STMNMX(1:ICRDCH(NCARD)),NCARD,ICRDCH(NCARD) - 1147 FORMAT(/'...WE HAVE FOUND A MATCHING NAME FOR "',A,'" OR "',A, - 1 '" AT CARDINAL INDEX',I3,', FOR CHARACTERS 1-',I2,'.'/4X, - 2 'NAME CHECKING IS HEREBY DISABLED.') - GO TO 71 - ENDIF - ENDDO - -C SAME NAME BUT DIFFERENT ID - - IF(STMNMZ .NE. 'NAMELESS' .AND. - 1 STMNMZ .EQ. STMNMX .AND. STMIDZ .NE. STMIDX) THEN - IETYP=7 - IF(NCK .GT. NTEST) WRITE(6,65) NREC,STMNMZ,STMIDZ,NCK,STMIDX, - 1 NNNREC,ZZZREC,TSTREC(NREC),SCRATC(NCK) - 65 FORMAT(/'******RECORD #',I3,' HAS NAME=',A,' AND ID=',A,', BUT ', - 1 'ID IS DIFFERENT FROM VALIDATED ORIGINAL SHORT-TERM ', - 2 'HISTORY RECORD',I3/4X,' WHICH IS ',A,'. RECORDS ARE:'/ - 3 2(1X,'@@@',A,'@@@'/),2(4X,A/)) - IF(NCK .LE. NTEST) WRITE(6,66) NREC,STMNMZ,STMIDZ,NCK,STMIDX, - 1 NNNREC,ZZZREC,TSTREC(NREC),SCRATC(NCK) - 66 FORMAT(/'******RECORD #',I3,' HAS NAME=',A,' AND ID=',A,', BUT ', - 1 'ID IS DIFFERENT FROM TEST RECORD WITH LARGER INDEX',I3, - 2 ' WHICH IS ',A,'.'/4X,'RECORDS ARE:'/2(1X,'@@@',A,'@@@'/), - 3 2(4X,A/)) - IF(RSMCZ .EQ. 'JTWC' .AND. STMIDZ(1:2) .EQ. STMIDX(1:2)) THEN - IETYP=-7 - WRITE(6,165) - 165 FORMAT('###OBSERVER IS JTWC. BASIN NOT GUARANTEED TO BE ', - 1 'CONSISTENT. IETYP=-7.') - ENDIF - IF(IETYP .GT. 0) GO TO 71 - ENDIF - -C SAME ID BUT DIFFERENT NAME: NEITHER IS NAMELESS - - IF(STMNMZ .NE. 'NAMELESS' .AND. STMNMX .NE. 'NAMELESS') THEN - IF(STMIDZ .EQ. STMIDX .AND. STMNMZ .NE. STMNMX .AND. - 1 RELOCZ .EQ. ' ' .AND. RELOCX .EQ. ' ') THEN - IETYP=8 - IF(NCK .GT. NTEST) WRITE(6,67) NREC,STMIDZ,STMNMZ,NCK,STMIDX, - 1 NNNREC,ZZZREC,TSTREC(NREC),SCRATC(NCK) - 67 FORMAT(/'******RECORD #',I3,' HAS ID=',A,' AND NAME=',A,', BUT ', - 1 'NAME IS DIFFERENT FROM VALIDATED ORIGINAL'/7X,'SHORT-', - 2 'TERM HISTORY RECORD',I3,' WHICH IS ',A,'.'/7X,'RECORDS ', - 3 'ARE:'/2(1X,'@@@',A,'@@@'/),2(4X,A/)) - IF(NCK .LE. NTEST) WRITE(6,68) NREC,STMIDZ,STMNMZ,NCK,STMIDX, - 1 NNNREC,ZZZREC,TSTREC(NREC),SCRATC(NCK) - 68 FORMAT(/'******RECORD #',I3,' HAS ID=',A,' AND NAME=',A,', BUT ', - 1 'NAME IS DIFFERENT FROM TEST RECORD WITH LARGER INDEX',I3, - 2 ' WHICH IS ',A,'.'/4X,'RECORDS ARE:'/2(1X,'@@@',A,'@@@'/), - 3 2(4X,A/)) - GO TO 71 - ENDIF - ENDIF - - ENDIF - ENDDO - 71 CONTINUE - ENDIF - -C CHECK FOR RECORDS WITH IDENTICAL RSMC, DATE/TIME GROUP AND -C STORM ID. SINCE THE CURRENT RECORD IS FIRST, WE WILL SUPERCEDE -C IT WITH THE LATER RECORD - - IF(IETYP .EQ. 0) THEN - DO NCK=NREC+1,NTEST - BUFINX=TSTREC(NCK) - CALL DECVAR(ISTIDC,ISTIDC+ITWO-1,KSTMX,IERDEC,'(I2.2)', - 1 STMIDX) - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 TSTREC(NREC)) - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVRX(IV),IERDEC,FMTVIT(IV), - 1 TSTREC(NCK )) - ENDDO - - DO NBA=1,NBASIN - IF(STMIDX(3:3) .EQ. IDBASN(NBA)) THEN - NIDBSX=NBA - GO TO 91 - ENDIF - ENDDO - - 91 IF(RSMCX .EQ. RSMCZ .AND. - 1 IDATEX .EQ. IDATEZ .AND. - 2 IUTCX .EQ. IUTCZ .AND. - 3 NIDBSX .EQ. NIDBSN .AND. - 4 KSTMX .EQ. KSTORM) THEN - -C ACCUMULATE ALL RECORDS THAT HAVE THE SAME RSMC, DATE/TIME AND -C STORM ID FOR PROCESSING - - IF(NDUP .LT. NDUPMX) THEN - NDUP=NDUP+1 - INDXDP(NDUP)=NCK - - ELSE - WRITE(6,93) RSMCZ,IDATEZ,IUTCZ,STMIDZ,NDUPMX - 93 FORMAT(/'******NUMBER OF RECORDS WITH SAME RSMC=',A,', DATE=',I9, - 1 ', TIME=',I5,' AND STORM ID=',A/7X,'EXCEEDS THE MAXIMUM=', - 2 I3,'. THE PROGRAM WILL TERMINATE!!') - CALL ABORT1('STIDCK ',53) - ENDIF - - ENDIF - ENDDO - - IF(NDUP .GT. 0) THEN - CALL FIXDUP(IUNTHO,NTEST,NREC,NDUP,INDXDP,TSTREC,ZZZREC,NNNREC, - 1 IETYP) - IF(IETYP .EQ. 4) THEN - DO NDU=1,NDUP - WRITE(6,109) NDU,IABS(INDXDP(NDU)),IETYP - 109 FORMAT(/'...DUPLICATE RECORD',I3,' WITH INDEX=',I3,' HAS ', - 1 'PROBABLE DATE/TIME ERROR=',I3) - IF(INDXDP(NDU) .LT. 0) IDUPID(IABS(INDXDP(NDU)))=IETYP - ENDDO - -C CLEAR THE ERROR FLAG FOR THE CURRENT RECORD!!! - - IETYP=0 - ENDIF - ENDIF - - ENDIF - - IF(IETYP .EQ. 0) THEN - -C SKIP STORM NAME CHECK IF STORM NAME='NAMELESS' OR BASIN IS -C NEITHER ATLANTIC OR EAST PACIFIC - - IF(STMNMZ .EQ. 'NAMELESS') THEN - WRITE(6,113) STMNMZ - 113 FORMAT(/'...STORM NAME IS ',A9,' SO NO NAME CHECKING WILL BE ', - 1 'DONE') - GO TO 190 - ENDIF - - IF(NIDBSN .LE. 4) THEN - IF(NIDBSN .LE. 2) THEN - NSTBSN=-1 - DO NST=1,NSTMAX - IF(STMNMZ .EQ. STBASN(NST,NIDBSN,IYRNAM)) THEN -C WRITE(6,117) STMNMZ,NST,NIDBSN,IYRNAM -C 117 FORMAT(/'...WE HAVE FOUND MATCHING NAME FOR ',A,' AT INDEX=',I4, -C 1 ', FOR NIDBSN,IYRNAM=',2I4) - NSTBSN=NST - GO TO 171 - ENDIF - ENDDO - -C FOR EAST PACIFIC STORM IDS, CHECK THAT THEY MAY HAVE BEEN NAMED -C IN THE CENTRAL PACIFIC - - IF(NIDBSN .EQ. 2) THEN - NSTBSN=-1 - DO NST=1,NSMXCP - IF(STMNMZ .EQ. STBACP(NST)) THEN - NSTBSN=NST - GO TO 171 - ENDIF - ENDDO - ENDIF - - ELSE IF(NIDBSN .EQ. 3) THEN - NSTBSN=-1 - DO NST=1,NSMXCP - IF(STMNMZ .EQ. STBACP(NST)) THEN - NSTBSN=NST - GO TO 171 - ENDIF - ENDDO - - ELSE IF(NIDBSN .EQ. 4) THEN - NSTBSN=-1 - DO NST=1,NSMXWP - IF(STMNMZ .EQ. STBAWP(NST)) THEN - NSTBSN=NST - GO TO 171 - ENDIF - ENDDO - ENDIF - -C CHECK FOR CARDINAL NUMBER IDENTIFIER FOR AS YET UNNAMED STORMS - - DO NCARD=1,NCRDMX - IF(STMNMZ(1:ICRDCH(NCARD)) .EQ. CARDNM(NCARD)(1:ICRDCH(NCARD))) - 1 THEN - WRITE(6,147) STMNMZ(1:ICRDCH(NCARD)),NCARD,ICRDCH(NCARD) - 147 FORMAT(/'...WE HAVE FOUND MATCHING NAME FOR "',A,'" AT CARDINAL ', - 1 'INDEX',I3,', FOR CHARACTERS 1-',I2,'.') - NSTBSN=NCARD - GO TO 171 - ENDIF - ENDDO - -C CHECK FOR GREEK NAMES - - DO NGRK=1,NGRKMX - IF(STMNMZ(1:IGRKCH(NGRK)) .EQ. GREKNM(NGRK)(1:IGRKCH(NGRK))) - 1 THEN - WRITE(6,157) STMNMZ(1:IGRKCH(NGRK)),NGRK,IGRKCH(NGRK) - 157 FORMAT(/'...WE HAVE FOUND MATCHING GREEK NAME FOR "',A,'" AT ', - 1 'GREEK INDEX',I3,', FOR CHARACTERS 1-',I2,'.') - NSTBSN=NGRK - GO TO 171 - ENDIF - ENDDO - - 171 IF(NSTBSN .LT. 0) THEN - IETYP=5 - WRITE(6,173) NREC,STMNMZ,NIDBSN,IYRNAM,NNNREC,ZZZREC,TSTREC(NREC) - 173 FORMAT(/'+++RECORD #',I3,' HAS BAD STORM NAME=',A9,'. NIDBSN,', - 1 'IYRNAM=',2I4/4X,'ERROR RECOVERY WILL BE CALLED FOR THIS', - 2 ' RECORD:'/2(1X,'@@@',A,'@@@'/),4X,A) - - CALL FIXNAM(IUNTCA,NIDBSN,IYR,IETYP,STMNMZ,TSTREC(NREC)) - - ENDIF - - ELSE - WRITE(6,181) IDBASN(NIDBSN),STMNMZ - 181 FORMAT('...VALID BASIN ID=',A1,' DOES NOT ALLOW STORM NAME CHECK', - 1 ' AT THIS TIME. NAME=',A9) - ENDIF - - ENDIF - - ENDIF - - 190 IFSTCK(NUMTST(NREC))=IETYP - IF(IETYP .GT. 0) THEN - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(NREC) - BADREC(NADD+NBAD)=TSTREC(NREC) - ELSE - NOKAY=NOKAY+1 - NUMOKA(NOKAY)=NUMTST(NREC) - OKAREC(NOKAY)=TSTREC(NREC) - ENDIF - - ENDDO - - WRITE(6,201) NOKAY,NADD,NTEST,(ERCID(NER),NER=1,NERCID) - 201 FORMAT(//'...RESULTS OF THE STORM ID CHECK ARE: NOKAY=',I4,' AND', - 1 ' NADD=',I4,' FOR A TOTAL OF ',I4,' RECORDS.'//4X, - 2 'ERROR CODES ARE:'/(6X,A)) - WRITE(6,203) - 203 FORMAT(/'...OKAY RECORDS ARE:',100X,'ERC'/) - DO NOK=1,NOKAY - WRITE(6,209) NOK,NUMOKA(NOK),OKAREC(NOK),IFSTCK(NUMOKA(NOK)) - 209 FORMAT(3X,I4,'...',I4,'...',A,'...',I3) - ENDDO - IF(NADD .GT. 0) WRITE(6,211) (NBAD+NBA,NUMBAD(NBAD+NBA), - 1 BADREC(NBAD+NBA), - 2 IFSTCK(NUMBAD(NBAD+NBA)), - 3 NBA=1,NADD) - 211 FORMAT(/' ADDED BAD RECORDS ARE:',95X,'ERC'/(3X,I4,'...',I4, - 1 '...',A,'...',I3)) - NBAD=NBAD+NADD - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: FIXDUP ERROR RECOVERY FOR PARTIAL DUPLICATE RECS -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: ERROR RECOVERY FOR PARTIAL DUPLICATE RECORDS. PARTIAL -C DUPLICATE RECORDS ARE DEFINED AS THOSE WITH IDENTICAL RSMC, STORM -C ID & NAME, AND DATE/TIME. THE ERROR RECOVERY PROCEDURE BEGINS BY -C TRYING TO FIND A PREVIOUS RECORD FOR THE TARGET RECORD, WHICH IS -C DEFINED AS THE FIRST OF THE DUPLICATE RECORDS (ALL SUBSEQUENT -C RECORDS ARE DEFINED AS "DUPLICATES"). THE CURRENT RECORDS ARE -C SEARCHED FIRST, THEN THE SHORT-TERM HISTORY FILE IS SEARCHED. -C IF NO PREVIOUS RECORDS ARE FOUND ANYWHERE, THE DEFAULT DECISION IS -C TO KEEP THE LAST OF THE DUPLICATES, UNDER THE ASSSUMPTION THAT -C THE DUPLICATE RECORDS ARE UPDATE RECORDS FOR THE SAME STORM. -C IF A PREVIOUS RECORD IS FOUND, ITS EXTRAPOLATED POSITION IS COMPARED -C WITH THE TARGET RECORD AND THE DUPLICATE RECORDS. IF THE TARGET -C POSITION ERROR IS GREATER THAN THE DUPLICATE POSITION, THE -C TARGET RECORD IS CONSIDERED ERROREOUS. IF THE TARGET POSITION ERROR -C IS LESS THAN THE DUPLICATE POSITION ERROR, THE DUPLICATE POSITION -C IS CHECKED AGAINST AN EXTRAPOLATED FUTURE POSITION. IF THAT ERROR -C IS LESS THAN FOR THE CURRENT POSITION, IT IS ASSUMED THAT THE -C DUPLICATE RECORD HAS A DATE/TIME ERROR. IF THE DUPLICATE POSITION -C ERROR IS LARGER FOR THE FUTURE TIME, IT IS ASSUMED THAT THE -C DUPLICATE RECORD IS AN UPDATE RECORD WHICH SUPERCEDES THE TARGET. -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C -C USAGE: CALL FIXDUP(IUNTHO,NTEST,NREC,NDUP,INDXDP,TSTREC,ZZZREC, -C NNNREC,IETYP) -C INPUT ARGUMENT LIST: -C IUNTHO - UNIT NUMBER FOR SHORT-TERM HISTORY FILE. -C NTEST - TOTAL NUMBER OF RECORDS AVAILABLE (DIMENSION OF TSTREC) -C NREC - INDEX NUMBER OF TARGET RECORD -C NDUP - NUMBER OF DUPLICATE RECORDS -C INDXDP - INTEGER ARRAY CONTAINING INDEX NUMBERS OF -C - DUPLICATE RECORDS -C TSTREC - CHARACTER ARRAY OF INPUT RECORDS. -C ZZZREC - CHARACTER VARIABLE CONTAINING VARIABLE NAMES. -C NNNREC - CHARACTER VARIABLE CONTAINING COLUMN NUMBERS. -C -C OUTPUT ARGUMENT LIST: -C IETYP - ERROR CODE -C -C INPUT FILES: -C UNIT 21 - SHORT-TERM HISTORY FILE -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE FIXDUP(IUNTHO,NTEST,NREC,NDUP,INDXDP,TSTREC,ZZZREC, - 1 NNNREC,IETYP) - - PARAMETER (MAXSTM=70) - - SAVE - - CHARACTER*(*) TSTREC(0:NTEST),ZZZREC,NNNREC - - DIMENSION INDXDP(NDUP) - - DIMENSION RINC(5) - - CHARACTER STMNAM*9,STMID*3,RSMC*4 - - LOGICAL FSTFLG - - DIMENSION STMNAM(MAXSTM),STMLAT(MAXSTM),STMLON(MAXSTM), - 1 STMDIR(MAXSTM),STMSPD(MAXSTM),IDATE(MAXSTM), - 2 IUTC(MAXSTM),RMAX(MAXSTM),PENV(MAXSTM),PCEN(MAXSTM), - 3 PTOP(MAXSTM),RSMC(MAXSTM),RMW(MAXSTM),VMAX(MAXSTM), - 4 R15NW(MAXSTM),R15NE(MAXSTM),R15SE(MAXSTM),R15SW(MAXSTM), - 5 STMID(MAXSTM),FSTFLG(MAXSTM) - - PARAMETER (MAXCHR=95) - PARAMETER (MAXVIT=15) - PARAMETER (NBASIN=11) - - CHARACTER BUFIN*1,RSMCZ*4,STMIDZ*3,STMNMZ*9,FSTFLZ*1,STMDPZ*1, - 1 LATNS*1,LONEW*1,FMTVIT*6,BUFINZ*100,RELOCZ*1,IDBASN*1 - - DIMENSION IVTVAR(MAXVIT),VITVAR(MAXVIT),VITFAC(MAXVIT), - 1 ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION IDBASN(NBASIN),BUFIN(MAXCHR),FMTVIT(MAXVIT) - - EQUIVALENCE (BUFIN(1),RSMCZ),(BUFIN(5),RELOCZ),(BUFIN(6),STMIDZ), - 1 (BUFIN(10),STMNMZ),(BUFIN(19),FSTFLZ), - 2 (BUFIN(37),LATNS),(BUFIN(43),LONEW), - 3 (BUFIN(95),STMDPZ),(BUFIN(1),BUFINZ) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - EQUIVALENCE (VITVAR( 3),STMLTZ),(VITVAR( 4),STMLNZ), - 1 (VITVAR( 5),STMDRZ),(VITVAR( 6),STMSPZ) - - DIMENSION IVTVRX(MAXVIT),VITVRX(MAXVIT) - - CHARACTER BUFCK(MAXCHR)*1,RSMCX*4,RELOCX*1,STMIDX*3,LATNSX*1, - 1 LONEWX*1,BUFINX*100 - - EQUIVALENCE (BUFCK(1),RSMCX),(BUFCK(5),RELOCX),(BUFCK(6),STMIDX), - 1 (BUFCK(35),LATNSX),(BUFCK(41),LONEWX), - 2 (BUFCK(1),BUFINX) - - EQUIVALENCE (IVTVRX(1),IDATEX),(IVTVRX(2),IUTCX), - 1 (VITVRX(3),STMLTX),(VITVRX(4),STMLNX), - 2 (VITVRX(5),STMDRX),(VITVRX(6),STMSPX) - - DATA VITFAC/2*1.0,2*0.1,1.0,0.1,9*1.0/, - 1 FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 2 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 3 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 4 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/ - - DATA IDBASN/'L','E','C','W','O','T','U','P','S','B','A'/ - -C IPRNT : CONTROLS PRINTING IN SUBROUTINE NEWVIT -C FACSPD: CONVERSION FACTOR FOR R(DEG LAT)=V(M/S)*T(FRAC DAY)* -C FACSPD - - DATA NUM/1/,ITWO/2/,ISTIDC/1/,IPRNT/0/,FACSPD/0.77719/, - 1 IHRWIN/0/ - - WRITE(6,1) NDUP,NTEST,NREC - 1 FORMAT(/'...ENTERING FIXDUP WITH ',I3,' DUPLICATE RECORDS AND',I4, - 1 ' TOTAL RECORDS. TARGET RECORD TO BE CHECKED HAS INDEX=', - 2 I3) - -C RECOVER STORM ID, DATE,TIME ETC FROM THE TARGET RECORD - - BUFINZ=TSTREC(NREC) - CALL DECVAR(ISTIDC,ISTIDC+ITWO-1,KSTORM,IERDEC,'(I2.2)', - 1 STMIDZ) - DO IV=1,6 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 BUFINZ) - VITVAR(IV)=IVTVAR(IV)*VITFAC(IV) - ENDDO - IF(LATNS .EQ. 'S') STMLTZ=-STMLTZ - IF(LONEW .EQ. 'W') STMLNZ=360.-STMLNZ - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - - WRITE(6,7) BUFINZ,(INDXDP(ND),TSTREC(INDXDP(ND)),ND=1,NDUP) - 7 FORMAT('...TARGET RECORD FOR COMPARISON IS:'/10X,A/4X, - 1 'DUPLICATE RECORDS ARE:'/(4X,I4,2X,A)) -C WRITE(6,9) STMLTZ,STMLNZ,STMDRZ,STMSPZ -C 9 FORMAT('...LAT/LON, DIR/SPD OF TARGET RECORD ARE ',4F10.3) - -C CHECK IF THERE ARE ANY PREVIOUS RECORDS IN TSTREC - - INDCLO=-99 - DTCLO=1.E10 - DO NCK=1,NTEST - BUFINX=TSTREC(NCK) - CALL DECVAR(ISTIDC,ISTIDC+ITWO-1,KSTMX,IERDEC,'(I2.2)', - 1 STMIDX) - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVRX(IV),IERDEC,FMTVIT(IV), - 1 TSTREC(NCK)) - ENDDO - - DO NBA=1,NBASIN - IF(STMIDX(3:3) .EQ. IDBASN(NBA)) NIDBSX=NBA - IF(STMIDZ(3:3) .EQ. IDBASN(NBA)) NIDBSN=NBA - ENDDO - - IF(RSMCX .EQ. RSMCZ .AND. - 1 NIDBSX .EQ. NIDBSN .AND. - 2 KSTMX .EQ. KSTORM .AND. - 3 NCK .NE. NREC ) THEN - CALL ZTIME(IDATEX,IUTCX,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYX) -C WRITE(6,53) NCK,IDATEX,IUTCX,DAYX -C 53 FORMAT('...INDEX,DATE,TIME OF SAME STORM ARE:',I3,I9,I5,F10.3) - - IF(DAYX .LT. DAYZ .AND. DAYZ-DAYX .LT. DTCLO) THEN - INDCLO=NCK - DTCLO=DAYZ-DAYX - ENDIF - - ENDIF - - ENDDO - - IF(INDCLO .GT. 0) THEN - BUFINX=TSTREC(INDCLO) - DO IV=3,6 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVRX(IV),IERDEC,FMTVIT(IV), - 1 BUFINX) - VITVRX(IV)=IVTVRX(IV)*VITFAC(IV) - ENDDO - IF(LATNSX .EQ. 'S') STMLTX=-STMLTX - IF(LONEWX .EQ. 'W') STMLNX=360.-STMLNX - CALL DS2UV(USTM,VSTM,STMDRX,STMSPX) - - ELSE - WRITE(6,77) IUNTHO - 77 FORMAT(/'...PREVIOUS STORM RECORD COULD NOT BE FOUND IN CURRENT ', - 1 'RECORDS. WE WILL LOOK IN THE SHORT-TERM HISTORY FILE, ', - 2 'UNIT=',I3) - -C SCAN HISTORICAL FILE FOR ALL OCCURRENCES OF EACH STORM. -C SAVE THE LATEST TIME FOR USE LATER. - - IOPT=5 - IDTREQ=IDATEZ - STMID(1)=STMIDZ - CALL NEWVIT(IUNTHO,IPRNT,IOPT,IERVIT,MAXSTM,KSTORM,IDTREQ,IHRREQ, - 1 IHRWIN,IDATE,IUTC,STMLAT,STMLON,STMDIR,STMSPD, - 2 PCEN,PENV,RMAX,VMAX,RMW,R15NE,R15SE,R15SW,R15NW, - 3 PTOP,FSTFLG,STMNAM,STMID,RSMC) - - IF(KSTORM .GT. 0) THEN - DO KST=1,KSTORM - CALL ZTIME(IDATE(KST),IUTC(KST),IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYX) -C WRITE(6,79) KST,DAYX,DAYZ -C 79 FORMAT('...INDEX,DAYX, DAYZ FROM ST. TERM HIST. FILE=',I3,2F10.3) - IF(DAYZ-DAYX .LT. DTCLO) THEN - INDCLO=KST - DTCLO=DAYZ-DAYX - ENDIF - ENDDO - - CALL DS2UV(USTM,VSTM,STMDIR(INDCLO),STMSPD(INDCLO)) - STMLTX=STMLAT(INDCLO) - STMLNX=STMLON(INDCLO) - - ELSE - WRITE(6,97) - 97 FORMAT('###PREVIOUS RECORD COULD NOT BE FOUND ANYWHERE. ', - 1 'THEREFORE, WE MAKE THE ARBITRARY, BUT NECESSARY DECISION'/ - 2 4X,'TO RETAIN THE LAST DUPLICATE RECORD.') - - IETYP=3 - WRITE(6,99) NREC,INDXDP(NDUP),NNNREC,ZZZREC,TSTREC(NREC), - 1 TSTREC(INDXDP(NDUP)) - 99 FORMAT(/'******RECORD #',I3,' WILL BE SUPERCEDED BY RECORD #',I3, - 1 ', WHICH ARRIVED LATER AND HAS IDENTICAL RSMC, DATE/TIME', - 2 ' AND STORM ID'/2(1X,'@@@',A,'@@@'/),2(4X,A/)) - RETURN - ENDIF - - ENDIF - -C SAVE THE PREVIOUS FIX POSITION AND EXTRAPOLATE IT -C TO THE CURRENT TIME - - PRVLAT=STMLTX - PRVLON=STMLNX - EXTLAT=PRVLAT+VSTM*DTCLO*FACSPD - EXTLON=PRVLON+USTM*DTCLO*FACSPD - - EXTERZ=DISTSP(STMLTZ,STMLNZ,EXTLAT,EXTLON)*1.E-3 - WRITE(6,95) STMLTZ,STMLNZ,EXTERZ - 95 FORMAT(/'...LAT/LON,EXTRAPOLATION ERROR FOR RECORDS ARE:'/4X, - 1 'TARGET:',9X,3F10.3) - - DO NDU=1,NDUP - BUFINX=TSTREC(INDXDP(NDU)) - DO IV=3,4 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVRX(IV),IERDEC,FMTVIT(IV), - 1 BUFINX) - VITVRX(IV)=IVTVRX(IV)*VITFAC(IV) - ENDDO - IF(LATNSX .EQ. 'S') STMLTX=-STMLTX - IF(LONEWX .EQ. 'W') STMLNX=360.-STMLNX - EXTERD=DISTSP(STMLTX,STMLNX,EXTLAT,EXTLON)*1.E-3 - WRITE(6,111) NDU,STMLTX,STMLNX,EXTERD - 111 FORMAT('...DUP. RECORD:',I4,3F10.3) - - IF(EXTERD .GT. EXTERZ) THEN - EXTLT2=PRVLAT+VSTM*DTCLO*FACSPD*2.0 - EXTLN2=PRVLON+USTM*DTCLO*FACSPD*2.0 - EXTER2=DISTSP(STMLTX,STMLNX,EXTLT2,EXTLN2)*1.E-3 - WRITE(6,113) NDU,EXTLT2,EXTLN2,EXTER2 - 113 FORMAT('...2XDT EXTRAP:',I4,3F10.3) - -C IF THE DIFFERENCE BETWEEN THE DUPLICATE POSITION AND -C AN EXTRAPOLATED POSITION TO A FUTURE CYCLE IS LESS -C THAN THE DIFFERENCE AT THE CURRENT TIME, WE ASSUME -C THAT THE DUPLICATE HAS A BAD DATE/TIME, I.E. THAT IT -C IS VALID A A LATER TIME. CURRENTLY THERE IS NO ERROR -C RETRIEVAL FOR THE DATE/TIME GROUP SO THAT THIS RECORD -C IS MARKED TO BE IN ERROR BY MAKING THE INDEX NEGATIVE. - - IF(EXTER2 .LT. EXTERD) THEN - IETYP=4 - INDXDP(NDU)=-INDXDP(NDU) - WRITE(6,117) IETYP,INDXDP(NDU) - 117 FORMAT(/'...DUPLICATE HAS DIFFERENCE WITH EXTRAPOLATED POSITION ', - 1 'TO FUTURE TIME THAT IS LESS THAN FOR CURRENT TIME.'/4X, - 2 'THEREFORE, WE CONCLUDE THAT THERE IS A DATE/TIME ERROR ', - 3 'IN THE DUPLICATE RECORD (IETYP=',I3,').'/4X,'THE INDEX=', - 4 I3,' IS MARKED NEGATIVE TO INDICATE AN ERROR.') - - ELSE - IETYP=3 - WRITE(6,119) NREC,INDXDP(NDUP),NNNREC,ZZZREC,TSTREC(NREC), - 1 TSTREC(INDXDP(NDUP)) - 119 FORMAT(/'...DUPLICATE HAS DIFFERENCE WITH EXTRAPOLATED FUTURE ', - 1 'POSITION GREATER THAN THAT FOR CURRENT POSITION.'/ - 2 ' ******RECORD #',I3,' WILL BE SUPERCEDED BY RECORD #',I3, - 3 ', WHICH ARRIVED LATER AND HAS IDENTICAL RSMC, DATE/TIME', - 4 ' AND STORM ID'/2(1X,'@@@',A,'@@@'/),2(4X,A/)) - ENDIF - - ELSE - IETYP=3 - WRITE(6,121) NREC,INDXDP(NDUP),NNNREC,ZZZREC,TSTREC(NREC), - 1 TSTREC(INDXDP(NDUP)) - 121 FORMAT(/'...DUPLICATE HAS DIFFERENCE WITH EXTRAPOLATED PAST ', - 1 'POSITION LESS THAN OR EQUAL TO THAT FOR TARGET.'/ - 2 ' ******RECORD #',I3,' WILL BE SUPERCEDED BY RECORD #',I3, - 3 ', WHICH ARRIVED LATER AND HAS IDENTICAL RSMC, DATE/TIME', - 4 ' AND STORM ID'/2(1X,'@@@',A,'@@@'/),2(4X,A/)) - ENDIF - - ENDDO - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: FIXNAM NAME RECOVERY FOR SYNDAT_QCTROPCY -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: ERRONEOUS STORM NAMES ARE CHECKED FOR OLD (RETIRED) STORM -C NAMES (ATLANTIC BASIN ONLY). IF A RETIRED NAME MATCHES THE -C INPUT STORM NAME, ERROR RECOVERY IS SUCCESSFUL. SEE REMARKS BELOW. -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C 1993-08-25 S. LORD ADDED CATALOG CHECKING FOR STORM IDS -C -C USAGE: CALL FIXNAM(IUNTCA,NIDBSN,IYRN,IETYP,STMNAM,DUMREC) -C INPUT ARGUMENT LIST: -C IUNTCA - STORM CATALOG UNIT NUMBER -C NIDBSN - BASIN INDEX -C IYRN - 4 DIGIT YEAR OF STORM (YYYY) -C IETYP - INPUT ERROR CODE (SHOULD BE POSITIVE) -C STMNAM - CHARACTER VARIABLE CONTAINING ERRONEOUS STORM NAME -C -C OUTPUT ARGUMENT LIST: -C IETYP - SIGN OF INPUT IETYP IS CHANGED TO NEGATIVE IF -C - RECOVERY IS SUCCESSFUL -C DUMREC - CHARACTER VARIABLE CONTAINING ENTIRE INPUT DATA RECORD -C - WITH CORRECTED NAME. -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE FIXNAM(IUNTCA,NIDBSN,IYRN,IETYP,STMNAM,DUMREC) - - PARAMETER (NRETIR= 7) - - SAVE - - CHARACTER*(*) STMNAM,DUMREC - - PARAMETER (MAXCHR=95) - PARAMETER (MAXVIT=15) - PARAMETER (NBASIN=11) - - CHARACTER BUFIN*1,RSMCZ*4,STMIDZ*3,STMNMZ*9,FSTFLZ*1,STMDPZ*1, - 1 LATNS*1,LONEW*1,FMTVIT*6,BUFINZ*100,RELOCZ*1,NABASN*16 - - DIMENSION IVTVAR(MAXVIT),ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION NABASN(NBASIN),BUFIN(MAXCHR),FMTVIT(MAXVIT) - - EQUIVALENCE (BUFIN(1),RSMCZ),(BUFIN(5),RELOCZ),(BUFIN(6),STMIDZ), - 1 (BUFIN(10),STMNMZ),(BUFIN(19),FSTFLZ), - 2 (BUFIN(37),LATNS),(BUFIN(43),LONEW), - 3 (BUFIN(95),STMDPZ),(BUFIN(1),BUFINZ) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - CHARACTER RETNAM(NRETIR,NBASIN)*9 - DIMENSION IRETYR(NRETIR,NBASIN),NUMRET(NBASIN) - - DIMENSION RINC(5) - - DATA FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 1 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 2 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 3 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/ - - DATA NABASN/'ATLANTIC ','EAST PACIFIC ', - 1 'CENTRAL PACIFIC ','WEST PACIFIC ', - 2 'SOUTH CHINA SEA ','EAST CHINA SEA ', - 3 'AUSTRALIA ','SOUTH PACIFIC ', - 4 'SOUTH INDIAN OCN','BAY OF BENGAL ', - 5 'NRTH ARABIAN SEA'/ - - DATA RETNAM/'GILBERT ','JOAN ','HUGO ','GLORIA ', - 1 'DIANA ','BOB ','ANDREW ',70*' '/ - - DATA IRETYR/1988,1988,1989,1985,1990,1991,1992, - 1 70*00/ - - DATA NUMRET/7,1,9*0/,DYSPMX/2.0/ - - RETNAM(1,2)='INIKI' - IRETYR(1,2)=1992 - - BUFINZ=DUMREC - DO INUM=1,NUMRET(NIDBSN) - IF(STMNAM .EQ. RETNAM(INUM,NIDBSN) .AND. - 1 IYRN .EQ. IRETYR(INUM,NIDBSN)) THEN - WRITE(6,3) NABASN(NIDBSN),STMNAM,IYRN - 3 FORMAT(/'...SUCESSFUL RECOVERY OF STORM NAME FROM RETIRED STORM ', - 1 'NAMES OF THE ',A,'. NAME, YEAR=',A,1X,I5) - STMNMZ=STMNAM - DUMREC=BUFINZ - IETYP=-IETYP - RETURN - ENDIF - ENDDO - -C LOOK FOR NAME IN STORM CATALOG. IF THERE, CHECK THAT IT IS A -C RECENT STORM. IF SO, ASSUME THAT THE STORM ID IS OK. - - CALL STCATN(IUNTCA,STMNAM,IDATCA,IUTCCA,IFND) - IF(IFND .EQ. 0) THEN - WRITE(6,101) STMNAM - 101 FORMAT(/'...UNSUCESSFUL ATTEMPT TO RECOVER STORM NAME ...',A, - 1 '... HAS OCCURRED.') - ELSE - -C NOW CHECK DATE VERSUS SUBJECT RECORD - - do iv=1,2 - call decvar(istvar(iv),ienvar(iv),ivtvar(iv),ierdec,fmtvit(iv), - 1 bufinz) - enddo - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - - CALL ZTIME(IDATCA,IUTCCA,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYCA) - WRITE(6,133) IDATEZ,IUTCZ,IDATCA,IUTCCA,DAYZ,DAYCA - 133 FORMAT('...COMPARING DATES BETWEEN RECORD AND CATALOG. IDATEZ, ', - 1 'IUTCZ=',I9,I5,' IDATCA,IUTCCA=',I9,I5/4X,'DAYZ,DAYCA=', - 2 2F12.3) - IF(ABS(DAYZ-DAYCA) .GT. DYSPMX) RETURN - IETYP=-IETYP - WRITE(6,201) STMNAM - 201 FORMAT(/'...SUCESSFUL ATTEMPT TO RECOVER STORM NAME ...',A, - 1 '... HAS OCCURRED.') - ENDIF - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: SECVCK SECONDARY VARIABLE Q/C CHECKING -C PRGMMR: S. LORD ORG: NP22 DATE: 1990-11-01 -C -C ABSTRACT: SECONDARY VARIABLES ARE: STORM DIRECTION AND SPEED, -C PCEN (CENTRAL PRESSURE), RMAX (RADIUS OF THE OUTERMOST CLOSED -C ISOBAR), PENV (PRESSURE AT RMAX), AND VMAX (MAXIMUM WIND SPEED). -C THIS ROUTINE CHECKS FOR MISSING AND OUT OF BOUNDS VALUES. -C FOR RMAX, PENV, AND VMAX, VALUES ARE SUBSTITUTED FROM THE LATEST -C HISTORICAL Q/C CHECKED RECORD IF THAT RECORD IS NO MORE THAN 12 -C HOURS OLD. -C -C PROGRAM HISTORY LOG: -C 1990-11-01 S. LORD -C 1991-11-17 S. LORD REVISED FOR MULTIPLE ERRORS -C 1992-08-20 S. LORD ADDED THE JTWC MEMORIAL SWITCH CHECK -C 1992-09-04 S. LORD ADDED PRESSURE WIND RELATIONSHIP -C -C USAGE: CALL SECVCK(IUNTOK,NTEST,NOKAY,NBAD,NUMTST,NUMOKA,NUMBAD, -C DAY0,DAYMIN,DAYMX1,DAYOFF,IFSECV,ZZZREC,NNNREC, -C SCRREC,TSTREC,BADREC,OKAREC) -C INPUT ARGUMENT LIST: -C IUNTOK - UNIT NUMBER FOR PRELIMINARY QUALITY CONTROLLED FILE. -C NTEST - NUMBER OF RECORDS TO BE TESTED. -C NUMTST - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH RECORD -C - TO BE TESTED. -C DAY0 - DATE AT WHICH THIS Q/C CHECK IS BEING MADE. -C - UNITS ARE DDD.FFF, WHERE DDD=JULIAN DAY, FFF=FRAC- -C - TIONAL DAY (E.G. .5=1200 UTC). -C DAYMIN - EARLIEST (MINIMUM) DATE FOR CONSTRUCTION OF A -C - HISTORICAL TRACK FOR EACH STORM. -C - UNITS SAME AS DAY0 ABOVE. -C DAYMX1 - LATEST (MAXIMUM) DATE FOR CONSTRUCTION OF HISTORICAL -C - TRACK FOR EACH STORM. UNITS ARE SAME AS DAY0 ABOVE. -C DAYOFF - OFFSET ADDED TO DAYMX1 IF DAYMIN REFERS TO THE YEAR -C - BEFORE DAYMX1. -C ZZZREC - CHARACTER VARIABLE CONTAINING VARIABLE NAMES. -C NNNREC - CHARACTER VARIABLE CONTAINING COLUMN NUMBERS. -C TSTREC - CHARACTER ARRAY CONTAINING RECORDS TO BE TESTED. -C -C OUTPUT ARGUMENT LIST: -C NOKAY - NUMBER OF RECORDS THAT PASSED THE SEC. VAR. CHECK. -C NBAD - NUMBER OF RECORDS THAT FAILED THE SEC. VAR. CHECK. -C IFSECV - INTEGER ARRAY CONTAINING ERROR CODE FOR EACH INPUT -C - RECORD. SEE COMMENTS IN PGM FOR KEY TO ERROR CODES. -C SCRREC - SCRATCH CHARACTER*9 ARRAY -C NUMOKA - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH GOOD -C - RECORD. -C NUMBAD - INTEGER ARRAY CONTAINING INDEX NUMBER OF EACH BAD -C - RECORD. -C BADREC - CHARACTER ARRAY CONTAINING BAD RECORDS THAT FAILED -C - THE SEC. VAR. CHECK. -C OKAREC - CHARACTER ARRAY CONTAINING ALL RECORDS THAT PASSED -C - THE SEC. VAR. CHECK. -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: WARNING: RECORDS WITH CORRECT FORMAT BUT MISSING OR -C ERRONEOUS DATA MAY BE MODIFIED BY THIS ROUTINE!! -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE SECVCK(IUNTOK,NTEST,NOKAY,NBAD,NUMTST,NUMOKA,NUMBAD, - 1 DAY0,DAYMIN,DAYMX1,DAYOFF,IFSECV,ZZZREC,NNNREC, - 2 SCRREC,TSTREC,BADREC,OKAREC) - - PARAMETER (NPRVMX=61) - PARAMETER (MAXSTM=70) - PARAMETER (NERCSV=9) - PARAMETER (MAXREC=1000) - - SAVE - - CHARACTER*(*) ZZZREC,NNNREC,SCRREC(0:NTEST),TSTREC(0:NTEST), - 1 BADREC(MAXREC),OKAREC(NTEST),ERCSV(NERCSV)*60, - 2 STDPTP(-NPRVMX:-1)*1,SUBTOP*1,SUBFLG*1 - - LOGICAL NEWSTM - - DIMENSION NUMOKA(NTEST),IFSECV(MAXREC),NUMBAD(MAXREC), - 1 NUMTST(NTEST) - - DIMENSION NUMSTM(MAXSTM),INDXST(MAXSTM,MAXSTM),IOPSTM(MAXSTM), - 1 SRTDAY(MAXSTM,MAXSTM),IDASRT(MAXSTM) - - DIMENSION STLATP(-NPRVMX:-1),STLONP(-NPRVMX:-1), - 1 STDAYP(-NPRVMX: 0),STVMXP(-NPRVMX:-1), - 2 STDIRP(-NPRVMX:-1),STSPDP(-NPRVMX:-1), - 3 STPCNP(-NPRVMX:-1),STPENP(-NPRVMX:-1), - 4 STRMXP(-NPRVMX:-1) - - PARAMETER (MAXCHR=95) - PARAMETER (MAXVIT=15) - PARAMETER (MAXTPC= 3) - PARAMETER (NBASIN=11) - PARAMETER (ISECVR= 5,ITERVR=10) - PARAMETER (NSECVR=ITERVR-ISECVR) - PARAMETER (NTERVR=MAXVIT-ITERVR+1) - - CHARACTER BUFIN*1,RSMCZ*4,STMIDZ*3,STMNMZ*9,FSTFLZ*1,STMDPZ*1, - 1 SHALO*1,MEDIUM*1,DEEP*1,LATNS*1,LONEW*1,FMTVIT*6, - 2 BUFINZ*100,STMREQ*9,RELOCZ*1,STMTPC*1,EXE*1,NAMVAR*5, - 3 IDBASN*1,NABASN*16 - - DIMENSION IVTVAR(MAXVIT),VITVAR(MAXVIT),VITFAC(MAXVIT), - 1 ISTVAR(MAXVIT),IENVAR(MAXVIT) - - DIMENSION NAMVAR(MAXVIT+1),IDBASN(NBASIN),NABASN(NBASIN), - 1 BUFIN(MAXCHR),STMTPC(0:MAXTPC),FMTVIT(MAXVIT) - - EQUIVALENCE (BUFIN(1),RSMCZ),(BUFIN(5),RELOCZ),(BUFIN(6),STMIDZ), - 1 (BUFIN(10),STMNMZ),(BUFIN(19),FSTFLZ), - 2 (BUFIN(37),LATNS),(BUFIN(43),LONEW), - 3 (BUFIN(95),STMDPZ),(BUFIN(1),BUFINZ) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - EQUIVALENCE (VITVAR( 3),STMLTZ),(VITVAR( 4),STMLNZ), - 1 (VITVAR( 5),STMDRZ),(VITVAR( 6),STMSPZ), - 2 (VITVAR( 7),PCENZ) - - EQUIVALENCE (STMTPC(0), EXE),(STMTPC(1),SHALO),(STMTPC(2),MEDIUM), - 1 (STMTPC(3),DEEP) - -C **** NOTE: SECBND AND PRVSVR ARE DIMENSIONED NSECVR+1 TO CARRY -C SPACE FOR VMAX, WHICH IS NOT STRICTLY A SECONDARY VARIABLE. -C THEREFORE, WE DO NOT ALLOW MISSING OR ERRONEOUS VALUES -C OF VMAX TO CAUSE RECORDS TO BE REJECTED. - -C ****NOTE: DEPTH OF CYCLONIC CIRCULATION IS CLASSIFIED AS A -C SECONDARY VARIABLE - - DIMENSION RINC(5) - - DIMENSION SECBND(NSECVR+1,2),PRVSVR(NSECVR+1,-NPRVMX:-1), - 1 TERBND(NTERVR,2),IERROR(NSECVR+2) - - EQUIVALENCE (DIRMN ,SECBND(1,1)),(DIRMX ,SECBND(1,2)), - 1 (SPDMN ,SECBND(2,1)),(SPDMX ,SECBND(2,2)), - 2 (PCENMN,SECBND(3,1)),(PCENMX,SECBND(3,2)), - 3 (PENVMN,SECBND(4,1)),(PENVMX,SECBND(4,2)), - 4 (RMAXMN,SECBND(5,1)),(RMAXMX,SECBND(5,2)), - 5 (VMAXMN,TERBND(1,1)),(VMAXMX,TERBND(1,2)) - - DATA SHALO/'S'/,MEDIUM/'M'/,DEEP/'D'/,EXE/'X'/, - 1 VITFAC/2*1.0,2*0.1,1.0,0.1,9*1.0/, - 2 FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 3 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 4 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 5 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/ - - DATA IDBASN/'L','E','C','W','O','T','U','P','S','B','A'/ - - DATA NABASN/'ATLANTIC ','EAST PACIFIC ', - 1 'CENTRAL PACIFIC ','WEST PACIFIC ', - 2 'SOUTH CHINA SEA ','EAST CHINA SEA ', - 3 'AUSTRALIA ','SOUTH PACIFIC ', - 4 'SOUTH INDIAN OCN','BAY OF BENGAL ', - 5 'NRTH ARABIAN SEA'/ - - DATA NAMVAR/'DATE ','TIME ','LAT. ','LONG.','DIR ','SPEED', - 1 'PCEN ','PENV ','RMAX ','VMAX ','RMW ','R15NE', - 2 'R15SE','R15SW','R15NW','DEPTH'/ - -C RMISPR: MISSING CODE FOR RMAX, PCEN AND PENV -C RMISV: MISSING CODE FOR MAX. TANGENTIAL WIND (VMAX) -C EPSMIS: TOLERANCE FOR MISSING VMAX -C FIVMIN: FIVE MINUTES IN UNITS OF FRACTIONAL DAYS -C DTPERS: MAXIMUM TIME SEPARATION FOR SUBSTITUTION OF MISSING -C SECONDARY INFORMATION USING PERSISTENCE (12 HOURS) -C BOUNDS FOR SECONDARY VARIABLES: -C DIRMN =0.0 DEG DIRMX =360 DEG -C SPDMN =0.0 M/S SPDMX =30 M/S -C PCENMN=880 MB PCENMX=1020 MB -C PENVMN=970 MB PENVMX=1050 MB -C RMAXMN=100 KM RMAXMX=999 KM -C VMAXMN=7.7 M/S VMAXMX=100 M/S - - DATA RMISV/-9.0/,RMISPR/-999.0/,EPSMIS/1.E-1/,NUM/1/, - 1 FIVMIN/3.4722E-3/,DTPERS/0.5/ - - DATA DIRMN/0.0/,DIRMX/360./,SPDMN/0.0/,SPDMX/30./, - 1 PCENMN/880./,PCENMX/1020./,PENVMN/970./,PENVMX/1050./, - 2 RMAXMN/100./,RMAXMX/999.0/,VMAXMN/7.7 /,VMAXMX/100./ - - DATA ERCSV - 1 /'1: UNPHYSICAL OR MISSING DIRECTION (OUTSIDE BOUNDS) ', - 2 '2: UNPHYSICAL OR MISSING SPEED (OUTSIDE BOUNDS) ', - 3 '3: UNPHYSICAL OR MISSING CENTRAL PRESSURE (OUTSIDE BOUNDS) ', - 4 '4: UNPHYSICAL OR MISSING ENV. PRESSURE (OUTSIDE BOUNDS) ', - 5 '5: UNPHYSICAL OR MISSING RMAX (OUTSIDE BOUNDS) ', - 6 '6: UNPHYSICAL OR MISSING VMAX (OUTSIDE BOUNDS) ', - 7 '7: MISSING OR UNINTERPRETABLE DEPTH OF CYCLONE CIRCULATION ', - 8 '8: COMBINATION OF TWO OF THE ERROR TYPES 1-6 ', - 9 '9: COMBINATION OF THREE OR MORE OF THE ERROR TYPES 1-6 '/ - -C ERROR CODES FOR DIRECTION/SPEED GROUP CHECK ARE AS FOLLOWS: -C NEGATIVE NUMBERS INDICATE THAT AN ERRONEOUS OR MISSING VALUE -C WAS SUBSTITUTED USING PERSISTENCE OVER THE TIME DTPERS (12 H) -C MULTIPLE ERRORS ARE HANDLED AS FOLLOWS: -C THE FIRST ERROR OCCUPIES THE LEFT-MOST DIGIT -C THE SECOND ERROR OCCUPIES THE RIGHT-MOST DIGIT -C THREE OR MORE ERRORS REVERTS TO ERROR CODE=9 - -C 1: UNPHYSICAL DIRECTION (OUTSIDE BOUNDS) -C 2: UNPHYSICAL SPEED (OUTSIDE BOUNDS) -C 3: UNPHYSICAL CENTRAL PRESSURE (OUTSIDE BOUNDS) -C 4: UNPHYSICAL ENVIRONMENTAL PRESSURE (OUTSIDE BOUNDS) -C 5: UNPHYSICAL RMAX (OUTSIDE BOUNDS) -C 6: UNPHYSICAL VMAX (OUTSIDE BOUNDS) -C 7: MISSING OR UNINTERPRETABLE DEPTH OF CYCLONE CIRCULATION -C 8: COMBINATION OF TWO OF THE ERROR TYPES 1-6 -C 9: COMBINATION OF THREE OR MORE OF THE ERROR TYPES 1-6 - - NADD=0 - WRITE(6,1) NTEST,NOKAY,NBAD,DAY0,DAYMIN,DAYMX1, - 1 DAYOFF - 1 FORMAT(//'...ENTERING SECVCK TO CHECK SECONDARY VARIABLE ERRORS.', - 1 ' NTEST,NOKAY,NBAD=',3I4/4X,'TIME PARAMETERS ARE: DAY0,', - 2 'DAYMIN,DAYMX1,DAYOFF=',4F11.3///) - - CALL WRNING('SECVCK') - -C INITIALIZE SOME VARIABLES - - NUNI=0 - NSTART=0 - SCRREC(0)='ZZZZZ' - STDAYP(0)=-999.0 - SECBND(6,1:2)=TERBND(1,1:2) - - NUMSTM(1:MAXSTM)=0 - INDXST(1:MAXSTM,1:MAXSTM)=0 - -C FOR THE READABLE RECORDS, FIND THE UNIQUE STORMS AND SAVE THE -C INDEX FOR EACH STORM - - WRITE(6,31) - 31 FORMAT(/'...RECORDS THAT WILL BE CHECKED ARE:'/) - DO NREC=1,NTEST - - BUFINZ=TSTREC(NREC) - WRITE(6,33) NREC,NUMTST(NREC),BUFINZ - 33 FORMAT('...',I4,'...',I4,'...',A) - -C DECODE DATE FOR SORTING PURPOSES - - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 BUFINZ) - ENDDO - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - -C CATEGORIZE ALL STORMS BY THEIR STORM ID - - IOPT=5 - STMREQ=STMIDZ - -C ENDIF - - NEWSTM=.TRUE. - DO NR=NSTART,NUNI - IF(STMREQ .EQ. SCRREC(NR)) THEN - NEWSTM=.FALSE. - INDX=NR - GO TO 85 - ENDIF - ENDDO - - 85 NSTART=1 - IF(NEWSTM) THEN - NUNI=NUNI+1 - SCRREC(NUNI)=STMREQ - IOPSTM(NUNI)=IOPT - INDX=NUNI - ENDIF - - NUMSTM(INDX)=NUMSTM(INDX)+1 - INDXST(NUMSTM(INDX),INDX)=NREC - SRTDAY(NUMSTM(INDX),INDX)=DAYZ - - ENDDO - - WRITE(6,101) NUNI - 101 FORMAT(/'...NUMBER OF UNIQUE STORMS=',I4) - -C CHECK SECONDARY VARIABLES DIRECTION,SPEED, PCEN, PENV, RMAX -C VMAX AND STORM DEPTH FOR MISSING AND OUT OF BOUNDS VALUES - - DO NUNIQ=1,NUNI - - BUFINZ=TSTREC(INDXST(1,NUNIQ)) - CALL DECVAR(ISTVAR(1),IENVAR(1),IVTVAR(1),IERDEC,FMTVIT(1), - 1 BUFINZ) - - print *, ' ' - print *, ' ' - IDTTRK=-IDATEZ - CALL SETTRK(IUNTOK,IOPSTM(NUNIQ),IDTTRK,DAY0,DAYMIN, - 1 DAYMX1,DAYOFF,STMDRZ,STMSPZ,STMLTZ,STMLNZ, - 2 SCRREC(NUNIQ),IERSET) - CALL PRVSTM(STLATP,STLONP,STDIRP,STSPDP,STDAYP, - 1 STRMXP,STPCNP,STPENP,STVMXP,STDPTP,KSTPRV) - PRVSVR(1,-1:-KSTPRV:-1)=STDIRP(-1:-KSTPRV:-1) - PRVSVR(2,-1:-KSTPRV:-1)=STSPDP(-1:-KSTPRV:-1) - PRVSVR(3,-1:-KSTPRV:-1)=STPCNP(-1:-KSTPRV:-1) - PRVSVR(4,-1:-KSTPRV:-1)=STPENP(-1:-KSTPRV:-1) - PRVSVR(5,-1:-KSTPRV:-1)=STRMXP(-1:-KSTPRV:-1) - PRVSVR(6,-1:-KSTPRV:-1)=STVMXP(-1:-KSTPRV:-1) - -C SORT ALL RECORDS BY TIME FOR EACH STORM SO THAT WE CAN TAKE -C THEM IN CHRONOLOGICAL ORDER - - CALL SORTRL(SRTDAY(1:NUMSTM(NUNIQ),NUNIQ),IDASRT(1:NUMSTM(NUNIQ)), - 1 NUMSTM(NUNIQ)) - - WRITE(6,107) KSTPRV,SCRREC(NUNIQ) - 107 FORMAT(/'...READY FOR ERROR CHECK WITH KSTPRV, STMID=',I3,1X,A) - - DO NUMST=1,NUMSTM(NUNIQ) - -C INITIALIZE ERROR COUNTERS - - NTOTER=0 - NPOSER=0 - IERROR(1:NSECVR+2)=0 - - NREC=INDXST(IDASRT(NUMST),NUNIQ) - BUFINZ=TSTREC(NREC) - -C GET DATE/TIME, STORM LAT/LON, AND THE SECONDARY -C VARIABLES DIRECTION/SPEED, PCEN, PENV, RMAX -C ****NOTE: ALTHOUGH NOT STRICTLY A SECONDARY VARIABLE, VMAX -C IS CHECKED HERE SINCE IT IS NEEDED FOR CLIPER. - - DO IV=1,ITERVR - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 BUFINZ) - VITVAR(IV)=REAL(IVTVAR(IV))*VITFAC(IV) - ENDDO - - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - JDY=IFIX(DAYZ) - - INDX00=99 - DO NP=-1,-KSTPRV,-1 - IF(ABS(STDAYP(NP)-DAYZ) .LE. FIVMIN) INDX00=NP - ENDDO - IF(ABS(DAYZ-DAY0) .LT. FIVMIN) INDX00=0 - - IF(INDX00 .EQ. 99) THEN - WRITE(6,133) INDX00 - 133 FORMAT(/'******AN INDEXING ERROR HAS OCCURRED IN SECVCK, INDX00=', - 1 I4) - CALL ABORT1('SECVCK ',133) - ENDIF - -C ERROR RECOVERY FROM PERSISTENCE IS ALWAYS POSSIBLE. RECOVERY -C FROM CLIMATOLOGY IS POSSIBLE FOR ENVIRONMENTAL PRESSURE AND -C STORM SIZE. - -C THE JMA MEMORIAL DIRECTION/SPEED CHECK IS NOW IMPLEMENTED: -C IF BOTH DIRECTION AND SPEED ARE ZERO, AND THE RSMC IS JMA, -C WE TRY TO RECOVER A BETTER DIRECTION/SPEED. - - DO IV=ISECVR,ITERVR - - RMISVR=RMISPR - SUBVAR=-99.0 - IF(IV .EQ. ITERVR) RMISVR=RMISV - IF(ABS(VITVAR(IV)-RMISVR) .LE. EPSMIS .OR. - 1 VITVAR(IV) .LT. SECBND(IV-ISECVR+1,1) .OR. - 2 VITVAR(IV) .GT. SECBND(IV-ISECVR+1,2) .OR. - 3 (IV-ISECVR+1 .LE. 2 .AND. VITVAR(5) .EQ. 0.0 .AND. - 4 VITVAR(6) .EQ. 0.0 .AND. (RSMCZ .EQ. 'JMA' .OR. - 5 RSMCZ .EQ. '!WJ' .OR. RSMCZ .EQ. '!JW'))) THEN - - NTOTER=NTOTER+1 - IF(IV-ISECVR+1 .EQ. 3) THEN - NPOSER=NPOSER+1 - IERROR(NTOTER)=IABS(IV-ISECVR+1) - ELSE - IERROR(NTOTER)=-IABS(IV-ISECVR+1) - ENDIF - - WRITE(6,141) NUNIQ,NUMST,INDX00,DAYZ,NTOTER,IERROR(NTOTER), - 1 NAMVAR(IV),VITVAR(IV),RMISVR,SECBND(IV-ISECVR+1,1), - 2 SECBND(IV-ISECVR+1,2),NNNREC,ZZZREC,TSTREC(NREC) - 141 FORMAT(//'...ERROR CHECKING NUNIQ,NUMST,INDX00,DAYZ,NTOTER,', - 1 'IERROR=',3I4,F11.3,2I4/4X,'HAS FOUND SECONDARY ', - 2 'VARIABLE ',A,' WITH VALUE=',F7.1,' MISSING OR ', - 3 'EXCEEDING BOUNDS. RMISVR,MINVAL,MAXVAL=',3F7.1/2(1X, - 4 '@@@',A,'@@@'/),4X,A) - -C NEGATE THE ERROR FLAG SO THAT IT SERVES ONLY AS A REMINDER THAT -C AN ERROR IS PRESENT - - IF(IV-ISECVR+1 .LE. 2 .AND. - 1 ((VITVAR(5) .NE. 0.0 .OR. - 2 VITVAR(6) .NE. 0.0) .OR. (RSMCZ .NE. 'JMA' .AND. - 3 RSMCZ .NE. '!WJ' .AND. RSMCZ .NE. '!JW'))) THEN - - WRITE(6,151) NAMVAR(IV),IERROR(NTOTER) - 151 FORMAT('...ERROR RECOVERY FOR ',A,' WILL BE DELAYED UNTIL DRSPCK', - 1 ' (NO LONGER CALLED).'/4X,'THE ERROR TYPE ',I3,' IS MADE ', - 2 'NEGATIVE AS A REMINDER THAT AN ERROR HAS OCCURRED.') - - ELSE - -C FOR ALL OTHER VARIABLES, IS THERE A PREVIOUS HISTORY? - - IF(KSTPRV .GT. 0) THEN - INDPER=0 - DO NP=INDX00-1,-KSTPRV,-1 - IF(ABS(PRVSVR(IV-ISECVR+1,NP)-RMISVR) .GT. EPSMIS .AND. - 1 PRVSVR(IV-ISECVR+1,NP) .GE. SECBND(IV-ISECVR+1,1) .AND. - 2 PRVSVR(IV-ISECVR+1,NP) .LE. SECBND(IV-ISECVR+1,2)) THEN - -c Because of the JMA memorial problem, we are not allowed to use -c a motionless storm as a persistence value - - if(iv-isecvr+1 .le. 2 .and. prvsvr(1,np) .eq. 0 .and. - 1 prvsvr(2,np) .eq. 0) then - ipers=0 - - else - INDPER=NP - IPERS=1 -C WRITE(6,161) INDPER,DAYZ,STDAYP(INDPER), -C 1 PRVSVR(IV-ISECVR+1,INDPER) -C 161 FORMAT(/'...INDPER,DAYZ,STDAYP(INDPER),PRVSVR(IV-ISECVR+1, -C 1 'INDPER)=',I3,3F10.3) - GO TO 221 - ENDIF - ENDIF - ENDDO - 221 CONTINUE - -C IS PERSISTENCE SUBSTITUTION POSSIBLE? - - IF(DAYZ-STDAYP(INDPER) .LE. DTPERS .AND. IPERS .EQ. 1) THEN - SUBVAR=PRVSVR(IV-ISECVR+1,INDPER) - SUBFLG='P' - IF(NPOSER .GT. 0) NPOSER=NPOSER-1 - IERROR(NTOTER)=-IABS(IERROR(NTOTER)) - WRITE(6,223) SUBVAR - 223 FORMAT('...THE MISSING OR ERRONEOUS VALUE WILL BE REPLACED BY ', - 1 'A PERSISTENCE VALUE OF ',F7.1) - -C PERSISTENCE SUBSTITUTION NOT POSSIBLE - - ELSE - IF(IV-ISECVR+1 .LE. 3) THEN - SUBVAR=0.0 - WRITE(6,224) NAMVAR(IV),DAYZ,STDAYP(INDPER),DTPERS - 224 FORMAT(/'...TIME INTERVAL TO THE CLOSEST PREVIOUS RECORD WITH ', - 1 'A NON-MISSING ',A,' EXCEEDS DTPERS OR A '/4X,'NON-', - 2 'MISSING VALUE CANNOT BE FOUND. DAYZ,PREVIOUS DAY,', - 3 'DTPERS=',3F10.3,'.'/4X,'NO RECOVERY POSSIBLE FOR THIS', - 4 ' VARIABLE.') - - ELSE - WRITE(6,225) NAMVAR(IV),DAYZ,STDAYP(INDPER),DTPERS - 225 FORMAT(/'...TIME INTERVAL TO THE CLOSEST PREVIOUS RECORD WITH ', - 1 'A NON-MISSING ',A,' EXCEEDS DTPERS OR A '/4X,'NON-', - 2 'MISSING VALUE CANNOT BE FOUND. DAYZ,PREVIOUS DAY,', - 3 'DTPERS=',3F10.3/4X,'WE WILL SUBSTITUTE A ', - 4 'CLIMATOLOGICAL VALUE.') - ENDIF - ENDIF - -C NO PRIOR HISTORY - - ELSE - IF(IV-ISECVR+1 .LE. 3) THEN - SUBVAR=0.0 - WRITE(6,226) KSTPRV - 226 FORMAT(/'...KSTPRV=',I2,' SO THERE IS NO PRIOR HISTORY AND NO ', - 1 'CHECKING. NO RECOVERY POSSIBLE FOR THIS VARIABLE.') - - ELSE - WRITE(6,227) KSTPRV - 227 FORMAT(/'...KSTPRV=',I2,' SO THERE IS NO PRIOR HISTORY AND NO ', - 1 'CHECKING. CLIMATOLOGICAL VALUES WILL BE SUBSTITUTED.') - ENDIF - ENDIF - -C CLIMATOLOGICAL VARIABLE SUBSTITUTION - - IF(SUBVAR .EQ. -99.0) THEN - DO NBA=1,NBASIN - IF(STMIDZ(3:3) .EQ. IDBASN(NBA)) THEN - IBASN=NBA - GO TO 2228 - ENDIF - ENDDO - 2228 CONTINUE - -C SUBSTITUTE A PRESSURE-WIND RELATIONSHIP FOR MAX WIND - - IF(IV .EQ. ITERVR) THEN - SUBVAR=TCPWTB(VITVAR(7),IBASN) - ELSE - SUBVAR=TCCLIM(IV,IBASN) - ENDIF - SUBFLG='C' - WRITE(6,229) NAMVAR(IV),SUBVAR,NABASN(IBASN) - 229 FORMAT(/'...FOR VARIABLE ',A,', THE CLIMATOLOGICAL VALUE IS',F7.1, - 1 ' IN THE ',A,' BASIN.') - ENDIF - - IF(SUBVAR .NE. 0.0) THEN - WRITE(TSTREC(NREC)(ISTVAR(IV):IENVAR(IV)),FMTVIT(IV)) - 1 NINT(SUBVAR/VITFAC(IV)) - TSTREC(NREC)(ISTVAR(IV)-1:ISTVAR(IV)-1)=SUBFLG - WRITE(6,2219) TSTREC(NREC) - 2219 FORMAT('...AFTER SUBSTITUTION, THE RECORD IS:'/4X,A) - BUFINZ=TSTREC(NREC) - -c Only update vitvar after direction errors have been corrected -c in the rare case for a JMA report with 0000 direction and -c 0000 speed - - if(iv-isecvr+1 .ge. 2) then - DO IVZ=1,ITERVR - CALL DECVAR(ISTVAR(IVZ),IENVAR(IVZ),IVTVAR(IVZ),IERDEC, - 1 FMTVIT(IVZ),BUFINZ) - VITVAR(IVZ)=REAL(IVTVAR(IVZ))*VITFAC(IVZ) - ENDDO - endif - ENDIF - - ENDIF - ENDIF - -C THE JTWC MEMORIAL PRESSURE SWITCHING CHECK -C IV=7 IS PCEN -C IV=8 IS PENV - - IF(IV-ISECVR+1 .EQ. 3) THEN - IF(VITVAR(IV) .GE. VITVAR(IV+1)) THEN - NTOTER=NTOTER+1 - WRITE(6,2301) VITVAR(IV),VITVAR(IV+1) - 2301 FORMAT(/'...UNPHYSICAL PCEN=',F7.1,' >= PENV=',F7.1) - IF(SUBVAR .GT. 0.0) THEN - NPOSER=NPOSER+1 - IERROR(NTOTER)=IABS(IV-ISECVR+1) - WRITE(6,2303) - 2303 FORMAT('...WE CANNOT RECOVER THIS ERROR SINCE SUBSTITUTION HAS ', - 1 'GIVEN UNPHYSICAL RESULTS.') - ELSE - IF(VITVAR(IV) .NE. RMISVR .AND. VITVAR(IV+1) .NE. RMISVR) THEN - SUBFLG='Z' - SUBVR1=VITVAR(IV) - SUBVR2=VITVAR(IV+1)-1.0 - WRITE(TSTREC(NREC)(ISTVAR(IV):IENVAR(IV)),FMTVIT(IV)) - 1 NINT(SUBVR2/VITFAC(IV)) - WRITE(TSTREC(NREC)(ISTVAR(IV+1):IENVAR(IV+1)),FMTVIT(IV+1)) - 1 NINT(SUBVR1/VITFAC(IV+1)) - TSTREC(NREC)(ISTVAR(IV)-1:ISTVAR(IV)-1)=SUBFLG - TSTREC(NREC)(ISTVAR(IV+1)-1:ISTVAR(IV+1)-1)=SUBFLG - WRITE(6,2219) TSTREC(NREC) - BUFINZ=TSTREC(NREC) - DO IVZ=1,ITERVR - CALL DECVAR(ISTVAR(IVZ),IENVAR(IVZ),IVTVAR(IVZ),IERDEC, - 1 FMTVIT(IVZ),BUFINZ) - VITVAR(IVZ)=REAL(IVTVAR(IVZ))*VITFAC(IVZ) - ENDDO - ENDIF - ENDIF - ENDIF - ENDIF - ENDDO - -C CHECK FOR MISSING DEPTH OF THE CYCLONIC CIRCULATION - - ITPC=0 - DO KTPC=1,MAXTPC - IF(STMDPZ .EQ. STMTPC(KTPC)) THEN - ITPC=KTPC -C WRITE(6,239) NUMST,STMDPZ -C 239 FORMAT('...RECORD ',I3,' HAS A PROPER CODE=',A,' FOR DEPTH OF ', -C 'THE CYCLONIC CIRCULATION.') - ENDIF - ENDDO - - IF(ITPC .EQ. 0) THEN - - SUBTOP=EXE - NTOTER=NTOTER+1 - IERROR(NTOTER)=-7 - - WRITE(6,241) NUNIQ,NUMST,INDX00,DAYZ,NTOTER,IERROR(NTOTER), - 1 STMDPZ,NNNREC,ZZZREC,TSTREC(NREC) - 241 FORMAT(//'...ERROR CHECKING NUNIQ,NUMST,INDX00,DAYZ,NTOTER,', - 1 'IERROR=',3I4,F11.3,2I4/4X,'HAS FOUND MISSING OR BAD ', - 2 'CODE=',A,' FOR DEPTH OF THE CYCLONIC CIRCULATION. ', - 3 'RECORD='/2(1X,'@@@',A,'@@@'/),4X,A) - - IF(KSTPRV .GT. 0) THEN - INDPER=0 - DO NP=INDX00-1,-KSTPRV,-1 - DO ITPC=1,MAXTPC - IF(STDPTP(NP) .EQ. STMTPC(ITPC)) THEN - INDPER=NP - SUBTOP=STDPTP(NP) - SUBFLG='P' - WRITE(6,243) INDPER,DAYZ,STDAYP(INDPER),SUBTOP - 243 FORMAT(/'...INDPER,DAYZ,STDAYP(INDPER),SUBTOP=',I3,2F10.3,1X,A) - GO TO 261 - ENDIF - ENDDO - - ENDDO - - 261 CONTINUE - IF(DAYZ-STDAYP(INDPER) .LE. DTPERS) THEN - WRITE(6,263) NAMVAR(MAXVIT+1),SUBTOP - 263 FORMAT('...THE MISSING OR ERRONEOUS VALUE OF ',A,' WILL BE ', - 1 'REPLACED BY A PERSISTENCE VALUE OF ',A) - - ELSE - - WRITE(6,273) DAYZ,STDAYP(INDPER),DTPERS - 273 FORMAT(/'...TIME INTERVAL TO THE CLOSEST PREVIOUS RECORD WITH ', - 1 'A PROPER STORM DEPTH CODE EXCEEDS DTPERS OR AN '/4X, - 2 'ACCEPTABLE VALUE CANNOT BE FOUND. ', - 3 'DAYZ,PREVIOUS DAY,DTPERS=',3F10.3/,4X,'WE WILL ', - 4 'SUBSTITUTE A CLIMATOLOGICAL VALUE.') - ENDIF - - ELSE - WRITE(6,277) KSTPRV - 277 FORMAT(/'...KSTPRV=',I2,' SO THERE IS NO PRIOR HISTORY AND NO ', - 1 'CHECKING. CLIMATOLOGICAL VALUES WILL BE SUBSTITUTED.') - ENDIF - -C DETERMINE CLIMATOLOGICAL VALUE IF NECESSARY - - IF(SUBTOP .EQ. EXE) THEN - IF(PCENZ .LE. 980.0) THEN - SUBTOP=DEEP - WRITE(6,279) PCENZ,SUBTOP - 279 FORMAT('...CLIMATOLOGICAL SUBSTITUTION OF STORM DEPTH: PCEN, ', - 1 'DEPTH=',F7.1,1X,A) - ELSE IF(PCENZ .LE. 1000.0) THEN - SUBTOP=MEDIUM - WRITE(6,279) PCENZ,SUBTOP - ELSE - SUBTOP=SHALO - WRITE(6,279) PCENZ,SUBTOP - ENDIF - SUBFLG='C' - ENDIF - - WRITE(TSTREC(NREC)(MAXCHR:MAXCHR),'(A)') SUBTOP - TSTREC(NREC)(MAXCHR-1:MAXCHR-1)=SUBFLG - WRITE(6,269) TSTREC(NREC) - 269 FORMAT('...AFTER SUBSTITUTION, THE RECORD IS:'/4X,A) - ENDIF - -C ASSIGN SUMMARY ERROR CODE - -C NO ERRORS - - IF(NTOTER .EQ. 0) THEN - IETYP=0 - ISGNER=1 - -C IF ALL ERRORS HAVE BEEN FIXED, SUMMARY CODE IS NEGATIVE - - ELSE - IF(NPOSER .EQ. 0) THEN - ISGNER=-1 - ELSE - ISGNER=1 - ENDIF - -C ADD CODE FOR DEPTH OF THE CYCLONIC CIRCULATION FIRST - - NERZ=0 - NALLER=NTOTER - IF(IABS(IERROR(NTOTER)) .EQ. 7) THEN - NERZ=1 - IETYP=7 - NALLER=NTOTER-1 - ENDIF - -C ALL OTHER ERRORS. PICK OUT ONLY ALL ERRORS THAT REMAIN OR -C ALL ERRORS THAT HAVE BEEN FIXED IF THERE ARE NO REMAINING -C ERRORS. DO NOTHING WITH OTHER ERRORS. - - DO NER=1,NALLER - IF((ISGNER .LT. 0 .AND. IERROR(NER) .LT. 0) .OR. - 1 (ISGNER .GT. 0 .AND. IERROR(NER) .GT. 0)) THEN - NERZ=NERZ+1 - - ELSE - GO TO 280 - ENDIF - - IF(NERZ .EQ. 1) THEN - IETYP=IABS(IERROR(NER)) - - ELSE IF(NERZ .EQ. 2) THEN - IETYP=IABS(IETYP)*10+IABS(IERROR(NER)) - - ELSE IF(NERZ .EQ. 3) THEN - IF(IABS(IERROR(NTOTER)) .EQ. 7) THEN - IETYP=78 - ELSE - IETYP=9 - ENDIF - - ELSE - IF(IABS(IERROR(NTOTER)) .EQ. 7) THEN - IETYP=79 - ELSE - IETYP=9 - ENDIF - ENDIF - - 280 CONTINUE - ENDDO - ENDIF - IETYP=SIGN(IETYP,ISGNER) - - WRITE(6,281) SCRREC(NUNIQ),NUMST,NUMSTM(NUNIQ),NTOTER,NPOSER, - 1 ISGNER,IETYP,(IERROR(NER),NER=1,NTOTER) - 281 FORMAT(/'...ERROR SUMMARY FOR STMID,NUMST,NUMSTM=',A,2I3,':'/4X, - 1 'NTOTER,NPOSER,ISGNER,IETYP,IERROR=',4I4/(4X,10I4)) - -C WRITE(6,287) NREC,IETYP,NUMTST(NREC),NUMST,NUNIQ,BUFINZ -C 287 FORMAT(/'...DEBUGGING, NREC,IETYP,NUMTST(NREC),NUMST,NUNIQ,', -C 1 'BUFINZ=',5I4/4X,A) - IFSECV(NUMTST(NREC))=IETYP - IF(IETYP .GT. 0) THEN - NADD=NADD+1 - NUMBAD(NADD+NBAD)=NUMTST(NREC) - BADREC(NADD+NBAD)=TSTREC(NREC) - ELSE - NOKAY=NOKAY+1 - NUMOKA(NOKAY)=NUMTST(NREC) - OKAREC(NOKAY)=TSTREC(NREC) - ENDIF - - ENDDO - - ENDDO - - WRITE(6,301) NOKAY,NADD,NTEST,(ERCSV(NER),NER=1,NERCSV) - 301 FORMAT(//'...RESULTS OF THE SECONDARY VARIABLE ERROR CHECK ARE: ', - 1 'NOKAY=',I4,' AND NADD=',I4,' FOR A TOTAL OF ',I4, - 2 ' RECORDS.'//4X,'ERROR CODES ARE:'/(6X,A)) - WRITE(6,303) - 303 FORMAT(/'...NOTES: NEGATIVE ERROR CODES (EXCEPT DIR/SPD) INDICATE' - 1 ,' SUCCESSFUL RECOVERY FROM MISSING OR ERRONEOUS DATA'/11X - 2 ,'BY SUBSTITUTION FROM PERSISTENCE.'/11X,'A NEGATIVE ERR', - 3 'OR CODE FOR DIR/SPD INDICATES THAT ERROR RECOVERY WILL ', - 4 'BE POSTPONED UNTIL THE DIR/SPD CHECK.'/11X,'MULTIPLE ', - 5 'ERRORS ARE HANDLED AS FOLLOWS:'/13X,'THE FIRST SECONDARY' - 6 ,' ERROR OCCUPIES THE LEFT-MOST DIGIT.'/13X,'THE NEXT ', - 7 'SECONDARY ERROR OCCUPIES THE RIGHT-MOST DIGIT.'/13X, - 8 'THREE OR MORE ERRORS REVERTS TO ERROR CODE=7, ETC.'/13X, - 9 'ERRORS FOR THE DEPTH OF THE CYCLONIC CIRCULATION ARE ', - A 'COUNTED SEPARATELY.'//3X,'OKAY RECORDS ARE:',100X,'ERC'/) - - DO NOK=1,NOKAY - WRITE(6,309) NOK,NUMOKA(NOK),OKAREC(NOK),IFSECV(NUMOKA(NOK)) - 309 FORMAT(3X,I4,'...',I4,'...',A,'...',I3) - ENDDO - IF(NADD .GT. 0) WRITE(6,311) (NBAD+NBA,NUMBAD(NBAD+NBA), - 1 BADREC(NBAD+NBA), - 2 IFSECV(NUMBAD(NBAD+NBA)), - 3 NBA=1,NADD) - 311 FORMAT(/' ADDED BAD RECORDS ARE:',95X,'ERC'/(3X,I4,'...',I4, - 1 '...',A,'...',I3)) - NBAD=NBAD+NADD - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: WRNING WRITES WARNING MESSAGE ABOUT RECORD MODS -C PRGMMR: S. LORD ORG: NP22 DATE: 1992-02-21 -C -C ABSTRACT: WRITES SIMPLE WARNING MESSAGE. -C -C PROGRAM HISTORY LOG: -C 1992-02-21 S. LORD -C -C USAGE: CALL WRNING(IDSUB) -C INPUT ARGUMENT LIST: -C IDSUB - SUBROUTINE NAME -C -C REMARKS: SEE REMARKS IN CODE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE WRNING(IDSUB) - - CHARACTER*6 IDSUB - - WRITE(6,1) IDSUB - 1 FORMAT(21X,'***********************************************'/ - 1 21X,'***********************************************'/ - 2 21X,'**** ****'/ - 3 21X,'**** WARNING: RECORDS WITH CORRECT FORMAT ****'/ - 4 21X,'**** BUT MISSING OR ERRONEOUS ****'/ - 5 21X,'**** DATA MAY BE MODIFIED BY ****'/ - 6 21X,'**** THIS ROUTINE=',A6,'!!! ****'/ - 7 21X,'**** ****'/ - 8 21X,'**** TYPES OF SUBSTITUTIONS ARE: ****'/ - 9 21X,'**** CLIMATOLOGICAL SUBSTITUTION: C ****'/ - O 21X,'**** RSMC AVERAGING: A ****'/ - 1 21X,'**** PERSISTENCE SUBSTITUTION: P ****'/ - 2 21X,'**** OVERLAP MODIFICATION: O ****'/ - 3 21X,'**** DIRECTION/SPEED SUBSTITUTION: S ****'/ - 4 21X,'**** LATITUDE/LONGITUDE SUBSTITUTION: L ****'/ - 4 21X,'**** SWITCHED PCEN-PENV SUBSTITUTION: Z ****'/ - 8 21X,'**** ****'/ - 6 21X,'***********************************************'/ - 7 21X,'***********************************************') - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: F1 RECALCULATES LONGITUDES -C PRGMMR: S. LORD ORG: NP22 DATE: 1993-05-01 -C -C ABSTRACT: SEE COMMENTS IN PROGRAM. ORIGINALLY WRITTEN BY C. J. NEWMANN -C -C PROGRAM HISTORY LOG: -C 1993-05-01 S. LORD INSTALLED PROGRAM -C -C USAGE: CALL F1(ALON) -C INPUT ARGUMENT LIST: SEE COMMENTS IN PROGRAM -C -C OUTPUT ARGUMENT LIST: -C SEE COMMENTS IN PROGRAM -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - FUNCTION F1(ALON) - -C CONVERT FROM E LONGITUDE TO THOSE ACCEPTABLE IN AL TAYLOR ROUTINES - - IF(ALON.GT.180.)F1=360.-ALON - IF(ALON.LE.180.)F1=-ALON - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: F2 CALCULATES DATES -C PRGMMR: D. A. KEYSER ORG: NP22 DATE: 1998-06-05 -C -C ABSTRACT: SEE COMMENTS IN PROGRAM. ORIGINALLY WRITTEN BY C. J. -C NEWMANN -C -C PROGRAM HISTORY LOG: -C 1993-05-01 S. LORD INSTALLED PROGRAM -C 1998-06-05 D. A. KEYSER - Y2K, FORTRAN 90 COMPLIANT -C -C USAGE: CALL F2(IDATIM) -C INPUT ARGUMENT LIST: -C IDATIM - 10-DIGIT DATE IN FORM YYYYDDMMHH -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - FUNCTION F2(IDATIM) - -C OBTAIN JULIAN DAY NUMBER -C 0000UTC ON 1 JAN IS SET TO DAY NUMBER 0 AND 1800UTC ON 31 DEC IS SET -C TO DAY NUMBER 364.75. LEAP YEARS ARE IGNORED. - - CHARACTER*10 ALFA - WRITE(ALFA,'(I10)')IDATIM - READ(ALFA,'(I4,3I2)')KYR,MO,KDA,KHR - MON=MO - IF(MON.EQ.13)MON=1 - DANBR=3055*(MON+2)/100-(MON+10)/13*2-91+KDA - F2=DANBR-1.+REAL(KHR/6)*0.25 - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: SLDATE RETRIEVES DATE FROM SYSTEM AND DATE FILE -C PRGMMR: D. A. KEYSER ORG: NP22 DATE: 1998-06-05 -C -C ABSTRACT: RETRIEVES DATE FROM SYSTEM AND FROM A DATE FILE, AND -C OBTAINS THE DIFFERENCE BETWEEN THE TWO. CONSTRUCTS DATE -C IN FORM YYYYMMDD AND HHMM. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C 1998-06-05 D. A. KEYSER - Y2K/F90 COMPLIANCE -C -C USAGE: CALL SLDATE(IUNTDT,IDATEZ,IUTCZ,IOFFTM) -C INPUT ARGUMENT LIST: -C IUNTDT - UNIT NUMBER FOR FILE CONTAINING RUN DATE -C -C OUTPUT ARGUMENT LIST: -C IDATEZ - DATE IN FORM YYYYMMDD -C IUTCZ - DATE IN FORM HHMM -C IOFFTM - OFFSET (HOURS *100) BETWEEN SYSTEM DATE AND -C - FILE DATE (SYSTEM DATE MINUS FILE DATE) -C -C INPUT FILES: -C UNIT "IUNTDT" - FILE CONTAINING RUN DATE IN I4,3I2 FORMAT -C - ('YYYYMMDDHH') -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE SLDATE(IUNTDT,IDATEZ,IUTCZ,IOFFTM) - - CHARACTER USRDAT*10 - - SAVE - - DIMENSION IDAT(8),JDAT(6),RINC(5) - - EQUIVALENCE (IDAT(1),JW3YR),(IDAT(2),JW3MO),(IDAT(3),JW3DA), - 2 (IDAT(5),JW3HR),(IDAT(6),JW3MIN),(IDAT(7),JW3SEC) - - READ(IUNTDT,1) USRDAT - 1 FORMAT(A10) - WRITE(6,3) USRDAT - 3 FORMAT(/'...',A10,'...') - -C OBTAIN CURRENT SYSTEM DATE - IDAT (UTC) - - CALL W3UTCDAT(IDAT) - -C DECODE THE DATE LABEL INTO JDAT (UTC) - - READ(USRDAT(1: 4),'(I4)') JDAT(1) - READ(USRDAT(5: 6),'(I2)') JDAT(2) - READ(USRDAT(7: 8),'(I2)') JDAT(3) - READ(USRDAT(9:10),'(I2)') JDAT(5) - -C THIS IS THE TIME ZONE OFFSET (SAME AS FOR IDAT) - JDAT(4) = IDAT(4) - - JDAT(6) = 0 - -C COMBINE YEAR, MONTH, DAY, HOUR, MINUTE TO FORM YYYYMMDD - - IDATEZ=JDAT(1)*10000+JDAT(2)*100+JDAT(3) - IUTCZ =JDAT(5)*100+JDAT(6) - -C OBTAIN TIME DIFFERENCE (CURRENT TIME - LABEL TIME) IN HOURS * 100 - - CALL W3DIFDAT(IDAT,(/JDAT(1),JDAT(2),JDAT(3),JDAT(4),JDAT(5), - $ JDAT(6),0,0/),2,RINC) - IOFFTM=NINT(RINC(2)*100.) - - WRITE(6,5) JW3YR,JW3MO,JW3DA,JW3HR,JW3MIN,JW3SEC,IOFFTM - 5 FORMAT(/'...CURRENT DATE/TIME FROM W3UTCDAT CALL IS:'/'JW3YR=',I5, - 1 ' JW3MO=',I3,' JW3DA=',I3,' JW3HR=',I5,' JW3MIN=',I5, - 2 ' JW3SEC=',I5,' OFFTIM=',I8) - - WRITE(6,13) IDATEZ,IUTCZ - 13 FORMAT('...IN SLDATE, IDATEZ,IUTCZ=',I10,2X,I4) - - RETURN - -C----------------------------------------------------------------------- - ENTRY SLDTCK(IUNTDT) - - REWIND IUNTDT - WRITE(6,21) IUNTDT - 21 FORMAT('...WRITING USRDAT TO UNIT',I3) - WRITE(IUNTDT,1) USRDAT - - RETURN - -C----------------------------------------------------------------------- - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: FIXSLM MODIFIES SEA-LAND MASK -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: MODIFIES NCEP T126 GAUSSIAN GRID SEA-LAND MASK. CONVERTS -C SOME SMALL ISLANDS TO OCEAN POINTS. PROGRAM IS DEPENDENT -C ON MODEL RESOLUTION. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C 1992-04-08 S. J. LORD CONVERTED TO T126 FROM T80 -C -C USAGE: CALL FIXSLM(LONF,LATG2,RLON,RLAT,SLMASK) -C INPUT ARGUMENT LIST: -C LONF - NUMBER OF LONGITUDINAL POINTS, FIRST INDEX OF SLMASK -C LATG2 - NUMBER OF LATITUDINAL POINTS, SECOND INDEX OF SLMASK -C RLON - LONGITUDES -C RLAT - LATITUDES -C SLMASK - T162 SEA-LAND MASK ON GAUSSIAN GRID -C -C OUTPUT ARGUMENT LIST: -C SLMASK - MODIFIED T162 SEA-LAND MASK ON GAUSSIAN GRID -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE FIXSLM(LONF,LATG2,RLON,RLAT,SLMASK) - - PARAMETER (MAXSLM=35) - - SAVE - - DIMENSION RLAT(LATG2),RLON(LONF),SLMASK(LONF,LATG2),IPT(MAXSLM), - 1 JPT(MAXSLM) - - DATA NOCEAN/21/, - -C INDONESIAN ARCHIPELAGO,NEW CALEDONIA - - 1 IPT/133,135,129,177, - -C YUCATAN - - 2 290,291,292,289,290,291,289,290,291, - -C CUBA - - 3 299,300,301,302,303,303,304,305,14*0.0/, - -C INDONESIAN ARCHIPELAGO,NEW CALEDONIA - - 1 JPT/106,105,106,118, - -C YUCATAN - - 2 3*73,3*74,3*75, - -C CUBA - - 3 3*72,2*73,3*74,14*0.0/ - -C WRITE(6,7) -C 7 FORMAT('...CONVERTING LAND TO OCEAN, NPT,IPT,RLON,JPT,RLAT=') - DO NPT=1,NOCEAN - SLMASK(IPT(NPT),JPT(NPT))=0.0 -C WRITE(6,9) NPT,IPT(NPT),RLON(IPT(NPT)),JPT(NPT),RLAT(JPT(NPT)) -C 9 FORMAT(4X,2I5,F10.3,I5,F10.3) - ENDDO - - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: GAULAT CALCULATES GAUSSIAN GRID LATITUDES -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: CALCULATES GAUSSIAN GRID LATITUDES -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD - COPIED FROM KANAMITSU LIBRARY -C -C USAGE: CALL PGM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C DDNAME1 - GENERIC NAME & CONTENT -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C DDNAME2 - GENERIC NAME & CONTENT AS ABOVE -C FT06F001 - INCLUDE IF ANY PRINTOUT -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE GAULAT(GAUL,K) - - IMPLICIT REAL(8) (A-H,O-Z) - DIMENSION A(500) - REAL GAUL(1) - - SAVE - - ESP=1.D-14 - C=(1.D0-(2.D0/3.14159265358979D0)**2)*0.25D0 - FK=K - KK=K/2 - CALL BSSLZ1(A,KK) - DO IS=1,KK - XZ=COS(A(IS)/SQRT((FK+0.5D0)**2+C)) - ITER=0 - 10 PKM2=1.D0 - PKM1=XZ - ITER=ITER+1 - IF(ITER.GT.10) GO TO 70 - DO N=2,K - FN=N - PK=((2.D0*FN-1.D0)*XZ*PKM1-(FN-1.D0)*PKM2)/FN - PKM2=PKM1 - PKM1=PK - ENDDO - PKM1=PKM2 - PKMRK=(FK*(PKM1-XZ*PK))/(1.D0-XZ**2) - SP=PK/PKMRK - XZ=XZ-SP - AVSP=ABS(SP) - IF(AVSP.GT.ESP) GO TO 10 - A(IS)=XZ - ENDDO - IF(K.EQ.KK*2) GO TO 50 - A(KK+1)=0.D0 - PK=2.D0/FK**2 - DO N=2,K,2 - FN=N - PK=PK*FN**2/(FN-1.D0)**2 - ENDDO - 50 CONTINUE - DO N=1,KK - L=K+1-N - A(L)=-A(N) - ENDDO - - RADI=180./(4.*ATAN(1.)) - GAUL(1:K)=ACOS(A(1:K))*RADI -C PRINT *,'GAUSSIAN LAT (DEG) FOR JMAX=',K -C PRINT *,(GAUL(N),N=1,K) - - RETURN - 70 WRITE(6,6000) - 6000 FORMAT(//5X,14HERROR IN GAUAW//) - CALL ABORT1(' GAULAT',6000) - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: BSSLZ1 CALCULATES BESSEL FUNCTIONS -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: CALCULATES BESSEL FUNCTIONS -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD - COPIED FROM KANAMITSU LIBRARY -C -C USAGE: CALL PGM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C DDNAME1 - GENERIC NAME & CONTENT -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C DDNAME2 - GENERIC NAME & CONTENT AS ABOVE -C FT06F001 - INCLUDE IF ANY PRINTOUT -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE BSSLZ1(BES,N) - - IMPLICIT REAL(8) (A-H,O-Z) - DIMENSION BES(N) - DIMENSION BZ(50) - - DATA PI/3.14159265358979D0/ - DATA BZ / 2.4048255577D0, 5.5200781103D0, - $ 8.6537279129D0,11.7915344391D0,14.9309177086D0,18.0710639679D0, - $ 21.2116366299D0,24.3524715308D0,27.4934791320D0,30.6346064684D0, - $ 33.7758202136D0,36.9170983537D0,40.0584257646D0,43.1997917132D0, - $ 46.3411883717D0,49.4826098974D0,52.6240518411D0,55.7655107550D0, - $ 58.9069839261D0,62.0484691902D0,65.1899648002D0,68.3314693299D0, - $ 71.4729816036D0,74.6145006437D0,77.7560256304D0,80.8975558711D0, - $ 84.0390907769D0,87.1806298436D0,90.3221726372D0,93.4637187819D0, - $ 96.6052679510D0,99.7468198587D0,102.888374254D0,106.029930916D0, - $ 109.171489649D0,112.313050280D0,115.454612653D0,118.596176630D0, - $ 121.737742088D0,124.879308913D0,128.020877005D0,131.162446275D0, - $ 134.304016638D0,137.445588020D0,140.587160352D0,143.728733573D0, - $ 146.870307625D0,150.011882457D0,153.153458019D0,156.295034268D0/ - NN=N - IF(N.LE.50) GO TO 12 - BES(50)=BZ(50) - BES(51:N)=BES(50:N-1)+PI - NN=49 - 12 CONTINUE - BES(1:NN)=BZ(1:NN) - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: TRKSUB DETERMINES OBS. TROP. CYCLONE TRACKS -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: CONTAINS VARIOUS ENTRY POINTS TO DETERMINE TROPICAL -C CYCLONE TRACKS, CALCULATE STORM RELATIVE COORDINATE, DETERMINES -C FIRST OCCURRENCE OF A PARTICULAR STORM. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C -C USAGE: CALL TRKSUB(IOVITL,IOPTZ,IDATTK,DAY0,DAYMN,DAYMX,DAYOFF, -C 1 STMDR0,STMSP0,STLAT0,STLON0,IERSET, -C 3 STLATP,STLONP,STDIRP,STSPDP,STDAYP, -C 4 STRMXP,STPCNP,STPENP,STVMXP,KSTPZ, -C 5 STDPTP,STMNTK) -C CALL SETTRK(IOVITL,IOPTZ,IDATTK,DAY0,DAYMN,DAYMX,DAYOFF, -C 1 STMDR0,STMSP0,STLAT0,STLON0,STMNTK,IERSET) -C INPUT ARGUMENT LIST: -C DAY0 - FRACTIONAL NUMBER OF DAYS SINCE 12/31/1899 -C DAYMX - FRACTIONAL NUMBER OF DAYS SINCE 12/31/1899 (MAX) -C DAYMN - FRACTIONAL NUMBER OF DAYS SINCE 12/31/1899 (MIN) -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C DDNAME1 - GENERIC NAME & CONTENT -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE TRKSUB(IOVITL,IOPTZ,IDATTK,DAY0,DAYMN,DAYMX,DAYOFF, - 1 STMDR0,STMSP0,STLAT0,STLON0,IERSET,STLATP, - 2 STLONP,STDIRP,STSPDP,STDAYP,STRMXP,STPCNP, - 3 STPENP,STVMXP,KSTPZ,STDPTP,STMNTK) - - PARAMETER (MAXSTM=70) - PARAMETER (NSTM=MAXSTM,NSTM1=NSTM+1) - PARAMETER (NPRVMX=61) - - LOGICAL NOMIN,NOMAX,EXTRPB,EXTRPF - CHARACTER STMNTK*(*),STDPTP*1 - - SAVE - - DIMENSION STDPTP(-NPRVMX:-1) - - DIMENSION RINC(5) - - CHARACTER STMNAM*9,STMID*3,RSMC*4 - - LOGICAL FSTFLG - - DIMENSION STMNAM(MAXSTM),STMLAT(MAXSTM),STMLON(MAXSTM), - 1 STMDIR(MAXSTM),STMSPD(MAXSTM),IDATE(MAXSTM), - 2 IUTC(MAXSTM),RMAX(MAXSTM),PENV(MAXSTM),PCEN(MAXSTM), - 3 PTOP(MAXSTM),RSMC(MAXSTM),RMW(MAXSTM),VMAX(MAXSTM), - 4 R15NW(MAXSTM),R15NE(MAXSTM),R15SE(MAXSTM),R15SW(MAXSTM), - 5 STMID(MAXSTM),FSTFLG(MAXSTM) - - PARAMETER (MAXTPC= 3) - - CHARACTER SHALO*1,MEDIUM*1,DEEP*1,STMTPC*1,EXE*1 - - DIMENSION STMTOP(0:MAXTPC) - - DIMENSION STMTPC(0:MAXTPC) - - EQUIVALENCE (STMTPC(0), EXE),(STMTPC(1),SHALO),(STMTPC(2),MEDIUM), - 1 (STMTPC(3),DEEP) - - DIMENSION TRKLTZ(0:NSTM1),TRKLNZ(0:NSTM1), - 1 TRKDRZ(0:NSTM1),TRKSPZ(0:NSTM1), - 2 TRKRMX(0:NSTM1),TRKPCN(0:NSTM1), - 3 TRKPEN(0:NSTM1),TRKVMX(0:NSTM1), - 4 TRKDPT(0:NSTM1) - - DIMENSION STMDAY(0:NSTM1),SRTDAY(NSTM),IDASRT(0:NSTM1), - 1 SRTLAT(NSTM),SRTLON(NSTM),SRTDIR(NSTM),SRTSPD(NSTM), - 2 ISRTDA(NSTM),ISRTUT(NSTM),SRTRMX(NSTM),SRTPCN(NSTM), - 3 SRTPEN(NSTM),SRTVMX(NSTM),SRTDPT(NSTM) - - DIMENSION STLATP(-NPRVMX:-1),STLONP(-NPRVMX:-1), - 1 STDIRP(-NPRVMX:-1),STSPDP(-NPRVMX:-1), - 1 STDAYP(-NPRVMX: 0),STRMXP(-NPRVMX:-1), - 1 STPCNP(-NPRVMX:-1),STPENP(-NPRVMX:-1), - 2 STVMXP(-NPRVMX:-1) - - EQUIVALENCE (TRKLTZ(1),STMLAT(1)),(TRKLNZ(1),STMLON(1)), - 1 (TRKDRZ(1),STMDIR(1)),(TRKSPZ(1),STMSPD(1)), - 2 (TRKRMX(1),RMAX (1)),(TRKPCN(1),PCEN (1)), - 3 (TRKPEN(1),PENV (1)),(TRKVMX(1),VMAX (1)), - 4 (TRKDPT(1),PTOP (1)) - - DATA SHALO/'S'/,MEDIUM/'M'/,DEEP/'D'/,EXE/'X'/, - 1 STMTOP/-99.0,700.,400.,200./ - -C FIVMIN IS FIVE MINUTES IN UNITS OF FRACTIONAL DAYS -C FACSPD IS CONVERSION FACTOR FOR R(DEG LAT)=V(M/S)*T(FRAC DAY)* - - DATA IPRNT/0/,FIVMIN/3.4722E-3/,FACSPD/0.77719/ - -C----------------------------------------------------------------------- - - ENTRY SETTRK(IOVITL,IOPTZ,IDATTK,DAY0,DAYMN,DAYMX,DAYOFF, - 1 STMDR0,STMSP0,STLAT0,STLON0,STMNTK,IERSET) - - IERSET=0 - IOPT=IOPTZ - IDTREQ=IDATTK - IF(IOPT .EQ. 5) THEN - STMID (1)=STMNTK(1:3) - ELSE IF(IOPT .EQ. 6) THEN - STMNAM(1)=STMNTK(1:9) - ELSE - WRITE(6,1) IOPT - 1 FORMAT(/'******ILLEGAL OPTION IN SETTRK=',I4) - IERSET=1 - RETURN - ENDIF - - WRITE(6,6) IOPT,STMNTK,DAY0,DAYMN,DAYMX,IDTREQ,IHRREQ - 6 FORMAT(/'...ENTERING SETTRK, WITH IOPT=',I2,'. LOOKING FOR ALL ', - 1 'FIXES FOR ',A,' WITH CENTRAL TIME=',F12.2,/4X,' MIN/MAX', - 2 ' TIMES=',2F12.2,' AND REQUESTED DATE/TIME=',2I10) - - CALL NEWVIT(IOVITL,IPRNT,IOPT,IERVIT,MAXSTM,KSTORM,IDTREQ,IHRREQ, - 1 IHRWIN,IDATE,IUTC,STMLAT,STMLON,STMDIR,STMSPD, - 2 PCEN,PENV,RMAX,VMAX,RMW,R15NE,R15SE,R15SW,R15NW, - 3 PTOP,FSTFLG,STMNAM,STMID,RSMC) - -C CONVERT FIX TIMES TO FLOATING POINT JULIAN DAY - - DO KST=1,KSTORM - CALL ZTIME(IDATE(KST),IUTC(KST),IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/), - $ 1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,STMDAY(KST)) - STMDAY(KST)=STMDAY(KST)+DAYOFF - -c WRITE(6,16) IDATE(KST),IUTC(KST),IYR,IMO,IDA,IHR,IMIN,JDY, -c 1 STMDAY(KST) -c 16 FORMAT('...STORM FIX TIMES ARE: IDATE,IUTC,IYR,IMO,IDA,IHR,IMIN,', -c 1 'JDY,STMDAY'/4X,8I8,F15.5) - - ENDDO - - CALL SORTRL(STMDAY(1:KSTORM),IDASRT(1:KSTORM),KSTORM) - -c WRITE(6,26) (STMDAY(KST),IDASRT(KST),KST=1,KSTORM) -c 26 FORMAT(/'...SORTED STORM DAYS AND INDEX ARE:'/(5X,F15.5,I6)) - -C PICK OUT TIMES AND LOCATIONS FROM SORTED LIST OF STORM DAYS - - NOMIN=.TRUE. - NOMAX=.TRUE. - EXTRPB=.FALSE. - EXTRPF=.FALSE. - KSRTMN=-1 - KSRTMX=-1 - - DO KSRT=1,KSTORM - - IF(STMDAY(KSRT) .GT. DAYMN .AND. NOMIN) THEN - NOMIN=.FALSE. - KSRTMN=KSRT-1 - ENDIF - - IF(STMDAY(KSRT) .GT. DAYMX .AND. NOMAX) THEN - NOMAX=.FALSE. - KSRTMX=KSRT - ENDIF - - ENDDO - - IF(KSRTMN .LE. 0) THEN - -C WE HAVENT'T BEEN ABLE TO FIND A STMDAY THAT IS LESS THAN 8 HOURS -C EARLIER THAN THE TIME WINDOW. EITHER THIS IS THE FIRST TIME -C THIS STORM HAS BEEN RUN OR THERE MAY BE AN ERROR. IN EITHER -C CASE, WE ALLOW EXTRAPOLATION OF THE OBSERVED MOTION BACK -C IN TIME, BUT SET AN ERROR FLAG. THE SAME METHOD IS -C USED FOR THE LAST RUN OF A PARTICULAR STORM. - - DT=STMDAY(1)-DAYMN - IF(DT .LE. 0.333333) THEN - WRITE(6,41) KSTORM,KSRT,DAYMN,(STMDAY(KST),KST=1,KSTORM) - 41 FORMAT(/'######CANNOT FIND STORM RECORDS LESS THAN 8 HOURS ', - 1 'BEFORE WINDOW MINIMUM.'/7X,'THIS IS THE FIRST RECORD ', - 2 'FOR THIS STORM OR THERE MAY BE AN ERROR. KSTORM,KSRT,', - 3 'DAYMN,STMDAY=',2I4,F10.3/(5X,10F12.3)) - IERSET=41 - ENDIF - - EXTRPB=.TRUE. - KSRTMN=0 - ISRT=IDASRT(1) - IDASRT(KSRTMN)=0 - STMDAY(KSRTMN)=DAYMN - TRKDRZ(KSRTMN)=STMDIR(ISRT) - TRKSPZ(KSRTMN)=STMSPD(ISRT) - CALL DS2UV(USTM,VSTM,STMDIR(ISRT),STMSPD(ISRT)) - TRKLTZ(KSRTMN)=STMLAT(ISRT)-VSTM*DT*FACSPD - TRKLNZ(KSRTMN)=STMLON(ISRT)-USTM*DT*FACSPD/COSD(STMLAT(ISRT)) - TRKRMX(KSRTMN)=RMAX(ISRT) - TRKPCN(KSRTMN)=PCEN(ISRT) - TRKPEN(KSRTMN)=PENV(ISRT) - TRKVMX(KSRTMN)=VMAX(ISRT) - TRKDPT(KSRTMN)=PTOP(ISRT) - WRITE(6,39) ISRT,KSRTMN,STMDAY(KSRTMN),TRKDRZ(KSRTMN), - 1 TRKSPZ(KSRTMN),USTM,VSTM,DT,TRKLTZ(KSRTMN), - 2 TRKLNZ(KSRTMN),STMLAT(ISRT),STMLON(ISRT) - 39 FORMAT(/'...EXTRAPOLATING FIX BACKWARDS IN TIME: ISRT,KSRTMN,', - 1 '(STMDAY,TRKDRZ,TRKSPZ(KSRTMN)),USTM,VSTM,DT,'/41X, - 2 '(TRKLTZ,TRKLNZ(KSRTMN)),(STMLAT,STMLON(ISRT))='/40X, - 3 2I3,6F12.3/43X,4F12.3) - ENDIF - - IF(KSRTMX .LE. 0) THEN - DT=DAYMX-STMDAY(KSTORM) - IF(DT .LE. 0.333333) THEN - WRITE(6,51) KSTORM,KSRT,DAYMX,(STMDAY(KST),KST=1,KSTORM) - 51 FORMAT(/'######CANNOT FIND STORM RECORDS MORE THAN 8 HOURS ', - 1 'AFTER WINDOW MAXIMUM.'/7X,'THIS IS THE LAST RECORD ', - 2 'FOR THIS STORM OR THERE MAY BE AN ERROR. KSTORM,KSRT,', - 3 'DAYMX,STMDAY=',2I4,F10.3/(5X,10F12.3)) - IERSET=51 - ENDIF - - EXTRPF=.TRUE. - KSRTMX=KSTORM+1 - ISRT=IDASRT(KSTORM) - IDASRT(KSRTMX)=KSTORM+1 - STMDAY(KSRTMX)=DAYMX - TRKDRZ(KSRTMX)=STMDIR(ISRT) - TRKSPZ(KSRTMX)=STMSPD(ISRT) - CALL DS2UV(USTM,VSTM,TRKDRZ(ISRT),TRKSPZ(ISRT)) - TRKLTZ(KSRTMX)=STMLAT(ISRT)+VSTM*DT*FACSPD - TRKLNZ(KSRTMX)=STMLON(ISRT)+USTM*DT*FACSPD/COSD(STMLAT(ISRT)) - TRKRMX(KSRTMX)=RMAX(ISRT) - TRKPCN(KSRTMX)=PCEN(ISRT) - TRKPEN(KSRTMX)=PENV(ISRT) - TRKVMX(KSRTMX)=VMAX(ISRT) - TRKDPT(KSRTMX)=PTOP(ISRT) - WRITE(6,49) ISRT,STMDAY(KSRTMX),TRKDRZ(KSRTMX),TRKSPZ(KSRTMX), - 1 USTM,VSTM,DT,TRKLTZ(KSRTMX),TRKLNZ(KSRTMX), - 2 STMLAT(ISRT),STMLON(ISRT) - 49 FORMAT(/'...EXTRAPOLATING FIX FORWARDS IN TIME: ISRT,(STMDAY,', - 1 'TRKDIR,TRKSPD(KSRTMX)),USTM,VSTM,DT,'/41X,'(TRKLTZ,', - 2 'TRKLNZ(KSRTMX)),(STMLAT,STMLON(ISRT))='/40X,I3,6F12.3/ - 3 43X,4F12.3) - - ENDIF - - KK=1 - KST0=-1 - TIMMIN=1.E10 - -C PUT ALL FIXES THAT DEFINE THE TIME WINDOW INTO ARRAYS SORTED -C BY TIME. FIRST, ELIMINATE RECORDS WITH DUPLICATE TIMES. -C OUR ARBITRARY CONVENTION IS TO KEEP THE LATEST RECORD. ANY -C FIX TIME WITHIN 5 MINUTES OF ITS PREDECESSOR IN THE SORTED -C LIST IS CONSIDERED DUPLICATE. - - DO KST=KSRTMN,KSRTMX - IF(KST .GT. KSRTMN) THEN - IF(STMDAY(KST)-SRTDAY(KK) .LT. FIVMIN) THEN - WRITE(6,53) KST,KK,STMDAY(KST),SRTDAY(KK) - 53 FORMAT(/'...TIME SORTED FIX RECORDS SHOW A DUPLICATE, KST,KK,', - 1 'STMDAY(KST),SRTDAY(KK)=',2I5,2F12.3) - ELSE - KK=KK+1 - ENDIF - ENDIF - -C STORE SORTED LAT/LON, DIRECTION, SPEED FOR FUTURE USE. - - SRTLAT(KK)=TRKLTZ(IDASRT(KST)) - SRTLON(KK)=TRKLNZ(IDASRT(KST)) - SRTDIR(KK)=TRKDRZ(IDASRT(KST)) - SRTSPD(KK)=TRKSPZ(IDASRT(KST)) - SRTDAY(KK)=STMDAY(KST) - SRTRMX(KK)=TRKRMX(IDASRT(KST)) - SRTPCN(KK)=TRKPCN(IDASRT(KST)) - SRTPEN(KK)=TRKPEN(IDASRT(KST)) - SRTVMX(KK)=TRKVMX(IDASRT(KST)) - SRTDPT(KK)=TRKDPT(IDASRT(KST)) - -c fixit?? - to avoid subscript zero warning on next two lines, I did -c the following .... -cdak ISRTDA(KK)=IDATE(IDASRT(KST)) -cdak ISRTUT(KK)=IUTC (IDASRT(KST)) - if(IDASRT(KST).ne.0) then - ISRTDA(KK)=IDATE(IDASRT(KST)) - ISRTUT(KK)=IUTC (IDASRT(KST)) - else - ISRTDA(KK)=0 - ISRTUT(KK)=0 - end if - - IF(ABS(SRTDAY(KK)-DAY0) .LT. TIMMIN) THEN - IF(ABS(SRTDAY(KK)-DAY0) .LT. FIVMIN) KST0=KK - KSTZ=KK - TIMMIN=ABS(SRTDAY(KK)-DAY0) - ENDIF - ENDDO - - KSTMX=KK - -C ZERO OUT EXTRAPOLATED DATE AND TIME AS A REMINDER - - IF(EXTRPF) THEN - ISRTDA(KSTMX)=0 - ISRTUT(KSTMX)=0 - ENDIF - - IF(EXTRPB) THEN - ISRTDA(1)=0 - ISRTUT(1)=0 - ENDIF - - IF(KSTZ .EQ. KSTMX .AND. .NOT. (EXTRPB .OR. EXTRPF)) THEN - WRITE(6,61) KSTZ,KSTMX,(SRTDAY(KST),KST=1,KSTMX) - 61 FORMAT(/'******THE REFERENCE STORM INDEX IS THE MAXIMUM ALLOWED ', - 1 'A PROBABLE ERROR HAS OCCURRED'/8X,'KSTZ,KSTMX,SRTDAY=', - 2 2I5/(5X,10F12.3)) - CALL ABORT1(' SETTRK',61) - ENDIF - - IF(KST0 .LE. 0) THEN - WRITE(6,72) DAY0,KST0,(SRTDAY(KST),KST=1,KSTMX) - 72 FORMAT(/'******THERE IS NO FIX AT THE ANALYSIS TIME, AN ', - 1 'INTERPOLATED POSITION WILL BE CALCULATED'/5X,'DAY0,', - 2 'KST0,SRTDAY=',F12.3,I6/(5X,10F12.3)) - IF(DAY0-SRTDAY(KSTZ) .GT. 0.0) THEN - RATIO=(DAY0-SRTDAY(KSTZ))/(SRTDAY(KSTZ+1)-SRTDAY(KSTZ)) - STLAT0=SRTLAT(KSTZ)+(SRTLAT(KSTZ+1)-SRTLAT(KSTZ))*RATIO - STLON0=SRTLON(KSTZ)+(SRTLON(KSTZ+1)-SRTLON(KSTZ))*RATIO - STMDR0=SRTDIR(KSTZ)+(SRTDIR(KSTZ+1)-SRTDIR(KSTZ))*RATIO - STMSP0=SRTSPD(KSTZ)+(SRTSPD(KSTZ+1)-SRTSPD(KSTZ))*RATIO - STDAY0=DAY0 - ELSE - RATIO=(DAY0-SRTDAY(KSTZ-1))/(SRTDAY(KSTZ)-SRTDAY(KSTZ-1)) - STLAT0=SRTLAT(KSTZ-1)+(SRTLAT(KSTZ)-SRTLAT(KSTZ-1))*RATIO - STLON0=SRTLON(KSTZ-1)+(SRTLON(KSTZ)-SRTLON(KSTZ-1))*RATIO - STMDR0=SRTDIR(KSTZ-1)+(SRTDIR(KSTZ)-SRTDIR(KSTZ-1))*RATIO - STMSP0=SRTSPD(KSTZ-1)+(SRTSPD(KSTZ)-SRTSPD(KSTZ-1))*RATIO - STDAY0=DAY0 - ENDIF - - ELSE - STLAT0=SRTLAT(KST0) - STLON0=SRTLON(KST0) - STMDR0=SRTDIR(KST0) - STMSP0=SRTSPD(KST0) - STDAY0=SRTDAY(KST0) - ENDIF - - WRITE(6,77) (KSRT,ISRTDA(KSRT),ISRTUT(KSRT), - 1 SRTDAY(KSRT),SRTLAT(KSRT),SRTLON(KSRT), - 2 SRTDIR(KSRT),SRTSPD(KSRT), - 3 SRTPCN(KSRT),SRTPEN(KSRT),SRTRMX(KSRT), - 4 SRTVMX(KSRT),SRTDPT(KSRT),KSRT=1,KSTMX) - 77 FORMAT(/'...FINAL SORTED LIST IS:'/6X,'YYYYMMDD',2X,'HHMM',4X, - 1 'RJDAY',7X,'LAT',7X,'LON',6X,'DIR',7X,'SPEED',4X,' PCEN', - 2 26X,'PENV',6X,'RMAX',5X,'VMAX',4X,'PTOP'/(1X,I3,2X,I8,2X, - 3 I4,8F10.2,2(3X,F5.1))) - - WRITE(6,79) STDAY0,STLAT0,STLON0,STMDR0,STMSP0 - 79 FORMAT(/'...THE REFERENCE TIME, LATITUDE, LONGITUDE, DIRECTION ', - 1 'AND SPEED ARE:',5F12.3) - - WRITE(6,89) - 89 FORMAT(/'...END SETTRK') - - RETURN - -C----------------------------------------------------------------------- - - ENTRY PRVSTM(STLATP,STLONP,STDIRP,STSPDP,STDAYP, - 1 STRMXP,STPCNP,STPENP,STVMXP,STDPTP,KSTPZ) - -C THIS ENTRY IS CURRENTLY SET UP TO RETURN THE TWO PREVIOUS -C SETS OF STORM LAT/LON, DIR/SPD, TIME. FOR CASES IN WHICH -C INSUFFICIENT STORM RECORDS HAVE BEEN FOUND, THE SLOTS ARE -C FILLED WITH -99.0 OR A DASH - -C KSTPZ IS THE NUMBER OF PREVIOUS, NON-EXTRAPOLATED, STORM RECORDS - - KSTPZ=MIN0(MAX0(KSTZ-1,0),NPRVMX) - STLATP(-NPRVMX:-1)=-99.0 - STLONP(-NPRVMX:-1)=-99.0 - STDIRP(-NPRVMX:-1)=-99.0 - STSPDP(-NPRVMX:-1)=-99.0 - STDAYP(-NPRVMX:-1)=-99.0 - STRMXP(-NPRVMX:-1)=-99.0 - STPCNP(-NPRVMX:-1)=-99.0 - STPENP(-NPRVMX:-1)=-99.0 - STVMXP(-NPRVMX:-1)=-99.0 - STDPTP(-NPRVMX:-1)='-' - - DO KSTP=1,KSTPZ - STLATP(-KSTP)=SRTLAT(KSTZ-KSTP) - STLONP(-KSTP)=SRTLON(KSTZ-KSTP) - STDIRP(-KSTP)=SRTDIR(KSTZ-KSTP) - STSPDP(-KSTP)=SRTSPD(KSTZ-KSTP) - STDAYP(-KSTP)=SRTDAY(KSTZ-KSTP) - STRMXP(-KSTP)=SRTRMX(KSTZ-KSTP) - STPCNP(-KSTP)=SRTPCN(KSTZ-KSTP) - STPENP(-KSTP)=SRTPEN(KSTZ-KSTP) - STVMXP(-KSTP)=SRTVMX(KSTZ-KSTP) - -C RECODE PRESSURE STORM DEPTH INTO A CHARACTER - - KTPC=0 - DO KTOP=1,MAXTPC - IF(SRTDPT(KSTZ-KSTP) .EQ. STMTOP(KTOP)) KTPC=KTOP - ENDDO - STDPTP(-KSTP)=STMTPC(KTPC) - - ENDDO - IF(EXTRPB .AND. KSTZ-KSTPZ .LE. 1) KSTPZ=KSTPZ-1 - - IF(KSTPZ .EQ. 0) THEN - WRITE(6,97) - 97 FORMAT(/'...NO STORM RECORDS PRECEEDING THE REFERENCE TIME HAVE ', - 1 'BEEN FOUND BY PRVSTM.') - - ELSE - WRITE(6,98) KSTPZ,NPRVMX,STDAYP(-1) - 98 FORMAT(/'...PRVSTM HAS FOUND',I3,' STORM RECORDS PRECEEDING THE ', - 1 'REFERENCE TIME (MAX ALLOWED=',I2,').'/4X,'THE TIME ', - 2 'CORRESPONDING TO INDEX -1 IS',F12.3,'.') - ENDIF - -C WRITE(6,99) KSTZ,KSTPZ,(STLATP(KK),STLONP(KK),STDIRP(KK), -C 1 STSPDP(KK),STDAYP(KK),STRMXP(KK),STPCNP(KK), -C 2 STPENP(KK),STVMXP(KK),KK=-1,-NPRVMX,-1) -C 99 FORMAT(/'...FROM PRVSTM, KSTZ,KSTPZ,STLATP,STLONP,STDIRP,STSPDP,', -C 1 'STDAYP,STRMXP,STPCNP,STPENP,STVMXP=',2I3/(5X,9F10.2)) - RETURN - -C----------------------------------------------------------------------- - - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: NEWVIT READS TROPICAL CYCLONE VITAL STAT. FILE -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: GENERAL FILE READER FOR TROPICAL CYCLONE VITAL STATISTICS -C FILE. CAN FIND ALL STORMS OF A PARTICULAR NAME OR ID, ALL -C STORMS ON A PARTICULAR DATE/TIME AND VARIOUS COMBINATIONS OF -C THE ABOVE. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C -C USAGE: CALL PGM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C DDNAME1 - GENERIC NAME & CONTENT -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C DDNAME2 - GENERIC NAME & CONTENT AS ABOVE -C FT06F001 - INCLUDE IF ANY PRINTOUT -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE NEWVIT(IOVITL,IPRNT,IOPT,IERVIT,MAXSTM,KSTORM,IDTREQ, - 1 IHRREQ,IHRWIN,IDATE,IUTC,STMLAT,STMLON,STMDIR,STMSPD, - 2 PCEN,PENV,RMAX,VMAX,RMW,R15NE,R15SE,R15SW,R15NW, - 3 PTOP,FSTFLG,STMNAM,STMID,RSMC) - - SAVE - - DIMENSION RINC(5) - - CHARACTER STMNAM*9,STMID*3,RSMC*4 - - LOGICAL FSTFLG - - DIMENSION STMNAM(MAXSTM),STMLAT(MAXSTM),STMLON(MAXSTM), - 1 STMDIR(MAXSTM),STMSPD(MAXSTM),IDATE(MAXSTM), - 2 IUTC(MAXSTM),RMAX(MAXSTM),PENV(MAXSTM),PCEN(MAXSTM), - 3 PTOP(MAXSTM),RSMC(MAXSTM),RMW(MAXSTM),VMAX(MAXSTM), - 4 R15NW(MAXSTM),R15NE(MAXSTM),R15SE(MAXSTM),R15SW(MAXSTM), - 5 STMID(MAXSTM),FSTFLG(MAXSTM) - - PARAMETER (MAXCHR=95) - PARAMETER (MAXVIT=15) - PARAMETER (MAXTPC= 3) - - CHARACTER BUFIN*1,RSMCZ*4,STMIDZ*3,STMNMZ*9,FSTFLZ*1,STMDPZ*1, - 1 LATNS*1,LONEW*1,FMTVIT*6,BUFINZ*100,STMREQ*9,RELOCZ*1 - CHARACTER BUFY2K*1 - - DIMENSION IVTVAR(MAXVIT),VITVAR(MAXVIT),VITFAC(MAXVIT), - 1 ISTVAR(MAXVIT),IENVAR(MAXVIT),STMTOP(0:MAXTPC) - - DIMENSION BUFIN(MAXCHR),FMTVIT(MAXVIT) - DIMENSION BUFY2K(MAXCHR) - - EQUIVALENCE (BUFIN(1),RSMCZ),(BUFIN(5),RELOCZ),(BUFIN(6),STMIDZ), - 1 (BUFIN(10),STMNMZ),(BUFIN(19),FSTFLZ), - 2 (BUFIN(37),LATNS),(BUFIN(43),LONEW), - 3 (BUFIN(95),STMDPZ),(BUFIN(1),BUFINZ) - - EQUIVALENCE (IVTVAR(1),IDATEZ),(IVTVAR(2),IUTCZ) - - EQUIVALENCE (VITVAR( 3),STMLTZ),(VITVAR( 4),STMLNZ), - 1 (VITVAR( 5),STMDRZ),(VITVAR( 6),STMSPZ), - 2 (VITVAR( 7),PCENZ), (VITVAR( 8),PENVZ), - 3 (VITVAR( 9),RMAXZ), (VITVAR(10),VMAXZ), - 4 (VITVAR(11),RMWZ), (VITVAR(12),R15NEZ), - 5 (VITVAR(13),R15SEZ),(VITVAR(14),R15SWZ), - 6 (VITVAR(15),R15NWZ) - - DATA VITFAC/2*1.0,2*0.1,1.0,0.1,9*1.0/, - 1 FMTVIT/'(I8.8)','(I4.4)','(I3.3)','(I4.4)',2*'(I3.3)', - 2 3*'(I4.4)','(I2.2)','(I3.3)',4*'(I4.4)'/, - 3 ISTVAR/20,29,34,39,45,49,53,58,63,68,71,75,80,85,90/, - 4 IENVAR/27,32,36,42,47,51,56,61,66,69,73,78,83,88,93/, - 5 STMTOP/-99.0,700.,400.,200./ - -C FIVMIN IS FIVE MINUTES IN UNITS OF FRACTIONAL DAYS - - DATA FIVMIN/3.4722E-3/,IRDERM/20/,NUM/1/ - -C THIS SUBROUTINE READS A GLOBAL VITAL STATISTICS FILE FOR -C TROPICAL CYCLONES. THERE ARE A NUMBER OF OPTIONS (IOPT) -C UNDER WHICH THIS ROUTINE WILL OPERATE: -C 1) FIND ALL STORMS ON A SPECIFIED DATE/TIME (+WINDOW) -C 2) FIND A PARTICULAR STORM NAME ON A SPECIFIED DATE/TIME -C (+WINDOW) -C 3) FIND ALL OCCURRENCES OF A PARTICULAR STORM NAME -C 4) SAME AS OPTION 2, EXCEPT FOR A PARTICULAR STORM ID -C 5) SAME AS OPTION 3, EXCEPT FOR A PARTICULAR STORM ID -C 6) ALL OCCURRENCES OF A PARTICULAR STORM NAME, EVEN -C BEFORE IT HAD A NAME (FIND FIRST OCCURRENCE OF -C STORM NAME, SUBSTITUE STORM ID, REWIND, THEN -C EXECUTE OPTION 5 - -C STORM ID POSITON CONTAINS THE BASIN IDENTIFIER IN THE -C LAST CHARACTER. THESE ABBREVIATIONS ARE: -C NORTH ATLANTIC: L -C EAST PACIFIC: E -C CENTRAL PACIFIC: C -C WEST PACIFIC: W -C AUSTRALIAN: U -C SOUTH INDIAN: S -C SOUTH PACIFIC P -C N ARABIAN SEA A -C BAY OF BENGAL B -C SOUTH CHINA SEA O -C EAST CHINA SEA T - -C CHECK INPUT ARGUMENTS ACCORDING TO OPTION. ALSO DO OVERHEAD -C CHORES IF NECESSARY - - IERVIT=0 - STMREQ=' ' - IYRREQ=-9999 - - IF(IOPT .LE. 2 .OR. IOPT .EQ. 4) THEN - IF(IDTREQ .LE. 0) THEN - WRITE(6,11) IOPT,IDTREQ,IHRREQ,IHRWIN,MAXSTM,STMNAM(1),STMID(1) - 11 FORMAT(/'****** ILLEGAL DATE IN NEWVIT, IOPT,IDTREQ,IHRREQ,', - 1 'IHRWIN,MAXSTM,STMNAM,STMID='/9X,5I10,2X,A9,2X,A3) - IERVIT=10 - ENDIF - - IF(IHRREQ .LT. 0) THEN - WRITE(6,21) IOPT,IDTREQ,IHRREQ,IHRWIN,MAXSTM,STMNAM(1),STMID(1) - 21 FORMAT(/'****** ILLEGAL HOUR IN NEWVIT, IOPT,IDTREQ,IHRREQ,', - 1 'IHRWIN,MAXSTM,STMNAM,STMID='/9X,5I10,2X,A9,2X,A3) - IERVIT=20 - ENDIF - - IF(IHRWIN .LT. 0) THEN - WRITE(6,31) IOPT,IDTREQ,IHRREQ,IHRWIN,MAXSTM,STMNAM(1),STMID(1) - 31 FORMAT(/'****** ILLEGAL WINDOW IN NEWVIT, IOPT,IDTREQ,IHRREQ,', - 1 'IHRWIN,MAXSTM,STMNAM,STMID='/9X,5I10,2X,A9,2X,A3) - IERVIT=30 - -C SET UP PARAMETERS FOR TIME WINDOW - - ELSE IF(IHRWIN .GT. 0) THEN - CALL ZTIME(IDTREQ,IHRREQ,IYRWIN,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYRWIN,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0, - $ 0/),1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAY0) - -C NORMAL CASE - - WINDOW=REAL(IHRWIN)/24. - DAYPLS=DAY0+WINDOW+FIVMIN - DAYMNS=DAY0-WINDOW-FIVMIN - ENDIF - ENDIF - - IF(IOPT .EQ. 2 .OR. IOPT .EQ. 3 .OR. IOPT .EQ. 6) THEN - IF(STMNAM(1) .EQ. ' ') THEN - WRITE(6,41) IOPT,IDTREQ,IHRREQ,IHRWIN,MAXSTM,STMNAM(1),STMID(1) - 41 FORMAT(/'****** ILLEGAL STMNAM IN NEWVIT, IOPT,IDTREQ,IHRREQ,', - 1 'IHRWIN,MAXSTM,STMNAM,STMID='/9X,5I10,2X,A9,2X,A3) - IERVIT=40 - - ELSE - STMREQ=STMNAM(1) - ENDIF - - ELSE IF(IOPT .EQ. 4 .OR. IOPT .EQ. 5) THEN - IF(STMID(1) .EQ. ' ') THEN - WRITE(6,51) IOPT,IDTREQ,IHRREQ,IHRWIN,MAXSTM,STMNAM(1),STMID(1) - 51 FORMAT(/'****** ILLEGAL STMID IN NEWVIT, IOPT,IDTREQ,IHRREQ,', - 1 'IHRWIN,MAXSTM,STMNAM,STMID='/9X,5I10,2X,A9,2X,A3) - IERVIT=50 - - ELSE - STMREQ=STMID(1) - ENDIF - - ELSE IF(IOPT .NE. 1) THEN - WRITE(6,61) IOPT,IDTREQ,IHRREQ,IHRWIN,MAXSTM,STMNAM(1),STMID(1) - 61 FORMAT(/'****** ILLEGAL OPTION IN NEWVIT, IOPT,IDTREQ,IHRREQ,', - 1 'IHRWIN,MAXSTM,STMNAM,STMID='/9X,5I10,2X,A9,2X,A3) - IERVIT=60 - ENDIF - -C FOR OPTIONS 3, 5, 6 (ALL OCCURRENCES OPTIONS), SEARCH IS -C RESTRICTED TO A SPECIFIC YEAR when idtreq is positive - - IF(IOPT .EQ. 3 .OR. IOPT .EQ. 5 .OR. IOPT .EQ. 6) - 1 IYRREQ=IDTREQ/10000 - -C ****** ERROR EXIT ****** - - IF(IERVIT .GT. 0) RETURN - -C INITIALIZE FILE AND COUNTERS - - 90 REWIND IOVITL - KREC=0 - KSTORM=0 - NERROR=0 - -C READ A RECORD INTO BUFFER - - 100 CONTINUE - - READ(IOVITL,101,ERR=990,END=200) (BUFIN(NCH),NCH=1,MAXCHR) - 101 FORMAT(95A1) - -C AT THIS POINT WE DO NOT KNOW IF A 2-DIGIT YEAR BEGINS IN COLUMN 20 -C OF THE RECORD (OLD NON-Y2K COMPLIANT FORM) OR IF A 4-DIGIT YEAR -C BEGINS IN COLUMN 20 (NEW Y2K COMPLIANT FORM) - TEST ON LOCATION OF -C LATITUDE N/S INDICATOR TO FIND OUT ... - - IF(BUFIN(35).EQ.'N' .OR. BUFIN(35).EQ.'S') THEN - -C ... THIS RECORD STILL CONTAINS THE OLD 2-DIGIT FORM OF THE YEAR -C ... THIS PROGRAM WILL CONVERT THE RECORD TO A 4-DIGIT YEAR USING THE -C "WINDOWING" TECHNIQUE SINCE SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 2-digit year "',BUFIN(20:21),'"' - PRINT *, ' ' - PRINT '(a,i0,a,a)', 'From unit ',iovitl,'; BUFIN-10: ',bufin - PRINT *, ' ' - BUFY2K(1:19) = BUFIN(1:19) - IF(BUFIN(20)//BUFIN(21).GT.'20') THEN - BUFY2K(20) = '1' - BUFY2K(21) = '9' - ELSE - BUFY2K(20) = '2' - BUFY2K(21) = '0' - ENDIF - BUFY2K(22:95) = BUFIN(20:93) - BUFIN = BUFY2K - PRINT *, ' ' - PRINT *, '==> 2-digit year converted to 4-digit year "', - $ BUFIN(20:23),'" via windowing technique' - PRINT *, ' ' - PRINT *, 'From unit ',iovitl,'; BUFIN-10: ',bufin - PRINT *, ' ' - - ELSE IF(BUFIN(37).EQ.'N' .OR. BUFIN(37).EQ.'S') THEN - -C ... THIS RECORD CONTAINS THE NEW 4-DIGIT FORM OF THE YEAR -C ... NO CONVERSION NECESSARY SINCE THIS SUBSEQUENT LOGIC EXPECTS THIS - - PRINT *, ' ' - PRINT *, '==> Read in RECORD from tcvitals file -- contains a', - $ ' 4-digit year "',BUFIN(20:23),'"' - PRINT *, ' ' - PRINT *, 'From unit ',iovitl,'; BUFIN-10: ',bufin - PRINT *, ' ' - PRINT *, '==> No conversion necessary' - PRINT *, ' ' - - ELSE - - PRINT *, ' ' - PRINT *, '***** Cannot determine if this record contains ', - $ 'a 2-digit year or a 4-digit year - skip it and try reading ', - $ 'the next record' - PRINT *, ' ' - GO TO 100 - - END IF - - KREC=KREC+1 - -C DECODE DATE AND TIME - - DO IV=1,2 - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 BUFINZ) -c WRITE(6,109) IV,ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC, -c 1 FMTVIT(IV) -c 109 FORMAT(/'...DECODING VARIABLE #',I2,' ISTART,IEND,IVALUE,IER,', -c 1 'FMT=',2I4,I10,I3,2X,A10) - ENDDO - -C FILTER OUT RECORDS THAT ARE NOT GATHERED BY CURRENT OPTION - -C FIRST: DATE/TIME/WINDOW FILTER - - IF(IOPT .LE. 2 .OR. IOPT .EQ. 4) THEN - -C EXACT DATE/UTC ARE SPECIFIED - - IF(IHRWIN .EQ. 0) THEN -C WRITE(6,117) IDATEZ,IUTCZ -C 117 FORMAT(/'...NO WINDOW OPTION: IDATEZ,IUTCZ=',2I10) - IF(IDTREQ .NE. IDATEZ) GO TO 100 - IF(IHRREQ .NE. IUTCZ ) GO TO 100 - - ELSE - CALL ZTIME(IDATEZ,IUTCZ,IYR,IMO,IDA,IHR,IMIN) - CALL W3DIFDAT((/IYR,IMO,IDA,0,0,0,0,0/),(/1899,12,31,0,0,0,0,0/) - $ ,1,RINC) - JDY = NINT(RINC(1)) - CALL FLDAY(JDY,IHR,IMIN,DAYZ) - -C WRITE(6,119) IYR,IMO,IDA,IHR,IMIN,JDY,DAYZ,DAYMNS,DAYPLS,IYRMNS -C 119 FORMAT('...DEBUGGING WINDOW TIME SELECTION: IYR,IMO,IDA,IHR,', -C 1 'IMIN,JDY,DAYZ,DAYMNS,DAYPLS,IYRMNS='/15X,6I5,3F12.4,I5) - -C YEAR WINDOW, THEN FRACTIONAL JULIAN DAY WINDOW - - IF(IYR .NE. IYRWIN) GO TO 100 - IF(DAYZ .LT. DAYMNS .OR. DAYZ .GT. DAYPLS) GO TO 100 - ENDIF - ENDIF - -C SECOND: STORM NAME FILTER - - IF(IOPT .EQ. 2 .OR. IOPT .EQ. 3 .OR. IOPT .EQ. 6) THEN - IF(IPRNT .GT. 0) WRITE(6,123) STMNMZ,STMREQ - 123 FORMAT('...STORM NAME FILTER, STMNMZ,STMREQ=',A9,2X,A9) - IF(STMNMZ .NE. STMREQ) GO TO 100 - IF(IOPT .EQ. 3 .OR. IOPT .EQ. 6) then - if(iyrreq .gt. 0 .and. IDATEZ/10000 .NE. IYRREQ) go to 100 - endif - -C FOR OPTION 6, BRANCH BACK TO LOOK FOR STORM ID INSTEAD OF -C STORM NAME - - IF(IOPT .EQ. 6) THEN - IOPT=5 - STMREQ=STMIDZ - GO TO 90 - ENDIF - - ENDIF - -C THIRD: STORM ID FILTER - - IF(IOPT .EQ. 4 .AND. STMIDZ .NE. STMREQ) GO TO 100 - IF(IOPT .EQ. 5 .AND. (STMIDZ .NE. STMREQ .OR. (iyrreq .gt. 0 - 1 .and. IDATEZ/10000 .NE. IYRREQ))) GO TO 100 - -C EUREKA - - IF(IPRNT .GT. 0) WRITE(6,137) STMREQ,KREC - 137 FORMAT('...REQUESTED STORM FOUND, NAME/ID=',A9,' AT RECORD #',I6) - - DO IV=3,MAXVIT - CALL DECVAR(ISTVAR(IV),IENVAR(IV),IVTVAR(IV),IERDEC,FMTVIT(IV), - 1 BUFINZ) - VITVAR(IV)=REAL(IVTVAR(IV))*VITFAC(IV) - ENDDO - -C DEPTH OF CYCLONIC CIRCULATION - - IF(STMDPZ .EQ. 'S') THEN - PTOPZ=STMTOP(1) - ELSE IF(STMDPZ .EQ. 'M') THEN - PTOPZ=STMTOP(2) - ELSE IF(STMDPZ .EQ. 'D') THEN - PTOPZ=STMTOP(3) - ELSE IF(STMDPZ .EQ. 'X') THEN - PTOPZ=-99.0 -C WRITE(6,141) STMDPZ -C 141 FORMAT('******DEPTH OF CYCLONIC CIRCULATION HAS MISSING CODE=',A, -C 1 '.') - ELSE - WRITE(6,143) STMDPZ - 143 FORMAT('******ERROR DECODING DEPTH OF CYCLONIC CIRCULATION, ', - 1 'STMDPZ=',A1,'. ERROR RECOVERY NEEDED.') - ENDIF - -C ***************************************************** -C ***************************************************** -C **** IMPORTANT NOTES: **** -C **** **** -C **** ALL STORM LONGITUDES CONVERTED TO **** -C **** 0-360 DEGREES, POSITIVE EASTWARD !!! **** -C **** **** -C **** ALL STORM SPEEDS ARE IN M/SEC **** -C **** **** -C **** ALL DISTANCE DATA ARE IN KM **** -C **** **** -C **** ALL PRESSURE DATA ARE IN HPA (MB) **** -C ***************************************************** -C ***************************************************** - -C SIGN OF LATITUDE AND CONVERT LONGITUDE - - IF(LATNS .EQ. 'S') THEN - STMLTZ=-STMLTZ - ELSE IF(LATNS .NE. 'N') THEN - WRITE(6,153) STMLTZ,STMLNZ,LATNS - 153 FORMAT('******ERROR DECODING LATNS, ERROR RECOVERY NEEDED. ', - 1 'STMLTZ,STMLNZ,LATNS=',2F12.2,2X,A1) - GO TO 100 - ENDIF - - IF(LONEW .EQ. 'W') THEN - STMLNZ=360.-STMLNZ - ELSE IF(LONEW .NE. 'E') THEN - WRITE(6,157) STMLTZ,STMLNZ,LATNS - 157 FORMAT('******ERROR DECODING LONEW, ERROR RECOVERY NEEDED. ', - 1 'STMLTZ,STMLNZ,LATNS=',2F12.2,2X,A1) - ENDIF - - IF(IPRNT .EQ. 1) - 1 WRITE(6,161) IDATEZ,IUTCZ,STMLTZ,STMLNZ,STMDRZ,STMSPZ,PENVZ, - 2 PCENZ,RMAXZ,VMAXZ,RMWZ,R15NEZ,R15SEZ,R15SWZ,R15NWZ - 161 FORMAT('...ALL STORM DATA CALCULATED: IDATEZ,IUTCZ,STMLTZ,', - 1 'STMLNZ,STMDRZ,STMSPZ,PENVZ,PCENZ,RMAXZ,VMAXZ,RMWZ,', - 2 'R15NEZ,R15SEZ,R15SWZ,R15NWZ='/5X,2I10,13F8.2) - - IF(KSTORM .LT. MAXSTM) THEN - KSTORM=KSTORM+1 - IDATE(KSTORM)=IDATEZ - IUTC(KSTORM)=IUTCZ - PTOP(KSTORM)=PTOPZ - STMLAT(KSTORM)=STMLTZ - STMLON(KSTORM)=STMLNZ - STMDIR(KSTORM)=STMDRZ - STMSPD(KSTORM)=STMSPZ - STMNAM(KSTORM)=STMNMZ - STMID (KSTORM)=STMIDZ - RSMC (KSTORM)=RSMCZ - RMAX(KSTORM)=RMAXZ - PENV(KSTORM)=PENVZ - PCEN(KSTORM)=PCENZ - VMAX(KSTORM)=VMAXZ - RMW(KSTORM)=RMWZ - R15NE(KSTORM)=R15NEZ - R15SE(KSTORM)=R15SEZ - R15SW(KSTORM)=R15SWZ - R15NW(KSTORM)=R15NWZ - -C SET THE FIRST OCCURRENCE FLAG IF PRESENT - - IF(FSTFLZ .EQ. ':') THEN - FSTFLG(KSTORM)=.TRUE. - ELSE - FSTFLG(KSTORM)=.FALSE. - ENDIF - - GO TO 100 - - ELSE - GO TO 300 - ENDIF - - 200 CONTINUE - - IF(KSTORM .GT. 0) THEN - -C NORMAL RETURN HAVING FOUND REQUESTED STORM (S) AT DATE/TIME/WINDOW - - IF(IPRNT .EQ. 1) WRITE(6,201) STMREQ,IDTREQ,IHRREQ,KSTORM,KREC - 201 FORMAT(/'...FOUND STORM NAME/ID ',A12,' AT DATE, TIME=',I9,'/', - 1 I5,' UTC IN VITALS FILE.'/4X,I5,' RECORDS FOUND. ', - 2 'TOTAL NUMBER OF RECORDS READ=',I7) - RETURN - -C UNABLE TO FIND REQUESTED STORM AT DATE/TIME/WINDOW - - ELSE - IF(IOPT .EQ. 1) STMREQ='ALLSTORMS' - WRITE(6,207) IOPT,STMREQ,STMNMZ - 207 FORMAT(/'**** OPTION=',I3,' CANNOT FIND STORM NAME/ID=',A9, - 1 '... LAST STORM FOUND=',A9) - - WRITE(6,209) IDATEZ,IDTREQ,IUTCZ,IHRREQ - 209 FORMAT('**** CANNOT FIND REQUESTED DATE/TIME, (FOUND, ', - 1 'REQUESTED) (DATE/TIME)=',4I10/) - IERVIT=210 - RETURN - - ENDIF - - 300 WRITE(6,301) KSTORM - 301 FORMAT(/'******KSTORM EXCEEDS AVAILABLE SPACE, KSTORM=',I5) - RETURN - - 990 WRITE(6,991) BUFIN - 991 FORMAT('******ERROR READING STORM RECORD. BUFIN IS:'/' ******',A, - 1 '******') - NERROR=NERROR+1 - IF(NERROR .LE. IRDERM) GO TO 100 - IERVIT=990 - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: DECVAR DECODES VARIALES -C PRGMMR: D. A. KEYSER ORG: NP22 DATE: 2004-06-08 -C -C ABSTRACT: DECODES A PARTICULAR INTEGER VARIABLE FROM AN INPUT -C CHARACTER- BASED RECORD IN THE TROPICAL CYCLONE VITAL STATISTICS -C FILE. THIS IS DONE THROUGH AN INTERNAL READ. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C 2004-06-08 D. A. KEYSER - WHEN INTEGER VALUES ARE DECODED FROM -C CHARACTER-BASED RECORD VIA INTERNAL READ IN THIS SUBR., -C IF BYTE IN UNITS DIGIT LOCATION IS ERRONEOUSLY CODED AS -C BLANK (" "), IT IS REPLACED WITH A "5" IN ORDER TO -C PREVENT INVALID VALUE FROM BEING RETURNED (I.E., IF -C "022 " WAS READ, IT WAS DECODED AS "22", IT IS NOW -C DECODED AS "225" - THIS HAPPENED FOR VALUE OF RADIUS OF -C LAST CLOSED ISOBAR FOR JTWC RECORDS FROM 13 JULY 2000 -C THROUGH FNMOC FIX ON 26 MAY 2004 - THE VALUE WAS REPLACED -C BY CLIMATOLOGY BECAUSE IT FAILED A GROSS CHECK, HAD THIS -C CHANGE BEEN IN PLACE THE DECODED VALUE WOULD HAVE BEEN -C W/I 0.5 KM OF THE ACTUAL VALUE) -C -C USAGE: CALL DECVAR(ISTART,IEND,IVALUE,IERDEC,FMT,BUFF) -C INPUT ARGUMENT LIST: -C ISTART - INTEGER BYTE IN BUFF FROM WHICH TO BEGIN INTERNAL READ -C IEND - INTEGER BYTE IN BUFF FROM WHICH TO END INTERNAL READ -C FMT - CHARACTER*(*) FORMAT TO USE FOR INTERNAL READ -C BUFF - CHARACTER*(*) TROPICAL CYCLONE RECORD -C -C OUTPUT ARGUMENT LIST: -C IVALUE - INTEGER VALUE DECODED FROM BUFF -C IERDEC - ERROR RETURN CODE (= 0 - SUCCESSFUL DECODE, -C =10 - DECODE ERROR) -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: IF IERDEC = 10, IVALUE IS RETURNED AS -9, -99, -999 -C OR -9999999 DEPENDING UPON THE VALUE OF FMT. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE DECVAR(ISTART,IEND,IVALUE,IERDEC,FMT,BUFF) - - PARAMETER (NCHLIN=130) - - CHARACTER FMT*(*),BUFF*(*),BUFF_save*130,OUTLIN*1 - - SAVE - - DIMENSION OUTLIN(NCHLIN) - DIMENSION MISSNG(2:8) - - DATA MISSNG/-9,-99,-999,-9999,-99999,-999999,-9999999/ - -C WRITE(6,1) FMT,ISTART,IEND,BUFF -C 1 FORMAT(/'...FMT,ISTART,IEND=',A10,2I5/' ...BUFF=',A,'...') - - IF(BUFF(IEND:IEND).EQ.' ') THEN - BUFF_save = BUFF - BUFF(IEND:IEND) = '5' - WRITE(6,888) IEND - 888 FORMAT(/' ++++++DECVAR: WARNING -- BLANK (" ") CHARACTER READ IN', - 1 ' UNITS DIGIT IN BYTE',I4,' OF RECORD - CHANGE TO "5" ', - 2 'AND CONTINUE DECODE'/) - OUTLIN=' ' - OUTLIN(IEND:IEND)='5' - WRITE(6,'(130A1)') OUTLIN - WRITE(6,'(A130/)') BUFF_save(1:130) - ENDIF - - READ(BUFF(ISTART:IEND),FMT,ERR=10) IVALUE - - IERDEC=0 - - RETURN - - 10 CONTINUE - - OUTLIN=' ' - OUTLIN(ISTART:IEND)='*' - - IVALUE = -9999999 - K = IEND - ISTART + 1 - IF(K.GT.1 .AND. K.LT.9) IVALUE = MISSNG(K) - - WRITE(6,31) OUTLIN - WRITE(6,32) BUFF(1:130),IVALUE - 31 FORMAT(/' ******DECVAR: ERROR DECODING, BUFF='/130A1) - 32 FORMAT(A130/7X,'VALUE RETURNED AS ',I9/) - - IERDEC=10 - - RETURN - - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: TIMSUB PERFORMS TIME CHORES -C PRGMMR: D. A. KEYSER ORG: NP22 DATE: 1998-06-05 -C -C ABSTRACT: VARIOUS ENTRIES CONVERT 8 DIGIT YYYYMMDD INTO YEAR, MONTH -C AND DAY, AND FRACTIONAL JULIAN DAY FROM INTEGER JULIAN DAY, HOUR -C AND MINUTE. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C 1998-06-05 D. A. KEYSER - Y2K/F90 COMPLIANCE -C -C USAGE: CALL TIMSUB(IDATE,IUTC,IYR,IMO,IDA,IHR,IMIN,JDY,DAY) -C CALL FLDAY(JDY,IHR,IMIN,DAY) -C INPUT ARGUMENT LIST: -C IDATE - DATE IN FORM YYYYMMDD -C JDY - NUMBER OF DAYS SINCE 12/31/1899 -C -C OUTPUT ARGUMENT LIST: -C IYR - YEAR IN FORM YYYY -C IMO - MONTH OF YEAR -C IDA - DAY OF MONTH -C IHR - HOUR OF DAY -C IMIN - MINUTE OF HOUR -C DAY - FRACTIONAL NUMBER OF DAYS SINCE 12/31/1899 -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE TIMSUB(IDATE,IUTC,IYR,IMO,IDA,IHR,IMIN,JDY,DAY) - -C----------------------------------------------------------------------- - - ENTRY ZTIME(IDATE,IUTC,IYR,IMO,IDA,IHR,IMIN) - -C PARSE 8 DIGIT YYYYMMDD INTO YEAR MONTH AND DAY - - IYR = IDATE/10000 - IMO =(IDATE-IYR*10000)/100 - IDA = IDATE-IYR*10000-IMO*100 - IHR =IUTC/100 - IMIN=IUTC-IHR*100 - RETURN - -C----------------------------------------------------------------------- -C THIS ENTRY CALCULATES THE FRACTIONAL JULIAN DAY FROM INTEGERS -C JULIAN DAY, HOUR AND MINUTE (ACUALLY, JDY HERE IS NO. OF DAYS -C SINCE 12/31/1899) - - ENTRY FLDAY(JDY,IHR,IMIN,DAY) - DAY=REAL(JDY)+(REAL(IHR)*60.+REAL(IMIN))/1440. - RETURN - -C----------------------------------------------------------------------- - - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: YTIME GETS INTEGER YYYY, YYYYMMDD, HHMM -C PRGMMR: D. A. KEYSER ORG: NP22 DATE: 1998-10-29 -C -C ABSTRACT: CALCULATES 8-DIGIT INTEGER YYYYMMDD, 4-DIGIT INTEGER YYYY, -C AND 6-DIGIT INTEGER HHMMSS FROM FRACTIONAL NUMBER OF DAYS SINCE -C 12/31/1899 -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C 1998-10-29 D. A. KEYSER - Y2K/F90 COMPLIANCE -C -C USAGE: CALL YTIME(IYR,DAYZ,IDATE,JUTC) -C INPUT ARGUMENT LIST: -C DAYZ - FRACTIONAL NUMBER OF DAYS SINCE 12/31/1899 -C -C OUTPUT ARGUMENT LIST: -C IYR - YEAR (YYYY) -C IDATEZ - DATE IN FORM YYYYMMDD -C JUTC - DATE IN FORM HHMMSS -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE YTIME(IYR,DAYZ,IDATE,JUTC) - DIMENSION JDAT(8) - - CALL W3MOVDAT((/DAYZ,0.,0.,0.,0./),(/1899,12,31,0,0,0,0,0/),JDAT) - IYR = JDAT(1) - IMO = JDAT(2) - IDA = JDAT(3) - IHR = JDAT(5) - IMN = JDAT(6) - ISC = JDAT(7) - - IDATE=IDA+(100*IMO)+(10000*IYR) - JUTC =ISC+100*IMN+10000*IHR - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: SORTRL SORTS REAL NUMBERS -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-04 -C -C ABSTRACT: SORTS REAL NUMBERS. OUTPUT ARRAY IS THE INDEX OF -C THE INPUT VALUES THAT ARE SORTED. -C -C PROGRAM HISTORY LOG: -C 1991-06-04 S. J. LORD (MODIFIED FROM NCAR CODE) -C -C USAGE: CALL SORTRL(A,LA,NL) -C INPUT ARGUMENT LIST: -C A - ARRAY OF ELEMENTS TO BE SORTED. -C NL - NUMBER OF ELEMENTS TO BE SORTED. -C -C OUTPUT ARGUMENT LIST: -C LA - INTEGER ARRAY CONTAINING THE INDEX OF THE SORTED -C - ELEMENTS. SORTING IS FROM SMALL TO LARGE. E.G. -C - LA(1) CONTAINS THE INDEX OF THE SMALLEST ELEMENT IN -C - ARRAY. LA(NL) CONTAINS THE INDEX OF THE LARGEST. -C -C -C REMARKS: NONE -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE SORTRL(A,LA,NL) - -C ENTRY SORTRL(A,LA,NL) SORT UP REAL NUMBERS -C ** REVISED (6/13/84) FOR THE USE IN VAX-11 -C ARGUMENTS ... -C A INPUT ARRAY OF NL ELEMENTS TO BE SORTED OR RE-ORDERED -C LA OUTPUT ARRAY OF NL ELEMENTS IN WHICH THE ORIGINAL LOCATION -C OF THE SORTED ELEMENTS OF A ARE SAVED, OR -C INPUT ARRAY TO SPECIFY THE REORDERING OF ARRAY A BY SORTED -C NL THE NUMBER OF ELEMENTS TO BE TREATED - - SAVE - - DIMENSION A(NL),LA(NL),LS1(64),LS2(64) - DATA NSX/64/ - -C SET THE ORIGINAL ORDER IN LA - - DO L=1,NL - LA(L)=L - ENDDO - -C SEPARATE NEGATIVES FROM POSITIVES - - L = 0 - M = NL + 1 - 12 L = L + 1 - IF(L.GE.M) GO TO 19 - IF(A(L)) 12,15,15 - 15 M = M - 1 - IF(L.GE.M) GO TO 19 - IF(A(M)) 18,15,15 - 18 AZ = A(M) - A(M) = A(L) - A(L) = AZ - LZ = LA(M) - LA(M) = LA(L) - LA(L) = LZ - GO TO 12 - 19 L = L - 1 - -C NOTE THAT MIN AND MAX FOR INTERVAL (1,NL) HAVE NOT BEEN DETERMINED - - LS1(1) = 0 - L2 = NL + 1 - NS = 1 - -C STEP UP - - 20 LS1(NS) = LS1(NS) + 1 - LS2(NS) = L - NS = NS + 1 - IF(NS.GT.NSX) GO TO 80 - L1 = L + 1 - LS1(NS) = L1 - L2 = L2 - 1 - GO TO 40 - -C STEP DOWN - - 30 NS=NS-1 - IF (NS.LE.0) GO TO 90 - L1 = LS1(NS) - L2 = LS2(NS) - 40 IF(L2.LE.L1) GO TO 30 - -C FIND MAX AND MIN OF THE INTERVAL (L1,L2) - - IF (A(L1)-A(L2) .LE. 0) GO TO 52 - AN = A(L2) - LN = L2 - AX = A(L1) - LX = L1 - GO TO 54 - 52 AN = A(L1) - LN = L1 - AX = A(L2) - LX = L2 - 54 L1A = L1 + 1 - L2A = L2 - 1 - IF(L1A.GT.L2A) GO TO 60 - - DO L=L1A,L2A - IF (A(L)-AX .GT. 0) GO TO 56 - IF (A(L)-AN .GE. 0) GO TO 58 - AN = A(L) - LN = L - GO TO 58 - 56 AX = A(L) - LX = L - 58 CONTINUE - ENDDO - -C IF ALL ELEMENTS ARE EQUAL (AN=AX), STEP DOWN - - 60 IF (AN .EQ. AX) GO TO 30 - -C PLACE MIN AT L1, AND MAX AT L2 -C IF EITHER LN=L2 OR LX=L1, FIRST EXCHANGE L1 AND L2 - - IF(LN.EQ.L2.OR.LX.EQ.L1) GO TO 62 - GO TO 64 - 62 AZ=A(L1) - A(L1)=A(L2) - A(L2)=AZ - LZ=LA(L1) - LA(L1)=LA(L2) - LA(L2)=LZ - -C MIN TO L1, IF LN IS NOT AT EITHER END - - 64 IF(LN.EQ.L1.OR.LN.EQ.L2) GO TO 66 - A(LN)=A(L1) - A(L1)=AN - LZ=LA(LN) - LA(LN)=LA(L1) - LA(L1)=LZ - -C MAX TO L2, IF LX IS NOT AT EITHER END - - 66 IF(LX.EQ.L2.OR.LX.EQ.L1) GO TO 68 - A(LX)=A(L2) - A(L2)=AX - LZ=LA(LX) - LA(LX)=LA(L2) - LA(L2)=LZ - -C IF ONLY THREE ELEMENTS IN (L1,L2), STEP DOWN. - - 68 IF(L1A.GE.L2A) GO TO 30 - -C SET A CRITERION TO SPLIT THE INTERVAL (L1A,L2A) -C AC IS AN APPROXIMATE ARITHMETIC AVERAGE OF AX AND AN, -C PROVIDED THAT AX IS GREATER THAN AN. (IT IS THE CASE, HERE) -C ** IF A IS DISTRIBUTED EXPONENTIALLY, GEOMETRIC MEAN MAY -C BE MORE EFFICIENT - - AC = (AX+AN)/2 - -C MIN AT L1 AND MAX AT L2 ARE OUTSIDE THE INTERVAL - - L = L1 - M = L2 - 72 L = L + 1 - IF(L.GE.M) GO TO 78 -cc 73 CONTINUE - IF (A(L)-AC .LE. 0) GO TO 72 - 75 M = M - 1 - IF(L.GE.M) GO TO 78 -cc 76 CONTINUE - IF (A(M)-AC .GT. 0) GO TO 75 - AZ = A(M) - A(M) = A(L) - A(L) = AZ - LZ = LA(M) - LA(M) = LA(L) - LA(L) = LZ - GO TO 72 - -C SINCE 75 IS ENTERED ONLY IF 73 IS FALSE, 75 IS NOT TENTATIVE -C BUT 72 IS TENTATIVE, AND MUST BE CORRECTED IF NO FALSE 76 OCCURS - - 78 L = L - 1 - GO TO 20 - 80 WRITE(6,85) NSX - 85 FORMAT(/' === SORTING INCOMPLETE. SPLIT EXCEEDED',I3,' ==='/) - 90 RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: DS2UV CONVERTS DIRECTION/SPEED TO U/V MOTION -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: CONVERTS DIRECTION AND SPEED TO ZONAL AND MERIDIONAL -C MOTION. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C -C USAGE: CALL PGM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C DDNAME1 - GENERIC NAME & CONTENT -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C DDNAME2 - GENERIC NAME & CONTENT AS ABOVE -C FT06F001 - INCLUDE IF ANY PRINTOUT -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE DS2UV(UZ,VZ,DIRZ,SPDZ) - -C THIS SUBROUTINE PRODUCES U, V CARTESIAN WINDS FROM DIRECTION,SPEED -C ****** IMPORTANT NOTE: DIRECTION IS DIRECTION WIND IS -C BLOWING, THE OPPOSITE OF METEOROLOGICAL CONVENTION *** - - UZ=SPDZ*SIND(DIRZ) - VZ=SPDZ*COSD(DIRZ) - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: ATAN2D ARC TAN FUNCTION FROM DEGREES INPUT -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: ARC TAN FUNCTION FROM DEGREES INPUT. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C -C USAGE: CALL PGM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C DDNAME1 - GENERIC NAME & CONTENT -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C DDNAME2 - GENERIC NAME & CONTENT AS ABOVE -C FT06F001 - INCLUDE IF ANY PRINTOUT -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - FUNCTION ATAN2D(ARG1,ARG2) - -C DEGRAD CONVERTS DEGREES TO RADIANS - - DATA DEGRAD/0.017453/ - IF(ARG1 .EQ. 0.0 .AND. ARG2 .EQ. 0.0) THEN - ATAN2D=0.0 - ELSE - ATAN2D=ATAN2(ARG1,ARG2)/DEGRAD - ENDIF - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: SIND SINE FUNCTION FROM DEGREES INPUT -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: SINE FUNCTION FROM DEGREES INPUT. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C -C USAGE: CALL PGM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C DDNAME1 - GENERIC NAME & CONTENT -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C DDNAME2 - GENERIC NAME & CONTENT AS ABOVE -C FT06F001 - INCLUDE IF ANY PRINTOUT -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - FUNCTION SIND(ARG) - -C DEGRAD CONVERTS DEGREES TO RADIANS - - DATA DEGRAD/0.017453/ - SIND=SIN(ARG*DEGRAD) - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: COSD COSINE FUNCTION FROM DEGREES INPUT -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: RETURNS COSINE FUNCTION FROM DEGREES INPUT -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C -C USAGE: CALL PGM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C DDNAME1 - GENERIC NAME & CONTENT -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C DDNAME2 - GENERIC NAME & CONTENT AS ABOVE -C FT06F001 - INCLUDE IF ANY PRINTOUT -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - FUNCTION COSD(ARG) - -C DEGRAD CONVERTS DEGREES TO RADIANS - - DATA DEGRAD/0.017453/ - COSD=COS(ARG*DEGRAD) - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: DISTSP DISTANCE ON GREAT CIRCLE -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: CALCULATES DISTANCE ON GREAT CIRCLE BETWEEN TWO LAT/LON -C POINTS. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C -C USAGE: DXY=DISTSP(DLAT1,DLON1,DLAT2,DLON2) -C INPUT ARGUMENT LIST: -C DLAT1 - LATITUDE OF POINT 1 (-90<=LAT<=90) -C DLON1 - LONGITUDE OF POINT 1 (-180 TO 180 OR 0 TO 360) -C DLAT2 - LATITUDE OF POINT 2 (-90<=LAT<=90) -C DLON1 - LONGITUDE OF POINT 2 -C -C -C REMARKS: DISTANCE IS IN METERS -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - FUNCTION DISTSP(DLAT1,DLON1,DLAT2,DLON2) - DATA REARTH/6.37E6/ - - XXD=COSD(DLON1-DLON2)*COSD(DLAT1)*COSD(DLAT2)+ - 1 SIND(DLAT1)*SIND(DLAT2) - - XXM=AMIN1(1.0,AMAX1(-1.0,XXD)) - - DISTSP=ACOS(XXM)*REARTH - RETURN - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: AVGSUB CALCULATES AVERAGES -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-06 -C -C ABSTRACT: CALCULATES AVERAGES WEIGHTED AND UNWEIGHTED FOR ALL -C INPUT NUMBERS OR JUST POSITIVE ONES. -C -C PROGRAM HISTORY LOG: -C 1991-06-06 S. J. LORD -C -C USAGE: CALL PGM-NAME(INARG1, INARG2, WRKARG, OUTARG1, ... ) -C INPUT ARGUMENT LIST: -C INARG1 - GENERIC DESCRIPTION, INCLUDING CONTENT, UNITS, -C INARG2 - TYPE. EXPLAIN FUNCTION IF CONTROL VARIABLE. -C -C OUTPUT ARGUMENT LIST: (INCLUDING WORK ARRAYS) -C WRKARG - GENERIC DESCRIPTION, ETC., AS ABOVE. -C OUTARG1 - EXPLAIN COMPLETELY IF ERROR RETURN -C ERRFLAG - EVEN IF MANY LINES ARE NEEDED -C -C INPUT FILES: (DELETE IF NO INPUT FILES IN SUBPROGRAM) -C DDNAME1 - GENERIC NAME & CONTENT -C -C OUTPUT FILES: (DELETE IF NO OUTPUT FILES IN SUBPROGRAM) -C DDNAME2 - GENERIC NAME & CONTENT AS ABOVE -C FT06F001 - INCLUDE IF ANY PRINTOUT -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE AVGSUB(XX,WT,LX,AVX) - - DIMENSION XX(LX),WT(LX) - - AVX=0.0 - N=0 - DO L=1,LX - AVX=AVX+XX(L) - N=N+1 - ENDDO - AVX=AVX/REAL(N) - RETURN - -C----------------------------------------------------------------------- - - ENTRY WTAVRG(XX,WT,LX,AVX) - - AVX=0.0 - W=0.0 - DO L=1,LX - AVX=AVX+XX(L)*WT(L) - W=W+WT(L) - ENDDO - AVX=AVX/W - RETURN - -C----------------------------------------------------------------------- - - ENTRY WTAVGP(XX,WT,LX,AVX) - - AVX=0.0 - W=0.0 - DO L=1,LX - IF(XX(L) .GE. 0.0) THEN - AVX=AVX+XX(L)*WT(L) - W=W+WT(L) - ENDIF - ENDDO - IF(W .NE. 0.0) THEN - AVX=AVX/W - ELSE - AVX=XX(1) - ENDIF - RETURN - -C----------------------------------------------------------------------- - - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: ABORT1 ERROR EXIT ROUTINE -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-05 -C -C ABSTRACT: ERROR TERMINATION ROUTINE THAT LISTS ROUTINE WHERE -C ERROR OCCURRED AND THE NEAREST STATEMENT NUMBER. -C -C PROGRAM HISTORY LOG: -C 1991-06-05 S. J. LORD -C -C USAGE: CALL ABORT1(ME(KENTRY,ISTMT) -C INPUT ARGUMENT LIST: -C KENTRY - CHARACTER VARIABLE (*7) GIVING PROGRAM OR SUBROUTINE -C - WHERE ERROR OCCURRED. -C ISTMT - STATEMENT NUMBER NEAR WHERE ERROR OCCURRED. -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: THIS ROUTINE IS CALLED WHENEVER AN INTERNAL PROBLEM -C TO THE CODE IS FOUND. EXAMPLES ARE CALLING PARAMETERS THAT -C WILL OVERFLOW ARRAY BOUNDARIES AND OBVIOUS INCONSISTENCIES -C IN NUMBERS GENERATED BY THE CODE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE ABORT1(KENTRY,ISTMT) - CHARACTER*7 KENTRY - WRITE(6,10) KENTRY,ISTMT - 10 FORMAT(//21X,'*********************************************'/ - 1 21X,'*********************************************'/ - 2 21X,'**** PROGRAM FAILED DUE TO FATAL ERROR ****'/ - 3 21X,'**** IN ROUTINE ',A,' NEAR ****'/ - 4 21X,'**** STATEMENT NUMBER',I5,'. ****'/ - 5 21X,'*********************************************'/ - 6 21X,'*********************************************') - CALL W3TAGE('SYNDAT_QCTROPCY') - call ERREXIT (20) - END - -C$$$ SUBPROGRAM DOCUMENTATION BLOCK -C . . . . -C SUBPROGRAM: OFILE0 OPENS ALL DATA FILES LISTED IN TEXT FILE -C PRGMMR: S. J. LORD ORG: NP22 DATE: 1991-06-07 -C -C ABSTRACT: OPENS ALL OF THE DATA FILES READ FROM A LIST IN A TEXT -C FILE. -C -C PROGRAM HISTORY LOG: -C 1991-06-07 S. J. LORD -C -C USAGE: CALL OFILE0(IUNTOP,NFILMX,NFTOT,FILNAM) -C INPUT ARGUMENT LIST: -C IUNTOP - UNIT NUMBER OF TEXT FILE ASSOCIATING UNIT NUMBERS -C - WITH FILE NAMES -C FILNAM - FILE NAMES (UPON INPUT ONLY ELEMENT 0 STORED - -C - THE FILE NAME ASSOCIATED WITH UNIT IUNTOP) -C NFILMX - THE MAXIMUM NUMBER OF FILES THAT CAN BE OPENED IN -C - THIS SUBROUTINE -C -C OUTPUT ARGUMENT LIST: -C NFTOT - NUMBER OF DATA FILES OPENED IN THIS SUBROUTINE -C -C INPUT FILES: -C UNIT "IUNTOP" -C - TEXT FILE ASSOCIATING UNIT NUMBERS WITH FILE NAMES -C MANY - READ FROM LIST IN UNIT IUNTOP -C -C OUTPUT FILES: -C UNIT 06 - STANDARD OUTPUT PRINT -C -C REMARKS: NONE. -C -C ATTRIBUTES: -C MACHINE: IBM-SP -C LANGUAGE: FORTRAN 90 -C -C$$$ - SUBROUTINE OFILE0(IUNTOP,NFILMX,NFTOT,FILNAM) - - PARAMETER (IDGMAX=7) - - SAVE - - CHARACTER FILNAM*(*),CFORM*11,CSTAT*7,CACCES*10,MACHIN*10, - 1 CFZ*1,CSTZ*1,CACZ*1,CPOS*10 - - DIMENSION IUNIT(NFILMX),CFORM(NFILMX),CSTAT(NFILMX), - 1 CACCES(NFILMX),CPOS(NFILMX) - DIMENSION FILNAM(0:NFILMX) - - INTEGER(4) IARGC,NDEF - - NF=0 - -C DEFAULT FILENAME IS SPECIFIED BY THE CALLING PROGRAM. -C RUNNING THE PROGRAM WITH ARGUMENTS ALLOWS -C YOU TO SPECIFY THE FILENAM AS FOLLOWS: - - NDEF=IARGC() - - IF(NDEF .LT. 0) CALL GETARG(1_4,FILNAM(0)) - - LENG0=INDEX(FILNAM(0),' ')-1 - WRITE(6,5) NDEF,FILNAM(0)(1:LENG0) - 5 FORMAT(/'...SETTING UP TO READ I/O FILENAMES AND OPEN PARMS.', - 1 ' NDEF,FILNAM(0)=',I2,1X,'...',A,'...') - - OPEN(UNIT=IUNTOP,FORM='FORMATTED',STATUS='OLD',ACCESS= - 1 'SEQUENTIAL',FILE=FILNAM(0)(1:leng0),ERR=95,IOSTAT=IOS) - - READ(IUNTOP,11,ERR=90) MACHIN - 11 FORMAT(A) - WRITE(6,13) MACHIN - 13 FORMAT('...READY TO READ FILES TO OPEN ON MACHINE ',A) - - DO IFILE=1,NFILMX - NF=NF+1 - READ(IUNTOP,21,END=50,ERR=90,IOSTAT=IOS) IUNIT(NF), - 1 CFZ,CSTZ,CACZ,FILNAM(NF) - 21 FORMAT(I2,3(1X,A1),1X,A) - - LENGTH=INDEX(FILNAM(NF),' ')-1 - WRITE(6,23) NF,IUNIT(NF),CFZ,CSTZ,CACZ,FILNAM(NF)(1:LENGTH) - 23 FORMAT('...FOR FILE #',I3,', READING IUNIT, ABBREVIATIONS CFZ', - 1 ',CSTZ,CACZ='/4X,I3,3(1X,A,1X),5x,'...FILENAME=',A,'...') - -c Interpret the abbreviations - - if(CFZ .eq. 'f' .or. CFZ .eq. 'F') then - cform(nf)='FORMATTED' - else if(CFZ .eq. 'u' .or. CFZ .eq. 'U') then - cform(nf)='UNFORMATTED' - else - write(6,25) CFZ - 25 format('******option ',a,' for format is not allowed. Abort') - call abort1(' OFILE0',25) - endif - - if(CSTZ .eq. 'o' .or. CSTZ .eq. 'O') then - cstat(nf)='OLD' - else if(CSTZ .eq. 'n' .or. CSTZ .eq. 'N') then - cstat(nf)='NEW' - else if(CSTZ .eq. 'k' .or. CSTZ .eq. 'K') then - cstat(nf)='UNKNOWN' - else if(CSTZ .eq. 's' .or. CSTZ .eq. 'S') then - cstat(nf)='SCRATCH' - else - write(6,27) CSTZ - 27 format('******option ',a,' for status is not allowed. Abort') - call abort1(' OFILE0',27) - endif - - cpos(nf)=' ' - if(CACZ .eq. 'd' .or. CACZ .eq. 'D') then - cacces(nf)='DIRECT' - else if(CACZ .eq. 'q' .or. CACZ .eq. 'Q') then - cacces(nf)='SEQUENTIAL' - else if(CACZ .eq. 'a' .or. CACZ .eq. 'A') then - cacces(nf)='APPEND' - else if(CACZ .eq. 's' .or. CACZ .eq. 'S') then - cacces(nf)='SEQUENTIAL' - cpos(nf)='APPEND' - else if(CACZ .eq. 't' .or. CACZ .eq. 'T') then - cacces(nf)='DIRECT' - cpos(nf)='APPEND' - else - write(6,29) CACZ - 29 format('******option ',a,' for access is not allowed. Abort') - call abort1(' OFILE0',29) - endif - - IF(CACCES(NF) .NE. 'DIRECT') THEN - if(cpos(nf) .eq. ' ') then - if (cstat(nf).eq.'OLD') then - OPEN(UNIT=IUNIT(NF),FORM=cform(nf),STATUS='OLD', - 1 ACCESS=cacces(nf),FILE=FILNAM(NF)(1:LENGTH), - 2 ERR=95,IOSTAT=IOS) - elseif (cstat(nf).eq.'NEW') then - OPEN(UNIT=IUNIT(NF),FORM=cform(nf),STATUS='NEW', - 1 ACCESS=cacces(nf),FILE=FILNAM(NF)(1:LENGTH), - 2 ERR=95,IOSTAT=IOS) - elseif (cstat(nf).eq.'UNKNOWN') then - OPEN(UNIT=IUNIT(NF),FORM=cform(nf),STATUS='UNKNOWN', - 1 ACCESS=cacces(nf),FILE=FILNAM(NF)(1:LENGTH), - 2 ERR=95,IOSTAT=IOS) - else - OPEN(UNIT=IUNIT(NF),FORM=cform(nf),STATUS=cstat(nf), - 1 ACCESS=cacces(nf), - 2 ERR=95,IOSTAT=IOS) - endif - else - if (cstat(nf).eq.'OLD') then - open(unit=iunit(nf),form=cform(nf),status='OLD', - 1 access=cacces(nf),position=cpos(nf), - 2 file=filnam(nf)(1:length),err=95,iostat=ios) - elseif (cstat(nf).eq.'NEW') then - open(unit=iunit(nf),form=cform(nf),status='NEW', - 1 access=cacces(nf),position=cpos(nf), - 2 file=filnam(nf)(1:length),err=95,iostat=ios) - elseif (cstat(nf).eq.'UNKNOWN') then - open(unit=iunit(nf),form=cform(nf),status='UNKNOWN', - 1 access=cacces(nf),position=cpos(nf), - 2 file=filnam(nf)(1:length),err=95,iostat=ios) - else - open(unit=iunit(nf),form=cform(nf),status=cstat(nf), - 1 access=cacces(nf),position=cpos(nf), - 2 err=95,iostat=ios) - endif - endif - ELSE - read(filnam(nf)(length+2:length+2+idgmax-1),37) lrec - 37 format(i7) - write(6,39) lrec - 39 format('...Direct access record length:',i7,'...') - if(cpos(nf) .eq. ' ') then - if (cstat(nf).eq.'OLD') then - OPEN(UNIT=IUNIT(NF),FORM=CFORM(NF),STATUS='OLD', - 1 ACCESS=CACCES(NF),FILE=FILNAM(NF)(1:LENGTH), - 2 ERR=95,IOSTAT=IOS,RECL=lrec) - elseif (cstat(nf).eq.'NEW') then - OPEN(UNIT=IUNIT(NF),FORM=CFORM(NF),STATUS='NEW', - 1 ACCESS=CACCES(NF),FILE=FILNAM(NF)(1:LENGTH), - 2 ERR=95,IOSTAT=IOS,RECL=lrec) - elseif (cstat(nf).eq.'UNKNOWN') then - OPEN(UNIT=IUNIT(NF),FORM=CFORM(NF),STATUS='UNKNOWN', - 1 ACCESS=CACCES(NF),FILE=FILNAM(NF)(1:LENGTH), - 2 ERR=95,IOSTAT=IOS,RECL=lrec) - else - OPEN(UNIT=IUNIT(NF),FORM=CFORM(NF),STATUS=CSTAT(NF), - 1 ACCESS=CACCES(NF), - 2 ERR=95,IOSTAT=IOS,RECL=lrec) - endif - else - if (cstat(nf).eq.'OLD') then - open(unit=iunit(nf),form=cform(nf),status='OLD', - 1 access=cacces(nf),file=filnam(nf)(1:length), - 2 position=cpos(nf),err=95,iostat=ios,recl=lrec) - elseif (cstat(nf).eq.'NEW') then - open(unit=iunit(nf),form=cform(nf),status='NEW', - 1 access=cacces(nf),file=filnam(nf)(1:length), - 2 position=cpos(nf),err=95,iostat=ios,recl=lrec) - elseif (cstat(nf).eq.'UNKNOWN') then - open(unit=iunit(nf),form=cform(nf),status='UNKNOWN', - 1 access=cacces(nf),file=filnam(nf)(1:length), - 2 position=cpos(nf),err=95,iostat=ios,recl=lrec) - else - open(unit=iunit(nf),form=cform(nf),status=cstat(nf), - 1 access=cacces(nf), - 2 position=cpos(nf),err=95,iostat=ios,recl=lrec) - endif - endif - ENDIF - ENDDO - - WRITE(6,391) NFILMX - 391 FORMAT('******NUMBER OF FILES TO BE OPENED MEETS OR EXCEEDS ', - 1 'MAXIMUM SET BY PROGRAM (=',I3) - CALL ABORT1(' OFILE0',50) - - 50 CONTINUE - -C WE HAVE DEFINED AND OPENED ALL FILES - - NFTOT=NF-1 - WRITE(6,51) NFTOT,MACHIN - 51 FORMAT(/'...SUCCESSFULLY OPENED ',I3,' FILES ON ',A) - RETURN - - 90 CONTINUE - WRITE(6,91) FILNAM(0)(1:leng0),ios - 91 FORMAT('******ERROR READING OPEN FILE=',A,' error=',i4) - CALL ABORT1(' OFILE0',91) - - 95 CONTINUE - WRITE(6,96) NF,IOS - 96 FORMAT('******ERROR UPON OPENING FILE, NF,IOS=',2I5) - CALL ABORT1(' OFILE0',96) - - END diff --git a/sorc/tave.fd/tave.f b/sorc/tave.fd/tave.f deleted file mode 100644 index bbf52634636..00000000000 --- a/sorc/tave.fd/tave.f +++ /dev/null @@ -1,1083 +0,0 @@ - program tave -c -c ABSTRACT: This program averages the temperatures from an input -c grib file and produces an output grib file containing the mean -c temperature in the 300-500 mb layer. For each model and each -c lead time, there will need to be data from 300 to 500 mb in -c 50 mb increments, such that all 5 of these layers then get -c averaged together. -c -c Written by Tim Marchok - - USE params - USE grib_mod - - implicit none - - type(gribfield) :: holdgfld - integer, parameter :: lugb=11,lulv=16,lugi=31,lout=51 - integer, parameter :: nlevsout=1,nlevsin=5 - integer kpds(200),kgds(200) - integer iriret,iogret,kf,iggret,igdret,iidret,gribver,g2_jpdtn - integer iha,iho,iva,irfa,iodret,ifcsthour,iia,iparm - integer ilevs(nlevsin) - real, allocatable :: xinptmp(:,:),xouttmp(:) - logical(1), allocatable :: valid_pt(:),readflag(:) - real xoutlev - - namelist/timein/ifcsthour,iparm,gribver,g2_jpdtn -c - data ilevs /300, 350, 400, 450, 500/ - xoutlev = 401. -c - read (5,NML=timein,END=201) - 201 continue - print *,' ' - print *,'*---------------------------------------------*' - print *,' ' - print *,' +++ Top of tave +++ ' - print *,' ' - print *,'After tave namelist read, input forecast hour= ' - & ,ifcsthour - print *,' input GRIB parm= ',iparm - print *,' GRIB version= ',gribver - print *,' GRIB2 JPDTN= g2_jpdtn= ' - & ,g2_jpdtn - -c ilevs = -999 -c call read_input_levels (lulv,nlevsin,ilevs,iriret) -c -c if (iriret /= 0) then -c print *,' ' -c print *,'!!! RETURN CODE FROM read_input_levels /= 0' -c print *,'!!! RETURN CODE = iriret = ',iriret -c print *,'!!! EXITING....' -c print *,' ' -c goto 899 -c endif - - call open_grib_files (lugb,lugi,lout,gribver,iogret) - if (iogret /= 0) then - print '(/,a35,a5,i4,/)','!!! ERROR: in tave open_grib_files,' - & ,' rc= ',iogret - goto 899 - endif - call getgridinfo (lugb,lugi,kf,kpds,kgds,holdgfld,ifcsthour,iparm - & ,gribver,g2_jpdtn,iggret) - - allocate (xinptmp(kf,nlevsin),stat=iha) - allocate (xouttmp(kf),stat=iho) - allocate (valid_pt(kf),stat=iva) - allocate (readflag(nlevsin),stat=irfa) - if (iha /= 0 .or. iho /= 0 .or. iva /= 0 .or. irfa /= 0) then - print *,' ' - print *,'!!! ERROR in tave allocating arrays.' - print *,'!!! ERROR allocating the xinptmp, readflag, or the' - print *,'!!! valid_pt array, iha= ',iha,' iva= ',iva - print *,'!!! irfa= ',irfa,' iho= ',iho - print *,' ' - goto 899 - endif - - call getdata (lugb,lugi,kf,valid_pt,nlevsin,ilevs - & ,readflag,xinptmp,ifcsthour,iparm,gribver - & ,g2_jpdtn,igdret) - - call average_data (kf,valid_pt,nlevsin,ilevs,readflag - & ,xinptmp,xouttmp,iidret) - - call output_data (lout,kf,kpds,kgds,holdgfld,xouttmp,valid_pt - & ,xoutlev,nlevsout,gribver,ifcsthour,iodret) - - deallocate (xinptmp) - deallocate (xouttmp) - deallocate (valid_pt) - deallocate (readflag) - - 899 continue -c - stop - end -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine read_input_levels (lulv,nlevsin,ilevs,iriret) -c -c ABSTRACT: This subroutine reads in a text file that contains -c the number of input pressure levels for a given model. The -c format of the file goes like this, from upper levels to -c lower, for example: -c -c 1 200 -c 2 400 -c 3 500 -c 4 700 -c 5 850 -c 6 925 -c 7 1000 -c -c - implicit none - - integer lulv,nlevsin,iriret,inplev,ict,lvix - integer ilevs(nlevsin) -c - iriret=0 - ict = 0 - do while (.true.) - - print *,'Top of while loop in tave read_input_levels' - - read (lulv,85,end=130) lvix,inplev - - if (inplev > 0 .and. inplev <= 1000) then - ict = ict + 1 - ilevs(ict) = inplev - else - print *,' ' - print *,'!!! ERROR: Input level not between 0 and 1000' - print *,'!!! in tave. inplev= ',inplev - print *,'!!! STOPPING EXECUTION' - STOP 91 - endif - - print *,'tave readloop, ict= ',ict,' inplev= ',inplev - - enddo - - 85 format (i4,1x,i4) - 130 continue - - nlevsin = ict - - print *,' ' - print *,'Total number of tave levels read in = ',nlevsin -c - return - end - -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine getgridinfo (lugb,lugi,kf,kpds,kgds,holdgfld,ifcsthour - & ,iparm,gribver,g2_jpdtn,iggret) -c -c ABSTRACT: The purpose of this subroutine is just to get the max -c values of i and j and the dx and dy grid spacing intervals for the -c grid to be used in the rest of the program. So just read the -c grib file to get the lon and lat data. Also, get the info for -c the data grids boundaries. This boundary information will be -c used later in the tracking algorithm, and is accessed via Module -c grid_bounds. -c -C INPUT: -C lugb The Fortran unit number for the GRIB data file -C lugi The Fortran unit number for the GRIB index file -c ifcsthour input forecast hour to search for -c iparm input grib parm to search for -c gribver integer (1 or 2) to indicate if using GRIB1 / GRIB2 -c g2_jpdtn If GRIB2 data being read, this is the value for JPDTN -c that is input to getgb2. -C -C OUTPUT: -c kf Number of gridpoints on the grid -c kpds pds array for a GRIB1 record -c kgds gds array for a GRIB1 record -c holdgfld info for a GRIB2 record -c -C iggret The return code from this subroutine -c - USE params - USE grib_mod - - implicit none -c - CHARACTER(len=8) :: ctemp - CHARACTER(len=80) :: ftemplate - type(gribfield) :: gfld,prevfld,holdgfld - integer,dimension(200) :: jids,jpdt,jgdt - logical(1), allocatable :: lb(:) - integer, parameter :: jf=4000000 - integer jpds(200),jgds(200) - integer kpds(200),kgds(200) - integer :: listsec1(13) - integer ila,ifa,iret,ifcsthour,imax,jmax,jskp,jdisc - integer lugb,lugi,kf,j,k,iggret,iparm,gribver,g2_jpdtn - integer jpdtn,jgdtn,npoints,icount,ipack,krec - integer :: listsec0(2)=(/0,2/) - integer :: igds(5)=(/0,0,0,0,0/),previgds(5) - integer :: idrstmpl(200) - integer :: currlen=1000000 - logical :: unpack=.true. - logical :: open_grb=.false. - real, allocatable :: f(:) - real dx,dy -c - iggret = 0 - - allocate (lb(jf),stat=ila) - allocate (f(jf),stat=ifa) - if (ila /= 0 .or. ifa /= 0) then - print *,' ' - print *,'!!! ERROR in tave.' - print *,'!!! ERROR in getgridinfo allocating either lb or f' - print *,'!!! ila = ',ila,' ifa= ',ifa - iggret = 97 - return - endif - - if (gribver == 2) then - - ! Search for a record from a GRIB2 file - - ! - ! --- Initialize Variables --- - ! - - gfld%idsect => NULL() - gfld%local => NULL() - gfld%list_opt => NULL() - gfld%igdtmpl => NULL() - gfld%ipdtmpl => NULL() - gfld%coord_list => NULL() - gfld%idrtmpl => NULL() - gfld%bmap => NULL() - gfld%fld => NULL() - - jdisc=0 ! Meteorological products - jids=-9999 - jpdtn=g2_jpdtn ! 0 = analysis or forecast; 1 = ens fcst - jgdtn=0 ! lat/lon grid - jgdt=-9999 - jpdt=-9999 - - npoints=0 - icount=0 - jskp=0 - -c Search for Temperature by production template 4.0 - - JPDT(1:15)=(/ -9999,-9999,-9999,-9999,-9999,-9999,-9999,-9999 - & ,-9999,-9999,-9999,-9999,-9999,-9999,-9999/) - - call getgb2(lugb,lugi,jskp,jdisc,jids,jpdtn,jpdt,jgdtn,jgdt - & ,unpack,krec,gfld,iret) - if ( iret.ne.0) then - print *,' ' - print *,' ERROR: getgb2 error in getgridinfo = ',iret - endif - -c Determine packing information from GRIB2 file -c The default packing is 40 JPEG 2000 - - ipack = 40 - - print *,' gfld%idrtnum = ', gfld%idrtnum - - ! Set DRT info ( packing info ) - if ( gfld%idrtnum.eq.0 ) then ! Simple packing - ipack = 0 - elseif ( gfld%idrtnum.eq.2 ) then ! Complex packing - ipack = 2 - elseif ( gfld%idrtnum.eq.3 ) then ! Complex & spatial packing - ipack = 31 - elseif ( gfld%idrtnum.eq.40.or.gfld%idrtnum.eq.15 ) then - ! JPEG 2000 packing - ipack = 40 - elseif ( gfld%idrtnum.eq.41 ) then ! PNG packing - ipack = 41 - endif - - print *,'After check of idrtnum, ipack= ',ipack - - print *,'Number of gridpts= gfld%ngrdpts= ',gfld%ngrdpts - print *,'Number of elements= gfld%igdtlen= ',gfld%igdtlen - print *,'PDT num= gfld%ipdtnum= ',gfld%ipdtnum - print *,'GDT num= gfld%igdtnum= ',gfld%igdtnum - - imax = gfld%igdtmpl(8) - jmax = gfld%igdtmpl(9) - dx = float(gfld%igdtmpl(17))/1.e6 - dy = float(gfld%igdtmpl(17))/1.e6 - kf = gfld%ngrdpts - - holdgfld = gfld - - else - - ! Search for a record from a GRIB1 file - - jpds = -1 - jgds = -1 - - j=0 - - jpds(5) = iparm ! Get a temperature record - jpds(6) = 100 ! Get a record on a standard pressure level - jpds(14) = ifcsthour - - call getgb(lugb,lugi,jf,j,jpds,jgds, - & kf,k,kpds,kgds,lb,f,iret) - - if (iret.ne.0) then - print *,' ' - print *,'!!! ERROR in tave getgridinfo calling getgb' - print *,'!!! Return code from getgb = iret = ',iret - iggret = iret - return - else - iggret=0 - imax = kgds(2) - jmax = kgds(3) - dx = float(kgds(9))/1000. - dy = float(kgds(10))/1000. - endif - - endif - - print *,' ' - print *,'In getgridinfo, grid dimensions follow:' - print *,'imax= ',imax,' jmax= ',jmax - print *,' dx= ',dx,' dy= ',dy - print *,'number of gridpoints = ',kf - - deallocate (lb); deallocate(f) - - return - end - -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine getdata (lugb,lugi,kf,valid_pt,nlevsin,ilevs - & ,readflag,xinptmp,ifcsthour,iparm,gribver - & ,g2_jpdtn,igdret) -c -c ABSTRACT: This subroutine reads the input GRIB file for the -c tracked parameters. - - USE params - USE grib_mod - - implicit none -c - type(gribfield) :: gfld,prevfld - CHARACTER(len=8) :: ctemp,pabbrev - CHARACTER(len=80) :: ftemplate - integer,dimension(200) :: jids,jpdt,jgdt - integer, parameter :: jf=4000000 - integer ilevs(nlevsin) - integer jpds(200),jgds(200),kpds(200),kgds(200) - integer lugb,lugi,kf,nlevsin,igdret,iparm,jskp,jdisc - integer jpdtn,jgdtn,npoints,icount,ipack,krec - integer i,j,k,ict,np,lev,ifcsthour,iret,gribver,g2_jpdtn - integer pdt_4p0_vert_level,pdt_4p0_vtime,mm - integer :: listsec0(2)=(/0,2/) - integer :: listsec1(13) - integer :: igds(5)=(/0,0,0,0,0/),previgds(5) - integer :: idrstmpl(200) - integer :: currlen=1000000 - logical :: unpack=.true. - logical :: open_grb=.false. - logical(1) valid_pt(kf),lb(kf),readflag(nlevsin) - real f(kf),xinptmp(kf,nlevsin),xtemp(kf) - real dmin,dmax,firstval,lastval -c - igdret=0 - ict = 0 - - print *,'At top of getdata, ifcsthour= ',ifcsthour - - level_loop: do lev = 1,nlevsin - - print *,' ' - print *,'------------------------------------------------' - print *,'In tave getdata read loop, lev= ',lev,' level= ' - & ,ilevs(lev) - - if (gribver == 2) then - - ! - ! --- Initialize Variables --- - ! - - gfld%idsect => NULL() - gfld%local => NULL() - gfld%list_opt => NULL() - gfld%igdtmpl => NULL() - gfld%ipdtmpl => NULL() - gfld%coord_list => NULL() - gfld%idrtmpl => NULL() - gfld%bmap => NULL() - gfld%fld => NULL() - - jdisc=0 ! Meteorological products - jids=-9999 - jpdtn=g2_jpdtn ! 0 = analysis or forecast; 1 = ens fcst - jgdtn=0 ! lat/lon grid - jgdt=-9999 - jpdt=-9999 - - npoints=0 - icount=0 - jskp=0 - -c Search for input parameter by production template 4.0. This -c tave program is used primarily for temperature, but still we -c will leave that as a variable and not-hard wire it in case we -c choose to average something else in the future. - - if (iparm == 11) then - - ! Set defaults for JPDT, then override in array - ! assignments below... - - JPDT(1:15)=(/ -9999,-9999,-9999,-9999,-9999,-9999,-9999 - & ,-9999,-9999,-9999,-9999,-9999,-9999,-9999,-9999/) - JPDT(1) = 0 ! Param category from Table 4.1 - JPDT(2) = 0 ! Param number from Table 4.2 - JPDT(9) = ifcsthour - JPDT(10) = 100 ! Isobaric surface requested (Table 4.5) - JPDT(12) = ilevs(lev) * 100 ! value of specific level - - print *,'In getdata, just set JPDT inputs....' - - endif - - print *,'before getgb2 call, value of unpack = ',unpack - - do mm = 1,15 - print *,'tave getdata mm= ',mm,' JPDT(mm)= ',JPDT(mm) - enddo - - call getgb2(lugb,lugi,jskp,jdisc,jids,jpdtn,jpdt,jgdtn,jgdt - & ,unpack,krec,gfld,iret) - - print *,'iret from getgb2 in getdata = ',iret - - print *,'after getgb2 call, value of unpacked = ' - & ,gfld%unpacked - - print *,'after getgb2 call, gfld%ndpts = ',gfld%ndpts - print *,'after getgb2 call, gfld%ibmap = ',gfld%ibmap - - if ( iret == 0) then - -c Determine packing information from GRIB2 file -c The default packing is 40 JPEG 2000 - - ipack = 40 - - print *,' gfld%idrtnum = ', gfld%idrtnum - - ! Set DRT info ( packing info ) - if ( gfld%idrtnum.eq.0 ) then ! Simple packing - ipack = 0 - elseif ( gfld%idrtnum.eq.2 ) then ! Complex packing - ipack = 2 - elseif ( gfld%idrtnum.eq.3 ) then ! Complex & spatial - & ! packing - ipack = 31 - elseif ( gfld%idrtnum.eq.40.or.gfld%idrtnum.eq.15 ) then - ! JPEG 2000 packing - ipack = 40 - elseif ( gfld%idrtnum.eq.41 ) then ! PNG packing - ipack = 41 - endif - - print *,'After check of idrtnum, ipack= ',ipack - - print *,'Number of gridpts= gfld%ngrdpts= ',gfld%ngrdpts - print *,'Number of elements= gfld%igdtlen= ',gfld%igdtlen - print *,'GDT num= gfld%igdtnum= ',gfld%igdtnum - - kf = gfld%ndpts ! Number of gridpoints returned from read - - do np = 1,kf - xinptmp(np,lev) = gfld%fld(np) - xtemp(np) = gfld%fld(np) - if (gfld%ibmap == 0) then - valid_pt(np) = gfld%bmap(np) - else - valid_pt(np) = .true. - endif - enddo - - readflag(lev) = .TRUE. -c call bitmapchk(kf,gfld%bmap,gfld%fld,dmin,dmax) - call bitmapchk(kf,valid_pt,xtemp,dmin,dmax) - - if (ict == 0) then -c do np = 1,kf -c valid_pt(np) = gfld%bmap(np) -c enddo - ict = ict + 1 - endif - - firstval=gfld%fld(1) - lastval=gfld%fld(kf) - - print *,' ' - print *,' SECTION 0: discipl= ',gfld%discipline - & ,' gribver= ',gfld%version - - print *,' ' - print *,' SECTION 1: ' - - do j = 1,gfld%idsectlen - print *,' sect1, j= ',j,' gfld%idsect(j)= ' - & ,gfld%idsect(j) - enddo - - if ( associated(gfld%local).AND.gfld%locallen.gt.0) then - print *,' ' - print *,' SECTION 2: ',gfld%locallen,' bytes' - else - print *,' ' - print *,' SECTION 2 DOES NOT EXIST IN THIS RECORD' - endif - - print *,' ' - print *,' SECTION 3: griddef= ',gfld%griddef - print *,' ngrdpts= ',gfld%ngrdpts - print *,' numoct_opt= ',gfld%numoct_opt - print *,' interp_opt= ',gfld%interp_opt - print *,' igdtnum= ',gfld%igdtnum - print *,' igdtlen= ',gfld%igdtlen - - print *,' ' - print '(a17,i3,a2)',' GRID TEMPLATE 3.',gfld%igdtnum,': ' - do j=1,gfld%igdtlen - print *,' j= ',j,' gfld%igdtmpl(j)= ',gfld%igdtmpl(j) - enddo - - print *,' ' - print *,' PDT num (gfld%ipdtnum) = ',gfld%ipdtnum - print *,' ' - print '(a20,i3,a2)',' PRODUCT TEMPLATE 4.',gfld%ipdtnum,': ' - do j=1,gfld%ipdtlen - print *,' sect 4 j= ',j,' gfld%ipdtmpl(j)= ' - & ,gfld%ipdtmpl(j) - enddo - -c Print out values for data representation type - - print *,' ' - print '(a21,i3,a2)',' DATA REP TEMPLATE 5.',gfld%idrtnum - & ,': ' - do j=1,gfld%idrtlen - print *,' sect 5 j= ',j,' gfld%idrtmpl(j)= ' - & ,gfld%idrtmpl(j) - enddo - - pdt_4p0_vtime = gfld%ipdtmpl(9) - pdt_4p0_vert_level = gfld%ipdtmpl(12) - -c Get parameter abbrev for record that was retrieved - - pabbrev=param_get_abbrev(gfld%discipline,gfld%ipdtmpl(1) - & ,gfld%ipdtmpl(2)) - - print *,' ' - write (6,131) - 131 format (' rec# param level byy bmm bdd bhh ' - & ,'fhr npts firstval lastval minval ' - & ,' maxval') - print '(i5,3x,a8,2x,6i5,2x,i8,4g12.4)' - & ,krec,pabbrev,pdt_4p0_vert_level/100,gfld%idsect(6) - & ,gfld%idsect(7),gfld%idsect(8),gfld%idsect(9) - & ,pdt_4p0_vtime,gfld%ndpts,firstval,lastval,dmin,dmax - -c do np = 1,kf -c xinptmp(np,lev) = gfld%fld(np) -c enddo - - else - - print *,' ' - print *,'!!! ERROR: GRIB2 TAVE READ IN GETDATA FAILED FOR ' - & ,'LEVEL LEV= ',LEV - print *,' ' - - readflag(lev) = .FALSE. - - do np = 1,kf - xinptmp(np,lev) = -99999.0 - enddo - - endif - - else - - ! Reading a GRIB1 file.... - - jpds = -1 - jgds = -1 - j=0 - - jpds(5) = iparm ! parameter id for temperature - jpds(6) = 100 ! level id to indicate a pressure level - jpds(7) = ilevs(lev) ! actual level of the layer - jpds(14) = ifcsthour ! lead time to search for - - call getgb (lugb,lugi,jf,j,jpds,jgds, - & kf,k,kpds,kgds,lb,f,iret) - - print *,' ' - print *,'After tave getgb call, j= ',j,' k= ',k,' level= ' - & ,ilevs(lev),' iret= ',iret - - if (iret == 0) then - - readflag(lev) = .TRUE. - call bitmapchk(kf,lb,f,dmin,dmax) - - if (ict == 0) then - do np = 1,kf - valid_pt(np) = lb(np) - enddo - ict = ict + 1 - endif - - write (6,31) - 31 format (' rec# parm# levt lev byy bmm bdd bhh fhr ' - & ,'npts minval maxval') - print '(i4,2x,8i5,i8,2g12.4)', - & k,(kpds(i),i=5,11),kpds(14),kf,dmin,dmax - - do np = 1,kf - xinptmp(np,lev) = f(np) - enddo - - else - - print *,' ' - print *,'!!! ERROR: TAVE READ FAILED FOR LEVEL LEV= ',LEV - print *,' ' - - readflag(lev) = .FALSE. - - do np = 1,kf - xinptmp(np,lev) = -99999.0 - enddo - - endif - - endif - - enddo level_loop -c - return - end -c -c----------------------------------------------------------------------- -c -c----------------------------------------------------------------------- - subroutine average_data (kf,valid_pt,nlevsin,ilevs,readflag - & ,xinptmp,xouttmp,iidret) -c -c ABSTRACT: This routine averages data between 300 and 500 mb to get -c a mean temperature at 400 mb. The input data should be at 50 mb -c resolution, giving 5 input levels in total. - - implicit none - - logical(1) valid_pt(kf),readflag(nlevsin) - integer ilevs(nlevsin) - integer nlevsin,kf,k,n,iidret - real xinptmp(kf,nlevsin),xouttmp(kf) - real xinlevs_p(nlevsin),xinlevs_lnp(nlevsin) - real xsum -c - iidret=0 - print *,'*----------------------------------------------*' - print *,' Top of average data routine' - print *,'*----------------------------------------------*' - print *,' ' - - do n = 1,kf - xsum = 0.0 -c print *,' ' - do k = 1,nlevsin - xsum = xsum + xinptmp(n,k) -c print *,'n= ',n,' k= ',k,' xsum= ',xsum - enddo - xouttmp(n) = xsum / float(nlevsin) -c print *,'n= ',n,' mean= ',xouttmp(n) - enddo -c - return - end -c -c---------------------------------------------------------------------- -c -c---------------------------------------------------------------------- - subroutine output_data (lout,kf,kpds,kgds,holdgfld,xouttmp - & ,valid_pt,xoutlev,nlevsout,gribver,ifcsthour,iodret) -c -c ABSTRACT: This routine writes out the output data on the -c specified output pressure levels. - - USE params - USE grib_mod - - implicit none - - CHARACTER(len=1),pointer,dimension(:) :: cgrib -c CHARACTER(len=1),pointer,allocatable :: cgrib(:) - type(gribfield) :: holdgfld - logical(1) valid_pt(kf),bmap(kf) - integer lout,kf,lugb,lugi,iodret,nlevsout,igoret,ipret,lev - integer gribver,ierr,ipack,lengrib,npoints,newlen,idrsnum - integer numcoord,ica,n,j,ifcsthour - integer :: idrstmpl(200) - integer :: currlen=1000000 - integer :: listsec0(2)=(/0,2/) - integer :: igds(5)=(/0,0,0,0,0/),previgds(5) - integer kpds(200),kgds(200) - integer(4), parameter::idefnum=1 - integer(4) ideflist(idefnum),ibmap - real xouttmp(kf),xoutlev,coordlist -c - iodret=0 - call baopenw (lout,"fort.51",igoret) - print *,'baopenw: igoret= ',igoret - - if (igoret /= 0) then - print *,' ' - print *,'!!! ERROR in sub output_data opening' - print *,'!!! **OUTPUT** grib file. baopenw return codes:' - print *,'!!! grib file 1 return code = igoret = ',igoret - STOP 95 - return - endif - - if (gribver == 2) then - - ! Write data out as a GRIB2 message.... - - allocate(cgrib(currlen),stat=ica) - if (ica /= 0) then - print *,' ' - print *,'ERROR in output_data allocating cgrib' - print *,'ica= ',ica - iodret=95 - return - endif - - - ! Ensure that cgrib array is large enough - - if (holdgfld%ifldnum == 1 ) then ! start new GRIB2 message - npoints=holdgfld%ngrdpts - else - npoints=npoints+holdgfld%ngrdpts - endif - newlen=npoints*4 - if ( newlen.gt.currlen ) then -ccc if (allocated(cgrib)) deallocate(cgrib) - if (associated(cgrib)) deallocate(cgrib) - allocate(cgrib(newlen),stat=ierr) -c call realloc (cgrib,currlen,newlen,ierr) - if (ierr == 0) then - print *,' ' - print *,'re-allocate for large grib msg: ' - print *,' currlen= ',currlen - print *,' newlen= ',newlen - currlen=newlen - else - print *,'ERROR returned from 2nd allocate cgrib = ',ierr - stop 95 - endif - endif - - ! Create new GRIB Message - listsec0(1)=holdgfld%discipline - listsec0(2)=holdgfld%version - - print *,'output, holdgfld%idsectlen= ',holdgfld%idsectlen - do j = 1,holdgfld%idsectlen - print *,' sect1, j= ',j,' holdgfld%idsect(j)= ' - & ,holdgfld%idsect(j) - enddo - - call gribcreate(cgrib,currlen,listsec0,holdgfld%idsect,ierr) - if (ierr.ne.0) then - write(6,*) ' ERROR creating new GRIB2 field (gribcreate)= ' - & ,ierr - stop 95 - endif - - previgds=igds - igds(1)=holdgfld%griddef - igds(2)=holdgfld%ngrdpts - igds(3)=holdgfld%numoct_opt - igds(4)=holdgfld%interp_opt - igds(5)=holdgfld%igdtnum - - if (igds(3) == 0) then - ideflist = 0 - endif - - call addgrid (cgrib,currlen,igds,holdgfld%igdtmpl - & ,holdgfld%igdtlen,ideflist,idefnum,ierr) - - if (ierr.ne.0) then - write(6,*) ' ERROR from addgrid adding GRIB2 grid = ',ierr - stop 95 - endif - - - holdgfld%ipdtmpl(12) = int(xoutlev) * 100 - - ipack = 40 - idrsnum = ipack - idrstmpl = 0 - - idrstmpl(2)= holdgfld%idrtmpl(2) - idrstmpl(3)= holdgfld%idrtmpl(3) - idrstmpl(6)= 0 - idrstmpl(7)= 255 - - numcoord=0 - coordlist=0.0 ! Only needed for hybrid vertical coordinate, - ! not here, so set it to 0.0 - - ! 0 - A bit map applies to this product and is specified in - ! this section - ! 255 - A bit map does not apply to this product - ibmap=255 ! Bitmap indicator (see Code Table 6.0) - - print *,' ' - print *,'output, holdgfld%ipdtlen= ',holdgfld%ipdtlen - do n = 1,holdgfld%ipdtlen - print *,'output, n= ',n,' holdgfld%ipdtmpl= ' - & ,holdgfld%ipdtmpl(n) - enddo - - print *,'output, kf= ',kf - -c if (ifcsthour < 6) then -c do n = 1,kf -cc print *,'output, n= ',n,' xouttmp(n)= ',xouttmp(n) -c write (92,151) n,xouttmp(n) -c 151 format (1x,'n= ',i6,' xouttmp(n)= ',f10.4) -c enddo -c endif - - call addfield (cgrib,currlen,holdgfld%ipdtnum,holdgfld%ipdtmpl - & ,holdgfld%ipdtlen,coordlist - & ,numcoord - & ,idrsnum,idrstmpl,200 - & ,xouttmp,kf,ibmap,bmap,ierr) - - if (ierr /= 0) then - write(6,*) ' ERROR from addfield adding GRIB2 data = ',ierr - stop 95 - endif - -! Finalize GRIB message after all grids -! and fields have been added. It adds the End Section ( "7777" ) - - call gribend(cgrib,currlen,lengrib,ierr) - call wryte(lout,lengrib,cgrib) - - if (ierr == 0) then - print *,' ' - print *,'+++ GRIB2 write successful. ' - print *,' Len of message = currlen= ',currlen - print *,' Len of entire GRIB2 message = lengrib= ',lengrib - else - print *,' ERROR from gribend writing GRIB2 msg = ',ierr - stop 95 - endif - - else - - ! Write data out as a GRIB1 message.... - - kpds(6) = 100 - - do lev = 1,nlevsout - - kpds(7) = int(xoutlev) - - print *,'tave: just before call to putgb, kf= ',kf - - print *,'output, kf= ',kf -c do n = 1,kf -c print *,'output, n= ',n,' xouttmp(n)= ',xouttmp(n) -c enddo - - if (ifcsthour < 6) then - do n = 1,kf -c print *,'output, n= ',n,' xouttmp(n)= ',xouttmp(n) - write (91,161) n,xouttmp(n) - 161 format (1x,'n= ',i6,' xouttmp(n)= ',f10.4) - enddo - endif - - call putgb (lout,kf,kpds,kgds,valid_pt,xouttmp,ipret) - print *,'tave: just after call to putgb, kf= ',kf - if (ipret == 0) then - print *,' ' - print *,'+++ IPRET = 0 after call to putgb' - print *,' ' - else - print *,' ' - print *,'!!!!!! ERROR in tave' - print *,'!!!!!! ERROR: IPRET NE 0 AFTER CALL TO PUTGB !!!' - print *,'!!!!!! Level index= ',lev - print *,'!!!!!! pressure= ',xoutlev - print *,' ' - endif - - write(*,980) kpds(1),kpds(2) - write(*,981) kpds(3),kpds(4) - write(*,982) kpds(5),kpds(6) - write(*,983) kpds(7),kpds(8) - write(*,984) kpds(9),kpds(10) - write(*,985) kpds(11),kpds(12) - write(*,986) kpds(13),kpds(14) - write(*,987) kpds(15),kpds(16) - write(*,988) kpds(17),kpds(18) - write(*,989) kpds(19),kpds(20) - write(*,990) kpds(21),kpds(22) - write(*,991) kpds(23),kpds(24) - write(*,992) kpds(25) - write(*,880) kgds(1),kgds(2) - write(*,881) kgds(3),kgds(4) - write(*,882) kgds(5),kgds(6) - write(*,883) kgds(7),kgds(8) - write(*,884) kgds(9),kgds(10) - write(*,885) kgds(11),kgds(12) - write(*,886) kgds(13),kgds(14) - write(*,887) kgds(15),kgds(16) - write(*,888) kgds(17),kgds(18) - write(*,889) kgds(19),kgds(20) - write(*,890) kgds(21),kgds(22) - - enddo - - 980 format(' kpds(1) = ',i7,' kpds(2) = ',i7) - 981 format(' kpds(3) = ',i7,' kpds(4) = ',i7) - 982 format(' kpds(5) = ',i7,' kpds(6) = ',i7) - 983 format(' kpds(7) = ',i7,' kpds(8) = ',i7) - 984 format(' kpds(9) = ',i7,' kpds(10) = ',i7) - 985 format(' kpds(11) = ',i7,' kpds(12) = ',i7) - 986 format(' kpds(13) = ',i7,' kpds(14) = ',i7) - 987 format(' kpds(15) = ',i7,' kpds(16) = ',i7) - 988 format(' kpds(17) = ',i7,' kpds(18) = ',i7) - 989 format(' kpds(19) = ',i7,' kpds(20) = ',i7) - 990 format(' kpds(21) = ',i7,' kpds(22) = ',i7) - 991 format(' kpds(23) = ',i7,' kpds(24) = ',i7) - 992 format(' kpds(25) = ',i7) - 880 format(' kgds(1) = ',i7,' kgds(2) = ',i7) - 881 format(' kgds(3) = ',i7,' kgds(4) = ',i7) - 882 format(' kgds(5) = ',i7,' kgds(6) = ',i7) - 883 format(' kgds(7) = ',i7,' kgds(8) = ',i7) - 884 format(' kgds(9) = ',i7,' kgds(10) = ',i7) - 885 format(' kgds(11) = ',i7,' kgds(12) = ',i7) - 886 format(' kgds(13) = ',i7,' kgds(14) = ',i7) - 887 format(' kgds(15) = ',i7,' kgds(16) = ',i7) - 888 format(' kgds(17) = ',i7,' kgds(18) = ',i7) - 889 format(' kgds(19) = ',i7,' kgds(20) = ',i7) - 890 format(' kgds(20) = ',i7,' kgds(22) = ',i7) - - endif -c - return - end -c -c----------------------------------------------------------------------- -c -c----------------------------------------------------------------------- - subroutine open_grib_files (lugb,lugi,lout,gribver,iret) - -C ABSTRACT: This subroutine must be called before any attempt is -C made to read from the input GRIB files. The GRIB and index files -C are opened with a call to baopenr. This call to baopenr was not -C needed in the cray version of this program (the files could be -C opened with a simple Cray assign statement), but the GRIB-reading -C utilities on the SP do require calls to this subroutine (it has -C something to do with the GRIB I/O being done in C on the SP, and -C the C I/O package needs an explicit open statement). -C -C INPUT: -C lugb The Fortran unit number for the GRIB data file -C lugi The Fortran unit number for the GRIB index file -C lout The Fortran unit number for the output grib file -c gribver integer (1 or 2) to indicate if using GRIB1 / GRIB2 -C -C OUTPUT: -C iret The return code from this subroutine - - implicit none - - character fnameg*7,fnamei*7,fnameo*7 - integer iret,gribver,lugb,lugi,lout,igoret,iioret,iooret - - iret=0 - fnameg(1:5) = "fort." - fnamei(1:5) = "fort." - fnameo(1:5) = "fort." - write(fnameg(6:7),'(I2)') lugb - write(fnamei(6:7),'(I2)') lugi - write(fnameo(6:7),'(I2)') lout - call baopenr (lugb,fnameg,igoret) - call baopenr (lugi,fnamei,iioret) - call baopenw (lout,fnameo,iooret) - - print *,' ' - print *,'tave baopen: igoret= ',igoret,' iioret= ',iioret - & ,' iooret= ',iooret - - if (igoret /= 0 .or. iioret /= 0 .or. iooret /= 0) then - print *,' ' - print *,'!!! ERROR in tave' - print *,'!!! ERROR in sub open_grib_files opening grib file' - print *,'!!! or grib index file. baopen return codes:' - print *,'!!! grib file return code = igoret = ',igoret - print *,'!!! index file return code = iioret = ',iioret - print *,'!!! output file return code = iooret = ',iooret - iret = 93 - return - endif - - return - end -c -c------------------------------------------------------------------- -c -c------------------------------------------------------------------- - subroutine bitmapchk (n,ld,d,dmin,dmax) -c -c This subroutine checks the bitmap for non-existent data values. -c Since the data from the regional models have been interpolated -c from either a polar stereographic or lambert conformal grid -c onto a lat/lon grid, there will be some gridpoints around the -c edges of this lat/lon grid that have no data; these grid -c points have been bitmapped out by Mark Iredell's interpolater. -c To provide another means of checking for invalid data points -c later in the program, set these bitmapped data values to a -c value of -999.0. The min and max of this array are also -c returned if a user wants to check for reasonable values. -c - logical(1) ld - dimension ld(n),d(n) -c - dmin=1.E15 - dmax=-1.E15 -c - do i=1,n - if (ld(i)) then - dmin=min(dmin,d(i)) - dmax=max(dmax,d(i)) - else - d(i) = -999.0 - endif - enddo -c - return - end diff --git a/sorc/tocsbufr.fd/tocsbufr.f b/sorc/tocsbufr.fd/tocsbufr.f deleted file mode 100644 index 0f1914cd1a6..00000000000 --- a/sorc/tocsbufr.fd/tocsbufr.f +++ /dev/null @@ -1,272 +0,0 @@ - PROGRAM TOCSBUFR -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C . . . . -C MAIN PROGRAM: TOCSBUFR -C PRGMMR: GILBERT ORG: NP11 DATE: 2004-02-23 -C -C ABSTRACT: Reads each BUFR message from a standard fortran blocked (f77) -C file and adds a TOC -C Flag Field separator block and WMO Header in front of each BUFR -C field, and writes them out to a new file. The output file -C is in the format required for TOC's FTP Input Service, which -C can be used to disseminate the BUFR messages. -C This service is described at http://weather.gov/tg/ftpingest.html. -C -C TOCSBUFR contains two options that are selected using -C a namelist on unit 5 ( see INPUT FILES below ): -C 1) The specified WMO HEADER can be added to each BUFR -C message in the file OR once at the beginning of the -C file. -C 2) The BUFR messages can be used "as is", or if they -C in NCEP format they can be "standardized" for external -C users. -C -C PROGRAM HISTORY LOG: -C 2001-03-01 Gilbert modified from WMOGRIB -C 2004-02-23 Gilbert modified from WMOBUFR to write out BUFR -C messages in the NTC/FTP Input Service format -C instead of the old STATFILE format. -C 2005-04-07 Gilbert This version was created from original program -C TOCBUFR. A new more thorough "standardizing" -C routine is being used to create WMO standard -C BUFR messages for AWIPS. -C 2009-06-16 J. Ator The program was modified in response to BUFRLIB -C changes, including a change to the WRITSA call -C sequence. Also added a call to MAXOUT to stop -C BUFR messages larger than 10k bytes from being -C truncated when standardizing. The program can -C now standardize BUFR messages as large as the -C MAXOUT limit without any loss of data. -C 2012-12-06 J. Ator modified for WCOSS -C -C USAGE: -C INPUT FILES: -C 5 - STANDARD INPUT - NAMELIST /INPUT/. -C BULHED = "TTAAII" part of WMO Header (CHAR*6) -C KWBX = "CCCC" orig center part of WMO Header (CHAR*4) -C NCEP2STD = .true. - will convert NCEP format -C BUFR messages to standard WMO -C format. -C = .false. - No conversion done to BUFR -C messages. -C SEPARATE = .true. - Add Flag Field Separator and WMO -C Header to each BUFR message in -C file. -C = .false. - Add Flag Field Separator and WMO -C Header once at beginning of -C output file. -C MAXFILESIZE = Max size of output file in bytes. -C Used only when SEPARATE = .false. -C 11 - INPUT BUFR FILE -C -C OUTPUT FILES: (INCLUDING SCRATCH FILES) -C 6 - STANDARD FORTRAN PRINT FILE -C 51 - AWIPS BUFR FILE WITH WMO HEADERS ADDED -C -C SUBPROGRAMS CALLED: (LIST ALL CALLED FROM ANYWHERE IN CODES) -C UNIQUE: - makwmo mkfldsep -C LIBRARY: -C W3LIB - W3TAGB W3UTCDAT -C W3TAGE -C -C EXIT STATES: -C COND = 0 - SUCCESSFUL RUN -C 19 - ERROR READING COMMAND LINE ARGS FOR WMOHEADER -C 20 - Error opening output BUFR transmission file -C 30 - NO BUFR MESSSAGES FOUND -C -C REMARKS: This utility was written for the ETA BUFR sounding -C collectives, and assumes all BUFR messages in the input -C file require the same WMO Header. -C -C ATTRIBUTES: -C LANGUAGE: FORTRAN 90 -C MACHINE: WCOSS -C -C$$$ -C - PARAMETER (MXSIZE=500000,MXSIZED4=MXSIZE/4) - INTEGER,PARAMETER :: INBUFR=11,OUTBUFR=51,TMPBUFR=91,iopt=2 -C - INTEGER,dimension(8):: ITIME=(/0,0,0,-500,0,0,0,0/) - INTEGER,dimension(MXSIZED4):: MBAY - INTEGER NBUL - INTEGER iday,hour - INTEGER :: MAXFILESIZE=1000000 -C - CHARACTER * 80 fileo - CHARACTER * 11 envvar - CHARACTER * 8 SUBSET - CHARACTER * 6 :: BULHED="CHEK12" - CHARACTER * 1 BUFR(MXSIZE) - CHARACTER * 4 :: ctemp,KWBX="OUTT" - CHARACTER * 1 CSEP(80) - integer,parameter :: lenhead=21 - CHARACTER * 1 WMOHDR(lenhead) - character*1,allocatable :: filebuf(:) - LOGICAL :: NCEP2STD=.false.,SEPARATE=.true. -C - EQUIVALENCE (BUFR(1), MBAY(1)) -C - NAMELIST /INPUT/ BULHED,KWBX,NCEP2STD,SEPARATE,MAXFILESIZE -C - CALL W3TAGB('TOCSBUFR',2012,0341,0083,'NP12') -C -C Read input values from namelist -C - READ(5,INPUT) - - PRINT * - PRINT *,'- Adding WMO Header: ',BULHED,' ',KWBX - IF (NCEP2STD) then - print *,'- Convert BUFR messages from NCEP format to standard', - & ' BUFR Format.' - else - print *,'- No conversion of BUFR messages will be done.' - endif - IF (SEPARATE) then - print *,'- Add Flag Field Separator and WMO Header to each ', - & 'BUFR message in file.' - else - print *,'- Add Flag Field Separator and WMO Header once at', - & ' beginning of file.' - allocate(filebuf(MAXFILESIZE)) - endif - PRINT * - -C -C Read output BUFR file name from FORT -C environment variable, and open file. -C - envvar='FORT ' - write(envvar(5:6),fmt='(I2)') outbufr - call get_environment_variable(envvar,fileo) - call baopenw(outbufr,fileo,iret1) - if ( iret1 .ne. 0 ) then - write(6,fmt='(" Error opening BUFR file: ",A80)') fileo - write(6,fmt='(" baopenw error = ",I5)') iret1 - stop 20 - endif -C -C Open input NCEP formatted BUFR file, if NCEP2STD = .true. -C - if (NCEP2STD) then - call OPENBF(INBUFR,'IN',INBUFR) - CALL MAXOUT(0) - call OPENBF(TMPBUFR,'NUL',INBUFR) - CALL STDMSG('Y') - endif - -C -C Get system date and time -C - call w3utcdat(itime) -C -C loop through input control records. -C - NBUL = 0 - nrec = 0 - itot = 0 - foreachbufrmessage: do - - if (NCEP2STD) then - if ( IREADMG (INBUFR,SUBSET,JDATE) .ne. 0 ) exit - if ( NMSUB(INBUFR) .gt. 0 ) then - nrec = nrec + 1 - CALL OPENMG (TMPBUFR,SUBSET,JDATE) - DO WHILE ( ICOPYSB(INBUFR,TMPBUFR) .eq. 0 ) - CONTINUE - END DO - CALL WRITSA( (-1)*TMPBUFR, MXSIZED4, MBAY, LMBAY) - else - cycle - endif - else - read(INBUFR,iostat=ios) BUFR -C print *,'Error reading message from input BUFR file.', -C & ' iostat = ',ios - if ( ios .le. 0 ) then - exit - endif - nrec = nrec + 1 - endif -C -C Extract BUFR edition number - ied = iupbs01(MBAY,'BEN') -C Calculate length of BUFR message - if (ied.le.1) then - call getlens(MBAY,5,len0,len1,len2,len3,len4,len5) - ILEN = len0+len1+len2+len3+len4+len5 - else - ILEN = iupbs01(MBAY,'LENM') - endif -C Check ending 7777 to see if we have a complete BUFR message - ctemp=BUFR(ILEN-3)//BUFR(ILEN-2)//BUFR(ILEN-1)//BUFR(ILEN) - if ( ctemp.ne.'7777') then - print *,' INVALID BUFR MESSAGE FOUND...SKIPPING ' - exit - endif -C -C MAKE WMO HEADER -C - iday=ITIME(3) - hour=ITIME(5) - CALL MAKWMO (BULHED,iday,hour,KWBX,WMOHDR) -C - NBUL = NBUL + 1 -C - IF (SEPARATE) THEN -C -C ADD Flag Field Separator AND WMO HEADERS -C TO BUFR MESSAGE. WRITE BUFR MESSAGE IN FILE -C - call mkfldsep(csep,iopt,insize,ilen+lenhead,lenout) - call wryte(outbufr,lenout,csep) - call wryte(outbufr,lenhead,WMOHDR) - call wryte(outbufr,ilen,bufr) - ELSE -C -C APPEND NEW BUFR MESSAGE TO filebuf ARRAY -C - if ((itot+ilen).lt.(MAXFILESIZE-101)) then - filebuf(itot+1:itot+ilen)=BUFR(1:ilen) - itot=itot+ilen - else - print *,' Internal Buffer of ',MAXFILESIZE,' bytes is ', - & 'full. Increase MAXFILESIZE in NAMELIST.' - exit - endif - ENDIF -C - enddo foreachbufrmessage -C - IF (.not.SEPARATE) THEN -C -C ADD Flag Field Separator AND WMO HEADERS -C TO BUFR MESSAGE. WRITE BUFR MESSAGE IN FILE -C - call mkfldsep(csep,iopt,insize,itot+lenhead,lenout) - call wryte(outbufr,lenout,csep) - call wryte(outbufr,lenhead,WMOHDR) - call wryte(outbufr,itot,filebuf) - deallocate(filebuf) - ENDIF -C -C* CLOSING SECTION -C - IF (NBUL .EQ. 0 ) THEN - WRITE (6,FMT='('' SOMETHING WRONG WITH INPUT BUFR FILE...'', - & ''NOTHING WAS PROCESSED'')') - CALL W3TAGE('TOCSBUFR') - call errexit(30) - ELSE - CALL BACLOSE (OUTBUFR,iret) - WRITE (6,FMT='(//,'' ******** RECAP OF THIS EXECUTION '', - & ''********'',/,5X,''READ '',I6,'' BUFR MESSAGES'', - & /,5X,''WROTE '',I6,'' BULLETINS OUT FOR TRANSMISSION'', - & //)') NREC, NBUL - ENDIF -C - CALL W3TAGE('TOCSBUFR') - STOP - END diff --git a/sorc/vint.fd/vint.f b/sorc/vint.fd/vint.f deleted file mode 100644 index e4d6db807c1..00000000000 --- a/sorc/vint.fd/vint.f +++ /dev/null @@ -1,1239 +0,0 @@ - program vint -c -c ABSTRACT: This program interpolates from various pressure levels -c onto regularly-spaced, 50-mb vertical levels. The intent is that -c we can use data with relatively coarse vertical resolution to -c get data on the necessary 50-mb intervals that we need for Bob -c Hart's cyclone phase space. For each model, we will need to read -c in a control file that contains the levels that we are -c interpolating from. -c -c Written by Tim Marchok - - USE params - USE grib_mod - - implicit none - - type(gribfield) :: holdgfld - integer, parameter :: lugb=11,lulv=16,lugi=31,lout=51,maxlev=200 - integer kpds(200),kgds(200) - integer nlevsin,iriret,iogret,kf,iggret,igdret,iidret,ixo,k,n - integer iha,iho,iva,irfa,iodret,ifcsthour,iia,iparm,nlevsout - integer gribver,g2_jpdtn - integer ilevs(maxlev) - real, allocatable :: xinpdat(:,:),xoutdat(:,:),xoutlevs_p(:) - logical(1), allocatable :: valid_pt(:),readflag(:) - - namelist/timein/ifcsthour,iparm,gribver,g2_jpdtn -c - read (5,NML=timein,END=201) - 201 continue - print *,' ' - print *,'*----------------------------------------------------*' - print *,' ' - print *,' +++ Top of vint +++' - print *,' ' - print *,'After namelist read, input forecast hour = ',ifcsthour - print *,' input grib parm = ',iparm - print *,' GRIB version= ',gribver - print *,' GRIB2 JPDTN= g2_jpdtn= ' - & ,g2_jpdtn - - if (iparm == 7 .or. iparm == 156) then - nlevsout = 13 ! dealing with height - else - nlevsout = 5 ! dealing with temperature - endif - - allocate (xoutlevs_p(nlevsout),stat=ixo) - if (ixo /= 0) then - print *,' ' - print *,'!!! ERROR in vint allocating the xoutlevs_p array.' - print *,'!!! ixo= ',ixo - print *,' ' - goto 899 - endif - - do k = 1,nlevsout - xoutlevs_p(k) = 300. + float((k-1)*50) - enddo - - ilevs = -999 - call read_input_levels (lulv,maxlev,nlevsin,ilevs,iriret) - - if (iriret /= 0) then - print *,' ' - print *,'!!! ERROR in vint. ' - print *,'!!! RETURN CODE FROM read_input_levels /= 0' - print *,'!!! RETURN CODE = iriret = ',iriret - print *,'!!! EXITING....' - print *,' ' - goto 899 - endif - - call open_grib_files (lugb,lugi,lout,gribver,iogret) - - if (iogret /= 0) then - print '(/,a45,i4,/)','!!! ERROR: in vint open_grib_files, rc= ' - & ,iogret - goto 899 - endif - - call getgridinfo (lugb,lugi,kf,kpds,kgds,holdgfld,ifcsthour,iparm - & ,gribver,g2_jpdtn,iggret) - - allocate (xinpdat(kf,nlevsin),stat=iha) - allocate (xoutdat(kf,nlevsout),stat=iho) - allocate (valid_pt(kf),stat=iva) - allocate (readflag(nlevsin),stat=irfa) - if (iha /= 0 .or. iho /= 0 .or. iva /= 0 .or. irfa /= 0) then - print *,' ' - print *,'!!! ERROR in vint.' - print *,'!!! ERROR allocating the xinpdat, readflag, or the' - print *,'!!! valid_pt array, iha= ',iha,' iva= ',iva - print *,'!!! irfa= ',irfa,' iho= ',iho - print *,' ' - goto 899 - endif - - print *,'hold check, holdgfld%ipdtlen = ',holdgfld%ipdtlen - do n = 1,holdgfld%ipdtlen - print *,'hold check, n= ',n,' holdgfld%ipdtmpl= ' - & ,holdgfld%ipdtmpl(n) - enddo - - call getdata (lugb,lugi,kf,valid_pt,nlevsin,ilevs,maxlev - & ,readflag,xinpdat,ifcsthour,iparm,gribver,g2_jpdtn - & ,igdret) - - call interp_data (kf,valid_pt,nlevsin,ilevs,maxlev,readflag - & ,xinpdat,xoutdat,xoutlevs_p,nlevsout,iidret) - - call output_data (lout,kf,kpds,kgds,holdgfld,xoutdat,valid_pt - & ,xoutlevs_p,nlevsout,gribver,iodret) - - deallocate (xinpdat) - deallocate (xoutdat) - deallocate (valid_pt) - deallocate (readflag) - deallocate (xoutlevs_p) - - 899 continue -c - stop - end -c -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine read_input_levels (lulv,maxlev,nlevsin,ilevs,iriret) -c -c ABSTRACT: This subroutine reads in a text file that contains -c the number of input pressure levels for a given model. The -c format of the file goes like this, from upper levels to -c lower, for example: -c -c 1 200 -c 2 400 -c 3 500 -c 4 700 -c 5 850 -c 6 925 -c 7 1000 -c -c - implicit none - - integer lulv,nlevsin,maxlev,iriret,inplev,ict,lvix - integer ilevs(maxlev) -c - iriret=0 - ict = 0 - do while (.true.) - - print *,'Top of while loop in vint read_input_levels' - - read (lulv,85,end=130) lvix,inplev - - if (inplev > 0 .and. inplev <= 1000) then - ict = ict + 1 - ilevs(ict) = inplev - else - print *,' ' - print *,'!!! ERROR: Input level not between 0 and 1000' - print *,'!!! in vint. inplev= ',inplev - print *,'!!! STOPPING EXECUTION' - STOP 91 - endif - - print *,'vint readloop, ict= ',ict,' inplev= ',inplev - - enddo - - 85 format (i4,1x,i4) - 130 continue - - nlevsin = ict - - print *,' ' - print *,'Total number of vint levels read in = ',nlevsin -c - return - end - -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine getgridinfo (lugb,lugi,kf,kpds,kgds,holdgfld,ifcsthour - & ,iparm,gribver,g2_jpdtn,iggret) -c -c ABSTRACT: The purpose of this subroutine is just to get the max -c values of i and j and the dx and dy grid spacing intervals for the -c grid to be used in the rest of the program. So just read the -c grib file to get the lon and lat data. Also, get the info for -c the data grid's boundaries. This boundary information will be -c used later in the tracking algorithm, and is accessed via Module -c grid_bounds. -c -C INPUT: -C lugb The Fortran unit number for the GRIB data file -C lugi The Fortran unit number for the GRIB index file -c ifcsthour input forecast hour to search for -c iparm input grib parm to search for -c gribver integer (1 or 2) to indicate if using GRIB1 / GRIB2 -c g2_jpdtn If GRIB2 data being read, this is the value for JPDTN -c that is input to getgb2. -C -C OUTPUT: -c kf Number of gridpoints on the grid -c kpds pds array for a GRIB1 record -c kgds gds array for a GRIB1 record -c holdgfld info for a GRIB2 record -c -C iggret The return code from this subroutine -c - USE params - USE grib_mod - - implicit none -c - type(gribfield) :: gfld,prevfld,holdgfld - integer,dimension(200) :: jids,jpdt,jgdt - logical(1), allocatable :: lb(:) - integer, parameter :: jf=4000000 - integer jpds(200),jgds(200) - integer kpds(200),kgds(200) - integer :: listsec1(13) - integer ila,ifa,iret,ifcsthour,imax,jmax,jskp,jdisc - integer lugb,lugi,kf,j,k,iggret,iparm,gribver,g2_jpdtn - integer jpdtn,jgdtn,npoints,icount,ipack,krec - integer :: listsec0(2)=(/0,2/) - integer :: igds(5)=(/0,0,0,0,0/),previgds(5) - integer :: idrstmpl(200) - integer :: currlen=1000000 - logical :: unpack=.true. - logical :: open_grb=.false. - real, allocatable :: f(:) - real dx,dy -c - iggret = 0 - - allocate (lb(jf),stat=ila) - allocate (f(jf),stat=ifa) - if (ila /= 0 .or. ifa /= 0) then - print *,' ' - print *,'!!! ERROR in vint.' - print *,'!!! ERROR in getgridinfo allocating either lb or f' - print *,'!!! ila = ',ila,' ifa= ',ifa - iggret = 97 - return - endif - - if (gribver == 2) then - - ! Search for a record from a GRIB2 file - - ! - ! --- Initialize Variables --- - ! - - gfld%idsect => NULL() - gfld%local => NULL() - gfld%list_opt => NULL() - gfld%igdtmpl => NULL() - gfld%ipdtmpl => NULL() - gfld%coord_list => NULL() - gfld%idrtmpl => NULL() - gfld%bmap => NULL() - gfld%fld => NULL() - - jdisc=0 ! meteorological products - jids=-9999 - jpdtn=g2_jpdtn ! 0 = analysis or forecast; 1 = ens fcst - jgdtn=0 ! lat/lon grid - jgdt=-9999 - jpdt=-9999 - - npoints=0 - icount=0 - jskp=0 - -c Search for Temperature or GP Height by production template.... - - JPDT(1:15)=(/-9999,-9999,-9999,-9999,-9999,-9999,-9999,-9999 - & ,-9999,-9999,-9999,-9999,-9999,-9999,-9999/) - - if (iparm == 7) then ! GP Height - jpdt(1) = 3 ! Param category from Table 4.1 - jpdt(2) = 5 ! Param number from Table 4.2-0-3 - elseif (iparm == 11) then ! Temperature - jpdt(1) = 0 ! Param category from Table 4.1 - jpdt(2) = 0 ! Param category from Table 4.2 - endif - - jpdt(9) = ifcsthour - - call getgb2(lugb,lugi,jskp,jdisc,jids,jpdtn,jpdt,jgdtn,jgdt - & ,unpack,krec,gfld,iret) - if ( iret.ne.0) then - print *,' ' - print *,' ERROR: getgb2 error in getgridinfo = ',iret - endif - -c Determine packing information from GRIB2 file -c The default packing is 40 JPEG 2000 - - ipack = 40 - - print *,' gfld%idrtnum = ', gfld%idrtnum - - ! Set DRT info ( packing info ) - if ( gfld%idrtnum.eq.0 ) then ! Simple packing - ipack = 0 - elseif ( gfld%idrtnum.eq.2 ) then ! Complex packing - ipack = 2 - elseif ( gfld%idrtnum.eq.3 ) then ! Complex & spatial packing - ipack = 31 - elseif ( gfld%idrtnum.eq.40.or.gfld%idrtnum.eq.15 ) then - ! JPEG 2000 packing - ipack = 40 - elseif ( gfld%idrtnum.eq.41 ) then ! PNG packing - ipack = 41 - endif - - print *,'After check of idrtnum, ipack= ',ipack - - print *,'Number of gridpts= gfld%ngrdpts= ',gfld%ngrdpts - print *,'Number of elements= gfld%igdtlen= ',gfld%igdtlen - print *,'PDT num= gfld%ipdtnum= ',gfld%ipdtnum - print *,'GDT num= gfld%igdtnum= ',gfld%igdtnum - - imax = gfld%igdtmpl(8) - print *,'at A' - jmax = gfld%igdtmpl(9) - print *,'at B' - dx = float(gfld%igdtmpl(17))/1.e6 - print *,'at C' - dy = float(gfld%igdtmpl(17))/1.e6 - print *,'at D' - kf = gfld%ngrdpts - print *,'at E' - - holdgfld = gfld - - else - - ! Search for a record from a GRIB1 file - - jpds = -1 - jgds = -1 - - j=0 - - jpds(5) = iparm ! Get a record for the input parm selected - jpds(6) = 100 ! Get a record on a standard pressure level - jpds(14) = ifcsthour - - call getgb(lugb,lugi,jf,j,jpds,jgds, - & kf,k,kpds,kgds,lb,f,iret) - - if (iret.ne.0) then - print *,' ' - print *,'!!! ERROR in vint getgridinfo calling getgb' - print *,'!!! Return code from getgb = iret = ',iret - iggret = iret - return - else - iggret=0 - imax = kgds(2) - jmax = kgds(3) - dx = float(kgds(9))/1000. - dy = float(kgds(10))/1000. - endif - - endif - - print *,' ' - print *,'In vint getgridinfo, grid dimensions follow:' - print *,'imax= ',imax,' jmax= ',jmax - print *,' dx= ',dx,' dy= ',dy - print *,'number of gridpoints = ',kf - - deallocate (lb); deallocate(f) - - return - end - -c--------------------------------------------------------------------- -c -c--------------------------------------------------------------------- - subroutine getdata (lugb,lugi,kf,valid_pt,nlevsin,ilevs,maxlev - & ,readflag,xinpdat,ifcsthour,iparm,gribver,g2_jpdtn - & ,igdret) -c -c ABSTRACT: This subroutine reads the input GRIB file for the -c tracked parameters. - - USE params - USE grib_mod - - implicit none -c - type(gribfield) :: gfld,prevfld - CHARACTER(len=8) :: pabbrev - integer,dimension(200) :: jids,jpdt,jgdt - logical(1) valid_pt(kf),lb(kf),readflag(nlevsin) - integer, parameter :: jf=4000000 - integer ilevs(maxlev) - integer jpds(200),jgds(200),kpds(200),kgds(200) - integer lugb,lugi,kf,nlevsin,maxlev,igdret,jskp,jdisc - integer i,j,k,ict,np,lev,ifcsthour,iret,iparm,gribver,g2_jpdtn - integer jpdtn,jgdtn,npoints,icount,ipack,krec - integer pdt_4p0_vert_level,pdt_4p0_vtime,mm - integer :: listsec0(2)=(/0,2/) - integer :: listsec1(13) - integer :: igds(5)=(/0,0,0,0,0/),previgds(5) - integer :: idrstmpl(200) - integer :: currlen=1000000 - logical :: unpack=.true. - logical :: open_grb=.false. - real f(kf),xinpdat(kf,nlevsin),xtemp(kf) - real dmin,dmax,firstval,lastval -c - igdret=0 - ict = 0 - - level_loop: do lev = 1,nlevsin - - print *,' ' - print *,'In vint getdata read loop, lev= ',lev,' level= ' - & ,ilevs(lev) - - if (gribver == 2) then - - ! - ! --- Initialize Variables --- - ! - - gfld%idsect => NULL() - gfld%local => NULL() - gfld%list_opt => NULL() - gfld%igdtmpl => NULL() - gfld%ipdtmpl => NULL() - gfld%coord_list => NULL() - gfld%idrtmpl => NULL() - gfld%bmap => NULL() - gfld%fld => NULL() - - jdisc=0 ! meteorological products - jids=-9999 - jpdtn=g2_jpdtn ! 0 = analysis or forecast; 1 = ens fcst - jgdtn=0 ! lat/lon grid - jgdt=-9999 - jpdt=-9999 - - npoints=0 - icount=0 - jskp=0 - -c Search for input parameter by production template 4.0. This -c vint program is used primarily for temperature, but still we -c will leave that as a variable and not-hard wire it in case we -c choose to average something else in the future. - - ! We are looking for Temperature or GP Height here. This - ! block of code, or even the smaller subset block of code that - ! contains the JPDT(1) and JPDT(2) assignments, can of course - ! be modified if this program is to be used for interpolating - ! other variables.... - - ! Set defaults for JPDT, then override in array - ! assignments below... - - JPDT(1:15)=(/-9999,-9999,-9999,-9999,-9999,-9999,-9999,-9999 - & ,-9999,-9999,-9999,-9999,-9999,-9999,-9999/) - - print *,' ' - print *,'In getdata vint, iparm= ',iparm - - if (iparm == 7) then ! GP Height - jpdt(1) = 3 ! Param category from Table 4.1 - jpdt(2) = 5 ! Param number from Table 4.2-0-3 - elseif (iparm == 11) then ! Temperature - jpdt(1) = 0 ! Param category from Table 4.1 - jpdt(2) = 0 ! Param category from Table 4.2 - endif - - JPDT(9) = ifcsthour - JPDT(10) = 100 ! Isobaric surface requested (Table 4.5) - JPDT(12) = ilevs(lev) * 100 ! value of specific level - - print *,'before getgb2 call, value of unpack = ',unpack - - do mm = 1,15 - print *,'VINT getdata mm= ',mm,' JPDT(mm)= ',JPDT(mm) - enddo - - call getgb2(lugb,lugi,jskp,jdisc,jids,jpdtn,jpdt,jgdtn,jgdt - & ,unpack,krec,gfld,iret) - - print *,'iret from getgb2 in getdata = ',iret - - print *,'after getgb2 call, value of unpacked = ' - & ,gfld%unpacked - - print *,'after getgb2 call, gfld%ndpts = ',gfld%ndpts - print *,'after getgb2 call, gfld%ibmap = ',gfld%ibmap - - if ( iret == 0) then - -c Determine packing information from GRIB2 file -c The default packing is 40 JPEG 2000 - - ipack = 40 - - print *,' gfld%idrtnum = ', gfld%idrtnum - - ! Set DRT info ( packing info ) - if ( gfld%idrtnum.eq.0 ) then ! Simple packing - ipack = 0 - elseif ( gfld%idrtnum.eq.2 ) then ! Complex packing - ipack = 2 - elseif ( gfld%idrtnum.eq.3 ) then ! Complex & spatial - & ! packing - ipack = 31 - elseif ( gfld%idrtnum.eq.40.or.gfld%idrtnum.eq.15 ) then - ! JPEG 2000 packing - ipack = 40 - elseif ( gfld%idrtnum.eq.41 ) then ! PNG packing - ipack = 41 - endif - - print *,'After check of idrtnum, ipack= ',ipack - - print *,'Number of gridpts= gfld%ngrdpts= ',gfld%ngrdpts - print *,'Number of elements= gfld%igdtlen= ',gfld%igdtlen - print *,'GDT num= gfld%igdtnum= ',gfld%igdtnum - - kf = gfld%ndpts ! Number of gridpoints returned from read - - do np = 1,kf - xinpdat(np,lev) = gfld%fld(np) - xtemp(np) = gfld%fld(np) - if (gfld%ibmap == 0) then - valid_pt(np) = gfld%bmap(np) - else - valid_pt(np) = .true. - endif - enddo - - readflag(lev) = .TRUE. -c call bitmapchk(kf,gfld%bmap,gfld%fld,dmin,dmax) - call bitmapchk(kf,valid_pt,xtemp,dmin,dmax) - - if (ict == 0) then -c do np = 1,kf -c valid_pt(np) = gfld%bmap(np) -c enddo - ict = ict + 1 - endif - - firstval=gfld%fld(1) - lastval=gfld%fld(kf) - - print *,' ' - print *,' SECTION 0: discipl= ',gfld%discipline - & ,' gribver= ',gfld%version - print *,' ' - print *,' SECTION 1: ' - - do j = 1,gfld%idsectlen - print *,' sect1, j= ',j,' gfld%idsect(j)= ' - & ,gfld%idsect(j) - enddo - - if ( associated(gfld%local).AND.gfld%locallen.gt.0) then - print *,' ' - print *,' SECTION 2: ',gfld%locallen,' bytes' - else - print *,' ' - print *,' SECTION 2 DOES NOT EXIST IN THIS RECORD' - endif - - print *,' ' - print *,' SECTION 3: griddef= ',gfld%griddef - print *,' ngrdpts= ',gfld%ngrdpts - print *,' numoct_opt= ',gfld%numoct_opt - print *,' interp_opt= ',gfld%interp_opt - print *,' igdtnum= ',gfld%igdtnum - print *,' igdtlen= ',gfld%igdtlen - - print *,' ' - print '(a17,i3,a2)',' GRID TEMPLATE 3.',gfld%igdtnum,': ' - do j=1,gfld%igdtlen - print *,' j= ',j,' gfld%igdtmpl(j)= ',gfld%igdtmpl(j) - enddo - - print *,' ' - print *,' PDT num (gfld%ipdtnum) = ',gfld%ipdtnum - print *,' ' - print '(a20,i3,a2)',' PRODUCT TEMPLATE 4.',gfld%ipdtnum,': ' - do j=1,gfld%ipdtlen - print *,' sect 4 j= ',j,' gfld%ipdtmpl(j)= ' - & ,gfld%ipdtmpl(j) - enddo - -c Print out values for data representation type - - print *,' ' - print '(a21,i3,a2)',' DATA REP TEMPLATE 5.',gfld%idrtnum - & ,': ' - do j=1,gfld%idrtlen - print *,' sect 5 j= ',j,' gfld%idrtmpl(j)= ' - & ,gfld%idrtmpl(j) - enddo - -c Get parameter abbrev for record that was retrieved - - pdt_4p0_vtime = gfld%ipdtmpl(9) - pdt_4p0_vert_level = gfld%ipdtmpl(12) - - pabbrev=param_get_abbrev(gfld%discipline,gfld%ipdtmpl(1) - & ,gfld%ipdtmpl(2)) - - print *,' ' - write (6,131) - 131 format (' rec# param level byy bmm bdd bhh ' - & ,'fhr npts firstval lastval minval ' - & ,' maxval') - print '(i5,3x,a8,2x,6i5,2x,i8,4g12.4)' - & ,krec,pabbrev,pdt_4p0_vert_level/100,gfld%idsect(6) - & ,gfld%idsect(7),gfld%idsect(8),gfld%idsect(9) - & ,pdt_4p0_vtime,gfld%ndpts,firstval,lastval,dmin,dmax - - do np = 1,kf - xinpdat(np,lev) = gfld%fld(np) - enddo - - else - - print *,' ' - print *,'!!! ERROR: GRIB2 VINT READ IN GETDATA FAILED FOR ' - & ,'LEVEL LEV= ',LEV - print *,' ' - - readflag(lev) = .FALSE. - - do np = 1,kf - xinpdat(np,lev) = -99999.0 - enddo - - endif - - else - - ! Reading a GRIB1 file.... - - jpds = -1 - jgds = -1 - j=0 - - jpds(5) = iparm ! grib parameter id to read in - jpds(6) = 100 ! level id to indicate a pressure level - jpds(7) = ilevs(lev) ! actual level of the layer - jpds(14) = ifcsthour ! lead time to search for - - call getgb (lugb,lugi,jf,j,jpds,jgds, - & kf,k,kpds,kgds,lb,f,iret) - - print *,' ' - print *,'After vint getgb call, j= ',j,' k= ',k,' level= ' - & ,ilevs(lev),' iret= ',iret - - if (iret == 0) then - - readflag(lev) = .TRUE. - call bitmapchk(kf,lb,f,dmin,dmax) - - if (ict == 0) then - do np = 1,kf - valid_pt(np) = lb(np) - enddo - ict = ict + 1 - endif - - write (6,31) - 31 format (' rec# parm# levt lev byy bmm bdd bhh fhr ' - & ,'npts minval maxval') - print '(i4,2x,8i5,i8,2g12.4)', - & k,(kpds(i),i=5,11),kpds(14),kf,dmin,dmax - - do np = 1,kf - xinpdat(np,lev) = f(np) - enddo - - else - - print *,' ' - print *,'!!! ERROR: VINT READ FAILED FOR LEVEL LEV= ',LEV - print *,' ' - - readflag(lev) = .FALSE. - - do np = 1,kf - xinpdat(np,lev) = -99999.0 - enddo - - endif - - endif - - enddo level_loop -c - return - end -c -c----------------------------------------------------------------------- -c -c----------------------------------------------------------------------- - subroutine interp_data (kf,valid_pt,nlevsin,ilevs,maxlev,readflag - & ,xinpdat,xoutdat,xoutlevs_p,nlevsout,iidret) -c -c ABSTRACT: This routine interpolates data in between available -c pressure levels to get data resolution at the 50-mb -c resolution that we need for the cyclone phase space -c diagnostics. - - implicit none - - logical(1) valid_pt(kf),readflag(nlevsin) - integer ilevs(maxlev) - integer nlevsin,nlevsout,maxlev,kf,kout,kin,k,n,kup,klo - integer iidret - real xinpdat(kf,nlevsin),xoutdat(kf,nlevsout) - real xoutlevs_p(nlevsout),xoutlevs_lnp(nlevsout) - real xinlevs_p(nlevsin),xinlevs_lnp(nlevsin) - real pdiff,pdiffmin,xu,xo,xl,yu,yl -c - iidret=0 - print *,' ' - print *,'*----------------------------------------------*' - print *,' Listing of standard output levels follows....' - print *,'*----------------------------------------------*' - print *,' ' - - do k = 1,nlevsout - xoutlevs_lnp(k) = log(xoutlevs_p(k)) - write (6,81) k,xoutlevs_p(k),xoutlevs_lnp(k) - enddo - 81 format (1x,'k= ',i3,' p= ',f6.1,' ln(p)= ',f9.6) - - do k = 1,nlevsin - xinlevs_p(k) = float(ilevs(k)) - xinlevs_lnp(k) = log(xinlevs_p(k)) - enddo - -c ----------------------------------------------------------------- -c We want to loop through for all the *output* levels that we need. -c We may have some input levels that match perfectly, often at -c least the standard levels like 500, 700, 850. For these levels, -c just take the data directly from the input file. For other -c output levels that fall between the input levels, we need to -c find the nearest upper and lower levels. - - output_loop: do kout = 1,nlevsout - - print *,' ' - print *,'+------------------------------------------------+' - print *,'Top of vint output_loop, kout= ',kout,' pressure= ' - & ,xoutlevs_p(kout) - - ! Loop through all of the input levels and find the level - ! that is closest to the output level from the *upper* side. - ! And again, in this upper loop, if we hit a level that - ! exactly matches a needed output level, just copy that data - ! and then cycle back to the top of output_loop. - - kup = -999 - klo = -999 - - pdiffmin = 9999.0 - - inp_loop_up: do kin = 1,nlevsin - if (xinlevs_p(kin) == xoutlevs_p(kout)) then - print *,' ' - print *,'+++ Exact level found. kout= ',kout - print *,'+++ level= ',xoutlevs_p(kout) - print *,'+++ Data copied. No interpolation needed.' - if (readflag(kin)) then - do n = 1,kf - xoutdat(n,kout) = xinpdat(n,kin) - enddo - cycle output_loop - else - print *,' ' - print *,'!!! ERROR: readflag is FALSE in interp_data for' - print *,'!!! level kin= ',kin,', which is a level that ' - print *,'!!! exactly matches a required output level, and' - print *,'!!! the user has identified as being an input ' - print *,'!!! level with valid data for this model. We ' - print *,'!!! will get the data from a different level.' - endif - else - pdiff = xoutlevs_p(kout) - xinlevs_p(kin) - if (pdiff > 0.) then ! We have a level higher than outlev - if (pdiff < pdiffmin) then - pdiffmin = pdiff - kup = kin - endif - endif - endif - enddo inp_loop_up - - pdiffmin = 9999.0 - - inp_loop_lo: do kin = 1,nlevsin - pdiff = xinlevs_p(kin) - xoutlevs_p(kout) - if (pdiff > 0.) then ! We have a level lower than outlev - if (pdiff < pdiffmin) then - pdiffmin = pdiff - klo = kin - endif - endif - enddo inp_loop_lo - - if (kup == -999 .or. klo == -999) then - print *,' ' - print *,'!!! ERROR: While interpolating, could not find ' - print *,'!!! either an upper or lower input level to use' - print *,'!!! for interpolating *from*.' - print *,'!!! kup= ',kup,' klo= ',klo - print *,' ' - print *,'!!! STOPPING....' - stop 91 - endif - - if (.not. readflag(kup) .or. .not. readflag(klo)) then - print *,' ' - print *,'!!! ERROR: In interp_data, either the upper or the' - print *,'!!! lower input level closest to the target output' - print *,'!!! level did not have valid data read in.' - print *,'!!! ' - write (6,91) ' upper level k= ',kup,xinlevs_p(kup) - & ,xinlevs_lnp(kup) - write (6,101) xoutlevs_p(kout),xoutlevs_lnp(kout) - write (6,91) ' lower level k= ',klo,xinlevs_p(klo) - & ,xinlevs_lnp(klo) - print *,'!!! readflag upper = ',readflag(kup) - print *,'!!! readflag lower = ',readflag(klo) - print *,'!!! EXITING....' - stop 92 - endif - - print *,' ' - write (6,91) ' upper level k= ',kup,xinlevs_p(kup) - & ,xinlevs_lnp(kup) - write (6,101) xoutlevs_p(kout),xoutlevs_lnp(kout) - write (6,91) ' lower level k= ',klo,xinlevs_p(klo) - & ,xinlevs_lnp(klo) - - 91 format (1x,a17,1x,i3,' pressure= ',f6.1,' ln(p)= ',f9.6) - 101 format (13x,'Target output pressure= ',f6.1,' ln(p)= ',f9.6) - - !-------------------------------------------------------------- - ! Now perform the linear interpolation. Here is the notation - ! used in the interpolation: - ! - ! xu = ln of pressure at upper level - ! xo = ln of pressure at output level - ! xl = ln of pressure at lower level - ! yu = data value at upper level - ! yl = data value at lower level - !-------------------------------------------------------------- - - xu = xinlevs_lnp(kup) - xo = xoutlevs_lnp(kout) - xl = xinlevs_lnp(klo) - - do n = 1,kf - yu = xinpdat(n,kup) - yl = xinpdat(n,klo) - xoutdat(n,kout) = ((yl * (xo - xu)) - (yu * (xo - xl))) - & / (xl - xu) - enddo - - enddo output_loop -c - return - end -c -c---------------------------------------------------------------------- -c -c---------------------------------------------------------------------- - subroutine output_data (lout,kf,kpds,kgds,holdgfld,xoutdat - & ,valid_pt,xoutlevs_p,nlevsout,gribver,iodret) -c -c ABSTRACT: This routine writes out the output data on the -c specified output pressure levels. - - USE params - USE grib_mod - - implicit none - - CHARACTER(len=1),pointer,dimension(:) :: cgrib - type(gribfield) :: holdgfld - logical(1) valid_pt(kf),bmap(kf) - integer lout,kf,lugb,lugi,iodret,nlevsout,igoret,ipret,lev - integer gribver,ierr,ipack,lengrib,npoints,newlen,idrsnum - integer numcoord,ica,n,j - integer :: idrstmpl(200) - integer :: currlen=1000000 - integer :: listsec0(2)=(/0,2/) - integer :: igds(5)=(/0,0,0,0,0/),previgds(5) - integer kpds(200),kgds(200) - integer(4), parameter::idefnum=1 - integer(4) ideflist(idefnum),ibmap - real coordlist - real xoutdat(kf,nlevsout),xoutlevs_p(nlevsout) -c - iodret=0 - call baopenw (lout,"fort.51",igoret) - print *,'baopenw: igoret= ',igoret - - if (igoret /= 0) then - print *,' ' - print *,'!!! ERROR in vint in sub output_data opening' - print *,'!!! **OUTPUT** grib file. baopenw return codes:' - print *,'!!! grib file 1 return code = igoret = ',igoret - STOP 95 - return - endif - - levloop: do lev = 1,nlevsout - - if (gribver == 2) then - - ! Write data out as a GRIB2 message.... - - allocate(cgrib(currlen),stat=ica) - if (ica /= 0) then - print *,' ' - print *,'ERROR in output_data allocating cgrib' - print *,'ica= ',ica - iodret=95 - return - endif - - ! Ensure that cgrib array is large enough - - if (holdgfld%ifldnum == 1 ) then ! start new GRIB2 message - npoints=holdgfld%ngrdpts - else - npoints=npoints+holdgfld%ngrdpts - endif - newlen=npoints*4 - if ( newlen.gt.currlen ) then -ccc if (allocated(cgrib)) deallocate(cgrib) - if (associated(cgrib)) deallocate(cgrib) - allocate(cgrib(newlen),stat=ierr) -c call realloc (cgrib,currlen,newlen,ierr) - if (ierr == 0) then - print *,' ' - print *,'re-allocate for large grib msg: ' - print *,' currlen= ',currlen - print *,' newlen= ',newlen - currlen=newlen - else - print *,'ERROR returned from 2nd allocate cgrib = ',ierr - stop 95 - endif - endif - - ! Create new GRIB Message - listsec0(1)=holdgfld%discipline - listsec0(2)=holdgfld%version - - print *,'output, holdgfld%idsectlen= ',holdgfld%idsectlen - do j = 1,holdgfld%idsectlen - print *,' sect1, j= ',j,' holdgfld%idsect(j)= ' - & ,holdgfld%idsect(j) - enddo - - call gribcreate(cgrib,currlen,listsec0,holdgfld%idsect,ierr) - if (ierr.ne.0) then - write(6,*) ' ERROR creating new GRIB2 field (gribcreate)= ' - & ,ierr - stop 95 - endif - - previgds=igds - igds(1)=holdgfld%griddef - igds(2)=holdgfld%ngrdpts - igds(3)=holdgfld%numoct_opt - igds(4)=holdgfld%interp_opt - igds(5)=holdgfld%igdtnum - - if (igds(3) == 0) then - ideflist = 0 - endif - - call addgrid (cgrib,currlen,igds,holdgfld%igdtmpl - & ,holdgfld%igdtlen,ideflist,idefnum,ierr) - - if (ierr.ne.0) then - write(6,*) ' ERROR from addgrid adding GRIB2 grid = ',ierr - stop 95 - endif - - holdgfld%ipdtmpl(12) = int(xoutlevs_p(lev)) * 100 - - ipack = 40 - idrsnum = ipack - idrstmpl = 0 - - idrstmpl(2)= holdgfld%idrtmpl(2) - idrstmpl(3)= holdgfld%idrtmpl(3) - idrstmpl(6)= 0 - idrstmpl(7)= 255 - - numcoord=0 - coordlist=0.0 ! Only needed for hybrid vertical coordinate, - ! not here, so set it to 0.0 - - ! 0 - A bit map applies to this product and is specified in - ! this section - ! 255 - A bit map does not apply to this product - ibmap=255 ! Bitmap indicator (see Code Table 6.0) - - print *,' ' - print *,'output, holdgfld%ipdtlen= ',holdgfld%ipdtlen - do n = 1,holdgfld%ipdtlen - print *,'output, n= ',n,' holdgfld%ipdtmpl= ' - & ,holdgfld%ipdtmpl(n) - enddo - - print *,'output, kf= ',kf -c do n = 1,kf -c print *,'output, n= ',n,' xoutdat(n)= ',xoutdat(n) -c enddo - - call addfield (cgrib,currlen,holdgfld%ipdtnum,holdgfld%ipdtmpl - & ,holdgfld%ipdtlen,coordlist - & ,numcoord - & ,idrsnum,idrstmpl,200 - & ,xoutdat(1,lev),kf,ibmap,bmap,ierr) - - if (ierr /= 0) then - write(6,*) ' ERROR from addfield adding GRIB2 data = ',ierr - stop 95 - endif - -! Finalize GRIB message after all grids -! and fields have been added. It adds the End Section ( "7777" ) - - call gribend(cgrib,currlen,lengrib,ierr) - call wryte(lout,lengrib,cgrib) - - if (ierr == 0) then - print *,' ' - print *,'+++ GRIB2 write successful. ' - print *,' Len of message = currlen= ',currlen - print *,' Len of entire GRIB2 message = lengrib= ' - & ,lengrib - else - print *,' ERROR from gribend writing GRIB2 msg = ',ierr - stop 95 - endif - - else - - ! Write data out as a GRIB1 message.... - - kpds(7) = int(xoutlevs_p(lev)) - - print *,'In vint, just before call to putgb, kf= ',kf - call putgb (lout,kf,kpds,kgds,valid_pt,xoutdat(1,lev),ipret) - print *,'In vint, just after call to putgb, kf= ',kf - if (ipret == 0) then - print *,' ' - print *,'+++ IPRET = 0 after call to putgb in vint' - print *,' ' - else - print *,' ' - print *,'!!!!!! ERROR in vint.' - print *,'!!!!!! ERROR: IPRET NE 0 AFTER CALL TO PUTGB !!!' - print *,'!!!!!! Level index= ',lev - print *,'!!!!!! pressure= ',xoutlevs_p(lev) - print *,' ' - endif - - write(*,980) kpds(1),kpds(2) - write(*,981) kpds(3),kpds(4) - write(*,982) kpds(5),kpds(6) - write(*,983) kpds(7),kpds(8) - write(*,984) kpds(9),kpds(10) - write(*,985) kpds(11),kpds(12) - write(*,986) kpds(13),kpds(14) - write(*,987) kpds(15),kpds(16) - write(*,988) kpds(17),kpds(18) - write(*,989) kpds(19),kpds(20) - write(*,990) kpds(21),kpds(22) - write(*,991) kpds(23),kpds(24) - write(*,992) kpds(25) - write(*,880) kgds(1),kgds(2) - write(*,881) kgds(3),kgds(4) - write(*,882) kgds(5),kgds(6) - write(*,883) kgds(7),kgds(8) - write(*,884) kgds(9),kgds(10) - write(*,885) kgds(11),kgds(12) - write(*,886) kgds(13),kgds(14) - write(*,887) kgds(15),kgds(16) - write(*,888) kgds(17),kgds(18) - write(*,889) kgds(19),kgds(20) - write(*,890) kgds(21),kgds(22) - - 980 format(' kpds(1) = ',i7,' kpds(2) = ',i7) - 981 format(' kpds(3) = ',i7,' kpds(4) = ',i7) - 982 format(' kpds(5) = ',i7,' kpds(6) = ',i7) - 983 format(' kpds(7) = ',i7,' kpds(8) = ',i7) - 984 format(' kpds(9) = ',i7,' kpds(10) = ',i7) - 985 format(' kpds(11) = ',i7,' kpds(12) = ',i7) - 986 format(' kpds(13) = ',i7,' kpds(14) = ',i7) - 987 format(' kpds(15) = ',i7,' kpds(16) = ',i7) - 988 format(' kpds(17) = ',i7,' kpds(18) = ',i7) - 989 format(' kpds(19) = ',i7,' kpds(20) = ',i7) - 990 format(' kpds(21) = ',i7,' kpds(22) = ',i7) - 991 format(' kpds(23) = ',i7,' kpds(24) = ',i7) - 992 format(' kpds(25) = ',i7) - 880 format(' kgds(1) = ',i7,' kgds(2) = ',i7) - 881 format(' kgds(3) = ',i7,' kgds(4) = ',i7) - 882 format(' kgds(5) = ',i7,' kgds(6) = ',i7) - 883 format(' kgds(7) = ',i7,' kgds(8) = ',i7) - 884 format(' kgds(9) = ',i7,' kgds(10) = ',i7) - 885 format(' kgds(11) = ',i7,' kgds(12) = ',i7) - 886 format(' kgds(13) = ',i7,' kgds(14) = ',i7) - 887 format(' kgds(15) = ',i7,' kgds(16) = ',i7) - 888 format(' kgds(17) = ',i7,' kgds(18) = ',i7) - 889 format(' kgds(19) = ',i7,' kgds(20) = ',i7) - 890 format(' kgds(20) = ',i7,' kgds(22) = ',i7) - - endif - - enddo levloop -c - return - end -c -c----------------------------------------------------------------------- -c -c----------------------------------------------------------------------- - subroutine open_grib_files (lugb,lugi,lout,gribver,iret) - -C ABSTRACT: This subroutine must be called before any attempt is -C made to read from the input GRIB files. The GRIB and index files -C are opened with a call to baopenr. This call to baopenr was not -C needed in the cray version of this program (the files could be -C opened with a simple Cray assign statement), but the GRIB-reading -C utilities on the SP do require calls to this subroutine (it has -C something to do with the GRIB I/O being done in C on the SP, and -C the C I/O package needs an explicit open statement). -C -C INPUT: -C lugb The Fortran unit number for the GRIB data file -C lugi The Fortran unit number for the GRIB index file -C lout The Fortran unit number for the output grib file -c gribver integer (1 or 2) to indicate if using GRIB1 / GRIB2 -C -C OUTPUT: -C iret The return code from this subroutine - - implicit none - - character fnameg*7,fnamei*7,fnameo*7 - integer iret,gribver,lugb,lugi,lout,igoret,iioret,iooret - - iret=0 - fnameg(1:5) = "fort." - fnamei(1:5) = "fort." - fnameo(1:5) = "fort." - write(fnameg(6:7),'(I2)') lugb - write(fnamei(6:7),'(I2)') lugi - write(fnameo(6:7),'(I2)') lout - call baopenr (lugb,fnameg,igoret) - call baopenr (lugi,fnamei,iioret) - call baopenw (lout,fnameo,iooret) - - print *,' ' - print *,'vint: baopen: igoret= ',igoret,' iioret= ',iioret - & ,' iooret= ',iooret - - if (igoret /= 0 .or. iioret /= 0 .or. iooret /= 0) then - print *,' ' - print *,'!!! ERROR in vint.' - print *,'!!! ERROR in sub open_grib_files opening grib file' - print *,'!!! or grib index file. baopen return codes:' - print *,'!!! grib file return code = igoret = ',igoret - print *,'!!! index file return code = iioret = ',iioret - print *,'!!! output file return code = iooret = ',iooret - iret = 93 - return - endif - - return - end -c -c------------------------------------------------------------------- -c -c------------------------------------------------------------------- - subroutine bitmapchk (n,ld,d,dmin,dmax) -c -c This subroutine checks the bitmap for non-existent data values. -c Since the data from the regional models have been interpolated -c from either a polar stereographic or lambert conformal grid -c onto a lat/lon grid, there will be some gridpoints around the -c edges of this lat/lon grid that have no data; these grid -c points have been bitmapped out by Mark Iredell's interpolater. -c To provide another means of checking for invalid data points -c later in the program, set these bitmapped data values to a -c value of -999.0. The min and max of this array are also -c returned if a user wants to check for reasonable values. -c - logical(1) ld - dimension ld(n),d(n) -c - dmin=1.E15 - dmax=-1.E15 -c - do i=1,n - if (ld(i)) then - dmin=min(dmin,d(i)) - dmax=max(dmax,d(i)) - else - d(i) = -999.0 - endif - enddo -c - return - end diff --git a/test/diff_grib_files.py b/test/diff_grib_files.py index 43619f143d1..9c01afbb18e 100755 --- a/test/diff_grib_files.py +++ b/test/diff_grib_files.py @@ -1,6 +1,6 @@ #! /bin/env python3 ''' -Compares two grib2 files and print any variables that have a +Compares two grib2 files and print any variables that have a non-identity correlation. Syntax @@ -15,16 +15,16 @@ Path to the second grib2 file ''' - import re import sys import subprocess # TODO - Update to also check the min just in case the grib files have a constant offset + def count_nonid_corr(test_string: str, quiet=False): ''' - Scan a wgrib2 print of the correlation between two values and count + Scan a wgrib2 print of the correlation between two values and count how many variables have a non-identity correlation. Any such variables are printed. @@ -64,6 +64,7 @@ def count_nonid_corr(test_string: str, quiet=False): return count + if __name__ == '__main__': fileA = sys.argv[0] fileB = sys.argv[1] @@ -71,4 +72,5 @@ def count_nonid_corr(test_string: str, quiet=False): wgrib2_cmd = f"wgrib2 {fileA} -var -rpn 'sto_1' -import_grib {fileB} -rpn 'rcl_1:print_corr'" string = subprocess.run(wgrib2_cmd, shell=True, stdout=subprocess.PIPE).stdout.decode("utf-8") + count_nonid_corr(string) diff --git a/ush/calcanl_gfs.py b/ush/calcanl_gfs.py index 69f282cf417..a325ec35b3a 100755 --- a/ush/calcanl_gfs.py +++ b/ush/calcanl_gfs.py @@ -14,103 +14,95 @@ # function to calculate analysis from a given increment file and background -def calcanl_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, ASuffix, - ComIn_Ges, GPrefix, GSuffix, +def calcanl_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, + ComIn_Ges, GPrefix, FixDir, atmges_ens_mean, RunDir, NThreads, NEMSGet, IAUHrs, ExecCMD, ExecCMDMPI, ExecAnl, ExecChgresInc, Cdump): - print('calcanl_gfs beginning at: ',datetime.datetime.utcnow()) + print('calcanl_gfs beginning at: ', datetime.datetime.utcnow()) IAUHH = IAUHrs - if Cdump == "gfs": - IAUHH = list(map(int,'6')) - else: - IAUHH = IAUHrs - ######## copy and link files + # copy and link files if DoIAU and l4DEnsVar and Write4Danl: for fh in IAUHH: if fh == 6: # for full res analysis - CalcAnlDir = RunDir+'/calcanl_'+format(fh, '02') + CalcAnlDir = RunDir + '/calcanl_' + format(fh, '02') if not os.path.exists(CalcAnlDir): gsi_utils.make_dir(CalcAnlDir) - gsi_utils.copy_file(ExecAnl, CalcAnlDir+'/calc_anl.x') - gsi_utils.link_file(RunDir+'/siginc.nc', CalcAnlDir+'/siginc.nc.06') - gsi_utils.link_file(RunDir+'/sigf06', CalcAnlDir+'/ges.06') - gsi_utils.link_file(RunDir+'/siganl', CalcAnlDir+'/anl.06') - gsi_utils.copy_file(ExecChgresInc, CalcAnlDir+'/chgres_inc.x') + gsi_utils.copy_file(ExecAnl, CalcAnlDir + '/calc_anl.x') + gsi_utils.link_file(RunDir + '/siginc.nc', CalcAnlDir + '/siginc.nc.06') + gsi_utils.link_file(RunDir + '/sigf06', CalcAnlDir + '/ges.06') + gsi_utils.link_file(RunDir + '/siganl', CalcAnlDir + '/anl.06') + gsi_utils.copy_file(ExecChgresInc, CalcAnlDir + '/chgres_inc.x') # for ensemble res analysis - if Cdump == "gdas": - CalcAnlDir = RunDir+'/calcanl_ensres_'+format(fh, '02') + if Cdump in ["gdas", "gfs"]: + CalcAnlDir = RunDir + '/calcanl_ensres_' + format(fh, '02') if not os.path.exists(CalcAnlDir): gsi_utils.make_dir(CalcAnlDir) - gsi_utils.copy_file(ExecAnl, CalcAnlDir+'/calc_anl.x') - gsi_utils.link_file(RunDir+'/siginc.nc', CalcAnlDir+'/siginc.nc.06') - gsi_utils.link_file(ComOut+'/'+APrefix+'atmanl.ensres'+ASuffix, CalcAnlDir+'/anl.ensres.06') - gsi_utils.link_file(ComIn_Ges+'/'+GPrefix+'atmf006.ensres'+GSuffix, CalcAnlDir+'/ges.ensres.06') - gsi_utils.link_file(RunDir+'/sigf06', CalcAnlDir+'/ges.06') + gsi_utils.copy_file(ExecAnl, CalcAnlDir + '/calc_anl.x') + gsi_utils.link_file(RunDir + '/siginc.nc', CalcAnlDir + '/siginc.nc.06') + gsi_utils.link_file(ComOut + '/' + APrefix + 'atmanl.ensres.nc', CalcAnlDir + '/anl.ensres.06') + gsi_utils.link_file(ComIn_Ges + '/' + GPrefix + 'atmf006.ensres.nc', CalcAnlDir + '/ges.ensres.06') + gsi_utils.link_file(RunDir + '/sigf06', CalcAnlDir + '/ges.06') else: - if os.path.isfile('sigi'+format(fh, '02')+'.nc'): + if os.path.isfile('sigi' + format(fh, '02') + '.nc'): # for full res analysis - CalcAnlDir = RunDir+'/calcanl_'+format(fh, '02') - CalcAnlDir6 = RunDir+'/calcanl_'+format(6, '02') + CalcAnlDir = RunDir + '/calcanl_' + format(fh, '02') + CalcAnlDir6 = RunDir + '/calcanl_' + format(6, '02') if not os.path.exists(CalcAnlDir): gsi_utils.make_dir(CalcAnlDir) if not os.path.exists(CalcAnlDir6): gsi_utils.make_dir(CalcAnlDir6) - gsi_utils.link_file(ComOut+'/'+APrefix+'atma'+format(fh, '03')+ASuffix, - CalcAnlDir6+'/anl.'+format(fh, '02')) - gsi_utils.link_file(RunDir+'/siga'+format(fh, '02'), - CalcAnlDir6+'/anl.'+format(fh, '02')) - gsi_utils.link_file(RunDir+'/sigi'+format(fh, '02')+'.nc', - CalcAnlDir+'/siginc.nc.'+format(fh, '02')) - gsi_utils.link_file(CalcAnlDir6+'/inc.fullres.'+format(fh, '02'), - CalcAnlDir+'/inc.fullres.'+format(fh, '02')) - gsi_utils.link_file(RunDir+'/sigf'+format(fh, '02'), - CalcAnlDir6+'/ges.'+format(fh, '02')) - gsi_utils.link_file(RunDir+'/sigf'+format(fh, '02'), - CalcAnlDir+'/ges.'+format(fh, '02')) - gsi_utils.copy_file(ExecChgresInc, CalcAnlDir+'/chgres_inc.x') + gsi_utils.link_file(ComOut + '/' + APrefix + 'atma' + format(fh, '03') + '.nc', + CalcAnlDir6 + '/anl.' + format(fh, '02')) + gsi_utils.link_file(RunDir + '/siga' + format(fh, '02'), + CalcAnlDir6 + '/anl.' + format(fh, '02')) + gsi_utils.link_file(RunDir + '/sigi' + format(fh, '02') + '.nc', + CalcAnlDir + '/siginc.nc.' + format(fh, '02')) + gsi_utils.link_file(CalcAnlDir6 + '/inc.fullres.' + format(fh, '02'), + CalcAnlDir + '/inc.fullres.' + format(fh, '02')) + gsi_utils.link_file(RunDir + '/sigf' + format(fh, '02'), + CalcAnlDir6 + '/ges.' + format(fh, '02')) + gsi_utils.link_file(RunDir + '/sigf' + format(fh, '02'), + CalcAnlDir + '/ges.' + format(fh, '02')) + gsi_utils.copy_file(ExecChgresInc, CalcAnlDir + '/chgres_inc.x') # for ensemble res analysis - CalcAnlDir = RunDir+'/calcanl_ensres_'+format(fh, '02') - CalcAnlDir6 = RunDir+'/calcanl_ensres_'+format(6, '02') + CalcAnlDir = RunDir + '/calcanl_ensres_' + format(fh, '02') + CalcAnlDir6 = RunDir + '/calcanl_ensres_' + format(6, '02') if not os.path.exists(CalcAnlDir): gsi_utils.make_dir(CalcAnlDir) if not os.path.exists(CalcAnlDir6): gsi_utils.make_dir(CalcAnlDir6) - gsi_utils.link_file(ComOut+'/'+APrefix+'atma'+format(fh, '03')+'.ensres'+ASuffix, - CalcAnlDir6+'/anl.ensres.'+format(fh, '02')) - gsi_utils.link_file(RunDir+'/sigi'+format(fh, '02')+'.nc', - CalcAnlDir6+'/siginc.nc.'+format(fh, '02')) - gsi_utils.link_file(ComIn_Ges+'/'+GPrefix+'atmf'+format(fh, '03')+'.ensres'+GSuffix, - CalcAnlDir6+'/ges.ensres.'+format(fh, '02')) - + gsi_utils.link_file(ComOut + '/' + APrefix + 'atma' + format(fh, '03') + '.ensres.nc', + CalcAnlDir6 + '/anl.ensres.' + format(fh, '02')) + gsi_utils.link_file(RunDir + '/sigi' + format(fh, '02') + '.nc', + CalcAnlDir6 + '/siginc.nc.' + format(fh, '02')) + gsi_utils.link_file(ComIn_Ges + '/' + GPrefix + 'atmf' + format(fh, '03') + '.ensres.nc', + CalcAnlDir6 + '/ges.ensres.' + format(fh, '02')) else: # for full res analysis - CalcAnlDir = RunDir+'/calcanl_'+format(6, '02') + CalcAnlDir = RunDir + '/calcanl_' + format(6, '02') if not os.path.exists(CalcAnlDir): gsi_utils.make_dir(CalcAnlDir) - gsi_utils.copy_file(ExecAnl, CalcAnlDir+'/calc_anl.x') - gsi_utils.link_file(RunDir+'/siginc.nc', CalcAnlDir+'/siginc.nc.06') - gsi_utils.link_file(RunDir+'/sigf06', CalcAnlDir+'/ges.06') - gsi_utils.link_file(RunDir+'/siganl', CalcAnlDir+'/anl.06') - gsi_utils.copy_file(ExecChgresInc, CalcAnlDir+'/chgres_inc.x') + gsi_utils.copy_file(ExecAnl, CalcAnlDir + '/calc_anl.x') + gsi_utils.link_file(RunDir + '/siginc.nc', CalcAnlDir + '/siginc.nc.06') + gsi_utils.link_file(RunDir + '/sigf06', CalcAnlDir + '/ges.06') + gsi_utils.link_file(RunDir + '/siganl', CalcAnlDir + '/anl.06') + gsi_utils.copy_file(ExecChgresInc, CalcAnlDir + '/chgres_inc.x') # for ensemble res analysis - CalcAnlDir = RunDir+'/calcanl_ensres_'+format(6, '02') + CalcAnlDir = RunDir + '/calcanl_ensres_' + format(6, '02') if not os.path.exists(CalcAnlDir): gsi_utils.make_dir(CalcAnlDir) - gsi_utils.copy_file(ExecAnl, CalcAnlDir+'/calc_anl.x') - gsi_utils.link_file(RunDir+'/siginc.nc', CalcAnlDir+'/siginc.nc.06') - gsi_utils.link_file(ComOut+'/'+APrefix+'atmanl.ensres'+ASuffix, CalcAnlDir+'/anl.ensres.06') - gsi_utils.link_file(ComIn_Ges+'/'+GPrefix+'atmf006.ensres'+GSuffix, CalcAnlDir+'/ges.ensres.06') + gsi_utils.copy_file(ExecAnl, CalcAnlDir + '/calc_anl.x') + gsi_utils.link_file(RunDir + '/siginc.nc', CalcAnlDir + '/siginc.nc.06') + gsi_utils.link_file(ComOut + '/' + APrefix + 'atmanl.ensres.nc', CalcAnlDir + '/anl.ensres.06') + gsi_utils.link_file(ComIn_Ges + '/' + GPrefix + 'atmf006.ensres.nc', CalcAnlDir + '/ges.ensres.06') - ######## get dimension information from background and increment files + # get dimension information from background and increment files AnlDims = gsi_utils.get_ncdims('siginc.nc') - if ASuffix == ".nc": - GesDims = gsi_utils.get_ncdims('sigf06') - else: - GesDims = gsi_utils.get_nemsdims('sigf06',NEMSGet) + GesDims = gsi_utils.get_ncdims('sigf06') levs = AnlDims['lev'] LonA = AnlDims['lon'] @@ -120,71 +112,71 @@ def calcanl_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, ASuffix, # vertical coordinate info levs2 = levs + 1 - siglevel = FixDir+'/global_hyblev.l'+str(levs2)+'.txt' + siglevel = FixDir + '/global_hyblev.l' + str(levs2) + '.txt' - ####### determine how many forecast hours to process - nFH=0 + # determine how many forecast hours to process + nFH = 0 for fh in IAUHH: # first check to see if increment file exists - CalcAnlDir = RunDir+'/calcanl_'+format(fh, '02') - if (os.path.isfile(CalcAnlDir+'/siginc.nc.'+format(fh, '02'))): - print('will process increment file: '+CalcAnlDir+'/siginc.nc.'+format(fh, '02')) - nFH+=1 + CalcAnlDir = RunDir + '/calcanl_' + format(fh, '02') + if (os.path.isfile(CalcAnlDir + '/siginc.nc.' + format(fh, '02'))): + print('will process increment file: ' + CalcAnlDir + '/siginc.nc.' + format(fh, '02')) + nFH += 1 else: - print('Increment file: '+CalcAnlDir+'/siginc.nc.'+format(fh, '02')+' does not exist. Skipping.') + print('Increment file: ' + CalcAnlDir + '/siginc.nc.' + format(fh, '02') + ' does not exist. Skipping.') sys.stdout.flush() - ######## need to gather information about runtime environment - ExecCMD = ExecCMD.replace("$ncmd","1") + # need to gather information about runtime environment + ExecCMD = ExecCMD.replace("$ncmd", "1") os.environ['OMP_NUM_THREADS'] = str(NThreads) os.environ['ncmd'] = str(nFH) - ExecCMDMPI1 = ExecCMDMPI.replace("$ncmd",str(1)) - ExecCMDMPI = ExecCMDMPI.replace("$ncmd",str(nFH)) - ExecCMDLevs = ExecCMDMPI.replace("$ncmd",str(levs)) - ExecCMDMPI10 = ExecCMDMPI.replace("$ncmd",str(10)) + ExecCMDMPI1 = ExecCMDMPI.replace("$ncmd", str(1)) + ExecCMDMPI = ExecCMDMPI.replace("$ncmd", str(nFH)) + ExecCMDLevs = ExecCMDMPI.replace("$ncmd", str(levs)) + ExecCMDMPI10 = ExecCMDMPI.replace("$ncmd", str(10)) # are we using mpirun with lsf, srun, or aprun with Cray? launcher = ExecCMDMPI.split(' ')[0] if launcher == 'mpirun': - hostfile = os.getenv('LSB_DJOB_HOSTFILE','') + hostfile = os.getenv('LSB_DJOB_HOSTFILE', '') with open(hostfile) as f: hosts_tmp = f.readlines() hosts_tmp = [x.strip() for x in hosts_tmp] hosts = [] [hosts.append(x) for x in hosts_tmp if x not in hosts] nhosts = len(hosts) - ExecCMDMPI_host = 'mpirun -np '+str(nFH)+' --hostfile hosts' - tasks = int(os.getenv('LSB_DJOB_NUMPROC',1)) + ExecCMDMPI_host = 'mpirun -np ' + str(nFH) + ' --hostfile hosts' + tasks = int(os.getenv('LSB_DJOB_NUMPROC', 1)) if levs > tasks: - ExecCMDMPILevs_host = 'mpirun -np '+str(tasks)+' --hostfile hosts' - ExecCMDMPILevs_nohost = 'mpirun -np '+str(tasks) + ExecCMDMPILevs_host = 'mpirun -np ' + str(tasks) + ' --hostfile hosts' + ExecCMDMPILevs_nohost = 'mpirun -np ' + str(tasks) else: - ExecCMDMPILevs_host = 'mpirun -np '+str(levs)+' --hostfile hosts' - ExecCMDMPILevs_nohost = 'mpirun -np '+str(levs) + ExecCMDMPILevs_host = 'mpirun -np ' + str(levs) + ' --hostfile hosts' + ExecCMDMPILevs_nohost = 'mpirun -np ' + str(levs) ExecCMDMPI1_host = 'mpirun -np 1 --hostfile hosts' ExecCMDMPI10_host = 'mpirun -np 10 --hostfile hosts' elif launcher == 'mpiexec': - hostfile = os.getenv('PBS_NODEFILE','') + hostfile = os.getenv('PBS_NODEFILE', '') with open(hostfile) as f: hosts_tmp = f.readlines() hosts_tmp = [x.strip() for x in hosts_tmp] hosts = [] [hosts.append(x) for x in hosts_tmp if x not in hosts] nhosts = len(hosts) - ExecCMDMPI_host = 'mpiexec -l -n '+str(nFH) - tasks = int(os.getenv('ntasks',1)) + ExecCMDMPI_host = 'mpiexec -l -n ' + str(nFH) + tasks = int(os.getenv('ntasks', 1)) print('nhosts,tasks=', nhosts, tasks) if levs > tasks: - ExecCMDMPILevs_host = 'mpiexec -l -n '+str(tasks) - ExecCMDMPILevs_nohost = 'mpiexec -l -n '+str(tasks) + ExecCMDMPILevs_host = 'mpiexec -l -n ' + str(tasks) + ExecCMDMPILevs_nohost = 'mpiexec -l -n ' + str(tasks) else: - ExecCMDMPILevs_host = 'mpiexec -l -n '+str(levs) - ExecCMDMPILevs_nohost = 'mpiexec -l -n '+str(levs) - ExecCMDMPI1_host = 'mpiexec -l -n 1 --cpu-bind depth --depth '+str(NThreads) - ExecCMDMPI10_host = 'mpiexec -l -n 10 --cpu-bind depth --depth '+str(NThreads) + ExecCMDMPILevs_host = 'mpiexec -l -n ' + str(levs) + ExecCMDMPILevs_nohost = 'mpiexec -l -n ' + str(levs) + ExecCMDMPI1_host = 'mpiexec -l -n 1 --cpu-bind depth --depth ' + str(NThreads) + ExecCMDMPI10_host = 'mpiexec -l -n 10 --cpu-bind depth --depth ' + str(NThreads) elif launcher == 'srun': - nodes = os.getenv('SLURM_JOB_NODELIST','') - hosts_tmp = subprocess.check_output('scontrol show hostnames '+nodes, shell=True) + nodes = os.getenv('SLURM_JOB_NODELIST', '') + hosts_tmp = subprocess.check_output('scontrol show hostnames ' + nodes, shell=True) if (sys.version_info > (3, 0)): hosts_tmp = hosts_tmp.decode('utf-8') hosts_tmp = str(hosts_tmp).splitlines() @@ -196,165 +188,163 @@ def calcanl_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, ASuffix, hosts = [] [hosts.append(x) for x in hosts_tmp if x not in hosts] nhosts = len(hosts) - ExecCMDMPI_host = 'srun -n '+str(nFH)+' --verbose --export=ALL -c 1 --distribution=arbitrary --cpu-bind=cores' + ExecCMDMPI_host = 'srun -n ' + str(nFH) + ' --verbose --export=ALL -c 1 --distribution=arbitrary --cpu-bind=cores' # need to account for when fewer than LEVS tasks are available - tasks = int(os.getenv('SLURM_NPROCS',1)) + tasks = int(os.getenv('SLURM_NPROCS', 1)) if levs > tasks: - ExecCMDMPILevs_host = 'srun -n '+str(tasks)+' --verbose --export=ALL -c 1 --distribution=arbitrary --cpu-bind=cores' - ExecCMDMPILevs_nohost = 'srun -n '+str(tasks)+' --verbose --export=ALL' + ExecCMDMPILevs_host = 'srun -n ' + str(tasks) + ' --verbose --export=ALL -c 1 --distribution=arbitrary --cpu-bind=cores' + ExecCMDMPILevs_nohost = 'srun -n ' + str(tasks) + ' --verbose --export=ALL' else: - ExecCMDMPILevs_host = 'srun -n '+str(levs)+' --verbose --export=ALL -c 1 --distribution=arbitrary --cpu-bind=cores' - ExecCMDMPILevs_nohost = 'srun -n '+str(levs)+' --verbose --export=ALL' + ExecCMDMPILevs_host = 'srun -n ' + str(levs) + ' --verbose --export=ALL -c 1 --distribution=arbitrary --cpu-bind=cores' + ExecCMDMPILevs_nohost = 'srun -n ' + str(levs) + ' --verbose --export=ALL' ExecCMDMPI1_host = 'srun -n 1 --verbose --export=ALL -c 1 --distribution=arbitrary --cpu-bind=cores' ExecCMDMPI10_host = 'srun -n 10 --verbose --export=ALL -c 1 --distribution=arbitrary --cpu-bind=cores' elif launcher == 'aprun': - hostfile = os.getenv('LSB_DJOB_HOSTFILE','') + hostfile = os.getenv('LSB_DJOB_HOSTFILE', '') with open(hostfile) as f: hosts_tmp = f.readlines() hosts_tmp = [x.strip() for x in hosts_tmp] hosts = [] [hosts.append(x) for x in hosts_tmp if x not in hosts] nhosts = len(hosts) - ExecCMDMPI_host = 'aprun -l hosts -d '+str(NThreads)+' -n '+str(nFH) - ExecCMDMPILevs_host = 'aprun -l hosts -d '+str(NThreads)+' -n '+str(levs) - ExecCMDMPILevs_nohost = 'aprun -d '+str(NThreads)+' -n '+str(levs) - ExecCMDMPI1_host = 'aprun -l hosts -d '+str(NThreads)+' -n 1' - ExecCMDMPI10_host = 'aprun -l hosts -d '+str(NThreads)+' -n 10' + ExecCMDMPI_host = 'aprun -l hosts -d ' + str(NThreads) + ' -n ' + str(nFH) + ExecCMDMPILevs_host = 'aprun -l hosts -d ' + str(NThreads) + ' -n ' + str(levs) + ExecCMDMPILevs_nohost = 'aprun -d ' + str(NThreads) + ' -n ' + str(levs) + ExecCMDMPI1_host = 'aprun -l hosts -d ' + str(NThreads) + ' -n 1' + ExecCMDMPI10_host = 'aprun -l hosts -d ' + str(NThreads) + ' -n 10' else: print('unknown MPI launcher. Failure.') sys.exit(1) - ####### generate the full resolution analysis + # generate the full resolution analysis ihost = 0 - ### interpolate increment to full background resolution + # interpolate increment to full background resolution for fh in IAUHH: # first check to see if increment file exists - CalcAnlDir = RunDir+'/calcanl_'+format(fh, '02') - if (os.path.isfile(CalcAnlDir+'/siginc.nc.'+format(fh, '02'))): - print('Interpolating increment for f'+format(fh, '03')) + CalcAnlDir = RunDir + '/calcanl_' + format(fh, '02') + if (os.path.isfile(CalcAnlDir + '/siginc.nc.' + format(fh, '02'))): + print('Interpolating increment for f' + format(fh, '03')) # set up the namelist namelist = OrderedDict() namelist["setup"] = {"lon_out": LonB, - "lat_out": LatB, - "lev": levs, - "infile": "'siginc.nc."+format(fh, '02')+"'", - "outfile": "'inc.fullres."+format(fh, '02')+"'", - } - gsi_utils.write_nml(namelist, CalcAnlDir+'/fort.43') + "lat_out": LatB, + "lev": levs, + "infile": "'siginc.nc." + format(fh, '02') + "'", + "outfile": "'inc.fullres." + format(fh, '02') + "'", + } + gsi_utils.write_nml(namelist, CalcAnlDir + '/fort.43') if ihost >= nhosts: ihost = 0 - with open(CalcAnlDir+'/hosts', 'w') as hostfile: - hostfile.write(hosts[ihost]+'\n') - if launcher == 'srun': # need to write host per task not per node for slurm + with open(CalcAnlDir + '/hosts', 'w') as hostfile: + hostfile.write(hosts[ihost] + '\n') + if launcher == 'srun': # need to write host per task not per node for slurm # For xjet, each instance of chgres_inc must run on two nodes each - if os.getenv('SLURM_JOB_PARTITION','') == 'xjet': - for a in range(0,4): - hostfile.write(hosts[ihost]+'\n') - ihost+=1 - for a in range(0,5): - hostfile.write(hosts[ihost]+'\n') - for a in range(0,9): # need 9 more of the same host for the 10 tasks for chgres_inc - hostfile.write(hosts[ihost]+'\n') + if os.getenv('SLURM_JOB_PARTITION', '') == 'xjet': + for a in range(0, 4): + hostfile.write(hosts[ihost] + '\n') + ihost += 1 + for a in range(0, 5): + hostfile.write(hosts[ihost] + '\n') + for a in range(0, 9): # need 9 more of the same host for the 10 tasks for chgres_inc + hostfile.write(hosts[ihost] + '\n') if launcher == 'srun': - os.environ['SLURM_HOSTFILE'] = CalcAnlDir+'/hosts' + os.environ['SLURM_HOSTFILE'] = CalcAnlDir + '/hosts' print('interp_inc', fh, namelist) - job = subprocess.Popen(ExecCMDMPI10_host+' '+CalcAnlDir+'/chgres_inc.x', shell=True, cwd=CalcAnlDir) - print(ExecCMDMPI10_host+' '+CalcAnlDir+'/chgres_inc.x submitted on '+hosts[ihost]) + job = subprocess.Popen(ExecCMDMPI10_host + ' ' + CalcAnlDir + '/chgres_inc.x', shell=True, cwd=CalcAnlDir) + print(ExecCMDMPI10_host + ' ' + CalcAnlDir + '/chgres_inc.x submitted on ' + hosts[ihost]) sys.stdout.flush() ec = job.wait() if ec != 0: - print('Error with chgres_inc.x at forecast hour: f'+format(fh, '03')) - print('Error with chgres_inc.x, exit code='+str(ec)) + print('Error with chgres_inc.x at forecast hour: f' + format(fh, '03')) + print('Error with chgres_inc.x, exit code=' + str(ec)) print(locals()) sys.exit(ec) - ihost+=1 + ihost += 1 else: - print('f'+format(fh, '03')+' is in $IAUFHRS but increment file is missing. Skipping.') + print('f' + format(fh, '03') + ' is in $IAUFHRS but increment file is missing. Skipping.') - #### generate analysis from interpolated increment - CalcAnlDir6 = RunDir+'/calcanl_'+format(6, '02') + # generate analysis from interpolated increment + CalcAnlDir6 = RunDir + '/calcanl_' + format(6, '02') # set up the namelist namelist = OrderedDict() - namelist["setup"] = {"datapath": "'./'", - "analysis_filename": "'anl'", - "firstguess_filename": "'ges'", - "increment_filename": "'inc.fullres'", - "fhr": 6, - } + namelist["setup"] = {"datapath": "'./'", + "analysis_filename": "'anl'", + "firstguess_filename": "'ges'", + "increment_filename": "'inc.fullres'", + "fhr": 6, + } - gsi_utils.write_nml(namelist, CalcAnlDir6+'/calc_analysis.nml') + gsi_utils.write_nml(namelist, CalcAnlDir6 + '/calc_analysis.nml') # run the executable - if ihost >= nhosts-1: + if ihost >= nhosts - 1: ihost = 0 if launcher == 'srun': del os.environ['SLURM_HOSTFILE'] print('fullres_calc_anl', namelist) - fullres_anl_job = subprocess.Popen(ExecCMDMPILevs_nohost+' '+CalcAnlDir6+'/calc_anl.x', shell=True, cwd=CalcAnlDir6) - print(ExecCMDMPILevs_nohost+' '+CalcAnlDir6+'/calc_anl.x submitted') + fullres_anl_job = subprocess.Popen(ExecCMDMPILevs_nohost + ' ' + CalcAnlDir6 + '/calc_anl.x', shell=True, cwd=CalcAnlDir6) + print(ExecCMDMPILevs_nohost + ' ' + CalcAnlDir6 + '/calc_anl.x submitted') sys.stdout.flush() exit_fullres = fullres_anl_job.wait() sys.stdout.flush() if exit_fullres != 0: - print('Error with calc_analysis.x for deterministic resolution, exit code='+str(exit_fullres)) + print('Error with calc_analysis.x for deterministic resolution, exit code=' + str(exit_fullres)) print(locals()) sys.exit(exit_fullres) - - ######## compute determinstic analysis on ensemble resolution - if Cdump == "gdas": + # compute determinstic analysis on ensemble resolution + if Cdump in ["gdas", "gfs"]: chgres_jobs = [] for fh in IAUHH: # first check to see if guess file exists - CalcAnlDir6 = RunDir+'/calcanl_ensres_06' - print(CalcAnlDir6+'/ges.ensres.'+format(fh, '02')) - if (os.path.isfile(CalcAnlDir6+'/ges.ensres.'+format(fh, '02'))): - print('Calculating analysis on ensemble resolution for f'+format(fh, '03')) - ######## generate ensres analysis from interpolated background + CalcAnlDir6 = RunDir + '/calcanl_ensres_06' + print(CalcAnlDir6 + '/ges.ensres.' + format(fh, '02')) + if (os.path.isfile(CalcAnlDir6 + '/ges.ensres.' + format(fh, '02'))): + print('Calculating analysis on ensemble resolution for f' + format(fh, '03')) + # generate ensres analysis from interpolated background # set up the namelist namelist = OrderedDict() - namelist["setup"] = {"datapath": "'./'", - "analysis_filename": "'anl.ensres'", - "firstguess_filename": "'ges.ensres'", - "increment_filename": "'siginc.nc'", - "fhr": fh, - } + namelist["setup"] = {"datapath": "'./'", + "analysis_filename": "'anl.ensres'", + "firstguess_filename": "'ges.ensres'", + "increment_filename": "'siginc.nc'", + "fhr": fh, + } - gsi_utils.write_nml(namelist, CalcAnlDir6+'/calc_analysis.nml') + gsi_utils.write_nml(namelist, CalcAnlDir6 + '/calc_analysis.nml') # run the executable - if ihost > nhosts-1: + if ihost > nhosts - 1: ihost = 0 print('ensres_calc_anl', namelist) - ensres_anl_job = subprocess.Popen(ExecCMDMPILevs_nohost+' '+CalcAnlDir6+'/calc_anl.x', shell=True, cwd=CalcAnlDir6) - print(ExecCMDMPILevs_nohost+' '+CalcAnlDir6+'/calc_anl.x submitted') + ensres_anl_job = subprocess.Popen(ExecCMDMPILevs_nohost + ' ' + CalcAnlDir6 + '/calc_anl.x', shell=True, cwd=CalcAnlDir6) + print(ExecCMDMPILevs_nohost + ' ' + CalcAnlDir6 + '/calc_anl.x submitted') sys.stdout.flush() - ####### check on analysis steps + # check on analysis steps exit_ensres = ensres_anl_job.wait() if exit_ensres != 0: - print('Error with calc_analysis.x for ensemble resolution, exit code='+str(exit_ensres)) + print('Error with calc_analysis.x for ensemble resolution, exit code=' + str(exit_ensres)) print(locals()) sys.exit(exit_ensres) else: - print('f'+format(fh, '03')+' is in $IAUFHRS but ensemble resolution guess file is missing. Skipping.') + print('f' + format(fh, '03') + ' is in $IAUFHRS but ensemble resolution guess file is missing. Skipping.') - print('calcanl_gfs successfully completed at: ',datetime.datetime.utcnow()) + print('calcanl_gfs successfully completed at: ', datetime.datetime.utcnow()) print(locals()) + # run the function if this script is called from the command line if __name__ == '__main__': DoIAU = gsi_utils.isTrue(os.getenv('DOIAU', 'NO')) l4DEnsVar = gsi_utils.isTrue(os.getenv('l4densvar', 'NO')) Write4Danl = gsi_utils.isTrue(os.getenv('lwrite4danl', 'NO')) - ComIn_Ges = os.getenv('COMIN_GES', './') + ComIn_Ges = os.getenv('COM_ATMOS_HISTORY_PREV', './') GPrefix = os.getenv('GPREFIX', './') - GSuffix = os.getenv('GSUFFIX', './') - ComOut = os.getenv('COMOUT', './') + ComOut = os.getenv('COM_ATMOS_ANALYSIS', './') APrefix = os.getenv('APREFIX', '') - ASuffix= os.getenv('ASUFFIX', '') NThreads = os.getenv('NTHREADS_CHGRES', 1) FixDir = os.getenv('FIXgsm', './') atmges_ens_mean = os.getenv('ATMGES_ENSMEAN', './atmges_ensmean') @@ -363,13 +353,13 @@ def calcanl_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, ASuffix, ExecCMDMPI = os.getenv('APRUN_CALCINC', '') ExecAnl = os.getenv('CALCANLEXEC', './calc_analysis.x') ExecChgresInc = os.getenv('CHGRESINCEXEC', './interp_inc.x') - NEMSGet = os.getenv('NEMSIOGET','nemsio_get') - IAUHrs = list(map(int,os.getenv('IAUFHRS','6').split(','))) + NEMSGet = os.getenv('NEMSIOGET', 'nemsio_get') + IAUHrs = list(map(int, os.getenv('IAUFHRS', '6').split(','))) Cdump = os.getenv('CDUMP', 'gdas') print(locals()) - calcanl_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, ASuffix, - ComIn_Ges, GPrefix, GSuffix, + calcanl_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, + ComIn_Ges, GPrefix, FixDir, atmges_ens_mean, RunDir, NThreads, NEMSGet, IAUHrs, ExecCMD, ExecCMDMPI, ExecAnl, ExecChgresInc, Cdump) diff --git a/ush/calcinc_gfs.py b/ush/calcinc_gfs.py index 0306d9f39f1..cb334ac8361 100755 --- a/ush/calcinc_gfs.py +++ b/ush/calcinc_gfs.py @@ -12,79 +12,76 @@ from collections import OrderedDict # main function -def calcinc_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, ASuffix, IAUHrs, + + +def calcinc_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, IAUHrs, NThreads, IMP_Physics, Inc2Zero, RunDir, Exec, ExecCMD): - # run the calc_increment_ens executable + # run the calc_increment_ens executable - # copy and link files - if DoIAU and l4DEnsVar and Write4Danl: - nFH=0 - for fh in IAUHrs: - nFH+=1 - if fh == 6: - gsi_utils.link_file('sigf06', 'atmges_mem'+format(nFH, '03')) - gsi_utils.link_file('siganl', 'atmanl_mem'+format(nFH, '03')) - gsi_utils.link_file(ComOut+'/'+APrefix+'atminc.nc', 'atminc_mem'+format(nFH, '03')) - else: - gsi_utils.link_file('sigf'+format(fh, '02'), 'atmges_mem'+format(nFH, '03')) - gsi_utils.link_file('siga'+format(fh, '02'), 'atmanl_mem'+format(nFH, '03')) - gsi_utils.link_file(ComOut+'/'+APrefix+'atmi'+format(fh, '03')+'.nc', 'atminc_mem'+format(nFH, '03')) - else: - nFH=1 - gsi_utils.link_file('sigf06', 'atmges_mem001') - gsi_utils.link_file('siganl', 'atmanl_mem001') - gsi_utils.link_file(ComOut+'/'+APrefix+'atminc', 'atminc_mem001') - os.environ['OMP_NUM_THREADS'] = str(NThreads) - os.environ['ncmd'] = str(nFH) - shutil.copy(Exec,RunDir+'/calc_inc.x') - ExecCMD = ExecCMD.replace("$ncmd",str(nFH)) + # copy and link files + if DoIAU and l4DEnsVar and Write4Danl: + nFH = 0 + for fh in IAUHrs: + nFH += 1 + if fh == 6: + gsi_utils.link_file('sigf06', 'atmges_mem' + format(nFH, '03')) + gsi_utils.link_file('siganl', 'atmanl_mem' + format(nFH, '03')) + gsi_utils.link_file(ComOut + '/' + APrefix + 'atminc.nc', 'atminc_mem' + format(nFH, '03')) + else: + gsi_utils.link_file('sigf' + format(fh, '02'), 'atmges_mem' + format(nFH, '03')) + gsi_utils.link_file('siga' + format(fh, '02'), 'atmanl_mem' + format(nFH, '03')) + gsi_utils.link_file(ComOut + '/' + APrefix + 'atmi' + format(fh, '03') + '.nc', 'atminc_mem' + format(nFH, '03')) + else: + nFH = 1 + gsi_utils.link_file('sigf06', 'atmges_mem001') + gsi_utils.link_file('siganl', 'atmanl_mem001') + gsi_utils.link_file(ComOut + '/' + APrefix + 'atminc', 'atminc_mem001') + os.environ['OMP_NUM_THREADS'] = str(NThreads) + os.environ['ncmd'] = str(nFH) + shutil.copy(Exec, RunDir + '/calc_inc.x') + ExecCMD = ExecCMD.replace("$ncmd", str(nFH)) - # set up the namelist - namelist = OrderedDict() - namelist["setup"] = {"datapath": "'./'", - "analysis_filename": "'atmanl'", - "firstguess_filename": "'atmges'", - "increment_filename": "'atminc'", - "debug": ".false.", - "nens": str(nFH), - "imp_physics": str(IMP_Physics)} + # set up the namelist + namelist = OrderedDict() + namelist["setup"] = {"datapath": "'./'", + "analysis_filename": "'atmanl'", + "firstguess_filename": "'atmges'", + "increment_filename": "'atminc'", + "debug": ".false.", + "nens": str(nFH), + "imp_physics": str(IMP_Physics)} - namelist["zeroinc"] = {"incvars_to_zero": Inc2Zero} - - gsi_utils.write_nml(namelist, RunDir+'/calc_increment.nml') + namelist["zeroinc"] = {"incvars_to_zero": Inc2Zero} + + gsi_utils.write_nml(namelist, RunDir + '/calc_increment.nml') + + # run the executable + try: + err = subprocess.check_call(ExecCMD + ' ' + RunDir + '/calc_inc.x', shell=True) + print(locals()) + except subprocess.CalledProcessError as e: + print('Error with calc_inc.x, exit code=' + str(e.returncode)) + print(locals()) + sys.exit(e.returncode) - # run the executable - try: - err = subprocess.check_call(ExecCMD+' '+RunDir+'/calc_inc.x', shell=True) - print(locals()) - except subprocess.CalledProcessError as e: - print('Error with calc_inc.x, exit code='+str(e.returncode)) - print(locals()) - sys.exit(e.returncode) # run the function if this script is called from the command line if __name__ == '__main__': - DoIAU = gsi_utils.isTrue(os.getenv('DOIAU', 'NO')) - l4DEnsVar = gsi_utils.isTrue(os.getenv('l4densvar', 'NO')) - Write4Danl = gsi_utils.isTrue(os.getenv('lwrite4danl', 'NO')) - ComOut = os.getenv('COMOUT', './') - APrefix = os.getenv('APREFIX', '') - ASuffix= os.getenv('ASUFFIX', '') - NThreads = os.getenv('NTHREADS_CALCINC', 1) - IMP_Physics = os.getenv('imp_physics', 11) - RunDir = os.getenv('DATA', './') - ExecNC = os.getenv('CALCINCNCEXEC', './calc_increment_ens_ncio.x') - ExecNEMS = os.getenv('CALCINCEXEC', './calc_increment_ens.x') - Inc2Zero = os.getenv('INCREMENTS_TO_ZERO', '"NONE"') - ExecCMD = os.getenv('APRUN_CALCINC', '') - IAUHrs = list(map(int,os.getenv('IAUFHRS','6').split(','))) + DoIAU = gsi_utils.isTrue(os.getenv('DOIAU', 'NO')) + l4DEnsVar = gsi_utils.isTrue(os.getenv('l4densvar', 'NO')) + Write4Danl = gsi_utils.isTrue(os.getenv('lwrite4danl', 'NO')) + ComOut = os.getenv('COM_ATMOS_ANALYSIS', './') + APrefix = os.getenv('APREFIX', '') + NThreads = os.getenv('NTHREADS_CALCINC', 1) + IMP_Physics = os.getenv('imp_physics', 11) + RunDir = os.getenv('DATA', './') + ExecNC = os.getenv('CALCINCNCEXEC', './calc_increment_ens_ncio.x') + Inc2Zero = os.getenv('INCREMENTS_TO_ZERO', '"NONE"') + ExecCMD = os.getenv('APRUN_CALCINC', '') + IAUHrs = list(map(int, os.getenv('IAUFHRS', '6').split(','))) - # determine if the analysis is in netCDF or NEMSIO - if ASuffix == ".nc": - Exec = ExecNC - else: - Exec = ExecNEMS + Exec = ExecNC - print(locals()) - calcinc_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, ASuffix, IAUHrs, - NThreads, IMP_Physics, Inc2Zero, RunDir, Exec, ExecCMD) + print(locals()) + calcinc_gfs(DoIAU, l4DEnsVar, Write4Danl, ComOut, APrefix, IAUHrs, + NThreads, IMP_Physics, Inc2Zero, RunDir, Exec, ExecCMD) diff --git a/ush/compare_f90nml.py b/ush/compare_f90nml.py new file mode 100755 index 00000000000..f3c5573a927 --- /dev/null +++ b/ush/compare_f90nml.py @@ -0,0 +1,107 @@ +#!/usr/bin/env python3 + +import json +import f90nml +from typing import Dict +from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter + + +def get_dict_from_nml(filename: str) -> Dict: + """ + Read a F90 namelist and convert to a dictionary. + This method uses json to convert OrderedDictionary into regular dictionary + Parameters + ---------- + filename: str + Name of the F90 namelist + Returns + ------- + dictionary: Dict + F90 namelist returned as a dictionary + """ + return json.loads(json.dumps(f90nml.read(filename).todict())) + + +def compare_dicts(dict1: Dict, dict2: Dict, path: str = "") -> None: + """ + Compare 2 dictionaries. + This is done by looping over keys in dictionary 1 and searching for them + in dictionary 2. + If a matching key is found, the values are compared. + If a matching key is not found, it is set to as UNDEFINED. + Note: A reverse match is not performed in this method. For reverse matching, use the -r option in the main driver. + Note: This is a recursive method to handle nested dictionaries. + Parameters + ---------- + dict1: Dict + First dictionary + dict2: Dict + Second dictionary + path: str (optional) + default: "" + key (if nested dictionary) + Returns + ------- + None + """ + + result = dict() + for kk in dict1.keys(): # Loop over all keys of first dictionary + if kk in dict2.keys(): # kk is present in dict2 + if isinstance(dict1[kk], dict): # nested dictionary, go deeper + compare_dicts(dict1[kk], dict2[kk], path=kk) + else: + if dict1[kk] != dict2[kk]: + if path not in result: + result[path] = dict() + result[path][kk] = [dict1[kk], dict2[kk]] + else: # kk is *not* present in dict2 + tt = path if path else kk + if tt not in result: + result[tt] = dict() + result[tt][kk] = [dict1[kk], 'UNDEFINED'] + + def _print_diffs(diff_dict: Dict) -> None: + """ + Print the differences between the two dictionaries to stdout + Parameters + ---------- + diff_dict: Dict + Dictionary containing differences + Returns + ------- + None + """ + for path in diff_dict.keys(): + print(f"{path}:") + max_len = len(max(diff_dict[path], key=len)) + for kk in diff_dict[path].keys(): + items = diff_dict[path][kk] + print( + f"{kk:>{max_len+2}} : {' | '.join(map(str, diff_dict[path][kk]))}") + + _print_diffs(result) + + +if __name__ == "__main__": + + parser = ArgumentParser( + description=("Compare two Fortran namelists and display differences (left_namelist - right_namelist)"), + formatter_class=ArgumentDefaultsHelpFormatter) + parser.add_argument('left_namelist', type=str, help="Left namelist to compare") + parser.add_argument('right_namelist', type=str, help="Right namelist to compare") + parser.add_argument('-r', '--reverse', help='reverse diff (right_namelist - left_namelist)', + action='store_true', required=False) + args = parser.parse_args() + + nml1, nml2 = args.left_namelist, args.right_namelist + if args.reverse: + nml2, nml1 = nml1, nml2 + + dict1 = get_dict_from_nml(nml1) + dict2 = get_dict_from_nml(nml2) + + msg = f"comparing: {nml1} | {nml2}" + print(msg) + print("-" * len(msg)) + compare_dicts(dict1, dict2) diff --git a/ush/detect_machine.sh b/ush/detect_machine.sh new file mode 100755 index 00000000000..647722b7a3e --- /dev/null +++ b/ush/detect_machine.sh @@ -0,0 +1,73 @@ +#!/bin/bash + +# First detect w/ hostname +case $(hostname -f) in + + adecflow0[12].acorn.wcoss2.ncep.noaa.gov) MACHINE_ID=wcoss2 ;; ### acorn + alogin0[12].acorn.wcoss2.ncep.noaa.gov) MACHINE_ID=wcoss2 ;; ### acorn + clogin0[1-9].cactus.wcoss2.ncep.noaa.gov) MACHINE_ID=wcoss2 ;; ### cactus01-9 + clogin10.cactus.wcoss2.ncep.noaa.gov) MACHINE_ID=wcoss2 ;; ### cactus10 + dlogin0[1-9].dogwood.wcoss2.ncep.noaa.gov) MACHINE_ID=wcoss2 ;; ### dogwood01-9 + dlogin10.dogwood.wcoss2.ncep.noaa.gov) MACHINE_ID=wcoss2 ;; ### dogwood10 + + gaea9) MACHINE_ID=gaea ;; ### gaea9 + gaea1[0-6]) MACHINE_ID=gaea ;; ### gaea10-16 + gaea9.ncrc.gov) MACHINE_ID=gaea ;; ### gaea9 + gaea1[0-6].ncrc.gov) MACHINE_ID=gaea ;; ### gaea10-16 + + hfe0[1-9]) MACHINE_ID=hera ;; ### hera01-9 + hfe1[0-2]) MACHINE_ID=hera ;; ### hera10-12 + hecflow01) MACHINE_ID=hera ;; ### heraecflow01 + + s4-submit.ssec.wisc.edu) MACHINE_ID=s4 ;; ### s4 + + fe[1-8]) MACHINE_ID=jet ;; ### jet01-8 + tfe[12]) MACHINE_ID=jet ;; ### tjet1-2 + + Orion-login-[1-4].HPC.MsState.Edu) MACHINE_ID=orion ;; ### orion1-4 + + cheyenne[1-6].cheyenne.ucar.edu) MACHINE_ID=cheyenne ;; ### cheyenne1-6 + cheyenne[1-6].ib0.cheyenne.ucar.edu) MACHINE_ID=cheyenne ;; ### cheyenne1-6 + chadmin[1-6].ib0.cheyenne.ucar.edu) MACHINE_ID=cheyenne ;; ### cheyenne1-6 + + login[1-4].stampede2.tacc.utexas.edu) MACHINE_ID=stampede ;; ### stampede1-4 + + login0[1-2].expanse.sdsc.edu) MACHINE_ID=expanse ;; ### expanse1-2 + + discover3[1-5].prv.cube) MACHINE_ID=discover ;; ### discover31-35 + *) MACHINE_ID=UNKNOWN ;; # Unknown platform +esac + +# Overwrite auto-detect with MACHINE if set +MACHINE_ID=${MACHINE:-${MACHINE_ID}} + +# If MACHINE_ID is no longer UNKNNOWN, return it +if [[ "${MACHINE_ID}" != "UNKNOWN" ]]; then + return +fi + +# Try searching based on paths since hostname may not match on compute nodes +if [[ -d /lfs/f1 ]] ; then + # We are on NOAA Cactus or Dogwood + MACHINE_ID=wcoss2 +elif [[ -d /mnt/lfs1 ]] ; then + # We are on NOAA Jet + MACHINE_ID=jet +elif [[ -d /scratch1 ]] ; then + # We are on NOAA Hera + MACHINE_ID=hera +elif [[ -d /work ]] ; then + # We are on MSU Orion + MACHINE_ID=orion +elif [[ -d /glade ]] ; then + # We are on NCAR Yellowstone + MACHINE_ID=cheyenne +elif [[ -d /lustre && -d /ncrc ]] ; then + # We are on GAEA. + MACHINE_ID=gaea +elif [[ -d /data/prod ]] ; then + # We are on SSEC's S4 + MACHINE_ID=s4 +else + echo WARNING: UNKNOWN PLATFORM 1>&2 +fi diff --git a/ush/drive_makeprepbufr.sh b/ush/drive_makeprepbufr.sh deleted file mode 100755 index 31154c10bb7..00000000000 --- a/ush/drive_makeprepbufr.sh +++ /dev/null @@ -1,142 +0,0 @@ -#! /usr/bin/env bash - -############################################################### -# < next few lines under version control, D O N O T E D I T > -# $Date$ -# $Revision$ -# $Author$ -# $Id$ -############################################################### - -############################################################### -## Author: Rahul Mahajan Org: NCEP/EMC Date: April 2017 - -## Abstract: -## Prepare for analysis driver script -## EXPDIR : /full/path/to/config/files -## CDATE : current analysis date (YYYYMMDDHH) -## CDUMP : cycle name (gdas / gfs) -############################################################### - -source "$HOMEgfs/ush/preamble.sh" - -############################################################### -# Source relevant configs -configs="base prep prepbufr" -for config in $configs; do - . $EXPDIR/config.${config} - status=$? - [[ $status -ne 0 ]] && exit $status -done - -############################################################### -# Source machine runtime environment -. $BASE_ENV/${machine}.env prepbufr -status=$? -[[ $status -ne 0 ]] && exit $status - -############################################################### -KEEPDATA=${KEEPDATA:-"NO"} -DO_RELOCATE=${DO_RELOCATE:-"NO"} -DONST=${DONST:-"NO"} - -############################################################### -# Set script and dependency variables -COMPONENT=${COMPONENT:-atmos} - -GDATE=$($NDATE -$assim_freq $CDATE) - -cymd=$(echo $CDATE | cut -c1-8) -chh=$(echo $CDATE | cut -c9-10) -gymd=$(echo $GDATE | cut -c1-8) -ghh=$(echo $GDATE | cut -c9-10) - -OPREFIX="${CDUMP}.t${chh}z." -OSUFFIX=".bufr_d" -GPREFIX="gdas.t${ghh}z." -GSUFFIX=${GSUFFIX:-$SUFFIX} -APREFIX="${CDUMP}.t${chh}z." -ASUFFIX=${ASUFFIX:-$SUFFIX} - -COMIN_OBS=${COMIN_OBS:-"$DMPDIR/${CDUMP}${DUMP_SUFFIX}.${PDY}/${cyc}/${COMPONENT}"} -COMIN_GES=${COMIN_GES:-"$ROTDIR/gdas.$gymd/$ghh/$COMPONENT"} -COMOUT=${COMOUT:-"$ROTDIR/$CDUMP.$cymd/$chh/$COMPONENT"} -[[ ! -d $COMOUT ]] && mkdir -p $COMOUT -export DATA="$RUNDIR/$CDATE/$CDUMP/prepbufr" -[[ -d $DATA ]] && rm -rf $DATA -mkdir -p $DATA -cd $DATA - -############################################################### -# MAKEPREPBUFRSH environment specific variables -export NEMSIO_IN=".true." -export COMSP="$DATA/" -export NET=$CDUMP - -############################################################### -# Link observation files in BUFRLIST -for bufrname in $BUFRLIST; do - $NLN $COMIN_OBS/${OPREFIX}${bufrname}.tm00$OSUFFIX ${bufrname}.tm00$OSUFFIX -done - -# Link first guess files -$NLN $COMIN_GES/${GPREFIX}atmf003${GSUFFIX} ./atmgm3$GSUFFIX -$NLN $COMIN_GES/${GPREFIX}atmf006${GSUFFIX} ./atmges$GSUFFIX -$NLN $COMIN_GES/${GPREFIX}atmf009${GSUFFIX} ./atmgp3$GSUFFIX - -[[ -f $COMIN_GES/${GPREFIX}atmf004${GSUFFIX} ]] && $NLN $COMIN_GES/${GPREFIX}atmf004${GSUFFIX} ./atmgm2$GSUFFIX -[[ -f $COMIN_GES/${GPREFIX}atmf005${GSUFFIX} ]] && $NLN $COMIN_GES/${GPREFIX}atmf005${GSUFFIX} ./atmgm1$GSUFFIX -[[ -f $COMIN_GES/${GPREFIX}atmf007${GSUFFIX} ]] && $NLN $COMIN_GES/${GPREFIX}atmf007${GSUFFIX} ./atmgp1$GSUFFIX -[[ -f $COMIN_GES/${GPREFIX}atmf008${GSUFFIX} ]] && $NLN $COMIN_GES/${GPREFIX}atmf008${GSUFFIX} ./atmgp2$GSUFFIX - -# If relocation is turned off: these files don't exist, touch them -if [ $DO_RELOCATE = "NO" ]; then - touch $DATA/tcvitals.relocate.tm00 - touch $DATA/tropcy_relocation_status.tm00 - echo "RECORDS PROCESSED" >> $DATA/tropcy_relocation_status.tm00 -fi - -############################################################### -# if PREPDATA is YES and -# 1. the aircft bufr file is not found, set PREPACQC to NO -# 2. the ****** bufr file is not found, set ******** to NO -if [ $PREPDATA = "YES" ]; then - [[ ! -s aircft.tm00$OSUFFIX ]] && export PREPACQC="NO" -fi - -############################################################### -# Execute MAKEPREPBUFRSH - -echo $(date) EXECUTING $MAKEPREPBUFRSH $CDATE >&2 -$MAKEPREPBUFRSH $CDATE -status=$? -echo $(date) EXITING $MAKEPREPBUFRSH with return code $status >&2 -[[ $status -ne 0 ]] && exit $status - -############################################################### -# Create nsstbufr file -if [ $DONST = "YES" ]; then - SFCSHPBF=${SFCSHPBF:-$COMIN_OBS/sfcshp.$CDUMP.$CDATE} - TESACBF=${TESACBF:-$COMIN_OBS/tesac.$CDUMP.$CDATE} - BATHYBF=${BATHYBF:-$COMIN_OBS/bathy.$CDUMP.$CDATE} - TRKOBBF=${TRKOBBF:-$COMIN_OBS/trkob.$CDUMP.$CDATE} - NSSTBF=${NSSTBF:-$COMOUT/${APREFIX}nsstbufr} - - cat $SFCSHPBF $TESACBF $BATHYBF $TRKOBBF > $NSSTBF - status=$? - echo $(date) CREATE $NSSTBF with return code $status >&2 - - # NSST bufr file must be restricted since it contains unmasked ship ids - chmod 640 $NSSTBF - $CHGRP_CMD $NSSTBF -fi -############################################################### -# Copy prepbufr and prepbufr.acft_profiles to COMOUT -$NCP $DATA/prepda.t${chh}z $COMOUT/${APREFIX}prepbufr -$NCP $DATA/prepbufr.acft_profiles $COMOUT/${APREFIX}prepbufr.acft_profiles - -############################################################### -# Exit out cleanly -if [ $KEEPDATA = "NO" ] ; then rm -rf $DATA ; fi - -exit 0 diff --git a/ush/forecast_det.sh b/ush/forecast_det.sh index f3823cde992..06329e07626 100755 --- a/ush/forecast_det.sh +++ b/ush/forecast_det.sh @@ -19,65 +19,61 @@ FV3_GFS_det(){ res_latlon_dynamics="''" # Determine if this is a warm start or cold start - if [ -f $gmemdir/RESTART/${sPDY}.${scyc}0000.coupler.res ]; then + if [[ -f "${COM_ATMOS_RESTART_PREV}/${sPDY}.${scyc}0000.coupler.res" ]]; then export warm_start=".true." fi # turn IAU off for cold start DOIAU_coldstart=${DOIAU_coldstart:-"NO"} - if [ $DOIAU = "YES" -a $warm_start = ".false." ] || [ $DOIAU_coldstart = "YES" -a $warm_start = ".true." ]; then + if [ ${DOIAU} = "YES" -a ${warm_start} = ".false." ] || [ ${DOIAU_coldstart} = "YES" -a ${warm_start} = ".true." ]; then export DOIAU="NO" - echo "turning off IAU since warm_start = $warm_start" + echo "turning off IAU since warm_start = ${warm_start}" DOIAU_coldstart="YES" IAU_OFFSET=0 - sCDATE=$CDATE - sPDY=$PDY - scyc=$cyc - tPDY=$sPDY - tcyc=$cyc + sCDATE=${CDATE} + sPDY=${PDY} + scyc=${cyc} + tPDY=${sPDY} + tcyc=${cyc} fi #------------------------------------------------------- # determine if restart IC exists to continue from a previous forecast - RERUN="NO" - filecount=$(find $RSTDIR_ATM -type f | wc -l) - if [ $CDUMP = "gfs" -a $rst_invt1 -gt 0 -a $FHMAX -gt $rst_invt1 -a $filecount -gt 10 ]; then + RERUN=${RERUN:-"NO"} + filecount=$(find "${COM_ATMOS_RESTART:-/dev/null}" -type f | wc -l) + if [[ ( ${CDUMP} = "gfs" || ( ${RUN} = "gefs" && ${CDATE_RST} = "" )) && ${rst_invt1} -gt 0 && ${FHMAX} -gt ${rst_invt1} && ${filecount} -gt 10 ]]; then reverse=$(echo "${restart_interval[@]} " | tac -s ' ') - for xfh in $reverse ; do + for xfh in ${reverse} ; do yfh=$((xfh-(IAU_OFFSET/2))) - SDATE=$($NDATE +$yfh $CDATE) - PDYS=$(echo $SDATE | cut -c1-8) - cycs=$(echo $SDATE | cut -c9-10) - flag1=$RSTDIR_ATM/${PDYS}.${cycs}0000.coupler.res - flag2=$RSTDIR_ATM/coupler.res + SDATE=$(${NDATE} ${yfh} "${CDATE}") + PDYS=$(echo "${SDATE}" | cut -c1-8) + cycs=$(echo "${SDATE}" | cut -c9-10) + flag1=${COM_ATMOS_RESTART}/${PDYS}.${cycs}0000.coupler.res + flag2=${COM_ATMOS_RESTART}/coupler.res #make sure that the wave restart files also exist if cplwav=true waverstok=".true." - if [ $cplwav = ".true." ]; then - for wavGRD in $waveGRD ; do - if [ ! -f ${RSTDIR_WAVE}/${PDYS}.${cycs}0000.restart.${wavGRD} ]; then + if [[ "${cplwav}" = ".true." ]]; then + for wavGRD in ${waveGRD} ; do + if [[ ! -f "${COM_WAVE_RESTART}/${PDYS}.${cycs}0000.restart.${wavGRD}" ]]; then waverstok=".false." fi done fi - if [ -s $flag1 -a $waverstok = ".true." ]; then - CDATE_RST=$SDATE - [[ $RERUN = "YES" ]] && break - mv $flag1 ${flag1}.old - if [ -s $flag2 ]; then mv $flag2 ${flag2}.old ;fi + if [[ -s "${flag1}" ]] && [[ ${waverstok} = ".true." ]]; then + CDATE_RST=${SDATE} + [[ ${RERUN} = "YES" ]] && break + mv "${flag1}" "${flag1}.old" + if [[ -s "${flag2}" ]]; then mv "${flag2}" "${flag2}.old" ;fi RERUN="YES" - [[ $xfh = $rst_invt1 ]] && RERUN="NO" + [[ ${xfh} = ${rst_invt1} ]] && RERUN="NO" fi done fi #------------------------------------------------------- } -FV3_GEFS_det(){ - echo "SUB ${FUNCNAME[0]}: Defining variables for FV3GEFS" -} - WW3_det(){ echo "SUB ${FUNCNAME[0]}: Run type determination for WW3" } diff --git a/ush/forecast_postdet.sh b/ush/forecast_postdet.sh index 2e00646d1f7..bbd9cbab872 100755 --- a/ush/forecast_postdet.sh +++ b/ush/forecast_postdet.sh @@ -11,11 +11,6 @@ ## for execution. ##### -FV3_GEFS_postdet(){ - echo SUB ${FUNCNAME[0]}: Linking input data for FV3 $RUN - # soft link commands insert here -} - DATM_postdet(){ ###################################################################### # Link DATM inputs (ie forcing files) # @@ -45,26 +40,25 @@ FV3_GFS_postdet(){ if [ $RERUN = "NO" ]; then #............................. - # Link all (except sfc_data) restart files from $gmemdir - for file in $(ls $gmemdir/RESTART/${sPDY}.${scyc}0000.*.nc); do + # Link all restart files from previous cycle + for file in "${COM_ATMOS_RESTART_PREV}/${sPDY}.${scyc}0000."*.nc; do file2=$(echo $(basename $file)) file2=$(echo $file2 | cut -d. -f3-) # remove the date from file fsuf=$(echo $file2 | cut -d. -f1) - if [ $fsuf != "sfc_data" ]; then - $NLN $file $DATA/INPUT/$file2 - fi + $NLN $file $DATA/INPUT/$file2 done - # Link sfcanl_data restart files from $memdir - for file in $(ls $memdir/RESTART/${sPDY}.${scyc}0000.*.nc); do - file2=$(echo $(basename $file)) - file2=$(echo $file2 | cut -d. -f3-) # remove the date from file - fsufanl=$(echo $file2 | cut -d. -f1) - if [ $fsufanl = "sfcanl_data" ]; then + # Replace sfc_data with sfcanl_data restart files from current cycle (if found) + if [ "${MODE}" = "cycled" ] && [ "${CCPP_SUITE}" = "FV3_GFS_v16" ]; then # TODO: remove if statement when global_cycle can handle NOAHMP + for file in "${COM_ATMOS_RESTART}/${sPDY}.${scyc}0000."*.nc; do + file2=$(echo $(basename $file)) + file2=$(echo $file2 | cut -d. -f3-) # remove the date from file + fsufanl=$(echo $file2 | cut -d. -f1) file2=$(echo $file2 | sed -e "s/sfcanl_data/sfc_data/g") + rm -f $DATA/INPUT/$file2 $NLN $file $DATA/INPUT/$file2 - fi - done + done + fi # Need a coupler.res when doing IAU if [ $DOIAU = "YES" ]; then @@ -81,9 +75,9 @@ EOF for i in $(echo $IAUFHRS | sed "s/,/ /g" | rev); do incfhr=$(printf %03i $i) if [ $incfhr = "006" ]; then - increment_file=$memdir/${CDUMP}.t${cyc}z.${PREFIX_ATMINC}atminc.nc + increment_file="${COM_ATMOS_ANALYSIS}/${RUN}.t${cyc}z.${PREFIX_ATMINC}atminc.nc" else - increment_file=$memdir/${CDUMP}.t${cyc}z.${PREFIX_ATMINC}atmi${incfhr}.nc + increment_file="${COM_ATMOS_ANALYSIS}/${RUN}.t${cyc}z.${PREFIX_ATMINC}atmi${incfhr}.nc" fi if [ ! -f $increment_file ]; then echo "ERROR: DOIAU = $DOIAU, but missing increment file for fhr $incfhr at $increment_file" @@ -96,7 +90,7 @@ EOF read_increment=".false." res_latlon_dynamics="" else - increment_file=$memdir/${CDUMP}.t${cyc}z.${PREFIX_ATMINC}atminc.nc + increment_file="${COM_ATMOS_ANALYSIS}/${RUN}.t${cyc}z.${PREFIX_ATMINC}atminc.nc" if [ -f $increment_file ]; then $NLN $increment_file $DATA/INPUT/fv3_increment.nc read_increment=".true." @@ -109,7 +103,7 @@ EOF export warm_start=".true." PDYT=$(echo $CDATE_RST | cut -c1-8) cyct=$(echo $CDATE_RST | cut -c9-10) - for file in $(ls $RSTDIR_ATM/${PDYT}.${cyct}0000.*); do + for file in "${COM_ATMOS_RESTART}/${PDYT}.${cyct}0000."*; do file2=$(echo $(basename $file)) file2=$(echo $file2 | cut -d. -f3-) $NLN $file $DATA/INPUT/$file2 @@ -134,7 +128,7 @@ EOF #............................. else ## cold start - for file in $(ls $memdir/INPUT/*.nc); do + for file in "${COM_ATMOS_INPUT}/"*.nc; do file2=$(echo $(basename $file)) fsuf=$(echo $file2 | cut -c1-3) if [ $fsuf = "gfs" -o $fsuf = "sfc" ]; then @@ -144,16 +138,10 @@ EOF fi - if [ $machine = 'sandbox' ]; then - echo SUB ${FUNCNAME[0]}: Checking initial condition, overriden in sandbox mode! - else - nfiles=$(ls -1 $DATA/INPUT/* | wc -l) - if [ $nfiles -le 0 ]; then - echo SUB ${FUNCNAME[0]}: Initial conditions must exist in $DATA/INPUT, ABORT! - msg="SUB ${FUNCNAME[0]}: Initial conditions must exist in $DATA/INPUT, ABORT!" - postmsg "$jlogfile" "$msg" - exit 1 - fi + nfiles=$(ls -1 $DATA/INPUT/* | wc -l) + if [ $nfiles -le 0 ]; then + echo SUB ${FUNCNAME[0]}: Initial conditions must exist in $DATA/INPUT, ABORT! + exit 1 fi # If doing IAU, change forecast hours @@ -166,10 +154,6 @@ EOF #-------------------------------------------------------------------------- # Grid and orography data - for n in $(seq 1 $ntiles); do - $NLN $FIXfv3/$CASE/${CASE}_grid.tile${n}.nc $DATA/INPUT/${CASE}_grid.tile${n}.nc - $NLN $FIXfv3/$CASE/${CASE}_oro_data.tile${n}.nc $DATA/INPUT/oro_data.tile${n}.nc - done if [ $cplflx = ".false." ] ; then $NLN $FIXfv3/$CASE/${CASE}_mosaic.nc $DATA/INPUT/grid_spec.nc @@ -177,20 +161,12 @@ EOF $NLN $FIXfv3/$CASE/${CASE}_mosaic.nc $DATA/INPUT/${CASE}_mosaic.nc fi - # Fractional grid related - if [ $FRAC_GRID = ".true." ]; then - OROFIX=${OROFIX:-"${FIX_DIR}/fix_fv3_fracoro/${CASE}.mx${OCNRES}_frac"} - FIX_SFC=${FIX_SFC:-"${OROFIX}/fix_sfc"} - for n in $(seq 1 $ntiles); do - $NLN ${OROFIX}/oro_${CASE}.mx${OCNRES}.tile${n}.nc $DATA/INPUT/oro_data.tile${n}.nc - done - else - OROFIX=${OROFIX:-"${FIXfv3}/${CASE}"} - FIX_SFC=${FIX_SFC:-"${OROFIX}/fix_sfc"} - for n in $(seq 1 $ntiles); do - $NLN ${OROFIX}/${CASE}_oro_data.tile${n}.nc $DATA/INPUT/oro_data.tile${n}.nc - done - fi + OROFIX=${OROFIX:-"${FIX_DIR}/orog/${CASE}.mx${OCNRES}_frac"} + FIX_SFC=${FIX_SFC:-"${OROFIX}/fix_sfc"} + for n in $(seq 1 $ntiles); do + $NLN ${OROFIX}/oro_${CASE}.mx${OCNRES}.tile${n}.nc $DATA/INPUT/oro_data.tile${n}.nc + $NLN ${OROFIX}/${CASE}_grid.tile${n}.nc $DATA/INPUT/${CASE}_grid.tile${n}.nc + done export CCPP_SUITE=${CCPP_SUITE:-"FV3_GFS_v16"} _suite_file=$HOMEgfs/sorc/ufs_model.fd/FV3/ccpp/suites/suite_${CCPP_SUITE}.xml @@ -249,11 +225,8 @@ EOF fi # Files for GWD - OROFIX_ugwd=${OROFIX_ugwd:-"${FIX_DIR}/fix_ugwd"} - if [[ "$CCPP_SUITE" != "FV3_RAP_cires_ugwp" && "$CCPP_SUITE" != "FV3_RAP_noah_sfcdiff_unified_ugwp" && "$CCPP_SUITE" != "FV3_RAP_noah_sfcdiff_ugwpv1" ]] ; then ## JKH - - $NLN ${OROFIX_ugwd}/ugwp_limb_tau.nc $DATA/ugwp_limb_tau.nc - fi + OROFIX_ugwd=${OROFIX_ugwd:-"${FIX_DIR}/ugwd"} + $NLN ${OROFIX_ugwd}/ugwp_limb_tau.nc $DATA/ugwp_limb_tau.nc for n in $(seq 1 $ntiles); do $NLN ${OROFIX_ugwd}/$CASE/${CASE}_oro_data_ls.tile${n}.nc $DATA/INPUT/oro_data_ls.tile${n}.nc $NLN ${OROFIX_ugwd}/$CASE/${CASE}_oro_data_ss.tile${n}.nc $DATA/INPUT/oro_data_ss.tile${n}.nc @@ -279,8 +252,8 @@ EOF if [ $imp_physics -eq 8 ]; then $NLN $FIX_AM/CCN_ACTIVATE.BIN $DATA/CCN_ACTIVATE.BIN $NLN $FIX_AM/freezeH2O.dat $DATA/freezeH2O.dat - $NLN $FIX_AM/qr_acr_qgV2.dat $DATA/qr_acr_qgV2.dat - $NLN $FIX_AM/qr_acr_qsV2.dat $DATA/qr_acr_qsV2.dat + $NLN $FIX_AM/qr_acr_qgV2.dat $DATA/qr_acr_qgV2.dat + $NLN $FIX_AM/qr_acr_qsV2.dat $DATA/qr_acr_qsV2.dat fi $NLN $FIX_AM/${O3FORC} $DATA/global_o3prdlos.f77 @@ -290,12 +263,12 @@ EOF ## merra2 aerosol climo if [ $IAER -eq "1011" ]; then - FIX_AER="${FIX_DIR}/fix_aer" + FIX_AER="${FIX_DIR}/aer" for month in $(seq 1 12); do MM=$(printf %02d $month) $NLN "${FIX_AER}/merra2.aerclim.2003-2014.m${MM}.nc" "aeroclim.m${MM}.nc" done - FIX_LUT="${FIX_DIR}/fix_lut" + FIX_LUT="${FIX_DIR}/lut" $NLN $FIX_LUT/optics_BC.v1_3.dat $DATA/optics_BC.dat $NLN $FIX_LUT/optics_OC.v1_3.dat $DATA/optics_OC.dat $NLN $FIX_LUT/optics_DU.v15_3.dat $DATA/optics_DU.dat @@ -321,9 +294,9 @@ EOF # inline post fix files if [ $WRITE_DOPOST = ".true." ]; then $NLN $PARM_POST/post_tag_gfs${LEVS} $DATA/itag - $NLN $PARM_POST/postxconfig-NT-GFS-TWO.txt $DATA/postxconfig-NT.txt - $NLN $PARM_POST/postxconfig-NT-GFS-F00-TWO.txt $DATA/postxconfig-NT_FH00.txt - $NLN $PARM_POST/params_grib2_tbl_new $DATA/params_grib2_tbl_new + $NLN ${FLTFILEGFS:-$PARM_POST/postxconfig-NT-GFS-TWO.txt} $DATA/postxconfig-NT.txt + $NLN ${FLTFILEGFSF00:-$PARM_POST/postxconfig-NT-GFS-F00-TWO.txt} $DATA/postxconfig-NT_FH00.txt + $NLN ${POSTGRB2TBL:-$PARM_POST/params_grib2_tbl_new} $DATA/params_grib2_tbl_new fi #------------------------------------------------------------------ @@ -515,29 +488,24 @@ EOF JCAP_STP=${JCAP_STP:-$JCAP_CASE} LONB_STP=${LONB_STP:-$LONB_CASE} LATB_STP=${LATB_STP:-$LATB_CASE} - cd $DATA - - affix="nc" - if [ "$OUTPUT_FILE" = "nemsio" ]; then - affix="nemsio" - fi - + if [[ ! -d ${COM_ATMOS_HISTORY} ]]; then mkdir -p ${COM_ATMOS_HISTORY}; fi + if [[ ! -d ${COM_ATMOS_MASTER} ]]; then mkdir -p ${COM_ATMOS_MASTER}; fi if [ $QUILTING = ".true." -a $OUTPUT_GRID = "gaussian_grid" ]; then fhr=$FHMIN for fhr in $OUTPUT_FH; do FH3=$(printf %03i $fhr) FH2=$(printf %02i $fhr) - atmi=atmf${FH3}.$affix - sfci=sfcf${FH3}.$affix - logi=logf${FH3} + atmi=atmf${FH3}.nc + sfci=sfcf${FH3}.nc + logi=log.atm.f${FH3} pgbi=GFSPRS.GrbF${FH2} flxi=GFSFLX.GrbF${FH2} - atmo=$memdir/${CDUMP}.t${cyc}z.atmf${FH3}.$affix - sfco=$memdir/${CDUMP}.t${cyc}z.sfcf${FH3}.$affix - logo=$memdir/${CDUMP}.t${cyc}z.logf${FH3}.txt - pgbo=$memdir/${CDUMP}.t${cyc}z.master.grb2f${FH3} - flxo=$memdir/${CDUMP}.t${cyc}z.sfluxgrbf${FH3}.grib2 + atmo=${COM_ATMOS_HISTORY}/${RUN}.t${cyc}z.atmf${FH3}.nc + sfco=${COM_ATMOS_HISTORY}/${RUN}.t${cyc}z.sfcf${FH3}.nc + logo=${COM_ATMOS_HISTORY}/${RUN}.t${cyc}z.atm.logf${FH3}.txt + pgbo=${COM_ATMOS_MASTER}/${RUN}.t${cyc}z.master.grb2f${FH3} + flxo=${COM_ATMOS_MASTER}/${RUN}.t${cyc}z.sfluxgrbf${FH3}.grib2 eval $NLN $atmo $atmi eval $NLN $sfco $sfci eval $NLN $logo $logi @@ -548,11 +516,11 @@ EOF done else for n in $(seq 1 $ntiles); do - eval $NLN nggps2d.tile${n}.nc $memdir/nggps2d.tile${n}.nc - eval $NLN nggps3d.tile${n}.nc $memdir/nggps3d.tile${n}.nc - eval $NLN grid_spec.tile${n}.nc $memdir/grid_spec.tile${n}.nc - eval $NLN atmos_static.tile${n}.nc $memdir/atmos_static.tile${n}.nc - eval $NLN atmos_4xdaily.tile${n}.nc $memdir/atmos_4xdaily.tile${n}.nc + eval $NLN nggps2d.tile${n}.nc ${COM_ATMOS_HISTORY}/nggps2d.tile${n}.nc + eval $NLN nggps3d.tile${n}.nc ${COM_ATMOS_HISTORY}/nggps3d.tile${n}.nc + eval $NLN grid_spec.tile${n}.nc ${COM_ATMOS_HISTORY}/grid_spec.tile${n}.nc + eval $NLN atmos_static.tile${n}.nc ${COM_ATMOS_HISTORY}/atmos_static.tile${n}.nc + eval $NLN atmos_4xdaily.tile${n}.nc ${COM_ATMOS_HISTORY}/atmos_4xdaily.tile${n}.nc done fi } @@ -560,10 +528,6 @@ EOF FV3_GFS_nml(){ # namelist output for a certain component echo SUB ${FUNCNAME[0]}: Creating name lists and model configure file for FV3 - if [ $machine = 'sandbox' ]; then - cd $SCRIPTDIR - echo "MAIN: !!!Sandbox mode, writing to current directory!!!" - fi # Call child scripts in current script directory source $SCRIPTDIR/parsing_namelists_FV3.sh FV3_namelists @@ -587,16 +551,16 @@ data_out_GFS() { if [ $SEND = "YES" ]; then # Copy model restart files - if [ $CDUMP = "gdas" -a $rst_invt1 -gt 0 ]; then + if [[ ${RUN} =~ "gdas" ]] && (( rst_invt1 > 0 )); then cd $DATA/RESTART - mkdir -p $memdir/RESTART + mkdir -p "${COM_ATMOS_RESTART}" for rst_int in $restart_interval ; do if [ $rst_int -ge 0 ]; then RDATE=$($NDATE +$rst_int $CDATE) rPDY=$(echo $RDATE | cut -c1-8) rcyc=$(echo $RDATE | cut -c9-10) - for file in $(ls ${rPDY}.${rcyc}0000.*) ; do - $NCP $file $memdir/RESTART/$file + for file in "${rPDY}.${rcyc}0000."* ; do + ${NCP} "${file}" "${COM_ATMOS_RESTART}/${file}" done fi done @@ -609,19 +573,19 @@ data_out_GFS() { RDATE=$($NDATE +$rst_iau $CDATE) rPDY=$(echo $RDATE | cut -c1-8) rcyc=$(echo $RDATE | cut -c9-10) - for file in $(ls ${rPDY}.${rcyc}0000.*) ; do - $NCP $file $memdir/RESTART/$file + for file in "${rPDY}.${rcyc}0000."* ; do + ${NCP} "${file}" "${COM_ATMOS_RESTART}/${file}" done fi - elif [ $CDUMP = "gfs" ]; then - $NCP $DATA/input.nml $ROTDIR/${CDUMP}.${PDY}/${cyc}/atmos/ - $NCP $DATA/model_configure $ROTDIR/${CDUMP}.${PDY}/${cyc}/atmos/ # GSL + elif [[ ${RUN} =~ "gfs" ]]; then + ${NCP} "${DATA}/input.nml" "${COM_ATMOS_HISTORY}/input.nml" fi fi echo "SUB ${FUNCNAME[0]}: Output data for FV3 copied" } + WW3_postdet() { echo "SUB ${FUNCNAME[0]}: Linking input data for WW3" COMPONENTwave=${COMPONENTwave:-${RUN}wave} @@ -633,68 +597,61 @@ WW3_postdet() { grdALL=$(printf "%s\n" "${array[@]}" | sort -u | tr '\n' ' ') for wavGRD in ${grdALL}; do - $NCP $ROTDIR/${CDUMP}.${PDY}/${cyc}/wave/rundata/${COMPONENTwave}.mod_def.$wavGRD $DATA/mod_def.$wavGRD + ${NCP} "${COM_WAVE_PREP}/${COMPONENTwave}.mod_def.${wavGRD}" "${DATA}/mod_def.${wavGRD}" done else #if shel, only 1 waveGRD which is linked to mod_def.ww3 - $NCP $ROTDIR/${CDUMP}.${PDY}/${cyc}/wave/rundata/${COMPONENTwave}.mod_def.$waveGRD $DATA/mod_def.ww3 + ${NCP} "${COM_WAVE_PREP}/${COMPONENTwave}.mod_def.${waveGRD}" "${DATA}/mod_def.ww3" fi #if wave mesh is not the same as the ocn/ice mesh, linkk it in the file comparemesh=${MESH_OCN_ICE:-"mesh.mx${ICERES}.nc"} - if [ "$MESH_WAV" = "$comparemesh" ]; then + if [ "$MESH_WAV" = "$comparemesh" ]; then echo "Wave is on same mesh as ocean/ice" - else + else $NLN -sf $FIXwave/$MESH_WAV $DATA/ - fi + fi - export WAVHCYC=${WAVHCYC:-6} - export WRDATE=$($NDATE -${WAVHCYC} $CDATE) - export WRPDY=$(echo $WRDATE | cut -c1-8) - export WRcyc=$(echo $WRDATE | cut -c9-10) - export WRDIR=${ROTDIR}/${CDUMPRSTwave}.${WRPDY}/${WRcyc}/wave/restart - export RSTDIR_WAVE=$ROTDIR/${CDUMP}.${PDY}/${cyc}/wave/restart - export datwave=$COMOUTwave/rundata - export wavprfx=${CDUMPwave}${WAV_MEMBER:-} + export wavprfx=${RUNwave}${WAV_MEMBER:-} #Copy initial condition files: for wavGRD in $waveGRD ; do if [ $warm_start = ".true." -o $RERUN = "YES" ]; then if [ $RERUN = "NO" ]; then - waverstfile=${WRDIR}/${sPDY}.${scyc}0000.restart.${wavGRD} + waverstfile=${COM_WAVE_RESTART_PREV}/${sPDY}.${scyc}0000.restart.${wavGRD} else - waverstfile=${RSTDIR_WAVE}/${PDYT}.${cyct}0000.restart.${wavGRD} + waverstfile=${COM_WAVE_RESTART}/${PDYT}.${cyct}0000.restart.${wavGRD} fi else - waverstfile=${RSTDIR_WAVE}/${sPDY}.${scyc}0000.restart.${wavGRD} + waverstfile=${COM_WAVE_RESTART}/${sPDY}.${scyc}0000.restart.${wavGRD} fi if [ ! -f ${waverstfile} ]; then if [ $RERUN = "NO" ]; then echo "WARNING: NON-FATAL ERROR wave IC is missing, will start from rest" - else - echo "ERROR: Wave IC is missing in RERUN, exiting." - exit 1 - fi + else + echo "ERROR: Wave IC is missing in RERUN, exiting." + exit 1 + fi else if [ $waveMULTIGRID = ".true." ]; then $NLN ${waverstfile} $DATA/restart.${wavGRD} - else + else $NLN ${waverstfile} $DATA/restart.ww3 fi fi - done + done if [ $waveMULTIGRID = ".true." ]; then for wavGRD in $waveGRD ; do - $NLN $datwave/${wavprfx}.log.${wavGRD}.${PDY}${cyc} log.${wavGRD} + ${NLN} "${COM_WAVE_HISTORY}/${wavprfx}.log.${wavGRD}.${PDY}${cyc}" "log.${wavGRD}" done else - $NLN $datwave/${wavprfx}.log.${waveGRD}.${PDY}${cyc} log.ww3 - fi + ${NLN} "${COM_WAVE_HISTORY}/${wavprfx}.log.${waveGRD}.${PDY}${cyc}" "log.ww3" + fi if [ "$WW3ICEINP" = "YES" ]; then - wavicefile=$COMINwave/rundata/${CDUMPwave}.${WAVEICE_FID}.${cycle}.ice + wavicefile="${COM_WAVE_PREP}/${RUNwave}.${WAVEICE_FID}.${cycle}.ice" if [ ! -f $wavicefile ]; then echo "ERROR: WW3ICEINP = ${WW3ICEINP}, but missing ice file" echo "Abort!" @@ -704,7 +661,7 @@ WW3_postdet() { fi if [ "$WW3CURINP" = "YES" ]; then - wavcurfile=$COMINwave/rundata/${CDUMPwave}.${WAVECUR_FID}.${cycle}.cur + wavcurfile="${COM_WAVE_PREP}/${RUNwave}.${WAVECUR_FID}.${cycle}.cur" if [ ! -f $wavcurfile ]; then echo "ERROR: WW3CURINP = ${WW3CURINP}, but missing current file" echo "Abort!" @@ -713,11 +670,13 @@ WW3_postdet() { $NLN $wavcurfile $DATA/current.${WAVECUR_FID} fi + if [[ ! -d ${COM_WAVE_HISTORY} ]]; then mkdir -p "${COM_WAVE_HISTORY}"; fi + # Link output files cd $DATA if [ $waveMULTIGRID = ".true." ]; then - $NLN $datwave/${wavprfx}.log.mww3.${PDY}${cyc} log.mww3 - fi + ${NLN} "${COM_WAVE_HISTORY}/${wavprfx}.log.mww3.${PDY}${cyc}" "log.mww3" + fi # Loop for gridded output (uses FHINC) fhr=$FHMIN_WAV @@ -727,11 +686,11 @@ WW3_postdet() { HMS="$(echo $YMDH | cut -c9-10)0000" if [ $waveMULTIGRID = ".true." ]; then for wavGRD in ${waveGRD} ; do - $NLN $datwave/${wavprfx}.out_grd.${wavGRD}.${YMD}.${HMS} $DATA/${YMD}.${HMS}.out_grd.${wavGRD} + ${NLN} "${COM_WAVE_HISTORY}/${wavprfx}.out_grd.${wavGRD}.${YMD}.${HMS}" "${DATA}/${YMD}.${HMS}.out_grd.${wavGRD}" done else - $NLN $datwave/${wavprfx}.out_grd.${waveGRD}.${YMD}.${HMS} $DATA/${YMD}.${HMS}.out_grd.ww3 - fi + ${NLN} "${COM_WAVE_HISTORY}/${wavprfx}.out_grd.${waveGRD}.${YMD}.${HMS}" "${DATA}/${YMD}.${HMS}.out_grd.ww3" + fi FHINC=$FHOUT_WAV if [ $FHMAX_HF_WAV -gt 0 -a $FHOUT_HF_WAV -gt 0 -a $fhr -lt $FHMAX_HF_WAV ]; then FHINC=$FHOUT_HF_WAV @@ -746,10 +705,10 @@ WW3_postdet() { YMD=$(echo $YMDH | cut -c1-8) HMS="$(echo $YMDH | cut -c9-10)0000" if [ $waveMULTIGRID = ".true." ]; then - $NLN $datwave/${wavprfx}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS} $DATA/${YMD}.${HMS}.out_pnt.${waveuoutpGRD} + ${NLN} "${COM_WAVE_HISTORY}/${wavprfx}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS}" "${DATA}/${YMD}.${HMS}.out_pnt.${waveuoutpGRD}" else - $NLN $datwave/${wavprfx}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS} $DATA/${YMD}.${HMS}.out_pnt.ww3 - fi + ${NLN} "${COM_WAVE_HISTORY}/${wavprfx}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS}" "${DATA}/${YMD}.${HMS}.out_pnt.ww3" + fi FHINC=$FHINCP_WAV fhr=$((fhr+FHINC)) @@ -758,15 +717,15 @@ WW3_postdet() { WW3_nml() { echo "SUB ${FUNCNAME[0]}: Copying input files for WW3" - WAV_MOD_TAG=${CDUMP}wave${waveMEMB} + WAV_MOD_TAG=${RUN}wave${waveMEMB} if [ "${USE_WAV_RMP:-YES}" = "YES" ]; then - if (( $( ls -1 $FIXwave/rmp_src_to_dst_conserv_* > /dev/null | wc -l) > 0 )); then + if (( $( ls -1 $FIXwave/rmp_src_to_dst_conserv_* 2> /dev/null | wc -l) > 0 )); then for file in $(ls $FIXwave/rmp_src_to_dst_conserv_*) ; do - $NLN $file $DATA/ + $NLN $file $DATA/ done else - echo 'FATAL ERROR : No rmp precomputed nc files found for wave model' - exit 4 + echo 'FATAL ERROR : No rmp precomputed nc files found for wave model' + exit 4 fi fi source $SCRIPTDIR/parsing_namelists_WW3.sh @@ -781,23 +740,39 @@ WW3_out() { CPL_out() { echo "SUB ${FUNCNAME[0]}: Copying output data for general cpl fields" if [ $esmf_profile = ".true." ]; then - $NCP $DATA/ESMF_Profile.summary $ROTDIR/$CDUMP.$PDY/$cyc/ + ${NCP} "${DATA}/ESMF_Profile.summary" "${COM_ATMOS_HISTORY}/ESMF_Profile.summary" fi } MOM6_postdet() { echo "SUB ${FUNCNAME[0]}: MOM6 after run type determination" - OCNRES=${OCNRES:-"025"} - # Copy MOM6 ICs - $NCP -pf $ICSDIR/$CDATE/ocn/MOM*nc $DATA/INPUT/ + ${NLN} "${COM_OCEAN_RESTART_PREV}/${PDY}.${cyc}0000.MOM.res.nc" "${DATA}/INPUT/MOM.res.nc" + case $OCNRES in + "025") + for nn in $(seq 1 4); do + if [[ -f "${COM_OCEAN_RESTART_PREV}/${PDY}.${cyc}0000.MOM.res_${nn}.nc" ]]; then + ${NLN} "${COM_OCEAN_RESTART_PREV}/${PDY}.${cyc}0000.MOM.res_${nn}.nc" "${DATA}/INPUT/MOM.res_${nn}.nc" + fi + done + ;; + esac + + # Link increment + if [[ "${DO_JEDIOCNVAR:-NO}" = "YES" ]]; then + if [[ ! -f "${COM_OCEAN_ANALYSIS}/${RUN}.t${cyc}z.ocninc.nc" ]]; then + echo "FATAL ERROR: Ocean increment not found, ABORT!" + exit 111 + fi + ${NLN} "${COM_OCEAN_ANALYSIS}/${RUN}.t${cyc}z.ocninc.nc" "${DATA}/INPUT/mom6_increment.nc" + fi # Copy MOM6 fixed files $NCP -pf $FIXmom/$OCNRES/* $DATA/INPUT/ # Copy coupled grid_spec - spec_file="$FIX_DIR/fix_cpl/a${CASE}o${OCNRES}/grid_spec.nc" + spec_file="$FIX_DIR/cpl/a${CASE}o${OCNRES}/grid_spec.nc" if [ -s $spec_file ]; then $NCP -pf $spec_file $DATA/INPUT/ else @@ -805,12 +780,29 @@ MOM6_postdet() { exit 3 fi - # Copy mediator restart files to RUNDIR - if [ $warm_start = ".true." -o $RERUN = "YES" ]; then - $NCP $ROTDIR/$CDUMP.$PDY/$cyc/med/ufs.cpld*.nc $DATA/ - $NCP $ROTDIR/$CDUMP.$PDY/$cyc/med/rpointer.cpl $DATA/ + # Copy mediator restart files to RUNDIR # TODO: mediator should have its own CMEPS_postdet() function + if [[ $warm_start = ".true." ]]; then + local mediator_file="${COM_MED_RESTART}/${PDY}.${cyc}0000.ufs.cpld.cpl.r.nc" + if [[ -f "${mediator_file}" ]]; then + ${NCP} "${mediator_file}" "${DATA}/ufs.cpld.cpl.r.nc" + rm -f "${DATA}/rpointer.cpl" + touch "${DATA}/rpointer.cpl" + echo "ufs.cpld.cpl.r.nc" >> "${DATA}/rpointer.cpl" + else + # We have a choice to make here. + # Either we can FATAL ERROR out, or we can let the coupling fields initialize from zero + # cmeps_run_type is determined based on the availability of the mediator restart file + echo "WARNING: ${mediator_file} does not exist for warm_start = .true., initializing!" + #echo "FATAL ERROR: ${mediator_file} must exist for warm_start = .true. and does not, ABORT!" + #exit 4 + fi + else + # This is a cold start, so initialize the coupling fields from zero + export cmeps_run_type="startup" fi + # If using stochatic parameterizations, create a seed that does not exceed the + # largest signed integer if [ $DO_OCN_SPPT = "YES" -o $DO_OCN_PERT_EPBL = "YES" ]; then if [ ${SET_STP_SEED:-"YES"} = "YES" ]; then ISEED_OCNSPPT=$(( (CDATE*1000 + MEMBER*10 + 6) % 2147483647 )) @@ -820,56 +812,117 @@ MOM6_postdet() { fi fi + # Create COMOUTocean + [[ ! -d ${COM_OCEAN_HISTORY} ]] && mkdir -p "${COM_OCEAN_HISTORY}" + # Link output files + if [[ "${RUN}" =~ "gfs" ]]; then + # Link output files for RUN = gfs - export ENSMEM=${ENSMEM:-01} - export IDATE=$CDATE + # TODO: get requirements on what files need to be written out and what these dates here are and what they mean + export ENSMEM=${ENSMEM:-01} + export IDATE=$CDATE - [[ ! -d $COMOUTocean ]] && mkdir -p $COMOUTocean + fhrlst=${OUTPUT_FH} + if [[ ! -d ${COM_OCEAN_HISTORY} ]]; then mkdir -p ${COM_OCEAN_HISTORY}; fi - fhrlst=$OUTPUT_FH + for fhr in $fhrlst; do + if [ $fhr = 'anl' ]; then # Looking at OUTPUT_FH, this is never true, TODO: remove this block + continue + fi + if [ -z ${last_fhr:-} ]; then + last_fhr=$fhr + continue + fi + (( interval = fhr - last_fhr )) + (( midpoint = last_fhr + interval/2 )) + VDATE=$($NDATE $fhr $IDATE) + YYYY=$(echo $VDATE | cut -c1-4) + MM=$(echo $VDATE | cut -c5-6) + DD=$(echo $VDATE | cut -c7-8) + HH=$(echo $VDATE | cut -c9-10) + SS=$((10#$HH*3600)) + + VDATE_MID=$($NDATE $midpoint $IDATE) + YYYY_MID=$(echo $VDATE_MID | cut -c1-4) + MM_MID=$(echo $VDATE_MID | cut -c5-6) + DD_MID=$(echo $VDATE_MID | cut -c7-8) + HH_MID=$(echo $VDATE_MID | cut -c9-10) + SS_MID=$((10#$HH_MID*3600)) + + source_file="ocn_${YYYY_MID}_${MM_MID}_${DD_MID}_${HH_MID}.nc" + dest_file="ocn${VDATE}.${ENSMEM}.${IDATE}.nc" + ${NLN} ${COM_OCEAN_HISTORY}/${dest_file} ${DATA}/${source_file} + + source_file="ocn_daily_${YYYY}_${MM}_${DD}.nc" + dest_file=${source_file} + if [ ! -a "${DATA}/${source_file}" ]; then + $NLN ${COM_OCEAN_HISTORY}/${dest_file} ${DATA}/${source_file} + fi - for fhr in $fhrlst; do - if [ $fhr = 'anl' ]; then - continue - fi - if [ -z ${last_fhr:-} ]; then last_fhr=$fhr - continue - fi - (( interval = fhr - last_fhr )) - (( midpoint = last_fhr + interval/2 )) - VDATE=$($NDATE $fhr $IDATE) - YYYY=$(echo $VDATE | cut -c1-4) - MM=$(echo $VDATE | cut -c5-6) - DD=$(echo $VDATE | cut -c7-8) - HH=$(echo $VDATE | cut -c9-10) - SS=$((10#$HH*3600)) + done - VDATE_MID=$($NDATE $midpoint $IDATE) - YYYY_MID=$(echo $VDATE_MID | cut -c1-4) - MM_MID=$(echo $VDATE_MID | cut -c5-6) - DD_MID=$(echo $VDATE_MID | cut -c7-8) - HH_MID=$(echo $VDATE_MID | cut -c9-10) - SS_MID=$((10#$HH_MID*3600)) - - source_file="ocn_${YYYY_MID}_${MM_MID}_${DD_MID}_${HH_MID}.nc" - dest_file="ocn${VDATE}.${ENSMEM}.${IDATE}.nc" - ${NLN} ${COMOUTocean}/${dest_file} ${DATA}/${source_file} - - source_file="wavocn_${YYYY_MID}_${MM_MID}_${DD_MID}_${HH_MID}.nc" - dest_file=${source_file} - ${NLN} ${COMOUTocean}/${dest_file} ${DATA}/${source_file} - - source_file="ocn_daily_${YYYY}_${MM}_${DD}.nc" - dest_file=${source_file} - if [ ! -a "${DATA}/${source_file}" ]; then - $NLN ${COMOUTocean}/${dest_file} ${DATA}/${source_file} - fi + elif [[ "${RUN}" =~ "gdas" ]]; then + # Link output files for RUN = gdas + + # Save MOM6 backgrounds + for fhr in ${OUTPUT_FH}; do + local idatestr=$(date -d "${CDATE:0:8} ${CDATE:8:2} + ${fhr} hours" +%Y_%m_%d_%H) + local fhr3=$(printf %03i "${fhr}") + $NLN "${COM_OCEAN_HISTORY}/${RUN}.t${cyc}z.ocnf${fhr3}.nc" "${DATA}/ocn_da_${idatestr}.nc" + done + fi + + mkdir -p "${COM_OCEAN_RESTART}" - last_fhr=$fhr + # end point restart does not have a timestamp, calculate + local rdate=$(date -d "${CDATE:0:8} ${CDATE:8:2} + ${FHMAX} hours" +%Y%m%d%H) + + # Link ocean restarts from DATA to COM + # Coarser than 1/2 degree has a single MOM restart + $NLN "${COM_OCEAN_RESTART}/${rdate:0:8}.${rdate:8:2}0000.MOM.res.nc" "${DATA}/MOM6_RESTART/" + # 1/4 degree resolution has 4 additional restarts + case ${OCNRES} in + "025") + for nn in $(seq 1 4); do + $NLN "${COM_OCEAN_RESTART}/${rdate:0:8}.${rdate:8:2}0000.MOM.res_${nn}.nc" "${DATA}/MOM6_RESTART/" + done + ;; + *) + ;; + esac + + # Loop over restart_interval frequency and link restarts from DATA to COM + local res_int=$(echo $restart_interval | cut -d' ' -f1) # If this is a list, get the frequency. # This is bound to break w/ IAU + local idate=$(date -d "${CDATE:0:8} ${CDATE:8:2} + ${res_int} hours" +%Y%m%d%H) + while [[ $idate -lt $rdate ]]; do + local idatestr=$(date +%Y-%m-%d-%H -d "${idate:0:8} ${idate:8:2}") + $NLN "${COM_OCEAN_RESTART}/${idate:0:8}.${idate:8:2}0000.MOM.res.nc" "${DATA}/MOM6_RESTART/" + case ${OCNRES} in + "025") + for nn in $(seq 1 4); do + $NLN "${COM_OCEAN_RESTART}/${idate:0:8}.${idate:8:2}0000.MOM.res_${nn}.nc" "${DATA}/MOM6_RESTART/" + done + ;; + esac + local idate=$(date -d "${idate:0:8} ${idate:8:2} + ${res_int} hours" +%Y%m%d%H) done - $NLN ${COMOUTocean}/MOM_input $DATA/INPUT/MOM_input + + # TODO: mediator should have its own CMEPS_postdet() function + # Link mediator restarts from DATA to COM + # DANGER DANGER DANGER - Linking mediator restarts to COM causes the model to fail with a message like this below: + # Abort with message NetCDF: File exists && NC_NOCLOBBER in file pio-2.5.7/src/clib/pioc_support.c at line 2173 + # Instead of linking, copy the mediator files after the model finishes + #local COMOUTmed="${ROTDIR}/${RUN}.${PDY}/${cyc}/med" + #mkdir -p "${COMOUTmed}/RESTART" + #local idate=$(date -d "${CDATE:0:8} ${CDATE:8:2} + ${res_int} hours" +%Y%m%d%H) + #while [[ $idate -le $rdate ]]; do + # local seconds=$(to_seconds ${idate:8:2}0000) # use function to_seconds from forecast_predet.sh to convert HHMMSS to seconds + # local idatestr="${idate:0:4}-${idate:4:2}-${idate:6:2}-${seconds}" + # $NLN "${COMOUTmed}/RESTART/${idate:0:8}.${idate:8:2}0000.ufs.cpld.cpl.r.nc" "${DATA}/RESTART/ufs.cpld.cpl.r.${idatestr}.nc" + # local idate=$(date -d "${idate:0:8} ${idate:8:2} + ${res_int} hours" +%Y%m%d%H) + #done echo "SUB ${FUNCNAME[0]}: MOM6 input data linked/copied" @@ -883,24 +936,57 @@ MOM6_nml() { MOM6_out() { echo "SUB ${FUNCNAME[0]}: Copying output data for MOM6" + + # Copy MOM_input from DATA to COM_OCEAN_INPUT after the forecast is run (and successfull) + if [[ ! -d ${COM_OCEAN_INPUT} ]]; then mkdir -p "${COM_OCEAN_INPUT}"; fi + ${NCP} "${DATA}/INPUT/MOM_input" "${COM_OCEAN_INPUT}/" + + # TODO: mediator should have its own CMEPS_out() function + # Copy mediator restarts from DATA to COM + # Linking mediator restarts to COM causes the model to fail with a message. + # See MOM6_postdet() function for error message + mkdir -p "${COM_MED_RESTART}" + local res_int=$(echo $restart_interval | cut -d' ' -f1) # If this is a list, get the frequency. # This is bound to break w/ IAU + local idate=$(date -d "${CDATE:0:8} ${CDATE:8:2} + ${res_int} hours" +%Y%m%d%H) + local rdate=$(date -d "${CDATE:0:8} ${CDATE:8:2} + ${FHMAX} hours" +%Y%m%d%H) + while [[ $idate -le $rdate ]]; do + local seconds=$(to_seconds ${idate:8:2}0000) # use function to_seconds from forecast_predet.sh to convert HHMMSS to seconds + local idatestr="${idate:0:4}-${idate:4:2}-${idate:6:2}-${seconds}" + local mediator_file="${DATA}/RESTART/ufs.cpld.cpl.r.${idatestr}.nc" + if [[ -f ${mediator_file} ]]; then + $NCP "${DATA}/RESTART/ufs.cpld.cpl.r.${idatestr}.nc" "${COM_MED_RESTART}/${idate:0:8}.${idate:8:2}0000.ufs.cpld.cpl.r.nc" + else + echo "Mediator restart ${mediator_file} not found." + fi + local idate=$(date -d "${idate:0:8} ${idate:8:2} + ${res_int} hours" +%Y%m%d%H) + done } CICE_postdet() { echo "SUB ${FUNCNAME[0]}: CICE after run type determination" + # TODO: move configuration settings to config.ice + + # TODO: These need to be calculated in the parsing_namelists_CICE.sh script CICE_namelists() function and set as local year=$(echo $CDATE|cut -c 1-4) month=$(echo $CDATE|cut -c 5-6) day=$(echo $CDATE|cut -c 7-8) - sec=$(echo $CDATE|cut -c 9-10) + sec=$(echo $CDATE|cut -c 9-10) stepsperhr=$((3600/$ICETIM)) nhours=$($NHOUR $CDATE ${year}010100) steps=$((nhours*stepsperhr)) npt=$((FHMAX*$stepsperhr)) # Need this in order for dump_last to work + # TODO: These settings should be elevated to config.ice histfreq_n=${histfreq_n:-6} - dumpfreq_n=${dumpfreq_n:-840} # restart write interval in seconds, default 35 days - dumpfreq=${dumpfreq:-"h"} # "h","d","m" or "y" for restarts at intervals of "hours", "days", "months" or "years" - cice_hist_avg=${cice_hist_avg:-".true."} + dumpfreq_n=${dumpfreq_n:-1000} # Set this to a really large value, as cice, mom6 and cmeps restart interval is controlled by nems.configure + dumpfreq=${dumpfreq:-"y"} # "h","d","m" or "y" for restarts at intervals of "hours", "days", "months" or "years" + + if [[ "${RUN}" =~ "gdas" ]]; then + cice_hist_avg=".false." # DA needs instantaneous + elif [[ "${RUN}" =~ "gfs" ]]; then + cice_hist_avg=".true." # P8 wants averaged over histfreq_n + fi FRAZIL_FWSALT=${FRAZIL_FWSALT:-".true."} ktherm=${ktherm:-2} @@ -910,68 +996,100 @@ CICE_postdet() { # restart_pond_lvl (if tr_pond_lvl=true): # -- if true, initialize the level ponds from restart (if runtype=continue) # -- if false, re-initialize level ponds to zero (if runtype=initial or continue) - - #TODO: Determine the proper way to determine if it's a 'hot start' or not - #note this is not mediator cold start or not - #if [ hotstart ]; then - # #continuing run "hot start" - # RUNTYPE='continue' - # USE_RESTART_TIME='.true.' - #fi - RUNTYPE='initial' - USE_RESTART_TIME='.false.' restart_pond_lvl=${restart_pond_lvl:-".false."} - ICERES=${ICERES:-"025"} - if [ $ICERES = '025' ]; then - ICERESdec="0.25" - fi - if [ $ICERES = '050' ]; then - ICERESdec="0.50" - fi - if [ $ICERES = '100' ]; then - ICERESdec="1.00" - fi + ICERES=${ICERES:-"025"} # TODO: similar to MOM_out, lift this higher ice_grid_file=${ice_grid_file:-"grid_cice_NEMS_mx${ICERES}.nc"} ice_kmt_file=${ice_kmt_file:-"kmtu_cice_NEMS_mx${ICERES}.nc"} export MESH_OCN_ICE=${MESH_OCN_ICE:-"mesh.mx${ICERES}.nc"} - iceic="cice_model.res_$CDATE.nc" - - # Copy CICE IC - $NCP -p $ICSDIR/$CDATE/ice/cice_model_${ICERESdec}.res_$CDATE.nc $DATA/$iceic + # Copy/link CICE IC to DATA + if [[ "${warm_start}" = ".true." ]]; then + cice_ana="${COM_ICE_RESTART}/${PDY}.${cyc}0000.cice_model_anl.res.nc" + if [[ -e ${cice_ana} ]]; then + ${NLN} "${cice_ana}" "${DATA}/cice_model.res.nc" + else + ${NLN} "${COM_ICE_RESTART_PREV}/${PDY}.${cyc}0000.cice_model.res.nc" "${DATA}/cice_model.res.nc" + fi + else # cold start are typically SIS2 restarts obtained from somewhere else e.g. CPC + $NLN "${COM_ICE_RESTART}/${PDY}.${cyc}0000.cice_model.res.nc" "${DATA}/cice_model.res.nc" + fi + # TODO: add a check for the restarts to exist, if not, exit eloquently + rm -f "${DATA}/ice.restart_file" + touch "${DATA}/ice.restart_file" + echo "${DATA}/cice_model.res.nc" >> "${DATA}/ice.restart_file" echo "Link CICE fixed files" $NLN -sf $FIXcice/$ICERES/${ice_grid_file} $DATA/ $NLN -sf $FIXcice/$ICERES/${ice_kmt_file} $DATA/ $NLN -sf $FIXcice/$ICERES/$MESH_OCN_ICE $DATA/ - # Link output files - export ENSMEM=${ENSMEM:-01} - export IDATE=$CDATE - [[ ! -d $COMOUTice ]] && mkdir -p $COMOUTice - $NLN $COMOUTice/ice_in $DATA/ice_in - fhrlst=$OUTPUT_FH + # Link CICE output files + if [[ ! -d "${COM_ICE_HISTORY}" ]]; then mkdir -p "${COM_ICE_HISTORY}"; fi + mkdir -p ${COM_ICE_RESTART} - for fhr in $fhrlst; do - if [ $fhr = 'anl' ]; then - continue - fi - VDATE=$($NDATE $fhr $IDATE) - YYYY=$(echo $VDATE | cut -c1-4) - MM=$(echo $VDATE | cut -c5-6) - DD=$(echo $VDATE | cut -c7-8) - HH=$(echo $VDATE | cut -c9-10) - SS=$((10#$HH*3600)) + if [[ "${RUN}" =~ "gfs" ]]; then + # Link output files for RUN = gfs - if [[ 10#$fhr -eq 0 ]]; then - $NLN $COMOUTice/iceic$VDATE.$ENSMEM.$IDATE.nc $DATA/history/iceh_ic.${YYYY}-${MM}-${DD}-$(printf "%5.5d" ${SS}).nc - else - (( interval = fhr - last_fhr )) - $NLN $COMOUTice/ice$VDATE.$ENSMEM.$IDATE.nc $DATA/history/iceh_$(printf "%0.2d" $interval)h.${YYYY}-${MM}-${DD}-$(printf "%5.5d" ${SS}).nc - fi - last_fhr=$fhr + # TODO: make these forecast output files consistent w/ GFS output + # TODO: Work w/ NB to determine appropriate naming convention for these files + + export ENSMEM=${ENSMEM:-01} + export IDATE=$CDATE + + fhrlst=$OUTPUT_FH + + # TODO: consult w/ NB on how to improve on this. Gather requirements and more information on what these files are and how they are used to properly catalog them + for fhr in $fhrlst; do + if [ $fhr = 'anl' ]; then # Looking at OUTPUT_FH, this is never true. TODO: remove this block + continue + fi + VDATE=$($NDATE $fhr $IDATE) + YYYY=$(echo $VDATE | cut -c1-4) + MM=$(echo $VDATE | cut -c5-6) + DD=$(echo $VDATE | cut -c7-8) + HH=$(echo $VDATE | cut -c9-10) + SS=$((10#$HH*3600)) + + if [[ 10#$fhr -eq 0 ]]; then + ${NLN} "${COM_ICE_HISTORY}/iceic${VDATE}.${ENSMEM}.${IDATE}.nc" "${DATA}/CICE_OUTPUT/iceh_ic.${YYYY}-${MM}-${DD}-$(printf "%5.5d" ${SS}).nc" + else + (( interval = fhr - last_fhr )) # Umm.. isn't this histfreq_n? + ${NLN} "${COM_ICE_HISTORY}/ice${VDATE}.${ENSMEM}.${IDATE}.nc" "${DATA}/CICE_OUTPUT/iceh_$(printf "%0.2d" $interval)h.${YYYY}-${MM}-${DD}-$(printf "%5.5d" ${SS}).nc" + fi + last_fhr=$fhr + done + + elif [[ "${RUN}" =~ "gdas" ]]; then + + # Link CICE generated initial condition file from DATA/CICE_OUTPUT to COMOUTice + # This can be thought of as the f000 output from the CICE model + local seconds=$(to_seconds ${CDATE:8:2}0000) # convert HHMMSS to seconds + $NLN "${COM_ICE_HISTORY}/${RUN}.t${cyc}z.iceic.nc" "${DATA}/CICE_OUTPUT/iceh_ic.${CDATE:0:4}-${CDATE:4:2}-${CDATE:6:2}-${seconds}.nc" + + # Link instantaneous CICE forecast output files from DATA/CICE_OUTPUT to COMOUTice + local fhr="${FHOUT}" + while [[ "${fhr}" -le "${FHMAX}" ]]; do + local idate=$(date -d "${CDATE:0:8} ${CDATE:8:2} + ${fhr} hours" +%Y%m%d%H) + local seconds=$(to_seconds ${idate:8:2}0000) # convert HHMMSS to seconds + local fhr3=$(printf %03i ${fhr}) + $NLN "${COM_ICE_HISTORY}/${RUN}.t${cyc}z.icef${fhr3}.nc" "${DATA}/CICE_OUTPUT/iceh_inst.${idate:0:4}-${idate:4:2}-${idate:6:2}-${seconds}.nc" + local fhr=$((fhr + FHOUT)) + done + + fi + + # Link CICE restarts from CICE_RESTART to COMOUTice/RESTART + # Loop over restart_interval and link restarts from DATA to COM + local res_int=$(echo ${restart_interval} | cut -d' ' -f1) # If this is a list, get the frequency. # This is bound to break w/ IAU + local rdate=$(date -d "${CDATE:0:8} ${CDATE:8:2} + ${FHMAX} hours" +%Y%m%d%H) + local idate=$(date -d "${CDATE:0:8} ${CDATE:8:2} + ${res_int} hours" +%Y%m%d%H) + while [[ ${idate} -le ${rdate} ]]; do + local seconds=$(to_seconds ${idate:8:2}0000) # convert HHMMSS to seconds + local idatestr="${idate:0:4}-${idate:4:2}-${idate:6:2}-${seconds}" + $NLN "${COM_ICE_RESTART}/${idate:0:8}.${idate:8:2}0000.cice_model.res.nc" "${DATA}/CICE_RESTART/cice_model.res.${idatestr}.nc" + local idate=$(date -d "${idate:0:8} ${idate:8:2} + ${res_int} hours" +%Y%m%d%H) done } @@ -983,6 +1101,10 @@ CICE_nml() { CICE_out() { echo "SUB ${FUNCNAME[0]}: Copying output data for CICE" + + # Copy ice_in namelist from DATA to COMOUTice after the forecast is run (and successfull) + if [[ ! -d "${COM_ICE_INPUT}" ]]; then mkdir -p "${COM_ICE_INPUT}"; fi + ${NCP} "${DATA}/ice_in" "${COM_ICE_INPUT}/ice_in" } GOCART_rc() { @@ -1019,7 +1141,7 @@ GOCART_rc() { GOCART_postdet() { echo "SUB ${FUNCNAME[0]}: Linking output data for GOCART" - [[ ! -d $COMOUTaero ]] && mkdir -p $COMOUTaero + if [[ ! -d "${COM_CHEM_HISTORY}" ]]; then mkdir -p "${COM_CHEM_HISTORY}"; fi fhrlst=$OUTPUT_FH for fhr in $fhrlst; do @@ -1033,6 +1155,13 @@ GOCART_postdet() { HH=$(echo $VDATE | cut -c9-10) SS=$((10#$HH*3600)) - $NLN $COMOUTaero/gocart.inst_aod.${YYYY}${MM}${DD}_${HH}00z.nc4 $DATA/gocart.inst_aod.${YYYY}${MM}${DD}_${HH}00z.nc4 + # + # Temporarily delete existing files due to noclobber in GOCART + # + if [[ -e "${COM_CHEM_HISTORY}/gocart.inst_aod.${YYYY}${MM}${DD}_${HH}00z.nc4" ]]; then + rm "${COM_CHEM_HISTORY}/gocart.inst_aod.${YYYY}${MM}${DD}_${HH}00z.nc4" + fi + + ${NLN} "${COM_CHEM_HISTORY}/gocart.inst_aod.${YYYY}${MM}${DD}_${HH}00z.nc4" "${DATA}/gocart.inst_aod.${YYYY}${MM}${DD}_${HH}00z.nc4" done } diff --git a/ush/forecast_predet.sh b/ush/forecast_predet.sh index 947fae59cd3..334eacedeff 100755 --- a/ush/forecast_predet.sh +++ b/ush/forecast_predet.sh @@ -10,6 +10,29 @@ # For all non-evironment variables # Cycling and forecast hour specific parameters + +to_seconds() { + # Function to convert HHMMSS to seconds since 00Z + local hhmmss=${1:?} + local hh=${hhmmss:0:2} + local mm=${hhmmss:2:2} + local ss=${hhmmss:4:2} + local seconds=$((10#${hh}*3600+10#${mm}*60+10#${ss})) + local padded_seconds=$(printf "%05d" ${seconds}) + echo ${padded_seconds} +} + +middle_date(){ + # Function to calculate mid-point date in YYYYMMDDHH between two dates also in YYYYMMDDHH + local date1=${1:?} + local date2=${2:?} + local date1s=$(date -d "${date1:0:8} ${date1:8:2}" +%s) + local date2s=$(date -d "${date2:0:8} ${date2:8:2}" +%s) + local dtsecsby2=$(( $((date2s - date1s)) / 2 )) + local mid_date=$(date -d "${date1:0:8} ${date1:8:2} + ${dtsecsby2} seconds" +%Y%m%d%H%M%S) + echo ${mid_date:0:10} +} + common_predet(){ echo "SUB ${FUNCNAME[0]}: Defining variables for shared through models" pwd=$(pwd) @@ -19,7 +42,6 @@ common_predet(){ CDATE=${CDATE:-2017032500} DATA=${DATA:-$pwd/fv3tmp$$} # temporary running directory ROTDIR=${ROTDIR:-$pwd} # rotating archive directory - ICSDIR=${ICSDIR:-$pwd} # cold start initial conditions } DATM_predet(){ @@ -38,7 +60,6 @@ DATM_predet(){ FV3_GFS_predet(){ echo "SUB ${FUNCNAME[0]}: Defining variables for FV3GFS" CDUMP=${CDUMP:-gdas} - CDUMPwave="${CDUMP}wave" FHMIN=${FHMIN:-0} FHMAX=${FHMAX:-9} FHOUT=${FHOUT:-3} @@ -71,16 +92,14 @@ FV3_GFS_predet(){ # Directories. pwd=$(pwd) - NWPROD=${NWPROD:-${NWROOT:-$pwd}} - HOMEgfs=${HOMEgfs:-$NWPROD} + HOMEgfs=${HOMEgfs:-${PACKAGEROOT:-$pwd}} FIX_DIR=${FIX_DIR:-$HOMEgfs/fix} - FIX_AM=${FIX_AM:-$FIX_DIR/fix_am} - FIX_AER=${FIX_AER:-$FIX_DIR/fix_aer} - FIX_LUT=${FIX_LUT:-$FIX_DIR/fix_lut} - FIXfv3=${FIXfv3:-$FIX_DIR/fix_fv3_gmted2010} + FIX_AM=${FIX_AM:-$FIX_DIR/am} + FIX_AER=${FIX_AER:-$FIX_DIR/aer} + FIX_LUT=${FIX_LUT:-$FIX_DIR/lut} + FIXfv3=${FIXfv3:-$FIX_DIR/orog} DATA=${DATA:-$pwd/fv3tmp$$} # temporary running directory ROTDIR=${ROTDIR:-$pwd} # rotating archive directory - ICSDIR=${ICSDIR:-$pwd} # cold start initial conditions DMPDIR=${DMPDIR:-$pwd} # global dumps for seaice, snow and sst analysis # Model resolution specific parameters @@ -115,30 +134,15 @@ FV3_GFS_predet(){ PARM_POST=${PARM_POST:-$HOMEgfs/parm/post} # Model config options - APRUN_FV3=${APRUN_FV3:-${APRUN_FCST:-${APRUN:-""}}} - #the following NTHREAD_FV3 line is commented out because NTHREAD_FCST is not defined - #and because NTHREADS_FV3 gets overwritten by what is in the env/${macine}.env - #file and the value of npe_node_fcst is not correctly defined when using more than - #one thread and sets NTHREADS_FV3=1 even when the number of threads is appropraitely >1 - #NTHREADS_FV3=${NTHREADS_FV3:-${NTHREADS_FCST:-${nth_fv3:-1}}} - cores_per_node=${cores_per_node:-${npe_node_fcst:-24}} ntiles=${ntiles:-6} - if [ $MEMBER -lt 0 ]; then - NTASKS_TOT=${NTASKS_TOT:-${npe_fcst_gfs:-0}} - else - NTASKS_TOT=${NTASKS_TOT:-${npe_efcs:-0}} - fi TYPE=${TYPE:-"nh"} # choices: nh, hydro MONO=${MONO:-"non-mono"} # choices: mono, non-mono QUILTING=${QUILTING:-".true."} OUTPUT_GRID=${OUTPUT_GRID:-"gaussian_grid"} - OUTPUT_FILE=${OUTPUT_FILE:-"nemsio"} WRITE_NEMSIOFLIP=${WRITE_NEMSIOFLIP:-".true."} WRITE_FSYNCFLAG=${WRITE_FSYNCFLAG:-".true."} - affix="nemsio" - [[ "$OUTPUT_FILE" = "netcdf" ]] && affix="nc" rCDUMP=${rCDUMP:-$CDUMP} @@ -209,10 +213,9 @@ FV3_GFS_predet(){ print_freq=${print_freq:-6} #------------------------------------------------------- - if [ $CDUMP = "gfs" -a $rst_invt1 -gt 0 ]; then - RSTDIR_ATM=${RSTDIR:-$ROTDIR}/${CDUMP}.${PDY}/${cyc}/atmos/RERUN_RESTART - if [ ! -d $RSTDIR_ATM ]; then mkdir -p $RSTDIR_ATM ; fi - $NLN $RSTDIR_ATM RESTART + if [[ ${RUN} =~ "gfs" || ${RUN} = "gefs" ]] && (( rst_invt1 > 0 )); then + if [[ ! -d ${COM_ATMOS_RESTART} ]]; then mkdir -p "${COM_ATMOS_RESTART}" ; fi + ${NLN} "${COM_ATMOS_RESTART}" RESTART # The final restart written at the end doesn't include the valid date # Create links that keep the same name pattern for these files VDATE=$($NDATE +$FHMAX_GFS $CDATE) @@ -224,39 +227,19 @@ FV3_GFS_predet(){ files="${files} ${base}.tile${tile}.nc" done done - for file in $files; do - $NLN $RSTDIR_ATM/$file $RSTDIR_ATM/${vPDY}.${vcyc}0000.$file + for file in ${files}; do + ${NLN} "${COM_ATMOS_RESTART}/${file}" "${COM_ATMOS_RESTART}/${vPDY}.${vcyc}0000.${file}" done else mkdir -p $DATA/RESTART fi - #------------------------------------------------------- - # member directory - if [ $MEMBER -lt 0 ]; then - prefix=$CDUMP - rprefix=$rCDUMP - memchar="" - else - prefix=enkf$CDUMP - rprefix=enkf$rCDUMP - memchar=mem$(printf %03i $MEMBER) - fi - memdir=$ROTDIR/${prefix}.$PDY/$cyc/atmos/$memchar - if [ ! -d $memdir ]; then mkdir -p $memdir; fi - - GDATE=$($NDATE -$assim_freq $CDATE) - gPDY=$(echo $GDATE | cut -c1-8) - gcyc=$(echo $GDATE | cut -c9-10) - gmemdir=$ROTDIR/${rprefix}.$gPDY/$gcyc/atmos/$memchar - sCDATE=$($NDATE -3 $CDATE) - if [[ "$DOIAU" = "YES" ]]; then sCDATE=$($NDATE -3 $CDATE) sPDY=$(echo $sCDATE | cut -c1-8) scyc=$(echo $sCDATE | cut -c9-10) - tPDY=$gPDY - tcyc=$gcyc + tPDY=${gPDY} + tcyc=${gcyc} else sCDATE=$CDATE sPDY=$PDY @@ -270,36 +253,18 @@ FV3_GFS_predet(){ WW3_predet(){ echo "SUB ${FUNCNAME[0]}: Defining variables for WW3" - if [ $CDUMP = "gdas" ]; then - export RSTDIR_WAVE=$ROTDIR/${CDUMP}.${PDY}/${cyc}/wave/restart - else - export RSTDIR_WAVE=${RSTDIR_WAVE:-$ROTDIR/${CDUMP}.${PDY}/${cyc}/wave/restart} - fi - if [ ! -d $RSTDIR_WAVE ]; then mkdir -p $RSTDIR_WAVE ; fi - $NLN $RSTDIR_WAVE restart_wave + if [[ ! -d "${COM_WAVE_RESTART}" ]]; then mkdir -p "${COM_WAVE_RESTART}" ; fi + ${NLN} "${COM_WAVE_RESTART}" "restart_wave" } CICE_predet(){ echo "SUB ${FUNCNAME[0]}: CICE before run type determination" - if [ ! -d $ROTDIR ]; then mkdir -p $ROTDIR; fi - if [ ! -d $DATA ]; then mkdir -p $DATA; fi - if [ ! -d $DATA/RESTART ]; then mkdir -p $DATA/RESTART; fi - if [ ! -d $DATA/INPUT ]; then mkdir -p $DATA/INPUT; fi - if [ ! -d $DATA/restart ]; then mkdir -p $DATA/restart; fi - if [ ! -d $DATA/history ]; then mkdir -p $DATA/history; fi - if [ ! -d $DATA/OUTPUT ]; then mkdir -p $DATA/OUTPUT; fi + if [ ! -d $DATA/CICE_OUTPUT ]; then mkdir -p $DATA/CICE_OUTPUT; fi + if [ ! -d $DATA/CICE_RESTART ]; then mkdir -p $DATA/CICE_RESTART; fi } MOM6_predet(){ echo "SUB ${FUNCNAME[0]}: MOM6 before run type determination" - if [ ! -d $ROTDIR ]; then mkdir -p $ROTDIR; fi - if [ ! -d $DATA ]; then mkdir -p $DATA; fi - if [ ! -d $DATA/RESTART ]; then mkdir -p $DATA/RESTART; fi - if [ ! -d $DATA/INPUT ]; then mkdir -p $DATA/INPUT; fi - if [ ! -d $DATA/restart ]; then mkdir -p $DATA/restart; fi - if [ ! -d $DATA/history ]; then mkdir -p $DATA/history; fi - if [ ! -d $DATA/OUTPUT ]; then mkdir -p $DATA/OUTPUT; fi if [ ! -d $DATA/MOM6_OUTPUT ]; then mkdir -p $DATA/MOM6_OUTPUT; fi if [ ! -d $DATA/MOM6_RESTART ]; then mkdir -p $DATA/MOM6_RESTART; fi - cd $DATA || exit 8 } diff --git a/ush/fv3gfs_downstream_nems.sh b/ush/fv3gfs_downstream_nems.sh index 3b15b00cb20..48aacf0f072 100755 --- a/ush/fv3gfs_downstream_nems.sh +++ b/ush/fv3gfs_downstream_nems.sh @@ -34,10 +34,10 @@ source "$HOMEgfs/ush/preamble.sh" "$FH" export downset=${downset:-1} export DATA=${DATA:-/ptmpd2/$LOGNAME/test} -export CNVGRIB=${CNVGRIB:-${NWPROD:-/nwprod}/util/exec/cnvgrib21} -export COPYGB2=${COPYGB2:-${NWPROD:-/nwprod}/util/exec/copygb2} -export WGRIB2=${WGRIB2:-${NWPROD:-/nwprod}/util/exec/wgrib2} -export GRBINDEX=${GRBINDEX:-${NWPROD:-nwprod}/util/exec/grbindex} +export CNVGRIB=${CNVGRIB:-${grib_util_ROOT}/bin/cnvgrib} +export COPYGB2=${COPYGB2:-${grib_util_ROOT}/bin/copygb} +export WGRIB2=${WGRIB2:-${wgrib2_ROOT}/bin/wgrib2} +export GRBINDEX=${GRBINDEX:-${wgrib2_ROOT}/bin/grbindex} export RUN=${RUN:-"gfs"} export cycn=$(echo $CDATE |cut -c 9-10) export TCYC=${TCYC:-".t${cycn}z."} @@ -130,7 +130,7 @@ while [ $nset -le $totalset ]; do set +e $WGRIB2 -d $end $tmpfile | egrep -i "ugrd|ustm|uflx|u-gwd" export rc=$? - ${ERR_EXIT_ON:-set -eu} + set_strict if [[ $rc -eq 0 ]] ; then export end=$(expr ${end} + 1) elif [[ $rc -gt 1 ]]; then @@ -141,7 +141,7 @@ while [ $nset -le $totalset ]; do set +e $WGRIB2 -d $end $tmpfile | egrep -i "land" export rc=$? - ${ERR_EXIT_ON:-set -eu} + set_strict if [[ $rc -eq 0 ]] ; then export end=$(expr ${end} + 1) elif [[ $rc -gt 1 ]]; then @@ -240,50 +240,50 @@ while [ $nset -le $totalset ]; do if [ $nset = 1 ]; then if [ $fhr3 = anl ]; then - cp pgb2file_${fhr3}_0p25 $COMOUT/${PREFIX}pgrb2.0p25.anl - $WGRIB2 -s pgb2file_${fhr3}_0p25 > $COMOUT/${PREFIX}pgrb2.0p25.anl.idx + cp "pgb2file_${fhr3}_0p25" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2.0p25.anl" + ${WGRIB2} -s "pgb2file_${fhr3}_0p25" > "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2.0p25.anl.idx" if [ "$PGBS" = "YES" ]; then - cp pgb2file_${fhr3}_0p5 $COMOUT/${PREFIX}pgrb2.0p50.anl - cp pgb2file_${fhr3}_1p0 $COMOUT/${PREFIX}pgrb2.1p00.anl - $WGRIB2 -s pgb2file_${fhr3}_0p5 > $COMOUT/${PREFIX}pgrb2.0p50.anl.idx - $WGRIB2 -s pgb2file_${fhr3}_1p0 > $COMOUT/${PREFIX}pgrb2.1p00.anl.idx + cp "pgb2file_${fhr3}_0p5" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.anl" + cp "pgb2file_${fhr3}_1p0" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.anl" + ${WGRIB2} -s "pgb2file_${fhr3}_0p5" > "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.anl.idx" + ${WGRIB2} -s "pgb2file_${fhr3}_1p0" > "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.anl.idx" if [ "$PGB1F" = 'YES' ]; then - cp pgbfile_${fhr3}_1p0 $COMOUT/${PREFIX}pgrb.1p00.anl - $GRBINDEX $COMOUT/${PREFIX}pgrb.1p00.anl $COMOUT/${PREFIX}pgrb.1p00.anl.idx + cp "pgbfile_${fhr3}_1p0" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb.1p00.anl" + ${GRBINDEX} "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb.1p00.anl" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb.1p00.anl.idx" fi fi else - cp pgb2file_${fhr3}_0p25 $COMOUT/${PREFIX}pgrb2.0p25.f${fhr3} - $WGRIB2 -s pgb2file_${fhr3}_0p25 > $COMOUT/${PREFIX}pgrb2.0p25.f${fhr3}.idx + cp "pgb2file_${fhr3}_0p25" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2.0p25.f${fhr3}" + ${WGRIB2} -s "pgb2file_${fhr3}_0p25" > "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2.0p25.f${fhr3}.idx" if [ "$PGBS" = "YES" ]; then - cp pgb2file_${fhr3}_0p5 $COMOUT/${PREFIX}pgrb2.0p50.f${fhr3} - cp pgb2file_${fhr3}_1p0 $COMOUT/${PREFIX}pgrb2.1p00.f${fhr3} - $WGRIB2 -s pgb2file_${fhr3}_0p5 > $COMOUT/${PREFIX}pgrb2.0p50.f${fhr3}.idx - $WGRIB2 -s pgb2file_${fhr3}_1p0 > $COMOUT/${PREFIX}pgrb2.1p00.f${fhr3}.idx + cp "pgb2file_${fhr3}_0p5" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.f${fhr3}" + cp "pgb2file_${fhr3}_1p0" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.f${fhr3}" + ${WGRIB2} -s "pgb2file_${fhr3}_0p5" > "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.f${fhr3}.idx" + ${WGRIB2} -s "pgb2file_${fhr3}_1p0" > "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.f${fhr3}.idx" if [ "$PGB1F" = 'YES' ]; then - cp pgbfile_${fhr3}_1p0 $COMOUT/${PREFIX}pgrb.1p00.f${fhr3} - $GRBINDEX $COMOUT/${PREFIX}pgrb.1p00.f${fhr3} $COMOUT/${PREFIX}pgrb.1p00.f${fhr3}.idx + cp "pgbfile_${fhr3}_1p0" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb.1p00.f${fhr3}" + ${GRBINDEX} "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb.1p00.f${fhr3}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb.1p00.f${fhr3}.idx" fi fi fi elif [ $nset = 2 ]; then if [ $fhr3 = anl ]; then - cp pgb2bfile_${fhr3}_0p25 $COMOUT/${PREFIX}pgrb2b.0p25.anl - $WGRIB2 -s pgb2bfile_${fhr3}_0p25 > $COMOUT/${PREFIX}pgrb2b.0p25.anl.idx + cp "pgb2bfile_${fhr3}_0p25" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2b.0p25.anl" + ${WGRIB2} -s "pgb2bfile_${fhr3}_0p25" > "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2b.0p25.anl.idx" if [ "$PGBS" = "YES" ]; then - cp pgb2bfile_${fhr3}_0p5 $COMOUT/${PREFIX}pgrb2b.0p50.anl - cp pgb2bfile_${fhr3}_1p0 $COMOUT/${PREFIX}pgrb2b.1p00.anl - $WGRIB2 -s pgb2bfile_${fhr3}_0p5 > $COMOUT/${PREFIX}pgrb2b.0p50.anl.idx - $WGRIB2 -s pgb2bfile_${fhr3}_1p0 > $COMOUT/${PREFIX}pgrb2b.1p00.anl.idx + cp "pgb2bfile_${fhr3}_0p5" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2b.0p50.anl" + cp "pgb2bfile_${fhr3}_1p0" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2b.1p00.anl" + ${WGRIB2} -s "pgb2bfile_${fhr3}_0p5" > "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2b.0p50.anl.idx" + ${WGRIB2} -s "pgb2bfile_${fhr3}_1p0" > "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2b.1p00.anl.idx" fi else - cp pgb2bfile_${fhr3}_0p25 $COMOUT/${PREFIX}pgrb2b.0p25.f${fhr3} - $WGRIB2 -s pgb2bfile_${fhr3}_0p25 > $COMOUT/${PREFIX}pgrb2b.0p25.f${fhr3}.idx + cp "pgb2bfile_${fhr3}_0p25" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2b.0p25.f${fhr3}" + ${WGRIB2} -s "pgb2bfile_${fhr3}_0p25" > "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2b.0p25.f${fhr3}.idx" if [ "$PGBS" = "YES" ]; then - cp pgb2bfile_${fhr3}_0p5 $COMOUT/${PREFIX}pgrb2b.0p50.f${fhr3} - cp pgb2bfile_${fhr3}_1p0 $COMOUT/${PREFIX}pgrb2b.1p00.f${fhr3} - $WGRIB2 -s pgb2bfile_${fhr3}_0p5 > $COMOUT/${PREFIX}pgrb2b.0p50.f${fhr3}.idx - $WGRIB2 -s pgb2bfile_${fhr3}_1p0 > $COMOUT/${PREFIX}pgrb2b.1p00.f${fhr3}.idx + cp "pgb2bfile_${fhr3}_0p5" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2b.0p50.f${fhr3}" + cp "pgb2bfile_${fhr3}_1p0" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2b.1p00.f${fhr3}" + ${WGRIB2} -s "pgb2bfile_${fhr3}_0p5" > "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2b.0p50.f${fhr3}.idx" + ${WGRIB2} -s "pgb2bfile_${fhr3}_1p0" > "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2b.1p00.f${fhr3}.idx" fi fi fi diff --git a/ush/fv3gfs_dwn_nems.sh b/ush/fv3gfs_dwn_nems.sh index eb29445b363..fdc58c68ea6 100755 --- a/ush/fv3gfs_dwn_nems.sh +++ b/ush/fv3gfs_dwn_nems.sh @@ -18,9 +18,9 @@ export fhr3=$2 export iproc=$3 export nset=$4 -export CNVGRIB=${CNVGRIB:-$${NWPROD:-/nwprod}/util/exec/cnvgrib21} -export COPYGB2=${COPYGB2:-$${NWPROD:-/nwprod}/util/exec/copygb2} -export WGRIB2=${WGRIB2:-${NWPROD:-/nwprod}/util/exec/wgrib2} +export CNVGRIB=${CNVGRIB:-${grib_util_ROOT}/bin/cnvgrib} +export COPYGB2=${COPYGB2:-${grib_util_ROOT}/bin/copygb} +export WGRIB2=${WGRIB2:-${wgrib2_ROOT}/bin/wgrib2} export TRIMRH=${TRIMRH:-$USHgfs/trim_rh.sh} export MODICEC=${MODICEC:-$USHgfs/mod_icec.sh} diff --git a/ush/fv3gfs_nc2nemsio.sh b/ush/fv3gfs_nc2nemsio.sh deleted file mode 100755 index 99eea9ce5f3..00000000000 --- a/ush/fv3gfs_nc2nemsio.sh +++ /dev/null @@ -1,73 +0,0 @@ -#! /usr/bin/env bash - -#---------------------------------------------------------------------------- -#--Fanglin Yang, October 2016: convert FV3 NetCDF files to NEMSIO format. -# Note FV3 lat-lon grid is located at the center of each grid box, -# starting from south to north and from east to west. -# For example, for a 0.5-deg uniform grid, nlon=720, nlat=360 -# X(1,1)=[0.25E,89.75S], X(nlon,nlat)=[359.75E,89.75N] -#--------------------------------------------------------------------------- - -source "$HOMEgfs/ush/preamble.sh" - -export CDATE=${CDATE:-"2016100300"} -export GG=${master_grid:-"0p25deg"} # 1deg 0p5deg 0p25deg 0p125deg -export FHZER=${FHZER:-6} # accumulation bucket in hours -export fdiag=${fdiag:-"none"} # specified forecast output hours - -pwd=$(pwd) -export DATA=${DATA:-$pwd} -export NWPROD=${NWPROD:-$pwd} -export HOMEgfs=${HOMEgfs:-$NWPROD} -export NC2NEMSIOEXE=${NC2NEMSIOEXE:-$HOMEgfs/exec/fv3nc2nemsio.x} - -cycn=$(echo $CDATE | cut -c 9-10) -export TCYC=${TCYC:-".t${cycn}z."} -export CDUMP=${CDUMP:-gfs} - -export PREFIX=${PREFIX:-${CDUMP}${TCYC}} -export SUFFIX=${SUFFIX:-".nemsio"} - -#-------------------------------------------------- -cd $DATA || exit 8 - -input_dir=$DATA -output_dir=$DATA - -in_3d=${PREFIX}nggps3d.${GG}.nc -in_2d=${PREFIX}nggps2d.${GG}.nc -if [ ! -s $in_3d -o ! -s $in_2d ]; then - echo "$in_3d and $in_2d are missing. exit" - exit 1 -fi - -#--check if the output is from non-hydrostatic case -nhrun=$(ncdump -c $in_3d | grep nhpres) -nhcase=$? - -# If no information on the time interval is given, deduce from the netCDF file -[[ $fdiag = "none" ]] && fdiag=$(ncks -H -s "%g " -C -v time $in_3d) - -#--------------------------------------------------- -nt=0 -err=0 -for fhour in $(echo $fdiag | sed "s/,/ /g"); do - nt=$((nt+1)) - ifhour=$(printf "%09d" $fhour) # convert to integer - fhzh=$(( (ifhour/FHZER-1)*FHZER )) # bucket accumulation starting hour - [[ $fhzh -lt 0 ]] && fhzh=0 - - fhr=$(printf "%03d" $fhour) - outfile=${PREFIX}atmf${fhr}${SUFFIX} - - $NC2NEMSIOEXE $CDATE $nt $fhzh $fhour $input_dir $in_2d $in_3d $output_dir $outfile $nhcase - rc=$? - ((err+=rc)) - - [[ ! -f $outfile ]] && ((err+=1)) - -done - -#--------------------------------------------------- - -exit $err diff --git a/ush/fv3gfs_regrid_nemsio.sh b/ush/fv3gfs_regrid_nemsio.sh deleted file mode 100755 index 7b92c27cde6..00000000000 --- a/ush/fv3gfs_regrid_nemsio.sh +++ /dev/null @@ -1,119 +0,0 @@ -#! /usr/bin/env bash - -################################################################################ -# UNIX Script Documentation Block -# Script name: fv3gfs_regrid_nemsio.sh -# Script description: Remap FV3 forecasts on six tile in NetCDF to global Gaussian -# grid with NEMSIO output -# -# $Id$ -# -# Author: Fanglin Yang Org: NCEP/EMC Date: 2016-12-01 -# Abstract: regrid_nemsio.fd provided by Jeffrey.S.Whitaker OAR/ESRL -# -# Script history log: -# 2016-12-01 Fanglin Yang -# 2017-02-13 Rahul Mahajan -# -# Attributes: -# Language: Portable Operating System Interface (POSIX) Shell -################################################################################ - -source "$HOMEgfs/ush/preamble.sh" - -#------------------------------------------------------- -# Directories and paths -pwd=$(pwd) -DATA=${DATA:-$pwd} -NWPROD=${NWPROD:-$pwd} -HOMEgfs=${HOMEgfs:-$NWPROD} -FIX_DIR=${FIX_DIR:-$HOMEgfs/fix} -FIX_AM=${FIX_AM:-$FIX_DIR/fix_am} -FIXfv3=${FIXfv3:-$FIX_DIR/fix_fv3_gmted2010} -REGRID_NEMSIO_EXEC=${REGRID_NEMSIO_EXEC:-$HOMEgfs/exec/regrid_nemsio} -REGRID_NEMSIO_TBL=${REGRID_NEMSIO_TBL:-$HOMEgfs/parm/parm_fv3diag/variable_table.txt} - -CDATE=${CDATE:-2017011500} -CDUMP=${CDUMP:-"gdas"} -CASE=${CASE:-C768} -LEVS=${LEVS:-65} -GG=${GG:-gaussian} # gaussian or regular lat-lon -res=$(echo $CASE | cut -c2-) -JCAP=${JCAP:-$((res*2-2))} -LATB=${LATB:-$((res*2))} -LONB=${LONB:-$((res*4))} - -NEMSIO_OUT2DNAME=${NEMSIO_OUT2DNAME:-sfc.$CDATE} -NEMSIO_OUT3DNAME=${NEMSIO_OUT3DNAME:-atm.$CDATE} -DEBUG=${REGRID_NEMSIO_DEBUG:-".true."} - -APRUN_REGRID_NEMSIO=${APRUN_REGRID_NEMSIO:-${APRUN:-""}} -NTHREADS_REGRID_NEMSIO=${NTHREADS_REGRID_NEMSIO:-${NTHREADS:-1}} - -NMV=${NMV:-"/bin/mv"} - -#------------------------------------------------------- -# IO specific parameters and error traps -ERRSCRIPT=${ERRSCRIPT:-'eval [[ $err = 0 ]]'} - -#-------------------------------------------------- -# ESMF regrid weights and output variable table -weight_bilinear=${weight_bilinear:-$FIXfv3/$CASE/fv3_SCRIP_${CASE}_GRIDSPEC_lon${LONB}_lat${LATB}.${GG}.bilinear.nc} -weight_neareststod=${weight_neareststod:-$FIXfv3/$CASE/fv3_SCRIP_${CASE}_GRIDSPEC_lon${LONB}_lat${LATB}.${GG}.neareststod.nc} - -#------------------------------------------------------- -# Go to the directory where the history files are -cd $DATA || exit 8 - -#------------------------------------------------------- -# Create namelist -rm -f regrid-nemsio.input - -cat > regrid-nemsio.input << EOF -&share - debug=$DEBUG, - ntrunc=$JCAP, - nlons=$LONB, - nlats=$LATB, - datapathout2d='$NEMSIO_OUT2DNAME', - datapathout3d='$NEMSIO_OUT3DNAME', - analysis_filename='fv3_history.tile1.nc','fv3_history.tile2.nc','fv3_history.tile3.nc','fv3_history.tile4.nc','fv3_history.tile5.nc','fv3_history.tile6.nc', - analysis_filename2d='fv3_history2d.tile1.nc','fv3_history2d.tile2.nc','fv3_history2d.tile3.nc','fv3_history2d.tile4.nc','fv3_history2d.tile5.nc','fv3_history2d.tile6.nc', - forecast_timestamp='${CDATE}', - variable_table='$REGRID_NEMSIO_TBL', - nemsio_opt3d='bin4', - nemsio_opt2d='bin4' -/ - -&interpio - esmf_bilinear_filename='$weight_bilinear', - esmf_neareststod_filename='$weight_neareststod', - gfs_hyblevs_filename='$FIX_AM/global_hyblev.l$LEVS.txt' -/ -EOF - -#------------------------------------------------------------------ -export OMP_NUM_THREADS=$NTHREADS_REGRID_NEMSIO -$APRUN_REGRID_NEMSIO $REGRID_NEMSIO_EXEC - -export ERR=$? -export err=$ERR -$ERRSCRIPT || exit $err - -rm -f regrid-nemsio.input - -#------------------------------------------------------------------ -PDY=$(echo $CDATE | cut -c1-8) -cyc=$(echo $CDATE | cut -c9-10) -PREFIX=${PREFIX:-"${CDUMP}.t${cyc}z."} -SUFFIX=${SUFFIX:-".nemsio"} -for ftype in atm sfc; do - for file in $(ls -1 ${ftype}.${CDATE}.fhr*); do - fhrchar=$(echo $file | cut -d. -f3 | cut -c4-) - $NMV $file ${PREFIX}${ftype}f${fhrchar}${SUFFIX} - done -done - -#------------------------------------------------------------------ - -exit $err diff --git a/ush/fv3gfs_remap.sh b/ush/fv3gfs_remap.sh index b1c3546d979..430e96c8682 100755 --- a/ush/fv3gfs_remap.sh +++ b/ush/fv3gfs_remap.sh @@ -13,10 +13,9 @@ export GG=${master_grid:-"0p25deg"} # 1deg 0p5deg 0p25deg 0p125deg pwd=$(pwd) export DATA=${DATA:-$pwd} -export NWPROD=${NWPROD:-$pwd} -export HOMEgfs=${HOMEgfs:-$NWPROD} +export HOMEgfs=${HOMEgfs:-$PACKAGEROOT} export FIX_DIR=${FIX_DIR:-$HOMEgfs/fix} -export FIXfv3=${FIXfv3:-$FIX_DIR/fix_fv3_gmted2010} +export FIXfv3=${FIXfv3:-$FIX_DIR/orog} export REMAPEXE=${REMAPEXE:-$HOMEgfs/exec/fregrid_parallel} export IPD4=${IPD4:-"YES"} diff --git a/ush/gaussian_sfcanl.sh b/ush/gaussian_sfcanl.sh index 147afd5497b..f8d2763bb58 100755 --- a/ush/gaussian_sfcanl.sh +++ b/ush/gaussian_sfcanl.sh @@ -28,9 +28,9 @@ # HOMEgfs Directory for gfs version. Default is # $BASEDIR/gfs_ver.v15.0.0} # FIXam Directory for the global fixed climatology files. -# Defaults to $HOMEgfs/fix/fix_am +# Defaults to $HOMEgfs/fix/am # FIXfv3 Directory for the model grid and orography netcdf -# files. Defaults to $HOMEgfs/fix/fix_fv3_gmted2010 +# files. Defaults to $HOMEgfs/fix/orog # FIXWGTS Weight file to use for interpolation # EXECgfs Directory of the program executable. Defaults to # $HOMEgfs/exec @@ -42,7 +42,7 @@ # defaults to current working directory # XC Suffix to add to executables. Defaults to none. # GAUSFCANLEXE Program executable. -# Defaults to $EXECgfs/gaussian_sfcanl.exe +# Defaults to $EXECgfs/gaussian_sfcanl.x # INISCRIPT Preprocessing script. Defaults to none. # LOGSCRIPT Log posting script. Defaults to none. # ERRSCRIPT Error processing script @@ -87,11 +87,11 @@ # $FIXWGTS # $FIXam/global_hyblev.l65.txt # -# input data : $COMOUT/RESTART/${PDY}.${cyc}0000.sfcanl_data.tile*.nc +# input data : ${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile*.nc # # output data: $PGMOUT # $PGMERR -# $COMOUT/${APREFIX}sfcanl${ASUFFIX} +# $COMOUT/${APREFIX}sfcanl.nc # # Remarks: # @@ -121,27 +121,20 @@ LATB_SFC=${LATB_SFC:-$LATB_CASE} DONST=${DONST:-"NO"} LEVS=${LEVS:-64} LEVSP1=$(($LEVS+1)) -OUTPUT_FILE=${OUTPUT_FILE:-"nemsio"} -if [ $OUTPUT_FILE = "netcdf" ]; then - export NETCDF_OUT=".true." -else - export NETCDF_OUT=".false." -fi - +export NETCDF_OUT=".true." # Directories. -gfs_ver=${gfs_ver:-v15.0.0} -BASEDIR=${BASEDIR:-${NWROOT:-/nwprod2}} -HOMEgfs=${HOMEgfs:-$BASEDIR/gfs_ver.${gfs_ver}} +gfs_ver=${gfs_ver:-v16.3.0} +BASEDIR=${BASEDIR:-${PACKAGEROOT:-/lfs/h1/ops/prod/packages}} +HOMEgfs=${HOMEgfs:-$BASEDIR/gfs.${gfs_ver}} EXECgfs=${EXECgfs:-$HOMEgfs/exec} -FIXfv3=${FIXfv3:-$HOMEgfs/fix/fix_fv3_gmted2010} -FIXam=${FIXam:-$HOMEgfs/fix/fix_am} +FIXfv3=${FIXfv3:-$HOMEgfs/fix/orog} +FIXam=${FIXam:-$HOMEgfs/fix/am} FIXWGTS=${FIXWGTS:-$FIXfv3/$CASE/fv3_SCRIP_${CASE}_GRIDSPEC_lon${LONB_SFC}_lat${LATB_SFC}.gaussian.neareststod.nc} DATA=${DATA:-$(pwd)} -COMOUT=${COMOUT:-$(pwd)} # Filenames. XC=${XC:-} -GAUSFCANLEXE=${GAUSFCANLEXE:-$EXECgfs/gaussian_sfcanl.exe} +GAUSFCANLEXE=${GAUSFCANLEXE:-$EXECgfs/gaussian_sfcanl.x} SIGLEVEL=${SIGLEVEL:-$FIXam/global_hyblev.l${LEVSP1}.txt} CDATE=${CDATE:?} @@ -166,7 +159,8 @@ else mkdata=YES fi cd $DATA||exit 99 -[[ -d $COMOUT ]]||mkdir -p $COMOUT +[[ -d "${COM_ATMOS_ANALYSIS}" ]] || mkdir -p "${COM_ATMOS_ANALYSIS}" +[[ -d "${COM_ATMOS_RESTART}" ]] || mkdir -p "${COM_ATMOS_RESTART}" cd $DATA ################################################################################ @@ -175,12 +169,10 @@ export PGM=$GAUSFCANLEXE export pgm=$PGM $LOGSCRIPT -PDY=$(echo $CDATE | cut -c1-8) -cyc=$(echo $CDATE | cut -c9-10) -iy=$(echo $CDATE | cut -c1-4) -im=$(echo $CDATE | cut -c5-6) -id=$(echo $CDATE | cut -c7-8) -ih=$(echo $CDATE | cut -c9-10) +iy=${PDY:0:4} +im=${PDY:4:2} +id=${PDY:6:2} +ih=${cyc} export OMP_NUM_THREADS=${OMP_NUM_THREADS_SFC:-1} @@ -188,12 +180,12 @@ export OMP_NUM_THREADS=${OMP_NUM_THREADS_SFC:-1} $NLN $FIXWGTS ./weights.nc # input analysis tiles (with nst records) -$NLN $COMOUT/RESTART/${PDY}.${cyc}0000.sfcanl_data.tile1.nc ./anal.tile1.nc -$NLN $COMOUT/RESTART/${PDY}.${cyc}0000.sfcanl_data.tile2.nc ./anal.tile2.nc -$NLN $COMOUT/RESTART/${PDY}.${cyc}0000.sfcanl_data.tile3.nc ./anal.tile3.nc -$NLN $COMOUT/RESTART/${PDY}.${cyc}0000.sfcanl_data.tile4.nc ./anal.tile4.nc -$NLN $COMOUT/RESTART/${PDY}.${cyc}0000.sfcanl_data.tile5.nc ./anal.tile5.nc -$NLN $COMOUT/RESTART/${PDY}.${cyc}0000.sfcanl_data.tile6.nc ./anal.tile6.nc +${NLN} "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile1.nc" "./anal.tile1.nc" +${NLN} "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile2.nc" "./anal.tile2.nc" +${NLN} "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile3.nc" "./anal.tile3.nc" +${NLN} "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile4.nc" "./anal.tile4.nc" +${NLN} "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile5.nc" "./anal.tile5.nc" +${NLN} "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile6.nc" "./anal.tile6.nc" # input orography tiles $NLN $FIXfv3/$CASE/${CASE}_oro_data.tile1.nc ./orog.tile1.nc @@ -206,7 +198,7 @@ $NLN $FIXfv3/$CASE/${CASE}_oro_data.tile6.nc ./orog.tile6.nc $NLN $SIGLEVEL ./vcoord.txt # output gaussian global surface analysis files -$NLN $COMOUT/${APREFIX}sfcanl${ASUFFIX} ./sfc.gaussian.analysis.file +${NLN} "${COM_ATMOS_ANALYSIS}/${APREFIX}sfcanl.nc" "./sfc.gaussian.analysis.file" # Executable namelist cat < fort.41 diff --git a/ush/gfs_bfr2gpk.sh b/ush/gfs_bfr2gpk.sh index c11ec62735f..add68536ecc 100755 --- a/ush/gfs_bfr2gpk.sh +++ b/ush/gfs_bfr2gpk.sh @@ -10,7 +10,7 @@ # Log: # # K. Brill/HPC 04/12/05 # ######################################################################### -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" # Set GEMPAK paths. @@ -18,32 +18,19 @@ source "$HOMEgfs/ush/preamble.sh" # Go to a working directory. -cd $DATA - -# Set input directory name. - -#BPATH=$COMIN/bufr.t${cyc}z -BPATH=$COMOUT/bufr.t${cyc}z -export BPATH +cd "${DATA}" || exit 2 # Set output directory: - -COMAWP=${COMAWP:-$COMOUT/gempak} -OUTDIR=$COMAWP -if [ ! -d $OUTDIR ]; then mkdir -p $OUTDIR; fi +if [[ ! -d "${COM_ATMOS_GEMPAK}" ]]; then mkdir -p "${COM_ATMOS_GEMPAK}"; fi outfilbase=gfs_${PDY}${cyc} # Get the list of individual station files. date -##filelist=$(/bin/ls -1 $BPATH | grep bufr) -##rm -f bufr.combined -##for file in $filelist; do -## cat $BPATH/$file >> bufr.combined -##done - cat $BPATH/bufr.*.${PDY}${cyc} > bufr.combined +cat "${COM_ATMOS_BUFR}/bufr."*".${PDY}${cyc}" > bufr.combined date + namsnd << EOF > /dev/null SNBUFR = bufr.combined SNOUTF = ${outfilbase}.snd @@ -55,20 +42,20 @@ r ex EOF + date -/bin/rm *.nts +/bin/rm ./*.nts snd=${outfilbase}.snd sfc=${outfilbase}.sfc -cp $snd $OUTDIR/.$snd -cp $sfc $OUTDIR/.$sfc -mv $OUTDIR/.$snd $OUTDIR/$snd -mv $OUTDIR/.$sfc $OUTDIR/$sfc - -if [ $SENDDBN = "YES" ] -then - $DBNROOT/bin/dbn_alert MODEL GFS_PTYP_SFC $job $OUTDIR/$sfc - $DBNROOT/bin/dbn_alert MODEL GFS_PTYP_SND $job $OUTDIR/$snd +cp "${snd}" "${COM_ATMOS_GEMPAK}/.${snd}" +cp "${sfc}" "${COM_ATMOS_GEMPAK}/.${sfc}" +mv "${COM_ATMOS_GEMPAK}/.${snd}" "${COM_ATMOS_GEMPAK}/${snd}" +mv "${COM_ATMOS_GEMPAK}/.${sfc}" "${COM_ATMOS_GEMPAK}/${sfc}" + +if [[ ${SENDDBN} == "YES" ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PTYP_SFC "${job}" "${COM_ATMOS_GEMPAK}/${sfc}" + "${DBNROOT}/bin/dbn_alert" MODEL GFS_PTYP_SND "${job}" "${COM_ATMOS_GEMPAK}/${snd}" fi -echo done > $DATA/gembufr.done +echo "done" > "${DATA}/gembufr.done" diff --git a/ush/gfs_bufr.sh b/ush/gfs_bufr.sh index 07bebd5ac0a..b782c707c90 100755 --- a/ush/gfs_bufr.sh +++ b/ush/gfs_bufr.sh @@ -19,30 +19,18 @@ # 2019-10-10 Guang Ping Lou: Read in NetCDF files # echo "History: February 2003 - First implementation of this utility script" # -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs:?}/ush/preamble.sh" -if test "$F00FLAG" = "YES" -then +if [[ "${F00FLAG}" == "YES" ]]; then f00flag=".true." else f00flag=".false." fi -hh=$FSTART -while test $hh -le $FEND -do - hh=$( expr $hh + $FINT ) - if test $hh -lt 10 - then - hh=0$hh - fi -done - -export pgm=gfs_bufr +export pgm="gfs_bufr.x" #. prep_step -if test "$MAKEBUFR" = "YES" -then +if [[ "${MAKEBUFR}" == "YES" ]]; then bufrflag=".true." else bufrflag=".false." @@ -51,67 +39,55 @@ fi ##fformat="nc" ##fformat="nemsio" - CLASS="class1fv3" +CLASS="class1fv3" cat << EOF > gfsparm &NAMMET - levs=$LEVS,makebufr=$bufrflag, - dird="$COMOUT/bufr.${cycle}/bufr", - nstart=$FSTART,nend=$FEND,nint=$FINT, - nend1=$NEND1,nint1=$NINT1,nint3=$NINT3, - nsfc=80,f00=$f00flag,fformat=$fformat,np1=0 + levs=${LEVS},makebufr=${bufrflag}, + dird="${COM_ATMOS_BUFR}/bufr", + nstart=${FSTART},nend=${FEND},nint=${FINT}, + nend1=${NEND1},nint1=${NINT1},nint3=${NINT3}, + nsfc=80,f00=${f00flag},fformat=${fformat},np1=0 / EOF -hh=$FSTART - if test $hh -lt 100 - then - hh1=$(echo "${hh#"${hh%??}"}") - hh=$hh1 - fi -while test $hh -le $FEND -do - if test $hh -lt 100 - then - hh2=0$hh - else - hh2=$hh - fi +for (( hr = 10#${FSTART}; hr <= 10#${FEND}; hr = hr + 10#${FINT} )); do + hh2=$(printf %02i "${hr}") + hh3=$(printf %03i "${hr}") -#--------------------------------------------------------- -# Make sure all files are available: + #--------------------------------------------------------- + # Make sure all files are available: ic=0 - while [ $ic -lt 1000 ] - do - if [ ! -f $COMIN/${RUN}.${cycle}.logf${hh2}.${logfm} ] - then + while (( ic < 1000 )); do + if [[ ! -f "${COM_ATMOS_HISTORY}/${RUN}.${cycle}.logf${hh3}.${logfm}" ]]; then sleep 10 - ic=$(expr $ic + 1) + ic=$((ic + 1)) else break fi - if [ $ic -ge 360 ] - then - err_exit "COULD NOT LOCATE logf${hh2} file AFTER 1 HOUR" + if (( ic >= 360 )); then + echo "FATAL: COULD NOT LOCATE logf${hh3} file AFTER 1 HOUR" + exit 2 fi done -#------------------------------------------------------------------ - ln -sf $COMIN/${RUN}.${cycle}.atmf${hh2}.${atmfm} sigf${hh} - ln -sf $COMIN/${RUN}.${cycle}.sfcf${hh2}.${atmfm} flxf${hh} - - hh=$( expr $hh + $FINT ) - if test $hh -lt 10 - then - hh=0$hh - fi -done + #------------------------------------------------------------------ + ln -sf "${COM_ATMOS_HISTORY}/${RUN}.${cycle}.atmf${hh3}.${atmfm}" "sigf${hh2}" + ln -sf "${COM_ATMOS_HISTORY}/${RUN}.${cycle}.sfcf${hh3}.${atmfm}" "flxf${hh2}" +done # define input BUFR table file. -ln -sf $PARMbufrsnd/bufr_gfs_${CLASS}.tbl fort.1 -ln -sf ${STNLIST:-$PARMbufrsnd/bufr_stalist.meteo.gfs} fort.8 -ln -sf $PARMbufrsnd/bufr_ij13km.txt fort.7 +ln -sf "${PARMbufrsnd}/bufr_gfs_${CLASS}.tbl" fort.1 +ln -sf "${STNLIST:-${PARMbufrsnd}/bufr_stalist.meteo.gfs}" fort.8 +ln -sf "${PARMbufrsnd}/bufr_ij13km.txt" fort.7 -${APRUN_POSTSND} $EXECbufrsnd/gfs_bufr < gfsparm > out_gfs_bufr_$FEND +${APRUN_POSTSND} "${EXECbufrsnd}/${pgm}" < gfsparm > "out_gfs_bufr_${FEND}" export err=$? +if [ $err -ne 0 ]; then + echo "GFS postsnd job error, Please check files " + echo "${COM_ATMOS_HISTORY}/${RUN}.${cycle}.atmf${hh2}.${atmfm}" + echo "${COM_ATMOS_HISTORY}/${RUN}.${cycle}.sfcf${hh2}.${atmfm}" + err_chk +fi + exit ${err} diff --git a/ush/gfs_bufr_netcdf.sh b/ush/gfs_bufr_netcdf.sh index 30d7631da32..b358c6b69af 100755 --- a/ush/gfs_bufr_netcdf.sh +++ b/ush/gfs_bufr_netcdf.sh @@ -38,7 +38,7 @@ do fi done -export pgm=gfs_bufr +export pgm="gfs_bufr.x" #. prep_step if test "$MAKEBUFR" = "YES" @@ -48,10 +48,8 @@ else bufrflag=".false." fi -fformat="nc" - - SFCF="sfc" - CLASS="class1fv3" +SFCF="sfc" +CLASS="class1fv3" cat << EOF > gfsparm &NAMMET levs=$LEVS,makebufr=$bufrflag, @@ -82,7 +80,7 @@ do ic=0 while [ $ic -lt 1000 ] do - if [ ! -f $COMIN/${RUN}.${cycle}.logf${hh2}.${fformat} ] + if [ ! -f $COMIN/${RUN}.${cycle}.logf${hh2}.txt ] then sleep 10 ic=$(expr $ic + 1) @@ -96,8 +94,8 @@ do fi done #------------------------------------------------------------------ - ln -sf $COMIN/${RUN}.${cycle}.atmf${hh2}.${fformat} sigf${hh} - ln -sf $COMIN/${RUN}.${cycle}.${SFCF}f${hh2}.${fformat} flxf${hh} + ln -sf $COMIN/${RUN}.${cycle}.atmf${hh2}.nc sigf${hh} + ln -sf $COMIN/${RUN}.${cycle}.${SFCF}f${hh2}.nc flxf${hh} hh=$( expr $hh + $FINT ) if test $hh -lt 10 @@ -111,7 +109,7 @@ ln -sf $PARMbufrsnd/bufr_gfs_${CLASS}.tbl fort.1 ln -sf ${STNLIST:-$PARMbufrsnd/bufr_stalist.meteo.gfs} fort.8 ln -sf $PARMbufrsnd/bufr_ij13km.txt fort.7 -${APRUN_POSTSND} $EXECbufrsnd/gfs_bufr < gfsparm > out_gfs_bufr_$FEND +${APRUN_POSTSND} "${EXECbufrsnd}/${pgm}" < gfsparm > "out_gfs_bufr_${FEND}" export err=$? exit ${err} diff --git a/ush/gfs_post.sh b/ush/gfs_post.sh new file mode 100755 index 00000000000..01161acf52d --- /dev/null +++ b/ush/gfs_post.sh @@ -0,0 +1,416 @@ +#! /usr/bin/env bash + +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: gfs_post.sh +# Script description: Posts the global pressure GRIB file +# +# Author: Mark Iredell Org: NP23 Date: 1999-05-01 +# +# Abstract: This script reads a single global GFS IO file and (optionally) +# a global flux file and creates a global pressure GRIB file. +# The resolution and generating code of the output GRIB file can also +# be set in the argument list. +# +# Script history log: +# 1999-05-01 Mark Iredell +# 2007-04-04 Huiya Chuang: Modify the script to run unified post +# 2012-06-04 Jun Wang: add grib2 option +# 2015-03-20 Lin Gan: add Perl for Post XML performance upgrade +# 2016-02-08 Lin Gan: Modify to use Vertical Structure +# 2018-02-05 Wen Meng: For EE2 standard, create gfs_post.sh based +# global_post.sh and change EXECglobal to EXECgfs; +# Remove legacy setting for reading non-nemsio model output +# and generating grib1 data +# 2019-06-02 Wen Meng: Remove the links of gfs fix files. +# 2021-06-11 Yali Mao: Instead of err_chk, 'exit $err' for wafsfile +# if POSTGPEXEC fails +# +# Usage: global_postgp.sh SIGINP FLXINP FLXIOUT PGBOUT PGIOUT IGEN +# +# Input script positional parameters: +# 1 Input sigma file +# defaults to $SIGINP +# 2 Input flux file +# defaults to $FLXINP +# 3 Output flux index file +# defaults to $FLXIOUT +# 4 Output pressure GRIB file +# defaults to $PGBOUT +# 5 Output pressure GRIB index file +# defaults to $PGIOUT, then to none +# 8 Model generating code, +# defaults to $IGEN, then to input sigma generating code +# +# Imported Shell Variables: +# SIGINP Input sigma file +# overridden by $1 +# FLXINP Input flux file +# overridden by $2 +# FLXIOUT Output flux index file +# overridden by $3 +# PGBOUT Output pressure GRIB file +# overridden by $4. If not defined, +# post will use the filename specified in +# the control file +# PGIOUT Output pressure GRIB index file +# overridden by $5; defaults to none +# IGEN Model generating code +# overridden by $8; defaults to input sigma generating code +##### Moorthi: Add new imported shell variable for running chgres +# CHGRESSH optional: the script to run chgres +# default to to ${USHglobal}/global_chgres.sh +# SIGLEVEL optional: the coordinate text file +# default to to /nwprod/fix/global_hyblev.l${LEVS}.txt +##### Chuang: Add new imported Shell Variable for post +# OUTTYP Output file type read in by post +# 1: if user has a sigma file and needs post to run chgres to convert to gfs io file +# 2: if user already has a gfs io file +# 3: if user uses post to read sigma file directly +# 0: if user wishes to generate both gfsio and sigma files +# 4: if user uses post to read nemsio file directly +# VDATE Verifying date 10 digits yyyymmddhh +# GFSOUT Optional, output file name from chgres which is input file name to post +# if model already runs gfs io, make sure GFSOUT is linked to the gfsio file +# CTLFILE Optional, Your version of control file if not using operational one +# OVERPARMEXEC Optional, the executable for changing Grib KPDS ID +# default to to ${EXECglobal}/overparm_grib +# CHGRESTHREAD Optional, speed up chgres by using multiple threads +# default to 1 +# FILTER Optional, set to 1 to filter SLP and 500 mb height using copygb +# D3DINP Optional, Inout D3D file, if not defined, post will run +# without processing D3D file +# D3DOUT Optional, output D3D file, if not defined, post will +# use the file name specified in the control file +# IPVOUT Optional, output IPV file, if not defined, post will +# use the file name specified in the control file +# GENPSICHI Optional, set to YES will generate psi and chi and +# append it to the end of PGBOUT. Default to NO +# GENPSICHIEXE Optional, specify where executable is for generating +# psi and chi. +######################################################################## +# EXECUTIL Directory for utility executables +# defaults to /nwprod/util/exec +# USHUTIL Directory for utility scripts +# defaults to /nwprod/util/ush +# EXECglobal Directory for global executables +# defaults to /nwprod/exec +# USHglobal Directory for global scripts +# defaults to /nwprod/ush +# DATA working directory +# (if nonexistent will be made, used and deleted) +# defaults to current working directory +# MP Multi-processing type ("p" or "s") +# defaults to "p", or "s" if LOADL_STEP_TYPE is not PARALLEL +# XC Suffix to add to executables +# defaults to none +# POSTGPEXEC Global post executable +# defaults to ${EXECglobal}/upp.x +# GRBINDEX GRIB index maker +# defaults to ${EXECUTIL}/grbindex$XC +# POSTGPLIST File containing further namelist inputs +# defaults to /dev/null +# INISCRIPT Preprocessing script +# defaults to none +# LOGSCRIPT Log posting script +# defaults to none +# ERRSCRIPT Error processing script +# defaults to 'eval [[ $err = 0 ]]' +# ENDSCRIPT Postprocessing script +# defaults to none +# POSTGPVARS Other namelist inputs to the global post executable +# such as IDRT,KO,PO,KTT,KT,PT,KZZ,ZZ, +# NCPUS,MXBIT,IDS,POB,POT,MOO,MOOA,MOW,MOWA, +# ICEN,ICEN2,IENST,IENSI +# defaults to none set +# NTHREADS Number of threads +# defaults to 1 +# NTHSTACK Size of stack per thread +# defaults to 64000000 +# VERBOSE Verbose flag (YES or NO) +# defaults to NO +# PGMOUT Executable standard output +# defaults to $pgmout, then to '&1' +# PGMERR Executable standard error +# defaults to $pgmerr, then to '&1' +# pgmout Executable standard output default +# pgmerr Executable standard error default +# REDOUT standard output redirect ('1>' or '1>>') +# defaults to '1>', or to '1>>' to append if $PGMOUT is a file +# REDERR standard error redirect ('2>' or '2>>') +# defaults to '2>', or to '2>>' to append if $PGMERR is a file +# +# Exported Shell Variables: +# PGM Current program name +# pgm +# ERR Last return code +# err +# +# Modules and files referenced: +# scripts : $INISCRIPT +# $LOGSCRIPT +# $ERRSCRIPT +# $ENDSCRIPT +# +# programs : $POSTGPEXEC +# $GRBINDEX +# +# input data : $1 or $SIGINP +# $2 or $SFCINP +# $POSTGPLIST +# +# output data: $3 or $FLXIOUT +# $4 or $PGBOUT +# $5 or $PGIOUT +# $PGMOUT +# $PGMERR +# +# scratch : ${DATA}/postgp.inp.sig +# ${DATA}/postgp.inp.flx +# ${DATA}/postgp.out.pgb +# +# Remarks: +# +# Condition codes +# 0 - no problem encountered +# >0 - some problem encountered +# +# Control variable resolution priority +# 1 Command line argument. +# 2 Environment variable. +# 3 Inline default. +# +# Attributes: +# Language: POSIX shell +# Machine: IBM SP +# +#### +################################################################################ +# Set environment. +source "${HOMEgfs}/ush/preamble.sh" + +# Command line arguments. +export SIGINP=${1:-${SIGINP:-}} +export FLXINP=${2:-${FLXINP:-}} +export FLXIOUT=${3:-${FLXIOUT:-}} +export PGBOUT=${4:-${PGBOUT:-}} +#export PGIOUT=${5:-${PGIOUT}} +export PGIOUT=${PGIOUT:-pgb.idx} +export IO=${6:-${IO:-0}} +export JO=${7:-${JO:-0}} +export IGEN=${8:-${IGEN:-0}} +# Directories. +export NWPROD=${NWPROD:-/nwprod} +#export EXECUTIL=${EXECUTIL:-${NWPROD}/util/exec} +export USHUTIL=${USHUTIL:-${NWPROD}/util/ush} +export EXECgfs=${EXECgfs:-${NWPROD}/exec} +export USHgfs=${USHgfs:-${NWPROD}/ush} +export DATA=${DATA:-$(pwd)} +# Filenames. +export MP=${MP:-$([[ ${LOADL_STEP_TYPE:-SERIAL} = PARALLEL ]]&&echo "p"||echo "s")} +export XC=${XC:-} +export POSTGPEXEC=${POSTGPEXEC:-${EXECgfs}/upp.x} +export OVERPARMEXEC=${OVERPARMEXEC:-${EXECgfs}/overparm_grib} +export POSTGPLIST=${POSTGPLIST:-/dev/null} +export INISCRIPT=${INISCRIPT:-} +# Ignore warning about single quote not subtituting now +# shellcheck disable=SC2016 +export ERRSCRIPT=${ERRSCRIPT:-'eval (( err == 0 ))'} +# shellcheck disable= +export LOGSCRIPT=${LOGSCRIPT:-} +export ENDSCRIPT=${ENDSCRIPT:-} +export GFSOUT=${GFSOUT:-gfsout} +export CTLFILE=${CTLFILE:-${NWPROD}/parm/gfs_cntrl.parm} +export GRIBVERSION=${GRIBVERSION:-'grib1'} +# Other variables. +export POSTGPVARS=${POSTGPVARS} +export NTHREADS=${NTHREADS:-1} +export NTHSTACK=${NTHSTACK:-64000000} +export PGMOUT=${PGMOUT:-${pgmout:-'&1'}} +export PGMERR=${PGMERR:-${pgmerr:-'&2'}} +export CHGRESTHREAD=${CHGRESTHREAD:-1} +export FILTER=${FILTER:-0} +export GENPSICHI=${GENPSICHI:-NO} +export GENPSICHIEXE=${GENPSICHIEXE:-${EXECgfs}/genpsiandchi} +export ens=${ens:-NO} +#export D3DINP=${D3DINP:-/dev/null} +l="$(echo "${PGMOUT}" | xargs | cut -c1)" +[[ ${l} = '&' ]]&&a=''||a='>' +export REDOUT=${REDOUT:-'1>'${a}} +l="$(echo "${PGMERR}" | xargs | cut -c1)" +[[ ${l} = '&' ]]&&a=''||a='>' +export REDERR=${REDERR:-'2>'${a}} +################################################################################ + +# Chuang: Run chgres if OUTTYP=1 or 0 + +export APRUN=${APRUNP:-${APRUN:-""}} + +# exit if NEMSINP does not exist +if (( OUTTYP == 4 )) ; then + if [ ! -s "${NEMSINP}" ] || [ ! -s "${FLXINP}" ] ; then + echo "model files not found, exitting" + exit 111 + fi +fi + +export SIGHDR=${SIGHDR:-${NWPROD}/exec/global_sighdr} +export IDRT=${IDRT:-4} + +# run post to read file if OUTTYP=4 +if (( OUTTYP == 4 )) ; then + export MODEL_OUT_FORM=${MODEL_OUT_FORM:-netcdfpara} + export GFSOUT=${NEMSINP} +fi + +# allow threads to use threading in Jim's sp lib +# but set default to 1 +export OMP_NUM_THREADS=${OMP_NUM_THREADS:-1} + +pwd=$(pwd) +if [[ -d "${DATA}" ]]; then + mkdata=NO +else + mkdir -p "${DATA}" + mkdata=YES +fi +cd "${DATA}" || exit 99 +################################################################################ +# Post GRIB +export PGM=${POSTGPEXEC} +export pgm=${PGM} +${LOGSCRIPT} +cat <<-EOF >postgp.inp.nml$$ + &NAMPGB + ${POSTGPVARS} +EOF + +cat <<-EOF >>postgp.inp.nml$$ + / +EOF + +if [[ "${VERBOSE}" = "YES" ]]; then + cat postgp.inp.nml$$ +fi + +# making the time stamp format for ncep post +YY=$(echo "${VDATE}" | cut -c1-4) +MM=$(echo "${VDATE}" | cut -c5-6) +DD=$(echo "${VDATE}" | cut -c7-8) +HH=$(echo "${VDATE}" | cut -c9-10) +export YY MM DD HH + +cat > itag <<-EOF + &model_inputs + fileName='${GFSOUT}' + IOFORM='${MODEL_OUT_FORM}' + grib='${GRIBVERSION}' + DateStr='${YY}-${MM}-${DD}_${HH}:00:00' + MODELNAME='GFS' + fileNameFlux='${FLXINP}' + / +EOF + +cat postgp.inp.nml$$ >> itag + +cat itag + +rm -f fort.* + +#ln -sf $SIGINP postgp.inp.sig$$ +#ln -sf $FLXINP postgp.inp.flx$$ +#ln -sf $PGBOUT postgp.out.pgb$$ + +# change model generating Grib number +if [ "${GRIBVERSION}" = "grib2" ]; then + cp "${POSTGRB2TBL}" . + cp "${PostFlatFile}" ./postxconfig-NT.txt + if [ "${ens}" = "YES" ] ; then + sed < "${PostFlatFile}" -e "s#negatively_pert_fcst#${ens_pert_type}#" > ./postxconfig-NT.txt + fi + # cp ${CTLFILE} postcntrl.xml +fi +CTL=$(basename "${CTLFILE}") +export CTL + +ln -sf griddef.out fort.110 +cp "${PARMpost}/nam_micro_lookup.dat" ./eta_micro_lookup.dat + +echo "gfs_post.sh OMP_NUM_THREADS= ${OMP_NUM_THREADS}" +${APRUN:-mpirun.lsf} "${POSTGPEXEC}" < itag > "outpost_gfs_${VDATE}_${CTL}" + +export ERR=$? +export err=${ERR} + +if (( err != 0 )) ; then + if [ "${PGBOUT}" = "wafsfile" ] ; then + exit "${err}" + fi +fi +${ERRSCRIPT} || exit 2 + +if [ "${FILTER}" = "1" ] ; then + # Filter SLP and 500 mb height using copygb, change GRIB ID, and then + # cat the filtered fields to the pressure GRIB file, from Iredell + + if [ "${GRIBVERSION}" = "grib2" ]; then + if [ "${ens}" = "YES" ] ; then + "${COPYGB2}" -x -i'4,0,80' -k'1 3 0 7*-9999 101 0 0' "${PGBOUT}" tfile + export err=$?; err_chk + else + "${COPYGB2}" -x -i'4,0,80' -k'0 3 0 7*-9999 101 0 0' "${PGBOUT}" tfile + export err=$?; err_chk + fi + ${WGRIB2} tfile -set_byte 4 11 1 -grib prmsl + export err=$?; err_chk + if [ "${ens}" = "YES" ] ; then + "${COPYGB2}" -x -i'4,1,5' -k'1 3 5 7*-9999 100 0 50000' "${PGBOUT}" tfile + export err=$?; err_chk + else + "${COPYGB2}" -x -i'4,1,5' -k'0 3 5 7*-9999 100 0 50000' "${PGBOUT}" tfile + export err=$?; err_chk + fi + ${WGRIB2} tfile -set_byte 4 11 193 -grib h5wav + export err=$?; err_chk + + #cat $PGBOUT prmsl h5wav >> $PGBOUT + #wm + # cat prmsl h5wav >> $PGBOUT + [[ -f prmsl ]] && rm prmsl + [[ -f h5wav ]] && rm h5wav + [[ -f tfile ]] && rm tfile + fi +fi + +################################################################################ +# Make GRIB index file +if [[ -n "${PGIOUT}" ]]; then + if [ "${GRIBVERSION}" = "grib2" ]; then + ${GRB2INDEX} "${PGBOUT}" "${PGIOUT}" + fi +fi +if [[ -r ${FLXINP} && -n ${FLXIOUT} && ${OUTTYP} -le 3 ]]; then + ${GRBINDEX} "${FLXINP}" "${FLXIOUT}" +fi +################################################################################ +# generate psi and chi +echo "GENPSICHI = ${GENPSICHI}" +if [ "${GENPSICHI}" = "YES" ] ; then + #echo "PGBOUT PGIOUT=" $PGBOUT $PGIOUT + #echo "YY MM=" $YY $MM + export psichifile=./psichi.grb + ${GENPSICHIEXE} < postgp.inp.nml$$ + rc=$? + if (( rc != 0 )); then + echo "Nonzero return code rc=${rc}" + exit 3 + fi + cat ./psichi.grb >> "${PGBOUT}" +fi +################################################################################ +# Postprocessing +cd "${pwd}" || exit 2 +[[ "${mkdata}" = "YES" ]] && rmdir "${DATA}" + +exit "${err}" diff --git a/ush/gfs_sndp.sh b/ush/gfs_sndp.sh index a0616e27b4a..579dd5ae25d 100755 --- a/ush/gfs_sndp.sh +++ b/ush/gfs_sndp.sh @@ -32,14 +32,12 @@ cd $DATA/$m for stn in $(cat $file_list) do - cp ${COMOUT}/bufr.${cycle}/bufr.$stn.$PDY$cyc $DATA/${m}/bufrin - export pgm=tocsbufr + cp "${COM_ATMOS_BUFR}/bufr.${stn}.${PDY}${cyc}" "${DATA}/${m}/bufrin" + export pgm=tocsbufr.x #. prep_step export FORT11=$DATA/${m}/bufrin export FORT51=./bufrout - # JY - Turn off the startmsg to reduce the update on jlogfile in this loop - # startmsg - $EXECbufrsnd/tocsbufr << EOF + ${EXECbufrsnd}/${pgm} << EOF &INPUT BULHED="$WMOHEAD",KWBX="$CCCC", NCEP2STD=.TRUE., @@ -47,12 +45,11 @@ cd $DATA/$m MAXFILESIZE=600000 / EOF - # JY export err=$?; err_chk - export err=$?; #err_chk - if [ $err -ne 0 ] - then - echo "ERROR in $pgm" + export err=$?; + if (( err != 0 )); then + echo "FATAL ERROR in ${pgm}" err_chk + exit 3 fi cat $DATA/${m}/bufrout >> $DATA/${m}/gfs_collective$m.fil @@ -60,13 +57,12 @@ EOF rm $DATA/${m}/bufrout done -# if test $SENDCOM = 'NO' - if test $SENDCOM = 'YES' - then - if [ $SENDDBN = 'YES' ] ; then - cp $DATA/${m}/gfs_collective$m.fil $pcom/gfs_collective$m.postsnd_$cyc - $DBNROOT/bin/dbn_alert NTC_LOW BUFR $job $pcom/gfs_collective$m.postsnd_$cyc + if [[ ${SENDCOM} == 'YES' ]]; then + if [[ ${SENDDBN} == 'YES' ]] ; then + cp "${DATA}/${m}/gfs_collective${m}.fil" "${COM_ATMOS_WMO}/gfs_collective${m}.postsnd_${cyc}" + "${DBNROOT}/bin/dbn_alert" NTC_LOW BUFR "${job}" \ + "${COM_ATMOS_WMO}/gfs_collective${m}.postsnd_${cyc}" fi - cp $DATA/${m}/gfs_collective$m.fil ${COMOUT}/bufr.${cycle}/. + cp "${DATA}/${m}/gfs_collective${m}.fil" "${COM_ATMOS_BUFR}/." fi diff --git a/ush/gfs_truncate_enkf.sh b/ush/gfs_truncate_enkf.sh index c7bdfad0c46..0a7d6fc0dd7 100755 --- a/ush/gfs_truncate_enkf.sh +++ b/ush/gfs_truncate_enkf.sh @@ -14,7 +14,7 @@ mkdir -p $DATATMP cd $DATATMP export LEVS=${LEVS_LORES:-64} -export FIXam=${FIXam:-$HOMEgfs/fix/fix_am} +export FIXam=${FIXam:-$HOMEgfs/fix/am} export CHGRESSH=${CHGRESSH:-${USHgfs}/global_chgres.sh} export CHGRESEXEC=${CHGRESEXEC-${EXECgfs}/global_chgres} diff --git a/ush/gldas_forcing.sh b/ush/gldas_forcing.sh new file mode 100755 index 00000000000..ca5562f459d --- /dev/null +++ b/ush/gldas_forcing.sh @@ -0,0 +1,118 @@ +#! /usr/bin/env bash +########################################################################### +# this script gets cpc daily precipitation and using gdas hourly precipitation +# to disaggregate daily value into hourly value +########################################################################### + +source "${HOMEgfs:?}/ush/preamble.sh" + +bdate=$1 +edate=$2 + +# HOMEgldas - gldas directory +# EXECgldas - gldas exec directory +# PARMgldas - gldas param directory +# FIXgldas - gldas fix field directory +export LISDIR=${HOMEgldas:?} +export fpath=${RUNDIR:?}/force +export xpath=${RUNDIR:?}/force +export WGRIB=${WGRIB:?} +export COPYGB=${COPYGB:?} +export ERRSCRIPT=${ERRSCRIPT:-"eval [[ ${err} = 0 ]]"} + +#------------------------------- +#--- extract variables of each timestep and create forcing files +sdate=${bdate} +edate=$(sh "${FINDDATE:?}" "${edate}" d-1) +while [[ "${sdate}" -lt "${edate}" ]] ; do + + sdat0=$(sh "${FINDDATE:?}" "${sdate}" d-1) + [[ ! -d ${xpath}/cpc.${sdate} ]] && mkdir -p "${xpath}/cpc.${sdate}" + [[ ! -d ${xpath}/cpc.${sdat0} ]] && mkdir -p "${xpath}/cpc.${sdat0}" + + cd "${xpath}" || exit + rm -f fort.* grib.* + + COMPONENT=${COMPONENT:-"atmos"} + pathp1=${CPCGAUGE:?}/gdas.${sdate}/00/${COMPONENT} + pathp2=${DCOMIN:?}/${sdate}/wgrbbul/cpc_rcdas + cpc_precip="PRCP_CU_GAUGE_V1.0GLB_0.125deg.lnx.${sdate}.RT" + if [[ "${RUN_ENVIR:?}" = "emc" ]] && [[ "${sdate}" -gt "${bdate}" ]]; then + cpc_precip="PRCP_CU_GAUGE_V1.0GLB_0.125deg.lnx.${sdate}.RT_early" + fi + cpc=${pathp1}/${cpc_precip} + if [[ ! -s "${cpc}" ]]; then cpc=${pathp2}/${cpc_precip} ; fi + if [[ "${RUN_ENVIR:?}" = "nco" ]]; then cpc=${pathp2}/${cpc_precip} ; fi + if [[ ! -s "${cpc}" ]]; then + echo "WARNING: GLDAS MISSING ${cpc}, WILL NOT RUN." + exit 3 + fi + cp "${cpc}" "${xpath}/cpc.${sdate}/." + + sflux=${fpath}/gdas.${sdat0}/gdas1.t12z.sfluxgrbf06 + prate=gdas.${sdat0}12 + ${WGRIB} -s "${sflux}" | grep "PRATE:sfc" | ${WGRIB} -i "${sflux}" -grib -o "${prate}" + + sflux=${fpath}/gdas.${sdat0}/gdas1.t18z.sfluxgrbf06 + prate=gdas.${sdat0}18 + ${WGRIB} -s "${sflux}" | grep "PRATE:sfc" | ${WGRIB} -i "${sflux}" -grib -o "${prate}" + + sflux=${fpath}/gdas.${sdate}/gdas1.t00z.sfluxgrbf06 + prate=gdas.${sdate}00 + ${WGRIB} -s "${sflux}" | grep "PRATE:sfc" | ${WGRIB} -i "${sflux}" -grib -o "${prate}" + + sflux=${fpath}/gdas.${sdate}/gdas1.t06z.sfluxgrbf06 + prate=gdas.${sdate}06 + ${WGRIB} -s "${sflux}" | grep "PRATE:sfc" | ${WGRIB} -i "${sflux}" -grib -o "${prate}" + + if [[ "${USE_CFP:?}" = "YES" ]] ; then + rm -f ./cfile + touch ./cfile + { + echo "${COPYGB} -i3 '-g255 0 2881 1441 90000 0 128 -90000 360000 125 125' -x gdas.${sdat0}12 grib.12" + echo "${COPYGB} -i3 '-g255 0 2881 1441 90000 0 128 -90000 360000 125 125' -x gdas.${sdat0}18 grib.18" + echo "${COPYGB} -i3 '-g255 0 2881 1441 90000 0 128 -90000 360000 125 125' -x gdas.${sdate}00 grib.00" + echo "${COPYGB} -i3 '-g255 0 2881 1441 90000 0 128 -90000 360000 125 125' -x gdas.${sdate}06 grib.06" + } >> ./cfile + ${APRUN_GLDAS_DATA_PROC:?} ./cfile + else + ${COPYGB} -i3 '-g255 0 2881 1441 90000 0 128 -90000 360000 125 125' -x gdas."${sdat0}"12 grib.12 + ${COPYGB} -i3 '-g255 0 2881 1441 90000 0 128 -90000 360000 125 125' -x gdas."${sdat0}"18 grib.18 + ${COPYGB} -i3 '-g255 0 2881 1441 90000 0 128 -90000 360000 125 125' -x gdas."${sdate}"00 grib.00 + ${COPYGB} -i3 '-g255 0 2881 1441 90000 0 128 -90000 360000 125 125' -x gdas."${sdate}"06 grib.06 + fi + + rm -f fort.10 + touch fort.10 + echo "${sdat0}" >> fort.10 + echo "${sdate}" >> fort.10 + + export pgm=gldas_forcing + # shellcheck disable=SC1091 + . prep_step + # shellcheck disable= + + ${WGRIB} -d -bin grib.12 -o fort.11 + ${WGRIB} -d -bin grib.18 -o fort.12 + ${WGRIB} -d -bin grib.00 -o fort.13 + ${WGRIB} -d -bin grib.06 -o fort.14 + + ln -fs "${xpath}/cpc.${sdate}/${cpc_precip}" fort.15 + + "${EXECgldas:?}/gldas_forcing" 1>&1 2>&2 + + export err=$? + ${ERRSCRIPT} || exit 3 + + cp fort.21 "${xpath}/cpc.${sdat0}/precip.gldas.${sdat0}12" + cp fort.22 "${xpath}/cpc.${sdat0}/precip.gldas.${sdat0}18" + cp fort.23 "${xpath}/cpc.${sdate}/precip.gldas.${sdate}00" + cp fort.24 "${xpath}/cpc.${sdate}/precip.gldas.${sdate}06" + + rm -f fort.* grib.* + + sdate=$(sh "${FINDDATE}" "${sdate}" d+1) +done +#------------------------------- + +exit "${err}" diff --git a/ush/gldas_get_data.sh b/ush/gldas_get_data.sh new file mode 100755 index 00000000000..34163091193 --- /dev/null +++ b/ush/gldas_get_data.sh @@ -0,0 +1,76 @@ +#! /usr/bin/env bash +######################################################### +# This script generate gldas forcing from gdas prod sflux +######################################################### + +source "${HOMEgfs:?}/ush/preamble.sh" + +bdate=$1 +edate=$2 + +if [[ "${USE_CFP:-"NO"}" = "YES" ]] ; then + touch ./cfile +fi + +### COMINgdas = prod gdas sflux grib2 +### RUNDIR = gldas forcing in grib2 format +### RUNDIR/force = gldas forcing in grib1 format +export COMPONENT=${COMPONENT:-atmos} +fpath=${RUNDIR:?} +gpath=${RUNDIR}/force +cycint=${assim_freq:-6} + +# get gdas flux files to force gldas. +# CPC precipitation is from 12z to 12z. One more day of gdas data is +# needed to disaggregate daily CPC precipitation values to hourly values +cdate=$(${NDATE:?} -12 "${bdate}") + +iter=0 + +#------------------------------- +while [[ "${cdate}" -lt "${edate}" ]]; do + + ymd=$(echo "${cdate}" |cut -c 1-8) + cyc=$(echo "${cdate}" |cut -c 9-10) + [[ ! -d ${fpath}/gdas.${ymd} ]] && mkdir -p "${fpath}/gdas.${ymd}" + [[ ! -d ${gpath}/gdas.${ymd} ]] && mkdir -p "${gpath}/gdas.${ymd}" + + f=1 + while [[ "${f}" -le "${cycint}" ]]; do + rflux=${COMINgdas:?}/gdas.${ymd}/${cyc}/${COMPONENT}/gdas.t${cyc}z.sfluxgrbf00${f}.grib2 + fflux=${fpath}/gdas.${ymd}/gdas.t${cyc}z.sfluxgrbf0${f}.grib2 + gflux=${gpath}/gdas.${ymd}/gdas1.t${cyc}z.sfluxgrbf0${f} + if [[ ! -s "${rflux}" ]];then + echo "WARNING: GLDAS MISSING ${rflux}, WILL NOT RUN." + exit 2 + fi + rm -f "${fflux}" "${gflux}" + touch "${fflux}" "${gflux}" + + fcsty=anl + if [[ "${f}" -ge 1 ]]; then fcsty=fcst; fi + + if [[ "${USE_CFP:-"NO"}" = "YES" ]] ; then + if [[ "${CFP_MP:-"NO"}" = "YES" ]]; then + echo "${iter} ${USHgldas:?}/gldas_process_data.sh ${iter} ${rflux} ${fcsty} ${fflux} ${gflux} ${f}" >> ./cfile + else + echo "${USHgldas:?}/gldas_process_data.sh ${iter} ${rflux} ${fcsty} ${fflux} ${gflux} ${f}" >> ./cfile + fi + else + "${USHgldas:?}/gldas_process_data.sh" "${iter}" "${rflux}" "${fcsty}" "${fflux}" "${gflux}" "${f}" + fi + + iter=$((iter+1)) + f=$((f+1)) + done + +#------------------------------- + cdate=$(${NDATE} +"${cycint}" "${cdate}") +done +#------------------------------- + +if [[ "${USE_CFP:-"NO"}" = "YES" ]] ; then + ${APRUN_GLDAS_DATA_PROC:?} ./cfile +fi + +exit $? diff --git a/ush/gldas_liscrd.sh b/ush/gldas_liscrd.sh new file mode 100755 index 00000000000..7c0f4460354 --- /dev/null +++ b/ush/gldas_liscrd.sh @@ -0,0 +1,46 @@ +#! /usr/bin/env bash + +source "${HOMEgfs:?}/ush/preamble.sh" + +if [[ $# -lt 3 ]]; then + echo usage "$0" yyyymmddhh1 yyyymmddhh2 126/382/574/1534 + exit $? +fi + +date1=$1 +date2=$2 +grid=$3 + +yyyy1=$(echo "${date1}" | cut -c 1-4) +mm1=$(echo "${date1}" | cut -c 5-6) +dd1=$(echo "${date1}" | cut -c 7-8) +hh1=$(echo "${date1}" | cut -c 9-10) +yyyy2=$(echo "${date2}" | cut -c 1-4) +mm2=$(echo "${date2}" | cut -c 5-6) +dd2=$(echo "${date2}" | cut -c 7-8) +hh2=$(echo "${date2}" | cut -c 9-10) + +PARM_LM=${PARMgldas:?} +LISCARD=lis.crd + +rm -f "${LISCARD}" +touch "${LISCARD}" +{ + cat "${PARM_LM}/lis.crd.T${grid}.tmp.1" + echo "LIS%t%SSS = 0 " + echo "LIS%t%SMN = 00 " + echo "LIS%t%SHR = ${hh1} " + echo "LIS%t%SDA = ${dd1} " + echo "LIS%t%SMO = ${mm1} " + echo "LIS%t%SYR = ${yyyy1}" + echo "LIS%t%ENDCODE = 1 " + echo "LIS%t%ESS = 0 " + echo "LIS%t%EMN = 00 " + echo "LIS%t%EHR = ${hh2} " + echo "LIS%t%EDA = ${dd2} " + echo "LIS%t%EMO = ${mm2} " + echo "LIS%t%EYR = ${yyyy2}" + cat "${PARM_LM}/lis.crd.T${grid}.tmp.2" +} >> "${LISCARD}" + +exit 0 diff --git a/ush/gldas_process_data.sh b/ush/gldas_process_data.sh new file mode 100755 index 00000000000..4770170a97d --- /dev/null +++ b/ush/gldas_process_data.sh @@ -0,0 +1,34 @@ +#! /usr/bin/env bash + +source "${HOMEgfs:?}/ush/preamble.sh" "$1" + +rflux=$2 +fcsty=$3 +fflux=$4 +gflux=$5 +f=$6 + +WGRIB2=${WGRIB2:?} +CNVGRIB=${CNVGRIB:?} + +${WGRIB2} "${rflux}" | grep "TMP:1 hybrid" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "SPFH:1 hybrid" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "UGRD:1 hybrid" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "VGRD:1 hybrid" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "HGT:1 hybrid" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "PRES:surface" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "PRATE:surface" | grep ave | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "VEG:surface" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "SFCR:surface" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "SFEXC:surface" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "TMP:surface" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "WEASD:surface" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "SNOD:surface" | grep "${fcsty}" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" + +${WGRIB2} "${rflux}" | grep "DSWRF:surface:${f} hour fcst" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "DLWRF:surface:${f} hour fcst" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" +${WGRIB2} "${rflux}" | grep "USWRF:surface:${f} hour fcst" | ${WGRIB2} -i "${rflux}" -append -grib "${fflux}" + +${CNVGRIB} -g21 "${fflux}" "${gflux}" + +exit $? diff --git a/ush/global_extrkr.sh b/ush/global_extrkr.sh deleted file mode 100755 index ad0b249b28a..00000000000 --- a/ush/global_extrkr.sh +++ /dev/null @@ -1,1697 +0,0 @@ -#! /usr/bin/env bash - -source "$HOMEgfs/ush/preamble.sh" - -userid=$LOGNAME - -############################################################################## -# cat<${DATA}/tmpsynvit.${atcfout}.${PDY}${CYL} - grep "${current_str}" ${synvitdir}/${synvitfile} \ - >>${DATA}/tmpsynvit.${atcfout}.${PDY}${CYL} - grep "${future_str}" ${synvitfuture_dir}/${synvitfuture_file} \ - >>${DATA}/tmpsynvit.${atcfout}.${PDY}${CYL} -else - set +x - echo " " - echo " There is no (synthetic) TC vitals file for ${CYL}z in ${synvitdir}," - echo " nor is there a TC vitals file for ${old_hh}z in ${synvitold_dir}." - echo " nor is there a TC vitals file for ${future_hh}z in ${synvitfuture_dir}," - echo " Checking the raw TC Vitals file ....." - echo " " - ${TRACE_ON:-set -x} -fi - -# Take the vitals from Steve Lord's /com/gfs/prod tcvitals file, -# and cat them with the NHC-only vitals from the raw, original -# /com/arch/prod/synda_tcvitals file. Do this because the nwprod -# tcvitals file is the original tcvitals file, and Steve runs a -# program that ignores the vitals for a storm that's over land or -# even just too close to land, and for tracking purposes for the -# US regional models, we need these locations. Only include these -# "inland" storm vitals for NHC (we're not going to track inland -# storms that are outside of NHC's domain of responsibility -- we -# don't need that info). -# UPDATE 5/12/98 MARCHOK: awk logic is added to screen NHC -# vitals such as "89E TEST", since TPC -# does not want tracks for such storms. - -grep "${old_str}" ${archsyndir}/syndat_tcvitals.${CENT}${syy} | \ - grep -v TEST | awk 'substr($0,6,1) !~ /8/ {print $0}' \ - >${DATA}/tmprawvit.${atcfout}.${PDY}${CYL} -grep "${current_str}" ${archsyndir}/syndat_tcvitals.${CENT}${syy} | \ - grep -v TEST | awk 'substr($0,6,1) !~ /8/ {print $0}' \ - >>${DATA}/tmprawvit.${atcfout}.${PDY}${CYL} -grep "${future_str}" ${archsyndir}/syndat_tcvitals.${CENT}${syy} | \ - grep -v TEST | awk 'substr($0,6,1) !~ /8/ {print $0}' \ - >>${DATA}/tmprawvit.${atcfout}.${PDY}${CYL} - - -# IMPORTANT: When "cat-ing" these files, make sure that the vitals -# files from the "raw" TC vitals files are first in order and Steve's -# TC vitals files second. This is because Steve's vitals file has -# been error-checked, so if we have a duplicate tc vitals record in -# these 2 files (very likely), program supvit.x below will -# only take the last vitals record listed for a particular storm in -# the vitals file (all previous duplicates are ignored, and Steve's -# error-checked vitals records are kept). - -cat ${DATA}/tmprawvit.${atcfout}.${PDY}${CYL} ${DATA}/tmpsynvit.${atcfout}.${PDY}${CYL} \ - >${DATA}/vitals.${atcfout}.${PDY}${CYL} - -#--------------------------------------------------------------# -# Now run a fortran program that will read all the TC vitals -# records for the current dtg and the dtg from 6h ago, and -# sort out any duplicates. If the program finds a storm that -# was included in the vitals file 6h ago but not for the current -# dtg, this program updates the 6h-old first guess position -# and puts these updated records as well as the records from -# the current dtg into a temporary vitals file. It is this -# temporary vitals file that is then used as the input for the -# tracking program. -#--------------------------------------------------------------# - -oldymdh=$( ${NDATE:?} -${vit_incr} ${PDY}${CYL}) -oldyy=${oldymdh:2:2} -oldmm=${oldymdh:4:2} -olddd=${oldymdh:6:2} -oldhh=${oldymdh:8:2} -oldymd=${oldyy}${oldmm}${olddd} - -futureymdh=$( ${NDATE:?} 6 ${PDY}${CYL}) -futureyy=${futureymdh:2:2} -futuremm=${futureymdh:4:2} -futuredd=${futureymdh:6:2} -futurehh=${futureymdh:8:2} -futureymd=${futureyy}${futuremm}${futuredd} - -cat<${DATA}/suv_input.${atcfout}.${PDY}${CYL} -&datenowin dnow%yy=${syy}, dnow%mm=${smm}, - dnow%dd=${sdd}, dnow%hh=${CYL}/ -&dateoldin dold%yy=${oldyy}, dold%mm=${oldmm}, - dold%dd=${olddd}, dold%hh=${oldhh}/ -&datefuturein dfuture%yy=${futureyy}, dfuture%mm=${futuremm}, - dfuture%dd=${futuredd}, dfuture%hh=${futurehh}/ -&hourinfo vit_hr_incr=${vit_incr}/ -EOF - - -numvitrecs=$(cat ${DATA}/vitals.${atcfout}.${PDY}${CYL} | wc -l) -if [ ${numvitrecs} -eq 0 ] -then - - if [ ${trkrtype} = 'tracker' ] - then - set +x - echo " " - echo "!!! NOTE -- There are no vitals records for this time period." - echo "!!! File ${DATA}/vitals.${atcfout}.${PDY}${CYL} is empty." - echo "!!! It could just be that there are no storms for the current" - echo "!!! time. Please check the dates and submit this job again...." - echo " " - ${TRACE_ON:-set -x} - exit 1 - fi - -fi - -# For tcgen cases, filter to use only vitals from the ocean -# basin of interest.... - -if [ ${trkrtype} = 'tcgen' ] - then - - if [ ${numvitrecs} -gt 0 ] - then - - fullvitfile=${DATA}/vitals.${atcfout}.${PDY}${CYL} - cp $fullvitfile ${DATA}/vitals.all_basins.${atcfout}.${PDY}${CYL} - basin=$( echo $regtype | cut -c1-2) - - if [ ${basin} = 'al' ]; then - cat $fullvitfile | awk '{if (substr($0,8,1) == "L") print $0}' \ - >${DATA}/vitals.tcgen_al_only.${atcfout}.${PDY}${CYL} - cp ${DATA}/vitals.tcgen_al_only.${atcfout}.${PDY}${CYL} \ - ${DATA}/vitals.${atcfout}.${PDY}${CYL} - fi - if [ ${basin} = 'ep' ]; then - cat $fullvitfile | awk '{if (substr($0,8,1) == "E") print $0}' \ - >${DATA}/vitals.tcgen_ep_only.${atcfout}.${PDY}${CYL} - cp ${DATA}/vitals.tcgen_ep_only.${atcfout}.${PDY}${CYL} \ - ${DATA}/vitals.${atcfout}.${PDY}${CYL} - fi - if [ ${basin} = 'wp' ]; then - cat $fullvitfile | awk '{if (substr($0,8,1) == "W") print $0}' \ - >${DATA}/vitals.tcgen_wp_only.${atcfout}.${PDY}${CYL} - cp ${DATA}/vitals.tcgen_wp_only.${atcfout}.${PDY}${CYL} \ - ${DATA}/vitals.${atcfout}.${PDY}${CYL} - fi - - cat ${DATA}/vitals.${atcfout}.${PDY}${CYL} - - fi - -fi - -# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -# Before running the program to read, sort and update the vitals, -# first run the vitals through some awk logic, the purpose of -# which is to convert all the 2-digit years into 4-digit years. -# We need this logic to ensure that all the vitals going -# into supvit.f have uniform, 4-digit years in their records. -# -# 1/8/2000: sed code added by Tim Marchok due to the fact that -# some of the vitals were getting past the syndata/qctropcy -# error-checking with a colon in them; the colon appeared -# in the character immediately to the left of the date, which -# was messing up the "(length($4) == 8)" statement logic. -# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -sed -e "s/\:/ /g" ${DATA}/vitals.${atcfout}.${PDY}${CYL} > ${DATA}/tempvit -mv ${DATA}/tempvit ${DATA}/vitals.${atcfout}.${PDY}${CYL} - -awk ' -{ - yycheck = substr($0,20,2) - if ((yycheck == 20 || yycheck == 19) && (length($4) == 8)) { - printf ("%s\n",$0) - } - else { - if (yycheck >= 0 && yycheck <= 50) { - printf ("%s20%s\n",substr($0,1,19),substr($0,20)) - } - else { - printf ("%s19%s\n",substr($0,1,19),substr($0,20)) - } - } -} ' ${DATA}/vitals.${atcfout}.${PDY}${CYL} >${DATA}/vitals.${atcfout}.${PDY}${CYL}.y4 - -mv ${DATA}/vitals.${atcfout}.${PDY}${CYL}.y4 ${DATA}/vitals.${atcfout}.${PDY}${CYL} - -if [ ${numvitrecs} -gt 0 ] -then - - export pgm=supvit - . $prep_step - - ln -s -f ${DATA}/vitals.${atcfout}.${PDY}${CYL} fort.31 - ln -s -f ${DATA}/vitals.upd.${atcfout}.${PDY}${CYL} fort.51 - - msg="$pgm start for $atcfout at ${CYL}z" - $postmsg "$jlogfile" "$msg" - - ${exectrkdir}/supvit <${DATA}/suv_input.${atcfout}.${PDY}${CYL} - suvrcc=$? - - if [ ${suvrcc} -eq 0 ] - then - msg="$pgm end for $atcfout at ${CYL}z completed normally" - $postmsg "$jlogfile" "$msg" - else - set +x - echo " " - echo "!!! ERROR -- An error occurred while running supvit.x, " - echo "!!! which is the program that updates the TC Vitals file." - echo "!!! Return code from supvit.x = ${suvrcc}" - echo "!!! model= ${atcfout}, forecast initial time = ${PDY}${CYL}" - echo "!!! Exiting...." - echo " " - ${TRACE_ON:-set -x} - err_exit " FAILED ${jobid} - ERROR RUNNING SUPVIT IN TRACKER SCRIPT- ABNORMAL EXIT" - fi - -else - - touch ${DATA}/vitals.upd.${atcfout}.${PDY}${CYL} - -fi - -#----------------------------------------------------------------- -# In this section, check to see if the user requested the use of -# operational TC vitals records for the initial time only. This -# option might be used for a retrospective medium range forecast -# in which the user wants to initialize with the storms that are -# currently there, but then let the model do its own thing for -# the next 10 or 14 days.... - -#------------------------------------------------------------------# -# Now select all storms to be processed, that is, process every -# storm that's listed in the updated vitals file for the current -# forecast hour. If there are no storms for the current time, -# then exit. -#------------------------------------------------------------------# - -numvitrecs=$(cat ${DATA}/vitals.upd.${atcfout}.${PDY}${CYL} | wc -l) -if [ ${numvitrecs} -eq 0 ] -then - if [ ${trkrtype} = 'tracker' ] - then - set +x - echo " " - echo "!!! NOTE -- There are no vitals records for this time period " - echo "!!! in the UPDATED vitals file." - echo "!!! It could just be that there are no storms for the current" - echo "!!! time. Please check the dates and submit this job again...." - echo " " - ${TRACE_ON:-set -x} - exit 1 - fi -fi - -set +x -echo " " -echo " *--------------------------------*" -echo " | STORM SELECTION |" -echo " *--------------------------------*" -echo " " -${TRACE_ON:-set -x} - -ict=1 -while [ $ict -le 15 ] -do - stormflag[${ict}]=3 - let ict=ict+1 -done - -dtg_current="${symd} ${CYL}00" -stormmax=$( grep "${dtg_current}" ${DATA}/vitals.upd.${atcfout}.${PDY}${CYL} | wc -l) - -if [ ${stormmax} -gt 15 ] -then - stormmax=15 -fi - -sct=1 -while [ ${sct} -le ${stormmax} ] -do - stormflag[${sct}]=1 - let sct=sct+1 -done - - -#---------------------------------------------------------------# -# -# -------- "Genesis" Vitals processing -------- -# -# May 2006: This entire genesis tracking system is being -# upgraded to more comprehensively track and categorize storms. -# One thing that has been missing from the tracking system is -# the ability to keep track of storms from one analysis cycle -# to the next. That is, the current system has been very -# effective at tracking systems within a forecast, but we have -# no methods in place for keeping track of storms across -# difference initial times. For example, if we are running -# the tracker on today's 00z GFS analysis, we will get a -# position for various storms at the analysis time. But then -# if we go ahead and run again at 06z, we have no way of -# telling the tracker that we know about the 00z position of -# this storm. We now address that problem by creating -# "genesis" vitals, that is, when a storm is found at an -# analysis time, we not only produce "atcfunix" output to -# detail the track & intensity of a found storm, but we also -# produce a vitals record that will be used for the next -# run of the tracker script. These "genesis vitals" records -# will be of the format: -# -# YYYYMMDDHH_AAAH_LLLLX_TYP -# -# Where: -# -# YYYYMMDDHH = Date the storm was FIRST identified -# by the tracker. -# AAA = Abs(Latitude) * 10; integer value -# H = 'N' for norther hem, 'S' for southern hem -# LLLL = Abs(Longitude) * 10; integer value -# X = 'E' for eastern hem, 'W' for western hem -# TYP = Tropical cyclone storm id if this is a -# tropical cyclone (e.g., "12L", or "09W", etc). -# If this is one that the tracker instead "Found -# On the Fly (FOF)", we simply put those three -# "FOF" characters in there. - -d6ago_ymdh=$( ${NDATE:?} -6 ${PDY}${CYL}) -d6ago_4ymd=$( echo ${d6ago_ymdh} | cut -c1-8) -d6ago_ymd=$( echo ${d6ago_ymdh} | cut -c3-8) -d6ago_hh=$( echo ${d6ago_ymdh} | cut -c9-10) -d6ago_str="${d6ago_ymd} ${d6ago_hh}00" - -d6ahead_ymdh=$( ${NDATE:?} 6 ${PDY}${CYL}) -d6ahead_4ymd=$( echo ${d6ahead_ymdh} | cut -c1-8) -d6ahead_ymd=$( echo ${d6ahead_ymdh} | cut -c3-8) -d6ahead_hh=$( echo ${d6ahead_ymdh} | cut -c9-10) -d6ahead_str="${d6ahead_ymd} ${d6ahead_hh}00" - -syyyym6=$( echo ${d6ago_ymdh} | cut -c1-4) -smmm6=$( echo ${d6ago_ymdh} | cut -c5-6) -sddm6=$( echo ${d6ago_ymdh} | cut -c7-8) -shhm6=$( echo ${d6ago_ymdh} | cut -c9-10) - -syyyyp6=$( echo ${d6ahead_ymdh} | cut -c1-4) -smmp6=$( echo ${d6ahead_ymdh} | cut -c5-6) -sddp6=$( echo ${d6ahead_ymdh} | cut -c7-8) -shhp6=$( echo ${d6ahead_ymdh} | cut -c9-10) - -set +x -echo " " -echo " d6ago_str= --->${d6ago_str}<---" -echo " current_str= --->${current_str}<---" -echo " d6ahead_str= --->${d6ahead_str}<---" -echo " " -echo " for the times 6h ago, current and 6h ahead:" -echo " " -echo " " -${TRACE_ON:-set -x} - - touch ${DATA}/genvitals.upd.${cmodel}.${atcfout}.${PDY}${CYL} - - -#-----------------------------------------------------------------# -# -# ------ CUT APART INPUT GRIB FILES ------- -# -# For the selected model, cut apart the GRIB input files in order -# to pull out only the variables that we need for the tracker. -# Put these selected variables from all forecast hours into 1 big -# GRIB file that we'll use as input for the tracker. -# -#-----------------------------------------------------------------# - -set +x -echo " " -echo " -----------------------------------------" -echo " NOW CUTTING APART INPUT GRIB FILES TO " -echo " CREATE 1 BIG GRIB INPUT FILE " -echo " -----------------------------------------" -echo " " -${TRACE_ON:-set -x} - -#gix=$NWPROD/util/exec/grbindex -#g2ix=$NWPROD/util/exec/grb2index -#cgb=$NWPROD/util/exec/copygb -#cgb2=$NWPROD/util/exec/copygb2 - -regflag=$(grep NHC ${DATA}/vitals.upd.${atcfout}.${PDY}${CYL} | wc -l) - -# ---------------------------------------------------------------------- -find_gfile() { - # This subroutine finds an input file from a list of possible - # input filenames, and calls err_exit if no file is found. The - # first file found is returned. - - # Calling conventions: - # find_gfile GFS 30 /path/to/file1.master.pgrbq30.grib2 /path/to/file2.master.pgrbq030.grib2 ... - nicename="$1" - nicehour="$2" - shift 2 - gfile=none - echo "Searching for input $nicename data for forecast hour $nicehour" - ${TRACE_ON:-set -x} - now=$( date +%s ) - later=$(( now + wait_max_time )) - # Note: the loop has only one iteration if --wait-max-time is - # unspecified. That is because later=now - while [[ ! ( "$now" -gt "$later" ) ]] ; do - for gfile in "$@" ; do - if [[ ! -e "$gfile" ]] ; then - set +x - echo "$gfile: does not exist" - ${TRACE_ON:-set -x} - gfile=none - elif [[ ! -s "$gfile" ]] ; then - set +x - echo "$gfile: exists, but is empty" - ${TRACE_ON:-set -x} - gfile=none - else - set +x - echo "$gfile: exists, is non-empty, so I will use this file" - ${TRACE_ON:-set -x} - return 0 - fi - done - now=$( date +%s ) - if [[ "$gfile" == none ]] ; then - if [[ ! ( "$now" -lt "$later" ) ]] ; then - set +x - echo " " - echo " " - echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" - echo " !!! $nicename missing for hour $nicehour" - echo " !!! Check for the existence of these file:" - for gfile in "$@" ; do - echo " !!! $nicename File: $gfile" - done - echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" - echo " " - ${TRACE_ON:-set -x} - err_exit "ERROR: mandatory input GFS file for hour $nicehour is missing or empty. Aborting. Checked for these files: $*" - continue - else - set +x - echo " " - echo " !!! Mandatory input $nicename missing for hour $nicehour" - echo " !!! Will retry after $sleep_time second sleep." - echo " !!! Checked these files:" - for gfile in "$@" ; do - echo " !!! $nicename File: $gfile" - done - echo " " - ${TRACE_ON:-set -x} - sleep $sleep_time - fi - fi - done -} - -# -------------------------------------------------- -# Process GFS or GDAS data -# -------------------------------------------------- -if [[ ${model} -eq 1 || $model == 8 ]] ; then - - export nest_type="fixed" - export trkrebd=360.0 - export trkrwbd=0.0 - export trkrnbd=85.0 - export trkrsbd=-85.0 - rundescr="xxxx" - atcfdescr="xxxx" - - - if [ $loopnum -eq 1 ] - then - - if [ -s ${DATA}/gfsgribfile.${PDY}${CYL} ] - then - rm ${DATA}/gfsgribfile.${PDY}${CYL} - fi - - rm ${DATA}/master.gfsgribfile.${PDY}${CYL}.f* - rm ${DATA}/gfsgribfile.${PDY}${CYL}.f* - >${DATA}/gfsgribfile.${PDY}${CYL} - - set +x - echo " " - echo "Time before gfs wgrib loop is $(date)" - echo " " - ${TRACE_ON:-set -x} - - if [[ "$model" -eq 8 ]] ; then - name=gdas - name1=gdas - nicename=GDAS - else # not model 8, so assume GFS - name=gfs - name1=gfs - nicename=GFS - fi - - for fhour in ${fcsthrs} ; do - fhour=$( echo "$fhour" | bc ) - - if [ ${fhour} -eq $bad_hour ] - then - continue - fi - - fhour00=$( printf %02d "$fhour" ) - fhour000=$( printf %03d "$fhour" ) - fhour0000=$( printf %03d "$fhour" ) - - if [[ "$gribver" == 1 ]] ; then - - find_gfile "$nicename" "$fhour" \ - ${gfsdir}/$name1.t${CYL}z.${flag_pgb}$fhour00 \ - ${gfsdir}/$name1.t${CYL}z.${flag_pgb}$fhour000 \ - ${gfsdir}/pgb${flag_pgb}$fhour00.$name.${symdh} \ - ${gfsdir}/pgrb${flag_pgb}$fhour00.$name.${symdh} - ${WGRIB:?} -s $gfile >gfs.ix - - for parm in ${wgrib_parmlist} - do - case ${parm} in - "SurfaceU") grep "UGRD:10 m " gfs.ix ;; - "SurfaceV") grep "VGRD:10 m " gfs.ix ;; - *) grep "${parm}" gfs.ix ;; - esac - done | ${WGRIB:?} -s $gfile -i -grib -append \ - -o ${DATA}/master.gfsgribfile.${PDY}${CYL}.f${fhour000} - - gfs_master_file=${DATA}/master.gfsgribfile.${PDY}${CYL}.f${fhour000} - gfs_converted_file=${DATA}/gfsgribfile.${PDY}${CYL}.f${fhour000} - gfs_cat_file=${DATA}/gfsgribfile.${PDY}${CYL} -# $cgb -g4 -i2 -x ${gfs_master_file} ${gfs_converted_file} -# cat ${gfs_converted_file} >>${gfs_cat_file} - cat ${gfs_master_file} >>${gfs_cat_file} - - else # gribver is not 1, so assume GRIB2 - - find_gfile "$nicename" "$fhour" \ - ${gfsdir}/$name1.t${CYL}z.pgrb2.0p25.f${fhour000} \ - ${gfsdir}/$name1.t${CYL}z.pgrb2.0p25.f${fhour00} \ - ${gfsdir}/pgb${flag_pgb}$fhour00.$name.${symdh}.grib2 \ - ${gfsdir}/pgrb${flag_pgb}${fhour000}.$name.${symdh}.grib2 - ${WGRIB2:?} -s $gfile >gfs.ix - - for parm in ${wgrib_parmlist} - do - case ${parm} in - "SurfaceU") grep "UGRD:10 m " gfs.ix ;; - "SurfaceV") grep "VGRD:10 m " gfs.ix ;; - *) grep "${parm}" gfs.ix ;; - esac - done | ${WGRIB2:?} -i $gfile -append -grib \ - ${DATA}/master.gfsgribfile.${PDY}${CYL}.f${fhour000} - - gfs_master_file=${DATA}/master.gfsgribfile.${PDY}${CYL}.f${fhour000} - gfs_converted_file=${DATA}/gfsgribfile.${PDY}${CYL}.f${fhour000} - gfs_cat_file=${DATA}/gfsgribfile.${PDY}${CYL} - - ${GRB2INDEX:?} ${gfs_master_file} ${gfs_master_file}.ix - - g1=${gfs_master_file} - x1=${gfs_master_file}.ix - -# grid4="0 6 0 0 0 0 0 0 720 361 0 0 90000000 0 48 -90000000 359500000 500000 500000 0" -# $cgb2 -g "${grid4}" ${g1} ${x1} ${gfs_converted_file} -# cat ${gfs_converted_file} >>${gfs_cat_file} - - cat ${gfs_master_file} >>${gfs_cat_file} - - fi - - done - - if [ ${gribver} -eq 1 ]; then - ${GRBINDEX:?} ${DATA}/gfsgribfile.${PDY}${CYL} ${DATA}/gfsixfile.${PDY}${CYL} - else - ${GRB2INDEX:?} ${DATA}/gfsgribfile.${PDY}${CYL} ${DATA}/gfsixfile.${PDY}${CYL} - fi - -# -------------------------------------------- - - if [[ "$PhaseFlag" == y ]] ; then - - catfile=${DATA}/gfs.${PDY}${CYL}.catfile - >${catfile} - - for fhour in ${fcsthrs} - do - - - fhour=$( echo "$fhour" | bc ) - - if [ ${fhour} -eq $bad_hour ] - then - continue - fi - - fhour00=$( printf %02d "$fhour" ) - fhour000=$( printf %03d "$fhour" ) - fhour0000=$( printf %03d "$fhour" ) - - set +x - echo " " - echo "Date in interpolation for model= $cmodel and fhour= $fhour000 before = $(date)" - echo " " - ${TRACE_ON:-set -x} - - gfile=${DATA}/gfsgribfile.${PDY}${CYL} - ifile=${DATA}/gfsixfile.${PDY}${CYL} - - if [ ${gribver} -eq 1 ]; then - ${GRBINDEX:?} $gfile $ifile - else - ${GRB2INDEX:?} $gfile $ifile - fi - - gparm=7 - namelist=${DATA}/vint_input.${PDY}${CYL}.z - echo "&timein ifcsthour=${fhour000}," >${namelist} - echo " iparm=${gparm}," >>${namelist} - echo " gribver=${gribver}," >>${namelist} - echo " g2_jpdtn=${g2_jpdtn}/" >>${namelist} - - ln -s -f ${gfile} fort.11 - ln -s -f ${FIXRELO}/gfs_hgt_levs.txt fort.16 - ln -s -f ${ifile} fort.31 - ln -s -f ${DATA}/${cmodel}.${PDY}${CYL}.z.f${fhour000} fort.51 - - ${exectrkdir}/vint.x <${namelist} - rcc1=$? - - - gparm=11 - namelist=${DATA}/vint_input.${PDY}${CYL}.t - echo "&timein ifcsthour=${fhour000}," >${namelist} - echo " iparm=${gparm}," >>${namelist} - echo " gribver=${gribver}," >>${namelist} - echo " g2_jpdtn=${g2_jpdtn}/" >>${namelist} - - ln -s -f ${gfile} fort.11 - ln -s -f ${FIXRELO}/gfs_tmp_levs.txt fort.16 - ln -s -f ${ifile} fort.31 - ln -s -f ${DATA}/${cmodel}.${PDY}${CYL}.t.f${fhour000} fort.51 - - ${exectrkdir}/vint.x <${namelist} - rcc2=$? - - namelist=${DATA}/tave_input.${PDY}${CYL} - echo "&timein ifcsthour=${fhour000}," >${namelist} - echo " iparm=${gparm}," >>${namelist} - echo " gribver=${gribver}," >>${namelist} - echo " g2_jpdtn=${g2_jpdtn}/" >>${namelist} - - ffile=${DATA}/${cmodel}.${PDY}${CYL}.t.f${fhour000} - ifile=${DATA}/${cmodel}.${PDY}${CYL}.t.f${fhour000}.i - - if [ ${gribver} -eq 1 ]; then - ${GRBINDEX:?} ${ffile} ${ifile} - else - ${GRB2INDEX:?} ${ffile} ${ifile} - fi - - ln -s -f ${ffile} fort.11 - ln -s -f ${ifile} fort.31 - ln -s -f ${DATA}/${cmodel}.tave.${PDY}${CYL}.f${fhour000} fort.51 - ln -s -f ${DATA}/${cmodel}.tave92.${PDY}${CYL}.f${fhour000} fort.92 - - ${exectrkdir}/tave.x <${namelist} - rcc3=$? - - if [ $rcc1 -eq 0 -a $rcc2 -eq 0 -a $rcc3 -eq 0 ]; then - echo " " - else - mailfile=${rundir}/errmail.${cmodel}.${PDY}${CYL} - echo "CPS/WC interp failure for $cmodel ${PDY}${CYL}" >${mailfile} - mail -s "GFS Failure (CPS/WC int) $cmodel ${PDY}${CYL}" ${userid} <${mailfile} - exit 8 - fi - - tavefile=${DATA}/${cmodel}.tave.${PDY}${CYL}.f${fhour000} - zfile=${DATA}/${cmodel}.${PDY}${CYL}.z.f${fhour000} - cat ${zfile} ${tavefile} >>${catfile} -## rm $tavefile $zfile - - set +x - echo " " - echo "Date in interpolation for cmodel= $cmodel and fhour= $fhour000 after = $(date)" - echo " " - ${TRACE_ON:-set -x} - - done - fi # end of "If PhaseFlag is on" - fi # end of "If loopnum is 1" - - gfile=${DATA}/gfsgribfile.${PDY}${CYL} - ifile=${DATA}/gfsixfile.${PDY}${CYL} - - if [[ "$PhaseFlag" == y ]] ; then - cat ${catfile} >>${gfile} - if [ ${gribver} -eq 1 ]; then - ${GRBINDEX:?} ${gfile} ${ifile} - else - ${GRB2INDEX:?} ${gfile} ${ifile} - fi - fi - - # File names for input to tracker: - gribfile=${DATA}/gfsgribfile.${PDY}${CYL} - ixfile=${DATA}/gfsixfile.${PDY}${CYL} -fi - -$postmsg "$jlogfile" "SUCCESS: have all inputs needed to run tracker. Will now run the tracker." - -#------------------------------------------------------------------------# -# Now run the tracker # -#------------------------------------------------------------------------# - -ist=1 -while [ $ist -le 15 ] -do - if [ ${stormflag[${ist}]} -ne 1 ] - then - set +x; echo "Storm number $ist NOT selected for processing"; ${TRACE_ON:-set -x} - else - set +x; echo "Storm number $ist IS selected for processing...."; ${TRACE_ON:-set -x} - fi - let ist=ist+1 -done - -namelist=${DATA}/input.${atcfout}.${PDY}${CYL} -ATCFNAME=$( echo "${atcfname}" | tr '[a-z]' '[A-Z]') - -if [ ${cmodel} = 'sref' ]; then - export atcfymdh=$( ${NDATE:?} -3 ${scc}${syy}${smm}${sdd}${shh}) -else - export atcfymdh=${scc}${syy}${smm}${sdd}${shh} -fi - -contour_interval=100.0 -write_vit=n -want_oci=.TRUE. - -cat < ${namelist} -&datein inp%bcc=${scc},inp%byy=${syy},inp%bmm=${smm}, - inp%bdd=${sdd},inp%bhh=${shh},inp%model=${model}, - inp%modtyp='${modtyp}', - inp%lt_units='${lead_time_units}', - inp%file_seq='${file_sequence}', - inp%nesttyp='${nest_type}'/ -&atcfinfo atcfnum=${atcfnum},atcfname='${ATCFNAME}', - atcfymdh=${atcfymdh},atcffreq=${atcffreq}/ -&trackerinfo trkrinfo%westbd=${trkrwbd}, - trkrinfo%eastbd=${trkrebd}, - trkrinfo%northbd=${trkrnbd}, - trkrinfo%southbd=${trkrsbd}, - trkrinfo%type='${trkrtype}', - trkrinfo%mslpthresh=${mslpthresh}, - trkrinfo%v850thresh=${v850thresh}, - trkrinfo%gridtype='${modtyp}', - trkrinfo%contint=${contour_interval}, - trkrinfo%want_oci=${want_oci}, - trkrinfo%out_vit='${write_vit}', - trkrinfo%gribver=${gribver}, - trkrinfo%g2_jpdtn=${g2_jpdtn}/ -&phaseinfo phaseflag='${PHASEFLAG}', - phasescheme='${PHASE_SCHEME}', - wcore_depth=${WCORE_DEPTH}/ -&structinfo structflag='${STRUCTFLAG}', - ikeflag='${IKEFLAG}'/ -&fnameinfo gmodname='${atcfname}', - rundescr='${rundescr}', - atcfdescr='${atcfdescr}'/ -&verbose verb=3/ -&waitinfo use_waitfor='n', - wait_min_age=10, - wait_min_size=100, - wait_max_wait=1800, - wait_sleeptime=5, - per_fcst_command=''/ -EOF - -export pgm=gettrk -. $prep_step - -ln -s -f ${gribfile} fort.11 -ln -s -f ${DATA}/vitals.upd.${atcfout}.${PDY}${shh} fort.12 -ln -s -f ${DATA}/genvitals.upd.${cmodel}.${atcfout}.${PDY}${CYL} fort.14 -ihour=1 -for fhour in ${fcsthrs} ; do - fhour=$( echo "$fhour" | bc ) # strip leading zeros - printf "%4d %5d\n" $ihour $(( fhour * 60 )) - let ihour=ihour+1 -done > leadtimes.txt -ln -s -f leadtimes.txt fort.15 -#ln -s -f ${FIXRELO}/${cmodel}.tracker_leadtimes fort.15 -ln -s -f ${ixfile} fort.31 - -if [[ -z "$atcfout" ]] ; then - err_exit 'ERROR: exgfs_trkr script forgot to set $atcfout variable' -fi - -track_file_path=nowhere - -if [ ${trkrtype} = 'tracker' ]; then - if [ ${atcfout} = 'gfdt' -o ${atcfout} = 'gfdl' -o \ - ${atcfout} = 'hwrf' -o ${atcfout} = 'hwft' ]; then - ln -s -f ${DATA}/trak.${atcfout}.all.${stormenv}.${PDY}${CYL} fort.61 - ln -s -f ${DATA}/trak.${atcfout}.atcf.${stormenv}.${PDY}${CYL} fort.62 - ln -s -f ${DATA}/trak.${atcfout}.radii.${stormenv}.${PDY}${CYL} fort.63 - ln -s -f ${DATA}/trak.${atcfout}.atcf_gen.${stormenv}.${PDY}${CYL} fort.66 - ln -s -f ${DATA}/trak.${atcfout}.atcf_sink.${stormenv}.${PDY}${CYL} fort.68 - ln -s -f ${DATA}/trak.${atcfout}.atcf_hfip.${stormenv}.${PDY}${CYL} fort.69 - track_file_path=${DATA}/trak.${atcfout}.atcfunix.${stormenv}.${PDY}${CYL} - else - ln -s -f ${DATA}/trak.${atcfout}.all.${PDY}${CYL} fort.61 - ln -s -f ${DATA}/trak.${atcfout}.atcf.${PDY}${CYL} fort.62 - ln -s -f ${DATA}/trak.${atcfout}.radii.${PDY}${CYL} fort.63 - ln -s -f ${DATA}/trak.${atcfout}.atcf_gen.${PDY}${CYL} fort.66 - ln -s -f ${DATA}/trak.${atcfout}.atcf_sink.${PDY}${CYL} fort.68 - ln -s -f ${DATA}/trak.${atcfout}.atcf_hfip.${PDY}${CYL} fort.69 - track_file_path=${DATA}/trak.${atcfout}.atcfunix.${PDY}${CYL} - fi -else - ln -s -f ${DATA}/trak.${atcfout}.all.${regtype}.${PDY}${CYL} fort.61 - ln -s -f ${DATA}/trak.${atcfout}.atcf.${regtype}.${PDY}${CYL} fort.62 - ln -s -f ${DATA}/trak.${atcfout}.radii.${regtype}.${PDY}${CYL} fort.63 - ln -s -f ${DATA}/trak.${atcfout}.atcf_gen.${regtype}.${PDY}${CYL} fort.66 - ln -s -f ${DATA}/trak.${atcfout}.atcf_sink.${regtype}.${PDY}${CYL} fort.68 - ln -s -f ${DATA}/trak.${atcfout}.atcf_hfip.${regtype}.${PDY}${CYL} fort.69 - track_file_path=${DATA}/trak.${atcfout}.atcfunix.${regtype}.${PDY}${CYL} -fi - -if [[ "$track_file_path" == nowhere ]] ; then - err_exit 'ERROR: exgfs_trkr script forgot to set $track_file_path variable' -fi - -ln -s -f $track_file_path fort.64 - -if [ ${atcfname} = 'aear' ] -then - ln -s -f ${DATA}/trak.${atcfout}.initvitl.${PDY}${CYL} fort.65 -fi - -if [ ${write_vit} = 'y' ] -then - ln -s -f ${DATA}/output_genvitals.${atcfout}.${PDY}${shh} fort.67 -fi - -if [ ${PHASEFLAG} = 'y' ]; then - if [ ${atcfout} = 'gfdt' -o ${atcfout} = 'gfdl' -o \ - ${atcfout} = 'hwrf' -o ${atcfout} = 'hwft' ]; then - ln -s -f ${DATA}/trak.${atcfout}.cps_parms.${stormenv}.${PDY}${CYL} fort.71 - else - ln -s -f ${DATA}/trak.${atcfout}.cps_parms.${PDY}${CYL} fort.71 - fi -fi - -if [ ${STRUCTFLAG} = 'y' ]; then - if [ ${atcfout} = 'gfdt' -o ${atcfout} = 'gfdl' -o \ - ${atcfout} = 'hwrf' -o ${atcfout} = 'hwft' ]; then - ln -s -f ${DATA}/trak.${atcfout}.structure.${stormenv}.${PDY}${CYL} fort.72 - ln -s -f ${DATA}/trak.${atcfout}.fractwind.${stormenv}.${PDY}${CYL} fort.73 - ln -s -f ${DATA}/trak.${atcfout}.pdfwind.${stormenv}.${PDY}${CYL} fort.76 - else - ln -s -f ${DATA}/trak.${atcfout}.structure.${PDY}${CYL} fort.72 - ln -s -f ${DATA}/trak.${atcfout}.fractwind.${PDY}${CYL} fort.73 - ln -s -f ${DATA}/trak.${atcfout}.pdfwind.${PDY}${CYL} fort.76 - fi -fi - -if [ ${IKEFLAG} = 'y' ]; then - if [ ${atcfout} = 'gfdt' -o ${atcfout} = 'gfdl' -o \ - ${atcfout} = 'hwrf' -o ${atcfout} = 'hwft' ]; then - ln -s -f ${DATA}/trak.${atcfout}.ike.${stormenv}.${PDY}${CYL} fort.74 - else - ln -s -f ${DATA}/trak.${atcfout}.ike.${PDY}${CYL} fort.74 - fi -fi - -if [ ${trkrtype} = 'midlat' -o ${trkrtype} = 'tcgen' ]; then - ln -s -f ${DATA}/trkrmask.${atcfout}.${regtype}.${PDY}${CYL} fort.77 -fi - - -set +x -echo " " -echo " -----------------------------------------------" -echo " NOW EXECUTING TRACKER......" -echo " -----------------------------------------------" -echo " " -${TRACE_ON:-set -x} - -msg="$pgm start for $atcfout at ${CYL}z" -$postmsg "$jlogfile" "$msg" - -set +x -echo "+++ TIMING: BEFORE gettrk ---> $(date)" -${TRACE_ON:-set -x} - -set +x -echo " " -echo "TIMING: Before call to gettrk at $(date)" -echo " " -${TRACE_ON:-set -x} - -##/usrx/local/bin/getrusage -a /hwrf/save/Qingfu.Liu/trak/para/exec/gettrk <${namelist} - -${exectrkdir}/gettrk <${namelist} | tee gettrk.log -gettrk_rcc=$? - -set +x -echo " " -echo "TIMING: After call to gettrk at $(date)" -echo " " -${TRACE_ON:-set -x} - -set +x -echo "+++ TIMING: AFTER gettrk ---> $(date)" -${TRACE_ON:-set -x} - -#--------------------------------------------------------------# -# Send a message to the jlogfile for each storm that used -# tcvitals for hour 0 track/intensity info. -#--------------------------------------------------------------# - -pcount=0 -cat gettrk.log | grep -a 'NOTE: TCVITALS_USED_FOR_ATCF_F00' | \ -while read line -do - echo "line is [$line]" - if [[ ! ( "$pcount" -lt 30 ) ]] ; then - $postmsg "$jlogfile" "Hit maximum number of postmsg commands for tcvitals usage at hour 0. Will stop warning about that, to avoid spamming jlogfile." - break - fi - $postmsg "$jlogfile" "$line" - pcount=$(( pcount + 1 )) -done - -#--------------------------------------------------------------# -# Now copy the output track files to different directories -#--------------------------------------------------------------# - -set +x -echo " " -echo " -----------------------------------------------" -echo " NOW COPYING OUTPUT TRACK FILES TO COM " -echo " -----------------------------------------------" -echo " " -${TRACE_ON:-set -x} - -if [[ ! -e "$track_file_path" ]] ; then - $postmsg "$jlogfile" "WARNING: tracker output file does not exist. This is probably an error. File: $track_file_path" - $postmsg "$jlogfile" "WARNING: exgfs_trkr will create an empty track file and deliver that." - cat /dev/null > $track_file_path -elif [[ ! -s "$track_file_path" ]] ; then - $postmsg "$jlogfile" "WARNING: tracker output file is empty. That is only an error if there are storms or genesis cases somewhere in the world. File: $track_file_path" -else - $postmsg "$jlogfile" "SUCCESS: Track file exists and is non-empty: $track_file" - if [[ "$PHASEFLAG" == n ]] ; then - echo "Phase information was disabled. I will remove the empty phase information from the track file before delivery." - cp -p $track_file_path $track_file_path.orig - cut -c1-112 < $track_file_path.orig > $track_file_path - if [[ ! -s "$track_file_path" ]] ; then - $postmsg "$jlogfile" "WARNING: Something went wrong with \"cut\" command to remove phase information. Will deliver original file." - /bin/mv -f $track_file_path.orig $track_file_path - else - $postmsg "$jlogfile" "SUCCESS: Removed empty phase information because phase information is disabled." - fi - fi -fi - -#mkdir /global/save/Qingfu.Liu/gfspara_track/gfs.${PDY}${CYL} -#cp /ptmpp1/Qingfu.Liu/trakout2/${PDY}${CYL}/gfs/trak.gfso.atcf* /global/save/Qingfu.Liu/gfspara_track/gfs.${PDY}${CYL}/. -#rm -rf /ptmpp1/Qingfu.Liu/trakout2/${PDY}${CYL}/gfs/* - -if [ ${gettrk_rcc} -eq 0 ]; then - - if [ -s ${DATA}/output_genvitals.${atcfout}.${PDY}${shh} ]; then - cat ${DATA}/output_genvitals.${atcfout}.${PDY}${shh} >>${genvitfile} - fi - - if [ ${PARAFLAG} = 'YES' ] - then - - if [[ ! -s "$track_file_path" ]] ; then - $postmsg "$jlogfile" "WARNING: delivering empty track file to rundir." - fi - - cp $track_file_path ../. - cat $track_file_path >> \ - ${rundir}/${cmodel}.atcfunix.${syyyy} - if [ ${cmodel} = 'gfs' ]; then - cat ${rundir}/${cmodel}.atcfunix.${syyyy} | sed -e "s/ GFSO/ AVNO/g" >>${rundir}/avn.atcfunix.${syyyy} - fi -# cp ${DATA}/trak.${atcfout}.atcf_sink.${regtype}.${PDY}${CYL} ../. -# cp ${DATA}/trak.${atcfout}.atcf_gen.${regtype}.${PDY}${CYL} ../. - fi - - msg="$pgm end for $atcfout at ${CYL}z completed normally" - $postmsg "$jlogfile" "$msg" - -# Now copy track files into various archives.... - - if [ ${SENDCOM} = 'YES' ] - then - - if [[ ! -s "$track_file_path" ]] ; then - $postmsg "$jlogfile" "WARNING: delivering an empty track file to COM." - return - fi - - glatuxarch=${glatuxarch:-${gltrkdir}/tracks.atcfunix.${syy}} - - cat $track_file_path >>${glatuxarch} - if [ ${cmodel} = 'gfs' ]; then - cat $track_file_path | sed -e "s/ GFSO/ AVNO/g" >>${glatuxarch} - fi - - if [ ${PARAFLAG} = 'YES' ] - then - echo " " - tmatuxarch=${tmatuxarch:-/gpfs/gd2/emc/hwrf/save/${userid}/trak/prod/tracks.atcfunix.${syy}} - cat $track_file_path >>${tmatuxarch} - if [ ${cmodel} = 'gfs' ]; then - cat $track_file_path | sed -e "s/ GFSO/ AVNO/g" >>${tmatuxarch} - fi - else - - if [ ${cmodel} = 'gfdl' ] - then - cp $track_file_path ${COM}/${stormenv}.${PDY}${CYL}.trackeratcfunix - else - cp $track_file_path ${COM}/${atcfout}.t${CYL}z.cyclone.trackatcfunix - if [ ${cmodel} = 'gfs' ]; then - cat $track_file_path | sed -e "s/ GFSO/ AVNO/g" >${COM}/avn.t${CYL}z.cyclone.trackatcfunix - fi - fi - - tmscrdir=/gpfs/gd2/emc/hwrf/save/${userid}/trak/prod - - tmtrakstat=${tmscrdir}/tracker.prod.status - echo "${atcfout} tracker completed okay for ${PDY}${CYL}" >>${tmtrakstat} - - export SENDDBN=${SENDDBN:-YES} - if [ ${SENDDBN} = 'YES' ] - then - if [ ${cmodel} = 'gfdl' ] - then - $DBNROOT/bin/dbn_alert ATCFUNIX GFS_NAVY $job ${COM}/${stormenv}.${PDY}${CYL}.trackeratcfunix - else - $DBNROOT/bin/dbn_alert ATCFUNIX GFS_NAVY $job ${COM}/${atcfout}.t${CYL}z.cyclone.trackatcfunix - if [ ${cmodel} = 'gfs' ]; then - $DBNROOT/bin/dbn_alert ATCFUNIX GFS_NAVY $job ${COM}/avn.t${CYL}z.cyclone.trackatcfunix - fi - fi - fi - - if [[ "$SENDNHC" == YES ]] ; then - # We need to parse apart the atcfunix file and distribute the forecasts to - # the necessary directories. To do this, first sort the atcfunix records - # by forecast hour (k6), then sort again by ocean basin (k1), storm number (k2) - # and then quadrant radii wind threshold (k12). Once you've got that organized - # file, break the file up by putting all the forecast records for each storm - # into a separate file. Then, for each file, find the corresponding atcfunix - # file in the /nhc/com/prod/atcf directory and dump the atcfunix records for that - # storm in there. - - if [ ${cmodel} = 'gfdl' ] - then - auxfile=${COM}/${stormenv}.${PDY}${CYL}.trackeratcfunix - else - auxfile=$track_file_path - fi - - sort -k6 ${auxfile} | sort -k1 -k2 -k12 >atcfunix.sorted - - old_string="XX, XX" - - ict=0 - while read unixrec - do - storm_string=$( echo "${unixrec}" | cut -c1-6) - if [ "${storm_string}" = "${old_string}" ] - then - echo "${unixrec}" >>atcfunix_file.${ict} - else - let ict=ict+1 - echo "${unixrec}" >atcfunix_file.${ict} - old_string="${storm_string}" - fi - done >${ATCFdir}/${at}${NO}${syyyy}/a${at}${NO}${syyyy}.dat - cat atcfunix_file.$mct >>${ATCFdir}/${at}${NO}${syyyy}/a${at}${NO}${syyyy}.dat - cat atcfunix_file.$mct >>${ATCFdir}/${at}${NO}${syyyy}/ncep_a${at}${NO}${syyyy}.dat - if [ ${cmodel} = 'gfs' ]; then - cat atcfunix_file.$mct | sed -e "s/ GFSO/ AVNO/g" >>${ATCFdir}/${at}${NO}${syyyy}/a${at}${NO}${syyyy}.dat - cat atcfunix_file.$mct | sed -e "s/ GFSO/ AVNO/g" >>${ATCFdir}/${at}${NO}${syyyy}/ncep_a${at}${NO}${syyyy}.dat - fi - set +x - echo " " - echo "+++ Adding records to TPC ATCFUNIX directory: /tpcprd/atcf_unix/${at}${NO}${syyyy}" - echo " " - ${TRACE_ON:-set -x} - else - set +x - echo " " - echo "There is no TPC ATCFUNIX directory for: /tpcprd/atcf_unix/${at}${NO}${syyyy}" - ${TRACE_ON:-set -x} - fi - done - fi - fi - fi - - fi - -else - - if [ ${PARAFLAG} = 'YES' ] - then - echo " " - else - tmtrakstat=/gpfs/gd2/emc/hwrf/save/${userid}/trak/prod/tracker.prod.status - echo "ERROR: ${atcfout} tracker FAILED for ${PDY}${CYL}" >>${tmtrakstat} - fi - - set +x - echo " " - echo "!!! ERROR -- An error occurred while running gettrk.x, " - echo "!!! which is the program that actually gets the track." - echo "!!! Return code from gettrk.x = ${gettrk_rcc}" - echo "!!! model= ${atcfout}, forecast initial time = ${PDY}${CYL}" - echo "!!! Exiting...." - echo " " - ${TRACE_ON:-set -x} - err_exit " FAILED ${jobid} - ERROR RUNNING GETTRK IN TRACKER SCRIPT- ABNORMAL EXIT" - -fi diff --git a/ush/gsi_utils.py b/ush/gsi_utils.py index b33be51adb0..97d66e8ace5 100644 --- a/ush/gsi_utils.py +++ b/ush/gsi_utils.py @@ -1,6 +1,6 @@ -### gsi_utils.py -### a collection of functions, classes, etc. -### used for the GSI global analysis +# gsi_utils.py +# a collection of functions, classes, etc. +# used for the GSI global analysis def isTrue(str_in): """ isTrue(str_in) @@ -11,12 +11,13 @@ def isTrue(str_in): """ str_in = str_in.upper() - if str_in in ['YES','.TRUE.']: + if str_in in ['YES', '.TRUE.']: status = True else: status = False return status + def link_file(from_file, to_file): """ link_file(from_file, to_file) - function to check if a path exists, and if not, make a symlink @@ -28,20 +29,23 @@ def link_file(from_file, to_file): if not os.path.islink(to_file): os.symlink(from_file, to_file) else: - print(to_file+" exists, unlinking.") + print(to_file + " exists, unlinking.") os.unlink(to_file) os.symlink(from_file, to_file) - print("ln -s "+from_file+" "+to_file) + print("ln -s " + from_file + " " + to_file) + def copy_file(from_file, to_file): import shutil shutil.copy(from_file, to_file) - print("cp "+from_file+" "+to_file) + print("cp " + from_file + " " + to_file) + def make_dir(directory): import os os.makedirs(directory) - print("mkdir -p "+directory) + print("mkdir -p " + directory) + def write_nml(nml_dict, nml_file): """ write_nml(nml_dict, nml_file) @@ -54,9 +58,9 @@ def write_nml(nml_dict, nml_file): nfile = open(nml_file, 'w') for nml, nmlvars in nml_dict.items(): - nfile.write('&'+nml+'\n') + nfile.write('&' + nml + '\n') for var, val in nmlvars.items(): - nfile.write(' '+str(var)+' = '+str(val)+'\n') + nfile.write(' ' + str(var) + ' = ' + str(val) + '\n') nfile.write('/\n\n') nfile.close() @@ -82,7 +86,8 @@ def get_ncdims(ncfile): return ncdims -def get_nemsdims(nemsfile,nemsexe): + +def get_nemsdims(nemsfile, nemsexe): """ get_nemsdims(nemsfile,nemsexe) - function to return dictionary of NEMSIO file dimensions for use input: nemsfile - string to path nemsio file @@ -93,17 +98,18 @@ def get_nemsdims(nemsfile,nemsexe): """ import subprocess ncdims = { - 'dimx': 'grid_xt', + 'dimx': 'grid_xt', 'dimy': 'grid_yt', 'dimz': 'pfull', - } + } nemsdims = {} - for dim in ['dimx','dimy','dimz']: - out = subprocess.Popen([nemsexe,nemsfile,dim],stdout=subprocess.PIPE,stderr=subprocess.STDOUT) + for dim in ['dimx', 'dimy', 'dimz']: + out = subprocess.Popen([nemsexe, nemsfile, dim], stdout=subprocess.PIPE, stderr=subprocess.STDOUT) stdout, stderr = out.communicate() nemsdims[ncdims[dim]] = int(stdout.split(' ')[-1].rstrip()) return nemsdims + def get_timeinfo(ncfile): """ get_timeinfo(ncfile) - function to return datetime objects of initialized time and valid time @@ -122,7 +128,7 @@ def get_timeinfo(ncfile): date_str = time_units.split('since ')[1] date_str = re.sub("[^0-9]", "", date_str) initstr = date_str[0:10] - inittime = dt.datetime.strptime(initstr,"%Y%m%d%H") + inittime = dt.datetime.strptime(initstr, "%Y%m%d%H") nfhour = int(ncf['time'][0]) validtime = inittime + dt.timedelta(hours=nfhour) ncf.close() diff --git a/ush/hpssarch_gen.sh b/ush/hpssarch_gen.sh index 9785de98acf..c2a84441bfc 100755 --- a/ush/hpssarch_gen.sh +++ b/ush/hpssarch_gen.sh @@ -4,32 +4,22 @@ # Fanglin Yang, 20180318 # --create bunches of files to be archived to HPSS ################################################### -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" type=${1:-gfs} ##gfs, gdas, enkfgdas or enkfggfs -CDATE=${CDATE:-2018010100} -PDY=$(echo $CDATE | cut -c 1-8) -cyc=$(echo $CDATE | cut -c 9-10) -OUTPUT_FILE=${OUTPUT_FILE:-"netcdf"} ARCH_GAUSSIAN=${ARCH_GAUSSIAN:-"YES"} ARCH_GAUSSIAN_FHMAX=${ARCH_GAUSSIAN_FHMAX:-36} ARCH_GAUSSIAN_FHINC=${ARCH_GAUSSIAN_FHINC:-6} -SUFFIX=${SUFFIX:-".nc"} -if [ $SUFFIX = ".nc" ]; then - format="netcdf" -else - format="nemsio" -fi # Set whether to archive downstream products DO_DOWN=${DO_DOWN:-"NO"} -if [ $DO_BUFRSND = "YES" -o $WAFSF = "YES" ]; then +if [[ ${DO_BUFRSND} = "YES" || ${WAFSF} = "YES" ]]; then export DO_DOWN="YES" fi #----------------------------------------------------- -if [ $type = "gfs" ]; then +if [[ ${type} = "gfs" ]]; then #----------------------------------------------------- FHMIN_GFS=${FHMIN_GFS:-0} FHMAX_GFS=${FHMAX_GFS:-384} @@ -44,49 +34,52 @@ if [ $type = "gfs" ]; then touch gfsb.txt touch gfs_restarta.txt - if [ $ARCH_GAUSSIAN = "YES" ]; then + if [[ ${ARCH_GAUSSIAN} = "YES" ]]; then rm -f gfs_pgrb2b.txt - rm -f gfs_${format}b.txt + rm -f gfs_netcdfb.txt rm -f gfs_flux.txt touch gfs_pgrb2b.txt - touch gfs_${format}b.txt + touch gfs_netcdfb.txt touch gfs_flux.txt - if [ $MODE = "cycled" ]; then - rm -f gfs_${format}a.txt - touch gfs_${format}a.txt + if [[ ${MODE} = "cycled" ]]; then + rm -f gfs_netcdfa.txt + touch gfs_netcdfa.txt fi fi - if [ $DO_DOWN = "YES" ]; then + if [[ ${DO_DOWN} = "YES" ]]; then rm -f gfs_downstream.txt touch gfs_downstream.txt fi - dirpath="gfs.${PDY}/${cyc}/atmos/" - dirname="./${dirpath}" - head="gfs.t${cyc}z." - if [ $ARCH_GAUSSIAN = "YES" ]; then - echo "${dirname}${head}pgrb2b.0p25.anl " >>gfs_pgrb2b.txt - echo "${dirname}${head}pgrb2b.0p25.anl.idx " >>gfs_pgrb2b.txt - echo "${dirname}${head}pgrb2b.1p00.anl " >>gfs_pgrb2b.txt - echo "${dirname}${head}pgrb2b.1p00.anl.idx " >>gfs_pgrb2b.txt - - if [ $MODE = "cycled" ]; then - echo "${dirname}${head}atmanl${SUFFIX} " >>gfs_${format}a.txt - echo "${dirname}${head}sfcanl${SUFFIX} " >>gfs_${format}a.txt - echo "${dirname}${head}atmi*.nc " >>gfs_${format}a.txt - echo "${dirname}${head}dtfanl.nc " >>gfs_${format}a.txt - echo "${dirname}${head}loginc.txt " >>gfs_${format}a.txt + if [[ ${ARCH_GAUSSIAN} = "YES" ]]; then + { + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2b.0p25.anl" + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2b.0p25.anl.idx" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}pgrb2b.1p00.anl" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}pgrb2b.1p00.anl.idx" + } >> gfs_pgrb2b.txt + + if [[ ${MODE} = "cycled" ]]; then + { + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}atmanl.nc" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}sfcanl.nc" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}atmi*.nc" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}dtfanl.nc" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}loginc.txt" + } >> gfs_netcdfa.txt fi fh=0 - while [ $fh -le $ARCH_GAUSSIAN_FHMAX ]; do - fhr=$(printf %03i $fh) - echo "${dirname}${head}atmf${fhr}${SUFFIX} " >>gfs_${format}b.txt - echo "${dirname}${head}sfcf${fhr}${SUFFIX} " >>gfs_${format}b.txt + while (( fh <= ARCH_GAUSSIAN_FHMAX )); do + fhr=$(printf %03i "${fh}") + { + echo "${COM_ATMOS_HISTORY/${ROTDIR}\//}/${head}atmf${fhr}.nc" + echo "${COM_ATMOS_HISTORY/${ROTDIR}\//}/${head}sfcf${fhr}.nc" + } >> gfs_netcdfb.txt fh=$((fh+ARCH_GAUSSIAN_FHINC)) done fi @@ -94,135 +87,156 @@ if [ $type = "gfs" ]; then #.................. # Exclude the gfsarch.log file, which will change during the tar operation # This uses the bash extended globbing option - echo "./logs/${CDATE}/gfs!(arch).log " >>gfsa.txt - echo "${dirname}input.nml " >>gfsa.txt - if [ $MODE = "cycled" ]; then - echo "${dirname}${head}gsistat " >>gfsa.txt - echo "${dirname}${head}nsstbufr " >>gfsa.txt - echo "${dirname}${head}prepbufr " >>gfsa.txt - echo "${dirname}${head}prepbufr.acft_profiles " >>gfsa.txt - fi - echo "${dirname}${head}pgrb2.0p25.anl " >>gfsa.txt - echo "${dirname}${head}pgrb2.0p25.anl.idx " >>gfsa.txt - #Only generated if there are cyclones to track - cyclone_files=(avno.t${cyc}z.cyclone.trackatcfunix - avnop.t${cyc}z.cyclone.trackatcfunix - trak.gfso.atcfunix.${PDY}${cyc} - trak.gfso.atcfunix.altg.${PDY}${cyc} - storms.gfso.atcf_gen.${PDY}${cyc} - storms.gfso.atcf_gen.altg.${PDY}${cyc}) - - for file in ${cyclone_files[@]}; do - [[ -s $ROTDIR/${dirname}${file} ]] && echo "${dirname}${file}" >>gfsa.txt - done + { + echo "./logs/${PDY}${cyc}/gfs!(arch).log" + echo "${COM_ATMOS_HISTORY/${ROTDIR}\//}/input.nml" + + if [[ ${MODE} = "cycled" ]]; then + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}gsistat" + echo "${COM_OBS/${ROTDIR}\//}/${head}nsstbufr" + echo "${COM_OBS/${ROTDIR}\//}/${head}prepbufr" + echo "${COM_OBS/${ROTDIR}\//}/${head}prepbufr.acft_profiles" + fi - if [ $DO_DOWN = "YES" ]; then - if [ $DO_BUFRSND = "YES" ]; then - echo "${dirname}gempak/gfs_${PDY}${cyc}.sfc " >>gfs_downstream.txt - echo "${dirname}gempak/gfs_${PDY}${cyc}.snd " >>gfs_downstream.txt - echo "${dirname}wmo/gfs_collective*.postsnd_${cyc} " >>gfs_downstream.txt - echo "${dirname}bufr.t${cyc}z " >>gfs_downstream.txt - echo "${dirname}gfs.t${cyc}z.bufrsnd.tar.gz " >>gfs_downstream.txt - fi - if [ $WAFSF = "YES" ]; then - echo "${dirname}wafsgfs*.t${cyc}z.gribf*.grib2 " >>gfs_downstream.txt - echo "${dirname}gfs.t${cyc}z.wafs_grb45f*.grib2 " >>gfs_downstream.txt - echo "${dirname}gfs.t${cyc}z.wafs_grb45f*.nouswafs.grib2 " >>gfs_downstream.txt - echo "${dirname}WAFS_blended_${PDY}${cyc}f*.grib2 " >>gfs_downstream.txt - echo "${dirname}gfs.t*z.gcip.f*.grib2 " >>gfs_downstream.txt - echo "${dirname}gfs.t${cyc}z.wafs_0p25.f*.grib2 " >>gfs_downstream.txt - echo "${dirname}gfs.t${cyc}z.wafs_0p25_unblended.f*.grib2" >>gfs_downstream.txt - echo "${dirname}WAFS_0p25_blended_${PDY}${cyc}f*.grib2 " >>gfs_downstream.txt - fi - fi + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2.0p25.anl" + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2.0p25.anl.idx" - echo "${dirname}${head}pgrb2.0p50.anl " >>gfsb.txt - echo "${dirname}${head}pgrb2.0p50.anl.idx " >>gfsb.txt - echo "${dirname}${head}pgrb2.1p00.anl " >>gfsb.txt - echo "${dirname}${head}pgrb2.1p00.anl.idx " >>gfsb.txt + #Only generated if there are cyclones to track + cyclone_files=("avno.t${cyc}z.cyclone.trackatcfunix" + "avnop.t${cyc}z.cyclone.trackatcfunix" + "trak.gfso.atcfunix.${PDY}${cyc}" + "trak.gfso.atcfunix.altg.${PDY}${cyc}") + for file in "${cyclone_files[@]}"; do + [[ -s ${COM_ATMOS_TRACK}/${file} ]] && echo "${COM_ATMOS_TRACK/${ROTDIR}\//}/${file}" + done - fh=0 - while [ $fh -le $FHMAX_GFS ]; do - fhr=$(printf %03i $fh) - if [ $ARCH_GAUSSIAN = "YES" ]; then - echo "${dirname}${head}sfluxgrbf${fhr}.grib2 " >>gfs_flux.txt - echo "${dirname}${head}sfluxgrbf${fhr}.grib2.idx " >>gfs_flux.txt - - echo "${dirname}${head}pgrb2b.0p25.f${fhr} " >>gfs_pgrb2b.txt - echo "${dirname}${head}pgrb2b.0p25.f${fhr}.idx " >>gfs_pgrb2b.txt - if [ -s $ROTDIR/${dirpath}${head}pgrb2b.1p00.f${fhr} ]; then - echo "${dirname}${head}pgrb2b.1p00.f${fhr} " >>gfs_pgrb2b.txt - echo "${dirname}${head}pgrb2b.1p00.f${fhr}.idx " >>gfs_pgrb2b.txt + genesis_files=("storms.gfso.atcf_gen.${PDY}${cyc}" + "storms.gfso.atcf_gen.altg.${PDY}${cyc}") + for file in "${genesis_files[@]}"; do + [[ -s ${COM_ATMOS_GENESIS}/${file} ]] && echo "${COM_ATMOS_GENESIS/${ROTDIR}\//}/${file}" + done + } >> gfsa.txt + + { + if [[ ${DO_DOWN} = "YES" ]]; then + if [[ ${DO_BUFRSND} = "YES" ]]; then + echo "${COM_ATMOS_GEMPAK/${ROTDIR}\//}/gfs_${PDY}${cyc}.sfc" + echo "${COM_ATMOS_GEMPAK/${ROTDIR}\//}/gfs_${PDY}${cyc}.snd" + echo "${COM_ATMOS_WMO/${ROTDIR}\//}/gfs_collective*.postsnd_${cyc}" + echo "${COM_ATMOS_BUFR/${ROTDIR}\//}/bufr.t${cyc}z" + echo "${COM_ATMOS_BUFR/${ROTDIR}\//}/gfs.t${cyc}z.bufrsnd.tar.gz" + fi + if [[ ${WAFSF} = "YES" ]]; then + echo "${COM_ATMOS_WAFS/${ROTDIR}\//}/wafsgfs*.t${cyc}z.gribf*.grib2" + echo "${COM_ATMOS_WAFS/${ROTDIR}\//}/gfs.t${cyc}z.wafs_grb45f*.grib2" + echo "${COM_ATMOS_WAFS/${ROTDIR}\//}/gfs.t${cyc}z.wafs_grb45f*.nouswafs.grib2" + echo "${COM_ATMOS_WAFS/${ROTDIR}\//}/WAFS_blended_${PDY}${cyc}f*.grib2" + echo "${COM_ATMOS_WAFS/${ROTDIR}\//}/gfs.t*z.gcip.f*.grib2" + echo "${COM_ATMOS_WAFS/${ROTDIR}\//}/gfs.t${cyc}z.wafs_0p25.f*.grib2" + echo "${COM_ATMOS_WAFS/${ROTDIR}\//}/gfs.t${cyc}z.wafs_0p25_unblended.f*.grib2" + echo "${COM_ATMOS_WAFS/${ROTDIR}\//}/WAFS_0p25_blended_${PDY}${cyc}f*.grib2" fi fi + } >> gfs_downstream.txt - echo "${dirname}${head}pgrb2.0p25.f${fhr} " >>gfsa.txt - echo "${dirname}${head}pgrb2.0p25.f${fhr}.idx " >>gfsa.txt - echo "${dirname}${head}logf${fhr}.txt " >>gfsa.txt + { + echo "${COM_ATMOS_GRIB_0p50/${ROTDIR}\//}/${head}pgrb2.0p50.anl" + echo "${COM_ATMOS_GRIB_0p50/${ROTDIR}\//}/${head}pgrb2.0p50.anl.idx" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}pgrb2.1p00.anl" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}pgrb2.1p00.anl.idx" + } >> gfsb.txt - if [ -s $ROTDIR/${dirpath}${head}pgrb2.0p50.f${fhr} ]; then - echo "${dirname}${head}pgrb2.0p50.f${fhr} " >>gfsb.txt - echo "${dirname}${head}pgrb2.0p50.f${fhr}.idx " >>gfsb.txt - fi - if [ -s $ROTDIR/${dirpath}${head}pgrb2.1p00.f${fhr} ]; then - echo "${dirname}${head}pgrb2.1p00.f${fhr} " >>gfsb.txt - echo "${dirname}${head}pgrb2.1p00.f${fhr}.idx " >>gfsb.txt + + fh=0 + while (( fh <= FHMAX_GFS )); do + fhr=$(printf %03i "${fh}") + if [[ ${ARCH_GAUSSIAN} = "YES" ]]; then + { + echo "${COM_ATMOS_MASTER/${ROTDIR}\//}/${head}sfluxgrbf${fhr}.grib2" + echo "${COM_ATMOS_MASTER/${ROTDIR}\//}/${head}sfluxgrbf${fhr}.grib2.idx" + } >> gfs_flux.txt + + { + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2b.0p25.f${fhr}" + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2b.0p25.f${fhr}.idx" + if [[ -s "${COM_ATMOS_GRIB_1p00}/${head}pgrb2b.1p00.f${fhr}" ]]; then + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/{head}pgrb2b.1p00.f${fhr}" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/{head}pgrb2b.1p00.f${fhr}.idx" + fi + } >> gfs_pgrb2b.txt fi - inc=$FHOUT_GFS - if [ $FHMAX_HF_GFS -gt 0 -a $FHOUT_HF_GFS -gt 0 -a $fh -lt $FHMAX_HF_GFS ]; then - inc=$FHOUT_HF_GFS + { + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2.0p25.f${fhr}" + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2.0p25.f${fhr}.idx" + echo "${COM_ATMOS_HISTORY/${ROTDIR}\//}/${head}logf${fhr}.txt" + } >> gfsa.txt + + + { + if [[ -s "${COM_ATMOS_GRIB_0p50}/${head}pgrb2.0p50.f${fhr}" ]]; then + echo "${COM_ATMOS_GRIB_0p50/${ROTDIR}\//}/${head}pgrb2.0p50.f${fhr}" + echo "${COM_ATMOS_GRIB_0p50/${ROTDIR}\//}/${head}pgrb2.0p50.f${fhr}.idx" + fi + if [[ -s "${COM_ATMOS_GRIB_1p00}/${head}pgrb2.1p00.f${fhr}" ]]; then + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}pgrb2.1p00.f${fhr}" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}pgrb2.1p00.f${fhr}.idx" + fi + } >> gfsb.txt + + inc=${FHOUT_GFS} + if (( FHMAX_HF_GFS > 0 && FHOUT_HF_GFS > 0 && fh < FHMAX_HF_GFS )); then + inc=${FHOUT_HF_GFS} fi fh=$((fh+inc)) done #.................. - if [ $MODE = "cycled" ]; then - echo "${dirname}RESTART/*0000.sfcanl_data.tile1.nc " >>gfs_restarta.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile2.nc " >>gfs_restarta.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile3.nc " >>gfs_restarta.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile4.nc " >>gfs_restarta.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile5.nc " >>gfs_restarta.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile6.nc " >>gfs_restarta.txt - elif [ $MODE = "forecast-only" ]; then - echo "${dirname}INPUT/gfs_ctrl.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/gfs_data.tile1.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/gfs_data.tile2.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/gfs_data.tile3.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/gfs_data.tile4.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/gfs_data.tile5.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/gfs_data.tile6.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/sfc_data.tile1.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/sfc_data.tile2.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/sfc_data.tile3.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/sfc_data.tile4.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/sfc_data.tile5.nc " >>gfs_restarta.txt - echo "${dirname}INPUT/sfc_data.tile6.nc " >>gfs_restarta.txt - fi + { + if [[ ${MODE} = "cycled" ]]; then + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile1.nc" + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile2.nc" + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile3.nc" + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile4.nc" + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile5.nc" + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile6.nc" + elif [[ ${MODE} = "forecast-only" ]]; then + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/gfs_ctrl.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/gfs_data.tile1.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/gfs_data.tile2.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/gfs_data.tile3.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/gfs_data.tile4.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/gfs_data.tile5.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/gfs_data.tile6.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/sfc_data.tile1.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/sfc_data.tile2.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/sfc_data.tile3.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/sfc_data.tile4.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/sfc_data.tile5.nc" + echo "${COM_ATMOS_INPUT/${ROTDIR}\//}/sfc_data.tile6.nc" + fi + } >> gfs_restarta.txt + #.................. - if [ $DO_WAVE = "YES" ]; then + if [[ ${DO_WAVE} = "YES" ]]; then rm -rf gfswave.txt touch gfswave.txt - dirpath="gfs.${PDY}/${cyc}/wave/" - dirname="./${dirpath}" - head="gfswave.t${cyc}z." #........................... - echo "${dirname}rundata/ww3_multi* " >>gfswave.txt - echo "${dirname}gridded/${head}* " >>gfswave.txt - echo "${dirname}station/${head}* " >>gfswave.txt - + { + echo "${COM_WAVE_HISTORY/${ROTDIR}\//}/ww3_multi*" + echo "${COM_WAVE_GRID/${ROTDIR}\//}/${head}*" + echo "${COM_WAVE_STATION/${ROTDIR}\//}/${head}*" + } >> gfswave.txt fi - if [ $DO_OCN = "YES" ]; then - dirpath="gfs.${PDY}/${cyc}/ocean/" - dirname="./${dirpath}" + if [[ ${DO_OCN} = "YES" ]]; then head="gfs.t${cyc}z." @@ -233,7 +247,6 @@ if [ $type = "gfs" ]; then rm -f ocn_3D.txt rm -f ocn_xsect.txt rm -f ocn_daily.txt - rm -f wavocn.txt touch gfs_flux_1p00.txt touch ocn_ice_grib2_0p5.txt touch ocn_ice_grib2_0p25.txt @@ -241,44 +254,39 @@ if [ $type = "gfs" ]; then touch ocn_3D.txt touch ocn_xsect.txt touch ocn_daily.txt - touch wavocn.txt - echo "${dirname}MOM_input " >>ocn_2D.txt - echo "${dirname}ocn_2D* " >>ocn_2D.txt - echo "${dirname}ocn_3D* " >>ocn_3D.txt - echo "${dirname}ocn*EQ* " >>ocn_xsect.txt - echo "${dirname}ocn_daily* " >>ocn_daily.txt - echo "${dirname}wavocn* " >>wavocn.txt - echo "${dirname}ocn_ice*0p5x0p5.grb2 " >>ocn_ice_grib2_0p5.txt - echo "${dirname}ocn_ice*0p25x0p25.grb2 " >>ocn_ice_grib2_0p25.txt - - dirpath="gfs.${PDY}/${cyc}/atmos/" - dirname="./${dirpath}" - echo "${dirname}${head}flux.1p00.f??? " >>gfs_flux_1p00.txt - echo "${dirname}${head}flux.1p00.f???.idx " >>gfs_flux_1p00.txt + echo "${COM_OCEAN_INPUT/${ROTDIR}\//}/MOM_input" >> ocn_2D.txt + echo "${COM_OCEAN_HISTORY/${ROTDIR}\//}/ocn_2D*" >> ocn_2D.txt + echo "${COM_OCEAN_HISTORY/${ROTDIR}\//}/ocn_3D*" >> ocn_3D.txt + echo "${COM_OCEAN_XSECT/${ROTDIR}\//}/ocn*EQ*" >> ocn_xsect.txt + echo "${COM_OCEAN_DAILY/${ROTDIR}\//}/ocn_daily*" >> ocn_daily.txt + echo "${COM_OCEAN_GRIB_0p50/${ROTDIR}\//}/ocn_ice*0p5x0p5.grb2" >> ocn_ice_grib2_0p5.txt + echo "${COM_OCEAN_GRIB_0p25/${ROTDIR}\//}/ocn_ice*0p25x0p25.grb2" >> ocn_ice_grib2_0p25.txt + + # Also save fluxes from atmosphere + { + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}flux.1p00.f???" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}flux.1p00.f???.idx" + } >> gfs_flux_1p00.txt fi - if [ $DO_ICE = "YES" ]; then - dirpath="gfs.${PDY}/${cyc}/ice/" - dirname="./${dirpath}" - + if [[ ${DO_ICE} = "YES" ]]; then head="gfs.t${cyc}z." rm -f ice.txt touch ice.txt - echo "${dirname}ice_in " >>ice.txt - echo "${dirname}ice*nc " >>ice.txt + { + echo "${COM_ICE_INPUT/${ROTDIR}\//}/ice_in" + echo "${COM_ICE_HISTORY/${ROTDIR}\//}/ice*nc" + } >> ice.txt fi - if [ $DO_AERO = "YES" ]; then - dirpath="gfs.${PDY}/${cyc}/chem" - dirname="./${dirpath}" - + if [[ ${DO_AERO} = "YES" ]]; then head="gocart" rm -f chem.txt touch chem.txt - echo "${dirname}/${head}*" >> chem.txt + echo "${COM_CHEM_HISTORY/${ROTDIR}\//}/${head}*" >> chem.txt fi #----------------------------------------------------- @@ -288,7 +296,7 @@ fi ##end of gfs #----------------------------------------------------- -if [ $type = "gdas" ]; then +if [[ ${type} == "gdas" ]]; then #----------------------------------------------------- rm -f gdas.txt @@ -298,115 +306,159 @@ if [ $type = "gdas" ]; then touch gdas_restarta.txt touch gdas_restartb.txt - dirpath="gdas.${PDY}/${cyc}/atmos/" - dirname="./${dirpath}" head="gdas.t${cyc}z." #.................. - echo "${dirname}${head}gsistat " >>gdas.txt - echo "${dirname}${head}pgrb2.0p25.anl " >>gdas.txt - echo "${dirname}${head}pgrb2.0p25.anl.idx " >>gdas.txt - echo "${dirname}${head}pgrb2.1p00.anl " >>gdas.txt - echo "${dirname}${head}pgrb2.1p00.anl.idx " >>gdas.txt - echo "${dirname}${head}atmanl${SUFFIX} " >>gdas.txt - echo "${dirname}${head}sfcanl${SUFFIX} " >>gdas.txt - if [ -s $ROTDIR/${dirpath}${head}atmanl.ensres${SUFFIX} ]; then - echo "${dirname}${head}atmanl.ensres${SUFFIX} " >>gdas.txt - fi - if [ -s $ROTDIR/${dirpath}${head}atma003.ensres${SUFFIX} ]; then - echo "${dirname}${head}atma003.ensres${SUFFIX} " >>gdas.txt - fi - if [ -s $ROTDIR/${dirpath}${head}atma009.ensres${SUFFIX} ]; then - echo "${dirname}${head}atma009.ensres${SUFFIX} " >>gdas.txt - fi - if [ -s $ROTDIR/${dirpath}${head}cnvstat ]; then - echo "${dirname}${head}cnvstat " >>gdas.txt - fi - if [ -s $ROTDIR/${dirpath}${head}oznstat ]; then - echo "${dirname}${head}oznstat " >>gdas.txt - fi - if [ -s $ROTDIR/${dirpath}${head}radstat ]; then - echo "${dirname}${head}radstat " >>gdas.txt - fi - for fstep in prep anal gldas fcst vrfy radmon minmon oznmon; do - if [ -s $ROTDIR/logs/${CDATE}/gdas${fstep}.log ]; then - echo "./logs/${CDATE}/gdas${fstep}.log " >>gdas.txt - fi - done - echo "./logs/${CDATE}/gdaspost*.log " >>gdas.txt - - fh=0 - while [ $fh -le 9 ]; do - fhr=$(printf %03i $fh) - echo "${dirname}${head}sfluxgrbf${fhr}.grib2 " >>gdas.txt - echo "${dirname}${head}sfluxgrbf${fhr}.grib2.idx " >>gdas.txt - echo "${dirname}${head}pgrb2.0p25.f${fhr} " >>gdas.txt - echo "${dirname}${head}pgrb2.0p25.f${fhr}.idx " >>gdas.txt - echo "${dirname}${head}pgrb2.1p00.f${fhr} " >>gdas.txt - echo "${dirname}${head}pgrb2.1p00.f${fhr}.idx " >>gdas.txt - echo "${dirname}${head}logf${fhr}.txt " >>gdas.txt - echo "${dirname}${head}atmf${fhr}${SUFFIX} " >>gdas.txt - echo "${dirname}${head}sfcf${fhr}${SUFFIX} " >>gdas.txt - fh=$((fh+3)) - done - flist="001 002 004 005 007 008" - for fhr in $flist; do - echo "${dirname}${head}sfluxgrbf${fhr}.grib2 " >>gdas.txt - echo "${dirname}${head}sfluxgrbf${fhr}.grib2.idx " >>gdas.txt - done - + { + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}gsistat" + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2.0p25.anl" + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2.0p25.anl.idx" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}pgrb2.1p00.anl" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}pgrb2.1p00.anl.idx" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}atmanl.nc" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}sfcanl.nc" + if [[ -s "${COM_ATMOS_ANALYSIS}/${head}atmanl.ensres.nc" ]]; then + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}atmanl.ensres.nc" + fi + if [[ -s "${COM_ATMOS_ANALYSIS}/${head}atma003.ensres.nc" ]]; then + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}atma003.ensres.nc" + fi + if [[ -s "${COM_ATMOS_ANALYSIS}/${head}atma009.ensres.nc" ]]; then + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}atma009.ensres.nc" + fi + if [[ -s "${COM_ATMOS_ANALYSIS}/${head}cnvstat" ]]; then + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}cnvstat" + fi + if [[ -s "${COM_ATMOS_ANALYSIS}/${head}oznstat" ]]; then + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}oznstat" + fi + if [[ -s "${COM_ATMOS_ANALYSIS}/${head}radstat" ]]; then + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}radstat" + fi + for fstep in prep anal gldas fcst vrfy radmon minmon oznmon; do + if [[ -s "${ROTDIR}/logs/${PDY}${cyc}/gdas${fstep}.log" ]]; then + echo "./logs/${PDY}${cyc}/gdas${fstep}.log" + fi + done + echo "./logs/${PDY}${cyc}/gdaspost*.log" + fh=0 + while [[ ${fh} -le 9 ]]; do + fhr=$(printf %03i "${fh}") + echo "${COM_ATMOS_MASTER/${ROTDIR}\//}/${head}sfluxgrbf${fhr}.grib2" + echo "${COM_ATMOS_MASTER/${ROTDIR}\//}/${head}sfluxgrbf${fhr}.grib2.idx" + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2.0p25.f${fhr}" + echo "${COM_ATMOS_GRIB_0p25/${ROTDIR}\//}/${head}pgrb2.0p25.f${fhr}.idx" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}pgrb2.1p00.f${fhr}" + echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}pgrb2.1p00.f${fhr}.idx" + echo "${COM_ATMOS_HISTORY/${ROTDIR}\//}/${head}logf${fhr}.txt" + echo "${COM_ATMOS_HISTORY/${ROTDIR}\//}/${head}atmf${fhr}.nc" + echo "${COM_ATMOS_HISTORY/${ROTDIR}\//}/${head}sfcf${fhr}.nc" + fh=$((fh+3)) + done + flist="001 002 004 005 007 008" + for fhr in ${flist}; do + file="${COM_ATMOS_MASTER/${ROTDIR}\//}/${head}sfluxgrbf${fhr}.grib2" + if [[ -s "${file}" ]]; then + echo "${file}" + echo "${file}.idx" + fi + done + } >> gdas.txt #.................. - if [ -s $ROTDIR/${dirpath}${head}cnvstat ]; then - echo "${dirname}${head}cnvstat " >>gdas_restarta.txt + if [[ -s "${COM_ATMOS_ANALYSIS}/${head}cnvstat" ]]; then + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}cnvstat" >> gdas_restarta.txt fi - if [ -s $ROTDIR/${dirpath}${head}radstat ]; then - echo "${dirname}${head}radstat " >>gdas_restarta.txt + if [[ -s "${COM_ATMOS_ANALYSIS}/${head}radstat" ]]; then + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}radstat" >> gdas_restarta.txt fi - echo "${dirname}${head}nsstbufr " >>gdas_restarta.txt - echo "${dirname}${head}prepbufr " >>gdas_restarta.txt - echo "${dirname}${head}prepbufr.acft_profiles " >>gdas_restarta.txt - echo "${dirname}${head}abias " >>gdas_restarta.txt - echo "${dirname}${head}abias_air " >>gdas_restarta.txt - echo "${dirname}${head}abias_int " >>gdas_restarta.txt - echo "${dirname}${head}abias_pc " >>gdas_restarta.txt - echo "${dirname}${head}atmi*nc " >>gdas_restarta.txt - echo "${dirname}${head}dtfanl.nc " >>gdas_restarta.txt - echo "${dirname}${head}loginc.txt " >>gdas_restarta.txt - - echo "${dirname}RESTART/*0000.sfcanl_data.tile1.nc " >>gdas_restarta.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile2.nc " >>gdas_restarta.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile3.nc " >>gdas_restarta.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile4.nc " >>gdas_restarta.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile5.nc " >>gdas_restarta.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile6.nc " >>gdas_restarta.txt + { + echo "${COM_OBS/${ROTDIR}\//}/${head}nsstbufr" + echo "${COM_OBS/${ROTDIR}\//}/${head}prepbufr" + echo "${COM_OBS/${ROTDIR}\//}/${head}prepbufr.acft_profiles" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}abias" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}abias_air" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}abias_int" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}abias_pc" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}atmi*nc" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}dtfanl.nc" + echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}loginc.txt" + + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile1.nc" + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile2.nc" + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile3.nc" + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile4.nc" + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile5.nc" + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile6.nc" + } >> gdas_restarta.txt #.................. - echo "${dirname}RESTART " >>gdas_restartb.txt + echo "${COM_ATMOS_RESTART/${ROTDIR}\//}" >> gdas_restartb.txt #.................. - if [ $DO_WAVE = "YES" ]; then + if [[ ${DO_WAVE} = "YES" ]]; then rm -rf gdaswave.txt touch gdaswave.txt rm -rf gdaswave_restart.txt touch gdaswave_restart.txt - dirpath="gdas.${PDY}/${cyc}/wave/" - dirname="./${dirpath}" - head="gdaswave.t${cyc}z." #........................... - echo "${dirname}gridded/${head}* " >>gdaswave.txt - echo "${dirname}station/${head}* " >>gdaswave.txt + { + echo "${COM_WAVE_GRID/${ROTDIR}\//}/${head}*" + echo "${COM_WAVE_STATION/${ROTDIR}\//}/${head}*" + } >> gdaswave.txt - echo "${dirname}restart/* " >>gdaswave_restart.txt + echo "${COM_WAVE_RESTART/${ROTDIR}\//}/*" >> gdaswave_restart.txt fi + #.................. + if [[ ${DO_OCN} = "YES" ]]; then + + rm -rf gdasocean.txt + touch gdasocean.txt + rm -rf gdasocean_restart.txt + touch gdasocean_restart.txt + + head="gdas.t${cyc}z." + + #........................... + { + echo "${COM_OCEAN_HISTORY/${ROTDIR}\//}/${head}*" + echo "${COM_OCEAN_INPUT/${ROTDIR}\//}" + } >> gdasocean.txt + + { + echo "${COM_OCEAN_RESTART/${ROTDIR}\//}/*" + echo "${COM_MED_RESTART/${ROTDIR}\//}/*" + } >> gdasocean_restart.txt + + fi + + if [[ ${DO_ICE} = "YES" ]]; then + + rm -rf gdasice.txt + touch gdasice.txt + rm -rf gdasice_restart.txt + touch gdasice_restart.txt + + head="gdas.t${cyc}z." + + #........................... + { + echo "${COM_ICE_HISTORY/${ROTDIR}\//}/${head}*" + echo "${COM_ICE_INPUT/${ROTDIR}\//}/ice_in" + } >> gdasice.txt + + echo "${COM_ICE_RESTART/${ROTDIR}\//}/*" >> gdasice_restart.txt + + fi + #----------------------------------------------------- fi ##end of gdas @@ -414,167 +466,180 @@ fi ##end of gdas #----------------------------------------------------- -if [ $type = "enkfgdas" -o $type = "enkfgfs" ]; then +if [[ ${type} == "enkfgdas" || ${type} == "enkfgfs" ]]; then #----------------------------------------------------- IAUFHRS_ENKF=${IAUFHRS_ENKF:-6} lobsdiag_forenkf=${lobsdiag_forenkf:-".false."} - nfhrs=$(echo $IAUFHRS_ENKF | sed 's/,/ /g') + nfhrs="${IAUFHRS_ENKF/,/}" NMEM_ENKF=${NMEM_ENKF:-80} NMEM_EARCGRP=${NMEM_EARCGRP:-10} ##number of ens memebers included in each tarball NTARS=$((NMEM_ENKF/NMEM_EARCGRP)) - [[ $NTARS -eq 0 ]] && NTARS=1 - [[ $((NTARS*NMEM_EARCGRP)) -lt $NMEM_ENKF ]] && NTARS=$((NTARS+1)) -##NTARS2=$((NTARS/2)) # number of earc groups to include analysis/increments - NTARS2=$NTARS + [[ ${NTARS} -eq 0 ]] && NTARS=1 + [[ $((NTARS*NMEM_EARCGRP)) -lt ${NMEM_ENKF} ]] && NTARS=$((NTARS+1)) + ##NTARS2=$((NTARS/2)) # number of earc groups to include analysis/increments + NTARS2=${NTARS} - dirpath="enkf${CDUMP}.${PDY}/${cyc}/atmos/" - dirname="./${dirpath}" - head="${CDUMP}.t${cyc}z." + head="${RUN}.t${cyc}z." #.................. - rm -f enkf${CDUMP}.txt - touch enkf${CDUMP}.txt - - echo "${dirname}${head}enkfstat " >>enkf${CDUMP}.txt - echo "${dirname}${head}gsistat.ensmean " >>enkf${CDUMP}.txt - if [ -s $ROTDIR/${dirpath}${head}cnvstat.ensmean ]; then - echo "${dirname}${head}cnvstat.ensmean " >>enkf${CDUMP}.txt - fi - if [ -s $ROTDIR/${dirpath}${head}oznstat.ensmean ]; then - echo "${dirname}${head}oznstat.ensmean " >>enkf${CDUMP}.txt - fi - if [ -s $ROTDIR/${dirpath}${head}radstat.ensmean ]; then - echo "${dirname}${head}radstat.ensmean " >>enkf${CDUMP}.txt - fi - for FHR in $nfhrs; do # loop over analysis times in window - if [ $FHR -eq 6 ]; then - if [ -s $ROTDIR/${dirpath}${head}atmanl.ensmean${SUFFIX} ]; then - echo "${dirname}${head}atmanl.ensmean${SUFFIX} " >>enkf${CDUMP}.txt - fi - if [ -s $ROTDIR/${dirpath}${head}atminc.ensmean${SUFFIX} ]; then - echo "${dirname}${head}atminc.ensmean${SUFFIX} " >>enkf${CDUMP}.txt + rm -f "${RUN}.txt" + touch "${RUN}.txt" + + { + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}enkfstat" + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}gsistat.ensmean" + if [[ -s "${COM_ATMOS_ANALYSIS_ENSSTAT}/${head}cnvstat.ensmean" ]]; then + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}cnvstat.ensmean" + fi + if [[ -s "${COM_ATMOS_ANALYSIS_ENSSTAT}/${head}oznstat.ensmean" ]]; then + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}oznstat.ensmean" + fi + if [[ -s "${COM_ATMOS_ANALYSIS_ENSSTAT}/${head}radstat.ensmean" ]]; then + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}radstat.ensmean" + fi + for FHR in $nfhrs; do # loop over analysis times in window + if [ $FHR -eq 6 ]; then + if [[ -s "${COM_ATMOS_ANALYSIS_ENSSTAT}/${head}atmanl.ensmean.nc" ]]; then + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}atmanl.ensmean.nc" fi - else - if [ -s $ROTDIR/${dirpath}${head}atma00${FHR}.ensmean${SUFFIX} ]; then - echo "${dirname}${head}atma00${FHR}.ensmean${SUFFIX} " >>enkf${CDUMP}.txt + if [[ -s "${COM_ATMOS_ANALYSIS_ENSSTAT}/${head}atminc.ensmean.nc" ]]; then + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}atminc.ensmean.nc" fi - if [ -s $ROTDIR/${dirpath}${head}atmi00${FHR}.ensmean${SUFFIX} ]; then - echo "${dirname}${head}atmi00${FHR}.ensmean${SUFFIX} " >>enkf${CDUMP}.txt + else + if [[ -s "${COM_ATMOS_ANALYSIS_ENSSTAT}/${head}atma00${FHR}.ensmean.nc" ]]; then + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}atma00${FHR}.ensmean.nc" fi - fi - done # loop over FHR - for fstep in eobs ecen esfc eupd efcs epos ; do - echo "logs/${CDATE}/${CDUMP}${fstep}*.log " >>enkf${CDUMP}.txt - done - -# eomg* are optional jobs - for log in $ROTDIR/logs/${CDATE}/${CDUMP}eomg*.log; do - if [ -s "$log" ]; then - echo "logs/${CDATE}/${CDUMP}eomg*.log " >>enkf${CDUMP}.txt - fi - break - done + if [[ -s "${COM_ATMOS_ANALYSIS_ENSSTAT}/${head}atmi00${FHR}.ensmean.nc" ]]; then + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}atmi00${FHR}.ensmean.nc" + fi + fi + done # loop over FHR + for fstep in eobs ecen esfc eupd efcs epos ; do + echo "logs/${PDY}${cyc}/${RUN}${fstep}*.log" + done + # eomg* are optional jobs + for log in "${ROTDIR}/logs/${PDY}${cyc}/${RUN}eomg"*".log"; do + if [[ -s "${log}" ]]; then + echo "logs/${PDY}${cyc}/${RUN}eomg*.log" + fi + break + done -# Ensemble spread file only available with netcdf output - fh=3 - while [ $fh -le 9 ]; do - fhr=$(printf %03i $fh) - echo "${dirname}${head}atmf${fhr}.ensmean${SUFFIX} " >>enkf${CDUMP}.txt - echo "${dirname}${head}sfcf${fhr}.ensmean${SUFFIX} " >>enkf${CDUMP}.txt - if [ $OUTPUT_FILE = "netcdf" ]; then - if [ -s $ROTDIR/${dirpath}${head}atmf${fhr}.ensspread${SUFFIX} ]; then - echo "${dirname}${head}atmf${fhr}.ensspread${SUFFIX} " >>enkf${CDUMP}.txt - fi - fi - fh=$((fh+3)) - done + # Ensemble spread file only available with netcdf output + fh=3 + while [ $fh -le 9 ]; do + fhr=$(printf %03i $fh) + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}atmf${fhr}.ensmean.nc" + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}sfcf${fhr}.ensmean.nc" + if [[ -s "${COM_ATMOS_ANALYSIS_ENSSTAT}/${head}atmf${fhr}.ensspread.nc" ]]; then + echo "${COM_ATMOS_ANALYSIS_ENSSTAT/${ROTDIR}\//}/${head}atmf${fhr}.ensspread.nc" + fi + fh=$((fh+3)) + done + } >> "${RUN}.txt" #........................... n=1 - while [ $n -le $NTARS ]; do - #........................... + while (( n <= NTARS )); do + #........................... - rm -f enkf${CDUMP}_grp${n}.txt - rm -f enkf${CDUMP}_restarta_grp${n}.txt - rm -f enkf${CDUMP}_restartb_grp${n}.txt - touch enkf${CDUMP}_grp${n}.txt - touch enkf${CDUMP}_restarta_grp${n}.txt - touch enkf${CDUMP}_restartb_grp${n}.txt - - m=1 - while [ $m -le $NMEM_EARCGRP ]; do - nm=$(((n-1)*NMEM_EARCGRP+m)) - mem=$(printf %03i $nm) - dirpath="enkf${CDUMP}.${PDY}/${cyc}/atmos/mem${mem}/" - dirname="./${dirpath}" - head="${CDUMP}.t${cyc}z." - - #--- - for FHR in $nfhrs; do # loop over analysis times in window - if [ $FHR -eq 6 ]; then - if [ $n -le $NTARS2 ]; then - if [ -s $ROTDIR/${dirpath}${head}atmanl${SUFFIX} ] ; then - echo "${dirname}${head}atmanl${SUFFIX} " >>enkf${CDUMP}_grp${n}.txt + rm -f "${RUN}_grp${n}.txt" + rm -f "${RUN}_restarta_grp${n}.txt" + rm -f "${RUN}_restartb_grp${n}.txt" + touch "${RUN}_grp${n}.txt" + touch "${RUN}_restarta_grp${n}.txt" + touch "${RUN}_restartb_grp${n}.txt" + + m=1 + while (( m <= NMEM_EARCGRP )); do + nm=$(((n-1)*NMEM_EARCGRP+m)) + mem=$(printf %03i ${nm}) + head="${RUN}.t${cyc}z." + + MEMDIR="mem${mem}" YMD=${PDY} HH=${cyc} generate_com \ + COM_ATMOS_ANALYSIS_MEM:COM_ATMOS_ANALYSIS_TMPL \ + COM_ATMOS_RESTART_MEM:COM_ATMOS_RESTART_TMPL + + #--- + for FHR in $nfhrs; do # loop over analysis times in window + if [ $FHR -eq 6 ]; then + { + if (( n <= NTARS2 )); then + if [[ -s "${COM_ATMOS_ANALYSIS_MEM}/${head}atmanl.nc" ]] ; then + echo "${COM_ATMOS_ANALYSIS_MEM/${ROTDIR}\//}/${head}atmanl.nc" + fi + if [[ -s "${COM_ATMOS_ANALYSIS_MEM}/${head}ratminc.nc" ]] ; then + echo "${COM_ATMOS_ANALYSIS_MEM/${ROTDIR}\//}/${head}ratminc.nc" + fi fi - if [ -s $ROTDIR/${dirpath}${head}ratminc${SUFFIX} ] ; then - echo "${dirname}${head}ratminc${SUFFIX} " >>enkf${CDUMP}_grp${n}.txt - fi - fi - if [ -s $ROTDIR/${dirpath}${head}ratminc${SUFFIX} ] ; then - echo "${dirname}${head}ratminc${SUFFIX} " >>enkf${CDUMP}_restarta_grp${n}.txt - fi + } >> "${RUN}_grp${n}.txt" - else - if [ $n -le $NTARS2 ]; then - if [ -s $ROTDIR/${dirpath}${head}atma00${FHR}${SUFFIX} ] ; then - echo "${dirname}${head}atma00${FHR}${SUFFIX} " >>enkf${CDUMP}_grp${n}.txt - fi - if [ -s $ROTDIR/${dirpath}${head}ratmi00${FHR}${SUFFIX} ] ; then - echo "${dirname}${head}ratmi00${FHR}${SUFFIX} " >>enkf${CDUMP}_grp${n}.txt - fi - fi - if [ -s $ROTDIR/${dirpath}${head}ratmi00${FHR}${SUFFIX} ] ; then - echo "${dirname}${head}ratmi00${FHR}${SUFFIX} " >>enkf${CDUMP}_restarta_grp${n}.txt - fi + if [[ -s "${COM_ATMOS_ANALYSIS_MEM}/${head}ratminc.nc" ]] ; then + echo "${COM_ATMOS_ANALYSIS_MEM/${ROTDIR}\//}/${head}ratminc.nc" \ + >> "${RUN}_restarta_grp${n}.txt" + fi - fi - echo "${dirname}${head}atmf00${FHR}${SUFFIX} " >>enkf${CDUMP}_grp${n}.txt - if [ $FHR -eq 6 ]; then - echo "${dirname}${head}sfcf00${FHR}${SUFFIX} " >>enkf${CDUMP}_grp${n}.txt + else + { + if (( n <= NTARS2 )); then + if [[ -s "${COM_ATMOS_ANALYSIS_MEM}/${head}atma00${FHR}.nc" ]] ; then + echo "${COM_ATMOS_ANALYSIS_MEM/${ROTDIR}\//}/${head}atma00${FHR}.nc" + fi + if [[ -s "${COM_ATMOS_ANALYSIS_MEM}/${head}ratmi00${FHR}.nc" ]] ; then + echo "${COM_ATMOS_ANALYSIS_MEM/${ROTDIR}\//}/${head}ratmi00${FHR}.nc" + fi + fi + } >> "${RUN}_grp${n}.txt" + if [[ -s "${COM_ATMOS_ANALYSIS_MEM}/${head}ratmi00${FHR}.nc" ]] ; then + echo "${COM_ATMOS_ANALYSIS_MEM/${ROTDIR}\//}/${head}ratmi00${FHR}.nc" \ + >> "${RUN}_restarta_grp${n}.txt" + fi + fi + { + echo "${COM_ATMOS_ANALYSIS_MEM/${ROTDIR}\//}/${head}atmf00${FHR}.nc" + if (( FHR == 6 )); then + echo "${COM_ATMOS_ANALYSIS_MEM/${ROTDIR}\//}/${head}sfcf00${FHR}.nc" + fi + } >> "${RUN}_grp${n}.txt" + done # loop over FHR + + if [[ ${lobsdiag_forenkf} == ".false." ]] ; then + { + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/${head}gsistat" + if [[ -s "${COM_ATMOS_RESTART_MEM}/${head}cnvstat" ]] ; then + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/${head}cnvstat" + fi + } >> "${RUN}_grp${n}.txt" + + { + if [[ -s "${COM_ATMOS_RESTART_MEM}/${head}radstat" ]]; then + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/${head}radstat" + fi + if [[ -s "${COM_ATMOS_RESTART_MEM}/${head}cnvstat" ]]; then + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/${head}cnvstat" + fi + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/${head}abias" + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/${head}abias_air" + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/${head}abias_int" + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/${head}abias_pc" + } >> "${RUN}_restarta_grp${n}.txt" fi - done # loop over FHR - - if [[ lobsdiag_forenkf = ".false." ]] ; then - echo "${dirname}${head}gsistat " >>enkf${CDUMP}_grp${n}.txt - if [ -s $ROTDIR/${dirpath}${head}cnvstat ] ; then - echo "${dirname}${head}cnvstat " >>enkf${CDUMP}_grp${n}.txt - fi - if [ -s $ROTDIR/${dirpath}${head}radstat ]; then - echo "${dirname}${head}radstat " >>enkf${CDUMP}_restarta_grp${n}.txt - fi - if [ -s $ROTDIR/${dirpath}${head}cnvstat ]; then - echo "${dirname}${head}cnvstat " >>enkf${CDUMP}_restarta_grp${n}.txt - fi - echo "${dirname}${head}abias " >>enkf${CDUMP}_restarta_grp${n}.txt - echo "${dirname}${head}abias_air " >>enkf${CDUMP}_restarta_grp${n}.txt - echo "${dirname}${head}abias_int " >>enkf${CDUMP}_restarta_grp${n}.txt - echo "${dirname}${head}abias_pc " >>enkf${CDUMP}_restarta_grp${n}.txt - fi - #--- - echo "${dirname}RESTART/*0000.sfcanl_data.tile1.nc " >>enkf${CDUMP}_restarta_grp${n}.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile2.nc " >>enkf${CDUMP}_restarta_grp${n}.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile3.nc " >>enkf${CDUMP}_restarta_grp${n}.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile4.nc " >>enkf${CDUMP}_restarta_grp${n}.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile5.nc " >>enkf${CDUMP}_restarta_grp${n}.txt - echo "${dirname}RESTART/*0000.sfcanl_data.tile6.nc " >>enkf${CDUMP}_restarta_grp${n}.txt - - #--- - echo "${dirname}RESTART " >>enkf${CDUMP}_restartb_grp${n}.txt - - m=$((m+1)) - done + #--- + { + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/*0000.sfcanl_data.tile1.nc" + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/*0000.sfcanl_data.tile2.nc" + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/*0000.sfcanl_data.tile3.nc" + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/*0000.sfcanl_data.tile4.nc" + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/*0000.sfcanl_data.tile5.nc" + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}/*0000.sfcanl_data.tile6.nc" + } >> "${RUN}_restarta_grp${n}.txt" + #--- + echo "${COM_ATMOS_RESTART_MEM/${ROTDIR}\//}" >> "${RUN}_restartb_grp${n}.txt" + + m=$((m+1)) + done #........................... diff --git a/ush/inter_flux.sh b/ush/inter_flux.sh index f87b19b7ba6..b1f4475e05f 100755 --- a/ush/inter_flux.sh +++ b/ush/inter_flux.sh @@ -8,10 +8,10 @@ source "$HOMEgfs/ush/preamble.sh" "$FH" # into lat-lon grids. #----------------------------------------------------------------------- -export CNVGRIB=${CNVGRIB:-${NWPROD:-/nwprod}/util/exec/cnvgrib21} -export COPYGB2=${COPYGB2:-${NWPROD:-/nwprod}/util/exec/copygb2} -export WGRIB2=${WGRIB2:-${NWPROD:-/nwprod}/util/exec/wgrib2} -export GRBINDEX=${GRBINDEX:-${NWPROD:-nwprod}/util/exec/grbindex} +export CNVGRIB=${CNVGRIB:-${grib_util_ROOT}/bin/cnvgrib} +export COPYGB2=${COPYGB2:-${grib_util_ROOT}/bin/copygb} +export WGRIB2=${WGRIB2:-${wgrib2_ROOT}/bin/wgrib2} +export GRBINDEX=${GRBINDEX:-${wgrib2_ROOT}/bin/grbindex} export RUN=${RUN:-"gfs"} export cycn=$(echo $CDATE |cut -c 9-10) export TCYC=${TCYC:-".t${cycn}z."} @@ -43,15 +43,13 @@ else fi #--------------------------------------------------------------- + ${WGRIB2} "${COM_ATMOS_MASTER}/${FLUXFL}" ${option1} ${option21} ${option22} ${option23} \ + ${option24} ${option25} ${option26} ${option27} ${option28} \ + -new_grid ${grid1p0} fluxfile_${fhr3}_1p00 + export err=$?; err_chk - $WGRIB2 $COMOUT/${FLUXFL} $option1 $option21 $option22 $option23 $option24 \ - $option25 $option26 $option27 $option28 \ - -new_grid $grid1p0 fluxfile_${fhr3}_1p00 - - - $WGRIB2 -s fluxfile_${fhr3}_1p00 > $COMOUT/${PREFIX}flux.1p00.f${fhr3}.idx - cp fluxfile_${fhr3}_1p00 $COMOUT/${PREFIX}flux.1p00.f${fhr3} - + ${WGRIB2} -s "fluxfile_${fhr3}_1p00" > "${COM_ATMOS_GRIB_1p00}/${PREFIX}flux.1p00.f${fhr3}.idx" + cp "fluxfile_${fhr3}_1p00" "${COM_ATMOS_GRIB_1p00}/${PREFIX}flux.1p00.f${fhr3}" #--------------------------------------------------------------- diff --git a/ush/jjob_header.sh b/ush/jjob_header.sh new file mode 100644 index 00000000000..45fa6402ae6 --- /dev/null +++ b/ush/jjob_header.sh @@ -0,0 +1,115 @@ +#! /usr/bin/env bash +# +# Universal header for global j-jobs +# +# Sets up and completes actions common to all j-jobs: +# - Creates and moves to $DATA after removing any +# existing one unless $WIPE_DATA is set to "NO" +# - Runs `setpdy.sh` +# - Sources configs provided as arguments +# - Sources machine environment script +# - Defines a few other variables +# +# The job name for the environment files should be passed +# in using the `-e` option (required). Any config files +# to be sourced should be passed in as an argument to +# the `-c` option. For example: +# ``` +# jjob_header.sh -e "fcst" -c "base fcst" +# ``` +# Will source `config.base` and `config.fcst`, then pass +# `fcst` to the ${machine}.env script. +# +# Script requires the following variables to already be +# defined in the environment: +# - $HOMEgfs +# - $DATAROOT (unless $DATA is overriden) +# - $jobid +# - $PDY +# - $cyc +# - $machine +# +# Additionally, there are a couple of optional settings that +# can be set before calling the script: +# - $EXPDIR : Override the default $EXPDIR +# [default: ${HOMEgfs}/parm/config] +# - $DATA : Override the default $DATA location +# [default: ${DATAROOT}/${jobid}] +# - $WIPE_DATA : Set whether to delete any existing $DATA +# [default: "YES"] +# - $pid : Override the default process id +# [default: $$] +# + +OPTIND=1 +while getopts "c:e:" option; do + case "${option}" in + c) read -ra configs <<< "${OPTARG}" ;; + e) env_job=${OPTARG} ;; + :) + echo "FATAL [${BASH_SOURCE[0]}]: ${option} requires an argument" + exit 1 + ;; + *) + echo "FATAL [${BASH_SOURCE[0]}]: Unrecognized option: ${option}" + exit 1 + ;; + esac +done +shift $((OPTIND-1)) + +if [[ -z ${env_job} ]]; then + echo "FATAL [${BASH_SOURCE[0]}]: Must specify a job name with -e" + exit 1 +fi + +############################################## +# make temp directory +############################################## +export DATA=${DATA:-"${DATAROOT}/${jobid}"} +if [[ ${WIPE_DATA:-YES} == "YES" ]]; then + rm -rf "${DATA}" +fi +mkdir -p "${DATA}" +cd "${DATA}" || ( echo "FATAL [${BASH_SOURCE[0]}]: ${DATA} does not exist"; exit 1 ) + + +############################################## +# Run setpdy and initialize PDY variables +############################################## +export cycle="t${cyc}z" +setpdy.sh +source ./PDY + + +############################################## +# Determine Job Output Name on System +############################################## +export pid="${pid:-$$}" +export pgmout="OUTPUT.${pid}" +export pgmerr=errfile + + +############################# +# Source relevant config files +############################# +export EXPDIR="${EXPDIR:-${HOMEgfs}/parm/config}" +for config in "${configs[@]:-''}"; do + source "${EXPDIR}/config.${config}" + status=$? + if (( status != 0 )); then + echo "FATAL [${BASH_SOURCE[0]}]: Unable to load config config.${config}" + exit "${status}" + fi +done + + +########################################## +# Source machine runtime environment +########################################## +source "${HOMEgfs}/env/${machine}.env" "${env_job}" +status=$? +if (( status != 0 )); then + echo "FATAL [${BASH_SOURCE[0]}]: Error while sourcing machine environment ${machine}.env for job ${env_job}" + exit "${status}" +fi diff --git a/ush/load_fv3gfs_modules.sh b/ush/load_fv3gfs_modules.sh index 4bd625b793d..2899e695144 100755 --- a/ush/load_fv3gfs_modules.sh +++ b/ush/load_fv3gfs_modules.sh @@ -10,12 +10,15 @@ fi ulimit_s=$( ulimit -S -s ) # Find module command and purge: -source "$HOMEgfs/modulefiles/module-setup.sh.inc" +source "${HOMEgfs}/modulefiles/module-setup.sh.inc" # Load our modules: -module use "$HOMEgfs/modulefiles" +module use "${HOMEgfs}/modulefiles" -if [[ -d /lfs4 ]] ; then +if [[ -d /lfs/f1 ]]; then + # We are on WCOSS2 (Cactus or Dogwood) + module load module_base.wcoss2 +elif [[ -d /mnt/lfs1 ]] ; then # We are on NOAA Jet module load module_base.jet elif [[ -d /scratch1 ]] ; then @@ -30,12 +33,17 @@ elif [[ -d /glade ]] ; then elif [[ -d /lustre && -d /ncrc ]] ; then # We are on GAEA. module load module_base.gaea +elif [[ -d /data/prod ]] ; then + # We are on SSEC S4 + module load module_base.s4 else echo WARNING: UNKNOWN PLATFORM fi +module list + # Restore stack soft limit: -ulimit -S -s "$ulimit_s" +ulimit -S -s "${ulimit_s}" unset ulimit_s -${TRACE_ON:-set -x} +set_trace diff --git a/ush/load_ufsda_modules.sh b/ush/load_ufsda_modules.sh new file mode 100755 index 00000000000..e3356432d7e --- /dev/null +++ b/ush/load_ufsda_modules.sh @@ -0,0 +1,90 @@ +#! /usr/bin/env bash + +############################################################### +if [[ "${DEBUG_WORKFLOW:-NO}" == "NO" ]]; then + echo "Loading modules quietly..." + set +x +fi + +# Read optional module argument, default is to use GDAS +MODS="GDAS" +if [[ $# -gt 0 ]]; then + case "$1" in + --eva) + MODS="EVA" + ;; + --gdas) + MODS="GDAS" + ;; + *) + echo "Invalid option: $1" >&2 + exit 1 + ;; + esac +fi + +# Setup runtime environment by loading modules +ulimit_s=$( ulimit -S -s ) + +# Find module command and purge: +source "${HOMEgfs}/modulefiles/module-setup.sh.inc" + +# Load our modules: +module use "${HOMEgfs}/sorc/gdas.cd/modulefiles" + +if [[ -d /lfs/f1 ]]; then + # We are on WCOSS2 (Cactus or Dogwood) + echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM +elif [[ -d /lfs3 ]] ; then + # We are on NOAA Jet + echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM +elif [[ -d /scratch1 ]] ; then + # We are on NOAA Hera + module load "${MODS}/hera" + if [[ "${DEBUG_WORKFLOW:-NO}" == "YES" ]] ; then + module list + pip list + fi + # set NETCDF variable based on ncdump location + NETCDF=$( which ncdump ) + export NETCDF + # prod_util stuff, find a better solution later... + module use /scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/modulefiles/compiler/intel/2022.1.2/ + module load prod_util +elif [[ -d /work ]] ; then + # We are on MSU Orion + # prod_util stuff, find a better solution later... + #module use /apps/contrib/NCEP/hpc-stack/libs/hpc-stack/modulefiles/compiler/intel/2022.1.2/ + #module load prod_util + export UTILROOT=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2 + export MDATE=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/mdate + export NDATE=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/ndate + export NHOUR=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/nhour + export FSYNC=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/fsync_file + module load "${MODS}/orion" + if [[ "${DEBUG_WORKFLOW:-NO}" == "YES" ]] ; then + module list + pip list + fi + # set NETCDF variable based on ncdump location + ncdump=$( which ncdump ) + NETCDF=$( echo "${ncdump}" | cut -d " " -f 3 ) + export NETCDF +elif [[ -d /glade ]] ; then + # We are on NCAR Yellowstone + echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM +elif [[ -d /lustre && -d /ncrc ]] ; then + # We are on GAEA. + echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM +elif [[ -d /data/prod ]] ; then + # We are on SSEC S4 + echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM +else + echo WARNING: UNKNOWN PLATFORM +fi + +# Restore stack soft limit: +ulimit -S -s "${ulimit_s}" +unset ulimit_s + +set_trace diff --git a/ush/merge_fv3_aerosol_tile.py b/ush/merge_fv3_aerosol_tile.py index 7538bc7a767..decf6e9cbab 100755 --- a/ush/merge_fv3_aerosol_tile.py +++ b/ush/merge_fv3_aerosol_tile.py @@ -155,8 +155,8 @@ def main() -> None: parser.add_argument('core_file', type=str, help="File containing the dycore sigma level coefficients") parser.add_argument('ctrl_file', type=str, help="File containing the sigma level coefficients for atmospheric IC data") parser.add_argument('rest_file', type=str, help="File containing the pressure level thickness for the restart state") - parser.add_argument('variable_file', type=str, help="File containing list of tracer variable_names in the chem_file to add to the atm_file, one tracer per line") - parser.add_argument('out_file', type=str, nargs="?", help="Name of file to create. If none is specified, the atm_file will be edited in place. New file will be a copy of atm_file with the specificed tracers listed in variable_file appended from chem_file and ntracers updated.") + parser.add_argument('variable_file', type=str, help="File with list of tracer variable_names in the chem_file to add to the atm_file, one tracer per line") + parser.add_argument('out_file', type=str, nargs="?", help="Name of file to create") args = parser.parse_args() diff --git a/ush/minmon_xtrct_costs.pl b/ush/minmon_xtrct_costs.pl new file mode 100755 index 00000000000..1b5d490102f --- /dev/null +++ b/ush/minmon_xtrct_costs.pl @@ -0,0 +1,231 @@ +#!/usr/bin/env perl + +#--------------------------------------------------------------------------- +# minmon_xtrct_costs.pl +# +# Extract cost data from gsistat file and load into cost +# and cost term files. +#--------------------------------------------------------------------------- + +use strict; +use warnings; + +#---------------------------------------------- +# subroutine to trim white space from strings +#---------------------------------------------- +sub trim { my $s = shift; $s =~ s/^\s+|\s+$//g; return $s }; + + +#--------------------------- +# +# Main routine begins here +# +#--------------------------- + +if ($#ARGV != 4 ) { + print "usage: minmon_xtrct_costs.pl SUFFIX PDY cyc infile jlogfile\n"; + exit; +} +my $suffix = $ARGV[0]; + +my $pdy = $ARGV[1]; +my $cyc = $ARGV[2]; +my $infile = $ARGV[3]; +my $jlogfile = $ARGV[4]; + +my $use_costterms = 0; +my $no_data = 0.00; + +my $scr = "minmon_xtrct_costs.pl"; +print "$scr has started\n"; + + +my $rc = 0; +my $cdate = sprintf '%s%s', $pdy, $cyc; + +if( (-e $infile) ) { + + my $found_cost = 0; + my $found_costterms = 0; + my @cost_array; + my @jb_array; + my @jo_array; + my @jc_array; + my @jl_array; + my @term_array; + my @all_cost_terms; + + my $cost_target; + my $cost_number; + my $costterms_target; + my $jb_number = 5; + my $jo_number = 6; + my $jc_number = 7; + my $jl_number = 8; + + my $costfile = $ENV{"mm_costfile"}; + + if( (-e $costfile) ) { + open( COSTFILE, "<${costfile}" ) or die "Can't open ${costfile}: $!\n"; + my $line; + + while( $line = ) { + if( $line =~ /cost_target/ ) { + my @termsline = split( /:/, $line ); + $cost_target = $termsline[1]; + } elsif( $line =~ /cost_number/ ) { + my @termsline = split( /:/, $line ); + $cost_number = $termsline[1]; + } elsif( $line =~ /costterms_target/ ){ + my @termsline = split( /:/, $line ); + $costterms_target = $termsline[1]; + } + } + close( COSTFILE ); + } else { + $rc = 2; + } + + #------------------------------------------------------------------------ + # Open the infile and search for the $costterms_target and $cost_target + # strings. If found, parse out the cost information and push into + # holding arrays. + #------------------------------------------------------------------------ + if( $rc == 0 ) { + open( INFILE, "<${infile}" ) or die "Can't open ${infile}: $!\n"; + + my $line; + my $term_ctr=0; + + while( $line = ) { + + if( $line =~ /$costterms_target/ ) { + my @termsline = split( / +/, $line ); + push( @jb_array, $termsline[$jb_number] ); + push( @jo_array, $termsline[$jo_number] ); + push( @jc_array, $termsline[$jc_number] ); + push( @jl_array, $termsline[$jl_number] ); + $use_costterms = 1; + } + + if( $line =~ /$cost_target/ ) { + my @costline = split( / +/, $line ); + push( @cost_array, $costline[$cost_number] ); + } + + if( $term_ctr > 0 ) { + my @termline = split( / +/, $line ); + + if ( $term_ctr < 10 ) { + push( @term_array, trim($termline[1]) ); + push( @term_array, trim($termline[2]) ); + push( @term_array, trim($termline[3]) ); + $term_ctr++; + } else { + push( @term_array, trim($termline[1]) ); + push( @term_array, trim($termline[2]) ); + $term_ctr = 0; + } + + }elsif ( $line =~ "J=" && $line !~ "EJ=" ) { + my @termline = split( / +/, $line ); + push( @term_array, trim($termline[2]) ); + push( @term_array, trim($termline[3]) ); + push( @term_array, trim($termline[4]) ); + $term_ctr = 1; + } + } + + close( INFILE ); + + + #---------------------------------------------- + # move cost_array into all_costs by iteration + #---------------------------------------------- + my @all_costs; + for my $i (0 .. $#cost_array) { + my $iterline; + if( $use_costterms == 1 ){ + $iterline = sprintf ' %d,%e,%e,%e,%e,%e%s', + $i, $cost_array[$i], $jb_array[$i], $jo_array[$i], + $jc_array[$i], $jl_array[$i], "\n"; + } + else { + $iterline = sprintf ' %d,%e,%e,%e,%e,%e%s', + $i, $cost_array[$i], $no_data, $no_data, + $no_data, $no_data, "\n"; + } + + push( @all_costs, $iterline ); + } + + #--------------------------------------------------- + # move term_array into all_cost_terms by iteration + #--------------------------------------------------- + if( @term_array > 0 ) { + my $nterms = 32; + my $max_iter = ($#term_array+1)/$nterms; + my $niter = $max_iter -1; + + for my $iter (0 .. $niter ) { + my $step = $iter * $nterms; + my $iterline = sprintf '%d, %e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e,%e%s', + $iter, $term_array[$step], $term_array[$step+1], $term_array[$step+2], + $term_array[$step+3], $term_array[$step+4], $term_array[$step+5], + $term_array[$step+6], $term_array[$step+7], $term_array[$step+8], + $term_array[$step+9], $term_array[$step+10], $term_array[$step+11], + $term_array[$step+12], $term_array[$step+13], $term_array[$step+14], + $term_array[$step+15], $term_array[$step+16], $term_array[$step+17], + $term_array[$step+18], $term_array[$step+19], $term_array[$step+20], + $term_array[$step+21], $term_array[$step+22], $term_array[$step+23], + $term_array[$step+24], $term_array[$step+25], $term_array[$step+26], + $term_array[$step+27], $term_array[$step+28], $term_array[$step+29], + $term_array[$step+30], $term_array[$step+31], "\n"; + push( @all_cost_terms, $iterline ); + } + } + + #------------------------------------------ + # write all_costs array to costs.txt file + #------------------------------------------ + my $filename2 = "${cdate}.costs.txt"; + if( @all_costs > 0 ) { + open( OUTFILE, ">$filename2" ) or die "Can't open ${filename2}: $!\n"; + print OUTFILE @all_costs; + close( OUTFILE ); + } + + #----------------------------------------------------- + # write all_cost_terms array to costs_terms.txt file + #----------------------------------------------------- + my $filename3 = "${cdate}.cost_terms.txt"; + if( @all_cost_terms > 0 ) { + open( OUTFILE, ">$filename3" ) or die "Can't open ${filename3}: $!\n"; + print OUTFILE @all_cost_terms; + close( OUTFILE ); + } + + #-------------------------- + # move files to $M_TANKverf + #-------------------------- + my $tankdir = $ENV{"M_TANKverfM0"}; + if(! -d $tankdir) { + system( "mkdir -p $tankdir" ); + } + + if( -e $filename2 ) { + my $newfile2 = "${tankdir}/${filename2}"; + system("cp -f $filename2 $newfile2"); + } + if( -e $filename3 ) { + my $newfile3 = "${tankdir}/${filename3}"; + system("cp -f $filename3 $newfile3"); + } + + } # $rc still == 0 after reading gmon_cost.txt +} +else { # $infile does not exist + $rc = 1; +} + +print "$scr has ended, return code = $rc \n" diff --git a/ush/minmon_xtrct_gnorms.pl b/ush/minmon_xtrct_gnorms.pl new file mode 100755 index 00000000000..ecd44232da5 --- /dev/null +++ b/ush/minmon_xtrct_gnorms.pl @@ -0,0 +1,442 @@ +#!/usr/bin/env perl + +use strict; +use warnings; +use List::MoreUtils 'true'; +use List::MoreUtils 'first_index'; +use List::MoreUtils 'last_index'; + +#--------------------------------------------------------------------------- +# minmon_xtrct_gnorms.pl +# +# Update the gnorm_data.txt file with data from a new cycle. Add +# this new data to the last line of the gnorm_data.txt file. +# +# Note: If the gnorm_data.txt file does not exist, it will be created. +# +# The gnorm_data.txt file is used plotted directly by the javascript on +# the GSI stats page. +#--------------------------------------------------------------------------- +sub updateGnormData { + my $cycle = $_[0]; + my $igrad = $_[1]; + my $fgnorm = $_[2]; + my $avg_gnorm = $_[3]; + my $min_gnorm = $_[4]; + my $max_gnorm = $_[5]; + my $suffix = $_[6]; + + my $rc = 0; + my @filearray; + + my $gdfile = "gnorm_data.txt"; + + my $outfile = "new_gnorm_data.txt"; + my $yr = substr( $cycle, 0, 4); + my $mon = substr( $cycle, 4, 2); + my $day = substr( $cycle, 6, 2); + my $hr = substr( $cycle, 8, 2); + + my $newln = sprintf ' %04d,%02d,%02d,%02d,%e,%e,%e,%e,%e%s', + $yr, $mon, $day, $hr, $igrad, $fgnorm, + $avg_gnorm, $min_gnorm, $max_gnorm, "\n"; + + #------------------------------------------------------------- + # attempt to locate the latest $gdfile and copy it locally + # + if( -e $gdfile ) { + open( INFILE, "<${gdfile}" ) or die "Can't open ${gdfile}: $!\n"; + + @filearray = ; + +# This is the mechanism that limits the data to 30 days worth. Should I +# keep it or let the transfer script(s) truncate? 6/12/16 -- I'm going to keep +# it. I can add this as a later change once I add a user mechanism to vary the +# amount of data plotted (on the fly). + + my $cyc_interval = $ENV{'CYCLE_INTERVAL'}; + if( $cyc_interval eq "" ) { + $cyc_interval = 6; + } + + my $max_cyc = 119; # default 30 days worth of data = 120 cycles + # If CYCLE_INTERVAL is other than "" or 6 + # then set the $max_cyc using that interval + if( $cyc_interval != 6 && $cyc_interval != 0 ) { + my $cyc_per_day = 24 / $cyc_interval; + $max_cyc = (30 * $cyc_per_day) - 1; + } + + while( $#filearray > $max_cyc ) { + shift( @filearray ); + } + close( INFILE ); + } + + # Here is the problem Russ encountered after re-running the MinMon: + # If the cycle time in $newln is the same as an existing record in + # *.gnorm_data.txt then we end up with 2+ rows for the same cycle time. + # In that case $newln should replace the first existing line + # in @filearray and all other lines that might match should be deleted. + # Else when the cycle time doesn't already exist (the expected condition) + # it should be pushed into @filearray. + + # algorithm: + # ========= + # Establish $count of matches on "$yr,$mon,$day,$hr" + # if $count > 0 + # while $count > 1 + # get last_index and remove with splice + # replace first_index with $newln + # else + # push $newln + # + my $srch_strng = "$yr,$mon,$day,$hr"; + my $count = true { /$srch_strng/ } @filearray; + + if( $count > 0 ) { + while( $count > 1 ) { + my $l_index = last_index { /$srch_strng/ } @filearray; + splice @filearray, $l_index, 1; + $count = true { /$srch_strng/ } @filearray; + } + my $f_index = first_index { /$srch_strng/ } @filearray; + splice @filearray, $f_index, 1, $newln; + } + else { + push( @filearray, $newln ); + } + + open( OUTFILE, ">$outfile" ) or die "Can't open ${$outfile}: $!\n"; + print OUTFILE @filearray; + close( OUTFILE ); + + system("cp -f $outfile $gdfile"); + +} + +#--------------------------------------------------------------------------- +# makeErrMsg +# +# Apply a gross check on the final value of the gnorm for a specific +# cycle. If the final_gnorm value is greater than the gross_check value +# then put that in the error message file. Also check for resets or a +# premature halt, and journal those events to the error message file too. +# +# Note to self: reset_iter array is passed by reference +#--------------------------------------------------------------------------- +sub makeErrMsg { + my $suffix = $_[0]; + my $cycle = $_[1]; + my $final_gnorm = $_[2]; + my $stop_flag = $_[3]; + my $stop_iter = $_[4]; + my $reset_flag = $_[5]; + my $reset_iter = $_[6]; #reset iteration array + my $infile = $_[7]; + my $gross_check = $_[8]; + + my $mail_msg =""; + my $out_file = "${cycle}.errmsg.txt"; + + + if( $stop_flag > 0 ) { + my $stop_msg = " Gnorm check detected premature iteration stop: suffix = $suffix, cycle = $cycle, iteration = $stop_iter"; + $mail_msg .= $stop_msg; + } + + if( $reset_flag > 0 ) { + my $ctr=0; + my $reset_msg = "\n Gnorm check detected $reset_flag reset(s): suffix = $suffix, cycle = $cycle"; + $mail_msg .= $reset_msg; + $mail_msg .= "\n"; + $mail_msg .= " Reset(s) detected in iteration(s): @{$reset_iter}[$ctr] \n"; + + my $arr_size = @{$reset_iter}; + for( $ctr=1; $ctr < $arr_size; $ctr++ ) { + $mail_msg .= " @{$reset_iter}[$ctr]\n"; + } + } + + if( $final_gnorm >= $gross_check ){ + my $gnorm_msg = " Final gnorm gross check failure: suffix = $suffix, cycle = $cycle, final gnorm = $final_gnorm "; + + $mail_msg .= $gnorm_msg; + } + + if( length $mail_msg > 0 ){ + my $file_msg = " File source for report is: $infile"; + $mail_msg .= $file_msg; + } + + if( length $mail_msg > 0 ){ + my $mail_link = "http://www.emc.ncep.noaa.gov/gmb/gdas/gsi_stat/index.html?src=$suffix&typ=gnorm&cyc=$cycle"; + open( OUTFILE, ">$out_file" ) or die "Can't open ${$out_file}: $!\n"; + print OUTFILE $mail_msg; + print OUTFILE "\n\n $mail_link"; + close( OUTFILE ); + } +} + + +#--------------------------------------------------------------------------- +# +# Main routine begins here +# +#--------------------------------------------------------------------------- + +if ($#ARGV != 4 ) { + print "usage: minmon_xtrct_gnorms.pl SUFFIX pdy cyc infile jlogfile\n"; + exit; +} + + +my $suffix = $ARGV[0]; +my $pdy = $ARGV[1]; +my $cyc = $ARGV[2]; +my $infile = $ARGV[3]; +my $jlogfile = $ARGV[4]; + + +my $scr = "minmon_xtrct_gnorms.pl"; +print "$scr Has Started\n"; + +# +# This needs to be redesigned to get the gnorm value from the gsistat file +# using the line that starts "cost,grad,step,b,step?:". The line formerly +# used for the gnorm and reduction values may not be available if the the +# verbose output flag is set to FALSE. +# +# So, using the grad value on that line: +# gnorm[i] = (grad[i]**)/(grad[0]**) +# reduct[i] = sqrt(gnorm) + +my $igrad_target; +my $igrad_number; +my $expected_gnorms; +my $gross_check_val; + +my $rc = 0; +my $cdate = sprintf '%s%s', $pdy, $cyc; + +my $gnormfile = $ENV{"mm_gnormfile"}; + + +if( (-e $gnormfile) ) { + open( GNORMFILE, "<${gnormfile}" ) or die "Can't open ${gnormfile}: $!\n"; + my $line; + + while( $line = ) { + if( $line =~ /igrad_target/ ) { + my @termsline = split( /:/, $line ); + $igrad_target = $termsline[1]; + } elsif( $line =~ /igrad_number/ ) { + my @termsline = split( /:/, $line ); + $igrad_number = $termsline[1]; + } elsif( $line =~ /expected_gnorms/ ){ + my @termsline = split( /:/, $line ); + $expected_gnorms = $termsline[1]; + } elsif( $line =~ /gross_check_val/ ){ + my @termsline = split( /:/, $line ); + $gross_check_val = $termsline[1]; + } + } + close( GNORMFILE ); +} else { + $rc = 4; +} + +if( $rc == 0 ) { + if( (-e $infile) ) { + open( INFILE, "<${infile}" ) or die "Can't open ${infile}: $!\n"; + + my $found_igrad = 0; + my $final_gnorm = 0.0; + my $igrad = 0.0; + my $header = 4; + my $header2 = 0; + my @gnorm_array; + my @last_10_gnorm; + + my $reset_flag = 0; + my $stop_flag = 0; + my $warn_str = "WARNING"; + my $stop_str = "Stopping"; + my $stop_iter = ""; + my $reset_str = "Reset"; + my @reset_iter; # reset iteration array + + my $stop_iter_flag = 0; + my $reset_iter_flag = 0; + my $line; + while( $line = ) { + + ############################################## + # if the reset_iter_flag is 1 then record the + # current outer & inner iteration number + ############################################## + if( $reset_iter_flag == 1 ) { + if( $line =~ /${igrad_target}/ ) { + my @iterline = split( / +/, $line ); + my $iter_str = $iterline[2] . "," . $iterline[3]; + push( @reset_iter, $iter_str); + $reset_iter_flag = 0; + } + } + + + if( $line =~ /${igrad_target}/ ) { + my @gradline = split( / +/, $line ); + + my $grad = $gradline[$igrad_number]; + + if( $found_igrad == 0 ){ + $igrad = $grad; + $found_igrad = 1; + } + + my $igrad_sqr = $igrad**2; + my $grad_sqr = $grad**2; + my $gnorm = $grad_sqr/$igrad_sqr; + + push( @gnorm_array, $gnorm ); + } + + + if( $line =~ /${warn_str}/ ) { + if( $line =~ /${stop_str}/ ) { + $stop_flag++; + $stop_iter_flag=1; + } + elsif( $line =~ /${reset_str}/ ){ + $reset_flag++; + $reset_iter_flag = 1; + } + } + + } + close( INFILE ); + + ######################################################################## + # If the stop_flag is >0 then record the last outer & inner + # iteration number. The trick is that it's the last iteration in the + # log file and we just passed it when we hit the stop warning message, + # so we have to reopen the file and get the last iteration number. + ######################################################################## + if( $stop_flag > 0 ) { + open( INFILE, "<${infile}" ) or die "Can't open ${infile}: $!\n"; + + my @lines = reverse ; + foreach $line (@lines) { + if( $line =~ /${igrad_target}/ ){ + my @iterline = split( / +/, $line ); + $stop_iter = $iterline[2] . "," . $iterline[3]; + last; + } + } + close( INFILE ); + } + + + my @all_gnorm = @gnorm_array; + + ############################################################################## + ## + ## If the iterations were halted due to error then the @all_gnorm array won't + ## be the expected size. In that case we need to pad the array out with + ## RMISS values so GrADS won't choke when it tries to read the data file. + ## + ## Note that we're padding @all_gnorm. The @gnorm_array is examined below + ## and we don't want to pad that and mess up the min/max calculation. + ## + ############################################################################### + my $arr_size = @all_gnorm; + + if( $arr_size < $expected_gnorms ) { + for( my $ctr = $arr_size; $ctr < $expected_gnorms; $ctr++ ) { + push( @all_gnorm, -999.0 ); + } + } + + my $sum_10_gnorm = 0.0; + my $min_gnorm = 9999999.0; + my $max_gnorm = -9999999.0; + my $avg_gnorm = 0.0; + + for( my $ctr = 9; $ctr >= 0; $ctr-- ) { + my $new_gnorm = pop( @gnorm_array ); + $sum_10_gnorm = $sum_10_gnorm + $new_gnorm; + if( $new_gnorm > $max_gnorm ) { + $max_gnorm = $new_gnorm; + } + if( $new_gnorm < $min_gnorm ) { + $min_gnorm = $new_gnorm; + } + if( $ctr == 9 ) { + $final_gnorm = $new_gnorm; + } + } + + $avg_gnorm = $sum_10_gnorm / 10; + + + ##################################################################### + # Update the gnorm_data.txt file with information on the + # initial gradient, final gnorm, and avg/min/max for the last 10 + # iterations. + ##################################################################### + updateGnormData( $cdate,$igrad,$final_gnorm,$avg_gnorm,$min_gnorm,$max_gnorm,$suffix ); + + + ##################################################################### + # Call makeErrMsg to build the error message file to record any + # abnormalities in the minimization. This file can be mailed by + # a calling script. + ##################################################################### + makeErrMsg( $suffix, $cdate, $final_gnorm, $stop_flag, $stop_iter, $reset_flag, \@reset_iter, $infile, $gross_check_val ); + + + ######################################################### + # write to GrADS ready output data file + # + # Note: this uses pack to achieve the same results as + # an unformatted binary Fortran file. + ######################################################### + my $filename2 = "${cdate}.gnorms.ieee_d"; + + open( OUTFILE, ">$filename2" ) or die "Can't open ${filename2}: $!\n"; + binmode OUTFILE; + + print OUTFILE pack( 'f*', @all_gnorm); + + close( OUTFILE ); + + #-------------------------- + # move files to $M_TANKverf + #-------------------------- + my $tankdir = $ENV{"M_TANKverfM0"}; + if(! -d $tankdir) { + system( "mkdir -p $tankdir" ); + } + + if( -e $filename2 ) { + system("cp -f $filename2 ${tankdir}/."); + } + + my $gdfile = "gnorm_data.txt"; + if( -e $gdfile ) { + system("cp -f $gdfile ${tankdir}/."); + } + + my $errmsg = "${cdate}.errmsg.txt"; + if( -e $errmsg ) { + system("cp -f $errmsg ${tankdir}/."); + } + + } # $rc still == 0 after reading gmon_gnorm.txt + +}else { # $infile does not exist + $rc = 3; +} + +print "$scr has ended, return code = $rc \n" diff --git a/ush/minmon_xtrct_reduct.pl b/ush/minmon_xtrct_reduct.pl new file mode 100755 index 00000000000..f6037d3f324 --- /dev/null +++ b/ush/minmon_xtrct_reduct.pl @@ -0,0 +1,89 @@ +#!/usr/bin/env perl + +use strict; + +#--------------------------------------------------------------------------- +# minmon_xtrct_reduct.pl +# +# Extract the reduction stats for a GSI minimization run and store in +# reduction.ieee_d files ready for GrADS use. +#--------------------------------------------------------------------------- + +if ($#ARGV != 4 ) { + print "usage: minmon_xtrct_reduct.pl SUFFIX pdy cyc infile jlogfile\n"; + print " suffix is data source identifier\n"; + print " pdy is YYYYMMDD of the cycle to be processed\n"; + print " cyc is HH of the cycle to be processed\n"; + print " infile is the data file containing the reduction stats\n"; + print " jlogfile is the job log file\n"; + exit; +} +my $suffix = $ARGV[0]; +my $pdy = $ARGV[1]; +my $cyc = $ARGV[2]; +my $infile = $ARGV[3]; +my $jlogfile = $ARGV[4]; + +my $scr = "minmon_xtrct_reduct.pl"; +print "$scr has started\n"; + +my $rc = 0; +my $cdate = sprintf '%s%s', $pdy, $cyc; +my $initial_gradient = -999.0; +my $iter_gradient; + +if( (-e $infile) ) { + + my $reduct_target = "cost,grad,step,b,step?"; + my $gradient_num = 5; + my $reduct; + + open( INFILE, "<${infile}" ) or die "Can't open ${infile}: $!\n"; + + my @reduct_array; + + while( my $line = ) { + if( $line =~ /$reduct_target/ ) { + my @reduct_ln = split( / +/, $line ); + $iter_gradient = $reduct_ln[$gradient_num]; + if( $initial_gradient == -999.0 ){ + $initial_gradient = $iter_gradient; + } + + $reduct = $iter_gradient / $initial_gradient; + + push( @reduct_array, $reduct ); + } + } + + close( INFILE ); + + + ################################# + # write reduct_array to outfile + ################################# + my $outfile = "${cdate}.reduction.ieee_d"; + open( OUTFILE, ">$outfile" ) or die "Can't open ${outfile}: $!\n"; + binmode OUTFILE; + + print OUTFILE pack( 'f*', @reduct_array); + close( OUTFILE ); + + #---------------------------- + # copy outfile to $M_TANKverf + #---------------------------- + my $tankdir = $ENV{"M_TANKverfM0"}; + if(! -d $tankdir) { + system( "mkdir -p $tankdir" ); + } + + if( -e $outfile ) { + my $newfile = "${tankdir}/${outfile}"; + system("cp -f $outfile $newfile"); + } + +} else { # $infile does not exist + $rc = 5; +} + +print "$scr has ended, return code = $rc \n" diff --git a/ush/mod_icec.sh b/ush/mod_icec.sh index f62131846e3..96ccab90757 100755 --- a/ush/mod_icec.sh +++ b/ush/mod_icec.sh @@ -7,7 +7,7 @@ source "$HOMEgfs/ush/preamble.sh" f=$1 -export WGRIB2=${WGRIB2:-${NWPROD:-/nwprod}/util/exec/wgrib2} +export WGRIB2=${WGRIB2:-${wgrib2_ROOT}/bin/wgrib2} $WGRIB2 ${optncpu:-} $f \ -if 'LAND' -rpn 'sto_1' -fi \ diff --git a/ush/module-setup.sh b/ush/module-setup.sh new file mode 100755 index 00000000000..9c27ab4f7ca --- /dev/null +++ b/ush/module-setup.sh @@ -0,0 +1,107 @@ +#!/bin/bash +set -u + +if [[ ${MACHINE_ID} = jet* ]] ; then + # We are on NOAA Jet + if ( ! eval module help > /dev/null 2>&1 ) ; then + source /apps/lmod/lmod/init/bash + fi + export LMOD_SYSTEM_DEFAULT_MODULES=contrib + module reset + +elif [[ ${MACHINE_ID} = hera* ]] ; then + # We are on NOAA Hera + if ( ! eval module help > /dev/null 2>&1 ) ; then + source /apps/lmod/lmod/init/bash + fi + export LMOD_SYSTEM_DEFAULT_MODULES=contrib + module reset + +elif [[ ${MACHINE_ID} = orion* ]] ; then + # We are on Orion + if ( ! eval module help > /dev/null 2>&1 ) ; then + source /apps/lmod/init/bash + fi + export LMOD_SYSTEM_DEFAULT_MODULES=contrib + module reset + +elif [[ ${MACHINE_ID} = s4* ]] ; then + # We are on SSEC Wisconsin S4 + if ( ! eval module help > /dev/null 2>&1 ) ; then + source /usr/share/lmod/lmod/init/bash + fi + export LMOD_SYSTEM_DEFAULT_MODULES=license_intel + module reset + +elif [[ ${MACHINE_ID} = wcoss2 ]]; then + # We are on WCOSS2 + module reset + +elif [[ ${MACHINE_ID} = cheyenne* ]] ; then + # We are on NCAR Cheyenne + if ( ! eval module help > /dev/null 2>&1 ) ; then + source /glade/u/apps/ch/modulefiles/default/localinit/localinit.sh + fi + module purge + +elif [[ ${MACHINE_ID} = stampede* ]] ; then + # We are on TACC Stampede + if ( ! eval module help > /dev/null 2>&1 ) ; then + source /opt/apps/lmod/lmod/init/bash + fi + module purge + +elif [[ ${MACHINE_ID} = gaea* ]] ; then + # We are on GAEA. + if ( ! eval module help > /dev/null 2>&1 ) ; then + # We cannot simply load the module command. The GAEA + # /etc/profile modifies a number of module-related variables + # before loading the module command. Without those variables, + # the module command fails. Hence we actually have to source + # /etc/profile here. + source /etc/profile + __ms_source_etc_profile=yes + else + __ms_source_etc_profile=no + fi + module purge + # clean up after purge + unset _LMFILES_ + unset _LMFILES_000 + unset _LMFILES_001 + unset LOADEDMODULES + module load modules + if [[ -d /opt/cray/ari/modulefiles ]] ; then + module use -a /opt/cray/ari/modulefiles + fi + if [[ -d /opt/cray/pe/ari/modulefiles ]] ; then + module use -a /opt/cray/pe/ari/modulefiles + fi + if [[ -d /opt/cray/pe/craype/default/modulefiles ]] ; then + module use -a /opt/cray/pe/craype/default/modulefiles + fi + if [[ -s /etc/opt/cray/pe/admin-pe/site-config ]] ; then + source /etc/opt/cray/pe/admin-pe/site-config + fi + if [[ "${__ms_source_etc_profile}" == yes ]] ; then + source /etc/profile + unset __ms_source_etc_profile + fi + +elif [[ ${MACHINE_ID} = expanse* ]]; then + # We are on SDSC Expanse + if ( ! eval module help > /dev/null 2>&1 ) ; then + source /etc/profile.d/modules.sh + fi + module purge + module load slurm/expanse/20.02.3 + +elif [[ ${MACHINE_ID} = discover* ]]; then + # We are on NCCS discover + export SPACK_ROOT=/discover/nobackup/mapotts1/spack + export PATH=${PATH}:${SPACK_ROOT}/bin + . "${SPACK_ROOT}"/share/spack/setup-env.sh + +else + echo WARNING: UNKNOWN PLATFORM 1>&2 +fi diff --git a/ush/nems.configure.atm.IN b/ush/nems.configure.atm.IN index 2179b593124..c74fe381289 100644 --- a/ush/nems.configure.atm.IN +++ b/ush/nems.configure.atm.IN @@ -1,8 +1,12 @@ # ESMF # -logKindFlag: @[esmf_logkind] +logKindFlag: @[esmf_logkind] +globalResourceControl: true EARTH_component_list: ATM -ATM_model: fv3 +ATM_model: @[atm_model] +ATM_petlist_bounds: @[atm_petlist_bounds] +ATM_omp_num_threads: @[atm_omp_num_threads] + runSeq:: ATM :: diff --git a/ush/nems.configure.atm_aero.IN b/ush/nems.configure.atm_aero.IN index b3fc775034d..dcce57b0489 100644 --- a/ush/nems.configure.atm_aero.IN +++ b/ush/nems.configure.atm_aero.IN @@ -3,7 +3,8 @@ ############################################# # ESMF # - logKindFlag: @[esmf_logkind] +logKindFlag: @[esmf_logkind] +globalResourceControl: true # EARTH # EARTH_component_list: ATM CHM @@ -14,6 +15,7 @@ EARTH_attributes:: # ATM # ATM_model: @[atm_model] ATM_petlist_bounds: @[atm_petlist_bounds] +ATM_omp_num_threads: @[atm_omp_num_threads] ATM_attributes:: Verbosity = max :: @@ -21,6 +23,7 @@ ATM_attributes:: # CHM # CHM_model: @[chm_model] CHM_petlist_bounds: @[chm_petlist_bounds] +CHM_omp_num_threads: @[chm_omp_num_threads] CHM_attributes:: Verbosity = max :: diff --git a/ush/nems.configure.blocked_atm_wav.IN b/ush/nems.configure.blocked_atm_wav.IN index 159cf6a4589..9aeaefa875f 100644 --- a/ush/nems.configure.blocked_atm_wav.IN +++ b/ush/nems.configure.blocked_atm_wav.IN @@ -3,7 +3,8 @@ ############################################# # ESMF # - logKindFlag: @[esmf_logkind] +logKindFlag: @[esmf_logkind] +globalResourceControl: true # EARTH # EARTH_component_list: ATM WAV @@ -14,6 +15,7 @@ EARTH_attributes:: # ATM # ATM_model: @[atm_model] ATM_petlist_bounds: @[atm_petlist_bounds] +ATM_omp_num_threads: @[atm_omp_num_threads] ATM_attributes:: Verbosity = max DumpFields = true @@ -22,6 +24,7 @@ ATM_attributes:: # WAV # WAV_model: @[wav_model] WAV_petlist_bounds: @[wav_petlist_bounds] +WAV_omp_num_threads: @[wav_omp_num_threads] WAV_attributes:: Verbosity = max :: @@ -31,7 +34,7 @@ WAV_attributes:: # Run Sequence # runSeq:: @@[coupling_interval_sec] - ATM -> WAV + ATM -> WAV ATM WAV @ diff --git a/ush/nems.configure.cpld.IN b/ush/nems.configure.cpld.IN index dda18352051..abc9091c4e1 100644 --- a/ush/nems.configure.cpld.IN +++ b/ush/nems.configure.cpld.IN @@ -3,7 +3,8 @@ ############################################# # ESMF # -logKindFlag: @[esmf_logkind] +logKindFlag: @[esmf_logkind] +globalResourceControl: true # EARTH # EARTH_component_list: MED ATM OCN ICE @@ -14,11 +15,13 @@ EARTH_attributes:: # MED # MED_model: @[med_model] MED_petlist_bounds: @[med_petlist_bounds] +MED_omp_num_threads: @[med_omp_num_threads] :: # ATM # ATM_model: @[atm_model] ATM_petlist_bounds: @[atm_petlist_bounds] +ATM_omp_num_threads: @[atm_omp_num_threads] ATM_attributes:: Verbosity = 0 DumpFields = @[DumpFields] @@ -29,6 +32,7 @@ ATM_attributes:: # OCN # OCN_model: @[ocn_model] OCN_petlist_bounds: @[ocn_petlist_bounds] +OCN_omp_num_threads: @[ocn_omp_num_threads] OCN_attributes:: Verbosity = 0 DumpFields = @[DumpFields] @@ -40,6 +44,7 @@ OCN_attributes:: # ICE # ICE_model: @[ice_model] ICE_petlist_bounds: @[ice_petlist_bounds] +ICE_omp_num_threads: @[ice_omp_num_threads] ICE_attributes:: Verbosity = 0 DumpFields = @[DumpFields] diff --git a/ush/nems.configure.cpld_aero_outerwave.IN b/ush/nems.configure.cpld_aero_outerwave.IN new file mode 100644 index 00000000000..3b25faa268e --- /dev/null +++ b/ush/nems.configure.cpld_aero_outerwave.IN @@ -0,0 +1,148 @@ +############################################# +#### NEMS Run-Time Configuration File ##### +############################################# + +# ESMF # +logKindFlag: @[esmf_logkind] +globalResourceControl: true + +# EARTH # +EARTH_component_list: MED ATM CHM OCN ICE WAV +EARTH_attributes:: + Verbosity = 0 +:: + +# MED # +MED_model: @[med_model] +MED_petlist_bounds: @[med_petlist_bounds] +MED_omp_num_threads: @[med_omp_num_threads] +:: + +# ATM # +ATM_model: @[atm_model] +ATM_petlist_bounds: @[atm_petlist_bounds] +ATM_omp_num_threads: @[atm_omp_num_threads] +ATM_attributes:: + Verbosity = 0 + DumpFields = @[DumpFields] + ProfileMemory = false + OverwriteSlice = true +:: + +# CHM # +CHM_model: @[chm_model] +CHM_petlist_bounds: @[chm_petlist_bounds] +CHM_omp_num_threads: @[chm_omp_num_threads] +CHM_attributes:: + Verbosity = 0 +:: + +# OCN # +OCN_model: @[ocn_model] +OCN_petlist_bounds: @[ocn_petlist_bounds] +OCN_omp_num_threads: @[ocn_omp_num_threads] +OCN_attributes:: + Verbosity = 0 + DumpFields = @[DumpFields] + ProfileMemory = false + OverwriteSlice = true + mesh_ocn = @[MESH_OCN_ICE] +:: + +# ICE # +ICE_model: @[ice_model] +ICE_petlist_bounds: @[ice_petlist_bounds] +ICE_omp_num_threads: @[ice_omp_num_threads] +ICE_attributes:: + Verbosity = 0 + DumpFields = @[DumpFields] + ProfileMemory = false + OverwriteSlice = true + mesh_ice = @[MESH_OCN_ICE] + stop_n = @[RESTART_N] + stop_option = nhours + stop_ymd = -999 +:: + +# WAV # +WAV_model: @[wav_model] +WAV_petlist_bounds: @[wav_petlist_bounds] +WAV_omp_num_threads: @[wav_omp_num_threads] +WAV_attributes:: + Verbosity = 0 + OverwriteSlice = false + diro = "." + logfile = wav.log + mesh_wav = @[MESH_WAV] + multigrid = @[MULTIGRID] +:: + +# CMEPS warm run sequence +runSeq:: +@@[coupling_interval_slow_sec] + MED med_phases_prep_wav_avg + MED med_phases_prep_ocn_avg + MED -> WAV :remapMethod=redist + MED -> OCN :remapMethod=redist + WAV + OCN + @@[coupling_interval_fast_sec] + MED med_phases_prep_atm + MED med_phases_prep_ice + MED -> ATM :remapMethod=redist + MED -> ICE :remapMethod=redist + ATM phase1 + ATM -> CHM + CHM + CHM -> ATM + ATM phase2 + ICE + ATM -> MED :remapMethod=redist + MED med_phases_post_atm + ICE -> MED :remapMethod=redist + MED med_phases_post_ice + MED med_phases_prep_ocn_accum + MED med_phases_prep_wav_accum + @ + OCN -> MED :remapMethod=redist + WAV -> MED :remapMethod=redist + MED med_phases_post_ocn + MED med_phases_post_wav + MED med_phases_restart_write +@ +:: + +# CMEPS variables + +DRIVER_attributes:: +:: +MED_attributes:: + ATM_model = @[atm_model] + ICE_model = @[ice_model] + OCN_model = @[ocn_model] + WAV_model = @[wav_model] + history_n = 0 + history_option = nhours + history_ymd = -999 + coupling_mode = @[CPLMODE] + history_tile_atm = @[ATMTILESIZE] +:: +ALLCOMP_attributes:: + ScalarFieldCount = 2 + ScalarFieldIdxGridNX = 1 + ScalarFieldIdxGridNY = 2 + ScalarFieldName = cpl_scalars + start_type = @[RUNTYPE] + restart_dir = RESTART/ + case_name = ufs.cpld + restart_n = @[RESTART_N] + restart_option = nhours + restart_ymd = -999 + dbug_flag = @[cap_dbug_flag] + use_coldstart = @[use_coldstart] + use_mommesh = @[use_mommesh] + eps_imesh = @[eps_imesh] + stop_n = @[FHMAX] + stop_option = nhours + stop_ymd = -999 +:: diff --git a/ush/nems.configure.cpld_aero_wave.IN b/ush/nems.configure.cpld_aero_wave.IN index 9e67af9ba45..6b886b0626d 100644 --- a/ush/nems.configure.cpld_aero_wave.IN +++ b/ush/nems.configure.cpld_aero_wave.IN @@ -3,7 +3,8 @@ ############################################# # ESMF # - logKindFlag: @[esmf_logkind] +logKindFlag: @[esmf_logkind] +globalResourceControl: true # EARTH # EARTH_component_list: MED ATM CHM OCN ICE WAV @@ -14,11 +15,13 @@ EARTH_attributes:: # MED # MED_model: @[med_model] MED_petlist_bounds: @[med_petlist_bounds] +MED_omp_num_threads: @[med_omp_num_threads] :: # ATM # ATM_model: @[atm_model] ATM_petlist_bounds: @[atm_petlist_bounds] +ATM_omp_num_threads: @[atm_omp_num_threads] ATM_attributes:: Verbosity = 0 DumpFields = @[DumpFields] @@ -29,6 +32,7 @@ ATM_attributes:: # CHM # CHM_model: @[chm_model] CHM_petlist_bounds: @[chm_petlist_bounds] +CHM_omp_num_threads: @[chm_omp_num_threads] CHM_attributes:: Verbosity = 0 :: @@ -36,6 +40,7 @@ CHM_attributes:: # OCN # OCN_model: @[ocn_model] OCN_petlist_bounds: @[ocn_petlist_bounds] +OCN_omp_num_threads: @[ocn_omp_num_threads] OCN_attributes:: Verbosity = 0 DumpFields = @[DumpFields] @@ -47,6 +52,7 @@ OCN_attributes:: # ICE # ICE_model: @[ice_model] ICE_petlist_bounds: @[ice_petlist_bounds] +ICE_omp_num_threads: @[ice_omp_num_threads] ICE_attributes:: Verbosity = 0 DumpFields = @[DumpFields] @@ -61,6 +67,7 @@ ICE_attributes:: # WAV # WAV_model: @[wav_model] WAV_petlist_bounds: @[wav_petlist_bounds] +WAV_omp_num_threads: @[wav_omp_num_threads] WAV_attributes:: Verbosity = 0 OverwriteSlice = false diff --git a/ush/nems.configure.cpld_outerwave.IN b/ush/nems.configure.cpld_outerwave.IN new file mode 100644 index 00000000000..ec30d132a74 --- /dev/null +++ b/ush/nems.configure.cpld_outerwave.IN @@ -0,0 +1,136 @@ +############################################# +#### NEMS Run-Time Configuration File ##### +############################################# + +# ESMF # +logKindFlag: @[esmf_logkind] +globalResourceControl: true + +# EARTH # +EARTH_component_list: MED ATM OCN ICE WAV +EARTH_attributes:: + Verbosity = 0 +:: + +# MED # +MED_model: @[med_model] +MED_petlist_bounds: @[med_petlist_bounds] +MED_omp_num_threads: @[med_omp_num_threads] +:: + +# ATM # +ATM_model: @[atm_model] +ATM_petlist_bounds: @[atm_petlist_bounds] +ATM_omp_num_threads: @[atm_omp_num_threads] +ATM_attributes:: + Verbosity = 0 + DumpFields = @[DumpFields] + ProfileMemory = false + OverwriteSlice = true +:: + +# OCN # +OCN_model: @[ocn_model] +OCN_petlist_bounds: @[ocn_petlist_bounds] +OCN_omp_num_threads: @[ocn_omp_num_threads] +OCN_attributes:: + Verbosity = 0 + DumpFields = @[DumpFields] + ProfileMemory = false + OverwriteSlice = true + mesh_ocn = @[MESH_OCN_ICE] +:: + +# ICE # +ICE_model: @[ice_model] +ICE_petlist_bounds: @[ice_petlist_bounds] +ICE_omp_num_threads: @[ice_omp_num_threads] +ICE_attributes:: + Verbosity = 0 + DumpFields = @[DumpFields] + ProfileMemory = false + OverwriteSlice = true + mesh_ice = @[MESH_OCN_ICE] + stop_n = @[RESTART_N] + stop_option = nhours + stop_ymd = -999 +:: + +# WAV # +WAV_model: @[wav_model] +WAV_petlist_bounds: @[wav_petlist_bounds] +WAV_omp_num_threads: @[wav_omp_num_threads] +WAV_attributes:: + Verbosity = 0 + OverwriteSlice = false + diro = "." + logfile = wav.log + mesh_wav = @[MESH_WAV] + multigrid = @[MULTIGRID] +:: + +# CMEPS warm run sequence +runSeq:: +@@[coupling_interval_slow_sec] + MED med_phases_prep_wav_avg + MED med_phases_prep_ocn_avg + MED -> WAV :remapMethod=redist + MED -> OCN :remapMethod=redist + WAV + OCN + @@[coupling_interval_fast_sec] + MED med_phases_prep_atm + MED med_phases_prep_ice + MED -> ATM :remapMethod=redist + MED -> ICE :remapMethod=redist + ATM + ICE + ATM -> MED :remapMethod=redist + MED med_phases_post_atm + ICE -> MED :remapMethod=redist + MED med_phases_post_ice + MED med_phases_prep_ocn_accum + MED med_phases_prep_wav_accum + @ + OCN -> MED :remapMethod=redist + WAV -> MED :remapMethod=redist + MED med_phases_post_ocn + MED med_phases_post_wav + MED med_phases_restart_write +@ +:: + +# CMEPS variables + +DRIVER_attributes:: +:: +MED_attributes:: + ATM_model = @[atm_model] + ICE_model = @[ice_model] + OCN_model = @[ocn_model] + WAV_model = @[wav_model] + history_n = 0 + history_option = nhours + history_ymd = -999 + coupling_mode = @[CPLMODE] + history_tile_atm = @[ATMTILESIZE] +:: +ALLCOMP_attributes:: + ScalarFieldCount = 2 + ScalarFieldIdxGridNX = 1 + ScalarFieldIdxGridNY = 2 + ScalarFieldName = cpl_scalars + start_type = @[RUNTYPE] + restart_dir = RESTART/ + case_name = ufs.cpld + restart_n = @[RESTART_N] + restart_option = nhours + restart_ymd = -999 + dbug_flag = @[cap_dbug_flag] + use_coldstart = @[use_coldstart] + use_mommesh = @[use_mommesh] + eps_imesh = @[eps_imesh] + stop_n = @[FHMAX] + stop_option = nhours + stop_ymd = -999 +:: diff --git a/ush/nems.configure.cpld_wave.IN b/ush/nems.configure.cpld_wave.IN index 89ef5101605..f2843a5b2cf 100644 --- a/ush/nems.configure.cpld_wave.IN +++ b/ush/nems.configure.cpld_wave.IN @@ -4,6 +4,7 @@ # ESMF # logKindFlag: @[esmf_logkind] +globalResourceControl: true # EARTH # EARTH_component_list: MED ATM OCN ICE WAV @@ -14,11 +15,13 @@ EARTH_attributes:: # MED # MED_model: @[med_model] MED_petlist_bounds: @[med_petlist_bounds] +MED_omp_num_threads: @[med_omp_num_threads] :: # ATM # ATM_model: @[atm_model] ATM_petlist_bounds: @[atm_petlist_bounds] +ATM_omp_num_threads: @[atm_omp_num_threads] ATM_attributes:: Verbosity = 0 DumpFields = @[DumpFields] @@ -29,6 +32,7 @@ ATM_attributes:: # OCN # OCN_model: @[ocn_model] OCN_petlist_bounds: @[ocn_petlist_bounds] +OCN_omp_num_threads: @[ocn_omp_num_threads] OCN_attributes:: Verbosity = 0 DumpFields = @[DumpFields] @@ -40,6 +44,7 @@ OCN_attributes:: # ICE # ICE_model: @[ice_model] ICE_petlist_bounds: @[ice_petlist_bounds] +ICE_omp_num_threads: @[ice_omp_num_threads] ICE_attributes:: Verbosity = 0 DumpFields = @[DumpFields] @@ -54,12 +59,13 @@ ICE_attributes:: # WAV # WAV_model: @[wav_model] WAV_petlist_bounds: @[wav_petlist_bounds] +WAV_omp_num_threads: @[wav_omp_num_threads] WAV_attributes:: Verbosity = 0 OverwriteSlice = false diro = "." logfile = wav.log - mesh_wav = @[MESH_WAV] + mesh_wav = @[MESH_WAV] multigrid = @[MULTIGRID] :: diff --git a/ush/nems.configure.leapfrog_atm_wav.IN b/ush/nems.configure.leapfrog_atm_wav.IN index a29951d001f..b302a27e8a9 100644 --- a/ush/nems.configure.leapfrog_atm_wav.IN +++ b/ush/nems.configure.leapfrog_atm_wav.IN @@ -3,7 +3,8 @@ ############################################# # ESMF # - logKindFlag: @[esmf_logkind] +logKindFlag: @[esmf_logkind] +globalResourceControl: true # EARTH # EARTH_component_list: ATM WAV @@ -14,6 +15,7 @@ EARTH_attributes:: # ATM # ATM_model: @[atm_model] ATM_petlist_bounds: @[atm_petlist_bounds] +ATM_omp_num_threads: @[atm_omp_num_threads] ATM_attributes:: Verbosity = max DumpFields = true @@ -22,6 +24,7 @@ ATM_attributes:: # WAV # WAV_model: @[wav_model] WAV_petlist_bounds: @[wav_petlist_bounds] +WAV_omp_num_threads: @[wav_omp_num_threads] WAV_attributes:: Verbosity = max :: @@ -32,7 +35,7 @@ WAV_attributes:: runSeq:: @@[coupling_interval_slow_sec] ATM - ATM -> WAV + ATM -> WAV WAV @ :: diff --git a/ush/nems_configure.sh b/ush/nems_configure.sh index 04fea90f354..7645c9e76bb 100755 --- a/ush/nems_configure.sh +++ b/ush/nems_configure.sh @@ -7,119 +7,138 @@ ## $cpl** switches. ## ## This is a child script of modular -## forecast script. This script is definition only +## forecast script. This script is definition only (Is it? There is nothing defined here being used outside this script.) ##### writing_nems_configure() { echo "SUB ${FUNCNAME[0]}: parsing_nems_configure begins" -if [ -e $SCRIPTDIR/nems.configure ]; then - rm -f $SCRIPTDIR/nems.configure +if [[ -e "${SCRIPTDIR}/nems.configure" ]]; then + rm -f "${SCRIPTDIR}/nems.configure" fi # Setup nems.configure -DumpFields=${NEMSDumpFields:-false} -cap_dbug_flag=${cap_dbug_flag:-0} -if [ $warm_start = ".true." ]; then - cmeps_run_type='continue' +local DumpFields=${NEMSDumpFields:-false} +local cap_dbug_flag=${cap_dbug_flag:-0} +# Determine "cmeps_run_type" based on the availability of the mediator restart file +# If it is a warm_start, we already copied the mediator restart to DATA, if it was present +# If the mediator restart was not present, despite being a "warm_start", we put out a WARNING +# in forecast_postdet.sh +if [[ -f "${DATA}/ufs.cpld.cpl.r.nc" ]]; then + local cmeps_run_type='continue' else - cmeps_run_type='startup' + local cmeps_run_type='startup' fi -restart_interval=${restart_interval:-3024000} # Interval in seconds to write restarts - -ATM_model=${ATM_model:-'fv3'} -OCN_model=${OCN_model:-'mom6'} -ICE_model=${ICE_model:-'cice'} -WAV_model=${WAV_model:-'ww3'} -CHM_model=${CHM_model:-'gocart'} - -ATMPETS=${ATMPETS:-8} -MEDPETS=${MEDPETS:-8} -OCNPETS=${OCNPETS:-0} -ICEPETS=${ICEPETS:-0} -WAVPETS=${WAVPETS:-0} -CHMPETS=${CHMPETS:-${ATMPETS}} - -USE_MOMMESH=${USE_MOMMESH:-"true"} -MESH_OCN_ICE=${MESH_OCN_ICE:-"mesh.mx${ICERES}.nc"} - -if [[ $OCNRES = "100" ]]; then - EPS_IMESH='2.5e-1' -else - EPS_IMESH='1.0e-1' -fi - -rm -f $DATA/nems.configure +local res_int=${restart_interval:-3024000} # Interval in seconds to write restarts -med_petlist_bounds=${med_petlist_bounds:-"0 $(( $MEDPETS-1 ))"} -atm_petlist_bounds=${atm_petlist_bounds:-"0 $(( $ATMPETS-1 ))"} -ocn_petlist_bounds=${ocn_petlist_bounds:-"$ATMPETS $(( $ATMPETS+$OCNPETS-1 ))"} -ice_petlist_bounds=${ice_petlist_bounds:-"$(( $ATMPETS+$OCNPETS )) $(( $ATMPETS+$OCNPETS+$ICEPETS-1 ))"} -wav_petlist_bounds=${wav_petlist_bounds:-"$(( $ATMPETS+$OCNPETS+$ICEPETS )) $(( $ATMPETS+$OCNPETS+$ICEPETS+$WAVPETS-1 ))"} -chm_petlist_bounds=${chm_petlist_bounds:-"0 $(( $CHMPETS-1 ))"} +rm -f "${DATA}/nems.configure" -esmf_logkind=${esmf_logkind:-"ESMF_LOGKIND_MULTI"} #options: ESMF_LOGKIND_MULTI_ON_ERROR, ESMF_LOGKIND_MULTI, ESMF_LOGKIND_NONE +local esmf_logkind=${esmf_logkind:-"ESMF_LOGKIND_MULTI"} #options: ESMF_LOGKIND_MULTI_ON_ERROR, ESMF_LOGKIND_MULTI, ESMF_LOGKIND_NONE # Copy the selected template into run directory -infile="$SCRIPTDIR/nems.configure.$confignamevarfornems.IN" -if [ -s $infile ]; then - cp $infile tmp1 +infile="${SCRIPTDIR}/nems.configure.${confignamevarfornems}.IN" +if [[ -s ${infile} ]]; then + cp "${infile}" tmp1 else - echo "FATAL ERROR: nem.configure template '$infile' does not exist!" + echo "FATAL ERROR: nem.configure template '${infile}' does not exist!" exit 1 fi + +local atm_petlist_bounds="0 $(( ${ATMPETS}-1 ))" +local med_petlist_bounds="0 $(( ${MEDPETS}-1 ))" + +sed -i -e "s;@\[atm_model\];fv3;g" tmp1 +sed -i -e "s;@\[atm_petlist_bounds\];${atm_petlist_bounds};g" tmp1 +sed -i -e "s;@\[atm_omp_num_threads\];${ATMTHREADS};g" tmp1 sed -i -e "s;@\[med_model\];cmeps;g" tmp1 -sed -i -e "s;@\[atm_model\];$ATM_model;g" tmp1 -sed -i -e "s;@\[med_petlist_bounds\];$med_petlist_bounds;g" tmp1 -sed -i -e "s;@\[atm_petlist_bounds\];$atm_petlist_bounds;g" tmp1 -sed -i -e "s;@\[esmf_logkind\];$esmf_logkind;g" tmp1 +sed -i -e "s;@\[med_petlist_bounds\];${med_petlist_bounds};g" tmp1 +sed -i -e "s;@\[med_omp_num_threads\];${MEDTHREADS};g" tmp1 +sed -i -e "s;@\[esmf_logkind\];${esmf_logkind};g" tmp1 -if [ $cpl = ".true." ]; then - sed -i -e "s;@\[coupling_interval_slow_sec\];$CPL_SLOW;g" tmp1 +if [[ "${cpl}" = ".true." ]]; then + sed -i -e "s;@\[coupling_interval_slow_sec\];${CPL_SLOW};g" tmp1 fi -if [ $cplflx = .true. ]; then - if [ $restart_interval -gt 0 ]; then - restart_interval_nems=$restart_interval +if [[ "${cplflx}" = ".true." ]]; then + if [[ ${res_int} -gt 0 ]]; then + local restart_interval_nems=${res_int} else - restart_interval_nems=$FHMAX + local restart_interval_nems=${FHMAX} fi - sed -i -e "s;@\[ocn_model\];$OCN_model;g" tmp1 - sed -i -e "s;@\[ocn_petlist_bounds\];$ocn_petlist_bounds;g" tmp1 - sed -i -e "s;@\[DumpFields\];$DumpFields;g" tmp1 - sed -i -e "s;@\[cap_dbug_flag\];$cap_dbug_flag;g" tmp1 - sed -i -e "s;@\[use_coldstart\];$use_coldstart;g" tmp1 - sed -i -e "s;@\[RUNTYPE\];$cmeps_run_type;g" tmp1 - sed -i -e "s;@\[CPLMODE\];$cplmode;g" tmp1 - sed -i -e "s;@\[restart_interval\];$restart_interval;g" tmp1 - sed -i -e "s;@\[coupling_interval_fast_sec\];$CPL_FAST;g" tmp1 - sed -i -e "s;@\[RESTART_N\];$restart_interval_nems;g" tmp1 - sed -i -e "s;@\[use_mommesh\];$USE_MOMMESH;g" tmp1 - sed -i -e "s;@\[eps_imesh\];$EPS_IMESH;g" tmp1 - sed -i -e "s;@\[ATMTILESIZE\];$RESTILE;g" tmp1 + + # TODO: Should this be raised up to config.ufs or config.ocn? + case "${OCNRES}" in + "500") local eps_imesh="4.0e-1";; + "100") local eps_imesh="2.5e-1";; + *) local eps_imesh="1.0e-1";; + esac + + local use_coldstart=${use_coldstart:-".false."} + local use_mommesh=${USE_MOMMESH:-"true"} + local restile=$(echo "${CASE}" |cut -c2-) + + local start="${ATMPETS}" + local end="$(( ${start}+${OCNPETS}-1 ))" + local ocn_petlist_bounds="${start} ${end}" + + sed -i -e "s;@\[ocn_model\];mom6;g" tmp1 + sed -i -e "s;@\[ocn_petlist_bounds\];${ocn_petlist_bounds};g" tmp1 + sed -i -e "s;@\[ocn_omp_num_threads\];${OCNTHREADS};g" tmp1 + sed -i -e "s;@\[DumpFields\];${DumpFields};g" tmp1 + sed -i -e "s;@\[cap_dbug_flag\];${cap_dbug_flag};g" tmp1 + sed -i -e "s;@\[use_coldstart\];${use_coldstart};g" tmp1 + sed -i -e "s;@\[RUNTYPE\];${cmeps_run_type};g" tmp1 + sed -i -e "s;@\[CPLMODE\];${cplmode};g" tmp1 + sed -i -e "s;@\[restart_interval\];${res_int};g" tmp1 + sed -i -e "s;@\[coupling_interval_fast_sec\];${CPL_FAST};g" tmp1 + sed -i -e "s;@\[RESTART_N\];${restart_interval_nems};g" tmp1 + sed -i -e "s;@\[use_mommesh\];${use_mommesh};g" tmp1 + sed -i -e "s;@\[eps_imesh\];${eps_imesh};g" tmp1 + sed -i -e "s;@\[ATMTILESIZE\];${restile};g" tmp1 fi -if [ $cplwav = .true. ]; then - sed -i -e "s;@\[wav_model\];ww3;g" tmp1 - sed -i -e "s;@\[wav_petlist_bounds\];$wav_petlist_bounds;g" tmp1 - sed -i -e "s;@\[MESH_WAV\];$MESH_WAV;g" tmp1 - sed -i -e "s;@\[MULTIGRID\];$waveMULTIGRID;g" tmp1 + +if [[ "${cplice}" = ".true." ]]; then + + local mesh_ocn_ice=${MESH_OCN_ICE:-"mesh.mx${ICERES}.nc"} + + local start="$(( ${ATMPETS}+${OCNPETS} ))" + local end="$(( ${start}+${ICEPETS}-1 ))" + local ice_petlist_bounds="${start} ${end}" + + sed -i -e "s;@\[ice_model\];cice6;g" tmp1 + sed -i -e "s;@\[ice_petlist_bounds\];${ice_petlist_bounds};g" tmp1 + sed -i -e "s;@\[ice_omp_num_threads\];${ICETHREADS};g" tmp1 + sed -i -e "s;@\[MESH_OCN_ICE\];${mesh_ocn_ice};g" tmp1 + sed -i -e "s;@\[FHMAX\];${FHMAX_GFS};g" tmp1 fi -if [ $cplice = .true. ]; then - sed -i -e "s;@\[ice_model\];$ICE_model;g" tmp1 - sed -i -e "s;@\[ice_petlist_bounds\];$ice_petlist_bounds;g" tmp1 - sed -i -e "s;@\[MESH_OCN_ICE\];$MESH_OCN_ICE;g" tmp1 - sed -i -e "s;@\[FHMAX\];$FHMAX_GFS;g" tmp1 + +if [[ "${cplwav}" = ".true." ]]; then + + local start="$(( ${ATMPETS}+${OCNPETS:-0}+${ICEPETS:-0} ))" + local end="$(( ${start}+${WAVPETS}-1 ))" + local wav_petlist_bounds="${start} ${end}" + + sed -i -e "s;@\[wav_model\];ww3;g" tmp1 + sed -i -e "s;@\[wav_petlist_bounds\];${wav_petlist_bounds};g" tmp1 + sed -i -e "s;@\[wav_omp_num_threads\];${WAVTHREADS};g" tmp1 + sed -i -e "s;@\[MESH_WAV\];${MESH_WAV};g" tmp1 + sed -i -e "s;@\[MULTIGRID\];${waveMULTIGRID};g" tmp1 fi -if [ $cplchm = .true. ]; then - sed -i -e "s;@\[chm_model\];$CHM_model;g" tmp1 - sed -i -e "s;@\[chm_petlist_bounds\];$chm_petlist_bounds;g" tmp1 - sed -i -e "s;@\[coupling_interval_fast_sec\];$CPL_FAST;g" tmp1 + +if [[ "${cplchm}" = ".true." ]]; then + + local chm_petlist_bounds="0 $(( ${CHMPETS}-1 ))" + + sed -i -e "s;@\[chm_model\];gocart;g" tmp1 + sed -i -e "s;@\[chm_petlist_bounds\];${chm_petlist_bounds};g" tmp1 + sed -i -e "s;@\[chm_omp_num_threads\];${CHMTHREADS};g" tmp1 + sed -i -e "s;@\[coupling_interval_fast_sec\];${CPL_FAST};g" tmp1 fi mv tmp1 nems.configure echo "$(cat nems.configure)" -if [ $cplflx = .true. ]; then +if [[ "${cplflx}" = ".true." ]]; then #Create other CMEPS mediator related files cat > pio_in << EOF @@ -176,8 +195,8 @@ echo "$(cat med_modelio.nml)" fi -cp $HOMEgfs/sorc/ufs_model.fd/tests/parm/fd_nems.yaml fd_nems.yaml +${NCP} "${HOMEgfs}/sorc/ufs_model.fd/tests/parm/fd_nems.yaml" fd_nems.yaml -echo "SUB ${FUNCNAME[0]}: Nems configured for $confignamevarfornems" +echo "SUB ${FUNCNAME[0]}: Nems configured for ${confignamevarfornems}" } diff --git a/ush/ocnpost.ncl b/ush/ocnpost.ncl index 81f24673fcb..27e60b0edfa 100755 --- a/ush/ocnpost.ncl +++ b/ush/ocnpost.ncl @@ -93,7 +93,8 @@ begin ; pull from environment COMDIR = getenv("COMOUTocean") IDATE = getenv("IDATE") - FHR2 = getenv("FHR") + VDATE = getenv("VDATE") + FHR2 = getenv("FHR") FHR=FHR2 ENSMEM = getenv("ENSMEM") DATA_TMP = getenv("DATA") @@ -101,7 +102,7 @@ begin ; nemsrc = "/scratch2/NCEPDEV/climate/Bin.Li/S2S/fix/ocean_ice_post/FIXDIR/" ; calculate and break apart verification date - VDATE = tochar(systemfunc("$NDATE "+FHR+" "+IDATE)) + ; VDATE = tochar(systemfunc("$NDATE "+FHR+" "+IDATE)) ; YYYY = tostring(VDATE(0:3)) ; MM = tostring(VDATE(4:5)) ; DD = tostring(VDATE(6:7)) diff --git a/ush/ozn_xtrct.sh b/ush/ozn_xtrct.sh new file mode 100755 index 00000000000..3f6b3fed19b --- /dev/null +++ b/ush/ozn_xtrct.sh @@ -0,0 +1,261 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +#------------------------------------------------------------------ +# ozn_xtrct.sh +# +# This script performs the data extraction from the oznstat +# diagnostic files. The resulting data (*.ieee_d) files, GrADS +# control files and stdout files will be moved to the +# $TANKverf_ozn. +# +# Calling scripts must define: +# $TANKverf_ozn +# $HOMEoznmon +# $PDATE +# +# Return values are +# 0 = normal +# 2 = unable to generate satype list; may indicate no diag +# files found in oznstat file +#------------------------------------------------------------------ + +#-------------------------------------------------- +# check_diag_files +# +# Compare $satype (which contains the contents of +# gdas_oznmon_satype.txt to $avail_satype which is +# determined by the contents of the oznstat file. +# Report any missing diag files in a file named +# bad_diag.$PDATE +# +check_diag_files() { + pdate=$1 + found_satype=$2 + avail_satype=$3 + + out_file="bad_diag.${pdate}" + + echo ""; echo ""; echo "--> check_diag_files" + + for type in ${found_satype}; do + len_check=$(echo ${avail_satype} | grep ${type} | wc -c) + + if [[ ${len_check} -le 1 ]]; then + echo "missing diag file -- diag_${type}_ges.${pdate}.gz not found " >> ./${out_file} + fi + done + + echo "<-- check_diag_files"; echo ""; echo "" +} + + +iret=0 +export NCP=${NCP:-/bin/cp} +VALIDATE_DATA=${VALIDATE_DATA:-0} +nregion=${nregion:-6} +DO_DATA_RPT=${DO_DATA_RPT:-0} + +netcdf_boolean=".false." +if [[ $OZNMON_NETCDF -eq 1 ]]; then + netcdf_boolean=".true." +fi + +OZNMON_NEW_HDR=${OZNMON_NEW_HDR:-0} +new_hdr="F" +if [[ $OZNMON_NEW_HDR -eq 1 ]]; then + new_hdr="T" +fi + +#------------------------------------------------------------------ +# if VALIDATE_DATA then locate and untar base file +# +validate=".FALSE." +if [[ $VALIDATE_DATA -eq 1 ]]; then + if [[ ! -e $ozn_val_file && ! -h $ozn_val_file ]]; then + echo "WARNING: VALIDATE_DATA set to 1, but unable to locate $ozn_val_file" + echo " Setting VALIDATE_DATA to 0/OFF" + VALIDATE_DATA=0 + else + validate=".TRUE." + val_file=$(basename ${ozn_val_file}) + ${NCP} $ozn_val_file $val_file + tar -xvf $val_file + fi +fi +echo "VALIDATE_DATA, validate = $VALIDATE_DATA, $validate " + + + +#------------------------------------------------------------------ +# ozn_ptype here is the processing type which is intended to be "ges" +# or "anl". Default is "ges". +# +ozn_ptype=${ozn_ptype:-"ges anl"} + + +#--------------------------------------------------------------------------- +# Build satype list from the available diag files. +# +# An empty satype list means there are no diag files to process. That's +# a problem, reported by an iret value of 2 +# + +avail_satype=$(ls -1 d*ges* | sed -e 's/_/ /g;s/\./ /' | gawk '{ print $2 "_" $3 }') + +if [[ ${DO_DATA_RPT} -eq 1 ]]; then + if [[ -e ${SATYPE_FILE} ]]; then + satype=$(cat ${SATYPE_FILE}) + check_diag_files ${PDATE} "${satype}" "${avail_satype}" + else + echo "WARNING: missing ${SATYPE_FILE}" + fi +fi + +len_satype=$(echo -n "${satype}" | wc -c) + +if [[ ${len_satype} -le 1 ]]; then + satype=${avail_satype} +fi + +echo ${satype} + + +len_satype=$(echo -n "${satype}" | wc -c) + +if [[ ${DO_DATA_RPT} -eq 1 && ${len_satype} -lt 1 ]]; then + iret=2 + +else + + #-------------------------------------------------------------------- + # Copy extraction programs to working directory + # + ${NCP} ${HOMEoznmon}/exec/oznmon_time.x ./oznmon_time.x + if [[ ! -e oznmon_time.x ]]; then + iret=2 + exit ${iret} + fi + ${NCP} ${HOMEoznmon}/exec/oznmon_horiz.x ./oznmon_horiz.x + if [[ ! -e oznmon_horiz.x ]]; then + iret=3 + exit ${iret} + fi + + + #--------------------------------------------------------------------------- + # Outer loop over $ozn_ptype (default values 'ges', 'anl') + # + for ptype in ${ozn_ptype}; do + + iyy=$(echo ${PDATE} | cut -c1-4) + imm=$(echo ${PDATE} | cut -c5-6) + idd=$(echo ${PDATE} | cut -c7-8) + ihh=$(echo ${PDATE} | cut -c9-10) + + for type in ${avail_satype}; do + if [[ -f "diag_${type}_${ptype}.${PDATE}.gz" ]]; then + mv diag_${type}_${ptype}.${PDATE}.gz ${type}.${ptype}.gz + gunzip ./${type}.${ptype}.gz + + echo "processing ptype, type: ${ptype}, ${type}" + rm -f input + +cat << EOF > input + &INPUT + satname='${type}', + iyy=${iyy}, + imm=${imm}, + idd=${idd}, + ihh=${ihh}, + idhh=-720, + incr=6, + nregion=${nregion}, + region(1)='global', rlonmin(1)=-180.0,rlonmax(1)=180.0,rlatmin(1)=-90.0,rlatmax(1)= 90.0, + region(2)='70N-90N', rlonmin(2)=-180.0,rlonmax(2)=180.0,rlatmin(2)= 70.0,rlatmax(2)= 90.0, + region(3)='20N-70N', rlonmin(3)=-180.0,rlonmax(3)=180.0,rlatmin(3)= 20.0,rlatmax(3)= 70.0, + region(4)='20S-20N', rlonmin(4)=-180.0,rlonmax(4)=180.0,rlatmin(4)=-20.0,rlatmax(4)= 20.0, + region(5)='20S-70S', rlonmin(5)=-180.0,rlonmax(5)=180.0,rlatmin(5)=-70.0,rlatmax(5)=-20.0, + region(6)='70S-90S', rlonmin(6)=-180.0,rlonmax(6)=180.0,rlatmin(6)=-90.0,rlatmax(6)=-70.0, + validate=${validate}, + new_hdr=${new_hdr}, + ptype=${ptype}, + netcdf=${netcdf_boolean} + / +EOF + + + echo "oznmon_time.x HAS STARTED ${type}" + + ./oznmon_time.x < input > stdout.time.${type}.${ptype} + + echo "oznmon_time.x HAS ENDED ${type}" + + if [[ ! -d ${TANKverf_ozn}/time ]]; then + mkdir -p ${TANKverf_ozn}/time + fi + $NCP ${type}.${ptype}.ctl ${TANKverf_ozn}/time/ + $NCP ${type}.${ptype}.${PDATE}.ieee_d ${TANKverf_ozn}/time/ + + $NCP bad* ${TANKverf_ozn}/time/ + + rm -f input + +cat << EOF > input + &INPUT + satname='${type}', + iyy=${iyy}, + imm=${imm}, + idd=${idd}, + ihh=${ihh}, + idhh=-18, + incr=6, + new_hdr=${new_hdr}, + ptype=${ptype}, + netcdf=${netcdf_boolean} + / +EOF + + echo "oznmon_horiz.x HAS STARTED ${type}" + + ./oznmon_horiz.x < input > stdout.horiz.${type}.${ptype} + + echo "oznmon_horiz.x HAS ENDED ${type}" + + if [[ ! -d ${TANKverf_ozn}/horiz ]]; then + mkdir -p ${TANKverf_ozn}/horiz + fi + $NCP ${type}.${ptype}.ctl ${TANKverf_ozn}/horiz/ + + $COMPRESS ${type}.${ptype}.${PDATE}.ieee_d + $NCP ${type}.${ptype}.${PDATE}.ieee_d.${Z} ${TANKverf_ozn}/horiz/ + + + echo "finished processing ptype, type: ${ptype}, ${type}" + + else + echo "diag file for ${type}.${ptype} not found" + fi + + done # type in satype + + done # ptype in $ozn_ptype + + tar -cvf stdout.horiz.tar stdout.horiz* + ${COMPRESS} stdout.horiz.tar + ${NCP} stdout.horiz.tar.${Z} ${TANKverf_ozn}/horiz/ + + tar -cvf stdout.time.tar stdout.time* + ${COMPRESS} stdout.time.tar + ${NCP} stdout.time.tar.${Z} ${TANKverf_ozn}/time/ +fi + +#------------------------------------------------------- +# Conditionally remove data files older than 40 days +# +if [[ ${CLEAN_TANKDIR:-0} -eq 1 ]]; then + ${HOMEoznmon}/ush/clean_tankdir.sh glb 40 +fi + +exit ${iret} diff --git a/ush/parsing_model_configure_FV3.sh b/ush/parsing_model_configure_FV3.sh index 4574b6e3528..91b82a0d76d 100755 --- a/ush/parsing_model_configure_FV3.sh +++ b/ush/parsing_model_configure_FV3.sh @@ -12,6 +12,13 @@ FV3_model_configure(){ +local restile=$(echo "${CASE}" |cut -c2-) +local ichunk2d=$((4*restile)) +local jchunk2d=$((2*restile)) +local ichunk3d=$((4*restile)) +local jchunk3d=$((2*restile)) +local kchunk3d=1 + rm -f model_configure cat >> model_configure < ice_in < ice_in < ice_in < ice_in < ice_in < ice_in <> input.nml <> input.nml <> input.nml <> input.nml <> input.nml <> input.nml < $DATA/INPUT/MOM_input -rm $DATA/INPUT/MOM_input_template_$OCNRES +${NCP} -pf "${HOMEgfs}/parm/mom6/MOM_input_template_${OCNRES}" "${DATA}/INPUT/" +sed -e "s/@\[DT_THERM_MOM6\]/${DT_THERM_MOM6}/g" \ + -e "s/@\[DT_DYNAM_MOM6\]/${DT_DYNAM_MOM6}/g" \ + -e "s/@\[MOM6_RIVER_RUNOFF\]/${MOM6_RIVER_RUNOFF}/g" \ + -e "s/@\[MOM6_THERMO_SPAN\]/${MOM6_THERMO_SPAN}/g" \ + -e "s/@\[MOM6_USE_LI2016\]/${MOM6_USE_LI2016}/g" \ + -e "s/@\[MOM6_USE_WAVES\]/${MOM6_USE_WAVES}/g" \ + -e "s/@\[MOM6_ALLOW_LANDMASK_CHANGES\]/${MOM6_ALLOW_LANDMASK_CHANGES}/g" \ + -e "s/@\[NX_GLB\]/${NX_GLB}/g" \ + -e "s/@\[NY_GLB\]/${NY_GLB}/g" \ + -e "s/@\[CHLCLIM\]/${CHLCLIM}/g" \ + -e "s/@\[DO_OCN_SPPT\]/${OCN_SPPT}/g" \ + -e "s/@\[PERT_EPBL\]/${PERT_EPBL}/g" \ + -e "s/@\[ODA_INCUPD_NHOURS\]/${ODA_INCUPD_NHOURS}/g" \ + -e "s/@\[ODA_INCUPD\]/${ODA_INCUPD}/g" "${DATA}/INPUT/MOM_input_template_${OCNRES}" > "${DATA}/INPUT/MOM_input" +rm "${DATA}/INPUT/MOM_input_template_${OCNRES}" #data table for runoff: -DATA_TABLE=${DATA_TABLE:-$PARM_FV3DIAG/data_table} -$NCP $DATA_TABLE $DATA/data_table_template -sed -e "s/@\[FRUNOFF\]/$FRUNOFF/g" $DATA/data_table_template > $DATA/data_table -rm $DATA/data_table_template +DATA_TABLE=${DATA_TABLE:-${PARM_FV3DIAG}/data_table} +${NCP} "${DATA_TABLE}" "${DATA}/data_table_template" +sed -e "s/@\[FRUNOFF\]/${FRUNOFF}/g" "${DATA}/data_table_template" > "${DATA}/data_table" +rm "${DATA}/data_table_template" } diff --git a/ush/parsing_namelists_WW3.sh b/ush/parsing_namelists_WW3.sh index 209fe9d11ae..c53af9f18f4 100755 --- a/ush/parsing_namelists_WW3.sh +++ b/ush/parsing_namelists_WW3.sh @@ -70,7 +70,7 @@ WW3_namelists(){ echo " starting time : $time_beg" echo " ending time : $time_end" echo ' ' - ${TRACE_ON:-set -x} + set_trace @@ -108,7 +108,7 @@ WW3_namelists(){ then set +x echo " buoy.loc copied ($PARMwave/wave_${NET}.buoys)." - ${TRACE_ON:-set -x} + set_trace else echo " FATAL ERROR : buoy.loc ($PARMwave/wave_${NET}.buoys) NOT FOUND" exit 12 diff --git a/ush/preamble.sh b/ush/preamble.sh old mode 100755 new mode 100644 index bfa326f1030..be64684aa87 --- a/ush/preamble.sh +++ b/ush/preamble.sh @@ -19,68 +19,139 @@ # ####### set +x -if [[ -v '1' ]]; then - id="(${1})" +if (( $# > 0 )); then + id="(${1})" else - id="" + id="" fi # Record the start time so we can calculate the elapsed time later start_time=$(date +%s) # Get the base name of the calling script -_calling_script=$(basename ${BASH_SOURCE[1]}) +_calling_script=$(basename "${BASH_SOURCE[1]}") # Announce the script has begun -echo "Begin ${_calling_script} at $(date -u)" +start_time_human=$(date -d"@${start_time}" -u) +echo "Begin ${_calling_script} at ${start_time_human}" -# Stage our variables -export STRICT=${STRICT:-"YES"} -export TRACE=${TRACE:-"YES"} -export ERR_EXIT_ON="" -export TRACE_ON="" +declare -rx PS4='+ $(basename ${BASH_SOURCE[0]:-${FUNCNAME[0]:-"Unknown"}})[${LINENO}]'"${id}: " -if [[ $STRICT == "YES" ]]; then - # Exit on error and undefined variable - export ERR_EXIT_ON="set -eu" -fi -if [[ $TRACE == "YES" ]]; then - export TRACE_ON="set -x" - # Print the script name and line number of each command as it is executed - export PS4='+ $(basename $BASH_SOURCE)[$LINENO]'"$id: " -fi +set_strict() { + if [[ ${STRICT:-"YES"} == "YES" ]]; then + # Exit on error and undefined variable + set -eu + fi +} + +set_trace() { + # Print the script name and line number of each command as it is + # executed when using trace. + if [[ ${TRACE:-"YES"} == "YES" ]]; then + set -x + fi +} postamble() { - # - # Commands to execute when a script ends. - # - # Syntax: - # postamble script start_time rc - # - # Arguments: - # script: name of the script ending - # start_time: start time of script (in seconds) - # rc: the exit code of the script - # + # + # Commands to execute when a script ends. + # + # Syntax: + # postamble script start_time rc + # + # Arguments: + # script: name of the script ending + # start_time: start time of script (in seconds) + # rc: the exit code of the script + # - set +x - script=${1} - start_time=${2} - rc=${3} + set +x + script="${1}" + start_time="${2}" + rc="${3}" - # Calculate the elapsed time - end_time=$(date +%s) - elapsed_sec=$((end_time - start_time)) - elapsed=$(date -d@${elapsed_sec} -u +%H:%M:%S) + # Calculate the elapsed time + end_time=$(date +%s) + end_time_human=$(date -d@"${end_time}" -u +%H:%M:%S) + elapsed_sec=$((end_time - start_time)) + elapsed=$(date -d@"${elapsed_sec}" -u +%H:%M:%S) - # Announce the script has ended, then pass the error code up - echo "End ${script} at $(date -u) with error code ${rc:-0} (time elapsed: ${elapsed})" - exit ${rc} + # Announce the script has ended, then pass the error code up + echo "End ${script} at ${end_time_human} with error code ${rc:-0} (time elapsed: ${elapsed})" + exit "${rc}" } # Place the postamble in a trap so it is always called no matter how the script exits +# Shellcheck: Turn off warning about substitions at runtime instead of signal time +# shellcheck disable=SC2064 trap "postamble ${_calling_script} ${start_time} \$?" EXIT +# shellcheck disable= + +function generate_com() { + # + # Generate a list COM variables from a template by substituting in env variables. + # + # Each argument must have a corresponding template with the name ${ARG}_TMPL. Any + # variables in the template are replaced with their values. Undefined variables + # are just removed without raising an error. + # + # Accepts as options `-r` and `-x`, which do the same thing as the same options in + # `declare`. Variables are automatically marked as `-g` so the variable is visible + # in the calling script. + # + # Syntax: + # generate_com [-rx] $var1[:$tmpl1] [$var2[:$tmpl2]] [...]] + # + # options: + # -r: Make variable read-only (same as `decalre -r`) + # -x: Mark variable for export (same as `declare -x`) + # var1, var2, etc: Variable names whose values will be generated from a template + # and declared + # tmpl1, tmpl2, etc: Specify the template to use (default is "${var}_TMPL") + # + # Examples: + # # Current cycle and RUN, implicitly using template COM_ATMOS_ANALYSIS_TMPL + # YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS + # + # # Previous cycle and gdas using an explicit template + # RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + # COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL + # + # # Current cycle and COM for first member + # MEMDIR='mem001' YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_HISTORY + # + if [[ ${DEBUG_WORKFLOW:-"NO"} == "NO" ]]; then set +x; fi + local opts="-g" + local OPTIND=1 + while getopts "rx" option; do + opts="${opts}${option}" + done + shift $((OPTIND-1)) + + for input in "$@"; do + IFS=':' read -ra args <<< "${input}" + local com_var="${args[0]}" + local template + local value + if (( ${#args[@]} > 1 )); then + template="${args[1]}" + else + template="${com_var}_TMPL" + fi + if [[ ! -v "${template}" ]]; then + echo "FATAL ERROR in generate_com: Requested template ${template} not defined!" + exit 2 + fi + value=$(echo "${!template}" | envsubst) + # shellcheck disable=SC2086 + declare ${opts} "${com_var}"="${value}" + echo "generate_com :: ${com_var}=${value}" + done + set_trace +} +# shellcheck disable= +declare -xf generate_com # Turn on our settings -$ERR_EXIT_ON -$TRACE_ON +set_strict +set_trace diff --git a/ush/python/pygfs/__init__.py b/ush/python/pygfs/__init__.py new file mode 100644 index 00000000000..e69de29bb2d diff --git a/ush/python/pygfs/task/__init__.py b/ush/python/pygfs/task/__init__.py new file mode 100644 index 00000000000..e69de29bb2d diff --git a/ush/python/pygfs/task/aero_analysis.py b/ush/python/pygfs/task/aero_analysis.py new file mode 100644 index 00000000000..2a0b2c2dd64 --- /dev/null +++ b/ush/python/pygfs/task/aero_analysis.py @@ -0,0 +1,304 @@ +#!/usr/bin/env python3 + +import os +import glob +import gzip +import tarfile +from logging import getLogger +from typing import Dict, List, Any + +from pygw.attrdict import AttrDict +from pygw.file_utils import FileHandler +from pygw.timetools import add_to_datetime, to_fv3time, to_timedelta +from pygw.fsutils import rm_p, chdir +from pygw.timetools import to_fv3time +from pygw.yaml_file import YAMLFile, parse_yamltmpl, parse_j2yaml, save_as_yaml +from pygw.logger import logit +from pygw.executable import Executable +from pygw.exceptions import WorkflowException +from pygfs.task.analysis import Analysis + +logger = getLogger(__name__.split('.')[-1]) + + +class AerosolAnalysis(Analysis): + """ + Class for global aerosol analysis tasks + """ + @logit(logger, name="AerosolAnalysis") + def __init__(self, config): + super().__init__(config) + + _res = int(self.config['CASE'][1:]) + _res_enkf = int(self.config['CASE_ENKF'][1:]) + _window_begin = add_to_datetime(self.runtime_config.current_cycle, -to_timedelta(f"{self.config['assim_freq']}H") / 2) + _fv3jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config['cyc']:02d}z.aerovar.yaml") + + # Create a local dictionary that is repeatedly used across this class + local_dict = AttrDict( + { + 'npx_ges': _res + 1, + 'npy_ges': _res + 1, + 'npz_ges': self.config.LEVS - 1, + 'npz': self.config.LEVS - 1, + 'npx_anl': _res_enkf + 1, + 'npy_anl': _res_enkf + 1, + 'npz_anl': self.config['LEVS'] - 1, + 'AERO_WINDOW_BEGIN': _window_begin, + 'AERO_WINDOW_LENGTH': f"PT{self.config['assim_freq']}H", + 'OPREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN + 'APREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN + 'GPREFIX': f"gdas.t{self.runtime_config.previous_cycle.hour:02d}z.", + 'fv3jedi_yaml': _fv3jedi_yaml, + } + ) + + # task_config is everything that this task should need + self.task_config = AttrDict(**self.config, **self.runtime_config, **local_dict) + + @logit(logger) + def initialize(self: Analysis) -> None: + """Initialize a global aerosol analysis + + This method will initialize a global aerosol analysis using JEDI. + This includes: + - staging CRTM fix files + - staging FV3-JEDI fix files + - staging B error files + - staging model backgrounds + - generating a YAML file for the JEDI executable + - creating output directories + """ + super().initialize() + + # stage CRTM fix files + crtm_fix_list_path = os.path.join(self.task_config['HOMEgfs'], 'parm', 'parm_gdas', 'aero_crtm_coeff.yaml') + logger.debug(f"Staging CRTM fix files from {crtm_fix_list_path}") + crtm_fix_list = parse_yamltmpl(crtm_fix_list_path, self.task_config) + FileHandler(crtm_fix_list).sync() + + # stage fix files + jedi_fix_list_path = os.path.join(self.task_config['HOMEgfs'], 'parm', 'parm_gdas', 'aero_jedi_fix.yaml') + logger.debug(f"Staging JEDI fix files from {jedi_fix_list_path}") + jedi_fix_list = parse_yamltmpl(jedi_fix_list_path, self.task_config) + FileHandler(jedi_fix_list).sync() + + # stage berror files + # copy BUMP files, otherwise it will assume ID matrix + if self.task_config.get('STATICB_TYPE', 'identity') in ['bump']: + FileHandler(self.get_berror_dict(self.task_config)).sync() + + # stage backgrounds + FileHandler(self.get_bkg_dict(AttrDict(self.task_config, **self.task_config))).sync() + + # generate variational YAML file + logger.debug(f"Generate variational YAML file: {self.task_config.fv3jedi_yaml}") + varda_yaml = parse_j2yaml(self.task_config['AEROVARYAML'], self.task_config) + save_as_yaml(varda_yaml, self.task_config.fv3jedi_yaml) + logger.info(f"Wrote variational YAML to: {self.task_config.fv3jedi_yaml}") + + # need output dir for diags and anl + logger.debug("Create empty output [anl, diags] directories to receive output from executable") + newdirs = [ + os.path.join(self.task_config['DATA'], 'anl'), + os.path.join(self.task_config['DATA'], 'diags'), + ] + FileHandler({'mkdir': newdirs}).sync() + + @logit(logger) + def execute(self: Analysis) -> None: + + chdir(self.task_config.DATA) + + exec_cmd = Executable(self.task_config.APRUN_AEROANL) + exec_name = os.path.join(self.task_config.DATA, 'fv3jedi_var.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg(self.task_config.fv3jedi_yaml) + + try: + logger.debug(f"Executing {exec_cmd}") + exec_cmd() + except OSError: + raise OSError(f"Failed to execute {exec_cmd}") + except Exception: + raise WorkflowException(f"An error occured during execution of {exec_cmd}") + + pass + + @logit(logger) + def finalize(self: Analysis) -> None: + """Finalize a global aerosol analysis + + This method will finalize a global aerosol analysis using JEDI. + This includes: + - tarring up output diag files and place in ROTDIR + - copying the generated YAML file from initialize to the ROTDIR + - copying the guess files to the ROTDIR + - applying the increments to the original RESTART files + - moving the increment files to the ROTDIR + + Please note that some of these steps are temporary and will be modified + once the model is able to read aerosol tracer increments. + """ + # ---- tar up diags + # path of output tar statfile + aerostat = os.path.join(self.task_config.COM_CHEM_ANALYSIS, f"{self.task_config['APREFIX']}aerostat") + + # get list of diag files to put in tarball + diags = glob.glob(os.path.join(self.task_config['DATA'], 'diags', 'diag*nc4')) + + # gzip the files first + for diagfile in diags: + with open(diagfile, 'rb') as f_in, gzip.open(f"{diagfile}.gz", 'wb') as f_out: + f_out.writelines(f_in) + + # open tar file for writing + with tarfile.open(aerostat, "w") as archive: + for diagfile in diags: + diaggzip = f"{diagfile}.gz" + archive.add(diaggzip, arcname=os.path.basename(diaggzip)) + + # copy full YAML from executable to ROTDIR + src = os.path.join(self.task_config['DATA'], f"{self.task_config['CDUMP']}.t{self.runtime_config['cyc']:02d}z.aerovar.yaml") + dest = os.path.join(self.task_config.COM_CHEM_ANALYSIS, f"{self.task_config['CDUMP']}.t{self.runtime_config['cyc']:02d}z.aerovar.yaml") + yaml_copy = { + 'mkdir': [self.task_config.COM_CHEM_ANALYSIS], + 'copy': [[src, dest]] + } + FileHandler(yaml_copy).sync() + + # ---- NOTE below is 'temporary', eventually we will not be using FMS RESTART formatted files + # ---- all of the rest of this method will need to be changed but requires model and JEDI changes + # ---- copy RESTART fv_tracer files for future reference + template = '{}.fv_tracer.res.tile{}.nc'.format(to_fv3time(self.task_config.current_cycle), '{tilenum}') + bkglist = [] + for itile in range(1, self.task_config.ntiles + 1): + tracer = template.format(tilenum=itile) + src = os.path.join(self.task_config.COM_ATMOS_RESTART_PREV, tracer) + dest = os.path.join(self.task_config.COM_CHEM_ANALYSIS, f'aeroges.{tracer}') + bkglist.append([src, dest]) + FileHandler({'copy': bkglist}).sync() + + # ---- add increments to RESTART files + logger.info('Adding increments to RESTART files') + self._add_fms_cube_sphere_increments() + + # ---- move increments to ROTDIR + logger.info('Moving increments to ROTDIR') + template = f'aeroinc.{to_fv3time(self.task_config.current_cycle)}.fv_tracer.res.tile{{tilenum}}.nc' + inclist = [] + for itile in range(1, self.task_config.ntiles + 1): + tracer = template.format(tilenum=itile) + src = os.path.join(self.task_config.DATA, 'anl', tracer) + dest = os.path.join(self.task_config.COM_CHEM_ANALYSIS, tracer) + inclist.append([src, dest]) + FileHandler({'copy': inclist}).sync() + + def clean(self): + super().clean() + + @logit(logger) + def _add_fms_cube_sphere_increments(self: Analysis) -> None: + """This method adds increments to RESTART files to get an analysis + NOTE this is only needed for now because the model cannot read aerosol increments. + This method will be assumed to be deprecated before this is implemented operationally + """ + # only need the fv_tracer files + template = f'{to_fv3time(self.task_config.current_cycle)}.fv_tracer.res.tile{{tilenum}}.nc' + inc_template = os.path.join(self.task_config.DATA, 'anl', 'aeroinc.' + template) + bkg_template = os.path.join(self.task_config.COM_ATMOS_RESTART_PREV, template) + # get list of increment vars + incvars_list_path = os.path.join(self.task_config['HOMEgfs'], 'parm', 'parm_gdas', 'aeroanl_inc_vars.yaml') + incvars = YAMLFile(path=incvars_list_path)['incvars'] + super().add_fv3_increments(inc_template, bkg_template, incvars) + + @logit(logger) + def get_bkg_dict(self, task_config: Dict[str, Any]) -> Dict[str, List[str]]: + """Compile a dictionary of model background files to copy + + This method constructs a dictionary of FV3 RESTART files (coupler, core, tracer) + that are needed for global aerosol DA and returns said dictionary for use by the FileHandler class. + + Parameters + ---------- + task_config: Dict + a dictionary containing all of the configuration needed for the task + + Returns + ---------- + bkg_dict: Dict + a dictionary containing the list of model background files to copy for FileHandler + """ + # NOTE for now this is FV3 RESTART files and just assumed to be fh006 + + # get FV3 RESTART files, this will be a lot simpler when using history files + rst_dir = task_config.COM_ATMOS_RESTART_PREV + run_dir = os.path.join(task_config['DATA'], 'bkg') + + # Start accumulating list of background files to copy + bkglist = [] + + # aerosol DA needs coupler + basename = f'{to_fv3time(task_config.current_cycle)}.coupler.res' + bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) + + # aerosol DA only needs core/tracer + for ftype in ['core', 'tracer']: + template = f'{to_fv3time(self.task_config.current_cycle)}.fv_{ftype}.res.tile{{tilenum}}.nc' + for itile in range(1, task_config.ntiles + 1): + basename = template.format(tilenum=itile) + bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) + + bkg_dict = { + 'mkdir': [run_dir], + 'copy': bkglist, + } + return bkg_dict + + @logit(logger) + def get_berror_dict(self, config: Dict[str, Any]) -> Dict[str, List[str]]: + """Compile a dictionary of background error files to copy + + This method will construct a dictionary of BUMP background error files + for global aerosol DA and return said dictionary for use by the FileHandler class. + This dictionary contains coupler and fv_tracer files + for correlation and standard deviation as well as NICAS localization. + + Parameters + ---------- + config: Dict + a dictionary containing all of the configuration needed + + Returns + ---------- + berror_dict: Dict + a dictionary containing the list of background error files to copy for FileHandler + """ + # aerosol static-B needs nicas, cor_rh, cor_rv and stddev files. + b_dir = config.BERROR_DATA_DIR + b_datestr = to_fv3time(config.BERROR_DATE) + berror_list = [] + + for ftype in ['cor_rh', 'cor_rv', 'stddev']: + coupler = f'{b_datestr}.{ftype}.coupler.res' + berror_list.append([ + os.path.join(b_dir, coupler), os.path.join(config.DATA, 'berror', coupler) + ]) + template = '{b_datestr}.{ftype}.fv_tracer.res.tile{{tilenum}}.nc' + for itile in range(1, config.ntiles + 1): + tracer = template.format(tilenum=itile) + berror_list.append([ + os.path.join(b_dir, tracer), os.path.join(config.DATA, 'berror', tracer) + ]) + + nproc = config.ntiles * config.layout_x * config.layout_y + for nn in range(1, nproc + 1): + berror_list.append([ + os.path.join(b_dir, f'nicas_aero_nicas_local_{nproc:06}-{nn:06}.nc'), + os.path.join(config.DATA, 'berror', f'nicas_aero_nicas_local_{nproc:06}-{nn:06}.nc') + ]) + berror_dict = { + 'mkdir': [os.path.join(config.DATA, 'berror')], + 'copy': berror_list, + } + return berror_dict diff --git a/ush/python/pygfs/task/analysis.py b/ush/python/pygfs/task/analysis.py new file mode 100644 index 00000000000..7c24c9cbdb1 --- /dev/null +++ b/ush/python/pygfs/task/analysis.py @@ -0,0 +1,201 @@ +#!/usr/bin/env python3 + +import os +from logging import getLogger +from netCDF4 import Dataset +from typing import List, Dict, Any + +from pygw.yaml_file import YAMLFile, parse_j2yaml, parse_yamltmpl +from pygw.file_utils import FileHandler +from pygw.template import Template, TemplateConstants +from pygw.logger import logit +from pygw.task import Task + +logger = getLogger(__name__.split('.')[-1]) + + +class Analysis(Task): + """Parent class for GDAS tasks + + The Analysis class is the parent class for all + Global Data Assimilation System (GDAS) tasks + directly related to peforming an analysis + """ + + def __init__(self, config: Dict[str, Any]) -> None: + super().__init__(config) + self.config.ntiles = 6 + + def initialize(self) -> None: + super().initialize() + # all analyses need to stage observations + obs_dict = self.get_obs_dict() + FileHandler(obs_dict).sync() + + # some analyses need to stage bias corrections + bias_dict = self.get_bias_dict() + FileHandler(bias_dict).sync() + + # link jedi executable to run directory + self.link_jediexe() + + @logit(logger) + def get_obs_dict(self: Task) -> Dict[str, Any]: + """Compile a dictionary of observation files to copy + + This method uses the OBS_LIST configuration variable to generate a dictionary + from a list of YAML files that specify what observation files are to be + copied to the run directory from the observation input directory + + Parameters + ---------- + + Returns + ---------- + obs_dict: Dict + a dictionary containing the list of observation files to copy for FileHandler + """ + logger.debug(f"OBS_LIST: {self.task_config['OBS_LIST']}") + obs_list_config = parse_j2yaml(self.task_config["OBS_LIST"], self.task_config) + logger.debug(f"obs_list_config: {obs_list_config}") + # get observers from master dictionary + observers = obs_list_config['observers'] + copylist = [] + for ob in observers: + obfile = ob['obs space']['obsdatain']['engine']['obsfile'] + basename = os.path.basename(obfile) + copylist.append([os.path.join(self.task_config['COM_OBS'], basename), obfile]) + obs_dict = { + 'mkdir': [os.path.join(self.runtime_config['DATA'], 'obs')], + 'copy': copylist + } + return obs_dict + + @logit(logger) + def get_bias_dict(self: Task) -> Dict[str, Any]: + """Compile a dictionary of observation files to copy + + This method uses the OBS_LIST configuration variable to generate a dictionary + from a list of YAML files that specify what observation bias correction files + are to be copied to the run directory from the observation input directory + + Parameters + ---------- + + Returns + ---------- + bias_dict: Dict + a dictionary containing the list of observation bias files to copy for FileHandler + """ + logger.debug(f"OBS_LIST: {self.task_config['OBS_LIST']}") + obs_list_config = parse_j2yaml(self.task_config["OBS_LIST"], self.task_config) + logger.debug(f"obs_list_config: {obs_list_config}") + # get observers from master dictionary + observers = obs_list_config['observers'] + copylist = [] + for ob in observers: + if 'obs bias' in ob.keys(): + obfile = ob['obs bias']['input file'] + obdir = os.path.dirname(obfile) + basename = os.path.basename(obfile) + prefix = '.'.join(basename.split('.')[:-2]) + for file in ['satbias.nc4', 'satbias_cov.nc4', 'tlapse.txt']: + bfile = f"{prefix}.{file}" + copylist.append([os.path.join(self.task_config.COM_ATMOS_ANALYSIS_PREV, bfile), os.path.join(obdir, bfile)]) + + bias_dict = { + 'mkdir': [os.path.join(self.runtime_config.DATA, 'bc')], + 'copy': copylist + } + return bias_dict + + @logit(logger) + def add_fv3_increments(self, inc_file_tmpl: str, bkg_file_tmpl: str, incvars: List) -> None: + """Add cubed-sphere increments to cubed-sphere backgrounds + + Parameters + ---------- + inc_file_tmpl : str + template of the FV3 increment file of the form: 'filetype.tile{tilenum}.nc' + bkg_file_tmpl : str + template of the FV3 background file of the form: 'filetype.tile{tilenum}.nc' + incvars : List + List of increment variables to add to the background + """ + + for itile in range(1, self.config.ntiles + 1): + inc_path = inc_file_tmpl.format(tilenum=itile) + bkg_path = bkg_file_tmpl.format(tilenum=itile) + with Dataset(inc_path, mode='r') as incfile, Dataset(bkg_path, mode='a') as rstfile: + for vname in incvars: + increment = incfile.variables[vname][:] + bkg = rstfile.variables[vname][:] + anl = bkg + increment + rstfile.variables[vname][:] = anl[:] + try: + rstfile.variables[vname].delncattr('checksum') # remove the checksum so fv3 does not complain + except (AttributeError, RuntimeError): + pass # checksum is missing, move on + + @logit(logger) + def get_bkg_dict(self, task_config: Dict[str, Any]) -> Dict[str, List[str]]: + """Compile a dictionary of model background files to copy + + This method is a placeholder for now... will be possibly made generic at a later date + + Parameters + ---------- + task_config: Dict + a dictionary containing all of the configuration needed for the task + + Returns + ---------- + bkg_dict: Dict + a dictionary containing the list of model background files to copy for FileHandler + """ + bkg_dict = {'foo': 'bar'} + return bkg_dict + + @logit(logger) + def get_berror_dict(self, config: Dict[str, Any]) -> Dict[str, List[str]]: + """Compile a dictionary of background error files to copy + + This method is a placeholder for now... will be possibly made generic at a later date + + Parameters + ---------- + config: Dict + a dictionary containing all of the configuration needed + + Returns + ---------- + berror_dict: Dict + a dictionary containing the list of background error files to copy for FileHandler + """ + berror_dict = {'foo': 'bar'} + return berror_dict + + @logit(logger) + def link_jediexe(self: Task) -> None: + """Compile a dictionary of background error files to copy + + This method links a JEDI executable to the run directory + + Parameters + ---------- + Task: GDAS task + + Returns + ---------- + None + """ + exe_src = self.task_config.JEDIEXE + + # TODO: linking is not permitted per EE2. Needs work in JEDI to be able to copy the exec. + logger.debug(f"Link executable {exe_src} to DATA/") + exe_dest = os.path.join(self.task_config.DATA, os.path.basename(exe_src)) + if os.path.exists(exe_dest): + rm_p(exe_dest) + os.symlink(exe_src, exe_dest) + + return diff --git a/ush/python/pygfs/task/atm_analysis.py b/ush/python/pygfs/task/atm_analysis.py new file mode 100644 index 00000000000..3ab0ae3240e --- /dev/null +++ b/ush/python/pygfs/task/atm_analysis.py @@ -0,0 +1,435 @@ +#!/usr/bin/env python3 + +import os +import glob +import gzip +import tarfile +from logging import getLogger +from typing import Dict, List, Any + +from pygw.attrdict import AttrDict +from pygw.file_utils import FileHandler +from pygw.timetools import add_to_datetime, to_fv3time, to_timedelta, to_YMDH +from pygw.fsutils import rm_p, chdir +from pygw.yaml_file import parse_yamltmpl, parse_j2yaml, save_as_yaml +from pygw.logger import logit +from pygw.executable import Executable +from pygw.exceptions import WorkflowException +from pygfs.task.analysis import Analysis + +logger = getLogger(__name__.split('.')[-1]) + + +class AtmAnalysis(Analysis): + """ + Class for global atm analysis tasks + """ + @logit(logger, name="AtmAnalysis") + def __init__(self, config): + super().__init__(config) + + _res = int(self.config.CASE[1:]) + _res_anl = int(self.config.CASE_ANL[1:]) + _window_begin = add_to_datetime(self.runtime_config.current_cycle, -to_timedelta(f"{self.config.assim_freq}H") / 2) + _fv3jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.atmvar.yaml") + + # Create a local dictionary that is repeatedly used across this class + local_dict = AttrDict( + { + 'npx_ges': _res + 1, + 'npy_ges': _res + 1, + 'npz_ges': self.config.LEVS - 1, + 'npz': self.config.LEVS - 1, + 'npx_anl': _res_anl + 1, + 'npy_anl': _res_anl + 1, + 'npz_anl': self.config.LEVS - 1, + 'ATM_WINDOW_BEGIN': _window_begin, + 'ATM_WINDOW_LENGTH': f"PT{self.config.assim_freq}H", + 'OPREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN + 'APREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN + 'GPREFIX': f"gdas.t{self.runtime_config.previous_cycle.hour:02d}z.", + 'fv3jedi_yaml': _fv3jedi_yaml, + } + ) + + # task_config is everything that this task should need + self.task_config = AttrDict(**self.config, **self.runtime_config, **local_dict) + + @logit(logger) + def initialize(self: Analysis) -> None: + """Initialize a global atm analysis + + This method will initialize a global atm analysis using JEDI. + This includes: + - staging CRTM fix files + - staging FV3-JEDI fix files + - staging B error files + - staging model backgrounds + - generating a YAML file for the JEDI executable + - creating output directories + """ + super().initialize() + + # stage CRTM fix files + crtm_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'parm_gdas', 'atm_crtm_coeff.yaml') + logger.debug(f"Staging CRTM fix files from {crtm_fix_list_path}") + crtm_fix_list = parse_yamltmpl(crtm_fix_list_path, self.task_config) + FileHandler(crtm_fix_list).sync() + + # stage fix files + jedi_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'parm_gdas', 'atm_jedi_fix.yaml') + logger.debug(f"Staging JEDI fix files from {jedi_fix_list_path}") + jedi_fix_list = parse_yamltmpl(jedi_fix_list_path, self.task_config) + FileHandler(jedi_fix_list).sync() + + # stage berror files + # copy static background error files, otherwise it will assume ID matrix + logger.debug(f"Stage files for STATICB_TYPE {self.task_config.STATICB_TYPE}") + FileHandler(self.get_berror_dict(self.task_config)).sync() + + # stage backgrounds + FileHandler(self.get_bkg_dict(AttrDict(self.task_config))).sync() + + # generate variational YAML file + logger.debug(f"Generate variational YAML file: {self.task_config.fv3jedi_yaml}") + varda_yaml = parse_j2yaml(self.task_config.ATMVARYAML, self.task_config) + save_as_yaml(varda_yaml, self.task_config.fv3jedi_yaml) + logger.info(f"Wrote variational YAML to: {self.task_config.fv3jedi_yaml}") + + # need output dir for diags and anl + logger.debug("Create empty output [anl, diags] directories to receive output from executable") + newdirs = [ + os.path.join(self.task_config.DATA, 'anl'), + os.path.join(self.task_config.DATA, 'diags'), + ] + FileHandler({'mkdir': newdirs}).sync() + + @logit(logger) + def execute(self: Analysis) -> None: + + chdir(self.task_config.DATA) + + exec_cmd = Executable(self.task_config.APRUN_ATMANL) + exec_name = os.path.join(self.task_config.DATA, 'fv3jedi_var.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg(self.task_config.fv3jedi_yaml) + + try: + logger.debug(f"Executing {exec_cmd}") + exec_cmd() + except OSError: + raise OSError(f"Failed to execute {exec_cmd}") + except Exception: + raise WorkflowException(f"An error occured during execution of {exec_cmd}") + + pass + + @logit(logger) + def finalize(self: Analysis) -> None: + """Finalize a global atm analysis + + This method will finalize a global atm analysis using JEDI. + This includes: + - tar output diag files and place in ROTDIR + - copy the generated YAML file from initialize to the ROTDIR + - copy the updated bias correction files to ROTDIR + - write UFS model readable atm incrment file + + """ + # ---- tar up diags + # path of output tar statfile + atmstat = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, f"{self.task_config.APREFIX}atmstat") + + # get list of diag files to put in tarball + diags = glob.glob(os.path.join(self.task_config.DATA, 'diags', 'diag*nc4')) + + logger.info(f"Compressing {len(diags)} diag files to {atmstat}.gz") + + # gzip the files first + logger.debug(f"Gzipping {len(diags)} diag files") + for diagfile in diags: + with open(diagfile, 'rb') as f_in, gzip.open(f"{diagfile}.gz", 'wb') as f_out: + f_out.writelines(f_in) + + # open tar file for writing + logger.debug(f"Creating tar file {atmstat} with {len(diags)} gzipped diag files") + with tarfile.open(atmstat, "w") as archive: + for diagfile in diags: + diaggzip = f"{diagfile}.gz" + archive.add(diaggzip, arcname=os.path.basename(diaggzip)) + + # copy full YAML from executable to ROTDIR + logger.info(f"Copying {self.task_config.fv3jedi_yaml} to {self.task_config.COM_ATMOS_ANALYSIS}") + src = os.path.join(self.task_config.DATA, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmvar.yaml") + dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmvar.yaml") + logger.debug(f"Copying {src} to {dest}") + yaml_copy = { + 'mkdir': [self.task_config.COM_ATMOS_ANALYSIS], + 'copy': [[src, dest]] + } + FileHandler(yaml_copy).sync() + + # copy bias correction files to ROTDIR + logger.info("Copy bias correction files from DATA/ to COM/") + biasdir = os.path.join(self.task_config.DATA, 'bc') + biasls = os.listdir(biasdir) + biaslist = [] + for bfile in biasls: + src = os.path.join(biasdir, bfile) + dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, bfile) + biaslist.append([src, dest]) + + gprefix = f"{self.task_config.GPREFIX}" + gsuffix = f"{to_YMDH(self.task_config.previous_cycle)}" + ".txt" + aprefix = f"{self.task_config.APREFIX}" + asuffix = f"{to_YMDH(self.task_config.current_cycle)}" + ".txt" + + logger.info(f"Copying {gprefix}*{gsuffix} from DATA/ to COM/ as {aprefix}*{asuffix}") + obsdir = os.path.join(self.task_config.DATA, 'obs') + obsls = os.listdir(obsdir) + for ofile in obsls: + if ofile.endswith(".txt"): + src = os.path.join(obsdir, ofile) + tfile = ofile.replace(gprefix, aprefix) + tfile = tfile.replace(gsuffix, asuffix) + dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, tfile) + biaslist.append([src, dest]) + + bias_copy = { + 'mkdir': [self.task_config.COM_ATMOS_ANALYSIS], + 'copy': biaslist, + } + FileHandler(bias_copy).sync() + + # Create UFS model readable atm increment file from UFS-DA atm increment + logger.info("Create UFS model readable atm increment file from UFS-DA atm increment") + self.jedi2fv3inc() + + def clean(self): + super().clean() + + @logit(logger) + def get_bkg_dict(self, task_config: Dict[str, Any]) -> Dict[str, List[str]]: + """Compile a dictionary of model background files to copy + + This method constructs a dictionary of FV3 restart files (coupler, core, tracer) + that are needed for global atm DA and returns said dictionary for use by the FileHandler class. + + Parameters + ---------- + task_config: Dict + a dictionary containing all of the configuration needed for the task + + Returns + ---------- + bkg_dict: Dict + a dictionary containing the list of model background files to copy for FileHandler + """ + # NOTE for now this is FV3 restart files and just assumed to be fh006 + + # get FV3 restart files, this will be a lot simpler when using history files + rst_dir = os.path.join(task_config.COM_ATMOS_RESTART_PREV) # for now, option later? + run_dir = os.path.join(task_config.DATA, 'bkg') + + # Start accumulating list of background files to copy + bkglist = [] + + # atm DA needs coupler + basename = f'{to_fv3time(task_config.current_cycle)}.coupler.res' + bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) + + # atm DA needs core, srf_wnd, tracer, phy_data, sfc_data + for ftype in ['core', 'srf_wnd', 'tracer']: + template = f'{to_fv3time(self.task_config.current_cycle)}.fv_{ftype}.res.tile{{tilenum}}.nc' + for itile in range(1, task_config.ntiles + 1): + basename = template.format(tilenum=itile) + bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) + + for ftype in ['phy_data', 'sfc_data']: + template = f'{to_fv3time(self.task_config.current_cycle)}.{ftype}.tile{{tilenum}}.nc' + for itile in range(1, task_config.ntiles + 1): + basename = template.format(tilenum=itile) + bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) + + bkg_dict = { + 'mkdir': [run_dir], + 'copy': bkglist, + } + return bkg_dict + + @logit(logger) + def get_berror_dict(self, config: Dict[str, Any]) -> Dict[str, List[str]]: + """Compile a dictionary of background error files to copy + + This method will construct a dictionary of either bump of gsibec background + error files for global atm DA and return said dictionary for use by the + FileHandler class. + + Parameters + ---------- + config: Dict + a dictionary containing all of the configuration needed + + Returns + ---------- + berror_dict: Dict + a dictionary containing the list of atm background error files to copy for FileHandler + """ + SUPPORTED_BERROR_STATIC_MAP = {'identity': self._get_berror_dict_identity, + 'bump': self._get_berror_dict_bump, + 'gsibec': self._get_berror_dict_gsibec} + + try: + berror_dict = SUPPORTED_BERROR_STATIC_MAP[config.STATICB_TYPE](config) + except KeyError: + raise KeyError(f"{config.STATICB_TYPE} is not a supported background error type.\n" + + f"Currently supported background error types are:\n" + + f'{" | ".join(SUPPORTED_BERROR_STATIC_MAP.keys())}') + + return berror_dict + + @staticmethod + @logit(logger) + def _get_berror_dict_identity(config: Dict[str, Any]) -> Dict[str, List[str]]: + """Identity BE does not need any files for staging. + + This is a private method and should not be accessed directly. + + Parameters + ---------- + config: Dict + a dictionary containing all of the configuration needed + Returns + ---------- + berror_dict: Dict + Empty dictionary [identity BE needs not files to stage] + """ + logger.info(f"Identity background error does not use staged files. Return empty dictionary") + return {} + + @staticmethod + @logit(logger) + def _get_berror_dict_bump(config: Dict[str, Any]) -> Dict[str, List[str]]: + """Compile a dictionary of atm bump background error files to copy + + This method will construct a dictionary of atm bump background error + files for global atm DA and return said dictionary to the parent + + This is a private method and should not be accessed directly. + + Parameters + ---------- + config: Dict + a dictionary containing all of the configuration needed + + Returns + ---------- + berror_dict: Dict + a dictionary of atm bump background error files to copy for FileHandler + """ + # BUMP atm static-B needs nicas, cor_rh, cor_rv and stddev files. + b_dir = config.BERROR_DATA_DIR + b_datestr = to_fv3time(config.BERROR_DATE) + berror_list = [] + for ftype in ['cor_rh', 'cor_rv', 'stddev']: + coupler = f'{b_datestr}.{ftype}.coupler.res' + berror_list.append([ + os.path.join(b_dir, coupler), os.path.join(config.DATA, 'berror', coupler) + ]) + + template = '{b_datestr}.{ftype}.fv_tracer.res.tile{{tilenum}}.nc' + for itile in range(1, config.ntiles + 1): + tracer = template.format(tilenum=itile) + berror_list.append([ + os.path.join(b_dir, tracer), os.path.join(config.DATA, 'berror', tracer) + ]) + + nproc = config.ntiles * config.layout_x * config.layout_y + for nn in range(1, nproc + 1): + berror_list.append([ + os.path.join(b_dir, f'nicas_aero_nicas_local_{nproc:06}-{nn:06}.nc'), + os.path.join(config.DATA, 'berror', f'nicas_aero_nicas_local_{nproc:06}-{nn:06}.nc') + ]) + + # create dictionary of background error files to stage + berror_dict = { + 'mkdir': [os.path.join(config.DATA, 'berror')], + 'copy': berror_list, + } + return berror_dict + + @staticmethod + @logit(logger) + def _get_berror_dict_gsibec(config: Dict[str, Any]) -> Dict[str, List[str]]: + """Compile a dictionary of atm gsibec background error files to copy + + This method will construct a dictionary of atm gsibec background error + files for global atm DA and return said dictionary to the parent + + This is a private method and should not be accessed directly. + + Parameters + ---------- + config: Dict + a dictionary containing all of the configuration needed + + Returns + ---------- + berror_dict: Dict + a dictionary of atm gsibec background error files to copy for FileHandler + """ + # GSI atm static-B needs namelist and coefficient files. + b_dir = os.path.join(config.HOMEgfs, 'fix', 'gdas', 'gsibec', config.CASE_ANL) + berror_list = [] + for ftype in ['gfs_gsi_global.nml', 'gsi-coeffs-gfs-global.nc4']: + berror_list.append([ + os.path.join(b_dir, ftype), + os.path.join(config.DATA, 'berror', ftype) + ]) + + # create dictionary of background error files to stage + berror_dict = { + 'mkdir': [os.path.join(config.DATA, 'berror')], + 'copy': berror_list, + } + return berror_dict + + @logit(logger) + def jedi2fv3inc(self: Analysis) -> None: + """Generate UFS model readable analysis increment + + This method writes a UFS DA atm increment in UFS model readable format. + This includes: + - write UFS-DA atm increments using variable names expected by UFS model + - compute and write delp increment + - compute and write hydrostatic delz increment + + Please note that some of these steps are temporary and will be modified + once the modle is able to directly read atm increments. + + """ + # Select the atm guess file based on the analysis and background resolutions + # Fields from the atm guess are used to compute the delp and delz increments + case_anl = int(self.task_config.CASE_ANL[1:]) + case = int(self.task_config.CASE[1:]) + + file = f"{self.task_config.GPREFIX}" + "atmf006" + f"{'' if case_anl == case else '.ensres'}" + ".nc" + atmges_fv3 = os.path.join(self.task_config.COM_ATMOS_HISTORY_PREV, file) + + # Set the path/name to the input UFS-DA atm increment file (atminc_jedi) + # and the output UFS model atm increment file (atminc_fv3) + cdate = to_fv3time(self.task_config.current_cycle) + cdate_inc = cdate.replace('.', '_') + atminc_jedi = os.path.join(self.task_config.DATA, 'anl', f'atminc.{cdate_inc}z.nc4') + atminc_fv3 = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atminc.nc") + + # Reference the python script which does the actual work + incpy = os.path.join(self.task_config.HOMEgfs, 'ush/jediinc2fv3.py') + + # Execute incpy to create the UFS model atm increment file + cmd = Executable(incpy) + cmd.add_default_arg(atmges_fv3) + cmd.add_default_arg(atminc_jedi) + cmd.add_default_arg(atminc_fv3) + logger.debug(f"Executing {cmd}") + cmd(output='stdout', error='stderr') diff --git a/ush/python/pygfs/task/atmens_analysis.py b/ush/python/pygfs/task/atmens_analysis.py new file mode 100644 index 00000000000..28b121644a9 --- /dev/null +++ b/ush/python/pygfs/task/atmens_analysis.py @@ -0,0 +1,351 @@ +#!/usr/bin/env python3 + +import os +import glob +import gzip +import tarfile +from logging import getLogger +from typing import Dict, List, Any + +from pygw.attrdict import AttrDict +from pygw.file_utils import FileHandler +from pygw.timetools import add_to_datetime, to_fv3time, to_timedelta, to_YMDH, to_YMD +from pygw.fsutils import rm_p, chdir +from pygw.yaml_file import parse_yamltmpl, parse_j2yaml, save_as_yaml +from pygw.logger import logit +from pygw.executable import Executable +from pygw.exceptions import WorkflowException +from pygw.template import Template, TemplateConstants +from pygfs.task.analysis import Analysis + +logger = getLogger(__name__.split('.')[-1]) + + +class AtmEnsAnalysis(Analysis): + """ + Class for global atmens analysis tasks + """ + @logit(logger, name="AtmEnsAnalysis") + def __init__(self, config): + super().__init__(config) + + _res = int(self.config.CASE_ENKF[1:]) + _res_anl = int(self.config.CASE_ANL[1:]) + _window_begin = add_to_datetime(self.runtime_config.current_cycle, -to_timedelta(f"{self.config.assim_freq}H") / 2) + _fv3jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.atmens.yaml") + + # Create a local dictionary that is repeatedly used across this class + local_dict = AttrDict( + { + 'npx_ges': _res + 1, + 'npy_ges': _res + 1, + 'npz_ges': self.config.LEVS - 1, + 'npz': self.config.LEVS - 1, + 'npx_anl': _res_anl + 1, + 'npy_anl': _res_anl + 1, + 'npz_anl': self.config.LEVS - 1, + 'ATM_WINDOW_BEGIN': _window_begin, + 'ATM_WINDOW_LENGTH': f"PT{self.config.assim_freq}H", + 'OPREFIX': f"{self.config.EUPD_CYC}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN + 'APREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN + 'GPREFIX': f"gdas.t{self.runtime_config.previous_cycle.hour:02d}z.", + 'fv3jedi_yaml': _fv3jedi_yaml, + } + ) + + # task_config is everything that this task should need + self.task_config = AttrDict(**self.config, **self.runtime_config, **local_dict) + + @logit(logger) + def initialize(self: Analysis) -> None: + """Initialize a global atmens analysis + + This method will initialize a global atmens analysis using JEDI. + This includes: + - staging CRTM fix files + - staging FV3-JEDI fix files + - staging model backgrounds + - generating a YAML file for the JEDI executable + - creating output directories + + Parameters + ---------- + Analysis: parent class for GDAS task + + Returns + ---------- + None + """ + super().initialize() + + # Make member directories in DATA for background and in DATA and ROTDIR for analysis files + # create template dictionary for output member analysis directories + template_inc = self.task_config.COM_ATMOS_ANALYSIS_TMPL + tmpl_inc_dict = { + 'ROTDIR': self.task_config.ROTDIR, + 'RUN': self.task_config.RUN, + 'YMD': to_YMD(self.task_config.current_cycle), + 'HH': self.task_config.current_cycle.strftime('%H') + } + dirlist = [] + for imem in range(1, self.task_config.NMEM_ENKF + 1): + dirlist.append(os.path.join(self.task_config.DATA, 'bkg', f'mem{imem:03d}')) + dirlist.append(os.path.join(self.task_config.DATA, 'anl', f'mem{imem:03d}')) + + # create output directory path for member analysis + tmpl_inc_dict['MEMDIR'] = f"mem{imem:03d}" + incdir = Template.substitute_structure(template_inc, TemplateConstants.DOLLAR_CURLY_BRACE, tmpl_inc_dict.get) + dirlist.append(incdir) + + FileHandler({'mkdir': dirlist}).sync() + + # stage CRTM fix files + crtm_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'parm_gdas', 'atm_crtm_coeff.yaml') + logger.debug(f"Staging CRTM fix files from {crtm_fix_list_path}") + crtm_fix_list = parse_yamltmpl(crtm_fix_list_path, self.task_config) + FileHandler(crtm_fix_list).sync() + + # stage fix files + jedi_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'parm_gdas', 'atm_jedi_fix.yaml') + logger.debug(f"Staging JEDI fix files from {jedi_fix_list_path}") + jedi_fix_list = parse_yamltmpl(jedi_fix_list_path, self.task_config) + FileHandler(jedi_fix_list).sync() + + # stage backgrounds + FileHandler(self.get_bkg_dict()).sync() + + # generate ensemble da YAML file + logger.debug(f"Generate ensemble da YAML file: {self.task_config.fv3jedi_yaml}") + ensda_yaml = parse_j2yaml(self.task_config.ATMENSYAML, self.task_config) + save_as_yaml(ensda_yaml, self.task_config.fv3jedi_yaml) + logger.info(f"Wrote ensemble da YAML to: {self.task_config.fv3jedi_yaml}") + + # need output dir for diags and anl + logger.debug("Create empty output [anl, diags] directories to receive output from executable") + newdirs = [ + os.path.join(self.task_config.DATA, 'anl'), + os.path.join(self.task_config.DATA, 'diags'), + ] + FileHandler({'mkdir': newdirs}).sync() + + @logit(logger) + def execute(self: Analysis) -> None: + """Execute a global atmens analysis + + This method will execute a global atmens analysis using JEDI. + This includes: + - changing to the run directory + - running the global atmens analysis executable + + Parameters + ---------- + Analysis: parent class for GDAS task + + Returns + ---------- + None + """ + chdir(self.task_config.DATA) + + exec_cmd = Executable(self.task_config.APRUN_ATMENSANL) + exec_name = os.path.join(self.task_config.DATA, 'fv3jedi_letkf.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg(self.task_config.fv3jedi_yaml) + + try: + logger.debug(f"Executing {exec_cmd}") + exec_cmd() + except OSError: + raise OSError(f"Failed to execute {exec_cmd}") + except Exception: + raise WorkflowException(f"An error occured during execution of {exec_cmd}") + + pass + + @logit(logger) + def finalize(self: Analysis) -> None: + """Finalize a global atmens analysis + + This method will finalize a global atmens analysis using JEDI. + This includes: + - tar output diag files and place in ROTDIR + - copy the generated YAML file from initialize to the ROTDIR + - write UFS model readable atm incrment file + + Parameters + ---------- + Analysis: parent class for GDAS task + + Returns + ---------- + None + """ + # ---- tar up diags + # path of output tar statfile + atmensstat = os.path.join(self.task_config.COM_ATMOS_ANALYSIS_ENS, f"{self.task_config.APREFIX}atmensstat") + + # get list of diag files to put in tarball + diags = glob.glob(os.path.join(self.task_config.DATA, 'diags', 'diag*nc4')) + + logger.info(f"Compressing {len(diags)} diag files to {atmensstat}.gz") + + # gzip the files first + logger.debug(f"Gzipping {len(diags)} diag files") + for diagfile in diags: + with open(diagfile, 'rb') as f_in, gzip.open(f"{diagfile}.gz", 'wb') as f_out: + f_out.writelines(f_in) + + # open tar file for writing + logger.debug(f"Creating tar file {atmensstat} with {len(diags)} gzipped diag files") + with tarfile.open(atmensstat, "w") as archive: + for diagfile in diags: + diaggzip = f"{diagfile}.gz" + archive.add(diaggzip, arcname=os.path.basename(diaggzip)) + + # copy full YAML from executable to ROTDIR + logger.info(f"Copying {self.task_config.fv3jedi_yaml} to {self.task_config.COM_ATMOS_ANALYSIS_ENS}") + src = os.path.join(self.task_config.DATA, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmens.yaml") + dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS_ENS, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmens.yaml") + logger.debug(f"Copying {src} to {dest}") + yaml_copy = { + 'mkdir': [self.task_config.COM_ATMOS_ANALYSIS_ENS], + 'copy': [[src, dest]] + } + FileHandler(yaml_copy).sync() + + # Create UFS model readable atm increment file from UFS-DA atm increment + logger.info("Create UFS model readable atm increment file from UFS-DA atm increment") + self.jedi2fv3inc() + + def clean(self): + super().clean() + + @logit(logger) + def jedi2fv3inc(self: Analysis) -> None: + """Generate UFS model readable analysis increment + + This method writes a UFS DA atm increment in UFS model readable format. + This includes: + - write UFS-DA atm increments using variable names expected by UFS model + - compute and write delp increment + - compute and write hydrostatic delz increment + + Please note that some of these steps are temporary and will be modified + once the modle is able to directly read atm increments. + + Parameters + ---------- + Analysis: parent class for GDAS task + + Returns + ---------- + None + """ + # Select the atm guess file based on the analysis and background resolutions + # Fields from the atm guess are used to compute the delp and delz increments + cdate = to_fv3time(self.task_config.current_cycle) + cdate_inc = cdate.replace('.', '_') + + # Reference the python script which does the actual work + incpy = os.path.join(self.task_config.HOMEgfs, 'ush/jediinc2fv3.py') + + # create template dictionaries + template_inc = self.task_config.COM_ATMOS_ANALYSIS_TMPL + tmpl_inc_dict = { + 'ROTDIR': self.task_config.ROTDIR, + 'RUN': self.task_config.RUN, + 'YMD': to_YMD(self.task_config.current_cycle), + 'HH': self.task_config.current_cycle.strftime('%H') + } + + template_ges = self.task_config.COM_ATMOS_HISTORY_TMPL + tmpl_ges_dict = { + 'ROTDIR': self.task_config.ROTDIR, + 'RUN': self.task_config.RUN, + 'YMD': to_YMD(self.task_config.previous_cycle), + 'HH': self.task_config.previous_cycle.strftime('%H') + } + + # loop over ensemble members + for imem in range(1, self.task_config.NMEM_ENKF + 1): + memchar = f"mem{imem:03d}" + + # create output path for member analysis increment + tmpl_inc_dict['MEMDIR'] = memchar + incdir = Template.substitute_structure(template_inc, TemplateConstants.DOLLAR_CURLY_BRACE, tmpl_inc_dict.get) + + # rewrite UFS-DA atmens increments + tmpl_ges_dict['MEMDIR'] = memchar + gesdir = Template.substitute_structure(template_ges, TemplateConstants.DOLLAR_CURLY_BRACE, tmpl_ges_dict.get) + atmges_fv3 = os.path.join(gesdir, f"{self.task_config.CDUMP}.t{self.task_config.previous_cycle.hour:02d}z.atmf006.nc") + atminc_jedi = os.path.join(self.task_config.DATA, 'anl', memchar, f'atminc.{cdate_inc}z.nc4') + atminc_fv3 = os.path.join(incdir, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atminc.nc") + + # Execute incpy to create the UFS model atm increment file + # TODO: use MPMD or parallelize with mpi4py + # See https://github.com/NOAA-EMC/global-workflow/pull/1373#discussion_r1173060656 + cmd = Executable(incpy) + cmd.add_default_arg(atmges_fv3) + cmd.add_default_arg(atminc_jedi) + cmd.add_default_arg(atminc_fv3) + logger.debug(f"Executing {cmd}") + cmd(output='stdout', error='stderr') + + @logit(logger) + def get_bkg_dict(self: Analysis) -> Dict[str, List[str]]: + """Compile a dictionary of model background files to copy + + This method constructs a dictionary of ensemble FV3 restart files (coupler, core, tracer) + that are needed for global atmens DA and returns said dictionary for use by the FileHandler class. + + Parameters + ---------- + None + + Returns + ---------- + bkg_dict: Dict + a dictionary containing the list of model background files to copy for FileHandler + """ + # NOTE for now this is FV3 restart files and just assumed to be fh006 + # loop over ensemble members + rstlist = [] + bkglist = [] + + # get FV3 restart files, this will be a lot simpler when using history files + template_res = self.task_config.COM_ATMOS_RESTART_TMPL + tmpl_res_dict = { + 'ROTDIR': self.task_config.ROTDIR, + 'RUN': self.task_config.RUN, + 'YMD': to_YMD(self.task_config.previous_cycle), + 'HH': self.task_config.previous_cycle.strftime('%H'), + 'MEMDIR': None + } + + for imem in range(1, self.task_config.NMEM_ENKF + 1): + memchar = f"mem{imem:03d}" + + # get FV3 restart files, this will be a lot simpler when using history files + tmpl_res_dict['MEMDIR'] = memchar + rst_dir = Template.substitute_structure(template_res, TemplateConstants.DOLLAR_CURLY_BRACE, tmpl_res_dict.get) + rstlist.append(rst_dir) + + run_dir = os.path.join(self.task_config.DATA, 'bkg', memchar) + + # atmens DA needs coupler + basename = f'{to_fv3time(self.task_config.current_cycle)}.coupler.res' + bkglist.append([os.path.join(rst_dir, basename), os.path.join(self.task_config.DATA, 'bkg', memchar, basename)]) + + # atmens DA needs core, srf_wnd, tracer, phy_data, sfc_data + for ftype in ['fv_core.res', 'fv_srf_wnd.res', 'fv_tracer.res', 'phy_data', 'sfc_data']: + template = f'{to_fv3time(self.task_config.current_cycle)}.{ftype}.tile{{tilenum}}.nc' + for itile in range(1, self.task_config.ntiles + 1): + basename = template.format(tilenum=itile) + bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) + + bkg_dict = { + 'mkdir': rstlist, + 'copy': bkglist, + } + + return bkg_dict diff --git a/ush/python/pygfs/task/gfs_forecast.py b/ush/python/pygfs/task/gfs_forecast.py new file mode 100644 index 00000000000..3527c623e0d --- /dev/null +++ b/ush/python/pygfs/task/gfs_forecast.py @@ -0,0 +1,35 @@ +import os +import logging +from typing import Dict, Any + +from pygw.logger import logit +from pygw.task import Task +from pygfs.ufswm.gfs import GFS + +logger = logging.getLogger(__name__.split('.')[-1]) + + +class GFSForecast(Task): + """ + UFS-weather-model forecast task for the GFS + """ + + @logit(logger, name="GFSForecast") + def __init__(self, config: Dict[str, Any], *args, **kwargs): + """ + Parameters + ---------- + config : Dict + dictionary object containing configuration from environment + + *args : tuple + Additional arguments to `Task` + + **kwargs : dict, optional + Extra keyword arguments to `Task` + """ + + super().__init__(config, *args, **kwargs) + + # Create and initialize the GFS variant of the UFS + self.gfs = GFS(config) diff --git a/ush/python/pygfs/ufswm/__init__.py b/ush/python/pygfs/ufswm/__init__.py new file mode 100644 index 00000000000..e69de29bb2d diff --git a/ush/python/pygfs/ufswm/gfs.py b/ush/python/pygfs/ufswm/gfs.py new file mode 100644 index 00000000000..f86164d706f --- /dev/null +++ b/ush/python/pygfs/ufswm/gfs.py @@ -0,0 +1,20 @@ +import copy +import logging + +from pygw.logger import logit +from pygfs.ufswm.ufs import UFS + +logger = logging.getLogger(__name__.split('.')[-1]) + + +class GFS(UFS): + + @logit(logger, name="GFS") + def __init__(self, config): + + super().__init__("GFS", config) + + # Start putting fixed properties of the GFS + self.ntiles = 6 + + # Determine coupled/uncoupled from config and define as appropriate diff --git a/ush/python/pygfs/ufswm/ufs.py b/ush/python/pygfs/ufswm/ufs.py new file mode 100644 index 00000000000..a9118801b92 --- /dev/null +++ b/ush/python/pygfs/ufswm/ufs.py @@ -0,0 +1,58 @@ +import re +import copy +import logging +from typing import Dict, Any + +from pygw.template import Template, TemplateConstants +from pygw.logger import logit + +logger = logging.getLogger(__name__.split('.')[-1]) + +UFS_VARIANTS = ['GFS'] + + +class UFS: + + @logit(logger, name="UFS") + def __init__(self, model_name: str, config: Dict[str, Any]): + """Initialize the UFS-weather-model generic class and check if the model_name is a valid variant + + Parameters + ---------- + model_name: str + UFS variant + config : Dict + Incoming configuration dictionary + """ + + # First check if this is a valid variant + if model_name not in UFS_VARIANTS: + logger.warn(f"{model_name} is not a valid UFS variant") + raise NotImplementedError(f"{model_name} is not yet implemented") + + # Make a deep copy of incoming config for caching purposes. _config should not be updated + self._config = copy.deepcopy(config) + + @logit(logger) + def parse_ufs_templates(input_template, output_file, ctx: Dict) -> None: + """ + This method parses UFS-weather-model templates of the pattern @[VARIABLE] + drawing the value from ctx['VARIABLE'] + """ + + with open(input_template, 'r') as fhi: + file_in = fhi.read() + file_out = Template.substitute_structure( + file_in, TemplateConstants.AT_SQUARE_BRACES, ctx.get) + + # If there are unrendered bits, find out what they are + pattern = r"@\[.*?\]+" + matches = re.findall(pattern, file_out) + if matches: + logger.warn(f"{input_template} was rendered incompletely") + logger.warn(f"The following variables were not substituted") + print(matches) # TODO: improve the formatting of this message + # TODO: Should we abort here? or continue to write output_file? + + with open(output_file, 'w') as fho: + fho.write(file_out) diff --git a/ush/python/pygw/.gitignore b/ush/python/pygw/.gitignore new file mode 100644 index 00000000000..13a1a9f8512 --- /dev/null +++ b/ush/python/pygw/.gitignore @@ -0,0 +1,139 @@ +# Byte-compiled / optimized / DLL files +__pycache__/ +*.py[cod] +*$py.class + +# C extensions +*.so + +# Distribution / packaging +.Python +build/ +develop-eggs/ +dist/ +downloads/ +eggs/ +.eggs/ +lib/ +lib64/ +parts/ +sdist/ +var/ +wheels/ +pip-wheel-metadata/ +share/python-wheels/ +*.egg-info/ +.installed.cfg +*.egg +MANIFEST + +# PyInstaller +# Usually these files are written by a python script from a template +# before PyInstaller builds the exe, so as to inject date/other infos into it. +*.manifest +*.spec + +# Installer logs +pip-log.txt +pip-delete-this-directory.txt + +# Unit test / coverage reports +htmlcov/ +.tox/ +.nox/ +.coverage +.coverage.* +.cache +nosetests.xml +coverage.xml +*.cover +*.py,cover +.hypothesis/ +.pytest_cache/ + +# Translations +*.mo +*.pot + +# Django stuff: +*.log +local_settings.py +db.sqlite3 +db.sqlite3-journal + +# Flask stuff: +instance/ +.webassets-cache + +# Scrapy stuff: +.scrapy + +# Sphinx documentation +docs/_build/ + +# PyBuilder +target/ + +# Jupyter Notebook +.ipynb_checkpoints + +# IPython +profile_default/ +ipython_config.py + +# pyenv +.python-version + +# pipenv +# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. +# However, in case of collaboration, if having platform-specific dependencies or dependencies +# having no cross-platform support, pipenv may install dependencies that don't work, or not +# install all needed dependencies. +#Pipfile.lock + +# PEP 582; used by e.g. github.com/David-OConnor/pyflow +__pypackages__/ + +# Celery stuff +celerybeat-schedule +celerybeat.pid + +# SageMath parsed files +*.sage.py + +# Environments +.env +.venv +env/ +venv/ +ENV/ +env.bak/ +venv.bak/ + +# Spyder project settings +.spyderproject +.spyproject + +# Rope project settings +.ropeproject + +# mkdocs documentation +/site + +# mypy +.mypy_cache/ +.dmypy.json +dmypy.json + +# Pyre type checker +.pyre/ + +# Sphinx documentation +docs/_build/ + +# Editor backup files (Emacs, vim) +*~ +*.sw[a-p] + +# Pycharm IDE files +.idea/ diff --git a/ush/python/pygw/README.md b/ush/python/pygw/README.md new file mode 100644 index 00000000000..13db34471c7 --- /dev/null +++ b/ush/python/pygw/README.md @@ -0,0 +1,36 @@ +# global workflow specific tools + +Python tools specifically for global applications + +## Installation +Simple installation instructions +```sh +$> git clone https://github.com/noaa-emc/global-workflow +$> cd global-workflow/ush/python/pygw +$> pip install . +``` + +It is not required to install this package. Instead, +```sh +$> cd global-workflow/ush/python/pygw +$> export PYTHONPATH=$PWD/src/pygw +``` +would put this package in the `PYTHONPATH` + +### Note: +These instructions will be updated and the tools are under development. + +### Running python tests: +Simple instructions to enable executing pytests manually +```sh +# Create a python virtual environment and step into it +$> cd global-workflow/ush/python/pygw +$> python3 -m venv venv +$> source venv/bin/activate + +# Install pygw with the developer requirements +(venv) $> pip install .[dev] + +# Run pytests +(venv) $> pytest -v +``` diff --git a/ush/python/pygw/setup.cfg b/ush/python/pygw/setup.cfg new file mode 100644 index 00000000000..1d45df0d761 --- /dev/null +++ b/ush/python/pygw/setup.cfg @@ -0,0 +1,62 @@ +[metadata] +name = pygw +version = 0.0.1 +description = Global applications specific workflow related tools +long_description = file: README.md +long_description_content_type = text/markdown +author = "NOAA/NWS/NCEP/EMC" +#author_email = first.last@domain.tld +keywords = NOAA, NWS, NCEP, EMC, GFS, GEFS +home_page = https://github.com/noaa-emc/global-workflow +license = GNU Lesser General Public License +classifiers = + Development Status :: 1 - Beta + Intended Audience :: Developers + Intended Audience :: Science/Research + License :: OSI Approved :: GNU Lesser General Public License + Natural Language :: English + Operating System :: OS Independent + Programming Language :: Python + Programming Language :: Python :: 3 + Programming Language :: Python :: 3.6 + Programming Language :: Python :: 3.7 + Programming Language :: Python :: 3.8 + Programming Language :: Python :: 3.9 + Topic :: Software Development :: Libraries :: Python Modules + Operating System :: OS Independent + Typing :: Typed +project_urls = + Bug Tracker = https://github.com/noaa-emc/global-workflow/issues + CI = https://github.com/noaa-emc/global-workflow/actions + +[options] +zip_safe = False +include_package_data = True +package_dir = + =src +packages = find_namespace: +python_requires = >= 3.6 +setup_requires = + setuptools +install_requires = + numpy==1.21.6 + PyYAML==6.0 + Jinja2==3.1.2 +tests_require = + pytest + +[options.packages.find] +where=src + +[options.package_data] +* = *.txt, *.md + +[options.extras_require] +dev = pytest>=7; pytest-cov>=3 + +[green] +file-pattern = test_*.py +verbose = 2 +no-skip-report = true +quiet-stdout = true +run-coverage = true diff --git a/ush/python/pygw/setup.py b/ush/python/pygw/setup.py new file mode 100644 index 00000000000..e748ce0b714 --- /dev/null +++ b/ush/python/pygw/setup.py @@ -0,0 +1,4 @@ +''' Standard file for building the package with Distutils. ''' + +import setuptools +setuptools.setup() diff --git a/ush/python/pygw/src/pygw/__init__.py b/ush/python/pygw/src/pygw/__init__.py new file mode 100644 index 00000000000..d44158004c6 --- /dev/null +++ b/ush/python/pygw/src/pygw/__init__.py @@ -0,0 +1,8 @@ +""" +Commonly used toolset for the global applications and beyond. +""" +__docformat__ = "restructuredtext" + +import os + +pygw_directory = os.path.dirname(__file__) diff --git a/ush/python/pygw/src/pygw/attrdict.py b/ush/python/pygw/src/pygw/attrdict.py new file mode 100644 index 00000000000..f2add20a196 --- /dev/null +++ b/ush/python/pygw/src/pygw/attrdict.py @@ -0,0 +1,171 @@ +# attrdict is a Python module that gives you dictionaries whose values are both +# gettable and settable using attributes, in addition to standard item-syntax. +# https://github.com/mewwts/addict +# addict/addict.py -> attrdict.py +# hash: 7e8d23d +# License: MIT +# class Dict -> class AttrDict to prevent name collisions w/ typing.Dict + +import copy + +__all__ = ['AttrDict'] + + +class AttrDict(dict): + + def __init__(__self, *args, **kwargs): + object.__setattr__(__self, '__parent', kwargs.pop('__parent', None)) + object.__setattr__(__self, '__key', kwargs.pop('__key', None)) + object.__setattr__(__self, '__frozen', False) + for arg in args: + if not arg: + continue + elif isinstance(arg, dict): + for key, val in arg.items(): + __self[key] = __self._hook(val) + elif isinstance(arg, tuple) and (not isinstance(arg[0], tuple)): + __self[arg[0]] = __self._hook(arg[1]) + else: + for key, val in iter(arg): + __self[key] = __self._hook(val) + + for key, val in kwargs.items(): + __self[key] = __self._hook(val) + + def __setattr__(self, name, value): + if hasattr(self.__class__, name): + raise AttributeError("'AttrDict' object attribute " + "'{0}' is read-only".format(name)) + else: + self[name] = value + + def __setitem__(self, name, value): + isFrozen = (hasattr(self, '__frozen') and + object.__getattribute__(self, '__frozen')) + if isFrozen and name not in super(AttrDict, self).keys(): + raise KeyError(name) + if isinstance(value, dict): + value = AttrDict(value) + super(AttrDict, self).__setitem__(name, value) + try: + p = object.__getattribute__(self, '__parent') + key = object.__getattribute__(self, '__key') + except AttributeError: + p = None + key = None + if p is not None: + p[key] = self + object.__delattr__(self, '__parent') + object.__delattr__(self, '__key') + + def __add__(self, other): + if not self.keys(): + return other + else: + self_type = type(self).__name__ + other_type = type(other).__name__ + msg = "unsupported operand type(s) for +: '{}' and '{}'" + raise TypeError(msg.format(self_type, other_type)) + + @classmethod + def _hook(cls, item): + if isinstance(item, dict): + return cls(item) + elif isinstance(item, (list, tuple)): + return type(item)(cls._hook(elem) for elem in item) + return item + + def __getattr__(self, item): + return self.__getitem__(item) + + def __missing__(self, name): + if object.__getattribute__(self, '__frozen'): + raise KeyError(name) + return self.__class__(__parent=self, __key=name) + + def __delattr__(self, name): + del self[name] + + def to_dict(self): + base = {} + for key, value in self.items(): + if isinstance(value, type(self)): + base[key] = value.to_dict() + elif isinstance(value, (list, tuple)): + base[key] = type(value)( + item.to_dict() if isinstance(item, type(self)) else + item for item in value) + else: + base[key] = value + return base + + def copy(self): + return copy.copy(self) + + def deepcopy(self): + return copy.deepcopy(self) + + def __deepcopy__(self, memo): + other = self.__class__() + memo[id(self)] = other + for key, value in self.items(): + other[copy.deepcopy(key, memo)] = copy.deepcopy(value, memo) + return other + + def update(self, *args, **kwargs): + other = {} + if args: + if len(args) > 1: + raise TypeError() + other.update(args[0]) + other.update(kwargs) + for k, v in other.items(): + if ((k not in self) or + (not isinstance(self[k], dict)) or + (not isinstance(v, dict))): + self[k] = v + else: + self[k].update(v) + + def __getnewargs__(self): + return tuple(self.items()) + + def __getstate__(self): + return self + + def __setstate__(self, state): + self.update(state) + + def __or__(self, other): + if not isinstance(other, (AttrDict, dict)): + return NotImplemented + new = AttrDict(self) + new.update(other) + return new + + def __ror__(self, other): + if not isinstance(other, (AttrDict, dict)): + return NotImplemented + new = AttrDict(other) + new.update(self) + return new + + def __ior__(self, other): + self.update(other) + return self + + def setdefault(self, key, default=None): + if key in self: + return self[key] + else: + self[key] = default + return default + + def freeze(self, shouldFreeze=True): + object.__setattr__(self, '__frozen', shouldFreeze) + for key, val in self.items(): + if isinstance(val, AttrDict): + val.freeze(shouldFreeze) + + def unfreeze(self): + self.freeze(False) diff --git a/ush/python/pygw/src/pygw/configuration.py b/ush/python/pygw/src/pygw/configuration.py new file mode 100644 index 00000000000..da39a21748e --- /dev/null +++ b/ush/python/pygw/src/pygw/configuration.py @@ -0,0 +1,179 @@ +import glob +import os +import random +import subprocess +from pathlib import Path +from pprint import pprint +from typing import Union, List, Dict, Any + +from pygw.attrdict import AttrDict +from pygw.timetools import to_datetime + +__all__ = ['Configuration', 'cast_as_dtype', 'cast_strdict_as_dtypedict'] + + +class ShellScriptException(Exception): + def __init__(self, scripts, errors): + self.scripts = scripts + self.errors = errors + super(ShellScriptException, self).__init__( + str(errors) + + ': error processing' + + (' '.join(scripts))) + + +class UnknownConfigError(Exception): + pass + + +class Configuration: + """ + Configuration parser for the global-workflow + (or generally for sourcing a shell script into a python dictionary) + """ + + def __init__(self, config_dir: Union[str, Path]): + """ + Given a directory containing config files (config.XYZ), + return a list of config_files minus the ones ending with ".default" + """ + + self.config_dir = config_dir + self.config_files = self._get_configs + + @property + def _get_configs(self) -> List[str]: + """ + Given a directory containing config files (config.XYZ), + return a list of config_files minus the ones ending with ".default" + """ + result = list() + for config in glob.glob(f'{self.config_dir}/config.*'): + if not config.endswith('.default'): + result.append(config) + + return result + + def find_config(self, config_name: str) -> str: + """ + Given a config file name, find the full path of the config file + """ + + for config in self.config_files: + if config_name == os.path.basename(config): + return config + + raise UnknownConfigError( + f'{config_name} does not exist (known: {repr(config_name)}), ABORT!') + + def parse_config(self, files: Union[str, bytes, list]) -> Dict[str, Any]: + """ + Given the name of config file(s), key-value pair of all variables in the config file(s) + are returned as a dictionary + :param files: config file or list of config files + :type files: list or str or unicode + :return: Key value pairs representing the environment variables defined + in the script. + :rtype: dict + """ + if isinstance(files, (str, bytes)): + files = [files] + files = [self.find_config(file) for file in files] + return cast_strdict_as_dtypedict(self._get_script_env(files)) + + def print_config(self, files: Union[str, bytes, list]) -> None: + """ + Given the name of config file(s), key-value pair of all variables in the config file(s) are printed + Same signature as parse_config + :param files: config file or list of config files + :type files: list or str or unicode + :return: None + """ + config = self.parse_config(files) + pprint(config, width=4) + + @classmethod + def _get_script_env(cls, scripts: List) -> Dict[str, Any]: + default_env = cls._get_shell_env([]) + and_script_env = cls._get_shell_env(scripts) + vars_just_in_script = set(and_script_env) - set(default_env) + union_env = dict(default_env) + union_env.update(and_script_env) + return dict([(v, union_env[v]) for v in vars_just_in_script]) + + @staticmethod + def _get_shell_env(scripts: List) -> Dict[str, Any]: + varbls = dict() + runme = ''.join([f'source {s} ; ' for s in scripts]) + magic = f'--- ENVIRONMENT BEGIN {random.randint(0,64**5)} ---' + runme += f'/bin/echo -n "{magic}" ; /usr/bin/env -0' + with open('/dev/null', 'w') as null: + env = subprocess.Popen(runme, shell=True, stdin=null.fileno(), + stdout=subprocess.PIPE) + (out, err) = env.communicate() + out = out.decode() + begin = out.find(magic) + if begin < 0: + raise ShellScriptException(scripts, 'Cannot find magic string; ' + 'at least one script failed: ' + repr(out)) + for entry in out[begin + len(magic):].split('\x00'): + iequal = entry.find('=') + varbls[entry[0:iequal]] = entry[iequal + 1:] + return varbls + + +def cast_strdict_as_dtypedict(ctx: Dict[str, str]) -> Dict[str, Any]: + """ + Environment variables are typically stored as str + This method attempts to translate those into datatypes + Parameters + ---------- + ctx : dict + dictionary with values as str + Returns + ------- + varbles : dict + dictionary with values as datatypes + """ + varbles = AttrDict() + for key, value in ctx.items(): + varbles[key] = cast_as_dtype(value) + return varbles + + +def cast_as_dtype(string: str) -> Union[str, int, float, bool, Any]: + """ + Cast a value into known datatype + Parameters + ---------- + string: str + Returns + ------- + value : str or int or float or datetime + default: str + """ + TRUTHS = ['y', 'yes', 't', 'true', '.t.', '.true.'] + BOOLS = ['n', 'no', 'f', 'false', '.f.', '.false.'] + TRUTHS + BOOLS = [x.upper() for x in BOOLS] + BOOLS + ['Yes', 'No', 'True', 'False'] + + def _cast_or_not(type: Any, string: str): + try: + return type(string) + except ValueError: + return string + + def _true_or_not(string: str): + try: + return string.lower() in TRUTHS + except AttributeError: + return string + + try: + return to_datetime(string) # Try as a datetime + except Exception as exc: + if string in BOOLS: # Likely a boolean, convert to True/False + return _true_or_not(string) + elif '.' in string: # Likely a number and that too a float + return _cast_or_not(float, string) + else: # Still could be a number, may be an integer + return _cast_or_not(int, string) diff --git a/ush/python/pygw/src/pygw/exceptions.py b/ush/python/pygw/src/pygw/exceptions.py new file mode 100644 index 00000000000..a97cba6406f --- /dev/null +++ b/ush/python/pygw/src/pygw/exceptions.py @@ -0,0 +1,87 @@ +# pylint: disable=unused-argument + +# ---- + +from collections.abc import Callable + +from pygw.logger import Logger, logit + +logger = Logger(level="error", colored_log=True) + +__all__ = ["WorkflowException", "msg_except_handle"] + + +class WorkflowException(Exception): + """ + Description + ----------- + + This is the base-class for all exceptions; it is a sub-class of + Exceptions. + + Parameters + ---------- + + msg: str + + A Python string containing a message to accompany the + exception. + + """ + + @logit(logger) + def __init__(self: Exception, msg: str): + """ + Description + ----------- + + Creates a new WorkflowException object. + + """ + + # Define the base-class attributes. + logger.error(msg=msg) + super().__init__() + + +# ---- + + +def msg_except_handle(err_cls: object) -> Callable: + """ + Description + ----------- + + This function provides a decorator to be used to raise specified + exceptions. + + Parameters + ---------- + + err_cls: object + + A Python object containing the WorkflowException subclass to + be used for exception raises. + + Parameters + ---------- + + decorator: Callable + + A Python decorator. + + """ + + # Define the decorator function. + def decorator(func: Callable): + + # Execute the caller function; proceed accordingly. + def call_function(msg: str) -> None: + + # If an exception is encountered, raise the respective + # exception. + raise err_cls(msg=msg) + + return call_function + + return decorator diff --git a/ush/python/pygw/src/pygw/executable.py b/ush/python/pygw/src/pygw/executable.py new file mode 100644 index 00000000000..45464dba736 --- /dev/null +++ b/ush/python/pygw/src/pygw/executable.py @@ -0,0 +1,354 @@ +import os +import shlex +import subprocess +import sys +from typing import Any, Optional + +__all__ = ["Executable", "which", "CommandNotFoundError"] + + +class Executable: + """ + Class representing a program that can be run on the command line. + + Example: + -------- + + >>> from pygw.executable import Executable + >>> cmd = Executable('srun') # Lets say we need to run command e.g. "srun" + >>> cmd.add_default_arg('my_exec.x') # Lets say we need to run the executable "my_exec.x" + >>> cmd.add_default_arg('my_arg.yaml') # Lets say we need to pass an argument to this executable e.g. "my_arg.yaml" + >>> cmd.add_default_env('OMP_NUM_THREADS', 4) # Lets say we want to run w/ 4 threads in the environment + >>> cmd(output='stdout', error='stderr') # Run the command and capture the stdout and stderr in files named similarly. + + `cmd` line above will translate to: + + $ export OMP_NUM_THREADS=4 + $ srun my_exec.x my_arg.yaml 1>&stdout 2>&stderr + + References + ---------- + .. [1] "spack.util.executable.py", https://github.com/spack/spack/blob/develop/lib/spack/spack/util/executable.py + """ + + def __init__(self, name: str): + """ + Construct an executable object. + + Parameters + ---------- + name : str + name of the executable to run + """ + self.exe = shlex.split(str(name)) + self.default_env = {} + self.returncode = None + + if not self.exe: + raise ProcessError(f"Cannot construct executable for '{name}'") + + def add_default_arg(self, arg: str) -> None: + """ + Add a default argument to the command. + Parameters + ---------- + arg : str + argument to the executable + """ + self.exe.append(arg) + + def add_default_env(self, key: str, value: Any) -> None: + """ + Set an environment variable when the command is run. + + Parameters: + ---------- + key : str + The environment variable to set + value : Any + The value to set it to + """ + self.default_env[key] = str(value) + + @property + def command(self) -> str: + """ + The command-line string. + + Returns: + -------- + str : The executable and default arguments + """ + return " ".join(self.exe) + + @property + def name(self) -> str: + """ + The executable name. + + Returns: + -------- + str : The basename of the executable + """ + return os.path.basename(self.path) + + @property + def path(self) -> str: + """ + The path to the executable. + + Returns: + -------- + str : The path to the executable + """ + return self.exe[0] + + def __call__(self, *args, **kwargs): + """ + Run this executable in a subprocess. + + Parameters: + ----------- + *args (str): Command-line arguments to the executable to run + + Keyword Arguments: + ------------------ + _dump_env : Dict + Dict to be set to the environment actually + used (envisaged for testing purposes only) + env : Dict + The environment with which to run the executable + fail_on_error : bool + Raise an exception if the subprocess returns + an error. Default is True. The return code is available as + ``exe.returncode`` + ignore_errors : int or List + A list of error codes to ignore. + If these codes are returned, this process will not raise + an exception even if ``fail_on_error`` is set to ``True`` + input : + Where to read stdin from + output : + Where to send stdout + error : + Where to send stderr + + Accepted values for input, output, and error: + + * python streams, e.g. open Python file objects, or ``os.devnull`` + * filenames, which will be automatically opened for writing + * ``str``, as in the Python string type. If you set these to ``str``, + output and error will be written to pipes and returned as a string. + If both ``output`` and ``error`` are set to ``str``, then one string + is returned containing output concatenated with error. Not valid + for ``input`` + * ``str.split``, as in the ``split`` method of the Python string type. + Behaves the same as ``str``, except that value is also written to + ``stdout`` or ``stderr``. + + By default, the subprocess inherits the parent's file descriptors. + + """ + # Environment + env_arg = kwargs.get("env", None) + + # Setup default environment + env = os.environ.copy() if env_arg is None else {} + env.update(self.default_env) + + # Apply env argument + if env_arg: + env.update(env_arg) + + if "_dump_env" in kwargs: + kwargs["_dump_env"].clear() + kwargs["_dump_env"].update(env) + + fail_on_error = kwargs.pop("fail_on_error", True) + ignore_errors = kwargs.pop("ignore_errors", ()) + + # If they just want to ignore one error code, make it a tuple. + if isinstance(ignore_errors, int): + ignore_errors = (ignore_errors,) + + output = kwargs.pop("output", None) + error = kwargs.pop("error", None) + input = kwargs.pop("input", None) + + if input is str: + raise ValueError("Cannot use `str` as input stream.") + + def streamify(arg, mode): + if isinstance(arg, str): + return open(arg, mode), True + elif arg in (str, str.split): + return subprocess.PIPE, False + else: + return arg, False + + istream, close_istream = streamify(input, "r") + ostream, close_ostream = streamify(output, "w") + estream, close_estream = streamify(error, "w") + + cmd = self.exe + list(args) + + escaped_cmd = ["'%s'" % arg.replace("'", "'\"'\"'") for arg in cmd] + cmd_line_string = " ".join(escaped_cmd) + + proc = None # initialize to avoid lint warning + try: + proc = subprocess.Popen(cmd, stdin=istream, stderr=estream, stdout=ostream, env=env, close_fds=False) + out, err = proc.communicate() + + result = None + if output in (str, str.split) or error in (str, str.split): + result = "" + if output in (str, str.split): + outstr = str(out.decode("utf-8")) + result += outstr + if output is str.split: + sys.stdout.write(outstr) + if error in (str, str.split): + errstr = str(err.decode("utf-8")) + result += errstr + if error is str.split: + sys.stderr.write(errstr) + + rc = self.returncode = proc.returncode + if fail_on_error and rc != 0 and (rc not in ignore_errors): + long_msg = cmd_line_string + if result: + # If the output is not captured in the result, it will have + # been stored either in the specified files (e.g. if + # 'output' specifies a file) or written to the parent's + # stdout/stderr (e.g. if 'output' is not specified) + long_msg += "\n" + result + + raise ProcessError(f"Command exited with status {proc.returncode}:", long_msg) + + return result + + except OSError as e: + raise ProcessError(f"{self.exe[0]}: {e.strerror}", f"Command: {cmd_line_string}") + + except subprocess.CalledProcessError as e: + if fail_on_error: + raise ProcessError( + str(e), + f"\nExit status {proc.returncode} when invoking command: {cmd_line_string}", + ) + + finally: + if close_ostream: + ostream.close() + if close_estream: + estream.close() + if close_istream: + istream.close() + + def __eq__(self, other): + return hasattr(other, "exe") and self.exe == other.exe + + def __neq__(self, other): + return not (self == other) + + def __hash__(self): + return hash((type(self),) + tuple(self.exe)) + + def __repr__(self): + return f"" + + def __str__(self): + return " ".join(self.exe) + + +def which_string(*args, **kwargs) -> str: + """ + Like ``which()``, but return a string instead of an ``Executable``. + + If given multiple executables, returns the string of the first one that is found. + If no executables are found, returns None. + + Parameters: + ----------- + *args : str + One or more executables to search for + + Keyword Arguments: + ------------------ + path : str or List + The path to search. Defaults to ``PATH`` + required : bool + If set to True, raise an error if executable not found + + Returns: + -------- + str : + The first executable that is found in the path + """ + path = kwargs.get("path", os.environ.get("PATH", "")) + required = kwargs.get("required", False) + + if isinstance(path, str): + path = path.split(os.pathsep) + + for name in args: + for candidate_name in [name]: + if os.path.sep in candidate_name: + exe = os.path.abspath(candidate_name) + if os.path.isfile(exe) and os.access(exe, os.X_OK): + return exe + else: + for directory in path: + exe = os.path.join(directory, candidate_name) + if os.path.isfile(exe) and os.access(exe, os.X_OK): + return exe + + if required: + raise CommandNotFoundError(f"'{args[0]}' is required. Make sure it is in your PATH.") + + return None + + +def which(*args, **kwargs) -> Optional[Executable]: + """ + Finds an executable in the PATH like command-line which. + + If given multiple executables, returns the first one that is found. + If no executables are found, returns None. + + Parameters: + ----------- + *args : str + One or more executables to search for + + Keyword Arguments: + ------------------ + path : str or List + The path to search. Defaults to ``PATH`` + required : bool + If set to True, raise an error if executable not found + + Returns: + -------- + Executable: The first executable that is found in the path + """ + exe = which_string(*args, **kwargs) + return Executable(shlex.quote(exe)) if exe else None + + +class ProcessError(Exception): + """ + ProcessErrors are raised when Executables exit with an error code. + """ + def __init__(self, short_msg, long_msg=None): + self.short_msg = short_msg + self.long_msg = long_msg + message = short_msg + '\n' + long_msg if long_msg else short_msg + super().__init__(message) + + +class CommandNotFoundError(OSError): + """ + Raised when ``which()`` cannot find a required executable. + """ diff --git a/ush/python/pygw/src/pygw/file_utils.py b/ush/python/pygw/src/pygw/file_utils.py new file mode 100644 index 00000000000..062a707d055 --- /dev/null +++ b/ush/python/pygw/src/pygw/file_utils.py @@ -0,0 +1,73 @@ +from .fsutils import cp, mkdir + +__all__ = ['FileHandler'] + + +class FileHandler: + """Class to manipulate files in bulk for a given configuration + + Parameters + ---------- + config : dict + A dictionary containing the "action" and the "act" in the form of a list + + NOTE + ---- + "action" can be one of mkdir", "copy", etc. + Corresponding "act" would be ['dir1', 'dir2'], [['src1', 'dest1'], ['src2', 'dest2']] + + Attributes + ---------- + config : dict + Dictionary of files to manipulate + """ + + def __init__(self, config): + + self.config = config + + def sync(self): + """ + Method to execute bulk actions on files described in the configuration + """ + sync_factory = { + 'copy': self._copy_files, + 'mkdir': self._make_dirs, + } + # loop through the configuration keys + for action, files in self.config.items(): + sync_factory[action](files) + + @staticmethod + def _copy_files(filelist): + """Function to copy all files specified in the list + + `filelist` should be in the form: + - [src, dest] + + Parameters + ---------- + filelist : list + List of lists of [src, dest] + """ + for sublist in filelist: + if len(sublist) != 2: + raise Exception( + f"List must be of the form ['src', 'dest'], not {sublist}") + src = sublist[0] + dest = sublist[1] + cp(src, dest) + print(f'Copied {src} to {dest}') # TODO use logger + + @staticmethod + def _make_dirs(dirlist): + """Function to make all directories specified in the list + + Parameters + ---------- + dirlist : list + List of directories to create + """ + for dd in dirlist: + mkdir(dd) + print(f'Created {dd}') # TODO use logger diff --git a/ush/python/pygw/src/pygw/fsutils.py b/ush/python/pygw/src/pygw/fsutils.py new file mode 100644 index 00000000000..50d7d10bbff --- /dev/null +++ b/ush/python/pygw/src/pygw/fsutils.py @@ -0,0 +1,73 @@ +import os +import errno +import shutil +import contextlib + +__all__ = ['mkdir', 'mkdir_p', 'rmdir', 'chdir', 'rm_p', 'cp'] + + +def mkdir_p(path): + try: + os.makedirs(path) + except OSError as exc: + if exc.errno == errno.EEXIST and os.path.isdir(path): + pass + else: + raise OSError(f"unable to create directory at {path}") + + +mkdir = mkdir_p + + +def rmdir(dir_path): + try: + shutil.rmtree(dir_path) + except OSError as exc: + raise OSError(f"unable to remove {dir_path}") + + +@contextlib.contextmanager +def chdir(path): + cwd = os.getcwd() + try: + os.chdir(path) + yield + finally: + print(f"WARNING: Unable to chdir({path})") # TODO: use logging + os.chdir(cwd) + + +def rm_p(path): + try: + os.unlink(path) + except OSError as exc: + if exc.errno == errno.ENOENT: + pass + else: + raise OSError(f"unable to remove {path}") + + +def cp(source: str, target: str) -> None: + """ + copy `source` file to `target` using `shutil.copyfile` + If `target` is a directory, then the filename from `source` is retained into the `target` + Parameters + ---------- + source : str + Source filename + target : str + Destination filename or directory + Returns + ------- + None + """ + + if os.path.isdir(target): + target = os.path.join(target, os.path.basename(source)) + + try: + shutil.copyfile(source, target) + except OSError: + raise OSError(f"unable to copy {source} to {target}") + except Exception as exc: + raise Exception(exc) diff --git a/ush/python/pygw/src/pygw/jinja.py b/ush/python/pygw/src/pygw/jinja.py new file mode 100644 index 00000000000..0e4bcab0ee0 --- /dev/null +++ b/ush/python/pygw/src/pygw/jinja.py @@ -0,0 +1,228 @@ +import io +import os +import sys +import jinja2 +from markupsafe import Markup +from pathlib import Path +from typing import Dict + +from .timetools import strftime, to_YMDH, to_YMD, to_fv3time, to_isotime + +__all__ = ['Jinja'] + + +@jinja2.pass_eval_context +class SilentUndefined(jinja2.Undefined): + """ + Description + ----------- + A Jinja2 undefined that does not raise an error when it is used in a + template. Instead, it returns the template back when the variable is not found + This class is not to be used outside of this file + Its purpose is to return the template instead of an empty string + Presently, it also does not return the filter applied to the variable. + This will be added later when a use case for it presents itself. + """ + def __str__(self): + return "{{ " + self._undefined_name + " }}" + + def __add__(self, other): + return str(self) + other + + def __radd__(self, other): + return other + str(self) + + def __mod__(self, other): + return str(self) % other + + def __call__(self, *args, **kwargs): + return Markup("{{ " + self._undefined_name + " }}") + + +class Jinja: + """ + Description + ----------- + A wrapper around jinja2 to render templates + """ + + def __init__(self, template_path_or_string: str, data: Dict, allow_missing: bool = True): + """ + Description + ----------- + Given a path to a (jinja2) template and a data object, substitute the + template file with data. + Allow for retaining missing or undefined variables. + Parameters + ---------- + template_path_or_string : str + Path to the template file or a templated string + data : dict + Data to be substituted into the template + allow_missing : bool + If True, allow for missing or undefined variables + """ + + self.data = data + self.undefined = SilentUndefined if allow_missing else jinja2.StrictUndefined + + if os.path.isfile(template_path_or_string): + self.template_type = 'file' + self.template_path = Path(template_path_or_string) + else: + self.template_type = 'stream' + self.template_stream = template_path_or_string + + @property + def render(self, data: Dict = None) -> str: + """ + Description + ----------- + Render the Jinja2 template with the data + Parameters + ---------- + data: dict (optional) + Additional data to be used in the template + Not implemented yet. Placed here for future use + Returns + ------- + rendered: str + Rendered template into text + """ + + render_map = {'stream': self._render_stream, + 'file': self._render_file} + return render_map[self.template_type]() + + def get_set_env(self, loader: jinja2.BaseLoader) -> jinja2.Environment: + """ + Description + ----------- + Define the environment for the jinja2 template + Any number of filters can be added here + + Parameters + ---------- + loader: of class jinja2.BaseLoader + Returns + ------- + env: jinja2.Environment + """ + env = jinja2.Environment(loader=loader, undefined=self.undefined) + env.filters["strftime"] = lambda dt, fmt: strftime(dt, fmt) + env.filters["to_isotime"] = lambda dt: to_isotime(dt) if not isinstance(dt, SilentUndefined) else dt + env.filters["to_fv3time"] = lambda dt: to_fv3time(dt) if not isinstance(dt, SilentUndefined) else dt + env.filters["to_YMDH"] = lambda dt: to_YMDH(dt) if not isinstance(dt, SilentUndefined) else dt + env.filters["to_YMD"] = lambda dt: to_YMD(dt) if not isinstance(dt, SilentUndefined) else dt + return env + + @staticmethod + def add_filter_env(env: jinja2.Environment, filter_name: str, filter_func: callable): + """ + Description + ----------- + Add a custom filter to the jinja2 environment + Not implemented yet. Placed here for future use + Parameters + ---------- + env: jinja2.Environment + Active jinja2 environment + filter_name: str + name of the filter + filter_func: callable + function that will be called + Returns + ------- + env: jinja2.Environment + Active jinja2 environment with the new filter added + """ + raise NotImplementedError("Not implemented yet. Placed here for future use") + # Implementation would look something like the following + # env.filters[filter_name] = filter_func + # return env + + def _render_stream(self): + loader = jinja2.BaseLoader() + env = self.get_set_env(loader) + template = env.from_string(self.template_stream) + return self._render_template(template) + + def _render_file(self, data: Dict = None): + template_dir = self.template_path.parent + template_file = self.template_path.relative_to(template_dir) + + loader = jinja2.FileSystemLoader(template_dir) + env = self.get_set_env(loader) + template = env.get_template(str(template_file)) + return self._render_template(template) + + def _render_template(self, template: jinja2.Template): + """ + Description + ----------- + Render a jinja2 template object + Parameters + ---------- + template: jinja2.Template + + Returns + ------- + rendered: str + """ + try: + rendered = template.render(**self.data) + except jinja2.UndefinedError as ee: + raise Exception(f"Undefined variable in Jinja2 template\n{ee}") + + return rendered + + def _render(self, template_name: str, loader: jinja2.BaseLoader) -> str: + """ + Description + ----------- + Internal method to render a jinja2 template + Parameters + ---------- + template_name: str + loader: jinja2.BaseLoader + Returns + ------- + rendered: str + rendered template + """ + env = jinja2.Environment(loader=loader, undefined=self.undefined) + template = env.get_template(template_name) + try: + rendered = template.render(**self.data) + except jinja2.UndefinedError as ee: + raise Exception(f"Undefined variable in Jinja2 template\n{ee}") + + return rendered + + def save(self, output_file: str) -> None: + """ + Description + ----------- + Render and save the output to a file + Parameters + ---------- + output_file: str + Path to the output file + Returns + ------- + None + """ + with open(output_file, 'wb') as fh: + fh.write(self.render.encode("utf-8")) + + def dump(self) -> None: + """ + Description + ----------- + Render and dump the output to stdout + Returns + ------- + None + """ + io.TextIOWrapper(sys.stdout.buffer, + encoding="utf-8").write(self.render) diff --git a/ush/python/pygw/src/pygw/logger.py b/ush/python/pygw/src/pygw/logger.py new file mode 100644 index 00000000000..1bf2ed29858 --- /dev/null +++ b/ush/python/pygw/src/pygw/logger.py @@ -0,0 +1,275 @@ +""" +Logger +""" + +import os +import sys +from functools import wraps +from pathlib import Path +from typing import Union, List +import logging + + +class ColoredFormatter(logging.Formatter): + """ + Logging colored formatter + adapted from https://stackoverflow.com/a/56944256/3638629 + """ + + grey = '\x1b[38;21m' + blue = '\x1b[38;5;39m' + yellow = '\x1b[38;5;226m' + red = '\x1b[38;5;196m' + bold_red = '\x1b[31;1m' + reset = '\x1b[0m' + + def __init__(self, fmt): + super().__init__() + self.fmt = fmt + self.formats = { + logging.DEBUG: self.blue + self.fmt + self.reset, + logging.INFO: self.grey + self.fmt + self.reset, + logging.WARNING: self.yellow + self.fmt + self.reset, + logging.ERROR: self.red + self.fmt + self.reset, + logging.CRITICAL: self.bold_red + self.fmt + self.reset + } + + def format(self, record): + log_fmt = self.formats.get(record.levelno) + formatter = logging.Formatter(log_fmt) + return formatter.format(record) + + +class Logger: + """ + Improved logging + """ + LOG_LEVELS = ['DEBUG', 'INFO', 'WARNING', 'ERROR', 'CRITICAL'] + DEFAULT_LEVEL = 'INFO' + DEFAULT_FORMAT = '%(asctime)s - %(levelname)-8s - %(name)-12s: %(message)s' + + def __init__(self, name: str = None, + level: str = os.environ.get("LOGGING_LEVEL"), + _format: str = DEFAULT_FORMAT, + colored_log: bool = False, + logfile_path: Union[str, Path] = None): + """ + Initialize Logger + + Parameters + ---------- + name : str + Name of the Logger object + default : None + level : str + Desired Logging level + default : 'INFO' + _format : str + Desired Logging Format + default : '%(asctime)s - %(levelname)-8s - %(name)-12s: %(message)s' + colored_log : bool + Use colored logging for stdout + default: False + logfile_path : str or Path + Path for logging to a file + default : None + """ + + self.name = name + self.level = level.upper() if level else Logger.DEFAULT_LEVEL + self.format = _format + self.colored_log = colored_log + + if self.level not in Logger.LOG_LEVELS: + raise LookupError(f"{self.level} is unknown logging level\n" + + f"Currently supported log levels are:\n" + + f"{' | '.join(Logger.LOG_LEVELS)}") + + # Initialize the root logger if no name is present + self._logger = logging.getLogger(name) if name else logging.getLogger() + + self._logger.setLevel(self.level) + + _handlers = [] + # Add console handler for logger + _handler = Logger.add_stream_handler( + level=self.level, + _format=self.format, + colored_log=self.colored_log, + ) + _handlers.append(_handler) + self._logger.addHandler(_handler) + + # Add file handler for logger + if logfile_path is not None: + _handler = Logger.add_file_handler( + logfile_path, level=self.level, _format=self.format) + self._logger.addHandler(_handler) + _handlers.append(_handler) + + def __getattr__(self, attribute): + """ + Allows calling logging module methods directly + + Parameters + ---------- + attribute : str + attribute name of a logging object + + Returns + ------- + attribute : logging attribute + """ + return getattr(self._logger, attribute) + + def get_logger(self): + """ + Return the logging object + + Returns + ------- + logger : Logger object + """ + return self._logger + + @classmethod + def add_handlers(cls, logger: logging.Logger, handlers: List[logging.Handler]): + """ + Add a list of handlers to a logger + + Parameters + ---------- + logger : logging.Logger + Logger object to add a new handler to + handlers: list + A list of handlers to be added to the logger object + + Returns + ------- + logger : Logger object + """ + for handler in handlers: + logger.addHandler(handler) + + return logger + + @classmethod + def add_stream_handler(cls, level: str = DEFAULT_LEVEL, + _format: str = DEFAULT_FORMAT, + colored_log: bool = False): + """ + Create stream handler + This classmethod will allow setting a custom stream handler on children + + Parameters + ---------- + level : str + logging level + default : 'INFO' + _format : str + logging format + default : '%(asctime)s - %(levelname)-8s - %(name)-12s: %(message)s' + colored_log : bool + enable colored output for stdout + default : False + + Returns + ------- + handler : logging.Handler + stream handler of a logging object + """ + + handler = logging.StreamHandler(sys.stdout) + handler.setLevel(level) + _format = ColoredFormatter( + _format) if colored_log else logging.Formatter(_format) + handler.setFormatter(_format) + + return handler + + @classmethod + def add_file_handler(cls, logfile_path: Union[str, Path], + level: str = DEFAULT_LEVEL, + _format: str = DEFAULT_FORMAT): + """ + Create file handler. + This classmethod will allow setting custom file handler on children + Create stream handler + This classmethod will allow setting a custom stream handler on children + + Parameters + ---------- + logfile_path: str or Path + Path for writing out logfiles from logging + default : False + level : str + logging level + default : 'INFO' + _format : str + logging format + default : '%(asctime)s - %(levelname)-8s - %(name)-12s: %(message)s' + + Returns + ------- + handler : logging.Handler + file handler of a logging object + """ + + logfile_path = Path(logfile_path) + + # Create the directory containing the logfile_path + if not logfile_path.parent.is_dir(): + logfile_path.mkdir(parents=True, exist_ok=True) + + handler = logging.FileHandler(str(logfile_path)) + handler.setLevel(level) + handler.setFormatter(logging.Formatter(_format)) + + return handler + + +def logit(logger, name=None, message=None): + """ + Logger decorator to add logging to a function. + Simply add: + @logit(logger) before any function + Parameters + ---------- + logger : Logger + Logger object + name : str + Name of the module to be logged + default: __module__ + message : str + Name of the function to be logged + default: __name__ + """ + + def decorate(func): + + log_name = name if name else func.__module__ + log_msg = message if message else log_name + "." + func.__name__ + + @wraps(func) + def wrapper(*args, **kwargs): + + passed_args = [repr(aa) for aa in args] + passed_kwargs = [f"{kk}={repr(vv)}" for kk, vv in list(kwargs.items())] + + call_msg = 'BEGIN: ' + log_msg + logger.info(call_msg) + logger.debug(f"( {', '.join(passed_args + passed_kwargs)} )") + + # Call the function + retval = func(*args, **kwargs) + + # Close the logging with printing the return val + ret_msg = ' END: ' + log_msg + logger.info(ret_msg) + logger.debug(f" returning: {retval}") + + return retval + + return wrapper + + return decorate diff --git a/ush/python/pygw/src/pygw/task.py b/ush/python/pygw/src/pygw/task.py new file mode 100644 index 00000000000..22ce4626d8f --- /dev/null +++ b/ush/python/pygw/src/pygw/task.py @@ -0,0 +1,93 @@ +import logging +from typing import Dict + +from pygw.attrdict import AttrDict +from pygw.timetools import add_to_datetime, to_timedelta + +logger = logging.getLogger(__name__.split('.')[-1]) + + +class Task: + """ + Base class for all tasks + """ + + def __init__(self, config: Dict, *args, **kwargs): + """ + Every task needs a config. + Additional arguments (or key-value arguments) can be provided. + + Parameters + ---------- + config : Dict + dictionary object containing task configuration + + *args : tuple + Additional arguments to `Task` + + **kwargs : dict, optional + Extra keyword arguments to `Task` + """ + + # Store the config and arguments as attributes of the object + self.config = AttrDict(config) + + for arg in args: + setattr(self, str(arg), arg) + + for key, value in kwargs.items(): + setattr(self, key, value) + + # Pull out basic runtime keys values from config into its own runtime config + self.runtime_config = AttrDict() + runtime_keys = ['PDY', 'cyc', 'DATA', 'RUN', 'CDUMP'] # TODO: eliminate CDUMP and use RUN instead + for kk in runtime_keys: + try: + self.runtime_config[kk] = config[kk] + logger.debug(f'Deleting runtime_key {kk} from config') + del self.config[kk] + except KeyError: + raise KeyError(f"Encountered an unreferenced runtime_key {kk} in 'config'") + + # Any other composite runtime variables that may be needed for the duration of the task + # can be constructed here + + # Construct the current cycle datetime object + self.runtime_config['current_cycle'] = add_to_datetime(self.runtime_config['PDY'], to_timedelta(f"{self.runtime_config.cyc}H")) + logger.debug(f"current cycle: {self.runtime_config['current_cycle']}") + + # Construct the previous cycle datetime object + self.runtime_config['previous_cycle'] = add_to_datetime(self.runtime_config.current_cycle, -to_timedelta(f"{self.config['assim_freq']}H")) + logger.debug(f"previous cycle: {self.runtime_config['previous_cycle']}") + + pass + + def initialize(self): + """ + Initialize methods for a task + """ + pass + + def configure(self): + """ + Configuration methods for a task in preparation for execution + """ + pass + + def execute(self): + """ + Execute methods for a task + """ + pass + + def finalize(self): + """ + Methods for after the execution that produces output task + """ + pass + + def clean(self): + """ + Methods to clean after execution and finalization prior to closing out a task + """ + pass diff --git a/ush/python/pygw/src/pygw/template.py b/ush/python/pygw/src/pygw/template.py new file mode 100644 index 00000000000..85323057833 --- /dev/null +++ b/ush/python/pygw/src/pygw/template.py @@ -0,0 +1,191 @@ +import re +import os +import copy +from collections import namedtuple +from collections.abc import Sequence + +# Template imported with permission from jcsda/solo + +__all__ = ['Template', 'TemplateConstants'] + + +class TemplateConstants: + DOLLAR_CURLY_BRACE = '${}' + DOLLAR_PARENTHESES = '$()' + DOUBLE_CURLY_BRACES = '{{}}' + AT_SQUARE_BRACES = '@[]' + AT_ANGLE_BRACKETS = '@<>' + + SubPair = namedtuple('SubPair', ['regex', 'slice']) + + +class Template: + + """ + Utility for substituting variables in a template. The template can be the contents of a whole file + as a string (substitute_string) or in a complex dictionary (substitute_structure). + substitutions define different type of variables with a regex and a slice: + - the regex is supposed to find the whole variable, e.g, $(variable) + - the slice indicate how to slice the value returned by the regex to have the variable name, in the + case of $(variable), the slice is 2, -1 to remove $( and ). + You can easily add new type of variables following those rules. + + Please note that the regexes allow for at least one nested variable and the code is able to handle it. + It means that $($(variable)) will be processed correctly but the substitutions will need more than one + pass. + + If you have a file that is deeper than just a simple dictionary of has lists in it, you can use the method + build_index to create a dictionary that will have all the options from deeper levels (list, dicts). + You can then pass index.get as an argument to any method you use. + If you use substitute_with_dependencies, this is done automatically. + """ + + substitutions = { + TemplateConstants.DOLLAR_CURLY_BRACE: TemplateConstants.SubPair(re.compile(r'\${.*?}+'), slice(2, -1)), + TemplateConstants.DOLLAR_PARENTHESES: TemplateConstants.SubPair(re.compile(r'\$\(.*?\)+'), slice(2, -1)), + TemplateConstants.DOUBLE_CURLY_BRACES: TemplateConstants.SubPair(re.compile(r'{{.*?}}+'), slice(2, -2)), + TemplateConstants.AT_SQUARE_BRACES: TemplateConstants.SubPair(re.compile(r'@\[.*?\]+'), slice(2, -1)), + TemplateConstants.AT_ANGLE_BRACKETS: TemplateConstants.SubPair( + re.compile(r'@\<.*?\>+'), slice(2, -1)) + } + + @classmethod + def find_variables(cls, variable_to_substitute: str, var_type: str): + pair = cls.substitutions[var_type] + return [x[pair.slice] for x in re.findall(pair.regex, variable_to_substitute)] + + @classmethod + def substitute_string(cls, variable_to_substitute, var_type: str, get_value): + """ + Substitutes variables under the form var_type (e.g. DOLLAR_CURLY_BRACE), looks for a value returned + by function get_value and if found, substitutes the variable. Convert floats and int to string + before substitution. If the value in the dictionary is a complex type, just assign it instead + of substituting. + get_value is a function that returns the value to substitute: + signature: get_value(variable_name). + If substituting from a dictionary my_dict, pass my_dict.get + """ + pair = cls.substitutions[var_type] + if isinstance(variable_to_substitute, str): + variable_names = re.findall(pair.regex, variable_to_substitute) + for variable in variable_names: + var = variable[pair.slice] + v = get_value(var) + if v is not None: + if not is_single_type_or_string(v): + if len(variable_names) == 1: + # v could be a list or a dictionary (complex structure and not a string). + # If there is one variable that is the whole + # string, we can safely replace, otherwise do nothing. + if variable_to_substitute.replace(variable_names[0][pair.slice], '') == var_type: + variable_to_substitute = v + else: + if isinstance(v, float) or isinstance(v, int): + v = str(v) + if isinstance(v, str): + variable_to_substitute = variable_to_substitute.replace( + variable, v) + else: + variable_to_substitute = v + else: + more = re.search(pair.regex, var) + if more is not None: + new_value = cls.substitute_string( + var, var_type, get_value) + variable_to_substitute = variable_to_substitute.replace( + var, new_value) + return variable_to_substitute + + @classmethod + def substitute_structure(cls, structure_to_substitute, var_type: str, get_value): + """ + Traverses a dictionary and substitutes variables in fields, lists + and nested dictionaries. + """ + if isinstance(structure_to_substitute, dict): + for key, item in structure_to_substitute.items(): + structure_to_substitute[key] = cls.substitute_structure( + item, var_type, get_value) + elif is_sequence_and_not_string(structure_to_substitute): + for i, item in enumerate(structure_to_substitute): + structure_to_substitute[i] = cls.substitute_structure( + item, var_type, get_value) + else: + structure_to_substitute = cls.substitute_string(structure_to_substitute, var_type, + get_value) + return structure_to_substitute + + @classmethod + def substitute_structure_from_environment(cls, structure_to_substitute): + return cls.substitute_structure(structure_to_substitute, TemplateConstants.DOLLAR_CURLY_BRACE, os.environ.get) + + @classmethod + def substitute_with_dependencies(cls, dictionary, keys, var_type: str, shallow_precedence=True, excluded=()): + """ + Given a dictionary with a complex (deep) structure, we want to substitute variables, + using keys, another dictionary that may also have a deep structure (dictionary and keys + can be the same dictionary if you want to substitute in place). + We create an index based on keys (see build_index) and substitute values in dictionary + using index. If variables may refer to other variables, more than one pass of substitution + may be needed, so we substitute until there is no more change in dictionary (convergence). + """ + all_variables = cls.build_index(keys, excluded, shallow_precedence) + previous = {} + while dictionary != previous: + previous = copy.deepcopy(dictionary) + dictionary = cls.substitute_structure( + dictionary, var_type, all_variables.get) + return dictionary + + @classmethod + def build_index(cls, dictionary, excluded=None, shallow_precedence=True): + """ + Builds an index of all keys with their values, going deep into the dictionary. The index + if a flat structure (dictionary). + If the same key name is present more than once in the structure, we want to + either prioritise the values that are near the root of the tree (shallow_precedence=True) + or values that are near the leaves (shallow_precedence=False). We don't anticipate use + cases where the "nearest variable" should be used, but this could constitute a future + improvement. + """ + def build(structure, variables): + if isinstance(structure, dict): + for k, i in structure.items(): + if ((k not in variables) or (k in variables and not shallow_precedence)) and k not in excluded: + variables[k] = i + build(i, variables) + elif is_sequence_and_not_string(structure): + for v in structure: + build(v, variables) + var = {} + if excluded is None: + excluded = set() + build(dictionary, var) + return var + + +# These used to be in basic.py, and have been copied here till they are needed elsewhere. + + +def is_sequence_and_not_string(a): + return isinstance(a, Sequence) and not isinstance(a, str) + + +def is_single_type(s): + try: + len(s) + except TypeError: + return True + else: + return False + + +def is_single_type_or_string(s): + if isinstance(s, str): + return True + try: + len(s) + except TypeError: + return True + else: + return False diff --git a/ush/python/pygw/src/pygw/timetools.py b/ush/python/pygw/src/pygw/timetools.py new file mode 100644 index 00000000000..b63e1323376 --- /dev/null +++ b/ush/python/pygw/src/pygw/timetools.py @@ -0,0 +1,291 @@ +import re +import datetime + + +__all__ = ["to_datetime", "to_timedelta", + "datetime_to_YMDH", "datetime_to_YMD", + "timedelta_to_HMS", + "strftime", "strptime", + "to_YMDH", "to_YMD", + "to_isotime", "to_fv3time", + "add_to_datetime", "add_to_timedelta"] + + +_DATETIME_RE = re.compile( + r"(?P\d{4})(-)?(?P\d{2})(-)?(?P\d{2})" + r"(T)?(?P\d{2})?(:)?(?P\d{2})?(:)?(?P\d{2})?(Z)?") + +_TIMEDELTA_HOURS_RE = re.compile( + r"(?P[+-])?" + r"((?P\d+)[d])?" + r"(T)?((?P\d+)[H])?((?P\d+)[M])?((?P\d+)[S])?(Z)?") +_TIMEDELTA_TIME_RE = re.compile( + r"(?P[+-])?" + r"((?P\d+)(\s)day(s)?,(\s)?)?" + r"(T)?(?P\d{1,2})?(:(?P\d{1,2}))?(:(?P\d{1,2}))?") + + +def to_datetime(dtstr: str) -> datetime.datetime: + """ + Description + ----------- + Translate a string into a datetime object in a generic way. + The string can also support ISO 8601 representation. + + Formats accepted (T, Z, -, :) are optional: + YYYY-mm-dd + YYYY-mm-ddTHHZ + YYYY-mm-ddTHH:MMZ + YYYY-mm-ddTHH:MM:SSZ + + Parameters + ---------- + dtstr : str + String to be translated into a datetime object + + Returns + ------- + datetime.datetime + Datetime object + """ + + mm = _DATETIME_RE.match(dtstr) + if mm: + return datetime.datetime(**{kk: int(vv) for kk, vv in mm.groupdict().items() if vv}) + else: + raise Exception(f"Bad datetime string: '{dtstr}'") + + +def to_timedelta(tdstr: str) -> datetime.timedelta: + """ + Description + ----------- + Translate a string into a timedelta object in a generic way + + Formats accepted (, T, Z) are optional: +

dTHMSZ +
day(s), hh:mm:ss + + can be +/-, default is + +
can be any integer, default is 0 + can be any integer, default is 0 + can be any integer, default is 0 + can be any integer, default is 0 + + Parameters + ---------- + tdstr : str + String to be translated into a timedelta object + + Returns + ------- + datetime.timedelta + Timedelta object + """ + + time_dict = {'sign': '+', + 'days': 0, + 'hours': 0, + 'minutes': 0, + 'seconds': 0} + + if any(x in tdstr for x in ['day', 'days', ':']): + mm = _TIMEDELTA_TIME_RE.match(tdstr) # timedelta representation + else: + mm = _TIMEDELTA_HOURS_RE.match(tdstr) # ISO 8601 representation + + if mm: + nmm = {kk: vv if vv is not None else time_dict[kk] + for kk, vv in mm.groupdict().items()} + del nmm['sign'] + nmm = {kk: float(vv) for kk, vv in nmm.items()} + dt = datetime.timedelta(**nmm) + if mm.group('sign') is not None and mm.group('sign') == '-': + dt = -dt + return dt + else: + raise Exception(f"Bad timedelta string: '{tdstr}'") + + +def datetime_to_YMDH(dt: datetime.datetime) -> str: + """ + Description + ----------- + Translate a datetime object to 'YYYYmmddHH' format. + + Parameters + ---------- + dt : datetime.datetime + Datetime object to translate. + + Returns + ------- + str: str + Formatted string in 'YYYYmmddHH' format. + """ + try: + return dt.strftime('%Y%m%d%H') + except Exception: + raise Exception(f"Bad datetime: '{dt}'") + + +def datetime_to_YMD(dt: datetime.datetime) -> str: + """ + Description + ----------- + Translate a datetime object to 'YYYYmmdd' format. + + Parameters + ---------- + dt : datetime.datetime + Datetime object to translate. + + Returns + ------- + str: str + Formatted string in 'YYYYmmdd' format. + """ + try: + return dt.strftime('%Y%m%d') + except Exception: + raise Exception(f"Bad datetime: '{dt}'") + + +def timedelta_to_HMS(td: datetime.timedelta) -> str: + """ + Description + ----------- + Translate a timedelta object to 'HH:MM:SS' format. + + Parameters + ---------- + td : datetime.timedelta + Timedelta object to translate. + + Returns + ------- + str: str + Formatted string in 'HH:MM:SS' format. + """ + try: + hours, remainder = divmod(int(td.total_seconds()), 3600) + minutes, seconds = divmod(remainder, 60) + return f"{hours:02d}:{minutes:02d}:{seconds:02d}" + except Exception: + raise Exception(f"Bad timedelta: '{td}'") + + +def strftime(dt: datetime.datetime, fmt: str) -> str: + """ + Return a formatted string from a datetime object. + """ + try: + return dt.strftime(fmt) + except Exception: + raise Exception(f"Bad datetime (format): '{dt} ({fmt})'") + + +def strptime(dtstr: str, fmt: str) -> datetime.datetime: + """ + Description + ----------- + Translate a formatted string into datetime object. + + Parameters + ---------- + dtstr : str + Datetime string to translate. + fmt : str + Datetime string format. + + Returns + ------- + datetime.datetime: datetime.datetime + Datetime object. + """ + try: + return datetime.datetime.strptime(dtstr, fmt) + except Exception: + raise Exception(f"Bad datetime string (format): '{dtstr} ({fmt})'") + + +def to_isotime(dt: datetime.datetime) -> str: + """ + Description + ----------- + Return a ISO formatted '%Y-%m-%dT%H:%M:%SZ' string from a datetime object. + + Parameters + ---------- + dt : datetime.datetime + Datetime object to format. + + Returns + ------- + str: str + Formatted string in ISO format. + """ + return strftime(dt, '%Y-%m-%dT%H:%M:%SZ') + + +def to_fv3time(dt: datetime.datetime) -> str: + """ + Description + ----------- + Return a FV3 formatted string from a datetime object. + + Parameters + ---------- + dt : datetime.datetime + Datetime object to format. + + Returns + ------- + str: str + Formatted string in FV3 format. + """ + return strftime(dt, '%Y%m%d.%H%M%S') + + +def add_to_datetime(dt: datetime.datetime, td: datetime.timedelta) -> datetime.datetime: + """ + Description + ----------- + Adds a timedelta to a datetime object. + + Parameters + ---------- + dt : datetime.datetime + Datetime object to add to. + td : datetime.timedelta + Timedelta object to add. + + Returns + ------- + datetime.datetime + """ + return dt + td + + +def add_to_timedelta(td1, td2): + """ + Description + ----------- + Adds two timedelta objects. + + Parameters + ---------- + td1 : datetime.timedelta + First timedelta object to add. + td2 : datetime.timedelta + Second timedelta object to add. + + Returns + ------- + datetime.timedelta + """ + return td1 + td2 + + +to_YMDH = datetime_to_YMDH +to_YMD = datetime_to_YMD diff --git a/ush/python/pygw/src/pygw/yaml_file.py b/ush/python/pygw/src/pygw/yaml_file.py new file mode 100644 index 00000000000..89cd1e2ec08 --- /dev/null +++ b/ush/python/pygw/src/pygw/yaml_file.py @@ -0,0 +1,208 @@ +import os +import re +import json +import yaml +import datetime +from typing import Any, Dict +from .attrdict import AttrDict +from .template import TemplateConstants, Template +from .jinja import Jinja + +__all__ = ['YAMLFile', 'parse_yaml', 'parse_yamltmpl', 'parse_j2yaml', + 'save_as_yaml', 'dump_as_yaml', 'vanilla_yaml'] + + +class YAMLFile(AttrDict): + """ + Reads a YAML file as an AttrDict and recursively converts + nested dictionaries into AttrDict. + This is the entry point for all YAML files. + """ + + def __init__(self, path=None, data=None): + super().__init__() + + if path and data: + print("Ignoring 'data' and using 'path' argument") + + config = None + if path is not None: + config = parse_yaml(path=path) + elif data is not None: + config = parse_yaml(data=data) + + if config is not None: + self.update(config) + + def save(self, target): + save_as_yaml(self, target) + + def dump(self): + return dump_as_yaml(self) + + def as_dict(self): + return vanilla_yaml(self) + + +def save_as_yaml(data, target): + # specifies a wide file so that long strings are on one line. + with open(target, 'w') as fh: + yaml.safe_dump(vanilla_yaml(data), fh, + width=100000, sort_keys=False) + + +def dump_as_yaml(data): + return yaml.dump(vanilla_yaml(data), + width=100000, sort_keys=False) + + +def parse_yaml(path=None, data=None, + encoding='utf-8', loader=yaml.SafeLoader): + """ + Load a yaml configuration file and resolve any environment variables + The environment variables must have !ENV before them and be in this format + to be parsed: ${VAR_NAME}. + E.g.: + database: + host: !ENV ${HOST} + port: !ENV ${PORT} + app: + log_path: !ENV '/var/${LOG_PATH}' + something_else: !ENV '${AWESOME_ENV_VAR}/var/${A_SECOND_AWESOME_VAR}' + :param str path: the path to the yaml file + :param str data: the yaml data itself as a stream + :param Type[yaml.loader] loader: Specify which loader to use. Defaults to yaml.SafeLoader + :param str encoding: the encoding of the data if a path is specified, defaults to utf-8 + :return: the dict configuration + :rtype: Dict[str, Any] + + Adopted from: + https://dev.to/mkaranasou/python-yaml-configuration-with-environment-variables-parsing-2ha6 + """ + # define tags + envtag = '!ENV' + inctag = '!INC' + # pattern for global vars: look for ${word} + pattern = re.compile(r'.*?\${(\w+)}.*?') + loader = loader or yaml.SafeLoader + + # the envtag will be used to mark where to start searching for the pattern + # e.g. somekey: !ENV somestring${MYENVVAR}blah blah blah + loader.add_implicit_resolver(envtag, pattern, None) + loader.add_implicit_resolver(inctag, pattern, None) + + def expand_env_variables(line): + match = pattern.findall(line) # to find all env variables in line + if match: + full_value = line + for g in match: + full_value = full_value.replace( + f'${{{g}}}', os.environ.get(g, f'${{{g}}}') + ) + return full_value + return line + + def constructor_env_variables(loader, node): + """ + Extracts the environment variable from the node's value + :param yaml.Loader loader: the yaml loader + :param node: the current node in the yaml + :return: the parsed string that contains the value of the environment + variable + """ + value = loader.construct_scalar(node) + return expand_env_variables(value) + + def constructor_include_variables(loader, node): + """ + Extracts the environment variable from the node's value + :param yaml.Loader loader: the yaml loader + :param node: the current node in the yaml + :return: the content of the file to be included + """ + value = loader.construct_scalar(node) + value = expand_env_variables(value) + expanded = parse_yaml(value) + return expanded + + loader.add_constructor(envtag, constructor_env_variables) + loader.add_constructor(inctag, constructor_include_variables) + + if path: + with open(path, 'r', encoding=encoding) as conf_data: + return yaml.load(conf_data, Loader=loader) + elif data: + return yaml.load(data, Loader=loader) + else: + raise ValueError( + "Either a path or data should be defined as input") + + +def vanilla_yaml(ctx): + """ + Transform an input object of complex type as a plain type + """ + if isinstance(ctx, AttrDict): + return {kk: vanilla_yaml(vv) for kk, vv in ctx.items()} + elif isinstance(ctx, list): + return [vanilla_yaml(vv) for vv in ctx] + elif isinstance(ctx, datetime.datetime): + return ctx.strftime("%Y-%m-%dT%H:%M:%SZ") + else: + return ctx + + +def parse_j2yaml(path: str, data: Dict) -> Dict[str, Any]: + """ + Description + ----------- + Load a compound jinja2-templated yaml file and resolve any templated variables. + The jinja2 templates are first resolved and then the rendered template is parsed as a yaml. + Finally, any remaining $( ... ) templates are resolved + + Parameters + ---------- + path : str + the path to the yaml file + data : Dict[str, Any], optional + the context for jinja2 templating + Returns + ------- + Dict[str, Any] + the dict configuration + """ + jenv = Jinja(path, data) + yaml_file = jenv.render + yaml_dict = YAMLFile(data=yaml_file) + yaml_dict = Template.substitute_structure( + yaml_dict, TemplateConstants.DOLLAR_PARENTHESES, data.get) + + # If the input yaml file included other yamls with jinja2 templates, then we need to re-parse the jinja2 templates in them + jenv2 = Jinja(json.dumps(yaml_dict, indent=4), data) + yaml_file2 = jenv2.render + yaml_dict = YAMLFile(data=yaml_file2) + + return yaml_dict + + +def parse_yamltmpl(path: str, data: Dict = None) -> Dict[str, Any]: + """ + Description + ----------- + Load a simple templated yaml file and then resolve any templated variables defined as $( ... ) + Parameters + ---------- + path : str + the path to the yaml file + data : Dict[str, Any], optional + the context for pygw.Template templating + Returns + ------- + Dict[str, Any] + the dict configuration + """ + yaml_dict = YAMLFile(path=path) + if data is not None: + yaml_dict = Template.substitute_structure(yaml_dict, TemplateConstants.DOLLAR_PARENTHESES, data.get) + + return yaml_dict diff --git a/ush/python/pygw/src/tests/__init__.py b/ush/python/pygw/src/tests/__init__.py new file mode 100644 index 00000000000..e69de29bb2d diff --git a/ush/python/pygw/src/tests/test_configuration.py b/ush/python/pygw/src/tests/test_configuration.py new file mode 100644 index 00000000000..e83c2755b8d --- /dev/null +++ b/ush/python/pygw/src/tests/test_configuration.py @@ -0,0 +1,172 @@ +import os +import pytest +from datetime import datetime + +from pygw.configuration import Configuration, cast_as_dtype + +file0 = """#!/bin/bash +export SOME_ENVVAR1="${USER}" +export SOME_LOCALVAR1="myvar1" +export SOME_LOCALVAR2="myvar2.0" +export SOME_LOCALVAR3="myvar3_file0" +export SOME_PATH1="/path/to/some/directory" +export SOME_PATH2="/path/to/some/file" +export SOME_DATE1="20221225" +export SOME_DATE2="2022122518" +export SOME_DATE3="202212251845" +export SOME_INT1=3 +export SOME_INT2=15 +export SOME_INT3=-999 +export SOME_FLOAT1=0.2 +export SOME_FLOAT2=3.5 +export SOME_FLOAT3=-9999. +export SOME_BOOL1=YES +export SOME_BOOL2=.true. +export SOME_BOOL3=.T. +export SOME_BOOL4=NO +export SOME_BOOL5=.false. +export SOME_BOOL6=.F. +""" + +file1 = """#!/bin/bash +export SOME_LOCALVAR3="myvar3_file1" +export SOME_LOCALVAR4="myvar4" +export SOME_BOOL7=.TRUE. +""" + +file0_dict = { + 'SOME_ENVVAR1': os.environ['USER'], + 'SOME_LOCALVAR1': "myvar1", + 'SOME_LOCALVAR2': "myvar2.0", + 'SOME_LOCALVAR3': "myvar3_file0", + 'SOME_PATH1': "/path/to/some/directory", + 'SOME_PATH2': "/path/to/some/file", + 'SOME_DATE1': datetime(2022, 12, 25, 0, 0, 0), + 'SOME_DATE2': datetime(2022, 12, 25, 18, 0, 0), + 'SOME_DATE3': datetime(2022, 12, 25, 18, 45, 0), + 'SOME_INT1': 3, + 'SOME_INT2': 15, + 'SOME_INT3': -999, + 'SOME_FLOAT1': 0.2, + 'SOME_FLOAT2': 3.5, + 'SOME_FLOAT3': -9999., + 'SOME_BOOL1': True, + 'SOME_BOOL2': True, + 'SOME_BOOL3': True, + 'SOME_BOOL4': False, + 'SOME_BOOL5': False, + 'SOME_BOOL6': False +} + +file1_dict = { + 'SOME_LOCALVAR3': "myvar3_file1", + 'SOME_LOCALVAR4': "myvar4", + 'SOME_BOOL7': True +} + +str_dtypes = [ + ('HOME', 'HOME'), +] + +int_dtypes = [ + ('1', 1), +] + +float_dtypes = [ + ('1.0', 1.0), +] + +bool_dtypes = [ + ('y', True), ('n', False), + ('Y', True), ('N', False), + ('yes', True), ('no', False), + ('Yes', True), ('No', False), + ('YES', True), ('NO', False), + ('t', True), ('f', False), + ('T', True), ('F', False), + ('true', True), ('false', False), + ('True', True), ('False', False), + ('TRUE', True), ('FALSE', False), + ('.t.', True), ('.f.', False), + ('.T.', True), ('.F.', False), +] + +datetime_dtypes = [ + ('20221215', datetime(2022, 12, 15, 0, 0, 0)), + ('2022121518', datetime(2022, 12, 15, 18, 0, 0)), + ('2022121518Z', datetime(2022, 12, 15, 18, 0, 0)), + ('20221215T1830', datetime(2022, 12, 15, 18, 30, 0)), + ('20221215T1830Z', datetime(2022, 12, 15, 18, 30, 0)), +] + + +def evaluate(dtypes): + for pair in dtypes: + print(f"Test: '{pair[0]}' ==> {pair[1]}") + assert pair[1] == cast_as_dtype(pair[0]) + + +def test_cast_as_dtype_str(): + evaluate(str_dtypes) + + +def test_cast_as_dtype_int(): + evaluate(int_dtypes) + + +def test_cast_as_dtype_float(): + evaluate(float_dtypes) + + +def test_cast_as_dtype_bool(): + evaluate(bool_dtypes) + + +def test_cast_as_dtype_datetimes(): + evaluate(datetime_dtypes) + + +@pytest.fixture +def create_configs(tmp_path): + + file_path = tmp_path / 'config.file0' + with open(file_path, 'w') as fh: + fh.write(file0) + + file_path = tmp_path / 'config.file1' + with open(file_path, 'w') as fh: + fh.write(file1) + + +def test_configuration_config_dir(tmp_path, create_configs): + cfg = Configuration(tmp_path) + assert cfg.config_dir == tmp_path + + +@pytest.mark.skip(reason="fails in GH runner, passes on localhost") +def test_configuration_config_files(tmp_path, create_configs): + cfg = Configuration(tmp_path) + config_files = [str(tmp_path / 'config.file0'), str(tmp_path / 'config.file1')] + assert config_files == cfg.config_files + + +def test_find_config(tmp_path, create_configs): + cfg = Configuration(tmp_path) + file0 = cfg.find_config('config.file0') + assert str(tmp_path / 'config.file0') == file0 + + +@pytest.mark.skip(reason="fails in GH runner, passes on localhost") +def test_parse_config1(tmp_path, create_configs): + cfg = Configuration(tmp_path) + f0 = cfg.parse_config('config.file0') + assert file0_dict == f0 + + +@pytest.mark.skip(reason="fails in GH runner, passes on localhost") +def test_parse_config2(tmp_path, create_configs): + cfg = Configuration(tmp_path) + ff = cfg.parse_config(['config.file0', 'config.file1']) + ff_dict = file0_dict.copy() + ff_dict.update(file1_dict) + assert ff_dict == ff diff --git a/ush/python/pygw/src/tests/test_exceptions.py b/ush/python/pygw/src/tests/test_exceptions.py new file mode 100644 index 00000000000..79f3e4f1ecd --- /dev/null +++ b/ush/python/pygw/src/tests/test_exceptions.py @@ -0,0 +1,35 @@ +import pytest + +from pygw.exceptions import WorkflowException + +# ---- + + +class TestError(WorkflowException): + """ + Description + ----------- + + This is the base-class for exceptions encountered within the + pygw/errors unit-tests module; it is a sub-class of Error. + + """ + +# ---- + + +def test_errors() -> None: + """ + Description + ----------- + + This function provides a unit test for the errors module. + + """ + + # Raise the base-class exception. + with pytest.raises(Exception): + msg = "Testing exception raise." + raise TestError(msg=msg) + + assert True diff --git a/ush/python/pygw/src/tests/test_executable.py b/ush/python/pygw/src/tests/test_executable.py new file mode 100644 index 00000000000..4c0e584fab2 --- /dev/null +++ b/ush/python/pygw/src/tests/test_executable.py @@ -0,0 +1,60 @@ +import os +from pathlib import Path +import pytest +from pygw.executable import Executable, which, CommandNotFoundError + + +script = """#!/bin/bash +echo ${USER} +""" + + +def test_executable(tmp_path): + """ + Tests the class `Executable` + Parameters: + ----------- + tmp_path : Path + temporary path created by pytest + """ + whoami = os.environ['USER'] + + test_file = tmp_path / 'whoami.x' + Path(test_file).touch(mode=0o755) + with open(test_file, 'w') as fh: + fh.write(script) + + cmd = Executable(str(test_file)) + assert cmd.exe == [str(test_file)] + + stdout_file = tmp_path / 'stdout' + stderr_file = tmp_path / 'stderr' + cmd(output=str(stdout_file), error=str(stderr_file)) + with open(str(stdout_file)) as fh: + assert fh.read() == whoami + '\n' + + +def test_which(tmpdir): + """ + Tests the `which()` function. + `which` should return `None` if the executable is not found + Parameters + ---------- + tmpdir : Path + path to a temporary directory created by pytest + """ + os.environ["PATH"] = str(tmpdir) + assert which('test.x') is None + + with pytest.raises(CommandNotFoundError): + which('test.x', required=True) + + path = str(tmpdir.join("test.x")) + + # create a test.x executable in the tmpdir + with tmpdir.as_cwd(): + Path('test.x').touch(mode=0o755) + + exe = which("test.x") + assert exe is not None + assert exe.path == path diff --git a/ush/python/pygw/src/tests/test_file_utils.py b/ush/python/pygw/src/tests/test_file_utils.py new file mode 100644 index 00000000000..684c76b650b --- /dev/null +++ b/ush/python/pygw/src/tests/test_file_utils.py @@ -0,0 +1,66 @@ +import os +from pygw.file_utils import FileHandler + + +def test_mkdir(tmp_path): + """ + Test for creating directories: + Parameters + ---------- + tmp_path - pytest fixture + """ + + dir_path = tmp_path / 'my_test_dir' + d1 = f'{dir_path}1' + d2 = f'{dir_path}2' + d3 = f'{dir_path}3' + + # Create config object for FileHandler + config = {'mkdir': [d1, d2, d3]} + + # Create d1, d2, d3 + FileHandler(config).sync() + + # Check if d1, d2, d3 were indeed created + for dd in config['mkdir']: + assert os.path.exists(dd) + + +def test_copy(tmp_path): + """ + Test for copying files: + Parameters + ---------- + tmp_path - pytest fixture + """ + + input_dir_path = tmp_path / 'my_input_dir' + + # Create the input directory + config = {'mkdir': [input_dir_path]} + FileHandler(config).sync() + + # Put empty files in input_dir_path + src_files = [input_dir_path / 'a.txt', input_dir_path / 'b.txt'] + for ff in src_files: + ff.touch() + + # Create output_dir_path and expected file names + output_dir_path = tmp_path / 'my_output_dir' + config = {'mkdir': [output_dir_path]} + FileHandler(config).sync() + dest_files = [output_dir_path / 'a.txt', output_dir_path / 'bb.txt'] + + copy_list = [] + for src, dest in zip(src_files, dest_files): + copy_list.append([src, dest]) + + # Create config object for FileHandler + config = {'copy': copy_list} + + # Copy input files to output files + FileHandler(config).sync() + + # Check if files were indeed copied + for ff in dest_files: + assert os.path.isfile(ff) diff --git a/ush/python/pygw/src/tests/test_jinja.py b/ush/python/pygw/src/tests/test_jinja.py new file mode 100644 index 00000000000..10749515abd --- /dev/null +++ b/ush/python/pygw/src/tests/test_jinja.py @@ -0,0 +1,37 @@ +import pytest + +from datetime import datetime +from pygw.jinja import Jinja +from pygw.timetools import to_isotime + +current_date = datetime.now() +j2tmpl = """Hello {{ name }}! {{ greeting }} It is: {{ current_date | to_isotime }}""" + + +@pytest.fixture +def create_template(tmp_path): + file_path = tmp_path / 'template.j2' + with open(file_path, 'w') as fh: + fh.write(j2tmpl) + + +def test_render_stream(): + data = {"name": "John"} + j = Jinja(j2tmpl, data, allow_missing=True) + assert j.render == "Hello John! {{ greeting }} It is: {{ current_date }}" + + data = {"name": "Jane", "greeting": "How are you?", "current_date": current_date} + j = Jinja(j2tmpl, data, allow_missing=False) + assert j.render == f"Hello Jane! How are you? It is: {to_isotime(current_date)}" + + +def test_render_file(tmp_path, create_template): + + file_path = tmp_path / 'template.j2' + data = {"name": "John"} + j = Jinja(str(file_path), data, allow_missing=True) + assert j.render == "Hello John! {{ greeting }} It is: {{ current_date }}" + + data = {"name": "Jane", "greeting": "How are you?", "current_date": current_date} + j = Jinja(str(file_path), data, allow_missing=False) + assert j.render == f"Hello Jane! How are you? It is: {to_isotime(current_date)}" diff --git a/ush/python/pygw/src/tests/test_logger.py b/ush/python/pygw/src/tests/test_logger.py new file mode 100644 index 00000000000..a9b4504d57b --- /dev/null +++ b/ush/python/pygw/src/tests/test_logger.py @@ -0,0 +1,67 @@ +from pygw.logger import Logger +from pygw.logger import logit + +level = 'debug' +number_of_log_msgs = 5 +reference = {'debug': "Logging test has started", + 'info': "Logging to 'logger.log' in the script dir", + 'warning': "This is my last warning, take heed", + 'error': "This is an error", + 'critical': "He's dead, She's dead. They are all dead!"} + + +def test_logger(tmp_path): + """Test log file""" + + logfile = tmp_path / "logger.log" + + try: + log = Logger('test_logger', level=level, logfile_path=logfile, colored_log=True) + log.debug(reference['debug']) + log.info(reference['info']) + log.warning(reference['warning']) + log.error(reference['error']) + log.critical(reference['critical']) + except Exception as e: + raise AssertionError(f'logging failed as {e}') + + # Make sure log to file created messages + try: + with open(logfile, 'r') as fh: + log_msgs = fh.readlines() + except Exception as e: + raise AssertionError(f'failed reading log file as {e}') + + # Ensure number of messages are same + log_msgs_in_logfile = len(log_msgs) + assert log_msgs_in_logfile == number_of_log_msgs + + # Ensure messages themselves are same + for count, line in enumerate(log_msgs): + lev = line.split('-')[3].strip().lower() + message = line.split(':')[-1].strip() + assert reference[lev] == message + + +def test_logit(tmp_path): + + logger = Logger('test_logit', level=level, colored_log=True) + + @logit(logger) + def add(x, y): + return x + y + + @logit(logger) + def usedict(n, j=0, k=1): + return n + j + k + + @logit(logger, 'example') + def spam(): + print('Spam!') + + add(2, 3) + usedict(2, 3) + usedict(2, k=3) + spam() + + assert True diff --git a/ush/python/pygw/src/tests/test_template.py b/ush/python/pygw/src/tests/test_template.py new file mode 100644 index 00000000000..f6d594b2d9d --- /dev/null +++ b/ush/python/pygw/src/tests/test_template.py @@ -0,0 +1,147 @@ +import os +from pygw.template import TemplateConstants, Template + + +def test_substitute_string_from_dict(): + """ + Substitute with ${v} + """ + template = '${greeting} to ${the_world}' + dictionary = { + 'greeting': 'Hello', + 'the_world': 'the world' + } + final = 'Hello to the world' + assert Template.substitute_structure(template, + TemplateConstants.DOLLAR_CURLY_BRACE, dictionary.get) == final + + +def test_substitute_string_from_dict_paren(): + """ + Substitute with $(v) + """ + template = '$(greeting) to $(the_world)' + dictionary = { + 'greeting': 'Hello', + 'the_world': 'the world' + } + final = 'Hello to the world' + assert Template.substitute_structure(template, + TemplateConstants.DOLLAR_PARENTHESES, dictionary.get) == final + + +def test_assign_string_from_dict_paren(): + """ + Substitute with $(v) should replace with the actual object + """ + template = '$(greeting)' + dictionary = { + 'greeting': { + 'a': 1, + 'b': 2 + } + } + assert Template.substitute_structure(template, + TemplateConstants.DOLLAR_PARENTHESES, + dictionary.get) == dictionary['greeting'] + + +def test_substitute_string_from_dict_double_curly(): + """ + Substitute with {{v}} + """ + template = '{{greeting}} to {{the_world}}' + dictionary = { + 'greeting': 'Hello', + 'the_world': 'the world' + } + final = 'Hello to the world' + assert Template.substitute_structure(template, + TemplateConstants.DOUBLE_CURLY_BRACES, + dictionary.get) == final + + +def test_substitute_string_from_dict_at_square(): + """ + Substitute with @[v] + """ + template = '@[greeting] to @[the_world]' + dictionary = { + 'greeting': 'Hello', + 'the_world': 'the world' + } + final = 'Hello to the world' + assert Template.substitute_structure(template, + TemplateConstants.AT_SQUARE_BRACES, + dictionary.get) == final + + +def test_substitute_string_from_dict_at_carrots(): + """ + Substitute with @ + """ + template = '@ to @' + dictionary = { + 'greeting': 'Hello', + 'the_world': 'the world' + } + final = 'Hello to the world' + assert Template.substitute_structure(template, + TemplateConstants.AT_ANGLE_BRACKETS, + dictionary.get) == final + + +def test_substitute_string_from_environment(): + """ + Substitute from environment + """ + template = '${GREETING} to ${THE_WORLD}' + os.environ['GREETING'] = 'Hello' + os.environ['THE_WORLD'] = 'the world' + final = 'Hello to the world' + assert Template.substitute_structure_from_environment(template) == final + + +def test_substitute_with_dependencies(): + input = { + 'root': '/home/user', + 'config_file': 'config.yaml', + 'config': '$(root)/config/$(config_file)', + 'greeting': 'hello $(world)', + 'world': 'world', + 'complex': '$(dictionary)', + 'dictionary': { + 'a': 1, + 'b': 2 + }, + 'dd': {'2': 'a', '1': 'b'}, + 'ee': {'3': 'a', '1': 'b'}, + 'ff': {'4': 'a', '1': 'b $(greeting)'}, + 'host': { + 'name': 'xenon', + 'config': '$(root)/hosts', + 'config_file': '$(config)/$(name).config.yaml', + 'proxy2': { + 'config': '$(root)/$(name).$(greeting).yaml', + 'list': [['$(root)/$(name)', 'toto.$(name).$(greeting)'], '$(config_file)'] + } + } + } + output = {'complex': {'a': 1, 'b': 2}, + 'config': '/home/user/config/config.yaml', + 'config_file': 'config.yaml', + 'dd': {'1': 'b', '2': 'a'}, + 'dictionary': {'a': 1, 'b': 2}, + 'ee': {'1': 'b', '3': 'a'}, + 'ff': {'1': 'b hello world', '4': 'a'}, + 'greeting': 'hello world', + 'host': {'config': '/home/user/hosts', + 'config_file': '/home/user/config/config.yaml/xenon.config.yaml', + 'name': 'xenon', + 'proxy2': {'config': '/home/user/xenon.hello world.yaml', + 'list': [['/home/user/xenon', 'toto.xenon.hello world'], + 'config.yaml']}}, + 'root': '/home/user', + 'world': 'world'} + + assert Template.substitute_with_dependencies(input, input, TemplateConstants.DOLLAR_PARENTHESES) == output diff --git a/ush/python/pygw/src/tests/test_timetools.py b/ush/python/pygw/src/tests/test_timetools.py new file mode 100644 index 00000000000..592fd479054 --- /dev/null +++ b/ush/python/pygw/src/tests/test_timetools.py @@ -0,0 +1,76 @@ +from datetime import datetime, timedelta +from pygw.timetools import * + +current_date = datetime.now() + + +def test_to_datetime(): + + assert to_datetime('20220314') == datetime(2022, 3, 14) + assert to_datetime('2022031412') == datetime(2022, 3, 14, 12) + assert to_datetime('202203141230') == datetime(2022, 3, 14, 12, 30) + assert to_datetime('2022-03-14') == datetime(2022, 3, 14) + assert to_datetime('2022-03-14T12Z') == datetime(2022, 3, 14, 12) + assert to_datetime('2022-03-14T12:30Z') == datetime(2022, 3, 14, 12, 30) + assert to_datetime('2022-03-14T12:30:45') == datetime(2022, 3, 14, 12, 30, 45) + assert to_datetime('2022-03-14T12:30:45Z') == datetime(2022, 3, 14, 12, 30, 45) + + +def test_to_timedelta(): + assert to_timedelta('2d3H4M5S') == timedelta(days=2, hours=3, minutes=4, seconds=5) + assert to_timedelta('-3H15M') == timedelta(hours=-3, minutes=-15) + assert to_timedelta('1:30:45') == timedelta(hours=1, minutes=30, seconds=45) + assert to_timedelta('5 days, 12:30:15') == timedelta(days=5, hours=12, minutes=30, seconds=15) + + +def test_datetime_to_ymdh(): + assert datetime_to_YMDH(current_date) == current_date.strftime('%Y%m%d%H') + + +def test_datetime_to_ymd(): + assert datetime_to_YMD(current_date) == current_date.strftime('%Y%m%d') + + +def test_timedelta_to_hms(): + td = timedelta(hours=5, minutes=39, seconds=56) + assert timedelta_to_HMS(td) == '05:39:56' + td = timedelta(days=4, hours=5, minutes=39, seconds=56) + assert timedelta_to_HMS(td) == '101:39:56' + + +def test_strftime(): + assert strftime(current_date, '%Y%m%d') == current_date.strftime('%Y%m%d') + assert strftime(current_date, '%Y%m%d %H') == current_date.strftime('%Y%m%d %H') + + +def test_strptime(): + assert strptime(current_date.strftime('%Y%m%d'), '%Y%m%d') == \ + datetime.strptime(current_date.strftime('%Y%m%d'), '%Y%m%d') + + +def test_to_isotime(): + assert to_isotime(current_date) == current_date.strftime('%Y-%m-%dT%H:%M:%SZ') + + +def test_to_fv3time(): + assert to_fv3time(current_date) == current_date.strftime('%Y%m%d.%H%M%S') + + +def test_add_to_timedelta(): + assert add_to_timedelta(timedelta(days=1), timedelta(hours=3)) == \ + timedelta(days=1, hours=3) + assert add_to_timedelta(timedelta(hours=5, minutes=30), timedelta(minutes=15)) == \ + timedelta(hours=5, minutes=45) + assert add_to_timedelta(timedelta(seconds=45), timedelta(milliseconds=500)) == \ + timedelta(seconds=45, milliseconds=500) + + +def test_add_to_datetime(): + dt = datetime(2023, 3, 14, 12, 0, 0) + td = timedelta(days=1, hours=6) + negative_td = timedelta(days=-1, hours=-6) + zero_td = timedelta() + + assert add_to_datetime(dt, td) == datetime(2023, 3, 15, 18, 0, 0) + assert add_to_datetime(dt, negative_td) == datetime(2023, 3, 13, 6, 0, 0) + assert add_to_datetime(dt, zero_td) == datetime(2023, 3, 14, 12, 0, 0) diff --git a/ush/python/pygw/src/tests/test_yaml_file.py b/ush/python/pygw/src/tests/test_yaml_file.py new file mode 100644 index 00000000000..d01eb154b24 --- /dev/null +++ b/ush/python/pygw/src/tests/test_yaml_file.py @@ -0,0 +1,97 @@ +import os +import pytest +from datetime import datetime +from pygw.yaml_file import YAMLFile, parse_yamltmpl, parse_j2yaml, save_as_yaml, dump_as_yaml + +host_yaml = """ +host: + hostname: test_host + host_user: !ENV ${USER} +""" + +conf_yaml = """ +config: + config_file: !ENV ${TMP_PATH}/config.yaml + user: !ENV ${USER} + host_file: !INC ${TMP_PATH}/host.yaml +""" + +tmpl_yaml = """ +config: + config_file: !ENV ${TMP_PATH}/config.yaml + user: !ENV ${USER} + host_file: !INC ${TMP_PATH}/host.yaml +tmpl: + cdate: '{{PDY}}{{cyc}}' + homedir: /home/$(user) +""" +# Note the quotes ' ' around {{ }}. These quotes are necessary for yaml otherwise yaml will fail parsing + +j2tmpl_yaml = """ +config: + config_file: !ENV ${TMP_PATH}/config.yaml + user: !ENV ${USER} + host_file: !INC ${TMP_PATH}/host.yaml +tmpl: + cdate: '{{ current_cycle | to_YMD }}{{ current_cycle | strftime('%H') }}' + homedir: /home/$(user) +""" + + +@pytest.fixture +def create_template(tmpdir): + """Create temporary templates for testing""" + tmpdir.join('host.yaml').write(host_yaml) + tmpdir.join('config.yaml').write(conf_yaml) + tmpdir.join('tmpl.yaml').write(tmpl_yaml) + tmpdir.join('j2tmpl.yaml').write(j2tmpl_yaml) + + +def test_yaml_file(tmp_path, create_template): + + # Set env. variable + os.environ['TMP_PATH'] = str(tmp_path) + conf = YAMLFile(path=str(tmp_path / 'config.yaml')) + + # Write out yaml file + yaml_out = tmp_path / 'config_output.yaml' + conf.save(yaml_out) + + # Read in the yaml file and compare w/ conf + yaml_in = YAMLFile(path=str(yaml_out)) + + assert yaml_in == conf + + +def test_yaml_file_with_templates(tmp_path, create_template): + + # Set env. variable + os.environ['TMP_PATH'] = str(tmp_path) + data = {'user': os.environ['USER']} + conf = parse_yamltmpl(path=str(tmp_path / 'tmpl.yaml'), data=data) + + # Write out yaml file + yaml_out = tmp_path / 'tmpl_output.yaml' + save_as_yaml(conf, yaml_out) + + # Read in the yaml file and compare w/ conf + yaml_in = YAMLFile(path=yaml_out) + + assert yaml_in == conf + + +def test_yaml_file_with_j2templates(tmp_path, create_template): + + # Set env. variable + os.environ['TMP_PATH'] = str(tmp_path) + data = {'user': os.environ['USER'], 'current_cycle': datetime.now()} + conf = parse_j2yaml(path=str(tmp_path / 'j2tmpl.yaml'), data=data) + + # Write out yaml file + yaml_out = tmp_path / 'j2tmpl_output.yaml' + save_as_yaml(conf, yaml_out) + + # Read in the yaml file and compare w/ conf + yaml_in = YAMLFile(path=yaml_out) + + assert yaml_in == conf diff --git a/ush/radmon_diag_ck.sh b/ush/radmon_diag_ck.sh new file mode 100755 index 00000000000..142e99f8c78 --- /dev/null +++ b/ush/radmon_diag_ck.sh @@ -0,0 +1,175 @@ +#!/bin/bash + +#---------------------------------------------------------------- +# Check the contents of the radstat file and compare to +# the ${run}_radmon_satype.txt file. Report any missing +# or zero sized diag files. +# + + function usage { + echo "Usage: radmon_diag_ck.sh -rad radstat --sat satype --out output " + echo "" + echo " -r,--rad radstat file (required)" + echo " File name or path to radstat file." + echo "" + echo " -s,--sat satype file (required)" + echo " File name or path to satype file." + echo "" + echo " -o,--out output file name (required)" + echo " File name for missing diag file report." + } + + +echo "--> radmon_diag_ck.sh" + + +#-------------------------- +# Process input arguments +# + nargs=$# + if [[ $nargs -ne 6 ]]; then + usage + exit 1 + fi + + while [[ $# -ge 1 ]] + do + key="$1" + echo $key + + case $key in + -r|--rad) + radstat_file="$2" + shift # past argument + ;; + -s|--sat) + satype_file="$2" + shift # past argument + ;; + -o|--out) + output_file="$2" + shift # past argument + ;; + *) + #unspecified key + echo " unsupported key = $key" + ;; + esac + + shift + done + +# set -ax + + echo " radstat_file = ${radstat_file}" + echo " satype_file = ${satype_file}" + echo " output_file = ${output_file}" + + missing_diag="" + zero_len_diag="" + + #--------------------------------------------- + # get list of diag files in the radstat file + # + radstat_contents=`tar -tf ${radstat_file} | grep '_ges' | + gawk -F"diag_" '{print $2}' | + gawk -F"_ges" '{print $1}'` + + + #--------------------------------------------- + # load contents of satype_file into an array + # + satype_contents=`cat ${satype_file}` + + + #------------------------------------------------- + # compare $satype_contents and $radstat_contents + # report anything missing + # + for sat in $satype_contents; do + test=`echo $radstat_contents | grep $sat` + + if [[ ${#test} -le 0 ]]; then + missing_diag="${missing_diag} ${sat}" + fi + + done + + echo "" + echo "missing_diag = ${missing_diag}" + echo "" + + + #--------------------------------------------------------- + # Check for zero sized diag files. The diag files in + # the radstat file (which is a tar file) are gzipped. + # I find that 0 sized, gzipped file has a size of ~52 + # (I assume that's for header and block size). + # + # So for this check we'll assume anything in the radstat + # file with a size of > 1000 bytes is suspect. (That's + # overkill, 100 is probably sufficient, but I'm the + # nervous type.) So we'll extract, uncompress, and check + # the actual file size of those. Anything with an + # uncompressed size of 0 goes on the zero_len_diag list. + # + verbose_contents=`tar -tvf ${radstat_file} | grep '_ges'` + + + #------------------------------------------------------- + # note: need to reset the IFS to line breaks otherwise + # the $vc value in the for loop below will break + # on all white space, not the line break. + SAVEIFS=$IFS + IFS=$(echo -en "\n\b") + + + for vc in ${verbose_contents}; do + + gzip_len=`echo ${vc} | gawk '{print $3}'` + + if [[ ${gzip_len} -le 1000 ]]; then + test_file=`echo ${vc} | gawk '{print $6}'` + tar -xf ${radstat_file} ${test_file} + + gunzip ${test_file} + unzipped_file=`echo ${test_file%.*}` + + uz_file_size=`ls -la ${unzipped_file} | gawk '{print $5}'` + + if [[ ${uz_file_size} -le 0 ]]; then + sat=`echo ${unzipped_file} | gawk -F"diag_" '{print $2}' | + gawk -F"_ges" '{print $1}'` + + zero_len_diag="${zero_len_diag} ${sat}" + fi + + rm -f ${unzipped_file} + fi + done + + IFS=${SAVEIFS} # reset IFS to default (white space) + + echo "" + echo "zero_len_diag = ${zero_len_diag}" + echo "" + + + #----------------------------------------- + # Write results to $output_file + # + if [[ ${#zero_len_diag} -gt 0 ]]; then + for zld in ${zero_len_diag}; do + echo " Zero Length diagnostic file: $zld" >> $output_file + done + fi + + if [[ ${#missing_diag} -gt 0 ]]; then + for md in ${missing_diag}; do + echo " Missing diagnostic file : $md" >> $output_file + done + fi + + +echo "<-- radmon_diag_ck.sh" +exit diff --git a/ush/radmon_err_rpt.sh b/ush/radmon_err_rpt.sh new file mode 100755 index 00000000000..8561563d48f --- /dev/null +++ b/ush/radmon_err_rpt.sh @@ -0,0 +1,194 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: radmon_err_rpt.sh +# Script description: Compare the contents of error files from two different +# cycles. +# +# Author: Ed Safford Org: NP23 Date: 2012-02-02 +# +# Abstract: This script compares the contents of two error files from two different +# sets of radiance diagnostic files (which are an output from GSI runs). +# All unique satellite instrument/channel/region combinations that appear +# in both files are reported. +# +# This script is run as a child script of radmon_verf_time.sh. The parent +# script creates/copies the error files into a temporary working +# directory before invoking this script. +# +# +# Usage: radmon_err_rpt.sh file1 file2 type cycle1 cycle2 diag_rpt outfile +# +# Input script positional parameters: +# file1 obs, penalty, or channel error file +# required +# file2 obs, penalty, or channel error file +# required +# type type of error file +# choices are obs, pen, chan, or cnt; required +# cycle1 first cycle processing date +# yyyymmddcc format; required +# cycle2 second cycle processing date +# yyyymmddcc format; required +# diag_rpt diagnostic report text file +# required +# outfile output file name +# required +# +# Remarks: +# +# Condition codes +# 0 - no problem encountered +# >0 - some problem encountered +#################################################################### + +# Command line arguments. +file1=${1:-${file1:?}} +file2=${2:-${file2:?}} +type=${3:-${type:?}} +cycle1=${4:-${cycle1:?}} +cycle2=${5:-${cycle2:?}} +diag_rpt=${6:-${diag_rpt:?}} +outfile=${7:-${outfile:?}} + +# Directories +HOMEradmon=${HOMEradmon:-$(pwd)} + +# Other variables +err=0 +RADMON_SUFFIX=${RADMON_SUFFIX} + +have_diag_rpt=0 +if [[ -s $diag_rpt ]]; then + have_diag_rpt=1 +else + err=1 +fi +echo "have_diag_rpt = $have_diag_rpt" + +#----------------------------------------------------------------------------- +# read each line in the $file1 +# search $file2 for the same satname, channel, and region +# if same combination is in both files, add the values to the output file +# +{ while read myline; do + echo "myline = $myline" + bound="" + + echo $myline + satname=$(echo $myline | gawk '{print $1}') + channel=$(echo $myline | gawk '{print $3}') + region=$(echo $myline | gawk '{print $5}') + value1=$(echo $myline | gawk '{print $7}') + bound=$(echo $myline | gawk '{print $9}') + +# +# Check findings against diag_report. If the satellite/instrument is on the +# diagnostic report it means the diagnostic file file for the +# satelite/instrument is missing for this cycle, so skip any additional +# error checking for that source. Otherwise, evaluate as per normal. +# + + diag_match="" + diag_match_len=0 + + if [[ $have_diag_rpt == 1 ]]; then + diag_match=$(gawk "/$satname/" $diag_rpt) + diag_match_len=$(echo ${#diag_match}) + fi + + + if [[ $diag_match_len == 0 ]]; then + + if [[ $type == "chan" ]]; then + echo "looking for match for $satname and $channel" + { while read myline2; do + satname2=$(echo $myline2 | gawk '{print $1}') + channel2=$(echo $myline2 | gawk '{print $3}') + + if [[ $satname == $satname2 && $channel == $channel2 ]]; then + match="$satname channel= $channel" + echo "match from gawk = $match" + break; + else + match="" + fi + + done } < $file2 + + + else + match=$(gawk "/$satname/ && /channel= $channel / && /region= $region /" $file2) + echo match = $match + + match_len=$(echo ${#match}) + if [[ $match_len > 0 ]]; then + channel2=$(echo $match | gawk '{print $3}') + + if [[ $channel2 != $channel ]]; then + match="" + fi + fi + + fi + match_len=$(echo ${#match}) + + if [[ $match_len > 0 ]]; then + + value2=$(echo $match | gawk '{print $7}') + bound2=$(echo $match | gawk '{print $9}') + + if [[ $type == "chan" ]]; then + tmpa=" $satname channel= $channel" + tmpb="" + + elif [[ $type == "pen" ]]; then + tmpa="$satname channel= $channel region= $region" + tmpb="$cycle1 $value1 $bound" + + elif [[ $type == "cnt" ]]; then + tmpa="$satname channel= $channel region= $region" + tmpb="$cycle1 $value1 $bound" + + else + tmpa="$satname channel= $channel region= $region" + tmpb="$cycle1: $type= $value1" + fi + + line1="$tmpa $tmpb" + echo "$line1" >> $outfile + + if [[ $type != "chan" ]]; then + tmpc=$(echo $tmpa |sed 's/[a-z]/ /g' | sed 's/[0-9]/ /g' | sed 's/=/ /g' | sed 's/_/ /g' | sed 's/-/ /g') + + if [[ $type == "pen" || $type == "cnt" ]]; then + line2=" $tmpc $cycle2 $value2 $bound2" + else + line2=" $tmpc $cycle2: $type= $value2" + fi + + echo "$line2" >> $outfile + fi + + #----------------------------------------- + # add hyperlink to warning entry + # + line3=" http://www.emc.ncep.noaa.gov/gmb/gdas/radiance/es_rad/${RADMON_SUFFIX}/index.html?sat=${satname}®ion=${region}&channel=${channel}&stat=${type}" + if [[ $channel -gt 0 ]]; then + echo "$line3" >> $outfile + echo "" >> $outfile + fi + fi + fi +done } < $file1 + + +################################################################################ +# Post processing + +exit ${err} + diff --git a/ush/radmon_verf_angle.sh b/ush/radmon_verf_angle.sh new file mode 100755 index 00000000000..b2dab0825ab --- /dev/null +++ b/ush/radmon_verf_angle.sh @@ -0,0 +1,235 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: radmon_verf_angle.sh +# Script description: Extract angle dependent data from radiance +# diagnostic files. +# +# Author: Ed Safford Org: NP23 Date: 2012-02-02 +# +# Abstract: This script extracts angle dependent data from radiance +# diagnostic files (which are an output from GSI runs), +# storing the extracted data in small binary files. +# +# This script is a child script of exgdas_vrfyrad.sh.sms. The parent +# script opens and uncompresses the radiance diagnostic file and copies +# other supporting files into a temporary working directory. +# +# +# Usage: radmon_verf_angle.sh PDATE +# +# Input script positional parameters: +# PDATE processing date +# yyyymmddcc format; required +# +# Imported Shell Variables: +# RADMON_SUFFIX data source suffix +# defauls to opr +# EXECradmon executable directory +# defaults to current directory +# RAD_AREA global or regional flag +# defaults to global +# TANKverf_rad data repository +# defaults to current directory +# SATYPE list of satellite/instrument sources +# defaults to none +# VERBOSE Verbose flag (YES or NO) +# defaults to NO +# LITTLE_ENDIAN flag to indicate LE machine +# defaults to 0 (big endian) +# USE_ANL use analysis files as inputs in addition to +# the ges files. Default is 0 (ges only) +# +# Modules and files referenced: +# scripts : +# +# programs : $NCP +# $angle_exec +# +# fixed data : $scaninfo +# +# input data : $data_file +# +# output data: $angle_file +# $angle_ctl +# $pgmout +# +# Remarks: +# +# Condition codes +# 0 - no problem encountered +# >0 - some problem encountered +# +#################################################################### + +# Command line arguments. +RAD_AREA=${RAD_AREA:-glb} +REGIONAL_RR=${REGIONAL_RR:-0} # rapid refresh model flag +rgnHH=${rgnHH:-} +rgnTM=${rgnTM:-} + +export PDATE=${1:-${PDATE:?}} + +echo " REGIONAL_RR, rgnHH, rgnTM = $REGIONAL_RR, $rgnHH, $rgnTM" +netcdf_boolean=".false." +if [[ $RADMON_NETCDF -eq 1 ]]; then + netcdf_boolean=".true." +fi +echo " RADMON_NETCDF, netcdf_boolean = ${RADMON_NETCDF}, $netcdf_boolean" + +which prep_step +which startmsg + +# Directories +FIXgdas=${FIXgdas:-$(pwd)} +EXECradmon=${EXECradmon:-$(pwd)} +TANKverf_rad=${TANKverf_rad:-$(pwd)} + +# File names +export pgmout=${pgmout:-${jlogfile}} +touch $pgmout + +# Other variables +SATYPE=${SATYPE:-} +VERBOSE=${VERBOSE:-NO} +LITTLE_ENDIAN=${LITTLE_ENDIAN:-0} +USE_ANL=${USE_ANL:-0} + + +if [[ $USE_ANL -eq 1 ]]; then + gesanl="ges anl" +else + gesanl="ges" +fi + +err=0 +angle_exec=radmon_angle.x +shared_scaninfo=${shared_scaninfo:-$FIXgdas/gdas_radmon_scaninfo.txt} +scaninfo=scaninfo.txt + +#-------------------------------------------------------------------- +# Copy extraction program and supporting files to working directory + +$NCP ${EXECradmon}/${angle_exec} ./ +$NCP $shared_scaninfo ./${scaninfo} + +if [[ ! -s ./${angle_exec} || ! -s ./${scaninfo} ]]; then + err=2 +else +#-------------------------------------------------------------------- +# Run program for given time + + export pgm=${angle_exec} + + iyy=$(echo $PDATE | cut -c1-4) + imm=$(echo $PDATE | cut -c5-6) + idd=$(echo $PDATE | cut -c7-8) + ihh=$(echo $PDATE | cut -c9-10) + + ctr=0 + fail=0 + touch "./errfile" + + for type in ${SATYPE}; do + + if [[ ! -s ${type} ]]; then + echo "ZERO SIZED: ${type}" + continue + fi + + for dtype in ${gesanl}; do + + echo "pgm = $pgm" + echo "pgmout = $pgmout" + prep_step + + ctr=$(expr $ctr + 1) + + if [[ $dtype == "anl" ]]; then + data_file=${type}_anl.${PDATE}.ieee_d + ctl_file=${type}_anl.ctl + angl_ctl=angle.${ctl_file} + else + data_file=${type}.${PDATE}.ieee_d + ctl_file=${type}.ctl + angl_ctl=angle.${ctl_file} + fi + + angl_file="" + if [[ $REGIONAL_RR -eq 1 ]]; then + angl_file=${rgnHH}.${data_file}.${rgnTM} + fi + + + if [[ -f input ]]; then rm input; fi + + nchanl=-999 +cat << EOF > input + &INPUT + satname='${type}', + iyy=${iyy}, + imm=${imm}, + idd=${idd}, + ihh=${ihh}, + idhh=-720, + incr=${CYCLE_INTERVAL}, + nchanl=${nchanl}, + suffix='${RADMON_SUFFIX}', + gesanl='${dtype}', + little_endian=${LITTLE_ENDIAN}, + rad_area='${RAD_AREA}', + netcdf=${netcdf_boolean}, + / +EOF + + startmsg + ./${angle_exec} < input >> ${pgmout} 2>>errfile + export err=$?; err_chk + if [[ $err -ne 0 ]]; then + fail=$(expr $fail + 1) + fi + + if [[ -s ${angl_file} ]]; then + ${COMPRESS} -f ${angl_file} + fi + + if [[ -s ${angl_ctl} ]]; then + ${COMPRESS} -f ${angl_ctl} + fi + + + done # for dtype in ${gesanl} loop + + done # for type in ${SATYPE} loop + + + ${USHradmon}/rstprod.sh + + tar_file=radmon_angle.tar + if compgen -G "angle*.ieee_d*" > /dev/null || compgen -G "angle*.ctl*" > /dev/null; then + tar -cf $tar_file angle*.ieee_d* angle*.ctl* + ${COMPRESS} ${tar_file} + mv $tar_file.${Z} ${TANKverf_rad}/. + + if [[ $RAD_AREA = "rgn" ]]; then + cwd=$(pwd) + cd ${TANKverf_rad} + tar -xf ${tar_file}.${Z} + rm ${tar_file}.${Z} + cd ${cwd} + fi + fi + + if [[ $ctr -gt 0 && $fail -eq $ctr || $fail -gt $ctr ]]; then + err=3 + fi +fi + +################################################################################ +# Post processing + +exit ${err} diff --git a/ush/radmon_verf_bcoef.sh b/ush/radmon_verf_bcoef.sh new file mode 100755 index 00000000000..374c8db7b28 --- /dev/null +++ b/ush/radmon_verf_bcoef.sh @@ -0,0 +1,233 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: radmon_verf_bcoef.sh +# Script description: Extract bias correction coefficients data from radiance +# diagnostic files. +# +# Author: Ed Safford Org: NP23 Date: 2012-02-02 +# +# Abstract: This script extracts bias correction coefficient related data from +# radiance diagnostic files (which are an output from GSI runs), +# storing the extracted data in small binary files. +# +# This script is a child script of exgdas_vrfyrad.sh.sms. The parent +# script opens and uncompresses the radiance diagnostic file and copies +# other supporting files into a temporary working directory. +# +# +# Usage: radmon_verf_bcoef.sh PDATE +# +# Input script positional parameters: +# PDATE processing date +# yyyymmddcc format; required +# +# Imported Shell Variables: +# RADMON_SUFFIX data source suffix +# defauls to opr +# EXECradmon executable directory +# defaults to current directory +# FIXradmon fixed data directory +# defaults to current directory +# RAD_AREA global or regional flag +# defaults to global +# TANKverf_rad data repository +# defaults to current directory +# SATYPE list of satellite/instrument sources +# defaults to none +# LITTLE_ENDIAN flag for LE machine +# defaults to 0 (big endian) +# USE_ANL use analysis files as inputs in addition to +# the ges files. Default is 0 (ges only) +# +# Modules and files referenced: +# scripts : +# +# programs : $NCP +# $bcoef_exec +# +# fixed data : $biascr +# +# input data : $data_file +# +# output data: $bcoef_file +# $bcoef_ctl +# $pgmout +# +# Remarks: +# +# Condition codes +# 0 - no problem encountered +# >0 - some problem encountered +# +#################################################################### +# Command line arguments. +export PDATE=${1:-${PDATE:?}} + +netcdf_boolean=".false." +if [[ $RADMON_NETCDF -eq 1 ]]; then + netcdf_boolean=".true." +fi +echo " RADMON_NETCDF, netcdf_boolean = ${RADMON_NETCDF}, $netcdf_boolean" + +# Directories +FIXgdas=${FIXgdas:-$(pwd)} +EXECradmon=${EXECradmon:-$(pwd)} +TANKverf_rad=${TANKverf_rad:-$(pwd)} + +# File names +pgmout=${pgmout:-${jlogfile}} +touch $pgmout + +# Other variables +RAD_AREA=${RAD_AREA:-glb} +REGIONAL_RR=${REGIONAL_RR:-0} +rgnHH=${rgnHH:-} +rgnTM=${rgnTM:-} +SATYPE=${SATYPE:-} +LITTLE_ENDIAN=${LITTLE_ENDIAN:-0} +USE_ANL=${USE_ANL:-0} + + +err=0 +bcoef_exec=radmon_bcoef.x + +if [[ $USE_ANL -eq 1 ]]; then + gesanl="ges anl" +else + gesanl="ges" +fi + +#-------------------------------------------------------------------- +# Copy extraction program and supporting files to working directory + +$NCP $EXECradmon/${bcoef_exec} ./${bcoef_exec} +$NCP ${biascr} ./biascr.txt + +if [[ ! -s ./${bcoef_exec} || ! -s ./biascr.txt ]]; then + err=4 +else + + +#-------------------------------------------------------------------- +# Run program for given time + + export pgm=${bcoef_exec} + + iyy=$(echo $PDATE | cut -c1-4) + imm=$(echo $PDATE | cut -c5-6) + idd=$(echo $PDATE | cut -c7-8) + ihh=$(echo $PDATE | cut -c9-10) + + ctr=0 + fail=0 + + nchanl=-999 + npredr=5 + + for type in ${SATYPE}; do + + if [[ ! -s ${type} ]]; then + echo "ZERO SIZED: ${type}" + continue + fi + + for dtype in ${gesanl}; do + + prep_step + + ctr=$(expr $ctr + 1) + + if [[ $dtype == "anl" ]]; then + data_file=${type}_anl.${PDATE}.ieee_d + ctl_file=${type}_anl.ctl + bcoef_ctl=bcoef.${ctl_file} + else + data_file=${type}.${PDATE}.ieee_d + ctl_file=${type}.ctl + bcoef_ctl=bcoef.${ctl_file} + fi + + if [[ $REGIONAL_RR -eq 1 ]]; then + bcoef_file=${rgnHH}.bcoef.${data_file}.${rgnTM} + else + bcoef_file=bcoef.${data_file} + fi + + + if [[ -f input ]]; then rm input; fi + + +cat << EOF > input + &INPUT + satname='${type}', + npredr=${npredr}, + nchanl=${nchanl}, + iyy=${iyy}, + imm=${imm}, + idd=${idd}, + ihh=${ihh}, + idhh=-720, + incr=${CYCLE_INTERVAL}, + suffix='${RADMON_SUFFIX}', + gesanl='${dtype}', + little_endian=${LITTLE_ENDIAN}, + netcdf=${netcdf_boolean}, + / +EOF + startmsg + ./${bcoef_exec} < input >>${pgmout} 2>>errfile + export err=$?; err_chk + if [[ $err -ne 0 ]]; then + fail=$(expr $fail + 1) + fi + + +#------------------------------------------------------------------- +# move data, control, and stdout files to $TANKverf_rad and compress +# + + if [[ -s ${bcoef_file} ]]; then + ${COMPRESS} ${bcoef_file} + fi + + if [[ -s ${bcoef_ctl} ]]; then + ${COMPRESS} ${bcoef_ctl} + fi + + + done # dtype in $gesanl loop + done # type in $SATYPE loop + + + ${USHradmon}/rstprod.sh + + if compgen -G "bcoef*.ieee_d*" > /dev/null || compgen -G "bcoef*.ctl*" > /dev/null; then + tar_file=radmon_bcoef.tar + tar -cf $tar_file bcoef*.ieee_d* bcoef*.ctl* + ${COMPRESS} ${tar_file} + mv $tar_file.${Z} ${TANKverf_rad} + + if [[ $RAD_AREA = "rgn" ]]; then + cwd=$(pwd) + cd ${TANKverf_rad} + tar -xf ${tar_file}.${Z} + rm ${tar_file}.${Z} + cd ${cwd} + fi + fi + + if [[ $ctr -gt 0 && $fail -eq $ctr || $fail -gt $ctr ]]; then + err=5 + fi +fi + + +################################################################################ +# Post processing + +exit ${err} diff --git a/ush/radmon_verf_bcor.sh b/ush/radmon_verf_bcor.sh new file mode 100755 index 00000000000..3e267f018cd --- /dev/null +++ b/ush/radmon_verf_bcor.sh @@ -0,0 +1,226 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: radmon_verf_bcor.sh +# Script description: Extract bias correction data from radiance diagnostic +# files. +# +# Author: Ed Safford Org: NP23 Date: 2012-02-02 +# +# Abstract: This script extracts bias correction related data from radiance +# diagnostic files (which are an output from GSI runs), storing the +# extracted data in small binary files. +# +# This script is a child script of exgdas_vrfyrad.sh.sms. The parent +# script opens and uncompresses the radiance diagnostic file and copies +# other supporting files into a temporary working directory. +# +# +# Usage: radmon_verf_bcor.sh PDATE +# +# Input script positional parameters: +# PDATE processing date +# yyyymmddcc format; required +# +# Imported Shell Variables: +# RADMON_SUFFIX data source suffix +# defauls to opr +# EXECradmon executable directory +# defaults to current directory +# RAD_AREA global or regional flag +# defaults to global +# TANKverf_rad data repository +# defaults to current directory +# SATYPE list of satellite/instrument sources +# defaults to none +# LITTLE_ENDIAN flag for little endian machine +# defaults to 0 (big endian) +# USE_ANL use analysis files as inputs in addition to +# the ges files. Default is 0 (ges only) +# +# Modules and files referenced: +# scripts : +# +# programs : $NCP +# $bcor_exec +# +# fixed data : none +# +# input data : $data_file +# +# output data: $bcor_file +# $bcor_ctl +# $pgmout +# +# Remarks: +# +# Condition codes +# 0 - no problem encountered +# >0 - some problem encountered +# +#################################################################### + +# Command line arguments. +export PDATE=${1:-${PDATE:?}} + +# Directories +EXECradmon=${EXECradmon:-$(pwd)} +TANKverf_rad=${TANKverf_rad:-$(pwd)} + +# File names +pgmout=${pgmout:-${jlogfile}} +touch $pgmout + +# Other variables +RAD_AREA=${RAD_AREA:-glb} +SATYPE=${SATYPE:-} +LITTLE_ENDIAN=${LITTLE_ENDIAN:-0} +USE_ANL=${USE_ANL:-0} + +bcor_exec=radmon_bcor.x +err=0 + +netcdf_boolean=".false." +if [[ $RADMON_NETCDF -eq 1 ]]; then + netcdf_boolean=".true." +fi + +if [[ $USE_ANL -eq 1 ]]; then + gesanl="ges anl" +else + gesanl="ges" +fi + + +#-------------------------------------------------------------------- +# Copy extraction program to working directory + +$NCP ${EXECradmon}/${bcor_exec} ./${bcor_exec} + +if [[ ! -s ./${bcor_exec} ]]; then + err=6 +else + + +#-------------------------------------------------------------------- +# Run program for given time + + export pgm=${bcor_exec} + + iyy=$(echo $PDATE | cut -c1-4) + imm=$(echo $PDATE | cut -c5-6) + idd=$(echo $PDATE | cut -c7-8) + ihh=$(echo $PDATE | cut -c9-10) + + ctr=0 + fail=0 + touch "./errfile" + + for type in ${SATYPE}; do + + for dtype in ${gesanl}; do + + prep_step + + ctr=$(expr $ctr + 1) + + if [[ $dtype == "anl" ]]; then + data_file=${type}_anl.${PDATE}.ieee_d + bcor_file=bcor.${data_file} + ctl_file=${type}_anl.ctl + bcor_ctl=bcor.${ctl_file} + stdout_file=stdout.${type}_anl + bcor_stdout=bcor.${stdout_file} + input_file=${type}_anl + else + data_file=${type}.${PDATE}.ieee_d + bcor_file=bcor.${data_file} + ctl_file=${type}.ctl + bcor_ctl=bcor.${ctl_file} + stdout_file=stdout.${type} + bcor_stdout=bcor.${stdout_file} + input_file=${type} + fi + + if [[ -f input ]]; then rm input; fi + + # Check for 0 length input file here and avoid running + # the executable if $input_file doesn't exist or is 0 bytes + # + if [[ -s $input_file ]]; then + nchanl=-999 + +cat << EOF > input + &INPUT + satname='${type}', + iyy=${iyy}, + imm=${imm}, + idd=${idd}, + ihh=${ihh}, + idhh=-720, + incr=6, + nchanl=${nchanl}, + suffix='${RADMON_SUFFIX}', + gesanl='${dtype}', + little_endian=${LITTLE_ENDIAN}, + rad_area='${RAD_AREA}', + netcdf=${netcdf_boolean}, + / +EOF + + startmsg + ./${bcor_exec} < input >> ${pgmout} 2>>errfile + export err=$?; err_chk + if [[ $? -ne 0 ]]; then + fail=$(expr $fail + 1) + fi + + +#------------------------------------------------------------------- +# move data, control, and stdout files to $TANKverf_rad and compress +# + + if [[ -s ${bcor_file} ]]; then + ${COMPRESS} ${bcor_file} + fi + + if [[ -s ${bcor_ctl} ]]; then + ${COMPRESS} ${bcor_ctl} + fi + + fi + done # dtype in $gesanl loop + done # type in $SATYPE loop + + + ${USHradmon}/rstprod.sh + tar_file=radmon_bcor.tar + + if compgen -G "bcor*.ieee_d*" > /dev/null || compgen -G "bcor*.ctl*" > /dev/null; then + tar -cf $tar_file bcor*.ieee_d* bcor*.ctl* + ${COMPRESS} ${tar_file} + mv $tar_file.${Z} ${TANKverf_rad}/. + + if [[ $RAD_AREA = "rgn" ]]; then + cwd=$(pwd) + cd ${TANKverf_rad} + tar -xf ${tar_file}.${Z} + rm ${tar_file}.${Z} + cd ${cwd} + fi + fi + + if [[ $ctr -gt 0 && $fail -eq $ctr || $fail -gt $ctr ]]; then + err=7 + fi +fi + +################################################################################ +# Post processing + +exit ${err} + diff --git a/ush/radmon_verf_time.sh b/ush/radmon_verf_time.sh new file mode 100755 index 00000000000..51743277c91 --- /dev/null +++ b/ush/radmon_verf_time.sh @@ -0,0 +1,567 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: radmon_verf_time.sh +# Script description: Extract time data from radiance diagnostic files, +# perform data integrity checks. +# +# Author: Ed Safford Org: NP23 Date: 2012-02-02 +# +# Abstract: This script extracts time related data from radiance diagnostic +# files (which are an output from GSI runs), storing the extracted +# data in small binary files. Data integrity checks are performed +# on the data and mail messages are sent if potential errors are +# detected. +# +# This script is a child script of exgdas_vrfyrad.sh.sms. The parent +# script opens and uncompresses the radiance diagnostic file and copies +# other supporting files into a temporary working directory. +# +# +# Usage: radmon_verf_time.sh PDATE +# +# Input script positional parameters: +# PDATE processing date +# yyyymmddcc format; required +# +# Imported Shell Variables: +# DO_DATA_RPT switch to build the data report +# defaults to 1 (on) +# RADMON_SUFFIX data source suffix +# defauls to opr +# EXECradmon executable directory +# defaults to current directory +# FIXgdas fixed data directory +# defaults to current directory +# RAD_AREA global or regional flag +# defaults to global +# TANKverf_rad data repository +# defaults to current directory +# SATYPE list of satellite/instrument sources +# defaults to none +# VERBOSE Verbose flag (YES or NO) +# defaults to NO +# LITTLE_ENDIAN flag for little endian machine +# defaults to 0 (big endian) +# USE_ANL use analysis files as inputs in addition to +# the ges files. Default is 0 (ges only) +# +# Modules and files referenced: +# scripts : +# +# programs : $NCP +# $time_exec +# +# fixed data : gdas_radmon_base.tar +# +# input data : $data_file +# +# output data: $time_file +# $time_ctl +# $pgmout +# $bad_pen +# $bad_chan +# $report +# $diag_report +# +# +# Remarks: +# +# Condition codes +# 0 - no problem encountered +# >0 - some problem encountered +# +#################################################################### + +# Command line arguments. +export PDATE=${1:-${PDATE:?}} + +# Directories +FIXgdas=${FIXgdas:-$(pwd)} +EXECradmon=${EXECradmon:-$(pwd)} +TANKverf_rad=${TANKverf_rad:-$(pwd)} + +# File names +#pgmout=${pgmout:-${jlogfile}} +#touch $pgmout + +radmon_err_rpt=${radmon_err_rpt:-${USHradmon}/radmon_err_rpt.sh} +base_file=${base_file:-$FIXgdas/gdas_radmon_base.tar} +report=report.txt +disclaimer=disclaimer.txt + +diag_report=diag_report.txt +diag_hdr=diag_hdr.txt +diag=diag.txt + +obs_err=obs_err.txt +obs_hdr=obs_hdr.txt +pen_err=pen_err.txt +pen_hdr=pen_hdr.txt + +chan_err=chan_err.txt +chan_hdr=chan_hdr.txt +count_hdr=count_hdr.txt +count_err=count_err.txt + +netcdf_boolean=".false." +if [[ $RADMON_NETCDF -eq 1 ]]; then + netcdf_boolean=".true." +fi + +DO_DATA_RPT=${DO_DATA_RPT:-1} +RADMON_SUFFIX=${RADMON_SUFFIX:-opr} +RAD_AREA=${RAD_AREA:-glb} +REGIONAL_RR=${REGIONAL_RR:-0} +rgnHH=${rgnHH:-} +rgnTM=${rgnTM:-} +SATYPE=${SATYPE:-} +VERBOSE=${VERBOSE:-NO} +LITTLE_ENDIAN=${LITTLE_ENDIAN:-0} + +time_exec=radmon_time.x +USE_ANL=${USE_ANL:-0} +err=0 + +if [[ $USE_ANL -eq 1 ]]; then + gesanl="ges anl" +else + gesanl="ges" +fi + + +#-------------------------------------------------------------------- +# Copy extraction program and base files to working directory +#------------------------------------------------------------------- +$NCP ${EXECradmon}/${time_exec} ./ +if [[ ! -s ./${time_exec} ]]; then + err=8 +fi + +iyy=$(echo $PDATE | cut -c1-4) +imm=$(echo $PDATE | cut -c5-6) +idd=$(echo $PDATE | cut -c7-8) +ihh=$(echo $PDATE | cut -c9-10) +cyc=$ihh +CYCLE=$cyc + +local_base="local_base" +if [[ $DO_DATA_RPT -eq 1 ]]; then + + if [[ -e ${base_file}.${Z} ]]; then + $NCP ${base_file}.${Z} ./${local_base}.{Z} + ${UNCOMPRESS} ${local_base}.${Z} + else + $NCP ${base_file} ./${local_base} + fi + + if [[ ! -s ./${local_base} ]]; then + echo "RED LIGHT: local_base file not found" + else + echo "Confirming local_base file is good = ${local_base}" + tar -xf ./${local_base} + echo "local_base is untarred" + fi +fi + +if [[ $err -eq 0 ]]; then + ctr=0 + fail=0 + + export pgm=${time_exec} +#-------------------------------------------------------------------- +# Loop over each entry in SATYPE +#-------------------------------------------------------------------- + for type in ${SATYPE}; do + + if [[ ! -s ${type} ]]; then + echo "ZERO SIZED: ${type}" + continue + fi + + ctr=$(expr $ctr + 1) + + for dtype in ${gesanl}; do + + if [[ -f input ]]; then rm input; fi + + if [[ $dtype == "anl" ]]; then + data_file=${type}_anl.${PDATE}.ieee_d + ctl_file=${type}_anl.ctl + time_ctl=time.${ctl_file} + else + data_file=${type}.${PDATE}.ieee_d + ctl_file=${type}.ctl + time_ctl=time.${ctl_file} + fi + + if [[ $REGIONAL_RR -eq 1 ]]; then + time_file=${rgnHH}.time.${data_file}.${rgnTM} + else + time_file=time.${data_file} + fi + +#-------------------------------------------------------------------- +# Run program for given satellite/instrument +#-------------------------------------------------------------------- + nchanl=-999 +cat << EOF > input + &INPUT + satname='${type}', + iyy=${iyy}, + imm=${imm}, + idd=${idd}, + ihh=${ihh}, + idhh=-720, + incr=${CYCLE_INTERVAL}, + nchanl=${nchanl}, + suffix='${RADMON_SUFFIX}', + gesanl='${dtype}', + little_endian=${LITTLE_ENDIAN}, + rad_area='${RAD_AREA}', + netcdf=${netcdf_boolean}, + / +EOF + + ./${time_exec} < input >> stdout.${type} 2>>errfile + + if [[ $err -ne 0 ]]; then + fail=$(expr $fail + 1) + fi + +#------------------------------------------------------------------- +# move data, control, and stdout files to $TANKverf_rad and compress +#------------------------------------------------------------------- + cat stdout.${type} >> stdout.time + + if [[ -s ${time_file} ]]; then + ${COMPRESS} ${time_file} + fi + + if [[ -s ${time_ctl} ]]; then + ${COMPRESS} ${time_ctl} + fi + + done + done + + + ${USHradmon}/rstprod.sh + + if compgen -G "time*.ieee_d*" > /dev/null || compgen -G "time*.ctl*" > /dev/null; then + tar_file=radmon_time.tar + tar -cf $tar_file time*.ieee_d* time*.ctl* + ${COMPRESS} ${tar_file} + mv $tar_file.${Z} ${TANKverf_rad}/. + + if [[ $RAD_AREA = "rgn" ]]; then + cwd=$(pwd) + cd ${TANKverf_rad} + tar -xf ${tar_file}.${Z} + rm ${tar_file}.${Z} + cd ${cwd} + fi + fi + + if [[ $ctr -gt 0 && $fail -eq $ctr || $fail -gt $ctr ]]; then + echo "fail, ctr = $fail, $ctr" + err=10 + fi + +fi + + + +#################################################################### +#------------------------------------------------------------------- +# Begin error analysis and reporting +#------------------------------------------------------------------- +#################################################################### + +if [[ $DO_DATA_RPT -eq 1 ]]; then + +#--------------------------- +# build report disclaimer +# + cat << EOF > ${disclaimer} + + +*********************** WARNING *************************** +THIS IS AN AUTOMATED EMAIL. REPLIES TO SENDER WILL NOT BE +RECEIVED. PLEASE DIRECT REPLIES TO edward.safford@noaa.gov +*********************** WARNING *************************** +EOF + + +#------------------------------------------------------------------- +# Check for missing diag files +# + tmp_satype="./tmp_satype.txt" + echo ${SATYPE} > ${tmp_satype} + ${USHradmon}/radmon_diag_ck.sh --rad ${radstat} --sat ${tmp_satype} --out ${diag} + + if [[ -s ${diag} ]]; then + cat << EOF > ${diag_hdr} + + Problem Reading Diagnostic File + + + Problems were encountered reading the diagnostic file for + the following sources: + +EOF + + cat ${diag_hdr} >> ${diag_report} + cat ${diag} >> ${diag_report} + + echo >> ${diag_report} + + rm ${diag_hdr} + fi + +#------------------------------------------------------------------- +# move warning notification to TANKverf +# + if [[ -s ${diag} ]]; then + lines=$(wc -l <${diag}) + echo "lines in diag = $lines" + + if [[ $lines -gt 0 ]]; then + cat ${diag_report} + cp ${diag} ${TANKverf_rad}/bad_diag.${PDATE} + else + rm ${diag_report} + fi + fi + + + + #---------------------------------------------------------------- + # Identify bad_pen and bad_chan files for this cycle and + # previous cycle + + bad_pen=bad_pen.${PDATE} + bad_chan=bad_chan.${PDATE} + low_count=low_count.${PDATE} + + qdate=$($NDATE -${CYCLE_INTERVAL} $PDATE) + pday=$(echo $qdate | cut -c1-8) + + prev_bad_pen=bad_pen.${qdate} + prev_bad_chan=bad_chan.${qdate} + prev_low_count=low_count.${qdate} + + prev_bad_pen=${TANKverf_radM1}/${prev_bad_pen} + prev_bad_chan=${TANKverf_radM1}/${prev_bad_chan} + prev_low_count=${TANKverf_radM1}/${prev_low_count} + + if [[ -s $bad_pen ]]; then + echo "pad_pen = $bad_pen" + fi + if [[ -s $prev_bad_pen ]]; then + echo "prev_pad_pen = $prev_bad_pen" + fi + + if [[ -s $bad_chan ]]; then + echo "bad_chan = $bad_chan" + fi + if [[ -s $prev_bad_chan ]]; then + echo "prev_bad_chan = $prev_bad_chan" + fi + if [[ -s $low_count ]]; then + echo "low_count = $low_count" + fi + if [[ -s $prev_low_count ]]; then + echo "prev_low_count = $prev_low_count" + fi + + do_pen=0 + do_chan=0 + do_cnt=0 + + if [[ -s $bad_pen && -s $prev_bad_pen ]]; then + do_pen=1 + fi + + if [[ -s $low_count && -s $prev_low_count ]]; then + do_cnt=1 + fi + + #-------------------------------------------------------------------- + # avoid doing the bad_chan report for REGIONAL_RR sources -- because + # they run hourly they often have 0 count channels for off-hour runs. + # + if [[ -s $bad_chan && -s $prev_bad_chan && REGIONAL_RR -eq 0 ]]; then + do_chan=1 + fi + + #-------------------------------------------------------------------- + # Remove extra spaces in new bad_pen & low_count files + # + if [[ -s ${bad_pen} ]]; then + gawk '{$1=$1}1' $bad_pen > tmp.bad_pen + mv -f tmp.bad_pen $bad_pen + fi + if [[ -s ${low_count} ]]; then + gawk '{$1=$1}1' $low_count > tmp.low_count + mv -f tmp.low_count $low_count + fi + + echo " do_pen, do_chan, do_cnt = $do_pen, $do_chan, $do_cnt" + echo " diag_report = $diag_report " + if [[ $do_pen -eq 1 || $do_chan -eq 1 || $do_cnt -eq 1 || -s ${diag_report} ]]; then + + if [[ $do_pen -eq 1 ]]; then + + echo "calling radmon_err_rpt for pen" + ${radmon_err_rpt} ${prev_bad_pen} ${bad_pen} pen ${qdate} \ + ${PDATE} ${diag_report} ${pen_err} + fi + + if [[ $do_chan -eq 1 ]]; then + + echo "calling radmon_err_rpt for chan" + ${radmon_err_rpt} ${prev_bad_chan} ${bad_chan} chan ${qdate} \ + ${PDATE} ${diag_report} ${chan_err} + fi + + if [[ $do_cnt -eq 1 ]]; then + + echo "calling radmon_err_rpt for cnt" + ${radmon_err_rpt} ${prev_low_count} ${low_count} cnt ${qdate} \ + ${PDATE} ${diag_report} ${count_err} + fi + + #------------------------------------------------------------------- + # put together the unified error report with any obs, chan, and + # penalty problems and mail it + + if [[ -s ${obs_err} || -s ${pen_err} || -s ${chan_err} || -s ${count_err} || -s ${diag_report} ]]; then + + echo DOING ERROR REPORTING + + + cat << EOF > $report +Radiance Monitor warning report + + Net: ${RADMON_SUFFIX} + Run: ${RUN} + Cycle: $PDATE + +EOF + + if [[ -s ${diag_report} ]]; then + echo OUTPUTING DIAG_REPORT + cat ${diag_report} >> $report + fi + + if [[ -s ${chan_err} ]]; then + + echo OUTPUTING CHAN_ERR + + cat << EOF > ${chan_hdr} + + The following channels report 0 observational counts over the past two cycles: + + Satellite/Instrument Channel + ==================== ======= + +EOF + + cat ${chan_hdr} >> $report + cat ${chan_err} >> $report + + fi + + if [[ -s ${count_err} ]]; then + + cat << EOF > ${count_hdr} + + + + The following channels report abnormally low observational counts in the latest 2 cycles: + +Satellite/Instrument Obs Count Avg Count +==================== ========= ========= + +EOF + + cat ${count_hdr} >> $report + cat ${count_err} >> $report + fi + + + if [[ -s ${pen_err} ]]; then + + cat << EOF > ${pen_hdr} + + + Penalty values outside of the established normal range were found + for these sensor/channel/regions in the past two cycles: + + Questionable Penalty Values + ============ ======= ====== Cycle Penalty Bound + ----- ------- ----- +EOF + cat ${pen_hdr} >> $report + cat ${pen_err} >> $report + rm -f ${pen_hdr} + rm -f ${pen_err} + fi + + echo >> $report + cat ${disclaimer} >> $report + echo >> $report + fi + + #------------------------------------------------------------------- + # dump report to log file + # + if [[ -s ${report} ]]; then + lines=$(wc -l <${report}) + if [[ $lines -gt 2 ]]; then + cat ${report} + + $NCP ${report} ${TANKverf_rad}/warning.${PDATE} + fi + fi + + + fi + + #------------------------------------------------------------------- + # copy new bad_pen, bad_chan, and low_count files to $TANKverf_rad + # + if [[ -s ${bad_chan} ]]; then + mv ${bad_chan} ${TANKverf_rad}/. + fi + + if [[ -s ${bad_pen} ]]; then + mv ${bad_pen} ${TANKverf_rad}/. + fi + + if [[ -s ${low_count} ]]; then + mv ${low_count} ${TANKverf_rad}/. + fi + + +fi + + for type in ${SATYPE}; do + rm -f stdout.${type} + done + +################################################################################ +#------------------------------------------------------------------- +# end error reporting section +#------------------------------------------------------------------- +################################################################################ + +################################################################################ +# Post processing + +exit ${err} diff --git a/ush/rstprod.sh b/ush/rstprod.sh new file mode 100755 index 00000000000..acac0340bb7 --- /dev/null +++ b/ush/rstprod.sh @@ -0,0 +1,19 @@ +#! /usr/bin/env bash + +source "$HOMEgfs/ush/preamble.sh" + +#--------------------------------------------------------- +# rstprod.sh +# +# Restrict data from select sensors and satellites +#--------------------------------------------------------- + +# Restrict select sensors and satellites + +export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} +rlist="saphir abi_g16" +for rtype in $rlist; do + if compgen -G "*${rtype}*" > /dev/null; then + ${CHGRP_CMD} *${rtype}* + fi +done diff --git a/ush/scale_dec.sh b/ush/scale_dec.sh index 59e2bab14eb..77136d7f70d 100755 --- a/ush/scale_dec.sh +++ b/ush/scale_dec.sh @@ -13,7 +13,7 @@ source "$HOMEgfs/ush/preamble.sh" f=$1 -export WGRIB2=${WGRIB2:-${NWROOT}/grib_util.v1.1.0/exec/wgrib2} +export WGRIB2=${WGRIB2:-${wgrib2_ROOT}/bin/wgrib2} # export WGRIB2=/gpfs/dell1/nco/ops/nwprod/grib_util.v1.1.0/exec/wgrib2 diff --git a/ush/syndat_getjtbul.sh b/ush/syndat_getjtbul.sh index 89196d05966..c17067ff723 100755 --- a/ush/syndat_getjtbul.sh +++ b/ush/syndat_getjtbul.sh @@ -33,17 +33,8 @@ EXECSYND=${EXECSYND:-${HOMESYND}/exec} cd $DATA if [ "$#" -ne '1' ]; then - msg="**NON-FATAL ERROR PROGRAM SYNDAT_GETJTBUL run date not in \ + echo "**NON-FATAL ERROR PROGRAM SYNDAT_GETJTBUL run date not in \ positional parameter 1" - set +x - echo - echo $msg - echo - ${TRACE_ON:-set -x} - echo $msg >> $pgmout - set +u - [ -n "$jlogfile" ] && postmsg "$jlogfile" "$msg" - set -u echo "Leaving sub-shell syndat_getjtbul.sh to recover JTWC Bulletins" \ >> $pgmout @@ -94,7 +85,7 @@ echo " pdym1 is $pdym1" echo echo " ymddir is $ymddir" echo -${TRACE_ON:-set -x} +set_trace find=$ymd" "$hour echo "looking for string $find in $jtwcdir/tropcyc" >> $pgmout @@ -124,18 +115,15 @@ fi perl -wpi.ORIG -e 's/(^.... ... )(\S{9,9})(\S{1,})/$1$2/' jtwcbul diff jtwcbul.ORIG jtwcbul > jtwcbul_changes.txt if [ -s jtwcbul_changes.txt ]; then - msg="***WARNING: SOME JTWC VITALS SEGMENTS REQUIRED PRELIMINARY MODIFICATION!" - [ -n "$jlogfile" ] && postmsg "$jlogfile" "$msg" - echo -e "\n${msg}. Changes follow:" >> $pgmout - cat jtwcbul_changes.txt >> $pgmout - echo -e "\n" >> $pgmout + echo "***WARNING: SOME JTWC VITALS SEGMENTS REQUIRED PRELIMINARY MODIFICATION!" + cat jtwcbul_changes.txt fi # Execute bulletin processing [ -s jtwcbul ] && echo "Processing JTWC bulletin halfs into tcvitals records" >> $pgmout -pgm=$(basename $EXECSYND/syndat_getjtbul) +pgm=$(basename $EXECSYND/syndat_getjtbul.x) export pgm if [ -s prep_step ]; then set +u @@ -150,7 +138,7 @@ rm -f fnoc export FORT11=jtwcbul export FORT51=fnoc -time -p $EXECSYND/syndat_getjtbul >> $pgmout 2> errfile +time -p ${EXECSYND}/${pgm} >> $pgmout 2> errfile errget=$? ###cat errfile cat errfile >> $pgmout @@ -159,7 +147,7 @@ set +x echo echo 'The foreground exit status for SYNDAT_GETJTBUL is ' $errget echo -${TRACE_ON:-set -x} +set_trace if [ "$errget" -gt '0' ];then if [ "$errget" -eq '1' ];then msg="No JTWC bulletins in $jtwcdir/tropcyc, no JTWC tcvitals \ @@ -175,30 +163,12 @@ available for qctropcy for $CDATE10" fi fi else - msg="**NON-FATAL ERROR PROGRAM SYNDAT_GETJTBUL FOR $CDATE10 \ + echo "**NON-FATAL ERROR PROGRAM SYNDAT_GETJTBUL FOR $CDATE10 \ RETURN CODE $errget" fi - set +x - echo - echo $msg - echo - ${TRACE_ON:-set -x} - echo $msg >> $pgmout - set +u - [ -n "$jlogfile" ] && postmsg "$jlogfile" "$msg" - set -u else - msg="program SYNDAT_GETJTBUL completed normally for $CDATE10, JTWC \ + echo "program SYNDAT_GETJTBUL completed normally for $CDATE10, JTWC \ rec. passed to qctropcy" - set +x - echo - echo $msg - echo - ${TRACE_ON:-set -x} - echo $msg >> $pgmout - set +u - [ -n "$jlogfile" ] && postmsg "$jlogfile" "$msg" - set -u fi set +x echo @@ -206,7 +176,7 @@ echo "----------------------------------------------------------" echo "*********** COMPLETED PROGRAM syndat_getjtbul **********" echo "----------------------------------------------------------" echo -${TRACE_ON:-set -x} +set_trace if [ "$errget" -eq '0' ];then echo "Completed JTWC tcvitals records are:" >> $pgmout @@ -215,6 +185,6 @@ fi echo "Leaving sub-shell syndat_getjtbul.sh to recover JTWC Bulletins" \ >> $pgmout -echo " " >> $pgmout +echo " " >> "${pgmout}" exit diff --git a/ush/syndat_qctropcy.sh b/ush/syndat_qctropcy.sh index 571a7543b53..5b5b4ba34be 100755 --- a/ush/syndat_qctropcy.sh +++ b/ush/syndat_qctropcy.sh @@ -63,8 +63,6 @@ # copy_back - switch to copy updated files back to archive directory and # to tcvitals directory # (Default: YES) -# jlogfile - path to job log file (skipped over by this script if not -# passed in) # SENDCOM switch copy output files to $COMSP # (Default: YES) # files_override - switch to override default "files" setting for given run @@ -78,7 +76,7 @@ HOMENHCp1=${HOMENHCp1:-/gpfs/?p1/nhc/save/guidance/storm-data/ncep} HOMENHC=${HOMENHC:-/gpfs/dell2/nhc/save/guidance/storm-data/ncep} TANK_TROPCY=${TANK_TROPCY:-${DCOMROOT}/us007003} -FIXSYND=${FIXSYND:-$HOMEgfs/fix/fix_am} +FIXSYND=${FIXSYND:-$HOMEgfs/fix/am} USHSYND=${USHSYND:-$HOMEgfs/ush} EXECSYND=${EXECSYND:-$HOMEgfs/exec} PARMSYND=${PARMSYND:-$HOMEgfs/parm/relo} @@ -95,7 +93,7 @@ set +x echo echo $msg echo -${TRACE_ON:-set -x} +set_trace echo $msg >> $pgmout if [ "$#" -ne '1' ]; then @@ -105,27 +103,29 @@ positional parameter 1" echo echo $msg echo - ${TRACE_ON:-set -x} + set_trace echo $msg >> $pgmout msg="**NO TROPICAL CYCLONE tcvitals processed --> non-fatal" set +x echo echo $msg echo - ${TRACE_ON:-set -x} + set_trace echo $msg >> $pgmout -# Copy null files into "${COMSP}syndata.tcvitals.$tmmark" and -# "${COMSP}jtwc-fnoc.tcvitals.$tmmark" so later ftp attempts will find and +# Copy null files into "${COM_OBS}/${RUN}.${cycle}.syndata.tcvitals.$tmmark" and +# "${COM_OBS}/${RUN}.${cycle}.jtwc-fnoc.tcvitals.$tmmark" so later ftp attempts will find and # copy the zero-length file and avoid wasting time with multiple attempts # to remote machine(s) # (Note: Only do so if files don't already exist) if [ $SENDCOM = YES ]; then - [ ! -s ${COMSP}syndata.tcvitals.$tmmark ] && \ - cp /dev/null ${COMSP}syndata.tcvitals.$tmmark - [ ! -s ${COMSP}jtwc-fnoc.tcvitals.$tmmark ] && \ - cp /dev/null ${COMSP}jtwc-fnoc.tcvitals.$tmmark + if [[ ! -s "${COM_OBS}/${RUN}.${cycle}.syndata.tcvitals.${tmmark}" ]]; then + cp "/dev/null" "${COM_OBS}/${RUN}.${cycle}.syndata.tcvitals.${tmmark}" + fi + if [[ ! -s "${COM_OBS}/${RUN}.${cycle}.jtwc-fnoc.tcvitals.${tmmark}" ]]; then + cp "/dev/null" "${COM_OBS}/${RUN}.${cycle}.jtwc-fnoc.tcvitals.${tmmark}" + fi fi exit @@ -137,7 +137,7 @@ set +x echo echo "Run date is $CDATE10" echo -${TRACE_ON:-set -x} +set_trace year=$(echo $CDATE10 | cut -c1-4) @@ -159,7 +159,7 @@ if [ $dateck_size -lt 10 ]; then echo 1900010100 > dateck set +x echo -e "\n${msg}\n" - ${TRACE_ON:-set -x} + set_trace echo $msg >> $pgmout fi @@ -188,7 +188,7 @@ if [ -n "$files_override" ]; then # for testing, typically want FILES=F fi set +x echo -e "\n${msg}\n" - ${TRACE_ON:-set -x} + set_trace echo $msg >> $pgmout fi @@ -250,7 +250,7 @@ cp $slmask slmask.126 # Execute program syndat_qctropcy -pgm=$(basename $EXECSYND/syndat_qctropcy) +pgm=$(basename $EXECSYND/syndat_qctropcy.x) export pgm if [ -s prep_step ]; then set +u @@ -264,7 +264,7 @@ fi echo "$CDATE10" > cdate10.dat export FORT11=slmask.126 export FORT12=cdate10.dat -$EXECSYND/syndat_qctropcy >> $pgmout 2> errfile +${EXECSYND}/${pgm} >> $pgmout 2> errfile errqct=$? ###cat errfile cat errfile >> $pgmout @@ -273,34 +273,36 @@ set +x echo echo "The foreground exit status for SYNDAT_QCTROPCY is " $errqct echo -${TRACE_ON:-set -x} +set_trace if [ "$errqct" -gt '0' ];then msg="**NON-FATAL ERROR PROGRAM SYNDAT_QCTROPCY RETURN CODE $errqct" set +x echo echo $msg echo - ${TRACE_ON:-set -x} + set_trace echo $msg >> $pgmout msg="**NO TROPICAL CYCLONE tcvitals processed --> non-fatal" set +x echo echo $msg echo - ${TRACE_ON:-set -x} + set_trace echo $msg >> $pgmout # In the event of a ERROR in PROGRAM SYNDAT_QCTROPCY, copy null files into -# "${COMSP}syndata.tcvitals.$tmmark" and "${COMSP}jtwc-fnoc.tcvitals.$tmmark" +# "${COM_OBS}/${RUN}.${cycle}.syndata.tcvitals.$tmmark" and "${COM_OBS}/${RUN}.${cycle}.jtwc-fnoc.tcvitals.$tmmark" # so later ftp attempts will find and copy the zero-length file and avoid # wasting time with multiple attempts to remote machine(s) # (Note: Only do so if files don't already exist) if [ $SENDCOM = YES ]; then - [ ! -s ${COMSP}syndata.tcvitals.$tmmark ] && \ - cp /dev/null ${COMSP}syndata.tcvitals.$tmmark - [ ! -s ${COMSP}jtwc-fnoc.tcvitals.$tmmark ] && \ - cp /dev/null ${COMSP}jtwc-fnoc.tcvitals.$tmmark + if [[ ! -s "${COM_OBS}/${RUN}.${cycle}.syndata.tcvitals.${tmmark}" ]]; then + cp "/dev/null" "${COM_OBS}/${RUN}.${cycle}.syndata.tcvitals.${tmmark}" + fi + if [[ ! -s ${COM_OBS}/${RUN}.${cycle}.jtwc-fnoc.tcvitals.${tmmark} ]]; then + cp "/dev/null" "${COM_OBS}/${RUN}.${cycle}.jtwc-fnoc.tcvitals.${tmmark}" + fi fi exit @@ -311,7 +313,7 @@ echo "----------------------------------------------------------" echo "********** COMPLETED PROGRAM syndat_qctropcy **********" echo "----------------------------------------------------------" echo -${TRACE_ON:-set -x} +set_trace if [ "$copy_back" = 'YES' ]; then cat lthistry>>$ARCHSYND/syndat_lthistry.$year @@ -356,7 +358,7 @@ $HOMENHC/tcvitals successfully updated by syndat_qctropcy" echo echo $msg echo - ${TRACE_ON:-set -x} + set_trace echo $msg >> $pgmout fi @@ -368,7 +370,7 @@ not changed by syndat_qctropcy" echo echo $msg echo - ${TRACE_ON:-set -x} + set_trace echo $msg >> $pgmout fi @@ -377,15 +379,15 @@ fi # This is the file that connects to the later RELOCATE and/or PREP scripts -[ $SENDCOM = YES ] && cp current ${COMSP}syndata.tcvitals.$tmmark +[ $SENDCOM = YES ] && cp current "${COM_OBS}/${RUN}.${cycle}.syndata.tcvitals.${tmmark}" # Create the DBNet alert if [ $SENDDBN = "YES" ] then - $DBNROOT/bin/dbn_alert MODEL GDAS_TCVITALS $job ${COMSP}syndata.tcvitals.$tmmark + "${DBNROOT}/bin/dbn_alert" "MODEL" "GDAS_TCVITALS" "${job}" "${COM_OBS}/${RUN}.${cycle}.syndata.tcvitals.${tmmark}" fi # Write JTWC/FNOC Tcvitals to /com path since not saved anywhere else -[ $SENDCOM = YES ] && cp fnoc ${COMSP}jtwc-fnoc.tcvitals.$tmmark +[ $SENDCOM = YES ] && cp fnoc "${COM_OBS}/${RUN}.${cycle}.jtwc-fnoc.tcvitals.${tmmark}" exit diff --git a/ush/trim_rh.sh b/ush/trim_rh.sh index 2de2e17c7bb..5a8903cae61 100755 --- a/ush/trim_rh.sh +++ b/ush/trim_rh.sh @@ -7,7 +7,7 @@ source "$HOMEgfs/ush/preamble.sh" f=$1 -export WGRIB2=${WGRIB2:-${NWPROD:-/nwprod}/util/exec/wgrib2} +export WGRIB2=${WGRIB2:-${wgrib2_ROOT}/bin/wgrib2} $WGRIB2 ${optncpu:-} $f -not_if ':RH:' -grib $f.new \ -if ':RH:' -rpn "10:*:0.5:+:floor:1000:min:10:/" -set_grib_type same \ diff --git a/ush/tropcy_relocate.sh b/ush/tropcy_relocate.sh index e3a82efaf7c..9b170ddfd00 100755 --- a/ush/tropcy_relocate.sh +++ b/ush/tropcy_relocate.sh @@ -124,7 +124,6 @@ # -stdoutmode ordered" # USHGETGES String indicating directory path for GETGES utility ush # file -# Default is "/nwprod/util/ush" # USHRELO String indicating directory path for RELOCATE ush files # Default is "${HOMERELO}/ush" # EXECRELO String indicating directory path for RELOCATE executables @@ -142,7 +141,7 @@ # Default is "$EXECRELO/relocate_mv_nvortex" # SUPVX String indicating executable path for SUPVIT utility # program -# Default is "$EXECUTIL/supvit" +# Default is "$EXECUTIL/supvit.x" # GETTX String indicating executable path for GETTRK utility # program # Default is "$EXECUTIL/gettrk" @@ -158,8 +157,6 @@ # be used by the script. If they are not, they will be skipped # over by the script. # -# jlogfile String indicating path to joblog file -# # Exported Shell Variables: # CDATE10 String indicating the center date/time for the relocation # processing @@ -183,9 +180,7 @@ # $USHRELO/tropcy_relocate_extrkr.sh) # $DATA/err_chk (here and in child script # $USHRELO/tropcy_relocate_extrkr.sh) -# NOTE 1: postmsg above is required ONLY if "$jlogfile" is -# present. -# NOTE 2: The last three scripts above are NOT REQUIRED utilities. +# NOTE: The last three scripts above are NOT REQUIRED utilities. # If $DATA/prep_step not found, a scaled down version of it is # executed in-line. If $DATA/err_exit or $DATA/err_chk are not # found and a fatal error has occurred, then the script calling @@ -216,7 +211,7 @@ source "$HOMEgfs/ush/preamble.sh" MACHINE=${MACHINE:-$(hostname -s | cut -c 1-3)} SENDCOM=${SENDCOM:-YES} -export NWROOT=${NWROOT:-/nwprod2} +export OPSROOT=${OPSROOT:-/lfs/h1/ops/prod} GRIBVERSION=${GRIBVERSION:-"grib2"} if [ ! -d $DATA ] ; then mkdir -p $DATA ;fi @@ -256,7 +251,7 @@ then echo "problem with obtaining date record;" echo "ABNORMAL EXIT!!!!!!!!!!!" echo - ${TRACE_ON:-set -x} + set_trace if [ -s $DATA/err_exit ]; then $DATA/err_exit else @@ -274,7 +269,7 @@ set +x echo echo "CENTER DATE/TIME FOR RELOCATION PROCESSING IS $CDATE10" echo -${TRACE_ON:-set -x} +set_trace #---------------------------------------------------------------------------- @@ -284,13 +279,12 @@ ${TRACE_ON:-set -x} envir=${envir:-prod} if [ $MACHINE != sgi ]; then - HOMEALL=${HOMEALL:-$NWROOT} + HOMEALL=${HOMEALL:-$OPSROOT} else HOMEALL=${HOMEALL:-/disk1/users/snake/prepobs} fi HOMERELO=${HOMERELO:-${shared_global_home}} -#HOMERELO=${HOMERELO:-$NWROOT/tropcy_qc_reloc.${tropcy_qc_reloc_ver}} envir_getges=${envir_getges:-$envir} if [ $modhr -eq 0 ]; then @@ -317,7 +311,7 @@ RELOX=${RELOX:-$EXECRELO/relocate_mv_nvortex} export BKGFREQ=${BKGFREQ:-1} -SUPVX=${SUPVX:-$EXECRELO/supvit} +SUPVX=${SUPVX:-$EXECRELO/supvit.x} GETTX=${GETTX:-$EXECRELO/gettrk} ################################################ @@ -327,11 +321,7 @@ GETTX=${GETTX:-$EXECRELO/gettrk} # attempt to perform tropical cyclone relocation # ---------------------------------------------- -msg="Attempt to perform tropical cyclone relocation for $CDATE10" -set +u -##[ -n "$jlogfile" ] && $DATA/postmsg "$jlogfile" "$msg" -[ -n "$jlogfile" ] && postmsg "$jlogfile" "$msg" -set -u +echo "Attempt to perform tropical cyclone relocation for $CDATE10" if [ $modhr -ne 0 ]; then @@ -344,7 +334,7 @@ if [ $modhr -ne 0 ]; then not a multiple of 3-hrs;" echo "ABNORMAL EXIT!!!!!!!!!!!" echo - ${TRACE_ON:-set -x} + set_trace if [ -s $DATA/err_exit ]; then $DATA/err_exit else @@ -367,14 +357,14 @@ echo " Get TCVITALS file valid for -$fhr hrs relative to center" echo " relocation processing date/time" echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo - ${TRACE_ON:-set -x} + set_trace $USHGETGES/getges.sh -e $envir_getges -n $network_getges \ -v $CDATE10 -f $fhr -t tcvges tcvitals.m${fhr} set +x echo echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo - ${TRACE_ON:-set -x} + set_trace fi done @@ -417,7 +407,7 @@ echo " Get global sigma GUESS valid for $fhr hrs relative to center" echo " relocation processing date/time" echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo - ${TRACE_ON:-set -x} + set_trace $USHGETGES/getges.sh -e $envir_getges -n $network_getges \ -v $CDATE10 -t $stype $sges errges=$? @@ -429,7 +419,7 @@ echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" to center relocation date/time;" echo "ABNORMAL EXIT!!!!!!!!!!!" echo - ${TRACE_ON:-set -x} + set_trace if [ -s $DATA/err_exit ]; then $DATA/err_exit else @@ -440,28 +430,28 @@ to center relocation date/time;" fi # For center time sigma guess file obtained via getges, store pathname from -# getges into ${COMSP}sgesprep_pre-relocate_pathname.$tmmark and, for now, -# also in ${COMSP}sgesprep_pathname.$tmmark - if relocation processing stops +# getges into ${COM_OBS}/${RUN}.${cycle}.sgesprep_pre-relocate_pathname.$tmmark and, for now, +# also in ${COM_OBS}/${RUN}.${cycle}.sgesprep_pathname.$tmmark - if relocation processing stops # due to an error or due to no input tcvitals records found, then the center # time sigma guess will not be modified and this getges file will be read in # subsequent PREP processing; if relocation processing continues and the -# center sigma guess is modified, then ${COMSP}sgesprep_pathname.$tmmark will +# center sigma guess is modified, then ${COM_OBS}/${RUN}.${cycle}.sgesprep_pathname.$tmmark will # be removed later in this script {the subsequent PREP step will correctly -# update ${COMSP}sgesprep_pathname.$tmmark to point to the sgesprep file +# update ${COM_OBS}/${RUN}.${cycle}.sgesprep_pathname.$tmmark to point to the sgesprep file # updated here by the relocation} # ---------------------------------------------------------------------------- if [ $fhr = "0" ]; then - $USHGETGES/getges.sh -e $envir_getges -n $network_getges -v $CDATE10 \ - -t $stype > ${COMSP}sgesprep_pre-relocate_pathname.$tmmark - cp ${COMSP}sgesprep_pre-relocate_pathname.$tmmark \ - ${COMSP}sgesprep_pathname.$tmmark + "${USHGETGES}/getges.sh" -e "${envir_getges}" -n "${network_getges}" -v "${CDATE10}" \ + -t "${stype}" > "${COM_OBS}/${RUN}.${cycle}.sgesprep_pre-relocate_pathname.${tmmark}" + cp "${COM_OBS}/${RUN}.${cycle}.sgesprep_pre-relocate_pathname.${tmmark}" \ + "${COM_OBS}/${RUN}.${cycle}.sgesprep_pathname.${tmmark}" fi set +x echo echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo - ${TRACE_ON:-set -x} + set_trace fi if [ ! -s $pges ]; then set +x @@ -471,7 +461,7 @@ echo " Get global pressure grib GUESS valid for $fhr hrs relative to center" echo " relocation processing date/time" echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo - ${TRACE_ON:-set -x} + set_trace $USHGETGES/getges.sh -e $envir_getges -n $network_getges \ -v $CDATE10 -t $ptype $pges errges=$? @@ -483,7 +473,7 @@ echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" relative to center relocation date/time;" echo "ABNORMAL EXIT!!!!!!!!!!!" echo - ${TRACE_ON:-set -x} + set_trace if [ -s $DATA/err_exit ]; then $DATA/err_exit else @@ -496,14 +486,14 @@ relative to center relocation date/time;" echo echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo - ${TRACE_ON:-set -x} + set_trace fi done if [ -f ${tstsp}syndata.tcvitals.$tmmark ]; then cp ${tstsp}syndata.tcvitals.$tmmark tcvitals.now else - cp ${COMSP}syndata.tcvitals.$tmmark tcvitals.now + cp "${COM_OBS}/${RUN}.${cycle}.syndata.tcvitals.${tmmark}" "tcvitals.now" fi @@ -524,13 +514,10 @@ grep "$pdy $cyc" VITL errgrep=$? > tcvitals if [ $errgrep -ne 0 ] ; then - msg="NO TCVITAL RECORDS FOUND FOR $CDATE10 - EXIT TROPICAL CYCLONE \ + echo "NO TCVITAL RECORDS FOUND FOR $CDATE10 - EXIT TROPICAL CYCLONE \ RELOCATION PROCESSING" - set +u - [ -n "$jlogfile" ] && postmsg "$jlogfile" "$msg" - set -u -# The existence of ${COMSP}tropcy_relocation_status.$tmmark file will tell the +# The existence of ${COM_OBS}/${RUN}.${cycle}.tropcy_relocation_status.$tmmark file will tell the # subsequent PREP processing that RELOCATION processing occurred, echo # "NO RECORDS to process" into it to further tell PREP processing that records # were not processed by relocation and the global sigma guess was NOT @@ -538,14 +525,15 @@ RELOCATION PROCESSING" # found) # Note: When tropical cyclone relocation does run to completion and the # global sigma guess is modified, the parent script to this will echo -# "RECORDS PROCESSED" into ${COMSP}tropcy_relocation_status.$tmmark +# "RECORDS PROCESSED" into ${COM_OBS}/${RUN}.${cycle}.tropcy_relocation_status.$tmmark # assuming it doesn't already exist (meaning "NO RECORDS to process" # was NOT echoed into it here) # ---------------------------------------------------------------------------- - echo "NO RECORDS to process" > ${COMSP}tropcy_relocation_status.$tmmark - [ ! -s ${COMSP}tcvitals.relocate.$tmmark ] && \ - cp /dev/null ${COMSP}tcvitals.relocate.$tmmark + echo "NO RECORDS to process" > "${COM_OBS}/${RUN}.${cycle}.tropcy_relocation_status.${tmmark}" + if [[ ! -s "${COM_OBS}/${RUN}.${cycle}.tcvitals.relocate.${tmmark}" ]]; then + cp "/dev/null" "${COM_OBS}/${RUN}.${cycle}.tcvitals.relocate.${tmmark}" + fi else cat VITL >>tcvitals @@ -568,7 +556,7 @@ else echo "$USHRELO/tropcy_relocate_extrkr.sh failed" echo "ABNORMAL EXIT!!!!!!!!!!!" echo - ${TRACE_ON:-set -x} + set_trace if [ -s $DATA/err_exit ]; then $DATA/err_exit "Script $USHRELO/tropcy_relocate_extrkr.sh failed" else @@ -651,7 +639,7 @@ else # check for success # ----------------- - echo; ${TRACE_ON:-set -x} + echo; set_trace if [ "$errSTATUS" -gt '0' ]; then if [ -s $DATA/err_exit ]; then $DATA/err_exit "Script RELOCATE_GES failed" @@ -700,38 +688,35 @@ else rm -f RELOCATE_GES cmd if [ "$SENDCOM" = "YES" ]; then - cp rel_inform1 ${COMSP}inform.relocate.$tmmark - cp tcvitals ${COMSP}tcvitals.relocate.$tmmark + cp "rel_inform1" "${COM_OBS}/${RUN}.${cycle}.inform.relocate.${tmmark}" + cp "tcvitals" "${COM_OBS}/${RUN}.${cycle}.tcvitals.relocate.${tmmark}" if [ "$SENDDBN" = "YES" ]; then if test "$RUN" = "gdas1" then - $DBNROOT/bin/dbn_alert MODEL GDAS1_TCI $job ${COMSP}inform.relocate.$tmmark - $DBNROOT/bin/dbn_alert MODEL GDAS1_TCI $job ${COMSP}tcvitals.relocate.$tmmark + "${DBNROOT}/bin/dbn_alert" "MODEL" "GDAS1_TCI" "${job}" "${COM_OBS}/${RUN}.${cycle}.inform.relocate.${tmmark}" + "${DBNROOT}/bin/dbn_alert" "MODEL" "GDAS1_TCI" "${job}" "${COM_OBS}/${RUN}.${cycle}.tcvitals.relocate.${tmmark}" fi if test "$RUN" = "gfs" then - $DBNROOT/bin/dbn_alert MODEL GFS_TCI $job ${COMSP}inform.relocate.$tmmark - $DBNROOT/bin/dbn_alert MODEL GFS_TCI $job ${COMSP}tcvitals.relocate.$tmmark + "${DBNROOT}/bin/dbn_alert" "MODEL" "GFS_TCI" "${job}" "${COM_OBS}/${RUN}.${cycle}.inform.relocate.${tmmark}" + "${DBNROOT}/bin/dbn_alert" "MODEL" "GFS_TCI" "${job}" "${COM_OBS}/${RUN}.${cycle}.tcvitals.relocate.${tmmark}" fi fi fi # -------------------------------------------------------------------------- # Since relocation processing has ended sucessfully (and the center sigma -# guess has been modified), remove ${COMSP}sgesprep_pathname.$tmmark (which +# guess has been modified), remove ${COM_OBS}/${RUN}.${cycle}.sgesprep_pathname.$tmmark (which # had earlier had getges center sigma guess pathname written into it - in # case of error or no input tcvitals records found) - the subsequent PREP -# step will correctly update ${COMSP}sgesprep_pathname.$tmmark to point to +# step will correctly update ${COM_OBS}/${RUN}.${cycle}.sgesprep_pathname.$tmmark to point to # the sgesprep file updated here by the relocation # -------------------------------------------------------------------------- - rm ${COMSP}sgesprep_pathname.$tmmark + rm "${COM_OBS}/${RUN}.${cycle}.sgesprep_pathname.${tmmark}" - msg="TROPICAL CYCLONE RELOCATION PROCESSING SUCCESSFULLY COMPLETED FOR \ + echo "TROPICAL CYCLONE RELOCATION PROCESSING SUCCESSFULLY COMPLETED FOR \ $CDATE10" - set +u - [ -n "$jlogfile" ] && postmsg "$jlogfile" "$msg" - set -u # end GFDL ges manipulation # ------------------------- diff --git a/ush/tropcy_relocate_extrkr.sh b/ush/tropcy_relocate_extrkr.sh index 79295cead0a..ede2318c4ad 100755 --- a/ush/tropcy_relocate_extrkr.sh +++ b/ush/tropcy_relocate_extrkr.sh @@ -239,7 +239,7 @@ cmodel=$(echo ${cmodel} | tr "[A-Z]" "[a-z]") case ${cmodel} in - gdas) set +x; echo " "; echo " ++ operational GDAS chosen"; ${TRACE_ON:-set -x}; + gdas) set +x; echo " "; echo " ++ operational GDAS chosen"; set_trace; fcstlen=9 ; fcsthrs="" for fhr in $( seq 0 $BKGFREQ 9); do @@ -272,48 +272,48 @@ case ${cmodel} in # jpdtn=0 for deterministic data. g2_jpdtn=0 model=8;; - gfs) set +x; echo " "; echo " ++ operational GFS chosen"; ${TRACE_ON:-set -x}; + gfs) set +x; echo " "; echo " ++ operational GFS chosen"; set_trace; fcsthrsgfs=' 00 06 12 18 24 30 36 42 48 54 60 66 72 78'; gfsdir=$COMIN; gfsgfile=gfs.t${dishh}z.pgrbf; model=1;; - mrf) set +x; echo " "; echo " ++ operational MRF chosen"; ${TRACE_ON:-set -x}; + mrf) set +x; echo " "; echo " ++ operational MRF chosen"; set_trace; fcsthrsmrf=' 00 12 24 36 48 60 72'; mrfdir=$COMIN; mrfgfile=drfmr.t${dishh}z.pgrbf; model=2;; - ukmet) set +x; echo " "; echo " ++ operational UKMET chosen"; ${TRACE_ON:-set -x}; + ukmet) set +x; echo " "; echo " ++ operational UKMET chosen"; set_trace; fcsthrsukmet=' 00 12 24 36 48 60 72'; ukmetdir=$COMIN; ukmetgfile=ukmet.t${dishh}z.ukmet; model=3;; - ecmwf) set +x; echo " "; echo " ++ operational ECMWF chosen"; ${TRACE_ON:-set -x}; + ecmwf) set +x; echo " "; echo " ++ operational ECMWF chosen"; set_trace; fcsthrsecmwf=' 00 24 48 72'; ecmwfdir=$COMIN; ecmwfgfile=ecmgrb25.t12z; model=4;; - ngm) set +x; echo " "; echo " ++ operational NGM chosen"; ${TRACE_ON:-set -x}; + ngm) set +x; echo " "; echo " ++ operational NGM chosen"; set_trace; fcsthrsngm=' 00 06 12 18 24 30 36 42 48'; ngmdir=$COMIN; ngmgfile=ngm.t${dishh}z.pgrb.f; model=5;; - nam) set +x; echo " "; echo " ++ operational Early NAM chosen"; ${TRACE_ON:-set -x}; + nam) set +x; echo " "; echo " ++ operational Early NAM chosen"; set_trace; fcsthrsnam=' 00 06 12 18 24 30 36 42 48'; namdir=$COMIN; namgfile=nam.t${dishh}z.awip32; model=6;; - ngps) set +x; echo " "; echo " ++ operational NAVGEM chosen"; ${TRACE_ON:-set -x}; + ngps) set +x; echo " "; echo " ++ operational NAVGEM chosen"; set_trace; fcsthrsngps=' 00 12 24 36 48 60 72'; #ngpsdir=/com/hourly/prod/hourly.${CENT}${symd}; ngpsdir=$OMIN; ngpsgfile=fnoc.t${dishh}z; model=7;; other) set +x; echo " "; echo " Model selected by user is ${cmodel}, which is a "; - echo "user-defined model, NOT operational...."; echo " "; ${TRACE_ON:-set -x}; + echo "user-defined model, NOT operational...."; echo " "; set_trace; model=9;; *) set +x; echo " "; echo " !!! Model selected is not recognized."; echo " Model= ---> ${cmodel} <--- ..... Please submit the script again...."; - echo " "; ${TRACE_ON:-set -x}; exit 8;; + echo " "; set_trace; exit 8;; esac @@ -377,7 +377,7 @@ if [ ${cmodel} = 'other' ]; then echo " replace the forecast hour characters 00 with XX. Please check the" echo " name in the kickoff script and qsub it again. Exiting....." echo " " - ${TRACE_ON:-set -x} + set_trace exit 8 fi @@ -400,7 +400,7 @@ if [ ${cmodel} = 'other' ]; then echo " " echo " !!! Exiting loop, only processing 14 forecast files ...." echo " " - ${TRACE_ON:-set -x} + set_trace break fi @@ -415,7 +415,7 @@ if [ ${cmodel} = 'other' ]; then echo " " echo " +++ Found file ${fnamebeg}${fhour}${fnameend}" echo " " - ${TRACE_ON:-set -x} + set_trace let fhrct=fhrct+1 else fflag='n' @@ -435,7 +435,7 @@ if [ ${cmodel} = 'other' ]; then echo " !!! Please check the directory to make sure the file" echo " !!! is there and then submit this job again." echo " " - ${TRACE_ON:-set -x} + set_trace exit 8 fi @@ -444,7 +444,7 @@ if [ ${cmodel} = 'other' ]; then echo " Max forecast hour is $maxhour" echo " List of forecast hours: $fcsthrsother" echo " " - ${TRACE_ON:-set -x} + set_trace # -------------------------------------------------- # In order for the fortran program to know how many @@ -526,7 +526,7 @@ if [ ${numvitrecs} -eq 0 ]; then echo "!!! It could just be that there are no storms for the current" echo "!!! time. Please check the dates and submit this job again...." echo " " - ${TRACE_ON:-set -x} + set_trace exit 8 fi @@ -574,7 +574,7 @@ pgm=$(basename $SUPVX) if [ -s $DATA/prep_step ]; then set +e . $DATA/prep_step - ${ERR_EXIT_ON:-set -eu} + set_strict else [ -f errfile ] && rm errfile export XLFUNITS=0 @@ -613,14 +613,14 @@ set +x echo echo 'The foreground exit status for SUPVIT is ' $err echo -${TRACE_ON:-set -x} +set_trace if [ $err -eq 0 ]; then set +x echo " " echo " Normal end for program supvitql (which updates TC vitals file)." echo " " - ${TRACE_ON:-set -x} + set_trace else set +x echo " " @@ -630,7 +630,7 @@ else echo "!!! model= ${cmodel}, forecast initial time = ${symd}${dishh}" echo "!!! Exiting...." echo " " - ${TRACE_ON:-set -x} + set_trace fi if [ -s $DATA/err_chk ]; then $DATA/err_chk @@ -660,7 +660,7 @@ if [ ${numvitrecs} -eq 0 ]; then echo "!!! File ${vdir}/vitals.upd.${cmodel}.${symd}${dishh} is empty." echo "!!! Please check the dates and submit this job again...." echo " " - ${TRACE_ON:-set -x} + set_trace exit 8 fi @@ -676,7 +676,7 @@ echo " Below is a list of the storms to be processed: " | tee -a storm_list echo " " | tee -a storm_list cat ${vdir}/vitals.upd.${cmodel}.${symd}${dishh} | tee -a storm_list echo " " | tee -a storm_list -${TRACE_ON:-set -x} +set_trace set +u [ -n "../$pgmout" ] && cat storm_list >> ../$pgmout @@ -729,7 +729,7 @@ echo " NOW CUTTING APART INPUT GRIB FILES TO " echo " CREATE 1 BIG GRIB INPUT FILE " echo " -----------------------------------------" echo " " -${TRACE_ON:-set -x} +set_trace #grid='255 0 151 71 70000 190000 128 0000 340000 1000 1000 64' #grid='255 0 360 181 90000 0000 128 -90000 -1000 1000 1000 64' @@ -756,7 +756,7 @@ if [ ${model} -eq 5 ]; then echo " !!! in the analysis data." echo " *******************************************************************" echo " " - ${TRACE_ON:-set -x} + set_trace fi if [ -s ${vdir}/ngmlatlon.pgrb.${symd}${dishh} ]; then @@ -772,7 +772,7 @@ if [ ${model} -eq 5 ]; then echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" echo " !!! NGM File missing: ${ngmdir}/${ngmgfile}${fhour}" echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" - ${TRACE_ON:-set -x} + set_trace continue fi if [ -s $TMPDIR/tmpixfile ]; then rm $TMPDIR/tmpixfile; fi @@ -783,7 +783,7 @@ if [ ${model} -eq 5 ]; then echo " " echo " Extracting NGM GRIB data for forecast hour = $fhour" echo " " - ${TRACE_ON:-set -x} + set_trace g1=${ngmdir}/${ngmgfile}${fhour} @@ -807,7 +807,7 @@ if [ ${model} -eq 5 ]; then echo "!!! sure you've allocated enough memory for this job (error 134 using $COPYGB is " echo "!!! typically due to using more memory than you've allocated). Exiting....." echo " " - ${TRACE_ON:-set -x} + set_trace exit 8 fi @@ -846,7 +846,7 @@ if [ ${model} -eq 6 ]; then echo " !!! in the analysis data." echo " *******************************************************************" echo " " - ${TRACE_ON:-set -x} + set_trace fi if [ -s ${vdir}/namlatlon.pgrb.${symd}${dishh} ]; then @@ -862,7 +862,7 @@ if [ ${model} -eq 6 ]; then echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" echo " !!! Early NAM File missing: ${namdir}/${namgfile}${fhour}.tm00" echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" - ${TRACE_ON:-set -x} + set_trace continue fi if [ -s $TMPDIR/tmpixfile ]; then rm $TMPDIR/tmpixfile; fi @@ -873,7 +873,7 @@ if [ ${model} -eq 6 ]; then echo " " echo " Extracting Early NAM GRIB data for forecast hour = $fhour" echo " " - ${TRACE_ON:-set -x} + set_trace g1=${namdir}/${namgfile}${fhour}.tm00 @@ -898,7 +898,7 @@ if [ ${model} -eq 6 ]; then echo "!!! sure you've allocated enough memory for this job (error 134 using $COPYGB is " echo "!!! typically due to using more memory than you've allocated). Exiting....." echo " " - ${TRACE_ON:-set -x} + set_trace exit 8 fi @@ -946,7 +946,7 @@ if [ ${model} -eq 4 ]; then echo " " echo " !!! Due to missing ECMWF file, execution is ending...." echo " " - ${TRACE_ON:-set -x} + set_trace exit 8 fi @@ -989,7 +989,7 @@ if [ ${model} -eq 1 ]; then echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" echo " !!! GFS File missing: ${gfsdir}/${gfsgfile}${fhour}" echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" - ${TRACE_ON:-set -x} + set_trace continue fi @@ -1060,7 +1060,7 @@ if [ ${model} -eq 8 ]; then echo " !!! gdas File missing: $gfile" echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" echo " " - ${TRACE_ON:-set -x} + set_trace continue fi @@ -1109,7 +1109,7 @@ if [ ${model} -eq 8 ]; then echo " !!! gdas File missing: $gfile" echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" echo " " - ${TRACE_ON:-set -x} + set_trace continue fi @@ -1164,7 +1164,7 @@ if [ ${model} -eq 2 ]; then echo " !!! MRF File missing: ${mrfdir}/${mrfgfile}${fhour}" echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" echo " " - ${TRACE_ON:-set -x} + set_trace continue fi @@ -1219,7 +1219,7 @@ if [ ${model} -eq 3 ]; then echo " !!! UKMET File missing: ${ukmetdir}/${ukmetgfile}${fhour}" echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" echo " " - ${TRACE_ON:-set -x} + set_trace continue fi @@ -1260,7 +1260,7 @@ if [ ${model} -eq 7 ]; then echo " " echo " !!! Due to missing NAVGEM file, execution is ending...." echo " " - ${TRACE_ON:-set -x} + set_trace exit 8 fi @@ -1335,7 +1335,7 @@ if [ ${model} -eq 9 ]; then echo "!!! Forecast File missing: ${otherdir}/${fnamebeg}00${fnameend}" echo "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" echo " " - ${TRACE_ON:-set -x} + set_trace continue fi @@ -1409,7 +1409,7 @@ if [ ${model} -eq 9 ]; then echo "!!! sure you've allocated enough memory for this job (error 134 using $COPYGB is " echo "!!! typically due to using more memory than you've allocated). Exiting....." echo " " - ${TRACE_ON:-set -x} + set_trace exit 8 fi @@ -1440,9 +1440,9 @@ while [ $ist -le 15 ] do if [ ${stormflag[${ist}]} -ne 1 ] then - set +x; echo "Storm number $ist NOT selected for processing"; ${TRACE_ON:-set -x} + set +x; echo "Storm number $ist NOT selected for processing"; set_trace else - set +x; echo "Storm number $ist IS selected for processing...."; ${TRACE_ON:-set -x} + set +x; echo "Storm number $ist IS selected for processing...."; set_trace fi let ist=ist+1 done @@ -1561,7 +1561,7 @@ set +x echo echo 'The foreground exit status for GETTRK is ' $err echo -${TRACE_ON:-set -x} +set_trace if [ -s $DATA/err_chk ]; then $DATA/err_chk diff --git a/ush/wave_grib2_sbs.sh b/ush/wave_grib2_sbs.sh index a4463156f64..8511515abbe 100755 --- a/ush/wave_grib2_sbs.sh +++ b/ush/wave_grib2_sbs.sh @@ -25,55 +25,54 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" # 0.a Basic modes of operation - cd $GRIBDATA +cd "${GRIBDATA}" || exit 2 - alertName=$(echo $RUN|tr [a-z] [A-Z]) +alertName=${RUN^^} - grdID=$1 - gribDIR=${grdID}_grib - rm -rfd ${gribDIR} - mkdir ${gribDIR} - err=$? - if [ $err != 0 ] - then - set +x - echo ' ' - echo '******************************************************************************* ' - echo '*** FATAL ERROR : ERROR IN ww3_grib2 (COULD NOT CREATE TEMP DIRECTORY) *** ' - echo '******************************************************************************* ' - echo ' ' - ${TRACE_ON:-set -x} - exit 1 - fi +grdID=$1 +gribDIR="${grdID}_grib" +rm -rfd "${gribDIR}" +mkdir "${gribDIR}" +err=$? +if [[ ${err} != 0 ]]; then + set +x + echo ' ' + echo '******************************************************************************* ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 (COULD NOT CREATE TEMP DIRECTORY) *** ' + echo '******************************************************************************* ' + echo ' ' + set_trace + exit 1 +fi - cd ${gribDIR} +cd "${gribDIR}" || exit 2 # 0.b Define directories and the search path. # The tested variables should be exported by the postprocessor script. - GRIDNR=$2 - MODNR=$3 - ymdh=$4 - fhr=$5 - grdnam=$6 - grdres=$7 - gribflags=$8 - ngrib=1 # only one time slice - dtgrib=3600 # only one time slice +GRIDNR=$2 +MODNR=$3 +ymdh=$4 +fhr=$5 +grdnam=$6 +grdres=$7 +gribflags=$8 +ngrib=1 # only one time slice +dtgrib=3600 # only one time slice # SBS one time slice per file - FH3=$(printf %03i $fhr) +FH3=$(printf %03i "${fhr}") # Verify if grib2 file exists from interrupted run - ENSTAG="" - if [ ${waveMEMB} ]; then ENSTAG=".${membTAG}${waveMEMB}" ; fi - outfile=${WAV_MOD_TAG}.${cycle}${ENSTAG}.${grdnam}.${grdres}.f${FH3}.grib2 +ENSTAG="" +if [[ -n ${waveMEMB} ]]; then ENSTAG=".${membTAG}${waveMEMB}" ; fi +outfile="${WAV_MOD_TAG}.${cycle}${ENSTAG}.${grdnam}.${grdres}.f${FH3}.grib2" # Only create file if not present in COM - if [ ! -s ${COMOUT}/gridded/${outfile}.idx ]; then +if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then set +x echo ' ' @@ -81,178 +80,175 @@ source "$HOMEgfs/ush/preamble.sh" echo '! Make GRIB files |' echo '+--------------------------------+' echo " Model ID : $WAV_MOD_TAG" - ${TRACE_ON:-set -x} + set_trace - if [ -z "$CDATE" ] || [ -z "$cycle" ] || [ -z "$EXECwave" ] || \ - [ -z "$COMOUT" ] || [ -z "$WAV_MOD_TAG" ] || [ -z "$SENDCOM" ] || \ - [ -z "$gribflags" ] || \ - [ -z "$GRIDNR" ] || [ -z "$MODNR" ] || [ -z "$SENDDBN" ] - then + if [[ -z "${PDY}" ]] || [[ -z ${cyc} ]] || [[ -z "${cycle}" ]] || [[ -z "${EXECwave}" ]] || \ + [[ -z "${COM_WAVE_GRID}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${SENDCOM}" ]] || \ + [[ -z "${gribflags}" ]] || [[ -z "${GRIDNR}" ]] || [[ -z "${MODNR}" ]] || \ + [[ -z "${SENDDBN}" ]]; then set +x echo ' ' echo '***************************************************' echo '*** EXPORTED VARIABLES IN postprocessor NOT SET ***' echo '***************************************************' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 1 fi -# 0.c Starting time for output + # 0.c Starting time for output - tstart="$(echo $ymdh | cut -c1-8) $(echo $ymdh | cut -c9-10)0000" + tstart="${ymdh:0:8} ${ymdh:8:2}0000" set +x - echo " Starting time : $tstart" - echo " Time step : Single SBS - echo " Number of times : Single SBS - echo " GRIB field flags : $gribflags" + echo " Starting time : ${tstart}" + echo " Time step : Single SBS" + echo " Number of times : Single SBS" + echo " GRIB field flags : ${gribflags}" echo ' ' - ${TRACE_ON:-set -x} + set_trace -# 0.e Links to working directory + # 0.e Links to working directory - ln -s ${DATA}/mod_def.$grdID mod_def.ww3 - ln -s ${DATA}/output_${ymdh}0000/out_grd.$grdID out_grd.ww3 + ln -s "${DATA}/mod_def.${grdID}" "mod_def.ww3" + ln -s "${DATA}/output_${ymdh}0000/out_grd.${grdID}" "out_grd.ww3" -# --------------------------------------------------------------------------- # -# 1. Generate GRIB file with all data -# 1.a Generate input file for ww3_grib2 -# Template copied in mother script ... + # --------------------------------------------------------------------------- # + # 1. Generate GRIB file with all data + # 1.a Generate input file for ww3_grib2 + # Template copied in mother script ... set +x echo " Generate input file for ww3_grib2" - ${TRACE_ON:-set -x} + set_trace - sed -e "s/TIME/$tstart/g" \ - -e "s/DT/$dtgrib/g" \ - -e "s/NT/$ngrib/g" \ - -e "s/GRIDNR/$GRIDNR/g" \ - -e "s/MODNR/$MODNR/g" \ - -e "s/FLAGS/$gribflags/g" \ - ${DATA}/ww3_grib2.${grdID}.inp.tmpl > ww3_grib.inp + sed -e "s/TIME/${tstart}/g" \ + -e "s/DT/${dtgrib}/g" \ + -e "s/NT/${ngrib}/g" \ + -e "s/GRIDNR/${GRIDNR}/g" \ + -e "s/MODNR/${MODNR}/g" \ + -e "s/FLAGS/${gribflags}/g" \ + "${DATA}/ww3_grib2.${grdID}.inp.tmpl" > ww3_grib.inp echo "ww3_grib.inp" cat ww3_grib.inp -# 1.b Run GRIB packing program + + # 1.b Run GRIB packing program set +x echo " Run ww3_grib2" - echo " Executing $EXECwave/ww3_grib" - ${TRACE_ON:-set -x} + echo " Executing ${EXECwave}/ww3_grib" + set_trace export pgm=ww3_grib;. prep_step - $EXECwave/ww3_grib > grib2_${grdnam}_${FH3}.out 2>&1 + "${EXECwave}/ww3_grib" > "grib2_${grdnam}_${FH3}.out" 2>&1 export err=$?;err_chk - if [ ! -s gribfile ]; then - set +x - echo ' ' - echo '************************************************ ' - echo '*** FATAL ERROR : ERROR IN ww3_grib encoding *** ' - echo '************************************************ ' - echo ' ' - ${TRACE_ON:-set -x} - exit 3 - fi - - if [ $fhr -gt 0 ]; then - $WGRIB2 gribfile -set_date $CDATE -set_ftime "$fhr hour fcst" -grib ${COMOUT}/gridded/${outfile} + if [ ! -s gribfile ]; then + set +x + echo ' ' + echo '************************************************ ' + echo '*** FATAL ERROR : ERROR IN ww3_grib encoding *** ' + echo '************************************************ ' + echo ' ' + set_trace + exit 3 + fi + + if (( fhr > 0 )); then + ${WGRIB2} gribfile -set_date "${PDY}${cyc}" -set_ftime "${fhr} hour fcst" -grib "${COM_WAVE_GRID}/${outfile}" err=$? else - $WGRIB2 gribfile -set_date $CDATE -set_ftime "$fhr hour fcst" -set table_1.4 1 -set table_1.2 1 -grib ${COMOUT}/gridded/${outfile} + ${WGRIB2} gribfile -set_date "${PDY}${cyc}" -set_ftime "${fhr} hour fcst" \ + -set table_1.4 1 -set table_1.2 1 -grib "${COM_WAVE_GRID}/${outfile}" err=$? fi - if [ $err != 0 ] - then + if [[ ${err} != 0 ]]; then set +x echo ' ' echo '********************************************* ' echo '*** FATAL ERROR : ERROR IN ww3_grib2 *** ' echo '********************************************* ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 3 fi -# Create index - $WGRIB2 -s $COMOUT/gridded/${outfile} > $COMOUT/gridded/${outfile}.idx + # Create index + ${WGRIB2} -s "${COM_WAVE_GRID}/${outfile}" > "${COM_WAVE_GRID}/${outfile}.idx" -# Create grib2 subgrid is this is the source grid - if [ "${grdID}" = "${WAV_SUBGRBSRC}" ]; then + # Create grib2 subgrid is this is the source grid + if [[ "${grdID}" = "${WAV_SUBGRBSRC}" ]]; then for subgrb in ${WAV_SUBGRB}; do subgrbref=$(echo ${!subgrb} | cut -d " " -f 1-20) subgrbnam=$(echo ${!subgrb} | cut -d " " -f 21) subgrbres=$(echo ${!subgrb} | cut -d " " -f 22) subfnam="${WAV_MOD_TAG}.${cycle}${ENSTAG}.${subgrbnam}.${subgrbres}.f${FH3}.grib2" - $COPYGB2 -g "${subgrbref}" -i0 -x ${COMOUT}/gridded/${outfile} ${COMOUT}/gridded/${subfnam} - $WGRIB2 -s $COMOUT/gridded/${subfnam} > $COMOUT/gridded/${subfnam}.idx + ${COPYGB2} -g "${subgrbref}" -i0 -x "${COM_WAVE_GRID}/${outfile}" "${COM_WAVE_GRID}/${subfnam}" + ${WGRIB2} -s "${COM_WAVE_GRID}/${subfnam}" > "${COM_WAVE_GRID}/${subfnam}.idx" done fi -# 1.e Save in /com - - if [ ! -s $COMOUT/gridded/${outfile} ] - then - set +x - echo ' ' - echo '********************************************* ' - echo '*** FATAL ERROR : ERROR IN ww3_grib2 *** ' - echo '********************************************* ' - echo ' ' - echo " Error in moving grib file ${outfile} to com" - echo ' ' - ${TRACE_ON:-set -x} - exit 4 - fi - if [ ! -s $COMOUT/gridded/${outfile} ] - then - set +x - echo ' ' - echo '*************************************************** ' - echo '*** FATAL ERROR : ERROR IN ww3_grib2 INDEX FILE *** ' - echo '*************************************************** ' - echo ' ' - echo " Error in moving grib file ${outfile}.idx to com" - echo ' ' - ${TRACE_ON:-set -x} - exit 4 - fi - - if [[ "$SENDDBN" = 'YES' ]] && [[ ${outfile} != *global.0p50* ]] - then - set +x - echo " Alerting GRIB file as $COMOUT/gridded/${outfile}" - echo " Alerting GRIB index file as $COMOUT/gridded/${outfile}.idx" - ${TRACE_ON:-set -x} - $DBNROOT/bin/dbn_alert MODEL ${alertName}_WAVE_GB2 $job $COMOUT/gridded/${outfile} - $DBNROOT/bin/dbn_alert MODEL ${alertName}_WAVE_GB2_WIDX $job $COMOUT/gridded/${outfile}.idx - else - echo "${outfile} is global.0p50, not alert out" - fi + # 1.e Save in /com + if [[ ! -s "${COM_WAVE_GRID}/${outfile}" ]]; then + set +x + echo ' ' + echo '********************************************* ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 *** ' + echo '********************************************* ' + echo ' ' + echo " Error in moving grib file ${outfile} to com" + echo ' ' + set_trace + exit 4 + fi + if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then + set +x + echo ' ' + echo '*************************************************** ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 INDEX FILE *** ' + echo '*************************************************** ' + echo ' ' + echo " Error in moving grib file ${outfile}.idx to com" + echo ' ' + set_trace + exit 4 + fi -# --------------------------------------------------------------------------- # -# 3. Clean up the directory + if [[ "${SENDDBN}" = 'YES' ]] && [[ ${outfile} != *global.0p50* ]]; then + set +x + echo " Alerting GRIB file as ${COM_WAVE_GRID}/${outfile}" + echo " Alerting GRIB index file as ${COM_WAVE_GRID}/${outfile}.idx" + set_trace + "${DBNROOT}/bin/dbn_alert" MODEL "${alertName}_WAVE_GB2" "${job}" "${COM_WAVE_GRID}/${outfile}" + "${DBNROOT}/bin/dbn_alert" MODEL "${alertName}_WAVE_GB2_WIDX" "${job}" "${COM_WAVE_GRID}/${outfile}.idx" + else + echo "${outfile} is global.0p50 or SENDDBN is NO, no alert sent" + fi + + + # --------------------------------------------------------------------------- # + # 3. Clean up the directory rm -f gribfile set +x echo " Removing work directory after success." - ${TRACE_ON:-set -x} + set_trace cd ../ - mv -f ${gribDIR} done.${gribDIR} + mv -f "${gribDIR}" "done.${gribDIR}" - else - set +x - echo ' ' - echo " File ${COMOUT}/gridded/${outfile} found, skipping generation process" - echo ' ' - ${TRACE_ON:-set -x} - fi +else + set +x + echo ' ' + echo " File ${COM_WAVE_GRID}/${outfile} found, skipping generation process" + echo ' ' + set_trace +fi # End of ww3_grib2.sh -------------------------------------------------- # diff --git a/ush/wave_grid_interp_sbs.sh b/ush/wave_grid_interp_sbs.sh index 59a604d0f5c..bf34068874d 100755 --- a/ush/wave_grid_interp_sbs.sh +++ b/ush/wave_grid_interp_sbs.sh @@ -48,7 +48,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : ERROR IN ww3_grid_interp (COULD NOT CREATE TEMP DIRECTORY) *** ' echo '************************************************************************************* ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 1 fi @@ -63,11 +63,11 @@ source "$HOMEgfs/ush/preamble.sh" echo '! Make GRID files |' echo '+--------------------------------+' echo " Model ID : $WAV_MOD_TAG" - ${TRACE_ON:-set -x} + set_trace - if [ -z "$CDATE" ] || [ -z "$cycle" ] || [ -z "$EXECwave" ] || \ - [ -z "$COMOUT" ] || [ -z "$WAV_MOD_TAG" ] || [ -z "$SENDCOM" ] || \ - [ -z "$SENDDBN" ] || [ -z "$waveGRD" ] + if [[ -z "${PDY}" ]] || [[ -z "${cyc}" ]] || [[ -z "${cycle}" ]] || [[ -z "${EXECwave}" ]] || \ + [[ -z "${COM_WAVE_PREP}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${SENDCOM}" ]] || \ + [[ -z "${SENDDBN}" ]] || [ -z "${waveGRD}" ] then set +x echo ' ' @@ -75,8 +75,8 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** EXPORTED VARIABLES IN postprocessor NOT SET ***' echo '***************************************************' echo ' ' - echo "$CDATE $cycle $EXECwave $COMOUT $WAV_MOD_TAG $SENDCOM $SENDDBN $waveGRD" - ${TRACE_ON:-set -x} + echo "${PDY}${cyc} ${cycle} ${EXECwave} ${COM_WAVE_PREP} ${WAV_MOD_TAG} ${SENDCOM} ${SENDDBN} ${waveGRD}" + set_trace exit 1 fi @@ -103,7 +103,7 @@ source "$HOMEgfs/ush/preamble.sh" # 1. Generate GRID file with all data # 1.a Generate Input file - time="$(echo $ymdh | cut -c1-8) $(echo $ymdh | cut -c9-10)0000" + time="${ymdh:0:8} ${ymdh:8:2}0000" sed -e "s/TIME/$time/g" \ -e "s/DT/$dt/g" \ @@ -118,7 +118,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' ' echo " Copying $FIXwave/WHTGRIDINT.bin.${grdID} " - ${TRACE_ON:-set -x} + set_trace cp $FIXwave/WHTGRIDINT.bin.${grdID} ${DATA} wht_OK='yes' else @@ -138,7 +138,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo " Run ww3_gint echo " Executing $EXECwave/ww3_gint - ${TRACE_ON:-set -x} + set_trace export pgm=ww3_gint;. prep_step $EXECwave/ww3_gint 1> gint.${grdID}.out 2>&1 @@ -160,7 +160,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : ERROR IN ww3_gint interpolation * ' echo '*************************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 3 fi @@ -175,15 +175,15 @@ source "$HOMEgfs/ush/preamble.sh" if [ "$SENDCOM" = 'YES' ] then set +x - echo " Saving GRID file as $COMOUT/rundata/$WAV_MOD_TAG.out_grd.$grdID.${CDATE}" - ${TRACE_ON:-set -x} - cp ${DATA}/output_${ymdh}0000/out_grd.$grdID $COMOUT/rundata/$WAV_MOD_TAG.out_grd.$grdID.${CDATE} + echo " Saving GRID file as ${COM_WAVE_PREP}/${WAV_MOD_TAG}.out_grd.${grdID}.${PDY}${cyc}" + set_trace + cp "${DATA}/output_${ymdh}0000/out_grd.${grdID}" "${COM_WAVE_PREP}/${WAV_MOD_TAG}.out_grd.${grdID}.${PDY}${cyc}" # if [ "$SENDDBN" = 'YES' ] # then # set +x -# echo " Alerting GRID file as $COMOUT/rundata/$WAV_MOD_TAG.out_grd.$grdID.${CDATE} -# ${TRACE_ON:-set -x} +# echo " Alerting GRID file as $COMOUT/rundata/$WAV_MOD_TAG.out_grd.$grdID.${PDY}${cyc} +# set_trace # # PUT DBNET ALERT HERE .... diff --git a/ush/wave_grid_moddef.sh b/ush/wave_grid_moddef.sh index 80c041df37c..5b1b212a168 100755 --- a/ush/wave_grid_moddef.sh +++ b/ush/wave_grid_moddef.sh @@ -38,7 +38,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '+--------------------------------+' echo " Grid : $1" echo ' ' - ${TRACE_ON:-set -x} + set_trace # 0.b Check if grid set @@ -50,7 +50,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** Grid not identifife in ww3_mod_def.sh ***' echo '**************************************************' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 1 else grdID=$1 @@ -67,7 +67,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** EXPORTED VARIABLES IN ww3_mod_def.sh NOT SET ***' echo '*********************************************************' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 2 fi @@ -79,7 +79,7 @@ source "$HOMEgfs/ush/preamble.sh" echo ' Creating mod_def file ...' echo " Executing $EXECwave/ww3_grid" echo ' ' - ${TRACE_ON:-set -x} + set_trace rm -f ww3_grid.inp ln -sf ../ww3_grid.inp.$grdID ww3_grid.inp @@ -95,13 +95,13 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : ERROR IN ww3_grid *** ' echo '******************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 3 fi if [ -f mod_def.ww3 ] then - cp mod_def.ww3 $COMOUT/rundata/${CDUMP}wave.mod_def.${grdID} + cp mod_def.ww3 "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" mv mod_def.ww3 ../mod_def.$grdID else set +x @@ -110,7 +110,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : MOD DEF FILE NOT FOUND *** ' echo '******************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 4 fi diff --git a/ush/wave_outp_cat.sh b/ush/wave_outp_cat.sh index 7adf77dbf0c..f4bf6b22944 100755 --- a/ush/wave_outp_cat.sh +++ b/ush/wave_outp_cat.sh @@ -38,7 +38,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** LOCATION ID IN ww3_outp_spec.sh NOT SET ***' echo '***********************************************' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 1 else buoy=$bloc @@ -56,7 +56,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** EXPORTED VARIABLES IN ww3_outp_cat.sh NOT SET ***' echo '******************************************************' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 3 fi @@ -66,7 +66,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo " Generate input file for ww3_outp." - ${TRACE_ON:-set -x} + set_trace if [ "$specdir" = "bull" ] then @@ -113,7 +113,7 @@ source "$HOMEgfs/ush/preamble.sh" echo "*** FATAL ERROR : OUTPUT DATA FILE FOR BOUY $bouy at ${ymdh} NOT FOUND *** " echo '************************************************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=2; export err;${errchk} exit $err fi @@ -137,7 +137,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " FATAL ERROR : OUTPUTFILE ${outfile} not created " echo '*************************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace err=2; export err;${errchk} exit $err fi diff --git a/ush/wave_outp_spec.sh b/ush/wave_outp_spec.sh index a652d36745e..5acc0f95ab5 100755 --- a/ush/wave_outp_spec.sh +++ b/ush/wave_outp_spec.sh @@ -45,7 +45,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : ERROR IN ww3_outp_spec (COULD NOT CREATE TEMP DIRECTORY) *** ' echo '****************************************************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 1 fi @@ -57,7 +57,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '! Make spectral file |' echo '+--------------------------------+' echo " Model ID : $WAV_MOD_TAG" - ${TRACE_ON:-set -x} + set_trace # 0.b Check if buoy location set @@ -69,7 +69,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** LOCATION ID IN ww3_outp_spec.sh NOT SET ***' echo '***********************************************' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 1 else buoy=$bloc @@ -84,7 +84,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Location ID/# : $buoy (${point})" echo " Spectral output start time : $ymdh " echo ' ' - ${TRACE_ON:-set -x} + set_trace break fi done < tmp_list.loc @@ -95,7 +95,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** LOCATION ID IN ww3_outp_spec.sh NOT RECOGNIZED ***' echo '******************************************************' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 2 fi fi @@ -113,7 +113,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** EXPORTED VARIABLES IN ww3_outp_spec.sh NOT SET ***' echo '******************************************************' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 3 fi @@ -125,7 +125,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo " Output starts at $tstart." echo ' ' - ${TRACE_ON:-set -x} + set_trace # 0.e sync important files @@ -144,7 +144,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo " Generate input file for ww3_outp." - ${TRACE_ON:-set -x} + set_trace if [ "$specdir" = "bull" ] then @@ -171,7 +171,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo " Executing $EXECwave/ww3_outp" - ${TRACE_ON:-set -x} + set_trace export pgm=ww3_outp;. prep_step $EXECwave/ww3_outp 1> outp_${specdir}_${buoy}.out 2>&1 @@ -186,7 +186,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : ERROR IN ww3_outp *** ' echo '******************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 4 fi @@ -230,7 +230,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : OUTPUT DATA FILE FOR BOUY $bouy NOT FOUND *** ' echo '***************************************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 5 fi diff --git a/ush/wave_prnc_cur.sh b/ush/wave_prnc_cur.sh index 7b193313d32..6b1ab19db25 100755 --- a/ush/wave_prnc_cur.sh +++ b/ush/wave_prnc_cur.sh @@ -29,18 +29,19 @@ curfile=$2 fhr=$3 flagfirst=$4 fh3=$(printf "%03d" "${fhr#0}") +fext='f' # Timing has to be made relative to the single 00z RTOFS cycle for that PDY mkdir -p rtofs_${ymdh_rtofs} cd rtofs_${ymdh_rtofs} -ncks -x -v sst,sss,layer_density $curfile cur_uv_${PDY}_${fext}${fh3}.nc -ncks -O -a -h -x -v Layer cur_uv_${PDY}_${fext}${fh3}.nc cur_temp1.nc +ncks -x -v sst,sss,layer_density "${curfile} cur_uv_${PDY}_${fext}${fh3}.nc" +ncks -O -a -h -x -v Layer "cur_uv_${PDY}_${fext}${fh3}.nc" "cur_temp1.nc" ncwa -h -O -a Layer cur_temp1.nc cur_temp2.nc ncrename -h -O -v MT,time -d MT,time cur_temp2.nc ncks -v u_velocity,v_velocity cur_temp2.nc cur_temp3.nc -mv -f cur_temp3.nc cur_uv_${PDY}_${fext}${fh3}_flat.nc +mv -f "cur_temp3.nc" "cur_uv_${PDY}_${fext}${fh3}_flat.nc" # Convert to regular lat lon file # If weights need to be regenerated due to CDO ver change, use: @@ -48,19 +49,19 @@ mv -f cur_temp3.nc cur_uv_${PDY}_${fext}${fh3}_flat.nc cp ${FIXwave}/weights_rtofs_to_r4320x2160.nc ./weights.nc # Interpolate to regular 5 min grid -$CDO remap,r4320x2160,weights.nc cur_uv_${PDY}_${fext}${fh3}_flat.nc cur_5min_01.nc +${CDO} remap,r4320x2160,weights.nc "cur_uv_${PDY}_${fext}${fh3}_flat.nc" "cur_5min_01.nc" # Perform 9-point smoothing twice to make RTOFS data less noisy when # interpolating from 1/12 deg RTOFS grid to 1/6 deg wave grid if [ "WAV_CUR_CDO_SMOOTH" = "YES" ]; then - $CDO -f nc -smooth9 cur_5min_01.nc cur_5min_02.nc - $CDO -f nc -smooth9 cur_5min_02.nc cur_glo_uv_${PDY}_${fext}${fh3}_5min.nc + ${CDO} -f nc -smooth9 "cur_5min_01.nc" "cur_5min_02.nc" + ${CDO} -f nc -smooth9 "cur_5min_02.nc" "cur_glo_uv_${PDY}_${fext}${fh3}_5min.nc" else - mv cur_5min_01.nc cur_glo_uv_${PDY}_${fext}${fh3}_5min.nc + mv "cur_5min_01.nc" "cur_glo_uv_${PDY}_${fext}${fh3}_5min.nc" fi # Cleanup -rm -f cur_temp[123].nc cur_5min_??.nc cur_glo_uv_${PDY}_${fext}${fh3}.nc weights.nc +rm -f cur_temp[123].nc cur_5min_??.nc "cur_glo_uv_${PDY}_${fext}${fh3}.nc weights.nc" if [ ${flagfirst} = "T" ] then @@ -70,8 +71,8 @@ else fi rm -f cur.nc -ln -s cur_glo_uv_${PDY}_${fext}${fh3}_5min.nc cur.nc -ln -s ${DATA}/mod_def.${WAVECUR_FID} ./mod_def.ww3 +ln -s "cur_glo_uv_${PDY}_${fext}${fh3}_5min.nc" "cur.nc" +ln -s "${DATA}/mod_def.${WAVECUR_FID}" ./mod_def.ww3 export pgm=ww3_prnc;. prep_step $EXECwave/ww3_prnc 1> prnc_${WAVECUR_FID}_${ymdh_rtofs}.out 2>&1 diff --git a/ush/wave_prnc_ice.sh b/ush/wave_prnc_ice.sh index 16473dbd1f2..a32a2b7e430 100755 --- a/ush/wave_prnc_ice.sh +++ b/ush/wave_prnc_ice.sh @@ -47,18 +47,18 @@ source "$HOMEgfs/ush/preamble.sh" echo '! Make ice fields |' echo '+--------------------------------+' echo " Model TAG : $WAV_MOD_TAG" - echo " Model ID : ${CDUMP}wave" + echo " Model ID : ${RUN}wave" echo " Ice grid ID : $WAVEICE_FID" echo " Ice file : $WAVICEFILE" echo ' ' - ${TRACE_ON:-set -x} + set_trace echo "Making ice fields." - if [ -z "$YMDH" ] || [ -z "$cycle" ] || \ - [ -z "$COMOUT" ] || [ -z "$FIXwave" ] || [ -z "$EXECwave" ] || \ - [ -z "$WAV_MOD_TAG" ] || [ -z "$WAVEICE_FID" ] || [ -z "$SENDCOM" ] || \ - [ -z "$COMIN_WAV_ICE" ] - then + if [[ -z "${YMDH}" ]] || [[ -z "${cycle}" ]] || \ + [[ -z "${COM_WAVE_PREP}" ]] || [[ -z "${FIXwave}" ]] || [[ -z "${EXECwave}" ]] || \ + [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${WAVEICE_FID}" ]] || [[ -z "${SENDCOM}" ]] || \ + [[ -z "${COM_OBS}" ]]; then + set +x echo ' ' echo '**************************************************' @@ -66,7 +66,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '**************************************************' echo ' ' exit 1 - ${TRACE_ON:-set -x} + set_trace echo "NON-FATAL ERROR - EXPORTED VARIABLES IN preprocessor NOT SET" fi @@ -78,7 +78,7 @@ source "$HOMEgfs/ush/preamble.sh" # 1. Get the necessary files # 1.a Copy the ice data file - file=${COMIN_WAV_ICE}/${WAVICEFILE} + file=${COM_OBS}/${WAVICEFILE} if [ -f $file ] then @@ -89,7 +89,7 @@ source "$HOMEgfs/ush/preamble.sh" then set +x echo " ice.grib copied ($file)." - ${TRACE_ON:-set -x} + set_trace else set +x echo ' ' @@ -97,7 +97,7 @@ source "$HOMEgfs/ush/preamble.sh" echo "*** FATAL ERROR: NO ICE FILE $file *** " echo '************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace echo "FATAL ERROR - NO ICE FILE (GFS GRIB)" exit 2 fi @@ -108,7 +108,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' Extracting data from ice.grib ...' - ${TRACE_ON:-set -x} + set_trace $WGRIB2 ice.grib -netcdf icean_5m.nc 2>&1 > wgrib.out @@ -124,7 +124,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** ERROR IN UNPACKING GRIB ICE FILE *** ' echo '**************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace echo "ERROR IN UNPACKING GRIB ICE FILE." exit 3 fi @@ -139,7 +139,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' Run through preprocessor ...' echo ' ' - ${TRACE_ON:-set -x} + set_trace cp -f ${DATA}/ww3_prnc.ice.$WAVEICE_FID.inp.tmpl ww3_prnc.inp @@ -157,7 +157,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** WARNING: NON-FATAL ERROR IN ww3_prnc *** ' echo '******************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace echo "WARNING: NON-FATAL ERROR IN ww3_prnc." exit 4 fi @@ -175,13 +175,13 @@ source "$HOMEgfs/ush/preamble.sh" icefile=${WAV_MOD_TAG}.${WAVEICE_FID}.$cycle.ice elif [ "${WW3ATMIENS}" = "F" ] then - icefile=${CDUMP}wave.${WAVEICE_FID}.$cycle.ice + icefile=${RUN}wave.${WAVEICE_FID}.$cycle.ice fi set +x - echo " Saving ice.ww3 as $COMOUT/rundata/${icefile}" - ${TRACE_ON:-set -x} - cp ice.ww3 $COMOUT/rundata/${icefile} + echo " Saving ice.ww3 as ${COM_WAVE_PREP}/${icefile}" + set_trace + cp ice.ww3 "${COM_WAVE_PREP}/${icefile}" rm -f ice.ww3 # --------------------------------------------------------------------------- # diff --git a/ush/wave_tar.sh b/ush/wave_tar.sh index 452601dceb4..9264aac5f3e 100755 --- a/ush/wave_tar.sh +++ b/ush/wave_tar.sh @@ -42,7 +42,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " ID : $1" echo " Type : $2" echo " Number of files : $3" - ${TRACE_ON:-set -x} + set_trace # 0.b Check if type set @@ -55,7 +55,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** VARIABLES IN ww3_tar.sh NOT SET ***' echo '********************************************' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 1 else ID=$1 @@ -76,16 +76,15 @@ source "$HOMEgfs/ush/preamble.sh" # 0.c Define directories and the search path. # The tested variables should be exported by the postprocessor script. - if [ -z "$cycle" ] || [ -z "$COMOUT" ] || [ -z "$WAV_MOD_TAG" ] || \ - [ -z "$SENDCOM" ] || [ -z "$SENDDBN" ] || [ -z "${STA_DIR}" ] - then + if [[ -z "${cycle}" ]] || [[ -z "${COM_WAVE_STATION}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || \ + [[ -z "${SENDCOM}" ]] || [[ -z "${SENDDBN}" ]] || [[ -z "${STA_DIR}" ]]; then set +x echo ' ' echo '*****************************************************' echo '*** EXPORTED VARIABLES IN ww3_tar.sh NOT SET ***' echo '*****************************************************' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 2 fi @@ -97,7 +96,7 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' ' echo ' Making tar file ...' - ${TRACE_ON:-set -x} + set_trace count=0 countMAX=5 @@ -121,7 +120,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : TAR CREATION FAILED *** ' echo '***************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 3 fi @@ -132,7 +131,7 @@ source "$HOMEgfs/ush/preamble.sh" else set +x echo ' All files not found for tar. Sleeping 10 seconds and trying again ..' - ${TRACE_ON:-set -x} + set_trace sleep 10 count=$(expr $count + 1) fi @@ -147,7 +146,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : TAR CREATION FAILED *** ' echo '***************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 3 fi @@ -167,7 +166,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : SPECTRAL TAR COMPRESSION FAILED *** ' echo '***************************************************** ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 4 fi fi @@ -180,10 +179,10 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' ' - echo " Moving tar file ${file_name} to $COMOUT ..." - ${TRACE_ON:-set -x} + echo " Moving tar file ${file_name} to ${COM_WAVE_STATION} ..." + set_trace - cp ${file_name} $COMOUT/station/. + cp "${file_name}" "${COM_WAVE_STATION}/." exit=$? @@ -195,7 +194,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** FATAL ERROR : TAR COPY FAILED *** ' echo '************************************* ' echo ' ' - ${TRACE_ON:-set -x} + set_trace exit 4 fi @@ -203,10 +202,11 @@ source "$HOMEgfs/ush/preamble.sh" then set +x echo ' ' - echo " Alerting TAR file as $COMOUT/station/${file_name}" + echo " Alerting TAR file as ${COM_WAVE_STATION}/${file_name}" echo ' ' - ${TRACE_ON:-set -x} - $DBNROOT/bin/dbn_alert MODEL ${alertName}_WAVE_TAR $job $COMOUT/station/${file_name} + set_trace + "${DBNROOT}/bin/dbn_alert MODEL" "${alertName}_WAVE_TAR" "${job}" \ + "${COM_WAVE_STATION}/${file_name}" fi # --------------------------------------------------------------------------- # diff --git a/util/modulefiles/gfs_util.hera b/util/modulefiles/gfs_util.hera deleted file mode 100644 index ac8a7d941cd..00000000000 --- a/util/modulefiles/gfs_util.hera +++ /dev/null @@ -1,28 +0,0 @@ -#%Module##################################################### -## Module file for GFS util -############################################################# -# -# Loading required system modules -# - -module use /scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/modulefiles/stack -module load hpc/1.1.0 -module load hpc-intel/18.0.5.274 -module load hpc-impi/2018.0.4 - -module load bacio/2.4.1 -module load w3emc/2.7.3 -module load w3nco/2.4.1 -module load ip/3.3.3 -module load sp/2.3.3 -module load bufr/11.4.0 - -module load jasper/2.0.22 -module load png/1.6.35 -module load zlib/1.2.11 - -module load ncl/6.5.0 -module load gempak/7.4.2 - -export GEMINC=/apps/gempak/7.4.2/gempak/include -export GEMOLB=/apps/gempak/7.4.2/os/linux64/lib diff --git a/util/sorc/compile_gfs_util_wcoss.sh b/util/sorc/compile_gfs_util_wcoss.sh deleted file mode 100755 index 724626d3ea1..00000000000 --- a/util/sorc/compile_gfs_util_wcoss.sh +++ /dev/null @@ -1,49 +0,0 @@ -#!/bin/sh - -###################################################################### -# -# Build executable GFS utility for GFS V16.0.0 -# -###################################################################### - -LMOD_EXACT_MATCH=no -source ../../sorc/machine-setup.sh > /dev/null 2>&1 -cwd=$(pwd) - -if [ "$target" = "hera" ] ; then - echo " " - echo " You are on $target " - echo " " -else - echo " " - echo " Your machine $target is not supported" - echo " The script $0 can not continue. Aborting!" - echo " " - exit -fi -echo " " - -# Load required modules -source ../modulefiles/gfs_util.${target} -module list - -dirlist="overgridid rdbfmsua webtitle mkgfsawps" -set -x - -for dir in $dirlist -do - cd ${dir}.fd - echo "PWD: $PWD" - set +x - echo " " - echo " ### ${dir} ### " - echo " " - set -x - ./compile_${dir}_wcoss.sh - set +x - echo " " - echo " ######################################### " - echo " " - cd .. - echo "BACK TO: $PWD" -done diff --git a/util/sorc/mkgfsawps.fd/compile_mkgfsawps_wcoss.sh b/util/sorc/mkgfsawps.fd/compile_mkgfsawps_wcoss.sh deleted file mode 100755 index 5d12f3e53ce..00000000000 --- a/util/sorc/mkgfsawps.fd/compile_mkgfsawps_wcoss.sh +++ /dev/null @@ -1,28 +0,0 @@ -#!/bin/sh -LMOD_EXACT_MATCH=no -source ../../../sorc/machine-setup.sh > /dev/null 2>&1 -cwd=$(pwd) - -if [ "$target" = "hera" ]; then - echo " " - echo " You are on $target " - echo " " -else - echo " " - echo " Your machine $target is not supported" - echo " The script $0 can not continue. Aborting!" - echo " " - exit -fi -echo " " - -# Load required modules -source ../../modulefiles/gfs_util.${target} -module list - -set -x - -mkdir -p ../../exec -make -f makefile.$target -make -f makefile.$target clean -mv mkgfsawps ../../exec diff --git a/util/sorc/mkgfsawps.fd/makefile b/util/sorc/mkgfsawps.fd/makefile deleted file mode 100755 index 86f3c417b19..00000000000 --- a/util/sorc/mkgfsawps.fd/makefile +++ /dev/null @@ -1,53 +0,0 @@ -SHELL=/bin/sh -# -SRCS= mkgfsawps.f - -OBJS= mkgfsawps.o - -# Tunable parameters -# -# FC Name of the fortran compiling system to use -# LDFLAGS Flags to the loader -# LIBS List of libraries -# CMD Name of the executable -# PROFLIB Library needed for profiling -# -FC = ifort - -LDFLAGS = -IOMP5_LIB=/usrx/local/prod/intel/2018UP01/lib/intel64/libiomp5.a - -LIBS = -Xlinker --start-group ${W3NCO_LIBd} ${W3NCO_LIBd} ${IP_LIBd} ${SP_LIBd} ${BACIO_LIB4} ${IOMP5_LIB} - -CMD = mkgfsawps -PROFLIB = -lprof - -# To perform the default compilation, use the first line -# To compile with flowtracing turned on, use the second line -# To compile giving profile additonal information, use the third line -# WARNING: SIMULTANEOUSLY PROFILING AND FLOWTRACING IS NOT RECOMMENDED -FFLAGS = -O3 -g -convert big_endian -r8 -i4 -assume noold_ldout_format - -# Lines from here on down should not need to be changed. They are the -# actual rules which make uses to build a.out. -# -all: $(CMD) - -$(CMD): $(OBJS) - $(FC) -o $(LDFLAGS) $(@) $(OBJS) $(LIBS) - rm -f $(OBJS) - -# Make the profiled version of the command and call it a.out.prof -# -$(CMD).prof: $(OBJS) - $(FC) -o $(LDFLAGS) $(@) $(OBJS) $(LIBS) - -rm -f $(OBJS) - -clean: - -rm -f $(OBJS) - -clobber: clean - -rm -f $(CMD) $(CMD).prof - -void: clobber - -rm -f $(SRCS) makefile diff --git a/util/sorc/mkgfsawps.fd/makefile.hera b/util/sorc/mkgfsawps.fd/makefile.hera deleted file mode 100755 index 99052691e72..00000000000 --- a/util/sorc/mkgfsawps.fd/makefile.hera +++ /dev/null @@ -1,53 +0,0 @@ -SHELL=/bin/sh -# -SRCS= mkgfsawps.f - -OBJS= mkgfsawps.o - -# Tunable parameters -# -# FC Name of the fortran compiling system to use -# LDFLAGS Flags to the loader -# LIBS List of libraries -# CMD Name of the executable -# PROFLIB Library needed for profiling -# -FC = ifort - -LDFLAGS = -# IOMP5_LIB=/usrx/local/prod/intel/2018UP01/lib/intel64/libiomp5.a - -LIBS = -qopenmp -Xlinker --start-group ${W3NCO_LIBd} ${W3NCO_LIBd} ${IP_LIBd} ${SP_LIBd} ${BACIO_LIB4} ${IOMP5_LIB} - -CMD = mkgfsawps -PROFLIB = -lprof - -# To perform the default compilation, use the first line -# To compile with flowtracing turned on, use the second line -# To compile giving profile additonal information, use the third line -# WARNING: SIMULTANEOUSLY PROFILING AND FLOWTRACING IS NOT RECOMMENDED -FFLAGS = -O3 -g -convert big_endian -r8 -i4 -assume noold_ldout_format - -# Lines from here on down should not need to be changed. They are the -# actual rules which make uses to build a.out. -# -all: $(CMD) - -$(CMD): $(OBJS) - $(FC) -o $(LDFLAGS) $(@) $(OBJS) $(LIBS) - rm -f $(OBJS) - -# Make the profiled version of the command and call it a.out.prof -# -$(CMD).prof: $(OBJS) - $(FC) -o $(LDFLAGS) $(@) $(OBJS) $(LIBS) - -rm -f $(OBJS) - -clean: - -rm -f $(OBJS) - -clobber: clean - -rm -f $(CMD) $(CMD).prof - -void: clobber - -rm -f $(SRCS) makefile diff --git a/util/sorc/mkgfsawps.fd/mkgfsawps.f b/util/sorc/mkgfsawps.fd/mkgfsawps.f deleted file mode 100755 index 4e4e57db3ca..00000000000 --- a/util/sorc/mkgfsawps.fd/mkgfsawps.f +++ /dev/null @@ -1,511 +0,0 @@ - PROGRAM MKGFSAWPS -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C . . . . -C MAIN PROGRAM: MKGFSAWPS -C PRGMMR: VUONG ORG: NP11 DATE: 2004-04-21 -C -C ABSTRACT: PROGRAM READS GRIB FILE FROM SPECTRAL MODEL WITH 0.5 DEGREE -C (GRID 4) OR 1 DEGREE (GRID 3) OR 2.5 DEGREE (GRID 2) RECORDS. -C UNPACKS THEM, AND CAN MAKE AWIPS GRIB GRIDS 201,202, 203, -C 204, 211, 213 and 225. THEN, ADD A TOC FLAG FIELD SEPARATOR -C BLOCK AND WMO HEADER IN FRONT OF EACH GRIB FIELD, AND WRITES -C THEM OUT TO A NEW FILE. THE OUTPUT FILE IS IN THE FORMAT -C REQUIRED FOR TOC'S FTP INPUT SERVICE, WHICH CAN BE USED TO -C DISSEMINATE THE GRIB BULLETINS. -C -C PROGRAM HISTORY LOG: -C 2004-04-21 VUONG -C 2010-05-27 VUONG INCREASED SIZE OF ARRAYS -C -C USAGE: -C INPUT FILES: -C 5 - STANDARD FORTRAN INPUT FILE. -C 11 - GRIB FILE FROM SPECTRAL MODEL WITH GRID 2 OR 3. -C 31 - CRAY GRIB INDEX FILE FOR FILE 11 -C PARM - PASS IN 4 CHARACTERS 'KWBX' WITH PARM FIELD -C -C OUTPUT FILES: (INCLUDING SCRATCH FILES) -C 6 - STANDARD FORTRAN PRINT FILE -C 51 - AWIPS GRIB GRID TYPE 201,202,203,211,213 and 225 RECORDS -C MADE FROM GRIB GRID 2, 3 OR 4 RECORDS. -C -C SUBPROGRAMS CALLED: (LIST ALL CALLED FROM ANYWHERE IN CODES) -C UNIQUE: - MAKWMO -C LIBRARY: -C W3LIB - W3AS00 IW3PDS W3FP11 W3UTCDAT -C W3FI63 W3FI72 W3FI83 W3TAGB GETGB GETGBP -C BACIO - BAREAD BAOPENR BAOPENW BACLOSE -C -C EXIT STATES: -C COND = 0 - SUCCESSFUL RUN -C 10 - ERROR OPENING INPUT GRIB DATA FILE -C 18 - ERROR READING CONTROL CARD FILE -C 19 - ERROR READING CONTROL CARD FILE -C 20 - ERROR OPENING OUTPUT GRIB FILE -C 30 - BULLETINS ARE MISSING -C -C REMARKS: LIST CAVEATS, OTHER HELPFUL HINTS OR INFORMATION -C -C ATTRIBUTES: -C LANGUAGE: FORTRAN 90 -C - PARAMETER (MXSIZE=2000000,MXSIZ3=MXSIZE*3) - PARAMETER (LUGI=31,LUGB=11,LUGO=51) - PARAMETER (LENHEAD=21) -C - REAL FLDI(MXSIZE) - REAL FLDV(MXSIZE) - REAL FLDO(MXSIZE),FLDVO(MXSIZE) - REAL RLAT(MXSIZE),RLON(MXSIZE) - REAL CROT(MXSIZE),SROT(MXSIZE) -C - INTEGER D(20) - INTEGER IFLD(MXSIZE) - INTEGER IBDSFL(12) - INTEGER IBMAP(MXSIZE) - INTEGER IDAWIP(200) - INTEGER JGDS(100) - INTEGER MPDS(25) - INTEGER,DIMENSION(8):: ITIME=(/0,0,0,-500,0,0,0,0/) - INTEGER KGDS(200),KGDSO(200) - INTEGER KPDS(25) - INTEGER MAPNUM(20) - INTEGER NBITS(20) - INTEGER NPARM - INTEGER NBUL - INTEGER PUNUM - INTEGER IPOPT(20) - INTEGER,DIMENSION(28):: HEXPDS -C - CHARACTER * 6 BULHED(20) - CHARACTER * 100 CPARM - CHARACTER * 17 DESC - CHARACTER * 3 EOML - CHARACTER * 1 GRIB(MXSIZ3) - CHARACTER * 1 KBUF(MXSIZ3) - CHARACTER * 4 KWBX - CHARACTER * 2 NGBFLG - CHARACTER * 1 PDS(28),GDS(400) - CHARACTER * 1 PDSL(28) - CHARACTER * 1 PDSAWIP(28) - CHARACTER * 132 TITLE - CHARACTER * 1 WMOHDR(21) - CHARACTER * 1 WFLAG - CHARACTER * 6 ENVVAR - CHARACTER * 80 FIlEB,FILEI,FILEO - CHARACTER * 1 CSEP(80) -C - LOGICAL IW3PDS - LOGICAL*1 KBMS(MXSIZE),KBMSO(MXSIZE) -C - SAVE -C - DATA IBDSFL/ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0/ - DATA IP/0/,IPOPT/1,19*0/ - DATA HEXPDS /28*0/ - DATA KM/1/ -C - CALL W3TAGB('MKGFSAWIPS',2004,0112,0112,'NP11') -C -C READ GRIB DATA AND INDEX FILE NAMES FROM THE FORT -C ENVIRONMENT VARIABLES, AND OPEN THE FILES. -C - ENVVAR='FORT ' - WRITE(ENVVAR(5:6),FMT='(I2)') LUGB - CALL GETENV(ENVVAR,FILEB) - WRITE(ENVVAR(5:6),FMT='(I2)') LUGI - CALL GETENV(ENVVAR,FILEI) - - CALL BAOPENR(LUGB,FILEB,IRET1) - IF ( IRET1 .NE. 0 ) THEN - WRITE(6,FMT='(" ERROR OPENING GRIB FILE: ",A80)') FILEB - WRITE(6,FMT='(" BAOPENR ERROR = ",I5)') IRET1 - STOP 10 - ENDIF - - CALL BAOPENR(LUGI,FILEI,IRET2) - IF ( IRET2 .NE. 0 ) THEN - WRITE(6,FMT='(" ERROR OPENING GRIB FILE: ",A80)') FILEB - WRITE(6,FMT='(" BAOPENR ERROR = ",I5)') IRET2 - STOP 10 - ENDIF -C -C READ OUTPUT GRIB BULLETIN FILE NAME FROM FORT -C ENVIRONMENT VARIABLE, AND OPEN FILE. -C - ENVVAR='FORT ' - WRITE(ENVVAR(5:6),FMT='(I2)') LUGO - CALL GETENV(ENVVAR,FILEO) - CALL BAOPENW(LUGO,FILEO,IRET3) - IF ( IRET3 .NE. 0 ) THEN - WRITE(6,FMT='(" ERROR OPENING OUTPUT GRIB FILE: ",A80)') FILEB - WRITE(6,FMT='(" BAOPENW ERROR = ",I5)') IRET3 - STOP 20 - ENDIF -C -C GET PARM FIELD WITH UP TO 100 CHARACTERS -C - CPARM = ' ' - KWBX = 'KWBC' - CALL W3AS00(NPARM,CPARM,IER) - IF (IER.EQ.0) THEN - IF (NPARM.EQ.0.OR.CPARM(1:4).EQ.' ') THEN - PRINT *,'THERE IS A PARM FIELD BUT IT IS EMPTY' - PRINT *,'OR BLANK, I WILL USE THE DEFAULT KWBC' - ELSE - KWBX(1:4) = CPARM(1:4) - END IF - ELSE IF (IER.EQ.2.OR.IER.EQ.3) THEN - PRINT *,'W3AS00 ERROR = ',IER - PRINT *,'THERE IS NO PARM FIELD, I USED DEFAULT KWBC' - ELSE - PRINT *,'W3AS00 ERROR = ',IER - END IF - PRINT *,'NPARM = ',NPARM - PRINT *,'CPARM = ',CPARM(1:4) - PRINT *,'KWBX = ',KWBX(1:4) -C - IRET = 0 - IOPT = 2 - INSIZE = 19 - NBUL = 0 - NGBSUM = 0 -C - CALL W3UTCDAT (ITIME) -C -C LOOP TO READ UNPACKED GRIB DATA -C 28 BYTE PDS AND 65160 FLOATING POINT NUMBERS -C - NREC = 0 - DO 699 IREAD = 1,1000 - READ (*,66,END=800) (HEXPDS(J),J=1,12), - & (HEXPDS(J),J=17,20), PUNUM, NGBFLG, DESC - 66 FORMAT(3(2X,4Z2),3X,4Z2,6X,I3,1X,A2,1X,A17) -C -C CHARACTERS ON CONTROL CARD NOT 0-9, A-F, OR a-f -C ALL RECORD EXCEPT V-GRD ARE READ INTO ARRAY C -C -C EXIT LOOP, IF NO MORE BULLETINS IN INPUT CARDS -C - PDS=CHAR(HEXPDS) - IF (MOVA2I(PDS(1)) .EQ. 255) EXIT - NREC = NREC + 1 - WRITE (6,FMT='(''**************************************'', - & ''************************************************'')') - PRINT *,'START NEW RECORD NO. = ',NREC - WRITE (6,FMT='('' INPUT PDS, PUNUM, NGBFLG'', - & '' & DESC...DESIRED GRIB MAPS LISTED ON FOLLOWING '', - & ''LINES...'',/,4X,3(2X,4Z2.2),3X,4Z2.2,6X,I3,1X,A2, - & 1X,A17)') (HEXPDS(J),J=1,12), - & (HEXPDS(J),J=17,20), PUNUM, NGBFLG, DESC -C -C READ IN GRIDS TO INTERPOLATE TO -C - NGB = 0 - DO J = 1,20 - READ (*,END=710,FMT='(4X,I3,2X,I2,2X,A6,1X,I3,24X,A3)') - & MAPNUM(J),NBITS(J), BULHED(J), D(J), EOML - WRITE (6,FMT='(4X,I3,2X,I2,2X,A6,1X,I3,24X,A3)') - & MAPNUM(J),NBITS(J), BULHED(J), D(J), EOML - NGB = J - IF (EOML .EQ. 'EOM') EXIT - ENDDO -C - NGBSUM = NGBSUM + NGB - JREW = 0 - MPDS = -1 - JGDS = -1 - MPDS(3) = MOVA2I(PDS(7)) - MPDS(5) = MOVA2I(PDS(9)) - WFLAG = ' ' - IF (MPDS(5).EQ.33) THEN - WFLAG = 'U' - ELSE IF (MPDS(5).EQ.34) THEN - WFLAG = 'V' - END IF - MPDS(6) = MOVA2I(PDS(10)) - MPDS(7) = MOVA2I(PDS(11)) * 256 + MOVA2I(PDS(12)) - IF (MPDS(5).EQ.61.OR.MPDS(5).EQ.62.OR. - & MPDS(5).EQ.63) THEN - MPDS(14) = MOVA2I(PDS(19)) - MPDS(15) = MOVA2I(PDS(20)) - END IF -C -C PRINT *,'CHECK POINT BEFORE GETGB' -C IF YOU GET U-GRD, ALSO READ V-GRD INTO ARRAY FLDV -C ALL RECORD EXCEPT V-GRD ARE READ INTO ARRAY FLDI -C IF YOU GET V-GRD, READ INTO ARRAY FLDV, READ U-GRD INTO FLDI -C - IF (WFLAG.EQ.'V') MPDS(5) = 33 - CALL GETGB(LUGB,LUGI,MXSIZE,JREW,MPDS,JGDS, - & MI,KREW,KPDS,KGDS,KBMS,FLDI,IRET) - CALL GETGBP(LUGB,LUGI,MXSIZ3,KREW-1,MPDS,JGDS, - & KBYTES,KREW,KPDS,KGDS,GRIB,IRET) - IF (IRET.NE.0) THEN - IF (IRET.LT.96) PRINT *,'GETGB-W3FI63: ERROR = ',IRET - IF (IRET.EQ.96) PRINT *,'GETGB: ERROR READING INDEX FILE' - IF (IRET.EQ.97) PRINT *,'GETGB: ERROR READING GRIB FILE' - IF (IRET.EQ.98) THEN - PRINT *,'GETGB ERROR: NUM. OF DATA POINTS GREATER THAN JF' - END IF - IF (IRET.EQ.99) PRINT *,'GETGB ERROR: REQUEST NOT FOUND' - IF (IRET.GT.99) PRINT *,'GETGB ERROR = ',IRET - GO TO 699 - END IF - PDSL(1:28)=GRIB(9:36) - IBI=MOD(KPDS(4)/64,2) - IF (WFLAG.EQ.'U') THEN - CALL W3FP11 (GRIB,PDSL,TITLE,IER) -C -C COMPARE RECORD (GRIB) TO CONTROL CARD (PDS), THEY SHOULD MATCH -C - KEY = 2 - IF (.NOT.IW3PDS(PDSL,PDS,KEY)) THEN - PRINT 2900, IREAD, (MOVA2I(PDSL(J)),J=1,28), - * (MOVA2I(PDS(J)),J=1,28) - GO TO 699 - END IF - END IF -C -C READ V-GRD INTO ARRAY FLDV -C - IF (WFLAG.EQ.'U'.OR.WFLAG.EQ.'V') THEN - MPDS(5) = 34 - CALL GETGB(LUGB,LUGI,MXSIZE,JREW,MPDS,JGDS, - & MI,KREW,KPDS,KGDS,KBMS,FLDV,JRET) - CALL GETGBP(LUGB,LUGI,MXSIZ3,KREW-1,MPDS,JGDS, - & KBYTES,KREW,KPDS,KGDS,GRIB,JRET) - IF (JRET.NE.0) THEN - IF (JRET.LT.96) PRINT *,'GETGB-W3FI63: ERROR = ',JRET - IF (JRET.EQ.96) PRINT *,'GETGB: ERROR READING INDEX FILE' - IF (JRET.EQ.97) PRINT *,'GETGB: ERROR READING GRIB FILE' - IF (JRET.EQ.98) THEN - PRINT *,'GETGB ERROR: NUM. OF DATA POINTS GREATER THAN JF' - END IF - IF (JRET.EQ.99) PRINT *,'GETGB ERROR: REQUEST NOT FOUND' - IF (JRET.GT.99) PRINT *,'GETGB ERROR = ',JRET - GO TO 699 - END IF - IF (WFLAG.EQ.'V') THEN - CALL W3FP11 (GRIB,PDSL,TITLE,IER) - END IF - END IF - PRINT *,'RECORD NO. OF GRIB RECORD IN INPUT FILE = ',KREW -C -C COMPARE RECORD (GRIB) TO CONTROL CARD (PDS), THEY SHOULD MATCH -C - KEY = 2 - IF (WFLAG.EQ.' '.OR.WFLAG.EQ.'V') THEN - PDSL(1:28)=GRIB(9:36) - IF (.NOT.IW3PDS(PDSL,PDS,KEY)) THEN - PRINT 2900, IREAD, (MOVA2I(PDSL(J)),J=1,28), - * (MOVA2I(PDS(J)),J=1,28) -2900 FORMAT ( 1X,I4,' (PDS) IN RECORD DOES NOT MATCH (PDS) IN ', - & 'CONTROL CARD ',/,7(1X,4Z2.2), /,7(1X,4Z2.2)) - GO TO 699 - END IF - END IF -C - PRINT 2, (MOVA2I(PDSL(J)),J=1,28) - 2 FORMAT (' PDS = ',7(4Z2.2,1X)) -C - IF (WFLAG.EQ.' ') THEN - CALL W3FP11 (GRIB,PDSL,TITLE,IER) - END IF - IF (IER.NE.0) PRINT *,'W3FP11 ERROR = ',IER - PRINT *,TITLE(1:86) -C -C MASK OUT ZERO PRECIP GRIDPOINTS BEFORE INTERPOLATION -C - IF (MPDS(5).EQ.61.OR.MPDS(5).EQ.62.OR. - & MPDS(5).EQ.63) THEN - DO J=1,MI - IF ( FLDI(J).EQ.0.0 ) THEN - KBMS(J)=.FALSE. - IBI=1 - ENDIF - ENDDO - END IF -C -C PROCESS EACH GRID -C - DO 690 I = 1,NGB - - CALL MAKGDS(MAPNUM(I),KGDSO,GDS,LENGDS,IRET) - IF ( IRET.NE.0) THEN - PRINT *,' GRID ',MAPNUM(I),' NOT VALID.' - CYCLE - ENDIF - - IF (WFLAG.EQ.' ') THEN - CALL IPOLATES(IP,IPOPT,KGDS,KGDSO,MI,MXSIZE,KM,IBI,KBMS,FLDI, - * IGPTS,RLAT,RLON,IBO,KBMSO,FLDO,IRET) - ELSE - CALL IPOLATEV(IP,IPOPT,KGDS,KGDSO,MI,MXSIZE,KM,IBI,KBMS, - * FLDI,FLDV,IGPTS,RLAT,RLON,CROT,SROT, - * IBO,KBMSO,FLDO,FLDVO,IRET) - ENDIF - - IF (IRET.NE.0) THEN - PRINT *,' INTERPOLATION TO GRID ',MAPNUM(I),' FAILED.' - CYCLE - ENDIF - - IF (WFLAG.EQ.'V') THEN - FLDO=FLDVO - ENDIF -C -C CALL W3FI69 TO UNPACK PDS INTO 25 WORD INTEGER ARRAY -C - CALL W3FI69(PDSL,IDAWIP) -C -C CHANGE MODEL NUMBER AND GRID TYPE -C - IDAWIP(5) = MAPNUM(I) - IF (WFLAG.EQ.'U') IDAWIP(8) = 33 - IF (WFLAG.EQ.'V') IDAWIP(8) = 34 -C -C ZERO PRECIP GRIDPOINTS WHERE MASK WAS APPLIED BEFORE INTERPOLATION -C - IF (IDAWIP(8).EQ.61.OR.IDAWIP(8).EQ.62.OR. - & IDAWIP(8).EQ.63) THEN - IF (IBO.EQ.1) THEN - DO J=1,IGPTS - IF ( .NOT.KBMSO(J) ) THEN - KBMSO(J)=.TRUE. - FLDO(J)=0.0 - ENDIF - ENDDO - END IF - END IF -C -C TEST RELATIVE HUMIDITY FOR GT THAN 100.0 AND LT 0.0 -C IF SO, RESET TO 0.0 AND 100.0 -C - IF (IDAWIP(8).EQ.52) THEN - DO J = 1,IGPTS - IF (FLDO(J).GT.100.0) FLDO(J) = 100.0 - IF (FLDO(J).LT.0.0) FLDO(J) = 0.0 - END DO - END IF -C -C SET ALL NEGATIVE ACUM PCP VALUES TO 0 -C - IF (IDAWIP(8).EQ.61.OR.IDAWIP(8).EQ.62.OR. - & IDAWIP(8).EQ.63) THEN - DO J = 1,IGPTS - IF (FLDO(J).LT.0.0) FLDO(J) = 0.0 - END DO - END IF -C -C COPY OUTPUT BITMAP FROM LOGICAL TO INTEGER ARRAY FOR W3FI72 -C - IF (IBO.EQ.1) THEN - DO J=1,IGPTS - IF (KBMSO(J)) THEN - IBMAP(J)=1 - ELSE - IBMAP(J)=0 - ENDIF - ENDDO - ELSE - IBMAP=1 - ENDIF -C -C IF D VALUE EQUAL ZERO, USE D VALUE IN 1 DEGREE INPUT RECORDS, -C ELSE USE THE D VALUE -C - IF (D(I).NE.0) THEN - IDAWIP(25) = D(I) - END IF -C -C PRINT *,'W3FT69 = ',IDAWIP -C PRINT *,'CHECK POINT AFTER W3FI69' -C - IBITL = NBITS(I) - ITYPE = 0 - IGRID = MAPNUM(I) - IPFLAG = 0 - IGFLAG = 0 - IBFLAG = 0 - ICOMP = 0 - IBLEN = IGPTS - JERR = 0 -C -C GRIB AWIPS GRID 37-44 -C -C PRINT *,'CHECK POINT BEFORE W3FI72' - CALL W3FI72(ITYPE,FLDO,IFLD,IBITL, - & IPFLAG,IDAWIP,PDSAWIP, - & IGFLAG,IGRID,KGDSO,ICOMP, - & IBFLAG,IBMAP,IBLEN, - & IBDSFL, - & NPTS,KBUF,ITOT,JERR) -C PRINT *,'CHECK POINT AFTER W3FI72' - IF (JERR.NE.0) PRINT *,' W3FI72 ERROR = ',JERR - PRINT *,'NPTS, ITOT = ',NPTS,ITOT - PRINT 2, (MOVA2I(PDSAWIP(J)),J=1,28) -C -C PRINT *,'SIZE OF GRIB FIELD = ',ITOT -C -C MAKE FLAG FIELD SEPARATOR BLOCK -C - CALL MKFLDSEP(CSEP,IOPT,INSIZE,ITOT+LENHEAD,LENOUT) -C -C MAKE WMO HEADER -C - CALL MAKWMO (BULHED(I),KPDS(10),KPDS(11),KWBX,WMOHDR) -C -C WRITE OUT SEPARATOR BLOCK, ABBREVIATED WMO HEADING, -C - CALL WRYTE(LUGO,LENOUT,CSEP) - CALL WRYTE(LUGO,LENHEAD,WMOHDR) - CALL WRYTE(LUGO,ITOT,KBUF) - NBUL = NBUL + 1 - 690 CONTINUE -C - 699 CONTINUE -C-------------------------------------------------------------- -C -C CLOSING SECTION -C - 800 CONTINUE - IF (NBUL .EQ. 0 .AND. NUMFLD .EQ. 0) THEN - WRITE (6,FMT='('' SOMETHING WRONG WITH DATA CARDS...'', - & ''NOTHING WAS PROCESSED'')') - CALL W3TAGE('MKGFSAWPS') - STOP 19 - ELSE - CALL BACLOSE (LUGB,IRET) - CALL BACLOSE (LUGI,IRET) - CALL BACLOSE (LUGO,IRET) - WRITE (6,FMT='(//,'' ******** RECAP OF THIS EXECUTION '', - & ''********'',/,5X,''READ '',I6,'' INDIVIDUAL IDS'', - & /,5X,''WROTE '',I6,'' BULLETINS OUT FOR TRANSMISSION'', - & //)') NREC, NBUL -C -C TEST TO SEE IF ANY BULLETINS MISSING -C - MBUL = 0 - MBUL = NGBSUM - NBUL - IF (MBUL.NE.0) THEN - PRINT *,'BULLETINS MISSING = ',MBUL - CALL W3TAGE('MKGFSAWPS') - STOP 30 - END IF -C - CALL W3TAGE('MKGFSAWPS') - STOP - ENDIF -C -C ERROR MESSAGES -C - 710 CONTINUE - WRITE (6,FMT='('' ?*?*? CHECK DATA CARDS... READ IN '', - & ''GRIB PDS AND WAS EXPECTING GRIB MAP CARDS TO FOLLOW.'',/, - & '' MAKE SURE NGBFLG = ZZ OR SUPPLY '', - & ''SOME GRIB MAP DEFINITIONS!'')') - CALL W3TAGE('MKGFSAWPS') - STOP 18 -C - END diff --git a/util/sorc/overgridid.fd/compile_overgridid_wcoss.sh b/util/sorc/overgridid.fd/compile_overgridid_wcoss.sh deleted file mode 100755 index d7b0e0185c1..00000000000 --- a/util/sorc/overgridid.fd/compile_overgridid_wcoss.sh +++ /dev/null @@ -1,35 +0,0 @@ -#!/bin/sh - -###################################################################### -# -# Build executable : GFS utilities -# -###################################################################### - -LMOD_EXACT_MATCH=no -source ../../../sorc/machine-setup.sh > /dev/null 2>&1 -cwd=$(pwd) - -if [ "$target" = "hera" ]; then - echo " " - echo " You are on $target " - echo " " -else - echo " " - echo " Your machine $target is not supported" - echo " The script $0 can not continue. Aborting!" - echo " " - exit -fi -echo " " - -# Load required modules -source ../../modulefiles/gfs_util.${target} -module list - -set -x - -mkdir -p ../../exec -make -mv overgridid ../../exec -make clean diff --git a/util/sorc/overgridid.fd/makefile b/util/sorc/overgridid.fd/makefile deleted file mode 100755 index 7141872bc14..00000000000 --- a/util/sorc/overgridid.fd/makefile +++ /dev/null @@ -1,8 +0,0 @@ -LIBS = ${W3NCO_LIB4} ${BACIO_LIB4} -OBJS= overgridid.o -overgridid: overgridid.f - ifort -o overgridid overgridid.f $(LIBS) -clean: - -rm -f $(OBJS) - - diff --git a/util/sorc/overgridid.fd/overgridid.f b/util/sorc/overgridid.fd/overgridid.f deleted file mode 100755 index 29b0080bf69..00000000000 --- a/util/sorc/overgridid.fd/overgridid.f +++ /dev/null @@ -1,59 +0,0 @@ - program overgridid -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C -C MAIN PROGRAM: OVERGRIDID REPLACE iGRID ID IN A GRIB FILE -C PRGMMR: VUONG ORG: NP23 DATE: 2014-05-21 -C -C ABSTRACT: THIS PROGRAM READS AN ENTIRE GRIB FILE FROM UNIT 11 -C AND WRITES IT BACK OUT TO UNIT 51, REPLACING THE INTERNAL -C GRID ID WITH THE GRID ID READ IN VIA UNIT 5. -C -C PROGRAM HISTORY LOG: -C 1998-01-01 IREDELL -C 1998-06-17 FARLEY MODIFIED OVERDATE ROUTINE -C 1999-05-24 Gilbert - added calls to BAOPEN. -C 2014-05-21 Vuong - Modified to change grid id in a grib file -C -C INPUT FILES: -C UNIT 5 2-DIGIT MODEL ID (in base 10) -C UNIT 11 INPUT GRIB FILE = "fort.11" -C -C OUTPUT FILES: -C UNIT 51 OUTPUT GRIB FILE = "fort.51" -C -C SUBPROGRAMS CALLED: -C SKGB - Find next GRIB field -C BAREAD - Read GRIB field -C WRYTE - Read GRIB field -C -C REMARKS: -C ANY NON-GRIB INFORMATION IN THE INPUT GRIB FILE WILL BE LOST. -C AN OUTPUT LINE WILL BE WRITTEN FOR EACH GRIB MESSAGE COPIED. -C -C ATTRIBUTES: -C LANGUAGE: FORTRAN 90 -C -C$$$ - parameter(msk1=32000,msk2=4000,mgrib=10000000) - character cgrib(mgrib) -C - read *,id ! grid id, ie 03 for 1.0 deg grib - call baopenr(11,"fort.11",iret1) - call baopenw(51,"fort.51",iret2) -C - n=0 - iseek=0 - call skgb(11,iseek,msk1,lskip,lgrib) - dowhile(lgrib.gt.0.and.lgrib.le.mgrib) - call baread(11,lskip,lgrib,ngrib,cgrib) - if(ngrib.ne.lgrib) call exit(2) - n=n+1 - id0=mova2i(cgrib(8+7)) - cgrib(8+7)=char(id) - call wryte(51,lgrib,cgrib) - print '("msg",i6,4x,"len",i8,4x,"was",i4.2,4x,"now",i4.2)', - & n,lgrib,id0,id - iseek=lskip+lgrib - call skgb(11,iseek,msk2,lskip,lgrib) - enddo - end diff --git a/util/sorc/overgridid.fd/sample.script b/util/sorc/overgridid.fd/sample.script deleted file mode 100755 index fdfd9316004..00000000000 --- a/util/sorc/overgridid.fd/sample.script +++ /dev/null @@ -1,13 +0,0 @@ -# THIS SCRIPT READ A FORECAST FILE (UNIT 11), MODIFIES PDS OCTET(8) -# TO CORRECT THE GRIB GRID ID AND RE-WRITES THE FILE TO UNIT 51. - -# STANDARD INPUT IS A 3-DIGIT INTEGER, FOR EXAMPLE 255 (User define grid) - -ln -s master.grbf06 fort.11 - -overgridid << EOF -255 -EOF - -mv fort.51 master.grbf06.new -rm fort.11 diff --git a/util/sorc/rdbfmsua.fd/MAPFILE b/util/sorc/rdbfmsua.fd/MAPFILE deleted file mode 100755 index 19e0decd716..00000000000 --- a/util/sorc/rdbfmsua.fd/MAPFILE +++ /dev/null @@ -1,4045 +0,0 @@ -Archive member included because of file (symbol) - -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flclos.o) - rdbfmsua.o (fl_clos_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flflun.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flclos.o) (fl_flun_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltbop.o) - rdbfmsua.o (fl_tbop_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltdat.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltbop.o) (fl_tdat_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltbop.o) (fl_tinq_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stldsp.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltdat.o) (st_ldsp_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlstr.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) (st_lstr_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmbl.o) - rdbfmsua.o (st_rmbl_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmst.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) (st_rmst_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbrstn.o) - rdbfmsua.o (tb_rstn_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flbksp.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltdat.o) (fl_bksp_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flinqr.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) (fl_inqr_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flpath.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) (fl_path_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flsopn.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltbop.o) (fl_sopn_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssenvr.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flinqr.o) (ss_envr_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssgsym.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssenvr.o) (ss_gsym_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlcuc.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssenvr.o) (st_lcuc_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stuclc.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flinqr.o) (st_uclc_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbastn.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbrstn.o) (tb_astn_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flglun.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flsopn.o) (fl_glun_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libbridge.a(dcbsrh.o) - rdbfmsua.o (dc_bsrh_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ireadns.o) - rdbfmsua.o (ireadns_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) - rdbfmsua.o (openbf_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapn.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (posapn_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapx.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (posapx_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapn.o) (rdmsgw_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (readdx_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ireadns.o) (readns_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) (readsb_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(status.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (status_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) - rdbfmsua.o (ufbint_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) (ufbrw_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upb.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) (upb_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (wrdlen_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (writdx_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wtstat.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (wtstat_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(adn30.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) (adn30_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (bfrini_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort2.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) (bort2_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort_exit.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort2.o) (bort_exit_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (bort_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(conwin.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) (conwin_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cpbfdx.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) (cpbfdx_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(drstpl.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) (drstpl_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxinit.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (dxinit_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxmini.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) (dxmini_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getwin.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) (getwin_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ibfms.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) (ibfms_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ichkstr.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) (ichkstr_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ifxy.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) (ifxy_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invcon.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(conwin.o) (invcon_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invwin.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) (invwin_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ipkm.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) (ipkm_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(irev.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upb.o) (irev_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupm.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) (iupm_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lmsg.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) (lmsg_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrpc.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getwin.o) (lstrpc_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrps.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) (lstrps_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) (msgwrt_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(newwin.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) (newwin_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nmwrd.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lmsg.o) (nmwrd_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nxtwin.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) (nxtwin_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ovrbs1.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) (ovrbs1_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(padmsg.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) (padmsg_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkb.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) (pkb_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkbs1.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) (pkbs1_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkc.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) (pkc_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pktdd.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxinit.o) (pktdd_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs01.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) (pkvs01_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs1.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) (pkvs1_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) (rdbfdx_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdcmps.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) (rdcmps_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) (rdtree_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) (rdusdx_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readmg.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) (readmg_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) (seqsdx_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) (stndrd_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) (string_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strnum.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) (strnum_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strsuc.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strnum.o) (strsuc_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(trybump.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) (trybump_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upbb.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) (upbb_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upc.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdcmps.o) (upc_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(usrtpl.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(conwin.o) (usrtpl_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(capit.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) (capit_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrna.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ichkstr.o) (chrtrna_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrn.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) (chrtrn_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readmg.o) (cktaba_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cnved4.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) (cnved4_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(digit.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) (digit_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(elemdx.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) (elemdx_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getlens.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) (getlens_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(gets1loc.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkbs1.o) (gets1loc_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(i4dy.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) (i4dy_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(idn30.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) (idn30_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(igetdate.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) (igetdate_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(istdesc.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) (istdesc_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupb.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) (iupb_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupbs01.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) (iupbs01_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstchr.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(elemdx.o) (jstchr_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstnum.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(elemdx.o) (jstnum_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstjpb.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(trybump.o) (lstjpb_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) (makestab_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(mvb.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) (mvb_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemock.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) (nemock_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtab.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) (nemtab_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbax.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) (nemtbax_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenuaa.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) (nenuaa_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenubd.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) (nenubd_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numbck.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) (numbck_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtab.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) (numtab_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbt.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) (openbt_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parstr.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) (parstr_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parusr.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) (parusr_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parutg.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parusr.o) (parutg_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rcstpl.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) (rcstpl_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgb.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readmg.o) (rdmsgb_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) (restd_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rsvfvm.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) (rsvfvm_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strcln.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) (strcln_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) (tabsub_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(uptdd.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) (uptdd_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdesc.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) (wrdesc_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cadn30.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) (cadn30_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chekstab.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) (chekstab_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(inctab.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) (inctab_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbb.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) (nemtbb_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbd.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) (nemtbd_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtbd.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) (numtbd_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabent.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) (tabent_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(valx.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbb.o) (valx_) -/gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rjust.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(valx.o) (rjust_) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/for_main.o (for_rtl_finish_) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) (for_check_env_name) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) - rdbfmsua.o (for_open) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_preconnected_units_init.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) (for__preconnected_units_create) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_reentrancy.o) - rdbfmsua.o (for_set_reentrancy) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) (for_since_epoch_t) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_stop.o) - rdbfmsua.o (for_stop_core) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_vm.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) (for__set_signal_ops_during_vm) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) - rdbfmsua.o (for_write_int_fmt) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_fmt.o) - rdbfmsua.o (for_write_seq_fmt) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_lis.o) - rdbfmsua.o (for_write_seq_lis) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_preconnected_units_init.o) (for__aio_lub_table) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) (for__reopen_file) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio_wrap.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) (for__aio_pthread_self) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_int.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) (cvt_text_to_integer) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_f.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) (cvt_vax_f_to_ieee_single) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_d.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) (cvt_vax_d_to_ieee_double) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_g.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) (cvt_vax_g_to_ieee_double) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cray.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) (cvt_cray_to_ieee_double) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_short.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) (cvt_ibm_short_to_ieee_single) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_long.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) (cvt_ibm_long_to_ieee_double) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_double.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) (cvt_ieee_double_to_cray) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_single.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) (cvt_ieee_single_to_ibm_short) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) (for__close_default) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close_proc.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) (for__close_proc) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_default_io_sizes_env_init.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) (for__default_io_sizes_env_init) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_desc_item.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) (for__desc_ret_item) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) (for__io_return) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_reentrancy.o) (for_exit) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit_handler.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) (for__exit_handler) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_comp.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) (for__format_compiler) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) (for__format_value) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_get.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) (for__get_s) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_intrp_fmt.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) (for__interp_fmt) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_ldir_wfs.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_lis.o) (for__wfs_table) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_lub_mgt.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) (for__acquire_lun) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_need_lf.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) (for__add_to_lf_table) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_put.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_stop.o) (for__put_su) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close_proc.o) (for__finish_ufseq_write) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(tbk_traceback.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) (tbk_stack_trace) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt__globals.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_double.o) (vax_c) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_int_to_text.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) (cvt_integer_to_text) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_data_to_text.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) (cvt_data_to_text) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_log_to_text.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) (cvt_boolean_to_text_ex) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_data.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) (cvt_text_to_data) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_log.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) (cvt_text_to_boolean) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_t.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) (cvt_ieee_t_to_text_ex) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_s.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) (cvt_ieee_s_to_text_ex) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_x.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) (cvt_ieee_x_to_text_ex) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_s.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_s.o) (cvtas_a_to_s) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_t.o) (cvtas_a_to_t) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_s_to_a.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_s.o) (cvtas_s_to_a) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_t_to_a.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_t.o) (cvtas_t_to_a) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_s.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_s.o) (cvtas__nan_s) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_t.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) (cvtas__nan_t) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_x.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_x.o) (cvtas_a_to_x) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_x_to_a.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_x.o) (cvtas_x_to_a) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_x.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_x.o) (cvtas__nan_x) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_globals.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) (cvtas_pten_word) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_53.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_s.o) (cvtas_pten_t) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_64.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) (cvtas_pten_64) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_128.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_s_to_a.o) (cvtas_pten_128) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(fetestexcept.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_stop.o) (fetestexcept) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_stub.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parusr.o) (lroundf) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_stub.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(trybump.o) (lround) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_ct.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_stub.o) (lround.L) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_ct.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_stub.o) (lroundf.L) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_gen.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_stub.o) (lroundf.A) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_gen.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_stub.o) (lround.A) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_ct.o) (__libm_error_support) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrf.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) (matherrf) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrl.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) (matherrl) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherr.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) (matherr) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ints2q.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) (__jtoq) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(qcomp.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) (__neq) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fp2q.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) (__dtoq) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(q2fp.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) (__qtof) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_display.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(tbk_traceback.o) (tbk_string_stack_signal_impl) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_backtrace.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_display.o) (tbk_getPC) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(cpu_feature_disp.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_stub.o) (__intel_cpu_features_init_x) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemcpy.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) (_intel_fast_memcpy) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemmove.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) (_intel_fast_memmove) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemset.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) (_intel_fast_memset) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(new_proc_init.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/for_main.o (__intel_new_feature_proc_init) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_addsubq.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) (__addq) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_divq.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) (__divq) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcpy.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) (__intel_sse2_strcpy) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncpy.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) (__intel_sse2_strncpy) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strlen.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) (__intel_sse2_strlen) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strchr.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) (__intel_sse2_strchr) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncmp.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) (__intel_sse2_strncmp) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcat.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) (__intel_sse2_strcat) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncat.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) (__intel_sse2_strncat) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memcpy_pp.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemcpy.o) (__intel_new_memcpy) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memset_pp.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemset.o) (__intel_new_memset) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memcpy.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemcpy.o) (__intel_ssse3_memcpy) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memcpy.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemcpy.o) (__intel_ssse3_rep_memcpy) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memmove.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemmove.o) (__intel_ssse3_memmove) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memmove.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemmove.o) (__intel_ssse3_rep_memmove) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(irc_msg_support.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_backtrace.o) (__libirc_get_msg) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_mem_ops.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memset_pp.o) (__libirc_largest_cache_size) -/opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(proc_init_utils.o) - /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(new_proc_init.o) (__intel_proc_init_ftzdazule) -/usr/lib64/libc_nonshared.a(elf-init.oS) - /usr/lib/../lib64/crt1.o (__libc_csu_fini) -/opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdcmps.o) (__powidf2) - -Allocating common symbols -Common symbol size file - -utgprm_ 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parutg.o) -maxcmp_ 0x18 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) -msgstd_ 0x1 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) -thread_count_mutex 0x28 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) -reptab_ 0x64 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) -stbfr_ 0x100 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) -hrdwrd_ 0x2c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) -bitbuf_ 0x192dd8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) -usrbit_ 0x27100 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) -stcach_ 0x4844c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) -bufrmg_ 0xc354 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) -msgcmp_ 0x1 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) -nulbfr_ 0x80 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) -threads_in_flight_mutex - 0x28 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) -usrint_ 0x753080 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) -acmode_ 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) -s01cm_ 0x7c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) -gmbdta_ 0x1c4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flflun.o) -for__pthread_mutex_unlock_ptr - 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) -for__a_argv 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) -for__pthread_mutex_init_ptr - 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) -charac_ 0x804 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) -stords_ 0x1f40 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) -bufrsr_ 0xc3f8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) -tabccc_ 0x10 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) -unptyp_ 0x80 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) -msgfmt_ 0x80 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) -dateln_ 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) -sect01_ 0x7c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) -message_catalog 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) -tables_ 0x13d628 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) -mrgcom_ 0x10 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) -padesc_ 0x14 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) -usrtmp_ 0x16e3600 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rcstpl.o) -dxtab_ 0x300 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) -for__pthread_mutex_lock_ptr - 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) -tababd_ 0xbbe58c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) -usrstr_ 0xd0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) -msgcwd_ 0x280 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) -for__l_argc 0x4 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) -quiet_ 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) -for__aio_lub_table 0x400 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) - -Discarded input sections - - .note.GNU-stack - 0x0000000000000000 0x0 /usr/lib/../lib64/crt1.o - .note.GNU-stack - 0x0000000000000000 0x0 /usr/lib/../lib64/crti.o - .note.GNU-stack - 0x0000000000000000 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/for_main.o - .note.GNU-stack - 0x0000000000000000 0x0 rdbfmsua.o - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flclos.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flflun.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltbop.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltdat.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stldsp.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlstr.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmbl.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmst.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbrstn.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flbksp.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flinqr.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flpath.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flsopn.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssenvr.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssgsym.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlcuc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stuclc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbastn.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flglun.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libbridge.a(dcbsrh.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ireadns.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapn.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapx.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(status.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upb.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wtstat.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(adn30.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort2.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort_exit.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(conwin.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cpbfdx.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(drstpl.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxinit.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxmini.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getwin.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ibfms.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ichkstr.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ifxy.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invcon.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invwin.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ipkm.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(irev.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupm.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lmsg.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrpc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrps.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(newwin.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nmwrd.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nxtwin.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ovrbs1.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(padmsg.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkb.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkbs1.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pktdd.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs01.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs1.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdcmps.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readmg.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strnum.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strsuc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(trybump.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upbb.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(usrtpl.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(capit.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrna.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrn.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cnved4.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(digit.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(elemdx.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getlens.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(gets1loc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(i4dy.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(idn30.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(igetdate.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(istdesc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupb.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupbs01.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstchr.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstnum.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstjpb.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(mvb.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemock.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtab.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbax.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenuaa.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenubd.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numbck.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtab.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbt.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parstr.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parusr.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parutg.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rcstpl.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgb.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rsvfvm.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strcln.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(uptdd.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdesc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cadn30.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chekstab.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(inctab.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbb.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbd.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtbd.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabent.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(valx.o) - .note.GNU-stack - 0x0000000000000000 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rjust.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_preconnected_units_init.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_reentrancy.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_stop.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_vm.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_fmt.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_lis.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio_wrap.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_int.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_f.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_d.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_g.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cray.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_short.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_long.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_double.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_single.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close_proc.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_default_io_sizes_env_init.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_desc_item.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit_handler.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_comp.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_get.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_intrp_fmt.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_ldir_wfs.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_lub_mgt.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_need_lf.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_put.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(tbk_traceback.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt__globals.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_int_to_text.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_data_to_text.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_log_to_text.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_data.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_log.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_t.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_s.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_x.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_s.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_s_to_a.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_t_to_a.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_s.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_t.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_x.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_x_to_a.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_x.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_globals.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_53.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_64.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_128.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(fetestexcept.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_stub.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_stub.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_ct.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_ct.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_gen.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_gen.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrf.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrl.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherr.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ints2q.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(qcomp.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fp2q.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(q2fp.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_display.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_backtrace.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(cpu_feature_disp.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemcpy.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemmove.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemset.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(new_proc_init.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_addsubq.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_divq.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcpy.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncpy.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strlen.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strchr.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncmp.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcat.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncat.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memcpy_pp.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memset_pp.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memcpy.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memcpy.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memmove.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memmove.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(irc_msg_support.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_mem_ops.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(proc_init_utils.o) - .note.GNU-stack - 0x0000000000000000 0x0 /usr/lib64/libc_nonshared.a(elf-init.oS) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - .note.GNU-stack - 0x0000000000000000 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - .note.GNU-stack - 0x0000000000000000 0x0 /usr/lib/../lib64/crtn.o - -Memory Configuration - -Name Origin Length Attributes -*default* 0x0000000000000000 0xffffffffffffffff - -Linker script and memory map - -LOAD /usr/lib/../lib64/crt1.o -LOAD /usr/lib/../lib64/crti.o -LOAD /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o -LOAD /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/for_main.o -LOAD rdbfmsua.o -LOAD /gpfs/hps/emc/global/noscrub/Boi.Vuong/lib_sorc/decod_ut/v1.0.0/intel/libdecod_ut_v1.0.0.a -LOAD /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a -LOAD /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libappl.a -LOAD /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libsyslib.a -LOAD /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libcgemlib.a -LOAD /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libbridge.a -LOAD /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a -LOAD /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/../../../../lib64/libgfortran.so -LOAD /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifport.a -LOAD /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a -LOAD /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a -LOAD /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libsvml.a -LOAD /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libm.a -LOAD /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libipgo.a -LOAD /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a -LOAD /usr/lib/../lib64/libpthread.so -START GROUP -LOAD /lib64/libpthread.so.0 -LOAD /usr/lib64/libpthread_nonshared.a -END GROUP -LOAD /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libsvml.a -LOAD /usr/lib/../lib64/libc.so -START GROUP -LOAD /lib64/libc.so.6 -LOAD /usr/lib64/libc_nonshared.a -LOAD /lib64/ld-linux-x86-64.so.2 -END GROUP -LOAD /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a -LOAD /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/../../../../lib64/libgcc_s.so -LOAD /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc_s.a -LOAD /usr/lib/../lib64/libdl.so -LOAD /usr/lib/../lib64/libc.so -START GROUP -LOAD /lib64/libc.so.6 -LOAD /usr/lib64/libc_nonshared.a -LOAD /lib64/ld-linux-x86-64.so.2 -END GROUP -LOAD /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o -LOAD /usr/lib/../lib64/crtn.o - 0x0000000000400000 PROVIDE (__executable_start, 0x400000) - 0x0000000000400238 . = (0x400000 + SIZEOF_HEADERS) - -.interp 0x0000000000400238 0x1c - *(.interp) - .interp 0x0000000000400238 0x1c /usr/lib/../lib64/crt1.o - -.note.ABI-tag 0x0000000000400254 0x20 - .note.ABI-tag 0x0000000000400254 0x20 /usr/lib/../lib64/crt1.o - -.note.SuSE 0x0000000000400274 0x18 - .note.SuSE 0x0000000000400274 0x18 /usr/lib/../lib64/crt1.o - -.note.gnu.build-id - 0x000000000040028c 0x24 - *(.note.gnu.build-id) - .note.gnu.build-id - 0x000000000040028c 0x24 /usr/lib/../lib64/crt1.o - -.hash 0x00000000004002b0 0x494 - *(.hash) - .hash 0x00000000004002b0 0x494 /usr/lib/../lib64/crt1.o - -.gnu.hash 0x0000000000400748 0x170 - *(.gnu.hash) - .gnu.hash 0x0000000000400748 0x170 /usr/lib/../lib64/crt1.o - -.dynsym 0x00000000004008b8 0xf00 - *(.dynsym) - .dynsym 0x00000000004008b8 0xf00 /usr/lib/../lib64/crt1.o - -.dynstr 0x00000000004017b8 0x87d - *(.dynstr) - .dynstr 0x00000000004017b8 0x87d /usr/lib/../lib64/crt1.o - -.gnu.version 0x0000000000402036 0x140 - *(.gnu.version) - .gnu.version 0x0000000000402036 0x140 /usr/lib/../lib64/crt1.o - -.gnu.version_d 0x0000000000402178 0x0 - *(.gnu.version_d) - .gnu.version_d - 0x0000000000000000 0x0 /usr/lib/../lib64/crt1.o - -.gnu.version_r 0x0000000000402178 0xd0 - *(.gnu.version_r) - .gnu.version_r - 0x0000000000402178 0xd0 /usr/lib/../lib64/crt1.o - -.rela.dyn 0x0000000000402248 0x438 - *(.rela.init) - *(.rela.text .rela.text.* .rela.gnu.linkonce.t.*) - .rela.text 0x0000000000000000 0x0 /usr/lib/../lib64/crt1.o - .rela.text.ssse3 - 0x0000000000000000 0x0 /usr/lib/../lib64/crt1.o - *(.rela.fini) - *(.rela.rodata .rela.rodata.* .rela.gnu.linkonce.r.*) - .rela.rodata 0x0000000000000000 0x0 /usr/lib/../lib64/crt1.o - *(.rela.data .rela.data.* .rela.gnu.linkonce.d.*) - .rela.data 0x0000000000000000 0x0 /usr/lib/../lib64/crt1.o - *(.rela.tdata .rela.tdata.* .rela.gnu.linkonce.td.*) - *(.rela.tbss .rela.tbss.* .rela.gnu.linkonce.tb.*) - *(.rela.ctors) - *(.rela.dtors) - *(.rela.got) - .rela.got 0x0000000000402248 0x3f0 /usr/lib/../lib64/crt1.o - *(.rela.bss .rela.bss.* .rela.gnu.linkonce.b.*) - .rela.bss 0x0000000000402638 0x48 /usr/lib/../lib64/crt1.o - *(.rela.ldata .rela.ldata.* .rela.gnu.linkonce.l.*) - *(.rela.lbss .rela.lbss.* .rela.gnu.linkonce.lb.*) - *(.rela.lrodata .rela.lrodata.* .rela.gnu.linkonce.lr.*) - *(.rela.ifunc) - -.rela.plt 0x0000000000402680 0xa80 - *(.rela.plt) - .rela.plt 0x0000000000402680 0xa80 /usr/lib/../lib64/crt1.o - 0x0000000000403100 PROVIDE (__rela_iplt_start, .) - *(.rela.iplt) - .rela.iplt 0x0000000000000000 0x0 /usr/lib/../lib64/crt1.o - 0x0000000000403100 PROVIDE (__rela_iplt_end, .) - -.init 0x0000000000403100 0x18 - *(SORT(.init)) - .init 0x0000000000403100 0x9 /usr/lib/../lib64/crti.o - 0x0000000000403100 _init - .init 0x0000000000403109 0x5 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - .init 0x000000000040310e 0x5 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - .init 0x0000000000403113 0x5 /usr/lib/../lib64/crtn.o - -.plt 0x0000000000403120 0x710 - *(.plt) - .plt 0x0000000000403120 0x710 /usr/lib/../lib64/crt1.o - 0x0000000000403130 fileno@@GLIBC_2.2.5 - 0x0000000000403140 printf@@GLIBC_2.2.5 - 0x0000000000403150 _gfortran_transfer_character_write@@GFORTRAN_1.4 - 0x0000000000403160 _Unwind_GetRegionStart@@GCC_3.0 - 0x0000000000403170 memset@@GLIBC_2.2.5 - 0x0000000000403180 ftell@@GLIBC_2.2.5 - 0x0000000000403190 snprintf@@GLIBC_2.2.5 - 0x00000000004031a0 _gfortran_st_open@@GFORTRAN_1.0 - 0x00000000004031b0 posix_memalign@@GLIBC_2.2.5 - 0x00000000004031c0 _gfortran_st_write_done@@GFORTRAN_1.0 - 0x00000000004031d0 close@@GLIBC_2.2.5 - 0x00000000004031e0 abort@@GLIBC_2.2.5 - 0x00000000004031f0 ttyname@@GLIBC_2.2.5 - 0x0000000000403200 memchr@@GLIBC_2.2.5 - 0x0000000000403210 strncat@@GLIBC_2.2.5 - 0x0000000000403220 getrusage@@GLIBC_2.2.5 - 0x0000000000403230 isatty@@GLIBC_2.2.5 - 0x0000000000403240 puts@@GLIBC_2.2.5 - 0x0000000000403250 uname@@GLIBC_2.2.5 - 0x0000000000403260 fseek@@GLIBC_2.2.5 - 0x0000000000403270 exit@@GLIBC_2.2.5 - 0x0000000000403280 gettimeofday@@GLIBC_2.2.5 - 0x0000000000403290 _gfortran_st_inquire@@GFORTRAN_1.0 - 0x00000000004032a0 read@@GLIBC_2.2.5 - 0x00000000004032b0 malloc@@GLIBC_2.2.5 - 0x00000000004032c0 fopen@@GLIBC_2.2.5 - 0x00000000004032d0 __libc_start_main@@GLIBC_2.2.5 - 0x00000000004032e0 system@@GLIBC_2.2.5 - 0x00000000004032f0 unlink@@GLIBC_2.2.5 - 0x0000000000403300 siglongjmp@@GLIBC_2.2.5 - 0x0000000000403310 catgets@@GLIBC_2.2.5 - 0x0000000000403320 sysconf@@GLIBC_2.2.5 - 0x0000000000403330 getpid@@GLIBC_2.2.5 - 0x0000000000403340 catclose@@GLIBC_2.2.5 - 0x0000000000403350 fgets@@GLIBC_2.2.5 - 0x0000000000403360 __fxstat64@@GLIBC_2.2.5 - 0x0000000000403370 freopen64@@GLIBC_2.2.5 - 0x0000000000403380 free@@GLIBC_2.2.5 - 0x0000000000403390 strlen@@GLIBC_2.2.5 - 0x00000000004033a0 _gfortran_st_read_done@@GFORTRAN_1.0 - 0x00000000004033b0 vsprintf@@GLIBC_2.2.5 - 0x00000000004033c0 opendir@@GLIBC_2.2.5 - 0x00000000004033d0 __xpg_basename@@GLIBC_2.2.5 - 0x00000000004033e0 mkstemp64@@GLIBC_2.2.5 - 0x00000000004033f0 __ctype_b_loc@@GLIBC_2.3 - 0x0000000000403400 _gfortran_concat_string@@GFORTRAN_1.0 - 0x0000000000403410 sprintf@@GLIBC_2.2.5 - 0x0000000000403420 strrchr@@GLIBC_2.2.5 - 0x0000000000403430 _Unwind_GetIP@@GCC_3.0 - 0x0000000000403440 atol@@GLIBC_2.2.5 - 0x0000000000403450 _Unwind_Backtrace@@GCC_3.3 - 0x0000000000403460 sscanf@@GLIBC_2.2.5 - 0x0000000000403470 _gfortran_transfer_integer@@GFORTRAN_1.0 - 0x0000000000403480 _gfortran_st_close@@GFORTRAN_1.0 - 0x0000000000403490 _gfortran_st_backspace@@GFORTRAN_1.0 - 0x00000000004034a0 kill@@GLIBC_2.2.5 - 0x00000000004034b0 strerror@@GLIBC_2.2.5 - 0x00000000004034c0 open64@@GLIBC_2.2.5 - 0x00000000004034d0 strstr@@GLIBC_2.2.5 - 0x00000000004034e0 sigprocmask@@GLIBC_2.2.5 - 0x00000000004034f0 _gfortran_transfer_array_write@@GFORTRAN_1.4 - 0x0000000000403500 sigaction@@GLIBC_2.2.5 - 0x0000000000403510 strcat@@GLIBC_2.2.5 - 0x0000000000403520 fputs@@GLIBC_2.2.5 - 0x0000000000403530 _Unwind_ForcedUnwind@@GCC_3.0 - 0x0000000000403540 ftruncate64@@GLIBC_2.2.5 - 0x0000000000403550 readlink@@GLIBC_2.2.5 - 0x0000000000403560 _gfortran_transfer_character@@GFORTRAN_1.0 - 0x0000000000403570 memcpy@@GLIBC_2.2.5 - 0x0000000000403580 raise@@GLIBC_2.2.5 - 0x0000000000403590 signal@@GLIBC_2.2.5 - 0x00000000004035a0 _gfortran_getenv@@GFORTRAN_1.0 - 0x00000000004035b0 memmove@@GLIBC_2.2.5 - 0x00000000004035c0 strchr@@GLIBC_2.2.5 - 0x00000000004035d0 vsnprintf@@GLIBC_2.2.5 - 0x00000000004035e0 fread@@GLIBC_2.2.5 - 0x00000000004035f0 setenv@@GLIBC_2.2.5 - 0x0000000000403600 catopen@@GLIBC_2.2.5 - 0x0000000000403610 getenv@@GLIBC_2.2.5 - 0x0000000000403620 _gfortran_transfer_integer_write@@GFORTRAN_1.4 - 0x0000000000403630 _gfortran_st_write@@GFORTRAN_1.0 - 0x0000000000403640 __errno_location@@GLIBC_2.2.5 - 0x0000000000403650 strcmp@@GLIBC_2.2.5 - 0x0000000000403660 getcwd@@GLIBC_2.2.5 - 0x0000000000403670 strcpy@@GLIBC_2.2.5 - 0x0000000000403680 nanosleep@@GLIBC_2.2.5 - 0x0000000000403690 _gfortran_string_index@@GFORTRAN_1.0 - 0x00000000004036a0 dladdr@@GLIBC_2.2.5 - 0x00000000004036b0 __ctype_tolower_loc@@GLIBC_2.3 - 0x00000000004036c0 memcmp@@GLIBC_2.2.5 - 0x00000000004036d0 _gfortran_st_rewind@@GFORTRAN_1.0 - 0x00000000004036e0 _gfortran_st_read@@GFORTRAN_1.0 - 0x00000000004036f0 feof@@GLIBC_2.2.5 - 0x0000000000403700 fclose@@GLIBC_2.2.5 - 0x0000000000403710 strncpy@@GLIBC_2.2.5 - 0x0000000000403720 __xstat64@@GLIBC_2.2.5 - 0x0000000000403730 lseek64@@GLIBC_2.2.5 - 0x0000000000403740 dlsym@@GLIBC_2.2.5 - 0x0000000000403750 closedir@@GLIBC_2.2.5 - 0x0000000000403760 access@@GLIBC_2.2.5 - 0x0000000000403770 sigemptyset@@GLIBC_2.2.5 - 0x0000000000403780 _gfortran_transfer_real@@GFORTRAN_1.0 - 0x0000000000403790 fopen64@@GLIBC_2.2.5 - 0x00000000004037a0 _gfortran_compare_string@@GFORTRAN_1.0 - 0x00000000004037b0 realloc@@GLIBC_2.2.5 - 0x00000000004037c0 perror@@GLIBC_2.2.5 - 0x00000000004037d0 __sigsetjmp@@GLIBC_2.2.5 - 0x00000000004037e0 fprintf@@GLIBC_2.2.5 - 0x00000000004037f0 localtime@@GLIBC_2.2.5 - 0x0000000000403800 write@@GLIBC_2.2.5 - 0x0000000000403810 _gfortran_pow_i4_i4@@GFORTRAN_1.0 - 0x0000000000403820 fcntl@@GLIBC_2.2.5 - *(.iplt) - .iplt 0x0000000000000000 0x0 /usr/lib/../lib64/crt1.o - -.text 0x0000000000403830 0xb2738 - *(.text.unlikely .text.*_unlikely) - .text.unlikely - 0x0000000000403830 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - .text.unlikely - 0x0000000000403830 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - .text.unlikely - 0x0000000000403830 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - *(.text.exit .text.exit.*) - *(.text.startup .text.startup.*) - *(.text.hot .text.hot.*) - *(.text .stub .text.* .gnu.linkonce.t.*) - .text 0x0000000000403830 0x2c /usr/lib/../lib64/crt1.o - 0x0000000000403830 _start - .text 0x000000000040385c 0x17 /usr/lib/../lib64/crti.o - *fill* 0x0000000000403873 0xd - .text 0x0000000000403880 0x116 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - *fill* 0x0000000000403996 0xa - .text 0x00000000004039a0 0x40 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/for_main.o - 0x00000000004039a0 main - .text 0x00000000004039e0 0x1920 rdbfmsua.o - 0x00000000004039e0 MAIN__ - .text 0x0000000000405300 0x82 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flclos.o) - 0x0000000000405300 fl_clos_ - *fill* 0x0000000000405382 0x2 - .text 0x0000000000405384 0x4d /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flflun.o) - 0x0000000000405384 fl_flun_ - *fill* 0x00000000004053d1 0x3 - .text 0x00000000004053d4 0x13e /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltbop.o) - 0x00000000004053d4 fl_tbop_ - *fill* 0x0000000000405512 0x2 - .text 0x0000000000405514 0x155 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltdat.o) - 0x0000000000405514 fl_tdat_ - *fill* 0x0000000000405669 0x3 - .text 0x000000000040566c 0xd4a /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) - 0x000000000040566c fl_tinq_ - *fill* 0x00000000004063b6 0x2 - .text 0x00000000004063b8 0x332 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stldsp.o) - 0x00000000004063b8 st_ldsp_ - *fill* 0x00000000004066ea 0x2 - .text 0x00000000004066ec 0x9b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlstr.o) - 0x00000000004066ec st_lstr_ - *fill* 0x0000000000406787 0x1 - .text 0x0000000000406788 0x28e /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmbl.o) - 0x0000000000406788 st_rmbl_ - *fill* 0x0000000000406a16 0x2 - .text 0x0000000000406a18 0x576 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmst.o) - 0x0000000000406a18 st_rmst_ - *fill* 0x0000000000406f8e 0x2 - .text 0x0000000000406f90 0x560 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbrstn.o) - 0x0000000000406f90 tb_rstn_ - .text 0x00000000004074f0 0x6a /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flbksp.o) - 0x00000000004074f0 fl_bksp_ - *fill* 0x000000000040755a 0x2 - .text 0x000000000040755c 0x5c5 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flinqr.o) - 0x000000000040755c fl_inqr_ - *fill* 0x0000000000407b21 0x3 - .text 0x0000000000407b24 0x31c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flpath.o) - 0x0000000000407b24 fl_path_ - .text 0x0000000000407e40 0x230 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flsopn.o) - 0x0000000000407e40 fl_sopn_ - .text 0x0000000000408070 0x78f /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssenvr.o) - 0x0000000000408070 ss_envr_ - *fill* 0x00000000004087ff 0x1 - .text 0x0000000000408800 0x13b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssgsym.o) - 0x0000000000408800 ss_gsym_ - *fill* 0x000000000040893b 0x1 - .text 0x000000000040893c 0x146 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlcuc.o) - 0x000000000040893c st_lcuc_ - *fill* 0x0000000000408a82 0x2 - .text 0x0000000000408a84 0x146 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stuclc.o) - 0x0000000000408a84 st_uclc_ - *fill* 0x0000000000408bca 0x2 - .text 0x0000000000408bcc 0xb03 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbastn.o) - 0x0000000000408bcc tb_astn_ - *fill* 0x00000000004096cf 0x1 - .text 0x00000000004096d0 0x89 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flglun.o) - 0x00000000004096d0 fl_glun_ - *fill* 0x0000000000409759 0x3 - .text 0x000000000040975c 0x181 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libbridge.a(dcbsrh.o) - 0x000000000040975c dc_bsrh_ - *fill* 0x00000000004098dd 0x3 - .text 0x00000000004098e0 0x45 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ireadns.o) - 0x00000000004098e0 ireadns_ - *fill* 0x0000000000409925 0x3 - .text 0x0000000000409928 0xe23 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) - 0x0000000000409928 openbf_ - *fill* 0x000000000040a74b 0x1 - .text 0x000000000040a74c 0x10c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapn.o) - 0x000000000040a74c posapn_ - .text 0x000000000040a858 0x135 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapx.o) - 0x000000000040a858 posapx_ - *fill* 0x000000000040a98d 0x3 - .text 0x000000000040a990 0x1ff /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) - 0x000000000040a990 rdmsgw_ - *fill* 0x000000000040ab8f 0x1 - .text 0x000000000040ab90 0x3e9 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) - 0x000000000040ab90 readdx_ - *fill* 0x000000000040af79 0x3 - .text 0x000000000040af7c 0x12a /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) - 0x000000000040af7c readns_ - *fill* 0x000000000040b0a6 0x2 - .text 0x000000000040b0a8 0x390 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) - 0x000000000040b0a8 readsb_ - .text 0x000000000040b438 0x223 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(status.o) - 0x000000000040b438 status_ - *fill* 0x000000000040b65b 0x1 - .text 0x000000000040b65c 0xfd2 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) - 0x000000000040b65c ufbint_ - *fill* 0x000000000040c62e 0x2 - .text 0x000000000040c630 0xab2 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) - 0x000000000040c630 ufbrw_ - *fill* 0x000000000040d0e2 0x2 - .text 0x000000000040d0e4 0x1b4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upb.o) - 0x000000000040d0e4 upb_ - .text 0x000000000040d298 0x1276 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) - 0x000000000040d298 wrdlen_ - *fill* 0x000000000040e50e 0x2 - .text 0x000000000040e510 0xa79 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) - 0x000000000040e510 writdx_ - *fill* 0x000000000040ef89 0x3 - .text 0x000000000040ef8c 0x4ff /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wtstat.o) - 0x000000000040ef8c wtstat_ - *fill* 0x000000000040f48b 0x1 - .text 0x000000000040f48c 0x484 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(adn30.o) - 0x000000000040f48c adn30_ - .text 0x000000000040f910 0x638 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) - 0x000000000040f910 bfrini_ - .text 0x000000000040ff48 0x26d /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort2.o) - 0x000000000040ff48 bort2_ - *fill* 0x00000000004101b5 0x3 - .text 0x00000000004101b8 0xe /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort_exit.o) - 0x00000000004101b8 bort_exit_ - *fill* 0x00000000004101c6 0x2 - .text 0x00000000004101c8 0x1e8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort.o) - 0x00000000004101c8 bort_ - .text 0x00000000004103b0 0x1f6 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(conwin.o) - 0x00000000004103b0 conwin_ - *fill* 0x00000000004105a6 0x2 - .text 0x00000000004105a8 0x48b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cpbfdx.o) - 0x00000000004105a8 cpbfdx_ - *fill* 0x0000000000410a33 0x1 - .text 0x0000000000410a34 0x159 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(drstpl.o) - 0x0000000000410a34 drstpl_ - *fill* 0x0000000000410b8d 0x3 - .text 0x0000000000410b90 0xa44 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxinit.o) - 0x0000000000410b90 dxinit_ - .text 0x00000000004115d4 0x83e /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxmini.o) - 0x00000000004115d4 dxmini_ - *fill* 0x0000000000411e12 0x2 - .text 0x0000000000411e14 0x318 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getwin.o) - 0x0000000000411e14 getwin_ - .text 0x000000000041212c 0x67 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ibfms.o) - 0x000000000041212c ibfms_ - *fill* 0x0000000000412193 0x1 - .text 0x0000000000412194 0xec /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ichkstr.o) - 0x0000000000412194 ichkstr_ - .text 0x0000000000412280 0x107 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ifxy.o) - 0x0000000000412280 ifxy_ - *fill* 0x0000000000412387 0x1 - .text 0x0000000000412388 0x509 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invcon.o) - 0x0000000000412388 invcon_ - *fill* 0x0000000000412891 0x3 - .text 0x0000000000412894 0x279 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invwin.o) - 0x0000000000412894 invwin_ - *fill* 0x0000000000412b0d 0x3 - .text 0x0000000000412b10 0x1cf /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ipkm.o) - 0x0000000000412b10 ipkm_ - *fill* 0x0000000000412cdf 0x1 - .text 0x0000000000412ce0 0x6e /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(irev.o) - 0x0000000000412ce0 irev_ - *fill* 0x0000000000412d4e 0x2 - .text 0x0000000000412d50 0x196 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupm.o) - 0x0000000000412d50 iupm_ - *fill* 0x0000000000412ee6 0x2 - .text 0x0000000000412ee8 0x40 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lmsg.o) - 0x0000000000412ee8 lmsg_ - .text 0x0000000000412f28 0x362 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrpc.o) - 0x0000000000412f28 lstrpc_ - *fill* 0x000000000041328a 0x2 - .text 0x000000000041328c 0x362 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrps.o) - 0x000000000041328c lstrps_ - *fill* 0x00000000004135ee 0x2 - .text 0x00000000004135f0 0x995 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) - 0x00000000004135f0 msgwrt_ - *fill* 0x0000000000413f85 0x3 - .text 0x0000000000413f88 0x226 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(newwin.o) - 0x0000000000413f88 newwin_ - *fill* 0x00000000004141ae 0x2 - .text 0x00000000004141b0 0x63 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nmwrd.o) - 0x00000000004141b0 nmwrd_ - *fill* 0x0000000000414213 0x1 - .text 0x0000000000414214 0x2ae /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nxtwin.o) - 0x0000000000414214 nxtwin_ - *fill* 0x00000000004144c2 0x2 - .text 0x00000000004144c4 0x331 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ovrbs1.o) - 0x00000000004144c4 ovrbs1_ - *fill* 0x00000000004147f5 0x3 - .text 0x00000000004147f8 0xd6 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(padmsg.o) - 0x00000000004147f8 padmsg_ - *fill* 0x00000000004148ce 0x2 - .text 0x00000000004148d0 0x349 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkb.o) - 0x00000000004148d0 pkb_ - *fill* 0x0000000000414c19 0x3 - .text 0x0000000000414c1c 0x44f /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkbs1.o) - 0x0000000000414c1c pkbs1_ - *fill* 0x000000000041506b 0x1 - .text 0x000000000041506c 0x3d9 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkc.o) - 0x000000000041506c pkc_ - *fill* 0x0000000000415445 0x3 - .text 0x0000000000415448 0x4d3 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pktdd.o) - 0x0000000000415448 pktdd_ - *fill* 0x000000000041591b 0x1 - .text 0x000000000041591c 0x2a8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs01.o) - 0x000000000041591c pkvs01_ - .text 0x0000000000415bc4 0x39d /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs1.o) - 0x0000000000415bc4 pkvs1_ - *fill* 0x0000000000415f61 0x3 - .text 0x0000000000415f64 0x14fe /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) - 0x0000000000415f64 rdbfdx_ - *fill* 0x0000000000417462 0x2 - .text 0x0000000000417464 0x519 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdcmps.o) - 0x0000000000417464 rdcmps_ - *fill* 0x000000000041797d 0x3 - .text 0x0000000000417980 0x40d /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) - 0x0000000000417980 rdtree_ - *fill* 0x0000000000417d8d 0x3 - .text 0x0000000000417d90 0x1989 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) - 0x0000000000417d90 rdusdx_ - *fill* 0x0000000000419719 0x3 - .text 0x000000000041971c 0x27f /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readmg.o) - 0x000000000041971c readmg_ - *fill* 0x000000000041999b 0x1 - .text 0x000000000041999c 0x1da9 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) - 0x000000000041999c seqsdx_ - *fill* 0x000000000041b745 0x3 - .text 0x000000000041b748 0x984 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) - 0x000000000041b748 stndrd_ - .text 0x000000000041c0cc 0x7af /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) - 0x000000000041c0cc string_ - *fill* 0x000000000041c87b 0x1 - .text 0x000000000041c87c 0x36b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strnum.o) - 0x000000000041c87c strnum_ - *fill* 0x000000000041cbe7 0x1 - .text 0x000000000041cbe8 0x39b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strsuc.o) - 0x000000000041cbe8 strsuc_ - *fill* 0x000000000041cf83 0x1 - .text 0x000000000041cf84 0x2ef /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(trybump.o) - 0x000000000041cf84 trybump_ - *fill* 0x000000000041d273 0x1 - .text 0x000000000041d274 0x1a0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upbb.o) - 0x000000000041d274 upbb_ - .text 0x000000000041d414 0x11c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upc.o) - 0x000000000041d414 upc_ - .text 0x000000000041d530 0x1abd /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(usrtpl.o) - 0x000000000041d530 usrtpl_ - *fill* 0x000000000041efed 0x3 - .text 0x000000000041eff0 0xa6 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(capit.o) - 0x000000000041eff0 capit_ - *fill* 0x000000000041f096 0x2 - .text 0x000000000041f098 0x110 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrna.o) - 0x000000000041f098 chrtrna_ - .text 0x000000000041f1a8 0x99 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrn.o) - 0x000000000041f1a8 chrtrn_ - *fill* 0x000000000041f241 0x3 - .text 0x000000000041f244 0xfe6 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) - 0x000000000041f244 cktaba_ - *fill* 0x000000000042022a 0x2 - .text 0x000000000042022c 0x69d /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cnved4.o) - 0x000000000042022c cnved4_ - *fill* 0x00000000004208c9 0x3 - .text 0x00000000004208cc 0x6f /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(digit.o) - 0x00000000004208cc digit_ - *fill* 0x000000000042093b 0x1 - .text 0x000000000042093c 0xadb /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(elemdx.o) - 0x000000000042093c elemdx_ - *fill* 0x0000000000421417 0x1 - .text 0x0000000000421418 0x1e4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getlens.o) - 0x0000000000421418 getlens_ - .text 0x00000000004215fc 0x57a /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(gets1loc.o) - 0x00000000004215fc gets1loc_ - *fill* 0x0000000000421b76 0x2 - .text 0x0000000000421b78 0x6b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(i4dy.o) - 0x0000000000421b78 i4dy_ - *fill* 0x0000000000421be3 0x1 - .text 0x0000000000421be4 0x3f3 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(idn30.o) - 0x0000000000421be4 idn30_ - *fill* 0x0000000000421fd7 0x1 - .text 0x0000000000421fd8 0x110 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(igetdate.o) - 0x0000000000421fd8 igetdate_ - .text 0x00000000004220e8 0x14a /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(istdesc.o) - 0x00000000004220e8 istdesc_ - *fill* 0x0000000000422232 0x2 - .text 0x0000000000422234 0x4b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupb.o) - 0x0000000000422234 iupb_ - *fill* 0x000000000042227f 0x1 - .text 0x0000000000422280 0x26e /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupbs01.o) - 0x0000000000422280 iupbs01_ - *fill* 0x00000000004224ee 0x2 - .text 0x00000000004224f0 0xff /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstchr.o) - 0x00000000004224f0 jstchr_ - *fill* 0x00000000004225ef 0x1 - .text 0x00000000004225f0 0x51d /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstnum.o) - 0x00000000004225f0 jstnum_ - *fill* 0x0000000000422b0d 0x3 - .text 0x0000000000422b10 0x38b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstjpb.o) - 0x0000000000422b10 lstjpb_ - *fill* 0x0000000000422e9b 0x1 - .text 0x0000000000422e9c 0x14fe /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) - 0x0000000000422e9c makestab_ - *fill* 0x000000000042439a 0x2 - .text 0x000000000042439c 0x20c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(mvb.o) - 0x000000000042439c mvb_ - .text 0x00000000004245a8 0xf2 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemock.o) - 0x00000000004245a8 nemock_ - *fill* 0x000000000042469a 0x2 - .text 0x000000000042469c 0x496 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtab.o) - 0x000000000042469c nemtab_ - *fill* 0x0000000000424b32 0x2 - .text 0x0000000000424b34 0x396 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbax.o) - 0x0000000000424b34 nemtbax_ - *fill* 0x0000000000424eca 0x2 - .text 0x0000000000424ecc 0x28a /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenuaa.o) - 0x0000000000424ecc nenuaa_ - *fill* 0x0000000000425156 0x2 - .text 0x0000000000425158 0x4d2 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenubd.o) - 0x0000000000425158 nenubd_ - *fill* 0x000000000042562a 0x2 - .text 0x000000000042562c 0x17b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numbck.o) - 0x000000000042562c numbck_ - *fill* 0x00000000004257a7 0x1 - .text 0x00000000004257a8 0x637 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtab.o) - 0x00000000004257a8 numtab_ - *fill* 0x0000000000425ddf 0x1 - .text 0x0000000000425de0 0x217 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbt.o) - 0x0000000000425de0 openbt_ - *fill* 0x0000000000425ff7 0x1 - .text 0x0000000000425ff8 0x739 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parstr.o) - 0x0000000000425ff8 parstr_ - *fill* 0x0000000000426731 0x3 - .text 0x0000000000426734 0x1044 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parusr.o) - 0x0000000000426734 parusr_ - .text 0x0000000000427778 0x8d9 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parutg.o) - 0x0000000000427778 parutg_ - *fill* 0x0000000000428051 0x3 - .text 0x0000000000428054 0x8cf /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rcstpl.o) - 0x0000000000428054 rcstpl_ - *fill* 0x0000000000428923 0x1 - .text 0x0000000000428924 0x1ed /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgb.o) - 0x0000000000428924 rdmsgb_ - *fill* 0x0000000000428b11 0x3 - .text 0x0000000000428b14 0x4e2 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) - 0x0000000000428b14 restd_ - *fill* 0x0000000000428ff6 0x2 - .text 0x0000000000428ff8 0x7c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rsvfvm.o) - 0x0000000000428ff8 rsvfvm_ - .text 0x0000000000429074 0x25 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strcln.o) - 0x0000000000429074 strcln_ - *fill* 0x0000000000429099 0x3 - .text 0x000000000042909c 0xe9b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) - 0x000000000042909c tabsub_ - *fill* 0x0000000000429f37 0x1 - .text 0x0000000000429f38 0x227 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(uptdd.o) - 0x0000000000429f38 uptdd_ - *fill* 0x000000000042a15f 0x1 - .text 0x000000000042a160 0xa8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdesc.o) - 0x000000000042a160 wrdesc_ - .text 0x000000000042a208 0x9f /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cadn30.o) - 0x000000000042a208 cadn30_ - *fill* 0x000000000042a2a7 0x1 - .text 0x000000000042a2a8 0x368 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chekstab.o) - 0x000000000042a2a8 chekstab_ - .text 0x000000000042a610 0x2dd /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(inctab.o) - 0x000000000042a610 inctab_ - *fill* 0x000000000042a8ed 0x3 - .text 0x000000000042a8f0 0x86c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbb.o) - 0x000000000042a8f0 nemtbb_ - .text 0x000000000042b15c 0xaf0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbd.o) - 0x000000000042b15c nemtbd_ - .text 0x000000000042bc4c 0x314 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtbd.o) - 0x000000000042bc4c numtbd_ - .text 0x000000000042bf60 0x886 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabent.o) - 0x000000000042bf60 tabent_ - *fill* 0x000000000042c7e6 0x2 - .text 0x000000000042c7e8 0x609 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(valx.o) - 0x000000000042c7e8 valx_ - *fill* 0x000000000042cdf1 0x3 - .text 0x000000000042cdf4 0xaf /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rjust.o) - 0x000000000042cdf4 rjust_ - *fill* 0x000000000042cea3 0xd - .text 0x000000000042ceb0 0x2a50 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) - 0x000000000042ceb0 for__process_start_time - 0x000000000042ced0 for__signal_handler - 0x000000000042de20 for_enable_underflow - 0x000000000042de40 for_get_fpe_ - 0x000000000042e020 for_set_fpe_ - 0x000000000042e3a0 for_fpe_service - 0x000000000042e750 for_get_fpe_counts_ - 0x000000000042e7a0 for_rtl_finish_ - 0x000000000042e7c0 dump_dfil_exception_info - 0x000000000042f6a0 for_rtl_init_ - .text 0x000000000042f900 0x1120 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) - 0x000000000042f900 for__adjust_buffer - 0x000000000042fb50 for__lower_bound_index - 0x000000000042fba0 for__cvt_foreign_read - 0x00000000004300f0 for__cvt_foreign_write - 0x00000000004308f0 for__cvt_foreign_check - 0x0000000000430970 for_check_env_name - .text 0x0000000000430a20 0x5ca0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) - 0x0000000000430a20 SetEndian - 0x0000000000430e20 CheckStreamRecortType - 0x0000000000431300 CheckEndian - 0x0000000000431760 for_open - 0x0000000000432a80 for__update_reopen_keywords - 0x0000000000433a50 for__set_foreign_bits - 0x0000000000434d70 for__open_key - 0x0000000000435020 for__open_args - 0x00000000004357e0 for__find_iomsg - 0x0000000000435880 for__set_terminator_option - 0x0000000000435d80 for__set_conversion_option - 0x0000000000436190 for__is_special_device - 0x0000000000436340 for__open_default - .text 0x00000000004366c0 0x240 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_preconnected_units_init.o) - 0x00000000004366c0 for__preconnected_units_create - .text 0x0000000000436900 0x280 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_reentrancy.o) - 0x0000000000436900 for_set_reentrancy - 0x0000000000436920 for__reentrancy_cleanup - 0x00000000004369b0 for__disable_asynch_deliv_private - 0x00000000004369d0 for__enable_asynch_deliv_private - 0x00000000004369f0 for__once_private - 0x0000000000436a40 for__reentrancy_init - .text 0x0000000000436b80 0x870 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) - 0x0000000000436b80 for_since_epoch - 0x0000000000436c20 for_since_epoch_t - 0x0000000000436cc0 for_since_epoch_x - 0x0000000000436dc0 for_secnds - 0x0000000000436ed0 for_secnds_t - 0x0000000000436fe0 for_secnds_x - 0x0000000000437240 for_cpusec - 0x00000000004372d0 for_cpusec_t - 0x0000000000437350 for_cpusec_x - .text 0x00000000004373f0 0x2b50 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_stop.o) - 0x00000000004373f0 for_abort - 0x0000000000437eb0 for_stop_core_impl - 0x0000000000438a60 for_stop_core - 0x0000000000439590 for_stop - .text 0x0000000000439f40 0x1070 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_vm.o) - 0x0000000000439f40 for__set_signal_ops_during_vm - 0x0000000000439f80 for__get_vm - 0x000000000043a0c0 for__realloc_vm - 0x000000000043a1b0 for__free_vm - 0x000000000043a230 for_allocate - 0x000000000043a5a0 for_alloc_allocatable - 0x000000000043a920 for_deallocate - 0x000000000043aab0 for_dealloc_allocatable - 0x000000000043ac60 for_check_mult_overflow - 0x000000000043ad80 for_check_mult_overflow64 - 0x000000000043af00 for__spec_align_alloc - 0x000000000043afa0 for__spec_align_free - .text 0x000000000043afb0 0x39f0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) - 0x000000000043afb0 for_write_int_fmt - 0x000000000043c120 for_write_int_fmt_xmit - .text 0x000000000043e9a0 0x4b60 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_fmt.o) - 0x000000000043e9a0 for_write_seq_fmt - 0x0000000000440750 for_write_seq_fmt_xmit - 0x0000000000443330 for__write_args - .text 0x0000000000443500 0x6b90 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_lis.o) - 0x0000000000443500 ensure_one_leading_blank_before_data - 0x0000000000443910 for_write_seq_lis - 0x00000000004454f0 for_write_seq_lis_xmit - .text 0x000000000044a090 0x4f30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) - 0x000000000044a090 for__aio_acquire_lun_fname - 0x000000000044a360 for__aio_release - 0x000000000044a430 for__aio_acquire_lun - 0x000000000044aca0 for__aio_release_lun - 0x000000000044b2c0 for__aio_destroy - 0x000000000044b760 for_asynchronous - 0x000000000044c7a0 for_waitid - 0x000000000044d640 for_wait - 0x000000000044e390 for__aio_check_unit - 0x000000000044e5d0 for__aio_error_handling - 0x000000000044ee00 for__aio_init - .text 0x000000000044efc0 0x5350 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) - 0x000000000044efc0 fname_from_piped_fd - 0x000000000044f200 for__reopen_file - 0x000000000044f2f0 for__compute_filename - 0x0000000000451a80 for__open_proc - 0x0000000000454300 for__decl_exit_hand - .text 0x0000000000454310 0xb0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio_wrap.o) - 0x0000000000454310 for__aio_pthread_self - 0x0000000000454320 for__aio_pthread_create - 0x0000000000454340 for__aio_pthread_cancel - 0x0000000000454350 for__aio_pthread_detach - 0x0000000000454360 for__aio_pthread_mutex_lock - 0x0000000000454370 for__aio_pthread_mutex_unlock - 0x0000000000454380 for__aio_pthread_cond_wait - 0x0000000000454390 for__aio_pthread_cond_signal - 0x00000000004543a0 for__aio_pthread_mutex_init - 0x00000000004543b0 for__aio_pthread_exit - .text 0x00000000004543c0 0xad0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_int.o) - 0x00000000004543c0 cvt_text_to_integer - 0x00000000004546e0 cvt_text_to_unsigned - 0x0000000000454910 cvt_text_to_integer64 - 0x0000000000454c40 cvt_text_to_unsigned64 - .text 0x0000000000454e90 0xd20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_f.o) - 0x0000000000454e90 cvt_vax_f_to_ieee_single_ - 0x00000000004552f0 CVT_VAX_F_TO_IEEE_SINGLE - 0x0000000000455750 cvt_vax_f_to_ieee_single - .text 0x0000000000455bb0 0xf80 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_d.o) - 0x0000000000455bb0 cvt_vax_d_to_ieee_double_ - 0x00000000004560e0 CVT_VAX_D_TO_IEEE_DOUBLE - 0x0000000000456610 cvt_vax_d_to_ieee_double - .text 0x0000000000456b30 0xf20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_g.o) - 0x0000000000456b30 cvt_vax_g_to_ieee_double_ - 0x0000000000457030 CVT_VAX_G_TO_IEEE_DOUBLE - 0x0000000000457530 cvt_vax_g_to_ieee_double - .text 0x0000000000457a50 0x1f40 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cray.o) - 0x0000000000457a50 cvt_cray_to_ieee_single_ - 0x0000000000457f20 CVT_CRAY_TO_IEEE_SINGLE - 0x00000000004583f0 cvt_cray_to_ieee_single - 0x00000000004588e0 cvt_cray_to_ieee_double_ - 0x0000000000458e60 CVT_CRAY_TO_IEEE_DOUBLE - 0x00000000004593e0 cvt_cray_to_ieee_double - .text 0x0000000000459990 0xd80 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_short.o) - 0x0000000000459990 cvt_ibm_short_to_ieee_single_ - 0x0000000000459e10 CVT_IBM_SHORT_TO_IEEE_SINGLE - 0x000000000045a290 cvt_ibm_short_to_ieee_single - .text 0x000000000045a710 0x1070 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_long.o) - 0x000000000045a710 cvt_ibm_long_to_ieee_double_ - 0x000000000045ac70 CVT_IBM_LONG_TO_IEEE_DOUBLE - 0x000000000045b1d0 cvt_ibm_long_to_ieee_double - .text 0x000000000045b780 0x4080 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_double.o) - 0x000000000045b780 cvt_ieee_double_to_cray_ - 0x000000000045bc00 CVT_IEEE_DOUBLE_TO_CRAY - 0x000000000045c080 cvt_ieee_double_to_cray - 0x000000000045c500 cvt_ieee_double_to_ibm_long_ - 0x000000000045c9c0 CVT_IEEE_DOUBLE_TO_IBM_LONG - 0x000000000045ce80 cvt_ieee_double_to_ibm_long - 0x000000000045d360 cvt_ieee_double_to_vax_d_ - 0x000000000045d730 CVT_IEEE_DOUBLE_TO_VAX_D - 0x000000000045db00 cvt_ieee_double_to_vax_d - 0x000000000045df00 cvt_ieee_double_to_vax_g_ - 0x000000000045e2d0 CVT_IEEE_DOUBLE_TO_VAX_G - 0x000000000045e6a0 cvt_ieee_double_to_vax_g - 0x000000000045eaa0 cvt_ieee_double_to_vax_h_ - 0x000000000045ef10 CVT_IEEE_DOUBLE_TO_VAX_H - 0x000000000045f380 cvt_ieee_double_to_vax_h - .text 0x000000000045f800 0x1fe0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_single.o) - 0x000000000045f800 cvt_ieee_single_to_cray_ - 0x000000000045fba0 CVT_IEEE_SINGLE_TO_CRAY - 0x000000000045ff40 cvt_ieee_single_to_cray - 0x0000000000460300 cvt_ieee_single_to_ibm_short_ - 0x00000000004606f0 CVT_IEEE_SINGLE_TO_IBM_SHORT - 0x0000000000460ae0 cvt_ieee_single_to_ibm_short - 0x0000000000460ef0 cvt_ieee_single_to_vax_f_ - 0x00000000004611f0 CVT_IEEE_SINGLE_TO_VAX_F - 0x00000000004614f0 cvt_ieee_single_to_vax_f - .text 0x00000000004617e0 0x750 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close.o) - 0x00000000004617e0 for_close - 0x0000000000461ce0 for__close_args - 0x0000000000461e10 for__close_default - .text 0x0000000000461f30 0x6d0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close_proc.o) - 0x0000000000461f30 for__close_proc - .text 0x0000000000462600 0x220 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_default_io_sizes_env_init.o) - 0x0000000000462600 for__default_io_sizes_env_init - .text 0x0000000000462820 0xbd0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_desc_item.o) - 0x0000000000462820 for__desc_ret_item - 0x0000000000462b30 for__key_desc_ret_item - 0x0000000000462e60 for__desc_test_item - 0x0000000000463080 for__desc_zero_length_item - .text 0x00000000004633f0 0x4a50 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) - 0x00000000004633f0 for__this_image_number_or_zero - 0x0000000000463440 for__io_return - 0x0000000000464040 for__issue_diagnostic - 0x00000000004649e0 for__get_msg - 0x0000000000464ce0 for_emit_diagnostic - 0x0000000000464e50 for__message_catalog_close - 0x00000000004655b0 for_errmsg - 0x0000000000465770 for__rtc_uninit_use - 0x0000000000465790 for__rtc_uninit_use_src - 0x00000000004657b0 TRACEBACKQQ - 0x00000000004659f0 tracebackqq_ - 0x0000000000465c30 for_perror_ - 0x0000000000466e20 for_gerror_ - 0x0000000000467bc0 for__establish_user_error_handler - 0x0000000000467c00 for__continue_traceback_ - 0x0000000000467d20 for__continue_traceback - .text 0x0000000000467e40 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit.o) - 0x0000000000467e40 for_exit - .text 0x0000000000467e60 0x2f0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit_handler.o) - 0x0000000000467e60 for__fpe_exit_handler - 0x0000000000467f40 for__exit_handler - .text 0x0000000000468150 0x3320 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_comp.o) - 0x0000000000468150 for__format_compiler - .text 0x000000000046b470 0x1810 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) - 0x000000000046b470 for__format_value - 0x000000000046c1c0 for__cvt_value - .text 0x000000000046cc80 0x2490 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_get.o) - 0x000000000046cc80 for__get_s - 0x000000000046e050 for__read_input - 0x000000000046e170 for__get_d - 0x000000000046e520 for__get_su_fields - 0x000000000046ef30 for__get_more_fields - .text 0x000000000046f110 0xe80 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_intrp_fmt.o) - 0x000000000046f110 for__interp_fmt - .text 0x000000000046ff90 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_ldir_wfs.o) - .text 0x000000000046ff90 0x2540 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_lub_mgt.o) - 0x000000000046ff90 for__acquire_lun - 0x0000000000470e40 for__release_lun - 0x0000000000471160 for__create_lub - 0x0000000000471300 for__deallocate_lub - 0x0000000000471e30 for__get_next_lub - 0x00000000004722a0 for__get_free_newunit - 0x0000000000472470 for__release_newunit - .text 0x00000000004724d0 0x4e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_need_lf.o) - 0x00000000004724d0 for__add_to_lf_table - 0x0000000000472930 for__rm_from_lf_table - .text 0x00000000004729b0 0x1ec0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_put.o) - 0x00000000004729b0 for__put_su - 0x0000000000473030 for__write_output - 0x0000000000473410 for__put_sf - 0x00000000004744d0 for__put_d - 0x0000000000474760 for__flush_readahead - .text 0x0000000000474870 0x6200 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq.o) - 0x0000000000474870 for_write_seq - 0x0000000000475f70 for_write_seq_xmit - 0x000000000047a610 for__finish_ufseq_write - .text 0x000000000047aa70 0x1360 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(tbk_traceback.o) - 0x000000000047bb70 tbk_stack_trace - .text 0x000000000047bdd0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt__globals.o) - .text 0x000000000047bdd0 0x7d0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_int_to_text.o) - 0x000000000047bdd0 cvt_integer_to_text - 0x000000000047bfc0 cvt_unsigned_to_text - 0x000000000047c1a0 cvt_integer64_to_text - 0x000000000047c3b0 cvt_unsigned64_to_text - .text 0x000000000047c5a0 0x780 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_data_to_text.o) - 0x000000000047c5a0 cvt_data_to_text - 0x000000000047c960 cvt_data64_to_text - .text 0x000000000047cd20 0x8d0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_log_to_text.o) - 0x000000000047cd20 cvt_boolean_to_text - 0x000000000047d020 cvt_boolean_to_text_ex - 0x000000000047d320 cvt_boolean64_to_text - .text 0x000000000047d5f0 0x570 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_data.o) - 0x000000000047d5f0 cvt_text_to_data - 0x000000000047d8d0 cvt_text_to_data64 - .text 0x000000000047db60 0x220 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_log.o) - 0x000000000047db60 cvt_text_to_boolean - 0x000000000047dc70 cvt_text_to_boolean64 - .text 0x000000000047dd80 0x2820 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_t.o) - 0x000000000047dd80 cvt_ieee_t_to_text_ex - 0x000000000047f0f0 cvt_ieee_t_to_text - 0x0000000000480410 cvt_text_to_ieee_t_ex - .text 0x00000000004805a0 0x2760 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_s.o) - 0x00000000004805a0 cvt_ieee_s_to_text_ex - 0x00000000004818b0 cvt_ieee_s_to_text - 0x0000000000482b70 cvt_text_to_ieee_s_ex - .text 0x0000000000482d00 0x1610 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_x.o) - 0x0000000000482d00 cvt_ieee_x_to_text - 0x0000000000482d50 cvt_ieee_x_to_text_ex - 0x0000000000484180 cvt_text_to_ieee_x_ex - .text 0x0000000000484310 0x1660 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_s.o) - 0x0000000000484310 cvtas_a_to_s - .text 0x0000000000485970 0x2bb0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) - 0x0000000000485970 cvtas_a_to_t - .text 0x0000000000488520 0x53f0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_s_to_a.o) - 0x0000000000488520 cvtas_s_to_a - .text 0x000000000048d910 0x5530 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_t_to_a.o) - 0x000000000048d910 cvtas_t_to_a - .text 0x0000000000492e40 0xd0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_s.o) - 0x0000000000492e40 cvtas__nan_s - .text 0x0000000000492f10 0xc0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_t.o) - 0x0000000000492f10 cvtas__nan_t - .text 0x0000000000492fd0 0x5270 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_x.o) - 0x0000000000492fd0 cvtas_a_to_x - .text 0x0000000000498240 0x5750 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_x_to_a.o) - 0x0000000000498240 cvtas_x_to_a - .text 0x000000000049d990 0xf0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_x.o) - 0x000000000049d990 cvtas__nan_x - .text 0x000000000049da80 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_globals.o) - .text 0x000000000049da80 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_53.o) - .text 0x000000000049da80 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_64.o) - .text 0x000000000049da80 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_128.o) - .text 0x000000000049da80 0x30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(fetestexcept.o) - 0x000000000049da80 fetestexcept - .text 0x000000000049dab0 0x50 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_stub.o) - 0x000000000049dab0 lroundf - .text 0x000000000049db00 0x50 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_stub.o) - 0x000000000049db00 lround - .text 0x000000000049db50 0x170 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_ct.o) - 0x000000000049db50 lround.L - .text 0x000000000049dcc0 0x130 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_ct.o) - 0x000000000049dcc0 lroundf.L - .text 0x000000000049ddf0 0xe0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_gen.o) - 0x000000000049ddf0 lroundf.A - .text 0x000000000049ded0 0xf0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_gen.o) - 0x000000000049ded0 lround.A - .text 0x000000000049dfc0 0x7f0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) - 0x000000000049e1d0 __libm_copy_value - 0x000000000049e320 __libm_error_support - 0x000000000049e720 __libm_setusermatherrl - 0x000000000049e750 __libm_setusermatherr - 0x000000000049e780 __libm_setusermatherrf - .text 0x000000000049e7b0 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrf.o) - 0x000000000049e7b0 matherrf - .text 0x000000000049e7c0 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrl.o) - 0x000000000049e7c0 matherrl - .text 0x000000000049e7d0 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherr.o) - 0x000000000049e7d0 matherr - .text 0x000000000049e7e0 0x1e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ints2q.o) - 0x000000000049e7e0 __ktoq - 0x000000000049e870 __jtoq - 0x000000000049e920 __itoq - 0x000000000049e980 __utoq - .text 0x000000000049e9c0 0x560 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(qcomp.o) - 0x000000000049e9c0 __eqq - 0x000000000049ea40 __neq - 0x000000000049ead0 __leq - 0x000000000049ebb0 __ltq - 0x000000000049ec90 __geq - 0x000000000049ed70 __gtq - 0x000000000049ee50 __compareq - .text 0x000000000049ef20 0x1d0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fp2q.o) - 0x000000000049ef20 __dtoq - 0x000000000049eff0 __ltoq - 0x000000000049f060 __ftoq - .text 0x000000000049f0f0 0x7e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(q2fp.o) - 0x000000000049f0f0 __qtod - 0x000000000049f3e0 __qtol - 0x000000000049f620 __qtof - .text 0x000000000049f8d0 0x650 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_display.o) - 0x000000000049f8d0 tbk_string_stack_signal - 0x000000000049f936 tbk_string_stack_signal_impl - .text 0x000000000049ff20 0x1640 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_backtrace.o) - 0x000000000049ff20 tbk_getPC - 0x000000000049ff30 tbk_getRetAddr - 0x000000000049ff40 tbk_getFramePtr - 0x000000000049ff50 tbk_getModuleName - 0x00000000004a0270 tbk_get_pc_info - 0x00000000004a0e30 tbk_geterrorstring - 0x00000000004a0fe0 tbk_trace_stack - 0x00000000004a1034 tbk_trace_stack_impl - .text 0x00000000004a1560 0x460 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(cpu_feature_disp.o) - 0x00000000004a1560 __intel_cpu_features_init_x - 0x00000000004a1580 __intel_cpu_features_init - .text 0x00000000004a19c0 0xc0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemcpy.o) - 0x00000000004a19c0 _intel_fast_memcpy.A - 0x00000000004a19d0 _intel_fast_memcpy.J - 0x00000000004a19e0 _intel_fast_memcpy.M - 0x00000000004a19f0 _intel_fast_memcpy.P - 0x00000000004a1a00 _intel_fast_memcpy - .text 0x00000000004a1a80 0x90 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemmove.o) - 0x00000000004a1a80 _intel_fast_memmove.A - 0x00000000004a1a90 _intel_fast_memmove.M - 0x00000000004a1aa0 _intel_fast_memmove.P - 0x00000000004a1ab0 _intel_fast_memmove - .text 0x00000000004a1b10 0x60 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemset.o) - 0x00000000004a1b10 _intel_fast_memset.A - 0x00000000004a1b20 _intel_fast_memset.J - 0x00000000004a1b30 _intel_fast_memset - .text 0x00000000004a1b70 0x360 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(new_proc_init.o) - 0x00000000004a1b70 __intel_new_feature_proc_init - .text 0x00000000004a1ed0 0x3190 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_addsubq.o) - 0x00000000004a31e0 __addq - 0x00000000004a3290 __subq - .text 0x00000000004a5060 0x1c00 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_divq.o) - 0x00000000004a5060 __divq.L - 0x00000000004a5e40 __divq.A - 0x00000000004a6c20 __divq - .text 0x00000000004a6c60 0x130 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcpy.o) - 0x00000000004a6c60 __intel_sse2_strcpy - .text 0x00000000004a6d90 0x190 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncpy.o) - 0x00000000004a6d90 __intel_sse2_strncpy - .text 0x00000000004a6f20 0x30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strlen.o) - 0x00000000004a6f20 __intel_sse2_strlen - .text 0x00000000004a6f50 0x40 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strchr.o) - 0x00000000004a6f50 __intel_sse2_strchr - .text 0x00000000004a6f90 0x2e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncmp.o) - 0x00000000004a6f90 __intel_sse2_strncmp - .text 0x00000000004a7270 0x280 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcat.o) - 0x00000000004a7270 __intel_sse2_strcat - .text 0x00000000004a74f0 0x330 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncat.o) - 0x00000000004a74f0 __intel_sse2_strncat - .text 0x00000000004a7820 0x17b0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memcpy_pp.o) - 0x00000000004a7820 __intel_memcpy - 0x00000000004a7820 __intel_new_memcpy - .text 0x00000000004a8fd0 0x11e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memset_pp.o) - 0x00000000004a8fd0 __intel_memset - 0x00000000004a8fd0 __intel_new_memset - .text 0x00000000004aa1b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memcpy.o) - .text.ssse3 0x00000000004aa1b0 0x29c5 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memcpy.o) - 0x00000000004aa1b0 __intel_ssse3_memcpy - *fill* 0x00000000004acb75 0x3 - .text 0x00000000004acb78 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memcpy.o) - *fill* 0x00000000004acb78 0x8 - .text.ssse3 0x00000000004acb80 0x2ab6 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memcpy.o) - 0x00000000004acb80 __intel_ssse3_rep_memcpy - *fill* 0x00000000004af636 0x2 - .text 0x00000000004af638 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memmove.o) - *fill* 0x00000000004af638 0x8 - .text.ssse3 0x00000000004af640 0x2b76 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memmove.o) - 0x00000000004af640 __intel_ssse3_memmove - *fill* 0x00000000004b21b6 0x2 - .text 0x00000000004b21b8 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memmove.o) - *fill* 0x00000000004b21b8 0x8 - .text.ssse3 0x00000000004b21c0 0x2af6 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memmove.o) - 0x00000000004b21c0 __intel_ssse3_rep_memmove - *fill* 0x00000000004b4cb6 0xa - .text 0x00000000004b4cc0 0x4e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(irc_msg_support.o) - 0x00000000004b4cc0 __libirc_get_msg - 0x00000000004b4ef0 __libirc_print - .text 0x00000000004b51a0 0xbe0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_mem_ops.o) - 0x00000000004b51a0 __cacheSize - .text 0x00000000004b5d80 0xb0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(proc_init_utils.o) - 0x00000000004b5d80 __intel_proc_init_ftzdazule - .text 0x00000000004b5e30 0x99 /usr/lib64/libc_nonshared.a(elf-init.oS) - 0x00000000004b5e30 __libc_csu_fini - 0x00000000004b5e40 __libc_csu_init - *fill* 0x00000000004b5ec9 0x7 - .text 0x00000000004b5ed0 0x51 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - 0x00000000004b5ed0 __powidf2 - *fill* 0x00000000004b5f21 0xf - .text 0x00000000004b5f30 0x36 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - *fill* 0x00000000004b5f66 0x2 - .text 0x00000000004b5f68 0x0 /usr/lib/../lib64/crtn.o - *(.gnu.warning) - -.fini 0x00000000004b5f68 0x16 - *(SORT(.fini)) - .fini 0x00000000004b5f68 0x10 /usr/lib/../lib64/crti.o - 0x00000000004b5f68 _fini - .fini 0x00000000004b5f78 0x5 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - .fini 0x00000000004b5f7d 0x1 /usr/lib/../lib64/crtn.o - 0x00000000004b5f7e PROVIDE (__etext, .) - 0x00000000004b5f7e PROVIDE (_etext, .) - 0x00000000004b5f7e PROVIDE (etext, .) - -.rodata 0x00000000004b5f80 0x1a5e8 - *(.rodata .rodata.* .gnu.linkonce.r.*) - .rodata.cst4 0x00000000004b5f80 0x4 /usr/lib/../lib64/crt1.o - 0x00000000004b5f80 _IO_stdin_used - *fill* 0x00000000004b5f84 0x4 - .rodata 0x00000000004b5f88 0x240 rdbfmsua.o - .rodata.str1.4 - 0x00000000004b61c8 0x1c2 rdbfmsua.o - 0x208 (size before relaxing) - *fill* 0x00000000004b638a 0x6 - .rodata 0x00000000004b6390 0x50 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flclos.o) - .rodata 0x00000000004b63e0 0x55 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltdat.o) - .rodata 0x00000000004b6435 0x21 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) - .rodata 0x00000000004b6456 0x1 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stldsp.o) - *fill* 0x00000000004b6457 0x1 - .rodata 0x00000000004b6458 0xc /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbrstn.o) - *fill* 0x00000000004b6464 0x4 - .rodata 0x00000000004b6468 0x50 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flbksp.o) - .rodata 0x00000000004b64b8 0x58 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flinqr.o) - .rodata 0x00000000004b6510 0x1 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flpath.o) - *fill* 0x00000000004b6511 0x7 - .rodata 0x00000000004b6518 0x5d /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flsopn.o) - .rodata 0x00000000004b6575 0x3 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssenvr.o) - .rodata 0x00000000004b6578 0xc4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbastn.o) - *fill* 0x00000000004b663c 0x4 - .rodata 0x00000000004b6640 0x3a2 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) - *fill* 0x00000000004b69e2 0x6 - .rodata 0x00000000004b69e8 0x76 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapn.o) - *fill* 0x00000000004b6a5e 0x2 - .rodata 0x00000000004b6a60 0x3e /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapx.o) - *fill* 0x00000000004b6a9e 0x2 - .rodata 0x00000000004b6aa0 0x14 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) - *fill* 0x00000000004b6ab4 0x4 - .rodata 0x00000000004b6ab8 0x3bb /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) - *fill* 0x00000000004b6e73 0x5 - .rodata 0x00000000004b6e78 0x97 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) - *fill* 0x00000000004b6f0f 0x1 - .rodata 0x00000000004b6f10 0xe8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) - .rodata 0x00000000004b6ff8 0x5c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(status.o) - *fill* 0x00000000004b7054 0x4 - .rodata 0x00000000004b7058 0x478 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) - .rodata 0x00000000004b74d0 0x7c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) - *fill* 0x00000000004b754c 0x4 - .rodata 0x00000000004b7550 0x331 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) - *fill* 0x00000000004b7881 0x7 - .rodata 0x00000000004b7888 0xad /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) - *fill* 0x00000000004b7935 0x3 - .rodata 0x00000000004b7938 0x1fc /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wtstat.o) - *fill* 0x00000000004b7b34 0x4 - .rodata 0x00000000004b7b38 0xe4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(adn30.o) - .rodata 0x00000000004b7c1c 0x2c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) - .rodata 0x00000000004b7c48 0x41 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort2.o) - *fill* 0x00000000004b7c89 0x7 - .rodata 0x00000000004b7c90 0x41 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort.o) - *fill* 0x00000000004b7cd1 0x3 - .rodata 0x00000000004b7cd4 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cpbfdx.o) - .rodata 0x00000000004b7cd8 0xc /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(drstpl.o) - .rodata 0x00000000004b7ce4 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxinit.o) - *fill* 0x00000000004b7cec 0x4 - .rodata 0x00000000004b7cf0 0xea /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxmini.o) - *fill* 0x00000000004b7dda 0x6 - .rodata 0x00000000004b7de0 0x57 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getwin.o) - *fill* 0x00000000004b7e37 0x9 - .rodata 0x00000000004b7e40 0x10 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ibfms.o) - .rodata 0x00000000004b7e50 0x11 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ifxy.o) - *fill* 0x00000000004b7e61 0x7 - .rodata 0x00000000004b7e68 0x82 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invcon.o) - *fill* 0x00000000004b7eea 0x6 - .rodata 0x00000000004b7ef0 0x82 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invwin.o) - *fill* 0x00000000004b7f72 0x6 - .rodata 0x00000000004b7f78 0x86 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ipkm.o) - *fill* 0x00000000004b7ffe 0x2 - .rodata 0x00000000004b8000 0x87 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupm.o) - *fill* 0x00000000004b8087 0x1 - .rodata 0x00000000004b8088 0xd0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrpc.o) - .rodata 0x00000000004b8158 0xd0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrps.o) - .rodata 0x00000000004b8228 0x144 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) - *fill* 0x00000000004b836c 0x4 - .rodata 0x00000000004b8370 0x82 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(newwin.o) - .rodata 0x00000000004b83f2 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nmwrd.o) - *fill* 0x00000000004b83f6 0x2 - .rodata 0x00000000004b83f8 0x82 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nxtwin.o) - *fill* 0x00000000004b847a 0x6 - .rodata 0x00000000004b8480 0xeb /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ovrbs1.o) - *fill* 0x00000000004b856b 0x5 - .rodata 0x00000000004b8570 0x6c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(padmsg.o) - *fill* 0x00000000004b85dc 0x4 - .rodata 0x00000000004b85e0 0xc5 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkbs1.o) - *fill* 0x00000000004b86a5 0x3 - .rodata 0x00000000004b86a8 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkc.o) - .rodata 0x00000000004b86b0 0x11c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pktdd.o) - *fill* 0x00000000004b87cc 0x4 - .rodata 0x00000000004b87d0 0x80 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs01.o) - .rodata 0x00000000004b8850 0xfc /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs1.o) - *fill* 0x00000000004b894c 0x4 - .rodata 0x00000000004b8950 0x37b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) - *fill* 0x00000000004b8ccb 0x1 - .rodata 0x00000000004b8ccc 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdcmps.o) - *fill* 0x00000000004b8cd4 0x4 - .rodata 0x00000000004b8cd8 0x494 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) - *fill* 0x00000000004b916c 0x4 - .rodata 0x00000000004b9170 0xde /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readmg.o) - *fill* 0x00000000004b924e 0x2 - .rodata 0x00000000004b9250 0x662 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) - *fill* 0x00000000004b98b2 0x6 - .rodata 0x00000000004b98b8 0x24c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) - *fill* 0x00000000004b9b04 0x4 - .rodata 0x00000000004b9b08 0xf2 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) - *fill* 0x00000000004b9bfa 0x6 - .rodata 0x00000000004b9c00 0x84 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strnum.o) - *fill* 0x00000000004b9c84 0x4 - .rodata 0x00000000004b9c88 0xb2 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strsuc.o) - *fill* 0x00000000004b9d3a 0x2 - .rodata 0x00000000004b9d3c 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(trybump.o) - .rodata 0x00000000004b9d44 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upc.o) - *fill* 0x00000000004b9d4c 0x4 - .rodata 0x00000000004b9d50 0x4f0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(usrtpl.o) - .rodata 0x00000000004ba240 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrna.o) - .rodata 0x00000000004ba248 0x274 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) - *fill* 0x00000000004ba4bc 0x4 - .rodata 0x00000000004ba4c0 0xf5 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cnved4.o) - *fill* 0x00000000004ba5b5 0x3 - .rodata 0x00000000004ba5b8 0x14a /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(elemdx.o) - *fill* 0x00000000004ba702 0x2 - .rodata 0x00000000004ba704 0x10 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getlens.o) - .rodata 0x00000000004ba714 0x4a /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(gets1loc.o) - *fill* 0x00000000004ba75e 0x2 - .rodata 0x00000000004ba760 0x148 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(idn30.o) - .rodata 0x00000000004ba8a8 0x10 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(igetdate.o) - .rodata 0x00000000004ba8b8 0x18 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(istdesc.o) - .rodata 0x00000000004ba8d0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupbs01.o) - .rodata 0x00000000004ba8f8 0x1 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstchr.o) - *fill* 0x00000000004ba8f9 0x7 - .rodata 0x00000000004ba900 0xfa /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstnum.o) - *fill* 0x00000000004ba9fa 0x6 - .rodata 0x00000000004baa00 0xd0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstjpb.o) - .rodata 0x00000000004baad0 0x170 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) - .rodata 0x00000000004bac40 0x78 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(mvb.o) - .rodata 0x00000000004bacb8 0x1f /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtab.o) - *fill* 0x00000000004bacd7 0x1 - .rodata 0x00000000004bacd8 0xb0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbax.o) - .rodata 0x00000000004bad88 0xbb /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenuaa.o) - *fill* 0x00000000004bae43 0x5 - .rodata 0x00000000004bae48 0x16b /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenubd.o) - .rodata 0x00000000004bafb3 0x13 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numbck.o) - *fill* 0x00000000004bafc6 0x2 - .rodata 0x00000000004bafc8 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtab.o) - *fill* 0x00000000004bafef 0x1 - .rodata 0x00000000004baff0 0xca /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbt.o) - *fill* 0x00000000004bb0ba 0x6 - .rodata 0x00000000004bb0c0 0x181 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parstr.o) - *fill* 0x00000000004bb241 0x7 - .rodata 0x00000000004bb248 0x363 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parusr.o) - *fill* 0x00000000004bb5ab 0x5 - .rodata 0x00000000004bb5b0 0x1e4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parutg.o) - *fill* 0x00000000004bb794 0x4 - .rodata 0x00000000004bb798 0xf0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rcstpl.o) - .rodata 0x00000000004bb888 0x18 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgb.o) - .rodata 0x00000000004bb8a0 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) - *fill* 0x00000000004bb8c7 0x1 - .rodata 0x00000000004bb8c8 0x3f6 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) - *fill* 0x00000000004bbcbe 0x2 - .rodata 0x00000000004bbcc0 0x68 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(uptdd.o) - .rodata 0x00000000004bbd28 0x47 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdesc.o) - *fill* 0x00000000004bbd6f 0x1 - .rodata 0x00000000004bbd70 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cadn30.o) - *fill* 0x00000000004bbd74 0x4 - .rodata 0x00000000004bbd78 0x10e /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chekstab.o) - *fill* 0x00000000004bbe86 0x2 - .rodata 0x00000000004bbe88 0x6e /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(inctab.o) - *fill* 0x00000000004bbef6 0xa - .rodata 0x00000000004bbf00 0x290 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbb.o) - .rodata 0x00000000004bc190 0x2b7 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbd.o) - .rodata 0x00000000004bc447 0x1 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtbd.o) - .rodata 0x00000000004bc448 0xb1 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabent.o) - *fill* 0x00000000004bc4f9 0x7 - .rodata 0x00000000004bc500 0x108 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(valx.o) - .rodata 0x00000000004bc608 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rjust.o) - .rodata.str1.4 - 0x00000000004bc610 0xd0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) - .rodata 0x00000000004bc6e0 0x160 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) - .rodata.str1.32 - 0x00000000004bc840 0x1ecd /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) - 0x1ee0 (size before relaxing) - *fill* 0x00000000004be70d 0x13 - .rodata 0x00000000004be720 0x3c0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) - .rodata.str1.4 - 0x00000000004beae0 0x3 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) - 0x4 (size before relaxing) - *fill* 0x00000000004beae3 0x1 - .rodata.str1.4 - 0x00000000004beae4 0x3cd /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) - 0x3f0 (size before relaxing) - *fill* 0x00000000004beeb1 0xf - .rodata 0x00000000004beec0 0x1980 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) - .rodata.str1.4 - 0x00000000004c0840 0x7 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_preconnected_units_init.o) - 0x8 (size before relaxing) - *fill* 0x00000000004c0847 0x19 - .rodata 0x00000000004c0860 0x80 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_reentrancy.o) - .rodata 0x00000000004c08e0 0x70 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) - .rodata.str1.4 - 0x00000000004c0950 0xb /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_stop.o) - 0x14 (size before relaxing) - *fill* 0x00000000004c095b 0x5 - .rodata 0x00000000004c0960 0x200 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_stop.o) - .rodata.str1.4 - 0x00000000004c0b60 0x53 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) - 0x64 (size before relaxing) - *fill* 0x00000000004c0bb3 0xd - .rodata 0x00000000004c0bc0 0x220 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) - .rodata.str1.4 - 0x00000000004c0de0 0xf /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_fmt.o) - 0x24 (size before relaxing) - *fill* 0x00000000004c0def 0x1 - .rodata 0x00000000004c0df0 0x2e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_fmt.o) - .rodata 0x00000000004c10d0 0x4b0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_lis.o) - .rodata.str1.4 - 0x00000000004c1580 0xf /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_lis.o) - 0x18 (size before relaxing) - *fill* 0x00000000004c158f 0x1 - .rodata.str1.4 - 0x00000000004c1590 0xf2 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) - 0x100 (size before relaxing) - *fill* 0x00000000004c1682 0x2 - .rodata.str1.4 - 0x00000000004c1684 0xb0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) - 0x104 (size before relaxing) - *fill* 0x00000000004c1734 0x4 - .rodata 0x00000000004c1738 0x310 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) - .rodata 0x00000000004c1a48 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_int.o) - .rodata 0x00000000004c1a58 0x198 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_f.o) - .rodata 0x00000000004c1bf0 0x198 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_d.o) - .rodata 0x00000000004c1d88 0x198 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_g.o) - .rodata 0x00000000004c1f20 0x330 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cray.o) - .rodata 0x00000000004c2250 0x198 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_short.o) - .rodata 0x00000000004c23e8 0x198 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_long.o) - .rodata 0x00000000004c2580 0x7f8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_double.o) - .rodata 0x00000000004c2d78 0x4c8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_single.o) - .rodata.str1.4 - 0x0000000000000000 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close.o) - .rodata 0x00000000004c3240 0x48 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close_proc.o) - .rodata.str1.4 - 0x00000000004c3288 0x66 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close_proc.o) - 0x68 (size before relaxing) - *fill* 0x00000000004c32ee 0x2 - .rodata.str1.4 - 0x00000000004c32f0 0x43 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_default_io_sizes_env_init.o) - 0x44 (size before relaxing) - *fill* 0x00000000004c3333 0xd - .rodata 0x00000000004c3340 0x200 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_desc_item.o) - 0x00000000004c34a0 for__dsc_itm_table - .rodata.str1.4 - 0x00000000004c3540 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_desc_item.o) - .rodata.str1.4 - 0x00000000004c3550 0x3c41 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) - 0x3cc4 (size before relaxing) - *fill* 0x00000000004c7191 0x7 - .rodata 0x00000000004c7198 0x58 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) - *fill* 0x00000000004c71f0 0x10 - .rodata.str1.32 - 0x00000000004c7200 0x1644 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) - 0x1660 (size before relaxing) - .rodata.str1.4 - 0x00000000004c8844 0x13 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit_handler.o) - 0x14 (size before relaxing) - *fill* 0x00000000004c8857 0x9 - .rodata 0x00000000004c8860 0x3a0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_comp.o) - .rodata.str1.4 - 0x00000000004c8c00 0xf /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_comp.o) - 0x10 (size before relaxing) - *fill* 0x00000000004c8c0f 0x1 - .rodata 0x00000000004c8c10 0x1100 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) - .rodata 0x00000000004c9d10 0x48 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_get.o) - .rodata.str1.4 - 0x00000000004c9d58 0xa /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_get.o) - 0xc (size before relaxing) - *fill* 0x00000000004c9d62 0x1e - .rodata 0x00000000004c9d80 0xbe0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_intrp_fmt.o) - 0x00000000004ca300 for__oz_fmt_table - 0x00000000004ca360 for__b_fmt_table - 0x00000000004ca400 for__fedg_fmt_table - 0x00000000004ca4e0 for__coerce_data_types - 0x00000000004ca943 for__i_fmt_table - .rodata.str1.4 - 0x00000000004ca960 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_intrp_fmt.o) - *fill* 0x00000000004ca970 0x10 - .rodata 0x00000000004ca980 0x240 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_ldir_wfs.o) - 0x00000000004ca980 for__wfs_table - 0x00000000004caaa0 for__wfs_msf_table - .rodata 0x00000000004cabc0 0x2d0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_lub_mgt.o) - .rodata.str1.4 - 0x00000000004cae90 0xe /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_lub_mgt.o) - 0x10 (size before relaxing) - .rodata.str1.4 - 0x0000000000000000 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_need_lf.o) - *fill* 0x00000000004cae9e 0x2 - .rodata 0x00000000004caea0 0x60 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_need_lf.o) - .rodata 0x00000000004caf00 0x210 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_put.o) - .rodata.str1.4 - 0x00000000004cb110 0xa /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_put.o) - 0xc (size before relaxing) - *fill* 0x00000000004cb11a 0x2 - .rodata.str1.4 - 0x00000000004cb11c 0xb /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq.o) - 0x14 (size before relaxing) - *fill* 0x00000000004cb127 0x1 - .rodata 0x00000000004cb128 0x248 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq.o) - .rodata.str1.4 - 0x00000000004cb370 0x1ef /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(tbk_traceback.o) - 0x1fc (size before relaxing) - *fill* 0x00000000004cb55f 0x1 - .rodata.str1.32 - 0x00000000004cb560 0xb93 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(tbk_traceback.o) - 0xba0 (size before relaxing) - *fill* 0x00000000004cc0f3 0xd - .rodata 0x00000000004cc100 0x1c0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt__globals.o) - 0x00000000004cc100 vax_c - 0x00000000004cc140 ieee_t - 0x00000000004cc1b0 ieee_s - 0x00000000004cc1e8 ibm_s - 0x00000000004cc204 ibm_l - 0x00000000004cc23c cray - 0x00000000004cc274 int_c - .rodata 0x00000000004cc2c0 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_int_to_text.o) - .rodata.str1.4 - 0x00000000004cc2e0 0x11 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_int_to_text.o) - 0x14 (size before relaxing) - .rodata.str1.4 - 0x0000000000000000 0x14 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_data_to_text.o) - *fill* 0x00000000004cc2f1 0xf - .rodata 0x00000000004cc300 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_data_to_text.o) - .rodata 0x00000000004cc320 0x30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_log_to_text.o) - .rodata 0x00000000004cc350 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_t.o) - .rodata 0x00000000004cc360 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_s.o) - .rodata 0x00000000004cc370 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_x.o) - .rodata 0x00000000004cc380 0x150 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_s.o) - .rodata.str1.4 - 0x00000000004cc4d0 0xd /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_s.o) - 0x10 (size before relaxing) - *fill* 0x00000000004cc4dd 0x3 - .rodata 0x00000000004cc4e0 0x150 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) - .rodata.str1.4 - 0x0000000000000000 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) - .rodata.str1.4 - 0x00000000004cc630 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_s_to_a.o) - 0x30 (size before relaxing) - .rodata 0x00000000004cc650 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_s_to_a.o) - .rodata.str1.4 - 0x0000000000000000 0x30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_t_to_a.o) - .rodata 0x00000000004cc660 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_t_to_a.o) - .rodata 0x00000000004cc670 0x138 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_x.o) - .rodata.str1.4 - 0x0000000000000000 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_x.o) - .rodata.str1.4 - 0x0000000000000000 0x30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_x_to_a.o) - *fill* 0x00000000004cc7a8 0x8 - .rodata 0x00000000004cc7b0 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_x_to_a.o) - .rodata 0x00000000004cc7c0 0x180 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_globals.o) - 0x00000000004cc7c0 cvtas_pten_word - 0x00000000004cc860 cvtas_globals_t - 0x00000000004cc8c0 cvtas_globals_x - 0x00000000004cc920 cvtas_globals_s - .rodata 0x00000000004cc940 0x4e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_53.o) - 0x00000000004cc940 cvtas_pten_t - 0x00000000004ccc40 cvtas_tiny_pten_t - 0x00000000004ccce0 cvtas_tiny_pten_t_map - 0x00000000004ccd40 cvtas_huge_pten_t - 0x00000000004ccdc0 cvtas_huge_pten_t_map - .rodata 0x00000000004cce20 0x5e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_64.o) - 0x00000000004cce20 cvtas_pten_64 - 0x00000000004cd120 cvtas_pten_64_bexp - 0x00000000004cd1e0 cvtas_tiny_pten_64 - 0x00000000004cd260 cvtas_tiny_pten_64_map - 0x00000000004cd2e0 cvtas_huge_pten_64 - 0x00000000004cd360 cvtas_huge_pten_64_map - 0x00000000004cd3ba cvtas_tiny_pten_64_bexp - 0x00000000004cd3d8 cvtas_huge_pten_64_bexp - .rodata 0x00000000004cd400 0x520 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_128.o) - 0x00000000004cd400 cvtas_pten_128 - 0x00000000004cd5c0 cvtas_tiny_tiny_pten_128 - 0x00000000004cd600 cvtas_tiny_pten_128 - 0x00000000004cd6a0 cvtas_tiny_pten_128_map - 0x00000000004cd740 cvtas_huge_huge_pten_128 - 0x00000000004cd780 cvtas_huge_pten_128 - 0x00000000004cd820 cvtas_huge_pten_128_map - 0x00000000004cd8a8 cvtas_pten_128_bexp - 0x00000000004cd8de cvtas_tiny_tiny_pten_128_bexp - 0x00000000004cd8e6 cvtas_tiny_pten_128_bexp - 0x00000000004cd8fa cvtas_huge_huge_pten_128_bexp - 0x00000000004cd902 cvtas_huge_pten_128_bexp - .rodata 0x00000000004cd920 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_ct.o) - .rodata 0x00000000004cd940 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_ct.o) - .rodata 0x00000000004cd960 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_gen.o) - .rodata 0x00000000004cd980 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_gen.o) - .rodata.str1.4 - 0x00000000004cd9a0 0x45a /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) - 0x484 (size before relaxing) - *fill* 0x00000000004cddfa 0x6 - .rodata 0x00000000004cde00 0x918 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) - .rodata 0x00000000004ce718 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(qcomp.o) - .rodata 0x00000000004ce720 0xc /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fp2q.o) - *fill* 0x00000000004ce72c 0x4 - .rodata 0x00000000004ce730 0x28 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(q2fp.o) - .rodata.str1.4 - 0x00000000004ce758 0x113 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_display.o) - 0x118 (size before relaxing) - *fill* 0x00000000004ce86b 0x15 - .rodata.str1.32 - 0x00000000004ce880 0xa2 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_display.o) - 0xc0 (size before relaxing) - *fill* 0x00000000004ce922 0x2 - .rodata.str1.4 - 0x00000000004ce924 0x24 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_backtrace.o) - 0x3c (size before relaxing) - .rodata 0x00000000004ce948 0x24 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_backtrace.o) - .rodata.str1.4 - 0x00000000004ce96c 0x14b /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(new_proc_init.o) - 0x150 (size before relaxing) - *fill* 0x00000000004ceab7 0x9 - .rodata 0x00000000004ceac0 0xa0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_addsubq.o) - .rodata 0x00000000004ceb60 0x90 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_divq.o) - .rodata.ssse3 0x00000000004cebf0 0x1c0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memcpy.o) - .rodata.ssse3 0x00000000004cedb0 0x500 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memcpy.o) - .rodata.ssse3 0x00000000004cf2b0 0x1c0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memmove.o) - .rodata.ssse3 0x00000000004cf470 0x500 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memmove.o) - .rodata.str1.4 - 0x00000000004cf970 0x58c /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(irc_msg_support.o) - 0x5b0 (size before relaxing) - *fill* 0x00000000004cfefc 0x4 - .rodata.str1.32 - 0x00000000004cff00 0x660 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(irc_msg_support.o) - .rodata.cst8 0x00000000004d0560 0x8 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - -.rodata1 - *(.rodata1) - -.eh_frame_hdr 0x00000000004d0568 0xee4 - *(.eh_frame_hdr) - .eh_frame_hdr 0x00000000004d0568 0xee4 /usr/lib/../lib64/crti.o - -.eh_frame 0x00000000004d1450 0xc694 - *(.eh_frame) - .eh_frame 0x00000000004d1450 0x40 /usr/lib/../lib64/crt1.o - .eh_frame 0x00000000004d1490 0x20 /usr/lib/../lib64/crti.o - 0x38 (size before relaxing) - .eh_frame 0x00000000004d14b0 0x38 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/for_main.o - 0x50 (size before relaxing) - .eh_frame 0x00000000004d14e8 0x158 rdbfmsua.o - .eh_frame 0x00000000004d1640 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flclos.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1660 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flflun.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1680 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltbop.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d16a8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltdat.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d16c8 0x30 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) - 0x48 (size before relaxing) - .eh_frame 0x00000000004d16f8 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stldsp.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1720 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlstr.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1740 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmbl.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1768 0x30 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmst.o) - 0x48 (size before relaxing) - .eh_frame 0x00000000004d1798 0x30 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbrstn.o) - 0x48 (size before relaxing) - .eh_frame 0x00000000004d17c8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flbksp.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d17e8 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flinqr.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1810 0x30 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flpath.o) - 0x48 (size before relaxing) - .eh_frame 0x00000000004d1840 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flsopn.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1860 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssenvr.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1888 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssgsym.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d18b0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlcuc.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d18d8 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stuclc.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1900 0x30 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbastn.o) - 0x48 (size before relaxing) - .eh_frame 0x00000000004d1930 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flglun.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1950 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libbridge.a(dcbsrh.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1970 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ireadns.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1990 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d19b0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapn.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d19d8 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapx.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1a00 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1a28 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1a48 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1a68 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1a88 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(status.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1aa8 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1ad0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1af8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upb.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1b18 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1b38 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1b60 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wtstat.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1b80 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(adn30.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1ba0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1bc8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort2.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1be8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort_exit.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1c08 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1c28 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(conwin.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1c50 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cpbfdx.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1c78 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(drstpl.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1c98 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxinit.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1cc0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxmini.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1ce8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getwin.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1d08 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ibfms.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1d28 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ichkstr.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1d50 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ifxy.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1d70 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invcon.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1d90 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invwin.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1db0 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ipkm.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1dd0 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(irev.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1df0 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupm.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1e10 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lmsg.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1e30 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrpc.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1e50 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrps.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1e70 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1e98 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(newwin.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1eb8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nmwrd.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1ed8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nxtwin.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1ef8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ovrbs1.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1f18 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(padmsg.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1f40 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkb.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1f68 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkbs1.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1f88 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkc.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1fb0 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pktdd.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d1fd0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs01.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d1ff8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs1.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2018 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2040 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdcmps.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2068 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2090 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d20b8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readmg.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d20d8 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2100 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2128 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2150 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strnum.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2178 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strsuc.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d21a0 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(trybump.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d21c0 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upbb.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d21e0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upc.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2208 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(usrtpl.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2230 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(capit.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2250 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrna.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2278 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrn.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d22a0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d22c8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cnved4.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d22e8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(digit.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2308 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(elemdx.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2328 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getlens.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2350 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(gets1loc.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2370 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(i4dy.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2390 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(idn30.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d23b8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(igetdate.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d23d8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(istdesc.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d23f8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupb.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2418 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupbs01.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2440 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstchr.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2468 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstnum.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2490 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstjpb.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d24b0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d24d8 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(mvb.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2500 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemock.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2520 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtab.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2548 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbax.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2570 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenuaa.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2598 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenubd.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d25c0 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numbck.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d25e0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtab.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2608 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbt.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2628 0x30 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parstr.o) - 0x48 (size before relaxing) - .eh_frame 0x00000000004d2658 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parusr.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2680 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parutg.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d26a8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rcstpl.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d26c8 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgb.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d26f0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2718 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rsvfvm.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2738 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strcln.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2758 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2780 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(uptdd.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d27a0 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdesc.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d27c0 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cadn30.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d27e8 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chekstab.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2810 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(inctab.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2838 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbb.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2858 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbd.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d2880 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtbd.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d28a8 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabent.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d28c8 0x28 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(valx.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d28f0 0x20 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rjust.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004d2910 0x358 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) - 0x370 (size before relaxing) - .eh_frame 0x00000000004d2c68 0x1c0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) - 0x1d8 (size before relaxing) - .eh_frame 0x00000000004d2e28 0xa08 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) - 0xa20 (size before relaxing) - .eh_frame 0x00000000004d3830 0x30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_preconnected_units_init.o) - 0x48 (size before relaxing) - .eh_frame 0x00000000004d3860 0x1d0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_reentrancy.o) - 0x1e8 (size before relaxing) - .eh_frame 0x00000000004d3a30 0x1e8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) - 0x200 (size before relaxing) - .eh_frame 0x00000000004d3c18 0x238 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_stop.o) - 0x250 (size before relaxing) - .eh_frame 0x00000000004d3e50 0x320 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_vm.o) - 0x338 (size before relaxing) - .eh_frame 0x00000000004d4170 0x6e8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) - 0x700 (size before relaxing) - .eh_frame 0x00000000004d4858 0x9b0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_fmt.o) - 0x9c8 (size before relaxing) - .eh_frame 0x00000000004d5208 0xbe8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_lis.o) - 0xc00 (size before relaxing) - .eh_frame 0x00000000004d5df0 0x758 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) - 0x770 (size before relaxing) - .eh_frame 0x00000000004d6548 0x698 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) - 0x6b0 (size before relaxing) - .eh_frame 0x00000000004d6be0 0x180 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio_wrap.o) - 0x198 (size before relaxing) - .eh_frame 0x00000000004d6d60 0x290 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_int.o) - 0x2a8 (size before relaxing) - .eh_frame 0x00000000004d6ff0 0xc8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_f.o) - 0xe0 (size before relaxing) - .eh_frame 0x00000000004d70b8 0xc8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_d.o) - 0xe0 (size before relaxing) - .eh_frame 0x00000000004d7180 0xc8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_g.o) - 0xe0 (size before relaxing) - .eh_frame 0x00000000004d7248 0x1e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cray.o) - 0x1f8 (size before relaxing) - .eh_frame 0x00000000004d7428 0xe0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_short.o) - 0xf8 (size before relaxing) - .eh_frame 0x00000000004d7508 0x108 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_long.o) - 0x120 (size before relaxing) - .eh_frame 0x00000000004d7610 0x508 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_double.o) - 0x520 (size before relaxing) - .eh_frame 0x00000000004d7b18 0x2a8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_single.o) - 0x2c0 (size before relaxing) - .eh_frame 0x00000000004d7dc0 0x220 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close.o) - 0x238 (size before relaxing) - .eh_frame 0x00000000004d7fe0 0x70 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close_proc.o) - 0x88 (size before relaxing) - .eh_frame 0x00000000004d8050 0x28 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_default_io_sizes_env_init.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d8078 0x1d8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_desc_item.o) - 0x1f0 (size before relaxing) - .eh_frame 0x00000000004d8250 0x4d8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) - 0x4f0 (size before relaxing) - .eh_frame 0x00000000004d8728 0x28 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit.o) - 0x40 (size before relaxing) - .eh_frame 0x00000000004d8750 0x60 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit_handler.o) - 0x78 (size before relaxing) - .eh_frame 0x00000000004d87b0 0x6a8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_comp.o) - 0x6c0 (size before relaxing) - .eh_frame 0x00000000004d8e58 0xf0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) - 0x108 (size before relaxing) - .eh_frame 0x00000000004d8f48 0x4c0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_get.o) - 0x4d8 (size before relaxing) - .eh_frame 0x00000000004d9408 0xe8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_intrp_fmt.o) - 0x100 (size before relaxing) - .eh_frame 0x00000000004d94f0 0x430 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_lub_mgt.o) - 0x448 (size before relaxing) - .eh_frame 0x00000000004d9920 0xb0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_need_lf.o) - 0xc8 (size before relaxing) - .eh_frame 0x00000000004d99d0 0x220 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_put.o) - 0x238 (size before relaxing) - .eh_frame 0x00000000004d9bf0 0x9e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq.o) - 0x9f8 (size before relaxing) - .eh_frame 0x00000000004da5d0 0xa0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(tbk_traceback.o) - 0xb8 (size before relaxing) - .eh_frame 0x00000000004da670 0x100 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_int_to_text.o) - 0x118 (size before relaxing) - .eh_frame 0x00000000004da770 0x180 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_data_to_text.o) - 0x198 (size before relaxing) - .eh_frame 0x00000000004da8f0 0x140 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_log_to_text.o) - 0x158 (size before relaxing) - .eh_frame 0x00000000004daa30 0x150 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_data.o) - 0x168 (size before relaxing) - .eh_frame 0x00000000004dab80 0x90 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_log.o) - 0xa8 (size before relaxing) - .eh_frame 0x00000000004dac10 0x2f8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_t.o) - 0x310 (size before relaxing) - .eh_frame 0x00000000004daf08 0x2f8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_s.o) - 0x310 (size before relaxing) - .eh_frame 0x00000000004db200 0x2d8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_x.o) - 0x2f0 (size before relaxing) - .eh_frame 0x00000000004db4d8 0x90 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_s.o) - 0xa8 (size before relaxing) - .eh_frame 0x00000000004db568 0x90 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) - 0xa8 (size before relaxing) - .eh_frame 0x00000000004db5f8 0x60 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_s_to_a.o) - 0x78 (size before relaxing) - .eh_frame 0x00000000004db658 0x60 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_t_to_a.o) - 0x78 (size before relaxing) - .eh_frame 0x00000000004db6b8 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_s.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004db6d0 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_t.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004db6e8 0x90 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_x.o) - 0xa8 (size before relaxing) - .eh_frame 0x00000000004db778 0x60 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_x_to_a.o) - 0x78 (size before relaxing) - .eh_frame 0x00000000004db7d8 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_x.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004db7f0 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(fetestexcept.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004db810 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_ct.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004db830 0x20 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_ct.o) - 0x38 (size before relaxing) - .eh_frame 0x00000000004db850 0x30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_gen.o) - 0x48 (size before relaxing) - .eh_frame 0x00000000004db880 0x30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_gen.o) - 0x48 (size before relaxing) - .eh_frame 0x00000000004db8b0 0xe8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) - 0x100 (size before relaxing) - .eh_frame 0x00000000004db998 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrf.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004db9b0 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrl.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004db9c8 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherr.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004db9e0 0x60 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ints2q.o) - 0x78 (size before relaxing) - .eh_frame 0x00000000004dba40 0xa8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(qcomp.o) - 0xc0 (size before relaxing) - .eh_frame 0x00000000004dbae8 0x48 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fp2q.o) - 0x60 (size before relaxing) - .eh_frame 0x00000000004dbb30 0x88 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(q2fp.o) - 0xa0 (size before relaxing) - .eh_frame 0x00000000004dbbb8 0x1e8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_display.o) - 0x200 (size before relaxing) - .eh_frame 0x00000000004dbda0 0x470 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_backtrace.o) - 0x488 (size before relaxing) - .eh_frame 0x00000000004dc210 0x1a8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(cpu_feature_disp.o) - 0x1c0 (size before relaxing) - .eh_frame 0x00000000004dc3b8 0x60 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemcpy.o) - 0x78 (size before relaxing) - .eh_frame 0x00000000004dc418 0x48 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemmove.o) - 0x60 (size before relaxing) - .eh_frame 0x00000000004dc460 0x30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemset.o) - 0x48 (size before relaxing) - .eh_frame 0x00000000004dc490 0x80 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(new_proc_init.o) - 0x98 (size before relaxing) - .eh_frame 0x00000000004dc510 0xb30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_addsubq.o) - 0xb48 (size before relaxing) - .eh_frame 0x00000000004dd040 0x570 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_divq.o) - 0x588 (size before relaxing) - .eh_frame 0x00000000004dd5b0 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcpy.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004dd5c8 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncpy.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004dd5e0 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strlen.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004dd5f8 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strchr.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004dd610 0xd8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncmp.o) - 0xf0 (size before relaxing) - .eh_frame 0x00000000004dd6e8 0x58 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcat.o) - 0x70 (size before relaxing) - .eh_frame 0x00000000004dd740 0x68 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncat.o) - 0x80 (size before relaxing) - .eh_frame 0x00000000004dd7a8 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memcpy_pp.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004dd7c0 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memset_pp.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004dd7d8 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memcpy.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004dd7f0 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memcpy.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004dd808 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memmove.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004dd820 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memmove.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004dd838 0x108 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(irc_msg_support.o) - 0x120 (size before relaxing) - .eh_frame 0x00000000004dd940 0xe0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_mem_ops.o) - 0xf8 (size before relaxing) - .eh_frame 0x00000000004dda20 0x68 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(proc_init_utils.o) - 0x80 (size before relaxing) - .eh_frame 0x00000000004dda88 0x40 /usr/lib64/libc_nonshared.a(elf-init.oS) - 0x58 (size before relaxing) - .eh_frame 0x00000000004ddac8 0x18 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - 0x30 (size before relaxing) - .eh_frame 0x00000000004ddae0 0x4 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - -.gcc_except_table - *(.gcc_except_table .gcc_except_table.*) - -.exception_ranges - *(.exception_ranges .exception_ranges*) - 0x00000000004ddae4 . = (ALIGN (0x200000) - ((0x200000 - .) & 0x1fffff)) - 0x00000000006de80c . = DATA_SEGMENT_ALIGN (0x200000, 0x1000) - -.eh_frame - *(.eh_frame) - -.gcc_except_table - *(.gcc_except_table .gcc_except_table.*) - -.exception_ranges - *(.exception_ranges .exception_ranges*) - -.tdata - *(.tdata .tdata.* .gnu.linkonce.td.*) - -.tbss - *(.tbss .tbss.* .gnu.linkonce.tb.*) - *(.tcommon) - -.preinit_array 0x00000000006de80c 0x0 - 0x00000000006de80c PROVIDE (__preinit_array_start, .) - *(.preinit_array) - 0x00000000006de80c PROVIDE (__preinit_array_end, .) - -.init_array 0x00000000006de80c 0x0 - 0x00000000006de80c PROVIDE (__init_array_start, .) - *(SORT(.init_array.*)) - *(.init_array) - 0x00000000006de80c PROVIDE (__init_array_end, .) - -.fini_array 0x00000000006de80c 0x0 - 0x00000000006de80c PROVIDE (__fini_array_start, .) - *(SORT(.fini_array.*)) - *(.fini_array) - 0x00000000006de80c PROVIDE (__fini_array_end, .) - -.ctors 0x00000000006de810 0x18 - *crtbegin.o(.ctors) - .ctors 0x00000000006de810 0x8 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - *crtbegin?.o(.ctors) - *(EXCLUDE_FILE(*crtend?.o *crtend.o) .ctors) - .ctors 0x00000000006de818 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_mem_ops.o) - *(SORT(.ctors.*)) - *(.ctors) - .ctors 0x00000000006de820 0x8 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - -.dtors 0x00000000006de828 0x10 - *crtbegin.o(.dtors) - .dtors 0x00000000006de828 0x8 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - *crtbegin?.o(.dtors) - *(EXCLUDE_FILE(*crtend?.o *crtend.o) .dtors) - *(SORT(.dtors.*)) - *(.dtors) - .dtors 0x00000000006de830 0x8 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - 0x00000000006de830 __DTOR_END__ - -.jcr 0x00000000006de838 0x8 - *(.jcr) - .jcr 0x00000000006de838 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - .jcr 0x00000000006de838 0x8 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - -.data.rel.ro 0x00000000006de840 0x480 - *(.data.rel.ro.local* .gnu.linkonce.d.rel.ro.local.*) - .data.rel.ro.local - 0x00000000006de840 0xa0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) - .data.rel.ro.local - 0x00000000006de8e0 0x3e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(irc_msg_support.o) - *(.data.rel.ro .data.rel.ro.* .gnu.linkonce.d.rel.ro.*) - -.dynamic 0x00000000006decc0 0x1e0 - *(.dynamic) - .dynamic 0x00000000006decc0 0x1e0 /usr/lib/../lib64/crt1.o - 0x00000000006decc0 _DYNAMIC - -.got 0x00000000006deea0 0x158 - *(.got) - .got 0x00000000006deea0 0x158 /usr/lib/../lib64/crt1.o - *(.igot) - 0x00000000006dffe8 . = DATA_SEGMENT_RELRO_END (., (SIZEOF (.got.plt) >= 0x18)?0x18:0x0) - -.got.plt 0x00000000006df000 0x398 - *(.got.plt) - .got.plt 0x00000000006df000 0x398 /usr/lib/../lib64/crt1.o - 0x00000000006df000 _GLOBAL_OFFSET_TABLE_ - *(.igot.plt) - .igot.plt 0x0000000000000000 0x0 /usr/lib/../lib64/crt1.o - -.data 0x00000000006df3c0 0x3ce0 - *(.data .data.* .gnu.linkonce.d.*) - .data 0x00000000006df3c0 0x4 /usr/lib/../lib64/crt1.o - 0x00000000006df3c0 data_start - 0x00000000006df3c0 __data_start - .data 0x00000000006df3c4 0x0 /usr/lib/../lib64/crti.o - *fill* 0x00000000006df3c4 0x4 - .data 0x00000000006df3c8 0x8 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - 0x00000000006df3c8 __dso_handle - .data 0x00000000006df3d0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/for_main.o - *fill* 0x00000000006df3d0 0x10 - .data 0x00000000006df3e0 0x240 rdbfmsua.o - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flclos.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flflun.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltbop.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltdat.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stldsp.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlstr.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmbl.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmst.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbrstn.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flbksp.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flinqr.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flpath.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flsopn.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssenvr.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssgsym.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlcuc.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stuclc.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbastn.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flglun.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libbridge.a(dcbsrh.o) - .data 0x00000000006df620 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ireadns.o) - *fill* 0x00000000006df620 0x20 - .data 0x00000000006df640 0xd8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapn.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapx.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(status.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upb.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wtstat.o) - .data 0x00000000006df718 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(adn30.o) - *fill* 0x00000000006df718 0x28 - .data 0x00000000006df740 0x7dc /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) - .data 0x00000000006dff1c 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort2.o) - .data 0x00000000006dff1c 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort_exit.o) - .data 0x00000000006dff1c 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort.o) - .data 0x00000000006dff1c 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(conwin.o) - .data 0x00000000006dff1c 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cpbfdx.o) - .data 0x00000000006dff1c 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(drstpl.o) - *fill* 0x00000000006dff1c 0x24 - .data 0x00000000006dff40 0x128 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxinit.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxmini.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getwin.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ibfms.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ichkstr.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ifxy.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invcon.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invwin.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ipkm.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(irev.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupm.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lmsg.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrpc.o) - .data 0x00000000006e0068 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrps.o) - .data 0x00000000006e0068 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(newwin.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nmwrd.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nxtwin.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ovrbs1.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(padmsg.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkb.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkbs1.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkc.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pktdd.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs01.o) - .data 0x00000000006e0070 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs1.o) - *fill* 0x00000000006e0070 0x10 - .data 0x00000000006e0080 0x68 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) - .data 0x00000000006e00e8 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdcmps.o) - .data 0x00000000006e00f0 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) - .data 0x00000000006e00f8 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) - .data 0x00000000006e00f8 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readmg.o) - .data 0x00000000006e00f8 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) - .data 0x00000000006e0100 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) - .data 0x00000000006e0100 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) - .data 0x00000000006e0100 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strnum.o) - .data 0x00000000006e0100 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strsuc.o) - .data 0x00000000006e0100 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(trybump.o) - .data 0x00000000006e0100 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upbb.o) - .data 0x00000000006e0100 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upc.o) - .data 0x00000000006e0100 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(usrtpl.o) - .data 0x00000000006e0100 0x3a /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(capit.o) - *fill* 0x00000000006e013a 0x2 - .data 0x00000000006e013c 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrna.o) - .data 0x00000000006e013c 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrn.o) - .data 0x00000000006e013c 0xa /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) - *fill* 0x00000000006e0146 0x2 - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cnved4.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(digit.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(elemdx.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getlens.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(gets1loc.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(i4dy.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(idn30.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(igetdate.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(istdesc.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupb.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupbs01.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstchr.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstnum.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstjpb.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) - .data 0x00000000006e0148 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(mvb.o) - *fill* 0x00000000006e0148 0x18 - .data 0x00000000006e0160 0x46 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemock.o) - *fill* 0x00000000006e01a6 0x2 - .data 0x00000000006e01a8 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtab.o) - .data 0x00000000006e01a8 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbax.o) - .data 0x00000000006e01a8 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenuaa.o) - .data 0x00000000006e01a8 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenubd.o) - .data 0x00000000006e01a8 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numbck.o) - .data 0x00000000006e01a8 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtab.o) - .data 0x00000000006e01a8 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbt.o) - .data 0x00000000006e01a8 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parstr.o) - .data 0x00000000006e01a8 0xc /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parusr.o) - *fill* 0x00000000006e01b4 0xc - .data 0x00000000006e01c0 0x60 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parutg.o) - .data 0x00000000006e0220 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rcstpl.o) - .data 0x00000000006e0220 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgb.o) - .data 0x00000000006e0220 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) - .data 0x00000000006e0220 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rsvfvm.o) - .data 0x00000000006e0220 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strcln.o) - .data 0x00000000006e0220 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(uptdd.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdesc.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cadn30.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chekstab.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(inctab.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbb.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbd.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtbd.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabent.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(valx.o) - .data 0x00000000006e0224 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rjust.o) - *fill* 0x00000000006e0224 0x4 - .data 0x00000000006e0228 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) - 0x00000000006e0228 for__segv_default_msg - 0x00000000006e0230 for__l_current_arg - .data 0x00000000006e0238 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) - .data 0x00000000006e0238 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) - .data 0x00000000006e0238 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_preconnected_units_init.o) - *fill* 0x00000000006e0238 0x8 - .data 0x00000000006e0240 0x140 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_reentrancy.o) - 0x00000000006e0240 for__static_threadstor_private - .data 0x00000000006e0380 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) - .data 0x00000000006e0380 0x80 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_stop.o) - .data 0x00000000006e0400 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_vm.o) - .data 0x00000000006e0400 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) - .data 0x00000000006e0400 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_fmt.o) - .data 0x00000000006e0400 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_lis.o) - .data 0x00000000006e0400 0x4 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio_wrap.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_int.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_f.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_d.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_g.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cray.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_short.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_long.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_double.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_single.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close_proc.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_default_io_sizes_env_init.o) - .data 0x00000000006e0404 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_desc_item.o) - *fill* 0x00000000006e0404 0x1c - .data 0x00000000006e0420 0x1e80 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) - .data 0x00000000006e22a0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit.o) - .data 0x00000000006e22a0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit_handler.o) - .data 0x00000000006e22a0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_comp.o) - .data 0x00000000006e22a0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) - .data 0x00000000006e22a0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_get.o) - .data 0x00000000006e22a0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_intrp_fmt.o) - .data 0x00000000006e22a0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_ldir_wfs.o) - .data 0x00000000006e22a0 0xc /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_lub_mgt.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_need_lf.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_put.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(tbk_traceback.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt__globals.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_int_to_text.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_data_to_text.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_log_to_text.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_data.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_log.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_t.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_s.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_x.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_s.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_s_to_a.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_t_to_a.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_s.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_t.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_x.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_x_to_a.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_x.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_globals.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_53.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_64.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_128.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(fetestexcept.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_stub.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_stub.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_ct.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_ct.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_gen.o) - .data 0x00000000006e22ac 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_gen.o) - *fill* 0x00000000006e22ac 0x14 - .data 0x00000000006e22c0 0x3c0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) - 0x00000000006e2660 __libm_pmatherrf - 0x00000000006e2668 __libm_pmatherr - 0x00000000006e2670 __libm_pmatherrl - 0x00000000006e267c _LIB_VERSIONIMF - .data 0x00000000006e2680 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrf.o) - .data 0x00000000006e2680 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrl.o) - .data 0x00000000006e2680 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherr.o) - .data 0x00000000006e2680 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ints2q.o) - .data 0x00000000006e2680 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(qcomp.o) - .data 0x00000000006e2680 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fp2q.o) - .data 0x00000000006e2680 0x28 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(q2fp.o) - .data 0x00000000006e26a8 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_display.o) - .data 0x00000000006e26a8 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_backtrace.o) - .data 0x00000000006e26a8 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(cpu_feature_disp.o) - .data 0x00000000006e26a8 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemcpy.o) - .data 0x00000000006e26a8 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemmove.o) - .data 0x00000000006e26a8 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemset.o) - *fill* 0x00000000006e26a8 0x18 - .data 0x00000000006e26c0 0x160 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(new_proc_init.o) - .data 0x00000000006e2820 0x28 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_addsubq.o) - .data 0x00000000006e2848 0x30 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_divq.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcpy.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncpy.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strlen.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strchr.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncmp.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcat.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncat.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memcpy_pp.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memset_pp.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memcpy.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memcpy.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memmove.o) - .data 0x00000000006e2878 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memmove.o) - .data 0x00000000006e2878 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(irc_msg_support.o) - .data 0x00000000006e2880 0x820 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_mem_ops.o) - 0x00000000006e3080 __libirc_largest_cache_size - 0x00000000006e3084 __libirc_largest_cache_size_half - 0x00000000006e3088 __libirc_data_cache_size - 0x00000000006e308c __libirc_data_cache_size_half - .data 0x00000000006e30a0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(proc_init_utils.o) - .data 0x00000000006e30a0 0x0 /usr/lib64/libc_nonshared.a(elf-init.oS) - .data 0x00000000006e30a0 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - .data 0x00000000006e30a0 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - .data 0x00000000006e30a0 0x0 /usr/lib/../lib64/crtn.o - -.tm_clone_table - 0x00000000006e30a0 0x0 - .tm_clone_table - 0x00000000006e30a0 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - .tm_clone_table - 0x00000000006e30a0 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - -.data1 - *(.data1) - 0x00000000006e30a0 _edata = . - 0x00000000006e30a0 PROVIDE (edata, .) - 0x00000000006e30a0 . = . - 0x00000000006e30a0 __bss_start = . - -.bss 0x00000000006e30c0 0x2ebcf68 - *(.dynbss) - .dynbss 0x00000000006e30c0 0x18 /usr/lib/../lib64/crt1.o - 0x00000000006e30c0 stdin@@GLIBC_2.2.5 - 0x00000000006e30c8 stderr@@GLIBC_2.2.5 - 0x00000000006e30d0 stdout@@GLIBC_2.2.5 - *(.bss .bss.* .gnu.linkonce.b.*) - .bss 0x00000000006e30d8 0x0 /usr/lib/../lib64/crt1.o - .bss 0x00000000006e30d8 0x0 /usr/lib/../lib64/crti.o - .bss 0x00000000006e30d8 0x10 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - .bss 0x00000000006e30e8 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/for_main.o - *fill* 0x00000000006e30e8 0x18 - .bss 0x00000000006e3100 0x91360 rdbfmsua.o - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flclos.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flflun.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltbop.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltdat.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stldsp.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlstr.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmbl.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmst.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbrstn.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flbksp.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flinqr.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flpath.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flsopn.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssenvr.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssgsym.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlcuc.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stuclc.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbastn.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flglun.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libbridge.a(dcbsrh.o) - .bss 0x0000000000774460 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ireadns.o) - .bss 0x0000000000774460 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) - *fill* 0x0000000000774464 0x1c - .bss 0x0000000000774480 0xc350 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapn.o) - *fill* 0x00000000007807d0 0x30 - .bss 0x0000000000780800 0xc350 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapx.o) - .bss 0x000000000078cb50 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) - .bss 0x000000000078cb50 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) - .bss 0x000000000078cb50 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) - .bss 0x000000000078cb50 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) - .bss 0x000000000078cb50 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(status.o) - .bss 0x000000000078cb50 0x8 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) - .bss 0x000000000078cb58 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) - .bss 0x000000000078cb58 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upb.o) - .bss 0x000000000078cb58 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) - *fill* 0x000000000078cb5c 0x24 - .bss 0x000000000078cb80 0xc350 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wtstat.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(adn30.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort2.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort_exit.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(conwin.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cpbfdx.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(drstpl.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxinit.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxmini.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getwin.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ibfms.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ichkstr.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ifxy.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invcon.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invwin.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ipkm.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(irev.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupm.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lmsg.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrpc.o) - .bss 0x0000000000798ed0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrps.o) - *fill* 0x0000000000798ed0 0x30 - .bss 0x0000000000798f00 0xc350 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) - .bss 0x00000000007a5250 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(newwin.o) - .bss 0x00000000007a5250 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nmwrd.o) - .bss 0x00000000007a5250 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nxtwin.o) - .bss 0x00000000007a5250 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ovrbs1.o) - .bss 0x00000000007a5254 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(padmsg.o) - .bss 0x00000000007a5254 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkb.o) - .bss 0x00000000007a5254 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkbs1.o) - .bss 0x00000000007a5254 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkc.o) - .bss 0x00000000007a5254 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pktdd.o) - .bss 0x00000000007a5254 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs01.o) - .bss 0x00000000007a5258 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs1.o) - *fill* 0x00000000007a525c 0x24 - .bss 0x00000000007a5280 0xc350 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) - .bss 0x00000000007b15d0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdcmps.o) - *fill* 0x00000000007b15d0 0x30 - .bss 0x00000000007b1600 0x13880 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) - .bss 0x00000000007c4e80 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) - .bss 0x00000000007c4e80 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readmg.o) - .bss 0x00000000007c4e80 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) - .bss 0x00000000007c4e80 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) - .bss 0x00000000007c4e80 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) - .bss 0x00000000007c4e80 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strnum.o) - .bss 0x00000000007c4e80 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strsuc.o) - .bss 0x00000000007c4e80 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(trybump.o) - .bss 0x00000000007c4e80 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upbb.o) - .bss 0x00000000007c4e80 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upc.o) - .bss 0x00000000007c4e80 0x3a980 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(usrtpl.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(capit.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrna.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrn.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cnved4.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(digit.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(elemdx.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getlens.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(gets1loc.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(i4dy.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(idn30.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(igetdate.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(istdesc.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupb.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupbs01.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstchr.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstnum.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstjpb.o) - .bss 0x00000000007ff800 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) - .bss 0x00000000007ff800 0x30d40 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(mvb.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemock.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtab.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbax.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenuaa.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenubd.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numbck.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtab.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbt.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parstr.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parusr.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parutg.o) - .bss 0x0000000000830540 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rcstpl.o) - .bss 0x0000000000830540 0x186a0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgb.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rsvfvm.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strcln.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(uptdd.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdesc.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cadn30.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chekstab.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(inctab.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbb.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbd.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtbd.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabent.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(valx.o) - .bss 0x0000000000848be0 0x0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rjust.o) - .bss 0x0000000000848be0 0x48 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) - 0x0000000000848bf0 for__l_excpt_info - 0x0000000000848bfc for__l_fpe_mask - 0x0000000000848c00 for__l_undcnt - 0x0000000000848c04 for__l_ovfcnt - 0x0000000000848c08 for__l_div0cnt - 0x0000000000848c0c for__l_invcnt - 0x0000000000848c10 for__l_inecnt - 0x0000000000848c14 for__l_fmtrecl - 0x0000000000848c18 for__l_ufmtrecl - 0x0000000000848c1c for__l_blocksize - 0x0000000000848c20 for__l_buffercount - .bss 0x0000000000848c28 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_io_util.o) - *fill* 0x0000000000848c28 0x18 - .bss 0x0000000000848c40 0x440 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open.o) - 0x0000000000849060 for__l_exit_hand_decl - .bss 0x0000000000849080 0x15e0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_preconnected_units_init.o) - .bss 0x000000000084a660 0x18 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_reentrancy.o) - 0x000000000084a670 for__reentrancy_mode - 0x000000000084a674 for__reentrancy_initialized - .bss 0x000000000084a678 0x4 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_secnds.o) - .bss 0x000000000084a67c 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_stop.o) - .bss 0x000000000084a684 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_vm.o) - .bss 0x000000000084a694 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wint_fmt.o) - .bss 0x000000000084a694 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_fmt.o) - .bss 0x000000000084a694 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq_lis.o) - *fill* 0x000000000084a694 0x4 - .bss 0x000000000084a698 0xd8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) - 0x000000000084a740 for__aio_global_mutex - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_open_proc.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio_wrap.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_int.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_f.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_d.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_vax_g.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cray.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_short.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ibm_long.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_double.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_ieee_single.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_close_proc.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_default_io_sizes_env_init.o) - .bss 0x000000000084a770 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_desc_item.o) - *fill* 0x000000000084a770 0x10 - .bss 0x000000000084a780 0x260 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) - 0x000000000084a9a0 for__user_iomsg_buf - 0x000000000084a9a8 for__user_iomsg_len - .bss 0x000000000084a9e0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit.o) - .bss 0x000000000084a9e0 0x4 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_exit_handler.o) - 0x000000000084a9e0 for__l_exit_termination - .bss 0x000000000084a9e4 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_comp.o) - .bss 0x000000000084a9e4 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_fmt_val.o) - .bss 0x000000000084a9e4 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_get.o) - .bss 0x000000000084a9e4 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_intrp_fmt.o) - .bss 0x000000000084a9e4 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_ldir_wfs.o) - *fill* 0x000000000084a9e4 0x1c - .bss 0x000000000084aa00 0x2760 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_lub_mgt.o) - 0x000000000084aa20 for__lub_table - .bss 0x000000000084d160 0x20a0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_need_lf.o) - 0x000000000084d160 for__file_info_hash_table - .bss 0x000000000084f200 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_put.o) - .bss 0x000000000084f200 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_wseq.o) - .bss 0x000000000084f200 0x4 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(tbk_traceback.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt__globals.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_int_to_text.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_data_to_text.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_log_to_text.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_data.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_text_to_log.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_t.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_s.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvt_cvtas_x.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_s.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_t.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_s_to_a.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_t_to_a.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_s.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_t.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_a_to_x.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_x_to_a.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_nan_x.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_globals.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_53.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_64.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(cvtas_pow_ten_128.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(fetestexcept.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_stub.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_stub.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_ct.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_ct.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_gen.o) - .bss 0x000000000084f204 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_gen.o) - *fill* 0x000000000084f204 0x4 - .bss 0x000000000084f208 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(libm_error.o) - .bss 0x000000000084f210 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrf.o) - .bss 0x000000000084f210 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherrl.o) - .bss 0x000000000084f210 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(matherr.o) - .bss 0x000000000084f210 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ints2q.o) - .bss 0x000000000084f210 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(qcomp.o) - .bss 0x000000000084f210 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fp2q.o) - .bss 0x000000000084f210 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(q2fp.o) - .bss 0x000000000084f210 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_display.o) - *fill* 0x000000000084f210 0x10 - .bss 0x000000000084f220 0x180 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(tbk_backtrace.o) - 0x000000000084f2c0 tbk__jmp_env - .bss 0x000000000084f3a0 0x10 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(cpu_feature_disp.o) - 0x000000000084f3a0 __intel_cpu_feature_indicator - 0x000000000084f3a8 __intel_cpu_feature_indicator_x - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemcpy.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemmove.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemset.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(new_proc_init.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_addsubq.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_divq.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcpy.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncpy.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strlen.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strchr.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncmp.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strcat.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(sse2_strncat.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memcpy_pp.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_memset_pp.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memcpy.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memcpy.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_memmove.o) - .bss 0x000000000084f3b0 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(intel_ssse3_rep_memmove.o) - *fill* 0x000000000084f3b0 0x10 - .bss 0x000000000084f3c0 0x420 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(irc_msg_support.o) - .bss 0x000000000084f7e0 0x60 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fast_mem_ops.o) - 0x000000000084f824 __libirc_mem_ops_method - 0x000000000084f828 __libirc_largest_cachelinesize - .bss 0x000000000084f840 0x0 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(proc_init_utils.o) - .bss 0x000000000084f840 0x0 /usr/lib64/libc_nonshared.a(elf-init.oS) - .bss 0x000000000084f840 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - .bss 0x000000000084f840 0x0 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - .bss 0x000000000084f840 0x0 /usr/lib/../lib64/crtn.o - *(COMMON) - COMMON 0x000000000084f840 0x1c4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flflun.o) - 0x000000000084f840 gmbdta_ - *fill* 0x000000000084fa04 0x3c - COMMON 0x000000000084fa40 0x484 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) - 0x000000000084fa40 stbfr_ - 0x000000000084fb40 nulbfr_ - 0x000000000084fbc0 msgfmt_ - 0x000000000084fc40 msgcwd_ - 0x000000000084fec0 quiet_ - *fill* 0x000000000084fec4 0x1c - COMMON 0x000000000084fee0 0x2c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) - 0x000000000084fee0 hrdwrd_ - *fill* 0x000000000084ff0c 0x34 - COMMON 0x000000000084ff40 0x13d628 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) - 0x000000000084ff40 tables_ - *fill* 0x000000000098d568 0x18 - COMMON 0x000000000098d580 0x192e80 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) - 0x000000000098d580 bitbuf_ - 0x0000000000b20380 unptyp_ - COMMON 0x0000000000b20400 0x753150 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) - 0x0000000000b20400 usrint_ - 0x0000000001273480 usrstr_ - *fill* 0x0000000001273550 0x30 - COMMON 0x0000000001273580 0x804 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) - 0x0000000001273580 charac_ - *fill* 0x0000000001273d84 0x3c - COMMON 0x0000000001273dc0 0xbbe88c /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) - 0x0000000001273dc0 dxtab_ - 0x00000000012740c0 tababd_ - *fill* 0x0000000001e3264c 0x34 - COMMON 0x0000000001e32680 0x188d4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) - 0x0000000001e32680 maxcmp_ - 0x0000000001e326a0 msgstd_ - 0x0000000001e326c0 reptab_ - 0x0000000001e32740 bufrmg_ - 0x0000000001e3eaa0 msgcmp_ - 0x0000000001e3eac0 acmode_ - 0x0000000001e3eb00 bufrsr_ - 0x0000000001e4af00 dateln_ - 0x0000000001e4af20 mrgcom_ - 0x0000000001e4af40 padesc_ - *fill* 0x0000000001e4af54 0x2c - COMMON 0x0000000001e4af80 0xfc /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) - 0x0000000001e4af80 s01cm_ - 0x0000000001e4b000 sect01_ - *fill* 0x0000000001e4b07c 0x4 - COMMON 0x0000000001e4b080 0x27100 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) - 0x0000000001e4b080 usrbit_ - COMMON 0x0000000001e72180 0x4a3c0 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) - 0x0000000001e72180 stcach_ - 0x0000000001eba600 stords_ - COMMON 0x0000000001ebc540 0x4 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parutg.o) - 0x0000000001ebc540 utgprm_ - *fill* 0x0000000001ebc544 0x3c - COMMON 0x0000000001ebc580 0x16e3600 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rcstpl.o) - 0x0000000001ebc580 usrtmp_ - COMMON 0x000000000359fb80 0x10 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) - 0x000000000359fb80 tabccc_ - COMMON 0x000000000359fb90 0xc /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_init.o) - 0x000000000359fb90 for__a_argv - 0x000000000359fb98 for__l_argc - *fill* 0x000000000359fb9c 0x4 - COMMON 0x000000000359fba0 0x480 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_aio.o) - 0x000000000359fba0 thread_count_mutex - 0x000000000359fbc8 threads_in_flight_mutex - 0x000000000359fbf0 for__pthread_mutex_unlock_ptr - 0x000000000359fbf8 for__pthread_mutex_init_ptr - 0x000000000359fc00 for__pthread_mutex_lock_ptr - 0x000000000359fc20 for__aio_lub_table - COMMON 0x00000000035a0020 0x8 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libifcore.a(for_diags_intel.o) - 0x00000000035a0020 message_catalog - 0x00000000035a0028 . = ALIGN ((. != 0x0)?0x8:0x1) - -.lbss - *(.dynlbss) - *(.lbss .lbss.* .gnu.linkonce.lb.*) - *(LARGE_COMMON) - 0x00000000035a0028 . = ALIGN (0x8) - -.lrodata - *(.lrodata .lrodata.* .gnu.linkonce.lr.*) - -.ldata 0x00000000039a0028 0x0 - *(.ldata .ldata.* .gnu.linkonce.l.*) - 0x00000000039a0028 . = ALIGN ((. != 0x0)?0x8:0x1) - 0x00000000039a0028 . = ALIGN (0x8) - 0x00000000039a0028 _end = . - 0x00000000039a0028 PROVIDE (end, .) - 0x00000000039a0028 . = DATA_SEGMENT_END (.) - -.stab - *(.stab) - -.stabstr - *(.stabstr) - -.stab.excl - *(.stab.excl) - -.stab.exclstr - *(.stab.exclstr) - -.stab.index - *(.stab.index) - -.stab.indexstr - *(.stab.indexstr) - -.comment 0x0000000000000000 0x73 - *(.comment) - .comment 0x0000000000000000 0x39 /usr/lib/../lib64/crt1.o - 0x3a (size before relaxing) - .comment 0x0000000000000000 0x3a /usr/lib/../lib64/crti.o - .comment 0x0000000000000039 0x26 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtbegin.o - 0x27 (size before relaxing) - .comment 0x000000000000005f 0x14 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/for_main.o - .comment 0x0000000000000000 0x14 rdbfmsua.o - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flclos.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flflun.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltbop.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltdat.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(fltinq.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stldsp.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlstr.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmbl.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(strmst.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbrstn.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flbksp.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flinqr.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flpath.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flsopn.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssenvr.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(ssgsym.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stlcuc.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(stuclc.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(tbastn.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libgemlib.a(flglun.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libbridge.a(dcbsrh.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ireadns.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbf.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapn.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(posapx.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgw.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readdx.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readns.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readsb.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(status.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbint.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ufbrw.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upb.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdlen.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(writdx.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wtstat.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(adn30.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bfrini.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort2.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort_exit.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(bort.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(conwin.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cpbfdx.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(drstpl.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxinit.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(dxmini.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getwin.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ibfms.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ichkstr.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ifxy.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invcon.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(invwin.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ipkm.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(irev.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupm.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lmsg.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrpc.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstrps.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(msgwrt.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(newwin.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nmwrd.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nxtwin.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(ovrbs1.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(padmsg.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkb.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkbs1.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkc.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pktdd.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs01.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(pkvs1.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdbfdx.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdcmps.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdtree.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdusdx.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(readmg.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(seqsdx.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(stndrd.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(string.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strnum.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strsuc.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(trybump.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upbb.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(upc.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(usrtpl.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(capit.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrna.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chrtrn.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cktaba.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cnved4.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(digit.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(elemdx.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(getlens.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(gets1loc.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(i4dy.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(idn30.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(igetdate.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(istdesc.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupb.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(iupbs01.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstchr.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(jstnum.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(lstjpb.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(makestab.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(mvb.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemock.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtab.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbax.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenuaa.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nenubd.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numbck.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtab.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(openbt.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parstr.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parusr.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(parutg.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rcstpl.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rdmsgb.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(restd.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rsvfvm.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(strcln.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabsub.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(uptdd.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(wrdesc.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(cadn30.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(chekstab.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(inctab.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbb.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(nemtbd.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(numtbd.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(tabent.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(valx.o) - .comment 0x0000000000000000 0x27 /gpfs/hps/nco/ops/nwprod/gempak.v6.32.0/nawips/os/linux3.0.101_x86_64/lib/libncepBUFR.a(rjust.o) - .comment 0x0000000000000000 0x14 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lroundf_stub.o) - .comment 0x0000000000000000 0x14 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libimf.a(lround_stub.o) - .comment 0x0000000000000000 0x14 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemcpy.o) - .comment 0x0000000000000000 0x14 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemmove.o) - .comment 0x0000000000000000 0x14 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(fastmemset.o) - .comment 0x0000000000000000 0x14 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_addsubq.o) - .comment 0x0000000000000000 0x14 /opt/intel/composer_xe_2015.3.187/compiler/lib/intel64/libirc.a(ia32_divq.o) - .comment 0x0000000000000000 0x27 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - .comment 0x0000000000000000 0x27 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2/crtend.o - .comment 0x0000000000000000 0x3a /usr/lib/../lib64/crtn.o - -.debug - *(.debug) - -.line - *(.line) - -.debug_srcinfo - *(.debug_srcinfo) - -.debug_sfnames - *(.debug_sfnames) - -.debug_aranges 0x0000000000000000 0x90 - *(.debug_aranges) - .debug_aranges - 0x0000000000000000 0x30 /usr/lib/../lib64/crt1.o - .debug_aranges - 0x0000000000000030 0x30 /usr/lib64/libc_nonshared.a(elf-init.oS) - .debug_aranges - 0x0000000000000060 0x30 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - -.debug_pubnames - 0x0000000000000000 0x5f - *(.debug_pubnames) - .debug_pubnames - 0x0000000000000000 0x25 /usr/lib/../lib64/crt1.o - .debug_pubnames - 0x0000000000000025 0x3a /usr/lib64/libc_nonshared.a(elf-init.oS) - -.debug_info 0x0000000000000000 0x58e - *(.debug_info .gnu.linkonce.wi.*) - .debug_info 0x0000000000000000 0x102 /usr/lib/../lib64/crt1.o - .debug_info 0x0000000000000102 0x130 /usr/lib64/libc_nonshared.a(elf-init.oS) - .debug_info 0x0000000000000232 0x35c /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - -.debug_abbrev 0x0000000000000000 0x22d - *(.debug_abbrev) - .debug_abbrev 0x0000000000000000 0x5f /usr/lib/../lib64/crt1.o - .debug_abbrev 0x000000000000005f 0xd4 /usr/lib64/libc_nonshared.a(elf-init.oS) - .debug_abbrev 0x0000000000000133 0xfa /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - -.debug_line 0x0000000000000000 0x22c - *(.debug_line) - .debug_line 0x0000000000000000 0x88 /usr/lib/../lib64/crt1.o - .debug_line 0x0000000000000088 0x96 /usr/lib64/libc_nonshared.a(elf-init.oS) - .debug_line 0x000000000000011e 0x10e /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - -.debug_frame 0x0000000000000000 0x58 - *(.debug_frame) - .debug_frame 0x0000000000000000 0x58 /usr/lib64/libc_nonshared.a(elf-init.oS) - -.debug_str 0x0000000000000000 0x428 - *(.debug_str) - .debug_str 0x0000000000000000 0x90 /usr/lib/../lib64/crt1.o - 0xd0 (size before relaxing) - .debug_str 0x0000000000000090 0x6a /usr/lib64/libc_nonshared.a(elf-init.oS) - 0xe0 (size before relaxing) - .debug_str 0x00000000000000fa 0x32e /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - 0x3ba (size before relaxing) - -.debug_loc 0x0000000000000000 0x1b7 - *(.debug_loc) - .debug_loc 0x0000000000000000 0xfe /usr/lib64/libc_nonshared.a(elf-init.oS) - .debug_loc 0x00000000000000fe 0xb9 /opt/gcc/4.9.2/snos/lib/gcc/x86_64-suse-linux/4.9.2//libgcc.a(_powidf2.o) - -.debug_macinfo - *(.debug_macinfo) - -.debug_weaknames - *(.debug_weaknames) - -.debug_funcnames - *(.debug_funcnames) - -.debug_typenames - *(.debug_typenames) - -.debug_varnames - *(.debug_varnames) - -.debug_pubtypes - *(.debug_pubtypes) - -.debug_ranges 0x0000000000000000 0x50 - *(.debug_ranges) - .debug_ranges 0x0000000000000000 0x50 /usr/lib64/libc_nonshared.a(elf-init.oS) - -.debug_macro - *(.debug_macro) - -.gnu.attributes - *(.gnu.attributes) - -/DISCARD/ - *(.note.GNU-stack) - *(.gnu_debuglink) - *(.gnu.lto_*) -OUTPUT(rdbfmsua elf64-x86-64) diff --git a/util/sorc/rdbfmsua.fd/README b/util/sorc/rdbfmsua.fd/README deleted file mode 100755 index 4128761bcff..00000000000 --- a/util/sorc/rdbfmsua.fd/README +++ /dev/null @@ -1,2 +0,0 @@ -added libgem.a and changed libbufr_4_32 to 64-bit. -also changed -m32 -m64. diff --git a/util/sorc/rdbfmsua.fd/README.new b/util/sorc/rdbfmsua.fd/README.new deleted file mode 100755 index f72a61f38a9..00000000000 --- a/util/sorc/rdbfmsua.fd/README.new +++ /dev/null @@ -1,10 +0,0 @@ -added libgem.a and changed libbufr_4_32 to 64-bit. -also changed -m32 -m64. - -# JY - 02/09/2016 -Run the following command before run the "make" - . /nwprod/gempak/.gempak - -# Boi - 09/10/2016 -on CRAY -module load gempak/6.32.0 diff --git a/util/sorc/rdbfmsua.fd/compile_rdbfmsua_wcoss.sh b/util/sorc/rdbfmsua.fd/compile_rdbfmsua_wcoss.sh deleted file mode 100755 index 2ffcdc6190d..00000000000 --- a/util/sorc/rdbfmsua.fd/compile_rdbfmsua_wcoss.sh +++ /dev/null @@ -1,35 +0,0 @@ -#!/bin/sh - -###################################################################### -# -# Build executable : GFS utilities -# -###################################################################### - -LMOD_EXACT_MATCH=no -source ../../../sorc/machine-setup.sh > /dev/null 2>&1 -cwd=$(pwd) - -if [ "$target" = "hera" ]; then - echo " " - echo " You are on $target " - echo " " -else - echo " " - echo " Your machine $target is not supported." - echo " The script $0 can not continue. Aborting!" - echo " " - exit -fi -echo " " - -# Load required modules -source ../../modulefiles/gfs_util.${target} -module list - -set -x - -mkdir -p ../../exec -make -f makefile.$target -make -f makefile.$target clean -mv rdbfmsua ../../exec diff --git a/util/sorc/rdbfmsua.fd/makefile b/util/sorc/rdbfmsua.fd/makefile deleted file mode 100755 index 69d183f394c..00000000000 --- a/util/sorc/rdbfmsua.fd/makefile +++ /dev/null @@ -1,84 +0,0 @@ -SHELL=/bin/sh -# -# This makefile was produced by /usr/bin/fmgen at 11:21:07 AM on 10/28/94 -# If it is invoked by the command line -# make -f makefile -# it will compile the fortran modules indicated by SRCS into the object -# modules indicated by OBJS and produce an executable named a.out. -# -# If it is invoked by the command line -# make -f makefile a.out.prof -# it will compile the fortran modules indicated by SRCS into the object -# modules indicated by OBJS and produce an executable which profiles -# named a.out.prof. -# -# To remove all the objects but leave the executables use the command line -# make -f makefile clean -# -# To remove everything but the source files use the command line -# make -f makefile clobber -# -# To remove the source files created by /usr/bin/fmgen and this makefile -# use the command line -# make -f makefile void -# -# The parameters SRCS and OBJS should not need to be changed. If, however, -# you need to add a new module add the name of the source module to the -# SRCS parameter and add the name of the resulting object file to the OBJS -# parameter. The new modules are not limited to fortran, but may be C, YACC, -# LEX, or CAL. An explicit rule will need to be added for PASCAL modules. -# -OBJS= rdbfmsua.o - - -# Tunable parameters -# -# FC Name of the fortran compiling system to use -# LDFLAGS Flags to the loader -# LIBS List of libraries -# CMD Name of the executable -# -FC = ifort -# FFLAGS = -O3 -q32 -I${GEMINC} -I${NAWIPS}/os/${NA_OS}/include -# FFLAGS = -I${GEMINC} -I${NAWIPS}/os/${NA_OS}/include -FFLAGS = -I${GEMINC} -I${OS_INC} -# LDFLAGS = -O3 -q32 -s -# LDFLAGS = -Wl,-Map,MAPFILE - -# BRIDGE=/gpfs/dell1/nco/ops/nwpara/gempak.v7.3.1/nawips/os/linux3.10.0_x86_64/lib/libbridge.a -BRIDGE=${GEMOLB}/libbridge.a - -LIBS = ${DECOD_UT_LIB} ${BUFR_LIB4} \ - -L${GEMOLB} -lgemlib -lappl -lsyslib -lcgemlib -lgfortran ${BRIDGE} - -# -L${GEMOLB} -lgemlib -lappl -lsyslib -lcgemlib -lgfortran ${BRIDGE} -# -L/nwprod/gempak/nawips1/os/linux2.6.32_x86_64/lib -lgemlib -lappl -lsyslib -lcgemlib -lbridge -lncepBUFR \ -# -lgfortran - -CMD = rdbfmsua - -# To perform the default compilation, use the first line -# To compile with flowtracing turned on, use the second line -# To compile giving profile additonal information, use the third line -# CFLAGS= -O3 -q32 - -# Lines from here on down should not need to be changed. They are the -# actual rules which make uses to build a.out. -# - -$(CMD): $(OBJS) - $(FC) $(LDFLAGS) -o $(@) $(OBJS) $(LIBS) - - -# The following rule reads the required NAWIPS definitions and then recursively -# runs this same makefile with a new target in the spawned shell. -# - -clean: - -rm -f ${OBJS} - -clobber: clean - -rm -f ${CMD} - -void: clobber - -rm -f ${SRCS} makefile diff --git a/util/sorc/rdbfmsua.fd/makefile.hera b/util/sorc/rdbfmsua.fd/makefile.hera deleted file mode 100755 index a1359e6cb8c..00000000000 --- a/util/sorc/rdbfmsua.fd/makefile.hera +++ /dev/null @@ -1,88 +0,0 @@ -SHELL=/bin/sh -# -# This makefile was produced by /usr/bin/fmgen at 11:21:07 AM on 10/28/94 -# If it is invoked by the command line -# make -f makefile -# it will compile the fortran modules indicated by SRCS into the object -# modules indicated by OBJS and produce an executable named a.out. -# -# If it is invoked by the command line -# make -f makefile a.out.prof -# it will compile the fortran modules indicated by SRCS into the object -# modules indicated by OBJS and produce an executable which profiles -# named a.out.prof. -# -# To remove all the objects but leave the executables use the command line -# make -f makefile clean -# -# To remove everything but the source files use the command line -# make -f makefile clobber -# -# To remove the source files created by /usr/bin/fmgen and this makefile -# use the command line -# make -f makefile void -# -# The parameters SRCS and OBJS should not need to be changed. If, however, -# you need to add a new module add the name of the source module to the -# SRCS parameter and add the name of the resulting object file to the OBJS -# parameter. The new modules are not limited to fortran, but may be C, YACC, -# LEX, or CAL. An explicit rule will need to be added for PASCAL modules. -# -OBJS= rdbfmsua.o - - -# Tunable parameters -# -# FC Name of the fortran compiling system to use -# LDFLAGS Flags to the loader -# LIBS List of libraries -# CMD Name of the executable -# -FC = ifort -# FFLAGS = -O3 -q32 -I${GEMINC} -I${NAWIPS}/os/${NA_OS}/include -# FFLAGS = -I${GEMINC} -I${NAWIPS}/os/${NA_OS}/include -FFLAGS = -I${GEMINC} -I${OS_INC} -# LDFLAGS = -O3 -q32 -s -# LDFLAGS = -Wl,-Map,MAPFILE - -# BRIDGE=/gpfs/dell1/nco/ops/nwpara/gempak.v7.3.1/nawips/os/linux3.10.0_x86_64/lib/libbridge.a -BRIDGE=${GEMOLB}/bridge.a -GFORTRAN=/apps/gcc/6.2.0/lib64 - -LIBS = ${DECOD_UT_LIB} ${BUFR_LIB4} \ - ${GEMLIB}/gemlib.a ${GEMLIB}/appl.a ${GEMLIB}/syslib.a ${GEMLIB}/cgemlib.a -L${GFORTRAN} -lgfortran ${BRIDGE} - -# LIBS = ${DECOD_UT_LIB} ${BUFR_LIB4} \ -# -L${GEMOLB} -lgemlib -lappl -lsyslib -lcgemlib -lgfortran ${BRIDGE} - -# -L${GEMOLB} -lgemlib -lappl -lsyslib -lcgemlib -lgfortran ${BRIDGE} -# -L/nwprod/gempak/nawips1/os/linux2.6.32_x86_64/lib -lgemlib -lappl -lsyslib -lcgemlib -lbridge -lncepBUFR \ -# -lgfortran - -CMD = rdbfmsua - -# To perform the default compilation, use the first line -# To compile with flowtracing turned on, use the second line -# To compile giving profile additonal information, use the third line -# CFLAGS= -O3 -q32 - -# Lines from here on down should not need to be changed. They are the -# actual rules which make uses to build a.out. -# - -$(CMD): $(OBJS) - $(FC) $(LDFLAGS) -o $(@) $(OBJS) $(LIBS) - - -# The following rule reads the required NAWIPS definitions and then recursively -# runs this same makefile with a new target in the spawned shell. -# - -clean: - -rm -f ${OBJS} - -clobber: clean - -rm -f ${CMD} - -void: clobber - -rm -f ${SRCS} makefile diff --git a/util/sorc/rdbfmsua.fd/rdbfmsua.f b/util/sorc/rdbfmsua.fd/rdbfmsua.f deleted file mode 100755 index c2d50889202..00000000000 --- a/util/sorc/rdbfmsua.fd/rdbfmsua.f +++ /dev/null @@ -1,398 +0,0 @@ - PROGRAM RDBFUA -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C -C MAIN PROGRAM: RDBFUA -C PRGMMR: J. ATOR ORG: NP12 DATE: 2007-08-13 -C -C ABSTRACT: Upper Air Plotted Data for levels 1000MB; 925MB; 850MB; 700MB; -C 500MB; 400MB; 300MB; 250MB; 200MB; 150MB, and 100MB for the -C following regions: 1)United States; 2)Canada; 3)Alaska; and, -C the 4)Mexico and Caribbean. Note that Alaska includes eastern -C Russia. Also adding South America, Africa, and the Pacific. -C -C PROGRAM HISTORY LOG: -C -C 2007-08-13 J. ATOR -- ORIGINAL AUTHOR -C 2007-08-20 C. Magee -- Added block 25 (eastern Russia) -C 2007-09-20 S. Lilly -- Changing to read blks 60 thru 91. -C 2007-09-20 C. Magee -- Added code to read upper air and metar stn tables -C 2007-09-25 S. Lilly -- Added logic to write statements in order to put STID, -C STNM and TIME on the same line. -C 2007-09-27 C. Magee -- Change output for stntbl.out. Use st_rmbl to remove -C leading blank from reportid if internal write was -C used to convert integer WMO block/stn number to -C char report id. -C 2012-01-24 J. Cahoon -- Modified from original RDBFUA to include -C significant and standard together in output -C 2012-02-15 B. Mabe -- Changed Program name and output file to reflect -C change to output for sig and man data -C 2016-10-18 B. Vuong -- Removed hardwire '/nwprod/dictionaries/' in CALL FL_TBOP -C 2020-01-15 B. Vuong -- Increased dimensional array r8lvl(6,200) -C -C USAGE: -C INPUT FILES: -C UNIT 40 - adpupa dumpfile (contains data from BUFR tank b002/xx001) -C -C sonde.land.tbl -C metar.tbl -C -C OUTPUT FILES: -C UNIT 51 - rdbfmsua.out - contains ASCII upper air data for the desired -C stations. -C UNIT 52 - stnmstbl.out - contains ASCII station table info for use by -C html generator. -C -C SUBPROGRAMS CALLED: -C UNIQUE: -C LIBRARY: BUFRLIB - OPENBF UFBINT -C GEMLIB - FL_TBOP ST_RMBL TB_RSTN -C BRIDGE - DC_BSRH -C -C EXIT STATES: -C COND = 0 - SUCCESSFUL RUN -C -C REMARKS: -C -C ATTRIBUTES: -C LANGUAGE: FORTRAN 90 -C MACHINE : IBM-SP -C -C$$$ - INCLUDE 'GEMPRM.PRM' - INCLUDE 'BRIDGE.PRM' -C*---------------------------------------------------------------------- -C* Set the name of the output file. -C*---------------------------------------------------------------------- - - CHARACTER*(*) FLO, STNO - - PARAMETER ( FLO = 'rdbfmsua.out' ) - PARAMETER ( STNO = 'sonde.idsms.tbl' ) - - REAL*8 BFMSNG - PARAMETER ( BFMSNG = 10.0E10 ) - - PARAMETER ( GPMSNG = -9999.0 ) - PARAMETER ( MAXSTN = 10000 ) - - REAL*8 r8hdr ( 9, 1 ), r8lvl ( 6, 200 ), r8arr( 1, 1 ) - REAL*8 r8tmp ( 6, 100 ), r8out ( 6, 300 ),swpbuf - REAL*8 r8tmptot ( 6, 300 ) - - CHARACTER*8 cmgtag, reportid - CHARACTER stnnam*32, tbchrs*20, state*2, tabcon*2 - CHARACTER ldcoun( LLSTFL )*2, mtcoun ( MAXSTN )*2 - CHARACTER ldstid ( LLSTFL )*8, mtstid ( MAXSTN )*8 - INTEGER ldstnm ( LLSTFL ), mtstnm ( MAXSTN ), ispri - INTEGER itabnum - REAL slat, slon, selv - LOGICAL nomatch, needHeader - -C*---------------------------------------------------------------------- -C* Open and read the sonde land station table. -C*---------------------------------------------------------------------- - CALL FL_TBOP ( 'sonde.land.tbl', - + 'stns', iunltb, iertop ) - IF ( iertop .ne. 0 ) THEN - print*,' error opening sonde land station table' - END IF - - ii = 1 - ierrst = 0 - DO WHILE ( ( ii .le. LLSTFL ) .and. ( ierrst .eq. 0 ) ) - CALL TB_RSTN ( iunltb, ldstid (ii), stnnam, ldstnm (ii), - + state, ldcoun (ii), slat, slon, - + selv, ispri, tbchrs, ierrst ) - ii = ii + 1 - END DO - IF ( ierrst .eq. -1 ) THEN - numua = ii - 1 - END IF -C*---------------------------------------------------------------------- -C* Close the sonde land station table file. -C*---------------------------------------------------------------------- - CALL FL_CLOS ( iunltb, iercls ) -C*---------------------------------------------------------------------- -C* Open and read the metar station table. -C*---------------------------------------------------------------------- - CALL FL_TBOP ( 'metar_stnm.tbl', - + 'stns', iunmtb, iertop ) - IF ( iertop .ne. 0 ) THEN - print*,' error opening metar station table' - END IF - - jj = 1 - ierrst = 0 - DO WHILE ( ( jj .le. MAXSTN ) .and. ( ierrst .eq. 0 ) ) - CALL TB_RSTN ( iunmtb, mtstid (jj), stnnam, mtstnm (jj), - + state, mtcoun(jj), slat, slon, - + selv, ispri, tbchrs, ierrst ) - jj = jj + 1 - END DO - IF ( ierrst .eq. -1 ) THEN - nummet = jj - 1 - END IF -C*---------------------------------------------------------------------- -C* Close the metar station table file. -C*---------------------------------------------------------------------- - CALL FL_CLOS ( iunmtb, iercls ) -C*---------------------------------------------------------------------- -C* Open and initialize the output files. -C*---------------------------------------------------------------------- - - OPEN ( UNIT = 51, FILE = FLO ) - WRITE ( 51, FMT = '(A)' ) 'PARM=PRES;HGHT;TMPK;DWPK;DRCT;SPED' - OPEN ( UNIT = 52, FILE = STNO) - -C*---------------------------------------------------------------------- -C* Open the BUFR file. -C*---------------------------------------------------------------------- - - CALL OPENBF ( 40, 'IN', 40 ) - -C*---------------------------------------------------------------------- -C* Read a BUFR subset from the BUFR file. -C*---------------------------------------------------------------------- - - DO WHILE ( IREADNS ( 40, cmgtag, imgdt ) .eq. 0 ) - - IF ( cmgtag .eq. 'NC002001' ) THEN - -C*---------------------------------------------------------------------- -C* Unpack the header information from this subset. -C*---------------------------------------------------------------------- - - CALL UFBINT ( 40, r8hdr, 9, 1, nlev, - + 'WMOB WMOS CLAT CLON SELV YEAR MNTH DAYS HOUR' ) - - IF ( ( ( r8hdr(1,1) .ge. 60 ) .and. - + ( r8hdr(1,1) .le. 91 ) ) .or. - + ( r8hdr(1,1) .eq. 25 ) ) THEN - -C*---------------------------------------------------------------------- -C* Unpack the level information from this subset. -C* and replicate for VISG =2,4,and 32 -C*---------------------------------------------------------------------- - levelit = 0 - needHeader = .true. - nlevtot = 0 - DO WHILE ( levelit .le. 2 ) - IF ( levelit .eq. 0 ) THEN - CALL UFBINT ( 40, r8lvl, 6, 50, nlev, - + 'VSIG=2 PRLC GP10 TMDB TMDP WDIR WSPD' ) - ELSE IF ( levelit .eq. 1 ) THEN - CALL UFBINT ( 40, r8lvl, 6, 50, nlev, - + 'VSIG=4 PRLC GP10 TMDB TMDP WDIR WSPD' ) - ELSE IF ( levelit .eq. 2 ) THEN - CALL UFBINT ( 40, r8lvl, 6, 50, nlev, - + 'VSIG=32 PRLC GP10 TMDB TMDP WDIR WSPD' ) - END IF - IF ( nlev .gt. 0 ) THEN -C*---------------------------------------------------------------------- -C* Find the corresponding 3 or 4 character ID -C* in the sonde land station table. Store into -C* reportid only if non-blank. -C*---------------------------------------------------------------------- - iblkstn = NINT( r8hdr(1,1)*1000 + r8hdr(2,1) ) - nomatch = .true. - CALL DC_BSRH ( iblkstn, ldstnm, numua, - + ii, iersrh ) - IF ( iersrh .ge. 0 ) THEN - reportid = ldstid(ii) - tabcon = ldcoun(ii) - itabnum = ldstnm(ii) - IF ( ldstid (ii) .ne. ' ') THEN - nomatch = .false. - END IF - END IF -C*---------------------------------------------------------------------- -C* Either no match in sonde land table or tdstid -C* was found but ldstid was blank, so check metar -C* table for match and non-blank char id. -C*---------------------------------------------------------------------- - IF ( nomatch ) THEN - mblkstn = INT( iblkstn * 10 ) - CALL DC_BSRH ( mblkstn, mtstnm, nummet, - + jj, iersrh ) - IF ( iersrh .ge. 0 ) THEN - reportid = mtstid(jj) - tabcon = mtcoun(jj) - itabnum = mtstnm(jj) - nomatch = .false. - END IF - END IF -C*---------------------------------------------------------------------- -C* If no header, build it -C*---------------------------------------------------------------------- - IF ( needHeader ) THEN -C*---------------------------------------------------------------------- -C* Write the data to the output file. -C*---------------------------------------------------------------------- - IF ( reportid .ne. ' ' ) THEN -C*---------------------------------------------------------------------- -C* 3- or 4-char ID found. -C*---------------------------------------------------------------------- - WRITE ( 51, - + FMT = '(/,A,A5,3X,A,I2,I3.3,3x,A,3I2.2,A,2I2.2)' ) - + 'STID=', reportid(1:5), - + 'STNM=', INT(r8hdr(1,1)), INT(r8hdr(2,1)), - + 'TIME=', MOD(NINT(r8hdr(6,1)),100), - + NINT(r8hdr(7,1)), NINT(r8hdr(8,1)), - + '/', NINT(r8hdr(9,1)), 0 - WRITE ( 51, - + FMT = '(2(A,F7.2,1X),A,F7.1)' ) - + 'SLAT=', r8hdr(3,1), - + 'SLON=', r8hdr(4,1), - + 'SELV=', r8hdr(5,1) - ELSE -C*---------------------------------------------------------------------- -C* write WMO block/station instead -C*---------------------------------------------------------------------- - WRITE ( 51, - + FMT = '(/,A,I2,I3.3,3X,A,I2,I3.3,3x,A,3I2.2,A,2I2.2)' ) - + 'STID=', INT(r8hdr(1,1)), INT(r8hdr(2,1)), - + 'STNM=', INT(r8hdr(1,1)), INT(r8hdr(2,1)), - + 'TIME=', MOD(NINT(r8hdr(6,1)),100), - + NINT(r8hdr(7,1)), NINT(r8hdr(8,1)), - + '/', NINT(r8hdr(9,1)), 0 - WRITE ( 51, - + FMT = '(2(A,F7.2,1X),A,F7.1)' ) - + 'SLAT=', r8hdr(3,1), - + 'SLON=', r8hdr(4,1), - + 'SELV=', r8hdr(5,1) - END IF - - - WRITE ( 51, FMT = '(/,6(A8,1X))' ) - + 'PRES', 'HGHT', 'TMPK', 'DWPK', 'DRCT', 'SPED' - needHeader = .false. - END IF - DO jj = 1, nlev - -C*---------------------------------------------------------------------- -C* Convert pressure to millibars. -C*---------------------------------------------------------------------- - - IF ( r8lvl(1,jj) .lt. BFMSNG ) THEN - r8lvl(1,jj) = r8lvl (1,jj) / 100.0 - ELSE - r8lvl(1,jj) = GPMSNG - END IF - -C*---------------------------------------------------------------------- -C* Convert geopotential to height in meters. -C*---------------------------------------------------------------------- - - IF ( r8lvl(2,jj) .lt. BFMSNG ) THEN - r8lvl (2,jj) = r8lvl (2,jj) / 9.8 - ELSE - r8lvl (2,jj) = GPMSNG - END IF - - DO ii = 3, 6 - IF ( r8lvl(ii,jj) .ge. BFMSNG ) THEN - r8lvl (ii,jj) = GPMSNG - END IF - END DO - END DO -C*---------------------------------------------------------------------- -C* itterate through levels and add to total array -C* ignore -9999 and 0 pressure levels -C*---------------------------------------------------------------------- - IF ( nlevtot .eq. 0 ) THEN - nlevtot = 1 - END IF - DO jj = 1,nlev - IF ( r8lvl(1,jj) .gt. 99 ) THEN - DO ii = 1,6 - r8tmptot(ii,nlevtot) = r8lvl(ii,jj) - END DO - nlevtot = nlevtot + 1 - END IF - END DO - nlevtot = nlevtot - 1 - END IF - levelit = levelit + 1 - END DO -C*--------------------------------------------------------------------- -C* bubble sort so output starts at lowest level of the -C* atmosphere (usu. 1000mb), only if there are available -C* levels -C*--------------------------------------------------------------------- - IF (nlevtot .gt. 0) THEN - istop = nlevtot - 1 - iswflg = 1 - DO WHILE ( ( iswflg .ne. 0 ) .and. - + ( istop .ge. 1 ) ) - iswflg = 0 -C - DO j = 1, istop - IF ( r8tmptot(1,j) .lt. r8tmptot(1,j+1) ) THEN - iswflg = 1 - DO i = 1,6 - swpbuf = r8tmptot (i,j) - r8tmptot (i,j) = r8tmptot (i,j+1) - r8tmptot (i,j+1) = swpbuf - END DO - END IF - END DO - istop = istop-1 - END DO -C*--------------------------------------------------------------------- -C* check for exact or partial dupes and only write -C* one line for each level to output file. -C*--------------------------------------------------------------------- - DO jj = 1,nlevtot - DO ii = 1,6 - r8out(ii,jj) = r8tmptot(ii,jj) - END DO - END DO - - kk = 1 - DO jj = 1,nlevtot-1 - IF ( r8out(1,kk) .eq. r8tmptot(1,jj+1) ) THEN - r8out(1,kk) = r8tmptot(1,jj) - DO ii = 2,6 - IF ( r8out(ii,kk) .lt. r8tmptot(ii,jj+1)) - + THEN - r8out(ii,kk) = r8tmptot(ii,jj+1) - END IF - END DO - ELSE - kk = kk + 1 - r8out(1,kk) = r8tmptot(1,jj+1) - r8out(2,kk) = r8tmptot(2,jj+1) - r8out(3,kk) = r8tmptot(3,jj+1) - r8out(4,kk) = r8tmptot(4,jj+1) - r8out(5,kk) = r8tmptot(5,jj+1) - r8out(6,kk) = r8tmptot(6,jj+1) - END IF - END DO -C*---------------------------------------------------------------------- -C* write pres, hght, temp, dew point, wind dir, -C* and wind speed to output file. -C*---------------------------------------------------------------------- - DO jj = 1,kk - WRITE ( 51, FMT = '(6(F8.2,1X))' ) - + ( r8out (ii,jj), ii = 1,6 ) - END DO -C*---------------------------------------------------------------------- -C* Write info for the current station to new table. -C* Includes reportid, lat, lon, country, and blk/ -C* stn. -C*---------------------------------------------------------------------- - IF ( reportid .eq. ' ') THEN - WRITE ( reportid(1:6),FMT='(I6)') itabnum - CALL ST_RMBL ( reportid,reportid,len,iret ) - END IF - WRITE ( 52, FMT = '(A6,F7.2,1X,F7.2, - + 1X,A2,1x,I6)' ) - + reportid(1:6),r8hdr(3,1),r8hdr(4,1), - + tabcon,itabnum - END IF - END IF - END IF - END DO - - STOP - END diff --git a/util/sorc/rdbfmsua.fd/rdbfmsua.f_org b/util/sorc/rdbfmsua.fd/rdbfmsua.f_org deleted file mode 100755 index 343c985fcb8..00000000000 --- a/util/sorc/rdbfmsua.fd/rdbfmsua.f_org +++ /dev/null @@ -1,397 +0,0 @@ - PROGRAM RDBFUA -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C -C MAIN PROGRAM: RDBFUA -C PRGMMR: J. ATOR ORG: NP12 DATE: 2007-08-13 -C -C ABSTRACT: Upper Air Plotted Data for levels 1000MB; 925MB; 850MB; 700MB; -C 500MB; 400MB; 300MB; 250MB; 200MB; 150MB, and 100MB for the -C following regions: 1)United States; 2)Canada; 3)Alaska; and, -C the 4)Mexico and Caribbean. Note that Alaska includes eastern -C Russia. Also adding South America, Africa, and the Pacific. -C -C PROGRAM HISTORY LOG: -C -C 2007-08-13 J. ATOR -- ORIGINAL AUTHOR -C 2007-08-20 C. Magee -- Added block 25 (eastern Russia) -C 2007-09-20 S. Lilly -- Changing to read blks 60 thru 91. -C 2007-09-20 C. Magee -- Added code to read upper air and metar stn tables -C 2007-09-25 S. Lilly -- Added logic to write statements in order to put STID, -C STNM and TIME on the same line. -C 2007-09-27 C. Magee -- Change output for stntbl.out. Use st_rmbl to remove -C leading blank from reportid if internal write was -C used to convert integer WMO block/stn number to -C char report id. -C 2012-01-24 J. Cahoon -- Modified from original RDBFUA to include -C significant and standard together in output -C 2012-02-15 B. Mabe -- Changed Program name and output file to reflect -C change to output for sig and man data -C 2016-10-18 B. Vuong -- Removed hardwire '/nwprod/dictionaries/' in CALL FL_TBOP -C -C USAGE: -C INPUT FILES: -C UNIT 40 - adpupa dumpfile (contains data from BUFR tank b002/xx001) -C -C sonde.land.tbl -C metar.tbl -C -C OUTPUT FILES: -C UNIT 51 - rdbfmsua.out - contains ASCII upper air data for the desired -C stations. -C UNIT 52 - stnmstbl.out - contains ASCII station table info for use by -C html generator. -C -C SUBPROGRAMS CALLED: -C UNIQUE: -C LIBRARY: BUFRLIB - OPENBF UFBINT -C GEMLIB - FL_TBOP ST_RMBL TB_RSTN -C BRIDGE - DC_BSRH -C -C EXIT STATES: -C COND = 0 - SUCCESSFUL RUN -C -C REMARKS: -C -C ATTRIBUTES: -C LANGUAGE: FORTRAN 90 -C MACHINE : IBM-SP -C -C$$$ - INCLUDE 'GEMPRM.PRM' - INCLUDE 'BRIDGE.PRM' -C*---------------------------------------------------------------------- -C* Set the name of the output file. -C*---------------------------------------------------------------------- - - CHARACTER*(*) FLO, STNO - - PARAMETER ( FLO = 'rdbfmsua.out' ) - PARAMETER ( STNO = 'sonde.idsms.tbl' ) - - REAL*8 BFMSNG - PARAMETER ( BFMSNG = 10.0E10 ) - - PARAMETER ( GPMSNG = -9999.0 ) - PARAMETER ( MAXSTN = 10000 ) - - REAL*8 r8hdr ( 9, 1 ), r8lvl ( 6, 100 ), r8arr( 1, 1 ) - REAL*8 r8tmp ( 6, 100 ), r8out ( 6, 300 ),swpbuf - REAL*8 r8tmptot ( 6, 300 ) - - CHARACTER*8 cmgtag, reportid - CHARACTER stnnam*32, tbchrs*20, state*2, tabcon*2 - CHARACTER ldcoun( LLSTFL )*2, mtcoun ( MAXSTN )*2 - CHARACTER ldstid ( LLSTFL )*8, mtstid ( MAXSTN )*8 - INTEGER ldstnm ( LLSTFL ), mtstnm ( MAXSTN ), ispri - INTEGER itabnum - REAL slat, slon, selv - LOGICAL nomatch, needHeader - -C*---------------------------------------------------------------------- -C* Open and read the sonde land station table. -C*---------------------------------------------------------------------- - CALL FL_TBOP ( 'sonde.land.tbl', - + 'stns', iunltb, iertop ) - IF ( iertop .ne. 0 ) THEN - print*,' error opening sonde land station table' - END IF - - ii = 1 - ierrst = 0 - DO WHILE ( ( ii .le. LLSTFL ) .and. ( ierrst .eq. 0 ) ) - CALL TB_RSTN ( iunltb, ldstid (ii), stnnam, ldstnm (ii), - + state, ldcoun (ii), slat, slon, - + selv, ispri, tbchrs, ierrst ) - ii = ii + 1 - END DO - IF ( ierrst .eq. -1 ) THEN - numua = ii - 1 - END IF -C*---------------------------------------------------------------------- -C* Close the sonde land station table file. -C*---------------------------------------------------------------------- - CALL FL_CLOS ( iunltb, iercls ) -C*---------------------------------------------------------------------- -C* Open and read the metar station table. -C*---------------------------------------------------------------------- - CALL FL_TBOP ( 'metar_stnm.tbl', - + 'stns', iunmtb, iertop ) - IF ( iertop .ne. 0 ) THEN - print*,' error opening metar station table' - END IF - - jj = 1 - ierrst = 0 - DO WHILE ( ( jj .le. MAXSTN ) .and. ( ierrst .eq. 0 ) ) - CALL TB_RSTN ( iunmtb, mtstid (jj), stnnam, mtstnm (jj), - + state, mtcoun(jj), slat, slon, - + selv, ispri, tbchrs, ierrst ) - jj = jj + 1 - END DO - IF ( ierrst .eq. -1 ) THEN - nummet = jj - 1 - END IF -C*---------------------------------------------------------------------- -C* Close the metar station table file. -C*---------------------------------------------------------------------- - CALL FL_CLOS ( iunmtb, iercls ) -C*---------------------------------------------------------------------- -C* Open and initialize the output files. -C*---------------------------------------------------------------------- - - OPEN ( UNIT = 51, FILE = FLO ) - WRITE ( 51, FMT = '(A)' ) 'PARM=PRES;HGHT;TMPK;DWPK;DRCT;SPED' - OPEN ( UNIT = 52, FILE = STNO) - -C*---------------------------------------------------------------------- -C* Open the BUFR file. -C*---------------------------------------------------------------------- - - CALL OPENBF ( 40, 'IN', 40 ) - -C*---------------------------------------------------------------------- -C* Read a BUFR subset from the BUFR file. -C*---------------------------------------------------------------------- - - DO WHILE ( IREADNS ( 40, cmgtag, imgdt ) .eq. 0 ) - - IF ( cmgtag .eq. 'NC002001' ) THEN - -C*---------------------------------------------------------------------- -C* Unpack the header information from this subset. -C*---------------------------------------------------------------------- - - CALL UFBINT ( 40, r8hdr, 9, 1, nlev, - + 'WMOB WMOS CLAT CLON SELV YEAR MNTH DAYS HOUR' ) - - IF ( ( ( r8hdr(1,1) .ge. 60 ) .and. - + ( r8hdr(1,1) .le. 91 ) ) .or. - + ( r8hdr(1,1) .eq. 25 ) ) THEN - -C*---------------------------------------------------------------------- -C* Unpack the level information from this subset. -C* and replicate for VISG =2,4,and 32 -C*---------------------------------------------------------------------- - levelit = 0 - needHeader = .true. - nlevtot = 0 - DO WHILE ( levelit .le. 2 ) - IF ( levelit .eq. 0 ) THEN - CALL UFBINT ( 40, r8lvl, 6, 50, nlev, - + 'VSIG=2 PRLC GP10 TMDB TMDP WDIR WSPD' ) - ELSE IF ( levelit .eq. 1 ) THEN - CALL UFBINT ( 40, r8lvl, 6, 50, nlev, - + 'VSIG=4 PRLC GP10 TMDB TMDP WDIR WSPD' ) - ELSE IF ( levelit .eq. 2 ) THEN - CALL UFBINT ( 40, r8lvl, 6, 50, nlev, - + 'VSIG=32 PRLC GP10 TMDB TMDP WDIR WSPD' ) - END IF - IF ( nlev .gt. 0 ) THEN -C*---------------------------------------------------------------------- -C* Find the corresponding 3 or 4 character ID -C* in the sonde land station table. Store into -C* reportid only if non-blank. -C*---------------------------------------------------------------------- - iblkstn = NINT( r8hdr(1,1)*1000 + r8hdr(2,1) ) - nomatch = .true. - CALL DC_BSRH ( iblkstn, ldstnm, numua, - + ii, iersrh ) - IF ( iersrh .ge. 0 ) THEN - reportid = ldstid(ii) - tabcon = ldcoun(ii) - itabnum = ldstnm(ii) - IF ( ldstid (ii) .ne. ' ') THEN - nomatch = .false. - END IF - END IF -C*---------------------------------------------------------------------- -C* Either no match in sonde land table or tdstid -C* was found but ldstid was blank, so check metar -C* table for match and non-blank char id. -C*---------------------------------------------------------------------- - IF ( nomatch ) THEN - mblkstn = INT( iblkstn * 10 ) - CALL DC_BSRH ( mblkstn, mtstnm, nummet, - + jj, iersrh ) - IF ( iersrh .ge. 0 ) THEN - reportid = mtstid(jj) - tabcon = mtcoun(jj) - itabnum = mtstnm(jj) - nomatch = .false. - END IF - END IF -C*---------------------------------------------------------------------- -C* If no header, build it -C*---------------------------------------------------------------------- - IF ( needHeader ) THEN -C*---------------------------------------------------------------------- -C* Write the data to the output file. -C*---------------------------------------------------------------------- - IF ( reportid .ne. ' ' ) THEN -C*---------------------------------------------------------------------- -C* 3- or 4-char ID found. -C*---------------------------------------------------------------------- - WRITE ( 51, - + FMT = '(/,A,A5,3X,A,I2,I3.3,3x,A,3I2.2,A,2I2.2)' ) - + 'STID=', reportid(1:5), - + 'STNM=', INT(r8hdr(1,1)), INT(r8hdr(2,1)), - + 'TIME=', MOD(NINT(r8hdr(6,1)),100), - + NINT(r8hdr(7,1)), NINT(r8hdr(8,1)), - + '/', NINT(r8hdr(9,1)), 0 - WRITE ( 51, - + FMT = '(2(A,F7.2,1X),A,F7.1)' ) - + 'SLAT=', r8hdr(3,1), - + 'SLON=', r8hdr(4,1), - + 'SELV=', r8hdr(5,1) - ELSE -C*---------------------------------------------------------------------- -C* write WMO block/station instead -C*---------------------------------------------------------------------- - WRITE ( 51, - + FMT = '(/,A,I2,I3.3,3X,A,I2,I3.3,3x,A,3I2.2,A,2I2.2)' ) - + 'STID=', INT(r8hdr(1,1)), INT(r8hdr(2,1)), - + 'STNM=', INT(r8hdr(1,1)), INT(r8hdr(2,1)), - + 'TIME=', MOD(NINT(r8hdr(6,1)),100), - + NINT(r8hdr(7,1)), NINT(r8hdr(8,1)), - + '/', NINT(r8hdr(9,1)), 0 - WRITE ( 51, - + FMT = '(2(A,F7.2,1X),A,F7.1)' ) - + 'SLAT=', r8hdr(3,1), - + 'SLON=', r8hdr(4,1), - + 'SELV=', r8hdr(5,1) - END IF - - - WRITE ( 51, FMT = '(/,6(A8,1X))' ) - + 'PRES', 'HGHT', 'TMPK', 'DWPK', 'DRCT', 'SPED' - needHeader = .false. - END IF - DO jj = 1, nlev - -C*---------------------------------------------------------------------- -C* Convert pressure to millibars. -C*---------------------------------------------------------------------- - - IF ( r8lvl(1,jj) .lt. BFMSNG ) THEN - r8lvl(1,jj) = r8lvl (1,jj) / 100.0 - ELSE - r8lvl(1,jj) = GPMSNG - END IF - -C*---------------------------------------------------------------------- -C* Convert geopotential to height in meters. -C*---------------------------------------------------------------------- - - IF ( r8lvl(2,jj) .lt. BFMSNG ) THEN - r8lvl (2,jj) = r8lvl (2,jj) / 9.8 - ELSE - r8lvl (2,jj) = GPMSNG - END IF - - DO ii = 3, 6 - IF ( r8lvl(ii,jj) .ge. BFMSNG ) THEN - r8lvl (ii,jj) = GPMSNG - END IF - END DO - END DO -C*---------------------------------------------------------------------- -C* itterate through levels and add to total array -C* ignore -9999 and 0 pressure levels -C*---------------------------------------------------------------------- - IF ( nlevtot .eq. 0 ) THEN - nlevtot = 1 - END IF - DO jj = 1,nlev - IF ( r8lvl(1,jj) .gt. 99 ) THEN - DO ii = 1,6 - r8tmptot(ii,nlevtot) = r8lvl(ii,jj) - END DO - nlevtot = nlevtot + 1 - END IF - END DO - nlevtot = nlevtot - 1 - END IF - levelit = levelit + 1 - END DO -C*--------------------------------------------------------------------- -C* bubble sort so output starts at lowest level of the -C* atmosphere (usu. 1000mb), only if there are available -C* levels -C*--------------------------------------------------------------------- - IF (nlevtot .gt. 0) THEN - istop = nlevtot - 1 - iswflg = 1 - DO WHILE ( ( iswflg .ne. 0 ) .and. - + ( istop .ge. 1 ) ) - iswflg = 0 -C - DO j = 1, istop - IF ( r8tmptot(1,j) .lt. r8tmptot(1,j+1) ) THEN - iswflg = 1 - DO i = 1,6 - swpbuf = r8tmptot (i,j) - r8tmptot (i,j) = r8tmptot (i,j+1) - r8tmptot (i,j+1) = swpbuf - END DO - END IF - END DO - istop = istop-1 - END DO -C*--------------------------------------------------------------------- -C* check for exact or partial dupes and only write -C* one line for each level to output file. -C*--------------------------------------------------------------------- - DO jj = 1,nlevtot - DO ii = 1,6 - r8out(ii,jj) = r8tmptot(ii,jj) - END DO - END DO - - kk = 1 - DO jj = 1,nlevtot-1 - IF ( r8out(1,kk) .eq. r8tmptot(1,jj+1) ) THEN - r8out(1,kk) = r8tmptot(1,jj) - DO ii = 2,6 - IF ( r8out(ii,kk) .lt. r8tmptot(ii,jj+1)) - + THEN - r8out(ii,kk) = r8tmptot(ii,jj+1) - END IF - END DO - ELSE - kk = kk + 1 - r8out(1,kk) = r8tmptot(1,jj+1) - r8out(2,kk) = r8tmptot(2,jj+1) - r8out(3,kk) = r8tmptot(3,jj+1) - r8out(4,kk) = r8tmptot(4,jj+1) - r8out(5,kk) = r8tmptot(5,jj+1) - r8out(6,kk) = r8tmptot(6,jj+1) - END IF - END DO -C*---------------------------------------------------------------------- -C* write pres, hght, temp, dew point, wind dir, -C* and wind speed to output file. -C*---------------------------------------------------------------------- - DO jj = 1,kk - WRITE ( 51, FMT = '(6(F8.2,1X))' ) - + ( r8out (ii,jj), ii = 1,6 ) - END DO -C*---------------------------------------------------------------------- -C* Write info for the current station to new table. -C* Includes reportid, lat, lon, country, and blk/ -C* stn. -C*---------------------------------------------------------------------- - IF ( reportid .eq. ' ') THEN - WRITE ( reportid(1:6),FMT='(I6)') itabnum - CALL ST_RMBL ( reportid,reportid,len,iret ) - END IF - WRITE ( 52, FMT = '(A6,F7.2,1X,F7.2, - + 1X,A2,1x,I6)' ) - + reportid(1:6),r8hdr(3,1),r8hdr(4,1), - + tabcon,itabnum - END IF - END IF - END IF - END DO - - STOP - END diff --git a/util/sorc/webtitle.fd/README b/util/sorc/webtitle.fd/README deleted file mode 100755 index 4f26e568a6d..00000000000 --- a/util/sorc/webtitle.fd/README +++ /dev/null @@ -1,9 +0,0 @@ -FF 11/09/12 -no essl library -intel's mkl blas/others are supposed to be compatible: http://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms - -other concerns: - -makefile:39: warning: overriding commands for target `webtitle' -makefile:34: warning: ignoring old commands for target `webtitle' - diff --git a/util/sorc/webtitle.fd/compile_webtitle_wcoss.sh b/util/sorc/webtitle.fd/compile_webtitle_wcoss.sh deleted file mode 100755 index d0d8f79bdec..00000000000 --- a/util/sorc/webtitle.fd/compile_webtitle_wcoss.sh +++ /dev/null @@ -1,35 +0,0 @@ -#!/bin/sh - -###################################################################### -# -# Build executable : GFS utilities -# -###################################################################### - -LMOD_EXACT_MATCH=no -source ../../../sorc/machine-setup.sh > /dev/null 2>&1 -cwd=$(pwd) - -if [ "$target" = "hera" ]; then - echo " " - echo " You are on $target " - echo " " -else - echo " " - echo " Your machine $target is not supported." - echo " The script $0 can not continue. Aborting!" - echo " " - exit -fi -echo " " - -# Load required modules -source ../../modulefiles/gfs_util.${target} -module list - -set -x - -mkdir -p ../../exec -make -mv webtitle ../../exec -make clean diff --git a/util/sorc/webtitle.fd/makefile b/util/sorc/webtitle.fd/makefile deleted file mode 100755 index bcad6f8f9f3..00000000000 --- a/util/sorc/webtitle.fd/makefile +++ /dev/null @@ -1,37 +0,0 @@ -# Modified BSM for WCOSS build 1/30/2013 -SHELL=/bin/sh - -SRCS= webtitle.f -OBJS= webtitle.o -# Tunable parameters -# -# FC Name of the fortran compiling system to use -# LDFLAGS Flags to the loader -# LIBS List of libraries -# CMD Name of the executable -# PROFLIB Library needed for profiling -# -FC = ifort - -LIBS= ${W3NCO_LIB4} - -CMD = webtitle -FFLAGS = -#FFLAGS = -debug - -# Lines from here on down should not need to be changed. They are the -# actual rules which make uses to build a.out. -# -all: $(CMD) - -$(CMD): $(OBJS) - $(FC) $(FFLAGS) -o $(@) $(OBJS) $(LIBS) - -clean: - -rm -f $(OBJS) - -clobber: clean - -rm -f $(CMD) - -void: clobber - -rm -f $(SRCS) makefile diff --git a/util/sorc/webtitle.fd/webtitle.f b/util/sorc/webtitle.fd/webtitle.f deleted file mode 100755 index b4bfdfa0b00..00000000000 --- a/util/sorc/webtitle.fd/webtitle.f +++ /dev/null @@ -1,147 +0,0 @@ -C$$$ MAIN PROGRAM DOCUMENTATION BLOCK -C . . . . -C MAIN PROGRAM: WEBTITLE -C PRGMMR: SAGER ORG: NP12 DATE: 2003-10-02 -C -C ABSTRACT: READS A FILE CONTAINING THE CURRENT DATE AND THE FORECAST -C HOUR AND WRITES A FILE CONTAINING A TITLE CONTAINING A REFORMATED -C DATE. THIS FILE IS USED TO CREATE A NEW FORMATED TITLE FOR THE -C NCEP MODEL GRAPHICS WEBSITE -C -C PROGRAM HISTORY LOG: -C -C 03-10-02 L. SAGER ORIGINAL VERSION -C 01-30-13 B. MABE Updated for WCOSS system. Remove Equiv and -C char to integer implied casts -C USAGE: -C INPUT FILES: -C FT05 - CURRENT DATE AND FORECAST HOUR -C -C OUTPUT FILES: -C FT55 - UPDATED TITLE CONTAINING REFORMATTED -C DATE -C -C SUBPROGRAMS CALLED: -C UNIQUE: - -C LIBRARY: - W3AI15 W3FS15 W3DOXDAT -C COMMON - -C -C EXIT STATES: -C COND = 0 - SUCCESSFUL RUN -C -C REMARKS: -C -C ATTRIBUTES: -C LANGUAGE: FORTRAN 90 -C MACHINE: IBM -C -C$$$ -C - INTEGER idat(8) - CHARACTER*4 cout(10) - CHARACTER*3 days(7) - CHARACTER*14 block - CHARACTER*40 line1 - CHARACTER*40 line2 - CHARACTER*4 tb1(2) - CHARACTER*2 tb2(3) - BYTE bsmdate(4) - BYTE retdate(4) - - DATA idat /8*0/ - DATA days /'SUN','MON','TUE','WED','THU','FRI','SAT'/ - - DATA line1 /'09/01/2003 12UTC 24HR FCST VALID TUE 09'/ - - DATA line2 /'/02/2003 12UTC NCEP/NWS/NOAA'/ - - CALL W3TAGB('WEBTITLE',2001,0275,0076,'NP12') -C -C Start by reading in the date/time -C - READ(5,102) block - 102 FORMAT(a14) - READ(block,100) tb1(1), tb1(2), tb2(1), tb2(2), tb2(3) - 100 FORMAT(2a4,4a2) - - read(tb1(1),*) jtau - read(tb1(2),*) iyear - iwork = iyear - 2000 - bsmdate(1)=iwork - read(tb2(1),*) bsmdate(2) - read(tb2(2),*) bsmdate(3) - read(tb2(3),*) bsmdate(4) - -C USAGE: CALL W3FS15 (IDATE, JTAU, NDATE) -C INPUT ARGUMENT LIST: -C IDATE - PACKED BINARY DATE/TIME AS FOLLOWS: -C BYTE 1 IS YEAR OF CENTURY 00-99 -C BYTE 2 IS MONTH 01-12 -C BYTE 3 IS DAY OF MONTH 01-31 -C BYTE 4 IS HOUR 00-23 -C SUBROUTINE TAKES ADVANTAGE OF FORTRAN ADDRESS -C PASSING, IDATE AND NDATE MAY BE -C A CHARACTER*1 ARRAY OF FOUR, THE LEFT 32 -C BITS OF 64 BIT INTEGER WORD. AN OFFICE NOTE 85 -C LABEL CAN BE STORED IN -C 4 INTEGER WORDS. -C IF INTEGER THE 2ND WORD IS USED. OUTPUT -C IS STORED IN LEFT 32 BITS. FOR A OFFICE NOTE 84 -C LABEL THE 7TH WORD IS IN THE 4TH CRAY 64 BIT -C INTEGER, THE LEFT 32 BITS. -C JTAU - INTEGER NUMBER OF HOURS TO UPDATE (IF POSITIVE) -C OR BACKDATE (IF NEGATIVE) -C -C OUTPUT ARGUMENT LIST: -C NDATE - NEW DATE/TIME WORD RETURNED IN THE -C SAME FORMAT AS 'IDATE'. 'NDATE' AND 'IDATE' MAY -C BE THE SAME VARIABLE. - - CALL w3fs15(bsmdate,jtau,retdate) -C -C... w3doxdat returns the day of the week -C -C INPUT VARIABLES: -C IDAT INTEGER (8) NCEP ABSOLUTE DATE AND TIME -C (YEAR, MONTH, DAY, TIME ZONE, -C HOUR, MINUTE, SECOND, MILLISECOND) -C -C OUTPUT VARIABLES: -C JDOW INTEGER DAY OF WEEK (1-7, WHERE 1 IS SUNDAY) -C JDOY INTEGER DAY OF YEAR (1-366, WHERE 1 IS JANUARY 1) -C JDAY INTEGER JULIAN DAY (DAY NUMBER FROM JAN. 1,4713 B.C.) -C - idat(1) = iyear - idat(2) = retdate(2) - idat(3) = retdate(3) - idat(5) = retdate(4) - - CALL w3doxdat(idat,jdow,jdoy,jday) - -C -C Convert the valid date back to character -C - - CALL w3ai15(idat,cout,10,2,' ') - - - line1(1:2) = block(9:10) - line1(4:5) = block(11:12) - line1(9:10) = block(7:8) - line1(12:13) = block(13:14) - line1(18:20) = block(2:4) - line1(35:37) = days(jdow) - line1(39:40) = cout(2)(1:2) - - line2(2:3) = cout(3)(1:2) - line2(7:8) = cout(1)(1:2) - line2(10:11) = cout(5)(1:2) - - - - write(55,105) line1,line2 - 105 FORMAT(2a40) - - CALL W3TAGE('WEBTITLE') - STOP - END diff --git a/util/ush/finddate.sh b/util/ush/finddate.sh deleted file mode 100755 index 1805c2103ca..00000000000 --- a/util/ush/finddate.sh +++ /dev/null @@ -1,163 +0,0 @@ -# finddate.sh -# author: Luke Lin phone: 457-5047 24 June 1998 -# abstract: This script looks in ether forward or backward in time to -# generate either a variable containing sequential date/time stamps -# for a period up to a month or just the date/time stamp occurring -# at the end of such a period. -# Time stamp is in the form yyyyddmm. The script should be good for many -# years. Leap years are accounted for. Years go 1998, 1999, 2000, 2001, -# 2002, 2003, .... -# etc. -# -# usage: examples assume todays date is 19990929. -# To generate a sequence looking 10 days forward then execute: -# list=$(sh /nwprod/util/scripts/finddate.sh 19990929 s+10) -# To generate just the date/time 10 days from now then execute: -# list=$(sh /nwprod/util/scripts/finddate.sh 19990929 d+10) -# To generate a sequence looking 10 days backward then execute: -# list=$(sh /nwprod/util/scripts/finddate.sh 19990929 s-10) -# To generate just the date/time 10 days ago then execute: -# list=$(sh /nwprod/util/scripts/finddate.sh 19990929 d-10) -# list will contain 10 time stamps starting with 19990929. Time stamps -# are separated by blanks. -# -# This script will work for periods up to a month. The number indicating -# the period in question should be two digits. For single digits 1-9 -# use 01, 02, 03, etc. -set +x -unset pdstr -today=$1 -var=$2 -yy=$(echo $today | cut -c1-4 ) -mm=$(echo $today | cut -c5-6 ) -dd=$(echo $today | cut -c7-8 ) -nxtyy=$yy -pyy=$yy -what=$(echo $var | cut -c1-1) -up=$(echo $var | cut -c2-2) -num=$(echo $var | cut -c3-4) -mod=$(expr \( $yy / 4 \) \* 4 - $yy ) -leap=0 -if test "$mod" -eq 0 -then -leap=1 -fi -case $mm in -01) mday=31 - pday=31 - pmon=12 - pyy=$(expr $yy - 1) - if test $pyy -lt '0' - then - pyy='1999' - fi - nxtmon=02;; -02) mday=$(expr "$leap" + 28 ) - pday=31 - pmon=01 - nxtmon=03;; -03) mday=31 - pday=$(expr "$leap" + 28 ) - pmon=02 - nxtmon=04;; -04) mday=30 - pday=31 - pmon=03 - nxtmon=05;; -05) mday=31 - pday=30 - pmon=04 - nxtmon=06;; -06) mday=30 - pday=31 - pmon=05 - nxtmon=07;; -07) mday=31 - pday=30 - pmon=06 - nxtmon=08;; -08) mday=31 - pday=31 - pmon=07 - nxtmon=09;; -09) mday=30 - pday=31 - pmon=08 - nxtmon=10;; -10) mday=31 - pday=30 - pmon=09 - nxtmon=11;; -11) mday=30 - pday=31 - pmon=10 - nxtmon=12;; -12) mday=31 - pday=30 - pmon=11 - nxtmon=01 - nxtyy=$(expr $yy + 1 ) - if test $yy -eq 1999 - then - nxtyy=2000 - fi ;; -*) echo mon=$mon is illegal - exit 99 ;; -esac - -if test $dd -gt $mday -then - echo "day=$dd is illegal. In month=$mon there are only $mday days." - exit 16 -fi - -i=1 -n=0 -while test $i -le $num -do - if test "$up" = '+' - then - ddn=$(expr $dd + $i) - mmn=$mm - yyn=$yy - if test $ddn -gt $mday - then - n=$(expr $n + 1) - ddn=$n - mmn=$nxtmon - yyn=$nxtyy - fi - if test $ddn -lt 10 - then - ddn="0$ddn" - fi - elif test "$up" = '-' - then - ddn=$(expr $dd - $i) - mmn=$mm - yyn=$yy - if test $ddn -le '0' - then - n=$(expr $pday + $ddn) - ddn=$n - mmn=$pmon - yyn=$pyy - fi - if test $ddn -lt 10 - then - ddn="0$ddn" - fi - else - echo '+ or - are allowed for 2nd variable in argument.' - echo "You tried $up, this is illegal." - exit 16 - fi - i=$(expr $i + 1 ) - if test "$what" = 's' - then - pdstr=$pdstr"$yyn$mmn$ddn " - else - pdstr=$yyn$mmn$ddn - fi -done -echo $pdstr diff --git a/util/ush/make_NTC_file.pl b/util/ush/make_NTC_file.pl deleted file mode 100755 index 478bd6a2882..00000000000 --- a/util/ush/make_NTC_file.pl +++ /dev/null @@ -1,119 +0,0 @@ -#!/usr/bin/perl -# -#------------------------------------------------------ -# -# This is make_NTC_file.pl -# It attaches the appropriate headers to the input file -# and copies it to a unique name for input to NTC. -# -# The following lines are prepended to the file: -# 1. A Bulletin Flag Field Seperator -# 2. A WMO header line -# 3. An optional subheader, e.g. DIFAX1064 -# -# Input wmoheader Originator datetime path -# where: -# wmoheader - WMO id to use in WMO header. -# subheader - "NONE" if none. -# Originator - Originator to use in WMO header -# datetime - date/time to use in WMO header, yyyymmddhh -# path - name input file -# output_path - name of output file -# -# Author: Paula Freeman based on script by Larry Sager -# -#------------------------------------------------------ - -$NArgs = @ARGV; - -if ($NArgs < 6) { - usage (); - exit; -} - -# -# Get input -# - -$WMOHeader=shift; -$Origin=shift; -$YYYYMMDDHH=shift; -$SubHeader=shift; -$Filename=shift; -$OutputFilename=shift; - -print "Filename is $Filename\n"; -print "Output Filename is $OutputFilename\n"; -$YYYYMMDDHH =~ /\d{4}(\d{2})(\d{4})/; -$MMDDHH = $1 . $2; -$DDHHMM = $2 . "00"; -print "WMOHeader = $WMOHeader\n"; -print "SubHeader = $SubHeader\n"; -print "Origin = $Origin\n"; - - -if ( ($WMOHeader eq "") || ($Origin eq "") || ($YYYYMMDDHH eq "") || ($Filename eq "") || ($OutputFilename eq "") || ($SubHeader eq "") ) { - usage (); - exit; -} - -# -# Create the file for TOC -# - - make_toc (); -# -# - - -sub usage () { - print "Usage: $0 \n"; -} - -sub make_toc { - -# -# Attach WMO header and subheader (if not "NONE"). -# Get the bytecount of file to insert into the Bulletin Flag Field Seperator. -# Add in length of WMO header, plus two carriage returns and line feed. -# If Subheader specified, count that in also, plus line a feed. -# - - $Header = "$WMOHeader $Origin $DDHHMM"; - $ByteCount = $(wc -c $Filename | cut -c1-8); - $ByteCount= $ByteCount + length($Header) + 3; - if ($SubHeader =~ /NONE/) { - print "No Subheader\n"; - } else { - if ($SubHeader =~ /IMAG/){ - $ByteCount = $ByteCount + length($SubHeader); - } else { - $ByteCount = $ByteCount + length($SubHeader) + 3; - } - } - $BulletinFlagFieldSep = sprintf( "****%10.10d****", $ByteCount); - - open(OUTFILE, ">$OutputFilename") or die "Cannot open $OutputFilename for output."; - print OUTFILE "$BulletinFlagFieldSep\n"; - print OUTFILE "$Header\r\r\n"; - if ($SubHeader =~ /NONE/) { - print "No Subheader\n"; - } else { - if ($SubHeader =~ /IMAG/){ - print OUTFILE "$SubHeader"; - } else { - print OUTFILE "$SubHeader\r\r\n"; - } - } - open (INFILE, $Filename) or die "Cannot open $Filename"; - - while ($rec=) { - print OUTFILE $rec; - } - - close INFILE; - close OUTFILE; - - print "$Filename -> $OutputFilename\n"; -} - diff --git a/util/ush/make_ntc_bull.pl b/util/ush/make_ntc_bull.pl deleted file mode 100755 index c6ca287eadc..00000000000 --- a/util/ush/make_ntc_bull.pl +++ /dev/null @@ -1,250 +0,0 @@ -#!/usr/bin/perl -# -#------------------------------------------------------ -# -# This is make_ntc_bull.pl -# It attaches the appropriate headers to the input file -# and copies it to a unique name for input to NTC. -# -# A Bulletin Flag Field Separator is prepended to the -# text bulletin. This TOC header contains the total -# number of bytes in the product not counting the -# bulletin flag field separator. -# -# Input: -# File identifier - Output name identier. -# subheader - "NONE" if none. -# Originator - Not used currently -# datetime - Not used currently -# filename - input file name -# output_path - name of output file -# -# Author: Larry Sager based on a script by Paula Freeman -# -# 31 Oct 05 -- new script -# -#------------------------------------------------------ - -if ($ENV{job}) { $job=$ENV{job}; } -if ($ENV{SENDCOM}) { $SENDCOM=$ENV{SENDCOM}; } -if ($ENV{SENDDBN}) { $SENDDBN=$ENV{SENDDBN}; } -$NArgs = @ARGV; - -if ($NArgs < 6) { - usage (); - exit; -} - -# -# Get input -# - -$NAME=shift; -$WMOname=shift; -$ORIGname=shift; -$DATEname=shift; -$Filename=shift; -$OutputFilename=shift; -print " Input : $Filename"; -print " Output: $OutputFilename"; - - -if ( ($Filename eq "") || ($OutputFilename eq "") ) { - usage (); - exit; -} - -# -# Create the file for TOC -# - if ( $NAME eq "plot" ) { - make_tocplot (); - } - elsif ($NAME eq "redb" ) { - make_tocredb (); - } - else { - make_tocbull (); - } -# -# - - -sub usage () { - print "Usage: $0 \n"; -} - -sub make_tocbull { - -# -# Attach WMO header -# Get the bytecount of file to insert into the Bulletin Flag Field Seperator. -# - - $ix = 0; - $under = "_"; - open (INFILE, $Filename) or die "Cannot open $Filename"; - - while ($cho=) { - $rec = $rec . $cho; - } - $cho = $rec; - $cho =~ s/\n//g; - $cho =~ s/<<@@/\r\r\n/g; - $cho =~ s/<<@/\r\r\n/g; - $cho =~ s/<//g; - $cho =~ s/\^//g; - $cho =~ s/\$//g; - $cho =~ s/\|/+/g; - $value = 40; - $Outp="$OutputFilename"; - open(OUTFILE, ">$Outp") or die "Cannot open $OutputFilename for output."; - while ($ix == 0) { - $cho = substr($cho,$value); - $value = 38; - $cho =~ s/'1/\&\&/; - $cho =~ s/'0/\&\&/; -# print "cho is $cho"; - ($cho2,$cho) = split(/\&\&/,$cho); - ($cho2,$cho3) = split(/\%/,$cho2); -# print "cho2 is $cho2"; - $ByteCount = length($cho2); - print " length is $ByteCount "; - $BulletinFlagFieldSep = sprintf( "****%10.10d****", $ByteCount); - if ($SENDCOM eq "YES") { - if ($ByteCount > 50 ) { - print OUTFILE "$BulletinFlagFieldSep\n"; - print OUTFILE $cho2; - } - else { - $ix = 1; - } - } - } - close OUTFILE; - if ($SENDDBN eq "YES" ) { -# Modified 20051205 by wx11rp to ensure the current production machine is used. -# $dbn_alert="/gpfs/w/nco/dbnet/bin/dbn_alert"; - $dbn_alert=$ENV{'DBNROOT'} . "/bin/dbn_alert"; - $type="GRIB_LOW"; - $job2=$job; - $subtype=$ORIGname; - $file_path=$Outp; - @command = ($dbn_alert, $type, $subtype, $job2, $file_path); - if (system (@command) != 0) { - print "Error alerting: @command \n"; - } - } - - close INFILE; - close OUTFILE; - - print "$Filename -> $OutputFilename\n"; -} - -sub make_tocplot { - -# -# Attach WMO header -# Get the bytecount of file to insert into the Bulletin Flag Field Seperator. -# - - $ix = 0; - $under = "_"; - open (INFILE, $Filename) or die "Cannot open $Filename"; - - while ($cho=) { - $rec = $rec . $cho; - } - $cho = $rec; -# $Outp="$OutputFilename$under$job"; - $Outp="$OutputFilename"; - open(OUTFILE, ">$Outp") or die "Cannot open $OutputFilename for output."; - while ($ix == 0) { - $cho =~ s/\$\$/\&\&/; - ($cho2,$cho) = split(/\&\&/,$cho); -# $cho2 =~ s/@/ /g; -# $cho2 = $cho2 . " "; - $ByteCount = length($cho2); - print " length is $ByteCount "; - $BulletinFlagFieldSep = sprintf( "****%10.10d****", $ByteCount); - if ($SENDCOM eq "YES") { - if ($ByteCount > 50 ) { - print OUTFILE "$BulletinFlagFieldSep\n"; - print OUTFILE $cho2; - } - else { - $ix = 1; - } - } - } - close OUTFILE; - if ($SENDDBN eq "YES" ) { -# 20051205 Modified by wx11rp to allow the script to run on any manchine labeled as the production machine -# $dbn_alert="/gpfs/w/nco/dbnet/bin/dbn_alert"; - $dbn_alert=$ENV{'DBNROOT'} . "/bin/dbn_alert"; - $type="GRIB_LOW"; - $subtype=$DATEname; - $job2=$job; - $file_path=$Outp; - @command = ($dbn_alert, $type, $subtype, $job2, $file_path); - if (system (@command) != 0) { - print "Error alerting: @command \n"; - } - } - - close INFILE; - close OUTFILE; - - print "$Filename -> $OutputFilename\n"; -} -sub make_tocredb { - -# -# Prepare the Redbook graphic for transmission to TOC by removing the AWIPS -# header and creating an NTC header. Get the Bytecount of the file to -# insert into the Bulletin Flag Field Seperator. -# - - $ix = 0; - $under = "_"; - open (INFILE, $Filename) or die "Cannot open $Filename"; - - while ($cho=) { - $rec = $rec . $cho; - } - $cho = $rec; - $Outp="$OutputFilename"; - open(OUTFILE, ">$Outp") or die "Cannot open $OutputFilename for output."; - $cho = substr($cho,24); - $ByteCount = length($cho); - print " length is $ByteCount "; - $BulletinFlagFieldSep = sprintf( "****%10.10d****", $ByteCount); - if ($SENDCOM eq "YES") { - if ($ByteCount > 50 ) { - print OUTFILE "$BulletinFlagFieldSep\n"; - print OUTFILE $cho; - - } - } - close OUTFILE; - if ($SENDDBN eq "YES" ) { -# 20051205 Modified by wx11rp to allow the script to run on any manchine labeled as the production machine -# $dbn_alert="/gpfs/w/nco/dbnet/bin/dbn_alert"; - $dbn_alert=$ENV{'DBNROOT'} . "/bin/dbn_alert"; - $type="GRIB_LOW"; - $subtype=$DATEname; - $job2=$job; - $file_path=$Outp; - @command = ($dbn_alert, $type, $subtype, $job2, $file_path); - if (system (@command) != 0) { - print "Error alerting: @command \n"; - } - } - - close INFILE; - close OUTFILE; - - print "$Filename -> $OutputFilename\n"; -} diff --git a/util/ush/make_tif.sh b/util/ush/make_tif.sh deleted file mode 100755 index 2609d1d7979..00000000000 --- a/util/ush/make_tif.sh +++ /dev/null @@ -1,45 +0,0 @@ -#!/bin/sh - -cd $DATA -# -# Use Image Magick to convert the GIF to TIF -# format -# -# module show imagemagick-intel-sandybridge/6.8.3 on CRAY -# export PATH=$PATH:/usrx/local/prod/imagemagick/6.8.3/intel/sandybridge/bin:. -# export LIBPATH="$LIBPATH":/usrx/local/prod/imagemagick/6.8.3/intel/sandybridge/lib -# export DELEGATE_PATH=/usrx/local/prod/imagemagick/6.8.3/intel/sandybridge/share/ImageMagick-6 - -# module show imagemagick/6.9.9-25 on DELL - export PATH=$PATH:/usrx/local/dev/packages/ImageMagick/6.9.9-25/bin:. - export LIBPATH="$LIBPATH":/usrx/local/dev/packages/ImageMagick/6.9.9-25/lib - export DELEGATE_PATH=/usrx/local/dev/packages/ImageMagick/6.9.9-25/share/ImageMagick-6 - - outname=out.tif - - convert gif:$input fax:$outname - -# -# Add the ntc heading: -# - -WMO=QTUA11 -ORIG=KWBC -PDYHH=${PDY}${cyc} - -if [ $HEADER = "YES" ] -then - INPATH=$DATA/$outname - SUB=DFAX1064 -# make_NTC_file.pl $WMO $ORIG $PDYHH $SUB $INPATH $OUTPATH - $UTILgfs/ush/make_NTC_file.pl $WMO $ORIG $PDYHH $SUB $INPATH $OUTPATH -# -# Send the graphic to TOC - - cp $OUTPATH ${COMOUTwmo}/gfs_500_hgt_tmp_nh_anl_${cyc}.tif - if [ $SENDDBN = YES ]; then - - $DBNROOT/bin/dbn_alert GRIB_LOW ${NET} ${job} ${COMOUTwmo}/gfs_500_hgt_tmp_nh_anl_${cyc}.tif - fi -fi - diff --git a/util/ush/month_name.sh b/util/ush/month_name.sh deleted file mode 100755 index a31a82e8a2b..00000000000 --- a/util/ush/month_name.sh +++ /dev/null @@ -1,112 +0,0 @@ -#!/bin/ksh - -#################################################################### -# -# SCRIPT: month_name.sh -# -# This script returns the name/abreviation of a month -# in a small text file, month_name.txt. It also echos the -# name/abreviation to stdout. The form of the returned -# name/abreviation is specified by the script arguments. -# -# USAGE: ./month_name.sh < month > < monthspec> -# -# EXAMPLE: ./month_name.sh 5 MON -# -# month spec contents of month_name.txt -# ----------- ------ ---------------------------- -# -# 6/06 Mon Jun -# 8/08 Month August -# 9/09 MON SEP -# 11 MONTH NOVEMBER -# -# -# Note: Variables may be assigned the value of the returned name -# by either of the following methods: -# -# MM=$(cat month_name.txt) after executing month_name.sh -# - OR - -# MM=$(month_name.sh 5 MON) (for example) -# -# -# -# HISTORY: 07/08/2005 - Original script -# -# -#################################################################### - - - typeset -Z2 month_num - - - month_num=$1 - month_spec=$2 - - case ${month_num} in - - 01) Mon=Jan - Month=January ;; - - 02) Mon=Feb - Month=February ;; - - 03) Mon=Mar - Month=March ;; - - 04) Mon=Apr - Month=April ;; - - 05) Mon=May - Month=May ;; - - 06) Mon=Jun - Month=June ;; - - 07) Mon=Jul - Month=July ;; - - 08) Mon=Aug - Month=August ;; - - 09) Mon=Sep - Month=September ;; - - 10) Mon=Oct - Month=October ;; - - 11) Mon=Nov - Month=November ;; - - 12) Mon=Dec - Month=December ;; - - esac - - - if [ ${month_spec} = Mon ]; then - - echo ${Mon} - echo ${Mon} > month_name.txt - - elif [ ${month_spec} = Month ]; then - - echo ${Month} - echo ${Month} > month_name.txt - - elif [ ${month_spec} = MON ]; then - - MON=$(echo ${Mon} | tr [a-z] [A-Z]) - echo ${MON} - echo ${MON} > month_name.txt - - elif [ ${month_spec} = MONTH ]; then - - MONTH=$(echo ${Month} | tr [a-z] [A-Z]) - echo ${MONTH} - echo ${MONTH} > month_name.txt - - fi - - - diff --git a/versions/fix.ver b/versions/fix.ver new file mode 100644 index 00000000000..0046ed5eb8e --- /dev/null +++ b/versions/fix.ver @@ -0,0 +1,23 @@ +#!/bin/bash +# Fix file subfolder versions + +export aer_ver=20220805 +export am_ver=20220805 +export chem_ver=20220805 +export cice_ver=20220805 +export cpl_ver=20220805 +export datm_ver=20220805 +export gdas_crtm_ver=20220805 +export gdas_fv3jedi_ver=20220805 +export gdas_gsibec_ver=20221031 +export gldas_ver=20220920 +export glwu_ver=20220805 +export gsi_ver=20221128 +export lut_ver=20220805 +export mom6_ver=20220805 +export orog_ver=20220805 +export reg2grb2_ver=20220805 +export sfc_climo_ver=20220805 +export ugwd_ver=20220805 +export verif_ver=20220805 +export wave_ver=20220805 diff --git a/workflow/applications.py b/workflow/applications.py index 1766c4071fe..c481cb76ff2 100644 --- a/workflow/applications.py +++ b/workflow/applications.py @@ -2,8 +2,8 @@ from typing import Dict, Any from datetime import timedelta -from configuration import Configuration from hosts import Host +from pygw.configuration import Configuration __all__ = ['AppConfig'] @@ -78,11 +78,11 @@ class AppConfig: VALID_MODES = ['cycled', 'forecast-only'] - def __init__(self, configuration: Configuration) -> None: + def __init__(self, conf: Configuration) -> None: self.scheduler = Host().scheduler - _base = configuration.parse_config('config.base') + _base = conf.parse_config('config.base') self.mode = _base['MODE'] @@ -103,11 +103,14 @@ def __init__(self, configuration: Configuration) -> None: self.do_bufrsnd = _base.get('DO_BUFRSND', False) self.do_gempak = _base.get('DO_GEMPAK', False) self.do_awips = _base.get('DO_AWIPS', False) - self.do_wafs = _base.get('DO_WAFS', False) + self.do_wafs = _base.get('WAFSF', False) self.do_vrfy = _base.get('DO_VRFY', True) + self.do_fit2obs = _base.get('DO_FIT2OBS', True) self.do_metp = _base.get('DO_METP', False) - self.do_jedivar = _base.get('DO_JEDIVAR', False) - self.do_jediens = _base.get('DO_JEDIENS', False) + self.do_jediatmvar = _base.get('DO_JEDIVAR', False) + self.do_jediatmens = _base.get('DO_JEDIENS', False) + self.do_jediocnvar = _base.get('DO_JEDIOCNVAR', False) + self.do_mergensst = _base.get('DO_MERGENSST', False) self.do_hpssarch = _base.get('HPSSARCH', False) @@ -133,7 +136,7 @@ def __init__(self, configuration: Configuration) -> None: self.configs_names = self._get_app_configs() # Source the config_files for the jobs in the application - self.configs = self._source_configs(configuration) + self.configs = self._source_configs(conf) # Update the base config dictionary based on application upd_base_map = {'cycled': self._cycled_upd_base, @@ -175,20 +178,25 @@ def _cycled_configs(self): configs = ['prep'] - if self.do_jedivar: - configs += ['atmanalprep', 'atmanalrun', 'atmanalpost'] + if self.do_jediatmvar: + configs += ['atmanlinit', 'atmanlrun', 'atmanlfinal'] else: configs += ['anal', 'analdiag'] - configs += ['sfcanl', 'analcalc', 'fcst', 'post', 'vrfy', 'arch'] + if self.do_jediocnvar: + configs += ['ocnanalprep', 'ocnanalbmat', 'ocnanalrun', 'ocnanalchkpt', 'ocnanalpost', 'ocnanalvrfy'] + if self.do_ocean: + configs += ['ocnpost'] + + configs += ['sfcanl', 'analcalc', 'fcst', 'post', 'vrfy', 'fit2obs', 'arch'] if self.do_gldas: configs += ['gldas'] if self.do_hybvar: - if self.do_jediens: - configs += ['atmensanalprep', 'atmensanalrun', 'atmensanalpost'] + if self.do_jediatmens: + configs += ['atmensanlinit', 'atmensanlrun', 'atmensanlfinal'] else: configs += ['eobs', 'eomg', 'ediag', 'eupd'] configs += ['ecen', 'esfc', 'efcs', 'echgres', 'epos', 'earc'] @@ -217,6 +225,9 @@ def _cycled_configs(self): if self.do_wafs: configs += ['wafs', 'wafsgrib2', 'wafsblending', 'wafsgcip', 'wafsgrib20p25', 'wafsblending0p25'] + if self.do_aero: + configs += ['aeroanlinit', 'aeroanlrun', 'aeroanlfinal'] + return configs @property @@ -282,7 +293,7 @@ def _forecast_only_upd_base(base_in): return base_out - def _source_configs(self, configuration: Configuration) -> Dict[str, Any]: + def _source_configs(self, conf: Configuration) -> Dict[str, Any]: """ Given the configuration object and jobs, source the configurations for each config and return a dictionary @@ -292,7 +303,7 @@ def _source_configs(self, configuration: Configuration) -> Dict[str, Any]: configs = dict() # Return config.base as well - configs['base'] = configuration.parse_config('config.base') + configs['base'] = conf.parse_config('config.base') # Source the list of all config_files involved in the application for config in self.configs_names: @@ -312,7 +323,7 @@ def _source_configs(self, configuration: Configuration) -> Dict[str, Any]: files += [f'config.{config}'] print(f'sourcing config.{config}') - configs[config] = configuration.parse_config(files) + configs[config] = conf.parse_config(files) return configs @@ -338,34 +349,45 @@ def _get_cycled_task_names(self): """ gdas_gfs_common_tasks_before_fcst = ['prep'] - gdas_gfs_common_tasks_after_fcst = ['post', 'vrfy'] + gdas_gfs_common_tasks_after_fcst = ['post'] + # if self.do_ocean: # TODO: uncomment when ocnpost is fixed in cycled mode + # gdas_gfs_common_tasks_after_fcst += ['ocnpost'] + gdas_gfs_common_tasks_after_fcst += ['vrfy'] + gdas_gfs_common_cleanup_tasks = ['arch'] - if self.do_jedivar: - gdas_gfs_common_tasks_before_fcst += ['atmanalprep', 'atmanalrun', 'atmanalpost'] + if self.do_jediatmvar: + gdas_gfs_common_tasks_before_fcst += ['atmanlinit', 'atmanlrun', 'atmanlfinal'] else: gdas_gfs_common_tasks_before_fcst += ['anal'] + if self.do_jediocnvar: + gdas_gfs_common_tasks_before_fcst += ['ocnanalprep', 'ocnanalbmat', 'ocnanalrun', + 'ocnanalchkpt', 'ocnanalpost', 'ocnanalvrfy'] + gdas_gfs_common_tasks_before_fcst += ['sfcanl', 'analcalc'] + if self.do_aero: + gdas_gfs_common_tasks_before_fcst += ['aeroanlinit', 'aeroanlrun', 'aeroanlfinal'] + gldas_tasks = ['gldas'] wave_prep_tasks = ['waveinit', 'waveprep'] wave_bndpnt_tasks = ['wavepostbndpnt', 'wavepostbndpntbll'] wave_post_tasks = ['wavepostsbs', 'wavepostpnt'] - hybrid_gdas_or_gfs_tasks = [] - hybrid_gdas_tasks = [] + hybrid_tasks = [] + hybrid_after_eupd_tasks = [] if self.do_hybvar: - if self.do_jediens: - hybrid_gdas_or_gfs_tasks += ['atmensanalprep', 'atmensanalrun', 'atmensanalpost', 'echgres'] + if self.do_jediatmens: + hybrid_tasks += ['atmensanlinit', 'atmensanlrun', 'atmensanlfinal', 'echgres'] else: - hybrid_gdas_or_gfs_tasks += ['eobs', 'eupd', 'echgres'] - hybrid_gdas_or_gfs_tasks += ['ediag'] if self.lobsdiag_forenkf else ['eomg'] - hybrid_gdas_tasks += ['ecen', 'esfc', 'efcs', 'epos', 'earc'] + hybrid_tasks += ['eobs', 'eupd', 'echgres'] + hybrid_tasks += ['ediag'] if self.lobsdiag_forenkf else ['eomg'] + hybrid_after_eupd_tasks += ['ecen', 'esfc', 'efcs', 'epos', 'earc'] # Collect all "gdas" cycle tasks gdas_tasks = gdas_gfs_common_tasks_before_fcst.copy() - if not self.do_jedivar: + if not self.do_jediatmvar: gdas_tasks += ['analdiag'] if self.do_gldas: @@ -378,16 +400,14 @@ def _get_cycled_task_names(self): gdas_tasks += gdas_gfs_common_tasks_after_fcst - if self.do_hybvar: - if 'gdas' in self.eupd_cdumps: - gdas_tasks += hybrid_gdas_or_gfs_tasks - gdas_tasks += hybrid_gdas_tasks - if self.do_wave and 'gdas' in self.wave_cdumps: if self.do_wave_bnd: gdas_tasks += wave_bndpnt_tasks gdas_tasks += wave_post_tasks + if self.do_fit2obs: + gdas_tasks += ['fit2obs'] + gdas_tasks += gdas_gfs_common_cleanup_tasks # Collect "gfs" cycle tasks @@ -403,9 +423,6 @@ def _get_cycled_task_names(self): if self.do_metp: gfs_tasks += ['metp'] - if self.do_hybvar and 'gfs' in self.eupd_cdumps: - gfs_tasks += hybrid_gdas_or_gfs_tasks - if self.do_wave and 'gfs' in self.wave_cdumps: if self.do_wave_bnd: gfs_tasks += wave_bndpnt_tasks @@ -429,7 +446,21 @@ def _get_cycled_task_names(self): gfs_tasks += gdas_gfs_common_cleanup_tasks - tasks = {'gdas': gdas_tasks, 'gfs': gfs_tasks} + tasks = dict() + tasks['gdas'] = gdas_tasks + + if self.do_hybvar and 'gdas' in self.eupd_cdumps: + enkfgdas_tasks = hybrid_tasks + hybrid_after_eupd_tasks + tasks['enkfgdas'] = enkfgdas_tasks + + # Add CDUMP=gfs tasks if running early cycle + if self.gfs_cyc > 0: + tasks['gfs'] = gfs_tasks + + if self.do_hybvar and 'gfs' in self.eupd_cdumps: + enkfgfs_tasks = hybrid_tasks + hybrid_after_eupd_tasks + enkfgfs_tasks.remove("echgres") + tasks['enkfgfs'] = enkfgfs_tasks return tasks diff --git a/workflow/ecFlow/ecflow_definitions.py b/workflow/ecFlow/ecflow_definitions.py index 7193790732b..0aea65710c0 100644 --- a/workflow/ecFlow/ecflow_definitions.py +++ b/workflow/ecFlow/ecflow_definitions.py @@ -370,12 +370,12 @@ def add_repeat(self, repeat, parent=None): """ repeat_token = re.search( - "(\d{8,10})( | to )(\d{10})( | by )(\d{1,2}:)?(\d{1,2}:\d{1,2})", + r"(\d{8,10})( | to )(\d{10})( | by )(\d{1,2}:)?(\d{1,2}:\d{1,2})", repeat) start = repeat_token.group(1).strip() end = repeat_token.group(3).strip() byday = repeat_token.group(5).strip() if repeat_token.group(5) is not \ - None else repeat_token.group(5) + None else repeat_token.group(5) bytime = repeat_token.group(6).strip() startdate = datetime.strptime(start, "%Y%m%d%H") if len(start) == 10 \ @@ -633,8 +633,8 @@ def add_task(self, task, parents, scriptrepo, template=None, self.ecf_nodes[task_name].setup_script(scriptrepo, template) if self.build_tree: self.ecf_nodes[task_name].generate_ecflow_task(self.ecfhome, - self.get_suite_name(), - parents) + self.get_suite_name(), + parents) self.ecf_nodes[parents] += self.ecf_nodes[task_name] def add_task_edits(self, task, edit_dict, parent_node=None, index=None): @@ -1002,7 +1002,7 @@ def __init__(self, ecfitem, ecfparent=None): self.__populate_full_name_items() if (ecfparent and self.__max_value is None and (ecfparent.is_list or ecfparent.is_range) and - len(self.__items) == len(ecfparent.get_full_name_items())): + len(self.__items) == len(ecfparent.get_full_name_items())): self.use_parent_counter = True def get_name(self): @@ -1162,7 +1162,6 @@ def __set_max_value(self, range_token): None """ - if not range_token[0]: self.__max_value = None self.use_parent_counter = True @@ -1301,8 +1300,8 @@ def __populate_full_name_items(self): for item in self.__items: if isinstance(item, int): self.__full_name_items.append(f"{self.__base}" - f"{item:03}" - f"{self.__suffix}") + f"{item:03}" + f"{self.__suffix}") elif isinstance(item, str): self.__full_name_items.append(f"{self.__base}" f"{item}" @@ -1438,6 +1437,7 @@ def get_type(self): return 'task' + class EcfFamilyNode(EcfNode): """ Extension class for the EcfNodes to identify tasks. @@ -1455,6 +1455,7 @@ def get_type(self): return 'family' + class EcfEventNode(EcfNode): """ Extension class for the EcfNodes to identify events. @@ -1472,6 +1473,7 @@ def get_type(self): return 'event' + class ecfTriggerNode(EcfNode): """ Extension class for the EcfNodes to identify triggers. Overloads the @@ -1719,7 +1721,7 @@ def has_event(self): """ if 'event' in self.task_setup.keys(): if self.is_list or self.is_range: - self.event = EcfEventNode(self.task_setup['event'],self) + self.event = EcfEventNode(self.task_setup['event'], self) else: self.event = EcfEventNode(self.task_setup['event'], self.ecfparent) @@ -1744,6 +1746,7 @@ def invalid_event_range(self): "Please review the configuration.") sys.exit(1) + class EcfEventNode(EcfNode): """ Extension class for the EcfNodes to identify events. @@ -1769,6 +1772,7 @@ def get_type(self): """ return 'event' + class EcfEditNode(EcfNode): """ Extension class for the EcfNodes to identify edits. @@ -1795,7 +1799,8 @@ def get_type(self): return 'edit' -class EcfRoot( ): + +class EcfRoot(): """ A root level class that is not an EcfNode object from above but an object that will extend a class from the ecflow module. @@ -1821,7 +1826,8 @@ def get_base_name(): The name of the node if it has a prefix, this strips out the surrounding range and just returns the beginning. """ - return re.search("(.*)\{.*\}",self.name()).group(1).strip() + return re.search(r"(.*)\{.*\}", self.name()).group(1).strip() + class EcfSuite(ecflow.Suite, EcfRoot): """ @@ -1855,6 +1861,7 @@ def generate_folders(self, ecfhome): if not os.path.exists(folder_path): os.makedirs(folder_path) + class EcfFamily(ecflow.Family, EcfRoot): """ Extends the ecflow.Family and EcfRoot classes to provide the folder @@ -1895,6 +1902,7 @@ def generate_folders(self, ecfhome, suite, parents): if not os.path.exists(folder_path): os.makedirs(folder_path) + class EcfTask(ecflow.Task, EcfRoot): """ Extends the ecflow.Task and EcfRoot classes to allow the task scripts to @@ -1964,12 +1972,12 @@ def generate_ecflow_task(self, ecfhome, suite, parents): script_name = f"{self.name()}.ecf" ecfscript = None search_script = f"{self.template}.ecf" if self.template is not \ - None else script_name + None else script_name if parents: script_path = f"{ecfhome}/{suite}/{parents.replace('>','/')}/{script_name}" else: script_path = f"{ecfhome}/{suite}/{script_name}" - for root,dirs,files in os.walk(self.scriptrepo): + for root, dirs, files in os.walk(self.scriptrepo): if search_script in files and ecfscript is None: ecfscript = os.path.join(root, search_script) elif script_name in files: @@ -1985,14 +1993,18 @@ def generate_ecflow_task(self, ecfhome, suite, parents): sys.exit(1) # define Python user-defined exceptions + + class Error(Exception): """Base class for other exceptions""" pass + class RangeError(Error): """Raised when the range in the configuration file is incorrect""" pass + class ConfigurationError(Error): """Raised when there is an error in the configuration file.""" pass diff --git a/workflow/ecFlow/ecflow_setup.py b/workflow/ecFlow/ecflow_setup.py index ca7e5162ca3..1170ebc4794 100644 --- a/workflow/ecFlow/ecflow_setup.py +++ b/workflow/ecFlow/ecflow_setup.py @@ -187,7 +187,7 @@ def get_suite_names(suitename): # If the name does have a list, break apart the prefix and suffix # from the list and then run it through a for loop to get all # possible values. - name_token = re.search("(.*)\[(.*)\](.*)", suitename) + name_token = re.search(r"(.*)\[(.*)\](.*)", suitename) base = name_token.group(1).strip() list_items = name_token.group(2).strip().split(',') suffix = name_token.group(3).strip() @@ -647,7 +647,7 @@ def find_env_param(node, value, envconfig): new_node = node if value in node: - variable_lookup = re.search(f".*{value}([\dA-Za-z_]*)", node).group(1).strip() + variable_lookup = re.search(fr".*{value}([\dA-Za-z_]*)", node).group(1).strip() if variable_lookup in os.environ: if isinstance(os.environ[variable_lookup], datetime.datetime): new_variable = os.environ[variable_lookup].strftime("%Y%m%d%H") @@ -705,7 +705,7 @@ def runupdate(nested_dict, value): for k, v in nested_dict.items(): if isinstance(v, str) and value in v: lookup = v.split('.') - variable_lookup = re.findall("[\dA-Za-z_]*", lookup[1])[0] + variable_lookup = re.findall(r"[\dA-Za-z_]*", lookup[1])[0] if variable_lookup in os.environ: if isinstance(os.environ[variable_lookup], datetime.datetime): nested_dict[k] = os.environ[variable_lookup].strftime("%Y%m%d%H") diff --git a/workflow/hosts.py b/workflow/hosts.py index 57bce4e5e4f..b97ac67d899 100644 --- a/workflow/hosts.py +++ b/workflow/hosts.py @@ -2,20 +2,11 @@ import os from pathlib import Path -from yaml import load -try: - from yaml import CLoader as Loader -except ImportError: - from yaml import Loader - -__all__ = ['Host'] +from pygw.yaml_file import YAMLFile -def load_yaml(_path: Path): - with open(_path, "r") as _file: - yaml_dict = load(_file, Loader=Loader) - return yaml_dict +__all__ = ['Host'] class Host: @@ -24,7 +15,7 @@ class Host: """ SUPPORTED_HOSTS = ['HERA', 'ORION', 'JET', - 'WCOSS2'] + 'WCOSS2', 'S4', 'CONTAINER'] def __init__(self, host=None): @@ -35,12 +26,13 @@ def __init__(self, host=None): self.machine = detected_host self.info = self._get_info - self.scheduler = self.info['scheduler'] + self.scheduler = self.info['SCHEDULER'] @classmethod def detect(cls): machine = 'NOTFOUND' + container = os.getenv('SINGULARITY_NAME', None) if os.path.exists('/scratch1/NCEPDEV'): machine = 'HERA' @@ -50,6 +42,10 @@ def detect(cls): machine = 'JET' elif os.path.exists('/lfs/f1'): machine = 'WCOSS2' + elif os.path.exists('/data/prod'): + machine = 'S4' + elif container is not None: + machine = 'CONTAINER' if machine not in Host.SUPPORTED_HOSTS: raise NotImplementedError(f'This machine is not a supported host.\n' + @@ -63,7 +59,7 @@ def _get_info(self) -> dict: hostfile = Path(os.path.join(os.path.dirname(__file__), f'hosts/{self.machine.lower()}.yaml')) try: - info = load_yaml(hostfile) + info = YAMLFile(path=hostfile) except FileNotFoundError: raise FileNotFoundError(f'{hostfile} does not exist!') except IOError: diff --git a/workflow/hosts/container.yaml b/workflow/hosts/container.yaml new file mode 100644 index 00000000000..3aab3df7511 --- /dev/null +++ b/workflow/hosts/container.yaml @@ -0,0 +1,23 @@ +BASE_GIT: '' +DMPDIR: '/home/${USER}' +PACKAGEROOT: '' +COMROOT: '' +COMINsyn: '' +HOMEDIR: '/home/${USER}' +STMP: '/home/${USER}' +PTMP: '/home/${USER}' +NOSCRUB: $HOMEDIR +SCHEDULER: none +ACCOUNT: '' +QUEUE: '' +QUEUE_SERVICE: '' +PARTITION_BATCH: '' +PARTITION_SERVICE: '' +CHGRP_RSTPROD: 'YES' +CHGRP_CMD: 'chgrp rstprod' +HPSSARCH: 'NO' +LOCALARCH: 'NO' +ATARDIR: '${NOSCRUB}/archive_rotdir/${PSLOT}' +MAKE_NSSTBUFR: 'NO' +MAKE_ACFTBUFR: 'NO' +SUPPORTED_RESOLUTIONS: ['C96', 'C48'] diff --git a/workflow/hosts/hera_emc.yaml b/workflow/hosts/hera_emc.yaml index 84debf1b9aa..d6e3b83b3d4 100644 --- a/workflow/hosts/hera_emc.yaml +++ b/workflow/hosts/hera_emc.yaml @@ -1,19 +1,23 @@ -base_git: '/scratch1/NCEPDEV/global/glopara/git' -base_svn: '/scratch1/NCEPDEV/global/glopara/svn' -dmpdir: '/scratch1/NCEPDEV/global/glopara/dump' -nwprod: '/scratch1/NCEPDEV/global/glopara/nwpara' -comroot: '/scratch1/NCEPDEV/global/glopara/com' -homedir: '/scratch1/NCEPDEV/global/$USER' -stmp: '/scratch1/NCEPDEV/stmp2/$USER' -ptmp: '/scratch1/NCEPDEV/stmp4/$USER' -noscrub: $HOMEDIR -account: fv3-cpu -scheduler: slurm -queue: batch -queue_service: service -partition_batch: hera -chgrp_rstprod: 'YES' -chgrp_cmd: 'chgrp rstprod' -hpssarch: 'YES' -localarch: 'NO' -atardir: '/NCEPDEV/$HPSS_PROJECT/1year/$USER/$machine/scratch/$PSLOT' +BASE_GIT: '/scratch1/NCEPDEV/global/glopara/git' +DMPDIR: '/scratch1/NCEPDEV/global/glopara/dump' +PACKAGEROOT: '/scratch1/NCEPDEV/global/glopara/nwpara' +COMROOT: '/scratch1/NCEPDEV/global/glopara/com' +COMINsyn: '${COMROOT}/gfs/prod/syndat' +HOMEDIR: '/scratch1/NCEPDEV/global/${USER}' +STMP: '/scratch1/NCEPDEV/stmp2/${USER}' +PTMP: '/scratch1/NCEPDEV/stmp4/${USER}' +NOSCRUB: $HOMEDIR +ACCOUNT: fv3-cpu +SCHEDULER: slurm +QUEUE: batch +QUEUE_SERVICE: batch +PARTITION_BATCH: hera +PARTITION_SERVICE: service +CHGRP_RSTPROD: 'YES' +CHGRP_CMD: 'chgrp rstprod' +HPSSARCH: 'YES' +LOCALARCH: 'NO' +ATARDIR: '/NCEPDEV/${HPSS_PROJECT}/1year/${USER}/${machine}/scratch/${PSLOT}' +MAKE_NSSTBUFR: 'NO' +MAKE_ACFTBUFR: 'NO' +SUPPORTED_RESOLUTIONS: ['C768', 'C384', 'C192', 'C96', 'C48'] diff --git a/workflow/hosts/hera_gsl.yaml b/workflow/hosts/hera_gsl.yaml index 35638d2c446..957d491f2fc 100644 --- a/workflow/hosts/hera_gsl.yaml +++ b/workflow/hosts/hera_gsl.yaml @@ -1,19 +1,23 @@ -base_git: '/scratch1/NCEPDEV/global/glopara/git' -base_svn: '/scratch1/NCEPDEV/global/glopara/svn' -dmpdir: '/scratch1/NCEPDEV/global/glopara/dump' -nwprod: '/scratch1/NCEPDEV/global/glopara/nwpara' -comroot: '/scratch1/NCEPDEV/global/glopara/com' -homedir: '/scratch1/BMC/gsd-fv3-dev/NCEPDEV/global/$USER' -stmp: '$ROTDIR/..' -ptmp: '$ROTDIR/..' -noscrub: $HOMEDIR -account: gsd-fv3-dev -scheduler: slurm -queue: batch -queue_service: service -partition_batch: hera -chgrp_rstprod: 'YES' -chgrp_cmd: 'chgrp rstprod' -hpssarch: 'NO' -localarch: 'NO' -atardir: '/BMC/$HPSS_PROJECT/1year/$USER/$machine/scratch/$PSLOT' +BASE_GIT: '/scratch1/NCEPDEV/global/glopara/git' +DMPDIR: '/scratch1/NCEPDEV/global/glopara/dump' +PACKAGEROOT: '/scratch1/NCEPDEV/global/glopara/nwpara' +COMROOT: '/scratch1/NCEPDEV/global/glopara/com' +COMINsyn: '${COMROOT}/gfs/prod/syndat' +HOMEDIR: '/scratch1/BMC/gsd-fv3-dev/NCEPDEV/global/${USER}' +STMP: '${ROTDIR}/..' +PTMP: '${ROTDIR}/..' +NOSCRUB: $HOMEDIR +ACCOUNT: gsd-fv3-dev +SCHEDULER: slurm +QUEUE: batch +QUEUE_SERVICE: batch +PARTITION_BATCH: hera +PARTITION_SERVICE: service +CHGRP_RSTPROD: 'YES' +CHGRP_CMD: 'chgrp rstprod' +HPSSARCH: 'NO' +LOCALARCH: 'NO' +ATARDIR: '/BMC/${HPSS_PROJECT}/1year/${USER}/${machine}/scratch/${PSLOT}' +MAKE_NSSTBUFR: 'NO' +MAKE_ACFTBUFR: 'NO' +SUPPORTED_RESOLUTIONS: ['C768', 'C384', 'C192', 'C96', 'C48'] diff --git a/workflow/hosts/jet_emc.yaml b/workflow/hosts/jet_emc.yaml new file mode 100644 index 00000000000..903213b7612 --- /dev/null +++ b/workflow/hosts/jet_emc.yaml @@ -0,0 +1,23 @@ +BASE_GIT: '/lfs4/HFIP/hfv3gfs/glopara/git' +DMPDIR: '/lfs4/HFIP/hfv3gfs/glopara/dump' +PACKAGEROOT: '/lfs4/HFIP/hfv3gfs/glopara/nwpara' +COMROOT: '/lfs4/HFIP/hfv3gfs/glopara/com' +COMINsyn: '${COMROOT}/gfs/prod/syndat' +HOMEDIR: '/lfs4/HFIP/hfv3gfs/${USER}' +STMP: '/lfs4/HFIP/hfv3gfs/${USER}/stmp' +PTMP: '/lfs4/HFIP/hfv3gfs/${USER}/ptmp' +NOSCRUB: $HOMEDIR +ACCOUNT: hfv3gfs +SCHEDULER: slurm +QUEUE: batch +QUEUE_SERVICE: batch +PARTITION_BATCH: kjet +PARTITION_SERVICE: service +CHGRP_RSTPROD: 'YES' +CHGRP_CMD: 'chgrp rstprod' +HPSSARCH: 'YES' +LOCALARCH: 'NO' +ATARDIR: '/NCEPDEV/${HPSS_PROJECT}/1year/${USER}/${machine}/scratch/${PSLOT}' +MAKE_NSSTBUFR: 'NO' +MAKE_ACFTBUFR: 'NO' +SUPPORTED_RESOLUTIONS: ['C384', 'C192', 'C96', 'C48'] diff --git a/workflow/hosts/jet_gsl.yaml b/workflow/hosts/jet_gsl.yaml index cb7880f1c86..609f22c8464 100644 --- a/workflow/hosts/jet_gsl.yaml +++ b/workflow/hosts/jet_gsl.yaml @@ -1,19 +1,23 @@ -base_git: '/lfs4/HFIP/hfv3gfs/glopara/git' -base_svn: '/lfs4/HFIP/hfv3gfs/glopara/svn' -dmpdir: '/lfs4/HFIP/hfv3gfs/glopara/dump' -nwprod: '/lfs4/HFIP/hfv3gfs/glopara/nwpara' -comroot: '/lfs4/HFIP/hfv3gfs/glopara/com' -homedir: '/lfs4/BMC/gsd-fv3-dev/NCEPDEV/global/$USER' -stmp: '$ROTDIR/..' -ptmp: '$ROTDIR/..' -noscrub: $HOMEDIR -account: gsd-fv3-dev -scheduler: slurm -queue: batch -queue_service: service -partition_batch: jet -chgrp_rstprod: 'YES' -chgrp_cmd: 'chgrp rstprod' -hpssarch: 'NO' -localarch: 'NO' -atardir: '/BMC/$HPSS_PROJECT/1year/$USER/$machine/scratch/$PSLOT' +BASE_GIT: '/lfs4/HFIP/hfv3gfs/glopara/git' +DMPDIR: '/lfs4/HFIP/hfv3gfs/glopara/dump' +PACKAGEROOT: '/lfs4/HFIP/hfv3gfs/glopara/nwpara' +COMROOT: '/lfs4/HFIP/hfv3gfs/glopara/com' +COMINsyn: '${COMROOT}/gfs/prod/syndat' +HOMEDIR: '/lfs4/BMC/gsd-fv3-dev/NCEPDEV/global/${USER}' +STMP: '${ROTDIR}/..' +PTMP: '${ROTDIR}/..' +NOSCRUB: $HOMEDIR +ACCOUNT: gsd-fv3-dev +SCHEDULER: slurm +QUEUE: batch +QUEUE_SERVICE: batch +PARTITION_BATCH: jet +PARTITION_SERVICE: service +CHGRP_RSTPROD: 'YES' +CHGRP_CMD: 'chgrp rstprod' +HPSSARCH: 'NO' +LOCALARCH: 'NO' +ATARDIR: '/BMC/${HPSS_PROJECT}/1year/${USER}/${machine}/scratch/${PSLOT}' +MAKE_NSSTBUFR: 'NO' +MAKE_ACFTBUFR: 'NO' +SUPPORTED_RESOLUTIONS: ['C384', 'C192', 'C96', 'C48'] diff --git a/workflow/hosts/orion_emc.yaml b/workflow/hosts/orion_emc.yaml index bf59ea3e1d6..b1f0a8ad978 100644 --- a/workflow/hosts/orion_emc.yaml +++ b/workflow/hosts/orion_emc.yaml @@ -1,19 +1,23 @@ -base_git: '/work/noaa/global/glopara/git' -base_svn: '/work/noaa/global/glopara/svn' -dmpdir: '/work/noaa/rstprod/dump' -nwprod: '/work/noaa/global/glopara/nwpara' -comroot: '/work/noaa/global/glopara/com' -homedir: '/work/noaa/global/$USER' -stmp: '/work/noaa/stmp/$USER' -ptmp: '/work/noaa/stmp/$USER' -noscrub: $HOMEDIR -scheduler: slurm -account: fv3-cpu -queue: batch -queue_service: service -partition_batch: orion -chgrp_rstprod: 'YES' -chgrp_cmd: 'chgrp rstprod' -hpssarch: 'NO' -localarch: 'NO' -atardir: '$NOSCRUB/archive_rotdir/$PSLOT' \ No newline at end of file +BASE_GIT: '/work/noaa/global/glopara/git' +DMPDIR: '/work/noaa/rstprod/dump' +PACKAGEROOT: '/work/noaa/global/glopara/nwpara' +COMROOT: '/work/noaa/global/glopara/com' +COMINsyn: '${COMROOT}/gfs/prod/syndat' +HOMEDIR: '/work/noaa/global/${USER}' +STMP: '/work/noaa/stmp/${USER}' +PTMP: '/work/noaa/stmp/${USER}' +NOSCRUB: $HOMEDIR +SCHEDULER: slurm +ACCOUNT: fv3-cpu +QUEUE: batch +QUEUE_SERVICE: batch +PARTITION_BATCH: orion +PARTITION_SERVICE: service +CHGRP_RSTPROD: 'YES' +CHGRP_CMD: 'chgrp rstprod' +HPSSARCH: 'NO' +LOCALARCH: 'NO' +ATARDIR: '${NOSCRUB}/archive_rotdir/${PSLOT}' +MAKE_NSSTBUFR: 'NO' +MAKE_ACFTBUFR: 'NO' +SUPPORTED_RESOLUTIONS: ['C768', 'C384', 'C192', 'C96', 'C48'] diff --git a/workflow/hosts/orion_gsl.yaml b/workflow/hosts/orion_gsl.yaml index 9ffc9089cc3..5dfb0efa06c 100644 --- a/workflow/hosts/orion_gsl.yaml +++ b/workflow/hosts/orion_gsl.yaml @@ -1,19 +1,23 @@ -base_git: '/work/noaa/global/glopara/git' -base_svn: '/work/noaa/global/glopara/svn' -dmpdir: '/work/noaa/rstprod/dump' -nwprod: '/work/noaa/global/glopara/nwpara' -comroot: '/work/noaa/global/glopara/com' -homedir: '/work/noaa/gsd-fv3-dev/$USER' -stmp: '/work/noaa/gsd-fv3-dev/stmp/$USER' -ptmp: '/work/noaa/gsd-fv3-dev/stmp/$USER' -noscrub: $HOMEDIR -scheduler: slurm -account: gsd-fv3-dev -queue: batch -queue_service: service -partition_batch: orion -chgrp_rstprod: 'YES' -chgrp_cmd: 'chgrp rstprod' -hpssarch: 'NO' -localarch: 'NO' -atardir: '$NOSCRUB/archive_rotdir/$PSLOT' +BASE_GIT: '/work/noaa/global/glopara/git' +DMPDIR: '/work/noaa/rstprod/dump' +PACKAGEROOT: '/work/noaa/global/glopara/nwpara' +COMROOT: '/work/noaa/global/glopara/com' +COMINsyn: '${COMROOT}/gfs/prod/syndat' +HOMEDIR: '/work/noaa/gsd-fv3-dev/${USER}' +STMP: '/work/noaa/gsd-fv3-dev/stmp/${USER}' +PTMP: '/work/noaa/gsd-fv3-dev/stmp/${USER}' +NOSCRUB: $HOMEDIR +SCHEDULER: slurm +ACCOUNT: gsd-fv3-dev +QUEUE: batch +QUEUE_SERVICE: batch +PARTITION_BATCH: orion +PARTITION_SERVICE: service +CHGRP_RSTPROD: 'YES' +CHGRP_CMD: 'chgrp rstprod' +HPSSARCH: 'NO' +LOCALARCH: 'NO' +ATARDIR: '${NOSCRUB}/archive_rotdir/${PSLOT}' +MAKE_NSSTBUFR: 'NO' +MAKE_ACFTBUFR: 'NO' +SUPPORTED_RESOLUTIONS: ['C768', 'C384', 'C192', 'C96', 'C48'] diff --git a/workflow/hosts/s4.yaml b/workflow/hosts/s4.yaml new file mode 100644 index 00000000000..d200eaa5193 --- /dev/null +++ b/workflow/hosts/s4.yaml @@ -0,0 +1,23 @@ +BASE_GIT: '/data/prod/glopara/git' +DMPDIR: '/data/prod/glopara/dump' +PACKAGEROOT: '/data/prod/glopara/nwpara' +COMROOT: '/data/prod/glopara/com' +COMINsyn: '${COMROOT}/gfs/prod/syndat' +HOMEDIR: '/data/users/${USER}' +STMP: '/scratch/users/${USER}' +PTMP: '/scratch/users/${USER}' +NOSCRUB: ${HOMEDIR} +ACCOUNT: star +SCHEDULER: slurm +QUEUE: s4 +QUEUE_SERVICE: serial +PARTITION_BATCH: s4 +PARTITION_SERVICE: serial +CHGRP_RSTPROD: 'NO' +CHGRP_CMD: 'ls' +HPSSARCH: 'NO' +LOCALARCH: 'NO' +ATARDIR: '${NOSCRUB}/archive_rotdir/${PSLOT}' +MAKE_NSSTBUFR: 'YES' +MAKE_ACFTBUFR: 'YES' +SUPPORTED_RESOLUTIONS: ['C384', 'C192', 'C96', 'C48'] diff --git a/workflow/hosts/wcoss2.yaml b/workflow/hosts/wcoss2.yaml new file mode 100644 index 00000000000..a049f6ae1fc --- /dev/null +++ b/workflow/hosts/wcoss2.yaml @@ -0,0 +1,23 @@ +BASE_GIT: '/lfs/h2/emc/global/save/emc.global/git' +DMPDIR: '/lfs/h2/emc/global/noscrub/emc.global/dump' +PACKAGEROOT: '${PACKAGEROOT:-"/lfs/h1/ops/prod/packages"}' +COMROOT: '${COMROOT:-"/lfs/h1/ops/prod/com"}' +COMINsyn: '${COMROOT}/gfs/${gfs_ver:-"v16.2"}/syndat' +HOMEDIR: '/lfs/h2/emc/global/noscrub/${USER}' +STMP: '/lfs/h2/emc/stmp/${USER}' +PTMP: '/lfs/h2/emc/ptmp/${USER}' +NOSCRUB: $HOMEDIR +ACCOUNT: 'GFS-DEV' +SCHEDULER: pbspro +QUEUE: 'dev' +QUEUE_SERVICE: 'dev_transfer' +PARTITION_BATCH: '' +PARTITION_SERVICE: '' +CHGRP_RSTPROD: 'YES' +CHGRP_CMD: 'chgrp rstprod' +HPSSARCH: 'NO' +LOCALARCH: 'NO' +ATARDIR: '/NCEPDEV/${HPSS_PROJECT}/1year/${USER}/${machine}/scratch/${PSLOT}' +MAKE_NSSTBUFR: 'NO' +MAKE_ACFTBUFR: 'NO' +SUPPORTED_RESOLUTIONS: ['C768', 'C384', 'C192', 'C96', 'C48'] diff --git a/workflow/pygw b/workflow/pygw new file mode 120000 index 00000000000..dfa1d9a164b --- /dev/null +++ b/workflow/pygw @@ -0,0 +1 @@ +../ush/python/pygw/src/pygw \ No newline at end of file diff --git a/workflow/rocoto/rocoto.py b/workflow/rocoto/rocoto.py index f17caccae2c..11699016639 100644 --- a/workflow/rocoto/rocoto.py +++ b/workflow/rocoto/rocoto.py @@ -78,7 +78,7 @@ def create_task(task_dict: Dict[str, Any]) -> List[str]: threads = resources_dict.get('threads', 1) log = task_dict.get('log', 'demo.log') envar = task_dict.get('envars', None) - dependency = task_dict.get('dependency', None) + dependency = task_dict.get('dependency', []) str_maxtries = str(maxtries) str_final = ' final="true"' if final else '' @@ -109,12 +109,14 @@ def create_task(task_dict: Dict[str, Any]) -> List[str]: strings.append(f'\t{e}\n') strings.append('\n') - if dependency is not None: + if dependency is not None and len(dependency) > 0: strings.append('\t\n') for d in dependency: strings.append(f'\t\t{d}\n') strings.append('\t\n') strings.append('\n') + elif taskname != "gfswaveinit": + print("WARNING: No dependencies for task " + taskname) strings.append('\n') @@ -293,7 +295,7 @@ def _traverse(o, tree_types=(list, tuple)): yield o -def create_dependency(dep_condition=None, dep=None) -> List[str]: +def create_dependency(dep_condition=None, dep=[]) -> List[str]: """ create a compound dependency given a list of dependencies, and compounding condition the list of dependencies are created using add_dependency @@ -309,10 +311,10 @@ def create_dependency(dep_condition=None, dep=None) -> List[str]: strings = [] - if dep_condition is not None: - strings.append(f'<{dep_condition}>') + if len(dep) > 0: + if dep_condition is not None: + strings.append(f'<{dep_condition}>') - if dep[0] is not None: for d in dep: if dep_condition is None: strings.append(f'{d}') @@ -320,8 +322,8 @@ def create_dependency(dep_condition=None, dep=None) -> List[str]: for e in _traverse(d): strings.append(f'\t{e}') - if dep_condition is not None: - strings.append(f'') + if dep_condition is not None: + strings.append(f'') return strings diff --git a/workflow/rocoto/workflow_tasks_emc.py b/workflow/rocoto/workflow_tasks_emc.py index c34605b52a8..786dfac7aaa 100644 --- a/workflow/rocoto/workflow_tasks_emc.py +++ b/workflow/rocoto/workflow_tasks_emc.py @@ -4,6 +4,7 @@ from typing import List from applications import AppConfig import rocoto.rocoto as rocoto +from pygw.template import Template, TemplateConstants __all__ = ['Tasks', 'create_wf_task', 'get_wf_tasks'] @@ -12,10 +13,12 @@ class Tasks: SERVICE_TASKS = ['arch', 'earc', 'getic'] VALID_TASKS = ['aerosol_init', 'coupled_ic', 'getic', 'init', 'prep', 'anal', 'sfcanl', 'analcalc', 'analdiag', 'gldas', 'arch', - 'atmanalprep', 'atmanalrun', 'atmanalpost', + 'atmanlinit', 'atmanlrun', 'atmanlfinal', + 'ocnanalprep', 'ocnanalbmat', 'ocnanalrun', 'ocnanalchkpt', 'ocnanalpost', 'ocnanalvrfy', 'earc', 'ecen', 'echgres', 'ediag', 'efcs', 'eobs', 'eomg', 'epos', 'esfc', 'eupd', - 'atmensanalprep', 'atmensanalrun', 'atmensanalpost', + 'atmensanlinit', 'atmensanlrun', 'atmensanlfinal', + 'aeroanlinit', 'aeroanlrun', 'aeroanlfinal', 'fcst', 'post', 'ocnpost', 'vrfy', 'metp', 'postsnd', 'awips', 'gempak', 'wafs', 'wafsblending', 'wafsblending0p25', @@ -37,10 +40,14 @@ def __init__(self, app_config: AppConfig, cdump: str) -> None: envar_dict = {'RUN_ENVIR': self._base.get('RUN_ENVIR', 'emc'), 'HOMEgfs': self._base.get('HOMEgfs'), 'EXPDIR': self._base.get('EXPDIR'), + 'NET': 'gfs', 'CDUMP': self.cdump, + 'RUN': self.cdump, 'CDATE': '@Y@m@d@H', 'PDY': '@Y@m@d', - 'cyc': '@H'} + 'cyc': '@H', + 'COMROOT': self._base.get('COMROOT'), + 'DATAROOT': self._base.get('DATAROOT')} self.envars = self._set_envars(envar_dict) @staticmethod @@ -60,8 +67,53 @@ def _get_hybgroups(nens: int, nmem_per_group: int, start_index: int = 1): @staticmethod def _is_this_a_gdas_task(cdump, task_name): - if cdump != 'gdas': - raise TypeError(f'{task_name} must be part of the "gdas" cycle and not {cdump}') + if cdump != 'enkfgdas': + raise TypeError(f'{task_name} must be part of the "enkfgdas" cycle and not {cdump}') + + def _template_to_rocoto_cycstring(self, template: str, subs_dict: dict = {}) -> str: + ''' + Takes a string templated with ${ } and converts it into a string suitable + for use in a rocoto . Some common substitutions are defined by + default. Any additional variables in the template and overrides of the + defaults can be passed in by an optional dict. + + Variables substitued by default: + ${ROTDIR} -> '&ROTDIR;' + ${RUN} -> self.cdump + ${DUMP} -> self.cdump + ${MEMDIR} -> '' + ${YMD} -> '@Y@m@d' + ${HH} -> '@H' + + Parameters + ---------- + template: str + Template string with variables to be replaced + subs_dict: dict, optional + Dictionary containing substitutions + + Returns + ------- + str + Updated string with variables substituted + + ''' + + # Defaults + rocoto_conversion_dict = { + 'ROTDIR': '&ROTDIR;', + 'RUN': self.cdump, + 'DUMP': self.cdump, + 'MEMDIR': '', + 'YMD': '@Y@m@d', + 'HH': '@H' + } + + rocoto_conversion_dict.update(subs_dict) + + return Template.substitute_structure(template, + TemplateConstants.DOLLAR_CURLY_BRACE, + rocoto_conversion_dict.get) def get_resource(self, task_name): """ @@ -97,15 +149,19 @@ def get_resource(self, task_name): memory = task_config.get(f'memory_{task_name}', None) - native = '--export=NONE' if scheduler in ['slurm'] else None + native = None + if scheduler in ['pbspro']: + native = '-l debug=true,place=vscatter' + if task_config.get('is_exclusive', False): + native += ':exclhost' + elif scheduler in ['slurm']: + native = '--export=NONE' - queue = task_config['QUEUE'] - if task_name in Tasks.SERVICE_TASKS and scheduler not in ['slurm']: - queue = task_config['QUEUE_SERVICE'] + queue = task_config['QUEUE_SERVICE'] if task_name in Tasks.SERVICE_TASKS else task_config['QUEUE'] partition = None if scheduler in ['slurm']: - partition = task_config['QUEUE_SERVICE'] if task_name in Tasks.SERVICE_TASKS else task_config[ + partition = task_config['PARTITION_SERVICE'] if task_name in Tasks.SERVICE_TASKS else task_config[ 'PARTITION_BATCH'] task_resource = {'account': account, @@ -145,8 +201,8 @@ def coupled_ic(self): prefix = f"{cpl_ic['BASE_CPLIC']}/{cpl_ic['CPL_ATMIC']}/@Y@m@d@H/{self.cdump}" for file in ['gfs_ctrl.nc'] + \ [f'{datatype}_data.tile{tile}.nc' - for datatype in ['gfs', 'sfc'] - for tile in range(1, self.n_tiles + 1)]: + for datatype in ['gfs', 'sfc'] + for tile in range(1, self.n_tiles + 1)]: data = f"{prefix}/{atm_res}/INPUT/{file}" dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) @@ -161,7 +217,7 @@ def coupled_ic(self): if self.app_config.do_ocean: ocn_res = f"{self._base.get('OCNRES', '025'):03d}" prefix = f"{cpl_ic['BASE_CPLIC']}/{cpl_ic['CPL_OCNIC']}/@Y@m@d@H/ocn" - for res in ['res'] + [f'res_{res_index}' for res_index in range(1, 5)]: + for res in ['res'] + [f'res_{res_index}' for res_index in range(1, 4)]: data = f"{prefix}/{ocn_res}/MOM.{res}.nc" dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) @@ -192,13 +248,14 @@ def coupled_ic(self): def getic(self): - files = ['INPUT/sfc_data.tile6.nc', - 'RESTART/@Y@m@d.@H0000.sfcanl_data.tile6.nc'] + atm_input_path = self._template_to_rocoto_cycstring(self._base['COM_ATMOS_INPUT_TMPL']) + atm_restart_path = self._template_to_rocoto_cycstring(self._base['COM_ATMOS_RESTART_TMPL']) deps = [] - for file in files: - dep_dict = {'type': 'data', 'data': f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/{file}'} - deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'data', 'data': f'{atm_input_path}/sfc_data.tile6.nc'} + deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'data', 'data': f'{atm_restart_path}/@Y@m@d.@H0000.sfcanl_data.tile6.nc'} + deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='nor', dep=deps) resources = self.get_resource('getic') @@ -208,16 +265,14 @@ def getic(self): def init(self): - files = ['gfs.t@Hz.sanl', - 'gfs.t@Hz.atmanl.nemsio', - 'gfs.t@Hz.atmanl.nc', - 'atmos/gfs.t@Hz.atmanl.nc', - 'atmos/RESTART/@Y@m@d.@H0000.sfcanl_data.tile6.nc'] + atm_anl_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_ANALYSIS_TMPL"]) + atm_restart_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_RESTART_TMPL"]) deps = [] - for file in files: - dep_dict = {'type': 'data', 'data': f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/{file}'} - deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'data', 'data': f'{atm_anl_path}/gfs.t@Hz.atmanl.nc'} + deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'data', 'data': f'{atm_restart_path}/@Y@m@d.@H0000.sfcanl_data.tile6.nc'} + deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='or', dep=deps) if self.app_config.do_hpssarch: @@ -232,19 +287,22 @@ def init(self): def prep(self): - suffix = self._base["SUFFIX"] dump_suffix = self._base["DUMP_SUFFIX"] gfs_cyc = self._base["gfs_cyc"] dmpdir = self._base["DMPDIR"] + atm_hist_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_HISTORY_TMPL"], {'RUN': 'gdas'}) + dump_path = self._template_to_rocoto_cycstring(self._base["COM_OBSDMP_TMPL"], + {'DMPDIR': dmpdir, 'DUMP_SUFFIX': dump_suffix}) + gfs_enkf = True if self.app_config.do_hybvar and 'gfs' in self.app_config.eupd_cdumps else False deps = [] - dep_dict = {'type': 'metatask', 'name': f'{"gdas"}post', 'offset': '-06:00:00'} + dep_dict = {'type': 'metatask', 'name': 'gdaspost', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) - data = f'&ROTDIR;/gdas.@Y@m@d/@H/atmos/gdas.t@Hz.atmf009{suffix}' + data = f'{atm_hist_path}/gdas.t@Hz.atmf009.nc' dep_dict = {'type': 'data', 'data': data, 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) - data = f'{dmpdir}/{self.cdump}{dump_suffix}.@Y@m@d/@H/{self.cdump}.t@Hz.updated.status.tm00.bufr_d' + data = f'{dump_path}/{self.cdump}.t@Hz.updated.status.tm00.bufr_d' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -289,11 +347,14 @@ def waveprep(self): def aerosol_init(self): + input_path = self._template_to_rocoto_cycstring(self._base['COM_ATMOS_INPUT_TMPL']) + restart_path = self._template_to_rocoto_cycstring(self._base['COM_ATMOS_RESTART_TMPL']) + deps = [] # Files from current cycle files = ['gfs_ctrl.nc'] + [f'gfs_data.tile{tile}.nc' for tile in range(1, self.n_tiles + 1)] for file in files: - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/INPUT/{file}' + data = f'{input_path}/{file}' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) @@ -305,24 +366,22 @@ def aerosol_init(self): interval = self._base['INTERVAL'] offset = f'-{interval}' - # Previous cycle - dep_dict = {'type': 'cycleexist', 'offset': offset} - deps.append(rocoto.add_dependency(dep_dict)) - # Files from previous cycle files = [f'@Y@m@d.@H0000.fv_core.res.nc'] + \ [f'@Y@m@d.@H0000.fv_core.res.tile{tile}.nc' for tile in range(1, self.n_tiles + 1)] + \ [f'@Y@m@d.@H0000.fv_tracer.res.tile{tile}.nc' for tile in range(1, self.n_tiles + 1)] for file in files: - data = [f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/RERUN_RESTART/', file] + data = [f'{restart_path}', file] dep_dict = {'type': 'data', 'data': data, 'offset': [offset, None]} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + cycledef = 'gfs_seq' resources = self.get_resource('aerosol_init') - task = create_wf_task('aerosol_init', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + task = create_wf_task('aerosol_init', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) return task @@ -331,7 +390,7 @@ def anal(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}prep'} deps.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_hybvar: - dep_dict = {'type': 'metatask', 'name': f'{"gdas"}epmn', 'offset': '-06:00:00'} + dep_dict = {'type': 'metatask', 'name': 'enkfgdasepmn', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) else: @@ -345,8 +404,8 @@ def anal(self): def sfcanl(self): deps = [] - if self.app_config.do_jedivar: - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanalrun'} + if self.app_config.do_jediatmvar: + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanlfinal'} else: dep_dict = {'type': 'task', 'name': f'{self.cdump}anal'} deps.append(rocoto.add_dependency(dep_dict)) @@ -360,15 +419,15 @@ def sfcanl(self): def analcalc(self): deps = [] - if self.app_config.do_jedivar: - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanalrun'} + if self.app_config.do_jediatmvar: + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanlfinal'} else: dep_dict = {'type': 'task', 'name': f'{self.cdump}anal'} deps.append(rocoto.add_dependency(dep_dict)) dep_dict = {'type': 'task', 'name': f'{self.cdump}sfcanl'} deps.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_hybvar and self.cdump in ['gdas']: - dep_dict = {'type': 'task', 'name': f'{"gdas"}echgres', 'offset': '-06:00:00'} + dep_dict = {'type': 'task', 'name': 'enkfgdasechgres', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -382,8 +441,6 @@ def analdiag(self): deps = [] dep_dict = {'type': 'task', 'name': f'{self.cdump}anal'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'cycleexist', 'offset': '-06:00:00'} - deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) resources = self.get_resource('analdiag') @@ -391,62 +448,203 @@ def analdiag(self): return task - def atmanalprep(self): + def atmanlinit(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}prep'} + deps.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_hybvar: + dep_dict = {'type': 'metatask', 'name': 'enkfgdasepmn', 'offset': '-06:00:00'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + else: + dependencies = rocoto.create_dependency(dep=deps) - suffix = self._base["SUFFIX"] - dump_suffix = self._base["DUMP_SUFFIX"] gfs_cyc = self._base["gfs_cyc"] + gfs_enkf = True if self.app_config.do_hybvar and 'gfs' in self.app_config.eupd_cdumps else False + + cycledef = self.cdump + if self.cdump in ['gfs'] and gfs_enkf and gfs_cyc != 4: + cycledef = 'gdas' + + resources = self.get_resource('atmanlinit') + task = create_wf_task('atmanlinit', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) + + return task + + def atmanlrun(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanlinit'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + resources = self.get_resource('atmanlrun') + task = create_wf_task('atmanlrun', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + + return task + + def atmanlfinal(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanlrun'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + + resources = self.get_resource('atmanlfinal') + task = create_wf_task('atmanlfinal', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + + return task + + def aeroanlinit(self): + + dump_suffix = self._base["DUMP_SUFFIX"] dmpdir = self._base["DMPDIR"] - do_gfs_enkf = True if self.app_config.do_hybvar and 'gfs' in self.app_config.eupd_cdumps else False + atm_hist_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_HISTORY_TMPL"], {'RUN': 'gdas'}) + dump_path = self._template_to_rocoto_cycstring(self._base["COM_OBSDMP_TMPL"], + {'DMPDIR': dmpdir, 'DUMP_SUFFIX': dump_suffix}) deps = [] dep_dict = {'type': 'metatask', 'name': 'gdaspost', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) - data = f'&ROTDIR;/gdas.@Y@m@d/@H/atmos/gdas.t@Hz.atmf009{suffix}' + data = f'{atm_hist_path}/gdas.t@Hz.atmf009.nc' dep_dict = {'type': 'data', 'data': data, 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) - data = f'{dmpdir}/{self.cdump}{dump_suffix}.@Y@m@d/@H/{self.cdump}.t@Hz.updated.status.tm00.bufr_d' - dep_dict = {'type': 'data', 'data': data} + dep_dict = {'type': 'task', 'name': f'{self.cdump}prep'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - cycledef = self.cdump - if self.cdump in ['gfs'] and do_gfs_enkf and gfs_cyc != 4: - cycledef = 'gdas' + resources = self.get_resource('aeroanlinit') + task = create_wf_task('aeroanlinit', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + return task + + def aeroanlrun(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}aeroanlinit'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + resources = self.get_resource('aeroanlrun') + task = create_wf_task('aeroanlrun', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) - resources = self.get_resource('atmanalprep') - task = create_wf_task('atmanalprep', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, - cycledef=cycledef) return task - def atmanalrun(self): + def aeroanlfinal(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanalprep'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}aeroanlrun'} deps.append(rocoto.add_dependency(dep_dict)) - if self.app_config.do_hybvar: - dep_dict = {'type': 'metatask', 'name': 'gdasepmn', 'offset': '-06:00:00'} + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + + resources = self.get_resource('aeroanlfinal') + task = create_wf_task('aeroanlfinal', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + + return task + + def ocnanalprep(self): + + dump_suffix = self._base["DUMP_SUFFIX"] + dmpdir = self._base["DMPDIR"] + ocean_hist_path = self._template_to_rocoto_cycstring(self._base["COM_OCEAN_HISTORY_TMPL"]) + + deps = [] + data = f'{ocean_hist_path}/gdas.t@Hz.ocnf009.nc' + dep_dict = {'type': 'data', 'data': data, 'offset': '-06:00:00'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + resources = self.get_resource('ocnanalprep') + task = create_wf_task('ocnanalprep', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) + + return task + + def ocnanalbmat(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalprep'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + resources = self.get_resource('ocnanalbmat') + task = create_wf_task('ocnanalbmat', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) + + return task + + def ocnanalrun(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalbmat'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + resources = self.get_resource('ocnanalrun') + task = create_wf_task('ocnanalrun', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) + + return task + + def ocnanalchkpt(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalrun'} + deps.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_mergensst: + data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/{self.cdump}.t@Hz.sfcanl.nc' + dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) - dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - else: - dependencies = rocoto.create_dependency(dep=deps) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - resources = self.get_resource('atmanalrun') - task = create_wf_task('atmanalrun', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + resources = self.get_resource('ocnanalchkpt') + task = create_wf_task('ocnanalchkpt', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) return task - def atmanalpost(self): + def ocnanalpost(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanalrun'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalchkpt'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'cycleexist', 'offset': '-06:00:00'} + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + + resources = self.get_resource('ocnanalpost') + task = create_wf_task('ocnanalpost', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) + + return task + + def ocnanalvrfy(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalpost'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - resources = self.get_resource('atmanalpost') - task = create_wf_task('atmanalpost', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + resources = self.get_resource('ocnanalvrfy') + task = create_wf_task('ocnanalvrfy', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) return task @@ -455,8 +653,6 @@ def gldas(self): deps = [] dep_dict = {'type': 'task', 'name': f'{self.cdump}sfcanl'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'cycleexist', 'offset': '-06:00:00'} - deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) resources = self.get_resource('gldas') @@ -470,7 +666,7 @@ def fcst(self): 'cycled': self._fcst_cycled} try: - task = fcst_map[self.app_config.mode] + task = fcst_map[self.app_config.mode]() except KeyError: raise NotImplementedError(f'{self.app_config.mode} is not a valid type.\n' + 'Currently supported forecast types are:\n' + @@ -478,16 +674,17 @@ def fcst(self): return task - @property def _fcst_forecast_only(self): dependencies = [] deps = [] if self.app_config.do_atm: - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/INPUT/sfc_data.tile6.nc' + atm_input_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_INPUT_TMPL"]) + atm_restart_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_RESTART_TMPL"]) + data = f'{atm_input_path}/sfc_data.tile6.nc' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/RESTART/@Y@m@d.@H0000.sfcanl_data.tile6.nc' + data = f'{atm_restart_path}/@Y@m@d.@H0000.sfcanl_data.tile6.nc' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dependencies.append(rocoto.create_dependency(dep_condition='or', dep=deps)) @@ -531,13 +728,16 @@ def _fcst_forecast_only(self): return task - @property def _fcst_cycled(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}sfcanl'} dep = rocoto.add_dependency(dep_dict) dependencies = rocoto.create_dependency(dep=dep) + if self.app_config.do_jediocnvar: + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalpost'} + dependencies.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_gldas and self.cdump in ['gdas']: dep_dict = {'type': 'task', 'name': f'{self.cdump}gldas'} dependencies.append(rocoto.add_dependency(dep_dict)) @@ -546,6 +746,10 @@ def _fcst_cycled(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}waveprep'} dependencies.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_aero: + dep_dict = {'type': 'task', 'name': f'{self.cdump}aeroanlfinal'} + dependencies.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=dependencies) if self.cdump in ['gdas']: @@ -553,8 +757,11 @@ def _fcst_cycled(self): dependencies.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='or', dep=dependencies) + cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump + resources = self.get_resource('fcst') - task = create_wf_task('fcst', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + task = create_wf_task('fcst', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) return task @@ -566,7 +773,8 @@ def post(self): return self._post_task('post', add_anl_to_post=add_anl_to_post) def ocnpost(self): - return self._post_task('ocnpost', add_anl_to_post=False) + if self.app_config.mode in ['forecast-only']: # TODO: fix ocnpost in cycled mode + return self._post_task('ocnpost', add_anl_to_post=False) def _post_task(self, task_name, add_anl_to_post=False): if task_name not in ['post', 'ocnpost']: @@ -610,7 +818,8 @@ def _get_postgroups(cdump, config, add_anl=False): return grp, dep, lst deps = [] - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/{self.cdump}.t@Hz.log#dep#.txt' + atm_hist_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_HISTORY_TMPL"]) + data = f'{atm_hist_path}/{self.cdump}.t@Hz.atm.log#dep#.txt' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dep_dict = {'type': 'task', 'name': f'{self.cdump}fcst'} @@ -628,16 +837,19 @@ def _get_postgroups(cdump, config, add_anl=False): varval1, varval2, varval3 = _get_postgroups(self.cdump, self._configs[task_name], add_anl=add_anl_to_post) vardict = {varname2: varval2, varname3: varval3} + cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump + resources = self.get_resource(task_name) task = create_wf_task(task_name, resources, cdump=self.cdump, envar=postenvars, dependency=dependencies, - metatask=task_name, varname=varname1, varval=varval1, vardict=vardict) + metatask=task_name, varname=varname1, varval=varval1, vardict=vardict, cycledef=cycledef) return task def wavepostsbs(self): deps = [] for wave_grid in self._configs['wavepostsbs']['waveGRD'].split(): - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/wave/rundata/{self.cdump}wave.out_grd.{wave_grid}.@Y@m@d.@H0000' + wave_hist_path = self._template_to_rocoto_cycstring(self._base["COM_WAVE_HISTORY_TMPL"]) + data = f'{wave_hist_path}/{self.cdump}wave.out_grd.{wave_grid}.@Y@m@d.@H0000' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -660,7 +872,8 @@ def wavepostbndpnt(self): def wavepostbndpntbll(self): deps = [] - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/{self.cdump}.t@Hz.logf180.txt' + wave_hist_path = self._template_to_rocoto_cycstring(self._base["COM_WAVE_HISTORY_TMPL"]) + data = f'{wave_hist_path}/{self.cdump}.t@Hz.atm.logf180.txt' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -715,7 +928,7 @@ def waveawipsgridded(self): deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) - resources = self.get_resource('waeawipsgridded') + resources = self.get_resource('waveawipsgridded') task = create_wf_task('waveawipsgridded', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) @@ -737,10 +950,12 @@ def _wafs_task(self, task_name): if task_name not in ['wafs', 'wafsgcip', 'wafsgrib2', 'wafsgrib20p25']: raise KeyError(f'Invalid WAFS task: {task_name}') + wafs_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_WAFS_TMPL"]) + deps = [] fhrlst = [6] + [*range(12, 36 + 3, 3)] for fhr in fhrlst: - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/{self.cdump}.t@Hz.wafs.grb2if{fhr:03d}' + data = f'{wafs_path}/{self.cdump}.t@Hz.wafs.grb2if{fhr:03d}' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -862,8 +1077,25 @@ def vrfy(self): deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) + cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump + resources = self.get_resource('vrfy') - task = create_wf_task('vrfy', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + task = create_wf_task('vrfy', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) + + return task + + def fit2obs(self): + deps = [] + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}post'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump + + resources = self.get_resource('fit2obs') + task = create_wf_task('fit2obs', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) return task @@ -894,6 +1126,9 @@ def arch(self): if self.app_config.do_vrfy: dep_dict = {'type': 'task', 'name': f'{self.cdump}vrfy'} deps.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_fit2obs and self.cdump in ['gdas']: + dep_dict = {'type': 'task', 'name': f'{self.cdump}fit2obs'} + deps.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_metp and self.cdump in ['gfs']: dep_dict = {'type': 'metatask', 'name': f'{self.cdump}metp'} deps.append(rocoto.add_dependency(dep_dict)) @@ -906,21 +1141,30 @@ def arch(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}wavepostbndpnt'} deps.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_ocean: - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}ocnpost'} + if self.app_config.mode in ['forecast-only']: # TODO: fix ocnpost to run in cycled mode + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}ocnpost'} + deps.append(rocoto.add_dependency(dep_dict)) + # If all verification and ocean/wave coupling is off, add the gdas/gfs post metatask as a dependency + if len(deps) == 0: + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}post'} deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump + resources = self.get_resource('arch') - task = create_wf_task('arch', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + task = create_wf_task('arch', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) return task # Start of ensemble tasks def eobs(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}prep'} + dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}prep'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'metatask', 'name': f'{"gdas"}epmn', 'offset': '-06:00:00'} + dep_dict = {'type': 'metatask', 'name': 'enkfgdasepmn', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -971,65 +1215,49 @@ def eupd(self): return task - def atmensanalprep(self): - - suffix = self._base["SUFFIX"] - dump_suffix = self._base["DUMP_SUFFIX"] - gfs_cyc = self._base["gfs_cyc"] - dmpdir = self._base["DMPDIR"] - do_gfs_enkf = True if self.app_config.do_hybvar and 'gfs' in self.app_config.eupd_cdumps else False - + def atmensanlinit(self): deps = [] - dep_dict = {'type': 'metatask', 'name': 'gdaspost', 'offset': '-06:00:00'} + dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}prep'} deps.append(rocoto.add_dependency(dep_dict)) - data = f'&ROTDIR;/gdas.@Y@m@d/@H/atmos/gdas.t@Hz.atmf009{suffix}' - dep_dict = {'type': 'data', 'data': data, 'offset': '-06:00:00'} - deps.append(rocoto.add_dependency(dep_dict)) - data = f'{dmpdir}/{self.cdump}{dump_suffix}.@Y@m@d/@H/{self.cdump}.t@Hz.updated.status.tm00.bufr_d' - dep_dict = {'type': 'data', 'data': data} + dep_dict = {'type': 'metatask', 'name': 'enkfgdasepmn', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - cycledef = self.cdump - if self.cdump in ['gfs'] and do_gfs_enkf and gfs_cyc != 4: - cycledef = 'gdas' - - resources = self.get_resource('atmensanalprep') - task = create_wf_task('atmensanalprep', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef = "gdas" + resources = self.get_resource('atmensanlinit') + task = create_wf_task('atmensanlinit', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, cycledef=cycledef) return task - def atmensanalrun(self): + def atmensanlrun(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanalprep'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanlinit'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'metatask', 'name': 'gdasepmn', 'offset': '-06:00:00'} + dep_dict = {'type': 'metatask', 'name': 'enkfgdasepmn', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - resources = self.get_resource('atmensanalrun') - task = create_wf_task('atmensanalrun', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + resources = self.get_resource('atmensanlrun') + task = create_wf_task('atmensanlrun', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) return task - def atmensanalpost(self): + def atmensanlfinal(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanalrun'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanlrun'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) - resources = self.get_resource('atmensanalpost') - task = create_wf_task('atmensanalpost', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + resources = self.get_resource('atmensanlfinal') + task = create_wf_task('atmensanlfinal', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) return task def ecen(self): - self._is_this_a_gdas_task(self.cdump, 'ecen') - def _get_ecengroups(): if self._base.get('DOIAU_ENKF', False): @@ -1053,15 +1281,13 @@ def _get_ecengroups(): return grp, dep, lst - eupd_cdump = 'gdas' if 'gdas' in self.app_config.eupd_cdumps else 'gfs' - deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}analcalc'} + dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}analcalc'} deps.append(rocoto.add_dependency(dep_dict)) - if self.app_config.do_jediens: - dep_dict = {'type': 'task', 'name': f'{eupd_cdump}atmensanalrun'} + if self.app_config.do_jediatmens: + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanlfinal'} else: - dep_dict = {'type': 'task', 'name': f'{eupd_cdump}eupd'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}eupd'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -1082,29 +1308,25 @@ def _get_ecengroups(): def esfc(self): - self._is_this_a_gdas_task(self.cdump, 'esfc') - - eupd_cdump = 'gdas' if 'gdas' in self.app_config.eupd_cdumps else 'gfs' + # eupd_cdump = 'gdas' if 'gdas' in self.app_config.eupd_cdumps else 'gfs' deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}analcalc'} + dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}analcalc'} deps.append(rocoto.add_dependency(dep_dict)) - if self.app_config.do_jediens: - dep_dict = {'type': 'task', 'name': f'{eupd_cdump}atmensanalrun'} + if self.app_config.do_jediatmens: + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanlfinal'} else: - dep_dict = {'type': 'task', 'name': f'{eupd_cdump}eupd'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}eupd'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) resources = self.get_resource('esfc') - task = create_wf_task('esfc', resources, cdump='gdas', envar=self.envars, dependency=dependencies) + task = create_wf_task('esfc', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) return task def efcs(self): - self._is_this_a_gdas_task(self.cdump, 'efcs') - deps = [] dep_dict = {'type': 'metatask', 'name': f'{self.cdump}ecmn'} deps.append(rocoto.add_dependency(dep_dict)) @@ -1120,9 +1342,12 @@ def efcs(self): groups = self._get_hybgroups(self._base['NMEM_ENKF'], self._configs['efcs']['NMEM_EFCSGRP']) + if self.cdump == "enkfgfs": + groups = self._get_hybgroups(self._base['NMEM_EFCS'], self._configs['efcs']['NMEM_EFCSGRP_GFS']) + cycledef = 'gdas_half,gdas' if self.cdump in ['enkfgdas'] else self.cdump.replace('enkf', '') resources = self.get_resource('efcs') task = create_wf_task('efcs', resources, cdump=self.cdump, envar=efcsenvars, dependency=dependencies, - metatask='efmn', varname='grp', varval=groups) + metatask='efmn', varname='grp', varval=groups, cycledef=cycledef) return task @@ -1131,25 +1356,29 @@ def echgres(self): self._is_this_a_gdas_task(self.cdump, 'echgres') deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}fcst'} + dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}fcst'} deps.append(rocoto.add_dependency(dep_dict)) dep_dict = {'type': 'task', 'name': f'{self.cdump}efcs01'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + cycledef = 'gdas_half,gdas' if self.cdump in ['enkfgdas'] else self.cdump + resources = self.get_resource('echgres') - task = create_wf_task('echgres', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + task = create_wf_task('echgres', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) return task def epos(self): - self._is_this_a_gdas_task(self.cdump, 'epos') - def _get_eposgroups(epos): fhmin = epos['FHMIN_ENKF'] fhmax = epos['FHMAX_ENKF'] fhout = epos['FHOUT_ENKF'] + if self.cdump == "enkfgfs": + fhmax = epos['FHMAX_ENKF_GFS'] + fhout = epos['FHOUT_ENKF_GFS'] fhrs = range(fhmin, fhmax + fhout, fhout) neposgrp = epos['NEPOSGRP'] @@ -1180,16 +1409,16 @@ def _get_eposgroups(epos): varval1, varval2, varval3 = _get_eposgroups(self._configs['epos']) vardict = {varname2: varval2, varname3: varval3} + cycledef = 'gdas_half,gdas' if self.cdump in ['enkfgdas'] else self.cdump.replace('enkf', '') + resources = self.get_resource('epos') task = create_wf_task('epos', resources, cdump=self.cdump, envar=eposenvars, dependency=dependencies, - metatask='epmn', varname=varname1, varval=varval1, vardict=vardict) + metatask='epmn', varname=varname1, varval=varval1, vardict=vardict, cycledef=cycledef) return task def earc(self): - self._is_this_a_gdas_task(self.cdump, 'earc') - deps = [] dep_dict = {'type': 'metatask', 'name': f'{self.cdump}epmn'} deps.append(rocoto.add_dependency(dep_dict)) @@ -1200,9 +1429,11 @@ def earc(self): groups = self._get_hybgroups(self._base['NMEM_ENKF'], self._configs['earc']['NMEM_EARCGRP'], start_index=0) + cycledef = 'gdas_half,gdas' if self.cdump in ['enkfgdas'] else self.cdump.replace('enkf', '') + resources = self.get_resource('earc') task = create_wf_task('earc', resources, cdump=self.cdump, envar=earcenvars, dependency=dependencies, - metatask='eamn', varname='grp', varval=groups) + metatask='eamn', varname='grp', varval=groups, cycledef=cycledef) return task @@ -1220,7 +1451,7 @@ def create_wf_task(task_name, resources, 'varval': f'{varval}', 'vardict': vardict} - cycledefstr = cdump if cycledef is None else cycledef + cycledefstr = cdump.replace('enkf', '') if cycledef is None else cycledef task_dict = {'taskname': f'{tasknamestr}', 'cycledef': f'{cycledefstr}', diff --git a/workflow/rocoto/workflow_tasks_gsl.py b/workflow/rocoto/workflow_tasks_gsl.py index 734fe5a6d58..854e0d2a3c1 100644 --- a/workflow/rocoto/workflow_tasks_gsl.py +++ b/workflow/rocoto/workflow_tasks_gsl.py @@ -4,6 +4,7 @@ from typing import List from applications import AppConfig import rocoto.rocoto as rocoto +from pygw.template import Template, TemplateConstants __all__ = ['Tasks', 'create_wf_task', 'get_wf_tasks'] @@ -12,10 +13,12 @@ class Tasks: SERVICE_TASKS = ['arch', 'earc', 'getic'] VALID_TASKS = ['aerosol_init', 'coupled_ic', 'getic', 'init', 'prep', 'anal', 'sfcanl', 'analcalc', 'analdiag', 'gldas', 'arch', - 'atmanalprep', 'atmanalrun', 'atmanalpost', + 'atmanlinit', 'atmanlrun', 'atmanlfinal', + 'ocnanalprep', 'ocnanalbmat', 'ocnanalrun', 'ocnanalchkpt', 'ocnanalpost', 'ocnanalvrfy', 'earc', 'ecen', 'echgres', 'ediag', 'efcs', 'eobs', 'eomg', 'epos', 'esfc', 'eupd', - 'atmensanalprep', 'atmensanalrun', 'atmensanalpost', + 'atmensanlinit', 'atmensanlrun', 'atmensanlfinal', + 'aeroanlinit', 'aeroanlrun', 'aeroanlfinal', 'fcst', 'post', 'ocnpost', 'vrfy', 'metp', 'postsnd', 'awips', 'gempak', 'wafs', 'wafsblending', 'wafsblending0p25', @@ -38,10 +41,14 @@ def __init__(self, app_config: AppConfig, cdump: str) -> None: 'HOMEgfs': self._base.get('HOMEgfs'), 'EXPDIR': self._base.get('EXPDIR'), 'ROTDIR': self._base.get('ROTDIR'), + 'NET': 'gfs', 'CDUMP': self.cdump, + 'RUN': self.cdump, 'CDATE': '@Y@m@d@H', 'PDY': '@Y@m@d', - 'cyc': '@H'} + 'cyc': '@H', + 'COMROOT': self._base.get('COMROOT'), + 'DATAROOT': self._base.get('DATAROOT')} self.envars = self._set_envars(envar_dict) @staticmethod @@ -61,8 +68,53 @@ def _get_hybgroups(nens: int, nmem_per_group: int, start_index: int = 1): @staticmethod def _is_this_a_gdas_task(cdump, task_name): - if cdump != 'gdas': - raise TypeError(f'{task_name} must be part of the "gdas" cycle and not {cdump}') + if cdump != 'enkfgdas': + raise TypeError(f'{task_name} must be part of the "enkfgdas" cycle and not {cdump}') + + def _template_to_rocoto_cycstring(self, template: str, subs_dict: dict = {}) -> str: + ''' + Takes a string templated with ${ } and converts it into a string suitable + for use in a rocoto . Some common substitutions are defined by + default. Any additional variables in the template and overrides of the + defaults can be passed in by an optional dict. + + Variables substitued by default: + ${ROTDIR} -> '&ROTDIR;' + ${RUN} -> self.cdump + ${DUMP} -> self.cdump + ${MEMDIR} -> '' + ${YMD} -> '@Y@m@d' + ${HH} -> '@H' + + Parameters + ---------- + template: str + Template string with variables to be replaced + subs_dict: dict, optional + Dictionary containing substitutions + + Returns + ------- + str + Updated string with variables substituted + + ''' + + # Defaults + rocoto_conversion_dict = { + 'ROTDIR': '&ROTDIR;', + 'RUN': self.cdump, + 'DUMP': self.cdump, + 'MEMDIR': '', + 'YMD': '@Y@m@d', + 'HH': '@H' + } + + rocoto_conversion_dict.update(subs_dict) + + return Template.substitute_structure(template, + TemplateConstants.DOLLAR_CURLY_BRACE, + rocoto_conversion_dict.get) def get_resource(self, task_name): """ @@ -98,15 +150,19 @@ def get_resource(self, task_name): memory = task_config.get(f'memory_{task_name}', None) - native = '&NATIVE_STR;' if scheduler in ['slurm'] else None + native = None + if scheduler in ['pbspro']: + native = '-l debug=true,place=vscatter' + if task_config.get('is_exclusive', False): + native += ':exclhost' + elif scheduler in ['slurm']: + native = '&NATIVE_STR;' - queue = task_config['QUEUE'] - if task_name in Tasks.SERVICE_TASKS and scheduler not in ['slurm']: - queue = task_config['QUEUE_SERVICE'] + queue = task_config['QUEUE_SERVICE'] if task_name in Tasks.SERVICE_TASKS else task_config['QUEUE'] partition = None if scheduler in ['slurm']: - partition = task_config['QUEUE_SERVICE'] if task_name in Tasks.SERVICE_TASKS else task_config[ + partition = task_config['PARTITION_SERVICE'] if task_name in Tasks.SERVICE_TASKS else task_config[ 'PARTITION_BATCH'] task_resource = {'account': account, @@ -146,8 +202,8 @@ def coupled_ic(self): prefix = f"{cpl_ic['BASE_CPLIC']}/{cpl_ic['CPL_ATMIC']}/@Y@m@d@H/{self.cdump}" for file in ['gfs_ctrl.nc'] + \ [f'{datatype}_data.tile{tile}.nc' - for datatype in ['gfs', 'sfc'] - for tile in range(1, self.n_tiles + 1)]: + for datatype in ['gfs', 'sfc'] + for tile in range(1, self.n_tiles + 1)]: data = f"{prefix}/{atm_res}/INPUT/{file}" dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) @@ -162,7 +218,7 @@ def coupled_ic(self): if self.app_config.do_ocean: ocn_res = f"{self._base.get('OCNRES', '025'):03d}" prefix = f"{cpl_ic['BASE_CPLIC']}/{cpl_ic['CPL_OCNIC']}/@Y@m@d@H/ocn" - for res in ['res'] + [f'res_{res_index}' for res_index in range(1, 5)]: + for res in ['res'] + [f'res_{res_index}' for res_index in range(1, 4)]: data = f"{prefix}/{ocn_res}/MOM.{res}.nc" dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) @@ -193,13 +249,14 @@ def coupled_ic(self): def getic(self): - files = ['INPUT/sfc_data.tile6.nc', - 'RESTART/@Y@m@d.@H0000.sfcanl_data.tile6.nc'] + atm_input_path = self._template_to_rocoto_cycstring(self._base['COM_ATMOS_INPUT_TMPL']) + atm_restart_path = self._template_to_rocoto_cycstring(self._base['COM_ATMOS_RESTART_TMPL']) deps = [] - for file in files: - dep_dict = {'type': 'data', 'data': f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/{file}'} - deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'data', 'data': f'{atm_input_path}/sfc_data.tile6.nc'} + deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'data', 'data': f'{atm_restart_path}/@Y@m@d.@H0000.sfcanl_data.tile6.nc'} + deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='nor', dep=deps) resources = self.get_resource('getic') @@ -207,36 +264,22 @@ def getic(self): return task -##JKH def init(self): -##JKH -##JKH files = ['gfs.t@Hz.sanl', -##JKH 'gfs.t@Hz.atmanl.nemsio', -##JKH 'gfs.t@Hz.atmanl.nc', -##JKH 'atmos/gfs.t@Hz.atmanl.nc', -##JKH 'atmos/RESTART/@Y@m@d.@H0000.sfcanl_data.tile6.nc'] -##JKH -##JKH deps = [] -##JKH for file in files: -##JKH dep_dict = {'type': 'data', 'data': f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/{file}'} -##JKH deps.append(rocoto.add_dependency(dep_dict)) -##JKH dependencies = rocoto.create_dependency(dep_condition='or', dep=deps) -##JKH -##JKH if self.app_config.do_hpssarch: -##JKH dep_dict = {'type': 'task', 'name': f'{self.cdump}getic'} -##JKH dependencies.append(rocoto.add_dependency(dep_dict)) -##JKH dependencies = rocoto.create_dependency(dep_condition='and', dep=dependencies) -##JKH -##JKH resources = self.get_resource('init') -##JKH task = create_wf_task('init', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) -##JKH -##JKH return task - def init(self): + atm_anl_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_ANALYSIS_TMPL"]) + atm_restart_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_RESTART_TMPL"]) + deps = [] - dep_dict = {'type': 'data', 'data': f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/INPUT'} + dep_dict = {'type': 'data', 'data': f'{atm_anl_path}/gfs.t@Hz.atmanl.nc'} deps.append(rocoto.add_dependency(dep_dict)) - dependencies = rocoto.create_dependency(dep_condition='not', dep=deps) + dep_dict = {'type': 'data', 'data': f'{atm_restart_path}/@Y@m@d.@H0000.sfcanl_data.tile6.nc'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='or', dep=deps) + + if self.app_config.do_hpssarch: + dep_dict = {'type': 'task', 'name': f'{self.cdump}getic'} + dependencies.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=dependencies) resources = self.get_resource('init') task = create_wf_task('init', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) @@ -245,19 +288,22 @@ def init(self): def prep(self): - suffix = self._base["SUFFIX"] dump_suffix = self._base["DUMP_SUFFIX"] gfs_cyc = self._base["gfs_cyc"] dmpdir = self._base["DMPDIR"] + atm_hist_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_HISTORY_TMPL"], {'RUN': 'gdas'}) + dump_path = self._template_to_rocoto_cycstring(self._base["COM_OBSDMP_TMPL"], + {'DMPDIR': dmpdir, 'DUMP_SUFFIX': dump_suffix}) + gfs_enkf = True if self.app_config.do_hybvar and 'gfs' in self.app_config.eupd_cdumps else False deps = [] - dep_dict = {'type': 'metatask', 'name': f'{"gdas"}post', 'offset': '-06:00:00'} + dep_dict = {'type': 'metatask', 'name': 'gdaspost', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) - data = f'&ROTDIR;/gdas.@Y@m@d/@H/atmos/gdas.t@Hz.atmf009{suffix}' + data = f'{atm_hist_path}/gdas.t@Hz.atmf009.nc' dep_dict = {'type': 'data', 'data': data, 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) - data = f'{dmpdir}/{self.cdump}{dump_suffix}.@Y@m@d/@H/{self.cdump}.t@Hz.updated.status.tm00.bufr_d' + data = f'{dump_path}/{self.cdump}.t@Hz.updated.status.tm00.bufr_d' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -302,11 +348,14 @@ def waveprep(self): def aerosol_init(self): + input_path = self._template_to_rocoto_cycstring(self._base['COM_ATMOS_INPUT_TMPL']) + restart_path = self._template_to_rocoto_cycstring(self._base['COM_ATMOS_RESTART_TMPL']) + deps = [] # Files from current cycle files = ['gfs_ctrl.nc'] + [f'gfs_data.tile{tile}.nc' for tile in range(1, self.n_tiles + 1)] for file in files: - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/INPUT/{file}' + data = f'{input_path}/{file}' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) @@ -318,24 +367,22 @@ def aerosol_init(self): interval = self._base['INTERVAL'] offset = f'-{interval}' - # Previous cycle - dep_dict = {'type': 'cycleexist', 'offset': offset} - deps.append(rocoto.add_dependency(dep_dict)) - # Files from previous cycle files = [f'@Y@m@d.@H0000.fv_core.res.nc'] + \ [f'@Y@m@d.@H0000.fv_core.res.tile{tile}.nc' for tile in range(1, self.n_tiles + 1)] + \ [f'@Y@m@d.@H0000.fv_tracer.res.tile{tile}.nc' for tile in range(1, self.n_tiles + 1)] for file in files: - data = [f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/RERUN_RESTART/', file] + data = [f'{restart_path}', file] dep_dict = {'type': 'data', 'data': data, 'offset': [offset, None]} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + cycledef = 'gfs_seq' resources = self.get_resource('aerosol_init') - task = create_wf_task('aerosol_init', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + task = create_wf_task('aerosol_init', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) return task @@ -344,7 +391,7 @@ def anal(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}prep'} deps.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_hybvar: - dep_dict = {'type': 'metatask', 'name': f'{"gdas"}epmn', 'offset': '-06:00:00'} + dep_dict = {'type': 'metatask', 'name': 'enkfgdasepmn', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) else: @@ -358,8 +405,8 @@ def anal(self): def sfcanl(self): deps = [] - if self.app_config.do_jedivar: - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanalrun'} + if self.app_config.do_jediatmvar: + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanlfinal'} else: dep_dict = {'type': 'task', 'name': f'{self.cdump}anal'} deps.append(rocoto.add_dependency(dep_dict)) @@ -373,15 +420,15 @@ def sfcanl(self): def analcalc(self): deps = [] - if self.app_config.do_jedivar: - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanalrun'} + if self.app_config.do_jediatmvar: + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanlfinal'} else: dep_dict = {'type': 'task', 'name': f'{self.cdump}anal'} deps.append(rocoto.add_dependency(dep_dict)) dep_dict = {'type': 'task', 'name': f'{self.cdump}sfcanl'} deps.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_hybvar and self.cdump in ['gdas']: - dep_dict = {'type': 'task', 'name': f'{"gdas"}echgres', 'offset': '-06:00:00'} + dep_dict = {'type': 'task', 'name': 'enkfgdasechgres', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -395,8 +442,6 @@ def analdiag(self): deps = [] dep_dict = {'type': 'task', 'name': f'{self.cdump}anal'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'cycleexist', 'offset': '-06:00:00'} - deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) resources = self.get_resource('analdiag') @@ -404,62 +449,203 @@ def analdiag(self): return task - def atmanalprep(self): + def atmanlinit(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}prep'} + deps.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_hybvar: + dep_dict = {'type': 'metatask', 'name': 'enkfgdasepmn', 'offset': '-06:00:00'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + else: + dependencies = rocoto.create_dependency(dep=deps) - suffix = self._base["SUFFIX"] - dump_suffix = self._base["DUMP_SUFFIX"] gfs_cyc = self._base["gfs_cyc"] + gfs_enkf = True if self.app_config.do_hybvar and 'gfs' in self.app_config.eupd_cdumps else False + + cycledef = self.cdump + if self.cdump in ['gfs'] and gfs_enkf and gfs_cyc != 4: + cycledef = 'gdas' + + resources = self.get_resource('atmanlinit') + task = create_wf_task('atmanlinit', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) + + return task + + def atmanlrun(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanlinit'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + resources = self.get_resource('atmanlrun') + task = create_wf_task('atmanlrun', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + + return task + + def atmanlfinal(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanlrun'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + + resources = self.get_resource('atmanlfinal') + task = create_wf_task('atmanlfinal', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + + return task + + def aeroanlinit(self): + + dump_suffix = self._base["DUMP_SUFFIX"] dmpdir = self._base["DMPDIR"] - do_gfs_enkf = True if self.app_config.do_hybvar and 'gfs' in self.app_config.eupd_cdumps else False + atm_hist_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_HISTORY_TMPL"], {'RUN': 'gdas'}) + dump_path = self._template_to_rocoto_cycstring(self._base["COM_OBSDMP_TMPL"], + {'DMPDIR': dmpdir, 'DUMP_SUFFIX': dump_suffix}) deps = [] dep_dict = {'type': 'metatask', 'name': 'gdaspost', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) - data = f'&ROTDIR;/gdas.@Y@m@d/@H/atmos/gdas.t@Hz.atmf009{suffix}' + data = f'{atm_hist_path}/gdas.t@Hz.atmf009.nc' dep_dict = {'type': 'data', 'data': data, 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) - data = f'{dmpdir}/{self.cdump}{dump_suffix}.@Y@m@d/@H/{self.cdump}.t@Hz.updated.status.tm00.bufr_d' - dep_dict = {'type': 'data', 'data': data} + dep_dict = {'type': 'task', 'name': f'{self.cdump}prep'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - cycledef = self.cdump - if self.cdump in ['gfs'] and do_gfs_enkf and gfs_cyc != 4: - cycledef = 'gdas' + resources = self.get_resource('aeroanlinit') + task = create_wf_task('aeroanlinit', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + return task + + def aeroanlrun(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}aeroanlinit'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + resources = self.get_resource('aeroanlrun') + task = create_wf_task('aeroanlrun', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) - resources = self.get_resource('atmanalprep') - task = create_wf_task('atmanalprep', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, - cycledef=cycledef) return task - def atmanalrun(self): + def aeroanlfinal(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanalprep'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}aeroanlrun'} deps.append(rocoto.add_dependency(dep_dict)) - if self.app_config.do_hybvar: - dep_dict = {'type': 'metatask', 'name': 'gdasepmn', 'offset': '-06:00:00'} + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + + resources = self.get_resource('aeroanlfinal') + task = create_wf_task('aeroanlfinal', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + + return task + + def ocnanalprep(self): + + dump_suffix = self._base["DUMP_SUFFIX"] + dmpdir = self._base["DMPDIR"] + ocean_hist_path = self._template_to_rocoto_cycstring(self._base["COM_OCEAN_HISTORY_TMPL"]) + + deps = [] + data = f'{ocean_hist_path}/gdas.t@Hz.ocnf009.nc' + dep_dict = {'type': 'data', 'data': data, 'offset': '-06:00:00'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + resources = self.get_resource('ocnanalprep') + task = create_wf_task('ocnanalprep', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) + + return task + + def ocnanalbmat(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalprep'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + resources = self.get_resource('ocnanalbmat') + task = create_wf_task('ocnanalbmat', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) + + return task + + def ocnanalrun(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalbmat'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + resources = self.get_resource('ocnanalrun') + task = create_wf_task('ocnanalrun', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) + + return task + + def ocnanalchkpt(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalrun'} + deps.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_mergensst: + data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/{self.cdump}.t@Hz.sfcanl.nc' + dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) - dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - else: - dependencies = rocoto.create_dependency(dep=deps) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - resources = self.get_resource('atmanalrun') - task = create_wf_task('atmanalrun', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + resources = self.get_resource('ocnanalchkpt') + task = create_wf_task('ocnanalchkpt', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) return task - def atmanalpost(self): + def ocnanalpost(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanalrun'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalchkpt'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'cycleexist', 'offset': '-06:00:00'} + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + + resources = self.get_resource('ocnanalpost') + task = create_wf_task('ocnanalpost', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) + + return task + + def ocnanalvrfy(self): + + deps = [] + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalpost'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - resources = self.get_resource('atmanalpost') - task = create_wf_task('atmanalpost', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + resources = self.get_resource('ocnanalvrfy') + task = create_wf_task('ocnanalvrfy', + resources, + cdump=self.cdump, + envar=self.envars, + dependency=dependencies) return task @@ -468,8 +654,6 @@ def gldas(self): deps = [] dep_dict = {'type': 'task', 'name': f'{self.cdump}sfcanl'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'cycleexist', 'offset': '-06:00:00'} - deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) resources = self.get_resource('gldas') @@ -483,7 +667,7 @@ def fcst(self): 'cycled': self._fcst_cycled} try: - task = fcst_map[self.app_config.mode] + task = fcst_map[self.app_config.mode]() except KeyError: raise NotImplementedError(f'{self.app_config.mode} is not a valid type.\n' + 'Currently supported forecast types are:\n' + @@ -491,16 +675,17 @@ def fcst(self): return task - @property def _fcst_forecast_only(self): dependencies = [] deps = [] if self.app_config.do_atm: - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/INPUT/sfc_data.tile6.nc' + atm_input_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_INPUT_TMPL"]) + atm_restart_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_RESTART_TMPL"]) + data = f'{atm_input_path}/sfc_data.tile6.nc' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/RESTART/@Y@m@d.@H0000.sfcanl_data.tile6.nc' + data = f'{atm_restart_path}/@Y@m@d.@H0000.sfcanl_data.tile6.nc' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dependencies.append(rocoto.create_dependency(dep_condition='or', dep=deps)) @@ -544,13 +729,16 @@ def _fcst_forecast_only(self): return task - @property def _fcst_cycled(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}sfcanl'} dep = rocoto.add_dependency(dep_dict) dependencies = rocoto.create_dependency(dep=dep) + if self.app_config.do_jediocnvar: + dep_dict = {'type': 'task', 'name': f'{self.cdump}ocnanalpost'} + dependencies.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_gldas and self.cdump in ['gdas']: dep_dict = {'type': 'task', 'name': f'{self.cdump}gldas'} dependencies.append(rocoto.add_dependency(dep_dict)) @@ -559,6 +747,10 @@ def _fcst_cycled(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}waveprep'} dependencies.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_aero: + dep_dict = {'type': 'task', 'name': f'{self.cdump}aeroanlfinal'} + dependencies.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=dependencies) if self.cdump in ['gdas']: @@ -566,8 +758,11 @@ def _fcst_cycled(self): dependencies.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='or', dep=dependencies) + cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump + resources = self.get_resource('fcst') - task = create_wf_task('fcst', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + task = create_wf_task('fcst', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) return task @@ -579,7 +774,8 @@ def post(self): return self._post_task('post', add_anl_to_post=add_anl_to_post) def ocnpost(self): - return self._post_task('ocnpost', add_anl_to_post=False) + if self.app_config.mode in ['forecast-only']: # TODO: fix ocnpost in cycled mode + return self._post_task('ocnpost', add_anl_to_post=False) def _post_task(self, task_name, add_anl_to_post=False): if task_name not in ['post', 'ocnpost']: @@ -623,7 +819,8 @@ def _get_postgroups(cdump, config, add_anl=False): return grp, dep, lst deps = [] - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/{self.cdump}.t@Hz.log#dep#.txt' + atm_hist_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_HISTORY_TMPL"]) + data = f'{atm_hist_path}/{self.cdump}.t@Hz.atm.log#dep#.txt' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dep_dict = {'type': 'task', 'name': f'{self.cdump}fcst'} @@ -641,16 +838,19 @@ def _get_postgroups(cdump, config, add_anl=False): varval1, varval2, varval3 = _get_postgroups(self.cdump, self._configs[task_name], add_anl=add_anl_to_post) vardict = {varname2: varval2, varname3: varval3} + cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump + resources = self.get_resource(task_name) task = create_wf_task(task_name, resources, cdump=self.cdump, envar=postenvars, dependency=dependencies, - metatask=task_name, varname=varname1, varval=varval1, vardict=vardict) + metatask=task_name, varname=varname1, varval=varval1, vardict=vardict, cycledef=cycledef) return task def wavepostsbs(self): deps = [] for wave_grid in self._configs['wavepostsbs']['waveGRD'].split(): - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/wave/rundata/{self.cdump}wave.out_grd.{wave_grid}.@Y@m@d.@H0000' + wave_hist_path = self._template_to_rocoto_cycstring(self._base["COM_WAVE_HISTORY_TMPL"]) + data = f'{wave_hist_path}/{self.cdump}wave.out_grd.{wave_grid}.@Y@m@d.@H0000' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -673,7 +873,8 @@ def wavepostbndpnt(self): def wavepostbndpntbll(self): deps = [] - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/{self.cdump}.t@Hz.logf180.txt' + wave_hist_path = self._template_to_rocoto_cycstring(self._base["COM_WAVE_HISTORY_TMPL"]) + data = f'{wave_hist_path}/{self.cdump}.t@Hz.atm.logf180.txt' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -728,7 +929,7 @@ def waveawipsgridded(self): deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) - resources = self.get_resource('waeawipsgridded') + resources = self.get_resource('waveawipsgridded') task = create_wf_task('waveawipsgridded', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) @@ -750,10 +951,12 @@ def _wafs_task(self, task_name): if task_name not in ['wafs', 'wafsgcip', 'wafsgrib2', 'wafsgrib20p25']: raise KeyError(f'Invalid WAFS task: {task_name}') + wafs_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_WAFS_TMPL"]) + deps = [] fhrlst = [6] + [*range(12, 36 + 3, 3)] for fhr in fhrlst: - data = f'&ROTDIR;/{self.cdump}.@Y@m@d/@H/atmos/{self.cdump}.t@Hz.wafs.grb2if{fhr:03d}' + data = f'{wafs_path}/{self.cdump}.t@Hz.wafs.grb2if{fhr:03d}' dep_dict = {'type': 'data', 'data': data} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -875,8 +1078,25 @@ def vrfy(self): deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) + cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump + resources = self.get_resource('vrfy') - task = create_wf_task('vrfy', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + task = create_wf_task('vrfy', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) + + return task + + def fit2obs(self): + deps = [] + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}post'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump + + resources = self.get_resource('fit2obs') + task = create_wf_task('fit2obs', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) return task @@ -907,6 +1127,9 @@ def arch(self): if self.app_config.do_vrfy: dep_dict = {'type': 'task', 'name': f'{self.cdump}vrfy'} deps.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_fit2obs and self.cdump in ['gdas']: + dep_dict = {'type': 'task', 'name': f'{self.cdump}fit2obs'} + deps.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_metp and self.cdump in ['gfs']: dep_dict = {'type': 'metatask', 'name': f'{self.cdump}metp'} deps.append(rocoto.add_dependency(dep_dict)) @@ -919,21 +1142,30 @@ def arch(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}wavepostbndpnt'} deps.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_ocean: - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}ocnpost'} + if self.app_config.mode in ['forecast-only']: # TODO: fix ocnpost to run in cycled mode + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}ocnpost'} + deps.append(rocoto.add_dependency(dep_dict)) + # If all verification and ocean/wave coupling is off, add the gdas/gfs post metatask as a dependency + if len(deps) == 0: + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}post'} deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump + resources = self.get_resource('arch') - task = create_wf_task('arch', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + task = create_wf_task('arch', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) return task # Start of ensemble tasks def eobs(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}prep'} + dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}prep'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'metatask', 'name': f'{"gdas"}epmn', 'offset': '-06:00:00'} + dep_dict = {'type': 'metatask', 'name': 'enkfgdasepmn', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -984,65 +1216,49 @@ def eupd(self): return task - def atmensanalprep(self): - - suffix = self._base["SUFFIX"] - dump_suffix = self._base["DUMP_SUFFIX"] - gfs_cyc = self._base["gfs_cyc"] - dmpdir = self._base["DMPDIR"] - do_gfs_enkf = True if self.app_config.do_hybvar and 'gfs' in self.app_config.eupd_cdumps else False - + def atmensanlinit(self): deps = [] - dep_dict = {'type': 'metatask', 'name': 'gdaspost', 'offset': '-06:00:00'} + dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}prep'} deps.append(rocoto.add_dependency(dep_dict)) - data = f'&ROTDIR;/gdas.@Y@m@d/@H/atmos/gdas.t@Hz.atmf009{suffix}' - dep_dict = {'type': 'data', 'data': data, 'offset': '-06:00:00'} - deps.append(rocoto.add_dependency(dep_dict)) - data = f'{dmpdir}/{self.cdump}{dump_suffix}.@Y@m@d/@H/{self.cdump}.t@Hz.updated.status.tm00.bufr_d' - dep_dict = {'type': 'data', 'data': data} + dep_dict = {'type': 'metatask', 'name': 'enkfgdasepmn', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - cycledef = self.cdump - if self.cdump in ['gfs'] and do_gfs_enkf and gfs_cyc != 4: - cycledef = 'gdas' - - resources = self.get_resource('atmensanalprep') - task = create_wf_task('atmensanalprep', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef = "gdas" + resources = self.get_resource('atmensanlinit') + task = create_wf_task('atmensanlinit', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, cycledef=cycledef) return task - def atmensanalrun(self): + def atmensanlrun(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanalprep'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanlinit'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'metatask', 'name': 'gdasepmn', 'offset': '-06:00:00'} + dep_dict = {'type': 'metatask', 'name': 'enkfgdasepmn', 'offset': '-06:00:00'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - resources = self.get_resource('atmensanalrun') - task = create_wf_task('atmensanalrun', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + resources = self.get_resource('atmensanlrun') + task = create_wf_task('atmensanlrun', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) return task - def atmensanalpost(self): + def atmensanlfinal(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanalrun'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanlrun'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) - resources = self.get_resource('atmensanalpost') - task = create_wf_task('atmensanalpost', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + resources = self.get_resource('atmensanlfinal') + task = create_wf_task('atmensanlfinal', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) return task def ecen(self): - self._is_this_a_gdas_task(self.cdump, 'ecen') - def _get_ecengroups(): if self._base.get('DOIAU_ENKF', False): @@ -1066,15 +1282,13 @@ def _get_ecengroups(): return grp, dep, lst - eupd_cdump = 'gdas' if 'gdas' in self.app_config.eupd_cdumps else 'gfs' - deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}analcalc'} + dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}analcalc'} deps.append(rocoto.add_dependency(dep_dict)) - if self.app_config.do_jediens: - dep_dict = {'type': 'task', 'name': f'{eupd_cdump}atmensanalrun'} + if self.app_config.do_jediatmens: + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanlfinal'} else: - dep_dict = {'type': 'task', 'name': f'{eupd_cdump}eupd'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}eupd'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -1095,29 +1309,25 @@ def _get_ecengroups(): def esfc(self): - self._is_this_a_gdas_task(self.cdump, 'esfc') - - eupd_cdump = 'gdas' if 'gdas' in self.app_config.eupd_cdumps else 'gfs' + # eupd_cdump = 'gdas' if 'gdas' in self.app_config.eupd_cdumps else 'gfs' deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}analcalc'} + dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}analcalc'} deps.append(rocoto.add_dependency(dep_dict)) - if self.app_config.do_jediens: - dep_dict = {'type': 'task', 'name': f'{eupd_cdump}atmensanalrun'} + if self.app_config.do_jediatmens: + dep_dict = {'type': 'task', 'name': f'{self.cdump}atmensanlfinal'} else: - dep_dict = {'type': 'task', 'name': f'{eupd_cdump}eupd'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}eupd'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) resources = self.get_resource('esfc') - task = create_wf_task('esfc', resources, cdump='gdas', envar=self.envars, dependency=dependencies) + task = create_wf_task('esfc', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) return task def efcs(self): - self._is_this_a_gdas_task(self.cdump, 'efcs') - deps = [] dep_dict = {'type': 'metatask', 'name': f'{self.cdump}ecmn'} deps.append(rocoto.add_dependency(dep_dict)) @@ -1133,9 +1343,12 @@ def efcs(self): groups = self._get_hybgroups(self._base['NMEM_ENKF'], self._configs['efcs']['NMEM_EFCSGRP']) + if self.cdump == "enkfgfs": + groups = self._get_hybgroups(self._base['NMEM_EFCS'], self._configs['efcs']['NMEM_EFCSGRP_GFS']) + cycledef = 'gdas_half,gdas' if self.cdump in ['enkfgdas'] else self.cdump.replace('enkf', '') resources = self.get_resource('efcs') task = create_wf_task('efcs', resources, cdump=self.cdump, envar=efcsenvars, dependency=dependencies, - metatask='efmn', varname='grp', varval=groups) + metatask='efmn', varname='grp', varval=groups, cycledef=cycledef) return task @@ -1144,25 +1357,29 @@ def echgres(self): self._is_this_a_gdas_task(self.cdump, 'echgres') deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}fcst'} + dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}fcst'} deps.append(rocoto.add_dependency(dep_dict)) dep_dict = {'type': 'task', 'name': f'{self.cdump}efcs01'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + cycledef = 'gdas_half,gdas' if self.cdump in ['enkfgdas'] else self.cdump + resources = self.get_resource('echgres') - task = create_wf_task('echgres', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies) + task = create_wf_task('echgres', resources, cdump=self.cdump, envar=self.envars, dependency=dependencies, + cycledef=cycledef) return task def epos(self): - self._is_this_a_gdas_task(self.cdump, 'epos') - def _get_eposgroups(epos): fhmin = epos['FHMIN_ENKF'] fhmax = epos['FHMAX_ENKF'] fhout = epos['FHOUT_ENKF'] + if self.cdump == "enkfgfs": + fhmax = epos['FHMAX_ENKF_GFS'] + fhout = epos['FHOUT_ENKF_GFS'] fhrs = range(fhmin, fhmax + fhout, fhout) neposgrp = epos['NEPOSGRP'] @@ -1193,16 +1410,16 @@ def _get_eposgroups(epos): varval1, varval2, varval3 = _get_eposgroups(self._configs['epos']) vardict = {varname2: varval2, varname3: varval3} + cycledef = 'gdas_half,gdas' if self.cdump in ['enkfgdas'] else self.cdump.replace('enkf', '') + resources = self.get_resource('epos') task = create_wf_task('epos', resources, cdump=self.cdump, envar=eposenvars, dependency=dependencies, - metatask='epmn', varname=varname1, varval=varval1, vardict=vardict) + metatask='epmn', varname=varname1, varval=varval1, vardict=vardict, cycledef=cycledef) return task def earc(self): - self._is_this_a_gdas_task(self.cdump, 'earc') - deps = [] dep_dict = {'type': 'metatask', 'name': f'{self.cdump}epmn'} deps.append(rocoto.add_dependency(dep_dict)) @@ -1213,9 +1430,11 @@ def earc(self): groups = self._get_hybgroups(self._base['NMEM_ENKF'], self._configs['earc']['NMEM_EARCGRP'], start_index=0) + cycledef = 'gdas_half,gdas' if self.cdump in ['enkfgdas'] else self.cdump.replace('enkf', '') + resources = self.get_resource('earc') task = create_wf_task('earc', resources, cdump=self.cdump, envar=earcenvars, dependency=dependencies, - metatask='eamn', varname='grp', varval=groups) + metatask='eamn', varname='grp', varval=groups, cycledef=cycledef) return task @@ -1233,7 +1452,7 @@ def create_wf_task(task_name, resources, 'varval': f'{varval}', 'vardict': vardict} - cycledefstr = cdump if cycledef is None else cycledef + cycledefstr = cdump.replace('enkf', '') if cycledef is None else cycledef task_dict = {'taskname': f'{tasknamestr}', 'cycledef': f'{cycledefstr}', diff --git a/workflow/rocoto/workflow_xml_emc.py b/workflow/rocoto/workflow_xml_emc.py index e0786964f99..856cade2b94 100644 --- a/workflow/rocoto/workflow_xml_emc.py +++ b/workflow/rocoto/workflow_xml_emc.py @@ -3,7 +3,9 @@ import os from distutils.spawn import find_executable from datetime import datetime +from pygw.timetools import to_timedelta from collections import OrderedDict +from typing import Dict from applications import AppConfig from rocoto.workflow_tasks import get_wf_tasks import rocoto.rocoto as rocoto @@ -11,9 +13,10 @@ class RocotoXML: - def __init__(self, app_config: AppConfig) -> None: + def __init__(self, app_config: AppConfig, rocoto_config: Dict) -> None: self._app_config = app_config + self.rocoto_config = rocoto_config self._base = self._app_config.configs['base'] @@ -56,13 +59,10 @@ def _get_definitions(self) -> str: entity['PSLOT'] = self._base['PSLOT'] - if self._app_config.mode in ['forecast-only']: - entity['ICSDIR'] = self._base['ICSDIR'] - entity['ROTDIR'] = self._base['ROTDIR'] entity['JOBS_DIR'] = self._base['BASE_JOB'] - entity['MAXTRIES'] = self._base.get('ROCOTO_MAXTRIES', 2) + entity['MAXTRIES'] = self.rocoto_config['maxtries'] # Put them all in an XML key-value syntax strings = [] @@ -77,9 +77,9 @@ def _get_workflow_header(self): """ scheduler = self._app_config.scheduler - cyclethrottle = self._base.get('ROCOTO_CYCLETHROTTLE', 3) - taskthrottle = self._base.get('ROCOTO_TASKTHROTTLE', 25) - verbosity = self._base.get('ROCOTO_VERBOSITY', 10) + cyclethrottle = self.rocoto_config['cyclethrottle'] + taskthrottle = self.rocoto_config['taskthrottle'] + verbosity = self.rocoto_config['verbosity'] expdir = self._base['EXPDIR'] @@ -110,29 +110,41 @@ def _get_cycledefs(self): return cycledefs def _get_cycledefs_cycled(self): - sdate = self._base['SDATE'].strftime('%Y%m%d%H%M') - edate = self._base['EDATE'].strftime('%Y%m%d%H%M') + sdate = self._base['SDATE'] + edate = self._base['EDATE'] interval = self._base.get('INTERVAL', '06:00:00') - strings = [f'\t{sdate} {edate} {interval}'] + strings = [] + strings.append(f'\t{sdate.strftime("%Y%m%d%H%M")} {sdate.strftime("%Y%m%d%H%M")} {interval}') + sdate = sdate + to_timedelta(interval) + strings.append(f'\t{sdate.strftime("%Y%m%d%H%M")} {edate.strftime("%Y%m%d%H%M")} {interval}') if self._app_config.gfs_cyc != 0: - sdate_gfs = self._base['SDATE_GFS'].strftime('%Y%m%d%H%M') - edate_gfs = self._base['EDATE_GFS'].strftime('%Y%m%d%H%M') + sdate_gfs = self._base['SDATE_GFS'] + edate_gfs = self._base['EDATE_GFS'] interval_gfs = self._base['INTERVAL_GFS'] - strings.append(f'\t{sdate_gfs} {edate_gfs} {interval_gfs}') - strings.append('') - strings.append('') + strings.append(f'\t{sdate_gfs.strftime("%Y%m%d%H%M")} {edate_gfs.strftime("%Y%m%d%H%M")} {interval_gfs}') + + sdate_gfs = sdate_gfs + to_timedelta(interval_gfs) + if sdate_gfs <= edate_gfs: + strings.append(f'\t{sdate_gfs.strftime("%Y%m%d%H%M")} {edate_gfs.strftime("%Y%m%d%H%M")} {interval_gfs}') + + strings.append('') + strings.append('') return '\n'.join(strings) def _get_cycledefs_forecast_only(self): - sdate = self._base['SDATE'].strftime('%Y%m%d%H%M') - edate = self._base['EDATE'].strftime('%Y%m%d%H%M') + sdate = self._base['SDATE'] + edate = self._base['EDATE'] interval = self._base.get('INTERVAL_GFS', '24:00:00') - cdump = self._base['CDUMP'] - strings = f'\t{sdate} {edate} {interval}\n\n' + strings = [] + strings.append(f'\t{sdate.strftime("%Y%m%d%H%M")} {edate.strftime("%Y%m%d%H%M")} {interval}') + + sdate = sdate + to_timedelta(interval) + if sdate <= edate: + strings.append(f'\t{sdate.strftime("%Y%m%d%H%M")} {edate.strftime("%Y%m%d%H%M")} {interval}') - return strings + return '\n'.join(strings) @staticmethod def _get_workflow_footer(): @@ -154,8 +166,8 @@ def _assemble_xml(self) -> str: return ''.join(strings) def write(self, xml_file: str = None, crontab_file: str = None): - self._write_xml(xml_file = xml_file) - self._write_crontab(crontab_file = crontab_file) + self._write_xml(xml_file=xml_file) + self._write_crontab(crontab_file=crontab_file) def _write_xml(self, xml_file: str = None) -> None: diff --git a/workflow/rocoto/workflow_xml_gsl.py b/workflow/rocoto/workflow_xml_gsl.py index 245884ce064..44443d27b22 100644 --- a/workflow/rocoto/workflow_xml_gsl.py +++ b/workflow/rocoto/workflow_xml_gsl.py @@ -3,7 +3,9 @@ import os from distutils.spawn import find_executable from datetime import datetime +from pygw.timetools import to_timedelta from collections import OrderedDict +from typing import Dict from applications import AppConfig from rocoto.workflow_tasks import get_wf_tasks import rocoto.rocoto as rocoto @@ -11,9 +13,10 @@ class RocotoXML: - def __init__(self, app_config: AppConfig) -> None: + def __init__(self, app_config: AppConfig, rocoto_config: Dict) -> None: self._app_config = app_config + self.rocoto_config = rocoto_config self._base = self._app_config.configs['base'] @@ -56,14 +59,11 @@ def _get_definitions(self) -> str: entity['PSLOT'] = self._base['PSLOT'] - if self._app_config.mode in ['forecast-only']: - entity['ICSDIR'] = self._base['ICSDIR'] - entity['ROTDIR'] = self._base['ROTDIR'] entity['JOBS_DIR'] = self._base['BASE_JOB'] entity['NATIVE_STR'] = '--export=NONE' - entity['MAXTRIES'] = self._base.get('ROCOTO_MAXTRIES', 2) + entity['MAXTRIES'] = self.rocoto_config['maxtries'] # Put them all in an XML key-value syntax strings = [] @@ -78,9 +78,9 @@ def _get_workflow_header(self): """ scheduler = self._app_config.scheduler - cyclethrottle = self._base.get('ROCOTO_CYCLETHROTTLE', 3) - taskthrottle = self._base.get('ROCOTO_TASKTHROTTLE', 25) - verbosity = self._base.get('ROCOTO_VERBOSITY', 10) + cyclethrottle = self.rocoto_config['cyclethrottle'] + taskthrottle = self.rocoto_config['taskthrottle'] + verbosity = self.rocoto_config['verbosity'] expdir = self._base['EXPDIR'] @@ -111,29 +111,41 @@ def _get_cycledefs(self): return cycledefs def _get_cycledefs_cycled(self): - sdate = self._base['SDATE'].strftime('%Y%m%d%H%M') - edate = self._base['EDATE'].strftime('%Y%m%d%H%M') + sdate = self._base['SDATE'] + edate = self._base['EDATE'] interval = self._base.get('INTERVAL', '06:00:00') - strings = [f'\t{sdate} {edate} {interval}'] + strings = [] + strings.append(f'\t{sdate.strftime("%Y%m%d%H%M")} {sdate.strftime("%Y%m%d%H%M")} {interval}') + sdate = sdate + to_timedelta(interval) + strings.append(f'\t{sdate.strftime("%Y%m%d%H%M")} {edate.strftime("%Y%m%d%H%M")} {interval}') if self._app_config.gfs_cyc != 0: - sdate_gfs = self._base['SDATE_GFS'].strftime('%Y%m%d%H%M') - edate_gfs = self._base['EDATE_GFS'].strftime('%Y%m%d%H%M') + sdate_gfs = self._base['SDATE_GFS'] + edate_gfs = self._base['EDATE_GFS'] interval_gfs = self._base['INTERVAL_GFS'] - strings.append(f'\t{sdate_gfs} {edate_gfs} {interval_gfs}') - strings.append('') - strings.append('') + strings.append(f'\t{sdate_gfs.strftime("%Y%m%d%H%M")} {edate_gfs.strftime("%Y%m%d%H%M")} {interval_gfs}') + + sdate_gfs = sdate_gfs + to_timedelta(interval_gfs) + if sdate_gfs <= edate_gfs: + strings.append(f'\t{sdate_gfs.strftime("%Y%m%d%H%M")} {edate_gfs.strftime("%Y%m%d%H%M")} {interval_gfs}') + + strings.append('') + strings.append('') return '\n'.join(strings) def _get_cycledefs_forecast_only(self): - sdate = self._base['SDATE'].strftime('%Y%m%d%H%M') - edate = self._base['EDATE'].strftime('%Y%m%d%H%M') + sdate = self._base['SDATE'] + edate = self._base['EDATE'] interval = self._base.get('INTERVAL_GFS', '24:00:00') - cdump = self._base['CDUMP'] - strings = f'\t{sdate} {edate} {interval}\n\n' + strings = [] + strings.append(f'\t{sdate.strftime("%Y%m%d%H%M")} {edate.strftime("%Y%m%d%H%M")} {interval}') + + sdate = sdate + to_timedelta(interval) + if sdate <= edate: + strings.append(f'\t{sdate.strftime("%Y%m%d%H%M")} {edate.strftime("%Y%m%d%H%M")} {interval}') - return strings + return '\n'.join(strings) @staticmethod def _get_workflow_footer(): @@ -155,8 +167,8 @@ def _assemble_xml(self) -> str: return ''.join(strings) def write(self, xml_file: str = None, crontab_file: str = None): - self._write_xml(xml_file = xml_file) - self._write_crontab(crontab_file = crontab_file) + self._write_xml(xml_file=xml_file) + self._write_crontab(crontab_file=crontab_file) def _write_xml(self, xml_file: str = None) -> None: diff --git a/workflow/rocoto_viewer.py b/workflow/rocoto_viewer.py index 63db6f25383..95dd9e76dd8 100755 --- a/workflow/rocoto_viewer.py +++ b/workflow/rocoto_viewer.py @@ -12,9 +12,14 @@ # rocoto_viewer.py -w my_gfs-workflow.xml -d my_database.db # # The script is located in the directory para/exp/rocoto/rocotoviewers/rocotoviewer_curses/rocoto_viewer.py -# The view will continuously update every four minutes and reflect the current status of your workflow. You may use your mouse or arrow keys to select a particular task and view its status details by pressing the key \p c as indicated as \b \ (which runs \b rocotocheck) or perform a \b rocotorewind by pressing \b \ to restart the workflow at that point. Running \b rocotorewind causes the state information of that task to be cleared from the database and resubmits the job to the scheduler. +# The view will continuously update every four minutes and reflect the current status of your workflow. +# You may use your mouse or arrow keys to select a particular task and view its status details by pressing the key \p c as indicated as \b \ +# (which runs \b rocotocheck) or perform a \b rocotorewind by pressing \b \ to restart the workflow at that point. +# Running \b rocotorewind causes the state information of that task to be cleared from the database and resubmits the job to the scheduler. # -# Tasks marked with the \b \< symbol are \b metatasks and can be expanded by highlight that task with the mouse, and then clicking on the \b \< symbol which then changes to \b \> . You can then click on the \b \> symbol to collapse it again. Alternatively, you can select the 'x' to expand and collapse metatasks when selected. +# Tasks marked with the \b \< symbol are \b metatasks and can be expanded by highlight that task with the mouse, +# and then clicking on the \b \< symbol which then changes to \b \>. +# You can then click on the \b \> symbol to collapse it again. Alternatively, you can select the 'x' to expand and collapse metatasks when selected. # # @cond ROCOTO_VIEWER_CURSES @@ -125,6 +130,7 @@ mlines = 0 mcols = 0 + def eprint(message: str) -> None: """ Print to stderr instead of stdout @@ -208,7 +214,7 @@ def string_to_timedelta(td_string: str) -> timedelta: and mdict['negative'] == '-': return -dt return dt - except(TypeError, ValueError, AttributeError): + except (TypeError, ValueError, AttributeError): raise @@ -941,13 +947,13 @@ def get_tasklist(workflow_file): task_cycledefs = cycle_noname if list_tasks: print(f"{task_name}, {task_cycledefs}") - # dependancies = child.getiterator('dependency') + # dependancies = child.iter('dependency') # for dependency in dependancies: # for them in dependency.getchildren(): # print(them.attrib) tasks_ordered.append((task_name, task_cycledefs, log_file)) elif child.tag == 'metatask': - all_metatasks_iterator = child.getiterator('metatask') + all_metatasks_iterator = child.iter('metatask') all_vars = dict() all_tasks = [] for i, metatasks in enumerate(all_metatasks_iterator): @@ -1113,9 +1119,14 @@ def get_rocoto_stat(params, queue_stat): (theid, jobid, task_order, taskname, cycle, state, exit_status, duration, tries) = row if jobid != '-': if use_performance_metrics: - line = f"{datetime.fromtimestamp(cycle).strftime('%Y%m%d%H%M')} {taskname} {str(jobid)} {str(state)} {str(exit_status)} {str(tries)} {str(duration).split('.')[0]} {str(slots)} {str(qtime)} {str(cputime).split('.')[0]} {str(runtime)}" + line = (f"{datetime.fromtimestamp(cycle).strftime('%Y%m%d%H%M')} " + f"{taskname} {str(jobid)} {str(state)} {str(exit_status)} " + f"{str(tries)} {str(duration).split('.')[0]} {str(slots)} " + f"{str(qtime)} {str(cputime).split('.')[0]} {str(runtime)}") else: - line = f"{datetime.fromtimestamp(cycle).strftime('%Y%m%d%H%M')} {taskname} {str(jobid)} {str(state)} {str(exit_status)} {str(tries)} {str(duration).split('.')[0]}" + line = (f"{datetime.fromtimestamp(cycle).strftime('%Y%m%d%H%M')} " + f"{taskname} {str(jobid)} {str(state)} {str(exit_status)} " + f"{str(tries)} {str(duration).split('.')[0]}") info[cycle].append(line) for every_cycle in cycles: @@ -1279,7 +1290,8 @@ def main(screen): use_multiprocessing = False # header_string = ' '*18+'CYCLE'+' '*17+'TASK'+' '*39+'JOBID'+' '*6+'STATE'+' '*9+'EXIT'+' '*2+'TRIES'+' '*2+'DURATION' - header_string = ' ' * 7 + 'CYCLE' + ' ' * (int(job_name_length_max / 2) + 3) + 'TASK' + ' ' * (int(job_name_length_max / 2) + 3) + 'JOBID' + ' ' * 6 + 'STATE' + ' ' * 9 + 'EXIT' + ' ' * 1 + 'TRIES' + ' ' * 1 + 'DURATION' + header_string = ' ' * 7 + 'CYCLE' + ' ' * (int(job_name_length_max / 2) + 3) + 'TASK' + ' ' * (int(job_name_length_max / 2) + 3) + \ + 'JOBID' + ' ' * 6 + 'STATE' + ' ' * 9 + 'EXIT' + ' ' * 1 + 'TRIES' + ' ' * 1 + 'DURATION' header_string_under = '=== (updated:tttttttttttttttt) =================== PSLOT: pslot ' + '=' * 44 global use_performance_metrics @@ -1336,8 +1348,10 @@ def main(screen): html_ptr = open(html_output_file, 'w') html_ptr.write(ccs_html) stat_update_time = str(datetime.now()).rsplit(':', 1)[0] - html_discribe_line = f'\n\n\n\n' - html_discribe_line += f'\n\n
ExpandRefreshed: {stat_update_time}PSLOT: {PSLOT}
ROTDIR: {workflow_name}Turn Around Times
\n
\n' + html_discribe_line = f'\n\n\n\n' + html_discribe_line += f'' + html_discribe_line += f'\n\n
' + html_discribe_line += f'ExpandRefreshed: {stat_update_time}PSLOT: {PSLOT}
ROTDIR: {workflow_name}Turn Around Times
\n
\n' html_discribe_line += html_header_line html_ptr.write(html_discribe_line) else: @@ -1701,7 +1715,9 @@ def main(screen): column = column[:7] html_line += f'{column}' elif i == 3: - if meta_tasks[cycle][line_num][1] and len(metatasks_state_string_cycle[cycle][columns[1]].split()) != 1 and metatasks_state_cycle[cycle][columns[1]]: + if meta_tasks[cycle][line_num][1] \ + and len(metatasks_state_string_cycle[cycle][columns[1]].split()) != 1 \ + and metatasks_state_cycle[cycle][columns[1]]: column = metatasks_state_string_cycle[cycle][columns[1]] if len(column) > 15: if column.split()[1] == 'SUCCEEDED': @@ -1780,8 +1796,10 @@ def main(screen): html_ptr = open(html_output_file, 'w') html_ptr.write(ccs_html) stat_update_time = str(datetime.now()).rsplit(':', 1)[0] - html_discribe_line = f'\n\n\n\n' - html_discribe_line += f'\n\n
CollapseRefreshed: {stat_update_time}PSLOT: {PSLOT}
ROTDIR: {workflow_name}Turn Around Times
\n
\n' + html_discribe_line = f'\n\n\n\n' + html_discribe_line += f'\n\n
' + html_discribe_line += f'CollapseRefreshed: {stat_update_time}PSLOT: {PSLOT}
ROTDIR: {workflow_name}' + html_discribe_line += f'Turn Around Times
\n
\n' html_discribe_line += html_header_line html_ptr.write(html_discribe_line) html_output_firstpass = False @@ -1947,7 +1965,9 @@ def main(screen): else: pad.addstr(job_id + ' ' * (11 - len(job_id))) elif i == 3: - if meta_tasks[cycle][line_num][1] and len(metatasks_state_string_cycle[cycle][columns[1]].split()) != 1 and metatasks_state_cycle[cycle][columns[1]]: + if meta_tasks[cycle][line_num][1] \ + and len(metatasks_state_string_cycle[cycle][columns[1]].split()) != 1 \ + and metatasks_state_cycle[cycle][columns[1]]: column = metatasks_state_string_cycle[cycle][columns[1]] if red_override: the_text_color = 2 diff --git a/workflow/setup_expt.py b/workflow/setup_expt.py index 566e936d24c..e948a295285 100755 --- a/workflow/setup_expt.py +++ b/workflow/setup_expt.py @@ -7,14 +7,19 @@ import os import glob import shutil -from datetime import datetime +import warnings from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter + from hosts import Host +from pygw.yaml_file import YAMLFile +from pygw.timetools import to_datetime, to_timedelta, datetime_to_YMDH + _here = os.path.dirname(__file__) _top = os.path.abspath(os.path.join(os.path.abspath(_here), '..')) + def makedirs_if_missing(dirname): """ Creates a directory if not already present @@ -51,28 +56,139 @@ def fill_COMROT_cycled(host, inputs): Implementation of 'fill_COMROT' for cycled mode """ - idatestr = inputs.idate.strftime('%Y%m%d%H') comrot = os.path.join(inputs.comrot, inputs.pslot) - if inputs.icsdir is not None: - # Link ensemble member initial conditions - enkfdir = f'enkf{inputs.cdump}.{idatestr[:8]}/{idatestr[8:]}' - makedirs_if_missing(os.path.join(comrot, enkfdir)) - for ii in range(1, inputs.nens + 1): - makedirs_if_missing(os.path.join(comrot, enkfdir, f'mem{ii:03d}')) - os.symlink(os.path.join(inputs.icsdir, idatestr, f'C{inputs.resens}', f'mem{ii:03d}', 'RESTART'), - os.path.join(comrot, enkfdir, f'mem{ii:03d}', 'RESTART')) + do_ocean = do_ice = do_med = False + + if inputs.app in ['S2S', 'S2SW']: + do_ocean = do_ice = do_med = True + + if inputs.icsdir is None: + warnings.warn("User did not provide '--icsdir' to stage initial conditions") + return - # Link deterministic initial conditions + rdatestr = datetime_to_YMDH(inputs.idate - to_timedelta('T06H')) + idatestr = datetime_to_YMDH(inputs.idate) + + if os.path.isdir(os.path.join(inputs.icsdir, f'{inputs.cdump}.{rdatestr[:8]}', rdatestr[8:], 'model_data', 'atmos')): + flat_structure = False + else: + flat_structure = True + + # Destination always uses the new COM structure + # These should match the templates defined in config.com + if inputs.start in ['warm']: + dst_atm_dir = os.path.join('model_data', 'atmos', 'restart') + dst_med_dir = os.path.join('model_data', 'med', 'restart') + else: + dst_atm_dir = os.path.join('model_data', 'atmos', 'input') + dst_med_dir = '' # no mediator files for a "cold start" + do_med = False + dst_ocn_rst_dir = os.path.join('model_data', 'ocean', 'restart') + dst_ocn_anl_dir = os.path.join('analysis', 'ocean') + dst_ice_dir = os.path.join('model_data', 'ice', 'restart') + dst_atm_anl_dir = os.path.join('analysis', 'atmos') + + if flat_structure: + # ICs are in the old flat COM structure + if inputs.start in ['warm']: # This is warm start experiment + src_atm_dir = os.path.join('atmos', 'RESTART') + src_med_dir = os.path.join('med', 'RESTART') + elif inputs.start in ['cold']: # This is a cold start experiment + src_atm_dir = os.path.join('atmos', 'INPUT') + src_med_dir = '' # no mediator files for a "cold start" + do_med = False + # ocean and ice have the same filenames for warm and cold + src_ocn_rst_dir = os.path.join('ocean', 'RESTART') + src_ocn_anl_dir = 'ocean' + src_ice_dir = os.path.join('ice', 'RESTART') + src_atm_anl_dir = 'atmos' + else: + src_atm_dir = dst_atm_dir + src_med_dir = dst_med_dir + src_ocn_rst_dir = dst_ocn_rst_dir + src_ocn_anl_dir = dst_ocn_anl_dir + src_ice_dir = dst_ice_dir + src_atm_anl_dir = dst_atm_anl_dir + + def link_files_from_src_to_dst(src_dir, dst_dir): + files = os.listdir(src_dir) + for fname in files: + os.symlink(os.path.join(src_dir, fname), + os.path.join(dst_dir, fname)) + return + + # Link ensemble member initial conditions + if inputs.nens > 0: + if inputs.start in ['warm']: + enkfdir = f'enkf{inputs.cdump}.{rdatestr[:8]}/{rdatestr[8:]}' + elif inputs.start in ['cold']: + enkfdir = f'enkf{inputs.cdump}.{idatestr[:8]}/{idatestr[8:]}' + + for ii in range(1, inputs.nens + 1): + memdir = f'mem{ii:03d}' + # Link atmospheric files + dst_dir = os.path.join(comrot, enkfdir, memdir, dst_atm_dir) + src_dir = os.path.join(inputs.icsdir, enkfdir, memdir, src_atm_dir) + makedirs_if_missing(dst_dir) + link_files_from_src_to_dst(src_dir, dst_dir) + # ocean, ice, etc. TBD ... + + # Link deterministic initial conditions + + # Link atmospheric files + if inputs.start in ['warm']: + detdir = f'{inputs.cdump}.{rdatestr[:8]}/{rdatestr[8:]}' + elif inputs.start in ['cold']: detdir = f'{inputs.cdump}.{idatestr[:8]}/{idatestr[8:]}' - makedirs_if_missing(os.path.join(comrot, detdir)) - os.symlink(os.path.join(inputs.icsdir, idatestr, f'C{inputs.resdet}', 'control', 'RESTART'), - os.path.join(comrot, detdir, 'RESTART')) - # Link bias correction and radiance diagnostics files - for fname in ['abias', 'abias_pc', 'abias_air', 'radstat']: - os.symlink(os.path.join(inputs.icsdir, idatestr, f'{inputs.cdump}.t{idatestr[8:]}z.{fname}'), - os.path.join(comrot, detdir, f'{inputs.cdump}.t{idatestr[8:]}z.{fname}')) + dst_dir = os.path.join(comrot, detdir, dst_atm_dir) + src_dir = os.path.join(inputs.icsdir, detdir, src_atm_dir) + makedirs_if_missing(dst_dir) + link_files_from_src_to_dst(src_dir, dst_dir) + + # Link ocean files + if do_ocean: + detdir = f'{inputs.cdump}.{rdatestr[:8]}/{rdatestr[8:]}' + dst_dir = os.path.join(comrot, detdir, dst_ocn_rst_dir) + src_dir = os.path.join(inputs.icsdir, detdir, src_ocn_rst_dir) + makedirs_if_missing(dst_dir) + link_files_from_src_to_dst(src_dir, dst_dir) + + # First 1/2 cycle needs a MOM6 increment + incdir = f'{inputs.cdump}.{idatestr[:8]}/{idatestr[8:]}' + incfile = f'{inputs.cdump}.t{idatestr[8:]}z.ocninc.nc' + src_file = os.path.join(inputs.icsdir, incdir, src_ocn_anl_dir, incfile) + dst_file = os.path.join(comrot, incdir, dst_ocn_anl_dir, incfile) + makedirs_if_missing(os.path.join(comrot, incdir, dst_ocn_anl_dir)) + os.symlink(src_file, dst_file) + + # Link ice files + if do_ice: + detdir = f'{inputs.cdump}.{rdatestr[:8]}/{rdatestr[8:]}' + dst_dir = os.path.join(comrot, detdir, dst_ice_dir) + src_dir = os.path.join(inputs.icsdir, detdir, src_ice_dir) + makedirs_if_missing(dst_dir) + link_files_from_src_to_dst(src_dir, dst_dir) + + # Link mediator files + if do_med: + detdir = f'{inputs.cdump}.{rdatestr[:8]}/{rdatestr[8:]}' + dst_dir = os.path.join(comrot, detdir, dst_med_dir) + src_dir = os.path.join(inputs.icsdir, detdir, src_med_dir) + makedirs_if_missing(dst_dir) + link_files_from_src_to_dst(src_dir, dst_dir) + + # Link bias correction and radiance diagnostics files + detdir = f'{inputs.cdump}.{idatestr[:8]}/{idatestr[8:]}' + src_dir = os.path.join(inputs.icsdir, detdir, src_atm_anl_dir) + dst_dir = os.path.join(comrot, detdir, dst_atm_anl_dir) + makedirs_if_missing(dst_dir) + for ftype in ['abias', 'abias_pc', 'abias_air', 'radstat']: + fname = f'{inputs.cdump}.t{idatestr[8:]}z.{ftype}' + src_file = os.path.join(src_dir, fname) + if os.path.exists(src_file): + os.symlink(src_file, os.path.join(dst_dir, fname)) return @@ -81,6 +197,7 @@ def fill_COMROT_forecasts(host, inputs): """ Implementation of 'fill_COMROT' for forecast-only mode """ + print('forecast-only mode treats ICs differently and cannot be staged here') return @@ -108,76 +225,102 @@ def fill_EXPDIR(inputs): return +def update_configs(host, inputs): + + # First update config.base + edit_baseconfig(host, inputs) + + yaml_path = inputs.yaml + yaml_dict = YAMLFile(path=yaml_path) + + # loop over other configs and update them + for cfg in yaml_dict.keys(): + cfg_file = f'{inputs.expdir}/{inputs.pslot}/config.{cfg}' + cfg_dict = get_template_dict(yaml_dict[cfg]) + edit_config(cfg_file, cfg_file, cfg_dict) + + return + + def edit_baseconfig(host, inputs): """ Parses and populates the templated `config.base.emc.dyn` to `config.base` """ tmpl_dict = { - "@MACHINE@": host.machine.upper(), + "@HOMEgfs@": _top, + "@MACHINE@": host.machine.upper()} + + # Replace host related items + extend_dict = get_template_dict(host.info) + tmpl_dict = dict(tmpl_dict, **extend_dict) + + extend_dict = dict() + extend_dict = { "@PSLOT@": inputs.pslot, - "@SDATE@": inputs.idate.strftime('%Y%m%d%H'), - "@EDATE@": inputs.edate.strftime('%Y%m%d%H'), + "@SDATE@": datetime_to_YMDH(inputs.idate), + "@EDATE@": datetime_to_YMDH(inputs.edate), "@CASECTL@": f'C{inputs.resdet}', - "@HOMEgfs@": _top, - "@BASE_GIT@": host.info["base_git"], - "@DMPDIR@": host.info["dmpdir"], - "@NWPROD@": host.info["nwprod"], - "@COMROOT@": host.info["comroot"], - "@HOMEDIR@": host.info["homedir"], "@EXPDIR@": inputs.expdir, "@ROTDIR@": inputs.comrot, - "@ICSDIR@": inputs.icsdir, - "@STMP@": host.info["stmp"], - "@PTMP@": host.info["ptmp"], - "@NOSCRUB@": host.info["noscrub"], - "@ACCOUNT@": host.info["account"], - "@QUEUE@": host.info["queue"], - "@QUEUE_SERVICE@": host.info["queue_service"], - "@PARTITION_BATCH@": host.info["partition_batch"], "@EXP_WARM_START@": inputs.warm_start, "@MODE@": inputs.mode, - "@CHGRP_RSTPROD@": host.info["chgrp_rstprod"], - "@CHGRP_CMD@": host.info["chgrp_cmd"], - "@HPSSARCH@": host.info["hpssarch"], - "@LOCALARCH@": host.info["localarch"], - "@ATARDIR@": host.info["atardir"], "@gfs_cyc@": inputs.gfs_cyc, - "@APP@": inputs.app, + "@APP@": inputs.app } + tmpl_dict = dict(tmpl_dict, **extend_dict) extend_dict = dict() if inputs.mode in ['cycled']: extend_dict = { "@CASEENS@": f'C{inputs.resens}', "@NMEM_ENKF@": inputs.nens, + "@DOHYBVAR@": "YES" if inputs.nens > 0 else "NO", } tmpl_dict = dict(tmpl_dict, **extend_dict) - # Open and read the templated config.base.emc.dyn - base_tmpl = f'{inputs.configdir}/config.base.emc.dyn' - with open(base_tmpl, 'rt') as fi: - basestr = fi.read() + # All apps and modes now use the same physics and CCPP suite by default + extend_dict = {"@CCPP_SUITE@": "FV3_GFS_v17_p8", "@IMP_PHYSICS@": 8} + tmpl_dict = dict(tmpl_dict, **extend_dict) + + base_input = f'{inputs.configdir}/config.base.emc.dyn' + base_output = f'{inputs.expdir}/{inputs.pslot}/config.base' + edit_config(base_input, base_output, tmpl_dict) - for key, val in tmpl_dict.items(): - basestr = basestr.replace(key, str(val)) + return - # Write and clobber the experiment config.base - base_config = f'{inputs.expdir}/{inputs.pslot}/config.base' - if os.path.exists(base_config): - os.unlink(base_config) - with open(base_config, 'wt') as fo: - fo.write(basestr) +def edit_config(input_config, output_config, config_dict): - print('') - print(f'EDITED: {base_config} as per user input.') - print(f'DEFAULT: {base_tmpl} is for reference only.') - print('') + # Read input config + with open(input_config, 'rt') as fi: + config_str = fi.read() + + # Substitute from config_dict + for key, val in config_dict.items(): + config_str = config_str.replace(key, str(val)) + + # Ensure no output_config file exists + if os.path.exists(output_config): + os.unlink(output_config) + + # Write output config + with open(output_config, 'wt') as fo: + fo.write(config_str) + + print(f'EDITED: {output_config} as per user input.') return +def get_template_dict(input_dict): + output_dict = dict() + for key, value in input_dict.items(): + output_dict[f'@{key}@'] = value + + return output_dict + + def input_args(): """ Method to collect user arguments for `setup_expt.py` @@ -187,7 +330,6 @@ def input_args(): Setup files and directories to start a GFS parallel.\n Create EXPDIR, copy config files.\n Create COMROT experiment directory structure, - link initial condition files from $ICSDIR to $COMROT """ parser = ArgumentParser(description=description, @@ -210,9 +352,9 @@ def input_args(): type=str, required=False, default=os.getenv('HOME')) subp.add_argument('--expdir', help='full path to EXPDIR', type=str, required=False, default=os.getenv('HOME')) - subp.add_argument('--idate', help='starting date of experiment, initial conditions must exist!', required=True, type=lambda dd: datetime.strptime(dd, '%Y%m%d%H')) - subp.add_argument('--edate', help='end date experiment', required=True, type=lambda dd: datetime.strptime(dd, '%Y%m%d%H')) - subp.add_argument('--icsdir', help='full path to initial condition directory', type=str, required=False, default=None) + subp.add_argument('--idate', help='starting date of experiment, initial conditions must exist!', + required=True, type=lambda dd: to_datetime(dd)) + subp.add_argument('--edate', help='end date experiment', required=True, type=lambda dd: to_datetime(dd)) subp.add_argument('--configdir', help='full path to directory containing the config files', type=str, required=False, default=os.path.join(_top, 'parm/config')) subp.add_argument('--cdump', help='CDUMP to start the experiment', @@ -222,28 +364,32 @@ def input_args(): subp.add_argument('--start', help='restart mode: warm or cold', type=str, choices=['warm', 'cold'], required=False, default='cold') + subp.add_argument('--yaml', help='Defaults to substitute from', type=str, + required=False, default=os.path.join(_top, 'parm/config/yaml/defaults.yaml')) + + ufs_apps = ['ATM', 'ATMA', 'ATMW', 'S2S', 'S2SW'] + # cycled mode additional arguments + cycled.add_argument('--icsdir', help='full path to initial condition directory', type=str, required=False, default=None) cycled.add_argument('--resens', help='resolution of the ensemble model forecast', type=int, required=False, default=192) cycled.add_argument('--nens', help='number of ensemble members', type=int, required=False, default=20) cycled.add_argument('--app', help='UFS application', type=str, - choices=['ATM', 'ATMW'], required=False, default='ATM') + choices=ufs_apps, required=False, default='ATM') # forecast only mode additional arguments - forecasts.add_argument('--app', help='UFS application', type=str, choices=[ - 'ATM', 'ATMA', 'ATMW', 'S2S', 'S2SW', 'S2SWA', 'NG-GODAS'], required=False, default='ATM') + forecasts.add_argument('--app', help='UFS application', type=str, + choices=ufs_apps + ['S2SWA'], required=False, default='ATM') args = parser.parse_args() - if args.app in ['S2S', 'S2SW'] and args.icsdir is None: - raise SyntaxError("An IC directory must be specified with --icsdir when running the S2S or S2SW app") - # Add an entry for warm_start = .true. or .false. - if args.start == "warm": + if args.start in ['warm']: args.warm_start = ".true." - else: + elif args.start in ['cold']: args.warm_start = ".false." + return args @@ -266,11 +412,21 @@ def query_and_clean(dirname): return create_dir +def validate_user_request(host, inputs): + expt_res = f'C{inputs.resdet}' + supp_res = host.info['SUPPORTED_RESOLUTIONS'] + machine = host.machine + if expt_res not in supp_res: + raise NotImplementedError(f"Supported resolutions on {machine} are:\n{', '.join(supp_res)}") + + if __name__ == '__main__': user_inputs = input_args() host = Host() + validate_user_request(host, user_inputs) + comrot = os.path.join(user_inputs.comrot, user_inputs.pslot) expdir = os.path.join(user_inputs.expdir, user_inputs.pslot) @@ -284,4 +440,4 @@ def query_and_clean(dirname): if create_expdir: makedirs_if_missing(expdir) fill_EXPDIR(user_inputs) - edit_baseconfig(host, user_inputs) + update_configs(host, user_inputs) diff --git a/workflow/setup_xml.py b/workflow/setup_xml.py index 6e63cade118..d43efe21e1f 100755 --- a/workflow/setup_xml.py +++ b/workflow/setup_xml.py @@ -6,9 +6,9 @@ import os from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter -from configuration import Configuration from applications import AppConfig from rocoto.workflow_xml import RocotoXML +from pygw.configuration import Configuration def input_args(): @@ -28,6 +28,15 @@ def input_args(): parser.add_argument('expdir', help='full path to experiment directory containing config files', type=str, default=os.environ['PWD']) + parser.add_argument('--maxtries', help='maximum number of retries', type=int, + default=2, required=False) + parser.add_argument('--cyclethrottle', help='maximum number of concurrent cycles', type=int, + default=3, required=False) + parser.add_argument('--taskthrottle', help='maximum number of concurrent tasks', type=int, + default=25, required=False) + parser.add_argument('--verbosity', help='verbosity level of Rocoto', type=int, + default=10, required=False) + args = parser.parse_args() return args @@ -45,6 +54,10 @@ def check_expdir(cmd_expdir, cfg_expdir): if __name__ == '__main__': user_inputs = input_args() + rocoto_param_dict = {'maxtries': user_inputs.maxtries, + 'cyclethrottle': user_inputs.cyclethrottle, + 'taskthrottle': user_inputs.taskthrottle, + 'verbosity': user_inputs.verbosity} cfg = Configuration(user_inputs.expdir) @@ -54,5 +67,5 @@ def check_expdir(cmd_expdir, cfg_expdir): app_config = AppConfig(cfg) # Create Rocoto Tasks and Assemble them into an XML - xml = RocotoXML(app_config) + xml = RocotoXML(app_config, rocoto_param_dict) xml.write() diff --git a/workflow/test_configuration.py b/workflow/test_configuration.py index f210ceefa4e..5c59fd35bf7 100644 --- a/workflow/test_configuration.py +++ b/workflow/test_configuration.py @@ -1,5 +1,5 @@ import sys -from configuration import Configuration +from pygw.configuration import Configuration expdir = sys.argv[1] @@ -14,17 +14,19 @@ print(f'config.base: {cfg.find_config("config.base")}') -print('*'*80) +print('*' * 80) print('config.base ...') base = cfg.parse_config('config.base') cfg.print_config('config.base') +print(type(base)) +print(base.HOMEgfs) -print('*'*80) +print('*' * 80) print('config.anal...') cfg.print_config(['config.base', 'config.anal']) -print('*'*80) +print('*' * 80) print('config.efcs ...') configs = ['config.base', 'config.fcst', 'config.efcs'] cfg.print_config(configs)