Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The wrong ICs are used in p7a (and in develop/p7b branch) #624

Closed
JessicaMeixner-NOAA opened this issue Jun 7, 2021 · 15 comments · Fixed by #639
Closed

The wrong ICs are used in p7a (and in develop/p7b branch) #624

JessicaMeixner-NOAA opened this issue Jun 7, 2021 · 15 comments · Fixed by #639
Assignees
Labels
bug Something isn't working

Comments

@JessicaMeixner-NOAA
Copy link
Collaborator

Description

The soil variables in the benchmark atm ICs were not updated with the spin-up from NOAH-MP. They were just from GEFS.

Additional context

This will need to be updated in the release/P7a, release/P7b and the develop branch.

I can either move to having the ATM point to a new higher level BM_IC-YYYYMMDD that has the updated ICs or I can create an entire new input-data-YYYYMMDD to update the corresponding sfc files in FV3_input_frac/BM7_IC.
If there's a preference please let me know.

Output

Shows that we're using the wrong ones:
diff -r -q /scratch1/NCEPDEV/stmp2/Michael.Barlage/spinup/mini7a/jarvis/2013040100 /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT
Only in /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT: gfs_ctrl.nc
Only in /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT: gfs_data.tile1.nc
Only in /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT: gfs_data.tile2.nc
Only in /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT: gfs_data.tile3.nc
Only in /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT: gfs_data.tile4.nc
Only in /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT: gfs_data.tile5.nc
Only in /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT: gfs_data.tile6.nc
Files /scratch1/NCEPDEV/stmp2/Michael.Barlage/spinup/mini7a/jarvis/2013040100/sfc_data.tile1.nc and /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT/sfc_data.tile1.nc differ
Files /scratch1/NCEPDEV/stmp2/Michael.Barlage/spinup/mini7a/jarvis/2013040100/sfc_data.tile2.nc and /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT/sfc_data.tile2.nc differ
Files /scratch1/NCEPDEV/stmp2/Michael.Barlage/spinup/mini7a/jarvis/2013040100/sfc_data.tile3.nc and /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT/sfc_data.tile3.nc differ
Files /scratch1/NCEPDEV/stmp2/Michael.Barlage/spinup/mini7a/jarvis/2013040100/sfc_data.tile4.nc and /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT/sfc_data.tile4.nc differ
Files /scratch1/NCEPDEV/stmp2/Michael.Barlage/spinup/mini7a/jarvis/2013040100/sfc_data.tile5.nc and /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT/sfc_data.tile5.nc differ
Files /scratch1/NCEPDEV/stmp2/Michael.Barlage/spinup/mini7a/jarvis/2013040100/sfc_data.tile6.nc and /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2013040100/gfs/C384_L127/INPUT/sfc_data.tile6.nc differ

Which variables are different:
$ nccmp -dsSqf 2012010100/sfc_data.tile1.nc /scratch1/NCEPDEV/nems/emc.nemspara/RT/NEMSfv3gfs/input-data-20210528/FV3_input_frac/BM7_IC/2012010100/gfs/C384_L127/INPUT/sfc_data.tile1.nc
Variable Group Count Sum AbsSum Min Max Range Mean StdDev
slc / 233372 11798.2 15751.6 -0.244231 0.289306 0.533537 0.0505552 0.0726616
smc / 233372 11793.3 15752.2 -0.244231 0.289306 0.533537 0.0505345 0.0726831
stc / 233372 172438 752991 -13.5334 10.5819 24.1153 0.738899 3.65829

@JessicaMeixner-NOAA JessicaMeixner-NOAA added the bug Something isn't working label Jun 7, 2021
@jiandewang
Copy link
Collaborator

create an entire new input-data-YYYYMMDD will be a clean way I prefer

@junwang-noaa
Copy link
Collaborator

junwang-noaa commented Jun 8, 2021 via email

@JessicaMeixner-NOAA
Copy link
Collaborator Author

I will be creating input-data-20210608 in the hera emc.nemspara area and the PRs will follow after I know all 8 cases can at least start.

@junwang-noaa
Copy link
Collaborator

@DeniseWorthen is cleaning up the BM ICs, let's check with her on the new input directories.

@JessicaMeixner-NOAA in P7a/P7b the wave ICs files (restart files from CFSR) are not consistent with the atmosphere ICs, do we want to create the wave ICs from the new atmos ICs for P7a/P7b?

@JessicaMeixner-NOAA
Copy link
Collaborator Author

@junwang-noaa The plan is to change the wave initial conditions in P7c

@DeniseWorthen is there a timeline for cleaning up the BM IC so I can communicate a timeline to @AvichalMehra-NOAA if we want to change the structure of the BM ICs for this and the p7b branch.

@JessicaMeixner-NOAA
Copy link
Collaborator Author

Also, I'm leaving my 20 minute tests running to make sure everything will start up with these new sfc files. They are all in the queue, none have started yet. If others think this is a waste of resources, I can stop this but I still feel like this is an important check.

@DeniseWorthen
Copy link
Collaborator

@JessicaMeixner-NOAA It should be by the end of the week.

A couple of questions that I'm confused about.

  1. You've created new W3 ICs that match the new atm ICs. But those restart files will only work w/ the updated (PR Update WW3 with fb_coupling_fields; add cice fix for zlvs #573) W3? Is that right?

  2. For P7a/P7b, we need W3 ICs for the new atm ICs which do not contain the two extra fields since those two releases are not using the update W3. Is that right?

  3. There's some confusion about actual land ICs. The location that these were pulled from was documented in the original PR Add p7a test using tiled FV3 Fix files, P7a ICs and NoahMP; add open-water normalization in CMEPS (#549) #585. Apparently something changed or the information we were provided at the time was incorrect.

@JessicaMeixner-NOAA
Copy link
Collaborator Author

@DeniseWorthen Thanks for the timeline update.

Answers to your questions:

  1. Yes the new ICs will only work with the updated PR Update WW3 with fb_coupling_fields; add cice fix for zlvs #573. This PR optionally adds the possibility of extra input, which we do not use. There were changes in the restart file that do not effect our answers but require a new WW3 IC/restart file. Since we had to change them, we took this opportunity to switch the WW3 IC forcing.

  2. We do not need new ICs for WW3 that match the new atm data source for P7a and P7b. The plan is to switch the source of the ICs with P7c. Yes, this will cause answer changes, but @AvichalMehra-NOAA confirmed to me yesterday that this is the plan and it's okay to not update p7a and p7b WW3 ICs.

  3. The information provided originally with PR Add p7a test using tiled FV3 Fix files, P7a ICs and NoahMP; add open-water normalization in CMEPS (#549) #585 was inadvertently incorrect. No one discovered until yesterday that the ICs sfc files did not have the different NOAH-MP soil fields as originally intended.

@junwang-noaa
Copy link
Collaborator

Will we rerun P7a? What is the plan for P7a mini branchmark?

@JessicaMeixner-NOAA
Copy link
Collaborator Author

P7a will be re-run with the correct ICs when we have a commit hash in the release branch we can re-run it from, which at this point looks like at the end of this week early next week.

@AvichalMehra-NOAA
Copy link

AvichalMehra-NOAA commented Jun 8, 2021 via email

@junwang-noaa
Copy link
Collaborator

@AvichalMehra-NOAA We discussed, we will merge P7a PR #625 to P7a branch so that it won't delay your schedule of rerunning P7a experiment. After Denise cleans up the BM_IC, we will use that in P7b and the develop branch.

@AvichalMehra-NOAA
Copy link

AvichalMehra-NOAA commented Jun 8, 2021 via email

@DeniseWorthen
Copy link
Collaborator

@JessicaMeixner-NOAA Please confirm a hera location where I can find the W3 ICs for each benchmark date which can be used after PR #573 is merged. I have what I copied from Gaea but I'd like to compare just to be sure. Thnx.

@JessicaMeixner-NOAA
Copy link
Collaborator Author

@DeniseWorthen here's the WW3 IC locations by WW3 commit date:

Between 2020-09-25 and 2021-05-27 (so for P6, P7a, P7b) the ICs can be found at:
/scratch2/NCEPDEV/climate/climpara/S2S/IC/CFSRwave20200925/$YYYYMMDDHH/wav/gwes_30m

Between 2021-05-27 and 2021-06-08 the ICs are what you have on gaea and can also be found on hera here: /scratch2/NCEPDEV/climate/Jessica.Meixner/WW3ICGEFS/WW3_20210527/RestartFiles

For 2021-06-08 and after, I'm creating them now, they will be here:
/scratch2/NCEPDEV/climate/Jessica.Meixner/WW3ICGEFS/RestartFiles

I'm not sure if we'll update to the 2021-06-08 version of WW3 (which would likely be the simplest) in PR #573 or leave it as originally intended and use the "between 2021-05-27 and 2021-06-08" version.

Please let me know if I can provide more details.

@DeniseWorthen DeniseWorthen self-assigned this Jul 9, 2021
epic-cicd-jenkins pushed a commit that referenced this issue Apr 17, 2023
…#624)

Based on discussion in last week's code management meeting, it was agreed that the default values for the atmospheric timestep DT_ATMOS are too low, and for most cases are very wasteful of core hours. This change increases the default timestep for the following pre-defined domains:

* RRFS_CONUS_25km, RRFS_CONUScompact_25km: 40 --> 180
* RRFS_CONUS_13km, RRFS_CONUScompact_13km: 45 --> 75
* RRFS_NA_13km: 50 --> 75
* RRFS_CONUScompact_3km, RRFS_SUBCONUS_3km, SUBCONUS_Ind_3km: 40 --> 36
* CONUS_3km_GFDLgrid: 18 --> 36

There is a known limitation for certain CCPP suites designed for high-resolution cases (per @JeffBeck-NOAA, these are FV3_RRFS_v1beta, FV3_WoFS_v0, and FV3_HRRR); these suites may see failures if DT_ATMOS is too high. For those cases, the default DT_ATMOS is set to 40, however users can still over-write this value in their config file.

NOTE: This change will affect results for users who are using the affected domains and have not manually specified DT_ATMOS. This should be reflected in the release notes for the next release.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
5 participants