Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UGWP Version 0 #298

Merged
merged 9 commits into from
Aug 27, 2019
Merged

UGWP Version 0 #298

merged 9 commits into from
Aug 27, 2019

Conversation

bluefinweiwei
Copy link
Collaborator

new file:   physics/GFS_GWD_generic.F90 (same content as in gwdps.f but with a generic name because both gwdps and ugwp use the _pre part)
new file:   physics/cires_ugwp.F90 (CCPP-compliant UGWP)
new file:   physics/cires_ugwp_post.F90 (CCPP-compliant UGWP)
new file:   physics/cires_ugwp_initialize.F90 (VAY's original code)
new file:   physics/cires_ugwp_module.F90 (VAY's original code)
new file:   physics/cires_ugwp_solvers.F90 (VAY's original code)
new file:   physics/cires_ugwp_triggers.F90 (VAY's original code)
new file:   physics/cires_ugwp_utils.F90 (VAY's original code)
new file:   physics/cires_vert_lsatdis.F90 (VAY's original code)
new file:   physics/cires_vert_orodis.F90 (VAY's original code)
new file:   physics/cires_vert_wmsdis.F90 (VAY's original code)
new file:   physics/cires_orowam2017.f (VAY's original code)
new file:   physics/ugwp_driver_v0.f (VAY's original code)

Tests were done with suite GFS 2017. This is the RT that UGWP uses with IPD. Passes b4b RT on Cheyenne. PR will be reviewed. Once ready, commit. This will bring uGWP v0 to existence. There is a uGWP v1 in the code, which a guard has been provided in the cires_ugwp.F90 because it will not be made CCPP compliant at this time. We will later discuss with EMC and Valery the roles and responsibilities.

…t with a generic name because both gwdps and ugwp use the _pre part)

	new file:   physics/cires_ugwp.F90
	new file:   physics/cires_ugwp_post.F90
	new file:   physics/cires_ugwp_initialize.F90
	new file:   physics/cires_ugwp_module.F90
	new file:   physics/cires_ugwp_solvers.F90
	new file:   physics/cires_ugwp_triggers.F90
	new file:   physics/cires_ugwp_utils.F90
	new file:   physics/cires_vert_lsatdis.F90
	new file:   physics/cires_vert_orodis.F90
	new file:   physics/cires_vert_wmsdis.F90
	new file:   physics/cires_orowam2017.f
	new file:   physics/ugwp_driver_v0.f
@climbfuji
Copy link
Collaborator

climbfuji commented Aug 21, 2019

!! subgrid scale orography including convective breaking, shear
!! breaking and the presence of critical levels.
!!
!! \section arg_table_GFS_GWD_generic_run Argument Table
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a massive amount of code, more than any other interstitial scheme. Is this really "generic"? Or should this an additional "UGWD" scheme, or a gravity wave drag mountain blocking scheme? Is this used with other GWD schemes as well? @grantfirl ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a complete copy of gwdps.f, which is the orographic gravity wave drag and mountain blocking scheme. @grantfirl has commented this in GFS_physics_driver.F90 (around line 2897). It's used if unified gravity wave physics (UGWP) is not invoked. UGWP only uses the _pre part for the statistics of sub-grid orography.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am confused, but that's probably my lack of understanding. Isn't this the exact same code as in gwdps_pre, gwdps and gwdps_post, which are already in CCPP? Can't we use those? Are there any differences (in particular in pre or post?)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They were exactly the same, not 100% though. There was an issue related to continuation line, which was shown as syntax error during my runs. I had to format a little by dealing with symbol "&". Below was from the email by @grantfirl: "We had been separating interstitial code into "scheme-specific" interstitial code that only gets executed when a specific scheme is active, or "scheme-generic" code that gets executed as part of a suite for any scheme of that type, i.e. for any GWD scheme. When we originally wrote the interstitial schemes (I believe Gerard Ketefian did the GWD schemes), there was only one GWD scheme, so the interstitial code was scheme-specific by default. Now that there are more than one GWD schemes, we should create a new file called GFS_GWD_generic.F90. It should at least contain what is in gwdps_pre_run in gwdps.f and maybe more, depending on what we find out from VAY about the diagnostics and tendencies code in GFS_physics_driver.F90."

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems to me that the entirety of the GFS_GWD_generic module in this file is not needed -- isn't it just the gwdps scheme itself?

What I meant by that email was that the file gwdps.f contains interstitial schemes (modules gwdps_pre and gwdps_post) that are no longer specific to just the gwdps scheme since the same code is called in GFS_physics_driver when UGWP is active (is this still correct? -- I haven't looked since you got the new code from EMC). These modules should be moved out into a new file called GFS_GWD_generic.F90, which you did (did you also delete them from gwdps.f?). But the meat of the old gwdps scheme, gwdps_run, should stay in gwdps.f.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's somewhat arbitrary and we have not been consistent. I think that my preference is to have one file named "[X]_generic" for both pre and post modules (if both exist, but for only pre in this case) rather than using individual files for pre and post modules since that one file can be a place to consolidate both pre and post codes in the future and it is cleaner to have fewer files. However, others working with the CCPP have been more prolific in their files and names, so there are examples where separate "[Y]_pre" and "[Y]_post files exist, although I haven't heard a reason why this would be preferable.

Once you're done, please remove the gwdps_pre module from gwdps.f since we do not want to duplicate code. This will also mean changing gwdps_pre to GFS_GWD_generic_pre in all SDFs that use it.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At this point, please no changes to any of the PRs. I am proposing several changes to make sure that we don't miss this window of getting the PRs in tomorrow or Friday. I can handle this and you can both review it. Thanks.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @grantfirl, I'll adjust the GWD_generic. @climbfuji I won't push any new changes until next week (perhaps) and thanks for the timely reminder.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @grantfirl, I'll adjust the GWD_generic. @climbfuji I won't push any new changes until next week (perhaps) and thanks for the timely reminder.

Just to be clear, I will make the changes as recommended by Grant and then create pull requests to update your PRs (with several other changes, including update to the latest gmtb/develop codebase for each of the PRs). Once you are both OK with the changes, you can merge them into your code (I will show you how this works tomorrow in case you need help) and kick off the full suite of regression tests. Assuming this all goes well, your PRs will be merged into gmtb/develop on Friday.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds very good. Thx!

@climbfuji
Copy link
Collaborator

A few more notes:

  • constants are still imported from physcons.F90 instead of being passed through the argument list
  • dimension mismatch for ak and bk in metadata tables (both in GFS_typedefs.meta and cires_ugwp.F90)
  • more than half of the intent values in cires_ugwp.F90 are set to invalid values of "none" (copy & paste from GFS_typedefs?)

@climbfuji
Copy link
Collaborator

@grantfirl @llpcarson can you please review these PRs? @grantfirl to please make sure that the concerns we had were addressed correctly by me, and @llpcarson for the overall approval (because both @climbfuji and @grantfirl had significant input into these PRs). Thank you ...

@climbfuji
Copy link
Collaborator

See https://github.com/NCAR/NEMSfv3gfs/pull/218 for testing.

Copy link
Collaborator

@climbfuji climbfuji left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me now. Tests are still running.

Copy link
Contributor

@llpcarson llpcarson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me

& sinlat, xlatd, taup, taud, pkdis)
!
USE MACHINE , ONLY : kind_phys
use ugwp_common , only : grav, omega2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This makes me uneasy. The UGWP scheme is using its own set of constants rather than the host-provided ones. Perhaps address in a followup PR?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well spotted, I didn't see this one. @bluefinweiwei can you please create an issue for this in NCAR's ccpp-physics describing the problem and that this needs to be addressed in a follow-up PR (i.e. constants have to be passed in via cires_ugwp_{init,run,finalize} and then down to this routine. Thanks.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that some of the constants are specific to the scheme, but many are duplicates to host-provided ones. Of course, changing them would change the answer...

subroutine oro_meanflow(nz, nzi, u1, v1, t1, pint, pmid,
& delp, rho, bn2, uzi, rhoi, ktur, kalp, dzi, xn, yn)

use ugwp_common , only : grav, rgrav, rdi, velmin, dw2min
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants again

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bluefinweiwei mention this place as well please, and others that Grant is finding.


! end module oro_state

module ugwp_common
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here is where UGWP constants are defined - lots of duplicates here

!
subroutine init_conv_gws(nwaves, nazdir, nstoch, effac, &
lonr, kxw, cgwf)
use ugwp_common, only : pi2, arad
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not sure if these are duplicates to host-provided constants or not

real, allocatable :: xaz_fjet(:), yaz_fjet(:)
contains
subroutine init_fjet_gws(nwaves, nazdir, nstoch, effac, lonr, kxw)
use ugwp_common, only : pi2, arad
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

!
subroutine init_okw_gws(nwaves, nazdir, nstoch, effac, lonr, kxw)

use ugwp_common, only : pi2, arad
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

real, parameter :: maxdudt = 250.e-5

real, parameter :: hpscale= 7000., rhp2 = 0.5/hpscale
real, parameter :: omega2 = 2.*6.28/86400
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

omega2 is actually in UGWP's own constants and is redefined here. (omega is in host models' physcons too)

! call initsolv_wmsdis(me, master, knob_ugwp_wvspec(2), knob_ugwp_azdir(2), &
! knob_ugwp_stoch(2), knob_ugwp_effac(2), do_physb_gwsrcs, kxw)
!
use ugwp_common, only : pi, pi2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-model defined constants

logical :: do_physb_gwsrcs = .false. ! control for physics-based GW-sources
logical :: do_rfdamp = .false. ! control for Rayleigh friction inside ugwp_driver

real, parameter :: arad=6370.e3
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

again, not using either UGWP- or host-model defined constants and redefining here.


!
use machine, only: kind_phys
use physcons, only: con_cp, con_fvirt, con_g, con_rd
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

using host-model provided constants, but not through CCPP here

!
use machine, only: kind_phys
use physcons, only: con_cp, con_fvirt, con_g, con_rd
use ugwp_common, only: omega2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

except for this guy (omega is in host-model constants too, though, so ?)

use ugwp_oro_init, only : cdmb, cleff, sigfac, hncrit, hpmin, hminmt
use ugwp_oro_init, only : gamm_std, sigma_std
use ugwp_common , only : rgrav, grav, cpd, rd, rv, rcpd, rcpd2

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

fcor, c2f2, u, v, t, q, prsi, delp, prsl, prslk, phii, phil, &
ax, ay, eps, ked, tauz)

use ugwp_common , only : rgrav, grav, cpd, rd, rv, rcpd, rcpd2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

fcor, c2f2, u, v, t, q, prsi, delp, prsl, prslk, phii, phil, &
ax, ay, eps, ked, tauz)
! use para_taub, only : tau_ex
use ugwp_common , only : rgrav, grav, cpd, rd, rv, rcpd, rcpd2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

!
!
subroutine rf_damp(im, levs, levs_rf, dtp, rfdis, rfdist, u, v, ax, ay, eps)
use ugwp_common, only : rcpd2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

!
SUBROUTINE subs_diag_geo(nx, ny, lat, lon, rlat, rlon, dy, dx, &
cosv, rlatc, brcos, brcos2, dlam1, dlam2, dlat, divJp, divJm)
use ugwp_common , only : deg_to_rad
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

! geometric factors to compute deriv-es etc ...
! coriolis coslat tan etc...
!
earth_r = 6370.e3
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems like these should be provided by host model too

subroutine um_flow(nz, klow, ktop, up, vp, tp, qp, dp, zpm, zpi, &
pmid, pint, bn2, uhm, vhm, bn2hm, rhohm)
!
use ugwp_common, only : bnv2min, grav, gocp, fv, rdi
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

subroutine mflow_tauz(levs, up, vp, tp, qp, dp, zpm, zpi, &
pmid, pint, rho, ui, vi, ti, bn2i, bvi, rhoi)

use ugwp_common, only : bnv2min, grav, gocp, fv, rdi
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

! call ugwp_lsatdis_naz(levs, ksrc, nw, naz, kxw, taub_spect, ch, xaz, yaz, &
! fcor(j), c2f2(j), dp, zmid, zint, pmid, pint, rho, ui, vi, ti, &
! kvg, ktg, krad, kion, bn2i, bvi, rhoi, ax1, ay1, eps1, ked1)
use ugwp_common, only : rcpd, grav, rgrav
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

return

!
print *
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe in a follow-up PR wrap print statements in some kind of DEBUG flag? There are several places throughout this scheme with print statements, not just here...

elvpd, elvp, hprime , sigma, theta, oc, oa4, clx4, gam, zpbl, &
up, vp, tp, qp, dp, zpm, zpi, pmid, pint, idxzb, drmtb,taumtb)

use ugwp_common, only : bnv2min, grav, grcp, fv, rad_to_deg, dw2min, velmin, rdi
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

elvp, up, vp, tp, qp, dp, zpm, zpi, pmid, pint, xn, yn, umag, &
tautot, tauogw, taulee, drlee, tau_src, kxridge, kdswj, krefj, kotr)
!
use ugwp_common, only : bnv2min, grav, pi, pi2, dw2min, velmin
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

kxw, fcor, kxridge, up, vp, tp, qp, dp, zpm, zpi, pmid, pint, &
xn, yn, umag, drtau, kdis)

use ugwp_common, only : bnv2min, grav, pi, pi2, dw2min, velmin, rgrav
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

subroutine ugwp_tofd(im, levs, sigflt, elvmax, zpbl, u, v, zmid, &
utofd, vtofd, epstofd, krf_tofd)
use machine , only : kind_phys
use ugwp_common , only : rcpd2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

subroutine ugwp_tofd1d(levs, sigflt, elvmax, zsurf, zpbl, u, v, &
zmid, utofd, vtofd, epstofd, krf_tofd)
use machine , only : kind_phys
use ugwp_common , only : rcpd2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

!
!
! use para_taub, only : tau_ex
use ugwp_common, only : rcpd, grav, rgrav
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

!----------------------------------------

USE MACHINE , ONLY : kind_phys
use ugwp_common , only : rgrav, grav, cpd, rd, rv, rcpd, rcpd2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

! Alternative expression: ZMTB = max(Heff*(1. -Fcrit_gfs/Fr), 0)
! fcrit_gfs/fr
!
goto 788
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GOTO OK?

! ---------------------------------------------------------------------------------
!

use ugwp_common , only : rgrav, grav, cpd, rd, rv
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

if (kdt == 1 .and. mpi_id == master) then
print *, 'vgw done '
!
print *, maxval(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not using host-provided constants here

Copy link
Collaborator

@grantfirl grantfirl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll go ahead and approve as long as the RTs pass, but there are a few things noted to follow up with. Some of the things could be related to v1 (which was not touched for CCPP purposes), but some are definitely related to v0. The constants, as noted, are messy. Some are passed in to the top level scheme driver, but they are not propagated through the scheme. There are several places where constants are redefined locally and not even grabbed from the scheme-specific constant list OR the host-model provided list. There was one GOTO statement which is supposed to be a no-no for CCPP. The last thing is that I saw a bunch of print statements -- I'm not sure if they're just during the init stage or not, but it might be good to wrap those in a debug flag or something to prevent annoying terminal output flooding from the scheme.

@climbfuji
Copy link
Collaborator

I'll go ahead and approve as long as the RTs pass, but there are a few things noted to follow up with. Some of the things could be related to v1 (which was not touched for CCPP purposes), but some are definitely related to v0. The constants, as noted, are messy. Some are passed in to the top level scheme driver, but they are not propagated through the scheme. There are several places where constants are redefined locally and not even grabbed from the scheme-specific constant list OR the host-model provided list. There was one GOTO statement which is supposed to be a no-no for CCPP. The last thing is that I saw a bunch of print statements -- I'm not sure if they're just during the init stage or not, but it might be good to wrap those in a debug flag or something to prevent annoying terminal output flooding from the scheme.

Thanks for this thorough review. What a mess, these constants. See issue #304.

@bluefinweiwei
Copy link
Collaborator Author

Thanks for the review. Also note that all the UGWP-related subroutines and modules (that @grantfirl noticed with the constant, stdout, and "goto" issues) are exactly the same as what are in the IPD version.

@grantfirl
Copy link
Collaborator

Thanks for the review. Also note that all the UGWP-related subroutines and modules (that @grantfirl noticed with the constant, stdout, and "goto" issues) are exactly the same as what are in the IPD version.

Yes, good point - definitely not @bluefinweiwei's fault. In fact, one could not have addressed the constants in the initial CCPP implementation while passing the RT using UGWP since the ugwp_common constants are different than the FV3-provided ones. A new baseline will need to be established once the constants are all coming from FV3 properly.

@bluefinweiwei
Copy link
Collaborator Author

Indeed, I think optimizing the IPD's UGWP code is a next-level work. Thanks @grantfirl.

@climbfuji
Copy link
Collaborator

Indeed, I think optimizing the IPD's UGWP code is a next-level work. Thanks @grantfirl.

... but we should do this as soon as possible before it gets forgotten and the issue moves down in the list ... it's actually not that hard, just needs to be done and the issue of not being able to compare b4b with IPD needs to be resolved.

@bluefinweiwei
Copy link
Collaborator Author

@climbfuji sounds good. I've put a note in issue #304.

@climbfuji climbfuji merged commit 2506e5d into NCAR:gmtb/develop Aug 27, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants