Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding the WRF-Solar EPS model #1547

Merged
merged 22 commits into from
Jan 24, 2022
Merged

Conversation

pedro-jm
Copy link
Contributor

@pedro-jm pedro-jm commented Sep 1, 2021

TYPE: New feature

KEYWORDS: WRF-Solar, ensemble, stochastic

SOURCE: Pedro A. Jimenez, Ju Hye Kim, Jimy Dudhia (NCAR) and Jaemo Yang (NREL)

DESCRIPTION OF CHANGES:
Problem:
The WRF-Solar model is a deterministic model tailored for solar energy applications. Having a probabilistic component tailored for solar energy application is missing.

Solution:
WRF-Solar Ensemble Prediction System (WRF-Solar EPS, Kim et al. 2021) was designed to provide probabilistic forecasts tailored for solar energy applications. We have selected the most relevant variables from 6 WRF packages using tangent linear methodologies (Yang et al. 2021). Then we have used the available stochastic infrastructure in WRF to perturb these variables at runtine. The scheme is activated setting multi_perturb = 1 in the &stoch block of namelist.input. The characteristics of the perturbations are specified in a new file in the run directory called STOCHPERT.TBL. Then the perturbations to each module and variable
are turned on/off via namelist settings (see run/README.namelist). More information can be found in this very preliminar website: https://ral.ucar.edu/solutions/products/wrf-solar-eps

Link to Yang et al. 2021 article: https://www-sciencedirect-com.cuucar.idm.oclc.org/science/article/pii/S0038092X21002322?via%3Dihub

Link to Kim et al. 2021 article: https://ieeexplore-ieee-org.cuucar.idm.oclc.org/document/9580552

LIST OF MODIFIED FILES:
M Registry/Registry.EM_COMMON
M Registry/registry.dimspec
M Registry/registry.stoch
M dyn_em/module_first_rk_step_part1.F
M dyn_em/module_first_rk_step_part2.F
M dyn_em/module_stoch.F
M dyn_em/solve_em.F
M phys/module_microphysics_driver.F
M phys/module_pbl_driver.F
M phys/module_ra_farms.F
M phys/module_radiation_driver.F
M phys/module_sf_noahdrv.F
M phys/module_shallowcu_driver.F
M phys/module_shcu_deng.F
M phys/module_surface_driver.F
M run/README.namelist
A run/STOCHPERT.TBL
M share/module_check_a_mundo.F

TESTS CONDUCTED:

  1. We have run multiple years WRF-Solar EPS and analyzed the characteristics of the probabilistic forecasts as well as calibrated forecasts. Our analysis do not show any potential problems at this point.
  2. We ran a simulation with 10 ensemble members and calculated the spread of the ensemble (standard deviation) as a function of the lead time. The spread (this PR compared to our other tests) is very similar.
    Below is plot with the ensemble spread at given lead time original (left) and current mods (right). The area covered is CONUS.

Screen Shot 2022-01-19 at 5 19 52 PM

  1. Jenkins testing is all PASS.

RELEASE NOTE: WRF-Solar was expanded to have a stochastic ensemble prediction system (WRF-Solar EPS) tailored for solar energy applications (Yang et al. 2021, Kim et al. 2022). The stochastic perturbations can be introduced into variables of six parameterizations controlling cloud and radiation processes. A more detailed description of the model is provided in the WRF-Solar EPS website:
https://ral.ucar.edu/solutions/products/wrf-solar-eps

Kim, J.-H., P.A. Jimenez, M. Sengupta, J. Yang, J. Dudhia, S. Alessandrini and Y. Xie, 2022: The WRF-Solar Ensemble Prediction System to provide solar irradiance probabilistic forecasts. IEEE J. of Photovoltaics (In press.)

Yang, J., J.-H. Kim, P. A. Jimenez, M. Sengupta, J. Dudhia, Y. Xie, A. Golnas, R. Giering: An Efficient method to identify uncertainties of WRF-Solar variables in forecasting solar irradiance using a tangent linear sensitivity analysis. Solar Energy, 220, 509-522.

@pedro-jm pedro-jm requested review from a team as code owners September 1, 2021 18:45
@joeolson42
Copy link
Contributor

I think we need to talk about the MYNN changes. This looks like it will be a significant snag in our code universalization efforts because these new stochastic variables/arrays are not in CCPP. Either this alternative stochastic approach needs to be added to CCPP or it needs to be merged with the preexisting stochastic approach in the MYNN. I'm open to compromises, but it would take some convincing to have to carry two separate approaches. It looks like there is considerable overlap in the targeted impact of both stochastic approaches.

@dudhia
Copy link
Collaborator

dudhia commented Sep 1, 2021 via email

@joeolson42
Copy link
Contributor

That may be a work around but it's ugly. I'm trying to eliminate dycore-specific code so the same features/configurations are possible in all dycores, not just identical code. Maybe we can do this in the short term but it should be called out as a bad practice if code universalization is truly our goal.

Is it possible to perturb these arrays (th, qv, qc, qi, etc) before they are input to the MYNN? It seems like these mods could be an interstitial routine.

@pedro-jm
Copy link
Contributor Author

pedro-jm commented Sep 1, 2021

Joe, I think it may be possible to do the perturbations outside MYNN. Let me think about it a bit more and take a look at the code.
Pedro.

@dudhia
Copy link
Collaborator

dudhia commented Sep 2, 2021 via email

@pedro-jm
Copy link
Contributor Author

pedro-jm commented Sep 8, 2021

Joe, @joeolson42
I was able to implement the perturbations outside the MYNN module. No changes in module_bl_mynn.F.
So we should be good to go with respect to MYNN.

@joeolson42
Copy link
Contributor

joeolson42 commented Sep 8, 2021 via email

@pedro-jm
Copy link
Contributor Author

All the variables are now perturbed in the drivers (instead of the parameterizations modules).

@weiwangncar
Copy link
Collaborator

@pedro-jm Can you verify that a restart run would produce identical results as a continuous run? What about runs produced by different number of processors? Can you also show some plots that may illustrate the spread you can produce with this method?

@pedro-jm
Copy link
Contributor Author

@davegill
I did a test and it appears to work fine. I run a sim with 10 ensemble members using the code before and after your mods. I then calculated the spread of the ensemble (standard deviation) as a function of the lead time. Spread is very similar. Below is plot with the ensemble spread at given lead time before (left) and after (right) your changes. The area covered is CONUS.

Screen Shot 2022-01-19 at 5 19 52 PM

@pedro-jm
Copy link
Contributor Author

The spread in the previous plots is for the global horizontal irradiance (GHI).

@davegill
Copy link
Contributor

@pedro-jm
Great news!
Thank you much

@davegill
Copy link
Contributor

@weiwangncar @dudhia @pedro-jm
Wei and Jimy,
This is ready for review.

Pedro,
Thank you so much for checking this out, especially considering the PLAGUE!

@weiwangncar
Copy link
Collaborator

@pedro-jm Can you verify that a restart run would produce identical results as a continuous run? We also like to know if the code would produce identical results running it with one and multiple processors. You can do this with dmpar-compiled code running it with one and then multiple processors.

@pedro-jm
Copy link
Contributor Author

@davegill
Thanks to you. You added considerable devels. We probably would not be able to add all of that on time.

@weiwangncar
I should be able to do those test

@dudhia
Copy link
Collaborator

dudhia commented Jan 20, 2022 via email

@weiwangncar
Copy link
Collaborator

@pedro-jm @dudhia For restart test, optimized code should be the one to use. For bit-for-bit test, all we require is to use code without optimization. But if using optimized code gives you bit-for-bit results, it is good too. Many of our options can produce bit-for-bit results when using different number of processors.

@davegill
Copy link
Contributor

@pedro-jm @weiwangncar @dudhia

Does Pedro need to turn off optimization for this test or can it work optimized?

Yes, we tend to find that unoptimized builds produce bit-identical results more often.

  1. use "configure -d", that will turn off all optimization
  2. run about 10 - 20 time steps for the parallel test, you can use an MPI-only run and then vary the number of tasks
  3. use a REALLY small domain, perhaps 70x70 up to 100x100, which means you could run "mpirun -np 1" and "mpirun -np 4", for example
  4. for the restart test, compare 20 time steps to a 10+10 time step simulation

@pedro-jm
Copy link
Contributor Author

I am able to get bit-identical results for the first 20 time steps I checked using 1 and 36 cores.

However, I needed to change the PBL setting from 5 to 0 to get bit-identical results. Turning off the perturbations of this PR and using PBL 5 does does not produce bit-identical results. There are no changes for PBL 5 in this PR so It seems we should be fine here.

@pedro-jm
Copy link
Contributor Author

There was an oversight on my side with the perturbations and the bit-identical results for 1 and 36 cores.

I get bit-identical results when I perturb these modules:
pert_noah = .true.
pert_thom = .true.
pert_cld3 = .true.

I DO NOT get bit-identical results when I perturb these modules:
pert_farms = .true.
pert_deng = .true.

I cannot test this one:
pert_mynn = .true.

@pedro-jm
Copy link
Contributor Author

I forgot to document that the random perturbations are OK.
The issue appears when we apply them. So it is clear where to look for the problems.

@pedro-jm
Copy link
Contributor Author

Restarts currently do not provide bit for bit results. This is expected: we do not have the perturbations in the wrfrst files. And there might be issues with the initialization of the perturbations in the restart. We can put a check in checkamundo to stop if it is a restart run and we have this PR activated. Most of the users of this PR probably would not use restarts.

@davegill
Copy link
Contributor

@weiwangncar @dudhia
What are opinions? If you deem that this is not ready for a review, I doubt anyone really has the time to make it ready for a review.

@dudhia
Copy link
Collaborator

dudhia commented Jan 21, 2022 via email

@weiwangncar
Copy link
Collaborator

@pedro-jm @dudhia Can you add a bit more info to the RELEASE NOTE? Since that is the bit we are going to describe the update in the release. Information on which physics that may be perturbed. You can site Solar physics suite, but it may be explicitly stated here for general info. Also does the paper describe how the STOCHPERT.TBL is created? Some words about that and paper reference in the RELEASE NOTE will be useful.

@pedro-jm
Copy link
Contributor Author

@weiwangncar
I just updated the RELEASE NOTES info providing a link to the WRF-Solar EPS website. If you still miss any relevant info let me know.

@weiwangncar
Copy link
Collaborator

@pedro-jm Thanks for the added info. I'm ok with this PR. It is good that the test results are recorded in this PR (restart, bit-for-bit), so that we can refer to this later.

@davegill davegill self-requested a review January 24, 2022 06:25
Copy link
Contributor

@davegill davegill left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Jimy says it can go in AS-IS, ands we sort things out before the v4.4 release

@davegill davegill merged commit fed10f4 into wrf-model:develop Jan 24, 2022
davegill added a commit that referenced this pull request Jan 24, 2022
TYPE: bug fix

KEYWORDS: netcdfpar, Error

SOURCE: internal

DESCRIPTION OF CHANGES:
IMPORTANT: Without these mods, every commit since the parallel netcdf4 IO mods will fail the DA
build test in the regression test. For example, at least these commits:
```
fed10f4 Adding the WRF-Solar EPS model (#1547)
0bda5e0 Fix 4dvar build failure after commit 8b5bfe5 (#1652)
8b5bfe5 Thompson AA enhancements: BC aerosol, biomass burning emissions, and … (#1616)
9dc68ca After testing with UFS/GFS/FV-3, some tuning knob changes to Thompson-MP and icloud3 (cloud fraction) scheme (#1626)
96fd889 Update HONO, TERP, and CO2 emissions (#1644)
64fb190 SFCLAY=1, add shallow water roughness calculation (#1543)
609c2fc New module firebrand_spotting for WRF-Fire (#1540)
75bfe6d MYNN PBL clouds in photolysis option 4 (TUV) (#1622)
f8c4b13 Fix runtime error when using sf_surface_mosaic = 1 with use_wudapt_lcz = 0 (#1638)
b511c70 Run-time option for climate GHG for radiation (#1625)
8194c66 Bug fix for configuration option INTEL:HSW/BDW (#1645)
16c9287  bug fixes for radar_rf_opt=2 (#1642)
a82ce24 Sync with NoahMP Github version with all NoahMP updates since v4.3 (#1641)
7b642cc Bug fix for TAMDAR T VarBC (#1632)
92fd706 fix WRFDA build for Parallel netcdf-4 IO (#1634)
```
Problem:
With PR #1552 "Parallel netcdf-4 IO option" (SHA1 3cd4713), when then code was built without
the new parallel NetCDF4 compression, the build log had an `Error`. 
```
> grep Error compile.log
Fatal Error: Cannot open module file ‘wrf_data_ncpar.mod’ for reading at (1): No such file or directory
make[2]: [diffwrf] Error 1 (ignored)
make[2]: [diffwrf] Error 1 (ignored)
wrf_io.f:117: Error: Can't open included file 'mpif.h'
make[2]: [wrf_io.o] Error 1 (ignored)
Fatal Error: Cannot open module file ‘wrf_data_ncpar.mod’ for reading at (1): No such file or directory
make[2]: [field_routines.o] Error 1 (ignored)
make[2]: [libwrfio_nfpar.a] Error 127 (ignored)
make[2]: [libwrfio_nfpar.a] Error 1 (ignored)
```
The problem was related to constructing the object files in the io_netcdfpar directory. When the 
option is not selected at compile time, then we do not care about errors in the directory that will 
never be used.

Solution:
If the NETCDFPAR option is not selected at compile time, then SKIP going into the io_netcdfpar
directory all together.

LIST OF MODIFIED FILES:
m Makefile
m arch/Config.pl
m arch/configure.defaults
m configure

TESTS CONDUCTED:
1. Without the NETCDFPAR parameter set, the build for the io_netcdfpar directory is bypassed:
```
          cd ../io_netcdfpar ; \
          echo SKIPPING make -i -r NETCDFPARPATH="/glade/u/apps/ch/opt/netcdf/4.7.3/gnu/9.1.0" \

          cd ../io_netcdfpar ; \
          echo SKIPPING make -i -r NETCDFPARPATH="/glade/u/apps/ch/opt/netcdf/4.7.3/gnu/9.1.0" \
```

2. When the NETCDFPAR env variable is set, the build includes the io_netcdfpar directory:
          cd ../io_netcdfpar ; \
           make -i -r NETCDFPARPATH="/glade/u/apps/ch/opt/netcdf/4.8.0/gnu/9.1.0" \

          cd ../io_netcdfpar ; \
           make -i -r NETCDFPARPATH="/glade/u/apps/ch/opt/netcdf/4.8.0/gnu/9.1.0" \
```

3. Jenkins tests are all PASS.
davegill added a commit to davegill/WRF that referenced this pull request Jan 25, 2022
TYPE: bug fix

KEYWORDS: EPS, DA

SOURCE: internal

DESCRIPTION OF CHANGES:
Problem:
The WRF-Solar EPS mods (PR wrf-model#1547, SHA1 fed10f4) modified the solve_em.F file, with
arguments to the microphysics driver. The solver_em.F file is one of the special files
that has a _ad and a _tl version. Any changes to solve_em.F need to be reflected in the
other two DA-related files.

Solution:
In this case, since only Thompson MP uses the EPS fields, the new args were made optional in
the MP driver.

LIST OF MODIFIED FILES:
M phys/module_microphysics_driver.F

TESTS CONDUCTED:
1. Yes, code compiles and passes regtests.
davegill added a commit that referenced this pull request Jan 25, 2022
TYPE: bug fix

KEYWORDS: EPS, DA

SOURCE: internal

DESCRIPTION OF CHANGES:
Problem:
The WRF-Solar EPS mods (PR #1547, SHA1 fed10f4) modified the solve_em.F file, with
arguments to the microphysics driver. The solver_em.F file is one of the special files
that has a _ad and a _tl version. Any changes to solve_em.F need to be reflected in the
other two DA-related files.

Solution:
In this case, since only Thompson MP uses the EPS fields, the new args were made optional in
the MP driver.

LIST OF MODIFIED FILES:
M phys/module_microphysics_driver.F

TESTS CONDUCTED:
1. Yes, code compiles and passes regtests.
vlakshmanan-scala pushed a commit to scala-computing/WRF that referenced this pull request Apr 4, 2024
TYPE: New feature

KEYWORDS: WRF-Solar, ensemble, stochastic

SOURCE: Pedro A. Jimenez, Ju Hye Kim, Jimy Dudhia (NCAR) and Jaemo Yang (NREL)

DESCRIPTION OF CHANGES:
Problem:
The WRF-Solar model is a deterministic model tailored for solar energy applications. Having a probabilistic component tailored for solar energy application is missing.

Solution:
WRF-Solar Ensemble Prediction System (WRF-Solar EPS, Kim et al. 2021) was designed to provide 
probabilistic forecasts tailored for solar energy applications. We have selected the most relevant 
variables from 6 WRF packages using tangent linear methodologies (Yang et al. 2021). Then we have 
used the available stochastic infrastructure in WRF to perturb these variables at runtine. The scheme 
is activated setting `multi_perturb = 1` in the &stoch block of namelist.input. The characteristics of 
the perturbations are specified in a new file in the run directory called STOCHPERT.TBL. Then the 
perturbations to each module and variable are turned on/off via namelist settings (see run/README.namelist). 
More information can be found in this very preliminar website: 
https://ral.ucar.edu/solutions/products/wrf-solar-eps

Link to Yang et al. 2021 article: https://www-sciencedirect-com.cuucar.idm.oclc.org/science/article/pii/S0038092X21002322?via%3Dihub

Link to Kim et al. 2021 article: https://ieeexplore-ieee-org.cuucar.idm.oclc.org/document/9580552

LIST OF MODIFIED FILES:
M       Registry/Registry.EM_COMMON
M       Registry/registry.dimspec
M       Registry/registry.stoch
M       dyn_em/module_first_rk_step_part1.F
M       dyn_em/module_first_rk_step_part2.F
M       dyn_em/module_stoch.F
M       dyn_em/solve_em.F
M       phys/module_microphysics_driver.F
M       phys/module_pbl_driver.F
M       phys/module_ra_farms.F
M       phys/module_radiation_driver.F
M       phys/module_sf_noahdrv.F
M       phys/module_shallowcu_driver.F
M       phys/module_shcu_deng.F
M       phys/module_surface_driver.F
M       run/README.namelist
A       run/STOCHPERT.TBL
M       share/module_check_a_mundo.F

TESTS CONDUCTED: 
1. We have run multiple years WRF-Solar EPS and analyzed the characteristics of the probabilistic 
forecasts as well as calibrated forecasts. Our analysis do not show any potential problems at this point. 
2. We ran a simulation with 10 ensemble members and calculated the spread of the ensemble (standard 
deviation) as a function of the lead time. The spread (this PR compared to our other tests) is very similar. 
Below is plot with the ensemble spread at given lead time original (left) and current mods (right). The 
area covered is CONUS.

![Screen Shot 2022-01-19 at 5 19 52 PM](https://user-images.githubusercontent.com/14111759/150239976-503980be-fbba-44b6-88f4-f473fcfed198.png)

3. Jenkins testing is all PASS.

RELEASE NOTE: WRF-Solar was expanded to have a stochastic ensemble prediction system (WRF-Solar EPS) tailored for solar energy applications (Yang et al. 2021, Kim et al. 2022). The stochastic perturbations can be introduced into variables of six parameterizations controlling cloud and radiation processes. A more detailed description of the model is provided in the WRF-Solar EPS website:
https://ral.ucar.edu/solutions/products/wrf-solar-eps

Kim, J.-H., P.A. Jimenez, M. Sengupta, J. Yang, J. Dudhia, S. Alessandrini and Y. Xie, 2022: The WRF-Solar Ensemble Prediction System to provide solar irradiance probabilistic forecasts. IEEE J. of Photovoltaics (In press.)

Yang, J., J.-H. Kim, P. A. Jimenez, M. Sengupta, J. Dudhia, Y. Xie, A. Golnas, R. Giering: An Efficient method to identify uncertainties of WRF-Solar variables in forecasting solar irradiance using a tangent linear sensitivity analysis. Solar Energy, 220, 509-522.
vlakshmanan-scala pushed a commit to scala-computing/WRF that referenced this pull request Apr 4, 2024
TYPE: bug fix

KEYWORDS: netcdfpar, Error

SOURCE: internal

DESCRIPTION OF CHANGES:
IMPORTANT: Without these mods, every commit since the parallel netcdf4 IO mods will fail the DA
build test in the regression test. For example, at least these commits:
```
fed10f4 Adding the WRF-Solar EPS model (wrf-model#1547)
0bda5e0 Fix 4dvar build failure after commit 8b5bfe5 (wrf-model#1652)
8b5bfe5 Thompson AA enhancements: BC aerosol, biomass burning emissions, and … (wrf-model#1616)
9dc68ca After testing with UFS/GFS/FV-3, some tuning knob changes to Thompson-MP and icloud3 (cloud fraction) scheme (wrf-model#1626)
96fd889 Update HONO, TERP, and CO2 emissions (wrf-model#1644)
64fb190 SFCLAY=1, add shallow water roughness calculation (wrf-model#1543)
609c2fc New module firebrand_spotting for WRF-Fire (wrf-model#1540)
75bfe6d MYNN PBL clouds in photolysis option 4 (TUV) (wrf-model#1622)
f8c4b13 Fix runtime error when using sf_surface_mosaic = 1 with use_wudapt_lcz = 0 (wrf-model#1638)
b511c70 Run-time option for climate GHG for radiation (wrf-model#1625)
8194c66 Bug fix for configuration option INTEL:HSW/BDW (wrf-model#1645)
16c9287  bug fixes for radar_rf_opt=2 (wrf-model#1642)
a82ce24 Sync with NoahMP Github version with all NoahMP updates since v4.3 (wrf-model#1641)
7b642cc Bug fix for TAMDAR T VarBC (wrf-model#1632)
92fd706 fix WRFDA build for Parallel netcdf-4 IO (wrf-model#1634)
```
Problem:
With PR wrf-model#1552 "Parallel netcdf-4 IO option" (SHA1 3cd4713), when then code was built without
the new parallel NetCDF4 compression, the build log had an `Error`. 
```
> grep Error compile.log
Fatal Error: Cannot open module file ‘wrf_data_ncpar.mod’ for reading at (1): No such file or directory
make[2]: [diffwrf] Error 1 (ignored)
make[2]: [diffwrf] Error 1 (ignored)
wrf_io.f:117: Error: Can't open included file 'mpif.h'
make[2]: [wrf_io.o] Error 1 (ignored)
Fatal Error: Cannot open module file ‘wrf_data_ncpar.mod’ for reading at (1): No such file or directory
make[2]: [field_routines.o] Error 1 (ignored)
make[2]: [libwrfio_nfpar.a] Error 127 (ignored)
make[2]: [libwrfio_nfpar.a] Error 1 (ignored)
```
The problem was related to constructing the object files in the io_netcdfpar directory. When the 
option is not selected at compile time, then we do not care about errors in the directory that will 
never be used.

Solution:
If the NETCDFPAR option is not selected at compile time, then SKIP going into the io_netcdfpar
directory all together.

LIST OF MODIFIED FILES:
m Makefile
m arch/Config.pl
m arch/configure.defaults
m configure

TESTS CONDUCTED:
1. Without the NETCDFPAR parameter set, the build for the io_netcdfpar directory is bypassed:
```
          cd ../io_netcdfpar ; \
          echo SKIPPING make -i -r NETCDFPARPATH="/glade/u/apps/ch/opt/netcdf/4.7.3/gnu/9.1.0" \

          cd ../io_netcdfpar ; \
          echo SKIPPING make -i -r NETCDFPARPATH="/glade/u/apps/ch/opt/netcdf/4.7.3/gnu/9.1.0" \
```

2. When the NETCDFPAR env variable is set, the build includes the io_netcdfpar directory:
          cd ../io_netcdfpar ; \
           make -i -r NETCDFPARPATH="/glade/u/apps/ch/opt/netcdf/4.8.0/gnu/9.1.0" \

          cd ../io_netcdfpar ; \
           make -i -r NETCDFPARPATH="/glade/u/apps/ch/opt/netcdf/4.8.0/gnu/9.1.0" \
```

3. Jenkins tests are all PASS.
vlakshmanan-scala pushed a commit to scala-computing/WRF that referenced this pull request Apr 4, 2024
TYPE: bug fix

KEYWORDS: EPS, DA

SOURCE: internal

DESCRIPTION OF CHANGES:
Problem:
The WRF-Solar EPS mods (PR wrf-model#1547, SHA1 fed10f4) modified the solve_em.F file, with
arguments to the microphysics driver. The solver_em.F file is one of the special files
that has a _ad and a _tl version. Any changes to solve_em.F need to be reflected in the
other two DA-related files.

Solution:
In this case, since only Thompson MP uses the EPS fields, the new args were made optional in
the MP driver.

LIST OF MODIFIED FILES:
M phys/module_microphysics_driver.F

TESTS CONDUCTED:
1. Yes, code compiles and passes regtests.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants