Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updates of ocean diagnostics and recipes for GMD-2019-291 (ESMValTool Part II paper) #1621

Open
wants to merge 62 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
62 commits
Select commit Hold shift + click to select a range
7bc5728
Created recipe filler to fill out additional_datasets sections of rec…
ledm Jun 21, 2019
8370ce2
Changing recipes to add new figures.
ledm Jun 24, 2019
6d23284
Added [modelling_realm]
ledm Jun 24, 2019
86caafd
Added more specific fields to the searches.
ledm Jun 24, 2019
6c45ec9
Added some changes to make it compatible with the recipe_filler.py
ledm Jun 24, 2019
6820884
Used esmvaltool/interface_scripts/recipe_filler.py to add missing fil…
ledm Jun 24, 2019
16a4fdf
Added calculate_anomaly tool.
ledm Jun 25, 2019
6200f74
Merge branch 'version2_development_ocean_development_for_v2_paper_116…
ledm Jun 25, 2019
120591f
Added thetaoga to change units list, fixed anomaly calculations, adde…
ledm Jun 25, 2019
1adfafe
bugfix
ledm Jun 25, 2019
6bb37c8
Added a few comments and minor changes.
ledm Jun 26, 2019
c912020
Merge branch 'version2_development_ocean_development_for_v2_paper_116…
ledm Jun 26, 2019
a1c3c49
span: overlap- > span: full
ledm Jun 26, 2019
738d81d
Added some prints and a debug loop.
ledm Jun 26, 2019
da19285
Merge branch 'version2_development_ocean_development_for_v2_paper_116…
ledm Jun 26, 2019
50cdbed
Removed models with no data in 19th century.
ledm Jun 26, 2019
0839444
Cut time range to 2005, added some minor changes to diag script.
ledm Jun 26, 2019
80dfb69
Created new recipe to make time series of BGC metrics. THis is the ba…
ledm Jun 26, 2019
221663d
changed Omyr -> Oyr
ledm Jun 26, 2019
51f7b5b
Added two bug fixes.
ledm Jun 26, 2019
edfcbe9
Added final set of CMIP5 models and bug fixes.
ledm Jun 27, 2019
bfd9cce
Added data into recipe and minor change to tools.
ledm Jun 27, 2019
55e0afa
Added fix for when the depth axis is upside down.
ledm Jun 27, 2019
d43e1f3
Switched to derived variables instead of trying to do it by itself.
ledm Jun 27, 2019
42308c6
Added recipe switch for plot title and ylabel.
ledm Jun 27, 2019
cf64e4e
Commented out everything to run faster.
ledm Jun 27, 2019
a0bd732
Sending minor changes back to pml
ledm Jun 28, 2019
381f0d9
Added the rest of the models back in.
ledm Jun 28, 2019
eeae403
Added a multi model mean.
ledm Jun 28, 2019
1bac06d
Changed final year from 2012 to 2005 for one model.
ledm Jul 4, 2019
fd87f68
Changed final year from 2012 to 2005 for one model and added the annu…
ledm Jul 4, 2019
b223631
Moved several ocean scalar fields into a single recipe.
ledm Jul 4, 2019
5f80b65
minor change.
ledm Jul 4, 2019
f051090
Added clip to remove negative values
ledm Jul 4, 2019
09821e1
Merge branch 'version2_development_ocean_development_for_v2_paper_116…
ledm Jul 4, 2019
07d8033
Merge remote-tracking branch 'remotes/origin/version2_development' in…
ledm Sep 19, 2019
f62f155
Changed area_statistics and removed needless yaml anchors (&)
ledm Jan 30, 2020
a71130d
Fix bug described in #1505 by @YanchunHe https://github.com/ESMValGro…
ledm Jan 30, 2020
81e8a2c
Sharing recipe with @valeriupredoi
ledm Jan 31, 2020
76a1102
commented out most diagnostics
ledm Jan 31, 2020
459778c
commented out more diagnostics
ledm Jan 31, 2020
a9f6af0
Fixed p2p scatter plots for veronikas paper, working on example recipe.
ledm Mar 17, 2020
391a860
Added 4th pane back in.
ledm Apr 16, 2020
61afed0
Merge branch 'master' into develment_tool_area_statistics_445
ledm Apr 16, 2020
3b245b7
added additional scalar fields
ledm Apr 16, 2020
d9b5242
Merge branch 'develment_tool_area_statistics_445' into development_oc…
ledm Apr 16, 2020
56d504c
Merge remote-tracking branch 'origin/master' into version2_developmen…
ledm May 21, 2020
f01f954
Merge remote-tracking branch 'origin/master' into version2_developmen…
ledm May 21, 2020
cdd2226
Added new BGC scalar fields
ledm May 21, 2020
a0a9b11
PEP8 compliance
ledm May 21, 2020
c26f00e
Merge remote-tracking branch 'origin/master' into development_ocean_r…
ledm May 21, 2020
237d142
Merge branch 'version2_development_ocean_development_for_v2_paper_116…
ledm May 21, 2020
fb864de
PEP8 fiuxes
ledm May 21, 2020
f570982
PEP8
ledm May 21, 2020
db590cb
minor changes for codacy
ledm May 22, 2020
bed6f87
codacy fixes
ledm May 22, 2020
7e9c47a
added spaces to appease codacy
ledm May 22, 2020
34aa525
deleted recipe_filler.py
ledm May 22, 2020
e885e09
split scalr plots into three recipes.
ledm May 28, 2020
e5fe5c6
cleaned up and added new FX format
ledm May 28, 2020
4011a4f
merged two similar recipes.
ledm May 28, 2020
a4e9133
working here
ledm Jul 9, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 25 additions & 12 deletions esmvaltool/diag_scripts/ocean/diagnostic_model_vs_obs.py
Original file line number Diff line number Diff line change
Expand Up @@ -228,6 +228,14 @@ def make_model_vs_obs_plots(
cmap='bwr',
title=' '.join([model, 'over', obs]),
log=True)
else:
add_map_subplot(
224,
cube224,
logspace4,
cmap='bwr',
title=' '.join([model, 'over', obs]),
log=True)

# Add overall title
fig.suptitle(long_name + ' [' + units + ']', fontsize=14)
Expand Down Expand Up @@ -362,9 +370,8 @@ def make_scatter(
obs_filename: str
the preprocessed observations file.
"""

filenames = {'model': model_filename, 'obs': obs_filename}
logger.debug('make_model_vs_obs_plots: \t%s', filenames)

# ####
# Load the data for each layer as a separate cube
layers = {}
Expand All @@ -377,9 +384,6 @@ def make_scatter(
for layer in cubes[model_type]:
layers[layer] = True

logger.debug('layers: %s', layers)
logger.debug('cubes: %s', ', '.join(cubes))

# ####
# load names:
model = metadata[filenames['model']]['dataset']
Expand All @@ -392,28 +396,37 @@ def make_scatter(

# Make a plot for each layer
for layer in layers:

fig = plt.figure()
fig.set_size_inches(7, 6)

# Create the cubes
model_data = np.ma.array(cubes['model'][layer].data)
obs_data = np.ma.array(cubes['obs'][layer].data)
units = str(cubes['model'][layer].units)

mask = model_data.mask + obs_data.mask
model_data = np.ma.masked_where(mask, model_data).compressed()
obs_data = np.ma.masked_where(mask, obs_data).compressed()

colours = 'gist_yarg'
zrange = diagtools.get_array_range([model_data, obs_data])
plotrange = [zrange[0], zrange[1], zrange[0], zrange[1]]

x_scale = 'log'
mask_neg = False
if np.min(zrange) < 0.:
if mask_neg:
mask = np.ma.masked_where(model_data <= 0., model_data).mask
mask += np.ma.masked_where(obs_data <= 0., obs_data).mask

model_data = np.ma.masked_where(mask, model_data).compressed()
obs_data = np.ma.masked_where(mask, obs_data).compressed()
zrange = diagtools.get_array_range([model_data, obs_data])
plotrange = [zrange[0], zrange[1], zrange[0], zrange[1]]
else:
x_scale = 'linear'

if np.min(zrange) * np.max(zrange) < -1:
x_scale = 'linear'
if np.min(zrange) < 0.:
logger.info('Skip scatter for %s. Min is < 0', long_name)
return

pyplot.hexbin(
model_data,
Expand All @@ -423,7 +436,7 @@ def make_scatter(
bins='log',
# extent=np.log10(plotrange),
gridsize=50,
cmap=pyplot.get_cmap(colours),
cmap=pyplot.get_cmap('gist_yarg'),
mincnt=0)
cbar = pyplot.colorbar()
cbar.set_label('log10(N)')
Expand All @@ -439,7 +452,7 @@ def make_scatter(
add_diagonal=True,
extent=plotrange)

pyplot.title(long_name)
pyplot.title(long_name+ ', '+units)
pyplot.xlabel(model)
pyplot.ylabel(obs)

Expand Down
2 changes: 1 addition & 1 deletion esmvaltool/diag_scripts/ocean/diagnostic_seaice.py
Original file line number Diff line number Diff line change
Expand Up @@ -289,7 +289,7 @@ def make_polar_map(
ax1 = plt.subplot(111, projection=cartopy.crs.SouthPolarStereo())
ax1.set_extent([-180, 180, -90, -50], cartopy.crs.PlateCarree())

linrange = np.linspace(0., 100., 21.)
linrange = np.linspace(0., 100., 21)
qplt.contourf(cube, linrange, cmap=cmap, linewidth=0, rasterized=True)
plt.tight_layout()

Expand Down
164 changes: 126 additions & 38 deletions esmvaltool/diag_scripts/ocean/diagnostic_timeseries.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,14 +50,15 @@

import logging
import os

import datetime
import iris
import matplotlib.pyplot as plt
import numpy as np

from esmvaltool.diag_scripts.ocean import diagnostic_tools as diagtools
from esmvaltool.diag_scripts.shared import run_diagnostic


# This part sends debug statements to stdout
logger = logging.getLogger(os.path.basename(__file__))

Expand Down Expand Up @@ -137,42 +138,94 @@ def moving_average(cube, window):

times = cube.coord('time').units.num2date(cube.coord('time').points)

datetime = diagtools.guess_calendar_datetime(cube)
guessed_dt = diagtools.guess_calendar_datetime(cube)

output = []

times = np.array([
datetime(time_itr.year, time_itr.month, time_itr.day, time_itr.hour,
time_itr.minute) for time_itr in times
guessed_dt(time_itr.year, time_itr.month, time_itr.day, time_itr.hour,
time_itr.minute) for time_itr in times
])

for time_itr in times:
if win_units in ['years', 'yrs', 'year', 'yr']:
tmin = datetime(time_itr.year - window_len, time_itr.month,
time_itr.day, time_itr.hour, time_itr.minute)
tmax = datetime(time_itr.year + window_len, time_itr.month,
time_itr.day, time_itr.hour, time_itr.minute)
tmin = guessed_dt(time_itr.year - window_len, time_itr.month,
time_itr.day, time_itr.hour, time_itr.minute)
tmax = guessed_dt(time_itr.year + window_len, time_itr.month,
time_itr.day, time_itr.hour, time_itr.minute)

if win_units in ['months', 'month', 'mn']:
tmin = datetime(time_itr.year, time_itr.month - window_len,
time_itr.day, time_itr.hour, time_itr.minute)
tmax = datetime(time_itr.year, time_itr.month + window_len,
time_itr.day, time_itr.hour, time_itr.minute)
tmin = guessed_dt(time_itr.year, time_itr.month - window_len,
time_itr.day, time_itr.hour, time_itr.minute)
tmax = guessed_dt(time_itr.year, time_itr.month + window_len,
time_itr.day, time_itr.hour, time_itr.minute)

if win_units in ['days', 'day', 'dy']:
tmin = datetime(time_itr.year, time_itr.month,
time_itr.day - window_len, time_itr.hour,
time_itr.minute)
tmax = datetime(time_itr.year, time_itr.month,
time_itr.day + window_len, time_itr.hour,
time_itr.minute)
tmin = guessed_dt(time_itr.year, time_itr.month,
time_itr.day - window_len, time_itr.hour,
time_itr.minute)
tmax = guessed_dt(time_itr.year, time_itr.month,
time_itr.day + window_len, time_itr.hour,
time_itr.minute)

arr = np.ma.masked_where((times < tmin) + (times > tmax), cube.data)
output.append(arr.mean())
cube.data = np.array(output)
return cube


def calculate_anomaly(cube, anomaly):
"""
Calculate the anomaly using a specified time range.

The anomaly window is a list which includes a starting year and and end
year to indicate the start and end of the time period in which to calculate
the anomaly.

Parameters
----------
cube: iris.cube.Cube
Input cube
anomaly: list
A start year and end year to calculate an anomaly.

Returns
----------
iris.cube.Cube:
A cube with the anomaly calculated.
"""
#start_year = int(np.array(anomaly).min())
#end_year = int(np.array(anomaly).max())
#start_month = 1
#end_month = 12
#start_day = 1
#end_day = 31

start = [int(np.array(anomaly).min()), 1, 1]
end = [int(np.array(anomaly).max()), 12, 31]

time_units = cube.coord('time').units
if time_units.calendar == '360_day':
start[2] = 30
end[2] = 30

start_date = datetime.datetime(
int(start[0]), int(start[1]), int(start[2]))
end_date = datetime.datetime(int(end[0]), int(end[1]), int(end[2]))

t_1 = time_units.date2num(start_date)
t_2 = time_units.date2num(end_date)
constraint = iris.Constraint(
time=lambda t: t_1 < time_units.date2num(t.point) < t_2)

new_cube = cube.extract(constraint)
if new_cube is None:
return None
mean = new_cube.data.mean()
cube.data = cube.data - mean
return cube


def make_time_series_plots(
cfg,
metadata,
Expand Down Expand Up @@ -211,28 +264,32 @@ def make_time_series_plots(
# Making plots for each layer
for layer_index, (layer, cube_layer) in enumerate(cubes.items()):
layer = str(layer)
title = ' '.join([metadata['dataset'], metadata['long_name']])

if 'moving_average' in cfg:
cube_layer = moving_average(cube_layer, cfg['moving_average'])

if multi_model:
timeplot(cube_layer, label=metadata['dataset'], ls=':')
else:
timeplot(cube_layer, label=metadata['dataset'])
if 'anomaly' in cfg:
title = ' '.join([title, 'anomaly'])
cube_layer = calculate_anomaly(cube_layer, cfg['anomaly'])
if cube_layer is None:
return

# Add title, legend to plots
title = ' '.join([metadata['dataset'], metadata['long_name']])
if layer != '':
if cube_layer.coords('depth'):
z_units = cube_layer.coord('depth').units
else:
z_units = ''
title = ' '.join([title, '(', layer, str(z_units), ')'])
plt.title(title)
plt.legend(loc='best')
plt.ylabel(str(cube_layer.units))

ylabel = str(cube_layer.units)
if 'ylabel' in cfg:
ylabel = cfg['ylabel']

# Determine image filename:
if multi_model:
timeplot(cube_layer, label=metadata['dataset'], ls=':')
path = diagtools.get_image_path(
cfg,
metadata,
Expand All @@ -244,17 +301,21 @@ def make_time_series_plots(
'start_year', 'end_year'
],
)

else:
timeplot(cube_layer, label=metadata['dataset'])

path = diagtools.get_image_path(
cfg,
metadata,
suffix='timeseries_' + str(layer_index) + image_extention,
)

plt.title(title)
plt.legend(loc='best')
plt.ylabel(ylabel)

# Saving files:
if cfg['write_plots']:

logger.info('Saving plots to %s', path)
plt.savefig(path)

Expand Down Expand Up @@ -320,17 +381,25 @@ def multi_model_time_series(
else:
cube = model_cubes[filename][layer]

if 'MultiModel' in metadata[filename]['dataset']:
if 'anomaly' in cfg:
cube = calculate_anomaly(cube, cfg['anomaly'])
if cube is None:
logger.info(' '.join(['Not enough time for anomaly',
'calculation',
metadata[filename]['dataset']]))
continue

if metadata[filename]['dataset'].lower().find('multimodel') > -1:
timeplot(
cube,
c=color,
c='black',
# label=metadata[filename]['dataset'],
ls=':',
ls='--',
lw=2.,
)
plot_details[filename] = {
'c': color,
'ls': ':',
'c': 'black',
'ls': '--',
'lw': 2.,
'label': metadata[filename]['dataset']
}
Expand All @@ -350,24 +419,38 @@ def multi_model_time_series(
}

title = metadata[filename]['long_name']
ylabel = str(model_cubes[filename][layer].units)
if layer != '':
if model_cubes[filename][layer].coords('depth'):
z_units = model_cubes[filename][layer].coord('depth').units
else:
z_units = ''

# Add title, legend to plots
if 'anomaly' in cfg:
title = ' '.join([title, 'anomaly'])

if layer:
title = ' '.join([title, '(', str(layer), str(z_units), ')'])

# check to see if the title is mentionned in the recipe.
# If so, it overwrites the dafult title.
if 'title' in cfg:
title = cfg['title']

if 'ylabel' in cfg:
ylabel = cfg['ylabel']

plt.title(title)
plt.legend(loc='best')
plt.ylabel(str(model_cubes[filename][layer].units))
plt.ylabel(ylabel)

# Saving files:
if cfg['write_plots']:
path = diagtools.get_image_path(
cfg,
metadata[filename],
prefix='MultipleModels_',
prefix='MultipleModels',
suffix='_'.join(['timeseries',
str(layer) + image_extention]),
metadata_id_list=[
Expand All @@ -377,9 +460,14 @@ def multi_model_time_series(
)

# Resize and add legend outside thew axes.
plt.gcf().set_size_inches(9., 6.)
diagtools.add_legend_outside_right(
plot_details, plt.gca(), column_width=0.15)
if len(plot_details) < 25:
plt.gcf().set_size_inches(9., 6.)
diagtools.add_legend_outside_right(
plot_details, plt.gca(), column_width=0.15)
if len(plot_details) > 25:
plt.gcf().set_size_inches(11., 6.)
diagtools.add_legend_outside_right(
plot_details, plt.gca(), column_width=0.18)

logger.info('Saving plots to %s', path)
plt.savefig(path)
Expand Down
Loading