Reproducing Key Figures from Kay et al. (2015) Paper

Introduction

This Jupyter Notebook demonstrates how one might use the NCAR Community Earth System Model (CESM) Large Ensemble (LENS) data hosted on AWS S3 (doi:10.26024/wt24-5j82). The notebook shows how to reproduce figures 2 and 4 from the Kay et al. (2015) paper describing the CESM LENS dataset (doi:10.1175/BAMS-D-13-00255.1)

This resource is intended to be helpful for people not familiar with elements of the Pangeo framework including Jupyter Notebooks, Xarray, and Zarr data format, or with the original paper, so it includes additional explanation.

Notebook version 3.3 (2019 Dec 19)

Set up environment

[1]:
# Display output of plots directly in Notebook
%matplotlib inline
import warnings
warnings.filterwarnings("ignore")

import intake
import numpy as np
import pandas as pd
import xarray as xr

Create and Connect to Dask Distributed Cluster

[2]:
# Create cluster
from dask_gateway import Gateway
from dask.distributed import Client
gateway = Gateway()
cluster = gateway.new_cluster()
cluster.adapt(minimum=2, maximum=100)
# Connect to cluster
client = Client(cluster)
# Display cluster dashboard URL
cluster

☝️ Link to scheduler dashboard will appear above.

Load data into xarray from a catalog using intake-esm

[3]:
# Open collection description file
catalog_url = 'https://ncar-cesm-lens.s3-us-west-2.amazonaws.com/catalogs/aws-cesm1-le.json'
col = intake.open_esm_datastore(catalog_url)
col

aws-cesm1-le catalog with 27 dataset(s) from 365 asset(s):

unique
component 5
frequency 5
experiment 6
variable 75
path 365
[4]:
# Show the first few lines of the catalog
col.df.head(10)
[4]:
component frequency experiment variable path
0 atm daily 20C FLNS s3://ncar-cesm-lens/atm/daily/cesmLE-20C-FLNS....
1 atm daily 20C FLNSC s3://ncar-cesm-lens/atm/daily/cesmLE-20C-FLNSC...
2 atm daily 20C FLUT s3://ncar-cesm-lens/atm/daily/cesmLE-20C-FLUT....
3 atm daily 20C FSNS s3://ncar-cesm-lens/atm/daily/cesmLE-20C-FSNS....
4 atm daily 20C FSNSC s3://ncar-cesm-lens/atm/daily/cesmLE-20C-FSNSC...
5 atm daily 20C FSNTOA s3://ncar-cesm-lens/atm/daily/cesmLE-20C-FSNTO...
6 atm daily 20C ICEFRAC s3://ncar-cesm-lens/atm/daily/cesmLE-20C-ICEFR...
7 atm daily 20C LHFLX s3://ncar-cesm-lens/atm/daily/cesmLE-20C-LHFLX...
8 atm daily 20C PRECL s3://ncar-cesm-lens/atm/daily/cesmLE-20C-PRECL...
9 atm daily 20C PRECSC s3://ncar-cesm-lens/atm/daily/cesmLE-20C-PRECS...
[5]:
# Show expanded version of collection structure with details
import pprint
uniques = col.unique(columns=["component", "frequency", "experiment", "variable"])
pprint.pprint(uniques, compact=True, indent=4)
{   'component': {   'count': 5,
                     'values': ['atm', 'ice_nh', 'ice_sh', 'lnd', 'ocn']},
    'experiment': {   'count': 6,
                      'values': [   '20C', 'CTRL', 'CTRL_AMIP', 'CTRL_SLAB_OCN',
                                    'HIST', 'RCP85']},
    'frequency': {   'count': 5,
                     'values': [   'daily', 'hourly6-1990-2005',
                                   'hourly6-2026-2035', 'hourly6-2071-2080',
                                   'monthly']},
    'variable': {   'count': 75,
                    'values': [   'DIC', 'DOC', 'FLNS', 'FLNSC', 'FLUT', 'FSNO',
                                  'FSNS', 'FSNSC', 'FSNTOA', 'FW', 'H2OSNO',
                                  'HMXL', 'ICEFRAC', 'LHFLX', 'O2', 'PD',
                                  'PRECC', 'PRECL', 'PRECSC', 'PRECSL', 'PRECT',
                                  'PRECTMX', 'PS', 'PSL', 'Q', 'Q850', 'QFLUX',
                                  'QREFHT', 'QRUNOFF', 'QSW_HBL', 'QSW_HTP',
                                  'RAIN', 'RESID_S', 'RESID_T', 'SALT', 'SFWF',
                                  'SFWF_WRST', 'SHF', 'SHFLX', 'SHF_QSW',
                                  'SNOW', 'SOILLIQ', 'SOILWATER_10CM', 'SSH',
                                  'SST', 'T', 'TAUX', 'TAUX2', 'TAUY', 'TAUY2',
                                  'TEMP', 'TMQ', 'TREFHT', 'TREFHTMN',
                                  'TREFHTMX', 'TS', 'U', 'UBOT', 'UES', 'UET',
                                  'UVEL', 'V', 'VNS', 'VNT', 'VVEL',
                                  'WSPDSRFAV', 'WTS', 'WTT', 'WVEL', 'Z3',
                                  'Z500', 'aice', 'aice_d', 'hi', 'hi_d']}}

Extract data needed to construct Figure 2 of Kay et al. paper

Search the catalog to find the desired data, in this case the reference height temperature of the atmosphere, at daily time resolution, for the Historical, 20th Century, and RCP8.5 (IPCC Representative Concentration Pathway 8.5) experiments

[6]:
col_subset = col.search(frequency=["daily", "monthly"], component="atm", variable="TREFHT",
                        experiment=["20C", "RCP85", "HIST"])

col_subset

aws-cesm1-le catalog with 5 dataset(s) from 5 asset(s):

unique
component 1
frequency 2
experiment 3
variable 1
path 5
[7]:
col_subset.df
[7]:
component frequency experiment variable path
0 atm daily 20C TREFHT s3://ncar-cesm-lens/atm/daily/cesmLE-20C-TREFH...
1 atm daily RCP85 TREFHT s3://ncar-cesm-lens/atm/daily/cesmLE-RCP85-TRE...
2 atm monthly 20C TREFHT s3://ncar-cesm-lens/atm/monthly/cesmLE-20C-TRE...
3 atm monthly HIST TREFHT s3://ncar-cesm-lens/atm/monthly/cesmLE-HIST-TR...
4 atm monthly RCP85 TREFHT s3://ncar-cesm-lens/atm/monthly/cesmLE-RCP85-T...
[8]:
# Load catalog entries for subset into a dictionary of xarray datasets
dsets = col_subset.to_dataset_dict(zarr_kwargs={"consolidated": True}, storage_options={"anon": True})
print(f"\nDataset dictionary keys:\n {dsets.keys()}")

--> The keys in the returned dictionary of datasets are constructed as follows:
        'component.experiment.frequency'
100.00% [5/5 00:00<00:00]

Dataset dictionary keys:
 dict_keys(['atm.20C.daily', 'atm.RCP85.monthly', 'atm.20C.monthly', 'atm.HIST.monthly', 'atm.RCP85.daily'])
[9]:
# Define Xarray datasets corresponding to the three experiments
ds_HIST = dsets['atm.HIST.monthly']
ds_20C = dsets['atm.20C.daily']
ds_RCP85 = dsets['atm.RCP85.daily']
[10]:
# Use Dask.Distributed utility function to display size of each dataset
from distributed.utils import format_bytes
print(f"Historical: {format_bytes(ds_HIST.nbytes)}\n"
      f"20th Century: {format_bytes(ds_20C.nbytes)}\n"
      f"RCP8.5: {format_bytes(ds_RCP85.nbytes)}")
Historical: 186.12 MB
20th Century: 277.72 GB
RCP8.5: 306.79 GB
[11]:
# Extract the Reference Height Temperature data variable
t_hist = ds_HIST["TREFHT"]
t_20c = ds_20C["TREFHT"]
t_rcp = ds_RCP85["TREFHT"]
t_20c
[11]:
Show/Hide data repr Show/Hide attributes
xarray.DataArray
'TREFHT'
  • member_id: 40
  • time: 31390
  • lat: 192
  • lon: 288
  • dask.array<chunksize=(2, 365, 192, 288), meta=np.ndarray>
    Array Chunk
    Bytes 277.72 GB 161.46 MB
    Shape (40, 31390, 192, 288) (2, 365, 192, 288)
    Count 1721 Tasks 1720 Chunks
    Type float32 numpy.ndarray
    40 1 288 192 31390
    • lat
      (lat)
      float64
      -90.0 -89.06 -88.12 ... 89.06 90.0
      long_name :
      latitude
      units :
      degrees_north
      array([-90.      , -89.057592, -88.115183, -87.172775, -86.230366, -85.287958,
             -84.34555 , -83.403141, -82.460733, -81.518325, -80.575916, -79.633508,
             -78.691099, -77.748691, -76.806283, -75.863874, -74.921466, -73.979058,
             -73.036649, -72.094241, -71.151832, -70.209424, -69.267016, -68.324607,
             -67.382199, -66.439791, -65.497382, -64.554974, -63.612565, -62.670157,
             -61.727749, -60.78534 , -59.842932, -58.900524, -57.958115, -57.015707,
             -56.073298, -55.13089 , -54.188482, -53.246073, -52.303665, -51.361257,
             -50.418848, -49.47644 , -48.534031, -47.591623, -46.649215, -45.706806,
             -44.764398, -43.82199 , -42.879581, -41.937173, -40.994764, -40.052356,
             -39.109948, -38.167539, -37.225131, -36.282723, -35.340314, -34.397906,
             -33.455497, -32.513089, -31.570681, -30.628272, -29.685864, -28.743455,
             -27.801047, -26.858639, -25.91623 , -24.973822, -24.031414, -23.089005,
             -22.146597, -21.204188, -20.26178 , -19.319372, -18.376963, -17.434555,
             -16.492147, -15.549738, -14.60733 , -13.664921, -12.722513, -11.780105,
             -10.837696,  -9.895288,  -8.95288 ,  -8.010471,  -7.068063,  -6.125654,
              -5.183246,  -4.240838,  -3.298429,  -2.356021,  -1.413613,  -0.471204,
               0.471204,   1.413613,   2.356021,   3.298429,   4.240838,   5.183246,
               6.125654,   7.068063,   8.010471,   8.95288 ,   9.895288,  10.837696,
              11.780105,  12.722513,  13.664921,  14.60733 ,  15.549738,  16.492147,
              17.434555,  18.376963,  19.319372,  20.26178 ,  21.204188,  22.146597,
              23.089005,  24.031414,  24.973822,  25.91623 ,  26.858639,  27.801047,
              28.743455,  29.685864,  30.628272,  31.570681,  32.513089,  33.455497,
              34.397906,  35.340314,  36.282723,  37.225131,  38.167539,  39.109948,
              40.052356,  40.994764,  41.937173,  42.879581,  43.82199 ,  44.764398,
              45.706806,  46.649215,  47.591623,  48.534031,  49.47644 ,  50.418848,
              51.361257,  52.303665,  53.246073,  54.188482,  55.13089 ,  56.073298,
              57.015707,  57.958115,  58.900524,  59.842932,  60.78534 ,  61.727749,
              62.670157,  63.612565,  64.554974,  65.497382,  66.439791,  67.382199,
              68.324607,  69.267016,  70.209424,  71.151832,  72.094241,  73.036649,
              73.979058,  74.921466,  75.863874,  76.806283,  77.748691,  78.691099,
              79.633508,  80.575916,  81.518325,  82.460733,  83.403141,  84.34555 ,
              85.287958,  86.230366,  87.172775,  88.115183,  89.057592,  90.      ])
    • lon
      (lon)
      float64
      0.0 1.25 2.5 ... 356.2 357.5 358.8
      long_name :
      longitude
      units :
      degrees_east
      array([  0.  ,   1.25,   2.5 , ..., 356.25, 357.5 , 358.75])
    • member_id
      (member_id)
      int64
      1 2 3 4 5 6 ... 101 102 103 104 105
      array([  1,   2,   3,   4,   5,   6,   7,   8,   9,  10,  11,  12,  13,  14,
              15,  16,  17,  18,  19,  20,  21,  22,  23,  24,  25,  26,  27,  28,
              29,  30,  31,  32,  33,  34,  35, 101, 102, 103, 104, 105])
    • time
      (time)
      object
      1920-01-01 00:00:00 ... 2005-12-31 00:00:00
      bounds :
      time_bnds
      long_name :
      time
      array([cftime.DatetimeNoLeap(1920-01-01 00:00:00),
             cftime.DatetimeNoLeap(1920-01-02 00:00:00),
             cftime.DatetimeNoLeap(1920-01-03 00:00:00), ...,
             cftime.DatetimeNoLeap(2005-12-29 00:00:00),
             cftime.DatetimeNoLeap(2005-12-30 00:00:00),
             cftime.DatetimeNoLeap(2005-12-31 00:00:00)], dtype=object)
  • cell_methods :
    time: mean
    long_name :
    Reference height temperature
    units :
    K
[12]:
# The global surface temperature anomaly was computed relative to the 1961-90 base period
# in the Kay et al. paper, so extract that time slice
t_ref = t_20c.sel(time=slice("1961", "1990"))

Read grid cell areas

Cell size varies with latitude, so this must be accounted for when computing the global mean.

Note: Each Zarr store includes area values and other ancillary information in addition to the actual data. A possible optimization to reduce data size would be to extract the duplicated information into separate objects.

[13]:
cell_area = ds_20C.area
total_area = cell_area.sum()
cell_area
[13]:
Show/Hide data repr Show/Hide attributes
xarray.DataArray
'area'
  • lat: 192
  • lon: 288
  • dask.array<chunksize=(192, 288), meta=np.ndarray>
    Array Chunk
    Bytes 221.18 kB 221.18 kB
    Shape (192, 288) (192, 288)
    Count 2 Tasks 1 Chunks
    Type float32 numpy.ndarray
    288 192
    • lat
      (lat)
      float64
      -90.0 -89.06 -88.12 ... 89.06 90.0
      long_name :
      latitude
      units :
      degrees_north
      array([-90.      , -89.057592, -88.115183, -87.172775, -86.230366, -85.287958,
             -84.34555 , -83.403141, -82.460733, -81.518325, -80.575916, -79.633508,
             -78.691099, -77.748691, -76.806283, -75.863874, -74.921466, -73.979058,
             -73.036649, -72.094241, -71.151832, -70.209424, -69.267016, -68.324607,
             -67.382199, -66.439791, -65.497382, -64.554974, -63.612565, -62.670157,
             -61.727749, -60.78534 , -59.842932, -58.900524, -57.958115, -57.015707,
             -56.073298, -55.13089 , -54.188482, -53.246073, -52.303665, -51.361257,
             -50.418848, -49.47644 , -48.534031, -47.591623, -46.649215, -45.706806,
             -44.764398, -43.82199 , -42.879581, -41.937173, -40.994764, -40.052356,
             -39.109948, -38.167539, -37.225131, -36.282723, -35.340314, -34.397906,
             -33.455497, -32.513089, -31.570681, -30.628272, -29.685864, -28.743455,
             -27.801047, -26.858639, -25.91623 , -24.973822, -24.031414, -23.089005,
             -22.146597, -21.204188, -20.26178 , -19.319372, -18.376963, -17.434555,
             -16.492147, -15.549738, -14.60733 , -13.664921, -12.722513, -11.780105,
             -10.837696,  -9.895288,  -8.95288 ,  -8.010471,  -7.068063,  -6.125654,
              -5.183246,  -4.240838,  -3.298429,  -2.356021,  -1.413613,  -0.471204,
               0.471204,   1.413613,   2.356021,   3.298429,   4.240838,   5.183246,
               6.125654,   7.068063,   8.010471,   8.95288 ,   9.895288,  10.837696,
              11.780105,  12.722513,  13.664921,  14.60733 ,  15.549738,  16.492147,
              17.434555,  18.376963,  19.319372,  20.26178 ,  21.204188,  22.146597,
              23.089005,  24.031414,  24.973822,  25.91623 ,  26.858639,  27.801047,
              28.743455,  29.685864,  30.628272,  31.570681,  32.513089,  33.455497,
              34.397906,  35.340314,  36.282723,  37.225131,  38.167539,  39.109948,
              40.052356,  40.994764,  41.937173,  42.879581,  43.82199 ,  44.764398,
              45.706806,  46.649215,  47.591623,  48.534031,  49.47644 ,  50.418848,
              51.361257,  52.303665,  53.246073,  54.188482,  55.13089 ,  56.073298,
              57.015707,  57.958115,  58.900524,  59.842932,  60.78534 ,  61.727749,
              62.670157,  63.612565,  64.554974,  65.497382,  66.439791,  67.382199,
              68.324607,  69.267016,  70.209424,  71.151832,  72.094241,  73.036649,
              73.979058,  74.921466,  75.863874,  76.806283,  77.748691,  78.691099,
              79.633508,  80.575916,  81.518325,  82.460733,  83.403141,  84.34555 ,
              85.287958,  86.230366,  87.172775,  88.115183,  89.057592,  90.      ])
    • lon
      (lon)
      float64
      0.0 1.25 2.5 ... 356.2 357.5 358.8
      long_name :
      longitude
      units :
      degrees_east
      array([  0.  ,   1.25,   2.5 , ..., 356.25, 357.5 , 358.75])
  • long_name :
    Grid-Cell Area
    standard_name :
    cell_area
    units :
    m2

Define weighted means

Note: resample(time="AS") does an Annual resampling based on Start of calendar year.

See documentation for Pandas resampling options

[14]:
t_ref_ts = (
    (t_ref.resample(time="AS").mean("time") * cell_area).sum(dim=("lat", "lon"))
    / total_area
).mean(dim=("time", "member_id"))

t_hist_ts = (
    (t_hist.resample(time="AS").mean("time") * cell_area).sum(dim=("lat", "lon"))
) / total_area

t_20c_ts = (
    (t_20c.resample(time="AS").mean("time") * cell_area).sum(dim=("lat", "lon"))
) / total_area

t_rcp_ts = (
    (t_rcp.resample(time="AS").mean("time") * cell_area).sum(dim=("lat", "lon"))
) / total_area

Read data and compute means

Note: Dask’s “lazy execution” philosophy means that until this point we have not actually read the bulk of the data. These steps take a while to complete, so we include the Notebook “cell magic” directive %%time to display elapsed and CPU times after computation.

[15]:
%%time
t_ref_mean = t_ref_ts.load()
t_ref_mean
CPU times: user 2.2 s, sys: 41.7 ms, total: 2.24 s
Wall time: 1min 39s
[15]:
Show/Hide data repr Show/Hide attributes
xarray.DataArray
  • 286.38766
    array(286.38766, dtype=float32)
    [16]:
    
    %%time
    t_hist_ts_df = t_hist_ts.to_series().T
    t_hist_ts_df.head()
    
    CPU times: user 262 ms, sys: 4.99 ms, total: 267 ms
    Wall time: 6.95 s
    
    [16]:
    
    time
    1850-01-01 00:00:00    286.372620
    1851-01-01 00:00:00    286.280853
    1852-01-01 00:00:00    286.260742
    1853-01-01 00:00:00    286.218781
    1854-01-01 00:00:00    286.159119
    dtype: float32
    
    [17]:
    
    %%time
    t_20c_ts_df = t_20c_ts.to_series().unstack().T
    t_20c_ts_df.head()
    
    CPU times: user 6.37 s, sys: 164 ms, total: 6.53 s
    Wall time: 45.6 s
    
    [17]:
    
    member_id 1 2 3 4 5 6 7 8 9 10 ... 31 32 33 34 35 101 102 103 104 105
    time
    1920-01-01 00:00:00 286.311310 286.346710 286.283875 286.363983 286.328400 286.373444 286.386017 286.302185 286.374878 286.348358 ... 286.243469 286.283783 286.173859 286.309509 286.296234 286.341064 286.341187 286.376831 286.321167 286.254822
    1921-01-01 00:00:00 286.250641 286.198181 286.287292 286.390564 286.309204 286.334229 286.311310 286.300232 286.315857 286.305603 ... 286.179413 286.315674 286.075104 286.295990 286.318085 286.375275 286.246063 286.356201 286.492523 286.224274
    1922-01-01 00:00:00 286.293488 286.296356 286.265686 286.336517 286.293579 286.220093 286.010773 286.195099 286.205170 286.396545 ... 286.142365 286.316254 286.140167 286.293549 286.327972 286.142365 286.412598 286.369232 286.503418 286.282074
    1923-01-01 00:00:00 286.329163 286.322662 286.251099 286.322723 286.237457 286.152069 286.066040 286.204498 286.271454 286.292236 ... 286.168762 286.300781 286.095490 286.116302 286.227905 286.226440 286.512909 286.381348 286.215302 286.396332
    1924-01-01 00:00:00 286.307465 286.237366 286.148895 286.311890 286.361694 286.185974 286.248352 286.288177 286.330444 286.411835 ... 286.143066 286.287079 286.234100 286.199890 286.252777 286.322815 286.256165 286.221588 286.247437 286.422028

    5 rows × 40 columns

    [18]:
    
    %%time
    t_rcp_ts_df = t_rcp_ts.to_series().unstack().T
    t_rcp_ts_df.head()
    
    CPU times: user 7.11 s, sys: 213 ms, total: 7.32 s
    Wall time: 55.3 s
    
    [18]:
    
    member_id 1 2 3 4 5 6 7 8 9 10 ... 31 32 33 34 35 101 102 103 104 105
    time
    2006-01-01 00:00:00 286.764832 286.960358 286.679230 286.793152 286.754547 287.022339 286.850464 287.089844 286.960022 286.775787 ... 286.866089 286.925049 286.663971 286.955414 286.712524 287.115601 286.863556 286.881683 287.308411 287.030334
    2007-01-01 00:00:00 287.073792 286.908539 286.808746 286.998901 286.841675 286.993042 286.914124 286.938965 286.933563 286.675385 ... 286.804108 286.849548 286.628204 287.010529 286.811523 287.187225 286.862823 287.008240 287.222534 287.239044
    2008-01-01 00:00:00 287.104095 286.815033 286.995056 287.081543 287.100708 286.960510 286.854706 286.878937 287.062927 286.702454 ... 286.825653 286.844086 286.811859 286.803741 286.956635 287.080994 286.930084 286.945801 287.087128 287.157745
    2009-01-01 00:00:00 286.984497 287.059418 287.010498 287.144745 286.948700 287.092316 286.888458 287.050964 287.138428 286.890839 ... 286.785797 286.876556 286.953094 287.060364 287.056885 287.124908 287.005615 287.083984 287.254211 287.060730
    2010-01-01 00:00:00 286.991821 287.102295 286.988159 286.875183 286.954407 287.121796 286.938843 287.116211 286.957245 287.049622 ... 286.937317 286.928284 286.980499 287.118713 287.178040 287.030212 287.114716 287.083038 287.256927 287.066528

    5 rows × 40 columns

    Get observations for figure 2 (HadCRUT4; Morice et al. 2012)

    [19]:
    
    # Observational time series data for comparison with ensemble average
    obsDataURL = "https://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/cru/hadcrut4/air.mon.anom.median.nc"
    
    [20]:
    
    ds = xr.open_dataset(obsDataURL).load()
    ds
    
    [20]:
    
    Show/Hide data repr Show/Hide attributes
    xarray.Dataset
      • lat: 36
      • lon: 72
      • nbnds: 2
      • time: 2044
      • lat
        (lat)
        float32
        87.5 82.5 77.5 ... -82.5 -87.5
        long_name :
        Latitude
        units :
        degrees_north
        actual_range :
        [ 87.5 -87.5]
        axis :
        Y
        coordinate_defines :
        point
        standard_name :
        latitude
        _ChunkSizes :
        36
        array([ 87.5,  82.5,  77.5,  72.5,  67.5,  62.5,  57.5,  52.5,  47.5,  42.5,
                37.5,  32.5,  27.5,  22.5,  17.5,  12.5,   7.5,   2.5,  -2.5,  -7.5,
               -12.5, -17.5, -22.5, -27.5, -32.5, -37.5, -42.5, -47.5, -52.5, -57.5,
               -62.5, -67.5, -72.5, -77.5, -82.5, -87.5], dtype=float32)
      • lon
        (lon)
        float32
        -177.5 -172.5 ... 172.5 177.5
        long_name :
        Longitude
        units :
        degrees_east
        actual_range :
        [-177.5 177.5]
        axis :
        X
        coordinate_defines :
        point
        standard_name :
        longitude
        _ChunkSizes :
        72
        array([-177.5, -172.5, -167.5, -162.5, -157.5, -152.5, -147.5, -142.5, -137.5,
               -132.5, -127.5, -122.5, -117.5, -112.5, -107.5, -102.5,  -97.5,  -92.5,
                -87.5,  -82.5,  -77.5,  -72.5,  -67.5,  -62.5,  -57.5,  -52.5,  -47.5,
                -42.5,  -37.5,  -32.5,  -27.5,  -22.5,  -17.5,  -12.5,   -7.5,   -2.5,
                  2.5,    7.5,   12.5,   17.5,   22.5,   27.5,   32.5,   37.5,   42.5,
                 47.5,   52.5,   57.5,   62.5,   67.5,   72.5,   77.5,   82.5,   87.5,
                 92.5,   97.5,  102.5,  107.5,  112.5,  117.5,  122.5,  127.5,  132.5,
                137.5,  142.5,  147.5,  152.5,  157.5,  162.5,  167.5,  172.5,  177.5],
              dtype=float32)
      • time
        (time)
        datetime64[ns]
        1850-01-01 ... 2020-04-01
        long_name :
        Time
        delta_t :
        0000-01-00 00:00:00
        avg_period :
        0000-01-00 00:00:00
        standard_name :
        time
        axis :
        T
        coordinate_defines :
        start
        bounds :
        time_bnds
        actual_range :
        [18262. 80444.]
        _ChunkSizes :
        1
        array(['1850-01-01T00:00:00.000000000', '1850-02-01T00:00:00.000000000',
               '1850-03-01T00:00:00.000000000', ..., '2020-02-01T00:00:00.000000000',
               '2020-03-01T00:00:00.000000000', '2020-04-01T00:00:00.000000000'],
              dtype='datetime64[ns]')
      • time_bnds
        (time, nbnds)
        datetime64[ns]
        1850-01-01 ... 2020-04-30
        long_name :
        Time Boundaries
        _ChunkSizes :
        [1 2]
        array([['1850-01-01T00:00:00.000000000', '1850-01-31T00:00:00.000000000'],
               ['1850-02-01T00:00:00.000000000', '1850-02-28T00:00:00.000000000'],
               ['1850-03-01T00:00:00.000000000', '1850-03-31T00:00:00.000000000'],
               ...,
               ['2020-02-01T00:00:00.000000000', '2020-02-28T00:00:00.000000000'],
               ['2020-03-01T00:00:00.000000000', '2020-03-31T00:00:00.000000000'],
               ['2020-04-01T00:00:00.000000000', '2020-04-30T00:00:00.000000000']],
              dtype='datetime64[ns]')
      • air
        (time, lat, lon)
        float32
        nan nan nan nan ... nan nan nan nan
        dataset :
        HADCRUT4
        var_desc :
        Air Temperature
        level_desc :
        Surface
        statistic :
        Anomaly
        parent_stat :
        Observation
        valid_range :
        [-40. 40.]
        units :
        degC
        long_name :
        HADCRUT4: Median Surface Air Temperature Monthly Median Anomaly from 100 ensemble members
        precision :
        2
        cell_methods :
        time: anomaly (monthly from values)
        standard_name :
        air_temperature_anomaly
        actual_range :
        [-20.846687 20.620308]
        _ChunkSizes :
        [ 1 36 72]
        array([[[       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                ...,
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan]],
        
               [[       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                ...,
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan]],
        
               [[       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                ...,
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan]],
        
               ...,
        
               [[       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,  5.3666625, ...,        nan,
                        nan,        nan],
                ...,
                [-0.2536739,        nan, -0.4268338, ...,  1.2520661,
                        nan, -0.5067065],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan]],
        
               [[       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,  1.4844693, ...,        nan,
                        nan,        nan],
                ...,
                [       nan,        nan, -0.7954489, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan]],
        
               [[       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,  0.7418742, ...,        nan,
                        nan,        nan],
                ...,
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan],
                [       nan,        nan,        nan, ...,        nan,
                        nan,        nan]]], dtype=float32)
    • platform :
      Surface
      title :
      HADCRUT4 Combined Air Temperature/SST Anomaly
      history :
      Originally created at NOAA/ESRL PSD by CAS 04/2012 from files obtained at the Hadley Center
      Conventions :
      CF-1.0
      Comment :
      This dataset supersedes V3
      Source :
      Obtained from http://hadobs.metoffice.com/ Data is a collaboration of CRU and the Hadley Center
      version :
      4.2.0
      dataset_title :
      HadCRUT4
      References :
      https://www.psl.noaa.gov/data/gridded/data.hadcrut4.html
      DODS_EXTRA.Unlimited_Dimension :
      time
    [21]:
    
    def weighted_temporal_mean(ds):
        """
        weight by days in each month
        """
        time_bound_diff = ds.time_bnds.diff(dim="nbnds")[:, 0]
        wgts = time_bound_diff.groupby("time.year") / time_bound_diff.groupby(
            "time.year"
        ).sum(xr.ALL_DIMS)
        np.testing.assert_allclose(wgts.groupby("time.year").sum(xr.ALL_DIMS), 1.0)
        obs = ds["air"]
        cond = obs.isnull()
        ones = xr.where(cond, 0.0, 1.0)
        obs_sum = (obs * wgts).resample(time="AS").sum(dim="time")
        ones_out = (ones * wgts).resample(time="AS").sum(dim="time")
        obs_s = (obs_sum / ones_out).mean(("lat", "lon")).to_series()
        return obs_s
    
    • Limit Observations to 20th Century

    [22]:
    
    obs_s = weighted_temporal_mean(ds)
    obs_s = obs_s['1920':]
    obs_s.head()
    
    [22]:
    
    time
    1920-01-01   -0.262006
    1921-01-01   -0.195891
    1922-01-01   -0.301986
    1923-01-01   -0.269062
    1924-01-01   -0.292857
    Freq: AS-JAN, dtype: float64
    
    [23]:
    
    all_ts_anom = pd.concat([t_20c_ts_df, t_rcp_ts_df]) - t_ref_mean.data
    years = [val.year for val in all_ts_anom.index]
    obs_years = [val.year for val in obs_s.index]
    
    [24]:
    
    # Combine ensemble member 1 data from historical and 20th century experiments
    hist_anom = t_hist_ts_df - t_ref_mean.data
    member1 = pd.concat([hist_anom.iloc[:-2], all_ts_anom.iloc[:,0]], verify_integrity=True)
    member1_years = [val.year for val in member1.index]
    

    Figure 2: Global surface temperature anomaly (1961-90 base period) for individual ensemble members, and observations

    [25]:
    
    import matplotlib.pyplot as plt
    
    [26]:
    
    ax = plt.axes()
    
    ax.tick_params(right=True, top=True, direction="out", length=6, width=2, grid_alpha=0.5)
    ax.plot(years, all_ts_anom.iloc[:,1:], color="grey")
    ax.plot(obs_years, obs_s['1920':], color="red")
    ax.plot(member1_years, member1, color="black")
    
    
    ax.text(
        0.35,
        0.4,
        "observations",
        verticalalignment="bottom",
        horizontalalignment="left",
        transform=ax.transAxes,
        color="red",
        fontsize=10,
    )
    ax.text(
        0.35,
        0.33,
        "members 2-40",
        verticalalignment="bottom",
        horizontalalignment="left",
        transform=ax.transAxes,
        color="grey",
        fontsize=10,
    )
    ax.text(
        0.05,
        0.2,
        "member 1",
        verticalalignment="bottom",
        horizontalalignment="left",
        transform=ax.transAxes,
        color="black",
        fontsize=10,
    )
    
    ax.set_xticks([1850, 1920, 1950, 2000, 2050, 2100])
    plt.ylim(-1, 5)
    plt.xlim(1850, 2100)
    plt.ylabel("Global Surface\nTemperature Anomaly (K)")
    plt.show()
    
    ../../../../_images/repos_NCAR_cesm-lens-aws_notebooks_kay-et-al-2015.v3_37_0.png

    Figure will appear above when ready. Compare with Fig.2 of Kay et al. 2015 (doi:10.1175/BAMS-D-13-00255.1)

    image0

    Compute linear trend for winter seasons

    [27]:
    
    def linear_trend(da, dim="time"):
        da_chunk = da.chunk({dim: -1})
        trend = xr.apply_ufunc(
            calc_slope,
            da_chunk,
            vectorize=True,
            input_core_dims=[[dim]],
            output_core_dims=[[]],
            output_dtypes=[np.float],
            dask="parallelized",
        )
        return trend
    
    
    def calc_slope(y):
        """ufunc to be used by linear_trend"""
        x = np.arange(len(y))
    
        # drop missing values (NaNs) from x and y
        finite_indexes = ~np.isnan(y)
        slope = np.nan if (np.sum(finite_indexes) < 2) else np.polyfit(x[finite_indexes], y[finite_indexes], 1)[0]
        return slope
    

    Get Observations for Figure 4 (NASA GISS GisTemp)

    [31]:
    
    # Observational time series data for comparison with ensemble average
    # NASA GISS Surface Temperature Analysis, https://data.giss.nasa.gov/gistemp/
    obsDataURL = "https://data.giss.nasa.gov/pub/gistemp/gistemp1200_GHCNv4_ERSSTv5.nc.gz"
    
    [32]:
    
    # Download, unzip, and load file
    import os
    os.system("wget " + obsDataURL)
    
    obsDataFileName = obsDataURL.split('/')[-1]
    os.system("gunzip " + obsDataFileName)
    
    obsDataFileName = obsDataFileName[:-3]
    ds = xr.open_dataset(obsDataFileName).load()
    ds
    
    [32]:
    
    Show/Hide data repr Show/Hide attributes
    xarray.Dataset
      • lat: 90
      • lon: 180
      • nv: 2
      • time: 1685
      • lat
        (lat)
        float32
        -89.0 -87.0 -85.0 ... 87.0 89.0
        standard_name :
        latitude
        long_name :
        Latitude
        units :
        degrees_north
        array([-89., -87., -85., -83., -81., -79., -77., -75., -73., -71., -69., -67.,
               -65., -63., -61., -59., -57., -55., -53., -51., -49., -47., -45., -43.,
               -41., -39., -37., -35., -33., -31., -29., -27., -25., -23., -21., -19.,
               -17., -15., -13., -11.,  -9.,  -7.,  -5.,  -3.,  -1.,   1.,   3.,   5.,
                 7.,   9.,  11.,  13.,  15.,  17.,  19.,  21.,  23.,  25.,  27.,  29.,
                31.,  33.,  35.,  37.,  39.,  41.,  43.,  45.,  47.,  49.,  51.,  53.,
                55.,  57.,  59.,  61.,  63.,  65.,  67.,  69.,  71.,  73.,  75.,  77.,
                79.,  81.,  83.,  85.,  87.,  89.], dtype=float32)
      • lon
        (lon)
        float32
        -179.0 -177.0 ... 177.0 179.0
        standard_name :
        longitude
        long_name :
        Longitude
        units :
        degrees_east
        array([-179., -177., -175., -173., -171., -169., -167., -165., -163., -161.,
               -159., -157., -155., -153., -151., -149., -147., -145., -143., -141.,
               -139., -137., -135., -133., -131., -129., -127., -125., -123., -121.,
               -119., -117., -115., -113., -111., -109., -107., -105., -103., -101.,
                -99.,  -97.,  -95.,  -93.,  -91.,  -89.,  -87.,  -85.,  -83.,  -81.,
                -79.,  -77.,  -75.,  -73.,  -71.,  -69.,  -67.,  -65.,  -63.,  -61.,
                -59.,  -57.,  -55.,  -53.,  -51.,  -49.,  -47.,  -45.,  -43.,  -41.,
                -39.,  -37.,  -35.,  -33.,  -31.,  -29.,  -27.,  -25.,  -23.,  -21.,
                -19.,  -17.,  -15.,  -13.,  -11.,   -9.,   -7.,   -5.,   -3.,   -1.,
                  1.,    3.,    5.,    7.,    9.,   11.,   13.,   15.,   17.,   19.,
                 21.,   23.,   25.,   27.,   29.,   31.,   33.,   35.,   37.,   39.,
                 41.,   43.,   45.,   47.,   49.,   51.,   53.,   55.,   57.,   59.,
                 61.,   63.,   65.,   67.,   69.,   71.,   73.,   75.,   77.,   79.,
                 81.,   83.,   85.,   87.,   89.,   91.,   93.,   95.,   97.,   99.,
                101.,  103.,  105.,  107.,  109.,  111.,  113.,  115.,  117.,  119.,
                121.,  123.,  125.,  127.,  129.,  131.,  133.,  135.,  137.,  139.,
                141.,  143.,  145.,  147.,  149.,  151.,  153.,  155.,  157.,  159.,
                161.,  163.,  165.,  167.,  169.,  171.,  173.,  175.,  177.,  179.],
              dtype=float32)
      • time
        (time)
        datetime64[ns]
        1880-01-15 ... 2020-05-15
        long_name :
        time
        bounds :
        time_bnds
        array(['1880-01-15T00:00:00.000000000', '1880-02-15T00:00:00.000000000',
               '1880-03-15T00:00:00.000000000', ..., '2020-03-15T00:00:00.000000000',
               '2020-04-15T00:00:00.000000000', '2020-05-15T00:00:00.000000000'],
              dtype='datetime64[ns]')
      • time_bnds
        (time, nv)
        datetime64[ns]
        1880-01-01 ... 2020-06-01
        array([['1880-01-01T00:00:00.000000000', '1880-02-01T00:00:00.000000000'],
               ['1880-02-01T00:00:00.000000000', '1880-03-01T00:00:00.000000000'],
               ['1880-03-01T00:00:00.000000000', '1880-04-01T00:00:00.000000000'],
               ...,
               ['2020-03-01T00:00:00.000000000', '2020-04-01T00:00:00.000000000'],
               ['2020-04-01T00:00:00.000000000', '2020-05-01T00:00:00.000000000'],
               ['2020-05-01T00:00:00.000000000', '2020-06-01T00:00:00.000000000']],
              dtype='datetime64[ns]')
      • tempanomaly
        (time, lat, lon)
        float32
        nan nan nan nan ... 3.51 3.51 3.51
        long_name :
        Surface temperature anomaly
        units :
        K
        cell_methods :
        time: mean
        array([[[        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                ...,
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan]],
        
               [[        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                ...,
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan]],
        
               [[        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                ...,
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan]],
        
               ...,
        
               [[-0.09999999, -0.09999999, -0.09999999, ..., -0.09999999,
                 -0.09999999, -0.09999999],
                [-0.09999999, -0.09999999, -0.09999999, ..., -0.09999999,
                 -0.09999999, -0.09999999],
                [-0.09999999, -0.09999999, -0.09999999, ..., -0.09999999,
                 -0.09999999, -0.09999999],
                ...,
                [ 3.04      ,  3.04      ,  3.04      , ...,  3.04      ,
                  3.04      ,  3.04      ],
                [ 3.04      ,  3.04      ,  3.04      , ...,  3.04      ,
                  3.04      ,  3.04      ],
                [ 3.04      ,  3.04      ,  3.04      , ...,  3.04      ,
                  3.04      ,  3.04      ]],
        
               [[ 2.01      ,  2.01      ,  2.01      , ...,  2.01      ,
                  2.01      ,  2.01      ],
                [ 2.01      ,  2.01      ,  2.01      , ...,  2.01      ,
                  2.01      ,  2.01      ],
                [ 2.01      ,  2.01      ,  2.01      , ...,  2.01      ,
                  2.01      ,  2.01      ],
                ...,
                [ 8.139999  ,  8.139999  ,  8.139999  , ...,  8.139999  ,
                  8.139999  ,  8.139999  ],
                [ 8.139999  ,  8.139999  ,  8.139999  , ...,  8.139999  ,
                  8.139999  ,  8.139999  ],
                [ 8.139999  ,  8.139999  ,  8.139999  , ...,  8.139999  ,
                  8.139999  ,  8.139999  ]],
        
               [[ 1.56      ,  1.56      ,  1.56      , ...,  1.56      ,
                  1.56      ,  1.56      ],
                [ 1.56      ,  1.56      ,  1.56      , ...,  1.56      ,
                  1.56      ,  1.56      ],
                [ 1.56      ,  1.56      ,  1.56      , ...,  1.56      ,
                  1.56      ,  1.56      ],
                ...,
                [ 3.51      ,  3.51      ,  3.51      , ...,  3.51      ,
                  3.51      ,  3.51      ],
                [ 3.51      ,  3.51      ,  3.51      , ...,  3.51      ,
                  3.51      ,  3.51      ],
                [ 3.51      ,  3.51      ,  3.51      , ...,  3.51      ,
                  3.51      ,  3.51      ]]], dtype=float32)
    • title :
      GISTEMP Surface Temperature Analysis
      institution :
      NASA Goddard Institute for Space Studies
      source :
      http://data.giss.nasa.gov/gistemp/
      Conventions :
      CF-1.6
      history :
      Created 2020-06-10 15:03:48 by SBBX_to_nc 2.0 - ILAND=1200, IOCEAN=NCDC/ER5, Base: 1951-1980
    [33]:
    
    # Remap longitude range from [-180, 180] to [0, 360] for plotting purposes
    ds = ds.assign_coords(lon=((ds.lon + 360) % 360)).sortby('lon')
    ds
    
    [33]:
    
    Show/Hide data repr Show/Hide attributes
    xarray.Dataset
      • lat: 90
      • lon: 180
      • nv: 2
      • time: 1685
      • lat
        (lat)
        float32
        -89.0 -87.0 -85.0 ... 87.0 89.0
        standard_name :
        latitude
        long_name :
        Latitude
        units :
        degrees_north
        array([-89., -87., -85., -83., -81., -79., -77., -75., -73., -71., -69., -67.,
               -65., -63., -61., -59., -57., -55., -53., -51., -49., -47., -45., -43.,
               -41., -39., -37., -35., -33., -31., -29., -27., -25., -23., -21., -19.,
               -17., -15., -13., -11.,  -9.,  -7.,  -5.,  -3.,  -1.,   1.,   3.,   5.,
                 7.,   9.,  11.,  13.,  15.,  17.,  19.,  21.,  23.,  25.,  27.,  29.,
                31.,  33.,  35.,  37.,  39.,  41.,  43.,  45.,  47.,  49.,  51.,  53.,
                55.,  57.,  59.,  61.,  63.,  65.,  67.,  69.,  71.,  73.,  75.,  77.,
                79.,  81.,  83.,  85.,  87.,  89.], dtype=float32)
      • lon
        (lon)
        float32
        1.0 3.0 5.0 ... 355.0 357.0 359.0
        array([  1.,   3.,   5.,   7.,   9.,  11.,  13.,  15.,  17.,  19.,  21.,  23.,
                25.,  27.,  29.,  31.,  33.,  35.,  37.,  39.,  41.,  43.,  45.,  47.,
                49.,  51.,  53.,  55.,  57.,  59.,  61.,  63.,  65.,  67.,  69.,  71.,
                73.,  75.,  77.,  79.,  81.,  83.,  85.,  87.,  89.,  91.,  93.,  95.,
                97.,  99., 101., 103., 105., 107., 109., 111., 113., 115., 117., 119.,
               121., 123., 125., 127., 129., 131., 133., 135., 137., 139., 141., 143.,
               145., 147., 149., 151., 153., 155., 157., 159., 161., 163., 165., 167.,
               169., 171., 173., 175., 177., 179., 181., 183., 185., 187., 189., 191.,
               193., 195., 197., 199., 201., 203., 205., 207., 209., 211., 213., 215.,
               217., 219., 221., 223., 225., 227., 229., 231., 233., 235., 237., 239.,
               241., 243., 245., 247., 249., 251., 253., 255., 257., 259., 261., 263.,
               265., 267., 269., 271., 273., 275., 277., 279., 281., 283., 285., 287.,
               289., 291., 293., 295., 297., 299., 301., 303., 305., 307., 309., 311.,
               313., 315., 317., 319., 321., 323., 325., 327., 329., 331., 333., 335.,
               337., 339., 341., 343., 345., 347., 349., 351., 353., 355., 357., 359.],
              dtype=float32)
      • time
        (time)
        datetime64[ns]
        1880-01-15 ... 2020-05-15
        long_name :
        time
        bounds :
        time_bnds
        array(['1880-01-15T00:00:00.000000000', '1880-02-15T00:00:00.000000000',
               '1880-03-15T00:00:00.000000000', ..., '2020-03-15T00:00:00.000000000',
               '2020-04-15T00:00:00.000000000', '2020-05-15T00:00:00.000000000'],
              dtype='datetime64[ns]')
      • time_bnds
        (time, nv)
        datetime64[ns]
        1880-01-01 ... 2020-06-01
        array([['1880-01-01T00:00:00.000000000', '1880-02-01T00:00:00.000000000'],
               ['1880-02-01T00:00:00.000000000', '1880-03-01T00:00:00.000000000'],
               ['1880-03-01T00:00:00.000000000', '1880-04-01T00:00:00.000000000'],
               ...,
               ['2020-03-01T00:00:00.000000000', '2020-04-01T00:00:00.000000000'],
               ['2020-04-01T00:00:00.000000000', '2020-05-01T00:00:00.000000000'],
               ['2020-05-01T00:00:00.000000000', '2020-06-01T00:00:00.000000000']],
              dtype='datetime64[ns]')
      • tempanomaly
        (time, lat, lon)
        float32
        nan nan nan nan ... 3.51 3.51 3.51
        long_name :
        Surface temperature anomaly
        units :
        K
        cell_methods :
        time: mean
        array([[[        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                ...,
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan]],
        
               [[        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                ...,
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan]],
        
               [[        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                ...,
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan],
                [        nan,         nan,         nan, ...,         nan,
                         nan,         nan]],
        
               ...,
        
               [[-0.09999999, -0.09999999, -0.09999999, ..., -0.09999999,
                 -0.09999999, -0.09999999],
                [-0.09999999, -0.09999999, -0.09999999, ..., -0.09999999,
                 -0.09999999, -0.09999999],
                [-0.09999999, -0.09999999, -0.09999999, ..., -0.09999999,
                 -0.09999999, -0.09999999],
                ...,
                [ 3.04      ,  3.04      ,  3.04      , ...,  3.04      ,
                  3.04      ,  3.04      ],
                [ 3.04      ,  3.04      ,  3.04      , ...,  3.04      ,
                  3.04      ,  3.04      ],
                [ 3.04      ,  3.04      ,  3.04      , ...,  3.04      ,
                  3.04      ,  3.04      ]],
        
               [[ 2.01      ,  2.01      ,  2.01      , ...,  2.01      ,
                  2.01      ,  2.01      ],
                [ 2.01      ,  2.01      ,  2.01      , ...,  2.01      ,
                  2.01      ,  2.01      ],
                [ 2.01      ,  2.01      ,  2.01      , ...,  2.01      ,
                  2.01      ,  2.01      ],
                ...,
                [ 8.139999  ,  8.139999  ,  8.139999  , ...,  8.139999  ,
                  8.139999  ,  8.139999  ],
                [ 8.139999  ,  8.139999  ,  8.139999  , ...,  8.139999  ,
                  8.139999  ,  8.139999  ],
                [ 8.139999  ,  8.139999  ,  8.139999  , ...,  8.139999  ,
                  8.139999  ,  8.139999  ]],
        
               [[ 1.56      ,  1.56      ,  1.56      , ...,  1.56      ,
                  1.56      ,  1.56      ],
                [ 1.56      ,  1.56      ,  1.56      , ...,  1.56      ,
                  1.56      ,  1.56      ],
                [ 1.56      ,  1.56      ,  1.56      , ...,  1.56      ,
                  1.56      ,  1.56      ],
                ...,
                [ 3.51      ,  3.51      ,  3.51      , ...,  3.51      ,
                  3.51      ,  3.51      ],
                [ 3.51      ,  3.51      ,  3.51      , ...,  3.51      ,
                  3.51      ,  3.51      ],
                [ 3.51      ,  3.51      ,  3.51      , ...,  3.51      ,
                  3.51      ,  3.51      ]]], dtype=float32)
    • title :
      GISTEMP Surface Temperature Analysis
      institution :
      NASA Goddard Institute for Space Studies
      source :
      http://data.giss.nasa.gov/gistemp/
      Conventions :
      CF-1.6
      history :
      Created 2020-06-10 15:03:48 by SBBX_to_nc 2.0 - ILAND=1200, IOCEAN=NCDC/ER5, Base: 1951-1980