-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
This used to be such a pain, people were looping over a years worth of files for 60 years of data.
This is ERA5
import dask.array
import flox.xarray
dims = ("time", "level", "lat", "lon")
# nyears is number of years, adjust to make bigger,
# full dataset is 60-ish years.
nyears = 20
shape = (nyears * 365 * 24, 37, 721, 1440)
chunks = (24, 1, -1, -1)
ds = xr.Dataset(
{
"U": (dims, dask.array.random.random(shape, chunks=chunks)),
"V": (dims, dask.array.random.random(shape, chunks=chunks)),
"W": (dims, dask.array.random.random(shape, chunks=chunks)),
"T": (dims, dask.array.random.random(shape, chunks=chunks)),
},
coords={"time": pd.date_range("2001-01-01", periods=shape[0], freq="H")},
)
zonal_means = ds.mean("lon")
anomaly = ds - zonal_means
anomaly['uv'] = anomaly.U*anomaly.V
anomaly['vt'] = anomaly.V*anomaly.T
anomaly['uw'] = anomaly.U*anomaly.W
temdiags = zonal_means.merge(anomaly[['uv','vt','uw']].mean("lon"))
# note method="blockwise" uses flox
temdiags = temdiags.resample(time="D").mean(method="blockwise")
temdiagsrabernat
Metadata
Metadata
Assignees
Labels
No labels