Plotting raster dataset, which is too large for memory #9874
Replies: 1 comment
-
Please ask here instead: https://discourse.pangeo.io/ |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello All,
I'm trying to generate a plot from a dataset too large to fit into memory (data based on global MODIS 500m mosaic, individual netcdf/geotiff tiles). Doing the naive and using xarray.mf_opendataset (reasonable chunks set) and then .plot crashes the terminal.
Searching around, there does not seem to be a tutorial on how to plot too-mem-large raster data using xarray/dask with matplotlib/holoview/other. The nearest answer I could find is https://discourse.holoviz.org/t/larger-than-memory-plotting-how-to-set-initial-slice-to-plot/2570/7 which involves the in-between step of a pointcloud.
Looking through the libraries, the holoview datashader one does seem to go into the right direction, but all the examples are only for displaying points (either straight gridding or over polygonal vector shapes).
So the question is: how can I plot & export the too-large-for-memory dataset using xarray, so that I can choose both the spatial extend (global to local and easily add things like coastlines) and temporal extend (processing with dask along the time axis for mean, sum and similar)?
Thank you for your help.
Beta Was this translation helpful? Give feedback.
All reactions