Handle large area processing#49
Conversation
sophieherrmann
left a comment
There was a problem hiding this comment.
There are some nice improvements!
Did you already check all other functions if attributes are properly passed?
|
I just found this http://xarray.pydata.org/en/stable/generated/xarray.save_mfdataset.html
|
Thanks, I just inserted that :) |
As discussed openEOPlatform/architecture-docs#27 Signed-off-by: sherrmann <sophie.herrmann@eodc.eu>
Signed-off-by: sherrmann <sophie.herrmann@eodc.eu>
clausmichele
left a comment
There was a problem hiding this comment.
Why are you splitting the request into two separate ones? I mean, this should be more flexible and split it into parts which have a size you know it's fine, otherwise this could work for size x = 100 for example, where you split into x1 = 50 and x2 = 50, but what if x = 1000? (numbers are just examples)
|
Thanks for reviewing it, that was just a test to try out splitting it in two parts, but I will remove the change, as it is not improving the process. |
|
To handle large areas and apply processes to large areas, I had a look at all the processes. The processes that still need an update to work for large areas are sort, order. The issue is discribed here: #52 |
|
As this PR already provides a number of new features / bug fixes which are urgently needed, I'll merge it now. A connected issue is that the |
odc_load_helper now only changes nodata values to np.nan