Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

load_uploaded_files not implemented? #177

Closed
kempenep opened this issue Feb 10, 2021 · 6 comments
Closed

load_uploaded_files not implemented? #177

kempenep opened this issue Feb 10, 2021 · 6 comments

Comments

@kempenep
Copy link

kempenep commented Feb 10, 2021

I am looking for the process load_uploaded_files, see specs. Has this been implemented in the openeo-python-client?

I also noted that when using:

polygonal_mean_timeseries('path')

the process graph is calling read_vector. This process_id is not listed in the specs, should this be load_uploaded_files instead?

@soxofaan
Copy link
Member

load_uploaded_files is indeed not implemented as a method or function in the python client yet.
You can however build a cube using the generic datacube_from_process method.
Something like this:

cube = con.datacube_from_process("load_uploaded_files", paths=["/path/to/....", "..."], format="...")

The read_vector in polygonal_mean_timeseries is a VITO specific process at the moment. See #104

@kempenep
Copy link
Author

Thanks for the hint.
The function polygonal_mean_timeseries('path'), would suit my needs and I can implement a read_vector in our back-end. However, as long as read_vector is not a part of the specs, the graph parser from @lforesta will complain (or I would have to include a proprietary definition in our own specs).

As for building a cube using the generic datacube_from_process, is it possible to load a vector? When I try to load a GeoJSON, I get:

UserWarning: No cube:dimensions metadata
  complain("No cube:dimensions metadata")

How would it be combined with the polygonal_mean_timeseries? Or is there a more generic way to call the aggregate_spatial function?

@soxofaan
Copy link
Member

that "no cube:dimension metadata" is just a warning, it shouldn't block you. Can you provide a bit more source code? It's a bit hard to guess what you are trying to do.

@kempenep
Copy link
Author

Sure, I want a simple aggregate_spatial from a datacube, e.g., ndvi obtained via:

datacube = connection.load_collection("EarthObservation.Copernicus.S2.scenes.source.L2A")
datacube = datacube.filter_bbox(west = 5.251809, south = 51.705835, east = 5.462144, north = 51.838069)
datacube = datacube.filter_temporal(start_date="2020-01-01", end_date="2020-03-01")
datacube = datacube.filter_bands(["B4", "B8"])
ndvi = datacube.ndvi(nir = 'B8', red = 'B4')

I have tried this:

ndvi = ndvi.aggregate_spatial('/path/to/vector.json', 'median')

But this calls the read_vector

From your hint, I figured the vector could be taken as an uploaded cube (?):

ndvi = ndvi.aggregate_spatial(con.datacube_from_process("load_uploaded_files", paths=["/path/to/....", "..."], format="..."), 'median')

But then the cube is not supported in this process, as geometries are expected (error message):

OpenEoClientException: Invalid `geometries`: <openeo.rest.datacube.DataCube object at 0x7f4058cd44f0>

@lforesta
Copy link
Contributor

However, as long as read_vector is not a part of the specs, the graph parser from @lforesta will complain (or I would have to include a proprietary definition in our own specs).

Indeed you can add a custom predefined process to your back-end. Else the validation step will complain since that process does not appear in your supported processes .

@soxofaan
Copy link
Member

soxofaan commented Aug 8, 2023

load_uploaded_files support was added in b1b5474

load_geojson and load_url have been added too under #424/#457

@soxofaan soxofaan closed this as completed Aug 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants