All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
load_stac
/metadata_from_stac
: add support for extracting actual temporal dimension metadata (#567)MultiBackendJobManager
: addcancel_running_job_after
option to automatically cancel jobs that are running for too long (#590)- Added
openeo.api.process.Parameter
helper to easily create a "spatial_extent" UDP parameter
MultiBackendJobManager
: changed job metadata storage API, to enable working with large databasesDataCube.apply_polygon()
: renamepolygons
argument togeometries
, but keep support for legacypolygons
for now (#592, #511)
apply_dimension
with atarget_dimension
argument was not correctly adjusting datacube metadata on the client side, causing a mismatch.- Preserve non-spatial dimension metadata in
aggregate_spatial
(#612)
- Add experimental
openeo.testing.results
subpackage with reusable test utilities for comparing batch job results with reference data MultiBackendJobManager
: add initial support for storing job metadata in Parquet file (instead of CSV) (#571)- Add
Connection.authenticate_oidc_access_token()
to set up authorization headers with an access token that is obtained "out-of-band" (#598) - Add
JobDatabaseInterface
to allow custom job metadata storage withMultiBackendJobManager
(#571)
- Add
openeo.udf.run_code.extract_udf_dependencies()
to extract UDF dependency declarations from UDF code (related to Open-EO/openeo-geopyspark-driver#237) - Document PEP 723 based Python UDF dependency declarations (Open-EO/openeo-geopyspark-driver#237)
- Added more
openeo.api.process.Parameter
helpers to easily create "bounding_box", "date", "datetime", "geojson" and "temporal_interval" parameters for UDP construction. - Added convenience method
Connection.load_stac_from_job(job)
to easily load the results of a batch job with theload_stac
process (#566) load_stac
/metadata_from_stac
: add support for extracting band info from "item_assets" in collection metadata (#573)- Added initial
openeo.testing
submodule for reusable test utilities
- Initial fix for broken
DataCube.reduce_temporal()
afterload_stac
(#568)
- OIDC device code flow: hide progress bar on completed (or timed out) authentication
- Introduced superclass
CubeMetadata
forCollectionMetadata
for essential metadata handling (just dimensions for now) without collection-specific STAC metadata parsing. (#464) - Added
VectorCube.vector_to_raster()
(#550)
- Changed default
chunk_size
of variousdownload
functions from None to 10MB. This improves the handling of large downloads and reduces memory usage. (#528) Connection.execute()
andDataCube.execute()
now have aauto_decode
argument. If set to True (default) the response will be decoded as a JSON and throw an exception if this fails, if set to False the rawrequests.Response
object will be returned. (#499)
- Preserve geo-referenced
x
andy
coordinates inexecute_local_udf
(#549)
- Add
DataCube.filter_labels()
- Update autogenerated functions/methods in
openeo.processes
to definitions from openeo-processes project version 2.0.0-rc1. This removescreate_raster_cube
,fit_class_random_forest
,fit_regr_random_forest
andsave_ml_model
. Although removed from openeo-processes 2.0.0-rc1, support forload_result
,predict_random_forest
andload_ml_model
is preserved but deprecated. (#424) - Show more informative error message on
403 Forbidden
errors from CDSE firewall (#512) - Handle API error responses more strict and avoid hiding possibly important information in JSON-formatted but non-compliant error responses.
- Fix band name support in
DataCube.band()
when no metadata is available (#515) - Support optional child callbacks in generated
openeo.processes
, e.g.merge_cubes
(#522) - Fix broken pre-flight validation in
Connection.save_user_defined_process
(#526)
- Support new UDF signature:
def apply_datacube(cube: DataArray, context: dict) -> DataArray
(#310) - Add
collection_property()
helper to easily build collection metadata property filters forConnection.load_collection()
(#331) - Add
DataCube.apply_polygon()
(standardized version of experimentalchunk_polygon
) (#424) - Various improvements to band mapping with the Awesome Spectral Indices feature. Allow explicitly specifying the satellite platform for band name mapping (e.g. "Sentinel2" or "LANDSAT8") if cube metadata lacks info. Follow the official band mapping from Awesome Spectral Indices better. Allow manually specifying the desired band mapping. (#485, #501)
- Also attempt to automatically refresh OIDC access token on a
401 TokenInvalid
response (in addition to403 TokenInvalid
) (#508) - Add
Parameter.object()
factory forobject
type parameters
- Remove custom spectral indices "NDGI", "NDMI" and "S2WI" from "extra-indices-dict.json" that were shadowing the official definitions from Awesome Spectral Indices (#501)
- Initial support for "spectral indices" that use constants defined by Awesome Spectral Indices (#501)
- Introduce
OpenEoApiPlainError
for API error responses that are not well-formed for better distinction with properly formed API error responses (OpenEoApiError
). (#491).
- Fix missing
validate
support inLocalConnection.execute
(#493)
- Add
DataCube.reduce_spatial()
- Added option (enabled by default) to automatically validate a process graph before execution. Validation issues just trigger warnings for now. (#404)
- Added "Sentinel1" band mapping support to "Awesome Spectral Indices" wrapper (#484)
- Run tests in GitHub Actions against Python 3.12 as well
- Enforce
XarrayDataCube
dimension order inexecute_local_udf()
to (t, bands, y, x) to improve UDF interoperability with existing back-end implementations.
- Support year/month shorthand date notations in temporal extent arguments of
Connection.load_collection
,DataCube.filter_temporal
and related (#421) - Support parameterized
bands
inload_collection
(#471) - Allow specifying item schema in
Parameter.array()
- Support "subtype" and "format" schema options in
Parameter.string()
- Before doing user-defined process (UDP) listing/creation: verify that back-end supports that (through openEO capabilities document) to improve error message.
- Skip metadata-based normalization/validation and stop showing unhelpful warnings/errors
like "No cube:dimensions metadata" or "Invalid dimension"
when no metadata is available client-side anyway (e.g. when using
datacube_from_process
, parameterized cube building, ...). (#442)
- Bumped minimal supported Python version to 3.7 (#460)
- Support handling of "callback" parameters in
openeo.processes
callables (#470)
- Processes that take a CRS as argument now try harder to normalize your input to
a CRS representation that aligns with the openEO API (using
pyproj
library when available) (#259) - Initial
load_geojson
support withConnection.load_geojson()
(#424) - Initial
load_url
(for vector cubes) support withConnection.load_url()
(#424) - Add
VectorCube.apply_dimension()
(Open-EO/openeo-python-driver#197) - Support lambda based property filtering in
Connection.load_stac()
(#425) VectorCube
: initial support forfilter_bands
,filter_bbox
,filter_labels
andfilter_vector
(#459)
Connection
based requests: always use finite timeouts by default (20 minutes in general, 30 minutes for synchronous execute requests) (#454)
- Fix: MultibackendJobManager should stop when finished, also when job finishes with error (#452)
- Fix
spatial_extent
/temporal_extent
handling in "localprocessing"load_stac
(#451)
- Add support in
VectoCube.download()
andVectorCube.execute_batch()
to guess output format from extension of a given filename (#401, #449) - Added
load_stac
for Client Side Processing, based on the openeo-processes-dask implementation
- Updated docs for Client Side Processing with
load_stac
examples, available at https://open-eo.github.io/openeo-python-client/cookbook/localprocessing.html
- Avoid double
save_result
nodes when combiningVectorCube.save_result()
and.download()
. (#401, #448)
- Added automatically renewal of access tokens with OIDC client credentials grant (
Connection.authenticate_oidc_client_credentials
) (#436)
- Simplified
BatchJob
methodsstart()
,stop()
,describe()
, ... Legacy aliasesstart_job()
,describe_job()
, ... are still available and don't trigger a deprecation warning for now. (#280) - Update
openeo.extra.spectral_indices
to Awesome Spectral Indices v0.4.0
- Generalized support for setting (default) OIDC provider id through env var
OPENEO_AUTH_PROVIDER_ID
#419 - Added
OidcDeviceCodePollTimeout
: specific exception for OIDC device code flow poll timeouts - On-demand preview: Added
DataCube.preview()
to generate a XYZ service with the process graph and display a map widget
- Fix format option conflict between
save_result
andcreate_job
#433 - Ensure that OIDC device code link opens in a new tab/window #443
- Support OIDC client credentials grant from a generic
connection.authenticate_oidc()
call through environment variables #419
- Fixed UDP parameter conversion issue in
build_process_dict
when using parameter incontext
ofrun_udf
#431
Connection.authenticate_oidc()
: add argumentmax_poll_time
to set maximum device code flow poll time- Show progress bar while waiting for OIDC authentication with device code flow, including special mode for in Jupyter notebooks. (#237)
- Basic support for
load_stac
process withConnection.load_stac()
(#425) - Add
DataCube.aggregate_spatial_window()
- Include "scope" parameter in OIDC token request with client credentials grant.
- Support fractional seconds in
Rfc3339.parse_datetime
(#418)
- Full support for user-uploaded files (
/files
endpoints) (#377) - Initial, experimental "local processing" feature to use
openEO Python Client Library functionality on local
GeoTIFF/NetCDF files and also do the processing locally
using the
openeo_processes_dask
package (#338) - Add
BatchJob.get_results_metadata_url()
.
Connection.list_files()
returns a list ofUserFile
objects instead of a list of metadata dictionaries. UseUserFile.metadata
to get the original dictionary. (#377)DataCube.aggregate_spatial()
returns aVectorCube
now, instead of aDataCube
(#386). The (experimental)fit_class_random_forest()
andfit_regr_random_forest()
methods moved accordingly to theVectorCube
class.- Improved documentation on
openeo.processes
andProcessBuilder
(#390). DataCube.create_job()
andConnection.create_job()
now require keyword arguments for all but the first argument for clarity. (#412).- Pass minimum log level to backend when retrieving batch job and secondary service logs. (Open-EO/openeo-api#485, Open-EO/openeo-python-driver#170)
- Dropped support for pre-1.0.0 versions of the openEO API
(#134):
- Remove
ImageCollectionClient
and related helpers (now unused leftovers from version 0.4.0 and earlier). (Also #100) - Drop support for pre-1.0.0 job result metadata
- Require at least version 1.0.0 of the openEO API for a back-end in
Connection
and all its methods.
- Remove
- Reinstated old behavior of authentication related user files (e.g. refresh token store) on Windows: when
PrivateJsonFile
may be readable by others, just log a message instead of raisingPermissionError
(387) VectorCube.create_job()
andMlModel.create_job()
are properly aligned withDataCube.create_job()
regarding setting job title, description, etc. (#412).- More robust handling of billing currency/plans in capabilities (#414)
- Avoid blindly adding a
save_result
node fromDataCube.execute_batch()
when there is already one (#401)
- The openeo Python client library can now also be installed with conda (conda-forge channel) (#176)
- Allow using a custom
requests.Session
inopeneo.rest.auth.oidc
logic
- Less verbose log printing on failed batch job #332
- Improve (UTC) timezone handling in
openeo.util.Rfc3339
and addrfc3339.today()
/rfc3339.utcnow()
.
- Fine-tuned
XarrayDataCube
tests for conda building and packaging (#176)
- Jupyter integration: show process graph visualization of
DataCube
objects instead of genericrepr
. (#336) - Add
Connection.vectorcube_from_paths()
to load a vector cube from files (on back-end) or URLs withload_uploaded_files
process. - Python 3.10 and 3.11 are now officially supported (test run now also for 3.10 and 3.11 in GitHub Actions, #346)
- Support for simplified OIDC device code flow, (#335)
- Added MultiBackendJobManager, based on implementation from openeo-classification project (#361)
- Added resilience to MultiBackendJobManager for backend failures (#365)
execute_batch
also skips temporal502 Bad Gateway errors
. #352
- Fixed/improved math operator/process support for
DataCube
s in "apply" mode (non-"band math"), allowing expressions like10 * cube.log10()
and~(cube == 0)
(#123) - Support
PrivateJsonFile
permissions properly on Windows, using oschmod library. (#198) - Fixed some broken unit tests on Windows related to path (separator) handling. (#350)
- Add
max_cloud_cover
argument toload_collection()
to simplify setting maximum cloud cover (propertyeo:cloud_cover
) (#328)
- Improve default dimension metadata of a datacube created with
openeo.rest.datacube.DataCube.load_disk_collection
DataCube.download()
: only automatically addsave_result
node when there is none yet.- Deprecation warnings: make sure they are shown by default and can be hidden when necessary.
- Rework and improve
openeo.UDF
helper class for UDF usage (#312).- allow loading directly from local file or URL
- autodetect
runtime
from file/URL suffix or source code - hide implementation details around
data
argument (e.g.data={"from_parameter": "x"}
) - old usage patterns of
openeo.UDF
andDataCube.apply_dimension()
still work but trigger deprecation warnings
- Show warning when using
load_collection
property filters that are not defined in the collection metadata (summaries).
- Eliminate dependency on
distutils.version.LooseVersion
which started to trigger deprecation warnings (#316).
- Remove old
Connection.oidc_auth_user_id_token_as_bearer
workaround flag (#300)
- Fix refresh token handling in case of OIDC token request with refresh token grant (#326)
- Allow passing raw JSON string, JSON file path or URL to
Connection.download()
,Connection.execute()
andConnection.create_job()
- Add support for reverse math operators on DataCube in
apply
mode (#323) - Add
DataCube.print_json()
to simplify exporting process graphs in Jupyter or other interactive environments (#324) - Raise
DimensionAlreadyExistsException
when trying toadd_dimension()
a dimension with existing name (Open-EO/openeo-geopyspark-driver#205)
DataCube.execute_batch()
now also guesses the output format from the filename, and allows usingformat
argument next to the currentout_format
to align with theDataCube.download()
method. (#240)- Better client-side handling of merged band name metadata in
DataCube.merge_cubes()
- Remove legacy
DataCube.graph
andDataCube.flatten()
to prevent usage patterns that cause interoperability issues (#155, #209, #324)
- Add support for passing a PGNode/VectorCube as geometry to
aggregate_spatial
,mask_polygon
, ... - Add support for second order callbacks e.g.
is_valid
incount
inreduce_dimension
(#317)
- Rename
RESTJob
class name to less cryptic and more user-friendlyBatchJob
. OriginalRESTJob
is still available as deprecated alias. (#280) - Dropped default reducer ("max") from
DataCube.reduce_temporal_simple()
- Various documentation improvements:
- Drop hardcoded
h5netcdf
engine fromXarrayIO.from_netcdf_file()
andXarrayIO.to_netcdf_file()
(#314) - Changed argument name of
Connection.describe_collection()
fromname
tocollection_id
to be more in line with other methods/functions.
- Fix
context
/condition
confusion bug withcount
callback inDataCube.reduce_dimension()
(#317)
- Add
context
parameter toDataCube.aggregate_spatial()
,DataCube.apply_dimension()
,DataCube.apply_neighborhood()
,DataCube.apply()
,DataCube.merge_cubes()
. (#291) - Add
DataCube.fit_regr_random_forest()
(#293) - Add
PGNode.update_arguments()
, which combined withDataCube.result_node()
allows to do advanced process graph argument tweaking/updating without using._pg
hacks. JobResults.download_files()
: also download (by default) the job result metadata as STAC JSON file (#184)- OIDC handling in
Connection
: try to automatically refresh access token when expired (#298) Connection.create_job
raises exception if response does not contain a valid job_id- Add
openeo.udf.debug.inspect
for using the openEOinspect
process in a UDF (#302) - Add
openeo.util.to_bbox_dict()
to simplify building a openEO style bbox dictionary, e.g. from a list or shapely geometry (#304)
- Removed deprecated (and non-functional)
zonal_statistics
method from oldImageCollectionClient
API. (#144)
- Add support for comparison operators (
<
,>
,<=
and>=
) in callback process building - Added
Connection.describe_process()
to retrieve and show a single process - Added
DataCube.flatten_dimensions()
andDataCube.unflatten_dimension
(Open-EO/openeo-processes#308, Open-EO/openeo-processes#316) - Added
VectorCube.run_udf
(to avoid non-standardprocess_with_node(UDF(...))
usage) - Added
DataCube.fit_class_random_forest()
andConnection.load_ml_model()
to train and load Machine Learning models (#279) - Added
DataCube.predict_random_forest()
to easily usereduce_dimension
with apredict_random_forest
reducer using aMlModel
(trained withfit_class_random_forest
) (#279) - Added
DataCube.resample_cube_temporal
(#284) - Add
target_dimension
argument toDataCube.aggregate_spatial
(#288) - Add basic configuration file system to define a default back-end URL and enable auto-authentication (#264, #187)
- Add
context
argument toDataCube.chunk_polygon()
- Add
Connection.version_info()
to list version information about the client, the API and the back-end
- Include openEO API error id automatically in exception message to simplify user support and post-mortem analysis.
- Use
Connection.default_timeout
(when set) also on version discovery request - Drop
ImageCollection
fromDataCube
's class hierarchy. This practically removes very old (pre-0.4.0) methods likedate_range_filter
andbbox_filter
fromDataCube
. (#100, #278) - Deprecate
DataCube.send_job
in favor ofDataCube.create_job
for better consistency (internally and with other libraries) (#276) - Update (autogenerated)
openeo.processes
module to 1.2.0 release (2021-12-13) of openeo-processes - Update (autogenerated)
openeo.processes
module to draft version of 2022-03-16 (e4df8648) of openeo-processes - Update
openeo.extra.spectral_indices
to a post-0.0.6 version of Awesome Spectral Indices
- Removed deprecated
zonal_statistics
method fromDataCube
. (#144) - Deprecate old-style
DataCube.polygonal_mean_timeseries()
,DataCube.polygonal_histogram_timeseries()
,DataCube.polygonal_median_timeseries()
andDataCube.polygonal_standarddeviation_timeseries()
- Support
rename_labels
on temporal dimension (#274) - Basic support for mixing
DataCube
andProcessBuilder
objects/processing (#275)
- Add experimental support for
chunk_polygon
process (Open-EO/openeo-processes#287) - Add support for
spatial_extent
,temporal_extent
andbands
toConnection.load_result()
- Setting the environment variable
OPENEO_BASEMAP_URL
allows to set a new templated URL to a XYZ basemap for the Vue Components library,OPENEO_BASEMAP_ATTRIBUTION
allows to set the attribution for the basemap (#260) - Initial support for experimental "federation:missing" flag on partial openEO Platform user job listings (Open-EO/openeo-api#419)
- Best effort detection of mistakenly using Python builtin
sum
orall
functions in callbacks (Forum #113) - Automatically print batch job logs when job doesn't finish successfully (using
execute_batch/run_synchronous/start_and_wait
).
- Add
options
argument toDataCube.atmospheric_correction
(Open-EO/openeo-python-driver#91) - Add
atmospheric_correction_options
andcloud_detection_options
arguments toDataCube.ard_surface_reflectance
(Open-EO/openeo-python-driver#91) - UDP storing: add support for "returns", "categories", "examples" and "links" properties (#242)
- Add
openeo.extra.spectral_indices
: experimental API to easily compute spectral indices (vegetation, water, urban, ...) on aDataCube
, using the index definitions from Awesome Spectral Indices
- Batch job status poll loop: ignore (temporary) "service unavailable" errors (Open-EO/openeo-python-driver#96)
- Batch job status poll loop: fail when there are too many soft errors (temporary connection/availability issues)
- Fix
DataCube.ard_surface_reflectance()
to use processard_surface_reflectance
instead ofatmospheric_correction
- Add command line tool
openeo-auth token-clear
to remove OIDC refresh token cache - Add support for OIDC device authorization grant without PKCE nor client secret, (#225, openeo-api#410)
- Add
DataCube.dimension_labels()
(EP-4008) - Add
Connection.load_result()
(EP-4008) - Add proper support for child callbacks in
fit_curve
andpredict_curve
(#229) ProcessBuilder
: Add support forarray_element(data, n)
throughdata[n]
syntax (#228)ProcessBuilder
: Add support foreq
andneq
through==
and!=
operators (EP-4011)- Add
DataCube.validate()
for process graph validation (EP-4012 related) - Add
Connection.as_curl()
for generating curl command to evaluate a process graph orDataCube
from the command line - Add support in
DataCube.download()
to guess output format from extension of a given filename
- Improve default handling of
crs
(andbase
/height
) infilter_bbox
: avoid explicitly sendingnull
unnecessarily (#233). - Update documentation/examples/tests: EPSG CRS in
filter_bbox
should be integer code, not string (#233). - Raise
ProcessGraphVisitException
fromProcessGraphVisitor.resolve_from_node()
(instead of genericValueError
) DataCube.linear_scale_range
is now a shortcut forDataCube.apply(lambda x:x.x.linear_scale_range( input_min, input_max, output_min, output_max))
. Instead of creating an invalid process graph that tries to invoke linear_scale_range on a datacube directly.- Nicer error message when back-end does not support basic auth (#247)
- Remove unused and outdated (0.4-style)
File
/RESTFile
classes (#115) - Deprecate usage of
DataCube.graph
property (#209)
Minor release to address version packaging issue.
- Support nested callbacks inside array arguments, for instance in
array_modify
,array_create
- Support
array_concat
- add
ProcessGraphUnflattener
andPGNodeGraphUnflattener
to unflatten a flat dict representation of a process graph to aPGNode
graph (EP-3609) - Add
Connection.datacube_from_flat_graph
andConnection.datacube_from_json
to construct aDataCube
from flat process graph representation (e.g. JSON file or JSON URL) (EP-3609) - Add documentation about UDP unflattening and sharing (EP-3609)
- Add
fit_curve
andpredict_curve
, two methods used in change detection
- Update
processes.py
based on 1.1.0 release op openeo-processes project processes.py
: include all processes from "proposals" folder of openeo-processes project- Jupyter integration: Visual rendering for process graphs shown instead of a plain JSON representation.
- Migrate from Travis CI to GitHub Actions for documentation building and unit tests (#178, EP-3645)
- Removed unit test runs for Python 3.5 (#210)
- Allow, but raise warning when specifying a CRS for the geometry passed to
aggregate_spatial
andmask_polygon
, which is non-standard/experimental feature, only supported by specific back-ends (#204) - Add
optional
argument toParameter
and fix re-encoding parameters with default value. (EP-3846) - Add support to test strict equality with
ComparableVersion
- Jupyter integration: add rich HTML rendering for more backend metadata (Job, Job Estimate, Logs, Services, User-Defined Processes)
- Add support for filter_spatial
- Add support for aggregate_temporal_period
- Added class
Service
for secondary web-services - Added a method
service
toConnection
- Add
Rfc3339.parse_date
andRfc3339.parse_date_or_datetime
- Disallow redirects on POST/DELETE/... requests and require status code 200 on
POST /result
requests. This improves error information wherePOST /result
would involve a redirect. (EP-3889) - Class
JobLogEntry
got replaced with a more complete and re-usableLogEntry
dict - The following methods return a
Service
class instead of a dict:tiled_viewing_service
inImageCollection
,ImageCollectionClient
andDataCube
,create_service
inConnection
- The method
remove_service
inConnection
has been deprecated in favor ofdelete_service
in theService
class
- Add dependency on
xarray
package (#159, #190, EP-3578) - Add support for default OIDC clients advertised by backend (#192, Open-EO/openeo-api#366)
- Add support for default OIDC provider (based on provider order advertised by backend) (Open-EO/openeo-api#373)
- Eliminate development/optional dependency on
openeo_udf
project (#159, #190, EP-3578). Now the openEO client library itself contains the necessary classes and implementation to run UDF code locally.
Connection
: don't send default auth headers to non-backend domains (#201)
- Improve OpenID Connect usability on Windows: don't raise exception on file permissions
that can not be changed (by
os.chmod
on Windows) (#198)
- Add initial/experimental support for OIDC device code flow with PKCE (alternative for client secret) (#191 / EP-3700)
- When creating a connection: use "https://" by default when no protocol is specified
DataCube.mask_polygon
: supportParameter
argument formask
- Add initial/experimental support for default OIDC client (#192, Open-EO/openeo-api#366)
- Add
Connection.authenticate_oidc
for user-friendlier OIDC authentication: first try refresh token and fall back on device code flow - Add experimental support for
array_modify
process (Open-EO/openeo-processes#202)
- Remove old/deprecated
Connection.authenticate_OIDC()
- Add namespace support to
DataCube.process
,PGNode
,ProcessGraphVisitor
(minor API breaking change) and related. Allows building process graphs with processes from non-"backend" namespaces (#182) collection_items
to request collection items through a STAC APIpaginate
as a basic method to support link-based pagination- Add namespace support to
Connection.datacube_from_process
- Add basic support for band name aliases in
metadata.Band
for band index lookup (EP-3670)
OpenEoApiError
moved fromopeneo.rest.connection
toopeneo.rest
- Added HTML representation for
list_jobs
,list_services
,list_files
and for job results - Improve refresh token handling in OIDC logic: avoid requesting refresh token (which can fail if OIDC client is not set up for that) when not necessary (EP-3700)
RESTJob.start_and_wait
: add status line when sending "start" request, and drop microsecond resolution from status lines
- Updated Vue Components library (solves issue with loading from slower back-ends where no result was shown)
- Add "reflected" operator support to
ProcessBuilder
- Add
RESTJob.get_results()
,JobResults
andResultAsset
for more fine-grained batch job result handling. (EP-3739) - Add documentation on batch job result (asset) handling and downloading
- Mark
Connection.imagecollection
more clearly as deprecated/legacy alias ofConnection.load_collection
- Deprecated
job_results()
andjob_logs()
onConnection
object, it's better to work throughRESTJob
object. - Update
DataCube.sar_backscatter
to the latest process spec: addcoefficient
argument and removeorthorectify
,rtc
. (openeo-processes#210)
- Remove outdated batch job result download logic left-overs
- Remove (outdated) abstract base class
openeo.job.Job
: did not add value, only caused maintenance overhead. (#115)
- Make
DataCube.filter_bbox()
easier to use: allow passing a bbox tuple, list, dict or even shapely geometry directly as first positional argument or asbbox
keyword argument. Handling of the legacy non-standard west-east-north-south positional argument order is preserved for now (#136) - Add "band math" methods
DataCube.ln()
,DataCube.logarithm(base)
,DataCube.log10()
andDataCube.log2()
- Improved support for creating and handling parameters when defining user-defined processes (EP-3698)
- Initial Jupyter integration: add rich HTML rendering of backend metadata (collections, file formats, UDF runtimes, ...) (#170)
- add
resolution_merge
process (experimental) (EP-3687, openeo-processes#221) - add
sar_backscatter
process (experimental) (EP-3612, openeo-processes#210)
- Fixed 'Content-Encoding' handling in
Connection.download
: client did not automatically decompress/result
responses when necessary (#175)
- Add
DataCube.aggregate_spatial()
- Get/create default
RefreshTokenStore
lazily inConnection
- Various documentation tweaks
- Add support for
title
/description
/plan
/budget
inDataCube.send_job
(#157 / #158) - Add
DataCube.to_json()
to easily get JSON representation of a DataCube - Allow to subclass
CollectionMetadata
and preserve original type when "cloning"
- Changed
execute_batch
to support downloading multiple files (within EP-3359, support profiling) - Don't send None-valued
title
/description
/plan
/budget
fields fromDataCube.send_job
(#157 / #158)
- Remove duplicate and broken
Connection.list_processgraphs
- Various documentation fixes and tweaks
- Avoid
merge_cubes
warning when using non-band-mathDataCube
operators
- Add
DataCube.aggregate_temporal
- Add initial support to download profiling information
- Deprecated legacy functions/methods are better documented as such and link to a recommended alternative (EP-3617).
- Get/create default
AuthConfig
in Connection lazily (allows client to run in environments without existing (default) config folder)
- Deprecate
zonal_statistics
in favor ofaggregate_spatial
- Remove support for old, non-standard
stretch_colors
process (Uselinear_scale_range
instead).
- Also handle
dict
arguments indereference_from_node_arguments
(EP-3509) - Add support for less/greater than and equal operators
- Raise warning when user defines a UDP with same id as a pre-defined one (EP-3544, #147)
- Add
rename_labels
support in metadata (EP-3585) - Improve "callback" handling (sub-process graphs): add predefined callbacks for all official processes and functionality to assemble these (EP-3555, #153)
- Moved datacube write/save/plot utilities from udf to client (EP-3456)
- Add documentation on OpenID Connect authentication (EP-3485)
- Fix
kwargs
handling inTimingLogger
decorator
- Add
openeo-auth
command line tool to manage OpenID Connect (and basic auth) related configs (EP-3377/EP-3493) - Support for using config files for OpenID Connect and basic auth based authentication, instead of hardcoding credentials (EP-3377/EP-3493)
- Fix target_band handling in
DataCube.ndvi
(EP-3496)