A flexible application for serving geospatial datasets.
Ecoset processes and serves geospatial information about environmental variables. Each environmental variable may be created through one or many methods, each of which can have its own technical implementation.
- An API with Swagger definitions and user interface.
- In-built methods to process data from shapefiles, geotiffs and local biodiversity occurrence databases.
- Variables with multiple implementations.
- Handles feature, raster, and table data.
Table of contents
- Getting started
- Configuration files
- Geotemporal Variables and Methods
- Creating Custom Variable Methods
- Spatial and temporal dimensionality
Local development and testing can be done either inside or outside of Docker containers. A small amount of sample data is included in /test/sample-data
so that some basic functions may be tested easily.
Ecoset can be tested - alongside redis and gbif server dependencies - by using the docker-compose files in the root directory. Run docker-compose -f docker-compose.yml -f docker-compose.dev.yml build
then docker-compose -f docker-compose.yml -f docker-compose.dev.yml up
to test using Docker.
If not using Docker, ensure you have at least Node 20 LTS installed.
- Setup an available redis instance and - if not localhost - set the cache host and port in
/src/config/default.yml
. - Navigate to
src
, then runyarn
to restore packages. - Run
yarn run tsoa:gen
to generate route definitions. - Run
yarn run dev
to start ecoset and watch for file changes.
To work on and test GBIF plugins, an available mysql database with a mirrored copy of gbif is required.
- Filters for pre- and post-processing of datasets (e.g. buffer in pre-, summarise in post-).