Skip to content

Latest commit

 

History

History
128 lines (82 loc) · 3.63 KB

CONTRIBUTING.md

File metadata and controls

128 lines (82 loc) · 3.63 KB

Contributing

See development/docs for more information.

Setting up a development environment

This should be enough to use the Docker Compose development environment. However, installing dependencies may be required if not using Docker Compose or for some editor integrations.

  • For browser/API development, install Node.js, pnpm, and dependencies.

    pnpm install
    
  • For data pipeline development, install Python, dependencies, and development tools.

    pip install -r data-pipeline/requirements.txt
    pip install -r requirements-dev.txt
    pip install -r deploy/deployctl/requirements.txt
    

Browser

The production API can be used for browser development. To start a local instance of only the browser...

  • with Docker

    # create browser/build.env file
    cat <<EOF > browser/build.env
    GA_TRACKING_ID=
    REPORT_VARIANT_URL=
    REPORT_VARIANT_VARIANT_ID_PARAMETER=
    REPORT_VARIANT_DATASET_PARAMETER=
    EOF
    
    ./development/env.sh browser up
    
  • without Docker:

    cd browser
    ./start.sh
    

API

Because of the size of the gnomAD database, API development is usually done using an Elasticsearch cluster hosted in the cloud. See the deployment documentation for instructions on deploying a browser environment in GCP and the data pipeline documentation for instructions on populating the database.

  • Install and configure the Google Cloud SDK.

  • Select GCP project and zone for development deployment.

    ./deployctl config set project $PROJECT
    ./deployctl config set zone $ZONE
    
  • Start a local instance of the API...

    • with Docker:

      ./development/env.sh api up
      

      or use ./development/env.sh up to start both the API and browser

    • without Docker:

      cd graphql-api
      ELASTICSEARCH_USERNAME=elastic ELASTICSEARCH_PASSWORD=$(../deployctl elasticsearch get-password) ./start.sh
      

The Docker Compose configuration could be modified to run Elasticsearch locally.

Data pipeline

See data-pipeline/README.md.

Conventions

All code should formatted using either Prettier for JavaScript or Black for Python. To run these formatters, use:

  • Prettier: pnpm format
  • Black: black .

If pre-commit hooks are installed, formatters will be automatically run on each commit.

Some other conventions are enforced using ESLint for JavaScript, Stylelint for CSS (and styled-components styles), and Pylint for Python. To run these linters use:

  • ESLint: pnpm lint:js
  • Stylelint: pnpm lint:css
  • Pylint: pylint data-pipeline/src/data_pipeline

Tests

Jest is used for JavaScript unit tests. Jest is configured to look for files named *.spec.js in the browser and graphql-api directories.

To run all Jest tests, use:

pnpm jest

To run only tests for one component, use pnpm jest --projects browser or pnpm jest --projects graphql-api.

Updating dependencies

Images for the Docker Compose development environment need to be rebuilt after updating dependencies.

./development/env.sh build browser
./development/env.sh build api