Environment | Branch | Status |
---|---|---|
production | production | |
staging | master |
Google Cloud Functions for carrying out event-driven tasks in the CIDC.
- Pub/Sub-triggered:
ingest_upload
: when a successful upload job is published to the "uploads" topic, transfers data from the upload bucket to the data bucket in GCS. contains separate permissions system for CIDC Biofx.vis_preprocessing
: perform and save precomputation on a givendownloadable_file
to facilitate visualization of that file's data in the CIDC Portal.derive_files_from_manifest_upload
: when a shipping/receiving manifest is ingested successfully, generate derivative files for the associated trial.derive_files_from_assay_or_analysis_upload
: when an assay or analysis upload completes, generate derivative files for the associated trial.store_auth0_logs
: pull logs for the past day from Auth0 and store them in Google Cloud Storage.send_email
: when an email is published to the "emails" topic, sends the email using the SendGrid API.update_cidc_from_csms
: when trial and manifest ID matching dict is published to "csms_trigger", update said trial/manifest from NCI's CSMS.disable_inactive_users
: find users who appear to have become inactive, and disable their accounts.refresh_download_permissions
: extend GCS IAM permission expiry dates for users who were active in the past day.
To install dependencies:
pip install -r requirements.dev.txt
To install and configure pre-commit hooks>
pre-commit install
To set-up the git hook for JIRA integration, run:
ln -s ../../.githooks/commit-msg .git/hooks/commit-msg
chmod +x .git/hooks/commit-msg
rm .git/hooks/commit-msg.sample
This symbolic link is necessary to correctly link files in .githooks
to .git/hooks
. Note that setting the core.hooksPath
configuration variable would lead to pre-commit failing. The commit-msg
hook runs after the pre-commit
hook, hence the two are de-coupled in this workflow.
To associate a commit with an issue, you will need to reference the JIRA Issue key (For eg 'CIDC-1111') in the corresponding commit message.
To start our hand-rolled local emulator:
python main.py
This starts up a Flask HTTP server that can simulate pubsub publish events and trigger cloud functions appropriately. E.g., to simulate publishing to the uploads
pubsub topic:
curl http://localhost:3001/projects/cidc-dfci-staging/topics/uploads -d "data=< base64-encoded pubsub message>"
If you add a new cloud function, you'll need to add it to the local emulator by hand.
To run the tests
pytest
This project uses GitHub Actions for continuous integration and deployment. To deploy an update to this application, follow these steps:
- Create a new branch locally, commit updates to it, then push that branch to this repository.
- Make a pull request from your branch into
master
. This will trigger GitHub Actions to run various tests and report back success or failure. You can't merge your PR until it passes the build, so if the build fails, you'll probably need to fix your code. - Once the build passes (and pending approval from collaborators reviewing the PR), merge your changes into
master
. This will trigger GitHub Actions to re-run tests on the code then deploy changes to the staging project. - Try out your deployed changes in the staging environment once the build completes.
- If you're satisfied that staging should be deployed into production, make a PR from
master
intoproduction
. - Once the PR build passes, merge
master
intoproduction
. This will trigger GitHub Actions to deploy the changes on staging to the production project.
For more information or to update the CI workflow, check out the configuration in .github/workflows/ci.yml
.
While it's recommended that the functions in this repo be deployed automatically using GitHub Actions, you might find that you need to deploy a function by hand. To do so, checkout the Google Cloud Functions docs.