Skip to content

Commit

Permalink
Merge pull request #2 from Mindgrub/feature/1/step-functions-support
Browse files Browse the repository at this point in the history
feat: Add step functions support, resolves #1
  • Loading branch information
doublecompile authored Oct 30, 2023
2 parents e3e18c4 + 3cdcefa commit 074d94f
Show file tree
Hide file tree
Showing 3 changed files with 21 additions and 5 deletions.
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
FROM alpine:3.15
FROM alpine:3.18

RUN apk add --no-cache mysql-client python3 py3-pip coreutils \
RUN apk add --no-cache mysql-client python3 py3-pip coreutils jq \
&& pip install awscli

ADD run.sh /run.sh
Expand Down
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,11 @@ An Alpine-based Docker image for producing a file with `mysqldump` and uploading
- `S3_PREFIX` – Optional. A string to prepend to the S3 object key (Default: "").
- `MYSQL_NET_BUFFER_LENGTH` – Optional. The `net_buffer_length` setting for `mysqldump` (Default: "16384").
- `REQUESTOR` – Optional. The email address of the user who requested this dump to be stored in the S3 metadata.
- `SFN_TASK_TOKEN` – Optional. A Step Functions [Task Token](https://docs.aws.amazon.com/step-functions/latest/apireference/API_GetActivityTask.html#StepFunctions-GetActivityTask-response-taskToken). If present, this token will be used to call [`SendTaskHeartbeat`](https://docs.aws.amazon.com/step-functions/latest/apireference/API_SendTaskHeartbeat.html) and [`SendTaskSuccess`](https://docs.aws.amazon.com/step-functions/latest/apireference/API_SendTaskSuccess.html). The task output sent to `SendTaskSuccess` will consist of a JSON object with a single property: `uri` (containing the S3 URI of the database dump).

### AWS Permissions

If this Docker image is used within Amazon ECS, specify permissions to S3 within your Task Definition role. Otherwise, you can provide `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` as environment variables.
If this Docker image is used within Amazon ECS, specify permissions to S3 (and optionally Step Functions) within your Task Definition role. Otherwise, you can provide `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` as environment variables.

## Technical Details

Expand Down
19 changes: 17 additions & 2 deletions run.sh
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,6 +1,11 @@
#!/bin/sh
set -e

# Send heartbeat
if [ -n "$SFN_TASK_TOKEN" ]; then
aws stepfunctions send-task-heartbeat --task-token "$SFN_TASK_TOKEN"
fi

# Variable defaults
: "${FILENAME_PREFIX:=snapshot}"
: "${MYSQL_NET_BUFFER_LENGTH:=16384}"
Expand All @@ -18,12 +23,22 @@ echo "About to export mysql://$DB_HOST/$DB_NAME to $destination"
mysqldump -h "$DB_HOST" -u "$DB_USER" --password="$DB_PASS" -P "$DB_PORT" -R -E --triggers --single-transaction --comments --net-buffer-length="$MYSQL_NET_BUFFER_LENGTH" "$DB_NAME" | gzip > "$destination"
echo "Export to $destination completed"

# Send heartbeat
if [ -n "$SFN_TASK_TOKEN" ]; then
aws stepfunctions send-task-heartbeat --task-token "$SFN_TASK_TOKEN"
fi

# Publish to S3
extra_metadata=""
if [ -n "$REQUESTOR" ]; then
extra_metadata=",Requestor=$REQUESTOR"
fi

# Publish to S3
echo "About to upload $destination to $s3_url"
aws s3 cp "$destination" "$s3_url" --storage-class "$S3_STORAGE_TIER" --metadata "DatabaseHost=${DB_HOST},DatabaseName=${DB_NAME}${extra_metadata}"
echo "Upload to $s3_url completed"

# Send activity success
if [ -n "$SFN_TASK_TOKEN" ]; then
json_output=$(jq -cn --arg uri "$s3_url" '{"uri":$uri}')
aws stepfunctions send-task-success --task-token "$SFN_TASK_TOKEN" --task-output "$json_output"
fi

0 comments on commit 074d94f

Please sign in to comment.