Skip to content

Commit

Permalink
Merge pull request #853 from hackforla/feat-dev-ops-ecs
Browse files Browse the repository at this point in the history
Minor devops changes
  • Loading branch information
mattyweb authored Dec 2, 2020
2 parents fd68a86 + 7c6dce5 commit 9394df8
Show file tree
Hide file tree
Showing 6 changed files with 25 additions and 39 deletions.
18 changes: 0 additions & 18 deletions .github/workflows/deploy_backend_dev.yml

This file was deleted.

21 changes: 15 additions & 6 deletions server/aws/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,22 +14,25 @@ Note that this blueprint only handles the server infrastructure. The client appl

## Assumptions

- AWS account is already created
- A task-definition.json is already loaded to the AWS account
- AWS account is already created and profile with required user credentials set up locally
- IAM Role (ecsTaskExecutionRole) already created with SSMReadOnlyAccess AND ECSTaskExecutionRolePolicy applied
- SSL/TLS certificate is already created and loaded to the AWS account
- DNS is manually pointed to the ALB once the deployment is complete
- DNS can be manually pointed to the ALB once the deployment is complete

## Deployment

The blueprint is meant to be deployed manually. The person deploying the blueprint will have the AWS CLI installed and a profile with Admin access for the AWS account being used. Use the Terraform commands (init, plan, apply, etc.) to configure the 0.12 version environment and deploy the environment.

The same blueprint is intended to be run in separate stages for ```dev``` and ```prod``` using Terraform Workspaces as described [here](https://learn.hashicorp.com/tutorials/terraform/organize-configuration?in=terraform/modules#separate-states).

### Parameters

The following parameters (at minimum) need to be set in order to run the blueprint. You can use, for example, a .tfvars file for these.
The following parameters (at minimum) need to be set in order to run the blueprint. You can use, for example, a .tfvars file for these (e.g. a dev.tfvars and prod.tfvars)

- profile
- account_id
- stage
- image_tag
- db_name
- db_username
- db_password
Expand All @@ -43,9 +46,15 @@ For example, the value from ```/dev/us-east-1/DB_DSN``` will be injected as the

## After the Deployment

The environment will be created with a blank database. As a result the application will initially show an error message. The initial Alembic migration needs to be run then the database populated using the Prefect data pipeline.
The environment will be created with a blank database. As a result the application will initially show an error message. The initial Alembic migration needs to be run then the database populated using the Prefect data pipeline. The easiest way to do this is with a SSH tunnel to the database using the Bastion server.

Alternatively, the appropriate ECS task ```prod-la-311-data-server``` can be run manually with the following container Command override:

```bash
alembic,upgrade,head
```

The easiest way to do this is with a SSH tunnel to the database using the Bastion server.
Once the database schema has been applied the data loading task ```prod-la-311-data-nightly-update``` can be manually run. When run on a blank database it will pull all available data.

### Developer Access with SSH and Bastion Server

Expand Down
4 changes: 1 addition & 3 deletions server/aws/templates/prefect.json
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,7 @@
},
"secrets": [
{ "name": "PREFECT__CONTEXT__SECRETS__DSN", "valueFrom": "/${stage}/${region}/DB_DSN" },
{ "name": "PREFECT__CONTEXT__SECRETS__SLACK_HOOK", "valueFrom": "/${stage}/${region}/SLACK_HOOK" }
],
"environment": [
{ "name": "PREFECT__CONTEXT__SECRETS__SLACK_HOOK", "valueFrom": "/${stage}/${region}/SLACK_HOOK" },
{ "name": "PREFECT__API_URL", "valueFrom": "/${stage}/${region}/API_URL" },
{ "name": "PREFECT__STAGE", "valueFrom": "/${stage}/${region}/STAGE" }
],
Expand Down
4 changes: 2 additions & 2 deletions server/aws/variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -36,12 +36,12 @@ variable db_password {}

variable container_cpu {
type = number
default = 256
default = 512
}

variable container_memory {
type = number
default = 1024
default = 2048
}

variable container_port {
Expand Down
8 changes: 2 additions & 6 deletions server/prefect/config.toml
Original file line number Diff line number Diff line change
@@ -1,22 +1,18 @@
# Default configuration for 311 Data prefect task

# These settings can be overridden with environment variables that follow the pattern:
# PREFECT__[setting name]

# determines whether Dask should be used to parallelize the flow run
dask = true

# (!) WARNING: setting this to "true" will wipe out whatever is in the target table (e.g. 'requests')
reset_db = false

# will write to ./output if left blank but can be overridden
temp_folder = ""

# set to local API instance or override with PREFECT__API_URL
api_url = "http://localhost:5000"

# whether the data update is being run in a testing, development or production environment
stage = "Testing"
# whether to vacuum database and reset stats after load
vacuum_db = false

[socrata]
host = "data.lacity.org"
Expand Down
9 changes: 5 additions & 4 deletions server/prefect/tasks/postgres.py
Original file line number Diff line number Diff line change
Expand Up @@ -214,10 +214,11 @@ def complete_load() -> Dict[str, int]:
connection.commit()
logger.info("Views successfully refreshed")

# need to have autocommit set for VACUUM to work
connection.autocommit = True
cursor.execute("VACUUM FULL ANALYZE")
logger.info("Database vacuumed and analyzed")
if prefect.config.vacuum_db:
# need to have autocommit set for VACUUM to work
connection.autocommit = True
cursor.execute("VACUUM FULL ANALYZE")
logger.info("Database vacuumed and analyzed")

cursor.close()
connection.close()
Expand Down

0 comments on commit 9394df8

Please sign in to comment.