Skip to content

Commit

Permalink
Drop dependencies for cols needs a CASCADE (move-coop#819)
Browse files Browse the repository at this point in the history
* Merge main into major-release (move-coop#814)

* Use black formatting in addition to flake8 (move-coop#796)

* Run black formatter on entire repository

* Update requirements.txt and CONTRIBUTING.md to reflect black format

* Use black linting in circleci test job

* Use longer variable name to resolve flake8 E741

* Move noqa comments back to proper lines after black reformat

* Standardize S3 Prefix Conventions (move-coop#803)

This PR catches exception errors when a user does not exhaustive access to keys in an S3 bucket

* Add Default Parameter Flexibility (move-coop#807)

Skips over new `/` logic checks if prefix is `None` (which is true by default)

* MoveOn Shopify / AK changes (move-coop#801)

* Add access_token authentication option for Shopify

* Remove unnecessary check
The access token will either be None or explicitly set; don't worry about an empty string.

* Add get_orders function and test

* Add get_transactions function and test

* Add function and test to get order

* style fixes

* style fixes

---------

Co-authored-by: sjwmoveon <[email protected]>
Co-authored-by: Alex French <[email protected]>
Co-authored-by: Kathy Nguyen <[email protected]>

* Catch File Extensions in S3 Prefix (move-coop#809)

* add exception handling

* Shortened logs for flake8

* add logic for default case

* added file logic + note to user

* restructured prefix logic

This change moves the prefix -> prefix/ logic into a try/except block ... this will be more robust to most use cases, while adding flexibility that we desire for split-permission buckets

* drop nested try/catch + add verbose error log

* Add error message verbosity

Co-authored-by: willyraedy <[email protected]>

---------

Co-authored-by: willyraedy <[email protected]>

---------

Co-authored-by: Austin Weisgrau <[email protected]>
Co-authored-by: Ian <[email protected]>
Co-authored-by: Cody Gordon <[email protected]>
Co-authored-by: sjwmoveon <[email protected]>
Co-authored-by: Alex French <[email protected]>
Co-authored-by: Kathy Nguyen <[email protected]>
Co-authored-by: willyraedy <[email protected]>

* cascade

* black

---------

Co-authored-by: Jason <[email protected]>
Co-authored-by: Austin Weisgrau <[email protected]>
Co-authored-by: Ian <[email protected]>
Co-authored-by: Cody Gordon <[email protected]>
Co-authored-by: sjwmoveon <[email protected]>
Co-authored-by: Alex French <[email protected]>
Co-authored-by: Kathy Nguyen <[email protected]>
Co-authored-by: willyraedy <[email protected]>
Co-authored-by: Elyse Weiss <[email protected]>
  • Loading branch information
10 people authored and tal42levy committed Jul 8, 2023
1 parent 0dae7d2 commit 7386621
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 9 deletions.
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
# The short X.Y version
version = ""
# The full version, including alpha/beta/rc tags
release = ""
release = "0.5"


# -- General configuration ---------------------------------------------------
Expand Down
10 changes: 3 additions & 7 deletions parsons/databases/redshift/redshift.py
Original file line number Diff line number Diff line change
Expand Up @@ -221,7 +221,6 @@ def query_with_connection(self, sql, connection, parameters=None, commit=True):
# rows in the correct order

with self.cursor(connection) as cursor:

if "credentials" not in sql:
logger.debug(f"SQL Query: {sql}")
cursor.execute(sql, parameters)
Expand All @@ -235,7 +234,6 @@ def query_with_connection(self, sql, connection, parameters=None, commit=True):
return None

else:

# Fetch the data in batches, and "pickle" the rows to a temp file.
# (We pickle rather than writing to, say, a CSV, so that we maintain
# all the type information for each field.)
Expand Down Expand Up @@ -397,7 +395,6 @@ def copy_s3(
"""

with self.connection() as connection:

if self._create_table_precheck(connection, table_name, if_exists):
if template_table:
sql = f"CREATE TABLE {table_name} (LIKE {template_table})"
Expand Down Expand Up @@ -620,7 +617,6 @@ def copy(
cols = None

with self.connection() as connection:

# Check to see if the table exists. If it does not or if_exists = drop, then
# create the new table.
if self._create_table_precheck(connection, table_name, if_exists):
Expand Down Expand Up @@ -859,7 +855,6 @@ def generate_manifest(
# Generate manifest file
manifest = {"entries": []}
for bucket in buckets:

# Retrieve list of files in bucket
key_list = s3.list_keys(bucket, prefix=prefix)
for key in key_list:
Expand Down Expand Up @@ -982,7 +977,6 @@ def upsert(
raise ValueError("Primary key column contains duplicate values.")

with self.connection() as connection:

try:
# Copy to a staging table
logger.info(f"Building staging table: {staging_tbl}")
Expand Down Expand Up @@ -1087,7 +1081,9 @@ def drop_dependencies_for_cols(self, schema, table, cols):
tbl = self.query_with_connection(sql_depend, connection)
dropped_views = [row["table_name"] for row in tbl]
if dropped_views:
sql_drop = "\n".join([f"drop view {view};" for view in dropped_views])
sql_drop = "\n".join(
[f"drop view {view} CASCADE;" for view in dropped_views]
)
tbl = self.query_with_connection(sql_drop, connection)
logger.info(f"Dropped the following views: {dropped_views}")

Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ def main():
version="1.0.0",
author="The Movement Cooperative",
author_email="[email protected]",
url="https://github.com/move-coop/parsons",
url="https://github.com/movementcoop/parsons",
keywords=["PROGRESSIVE", "API", "ETL"],
packages=find_packages(),
install_requires=install_requires,
Expand Down

0 comments on commit 7386621

Please sign in to comment.