Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature/databricks-sql-warehouse-compatibility #121

Merged
Merged
Show file tree
Hide file tree
Changes from 9 commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
137ed6c
feature/databricks-sql-warehouse-compatibility
fivetran-joemarkiewicz Apr 2, 2024
382dbe2
run databricks sql
fivetran-joemarkiewicz Apr 2, 2024
99d7f83
changelog pr ref and schema update
fivetran-joemarkiewicz Apr 2, 2024
12ea509
changes to incremental startegy selection
fivetran-joemarkiewicz Apr 2, 2024
c575861
databricks different schemas
fivetran-joemarkiewicz Apr 2, 2024
b0ac2fc
sql warehouse specific test runs
fivetran-joemarkiewicz Apr 2, 2024
bd2175b
adjustment for databricks sql and all other destinations
fivetran-joemarkiewicz Apr 2, 2024
d41ebf8
unique schema name to ensure drop wont conflict
fivetran-joemarkiewicz Apr 2, 2024
839c093
changelog reword
fivetran-joemarkiewicz Apr 2, 2024
4e87cfa
docs regen
fivetran-joemarkiewicz Apr 2, 2024
cf257e5
validations and docs regen
fivetran-joemarkiewicz Apr 2, 2024
37e5c52
variable adjustment
fivetran-joemarkiewicz Apr 2, 2024
91ffac9
change cleanup
fivetran-joemarkiewicz Apr 2, 2024
aa43436
changelog update
fivetran-joemarkiewicz Apr 2, 2024
bd30997
update readme & changelog
fivetran-catfritz Apr 3, 2024
2cc5a70
Update models/fivetran_platform__audit_table.sql
fivetran-joemarkiewicz Apr 3, 2024
8aad434
Update README.md
fivetran-joemarkiewicz Apr 3, 2024
9df06fb
docs regen after review
fivetran-joemarkiewicz Apr 3, 2024
064955d
spark removal from macro
fivetran-joemarkiewicz Apr 3, 2024
ce3abc9
Apply suggestions from code review
fivetran-joemarkiewicz Apr 3, 2024
ba8ef14
changelog for fileformat addition
fivetran-joemarkiewicz Apr 3, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .buildkite/hooks/pre-command
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,8 @@ export CI_DATABRICKS_DBT_HOST=$(gcloud secrets versions access latest --secret="
export CI_DATABRICKS_DBT_HTTP_PATH=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_DBT_HTTP_PATH" --project="dbt-package-testing-363917")
export CI_DATABRICKS_DBT_TOKEN=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_DBT_TOKEN" --project="dbt-package-testing-363917")
export CI_DATABRICKS_DBT_CATALOG=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_DBT_CATALOG" --project="dbt-package-testing-363917")
export CI_DATABRICKS_SQL_DBT_HTTP_PATH=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_SQL_DBT_HTTP_PATH" --project="dbt-package-testing-363917")
export CI_DATABRICKS_SQL_DBT_TOKEN=$(gcloud secrets versions access latest --secret="CI_DATABRICKS_SQL_DBT_TOKEN" --project="dbt-package-testing-363917")
export CI_SQLSERVER_DBT_SERVER=$(gcloud secrets versions access latest --secret="CI_SQLSERVER_DBT_SERVER" --project="dbt-package-testing-363917")
export CI_SQLSERVER_DBT_DATABASE=$(gcloud secrets versions access latest --secret="CI_SQLSERVER_DBT_DATABASE" --project="dbt-package-testing-363917")
export CI_SQLSERVER_DBT_USER=$(gcloud secrets versions access latest --secret="CI_SQLSERVER_DBT_USER" --project="dbt-package-testing-363917")
Expand Down
15 changes: 15 additions & 0 deletions .buildkite/pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,21 @@ steps:
commands: |
bash .buildkite/scripts/run_models.sh databricks

- label: ":databricks: :database: Run Tests - Databricks SQL Warehouse"
key: "run_dbt_databricks_sql"
plugins:
- docker#v3.13.0:
image: "python:3.8"
shell: [ "/bin/bash", "-e", "-c" ]
environment:
- "BASH_ENV=/tmp/.bashrc"
- "CI_DATABRICKS_DBT_HOST"
- "CI_DATABRICKS_SQL_DBT_HTTP_PATH"
- "CI_DATABRICKS_SQL_DBT_TOKEN"
- "CI_DATABRICKS_DBT_CATALOG"
commands: |
bash .buildkite/scripts/run_models.sh databricks-sql

- label: ":azure: Run Tests - SQLServer"
key: "run_dbt_sqlserver"
plugins:
Expand Down
17 changes: 17 additions & 0 deletions .buildkite/scripts/run_models.sh
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,22 @@ db=$1
echo `pwd`
cd integration_tests
dbt deps
if [ "$1" = "databricks-sql" ]; then
dbt seed --vars '{fivetran_platform_schema: sqlw_tests}' --target "$db" --full-refresh
dbt compile --vars '{fivetran_platform_schema: sqlw_tests}' --target "$db"
dbt run --vars '{fivetran_platform_schema: sqlw_tests}' --target "$db" --full-refresh
dbt run --vars '{fivetran_platform_schema: sqlw_tests}' --target "$db"
dbt test --vars '{fivetran_platform_schema: sqlw_tests}' --target "$db"
dbt run --vars '{fivetran_platform_schema: sqlw_tests, fivetran_platform__usage_pricing: true}' --target "$db" --full-refresh
dbt run --vars '{fivetran_platform_schema: sqlw_tests, fivetran_platform__usage_pricing: true}' --target "$db"
dbt test --target "$db"
dbt run --vars '{fivetran_platform_schema: sqlw_tests, fivetran_platform__credits_pricing: false, fivetran_platform__usage_pricing: true}' --target "$db" --full-refresh
dbt run --vars '{fivetran_platform__credits_pricing: false, fivetran_platform__usage_pricing: true}' --target "$db"
dbt test --vars '{fivetran_platform_schema: sqlw_tests}' --target "$db"
dbt run --vars '{fivetran_platform_schema: sqlw_tests, fivetran_platform__usage_pricing: false, fivetran_platform_using_destination_membership: false, fivetran_platform_using_user: false}' --target "$db" --full-refresh
dbt run --vars '{fivetran_platform_schema: sqlw_tests, fivetran_platform__usage_pricing: false, fivetran_platform_using_destination_membership: false, fivetran_platform_using_user: false}' --target "$db"
dbt test --vars '{fivetran_platform_schema: sqlw_tests}' --target "$db"
else
dbt seed --target "$db" --full-refresh
dbt compile --target "$db"
dbt run --target "$db" --full-refresh
Expand All @@ -50,6 +66,7 @@ dbt test --target "$db"
dbt run --vars '{fivetran_platform__usage_pricing: false, fivetran_platform_using_destination_membership: false, fivetran_platform_using_user: false}' --target "$db" --full-refresh
dbt run --vars '{fivetran_platform__usage_pricing: false, fivetran_platform_using_destination_membership: false, fivetran_platform_using_user: false}' --target "$db"
dbt test --target "$db"
fi
if [ "$1" != "sqlserver" ]; then
dbt run-operation fivetran_utils.drop_schemas_automation --target "$db"
fi
12 changes: 12 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,15 @@
# dbt_fivetran_log v1.7.1
[PR #121](https://github.com/fivetran/dbt_fivetran_log/pull/121) includes the following updates:

## Bug Fixes
- Users leveraging the Databricks SQL Warehouse runtime were previously unable to run the `fivetran_platform__audit_table` model due to an incompatible incremental strategy. As such, the following updates have been made:
- A new macro `is_databricks_sql_warehouse()` has been added to determine if a databricks runtime is a SQL Warehouse runtime for Databricks. This macro will return a boolean of `true` if the runtime is determined to be SQL Warehouse and `false` if it is any other runtime or destination.
fivetran-joemarkiewicz marked this conversation as resolved.
Show resolved Hide resolved
- The above macro is used in determining the incremental strategy within the `fivetran_platform__audit_table`. For Databricks SQL Warehouses, there will be **no** incremental strategy used. All other destination runtime strategies are not impacted with this change.
fivetran-joemarkiewicz marked this conversation as resolved.
Show resolved Hide resolved
fivetran-joemarkiewicz marked this conversation as resolved.
Show resolved Hide resolved
- For the SQL Warehouse runtime, the best incremental strategy we could elect to use is the `merge` strategy. However, we do not have full confidence in the resulting data integrity of the output model when leveraging this strategy. Therefore, we opted for the model to replicate a full create or replace behavior for the time being.
fivetran-joemarkiewicz marked this conversation as resolved.
Show resolved Hide resolved

## Under the Hood
- Added integration testing pipeline for Databricks SQL Warehouse.

# dbt_fivetran_log v1.7.0
[PR #119](https://github.com/fivetran/dbt_fivetran_log/pull/119) includes the following updates:

Expand Down
2 changes: 1 addition & 1 deletion dbt_project.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
config-version: 2
name: 'fivetran_log'
version: '1.7.0'
version: '1.7.1'
require-dbt-version: [">=1.3.0", "<2.0.0"]

models:
Expand Down
8 changes: 8 additions & 0 deletions integration_tests/ci/sample.profiles.yml
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,14 @@ integration_tests:
threads: 2
token: "{{ env_var('CI_DATABRICKS_DBT_TOKEN') }}"
type: databricks
databricks-sql:
catalog: "{{ env_var('CI_DATABRICKS_DBT_CATALOG') }}"
host: "{{ env_var('CI_DATABRICKS_DBT_HOST') }}"
http_path: "{{ env_var('CI_DATABRICKS_SQL_DBT_HTTP_PATH') }}"
schema: sqlw_tests
threads: 2
token: "{{ env_var('CI_DATABRICKS_SQL_DBT_TOKEN') }}"
type: databricks
sqlserver:
type: sqlserver
driver: 'ODBC Driver 18 for SQL Server'
Expand Down
6 changes: 3 additions & 3 deletions integration_tests/dbt_project.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
name: 'fivetran_log_integration_tests'
version: '1.7.0'
version: '1.7.1'

config-version: 2
profile: 'integration_tests'
Expand All @@ -10,7 +10,7 @@ dispatch:

vars:
fivetran_log:
fivetran_platform_schema: fivetran_platform_integration_tests
fivetran_platform_schema: "fivetran_platform_integration_tests"
fivetran_platform_account_identifier: "account"
fivetran_platform_incremental_mar_identifier: "incremental_mar"
fivetran_platform_connector_identifier: "connector"
Expand All @@ -24,7 +24,7 @@ vars:

models:
fivetran_log:
+schema: fivetran_platform
+schema: "{{ 'sqlw_tests' if target.name == 'databricks-sql' else 'fivetran_platform' }}"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an artifact of needing to use two different schemas for the Databricks jobs


seeds:
fivetran_log_integration_tests:
Expand Down
15 changes: 15 additions & 0 deletions macros/is_databricks_sql_warehouse.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
{% macro is_databricks_sql_warehouse(target) %}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we may want to add this to fivetran_utils in the future since Databricks SQL Warehouse folks using other packages with incremental models will have the same issue

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. This is something we will migrate there in the future.

{% if target.type in ('databricks','spark') %}
{% set re = modules.re %}
{% set path_match = target.http_path %}
{% set regex_pattern = "/sql/.+/warehouses/" %}
{% set match_result = re.search(regex_pattern, path_match) %}
{% if match_result %}
{{ return(True) }}
{% else %}
{{ return(False) }}
{% endif %}
{% else %}
{{ return(False) }}
{% endif %}
{% endmacro %}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also have an open question on dbt Slack which asks if there is a better way to do this natively using the dbt-databricks adapter.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems per dbt Slack that this is the preferred route.

4 changes: 2 additions & 2 deletions models/fivetran_platform__audit_table.sql
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@
'data_type': 'date'
} if target.type == 'bigquery' else ['sync_start_day'],
cluster_by = ['sync_start_day'],
incremental_strategy='insert_overwrite' if target.type in ('bigquery', 'spark', 'databricks') else 'delete+insert',
file_format='parquet'
incremental_strategy='insert_overwrite' if target.type in ('bigquery','databricks','spark') and not is_databricks_sql_warehouse(target) else (none if is_databricks_sql_warehouse(target) else 'delete+insert'),
file_format='parquet' if not is_databricks_sql_warehouse(target) else 'delta'
fivetran-joemarkiewicz marked this conversation as resolved.
Show resolved Hide resolved
) }}

with sync_log as (
Expand Down