Replies: 11 comments 15 replies
-
Hey @dataders is there a plan for releasing a 1.2-compatible version of |
Beta Was this translation helpful? Give feedback.
-
Alright, so I have a branch that is passing against A couple of comments/questions:
|
Beta Was this translation helpful? Give feedback.
-
If I just support dbt-core v1.2.0 directly regardless of the v1.1.0. will the adapter work well with v1.1.0 |
Beta Was this translation helpful? Give feedback.
-
Another Question: |
Beta Was this translation helpful? Give feedback.
-
I'm working on getting dbt-sqlite working with 1.2.0. I'm finding it difficult to troubleshoot failing tests for the cross-database macros. When tests fail due to valid SQL that produces an unexpected result, "dbt build" will indicate that the test failed but not provide any additional information. I'm able to retain the failures if I override the from dbt.tests.util import run_dbt
class BaseDateDiff(BaseUtils):
# actual test sequence
def test_build_assert_equal(self, project):
run_dbt(["build", "--store-failures"]) # seed, model, test I'm having to do a lot of manual work loading the seed data from the fixtures, then looking at the output from pytest to find the raw SQL to paste into a database client to test. Is there an easier way? |
Beta Was this translation helpful? Give feedback.
-
Thank you for this overview! Would it be possible to label this post and the ones for future dbt-core versions with a label? It would make it easier as an adapter maintainer to be able to just follow all posts with that specific label instead of digging through all of the community discussions :) |
Beta Was this translation helpful? Give feedback.
-
If there is a particular cross database macro that just doesn't work (in this case |
Beta Was this translation helpful? Give feedback.
-
trying to run test cases using command ( pytest test_basic.py ) having issue of project not found |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Pulling info from an email thread for here. @colin-rogers-dbt @Fleid @McKnight-42 can y'all help out the Vertica folks? Thanks!
|
Beta Was this translation helpful? Give feedback.
-
Hello team, we are working on updating as well as enhancing the dbt-vertica adapter. Right now, we are trying to run adapter basic test cases in that TestSnapshotCheckColsVertica(BaseSnapshotCheckCols): class TestSnapshotTimestampVertica(BaseSnapshotTimestamp): which is failing and giving us AssertionError: dbt exit state did not match expected please look for logs in below-mentioned links We look forward to hearing from you soon. |
Beta Was this translation helpful? Give feedback.
-
This discussion is for communicating to adapter maintainers the scope of work needed to make use of the changes in 1.2.0. If you have questions and concerns, please ask them here for posterity.
The first release cut for 1.2.0,
#dbt-core==1.2.0rc1
was published two days ago (PyPI | Github). We are targeting a second release cut in the next week with a target of releasing the official cut of 1.2.0 by the end of the month. The next release cut will contain likely only contain some smaller changes that will not impact end users(though perhaps #5432 if we're lucky). UPDATE 7/22: #5432 has been merged, so see the new section below on implementing connection retries.This minor release contains no breaking changes, but has both new features for end-users and some developer quality-of-life improvements that will benefit most adapter maintainers. This means that end users can install run
dbt-core==1.2.0
alongside an adapter of a previous minor version (e.g.1.1.0
or1.0.0
), but they won't necessarily have access to the new features.Connection Retries
#5432 adds a new method to the
BaseAdapter
,retry_connection()
(source), that allows an adapter to automatically try again when the attempt to open a new a connection on the database has a transient, infrequent error. The option can be set using theretries
config in the target of a~/.dbt/profiles.yml
If relevant to your adapter, you'll need to implement the following changes in your adapters
connections.py
:retries
value to the connection manager class along with it's default number of times to retry (likely: 1?)ConnectionManager.open()
method:connect()
(postgres's implementation) that returns ahandle
object. Ahandle
is the object returned by the standard python DB2 API Cursor object immediately after attempting to connectretryable_exceptions
of exceptions (dbt-postgres' version) sthe underlying driver might expose..open() method is now a call to
cls.retry_connection()` (dbt-postgres's version)Cross-database Macros
In #5298, we migrated a collection of "cross-database macros" from dbt-utils to dbt-core. Default implementations are automatically inherited by adapters and included in the testing suite. Adapter maintainers may need to override the implementation of one or more macros to align with database-specific syntax or optimize performance. For details on the testing suite, see: "Testing a new adapter".
The TL;DR rationale for this work is:
As for how to make it happen, looking at the following PRs for dbt-Labs-maintained adapters show it clearly:
Grants
Managing access grants is one of the most asked for features from dbt users. We’re delivering this capability, but naturally there’s variance across data platforms as to how grants work, so time for adapter maintainers to roll their sleeves up. You might get lucky and not have to override any of them, but in case you do, below are descriptions of the new methods and macros, grouped into level of complexity (start with the easy ones first!)
Pull requests for adding grants for dbt Labs-maintained adapters should be very useful as a reference, for example: dbt-labs/dbt-bigquery#212.
Override-able macros and methods
The two macros below are simple Boolean-toggles (i.e.
True/False
value) indicating whether certain features are available for your database. The default of both of these macros areTrue
, because we believe that all databases should support these ergonomic features. However, we've built for flexibility, so overriding these macros for your adapter, will handle the case where your database doesn't support these features.copy_grants()
copy_grants
configuration. true by default, which means “play it safe”: grants MIGHT have copied over, so dbt will run an extra query to check them + calculate diffs.default__copy_grants()
snowflake__copy_grants()
support_multiple_grantees-
_per_dcl_statement()
grant {privilege} to user_a, user_b, ...
? or douser_a
+user_b
need their own separate grant statements?default__support_multiple_grantees_per_dcl_statement()
spark__support_multiple_grantees_per_dcl_statement()
If the above macros do not suffice, then at least one of these
get_*_sql()
macros will need to be overwritten. They're all one-liners and might need small syntax tweaks to work on your database.get_show_grant_sql()
CURRENT
grants (privilege-grantee pairs) for a given relationdefault__get_show_grant_sql()
redshift__get_show_grant_sql()
get_grant_sql()
default__get_grant_sql()
spark__get_grant_sql()
get_revoke_sql()
default__get_revoke_sql()
bigquery__get_revoke_sql()
{% set grant_config = config.get('grants') %}
and{% do apply_grants(target_relation, grant_config) %}
by default, theshould_revoke
argument ofapply_grants
isTrue
. dbt will first run a query to “show” grants, then calculate diffs, then apply revoke/grant statements. you can use theshould_revoke
macro to determine whether this extra step is necessary. in cases where dbt is fully replacing an object, or creating one for the first time, grants may not be carried over — so it may be more efficient to skip the “show” step and just add the grants.If the above sets of macros still aren't cutting it, here's the final depth of complexity in which to wade.
get_dcl_statement_list()
default__get_dcl_statement_list()
call_dcl_statements()
default__call_dcl_statements()
Adapter.standardize_grants_dict()
{"privilege_name": [list, of, grantees], ...}
—> matches the structure of the user-suppliedgrant_config
core/dbt/adapters/base/impl.py
'sBaseAdapter.standardize_grants_dict()
Testing grants with your adapter
The tests for grants are implemented in the same way as the pytest tests that were introduced in dbt-core v1.1.0, in that they are importable and can you create adapter-specific child classes of each test in your repo. for example see how dbt-bigquery implements the tests. Notice the
BaseGrantsBigQuery
in which the mapping dict of standard privileges to BigQuery-specific privilege names.It is also worth noting that in your test database, you need to have create three users. If your integration test database is persistent, you'll only need to add the users to the database once, if the database is set up and torn down within the CI testing, you'll need to have the users added as part of your CI testing (or even the docker image).
In the example test.env, the users are prescribed as environment variables as follows:
Materialization inheritance!
Via a community contribution from the @volkangurel at Layer.ai, #5348 enables materializations to be inherited from parent adapters in much the same was as macros are dispatched.
this is a big deal for folks who are inheriting adapters, e.g. as dbt-synapse does with dbt-sqlserver, and for the family of adapters inherit from dbt-spark today.
New basic tests to implement in adapterland:
BaseDocsGenerate
andBaseDocsGenReferences
#5058 is another step along the path of converting all our functional tests to the new framework in order to empower adapter maintainers and other contributors to make use of the same tests that the core team uses for their own adapters. Effectively, this test is validates an adapter's ability to correctly generate the catalog that serves as the static backend of a project docs site.
If your adapter does not add extra relation-level metadata (e.g. table size (rows + bytes), last modified timestamp) which is the case by default, then you can follow the same inherit and
pass
pattern to enable your version ofBaseDocsGenerate
andBaseDocsGenReferences
. However, if you are supplementing the catalog with more metadata, you'll have to:expected_catalog
fixture, passing the above intomodel_stats
andseed_stats
Example PRs:
More python functions now available in the dbt jinja context
python’s
set
andzip
, and the most of theitertools
are available in the dbt-jinja context. Yay! (#5107 and #5140). There's no explicit action needed here, only mentioning in case it enables some jinja simplifications.Slight change to the default seed materializationwho: folks who override the entire seed materialization, and anyone who overrides materializations for small reasons. this is a great example of how the global_project can be modified to reduce boiler plate within adapters.what: a new macro,get_csv_sql()
, was added tomacros/materializations/seeds/helpers.sql
why transactions are no longer the default behavior for dbt-snowflake, however, they’re still needed for bundling the seed table creation and insertion. So now we have a new default macro so that dbt-snowflake can implement a version that makes the two statements happen in the same transactionmore info check out the issue #5206 and #5207Beta Was this translation helpful? Give feedback.
All reactions