-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Macro overwrites are not picked up since dbt version 0.21.0 #4098
Comments
@JCZuurmond Thanks for opening the issue. Check out #4029 (comment). As a quick restatement:
# dbt_project.yml
dispatch:
- macro_namespace: dbt
search_order: ['<name_of_your_root_project>', 'dbt'] I previously closed that issue, but anytime we get the same issue multiple times, it's worth thinking about what whether we should do something differently. There are a few directions we could go from here:
...waitOk, as I finished writing the second option above, it occurred to me that there might be a happy middle ground: we could set all macro namespaces to use That might be as simple as changing this conditional to include the root project, too: dbt-core/core/dbt/context/providers.py Lines 146 to 147 in 280e9ad
if not search_packages and namespace in self._adapter.config.dependencies:
search_packages = [self.config.project_name, namespace] I just tested that locally (reimplementing Is that a change you might be interested in contributing? Given that there's a win-win way to make it, and it restores previous behavior, I think we could sneak this into the next patch release (v0.21.1). |
Hi @jtcohen6, thank you for your quick and elaborate response! I missed the issue you linked in my search. I also read the CHANGELOG, but did not link the dispatch comment to this:
I like your suggestion to get the old behavior again, which I also think is more intuitive. I will have a go at a PR tomorrow. |
Is there an existing issue for this?
Current Behavior
There is a custom macro which overwrites a
dbt-spark
macro (spark__get_merge_sql
). The custom macro is not used in preference of the default macro.Expected Behavior
The custom macro is used in preference of the default macro
Steps To Reproduce
spark__get_merge_sql
(anything other than the default)dbt run <model name>
to create the modeldbt --debug run <model name>
to see the SQL of the merge statementRelevant log output
No response
Environment
What database are you using dbt with?
other (mention it in "Additional Context")
Additional Context
dbt-spark
with a databricks workspacedbt run
with version0.20.0
The text was updated successfully, but these errors were encountered: