Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Macro overwrites are not picked up since dbt version 0.21.0 #4098

Closed
1 task done
JCZuurmond opened this issue Oct 20, 2021 · 2 comments
Closed
1 task done

[Bug] Macro overwrites are not picked up since dbt version 0.21.0 #4098

JCZuurmond opened this issue Oct 20, 2021 · 2 comments
Labels
bug Something isn't working good_first_issue Straightforward + self-contained changes, good for new contributors! regression

Comments

@JCZuurmond
Copy link
Contributor

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

There is a custom macro which overwrites a dbt-spark macro (spark__get_merge_sql). The custom macro is not used in preference of the default macro.

Expected Behavior

The custom macro is used in preference of the default macro

Steps To Reproduce

  1. With dbt version 0.21.0 and dbt-spark
  2. Create a custom macro that overwrites the default spark__get_merge_sql (anything other than the default)
  3. Create an incremental model
  4. dbt run <model name> to create the model
  5. dbt --debug run <model name> to see the SQL of the merge statement

Relevant log output

No response

Environment

- OS: Ubuntu latest (in Github action - I guess 20.04) and MacOSc Big Sur 11.6  
- Python: 3.7
- dbt: 0.21.0

What database are you using dbt with?

other (mention it in "Additional Context")

Additional Context

  • Using dbt-spark with a databricks workspace
  • No errors when running dbt run with version 0.20.0
@JCZuurmond JCZuurmond added bug Something isn't working triage labels Oct 20, 2021
@jtcohen6
Copy link
Contributor

jtcohen6 commented Oct 20, 2021

@JCZuurmond Thanks for opening the issue. Check out #4029 (comment). As a quick restatement:

This did change in v0.21, as a trade-off to unlock other capabilities (#3851). Global macros now specify the dbt namespace when searching for dispatch candidates.

You have two options to get this working again, both quite straightforward:

  1. Name your local version as get_merge_sql, instead of spark__get_merge_sql. This overrides the entire create_table_as macro. It's only possible to do this for macros defined in your root project.
  2. Keep the local version named spark__get_merge_sql. Add a dispatch config to your dbt_project.yml, telling dbt to prefer your candidate (adapter-prefixed) macros over the built-in ones when dispatching the dbt namespace:
# dbt_project.yml
dispatch:
  - macro_namespace: dbt
    search_order: ['<name_of_your_root_project>', 'dbt']

I previously closed that issue, but anytime we get the same issue multiple times, it's worth thinking about what whether we should do something differently.

There are a few directions we could go from here:

  • Keep the current functionality, and do a better job of documenting it. Right now, it's documented under dispatch: https://docs.getdbt.com/reference/dbt-jinja-functions/dispatch#overriding-global-macros. Track down the other places we talk about overriding global macros, and make sure they link back there.
  • Slightly rework dispatch search orders to implicitly include your root project first, and the dbt namespace last, so that you don't have to include the explicit config in step (2) above, and the previous behavior would "just work." This feels 95% uncontroversial to me, but it does represent a reduction of flexibility, since it would no longer be possible to prefer a built-in or package macro over an implementation defined in the root project.

...wait

Ok, as I finished writing the second option above, it occurred to me that there might be a happy middle ground: we could set all macro namespaces to use ['<name_of_your_root_project>', '<macro_namespace>'] as the search order by default, unless you override it with a config in your own project, in which case it uses only the explicit search order you've defined.

That might be as simple as changing this conditional to include the root project, too:

if not search_packages and namespace in self._adapter.config.dependencies:
search_packages = [namespace]

            if not search_packages and namespace in self._adapter.config.dependencies:
                search_packages = [self.config.project_name, namespace]

I just tested that locally (reimplementing postgres__create_table_as to log a message), and it works as expected. I still have the ability to define a custom search order via project-level dispatch config, but it changes the default to be a bit more intuitive, too.

Is that a change you might be interested in contributing? Given that there's a win-win way to make it, and it restores previous behavior, I think we could sneak this into the next patch release (v0.21.1).

@jtcohen6 jtcohen6 added good_first_issue Straightforward + self-contained changes, good for new contributors! regression and removed triage labels Oct 20, 2021
@jtcohen6 jtcohen6 added this to the 0.21.1 milestone Oct 20, 2021
@JCZuurmond
Copy link
Contributor Author

Hi @jtcohen6, thank you for your quick and elaborate response! I missed the issue you linked in my search. I also read the CHANGELOG, but did not link the dispatch comment to this:

Specify macro_namespace = 'dbt' for all dispatched macros in the global project, making it possible to dispatch to macro implementations defined in packages. Dispatch generate_schema_name and generate_alias_name (#3456, #3851)

I like your suggestion to get the old behavior again, which I also think is more intuitive. I will have a go at a PR tomorrow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working good_first_issue Straightforward + self-contained changes, good for new contributors! regression
Projects
None yet
Development

No branches or pull requests

2 participants