Skip to content

Commit

Permalink
Merge branch 'current' into sl-sigma-preview
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Dec 16, 2024
2 parents b9cbee4 + d27439a commit c95dd29
Show file tree
Hide file tree
Showing 52 changed files with 454 additions and 247 deletions.
12 changes: 6 additions & 6 deletions .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,14 @@
* @dbt-labs/product-docs

# Adapter & Package Development Docs
/website/docs/docs/supported-data-platforms.md @dbt-labs/product-docs @dataders
/website/docs/reference/warehouse-setups @dbt-labs/product-docs @dataders
/website/docs/docs/supported-data-platforms.md @dbt-labs/product-docs @amychen1776
/website/docs/reference/warehouse-setups @dbt-labs/product-docs @amychen1776
# `resource-configs` contains more than just warehouse setups
/website/docs/reference/resource-configs/*-configs.md @dbt-labs/product-docs @dataders
/website/docs/guides/advanced/adapter-development @dbt-labs/product-docs @dataders @dbeatty10
/website/docs/reference/resource-configs/*-configs.md @dbt-labs/product-docs @amychen1776
/website/docs/guides/advanced/adapter-development @dbt-labs/product-docs @amychen1776

/website/docs/guides/building-packages @dbt-labs/product-docs @amychen1776 @dataders @dbeatty10
/website/docs/guides/creating-new-materializations @dbt-labs/product-docs @dataders @dbeatty10
/website/docs/guides/building-packages @dbt-labs/product-docs @amychen1776
/website/docs/guides/creating-new-materializations @dbt-labs/product-docs

# Require approval from the Multicell team when making
# changes to the public facing migration documentation.
Expand Down
7 changes: 3 additions & 4 deletions .github/workflows/vale.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,11 +29,11 @@ jobs:
python-version: '3.x'

- name: Install Vale
run: pip install vale==2.27.0 # Install a stable version of Vale
run: pip install vale==3.9.1.0 # Install a stable version of Vale

- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@v34
uses: tj-actions/changed-files@v45
with:
files: |
website/**/*.md
Expand Down Expand Up @@ -63,10 +63,9 @@ jobs:
uses: errata-ai/vale-action@reviewdog
with:
token: ${{ secrets.GITHUB_TOKEN }}
reporter: github-check
reporter: github-pr-review
files: ${{ steps.changed-files.outputs.all_changed_and_modified_files }}
separator: ' '
version: '2.27.0'

# - name: Post summary comment
# if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -65,3 +65,5 @@ N/A
pseudocolumn
yml
values=
dbt v\\d+\\.\\d+
v\\d+\\.\\d+
File renamed without changes.
2 changes: 2 additions & 0 deletions styles/custom/Typos.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,3 +37,5 @@ tokens:
- '\w+/\w+|\w+-\w+|n/a'
- 'n/a'
- 'N/A'
- 'dbt v\\d+\\.\\d+'
- 'v\\d+\\.\\d+ '
2 changes: 1 addition & 1 deletion website/blog/2022-04-14-add-ci-cd-to-bitbucket.md
Original file line number Diff line number Diff line change
Expand Up @@ -197,7 +197,7 @@ Reading the file over, you can see that we:

In summary, anytime anything is pushed to main, we’ll ensure our production database reflects the dbt transformation, and we’ve saved the resulting artifacts to defer to.

> ❓ **What are artifacts and why should I defer to them?** dbt artifacts are metadata of the last run - what models and tests were defined, which ones ran successfully, and which failed. If a future dbt run is set to ***defer*** to this metadata, it means that it can select models and tests to run based on their state, including and especially their difference from the reference metadata. See [Artifacts](https://docs.getdbt.com/reference/artifacts/dbt-artifacts), [Selection methods: “state”](https://docs.getdbt.com/reference/node-selection/methods#the-state-method), and [Caveats to state comparison](https://docs.getdbt.com/reference/node-selection/state-comparison-caveats) for details.
> ❓ **What are artifacts and why should I defer to them?** dbt artifacts are metadata of the last run - what models and tests were defined, which ones ran successfully, and which failed. If a future dbt run is set to ***defer*** to this metadata, it means that it can select models and tests to run based on their state, including and especially their difference from the reference metadata. See [Artifacts](https://docs.getdbt.com/reference/artifacts/dbt-artifacts), [Selection methods: “state”](https://docs.getdbt.com/reference/node-selection/methods#state), and [Caveats to state comparison](https://docs.getdbt.com/reference/node-selection/state-comparison-caveats) for details.

### Slim Continuous Integration: Retrieve the artifacts and do a state-based run

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ id: 5-how-we-style-our-yaml

- 2️⃣ Indents should be two spaces
- ➡️ List items should be indented
- 🔠 List items with a single entry can be a string. For example, `'select': 'other_user'`, but it's best practice to provide the argument as an explicit list. For example, `'select': ['other_user']`
- 🆕 Use a new line to separate list items that are dictionaries where appropriate
- 📏 Lines of YAML should be no longer than 80 characters.
- 🛠️ Use the [dbt JSON schema](https://github.com/dbt-labs/dbt-jsonschema) with any compatible IDE and a YAML formatter (we recommend [Prettier](https://prettier.io/)) to validate your YAML files and format them automatically.
Expand Down
5 changes: 5 additions & 0 deletions website/docs/docs/build/data-tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,11 @@ id: "data-tests"
keywords:
- test, tests, testing, dag
---

import CopilotBeta from '/snippets/_dbt-copilot-avail.md';

<CopilotBeta resource='data tests' />

## Related reference docs
* [Test command](/reference/commands/test)
* [Data test properties](/reference/resource-properties/data-tests)
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/dbt-tips.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ Leverage these dbt packages to streamline your workflow:
- Set `vars` in your `dbt_project.yml` to define global defaults for certain conditions, which you can then override using the `--vars` flag in your commands.
- Use [for loops](/guides/using-jinja?step=3) in Jinja to <Term id="dry">DRY</Term> up repetitive logic, such as selecting a series of columns that all require the same transformations and naming patterns to be applied.
- Instead of relying on post-hooks, use the [grants config](/reference/resource-configs/grants) to apply permission grants in the warehouse resiliently.
- Define [source-freshness](/docs/build/sources#snapshotting-source-data-freshness) thresholds on your sources to avoid running transformations on data that has already been processed.
- Define [source-freshness](/docs/build/sources#source-data-freshness) thresholds on your sources to avoid running transformations on data that has already been processed.
- Use the `+` operator on the left of a model `dbt build --select +model_name` to run a model and all of its upstream dependencies. Use the `+` operator on the right of the model `dbt build --select model_name+` to run a model and everything downstream that depends on it.
- Use `dir_name` to run all models in a package or directory.
- Use the `@` operator on the left of a model in a non-state-aware CI setup to test it. This operator runs all of a selection’s parents and children, and also runs the parents of its children, which in a fresh CI schema will likely not exist yet.
Expand Down
4 changes: 4 additions & 0 deletions website/docs/docs/build/documentation.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,10 @@ id: "documentation"
Good documentation for your dbt models will help downstream consumers discover and understand the datasets you curate for them.
dbt provides a way to generate documentation for your dbt project and render it as a website.

import CopilotBeta from '/snippets/_dbt-copilot-avail.md';

<CopilotBeta resource='documentation' />

## Related documentation

* [Declaring properties](/reference/configs-and-properties)
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/exposures.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,5 +77,5 @@ When we generate the [dbt Explorer site](/docs/collaborate/explore-projects), yo
## Related docs

* [Exposure properties](/reference/exposure-properties)
* [`exposure:` selection method](/reference/node-selection/methods#the-exposure-method)
* [`exposure:` selection method](/reference/node-selection/methods#exposure)
* [Data health tiles](/docs/collaborate/data-tile)
2 changes: 1 addition & 1 deletion website/docs/docs/build/groups.md
Original file line number Diff line number Diff line change
Expand Up @@ -119,4 +119,4 @@ dbt.exceptions.DbtReferenceError: Parsing Error

* [Model Access](/docs/collaborate/govern/model-access#groups)
* [Group configuration](/reference/resource-configs/group)
* [Group selection](/reference/node-selection/methods#the-group-method)
* [Group selection](/reference/node-selection/methods#group)
33 changes: 32 additions & 1 deletion website/docs/docs/build/materializations.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,11 @@ You can also configure [custom materializations](/guides/create-new-materializat


## Configuring materializations
By default, dbt models are materialized as "views". Models can be configured with a different materialization by supplying the `materialized` configuration parameter as shown below.
By default, dbt models are materialized as "views". Models can be configured with a different materialization by supplying the [`materialized` configuration](/reference/resource-configs/materialized) parameter as shown in the following tabs.

<Tabs>

<TabItem value="Project file">

<File name='dbt_project.yml'>

Expand Down Expand Up @@ -49,6 +53,10 @@ models:
</File>
</TabItem>
<TabItem value="Model file">
Alternatively, materializations can be configured directly inside of the model sql files. This can be useful if you are also setting [Performance Optimization] configs for specific models (for example, [Redshift specific configurations](/reference/resource-configs/redshift-configs) or [BigQuery specific configurations](/reference/resource-configs/bigquery-configs)).
<File name='models/events/stg_event_log.sql'>
Expand All @@ -63,6 +71,29 @@ from ...

</File>

</TabItem>

<TabItem value="Property file">

Materializations can also be configured in the model's `properties.yml` file. The following example shows the `table` materialization type. For a complete list of materialization types, refer to [materializations](/docs/build/materializations#materializations).

<File name='models/properties.yml'>

```yaml
version: 2

models:
- name: events
config:
materialized: table
```
</File>
</TabItem>
</Tabs>
## Materializations
Expand Down
4 changes: 4 additions & 0 deletions website/docs/docs/build/semantic-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,10 @@ tags: [Metrics, Semantic Layer]
pagination_next: "docs/build/dimensions"
---

import CopilotBeta from '/snippets/_dbt-copilot-avail.md';

<CopilotBeta resource='semantic models' />

Semantic models are the foundation for data definition in MetricFlow, which powers the dbt Semantic Layer:

- Think of semantic models as nodes connected by entities in a semantic graph.
Expand Down
14 changes: 7 additions & 7 deletions website/docs/docs/build/sources.md
Original file line number Diff line number Diff line change
Expand Up @@ -130,11 +130,11 @@ You can find more details on the available properties for sources in the [refere
<FAQ path="Tests/testing-sources" />
<FAQ path="Runs/running-models-downstream-of-source" />

## Snapshotting source data freshness
With a couple of extra configs, dbt can optionally snapshot the "freshness" of the data in your source tables. This is useful for understanding if your data pipelines are in a healthy state, and is a critical component of defining SLAs for your warehouse.
## Source data freshness
With a couple of extra configs, dbt can optionally capture the "freshness" of the data in your source tables. This is useful for understanding if your data pipelines are in a healthy state, and is a critical component of defining SLAs for your warehouse.

### Declaring source freshness
To configure sources to snapshot freshness information, add a `freshness` block to your source and `loaded_at_field` to your table declaration:
To configure source freshness information, add a `freshness` block to your source and `loaded_at_field` to your table declaration:

<File name='models/<filename>.yml'>

Expand Down Expand Up @@ -164,14 +164,14 @@ sources:

</File>

In the `freshness` block, one or both of `warn_after` and `error_after` can be provided. If neither is provided, then dbt will not calculate freshness snapshots for the tables in this source.
In the `freshness` block, one or both of `warn_after` and `error_after` can be provided. If neither is provided, then dbt will not calculate freshness for the tables in this source.

Additionally, the `loaded_at_field` is required to calculate freshness for a table. If a `loaded_at_field` is not provided, then dbt will not calculate freshness for the table.

These configs are applied hierarchically, so `freshness` and `loaded_at_field` values specified for a `source` will flow through to all of the `tables` defined in that source. This is useful when all of the tables in a source have the same `loaded_at_field`, as the config can just be specified once in the top-level source definition.

### Checking source freshness
To snapshot freshness information for your sources, use the `dbt source freshness` command ([reference docs](/reference/commands/source)):
To obtain freshness information for your sources, use the `dbt source freshness` command ([reference docs](/reference/commands/source)):

```
$ dbt source freshness
Expand All @@ -182,7 +182,7 @@ Behind the scenes, dbt uses the freshness properties to construct a `select` que
```sql
select
max(_etl_loaded_at) as max_loaded_at,
convert_timezone('UTC', current_timestamp()) as snapshotted_at
convert_timezone('UTC', current_timestamp()) as calculated_at
from raw.jaffle_shop.orders
```
Expand All @@ -198,7 +198,7 @@ Some databases can have tables where a filter over certain columns are required,
```sql
select
max(_etl_loaded_at) as max_loaded_at,
convert_timezone('UTC', current_timestamp()) as snapshotted_at
convert_timezone('UTC', current_timestamp()) as calculated_at
from raw.jaffle_shop.orders
where _etl_loaded_at >= date_sub(current_date(), interval 1 day)
```
Expand Down
1 change: 1 addition & 0 deletions website/docs/docs/build/unit-tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ Starting in dbt Core v1.8, we have introduced an additional type of test to dbt
- We currently only support adding unit tests to models in your _current_ project.
- We currently _don't_ support unit testing models that use the [`materialized view`](/docs/build/materializations#materialized-view) materialization.
- We currently _don't_ support unit testing models that use recursive SQL.
- We currently _don't_ support unit testing models that use introspective queries.
- If your model has multiple versions, by default the unit test will run on *all* versions of your model. Read [unit testing versioned models](/reference/resource-properties/unit-testing-versions) for more information.
- Unit tests must be defined in a YML file in your [`models/` directory](/reference/project-configs/model-paths).
- Table names must be aliased in order to unit test `join` logic.
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/collaborate/govern/model-versions.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ Let's say that `dim_customers` has three versions defined: `v2` is the "latest",

As you'll see in the implementation section below, a versioned model can reuse the majority of its YAML properties and configuration. Each version needs to only say how it _differs_ from the shared set of attributes. This gives you, as the producer of a versioned model, the opportunity to highlight the differences across versions—which is otherwise difficult to detect in models with dozens or hundreds of columns—and to clearly track, in one place, all versions of the model which are currently live.

dbt also supports [`version`-based selection](/reference/node-selection/methods#the-version-method). For example, you could define a [default YAML selector](/reference/node-selection/yaml-selectors#default) that avoids running any old model versions in development, even while you continue to run them in production through a sunset and migration period. (You could accomplish something similar by applying `tags` to these models, and cycling through those tags over time.)
dbt also supports [`version`-based selection](/reference/node-selection/methods#version). For example, you could define a [default YAML selector](/reference/node-selection/yaml-selectors#default) that avoids running any old model versions in development, even while you continue to run them in production through a sunset and migration period. (You could accomplish something similar by applying `tags` to these models, and cycling through those tags over time.)

<File name="selectors.yml">

Expand Down
3 changes: 2 additions & 1 deletion website/docs/docs/community-adapters.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,8 @@ Community adapters are adapter plugins contributed and maintained by members of

| Data platforms (click to view setup guide) |||
| ------------------------------------------ | -------------------------------- | ------------------------------------- |
| [Clickhouse](/docs/core/connect-data-platform/clickhouse-setup) | [Databend Cloud](/docs/core/connect-data-platform/databend-setup) | [Doris & SelectDB](/docs/core/connect-data-platform/doris-setup) |
| [Clickhouse](/docs/core/connect-data-platform/clickhouse-setup) | [CrateDB](/docs/core/connect-data-platform/cratedb-setup)
| [Databend Cloud](/docs/core/connect-data-platform/databend-setup) | [Doris & SelectDB](/docs/core/connect-data-platform/doris-setup) |
| [DuckDB](/docs/core/connect-data-platform/duckdb-setup) | [Exasol Analytics](/docs/core/connect-data-platform/exasol-setup) | [Extrica](/docs/core/connect-data-platform/extrica-setup) |
| [Hive](/docs/core/connect-data-platform/hive-setup) | [IBM DB2](/docs/core/connect-data-platform/ibmdb2-setup) | [Impala](/docs/core/connect-data-platform/impala-setup) |
| [Infer](/docs/core/connect-data-platform/infer-setup) | [iomete](/docs/core/connect-data-platform/iomete-setup) | [MindsDB](/docs/core/connect-data-platform/mindsdb-setup) |
Expand Down
62 changes: 62 additions & 0 deletions website/docs/docs/core/connect-data-platform/cratedb-setup.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
---
title: "CrateDB setup"
description: "Read this guide to learn about the CrateDB data platform setup in dbt."
id: "cratedb-setup"
meta:
maintained_by: Crate.io, Inc.
authors: 'CrateDB maintainers'
github_repo: 'crate/dbt-cratedb2'
pypi_package: 'dbt-cratedb2'
min_core_version: 'v1.0.0'
cloud_support: Not Supported
min_supported_version: 'n/a'
slack_channel_name: 'Community Forum'
slack_channel_link: 'https://community.cratedb.com/'
platform_name: 'CrateDB'
config_page: '/reference/resource-configs/no-configs'
---

import SetUpPages from '/snippets/_setup-pages-intro.md';

<SetUpPages meta={frontMatter.meta}/>


[CrateDB] is compatible with PostgreSQL, so its dbt adapter strongly depends on
dbt-postgres, documented at [PostgreSQL profile setup].

CrateDB targets are configured exactly the same way, see also [PostgreSQL
configuration], with just a few things to consider which are special to
CrateDB. Relevant details are outlined at [using dbt with CrateDB],
which also includes up-to-date information.


## Profile configuration

CrateDB targets should be set up using a configuration like this minimal sample
of settings in your [`profiles.yml`] file.

<File name='~/.dbt/profiles.yml'>

```yaml
cratedb_analytics:
target: dev
outputs:
dev:
type: cratedb
host: [clustername].aks1.westeurope.azure.cratedb.net
port: 5432
user: [username]
pass: [password]
dbname: crate # Do not change this value. CrateDB's only catalog is `crate`.
schema: doc # Define the schema name. CrateDB's default schema is `doc`.
```
</File>
[CrateDB]: https://cratedb.com/database
[PostgreSQL configuration]: https://docs.getdbt.com/reference/resource-configs/postgres-configs
[PostgreSQL profile setup]: https://docs.getdbt.com/docs/core/connect-data-platform/postgres-setup
[`profiles.yml`]: https://docs.getdbt.com/docs/core/connect-data-platform/profiles.yml
[using dbt with CrateDB]: https://cratedb.com/docs/guide/integrate/dbt/
8 changes: 1 addition & 7 deletions website/docs/docs/core/connect-data-platform/dremio-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,10 +60,6 @@ Next, configure the profile for your project.

When you initialize a project, you create one of these three profiles. You must configure it before trying to connect to Dremio Cloud or Dremio Software.

## Profiles

When you initialize a project, you create one of these three profiles. You must configure it before trying to connect to Dremio Cloud or Dremio Software.

* Profile for Dremio Cloud
* Profile for Dremio Software with Username/Password Authentication
* Profile for Dremio Software with Authentication Through a Personal Access Token
Expand Down Expand Up @@ -149,9 +145,7 @@ For descriptions of the configurations in these profiles, see [Configurations](#
</TabItem>
</Tabs>
## Configurations
### Configurations Common to Profiles for Dremio Cloud and Dremio Software
## Configurations Common to Profiles for Dremio Cloud and Dremio Software
| Configuration | Required? | Default Value | Description |
Expand Down
Loading

0 comments on commit c95dd29

Please sign in to comment.