-
Notifications
You must be signed in to change notification settings - Fork 178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Contract enforcement results in an error when data_type: GEOGRAPHY
#894
Comments
Thanks for raising this issue @gwerbin-tive ! I was able to reproduce the error message given your example. Root causeThe root cause is that this doesn't work for the cast(null as geography) as geo Instead, it needs to be: to_geography(null) as geo WorkaroundSee below for a workaround you can try out in the meantime. It works by defining a {{ cast("null", col['data_type']) }} as {{ col_name }}{{ ", " if not loop.last }} macro definitions
{% macro cast(expression, data_type) -%}
{{ adapter.dispatch('cast', 'dbt') (expression, data_type) }}
{%- endmacro %}
{% macro default__cast(expression, data_type) -%}
cast({{ expression }} as {{ data_type }})
{%- endmacro %}
{% macro snowflake__cast(expression, data_type) -%}
{#-- If the data type is 'geography' then it can't use cast() --#}
{%- if expression.strip().lower() == "null" and data_type.strip().lower() == "geography" -%}
to_geography(null)
{%- else -%}
cast({{ expression }} as {{ data_type }})
{%- endif -%}
{%- endmacro %}
{% macro default__get_empty_schema_sql(columns) %}
{%- set col_err = [] -%}
{%- set col_naked_numeric = [] -%}
select
{% for i in columns %}
{%- set col = columns[i] -%}
{%- if col['data_type'] is not defined -%}
{%- do col_err.append(col['name']) -%}
{#-- If this column's type is just 'numeric' then it is missing precision/scale, raise a warning --#}
{%- elif col['data_type'].strip().lower() in ('numeric', 'decimal', 'number') -%}
{%- do col_naked_numeric.append(col['name']) -%}
{%- endif -%}
{% set col_name = adapter.quote(col['name']) if col.get('quote') else col['name'] %}
{{ cast("null", col['data_type']) }} as {{ col_name }}{{ ", " if not loop.last }}
{%- endfor -%}
{%- if (col_err | length) > 0 -%}
{{ exceptions.column_type_missing(column_names=col_err) }}
{%- elif (col_naked_numeric | length) > 0 -%}
{{ exceptions.warn("Detected columns with numeric type and unspecified precision/scale, this can lead to unintended rounding: " ~ col_naked_numeric ~ "`") }}
{%- endif -%}
{% endmacro %} Proposed path forwardI'd propose that we'd solve this in two parts (both using the code from the workaround above):
BenefitsThis approach would also benefit other adapters that have done workarounds like this. The cross-database |
That I suspect this might become a problem with some of the other Snowflake data types, although I haven't tested it to confirm. That, or this is a Snowflake bug and it needs to be reported there. |
Awesome!
I tested out all the Snowflake data types (and aliases) that I could find within their documentation. There were only two data types where
In those cases, the solution will be to do this instead:
See below for the full query that used: queryselect
cast(null as NUMBER) as my_NUMBER,
cast(null as FLOAT) as my_FLOAT,
cast(null as TEXT) as my_TEXT,
cast(null as BINARY) as my_BINARY,
cast(null as BOOLEAN) as my_BOOLEAN,
cast(null as DATE) as my_DATE,
cast(null as TIME) as my_TIME,
cast(null as TIMESTAMP_LTZ) as my_TIMESTAMP_LTZ,
cast(null as TIMESTAMP_NTZ) as my_TIMESTAMP_NTZ,
cast(null as TIMESTAMP_TZ) as my_TIMESTAMP_TZ,
cast(null as VARIANT) as my_VARIANT,
cast(null as OBJECT) as my_OBJECT,
cast(null as ARRAY) as my_ARRAY,
TO_GEOGRAPHY(null) as my_GEOGRAPHY,
TO_GEOMETRY(null) as my_GEOMETRY,
-- aliases
cast(null as DECIMAL) as my_DECIMAL__NUMBER,
cast(null as DEC) as my_DEC__NUMBER,
cast(null as NUMERIC) as my_NUMERIC__NUMBER,
cast(null as INT) as my_INT__NUMBER,
cast(null as INTEGER) as my_INTEGER__NUMBER,
cast(null as BIGINT) as my_BIGINT__NUMBER,
cast(null as SMALLINT) as my_SMALLINT__NUMBER,
cast(null as TINYINT) as my_TINYINT__NUMBER,
cast(null as BYTEINT) as my_BYTEINT__NUMBER,
cast(null as FLOAT4) as my_FLOAT4__FLOAT,
cast(null as FLOAT8) as my_FLOAT8__FLOAT,
cast(null as DOUBLE) as my_DOUBLE__FLOAT,
cast(null as DOUBLE PRECISION) as my_DOUBLE_PRECISION__FLOAT,
cast(null as REAL) as my_REAL__FLOAT,
cast(null as CHAR) as my_CHAR__TEXT,
cast(null as CHARACTER) as my_CHARACTER__TEXT,
cast(null as NCHAR) as my_NCHAR__TEXT,
cast(null as STRING) as my_STRING__TEXT,
cast(null as VARCHAR) as my_VARCHAR__TEXT,
cast(null as NVARCHAR) as my_NVARCHAR__TEXT,
cast(null as NVARCHAR2) as my_NVARCHAR2__TEXT,
cast(null as CHAR VARYING) as my_CHAR_VARYING__TEXT,
cast(null as NCHAR VARYING) as my_NCHAR_VARYING__TEXT,
cast(null as VARBINARY) as my_VARBINARY__BINARY,
cast(null as DATETIME) as my_DATETIME__TIMESTAMP_NTZ,
cast(null as TIMESTAMP) as my_TIMESTAMP__TIMESTAMP_X catalog.json{
"metadata": {
"dbt_schema_version": "https://schemas.getdbt.com/dbt/catalog/v1.json",
"dbt_version": "1.7.5",
"generated_at": "2024-02-09T01:56:05.811681Z",
"invocation_id": "a0590684-685b-4dc0-98f1-48adf70e4376",
"env": {}
},
"nodes": {
"model.my_project.data_types": {
"metadata": {
"type": "VIEW",
"schema": "DBT_DBEATTY",
"name": "DATA_TYPES",
"database": "ANALYTICS_DEV",
"comment": null,
"owner": "TRANSFORMER"
},
"columns": {
"MY_NUMBER": {
"type": "NUMBER",
"index": 1,
"name": "MY_NUMBER",
"comment": null
},
"MY_FLOAT": {
"type": "FLOAT",
"index": 2,
"name": "MY_FLOAT",
"comment": null
},
"MY_TEXT": {
"type": "TEXT",
"index": 3,
"name": "MY_TEXT",
"comment": null
},
"MY_BINARY": {
"type": "BINARY",
"index": 4,
"name": "MY_BINARY",
"comment": null
},
"MY_BOOLEAN": {
"type": "BOOLEAN",
"index": 5,
"name": "MY_BOOLEAN",
"comment": null
},
"MY_DATE": {
"type": "DATE",
"index": 6,
"name": "MY_DATE",
"comment": null
},
"MY_TIME": {
"type": "TIME",
"index": 7,
"name": "MY_TIME",
"comment": null
},
"MY_TIMESTAMP_LTZ": {
"type": "TIMESTAMP_LTZ",
"index": 8,
"name": "MY_TIMESTAMP_LTZ",
"comment": null
},
"MY_TIMESTAMP_NTZ": {
"type": "TIMESTAMP_NTZ",
"index": 9,
"name": "MY_TIMESTAMP_NTZ",
"comment": null
},
"MY_TIMESTAMP_TZ": {
"type": "TIMESTAMP_TZ",
"index": 10,
"name": "MY_TIMESTAMP_TZ",
"comment": null
},
"MY_VARIANT": {
"type": "VARIANT",
"index": 11,
"name": "MY_VARIANT",
"comment": null
},
"MY_OBJECT": {
"type": "OBJECT",
"index": 12,
"name": "MY_OBJECT",
"comment": null
},
"MY_ARRAY": {
"type": "ARRAY",
"index": 13,
"name": "MY_ARRAY",
"comment": null
},
"MY_GEOGRAPHY": {
"type": "GEOGRAPHY",
"index": 14,
"name": "MY_GEOGRAPHY",
"comment": null
},
"MY_GEOMETRY": {
"type": "GEOMETRY",
"index": 15,
"name": "MY_GEOMETRY",
"comment": null
},
"MY_DECIMAL__NUMBER": {
"type": "NUMBER",
"index": 16,
"name": "MY_DECIMAL__NUMBER",
"comment": null
},
"MY_DEC__NUMBER": {
"type": "NUMBER",
"index": 17,
"name": "MY_DEC__NUMBER",
"comment": null
},
"MY_NUMERIC__NUMBER": {
"type": "NUMBER",
"index": 18,
"name": "MY_NUMERIC__NUMBER",
"comment": null
},
"MY_INT__NUMBER": {
"type": "NUMBER",
"index": 19,
"name": "MY_INT__NUMBER",
"comment": null
},
"MY_INTEGER__NUMBER": {
"type": "NUMBER",
"index": 20,
"name": "MY_INTEGER__NUMBER",
"comment": null
},
"MY_BIGINT__NUMBER": {
"type": "NUMBER",
"index": 21,
"name": "MY_BIGINT__NUMBER",
"comment": null
},
"MY_SMALLINT__NUMBER": {
"type": "NUMBER",
"index": 22,
"name": "MY_SMALLINT__NUMBER",
"comment": null
},
"MY_TINYINT__NUMBER": {
"type": "NUMBER",
"index": 23,
"name": "MY_TINYINT__NUMBER",
"comment": null
},
"MY_BYTEINT__NUMBER": {
"type": "NUMBER",
"index": 24,
"name": "MY_BYTEINT__NUMBER",
"comment": null
},
"MY_FLOAT4__FLOAT": {
"type": "FLOAT",
"index": 25,
"name": "MY_FLOAT4__FLOAT",
"comment": null
},
"MY_FLOAT8__FLOAT": {
"type": "FLOAT",
"index": 26,
"name": "MY_FLOAT8__FLOAT",
"comment": null
},
"MY_DOUBLE__FLOAT": {
"type": "FLOAT",
"index": 27,
"name": "MY_DOUBLE__FLOAT",
"comment": null
},
"MY_DOUBLE_PRECISION__FLOAT": {
"type": "FLOAT",
"index": 28,
"name": "MY_DOUBLE_PRECISION__FLOAT",
"comment": null
},
"MY_REAL__FLOAT": {
"type": "FLOAT",
"index": 29,
"name": "MY_REAL__FLOAT",
"comment": null
},
"MY_CHAR__TEXT": {
"type": "TEXT",
"index": 30,
"name": "MY_CHAR__TEXT",
"comment": null
},
"MY_CHARACTER__TEXT": {
"type": "TEXT",
"index": 31,
"name": "MY_CHARACTER__TEXT",
"comment": null
},
"MY_NCHAR__TEXT": {
"type": "TEXT",
"index": 32,
"name": "MY_NCHAR__TEXT",
"comment": null
},
"MY_STRING__TEXT": {
"type": "TEXT",
"index": 33,
"name": "MY_STRING__TEXT",
"comment": null
},
"MY_VARCHAR__TEXT": {
"type": "TEXT",
"index": 34,
"name": "MY_VARCHAR__TEXT",
"comment": null
},
"MY_NVARCHAR__TEXT": {
"type": "TEXT",
"index": 35,
"name": "MY_NVARCHAR__TEXT",
"comment": null
},
"MY_NVARCHAR2__TEXT": {
"type": "TEXT",
"index": 36,
"name": "MY_NVARCHAR2__TEXT",
"comment": null
},
"MY_CHAR_VARYING__TEXT": {
"type": "TEXT",
"index": 37,
"name": "MY_CHAR_VARYING__TEXT",
"comment": null
},
"MY_NCHAR_VARYING__TEXT": {
"type": "TEXT",
"index": 38,
"name": "MY_NCHAR_VARYING__TEXT",
"comment": null
},
"MY_VARBINARY__BINARY": {
"type": "BINARY",
"index": 39,
"name": "MY_VARBINARY__BINARY",
"comment": null
},
"MY_DATETIME__TIMESTAMP_NTZ": {
"type": "TIMESTAMP_NTZ",
"index": 40,
"name": "MY_DATETIME__TIMESTAMP_NTZ",
"comment": null
},
"MY_TIMESTAMP__TIMESTAMP_X": {
"type": "TIMESTAMP_NTZ",
"index": 41,
"name": "MY_TIMESTAMP__TIMESTAMP_X",
"comment": null
}
},
"stats": {
"has_stats": {
"id": "has_stats",
"label": "Has Stats?",
"value": false,
"include": false,
"description": "Indicates whether there are statistics for this table"
}
},
"unique_id": "model.my_project.data_types"
}
},
"sources": {},
"errors": null
} values{
"node": "data_types",
"show": [
{
"MY_NUMBER": null,
"MY_FLOAT": null,
"MY_TEXT": null,
"MY_BINARY": null,
"MY_BOOLEAN": null,
"MY_DATE": null,
"MY_TIME": null,
"MY_TIMESTAMP_LTZ": null,
"MY_TIMESTAMP_NTZ": null,
"MY_TIMESTAMP_TZ": null,
"MY_VARIANT": null,
"MY_OBJECT": null,
"MY_ARRAY": null,
"MY_GEOGRAPHY": null,
"MY_GEOMETRY": null,
"MY_DECIMAL__NUMBER": null,
"MY_DEC__NUMBER": null,
"MY_NUMERIC__NUMBER": null,
"MY_INT__NUMBER": null,
"MY_INTEGER__NUMBER": null,
"MY_BIGINT__NUMBER": null,
"MY_SMALLINT__NUMBER": null,
"MY_TINYINT__NUMBER": null,
"MY_BYTEINT__NUMBER": null,
"MY_FLOAT4__FLOAT": null,
"MY_FLOAT8__FLOAT": null,
"MY_DOUBLE__FLOAT": null,
"MY_DOUBLE_PRECISION__FLOAT": null,
"MY_REAL__FLOAT": null,
"MY_CHAR__TEXT": null,
"MY_CHARACTER__TEXT": null,
"MY_NCHAR__TEXT": null,
"MY_STRING__TEXT": null,
"MY_VARCHAR__TEXT": null,
"MY_NVARCHAR__TEXT": null,
"MY_NVARCHAR2__TEXT": null,
"MY_CHAR_VARYING__TEXT": null,
"MY_NCHAR_VARYING__TEXT": null,
"MY_VARBINARY__BINARY": null,
"MY_DATETIME__TIMESTAMP_NTZ": null,
"MY_TIMESTAMP__TIMESTAMP_X": null
}
]
} |
data_type: GEOGRAPHY
data_type: GEOGRAPHY
@dbeatty10 I noticed we're casting these two specific data types differently in this recent PR. Should we apply this more broadly? |
You have eagle eyes @mikealfare ! 🦅 Added this comment in that PR to tie these things together -- they are very much related! Short answerYes, we should apply this more broadly. Longer answerTo apply this more broadly, we should:
Here's example queries for a handful of databases: snowflakeselect
cast(null as NUMBER) as my_NUMBER,
cast(null as FLOAT) as my_FLOAT,
cast(null as TEXT) as my_TEXT,
cast(null as BINARY) as my_BINARY,
cast(null as BOOLEAN) as my_BOOLEAN,
cast(null as DATE) as my_DATE,
cast(null as TIME) as my_TIME,
cast(null as TIMESTAMP_LTZ) as my_TIMESTAMP_LTZ,
cast(null as TIMESTAMP_NTZ) as my_TIMESTAMP_NTZ,
cast(null as TIMESTAMP_TZ) as my_TIMESTAMP_TZ,
cast(null as VARIANT) as my_VARIANT,
cast(null as OBJECT) as my_OBJECT,
cast(null as ARRAY) as my_ARRAY,
TO_GEOGRAPHY(null) as my_GEOGRAPHY,
TO_GEOMETRY(null) as my_GEOMETRY,
-- aliases
cast(null as DECIMAL) as my_DECIMAL__NUMBER,
cast(null as DEC) as my_DEC__NUMBER,
cast(null as NUMERIC) as my_NUMERIC__NUMBER,
cast(null as INT) as my_INT__NUMBER,
cast(null as INTEGER) as my_INTEGER__NUMBER,
cast(null as BIGINT) as my_BIGINT__NUMBER,
cast(null as SMALLINT) as my_SMALLINT__NUMBER,
cast(null as TINYINT) as my_TINYINT__NUMBER,
cast(null as BYTEINT) as my_BYTEINT__NUMBER,
cast(null as FLOAT4) as my_FLOAT4__FLOAT,
cast(null as FLOAT8) as my_FLOAT8__FLOAT,
cast(null as DOUBLE) as my_DOUBLE__FLOAT,
cast(null as DOUBLE PRECISION) as my_DOUBLE_PRECISION__FLOAT,
cast(null as REAL) as my_REAL__FLOAT,
cast(null as CHAR) as my_CHAR__TEXT,
cast(null as CHARACTER) as my_CHARACTER__TEXT,
cast(null as NCHAR) as my_NCHAR__TEXT,
cast(null as STRING) as my_STRING__TEXT,
cast(null as VARCHAR) as my_VARCHAR__TEXT,
cast(null as NVARCHAR) as my_NVARCHAR__TEXT,
cast(null as NVARCHAR2) as my_NVARCHAR2__TEXT,
cast(null as CHAR VARYING) as my_CHAR_VARYING__TEXT,
cast(null as NCHAR VARYING) as my_NCHAR_VARYING__TEXT,
cast(null as VARBINARY) as my_VARBINARY__BINARY,
cast(null as DATETIME) as my_DATETIME__TIMESTAMP_NTZ,
cast(null as TIMESTAMP) as my_TIMESTAMP__TIMESTAMP_X bigqueryselect
cast(null as TIME) as my_TIME,
cast(null as DATE) as my_DATE,
cast(null as DATETIME) as my_DATETIME,
cast(null as TIMESTAMP) as my_TIMESTAMP,
cast(null as INTERVAL) as my_INTERVAL,
cast(null as FLOAT64) as my_FLOAT64,
cast(null as INT64) as my_INT64,
cast(null as NUMERIC) as my_NUMERIC,
cast(null as BIGNUMERIC) as my_BIGNUMERIC,
cast(null as BOOL) as my_BOOL,
cast(null as STRING) as my_STRING,
cast(null as BYTES) as my_BYTES,
cast(null as ARRAY<INT64>) as my_ARRAY,
to_json(null) as my_JSON,
cast(null as STRUCT<x STRING, y INT64>) as my_STRUCT,
cast(null as GEOGRAPHY) as my_GEOGRAPHY,
-- aliases
cast(null as INT) as my_INT__INT64,
cast(null as SMALLINT) as my_SMALLINT__INT64,
cast(null as INTEGER) as my_INTEGER__INT64,
cast(null as BIGINT) as my_BIGINT__INT64,
cast(null as TINYINT) as my_TINYINT__INT64,
cast(null as BYTEINT) as my_BYTEINT__INT64,
cast(null as DECIMAL) as my_DECIMAL__NUMERIC,
cast(null as BIGDECIMAL) as my_BIGDECIMAL__BIGNUMERIC redshiftselect
cast(null as SMALLINT) as my_SMALLINT,
cast(null as INTEGER) as my_INTEGER,
cast(null as BIGINT) as my_BIGINT,
cast(null as NUMERIC) as my_NUMERIC,
cast(null as REAL) as my_REAL,
cast(null as DOUBLE PRECISION) as my_DOUBLE_PRECISION,
cast(null as BOOLEAN) as my_BOOLEAN,
cast(null as CHARACTER) as my_CHARACTER,
cast(null as CHARACTER VARYING) as my_CHARACTER_VARYING,
cast(null as DATE) as my_DATE,
cast(null as TIMESTAMP WITHOUT TIME ZONE) as my_TIMESTAMP_WITHOUT_TIME_ZONE,
cast(null as TIMESTAMP WITH TIME ZONE) as my_TIMESTAMP_WITH_TIME_ZONE,
cast(null as TIME WITHOUT TIME ZONE) as my_TIME_WITHOUT_TIME_ZONE,
cast(null as TIME WITH TIME ZONE) as my_TIME_WITH_TIME_ZONE,
cast(null as BINARY VARYING) as my_BINARY_VARYING,
cast(null as GEOMETRY) as my_GEOMETRY,
cast(null as GEOGRAPHY) as my_GEOGRAPHY,
cast(null as HLLSKETCH) as my_HLLSKETCH,
cast(null as SUPER) as my_SUPER,
-- aliases
cast(null as INT2) as my_INT2__SMALLINT,
cast(null as INT) as my_INT__INTEGER,
cast(null as INT4) as my_INT4__INTEGER,
cast(null as INT8) as my_INT8__BIGINT,
cast(null as DECIMAL) as my_DECIMAL__NUMERIC,
cast(null as FLOAT4) as my_FLOAT4__REAL,
cast(null as FLOAT8) as my_FLOAT8__DOUBLE_PRECISION,
cast(null as FLOAT) as my_FLOAT__DOUBLE_PRECISION,
cast(null as BOOL) as my_BOOL__BOOLEAN,
cast(null as CHAR) as my_CHAR__CHARACTER,
cast(null as NCHAR) as my_NCHAR__CHARACTER,
cast(null as BPCHAR) as my_BPCHAR__CHARACTER,
cast(null as VARCHAR) as my_VARCHAR__CHARACTER_VARYING,
cast(null as NVARCHAR) as my_NVARCHAR__CHARACTER_VARYING,
cast(null as TEXT) as my_TEXT__CHARACTER_VARYING,
cast(null as TIMESTAMP) as my_TIMESTAMP__TIMESTAMP_WITHOUT_TIME_ZONE,
cast(null as TIMESTAMPTZ) as my_TIMESTAMPTZ__TIMESTAMP_WITH_TIME_ZONE,
cast(null as TIME) as my_TIME__TIME_WITHOUT_TIME_ZONE,
cast(null as TIMETZ) as my_TIMETZ__TIME_WITH_TIME_ZONE,
cast(null as VARBINARY) as my_VARBINARY__BINARY_VARYING,
cast(null as VARBYTE) as my_VARBYTE__BINARY_VARYING postgres-- excluding the following since they are autoincrementing data types:
-- smallserial, serial, bigserial
-- aliases: serial2, serial4, serial8
select
cast(null as smallint) as my_smallint,
cast(null as integer) as my_integer,
cast(null as bigint) as my_bigint,
cast(null as numeric) as my_numeric,
cast(null as real) as my_real,
cast(null as double precision) as my_double_precision,
-- cast(null as smallserial) as my_smallserial,
-- cast(null as serial) as my_serial,
-- cast(null as bigserial) as my_bigserial,
cast(null as date) as my_date,
cast(null as timestamp without time zone) as my_timestamp_without_time_zone,
cast(null as timestamp with time zone) as my_timestamp_with_time_zone,
cast(null as time without time zone) as my_time_without_time_zone,
cast(null as time with time zone) as my_time_with_time_zone,
cast(null as interval) as my_interval,
cast(null as boolean) as my_boolean,
cast(null as bytea) as my_bytea,
cast(null as character varying(100)) as my_character_varying,
cast(null as character(100)) as my_character,
cast(null as text) as my_text,
cast(null as bpchar) as my_bpchar__character,
-- aliases
cast(null as int2) as my_int2__smallint,
cast(null as int) as my_int__integer,
cast(null as int4) as my_int4__integer,
cast(null as int8) as my_int8__bigint,
cast(null as decimal) as my_decimal__numeric,
cast(null as float4) as my_float4__real,
cast(null as float8) as my_float8__double_precision,
-- cast(null as serial2) as my_serial2__smallserial,
-- cast(null as serial4) as my_serial4__serial,
-- cast(null as serial8) as my_serial8__bigserial,
cast(null as timestamp) as my_timestamp__timestamp_without_time_zone,
cast(null as timestamptz) as my_timestamptz__timestamp_with_time_zone,
cast(null as time) as my_time__time_without_time_zone,
cast(null as timetz) as my_timetz__time_with_time_zone,
cast(null as bool) as my_bool__boolean,
cast(null as varchar(100)) as my_varchar__character_varying,
cast(null as char(100)) as my_char__character databricksselect
cast(null as BINARY) as my_BINARY,
cast(null as BOOLEAN) as my_BOOLEAN,
cast(null as STRING) as my_STRING,
cast(null as INTERVAL) as my_INTERVAL,
cast(null as DATE) as my_DATE,
cast(null as TIMESTAMP) as my_TIMESTAMP,
cast(null as BYTE) as my_BYTE,
cast(null as SHORT) as my_SHORT,
cast(null as INTEGER) as my_INTEGER,
cast(null as BIGINT) as my_BIGINT,
cast(null as DECIMAL) as my_DECIMAL,
cast(null as FLOAT) as my_FLOAT,
cast(null as DOUBLE) as my_DOUBLE,
cast(null as ARRAY<INT>) as my_ARRAY,
cast(null as MAP<STRING, INT>) as my_MAP,
cast(null as STRUCT<name: STRING, age: INT>) as my_STRUCT,
cast(null as VOID) as my_VOID,
-- aliases
-- These are NOT documented, so they need to be discovered one-by-one experimentally
cast(null as INT) as my_INT__INTEGER,
cast(null as SMALLINT) as my_SMALLINT__SHORT,
cast(null as TINYINT) as my_TINYINT__BYTE,
cast(null as DEC) as my_DEC__NUMBER,
cast(null as NUMERIC) as my_NUMERIC__NUMBER,
cast(null as REAL) as my_REAL__FLOAT sparkselect
cast(null as BINARY) as my_BINARY,
cast(null as BOOLEAN) as my_BOOLEAN,
cast(null as STRING) as my_STRING,
cast(null as INTERVAL) as my_INTERVAL,
cast(null as DATE) as my_DATE,
cast(null as TIMESTAMP) as my_TIMESTAMP,
cast(null as BYTE) as my_BYTE,
cast(null as SHORT) as my_SHORT,
cast(null as INTEGER) as my_INTEGER,
cast(null as LONG) as my_LONG,
cast(null as DECIMAL) as my_DECIMAL,
cast(null as FLOAT) as my_FLOAT,
cast(null as DOUBLE) as my_DOUBLE,
cast(null as ARRAY<INT>) as my_ARRAY,
cast(null as MAP<STRING, INT>) as my_MAP,
cast(null as STRUCT<name: STRING, age: INT>) as my_STRUCT,
cast(null as VOID) as my_VOID,
-- aliases
-- These are NOT documented, so they need to be discovered one-by-one experimentally
cast(null as INT) as my_INT__INTEGER,
cast(null as SMALLINT) as my_SMALLINT__SHORT,
cast(null as TINYINT) as my_TINYINT__BYTE,
cast(null as DEC) as my_DEC__DECIMAL,
cast(null as NUMERIC) as my_NUMERIC__DECIMAL,
cast(null as REAL) as my_REAL__FLOAT |
Is this a new bug in dbt-snowflake?
Current Behavior
When trying to run a model with an enforced contract and a column of
data_type: GEOGRAPHY
, the following error occurs:Expected Behavior
I expected this to work with no errors.
Steps To Reproduce
models/mymodel.yml
:models/mymodel.sql
:Output from
dbt run --full-refresh --select mymodel
:Relevant log output
Compiled SQL found in the logs:
It is possible and valid to put
null
in a column ofGEOGRAPHY
type in Snowflake, but it doesn't seem like they support explicitly casting a null value toGEOGRAPHY
. Unclear if there's an accepted workaround, or if this is a bad idea for some technical reason.Environment
Additional Context
No response
The text was updated successfully, but these errors were encountered: