From 3c11953bf2800ac8297da14b2394d49e3e11dd39 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jan=20Cie=C5=9Blak?= Date: Mon, 17 Jun 2024 10:33:44 +0200 Subject: [PATCH] feat: standard database v1 readiness (#2842) Done in this pr: - Addressed comments from the [previous pr](https://github.com/Snowflake-Labs/terraform-provider-snowflake/pull/2834) - Added new `standard_database` resource (also added examples, import, mentioned in migration notes) - Deprecate old database resource - Adjust `databases` data-source to align with v1 requirements (add filters, missing values and outputs from DESCRIBE and SHOW PARAMETERS commands) - Replaced `snowflake_database` in all of the examples to `snowflake_standard_database` To be done: - Add missing properties on all three new database types - Make sure all of the issues were resolved with the new types of databases ## Test Plan * [x] acceptance tests ## References [CREATE DATABASE](https://docs.snowflake.com/en/sql-reference/sql/create-database) ## Update Changes done: - Added missing parameters to all the database types and moved them and operations on them to a common place (only metric_level parameter wasn't included as it is a preview feature and there wasn't enough information about and it seemed like ORGADMIN or certain privileges were required to be able to test/use it). - Switched to plain values instead of nested ones for parameters and adjusted customdiffs, so that the state is refreshed always when expected. - Every database type resolves database-connected issues (most if not all of them were already resolved by the latest versions of the provider for the `snowflake_database` resource). - Refresh for secondary was not added, as replication guidelines are recommending to create a task that would be refreshing the replica at a certain interval. To aid our users I created an example showing how to create a task that would run the refresh every 10 minutes. An easy upgrade (if we would like to) would be add a toggle to call refresh every read operation. The toggle could be turned on by default with the option to turn it off and refresh it in the task "manually". - State upgrader for snowflake_database (because we chose to rename the old one for have _old suffix). --- MIGRATION_GUIDE.md | 102 +- docs/data-sources/databases.md | 125 +- docs/index.md | 1 + docs/resources/database.md | 122 +- docs/resources/database_old.md | 85 + docs/resources/failover_group.md | 2 +- docs/resources/secondary_database.md | 125 ++ docs/resources/sequence.md | 6 +- docs/resources/shared_database.md | 107 ++ docs/resources/tag_association.md | 16 +- examples/additional/deprecated_resources.MD | 1 + .../snowflake_databases/data-source.tf | 74 +- .../resources/snowflake_database/import.sh | 2 +- .../resources/snowflake_database/resource.tf | 71 +- .../snowflake_database_old/import.sh | 1 + .../snowflake_database_old/resource.tf | 30 + .../snowflake_failover_group/resource.tf | 2 +- .../snowflake_secondary_database/resource.tf | 64 +- .../resources/snowflake_sequence/resource.tf | 6 +- .../snowflake_shared_database/resource.tf | 45 +- .../snowflake_tag_association/resource.tf | 16 +- pkg/acceptance/asserts.go | 23 + pkg/acceptance/asserts_test.go | 73 + pkg/acceptance/check_destroy.go | 3 + pkg/acceptance/helpers/database_client.go | 73 +- pkg/acceptance/helpers/parameter_client.go | 31 + pkg/acceptance/snowflakechecks/database.go | 28 + .../testenvs/testing_environment_variables.go | 3 + pkg/datasources/databases.go | 227 ++- pkg/datasources/databases_acceptance_test.go | 211 ++- .../testdata/TestAcc_Databases/like/test.tf | 16 + .../TestAcc_Databases/like/variables.tf | 15 + .../testdata/TestAcc_Databases/limit/test.tf | 19 + .../TestAcc_Databases/limit/variables.tf | 19 + .../TestAcc_Databases/optionals_set/test.tf | 20 + .../optionals_set/variables.tf | 11 + .../TestAcc_Databases/optionals_unset/test.tf | 22 + .../optionals_unset/variables.tf | 11 + .../TestAcc_Databases/starts_with/test.tf | 16 + .../starts_with/variables.tf | 15 + .../without_database/test.tf | 10 + pkg/internal/provider/docs/doc_helpers.go | 1 + pkg/provider/provider.go | 3 + pkg/provider/resources/resources.go | 1 + pkg/resources/custom_diffs.go | 68 +- pkg/resources/custom_diffs_test.go | 154 +- pkg/resources/database.go | 627 ++++---- pkg/resources/database_acceptance_test.go | 1365 ++++++++++++++--- pkg/resources/database_commons.go | 342 +++++ pkg/resources/database_old.go | 371 +++++ pkg/resources/database_old_acceptance_test.go | 450 ++++++ pkg/resources/database_state_upgraders.go | 29 + ...vileges_to_account_role_acceptance_test.go | 8 +- ...rant_privileges_to_role_acceptance_test.go | 8 +- pkg/resources/helpers.go | 78 +- pkg/resources/helpers_test.go | 68 + pkg/resources/secondary_database.go | 307 +--- .../secondary_database_acceptance_test.go | 427 ++++-- pkg/resources/shared_database.go | 201 +-- .../shared_database_acceptance_test.go | 222 +-- .../testdata/TestAcc_Database/basic/test.tf | 4 + .../TestAcc_Database/basic/variables.tf | 8 + .../testdata/TestAcc_Database/catalog/test.tf | 4 + .../TestAcc_Database/catalog/variables.tf | 8 + .../complete_optionals_set/test.tf | 31 + .../complete_optionals_set/variables.tf | 87 ++ .../int_parameter/set/test.tf | 4 + .../int_parameter/set/variables.tf | 8 + .../int_parameter/unset/test.tf | 3 + .../int_parameter/unset/variables.tf | 3 + .../TestAcc_Database/replication/test.tf | 12 + .../TestAcc_Database/replication/variables.tf | 15 + .../test.tf | 2 +- .../WithDataRetentionSet/test.tf | 2 +- .../WithoutDataRetentionSet/test.tf | 2 +- .../ImportedPrivileges/test.tf | 12 +- .../ImportedPrivileges/variables.tf | 6 +- .../ImportedPrivileges/test.tf | 12 +- .../ImportedPrivileges/variables.tf | 6 +- .../complete-optionals-set/test.tf | 34 +- .../complete-optionals-set/variables.tf | 30 +- .../complete-optionals-unset/test.tf | 13 +- .../complete-optionals-unset/variables.tf | 36 - .../TestAcc_SharedDatabase/complete/test.tf | 28 +- .../complete/variables.tf | 32 +- pkg/sdk/context_functions.go | 38 +- pkg/sdk/databases.go | 272 +++- pkg/sdk/databases_test.go | 83 +- pkg/sdk/parameters.go | 79 +- .../context_functions_integration_test.go | 18 + pkg/sdk/testint/databases_integration_test.go | 425 +++-- .../resources/secondary_database.md.tmpl | 33 + v1-preparations/ESSENTIAL_GA_OBJECTS.MD | 2 +- 93 files changed, 5898 insertions(+), 2033 deletions(-) create mode 100644 docs/resources/database_old.md create mode 100644 docs/resources/secondary_database.md create mode 100644 docs/resources/shared_database.md create mode 100644 examples/resources/snowflake_database_old/import.sh create mode 100644 examples/resources/snowflake_database_old/resource.tf create mode 100644 pkg/acceptance/asserts.go create mode 100644 pkg/acceptance/asserts_test.go create mode 100644 pkg/acceptance/snowflakechecks/database.go create mode 100644 pkg/datasources/testdata/TestAcc_Databases/like/test.tf create mode 100644 pkg/datasources/testdata/TestAcc_Databases/like/variables.tf create mode 100644 pkg/datasources/testdata/TestAcc_Databases/limit/test.tf create mode 100644 pkg/datasources/testdata/TestAcc_Databases/limit/variables.tf create mode 100644 pkg/datasources/testdata/TestAcc_Databases/optionals_set/test.tf create mode 100644 pkg/datasources/testdata/TestAcc_Databases/optionals_set/variables.tf create mode 100644 pkg/datasources/testdata/TestAcc_Databases/optionals_unset/test.tf create mode 100644 pkg/datasources/testdata/TestAcc_Databases/optionals_unset/variables.tf create mode 100644 pkg/datasources/testdata/TestAcc_Databases/starts_with/test.tf create mode 100644 pkg/datasources/testdata/TestAcc_Databases/starts_with/variables.tf create mode 100644 pkg/datasources/testdata/TestAcc_Databases/without_database/test.tf create mode 100644 pkg/resources/database_commons.go create mode 100644 pkg/resources/database_old.go create mode 100644 pkg/resources/database_old_acceptance_test.go create mode 100644 pkg/resources/database_state_upgraders.go create mode 100644 pkg/resources/testdata/TestAcc_Database/basic/test.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/basic/variables.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/catalog/test.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/catalog/variables.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/complete_optionals_set/test.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/complete_optionals_set/variables.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/int_parameter/set/test.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/int_parameter/set/variables.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/int_parameter/unset/test.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/int_parameter/unset/variables.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/replication/test.tf create mode 100644 pkg/resources/testdata/TestAcc_Database/replication/variables.tf create mode 100644 templates/resources/secondary_database.md.tmpl diff --git a/MIGRATION_GUIDE.md b/MIGRATION_GUIDE.md index 1f2972b2d2..0326d84de8 100644 --- a/MIGRATION_GUIDE.md +++ b/MIGRATION_GUIDE.md @@ -60,6 +60,106 @@ To easily handle three-value logic (true, false, unknown) in provider's configs, #### *(note)* `resource_monitor` validation and diff suppression `resource_monitor` is an identifier and handling logic may be still slightly changed as part of https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#identifiers-rework. It should be handled automatically (without needed manual actions on user side), though, but it is not guaranteed. +### new database resources +As part of the [preparation for v1](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#preparing-essential-ga-objects-for-the-provider-v1), we split up the database resource into multiple ones: +- Standard database - can be used as `snowflake_database` (replaces the old one and is used to create databases with optional ability to become a primary database ready for replication) +- Shared database - can be used as `snowflake_shared_database` (used to create databases from externally defined shares) +- Secondary database - can be used as `snowflake_secondary_database` (used to create replicas of databases from external sources) + +All the field changes in comparison to the previous database resource are: +- `is_transient` + - in `snowflake_shared_database` + - removed: the field is removed from `snowflake_shared_database` as it doesn't have any effect on shared databases. +- `from_database` - database cloning was entirely removed and is not possible by any of the new database resources. +- `from_share` - the parameter was moved to the dedicated resource for databases created from shares `snowflake_shared_database`. Right now, it's a text field instead of a map. Additionally, instead of legacy account identifier format we're expecting the new one that with share looks like this: `..`. For more information on account identifiers, visit the [official documentation](https://docs.snowflake.com/en/user-guide/admin-account-identifier). +- p, +- `from_replication` - the parameter was moved to the dedicated resource for databases created from primary databases `snowflake_secondary_database` +- `replication_configuration` - renamed: was renamed to `configuration` and is only available in the `snowflake_database`. Its internal schema changed that instead of list of accounts, we expect a list of nested objects with accounts for which replication (and optionally failover) should be enabled. More information about converting between both versions [here](#resource-renamed-snowflake_database---snowflake_database_old). Additionally, instead of legacy account identifier format we're expecting the new one that looks like this: `.`. For more information on account identifiers, visit the [official documentation](https://docs.snowflake.com/en/user-guide/admin-account-identifier). +- `data_retention_time_in_days` + - in `snowflake_shared_database` + - removed: the field is removed from `snowflake_shared_database` as it doesn't have any effect on shared databases. + - in `snowflake_database` and `snowflake_secondary_database` + - adjusted: now, it uses different approach that won't set it to -1 as a default value, but rather fills the field with the current value from Snowflake (this still can change). +- added: The following set of [parameters](https://docs.snowflake.com/en/sql-reference/parameters) was added to every database type: + - `max_data_extension_time_in_days` + - `external_volume` + - `catalog` + - `replace_invalid_characters` + - `default_ddl_collation` + - `storage_serialization_policy` + - `log_level` + - `trace_level` + - `suspend_task_after_num_failures` + - `task_auto_retry_attempts` + - `user_task_managed_initial_warehouse_size` + - `user_task_timeout_ms` + - `user_task_minimum_trigger_interval_in_seconds` + - `quoted_identifiers_ignore_case` + - `enable_console_output` + +The split was done (and will be done for several objects during the refactor) to simplify the resource on maintainability and usage level. +Its purpose was also to divide the resources by their specific purpose rather than cramping every use case of an object into one resource. + +### Resource renamed snowflake_database -> snowflake_database_old +We made a decision to use the existing `snowflake_database` resource for redesigning it into a standard database. +The previous `snowflake_database` was renamed to `snowflake_database_old` and the current `snowflake_database` +contains completely new implementation that follows our guidelines we set for V1. +When upgrading to the 0.93.0 version, the automatic state upgrader should cover the migration for databases that didn't have the following fields set: +- `from_share` (now, the new `snowflake_shared_database` should be used instead) +- `from_replica` (now, the new `snowflake_secondary_database` should be used instead) +- `replication_configuration` + +For configurations containing `replication_configuraiton` like this one: +```terraform +resource "snowflake_database" "test" { + name = "" + replication_configuration { + accounts = ["", ""] + ignore_edition_check = true + } +} +``` + +You have to transform the configuration into the following format (notice the change from account locator into the new account identifier format): +```terraform +resource "snowflake_database" "test" { + name = "%s" + replication { + enable_to_account { + account_identifier = "." + with_failover = false + } + enable_to_account { + account_identifier = "." + with_failover = false + } + } + ignore_edition_check = true +} +``` + +If you had `from_database` set, it should migrate automatically. +For now, we're dropping the possibility to create a clone database from other databases. +The only way will be to clone a database manually and import it as `snowflake_database`, but if +cloned databases diverge in behavior from standard databases, it may cause issues. + +For databases with one of the fields mentioned above, manual migration will be needed. +Please refer to our [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md) to perform zero downtime migration. + +If you would like to upgrade to the latest version and postpone the upgrade, you still have to perform the maunal migration +to the `snowflake_database_old` resource by following the [zero downtime migrations document](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md). +The only difference would be that instead of writing/generating new configurations you have to just rename the existing ones to contain `_old` suffix. + +### *(behavior change)* snowflake_databases datasource +- `terse` and `history` fields were removed. +- `replication_configuration` field was removed from `databases`. +- `pattern` was replaced by `like` field. +- Additional filtering options added (`limit`). +- Added missing fields returned by SHOW DATABASES. +- Added outputs from **DESC DATABASE** and **SHOW PARAMETERS IN DATABASE** (they can be turned off by declaring `with_describe = false` and `with_parameters = false`, **they're turned on by default**). +The additional parameters call **DESC DATABASE** (with `with_describe` turned on) and **SHOW PARAMETERS IN DATABASE** (with `with_parameters` turned on) **per database** returned by **SHOW DATABASES**. +It's important to limit the records and calls to Snowflake to the minimum. That's why we recommend assessing which information you need from the data source and then providing strong filters and turning off additional fields for better plan performance. + ## v0.89.0 ➞ v0.90.0 ### snowflake_table resource changes #### *(behavior change)* Validation to column type added @@ -79,7 +179,7 @@ resource "snowflake_tag_masking_policy_association" "name" { masking_policy_id = snowflake_masking_policy.example_masking_policy.id } ``` - + After ```terraform resource "snowflake_tag_masking_policy_association" "name" { diff --git a/docs/data-sources/databases.md b/docs/data-sources/databases.md index bc71a5832c..47097a56a0 100644 --- a/docs/data-sources/databases.md +++ b/docs/data-sources/databases.md @@ -12,7 +12,79 @@ description: |- ## Example Usage ```terraform -data "snowflake_databases" "this" {} +# Simple usage +data "snowflake_databases" "simple" { +} + +output "simple_output" { + value = data.snowflake_databases.simple.databases +} + +# Filtering (like) +data "snowflake_databases" "like" { + like = "database-name" +} + +output "like_output" { + value = data.snowflake_databases.like.databases +} + +# Filtering (starts_with) +data "snowflake_databases" "starts_with" { + starts_with = "database-" +} + +output "starts_with_output" { + value = data.snowflake_databases.starts_with.databases +} + +# Filtering (limit) +data "snowflake_databases" "limit" { + limit { + rows = 10 + from = "database-" + } +} + +output "limit_output" { + value = data.snowflake_databases.limit.databases +} + +# Without additional data (to limit the number of calls make for every found database) +data "snowflake_databases" "only_show" { + # with_describe is turned on by default and it calls DESCRIBE DATABASE for every database found and attaches its output to databases.*.description field + with_describe = false + + # with_parameters is turned on by default and it calls SHOW PARAMETERS FOR DATABASE for every database found and attaches its output to databases.*.parameters field + with_parameters = false +} + +output "only_show_output" { + value = data.snowflake_databases.only_show.databases +} + +# Ensure the number of databases is equal to at least one element (with the use of postcondition) +data "snowflake_databases" "assert_with_postcondition" { + starts_with = "database-name" + lifecycle { + postcondition { + condition = length(self.databases) > 0 + error_message = "there should be at least one database" + } + } +} + +# Ensure the number of databases is equal to at exactly one element (with the use of check block) +check "database_check" { + data "snowflake_databases" "assert_with_check_block" { + like = "database-name" + } + + assert { + condition = length(data.snowflake_databases.test.databases) == 1 + error_message = "Databases filtered by '${data.snowflake_databases.test.like}' returned ${length(data.snowflake_databases.test.databases)} databases where one was expected" + } +} ``` @@ -20,16 +92,29 @@ data "snowflake_databases" "this" {} ### Optional -- `history` (Boolean) Optionally includes dropped databases that have not yet been purged The output also includes an additional `dropped_on` column -- `pattern` (String) Optionally filters the databases by a pattern -- `starts_with` (String) Optionally filters the databases by a pattern -- `terse` (Boolean) Optionally returns only the columns `created_on` and `name` in the results +- `like` (String) Filters the output with **case-insensitive** pattern, with support for SQL wildcard characters (`%` and `_`). +- `limit` (Block List, Max: 1) Limits the number of rows returned. If the `limit.from` is set, then the limit wll start from the first element matched by the expression. The expression is only used to match with the first element, later on the elements are not matched by the prefix, but you can enforce a certain pattern with `starts_with` or `like`. (see [below for nested schema](#nestedblock--limit)) +- `starts_with` (String) Filters the output with **case-sensitive** characters indicating the beginning of the object name. +- `with_describe` (Boolean) Runs DESC DATABASE for each database returned by SHOW DATABASES. The output of describe is saved to the description field. By default this value is set to true. +- `with_parameters` (Boolean) Runs SHOW PARAMETERS FOR DATABASE for each database returned by SHOW DATABASES. The output of describe is saved to the parameters field as a map. By default this value is set to true. ### Read-Only -- `databases` (List of Object) Snowflake databases (see [below for nested schema](#nestedatt--databases)) +- `databases` (List of Object) Holds the output of SHOW DATABASES. (see [below for nested schema](#nestedatt--databases)) - `id` (String) The ID of this resource. + +### Nested Schema for `limit` + +Required: + +- `rows` (Number) The maximum number of rows to return. + +Optional: + +- `from` (String) Specifies a **case-sensitive** pattern that is used to match object name. After the first match, the limit on the number of rows will be applied. + + ### Nested Schema for `databases` @@ -37,19 +122,37 @@ Read-Only: - `comment` (String) - `created_on` (String) +- `description` (List of Object) (see [below for nested schema](#nestedobjatt--databases--description)) - `is_current` (Boolean) - `is_default` (Boolean) +- `is_transient` (Boolean) +- `kind` (String) - `name` (String) - `options` (String) - `origin` (String) - `owner` (String) -- `replication_configuration` (List of Object) (see [below for nested schema](#nestedobjatt--databases--replication_configuration)) +- `owner_role_type` (String) +- `parameters` (List of Object) (see [below for nested schema](#nestedobjatt--databases--parameters)) +- `resource_group` (String) - `retention_time` (Number) - -### Nested Schema for `databases.replication_configuration` + +### Nested Schema for `databases.description` + +Read-Only: + +- `created_on` (String) +- `kind` (String) +- `name` (String) + + + +### Nested Schema for `databases.parameters` Read-Only: -- `accounts` (List of String) -- `ignore_edition_check` (Boolean) +- `default` (String) +- `description` (String) +- `key` (String) +- `level` (String) +- `value` (String) diff --git a/docs/index.md b/docs/index.md index 6c09a7fe1f..6ed5bb38de 100644 --- a/docs/index.md +++ b/docs/index.md @@ -231,6 +231,7 @@ The Snowflake provider will use the following order of precedence when determini - [snowflake_account_grant](./docs/resources/account_grant) - use [snowflake_grant_privileges_to_account_role](./docs/resources/grant_privileges_to_account_role) instead - [snowflake_database_grant](./docs/resources/database_grant) - use [snowflake_grant_privileges_to_account_role](./docs/resources/grant_privileges_to_account_role) instead +- [snowflake_database_old](./docs/resources/database_old) - [snowflake_external_table_grant](./docs/resources/external_table_grant) - use [snowflake_grant_privileges_to_account_role](./docs/resources/grant_privileges_to_account_role) instead - [snowflake_failover_group_grant](./docs/resources/failover_group_grant) - use [snowflake_grant_privileges_to_account_role](./docs/resources/grant_privileges_to_account_role) instead - [snowflake_file_format_grant](./docs/resources/file_format_grant) - use [snowflake_grant_privileges_to_account_role](./docs/resources/grant_privileges_to_account_role) instead diff --git a/docs/resources/database.md b/docs/resources/database.md index 29edebb8e9..d9b2a07cc2 100644 --- a/docs/resources/database.md +++ b/docs/resources/database.md @@ -2,44 +2,75 @@ page_title: "snowflake_database Resource - terraform-provider-snowflake" subcategory: "" description: |- - + Represents a standard database. If replication configuration is specified, the database is promoted to serve as a primary database for replication. --- # snowflake_database (Resource) - +Represents a standard database. If replication configuration is specified, the database is promoted to serve as a primary database for replication. ## Example Usage ```terraform -resource "snowflake_database" "simple" { - name = "testing" - comment = "test comment" - data_retention_time_in_days = 3 +## Minimal +resource "snowflake_database" "primary" { + name = "database_name" } -resource "snowflake_database" "with_replication" { - name = "testing_2" - comment = "test comment 2" - replication_configuration { - accounts = ["test_account1", "test_account_2"] +## Complete (with every optional set) +resource "snowflake_database" "primary" { + name = "database_name" + is_transient = false + comment = "my standard database" + + data_retention_time_in_days = 10 + data_retention_time_in_days_save = 10 + max_data_extension_time_in_days = 20 + external_volume = "" + catalog = "" + replace_invalid_characters = false + default_ddl_collation = "en_US" + storage_serialization_policy = "COMPATIBLE" + log_level = "INFO" + trace_level = "ALWAYS" + suspend_task_after_num_failures = 10 + task_auto_retry_attempts = 10 + user_task_managed_initial_warehouse_size = "LARGE" + user_task_timeout_ms = 3600000 + user_task_minimum_trigger_interval_in_seconds = 120 + quoted_identifiers_ignore_case = false + enable_console_output = false + + replication { + enable_to_account { + account_identifier = "." + with_failover = true + } ignore_edition_check = true } } -resource "snowflake_database" "from_replica" { - name = "testing_3" - comment = "test comment" - data_retention_time_in_days = 3 - from_replica = "\"org1\".\"account1\".\"primary_db_name\"" +## Replication with for_each +locals { + replication_configs = [ + { + account_identifier = "." + with_failover = true + }, + { + account_identifier = "." + with_failover = true + }, + ] } -resource "snowflake_database" "from_share" { - name = "testing_4" - comment = "test comment" - from_share = { - provider = "account1_locator" - share = "share1" +resource "snowflake_database" "primary" { + name = "database_name" + for_each = local.replication_configs + + replication { + enable_to_account = each.value + ignore_edition_check = true } } ``` @@ -49,37 +80,60 @@ resource "snowflake_database" "from_share" { ### Required -- `name` (String) Specifies the identifier for the database; must be unique for your account. +- `name` (String) Specifies the identifier for the database; must be unique for your account. As a best practice for [Database Replication and Failover](https://docs.snowflake.com/en/user-guide/db-replication-intro), it is recommended to give each secondary database the same name as its primary database. This practice supports referencing fully-qualified objects (i.e. '..') by other objects in the same database, such as querying a fully-qualified table name in a view. If a secondary database has a different name from the primary database, then these object references would break in the secondary database. ### Optional +- `catalog` (String) The database parameter that specifies the default catalog to use for Iceberg tables. - `comment` (String) Specifies a comment for the database. -- `data_retention_time_in_days` (Number) Number of days for which Snowflake retains historical data for performing Time Travel actions (SELECT, CLONE, UNDROP) on the object. A value of 0 effectively disables Time Travel for the specified database. Default value for this field is set to -1, which is a fallback to use Snowflake default. For more information, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel). -- `from_database` (String) Specify a database to create a clone from. -- `from_replica` (String) Specify a fully-qualified path to a database to create a replica from. A fully qualified path follows the format of `""."".""`. An example would be: `"myorg1"."account1"."db1"` -- `from_share` (Map of String) Specify a provider and a share in this map to create a database from a share. As of version 0.87.0, the provider field is the account locator. -- `is_transient` (Boolean) Specifies a database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss. -- `replication_configuration` (Block List, Max: 1) When set, specifies the configurations for database replication. (see [below for nested schema](#nestedblock--replication_configuration)) +- `data_retention_time_in_days` (Number) Specifies the number of days for which Time Travel actions (CLONE and UNDROP) can be performed on the database, as well as specifying the default Time Travel retention time for all schemas created in the database. For more details, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel). +- `default_ddl_collation` (String) Specifies a default collation specification for all schemas and tables added to the database. It can be overridden on schema or table level. For more information, see [collation specification](https://docs.snowflake.com/en/sql-reference/collation#label-collation-specification). +- `enable_console_output` (Boolean) If true, enables stdout/stderr fast path logging for anonymous stored procedures. +- `external_volume` (String) The database parameter that specifies the default external volume to use for Iceberg tables. +- `is_transient` (Boolean) Specifies the database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss. +- `log_level` (String) Specifies the severity level of messages that should be ingested and made available in the active event table. Valid options are: [TRACE DEBUG INFO WARN ERROR FATAL OFF]. Messages at the specified level (and at more severe levels) are ingested. For more information, see [LOG_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-log-level). +- `max_data_extension_time_in_days` (Number) Object parameter that specifies the maximum number of days for which Snowflake can extend the data retention period for tables in the database to prevent streams on the tables from becoming stale. For a detailed description of this parameter, see [MAX_DATA_EXTENSION_TIME_IN_DAYS](https://docs.snowflake.com/en/sql-reference/parameters.html#label-max-data-extension-time-in-days). +- `quoted_identifiers_ignore_case` (Boolean) If true, the case of quoted identifiers is ignored. +- `replace_invalid_characters` (Boolean) Specifies whether to replace invalid UTF-8 characters with the Unicode replacement character (�) in query results for an Iceberg table. You can only set this parameter for tables that use an external Iceberg catalog. +- `replication` (Block List, Max: 1) Configures replication for a given database. When specified, this database will be promoted to serve as a primary database for replication. A primary database can be replicated in one or more accounts, allowing users in those accounts to query objects in each secondary (i.e. replica) database. (see [below for nested schema](#nestedblock--replication)) +- `storage_serialization_policy` (String) The storage serialization policy for Iceberg tables that use Snowflake as the catalog. Valid options are: [COMPATIBLE OPTIMIZED]. COMPATIBLE: Snowflake performs encoding and compression of data files that ensures interoperability with third-party compute engines. OPTIMIZED: Snowflake performs encoding and compression of data files that ensures the best table performance within Snowflake. +- `suspend_task_after_num_failures` (Number) How many times a task must fail in a row before it is automatically suspended. 0 disables auto-suspending. +- `task_auto_retry_attempts` (Number) Maximum automatic retries allowed for a user task. +- `trace_level` (String) Controls how trace events are ingested into the event table. Valid options are: [ALWAYS ON_EVENT OFF]. For information about levels, see [TRACE_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-trace-level). +- `user_task_managed_initial_warehouse_size` (String) The initial size of warehouse to use for managed warehouses in the absence of history. +- `user_task_minimum_trigger_interval_in_seconds` (Number) Minimum amount of time between Triggered Task executions in seconds. +- `user_task_timeout_ms` (Number) User task execution timeout in milliseconds. ### Read-Only - `id` (String) The ID of this resource. - -### Nested Schema for `replication_configuration` + +### Nested Schema for `replication` + +Required: + +- `enable_to_account` (Block List, Min: 1) Entry to enable replication and optionally failover for a given account identifier. (see [below for nested schema](#nestedblock--replication--enable_to_account)) + +Optional: + +- `ignore_edition_check` (Boolean) Allows replicating data to accounts on lower editions in either of the following scenarios: 1. The primary database is in a Business Critical (or higher) account but one or more of the accounts approved for replication are on lower editions. Business Critical Edition is intended for Snowflake accounts with extremely sensitive data. 2. The primary database is in a Business Critical (or higher) account and a signed business associate agreement is in place to store PHI data in the account per HIPAA and HITRUST regulations, but no such agreement is in place for one or more of the accounts approved for replication, regardless if they are Business Critical (or higher) accounts. Both scenarios are prohibited by default in an effort to help prevent account administrators for Business Critical (or higher) accounts from inadvertently replicating sensitive data to accounts on lower editions. + + +### Nested Schema for `replication.enable_to_account` Required: -- `accounts` (List of String) +- `account_identifier` (String) Specifies account identifier for which replication should be enabled. The account identifiers should be in the form of `"".""`. Optional: -- `ignore_edition_check` (Boolean) +- `with_failover` (Boolean) Specifies if failover should be enabled for the specified account identifier ## Import Import is supported using the following syntax: ```shell -terraform import snowflake_database.example name +terraform import snowflake_database.example 'database_name' ``` diff --git a/docs/resources/database_old.md b/docs/resources/database_old.md new file mode 100644 index 0000000000..e6fdecabe8 --- /dev/null +++ b/docs/resources/database_old.md @@ -0,0 +1,85 @@ +--- +page_title: "snowflake_database_old Resource - terraform-provider-snowflake" +subcategory: "" +description: |- + +--- + +# snowflake_database_old (Resource) + +~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use snowflake_database or snowflake_shared_database or snowflake_secondary_database instead. + +## Example Usage + +```terraform +resource "snowflake_database_old" "simple" { + name = "testing" + comment = "test comment" + data_retention_time_in_days = 3 +} + +resource "snowflake_database_old" "with_replication" { + name = "testing_2" + comment = "test comment 2" + replication_configuration { + accounts = ["test_account1", "test_account_2"] + ignore_edition_check = true + } +} + +resource "snowflake_database_old" "from_replica" { + name = "testing_3" + comment = "test comment" + data_retention_time_in_days = 3 + from_replica = "\"org1\".\"account1\".\"primary_db_name\"" +} + +resource "snowflake_database_old" "from_share" { + name = "testing_4" + comment = "test comment" + from_share = { + provider = "account1_locator" + share = "share1" + } +} +``` + + +## Schema + +### Required + +- `name` (String) Specifies the identifier for the database; must be unique for your account. + +### Optional + +- `comment` (String) Specifies a comment for the database. +- `data_retention_time_in_days` (Number) Number of days for which Snowflake retains historical data for performing Time Travel actions (SELECT, CLONE, UNDROP) on the object. A value of 0 effectively disables Time Travel for the specified database. Default value for this field is set to -1, which is a fallback to use Snowflake default. For more information, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel). +- `from_database` (String) Specify a database to create a clone from. +- `from_replica` (String) Specify a fully-qualified path to a database to create a replica from. A fully qualified path follows the format of `""."".""`. An example would be: `"myorg1"."account1"."db1"` +- `from_share` (Map of String) Specify a provider and a share in this map to create a database from a share. As of version 0.87.0, the provider field is the account locator. +- `is_transient` (Boolean) Specifies a database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss. +- `replication_configuration` (Block List, Max: 1) When set, specifies the configurations for database replication. (see [below for nested schema](#nestedblock--replication_configuration)) + +### Read-Only + +- `id` (String) The ID of this resource. + + +### Nested Schema for `replication_configuration` + +Required: + +- `accounts` (List of String) + +Optional: + +- `ignore_edition_check` (Boolean) + +## Import + +Import is supported using the following syntax: + +```shell +terraform import snowflake_database_old.example 'database_name' +``` diff --git a/docs/resources/failover_group.md b/docs/resources/failover_group.md index ee4fc6537b..f166661477 100644 --- a/docs/resources/failover_group.md +++ b/docs/resources/failover_group.md @@ -43,7 +43,7 @@ resource "snowflake_failover_group" "target_failover_group" { from_replica { organization_name = "..." source_account_name = "..." - name = snowflake_failover_group.fg.name + name = snowflake_failover_group.source_failover_group.name } } ``` diff --git a/docs/resources/secondary_database.md b/docs/resources/secondary_database.md new file mode 100644 index 0000000000..d3948e8037 --- /dev/null +++ b/docs/resources/secondary_database.md @@ -0,0 +1,125 @@ +--- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "snowflake_secondary_database Resource - terraform-provider-snowflake" +subcategory: "" +description: |- + A secondary database creates a replica of an existing primary database (i.e. a secondary database). For more information about database replication, see Introduction to database replication across multiple accounts https://docs.snowflake.com/en/user-guide/db-replication-intro. +--- + +# snowflake_secondary_database (Resource) + +~> **Note** The snowflake_secondary_database resource doesn't refresh itself, as the best practice is to use tasks scheduled for a certain interval. Check out the examples to see how to set up the refresh task. For SQL-based replication guide, see the [official documentation](https://docs.snowflake.com/en/user-guide/db-replication-config#replicating-a-database-to-another-account). + +A secondary database creates a replica of an existing primary database (i.e. a secondary database). For more information about database replication, see [Introduction to database replication across multiple accounts](https://docs.snowflake.com/en/user-guide/db-replication-intro). + +## Example Usage + +```terraform +# 1. Preparing primary database +resource "snowflake_database" "primary" { + provider = primary_account # notice the provider fields + name = "database_name" + replication { + enable_to_account { + account_identifier = "." + with_failover = true + } + ignore_edition_check = true + } +} + +# 2. Creating secondary database +## 2.1. Minimal version +resource "snowflake_secondary_database" "test" { + provider = secondary_account + name = snowflake_database.primary.name # It's recommended to give a secondary database the same name as its primary database + as_replica_of = "..${snowflake_database.primary.name}" +} + +## 2.2. Complete version (with every optional set) +resource "snowflake_secondary_database" "test" { + provider = secondary_account + name = snowflake_database.primary.name # It's recommended to give a secondary database the same name as its primary database + is_transient = false + as_replica_of = "..${snowflake_database.primary.name}" + comment = "A secondary database" + + data_retention_time_in_days = 10 + max_data_extension_time_in_days = 20 + external_volume = "" + catalog = "" + replace_invalid_characters = false + default_ddl_collation = "en_US" + storage_serialization_policy = "COMPATIBLE" + log_level = "INFO" + trace_level = "ALWAYS" + suspend_task_after_num_failures = 10 + task_auto_retry_attempts = 10 + user_task_managed_initial_warehouse_size = "LARGE" + user_task_timeout_ms = 3600000 + user_task_minimum_trigger_interval_in_seconds = 120 + quoted_identifiers_ignore_case = false + enable_console_output = false +} + +# The snowflake_secondary_database resource doesn't refresh itself, as the best practice is to use tasks scheduled for a certain interval. +# To create the refresh tasks, use separate database and schema. + +resource "snowflake_database" "tasks" { + name = "database_for_tasks" +} + +resource "snowflake_schema" "tasks" { + name = "schema_for_tasks" + database = snowflake_database.tasks.name +} + +resource "snowflake_task" "refresh_secondary_database" { + database = snowflake_database.tasks.name + name = "refresh_secondary_database" + schema = snowflake_schema.tasks.name + schedule = "10 minute" + sql_statement = "ALTER DATABASE ${snowflake_secondary_database.test.name} REFRESH" +} +``` + + +## Schema + +### Required + +- `as_replica_of` (String) A fully qualified path to a database to create a replica from. A fully qualified path follows the format of `""."".""`. +- `name` (String) Specifies the identifier for the database; must be unique for your account. As a best practice for [Database Replication and Failover](https://docs.snowflake.com/en/user-guide/db-replication-intro), it is recommended to give each secondary database the same name as its primary database. This practice supports referencing fully-qualified objects (i.e. '..') by other objects in the same database, such as querying a fully-qualified table name in a view. If a secondary database has a different name from the primary database, then these object references would break in the secondary database. + +### Optional + +- `catalog` (String) The database parameter that specifies the default catalog to use for Iceberg tables. +- `comment` (String) Specifies a comment for the database. +- `data_retention_time_in_days` (Number) Specifies the number of days for which Time Travel actions (CLONE and UNDROP) can be performed on the database, as well as specifying the default Time Travel retention time for all schemas created in the database. For more details, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel). +- `default_ddl_collation` (String) Specifies a default collation specification for all schemas and tables added to the database. It can be overridden on schema or table level. For more information, see [collation specification](https://docs.snowflake.com/en/sql-reference/collation#label-collation-specification). +- `enable_console_output` (Boolean) If true, enables stdout/stderr fast path logging for anonymous stored procedures. +- `external_volume` (String) The database parameter that specifies the default external volume to use for Iceberg tables. +- `is_transient` (Boolean) Specifies the database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss. +- `log_level` (String) Specifies the severity level of messages that should be ingested and made available in the active event table. Valid options are: [TRACE DEBUG INFO WARN ERROR FATAL OFF]. Messages at the specified level (and at more severe levels) are ingested. For more information, see [LOG_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-log-level). +- `max_data_extension_time_in_days` (Number) Object parameter that specifies the maximum number of days for which Snowflake can extend the data retention period for tables in the database to prevent streams on the tables from becoming stale. For a detailed description of this parameter, see [MAX_DATA_EXTENSION_TIME_IN_DAYS](https://docs.snowflake.com/en/sql-reference/parameters.html#label-max-data-extension-time-in-days). +- `quoted_identifiers_ignore_case` (Boolean) If true, the case of quoted identifiers is ignored. +- `replace_invalid_characters` (Boolean) Specifies whether to replace invalid UTF-8 characters with the Unicode replacement character (�) in query results for an Iceberg table. You can only set this parameter for tables that use an external Iceberg catalog. +- `storage_serialization_policy` (String) The storage serialization policy for Iceberg tables that use Snowflake as the catalog. Valid options are: [COMPATIBLE OPTIMIZED]. COMPATIBLE: Snowflake performs encoding and compression of data files that ensures interoperability with third-party compute engines. OPTIMIZED: Snowflake performs encoding and compression of data files that ensures the best table performance within Snowflake. +- `suspend_task_after_num_failures` (Number) How many times a task must fail in a row before it is automatically suspended. 0 disables auto-suspending. +- `task_auto_retry_attempts` (Number) Maximum automatic retries allowed for a user task. +- `trace_level` (String) Controls how trace events are ingested into the event table. Valid options are: [ALWAYS ON_EVENT OFF]. For information about levels, see [TRACE_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-trace-level). +- `user_task_managed_initial_warehouse_size` (String) The initial size of warehouse to use for managed warehouses in the absence of history. +- `user_task_minimum_trigger_interval_in_seconds` (Number) Minimum amount of time between Triggered Task executions in seconds. +- `user_task_timeout_ms` (Number) User task execution timeout in milliseconds. + +### Read-Only + +- `id` (String) The ID of this resource. + +## Import + +Import is supported using the following syntax: + +```shell +terraform import snowflake_secondary_database.example 'secondary_database_name' +``` diff --git a/docs/resources/sequence.md b/docs/resources/sequence.md index 2fd888dc31..2b4dfba100 100644 --- a/docs/resources/sequence.md +++ b/docs/resources/sequence.md @@ -12,17 +12,17 @@ description: |- ## Example Usage ```terraform -resource "snowflake_database" "database" { +resource "snowflake_database" "test" { name = "things" } resource "snowflake_schema" "test_schema" { name = "things" - database = snowflake_database.test_database.name + database = snowflake_database.test.name } resource "snowflake_sequence" "test_sequence" { - database = snowflake_database.test_database.name + database = snowflake_database.test.name schema = snowflake_schema.test_schema.name name = "thing_counter" } diff --git a/docs/resources/shared_database.md b/docs/resources/shared_database.md new file mode 100644 index 0000000000..d9b899dba0 --- /dev/null +++ b/docs/resources/shared_database.md @@ -0,0 +1,107 @@ +--- +page_title: "snowflake_shared_database Resource - terraform-provider-snowflake" +subcategory: "" +description: |- + A shared database creates a database from a share provided by another Snowflake account. For more information about shares, see Introduction to Secure Data Sharing https://docs.snowflake.com/en/user-guide/data-sharing-intro. +--- + +# snowflake_shared_database (Resource) + +A shared database creates a database from a share provided by another Snowflake account. For more information about shares, see [Introduction to Secure Data Sharing](https://docs.snowflake.com/en/user-guide/data-sharing-intro). + +## Example Usage + +```terraform +# 1. Preparing database to share +resource "snowflake_share" "test" { + provider = primary_account # notice the provider fields + name = "share_name" + accounts = ["."] +} + +resource "snowflake_database" "test" { + provider = primary_account + name = "shared_database" +} + +resource "snowflake_grant_privileges_to_share" "test" { + provider = primary_account + to_share = snowflake_share.test.name + privileges = ["USAGE"] + on_database = snowflake_database.test.name +} + +# 2. Creating shared database +## 2.1. Minimal version +resource "snowflake_shared_database" "test" { + provider = secondary_account + depends_on = [snowflake_grant_privileges_to_share.test] + name = snowflake_database.test.name # shared database should have the same as the "imported" one + from_share = "..${snowflake_share.test.name}" +} + +## 2.2. Complete version (with every optional set) +resource "snowflake_shared_database" "test" { + provider = secondary_account + depends_on = [snowflake_grant_privileges_to_share.test] + name = snowflake_database.test.name # shared database should have the same as the "imported" one + is_transient = false + from_share = "..${snowflake_share.test.name}" + comment = "A shared database" + + data_retention_time_in_days = 10 + max_data_extension_time_in_days = 20 + external_volume = "" + catalog = "" + replace_invalid_characters = false + default_ddl_collation = "en_US" + storage_serialization_policy = "COMPATIBLE" + log_level = "INFO" + trace_level = "ALWAYS" + suspend_task_after_num_failures = 10 + task_auto_retry_attempts = 10 + user_task_managed_initial_warehouse_size = "LARGE" + user_task_timeout_ms = 3600000 + user_task_minimum_trigger_interval_in_seconds = 120 + quoted_identifiers_ignore_case = false + enable_console_output = false +} +``` + + +## Schema + +### Required + +- `from_share` (String) A fully qualified path to a share from which the database will be created. A fully qualified path follows the format of `""."".""`. +- `name` (String) Specifies the identifier for the database; must be unique for your account. + +### Optional + +- `catalog` (String) The database parameter that specifies the default catalog to use for Iceberg tables. +- `comment` (String) Specifies a comment for the database. +- `default_ddl_collation` (String) Specifies a default collation specification for all schemas and tables added to the database. It can be overridden on schema or table level. For more information, see [collation specification](https://docs.snowflake.com/en/sql-reference/collation#label-collation-specification). +- `enable_console_output` (Boolean) If true, enables stdout/stderr fast path logging for anonymous stored procedures. +- `external_volume` (String) The database parameter that specifies the default external volume to use for Iceberg tables. +- `log_level` (String) Specifies the severity level of messages that should be ingested and made available in the active event table. Valid options are: [TRACE DEBUG INFO WARN ERROR FATAL OFF]. Messages at the specified level (and at more severe levels) are ingested. For more information, see [LOG_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-log-level). +- `quoted_identifiers_ignore_case` (Boolean) If true, the case of quoted identifiers is ignored. +- `replace_invalid_characters` (Boolean) Specifies whether to replace invalid UTF-8 characters with the Unicode replacement character (�) in query results for an Iceberg table. You can only set this parameter for tables that use an external Iceberg catalog. +- `storage_serialization_policy` (String) The storage serialization policy for Iceberg tables that use Snowflake as the catalog. Valid options are: [COMPATIBLE OPTIMIZED]. COMPATIBLE: Snowflake performs encoding and compression of data files that ensures interoperability with third-party compute engines. OPTIMIZED: Snowflake performs encoding and compression of data files that ensures the best table performance within Snowflake. +- `suspend_task_after_num_failures` (Number) How many times a task must fail in a row before it is automatically suspended. 0 disables auto-suspending. +- `task_auto_retry_attempts` (Number) Maximum automatic retries allowed for a user task. +- `trace_level` (String) Controls how trace events are ingested into the event table. Valid options are: [ALWAYS ON_EVENT OFF]. For information about levels, see [TRACE_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-trace-level). +- `user_task_managed_initial_warehouse_size` (String) The initial size of warehouse to use for managed warehouses in the absence of history. +- `user_task_minimum_trigger_interval_in_seconds` (Number) Minimum amount of time between Triggered Task executions in seconds. +- `user_task_timeout_ms` (Number) User task execution timeout in milliseconds. + +### Read-Only + +- `id` (String) The ID of this resource. + +## Import + +Import is supported using the following syntax: + +```shell +terraform import snowflake_shared_database.example 'shared_database_name' +``` diff --git a/docs/resources/tag_association.md b/docs/resources/tag_association.md index 5319ed4035..3abe2e9066 100644 --- a/docs/resources/tag_association.md +++ b/docs/resources/tag_association.md @@ -12,28 +12,28 @@ description: |- ## Example Usage ```terraform -resource "snowflake_database" "database" { +resource "snowflake_database" "test" { name = "database" } -resource "snowflake_schema" "schema" { +resource "snowflake_schema" "test" { name = "schema" - database = snowflake_database.database.name + database = snowflake_database.test.name } -resource "snowflake_tag" "tag" { +resource "snowflake_tag" "test" { name = "cost_center" - database = snowflake_database.database.name - schema = snowflake_schema.schema.name + database = snowflake_database.test.name + schema = snowflake_schema.test.name allowed_values = ["finance", "engineering"] } resource "snowflake_tag_association" "db_association" { object_identifier { - name = snowflake_database.database.name + name = snowflake_database.test.name } object_type = "DATABASE" - tag_id = snowflake_tag.tag.id + tag_id = snowflake_tag.test.id tag_value = "finance" } diff --git a/examples/additional/deprecated_resources.MD b/examples/additional/deprecated_resources.MD index 870eb9371d..01bdc91bbe 100644 --- a/examples/additional/deprecated_resources.MD +++ b/examples/additional/deprecated_resources.MD @@ -2,6 +2,7 @@ - [snowflake_account_grant](./docs/resources/account_grant) - use [snowflake_grant_privileges_to_account_role](./docs/resources/grant_privileges_to_account_role) instead - [snowflake_database_grant](./docs/resources/database_grant) - use [snowflake_grant_privileges_to_account_role](./docs/resources/grant_privileges_to_account_role) instead +- [snowflake_database_old](./docs/resources/database_old) - [snowflake_external_table_grant](./docs/resources/external_table_grant) - use [snowflake_grant_privileges_to_account_role](./docs/resources/grant_privileges_to_account_role) instead - [snowflake_failover_group_grant](./docs/resources/failover_group_grant) - use [snowflake_grant_privileges_to_account_role](./docs/resources/grant_privileges_to_account_role) instead - [snowflake_file_format_grant](./docs/resources/file_format_grant) - use [snowflake_grant_privileges_to_account_role](./docs/resources/grant_privileges_to_account_role) instead diff --git a/examples/data-sources/snowflake_databases/data-source.tf b/examples/data-sources/snowflake_databases/data-source.tf index 289c9d3563..f6f21658df 100644 --- a/examples/data-sources/snowflake_databases/data-source.tf +++ b/examples/data-sources/snowflake_databases/data-source.tf @@ -1 +1,73 @@ -data "snowflake_databases" "this" {} +# Simple usage +data "snowflake_databases" "simple" { +} + +output "simple_output" { + value = data.snowflake_databases.simple.databases +} + +# Filtering (like) +data "snowflake_databases" "like" { + like = "database-name" +} + +output "like_output" { + value = data.snowflake_databases.like.databases +} + +# Filtering (starts_with) +data "snowflake_databases" "starts_with" { + starts_with = "database-" +} + +output "starts_with_output" { + value = data.snowflake_databases.starts_with.databases +} + +# Filtering (limit) +data "snowflake_databases" "limit" { + limit { + rows = 10 + from = "database-" + } +} + +output "limit_output" { + value = data.snowflake_databases.limit.databases +} + +# Without additional data (to limit the number of calls make for every found database) +data "snowflake_databases" "only_show" { + # with_describe is turned on by default and it calls DESCRIBE DATABASE for every database found and attaches its output to databases.*.description field + with_describe = false + + # with_parameters is turned on by default and it calls SHOW PARAMETERS FOR DATABASE for every database found and attaches its output to databases.*.parameters field + with_parameters = false +} + +output "only_show_output" { + value = data.snowflake_databases.only_show.databases +} + +# Ensure the number of databases is equal to at least one element (with the use of postcondition) +data "snowflake_databases" "assert_with_postcondition" { + starts_with = "database-name" + lifecycle { + postcondition { + condition = length(self.databases) > 0 + error_message = "there should be at least one database" + } + } +} + +# Ensure the number of databases is equal to at exactly one element (with the use of check block) +check "database_check" { + data "snowflake_databases" "assert_with_check_block" { + like = "database-name" + } + + assert { + condition = length(data.snowflake_databases.test.databases) == 1 + error_message = "Databases filtered by '${data.snowflake_databases.test.like}' returned ${length(data.snowflake_databases.test.databases)} databases where one was expected" + } +} diff --git a/examples/resources/snowflake_database/import.sh b/examples/resources/snowflake_database/import.sh index 66914f1a48..8a30774299 100644 --- a/examples/resources/snowflake_database/import.sh +++ b/examples/resources/snowflake_database/import.sh @@ -1 +1 @@ -terraform import snowflake_database.example name +terraform import snowflake_database.example 'database_name' diff --git a/examples/resources/snowflake_database/resource.tf b/examples/resources/snowflake_database/resource.tf index 2eed3a059e..13c1833c6b 100644 --- a/examples/resources/snowflake_database/resource.tf +++ b/examples/resources/snowflake_database/resource.tf @@ -1,30 +1,61 @@ -resource "snowflake_database" "simple" { - name = "testing" - comment = "test comment" - data_retention_time_in_days = 3 +## Minimal +resource "snowflake_database" "primary" { + name = "database_name" } -resource "snowflake_database" "with_replication" { - name = "testing_2" - comment = "test comment 2" - replication_configuration { - accounts = ["test_account1", "test_account_2"] +## Complete (with every optional set) +resource "snowflake_database" "primary" { + name = "database_name" + is_transient = false + comment = "my standard database" + + data_retention_time_in_days = 10 + data_retention_time_in_days_save = 10 + max_data_extension_time_in_days = 20 + external_volume = "" + catalog = "" + replace_invalid_characters = false + default_ddl_collation = "en_US" + storage_serialization_policy = "COMPATIBLE" + log_level = "INFO" + trace_level = "ALWAYS" + suspend_task_after_num_failures = 10 + task_auto_retry_attempts = 10 + user_task_managed_initial_warehouse_size = "LARGE" + user_task_timeout_ms = 3600000 + user_task_minimum_trigger_interval_in_seconds = 120 + quoted_identifiers_ignore_case = false + enable_console_output = false + + replication { + enable_to_account { + account_identifier = "." + with_failover = true + } ignore_edition_check = true } } -resource "snowflake_database" "from_replica" { - name = "testing_3" - comment = "test comment" - data_retention_time_in_days = 3 - from_replica = "\"org1\".\"account1\".\"primary_db_name\"" +## Replication with for_each +locals { + replication_configs = [ + { + account_identifier = "." + with_failover = true + }, + { + account_identifier = "." + with_failover = true + }, + ] } -resource "snowflake_database" "from_share" { - name = "testing_4" - comment = "test comment" - from_share = { - provider = "account1_locator" - share = "share1" +resource "snowflake_database" "primary" { + name = "database_name" + for_each = local.replication_configs + + replication { + enable_to_account = each.value + ignore_edition_check = true } } diff --git a/examples/resources/snowflake_database_old/import.sh b/examples/resources/snowflake_database_old/import.sh new file mode 100644 index 0000000000..3ea61a2c21 --- /dev/null +++ b/examples/resources/snowflake_database_old/import.sh @@ -0,0 +1 @@ +terraform import snowflake_database_old.example 'database_name' diff --git a/examples/resources/snowflake_database_old/resource.tf b/examples/resources/snowflake_database_old/resource.tf new file mode 100644 index 0000000000..2219295495 --- /dev/null +++ b/examples/resources/snowflake_database_old/resource.tf @@ -0,0 +1,30 @@ +resource "snowflake_database_old" "simple" { + name = "testing" + comment = "test comment" + data_retention_time_in_days = 3 +} + +resource "snowflake_database_old" "with_replication" { + name = "testing_2" + comment = "test comment 2" + replication_configuration { + accounts = ["test_account1", "test_account_2"] + ignore_edition_check = true + } +} + +resource "snowflake_database_old" "from_replica" { + name = "testing_3" + comment = "test comment" + data_retention_time_in_days = 3 + from_replica = "\"org1\".\"account1\".\"primary_db_name\"" +} + +resource "snowflake_database_old" "from_share" { + name = "testing_4" + comment = "test comment" + from_share = { + provider = "account1_locator" + share = "share1" + } +} diff --git a/examples/resources/snowflake_failover_group/resource.tf b/examples/resources/snowflake_failover_group/resource.tf index 3fe2e0ffeb..a5ce34acf6 100644 --- a/examples/resources/snowflake_failover_group/resource.tf +++ b/examples/resources/snowflake_failover_group/resource.tf @@ -29,6 +29,6 @@ resource "snowflake_failover_group" "target_failover_group" { from_replica { organization_name = "..." source_account_name = "..." - name = snowflake_failover_group.fg.name + name = snowflake_failover_group.source_failover_group.name } } diff --git a/examples/resources/snowflake_secondary_database/resource.tf b/examples/resources/snowflake_secondary_database/resource.tf index dd606162ef..743759d395 100644 --- a/examples/resources/snowflake_secondary_database/resource.tf +++ b/examples/resources/snowflake_secondary_database/resource.tf @@ -2,33 +2,65 @@ resource "snowflake_database" "primary" { provider = primary_account # notice the provider fields name = "database_name" - replication_configuration { - accounts = ["."] + replication { + enable_to_account { + account_identifier = "." + with_failover = true + } ignore_edition_check = true } } # 2. Creating secondary database +## 2.1. Minimal version resource "snowflake_secondary_database" "test" { provider = secondary_account name = snowflake_database.primary.name # It's recommended to give a secondary database the same name as its primary database as_replica_of = "..${snowflake_database.primary.name}" +} + +## 2.2. Complete version (with every optional set) +resource "snowflake_secondary_database" "test" { + provider = secondary_account + name = snowflake_database.primary.name # It's recommended to give a secondary database the same name as its primary database is_transient = false + as_replica_of = "..${snowflake_database.primary.name}" + comment = "A secondary database" - data_retention_time_in_days { - value = 10 - } + data_retention_time_in_days = 10 + max_data_extension_time_in_days = 20 + external_volume = "" + catalog = "" + replace_invalid_characters = false + default_ddl_collation = "en_US" + storage_serialization_policy = "COMPATIBLE" + log_level = "INFO" + trace_level = "ALWAYS" + suspend_task_after_num_failures = 10 + task_auto_retry_attempts = 10 + user_task_managed_initial_warehouse_size = "LARGE" + user_task_timeout_ms = 3600000 + user_task_minimum_trigger_interval_in_seconds = 120 + quoted_identifiers_ignore_case = false + enable_console_output = false +} - max_data_extension_time_in_days { - value = 20 - } +# The snowflake_secondary_database resource doesn't refresh itself, as the best practice is to use tasks scheduled for a certain interval. +# To create the refresh tasks, use separate database and schema. + +resource "snowflake_database" "tasks" { + name = "database_for_tasks" +} + +resource "snowflake_schema" "tasks" { + name = "schema_for_tasks" + database = snowflake_database.tasks.name +} - external_volume = "external_volume_name" - catalog = "catalog_name" - replace_invalid_characters = false - default_ddl_collation = "en_US" - storage_serialization_policy = "OPTIMIZED" - log_level = "OFF" - trace_level = "OFF" - comment = "A secondary database" +resource "snowflake_task" "refresh_secondary_database" { + database = snowflake_database.tasks.name + name = "refresh_secondary_database" + schema = snowflake_schema.tasks.name + schedule = "10 minute" + sql_statement = "ALTER DATABASE ${snowflake_secondary_database.test.name} REFRESH" } diff --git a/examples/resources/snowflake_sequence/resource.tf b/examples/resources/snowflake_sequence/resource.tf index 08412a41e0..0518539c4b 100644 --- a/examples/resources/snowflake_sequence/resource.tf +++ b/examples/resources/snowflake_sequence/resource.tf @@ -1,14 +1,14 @@ -resource "snowflake_database" "database" { +resource "snowflake_database" "test" { name = "things" } resource "snowflake_schema" "test_schema" { name = "things" - database = snowflake_database.test_database.name + database = snowflake_database.test.name } resource "snowflake_sequence" "test_sequence" { - database = snowflake_database.test_database.name + database = snowflake_database.test.name schema = snowflake_schema.test_schema.name name = "thing_counter" } diff --git a/examples/resources/snowflake_shared_database/resource.tf b/examples/resources/snowflake_shared_database/resource.tf index 7f506bccf9..9128bfaacf 100644 --- a/examples/resources/snowflake_shared_database/resource.tf +++ b/examples/resources/snowflake_shared_database/resource.tf @@ -18,18 +18,37 @@ resource "snowflake_grant_privileges_to_share" "test" { } # 2. Creating shared database +## 2.1. Minimal version resource "snowflake_shared_database" "test" { - provider = secondary_account - depends_on = [snowflake_grant_privileges_to_share.test] - name = snowflake_database.test.name # shared database should have the same as the "imported" one - from_share = "..${snowflake_share.test.name}" - is_transient = false - external_volume = "external_volume_name" - catalog = "catalog_name" - replace_invalid_characters = false - default_ddl_collation = "en_US" - storage_serialization_policy = "OPTIMIZED" - log_level = "OFF" - trace_level = "OFF" - comment = "A shared database" + provider = secondary_account + depends_on = [snowflake_grant_privileges_to_share.test] + name = snowflake_database.test.name # shared database should have the same as the "imported" one + from_share = "..${snowflake_share.test.name}" +} + +## 2.2. Complete version (with every optional set) +resource "snowflake_shared_database" "test" { + provider = secondary_account + depends_on = [snowflake_grant_privileges_to_share.test] + name = snowflake_database.test.name # shared database should have the same as the "imported" one + is_transient = false + from_share = "..${snowflake_share.test.name}" + comment = "A shared database" + + data_retention_time_in_days = 10 + max_data_extension_time_in_days = 20 + external_volume = "" + catalog = "" + replace_invalid_characters = false + default_ddl_collation = "en_US" + storage_serialization_policy = "COMPATIBLE" + log_level = "INFO" + trace_level = "ALWAYS" + suspend_task_after_num_failures = 10 + task_auto_retry_attempts = 10 + user_task_managed_initial_warehouse_size = "LARGE" + user_task_timeout_ms = 3600000 + user_task_minimum_trigger_interval_in_seconds = 120 + quoted_identifiers_ignore_case = false + enable_console_output = false } diff --git a/examples/resources/snowflake_tag_association/resource.tf b/examples/resources/snowflake_tag_association/resource.tf index ab57b58884..36d5fbf7de 100644 --- a/examples/resources/snowflake_tag_association/resource.tf +++ b/examples/resources/snowflake_tag_association/resource.tf @@ -1,25 +1,25 @@ -resource "snowflake_database" "database" { +resource "snowflake_database" "test" { name = "database" } -resource "snowflake_schema" "schema" { +resource "snowflake_schema" "test" { name = "schema" - database = snowflake_database.database.name + database = snowflake_database.test.name } -resource "snowflake_tag" "tag" { +resource "snowflake_tag" "test" { name = "cost_center" - database = snowflake_database.database.name - schema = snowflake_schema.schema.name + database = snowflake_database.test.name + schema = snowflake_schema.test.name allowed_values = ["finance", "engineering"] } resource "snowflake_tag_association" "db_association" { object_identifier { - name = snowflake_database.database.name + name = snowflake_database.test.name } object_type = "DATABASE" - tag_id = snowflake_tag.tag.id + tag_id = snowflake_tag.test.id tag_value = "finance" } diff --git a/pkg/acceptance/asserts.go b/pkg/acceptance/asserts.go new file mode 100644 index 0000000000..4ce547dfa9 --- /dev/null +++ b/pkg/acceptance/asserts.go @@ -0,0 +1,23 @@ +package acceptance + +import ( + "fmt" + "strconv" + + "github.com/hashicorp/terraform-plugin-testing/helper/resource" +) + +func IsGreaterOrEqualTo(greaterOrEqualValue int) resource.CheckResourceAttrWithFunc { + return func(value string) error { + intValue, err := strconv.Atoi(value) + if err != nil { + return fmt.Errorf("unable to parse value %s as integer, err = %w", value, err) + } + + if intValue < greaterOrEqualValue { + return fmt.Errorf("expected value %d to be greater or equal to %d", intValue, greaterOrEqualValue) + } + + return nil + } +} diff --git a/pkg/acceptance/asserts_test.go b/pkg/acceptance/asserts_test.go new file mode 100644 index 0000000000..6d7e488d7f --- /dev/null +++ b/pkg/acceptance/asserts_test.go @@ -0,0 +1,73 @@ +package acceptance + +import ( + "testing" + + "github.com/stretchr/testify/assert" +) + +func TestIsGreaterOrEqualTo(t *testing.T) { + testCases := []struct { + Name string + GreaterOrEqualTo int + Actual string + Error string + }{ + { + Name: "validation: smaller than expected", + GreaterOrEqualTo: 20, + Actual: "10", + Error: "expected value 10 to be greater or equal to 20", + }, + { + Name: "validation: zero actual value", + GreaterOrEqualTo: 20, + Actual: "0", + Error: "expected value 0 to be greater or equal to 20", + }, + { + Name: "validation: zero greater value", + GreaterOrEqualTo: 0, + Actual: "-10", + Error: "expected value -10 to be greater or equal to 0", + }, + { + Name: "validation: negative value", + GreaterOrEqualTo: -20, + Actual: "-30", + Error: "expected value -30 to be greater or equal to -20", + }, + { + Name: "validation: not int value", + GreaterOrEqualTo: 20, + Actual: "not_int", + Error: "unable to parse value not_int as integer, err = strconv.Atoi: parsing \"not_int\": invalid syntax", + }, + { + Name: "validation: equal value", + GreaterOrEqualTo: 20, + Actual: "20", + }, + { + Name: "validation: greater value", + GreaterOrEqualTo: 20, + Actual: "30", + }, + { + Name: "validation: greater value with expected negative value", + GreaterOrEqualTo: -20, + Actual: "30", + }, + } + + for _, testCase := range testCases { + t.Run(testCase.Name, func(t *testing.T) { + err := IsGreaterOrEqualTo(testCase.GreaterOrEqualTo)(testCase.Actual) + if testCase.Error != "" { + assert.ErrorContains(t, err, testCase.Error) + } else { + assert.NoError(t, err) + } + }) + } +} diff --git a/pkg/acceptance/check_destroy.go b/pkg/acceptance/check_destroy.go index f9c46a102e..68b9054c78 100644 --- a/pkg/acceptance/check_destroy.go +++ b/pkg/acceptance/check_destroy.go @@ -73,6 +73,9 @@ var showByIdFunctions = map[resources.Resource]showByIdFunc{ resources.Database: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { return runShowById(ctx, id, client.Databases.ShowByID) }, + resources.DatabaseOld: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { + return runShowById(ctx, id, client.Databases.ShowByID) + }, resources.DatabaseRole: func(ctx context.Context, client *sdk.Client, id sdk.ObjectIdentifier) error { return runShowById(ctx, id, client.DatabaseRoles.ShowByID) }, diff --git a/pkg/acceptance/helpers/database_client.go b/pkg/acceptance/helpers/database_client.go index ad050aff03..2e9c71cf23 100644 --- a/pkg/acceptance/helpers/database_client.go +++ b/pkg/acceptance/helpers/database_client.go @@ -25,30 +25,6 @@ func (c *DatabaseClient) client() sdk.Databases { return c.context.client.Databases } -func (c *DatabaseClient) CreatePrimaryDatabase(t *testing.T, enableReplicationTo []sdk.AccountIdentifier) (*sdk.Database, sdk.ExternalObjectIdentifier, func()) { - t.Helper() - ctx := context.Background() - - primaryDatabase, primaryDatabaseCleanup := c.CreateDatabase(t) - - err := c.client().AlterReplication(ctx, primaryDatabase.ID(), &sdk.AlterDatabaseReplicationOptions{ - EnableReplication: &sdk.EnableReplication{ - ToAccounts: enableReplicationTo, - IgnoreEditionCheck: sdk.Bool(true), - }, - }) - require.NoError(t, err) - - organizationName, err := c.context.client.ContextFunctions.CurrentOrganizationName(ctx) - require.NoError(t, err) - - accountName, err := c.context.client.ContextFunctions.CurrentAccountName(ctx) - require.NoError(t, err) - - externalPrimaryId := sdk.NewExternalObjectIdentifier(sdk.NewAccountIdentifier(organizationName, accountName), primaryDatabase.ID()) - return primaryDatabase, externalPrimaryId, primaryDatabaseCleanup -} - func (c *DatabaseClient) CreateDatabase(t *testing.T) (*sdk.Database, func()) { t.Helper() return c.CreateDatabaseWithOptions(t, c.ids.RandomAccountObjectIdentifier(), &sdk.CreateDatabaseOptions{}) @@ -113,18 +89,49 @@ func (c *DatabaseClient) CreateSecondaryDatabaseWithOptions(t *testing.T, id sdk } } -func (c *DatabaseClient) UpdateDataRetentionTime(t *testing.T, id sdk.AccountObjectIdentifier, days int) func() { +func (c *DatabaseClient) CreatePrimaryDatabase(t *testing.T, enableReplicationTo []sdk.AccountIdentifier) (*sdk.Database, sdk.ExternalObjectIdentifier, func()) { t.Helper() ctx := context.Background() - return func() { - err := c.client().Alter(ctx, id, &sdk.AlterDatabaseOptions{ - Set: &sdk.DatabaseSet{ - DataRetentionTimeInDays: sdk.Int(days), - }, - }) - require.NoError(t, err) - } + primaryDatabase, primaryDatabaseCleanup := c.CreateDatabase(t) + + err := c.client().AlterReplication(ctx, primaryDatabase.ID(), &sdk.AlterDatabaseReplicationOptions{ + EnableReplication: &sdk.EnableReplication{ + ToAccounts: enableReplicationTo, + IgnoreEditionCheck: sdk.Bool(true), + }, + }) + require.NoError(t, err) + + sessionDetails, err := c.context.client.ContextFunctions.CurrentSessionDetails(ctx) + require.NoError(t, err) + + externalPrimaryId := sdk.NewExternalObjectIdentifier(sdk.NewAccountIdentifier(sessionDetails.OrganizationName, sessionDetails.AccountName), primaryDatabase.ID()) + return primaryDatabase, externalPrimaryId, primaryDatabaseCleanup +} + +func (c *DatabaseClient) UpdateDataRetentionTime(t *testing.T, id sdk.AccountObjectIdentifier, days int) { + t.Helper() + ctx := context.Background() + + err := c.client().Alter(ctx, id, &sdk.AlterDatabaseOptions{ + Set: &sdk.DatabaseSet{ + DataRetentionTimeInDays: sdk.Int(days), + }, + }) + require.NoError(t, err) +} + +func (c *DatabaseClient) UnsetCatalog(t *testing.T, id sdk.AccountObjectIdentifier) { + t.Helper() + ctx := context.Background() + + err := c.client().Alter(ctx, id, &sdk.AlterDatabaseOptions{ + Unset: &sdk.DatabaseUnset{ + Catalog: sdk.Bool(true), + }, + }) + require.NoError(t, err) } func (c *DatabaseClient) Show(t *testing.T, id sdk.AccountObjectIdentifier) (*sdk.Database, error) { diff --git a/pkg/acceptance/helpers/parameter_client.go b/pkg/acceptance/helpers/parameter_client.go index 9726cd8014..54435b4c63 100644 --- a/pkg/acceptance/helpers/parameter_client.go +++ b/pkg/acceptance/helpers/parameter_client.go @@ -5,6 +5,8 @@ import ( "fmt" "testing" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/stretchr/testify/require" ) @@ -23,6 +25,28 @@ func (c *ParameterClient) client() sdk.Parameters { return c.context.client.Parameters } +func (c *ParameterClient) ShowAccountParameters(t *testing.T) []*sdk.Parameter { + t.Helper() + params, err := c.client().ShowParameters(context.Background(), &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Account: sdk.Bool(true), + }, + }) + require.NoError(t, err) + return params +} + +func (c *ParameterClient) ShowDatabaseParameters(t *testing.T, id sdk.AccountObjectIdentifier) []*sdk.Parameter { + t.Helper() + params, err := c.client().ShowParameters(context.Background(), &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Database: id, + }, + }) + require.NoError(t, err) + return params +} + func (c *ParameterClient) UpdateAccountParameterTemporarily(t *testing.T, parameter sdk.AccountParameter, newValue string) func() { t.Helper() ctx := context.Background() @@ -62,3 +86,10 @@ func (c *ParameterClient) UnsetAccountParameter(t *testing.T, parameter sdk.Acco _, err := c.context.client.ExecForTests(ctx, fmt.Sprintf("ALTER ACCOUNT UNSET %s", parameter)) require.NoError(t, err) } + +func FindParameter(t *testing.T, parameters []*sdk.Parameter, parameter sdk.AccountParameter) *sdk.Parameter { + t.Helper() + param, err := collections.FindOne(parameters, func(p *sdk.Parameter) bool { return p.Key == string(parameter) }) + require.NoError(t, err) + return *param +} diff --git a/pkg/acceptance/snowflakechecks/database.go b/pkg/acceptance/snowflakechecks/database.go new file mode 100644 index 0000000000..f3630aa2e1 --- /dev/null +++ b/pkg/acceptance/snowflakechecks/database.go @@ -0,0 +1,28 @@ +package snowflakechecks + +import ( + "errors" + "fmt" + "testing" + + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-testing/helper/resource" + "github.com/hashicorp/terraform-plugin-testing/terraform" +) + +func CheckDatabaseDataRetentionTimeInDays(t *testing.T, databaseId sdk.AccountObjectIdentifier, expectedLevel sdk.ParameterType, expectedValue string) resource.TestCheckFunc { + t.Helper() + return func(state *terraform.State) error { + param := helpers.FindParameter(t, acc.TestClient().Parameter.ShowDatabaseParameters(t, databaseId), sdk.AccountParameterDataRetentionTimeInDays) + var errs []error + if param.Level != expectedLevel { + errs = append(errs, fmt.Errorf("expected parameter level %s, got %s", expectedLevel, param.Level)) + } + if param.Value != expectedValue { + errs = append(errs, fmt.Errorf("expected parameter value %s, got %s", expectedLevel, param.Level)) + } + return errors.Join(errs...) + } +} diff --git a/pkg/acceptance/testenvs/testing_environment_variables.go b/pkg/acceptance/testenvs/testing_environment_variables.go index 01ca81eacc..7bc1a3e082 100644 --- a/pkg/acceptance/testenvs/testing_environment_variables.go +++ b/pkg/acceptance/testenvs/testing_environment_variables.go @@ -4,6 +4,8 @@ import ( "fmt" "os" "testing" + + "github.com/hashicorp/terraform-plugin-testing/helper/resource" ) type env string @@ -27,6 +29,7 @@ const ( SkipManagedAccountTest env = "TEST_SF_TF_SKIP_MANAGED_ACCOUNT_TEST" SkipSamlIntegrationTest env = "TEST_SF_TF_SKIP_SAML_INTEGRATION_TEST" + EnableAcceptance env = resource.EnvTfAcc EnableSweep env = "TEST_SF_TF_ENABLE_SWEEP" ConfigureClientOnce env = "SF_TF_ACC_TEST_CONFIGURE_CLIENT_ONCE" TestObjectsSuffix env = "TEST_SF_TF_TEST_OBJECT_SUFFIX" diff --git a/pkg/datasources/databases.go b/pkg/datasources/databases.go index ce46ff1b6f..c8ddae417a 100644 --- a/pkg/datasources/databases.go +++ b/pkg/datasources/databases.go @@ -4,52 +4,77 @@ import ( "context" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" ) var databasesSchema = map[string]*schema.Schema{ - "terse": { + "with_describe": { Type: schema.TypeBool, Optional: true, - Default: false, - Description: "Optionally returns only the columns `created_on` and `name` in the results", + Default: true, + Description: "Runs DESC DATABASE for each database returned by SHOW DATABASES. The output of describe is saved to the description field. By default this value is set to true.", }, - "history": { + "with_parameters": { Type: schema.TypeBool, Optional: true, - Default: false, - Description: "Optionally includes dropped databases that have not yet been purged The output also includes an additional `dropped_on` column", + Default: true, + Description: "Runs SHOW PARAMETERS FOR DATABASE for each database returned by SHOW DATABASES. The output of describe is saved to the parameters field as a map. By default this value is set to true.", }, - "pattern": { + "like": { Type: schema.TypeString, Optional: true, - Description: "Optionally filters the databases by a pattern", + Description: "Filters the output with **case-insensitive** pattern, with support for SQL wildcard characters (`%` and `_`).", }, "starts_with": { Type: schema.TypeString, Optional: true, - Description: "Optionally filters the databases by a pattern", + Description: "Filters the output with **case-sensitive** characters indicating the beginning of the object name.", + }, + "limit": { + Type: schema.TypeList, + Optional: true, + Description: "Limits the number of rows returned. If the `limit.from` is set, then the limit wll start from the first element matched by the expression. The expression is only used to match with the first element, later on the elements are not matched by the prefix, but you can enforce a certain pattern with `starts_with` or `like`.", + MaxItems: 1, + Elem: &schema.Resource{ + Schema: map[string]*schema.Schema{ + "rows": { + Type: schema.TypeInt, + Required: true, + Description: "The maximum number of rows to return.", + }, + "from": { + Type: schema.TypeString, + Optional: true, + Description: "Specifies a **case-sensitive** pattern that is used to match object name. After the first match, the limit on the number of rows will be applied.", + }, + }, + }, }, "databases": { Type: schema.TypeList, Computed: true, - Description: "Snowflake databases", + Description: "Holds the output of SHOW DATABASES.", Elem: &schema.Resource{ Schema: map[string]*schema.Schema{ - "name": { + "created_on": { Type: schema.TypeString, Computed: true, }, - "comment": { + "name": { Type: schema.TypeString, Computed: true, }, - "owner": { + "kind": { Type: schema.TypeString, Computed: true, }, + "is_transient": { + Type: schema.TypeBool, + Computed: true, + }, "is_default": { Type: schema.TypeBool, Computed: true, @@ -62,11 +87,11 @@ var databasesSchema = map[string]*schema.Schema{ Type: schema.TypeString, Computed: true, }, - "retention_time": { - Type: schema.TypeInt, + "owner": { + Type: schema.TypeString, Computed: true, }, - "created_on": { + "comment": { Type: schema.TypeString, Computed: true, }, @@ -74,18 +99,63 @@ var databasesSchema = map[string]*schema.Schema{ Type: schema.TypeString, Computed: true, }, - "replication_configuration": { - Type: schema.TypeList, + "retention_time": { + Type: schema.TypeInt, Computed: true, + }, + "resource_group": { + Type: schema.TypeString, + Computed: true, + }, + "owner_role_type": { + Type: schema.TypeString, + Computed: true, + }, + "description": { + Type: schema.TypeList, + Computed: true, + Description: "Holds the output of DESCRIBE DATABASE.", + Elem: &schema.Resource{ + Schema: map[string]*schema.Schema{ + "created_on": { + Type: schema.TypeString, + Computed: true, + }, + "name": { + Type: schema.TypeString, + Computed: true, + }, + "kind": { + Type: schema.TypeString, + Computed: true, + }, + }, + }, + }, + "parameters": { + Type: schema.TypeList, + Computed: true, + Description: "Holds the output of SHOW PARAMETERS FOR DATABASE.", Elem: &schema.Resource{ Schema: map[string]*schema.Schema{ - "accounts": { - Type: schema.TypeList, + "key": { + Type: schema.TypeString, + Computed: true, + }, + "value": { + Type: schema.TypeString, + Computed: true, + }, + "level": { + Type: schema.TypeString, + Computed: true, + }, + "default": { + Type: schema.TypeString, Computed: true, - Elem: &schema.Schema{Type: schema.TypeString}, }, - "ignore_edition_check": { - Type: schema.TypeBool, + "description": { + Type: schema.TypeString, Computed: true, }, }, @@ -99,52 +169,107 @@ var databasesSchema = map[string]*schema.Schema{ // Databases the Snowflake current account resource. func Databases() *schema.Resource { return &schema.Resource{ - Read: ReadDatabases, - Schema: databasesSchema, + ReadContext: ReadDatabases, + Schema: databasesSchema, } } // ReadDatabases read the current snowflake account information. -func ReadDatabases(d *schema.ResourceData, meta interface{}) error { +func ReadDatabases(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { client := meta.(*provider.Context).Client - ctx := context.Background() - opts := sdk.ShowDatabasesOptions{} - if terse, ok := d.GetOk("terse"); ok { - opts.Terse = sdk.Bool(terse.(bool)) - } - if history, ok := d.GetOk("history"); ok { - opts.History = sdk.Bool(history.(bool)) - } - if pattern, ok := d.GetOk("pattern"); ok { + var opts sdk.ShowDatabasesOptions + + if likePattern, ok := d.GetOk("like"); ok { opts.Like = &sdk.Like{ - Pattern: sdk.String(pattern.(string)), + Pattern: sdk.String(likePattern.(string)), } } + if startsWith, ok := d.GetOk("starts_with"); ok { opts.StartsWith = sdk.String(startsWith.(string)) } + + if limit, ok := d.GetOk("limit"); ok && len(limit.([]any)) == 1 { + limitMap := limit.([]any)[0].(map[string]any) + + rows := limitMap["rows"].(int) + opts.LimitFrom = &sdk.LimitFrom{ + Rows: &rows, + } + + if from, ok := limitMap["from"].(string); ok { + opts.LimitFrom.From = &from + } + } + databases, err := client.Databases.Show(ctx, &opts) if err != nil { - return err + return diag.FromErr(err) } d.SetId("databases_read") - flattenedDatabases := []map[string]interface{}{} - for _, database := range databases { - flattenedDatabase := map[string]interface{}{} - flattenedDatabase["name"] = database.Name - flattenedDatabase["comment"] = database.Comment - flattenedDatabase["owner"] = database.Owner - flattenedDatabase["is_default"] = database.IsDefault - flattenedDatabase["is_current"] = database.IsCurrent - flattenedDatabase["origin"] = database.Origin - flattenedDatabase["created_on"] = database.CreatedOn.String() - flattenedDatabase["options"] = database.Options - flattenedDatabase["retention_time"] = database.RetentionTime - flattenedDatabases = append(flattenedDatabases, flattenedDatabase) + + flattenedDatabases := make([]map[string]any, len(databases)) + + for i, database := range databases { + var databaseDescription []map[string]any + if d.Get("with_describe").(bool) { + describeResult, err := client.Databases.Describe(ctx, database.ID()) + if err != nil { + return diag.FromErr(err) + } + for _, description := range describeResult.Rows { + databaseDescription = append(databaseDescription, map[string]any{ + "created_on": description.CreatedOn.String(), + "name": description.Name, + "kind": description.Kind, + }) + } + } + + var databaseParameters []map[string]any + if d.Get("with_parameters").(bool) { + parameters, err := client.Parameters.ShowParameters(ctx, &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Database: database.ID(), + }, + }) + if err != nil { + return diag.FromErr(err) + } + for _, parameter := range parameters { + databaseParameters = append(databaseParameters, map[string]any{ + "key": parameter.Key, + "value": parameter.Value, + "default": parameter.Default, + "level": string(parameter.Level), + "description": parameter.Description, + }) + } + } + + flattenedDatabases[i] = map[string]any{ + "created_on": database.CreatedOn.String(), + "name": database.Name, + "kind": database.Kind, + "is_transient": database.Transient, + "is_default": database.IsDefault, + "is_current": database.IsCurrent, + "origin": database.Origin, + "owner": database.Owner, + "comment": database.Comment, + "options": database.Options, + "retention_time": database.RetentionTime, + "resource_group": database.ResourceGroup, + "owner_role_type": database.OwnerRoleType, + "description": databaseDescription, + "parameters": databaseParameters, + } } + err = d.Set("databases", flattenedDatabases) if err != nil { - return err + return diag.FromErr(err) } + return nil } diff --git a/pkg/datasources/databases_acceptance_test.go b/pkg/datasources/databases_acceptance_test.go index 2a20b2a611..9989a1ad77 100644 --- a/pkg/datasources/databases_acceptance_test.go +++ b/pkg/datasources/databases_acceptance_test.go @@ -1,105 +1,170 @@ package datasources_test import ( - "fmt" + "maps" + "regexp" "strconv" "testing" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" + "github.com/hashicorp/terraform-plugin-testing/config" + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" "github.com/hashicorp/terraform-plugin-testing/helper/resource" - "github.com/hashicorp/terraform-plugin-testing/terraform" "github.com/hashicorp/terraform-plugin-testing/tfversion" ) -func TestAcc_Databases(t *testing.T) { +func TestAcc_Databases_Complete(t *testing.T) { databaseName := acc.TestClient().Ids.Alpha() comment := random.Comment() + + configVariables := config.Variables{ + "name": config.StringVariable(databaseName), + "comment": config.StringVariable(comment), + "account_identifier": config.StringVariable(strconv.Quote(acc.SecondaryTestClient().Account.GetAccountIdentifier(t).FullyQualifiedName())), + } + resource.Test(t, resource.TestCase{ ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, TerraformVersionChecks: []tfversion.TerraformVersionCheck{ tfversion.RequireAbove(tfversion.Version1_5_0), }, - CheckDestroy: nil, + CheckDestroy: acc.CheckDestroy(t, resources.Database), Steps: []resource.TestStep{ { - Config: databases(databaseName, comment), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Databases/optionals_set"), + ConfigVariables: configVariables, Check: resource.ComposeTestCheckFunc( - checkDatabases(databaseName, comment), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.#", "1"), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.created_on"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.name", databaseName), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.kind", "STANDARD"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.is_transient", "false"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.is_default", "false"), + // Commenting as this value depends on the currently used database, which is different when running as a single test and multiple tests (e.g., on CI) + // resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.is_current", "true"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.origin", ""), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.owner"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.comment", comment), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.options", ""), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.retention_time"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.resource_group", ""), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.owner_role_type"), + + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.description.#", "2"), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.description.0.created_on"), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.description.0.name"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.description.0.kind", "SCHEMA"), + + resource.TestCheckResourceAttrWith("data.snowflake_databases.test", "databases.0.parameters.#", acc.IsGreaterOrEqualTo(10)), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.parameters.0.key"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.parameters.0.value", ""), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.parameters.0.default", ""), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.parameters.0.level", ""), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.parameters.0.description"), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Databases/optionals_unset"), + ConfigVariables: configVariables, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.#", "1"), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.created_on"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.name", databaseName), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.kind", "STANDARD"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.is_transient", "false"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.is_default", "false"), + // Commenting for the same reason as above + // resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.is_current", "false"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.origin", ""), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.owner"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.comment", comment), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.options", ""), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.retention_time"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.resource_group", ""), + resource.TestCheckResourceAttrSet("data.snowflake_databases.test", "databases.0.owner_role_type"), + + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.description.#", "0"), + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.0.parameters.#", "0"), ), }, }, }) } -func databases(databaseName, comment string) string { - return fmt.Sprintf(` - resource snowflake_database "test_database" { - name = "%v" - comment = "%v" - } - data snowflake_databases "t" { - depends_on = [snowflake_database.test_database] - } - `, databaseName, comment) -} +func TestAcc_Databases_DifferentFiltering(t *testing.T) { + prefix := random.String() + idOne := acc.TestClient().Ids.RandomAccountObjectIdentifierWithPrefix(prefix) + idTwo := acc.TestClient().Ids.RandomAccountObjectIdentifierWithPrefix(prefix) + idThree := acc.TestClient().Ids.RandomAccountObjectIdentifier() + + commonVariables := config.Variables{ + "name_1": config.StringVariable(idOne.Name()), + "name_2": config.StringVariable(idTwo.Name()), + "name_3": config.StringVariable(idThree.Name()), + } + + likeConfig := config.Variables{ + "like": config.StringVariable(idOne.Name()), + } + maps.Copy(likeConfig, commonVariables) -func checkDatabases(databaseName string, comment string) resource.TestCheckFunc { - return func(s *terraform.State) error { - resourceState := s.Modules[0].Resources["data.snowflake_databases.t"] - if resourceState == nil { - return fmt.Errorf("resource not found in state") - } - instanceState := resourceState.Primary - if instanceState == nil { - return fmt.Errorf("resource has no primary instance") - } - if instanceState.ID != "databases_read" { - return fmt.Errorf("expected ID to be 'databases_read', got %s", instanceState.ID) - } - nDbs, err := strconv.Atoi(instanceState.Attributes["databases.#"]) - if err != nil { - return fmt.Errorf("expected a number for field 'databases', got %s", instanceState.Attributes["databases.#"]) - } - if nDbs == 0 { - return fmt.Errorf("expected databases to be greater or equal to 1, got %s", instanceState.Attributes["databases.#"]) - } - dbIdx := -1 - for i := 0; i < nDbs; i++ { - idxName := fmt.Sprintf("databases.%d.name", i) - if instanceState.Attributes[idxName] == databaseName { - dbIdx = i - break - } - } - if dbIdx == -1 { - return fmt.Errorf("database %s not found", databaseName) - } - idxComment := fmt.Sprintf("databases.%d.comment", dbIdx) - if instanceState.Attributes[idxComment] != comment { - return fmt.Errorf("expected comment '%s', got '%s'", comment, instanceState.Attributes[idxComment]) - } - idxCreatedOn := fmt.Sprintf("databases.%d.created_on", dbIdx) - if instanceState.Attributes[idxCreatedOn] == "" { - return fmt.Errorf("expected 'created_on' to be set") - } - idxOwner := fmt.Sprintf("databases.%d.owner", dbIdx) - if instanceState.Attributes[idxOwner] == "" { - return fmt.Errorf("expected 'owner' to be set") - } - idxRetentionTime := fmt.Sprintf("databases.%d.retention_time", dbIdx) - if instanceState.Attributes[idxRetentionTime] == "" { - return fmt.Errorf("expected 'retention_time' to be set") - } - idxIsCurrent := fmt.Sprintf("databases.%d.is_current", dbIdx) - if instanceState.Attributes[idxIsCurrent] == "" { - return fmt.Errorf("expected 'is_current' to be set") - } - idxIsDefault := fmt.Sprintf("databases.%d.is_default", dbIdx) - if instanceState.Attributes[idxIsDefault] == "" { - return fmt.Errorf("expected 'is_default' to be set") - } - return nil + startsWithConfig := config.Variables{ + "starts_with": config.StringVariable(prefix), } + maps.Copy(startsWithConfig, commonVariables) + + limitConfig := config.Variables{ + "rows": config.IntegerVariable(1), + "from": config.StringVariable(prefix), + } + maps.Copy(limitConfig, commonVariables) + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Database), + Steps: []resource.TestStep{ + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Databases/like"), + ConfigVariables: likeConfig, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.#", "1"), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Databases/starts_with"), + ConfigVariables: startsWithConfig, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.#", "2"), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Databases/limit"), + ConfigVariables: limitConfig, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("data.snowflake_databases.test", "databases.#", "1"), + ), + }, + }, + }) +} + +func TestAcc_Databases_DatabaseNotFound_WithPostConditions(t *testing.T) { + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + Steps: []resource.TestStep{ + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Databases/without_database"), + ExpectError: regexp.MustCompile("there should be at least one database"), + }, + }, + }) } diff --git a/pkg/datasources/testdata/TestAcc_Databases/like/test.tf b/pkg/datasources/testdata/TestAcc_Databases/like/test.tf new file mode 100644 index 0000000000..7b7b0425ba --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/like/test.tf @@ -0,0 +1,16 @@ +resource "snowflake_database" "test_1" { + name = var.name_1 +} + +resource "snowflake_database" "test_2" { + name = var.name_2 +} + +resource "snowflake_database" "test_3" { + name = var.name_3 +} + +data "snowflake_databases" "test" { + depends_on = [snowflake_database.test_1, snowflake_database.test_2, snowflake_database.test_3] + like = var.like +} diff --git a/pkg/datasources/testdata/TestAcc_Databases/like/variables.tf b/pkg/datasources/testdata/TestAcc_Databases/like/variables.tf new file mode 100644 index 0000000000..6bd0278080 --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/like/variables.tf @@ -0,0 +1,15 @@ +variable "name_1" { + type = string +} + +variable "name_2" { + type = string +} + +variable "name_3" { + type = string +} + +variable "like" { + type = string +} diff --git a/pkg/datasources/testdata/TestAcc_Databases/limit/test.tf b/pkg/datasources/testdata/TestAcc_Databases/limit/test.tf new file mode 100644 index 0000000000..d7e6184c0f --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/limit/test.tf @@ -0,0 +1,19 @@ +resource "snowflake_database" "test_1" { + name = var.name_1 +} + +resource "snowflake_database" "test_2" { + name = var.name_2 +} + +resource "snowflake_database" "test_3" { + name = var.name_3 +} + +data "snowflake_databases" "test" { + depends_on = [snowflake_database.test_1, snowflake_database.test_2, snowflake_database.test_3] + limit { + rows = var.rows + from = var.from + } +} diff --git a/pkg/datasources/testdata/TestAcc_Databases/limit/variables.tf b/pkg/datasources/testdata/TestAcc_Databases/limit/variables.tf new file mode 100644 index 0000000000..989508a9ce --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/limit/variables.tf @@ -0,0 +1,19 @@ +variable "name_1" { + type = string +} + +variable "name_2" { + type = string +} + +variable "name_3" { + type = string +} + +variable "rows" { + type = number +} + +variable "from" { + type = string +} diff --git a/pkg/datasources/testdata/TestAcc_Databases/optionals_set/test.tf b/pkg/datasources/testdata/TestAcc_Databases/optionals_set/test.tf new file mode 100644 index 0000000000..c7b1aa7069 --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/optionals_set/test.tf @@ -0,0 +1,20 @@ +resource "snowflake_database" "test" { + name = var.name + comment = var.comment + replication { + enable_to_account { + account_identifier = var.account_identifier + with_failover = true + } + ignore_edition_check = true + } +} + +data "snowflake_databases" "test" { + depends_on = [snowflake_database.test] + like = var.name + starts_with = var.name + limit { + rows = 1 + } +} diff --git a/pkg/datasources/testdata/TestAcc_Databases/optionals_set/variables.tf b/pkg/datasources/testdata/TestAcc_Databases/optionals_set/variables.tf new file mode 100644 index 0000000000..ea75b95f23 --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/optionals_set/variables.tf @@ -0,0 +1,11 @@ +variable "name" { + type = string +} + +variable "account_identifier" { + type = string +} + +variable "comment" { + type = string +} diff --git a/pkg/datasources/testdata/TestAcc_Databases/optionals_unset/test.tf b/pkg/datasources/testdata/TestAcc_Databases/optionals_unset/test.tf new file mode 100644 index 0000000000..c640dcafc0 --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/optionals_unset/test.tf @@ -0,0 +1,22 @@ +resource "snowflake_database" "test" { + name = var.name + comment = var.comment + replication { + enable_to_account { + account_identifier = var.account_identifier + with_failover = true + } + ignore_edition_check = true + } +} + +data "snowflake_databases" "test" { + with_describe = false + with_parameters = false + depends_on = [snowflake_database.test] + like = var.name + starts_with = var.name + limit { + rows = 1 + } +} diff --git a/pkg/datasources/testdata/TestAcc_Databases/optionals_unset/variables.tf b/pkg/datasources/testdata/TestAcc_Databases/optionals_unset/variables.tf new file mode 100644 index 0000000000..ea75b95f23 --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/optionals_unset/variables.tf @@ -0,0 +1,11 @@ +variable "name" { + type = string +} + +variable "account_identifier" { + type = string +} + +variable "comment" { + type = string +} diff --git a/pkg/datasources/testdata/TestAcc_Databases/starts_with/test.tf b/pkg/datasources/testdata/TestAcc_Databases/starts_with/test.tf new file mode 100644 index 0000000000..003a25d1a4 --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/starts_with/test.tf @@ -0,0 +1,16 @@ +resource "snowflake_database" "test_1" { + name = var.name_1 +} + +resource "snowflake_database" "test_2" { + name = var.name_2 +} + +resource "snowflake_database" "test_3" { + name = var.name_3 +} + +data "snowflake_databases" "test" { + depends_on = [snowflake_database.test_1, snowflake_database.test_2, snowflake_database.test_3] + starts_with = var.starts_with +} diff --git a/pkg/datasources/testdata/TestAcc_Databases/starts_with/variables.tf b/pkg/datasources/testdata/TestAcc_Databases/starts_with/variables.tf new file mode 100644 index 0000000000..a4044d2176 --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/starts_with/variables.tf @@ -0,0 +1,15 @@ +variable "name_1" { + type = string +} + +variable "name_2" { + type = string +} + +variable "name_3" { + type = string +} + +variable "starts_with" { + type = string +} diff --git a/pkg/datasources/testdata/TestAcc_Databases/without_database/test.tf b/pkg/datasources/testdata/TestAcc_Databases/without_database/test.tf new file mode 100644 index 0000000000..5fe341159d --- /dev/null +++ b/pkg/datasources/testdata/TestAcc_Databases/without_database/test.tf @@ -0,0 +1,10 @@ +data "snowflake_databases" "test" { + like = "non-existing-database" + + lifecycle { + postcondition { + condition = length(self.databases) > 0 + error_message = "there should be at least one database" + } + } +} diff --git a/pkg/internal/provider/docs/doc_helpers.go b/pkg/internal/provider/docs/doc_helpers.go index 7016300e36..82877a8b8b 100644 --- a/pkg/internal/provider/docs/doc_helpers.go +++ b/pkg/internal/provider/docs/doc_helpers.go @@ -8,6 +8,7 @@ import ( // deprecationMessageRegex is the message that should be used in resource/datasource DeprecationMessage to get a nice link in the documentation to the replacing resource. var deprecationMessageRegex = regexp.MustCompile(`Please use (snowflake_(\w+)) instead.`) +// TODO(SNOW-1465227): Should detect more than one replacements // GetDeprecatedResourceReplacement allows us to get resource replacement based on the regex deprecationMessageRegex func GetDeprecatedResourceReplacement(deprecationMessage string) (replacement string, replacementPage string, ok bool) { resourceReplacement := deprecationMessageRegex.FindStringSubmatch(deprecationMessage) diff --git a/pkg/provider/provider.go b/pkg/provider/provider.go index 457f3e0b14..e6922f1155 100644 --- a/pkg/provider/provider.go +++ b/pkg/provider/provider.go @@ -456,6 +456,7 @@ func getResources() map[string]*schema.Resource { "snowflake_account_parameter": resources.AccountParameter(), "snowflake_alert": resources.Alert(), "snowflake_api_integration": resources.APIIntegration(), + "snowflake_database_old": resources.DatabaseOld(), "snowflake_database": resources.Database(), "snowflake_database_role": resources.DatabaseRole(), "snowflake_dynamic_table": resources.DynamicTable(), @@ -494,9 +495,11 @@ func getResources() map[string]*schema.Resource { "snowflake_saml_integration": resources.SAMLIntegration(), "snowflake_schema": resources.Schema(), "snowflake_scim_integration": resources.SCIMIntegration(), + "snowflake_secondary_database": resources.SecondaryDatabase(), "snowflake_sequence": resources.Sequence(), "snowflake_session_parameter": resources.SessionParameter(), "snowflake_share": resources.Share(), + "snowflake_shared_database": resources.SharedDatabase(), "snowflake_stage": resources.Stage(), "snowflake_storage_integration": resources.StorageIntegration(), "snowflake_stream": resources.Stream(), diff --git a/pkg/provider/resources/resources.go b/pkg/provider/resources/resources.go index bd97c4e572..81172b32a6 100644 --- a/pkg/provider/resources/resources.go +++ b/pkg/provider/resources/resources.go @@ -6,6 +6,7 @@ const ( Account resource = "snowflake_account" Alert resource = "snowflake_alert" ApiIntegration resource = "snowflake_api_integration" + DatabaseOld resource = "snowflake_database_old" Database resource = "snowflake_database" DatabaseRole resource = "snowflake_database_role" DynamicTable resource = "snowflake_dynamic_table" diff --git a/pkg/resources/custom_diffs.go b/pkg/resources/custom_diffs.go index 3205522ced..a3f338bed1 100644 --- a/pkg/resources/custom_diffs.go +++ b/pkg/resources/custom_diffs.go @@ -2,51 +2,63 @@ package resources import ( "context" + "log" "strconv" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" + + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" ) -// NestedIntValueAccountObjectComputedIf is NestedValueComputedIf, -// but dedicated for account level objects with integer-typed properties. -func NestedIntValueAccountObjectComputedIf(key string, parameter sdk.AccountParameter) schema.CustomizeDiffFunc { - return NestedValueComputedIf( - key, - func(client *sdk.Client) (*sdk.Parameter, error) { - return client.Parameters.ShowAccountParameter(context.Background(), parameter) - }, - func(v any) string { return strconv.Itoa(v.(int)) }, - ) +func StringParameterValueComputedIf(key string, params []*sdk.Parameter, parameterLevel sdk.ParameterType, parameter sdk.AccountParameter) schema.CustomizeDiffFunc { + return ParameterValueComputedIf(key, params, parameterLevel, parameter, func(value any) string { return value.(string) }) } -// NestedValueComputedIf internally calls schema.ResourceDiff.SetNewComputed whenever the inner function returns true. -// It's main purpose was to use it with hierarchical values that are marked with Computed and Optional. Such values should -// be recomputed whenever the value is not in the configuration and the remote value is not equal to the value in state. -func NestedValueComputedIf(key string, showParam func(client *sdk.Client) (*sdk.Parameter, error), valueToString func(v any) string) schema.CustomizeDiffFunc { - return customdiff.ComputedIf(key, func(ctx context.Context, d *schema.ResourceDiff, meta interface{}) bool { - configValue, ok := d.GetRawConfig().AsValueMap()[key] - if ok && len(configValue.AsValueSlice()) == 1 { - return false - } +func IntParameterValueComputedIf(key string, params []*sdk.Parameter, parameterLevel sdk.ParameterType, parameter sdk.AccountParameter) schema.CustomizeDiffFunc { + return ParameterValueComputedIf(key, params, parameterLevel, parameter, func(value any) string { return strconv.Itoa(value.(int)) }) +} - client := meta.(*provider.Context).Client +func BoolParameterValueComputedIf(key string, params []*sdk.Parameter, parameterLevel sdk.ParameterType, parameter sdk.AccountParameter) schema.CustomizeDiffFunc { + return ParameterValueComputedIf(key, params, parameterLevel, parameter, func(value any) string { return strconv.FormatBool(value.(bool)) }) +} - param, err := showParam(client) +func ParameterValueComputedIf(key string, parameters []*sdk.Parameter, objectParameterLevel sdk.ParameterType, accountParameter sdk.AccountParameter, valueToString func(v any) string) schema.CustomizeDiffFunc { + return func(ctx context.Context, d *schema.ResourceDiff, meta any) error { + foundParameter, err := collections.FindOne(parameters, func(parameter *sdk.Parameter) bool { return parameter.Key == string(accountParameter) }) if err != nil { - return false + log.Printf("[WARN] failed to find account parameter: %s", accountParameter) + return nil } + parameter := *foundParameter - stateValue := d.Get(key).([]any) - if len(stateValue) != 1 { - return false + configValue, ok := d.GetRawConfig().AsValueMap()[key] + + // For cases where currently set value (in the config) is equal to the parameter, but not set on the right level. + // The parameter is set somewhere higher in the hierarchy, and we need to "forcefully" set the value to + // perform the actual set on Snowflake (and set the parameter on the correct level). + if ok && !configValue.IsNull() && parameter.Level != objectParameterLevel && parameter.Value == valueToString(d.Get(key)) { + return d.SetNewComputed(key) } - return param.Value != valueToString(stateValue[0].(map[string]any)["value"]) - }) + // For all other cases, if a parameter is set in the configuration, we can ignore parts needed for Computed fields. + if ok && !configValue.IsNull() { + return nil + } + + // If the configuration is not set, perform SetNewComputed for cases like: + // 1. Check if the parameter value differs from the one saved in state (if they differ, we'll update the computed value). + // 2. Check if the parameter is set on the object level (if so, it means that it was set externally, and we have to unset it). + if parameter.Value != valueToString(d.Get(key)) || parameter.Level == objectParameterLevel { + return d.SetNewComputed(key) + } + + return nil + } } func BoolComputedIf(key string, getDefault func(client *sdk.Client, id sdk.AccountObjectIdentifier) (string, error)) schema.CustomizeDiffFunc { diff --git a/pkg/resources/custom_diffs_test.go b/pkg/resources/custom_diffs_test.go index e535af6278..17bcc3ae90 100644 --- a/pkg/resources/custom_diffs_test.go +++ b/pkg/resources/custom_diffs_test.go @@ -16,116 +16,108 @@ import ( "github.com/stretchr/testify/require" ) -func TestNestedValueComputedIf(t *testing.T) { - customDiff := resources.NestedValueComputedIf( - "nested_value", - func(client *sdk.Client) (*sdk.Parameter, error) { - return &sdk.Parameter{ - Key: "Parameter", - Value: "snow-value", - }, nil - }, - func(v any) string { return v.(string) }, - ) - providerConfig := createProviderWithNestedValueAndCustomDiff(t, schema.TypeString, customDiff) +func TestParameterValueComputedIf(t *testing.T) { + createProviderConfig := func(parameterLevel sdk.ParameterType, parameterValue sdk.LogLevel) *schema.Provider { + customDiff := resources.ParameterValueComputedIf( + "value", + []*sdk.Parameter{ + { + Key: string(sdk.AccountParameterLogLevel), + Level: parameterLevel, + Value: string(parameterValue), + }, + }, + sdk.ParameterTypeDatabase, + sdk.AccountParameterLogLevel, + func(v any) string { return v.(string) }, + ) + return createProviderWithValuePropertyAndCustomDiff(t, schema.TypeString, customDiff) + } - t.Run("value set in the configuration and state", func(t *testing.T) { + t.Run("config: true - state: true - level: different - value: same", func(t *testing.T) { + providerConfig := createProviderConfig(sdk.ParameterTypeAccount, sdk.LogLevelInfo) diff := calculateDiff(t, providerConfig, cty.MapVal(map[string]cty.Value{ - "nested_value": cty.ListVal([]cty.Value{ - cty.MapVal(map[string]cty.Value{ - "value": cty.NumberIntVal(123), - }), - }), + "value": cty.StringVal(string(sdk.LogLevelInfo)), }), map[string]any{ - "nested_value": []any{ - map[string]any{ - "value": 123, - }, - }, + "value": string(sdk.LogLevelInfo), }) - assert.False(t, diff.Attributes["nested_value.#"].NewComputed) + assert.True(t, diff.Attributes["value"].NewComputed) }) - t.Run("value set only in the configuration", func(t *testing.T) { + t.Run("config: true - state: true - level: different - value: different", func(t *testing.T) { + providerConfig := createProviderConfig(sdk.ParameterTypeAccount, sdk.LogLevelDebug) diff := calculateDiff(t, providerConfig, cty.MapVal(map[string]cty.Value{ - "nested_value": cty.ListVal([]cty.Value{ - cty.MapVal(map[string]cty.Value{ - "value": cty.NumberIntVal(123), - }), - }), - }), map[string]any{}) - assert.True(t, diff.Attributes["nested_value.#"].NewComputed) + "value": cty.StringVal(string(sdk.LogLevelInfo)), + }), map[string]any{ + "value": string(sdk.LogLevelInfo), + }) + assert.False(t, diff.Attributes["value"].NewComputed) }) - t.Run("value set in the state and not equals with parameter", func(t *testing.T) { - diff := calculateDiff(t, providerConfig, cty.MapValEmpty(cty.Type{}), map[string]any{ - "nested_value": []any{ - map[string]any{ - "value": "value-to-change", - }, - }, + t.Run("config: true - state: true - level: same - value: same", func(t *testing.T) { + providerConfig := createProviderConfig(sdk.ParameterTypeDatabase, sdk.LogLevelInfo) + diff := calculateDiff(t, providerConfig, cty.MapVal(map[string]cty.Value{ + "value": cty.StringVal(string(sdk.LogLevelInfo)), + }), map[string]any{ + "value": string(sdk.LogLevelInfo), }) - assert.True(t, diff.Attributes["nested_value.#"].NewComputed) + assert.False(t, diff.Attributes["value"].NewComputed) }) - t.Run("value set in the state and equals with parameter", func(t *testing.T) { - diff := calculateDiff(t, providerConfig, cty.MapValEmpty(cty.Type{}), map[string]any{ - "nested_value": []any{ - map[string]any{ - "value": "snow-value", - }, - }, + t.Run("config: true - state: true - level: same - value: different", func(t *testing.T) { + providerConfig := createProviderConfig(sdk.ParameterTypeDatabase, sdk.LogLevelDebug) + diff := calculateDiff(t, providerConfig, cty.MapVal(map[string]cty.Value{ + "value": cty.StringVal(string(sdk.LogLevelInfo)), + }), map[string]any{ + "value": string(sdk.LogLevelInfo), }) - assert.False(t, diff.Attributes["nested_value.#"].NewComputed) + assert.False(t, diff.Attributes["value"].NewComputed) }) -} -func TestNestedIntValueAccountObjectComputedIf(t *testing.T) { - providerConfig := createProviderWithNestedValueAndCustomDiff(t, schema.TypeInt, resources.NestedIntValueAccountObjectComputedIf("nested_value", sdk.AccountParameterDataRetentionTimeInDays)) + t.Run("config: false - state: true - level: different - value: same", func(t *testing.T) { + providerConfig := createProviderConfig(sdk.ParameterTypeAccount, sdk.LogLevelInfo) + diff := calculateDiff(t, providerConfig, cty.MapValEmpty(cty.String), map[string]any{ + "value": string(sdk.LogLevelInfo), + }) + assert.False(t, diff.Attributes["value"].NewComputed) + }) - t.Run("different value than on the Snowflake side", func(t *testing.T) { - diff := calculateDiff(t, providerConfig, cty.MapValEmpty(cty.Type{}), map[string]any{ - "nested_value": []any{ - map[string]any{ - "value": 999, // value outside of valid range - }, - }, + t.Run("config: false - state: true - level: different - value: different", func(t *testing.T) { + providerConfig := createProviderConfig(sdk.ParameterTypeAccount, sdk.LogLevelDebug) + diff := calculateDiff(t, providerConfig, cty.MapValEmpty(cty.String), map[string]any{ + "value": string(sdk.LogLevelInfo), }) - assert.True(t, diff.Attributes["nested_value.#"].NewComputed) + assert.True(t, diff.Attributes["value"].NewComputed) }) - t.Run("same value as in Snowflake", func(t *testing.T) { - dataRetentionTimeInDays, err := acc.Client(t).Parameters.ShowAccountParameter(context.Background(), sdk.AccountParameterDataRetentionTimeInDays) - require.NoError(t, err) + t.Run("config: false - state: true - level: same - value: same", func(t *testing.T) { + providerConfig := createProviderConfig(sdk.ParameterTypeAccount, sdk.LogLevelInfo) + diff := calculateDiff(t, providerConfig, cty.MapValEmpty(cty.String), map[string]any{ + "value": string(sdk.LogLevelInfo), + }) + assert.False(t, diff.Attributes["value"].NewComputed) + }) - diff := calculateDiff(t, providerConfig, cty.MapValEmpty(cty.Type{}), map[string]any{ - "nested_value": []any{ - map[string]any{ - "value": dataRetentionTimeInDays.Value, - }, - }, + t.Run("config: false - state: true - level: same - value: different", func(t *testing.T) { + providerConfig := createProviderConfig(sdk.ParameterTypeAccount, sdk.LogLevelDebug) + diff := calculateDiff(t, providerConfig, cty.MapValEmpty(cty.String), map[string]any{ + "value": string(sdk.LogLevelInfo), }) - assert.False(t, diff.Attributes["nested_value.#"].NewComputed) + assert.True(t, diff.Attributes["value"].NewComputed) }) + + // Tests for filled config and empty state were not added as the only way + // of getting into this situation would be in create operation for which custom diffs are skipped. } -func createProviderWithNestedValueAndCustomDiff(t *testing.T, valueType schema.ValueType, customDiffFunc schema.CustomizeDiffFunc) *schema.Provider { +func createProviderWithValuePropertyAndCustomDiff(t *testing.T, valueType schema.ValueType, customDiffFunc schema.CustomizeDiffFunc) *schema.Provider { t.Helper() return &schema.Provider{ ResourcesMap: map[string]*schema.Resource{ "test": { Schema: map[string]*schema.Schema{ - "nested_value": { - Type: schema.TypeList, - MaxItems: 1, - Elem: &schema.Resource{ - Schema: map[string]*schema.Schema{ - "value": { - Type: valueType, - Required: true, - }, - }, - }, + "value": { + Type: valueType, Computed: true, Optional: true, }, diff --git a/pkg/resources/database.go b/pkg/resources/database.go index d6fd261400..921c407a66 100644 --- a/pkg/resources/database.go +++ b/pkg/resources/database.go @@ -2,395 +2,470 @@ package resources import ( "context" + "errors" "fmt" - "log" "slices" - "strconv" + "strings" + + "github.com/hashicorp/go-cty/cty" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" ) var databaseSchema = map[string]*schema.Schema{ "name": { Type: schema.TypeString, Required: true, - Description: "Specifies the identifier for the database; must be unique for your account.", - }, - "comment": { - Type: schema.TypeString, - Optional: true, - Default: "", - Description: "Specifies a comment for the database.", + Description: "Specifies the identifier for the database; must be unique for your account. As a best practice for [Database Replication and Failover](https://docs.snowflake.com/en/user-guide/db-replication-intro), it is recommended to give each secondary database the same name as its primary database. This practice supports referencing fully-qualified objects (i.e. '..') by other objects in the same database, such as querying a fully-qualified table name in a view. If a secondary database has a different name from the primary database, then these object references would break in the secondary database.", }, "is_transient": { Type: schema.TypeBool, Optional: true, - Default: false, - Description: "Specifies a database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss.", ForceNew: true, + Description: "Specifies the database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss.", }, - "data_retention_time_in_days": { - Type: schema.TypeInt, - Optional: true, - Default: -1, - Description: "Number of days for which Snowflake retains historical data for performing Time Travel actions (SELECT, CLONE, UNDROP) on the object. A value of 0 effectively disables Time Travel for the specified database. Default value for this field is set to -1, which is a fallback to use Snowflake default. For more information, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel).", - ValidateFunc: validation.IntBetween(-1, 90), - }, - "from_share": { - Type: schema.TypeMap, - Elem: &schema.Schema{Type: schema.TypeString}, - Description: "Specify a provider and a share in this map to create a database from a share. As of version 0.87.0, the provider field is the account locator.", - Optional: true, - ForceNew: true, - ConflictsWith: []string{"from_database", "from_replica"}, - }, - "from_database": { - Type: schema.TypeString, - Description: "Specify a database to create a clone from.", - Optional: true, - ForceNew: true, - ConflictsWith: []string{"from_share", "from_replica"}, - }, - "from_replica": { - Type: schema.TypeString, - Description: "Specify a fully-qualified path to a database to create a replica from. A fully qualified path follows the format of `\"\".\"\".\"\"`. An example would be: `\"myorg1\".\"account1\".\"db1\"`", - Optional: true, - ForceNew: true, - ConflictsWith: []string{"from_share", "from_database"}, - }, - // TODO: Add accounts for replication (it will promote local database to serve as a primary database for replication). - // "accounts for replication": { - // Type: schema.TypeList, - // Required: true, - // MinItems: 1, - // Elem: &schema.Schema{ - // Type: schema.TypeString, - // // TODO(ticket-number): Validate account identifiers. - // }, - // // TODO: Desc - // }, - // "accounts for failover": { - // Type: schema.TypeList, - // Required: true, - // MinItems: 1, - // Elem: &schema.Schema{ - // Type: schema.TypeString, - // // TODO(ticket-number): Validate account identifiers. - // }, - // // TODO: Desc - // }, - // "ignore_edition_check": { - // Type: schema.TypeBool, - // // TODO: Desc - // Optional: true, - // }, - "replication_configuration": { + "replication": { Type: schema.TypeList, - Description: "When set, specifies the configurations for database replication.", Optional: true, + Description: "Configures replication for a given database. When specified, this database will be promoted to serve as a primary database for replication. A primary database can be replicated in one or more accounts, allowing users in those accounts to query objects in each secondary (i.e. replica) database.", MaxItems: 1, Elem: &schema.Resource{ Schema: map[string]*schema.Schema{ - "accounts": { - Type: schema.TypeList, - Required: true, - MinItems: 1, - Elem: &schema.Schema{Type: schema.TypeString}, + "enable_to_account": { + Type: schema.TypeList, + Required: true, + Description: "Entry to enable replication and optionally failover for a given account identifier.", + MinItems: 1, + Elem: &schema.Resource{ + Schema: map[string]*schema.Schema{ + "account_identifier": { + Type: schema.TypeString, + Required: true, + // TODO(SNOW-1438810): Add account identifier validator + Description: "Specifies account identifier for which replication should be enabled. The account identifiers should be in the form of `\"\".\"\"`.", + }, + "with_failover": { + Type: schema.TypeBool, + Optional: true, + Description: "Specifies if failover should be enabled for the specified account identifier", + }, + }, + }, }, "ignore_edition_check": { Type: schema.TypeBool, - Default: true, Optional: true, + Description: "Allows replicating data to accounts on lower editions in either of the following scenarios: " + + "1. The primary database is in a Business Critical (or higher) account but one or more of the accounts approved for replication are on lower editions. Business Critical Edition is intended for Snowflake accounts with extremely sensitive data. " + + "2. The primary database is in a Business Critical (or higher) account and a signed business associate agreement is in place to store PHI data in the account per HIPAA and HITRUST regulations, but no such agreement is in place for one or more of the accounts approved for replication, regardless if they are Business Critical (or higher) accounts. " + + "Both scenarios are prohibited by default in an effort to help prevent account administrators for Business Critical (or higher) accounts from inadvertently replicating sensitive data to accounts on lower editions.", }, }, }, }, + "comment": { + Type: schema.TypeString, + Optional: true, + Description: "Specifies a comment for the database.", + }, } -// Database returns a pointer to the resource representing a database. func Database() *schema.Resource { return &schema.Resource{ - Create: CreateDatabase, - Read: ReadDatabase, - Delete: DeleteDatabase, - Update: UpdateDatabase, + SchemaVersion: 1, - Schema: databaseSchema, + CreateContext: CreateDatabase, + ReadContext: ReadDatabase, + DeleteContext: DeleteDatabase, + UpdateContext: UpdateDatabase, + Description: "Represents a standard database. If replication configuration is specified, the database is promoted to serve as a primary database for replication.", + + CustomizeDiff: DatabaseParametersCustomDiff, + Schema: MergeMaps(databaseSchema, DatabaseParametersSchema), Importer: &schema.ResourceImporter{ StateContext: schema.ImportStatePassthroughContext, }, + + StateUpgraders: []schema.StateUpgrader{ + { + Version: 0, + // setting type to cty.EmptyObject is a bit hacky here but following https://developer.hashicorp.com/terraform/plugin/framework/migrating/resources/state-upgrade#sdkv2-1 would require lots of repetitive code; this should work with cty.EmptyObject + Type: cty.EmptyObject, + Upgrade: v092DatabaseStateUpgrader, + }, + }, } } -// CreateDatabase implements schema.CreateFunc. -func CreateDatabase(d *schema.ResourceData, meta interface{}) error { +func CreateDatabase(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { client := meta.(*provider.Context).Client - ctx := context.Background() - name := d.Get("name").(string) - id := sdk.NewAccountObjectIdentifier(name) - - // Is it a Shared Database? - if fromShare, ok := d.GetOk("from_share"); ok { - account := fromShare.(map[string]interface{})["provider"].(string) - share := fromShare.(map[string]interface{})["share"].(string) - shareID := sdk.NewExternalObjectIdentifier(sdk.NewAccountIdentifierFromAccountLocator(account), sdk.NewAccountObjectIdentifier(share)) - opts := &sdk.CreateSharedDatabaseOptions{} - if v, ok := d.GetOk("comment"); ok { - opts.Comment = sdk.String(v.(string)) - } - err := client.Databases.CreateShared(ctx, id, shareID, opts) - if err != nil { - return fmt.Errorf("error creating database %v: %w", name, err) - } - d.SetId(name) - return ReadDatabase(d, meta) - } - // Is it a Secondary Database? - if primaryName, ok := d.GetOk("from_replica"); ok { - primaryID := sdk.NewExternalObjectIdentifierFromFullyQualifiedName(primaryName.(string)) - opts := &sdk.CreateSecondaryDatabaseOptions{} - if v := d.Get("data_retention_time_in_days"); v.(int) != -1 { - opts.DataRetentionTimeInDays = sdk.Int(v.(int)) - } - err := client.Databases.CreateSecondary(ctx, id, primaryID, opts) - if err != nil { - return fmt.Errorf("error creating database %v: %w", name, err) - } - d.SetId(name) - // todo: add failover_configuration block - return ReadDatabase(d, meta) - } - // Otherwise it is a Standard Database - opts := sdk.CreateDatabaseOptions{} - if v, ok := d.GetOk("comment"); ok { - opts.Comment = sdk.String(v.(string)) + id := sdk.NewAccountObjectIdentifier(d.Get("name").(string)) + + dataRetentionTimeInDays, + maxDataExtensionTimeInDays, + externalVolume, + catalog, + replaceInvalidCharacters, + defaultDDLCollation, + storageSerializationPolicy, + logLevel, + traceLevel, + suspendTaskAfterNumFailures, + taskAutoRetryAttempts, + userTaskManagedInitialWarehouseSize, + userTaskTimeoutMs, + userTaskMinimumTriggerIntervalInSeconds, + quotedIdentifiersIgnoreCase, + enableConsoleOutput, + err := GetAllDatabaseParameters(d) + if err != nil { + return diag.FromErr(err) } - if v, ok := d.GetOk("is_transient"); ok && v.(bool) { - opts.Transient = sdk.Bool(v.(bool)) + err = client.Databases.Create(ctx, id, &sdk.CreateDatabaseOptions{ + Transient: GetPropertyAsPointer[bool](d, "is_transient"), + DataRetentionTimeInDays: dataRetentionTimeInDays, + MaxDataExtensionTimeInDays: maxDataExtensionTimeInDays, + ExternalVolume: externalVolume, + Catalog: catalog, + ReplaceInvalidCharacters: replaceInvalidCharacters, + DefaultDDLCollation: defaultDDLCollation, + StorageSerializationPolicy: storageSerializationPolicy, + LogLevel: logLevel, + TraceLevel: traceLevel, + SuspendTaskAfterNumFailures: suspendTaskAfterNumFailures, + TaskAutoRetryAttempts: taskAutoRetryAttempts, + UserTaskManagedInitialWarehouseSize: userTaskManagedInitialWarehouseSize, + UserTaskTimeoutMs: userTaskTimeoutMs, + UserTaskMinimumTriggerIntervalInSeconds: userTaskMinimumTriggerIntervalInSeconds, + QuotedIdentifiersIgnoreCase: quotedIdentifiersIgnoreCase, + EnableConsoleOutput: enableConsoleOutput, + Comment: GetPropertyAsPointer[string](d, "comment"), + }) + if err != nil { + return diag.FromErr(err) } - if v, ok := d.GetOk("from_database"); ok { - opts.Clone = &sdk.Clone{ - SourceObject: sdk.NewAccountObjectIdentifier(v.(string)), - } - } + d.SetId(helpers.EncodeSnowflakeID(id)) - if v := d.Get("data_retention_time_in_days"); v.(int) != -1 { - opts.DataRetentionTimeInDays = sdk.Int(v.(int)) - } + var diags diag.Diagnostics - err := client.Databases.Create(ctx, id, &opts) - if err != nil { - return fmt.Errorf("error creating database %v: %w", name, err) - } - d.SetId(name) - - if v, ok := d.GetOk("replication_configuration"); ok { - replicationConfiguration := v.([]interface{})[0].(map[string]interface{}) - accounts := replicationConfiguration["accounts"].([]interface{}) - accountIDs := make([]sdk.AccountIdentifier, len(accounts)) - for i, account := range accounts { - accountIDs[i] = sdk.NewAccountIdentifierFromAccountLocator(account.(string)) - } - opts := &sdk.AlterDatabaseReplicationOptions{ - EnableReplication: &sdk.EnableReplication{ - ToAccounts: accountIDs, - }, - } - if ignoreEditionCheck, ok := replicationConfiguration["ignore_edition_check"]; ok { - opts.EnableReplication.IgnoreEditionCheck = sdk.Bool(ignoreEditionCheck.(bool)) + if v, ok := d.GetOk("replication"); ok { + replicationConfiguration := v.([]any)[0].(map[string]any) + + var ignoreEditionCheck *bool + if v, ok := replicationConfiguration["ignore_edition_check"]; ok { + ignoreEditionCheck = sdk.Pointer(v.(bool)) } - err := client.Databases.AlterReplication(ctx, id, opts) - if err != nil { - return fmt.Errorf("error enabling replication for database %v: %w", name, err) + + if enableToAccounts, ok := replicationConfiguration["enable_to_account"]; ok { + enableToAccountList := enableToAccounts.([]any) + + if len(enableToAccountList) > 0 { + replicationToAccounts := make([]sdk.AccountIdentifier, 0) + failoverToAccounts := make([]sdk.AccountIdentifier, 0) + + for _, enableToAccount := range enableToAccountList { + accountConfig := enableToAccount.(map[string]any) + accountIdentifier := sdk.NewAccountIdentifierFromFullyQualifiedName(accountConfig["account_identifier"].(string)) + + replicationToAccounts = append(replicationToAccounts, accountIdentifier) + if v, ok := accountConfig["with_failover"]; ok && v.(bool) { + failoverToAccounts = append(failoverToAccounts, accountIdentifier) + } + } + + if len(replicationToAccounts) > 0 { + err := client.Databases.AlterReplication(ctx, id, &sdk.AlterDatabaseReplicationOptions{ + EnableReplication: &sdk.EnableReplication{ + ToAccounts: replicationToAccounts, + IgnoreEditionCheck: ignoreEditionCheck, + }, + }) + if err != nil { + diags = append(diags, diag.Diagnostic{ + Severity: diag.Warning, + Summary: err.Error(), + }) + } + } + + if len(failoverToAccounts) > 0 { + err = client.Databases.AlterFailover(ctx, id, &sdk.AlterDatabaseFailoverOptions{ + EnableFailover: &sdk.EnableFailover{ + ToAccounts: failoverToAccounts, + }, + }) + if err != nil { + diags = append(diags, diag.Diagnostic{ + Severity: diag.Warning, + Summary: err.Error(), + }) + } + } + } } } - return ReadDatabase(d, meta) + return append(diags, ReadDatabase(ctx, d, meta)...) } -func ReadDatabase(d *schema.ResourceData, meta interface{}) error { +func UpdateDatabase(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { client := meta.(*provider.Context).Client - ctx := context.Background() id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) - database, err := client.Databases.ShowByID(ctx, id) - if err != nil { - d.SetId("") - log.Printf("Database %s not found, err = %s", id.Name(), err) - return nil + if d.HasChange("name") { + newId := sdk.NewAccountObjectIdentifier(d.Get("name").(string)) + err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ + NewName: &newId, + }) + if err != nil { + return diag.FromErr(err) + } + d.SetId(helpers.EncodeSnowflakeID(newId)) + id = newId } - if err := d.Set("name", database.Name); err != nil { - return err - } - if err := d.Set("comment", database.Comment); err != nil { - return err - } + databaseSetRequest := new(sdk.DatabaseSet) + databaseUnsetRequest := new(sdk.DatabaseUnset) - dataRetention, err := client.Parameters.ShowAccountParameter(ctx, sdk.AccountParameterDataRetentionTimeInDays) - if err != nil { - return err - } - paramDataRetention, err := strconv.Atoi(dataRetention.Value) - if err != nil { - return err + if updateParamDiags := HandleDatabaseParametersChanges(d, databaseSetRequest, databaseUnsetRequest); len(updateParamDiags) > 0 { + return updateParamDiags } - if dataRetentionDays := d.Get("data_retention_time_in_days"); dataRetentionDays.(int) != -1 || database.RetentionTime != paramDataRetention { - if err := d.Set("data_retention_time_in_days", database.RetentionTime); err != nil { - return err - } - } + if d.HasChange("replication") { + before, after := d.GetChange("replication") - if err := d.Set("is_transient", database.Transient); err != nil { - return err - } + getReplicationConfiguration := func(replicationConfigs []any) (replicationEnabledToAccounts []sdk.AccountIdentifier, failoverEnabledToAccounts []sdk.AccountIdentifier) { + replicationEnabledToAccounts = make([]sdk.AccountIdentifier, 0) + failoverEnabledToAccounts = make([]sdk.AccountIdentifier, 0) - return nil -} + for _, replicationConfigurationMap := range replicationConfigs { + replicationConfiguration := replicationConfigurationMap.(map[string]any) + for _, enableToAccountMap := range replicationConfiguration["enable_to_account"].([]any) { + enableToAccount := enableToAccountMap.(map[string]any) + accountIdentifier := sdk.NewAccountIdentifierFromFullyQualifiedName(enableToAccount["account_identifier"].(string)) -func UpdateDatabase(d *schema.ResourceData, meta interface{}) error { - id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) - client := meta.(*provider.Context).Client - ctx := context.Background() + replicationEnabledToAccounts = append(replicationEnabledToAccounts, accountIdentifier) + if enableToAccount["with_failover"].(bool) { + failoverEnabledToAccounts = append(failoverEnabledToAccounts, accountIdentifier) + } + } + } - if d.HasChange("name") { - newName := d.Get("name").(string) - newId := sdk.NewAccountObjectIdentifier(newName) - opts := &sdk.AlterDatabaseOptions{ - NewName: &newId, + return replicationEnabledToAccounts, failoverEnabledToAccounts } - err := client.Databases.Alter(ctx, id, opts) - if err != nil { - return fmt.Errorf("error updating database name on %v err = %w", d.Id(), err) - } - d.SetId(helpers.EncodeSnowflakeID(newId)) - id = newId - } + beforeReplicationEnabledToAccounts, beforeFailoverEnabledToAccounts := getReplicationConfiguration(before.([]any)) + afterReplicationEnabledToAccounts, afterFailoverEnabledToAccounts := getReplicationConfiguration(after.([]any)) - if d.HasChange("comment") { - comment := "" - if c := d.Get("comment"); c != nil { - comment = c.(string) - } - opts := &sdk.AlterDatabaseOptions{ - Set: &sdk.DatabaseSet{ - Comment: sdk.String(comment), - }, + addedFailovers, removedFailovers := ListDiff(beforeFailoverEnabledToAccounts, afterFailoverEnabledToAccounts) + addedReplications, removedReplications := ListDiff(beforeReplicationEnabledToAccounts, afterReplicationEnabledToAccounts) + // Failovers will be disabled implicitly by disabled replications + removedFailovers = slices.DeleteFunc(removedFailovers, func(identifier sdk.AccountIdentifier) bool { return slices.Contains(removedReplications, identifier) }) + + if len(addedReplications) > 0 { + err := client.Databases.AlterReplication(ctx, id, &sdk.AlterDatabaseReplicationOptions{ + EnableReplication: &sdk.EnableReplication{ + ToAccounts: addedReplications, + IgnoreEditionCheck: sdk.Bool(d.Get("replication.0.ignore_edition_check").(bool)), + }, + }) + if err != nil { + return diag.FromErr(err) + } } - err := client.Databases.Alter(ctx, id, opts) - if err != nil { - return fmt.Errorf("error updating database comment on %v err = %w", d.Id(), err) + + if len(addedFailovers) > 0 { + err := client.Databases.AlterFailover(ctx, id, &sdk.AlterDatabaseFailoverOptions{ + EnableFailover: &sdk.EnableFailover{ + ToAccounts: addedFailovers, + }, + }) + if err != nil { + return diag.FromErr(err) + } } - } - if d.HasChange("data_retention_time_in_days") { - if days := d.Get("data_retention_time_in_days"); days.(int) != -1 { - err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ - Set: &sdk.DatabaseSet{ - DataRetentionTimeInDays: sdk.Int(days.(int)), + if len(removedReplications) > 0 { + err := client.Databases.AlterReplication(ctx, id, &sdk.AlterDatabaseReplicationOptions{ + DisableReplication: &sdk.DisableReplication{ + ToAccounts: removedReplications, }, }) if err != nil { - return fmt.Errorf("error when setting database data retention time on %v err = %w", d.Id(), err) + return diag.FromErr(err) } - } else { - err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ - Unset: &sdk.DatabaseUnset{ - DataRetentionTimeInDays: sdk.Bool(true), + } + + if len(removedFailovers) > 0 { + err := client.Databases.AlterFailover(ctx, id, &sdk.AlterDatabaseFailoverOptions{ + DisableFailover: &sdk.DisableFailover{ + ToAccounts: removedFailovers, }, }) if err != nil { - return fmt.Errorf("error when usetting database data retention time on %v err = %w", d.Id(), err) + return diag.FromErr(err) } } } - // If replication configuration changes, need to update accounts that have permission to replicate database - if d.HasChange("replication_configuration") { - oldConfig, newConfig := d.GetChange("replication_configuration") + if d.HasChange("comment") { + comment := d.Get("comment").(string) + if len(comment) > 0 { + databaseSetRequest.Comment = &comment + } else { + databaseUnsetRequest.Comment = sdk.Bool(true) + } + } - newAccountIDs := make([]sdk.AccountIdentifier, 0) - ignoreEditionCheck := false - if len(newConfig.([]interface{})) != 0 { - newAccounts := newConfig.([]interface{})[0].(map[string]interface{})["accounts"].([]interface{}) - for _, account := range newAccounts { - newAccountIDs = append(newAccountIDs, sdk.NewAccountIdentifierFromAccountLocator(account.(string))) - } - ignoreEditionCheck = newConfig.([]interface{})[0].(map[string]interface{})["ignore_edition_check"].(bool) + if (*databaseSetRequest != sdk.DatabaseSet{}) { + err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ + Set: databaseSetRequest, + }) + if err != nil { + return diag.FromErr(err) } + } - oldAccountIDs := make([]sdk.AccountIdentifier, 0) - if len(oldConfig.([]interface{})) != 0 { - oldAccounts := oldConfig.([]interface{})[0].(map[string]interface{})["accounts"].([]interface{}) - for _, account := range oldAccounts { - oldAccountIDs = append(oldAccountIDs, sdk.NewAccountIdentifierFromAccountLocator(account.(string))) - } + if (*databaseUnsetRequest != sdk.DatabaseUnset{}) { + err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ + Unset: databaseUnsetRequest, + }) + if err != nil { + return diag.FromErr(err) } + } + + return ReadDatabase(ctx, d, meta) +} + +func ReadDatabase(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { + client := meta.(*provider.Context).Client + id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) - accountsToRemove := make([]sdk.AccountIdentifier, 0) - accountsToAdd := make([]sdk.AccountIdentifier, 0) - // Find accounts to remove - for _, oldAccountID := range oldAccountIDs { - if !slices.Contains(newAccountIDs, oldAccountID) { - accountsToRemove = append(accountsToRemove, oldAccountID) + database, err := client.Databases.ShowByID(ctx, id) + if err != nil { + if errors.Is(err, sdk.ErrObjectNotFound) { + d.SetId("") + return diag.Diagnostics{ + diag.Diagnostic{ + Severity: diag.Warning, + Summary: "Failed to query secondary database. Marking the resource as removed.", + Detail: fmt.Sprintf("DatabaseName: %s, Err: %s", id.FullyQualifiedName(), err), + }, } } + return diag.FromErr(err) + } + + if err := d.Set("name", database.Name); err != nil { + return diag.FromErr(err) + } + + if err := d.Set("is_transient", database.Transient); err != nil { + return diag.FromErr(err) + } + + if err := d.Set("comment", database.Comment); err != nil { + return diag.FromErr(err) + } + + sessionDetails, err := client.ContextFunctions.CurrentSessionDetails(ctx) + if err != nil { + return diag.FromErr(err) + } + + currentAccountIdentifier := sdk.NewAccountIdentifier(sessionDetails.OrganizationName, sessionDetails.AccountName) + replicationDatabases, err := client.ReplicationFunctions.ShowReplicationDatabases(ctx, &sdk.ShowReplicationDatabasesOptions{ + WithPrimary: sdk.Pointer(sdk.NewExternalObjectIdentifier(currentAccountIdentifier, id)), + }) + if err != nil { + return diag.FromErr(err) + } + + if len(replicationDatabases) == 1 { + replicationAllowedToAccounts := make([]sdk.AccountIdentifier, 0) + failoverAllowedToAccounts := make([]sdk.AccountIdentifier, 0) - // Find accounts to add - for _, newAccountID := range newAccountIDs { - if !slices.Contains(oldAccountIDs, newAccountID) { - accountsToAdd = append(accountsToAdd, newAccountID) + for _, allowedAccount := range strings.Split(replicationDatabases[0].ReplicationAllowedToAccounts, ",") { + allowedAccountIdentifier := sdk.NewAccountIdentifierFromFullyQualifiedName(strings.TrimSpace(allowedAccount)) + if currentAccountIdentifier.FullyQualifiedName() == allowedAccountIdentifier.FullyQualifiedName() { + continue } + replicationAllowedToAccounts = append(replicationAllowedToAccounts, allowedAccountIdentifier) } - if len(accountsToAdd) > 0 { - opts := &sdk.AlterDatabaseReplicationOptions{ - EnableReplication: &sdk.EnableReplication{ - ToAccounts: accountsToAdd, - }, - } - if ignoreEditionCheck { - opts.EnableReplication.IgnoreEditionCheck = sdk.Bool(ignoreEditionCheck) - } - err := client.Databases.AlterReplication(ctx, id, opts) - if err != nil { - return fmt.Errorf("error enabling replication configuration on %v err = %w", d.Id(), err) + + for _, allowedAccount := range strings.Split(replicationDatabases[0].FailoverAllowedToAccounts, ",") { + allowedAccountIdentifier := sdk.NewAccountIdentifierFromFullyQualifiedName(strings.TrimSpace(allowedAccount)) + if currentAccountIdentifier.FullyQualifiedName() == allowedAccountIdentifier.FullyQualifiedName() { + continue } + failoverAllowedToAccounts = append(failoverAllowedToAccounts, allowedAccountIdentifier) } - if len(accountsToRemove) > 0 { - opts := &sdk.AlterDatabaseReplicationOptions{ - DisableReplication: &sdk.DisableReplication{ - ToAccounts: accountsToRemove, - }, + enableToAccount := make([]map[string]any, 0) + for _, allowedAccount := range replicationAllowedToAccounts { + enableToAccount = append(enableToAccount, map[string]any{ + "account_identifier": allowedAccount.FullyQualifiedName(), + "with_failover": slices.Contains(failoverAllowedToAccounts, allowedAccount), + }) + } + + var ignoreEditionCheck bool + if v, ok := d.GetOk("replication.0.ignore_edition_check"); ok { + ignoreEditionCheck = v.(bool) + } + + if len(enableToAccount) == 0 { + err := d.Set("replication", []any{}) + if err != nil { + return diag.FromErr(err) } - err := client.Databases.AlterReplication(ctx, id, opts) + } else { + err := d.Set("replication", []any{ + map[string]any{ + "enable_to_account": enableToAccount, + "ignore_edition_check": ignoreEditionCheck, + }, + }) if err != nil { - return fmt.Errorf("error disabling replication configuration on %v err = %w", d.Id(), err) + return diag.FromErr(err) } } } - return ReadDatabase(d, meta) + databaseParameters, err := client.Parameters.ShowParameters(ctx, &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Database: id, + }, + }) + if err != nil { + return diag.FromErr(err) + } + + if diags := HandleDatabaseParameterRead(d, databaseParameters); diags != nil { + return diags + } + + return nil } -func DeleteDatabase(d *schema.ResourceData, meta interface{}) error { +func DeleteDatabase(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { client := meta.(*provider.Context).Client - ctx := context.Background() id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) + err := client.Databases.Drop(ctx, id, &sdk.DropDatabaseOptions{ IfExists: sdk.Bool(true), }) if err != nil { - return err + return diag.FromErr(err) } + d.SetId("") return nil } diff --git a/pkg/resources/database_acceptance_test.go b/pkg/resources/database_acceptance_test.go index e002d31037..f25ce17db2 100644 --- a/pkg/resources/database_acceptance_test.go +++ b/pkg/resources/database_acceptance_test.go @@ -1,25 +1,62 @@ package resources_test import ( - "context" "fmt" "strconv" "testing" - acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testenvs" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/snowflakechecks" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/importchecks" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/planchecks" + tfjson "github.com/hashicorp/terraform-json" + "github.com/hashicorp/terraform-plugin-testing/plancheck" + "github.com/stretchr/testify/require" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-testing/config" "github.com/hashicorp/terraform-plugin-testing/helper/resource" - "github.com/hashicorp/terraform-plugin-testing/plancheck" - "github.com/hashicorp/terraform-plugin-testing/terraform" "github.com/hashicorp/terraform-plugin-testing/tfversion" ) -func TestAcc_DatabaseWithUnderscore(t *testing.T) { - prefix := acc.TestClient().Ids.AlphaWithPrefix("_") +func TestAcc_Database_Basic(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + comment := random.Comment() + + newId := acc.TestClient().Ids.RandomAccountObjectIdentifier() + newComment := random.Comment() + + var ( + accountDataRetentionTimeInDays = new(string) + accountMaxDataExtensionTimeInDays = new(string) + accountExternalVolume = new(string) + accountCatalog = new(string) + accountReplaceInvalidCharacters = new(string) + accountDefaultDdlCollation = new(string) + accountStorageSerializationPolicy = new(string) + accountLogLevel = new(string) + accountTraceLevel = new(string) + accountSuspendTaskAfterNumFailures = new(string) + accountTaskAutoRetryAttempts = new(string) + accountUserTaskMangedInitialWarehouseSize = new(string) + accountUserTaskTimeoutMs = new(string) + accountUserTaskMinimumTriggerIntervalInSeconds = new(string) + accountQuotedIdentifiersIgnoreCase = new(string) + accountEnableConsoleOutput = new(string) + ) + + configVariables := func(id sdk.AccountObjectIdentifier, comment string) config.Variables { + return config.Variables{ + "name": config.StringVariable(id.Name()), + "comment": config.StringVariable(comment), + } + } resource.Test(t, resource.TestCase{ ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, @@ -30,22 +67,150 @@ func TestAcc_DatabaseWithUnderscore(t *testing.T) { CheckDestroy: acc.CheckDestroy(t, resources.Database), Steps: []resource.TestStep{ { - Config: dbConfig(prefix), + PreConfig: func() { + params := acc.TestClient().Parameter.ShowAccountParameters(t) + *accountDataRetentionTimeInDays = helpers.FindParameter(t, params, sdk.AccountParameterDataRetentionTimeInDays).Value + *accountMaxDataExtensionTimeInDays = helpers.FindParameter(t, params, sdk.AccountParameterMaxDataExtensionTimeInDays).Value + *accountExternalVolume = helpers.FindParameter(t, params, sdk.AccountParameterExternalVolume).Value + *accountCatalog = helpers.FindParameter(t, params, sdk.AccountParameterCatalog).Value + *accountReplaceInvalidCharacters = helpers.FindParameter(t, params, sdk.AccountParameterReplaceInvalidCharacters).Value + *accountDefaultDdlCollation = helpers.FindParameter(t, params, sdk.AccountParameterDefaultDDLCollation).Value + *accountStorageSerializationPolicy = helpers.FindParameter(t, params, sdk.AccountParameterStorageSerializationPolicy).Value + *accountLogLevel = helpers.FindParameter(t, params, sdk.AccountParameterLogLevel).Value + *accountTraceLevel = helpers.FindParameter(t, params, sdk.AccountParameterTraceLevel).Value + *accountSuspendTaskAfterNumFailures = helpers.FindParameter(t, params, sdk.AccountParameterSuspendTaskAfterNumFailures).Value + *accountTaskAutoRetryAttempts = helpers.FindParameter(t, params, sdk.AccountParameterTaskAutoRetryAttempts).Value + *accountUserTaskMangedInitialWarehouseSize = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskManagedInitialWarehouseSize).Value + *accountUserTaskTimeoutMs = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskTimeoutMs).Value + *accountUserTaskMinimumTriggerIntervalInSeconds = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds).Value + *accountQuotedIdentifiersIgnoreCase = helpers.FindParameter(t, params, sdk.AccountParameterQuotedIdentifiersIgnoreCase).Value + *accountEnableConsoleOutput = helpers.FindParameter(t, params, sdk.AccountParameterEnableConsoleOutput).Value + }, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: configVariables(id, comment), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.db", "name", prefix), - resource.TestCheckResourceAttr("snowflake_database.db", "comment", "test comment"), - resource.TestCheckResourceAttrSet("snowflake_database.db", "data_retention_time_in_days"), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "false"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", comment), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.#", "0"), + + resource.TestCheckResourceAttrPtr("snowflake_database.test", "data_retention_time_in_days", accountDataRetentionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "max_data_extension_time_in_days", accountMaxDataExtensionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "external_volume", accountExternalVolume), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "catalog", accountCatalog), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "replace_invalid_characters", accountReplaceInvalidCharacters), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "default_ddl_collation", accountDefaultDdlCollation), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "storage_serialization_policy", accountStorageSerializationPolicy), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "log_level", accountLogLevel), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "trace_level", accountTraceLevel), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "suspend_task_after_num_failures", accountSuspendTaskAfterNumFailures), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "task_auto_retry_attempts", accountTaskAutoRetryAttempts), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_managed_initial_warehouse_size", accountUserTaskMangedInitialWarehouseSize), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_timeout_ms", accountUserTaskTimeoutMs), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_minimum_trigger_interval_in_seconds", accountUserTaskMinimumTriggerIntervalInSeconds), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "quoted_identifiers_ignore_case", accountQuotedIdentifiersIgnoreCase), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "enable_console_output", accountEnableConsoleOutput), ), }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: configVariables(newId, newComment), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "name", newId.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "false"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", newComment), + + resource.TestCheckResourceAttrPtr("snowflake_database.test", "data_retention_time_in_days", accountDataRetentionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "max_data_extension_time_in_days", accountMaxDataExtensionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "external_volume", accountExternalVolume), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "catalog", accountCatalog), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "replace_invalid_characters", accountReplaceInvalidCharacters), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "default_ddl_collation", accountDefaultDdlCollation), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "storage_serialization_policy", accountStorageSerializationPolicy), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "log_level", accountLogLevel), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "trace_level", accountTraceLevel), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "suspend_task_after_num_failures", accountSuspendTaskAfterNumFailures), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "task_auto_retry_attempts", accountTaskAutoRetryAttempts), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_managed_initial_warehouse_size", accountUserTaskMangedInitialWarehouseSize), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_timeout_ms", accountUserTaskTimeoutMs), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_minimum_trigger_interval_in_seconds", accountUserTaskMinimumTriggerIntervalInSeconds), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "quoted_identifiers_ignore_case", accountQuotedIdentifiersIgnoreCase), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "enable_console_output", accountEnableConsoleOutput), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: configVariables(newId, newComment), + ResourceName: "snowflake_database.test", + ImportState: true, + ImportStateVerify: true, + }, }, }) } -func TestAcc_Database(t *testing.T) { - prefix := acc.TestClient().Ids.Alpha() - prefix2 := acc.TestClient().Ids.Alpha() +func TestAcc_Database_ComputedValues(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + comment := random.Comment() - secondaryAccountName := acc.SecondaryTestClient().Context.CurrentAccount(t) + configVariables := func(id sdk.AccountObjectIdentifier, comment string) config.Variables { + return config.Variables{ + "name": config.StringVariable(id.Name()), + "comment": config.StringVariable(comment), + } + } + + secondaryAccountIdentifier := acc.SecondaryTestClient().Account.GetAccountIdentifier(t).FullyQualifiedName() + + externalVolumeId, externalVolumeCleanup := acc.TestClient().ExternalVolume.Create(t) + t.Cleanup(externalVolumeCleanup) + + catalogId, catalogCleanup := acc.TestClient().CatalogIntegration.Create(t) + t.Cleanup(catalogCleanup) + + var ( + accountDataRetentionTimeInDays = new(string) + accountMaxDataExtensionTimeInDays = new(string) + accountExternalVolume = new(string) + accountCatalog = new(string) + accountReplaceInvalidCharacters = new(string) + accountDefaultDdlCollation = new(string) + accountStorageSerializationPolicy = new(string) + accountLogLevel = new(string) + accountTraceLevel = new(string) + accountSuspendTaskAfterNumFailures = new(string) + accountTaskAutoRetryAttempts = new(string) + accountUserTaskMangedInitialWarehouseSize = new(string) + accountUserTaskTimeoutMs = new(string) + accountUserTaskMinimumTriggerIntervalInSeconds = new(string) + accountQuotedIdentifiersIgnoreCase = new(string) + accountEnableConsoleOutput = new(string) + ) + + completeConfigVariables := config.Variables{ + "name": config.StringVariable(id.Name()), + "comment": config.StringVariable(comment), + "transient": config.BoolVariable(false), + "account_identifier": config.StringVariable(secondaryAccountIdentifier), + "with_failover": config.BoolVariable(true), + "ignore_edition_check": config.BoolVariable(true), + "data_retention_time_in_days": config.IntegerVariable(20), + "max_data_extension_time_in_days": config.IntegerVariable(30), + "external_volume": config.StringVariable(externalVolumeId.Name()), + "catalog": config.StringVariable(catalogId.Name()), + "replace_invalid_characters": config.BoolVariable(true), + "default_ddl_collation": config.StringVariable("en_US"), + "storage_serialization_policy": config.StringVariable(string(sdk.StorageSerializationPolicyCompatible)), + "log_level": config.StringVariable(string(sdk.LogLevelInfo)), + "trace_level": config.StringVariable(string(sdk.TraceLevelOnEvent)), + "suspend_task_after_num_failures": config.IntegerVariable(20), + "task_auto_retry_attempts": config.IntegerVariable(20), + "user_task_managed_initial_warehouse_size": config.StringVariable(string(sdk.WarehouseSizeXLarge)), + "user_task_timeout_ms": config.IntegerVariable(1200000), + "user_task_minimum_trigger_interval_in_seconds": config.IntegerVariable(120), + "quoted_identifiers_ignore_case": config.BoolVariable(true), + "enable_console_output": config.BoolVariable(true), + } resource.Test(t, resource.TestCase{ ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, @@ -56,58 +221,240 @@ func TestAcc_Database(t *testing.T) { CheckDestroy: acc.CheckDestroy(t, resources.Database), Steps: []resource.TestStep{ { - Config: dbConfig(prefix), + PreConfig: func() { + params := acc.TestClient().Parameter.ShowAccountParameters(t) + *accountDataRetentionTimeInDays = helpers.FindParameter(t, params, sdk.AccountParameterDataRetentionTimeInDays).Value + *accountMaxDataExtensionTimeInDays = helpers.FindParameter(t, params, sdk.AccountParameterMaxDataExtensionTimeInDays).Value + *accountExternalVolume = helpers.FindParameter(t, params, sdk.AccountParameterExternalVolume).Value + *accountCatalog = helpers.FindParameter(t, params, sdk.AccountParameterCatalog).Value + *accountReplaceInvalidCharacters = helpers.FindParameter(t, params, sdk.AccountParameterReplaceInvalidCharacters).Value + *accountDefaultDdlCollation = helpers.FindParameter(t, params, sdk.AccountParameterDefaultDDLCollation).Value + *accountStorageSerializationPolicy = helpers.FindParameter(t, params, sdk.AccountParameterStorageSerializationPolicy).Value + *accountLogLevel = helpers.FindParameter(t, params, sdk.AccountParameterLogLevel).Value + *accountTraceLevel = helpers.FindParameter(t, params, sdk.AccountParameterTraceLevel).Value + *accountSuspendTaskAfterNumFailures = helpers.FindParameter(t, params, sdk.AccountParameterSuspendTaskAfterNumFailures).Value + *accountTaskAutoRetryAttempts = helpers.FindParameter(t, params, sdk.AccountParameterTaskAutoRetryAttempts).Value + *accountUserTaskMangedInitialWarehouseSize = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskManagedInitialWarehouseSize).Value + *accountUserTaskTimeoutMs = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskTimeoutMs).Value + *accountUserTaskMinimumTriggerIntervalInSeconds = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds).Value + *accountQuotedIdentifiersIgnoreCase = helpers.FindParameter(t, params, sdk.AccountParameterQuotedIdentifiersIgnoreCase).Value + *accountEnableConsoleOutput = helpers.FindParameter(t, params, sdk.AccountParameterEnableConsoleOutput).Value + }, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: configVariables(id, comment), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.db", "name", prefix), - resource.TestCheckResourceAttr("snowflake_database.db", "comment", "test comment"), - resource.TestCheckResourceAttrSet("snowflake_database.db", "data_retention_time_in_days"), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "false"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", comment), + + resource.TestCheckResourceAttrPtr("snowflake_database.test", "data_retention_time_in_days", accountDataRetentionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "max_data_extension_time_in_days", accountMaxDataExtensionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "external_volume", accountExternalVolume), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "catalog", accountCatalog), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "replace_invalid_characters", accountReplaceInvalidCharacters), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "default_ddl_collation", accountDefaultDdlCollation), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "storage_serialization_policy", accountStorageSerializationPolicy), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "log_level", accountLogLevel), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "trace_level", accountTraceLevel), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "suspend_task_after_num_failures", accountSuspendTaskAfterNumFailures), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "task_auto_retry_attempts", accountTaskAutoRetryAttempts), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_managed_initial_warehouse_size", accountUserTaskMangedInitialWarehouseSize), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_timeout_ms", accountUserTaskTimeoutMs), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_minimum_trigger_interval_in_seconds", accountUserTaskMinimumTriggerIntervalInSeconds), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "quoted_identifiers_ignore_case", accountQuotedIdentifiersIgnoreCase), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "enable_console_output", accountEnableConsoleOutput), ), }, - // RENAME { - Config: dbConfig(prefix2), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/complete_optionals_set"), + ConfigVariables: completeConfigVariables, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.db", "name", prefix2), - resource.TestCheckResourceAttr("snowflake_database.db", "comment", "test comment"), - resource.TestCheckResourceAttrSet("snowflake_database.db", "data_retention_time_in_days"), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "false"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", comment), + + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "20"), + resource.TestCheckResourceAttr("snowflake_database.test", "max_data_extension_time_in_days", "30"), + resource.TestCheckResourceAttr("snowflake_database.test", "external_volume", externalVolumeId.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "catalog", catalogId.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "replace_invalid_characters", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "storage_serialization_policy", string(sdk.StorageSerializationPolicyCompatible)), + resource.TestCheckResourceAttr("snowflake_database.test", "log_level", string(sdk.LogLevelInfo)), + resource.TestCheckResourceAttr("snowflake_database.test", "trace_level", string(sdk.TraceLevelOnEvent)), + resource.TestCheckResourceAttr("snowflake_database.test", "suspend_task_after_num_failures", "20"), + resource.TestCheckResourceAttr("snowflake_database.test", "task_auto_retry_attempts", "20"), + resource.TestCheckResourceAttr("snowflake_database.test", "user_task_managed_initial_warehouse_size", string(sdk.WarehouseSizeXLarge)), + resource.TestCheckResourceAttr("snowflake_database.test", "user_task_timeout_ms", "1200000"), + resource.TestCheckResourceAttr("snowflake_database.test", "user_task_minimum_trigger_interval_in_seconds", "120"), + resource.TestCheckResourceAttr("snowflake_database.test", "quoted_identifiers_ignore_case", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "enable_console_output", "true"), ), }, - // CHANGE PROPERTIES { - Config: dbConfig2(prefix2), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: configVariables(id, comment), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.db", "name", prefix2), - resource.TestCheckResourceAttr("snowflake_database.db", "comment", "test comment 2"), - resource.TestCheckResourceAttr("snowflake_database.db", "data_retention_time_in_days", "3"), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "false"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", comment), + + resource.TestCheckResourceAttrPtr("snowflake_database.test", "data_retention_time_in_days", accountDataRetentionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "max_data_extension_time_in_days", accountMaxDataExtensionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "external_volume", accountExternalVolume), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "catalog", accountCatalog), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "replace_invalid_characters", accountReplaceInvalidCharacters), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "default_ddl_collation", accountDefaultDdlCollation), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "storage_serialization_policy", accountStorageSerializationPolicy), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "log_level", accountLogLevel), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "trace_level", accountTraceLevel), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "suspend_task_after_num_failures", accountSuspendTaskAfterNumFailures), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "task_auto_retry_attempts", accountTaskAutoRetryAttempts), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_managed_initial_warehouse_size", accountUserTaskMangedInitialWarehouseSize), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_timeout_ms", accountUserTaskTimeoutMs), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "user_task_minimum_trigger_interval_in_seconds", accountUserTaskMinimumTriggerIntervalInSeconds), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "quoted_identifiers_ignore_case", accountQuotedIdentifiersIgnoreCase), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "enable_console_output", accountEnableConsoleOutput), ), }, - // ADD REPLICATION - // proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2369 error + }, + }) +} + +func TestAcc_Database_Complete(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + secondaryAccountIdentifier := acc.SecondaryTestClient().Account.GetAccountIdentifier(t).FullyQualifiedName() + comment := random.Comment() + + externalVolumeId, externalVolumeCleanup := acc.TestClient().ExternalVolume.Create(t) + t.Cleanup(externalVolumeCleanup) + + catalogId, catalogCleanup := acc.TestClient().CatalogIntegration.Create(t) + t.Cleanup(catalogCleanup) + + completeConfigVariables := config.Variables{ + "name": config.StringVariable(id.Name()), + "comment": config.StringVariable(comment), + "transient": config.BoolVariable(false), + "account_identifier": config.StringVariable(secondaryAccountIdentifier), + "with_failover": config.BoolVariable(true), + "ignore_edition_check": config.BoolVariable(true), + + "data_retention_time_in_days": config.IntegerVariable(20), + "max_data_extension_time_in_days": config.IntegerVariable(30), + "external_volume": config.StringVariable(externalVolumeId.Name()), + "catalog": config.StringVariable(catalogId.Name()), + "replace_invalid_characters": config.BoolVariable(true), + "default_ddl_collation": config.StringVariable("en_US"), + "storage_serialization_policy": config.StringVariable(string(sdk.StorageSerializationPolicyCompatible)), + "log_level": config.StringVariable(string(sdk.LogLevelInfo)), + "trace_level": config.StringVariable(string(sdk.TraceLevelOnEvent)), + "suspend_task_after_num_failures": config.IntegerVariable(20), + "task_auto_retry_attempts": config.IntegerVariable(20), + "user_task_managed_initial_warehouse_size": config.StringVariable(string(sdk.WarehouseSizeXLarge)), + "user_task_timeout_ms": config.IntegerVariable(1200000), + "user_task_minimum_trigger_interval_in_seconds": config.IntegerVariable(120), + "quoted_identifiers_ignore_case": config.BoolVariable(true), + "enable_console_output": config.BoolVariable(true), + } + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Database), + Steps: []resource.TestStep{ { - Config: dbConfigWithReplication(prefix2, secondaryAccountName), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/complete_optionals_set"), + ConfigVariables: completeConfigVariables, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.db", "name", prefix2), - resource.TestCheckResourceAttr("snowflake_database.db", "comment", "test comment 2"), - resource.TestCheckResourceAttr("snowflake_database.db", "data_retention_time_in_days", "3"), - resource.TestCheckResourceAttr("snowflake_database.db", "replication_configuration.#", "1"), - resource.TestCheckResourceAttr("snowflake_database.db", "replication_configuration.0.accounts.#", "1"), - resource.TestCheckResourceAttr("snowflake_database.db", "replication_configuration.0.accounts.0", secondaryAccountName), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "false"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", comment), + + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "20"), + resource.TestCheckResourceAttr("snowflake_database.test", "max_data_extension_time_in_days", "30"), + resource.TestCheckResourceAttr("snowflake_database.test", "external_volume", externalVolumeId.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "catalog", catalogId.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "replace_invalid_characters", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "default_ddl_collation", "en_US"), + resource.TestCheckResourceAttr("snowflake_database.test", "storage_serialization_policy", string(sdk.StorageSerializationPolicyCompatible)), + resource.TestCheckResourceAttr("snowflake_database.test", "log_level", string(sdk.LogLevelInfo)), + resource.TestCheckResourceAttr("snowflake_database.test", "trace_level", string(sdk.TraceLevelOnEvent)), + resource.TestCheckResourceAttr("snowflake_database.test", "suspend_task_after_num_failures", "20"), + resource.TestCheckResourceAttr("snowflake_database.test", "task_auto_retry_attempts", "20"), + resource.TestCheckResourceAttr("snowflake_database.test", "user_task_managed_initial_warehouse_size", string(sdk.WarehouseSizeXLarge)), + resource.TestCheckResourceAttr("snowflake_database.test", "user_task_timeout_ms", "1200000"), + resource.TestCheckResourceAttr("snowflake_database.test", "user_task_minimum_trigger_interval_in_seconds", "120"), + resource.TestCheckResourceAttr("snowflake_database.test", "quoted_identifiers_ignore_case", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "enable_console_output", "true"), + + resource.TestCheckResourceAttr("snowflake_database.test", "replication.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.ignore_edition_check", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.0.account_identifier", secondaryAccountIdentifier), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.0.with_failover", "true"), ), }, - // IMPORT { - ResourceName: "snowflake_database.db", + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/complete_optionals_set"), + ConfigVariables: completeConfigVariables, + ResourceName: "snowflake_database.test", ImportState: true, ImportStateVerify: true, - ImportStateVerifyIgnore: []string{"replication_configuration"}, + ImportStateVerifyIgnore: []string{"replication.0.ignore_edition_check"}, }, }, }) } -func TestAcc_DatabaseRemovedOutsideOfTerraform(t *testing.T) { +func TestAcc_Database_Update(t *testing.T) { id := acc.TestClient().Ids.RandomAccountObjectIdentifier() - name := id.Name() + comment := random.Comment() + + newId := acc.TestClient().Ids.RandomAccountObjectIdentifier() + newComment := random.Comment() + + secondaryAccountIdentifier := acc.SecondaryTestClient().Account.GetAccountIdentifier(t).FullyQualifiedName() + + externalVolumeId, externalVolumeCleanup := acc.TestClient().ExternalVolume.Create(t) + t.Cleanup(externalVolumeCleanup) + + catalogId, catalogCleanup := acc.TestClient().CatalogIntegration.Create(t) + t.Cleanup(catalogCleanup) + + basicConfigVariables := func(id sdk.AccountObjectIdentifier, comment string) config.Variables { + return config.Variables{ + "name": config.StringVariable(id.Name()), + "comment": config.StringVariable(comment), + } + } + + completeConfigVariables := config.Variables{ + "name": config.StringVariable(newId.Name()), + "comment": config.StringVariable(newComment), + "transient": config.BoolVariable(false), + "account_identifier": config.StringVariable(secondaryAccountIdentifier), + "with_failover": config.BoolVariable(true), + "ignore_edition_check": config.BoolVariable(true), + "data_retention_time_in_days": config.IntegerVariable(20), + "max_data_extension_time_in_days": config.IntegerVariable(30), + "external_volume": config.StringVariable(externalVolumeId.Name()), + "catalog": config.StringVariable(catalogId.Name()), + "replace_invalid_characters": config.BoolVariable(true), + "default_ddl_collation": config.StringVariable("en_US"), + "storage_serialization_policy": config.StringVariable(string(sdk.StorageSerializationPolicyCompatible)), + "log_level": config.StringVariable(string(sdk.LogLevelInfo)), + "trace_level": config.StringVariable(string(sdk.TraceLevelOnEvent)), + "suspend_task_after_num_failures": config.IntegerVariable(20), + "task_auto_retry_attempts": config.IntegerVariable(20), + "user_task_managed_initial_warehouse_size": config.StringVariable(string(sdk.WarehouseSizeXLarge)), + "user_task_timeout_ms": config.IntegerVariable(1200000), + "user_task_minimum_trigger_interval_in_seconds": config.IntegerVariable(120), + "quoted_identifiers_ignore_case": config.BoolVariable(true), + "enable_console_output": config.BoolVariable(true), + } resource.Test(t, resource.TestCase{ ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, @@ -118,43 +465,118 @@ func TestAcc_DatabaseRemovedOutsideOfTerraform(t *testing.T) { CheckDestroy: acc.CheckDestroy(t, resources.Database), Steps: []resource.TestStep{ { - ConfigDirectory: config.TestNameDirectory(), - ConfigVariables: map[string]config.Variable{ - "db": config.StringVariable(name), - }, - ConfigPlanChecks: resource.ConfigPlanChecks{ - PreApply: []plancheck.PlanCheck{plancheck.ExpectNonEmptyPlan()}, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: basicConfigVariables(id, comment), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/complete_optionals_set"), + ConfigVariables: completeConfigVariables, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "name", newId.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "false"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", newComment), + + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "20"), + resource.TestCheckResourceAttr("snowflake_database.test", "max_data_extension_time_in_days", "30"), + resource.TestCheckResourceAttr("snowflake_database.test", "external_volume", externalVolumeId.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "catalog", catalogId.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "replace_invalid_characters", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "default_ddl_collation", "en_US"), + resource.TestCheckResourceAttr("snowflake_database.test", "storage_serialization_policy", string(sdk.StorageSerializationPolicyCompatible)), + resource.TestCheckResourceAttr("snowflake_database.test", "log_level", string(sdk.LogLevelInfo)), + resource.TestCheckResourceAttr("snowflake_database.test", "trace_level", string(sdk.TraceLevelOnEvent)), + resource.TestCheckResourceAttr("snowflake_database.test", "suspend_task_after_num_failures", "20"), + resource.TestCheckResourceAttr("snowflake_database.test", "task_auto_retry_attempts", "20"), + resource.TestCheckResourceAttr("snowflake_database.test", "user_task_managed_initial_warehouse_size", string(sdk.WarehouseSizeXLarge)), + resource.TestCheckResourceAttr("snowflake_database.test", "user_task_timeout_ms", "1200000"), + resource.TestCheckResourceAttr("snowflake_database.test", "user_task_minimum_trigger_interval_in_seconds", "120"), + resource.TestCheckResourceAttr("snowflake_database.test", "quoted_identifiers_ignore_case", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "enable_console_output", "true"), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: basicConfigVariables(id, comment), + }, + }, + }) +} + +func TestAcc_Database_HierarchicalValues(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + comment := random.Comment() + + configVariables := func(id sdk.AccountObjectIdentifier, comment string) config.Variables { + return config.Variables{ + "name": config.StringVariable(id.Name()), + "comment": config.StringVariable(comment), + } + } + + paramDefault := new(string) + var revertAccountParameterToDefault func() + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Database), + Steps: []resource.TestStep{ + { + PreConfig: func() { + *paramDefault = acc.TestClient().Parameter.ShowAccountParameter(t, sdk.AccountParameterMaxDataExtensionTimeInDays).Default }, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: configVariables(id, comment), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.db", "name", name), - resource.TestCheckResourceAttr("snowflake_database.db", "comment", "test comment"), - testAccCheckDatabaseExistence(t, id, true), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "max_data_extension_time_in_days", paramDefault), ), }, { - PreConfig: func() { acc.TestClient().Database.DropDatabaseFunc(t, id)() }, - ConfigDirectory: config.TestNameDirectory(), - ConfigVariables: map[string]config.Variable{ - "db": config.StringVariable(name), + PreConfig: func() { + revertAccountParameterToDefault = acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterMaxDataExtensionTimeInDays, "50") + t.Cleanup(revertAccountParameterToDefault) }, - ConfigPlanChecks: resource.ConfigPlanChecks{ - PreApply: []plancheck.PlanCheck{plancheck.ExpectNonEmptyPlan()}, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: configVariables(id, comment), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "max_data_extension_time_in_days", "50"), + ), + }, + { + PreConfig: func() { + revertAccountParameterToDefault() }, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: configVariables(id, comment), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.db", "name", name), - resource.TestCheckResourceAttr("snowflake_database.db", "comment", "test comment"), - testAccCheckDatabaseExistence(t, id, true), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "max_data_extension_time_in_days", paramDefault), ), }, }, }) } -// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2021 -func TestAcc_Database_issue2021(t *testing.T) { - name := acc.TestClient().Ids.Alpha() +func TestAcc_Database_Replication(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + secondaryAccountIdentifier := acc.SecondaryTestClient().Account.GetAccountIdentifier(t).FullyQualifiedName() - secondaryAccountName := acc.SecondaryTestClient().Context.CurrentAccount(t) + configVariables := func(id sdk.AccountObjectIdentifier, withReplication bool, withFailover bool, ignoreEditionCheck bool) config.Variables { + if withReplication { + return config.Variables{ + "name": config.StringVariable(id.Name()), + "account_identifier": config.StringVariable(secondaryAccountIdentifier), + "with_failover": config.BoolVariable(withFailover), + "ignore_edition_check": config.BoolVariable(ignoreEditionCheck), + } + } + return config.Variables{ + "name": config.StringVariable(id.Name()), + "comment": config.StringVariable(""), + } + } resource.Test(t, resource.TestCase{ ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, @@ -165,33 +587,81 @@ func TestAcc_Database_issue2021(t *testing.T) { CheckDestroy: acc.CheckDestroy(t, resources.Database), Steps: []resource.TestStep{ { - Config: dbConfigWithReplication(name, secondaryAccountName), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: configVariables(id, false, false, false), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.#", "0"), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/replication"), + ConfigVariables: configVariables(id, true, true, true), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.ignore_edition_check", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.0.account_identifier", secondaryAccountIdentifier), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.0.with_failover", "true"), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/replication"), + ConfigVariables: configVariables(id, true, false, true), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.db", "name", name), - resource.TestCheckResourceAttr("snowflake_database.db", "replication_configuration.#", "1"), - resource.TestCheckResourceAttr("snowflake_database.db", "replication_configuration.0.accounts.#", "1"), - resource.TestCheckResourceAttr("snowflake_database.db", "replication_configuration.0.accounts.0", secondaryAccountName), - testAccCheckIfDatabaseIsReplicated(t, name), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.ignore_edition_check", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.0.account_identifier", secondaryAccountIdentifier), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.0.with_failover", "false"), ), }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/basic"), + ConfigVariables: configVariables(id, false, false, false), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.#", "0"), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/replication"), + ConfigVariables: configVariables(id, true, true, true), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.ignore_edition_check", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.0.account_identifier", secondaryAccountIdentifier), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.0.with_failover", "true"), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/replication"), + ConfigVariables: configVariables(id, true, true, true), + ResourceName: "snowflake_database.test", + ImportState: true, + ImportStateVerify: true, + ImportStateVerifyIgnore: []string{"replication.0.ignore_edition_check"}, + }, }, }) } -// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2356 issue is fixed. -func TestAcc_Database_DefaultDataRetentionTime(t *testing.T) { +func TestAcc_Database_IntParameter(t *testing.T) { id := acc.TestClient().Ids.RandomAccountObjectIdentifier() - configVariablesWithoutDatabaseDataRetentionTime := func() config.Variables { - return config.Variables{ - "database": config.StringVariable(id.Name()), - } + databaseBasicConfig := config.Variables{ + "name": config.StringVariable(id.Name()), } - configVariablesWithDatabaseDataRetentionTime := func(databaseDataRetentionTime int) config.Variables { - vars := configVariablesWithoutDatabaseDataRetentionTime() - vars["database_data_retention_time"] = config.IntegerVariable(databaseDataRetentionTime) - return vars + databaseWithIntParameterConfig := func(dataRetentionTimeInDays int) config.Variables { + return config.Variables{ + "name": config.StringVariable(id.Name()), + "data_retention_time_in_days": config.IntegerVariable(dataRetentionTimeInDays), + } } resource.Test(t, resource.TestCase{ @@ -202,87 +672,252 @@ func TestAcc_Database_DefaultDataRetentionTime(t *testing.T) { }, CheckDestroy: acc.CheckDestroy(t, resources.Database), Steps: []resource.TestStep{ + // create with setting one param + { + ConfigVariables: databaseWithIntParameterConfig(50), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/set"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "data_retention_time_in_days"), + planchecks.ExpectChange("snowflake_database.test", "data_retention_time_in_days", tfjson.ActionCreate, nil, sdk.String("50")), + planchecks.ExpectComputed("snowflake_database.test", "data_retention_time_in_days", false), + }, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "50"), + ), + }, + // do not make any change (to check if there is no drift) + { + ConfigVariables: databaseWithIntParameterConfig(50), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/set"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectEmptyPlan(), + }, + }, + }, + // import when param in config + { + ResourceName: "snowflake_database.test", + ImportState: true, + ConfigVariables: databaseWithIntParameterConfig(50), + ImportStateCheck: importchecks.ComposeImportStateCheck( + importchecks.TestCheckResourceAttrInstanceState(id.Name(), "data_retention_time_in_days", "50"), + ), + }, + // change the param value in config + { + ConfigVariables: databaseWithIntParameterConfig(25), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/set"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "data_retention_time_in_days"), + planchecks.ExpectChange("snowflake_database.test", "data_retention_time_in_days", tfjson.ActionUpdate, sdk.String("50"), sdk.String("25")), + planchecks.ExpectComputed("snowflake_database.test", "data_retention_time_in_days", false), + }, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "25"), + ), + }, + // change param value on account - expect no changes { PreConfig: func() { - revertParameter := acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "5") - t.Cleanup(revertParameter) + param := acc.TestClient().Parameter.ShowAccountParameter(t, sdk.AccountParameterDataRetentionTimeInDays) + require.Equal(t, "", string(param.Level)) + revert := acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "50") + t.Cleanup(revert) + }, + ConfigVariables: databaseWithIntParameterConfig(25), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/set"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "data_retention_time_in_days"), + planchecks.ExpectChange("snowflake_database.test", "data_retention_time_in_days", tfjson.ActionNoop, sdk.String("25"), sdk.String("25")), + planchecks.ExpectComputed("snowflake_database.test", "data_retention_time_in_days", false), + plancheck.ExpectEmptyPlan(), + }, }, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), - ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "-1"), - checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "25"), ), }, + // change the param value externally { PreConfig: func() { - _ = acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "10") + // clean after the previous step + acc.TestClient().Parameter.UnsetAccountParameter(t, sdk.AccountParameterDataRetentionTimeInDays) + // update externally + acc.TestClient().Database.UpdateDataRetentionTime(t, id, 50) + }, + ConfigVariables: databaseWithIntParameterConfig(25), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/set"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "data_retention_time_in_days"), + planchecks.ExpectDrift("snowflake_database.test", "data_retention_time_in_days", sdk.String("25"), sdk.String("50")), + planchecks.ExpectChange("snowflake_database.test", "data_retention_time_in_days", tfjson.ActionUpdate, sdk.String("50"), sdk.String("25")), + planchecks.ExpectComputed("snowflake_database.test", "data_retention_time_in_days", false), + }, }, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), - ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "-1"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 10), + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "25"), + snowflakechecks.CheckDatabaseDataRetentionTimeInDays(t, id, sdk.ParameterTypeDatabase, "25"), + ), + }, + // remove the param from config + { + PreConfig: func() { + param := acc.TestClient().Parameter.ShowAccountParameter(t, sdk.AccountParameterDataRetentionTimeInDays) + require.Equal(t, "", string(param.Level)) + }, + ConfigVariables: databaseBasicConfig, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/unset"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "data_retention_time_in_days"), + planchecks.ExpectChange("snowflake_database.test", "data_retention_time_in_days", tfjson.ActionUpdate, sdk.String("25"), nil), + planchecks.ExpectComputed("snowflake_database.test", "data_retention_time_in_days", true), + }, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "1"), + snowflakechecks.CheckDatabaseDataRetentionTimeInDays(t, id, "", "1"), + ), + }, + // import when param not in config (snowflake default) + { + ResourceName: "snowflake_database.test", + ImportState: true, + ConfigVariables: databaseBasicConfig, + ImportStateCheck: importchecks.ComposeImportStateCheck( + importchecks.TestCheckResourceAttrInstanceState(id.Name(), "data_retention_time_in_days", "1"), ), }, + // change the param value in config to snowflake default { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), - ConfigVariables: configVariablesWithDatabaseDataRetentionTime(5), + ConfigVariables: databaseWithIntParameterConfig(1), + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/set"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "data_retention_time_in_days"), + planchecks.ExpectChange("snowflake_database.test", "data_retention_time_in_days", tfjson.ActionUpdate, sdk.String("1"), nil), + planchecks.ExpectComputed("snowflake_database.test", "data_retention_time_in_days", true), + }, + }, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "5"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 5), + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "1"), + snowflakechecks.CheckDatabaseDataRetentionTimeInDays(t, id, sdk.ParameterTypeDatabase, "1"), ), }, + // remove the param from config { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), - ConfigVariables: configVariablesWithDatabaseDataRetentionTime(15), + PreConfig: func() { + param := acc.TestClient().Parameter.ShowAccountParameter(t, sdk.AccountParameterDataRetentionTimeInDays) + require.Equal(t, "", string(param.Level)) + }, + ConfigVariables: databaseBasicConfig, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/unset"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "data_retention_time_in_days"), + planchecks.ExpectChange("snowflake_database.test", "data_retention_time_in_days", tfjson.ActionUpdate, sdk.String("1"), nil), + planchecks.ExpectComputed("snowflake_database.test", "data_retention_time_in_days", true), + }, + }, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "15"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 15), + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "1"), // Database default + snowflakechecks.CheckDatabaseDataRetentionTimeInDays(t, id, "", "1"), ), }, + // change param value on account - change expected to be noop { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), - ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), + PreConfig: func() { + param := acc.TestClient().Parameter.ShowAccountParameter(t, sdk.AccountParameterDataRetentionTimeInDays) + require.Equal(t, "", string(param.Level)) + revert := acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "50") + t.Cleanup(revert) + }, + ConfigVariables: databaseBasicConfig, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/unset"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "data_retention_time_in_days"), + planchecks.ExpectDrift("snowflake_database.test", "data_retention_time_in_days", sdk.String("1"), sdk.String("50")), + planchecks.ExpectChange("snowflake_database.test", "data_retention_time_in_days", tfjson.ActionNoop, sdk.String("50"), sdk.String("50")), + planchecks.ExpectComputed("snowflake_database.test", "data_retention_time_in_days", false), + }, + }, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "-1"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 10), + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "50"), + snowflakechecks.CheckDatabaseDataRetentionTimeInDays(t, id, sdk.ParameterTypeAccount, "50"), ), }, + // import when param not in config (set on account) { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), - ConfigVariables: configVariablesWithDatabaseDataRetentionTime(0), + ResourceName: "snowflake_database.test", + ImportState: true, + ConfigVariables: databaseBasicConfig, + ImportStateCheck: importchecks.ComposeImportStateCheck( + importchecks.TestCheckResourceAttrInstanceState(id.Name(), "data_retention_time_in_days", "50"), + ), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "0"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 0), + snowflakechecks.CheckDatabaseDataRetentionTimeInDays(t, id, sdk.ParameterTypeAccount, "50"), ), }, + // change param value on database { - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), - ConfigVariables: configVariablesWithDatabaseDataRetentionTime(3), + PreConfig: func() { + acc.TestClient().Database.UpdateDataRetentionTime(t, id, 50) + }, + ConfigVariables: databaseBasicConfig, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/unset"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "data_retention_time_in_days"), + planchecks.ExpectChange("snowflake_database.test", "data_retention_time_in_days", tfjson.ActionUpdate, sdk.String("50"), nil), + planchecks.ExpectComputed("snowflake_database.test", "data_retention_time_in_days", true), + }, + }, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "3"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 3), + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "50"), + snowflakechecks.CheckDatabaseDataRetentionTimeInDays(t, id, sdk.ParameterTypeAccount, "50"), + ), + }, + // unset param on account + { + PreConfig: func() { + acc.TestClient().Parameter.UnsetAccountParameter(t, sdk.AccountParameterDataRetentionTimeInDays) + }, + ConfigVariables: databaseBasicConfig, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/int_parameter/unset"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "data_retention_time_in_days"), + planchecks.ExpectDrift("snowflake_database.test", "data_retention_time_in_days", sdk.String("50"), sdk.String("1")), + planchecks.ExpectChange("snowflake_database.test", "data_retention_time_in_days", tfjson.ActionNoop, sdk.String("1"), sdk.String("1")), + planchecks.ExpectComputed("snowflake_database.test", "data_retention_time_in_days", false), + }, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "1"), + snowflakechecks.CheckDatabaseDataRetentionTimeInDays(t, id, "", "1"), ), }, }, }) } -// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2356 issue is fixed. -func TestAcc_Database_DefaultDataRetentionTime_SetOutsideOfTerraform(t *testing.T) { +func TestAcc_Database_StringValueSetOnDifferentParameterLevelWithSameValue(t *testing.T) { id := acc.TestClient().Ids.RandomAccountObjectIdentifier() - configVariablesWithoutDatabaseDataRetentionTime := func() config.Variables { - return config.Variables{ - "database": config.StringVariable(id.Name()), - } - } + catalogId, catalogCleanup := acc.TestClient().CatalogIntegration.Create(t) + t.Cleanup(catalogCleanup) - configVariablesWithDatabaseDataRetentionTime := func(databaseDataRetentionTime int) config.Variables { - vars := configVariablesWithoutDatabaseDataRetentionTime() - vars["database_data_retention_time"] = config.IntegerVariable(databaseDataRetentionTime) - return vars + configVariables := config.Variables{ + "name": config.StringVariable(id.Name()), + "catalog": config.StringVariable(catalogId.Name()), } resource.Test(t, resource.TestCase{ @@ -293,158 +928,440 @@ func TestAcc_Database_DefaultDataRetentionTime_SetOutsideOfTerraform(t *testing. }, CheckDestroy: acc.CheckDestroy(t, resources.Database), Steps: []resource.TestStep{ + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/catalog"), + ConfigVariables: configVariables, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "catalog", catalogId.Name()), + ), + }, { PreConfig: func() { - revertParameter := acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "5") - t.Cleanup(revertParameter) + require.Empty(t, acc.TestClient().Parameter.ShowAccountParameter(t, sdk.AccountParameterCatalog).Level) + acc.TestClient().Database.UnsetCatalog(t, id) + t.Cleanup(acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterCatalog, catalogId.Name())) }, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), - ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + planchecks.PrintPlanDetails("snowflake_database.test", "catalog"), + planchecks.ExpectChange("snowflake_database.test", "catalog", tfjson.ActionUpdate, sdk.String(catalogId.Name()), nil), + planchecks.ExpectComputed("snowflake_database.test", "catalog", true), + }, + }, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database/catalog"), + ConfigVariables: configVariables, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "-1"), - checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "catalog", catalogId.Name()), ), }, + }, + }) +} + +func TestAcc_Database_UpgradeWithTheSameFieldsAsInTheOldOne(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + comment := random.Comment() + dataRetentionTimeInDays := new(string) + + resource.Test(t, resource.TestCase{ + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Database), + Steps: []resource.TestStep{ { - PreConfig: acc.TestClient().Database.UpdateDataRetentionTime(t, id, 20), - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), - ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.92.0", + Source: "Snowflake-Labs/snowflake", + }, + }, + Config: databaseStateUpgraderBasic(id, comment), Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "id", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", comment), resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "-1"), - checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), ), }, { PreConfig: func() { - _ = acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "10") + *dataRetentionTimeInDays = helpers.FindParameter(t, acc.TestClient().Parameter.ShowDatabaseParameters(t, id), sdk.AccountParameterDataRetentionTimeInDays).Value }, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), - ConfigVariables: configVariablesWithDatabaseDataRetentionTime(3), - Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "3"), - checkAccountAndDatabaseDataRetentionTime(t, id, 10, 3), - ), + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + Config: databaseStateUpgraderBasic(id, comment), ConfigPlanChecks: resource.ConfigPlanChecks{ - PostApplyPostRefresh: []plancheck.PlanCheck{ + PreApply: []plancheck.PlanCheck{ plancheck.ExpectEmptyPlan(), }, }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "id", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", comment), + resource.TestCheckResourceAttrPtr("snowflake_database.test", "data_retention_time_in_days", dataRetentionTimeInDays), + ), }, }, }) } -func dbConfig(prefix string) string { - s := ` -resource "snowflake_database" "db" { +func databaseStateUpgraderBasic(id sdk.AccountObjectIdentifier, comment string) string { + return fmt.Sprintf(` +resource "snowflake_database" "test" { name = "%s" - comment = "test comment" + is_transient = true + comment = "%s" } -` - return fmt.Sprintf(s, prefix) +`, id.Name(), comment) } -func dbConfig2(prefix string) string { - s := ` -resource "snowflake_database" "db" { +func TestAcc_Database_UpgradeWithDataRetentionSet(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + comment := random.Comment() + + resource.Test(t, resource.TestCase{ + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Database), + Steps: []resource.TestStep{ + { + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.92.0", + Source: "Snowflake-Labs/snowflake", + }, + }, + Config: databaseStateUpgraderDataRetentionSet(id, comment, 10), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "id", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "false"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", comment), + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "10"), + ), + }, + { + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + Config: databaseStateUpgraderDataRetentionSet(id, comment, 10), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectEmptyPlan(), + }, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "id", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "false"), + resource.TestCheckResourceAttr("snowflake_database.test", "comment", comment), + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "10"), + ), + }, + }, + }) +} + +func databaseStateUpgraderDataRetentionSet(id sdk.AccountObjectIdentifier, comment string, dataRetention int) string { + return fmt.Sprintf(` +resource "snowflake_database" "test" { name = "%s" - comment = "test comment 2" - data_retention_time_in_days = 3 + comment = "%s" + data_retention_time_in_days = %d +} +`, id.Name(), comment, dataRetention) } -` - return fmt.Sprintf(s, prefix) + +func TestAcc_Database_WithReplication(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + secondaryAccountLocator := acc.SecondaryClient(t).GetAccountLocator() + secondaryAccountIdentifier := acc.SecondaryTestClient().Account.GetAccountIdentifier(t).FullyQualifiedName() + + resource.Test(t, resource.TestCase{ + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Database), + Steps: []resource.TestStep{ + { + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.92.0", + Source: "Snowflake-Labs/snowflake", + }, + }, + Config: databaseStateUpgraderWithReplicationOld(id, secondaryAccountLocator), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "id", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "replication_configuration.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication_configuration.0.ignore_edition_check", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication_configuration.0.accounts.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication_configuration.0.accounts.0", secondaryAccountLocator), + ), + }, + { + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + Config: databaseStateUpgraderWithReplicationNew(id, secondaryAccountIdentifier), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + // plancheck.ExpectNonEmptyPlan(), // Account locators have to be changed to the new account identifier format + }, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "id", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.ignore_edition_check", "true"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.#", "1"), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.0.account_identifier", secondaryAccountIdentifier), + resource.TestCheckResourceAttr("snowflake_database.test", "replication.0.enable_to_account.0.with_failover", "false"), + resource.TestCheckNoResourceAttr("snowflake_database.test", "replication_configuration"), + ), + }, + }, + }) } -func dbConfigWithReplication(prefix string, secondaryAccountName string) string { - s := ` -resource "snowflake_database" "db" { +func databaseStateUpgraderWithReplicationOld(id sdk.AccountObjectIdentifier, enableToAccount string) string { + return fmt.Sprintf(` +resource "snowflake_database" "test" { name = "%s" - comment = "test comment 2" - data_retention_time_in_days = 3 replication_configuration { - accounts = [ - "%s" - ] + accounts = ["%s"] + ignore_edition_check = true } } -` - return fmt.Sprintf(s, prefix, secondaryAccountName) +`, id.Name(), enableToAccount) } -// TODO [SNOW-936093]: this is used mostly as check for unsafe execute, not as normal check destroy in other resources. Handle with the helpers cleanup. -func testAccCheckDatabaseExistence(t *testing.T, id sdk.AccountObjectIdentifier, shouldExist bool) func(state *terraform.State) error { - t.Helper() - return func(state *terraform.State) error { - _, err := acc.TestClient().Database.Show(t, id) - if shouldExist { - if err != nil { - return fmt.Errorf("error while retrieving database %s, err = %w", id, err) - } - } else { - if err == nil { - return fmt.Errorf("database %v still exists", id) - } +func databaseStateUpgraderWithReplicationNew(id sdk.AccountObjectIdentifier, enableToAccount string) string { + return fmt.Sprintf(` +resource "snowflake_database" "test" { + name = "%s" + replication { + enable_to_account { + account_identifier = %s + with_failover = false } - return nil + ignore_edition_check = true } } +`, id.Name(), strconv.Quote(enableToAccount)) +} -func testAccCheckIfDatabaseIsReplicated(t *testing.T, id string) func(state *terraform.State) error { - t.Helper() - return func(state *terraform.State) error { - client := acc.Client(t) +func TestAcc_Database_UpgradeFromShare(t *testing.T) { + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) - ctx := context.Background() - replicationDatabases, err := client.ReplicationFunctions.ShowReplicationDatabases(ctx, nil) - if err != nil { - return err - } + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + secondaryClientLocator := acc.SecondaryClient(t).GetAccountLocator() - var exists bool - for _, o := range replicationDatabases { - if o.Name == id { - exists = true - break - } - } + shareExternalId := createShareableDatabase(t) - if !exists { - return fmt.Errorf("database %s should be replicated", id) - } + resource.Test(t, resource.TestCase{ + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Database), + Steps: []resource.TestStep{ + { + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.92.0", + Source: "Snowflake-Labs/snowflake", + }, + }, + Config: databaseStateUpgraderFromShareOld(id, secondaryClientLocator, shareExternalId), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "id", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "from_share.provider", secondaryClientLocator), + resource.TestCheckResourceAttr("snowflake_database.test", "from_share.share", shareExternalId.Name()), + ), + }, + { + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + Config: databaseStateUpgraderFromShareNew(id), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectEmptyPlan(), + }, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "id", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckNoResourceAttr("snowflake_database.test", "from_share"), + ), + }, + }, + }) +} - return nil +func databaseStateUpgraderFromShareOld(id sdk.AccountObjectIdentifier, secondaryClientLocator string, externalShare sdk.ExternalObjectIdentifier) string { + return fmt.Sprintf(` +resource "snowflake_database" "test" { + name = "%s" + data_retention_time_in_days = 0 # to avoid in-place update to -1 + from_share = { + provider = "%s" + share = "%s" } } +`, id.Name(), secondaryClientLocator, externalShare.Name()) +} -func checkAccountAndDatabaseDataRetentionTime(t *testing.T, id sdk.AccountObjectIdentifier, expectedAccountRetentionDays int, expectedDatabaseRetentionsDays int) func(state *terraform.State) error { - t.Helper() - return func(state *terraform.State) error { - providerContext := acc.TestAccProvider.Meta().(*provider.Context) - client := providerContext.Client - ctx := context.Background() +func databaseStateUpgraderFromShareNew(id sdk.AccountObjectIdentifier) string { + return fmt.Sprintf(` +resource "snowflake_database" "test" { + name = "%s" + data_retention_time_in_days = 0 +} +`, id.Name()) +} - database, err := acc.TestClient().Database.Show(t, id) - if err != nil { - return err - } +func TestAcc_Database_UpgradeFromReplica(t *testing.T) { + _ = testenvs.GetOrSkipTest(t, testenvs.EnableAcceptance) - if database.RetentionTime != expectedDatabaseRetentionsDays { - return fmt.Errorf("invalid database retention time, expected: %d, got: %d", expectedDatabaseRetentionsDays, database.RetentionTime) - } + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + _, primaryDatabaseId, databaseCleanup := acc.SecondaryTestClient().Database.CreatePrimaryDatabase(t, []sdk.AccountIdentifier{ + acc.TestClient().Account.GetAccountIdentifier(t), + }) + t.Cleanup(databaseCleanup) - param, err := client.Parameters.ShowAccountParameter(ctx, sdk.AccountParameterDataRetentionTimeInDays) - if err != nil { - return err - } - accountRetentionDays, err := strconv.Atoi(param.Value) - if err != nil { - return err - } + resource.Test(t, resource.TestCase{ + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Database), + Steps: []resource.TestStep{ + { + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.92.0", + Source: "Snowflake-Labs/snowflake", + }, + }, + Config: databaseStateUpgraderFromReplicaOld(id, primaryDatabaseId), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "id", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "from_replica", primaryDatabaseId.FullyQualifiedName()), + ), + }, + { + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + Config: databaseStateUpgraderFromReplicaNew(id), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectEmptyPlan(), + }, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.test", "id", id.Name()), + resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), + resource.TestCheckNoResourceAttr("snowflake_database.test", "from_replica"), + ), + }, + }, + }) +} - if accountRetentionDays != expectedAccountRetentionDays { - return fmt.Errorf("invalid account retention time, expected: %d, got: %d", expectedAccountRetentionDays, accountRetentionDays) - } +func databaseStateUpgraderFromReplicaOld(id sdk.AccountObjectIdentifier, primaryDatabaseId sdk.ExternalObjectIdentifier) string { + return fmt.Sprintf(` +resource "snowflake_database" "test" { + name = "%s" + data_retention_time_in_days = 0 # to avoid in-place update to -1 + from_replica = %s +} +`, id.Name(), strconv.Quote(primaryDatabaseId.FullyQualifiedName())) +} - return nil - } +func databaseStateUpgraderFromReplicaNew(id sdk.AccountObjectIdentifier) string { + return fmt.Sprintf(` +resource "snowflake_database" "test" { + name = "%s" + data_retention_time_in_days = 0 +} +`, id.Name()) +} + +func TestAcc_Database_UpgradeFromClonedDatabase(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + cloneId := acc.TestClient().Ids.RandomAccountObjectIdentifier() + + resource.Test(t, resource.TestCase{ + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Database), + Steps: []resource.TestStep{ + { + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.92.0", + Source: "Snowflake-Labs/snowflake", + }, + }, + Config: databaseStateUpgraderFromDatabaseOld(id, cloneId), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.cloned", "id", cloneId.Name()), + resource.TestCheckResourceAttr("snowflake_database.cloned", "name", cloneId.Name()), + resource.TestCheckResourceAttr("snowflake_database.cloned", "from_database", id.Name()), + ), + }, + { + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + Config: databaseStateUpgraderFromDatabaseNew(id, cloneId), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectEmptyPlan(), + }, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database.cloned", "id", cloneId.Name()), + resource.TestCheckResourceAttr("snowflake_database.cloned", "name", cloneId.Name()), + resource.TestCheckNoResourceAttr("snowflake_database.cloned", "from_database"), + ), + }, + }, + }) +} + +func databaseStateUpgraderFromDatabaseOld(id sdk.AccountObjectIdentifier, secondId sdk.AccountObjectIdentifier) string { + return fmt.Sprintf(` +resource "snowflake_database" "test" { + name = "%s" + data_retention_time_in_days = 0 # to avoid in-place update to -1 +} + +resource "snowflake_database" "cloned" { + name = "%s" + data_retention_time_in_days = 0 # to avoid in-place update to -1 + from_database = snowflake_database.test.name +} +`, id.Name(), secondId.Name()) +} + +func databaseStateUpgraderFromDatabaseNew(id sdk.AccountObjectIdentifier, secondId sdk.AccountObjectIdentifier) string { + return fmt.Sprintf(` +resource "snowflake_database" "test" { + name = "%s" + data_retention_time_in_days = 0 +} + +resource "snowflake_database" "cloned" { + name = "%s" + data_retention_time_in_days = 0 +} +`, id.Name(), secondId.Name()) } diff --git a/pkg/resources/database_commons.go b/pkg/resources/database_commons.go new file mode 100644 index 0000000000..0f3fb84f1a --- /dev/null +++ b/pkg/resources/database_commons.go @@ -0,0 +1,342 @@ +package resources + +import ( + "context" + "fmt" + "slices" + "strconv" + "strings" + + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" +) + +var ( + DatabaseParametersSchema = make(map[string]*schema.Schema) + SharedDatabaseParametersSchema = make(map[string]*schema.Schema) + sharedDatabaseNotApplicableParameters = []sdk.ObjectParameter{ + sdk.ObjectParameterDataRetentionTimeInDays, + sdk.ObjectParameterMaxDataExtensionTimeInDays, + } + DatabaseParametersCustomDiff = func(ctx context.Context, d *schema.ResourceDiff, meta any) error { + if d.Id() == "" { + return nil + } + + client := meta.(*provider.Context).Client + params, err := client.Parameters.ShowParameters(context.Background(), &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Database: helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier), + }, + }) + if err != nil { + return err + } + + return customdiff.All( + IntParameterValueComputedIf("data_retention_time_in_days", params, sdk.ParameterTypeDatabase, sdk.AccountParameterDataRetentionTimeInDays), + IntParameterValueComputedIf("max_data_extension_time_in_days", params, sdk.ParameterTypeDatabase, sdk.AccountParameterMaxDataExtensionTimeInDays), + StringParameterValueComputedIf("external_volume", params, sdk.ParameterTypeDatabase, sdk.AccountParameterExternalVolume), + StringParameterValueComputedIf("catalog", params, sdk.ParameterTypeDatabase, sdk.AccountParameterCatalog), + BoolParameterValueComputedIf("replace_invalid_characters", params, sdk.ParameterTypeDatabase, sdk.AccountParameterReplaceInvalidCharacters), + StringParameterValueComputedIf("default_ddl_collation", params, sdk.ParameterTypeDatabase, sdk.AccountParameterDefaultDDLCollation), + StringParameterValueComputedIf("storage_serialization_policy", params, sdk.ParameterTypeDatabase, sdk.AccountParameterStorageSerializationPolicy), + StringParameterValueComputedIf("log_level", params, sdk.ParameterTypeDatabase, sdk.AccountParameterLogLevel), + StringParameterValueComputedIf("trace_level", params, sdk.ParameterTypeDatabase, sdk.AccountParameterTraceLevel), + IntParameterValueComputedIf("suspend_task_after_num_failures", params, sdk.ParameterTypeDatabase, sdk.AccountParameterSuspendTaskAfterNumFailures), + IntParameterValueComputedIf("task_auto_retry_attempts", params, sdk.ParameterTypeDatabase, sdk.AccountParameterTaskAutoRetryAttempts), + StringParameterValueComputedIf("user_task_managed_initial_warehouse_size", params, sdk.ParameterTypeDatabase, sdk.AccountParameterUserTaskManagedInitialWarehouseSize), + IntParameterValueComputedIf("user_task_timeout_ms", params, sdk.ParameterTypeDatabase, sdk.AccountParameterUserTaskTimeoutMs), + IntParameterValueComputedIf("user_task_minimum_trigger_interval_in_seconds", params, sdk.ParameterTypeDatabase, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds), + BoolParameterValueComputedIf("quoted_identifiers_ignore_case", params, sdk.ParameterTypeDatabase, sdk.AccountParameterQuotedIdentifiersIgnoreCase), + BoolParameterValueComputedIf("enable_console_output", params, sdk.ParameterTypeDatabase, sdk.AccountParameterEnableConsoleOutput), + )(ctx, d, meta) + } +) + +func init() { + databaseParameterFields := []struct { + Name sdk.ObjectParameter + Type schema.ValueType + Description string + DiffSuppress schema.SchemaDiffSuppressFunc + ValidateDiag schema.SchemaValidateDiagFunc + }{ + { + Name: sdk.ObjectParameterDataRetentionTimeInDays, + Type: schema.TypeInt, + Description: "Specifies the number of days for which Time Travel actions (CLONE and UNDROP) can be performed on the database, as well as specifying the default Time Travel retention time for all schemas created in the database. For more details, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel).", + // Choosing higher range (for the standard edition or transient databases, the maximum number is 1) + ValidateDiag: validation.ToDiagFunc(validation.IntBetween(0, 90)), + }, + { + Name: sdk.ObjectParameterDefaultDDLCollation, + Type: schema.TypeString, + Description: "Specifies a default collation specification for all schemas and tables added to the database. It can be overridden on schema or table level. For more information, see [collation specification](https://docs.snowflake.com/en/sql-reference/collation#label-collation-specification).", + }, + { + Name: sdk.ObjectParameterCatalog, + Type: schema.TypeString, + Description: "The database parameter that specifies the default catalog to use for Iceberg tables.", + ValidateDiag: IsValidIdentifier[sdk.AccountObjectIdentifier](), + }, + { + Name: sdk.ObjectParameterExternalVolume, + Type: schema.TypeString, + Description: "The database parameter that specifies the default external volume to use for Iceberg tables.", + ValidateDiag: IsValidIdentifier[sdk.AccountObjectIdentifier](), + }, + { + Name: sdk.ObjectParameterLogLevel, + Type: schema.TypeString, + Description: fmt.Sprintf("Specifies the severity level of messages that should be ingested and made available in the active event table. Valid options are: %v. Messages at the specified level (and at more severe levels) are ingested. For more information, see [LOG_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-log-level).", sdk.AsStringList(sdk.AllLogLevels)), + ValidateDiag: StringInSlice(sdk.AsStringList(sdk.AllLogLevels), true), + DiffSuppress: func(k, oldValue, newValue string, d *schema.ResourceData) bool { + return strings.EqualFold(oldValue, newValue) + }, + }, + { + Name: sdk.ObjectParameterTraceLevel, + Type: schema.TypeString, + Description: fmt.Sprintf("Controls how trace events are ingested into the event table. Valid options are: %v. For information about levels, see [TRACE_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-trace-level).", sdk.AsStringList(sdk.AllTraceLevels)), + ValidateDiag: StringInSlice(sdk.AsStringList(sdk.AllTraceLevels), true), + DiffSuppress: func(k, oldValue, newValue string, d *schema.ResourceData) bool { + return strings.EqualFold(oldValue, newValue) + }, + }, + { + Name: sdk.ObjectParameterMaxDataExtensionTimeInDays, + Type: schema.TypeInt, + Description: "Object parameter that specifies the maximum number of days for which Snowflake can extend the data retention period for tables in the database to prevent streams on the tables from becoming stale. For a detailed description of this parameter, see [MAX_DATA_EXTENSION_TIME_IN_DAYS](https://docs.snowflake.com/en/sql-reference/parameters.html#label-max-data-extension-time-in-days).", + ValidateDiag: validation.ToDiagFunc(validation.IntBetween(0, 90)), + }, + { + Name: sdk.ObjectParameterReplaceInvalidCharacters, + Type: schema.TypeBool, + Description: "Specifies whether to replace invalid UTF-8 characters with the Unicode replacement character (�) in query results for an Iceberg table. You can only set this parameter for tables that use an external Iceberg catalog.", + }, + { + Name: sdk.ObjectParameterStorageSerializationPolicy, + Type: schema.TypeString, + Description: fmt.Sprintf("The storage serialization policy for Iceberg tables that use Snowflake as the catalog. Valid options are: %v. COMPATIBLE: Snowflake performs encoding and compression of data files that ensures interoperability with third-party compute engines. OPTIMIZED: Snowflake performs encoding and compression of data files that ensures the best table performance within Snowflake.", sdk.AsStringList(sdk.AllStorageSerializationPolicies)), + ValidateDiag: StringInSlice(sdk.AsStringList(sdk.AllStorageSerializationPolicies), true), + DiffSuppress: func(k, oldValue, newValue string, d *schema.ResourceData) bool { + return strings.EqualFold(oldValue, newValue) + }, + }, + { + Name: sdk.ObjectParameterSuspendTaskAfterNumFailures, + Type: schema.TypeInt, + Description: "How many times a task must fail in a row before it is automatically suspended. 0 disables auto-suspending.", + ValidateDiag: validation.ToDiagFunc(validation.IntAtLeast(0)), + }, + { + Name: sdk.ObjectParameterTaskAutoRetryAttempts, + Type: schema.TypeInt, + Description: "Maximum automatic retries allowed for a user task.", + ValidateDiag: validation.ToDiagFunc(validation.IntAtLeast(0)), + }, + { + Name: sdk.ObjectParameterUserTaskManagedInitialWarehouseSize, + Type: schema.TypeString, + Description: "The initial size of warehouse to use for managed warehouses in the absence of history.", + ValidateDiag: sdkValidation(sdk.ToWarehouseSize), + DiffSuppress: NormalizeAndCompare(sdk.ToWarehouseSize), + }, + { + Name: sdk.ObjectParameterUserTaskTimeoutMs, + Type: schema.TypeInt, + Description: "User task execution timeout in milliseconds.", + ValidateDiag: validation.ToDiagFunc(validation.IntBetween(0, 86400000)), + }, + { + Name: sdk.ObjectParameterUserTaskMinimumTriggerIntervalInSeconds, + Type: schema.TypeInt, + Description: "Minimum amount of time between Triggered Task executions in seconds.", + // TODO(DOC-2511): ValidateDiag: Not documented + }, + { + Name: sdk.ObjectParameterQuotedIdentifiersIgnoreCase, + Type: schema.TypeBool, + Description: "If true, the case of quoted identifiers is ignored.", + }, + { + Name: sdk.ObjectParameterEnableConsoleOutput, + Type: schema.TypeBool, + Description: "If true, enables stdout/stderr fast path logging for anonymous stored procedures.", + }, + } + + for _, field := range databaseParameterFields { + fieldName := strings.ToLower(string(field.Name)) + + DatabaseParametersSchema[fieldName] = &schema.Schema{ + Type: field.Type, + Description: field.Description, + Computed: true, + Optional: true, + ValidateDiagFunc: field.ValidateDiag, + DiffSuppressFunc: field.DiffSuppress, + } + + if !slices.Contains(sharedDatabaseNotApplicableParameters, field.Name) { + SharedDatabaseParametersSchema[fieldName] = &schema.Schema{ + Type: field.Type, + Description: field.Description, + ForceNew: true, + Optional: true, + Computed: true, + ValidateDiagFunc: field.ValidateDiag, + DiffSuppressFunc: field.DiffSuppress, + } + } + } +} + +// TODO(SNOW-1480106): Change to smaller and safer return type +func GetAllDatabaseParameters(d *schema.ResourceData) ( + dataRetentionTimeInDays *int, + maxDataExtensionTimeInDays *int, + externalVolume *sdk.AccountObjectIdentifier, + catalog *sdk.AccountObjectIdentifier, + replaceInvalidCharacters *bool, + defaultDDLCollation *string, + storageSerializationPolicy *sdk.StorageSerializationPolicy, + logLevel *sdk.LogLevel, + traceLevel *sdk.TraceLevel, + suspendTaskAfterNumFailures *int, + taskAutoRetryAttempts *int, + userTaskManagedInitialWarehouseSize *sdk.WarehouseSize, + userTaskTimeoutMs *int, + userTaskMinimumTriggerIntervalInSeconds *int, + quotedIdentifiersIgnoreCase *bool, + enableConsoleOutput *bool, + err error, +) { + dataRetentionTimeInDays = GetPropertyAsPointer[int](d, "data_retention_time_in_days") + maxDataExtensionTimeInDays = GetPropertyAsPointer[int](d, "max_data_extension_time_in_days") + if externalVolumeRaw := GetPropertyAsPointer[string](d, "external_volume"); externalVolumeRaw != nil { + externalVolume = sdk.Pointer(sdk.NewAccountObjectIdentifier(*externalVolumeRaw)) + } + if catalogRaw := GetPropertyAsPointer[string](d, "catalog"); catalogRaw != nil { + catalog = sdk.Pointer(sdk.NewAccountObjectIdentifier(*catalogRaw)) + } + replaceInvalidCharacters = GetPropertyAsPointer[bool](d, "replace_invalid_characters") + defaultDDLCollation = GetPropertyAsPointer[string](d, "default_ddl_collation") + if storageSerializationPolicyRaw := GetPropertyAsPointer[string](d, "storage_serialization_policy"); storageSerializationPolicyRaw != nil { + storageSerializationPolicy = sdk.Pointer(sdk.StorageSerializationPolicy(*storageSerializationPolicyRaw)) + } + if logLevelRaw := GetPropertyAsPointer[string](d, "log_level"); logLevelRaw != nil { + logLevel = sdk.Pointer(sdk.LogLevel(*logLevelRaw)) + } + if traceLevelRaw := GetPropertyAsPointer[string](d, "trace_level"); traceLevelRaw != nil { + traceLevel = sdk.Pointer(sdk.TraceLevel(*traceLevelRaw)) + } + suspendTaskAfterNumFailures = GetPropertyAsPointer[int](d, "suspend_task_after_num_failures") + taskAutoRetryAttempts = GetPropertyAsPointer[int](d, "task_auto_retry_attempts") + if userTaskManagedInitialWarehouseSizeRaw := GetPropertyAsPointer[string](d, "user_task_managed_initial_warehouse_size"); userTaskManagedInitialWarehouseSizeRaw != nil { + var warehouseSize sdk.WarehouseSize + if warehouseSize, err = sdk.ToWarehouseSize(*userTaskManagedInitialWarehouseSizeRaw); err != nil { + return + } + userTaskManagedInitialWarehouseSize = sdk.Pointer(warehouseSize) + } + userTaskTimeoutMs = GetPropertyAsPointer[int](d, "user_task_timeout_ms") + userTaskMinimumTriggerIntervalInSeconds = GetPropertyAsPointer[int](d, "user_task_minimum_trigger_interval_in_seconds") + quotedIdentifiersIgnoreCase = GetPropertyAsPointer[bool](d, "quoted_identifiers_ignore_case") + enableConsoleOutput = GetPropertyAsPointer[bool](d, "enable_console_output") + return +} + +func HandleDatabaseParametersChanges(d *schema.ResourceData, set *sdk.DatabaseSet, unset *sdk.DatabaseUnset) diag.Diagnostics { + return JoinDiags( + handleValuePropertyChange[int](d, "data_retention_time_in_days", &set.DataRetentionTimeInDays, &unset.DataRetentionTimeInDays), + handleValuePropertyChange[int](d, "max_data_extension_time_in_days", &set.MaxDataExtensionTimeInDays, &unset.MaxDataExtensionTimeInDays), + handleValuePropertyChangeWithMapping[string](d, "external_volume", &set.ExternalVolume, &unset.ExternalVolume, sdk.NewAccountObjectIdentifier), + handleValuePropertyChangeWithMapping[string](d, "catalog", &set.Catalog, &unset.Catalog, sdk.NewAccountObjectIdentifier), + handleValuePropertyChange[bool](d, "replace_invalid_characters", &set.ReplaceInvalidCharacters, &unset.ReplaceInvalidCharacters), + handleValuePropertyChange[string](d, "default_ddl_collation", &set.DefaultDDLCollation, &unset.DefaultDDLCollation), + handleValuePropertyChangeWithMapping[string](d, "storage_serialization_policy", &set.StorageSerializationPolicy, &unset.StorageSerializationPolicy, func(value string) sdk.StorageSerializationPolicy { return sdk.StorageSerializationPolicy(value) }), + handleValuePropertyChangeWithMapping[string](d, "log_level", &set.LogLevel, &unset.LogLevel, func(value string) sdk.LogLevel { return sdk.LogLevel(value) }), + handleValuePropertyChangeWithMapping[string](d, "trace_level", &set.TraceLevel, &unset.TraceLevel, func(value string) sdk.TraceLevel { return sdk.TraceLevel(value) }), + handleValuePropertyChange[int](d, "suspend_task_after_num_failures", &set.SuspendTaskAfterNumFailures, &unset.SuspendTaskAfterNumFailures), + handleValuePropertyChange[int](d, "task_auto_retry_attempts", &set.TaskAutoRetryAttempts, &unset.TaskAutoRetryAttempts), + handleValuePropertyChangeWithMapping[string](d, "user_task_managed_initial_warehouse_size", &set.UserTaskManagedInitialWarehouseSize, &unset.UserTaskManagedInitialWarehouseSize, func(value string) sdk.WarehouseSize { return sdk.WarehouseSize(value) }), + handleValuePropertyChange[int](d, "user_task_timeout_ms", &set.UserTaskTimeoutMs, &unset.UserTaskTimeoutMs), + handleValuePropertyChange[int](d, "user_task_minimum_trigger_interval_in_seconds", &set.UserTaskMinimumTriggerIntervalInSeconds, &unset.UserTaskMinimumTriggerIntervalInSeconds), + handleValuePropertyChange[bool](d, "quoted_identifiers_ignore_case", &set.QuotedIdentifiersIgnoreCase, &unset.QuotedIdentifiersIgnoreCase), + handleValuePropertyChange[bool](d, "enable_console_output", &set.EnableConsoleOutput, &unset.EnableConsoleOutput), + ) +} + +// handleValuePropertyChange calls internally handleValuePropertyChangeWithMapping with identity mapping +func handleValuePropertyChange[T any](d *schema.ResourceData, key string, setField **T, unsetField **bool) diag.Diagnostics { + return handleValuePropertyChangeWithMapping[T, T](d, key, setField, unsetField, func(value T) T { return value }) +} + +// handleValuePropertyChangeWithMapping checks schema.ResourceData for change in key's value. If there's a change detected +// (or unknown value that basically indicates diff.SetNewComputed was called on the key), it checks if the value is set in the configuration. +// If the value is set, setField (representing setter for a value) is set to the new planned value applying mapping beforehand in cases where enum values, +// identifiers, etc. have to be set. Otherwise, unsetField is populated. +func handleValuePropertyChangeWithMapping[T, R any](d *schema.ResourceData, key string, setField **R, unsetField **bool, mapping func(value T) R) diag.Diagnostics { + if d.HasChange(key) || !d.GetRawPlan().AsValueMap()[key].IsKnown() { + if !d.GetRawConfig().AsValueMap()[key].IsNull() { + *setField = sdk.Pointer(mapping(d.Get(key).(T))) + } else { + *unsetField = sdk.Bool(true) + } + } + return nil +} + +func HandleDatabaseParameterRead(d *schema.ResourceData, databaseParameters []*sdk.Parameter) diag.Diagnostics { + for _, parameter := range databaseParameters { + switch parameter.Key { + case + string(sdk.ObjectParameterDataRetentionTimeInDays), + string(sdk.ObjectParameterMaxDataExtensionTimeInDays), + string(sdk.ObjectParameterSuspendTaskAfterNumFailures), + string(sdk.ObjectParameterTaskAutoRetryAttempts), + string(sdk.ObjectParameterUserTaskTimeoutMs), + string(sdk.ObjectParameterUserTaskMinimumTriggerIntervalInSeconds): + value, err := strconv.Atoi(parameter.Value) + if err != nil { + return diag.FromErr(err) + } + if err := d.Set(strings.ToLower(parameter.Key), value); err != nil { + return diag.FromErr(err) + } + case + string(sdk.ObjectParameterExternalVolume), + string(sdk.ObjectParameterCatalog), + string(sdk.ObjectParameterDefaultDDLCollation), + string(sdk.ObjectParameterStorageSerializationPolicy), + string(sdk.ObjectParameterLogLevel), + string(sdk.ObjectParameterTraceLevel), + string(sdk.ObjectParameterUserTaskManagedInitialWarehouseSize): + if err := d.Set(strings.ToLower(parameter.Key), parameter.Value); err != nil { + return diag.FromErr(err) + } + case + string(sdk.ObjectParameterReplaceInvalidCharacters), + string(sdk.ObjectParameterQuotedIdentifiersIgnoreCase), + string(sdk.ObjectParameterEnableConsoleOutput): + value, err := strconv.ParseBool(parameter.Value) + if err != nil { + return diag.FromErr(err) + } + if err := d.Set(strings.ToLower(parameter.Key), value); err != nil { + return diag.FromErr(err) + } + } + } + + return nil +} diff --git a/pkg/resources/database_old.go b/pkg/resources/database_old.go new file mode 100644 index 0000000000..409338dfea --- /dev/null +++ b/pkg/resources/database_old.go @@ -0,0 +1,371 @@ +package resources + +import ( + "context" + "fmt" + "log" + "slices" + "strconv" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" +) + +var databaseOldSchema = map[string]*schema.Schema{ + "name": { + Type: schema.TypeString, + Required: true, + Description: "Specifies the identifier for the database; must be unique for your account.", + }, + "comment": { + Type: schema.TypeString, + Optional: true, + Default: "", + Description: "Specifies a comment for the database.", + }, + "is_transient": { + Type: schema.TypeBool, + Optional: true, + Default: false, + Description: "Specifies a database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss.", + ForceNew: true, + }, + "data_retention_time_in_days": { + Type: schema.TypeInt, + Optional: true, + Default: -1, + Description: "Number of days for which Snowflake retains historical data for performing Time Travel actions (SELECT, CLONE, UNDROP) on the object. A value of 0 effectively disables Time Travel for the specified database. Default value for this field is set to -1, which is a fallback to use Snowflake default. For more information, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel).", + ValidateFunc: validation.IntBetween(-1, 90), + }, + "from_share": { + Type: schema.TypeMap, + Elem: &schema.Schema{Type: schema.TypeString}, + Description: "Specify a provider and a share in this map to create a database from a share. As of version 0.87.0, the provider field is the account locator.", + Optional: true, + ForceNew: true, + ConflictsWith: []string{"from_database", "from_replica"}, + }, + "from_database": { + Type: schema.TypeString, + Description: "Specify a database to create a clone from.", + Optional: true, + ForceNew: true, + ConflictsWith: []string{"from_share", "from_replica"}, + }, + "from_replica": { + Type: schema.TypeString, + Description: "Specify a fully-qualified path to a database to create a replica from. A fully qualified path follows the format of `\"\".\"\".\"\"`. An example would be: `\"myorg1\".\"account1\".\"db1\"`", + Optional: true, + ForceNew: true, + ConflictsWith: []string{"from_share", "from_database"}, + }, + "replication_configuration": { + Type: schema.TypeList, + Description: "When set, specifies the configurations for database replication.", + Optional: true, + MaxItems: 1, + Elem: &schema.Resource{ + Schema: map[string]*schema.Schema{ + "accounts": { + Type: schema.TypeList, + Required: true, + MinItems: 1, + Elem: &schema.Schema{Type: schema.TypeString}, + }, + "ignore_edition_check": { + Type: schema.TypeBool, + Default: true, + Optional: true, + }, + }, + }, + }, +} + +// Database returns a pointer to the resource representing a database. +func DatabaseOld() *schema.Resource { + return &schema.Resource{ + Create: CreateDatabaseOld, + Read: ReadDatabaseOld, + Delete: DeleteDatabaseOld, + Update: UpdateDatabaseOld, + DeprecationMessage: "This resource is deprecated and will be removed in a future major version release. Please use snowflake_database or snowflake_shared_database or snowflake_secondary_database instead.", + + Schema: databaseOldSchema, + Importer: &schema.ResourceImporter{ + StateContext: schema.ImportStatePassthroughContext, + }, + } +} + +// CreateDatabase implements schema.CreateFunc. +func CreateDatabaseOld(d *schema.ResourceData, meta interface{}) error { + client := meta.(*provider.Context).Client + ctx := context.Background() + name := d.Get("name").(string) + id := sdk.NewAccountObjectIdentifier(name) + + // Is it a Shared Database? + if fromShare, ok := d.GetOk("from_share"); ok { + account := fromShare.(map[string]interface{})["provider"].(string) + share := fromShare.(map[string]interface{})["share"].(string) + shareID := sdk.NewExternalObjectIdentifier(sdk.NewAccountIdentifierFromAccountLocator(account), sdk.NewAccountObjectIdentifier(share)) + opts := &sdk.CreateSharedDatabaseOptions{} + if v, ok := d.GetOk("comment"); ok { + opts.Comment = sdk.String(v.(string)) + } + err := client.Databases.CreateShared(ctx, id, shareID, opts) + if err != nil { + return fmt.Errorf("error creating database %v: %w", name, err) + } + d.SetId(name) + return ReadDatabaseOld(d, meta) + } + // Is it a Secondary Database? + if primaryName, ok := d.GetOk("from_replica"); ok { + primaryID := sdk.NewExternalObjectIdentifierFromFullyQualifiedName(primaryName.(string)) + opts := &sdk.CreateSecondaryDatabaseOptions{} + if v := d.Get("data_retention_time_in_days"); v.(int) != -1 { + opts.DataRetentionTimeInDays = sdk.Int(v.(int)) + } + err := client.Databases.CreateSecondary(ctx, id, primaryID, opts) + if err != nil { + return fmt.Errorf("error creating database %v: %w", name, err) + } + d.SetId(name) + // todo: add failover_configuration block + return ReadDatabaseOld(d, meta) + } + + // Otherwise it is a Standard Database + opts := sdk.CreateDatabaseOptions{} + if v, ok := d.GetOk("comment"); ok { + opts.Comment = sdk.String(v.(string)) + } + + if v, ok := d.GetOk("is_transient"); ok && v.(bool) { + opts.Transient = sdk.Bool(v.(bool)) + } + + if v, ok := d.GetOk("from_database"); ok { + opts.Clone = &sdk.Clone{ + SourceObject: sdk.NewAccountObjectIdentifier(v.(string)), + } + } + + if v := d.Get("data_retention_time_in_days"); v.(int) != -1 { + opts.DataRetentionTimeInDays = sdk.Int(v.(int)) + } + + err := client.Databases.Create(ctx, id, &opts) + if err != nil { + return fmt.Errorf("error creating database %v: %w", name, err) + } + d.SetId(name) + + if v, ok := d.GetOk("replication_configuration"); ok { + replicationConfiguration := v.([]interface{})[0].(map[string]interface{}) + accounts := replicationConfiguration["accounts"].([]interface{}) + accountIDs := make([]sdk.AccountIdentifier, len(accounts)) + for i, account := range accounts { + accountIDs[i] = sdk.NewAccountIdentifierFromAccountLocator(account.(string)) + } + opts := &sdk.AlterDatabaseReplicationOptions{ + EnableReplication: &sdk.EnableReplication{ + ToAccounts: accountIDs, + }, + } + if ignoreEditionCheck, ok := replicationConfiguration["ignore_edition_check"]; ok { + opts.EnableReplication.IgnoreEditionCheck = sdk.Bool(ignoreEditionCheck.(bool)) + } + err := client.Databases.AlterReplication(ctx, id, opts) + if err != nil { + return fmt.Errorf("error enabling replication for database %v: %w", name, err) + } + } + + return ReadDatabaseOld(d, meta) +} + +func ReadDatabaseOld(d *schema.ResourceData, meta interface{}) error { + client := meta.(*provider.Context).Client + ctx := context.Background() + id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) + + database, err := client.Databases.ShowByID(ctx, id) + if err != nil { + d.SetId("") + log.Printf("Database %s not found, err = %s", id.Name(), err) + return nil + } + + if err := d.Set("name", database.Name); err != nil { + return err + } + if err := d.Set("comment", database.Comment); err != nil { + return err + } + + dataRetention, err := client.Parameters.ShowAccountParameter(ctx, sdk.AccountParameterDataRetentionTimeInDays) + if err != nil { + return err + } + paramDataRetention, err := strconv.Atoi(dataRetention.Value) + if err != nil { + return err + } + + if dataRetentionDays := d.Get("data_retention_time_in_days"); dataRetentionDays.(int) != -1 || database.RetentionTime != paramDataRetention { + if err := d.Set("data_retention_time_in_days", database.RetentionTime); err != nil { + return err + } + } + + if err := d.Set("is_transient", database.Transient); err != nil { + return err + } + + return nil +} + +func UpdateDatabaseOld(d *schema.ResourceData, meta interface{}) error { + id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) + client := meta.(*provider.Context).Client + ctx := context.Background() + + if d.HasChange("name") { + newName := d.Get("name").(string) + newId := sdk.NewAccountObjectIdentifier(newName) + opts := &sdk.AlterDatabaseOptions{ + NewName: &newId, + } + err := client.Databases.Alter(ctx, id, opts) + if err != nil { + return fmt.Errorf("error updating database name on %v err = %w", d.Id(), err) + } + d.SetId(helpers.EncodeSnowflakeID(newId)) + id = newId + } + + if d.HasChange("comment") { + comment := "" + if c := d.Get("comment"); c != nil { + comment = c.(string) + } + opts := &sdk.AlterDatabaseOptions{ + Set: &sdk.DatabaseSet{ + Comment: sdk.String(comment), + }, + } + err := client.Databases.Alter(ctx, id, opts) + if err != nil { + return fmt.Errorf("error updating database comment on %v err = %w", d.Id(), err) + } + } + + if d.HasChange("data_retention_time_in_days") { + if days := d.Get("data_retention_time_in_days"); days.(int) != -1 { + err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ + Set: &sdk.DatabaseSet{ + DataRetentionTimeInDays: sdk.Int(days.(int)), + }, + }) + if err != nil { + return fmt.Errorf("error when setting database data retention time on %v err = %w", d.Id(), err) + } + } else { + err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ + Unset: &sdk.DatabaseUnset{ + DataRetentionTimeInDays: sdk.Bool(true), + }, + }) + if err != nil { + return fmt.Errorf("error when usetting database data retention time on %v err = %w", d.Id(), err) + } + } + } + + // If replication configuration changes, need to update accounts that have permission to replicate database + if d.HasChange("replication_configuration") { + oldConfig, newConfig := d.GetChange("replication_configuration") + + newAccountIDs := make([]sdk.AccountIdentifier, 0) + ignoreEditionCheck := false + if len(newConfig.([]interface{})) != 0 { + newAccounts := newConfig.([]interface{})[0].(map[string]interface{})["accounts"].([]interface{}) + for _, account := range newAccounts { + newAccountIDs = append(newAccountIDs, sdk.NewAccountIdentifierFromAccountLocator(account.(string))) + } + ignoreEditionCheck = newConfig.([]interface{})[0].(map[string]interface{})["ignore_edition_check"].(bool) + } + + oldAccountIDs := make([]sdk.AccountIdentifier, 0) + if len(oldConfig.([]interface{})) != 0 { + oldAccounts := oldConfig.([]interface{})[0].(map[string]interface{})["accounts"].([]interface{}) + for _, account := range oldAccounts { + oldAccountIDs = append(oldAccountIDs, sdk.NewAccountIdentifierFromAccountLocator(account.(string))) + } + } + + accountsToRemove := make([]sdk.AccountIdentifier, 0) + accountsToAdd := make([]sdk.AccountIdentifier, 0) + // Find accounts to remove + for _, oldAccountID := range oldAccountIDs { + if !slices.Contains(newAccountIDs, oldAccountID) { + accountsToRemove = append(accountsToRemove, oldAccountID) + } + } + + // Find accounts to add + for _, newAccountID := range newAccountIDs { + if !slices.Contains(oldAccountIDs, newAccountID) { + accountsToAdd = append(accountsToAdd, newAccountID) + } + } + if len(accountsToAdd) > 0 { + opts := &sdk.AlterDatabaseReplicationOptions{ + EnableReplication: &sdk.EnableReplication{ + ToAccounts: accountsToAdd, + }, + } + if ignoreEditionCheck { + opts.EnableReplication.IgnoreEditionCheck = sdk.Bool(ignoreEditionCheck) + } + err := client.Databases.AlterReplication(ctx, id, opts) + if err != nil { + return fmt.Errorf("error enabling replication configuration on %v err = %w", d.Id(), err) + } + } + + if len(accountsToRemove) > 0 { + opts := &sdk.AlterDatabaseReplicationOptions{ + DisableReplication: &sdk.DisableReplication{ + ToAccounts: accountsToRemove, + }, + } + err := client.Databases.AlterReplication(ctx, id, opts) + if err != nil { + return fmt.Errorf("error disabling replication configuration on %v err = %w", d.Id(), err) + } + } + } + + return ReadDatabaseOld(d, meta) +} + +func DeleteDatabaseOld(d *schema.ResourceData, meta interface{}) error { + client := meta.(*provider.Context).Client + ctx := context.Background() + id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) + err := client.Databases.Drop(ctx, id, &sdk.DropDatabaseOptions{ + IfExists: sdk.Bool(true), + }) + if err != nil { + return err + } + d.SetId("") + return nil +} diff --git a/pkg/resources/database_old_acceptance_test.go b/pkg/resources/database_old_acceptance_test.go new file mode 100644 index 0000000000..37cc76d7aa --- /dev/null +++ b/pkg/resources/database_old_acceptance_test.go @@ -0,0 +1,450 @@ +package resources_test + +import ( + "context" + "fmt" + "strconv" + "testing" + + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-testing/config" + "github.com/hashicorp/terraform-plugin-testing/helper/resource" + "github.com/hashicorp/terraform-plugin-testing/plancheck" + "github.com/hashicorp/terraform-plugin-testing/terraform" + "github.com/hashicorp/terraform-plugin-testing/tfversion" +) + +func TestAcc_DatabaseWithUnderscore(t *testing.T) { + prefix := acc.TestClient().Ids.AlphaWithPrefix("_") + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), + Steps: []resource.TestStep{ + { + Config: dbConfig(prefix), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.db", "name", prefix), + resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment"), + resource.TestCheckResourceAttrSet("snowflake_database_old.db", "data_retention_time_in_days"), + ), + }, + }, + }) +} + +func TestAcc_Database(t *testing.T) { + prefix := acc.TestClient().Ids.Alpha() + prefix2 := acc.TestClient().Ids.Alpha() + + secondaryAccountName := acc.SecondaryTestClient().Context.CurrentAccount(t) + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), + Steps: []resource.TestStep{ + { + Config: dbConfig(prefix), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.db", "name", prefix), + resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment"), + resource.TestCheckResourceAttrSet("snowflake_database_old.db", "data_retention_time_in_days"), + ), + }, + // RENAME + { + Config: dbConfig(prefix2), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.db", "name", prefix2), + resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment"), + resource.TestCheckResourceAttrSet("snowflake_database_old.db", "data_retention_time_in_days"), + ), + }, + // CHANGE PROPERTIES + { + Config: dbConfig2(prefix2), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.db", "name", prefix2), + resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment 2"), + resource.TestCheckResourceAttr("snowflake_database_old.db", "data_retention_time_in_days", "3"), + ), + }, + // ADD REPLICATION + // proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2369 error + { + Config: dbConfigWithReplication(prefix2, secondaryAccountName), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.db", "name", prefix2), + resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment 2"), + resource.TestCheckResourceAttr("snowflake_database_old.db", "data_retention_time_in_days", "3"), + resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.#", "1"), + resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.0.accounts.#", "1"), + resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.0.accounts.0", secondaryAccountName), + ), + }, + // IMPORT + { + ResourceName: "snowflake_database_old.db", + ImportState: true, + ImportStateVerify: true, + ImportStateVerifyIgnore: []string{"replication_configuration"}, + }, + }, + }) +} + +func TestAcc_DatabaseRemovedOutsideOfTerraform(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + name := id.Name() + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), + Steps: []resource.TestStep{ + { + ConfigDirectory: config.TestNameDirectory(), + ConfigVariables: map[string]config.Variable{ + "db": config.StringVariable(name), + }, + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{plancheck.ExpectNonEmptyPlan()}, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.db", "name", name), + resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment"), + testAccCheckDatabaseExistence(t, id, true), + ), + }, + { + PreConfig: func() { acc.TestClient().Database.DropDatabaseFunc(t, id)() }, + ConfigDirectory: config.TestNameDirectory(), + ConfigVariables: map[string]config.Variable{ + "db": config.StringVariable(name), + }, + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{plancheck.ExpectNonEmptyPlan()}, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.db", "name", name), + resource.TestCheckResourceAttr("snowflake_database_old.db", "comment", "test comment"), + testAccCheckDatabaseExistence(t, id, true), + ), + }, + }, + }) +} + +// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2021 +func TestAcc_Database_issue2021(t *testing.T) { + name := acc.TestClient().Ids.Alpha() + + secondaryAccountName := acc.SecondaryTestClient().Context.CurrentAccount(t) + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), + Steps: []resource.TestStep{ + { + Config: dbConfigWithReplication(name, secondaryAccountName), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.db", "name", name), + resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.#", "1"), + resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.0.accounts.#", "1"), + resource.TestCheckResourceAttr("snowflake_database_old.db", "replication_configuration.0.accounts.0", secondaryAccountName), + testAccCheckIfDatabaseIsReplicated(t, name), + ), + }, + }, + }) +} + +// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2356 issue is fixed. +func TestAcc_Database_DefaultDataRetentionTime(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + + configVariablesWithoutDatabaseDataRetentionTime := func() config.Variables { + return config.Variables{ + "database": config.StringVariable(id.Name()), + } + } + + configVariablesWithDatabaseDataRetentionTime := func(databaseDataRetentionTime int) config.Variables { + vars := configVariablesWithoutDatabaseDataRetentionTime() + vars["database_data_retention_time"] = config.IntegerVariable(databaseDataRetentionTime) + return vars + } + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), + Steps: []resource.TestStep{ + { + PreConfig: func() { + revertParameter := acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "5") + t.Cleanup(revertParameter) + }, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), + ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "-1"), + checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), + ), + }, + { + PreConfig: func() { + _ = acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "10") + }, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), + ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "-1"), + checkAccountAndDatabaseDataRetentionTime(t, id, 10, 10), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), + ConfigVariables: configVariablesWithDatabaseDataRetentionTime(5), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "5"), + checkAccountAndDatabaseDataRetentionTime(t, id, 10, 5), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), + ConfigVariables: configVariablesWithDatabaseDataRetentionTime(15), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "15"), + checkAccountAndDatabaseDataRetentionTime(t, id, 10, 15), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), + ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "-1"), + checkAccountAndDatabaseDataRetentionTime(t, id, 10, 10), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), + ConfigVariables: configVariablesWithDatabaseDataRetentionTime(0), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "0"), + checkAccountAndDatabaseDataRetentionTime(t, id, 10, 0), + ), + }, + { + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), + ConfigVariables: configVariablesWithDatabaseDataRetentionTime(3), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "3"), + checkAccountAndDatabaseDataRetentionTime(t, id, 10, 3), + ), + }, + }, + }) +} + +// proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2356 issue is fixed. +func TestAcc_Database_DefaultDataRetentionTime_SetOutsideOfTerraform(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + + configVariablesWithoutDatabaseDataRetentionTime := func() config.Variables { + return config.Variables{ + "database": config.StringVariable(id.Name()), + } + } + + configVariablesWithDatabaseDataRetentionTime := func(databaseDataRetentionTime int) config.Variables { + vars := configVariablesWithoutDatabaseDataRetentionTime() + vars["database_data_retention_time"] = config.IntegerVariable(databaseDataRetentionTime) + return vars + } + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.DatabaseOld), + Steps: []resource.TestStep{ + { + PreConfig: func() { + revertParameter := acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "5") + t.Cleanup(revertParameter) + }, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), + ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "-1"), + checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), + ), + }, + { + PreConfig: func() { acc.TestClient().Database.UpdateDataRetentionTime(t, id, 20) }, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), + ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "-1"), + checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), + ), + }, + { + PreConfig: func() { + _ = acc.TestClient().Parameter.UpdateAccountParameterTemporarily(t, sdk.AccountParameterDataRetentionTimeInDays, "10") + }, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet"), + ConfigVariables: configVariablesWithDatabaseDataRetentionTime(3), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "3"), + checkAccountAndDatabaseDataRetentionTime(t, id, 10, 3), + ), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PostApplyPostRefresh: []plancheck.PlanCheck{ + plancheck.ExpectEmptyPlan(), + }, + }, + }, + }, + }) +} + +func dbConfig(prefix string) string { + s := ` +resource "snowflake_database_old" "db" { + name = "%s" + comment = "test comment" +} +` + return fmt.Sprintf(s, prefix) +} + +func dbConfig2(prefix string) string { + s := ` +resource "snowflake_database_old" "db" { + name = "%s" + comment = "test comment 2" + data_retention_time_in_days = 3 +} +` + return fmt.Sprintf(s, prefix) +} + +func dbConfigWithReplication(prefix string, secondaryAccountName string) string { + s := ` +resource "snowflake_database_old" "db" { + name = "%s" + comment = "test comment 2" + data_retention_time_in_days = 3 + replication_configuration { + accounts = [ + "%s" + ] + } +} +` + return fmt.Sprintf(s, prefix, secondaryAccountName) +} + +// TODO [SNOW-936093]: this is used mostly as check for unsafe execute, not as normal check destroy in other resources. Handle with the helpers cleanup. +func testAccCheckDatabaseExistence(t *testing.T, id sdk.AccountObjectIdentifier, shouldExist bool) func(state *terraform.State) error { + t.Helper() + return func(state *terraform.State) error { + _, err := acc.TestClient().Database.Show(t, id) + if shouldExist { + if err != nil { + return fmt.Errorf("error while retrieving database %s, err = %w", id, err) + } + } else { + if err == nil { + return fmt.Errorf("database %v still exists", id) + } + } + return nil + } +} + +func testAccCheckIfDatabaseIsReplicated(t *testing.T, id string) func(state *terraform.State) error { + t.Helper() + return func(state *terraform.State) error { + client := acc.Client(t) + + ctx := context.Background() + replicationDatabases, err := client.ReplicationFunctions.ShowReplicationDatabases(ctx, nil) + if err != nil { + return err + } + + var exists bool + for _, o := range replicationDatabases { + if o.Name == id { + exists = true + break + } + } + + if !exists { + return fmt.Errorf("database %s should be replicated", id) + } + + return nil + } +} + +func checkAccountAndDatabaseDataRetentionTime(t *testing.T, id sdk.AccountObjectIdentifier, expectedAccountRetentionDays int, expectedDatabaseRetentionsDays int) func(state *terraform.State) error { + t.Helper() + return func(state *terraform.State) error { + providerContext := acc.TestAccProvider.Meta().(*provider.Context) + client := providerContext.Client + ctx := context.Background() + + database, err := acc.TestClient().Database.Show(t, id) + if err != nil { + return err + } + + if database.RetentionTime != expectedDatabaseRetentionsDays { + return fmt.Errorf("invalid database retention time, expected: %d, got: %d", expectedDatabaseRetentionsDays, database.RetentionTime) + } + + param, err := client.Parameters.ShowAccountParameter(ctx, sdk.AccountParameterDataRetentionTimeInDays) + if err != nil { + return err + } + accountRetentionDays, err := strconv.Atoi(param.Value) + if err != nil { + return err + } + + if accountRetentionDays != expectedAccountRetentionDays { + return fmt.Errorf("invalid account retention time, expected: %d, got: %d", expectedAccountRetentionDays, accountRetentionDays) + } + + return nil + } +} diff --git a/pkg/resources/database_state_upgraders.go b/pkg/resources/database_state_upgraders.go new file mode 100644 index 0000000000..df04f75d13 --- /dev/null +++ b/pkg/resources/database_state_upgraders.go @@ -0,0 +1,29 @@ +package resources + +import ( + "context" +) + +func v092DatabaseStateUpgrader(ctx context.Context, rawState map[string]any, meta any) (map[string]any, error) { + if rawState == nil { + return rawState, nil + } + + if replicationConfigurations, ok := rawState["replication_configuration"]; ok && len(replicationConfigurations.([]any)) == 1 { + replicationConfiguration := replicationConfigurations.([]any)[0].(map[string]any) + replication := make(map[string]any) + replication["ignore_edition_check"] = replicationConfiguration["ignore_edition_check"] + + accounts := replicationConfiguration["accounts"].([]any) + enableForAccounts := make([]map[string]any, len(accounts)) + for i, account := range accounts { + enableForAccounts[i] = map[string]any{ + "account_identifier": account, + } + } + + rawState["replication"] = []any{replication} + } + + return rawState, nil +} diff --git a/pkg/resources/grant_privileges_to_account_role_acceptance_test.go b/pkg/resources/grant_privileges_to_account_role_acceptance_test.go index 1d98b760e8..2699b0744d 100644 --- a/pkg/resources/grant_privileges_to_account_role_acceptance_test.go +++ b/pkg/resources/grant_privileges_to_account_role_acceptance_test.go @@ -1004,14 +1004,14 @@ func TestAcc_GrantPrivilegesToAccountRole_ImportedPrivileges(t *testing.T) { sharedDatabaseId := acc.TestClient().Ids.RandomAccountObjectIdentifier() sharedDatabaseName := sharedDatabaseId.Name() shareId := acc.TestClient().Ids.RandomAccountObjectIdentifier() - shareName := shareId.Name() roleName := acc.TestClient().Ids.Alpha() - secondaryAccountName := acc.SecondaryTestClient().Context.CurrentAccount(t) configVariables := config.Variables{ "role_name": config.StringVariable(roleName), "shared_database_name": config.StringVariable(sharedDatabaseName), - "share_name": config.StringVariable(shareName), - "account_name": config.StringVariable(secondaryAccountName), + "external_share_name": config.StringVariable(sdk.NewExternalObjectIdentifier( + acc.SecondaryTestClient().Account.GetAccountIdentifier(t), + shareId, + ).FullyQualifiedName()), "privileges": config.ListVariable( config.StringVariable(sdk.AccountObjectPrivilegeImportedPrivileges.String()), ), diff --git a/pkg/resources/grant_privileges_to_role_acceptance_test.go b/pkg/resources/grant_privileges_to_role_acceptance_test.go index b09a9747b9..121f1484a2 100644 --- a/pkg/resources/grant_privileges_to_role_acceptance_test.go +++ b/pkg/resources/grant_privileges_to_role_acceptance_test.go @@ -1081,14 +1081,14 @@ func TestAcc_GrantPrivilegesToRole_ImportedPrivileges(t *testing.T) { sharedDatabaseId := acc.TestClient().Ids.RandomAccountObjectIdentifier() sharedDatabaseName := sharedDatabaseId.Name() shareId := acc.TestClient().Ids.RandomAccountObjectIdentifier() - shareName := shareId.Name() roleName := acc.TestClient().Ids.Alpha() - secondaryAccountName := acc.SecondaryTestClient().Context.CurrentAccount(t) configVariables := config.Variables{ "role_name": config.StringVariable(roleName), "shared_database_name": config.StringVariable(sharedDatabaseName), - "share_name": config.StringVariable(shareName), - "account_name": config.StringVariable(secondaryAccountName), + "external_share_name": config.StringVariable(sdk.NewExternalObjectIdentifier( + acc.SecondaryTestClient().Account.GetAccountIdentifier(t), + shareId, + ).FullyQualifiedName()), "privileges": config.ListVariable( config.StringVariable(sdk.AccountObjectPrivilegeImportedPrivileges.String()), ), diff --git a/pkg/resources/helpers.go b/pkg/resources/helpers.go index 45c9389416..70315b087f 100644 --- a/pkg/resources/helpers.go +++ b/pkg/resources/helpers.go @@ -2,8 +2,11 @@ package resources import ( "fmt" + "slices" "strings" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/snowflake" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" @@ -138,6 +141,10 @@ func GetPropertyAsPointer[T any](d *schema.ResourceData, property string) *T { return &typedValue } +func GetPropertyOfFirstNestedObjectByValueKey[T any](d *schema.ResourceData, propertyKey string) (*T, error) { + return GetPropertyOfFirstNestedObjectByKey[T](d, propertyKey, "value") +} + // GetPropertyOfFirstNestedObjectByKey should be used for single objects defined in the Terraform schema as // schema.TypeList with MaxItems set to one and inner schema with single value. To easily retrieve // the inner value, you can specify the top-level property with propertyKey and the nested value with nestedValueKey. @@ -170,6 +177,21 @@ func GetPropertyOfFirstNestedObjectByKey[T any](d *schema.ResourceData, property return &typedNestedValue, nil } +func SetPropertyOfFirstNestedObjectByValueKey[T any](d *schema.ResourceData, propertyKey string, value T) error { + return SetPropertyOfFirstNestedObjectByKey[T](d, propertyKey, "value", value) +} + +// SetPropertyOfFirstNestedObjectByKey should be used for single objects defined in the Terraform schema as +// schema.TypeList with MaxItems set to one and inner schema with single value. To easily set +// the inner value, you can specify top-level property with propertyKey, nested value with nestedValueKey and value at the end. +func SetPropertyOfFirstNestedObjectByKey[T any](d *schema.ResourceData, propertyKey string, nestedValueKey string, value T) error { + return d.Set(propertyKey, []any{ + map[string]any{ + nestedValueKey: value, + }, + }) +} + type tags []tag func (t tags) toSnowflakeTagValues() []snowflake.TagValue { @@ -261,20 +283,46 @@ func getTags(from interface{}) (to tags) { return to } -func nestedProperty(innerType schema.ValueType, fieldDescription string) *schema.Schema { - return &schema.Schema{ - Type: schema.TypeList, - MaxItems: 1, - Elem: &schema.Resource{ - Schema: map[string]*schema.Schema{ - "value": { - Type: innerType, - Required: true, - }, - }, - }, - Computed: true, - Optional: true, - Description: fieldDescription, +func MergeMaps[M ~map[K]V, K comparable, V any](src ...M) M { + merged := make(M) + for _, m := range src { + for k, v := range m { + merged[k] = v + } } + return merged +} + +// TODO(SNOW-1479870): Test +// JoinDiags iterates through passed diag.Diagnostics and joins them into one diag.Diagnostics. +// If none of the passed diagnostics contained any element a nil reference will be returned. +func JoinDiags(diagnostics ...diag.Diagnostics) diag.Diagnostics { + var result diag.Diagnostics + for _, diagnostic := range diagnostics { + if len(diagnostic) > 0 { + result = append(result, diagnostic...) + } + } + return result +} + +// ListDiff Compares two lists (before and after), then compares and returns two lists that include +// added and removed items between those lists. +func ListDiff[T comparable](beforeList []T, afterList []T) (added []T, removed []T) { + added = make([]T, 0) + removed = make([]T, 0) + + for _, privilegeBeforeChange := range beforeList { + if !slices.Contains(afterList, privilegeBeforeChange) { + removed = append(removed, privilegeBeforeChange) + } + } + + for _, privilegeAfterChange := range afterList { + if !slices.Contains(beforeList, privilegeAfterChange) { + added = append(added, privilegeAfterChange) + } + } + + return added, removed } diff --git a/pkg/resources/helpers_test.go b/pkg/resources/helpers_test.go index cc9534ae7f..dabf7bbfc6 100644 --- a/pkg/resources/helpers_test.go +++ b/pkg/resources/helpers_test.go @@ -228,3 +228,71 @@ func TestGetFirstNestedObjectByKey(t *testing.T) { _, err = resources.GetPropertyOfFirstNestedObjectByKey[int](d, "string_property", "value") assert.ErrorContains(t, err, "nested property string_property.value is not of type int, got: string") } + +func TestListDiff(t *testing.T) { + testCases := []struct { + Name string + Before []any + After []any + Added []any + Removed []any + }{ + { + Name: "no changes", + Before: []any{1, 2, 3, 4}, + After: []any{1, 2, 3, 4}, + Removed: []any{}, + Added: []any{}, + }, + { + Name: "only removed", + Before: []any{1, 2, 3, 4}, + After: []any{}, + Removed: []any{1, 2, 3, 4}, + Added: []any{}, + }, + { + Name: "only added", + Before: []any{}, + After: []any{1, 2, 3, 4}, + Removed: []any{}, + Added: []any{1, 2, 3, 4}, + }, + { + Name: "added repeated items", + Before: []any{2}, + After: []any{1, 2, 1}, + Removed: []any{}, + Added: []any{1, 1}, + }, + { + Name: "removed repeated items", + Before: []any{1, 2, 1}, + After: []any{2}, + Removed: []any{1, 1}, + Added: []any{}, + }, + { + Name: "simple diff: ints", + Before: []any{1, 2, 3, 4, 5, 6, 7, 8, 9}, + After: []any{1, 3, 5, 7, 9, 12, 13, 14}, + Removed: []any{2, 4, 6, 8}, + Added: []any{12, 13, 14}, + }, + { + Name: "simple diff: strings", + Before: []any{"one", "two", "three", "four"}, + After: []any{"five", "two", "four", "six"}, + Removed: []any{"one", "three"}, + Added: []any{"five", "six"}, + }, + } + + for _, tc := range testCases { + t.Run(tc.Name, func(t *testing.T) { + added, removed := resources.ListDiff(tc.Before, tc.After) + assert.Equal(t, tc.Added, added) + assert.Equal(t, tc.Removed, removed) + }) + } +} diff --git a/pkg/resources/secondary_database.go b/pkg/resources/secondary_database.go index 3f1b64e7a1..f66f800042 100644 --- a/pkg/resources/secondary_database.go +++ b/pkg/resources/secondary_database.go @@ -4,13 +4,11 @@ import ( "context" "errors" "fmt" - "strconv" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-sdk/v2/diag" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" ) @@ -32,64 +30,6 @@ var secondaryDatabaseSchema = map[string]*schema.Schema{ ForceNew: true, Description: "Specifies the database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss.", }, - "data_retention_time_in_days": nestedProperty( - schema.TypeInt, - "Specifies the number of days for which Time Travel actions (CLONE and UNDROP) can be performed on the database, as well as specifying the default Time Travel retention time for all schemas created in the database. For more details, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel).", - ), - "max_data_extension_time_in_days": nestedProperty( - schema.TypeInt, - "Object parameter that specifies the maximum number of days for which Snowflake can extend the data retention period for tables in the database to prevent streams on the tables from becoming stale. For a detailed description of this parameter, see [MAX_DATA_EXTENSION_TIME_IN_DAYS](https://docs.snowflake.com/en/sql-reference/parameters.html#label-max-data-extension-time-in-days).", - ), - // TODO: Below parameters should be nested properties - "external_volume": { - Type: schema.TypeString, - Optional: true, - ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), - Description: "The database parameter that specifies the default external volume to use for Iceberg tables.", - }, - "catalog": { - Type: schema.TypeString, - Optional: true, - ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), - Description: "The database parameter that specifies the default catalog to use for Iceberg tables.", - }, - "replace_invalid_characters": { - Type: schema.TypeBool, - Optional: true, - Description: "Specifies whether to replace invalid UTF-8 characters with the Unicode replacement character (�) in query results for an Iceberg table. You can only set this parameter for tables that use an external Iceberg catalog.", - }, - "default_ddl_collation": { - Type: schema.TypeString, - Optional: true, - Description: "Specifies a default collation specification for all schemas and tables added to the database. It can be overridden on schema or table level. For more information, see [collation specification](https://docs.snowflake.com/en/sql-reference/collation#label-collation-specification).", - }, - "storage_serialization_policy": { - Type: schema.TypeString, - Optional: true, - ValidateDiagFunc: StringInSlice(sdk.AsStringList(sdk.AllStorageSerializationPolicies), true), - Description: fmt.Sprintf("Specifies the storage serialization policy for Iceberg tables that use Snowflake as the catalog. Valid options are: %v. COMPATIBLE: Snowflake performs encoding and compression of data files that ensures interoperability with third-party compute engines. OPTIMIZED: Snowflake performs encoding and compression of data files that ensures the best table performance within Snowflake.", sdk.AsStringList(sdk.AllStorageSerializationPolicies)), - DiffSuppressFunc: func(k, oldValue, newValue string, d *schema.ResourceData) bool { - return d.Get(k).(string) == string(sdk.StorageSerializationPolicyOptimized) && newValue == "" - }, - }, - "log_level": { - Type: schema.TypeString, - Optional: true, - ValidateDiagFunc: StringInSlice(sdk.AsStringList(sdk.AllLogLevels), true), - DiffSuppressFunc: func(k, oldValue, newValue string, d *schema.ResourceData) bool { - return d.Get(k).(string) == string(sdk.LogLevelOff) && newValue == "" - }, - Description: fmt.Sprintf("Specifies the severity level of messages that should be ingested and made available in the active event table. Valid options are: %v. Messages at the specified level (and at more severe levels) are ingested. For more information, see [LOG_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-log-level).", sdk.AsStringList(sdk.AllLogLevels)), - }, - "trace_level": { - Type: schema.TypeString, - Optional: true, - ValidateDiagFunc: StringInSlice(sdk.AsStringList(sdk.AllTraceLevels), true), - DiffSuppressFunc: func(k, oldValue, newValue string, d *schema.ResourceData) bool { - return d.Get(k).(string) == string(sdk.TraceLevelOff) && newValue == "" - }, - Description: fmt.Sprintf("Controls how trace events are ingested into the event table. Valid options are: %v. For information about levels, see [TRACE_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-trace-level).", sdk.AsStringList(sdk.AllTraceLevels)), - }, "comment": { Type: schema.TypeString, Optional: true, @@ -105,12 +45,8 @@ func SecondaryDatabase() *schema.Resource { DeleteContext: DeleteSecondaryDatabase, Description: "A secondary database creates a replica of an existing primary database (i.e. a secondary database). For more information about database replication, see [Introduction to database replication across multiple accounts](https://docs.snowflake.com/en/user-guide/db-replication-intro).", - CustomizeDiff: customdiff.All( - NestedIntValueAccountObjectComputedIf("data_retention_time_in_days", sdk.AccountParameterDataRetentionTimeInDays), - NestedIntValueAccountObjectComputedIf("max_data_extension_time_in_days", sdk.AccountParameterMaxDataExtensionTimeInDays), - ), - - Schema: secondaryDatabaseSchema, + CustomizeDiff: DatabaseParametersCustomDiff, + Schema: MergeMaps(secondaryDatabaseSchema, DatabaseParametersSchema), Importer: &schema.ResourceImporter{ StateContext: schema.ImportStatePassthroughContext, }, @@ -123,46 +59,46 @@ func CreateSecondaryDatabase(ctx context.Context, d *schema.ResourceData, meta a secondaryDatabaseId := sdk.NewAccountObjectIdentifier(d.Get("name").(string)) primaryDatabaseId := sdk.NewExternalObjectIdentifierFromFullyQualifiedName(d.Get("as_replica_of").(string)) - dataRetentionTimeInDays, _ := GetPropertyOfFirstNestedObjectByKey[int](d, "data_retention_time_in_days", "value") - maxDataExtensionTimeInDays, _ := GetPropertyOfFirstNestedObjectByKey[int](d, "max_data_extension_time_in_days", "value") - - var externalVolume *sdk.AccountObjectIdentifier - if v, ok := d.GetOk("external_volume"); ok { - externalVolume = sdk.Pointer(sdk.NewAccountObjectIdentifier(v.(string))) - } - - var catalog *sdk.AccountObjectIdentifier - if v, ok := d.GetOk("catalog"); ok { - catalog = sdk.Pointer(sdk.NewAccountObjectIdentifier(v.(string))) - } - - var storageSerializationPolicy *sdk.StorageSerializationPolicy - if v, ok := d.GetOk("storage_serialization_policy"); ok { - storageSerializationPolicy = sdk.Pointer(sdk.StorageSerializationPolicy(v.(string))) - } - - var logLevel *sdk.LogLevel - if v, ok := d.GetOk("log_level"); ok { - logLevel = sdk.Pointer(sdk.LogLevel(v.(string))) - } - - var traceLevel *sdk.TraceLevel - if v, ok := d.GetOk("trace_level"); ok { - traceLevel = sdk.Pointer(sdk.TraceLevel(v.(string))) + dataRetentionTimeInDays, + maxDataExtensionTimeInDays, + externalVolume, + catalog, + replaceInvalidCharacters, + defaultDDLCollation, + storageSerializationPolicy, + logLevel, + traceLevel, + suspendTaskAfterNumFailures, + taskAutoRetryAttempts, + userTaskManagedInitialWarehouseSize, + userTaskTimeoutMs, + userTaskMinimumTriggerIntervalInSeconds, + quotedIdentifiersIgnoreCase, + enableConsoleOutput, + err := GetAllDatabaseParameters(d) + if err != nil { + return diag.FromErr(err) } - err := client.Databases.CreateSecondary(ctx, secondaryDatabaseId, primaryDatabaseId, &sdk.CreateSecondaryDatabaseOptions{ - Transient: GetPropertyAsPointer[bool](d, "is_transient"), - DataRetentionTimeInDays: dataRetentionTimeInDays, - MaxDataExtensionTimeInDays: maxDataExtensionTimeInDays, - ExternalVolume: externalVolume, - Catalog: catalog, - ReplaceInvalidCharacters: GetPropertyAsPointer[bool](d, "replace_invalid_characters"), - DefaultDDLCollation: GetPropertyAsPointer[string](d, "default_ddl_collation"), - StorageSerializationPolicy: storageSerializationPolicy, - LogLevel: logLevel, - TraceLevel: traceLevel, - Comment: GetPropertyAsPointer[string](d, "comment"), + err = client.Databases.CreateSecondary(ctx, secondaryDatabaseId, primaryDatabaseId, &sdk.CreateSecondaryDatabaseOptions{ + Transient: GetPropertyAsPointer[bool](d, "is_transient"), + DataRetentionTimeInDays: dataRetentionTimeInDays, + MaxDataExtensionTimeInDays: maxDataExtensionTimeInDays, + ExternalVolume: externalVolume, + Catalog: catalog, + ReplaceInvalidCharacters: replaceInvalidCharacters, + DefaultDDLCollation: defaultDDLCollation, + StorageSerializationPolicy: storageSerializationPolicy, + LogLevel: logLevel, + TraceLevel: traceLevel, + SuspendTaskAfterNumFailures: suspendTaskAfterNumFailures, + TaskAutoRetryAttempts: taskAutoRetryAttempts, + UserTaskManagedInitialWarehouseSize: userTaskManagedInitialWarehouseSize, + UserTaskTimeoutMs: userTaskTimeoutMs, + UserTaskMinimumTriggerIntervalInSeconds: userTaskMinimumTriggerIntervalInSeconds, + QuotedIdentifiersIgnoreCase: quotedIdentifiersIgnoreCase, + EnableConsoleOutput: enableConsoleOutput, + Comment: GetPropertyAsPointer[string](d, "comment"), }) if err != nil { return diag.FromErr(err) @@ -189,95 +125,11 @@ func UpdateSecondaryDatabase(ctx context.Context, d *schema.ResourceData, meta a secondaryDatabaseId = newId } - var databaseSetRequest sdk.DatabaseSet - var databaseUnsetRequest sdk.DatabaseUnset - - if d.HasChange("data_retention_time_in_days") { - dataRetentionObject, ok := d.GetOk("data_retention_time_in_days") - if ok && len(dataRetentionObject.([]any)) > 0 { - dataRetentionTimeInDays, err := GetPropertyOfFirstNestedObjectByKey[int](d, "data_retention_time_in_days", "value") - if err != nil { - return diag.FromErr(err) - } - databaseSetRequest.DataRetentionTimeInDays = dataRetentionTimeInDays - } else { - databaseUnsetRequest.DataRetentionTimeInDays = sdk.Bool(true) - } - } - - if d.HasChange("max_data_extension_time_in_days") { - maxDataExtensionTimeInDaysObject, ok := d.GetOk("max_data_extension_time_in_days") - if ok && len(maxDataExtensionTimeInDaysObject.([]any)) > 0 { - maxDataExtensionTimeInDays, err := GetPropertyOfFirstNestedObjectByKey[int](d, "max_data_extension_time_in_days", "value") - if err != nil { - return diag.FromErr(err) - } - databaseSetRequest.MaxDataExtensionTimeInDays = maxDataExtensionTimeInDays - } else { - databaseUnsetRequest.MaxDataExtensionTimeInDays = sdk.Bool(true) - } - } - - if d.HasChange("external_volume") { - externalVolume := d.Get("external_volume").(string) - if len(externalVolume) > 0 { - databaseSetRequest.ExternalVolume = sdk.Pointer(sdk.NewAccountObjectIdentifier(externalVolume)) - } else { - databaseUnsetRequest.ExternalVolume = sdk.Bool(true) - } - } - - if d.HasChange("catalog") { - catalog := d.Get("catalog").(string) - if len(catalog) > 0 { - databaseSetRequest.Catalog = sdk.Pointer(sdk.NewAccountObjectIdentifier(catalog)) - } else { - databaseUnsetRequest.Catalog = sdk.Bool(true) - } - } - - if d.HasChange("replace_invalid_characters") { - if d.Get("replace_invalid_characters").(bool) { - databaseSetRequest.ReplaceInvalidCharacters = sdk.Bool(true) - } else { - databaseUnsetRequest.ReplaceInvalidCharacters = sdk.Bool(true) - } - } - - if d.HasChange("default_ddl_collation") { - defaultDdlCollation := d.Get("default_ddl_collation").(string) - if len(defaultDdlCollation) > 0 { - databaseSetRequest.DefaultDDLCollation = &defaultDdlCollation - } else { - databaseUnsetRequest.DefaultDDLCollation = sdk.Bool(true) - } - } + databaseSetRequest := new(sdk.DatabaseSet) + databaseUnsetRequest := new(sdk.DatabaseUnset) - if d.HasChange("storage_serialization_policy") { - storageSerializationPolicy := d.Get("storage_serialization_policy").(string) - if len(storageSerializationPolicy) > 0 { - databaseSetRequest.StorageSerializationPolicy = sdk.Pointer(sdk.StorageSerializationPolicy(storageSerializationPolicy)) - } else { - databaseUnsetRequest.StorageSerializationPolicy = sdk.Bool(true) - } - } - - if d.HasChange("log_level") { - logLevel := d.Get("log_level").(string) - if len(logLevel) > 0 { - databaseSetRequest.LogLevel = sdk.Pointer(sdk.LogLevel(logLevel)) - } else { - databaseUnsetRequest.LogLevel = sdk.Bool(true) - } - } - - if d.HasChange("trace_level") { - traceLevel := d.Get("trace_level").(string) - if len(traceLevel) > 0 { - databaseSetRequest.TraceLevel = sdk.Pointer(sdk.TraceLevel(traceLevel)) - } else { - databaseUnsetRequest.TraceLevel = sdk.Bool(true) - } + if updateParamDiags := HandleDatabaseParametersChanges(d, databaseSetRequest, databaseUnsetRequest); len(updateParamDiags) > 0 { + return updateParamDiags } if d.HasChange("comment") { @@ -289,18 +141,18 @@ func UpdateSecondaryDatabase(ctx context.Context, d *schema.ResourceData, meta a } } - if (databaseSetRequest != sdk.DatabaseSet{}) { + if (*databaseSetRequest != sdk.DatabaseSet{}) { err := client.Databases.Alter(ctx, secondaryDatabaseId, &sdk.AlterDatabaseOptions{ - Set: &databaseSetRequest, + Set: databaseSetRequest, }) if err != nil { return diag.FromErr(err) } } - if (databaseUnsetRequest != sdk.DatabaseUnset{}) { + if (*databaseUnsetRequest != sdk.DatabaseUnset{}) { err := client.Databases.Alter(ctx, secondaryDatabaseId, &sdk.AlterDatabaseOptions{ - Unset: &databaseUnsetRequest, + Unset: databaseUnsetRequest, }) if err != nil { return diag.FromErr(err) @@ -329,15 +181,6 @@ func ReadSecondaryDatabase(ctx context.Context, d *schema.ResourceData, meta any return diag.FromErr(err) } - secondaryDatabaseParameters, err := client.Parameters.ShowParameters(ctx, &sdk.ShowParametersOptions{ - In: &sdk.ParametersIn{ - Database: secondaryDatabaseId, - }, - }) - if err != nil { - return diag.FromErr(err) - } - replicationDatabases, err := client.ReplicationFunctions.ShowReplicationDatabases(ctx, &sdk.ShowReplicationDatabasesOptions{ Like: &sdk.Like{ Pattern: sdk.String(secondaryDatabaseId.Name()), @@ -372,57 +215,21 @@ func ReadSecondaryDatabase(ctx context.Context, d *schema.ResourceData, meta any return diag.FromErr(err) } - if err := d.Set("data_retention_time_in_days", []any{map[string]any{"value": secondaryDatabase.RetentionTime}}); err != nil { + if err := d.Set("comment", secondaryDatabase.Comment); err != nil { return diag.FromErr(err) } - if err := d.Set("comment", secondaryDatabase.Comment); err != nil { + secondaryDatabaseParameters, err := client.Parameters.ShowParameters(ctx, &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Database: secondaryDatabaseId, + }, + }) + if err != nil { return diag.FromErr(err) } - for _, secondaryDatabaseParameter := range secondaryDatabaseParameters { - switch secondaryDatabaseParameter.Key { - case "MAX_DATA_EXTENSION_TIME_IN_DAYS": - maxDataExtensionTimeInDays, err := strconv.Atoi(secondaryDatabaseParameter.Value) - if err != nil { - return diag.FromErr(err) - } - if err := d.Set("max_data_extension_time_in_days", []any{map[string]any{"value": maxDataExtensionTimeInDays}}); err != nil { - return diag.FromErr(err) - } - case "EXTERNAL_VOLUME": - if err := d.Set("external_volume", secondaryDatabaseParameter.Value); err != nil { - return diag.FromErr(err) - } - case "CATALOG": - if err := d.Set("catalog", secondaryDatabaseParameter.Value); err != nil { - return diag.FromErr(err) - } - case "DEFAULT_DDL_COLLATION": - if err := d.Set("default_ddl_collation", secondaryDatabaseParameter.Value); err != nil { - return diag.FromErr(err) - } - case "LOG_LEVEL": - if err := d.Set("log_level", secondaryDatabaseParameter.Value); err != nil { - return diag.FromErr(err) - } - case "TRACE_LEVEL": - if err := d.Set("trace_level", secondaryDatabaseParameter.Value); err != nil { - return diag.FromErr(err) - } - case "REPLACE_INVALID_CHARACTERS": - boolValue, err := strconv.ParseBool(secondaryDatabaseParameter.Value) - if err != nil { - return diag.FromErr(err) - } - if err := d.Set("replace_invalid_characters", boolValue); err != nil { - return diag.FromErr(err) - } - case "STORAGE_SERIALIZATION_POLICY": - if err := d.Set("storage_serialization_policy", secondaryDatabaseParameter.Value); err != nil { - return diag.FromErr(err) - } - } + if diags := HandleDatabaseParameterRead(d, secondaryDatabaseParameters); diags != nil { + return diags } return nil diff --git a/pkg/resources/secondary_database_acceptance_test.go b/pkg/resources/secondary_database_acceptance_test.go index 4b2186eb8d..921025ef90 100644 --- a/pkg/resources/secondary_database_acceptance_test.go +++ b/pkg/resources/secondary_database_acceptance_test.go @@ -4,6 +4,8 @@ import ( "context" "testing" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers" + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" @@ -14,9 +16,7 @@ import ( "github.com/stretchr/testify/require" ) -func TestAcc_CreateSecondaryDatabase_minimal(t *testing.T) { - t.Skip("To be unskipped in the next database pr") - +func TestAcc_CreateSecondaryDatabase_Basic(t *testing.T) { id := acc.TestClient().Ids.RandomAccountObjectIdentifier() comment := random.Comment() @@ -28,11 +28,24 @@ func TestAcc_CreateSecondaryDatabase_minimal(t *testing.T) { newId := acc.TestClient().Ids.RandomAccountObjectIdentifier() newComment := random.Comment() - accountDataRetentionTimeInDays, err := acc.Client(t).Parameters.ShowAccountParameter(context.Background(), sdk.AccountParameterDataRetentionTimeInDays) - require.NoError(t, err) - - accountMaxDataExtensionTimeInDays, err := acc.Client(t).Parameters.ShowAccountParameter(context.Background(), sdk.AccountParameterMaxDataExtensionTimeInDays) - require.NoError(t, err) + var ( + accountDataRetentionTimeInDays = new(string) + accountMaxDataExtensionTimeInDays = new(string) + accountExternalVolume = new(string) + accountCatalog = new(string) + accountReplaceInvalidCharacters = new(string) + accountDefaultDdlCollation = new(string) + accountStorageSerializationPolicy = new(string) + accountLogLevel = new(string) + accountTraceLevel = new(string) + accountSuspendTaskAfterNumFailures = new(string) + accountTaskAutoRetryAttempts = new(string) + accountUserTaskMangedInitialWarehouseSize = new(string) + accountUserTaskTimeoutMs = new(string) + accountUserTaskMinimumTriggerIntervalInSeconds = new(string) + accountQuotedIdentifiersIgnoreCase = new(string) + accountEnableConsoleOutput = new(string) + ) configVariables := func(id sdk.AccountObjectIdentifier, primaryDatabaseName sdk.ExternalObjectIdentifier, comment string) config.Variables { return config.Variables{ @@ -51,21 +64,48 @@ func TestAcc_CreateSecondaryDatabase_minimal(t *testing.T) { CheckDestroy: acc.CheckDestroy(t, resources.SharedDatabase), Steps: []resource.TestStep{ { + PreConfig: func() { + params := acc.TestClient().Parameter.ShowAccountParameters(t) + *accountDataRetentionTimeInDays = helpers.FindParameter(t, params, sdk.AccountParameterDataRetentionTimeInDays).Value + *accountMaxDataExtensionTimeInDays = helpers.FindParameter(t, params, sdk.AccountParameterMaxDataExtensionTimeInDays).Value + *accountExternalVolume = helpers.FindParameter(t, params, sdk.AccountParameterExternalVolume).Value + *accountCatalog = helpers.FindParameter(t, params, sdk.AccountParameterCatalog).Value + *accountReplaceInvalidCharacters = helpers.FindParameter(t, params, sdk.AccountParameterReplaceInvalidCharacters).Value + *accountDefaultDdlCollation = helpers.FindParameter(t, params, sdk.AccountParameterDefaultDDLCollation).Value + *accountStorageSerializationPolicy = helpers.FindParameter(t, params, sdk.AccountParameterStorageSerializationPolicy).Value + *accountLogLevel = helpers.FindParameter(t, params, sdk.AccountParameterLogLevel).Value + *accountTraceLevel = helpers.FindParameter(t, params, sdk.AccountParameterTraceLevel).Value + *accountSuspendTaskAfterNumFailures = helpers.FindParameter(t, params, sdk.AccountParameterSuspendTaskAfterNumFailures).Value + *accountTaskAutoRetryAttempts = helpers.FindParameter(t, params, sdk.AccountParameterTaskAutoRetryAttempts).Value + *accountUserTaskMangedInitialWarehouseSize = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskManagedInitialWarehouseSize).Value + *accountUserTaskTimeoutMs = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskTimeoutMs).Value + *accountUserTaskMinimumTriggerIntervalInSeconds = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds).Value + *accountQuotedIdentifiersIgnoreCase = helpers.FindParameter(t, params, sdk.AccountParameterQuotedIdentifiersIgnoreCase).Value + *accountEnableConsoleOutput = helpers.FindParameter(t, params, sdk.AccountParameterEnableConsoleOutput).Value + }, ConfigVariables: configVariables(id, externalPrimaryId, comment), ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/basic"), Check: resource.ComposeTestCheckFunc( resource.TestCheckResourceAttr("snowflake_secondary_database.test", "name", id.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "as_replica_of", externalPrimaryId.FullyQualifiedName()), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", accountDataRetentionTimeInDays.Value), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "max_data_extension_time_in_days.0.value", accountMaxDataExtensionTimeInDays.Value), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "external_volume", ""), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "catalog", ""), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "replace_invalid_characters", "false"), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "default_ddl_collation", ""), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "storage_serialization_policy", "OPTIMIZED"), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "log_level", "OFF"), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "trace_level", "OFF"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "comment", comment), + + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "data_retention_time_in_days", accountDataRetentionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "max_data_extension_time_in_days", accountMaxDataExtensionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "external_volume", accountExternalVolume), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "catalog", accountCatalog), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "replace_invalid_characters", accountReplaceInvalidCharacters), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "default_ddl_collation", accountDefaultDdlCollation), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "storage_serialization_policy", accountStorageSerializationPolicy), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "log_level", accountLogLevel), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "trace_level", accountTraceLevel), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "suspend_task_after_num_failures", accountSuspendTaskAfterNumFailures), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "task_auto_retry_attempts", accountTaskAutoRetryAttempts), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "user_task_managed_initial_warehouse_size", accountUserTaskMangedInitialWarehouseSize), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "user_task_timeout_ms", accountUserTaskTimeoutMs), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "user_task_minimum_trigger_interval_in_seconds", accountUserTaskMinimumTriggerIntervalInSeconds), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "quoted_identifiers_ignore_case", accountQuotedIdentifiersIgnoreCase), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "enable_console_output", accountEnableConsoleOutput), ), }, // Rename + comment update @@ -75,16 +115,24 @@ func TestAcc_CreateSecondaryDatabase_minimal(t *testing.T) { Check: resource.ComposeTestCheckFunc( resource.TestCheckResourceAttr("snowflake_secondary_database.test", "name", newId.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "as_replica_of", externalPrimaryId.FullyQualifiedName()), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", accountDataRetentionTimeInDays.Value), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "max_data_extension_time_in_days.0.value", accountMaxDataExtensionTimeInDays.Value), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "external_volume", ""), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "catalog", ""), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "replace_invalid_characters", "false"), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "default_ddl_collation", ""), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "storage_serialization_policy", "OPTIMIZED"), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "log_level", "OFF"), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "trace_level", "OFF"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "comment", newComment), + + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "data_retention_time_in_days", accountDataRetentionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "max_data_extension_time_in_days", accountMaxDataExtensionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "external_volume", accountExternalVolume), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "catalog", accountCatalog), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "replace_invalid_characters", accountReplaceInvalidCharacters), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "default_ddl_collation", accountDefaultDdlCollation), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "storage_serialization_policy", accountStorageSerializationPolicy), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "log_level", accountLogLevel), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "trace_level", accountTraceLevel), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "suspend_task_after_num_failures", accountSuspendTaskAfterNumFailures), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "task_auto_retry_attempts", accountTaskAutoRetryAttempts), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "user_task_managed_initial_warehouse_size", accountUserTaskMangedInitialWarehouseSize), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "user_task_timeout_ms", accountUserTaskTimeoutMs), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "user_task_minimum_trigger_interval_in_seconds", accountUserTaskMinimumTriggerIntervalInSeconds), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "quoted_identifiers_ignore_case", accountQuotedIdentifiersIgnoreCase), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "enable_console_output", accountEnableConsoleOutput), ), }, // Import all values @@ -100,8 +148,6 @@ func TestAcc_CreateSecondaryDatabase_minimal(t *testing.T) { } func TestAcc_CreateSecondaryDatabase_complete(t *testing.T) { - t.Skip("To be unskipped in the next database pr") - id := acc.TestClient().Ids.RandomAccountObjectIdentifier() comment := random.Comment() @@ -125,47 +171,74 @@ func TestAcc_CreateSecondaryDatabase_complete(t *testing.T) { newCatalogId, newCatalogCleanup := acc.TestClient().CatalogIntegration.Create(t) t.Cleanup(newCatalogCleanup) - accountDataRetentionTimeInDays, err := acc.Client(t).Parameters.ShowAccountParameter(context.Background(), sdk.AccountParameterDataRetentionTimeInDays) - require.NoError(t, err) + var ( + accountDataRetentionTimeInDays = new(string) + accountMaxDataExtensionTimeInDays = new(string) + accountExternalVolume = new(string) + accountCatalog = new(string) + accountReplaceInvalidCharacters = new(string) + accountDefaultDdlCollation = new(string) + accountStorageSerializationPolicy = new(string) + accountLogLevel = new(string) + accountTraceLevel = new(string) + accountSuspendTaskAfterNumFailures = new(string) + accountTaskAutoRetryAttempts = new(string) + accountUserTaskMangedInitialWarehouseSize = new(string) + accountUserTaskTimeoutMs = new(string) + accountUserTaskMinimumTriggerIntervalInSeconds = new(string) + accountQuotedIdentifiersIgnoreCase = new(string) + accountEnableConsoleOutput = new(string) + ) - accountMaxDataExtensionTimeInDays, err := acc.Client(t).Parameters.ShowAccountParameter(context.Background(), sdk.AccountParameterMaxDataExtensionTimeInDays) - require.NoError(t, err) + unsetConfigVariables := config.Variables{ + "name": config.StringVariable(id.Name()), + "as_replica_of": config.StringVariable(externalPrimaryId.FullyQualifiedName()), + } - configVariables := func( - id sdk.AccountObjectIdentifier, - primaryDatabaseName sdk.ExternalObjectIdentifier, - transient bool, - dataRetentionTimeInDays *int, - maxDataExtensionTimeInDays *int, - externalVolume string, - catalog string, - replaceInvalidCharacters bool, - defaultDdlCollation string, - storageSerializationPolicy sdk.StorageSerializationPolicy, - logLevel sdk.LogLevel, - traceLevel sdk.TraceLevel, - comment string, - ) config.Variables { - variables := config.Variables{ - "name": config.StringVariable(id.Name()), - "as_replica_of": config.StringVariable(primaryDatabaseName.FullyQualifiedName()), - "transient": config.BoolVariable(transient), - "external_volume": config.StringVariable(externalVolume), - "catalog": config.StringVariable(catalog), - "replace_invalid_characters": config.BoolVariable(replaceInvalidCharacters), - "default_ddl_collation": config.StringVariable(defaultDdlCollation), - "storage_serialization_policy": config.StringVariable(string(storageSerializationPolicy)), - "log_level": config.StringVariable(string(logLevel)), - "trace_level": config.StringVariable(string(traceLevel)), - "comment": config.StringVariable(comment), - } - if dataRetentionTimeInDays != nil { - variables["data_retention_time_in_days"] = config.IntegerVariable(*dataRetentionTimeInDays) - } - if maxDataExtensionTimeInDays != nil { - variables["max_data_extension_time_in_days"] = config.IntegerVariable(*maxDataExtensionTimeInDays) - } - return variables + setConfigVariables := config.Variables{ + "name": config.StringVariable(id.Name()), + "as_replica_of": config.StringVariable(externalPrimaryId.FullyQualifiedName()), + "comment": config.StringVariable(comment), + + "data_retention_time_in_days": config.IntegerVariable(20), + "max_data_extension_time_in_days": config.IntegerVariable(25), + "external_volume": config.StringVariable(externalVolumeId.Name()), + "catalog": config.StringVariable(catalogId.Name()), + "replace_invalid_characters": config.BoolVariable(true), + "default_ddl_collation": config.StringVariable("en_US"), + "storage_serialization_policy": config.StringVariable(string(sdk.StorageSerializationPolicyCompatible)), + "log_level": config.StringVariable(string(sdk.LogLevelDebug)), + "trace_level": config.StringVariable(string(sdk.TraceLevelAlways)), + "suspend_task_after_num_failures": config.IntegerVariable(20), + "task_auto_retry_attempts": config.IntegerVariable(20), + "user_task_managed_initial_warehouse_size": config.StringVariable(string(sdk.WarehouseSizeLarge)), + "user_task_timeout_ms": config.IntegerVariable(1200000), + "user_task_minimum_trigger_interval_in_seconds": config.IntegerVariable(60), + "quoted_identifiers_ignore_case": config.BoolVariable(true), + "enable_console_output": config.BoolVariable(true), + } + + updatedConfigVariables := config.Variables{ + "name": config.StringVariable(newId.Name()), + "as_replica_of": config.StringVariable(externalPrimaryId.FullyQualifiedName()), + "comment": config.StringVariable(newComment), + + "data_retention_time_in_days": config.IntegerVariable(40), + "max_data_extension_time_in_days": config.IntegerVariable(45), + "external_volume": config.StringVariable(newExternalVolumeId.Name()), + "catalog": config.StringVariable(newCatalogId.Name()), + "replace_invalid_characters": config.BoolVariable(false), + "default_ddl_collation": config.StringVariable("en_GB"), + "storage_serialization_policy": config.StringVariable(string(sdk.StorageSerializationPolicyOptimized)), + "log_level": config.StringVariable(string(sdk.LogLevelInfo)), + "trace_level": config.StringVariable(string(sdk.TraceLevelOnEvent)), + "suspend_task_after_num_failures": config.IntegerVariable(40), + "task_auto_retry_attempts": config.IntegerVariable(40), + "user_task_managed_initial_warehouse_size": config.StringVariable(string(sdk.WarehouseSizeXLarge)), + "user_task_timeout_ms": config.IntegerVariable(2400000), + "user_task_minimum_trigger_interval_in_seconds": config.IntegerVariable(120), + "quoted_identifiers_ignore_case": config.BoolVariable(false), + "enable_console_output": config.BoolVariable(false), } resource.Test(t, resource.TestCase{ @@ -177,121 +250,135 @@ func TestAcc_CreateSecondaryDatabase_complete(t *testing.T) { CheckDestroy: acc.CheckDestroy(t, resources.SecondaryDatabase), Steps: []resource.TestStep{ { - ConfigVariables: configVariables( - id, - externalPrimaryId, - false, - sdk.Int(2), - sdk.Int(5), - externalVolumeId.Name(), - catalogId.Name(), - true, - "en_US", - sdk.StorageSerializationPolicyOptimized, - sdk.LogLevelInfo, - sdk.TraceLevelOnEvent, - comment, - ), + PreConfig: func() { + params := acc.TestClient().Parameter.ShowAccountParameters(t) + *accountDataRetentionTimeInDays = helpers.FindParameter(t, params, sdk.AccountParameterDataRetentionTimeInDays).Value + *accountMaxDataExtensionTimeInDays = helpers.FindParameter(t, params, sdk.AccountParameterMaxDataExtensionTimeInDays).Value + *accountExternalVolume = helpers.FindParameter(t, params, sdk.AccountParameterExternalVolume).Value + *accountCatalog = helpers.FindParameter(t, params, sdk.AccountParameterCatalog).Value + *accountReplaceInvalidCharacters = helpers.FindParameter(t, params, sdk.AccountParameterReplaceInvalidCharacters).Value + *accountDefaultDdlCollation = helpers.FindParameter(t, params, sdk.AccountParameterDefaultDDLCollation).Value + *accountStorageSerializationPolicy = helpers.FindParameter(t, params, sdk.AccountParameterStorageSerializationPolicy).Value + *accountLogLevel = helpers.FindParameter(t, params, sdk.AccountParameterLogLevel).Value + *accountTraceLevel = helpers.FindParameter(t, params, sdk.AccountParameterTraceLevel).Value + *accountSuspendTaskAfterNumFailures = helpers.FindParameter(t, params, sdk.AccountParameterSuspendTaskAfterNumFailures).Value + *accountTaskAutoRetryAttempts = helpers.FindParameter(t, params, sdk.AccountParameterTaskAutoRetryAttempts).Value + *accountUserTaskMangedInitialWarehouseSize = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskManagedInitialWarehouseSize).Value + *accountUserTaskTimeoutMs = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskTimeoutMs).Value + *accountUserTaskMinimumTriggerIntervalInSeconds = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds).Value + *accountQuotedIdentifiersIgnoreCase = helpers.FindParameter(t, params, sdk.AccountParameterQuotedIdentifiersIgnoreCase).Value + *accountEnableConsoleOutput = helpers.FindParameter(t, params, sdk.AccountParameterEnableConsoleOutput).Value + }, + ConfigVariables: setConfigVariables, ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-set"), Check: resource.ComposeTestCheckFunc( resource.TestCheckResourceAttr("snowflake_secondary_database.test", "name", id.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "is_transient", "false"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "as_replica_of", externalPrimaryId.FullyQualifiedName()), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", "2"), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "max_data_extension_time_in_days.0.value", "5"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "comment", comment), + + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days", "20"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "max_data_extension_time_in_days", "25"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "external_volume", externalVolumeId.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "catalog", catalogId.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "replace_invalid_characters", "true"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "default_ddl_collation", "en_US"), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "storage_serialization_policy", string(sdk.StorageSerializationPolicyOptimized)), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "log_level", string(sdk.LogLevelInfo)), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "trace_level", string(sdk.TraceLevelOnEvent)), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "comment", comment), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "storage_serialization_policy", string(sdk.StorageSerializationPolicyCompatible)), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "log_level", string(sdk.LogLevelDebug)), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "trace_level", string(sdk.TraceLevelAlways)), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "suspend_task_after_num_failures", "20"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "task_auto_retry_attempts", "20"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "user_task_managed_initial_warehouse_size", "LARGE"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "user_task_timeout_ms", "1200000"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "user_task_minimum_trigger_interval_in_seconds", "60"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "quoted_identifiers_ignore_case", "true"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "enable_console_output", "true"), ), }, { - ConfigVariables: configVariables( - newId, - externalPrimaryId, - false, - nil, - nil, - newExternalVolumeId.Name(), - newCatalogId.Name(), - false, - "en_GB", - sdk.StorageSerializationPolicyOptimized, - sdk.LogLevelDebug, - sdk.TraceLevelAlways, - newComment, - ), - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-unset"), + ConfigVariables: updatedConfigVariables, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-set"), Check: resource.ComposeTestCheckFunc( resource.TestCheckResourceAttr("snowflake_secondary_database.test", "name", newId.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "is_transient", "false"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "as_replica_of", externalPrimaryId.FullyQualifiedName()), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", accountDataRetentionTimeInDays.Value), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "max_data_extension_time_in_days.0.value", accountMaxDataExtensionTimeInDays.Value), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "comment", newComment), + + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days", "40"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "max_data_extension_time_in_days", "45"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "external_volume", newExternalVolumeId.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "catalog", newCatalogId.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "replace_invalid_characters", "false"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "default_ddl_collation", "en_GB"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "storage_serialization_policy", string(sdk.StorageSerializationPolicyOptimized)), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "log_level", string(sdk.LogLevelDebug)), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "trace_level", string(sdk.TraceLevelAlways)), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "comment", newComment), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "log_level", string(sdk.LogLevelInfo)), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "trace_level", string(sdk.TraceLevelOnEvent)), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "suspend_task_after_num_failures", "40"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "task_auto_retry_attempts", "40"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "user_task_managed_initial_warehouse_size", "XLARGE"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "user_task_timeout_ms", "2400000"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "user_task_minimum_trigger_interval_in_seconds", "120"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "quoted_identifiers_ignore_case", "false"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "enable_console_output", "false"), ), }, { - ConfigVariables: configVariables( - id, - externalPrimaryId, - false, - sdk.Int(2), - sdk.Int(5), - externalVolumeId.Name(), - catalogId.Name(), - true, - "en_US", - sdk.StorageSerializationPolicyCompatible, - sdk.LogLevelInfo, - sdk.TraceLevelOnEvent, - comment, + ConfigVariables: unsetConfigVariables, + ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-unset"), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "is_transient", "false"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "as_replica_of", externalPrimaryId.FullyQualifiedName()), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "comment", ""), + + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "data_retention_time_in_days", accountDataRetentionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "max_data_extension_time_in_days", accountMaxDataExtensionTimeInDays), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "external_volume", accountExternalVolume), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "catalog", accountCatalog), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "replace_invalid_characters", accountReplaceInvalidCharacters), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "default_ddl_collation", accountDefaultDdlCollation), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "storage_serialization_policy", accountStorageSerializationPolicy), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "log_level", accountLogLevel), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "trace_level", accountTraceLevel), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "suspend_task_after_num_failures", accountSuspendTaskAfterNumFailures), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "task_auto_retry_attempts", accountTaskAutoRetryAttempts), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "user_task_managed_initial_warehouse_size", accountUserTaskMangedInitialWarehouseSize), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "user_task_timeout_ms", accountUserTaskTimeoutMs), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "user_task_minimum_trigger_interval_in_seconds", accountUserTaskMinimumTriggerIntervalInSeconds), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "quoted_identifiers_ignore_case", accountQuotedIdentifiersIgnoreCase), + resource.TestCheckResourceAttrPtr("snowflake_secondary_database.test", "enable_console_output", accountEnableConsoleOutput), ), + }, + { + ConfigVariables: setConfigVariables, ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-set"), Check: resource.ComposeTestCheckFunc( resource.TestCheckResourceAttr("snowflake_secondary_database.test", "name", id.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "is_transient", "false"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "as_replica_of", externalPrimaryId.FullyQualifiedName()), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", "2"), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "max_data_extension_time_in_days.0.value", "5"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "comment", comment), + + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days", "20"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "max_data_extension_time_in_days", "25"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "external_volume", externalVolumeId.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "catalog", catalogId.Name()), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "replace_invalid_characters", "true"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "default_ddl_collation", "en_US"), resource.TestCheckResourceAttr("snowflake_secondary_database.test", "storage_serialization_policy", string(sdk.StorageSerializationPolicyCompatible)), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "log_level", string(sdk.LogLevelInfo)), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "trace_level", string(sdk.TraceLevelOnEvent)), - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "comment", comment), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "log_level", string(sdk.LogLevelDebug)), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "trace_level", string(sdk.TraceLevelAlways)), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "suspend_task_after_num_failures", "20"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "task_auto_retry_attempts", "20"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "user_task_managed_initial_warehouse_size", "LARGE"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "user_task_timeout_ms", "1200000"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "user_task_minimum_trigger_interval_in_seconds", "60"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "quoted_identifiers_ignore_case", "true"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "enable_console_output", "true"), ), }, // Import all values { - ConfigVariables: configVariables( - id, - externalPrimaryId, - false, - sdk.Int(2), - sdk.Int(5), - externalVolumeId.Name(), - catalogId.Name(), - true, - "en_US", - sdk.StorageSerializationPolicyCompatible, - sdk.LogLevelInfo, - sdk.TraceLevelOnEvent, - comment, - ), + ConfigVariables: setConfigVariables, ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-set"), ResourceName: "snowflake_secondary_database.test", ImportState: true, @@ -302,8 +389,6 @@ func TestAcc_CreateSecondaryDatabase_complete(t *testing.T) { } func TestAcc_CreateSecondaryDatabase_DataRetentionTimeInDays(t *testing.T) { - t.Skip("To be unskipped in the next database pr") - id := acc.TestClient().Ids.RandomAccountObjectIdentifier() _, externalPrimaryId, primaryDatabaseCleanup := acc.SecondaryTestClient().Database.CreatePrimaryDatabase(t, []sdk.AccountIdentifier{ @@ -314,27 +399,41 @@ func TestAcc_CreateSecondaryDatabase_DataRetentionTimeInDays(t *testing.T) { accountDataRetentionTimeInDays, err := acc.Client(t).Parameters.ShowAccountParameter(context.Background(), sdk.AccountParameterDataRetentionTimeInDays) require.NoError(t, err) + externalVolumeId, externalVolumeCleanup := acc.TestClient().ExternalVolume.Create(t) + t.Cleanup(externalVolumeCleanup) + + catalogId, catalogCleanup := acc.TestClient().CatalogIntegration.Create(t) + t.Cleanup(catalogCleanup) + configVariables := func( id sdk.AccountObjectIdentifier, primaryDatabaseName sdk.ExternalObjectIdentifier, dataRetentionTimeInDays *int, ) config.Variables { variables := config.Variables{ - "name": config.StringVariable(id.Name()), - "as_replica_of": config.StringVariable(primaryDatabaseName.FullyQualifiedName()), - "transient": config.BoolVariable(false), - "external_volume": config.StringVariable(""), - "catalog": config.StringVariable(""), - "replace_invalid_characters": config.StringVariable("false"), - "default_ddl_collation": config.StringVariable(""), - "storage_serialization_policy": config.StringVariable("OPTIMIZED"), - "log_level": config.StringVariable("OFF"), - "trace_level": config.StringVariable("OFF"), - "comment": config.StringVariable(""), + "name": config.StringVariable(id.Name()), + "as_replica_of": config.StringVariable(primaryDatabaseName.FullyQualifiedName()), + "transient": config.BoolVariable(false), + "comment": config.StringVariable(""), + + "max_data_extension_time_in_days": config.IntegerVariable(10), + "external_volume": config.StringVariable(externalVolumeId.Name()), + "catalog": config.StringVariable(catalogId.Name()), + "replace_invalid_characters": config.BoolVariable(true), + "default_ddl_collation": config.StringVariable("en_US"), + "storage_serialization_policy": config.StringVariable("OPTIMIZED"), + "log_level": config.StringVariable("OFF"), + "trace_level": config.StringVariable("OFF"), + "suspend_task_after_num_failures": config.IntegerVariable(10), + "task_auto_retry_attempts": config.IntegerVariable(10), + "user_task_managed_initial_warehouse_size": config.StringVariable(string(sdk.WarehouseSizeSmall)), + "user_task_timeout_ms": config.IntegerVariable(120000), + "user_task_minimum_trigger_interval_in_seconds": config.IntegerVariable(120), + "quoted_identifiers_ignore_case": config.BoolVariable(true), + "enable_console_output": config.BoolVariable(true), } if dataRetentionTimeInDays != nil { variables["data_retention_time_in_days"] = config.IntegerVariable(*dataRetentionTimeInDays) - variables["max_data_extension_time_in_days"] = config.IntegerVariable(10) } return variables } @@ -353,21 +452,21 @@ func TestAcc_CreateSecondaryDatabase_DataRetentionTimeInDays(t *testing.T) { ConfigVariables: configVariables(id, externalPrimaryId, sdk.Int(2)), ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-set"), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", "2"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days", "2"), ), }, { ConfigVariables: configVariables(id, externalPrimaryId, sdk.Int(1)), ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-set"), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", "1"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days", "1"), ), }, { ConfigVariables: configVariables(id, externalPrimaryId, nil), ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-unset"), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", accountDataRetentionTimeInDays.Value), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days", accountDataRetentionTimeInDays.Value), ), }, { @@ -378,7 +477,7 @@ func TestAcc_CreateSecondaryDatabase_DataRetentionTimeInDays(t *testing.T) { ConfigVariables: configVariables(id, externalPrimaryId, nil), ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-unset"), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", "3"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days", "3"), ), }, { @@ -388,21 +487,21 @@ func TestAcc_CreateSecondaryDatabase_DataRetentionTimeInDays(t *testing.T) { ConfigVariables: configVariables(id, externalPrimaryId, nil), ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-unset"), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", accountDataRetentionTimeInDays.Value), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days", accountDataRetentionTimeInDays.Value), ), }, { ConfigVariables: configVariables(id, externalPrimaryId, sdk.Int(3)), ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-set"), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", "3"), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days", "3"), ), }, { ConfigVariables: configVariables(id, externalPrimaryId, nil), ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SecondaryDatabase/complete-optionals-unset"), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days.0.value", accountDataRetentionTimeInDays.Value), + resource.TestCheckResourceAttr("snowflake_secondary_database.test", "data_retention_time_in_days", accountDataRetentionTimeInDays.Value), ), }, }, diff --git a/pkg/resources/shared_database.go b/pkg/resources/shared_database.go index e6075a632b..28855b66e2 100644 --- a/pkg/resources/shared_database.go +++ b/pkg/resources/shared_database.go @@ -4,7 +4,6 @@ import ( "context" "errors" "fmt" - "strconv" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" @@ -23,7 +22,12 @@ var sharedDatabaseSchema = map[string]*schema.Schema{ Type: schema.TypeString, Required: true, ForceNew: true, - Description: "A fully qualified path to a share from which the database will be created. A fully qualified path follows the format of `\"\".\"\"`.", + Description: "A fully qualified path to a share from which the database will be created. A fully qualified path follows the format of `\"\".\"\".\"\"`.", + }, + "comment": { + Type: schema.TypeString, + Optional: true, + Description: "Specifies a comment for the database.", }, // TODO(SNOW-1325381): Add it as an item to discuss and either remove or uncomment (and implement) it // "is_transient": { @@ -32,67 +36,6 @@ var sharedDatabaseSchema = map[string]*schema.Schema{ // ForceNew: true, // Description: "Specifies the database as transient. Transient databases do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel; however, this means they are also not protected by Fail-safe in the event of a data loss.", // }, - "external_volume": { - Type: schema.TypeString, - Optional: true, - ForceNew: true, - ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), - Description: "The database parameter that specifies the default external volume to use for Iceberg tables.", - }, - "catalog": { - Type: schema.TypeString, - Optional: true, - ForceNew: true, - ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), - Description: "The database parameter that specifies the default catalog to use for Iceberg tables.", - }, - "replace_invalid_characters": { - Type: schema.TypeBool, - Optional: true, - ForceNew: true, - Description: "Specifies whether to replace invalid UTF-8 characters with the Unicode replacement character (�) in query results for an Iceberg table. You can only set this parameter for tables that use an external Iceberg catalog.", - }, - "default_ddl_collation": { - Type: schema.TypeString, - Optional: true, - ForceNew: true, - Description: "Specifies a default collation specification for all schemas and tables added to the database. It can be overridden on schema or table level. For more information, see [collation specification](https://docs.snowflake.com/en/sql-reference/collation#label-collation-specification).", - }, - "storage_serialization_policy": { - Type: schema.TypeString, - Optional: true, - ForceNew: true, - ValidateDiagFunc: StringInSlice(sdk.AsStringList(sdk.AllStorageSerializationPolicies), true), - Description: fmt.Sprintf("Specifies the storage serialization policy for Iceberg tables that use Snowflake as the catalog. Valid options are: %v. COMPATIBLE: Snowflake performs encoding and compression of data files that ensures interoperability with third-party compute engines. OPTIMIZED: Snowflake performs encoding and compression of data files that ensures the best table performance within Snowflake.", sdk.AsStringList(sdk.AllStorageSerializationPolicies)), - DiffSuppressFunc: func(k, oldValue, newValue string, d *schema.ResourceData) bool { - return d.Get(k).(string) == string(sdk.StorageSerializationPolicyOptimized) && newValue == "" - }, - }, - "log_level": { - Type: schema.TypeString, - Optional: true, - ForceNew: true, - ValidateDiagFunc: StringInSlice(sdk.AsStringList(sdk.AllLogLevels), true), - DiffSuppressFunc: func(k, oldValue, newValue string, d *schema.ResourceData) bool { - return d.Get(k).(string) == string(sdk.LogLevelOff) && newValue == "" - }, - Description: fmt.Sprintf("Specifies the severity level of messages that should be ingested and made available in the active event table. Valid options are: %v. Messages at the specified level (and at more severe levels) are ingested. For more information, see [LOG_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-log-level).", sdk.AsStringList(sdk.AllLogLevels)), - }, - "trace_level": { - Type: schema.TypeString, - Optional: true, - ForceNew: true, - ValidateDiagFunc: StringInSlice(sdk.AsStringList(sdk.AllTraceLevels), true), - DiffSuppressFunc: func(k, oldValue, newValue string, d *schema.ResourceData) bool { - return d.Get(k).(string) == string(sdk.TraceLevelOff) && newValue == "" - }, - Description: fmt.Sprintf("Controls how trace events are ingested into the event table. Valid options are: %v. For information about levels, see [TRACE_LEVEL](https://docs.snowflake.com/en/sql-reference/parameters.html#label-trace-level).", sdk.AsStringList(sdk.AllTraceLevels)), - }, - "comment": { - Type: schema.TypeString, - Optional: true, - Description: "Specifies a comment for the database.", - }, } func SharedDatabase() *schema.Resource { @@ -103,7 +46,7 @@ func SharedDatabase() *schema.Resource { DeleteContext: DeleteSharedDatabase, Description: "A shared database creates a database from a share provided by another Snowflake account. For more information about shares, see [Introduction to Secure Data Sharing](https://docs.snowflake.com/en/user-guide/data-sharing-intro).", - Schema: sharedDatabaseSchema, + Schema: MergeMaps(sharedDatabaseSchema, SharedDatabaseParametersSchema), Importer: &schema.ResourceImporter{ StateContext: schema.ImportStatePassthroughContext, }, @@ -116,42 +59,43 @@ func CreateSharedDatabase(ctx context.Context, d *schema.ResourceData, meta any) id := sdk.NewAccountObjectIdentifier(d.Get("name").(string)) externalShareId := sdk.NewExternalObjectIdentifierFromFullyQualifiedName(d.Get("from_share").(string)) - var externalVolume *sdk.AccountObjectIdentifier - if v, ok := d.GetOk("external_volume"); ok { - externalVolume = sdk.Pointer(sdk.NewAccountObjectIdentifier(v.(string))) - } - - var catalog *sdk.AccountObjectIdentifier - if v, ok := d.GetOk("catalog"); ok { - catalog = sdk.Pointer(sdk.NewAccountObjectIdentifier(v.(string))) - } - - var storageSerializationPolicy *sdk.StorageSerializationPolicy - if v, ok := d.GetOk("storage_serialization_policy"); ok { - storageSerializationPolicy = sdk.Pointer(sdk.StorageSerializationPolicy(v.(string))) - } - - var logLevel *sdk.LogLevel - if v, ok := d.GetOk("log_level"); ok { - logLevel = sdk.Pointer(sdk.LogLevel(v.(string))) - } - - var traceLevel *sdk.TraceLevel - if v, ok := d.GetOk("trace_level"); ok { - traceLevel = sdk.Pointer(sdk.TraceLevel(v.(string))) + _, _, externalVolume, + catalog, + replaceInvalidCharacters, + defaultDDLCollation, + storageSerializationPolicy, + logLevel, + traceLevel, + suspendTaskAfterNumFailures, + taskAutoRetryAttempts, + userTaskManagedInitialWarehouseSize, + userTaskTimeoutMs, + userTaskMinimumTriggerIntervalInSeconds, + quotedIdentifiersIgnoreCase, + enableConsoleOutput, + err := GetAllDatabaseParameters(d) + if err != nil { + return diag.FromErr(err) } - err := client.Databases.CreateShared(ctx, id, externalShareId, &sdk.CreateSharedDatabaseOptions{ + err = client.Databases.CreateShared(ctx, id, externalShareId, &sdk.CreateSharedDatabaseOptions{ // TODO(SNOW-1325381) // Transient: GetPropertyAsPointer[bool](d, "is_transient"), - ExternalVolume: externalVolume, - Catalog: catalog, - ReplaceInvalidCharacters: GetPropertyAsPointer[bool](d, "replace_invalid_characters"), - DefaultDDLCollation: GetPropertyAsPointer[string](d, "default_ddl_collation"), - StorageSerializationPolicy: storageSerializationPolicy, - LogLevel: logLevel, - TraceLevel: traceLevel, - Comment: GetPropertyAsPointer[string](d, "comment"), + ExternalVolume: externalVolume, + Catalog: catalog, + ReplaceInvalidCharacters: replaceInvalidCharacters, + DefaultDDLCollation: defaultDDLCollation, + StorageSerializationPolicy: storageSerializationPolicy, + LogLevel: logLevel, + TraceLevel: traceLevel, + SuspendTaskAfterNumFailures: suspendTaskAfterNumFailures, + TaskAutoRetryAttempts: taskAutoRetryAttempts, + UserTaskManagedInitialWarehouseSize: userTaskManagedInitialWarehouseSize, + UserTaskTimeoutMs: userTaskTimeoutMs, + UserTaskMinimumTriggerIntervalInSeconds: userTaskMinimumTriggerIntervalInSeconds, + QuotedIdentifiersIgnoreCase: quotedIdentifiersIgnoreCase, + EnableConsoleOutput: enableConsoleOutput, + Comment: GetPropertyAsPointer[string](d, "comment"), }) if err != nil { return diag.FromErr(err) @@ -167,15 +111,15 @@ func UpdateSharedDatabase(ctx context.Context, d *schema.ResourceData, meta any) id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) if d.HasChange("name") { - newName := sdk.NewAccountObjectIdentifier(d.Get("name").(string)) + newId := sdk.NewAccountObjectIdentifier(d.Get("name").(string)) err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ - NewName: &newName, + NewName: &newId, }) if err != nil { return diag.FromErr(err) } - d.SetId(helpers.EncodeSnowflakeID(newName)) - id = newName + d.SetId(helpers.EncodeSnowflakeID(newId)) + id = newId } if d.HasChange("comment") { @@ -223,15 +167,6 @@ func ReadSharedDatabase(ctx context.Context, d *schema.ResourceData, meta any) d return diag.FromErr(err) } - parameters, err := client.Parameters.ShowParameters(ctx, &sdk.ShowParametersOptions{ - In: &sdk.ParametersIn{ - Database: id, - }, - }) - if err != nil { - return diag.FromErr(err) - } - if err := d.Set("name", database.Name); err != nil { return diag.FromErr(err) } @@ -243,47 +178,23 @@ func ReadSharedDatabase(ctx context.Context, d *schema.ResourceData, meta any) d // TODO(SNOW-1325381) // if err := d.Set("is_transient", database.Transient); err != nil { // return diag.FromErr(err) - //} + // } if err := d.Set("comment", database.Comment); err != nil { return diag.FromErr(err) } - for _, parameter := range parameters { - switch parameter.Key { - case "EXTERNAL_VOLUME": - if err := d.Set("external_volume", parameter.Value); err != nil { - return diag.FromErr(err) - } - case "CATALOG": - if err := d.Set("catalog", parameter.Value); err != nil { - return diag.FromErr(err) - } - case "DEFAULT_DDL_COLLATION": - if err := d.Set("default_ddl_collation", parameter.Value); err != nil { - return diag.FromErr(err) - } - case "LOG_LEVEL": - if err := d.Set("log_level", parameter.Value); err != nil { - return diag.FromErr(err) - } - case "TRACE_LEVEL": - if err := d.Set("trace_level", parameter.Value); err != nil { - return diag.FromErr(err) - } - case "REPLACE_INVALID_CHARACTERS": - boolValue, err := strconv.ParseBool(parameter.Value) - if err != nil { - return diag.FromErr(err) - } - if err := d.Set("replace_invalid_characters", boolValue); err != nil { - return diag.FromErr(err) - } - case "STORAGE_SERIALIZATION_POLICY": - if err := d.Set("storage_serialization_policy", parameter.Value); err != nil { - return diag.FromErr(err) - } - } + databaseParameters, err := client.Parameters.ShowParameters(ctx, &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Database: id, + }, + }) + if err != nil { + return diag.FromErr(err) + } + + if diags := HandleDatabaseParameterRead(d, databaseParameters); diags != nil { + return diags } return nil diff --git a/pkg/resources/shared_database_acceptance_test.go b/pkg/resources/shared_database_acceptance_test.go index 768adfaacb..fc0ec0cc1d 100644 --- a/pkg/resources/shared_database_acceptance_test.go +++ b/pkg/resources/shared_database_acceptance_test.go @@ -5,6 +5,9 @@ import ( "regexp" "testing" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers" + "github.com/hashicorp/terraform-plugin-testing/plancheck" + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" @@ -15,15 +18,30 @@ import ( "github.com/stretchr/testify/require" ) -func TestAcc_CreateSharedDatabase_minimal(t *testing.T) { - t.Skip("To be unskipped in the next database pr") - +func TestAcc_CreateSharedDatabase_Basic(t *testing.T) { id := acc.TestClient().Ids.RandomAccountObjectIdentifier() comment := random.Comment() newId := acc.TestClient().Ids.RandomAccountObjectIdentifier() newComment := random.Comment() + var ( + accountExternalVolume = new(string) + accountCatalog = new(string) + accountReplaceInvalidCharacters = new(string) + accountDefaultDdlCollation = new(string) + accountStorageSerializationPolicy = new(string) + accountLogLevel = new(string) + accountTraceLevel = new(string) + accountSuspendTaskAfterNumFailures = new(string) + accountTaskAutoRetryAttempts = new(string) + accountUserTaskMangedInitialWarehouseSize = new(string) + accountUserTaskTimeoutMs = new(string) + accountUserTaskMinimumTriggerIntervalInSeconds = new(string) + accountQuotedIdentifiersIgnoreCase = new(string) + accountEnableConsoleOutput = new(string) + ) + configVariables := func(id sdk.AccountObjectIdentifier, shareName sdk.ExternalObjectIdentifier, comment string) config.Variables { return config.Variables{ "name": config.StringVariable(id.Name()), @@ -43,35 +61,73 @@ func TestAcc_CreateSharedDatabase_minimal(t *testing.T) { CheckDestroy: acc.CheckDestroy(t, resources.SharedDatabase), Steps: []resource.TestStep{ { + PreConfig: func() { + params := acc.TestClient().Parameter.ShowAccountParameters(t) + *accountExternalVolume = helpers.FindParameter(t, params, sdk.AccountParameterExternalVolume).Value + *accountCatalog = helpers.FindParameter(t, params, sdk.AccountParameterCatalog).Value + *accountReplaceInvalidCharacters = helpers.FindParameter(t, params, sdk.AccountParameterReplaceInvalidCharacters).Value + *accountDefaultDdlCollation = helpers.FindParameter(t, params, sdk.AccountParameterDefaultDDLCollation).Value + *accountStorageSerializationPolicy = helpers.FindParameter(t, params, sdk.AccountParameterStorageSerializationPolicy).Value + *accountLogLevel = helpers.FindParameter(t, params, sdk.AccountParameterLogLevel).Value + *accountTraceLevel = helpers.FindParameter(t, params, sdk.AccountParameterTraceLevel).Value + *accountSuspendTaskAfterNumFailures = helpers.FindParameter(t, params, sdk.AccountParameterSuspendTaskAfterNumFailures).Value + *accountTaskAutoRetryAttempts = helpers.FindParameter(t, params, sdk.AccountParameterTaskAutoRetryAttempts).Value + *accountUserTaskMangedInitialWarehouseSize = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskManagedInitialWarehouseSize).Value + *accountUserTaskTimeoutMs = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskTimeoutMs).Value + *accountUserTaskMinimumTriggerIntervalInSeconds = helpers.FindParameter(t, params, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds).Value + *accountQuotedIdentifiersIgnoreCase = helpers.FindParameter(t, params, sdk.AccountParameterQuotedIdentifiersIgnoreCase).Value + *accountEnableConsoleOutput = helpers.FindParameter(t, params, sdk.AccountParameterEnableConsoleOutput).Value + }, ConfigVariables: configVariables(id, shareExternalId, comment), ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SharedDatabase/basic"), Check: resource.ComposeTestCheckFunc( resource.TestCheckResourceAttr("snowflake_shared_database.test", "name", id.Name()), resource.TestCheckResourceAttr("snowflake_shared_database.test", "from_share", shareExternalId.FullyQualifiedName()), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "external_volume", ""), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "catalog", ""), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "replace_invalid_characters", "false"), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "default_ddl_collation", ""), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "storage_serialization_policy", "OPTIMIZED"), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "log_level", "OFF"), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "trace_level", "OFF"), resource.TestCheckResourceAttr("snowflake_shared_database.test", "comment", comment), + + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "external_volume", accountExternalVolume), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "catalog", accountCatalog), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "replace_invalid_characters", accountReplaceInvalidCharacters), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "default_ddl_collation", accountDefaultDdlCollation), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "storage_serialization_policy", accountStorageSerializationPolicy), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "log_level", accountLogLevel), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "trace_level", accountTraceLevel), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "suspend_task_after_num_failures", accountSuspendTaskAfterNumFailures), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "task_auto_retry_attempts", accountTaskAutoRetryAttempts), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "user_task_managed_initial_warehouse_size", accountUserTaskMangedInitialWarehouseSize), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "user_task_timeout_ms", accountUserTaskTimeoutMs), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "user_task_minimum_trigger_interval_in_seconds", accountUserTaskMinimumTriggerIntervalInSeconds), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "quoted_identifiers_ignore_case", accountQuotedIdentifiersIgnoreCase), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "enable_console_output", accountEnableConsoleOutput), ), }, { ConfigVariables: configVariables(newId, shareExternalId, newComment), ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SharedDatabase/basic"), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectResourceAction("snowflake_shared_database.test", plancheck.ResourceActionUpdate), + }, + }, Check: resource.ComposeTestCheckFunc( resource.TestCheckResourceAttr("snowflake_shared_database.test", "name", newId.Name()), resource.TestCheckResourceAttr("snowflake_shared_database.test", "from_share", shareExternalId.FullyQualifiedName()), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "external_volume", ""), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "catalog", ""), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "replace_invalid_characters", "false"), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "default_ddl_collation", ""), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "storage_serialization_policy", "OPTIMIZED"), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "log_level", "OFF"), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "trace_level", "OFF"), resource.TestCheckResourceAttr("snowflake_shared_database.test", "comment", newComment), + + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "external_volume", accountExternalVolume), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "catalog", accountCatalog), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "replace_invalid_characters", accountReplaceInvalidCharacters), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "default_ddl_collation", accountDefaultDdlCollation), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "storage_serialization_policy", accountStorageSerializationPolicy), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "log_level", accountLogLevel), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "trace_level", accountTraceLevel), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "suspend_task_after_num_failures", accountSuspendTaskAfterNumFailures), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "task_auto_retry_attempts", accountTaskAutoRetryAttempts), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "user_task_managed_initial_warehouse_size", accountUserTaskMangedInitialWarehouseSize), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "user_task_timeout_ms", accountUserTaskTimeoutMs), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "user_task_minimum_trigger_interval_in_seconds", accountUserTaskMinimumTriggerIntervalInSeconds), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "quoted_identifiers_ignore_case", accountQuotedIdentifiersIgnoreCase), + resource.TestCheckResourceAttrPtr("snowflake_shared_database.test", "enable_console_output", accountEnableConsoleOutput), ), }, // Import all values @@ -87,8 +143,6 @@ func TestAcc_CreateSharedDatabase_minimal(t *testing.T) { } func TestAcc_CreateSharedDatabase_complete(t *testing.T) { - t.Skip("To be unskipped in the next database pr") - id := acc.TestClient().Ids.RandomAccountObjectIdentifier() comment := random.Comment() externalShareId := createShareableDatabase(t) @@ -99,30 +153,25 @@ func TestAcc_CreateSharedDatabase_complete(t *testing.T) { catalogId, catalogCleanup := acc.TestClient().CatalogIntegration.Create(t) t.Cleanup(catalogCleanup) - configVariables := func( - id sdk.AccountObjectIdentifier, - shareName sdk.ExternalObjectIdentifier, - externalVolume sdk.AccountObjectIdentifier, - catalog sdk.AccountObjectIdentifier, - replaceInvalidCharacters bool, - defaultDdlCollation string, - storageSerializationPolicy sdk.StorageSerializationPolicy, - logLevel sdk.LogLevel, - traceLevel sdk.TraceLevel, - comment string, - ) config.Variables { - return config.Variables{ - "name": config.StringVariable(id.Name()), - "from_share": config.StringVariable(shareName.FullyQualifiedName()), - "external_volume": config.StringVariable(externalVolume.Name()), - "catalog": config.StringVariable(catalog.Name()), - "replace_invalid_characters": config.BoolVariable(replaceInvalidCharacters), - "default_ddl_collation": config.StringVariable(defaultDdlCollation), - "storage_serialization_policy": config.StringVariable(string(storageSerializationPolicy)), - "log_level": config.StringVariable(string(logLevel)), - "trace_level": config.StringVariable(string(traceLevel)), - "comment": config.StringVariable(comment), - } + configVariables := config.Variables{ + "name": config.StringVariable(id.Name()), + "from_share": config.StringVariable(externalShareId.FullyQualifiedName()), + "comment": config.StringVariable(comment), + + "external_volume": config.StringVariable(externalVolumeId.Name()), + "catalog": config.StringVariable(catalogId.Name()), + "replace_invalid_characters": config.BoolVariable(true), + "default_ddl_collation": config.StringVariable("en_US"), + "storage_serialization_policy": config.StringVariable(string(sdk.StorageSerializationPolicyOptimized)), + "log_level": config.StringVariable(string(sdk.LogLevelInfo)), + "trace_level": config.StringVariable(string(sdk.TraceLevelOnEvent)), + "suspend_task_after_num_failures": config.IntegerVariable(20), + "task_auto_retry_attempts": config.IntegerVariable(20), + "user_task_managed_initial_warehouse_size": config.StringVariable(string(sdk.WarehouseSizeXLarge)), + "user_task_timeout_ms": config.IntegerVariable(1200000), + "user_task_minimum_trigger_interval_in_seconds": config.IntegerVariable(120), + "quoted_identifiers_ignore_case": config.BoolVariable(true), + "enable_console_output": config.BoolVariable(true), } resource.Test(t, resource.TestCase{ @@ -134,21 +183,13 @@ func TestAcc_CreateSharedDatabase_complete(t *testing.T) { CheckDestroy: acc.CheckDestroy(t, resources.SharedDatabase), Steps: []resource.TestStep{ { - ConfigVariables: configVariables( - id, - externalShareId, - externalVolumeId, - catalogId, - true, - "en_US", - sdk.StorageSerializationPolicyOptimized, - sdk.LogLevelInfo, - sdk.TraceLevelOnEvent, - comment, - ), + ConfigVariables: configVariables, ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SharedDatabase/complete"), Check: resource.ComposeTestCheckFunc( resource.TestCheckResourceAttr("snowflake_shared_database.test", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_shared_database.test", "from_share", externalShareId.FullyQualifiedName()), + resource.TestCheckResourceAttr("snowflake_shared_database.test", "comment", comment), + resource.TestCheckResourceAttr("snowflake_shared_database.test", "external_volume", externalVolumeId.Name()), resource.TestCheckResourceAttr("snowflake_shared_database.test", "catalog", catalogId.Name()), resource.TestCheckResourceAttr("snowflake_shared_database.test", "replace_invalid_characters", "true"), @@ -156,23 +197,18 @@ func TestAcc_CreateSharedDatabase_complete(t *testing.T) { resource.TestCheckResourceAttr("snowflake_shared_database.test", "storage_serialization_policy", string(sdk.StorageSerializationPolicyOptimized)), resource.TestCheckResourceAttr("snowflake_shared_database.test", "log_level", string(sdk.LogLevelInfo)), resource.TestCheckResourceAttr("snowflake_shared_database.test", "trace_level", string(sdk.TraceLevelOnEvent)), - resource.TestCheckResourceAttr("snowflake_shared_database.test", "comment", comment), + resource.TestCheckResourceAttr("snowflake_shared_database.test", "suspend_task_after_num_failures", "20"), + resource.TestCheckResourceAttr("snowflake_shared_database.test", "task_auto_retry_attempts", "20"), + resource.TestCheckResourceAttr("snowflake_shared_database.test", "user_task_managed_initial_warehouse_size", string(sdk.WarehouseSizeXLarge)), + resource.TestCheckResourceAttr("snowflake_shared_database.test", "user_task_timeout_ms", "1200000"), + resource.TestCheckResourceAttr("snowflake_shared_database.test", "user_task_minimum_trigger_interval_in_seconds", "120"), + resource.TestCheckResourceAttr("snowflake_shared_database.test", "quoted_identifiers_ignore_case", "true"), + resource.TestCheckResourceAttr("snowflake_shared_database.test", "enable_console_output", "true"), ), }, // Import all values { - ConfigVariables: configVariables( - id, - externalShareId, - externalVolumeId, - catalogId, - true, - "en_US", - sdk.StorageSerializationPolicyOptimized, - sdk.LogLevelInfo, - sdk.TraceLevelOnEvent, - comment, - ), + ConfigVariables: configVariables, ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SharedDatabase/complete"), ResourceName: "snowflake_shared_database.test", ImportState: true, @@ -183,30 +219,27 @@ func TestAcc_CreateSharedDatabase_complete(t *testing.T) { } func TestAcc_CreateSharedDatabase_InvalidValues(t *testing.T) { - t.Skip("To be unskipped in the next database pr") - comment := random.Comment() - configVariables := func( - replaceInvalidCharacters bool, - defaultDdlCollation string, - storageSerializationPolicy string, - logLevel string, - traceLevel string, - comment string, - ) config.Variables { - return config.Variables{ - "name": config.StringVariable(""), - "from_share": config.StringVariable(""), - "external_volume": config.StringVariable(""), - "catalog": config.StringVariable(""), - "replace_invalid_characters": config.BoolVariable(replaceInvalidCharacters), - "default_ddl_collation": config.StringVariable(defaultDdlCollation), - "storage_serialization_policy": config.StringVariable(storageSerializationPolicy), - "log_level": config.StringVariable(logLevel), - "trace_level": config.StringVariable(traceLevel), - "comment": config.StringVariable(comment), - } + configVariables := config.Variables{ + "name": config.StringVariable("name"), + "from_share": config.StringVariable("org.acc.name"), + "comment": config.StringVariable(comment), + + "external_volume": config.StringVariable(""), + "catalog": config.StringVariable(""), + "replace_invalid_characters": config.BoolVariable(false), + "default_ddl_collation": config.StringVariable(""), + "storage_serialization_policy": config.StringVariable("invalid_value"), + "log_level": config.StringVariable("invalid_value"), + "trace_level": config.StringVariable("invalid_value"), + "suspend_task_after_num_failures": config.IntegerVariable(0), + "task_auto_retry_attempts": config.IntegerVariable(0), + "user_task_managed_initial_warehouse_size": config.StringVariable(""), + "user_task_timeout_ms": config.IntegerVariable(0), + "user_task_minimum_trigger_interval_in_seconds": config.IntegerVariable(0), + "quoted_identifiers_ignore_case": config.BoolVariable(false), + "enable_console_output": config.BoolVariable(false), } resource.Test(t, resource.TestCase{ @@ -218,14 +251,7 @@ func TestAcc_CreateSharedDatabase_InvalidValues(t *testing.T) { CheckDestroy: acc.CheckDestroy(t, resources.SharedDatabase), Steps: []resource.TestStep{ { - ConfigVariables: configVariables( - true, - "en_US", - "invalid_value", - "invalid_value", - "invalid_value", - comment, - ), + ConfigVariables: configVariables, ConfigDirectory: acc.ConfigurationDirectory("TestAcc_SharedDatabase/complete"), ExpectError: regexp.MustCompile(`(expected \[{{} log_level}\] to be one of \[\"TRACE\" \"DEBUG\" \"INFO\" \"WARN\" \"ERROR\" \"FATAL\" \"OFF\"\], got invalid_value)|` + `(expected \[{{} trace_level}\] to be one of \[\"ALWAYS\" \"ON_EVENT\" \"OFF\"\], got invalid_value)|` + diff --git a/pkg/resources/testdata/TestAcc_Database/basic/test.tf b/pkg/resources/testdata/TestAcc_Database/basic/test.tf new file mode 100644 index 0000000000..3477ed36c1 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/basic/test.tf @@ -0,0 +1,4 @@ +resource "snowflake_database" "test" { + name = var.name + comment = var.comment +} diff --git a/pkg/resources/testdata/TestAcc_Database/basic/variables.tf b/pkg/resources/testdata/TestAcc_Database/basic/variables.tf new file mode 100644 index 0000000000..0f11034ff3 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/basic/variables.tf @@ -0,0 +1,8 @@ +variable "name" { + type = string +} + +variable "comment" { + type = string +} + diff --git a/pkg/resources/testdata/TestAcc_Database/catalog/test.tf b/pkg/resources/testdata/TestAcc_Database/catalog/test.tf new file mode 100644 index 0000000000..3af1bc5b81 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/catalog/test.tf @@ -0,0 +1,4 @@ +resource "snowflake_database" "test" { + name = var.name + catalog = var.catalog +} diff --git a/pkg/resources/testdata/TestAcc_Database/catalog/variables.tf b/pkg/resources/testdata/TestAcc_Database/catalog/variables.tf new file mode 100644 index 0000000000..f2f8c268a8 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/catalog/variables.tf @@ -0,0 +1,8 @@ +variable "name" { + type = string +} + +variable "catalog" { + type = string +} + diff --git a/pkg/resources/testdata/TestAcc_Database/complete_optionals_set/test.tf b/pkg/resources/testdata/TestAcc_Database/complete_optionals_set/test.tf new file mode 100644 index 0000000000..a109934fb9 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/complete_optionals_set/test.tf @@ -0,0 +1,31 @@ +resource "snowflake_database" "test" { + name = var.name + comment = var.comment + is_transient = var.transient + + data_retention_time_in_days = var.data_retention_time_in_days + max_data_extension_time_in_days = var.max_data_extension_time_in_days + external_volume = var.external_volume + catalog = var.catalog + replace_invalid_characters = var.replace_invalid_characters + default_ddl_collation = var.default_ddl_collation + storage_serialization_policy = var.storage_serialization_policy + log_level = var.log_level + trace_level = var.trace_level + suspend_task_after_num_failures = var.suspend_task_after_num_failures + task_auto_retry_attempts = var.task_auto_retry_attempts + user_task_managed_initial_warehouse_size = var.user_task_managed_initial_warehouse_size + user_task_timeout_ms = var.user_task_timeout_ms + user_task_minimum_trigger_interval_in_seconds = var.user_task_minimum_trigger_interval_in_seconds + quoted_identifiers_ignore_case = var.quoted_identifiers_ignore_case + enable_console_output = var.enable_console_output + + replication { + enable_to_account { + account_identifier = var.account_identifier + with_failover = var.with_failover + } + + ignore_edition_check = var.ignore_edition_check + } +} diff --git a/pkg/resources/testdata/TestAcc_Database/complete_optionals_set/variables.tf b/pkg/resources/testdata/TestAcc_Database/complete_optionals_set/variables.tf new file mode 100644 index 0000000000..d450c98e35 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/complete_optionals_set/variables.tf @@ -0,0 +1,87 @@ +variable "name" { + type = string +} + +variable "transient" { + type = bool +} + +variable "comment" { + type = string +} + +variable "account_identifier" { + type = string +} + +variable "with_failover" { + type = bool +} + +variable "ignore_edition_check" { + type = bool +} + +variable "data_retention_time_in_days" { + type = string +} + +variable "max_data_extension_time_in_days" { + type = string +} + +variable "external_volume" { + type = string +} + +variable "catalog" { + type = string +} + +variable "replace_invalid_characters" { + type = string +} + +variable "default_ddl_collation" { + type = string +} + +variable "storage_serialization_policy" { + type = string +} + +variable "log_level" { + type = string +} + +variable "trace_level" { + type = string +} + +variable "suspend_task_after_num_failures" { + type = number +} + +variable "task_auto_retry_attempts" { + type = number +} + +variable "user_task_managed_initial_warehouse_size" { + type = string +} + +variable "user_task_timeout_ms" { + type = number +} + +variable "user_task_minimum_trigger_interval_in_seconds" { + type = number +} + +variable "quoted_identifiers_ignore_case" { + type = bool +} + +variable "enable_console_output" { + type = bool +} diff --git a/pkg/resources/testdata/TestAcc_Database/int_parameter/set/test.tf b/pkg/resources/testdata/TestAcc_Database/int_parameter/set/test.tf new file mode 100644 index 0000000000..2480217c91 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/int_parameter/set/test.tf @@ -0,0 +1,4 @@ +resource "snowflake_database" "test" { + name = var.name + data_retention_time_in_days = var.data_retention_time_in_days +} diff --git a/pkg/resources/testdata/TestAcc_Database/int_parameter/set/variables.tf b/pkg/resources/testdata/TestAcc_Database/int_parameter/set/variables.tf new file mode 100644 index 0000000000..f4a4f3cef7 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/int_parameter/set/variables.tf @@ -0,0 +1,8 @@ +variable "name" { + type = string +} + +variable "data_retention_time_in_days" { + type = number +} + diff --git a/pkg/resources/testdata/TestAcc_Database/int_parameter/unset/test.tf b/pkg/resources/testdata/TestAcc_Database/int_parameter/unset/test.tf new file mode 100644 index 0000000000..402d24eee3 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/int_parameter/unset/test.tf @@ -0,0 +1,3 @@ +resource "snowflake_database" "test" { + name = var.name +} diff --git a/pkg/resources/testdata/TestAcc_Database/int_parameter/unset/variables.tf b/pkg/resources/testdata/TestAcc_Database/int_parameter/unset/variables.tf new file mode 100644 index 0000000000..77e5cc9698 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/int_parameter/unset/variables.tf @@ -0,0 +1,3 @@ +variable "name" { + type = string +} diff --git a/pkg/resources/testdata/TestAcc_Database/replication/test.tf b/pkg/resources/testdata/TestAcc_Database/replication/test.tf new file mode 100644 index 0000000000..df4b236328 --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/replication/test.tf @@ -0,0 +1,12 @@ +resource "snowflake_database" "test" { + name = var.name + + replication { + enable_to_account { + account_identifier = var.account_identifier + with_failover = var.with_failover + } + + ignore_edition_check = var.ignore_edition_check + } +} diff --git a/pkg/resources/testdata/TestAcc_Database/replication/variables.tf b/pkg/resources/testdata/TestAcc_Database/replication/variables.tf new file mode 100644 index 0000000000..56e38ab5ea --- /dev/null +++ b/pkg/resources/testdata/TestAcc_Database/replication/variables.tf @@ -0,0 +1,15 @@ +variable "name" { + type = string +} + +variable "account_identifier" { + type = string +} + +variable "with_failover" { + type = bool +} + +variable "ignore_edition_check" { + type = bool +} diff --git a/pkg/resources/testdata/TestAcc_DatabaseRemovedOutsideOfTerraform/test.tf b/pkg/resources/testdata/TestAcc_DatabaseRemovedOutsideOfTerraform/test.tf index f0d3470e09..180a0c22bd 100644 --- a/pkg/resources/testdata/TestAcc_DatabaseRemovedOutsideOfTerraform/test.tf +++ b/pkg/resources/testdata/TestAcc_DatabaseRemovedOutsideOfTerraform/test.tf @@ -1,4 +1,4 @@ -resource "snowflake_database" "db" { +resource "snowflake_database_old" "db" { name = var.db comment = "test comment" } diff --git a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet/test.tf b/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet/test.tf index 12a910c7d2..2f9535a0f1 100644 --- a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet/test.tf +++ b/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithDataRetentionSet/test.tf @@ -1,4 +1,4 @@ -resource "snowflake_database" "test" { +resource "snowflake_database_old" "test" { name = var.database data_retention_time_in_days = var.database_data_retention_time } diff --git a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet/test.tf b/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet/test.tf index 894e86ac20..c3386f300a 100644 --- a/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet/test.tf +++ b/pkg/resources/testdata/TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet/test.tf @@ -1,3 +1,3 @@ -resource "snowflake_database" "test" { +resource "snowflake_database_old" "test" { name = var.database } diff --git a/pkg/resources/testdata/TestAcc_GrantPrivilegesToAccountRole/ImportedPrivileges/test.tf b/pkg/resources/testdata/TestAcc_GrantPrivilegesToAccountRole/ImportedPrivileges/test.tf index f0ab03829d..767b17da93 100644 --- a/pkg/resources/testdata/TestAcc_GrantPrivilegesToAccountRole/ImportedPrivileges/test.tf +++ b/pkg/resources/testdata/TestAcc_GrantPrivilegesToAccountRole/ImportedPrivileges/test.tf @@ -1,10 +1,6 @@ -resource "snowflake_database" "test" { - name = var.shared_database_name - data_retention_time_in_days = 0 - from_share = { - provider = var.account_name - share = var.share_name - } +resource "snowflake_shared_database" "test" { + name = var.shared_database_name + from_share = var.external_share_name } resource "snowflake_role" "test" { @@ -12,7 +8,7 @@ resource "snowflake_role" "test" { } resource "snowflake_grant_privileges_to_account_role" "test" { - depends_on = [snowflake_database.test, snowflake_role.test] + depends_on = [snowflake_shared_database.test, snowflake_role.test] account_role_name = "\"${var.role_name}\"" privileges = var.privileges on_account_object { diff --git a/pkg/resources/testdata/TestAcc_GrantPrivilegesToAccountRole/ImportedPrivileges/variables.tf b/pkg/resources/testdata/TestAcc_GrantPrivilegesToAccountRole/ImportedPrivileges/variables.tf index 9e948b415f..3f5de98393 100644 --- a/pkg/resources/testdata/TestAcc_GrantPrivilegesToAccountRole/ImportedPrivileges/variables.tf +++ b/pkg/resources/testdata/TestAcc_GrantPrivilegesToAccountRole/ImportedPrivileges/variables.tf @@ -10,10 +10,6 @@ variable "shared_database_name" { type = string } -variable "share_name" { - type = string -} - -variable "account_name" { +variable "external_share_name" { type = string } diff --git a/pkg/resources/testdata/TestAcc_GrantPrivilegesToRole/ImportedPrivileges/test.tf b/pkg/resources/testdata/TestAcc_GrantPrivilegesToRole/ImportedPrivileges/test.tf index 593ba8b1e2..786d7d34bc 100644 --- a/pkg/resources/testdata/TestAcc_GrantPrivilegesToRole/ImportedPrivileges/test.tf +++ b/pkg/resources/testdata/TestAcc_GrantPrivilegesToRole/ImportedPrivileges/test.tf @@ -1,10 +1,6 @@ -resource "snowflake_database" "test" { - name = var.shared_database_name - data_retention_time_in_days = 0 - from_share = { - provider = var.account_name - share = var.share_name - } +resource "snowflake_shared_database" "test" { + name = var.shared_database_name + from_share = var.external_share_name } resource "snowflake_role" "test" { @@ -12,7 +8,7 @@ resource "snowflake_role" "test" { } resource "snowflake_grant_privileges_to_role" "test" { - depends_on = [snowflake_database.test, snowflake_role.test] + depends_on = [snowflake_shared_database.test, snowflake_role.test] role_name = "\"${var.role_name}\"" privileges = var.privileges on_account_object { diff --git a/pkg/resources/testdata/TestAcc_GrantPrivilegesToRole/ImportedPrivileges/variables.tf b/pkg/resources/testdata/TestAcc_GrantPrivilegesToRole/ImportedPrivileges/variables.tf index 9e948b415f..3f5de98393 100644 --- a/pkg/resources/testdata/TestAcc_GrantPrivilegesToRole/ImportedPrivileges/variables.tf +++ b/pkg/resources/testdata/TestAcc_GrantPrivilegesToRole/ImportedPrivileges/variables.tf @@ -10,10 +10,6 @@ variable "shared_database_name" { type = string } -variable "share_name" { - type = string -} - -variable "account_name" { +variable "external_share_name" { type = string } diff --git a/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-set/test.tf b/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-set/test.tf index 4cefd5d621..a8688141e0 100644 --- a/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-set/test.tf +++ b/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-set/test.tf @@ -1,22 +1,22 @@ resource "snowflake_secondary_database" "test" { name = var.name as_replica_of = var.as_replica_of - is_transient = var.transient + comment = var.comment - data_retention_time_in_days { - value = var.data_retention_time_in_days - } - - max_data_extension_time_in_days { - value = var.max_data_extension_time_in_days - } - - external_volume = var.external_volume - catalog = var.catalog - replace_invalid_characters = var.replace_invalid_characters - default_ddl_collation = var.default_ddl_collation - storage_serialization_policy = var.storage_serialization_policy - log_level = var.log_level - trace_level = var.trace_level - comment = var.comment + data_retention_time_in_days = var.data_retention_time_in_days + max_data_extension_time_in_days = var.max_data_extension_time_in_days + external_volume = var.external_volume + catalog = var.catalog + replace_invalid_characters = var.replace_invalid_characters + default_ddl_collation = var.default_ddl_collation + storage_serialization_policy = var.storage_serialization_policy + log_level = var.log_level + trace_level = var.trace_level + suspend_task_after_num_failures = var.suspend_task_after_num_failures + task_auto_retry_attempts = var.task_auto_retry_attempts + user_task_managed_initial_warehouse_size = var.user_task_managed_initial_warehouse_size + user_task_timeout_ms = var.user_task_timeout_ms + user_task_minimum_trigger_interval_in_seconds = var.user_task_minimum_trigger_interval_in_seconds + quoted_identifiers_ignore_case = var.quoted_identifiers_ignore_case + enable_console_output = var.enable_console_output } diff --git a/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-set/variables.tf b/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-set/variables.tf index cfe7514845..534c20a278 100644 --- a/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-set/variables.tf +++ b/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-set/variables.tf @@ -6,8 +6,8 @@ variable "as_replica_of" { type = string } -variable "transient" { - type = bool +variable "comment" { + type = string } variable "data_retention_time_in_days" { @@ -46,6 +46,30 @@ variable "trace_level" { type = string } -variable "comment" { +variable "suspend_task_after_num_failures" { + type = number +} + +variable "task_auto_retry_attempts" { + type = number +} + +variable "user_task_managed_initial_warehouse_size" { type = string } + +variable "user_task_timeout_ms" { + type = number +} + +variable "user_task_minimum_trigger_interval_in_seconds" { + type = number +} + +variable "quoted_identifiers_ignore_case" { + type = bool +} + +variable "enable_console_output" { + type = bool +} diff --git a/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-unset/test.tf b/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-unset/test.tf index 5aa60d21ed..770f36fc00 100644 --- a/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-unset/test.tf +++ b/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-unset/test.tf @@ -1,13 +1,4 @@ resource "snowflake_secondary_database" "test" { - name = var.name - as_replica_of = var.as_replica_of - is_transient = var.transient - external_volume = var.external_volume - catalog = var.catalog - replace_invalid_characters = var.replace_invalid_characters - default_ddl_collation = var.default_ddl_collation - storage_serialization_policy = var.storage_serialization_policy - log_level = var.log_level - trace_level = var.trace_level - comment = var.comment + name = var.name + as_replica_of = var.as_replica_of } diff --git a/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-unset/variables.tf b/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-unset/variables.tf index 977a6bdfe1..ecf7c3557e 100644 --- a/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-unset/variables.tf +++ b/pkg/resources/testdata/TestAcc_SecondaryDatabase/complete-optionals-unset/variables.tf @@ -5,39 +5,3 @@ variable "name" { variable "as_replica_of" { type = string } - -variable "transient" { - type = bool -} - -variable "external_volume" { - type = string -} - -variable "catalog" { - type = string -} - -variable "replace_invalid_characters" { - type = string -} - -variable "default_ddl_collation" { - type = string -} - -variable "storage_serialization_policy" { - type = string -} - -variable "log_level" { - type = string -} - -variable "trace_level" { - type = string -} - -variable "comment" { - type = string -} diff --git a/pkg/resources/testdata/TestAcc_SharedDatabase/complete/test.tf b/pkg/resources/testdata/TestAcc_SharedDatabase/complete/test.tf index 5c2f7493b6..417d31a68d 100644 --- a/pkg/resources/testdata/TestAcc_SharedDatabase/complete/test.tf +++ b/pkg/resources/testdata/TestAcc_SharedDatabase/complete/test.tf @@ -1,12 +1,20 @@ resource "snowflake_shared_database" "test" { - name = var.name - from_share = var.from_share - external_volume = var.external_volume - catalog = var.catalog - replace_invalid_characters = var.replace_invalid_characters - default_ddl_collation = var.default_ddl_collation - storage_serialization_policy = var.storage_serialization_policy - log_level = var.log_level - trace_level = var.trace_level - comment = var.comment + name = var.name + from_share = var.from_share + comment = var.comment + + external_volume = var.external_volume + catalog = var.catalog + replace_invalid_characters = var.replace_invalid_characters + default_ddl_collation = var.default_ddl_collation + storage_serialization_policy = var.storage_serialization_policy + log_level = var.log_level + trace_level = var.trace_level + suspend_task_after_num_failures = var.suspend_task_after_num_failures + task_auto_retry_attempts = var.task_auto_retry_attempts + user_task_managed_initial_warehouse_size = var.user_task_managed_initial_warehouse_size + user_task_timeout_ms = var.user_task_timeout_ms + user_task_minimum_trigger_interval_in_seconds = var.user_task_minimum_trigger_interval_in_seconds + quoted_identifiers_ignore_case = var.quoted_identifiers_ignore_case + enable_console_output = var.enable_console_output } diff --git a/pkg/resources/testdata/TestAcc_SharedDatabase/complete/variables.tf b/pkg/resources/testdata/TestAcc_SharedDatabase/complete/variables.tf index b704eb8dfe..03f5793ff2 100644 --- a/pkg/resources/testdata/TestAcc_SharedDatabase/complete/variables.tf +++ b/pkg/resources/testdata/TestAcc_SharedDatabase/complete/variables.tf @@ -6,6 +6,10 @@ variable "from_share" { type = string } +variable "comment" { + type = string +} + variable "external_volume" { type = string } @@ -15,7 +19,7 @@ variable "catalog" { } variable "replace_invalid_characters" { - type = bool + type = string } variable "default_ddl_collation" { @@ -34,6 +38,30 @@ variable "trace_level" { type = string } -variable "comment" { +variable "suspend_task_after_num_failures" { + type = number +} + +variable "task_auto_retry_attempts" { + type = number +} + +variable "user_task_managed_initial_warehouse_size" { type = string } + +variable "user_task_timeout_ms" { + type = number +} + +variable "user_task_minimum_trigger_interval_in_seconds" { + type = number +} + +variable "quoted_identifiers_ignore_case" { + type = bool +} + +variable "enable_console_output" { + type = bool +} diff --git a/pkg/sdk/context_functions.go b/pkg/sdk/context_functions.go index e0eb339db8..f853417fd3 100644 --- a/pkg/sdk/context_functions.go +++ b/pkg/sdk/context_functions.go @@ -35,19 +35,23 @@ type contextFunctions struct { } type currentSessionDetailsDBRow struct { - CurrentAccount string `db:"CURRENT_ACCOUNT"` - CurrentRole string `db:"CURRENT_ROLE"` - CurrentRegion string `db:"CURRENT_REGION"` - CurrentSession string `db:"CURRENT_SESSION"` - CurrentUser string `db:"CURRENT_USER"` + CurrentAccount string `db:"CURRENT_ACCOUNT"` + CurrentAccountName string `db:"CURRENT_ACCOUNT_NAME"` + CurrentOrganizationName string `db:"CURRENT_ORGANIZATION_NAME"` + CurrentRole string `db:"CURRENT_ROLE"` + CurrentRegion string `db:"CURRENT_REGION"` + CurrentSession string `db:"CURRENT_SESSION"` + CurrentUser string `db:"CURRENT_USER"` } type CurrentSessionDetails struct { - Account string `db:"CURRENT_ACCOUNT"` - Role string `db:"CURRENT_ROLE"` - Region string `db:"CURRENT_REGION"` - Session string `db:"CURRENT_SESSION"` - User string `db:"CURRENT_USER"` + Account string `db:"CURRENT_ACCOUNT"` + AccountName string `db:"CURRENT_ACCOUNT_NAME"` + OrganizationName string `db:"CURRENT_ORGANIZATION_NAME"` + Role string `db:"CURRENT_ROLE"` + Region string `db:"CURRENT_REGION"` + Session string `db:"CURRENT_SESSION"` + User string `db:"CURRENT_USER"` } func (acc *CurrentSessionDetails) AccountURL() (string, error) { @@ -185,16 +189,18 @@ func (c *contextFunctions) CurrentUser(ctx context.Context) (AccountObjectIdenti func (c *contextFunctions) CurrentSessionDetails(ctx context.Context) (*CurrentSessionDetails, error) { s := ¤tSessionDetailsDBRow{} - err := c.client.queryOne(ctx, s, "SELECT CURRENT_ACCOUNT() as CURRENT_ACCOUNT, CURRENT_ROLE() as CURRENT_ROLE, CURRENT_REGION() AS CURRENT_REGION, CURRENT_SESSION() as CURRENT_SESSION, CURRENT_USER() as CURRENT_USER") + err := c.client.queryOne(ctx, s, "SELECT CURRENT_ACCOUNT() as CURRENT_ACCOUNT, CURRENT_ROLE() as CURRENT_ROLE, CURRENT_REGION() AS CURRENT_REGION, CURRENT_SESSION() as CURRENT_SESSION, CURRENT_USER() as CURRENT_USER, CURRENT_ACCOUNT_NAME() as CURRENT_ACCOUNT_NAME, CURRENT_ORGANIZATION_NAME() as CURRENT_ORGANIZATION_NAME") if err != nil { return nil, err } return &CurrentSessionDetails{ - Account: s.CurrentAccount, - Role: s.CurrentRole, - Region: s.CurrentRegion, - Session: s.CurrentSession, - User: s.CurrentUser, + Account: s.CurrentAccount, + AccountName: s.CurrentAccountName, + OrganizationName: s.CurrentOrganizationName, + Role: s.CurrentRole, + Region: s.CurrentRegion, + Session: s.CurrentSession, + User: s.CurrentUser, }, nil } diff --git a/pkg/sdk/databases.go b/pkg/sdk/databases.go index cdefbc81d8..e676148167 100644 --- a/pkg/sdk/databases.go +++ b/pkg/sdk/databases.go @@ -151,24 +151,34 @@ var AllStorageSerializationPolicies = []StorageSerializationPolicy{ // CreateDatabaseOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-database. type CreateDatabaseOptions struct { - create bool `ddl:"static" sql:"CREATE"` - OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` - Transient *bool `ddl:"keyword" sql:"TRANSIENT"` - database bool `ddl:"static" sql:"DATABASE"` - IfNotExists *bool `ddl:"keyword" sql:"IF NOT EXISTS"` - name AccountObjectIdentifier `ddl:"identifier"` - Clone *Clone `ddl:"-"` - DataRetentionTimeInDays *int `ddl:"parameter" sql:"DATA_RETENTION_TIME_IN_DAYS"` - MaxDataExtensionTimeInDays *int `ddl:"parameter" sql:"MAX_DATA_EXTENSION_TIME_IN_DAYS"` - ExternalVolume *AccountObjectIdentifier `ddl:"identifier,equals" sql:"EXTERNAL_VOLUME"` - Catalog *AccountObjectIdentifier `ddl:"identifier,equals" sql:"CATALOG"` - ReplaceInvalidCharacters *bool `ddl:"parameter" sql:"REPLACE_INVALID_CHARACTERS"` - DefaultDDLCollation *string `ddl:"parameter,single_quotes" sql:"DEFAULT_DDL_COLLATION"` - StorageSerializationPolicy *StorageSerializationPolicy `ddl:"parameter" sql:"STORAGE_SERIALIZATION_POLICY"` - LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` - TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` - Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` - Tag []TagAssociation `ddl:"keyword,parentheses" sql:"TAG"` + create bool `ddl:"static" sql:"CREATE"` + OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` + Transient *bool `ddl:"keyword" sql:"TRANSIENT"` + database bool `ddl:"static" sql:"DATABASE"` + IfNotExists *bool `ddl:"keyword" sql:"IF NOT EXISTS"` + name AccountObjectIdentifier `ddl:"identifier"` + Clone *Clone `ddl:"-"` + + // Parameters + DataRetentionTimeInDays *int `ddl:"parameter" sql:"DATA_RETENTION_TIME_IN_DAYS"` + MaxDataExtensionTimeInDays *int `ddl:"parameter" sql:"MAX_DATA_EXTENSION_TIME_IN_DAYS"` + ExternalVolume *AccountObjectIdentifier `ddl:"identifier,equals" sql:"EXTERNAL_VOLUME"` + Catalog *AccountObjectIdentifier `ddl:"identifier,equals" sql:"CATALOG"` + ReplaceInvalidCharacters *bool `ddl:"parameter" sql:"REPLACE_INVALID_CHARACTERS"` + DefaultDDLCollation *string `ddl:"parameter,single_quotes" sql:"DEFAULT_DDL_COLLATION"` + StorageSerializationPolicy *StorageSerializationPolicy `ddl:"parameter" sql:"STORAGE_SERIALIZATION_POLICY"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` + SuspendTaskAfterNumFailures *int `ddl:"parameter" sql:"SUSPEND_TASK_AFTER_NUM_FAILURES"` + TaskAutoRetryAttempts *int `ddl:"parameter" sql:"TASK_AUTO_RETRY_ATTEMPTS"` + UserTaskManagedInitialWarehouseSize *WarehouseSize `ddl:"parameter" sql:"USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE"` + UserTaskTimeoutMs *int `ddl:"parameter" sql:"USER_TASK_TIMEOUT_MS"` + UserTaskMinimumTriggerIntervalInSeconds *int `ddl:"parameter" sql:"USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS"` + QuotedIdentifiersIgnoreCase *bool `ddl:"parameter" sql:"QUOTED_IDENTIFIERS_IGNORE_CASE"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + + Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` + Tag []TagAssociation `ddl:"keyword,parentheses" sql:"TAG"` } func (opts *CreateDatabaseOptions) validate() error { @@ -214,22 +224,32 @@ func (v *databases) Create(ctx context.Context, id AccountObjectIdentifier, opts // CreateSharedDatabaseOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-database. type CreateSharedDatabaseOptions struct { - create bool `ddl:"static" sql:"CREATE"` - OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` - Transient *bool `ddl:"keyword" sql:"TRANSIENT"` - database bool `ddl:"static" sql:"DATABASE"` - IfNotExists *bool `ddl:"keyword" sql:"IF NOT EXISTS"` - name AccountObjectIdentifier `ddl:"identifier"` - fromShare ExternalObjectIdentifier `ddl:"identifier" sql:"FROM SHARE"` - ExternalVolume *AccountObjectIdentifier `ddl:"identifier,equals" sql:"EXTERNAL_VOLUME"` - Catalog *AccountObjectIdentifier `ddl:"identifier,equals" sql:"CATALOG"` - ReplaceInvalidCharacters *bool `ddl:"parameter" sql:"REPLACE_INVALID_CHARACTERS"` - DefaultDDLCollation *string `ddl:"parameter,single_quotes" sql:"DEFAULT_DDL_COLLATION"` - StorageSerializationPolicy *StorageSerializationPolicy `ddl:"parameter" sql:"STORAGE_SERIALIZATION_POLICY"` - LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` - TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` - Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` - Tag []TagAssociation `ddl:"keyword,parentheses" sql:"TAG"` + create bool `ddl:"static" sql:"CREATE"` + OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` + Transient *bool `ddl:"keyword" sql:"TRANSIENT"` + database bool `ddl:"static" sql:"DATABASE"` + IfNotExists *bool `ddl:"keyword" sql:"IF NOT EXISTS"` + name AccountObjectIdentifier `ddl:"identifier"` + fromShare ExternalObjectIdentifier `ddl:"identifier" sql:"FROM SHARE"` + + // Parameters + ExternalVolume *AccountObjectIdentifier `ddl:"identifier,equals" sql:"EXTERNAL_VOLUME"` + Catalog *AccountObjectIdentifier `ddl:"identifier,equals" sql:"CATALOG"` + ReplaceInvalidCharacters *bool `ddl:"parameter" sql:"REPLACE_INVALID_CHARACTERS"` + DefaultDDLCollation *string `ddl:"parameter,single_quotes" sql:"DEFAULT_DDL_COLLATION"` + StorageSerializationPolicy *StorageSerializationPolicy `ddl:"parameter" sql:"STORAGE_SERIALIZATION_POLICY"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` + SuspendTaskAfterNumFailures *int `ddl:"parameter" sql:"SUSPEND_TASK_AFTER_NUM_FAILURES"` + TaskAutoRetryAttempts *int `ddl:"parameter" sql:"TASK_AUTO_RETRY_ATTEMPTS"` + UserTaskManagedInitialWarehouseSize *WarehouseSize `ddl:"parameter" sql:"USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE"` + UserTaskTimeoutMs *int `ddl:"parameter" sql:"USER_TASK_TIMEOUT_MS"` + UserTaskMinimumTriggerIntervalInSeconds *int `ddl:"parameter" sql:"USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS"` + QuotedIdentifiersIgnoreCase *bool `ddl:"parameter" sql:"QUOTED_IDENTIFIERS_IGNORE_CASE"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + + Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` + Tag []TagAssociation `ddl:"keyword,parentheses" sql:"TAG"` } func (opts *CreateSharedDatabaseOptions) validate() error { @@ -276,23 +296,33 @@ func (v *databases) CreateShared(ctx context.Context, id AccountObjectIdentifier // CreateSecondaryDatabaseOptions is based on https://docs.snowflake.com/en/sql-reference/sql/create-database. type CreateSecondaryDatabaseOptions struct { - create bool `ddl:"static" sql:"CREATE"` - OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` - Transient *bool `ddl:"keyword" sql:"TRANSIENT"` - database bool `ddl:"static" sql:"DATABASE"` - IfNotExists *bool `ddl:"keyword" sql:"IF NOT EXISTS"` - name AccountObjectIdentifier `ddl:"identifier"` - primaryDatabase ExternalObjectIdentifier `ddl:"identifier" sql:"AS REPLICA OF"` - DataRetentionTimeInDays *int `ddl:"parameter" sql:"DATA_RETENTION_TIME_IN_DAYS"` - MaxDataExtensionTimeInDays *int `ddl:"parameter" sql:"MAX_DATA_EXTENSION_TIME_IN_DAYS"` - ExternalVolume *AccountObjectIdentifier `ddl:"identifier,equals" sql:"EXTERNAL_VOLUME"` - Catalog *AccountObjectIdentifier `ddl:"identifier,equals" sql:"CATALOG"` - ReplaceInvalidCharacters *bool `ddl:"parameter" sql:"REPLACE_INVALID_CHARACTERS"` - DefaultDDLCollation *string `ddl:"parameter,single_quotes" sql:"DEFAULT_DDL_COLLATION"` - StorageSerializationPolicy *StorageSerializationPolicy `ddl:"parameter" sql:"STORAGE_SERIALIZATION_POLICY"` - LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` - TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` - Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` + create bool `ddl:"static" sql:"CREATE"` + OrReplace *bool `ddl:"keyword" sql:"OR REPLACE"` + Transient *bool `ddl:"keyword" sql:"TRANSIENT"` + database bool `ddl:"static" sql:"DATABASE"` + IfNotExists *bool `ddl:"keyword" sql:"IF NOT EXISTS"` + name AccountObjectIdentifier `ddl:"identifier"` + primaryDatabase ExternalObjectIdentifier `ddl:"identifier" sql:"AS REPLICA OF"` + + // Parameters + DataRetentionTimeInDays *int `ddl:"parameter" sql:"DATA_RETENTION_TIME_IN_DAYS"` + MaxDataExtensionTimeInDays *int `ddl:"parameter" sql:"MAX_DATA_EXTENSION_TIME_IN_DAYS"` + ExternalVolume *AccountObjectIdentifier `ddl:"identifier,equals" sql:"EXTERNAL_VOLUME"` + Catalog *AccountObjectIdentifier `ddl:"identifier,equals" sql:"CATALOG"` + ReplaceInvalidCharacters *bool `ddl:"parameter" sql:"REPLACE_INVALID_CHARACTERS"` + DefaultDDLCollation *string `ddl:"parameter,single_quotes" sql:"DEFAULT_DDL_COLLATION"` + StorageSerializationPolicy *StorageSerializationPolicy `ddl:"parameter" sql:"STORAGE_SERIALIZATION_POLICY"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` + SuspendTaskAfterNumFailures *int `ddl:"parameter" sql:"SUSPEND_TASK_AFTER_NUM_FAILURES"` + TaskAutoRetryAttempts *int `ddl:"parameter" sql:"TASK_AUTO_RETRY_ATTEMPTS"` + UserTaskManagedInitialWarehouseSize *WarehouseSize `ddl:"parameter" sql:"USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE"` + UserTaskTimeoutMs *int `ddl:"parameter" sql:"USER_TASK_TIMEOUT_MS"` + UserTaskMinimumTriggerIntervalInSeconds *int `ddl:"parameter" sql:"USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS"` + QuotedIdentifiersIgnoreCase *bool `ddl:"parameter" sql:"QUOTED_IDENTIFIERS_IGNORE_CASE"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + + Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` } func (opts *CreateSecondaryDatabaseOptions) validate() error { @@ -380,16 +410,25 @@ func (opts *AlterDatabaseOptions) validate() error { } type DatabaseSet struct { - DataRetentionTimeInDays *int `ddl:"parameter" sql:"DATA_RETENTION_TIME_IN_DAYS"` - MaxDataExtensionTimeInDays *int `ddl:"parameter" sql:"MAX_DATA_EXTENSION_TIME_IN_DAYS"` - ExternalVolume *AccountObjectIdentifier `ddl:"identifier,equals" sql:"EXTERNAL_VOLUME"` - Catalog *AccountObjectIdentifier `ddl:"identifier,equals" sql:"CATALOG"` - ReplaceInvalidCharacters *bool `ddl:"parameter" sql:"REPLACE_INVALID_CHARACTERS"` - DefaultDDLCollation *string `ddl:"parameter,single_quotes" sql:"DEFAULT_DDL_COLLATION"` - StorageSerializationPolicy *StorageSerializationPolicy `ddl:"parameter" sql:"STORAGE_SERIALIZATION_POLICY"` - LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` - TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` - Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` + // Parameters + DataRetentionTimeInDays *int `ddl:"parameter" sql:"DATA_RETENTION_TIME_IN_DAYS"` + MaxDataExtensionTimeInDays *int `ddl:"parameter" sql:"MAX_DATA_EXTENSION_TIME_IN_DAYS"` + ExternalVolume *AccountObjectIdentifier `ddl:"identifier,equals" sql:"EXTERNAL_VOLUME"` + Catalog *AccountObjectIdentifier `ddl:"identifier,equals" sql:"CATALOG"` + ReplaceInvalidCharacters *bool `ddl:"parameter" sql:"REPLACE_INVALID_CHARACTERS"` + DefaultDDLCollation *string `ddl:"parameter,single_quotes" sql:"DEFAULT_DDL_COLLATION"` + StorageSerializationPolicy *StorageSerializationPolicy `ddl:"parameter" sql:"STORAGE_SERIALIZATION_POLICY"` + LogLevel *LogLevel `ddl:"parameter,single_quotes" sql:"LOG_LEVEL"` + TraceLevel *TraceLevel `ddl:"parameter,single_quotes" sql:"TRACE_LEVEL"` + SuspendTaskAfterNumFailures *int `ddl:"parameter" sql:"SUSPEND_TASK_AFTER_NUM_FAILURES"` + TaskAutoRetryAttempts *int `ddl:"parameter" sql:"TASK_AUTO_RETRY_ATTEMPTS"` + UserTaskManagedInitialWarehouseSize *WarehouseSize `ddl:"parameter" sql:"USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE"` + UserTaskTimeoutMs *int `ddl:"parameter" sql:"USER_TASK_TIMEOUT_MS"` + UserTaskMinimumTriggerIntervalInSeconds *int `ddl:"parameter" sql:"USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS"` + QuotedIdentifiersIgnoreCase *bool `ddl:"parameter" sql:"QUOTED_IDENTIFIERS_IGNORE_CASE"` + EnableConsoleOutput *bool `ddl:"parameter" sql:"ENABLE_CONSOLE_OUTPUT"` + + Comment *string `ddl:"parameter,single_quotes" sql:"COMMENT"` } func (v *DatabaseSet) validate() error { @@ -400,29 +439,112 @@ func (v *DatabaseSet) validate() error { if v.Catalog != nil && !ValidObjectIdentifier(v.Catalog) { errs = append(errs, errInvalidIdentifier("DatabaseSet", "Catalog")) } - if !anyValueSet(v.DataRetentionTimeInDays, v.MaxDataExtensionTimeInDays, v.ExternalVolume, v.Catalog, v.ReplaceInvalidCharacters, v.DefaultDDLCollation, v.StorageSerializationPolicy, v.LogLevel, v.TraceLevel, v.Comment) { - errs = append(errs, errAtLeastOneOf("DatabaseSet", "DataRetentionTimeInDays", "MaxDataExtensionTimeInDays", "ExternalVolume", "Catalog", "ReplaceInvalidCharacters", "DefaultDDLCollation", "StorageSerializationPolicy", "LogLevel", "TraceLevel", "Comment")) + if !anyValueSet( + v.DataRetentionTimeInDays, + v.MaxDataExtensionTimeInDays, + v.ExternalVolume, + v.Catalog, + v.ReplaceInvalidCharacters, + v.DefaultDDLCollation, + v.StorageSerializationPolicy, + v.LogLevel, + v.TraceLevel, + v.SuspendTaskAfterNumFailures, + v.TaskAutoRetryAttempts, + v.UserTaskManagedInitialWarehouseSize, + v.UserTaskTimeoutMs, + v.UserTaskMinimumTriggerIntervalInSeconds, + v.QuotedIdentifiersIgnoreCase, + v.EnableConsoleOutput, + v.Comment, + ) { + errs = append(errs, errAtLeastOneOf( + "DatabaseSet", + "DataRetentionTimeInDays", + "MaxDataExtensionTimeInDays", + "ExternalVolume", + "Catalog", + "ReplaceInvalidCharacters", + "DefaultDDLCollation", + "StorageSerializationPolicy", + "LogLevel", + "TraceLevel", + "SuspendTaskAfterNumFailures", + "TaskAutoRetryAttempts", + "UserTaskManagedInitialWarehouseSize", + "UserTaskTimeoutMs", + "UserTaskMinimumTriggerIntervalInSeconds", + "QuotedIdentifiersIgnoreCase", + "EnableConsoleOutput", + "Comment", + )) } return errors.Join(errs...) } type DatabaseUnset struct { - DataRetentionTimeInDays *bool `ddl:"keyword" sql:"DATA_RETENTION_TIME_IN_DAYS"` - MaxDataExtensionTimeInDays *bool `ddl:"keyword" sql:"MAX_DATA_EXTENSION_TIME_IN_DAYS"` - ExternalVolume *bool `ddl:"keyword" sql:"EXTERNAL_VOLUME"` - Catalog *bool `ddl:"keyword" sql:"CATALOG"` - ReplaceInvalidCharacters *bool `ddl:"keyword" sql:"REPLACE_INVALID_CHARACTERS"` - DefaultDDLCollation *bool `ddl:"keyword" sql:"DEFAULT_DDL_COLLATION"` - StorageSerializationPolicy *bool `ddl:"keyword" sql:"STORAGE_SERIALIZATION_POLICY"` - LogLevel *bool `ddl:"keyword" sql:"LOG_LEVEL"` - TraceLevel *bool `ddl:"keyword" sql:"TRACE_LEVEL"` - Comment *bool `ddl:"keyword" sql:"COMMENT"` + // Parameters + DataRetentionTimeInDays *bool `ddl:"keyword" sql:"DATA_RETENTION_TIME_IN_DAYS"` + MaxDataExtensionTimeInDays *bool `ddl:"keyword" sql:"MAX_DATA_EXTENSION_TIME_IN_DAYS"` + ExternalVolume *bool `ddl:"keyword" sql:"EXTERNAL_VOLUME"` + Catalog *bool `ddl:"keyword" sql:"CATALOG"` + ReplaceInvalidCharacters *bool `ddl:"keyword" sql:"REPLACE_INVALID_CHARACTERS"` + DefaultDDLCollation *bool `ddl:"keyword" sql:"DEFAULT_DDL_COLLATION"` + StorageSerializationPolicy *bool `ddl:"keyword" sql:"STORAGE_SERIALIZATION_POLICY"` + LogLevel *bool `ddl:"keyword" sql:"LOG_LEVEL"` + TraceLevel *bool `ddl:"keyword" sql:"TRACE_LEVEL"` + SuspendTaskAfterNumFailures *bool `ddl:"keyword" sql:"SUSPEND_TASK_AFTER_NUM_FAILURES"` + TaskAutoRetryAttempts *bool `ddl:"keyword" sql:"TASK_AUTO_RETRY_ATTEMPTS"` + UserTaskManagedInitialWarehouseSize *bool `ddl:"keyword" sql:"USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE"` + UserTaskTimeoutMs *bool `ddl:"keyword" sql:"USER_TASK_TIMEOUT_MS"` + UserTaskMinimumTriggerIntervalInSeconds *bool `ddl:"keyword" sql:"USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS"` + QuotedIdentifiersIgnoreCase *bool `ddl:"keyword" sql:"QUOTED_IDENTIFIERS_IGNORE_CASE"` + EnableConsoleOutput *bool `ddl:"keyword" sql:"ENABLE_CONSOLE_OUTPUT"` + + Comment *bool `ddl:"keyword" sql:"COMMENT"` } func (v *DatabaseUnset) validate() error { var errs []error - if !anyValueSet(v.DataRetentionTimeInDays, v.MaxDataExtensionTimeInDays, v.ExternalVolume, v.Catalog, v.ReplaceInvalidCharacters, v.DefaultDDLCollation, v.StorageSerializationPolicy, v.LogLevel, v.TraceLevel, v.Comment) { - errs = append(errs, errAtLeastOneOf("DatabaseUnset", "DataRetentionTimeInDays", "MaxDataExtensionTimeInDays", "ExternalVolume", "Catalog", "ReplaceInvalidCharacters", "DefaultDDLCollation", "StorageSerializationPolicy", "LogLevel", "TraceLevel", "Comment")) + if !anyValueSet( + v.DataRetentionTimeInDays, + v.MaxDataExtensionTimeInDays, + v.ExternalVolume, + v.Catalog, + v.ReplaceInvalidCharacters, + v.DefaultDDLCollation, + v.StorageSerializationPolicy, + v.LogLevel, + v.TraceLevel, + v.SuspendTaskAfterNumFailures, + v.TaskAutoRetryAttempts, + v.UserTaskManagedInitialWarehouseSize, + v.UserTaskTimeoutMs, + v.UserTaskMinimumTriggerIntervalInSeconds, + v.QuotedIdentifiersIgnoreCase, + v.EnableConsoleOutput, + v.Comment, + ) { + errs = append(errs, errAtLeastOneOf( + "DatabaseUnset", + "DataRetentionTimeInDays", + "MaxDataExtensionTimeInDays", + "ExternalVolume", + "Catalog", + "ReplaceInvalidCharacters", + "DefaultDDLCollation", + "StorageSerializationPolicy", + "LogLevel", + "TraceLevel", + "SuspendTaskAfterNumFailures", + "TaskAutoRetryAttempts", + "UserTaskManagedInitialWarehouseSize", + "UserTaskTimeoutMs", + "UserTaskMinimumTriggerIntervalInSeconds", + "QuotedIdentifiersIgnoreCase", + "EnableConsoleOutput", + "Comment", + )) } return errors.Join(errs...) } diff --git a/pkg/sdk/databases_test.go b/pkg/sdk/databases_test.go index e3b1d36687..4dc8fd3cbe 100644 --- a/pkg/sdk/databases_test.go +++ b/pkg/sdk/databases_test.go @@ -70,6 +70,7 @@ func TestDatabasesCreate(t *testing.T) { opts := defaultOpts() opts.IfNotExists = Bool(true) opts.Transient = Bool(true) + opts.DataRetentionTimeInDays = Int(1) opts.MaxDataExtensionTimeInDays = Int(1) opts.ExternalVolume = &externalVolumeId @@ -79,6 +80,14 @@ func TestDatabasesCreate(t *testing.T) { opts.StorageSerializationPolicy = Pointer(StorageSerializationPolicyCompatible) opts.LogLevel = Pointer(LogLevelInfo) opts.TraceLevel = Pointer(TraceLevelOnEvent) + opts.SuspendTaskAfterNumFailures = Int(10) + opts.TaskAutoRetryAttempts = Int(10) + opts.UserTaskManagedInitialWarehouseSize = Pointer(WarehouseSizeMedium) + opts.UserTaskTimeoutMs = Int(12000) + opts.UserTaskMinimumTriggerIntervalInSeconds = Int(30) + opts.QuotedIdentifiersIgnoreCase = Bool(true) + opts.EnableConsoleOutput = Bool(true) + opts.Comment = String("comment") tagId := randomAccountObjectIdentifier() opts.Tag = []TagAssociation{ @@ -87,7 +96,7 @@ func TestDatabasesCreate(t *testing.T) { Value: "v1", }, } - assertOptsValidAndSQLEquals(t, opts, `CREATE TRANSIENT DATABASE IF NOT EXISTS %s DATA_RETENTION_TIME_IN_DAYS = 1 MAX_DATA_EXTENSION_TIME_IN_DAYS = 1 EXTERNAL_VOLUME = %s CATALOG = %s REPLACE_INVALID_CHARACTERS = true DEFAULT_DDL_COLLATION = 'en_US' STORAGE_SERIALIZATION_POLICY = COMPATIBLE LOG_LEVEL = 'INFO' TRACE_LEVEL = 'ON_EVENT' COMMENT = 'comment' TAG (%s = 'v1')`, opts.name.FullyQualifiedName(), externalVolumeId.FullyQualifiedName(), catalogId.FullyQualifiedName(), tagId.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE TRANSIENT DATABASE IF NOT EXISTS %s DATA_RETENTION_TIME_IN_DAYS = 1 MAX_DATA_EXTENSION_TIME_IN_DAYS = 1 EXTERNAL_VOLUME = %s CATALOG = %s REPLACE_INVALID_CHARACTERS = true DEFAULT_DDL_COLLATION = 'en_US' STORAGE_SERIALIZATION_POLICY = COMPATIBLE LOG_LEVEL = 'INFO' TRACE_LEVEL = 'ON_EVENT' SUSPEND_TASK_AFTER_NUM_FAILURES = 10 TASK_AUTO_RETRY_ATTEMPTS = 10 USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE = MEDIUM USER_TASK_TIMEOUT_MS = 12000 USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS = 30 QUOTED_IDENTIFIERS_IGNORE_CASE = true ENABLE_CONSOLE_OUTPUT = true COMMENT = 'comment' TAG (%s = 'v1')`, opts.name.FullyQualifiedName(), externalVolumeId.FullyQualifiedName(), catalogId.FullyQualifiedName(), tagId.FullyQualifiedName()) }) } @@ -142,13 +151,22 @@ func TestDatabasesCreateShared(t *testing.T) { externalVolumeId := randomAccountObjectIdentifier() catalogId := randomAccountObjectIdentifier() opts.OrReplace = Bool(true) + opts.ExternalVolume = &externalVolumeId opts.Catalog = &catalogId - opts.ReplaceInvalidCharacters = Bool(false) + opts.ReplaceInvalidCharacters = Bool(true) opts.DefaultDDLCollation = String("en_US") - opts.StorageSerializationPolicy = Pointer(StorageSerializationPolicyOptimized) + opts.StorageSerializationPolicy = Pointer(StorageSerializationPolicyCompatible) opts.LogLevel = Pointer(LogLevelInfo) opts.TraceLevel = Pointer(TraceLevelOnEvent) + opts.SuspendTaskAfterNumFailures = Int(10) + opts.TaskAutoRetryAttempts = Int(10) + opts.UserTaskManagedInitialWarehouseSize = Pointer(WarehouseSizeMedium) + opts.UserTaskTimeoutMs = Int(12000) + opts.UserTaskMinimumTriggerIntervalInSeconds = Int(30) + opts.QuotedIdentifiersIgnoreCase = Bool(true) + opts.EnableConsoleOutput = Bool(true) + opts.Comment = String("comment") tagId := randomAccountObjectIdentifier() opts.Tag = []TagAssociation{ @@ -157,7 +175,7 @@ func TestDatabasesCreateShared(t *testing.T) { Value: "v1", }, } - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE DATABASE %s FROM SHARE %s EXTERNAL_VOLUME = %s CATALOG = %s REPLACE_INVALID_CHARACTERS = false DEFAULT_DDL_COLLATION = 'en_US' STORAGE_SERIALIZATION_POLICY = OPTIMIZED LOG_LEVEL = 'INFO' TRACE_LEVEL = 'ON_EVENT' COMMENT = 'comment' TAG (%s = 'v1')`, opts.name.FullyQualifiedName(), opts.fromShare.FullyQualifiedName(), externalVolumeId.FullyQualifiedName(), catalogId.FullyQualifiedName(), tagId.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE DATABASE %s FROM SHARE %s EXTERNAL_VOLUME = %s CATALOG = %s REPLACE_INVALID_CHARACTERS = true DEFAULT_DDL_COLLATION = 'en_US' STORAGE_SERIALIZATION_POLICY = COMPATIBLE LOG_LEVEL = 'INFO' TRACE_LEVEL = 'ON_EVENT' SUSPEND_TASK_AFTER_NUM_FAILURES = 10 TASK_AUTO_RETRY_ATTEMPTS = 10 USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE = MEDIUM USER_TASK_TIMEOUT_MS = 12000 USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS = 30 QUOTED_IDENTIFIERS_IGNORE_CASE = true ENABLE_CONSOLE_OUTPUT = true COMMENT = 'comment' TAG (%s = 'v1')`, opts.name.FullyQualifiedName(), opts.fromShare.FullyQualifiedName(), externalVolumeId.FullyQualifiedName(), catalogId.FullyQualifiedName(), tagId.FullyQualifiedName()) }) } @@ -212,17 +230,26 @@ func TestDatabasesCreateSecondary(t *testing.T) { opts.OrReplace = Bool(true) opts.Transient = Bool(true) opts.primaryDatabase = primaryDatabaseId + opts.DataRetentionTimeInDays = Int(1) - opts.MaxDataExtensionTimeInDays = Int(10) + opts.MaxDataExtensionTimeInDays = Int(1) opts.ExternalVolume = &externalVolumeId opts.Catalog = &catalogId opts.ReplaceInvalidCharacters = Bool(true) opts.DefaultDDLCollation = String("en_US") - opts.StorageSerializationPolicy = Pointer(StorageSerializationPolicyOptimized) + opts.StorageSerializationPolicy = Pointer(StorageSerializationPolicyCompatible) opts.LogLevel = Pointer(LogLevelInfo) opts.TraceLevel = Pointer(TraceLevelOnEvent) + opts.SuspendTaskAfterNumFailures = Int(10) + opts.TaskAutoRetryAttempts = Int(10) + opts.UserTaskManagedInitialWarehouseSize = Pointer(WarehouseSizeMedium) + opts.UserTaskTimeoutMs = Int(12000) + opts.UserTaskMinimumTriggerIntervalInSeconds = Int(30) + opts.QuotedIdentifiersIgnoreCase = Bool(true) + opts.EnableConsoleOutput = Bool(true) + opts.Comment = String("comment") - assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TRANSIENT DATABASE %s AS REPLICA OF %s DATA_RETENTION_TIME_IN_DAYS = 1 MAX_DATA_EXTENSION_TIME_IN_DAYS = 10 EXTERNAL_VOLUME = %s CATALOG = %s REPLACE_INVALID_CHARACTERS = true DEFAULT_DDL_COLLATION = 'en_US' STORAGE_SERIALIZATION_POLICY = OPTIMIZED LOG_LEVEL = 'INFO' TRACE_LEVEL = 'ON_EVENT' COMMENT = 'comment'`, opts.name.FullyQualifiedName(), primaryDatabaseId.FullyQualifiedName(), externalVolumeId.FullyQualifiedName(), catalogId.FullyQualifiedName()) + assertOptsValidAndSQLEquals(t, opts, `CREATE OR REPLACE TRANSIENT DATABASE %s AS REPLICA OF %s DATA_RETENTION_TIME_IN_DAYS = 1 MAX_DATA_EXTENSION_TIME_IN_DAYS = 1 EXTERNAL_VOLUME = %s CATALOG = %s REPLACE_INVALID_CHARACTERS = true DEFAULT_DDL_COLLATION = 'en_US' STORAGE_SERIALIZATION_POLICY = COMPATIBLE LOG_LEVEL = 'INFO' TRACE_LEVEL = 'ON_EVENT' SUSPEND_TASK_AFTER_NUM_FAILURES = 10 TASK_AUTO_RETRY_ATTEMPTS = 10 USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE = MEDIUM USER_TASK_TIMEOUT_MS = 12000 USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS = 30 QUOTED_IDENTIFIERS_IGNORE_CASE = true ENABLE_CONSOLE_OUTPUT = true COMMENT = 'comment'`, opts.name.FullyQualifiedName(), primaryDatabaseId.FullyQualifiedName(), externalVolumeId.FullyQualifiedName(), catalogId.FullyQualifiedName()) }) } @@ -263,13 +290,51 @@ func TestDatabasesAlter(t *testing.T) { t.Run("validation: at least one set option", func(t *testing.T) { opts := defaultOpts() opts.Set = &DatabaseSet{} - assertOptsInvalidJoinedErrors(t, opts, errAtLeastOneOf("DatabaseSet", "DataRetentionTimeInDays", "MaxDataExtensionTimeInDays", "ExternalVolume", "Catalog", "ReplaceInvalidCharacters", "DefaultDDLCollation", "StorageSerializationPolicy", "LogLevel", "TraceLevel", "Comment")) + assertOptsInvalidJoinedErrors(t, opts, errAtLeastOneOf( + "DatabaseSet", + "DataRetentionTimeInDays", + "MaxDataExtensionTimeInDays", + "ExternalVolume", + "Catalog", + "ReplaceInvalidCharacters", + "DefaultDDLCollation", + "StorageSerializationPolicy", + "LogLevel", + "TraceLevel", + "SuspendTaskAfterNumFailures", + "TaskAutoRetryAttempts", + "UserTaskManagedInitialWarehouseSize", + "UserTaskTimeoutMs", + "UserTaskMinimumTriggerIntervalInSeconds", + "QuotedIdentifiersIgnoreCase", + "EnableConsoleOutput", + "Comment", + )) }) t.Run("validation: at least one unset option", func(t *testing.T) { opts := defaultOpts() opts.Unset = &DatabaseUnset{} - assertOptsInvalidJoinedErrors(t, opts, errAtLeastOneOf("DatabaseUnset", "DataRetentionTimeInDays", "MaxDataExtensionTimeInDays", "ExternalVolume", "Catalog", "ReplaceInvalidCharacters", "DefaultDDLCollation", "StorageSerializationPolicy", "LogLevel", "TraceLevel", "Comment")) + assertOptsInvalidJoinedErrors(t, opts, errAtLeastOneOf( + "DatabaseUnset", + "DataRetentionTimeInDays", + "MaxDataExtensionTimeInDays", + "ExternalVolume", + "Catalog", + "ReplaceInvalidCharacters", + "DefaultDDLCollation", + "StorageSerializationPolicy", + "LogLevel", + "TraceLevel", + "SuspendTaskAfterNumFailures", + "TaskAutoRetryAttempts", + "UserTaskManagedInitialWarehouseSize", + "UserTaskTimeoutMs", + "UserTaskMinimumTriggerIntervalInSeconds", + "QuotedIdentifiersIgnoreCase", + "EnableConsoleOutput", + "Comment", + )) }) t.Run("validation: invalid external volume identifier", func(t *testing.T) { diff --git a/pkg/sdk/parameters.go b/pkg/sdk/parameters.go index cc6a650047..87d4cfe0bd 100644 --- a/pkg/sdk/parameters.go +++ b/pkg/sdk/parameters.go @@ -279,6 +279,8 @@ func (parameters *parameters) SetObjectParameterOnAccount(ctx context.Context, p return err } opts.Set.Parameters.ObjectParameters.EnableUnredactedQuerySyntaxError = b + case ObjectParameterCatalog: + opts.Set.Parameters.ObjectParameters.Catalog = &value default: return fmt.Errorf("Invalid object parameter: %v", string(parameter)) } @@ -399,18 +401,26 @@ const ( AccountParameterWeekStart AccountParameter = "WEEK_START" // Object Parameters (inherited) - AccountParameterDataRetentionTimeInDays AccountParameter = "DATA_RETENTION_TIME_IN_DAYS" - AccountParameterDefaultDDLCollation AccountParameter = "DEFAULT_DDL_COLLATION" - AccountParameterLogLevel AccountParameter = "LOG_LEVEL" - AccountParameterMaxConcurrencyLevel AccountParameter = "MAX_CONCURRENCY_LEVEL" - AccountParameterMaxDataExtensionTimeInDays AccountParameter = "MAX_DATA_EXTENSION_TIME_IN_DAYS" - AccountParameterPipeExecutionPaused AccountParameter = "PIPE_EXECUTION_PAUSED" - AccountParameterStatementQueuedTimeoutInSeconds AccountParameter = "STATEMENT_QUEUED_TIMEOUT_IN_SECONDS" - AccountParameterShareRestrictions AccountParameter = "SHARE_RESTRICTIONS" - AccountParameterSuspendTaskAfterNumFailures AccountParameter = "SUSPEND_TASK_AFTER_NUM_FAILURES" - AccountParameterTraceLevel AccountParameter = "TRACE_LEVEL" - AccountParameterUserTaskManagedInitialWarehouseSize AccountParameter = "USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE" - AccountParameterUserTaskTimeoutMs AccountParameter = "USER_TASK_TIMEOUT_MS" + AccountParameterCatalog AccountParameter = "CATALOG" + AccountParameterDataRetentionTimeInDays AccountParameter = "DATA_RETENTION_TIME_IN_DAYS" + AccountParameterDefaultDDLCollation AccountParameter = "DEFAULT_DDL_COLLATION" + AccountParameterExternalVolume AccountParameter = "EXTERNAL_VOLUME" + AccountParameterLogLevel AccountParameter = "LOG_LEVEL" + AccountParameterMaxConcurrencyLevel AccountParameter = "MAX_CONCURRENCY_LEVEL" + AccountParameterMaxDataExtensionTimeInDays AccountParameter = "MAX_DATA_EXTENSION_TIME_IN_DAYS" + AccountParameterPipeExecutionPaused AccountParameter = "PIPE_EXECUTION_PAUSED" + AccountParameterReplaceInvalidCharacters AccountParameter = "REPLACE_INVALID_CHARACTERS" + AccountParameterStatementQueuedTimeoutInSeconds AccountParameter = "STATEMENT_QUEUED_TIMEOUT_IN_SECONDS" + AccountParameterStorageSerializationPolicy AccountParameter = "STORAGE_SERIALIZATION_POLICY" + AccountParameterShareRestrictions AccountParameter = "SHARE_RESTRICTIONS" + AccountParameterSuspendTaskAfterNumFailures AccountParameter = "SUSPEND_TASK_AFTER_NUM_FAILURES" + AccountParameterTraceLevel AccountParameter = "TRACE_LEVEL" + AccountParameterUserTaskManagedInitialWarehouseSize AccountParameter = "USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE" + AccountParameterUserTaskTimeoutMs AccountParameter = "USER_TASK_TIMEOUT_MS" + AccountParameterTaskAutoRetryAttempts AccountParameter = "TASK_AUTO_RETRY_ATTEMPTS" + AccountParameterUserTaskMinimumTriggerIntervalInSeconds AccountParameter = "USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS" + AccountParameterMetricLevel AccountParameter = "METRIC_LEVEL" + AccountParameterEnableConsoleOutput AccountParameter = "ENABLE_CONSOLE_OUTPUT" // User Parameters (inherited) AccountParameterEnableUnredactedQuerySyntaxError AccountParameter = "ENABLE_UNREDACTED_QUERY_SYNTAX_ERROR" @@ -464,25 +474,30 @@ type ObjectParameter string const ( // Object Parameters - ObjectParameterDataRetentionTimeInDays ObjectParameter = "DATA_RETENTION_TIME_IN_DAYS" - ObjectParameterDefaultDDLCollation ObjectParameter = "DEFAULT_DDL_COLLATION" - ObjectParameterLogLevel ObjectParameter = "LOG_LEVEL" - ObjectParameterMaxConcurrencyLevel ObjectParameter = "MAX_CONCURRENCY_LEVEL" - ObjectParameterMaxDataExtensionTimeInDays ObjectParameter = "MAX_DATA_EXTENSION_TIME_IN_DAYS" - ObjectParameterPipeExecutionPaused ObjectParameter = "PIPE_EXECUTION_PAUSED" - ObjectParameterPreventUnloadToInternalStages ObjectParameter = "PREVENT_UNLOAD_TO_INTERNAL_STAGES" // also an account param - ObjectParameterStatementQueuedTimeoutInSeconds ObjectParameter = "STATEMENT_QUEUED_TIMEOUT_IN_SECONDS" - ObjectParameterStatementTimeoutInSeconds ObjectParameter = "STATEMENT_TIMEOUT_IN_SECONDS" - ObjectParameterNetworkPolicy ObjectParameter = "NETWORK_POLICY" // also an account param - ObjectParameterShareRestrictions ObjectParameter = "SHARE_RESTRICTIONS" - ObjectParameterSuspendTaskAfterNumFailures ObjectParameter = "SUSPEND_TASK_AFTER_NUM_FAILURES" - ObjectParameterTraceLevel ObjectParameter = "TRACE_LEVEL" - ObjectParameterUserTaskManagedInitialWarehouseSize ObjectParameter = "USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE" - ObjectParameterUserTaskTimeoutMs ObjectParameter = "USER_TASK_TIMEOUT_MS" - ObjectParameterCatalog ObjectParameter = "CATALOG" - ObjectParameterExternalVolume ObjectParameter = "EXTERNAL_VOLUME" - ObjectParameterReplaceInvalidCharacters ObjectParameter = "REPLACE_INVALID_CHARACTERS" - ObjectParameterStorageSerializationPolicy ObjectParameter = "STORAGE_SERIALIZATION_POLICY" + ObjectParameterDataRetentionTimeInDays ObjectParameter = "DATA_RETENTION_TIME_IN_DAYS" + ObjectParameterDefaultDDLCollation ObjectParameter = "DEFAULT_DDL_COLLATION" + ObjectParameterLogLevel ObjectParameter = "LOG_LEVEL" + ObjectParameterMaxConcurrencyLevel ObjectParameter = "MAX_CONCURRENCY_LEVEL" + ObjectParameterMaxDataExtensionTimeInDays ObjectParameter = "MAX_DATA_EXTENSION_TIME_IN_DAYS" + ObjectParameterPipeExecutionPaused ObjectParameter = "PIPE_EXECUTION_PAUSED" + ObjectParameterPreventUnloadToInternalStages ObjectParameter = "PREVENT_UNLOAD_TO_INTERNAL_STAGES" // also an account param + ObjectParameterStatementQueuedTimeoutInSeconds ObjectParameter = "STATEMENT_QUEUED_TIMEOUT_IN_SECONDS" + ObjectParameterStatementTimeoutInSeconds ObjectParameter = "STATEMENT_TIMEOUT_IN_SECONDS" + ObjectParameterNetworkPolicy ObjectParameter = "NETWORK_POLICY" // also an account param + ObjectParameterShareRestrictions ObjectParameter = "SHARE_RESTRICTIONS" + ObjectParameterSuspendTaskAfterNumFailures ObjectParameter = "SUSPEND_TASK_AFTER_NUM_FAILURES" + ObjectParameterTraceLevel ObjectParameter = "TRACE_LEVEL" + ObjectParameterUserTaskManagedInitialWarehouseSize ObjectParameter = "USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE" + ObjectParameterUserTaskTimeoutMs ObjectParameter = "USER_TASK_TIMEOUT_MS" + ObjectParameterCatalog ObjectParameter = "CATALOG" + ObjectParameterExternalVolume ObjectParameter = "EXTERNAL_VOLUME" + ObjectParameterReplaceInvalidCharacters ObjectParameter = "REPLACE_INVALID_CHARACTERS" + ObjectParameterStorageSerializationPolicy ObjectParameter = "STORAGE_SERIALIZATION_POLICY" + ObjectParameterTaskAutoRetryAttempts ObjectParameter = "TASK_AUTO_RETRY_ATTEMPTS" + ObjectParameterUserTaskMinimumTriggerIntervalInSeconds ObjectParameter = "USER_TASK_MINIMUM_TRIGGER_INTERVAL_IN_SECONDS" + ObjectParameterQuotedIdentifiersIgnoreCase ObjectParameter = "QUOTED_IDENTIFIERS_IGNORE_CASE" + ObjectParameterMetricLevel ObjectParameter = "METRIC_LEVEL" + ObjectParameterEnableConsoleOutput ObjectParameter = "ENABLE_CONSOLE_OUTPUT" // User Parameters ObjectParameterEnableUnredactedQuerySyntaxError ObjectParameter = "ENABLE_UNREDACTED_QUERY_SYNTAX_ERROR" @@ -789,6 +804,7 @@ type ObjectParameters struct { TraceLevel *TraceLevel `ddl:"parameter" sql:"TRACE_LEVEL"` UserTaskManagedInitialWarehouseSize *WarehouseSize `ddl:"parameter" sql:"USER_TASK_MANAGED_INITIAL_WAREHOUSE_SIZE"` UserTaskTimeoutMs *int `ddl:"parameter" sql:"USER_TASK_TIMEOUT_MS"` + Catalog *string `ddl:"parameter" sql:"CATALOG"` } func (v *ObjectParameters) validate() error { @@ -904,6 +920,7 @@ const ( ParameterTypeSession ParameterType = "SESSION" ParameterTypeObject ParameterType = "OBJECT" ParameterTypeWarehouse ParameterType = "WAREHOUSE" + ParameterTypeDatabase ParameterType = "DATABASE" ) type Parameter struct { diff --git a/pkg/sdk/testint/context_functions_integration_test.go b/pkg/sdk/testint/context_functions_integration_test.go index 7dd2a2624a..8767e051d8 100644 --- a/pkg/sdk/testint/context_functions_integration_test.go +++ b/pkg/sdk/testint/context_functions_integration_test.go @@ -17,6 +17,24 @@ func TestInt_CurrentAccount(t *testing.T) { assert.NotEmpty(t, account) } +func TestInt_CurrentAccountName(t *testing.T) { + client := testClient(t) + ctx := testContext(t) + + accountName, err := client.ContextFunctions.CurrentAccountName(ctx) + require.NoError(t, err) + assert.NotEmpty(t, accountName) +} + +func TestInt_CurrentOrganizationName(t *testing.T) { + client := testClient(t) + ctx := testContext(t) + + organizationName, err := client.ContextFunctions.CurrentOrganizationName(ctx) + require.NoError(t, err) + assert.NotEmpty(t, organizationName) +} + func TestInt_CurrentRole(t *testing.T) { client := testClient(t) ctx := testContext(t) diff --git a/pkg/sdk/testint/databases_integration_test.go b/pkg/sdk/testint/databases_integration_test.go index 5342c01f72..9fbe65e62f 100644 --- a/pkg/sdk/testint/databases_integration_test.go +++ b/pkg/sdk/testint/databases_integration_test.go @@ -4,6 +4,8 @@ import ( "fmt" "testing" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/internal/collections" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" @@ -51,7 +53,7 @@ func TestInt_DatabasesCreate(t *testing.T) { }) t.Run("complete", func(t *testing.T) { - databaseID := testClientHelper().Ids.RandomAccountObjectIdentifier() + databaseId := testClientHelper().Ids.RandomAccountObjectIdentifier() // new database and schema created on purpose databaseTest, databaseCleanup := testClientHelper().Database.CreateDatabase(t) @@ -73,19 +75,26 @@ func TestInt_DatabasesCreate(t *testing.T) { t.Cleanup(catalogCleanup) comment := random.Comment() - err := client.Databases.Create(ctx, databaseID, &sdk.CreateDatabaseOptions{ - Transient: sdk.Bool(true), - IfNotExists: sdk.Bool(true), - DataRetentionTimeInDays: sdk.Int(1), - MaxDataExtensionTimeInDays: sdk.Int(1), - ExternalVolume: &externalVolume, - Catalog: &catalog, - ReplaceInvalidCharacters: sdk.Bool(true), - DefaultDDLCollation: sdk.String("en_US"), - StorageSerializationPolicy: sdk.Pointer(sdk.StorageSerializationPolicyCompatible), - LogLevel: sdk.Pointer(sdk.LogLevelInfo), - TraceLevel: sdk.Pointer(sdk.TraceLevelOnEvent), - Comment: sdk.String(comment), + err := client.Databases.Create(ctx, databaseId, &sdk.CreateDatabaseOptions{ + Transient: sdk.Bool(true), + IfNotExists: sdk.Bool(true), + DataRetentionTimeInDays: sdk.Int(0), + MaxDataExtensionTimeInDays: sdk.Int(10), + ExternalVolume: &externalVolume, + Catalog: &catalog, + ReplaceInvalidCharacters: sdk.Bool(true), + DefaultDDLCollation: sdk.String("en_US"), + StorageSerializationPolicy: sdk.Pointer(sdk.StorageSerializationPolicyCompatible), + LogLevel: sdk.Pointer(sdk.LogLevelInfo), + TraceLevel: sdk.Pointer(sdk.TraceLevelOnEvent), + SuspendTaskAfterNumFailures: sdk.Int(10), + TaskAutoRetryAttempts: sdk.Int(10), + UserTaskManagedInitialWarehouseSize: sdk.Pointer(sdk.WarehouseSizeMedium), + UserTaskTimeoutMs: sdk.Int(12_000), + UserTaskMinimumTriggerIntervalInSeconds: sdk.Int(30), + QuotedIdentifiersIgnoreCase: sdk.Bool(true), + EnableConsoleOutput: sdk.Bool(true), + Comment: sdk.String(comment), Tag: []sdk.TagAssociation{ { Name: tagTest.ID(), @@ -98,41 +107,35 @@ func TestInt_DatabasesCreate(t *testing.T) { }, }) require.NoError(t, err) - t.Cleanup(testClientHelper().Database.DropDatabaseFunc(t, databaseID)) + t.Cleanup(testClientHelper().Database.DropDatabaseFunc(t, databaseId)) - database, err := client.Databases.ShowByID(ctx, databaseID) + database, err := client.Databases.ShowByID(ctx, databaseId) require.NoError(t, err) - assert.Equal(t, databaseID.Name(), database.Name) + assert.Equal(t, databaseId.Name(), database.Name) assert.Equal(t, comment, database.Comment) - assert.Equal(t, 1, database.RetentionTime) - - param, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterMaxDataExtensionTimeInDays, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseID}) - assert.NoError(t, err) - assert.Equal(t, "1", param.Value) - - externalVolumeParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterExternalVolume, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseID}) - assert.NoError(t, err) - assert.Equal(t, externalVolume.Name(), externalVolumeParam.Value) - - catalogParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterCatalog, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseID}) - assert.NoError(t, err) - assert.Equal(t, catalog.Name(), catalogParam.Value) - - logLevelParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterLogLevel, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseID}) - assert.NoError(t, err) - assert.Equal(t, string(sdk.LogLevelInfo), logLevelParam.Value) - traceLevelParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterTraceLevel, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseID}) - assert.NoError(t, err) - assert.Equal(t, string(sdk.TraceLevelOnEvent), traceLevelParam.Value) + params := testClientHelper().Parameter.ShowDatabaseParameters(t, databaseId) + assertParameterEquals := func(t *testing.T, parameterName sdk.AccountParameter, expected string) { + t.Helper() + assert.Equal(t, expected, helpers.FindParameter(t, params, parameterName).Value) + } - ignoreInvalidCharactersParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterReplaceInvalidCharacters, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseID}) - assert.NoError(t, err) - assert.Equal(t, "true", ignoreInvalidCharactersParam.Value) - - serializationPolicyParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterStorageSerializationPolicy, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseID}) - assert.NoError(t, err) - assert.Equal(t, string(sdk.StorageSerializationPolicyCompatible), serializationPolicyParam.Value) + assertParameterEquals(t, sdk.AccountParameterDataRetentionTimeInDays, "0") + assertParameterEquals(t, sdk.AccountParameterMaxDataExtensionTimeInDays, "10") + assertParameterEquals(t, sdk.AccountParameterDefaultDDLCollation, "en_US") + assertParameterEquals(t, sdk.AccountParameterExternalVolume, externalVolume.Name()) + assertParameterEquals(t, sdk.AccountParameterCatalog, catalog.Name()) + assertParameterEquals(t, sdk.AccountParameterLogLevel, string(sdk.LogLevelInfo)) + assertParameterEquals(t, sdk.AccountParameterTraceLevel, string(sdk.TraceLevelOnEvent)) + assertParameterEquals(t, sdk.AccountParameterReplaceInvalidCharacters, "true") + assertParameterEquals(t, sdk.AccountParameterStorageSerializationPolicy, string(sdk.StorageSerializationPolicyCompatible)) + assertParameterEquals(t, sdk.AccountParameterSuspendTaskAfterNumFailures, "10") + assertParameterEquals(t, sdk.AccountParameterTaskAutoRetryAttempts, "10") + assertParameterEquals(t, sdk.AccountParameterUserTaskManagedInitialWarehouseSize, string(sdk.WarehouseSizeMedium)) + assertParameterEquals(t, sdk.AccountParameterUserTaskTimeoutMs, "12000") + assertParameterEquals(t, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds, "30") + assertParameterEquals(t, sdk.AccountParameterQuotedIdentifiersIgnoreCase, "true") + assertParameterEquals(t, sdk.AccountParameterEnableConsoleOutput, "true") tag1Value, err := client.SystemFunctions.GetTag(ctx, tagTest.ID(), database.ID(), sdk.ObjectTypeDatabase) require.NoError(t, err) @@ -190,16 +193,23 @@ func TestInt_DatabasesCreateShared(t *testing.T) { comment := random.Comment() err = client.Databases.CreateShared(ctx, databaseId, shareTest.ExternalID(), &sdk.CreateSharedDatabaseOptions{ - Transient: sdk.Bool(true), - IfNotExists: sdk.Bool(true), - ExternalVolume: &externalVolume, - Catalog: &catalog, - ReplaceInvalidCharacters: sdk.Bool(true), - DefaultDDLCollation: sdk.String("en_US"), - StorageSerializationPolicy: sdk.Pointer(sdk.StorageSerializationPolicyOptimized), - LogLevel: sdk.Pointer(sdk.LogLevelDebug), - TraceLevel: sdk.Pointer(sdk.TraceLevelAlways), - Comment: sdk.String(comment), + Transient: sdk.Bool(true), + IfNotExists: sdk.Bool(true), + ExternalVolume: &externalVolume, + Catalog: &catalog, + LogLevel: sdk.Pointer(sdk.LogLevelDebug), + TraceLevel: sdk.Pointer(sdk.TraceLevelAlways), + ReplaceInvalidCharacters: sdk.Bool(true), + DefaultDDLCollation: sdk.String("en_US"), + StorageSerializationPolicy: sdk.Pointer(sdk.StorageSerializationPolicyOptimized), + SuspendTaskAfterNumFailures: sdk.Int(10), + TaskAutoRetryAttempts: sdk.Int(10), + UserTaskManagedInitialWarehouseSize: sdk.Pointer(sdk.WarehouseSizeMedium), + UserTaskTimeoutMs: sdk.Int(12_000), + UserTaskMinimumTriggerIntervalInSeconds: sdk.Int(30), + QuotedIdentifiersIgnoreCase: sdk.Bool(true), + EnableConsoleOutput: sdk.Bool(true), + Comment: sdk.String(comment), Tag: []sdk.TagAssociation{ { Name: testTag.ID(), @@ -216,29 +226,26 @@ func TestInt_DatabasesCreateShared(t *testing.T) { assert.Equal(t, databaseId.Name(), database.Name) assert.Equal(t, comment, database.Comment) - externalVolumeParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterExternalVolume, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, externalVolume.Name(), externalVolumeParam.Value) - - catalogParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterCatalog, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, catalog.Name(), catalogParam.Value) - - logLevelParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterLogLevel, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, string(sdk.LogLevelDebug), logLevelParam.Value) - - traceLevelParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterTraceLevel, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, string(sdk.TraceLevelAlways), traceLevelParam.Value) - - ignoreInvalidCharactersParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterReplaceInvalidCharacters, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, "true", ignoreInvalidCharactersParam.Value) + params := testClientHelper().Parameter.ShowDatabaseParameters(t, databaseId) + assertParameterEquals := func(t *testing.T, parameterName sdk.AccountParameter, expected string) { + t.Helper() + assert.Equal(t, expected, helpers.FindParameter(t, params, parameterName).Value) + } - serializationPolicyParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterStorageSerializationPolicy, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, string(sdk.StorageSerializationPolicyOptimized), serializationPolicyParam.Value) + assertParameterEquals(t, sdk.AccountParameterDefaultDDLCollation, "en_US") + assertParameterEquals(t, sdk.AccountParameterExternalVolume, externalVolume.Name()) + assertParameterEquals(t, sdk.AccountParameterCatalog, catalog.Name()) + assertParameterEquals(t, sdk.AccountParameterLogLevel, string(sdk.LogLevelDebug)) + assertParameterEquals(t, sdk.AccountParameterTraceLevel, string(sdk.TraceLevelAlways)) + assertParameterEquals(t, sdk.AccountParameterReplaceInvalidCharacters, "true") + assertParameterEquals(t, sdk.AccountParameterStorageSerializationPolicy, string(sdk.StorageSerializationPolicyOptimized)) + assertParameterEquals(t, sdk.AccountParameterSuspendTaskAfterNumFailures, "10") + assertParameterEquals(t, sdk.AccountParameterTaskAutoRetryAttempts, "10") + assertParameterEquals(t, sdk.AccountParameterUserTaskManagedInitialWarehouseSize, string(sdk.WarehouseSizeMedium)) + assertParameterEquals(t, sdk.AccountParameterUserTaskTimeoutMs, "12000") + assertParameterEquals(t, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds, "30") + assertParameterEquals(t, sdk.AccountParameterQuotedIdentifiersIgnoreCase, "true") + assertParameterEquals(t, sdk.AccountParameterEnableConsoleOutput, "true") tag1Value, err := client.SystemFunctions.GetTag(ctx, testTag.ID(), database.ID(), sdk.ObjectTypeDatabase) require.NoError(t, err) @@ -275,17 +282,24 @@ func TestInt_DatabasesCreateSecondary(t *testing.T) { comment := random.Comment() err = client.Databases.CreateSecondary(ctx, databaseId, externalDatabaseId, &sdk.CreateSecondaryDatabaseOptions{ - IfNotExists: sdk.Bool(true), - DataRetentionTimeInDays: sdk.Int(1), - MaxDataExtensionTimeInDays: sdk.Int(10), - ExternalVolume: &externalVolume, - Catalog: &catalog, - ReplaceInvalidCharacters: sdk.Bool(true), - DefaultDDLCollation: sdk.String("en_US"), - StorageSerializationPolicy: sdk.Pointer(sdk.StorageSerializationPolicyOptimized), - LogLevel: sdk.Pointer(sdk.LogLevelDebug), - TraceLevel: sdk.Pointer(sdk.TraceLevelAlways), - Comment: sdk.String(comment), + IfNotExists: sdk.Bool(true), + DataRetentionTimeInDays: sdk.Int(10), + MaxDataExtensionTimeInDays: sdk.Int(10), + ExternalVolume: &externalVolume, + Catalog: &catalog, + ReplaceInvalidCharacters: sdk.Bool(true), + DefaultDDLCollation: sdk.String("en_US"), + StorageSerializationPolicy: sdk.Pointer(sdk.StorageSerializationPolicyOptimized), + LogLevel: sdk.Pointer(sdk.LogLevelDebug), + TraceLevel: sdk.Pointer(sdk.TraceLevelAlways), + SuspendTaskAfterNumFailures: sdk.Int(10), + TaskAutoRetryAttempts: sdk.Int(10), + UserTaskManagedInitialWarehouseSize: sdk.Pointer(sdk.WarehouseSizeMedium), + UserTaskTimeoutMs: sdk.Int(12_000), + UserTaskMinimumTriggerIntervalInSeconds: sdk.Int(30), + QuotedIdentifiersIgnoreCase: sdk.Bool(true), + EnableConsoleOutput: sdk.Bool(true), + Comment: sdk.String(comment), }) require.NoError(t, err) t.Cleanup(testClientHelper().Database.DropDatabaseFunc(t, databaseId)) @@ -294,36 +308,30 @@ func TestInt_DatabasesCreateSecondary(t *testing.T) { require.NoError(t, err) assert.Equal(t, databaseId.Name(), database.Name) - assert.Equal(t, 1, database.RetentionTime) assert.Equal(t, comment, database.Comment) - param, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterMaxDataExtensionTimeInDays, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, "10", param.Value) - - externalVolumeParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterExternalVolume, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, externalVolume.Name(), externalVolumeParam.Value) - - catalogParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterCatalog, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, catalog.Name(), catalogParam.Value) - - logLevelParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterLogLevel, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, string(sdk.LogLevelDebug), logLevelParam.Value) - - traceLevelParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterTraceLevel, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, string(sdk.TraceLevelAlways), traceLevelParam.Value) - - ignoreInvalidCharactersParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterReplaceInvalidCharacters, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, "true", ignoreInvalidCharactersParam.Value) + params := testClientHelper().Parameter.ShowDatabaseParameters(t, databaseId) + assertParameterEquals := func(t *testing.T, parameterName sdk.AccountParameter, expected string) { + t.Helper() + assert.Equal(t, expected, helpers.FindParameter(t, params, parameterName).Value) + } - serializationPolicyParam, err := client.Parameters.ShowObjectParameter(ctx, sdk.ObjectParameterStorageSerializationPolicy, sdk.Object{ObjectType: sdk.ObjectTypeDatabase, Name: databaseId}) - assert.NoError(t, err) - assert.Equal(t, string(sdk.StorageSerializationPolicyOptimized), serializationPolicyParam.Value) + assertParameterEquals(t, sdk.AccountParameterDataRetentionTimeInDays, "10") + assertParameterEquals(t, sdk.AccountParameterMaxDataExtensionTimeInDays, "10") + assertParameterEquals(t, sdk.AccountParameterDefaultDDLCollation, "en_US") + assertParameterEquals(t, sdk.AccountParameterExternalVolume, externalVolume.Name()) + assertParameterEquals(t, sdk.AccountParameterCatalog, catalog.Name()) + assertParameterEquals(t, sdk.AccountParameterLogLevel, string(sdk.LogLevelDebug)) + assertParameterEquals(t, sdk.AccountParameterTraceLevel, string(sdk.TraceLevelAlways)) + assertParameterEquals(t, sdk.AccountParameterReplaceInvalidCharacters, "true") + assertParameterEquals(t, sdk.AccountParameterStorageSerializationPolicy, string(sdk.StorageSerializationPolicyOptimized)) + assertParameterEquals(t, sdk.AccountParameterSuspendTaskAfterNumFailures, "10") + assertParameterEquals(t, sdk.AccountParameterTaskAutoRetryAttempts, "10") + assertParameterEquals(t, sdk.AccountParameterUserTaskManagedInitialWarehouseSize, string(sdk.WarehouseSizeMedium)) + assertParameterEquals(t, sdk.AccountParameterUserTaskTimeoutMs, "12000") + assertParameterEquals(t, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds, "30") + assertParameterEquals(t, sdk.AccountParameterQuotedIdentifiersIgnoreCase, "true") + assertParameterEquals(t, sdk.AccountParameterEnableConsoleOutput, "true") } func TestInt_DatabasesAlter(t *testing.T) { @@ -331,19 +339,20 @@ func TestInt_DatabasesAlter(t *testing.T) { secondaryClient := testSecondaryClient(t) ctx := testContext(t) - queryParameterForDatabase := func(t *testing.T, id sdk.AccountObjectIdentifier, parameter sdk.ObjectParameter) *sdk.Parameter { + assertDatabaseParameterEquals := func(t *testing.T, params []*sdk.Parameter, parameterName sdk.AccountParameter, expected string) { t.Helper() - value, err := client.Parameters.ShowObjectParameter(ctx, parameter, sdk.Object{ - ObjectType: sdk.ObjectTypeDatabase, - Name: id, - }) - require.NoError(t, err) - return value + assert.Equal(t, expected, helpers.FindParameter(t, params, parameterName).Value) } - queryParameterValueForDatabase := func(t *testing.T, id sdk.AccountObjectIdentifier, parameter sdk.ObjectParameter) string { + assertDatabaseParameterEqualsToDefaultValue := func(t *testing.T, params []*sdk.Parameter, parameterName sdk.ObjectParameter) { t.Helper() - return queryParameterForDatabase(t, id, parameter).Value + param, err := collections.FindOne(params, func(param *sdk.Parameter) bool { return param.Key == string(parameterName) }) + assert.NoError(t, err) + assert.NotNil(t, param) + if param != nil && (*param).Level == "" { + param := *param + assert.Equal(t, param.Default, param.Value) + } } testCases := []struct { @@ -447,72 +456,7 @@ func TestInt_DatabasesAlter(t *testing.T) { assert.Equal(t, newName.Name(), database.Name) }) - t.Run(fmt.Sprintf("Database: %s - setting and unsetting log_level and trace_level", testCase.DatabaseType), func(t *testing.T) { - if testCase.DatabaseType == "From Share" { - t.Skipf("Skipping database test because from share is not supported") - } - - databaseTest, databaseTestCleanup := testCase.CreateFn(t) - t.Cleanup(databaseTestCleanup) - - err := client.Databases.Alter(ctx, databaseTest.ID(), &sdk.AlterDatabaseOptions{ - Set: &sdk.DatabaseSet{ - LogLevel: sdk.Pointer(sdk.LogLevelInfo), - TraceLevel: sdk.Pointer(sdk.TraceLevelOnEvent), - }, - }) - require.NoError(t, err) - - require.Equal(t, string(sdk.LogLevelInfo), queryParameterValueForDatabase(t, databaseTest.ID(), sdk.ObjectParameterLogLevel)) - require.Equal(t, string(sdk.TraceLevelOnEvent), queryParameterValueForDatabase(t, databaseTest.ID(), sdk.ObjectParameterTraceLevel)) - - err = client.Databases.Alter(ctx, databaseTest.ID(), &sdk.AlterDatabaseOptions{ - Unset: &sdk.DatabaseUnset{ - LogLevel: sdk.Bool(true), - TraceLevel: sdk.Bool(true), - }, - }) - require.NoError(t, err) - - require.Equal(t, string(sdk.LogLevelOff), queryParameterValueForDatabase(t, databaseTest.ID(), sdk.ObjectParameterLogLevel)) - require.Equal(t, string(sdk.TraceLevelOff), queryParameterValueForDatabase(t, databaseTest.ID(), sdk.ObjectParameterTraceLevel)) - }) - - t.Run(fmt.Sprintf("Database: %s - setting and unsetting replace_invalid_characters and storage_serialization_policy", testCase.DatabaseType), func(t *testing.T) { - if testCase.DatabaseType == "From Share" { - t.Skipf("Skipping database test because from share is not supported") - } - - databaseTest, databaseTestCleanup := testCase.CreateFn(t) - t.Cleanup(databaseTestCleanup) - - err := client.Databases.Alter(ctx, databaseTest.ID(), &sdk.AlterDatabaseOptions{ - Set: &sdk.DatabaseSet{ - ReplaceInvalidCharacters: sdk.Bool(true), - StorageSerializationPolicy: sdk.Pointer(sdk.StorageSerializationPolicyCompatible), - }, - }) - require.NoError(t, err) - - require.Equal(t, "true", queryParameterValueForDatabase(t, databaseTest.ID(), sdk.ObjectParameterReplaceInvalidCharacters)) - require.Equal(t, string(sdk.StorageSerializationPolicyCompatible), queryParameterValueForDatabase(t, databaseTest.ID(), sdk.ObjectParameterStorageSerializationPolicy)) - - err = client.Databases.Alter(ctx, databaseTest.ID(), &sdk.AlterDatabaseOptions{ - Unset: &sdk.DatabaseUnset{ - ReplaceInvalidCharacters: sdk.Bool(true), - StorageSerializationPolicy: sdk.Bool(true), - }, - }) - require.NoError(t, err) - - replaceInvalidCharactersParam := queryParameterForDatabase(t, databaseTest.ID(), sdk.ObjectParameterReplaceInvalidCharacters) - storageSerializationPolicyParam := queryParameterForDatabase(t, databaseTest.ID(), sdk.ObjectParameterStorageSerializationPolicy) - - require.Equal(t, replaceInvalidCharactersParam.Default, replaceInvalidCharactersParam.Value) - require.Equal(t, storageSerializationPolicyParam.Default, storageSerializationPolicyParam.Value) - }) - - t.Run(fmt.Sprintf("Database: %s - setting and unsetting external volume and catalog", testCase.DatabaseType), func(t *testing.T) { + t.Run(fmt.Sprintf("Database: %s - setting and unsetting parameters", testCase.DatabaseType), func(t *testing.T) { if testCase.DatabaseType == "From Share" { t.Skipf("Skipping database test because from share is not supported") } @@ -528,54 +472,83 @@ func TestInt_DatabasesAlter(t *testing.T) { err := client.Databases.Alter(ctx, databaseTest.ID(), &sdk.AlterDatabaseOptions{ Set: &sdk.DatabaseSet{ - ExternalVolume: &externalVolumeTest, - Catalog: &catalogIntegrationTest, - }, - }) - require.NoError(t, err) - require.Equal(t, externalVolumeTest.Name(), queryParameterValueForDatabase(t, databaseTest.ID(), sdk.ObjectParameterExternalVolume)) - require.Equal(t, catalogIntegrationTest.Name(), queryParameterValueForDatabase(t, databaseTest.ID(), sdk.ObjectParameterCatalog)) - - err = client.Databases.Alter(ctx, databaseTest.ID(), &sdk.AlterDatabaseOptions{ - Unset: &sdk.DatabaseUnset{ - ExternalVolume: sdk.Bool(true), - Catalog: sdk.Bool(true), - }, - }) - require.NoError(t, err) - require.Empty(t, queryParameterValueForDatabase(t, databaseTest.ID(), sdk.ObjectParameterExternalVolume)) - require.Empty(t, queryParameterValueForDatabase(t, databaseTest.ID(), sdk.ObjectParameterCatalog)) - }) - - t.Run(fmt.Sprintf("Database: %s - setting and unsetting retention time", testCase.DatabaseType), func(t *testing.T) { - if testCase.DatabaseType == "From Share" { - t.Skipf("Skipping database test because from share is not supported") - } - - databaseTest, databaseTestCleanup := testCase.CreateFn(t) - t.Cleanup(databaseTestCleanup) - - err := client.Databases.Alter(ctx, databaseTest.ID(), &sdk.AlterDatabaseOptions{ - Set: &sdk.DatabaseSet{ - DataRetentionTimeInDays: sdk.Int(42), + DataRetentionTimeInDays: sdk.Int(42), + MaxDataExtensionTimeInDays: sdk.Int(42), + ExternalVolume: &externalVolumeTest, + Catalog: &catalogIntegrationTest, + ReplaceInvalidCharacters: sdk.Bool(true), + DefaultDDLCollation: sdk.String("en_US"), + StorageSerializationPolicy: sdk.Pointer(sdk.StorageSerializationPolicyCompatible), + LogLevel: sdk.Pointer(sdk.LogLevelInfo), + TraceLevel: sdk.Pointer(sdk.TraceLevelOnEvent), + SuspendTaskAfterNumFailures: sdk.Int(10), + TaskAutoRetryAttempts: sdk.Int(10), + UserTaskManagedInitialWarehouseSize: sdk.Pointer(sdk.WarehouseSizeMedium), + UserTaskTimeoutMs: sdk.Int(12_000), + UserTaskMinimumTriggerIntervalInSeconds: sdk.Int(30), + QuotedIdentifiersIgnoreCase: sdk.Bool(true), + EnableConsoleOutput: sdk.Bool(true), }, }) require.NoError(t, err) - database, err := client.Databases.ShowByID(ctx, databaseTest.ID()) - require.NoError(t, err) - assert.Equal(t, 42, database.RetentionTime) + params := testClientHelper().Parameter.ShowDatabaseParameters(t, databaseTest.ID()) + assertDatabaseParameterEquals(t, params, sdk.AccountParameterDataRetentionTimeInDays, "42") + assertDatabaseParameterEquals(t, params, sdk.AccountParameterMaxDataExtensionTimeInDays, "42") + assertDatabaseParameterEquals(t, params, sdk.AccountParameterExternalVolume, externalVolumeTest.Name()) + assertDatabaseParameterEquals(t, params, sdk.AccountParameterCatalog, catalogIntegrationTest.Name()) + assertDatabaseParameterEquals(t, params, sdk.AccountParameterReplaceInvalidCharacters, "true") + assertDatabaseParameterEquals(t, params, sdk.AccountParameterDefaultDDLCollation, "en_US") + assertDatabaseParameterEquals(t, params, sdk.AccountParameterStorageSerializationPolicy, string(sdk.StorageSerializationPolicyCompatible)) + assertDatabaseParameterEquals(t, params, sdk.AccountParameterLogLevel, string(sdk.LogLevelInfo)) + assertDatabaseParameterEquals(t, params, sdk.AccountParameterTraceLevel, string(sdk.TraceLevelOnEvent)) + assertDatabaseParameterEquals(t, params, sdk.AccountParameterSuspendTaskAfterNumFailures, "10") + assertDatabaseParameterEquals(t, params, sdk.AccountParameterTaskAutoRetryAttempts, "10") + assertDatabaseParameterEquals(t, params, sdk.AccountParameterUserTaskManagedInitialWarehouseSize, string(sdk.WarehouseSizeMedium)) + assertDatabaseParameterEquals(t, params, sdk.AccountParameterUserTaskTimeoutMs, "12000") + assertDatabaseParameterEquals(t, params, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds, "30") + assertDatabaseParameterEquals(t, params, sdk.AccountParameterQuotedIdentifiersIgnoreCase, "true") + assertDatabaseParameterEquals(t, params, sdk.AccountParameterEnableConsoleOutput, "true") err = client.Databases.Alter(ctx, databaseTest.ID(), &sdk.AlterDatabaseOptions{ Unset: &sdk.DatabaseUnset{ - DataRetentionTimeInDays: sdk.Bool(true), + DataRetentionTimeInDays: sdk.Bool(true), + MaxDataExtensionTimeInDays: sdk.Bool(true), + ExternalVolume: sdk.Bool(true), + Catalog: sdk.Bool(true), + ReplaceInvalidCharacters: sdk.Bool(true), + DefaultDDLCollation: sdk.Bool(true), + StorageSerializationPolicy: sdk.Bool(true), + LogLevel: sdk.Bool(true), + TraceLevel: sdk.Bool(true), + SuspendTaskAfterNumFailures: sdk.Bool(true), + TaskAutoRetryAttempts: sdk.Bool(true), + UserTaskManagedInitialWarehouseSize: sdk.Bool(true), + UserTaskTimeoutMs: sdk.Bool(true), + UserTaskMinimumTriggerIntervalInSeconds: sdk.Bool(true), + QuotedIdentifiersIgnoreCase: sdk.Bool(true), + EnableConsoleOutput: sdk.Bool(true), }, }) require.NoError(t, err) - database, err = client.Databases.ShowByID(ctx, databaseTest.ID()) - require.NoError(t, err) - assert.NotEqual(t, 42, database.RetentionTime) + params = testClientHelper().Parameter.ShowDatabaseParameters(t, databaseTest.ID()) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterDataRetentionTimeInDays) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterMaxDataExtensionTimeInDays) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterExternalVolume) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterCatalog) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterReplaceInvalidCharacters) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterDefaultDDLCollation) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterStorageSerializationPolicy) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterLogLevel) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterTraceLevel) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterSuspendTaskAfterNumFailures) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterTaskAutoRetryAttempts) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterUserTaskManagedInitialWarehouseSize) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterUserTaskTimeoutMs) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterUserTaskMinimumTriggerIntervalInSeconds) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterQuotedIdentifiersIgnoreCase) + assertDatabaseParameterEqualsToDefaultValue(t, params, sdk.ObjectParameterEnableConsoleOutput) }) t.Run(fmt.Sprintf("Database: %s - setting and unsetting comment", testCase.DatabaseType), func(t *testing.T) { diff --git a/templates/resources/secondary_database.md.tmpl b/templates/resources/secondary_database.md.tmpl new file mode 100644 index 0000000000..2fb8f72c98 --- /dev/null +++ b/templates/resources/secondary_database.md.tmpl @@ -0,0 +1,33 @@ +--- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "{{.Name}} {{.Type}} - {{.ProviderName}}" +subcategory: "" +description: |- +{{ if gt (len (split .Description "")) 1 -}} +{{ index (split .Description "") 1 | plainmarkdown | trimspace | prefixlines " " }} +{{- else -}} +{{ .Description | plainmarkdown | trimspace | prefixlines " " }} +{{- end }} +--- + +# {{.Name}} ({{.Type}}) + +~> **Note** The snowflake_secondary_database resource doesn't refresh itself, as the best practice is to use tasks scheduled for a certain interval. Check out the examples to see how to set up the refresh task. For SQL-based replication guide, see the [official documentation](https://docs.snowflake.com/en/user-guide/db-replication-config#replicating-a-database-to-another-account). + +{{ .Description | trimspace }} + +{{ if .HasExample -}} +## Example Usage + +{{ tffile (printf "examples/resources/%s/resource.tf" .Name)}} +{{- end }} + +{{ .SchemaMarkdown | trimspace }} +{{- if .HasImport }} + +## Import + +Import is supported using the following syntax: + +{{ codefile "shell" (printf "examples/resources/%s/import.sh" .Name)}} +{{- end }} diff --git a/v1-preparations/ESSENTIAL_GA_OBJECTS.MD b/v1-preparations/ESSENTIAL_GA_OBJECTS.MD index e518d0c7f3..d175228f57 100644 --- a/v1-preparations/ESSENTIAL_GA_OBJECTS.MD +++ b/v1-preparations/ESSENTIAL_GA_OBJECTS.MD @@ -21,7 +21,7 @@ newer provider versions. We will address these while working on the given object | SECURITY INTEGRATION | 👨‍💻 | [#2719](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2719), [#2568](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2568), [#2177](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2177), [#1851](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1851), [#1773](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1773), [#1741](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1741), [#1637](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1637), [#1503](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1503), [#1498](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1498), [#1421](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1421), [#1224](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1224) | | USER | ❌ | [#2817](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2817), [#2662](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2662), [#1572](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1572), [#1535](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1535), [#1155](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1155) | | WAREHOUSE | 👨‍💻 | [#1844](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1844), [#1104](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1104) | -| FUNCTION | ❌ | [#2735](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2735), [#2426](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2426), [#1479](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1479), [#1393](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1393), [#1208](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1208), [#1079](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1079) | +| FUNCTION | ❌ | [2859](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2859), [#2735](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2735), [#2426](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2426), [#1479](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1479), [#1393](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1393), [#1208](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1208), [#1079](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1079) | | MASKING POLICY | ❌ | [#2236](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2236), [#2035](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2035), [#1799](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1799), [#1764](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1764), [#1656](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1656), [#1444](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1444), [#1422](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1422), [#1097](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1097) | | PROCEDURE | ❌ | [#2735](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2735), [#2623](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2623), [#2257](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2257), [#2146](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2146), [#1855](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1855), [#1695](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1695), [#1640](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1640), [#1195](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1195), [#1189](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1189), [#1178](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1178), [#1050](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1050) | | ROW ACCESS POLICY | ❌ | [#2053](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2053), [#1600](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1600), [#1151](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1151) |