diff --git a/MIGRATION_GUIDE.md b/MIGRATION_GUIDE.md index 0d1c894bb4..2e72424d20 100644 --- a/MIGRATION_GUIDE.md +++ b/MIGRATION_GUIDE.md @@ -5,6 +5,18 @@ describe deprecations or breaking changes and help you to change your configurat across different versions. ## v0.92.0 ➞ v0.93.0 + +### general changes + +With this change we introduce the first resources redesigned for the V1. We have made a few design choices that will be reflected in these and in the further reworked resources. This includes: +- Handling the [default values](./v1-preparations/CHANGES_BEFORE_V1.md#default-values). +- Handling the ["empty" values](./v1-preparations/CHANGES_BEFORE_V1.md#empty-values). +- Handling the [Snowflake parameters](./v1-preparations/CHANGES_BEFORE_V1.md#snowflake-parameters). +- Saving the [config values in the state](./v1-preparations/CHANGES_BEFORE_V1.md#config-values-in-the-state). +- Providing a ["raw Snowflake output"](./v1-preparations/CHANGES_BEFORE_V1.md#empty-values) for the managed resources. + +They are all described in short in the [changes before v1 doc](./v1-preparations/CHANGES_BEFORE_V1.md). Please familiarize yourself with these changes before the upgrade. + ### old grant resources removal Following the [announcement](https://github.com/Snowflake-Labs/terraform-provider-snowflake/discussions/2736) we have removed the old grant resources. The two resources [snowflake_role_ownership_grant](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/resources/role_ownership_grant) and [snowflake_user_ownership_grant](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/resources/user_ownership_grant) were not listed in the announcement, but they were also marked as deprecated ones. We are removing them too to conclude the grants redesign saga. @@ -27,9 +39,14 @@ Now, the `sync_password` field will set the state value to `unknown` whenever th Renamed field `provisioner_role` to `run_as_role` to align with Snowflake docs. Please rename this field in your configuration files. State will be migrated automatically. -#### *(behavior change)* Changed behavior of `enabled` +#### *(feature)* New fields +Fields added to the resource: +- `enabled` +- `sync_password` +- `comment` -Field `enabled` is now required. Previously the default value during create in Snowflake was `true`. If you created a resource with Terraform, please add `enabled = true` to have the same value. +#### *(behavior change)* Changed behavior of `enabled` +New field `enabled` is required. Previously the default value during create in Snowflake was `true`. If you created a resource with Terraform, please add `enabled = true` to have the same value. #### *(behavior change)* Force new for multiple attributes Force new was added for the following attributes (because no usable SQL alter statements for them): @@ -38,10 +55,10 @@ Force new was added for the following attributes (because no usable SQL alter st ### snowflake_warehouse resource changes -Because of the multiple changes in the resource, the easiest migration way is to follow our [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md) to perform zero downtime migration. Alternatively, it is possible to follow some pointers below. Either way, familiarize yourself with the resource changes before version bumping. +Because of the multiple changes in the resource, the easiest migration way is to follow our [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md) to perform zero downtime migration. Alternatively, it is possible to follow some pointers below. Either way, familiarize yourself with the resource changes before version bumping. Also, check the [design decisions](./v1-preparations/CHANGES_BEFORE_V1.md). #### *(potential behavior change)* Default values removed -As part of the [redesign](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#preparing-essential-ga-objects-for-the-provider-v1) we are removing the default values for attributes having their defaults on Snowflake side to reduce coupling with the provider. Because of that the following defaults were removed: +As part of the [redesign](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#preparing-essential-ga-objects-for-the-provider-v1) we are removing the default values for attributes having their defaults on Snowflake side to reduce coupling with the provider (read more in [default values](./v1-preparations/CHANGES_BEFORE_V1.md#default-values)). Because of that the following defaults were removed: - `comment` (previously `""`) - `enable_query_acceleration` (previously `false`) - `query_acceleration_max_scale_factor` (previously `8`) @@ -50,7 +67,7 @@ As part of the [redesign](https://github.com/Snowflake-Labs/terraform-provider-s - `statement_queued_timeout_in_seconds` (previously `0`) - `statement_timeout_in_seconds` (previously `172800`) -**Beware!** For attributes being Snowflake parameters (in case of warehouse: `max_concurrency_level`, `statement_queued_timeout_in_seconds`, and `statement_timeout_in_seconds`), this is a breaking change. Previously, not setting a value for them was treated as a fallback to values hardcoded on the provider side. This caused warehouse creation with these parameters set on the warehouse level (and not using the Snowflake default from hierarchy; read more in the [parameters documentation](https://docs.snowflake.com/en/sql-reference/parameters)). To keep the previous values, fill in your configs to the default values listed above. +**Beware!** For attributes being Snowflake parameters (in case of warehouse: `max_concurrency_level`, `statement_queued_timeout_in_seconds`, and `statement_timeout_in_seconds`), this is a breaking change (read more in [Snowflake parameters](./v1-preparations/CHANGES_BEFORE_V1.md#snowflake-parameters)). Previously, not setting a value for them was treated as a fallback to values hardcoded on the provider side. This caused warehouse creation with these parameters set on the warehouse level (and not using the Snowflake default from hierarchy; read more in the [parameters documentation](https://docs.snowflake.com/en/sql-reference/parameters)). To keep the previous values, fill in your configs to the default values listed above. All previous defaults were aligned with the current Snowflake ones, however it's not possible to distinguish between filled out value and no value in the automatic state upgrader. Therefore, if the given attribute is not filled out in your configuration, terraform will try to perform update after the change (to UNSET the given attribute to the Snowflake default); it should result in no changes on Snowflake object side, but it is required to make Terraform state aligned with your config. **All** other optional fields that were not set inside the config at all (because of the change in handling state logic on our provider side) will follow the same logic. To avoid the need for the changes, fill out the default fields in your config. Alternatively run apply; no further changes should be shown as a part of the plan. @@ -58,9 +75,7 @@ All previous defaults were aligned with the current Snowflake ones, however it's There are three migrations that should happen automatically with the version bump: - incorrect `2XLARGE`, `3XLARGE`, `4XLARGE`, `5XLARGE`, `6XLARGE` values for warehouse size are changed to the proper ones - deprecated `wait_for_provisioning` attribute is removed from the state -- old empty resource monitor attribute is cleaned (earlier it was set to `"null"` string) - -[//]: # (TODO [SNOW-1348102 - after discussion]: describe the new state approach if decided) +- old empty resource monitor attribute is cleaned (earlier it was set to `"null"` string) #### *(fix)* Warehouse size UNSET @@ -100,6 +115,8 @@ To easily handle three-value logic (true, false, unknown) in provider's configs, The outputs of both commands are held in `warehouses` entry, where **DESC WAREHOUSE** is saved in the `describe_output` field, and **SHOW PARAMETERS IN WAREHOUSE** in the `parameters` field. It's important to limit the records and calls to Snowflake to the minimum. That's why we recommend assessing which information you need from the data source and then providing strong filters and turning off additional fields for better plan performance. +You can read more in ["raw Snowflake output"](./v1-preparations/CHANGES_BEFORE_V1.md#empty-values). + ### new database resources As part of the [preparation for v1](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#preparing-essential-ga-objects-for-the-provider-v1), we split up the database resource into multiple ones: - Standard database - can be used as `snowflake_database` (replaces the old one and is used to create databases with optional ability to become a primary database ready for replication) diff --git a/docs/resources/scim_integration.md b/docs/resources/scim_integration.md index 819e1a2cda..e4a0dd1086 100644 --- a/docs/resources/scim_integration.md +++ b/docs/resources/scim_integration.md @@ -45,7 +45,7 @@ resource "snowflake_scim_integration" "test" { - `comment` (String) Specifies a comment for the integration. - `network_policy` (String) Specifies an existing network policy that controls SCIM network traffic. -- `sync_password` (String) Specifies whether to enable or disable the synchronization of a user password from an Okta SCIM client as part of the API request to Snowflake. Available options are: `true` or `false`. When the value is not set in the configuration the provider will put `unknown` there which means to use the Snowflake default for this value. +- `sync_password` (String) Specifies whether to enable or disable the synchronization of a user password from an Okta SCIM client as part of the API request to Snowflake. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. ### Read-Only diff --git a/docs/resources/warehouse.md b/docs/resources/warehouse.md index cc7e1db4d0..3f62256b2d 100644 --- a/docs/resources/warehouse.md +++ b/docs/resources/warehouse.md @@ -28,10 +28,10 @@ resource "snowflake_warehouse" "warehouse" { ### Optional -- `auto_resume` (String) Specifies whether to automatically resume a warehouse when a SQL statement (e.g. query) is submitted to it. +- `auto_resume` (String) Specifies whether to automatically resume a warehouse when a SQL statement (e.g. query) is submitted to it. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. - `auto_suspend` (Number) Specifies the number of seconds of inactivity after which a warehouse is automatically suspended. - `comment` (String) Specifies a comment for the warehouse. -- `enable_query_acceleration` (String) Specifies whether to enable the query acceleration service for queries that rely on this warehouse for compute resources. +- `enable_query_acceleration` (String) Specifies whether to enable the query acceleration service for queries that rely on this warehouse for compute resources. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. - `initially_suspended` (Boolean) Specifies whether the warehouse is created initially in the ‘Suspended’ state. - `max_cluster_count` (Number) Specifies the maximum number of server clusters for the warehouse. - `max_concurrency_level` (Number) Object parameter that specifies the concurrency level for SQL statements (i.e. queries and DML) executed by a warehouse. diff --git a/pkg/datasources/databases.go b/pkg/datasources/databases.go index cc61d32795..fa0acd83e8 100644 --- a/pkg/datasources/databases.go +++ b/pkg/datasources/databases.go @@ -3,12 +3,11 @@ package datasources import ( "context" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/schemas" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" - "github.com/hashicorp/terraform-plugin-sdk/v2/diag" - + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/schemas" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/diag" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" ) @@ -61,7 +60,7 @@ var databasesSchema = map[string]*schema.Schema{ Description: "Holds the aggregated output of all database details queries.", Elem: &schema.Resource{ Schema: map[string]*schema.Schema{ - "show_output": { + resources.ShowOutputAttributeName: { Type: schema.TypeList, Computed: true, Description: "Holds the output of SHOW DATABASES.", @@ -69,7 +68,7 @@ var databasesSchema = map[string]*schema.Schema{ Schema: schemas.ShowDatabaseSchema, }, }, - "describe_output": { + resources.DescribeOutputAttributeName: { Type: schema.TypeList, Computed: true, Description: "Holds the output of DESCRIBE DATABASE.", @@ -77,7 +76,7 @@ var databasesSchema = map[string]*schema.Schema{ Schema: schemas.DatabaseDescribeSchema, }, }, - "parameters": { + resources.ParametersAttributeName: { Type: schema.TypeList, Computed: true, Description: "Holds the output of SHOW PARAMETERS FOR DATABASE.", @@ -159,9 +158,9 @@ func ReadDatabases(ctx context.Context, d *schema.ResourceData, meta any) diag.D } flattenedDatabases[i] = map[string]any{ - "show_output": []map[string]any{schemas.DatabaseToSchema(&database)}, - "describe_output": databaseDescription, - "parameters": databaseParameters, + resources.ShowOutputAttributeName: []map[string]any{schemas.DatabaseToSchema(&database)}, + resources.DescribeOutputAttributeName: databaseDescription, + resources.ParametersAttributeName: databaseParameters, } } diff --git a/pkg/datasources/security_integrations.go b/pkg/datasources/security_integrations.go index 18a8fd1305..c020cc78c2 100644 --- a/pkg/datasources/security_integrations.go +++ b/pkg/datasources/security_integrations.go @@ -4,6 +4,7 @@ import ( "context" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/schemas" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-sdk/v2/diag" @@ -28,7 +29,7 @@ var securityIntegrationsSchema = map[string]*schema.Schema{ Description: "Holds the aggregated output of all security integrations details queries.", Elem: &schema.Resource{ Schema: map[string]*schema.Schema{ - "show_output": { + resources.ShowOutputAttributeName: { Type: schema.TypeList, Computed: true, Description: "Holds the output of SHOW SECURITY INTEGRATIONS.", @@ -36,7 +37,7 @@ var securityIntegrationsSchema = map[string]*schema.Schema{ Schema: schemas.ShowSecurityIntegrationSchema, }, }, - "describe_output": { + resources.DescribeOutputAttributeName: { Type: schema.TypeList, Computed: true, Description: "Holds the output of DESCRIBE SECURITY INTEGRATIONS.", @@ -88,8 +89,8 @@ func ReadSecurityIntegrations(ctx context.Context, d *schema.ResourceData, meta } flattenedSecurityIntegrations[i] = map[string]any{ - "show_output": []map[string]any{schemas.SecurityIntegrationToSchema(&securityIntegration)}, - "describe_output": securityIntegrationDescriptions, + resources.ShowOutputAttributeName: []map[string]any{schemas.SecurityIntegrationToSchema(&securityIntegration)}, + resources.DescribeOutputAttributeName: securityIntegrationDescriptions, } } diff --git a/pkg/datasources/warehouses.go b/pkg/datasources/warehouses.go index bd431093b2..8c153989bd 100644 --- a/pkg/datasources/warehouses.go +++ b/pkg/datasources/warehouses.go @@ -4,6 +4,7 @@ import ( "context" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/schemas" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-sdk/v2/diag" @@ -34,7 +35,7 @@ var warehousesSchema = map[string]*schema.Schema{ Description: "Holds the aggregated output of all warehouse details queries.", Elem: &schema.Resource{ Schema: map[string]*schema.Schema{ - "show_output": { + resources.ShowOutputAttributeName: { Type: schema.TypeList, Computed: true, Description: "Holds the output of SHOW WAREHOUSES.", @@ -42,7 +43,7 @@ var warehousesSchema = map[string]*schema.Schema{ Schema: schemas.ShowWarehouseSchema, }, }, - "describe_output": { + resources.DescribeOutputAttributeName: { Type: schema.TypeList, Computed: true, Description: "Holds the output of DESCRIBE WAREHOUSE.", @@ -50,7 +51,7 @@ var warehousesSchema = map[string]*schema.Schema{ Schema: schemas.WarehouseDescribeSchema, }, }, - "parameters": { + resources.ParametersAttributeName: { Type: schema.TypeList, Computed: true, Description: "Holds the output of SHOW PARAMETERS FOR WAREHOUSE.", @@ -114,9 +115,9 @@ func ReadWarehouses(ctx context.Context, d *schema.ResourceData, meta any) diag. } flattenedWarehouses[i] = map[string]any{ - "show_output": []map[string]any{schemas.WarehouseToSchema(&warehouse)}, - "describe_output": warehouseDescription, - "parameters": warehouseParameters, + resources.ShowOutputAttributeName: []map[string]any{schemas.WarehouseToSchema(&warehouse)}, + resources.DescribeOutputAttributeName: warehouseDescription, + resources.ParametersAttributeName: warehouseParameters, } } diff --git a/pkg/resources/custom_diffs.go b/pkg/resources/custom_diffs.go index 14382da0af..72183af3e5 100644 --- a/pkg/resources/custom_diffs.go +++ b/pkg/resources/custom_diffs.go @@ -4,12 +4,11 @@ import ( "context" "log" "strconv" + "strings" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" - - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" ) @@ -72,3 +71,50 @@ func ComputedIfAnyAttributeChanged(key string, changedAttributeKeys ...string) s return result }) } + +type parameter struct { + parameterName sdk.AccountParameter + valueType valueType + parameterType sdk.ParameterType +} + +type valueType string + +const ( + valueTypeInt valueType = "int" + valueTypeBool valueType = "bool" + valueTypeString valueType = "string" +) + +type ResourceIdProvider interface { + Id() string +} + +func ParametersCustomDiff(parametersProvider func(context.Context, ResourceIdProvider, any) ([]*sdk.Parameter, error), parameters ...parameter) schema.CustomizeDiffFunc { + return func(ctx context.Context, d *schema.ResourceDiff, meta any) error { + if d.Id() == "" { + return nil + } + + params, err := parametersProvider(ctx, d, meta) + if err != nil { + return err + } + + diffFunctions := make([]schema.CustomizeDiffFunc, len(parameters)) + for idx, p := range parameters { + var diffFunction schema.CustomizeDiffFunc + switch p.valueType { + case valueTypeInt: + diffFunction = IntParameterValueComputedIf(strings.ToLower(string(p.parameterName)), params, p.parameterType, p.parameterName) + case valueTypeBool: + diffFunction = BoolParameterValueComputedIf(strings.ToLower(string(p.parameterName)), params, p.parameterType, p.parameterName) + case valueTypeString: + diffFunction = StringParameterValueComputedIf(strings.ToLower(string(p.parameterName)), params, p.parameterType, p.parameterName) + } + diffFunctions[idx] = diffFunction + } + + return customdiff.All(diffFunctions...)(ctx, d, meta) + } +} diff --git a/pkg/resources/database.go b/pkg/resources/database.go index 921c407a66..bc73e4ec5b 100644 --- a/pkg/resources/database.go +++ b/pkg/resources/database.go @@ -79,17 +79,18 @@ func Database() *schema.Resource { SchemaVersion: 1, CreateContext: CreateDatabase, + UpdateContext: UpdateDatabase, ReadContext: ReadDatabase, DeleteContext: DeleteDatabase, - UpdateContext: UpdateDatabase, Description: "Represents a standard database. If replication configuration is specified, the database is promoted to serve as a primary database for replication.", - CustomizeDiff: DatabaseParametersCustomDiff, - Schema: MergeMaps(databaseSchema, DatabaseParametersSchema), + Schema: MergeMaps(databaseSchema, DatabaseParametersSchema), Importer: &schema.ResourceImporter{ StateContext: schema.ImportStatePassthroughContext, }, + CustomizeDiff: DatabaseParametersCustomDiff, + StateUpgraders: []schema.StateUpgrader{ { Version: 0, @@ -128,7 +129,7 @@ func CreateDatabase(ctx context.Context, d *schema.ResourceData, meta any) diag. } err = client.Databases.Create(ctx, id, &sdk.CreateDatabaseOptions{ - Transient: GetPropertyAsPointer[bool](d, "is_transient"), + Transient: GetConfigPropertyAsPointerAllowingZeroValue[bool](d, "is_transient"), DataRetentionTimeInDays: dataRetentionTimeInDays, MaxDataExtensionTimeInDays: maxDataExtensionTimeInDays, ExternalVolume: externalVolume, @@ -145,7 +146,7 @@ func CreateDatabase(ctx context.Context, d *schema.ResourceData, meta any) diag. UserTaskMinimumTriggerIntervalInSeconds: userTaskMinimumTriggerIntervalInSeconds, QuotedIdentifiersIgnoreCase: quotedIdentifiersIgnoreCase, EnableConsoleOutput: enableConsoleOutput, - Comment: GetPropertyAsPointer[string](d, "comment"), + Comment: GetConfigPropertyAsPointerAllowingZeroValue[string](d, "comment"), }) if err != nil { return diag.FromErr(err) diff --git a/pkg/resources/database_acceptance_test.go b/pkg/resources/database_acceptance_test.go index cdd9466069..3ca0128b94 100644 --- a/pkg/resources/database_acceptance_test.go +++ b/pkg/resources/database_acceptance_test.go @@ -6,24 +6,23 @@ import ( "strconv" "testing" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testenvs" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/snowflakechecks" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/importchecks" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/planchecks" + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + r "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" tfjson "github.com/hashicorp/terraform-json" - "github.com/hashicorp/terraform-plugin-testing/plancheck" - "github.com/stretchr/testify/require" - acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/importchecks" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/planchecks" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/snowflakechecks" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testenvs" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-testing/config" "github.com/hashicorp/terraform-plugin-testing/helper/resource" + "github.com/hashicorp/terraform-plugin-testing/plancheck" "github.com/hashicorp/terraform-plugin-testing/tfversion" + "github.com/stretchr/testify/require" ) func TestAcc_Database_Basic(t *testing.T) { @@ -986,7 +985,7 @@ func TestAcc_Database_UpgradeWithTheSameFieldsAsInTheOldOne(t *testing.T) { resource.TestCheckResourceAttr("snowflake_database.test", "name", id.Name()), resource.TestCheckResourceAttr("snowflake_database.test", "is_transient", "true"), resource.TestCheckResourceAttr("snowflake_database.test", "comment", comment), - resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_database.test", "data_retention_time_in_days", r.IntDefaultString), ), }, { diff --git a/pkg/resources/database_commons.go b/pkg/resources/database_commons.go index b2a1cf307e..17e3c46431 100644 --- a/pkg/resources/database_commons.go +++ b/pkg/resources/database_commons.go @@ -7,15 +7,12 @@ import ( "strconv" "strings" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-sdk/v2/diag" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" ) var ( @@ -25,41 +22,40 @@ var ( sdk.ObjectParameterDataRetentionTimeInDays, sdk.ObjectParameterMaxDataExtensionTimeInDays, } - DatabaseParametersCustomDiff = func(ctx context.Context, d *schema.ResourceDiff, meta any) error { - if d.Id() == "" { - return nil - } - - client := meta.(*provider.Context).Client - params, err := client.Parameters.ShowParameters(context.Background(), &sdk.ShowParametersOptions{ - In: &sdk.ParametersIn{ - Database: helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier), - }, - }) - if err != nil { - return err - } + DatabaseParametersCustomDiff = ParametersCustomDiff( + databaseParametersProvider, + parameter{sdk.AccountParameterDataRetentionTimeInDays, valueTypeInt, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterMaxDataExtensionTimeInDays, valueTypeInt, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterExternalVolume, valueTypeString, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterCatalog, valueTypeString, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterReplaceInvalidCharacters, valueTypeBool, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterDefaultDDLCollation, valueTypeString, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterStorageSerializationPolicy, valueTypeString, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterLogLevel, valueTypeString, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterTraceLevel, valueTypeString, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterSuspendTaskAfterNumFailures, valueTypeInt, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterTaskAutoRetryAttempts, valueTypeInt, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterUserTaskManagedInitialWarehouseSize, valueTypeString, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterUserTaskTimeoutMs, valueTypeInt, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds, valueTypeInt, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterQuotedIdentifiersIgnoreCase, valueTypeBool, sdk.ParameterTypeDatabase}, + parameter{sdk.AccountParameterEnableConsoleOutput, valueTypeBool, sdk.ParameterTypeDatabase}, + ) +) - return customdiff.All( - IntParameterValueComputedIf("data_retention_time_in_days", params, sdk.ParameterTypeDatabase, sdk.AccountParameterDataRetentionTimeInDays), - IntParameterValueComputedIf("max_data_extension_time_in_days", params, sdk.ParameterTypeDatabase, sdk.AccountParameterMaxDataExtensionTimeInDays), - StringParameterValueComputedIf("external_volume", params, sdk.ParameterTypeDatabase, sdk.AccountParameterExternalVolume), - StringParameterValueComputedIf("catalog", params, sdk.ParameterTypeDatabase, sdk.AccountParameterCatalog), - BoolParameterValueComputedIf("replace_invalid_characters", params, sdk.ParameterTypeDatabase, sdk.AccountParameterReplaceInvalidCharacters), - StringParameterValueComputedIf("default_ddl_collation", params, sdk.ParameterTypeDatabase, sdk.AccountParameterDefaultDDLCollation), - StringParameterValueComputedIf("storage_serialization_policy", params, sdk.ParameterTypeDatabase, sdk.AccountParameterStorageSerializationPolicy), - StringParameterValueComputedIf("log_level", params, sdk.ParameterTypeDatabase, sdk.AccountParameterLogLevel), - StringParameterValueComputedIf("trace_level", params, sdk.ParameterTypeDatabase, sdk.AccountParameterTraceLevel), - IntParameterValueComputedIf("suspend_task_after_num_failures", params, sdk.ParameterTypeDatabase, sdk.AccountParameterSuspendTaskAfterNumFailures), - IntParameterValueComputedIf("task_auto_retry_attempts", params, sdk.ParameterTypeDatabase, sdk.AccountParameterTaskAutoRetryAttempts), - StringParameterValueComputedIf("user_task_managed_initial_warehouse_size", params, sdk.ParameterTypeDatabase, sdk.AccountParameterUserTaskManagedInitialWarehouseSize), - IntParameterValueComputedIf("user_task_timeout_ms", params, sdk.ParameterTypeDatabase, sdk.AccountParameterUserTaskTimeoutMs), - IntParameterValueComputedIf("user_task_minimum_trigger_interval_in_seconds", params, sdk.ParameterTypeDatabase, sdk.AccountParameterUserTaskMinimumTriggerIntervalInSeconds), - BoolParameterValueComputedIf("quoted_identifiers_ignore_case", params, sdk.ParameterTypeDatabase, sdk.AccountParameterQuotedIdentifiersIgnoreCase), - BoolParameterValueComputedIf("enable_console_output", params, sdk.ParameterTypeDatabase, sdk.AccountParameterEnableConsoleOutput), - )(ctx, d, meta) +func databaseParametersProvider(ctx context.Context, d ResourceIdProvider, meta any) ([]*sdk.Parameter, error) { + client := meta.(*provider.Context).Client + id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) + databaseParameters, err := client.Parameters.ShowParameters(ctx, &sdk.ShowParametersOptions{ + In: &sdk.ParametersIn{ + Database: id, + }, + }) + if err != nil { + return nil, err } -) + return databaseParameters, nil +} func init() { databaseParameterFields := []struct { @@ -220,38 +216,38 @@ func GetAllDatabaseParameters(d *schema.ResourceData) ( enableConsoleOutput *bool, err error, ) { - dataRetentionTimeInDays = GetPropertyAsPointer[int](d, "data_retention_time_in_days") - maxDataExtensionTimeInDays = GetPropertyAsPointer[int](d, "max_data_extension_time_in_days") - if externalVolumeRaw := GetPropertyAsPointer[string](d, "external_volume"); externalVolumeRaw != nil { + dataRetentionTimeInDays = GetConfigPropertyAsPointerAllowingZeroValue[int](d, "data_retention_time_in_days") + maxDataExtensionTimeInDays = GetConfigPropertyAsPointerAllowingZeroValue[int](d, "max_data_extension_time_in_days") + if externalVolumeRaw := GetConfigPropertyAsPointerAllowingZeroValue[string](d, "external_volume"); externalVolumeRaw != nil { externalVolume = sdk.Pointer(sdk.NewAccountObjectIdentifier(*externalVolumeRaw)) } - if catalogRaw := GetPropertyAsPointer[string](d, "catalog"); catalogRaw != nil { + if catalogRaw := GetConfigPropertyAsPointerAllowingZeroValue[string](d, "catalog"); catalogRaw != nil { catalog = sdk.Pointer(sdk.NewAccountObjectIdentifier(*catalogRaw)) } - replaceInvalidCharacters = GetPropertyAsPointer[bool](d, "replace_invalid_characters") - defaultDDLCollation = GetPropertyAsPointer[string](d, "default_ddl_collation") - if storageSerializationPolicyRaw := GetPropertyAsPointer[string](d, "storage_serialization_policy"); storageSerializationPolicyRaw != nil { + replaceInvalidCharacters = GetConfigPropertyAsPointerAllowingZeroValue[bool](d, "replace_invalid_characters") + defaultDDLCollation = GetConfigPropertyAsPointerAllowingZeroValue[string](d, "default_ddl_collation") + if storageSerializationPolicyRaw := GetConfigPropertyAsPointerAllowingZeroValue[string](d, "storage_serialization_policy"); storageSerializationPolicyRaw != nil { storageSerializationPolicy = sdk.Pointer(sdk.StorageSerializationPolicy(*storageSerializationPolicyRaw)) } - if logLevelRaw := GetPropertyAsPointer[string](d, "log_level"); logLevelRaw != nil { + if logLevelRaw := GetConfigPropertyAsPointerAllowingZeroValue[string](d, "log_level"); logLevelRaw != nil { logLevel = sdk.Pointer(sdk.LogLevel(*logLevelRaw)) } - if traceLevelRaw := GetPropertyAsPointer[string](d, "trace_level"); traceLevelRaw != nil { + if traceLevelRaw := GetConfigPropertyAsPointerAllowingZeroValue[string](d, "trace_level"); traceLevelRaw != nil { traceLevel = sdk.Pointer(sdk.TraceLevel(*traceLevelRaw)) } - suspendTaskAfterNumFailures = GetPropertyAsPointer[int](d, "suspend_task_after_num_failures") - taskAutoRetryAttempts = GetPropertyAsPointer[int](d, "task_auto_retry_attempts") - if userTaskManagedInitialWarehouseSizeRaw := GetPropertyAsPointer[string](d, "user_task_managed_initial_warehouse_size"); userTaskManagedInitialWarehouseSizeRaw != nil { + suspendTaskAfterNumFailures = GetConfigPropertyAsPointerAllowingZeroValue[int](d, "suspend_task_after_num_failures") + taskAutoRetryAttempts = GetConfigPropertyAsPointerAllowingZeroValue[int](d, "task_auto_retry_attempts") + if userTaskManagedInitialWarehouseSizeRaw := GetConfigPropertyAsPointerAllowingZeroValue[string](d, "user_task_managed_initial_warehouse_size"); userTaskManagedInitialWarehouseSizeRaw != nil { var warehouseSize sdk.WarehouseSize if warehouseSize, err = sdk.ToWarehouseSize(*userTaskManagedInitialWarehouseSizeRaw); err != nil { return } userTaskManagedInitialWarehouseSize = sdk.Pointer(warehouseSize) } - userTaskTimeoutMs = GetPropertyAsPointer[int](d, "user_task_timeout_ms") - userTaskMinimumTriggerIntervalInSeconds = GetPropertyAsPointer[int](d, "user_task_minimum_trigger_interval_in_seconds") - quotedIdentifiersIgnoreCase = GetPropertyAsPointer[bool](d, "quoted_identifiers_ignore_case") - enableConsoleOutput = GetPropertyAsPointer[bool](d, "enable_console_output") + userTaskTimeoutMs = GetConfigPropertyAsPointerAllowingZeroValue[int](d, "user_task_timeout_ms") + userTaskMinimumTriggerIntervalInSeconds = GetConfigPropertyAsPointerAllowingZeroValue[int](d, "user_task_minimum_trigger_interval_in_seconds") + quotedIdentifiersIgnoreCase = GetConfigPropertyAsPointerAllowingZeroValue[bool](d, "quoted_identifiers_ignore_case") + enableConsoleOutput = GetConfigPropertyAsPointerAllowingZeroValue[bool](d, "enable_console_output") return } diff --git a/pkg/resources/database_old.go b/pkg/resources/database_old.go index 409338dfea..0d5acb8a63 100644 --- a/pkg/resources/database_old.go +++ b/pkg/resources/database_old.go @@ -36,7 +36,7 @@ var databaseOldSchema = map[string]*schema.Schema{ "data_retention_time_in_days": { Type: schema.TypeInt, Optional: true, - Default: -1, + Default: IntDefault, Description: "Number of days for which Snowflake retains historical data for performing Time Travel actions (SELECT, CLONE, UNDROP) on the object. A value of 0 effectively disables Time Travel for the specified database. Default value for this field is set to -1, which is a fallback to use Snowflake default. For more information, see [Understanding & Using Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel).", ValidateFunc: validation.IntBetween(-1, 90), }, @@ -128,7 +128,7 @@ func CreateDatabaseOld(d *schema.ResourceData, meta interface{}) error { if primaryName, ok := d.GetOk("from_replica"); ok { primaryID := sdk.NewExternalObjectIdentifierFromFullyQualifiedName(primaryName.(string)) opts := &sdk.CreateSecondaryDatabaseOptions{} - if v := d.Get("data_retention_time_in_days"); v.(int) != -1 { + if v := d.Get("data_retention_time_in_days"); v.(int) != IntDefault { opts.DataRetentionTimeInDays = sdk.Int(v.(int)) } err := client.Databases.CreateSecondary(ctx, id, primaryID, opts) @@ -156,7 +156,7 @@ func CreateDatabaseOld(d *schema.ResourceData, meta interface{}) error { } } - if v := d.Get("data_retention_time_in_days"); v.(int) != -1 { + if v := d.Get("data_retention_time_in_days"); v.(int) != IntDefault { opts.DataRetentionTimeInDays = sdk.Int(v.(int)) } @@ -218,7 +218,7 @@ func ReadDatabaseOld(d *schema.ResourceData, meta interface{}) error { return err } - if dataRetentionDays := d.Get("data_retention_time_in_days"); dataRetentionDays.(int) != -1 || database.RetentionTime != paramDataRetention { + if dataRetentionDays := d.Get("data_retention_time_in_days"); dataRetentionDays.(int) != IntDefault || database.RetentionTime != paramDataRetention { if err := d.Set("data_retention_time_in_days", database.RetentionTime); err != nil { return err } @@ -267,7 +267,7 @@ func UpdateDatabaseOld(d *schema.ResourceData, meta interface{}) error { } if d.HasChange("data_retention_time_in_days") { - if days := d.Get("data_retention_time_in_days"); days.(int) != -1 { + if days := d.Get("data_retention_time_in_days"); days.(int) != IntDefault { err := client.Databases.Alter(ctx, id, &sdk.AlterDatabaseOptions{ Set: &sdk.DatabaseSet{ DataRetentionTimeInDays: sdk.Int(days.(int)), diff --git a/pkg/resources/database_old_acceptance_test.go b/pkg/resources/database_old_acceptance_test.go index 37cc76d7aa..2ac6014380 100644 --- a/pkg/resources/database_old_acceptance_test.go +++ b/pkg/resources/database_old_acceptance_test.go @@ -7,6 +7,7 @@ import ( "testing" acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + r "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" @@ -210,7 +211,7 @@ func TestAcc_Database_DefaultDataRetentionTime(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", r.IntDefaultString), checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), ), }, @@ -221,7 +222,7 @@ func TestAcc_Database_DefaultDataRetentionTime(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", r.IntDefaultString), checkAccountAndDatabaseDataRetentionTime(t, id, 10, 10), ), }, @@ -245,7 +246,7 @@ func TestAcc_Database_DefaultDataRetentionTime(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", r.IntDefaultString), checkAccountAndDatabaseDataRetentionTime(t, id, 10, 10), ), }, @@ -301,7 +302,7 @@ func TestAcc_Database_DefaultDataRetentionTime_SetOutsideOfTerraform(t *testing. ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", r.IntDefaultString), checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), ), }, @@ -310,7 +311,7 @@ func TestAcc_Database_DefaultDataRetentionTime_SetOutsideOfTerraform(t *testing. ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Database_DefaultDataRetentionTime/WithoutDataRetentionSet"), ConfigVariables: configVariablesWithoutDatabaseDataRetentionTime(), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_database_old.test", "data_retention_time_in_days", r.IntDefaultString), checkAccountAndDatabaseDataRetentionTime(t, id, 5, 5), ), }, diff --git a/pkg/resources/diff_suppressions.go b/pkg/resources/diff_suppressions.go index 59edce5c83..3e17034d46 100644 --- a/pkg/resources/diff_suppressions.go +++ b/pkg/resources/diff_suppressions.go @@ -33,7 +33,7 @@ func IgnoreChangeToCurrentSnowflakeValueInShow(keyInShowOutput string) schema.Sc return false } - if queryOutput, ok := d.GetOk(showOutputAttributeName); ok { + if queryOutput, ok := d.GetOk(ShowOutputAttributeName); ok { queryOutputList := queryOutput.([]any) if len(queryOutputList) == 1 { result := queryOutputList[0].(map[string]any) @@ -53,7 +53,7 @@ func IgnoreChangeToCurrentSnowflakeValueInDescribe(keyInDescribeOutput string) s return false } - if queryOutput, ok := d.GetOk(describeOutputAttributeName); ok { + if queryOutput, ok := d.GetOk(DescribeOutputAttributeName); ok { queryOutputList := queryOutput.([]any) if len(queryOutputList) == 1 { result := queryOutputList[0].(map[string]any) diff --git a/pkg/resources/helpers.go b/pkg/resources/helpers.go index cbf936c2ba..e661f7b761 100644 --- a/pkg/resources/helpers.go +++ b/pkg/resources/helpers.go @@ -141,6 +141,18 @@ func GetPropertyAsPointer[T any](d *schema.ResourceData, property string) *T { return &typedValue } +func GetConfigPropertyAsPointerAllowingZeroValue[T any](d *schema.ResourceData, property string) *T { + if d.GetRawConfig().AsValueMap()[property].IsNull() { + return nil + } + value := d.Get(property) + typedValue, ok := value.(T) + if !ok { + return nil + } + return &typedValue +} + func GetPropertyOfFirstNestedObjectByValueKey[T any](d *schema.ResourceData, propertyKey string) (*T, error) { return GetPropertyOfFirstNestedObjectByKey[T](d, propertyKey, "value") } diff --git a/pkg/resources/helpers_test.go b/pkg/resources/helpers_test.go index dabf7bbfc6..1b4170d41b 100644 --- a/pkg/resources/helpers_test.go +++ b/pkg/resources/helpers_test.go @@ -16,39 +16,131 @@ import ( "github.com/stretchr/testify/assert" ) -type grantType int - -const ( - normal grantType = iota - onFuture - onAll -) - -func TestGetPropertyAsPointer(t *testing.T) { +func Test_GetPropertyAsPointer(t *testing.T) { d := schema.TestResourceDataRaw(t, map[string]*schema.Schema{ "integer": { Type: schema.TypeInt, Required: true, }, + "second_integer": { + Type: schema.TypeInt, + Optional: true, + }, + "third_integer": { + Type: schema.TypeInt, + Optional: true, + }, "string": { Type: schema.TypeString, Required: true, }, + "second_string": { + Type: schema.TypeString, + Optional: true, + }, + "third_string": { + Type: schema.TypeInt, + Optional: true, + }, "boolean": { Type: schema.TypeBool, Required: true, }, + "second_boolean": { + Type: schema.TypeBool, + Optional: true, + }, + "third_boolean": { + Type: schema.TypeBool, + Optional: true, + }, }, map[string]interface{}{ - "integer": 123, - "string": "some string", - "boolean": true, - "invalid": true, + "integer": 123, + "second_integer": 0, + "string": "some string", + "second_string": "", + "boolean": true, + "second_boolean": false, + "invalid": true, }) assert.Equal(t, 123, *resources.GetPropertyAsPointer[int](d, "integer")) assert.Equal(t, "some string", *resources.GetPropertyAsPointer[string](d, "string")) assert.Equal(t, true, *resources.GetPropertyAsPointer[bool](d, "boolean")) assert.Nil(t, resources.GetPropertyAsPointer[bool](d, "invalid")) + + assert.Equal(t, 123, *resources.GetPropertyAsPointer[int](d, "integer")) + assert.Nil(t, resources.GetPropertyAsPointer[int](d, "second_integer")) + assert.Nil(t, resources.GetPropertyAsPointer[int](d, "third_integer")) + assert.Equal(t, "some string", *resources.GetPropertyAsPointer[string](d, "string")) + assert.Nil(t, resources.GetPropertyAsPointer[string](d, "second_integer")) + assert.Nil(t, resources.GetPropertyAsPointer[string](d, "third_string")) + assert.Equal(t, true, *resources.GetPropertyAsPointer[bool](d, "boolean")) + assert.Nil(t, resources.GetPropertyAsPointer[bool](d, "second_boolean")) + assert.Nil(t, resources.GetPropertyAsPointer[bool](d, "third_boolean")) + assert.Nil(t, resources.GetPropertyAsPointer[bool](d, "invalid")) +} + +// TODO [SNOW-1511594]: provide TestResourceDataRaw with working GetRawConfig() +func Test_GetConfigPropertyAsPointerAllowingZeroValue(t *testing.T) { + t.Skip("TestResourceDataRaw does not set up the ResourceData correctly - GetRawConfig is nil") + d := schema.TestResourceDataRaw(t, map[string]*schema.Schema{ + "integer": { + Type: schema.TypeInt, + Required: true, + }, + "second_integer": { + Type: schema.TypeInt, + Optional: true, + }, + "third_integer": { + Type: schema.TypeInt, + Optional: true, + }, + "string": { + Type: schema.TypeString, + Required: true, + }, + "second_string": { + Type: schema.TypeString, + Optional: true, + }, + "third_string": { + Type: schema.TypeInt, + Optional: true, + }, + "boolean": { + Type: schema.TypeBool, + Required: true, + }, + "second_boolean": { + Type: schema.TypeBool, + Optional: true, + }, + "third_boolean": { + Type: schema.TypeBool, + Optional: true, + }, + }, map[string]interface{}{ + "integer": 123, + "second_integer": 0, + "string": "some string", + "second_string": "", + "boolean": true, + "second_boolean": false, + "invalid": true, + }) + + assert.Equal(t, 123, *resources.GetConfigPropertyAsPointerAllowingZeroValue[int](d, "integer")) + assert.Equal(t, 0, *resources.GetConfigPropertyAsPointerAllowingZeroValue[int](d, "second_integer")) + assert.Nil(t, resources.GetConfigPropertyAsPointerAllowingZeroValue[int](d, "third_integer")) + assert.Equal(t, "some string", *resources.GetConfigPropertyAsPointerAllowingZeroValue[string](d, "string")) + assert.Equal(t, "", *resources.GetConfigPropertyAsPointerAllowingZeroValue[string](d, "second_integer")) + assert.Nil(t, resources.GetConfigPropertyAsPointerAllowingZeroValue[string](d, "third_string")) + assert.Equal(t, true, *resources.GetConfigPropertyAsPointerAllowingZeroValue[bool](d, "boolean")) + assert.Equal(t, false, *resources.GetConfigPropertyAsPointerAllowingZeroValue[bool](d, "second_boolean")) + assert.Nil(t, resources.GetConfigPropertyAsPointerAllowingZeroValue[bool](d, "third_boolean")) + assert.Nil(t, resources.GetConfigPropertyAsPointerAllowingZeroValue[bool](d, "invalid")) } // queriedAccountRolePrivilegesEqualTo will check if all the privileges specified in the argument are granted in Snowflake. diff --git a/pkg/resources/schema.go b/pkg/resources/schema.go index 686a20e328..bf76545066 100644 --- a/pkg/resources/schema.go +++ b/pkg/resources/schema.go @@ -50,7 +50,7 @@ var schemaSchema = map[string]*schema.Schema{ "data_retention_days": { Type: schema.TypeInt, Optional: true, - Default: -1, + Default: IntDefault, Description: "Specifies the number of days for which Time Travel actions (CLONE and UNDROP) can be performed on the schema, as well as specifying the default Time Travel retention time for all tables created in the schema. Default value for this field is set to -1, which is a fallback to use Snowflake default.", ValidateFunc: validation.IntBetween(-1, 90), }, @@ -87,7 +87,7 @@ func CreateSchema(d *schema.ResourceData, meta interface{}) error { } dataRetentionTimeInDays := GetPropertyAsPointer[int](d, "data_retention_days") - if dataRetentionTimeInDays != nil && *dataRetentionTimeInDays != -1 { + if dataRetentionTimeInDays != nil && *dataRetentionTimeInDays != IntDefault { createReq.DataRetentionTimeInDays = dataRetentionTimeInDays } @@ -133,7 +133,7 @@ func ReadSchema(d *schema.ResourceData, meta interface{}) error { } } - if dataRetentionDays := d.Get("data_retention_days"); dataRetentionDays.(int) != -1 || int64(database.RetentionTime) != retentionTime { + if dataRetentionDays := d.Get("data_retention_days"); dataRetentionDays.(int) != IntDefault || int64(database.RetentionTime) != retentionTime { if err := d.Set("data_retention_days", retentionTime); err != nil { return err } @@ -235,7 +235,7 @@ func UpdateSchema(d *schema.ResourceData, meta interface{}) error { } if d.HasChange("data_retention_days") { - if days := d.Get("data_retention_days"); days.(int) != -1 { + if days := d.Get("data_retention_days"); days.(int) != IntDefault { err := client.Schemas.Alter(ctx, id, &sdk.AlterSchemaOptions{ Set: &sdk.SchemaSet{ DataRetentionTimeInDays: sdk.Int(days.(int)), diff --git a/pkg/resources/schema_acceptance_test.go b/pkg/resources/schema_acceptance_test.go index dc77b363bd..c5ea1d76af 100644 --- a/pkg/resources/schema_acceptance_test.go +++ b/pkg/resources/schema_acceptance_test.go @@ -7,6 +7,7 @@ import ( "testing" acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + r "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" @@ -191,7 +192,7 @@ func TestAcc_Schema_DefaultDataRetentionTime(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Schema_DefaultDataRetentionTime/WithoutDataRetentionSet"), ConfigVariables: configVariablesWithoutSchemaDataRetentionTime(5), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_schema.test", "data_retention_days", "-1"), + resource.TestCheckResourceAttr("snowflake_schema.test", "data_retention_days", r.IntDefaultString), checkDatabaseAndSchemaDataRetentionTime(t, id, 5, 5), ), }, @@ -199,7 +200,7 @@ func TestAcc_Schema_DefaultDataRetentionTime(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Schema_DefaultDataRetentionTime/WithoutDataRetentionSet"), ConfigVariables: configVariablesWithoutSchemaDataRetentionTime(10), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_schema.test", "data_retention_days", "-1"), + resource.TestCheckResourceAttr("snowflake_schema.test", "data_retention_days", r.IntDefaultString), checkDatabaseAndSchemaDataRetentionTime(t, id, 10, 10), ), }, @@ -223,7 +224,7 @@ func TestAcc_Schema_DefaultDataRetentionTime(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Schema_DefaultDataRetentionTime/WithoutDataRetentionSet"), ConfigVariables: configVariablesWithoutSchemaDataRetentionTime(10), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_schema.test", "data_retention_days", "-1"), + resource.TestCheckResourceAttr("snowflake_schema.test", "data_retention_days", r.IntDefaultString), checkDatabaseAndSchemaDataRetentionTime(t, id, 10, 10), ), }, @@ -279,7 +280,7 @@ func TestAcc_Schema_DefaultDataRetentionTime_SetOutsideOfTerraform(t *testing.T) ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Schema_DefaultDataRetentionTime/WithoutDataRetentionSet"), ConfigVariables: configVariablesWithoutSchemaDataRetentionTime(5), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_schema.test", "data_retention_days", "-1"), + resource.TestCheckResourceAttr("snowflake_schema.test", "data_retention_days", r.IntDefaultString), checkDatabaseAndSchemaDataRetentionTime(t, id, 5, 5), ), }, @@ -288,7 +289,7 @@ func TestAcc_Schema_DefaultDataRetentionTime_SetOutsideOfTerraform(t *testing.T) ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Schema_DefaultDataRetentionTime/WithoutDataRetentionSet"), ConfigVariables: configVariablesWithoutSchemaDataRetentionTime(5), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_schema.test", "data_retention_days", "-1"), + resource.TestCheckResourceAttr("snowflake_schema.test", "data_retention_days", r.IntDefaultString), checkDatabaseAndSchemaDataRetentionTime(t, id, 5, 5), ), }, diff --git a/pkg/resources/scim_integration.go b/pkg/resources/scim_integration.go index 4bfacb3d1f..dfc2b71e1a 100644 --- a/pkg/resources/scim_integration.go +++ b/pkg/resources/scim_integration.go @@ -7,19 +7,16 @@ import ( "strconv" "strings" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/logging" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/schemas" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/schemas" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/hashicorp/go-cty/cty" "github.com/hashicorp/terraform-plugin-sdk/v2/diag" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" ) var scimIntegrationSchema = map[string]*schema.Schema{ @@ -40,7 +37,7 @@ var scimIntegrationSchema = map[string]*schema.Schema{ Required: true, ForceNew: true, Description: fmt.Sprintf("Specifies the client type for the scim integration. Valid options are: %v.", sdk.AsStringList(sdk.AllScimSecurityIntegrationScimClients)), - ValidateFunc: validation.StringInSlice(sdk.AsStringList(sdk.AllScimSecurityIntegrationScimClients), true), + ValidateDiagFunc: StringInSlice(sdk.AsStringList(sdk.AllScimSecurityIntegrationScimClients), true), DiffSuppressFunc: ignoreCaseAndTrimSpaceSuppressFunc, }, "run_as_role": { @@ -49,7 +46,7 @@ var scimIntegrationSchema = map[string]*schema.Schema{ ForceNew: true, Description: fmt.Sprintf("Specify the SCIM role in Snowflake that owns any users and roles that are imported from the identity provider into Snowflake using SCIM."+ " Provider assumes that the specified role is already provided. Valid options are: %v.", sdk.AllScimSecurityIntegrationRunAsRoles), - ValidateFunc: validation.StringInSlice(sdk.AsStringList(sdk.AllScimSecurityIntegrationRunAsRoles), true), + ValidateDiagFunc: StringInSlice(sdk.AsStringList(sdk.AllScimSecurityIntegrationRunAsRoles), true), DiffSuppressFunc: func(k, old, new string, d *schema.ResourceData) bool { normalize := func(s string) string { return strings.ToUpper(strings.ReplaceAll(s, "-", "")) @@ -67,17 +64,17 @@ var scimIntegrationSchema = map[string]*schema.Schema{ "sync_password": { Type: schema.TypeString, Optional: true, - Default: "unknown", - ValidateFunc: validation.StringInSlice([]string{"true", "false"}, true), + Default: BooleanDefault, + ValidateDiagFunc: validateBooleanString, DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInDescribe("sync_password"), - Description: "Specifies whether to enable or disable the synchronization of a user password from an Okta SCIM client as part of the API request to Snowflake. Available options are: `true` or `false`. When the value is not set in the configuration the provider will put `unknown` there which means to use the Snowflake default for this value.", + Description: booleanStringFieldDescription("Specifies whether to enable or disable the synchronization of a user password from an Okta SCIM client as part of the API request to Snowflake."), }, "comment": { Type: schema.TypeString, Optional: true, Description: "Specifies a comment for the integration.", }, - showOutputAttributeName: { + ShowOutputAttributeName: { Type: schema.TypeList, Computed: true, Description: "Outputs the result of `SHOW SECURITY INTEGRATIONS` for the given security integration.", @@ -85,7 +82,7 @@ var scimIntegrationSchema = map[string]*schema.Schema{ Schema: schemas.ShowSecurityIntegrationSchema, }, }, - describeOutputAttributeName: { + DescribeOutputAttributeName: { Type: schema.TypeList, Computed: true, Description: "Outputs the result of `DESCRIBE SECURITY INTEGRATIONS` for the given security integration.", @@ -110,8 +107,8 @@ func SCIMIntegration() *schema.Resource { }, CustomizeDiff: customdiff.All( - ComputedIfAnyAttributeChanged(showOutputAttributeName, "enabled", "scim_client", "comment"), - ComputedIfAnyAttributeChanged(describeOutputAttributeName, "enabled", "comment", "network_policy", "run_as_role", "sync_password"), + ComputedIfAnyAttributeChanged(ShowOutputAttributeName, "enabled", "scim_client", "comment"), + ComputedIfAnyAttributeChanged(DescribeOutputAttributeName, "enabled", "comment", "network_policy", "run_as_role", "sync_password"), ), StateUpgraders: []schema.StateUpgrader{ @@ -194,7 +191,7 @@ func CreateContextSCIMIntegration(ctx context.Context, d *schema.ResourceData, m req.WithNetworkPolicy(sdk.NewAccountObjectIdentifier(v.(string))) } - if v := d.Get("sync_password").(string); v != "unknown" { + if v := d.Get("sync_password").(string); v != BooleanDefault { parsed, err := strconv.ParseBool(v) if err != nil { return diag.FromErr(err) @@ -327,11 +324,11 @@ func ReadContextSCIMIntegration(withExternalChangesMarking bool) schema.ReadCont } } - if err = d.Set(showOutputAttributeName, []map[string]any{schemas.SecurityIntegrationToSchema(integration)}); err != nil { + if err = d.Set(ShowOutputAttributeName, []map[string]any{schemas.SecurityIntegrationToSchema(integration)}); err != nil { return diag.FromErr(err) } - if err = d.Set(describeOutputAttributeName, []map[string]any{schemas.ScimSecurityIntegrationPropertiesToSchema(integrationProperties)}); err != nil { + if err = d.Set(DescribeOutputAttributeName, []map[string]any{schemas.ScimSecurityIntegrationPropertiesToSchema(integrationProperties)}); err != nil { return diag.FromErr(err) } @@ -357,7 +354,7 @@ func UpdateContextSCIMIntegration(ctx context.Context, d *schema.ResourceData, m } if d.HasChange("sync_password") { - if v := d.Get("sync_password").(string); v != "unknown" { + if v := d.Get("sync_password").(string); v != BooleanDefault { parsed, err := strconv.ParseBool(v) if err != nil { return diag.FromErr(err) diff --git a/pkg/resources/scim_integration_acceptance_test.go b/pkg/resources/scim_integration_acceptance_test.go index d7df385d87..aa93f03a8a 100644 --- a/pkg/resources/scim_integration_acceptance_test.go +++ b/pkg/resources/scim_integration_acceptance_test.go @@ -4,13 +4,13 @@ import ( "fmt" "testing" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/importchecks" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/planchecks" + acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + r "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" tfjson "github.com/hashicorp/terraform-json" - acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/importchecks" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/planchecks" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/snowflakeroles" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" @@ -61,7 +61,7 @@ func TestAcc_ScimIntegration_basic(t *testing.T) { resource.TestCheckResourceAttr("snowflake_scim_integration.test", "scim_client", "GENERIC"), resource.TestCheckResourceAttr("snowflake_scim_integration.test", "run_as_role", role.Name()), resource.TestCheckNoResourceAttr("snowflake_scim_integration.test", "network_policy"), - resource.TestCheckResourceAttr("snowflake_scim_integration.test", "sync_password", "unknown"), + resource.TestCheckResourceAttr("snowflake_scim_integration.test", "sync_password", r.BooleanDefault), resource.TestCheckNoResourceAttr("snowflake_scim_integration.test", "comment"), resource.TestCheckResourceAttr("snowflake_scim_integration.test", "show_output.#", "1"), @@ -151,7 +151,7 @@ func TestAcc_ScimIntegration_basic(t *testing.T) { resource.TestCheckResourceAttr("snowflake_scim_integration.test", "scim_client", "OKTA"), resource.TestCheckResourceAttr("snowflake_scim_integration.test", "run_as_role", role2.Name()), resource.TestCheckResourceAttr("snowflake_scim_integration.test", "network_policy", ""), - resource.TestCheckResourceAttr("snowflake_scim_integration.test", "sync_password", "unknown"), + resource.TestCheckResourceAttr("snowflake_scim_integration.test", "sync_password", r.BooleanDefault), resource.TestCheckResourceAttr("snowflake_scim_integration.test", "comment", ""), ), }, @@ -231,8 +231,8 @@ func TestAcc_ScimIntegration_invalid(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_ScimIntegration/complete"), ConfigVariables: m(), ExpectError: helpers.MatchAllStringsInOrderNonOverlapping([]string{ - `expected scim_client to be one of ["OKTA" "AZURE" "GENERIC"], got invalid`, - `expected run_as_role to be one of ["OKTA_PROVISIONER" "AAD_PROVISIONER" "GENERIC_SCIM_PROVISIONER"], got invalid`, + `expected [{{} scim_client}] to be one of ["OKTA" "AZURE" "GENERIC"], got invalid`, + `expected [{{} run_as_role}] to be one of ["OKTA_PROVISIONER" "AAD_PROVISIONER" "GENERIC_SCIM_PROVISIONER"], got invalid`, }), }, }, @@ -299,7 +299,7 @@ func TestAcc_ScimIntegration_migrateFromVersion091(t *testing.T) { planchecks.ExpectChange("snowflake_scim_integration.test", "scim_client", tfjson.ActionUpdate, sdk.String("GENERIC"), sdk.String("GENERIC")), planchecks.ExpectChange("snowflake_scim_integration.test", "run_as_role", tfjson.ActionUpdate, sdk.String(role.Name()), sdk.String(role.Name())), planchecks.ExpectChange("snowflake_scim_integration.test", "network_policy", tfjson.ActionUpdate, sdk.String(""), sdk.String("")), - planchecks.ExpectChange("snowflake_scim_integration.test", "sync_password", tfjson.ActionUpdate, nil, sdk.String("unknown")), + planchecks.ExpectChange("snowflake_scim_integration.test", "sync_password", tfjson.ActionUpdate, nil, sdk.String(r.BooleanDefault)), planchecks.ExpectChange("snowflake_scim_integration.test", "comment", tfjson.ActionUpdate, nil, nil), }, }, diff --git a/pkg/resources/secondary_database.go b/pkg/resources/secondary_database.go index f66f800042..a1110251cb 100644 --- a/pkg/resources/secondary_database.go +++ b/pkg/resources/secondary_database.go @@ -81,7 +81,7 @@ func CreateSecondaryDatabase(ctx context.Context, d *schema.ResourceData, meta a } err = client.Databases.CreateSecondary(ctx, secondaryDatabaseId, primaryDatabaseId, &sdk.CreateSecondaryDatabaseOptions{ - Transient: GetPropertyAsPointer[bool](d, "is_transient"), + Transient: GetConfigPropertyAsPointerAllowingZeroValue[bool](d, "is_transient"), DataRetentionTimeInDays: dataRetentionTimeInDays, MaxDataExtensionTimeInDays: maxDataExtensionTimeInDays, ExternalVolume: externalVolume, @@ -98,7 +98,7 @@ func CreateSecondaryDatabase(ctx context.Context, d *schema.ResourceData, meta a UserTaskMinimumTriggerIntervalInSeconds: userTaskMinimumTriggerIntervalInSeconds, QuotedIdentifiersIgnoreCase: quotedIdentifiersIgnoreCase, EnableConsoleOutput: enableConsoleOutput, - Comment: GetPropertyAsPointer[string](d, "comment"), + Comment: GetConfigPropertyAsPointerAllowingZeroValue[string](d, "comment"), }) if err != nil { return diag.FromErr(err) diff --git a/pkg/resources/shared_database.go b/pkg/resources/shared_database.go index 28855b66e2..6bbf799d2f 100644 --- a/pkg/resources/shared_database.go +++ b/pkg/resources/shared_database.go @@ -95,7 +95,7 @@ func CreateSharedDatabase(ctx context.Context, d *schema.ResourceData, meta any) UserTaskMinimumTriggerIntervalInSeconds: userTaskMinimumTriggerIntervalInSeconds, QuotedIdentifiersIgnoreCase: quotedIdentifiersIgnoreCase, EnableConsoleOutput: enableConsoleOutput, - Comment: GetPropertyAsPointer[string](d, "comment"), + Comment: GetConfigPropertyAsPointerAllowingZeroValue[string](d, "comment"), }) if err != nil { return diag.FromErr(err) diff --git a/pkg/resources/warehouse_rework_show_output_proposal.go b/pkg/resources/show_and_describe_handlers.go similarity index 55% rename from pkg/resources/warehouse_rework_show_output_proposal.go rename to pkg/resources/show_and_describe_handlers.go index 654fec57e7..91a0fc9cb1 100644 --- a/pkg/resources/warehouse_rework_show_output_proposal.go +++ b/pkg/resources/show_and_describe_handlers.go @@ -1,17 +1,20 @@ package resources import ( + "log" + "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" ) const ( - showOutputAttributeName = "show_output" - describeOutputAttributeName = "describe_output" + ShowOutputAttributeName = "show_output" + DescribeOutputAttributeName = "describe_output" + ParametersAttributeName = "parameters" ) -// handleExternalChangesToObjectInShow assumes that show output is kept in showOutputAttributeName attribute +// handleExternalChangesToObjectInShow assumes that show output is kept in ShowOutputAttributeName attribute func handleExternalChangesToObjectInShow(d *schema.ResourceData, mappings ...showMapping) error { - if showOutput, ok := d.GetOk(showOutputAttributeName); ok { + if showOutput, ok := d.GetOk(ShowOutputAttributeName); ok { showOutputList := showOutput.([]any) if len(showOutputList) == 1 { result := showOutputList[0].(map[string]any) @@ -39,9 +42,9 @@ type showMapping struct { normalizeFunc func(any) any } -// handleExternalChangesToObjectInDescribe assumes that show output is kept in describeOutputAttributeName attribute +// handleExternalChangesToObjectInDescribe assumes that show output is kept in DescribeOutputAttributeName attribute func handleExternalChangesToObjectInDescribe(d *schema.ResourceData, mappings ...describeMapping) error { - if describeOutput, ok := d.GetOk(describeOutputAttributeName); ok { + if describeOutput, ok := d.GetOk(DescribeOutputAttributeName); ok { describeOutputList := describeOutput.([]any) if len(describeOutputList) == 1 { result := describeOutputList[0].(map[string]any) @@ -78,3 +81,39 @@ type describeMapping struct { valueToSet any normalizeFunc func(any) any } + +// setStateToValuesFromConfig currently handles only int, float, and string types. +// It's needed for the case where: +// - previous config was empty (therefore Snowflake defaults had been used) +// - new config have the same values that are already in SF +func setStateToValuesFromConfig(d *schema.ResourceData, resourceSchema map[string]*schema.Schema, fields []string) error { + if !d.GetRawConfig().IsNull() { + vMap := d.GetRawConfig().AsValueMap() + for _, field := range fields { + if v, ok := vMap[field]; ok && !v.IsNull() { + if schemaField, ok := resourceSchema[field]; ok { + switch schemaField.Type { + case schema.TypeInt: + intVal, _ := v.AsBigFloat().Int64() + if err := d.Set(field, intVal); err != nil { + return err + } + case schema.TypeFloat: + if err := d.Set(field, v.AsBigFloat()); err != nil { + return err + } + case schema.TypeString: + if err := d.Set(field, v.AsString()); err != nil { + return err + } + default: + log.Printf("[DEBUG] field %s has unsupported schema type %v not found", field, schemaField.Type) + } + } else { + log.Printf("[DEBUG] schema field %s not found", field) + } + } + } + } + return nil +} diff --git a/pkg/resources/special_values.go b/pkg/resources/special_values.go new file mode 100644 index 0000000000..c4837cd94a --- /dev/null +++ b/pkg/resources/special_values.go @@ -0,0 +1,39 @@ +package resources + +import ( + "fmt" +) + +const ( + BooleanTrue = "true" + BooleanFalse = "false" + BooleanDefault = "default" + + IntDefault = -1 + IntDefaultString = "-1" +) + +var validateBooleanString = StringInSlice([]string{BooleanTrue, BooleanFalse}, false) + +func booleanStringFromBool(value bool) string { + if value { + return BooleanTrue + } else { + return BooleanFalse + } +} + +func booleanStringToBool(value string) (bool, error) { + switch value { + case BooleanTrue: + return true, nil + case BooleanFalse: + return false, nil + default: + return false, fmt.Errorf("cannot retrieve boolean value from %s", value) + } +} + +func booleanStringFieldDescription(description string) string { + return fmt.Sprintf(`%s Available options are: "%s" or "%s". When the value is not set in the configuration the provider will put "%s" there which means to use the Snowflake default for this value.`, description, BooleanTrue, BooleanFalse, BooleanDefault) +} diff --git a/pkg/resources/table.go b/pkg/resources/table.go index 3b09f11627..402b65ef94 100644 --- a/pkg/resources/table.go +++ b/pkg/resources/table.go @@ -186,7 +186,7 @@ var tableSchema = map[string]*schema.Schema{ "data_retention_time_in_days": { Type: schema.TypeInt, Optional: true, - Default: -1, + Default: IntDefault, Description: "Specifies the retention period for the table so that Time Travel actions (SELECT, CLONE, UNDROP) can be performed on historical data in the table. If you wish to inherit the parent schema setting then pass in the schema attribute to this argument or do not fill this parameter at all; the default value for this field is -1, which is a fallback to use Snowflake default - in this case the schema value", ValidateFunc: validation.IntBetween(-1, 90), }, @@ -596,7 +596,7 @@ func CreateTable(d *schema.ResourceData, meta interface{}) error { } } - if v := d.Get("data_retention_time_in_days"); v.(int) != -1 { + if v := d.Get("data_retention_time_in_days"); v.(int) != IntDefault { createRequest.WithDataRetentionTimeInDays(sdk.Int(v.(int))) } @@ -674,7 +674,7 @@ func ReadTable(d *schema.ResourceData, meta interface{}) error { "change_tracking": table.ChangeTracking, "qualified_name": id.FullyQualifiedName(), } - if v := d.Get("data_retention_time_in_days"); v.(int) != -1 || int64(table.RetentionTime) != schemaRetentionTime { + if v := d.Get("data_retention_time_in_days"); v.(int) != IntDefault || int64(table.RetentionTime) != schemaRetentionTime { toSet["data_retention_time_in_days"] = table.RetentionTime } @@ -727,7 +727,7 @@ func UpdateTable(d *schema.ResourceData, meta interface{}) error { } if d.HasChange("data_retention_time_in_days") { - if days := d.Get("data_retention_time_in_days"); days.(int) != -1 { + if days := d.Get("data_retention_time_in_days"); days.(int) != IntDefault { runSetStatement = true setRequest.WithDataRetentionTimeInDays(sdk.Int(days.(int))) } else { diff --git a/pkg/resources/table_acceptance_test.go b/pkg/resources/table_acceptance_test.go index 87e82e64a6..937263d0a8 100644 --- a/pkg/resources/table_acceptance_test.go +++ b/pkg/resources/table_acceptance_test.go @@ -8,6 +8,7 @@ import ( "testing" acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + r "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" @@ -1498,7 +1499,7 @@ func TestAcc_Table_DefaultDataRetentionTime(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Table_DefaultDataRetentionTime/WithDatabaseDataRetentionSet"), ConfigVariables: configWithDatabaseDataRetentionSet(5), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", r.IntDefaultString), checkDatabaseSchemaAndTableDataRetentionTime(tableId, 5, 5, 5), ), }, @@ -1506,7 +1507,7 @@ func TestAcc_Table_DefaultDataRetentionTime(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Table_DefaultDataRetentionTime/WithSchemaDataRetentionSet"), ConfigVariables: configWithSchemaDataRetentionSet(5, 10), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", r.IntDefaultString), checkDatabaseSchemaAndTableDataRetentionTime(tableId, 5, 10, 10), ), }, @@ -1530,7 +1531,7 @@ func TestAcc_Table_DefaultDataRetentionTime(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Table_DefaultDataRetentionTime/WithSchemaDataRetentionSet"), ConfigVariables: configWithSchemaDataRetentionSet(10, 3), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", r.IntDefaultString), checkDatabaseSchemaAndTableDataRetentionTime(tableId, 10, 3, 3), ), }, @@ -1538,7 +1539,7 @@ func TestAcc_Table_DefaultDataRetentionTime(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Table_DefaultDataRetentionTime/WithDatabaseDataRetentionSet"), ConfigVariables: configWithDatabaseDataRetentionSet(10), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", r.IntDefaultString), checkDatabaseSchemaAndTableDataRetentionTime(tableId, 10, 10, 10), ), }, @@ -1596,7 +1597,7 @@ func TestAcc_Table_DefaultDataRetentionTime_SetOutsideOfTerraform(t *testing.T) ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Table_DefaultDataRetentionTime/WithDatabaseDataRetentionSet"), ConfigVariables: configWithDatabaseDataRetentionSet(5), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", r.IntDefaultString), checkDatabaseSchemaAndTableDataRetentionTime(tableId, 5, 5, 5), ), }, @@ -1607,7 +1608,7 @@ func TestAcc_Table_DefaultDataRetentionTime_SetOutsideOfTerraform(t *testing.T) ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Table_DefaultDataRetentionTime/WithDatabaseDataRetentionSet"), ConfigVariables: configWithDatabaseDataRetentionSet(5), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", r.IntDefaultString), checkDatabaseSchemaAndTableDataRetentionTime(tableId, 5, 5, 5), ), }, @@ -1675,7 +1676,7 @@ func TestAcc_Table_DefaultDataRetentionTimeSettingUnsetting(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Table_DefaultDataRetentionTime/WithTableDataRetentionSet"), ConfigVariables: configWithTableDataRetentionSet(10, 3, -1), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", r.IntDefaultString), checkDatabaseSchemaAndTableDataRetentionTime(tableId, 10, 3, 3), ), }, @@ -1683,7 +1684,7 @@ func TestAcc_Table_DefaultDataRetentionTimeSettingUnsetting(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Table_DefaultDataRetentionTime/WithSchemaDataRetentionSet"), ConfigVariables: configWithSchemaDataRetentionSet(10, 3), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", r.IntDefaultString), checkDatabaseSchemaAndTableDataRetentionTime(tableId, 10, 3, 3), ), }, @@ -1691,7 +1692,7 @@ func TestAcc_Table_DefaultDataRetentionTimeSettingUnsetting(t *testing.T) { ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Table_DefaultDataRetentionTime/WithTableDataRetentionSet"), ConfigVariables: configWithTableDataRetentionSet(10, 3, -1), Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", "-1"), + resource.TestCheckResourceAttr("snowflake_table.test", "data_retention_time_in_days", r.IntDefaultString), checkDatabaseSchemaAndTableDataRetentionTime(tableId, 10, 3, 3), ), }, diff --git a/pkg/resources/warehouse.go b/pkg/resources/warehouse.go index 71f77d3a69..22d4a863ed 100644 --- a/pkg/resources/warehouse.go +++ b/pkg/resources/warehouse.go @@ -19,7 +19,6 @@ import ( "github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation" ) -// TODO [SNOW-1348102 - if we choose this approach]: extract three-value logic; add better description for each field var warehouseSchema = map[string]*schema.Schema{ "name": { Type: schema.TypeString, @@ -43,14 +42,14 @@ var warehouseSchema = map[string]*schema.Schema{ "max_cluster_count": { Type: schema.TypeInt, Optional: true, - ValidateFunc: validation.IntBetween(1, 10), + ValidateDiagFunc: validation.ToDiagFunc(validation.IntBetween(1, 10)), DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInShow("max_cluster_count"), Description: "Specifies the maximum number of server clusters for the warehouse.", }, "min_cluster_count": { Type: schema.TypeInt, Optional: true, - ValidateFunc: validation.IntBetween(1, 10), + ValidateDiagFunc: validation.ToDiagFunc(validation.IntBetween(1, 10)), DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInShow("min_cluster_count"), Description: "Specifies the minimum number of server clusters for the warehouse (only applies to multi-cluster warehouses).", }, @@ -64,18 +63,18 @@ var warehouseSchema = map[string]*schema.Schema{ "auto_suspend": { Type: schema.TypeInt, Optional: true, - ValidateFunc: validation.IntAtLeast(0), + ValidateDiagFunc: validation.ToDiagFunc(validation.IntAtLeast(0)), DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInShow("auto_suspend"), Description: "Specifies the number of seconds of inactivity after which a warehouse is automatically suspended.", - Default: -1, + Default: IntDefault, }, "auto_resume": { Type: schema.TypeString, Optional: true, - ValidateFunc: validation.StringInSlice([]string{"true", "false"}, true), + ValidateDiagFunc: validateBooleanString, DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInShow("auto_resume"), - Description: "Specifies whether to automatically resume a warehouse when a SQL statement (e.g. query) is submitted to it.", - Default: "unknown", + Description: booleanStringFieldDescription("Specifies whether to automatically resume a warehouse when a SQL statement (e.g. query) is submitted to it."), + Default: BooleanDefault, }, "initially_suspended": { Type: schema.TypeBool, @@ -98,41 +97,41 @@ var warehouseSchema = map[string]*schema.Schema{ "enable_query_acceleration": { Type: schema.TypeString, Optional: true, - ValidateFunc: validation.StringInSlice([]string{"true", "false"}, true), + ValidateDiagFunc: validateBooleanString, DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInShow("enable_query_acceleration"), - Description: "Specifies whether to enable the query acceleration service for queries that rely on this warehouse for compute resources.", - Default: "unknown", + Description: booleanStringFieldDescription("Specifies whether to enable the query acceleration service for queries that rely on this warehouse for compute resources."), + Default: BooleanDefault, }, "query_acceleration_max_scale_factor": { Type: schema.TypeInt, Optional: true, - ValidateFunc: validation.IntBetween(0, 100), + ValidateDiagFunc: validation.ToDiagFunc(validation.IntBetween(0, 100)), DiffSuppressFunc: IgnoreChangeToCurrentSnowflakeValueInShow("query_acceleration_max_scale_factor"), Description: "Specifies the maximum scale factor for leasing compute resources for query acceleration. The scale factor is used as a multiplier based on warehouse size.", - Default: -1, + Default: IntDefault, }, strings.ToLower(string(sdk.ObjectParameterMaxConcurrencyLevel)): { - Type: schema.TypeInt, - Optional: true, - Computed: true, - ValidateFunc: validation.IntAtLeast(1), - Description: "Object parameter that specifies the concurrency level for SQL statements (i.e. queries and DML) executed by a warehouse.", + Type: schema.TypeInt, + Optional: true, + Computed: true, + ValidateDiagFunc: validation.ToDiagFunc(validation.IntAtLeast(1)), + Description: "Object parameter that specifies the concurrency level for SQL statements (i.e. queries and DML) executed by a warehouse.", }, strings.ToLower(string(sdk.ObjectParameterStatementQueuedTimeoutInSeconds)): { - Type: schema.TypeInt, - Optional: true, - Computed: true, - ValidateFunc: validation.IntAtLeast(0), - Description: "Object parameter that specifies the time, in seconds, a SQL statement (query, DDL, DML, etc.) can be queued on a warehouse before it is canceled by the system.", + Type: schema.TypeInt, + Optional: true, + Computed: true, + ValidateDiagFunc: validation.ToDiagFunc(validation.IntAtLeast(0)), + Description: "Object parameter that specifies the time, in seconds, a SQL statement (query, DDL, DML, etc.) can be queued on a warehouse before it is canceled by the system.", }, strings.ToLower(string(sdk.ObjectParameterStatementTimeoutInSeconds)): { - Type: schema.TypeInt, - Optional: true, - Computed: true, - ValidateFunc: validation.IntBetween(0, 604800), - Description: "Specifies the time, in seconds, after which a running SQL statement (query, DDL, DML, etc.) is canceled by the system", + Type: schema.TypeInt, + Optional: true, + Computed: true, + ValidateDiagFunc: validation.ToDiagFunc(validation.IntBetween(0, 604800)), + Description: "Specifies the time, in seconds, after which a running SQL statement (query, DDL, DML, etc.) is canceled by the system", }, - showOutputAttributeName: { + ShowOutputAttributeName: { Type: schema.TypeList, Computed: true, Description: "Outputs the result of `SHOW WAREHOUSE` for the given warehouse.", @@ -140,7 +139,7 @@ var warehouseSchema = map[string]*schema.Schema{ Schema: schemas.ShowWarehouseSchema, }, }, - parametersAttributeName: { + ParametersAttributeName: { Type: schema.TypeList, Computed: true, Description: "Outputs the result of `SHOW PARAMETERS IN WAREHOUSE` for the given warehouse.", @@ -150,27 +149,46 @@ var warehouseSchema = map[string]*schema.Schema{ }, } -// TODO: merge with DatabaseParametersCustomDiff and extract common -var warehouseParametersCustomDiff = func(ctx context.Context, d *schema.ResourceDiff, meta any) error { - if d.Id() == "" { - return nil - } - +func warehouseParametersProvider(ctx context.Context, d ResourceIdProvider, meta any) ([]*sdk.Parameter, error) { client := meta.(*provider.Context).Client - params, err := client.Parameters.ShowParameters(context.Background(), &sdk.ShowParametersOptions{ + id := helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier) + warehouseParameters, err := client.Parameters.ShowParameters(ctx, &sdk.ShowParametersOptions{ In: &sdk.ParametersIn{ - Warehouse: helpers.DecodeSnowflakeID(d.Id()).(sdk.AccountObjectIdentifier), + Warehouse: id, }, }) if err != nil { - return err + return nil, err } + return warehouseParameters, nil +} - return customdiff.All( - IntParameterValueComputedIf("max_concurrency_level", params, sdk.ParameterTypeWarehouse, sdk.AccountParameterMaxConcurrencyLevel), - IntParameterValueComputedIf("statement_queued_timeout_in_seconds", params, sdk.ParameterTypeWarehouse, sdk.AccountParameterStatementQueuedTimeoutInSeconds), - IntParameterValueComputedIf("statement_timeout_in_seconds", params, sdk.ParameterTypeWarehouse, sdk.AccountParameterStatementTimeoutInSeconds), - )(ctx, d, meta) +func handleWarehouseParametersChanges(d *schema.ResourceData, set *sdk.WarehouseSet, unset *sdk.WarehouseUnset) diag.Diagnostics { + return JoinDiags( + handleValuePropertyChange[int](d, "max_concurrency_level", &set.MaxConcurrencyLevel, &unset.MaxConcurrencyLevel), + handleValuePropertyChange[int](d, "statement_queued_timeout_in_seconds", &set.StatementQueuedTimeoutInSeconds, &unset.StatementQueuedTimeoutInSeconds), + handleValuePropertyChange[int](d, "statement_timeout_in_seconds", &set.StatementTimeoutInSeconds, &unset.StatementTimeoutInSeconds), + ) +} + +func handleWarehouseParameterRead(d *schema.ResourceData, warehouseParameters []*sdk.Parameter) diag.Diagnostics { + for _, parameter := range warehouseParameters { + switch parameter.Key { + case + string(sdk.ObjectParameterMaxConcurrencyLevel), + string(sdk.ObjectParameterStatementQueuedTimeoutInSeconds), + string(sdk.ObjectParameterStatementTimeoutInSeconds): + value, err := strconv.Atoi(parameter.Value) + if err != nil { + return diag.FromErr(err) + } + if err := d.Set(strings.ToLower(parameter.Key), value); err != nil { + return diag.FromErr(err) + } + } + } + + return nil } // Warehouse returns a pointer to the resource representing a warehouse. @@ -190,12 +208,17 @@ func Warehouse() *schema.Resource { }, CustomizeDiff: customdiff.All( - ComputedIfAnyAttributeChanged(showOutputAttributeName, "warehouse_type", "warehouse_size", "max_cluster_count", "min_cluster_count", "scaling_policy", "auto_suspend", "auto_resume", "resource_monitor", "comment", "enable_query_acceleration", "query_acceleration_max_scale_factor"), - ComputedIfAnyAttributeChanged(parametersAttributeName, strings.ToLower(string(sdk.ObjectParameterMaxConcurrencyLevel)), strings.ToLower(string(sdk.ObjectParameterStatementQueuedTimeoutInSeconds)), strings.ToLower(string(sdk.ObjectParameterStatementTimeoutInSeconds))), + ComputedIfAnyAttributeChanged(ShowOutputAttributeName, "warehouse_type", "warehouse_size", "max_cluster_count", "min_cluster_count", "scaling_policy", "auto_suspend", "auto_resume", "resource_monitor", "comment", "enable_query_acceleration", "query_acceleration_max_scale_factor"), + ComputedIfAnyAttributeChanged(ParametersAttributeName, strings.ToLower(string(sdk.ObjectParameterMaxConcurrencyLevel)), strings.ToLower(string(sdk.ObjectParameterStatementQueuedTimeoutInSeconds)), strings.ToLower(string(sdk.ObjectParameterStatementTimeoutInSeconds))), customdiff.ForceNewIfChange("warehouse_size", func(ctx context.Context, old, new, meta any) bool { return old.(string) != "" && new.(string) == "" }), - warehouseParametersCustomDiff, + ParametersCustomDiff( + warehouseParametersProvider, + parameter{sdk.AccountParameterMaxConcurrencyLevel, valueTypeInt, sdk.ParameterTypeWarehouse}, + parameter{sdk.AccountParameterStatementQueuedTimeoutInSeconds, valueTypeInt, sdk.ParameterTypeWarehouse}, + parameter{sdk.AccountParameterStatementTimeoutInSeconds, valueTypeInt, sdk.ParameterTypeWarehouse}, + ), ), StateUpgraders: []schema.StateUpgrader{ @@ -240,7 +263,7 @@ func ImportWarehouse(ctx context.Context, d *schema.ResourceData, meta any) ([]* if err = d.Set("auto_suspend", w.AutoSuspend); err != nil { return nil, err } - if err = d.Set("auto_resume", fmt.Sprintf("%t", w.AutoResume)); err != nil { + if err = d.Set("auto_resume", booleanStringFromBool(w.AutoResume)); err != nil { return nil, err } if err = d.Set("resource_monitor", w.ResourceMonitor.Name()); err != nil { @@ -249,7 +272,7 @@ func ImportWarehouse(ctx context.Context, d *schema.ResourceData, meta any) ([]* if err = d.Set("comment", w.Comment); err != nil { return nil, err } - if err = d.Set("enable_query_acceleration", fmt.Sprintf("%t", w.EnableQueryAcceleration)); err != nil { + if err = d.Set("enable_query_acceleration", booleanStringFromBool(w.EnableQueryAcceleration)); err != nil { return nil, err } if err = d.Set("query_acceleration_max_scale_factor", w.QueryAccelerationMaxScaleFactor); err != nil { @@ -294,11 +317,11 @@ func CreateWarehouse(ctx context.Context, d *schema.ResourceData, meta any) diag } createOptions.ScalingPolicy = &scalingPolicy } - if v := d.Get("auto_suspend").(int); v != -1 { + if v := d.Get("auto_suspend").(int); v != IntDefault { createOptions.AutoSuspend = sdk.Int(v) } - if v := d.Get("auto_resume").(string); v != "unknown" { - parsed, err := strconv.ParseBool(v) + if v := d.Get("auto_resume").(string); v != BooleanDefault { + parsed, err := booleanStringToBool(v) if err != nil { return diag.FromErr(err) } @@ -313,23 +336,23 @@ func CreateWarehouse(ctx context.Context, d *schema.ResourceData, meta any) diag if v, ok := d.GetOk("comment"); ok { createOptions.Comment = sdk.String(v.(string)) } - if v := d.Get("enable_query_acceleration").(string); v != "unknown" { - parsed, err := strconv.ParseBool(v) + if v := d.Get("enable_query_acceleration").(string); v != BooleanDefault { + parsed, err := booleanStringToBool(v) if err != nil { return diag.FromErr(err) } createOptions.EnableQueryAcceleration = sdk.Bool(parsed) } - if v := d.Get("query_acceleration_max_scale_factor").(int); v != -1 { + if v := d.Get("query_acceleration_max_scale_factor").(int); v != IntDefault { createOptions.QueryAccelerationMaxScaleFactor = sdk.Int(v) } - if v := GetPropertyAsPointerWithPossibleZeroValues[int](d, "max_concurrency_level"); v != nil { + if v := GetConfigPropertyAsPointerAllowingZeroValue[int](d, "max_concurrency_level"); v != nil { createOptions.MaxConcurrencyLevel = v } - if v := GetPropertyAsPointerWithPossibleZeroValues[int](d, "statement_queued_timeout_in_seconds"); v != nil { + if v := GetConfigPropertyAsPointerAllowingZeroValue[int](d, "statement_queued_timeout_in_seconds"); v != nil { createOptions.StatementQueuedTimeoutInSeconds = v } - if v := GetPropertyAsPointerWithPossibleZeroValues[int](d, "statement_timeout_in_seconds"); v != nil { + if v := GetConfigPropertyAsPointerAllowingZeroValue[int](d, "statement_timeout_in_seconds"); v != nil { createOptions.StatementTimeoutInSeconds = v } @@ -342,19 +365,6 @@ func CreateWarehouse(ctx context.Context, d *schema.ResourceData, meta any) diag return GetReadWarehouseFunc(false)(ctx, d, meta) } -// TODO: move -func GetPropertyAsPointerWithPossibleZeroValues[T any](d *schema.ResourceData, property string) *T { - if d.GetRawConfig().AsValueMap()[property].IsNull() { - return nil - } - value := d.Get(property) - typedValue, ok := value.(T) - if !ok { - return nil - } - return &typedValue -} - func GetReadWarehouseFunc(withExternalChangesMarking bool) schema.ReadContextFunc { return func(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { client := meta.(*provider.Context).Client @@ -401,66 +411,6 @@ func GetReadWarehouseFunc(withExternalChangesMarking bool) schema.ReadContextFun } } - // These are all identity sets, needed for the case where: - // - previous config was empty (therefore Snowflake defaults had been used) - // - new config have the same values that are already in SF - if !d.GetRawConfig().IsNull() { - if v := d.GetRawConfig().AsValueMap()["warehouse_type"]; !v.IsNull() { - if err = d.Set("warehouse_type", v.AsString()); err != nil { - return diag.FromErr(err) - } - } - if v := d.GetRawConfig().AsValueMap()["warehouse_size"]; !v.IsNull() { - if err = d.Set("warehouse_size", v.AsString()); err != nil { - return diag.FromErr(err) - } - } - if v := d.GetRawConfig().AsValueMap()["max_cluster_count"]; !v.IsNull() { - intVal, _ := v.AsBigFloat().Int64() - if err = d.Set("max_cluster_count", intVal); err != nil { - return diag.FromErr(err) - } - } - if v := d.GetRawConfig().AsValueMap()["min_cluster_count"]; !v.IsNull() { - intVal, _ := v.AsBigFloat().Int64() - if err = d.Set("min_cluster_count", intVal); err != nil { - return diag.FromErr(err) - } - } - if v := d.GetRawConfig().AsValueMap()["scaling_policy"]; !v.IsNull() { - if err = d.Set("scaling_policy", v.AsString()); err != nil { - return diag.FromErr(err) - } - } - if v := d.GetRawConfig().AsValueMap()["auto_suspend"]; !v.IsNull() { - intVal, _ := v.AsBigFloat().Int64() - if err = d.Set("auto_suspend", intVal); err != nil { - return diag.FromErr(err) - } - } - if v := d.GetRawConfig().AsValueMap()["auto_resume"]; !v.IsNull() { - if err = d.Set("auto_resume", v.AsString()); err != nil { - return diag.FromErr(err) - } - } - if v := d.GetRawConfig().AsValueMap()["resource_monitor"]; !v.IsNull() { - if err = d.Set("resource_monitor", v.AsString()); err != nil { - return diag.FromErr(err) - } - } - if v := d.GetRawConfig().AsValueMap()["enable_query_acceleration"]; !v.IsNull() { - if err = d.Set("enable_query_acceleration", v.AsString()); err != nil { - return diag.FromErr(err) - } - } - if v := d.GetRawConfig().AsValueMap()["query_acceleration_max_scale_factor"]; !v.IsNull() { - intVal, _ := v.AsBigFloat().Int64() - if err = d.Set("query_acceleration_max_scale_factor", intVal); err != nil { - return diag.FromErr(err) - } - } - } - if err = d.Set("name", w.Name); err != nil { return diag.FromErr(err) } @@ -468,15 +418,30 @@ func GetReadWarehouseFunc(withExternalChangesMarking bool) schema.ReadContextFun return diag.FromErr(err) } + if err = setStateToValuesFromConfig(d, warehouseSchema, []string{ + "warehouse_type", + "warehouse_size", + "max_cluster_count", + "min_cluster_count", + "scaling_policy", + "auto_suspend", + "auto_resume", + "resource_monitor", + "enable_query_acceleration", + "query_acceleration_max_scale_factor", + }); err != nil { + return diag.FromErr(err) + } + if diags := handleWarehouseParameterRead(d, warehouseParameters); diags != nil { return diags } - if err = d.Set(showOutputAttributeName, []map[string]any{schemas.WarehouseToSchema(w)}); err != nil { + if err = d.Set(ShowOutputAttributeName, []map[string]any{schemas.WarehouseToSchema(w)}); err != nil { return diag.FromErr(err) } - if err = d.Set(parametersAttributeName, []map[string]any{schemas.WarehouseParametersToSchema(warehouseParameters)}); err != nil { + if err = d.Set(ParametersAttributeName, []map[string]any{schemas.WarehouseParametersToSchema(warehouseParameters)}); err != nil { return diag.FromErr(err) } @@ -484,26 +449,6 @@ func GetReadWarehouseFunc(withExternalChangesMarking bool) schema.ReadContextFun } } -func handleWarehouseParameterRead(d *schema.ResourceData, warehouseParameters []*sdk.Parameter) diag.Diagnostics { - for _, parameter := range warehouseParameters { - switch parameter.Key { - case - string(sdk.ObjectParameterMaxConcurrencyLevel), - string(sdk.ObjectParameterStatementQueuedTimeoutInSeconds), - string(sdk.ObjectParameterStatementTimeoutInSeconds): - value, err := strconv.Atoi(parameter.Value) - if err != nil { - return diag.FromErr(err) - } - if err := d.Set(strings.ToLower(parameter.Key), value); err != nil { - return diag.FromErr(err) - } - } - } - - return nil -} - // UpdateWarehouse implements schema.UpdateFunc. func UpdateWarehouse(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { client := meta.(*provider.Context).Client @@ -578,7 +523,7 @@ func UpdateWarehouse(ctx context.Context, d *schema.ResourceData, meta any) diag } } if d.HasChange("auto_suspend") { - if v := d.Get("auto_suspend").(int); v != -1 { + if v := d.Get("auto_suspend").(int); v != IntDefault { set.AutoSuspend = sdk.Int(v) } else { // TODO [SNOW-1473453]: UNSET of auto suspend works incorrectly @@ -587,8 +532,8 @@ func UpdateWarehouse(ctx context.Context, d *schema.ResourceData, meta any) diag } } if d.HasChange("auto_resume") { - if v := d.Get("auto_resume").(string); v != "unknown" { - parsed, err := strconv.ParseBool(v) + if v := d.Get("auto_resume").(string); v != BooleanDefault { + parsed, err := booleanStringToBool(v) if err != nil { return diag.FromErr(err) } @@ -614,8 +559,8 @@ func UpdateWarehouse(ctx context.Context, d *schema.ResourceData, meta any) diag } } if d.HasChange("enable_query_acceleration") { - if v := d.Get("enable_query_acceleration").(string); v != "unknown" { - parsed, err := strconv.ParseBool(v) + if v := d.Get("enable_query_acceleration").(string); v != BooleanDefault { + parsed, err := booleanStringToBool(v) if err != nil { return diag.FromErr(err) } @@ -625,21 +570,21 @@ func UpdateWarehouse(ctx context.Context, d *schema.ResourceData, meta any) diag } } if d.HasChange("query_acceleration_max_scale_factor") { - if v := d.Get("query_acceleration_max_scale_factor").(int); v != -1 { + if v := d.Get("query_acceleration_max_scale_factor").(int); v != IntDefault { set.QueryAccelerationMaxScaleFactor = sdk.Int(v) } else { unset.QueryAccelerationMaxScaleFactor = sdk.Bool(true) } } if d.HasChange("max_concurrency_level") { - if v := d.Get("max_concurrency_level").(int); v != -1 { + if v := d.Get("max_concurrency_level").(int); v != IntDefault { set.MaxConcurrencyLevel = sdk.Int(v) } else { unset.MaxConcurrencyLevel = sdk.Bool(true) } } if d.HasChange("statement_queued_timeout_in_seconds") { - if v := d.Get("statement_queued_timeout_in_seconds").(int); v != -1 { + if v := d.Get("statement_queued_timeout_in_seconds").(int); v != IntDefault { set.StatementQueuedTimeoutInSeconds = sdk.Int(v) } else { unset.StatementQueuedTimeoutInSeconds = sdk.Bool(true) @@ -671,14 +616,6 @@ func UpdateWarehouse(ctx context.Context, d *schema.ResourceData, meta any) diag return GetReadWarehouseFunc(false)(ctx, d, meta) } -func handleWarehouseParametersChanges(d *schema.ResourceData, set *sdk.WarehouseSet, unset *sdk.WarehouseUnset) diag.Diagnostics { - return JoinDiags( - handleValuePropertyChange[int](d, "max_concurrency_level", &set.MaxConcurrencyLevel, &unset.MaxConcurrencyLevel), - handleValuePropertyChange[int](d, "statement_queued_timeout_in_seconds", &set.StatementQueuedTimeoutInSeconds, &unset.StatementQueuedTimeoutInSeconds), - handleValuePropertyChange[int](d, "statement_timeout_in_seconds", &set.StatementTimeoutInSeconds, &unset.StatementTimeoutInSeconds), - ) -} - // DeleteWarehouse implements schema.DeleteFunc. func DeleteWarehouse(ctx context.Context, d *schema.ResourceData, meta any) diag.Diagnostics { client := meta.(*provider.Context).Client diff --git a/pkg/resources/warehouse_acceptance_test.go b/pkg/resources/warehouse_acceptance_test.go index 0c9c217fc6..41c520c5bc 100644 --- a/pkg/resources/warehouse_acceptance_test.go +++ b/pkg/resources/warehouse_acceptance_test.go @@ -7,6 +7,7 @@ import ( "testing" acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + r "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/resources" tfjson "github.com/hashicorp/terraform-json" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" @@ -52,13 +53,13 @@ func TestAcc_Warehouse_BasicFlows(t *testing.T) { resource.TestCheckNoResourceAttr("snowflake_warehouse.w", "max_cluster_count"), resource.TestCheckNoResourceAttr("snowflake_warehouse.w", "min_cluster_count"), resource.TestCheckNoResourceAttr("snowflake_warehouse.w", "scaling_policy"), - resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_suspend", "-1"), - resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_resume", "unknown"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_suspend", r.IntDefaultString), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_resume", r.BooleanDefault), resource.TestCheckNoResourceAttr("snowflake_warehouse.w", "initially_suspended"), resource.TestCheckNoResourceAttr("snowflake_warehouse.w", "resource_monitor"), resource.TestCheckResourceAttr("snowflake_warehouse.w", "comment", comment), - resource.TestCheckResourceAttr("snowflake_warehouse.w", "enable_query_acceleration", "unknown"), - resource.TestCheckResourceAttr("snowflake_warehouse.w", "query_acceleration_max_scale_factor", "-1"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "enable_query_acceleration", r.BooleanDefault), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "query_acceleration_max_scale_factor", r.IntDefaultString), resource.TestCheckResourceAttr("snowflake_warehouse.w", "max_concurrency_level", "8"), resource.TestCheckResourceAttr("snowflake_warehouse.w", "statement_queued_timeout_in_seconds", "0"), @@ -130,9 +131,9 @@ func TestAcc_Warehouse_BasicFlows(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "warehouse_size", "max_cluster_count", "min_cluster_count", "scaling_policy", "auto_suspend", "auto_resume", "enable_query_acceleration", "query_acceleration_max_scale_factor", "max_concurrency_level", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "warehouse_size", "max_cluster_count", "min_cluster_count", "scaling_policy", "auto_suspend", "auto_resume", "enable_query_acceleration", "query_acceleration_max_scale_factor", "max_concurrency_level", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", r.ShowOutputAttributeName), plancheck.ExpectResourceAction("snowflake_warehouse.w", plancheck.ResourceActionUpdate), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseFullDefaultWithoutParametersConfig(name2, comment), @@ -166,7 +167,7 @@ func TestAcc_Warehouse_BasicFlows(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "warehouse_size", "max_cluster_count", "min_cluster_count", "scaling_policy", "auto_suspend", "auto_resume", "enable_query_acceleration", "query_acceleration_max_scale_factor", "max_concurrency_level", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "warehouse_size", "max_cluster_count", "min_cluster_count", "scaling_policy", "auto_suspend", "auto_resume", "enable_query_acceleration", "query_acceleration_max_scale_factor", "max_concurrency_level", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", r.ShowOutputAttributeName), // this is this only situation in which there will be a strange output in the plan planchecks.ExpectComputed("snowflake_warehouse.w", "max_concurrency_level", true), @@ -209,7 +210,7 @@ func TestAcc_Warehouse_BasicFlows(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "warehouse_size", "max_cluster_count", "min_cluster_count", "scaling_policy", "auto_suspend", "auto_resume", "enable_query_acceleration", "query_acceleration_max_scale_factor", "max_concurrency_level", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "warehouse_size", "max_cluster_count", "min_cluster_count", "scaling_policy", "auto_suspend", "auto_resume", "enable_query_acceleration", "query_acceleration_max_scale_factor", "max_concurrency_level", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_type", tfjson.ActionUpdate, sdk.String(string(sdk.WarehouseTypeStandard)), sdk.String(string(sdk.WarehouseTypeSnowparkOptimized))), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_size", tfjson.ActionUpdate, sdk.String(string(sdk.WarehouseSizeXSmall)), sdk.String(string(sdk.WarehouseSizeMedium))), @@ -296,9 +297,9 @@ func TestAcc_Warehouse_WarehouseType(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_type", tfjson.ActionCreate, nil, sdk.String(string(sdk.WarehouseTypeStandard))), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseWithTypeConfig(id.Name(), sdk.WarehouseTypeStandard, sdk.WarehouseSizeMedium), @@ -324,9 +325,9 @@ func TestAcc_Warehouse_WarehouseType(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_type", tfjson.ActionUpdate, sdk.String(string(sdk.WarehouseTypeStandard)), sdk.String(string(sdk.WarehouseTypeSnowparkOptimized))), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseWithTypeConfig(id.Name(), sdk.WarehouseTypeSnowparkOptimized, sdk.WarehouseSizeMedium), @@ -343,9 +344,9 @@ func TestAcc_Warehouse_WarehouseType(t *testing.T) { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectResourceAction("snowflake_warehouse.w", plancheck.ResourceActionUpdate), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_type", tfjson.ActionUpdate, sdk.String(string(sdk.WarehouseTypeSnowparkOptimized)), nil), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -359,9 +360,9 @@ func TestAcc_Warehouse_WarehouseType(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_type", tfjson.ActionUpdate, nil, sdk.String(strings.ToLower(string(sdk.WarehouseTypeSnowparkOptimized)))), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseWithTypeConfig(id.Name(), sdk.WarehouseType(strings.ToLower(string(sdk.WarehouseTypeSnowparkOptimized))), sdk.WarehouseSizeMedium), @@ -381,11 +382,11 @@ func TestAcc_Warehouse_WarehouseType(t *testing.T) { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectNonEmptyPlan(), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", r.ShowOutputAttributeName), planchecks.ExpectDrift("snowflake_warehouse.w", "warehouse_type", sdk.String(strings.ToLower(string(sdk.WarehouseTypeSnowparkOptimized))), sdk.String(string(sdk.WarehouseTypeStandard))), planchecks.ExpectDrift("snowflake_warehouse.w", "show_output.0.type", sdk.String(string(sdk.WarehouseTypeSnowparkOptimized)), sdk.String(string(sdk.WarehouseTypeStandard))), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_type", tfjson.ActionUpdate, sdk.String(string(sdk.WarehouseTypeStandard)), nil), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -405,11 +406,11 @@ func TestAcc_Warehouse_WarehouseType(t *testing.T) { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectNonEmptyPlan(), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_type", r.ShowOutputAttributeName), planchecks.ExpectDrift("snowflake_warehouse.w", "warehouse_type", nil, sdk.String(string(sdk.WarehouseTypeSnowparkOptimized))), planchecks.ExpectDrift("snowflake_warehouse.w", "show_output.0.type", sdk.String(string(sdk.WarehouseTypeStandard)), sdk.String(string(sdk.WarehouseTypeSnowparkOptimized))), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_type", tfjson.ActionUpdate, sdk.String(string(sdk.WarehouseTypeSnowparkOptimized)), nil), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -449,9 +450,9 @@ func TestAcc_Warehouse_WarehouseSizes(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_size", tfjson.ActionCreate, nil, sdk.String(string(sdk.WarehouseSizeSmall))), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseWithSizeConfig(id.Name(), string(sdk.WarehouseSizeSmall)), @@ -477,9 +478,9 @@ func TestAcc_Warehouse_WarehouseSizes(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_size", tfjson.ActionUpdate, sdk.String(string(sdk.WarehouseSizeSmall)), sdk.String(string(sdk.WarehouseSizeMedium))), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseWithSizeConfig(id.Name(), string(sdk.WarehouseSizeMedium)), @@ -496,9 +497,9 @@ func TestAcc_Warehouse_WarehouseSizes(t *testing.T) { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectResourceAction("snowflake_warehouse.w", plancheck.ResourceActionDestroyBeforeCreate), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_size", tfjson.ActionCreate, sdk.String(string(sdk.WarehouseSizeMedium)), nil), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -512,9 +513,9 @@ func TestAcc_Warehouse_WarehouseSizes(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_size", tfjson.ActionUpdate, nil, sdk.String(strings.ToLower(string(sdk.WarehouseSizeSmall)))), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseWithSizeConfig(id.Name(), strings.ToLower(string(sdk.WarehouseSizeSmall))), @@ -534,11 +535,11 @@ func TestAcc_Warehouse_WarehouseSizes(t *testing.T) { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectNonEmptyPlan(), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", r.ShowOutputAttributeName), planchecks.ExpectDrift("snowflake_warehouse.w", "warehouse_size", sdk.String(strings.ToLower(string(sdk.WarehouseSizeSmall))), sdk.String(string(sdk.WarehouseSizeXSmall))), planchecks.ExpectDrift("snowflake_warehouse.w", "show_output.0.size", sdk.String(string(sdk.WarehouseSizeSmall)), sdk.String(string(sdk.WarehouseSizeXSmall))), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_size", tfjson.ActionCreate, sdk.String(string(sdk.WarehouseSizeXSmall)), nil), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -558,11 +559,11 @@ func TestAcc_Warehouse_WarehouseSizes(t *testing.T) { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectNonEmptyPlan(), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "warehouse_size", r.ShowOutputAttributeName), planchecks.ExpectDrift("snowflake_warehouse.w", "warehouse_size", nil, sdk.String(string(sdk.WarehouseSizeSmall))), planchecks.ExpectDrift("snowflake_warehouse.w", "show_output.0.size", sdk.String(string(sdk.WarehouseSizeXSmall)), sdk.String(string(sdk.WarehouseSizeSmall))), planchecks.ExpectChange("snowflake_warehouse.w", "warehouse_size", tfjson.ActionCreate, sdk.String(string(sdk.WarehouseSizeSmall)), nil), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -620,11 +621,11 @@ func TestAcc_Warehouse_Validations(t *testing.T) { }, { Config: warehouseWithAutoResumeConfig(id.Name(), "other"), - ExpectError: regexp.MustCompile(`expected auto_resume to be one of \["true" "false"], got other`), + ExpectError: regexp.MustCompile(`expected \[\{\{} auto_resume}] to be one of \["true" "false"], got other`), }, { - Config: warehouseConfigWithMaxConcurrencyLevel(id.Name(), -1), - ExpectError: regexp.MustCompile(`expected max_concurrency_level to be at least \(1\), got -1`), + Config: warehouseConfigWithMaxConcurrencyLevel(id.Name(), -2), + ExpectError: regexp.MustCompile(`expected max_concurrency_level to be at least \(1\), got -2`), }, }, }) @@ -701,9 +702,9 @@ func TestAcc_Warehouse_AutoResume(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_resume", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_resume", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "auto_resume", tfjson.ActionCreate, nil, sdk.String("true")), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseWithAutoResumeConfig(id.Name(), "true"), @@ -729,9 +730,9 @@ func TestAcc_Warehouse_AutoResume(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_resume", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_resume", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "auto_resume", tfjson.ActionUpdate, sdk.String("true"), sdk.String("false")), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseWithAutoResumeConfig(id.Name(), "false"), @@ -748,13 +749,13 @@ func TestAcc_Warehouse_AutoResume(t *testing.T) { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectResourceAction("snowflake_warehouse.w", plancheck.ResourceActionUpdate), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_resume", "show_output"), - planchecks.ExpectChange("snowflake_warehouse.w", "auto_resume", tfjson.ActionUpdate, sdk.String("false"), sdk.String("unknown")), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_resume", r.ShowOutputAttributeName), + planchecks.ExpectChange("snowflake_warehouse.w", "auto_resume", tfjson.ActionUpdate, sdk.String("false"), sdk.String(r.BooleanDefault)), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_resume", "unknown"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_resume", r.BooleanDefault), resource.TestCheckResourceAttr("snowflake_warehouse.w", "show_output.#", "1"), resource.TestCheckResourceAttr("snowflake_warehouse.w", "show_output.0.auto_resume", "true"), snowflakechecks.CheckAutoResume(t, id, true), @@ -770,15 +771,15 @@ func TestAcc_Warehouse_AutoResume(t *testing.T) { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectNonEmptyPlan(), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_resume", "show_output"), - planchecks.ExpectDrift("snowflake_warehouse.w", "auto_resume", sdk.String("unknown"), sdk.String("false")), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_resume", r.ShowOutputAttributeName), + planchecks.ExpectDrift("snowflake_warehouse.w", "auto_resume", sdk.String(r.BooleanDefault), sdk.String("false")), planchecks.ExpectDrift("snowflake_warehouse.w", "show_output.0.auto_resume", sdk.String("true"), sdk.String("false")), - planchecks.ExpectChange("snowflake_warehouse.w", "auto_resume", tfjson.ActionUpdate, sdk.String("false"), sdk.String("unknown")), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectChange("snowflake_warehouse.w", "auto_resume", tfjson.ActionUpdate, sdk.String("false"), sdk.String(r.BooleanDefault)), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_resume", "unknown"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_resume", r.BooleanDefault), resource.TestCheckResourceAttr("snowflake_warehouse.w", "show_output.#", "1"), resource.TestCheckResourceAttr("snowflake_warehouse.w", "show_output.0.auto_resume", "true"), snowflakechecks.CheckWarehouseType(t, id, sdk.WarehouseTypeStandard), @@ -815,9 +816,9 @@ func TestAcc_Warehouse_AutoSuspend(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionCreate, nil, sdk.String("1200")), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseConfigWithAutoSuspend(id.Name(), 1200), @@ -843,9 +844,9 @@ func TestAcc_Warehouse_AutoSuspend(t *testing.T) { { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String("1200"), sdk.String("600")), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Config: warehouseConfigWithAutoSuspend(id.Name(), 600), @@ -862,13 +863,13 @@ func TestAcc_Warehouse_AutoSuspend(t *testing.T) { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectResourceAction("snowflake_warehouse.w", plancheck.ResourceActionUpdate), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", "show_output"), - planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String("600"), sdk.String("-1")), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", r.ShowOutputAttributeName), + planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String("600"), sdk.String(r.IntDefaultString)), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_suspend", "-1"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_suspend", r.IntDefaultString), resource.TestCheckResourceAttr("snowflake_warehouse.w", "show_output.#", "1"), resource.TestCheckResourceAttr("snowflake_warehouse.w", "show_output.0.auto_suspend", "600"), snowflakechecks.CheckAutoSuspendCount(t, id, 600), @@ -884,15 +885,15 @@ func TestAcc_Warehouse_AutoSuspend(t *testing.T) { ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectNonEmptyPlan(), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", "show_output"), - planchecks.ExpectDrift("snowflake_warehouse.w", "auto_suspend", sdk.String("-1"), sdk.String("2400")), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", r.ShowOutputAttributeName), + planchecks.ExpectDrift("snowflake_warehouse.w", "auto_suspend", sdk.String(r.IntDefaultString), sdk.String("2400")), planchecks.ExpectDrift("snowflake_warehouse.w", "show_output.0.auto_suspend", sdk.String("600"), sdk.String("2400")), - planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String("2400"), sdk.String("-1")), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String("2400"), sdk.String(r.IntDefaultString)), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_suspend", "-1"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_suspend", r.IntDefaultString), resource.TestCheckResourceAttr("snowflake_warehouse.w", "show_output.#", "1"), resource.TestCheckResourceAttr("snowflake_warehouse.w", "show_output.0.auto_suspend", "600"), snowflakechecks.CheckAutoSuspendCount(t, id, 600), @@ -929,12 +930,12 @@ func TestAcc_Warehouse_ZeroValues(t *testing.T) { Config: warehouseWithAllValidZeroValuesConfig(id.Name()), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", "query_acceleration_max_scale_factor", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", "query_acceleration_max_scale_factor", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionCreate, nil, sdk.String("0")), planchecks.ExpectChange("snowflake_warehouse.w", "query_acceleration_max_scale_factor", tfjson.ActionCreate, nil, sdk.String("0")), planchecks.ExpectChange("snowflake_warehouse.w", "statement_queued_timeout_in_seconds", tfjson.ActionCreate, nil, sdk.String("0")), planchecks.ExpectChange("snowflake_warehouse.w", "statement_timeout_in_seconds", tfjson.ActionCreate, nil, sdk.String("0")), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -959,17 +960,17 @@ func TestAcc_Warehouse_ZeroValues(t *testing.T) { Config: warehouseBasicConfig(id.Name()), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", "query_acceleration_max_scale_factor", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", "show_output"), - planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String("0"), sdk.String("-1")), - planchecks.ExpectChange("snowflake_warehouse.w", "query_acceleration_max_scale_factor", tfjson.ActionUpdate, sdk.String("0"), sdk.String("-1")), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", "query_acceleration_max_scale_factor", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", r.ShowOutputAttributeName), + planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String("0"), sdk.String(r.IntDefaultString)), + planchecks.ExpectChange("snowflake_warehouse.w", "query_acceleration_max_scale_factor", tfjson.ActionUpdate, sdk.String("0"), sdk.String(r.IntDefaultString)), planchecks.ExpectComputed("snowflake_warehouse.w", "statement_queued_timeout_in_seconds", true), planchecks.ExpectComputed("snowflake_warehouse.w", "statement_timeout_in_seconds", true), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( - resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_suspend", "-1"), - resource.TestCheckResourceAttr("snowflake_warehouse.w", "query_acceleration_max_scale_factor", "-1"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "auto_suspend", r.IntDefaultString), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "query_acceleration_max_scale_factor", r.IntDefaultString), resource.TestCheckResourceAttr("snowflake_warehouse.w", "statement_queued_timeout_in_seconds", "0"), resource.TestCheckResourceAttr("snowflake_warehouse.w", "statement_timeout_in_seconds", "172800"), @@ -989,12 +990,12 @@ func TestAcc_Warehouse_ZeroValues(t *testing.T) { Config: warehouseWithAllValidZeroValuesConfig(id.Name()), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", "query_acceleration_max_scale_factor", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", "show_output"), - planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String("-1"), sdk.String("0")), - planchecks.ExpectChange("snowflake_warehouse.w", "query_acceleration_max_scale_factor", tfjson.ActionUpdate, sdk.String("-1"), sdk.String("0")), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "auto_suspend", "query_acceleration_max_scale_factor", "statement_queued_timeout_in_seconds", "statement_timeout_in_seconds", r.ShowOutputAttributeName), + planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String(r.IntDefaultString), sdk.String("0")), + planchecks.ExpectChange("snowflake_warehouse.w", "query_acceleration_max_scale_factor", tfjson.ActionUpdate, sdk.String(r.IntDefaultString), sdk.String("0")), planchecks.ExpectComputed("snowflake_warehouse.w", "statement_queued_timeout_in_seconds", true), planchecks.ExpectChange("snowflake_warehouse.w", "statement_timeout_in_seconds", tfjson.ActionUpdate, sdk.String("172800"), sdk.String("0")), - planchecks.ExpectComputed("snowflake_warehouse.w", "show_output", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ShowOutputAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -1057,9 +1058,9 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseWithParameterConfig(id.Name(), 86400), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "statement_timeout_in_seconds", tfjson.ActionCreate, nil, sdk.String("86400")), - planchecks.ExpectComputed("snowflake_warehouse.w", "parameters", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ParametersAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -1096,9 +1097,9 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseWithParameterConfig(id.Name(), 43200), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "statement_timeout_in_seconds", tfjson.ActionUpdate, sdk.String("86400"), sdk.String("43200")), - planchecks.ExpectComputed("snowflake_warehouse.w", "parameters", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ParametersAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -1120,7 +1121,7 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseWithParameterConfig(id.Name(), 43200), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "parameters.0.statement_timeout_in_seconds.0.value", tfjson.ActionNoop, sdk.String("43200"), sdk.String("43200")), plancheck.ExpectEmptyPlan(), }, @@ -1144,10 +1145,10 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseWithParameterConfig(id.Name(), 43200), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectDrift("snowflake_warehouse.w", "statement_timeout_in_seconds", sdk.String("43200"), sdk.String("86400")), planchecks.ExpectChange("snowflake_warehouse.w", "statement_timeout_in_seconds", tfjson.ActionUpdate, sdk.String("86400"), sdk.String("43200")), - planchecks.ExpectComputed("snowflake_warehouse.w", "parameters", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ParametersAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -1167,10 +1168,10 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseWithParameterConfig(id.Name(), 43200), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "statement_timeout_in_seconds", tfjson.ActionUpdate, sdk.String("43200"), nil), planchecks.ExpectComputed("snowflake_warehouse.w", "statement_timeout_in_seconds", true), - planchecks.ExpectComputed("snowflake_warehouse.w", "parameters", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ParametersAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -1192,9 +1193,9 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseBasicConfig(id.Name()), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "statement_timeout_in_seconds", tfjson.ActionUpdate, sdk.String("43200"), nil), - planchecks.ExpectComputed("snowflake_warehouse.w", "parameters", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ParametersAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -1222,10 +1223,10 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseWithParameterConfig(id.Name(), 172800), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "statement_timeout_in_seconds", tfjson.ActionUpdate, sdk.String("172800"), nil), planchecks.ExpectComputed("snowflake_warehouse.w", "statement_timeout_in_seconds", true), - planchecks.ExpectComputed("snowflake_warehouse.w", "parameters", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ParametersAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -1245,10 +1246,10 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseBasicConfig(id.Name()), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "statement_timeout_in_seconds", tfjson.ActionUpdate, sdk.String("172800"), nil), planchecks.ExpectComputed("snowflake_warehouse.w", "statement_timeout_in_seconds", true), - planchecks.ExpectComputed("snowflake_warehouse.w", "parameters", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ParametersAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -1270,7 +1271,7 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseBasicConfig(id.Name()), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectDrift("snowflake_warehouse.w", "parameters.0.statement_timeout_in_seconds.0.value", sdk.String("172800"), sdk.String("86400")), planchecks.ExpectChange("snowflake_warehouse.w", "parameters.0.statement_timeout_in_seconds.0.value", tfjson.ActionNoop, sdk.String("86400"), sdk.String("86400")), }, @@ -1303,10 +1304,10 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseBasicConfig(id.Name()), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "statement_timeout_in_seconds", tfjson.ActionUpdate, sdk.String("86400"), nil), planchecks.ExpectComputed("snowflake_warehouse.w", "statement_timeout_in_seconds", true), - planchecks.ExpectComputed("snowflake_warehouse.w", "parameters", true), + planchecks.ExpectComputed("snowflake_warehouse.w", r.ParametersAttributeName, true), }, }, Check: resource.ComposeTestCheckFunc( @@ -1325,7 +1326,7 @@ func TestAcc_Warehouse_Parameter(t *testing.T) { Config: warehouseBasicConfig(id.Name()), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", "parameters"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "statement_timeout_in_seconds", r.ParametersAttributeName), planchecks.ExpectDrift("snowflake_warehouse.w", "parameters.0.statement_timeout_in_seconds.0.value", sdk.String("86400"), sdk.String("172800")), planchecks.ExpectDrift("snowflake_warehouse.w", "parameters.0.statement_timeout_in_seconds.0.level", sdk.String(string(sdk.ParameterTypeAccount)), sdk.String("")), }, @@ -1472,6 +1473,69 @@ func TestAcc_Warehouse_migrateFromVersion092_allFieldsFilledBeforeMigration(t *t resource.TestCheckResourceAttr("snowflake_warehouse.w", "name", id.Name()), resource.TestCheckNoResourceAttr("snowflake_warehouse.w", "wait_for_provisioning"), resource.TestCheckNoResourceAttr("snowflake_warehouse.w", "resource_monitor"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "enable_query_acceleration", "true"), + ), + }, + // let's try to change the value of the parameter that was earlier a bool and now is a string + { + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectResourceAction("snowflake_warehouse.w", plancheck.ResourceActionUpdate), + planchecks.ExpectChange("snowflake_warehouse.w", "enable_query_acceleration", tfjson.ActionUpdate, sdk.String("true"), sdk.String("false")), + }, + }, + Config: warehouseFullDefaultConfigWithQueryAcceleration(id.Name(), "new comment", false, 8), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_warehouse.w", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "comment", "new comment"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "enable_query_acceleration", "false"), + ), + }, + }, + }) +} + +func TestAcc_Warehouse_migrateFromVersion092_allFieldsFilledBeforeMigration_booleanChangeRightAfter(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + + resource.Test(t, resource.TestCase{ + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Warehouse), + + Steps: []resource.TestStep{ + { + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.92.0", + Source: "Snowflake-Labs/snowflake", + }, + }, + Config: warehouseFullMigrationConfig(id.Name(), true), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_warehouse.w", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "wait_for_provisioning", "true"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "resource_monitor", "null"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "enable_query_acceleration", "true"), + ), + }, + // let's try to change the value of the parameter that was earlier a bool and now is a string + { + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectResourceAction("snowflake_warehouse.w", plancheck.ResourceActionUpdate), + planchecks.ExpectChange("snowflake_warehouse.w", "enable_query_acceleration", tfjson.ActionUpdate, sdk.String("true"), sdk.String("false")), + }, + }, + Config: warehouseFullDefaultConfigWithQueryAcceleration(id.Name(), "new comment", false, 8), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_warehouse.w", "name", id.Name()), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "comment", "new comment"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "enable_query_acceleration", "false"), ), }, }, @@ -1509,7 +1573,7 @@ func TestAcc_Warehouse_migrateFromVersion092_queryAccelerationMaxScaleFactor_sam ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectEmptyPlan(), - planchecks.PrintPlanDetails("snowflake_warehouse.w", "query_acceleration_max_scale_factor", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "query_acceleration_max_scale_factor", r.ShowOutputAttributeName), }, }, Check: resource.ComposeTestCheckFunc( @@ -1554,13 +1618,13 @@ func TestAcc_Warehouse_migrateFromVersion092_queryAccelerationMaxScaleFactor_noI Config: warehouseFullDefaultConfigWithQueryAccelerationMaxScaleFactorRemoved(id.Name(), ""), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "query_acceleration_max_scale_factor", "show_output"), - planchecks.ExpectChange("snowflake_warehouse.w", "query_acceleration_max_scale_factor", tfjson.ActionUpdate, sdk.String("8"), sdk.String("-1")), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "query_acceleration_max_scale_factor", r.ShowOutputAttributeName), + planchecks.ExpectChange("snowflake_warehouse.w", "query_acceleration_max_scale_factor", tfjson.ActionUpdate, sdk.String("8"), sdk.String(r.IntDefaultString)), }, }, Check: resource.ComposeTestCheckFunc( resource.TestCheckResourceAttr("snowflake_warehouse.w", "name", id.Name()), - resource.TestCheckResourceAttr("snowflake_warehouse.w", "query_acceleration_max_scale_factor", "-1"), + resource.TestCheckResourceAttr("snowflake_warehouse.w", "query_acceleration_max_scale_factor", r.IntDefaultString), resource.TestCheckResourceAttr("snowflake_warehouse.w", "show_output.#", "1"), resource.TestCheckResourceAttr("snowflake_warehouse.w", "show_output.0.query_acceleration_max_scale_factor", "8"), @@ -1600,7 +1664,7 @@ func TestAcc_Warehouse_migrateFromVersion092_queryAccelerationMaxScaleFactor_dif Config: warehouseFullDefaultConfigWithQueryAcceleration(id.Name(), "", true, 10), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ - planchecks.PrintPlanDetails("snowflake_warehouse.w", "query_acceleration_max_scale_factor", "show_output"), + planchecks.PrintPlanDetails("snowflake_warehouse.w", "query_acceleration_max_scale_factor", r.ShowOutputAttributeName), planchecks.ExpectChange("snowflake_warehouse.w", "query_acceleration_max_scale_factor", tfjson.ActionUpdate, sdk.String("8"), sdk.String("10")), }, }, @@ -1721,10 +1785,10 @@ func TestAcc_Warehouse_migrateFromVersion092_defaultsRemoved(t *testing.T) { planchecks.ExpectChange("snowflake_warehouse.w", "max_cluster_count", tfjson.ActionUpdate, sdk.String("1"), nil), planchecks.ExpectChange("snowflake_warehouse.w", "min_cluster_count", tfjson.ActionUpdate, sdk.String("1"), nil), planchecks.ExpectChange("snowflake_warehouse.w", "scaling_policy", tfjson.ActionUpdate, sdk.String(string(sdk.ScalingPolicyStandard)), nil), - planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String("600"), sdk.String("-1")), - planchecks.ExpectChange("snowflake_warehouse.w", "auto_resume", tfjson.ActionUpdate, sdk.String("true"), sdk.String("unknown")), - planchecks.ExpectChange("snowflake_warehouse.w", "enable_query_acceleration", tfjson.ActionUpdate, sdk.String("false"), sdk.String("unknown")), - planchecks.ExpectChange("snowflake_warehouse.w", "query_acceleration_max_scale_factor", tfjson.ActionUpdate, sdk.String("8"), sdk.String("-1")), + planchecks.ExpectChange("snowflake_warehouse.w", "auto_suspend", tfjson.ActionUpdate, sdk.String("600"), sdk.String(r.IntDefaultString)), + planchecks.ExpectChange("snowflake_warehouse.w", "auto_resume", tfjson.ActionUpdate, sdk.String("true"), sdk.String(r.BooleanDefault)), + planchecks.ExpectChange("snowflake_warehouse.w", "enable_query_acceleration", tfjson.ActionUpdate, sdk.String("false"), sdk.String(r.BooleanDefault)), + planchecks.ExpectChange("snowflake_warehouse.w", "query_acceleration_max_scale_factor", tfjson.ActionUpdate, sdk.String("8"), sdk.String(r.IntDefaultString)), planchecks.ExpectComputed("snowflake_warehouse.w", "max_concurrency_level", true), planchecks.ExpectComputed("snowflake_warehouse.w", "statement_queued_timeout_in_seconds", true), diff --git a/pkg/resources/warehouse_rework_parameters_proposal.go b/pkg/resources/warehouse_rework_parameters_proposal.go deleted file mode 100644 index 35937fec7a..0000000000 --- a/pkg/resources/warehouse_rework_parameters_proposal.go +++ /dev/null @@ -1,52 +0,0 @@ -package resources - -import ( - "strconv" - "strings" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/collections" - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" - "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" -) - -const parametersAttributeName = "parameters" - -// markChangedParameters assumes that the snowflake parameter name is mirrored in schema (as lower-cased name) -// TODO [SNOW-1348102 - after discussion]: test (unit and acceptance) -// TODO [SNOW-1348102 - after discussion]: more readable errors -// TODO [SNOW-1348102 - after discussion]: handle different types than int -func markChangedParameters(objectParameters []sdk.ObjectParameter, currentParameters []*sdk.Parameter, d *schema.ResourceData, level sdk.ParameterType) error { - for _, param := range objectParameters { - currentSnowflakeParameter, err := collections.FindOne(currentParameters, func(p *sdk.Parameter) bool { - return p.Key == string(param) - }) - if err != nil { - return err - } - // this handles situations in which parameter was set on object externally (so either the value or the level was changed) - // we can just set the config value to the current Snowflake value because: - // 1. if it did not change, then no drift will be reported - // 2. if it had different non-empty value, then the drift will be reported and the value will be set during update - // 3. if it had empty value, then the drift will be reported and the value will be unset during update - if (*currentSnowflakeParameter).Level == level { - intValue, err := strconv.Atoi((*currentSnowflakeParameter).Value) - if err != nil { - return err - } - if err = d.Set(strings.ToLower(string(param)), intValue); err != nil { - return err - } - } - // this handles situations in which parameter was unset from the object - // we can just set the config value to because: - // 1. if it was missing in config before, then no drift will be reported - // 2. if it had a non-empty value, then the drift will be reported and the value will be set during update - if (*currentSnowflakeParameter).Level != level { - // TODO [SNOW-1348102 - after discussion]: this is currently set to an artificial default - if err = d.Set(strings.ToLower(string(param)), -1); err != nil { - return err - } - } - } - return nil -} diff --git a/v1-preparations/CHANGES_BEFORE_V1.md b/v1-preparations/CHANGES_BEFORE_V1.md index 2f8f505102..3ec832d168 100644 --- a/v1-preparations/CHANGES_BEFORE_V1.md +++ b/v1-preparations/CHANGES_BEFORE_V1.md @@ -1,7 +1,6 @@ -# Changes before v1 +# Design decisions before v1 -This document is a changelog of resources and datasources as part of the https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#preparing-essential-ga-objects-for-the-provider-v1. -Each provider version lists changes made in resources and datasources definitions during v1 preparations, like added, modified and removed fields. +This document is a supplement to all the resource changes described in the [migration guide](../MIGRATION_GUIDE.md) on our road to V1 (check the [roadmap](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#05052024-roadmap-overview)). Its purpose is to give explanation/context for the decisions spanning multiple resources. It will be updated with more findings/conventions. ## Default values For any resource that went through the rework as part of the [resource preparation for V1](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/ROADMAP.md#preparing-essential-ga-objects-for-the-provider-v1), @@ -16,16 +15,34 @@ create a resource with slightly different configuration in Snowflake (depending current account configuration, and most-likely other factors). That is why we recommend setting optional fields where you want to ensure that the specified value has been set on the Snowflake side. -## v0.91.0 ➞ v0.92.0 -### snowflake_scim_integration resource changes - -New fields: -- `enabled` -- `sync_password` -- `comment` - -Changed fields: -- `provisioner_role` renamed to `run_as_role` - -Other changes: -- `scim_client` and `run_as_role` marked as `ForceNew` +## "Empty" values +The [Terraform SDK v2](https://github.com/hashicorp/terraform-plugin-sdk) that is currently used in our provider detects the presence of the attribute based on its non-zero Golang value. This means, that it is not possible to distinguish the removal of the value inside a config from setting it explicitely to a zero value, e.g. `0` for the numeric value (check [this thread](https://discuss.hashicorp.com/t/is-it-possible-to-differentiate-between-a-zero-value-and-a-removed-property-in-the-terraform-provider-sdk/43131)). Before we migrate to the new recommended [Terraform Plugin Framework](https://github.com/hashicorp/terraform-plugin-framework) we want to handle such cases the same way inside the provider. It means that: +- boolean attributes will be migrated to the string attributes with two values: `"true"` and `"false"` settable in the config and the special third value `"default"` that will mean, that the given attribute is not set inside the config. +- integer values with the possible `0` value in Snowflake (e.g. `AUTO_SUSPEND` in [warehouse](https://docs.snowflake.com/en/sql-reference/sql/create-warehouse)) will have a special default (usually a `-1`) assigned on the provider side when the config is left empty for them. +- string values with the possible empty (`""`) value (e.g. default for column value in a table) will have a special default `""` that will be used for the empty config. +It won't be possible to use the above values directly (it will be for the string attributes) but users should be aware of them, because they may appear in the terraform plans. + +## Snowflake parameters +[Snowflake parameters](https://docs.snowflake.com/en/sql-reference/parameters) have different types and hierarchies. In the earlier versions of the provider they have been handled non-intuitively by setting the deault values inside the provider (e.g. [#2356](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2356)). We want to change that. Because of that we decided to: +- make all parameters available for the given object available in the resource (without the need to use the `snowflake_object_parameter` resource which future will be discussed in the next few weeks) +- remove the default values from Snowflake parameters in every resource before the V1. This is an important **breaking change**. In the previous versions usually not setting the given parameter resulted in using the provider default. This was different from creating the same object without the parameter by hand (because Snowflake just takes the parameter from the hierarchy in such case). +- provider will identify both the internal and the external changes to these parameters on both `value` and `level` levels, e.g.: + - setting the parameter inside the config and then manually unsetting it to the same value on the higher level will result in detecting a change + - not setting the parameter inside the config and then manually changing the parameter on object level to the same value as the value one level higher in the hierarchy will result in detecting a change +- handle parameters as optional/computed values in the provider +- add, in all objects having at least one parameter, a special computed collection `parameters` containing all the values and levels of parameters (the result of `SHOW PARAMETERS IN `). + +## Config values in the state +Currently, not setting a value for the given attribute inside the config results in populating this field in state with the value extracted from Snowflake (usually by running `SHOW`/`DESCRIBE`). This poses a challenge to identify if the change happened externally or is it just a default Snowflake value (multiple issues reported describe the issue with the infinite plans or weird drifts - this is one of the main reasons). With getting rid of the Snowflake defaults from the provider, it's not an easy thing to do in the currently used [Terraform SDK v2](https://github.com/hashicorp/terraform-plugin-sdk). We have considered and tested a variety of options, including custom diff suppression, setting these fields as optional and computed, and others, but there were smaller or bigger problems with these approaches. What we ended up with, and what will be a guideline for the V1 is: +- we do not fill the given attribute in the state if it is not present inside a config (for the optional attributes; the required ones are always present) +- we encourage to always use the value directly if you don't want to depend on the Snowflake default (consult [default values](#default-values) section) +- this may result in change detection with migrations to the newer versions of the provider (because currently, the value is stored independently of being present in the config or not and there is no way to deduce its presence in the automatic state migrations we can provide) - alternative would be to follow our [resource migration guide](../docs/technical-documentation/resource_migration.md) +- we will provide a `show_output` and `describe_output` in each resource (more in [Raw Snowflake output](#raw-snowflake-output) section) + +## Raw Snowflake output +Because of the changes regarding [Config values in the state](#config-values-in-the-state) we want to still allow our users to get the value of the given attribute, even when it is not set in the config. For each resource (and datasource) we will provide: +- `show_output` computed field, containing the response of `SHOW ` for the given managed object +- `describe_output` computed field, containing the response of `DESCRIBE ` for the given managed object +- `parameters` computed field, containing all the values and levels of Snowflake parameters (the result of `SHOW PARAMETERS IN `) + +This way, it is still possible to obtain the values in your configs, even without setting them directly for the given managed object.