diff --git a/CHANGELOG.md b/CHANGELOG.md index 1fee191a98..90016b3fdd 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,12 @@ # Changelog +## [0.94.1](https://github.com/Snowflake-Labs/terraform-provider-snowflake/compare/v0.94.0...v0.94.1) (2024-08-02) + + +### 🐛 **Bug fixes:** + +* Use ALTER for managing PUBLIC schemas that exist ([#2973](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2973)) ([567e9be](https://github.com/Snowflake-Labs/terraform-provider-snowflake/commit/567e9be5efb0be731fa7ee56143b8ca4326bd037)) + ## [0.94.0](https://github.com/Snowflake-Labs/terraform-provider-snowflake/compare/v0.93.0...v0.94.0) (2024-07-26) diff --git a/MIGRATION_GUIDE.md b/MIGRATION_GUIDE.md index 00d72115d2..0082d9442c 100644 --- a/MIGRATION_GUIDE.md +++ b/MIGRATION_GUIDE.md @@ -4,7 +4,7 @@ This document is meant to help you migrate your Terraform config to the new newe describe deprecations or breaking changes and help you to change your configuration to keep the same (or similar) behavior across different versions. -## v0.94.0 ➞ v0.95.0 +## v0.94.x ➞ v0.95.0 ### snowflake_warehouse resource changes @@ -76,6 +76,11 @@ The following set of [parameters](https://docs.snowflake.com/en/sql-reference/pa - [NETWORK_POLICY](https://docs.snowflake.com/en/sql-reference/parameters#network-policy) - [PREVENT_UNLOAD_TO_INTERNAL_STAGES](https://docs.snowflake.com/en/sql-reference/parameters#prevent-unload-to-internal-stages) +## v0.94.0 ➞ v0.94.1 +### changes in snowflake_schema + +In order to avoid dropping `PUBLIC` schemas, we have decided to use `ALTER` instead of `OR REPLACE` during creation. In the future we are planning to use `CREATE OR ALTER` when it becomes available for schems. + ## v0.93.0 ➞ v0.94.0 ### *(breaking change)* changes in snowflake_scim_integration @@ -116,7 +121,7 @@ New fields: - added `describe_output` field that holds the response from DESCRIBE SCHEMA. Note that one needs to grant sufficient privileges e.g. with [grant_ownership](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/resources/grant_ownership) on all objects in the schema. Otherwise, this field is not filled. - added `parameters` field that holds the response from SHOW PARAMETERS IN SCHEMA. -We allow creating and managing `PUBLIC` schemas now. When the name of the schema is `PUBLIC`, it's created with `OR_REPLACE`. We've decided this based on [#2826](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2826). +We allow creating and managing `PUBLIC` schemas now. When the name of the schema is `PUBLIC`, it's created with `OR_REPLACE`. Please be careful with this operation, because you may experience data loss. `OR_REPLACE` does `DROP` before `CREATE`, so all objects in the schema will be dropped and this is not visible in Terraform plan. To restore data-related objects that might have been accidentally or intentionally deleted, pleas read about [Time Travel](https://docs.snowflake.com/en/user-guide/data-time-travel). The alternative is to import `PUBLIC` schema manually and then manage it with Terraform. We've decided this based on [#2826](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2826). #### *(behavior change)* Boolean type changes To easily handle three-value logic (true, false, unknown) in provider's configs, type of `is_transient` and `with_managed_access` was changed from boolean to string. This should not require updating existing configs (boolean value should be accepted and state will be migrated to string automatically), however we recommend changing config values to strings. diff --git a/docs/data-sources/databases.md b/docs/data-sources/databases.md index 4a501293cc..a32b9f9da9 100644 --- a/docs/data-sources/databases.md +++ b/docs/data-sources/databases.md @@ -5,7 +5,7 @@ description: |- Datasource used to get details of filtered databases. Filtering is aligned with the current possibilities for SHOW DATABASES https://docs.snowflake.com/en/sql-reference/sql/show-databases query (like, starts_with, and limit are all supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_databases (Data Source) diff --git a/docs/data-sources/network_policies.md b/docs/data-sources/network_policies.md index f2e8c505ce..9a930a231b 100644 --- a/docs/data-sources/network_policies.md +++ b/docs/data-sources/network_policies.md @@ -5,7 +5,7 @@ description: |- Datasource used to get details of filtered network policies. Filtering is aligned with the current possibilities for SHOW NETWORK POLICIES https://docs.snowflake.com/en/sql-reference/sql/show-network-policies query (like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection. --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_network_policies (Data Source) diff --git a/docs/data-sources/roles.md b/docs/data-sources/roles.md index 4d43f7348e..8382bffa5b 100644 --- a/docs/data-sources/roles.md +++ b/docs/data-sources/roles.md @@ -5,7 +5,7 @@ description: |- Datasource used to get details of filtered roles. Filtering is aligned with the current possibilities for SHOW ROLES https://docs.snowflake.com/en/sql-reference/sql/show-roles query (like and in_class are all supported). The results of SHOW are encapsulated in one output collection. --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **V1 release candidate** This datasource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_roles (Data Source) diff --git a/docs/data-sources/schemas.md b/docs/data-sources/schemas.md index 2823553a55..81cc107919 100644 --- a/docs/data-sources/schemas.md +++ b/docs/data-sources/schemas.md @@ -5,6 +5,8 @@ description: |- Datasource used to get details of filtered schemas. Filtering is aligned with the current possibilities for SHOW SCHEMAS https://docs.snowflake.com/en/sql-reference/sql/show-schemas query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. --- +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. + # snowflake_schemas (Data Source) Datasource used to get details of filtered schemas. Filtering is aligned with the current possibilities for [SHOW SCHEMAS](https://docs.snowflake.com/en/sql-reference/sql/show-schemas) query. The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. diff --git a/docs/data-sources/security_integrations.md b/docs/data-sources/security_integrations.md index 16f187dfb3..4f0bc30c5b 100644 --- a/docs/data-sources/security_integrations.md +++ b/docs/data-sources/security_integrations.md @@ -5,7 +5,7 @@ description: |- Datasource used to get details of filtered security integrations. Filtering is aligned with the current possibilities for SHOW SECURITY INTEGRATIONS https://docs.snowflake.com/en/sql-reference/sql/show-integrations query (only like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection security_integrations. --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_security_integrations (Data Source) diff --git a/docs/data-sources/streamlits.md b/docs/data-sources/streamlits.md index 99b6c25d06..ef767a9082 100644 --- a/docs/data-sources/streamlits.md +++ b/docs/data-sources/streamlits.md @@ -5,7 +5,7 @@ description: |- Datasource used to get details of filtered streamlits. Filtering is aligned with the current possibilities for SHOW STREAMLITS https://docs.snowflake.com/en/sql-reference/sql/show-streamlits query (only like is supported). The results of SHOW and DESCRIBE are encapsulated in one output collection streamlits. --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. # snowflake_streamlits (Data Source) diff --git a/docs/data-sources/warehouses.md b/docs/data-sources/warehouses.md index dcfef526c0..99ce968572 100644 --- a/docs/data-sources/warehouses.md +++ b/docs/data-sources/warehouses.md @@ -5,7 +5,7 @@ description: |- Datasource used to get details of filtered warehouses. Filtering is aligned with the current possibilities for SHOW WAREHOUSES https://docs.snowflake.com/en/sql-reference/sql/show-warehouses query (only like is supported). The results of SHOW, DESCRIBE, and SHOW PARAMETERS IN are encapsulated in one output collection. --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # snowflake_warehouses (Data Source) diff --git a/docs/resources/schema.md b/docs/resources/schema.md index 92ff5d74d6..704a302ce5 100644 --- a/docs/resources/schema.md +++ b/docs/resources/schema.md @@ -5,6 +5,8 @@ description: |- Resource used to manage schema objects. For more information, check schema documentation https://docs.snowflake.com/en/sql-reference/sql/create-schema. --- +!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. + # snowflake_schema (Resource) Resource used to manage schema objects. For more information, check [schema documentation](https://docs.snowflake.com/en/sql-reference/sql/create-schema). @@ -53,7 +55,7 @@ resource "snowflake_schema" "schema" { ### Required - `database` (String) The database in which to create the schema. -- `name` (String) Specifies the identifier for the schema; must be unique for the database in which the schema is created. +- `name` (String) Specifies the identifier for the schema; must be unique for the database in which the schema is created. When the name is `PUBLIC`, during creation the provider checks if this schema has already been created and, in such case, `ALTER` is used to match the desired state. ### Optional diff --git a/pkg/acceptance/helpers/database_client.go b/pkg/acceptance/helpers/database_client.go index 2e9c71cf23..bc657a1fa3 100644 --- a/pkg/acceptance/helpers/database_client.go +++ b/pkg/acceptance/helpers/database_client.go @@ -49,15 +49,21 @@ func (c *DatabaseClient) CreateDatabaseWithOptions(t *testing.T, id sdk.AccountO } func (c *DatabaseClient) DropDatabaseFunc(t *testing.T, id sdk.AccountObjectIdentifier) func() { + t.Helper() + return func() { require.NoError(t, c.DropDatabase(t, id)) } +} + +func (c *DatabaseClient) DropDatabase(t *testing.T, id sdk.AccountObjectIdentifier) error { t.Helper() ctx := context.Background() - return func() { - err := c.client().Drop(ctx, id, &sdk.DropDatabaseOptions{IfExists: sdk.Bool(true)}) - require.NoError(t, err) - err = c.context.client.Sessions.UseSchema(ctx, c.ids.SchemaId()) - require.NoError(t, err) + if err := c.client().Drop(ctx, id, &sdk.DropDatabaseOptions{IfExists: sdk.Bool(true)}); err != nil { + return err + } + if err := c.context.client.Sessions.UseSchema(ctx, c.ids.SchemaId()); err != nil { + return err } + return nil } func (c *DatabaseClient) CreateSecondaryDatabaseWithOptions(t *testing.T, id sdk.AccountObjectIdentifier, externalId sdk.ExternalObjectIdentifier, opts *sdk.CreateSecondaryDatabaseOptions) (*sdk.Database, func()) { diff --git a/pkg/acceptance/helpers/grant_client.go b/pkg/acceptance/helpers/grant_client.go new file mode 100644 index 0000000000..fa7ba85ccc --- /dev/null +++ b/pkg/acceptance/helpers/grant_client.go @@ -0,0 +1,45 @@ +package helpers + +import ( + "context" + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/stretchr/testify/require" +) + +type GrantClient struct { + context *TestClientContext + ids *IdsGenerator +} + +func NewGrantClient(context *TestClientContext, idsGenerator *IdsGenerator) *GrantClient { + return &GrantClient{ + context: context, + ids: idsGenerator, + } +} + +func (c *GrantClient) client() sdk.Grants { + return c.context.client.Grants +} + +func (c *GrantClient) GrantOnSchemaToAccountRole(t *testing.T, schemaId sdk.DatabaseObjectIdentifier, accountRoleId sdk.AccountObjectIdentifier, privileges ...sdk.SchemaPrivilege) { + t.Helper() + ctx := context.Background() + + err := c.client().GrantPrivilegesToAccountRole( + ctx, + &sdk.AccountRoleGrantPrivileges{ + SchemaPrivileges: privileges, + }, + &sdk.AccountRoleGrantOn{ + Schema: &sdk.GrantOnSchema{ + Schema: &schemaId, + }, + }, + accountRoleId, + new(sdk.GrantPrivilegesToAccountRoleOptions), + ) + require.NoError(t, err) +} diff --git a/pkg/acceptance/helpers/schema_client.go b/pkg/acceptance/helpers/schema_client.go index 390b736894..ac77595262 100644 --- a/pkg/acceptance/helpers/schema_client.go +++ b/pkg/acceptance/helpers/schema_client.go @@ -95,3 +95,12 @@ func (c *SchemaClient) Show(t *testing.T, id sdk.DatabaseObjectIdentifier) (*sdk return c.client().ShowByID(ctx, id) } + +func (c *SchemaClient) ShowWithOptions(t *testing.T, opts *sdk.ShowSchemaOptions) []sdk.Schema { + t.Helper() + ctx := context.Background() + + schemas, err := c.client().Show(ctx, opts) + require.NoError(t, err) + return schemas +} diff --git a/pkg/acceptance/helpers/test_client.go b/pkg/acceptance/helpers/test_client.go index ad1b2a982b..c629c37a0e 100644 --- a/pkg/acceptance/helpers/test_client.go +++ b/pkg/acceptance/helpers/test_client.go @@ -24,6 +24,7 @@ type TestClient struct { ExternalVolume *ExternalVolumeClient FailoverGroup *FailoverGroupClient FileFormat *FileFormatClient + Grant *GrantClient MaskingPolicy *MaskingPolicyClient MaterializedView *MaterializedViewClient NetworkPolicy *NetworkPolicyClient @@ -77,6 +78,7 @@ func NewTestClient(c *sdk.Client, database string, schema string, warehouse stri ExternalVolume: NewExternalVolumeClient(context, idsGenerator), FailoverGroup: NewFailoverGroupClient(context, idsGenerator), FileFormat: NewFileFormatClient(context, idsGenerator), + Grant: NewGrantClient(context, idsGenerator), MaskingPolicy: NewMaskingPolicyClient(context, idsGenerator), MaterializedView: NewMaterializedViewClient(context, idsGenerator), NetworkPolicy: NewNetworkPolicyClient(context, idsGenerator), diff --git a/pkg/acceptance/testing.go b/pkg/acceptance/testing.go index 510b881439..44ccdd24bf 100644 --- a/pkg/acceptance/testing.go +++ b/pkg/acceptance/testing.go @@ -24,10 +24,12 @@ import ( "github.com/snowflakedb/gosnowflake" ) +const AcceptanceTestPrefix = "acc_test_" + var ( - TestDatabaseName = "acc_test_db_" + random.AcceptanceTestsSuffix - TestSchemaName = "acc_test_sc_" + random.AcceptanceTestsSuffix - TestWarehouseName = "acc_test_wh_" + random.AcceptanceTestsSuffix + TestDatabaseName = fmt.Sprintf("%sdb_%s", AcceptanceTestPrefix, random.AcceptanceTestsSuffix) + TestSchemaName = fmt.Sprintf("%ssc_%s", AcceptanceTestPrefix, random.AcceptanceTestsSuffix) + TestWarehouseName = fmt.Sprintf("%swh_%s", AcceptanceTestPrefix, random.AcceptanceTestsSuffix) ) var ( diff --git a/pkg/acceptance/testprofiles/testing_config_profiles.go b/pkg/acceptance/testprofiles/testing_config_profiles.go index 0fbef504ca..6bca3093c1 100644 --- a/pkg/acceptance/testprofiles/testing_config_profiles.go +++ b/pkg/acceptance/testprofiles/testing_config_profiles.go @@ -3,5 +3,7 @@ package testprofiles const ( Default = "default" Secondary = "secondary_test_account" + Third = "third_test_account" + Fourth = "fourth_test_account" IncorrectUserAndPassword = "incorrect_test_profile" ) diff --git a/pkg/datasources/database.go b/pkg/datasources/database.go index e3caa58b04..dcf0ec536f 100644 --- a/pkg/datasources/database.go +++ b/pkg/datasources/database.go @@ -83,7 +83,11 @@ func ReadDatabase(d *schema.ResourceData, meta interface{}) error { if err := d.Set("is_current", database.IsCurrent); err != nil { return err } - if err := d.Set("origin", database.Origin); err != nil { + var origin string + if database.Origin != nil { + origin = database.Origin.FullyQualifiedName() + } + if err := d.Set("origin", origin); err != nil { return err } if err := d.Set("retention_time", database.RetentionTime); err != nil { diff --git a/pkg/helpers/helpers.go b/pkg/helpers/helpers.go index 96866792c4..a3e0f04737 100644 --- a/pkg/helpers/helpers.go +++ b/pkg/helpers/helpers.go @@ -105,7 +105,7 @@ func DecodeSnowflakeID(id string) sdk.ObjectIdentifier { // The following configuration { "some_identifier": "db.name" } will be parsed as an object called "name" that lives // inside database called "db", not a database called "db.name". In this case quotes should be used. func DecodeSnowflakeParameterID(identifier string) (sdk.ObjectIdentifier, error) { - parts, err := ParseIdentifierString(identifier) + parts, err := sdk.ParseIdentifierString(identifier) if err != nil { return nil, err } @@ -126,7 +126,7 @@ func DecodeSnowflakeParameterID(identifier string) (sdk.ObjectIdentifier, error) // DecodeSnowflakeAccountIdentifier decodes account identifier (usually passed as one of the parameter in tf configuration) into sdk.AccountIdentifier. // Check more in https://docs.snowflake.com/en/sql-reference/sql/create-account#required-parameters. func DecodeSnowflakeAccountIdentifier(identifier string) (sdk.AccountIdentifier, error) { - parts, err := ParseIdentifierString(identifier) + parts, err := sdk.ParseIdentifierString(identifier) if err != nil { return sdk.AccountIdentifier{}, err } @@ -166,7 +166,7 @@ func ConcatSlices[T any](slices ...[]T) []T { // TODO(SNOW-999049): address during identifiers rework func ParseRootLocation(location string) (sdk.SchemaObjectIdentifier, string, error) { location = strings.TrimPrefix(location, "@") - parts, err := parseIdentifierStringWithOpts(location, func(r *csv.Reader) { + parts, err := sdk.ParseIdentifierStringWithOpts(location, func(r *csv.Reader) { r.Comma = '.' r.LazyQuotes = true }) diff --git a/pkg/helpers/identifier_string_parser.go b/pkg/helpers/identifier_string_parser.go deleted file mode 100644 index c5b1ee2049..0000000000 --- a/pkg/helpers/identifier_string_parser.go +++ /dev/null @@ -1,32 +0,0 @@ -package helpers - -import ( - "encoding/csv" - "fmt" - "strings" -) - -const ( - ParameterIDDelimiter = '.' -) - -func ParseIdentifierString(identifier string) ([]string, error) { - return parseIdentifierStringWithOpts(identifier, func(r *csv.Reader) { - r.Comma = ParameterIDDelimiter - }) -} - -func parseIdentifierStringWithOpts(identifier string, opts func(*csv.Reader)) ([]string, error) { - reader := csv.NewReader(strings.NewReader(identifier)) - if opts != nil { - opts(reader) - } - lines, err := reader.ReadAll() - if err != nil { - return nil, fmt.Errorf("unable to read identifier: %s, err = %w", identifier, err) - } - if len(lines) != 1 { - return nil, fmt.Errorf("incompatible identifier: %s", identifier) - } - return lines[0], nil -} diff --git a/pkg/helpers/identifier_string_parser_test.go b/pkg/helpers/identifier_string_parser_test.go deleted file mode 100644 index 1635808290..0000000000 --- a/pkg/helpers/identifier_string_parser_test.go +++ /dev/null @@ -1,94 +0,0 @@ -package helpers - -import ( - "testing" - - "github.com/stretchr/testify/require" -) - -// TODO [SNOW-999049]: add more fancy cases -func Test_ParseIdentifierString(t *testing.T) { - containsAll := func(t *testing.T, parts, expectedParts []string) { - t.Helper() - require.Len(t, parts, len(expectedParts)) - for _, part := range expectedParts { - require.Contains(t, parts, part) - } - } - - t.Run("returns read error", func(t *testing.T) { - input := `ab"c` - - _, err := ParseIdentifierString(input) - - require.ErrorContains(t, err, "unable to read identifier") - require.ErrorContains(t, err, `bare " in non-quoted-field`) - }) - - t.Run("returns error for empty input", func(t *testing.T) { - input := "" - - _, err := ParseIdentifierString(input) - - require.ErrorContains(t, err, "incompatible identifier") - }) - - t.Run("returns error for multiple lines", func(t *testing.T) { - input := "abc\ndef" - - _, err := ParseIdentifierString(input) - - require.ErrorContains(t, err, "incompatible identifier") - }) - - t.Run("returns parts correctly without quoting", func(t *testing.T) { - input := "abc.def" - expected := []string{"abc", "def"} - - parts, err := ParseIdentifierString(input) - - require.NoError(t, err) - containsAll(t, parts, expected) - }) - - t.Run("returns parts correctly with quoting", func(t *testing.T) { - input := `"abc"."def"` - expected := []string{"abc", "def"} - - parts, err := ParseIdentifierString(input) - - require.NoError(t, err) - containsAll(t, parts, expected) - }) - - t.Run("returns parts correctly with mixed quoting", func(t *testing.T) { - input := `"abc".def."ghi"` - expected := []string{"abc", "def", "ghi"} - - parts, err := ParseIdentifierString(input) - - require.NoError(t, err) - containsAll(t, parts, expected) - }) - - // Quote inside must have a preceding quote (https://docs.snowflake.com/en/sql-reference/identifiers-syntax). - t.Run("returns parts correctly with quote inside", func(t *testing.T) { - input := `"ab""c".def` - expected := []string{`ab"c`, "def"} - - parts, err := ParseIdentifierString(input) - - require.NoError(t, err) - containsAll(t, parts, expected) - }) - - t.Run("returns parts correctly with dots inside", func(t *testing.T) { - input := `"ab.c".def` - expected := []string{`ab.c`, "def"} - - parts, err := ParseIdentifierString(input) - - require.NoError(t, err) - containsAll(t, parts, expected) - }) -} diff --git a/pkg/helpers/resource_identifier.go b/pkg/helpers/resource_identifier.go new file mode 100644 index 0000000000..684b6ecd55 --- /dev/null +++ b/pkg/helpers/resource_identifier.go @@ -0,0 +1,18 @@ +package helpers + +import ( + "strings" +) + +const ResourceIdDelimiter = '|' + +func ParseResourceIdentifier(identifier string) []string { + if identifier == "" { + return make([]string, 0) + } + return strings.Split(identifier, string(ResourceIdDelimiter)) +} + +func EncodeResourceIdentifier(parts ...string) string { + return strings.Join(parts, string(ResourceIdDelimiter)) +} diff --git a/pkg/helpers/resource_identifier_test.go b/pkg/helpers/resource_identifier_test.go new file mode 100644 index 0000000000..f453cb8339 --- /dev/null +++ b/pkg/helpers/resource_identifier_test.go @@ -0,0 +1,39 @@ +package helpers + +import ( + "fmt" + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/stretchr/testify/assert" +) + +func Test_Encoding_And_Parsing_Of_ResourceIdentifier(t *testing.T) { + testCases := []struct { + Input []string + Expected string + ExpectedAfterDecoding []string + }{ + {Input: []string{sdk.NewSchemaObjectIdentifier("a", "b", "c").FullyQualifiedName(), "info"}, Expected: `"a"."b"."c"|info`}, + {Input: []string{}, Expected: ``}, + {Input: []string{"", "", ""}, Expected: `||`}, + {Input: []string{"a", "b", "c"}, Expected: `a|b|c`}, + // If one of the parts contains a separator sign (pipe in this case), + // we can end up with more parts than we started with. + {Input: []string{"a", "b", "c|d"}, Expected: `a|b|c|d`, ExpectedAfterDecoding: []string{"a", "b", "c", "d"}}, + } + + for _, testCase := range testCases { + t.Run(fmt.Sprintf(`Encoding and parsing %s resource identifier`, testCase.Input), func(t *testing.T) { + encodedIdentifier := EncodeResourceIdentifier(testCase.Input...) + assert.Equal(t, testCase.Expected, encodedIdentifier) + + parsedIdentifier := ParseResourceIdentifier(encodedIdentifier) + if testCase.ExpectedAfterDecoding != nil { + assert.Equal(t, testCase.ExpectedAfterDecoding, parsedIdentifier) + } else { + assert.Equal(t, testCase.Input, parsedIdentifier) + } + }) + } +} diff --git a/pkg/resources/grant_ownership_identifier.go b/pkg/resources/grant_ownership_identifier.go index a6bc3277c3..2b897f262c 100644 --- a/pkg/resources/grant_ownership_identifier.go +++ b/pkg/resources/grant_ownership_identifier.go @@ -2,7 +2,6 @@ package resources import ( "fmt" - "strings" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" @@ -59,7 +58,7 @@ func (g *OnObjectGrantOwnershipData) String() string { var parts []string parts = append(parts, g.ObjectType.String()) parts = append(parts, g.ObjectName.FullyQualifiedName()) - return strings.Join(parts, helpers.IDDelimiter) + return helpers.EncodeResourceIdentifier(parts...) } func (g *GrantOwnershipId) String() string { @@ -81,13 +80,13 @@ func (g *GrantOwnershipId) String() string { if len(data) > 0 { parts = append(parts, data) } - return strings.Join(parts, helpers.IDDelimiter) + return helpers.EncodeResourceIdentifier(parts...) } func ParseGrantOwnershipId(id string) (*GrantOwnershipId, error) { grantOwnershipId := new(GrantOwnershipId) - parts := strings.Split(id, helpers.IDDelimiter) + parts := helpers.ParseResourceIdentifier(id) if len(parts) < 5 { return grantOwnershipId, sdk.NewError(`grant ownership identifier should hold at least 5 parts "||||"`) } diff --git a/pkg/resources/grant_privileges_identifier_commons.go b/pkg/resources/grant_privileges_identifier_commons.go index 885b5d61c5..81779c08d9 100644 --- a/pkg/resources/grant_privileges_identifier_commons.go +++ b/pkg/resources/grant_privileges_identifier_commons.go @@ -39,7 +39,7 @@ func (d *OnSchemaGrantData) String() string { case OnAllSchemasInDatabaseSchemaGrantKind, OnFutureSchemasInDatabaseSchemaGrantKind: parts = append(parts, d.DatabaseName.FullyQualifiedName()) } - return strings.Join(parts, helpers.IDDelimiter) + return helpers.EncodeResourceIdentifier(parts...) } type OnSchemaObjectGrantData struct { @@ -57,7 +57,7 @@ func (d *OnSchemaObjectGrantData) String() string { case OnAllSchemaObjectGrantKind, OnFutureSchemaObjectGrantKind: parts = append(parts, d.OnAllOrFuture.String()) } - return strings.Join(parts, helpers.IDDelimiter) + return helpers.EncodeResourceIdentifier(parts...) } type BulkOperationGrantKind string @@ -84,7 +84,7 @@ func (d *BulkOperationGrantData) String() string { case InSchemaBulkOperationGrantKind: parts = append(parts, d.Schema.FullyQualifiedName()) } - return strings.Join(parts, helpers.IDDelimiter) + return helpers.EncodeResourceIdentifier(parts...) } func getBulkOperationGrantData(in *sdk.GrantOnSchemaObjectIn) *BulkOperationGrantData { diff --git a/pkg/resources/grant_privileges_to_account_role_identifier.go b/pkg/resources/grant_privileges_to_account_role_identifier.go index 03f91dee4d..024e18bc11 100644 --- a/pkg/resources/grant_privileges_to_account_role_identifier.go +++ b/pkg/resources/grant_privileges_to_account_role_identifier.go @@ -30,10 +30,7 @@ type OnAccountObjectGrantData struct { } func (d *OnAccountObjectGrantData) String() string { - return strings.Join([]string{ - d.ObjectType.String(), - d.ObjectName.FullyQualifiedName(), - }, helpers.IDDelimiter) + return helpers.EncodeResourceIdentifier(d.ObjectType.String(), d.ObjectName.FullyQualifiedName()) } type GrantPrivilegesToAccountRoleId struct { @@ -61,13 +58,13 @@ func (g *GrantPrivilegesToAccountRoleId) String() string { if len(data) > 0 { parts = append(parts, data) } - return strings.Join(parts, helpers.IDDelimiter) + return helpers.EncodeResourceIdentifier(parts...) } func ParseGrantPrivilegesToAccountRoleId(id string) (GrantPrivilegesToAccountRoleId, error) { var accountRoleId GrantPrivilegesToAccountRoleId - parts := strings.Split(id, helpers.IDDelimiter) + parts := helpers.ParseResourceIdentifier(id) if len(parts) < 5 { return accountRoleId, sdk.NewError(`account role identifier should hold at least 5 parts "||||"`) } diff --git a/pkg/resources/grant_privileges_to_database_role_identifier.go b/pkg/resources/grant_privileges_to_database_role_identifier.go index 56e4ec2044..973b7ab024 100644 --- a/pkg/resources/grant_privileges_to_database_role_identifier.go +++ b/pkg/resources/grant_privileges_to_database_role_identifier.go @@ -39,7 +39,7 @@ func (g *GrantPrivilegesToDatabaseRoleId) String() string { } parts = append(parts, string(g.Kind)) parts = append(parts, g.Data.String()) - return strings.Join(parts, helpers.IDDelimiter) + return helpers.EncodeResourceIdentifier(parts...) } type OnDatabaseGrantData struct { @@ -53,7 +53,7 @@ func (d *OnDatabaseGrantData) String() string { func ParseGrantPrivilegesToDatabaseRoleId(id string) (GrantPrivilegesToDatabaseRoleId, error) { var databaseRoleId GrantPrivilegesToDatabaseRoleId - parts := strings.Split(id, helpers.IDDelimiter) + parts := helpers.ParseResourceIdentifier(id) if len(parts) < 6 { return databaseRoleId, sdk.NewError(`database role identifier should hold at least 6 parts "|||||"`) } diff --git a/pkg/resources/grant_privileges_to_share_identifier.go b/pkg/resources/grant_privileges_to_share_identifier.go index 744facc1b2..9996c38add 100644 --- a/pkg/resources/grant_privileges_to_share_identifier.go +++ b/pkg/resources/grant_privileges_to_share_identifier.go @@ -29,18 +29,18 @@ type GrantPrivilegesToShareId struct { } func (id *GrantPrivilegesToShareId) String() string { - return strings.Join([]string{ + return helpers.EncodeResourceIdentifier( id.ShareName.FullyQualifiedName(), strings.Join(id.Privileges, ","), string(id.Kind), id.Identifier.FullyQualifiedName(), - }, helpers.IDDelimiter) + ) } func ParseGrantPrivilegesToShareId(idString string) (GrantPrivilegesToShareId, error) { var grantPrivilegesToShareId GrantPrivilegesToShareId - parts := strings.Split(idString, helpers.IDDelimiter) + parts := helpers.ParseResourceIdentifier(idString) if len(parts) != 4 { return grantPrivilegesToShareId, sdk.NewError(fmt.Sprintf(`snowflake_grant_privileges_to_share id is composed out of 4 parts "|||", but got %d parts: %v`, len(parts), parts)) } diff --git a/pkg/resources/network_policy.go b/pkg/resources/network_policy.go index 5d84e96dd2..8ca3e82f94 100644 --- a/pkg/resources/network_policy.go +++ b/pkg/resources/network_policy.go @@ -92,10 +92,10 @@ func NetworkPolicy() *schema.Resource { Description: "Resource used to control network traffic. For more information, check an [official guide](https://docs.snowflake.com/en/user-guide/network-policies) on controlling network traffic with network policies.", CustomizeDiff: customdiff.All( - // For now, allowed_network_rule_list and blocked_network_rule_list have to stay commented and the implementation - // for ComputedIfAnyAttributeChanged has to be adjusted. The main issue lays in fields that have diff suppression. - // When the value in state and the value in config are different (which is normal with diff suppressions) show - // and describe outputs are constantly recomputed (which will appear in every terraform plan). + // For now, allowed_network_rule_list and blocked_network_rule_list have to stay commented. + // The main issue lays in the old Terraform SDK and how its handling DiffSuppression and CustomizeDiff + // for complex types like Sets, Lists, and Maps. When every element of the Set is suppressed in custom diff, + // it returns true for d.HasChange anyway (it returns false for suppressed changes on primitive types like Number, Bool, String, etc.). ComputedIfAnyAttributeChanged( ShowOutputAttributeName, // "allowed_network_rule_list", diff --git a/pkg/resources/network_policy_acceptance_test.go b/pkg/resources/network_policy_acceptance_test.go index 638b58dfbd..e552fe84e5 100644 --- a/pkg/resources/network_policy_acceptance_test.go +++ b/pkg/resources/network_policy_acceptance_test.go @@ -363,6 +363,37 @@ func TestAcc_NetworkPolicy_InvalidBlockedIpListValue(t *testing.T) { }) } +func TestAcc_NetworkPolicy_InvalidNetworkRuleIds(t *testing.T) { + id := acc.TestClient().Ids.RandomAccountObjectIdentifier() + + resource.Test(t, resource.TestCase{ + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.NetworkPolicy), + Steps: []resource.TestStep{ + { + Config: networkPolicyConfigInvalidAllowedNetworkRules(id.Name()), + ExpectError: regexp.MustCompile(`sdk\.TableColumnIdentifier\. The correct form of the fully qualified name for`), + }, + { + Config: networkPolicyConfigInvalidAllowedNetworkRules(id.Name()), + ExpectError: regexp.MustCompile(`sdk\.DatabaseObjectIdentifier\. The correct form of the fully qualified name`), + }, + { + Config: networkPolicyConfigInvalidBlockedNetworkRules(id.Name()), + ExpectError: regexp.MustCompile(`sdk\.TableColumnIdentifier\. The correct form of the fully qualified name for`), + }, + { + Config: networkPolicyConfigInvalidBlockedNetworkRules(id.Name()), + ExpectError: regexp.MustCompile(`sdk\.DatabaseObjectIdentifier\. The correct form of the fully qualified name`), + }, + }, + }) +} + func networkPolicyConfigBasic(name string) string { return fmt.Sprintf(`resource "snowflake_network_policy" "test" { name = "%v" @@ -376,6 +407,20 @@ func networkPolicyConfigInvalidBlockedIpListValue(name string) string { }`, name) } +func networkPolicyConfigInvalidAllowedNetworkRules(name string) string { + return fmt.Sprintf(`resource "snowflake_network_policy" "test" { + name = "%v" + allowed_network_rule_list = ["a.b", "a.b.c.d"] + }`, name) +} + +func networkPolicyConfigInvalidBlockedNetworkRules(name string) string { + return fmt.Sprintf(`resource "snowflake_network_policy" "test" { + name = "%v" + blocked_network_rule_list = ["a.b", "a.b.c.d"] + }`, name) +} + func networkPolicyConfigComplete( name string, allowedRuleList []string, diff --git a/pkg/resources/schema.go b/pkg/resources/schema.go index 0ed2b229ae..57c2baf88b 100644 --- a/pkg/resources/schema.go +++ b/pkg/resources/schema.go @@ -8,24 +8,21 @@ import ( "slices" "strings" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/schemas" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/helpers" - + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/go-cty/cty" "github.com/hashicorp/terraform-plugin-sdk/v2/diag" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/customdiff" "github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" ) var schemaSchema = map[string]*schema.Schema{ "name": { Type: schema.TypeString, Required: true, - Description: "Specifies the identifier for the schema; must be unique for the database in which the schema is created.", + Description: "Specifies the identifier for the schema; must be unique for the database in which the schema is created. When the name is `PUBLIC`, during creation the provider checks if this schema has already been created and, in such case, `ALTER` is used to match the desired state.", DiffSuppressFunc: suppressIdentifierQuoting, }, "database": { @@ -177,6 +174,18 @@ func CreateContextSchema(ctx context.Context, d *schema.ResourceData, meta any) database := d.Get("database").(string) id := sdk.NewDatabaseObjectIdentifier(database, name) + if strings.EqualFold(strings.TrimSpace(name), "PUBLIC") { + _, err := client.Schemas.ShowByID(ctx, id) + if err != nil && !errors.Is(err, sdk.ErrObjectNotFound) { + return diag.FromErr(err) + } else if err == nil { + // there is already a PUBLIC schema, so we need to alter it instead + log.Printf("[DEBUG] found PUBLIC schema during creation, updating...") + d.SetId(helpers.EncodeSnowflakeID(database, name)) + return UpdateContextSchema(ctx, d, meta) + } + } + opts := &sdk.CreateSchemaOptions{ Comment: GetConfigPropertyAsPointerAllowingZeroValue[string](d, "comment"), } @@ -198,9 +207,6 @@ func CreateContextSchema(ctx context.Context, d *schema.ResourceData, meta any) } opts.WithManagedAccess = sdk.Bool(parsed) } - if strings.EqualFold(strings.TrimSpace(name), "PUBLIC") { - opts.OrReplace = sdk.Pointer(true) - } if err := client.Schemas.Create(ctx, id, opts); err != nil { return diag.Diagnostics{ diag.Diagnostic{ @@ -210,6 +216,7 @@ func CreateContextSchema(ctx context.Context, d *schema.ResourceData, meta any) }, } } + d.SetId(helpers.EncodeSnowflakeID(database, name)) return ReadContextSchema(false)(ctx, d, meta) @@ -312,7 +319,7 @@ func UpdateContextSchema(ctx context.Context, d *schema.ResourceData, meta any) id := helpers.DecodeSnowflakeID(d.Id()).(sdk.DatabaseObjectIdentifier) client := meta.(*provider.Context).Client - if d.HasChange("name") { + if d.HasChange("name") && !d.GetRawState().IsNull() { newId := sdk.NewDatabaseObjectIdentifier(d.Get("database").(string), d.Get("name").(string)) err := client.Schemas.Alter(ctx, id, &sdk.AlterSchemaOptions{ NewName: sdk.Pointer(newId), diff --git a/pkg/resources/schema_acceptance_test.go b/pkg/resources/schema_acceptance_test.go index 4d1ebfd35b..d07397a08e 100644 --- a/pkg/resources/schema_acceptance_test.go +++ b/pkg/resources/schema_acceptance_test.go @@ -1,8 +1,10 @@ package resources_test import ( + "cmp" "fmt" "regexp" + "slices" "strconv" "testing" @@ -447,8 +449,9 @@ func TestAcc_Schema_Rename(t *testing.T) { }) } -func TestAcc_Schema_ManagePublic(t *testing.T) { +func TestAcc_Schema_ManagePublicVersion_0_94_0(t *testing.T) { name := "PUBLIC" + schemaId := sdk.NewDatabaseObjectIdentifier(acc.TestDatabaseName, name) resource.Test(t, resource.TestCase{ PreCheck: func() { acc.TestAccPreCheck(t) }, @@ -469,13 +472,13 @@ func TestAcc_Schema_ManagePublic(t *testing.T) { ExpectError: regexp.MustCompile("Error: error creating schema PUBLIC"), }, { - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Schema/basic_with_pipe_execution_paused"), - ConfigVariables: map[string]config.Variable{ - "name": config.StringVariable(name), - "database": config.StringVariable(acc.TestDatabaseName), - "pipe_execution_paused": config.BoolVariable(true), + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.94.0", + Source: "Snowflake-Labs/snowflake", + }, }, + Config: schemav094WithPipeExecutionPaused(name, acc.TestDatabaseName, true), Check: resource.ComposeTestCheckFunc( resource.TestCheckResourceAttr("snowflake_schema.test", "name", name), resource.TestCheckResourceAttr("snowflake_schema.test", "database", acc.TestDatabaseName), @@ -483,13 +486,28 @@ func TestAcc_Schema_ManagePublic(t *testing.T) { ), }, { - ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, - ConfigDirectory: acc.ConfigurationDirectory("TestAcc_Schema/basic_with_pipe_execution_paused"), - ConfigVariables: map[string]config.Variable{ - "name": config.StringVariable(name), - "database": config.StringVariable(acc.TestDatabaseName), - "pipe_execution_paused": config.BoolVariable(false), + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.94.0", + Source: "Snowflake-Labs/snowflake", + }, + }, + PreConfig: func() { + // In v0.94 `CREATE OR REPLACE` was called, so we should see a DROP event. + schemas := acc.TestClient().Schema.ShowWithOptions(t, &sdk.ShowSchemaOptions{ + History: sdk.Pointer(true), + Like: &sdk.Like{ + Pattern: sdk.String(schemaId.Name()), + }, + }) + require.Len(t, schemas, 2) + slices.SortFunc(schemas, func(x, y sdk.Schema) int { + return cmp.Compare(x.DroppedOn.Unix(), y.DroppedOn.Unix()) + }) + require.Zero(t, schemas[0].DroppedOn) + require.NotZero(t, schemas[1].DroppedOn) }, + Config: schemav094WithPipeExecutionPaused(name, acc.TestDatabaseName, false), ConfigPlanChecks: resource.ConfigPlanChecks{ PreApply: []plancheck.PlanCheck{ plancheck.ExpectResourceAction("snowflake_schema.test", plancheck.ResourceActionUpdate), @@ -505,6 +523,70 @@ func TestAcc_Schema_ManagePublic(t *testing.T) { }) } +func TestAcc_Schema_ManagePublicVersion_0_94_1(t *testing.T) { + name := "PUBLIC" + + // use a separate db because this test relies on schema history + db, cleanupDb := acc.TestClient().Database.CreateDatabase(t) + t.Cleanup(cleanupDb) + schemaId := sdk.NewDatabaseObjectIdentifier(db.Name, name) + + resource.Test(t, resource.TestCase{ + PreCheck: func() { acc.TestAccPreCheck(t) }, + TerraformVersionChecks: []tfversion.TerraformVersionCheck{ + tfversion.RequireAbove(tfversion.Version1_5_0), + }, + CheckDestroy: acc.CheckDestroy(t, resources.Schema), + Steps: []resource.TestStep{ + // PUBLIC can not be created in v0.93 + { + ExternalProviders: map[string]resource.ExternalProvider{ + "snowflake": { + VersionConstraint: "=0.93.0", + Source: "Snowflake-Labs/snowflake", + }, + }, + Config: schemav093(name, db.Name), + ExpectError: regexp.MustCompile("Error: error creating schema PUBLIC"), + }, + { + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + Config: schemav094WithPipeExecutionPaused(name, db.Name, true), + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_schema.test", "name", name), + resource.TestCheckResourceAttr("snowflake_schema.test", "database", db.Name), + resource.TestCheckResourceAttr("snowflake_schema.test", "pipe_execution_paused", "true"), + ), + }, + { + ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, + PreConfig: func() { + // In newer versions, ALTER was called, so we should not see a DROP event. + schemas := acc.TestClient().Schema.ShowWithOptions(t, &sdk.ShowSchemaOptions{ + History: sdk.Pointer(true), + Like: &sdk.Like{ + Pattern: sdk.String(schemaId.Name()), + }, + }) + require.Len(t, schemas, 1) + require.Zero(t, schemas[0].DroppedOn) + }, + Config: schemav094WithPipeExecutionPaused(name, db.Name, true), + ConfigPlanChecks: resource.ConfigPlanChecks{ + PreApply: []plancheck.PlanCheck{ + plancheck.ExpectResourceAction("snowflake_schema.test", plancheck.ResourceActionNoop), + }, + }, + Check: resource.ComposeTestCheckFunc( + resource.TestCheckResourceAttr("snowflake_schema.test", "name", name), + resource.TestCheckResourceAttr("snowflake_schema.test", "database", db.Name), + resource.TestCheckResourceAttr("snowflake_schema.test", "pipe_execution_paused", "true"), + ), + }, + }, + }) +} + // TestAcc_Schema_TwoSchemasWithTheSameNameOnDifferentDatabases proves https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2209 issue. func TestAcc_Schema_TwoSchemasWithTheSameNameOnDifferentDatabases(t *testing.T) { name := "test_schema" @@ -949,3 +1031,14 @@ resource "snowflake_schema" "test" { ` return fmt.Sprintf(s, name, database) } + +func schemav094WithPipeExecutionPaused(name, database string, pipeExecutionPaused bool) string { + s := ` +resource "snowflake_schema" "test" { + name = "%s" + database = "%s" + pipe_execution_paused = %t +} +` + return fmt.Sprintf(s, name, database, pipeExecutionPaused) +} diff --git a/pkg/resources/secondary_database.go b/pkg/resources/secondary_database.go index ed60b30cbb..111947a39e 100644 --- a/pkg/resources/secondary_database.go +++ b/pkg/resources/secondary_database.go @@ -175,8 +175,10 @@ func ReadSecondaryDatabase(ctx context.Context, d *schema.ResourceData, meta any return diag.FromErr(err) } - if err := d.Set("as_replica_of", sdk.NewExternalObjectIdentifierFromFullyQualifiedName(replicationPrimaryDatabase.PrimaryDatabase).FullyQualifiedName()); err != nil { - return diag.FromErr(err) + if replicationPrimaryDatabase.PrimaryDatabase != nil { + if err := d.Set("as_replica_of", replicationPrimaryDatabase.PrimaryDatabase.FullyQualifiedName()); err != nil { + return diag.FromErr(err) + } } if err := d.Set("is_transient", secondaryDatabase.Transient); err != nil { diff --git a/pkg/resources/secondary_database_acceptance_test.go b/pkg/resources/secondary_database_acceptance_test.go index 921025ef90..edac458ec5 100644 --- a/pkg/resources/secondary_database_acceptance_test.go +++ b/pkg/resources/secondary_database_acceptance_test.go @@ -3,6 +3,7 @@ package resources_test import ( "context" "testing" + "time" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers" @@ -20,10 +21,13 @@ func TestAcc_CreateSecondaryDatabase_Basic(t *testing.T) { id := acc.TestClient().Ids.RandomAccountObjectIdentifier() comment := random.Comment() - _, externalPrimaryId, primaryDatabaseCleanup := acc.SecondaryTestClient().Database.CreatePrimaryDatabase(t, []sdk.AccountIdentifier{ + primaryDatabase, externalPrimaryId, _ := acc.SecondaryTestClient().Database.CreatePrimaryDatabase(t, []sdk.AccountIdentifier{ acc.TestClient().Account.GetAccountIdentifier(t), }) - t.Cleanup(primaryDatabaseCleanup) + t.Cleanup(func() { + // TODO(SNOW-1562172): Create a better solution for this type of situations + require.Eventually(t, func() bool { return acc.SecondaryTestClient().Database.DropDatabase(t, primaryDatabase.ID()) == nil }, time.Second*5, time.Second) + }) newId := acc.TestClient().Ids.RandomAccountObjectIdentifier() newComment := random.Comment() @@ -151,10 +155,13 @@ func TestAcc_CreateSecondaryDatabase_complete(t *testing.T) { id := acc.TestClient().Ids.RandomAccountObjectIdentifier() comment := random.Comment() - _, externalPrimaryId, primaryDatabaseCleanup := acc.SecondaryTestClient().Database.CreatePrimaryDatabase(t, []sdk.AccountIdentifier{ + primaryDatabase, externalPrimaryId, _ := acc.SecondaryTestClient().Database.CreatePrimaryDatabase(t, []sdk.AccountIdentifier{ sdk.NewAccountIdentifierFromAccountLocator(acc.Client(t).GetAccountLocator()), }) - t.Cleanup(primaryDatabaseCleanup) + t.Cleanup(func() { + // TODO(SNOW-1562172): Create a better solution for this type of situations + require.Eventually(t, func() bool { return acc.SecondaryTestClient().Database.DropDatabase(t, primaryDatabase.ID()) == nil }, time.Second*5, time.Second) + }) externalVolumeId, externalVolumeCleanup := acc.TestClient().ExternalVolume.Create(t) t.Cleanup(externalVolumeCleanup) @@ -391,10 +398,13 @@ func TestAcc_CreateSecondaryDatabase_complete(t *testing.T) { func TestAcc_CreateSecondaryDatabase_DataRetentionTimeInDays(t *testing.T) { id := acc.TestClient().Ids.RandomAccountObjectIdentifier() - _, externalPrimaryId, primaryDatabaseCleanup := acc.SecondaryTestClient().Database.CreatePrimaryDatabase(t, []sdk.AccountIdentifier{ + primaryDatabase, externalPrimaryId, _ := acc.SecondaryTestClient().Database.CreatePrimaryDatabase(t, []sdk.AccountIdentifier{ sdk.NewAccountIdentifierFromAccountLocator(acc.Client(t).GetAccountLocator()), }) - t.Cleanup(primaryDatabaseCleanup) + t.Cleanup(func() { + // TODO(SNOW-1562172): Create a better solution for this type of situations + require.Eventually(t, func() bool { return acc.SecondaryTestClient().Database.DropDatabase(t, primaryDatabase.ID()) == nil }, time.Second*5, time.Second) + }) accountDataRetentionTimeInDays, err := acc.Client(t).Parameters.ShowAccountParameter(context.Background(), sdk.AccountParameterDataRetentionTimeInDays) require.NoError(t, err) diff --git a/pkg/resources/shared_database.go b/pkg/resources/shared_database.go index 6dbb594b40..5f290b05c5 100644 --- a/pkg/resources/shared_database.go +++ b/pkg/resources/shared_database.go @@ -143,8 +143,10 @@ func ReadSharedDatabase(ctx context.Context, d *schema.ResourceData, meta any) d return diag.FromErr(err) } - if err := d.Set("from_share", sdk.NewExternalObjectIdentifierFromFullyQualifiedName(database.Origin).FullyQualifiedName()); err != nil { - return diag.FromErr(err) + if database.Origin != nil { + if err := d.Set("from_share", database.Origin.FullyQualifiedName()); err != nil { + return diag.FromErr(err) + } } // TODO(SNOW-1325381) diff --git a/pkg/resources/shared_database_acceptance_test.go b/pkg/resources/shared_database_acceptance_test.go index fc0ec0cc1d..d700a25646 100644 --- a/pkg/resources/shared_database_acceptance_test.go +++ b/pkg/resources/shared_database_acceptance_test.go @@ -4,16 +4,16 @@ import ( "context" "regexp" "testing" - - "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers" - "github.com/hashicorp/terraform-plugin-testing/plancheck" + "time" acc "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/helpers/random" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/provider/resources" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" "github.com/hashicorp/terraform-plugin-testing/config" "github.com/hashicorp/terraform-plugin-testing/helper/resource" + "github.com/hashicorp/terraform-plugin-testing/plancheck" "github.com/hashicorp/terraform-plugin-testing/tfversion" "github.com/stretchr/testify/require" ) @@ -174,6 +174,24 @@ func TestAcc_CreateSharedDatabase_complete(t *testing.T) { "enable_console_output": config.BoolVariable(true), } + // TODO(SNOW-1562172): Create a better solution for this type of situations + // We have to create test database from share before the actual test to check if the newly created share is ready + // after previous test (there's some kind of issue or delay between cleaning up a share and creating a new one right after). + testId := acc.TestClient().Ids.RandomAccountObjectIdentifier() + err := acc.Client(t).Databases.CreateShared(context.Background(), testId, externalShareId, new(sdk.CreateSharedDatabaseOptions)) + require.NoError(t, err) + + require.Eventually(t, func() bool { + database, err := acc.TestClient().Database.Show(t, testId) + if err != nil { + return false + } + // Origin is returned as "" in those cases, because it's not valid sdk.ExternalObjectIdentifier parser sets it as nil. + // Once it turns into valid sdk.ExternalObjectIdentifier, we're ready to proceed with the actual test. + return database.Origin != nil + }, time.Minute, time.Second*6) + acc.TestClient().Database.DropDatabaseFunc(t, testId)() + resource.Test(t, resource.TestCase{ ProtoV6ProviderFactories: acc.TestAccProtoV6ProviderFactories, TerraformVersionChecks: []tfversion.TerraformVersionCheck{ diff --git a/pkg/resources/user_password_policy_attachment.go b/pkg/resources/user_password_policy_attachment.go index 567f6a40e5..12882103a3 100644 --- a/pkg/resources/user_password_policy_attachment.go +++ b/pkg/resources/user_password_policy_attachment.go @@ -3,7 +3,6 @@ package resources import ( "context" "fmt" - "strings" "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/internal/provider" @@ -14,10 +13,11 @@ import ( var userPasswordPolicyAttachmentSchema = map[string]*schema.Schema{ "user_name": { - Type: schema.TypeString, - Required: true, - ForceNew: true, - Description: "User name of the user you want to attach the password policy to", + Type: schema.TypeString, + Required: true, + ForceNew: true, + Description: "User name of the user you want to attach the password policy to", + ValidateDiagFunc: IsValidIdentifier[sdk.AccountObjectIdentifier](), }, "password_policy_name": { Type: schema.TypeString, @@ -57,7 +57,7 @@ func CreateUserPasswordPolicyAttachment(d *schema.ResourceData, meta any) error return err } - d.SetId(helpers.EncodeSnowflakeID(userName.FullyQualifiedName(), passwordPolicy.FullyQualifiedName())) + d.SetId(helpers.EncodeResourceIdentifier(userName.FullyQualifiedName(), passwordPolicy.FullyQualifiedName())) return ReadUserPasswordPolicyAttachment(d, meta) } @@ -66,7 +66,7 @@ func ReadUserPasswordPolicyAttachment(d *schema.ResourceData, meta any) error { client := meta.(*provider.Context).Client ctx := context.Background() - parts := strings.Split(d.Id(), helpers.IDDelimiter) + parts := helpers.ParseResourceIdentifier(d.Id()) if len(parts) != 2 { return fmt.Errorf("required id format 'user_name|password_policy_name', but got: '%s'", d.Id()) } diff --git a/pkg/schemas/database_gen.go b/pkg/schemas/database_gen.go index f4a5596b14..5ad77e4895 100644 --- a/pkg/schemas/database_gen.go +++ b/pkg/schemas/database_gen.go @@ -75,7 +75,9 @@ func DatabaseToSchema(database *sdk.Database) map[string]any { databaseSchema["name"] = database.Name databaseSchema["is_default"] = database.IsDefault databaseSchema["is_current"] = database.IsCurrent - databaseSchema["origin"] = database.Origin + if database.Origin != nil { + databaseSchema["origin"] = database.Origin.FullyQualifiedName() + } databaseSchema["owner"] = database.Owner databaseSchema["comment"] = database.Comment databaseSchema["options"] = database.Options diff --git a/pkg/sdk/client_integration_test.go b/pkg/sdk/client_integration_test.go index c7192517c5..57522f9fae 100644 --- a/pkg/sdk/client_integration_test.go +++ b/pkg/sdk/client_integration_test.go @@ -66,26 +66,26 @@ func TestClient_NewClient(t *testing.T) { } func TestClient_ping(t *testing.T) { - client := testClient(t) + client := defaultTestClient(t) err := client.Ping() require.NoError(t, err) } func TestClient_close(t *testing.T) { - client := testClient(t) + client := defaultTestClient(t) err := client.Close() require.NoError(t, err) } func TestClient_exec(t *testing.T) { - client := testClient(t) + client := defaultTestClient(t) ctx := context.Background() _, err := client.exec(ctx, "SELECT 1") require.NoError(t, err) } func TestClient_query(t *testing.T) { - client := testClient(t) + client := defaultTestClient(t) ctx := context.Background() rows := []struct { One int `db:"ONE"` @@ -98,7 +98,7 @@ func TestClient_query(t *testing.T) { } func TestClient_queryOne(t *testing.T) { - client := testClient(t) + client := defaultTestClient(t) ctx := context.Background() row := struct { One int `db:"ONE"` diff --git a/pkg/sdk/databases.go b/pkg/sdk/databases.go index 6e00a78d68..da872eb1aa 100644 --- a/pkg/sdk/databases.go +++ b/pkg/sdk/databases.go @@ -5,6 +5,7 @@ import ( "database/sql" "errors" "fmt" + "log" "strconv" "strings" "time" @@ -50,7 +51,7 @@ type Database struct { Name string IsDefault bool IsCurrent bool - Origin string + Origin *ExternalObjectIdentifier Owner string Comment string Options string @@ -97,8 +98,14 @@ func (row databaseRow) convert() *Database { if row.IsCurrent.Valid { database.IsCurrent = row.IsCurrent.String == "Y" } - if row.Origin.Valid { - database.Origin = row.Origin.String + if row.Origin.Valid && row.Origin.String != "" { + originId, err := ParseExternalObjectIdentifier(row.Origin.String) + if err != nil { + // TODO(SNOW-1561641): Return error + log.Printf("[DEBUG] unable to parse origin ID: %v", row.Origin.String) + } else { + database.Origin = &originId + } } if row.Owner.Valid { database.Owner = row.Owner.String diff --git a/pkg/sdk/helper_test.go b/pkg/sdk/helper_test.go index adfe806f45..fd82cb3665 100644 --- a/pkg/sdk/helper_test.go +++ b/pkg/sdk/helper_test.go @@ -6,7 +6,7 @@ import ( "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/acceptance/testprofiles" ) -func testClient(t *testing.T) *Client { +func defaultTestClient(t *testing.T) *Client { t.Helper() client, err := NewDefaultClient() @@ -17,16 +17,31 @@ func testClient(t *testing.T) *Client { return client } -func testSecondaryClient(t *testing.T) *Client { +func secondaryTestClient(t *testing.T) *Client { + t.Helper() + return testClient(t, testprofiles.Secondary) +} + +func thirdTestClient(t *testing.T) *Client { + t.Helper() + return testClient(t, testprofiles.Third) +} + +func fourthTestClient(t *testing.T) *Client { + t.Helper() + return testClient(t, testprofiles.Fourth) +} + +func testClient(t *testing.T, profile string) *Client { t.Helper() - config, err := ProfileConfig(testprofiles.Secondary) + config, err := ProfileConfig(profile) if err != nil { - t.Skipf("Snowflake secondary account not configured. Must be set in ~./snowflake/config.yml with profile name: %s", testprofiles.Secondary) + t.Skipf("Snowflake %s profile not configured. Must be set in ~./snowflake/config.yml", profile) } client, err := NewClient(config) if err != nil { - t.Skipf("Snowflake secondary account not configured. Must be set in ~./snowflake/config.yml with profile name: %s", testprofiles.Secondary) + t.Skipf("Snowflake %s profile not configured. Must be set in ~./snowflake/config.yml", profile) } return client diff --git a/pkg/sdk/identifier_helpers.go b/pkg/sdk/identifier_helpers.go index dde7b86686..c7632347da 100644 --- a/pkg/sdk/identifier_helpers.go +++ b/pkg/sdk/identifier_helpers.go @@ -100,6 +100,10 @@ func NewExternalObjectIdentifierFromFullyQualifiedName(fullyQualifiedName string } } +func (i ExternalObjectIdentifier) AccountIdentifier() AccountIdentifier { + return i.accountIdentifier +} + func (i ExternalObjectIdentifier) Name() string { return i.objectIdentifier.Name() } @@ -137,6 +141,14 @@ func NewAccountIdentifierFromFullyQualifiedName(fullyQualifiedName string) Accou return NewAccountIdentifier(organizationName, accountName) } +func (i AccountIdentifier) OrganizationName() string { + return i.organizationName +} + +func (i AccountIdentifier) AccountName() string { + return i.accountName +} + func (i AccountIdentifier) Name() string { if i.organizationName != "" && i.accountName != "" { return fmt.Sprintf("%s.%s", i.organizationName, i.accountName) diff --git a/pkg/sdk/identifier_parsers.go b/pkg/sdk/identifier_parsers.go new file mode 100644 index 0000000000..584761b0be --- /dev/null +++ b/pkg/sdk/identifier_parsers.go @@ -0,0 +1,142 @@ +package sdk + +import ( + "encoding/csv" + "fmt" + "strings" +) + +const IdDelimiter = '.' + +// TODO(SNOW-1495053): Temporarily exported, make as private +func ParseIdentifierStringWithOpts(identifier string, opts func(*csv.Reader)) ([]string, error) { + reader := csv.NewReader(strings.NewReader(identifier)) + if opts != nil { + opts(reader) + } + lines, err := reader.ReadAll() + if err != nil { + return nil, fmt.Errorf("unable to read identifier: %s, err = %w", identifier, err) + } + if len(lines) != 1 { + return nil, fmt.Errorf("incompatible identifier: %s", identifier) + } + return lines[0], nil +} + +// TODO(SNOW-1495053): Temporarily exported, make as private +func ParseIdentifierString(identifier string) ([]string, error) { + parts, err := ParseIdentifierStringWithOpts(identifier, func(r *csv.Reader) { + r.Comma = IdDelimiter + }) + if err != nil { + return nil, err + } + for _, part := range parts { + // TODO(SNOW-1571674): Remove the validation + if strings.Contains(part, `"`) { + return nil, fmt.Errorf(`unable to parse identifier: %s, currently identifiers containing double quotes are not supported in the provider`, identifier) + } + } + return parts, nil +} + +func parseIdentifier[T ObjectIdentifier](identifier string, expectedParts int, expectedFormat string, constructFromParts func(parts []string) T) (T, error) { + var emptyIdentifier T + parts, err := ParseIdentifierString(identifier) + if err != nil { + return emptyIdentifier, err + } + if len(parts) != expectedParts { + return emptyIdentifier, fmt.Errorf(`unexpected number of parts %[1]d in identifier %[2]s, expected %[3]d in a form of "%[4]s"`, len(parts), identifier, expectedParts, expectedFormat) + } + return constructFromParts(parts), nil +} + +func ParseAccountObjectIdentifier(identifier string) (AccountObjectIdentifier, error) { + return parseIdentifier[AccountObjectIdentifier]( + identifier, 1, "", + func(parts []string) AccountObjectIdentifier { + return NewAccountObjectIdentifier(parts[0]) + }, + ) +} + +// TODO(SNOW-1495053): Replace ParseObjectIdentifier +// ParseObjectIdentifierString tries to guess the identifier by the number of parts it contains. +// Because of the overlapping, in some cases, the output ObjectIdentifier can be one of the following implementations: +// - AccountObjectIdentifier for one part +// - DatabaseObjectIdentifier for two parts +// - SchemaObjectIdentifier for three parts (overlaps with ExternalObjectIdentifier) +// - TableColumnIdentifier for four parts +func ParseObjectIdentifierString(identifier string) (ObjectIdentifier, error) { + parts, err := ParseIdentifierString(identifier) + if err != nil { + return nil, err + } + switch len(parts) { + case 1: + return NewAccountObjectIdentifier(parts[0]), nil + case 2: + return NewDatabaseObjectIdentifier(parts[0], parts[1]), nil + case 3: + return NewSchemaObjectIdentifier(parts[0], parts[1], parts[2]), nil + case 4: + return NewTableColumnIdentifier(parts[0], parts[1], parts[2], parts[3]), nil + default: + return nil, fmt.Errorf("unsupported identifier: %[1]s (number of parts: %[2]d)", identifier, len(parts)) + } +} + +func ParseDatabaseObjectIdentifier(identifier string) (DatabaseObjectIdentifier, error) { + return parseIdentifier[DatabaseObjectIdentifier]( + identifier, 2, ".", + func(parts []string) DatabaseObjectIdentifier { + return NewDatabaseObjectIdentifier(parts[0], parts[1]) + }, + ) +} + +func ParseSchemaObjectIdentifier(identifier string) (SchemaObjectIdentifier, error) { + return parseIdentifier[SchemaObjectIdentifier]( + identifier, 3, "..", + func(parts []string) SchemaObjectIdentifier { + return NewSchemaObjectIdentifier(parts[0], parts[1], parts[2]) + }, + ) +} + +func ParseTableColumnIdentifier(identifier string) (TableColumnIdentifier, error) { + return parseIdentifier[TableColumnIdentifier]( + identifier, 4, "...", + func(parts []string) TableColumnIdentifier { + return NewTableColumnIdentifier(parts[0], parts[1], parts[2], parts[3]) + }, + ) +} + +// ParseAccountIdentifier is implemented with an assumption that the recommended format is used that contains two parts, +// organization name and account name. +func ParseAccountIdentifier(identifier string) (AccountIdentifier, error) { + return parseIdentifier[AccountIdentifier]( + identifier, 2, ".", + func(parts []string) AccountIdentifier { + return NewAccountIdentifier(parts[0], parts[1]) + }, + ) +} + +// ParseExternalObjectIdentifier is implemented with an assumption that the identifier consists of three parts, because: +// - After identifier rework, we expect account identifiers to always have two parts ".". +// - So far, the only external things that we referred to with external identifiers had only one part (not including the account identifier), +// meaning it will always be represented as sdk.AccountObjectIdentifier. Documentation also doesn't describe any case where +// account identifier would be used as part of the identifier that would refer to the "lower level" object. +// Reference: https://docs.snowflake.com/en/user-guide/admin-account-identifier#where-are-account-identifiers-used. +func ParseExternalObjectIdentifier(identifier string) (ExternalObjectIdentifier, error) { + return parseIdentifier[ExternalObjectIdentifier]( + identifier, 3, "..", + func(parts []string) ExternalObjectIdentifier { + return NewExternalObjectIdentifier(NewAccountIdentifier(parts[0], parts[1]), NewAccountObjectIdentifier(parts[2])) + }, + ) +} diff --git a/pkg/sdk/identifier_parsers_test.go b/pkg/sdk/identifier_parsers_test.go new file mode 100644 index 0000000000..de86bbc9dc --- /dev/null +++ b/pkg/sdk/identifier_parsers_test.go @@ -0,0 +1,252 @@ +package sdk + +import ( + "fmt" + "testing" + + "github.com/stretchr/testify/assert" + "github.com/stretchr/testify/require" +) + +func Test_ParseIdentifierString(t *testing.T) { + containsAll := func(t *testing.T, parts, expectedParts []string) { + t.Helper() + require.Len(t, parts, len(expectedParts)) + for _, part := range expectedParts { + require.Contains(t, parts, part) + } + } + + t.Run("returns read error", func(t *testing.T) { + input := `ab"c` + + _, err := ParseIdentifierString(input) + + require.ErrorContains(t, err, "unable to read identifier") + require.ErrorContains(t, err, `bare " in non-quoted-field`) + }) + + t.Run("returns error for empty input", func(t *testing.T) { + input := "" + + _, err := ParseIdentifierString(input) + + require.ErrorContains(t, err, "incompatible identifier") + }) + + t.Run("returns error for multiple lines", func(t *testing.T) { + input := "abc\ndef" + + _, err := ParseIdentifierString(input) + + require.ErrorContains(t, err, "incompatible identifier") + }) + + t.Run("returns parts correctly without quoting", func(t *testing.T) { + input := "abc.def" + expected := []string{"abc", "def"} + + parts, err := ParseIdentifierString(input) + + require.NoError(t, err) + containsAll(t, parts, expected) + }) + + t.Run("returns parts correctly with quoting", func(t *testing.T) { + input := `"abc"."def"` + expected := []string{"abc", "def"} + + parts, err := ParseIdentifierString(input) + + require.NoError(t, err) + containsAll(t, parts, expected) + }) + + t.Run("returns parts correctly with mixed quoting", func(t *testing.T) { + input := `"abc".def."ghi"` + expected := []string{"abc", "def", "ghi"} + + parts, err := ParseIdentifierString(input) + + require.NoError(t, err) + containsAll(t, parts, expected) + }) + + // Quote inside must have a preceding quote (https://docs.snowflake.com/en/sql-reference/identifiers-syntax). + t.Run("returns parts correctly with quote inside", func(t *testing.T) { + input := `"ab""c".def` + _, err := ParseIdentifierString(input) + + require.ErrorContains(t, err, `unable to parse identifier: "ab""c".def, currently identifiers containing double quotes are not supported in the provider`) + }) + + t.Run("returns parts correctly with dots inside", func(t *testing.T) { + input := `"ab.c".def` + expected := []string{`ab.c`, "def"} + + parts, err := ParseIdentifierString(input) + + require.NoError(t, err) + containsAll(t, parts, expected) + }) + + t.Run("empty identifier", func(t *testing.T) { + input := `""` + expected := []string{""} + + parts, err := ParseIdentifierString(input) + + require.NoError(t, err) + containsAll(t, parts, expected) + }) + + t.Run("handled correctly double quotes", func(t *testing.T) { + input := `""."."".".".""."".""."".""."".""."""""` + + _, err := ParseIdentifierString(input) + require.ErrorContains(t, err, `unable to parse identifier: "".".""."."."".""."".""."".""."".""""", currently identifiers containing double quotes are not supported in the provider`) + }) +} + +func Test_IdentifierParsers(t *testing.T) { + testCases := []struct { + IdentifierType string + Input string + Expected ObjectIdentifier + Error string + }{ + {IdentifierType: "AccountObjectIdentifier", Input: ``, Error: "incompatible identifier: "}, + {IdentifierType: "AccountObjectIdentifier", Input: "a\nb", Error: "incompatible identifier: a\nb"}, + {IdentifierType: "AccountObjectIdentifier", Input: `a"b`, Error: "unable to read identifier: a\"b, err = parse error on line 1, column 2: bare \" in non-quoted-field"}, + {IdentifierType: "AccountObjectIdentifier", Input: `abc.cde`, Error: `unexpected number of parts 2 in identifier abc.cde, expected 1 in a form of ""`}, + {IdentifierType: "AccountObjectIdentifier", Input: `""""`, Error: `unable to parse identifier: """", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "AccountObjectIdentifier", Input: `"a""bc"`, Error: `unable to parse identifier: "a""bc", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "AccountObjectIdentifier", Input: `""`, Expected: NewAccountObjectIdentifier(``)}, + {IdentifierType: "AccountObjectIdentifier", Input: `abc`, Expected: NewAccountObjectIdentifier(`abc`)}, + {IdentifierType: "AccountObjectIdentifier", Input: `"abc"`, Expected: NewAccountObjectIdentifier(`abc`)}, + {IdentifierType: "AccountObjectIdentifier", Input: `"ab.c"`, Expected: NewAccountObjectIdentifier(`ab.c`)}, + + {IdentifierType: "DatabaseObjectIdentifier", Input: ``, Error: "incompatible identifier: "}, + {IdentifierType: "DatabaseObjectIdentifier", Input: "a\nb.cde", Error: "unable to read identifier: a\nb.cde, err = record on line 2: wrong number of fields"}, + {IdentifierType: "DatabaseObjectIdentifier", Input: `a"b.cde`, Error: "unable to read identifier: a\"b.cde, err = parse error on line 1, column 2: bare \" in non-quoted-field"}, + {IdentifierType: "DatabaseObjectIdentifier", Input: `abc.cde.efg`, Error: `unexpected number of parts 3 in identifier abc.cde.efg, expected 2 in a form of "."`}, + {IdentifierType: "DatabaseObjectIdentifier", Input: `abc`, Error: `unexpected number of parts 1 in identifier abc, expected 2 in a form of "."`}, + {IdentifierType: "DatabaseObjectIdentifier", Input: `"""".""""`, Error: `unable to parse identifier: """"."""", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "DatabaseObjectIdentifier", Input: `"a""bc"."cd""e"`, Error: `unable to parse identifier: "a""bc"."cd""e", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "DatabaseObjectIdentifier", Input: `"".""`, Expected: NewDatabaseObjectIdentifier(``, ``)}, + {IdentifierType: "DatabaseObjectIdentifier", Input: `abc.cde`, Expected: NewDatabaseObjectIdentifier(`abc`, `cde`)}, + {IdentifierType: "DatabaseObjectIdentifier", Input: `"abc"."cde"`, Expected: NewDatabaseObjectIdentifier(`abc`, `cde`)}, + {IdentifierType: "DatabaseObjectIdentifier", Input: `"ab.c"."cd.e"`, Expected: NewDatabaseObjectIdentifier(`ab.c`, `cd.e`)}, + + {IdentifierType: "SchemaObjectIdentifier", Input: ``, Error: "incompatible identifier: "}, + {IdentifierType: "SchemaObjectIdentifier", Input: "a\nb.cde.efg", Error: "unable to read identifier: a\nb.cde.efg, err = record on line 2: wrong number of fields"}, + {IdentifierType: "SchemaObjectIdentifier", Input: `a"b.cde.efg`, Error: "unable to read identifier: a\"b.cde.efg, err = parse error on line 1, column 2: bare \" in non-quoted-field"}, + {IdentifierType: "SchemaObjectIdentifier", Input: `abc.cde.efg.ghi`, Error: `unexpected number of parts 4 in identifier abc.cde.efg.ghi, expected 3 in a form of ".."`}, + {IdentifierType: "SchemaObjectIdentifier", Input: `abc.cde`, Error: `unexpected number of parts 2 in identifier abc.cde, expected 3 in a form of ".."`}, + {IdentifierType: "SchemaObjectIdentifier", Input: `""""."""".""""`, Error: `unable to parse identifier: """".""""."""", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "SchemaObjectIdentifier", Input: `"a""bc"."cd""e"."ef""g"`, Error: `unable to parse identifier: "a""bc"."cd""e"."ef""g", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "SchemaObjectIdentifier", Input: `""."".""`, Expected: NewSchemaObjectIdentifier(``, ``, ``)}, + {IdentifierType: "SchemaObjectIdentifier", Input: `abc.cde.efg`, Expected: NewSchemaObjectIdentifier(`abc`, `cde`, `efg`)}, + {IdentifierType: "SchemaObjectIdentifier", Input: `"abc"."cde"."efg"`, Expected: NewSchemaObjectIdentifier(`abc`, `cde`, `efg`)}, + {IdentifierType: "SchemaObjectIdentifier", Input: `"ab.c"."cd.e"."ef.g"`, Expected: NewSchemaObjectIdentifier(`ab.c`, `cd.e`, `ef.g`)}, + + {IdentifierType: "TableColumnIdentifier", Input: ``, Error: "incompatible identifier: "}, + {IdentifierType: "TableColumnIdentifier", Input: "a\nb.cde.efg.ghi", Error: "unable to read identifier: a\nb.cde.efg.ghi, err = record on line 2: wrong number of fields"}, + {IdentifierType: "TableColumnIdentifier", Input: `a"b.cde.efg.ghi`, Error: "unable to read identifier: a\"b.cde.efg.ghi, err = parse error on line 1, column 2: bare \" in non-quoted-field"}, + {IdentifierType: "TableColumnIdentifier", Input: `abc.cde.efg.ghi.ijk`, Error: `unexpected number of parts 5 in identifier abc.cde.efg.ghi.ijk, expected 4 in a form of "..."`}, + {IdentifierType: "TableColumnIdentifier", Input: `abc.cde`, Error: `unexpected number of parts 2 in identifier abc.cde, expected 4 in a form of "..."`}, + {IdentifierType: "TableColumnIdentifier", Input: `"""".""""."""".""""`, Error: `unable to parse identifier: """"."""".""""."""", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "TableColumnIdentifier", Input: `"a""bc"."cd""e"."ef""g"."gh""i"`, Error: `unable to parse identifier: "a""bc"."cd""e"."ef""g"."gh""i", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "TableColumnIdentifier", Input: `"".""."".""`, Expected: NewTableColumnIdentifier(``, ``, ``, ``)}, + {IdentifierType: "TableColumnIdentifier", Input: `abc.cde.efg.ghi`, Expected: NewTableColumnIdentifier(`abc`, `cde`, `efg`, `ghi`)}, + {IdentifierType: "TableColumnIdentifier", Input: `"abc"."cde"."efg"."ghi"`, Expected: NewTableColumnIdentifier(`abc`, `cde`, `efg`, `ghi`)}, + {IdentifierType: "TableColumnIdentifier", Input: `"ab.c"."cd.e"."ef.g"."gh.i"`, Expected: NewTableColumnIdentifier(`ab.c`, `cd.e`, `ef.g`, `gh.i`)}, + + {IdentifierType: "AccountIdentifier", Input: ``, Error: "incompatible identifier: "}, + {IdentifierType: "AccountIdentifier", Input: "a\nb.cde", Error: "unable to read identifier: a\nb.cde, err = record on line 2: wrong number of fields"}, + {IdentifierType: "AccountIdentifier", Input: `a"b.cde`, Error: "unable to read identifier: a\"b.cde, err = parse error on line 1, column 2: bare \" in non-quoted-field"}, + {IdentifierType: "AccountIdentifier", Input: `abc.cde.efg`, Error: `unexpected number of parts 3 in identifier abc.cde.efg, expected 2 in a form of "."`}, + {IdentifierType: "AccountIdentifier", Input: `abc`, Error: `unexpected number of parts 1 in identifier abc, expected 2 in a form of "."`}, + {IdentifierType: "AccountIdentifier", Input: `"""".""""`, Error: `unable to parse identifier: """"."""", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "AccountIdentifier", Input: `"a""bc"."cd""e"`, Error: `unable to parse identifier: "a""bc"."cd""e", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "AccountIdentifier", Input: `"".""`, Expected: NewAccountIdentifier(``, ``)}, + {IdentifierType: "AccountIdentifier", Input: `abc.cde`, Expected: NewAccountIdentifier(`abc`, `cde`)}, + {IdentifierType: "AccountIdentifier", Input: `"abc"."cde"`, Expected: NewAccountIdentifier(`abc`, `cde`)}, + {IdentifierType: "AccountIdentifier", Input: `"ab.c"."cd.e"`, Expected: NewAccountIdentifier(`ab.c`, `cd.e`)}, + + {IdentifierType: "ExternalObjectIdentifier", Input: ``, Error: "incompatible identifier: "}, + {IdentifierType: "ExternalObjectIdentifier", Input: "a\nb.cde.efg", Error: "unable to read identifier: a\nb.cde.efg, err = record on line 2: wrong number of fields"}, + {IdentifierType: "ExternalObjectIdentifier", Input: `a"b.cde.efg`, Error: "unable to read identifier: a\"b.cde.efg, err = parse error on line 1, column 2: bare \" in non-quoted-field"}, + {IdentifierType: "ExternalObjectIdentifier", Input: `abc.cde.efg.ghi`, Error: `unexpected number of parts 4 in identifier abc.cde.efg.ghi, expected 3 in a form of ".."`}, + {IdentifierType: "ExternalObjectIdentifier", Input: `abc.cde`, Error: `unexpected number of parts 2 in identifier abc.cde, expected 3 in a form of ".."`}, + {IdentifierType: "ExternalObjectIdentifier", Input: `""""."""".""""`, Error: `unable to parse identifier: """".""""."""", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "ExternalObjectIdentifier", Input: `"a""bc"."cd""e"."ef""g"`, Error: `unable to parse identifier: "a""bc"."cd""e"."ef""g", currently identifiers containing double quotes are not supported in the provider`}, + {IdentifierType: "ExternalObjectIdentifier", Input: `""."".""`, Expected: NewExternalObjectIdentifier(NewAccountIdentifier(``, ``), NewAccountObjectIdentifier(``))}, + {IdentifierType: "ExternalObjectIdentifier", Input: `abc.cde.efg`, Expected: NewExternalObjectIdentifier(NewAccountIdentifier(`abc`, `cde`), NewAccountObjectIdentifier(`efg`))}, + {IdentifierType: "ExternalObjectIdentifier", Input: `"abc"."cde"."efg"`, Expected: NewExternalObjectIdentifier(NewAccountIdentifier(`abc`, `cde`), NewAccountObjectIdentifier(`efg`))}, + {IdentifierType: "ExternalObjectIdentifier", Input: `"ab.c"."cd.e"."ef.g"`, Expected: NewExternalObjectIdentifier(NewAccountIdentifier(`ab.c`, `cd.e`), NewAccountObjectIdentifier(`ef.g`))}, + } + + for _, testCase := range testCases { + t.Run(fmt.Sprintf(`Parsing %s with input: "%s"`, testCase.IdentifierType, testCase.Input), func(t *testing.T) { + var id ObjectIdentifier + var err error + + switch testCase.IdentifierType { + case "AccountObjectIdentifier": + id, err = ParseAccountObjectIdentifier(testCase.Input) + case "DatabaseObjectIdentifier": + id, err = ParseDatabaseObjectIdentifier(testCase.Input) + case "SchemaObjectIdentifier": + id, err = ParseSchemaObjectIdentifier(testCase.Input) + case "TableColumnIdentifier": + id, err = ParseTableColumnIdentifier(testCase.Input) + case "AccountIdentifier": + id, err = ParseAccountIdentifier(testCase.Input) + case "ExternalObjectIdentifier": + id, err = ParseExternalObjectIdentifier(testCase.Input) + default: + t.Fatalf("unknown identifier type: %s", testCase.IdentifierType) + } + + if testCase.Error != "" { + assert.ErrorContains(t, err, testCase.Error) + } else { + assert.Equal(t, testCase.Expected, id) + assert.NoError(t, err) + } + }) + } +} + +func Test_ParseObjectIdentifierString(t *testing.T) { + testCases := []struct { + Input string + Expected ObjectIdentifier + Error string + }{ + {Input: `to.many.parts.for.identifier`, Error: "unsupported identifier: to.many.parts.for.identifier (number of parts: 5)"}, + {Input: "a\nb.cde.efg", Error: "unable to read identifier: a\nb.cde.efg, err = record on line 2: wrong number of fields"}, + {Input: `a"b.cde.efg`, Error: "unable to read identifier: a\"b.cde.efg, err = parse error on line 1, column 2: bare \" in non-quoted-field"}, + {Input: ``, Error: "incompatible identifier: "}, + {Input: `abc`, Expected: NewAccountObjectIdentifier(`abc`)}, + {Input: `abc.def`, Expected: NewDatabaseObjectIdentifier(`abc`, `def`)}, + {Input: `abc.def.ghi`, Expected: NewSchemaObjectIdentifier(`abc`, `def`, `ghi`)}, + {Input: `abc."d.e.f".ghi`, Expected: NewSchemaObjectIdentifier(`abc`, `d.e.f`, `ghi`)}, + {Input: `abc."d""e""f".ghi`, Expected: NewSchemaObjectIdentifier(`abc`, `d"e"f`, `ghi`), Error: `unable to parse identifier: abc."d""e""f".ghi, currently identifiers containing double quotes are not supported in the provider`}, + {Input: `abc.def.ghi.jkl`, Expected: NewTableColumnIdentifier(`abc`, `def`, `ghi`, `jkl`)}, + } + + for _, testCase := range testCases { + t.Run(fmt.Sprintf("ParseObjectIdentifierString for input %s", testCase.Input), func(t *testing.T) { + id, err := ParseObjectIdentifierString(testCase.Input) + + if testCase.Error != "" { + assert.ErrorContains(t, err, testCase.Error) + } else { + assert.Equal(t, testCase.Expected, id) + assert.NoError(t, err) + } + }) + } +} diff --git a/pkg/sdk/replication_functions.go b/pkg/sdk/replication_functions.go index 23bb07115f..c23aa974d0 100644 --- a/pkg/sdk/replication_functions.go +++ b/pkg/sdk/replication_functions.go @@ -4,6 +4,7 @@ import ( "context" "database/sql" "errors" + "log" "time" ) @@ -83,7 +84,7 @@ type ReplicationDatabase struct { Name string Comment string IsPrimary bool - PrimaryDatabase string + PrimaryDatabase *ExternalObjectIdentifier ReplicationAllowedToAccounts string FailoverAllowedToAccounts string OrganizationName string @@ -97,10 +98,17 @@ func (row replicationDatabaseRow) convert() *ReplicationDatabase { AccountName: row.AccountName, Name: row.Name, IsPrimary: row.IsPrimary, - PrimaryDatabase: row.PrimaryDatabase, OrganizationName: row.OrganizationName, AccountLocator: row.AccountLocator, } + if row.PrimaryDatabase != "" { + primaryDatabaseId, err := ParseExternalObjectIdentifier(row.PrimaryDatabase) + if err != nil { + log.Printf("unable to parse primary database identifier: %v, err = %s", row.PrimaryDatabase, err) + } else { + db.PrimaryDatabase = &primaryDatabaseId + } + } if row.RegionGroup.Valid { db.RegionGroup = row.RegionGroup.String } diff --git a/pkg/sdk/sweepers_test.go b/pkg/sdk/sweepers_test.go index 354711780c..e50031cd74 100644 --- a/pkg/sdk/sweepers_test.go +++ b/pkg/sdk/sweepers_test.go @@ -2,8 +2,10 @@ package sdk import ( "context" + "errors" "fmt" "log" + "slices" "testing" "time" @@ -17,8 +19,8 @@ func TestSweepAll(t *testing.T) { testenvs.AssertEnvSet(t, string(testenvs.TestObjectsSuffix)) t.Run("sweep after tests", func(t *testing.T) { - client := testClient(t) - secondaryClient := testSecondaryClient(t) + client := defaultTestClient(t) + secondaryClient := secondaryTestClient(t) err := SweepAfterIntegrationTests(client, random.IntegrationTestsSuffix) assert.NoError(t, err) @@ -37,38 +39,66 @@ func TestSweepAll(t *testing.T) { func Test_Sweeper_NukeStaleObjects(t *testing.T) { _ = testenvs.GetOrSkipTest(t, testenvs.EnableSweep) - t.Run("sweep integration test precreated objects", func(t *testing.T) { - client := testClient(t) - secondaryClient := testSecondaryClient(t) + client := defaultTestClient(t) + secondaryClient := secondaryTestClient(t) + thirdClient := thirdTestClient(t) + fourthClient := fourthTestClient(t) - err := nukeWarehouses(client, "int_test_wh_%")() - assert.NoError(t, err) + allClients := []*Client{client, secondaryClient, thirdClient, fourthClient} - err = nukeWarehouses(secondaryClient, "int_test_wh_%")() - assert.NoError(t, err) + // can't use extracted IntegrationTestPrefix and AcceptanceTestPrefix until sweepers reside in the SDK package (cyclic) + const integrationTestPrefix = "int_test_" + const acceptanceTestPrefix = "acc_test_" - err = nukeDatabases(client, "int_test_db_%")() - assert.NoError(t, err) + t.Run("sweep integration test precreated objects", func(t *testing.T) { + integrationTestWarehousesPrefix := fmt.Sprintf("%swh_%%", integrationTestPrefix) + integrationTestDatabasesPrefix := fmt.Sprintf("%sdb_%%", integrationTestPrefix) - err = nukeDatabases(secondaryClient, "int_test_db_%")() - assert.NoError(t, err) + for _, c := range allClients { + err := nukeWarehouses(c, integrationTestWarehousesPrefix)() + assert.NoError(t, err) + + err = nukeDatabases(c, integrationTestDatabasesPrefix)() + assert.NoError(t, err) + } }) t.Run("sweep acceptance tests precreated objects", func(t *testing.T) { - client := testClient(t) - secondaryClient := testSecondaryClient(t) + acceptanceTestWarehousesPrefix := fmt.Sprintf("%swh_%%", acceptanceTestPrefix) + acceptanceTestDatabasesPrefix := fmt.Sprintf("%sdb_%%", acceptanceTestPrefix) - err := nukeWarehouses(client, "acc_test_wh_%")() - assert.NoError(t, err) + for _, c := range allClients { + err := nukeWarehouses(c, acceptanceTestWarehousesPrefix)() + assert.NoError(t, err) - err = nukeWarehouses(secondaryClient, "acc_test_wh_%")() - assert.NoError(t, err) + err = nukeDatabases(c, acceptanceTestDatabasesPrefix)() + assert.NoError(t, err) + } + }) - err = nukeDatabases(client, "acc_test_db_%")() - assert.NoError(t, err) + t.Run("sweep users", func(t *testing.T) { + for _, c := range allClients { + err := nukeUsers(c)() + assert.NoError(t, err) + } + }) - err = nukeDatabases(secondaryClient, "acc_test_db_%")() - assert.NoError(t, err) + // TODO [SNOW-955520]: + t.Run("sweep databases", func(t *testing.T) { + t.Skipf("Used for manual sweeping; will be addressed during SNOW-955520") + for _, c := range allClients { + err := nukeDatabases(c, "")() + assert.NoError(t, err) + } + }) + + // TODO [SNOW-955520]: + t.Run("sweep warehouses", func(t *testing.T) { + t.Skipf("Used for manual sweeping; will be addressed during SNOW-955520") + for _, c := range allClients { + err := nukeWarehouses(c, "")() + assert.NoError(t, err) + } }) // TODO [SNOW-955520]: nuke stale objects (e.g. created more than 2 weeks ago) @@ -76,51 +106,113 @@ func Test_Sweeper_NukeStaleObjects(t *testing.T) { // TODO [SNOW-955520]: generalize nuke methods (sweepers too) func nukeWarehouses(client *Client, prefix string) func() error { + protectedWarehouses := []string{ + "SNOWFLAKE", + "SYSTEM$STREAMLIT_NOTEBOOK_WH", + } + return func() error { log.Printf("[DEBUG] Nuking warehouses with prefix %s\n", prefix) ctx := context.Background() - whs, err := client.Warehouses.Show(ctx, &ShowWarehouseOptions{Like: &Like{Pattern: String(prefix)}}) + var like *Like = nil + if prefix != "" { + like = &Like{Pattern: String(prefix)} + } + + whs, err := client.Warehouses.Show(ctx, &ShowWarehouseOptions{Like: like}) if err != nil { return fmt.Errorf("sweeping warehouses ended with error, err = %w", err) } + var errs []error log.Printf("[DEBUG] Found %d warehouses matching search criteria\n", len(whs)) for idx, wh := range whs { log.Printf("[DEBUG] Processing warehouse [%d/%d]: %s...\n", idx+1, len(whs), wh.ID().FullyQualifiedName()) - if wh.Name != "SNOWFLAKE" && wh.CreatedOn.Before(time.Now().Add(-4*time.Hour)) { + if !slices.Contains(protectedWarehouses, wh.Name) && wh.CreatedOn.Before(time.Now().Add(-2*time.Hour)) { log.Printf("[DEBUG] Dropping warehouse %s, created at: %s\n", wh.ID().FullyQualifiedName(), wh.CreatedOn.String()) if err := client.Warehouses.Drop(ctx, wh.ID(), &DropWarehouseOptions{IfExists: Bool(true)}); err != nil { - return fmt.Errorf("sweeping warehouse %s ended with error, err = %w", wh.ID().FullyQualifiedName(), err) + log.Printf("[DEBUG] Dropping warehouse %s, resulted in error %v\n", wh.ID().FullyQualifiedName(), err) + errs = append(errs, fmt.Errorf("sweeping warehouse %s ended with error, err = %w", wh.ID().FullyQualifiedName(), err)) } } else { log.Printf("[DEBUG] Skipping warehouse %s, created at: %s\n", wh.ID().FullyQualifiedName(), wh.CreatedOn.String()) } } - return nil + return errors.Join(errs...) } } func nukeDatabases(client *Client, prefix string) func() error { + protectedDatabases := []string{ + "SNOWFLAKE", + "MFA_ENFORCEMENT_POLICY", + } + return func() error { log.Printf("[DEBUG] Nuking databases with prefix %s\n", prefix) ctx := context.Background() - dbs, err := client.Databases.Show(ctx, &ShowDatabasesOptions{Like: &Like{Pattern: String(prefix)}}) + var like *Like = nil + if prefix != "" { + like = &Like{Pattern: String(prefix)} + } + dbs, err := client.Databases.Show(ctx, &ShowDatabasesOptions{Like: like}) if err != nil { return fmt.Errorf("sweeping databases ended with error, err = %w", err) } + var errs []error log.Printf("[DEBUG] Found %d databases matching search criteria\n", len(dbs)) for idx, db := range dbs { log.Printf("[DEBUG] Processing database [%d/%d]: %s...\n", idx+1, len(dbs), db.ID().FullyQualifiedName()) - if db.Name != "SNOWFLAKE" && db.CreatedOn.Before(time.Now().Add(-4*time.Hour)) { + if !slices.Contains(protectedDatabases, db.Name) && db.CreatedOn.Before(time.Now().Add(-2*time.Hour)) { log.Printf("[DEBUG] Dropping database %s, created at: %s\n", db.ID().FullyQualifiedName(), db.CreatedOn.String()) if err := client.Databases.Drop(ctx, db.ID(), &DropDatabaseOptions{IfExists: Bool(true)}); err != nil { - return fmt.Errorf("sweeping database %s ended with error, err = %w", db.ID().FullyQualifiedName(), err) + log.Printf("[DEBUG] Dropping database %s, resulted in error %v\n", db.ID().FullyQualifiedName(), err) + errs = append(errs, fmt.Errorf("sweeping database %s ended with error, err = %w", db.ID().FullyQualifiedName(), err)) } } else { log.Printf("[DEBUG] Skipping database %s, created at: %s\n", db.ID().FullyQualifiedName(), db.CreatedOn.String()) } } - return nil + return errors.Join(errs...) + } +} + +func nukeUsers(client *Client) func() error { + protectedUsers := []string{ + "SNOWFLAKE", + "ARTUR_SAWICKI", + "ARTUR_SAWICKI_LEGACY", + "JAKUB_MICHALAK", + "JAKUB_MICHALAK_LEGACY", + "JAN_CIESLAK", + "JAN_CIESLAK_LEGACY", + "TERRAFORM_SVC_ACCOUNT", + "TEST_CI_SERVICE_USER", + } + + return func() error { + log.Println("[DEBUG] Nuking users") + ctx := context.Background() + + users, err := client.Users.Show(ctx, &ShowUserOptions{}) + if err != nil { + return fmt.Errorf("sweeping users ended with error, err = %w", err) + } + var errs []error + log.Printf("[DEBUG] Found %d users\n", len(users)) + for idx, user := range users { + log.Printf("[DEBUG] Processing user [%d/%d]: %s...\n", idx+1, len(users), user.ID().FullyQualifiedName()) + if !slices.Contains(protectedUsers, user.Name) && user.CreatedOn.Before(time.Now().Add(-2*time.Hour)) { + log.Printf("[DEBUG] Dropping user %s\n", user.ID().FullyQualifiedName()) + if err := client.Users.Drop(ctx, user.ID(), &DropUserOptions{IfExists: Bool(true)}); err != nil { + log.Printf("[DEBUG] Dropping user %s, resulted in error %v\n", user.ID().FullyQualifiedName(), err) + errs = append(errs, fmt.Errorf("sweeping user %s ended with error, err = %w", user.ID().FullyQualifiedName(), err)) + } + } else { + log.Printf("[DEBUG] Skipping user %s\n", user.ID().FullyQualifiedName()) + } + } + return errors.Join(errs...) } } diff --git a/pkg/sdk/testint/identifier_integration_test.go b/pkg/sdk/testint/identifier_integration_test.go new file mode 100644 index 0000000000..f4153a5252 --- /dev/null +++ b/pkg/sdk/testint/identifier_integration_test.go @@ -0,0 +1,163 @@ +package testint + +import ( + "context" + "fmt" + "testing" + + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk" + "github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk/internal/collections" + "github.com/stretchr/testify/assert" + "github.com/stretchr/testify/require" +) + +func TestInt_IdentifiersForOnePartIdentifierAsNameAndReference(t *testing.T) { + testCases := []struct { + Name sdk.AccountObjectIdentifier + ShowName string + Error string + }{ + // special cases + {Name: sdk.NewAccountObjectIdentifier(``), Error: "invalid object identifier"}, + {Name: sdk.NewAccountObjectIdentifier(`"`), Error: "invalid object identifier"}, + // This is a valid identifier, but because in NewXIdentifier functions we're trimming double quotes it won't work + {Name: sdk.NewAccountObjectIdentifier(`""`), Error: "invalid object identifier"}, + // This is a valid identifier, but because in NewXIdentifier functions we're trimming double quotes it won't work + {Name: sdk.NewAccountObjectIdentifier(`""""`), Error: "invalid object identifier"}, + {Name: sdk.NewAccountObjectIdentifier(`"."`), ShowName: `.`}, + + // lower case + {Name: sdk.NewAccountObjectIdentifier(`abc`), ShowName: `abc`}, + {Name: sdk.NewAccountObjectIdentifier(`ab.c`), ShowName: `ab.c`}, + {Name: sdk.NewAccountObjectIdentifier(`a"bc`), Error: `unexpected '"`}, + {Name: sdk.NewAccountObjectIdentifier(`"a""bc"`), ShowName: `a"bc`}, + + // upper case + {Name: sdk.NewAccountObjectIdentifier(`ABC`), ShowName: `ABC`}, + {Name: sdk.NewAccountObjectIdentifier(`AB.C`), ShowName: `AB.C`}, + {Name: sdk.NewAccountObjectIdentifier(`A"BC`), Error: `unexpected '"`}, + {Name: sdk.NewAccountObjectIdentifier(`"A""BC"`), ShowName: `A"BC`}, + + // mixed case + {Name: sdk.NewAccountObjectIdentifier(`AbC`), ShowName: `AbC`}, + {Name: sdk.NewAccountObjectIdentifier(`Ab.C`), ShowName: `Ab.C`}, + {Name: sdk.NewAccountObjectIdentifier(`A"bC`), Error: `unexpected '"`}, + {Name: sdk.NewAccountObjectIdentifier(`"A""bC"`), ShowName: `A"bC`}, + } + + for _, testCase := range testCases { + testCase := testCase + + t.Run(fmt.Sprintf("one part identifier name and reference for input: %s", testCase.Name.FullyQualifiedName()), func(t *testing.T) { + ctx := context.Background() + + err := testClient(t).ResourceMonitors.Create(ctx, testCase.Name, new(sdk.CreateResourceMonitorOptions)) + if testCase.Error != "" { + require.ErrorContains(t, err, testCase.Error) + } else { + t.Cleanup(testClientHelper().ResourceMonitor.DropResourceMonitorFunc(t, testCase.Name)) + } + + err = testClient(t).Warehouses.Create(ctx, testCase.Name, &sdk.CreateWarehouseOptions{ + ResourceMonitor: &testCase.Name, + }) + if testCase.Error != "" { + require.ErrorContains(t, err, testCase.Error) + } else { + require.NoError(t, err) + t.Cleanup(testClientHelper().Warehouse.DropWarehouseFunc(t, testCase.Name)) + var result struct { + Name string `db:"name"` + ResourceMonitor string `db:"resource_monitor"` + } + err = testClient(t).QueryOneForTests(ctx, &result, fmt.Sprintf("SHOW WAREHOUSES LIKE '%s'", testCase.ShowName)) + require.NoError(t, err) + + // For one part identifiers, we expect Snowflake to return unescaped identifiers (just like the ones we used for SHOW) + assert.Equal(t, testCase.ShowName, result.Name) + assert.Equal(t, testCase.ShowName, result.ResourceMonitor) + } + }) + } +} + +func TestInt_IdentifiersForTwoPartIdentifierAsReference(t *testing.T) { + type RawGrantOutput struct { + Name string `db:"name"` + Privilege string `db:"privilege"` + } + + testCases := []struct { + Name sdk.DatabaseObjectIdentifier + OverrideExpectedSnowflakeOutput string + Error string + }{ + // special cases + {Name: sdk.NewDatabaseObjectIdentifier(``, ``), Error: "invalid object identifier"}, + {Name: sdk.NewDatabaseObjectIdentifier(`"`, `"`), Error: "invalid object identifier"}, + // This is a valid identifier, but because in NewXIdentifier functions we're trimming double quotes it won't work + {Name: sdk.NewDatabaseObjectIdentifier(`""`, `""`), Error: "invalid object identifier"}, + // This is a valid identifier, but because in NewXIdentifier functions we're trimming double quotes it won't work + {Name: sdk.NewDatabaseObjectIdentifier(`""""`, `""""`), Error: "invalid object identifier"}, + {Name: sdk.NewDatabaseObjectIdentifier(`"."`, `"."`)}, + + // lower case + {Name: sdk.NewDatabaseObjectIdentifier(`abc`, `abc`)}, + {Name: sdk.NewDatabaseObjectIdentifier(`ab.c`, `ab.c`)}, + {Name: sdk.NewDatabaseObjectIdentifier(`a"bc`, `a"bc`), Error: `unexpected '"`}, + {Name: sdk.NewDatabaseObjectIdentifier(`"a""bc"`, `"a""bc"`)}, + + // upper case + {Name: sdk.NewDatabaseObjectIdentifier(`ABC`, `ABC`), OverrideExpectedSnowflakeOutput: `ABC.ABC`}, + {Name: sdk.NewDatabaseObjectIdentifier(`AB.C`, `AB.C`)}, + {Name: sdk.NewDatabaseObjectIdentifier(`A"BC`, `A"BC`), Error: `unexpected '"`}, + {Name: sdk.NewDatabaseObjectIdentifier(`"A""BC"`, `"A""BC"`)}, + + // mixed case + {Name: sdk.NewDatabaseObjectIdentifier(`AbC`, `AbC`)}, + {Name: sdk.NewDatabaseObjectIdentifier(`Ab.C`, `Ab.C`)}, + {Name: sdk.NewDatabaseObjectIdentifier(`A"bC`, `A"bC`), Error: `unexpected '"`}, + {Name: sdk.NewDatabaseObjectIdentifier(`"A""bC"`, `"A""bC"`)}, + } + + role, roleCleanup := testClientHelper().Role.CreateRole(t) + t.Cleanup(roleCleanup) + + for _, testCase := range testCases { + t.Run(fmt.Sprintf("two part identifier reference for input: %s", testCase.Name.FullyQualifiedName()), func(t *testing.T) { + ctx := context.Background() + + err := testClient(t).Databases.Create(ctx, testCase.Name.DatabaseId(), new(sdk.CreateDatabaseOptions)) + if testCase.Error != "" { + require.ErrorContains(t, err, testCase.Error) + } else { + t.Cleanup(testClientHelper().Database.DropDatabaseFunc(t, testCase.Name.DatabaseId())) + } + + err = testClient(t).Schemas.Create(ctx, testCase.Name, new(sdk.CreateSchemaOptions)) + if testCase.Error != "" { + require.ErrorContains(t, err, testCase.Error) + } else { + require.NoError(t, err) + t.Cleanup(testClientHelper().Schema.DropSchemaFunc(t, testCase.Name)) + + testClientHelper().Grant.GrantOnSchemaToAccountRole(t, testCase.Name, role.ID(), sdk.SchemaPrivilegeCreateTable) + + var grants []RawGrantOutput + err = testClient(t).QueryForTests(ctx, &grants, fmt.Sprintf("SHOW GRANTS ON SCHEMA %s", testCase.Name.FullyQualifiedName())) + require.NoError(t, err) + + createTableGrant, err := collections.FindOne(grants, func(output RawGrantOutput) bool { return output.Privilege == sdk.SchemaPrivilegeCreateTable.String() }) + require.NoError(t, err) + + // For two part identifiers, we expect Snowflake to return escaped identifiers with exception + // to identifiers that don't have any lowercase character and special symbol in it. + if testCase.OverrideExpectedSnowflakeOutput != "" { + assert.Equal(t, testCase.OverrideExpectedSnowflakeOutput, createTableGrant.Name) + } else { + assert.Equal(t, testCase.Name.FullyQualifiedName(), createTableGrant.Name) + } + } + }) + } +} diff --git a/pkg/sdk/testint/replication_functions_integration_test.go b/pkg/sdk/testint/replication_functions_integration_test.go index d9a2a05d05..d6a58dd01d 100644 --- a/pkg/sdk/testint/replication_functions_integration_test.go +++ b/pkg/sdk/testint/replication_functions_integration_test.go @@ -55,7 +55,8 @@ func TestInt_ShowReplicationDatabases(t *testing.T) { require.NotEmpty(t, rdb.SnowflakeRegion) require.NotEmpty(t, rdb.CreatedOn) require.NotEmpty(t, rdb.AccountName) - require.NotEmpty(t, rdb.PrimaryDatabase) + require.NotNil(t, rdb.PrimaryDatabase) + require.NotEmpty(t, rdb.PrimaryDatabase.FullyQualifiedName()) if expectedIsPrimary { require.NotEmpty(t, rdb.ReplicationAllowedToAccounts) require.NotEmpty(t, rdb.FailoverAllowedToAccounts) diff --git a/pkg/sdk/testint/setup_test.go b/pkg/sdk/testint/setup_test.go index 3e7425aa51..d09be1d489 100644 --- a/pkg/sdk/testint/setup_test.go +++ b/pkg/sdk/testint/setup_test.go @@ -17,10 +17,12 @@ import ( "github.com/snowflakedb/gosnowflake" ) +const IntegrationTestPrefix = "int_test_" + var ( - TestWarehouseName = "int_test_wh_" + random.IntegrationTestsSuffix - TestDatabaseName = "int_test_db_" + random.IntegrationTestsSuffix - TestSchemaName = "int_test_sc_" + random.IntegrationTestsSuffix + TestWarehouseName = fmt.Sprintf("%swh_%s", IntegrationTestPrefix, random.IntegrationTestsSuffix) + TestDatabaseName = fmt.Sprintf("%sdb_%s", IntegrationTestPrefix, random.IntegrationTestsSuffix) + TestSchemaName = fmt.Sprintf("%ssc_%s", IntegrationTestPrefix, random.IntegrationTestsSuffix) NonExistingAccountObjectIdentifier = sdk.NewAccountObjectIdentifier("does_not_exist") NonExistingDatabaseObjectIdentifier = sdk.NewDatabaseObjectIdentifier(TestDatabaseName, "does_not_exist") diff --git a/templates/data-sources/databases.md.tmpl b/templates/data-sources/databases.md.tmpl index 18e5fffd7a..d3ff8d9c6c 100644 --- a/templates/data-sources/databases.md.tmpl +++ b/templates/data-sources/databases.md.tmpl @@ -9,7 +9,7 @@ description: |- {{- end }} --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # {{.Name}} ({{.Type}}) diff --git a/templates/data-sources/network_policies.md.tmpl b/templates/data-sources/network_policies.md.tmpl index 18e5fffd7a..d3ff8d9c6c 100644 --- a/templates/data-sources/network_policies.md.tmpl +++ b/templates/data-sources/network_policies.md.tmpl @@ -9,7 +9,7 @@ description: |- {{- end }} --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # {{.Name}} ({{.Type}}) diff --git a/templates/data-sources/roles.md.tmpl b/templates/data-sources/roles.md.tmpl index 18e5fffd7a..60acfefd96 100644 --- a/templates/data-sources/roles.md.tmpl +++ b/templates/data-sources/roles.md.tmpl @@ -9,7 +9,7 @@ description: |- {{- end }} --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **V1 release candidate** This datasource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # {{.Name}} ({{.Type}}) diff --git a/templates/data-sources/schemas.md.tmpl b/templates/data-sources/schemas.md.tmpl new file mode 100644 index 0000000000..0b004f8501 --- /dev/null +++ b/templates/data-sources/schemas.md.tmpl @@ -0,0 +1,24 @@ +--- +page_title: "{{.Name}} {{.Type}} - {{.ProviderName}}" +subcategory: "" +description: |- +{{ if gt (len (split .Description "")) 1 -}} +{{ index (split .Description "") 1 | plainmarkdown | trimspace | prefixlines " " }} +{{- else -}} +{{ .Description | plainmarkdown | trimspace | prefixlines " " }} +{{- end }} +--- + +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. + +# {{.Name}} ({{.Type}}) + +{{ .Description | trimspace }} + +{{ if .HasExample -}} +## Example Usage + +{{ tffile (printf "examples/data-sources/%s/data-source.tf" .Name)}} +{{- end }} + +{{ .SchemaMarkdown | trimspace }} diff --git a/templates/data-sources/security_integrations.md.tmpl b/templates/data-sources/security_integrations.md.tmpl index 18e5fffd7a..d3ff8d9c6c 100644 --- a/templates/data-sources/security_integrations.md.tmpl +++ b/templates/data-sources/security_integrations.md.tmpl @@ -9,7 +9,7 @@ description: |- {{- end }} --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # {{.Name}} ({{.Type}}) diff --git a/templates/data-sources/streamlits.md.tmpl b/templates/data-sources/streamlits.md.tmpl index 9129f2f243..0b004f8501 100644 --- a/templates/data-sources/streamlits.md.tmpl +++ b/templates/data-sources/streamlits.md.tmpl @@ -9,7 +9,7 @@ description: |- {{- end }} --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. # {{.Name}} ({{.Type}}) diff --git a/templates/data-sources/warehouses.md.tmpl b/templates/data-sources/warehouses.md.tmpl index 18e5fffd7a..d3ff8d9c6c 100644 --- a/templates/data-sources/warehouses.md.tmpl +++ b/templates/data-sources/warehouses.md.tmpl @@ -9,7 +9,7 @@ description: |- {{- end }} --- -!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. +!> **V1 release candidate** This data source was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the data source if needed. Any errors reported will be resolved with a higher priority. We encourage checking this data source out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0920--v0930) to use it. # {{.Name}} ({{.Type}}) diff --git a/templates/resources/schema.md.tmpl b/templates/resources/schema.md.tmpl new file mode 100644 index 0000000000..536ca2d3e6 --- /dev/null +++ b/templates/resources/schema.md.tmpl @@ -0,0 +1,32 @@ +--- +page_title: "{{.Name}} {{.Type}} - {{.ProviderName}}" +subcategory: "" +description: |- +{{ if gt (len (split .Description "")) 1 -}} +{{ index (split .Description "") 1 | plainmarkdown | trimspace | prefixlines " " }} +{{- else -}} +{{ .Description | plainmarkdown | trimspace | prefixlines " " }} +{{- end }} +--- + +!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0930--v0940) to use it. + +# {{.Name}} ({{.Type}}) + +{{ .Description | trimspace }} + +{{ if .HasExample -}} +## Example Usage + +{{ tffile (printf "examples/resources/%s/resource.tf" .Name)}} +{{- end }} + +{{ .SchemaMarkdown | trimspace }} +{{- if .HasImport }} + +## Import + +Import is supported using the following syntax: + +{{ codefile "shell" (printf "examples/resources/%s/import.sh" .Name)}} +{{- end }} diff --git a/v1-preparations/ESSENTIAL_GA_OBJECTS.MD b/v1-preparations/ESSENTIAL_GA_OBJECTS.MD index 3acbcbfa60..9e9d3ea9f1 100644 --- a/v1-preparations/ESSENTIAL_GA_OBJECTS.MD +++ b/v1-preparations/ESSENTIAL_GA_OBJECTS.MD @@ -29,7 +29,7 @@ newer provider versions. We will address these while working on the given object | ROW ACCESS POLICY | ❌ | [#2053](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2053), [#1600](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1600), [#1151](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1151) | | SCHEMA | 👨‍💻 | [#2826](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2826), [#2211](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2211), [#1243](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1243), [#506](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/506) | | STAGE | ❌ | [#2818](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2818), [#2505](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2505), [#1911](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1911), [#1903](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1903), [#1795](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1795), [#1705](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1705), [#1544](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1544), [#1491](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1491), [#1087](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1087), [#265](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/265) | -| STREAM | ❌ | [#2413](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2413), [#2201](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2201), [#1150](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1150) | +| STREAM | ❌ | [#2975](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2975), [#2413](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2413), [#2201](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2201), [#1150](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1150) | | STREAMLIT | 👨‍💻 | [#1933](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1933) | | TABLE | ❌ | [#2844](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2844), [#2839](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2839), [#2735](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2735), [#2733](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2733), [#2683](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2683), [#2676](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2676), [#2674](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2674), [#2629](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2629), [#2418](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2418), [#2415](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2415), [#2406](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2406), [#2236](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2236), [#2035](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2035), [#1823](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1823), [#1799](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1799), [#1764](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1764), [#1600](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1600), [#1387](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1387), [#1272](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1272), [#1271](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1271), [#1248](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1248), [#1241](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1241), [#1146](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1146), [#1032](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1032), [#420](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/420) | | TAG | ❌ | [#2943](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2902), [#2598](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/2598), [#1910](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1910), [#1909](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1909), [#1862](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1862), [#1806](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1806), [#1657](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1657), [#1496](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1496), [#1443](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1443), [#1394](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1394), [#1372](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1372), [#1074](https://github.com/Snowflake-Labs/terraform-provider-snowflake/issues/1074) |