Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BigQuery: support encryptionConfiguration in google_bigquery_data_transfer_config #11478

Conversation

BBBmau
Copy link
Collaborator

@BBBmau BBBmau commented Aug 17, 2024

This PR would support the encryptionConfiguration field which is missing.

Related REST API Reference of encryptionConfiguration: https://cloud.google.com/bigquery/docs/reference/datatransfer/rest/v1/projects.locations.transferConfigs#encryptionconfiguration

Release Note Template for Downstream PRs (will be copied)

bigquery: added `encryption_configuration` field to `google_bigquery_data_transfer_config` resource

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 2 files changed, 136 insertions(+))
google-beta provider: Diff ( 2 files changed, 136 insertions(+))
terraform-google-conversion: Diff ( 1 file changed, 29 insertions(+))

Missing test report

Your PR includes resource fields which are not covered by any test.

Resource: google_bigquery_data_transfer_config (13 total tests)
Please add an acceptance test which includes these fields. The test should include the following:

resource "google_bigquery_data_transfer_config" "primary" {
  encryption_configuration {
    kms_key_name = # value needed
  }
}

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 1
Passed tests: 1
Skipped tests: 0
Affected tests: 0

Click here to see the affected service packages
  • bigquerydatatransfer

$\textcolor{green}{\textsf{All tests passed!}}$

View the build log

@@ -172,6 +181,18 @@ properties:
reingests data for [today-10, today-1], rather than ingesting data for
just [today-1]. Only valid if the data source supports the feature.
Set the value to 0 to use the default value.
- !ruby/object:Api::Type::NestedObject
name: 'encryptionConfiguration'
description: |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be future-proof, can we make this description more generic and not specific about encryption key?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated description to match that found in API Reference.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to https://cloud.google.com/bigquery/docs/reference/datatransfer/rest/v1/projects.locations.transferConfigs#encryptionconfiguration - "The name of the KMS key used for encrypting BigQuery data." is the API description of kmsKeyName. Description of EncryptionConfiguration is "Represents the encryption configuration for a transfer."

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

apologies seems like i got the descriptions mixed up, should be correct now.

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 2 files changed, 132 insertions(+))
google-beta provider: Diff ( 2 files changed, 132 insertions(+))
terraform-google-conversion: Diff ( 1 file changed, 29 insertions(+))

Missing test report

Your PR includes resource fields which are not covered by any test.

Resource: google_bigquery_data_transfer_config (12 total tests)
Please add an acceptance test which includes these fields. The test should include the following:

resource "google_bigquery_data_transfer_config" "primary" {
  encryption_configuration {
    kms_key_name = # value needed
  }
}

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 1
Passed tests: 1
Skipped tests: 0
Affected tests: 0

Click here to see the affected service packages
  • bigquerydatatransfer

$\textcolor{green}{\textsf{All tests passed!}}$

View the build log

@BBBmau
Copy link
Collaborator Author

BBBmau commented Aug 22, 2024

marking as ready for review with a reproducible tfconfig for manually testing this:

when applying the tfconfig below:

data "google_project" "project" {
}

resource "google_kms_crypto_key" "mau_crypto_key" {
  name = "mau-crypto-key-test-bq"
  key_ring = "projects/hc-terraform-testing/locations/us-central1/keyRings/mau-latest-test-1"
  purpose = "ENCRYPT_DECRYPT"
}

data "google_iam_policy" "owner" {
  binding {
    role = "roles/bigquery.dataOwner"

    members = [
      "serviceAccount:mauricio-alvarezleon@hc-terraform-testing.iam.gserviceaccount.com",
    ]
  }
}

resource "google_bigquery_dataset_iam_policy" "dataset" {
  dataset_id  = google_bigquery_dataset.my_dataset.dataset_id
  policy_data = data.google_iam_policy.owner.policy_data
}

resource "google_bigquery_data_transfer_config" "query_config" {
  depends_on = [ google_kms_crypto_key.mau_crypto_key ]
  encryption_configuration {
    kms_key_name = "projects/hc-terraform-testing/locations/us-central1/keyRings/mau-latest-test/cryptoKeys/mau-crypto-key-test-bq"
  }
  display_name           = "my-query-mau"
  location               = "us-central1"
  data_source_id         = "scheduled_query"
  schedule               = "first sunday of quarter 00:00"
  destination_dataset_id = google_bigquery_dataset.my_dataset.dataset_id
  params = {
    destination_table_name_template = "my_table"
    write_disposition               = "WRITE_APPEND"
    query                           = "SELECT name FROM table WHERE x = 'y'"
  }
}

resource "google_bigquery_dataset" "my_dataset" {
  dataset_id    = "my_dataset"
  friendly_name = "foo"
  description   = "bar"
  location      = "us-central1"
}

We get a successfull apply with the following showing up for the bigquery_transfer_config, you'll notice that the encryption key has been successfully applied.

image

Let me know if anything else is needed to make this review easier. @wj-chen

@BBBmau BBBmau marked this pull request as ready for review August 22, 2024 04:12
@wj-chen
Copy link
Member

wj-chen commented Aug 22, 2024

marking as ready for review with a reproducible tfconfig for manually testing this:

when applying the tfconfig below:

data "google_project" "project" {
}

resource "google_kms_crypto_key" "mau_crypto_key" {
  name = "mau-crypto-key-test-bq"
  key_ring = "projects/hc-terraform-testing/locations/us-central1/keyRings/mau-latest-test-1"
  purpose = "ENCRYPT_DECRYPT"
}

data "google_iam_policy" "owner" {
  binding {
    role = "roles/bigquery.dataOwner"

    members = [
      "serviceAccount:mauricio-alvarezleon@hc-terraform-testing.iam.gserviceaccount.com",
    ]
  }
}

resource "google_bigquery_dataset_iam_policy" "dataset" {
  dataset_id  = google_bigquery_dataset.my_dataset.dataset_id
  policy_data = data.google_iam_policy.owner.policy_data
}

resource "google_bigquery_data_transfer_config" "query_config" {
  depends_on = [ google_kms_crypto_key.mau_crypto_key ]
  encryption_configuration {
    kms_key_name = "projects/hc-terraform-testing/locations/us-central1/keyRings/mau-latest-test/cryptoKeys/mau-crypto-key-test-bq"
  }
  display_name           = "my-query-mau"
  location               = "us-central1"
  data_source_id         = "scheduled_query"
  schedule               = "first sunday of quarter 00:00"
  destination_dataset_id = google_bigquery_dataset.my_dataset.dataset_id
  params = {
    destination_table_name_template = "my_table"
    write_disposition               = "WRITE_APPEND"
    query                           = "SELECT name FROM table WHERE x = 'y'"
  }
}

resource "google_bigquery_dataset" "my_dataset" {
  dataset_id    = "my_dataset"
  friendly_name = "foo"
  description   = "bar"
  location      = "us-central1"
}

We get a successfull apply with the following showing up for the bigquery_transfer_config, you'll notice that the encryption key has been successfully applied.

image

Let me know if anything else is needed to make this review easier. @wj-chen

Super helpful, thank you for the test notes!

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 2 files changed, 132 insertions(+))
google-beta provider: Diff ( 2 files changed, 132 insertions(+))
terraform-google-conversion: Diff ( 1 file changed, 29 insertions(+))

Missing test report

Your PR includes resource fields which are not covered by any test.

Resource: google_bigquery_data_transfer_config (12 total tests)
Please add an acceptance test which includes these fields. The test should include the following:

resource "google_bigquery_data_transfer_config" "primary" {
  encryption_configuration {
    kms_key_name = # value needed
  }
}

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 1
Passed tests: 1
Skipped tests: 0
Affected tests: 0

Click here to see the affected service packages
  • bigquerydatatransfer

$\textcolor{green}{\textsf{All tests passed!}}$

View the build log

@BBBmau BBBmau requested a review from hao-nan-li August 23, 2024 01:02
Copy link
Contributor

@hao-nan-li hao-nan-li left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding this field.
The Terraform google conversion error is unrelated to this change.

@@ -46,13 +46,22 @@ examples:
vars:
display_name: 'my-query'
dataset_id: 'my_dataset'
- !ruby/object:Provider::Terraform::Examples
name: 'bigquerydatatransfer_config_cmek'
skip_test: true
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any particular reason to skip the test? If so it looks like the added field is not covered by any test.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was based off of another example from bigquery_dataset which includes cmek support but doesn't include a test: #2226

I believe we skip it due to the test including a cryptoKeyRing creation, which we are unable to remove: https://cloud.google.com/kms/docs/reference/rest/v1/projects.locations.keyRings

I've included a test that includes the kmsKeyName field.

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 3 files changed, 226 insertions(+))
google-beta provider: Diff ( 3 files changed, 226 insertions(+))
terraform-google-conversion: Diff ( 1 file changed, 29 insertions(+))

Missing test report

Your PR includes resource fields which are not covered by any test.

Resource: google_bigquery_data_transfer_config (12 total tests)
Please add an acceptance test which includes these fields. The test should include the following:

resource "google_bigquery_data_transfer_config" "primary" {
  encryption_configuration {
    kms_key_name = # value needed
  }
}

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 0
Passed tests: 0
Skipped tests: 0
Affected tests: 0

Click here to see the affected service packages
  • bigquerydatatransfer

$\textcolor{red}{\textsf{Errors occurred during REPLAYING mode. Please fix them to complete your PR.}}$

View the build log

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 3 files changed, 223 insertions(+))
google-beta provider: Diff ( 3 files changed, 223 insertions(+))
terraform-google-conversion: Diff ( 1 file changed, 29 insertions(+))

Missing test report

Your PR includes resource fields which are not covered by any test.

Resource: google_bigquery_data_transfer_config (12 total tests)
Please add an acceptance test which includes these fields. The test should include the following:

resource "google_bigquery_data_transfer_config" "primary" {
  encryption_configuration {
    kms_key_name = # value needed
  }
}

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 1
Passed tests: 1
Skipped tests: 0
Affected tests: 0

Click here to see the affected service packages
  • bigquerydatatransfer

$\textcolor{green}{\textsf{All tests passed!}}$

View the build log

@BBBmau BBBmau requested a review from hao-nan-li August 23, 2024 21:45
@@ -383,6 +383,29 @@ func testAccBigqueryDataTransferConfig_scheduledQuery_update(t *testing.T) {
})
}

func testAccBigqueryDataTransferConfig_CMEK(t *testing.T) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason to skip the VCR run? Also I think this function should be named TestAccBigqueryDataTransferConfig_CMEK

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same reason from above #11478 (comment)

I also agree with the naming convention but went with what i committed due to the whole test file matching it.

Should we do the renaming of all the tests here or in a separate PR?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah thanks for letting me know. If other tests are created in this way then I'm fine with it.

@BBBmau BBBmau requested a review from hao-nan-li August 26, 2024 20:21
@hao-nan-li
Copy link
Contributor

Do you happen to know why this contributor-membership-checker is failing?

@BBBmau
Copy link
Collaborator Author

BBBmau commented Aug 26, 2024

/gcbrun

@BBBmau
Copy link
Collaborator Author

BBBmau commented Aug 26, 2024

Do you happen to know why this contributor-membership-checker is failing?

@hao-nan-li Seems like it's missing the build, i've reran the build to see if it'll get resolved by itself. Edit: looks like after rerunning the build it appears to be passing 😄

Relevant logs:

request url:  https://api.github.com/repos/GoogleCloudPlatform/magic-modules/issues/11478
request body:  null

response status-code:  200

User is a core contributor
Error: Failed to find pending build for PR 11478
Usage:
  magician membership-checker [flags]

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician. I've detected the following information about your changes:

Diff report

Your PR generated some diffs in downstreams - here they are.

google provider: Diff ( 3 files changed, 223 insertions(+))
google-beta provider: Diff ( 3 files changed, 223 insertions(+))
terraform-google-conversion: Diff ( 1 file changed, 29 insertions(+))

Missing test report

Your PR includes resource fields which are not covered by any test.

Resource: google_bigquery_data_transfer_config (12 total tests)
Please add an acceptance test which includes these fields. The test should include the following:

resource "google_bigquery_data_transfer_config" "primary" {
  encryption_configuration {
    kms_key_name = # value needed
  }
}

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 1
Passed tests: 0
Skipped tests: 0
Affected tests: 1

Click here to see the affected service packages
  • bigquerydatatransfer

Action taken

Found 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
  • TestAccBigqueryDataTransferConfig

Get to know how VCR tests work

@modular-magician
Copy link
Collaborator

$\textcolor{green}{\textsf{Tests passed during RECORDING mode:}}$
TestAccBigqueryDataTransferConfig[Debug log]

$\textcolor{green}{\textsf{No issues found for passed tests after REPLAYING rerun.}}$


$\textcolor{green}{\textsf{All tests passed!}}$

View the build log or the debug log for each test

@SarahFrench
Copy link
Contributor

Was the change to mmv1/.ruby-version intentional here?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants