-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BigQuery
: support encryptionConfiguration
in google_bigquery_data_transfer_config
#11478
BigQuery
: support encryptionConfiguration
in google_bigquery_data_transfer_config
#11478
Conversation
Hi there, I'm the Modular magician. I've detected the following information about your changes: Diff reportYour PR generated some diffs in downstreams - here they are.
Missing test reportYour PR includes resource fields which are not covered by any test. Resource: resource "google_bigquery_data_transfer_config" "primary" {
encryption_configuration {
kms_key_name = # value needed
}
}
|
Tests analyticsTotal tests: 1 Click here to see the affected service packages
View the build log |
@@ -172,6 +181,18 @@ properties: | |||
reingests data for [today-10, today-1], rather than ingesting data for | |||
just [today-1]. Only valid if the data source supports the feature. | |||
Set the value to 0 to use the default value. | |||
- !ruby/object:Api::Type::NestedObject | |||
name: 'encryptionConfiguration' | |||
description: | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be future-proof, can we make this description more generic and not specific about encryption key?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated description to match that found in API Reference.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
According to https://cloud.google.com/bigquery/docs/reference/datatransfer/rest/v1/projects.locations.transferConfigs#encryptionconfiguration - "The name of the KMS key used for encrypting BigQuery data." is the API description of kmsKeyName
. Description of EncryptionConfiguration
is "Represents the encryption configuration for a transfer."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
apologies seems like i got the descriptions mixed up, should be correct now.
Hi there, I'm the Modular magician. I've detected the following information about your changes: Diff reportYour PR generated some diffs in downstreams - here they are.
Missing test reportYour PR includes resource fields which are not covered by any test. Resource: resource "google_bigquery_data_transfer_config" "primary" {
encryption_configuration {
kms_key_name = # value needed
}
}
|
Tests analyticsTotal tests: 1 Click here to see the affected service packages
View the build log |
marking as ready for review with a reproducible tfconfig for manually testing this: when applying the tfconfig below: data "google_project" "project" {
}
resource "google_kms_crypto_key" "mau_crypto_key" {
name = "mau-crypto-key-test-bq"
key_ring = "projects/hc-terraform-testing/locations/us-central1/keyRings/mau-latest-test-1"
purpose = "ENCRYPT_DECRYPT"
}
data "google_iam_policy" "owner" {
binding {
role = "roles/bigquery.dataOwner"
members = [
"serviceAccount:mauricio-alvarezleon@hc-terraform-testing.iam.gserviceaccount.com",
]
}
}
resource "google_bigquery_dataset_iam_policy" "dataset" {
dataset_id = google_bigquery_dataset.my_dataset.dataset_id
policy_data = data.google_iam_policy.owner.policy_data
}
resource "google_bigquery_data_transfer_config" "query_config" {
depends_on = [ google_kms_crypto_key.mau_crypto_key ]
encryption_configuration {
kms_key_name = "projects/hc-terraform-testing/locations/us-central1/keyRings/mau-latest-test/cryptoKeys/mau-crypto-key-test-bq"
}
display_name = "my-query-mau"
location = "us-central1"
data_source_id = "scheduled_query"
schedule = "first sunday of quarter 00:00"
destination_dataset_id = google_bigquery_dataset.my_dataset.dataset_id
params = {
destination_table_name_template = "my_table"
write_disposition = "WRITE_APPEND"
query = "SELECT name FROM table WHERE x = 'y'"
}
}
resource "google_bigquery_dataset" "my_dataset" {
dataset_id = "my_dataset"
friendly_name = "foo"
description = "bar"
location = "us-central1"
}
We get a successfull apply with the following showing up for the Let me know if anything else is needed to make this review easier. @wj-chen |
Super helpful, thank you for the test notes! |
Hi there, I'm the Modular magician. I've detected the following information about your changes: Diff reportYour PR generated some diffs in downstreams - here they are.
Missing test reportYour PR includes resource fields which are not covered by any test. Resource: resource "google_bigquery_data_transfer_config" "primary" {
encryption_configuration {
kms_key_name = # value needed
}
}
|
Tests analyticsTotal tests: 1 Click here to see the affected service packages
View the build log |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding this field.
The Terraform google conversion error is unrelated to this change.
@@ -46,13 +46,22 @@ examples: | |||
vars: | |||
display_name: 'my-query' | |||
dataset_id: 'my_dataset' | |||
- !ruby/object:Provider::Terraform::Examples | |||
name: 'bigquerydatatransfer_config_cmek' | |||
skip_test: true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any particular reason to skip the test? If so it looks like the added field is not covered by any test.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was based off of another example from bigquery_dataset
which includes cmek support but doesn't include a test: #2226
I believe we skip it due to the test including a cryptoKeyRing creation, which we are unable to remove: https://cloud.google.com/kms/docs/reference/rest/v1/projects.locations.keyRings
I've included a test that includes the kmsKeyName
field.
Hi there, I'm the Modular magician. I've detected the following information about your changes: Diff reportYour PR generated some diffs in downstreams - here they are.
Missing test reportYour PR includes resource fields which are not covered by any test. Resource: resource "google_bigquery_data_transfer_config" "primary" {
encryption_configuration {
kms_key_name = # value needed
}
}
|
Tests analyticsTotal tests: 0 Click here to see the affected service packages
View the build log |
Hi there, I'm the Modular magician. I've detected the following information about your changes: Diff reportYour PR generated some diffs in downstreams - here they are.
Missing test reportYour PR includes resource fields which are not covered by any test. Resource: resource "google_bigquery_data_transfer_config" "primary" {
encryption_configuration {
kms_key_name = # value needed
}
}
|
Tests analyticsTotal tests: 1 Click here to see the affected service packages
View the build log |
@@ -383,6 +383,29 @@ func testAccBigqueryDataTransferConfig_scheduledQuery_update(t *testing.T) { | |||
}) | |||
} | |||
|
|||
func testAccBigqueryDataTransferConfig_CMEK(t *testing.T) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any reason to skip the VCR run? Also I think this function should be named TestAccBigqueryDataTransferConfig_CMEK
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same reason from above #11478 (comment)
I also agree with the naming convention but went with what i committed due to the whole test file matching it.
Should we do the renaming of all the tests here or in a separate PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah thanks for letting me know. If other tests are created in this way then I'm fine with it.
Do you happen to know why this |
/gcbrun |
@hao-nan-li Seems like it's missing the build, i've reran the build to see if it'll get resolved by itself. Edit: looks like after rerunning the build it appears to be passing 😄 Relevant logs: request url: https://api.github.com/repos/GoogleCloudPlatform/magic-modules/issues/11478
request body: null
response status-code: 200
User is a core contributor
Error: Failed to find pending build for PR 11478
Usage:
magician membership-checker [flags] |
Hi there, I'm the Modular magician. I've detected the following information about your changes: Diff reportYour PR generated some diffs in downstreams - here they are.
Missing test reportYour PR includes resource fields which are not covered by any test. Resource: resource "google_bigquery_data_transfer_config" "primary" {
encryption_configuration {
kms_key_name = # value needed
}
}
|
Tests analyticsTotal tests: 1 Click here to see the affected service packages
Action takenFound 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
|
Was the change to mmv1/.ruby-version intentional here? |
This PR would support the
encryptionConfiguration
field which is missing.Related REST API Reference of
encryptionConfiguration
: https://cloud.google.com/bigquery/docs/reference/datatransfer/rest/v1/projects.locations.transferConfigs#encryptionconfigurationRelease Note Template for Downstream PRs (will be copied)