Skip to content

Commit

Permalink
feat: Supports new networking attribute in `mongodbatlas_stream_con…
Browse files Browse the repository at this point in the history
…nection` (#2474)

* support new networking attribute in stream connection

* add changelog entry

* fix unit tests

* documentation

* fix migration tests

* include netwerking in examples for stream_connection

* improve changelog entry

* include acces.name attribute in model unit tests

* fix: add PlanModifiers to avoid unexpected plan changes

* test: refactor the kafkaStreamConnectionConfig and use normal `mig.CreateAndRunTest`

* test: refactor `testCaseCluster` and re-use in migration test

* test: improve check using actual name

* chore: add revertable commit to avoid pending resources in CI

* refactor: remove `networking.access.name` attribute (tests will fail until blocked issue is resolved)

* chore: remove name also from data source schema

* test: fix wrong parameter for kafkaNetworking

* clean up test after merge

* move into var block

* address PR comments

* test: remove old sleep function from stream connection tests

* test: refactor networking.access.type check to all test cases

* test: conditionally check networking.access.type based on provider version

* feat: implement DeleteStreamConnection with retry logic and add tests

---------

Co-authored-by: EspenAlbert <[email protected]>
  • Loading branch information
oarbusi and EspenAlbert authored Dec 23, 2024
1 parent f668cba commit 5504e20
Show file tree
Hide file tree
Showing 15 changed files with 295 additions and 86 deletions.
11 changes: 11 additions & 0 deletions .changelog/2474.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
```release-note:enhancement
resource/mongodbatlas_stream_connection: Adds `networking` attribute
```

```release-note:enhancement
data-source/mongodbatlas_stream_connection: Adds `networking` attribute
```

```release-note:enhancement
data-source/mongodbatlas_stream_connections: Adds `networking` attribute
```
8 changes: 8 additions & 0 deletions docs/data-sources/stream_connection.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ If `type` is of value `Kafka` the following additional attributes are defined:
* `bootstrap_servers` - Comma separated list of server addresses.
* `config` - A map of Kafka key-value pairs for optional configuration. This is a flat object, and keys can have '.' characters.
* `security` - Properties for the secure transport connection to Kafka. For SSL, this can include the trusted certificate to use. See [security](#security).
* `networking` - Networking Access Type can either be `PUBLIC` (default) or `VPC`. See [networking](#networking).

### Authentication

Expand All @@ -48,5 +49,12 @@ If `type` is of value `Kafka` the following additional attributes are defined:
* `role` - The name of the role to use. Can be a built in role or a custom role.
* `type` - Type of the DB role. Can be either BUILT_IN or CUSTOM.

### Networking
* `access` - Information about the networking access. See [access](#access).

### Access
* `name` - Id of the vpc peer when the type is `VPC`.
* `type` - Selected networking type. Either `PUBLIC` or `VPC`. Defaults to `PUBLIC`.

To learn more, see: [MongoDB Atlas API - Stream Connection](https://www.mongodb.com/docs/atlas/reference/api-resources-spec/#tag/Streams/operation/getStreamConnection) Documentation.
The [Terraform Provider Examples Section](https://github.com/mongodb/terraform-provider-mongodbatlas/blob/master/examples/mongodbatlas_stream_instance/atlas-streams-user-journey.md) also contains details on the overall support for Atlas Streams Processing in Terraform.
8 changes: 8 additions & 0 deletions docs/data-sources/stream_connections.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@ If `type` is of value `Kafka` the following additional attributes are defined:
* `bootstrap_servers` - Comma separated list of server addresses.
* `config` - A map of Kafka key-value pairs for optional configuration. This is a flat object, and keys can have '.' characters.
* `security` - Properties for the secure transport connection to Kafka. For SSL, this can include the trusted certificate to use. See [security](#security).
* `networking` - Networking Access Type can either be `PUBLIC` (default) or `VPC`. See [networking](#networking).

### Authentication

Expand All @@ -60,5 +61,12 @@ If `type` is of value `Kafka` the following additional attributes are defined:
* `role` - The name of the role to use. Can be a built in role or a custom role.
* `type` - Type of the DB role. Can be either BUILT_IN or CUSTOM.

### Networking
* `access` - Information about the networking access. See [access](#access).

### Access
* `name` - Id of the vpc peer when the type is `VPC`.
* `type` - Networking type. Either `PUBLIC` or `VPC`. Default is `PUBLIC`.

To learn more, see: [MongoDB Atlas API - Stream Connection](https://www.mongodb.com/docs/atlas/reference/api-resources-spec/#tag/Streams/operation/listStreamConnections) Documentation.
The [Terraform Provider Examples Section](https://github.com/mongodb/terraform-provider-mongodbatlas/blob/master/examples/mongodbatlas_stream_instance/atlas-streams-user-journey.md) also contains details on the overall support for Atlas Streams Processing in Terraform.
8 changes: 8 additions & 0 deletions docs/resources/stream_connection.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,7 @@ If `type` is of value `Kafka` the following additional arguments are defined:
* `bootstrap_servers` - Comma separated list of server addresses.
* `config` - A map of Kafka key-value pairs for optional configuration. This is a flat object, and keys can have '.' characters.
* `security` - Properties for the secure transport connection to Kafka. For SSL, this can include the trusted certificate to use. See [security](#security).
* `networking` - Networking Access Type can either be `PUBLIC` (default) or `VPC`. See [networking](#networking).

### Authentication

Expand All @@ -99,6 +100,13 @@ If `type` is of value `Kafka` the following additional arguments are defined:
* `role` - The name of the role to use. Value can be `atlasAdmin`, `readWriteAnyDatabase`, or `readAnyDatabase` if `type` is set to `BUILT_IN`, or the name of a user-defined role if `type` is set to `CUSTOM`.
* `type` - Type of the DB role. Can be either BUILT_IN or CUSTOM.

### Networking
* `access` - Information about the networking access. See [access](#access).

### Access
* `name` - Id of the vpc peer when the type is `VPC`.
* `type` - Selected networking type. Either `PUBLIC` or `VPC`. Defaults to `PUBLIC`.

## Import

You can import a stream connection resource using the instance name, project ID, and connection name. The format must be `INSTANCE_NAME-PROJECT_ID-CONNECTION_NAME`. For example:
Expand Down
5 changes: 5 additions & 0 deletions examples/mongodbatlas_stream_connection/main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,11 @@ resource "mongodbatlas_stream_connection" "example-kafka-plaintext" {
security = {
protocol = "PLAINTEXT"
}
networking = {
access = {
type = "PUBLIC"
}
}
}

resource "mongodbatlas_stream_connection" "example-kafka-ssl" {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ func TestAccStreamDSStreamConnection_kafkaPlaintext(t *testing.T) {
CheckDestroy: CheckDestroyStreamConnection,
Steps: []resource.TestStep{
{
Config: streamConnectionDataSourceConfig(kafkaStreamConnectionConfig(projectID, instanceName, "user", "rawpassword", "localhost:9092,localhost:9092", "earliest", false)),
Check: kafkaStreamConnectionAttributeChecks(dataSourceName, instanceName, "user", "rawpassword", "localhost:9092,localhost:9092", "earliest", false, false),
Config: streamConnectionDataSourceConfig(kafkaStreamConnectionConfig(projectID, instanceName, "user", "rawpassword", "localhost:9092,localhost:9092", "earliest", kafkaNetworkingPublic, false)),
Check: kafkaStreamConnectionAttributeChecks(dataSourceName, instanceName, "user", "rawpassword", "localhost:9092,localhost:9092", "earliest", networkingTypePublic, false, false),
},
},
})
Expand All @@ -39,8 +39,8 @@ func TestAccStreamDSStreamConnection_kafkaSSL(t *testing.T) {
CheckDestroy: CheckDestroyStreamConnection,
Steps: []resource.TestStep{
{
Config: streamConnectionDataSourceConfig(kafkaStreamConnectionConfig(projectID, instanceName, "user", "rawpassword", "localhost:9092", "earliest", true)),
Check: kafkaStreamConnectionAttributeChecks(dataSourceName, instanceName, "user", "rawpassword", "localhost:9092", "earliest", true, false),
Config: streamConnectionDataSourceConfig(kafkaStreamConnectionConfig(projectID, instanceName, "user", "rawpassword", "localhost:9092", "earliest", kafkaNetworkingPublic, true)),
Check: kafkaStreamConnectionAttributeChecks(dataSourceName, instanceName, "user", "rawpassword", "localhost:9092", "earliest", networkingTypePublic, true, false),
},
},
})
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ func TestAccStreamDSStreamConnections_basic(t *testing.T) {
CheckDestroy: CheckDestroyStreamConnection,
Steps: []resource.TestStep{
{
Config: streamConnectionsDataSourceConfig(kafkaStreamConnectionConfig(projectID, instanceName, "user", "rawpassword", "localhost:9092,localhost:9092", "earliest", false)),
Config: streamConnectionsDataSourceConfig(kafkaStreamConnectionConfig(projectID, instanceName, "user", "rawpassword", "localhost:9092,localhost:9092", "earliest", kafkaNetworkingPublic, false)),
Check: streamConnectionsAttributeChecks(dataSourceName, nil, nil, 1),
},
},
Expand All @@ -40,7 +40,7 @@ func TestAccStreamDSStreamConnections_withPageConfig(t *testing.T) {
CheckDestroy: CheckDestroyStreamConnection,
Steps: []resource.TestStep{
{
Config: streamConnectionsWithPageAttrDataSourceConfig(kafkaStreamConnectionConfig(projectID, instanceName, "user", "rawpassword", "localhost:9092,localhost:9092", "earliest", false)),
Config: streamConnectionsWithPageAttrDataSourceConfig(kafkaStreamConnectionConfig(projectID, instanceName, "user", "rawpassword", "localhost:9092,localhost:9092", "earliest", kafkaNetworkingPublic, false)),
Check: streamConnectionsAttributeChecks(dataSourceName, admin.PtrInt(2), admin.PtrInt(1), 0),
},
},
Expand Down
25 changes: 25 additions & 0 deletions internal/service/streamconnection/model_stream_connection.go
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,18 @@ func NewStreamConnectionReq(ctx context.Context, plan *TFStreamConnectionModel)
}
}

if !plan.Networking.IsNull() && !plan.Networking.IsUnknown() {
networkingModel := &TFNetworkingModel{}
if diags := plan.Networking.As(ctx, networkingModel, basetypes.ObjectAsOptions{}); diags.HasError() {
return nil, diags
}
streamConnection.Networking = &admin.StreamsKafkaNetworking{
Access: &admin.StreamsKafkaNetworkingAccess{
Type: networkingModel.Access.Type.ValueStringPointer(),
},
}
}

return &streamConnection, nil
}

Expand Down Expand Up @@ -114,6 +126,19 @@ func NewTFStreamConnection(ctx context.Context, projID, instanceName string, cur
connectionModel.DBRoleToExecute = dbRoleToExecuteModel
}

connectionModel.Networking = types.ObjectNull(NetworkingObjectType.AttrTypes)
if apiResp.Networking != nil {
networkingModel, diags := types.ObjectValueFrom(ctx, NetworkingObjectType.AttrTypes, TFNetworkingModel{
Access: TFNetworkingAccessModel{
Type: types.StringPointerValue(apiResp.Networking.Access.Type),
},
})
if diags.HasError() {
return nil, diags
}
connectionModel.Networking = networkingModel
}

return &connectionModel, nil
}

Expand Down
27 changes: 27 additions & 0 deletions internal/service/streamconnection/model_stream_connection_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ const (
dbRole = "customRole"
dbRoleType = "CUSTOM"
sampleConnectionName = "sample_stream_solar"
networkingType = "PUBLIC"
)

var configMap = map[string]string{
Expand Down Expand Up @@ -67,6 +68,7 @@ func TestStreamConnectionSDKToTFModel(t *testing.T) {
Config: types.MapNull(types.StringType),
Security: types.ObjectNull(streamconnection.ConnectionSecurityObjectType.AttrTypes),
DBRoleToExecute: tfDBRoleToExecuteObject(t, dbRole, dbRoleType),
Networking: types.ObjectNull(streamconnection.NetworkingObjectType.AttrTypes),
},
},
{
Expand Down Expand Up @@ -98,6 +100,7 @@ func TestStreamConnectionSDKToTFModel(t *testing.T) {
Config: tfConfigMap(t, configMap),
Security: tfSecurityObject(t, DummyCACert, securityProtocol),
DBRoleToExecute: types.ObjectNull(streamconnection.DBRoleToExecuteObjectType.AttrTypes),
Networking: types.ObjectNull(streamconnection.NetworkingObjectType.AttrTypes),
},
},
{
Expand All @@ -118,6 +121,7 @@ func TestStreamConnectionSDKToTFModel(t *testing.T) {
Config: types.MapNull(types.StringType),
Security: types.ObjectNull(streamconnection.ConnectionSecurityObjectType.AttrTypes),
DBRoleToExecute: types.ObjectNull(streamconnection.DBRoleToExecuteObjectType.AttrTypes),
Networking: types.ObjectNull(streamconnection.NetworkingObjectType.AttrTypes),
},
},
{
Expand Down Expand Up @@ -149,6 +153,7 @@ func TestStreamConnectionSDKToTFModel(t *testing.T) {
Config: tfConfigMap(t, configMap),
Security: tfSecurityObject(t, DummyCACert, securityProtocol),
DBRoleToExecute: types.ObjectNull(streamconnection.DBRoleToExecuteObjectType.AttrTypes),
Networking: types.ObjectNull(streamconnection.NetworkingObjectType.AttrTypes),
},
},
{
Expand All @@ -168,6 +173,7 @@ func TestStreamConnectionSDKToTFModel(t *testing.T) {
Config: types.MapNull(types.StringType),
Security: types.ObjectNull(streamconnection.ConnectionSecurityObjectType.AttrTypes),
DBRoleToExecute: types.ObjectNull(streamconnection.DBRoleToExecuteObjectType.AttrTypes),
Networking: types.ObjectNull(streamconnection.NetworkingObjectType.AttrTypes),
},
},
}
Expand Down Expand Up @@ -212,6 +218,11 @@ func TestStreamConnectionsSDKToTFModel(t *testing.T) {
Protocol: admin.PtrString(securityProtocol),
BrokerPublicCertificate: admin.PtrString(DummyCACert),
},
Networking: &admin.StreamsKafkaNetworking{
Access: &admin.StreamsKafkaNetworkingAccess{
Type: admin.PtrString(networkingType),
},
},
},
{
Name: admin.PtrString(connectionName),
Expand Down Expand Up @@ -253,6 +264,7 @@ func TestStreamConnectionsSDKToTFModel(t *testing.T) {
Config: tfConfigMap(t, configMap),
Security: tfSecurityObject(t, DummyCACert, securityProtocol),
DBRoleToExecute: types.ObjectNull(streamconnection.DBRoleToExecuteObjectType.AttrTypes),
Networking: tfNetworkingObject(t, networkingType),
},
{
ID: types.StringValue(fmt.Sprintf("%s-%s-%s", instanceName, dummyProjectID, connectionName)),
Expand All @@ -265,6 +277,7 @@ func TestStreamConnectionsSDKToTFModel(t *testing.T) {
Config: types.MapNull(types.StringType),
Security: types.ObjectNull(streamconnection.ConnectionSecurityObjectType.AttrTypes),
DBRoleToExecute: tfDBRoleToExecuteObject(t, dbRole, dbRoleType),
Networking: types.ObjectNull(streamconnection.NetworkingObjectType.AttrTypes),
},
{
ID: types.StringValue(fmt.Sprintf("%s-%s-%s", instanceName, dummyProjectID, sampleConnectionName)),
Expand All @@ -277,6 +290,7 @@ func TestStreamConnectionsSDKToTFModel(t *testing.T) {
Config: types.MapNull(types.StringType),
Security: types.ObjectNull(streamconnection.ConnectionSecurityObjectType.AttrTypes),
DBRoleToExecute: types.ObjectNull(streamconnection.DBRoleToExecuteObjectType.AttrTypes),
Networking: types.ObjectNull(streamconnection.NetworkingObjectType.AttrTypes),
},
},
},
Expand Down Expand Up @@ -470,3 +484,16 @@ func tfDBRoleToExecuteObject(t *testing.T, role, roleType string) types.Object {
}
return auth
}

func tfNetworkingObject(t *testing.T, networkingType string) types.Object {
t.Helper()
networking, diags := types.ObjectValueFrom(context.Background(), streamconnection.NetworkingObjectType.AttrTypes, streamconnection.TFNetworkingModel{
Access: streamconnection.TFNetworkingAccessModel{
Type: types.StringValue(networkingType),
},
})
if diags.HasError() {
t.Errorf("failed to create terraform data model: %s", diags.Errors()[0].Summary())
}
return networking
}
21 changes: 21 additions & 0 deletions internal/service/streamconnection/resource_schema.go
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ import (

"github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator"
"github.com/hashicorp/terraform-plugin-framework/resource/schema"
"github.com/hashicorp/terraform-plugin-framework/resource/schema/objectplanmodifier"
"github.com/hashicorp/terraform-plugin-framework/resource/schema/planmodifier"
"github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier"
"github.com/hashicorp/terraform-plugin-framework/schema/validator"
Expand All @@ -16,6 +17,9 @@ func ResourceSchema(ctx context.Context) schema.Schema {
Attributes: map[string]schema.Attribute{
"id": schema.StringAttribute{
Computed: true,
PlanModifiers: []planmodifier.String{
stringplanmodifier.UseStateForUnknown(),
},
},
"project_id": schema.StringAttribute{
Required: true,
Expand Down Expand Up @@ -95,6 +99,23 @@ func ResourceSchema(ctx context.Context) schema.Schema {
},
},
},
"networking": schema.SingleNestedAttribute{
Optional: true,
Computed: true,
PlanModifiers: []planmodifier.Object{
objectplanmodifier.UseStateForUnknown(),
},
Attributes: map[string]schema.Attribute{
"access": schema.SingleNestedAttribute{
Required: true,
Attributes: map[string]schema.Attribute{
"type": schema.StringAttribute{
Required: true,
},
},
},
},
},
},
}
}
23 changes: 21 additions & 2 deletions internal/service/streamconnection/resource_stream_connection.go
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,14 @@ import (
"errors"
"net/http"
"regexp"
"time"

"github.com/hashicorp/terraform-plugin-framework/attr"
"github.com/hashicorp/terraform-plugin-framework/path"
"github.com/hashicorp/terraform-plugin-framework/resource"
"github.com/hashicorp/terraform-plugin-framework/types"
"github.com/mongodb/terraform-provider-mongodbatlas/internal/common/conversion"

"github.com/hashicorp/terraform-plugin-framework/types"
"github.com/mongodb/terraform-provider-mongodbatlas/internal/config"
)

Expand Down Expand Up @@ -43,6 +45,7 @@ type TFStreamConnectionModel struct {
Config types.Map `tfsdk:"config"`
Security types.Object `tfsdk:"security"`
DBRoleToExecute types.Object `tfsdk:"db_role_to_execute"`
Networking types.Object `tfsdk:"networking"`
}

type TFConnectionAuthenticationModel struct {
Expand Down Expand Up @@ -77,6 +80,22 @@ var DBRoleToExecuteObjectType = types.ObjectType{AttrTypes: map[string]attr.Type
"type": types.StringType,
}}

type TFNetworkingAccessModel struct {
Type types.String `tfsdk:"type"`
}

var NetworkingAccessObjectType = types.ObjectType{AttrTypes: map[string]attr.Type{
"type": types.StringType,
}}

type TFNetworkingModel struct {
Access TFNetworkingAccessModel `tfsdk:"access"`
}

var NetworkingObjectType = types.ObjectType{AttrTypes: map[string]attr.Type{
"access": NetworkingAccessObjectType,
}}

func (r *streamConnectionRS) Schema(ctx context.Context, req resource.SchemaRequest, resp *resource.SchemaResponse) {
resp.Schema = ResourceSchema(ctx)
conversion.UpdateSchemaDescription(&resp.Schema)
Expand Down Expand Up @@ -181,7 +200,7 @@ func (r *streamConnectionRS) Delete(ctx context.Context, req resource.DeleteRequ
projectID := streamConnectionState.ProjectID.ValueString()
instanceName := streamConnectionState.InstanceName.ValueString()
connectionName := streamConnectionState.ConnectionName.ValueString()
if _, _, err := connV2.StreamsApi.DeleteStreamConnection(ctx, projectID, instanceName, connectionName).Execute(); err != nil {
if err := DeleteStreamConnection(ctx, connV2.StreamsApi, projectID, instanceName, connectionName, time.Minute); err != nil {
resp.Diagnostics.AddError("error deleting resource", err.Error())
return
}
Expand Down
Loading

0 comments on commit 5504e20

Please sign in to comment.