Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Automation] Generate Fluent Lite from Swagger datafactory#package-2018-06 #43413

Merged
merged 5 commits into from
Dec 17, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
2 changes: 1 addition & 1 deletion eng/versioning/version_client.txt
Original file line number Diff line number Diff line change
Expand Up @@ -326,7 +326,7 @@ com.azure.resourcemanager:azure-resourcemanager-frontdoor;1.0.0;1.1.0-beta.1
com.azure.resourcemanager:azure-resourcemanager-mixedreality;1.0.0-beta.3;1.0.0-beta.4
com.azure.resourcemanager:azure-resourcemanager-automation;1.0.0-beta.3;1.0.0-beta.4
com.azure.resourcemanager:azure-resourcemanager-resourcemover;1.2.0;1.3.0-beta.1
com.azure.resourcemanager:azure-resourcemanager-datafactory;1.0.0-beta.30;1.0.0-beta.31
com.azure.resourcemanager:azure-resourcemanager-datafactory;1.0.0-beta.30;1.0.0
com.azure.resourcemanager:azure-resourcemanager-advisor;1.0.0-beta.3;1.0.0-beta.4
com.azure.resourcemanager:azure-resourcemanager-appconfiguration;1.0.0;1.1.0-beta.1
com.azure.resourcemanager:azure-resourcemanager-attestation;1.0.0-beta.3;1.0.0-beta.4
Expand Down
148 changes: 144 additions & 4 deletions sdk/datafactory/azure-resourcemanager-datafactory/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,154 @@
# Release History

## 1.0.0-beta.31 (Unreleased)
## 1.0.0 (2024-12-16)

### Features Added
- Azure Resource Manager DataFactory client library for Java. This package contains Microsoft Azure SDK for DataFactory Management SDK. The Azure Data Factory V2 management API provides a RESTful set of web services that interact with Azure Data Factory V2 services. Package tag package-2018-06. For documentation on how to use this package, please see [Azure Management Libraries for Java](https://aka.ms/azsdk/java/mgmt).

### Breaking Changes

### Bugs Fixed
#### `models.Expression` was modified

* `withType(java.lang.String)` was removed

#### `models.TumblingWindowTrigger` was modified

* `runtimeState()` was removed

#### `models.DatasetReference` was modified

* `withType(java.lang.String)` was removed

#### `models.CustomEventsTrigger` was modified

* `runtimeState()` was removed

#### `models.IntegrationRuntimeReference` was modified

* `withType(java.lang.String)` was removed

#### `models.MultiplePipelineTrigger` was modified

* `runtimeState()` was removed

#### `models.RerunTumblingWindowTrigger` was modified

* `runtimeState()` was removed

#### `models.ScheduleTrigger` was modified

* `runtimeState()` was removed

#### `models.LinkedServiceReference` was modified

* `withType(java.lang.String)` was removed

#### `models.PipelineReference` was modified

* `withType(java.lang.String)` was removed

#### `models.BlobEventsTrigger` was modified

* `runtimeState()` was removed

#### `models.BlobTrigger` was modified

* `runtimeState()` was removed

#### `models.SelfHostedIntegrationRuntimeStatus` was modified

* `dataFactoryName()` was removed
* `state()` was removed

#### `models.ManagedIntegrationRuntimeStatus` was modified

* `dataFactoryName()` was removed
* `state()` was removed

#### `models.ChainingTrigger` was modified

* `runtimeState()` was removed

### Features Added

* `models.IcebergSink` was added

* `models.IcebergDataset` was added

* `models.IcebergWriteSettings` was added

#### `models.ScriptActivity` was modified

* `returnMultistatementResult()` was added
* `withReturnMultistatementResult(java.lang.Object)` was added

#### `models.SalesforceV2Source` was modified

* `pageSize()` was added
* `withPageSize(java.lang.Object)` was added

#### `models.MariaDBLinkedService` was modified

* `withUseSystemTrustStore(java.lang.Object)` was added
* `withSslMode(java.lang.Object)` was added
* `useSystemTrustStore()` was added
* `sslMode()` was added

#### `models.ServiceNowV2Source` was modified

* `pageSize()` was added
* `withPageSize(java.lang.Object)` was added

#### `models.SnowflakeV2LinkedService` was modified

* `withHost(java.lang.Object)` was added
* `host()` was added

#### `models.AzurePostgreSqlLinkedService` was modified

* `withEncoding(java.lang.Object)` was added
* `withPort(java.lang.Object)` was added
* `timeout()` was added
* `server()` was added
* `sslMode()` was added
* `encoding()` was added
* `withCommandTimeout(java.lang.Object)` was added
* `withTimezone(java.lang.Object)` was added
* `withTrustServerCertificate(java.lang.Object)` was added
* `trustServerCertificate()` was added
* `withServer(java.lang.Object)` was added
* `timezone()` was added
* `username()` was added
* `commandTimeout()` was added
* `withDatabase(java.lang.Object)` was added
* `withSslMode(java.lang.Object)` was added
* `withTimeout(java.lang.Object)` was added
* `database()` was added
* `readBufferSize()` was added
* `port()` was added
* `withReadBufferSize(java.lang.Object)` was added
* `withUsername(java.lang.Object)` was added

#### `models.PostgreSqlV2LinkedService` was modified

* `withAuthenticationType(java.lang.Object)` was added
* `authenticationType()` was added

#### `models.MySqlLinkedService` was modified

### Other Changes
* `connectionTimeout()` was added
* `withSslKey(java.lang.Object)` was added
* `treatTinyAsBoolean()` was added
* `sslCert()` was added
* `withSslCert(java.lang.Object)` was added
* `withTreatTinyAsBoolean(java.lang.Object)` was added
* `convertZeroDateTime()` was added
* `withGuidFormat(java.lang.Object)` was added
* `allowZeroDateTime()` was added
* `sslKey()` was added
* `guidFormat()` was added
* `withAllowZeroDateTime(java.lang.Object)` was added
* `withConnectionTimeout(java.lang.Object)` was added
* `withConvertZeroDateTime(java.lang.Object)` was added

## 1.0.0-beta.30 (2024-08-21)

Expand Down
38 changes: 22 additions & 16 deletions sdk/datafactory/azure-resourcemanager-datafactory/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Various documentation is available to help you get started
<dependency>
<groupId>com.azure.resourcemanager</groupId>
<artifactId>azure-resourcemanager-datafactory</artifactId>
<version>1.0.0-beta.30</version>
<version>1.0.0</version>
</dependency>
```
[//]: # ({x-version-update-end})
Expand Down Expand Up @@ -72,30 +72,33 @@ See [API design][design] for general introduction on design and key concepts on

```java
// storage account
StorageAccount storageAccount = storageManager.storageAccounts().define(STORAGE_ACCOUNT)
StorageAccount storageAccount = storageManager.storageAccounts()
.define(STORAGE_ACCOUNT)
.withRegion(REGION)
.withExistingResourceGroup(resourceGroup)
.create();
final String storageAccountKey = storageAccount.getKeys().iterator().next().value();
final String connectionString = getStorageConnectionString(STORAGE_ACCOUNT, storageAccountKey, storageManager.environment());
final String connectionString
= getStorageConnectionString(STORAGE_ACCOUNT, storageAccountKey, storageManager.environment());

// container
final String containerName = "adf";
storageManager.blobContainers().defineContainer(containerName)
storageManager.blobContainers()
.defineContainer(containerName)
.withExistingStorageAccount(resourceGroup, STORAGE_ACCOUNT)
.withPublicAccess(PublicAccess.NONE)
.create();

// blob as input
BlobClient blobClient = new BlobClientBuilder()
.connectionString(connectionString)
BlobClient blobClient = new BlobClientBuilder().connectionString(connectionString)
.containerName(containerName)
.blobName("input/data.txt")
.buildClient();
blobClient.upload(BinaryData.fromString("data"));

// data factory
Factory dataFactory = manager.factories().define(DATA_FACTORY)
Factory dataFactory = manager.factories()
.define(DATA_FACTORY)
.withRegion(REGION)
.withExistingResourceGroup(resourceGroup)
.create();
Expand All @@ -106,15 +109,16 @@ connectionStringProperty.put("type", "SecureString");
connectionStringProperty.put("value", connectionString);

final String linkedServiceName = "LinkedService";
manager.linkedServices().define(linkedServiceName)
manager.linkedServices()
.define(linkedServiceName)
.withExistingFactory(resourceGroup, DATA_FACTORY)
.withProperties(new AzureStorageLinkedService()
.withConnectionString(connectionStringProperty))
.withProperties(new AzureStorageLinkedService().withConnectionString(connectionStringProperty))
.create();

// input dataset
final String inputDatasetName = "InputDataset";
manager.datasets().define(inputDatasetName)
manager.datasets()
.define(inputDatasetName)
.withExistingFactory(resourceGroup, DATA_FACTORY)
.withProperties(new AzureBlobDataset()
.withLinkedServiceName(new LinkedServiceReference().withReferenceName(linkedServiceName))
Expand All @@ -125,7 +129,8 @@ manager.datasets().define(inputDatasetName)

// output dataset
final String outputDatasetName = "OutputDataset";
manager.datasets().define(outputDatasetName)
manager.datasets()
.define(outputDatasetName)
.withExistingFactory(resourceGroup, DATA_FACTORY)
.withProperties(new AzureBlobDataset()
.withLinkedServiceName(new LinkedServiceReference().withReferenceName(linkedServiceName))
Expand All @@ -135,14 +140,15 @@ manager.datasets().define(outputDatasetName)
.create();

// pipeline
PipelineResource pipeline = manager.pipelines().define("CopyBlobPipeline")
PipelineResource pipeline = manager.pipelines()
.define("CopyBlobPipeline")
.withExistingFactory(resourceGroup, DATA_FACTORY)
.withActivities(Collections.singletonList(new CopyActivity()
.withName("CopyBlob")
.withActivities(Collections.singletonList(new CopyActivity().withName("CopyBlob")
.withSource(new BlobSource())
.withSink(new BlobSink())
.withInputs(Collections.singletonList(new DatasetReference().withReferenceName(inputDatasetName)))
.withOutputs(Collections.singletonList(new DatasetReference().withReferenceName(outputDatasetName)))))
.withOutputs(
Collections.singletonList(new DatasetReference().withReferenceName(outputDatasetName)))))
.create();

// run pipeline
Expand Down
14 changes: 7 additions & 7 deletions sdk/datafactory/azure-resourcemanager-datafactory/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@

<groupId>com.azure.resourcemanager</groupId>
<artifactId>azure-resourcemanager-datafactory</artifactId>
<version>1.0.0-beta.31</version> <!-- {x-version-update;com.azure.resourcemanager:azure-resourcemanager-datafactory;current} -->
<version>1.0.0</version> <!-- {x-version-update;com.azure.resourcemanager:azure-resourcemanager-datafactory;current} -->
<packaging>jar</packaging>

<name>Microsoft Azure SDK for DataFactory Management</name>
Expand Down Expand Up @@ -45,14 +45,9 @@
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<jacoco.min.linecoverage>0</jacoco.min.linecoverage>
<jacoco.min.branchcoverage>0</jacoco.min.branchcoverage>
<revapi.skip>true</revapi.skip>
<spotless.skip>false</spotless.skip>
</properties>
<dependencies>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-json</artifactId>
<version>1.3.0</version> <!-- {x-version-update;com.azure:azure-json;dependency} -->
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-core</artifactId>
Expand All @@ -75,6 +70,11 @@
<version>1.14.2</version> <!-- {x-version-update;com.azure:azure-identity;dependency} -->
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-json</artifactId>
<version>1.3.0</version> <!-- {x-version-update;com.azure:azure-json;dependency} -->
</dependency>
<dependency>
<groupId>com.azure.resourcemanager</groupId>
<artifactId>azure-resourcemanager-storage</artifactId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,15 @@
import com.azure.core.http.HttpPipelinePosition;
import com.azure.core.http.policy.AddDatePolicy;
import com.azure.core.http.policy.AddHeadersFromContextPolicy;
import com.azure.core.http.policy.HttpLoggingPolicy;
import com.azure.core.http.policy.BearerTokenAuthenticationPolicy;
import com.azure.core.http.policy.HttpLogOptions;
import com.azure.core.http.policy.HttpLoggingPolicy;
import com.azure.core.http.policy.HttpPipelinePolicy;
import com.azure.core.http.policy.HttpPolicyProviders;
import com.azure.core.http.policy.RequestIdPolicy;
import com.azure.core.http.policy.RetryOptions;
import com.azure.core.http.policy.RetryPolicy;
import com.azure.core.http.policy.UserAgentPolicy;
import com.azure.core.management.http.policy.ArmChallengeAuthenticationPolicy;
import com.azure.core.management.profile.AzureProfile;
import com.azure.core.util.Configuration;
import com.azure.core.util.logging.ClientLogger;
Expand All @@ -43,8 +43,8 @@
import com.azure.resourcemanager.datafactory.implementation.OperationsImpl;
import com.azure.resourcemanager.datafactory.implementation.PipelineRunsImpl;
import com.azure.resourcemanager.datafactory.implementation.PipelinesImpl;
import com.azure.resourcemanager.datafactory.implementation.PrivateEndpointConnectionOperationsImpl;
import com.azure.resourcemanager.datafactory.implementation.PrivateEndPointConnectionsImpl;
import com.azure.resourcemanager.datafactory.implementation.PrivateEndpointConnectionOperationsImpl;
import com.azure.resourcemanager.datafactory.implementation.PrivateLinkResourcesImpl;
import com.azure.resourcemanager.datafactory.implementation.TriggerRunsImpl;
import com.azure.resourcemanager.datafactory.implementation.TriggersImpl;
Expand All @@ -66,8 +66,8 @@
import com.azure.resourcemanager.datafactory.models.Operations;
import com.azure.resourcemanager.datafactory.models.PipelineRuns;
import com.azure.resourcemanager.datafactory.models.Pipelines;
import com.azure.resourcemanager.datafactory.models.PrivateEndpointConnectionOperations;
import com.azure.resourcemanager.datafactory.models.PrivateEndPointConnections;
import com.azure.resourcemanager.datafactory.models.PrivateEndpointConnectionOperations;
import com.azure.resourcemanager.datafactory.models.PrivateLinkResources;
import com.azure.resourcemanager.datafactory.models.TriggerRuns;
import com.azure.resourcemanager.datafactory.models.Triggers;
Expand Down Expand Up @@ -294,7 +294,7 @@ public DataFactoryManager authenticate(TokenCredential credential, AzureProfile
.append("-")
.append("com.azure.resourcemanager.datafactory")
.append("/")
.append("1.0.0-beta.30");
.append("1.0.0");
if (!Configuration.getGlobalConfiguration().get("AZURE_TELEMETRY_DISABLED", false)) {
userAgentBuilder.append(" (")
.append(Configuration.getGlobalConfiguration().get("java.version"))
Expand Down Expand Up @@ -327,7 +327,7 @@ public DataFactoryManager authenticate(TokenCredential credential, AzureProfile
HttpPolicyProviders.addBeforeRetryPolicies(policies);
policies.add(retryPolicy);
policies.add(new AddDatePolicy());
policies.add(new ArmChallengeAuthenticationPolicy(credential, scopes.toArray(new String[0])));
policies.add(new BearerTokenAuthenticationPolicy(credential, scopes.toArray(new String[0])));
policies.addAll(this.policies.stream()
.filter(p -> p.getPipelinePosition() == HttpPipelinePosition.PER_RETRY)
.collect(Collectors.toList()));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -363,7 +363,6 @@ public AmazonRdsForSqlServerLinkedServiceTypeProperties withPooling(Object pooli
*/
@Override
public void validate() {
super.validate();
if (password() != null) {
password().validate();
}
Expand Down
Loading
Loading