From 9d2537ebb6e0741a22879a69c4a812f69d5a119e Mon Sep 17 00:00:00 2001 From: Mohammad Derakhshani Date: Mon, 3 May 2021 12:20:44 -0700 Subject: [PATCH] updated readme (#20956) cosmos spark update readme after release --- sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md b/sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md index 94ab1886315d4..f930191751e78 100644 --- a/sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md +++ b/sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md @@ -30,19 +30,20 @@ https://github.com/Azure/azure-sdk-for-java/issues/new | Connector | Spark | Minimum Java Version | Supported Scala Versions | | ------------- | ------------- | -------------------- | ----------------------- | +| 4.0.0-beta.2 | 3.1.1 | 8 | 2.12 | | 4.0.0-beta.1 | 3.1.1 | 8 | 2.12 | ## Download You can use the maven coordinate of the jar to auto install the Spark Connector to your Databricks Runtime 8 from Maven: -`com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.0.0-beta.1` +`com.azure.cosmos.spark:azure-cosmos-spark_3-1_2-12:4.0.0-beta.2` You can also integrate against Cosmos DB Spark Connector in your SBT project: ```scala -libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.0.0-beta.1" +libraryDependencies += "com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.0.0-beta.2" ``` -Cosmos DB Spark Connector is available on [Maven Central Repo](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-1_2-12/4.0.0-beta.1/jar). +Cosmos DB Spark Connector is available on [Maven Central Repo](https://search.maven.org/artifact/com.azure.cosmos.spark/azure-cosmos-spark_3-1_2-12/4.0.0-beta.2/jar). ### General