Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/branch-22.10' into SP-4656
Browse files Browse the repository at this point in the history
Signed-off-by: Raza Jafri <[email protected]>
  • Loading branch information
razajafri committed Aug 31, 2022
2 parents b9cc96c + 14d2d10 commit b1f367b
Show file tree
Hide file tree
Showing 62 changed files with 2,027 additions and 948 deletions.
1 change: 1 addition & 0 deletions .github/workflows/blossom-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@ jobs:
mattahrens,\
sinkinben,\
thirtiseven,\
YanxuanLiu,\
', format('{0},', github.actor)) && github.event.comment.body == 'build'
steps:
- name: Check if comment is issued by authorized person
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/mvn-verify-check.yml
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ jobs:

- name: package aggregator check
run: >
mvn -B package -pl aggregator -am
mvn -Dmaven.wagon.http.retryHandler.count=3 -B package -pl aggregator -am
-P 'individual,pre-merge'
-Dbuildver=${{ matrix.spark-version }}
-DskipTests
Expand All @@ -92,7 +92,7 @@ jobs:
# includes RAT, code style and doc-gen checks of default shim
- name: verify all modules with lowest-supported Spark version
run: >
mvn -B verify
mvn -Dmaven.wagon.http.retryHandler.count=3 -B verify
-P 'individual,pre-merge'
-Dbuildver=${{ needs.get-noSnapshot-versions-from-dist.outputs.sparkHeadVersion }}
-DskipTests
Expand Down
63 changes: 62 additions & 1 deletion dist/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,7 @@
312db,
321db
</databricks.buildvers>
<dist.jar.name>${project.build.directory}/${project.build.finalName}-${cuda.version}.jar</dist.jar.name>
</properties>
<profiles>
<profile>
Expand Down Expand Up @@ -331,7 +332,8 @@
<configuration>
<target>
<zip update="true" basedir="${project.build.directory}/extra-resources"
destfile="${project.build.directory}/${project.build.finalName}-${cuda.version}.jar"/>
compress="${dist.jar.compress}"
destfile="${dist.jar.name}"/>
</target>
</configuration>
</execution>
Expand Down Expand Up @@ -450,6 +452,65 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>3.0.1</version>
<executions>
<execution>
<id>default-install</id>
<phase>none</phase>
</execution>
<execution>
<id>install-parallel-worlds-jar</id>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<file>${dist.jar.name}</file>
<artifactId>${project.artifactId}</artifactId>
<classifier>${cuda.version}</classifier>
<groupId>${project.groupId}</groupId>
<version>${project.version}</version>
<packaging>jar</packaging>
<!-- pomFile will be taken from META-INF in jar
https://github.com/apache/maven-install-plugin/blob/9f77fb95ab2a95b1d8d0c34c39c6f088f9f690ab/src/main/java/org/apache/maven/plugins/install/InstallFileMojo.java#L309
-->
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-deploy-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>default-deploy</id>
<phase>none</phase>
</execution>
<execution>
<id>deploy-parallel-worlds-jar</id>
<phase>deploy</phase>
<goals>
<goal>deploy-file</goal>
</goals>
<configuration>
<file>${dist.jar.name}</file>
<url>file://${java.io.tmpdir}/m2-repo</url>
<artifactId>${project.artifactId}</artifactId>
<classifier>${cuda.version}</classifier>
<groupId>${project.groupId}</groupId>
<packaging>jar</packaging>
<!-- pomFile will be taken from META-INF in jar
https://github.com/apache/maven-deploy-plugin/blob/4a72d8e9778c1878058435bdb919d40d65c879dd/src/main/java/org/apache/maven/plugins/deploy/DeployFileMojo.java#L186
-->
<version>${project.version}</version>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Empty file removed dist/unshimmed-spark311.txt
Empty file.
6 changes: 4 additions & 2 deletions docs/compatibility.md
Original file line number Diff line number Diff line change
Expand Up @@ -882,6 +882,8 @@ Casting from string to timestamp currently has the following limitations.
| `"yyyy-[M]M "` | Yes |
| `"yyyy-[M]M-[d]d"` | Yes |
| `"yyyy-[M]M-[d]d "` | Yes |
| `"yyyy-[M]M-[d]dT[h]h:[m]m:[s]s[zone_id]"` | Partial [\[1\]](#Footnote1) |
| `"yyyy-[M]M-[d]d [h]h:[m]m:[s]s[zone_id]"` | Partial [\[1\]](#Footnote1) |
| `"yyyy-[M]M-[d]dT[h]h:[m]m:[s]s.[ms][ms][ms][us][us][us][zone_id]"` | Partial [\[1\]](#Footnote1) |
| `"yyyy-[M]M-[d]d [h]h:[m]m:[s]s.[ms][ms][ms][us][us][us][zone_id]"` | Partial [\[1\]](#Footnote1) |
| `"[h]h:[m]m:[s]s.[ms][ms][ms][us][us][us][zone_id]"` | Partial [\[1\]](#Footnote1) |
Expand All @@ -892,8 +894,8 @@ Casting from string to timestamp currently has the following limitations.
| `"tomorrow"` | Yes |
| `"yesterday"` | Yes |

- <a name="Footnote1"></a>[1] The timestamp portion must have 6 digits for milliseconds.
Only timezone 'Z' (UTC) is supported. Casting unsupported formats will result in null values.
- <a name="Footnote1"></a>[1] Leap seconds are not supported. If a zone_id is provided then only
timezone 'Z' (UTC) is supported. Casting unsupported formats will result in null values.

Spark is very lenient when casting from string to timestamp because all date and time components
are optional, meaning that input values such as `T`, `T2`, `:`, `::`, `1:`, `:1`, and `::1`
Expand Down
1 change: 1 addition & 0 deletions docs/configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -225,6 +225,7 @@ Name | SQL Function(s) | Description | Default Value | Notes
<a name="sql.expression.Explode"></a>spark.rapids.sql.expression.Explode|`explode`, `explode_outer`|Given an input array produces a sequence of rows for each value in the array|true|None|
<a name="sql.expression.Expm1"></a>spark.rapids.sql.expression.Expm1|`expm1`|Euler's number e raised to a power minus 1|true|None|
<a name="sql.expression.Floor"></a>spark.rapids.sql.expression.Floor|`floor`|Floor of a number|true|None|
<a name="sql.expression.FromUTCTimestamp"></a>spark.rapids.sql.expression.FromUTCTimestamp|`from_utc_timestamp`|Render the input UTC timestamp in the input timezone|true|None|
<a name="sql.expression.FromUnixTime"></a>spark.rapids.sql.expression.FromUnixTime|`from_unixtime`|Get the string from a unix timestamp|true|None|
<a name="sql.expression.GetArrayItem"></a>spark.rapids.sql.expression.GetArrayItem| |Gets the field at `ordinal` in the Array|true|None|
<a name="sql.expression.GetArrayStructFields"></a>spark.rapids.sql.expression.GetArrayStructFields| |Extracts the `ordinal`-th fields of all array elements for the data with the type of array of struct|true|None|
Expand Down
Loading

0 comments on commit b1f367b

Please sign in to comment.