Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later #41943

Conversation

eejbyfeldt
Copy link
Contributor

What changes were proposed in this pull request?

Drop hardcoded --target:jvm-1.8 value from scalac argument in pom.xml.

Why are the changes needed?

Build using maven is broken using 2.13 and Java 11 or later.

It fails with

$ ./build/mvn compile -Pscala-2.13 -Djava.version=11 -X
...
[WARNING] [Warn] : [deprecation @  | origin= | version=] -target is deprecated: Use -release instead to compile against the correct platform API.
[ERROR] [Error] : target platform version 8 is older than the release version 11
[WARNING] one warning found
[ERROR] one error found
...

if setting the java.version property or

$ ./build/mvn compile -Pscala-2.13
...
[WARNING] [Warn] : [deprecation @  | origin= | version=] -target is deprecated: Use -release instead to compile against the correct platform API.
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: not found: value sun
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26: not found: object sun
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: not found: object sun
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:210: not found: type Unsafe
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:212: not found: type Unsafe
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26: Unused import
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: Unused import
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala:452: not found: value sun
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: not found: object sun
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type SignalHandler
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:83: not found: type Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: type SignalHandler
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: value Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:114: not found: type Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:116: not found: value Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:128: not found: value Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: Unused import
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: Unused import
[WARNING] one warning found
[ERROR] 23 errors found
...

This is caused by that we in pom.xml hardcode that scalac should run with -target:jvm-1.8 (regardless of the value of java.version) this was fine for scala 2.12.18 and scala 2.13.8 as the scala-maven-plugin would add the arg -target based on the java.version property. (https://github.com/davidB/scala-maven-plugin/blob/4.8.0/src/main/java/scala_maven/ScalaMojoSupport.java#L629-L648) since this argument is later it took precedence over the value we hardcoded in maven and everything works as expected.

The problem comes in scala 2.13.11 where -target is deprecated and therefore the scala-maven-plugin uses the -release argument instead. The first second failure about not being able to accessing sun._ packages which is expected behvaior when using -release 8 see: scala/bug#12643 but if one sets --release 11 when using Java 11 access to sun._ compile just fine.

Note: That builds using scala 2.13 and java 11 or later without setting java.version to the appropriate value will still fail.

Note2: The java 8 builds still succeeds as the rt.jar is pased on the -bootclasspath when using java8.

Does this PR introduce any user-facing change?

Fixes the maven build when using scala 2.13 and Java 11 or later.

How was this patch tested?

Exising CI builds and manual builds locally.

@github-actions github-actions bot added the BUILD label Jul 11, 2023
@pan3793
Copy link
Member

pan3793 commented Jul 11, 2023

A quick question: previously, the output artifacts are runnable on JDK 8 whatever the building JDK version is. is it true after this change?

cc @LuciferYang

@LuciferYang
Copy link
Contributor

It's my bad, I only tested Java 11 and 17 under SBT and missed the Maven scenario when upgrade Scala 2.13.11

@LuciferYang
Copy link
Contributor

@eejbyfeldt Do you know why SBT is not failed?

javaVersion := SbtPomKeys.effectivePom.value.getProperties.get("java.version").asInstanceOf[String],
(Compile / javacOptions) ++= Seq(
"-encoding", UTF_8.name(),
"-source", javaVersion.value
),
// This -target and Xlint:unchecked options cannot be set in the Compile configuration scope since
// `javadoc` doesn't play nicely with them; see https://github.com/sbt/sbt/issues/355#issuecomment-3817629
// for additional discussion and explanation.
(Compile / compile / javacOptions) ++= Seq(
"-target", javaVersion.value,
"-Xlint:unchecked"
),
(Compile / scalacOptions) ++= Seq(
s"-target:jvm-${javaVersion.value}",
"-sourcepath", (ThisBuild / baseDirectory).value.getAbsolutePath // Required for relative source links in scaladoc
),

java -version
openjdk version "11.0.18" 2023-01-17 LTS
OpenJDK Runtime Environment Zulu11.62+17-CA (build 11.0.18+10-LTS)
OpenJDK 64-Bit Server VM Zulu11.62+17-CA (build 11.0.18+10-LTS, mixed mode)

build/sbt compile -Pscala-2.13 can successful and It looks like -target:jvm-${javaVersion.value} is -target:jvm-1.8

pom.xml Show resolved Hide resolved
@LuciferYang
Copy link
Contributor

LuciferYang commented Jul 12, 2023

friendly ping @dongjoon-hyun Should we keep using Scala 2.13.8 in Spark 3.5.0? WDYT?

also cc @srowen

@eejbyfeldt
Copy link
Contributor Author

eejbyfeldt commented Jul 12, 2023

@eejbyfeldt Do you know why SBT is not failed?

The failures are not directly related to scala 2.13.11. They are caused by the maven plugin we are using for running scalac. For versions greater or equal to 2.13.9 that plugin sets -release instead of -target (https://github.com/davidB/scala-maven-plugin/blob/4.8.0/src/main/java/scala_maven/ScalaMojoSupport.java#L629-L648) And as noted scala/bug#12643 and scala/bug#12824 replacing -target with -release is not a noop and has slightly different meaning and behvaior. My understanding in that in the sbt build we only set -target and we do not use -release and therefore we do not run into the same issues with the sbt build.

A quick question: previously, the output artifacts are runnable on JDK 8 whatever the building JDK version is. is it true after this change?

Based on the example provided in scala/bug#12824 that using -target:8 on java 17 is unsafe and can cause runtime failures. I am not sure we actually had that guarantee before. That said as long as one does not override java.version my understanding is that we get the same behavior as before and therefore the same compatibility.

@srowen
Copy link
Member

srowen commented Jul 12, 2023

I think the issue is that you end up compiling against a later version of JDK libraries, which is not necessarily compatible, even if emitting bytecode for a lower Java version. However, yeah we've always accepted that and test it in CI/CD to make sure it works.
If this change works for Java 8 and later, it seems fine.

@eejbyfeldt
Copy link
Contributor Author

I think the issue is that you end up compiling against a later version of JDK > If this change works for Java 8 and later, it seems fine.

There are still issues compiling 2.13 with newer newer java like 11 and not setting java.version to 11. When not setting the java.version property the scala-maven-plugin will set -release 8 which will not allow access to sun.* classes. This is the case currently in master and that is not resolved in this PR.

@srowen
Copy link
Member

srowen commented Jul 12, 2023

I think we target Java 8 for release builds and then want that to run on Java 11+. Does that work, or did that already work?

@LuciferYang
Copy link
Contributor

I think we target Java 8 for release builds and then want that to run on Java 11+.

I think this is ok now

@LuciferYang
Copy link
Contributor

LuciferYang commented Jul 12, 2023

I think the current issue is that we may have to specify the Java version through -Djava.version when using other Java versions for Maven build&test (after fixing the issue in this pr)

I do the following experiments:

  1. hardcode the java.version in pom.xml to 11
  2. maven build using Java 17:
build/mvn clean install -DskipTests -Pscala-2.13

this will failed due to

[WARNING] [Warn] : [deprecation @  | origin= | version=] -target is deprecated: Use -release instead to compile against the correct platform API.
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: object security is not a member of package sun
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: object nio is not a member of package sun
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: Unused import

We have to use the following command to build using Java 17 now(Current usage in Github action):

build/mvn clean install -DskipTests -Pscala-2.13 -Djava.version=17

@srowen
Copy link
Member

srowen commented Jul 12, 2023

I think the idea is you build with java 8, and test with 11/17 - does that work? or that is certainly what we want, to have one release that works across all the java versions.

@LuciferYang
Copy link
Contributor

I think the idea is you build with java 8, and test with 11/17 - does that work? or that is certainly what we want, to have one release that works across all the java versions.

I think this is ok.

@pan3793
Copy link
Member

pan3793 commented Jul 13, 2023

The java.version can be resolved by

       <profile>
            <id>java-8</id>
            <activation>
                <jdk>1.8</jdk>
            </activation>
            <properties>
                <java.version>1.8</java.version>
            </properties>
        </profile>

        <profile>
            <id>java-11</id>
            <activation>
                <jdk>11</jdk>
            </activation>
            <properties>
                <java.version>11</java.version>
            </properties>
        </profile>

@eejbyfeldt eejbyfeldt requested a review from LuciferYang July 31, 2023 08:17
@eejbyfeldt
Copy link
Contributor Author

Anyone opinions how we should proceed with this? Would be nice to have this fixed in the 3.5 branch as not being able to build with java 11 or newer is a regression compared to previous spark releases.

@LuciferYang
Copy link
Contributor

The java.version can be resolved by

       <profile>
            <id>java-8</id>
            <activation>
                <jdk>1.8</jdk>
            </activation>
            <properties>
                <java.version>1.8</java.version>
            </properties>
        </profile>

        <profile>
            <id>java-11</id>
            <activation>
                <jdk>11</jdk>
            </activation>
            <properties>
                <java.version>11</java.version>
            </properties>
        </profile>

Is this suggestion works? If it works, we don't need to manually specify -Djava.version=?

@eejbyfeldt eejbyfeldt force-pushed the fix-build-mvn-using-2.13-and-java11+ branch from 6804b1b to d5b9d4f Compare July 31, 2023 09:40
@eejbyfeldt
Copy link
Contributor Author

eejbyfeldt commented Jul 31, 2023

Is this suggestion works? If it works, we don't need to manually specify -Djava.version=?

Added it to this PR. Seems to work based on my testing locally.

I did not add it immediately as it seemed like a change for how it worked compared to now, and it was not clear to be everyone agreed that was a desired solution.

@srowen
Copy link
Member

srowen commented Jul 31, 2023

I think the issue is you will target Java 17 bytecode if running on 17, when we want to target 8 in all cases

@eejbyfeldt
Copy link
Contributor Author

I think the issue is you will target Java 17 bytecode if running on 17, when we want to target 8 in all cases

If that is the case then the changes currently in this PR are not what we want. But are we really sure that this is something that is expected or used? Because as far as I can tell this is not something that actually worked in the passed. If I take a spark 3.4.1 build that I build using the v3.4.1 tag on Java 11 and then try to run spark-submit run-example SparkPi on Java 8 it fails with

2023-07-31 14:59:15,304 INFO scheduler.DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:38) failed in 0.259 s due to Job aborted due to stage failure: Task serialization failed: java.lang.NoSuchMethodError: java.nio.ByteBuffer.flip()Ljava/nio/ByteBuffer;
java.lang.NoSuchMethodError: java.nio.ByteBuffer.flip()Ljava/nio/ByteBuffer;
	at org.apache.spark.util.io.ChunkedByteBufferOutputStream.toChunkedByteBuffer(ChunkedByteBufferOutputStream.scala:115)
	at org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:362)
	at org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:160)
	at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:99)
	at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:38)
	at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:78)
	at org.apache.spark.SparkContext.broadcastInternal(SparkContext.scala:1548)
	at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1530)
	at org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1535)
	at org.apache.spark.scheduler.DAGScheduler.submitStage(DAGScheduler.scala:1353)
	at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:1295)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2931)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2923)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2912)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)

I think the problem here boils down to that we previously have used the scalac arg --target to attempt to achieve what you describe. But according to the comment here (scala/bug#12643 (comment))

-target says "emit class file of version N, but I want to use arbitrary classes from the JDK and take my chances".

so only specifying -target is not the proper way to build on a later Java version and target Java 8. My understanding is that if that is what we actually want then we would need specify the java version using -release and actually fix the build errors that it causes.

@srowen
Copy link
Member

srowen commented Jul 31, 2023

Right, targeting Java 8 bytecode is necessary but not sufficient to run on Java 8. I think that's why we build releases on Java 8. These releases should still work on later Java releases, at least that's what the CI jobs are trying to test.

@LuciferYang
Copy link
Contributor

LuciferYang commented Aug 1, 2023

@eejbyfeldt can we change to use -release:8?

I have made the following changes based on your pr:

  1. upgrade scala-maven-plugin from 4.8.0 to 4.8.1
  2. change -target:jvm-1.8 to -release:8, both line 2911 and line 3652

then I test

java -version
openjdk version "17.0.8" 2023-07-18 LTS
OpenJDK Runtime Environment Zulu17.44+15-CA (build 17.0.8+7-LTS)
OpenJDK 64-Bit Server VM Zulu17.44+15-CA (build 17.0.8+7-LTS, mixed mode, sharing)

./build/mvn -DskipTests clean package   
./build/mvn clean compile -Pscala-2.13 
./build/mvn clean compile -Pscala-2.13

Both Scala 2.12 and Scala 2.13 with Java 17 build successfully, and the -release always 8.

@eejbyfeldt
Copy link
Contributor Author

change -target:jvm-1.8 to -release:8, both line 2911 and line 3652

Hardcoding -release:8 with the new default activation will not actually setting the -release config to 8. This is because the scala-maven-plugin will also append a -release flag based on the property "java.version". Since the one appended by scala-maven-plugin is later in the list of args it takes precedence. So while yes doing like you suggest will compile it will not have created a java 8 release. The args can be seen by passing -X to maven

$ ./build/mvn clean compile -Pscala-2.13 -X
...
[DEBUG] [zinc] Running cached compiler 76b0ae1b for Scala compiler version 2.13.11
[DEBUG] [zinc] The Scala compiler is invoked with:
        -unchecked
        -deprecation
        -feature
        -explaintypes
        -release:8
        -Wconf:cat=deprecation:wv,any:e
        -Wunused:imports
        -Wconf:cat=scaladoc:wv
        -Wconf:cat=lint-multiarg-infix:wv
        -Wconf:cat=other-nullary-override:wv
        -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
        -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
        -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
        -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
        -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
        -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
        -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
        -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
        -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
        -Wconf:cat=unchecked&msg=outer reference:s
        -Wconf:cat=unchecked&msg=eliminated by erasure:s
        -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
        -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
        -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
        -Wconf:msg=Implicit definition should have explicit type:s
        -release
        17
        -bootclasspath
        /home/eejbyfeldt/.m2/repository/org/scala-lang/scala-library/2.13.11/scala-library-2.13.11.jar
        -classpath
        /home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-io/9.4.50.v20221201/jetty-io-9.4.50.v20221201.jar:/home/eejbyfeldt/.m2/repository/org/slf4j/slf4j-api/2.0.7/slf4j-api-2.0.7.jar:/home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-client/9.4.51.v20230217/jetty-client-9.4.51.v20230217.jar:/home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-http/9.4.51.v20230217/jetty-http-9.4.51.v20230217.jar:/home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-util/9.4.51.v20230217/jetty-util-9.4.51.v20230217.jar:/home/eejbyfeldt/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/eejbyfeldt/dev/apache/spark/common/tags/target/scala-2.13/classes:/home/eejbyfeldt/.m2/repository/org/scala-lang/scala-reflect/2.13.11/scala-reflect-2.13.11.jar:/home/eejbyfeldt/.m2/repository/org/scala-lang/scala-compiler/2.13.11/scala-compiler-2.13.11.jar:/home/eejbyfeldt/.m2/repository/io/github/java-diff-utils/java-diff-utils/4.12/java-diff-utils-4.12.jar:/home/eejbyfeldt/.m2/repository/org/jline/jline/3.22.0/jline-3.22.0.jar:/home/eejbyfeldt/.m2/repository/net/java/dev/jna/jna/5.13.0/jna-5.13.0.jar:/home/eejbyfeldt/.m2/repository/org/scala-lang/scala-library/2.13.11/scala-library-2.13.11.jar
...

Running with -Djava.version=8 will set to release to 8 properly and then compilation fails with:

$ ./build/mvn clean compile -Pscala-2.13 -Djava.version=8
...
[INFO] Compiler bridge file: /home/eejbyfeldt/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.13-1.8.0-bin_2.13.11__61.0-1.8.0_20221110T195421.jar
[INFO] compiling 603 Scala sources and 77 Java sources to /home/eejbyfeldt/dev/apache/spark/core/target/scala-2.13/classes ...
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: not found: value sun
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26: not found: object sun
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: not found: object sun
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:210: not found: type Unsafe
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:212: not found: type Unsafe
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26: Unused import
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: Unused import
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala:452: not found: value sun
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: not found: object sun
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type SignalHandler
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:83: not found: type Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: type SignalHandler
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: value Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:114: not found: type Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:116: not found: value Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:128: not found: value Signal
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: Unused import
[ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: Unused import
[ERROR] 23 errors found

and based on the discussion in scala/bug#12643 I belive this is the expected behavior.

@LuciferYang
Copy link
Contributor

@eejbyfeldt Thank you for your response, I don’t have any further suggestions for now.

@srowen
Copy link
Member

srowen commented Aug 1, 2023

Can we set java.version to 8?
I just don't see a direct use case in the current Spark build for targeting higher java versions.
Assuming that a Java 8 build works on Java 17.

@eejbyfeldt
Copy link
Contributor Author

Assuming that a Java 8 build works on Java 17.

The problem is that this does not work with the current code as building with -release 8 will for bid access to the classes in sun.*. That is basically the failing build I posted in my last comment.

@srowen
Copy link
Member

srowen commented Aug 2, 2023

OK, got it. The problem is, I don't think it helps to target Java 11 here, as it will just make the release unusable on Java 8 right?

Is this workaround possible? scala/bug#12643 (comment)

Or else, just don't further upgrade Scala 2.13 until Java 8 support is dropped. That could reasonably happen in Spark 4

@LuciferYang
Copy link
Contributor

I made the following attempt and test using Java 17:

java -version                                                     
openjdk version "17.0.8" 2023-07-18 LTS
OpenJDK Runtime Environment Zulu17.44+15-CA (build 17.0.8+7-LTS)
OpenJDK 64-Bit Server VM Zulu17.44+15-CA (build 17.0.8+7-LTS, mixed mode, sharing)
  1. made the following changes to the code:
diff --git a/pom.xml b/pom.xml
index 2e9d1d2d8f3..af9481b0f37 100644
--- a/pom.xml
+++ b/pom.xml
@@ -112,7 +112,7 @@
   <properties>
     <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
     <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
-    <java.version>1.8</java.version>
+    <java.version>11</java.version>
     <maven.compiler.source>${java.version}</maven.compiler.source>
     <maven.compiler.target>${java.version}</maven.compiler.target>
     <maven.version>3.8.8</maven.version>
@@ -2908,7 +2908,7 @@
               <arg>-deprecation</arg>
               <arg>-feature</arg>
               <arg>-explaintypes</arg>
-              <arg>-target:jvm-1.8</arg>
+              <arg>-target:11</arg>
               <arg>-Xfatal-warnings</arg>
               <arg>-Ywarn-unused:imports</arg>
               <arg>-P:silencer:globalFilters=.*deprecated.*</arg>
@@ -3619,7 +3619,7 @@
                   <arg>-deprecation</arg>
                   <arg>-feature</arg>
                   <arg>-explaintypes</arg>
-                  <arg>-target:jvm-1.8</arg>
+                  <arg>-target:11</arg>
                   <arg>-Wconf:cat=deprecation:wv,any:e</arg>
                   <arg>-Wunused:imports</arg>
                   <!--
  1. run dev/change-scala-version.sh 2.13 to change Scala version
  2. run build/mvn clean install -DskipTests -Pscala-2.13 -Djava.version=17
[ERROR] [Error] : target platform version 11 is older than the release version 17
  1. run build/mvn clean install -DskipTests -Pscala-2.13
[WARNING] [Warn] : [deprecation @  | origin= | version=] -target is deprecated: Use -release instead to compile against the correct platform API.
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: object security is not a member of package sun
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: object nio is not a member of package sun
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer
[ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: Unused import
[WARNING] one warning found
[ERROR] 7 errors found
[INFO] ------------------------------------------------------------------------

So, it seems that even if we upgrade to Java 11, we still need to keep the values of java.version and -target:${version} consistent

@LuciferYang
Copy link
Contributor

Due to this issue, should we downgrade the Scala 2.13 version to 2.13.8 in branch-3.5? also cc @dongjoon-hyun

@srowen
Copy link
Member

srowen commented Aug 5, 2023

Yeah, I also confirmed this fails by testing the 3.5.0 RC1 build. If you build with Java 8, but test on Java 17, it won't work. We want this combination to work. If 2.13.8 still works as before, I believe we should use that. (And I think we should drop java 8 support in 4.0)

@LuciferYang
Copy link
Contributor

I submitted a PR to test the branch-3.5 branch with Scala 2.13.8, I will update the PR description later.

#42362

dongjoon-hyun pushed a commit that referenced this pull request Sep 13, 2023
### What changes were proposed in this pull request?
This pr downgrade `scala-maven-plugin` to version 4.7.1 to avoid it automatically adding the `-release` option as a Scala compilation argument.

### Why are the changes needed?
The `scala-maven-plugin` versions 4.7.2 and later will try to automatically append the `-release` option as a Scala compilation argument when it is not specified by the user:

1. 4.7.2 and 4.8.0: try to add the `-release` option for Scala versions 2.13.9 and higher.
2. 4.8.1: try to append the `-release` option for Scala versions 2.12.x/2.13.x/3.1.1, and append `-java-output-version` for Scala 3.1.2.

The addition of the `-release` option has caused issues mentioned in SPARK-44376 | #41943 and #40442 (comment). This is because the `-release` option has stronger compilation restrictions than `-target`, ensuring not only bytecode format, but also that the API used in the code is compatible with the specified version of Java. However, many APIs in the `sun.*` package are not `exports` in Java 11, 17, and 21, such as `sun.nio.ch.DirectBuffer`, `sun.util.calendar.ZoneInfo`, and `sun.nio.cs.StreamDecoder`, making them invisible when compiling across different versions.

For discussions within the Scala community, see scala/bug#12643, scala/bug#12824, scala/bug#12866,  but this is not a bug.

I have also submitted an issue to the `scala-maven-plugin` community to discuss the possibility of adding additional settings to control the addition of the `-release` option: davidB/scala-maven-plugin#722.

For Apache Spark 4.0, in the short term, I suggest downgrading `scala-maven-plugin` to version 4.7.1 to avoid it automatic adding the `-release` option as a Scala compilation argument. In the long term, we should reduce use of APIs that are not `exports` for compatibility with the `-release` compilation option due to `-target` already deprecated after Scala 2.13.9.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
- Pass GitHub Actions
- Manual check

run `git revert 656bf36` to revert to using Scala 2.13.11 and run `dev/change-scala-version.sh 2.13` to change Scala to 2.13

1. Run `build/mvn clean install -DskipTests -Pscala-2.13 -X` to check the Scala compilation arguments.

Before

```
[[DEBUG] [zinc] Running cached compiler 1992eaf4 for Scala compiler version 2.13.11
[DEBUG] [zinc] The Scala compiler is invoked with:
  -unchecked
  -deprecation
  -feature
  -explaintypes
  -target:jvm-1.8
  -Wconf:cat=deprecation:wv,any:e
  -Wunused:imports
  -Wconf:cat=scaladoc:wv
  -Wconf:cat=lint-multiarg-infix:wv
  -Wconf:cat=other-nullary-override:wv
  -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
  -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
  -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
  -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
  -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
  -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
  -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
  -Wconf:cat=unchecked&msg=outer reference:s
  -Wconf:cat=unchecked&msg=eliminated by erasure:s
  -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
  -Wconf:msg=Implicit definition should have explicit type:s
  -release
  8
  -bootclasspath
...
```

After

```
[DEBUG] [zinc] Running cached compiler 72dd4888 for Scala compiler version 2.13.11
[DEBUG] [zinc] The Scala compiler is invoked with:
  -unchecked
  -deprecation
  -feature
  -explaintypes
  -target:jvm-1.8
  -Wconf:cat=deprecation:wv,any:e
  -Wunused:imports
  -Wconf:cat=scaladoc:wv
  -Wconf:cat=lint-multiarg-infix:wv
  -Wconf:cat=other-nullary-override:wv
  -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
  -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
  -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
  -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
  -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
  -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
  -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
  -Wconf:cat=unchecked&msg=outer reference:s
  -Wconf:cat=unchecked&msg=eliminated by erasure:s
  -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
  -Wconf:msg=Implicit definition should have explicit type:s
  -target:8
  -bootclasspath
  ...
```

After downgrading the version, the `-release` option should no longer appear in the compilation arguments.

2. Maven can build the project with Java 17 without the issue described in #41943. And after this pr, we can re-upgrade Scala 2.13 to Scala 2.13.11.

### Was this patch authored or co-authored using generative AI tooling?
No

Closes #42899 from LuciferYang/SPARK-45144.

Authored-by: yangjie01 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
@eejbyfeldt
Copy link
Contributor Author

Closing this as it no longer relevant. In the 3.5 scala 2.13 was downgraded and the update will be done in #42918

@eejbyfeldt eejbyfeldt closed this Sep 15, 2023
@eejbyfeldt eejbyfeldt deleted the fix-build-mvn-using-2.13-and-java11+ branch September 15, 2023 06:46
dongjoon-hyun pushed a commit that referenced this pull request Sep 16, 2023
### What changes were proposed in this pull request?
This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in #41943 should no longer exist.

- https://www.scala-lang.org/news/2.13.11

Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3

### Why are the changes needed?
This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:
- scala/scala#10363
- scala/scala#10184
- scala/scala#10397
- scala/scala#10348
- scala/scala#10105

There are 2 known issues in this version:

- scala/bug#12800
- scala/bug#12799

For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just

https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130

in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.

So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves

The full release notes as follows:

- https://github.com/scala/scala/releases/tag/v2.13.11

### Does this PR introduce _any_ user-facing change?
Yes, this is a Scala version change.

### How was this patch tested?
- Existing Test

### Was this patch authored or co-authored using generative AI tooling?
No

Closes #42918 from LuciferYang/SPARK-40497-2.

Authored-by: yangjie01 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
szehon-ho pushed a commit to szehon-ho/spark that referenced this pull request Aug 7, 2024
This pr downgrade `scala-maven-plugin` to version 4.7.1 to avoid it automatically adding the `-release` option as a Scala compilation argument.

The `scala-maven-plugin` versions 4.7.2 and later will try to automatically append the `-release` option as a Scala compilation argument when it is not specified by the user:

1. 4.7.2 and 4.8.0: try to add the `-release` option for Scala versions 2.13.9 and higher.
2. 4.8.1: try to append the `-release` option for Scala versions 2.12.x/2.13.x/3.1.1, and append `-java-output-version` for Scala 3.1.2.

The addition of the `-release` option has caused issues mentioned in SPARK-44376 | apache#41943 and apache#40442 (comment). This is because the `-release` option has stronger compilation restrictions than `-target`, ensuring not only bytecode format, but also that the API used in the code is compatible with the specified version of Java. However, many APIs in the `sun.*` package are not `exports` in Java 11, 17, and 21, such as `sun.nio.ch.DirectBuffer`, `sun.util.calendar.ZoneInfo`, and `sun.nio.cs.StreamDecoder`, making them invisible when compiling across different versions.

For discussions within the Scala community, see scala/bug#12643, scala/bug#12824, scala/bug#12866,  but this is not a bug.

I have also submitted an issue to the `scala-maven-plugin` community to discuss the possibility of adding additional settings to control the addition of the `-release` option: davidB/scala-maven-plugin#722.

For Apache Spark 4.0, in the short term, I suggest downgrading `scala-maven-plugin` to version 4.7.1 to avoid it automatic adding the `-release` option as a Scala compilation argument. In the long term, we should reduce use of APIs that are not `exports` for compatibility with the `-release` compilation option due to `-target` already deprecated after Scala 2.13.9.

No

- Pass GitHub Actions
- Manual check

run `git revert 656bf36` to revert to using Scala 2.13.11 and run `dev/change-scala-version.sh 2.13` to change Scala to 2.13

1. Run `build/mvn clean install -DskipTests -Pscala-2.13 -X` to check the Scala compilation arguments.

Before

```
[[DEBUG] [zinc] Running cached compiler 1992eaf4 for Scala compiler version 2.13.11
[DEBUG] [zinc] The Scala compiler is invoked with:
  -unchecked
  -deprecation
  -feature
  -explaintypes
  -target:jvm-1.8
  -Wconf:cat=deprecation:wv,any:e
  -Wunused:imports
  -Wconf:cat=scaladoc:wv
  -Wconf:cat=lint-multiarg-infix:wv
  -Wconf:cat=other-nullary-override:wv
  -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
  -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
  -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
  -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
  -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
  -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
  -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
  -Wconf:cat=unchecked&msg=outer reference:s
  -Wconf:cat=unchecked&msg=eliminated by erasure:s
  -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
  -Wconf:msg=Implicit definition should have explicit type:s
  -release
  8
  -bootclasspath
...
```

After

```
[DEBUG] [zinc] Running cached compiler 72dd4888 for Scala compiler version 2.13.11
[DEBUG] [zinc] The Scala compiler is invoked with:
  -unchecked
  -deprecation
  -feature
  -explaintypes
  -target:jvm-1.8
  -Wconf:cat=deprecation:wv,any:e
  -Wunused:imports
  -Wconf:cat=scaladoc:wv
  -Wconf:cat=lint-multiarg-infix:wv
  -Wconf:cat=other-nullary-override:wv
  -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
  -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
  -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
  -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
  -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
  -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
  -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
  -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
  -Wconf:cat=unchecked&msg=outer reference:s
  -Wconf:cat=unchecked&msg=eliminated by erasure:s
  -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
  -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
  -Wconf:msg=Implicit definition should have explicit type:s
  -target:8
  -bootclasspath
  ...
```

After downgrading the version, the `-release` option should no longer appear in the compilation arguments.

2. Maven can build the project with Java 17 without the issue described in apache#41943. And after this pr, we can re-upgrade Scala 2.13 to Scala 2.13.11.

No

Closes apache#42899 from LuciferYang/SPARK-45144.

Authored-by: yangjie01 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
szehon-ho pushed a commit to szehon-ho/spark that referenced this pull request Aug 7, 2024
This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in apache#41943 should no longer exist.

- https://www.scala-lang.org/news/2.13.11

Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3

This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:
- scala/scala#10363
- scala/scala#10184
- scala/scala#10397
- scala/scala#10348
- scala/scala#10105

There are 2 known issues in this version:

- scala/bug#12800
- scala/bug#12799

For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just

https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130

in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.

So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves

The full release notes as follows:

- https://github.com/scala/scala/releases/tag/v2.13.11

Yes, this is a Scala version change.

- Existing Test

No

Closes apache#42918 from LuciferYang/SPARK-40497-2.

Authored-by: yangjie01 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants