Skip to content

build: switch to Java 17 for all modules #388

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

nielspardon
Copy link
Contributor

BREAKING CHANGE: this might break consumers running older Java versions

We've been seeing occasional compilation errors where the Gradle toolchain JDK auto detection picked up a Java 11 JDK which does not work with some of the newer syntax being used especially in unit tests.

The behavior was a bit random, sometimes it picked up Java 17 and the build went through, other times it picked up Java 11 and failed to compile.

I would propose to switch all modules to Java 17. This made the compilation more reliable for me locally.

I had to configure one JVM arg for the Spark tests to run successfully since something in those tests tries to access a protected module.

BREAKING CHANGE: this might break consumers running older Java versions

Signed-off-by: Niels Pardon <[email protected]>
@vbarua
Copy link
Member

vbarua commented Apr 11, 2025

(not a Java build expert)

My understanding based on

is that we explicilty want to support Java 8 for Spark. If we want to change that, we should check with the broader community.

That being said, I've also run into annoying build issues locally because of this, especially with Jabel (#195) and have considered ripping that out entirely to see if that helps the issue.

@nielspardon
Copy link
Contributor Author

yeah, I tried to dig into that and found the following page from Cloudera saying that Java 17 is supported since Spark 3.3:

https://community.cloudera.com/t5/Community-Articles/Spark-and-Java-versions-Supportability-Matrix/ta-p/383669

I consulted my colleague @andrew-coleman who has been working on the Spark mappings for us and he told me he's only running with Java 17.

And I found this line in substrait-spark/build.gradle which makes be believe we are already compiling Scala to the Java 17 release level:

scalaCompileOptions.additionalParameters = listOf("-release:17")

@nielspardon
Copy link
Contributor Author

I can not find any dependency on substrait-java in the Apache Gluten project. it rather looks like they have their own implementation of the Java classes:
https://github.com/apache/incubator-gluten/blob/main/gluten-substrait/pom.xml
https://github.com/apache/incubator-gluten/tree/main/gluten-substrait/src/main/java/org/apache/gluten/substrait

@EpsilonPrime
Copy link
Member

Gluten has their own fork of the Substrait protos. Ideally we can unfork their efforts at some point but they aren't willing to put in any work towards that end despite being an Apache project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants