Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/benchmark.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@ on:
required: true
default: '*'
jdk:
description: 'JDK version: 8, 11, 17 or 21-ea'
description: 'JDK version: 17 or 21'
required: true
default: '8'
default: '17'
scala:
description: 'Scala version: 2.13'
required: true
Expand Down
15 changes: 7 additions & 8 deletions .github/workflows/build_and_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ on:
java:
required: false
type: string
default: 8
default: 17
branch:
description: Branch to run the build against
required: false
Expand Down Expand Up @@ -649,11 +649,11 @@ jobs:
if [ -f ./dev/free_disk_space_container ]; then
./dev/free_disk_space_container
fi
- name: Install Java 8
- name: Install Java ${{ inputs.java }}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems easier to install Java this way, the daily test of branch-3.x all with java: 8

uses: actions/setup-java@v3
with:
distribution: zulu
java-version: 8
java-version: ${{ inputs.java }}
- name: License test
run: ./dev/check-license
- name: Dependencies test
Expand Down Expand Up @@ -777,7 +777,6 @@ jobs:
fail-fast: false
matrix:
java:
- 11
- 17
- 21
runs-on: ubuntu-22.04
Expand Down Expand Up @@ -868,11 +867,11 @@ jobs:
key: tpcds-coursier-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }}
restore-keys: |
tpcds-coursier-
- name: Install Java 8
- name: Install Java ${{ inputs.java }}
uses: actions/setup-java@v3
with:
distribution: zulu
java-version: 8
java-version: ${{ inputs.java }}
- name: Cache TPC-DS generated data
id: cache-tpcds-sf-1
uses: actions/cache@v3
Expand Down Expand Up @@ -974,11 +973,11 @@ jobs:
key: docker-integration-coursier-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }}
restore-keys: |
docker-integration-coursier-
- name: Install Java 8
- name: Install Java ${{ inputs.java }}
uses: actions/setup-java@v3
with:
distribution: zulu
java-version: 8
java-version: ${{ inputs.java }}
- name: Run tests
run: |
./dev/run-tests --parallelism 1 --modules docker-integration-tests --included-tags org.apache.spark.tags.DockerTest
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/build_ansi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
# under the License.
#

name: "Build / ANSI (master, Hadoop 3, JDK 8, Scala 2.13)"
name: "Build / ANSI (master, Hadoop 3, JDK 17, Scala 2.13)"

on:
schedule:
Expand All @@ -31,7 +31,7 @@ jobs:
uses: ./.github/workflows/build_and_test.yml
if: github.repository == 'apache/spark'
with:
java: 8
java: 17
branch: master
hadoop: hadoop3
envs: >-
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/build_coverage.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
# under the License.
#

name: "Build / Coverage (master, Scala 2.13, Hadoop 3, JDK 8)"
name: "Build / Coverage (master, Scala 2.13, Hadoop 3, JDK 17)"

on:
schedule:
Expand All @@ -31,7 +31,7 @@ jobs:
uses: ./.github/workflows/build_and_test.yml
if: github.repository == 'apache/spark'
with:
java: 8
java: 17
branch: master
hadoop: hadoop3
envs: >-
Expand Down
49 changes: 0 additions & 49 deletions .github/workflows/build_java11.yml

This file was deleted.

49 changes: 0 additions & 49 deletions .github/workflows/build_java17.yml

This file was deleted.

4 changes: 2 additions & 2 deletions .github/workflows/build_rockdb_as_ui_backend.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
# under the License.
#

name: "Build / RocksDB as UI Backend (master, Hadoop 3, JDK 8, Scala 2.13)"
name: "Build / RocksDB as UI Backend (master, Hadoop 3, JDK 17, Scala 2.13)"

on:
schedule:
Expand All @@ -31,7 +31,7 @@ jobs:
uses: ./.github/workflows/build_and_test.yml
if: github.repository == 'apache/spark'
with:
java: 8
java: 17
branch: master
hadoop: hadoop3
envs: >-
Expand Down
9 changes: 8 additions & 1 deletion .github/workflows/publish_snapshot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -47,11 +47,18 @@ jobs:
key: snapshot-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: |
snapshot-maven-
- name: Install Java 8
- name: Install Java 8 for branch-3.x
if: matrix.branch == 'branch-3.5' || matrix.branch == 'branch-3.4' || matrix.branch == 'branch-3.3'
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: 8
- name: Install Java 17
if: matrix.branch != 'branch-3.5' && matrix.branch != 'branch-3.4' && matrix.branch != 'branch-3.3'
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: 17
- name: Publish snapshot
env:
ASF_USERNAME: ${{ secrets.NEXUS_USER }}
Expand Down
2 changes: 1 addition & 1 deletion dev/infra/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ RUN apt-get update && apt-get install -y \
pkg-config \
curl \
wget \
openjdk-8-jdk \
openjdk-17-jdk-headless \
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dongjoon-hyun @HyukjinKwon should we change this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cc @Yikun

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dongjoon-hyun So in this PR, should we revert this change and consider it as a followup?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Emm, jdk install in here seems useless, because we are using github action to install the java. But if CI passed, it is ok.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Yikun ~ Let me continue to monitor the running status of GA.

gfortran \
libopenblas-dev \
liblapack-dev \
Expand Down
2 changes: 1 addition & 1 deletion docs/building-spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ license: |
## Apache Maven

The Maven-based build is the build of reference for Apache Spark.
Building Spark using Maven requires Maven 3.9.4 and Java 8/11/17.
Building Spark using Maven requires Maven 3.9.4 and Java 17.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add Support for Java 8/11 was removed in Spark 4.0.0.

Spark requires Scala 2.13; support for Scala 2.12 was removed in Spark 4.0.0.

### Setting up Maven's Memory Usage
Expand Down
5 changes: 2 additions & 3 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,12 +34,11 @@ source, visit [Building Spark](building-spark.html).

Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS), and it should run on any platform that runs a supported version of Java. This should include JVMs on x86_64 and ARM64. It's easy to run locally on one machine --- all you need is to have `java` installed on your system `PATH`, or the `JAVA_HOME` environment variable pointing to a Java installation.

Spark runs on Java 8/11/17, Scala 2.13, Python 3.8+, and R 3.5+.
Java 8 prior to version 8u371 support is deprecated as of Spark 3.5.0.
Spark runs on Java 17, Scala 2.13, Python 3.8+, and R 3.5+.
When using the Scala API, it is necessary for applications to use the same version of Scala that Spark was compiled for.
For example, when using Scala 2.13, use Spark compiled for 2.13, and compile code/applications for Scala 2.13 as well.

For Java 11, setting `-Dio.netty.tryReflectionSetAccessible=true` is required for the Apache Arrow library. This prevents the `java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available` error when Apache Arrow uses Netty internally.
For Java 17, setting `-Dio.netty.tryReflectionSetAccessible=true` is required for the Apache Arrow library. This prevents the `java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available` error when Apache Arrow uses Netty internally.

# Running the Examples and Shell

Expand Down
10 changes: 5 additions & 5 deletions docs/security.md
Original file line number Diff line number Diff line change
Expand Up @@ -498,8 +498,8 @@ replaced with one of the above namespaces.
A comma-separated list of ciphers. The specified ciphers must be supported by JVM.

<br />The reference list of protocols can be found in the "JSSE Cipher Suite Names" section
of the Java security guide. The list for Java 8 can be found at
<a href="https://docs.oracle.com/javase/8/docs/technotes/guides/security/StandardNames.html#ciphersuites">this</a>
of the Java security guide. The list for Java 17 can be found at
<a href="https://docs.oracle.com/en/java/javase/17/docs/specs/security/standard-names.html#jsse-cipher-suite-names">this</a>
page.

<br />Note: If not set, the default cipher suite for the JRE will be used.
Expand Down Expand Up @@ -537,8 +537,8 @@ replaced with one of the above namespaces.
TLS protocol to use. The protocol must be supported by JVM.

<br />The reference list of protocols can be found in the "Additional JSSE Standard Names"
section of the Java security guide. For Java 8, the list can be found at
<a href="https://docs.oracle.com/javase/8/docs/technotes/guides/security/StandardNames.html#jssenames">this</a>
section of the Java security guide. For Java 17, the list can be found at
<a href="https://docs.oracle.com/en/java/javase/17/docs/specs/security/standard-names.html#additional-jsse-standard-names">this</a>
page.
</td>
</tr>
Expand Down Expand Up @@ -591,7 +591,7 @@ Or via SparkConf "spark.hadoop.hadoop.security.credential.provider.path=jceks://
## Preparing the key stores

Key stores can be generated by `keytool` program. The reference documentation for this tool for
Java 8 is [here](https://docs.oracle.com/javase/8/docs/technotes/tools/unix/keytool.html).
Java 17 is [here](https://docs.oracle.com/en/java/javase/17/docs/specs/man/keytool.html).
The most basic steps to configure the key stores and the trust store for a Spark Standalone
deployment mode is as follows:

Expand Down
4 changes: 2 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
<java.version>17</java.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
<maven.version>3.9.4</maven.version>
Expand Down Expand Up @@ -2934,7 +2934,7 @@
<arg>-deprecation</arg>
<arg>-feature</arg>
<arg>-explaintypes</arg>
<arg>-target:jvm-1.8</arg>
<arg>-target:17</arg>
<arg>-Wconf:cat=deprecation:wv,any:e</arg>
<arg>-Wunused:imports</arg>
<!--
Expand Down
2 changes: 1 addition & 1 deletion project/SparkBuild.scala
Original file line number Diff line number Diff line change
Expand Up @@ -340,7 +340,7 @@ object SparkBuild extends PomBuild {
),

(Compile / scalacOptions) ++= Seq(
s"-target:jvm-${javaVersion.value}",
s"-target:${javaVersion.value}",
"-sourcepath", (ThisBuild / baseDirectory).value.getAbsolutePath // Required for relative source links in scaladoc
),

Expand Down
4 changes: 2 additions & 2 deletions python/docs/source/getting_started/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,7 @@ Package Supported version Note
`googleapis-common-protos` ==1.56.4 Required for Spark Connect
========================== ========================= ======================================================================================

Note that PySpark requires Java 8 or later with ``JAVA_HOME`` properly set.
If using JDK 11, set ``-Dio.netty.tryReflectionSetAccessible=true`` for Arrow related features and refer
Note that PySpark requires Java 17 or later with ``JAVA_HOME`` properly set.
Need set ``-Dio.netty.tryReflectionSetAccessible=true`` for Arrow related features and refer
to |downloading|_.