Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<artifactId>spark-parent</artifactId>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting, so parent is no longer Scala version specific? I would have thought it might be as it provides vars for the Scala version. I expect you've thought this through more than I have though

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Empirically, it looks like the generated POMS which are used for publishing end up with all variables in the <dependencies> section having been expanded according to whichever profile was active at publication time. For example, in http://central.maven.org/maven2/org/apache/spark/spark-core_2.10/1.6.1/spark-core_2.10-1.6.1.pom, there are no references to scala.binary.version in the <dependencies> section. There are references in the <build> section but I don't think that this is relevant to consumers of this POM since I don't expect people to depend on our child POMs in their own builds.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on some of the comments in https://issues.apache.org/jira/browse/MNG-2971, https://issues.apache.org/jira/browse/MNG-4223, and other tickets I believe that the intent is that properties should be expanded when installing / deploying POMs.

It looks like a somewhat more explicit way of ensuring this expansion would be to use the http://www.mojohaus.org/flatten-maven-plugin/ plugin.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like there's a more complete summary of Maven's desired behavior at https://cwiki.apache.org/confluence/display/MAVENOLD/Artifact-Coordinate+Expression+Transformation

<version>2.0.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
Expand Down
18 changes: 0 additions & 18 deletions bin/load-spark-env.cmd
Original file line number Diff line number Diff line change
Expand Up @@ -33,24 +33,6 @@ if [%SPARK_ENV_LOADED%] == [] (
call :LoadSparkEnv
)

rem Setting SPARK_SCALA_VERSION if not already set.

set ASSEMBLY_DIR2="%SPARK_HOME%\assembly\target\scala-2.11"
set ASSEMBLY_DIR1="%SPARK_HOME%\assembly\target\scala-2.10"

if [%SPARK_SCALA_VERSION%] == [] (

if exist %ASSEMBLY_DIR2% if exist %ASSEMBLY_DIR1% (
echo "Presence of build for both scala versions(SCALA 2.10 and SCALA 2.11) detected."
echo "Either clean one of them or, set SPARK_SCALA_VERSION=2.11 in spark-env.cmd."
exit 1
)
if exist %ASSEMBLY_DIR2% (
set SPARK_SCALA_VERSION=2.11
) else (
set SPARK_SCALA_VERSION=2.10
)
)
exit /b 0

:LoadSparkEnv
Expand Down
20 changes: 0 additions & 20 deletions bin/load-spark-env.sh
Original file line number Diff line number Diff line change
Expand Up @@ -41,23 +41,3 @@ if [ -z "$SPARK_ENV_LOADED" ]; then
set +a
fi
fi

# Setting SPARK_SCALA_VERSION if not already set.

if [ -z "$SPARK_SCALA_VERSION" ]; then

ASSEMBLY_DIR2="${SPARK_HOME}/assembly/target/scala-2.11"
ASSEMBLY_DIR1="${SPARK_HOME}/assembly/target/scala-2.10"

if [[ -d "$ASSEMBLY_DIR2" && -d "$ASSEMBLY_DIR1" ]]; then
echo -e "Presence of build for both scala versions(SCALA 2.10 and SCALA 2.11) detected." 1>&2
echo -e 'Either clean one of them or, export SPARK_SCALA_VERSION=2.11 in spark-env.sh.' 1>&2
exit 1
fi

if [ -d "$ASSEMBLY_DIR2" ]; then
export SPARK_SCALA_VERSION="2.11"
else
export SPARK_SCALA_VERSION="2.10"
fi
fi
4 changes: 2 additions & 2 deletions bin/spark-class
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ fi
if [ -f "${SPARK_HOME}/RELEASE" ]; then
SPARK_JARS_DIR="${SPARK_HOME}/jars"
else
SPARK_JARS_DIR="${SPARK_HOME}/assembly/target/scala-$SPARK_SCALA_VERSION/jars"
SPARK_JARS_DIR="${SPARK_HOME}/assembly/target/jars"
fi

if [ ! -d "$SPARK_JARS_DIR" ] && [ -z "$SPARK_TESTING$SPARK_SQL_TESTING" ]; then
Expand All @@ -52,7 +52,7 @@ fi

# Add the launcher build dir to the classpath if requested.
if [ -n "$SPARK_PREPEND_CLASSES" ]; then
LAUNCH_CLASSPATH="${SPARK_HOME}/launcher/target/scala-$SPARK_SCALA_VERSION/classes:$LAUNCH_CLASSPATH"
LAUNCH_CLASSPATH="${SPARK_HOME}/launcher/target/classes:$LAUNCH_CLASSPATH"
fi

# For tests
Expand Down
4 changes: 2 additions & 2 deletions bin/spark-class2.cmd
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ rem Find Spark jars.
if exist "%SPARK_HOME%\RELEASE" (
set SPARK_JARS_DIR="%SPARK_HOME%\jars"
) else (
set SPARK_JARS_DIR="%SPARK_HOME%\assembly\target\scala-%SPARK_SCALA_VERSION%\jars"
set SPARK_JARS_DIR="%SPARK_HOME%\assembly\target\jars"
)

if not exist "%SPARK_JARS_DIR%"\ (
Expand All @@ -45,7 +45,7 @@ set LAUNCH_CLASSPATH=%SPARK_JARS_DIR%\*

rem Add the launcher build dir to the classpath if requested.
if not "x%SPARK_PREPEND_CLASSES%"=="x" (
set LAUNCH_CLASSPATH="%SPARK_HOME%\launcher\target\scala-%SPARK_SCALA_VERSION%\classes;%LAUNCH_CLASSPATH%"
set LAUNCH_CLASSPATH="%SPARK_HOME%\launcher\target\classes;%LAUNCH_CLASSPATH%"
)

rem Figure out where java is.
Expand Down
24 changes: 1 addition & 23 deletions build/mvn
Original file line number Diff line number Diff line change
Expand Up @@ -93,25 +93,6 @@ install_zinc() {
ZINC_BIN="${_DIR}/${zinc_path}"
}

# Determine the Scala version from the root pom.xml file, set the Scala URL,
# and, with that, download the specific version of Scala necessary under
# the build/ folder
install_scala() {
# determine the Scala version used in Spark
local scala_version=`grep "scala.version" "${_DIR}/../pom.xml" | \
head -1 | cut -f2 -d'>' | cut -f1 -d'<'`
local scala_bin="${_DIR}/scala-${scala_version}/bin/scala"
local TYPESAFE_MIRROR=${TYPESAFE_MIRROR:-https://downloads.typesafe.com}

install_app \
"${TYPESAFE_MIRROR}/scala/${scala_version}" \
"scala-${scala_version}.tgz" \
"scala-${scala_version}/bin/scala"

SCALA_COMPILER="$(cd "$(dirname "${scala_bin}")/../lib" && pwd)/scala-compiler.jar"
SCALA_LIBRARY="$(cd "$(dirname "${scala_bin}")/../lib" && pwd)/scala-library.jar"
}

# Setup healthy defaults for the Zinc port if none were provided from
# the environment
ZINC_PORT=${ZINC_PORT:-"3030"}
Expand All @@ -132,7 +113,6 @@ fi

# Install the proper version of Scala and Zinc for the build
install_zinc
install_scala

# Reset the current working directory
cd "${_CALLING_DIR}"
Expand All @@ -142,9 +122,7 @@ cd "${_CALLING_DIR}"
if [ -n "${ZINC_INSTALL_FLAG}" -o -z "`"${ZINC_BIN}" -status -port ${ZINC_PORT}`" ]; then
export ZINC_OPTS=${ZINC_OPTS:-"$_COMPILE_JVM_OPTS"}
"${ZINC_BIN}" -shutdown -port ${ZINC_PORT}
"${ZINC_BIN}" -start -port ${ZINC_PORT} \
-scala-compiler "${SCALA_COMPILER}" \
-scala-library "${SCALA_LIBRARY}" &>/dev/null
"${ZINC_BIN}" -start -port ${ZINC_PORT} &>/dev/null
fi

# Set any `mvn` options if not already present
Expand Down
10 changes: 2 additions & 8 deletions common/network-common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,13 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<artifactId>spark-parent</artifactId>
<version>2.0.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<groupId>org.apache.spark</groupId>
<artifactId>spark-network-common_2.11</artifactId>
<artifactId>spark-network-common</artifactId>
<packaging>jar</packaging>
<name>Spark Project Networking</name>
<url>http://spark.apache.org/</url>
Expand Down Expand Up @@ -64,10 +64,6 @@
<artifactId>log4j</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
Expand All @@ -81,8 +77,6 @@
</dependencies>

<build>
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
<plugins>
<!-- Create a test-jar so network-shuffle can depend on our test utilities. -->
<plugin>
Expand Down
17 changes: 4 additions & 13 deletions common/network-shuffle/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,13 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<artifactId>spark-parent</artifactId>
<version>2.0.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<groupId>org.apache.spark</groupId>
<artifactId>spark-network-shuffle_2.11</artifactId>
<artifactId>spark-network-shuffle</artifactId>
<packaging>jar</packaging>
<name>Spark Project Shuffle Streaming Service</name>
<url>http://spark.apache.org/</url>
Expand All @@ -39,7 +39,7 @@
<!-- Core dependencies -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-network-common_${scala.binary.version}</artifactId>
<artifactId>spark-network-common</artifactId>
<version>${project.version}</version>
</dependency>

Expand Down Expand Up @@ -73,15 +73,11 @@
<!-- Test dependencies -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-network-common_${scala.binary.version}</artifactId>
<artifactId>spark-network-common</artifactId>
<version>${project.version}</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
Expand All @@ -93,9 +89,4 @@
<scope>test</scope>
</dependency>
</dependencies>

<build>
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
</build>
</project>
15 changes: 4 additions & 11 deletions common/network-yarn/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,36 +21,31 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<artifactId>spark-parent</artifactId>
<version>2.0.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<groupId>org.apache.spark</groupId>
<artifactId>spark-network-yarn_2.11</artifactId>
<artifactId>spark-network-yarn</artifactId>
<packaging>jar</packaging>
<name>Spark Project YARN Shuffle Service</name>
<url>http://spark.apache.org/</url>
<properties>
<sbt.project.name>network-yarn</sbt.project.name>
<!-- Make sure all Hadoop dependencies are provided to avoid repackaging. -->
<hadoop.deps.scope>provided</hadoop.deps.scope>
<shuffle.jar>${project.build.directory}/scala-${scala.binary.version}/spark-${project.version}-yarn-shuffle.jar</shuffle.jar>
<shuffle.jar>${project.build.directory}/spark-${project.version}-yarn-shuffle.jar</shuffle.jar>
<shade>org/spark_project/</shade>
</properties>

<dependencies>
<!-- Core dependencies -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-network-shuffle_${scala.binary.version}</artifactId>
<artifactId>spark-network-shuffle</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
</dependency>

<!-- Provided dependencies -->
<dependency>
<groupId>org.apache.hadoop</groupId>
Expand All @@ -64,8 +59,6 @@
</dependencies>

<build>
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
Expand Down
13 changes: 2 additions & 11 deletions common/sketch/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,30 +21,21 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<artifactId>spark-parent</artifactId>
<version>2.0.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<groupId>org.apache.spark</groupId>
<artifactId>spark-sketch_2.11</artifactId>
<artifactId>spark-sketch</artifactId>
<packaging>jar</packaging>
<name>Spark Project Sketch</name>
<url>http://spark.apache.org/</url>
<properties>
<sbt.project.name>sketch</sbt.project.name>
</properties>

<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
</dependency>
</dependencies>

<build>
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
<pluginManagement>
<plugins>
<plugin>
Expand Down
7 changes: 1 addition & 6 deletions common/tags/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<artifactId>spark-parent</artifactId>
<version>2.0.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>
Expand All @@ -42,9 +42,4 @@
<scope>compile</scope>
</dependency>
</dependencies>

<build>
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
</build>
</project>
14 changes: 4 additions & 10 deletions common/unsafe/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,13 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.11</artifactId>
<artifactId>spark-parent</artifactId>
<version>2.0.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<groupId>org.apache.spark</groupId>
<artifactId>spark-unsafe_2.11</artifactId>
<artifactId>spark-unsafe</artifactId>
<packaging>jar</packaging>
<name>Spark Project Unsafe</name>
<url>http://spark.apache.org/</url>
Expand All @@ -37,8 +37,8 @@

<dependencies>
<dependency>
<groupId>com.twitter</groupId>
<artifactId>chill_${scala.binary.version}</artifactId>
<groupId>com.esotericsoftware</groupId>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Truly small nit but these got indented too far

<artifactId>kryo-shaded</artifactId>
</dependency>

<!-- Core dependencies -->
Expand All @@ -59,10 +59,6 @@
</dependency>

<!-- Test dependencies -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
Expand All @@ -80,8 +76,6 @@
</dependency>
</dependencies>
<build>
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
<pluginManagement>
<plugins>
<plugin>
Expand Down
Loading