From d383efba12c66addb17006dea107bb0421d50bc3 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E9=83=AD=E5=B0=8F=E9=BE=99=2010207633?= Date: Fri, 31 Mar 2017 21:57:09 +0800 Subject: [PATCH 1/6] [SPARK-20177]Document about compression way has some little detail changes. --- docs/configuration.md | 7 +++++-- 1 file changed, 5 insertions(+), 2 deletions(-) diff --git a/docs/configuration.md b/docs/configuration.md index a9753925407d7..156997b539e65 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -639,6 +639,7 @@ Apart from these, the following properties are also available, and may be useful false Whether to compress logged events, if spark.eventLog.enabled is true. + Compression will use spark.io.compression.codec. @@ -773,14 +774,15 @@ Apart from these, the following properties are also available, and may be useful true Whether to compress broadcast variables before sending them. Generally a good idea. + Compression will use spark.io.compression.codec. spark.io.compression.codec lz4 - The codec used to compress internal data such as RDD partitions, broadcast variables and - shuffle outputs. By default, Spark provides three codecs: lz4, lzf, + The codec used to compress internal data such as RDD partitions,event log, broadcast variables + and shuffle outputs. By default, Spark provides three codecs: lz4, lzf, and snappy. You can also use fully qualified class names to specify the codec, e.g. org.apache.spark.io.LZ4CompressionCodec, @@ -881,6 +883,7 @@ Apart from these, the following properties are also available, and may be useful StorageLevel.MEMORY_ONLY_SER in Java and Scala or StorageLevel.MEMORY_ONLY in Python). Can save substantial space at the cost of some extra CPU time. + Compression will use spark.io.compression.codec. From 3059013e9d2aec76def14eb314b6761bea0e7ca0 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E9=83=AD=E5=B0=8F=E9=BE=99=2010207633?= Date: Sat, 1 Apr 2017 09:38:02 +0800 Subject: [PATCH 2/6] [SPARK-20177] event log add a space --- docs/configuration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/configuration.md b/docs/configuration.md index 156997b539e65..2687f542b8bd3 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -781,7 +781,7 @@ Apart from these, the following properties are also available, and may be useful spark.io.compression.codec lz4 - The codec used to compress internal data such as RDD partitions,event log, broadcast variables + The codec used to compress internal data such as RDD partitions, event log, broadcast variables and shuffle outputs. By default, Spark provides three codecs: lz4, lzf, and snappy. You can also use fully qualified class names to specify the codec, e.g. From 555cef88fe09134ac98fd0ad056121c7df2539aa Mon Sep 17 00:00:00 2001 From: guoxiaolongzte Date: Sun, 2 Apr 2017 08:16:08 +0800 Subject: [PATCH 3/6] '/applications/[app-id]/jobs' in rest api,status should be [running|succeeded|failed|unknown] --- docs/monitoring.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/monitoring.md b/docs/monitoring.md index 80519525af0c3..c1e827224b5b4 100644 --- a/docs/monitoring.md +++ b/docs/monitoring.md @@ -289,7 +289,7 @@ can be identified by their `[attempt-id]`. In the API listed below, when running /applications/[app-id]/jobs A list of all jobs for a given application. -
?status=[complete|succeeded|failed] list only jobs in the specific state. +
?status=[running|succeeded|failed|unknown] list only jobs in the specific state. From 0efb0dd9e404229cce638fe3fb0c966276784df7 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E9=83=AD=E5=B0=8F=E9=BE=99=2010207633?= Date: Wed, 5 Apr 2017 11:47:53 +0800 Subject: [PATCH 4/6] [SPARK-20218]'/applications/[app-id]/stages' in REST API,add description. --- docs/monitoring.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/monitoring.md b/docs/monitoring.md index 4d0617d253b80..d180d77e2cd4d 100644 --- a/docs/monitoring.md +++ b/docs/monitoring.md @@ -299,6 +299,7 @@ can be identified by their `[attempt-id]`. In the API listed below, when running /applications/[app-id]/stages A list of all stages for a given application. +
?status=[active|complete|pending|failed] list only stages in the state. /applications/[app-id]/stages/[stage-id] From 0e37fdeee28e31fc97436dabd001d3c85c5a7794 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E9=83=AD=E5=B0=8F=E9=BE=99=2010207633?= Date: Wed, 5 Apr 2017 13:22:54 +0800 Subject: [PATCH 5/6] [SPARK-20218] '/applications/[app-id]/stages/[stage-id]' in REST API,remove redundant description. --- docs/monitoring.md | 1 - 1 file changed, 1 deletion(-) diff --git a/docs/monitoring.md b/docs/monitoring.md index d180d77e2cd4d..da954385dc452 100644 --- a/docs/monitoring.md +++ b/docs/monitoring.md @@ -305,7 +305,6 @@ can be identified by their `[attempt-id]`. In the API listed below, when running /applications/[app-id]/stages/[stage-id] A list of all attempts for the given stage. -
?status=[active|complete|pending|failed] list only stages in the state. From 9ecc1a0257964acca6aabddcfc26cbcecf5086e8 Mon Sep 17 00:00:00 2001 From: guoxiaolong Date: Mon, 17 Apr 2017 15:25:36 +0800 Subject: [PATCH 6/6] =?UTF-8?q?[SPARK-20354]/api/v1/applications=E2=80=99?= =?UTF-8?q?=20return=20sparkUser=20is=20null=20in=20REST=20API.?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- core/src/main/scala/org/apache/spark/ui/SparkUI.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/core/src/main/scala/org/apache/spark/ui/SparkUI.scala b/core/src/main/scala/org/apache/spark/ui/SparkUI.scala index 7d31ac54a7177..bf4cf79e9faa3 100644 --- a/core/src/main/scala/org/apache/spark/ui/SparkUI.scala +++ b/core/src/main/scala/org/apache/spark/ui/SparkUI.scala @@ -117,7 +117,7 @@ private[spark] class SparkUI private ( endTime = new Date(-1), duration = 0, lastUpdated = new Date(startTime), - sparkUser = "", + sparkUser = getSparkUser, completed = false )) ))