Github user mateiz commented on the pull request:
https://github.com/apache/spark/pull/97#issuecomment-37505082
BTW as mentioned above please use PriorityQueue here instead of copying
their heap. It's just a lot of work to copy the heap.. we can take the
performance hit instead.
---
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/79#issuecomment-37504123
One or more automated tests failed
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/13149/
---
If your pr
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/79#issuecomment-37504122
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37504133
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37504130
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37504132
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/13151/
---
If your project
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37504125
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/13152/
---
If your project
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37504124
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37504134
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/13150/
---
If your project
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/112
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabl
Github user sryza commented on a diff in the pull request:
https://github.com/apache/spark/pull/120#discussion_r10554025
--- Diff:
yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientArguments.scala
---
@@ -133,11 +148,11 @@ class ClientArguments(val args: Array[String],
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/112#issuecomment-37503280
Merged into master and 0.9
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/113#issuecomment-37503197
@andrewor14 any comments or reservations on this one?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If y
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/120#issuecomment-37503046
Sandy - looks good to me. Are you still changing things? I noticed there
are a few comments that maybe should be updated:
```
alpha/src/main/scala/org/apach
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/120#discussion_r10553741
--- Diff:
yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientArguments.scala
---
@@ -67,24 +67,39 @@ class ClientArguments(val args: Array[String]
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/120#discussion_r10553724
--- Diff:
yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientArguments.scala
---
@@ -133,11 +148,11 @@ class ClientArguments(val args: Array[Strin
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/86#issuecomment-37502813
@mateiz maybe you could take a pass on this?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your projec
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/86#discussion_r10553709
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -0,0 +1,176 @@
+/*
+ * Licensed to the Apache Software Foundati
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/120#discussion_r10553691
--- Diff:
yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientArguments.scala
---
@@ -133,11 +148,11 @@ class ClientArguments(val args: Array[Strin
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10553393
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under on
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37501962
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user tdas commented on the pull request:
https://github.com/apache/spark/pull/126#issuecomment-37501973
How does immutable Hashmaps help to store metadata? For example, how would
you store block ID --> block info in the BlockManager using immutable HashMaps?
---
If your proje
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37501961
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not hav
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37501920
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not hav
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37501921
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37501874
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not hav
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/126#issuecomment-37501863
If you don't need high performance, why not just put a normal immutable
hashmap so you don't have to worry about concurrency?
---
If your project is set up for it, you can r
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37501875
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/79#issuecomment-37501793
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/79#issuecomment-37501794
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have th
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/79#issuecomment-37501743
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have t
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/79#issuecomment-37501744
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/13148/
---
If your project i
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10553293
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under on
Github user tdas commented on the pull request:
https://github.com/apache/spark/pull/126#issuecomment-37501724
@rxin It overrides stuff to make sure such things like traversing entire
HashMap does not happen. They are meant for being drop-in replacements of scala
HashMaps when applyin
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10553150
--- Diff: core/src/main/scala/org/apache/spark/util/BoundedHashMap.scala ---
@@ -0,0 +1,67 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) unde
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/126#issuecomment-37501259
@tdas I haven't finished looking at this (will probably spend more time
after Fri) - but WrappedJavaHashMap is fairly complicated, and it seems like a
recipe for complexity a
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10553128
--- Diff:
core/src/main/scala/org/apache/spark/util/TimeStampedWeakValueHashMap.scala ---
@@ -0,0 +1,112 @@
+/*
+ * Licensed to the Apache Software Founda
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10553097
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) unde
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10553086
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) unde
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10553056
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/ShuffleMapTask.scala ---
@@ -17,28 +17,24 @@
package org.apache.spark.scheduler
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10553044
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/ShuffleMapTask.scala ---
@@ -17,28 +17,24 @@
package org.apache.spark.scheduler
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10553024
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMessages.scala ---
@@ -35,9 +35,9 @@ private[storage] object BlockManagerMessages {
case
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10553029
--- Diff: core/src/main/scala/org/apache/spark/MapOutputTracker.scala ---
@@ -181,15 +178,50 @@ private[spark] class MapOutputTracker(conf:
SparkConf) extends
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552982
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMessages.scala ---
@@ -35,9 +35,9 @@ private[storage] object BlockManagerMessages {
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552981
--- Diff: core/src/main/scala/org/apache/spark/util/BoundedHashMap.scala ---
@@ -0,0 +1,67 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) unde
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552941
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under on
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552946
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) unde
Github user mengxr commented on the pull request:
https://github.com/apache/spark/pull/131#issuecomment-37500737
@srowen , level 3 BLAS would certainly help improve the performance. DSYRK
is for computing C <- A^T A + C, but I don't know whether we have it in jblas.
However, for impli
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552915
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under on
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552886
--- Diff: core/src/main/scala/org/apache/spark/MapOutputTracker.scala ---
@@ -20,15 +20,15 @@ package org.apache.spark
import java.io._
import java.util.
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552879
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under on
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552859
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMessages.scala ---
@@ -35,9 +35,9 @@ private[storage] object BlockManagerMessages {
case
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552846
--- Diff: core/src/main/scala/org/apache/spark/MapOutputTracker.scala ---
@@ -20,15 +20,15 @@ package org.apache.spark
import java.io._
import java.util.
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552839
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under on
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552833
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMessages.scala ---
@@ -35,9 +35,9 @@ private[storage] object BlockManagerMessages {
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552834
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under on
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552817
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under on
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552793
--- Diff: core/src/main/scala/org/apache/spark/MapOutputTracker.scala ---
@@ -181,15 +178,50 @@ private[spark] class MapOutputTracker(conf:
SparkConf) extends Log
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552777
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/ShuffleMapTask.scala ---
@@ -17,28 +17,24 @@
package org.apache.spark.scheduler
+imp
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552754
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) unde
I am trying to get a few examples to run in IntelliJ and I am seeing this.
What am I missing?
[ERROR] Failed to execute goal on project spark-examples_2.10: Could not
resolve dependencies for project
org.apache.spark:spark-examples_2.10:jar:0.9.1-incubating-SNAPSHOT: Failed
to collect dependencies
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552717
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) unde
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552716
--- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
@@ -278,7 +278,7 @@ private[spark] class Executor(
// have left some weir
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552723
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -206,12 +206,12 @@ class SendingConnection(val address:
InetSocketAddress, selector
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552696
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/dstream/StateDStream.scala
---
@@ -64,7 +64,7 @@ class StateDStream[K: ClassTag, V: ClassTag, S: Cl
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552705
--- Diff:
streaming/src/test/scala/org/apache/spark/streaming/InputStreamsSuite.scala ---
@@ -152,7 +152,7 @@ class InputStreamsSuite extends TestSuiteBase with
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552693
--- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
@@ -0,0 +1,135 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) unde
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552712
--- Diff:
yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala ---
@@ -137,7 +137,7 @@ trait ClientBase extends Logging {
} else if (
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552692
--- Diff: core/src/main/scala/org/apache/spark/rdd/RDD.scala ---
@@ -1025,6 +1025,14 @@ abstract class RDD[T: ClassTag](
checkpointData.flatMap(_.getCheck
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552681
--- Diff:
graphx/src/main/scala/org/apache/spark/graphx/impl/Serializers.scala ---
@@ -298,7 +298,7 @@ abstract class ShuffleSerializationStream(s:
OutputStream)
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552685
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/dstream/NetworkInputDStream.scala
---
@@ -128,7 +128,7 @@ abstract class NetworkReceiver[T: ClassTa
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552680
--- Diff:
graphx/src/main/scala/org/apache/spark/graphx/impl/Serializers.scala ---
@@ -391,7 +391,7 @@ abstract class ShuffleDeserializationStream(s:
InputStream
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552671
--- Diff:
examples/src/main/scala/org/apache/spark/examples/SparkHdfsLR.scala ---
@@ -34,8 +34,8 @@ object SparkHdfsLR {
case class DataPoint(x: Vector, y:
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552662
--- Diff: examples/src/main/scala/org/apache/spark/examples/LocalALS.scala
---
@@ -53,7 +53,7 @@ object LocalALS {
for (i <- 0 until M; j <- 0 until U) {
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552655
--- Diff: core/src/test/scala/org/apache/spark/CheckpointSuite.scala ---
@@ -432,7 +432,7 @@ object CheckpointSuite {
// This is a custom cogroup function t
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552666
--- Diff: examples/src/main/scala/org/apache/spark/examples/SparkALS.scala
---
@@ -54,7 +54,7 @@ object SparkALS {
for (i <- 0 until M; j <- 0 until U) {
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552654
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMessages.scala ---
@@ -35,9 +35,9 @@ private[storage] object BlockManagerMessages {
case
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552645
--- Diff: core/src/main/scala/org/apache/spark/rdd/RDD.scala ---
@@ -1025,6 +1025,14 @@ abstract class RDD[T: ClassTag](
checkpointData.flatMap(_.getC
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552638
--- Diff:
core/src/main/scala/org/apache/spark/util/TimeStampedWeakValueHashMap.scala ---
@@ -0,0 +1,112 @@
+/*
+ * Licensed to the Apache Software Founda
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/79#issuecomment-37499891
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have th
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552625
--- Diff:
core/src/main/scala/org/apache/spark/util/TimeStampedWeakValueHashMap.scala ---
@@ -0,0 +1,112 @@
+/*
+ * Licensed to the Apache Software Founda
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/79#issuecomment-37499890
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552599
--- Diff: core/src/main/scala/org/apache/spark/MapOutputTracker.scala ---
@@ -181,15 +178,50 @@ private[spark] class MapOutputTracker(conf:
SparkConf) extends
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/124#discussion_r10552608
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMessages.scala ---
@@ -35,9 +35,9 @@ private[storage] object BlockManagerMessages {
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/101#issuecomment-37499630
Merged build finished.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/101#issuecomment-37499631
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/13147/
---
If your project
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552496
--- Diff: core/src/main/scala/org/apache/spark/MapOutputTracker.scala ---
@@ -181,15 +178,50 @@ private[spark] class MapOutputTracker(conf:
SparkConf) extends Log
Github user aarondav commented on the pull request:
https://github.com/apache/spark/pull/101#issuecomment-37499281
Thanks for updating this, I think changing the package is a very good
cleanup.
---
If your project is set up for it, you can reply to this email and have your
reply appe
Github user aarondav commented on a diff in the pull request:
https://github.com/apache/spark/pull/101#discussion_r10552386
--- Diff: core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala ---
@@ -222,4 +232,27 @@ private[spark] object HadoopRDD {
def putCachedMetadat
Github user aarondav commented on a diff in the pull request:
https://github.com/apache/spark/pull/101#discussion_r10552362
--- Diff: core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala ---
@@ -15,18 +15,19 @@
* limitations under the License.
*/
-packa
Github user aarondav commented on a diff in the pull request:
https://github.com/apache/spark/pull/101#discussion_r10552358
--- Diff: core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala ---
@@ -15,18 +15,19 @@
* limitations under the License.
*/
-packa
Github user aarondav commented on a diff in the pull request:
https://github.com/apache/spark/pull/101#discussion_r10552253
--- Diff: core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala ---
@@ -15,18 +15,19 @@
* limitations under the License.
*/
-packa
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552205
--- Diff:
core/src/main/scala/org/apache/spark/util/TimeStampedWeakValueHashMap.scala ---
@@ -0,0 +1,112 @@
+/*
+ * Licensed to the Apache Software Founda
Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/126#discussion_r10552156
--- Diff:
core/src/main/scala/org/apache/spark/util/TimeStampedWeakValueHashMap.scala ---
@@ -0,0 +1,112 @@
+/*
+ * Licensed to the Apache Software Founda
Github user tdas commented on the pull request:
https://github.com/apache/spark/pull/126#issuecomment-37498313
@yaoshengzhe
This is only safe, best-effort attempt to clean metadata, so not guarantee
is being provided here. All we are trying to do for long running Spark
computatio
Github user CodingCat commented on the pull request:
https://github.com/apache/spark/pull/101#issuecomment-37498220
Hi, @kayousterhout and @aarondav , Thank you for your comments, I addressed
them
One potential issue is that, to call the function in HadoopRDD, I moved
SparkH
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/42#discussion_r10551998
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala ---
@@ -0,0 +1,93 @@
+/*
+ * Licensed to the Apache Software Founda
Github user qqsun8819 commented on the pull request:
https://github.com/apache/spark/pull/110#issuecomment-37497727
thanks @aarondav It doesn't matter. And welcome any advice for this patch
---
If your project is set up for it, you can reply to this email and have your
reply appear o
Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/42#discussion_r10551959
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/SparkReplayerBus.scala ---
@@ -0,0 +1,97 @@
+/*
+ * Licensed to the Apache Software Foundation
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/101#issuecomment-37497545
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not hav
1 - 100 of 338 matches
Mail list logo