ss "Word2VecApp" --master local[30]
target/scala-2.10/word2vec-project_2.10-1.0.jar
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-ClosureCleaner-or-java-serializer-OOM-when-trying-to-grow-tp24796p25383.
eaturesInfo,
numTrees, featureSubsetStrategy, impurity, maxDepth, maxBins)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-ClosureCleaner-or-java-serializer-OOM-when-trying-to-grow-tp24796p24818.html
Sent from the Apache Spark User List mailing lis
torage.memoryFraction 0.9
> spark.storage.shuffleFraction 0.05
> spark.default.parallelism 128
>
> The master machine has approximately 240 GB of ram and each worker has
> about
> 120GB of ram.
>
> I load in a relatively tiny RDD of MLLIB LabeledPoint objects, with each
y about 3000 or so
are non-zeros.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-ClosureCleaner-or-java-serializer-OOM-when-trying-to-grow-tp24796.html
Sent from the Apache Spark User List mailing list archiv