Hi ,
I am getting # java.lang.OutOfMemoryError: Java heap space . I have
increased my driver memory and executor memory still i am facing this issue.
I am using r4 for driver and core nodes(16). How can we see which step or
whether its related to any GC . Can we pin point to single point on code
Hi,
Have you tried creating more column blocks?
BlockMatrix matrix = cmatrix.toBlockMatrix(100, 100);
for example.
Is your data randomly spread out, or do you generally have clusters of
data points together?
On Wed, Jan 25, 2017 at 4:23 AM, Petr Shestov wrote:
> Hi all!
>
> I'm using Spark
Hi all!
I'm using Spark 2.0.1 with two workers (one executor each) with 20Gb each.
And run following code:
JavaRDD entries = ...; // filing the dataCoordinateMatrix
cmatrix = new CoordinateMatrix(entries.rdd());BlockMatrix matrix =
cmatrix.toBlockMatrix(100, 1000);BlockMatrix cooc =
matrix.transp
age will be added to the discussion
>> below:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Ja
>> va-Heap-Error-tp27669p27707.html
>> To unsubscribe from Spark Java Heap Error, click here
>> <http://apache-spark-user-list.1001560.n3.nabble.com/template/N
on
> below:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-
> Java-Heap-Error-tp27669p27707.html
> To unsubscribe from Spark Java Heap Error, click here
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=276
can do
df.cache(StorageLevel.MEMORY_AND_DISK).
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27707.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
T
t; Memory is close to half of 16gb available.
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-
> Java-Heap-Error-tp27669p27704.html
> To unsubscribe
Double check your Driver Memory in your Spark Web UI make sure the driver
Memory is close to half of 16gb available.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27704.html
Sent from the Apache Spark User List mailing list
va:745)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27696.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
xtraJavaOptions -XX:+PrintGCDetails -Dkey=value
>
> You might need to change your spark.driver.maxResultSize settings if you
> plan on doing a collect on the entire rdd/dataframe.
>
> --
> If you reply to this email, your message will be added to the discussion
&
2g
spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value
You might need to change your spark.driver.maxResultSize settings if you
plan on doing a collect on the entire rdd/dataframe.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-
n context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-heap-error-tp23856.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.a
You fundamentally want (half of) the Cartesian product so I don't think it
gets a lot faster to form this. You could implement this on cogroup
directly and maybe avoid forming the tuples you will filter out.
I'd think more about whether you really need to do this thing, or whether
there is anythin
Thought about it some more, and simplified the problem space for
discussions:
Given: JavaPairRDD c1; // c1.count() == 8000.
Goal: JavaPairRDD,Tuple2> c2; // all
lexicographical pairs
Where: all lexicographic permutations on c1 ::
(c1_i._1().compareTo(c1_j._1()) < 0) -> new
Tuple2,Tuple2>(c1_i, c1
I am trying to generate all (N-1)(N)/2 lexicographical 2-tuples from a
glom() JavaPairRDD>. The construction of these initial
Tuple2's JavaPairRDD space is well formed from case classes I
provide it (AQ, AQV, AQQ, CT) and is performant; minimized code:
SparkConf conf = new SparkConf()
.se
15 matches
Mail list logo