I use Sparkling Water 1.6.3, Spark 1.6.I use Java Oracle 8 or OpenJDK-7:(every time I get this error when I transform Spark DataFrame into H2O DataFrame. Spark cluster dies..):ERROR:py4j.java_gateway:Error while sending or receiving.Traceback (most recent call last): File ".../Spark1.6/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 746, in send_command raise Py4JError("Answer from Java side is empty")Py4JError: Answer from Java side is emptyERROR:py4j.java_gateway:An error occurred while trying to connect to the Java serverTraceback (most recent call last): File ".../Spark1.6/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 690, in start self.socket.connect((self.address, self.port)) File "/usr/local/anaconda/lib/python2.7/socket.py", line 228, in meth return getattr(self._sock,name)(*args)error: [Errno 111] Connection refusedERROR:py4j.java_gateway:An error occurred while trying to connect to the Java serverTraceback (most recent call last):My conf-file:spark.serializer org.apache.spark.serializer.KryoSerializer spark.kryoserializer.buffer.max 1500mbspark.driver.memory 65gspark.driver.extraJavaOptions -XX:-PrintGCDetails -XX:PermSize=35480m -XX:-PrintGCTimeStamps -XX:-PrintTenuringDistribution spark.python.worker.memory 65gspark.local.dir /data/spark-tmpspark.ext.h2o.client.log.dir /data/h2ospark.logConf falsespark.master local[*]spark.driver.maxResultSize 0spark.eventLog.enabled Truespark.eventLog.dir /data/spark_logIn the code I use "persist" data (amount of data is 5.7 GB).There is nothing in the h2olog-files.I guess that there is enough memory.Could anyone help me?Thanks!
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-Answer-from-Java-side-is-empty-tp26929.html Sent from the Apache Spark User List mailing list archive at Nabble.com.