3bcfc/7958346027016861/2296698
945593142/5693253843748751/latest.html
Hope the example is clear enough.
I am waiting for your response.
Thank you for your time,
Irina Stan
We have an application that reads text files, converts them to dataframes,
and saves them in Parquet format. The application runs fine when processing
a few files, but we have several thousand produced every day. When running
the job for all files, we have spark-submit killed on OOM:
#
# java.lang
help!
Irina
On 04.10.14 00:17, Yana Kadiyska wrote:
I don't think it's a red herring... (btw. spark.driver.host needs to be
set to the IP or FQDN of the machine where you're running the program).
I am running 0.9.2 on CDH4 and the beginning of my executor log looks
like below (I&
your executor log, this seems fairly likely:
is host1/xxx.xx.xx.xx:45542 the machine where your driver is running? is
that host/port reachable from the worker machines?
On Fri, Oct 3, 2014 at 5:32 AM, Irina Fedulova mailto:fedul...@gmail.com>> wrote:
Hi,
I have set up Spark 0.9.2 standal
Hi ssimanta,
were you able to resolve the problem with failing standalone scala program,
but spark repl working just fine? I am getting the same issue...
Thanks,
Irina
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Akka-Connection-refused-standalone
$$anon$2:
Connection refused: host1/xxx.xx.xx.xx:45542
---
Thanks!
Irina
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org