Hi Patrick,
Here's the process:
java -cp
/root/ephemeral-hdfs/conf::::/root/ephemeral-hdfs/conf:/root/spark/conf:/root/spark/assembly/target/scala-2.10/spark-assembly-1.1.1-SNAPSHOT-hadoop1.0.4.jar
-XX:MaxPermSize=128m -Djava.library.path=/root/ephemeral-hdfs/lib/native/
-Xms5g -Xmx10g -XX:MaxPermSize=10g -Dspark.akka.timeout=300
-Dspark.driver.port=59156 -Xms5g -Xmx10g -XX:MaxPermSize=10g -Xms58315M
-Xmx58315M org.apache.spark.executor.CoarseGrainedExecutorBackend
akka.tcp://sp...@ip-10-226-198-178.us-west-2.compute.internal:59156/user/CoarseGrainedScheduler
5 ip-10-38-9-181.us-west-2.compute.internal 8
akka.tcp://sparkwor...@ip-10-38-9-181.us-west-2.compute.internal:34533/user/Worker
app-20140825214225-0001

Attached is the requested stack trace.



On Mon, Aug 25, 2014 at 1:35 PM, Patrick Wendell [via Apache Spark
Developers List] <ml-node+s1001551n8001...@n3.nabble.com> wrote:

> One other idea - when things freeze up, try to run jstack on the spark
> shell process and on the executors and attach the results. It could be
> that
> somehow you are encountering a deadlock somewhere.
>
>
> On Mon, Aug 25, 2014 at 1:26 PM, Matei Zaharia <[hidden email]
> <http://user/SendEmail.jtp?type=node&node=8001&i=0>>
> wrote:
>
> > Was the original issue with Spark 1.1 (i.e. master branch) or an earlier
> > release?
> >
> > One possibility is that your S3 bucket is in a remote Amazon region,
> which
> > would make it very slow. In my experience though saveAsTextFile has
> worked
> > even for pretty large datasets in that situation, so maybe there's
> > something else in your job causing a problem. Have you tried other
> > operations on the data, like count(), or saving synthetic datasets (e.g.
> > sc.parallelize(1 to 100*1000*1000, 20).saveAsTextFile(...)?
> >
> > Matei
> >
> > On August 25, 2014 at 12:09:25 PM, amnonkhen ([hidden email]
> <http://user/SendEmail.jtp?type=node&node=8001&i=1>) wrote:
> >
> > Hi jerryye,
> > Maybe if you voted up my question on Stack Overflow it would get some
> > traction and we would get nearer to a solution.
> > Thanks,
> > Amnon
> >
> >
> >
> > --
> > View this message in context:
> >
> http://apache-spark-developers-list.1001551.n3.nabble.com/saveAsTextFile-to-s3-on-spark-does-not-work-just-hangs-tp7795p7991.html
> > Sent from the Apache Spark Developers List mailing list archive at
> > Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: [hidden email]
> <http://user/SendEmail.jtp?type=node&node=8001&i=2>
> > For additional commands, e-mail: [hidden email]
> <http://user/SendEmail.jtp?type=node&node=8001&i=3>
> >
> >
>
>
> ------------------------------
>  If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-developers-list.1001551.n3.nabble.com/saveAsTextFile-to-s3-on-spark-does-not-work-just-hangs-tp7795p8001.html
>  To start a new topic under Apache Spark Developers List, email
> ml-node+s1001551n1...@n3.nabble.com
> To unsubscribe from Apache Spark Developers List, click here
> <http://apache-spark-developers-list.1001551.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=1&code=amVycnl5ZUBnbWFpbC5jb218MXwtNTI4OTc1MTAz>
> .
> NAML
> <http://apache-spark-developers-list.1001551.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>


jstack.txt (92K) 
<http://apache-spark-developers-list.1001551.n3.nabble.com/attachment/8006/0/jstack.txt>




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/saveAsTextFile-to-s3-on-spark-does-not-work-just-hangs-tp7795p8006.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Reply via email to