Re: Spark Exception

2020-11-20 Thread Russell Spitzer
The general exceptions here mean that components within the Spark cluster can't communicate. The most common cause for this is failures of the processors that are supposed to be communicating. I generally see this when one of the processes goes into a GC storm or is shut down because of an exceptio

Re: Spark Exception

2020-11-20 Thread Amit Sharma
Russell i increased the rpc timeout to 240 seconds but i am still getting this issue once a while and after this issue my spark streaming job stuck and do not process any request then i need to restart this every time. Any suggestion please. Thanks Amit On Wed, Nov 18, 2020 at 12:05 PM Amit Shar

Re: Spark Exception

2020-11-20 Thread Amit Sharma
Please help. Thanks Amit On Wed, Nov 18, 2020 at 12:05 PM Amit Sharma wrote: > Hi, we are running a spark streaming job and sometimes it throws below > two exceptions . I am not understanding what is the difference between > these two exception for one timeout is 120 seconds and another is 6

Spark Exception

2020-11-18 Thread Amit Sharma
Hi, we are running a spark streaming job and sometimes it throws below two exceptions . I am not understanding what is the difference between these two exception for one timeout is 120 seconds and another is 600 seconds. What could be the reason for these Error running job streaming job 160570

Re: spark exception

2020-07-24 Thread Russell Spitzer
Usually this is just the sign that one of the executors quit unexpectedly which explains the dead executors you see in the ui. The next step is usually to go and look at those executor logs and see if there's any reason for the termination. if you end up seeing an abrupt truncation of the log that

spark exception

2020-07-24 Thread Amit Sharma
Hi All, sometimes i get this error in spark logs. I notice few executors are shown as dead in the executor tab during this error. Although my job get success. Please help me out the root cause of this issue. I have 3 workers with 30 cores each and 64 GB RAM each. My job uses 3 cores per executor an

Re: Apache Spark - Exception on adding column to Structured Streaming DataFrame

2018-02-05 Thread M Singh
Hi TD: Just wondering if you have any insight for me or need more info. Thanks On Thursday, February 1, 2018 7:43 AM, M Singh wrote: Hi TD: Here is the udpated code with explain and full stack trace. Please let me know what could be the issue and what to look for in the explain output.

Re: Apache Spark - Exception on adding column to Structured Streaming DataFrame

2018-02-01 Thread M Singh
Hi TD: Here is the udpated code with explain and full stack trace. Please let me know what could be the issue and what to look for in the explain output. Updated code: import scala.collection.immutableimport org.apache.spark.sql.functions._import org.joda.time._import org.apache.spark.sql._import

Re: Apache Spark - Exception on adding column to Structured Streaming DataFrame

2018-01-31 Thread Tathagata Das
Could you give the full stack trace of the exception? Also, can you do `dataframe2.explain(true)` and show us the plan output? On Wed, Jan 31, 2018 at 3:35 PM, M Singh wrote: > Hi Folks: > > I have to add a column to a structured *streaming* dataframe but when I > do that (using select or wit

Apache Spark - Exception on adding column to Structured Streaming DataFrame

2018-01-31 Thread M Singh
Hi Folks: I have to add a column to a structured streaming dataframe but when I do that (using select or withColumn) I get an exception.  I can add a column in structured non-streaming structured dataframe. I could not find any documentation on how to do this in the following doc  [https://spar

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
Solved!! The solution is using date_format with the “u” option. Thank you very much. Best, Carlo On 28 Jul 2016, at 18:59, carlo allocca mailto:ca6...@open.ac.uk>> wrote: Hi Mark, Thanks for the suggestion. I changed the maven entries as follows spark-core_2.10 2.0.0 and spark-sq

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
Hi Mark, Thanks for the suggestion. I changed the maven entries as follows spark-core_2.10 2.0.0 and spark-sql_2.10 2.0.0 As result, it worked when I removed the following line of code to compute DAYOFWEEK (Monday—>1 etc.): Dataset tmp6=tmp5.withColumn("ORD_DAYOFWEEK",

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Mark Hamstra
Don't use Spark 2.0.0-preview. That was a preview release with known issues, and was intended to be used only for early, pre-release testing purpose. Spark 2.0.0 is now released, and you should be using that. On Thu, Jul 28, 2016 at 3:48 AM, Carlo.Allocca wrote: > and, of course I am using > >

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
and, of course I am using org.apache.spark spark-core_2.11 2.0.0-preview org.apache.spark spark-sql_2.11 2.0.0-preview jar Is the below problem/issue related to the experimental

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
I have also found the following two related links: 1) https://github.com/apache/spark/commit/947b9020b0d621bc97661a0a056297e6889936d3 2) https://github.com/apache/spark/pull/12433 which both explain why it happens but nothing about what to do to solve it. Do you have any suggestion/recommendati

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
Hi Rui, Thanks for the promptly reply. No, I am not using Mesos. Ok. I am writing a code to build a suitable dataset for my needs as in the following: == Session configuration: SparkSession spark = SparkSession .builder() .master("local[6]") //

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Sun Rui
Are you using Mesos? if not , https://issues.apache.org/jira/browse/SPARK-16522 is not relevant You may describe more information about your Spark environment, and the full stack trace. > On Jul 28, 2016, at 17:44, Carlo.Allocca wrote: > > H

SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
Hi All, I am running SPARK locally, and when running d3=join(d1,d2) and d5=(d3, d4) am getting the following exception "org.apache.spark.SparkException: Exception thrown in awaitResult”. Googling for it, I found that the closed is the answer reported https://issues.apache.org/jira/browse/SPARK

Re: Apache Spark Exception in thread “main” java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class

2016-03-19 Thread Josh Rosen
See the instructions in the Spark documentation: https://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211 On Wed, Mar 16, 2016 at 7:05 PM satyajit vegesna wrote: > > > Hi, > > Scala version:2.11.7(had to upgrade the scala verison to enable case > clasess to accept more tha

Fwd: Apache Spark Exception in thread “main” java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class

2016-03-19 Thread satyajit vegesna
Hi, Scala version:2.11.7(had to upgrade the scala verison to enable case clasess to accept more than 22 parameters.) Spark version:1.6.1. PFB pom.xml Getting below error when trying to setup spark on intellij IDE, 16/03/16 18:36:44 INFO spark.SparkContext: Running Spark version 1.6.1 Exception

Re: Apache Spark Exception in thread “main” java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class

2016-03-19 Thread Josh Rosen
Err, whoops, looks like this is a user app and not building Spark itself, so you'll have to change your deps to use the 2.11 versions of Spark. e.g. spark-streaming_2.10 -> spark-streaming_2.11. On Wed, Mar 16, 2016 at 7:07 PM Josh Rosen wrote: > See the instructions in the Spark documentation:

Re: Install via directions in "Learning Spark". Exception when running bin/pyspark

2015-10-13 Thread David Bess
. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Install-via-directions-in-Learning-Spark-Exception-when-running-bin-pyspark-tp25043p25049.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Install via directions in "Learning Spark". Exception when running bin/pyspark

2015-10-13 Thread Robineast
doing that. Robin - Robin East Spark GraphX in Action Michael Malak and Robin East Manning Publications Co. http://www.manning.com/books/spark-graphx-in-action -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Install-via-directions-in-Learning-Spark

Install via directions in "Learning Spark". Exception when running bin/pyspark

2015-10-12 Thread David Bess
ll-via-directions-in-Learning-Spark-Exception-when-running-bin-pyspark-tp25043.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional co

Spark exception when sending message to akka actor

2014-12-22 Thread Priya Ch
Hi All, I have akka remote actors running on 2 nodes. I submitted spark application from node1. In the spark code, in one of the rdd, i am sending message to actor running on node1. My Spark code is as follows: class ActorClient extends Actor with Serializable { import context._ val curre

Re: spark Exception while performing saveAsTextFiles

2014-12-08 Thread Akhil Das
va.lang.Thread.run(Thread.java:745) > > > > The code was running good 2 days before but Now I am facing this error. > What > can be the reason > > > > > -- > View this message in context: > http://apache-spark-user-list.1001

spark Exception while performing saveAsTextFiles

2014-12-07 Thread Hafiz Mujadid
message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-Exception-while-performing-saveAsTextFiles-tp20573.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail

Spark exception while reading different inputs

2014-08-20 Thread durga
(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJ

Error in spark: Exception in thread "delete Spark temp dir"

2014-07-14 Thread Rahul Bhojwani
I am getting an error saying: Exception in thread "delete Spark temp dir C:\Users\shawn\AppData\Local\Temp\spark-b4f1105c-d67b-488c-83f9-eff1d1b95786" java.io.IOExcept ion: Failed to delete: C:\Users\shawn\AppData\Local\Temp\spark-b4f1105c-d67b-488c-83f9-eff1d1b95786\tmppr36zu at org.apac