Is there any chance we also print the least recent failure in stage as the
following most recent failure before Driver statcktrace?
> >> Caused by: org.apache.spark.SparkException: Job aborted due to stage
> >> failure: Task 10 in stage 1.0 failed 4 times, most recent failure: Lost
> >> task 10
The stack trace is omitted by JVM when an exception is thrown too
many times. This usually happens when you have multiple Spark tasks on the
same executor JVM throwing the same exception. See
https://stackoverflow.com/a/3010106
Best Regards,
Ryan
On Tue, Apr 28, 2020 at 10:45 PM lec ssmi wrote:
It should be a problem of my data quality. It's curious why the driver-side
exception stack has no specific exception information.
Edgardo Szrajber 于2020年4月28日周二 下午3:32写道:
> The exception occured while aborting the stage. It might be interesting to
> try to understand the reason for the abortion
The exception occured while aborting the stage. It might be interesting to try
to understand the reason for the abortion.Maybe timeout? How long the query
run?Bentzi
Sent from Yahoo Mail on Android
On Tue, Apr 28, 2020 at 9:25, Jungtaek Lim
wrote: The root cause of exception is occurred
The root cause of exception is occurred in executor side "Lost task 10.3 in
stage 1.0 (TID 81, spark6, executor 1)" so you may need to check there.
On Tue, Apr 28, 2020 at 2:52 PM lec ssmi wrote:
> Hi:
> One of my long-running queries occasionally encountered the following
> exception:
>
>
>