The definitive guide
Chapter 18:
Monitoring and Debugging
"This chapter covers the key details you need to monitor and debug your
Spark Applications. To do this , we will walk through the spark UI with an
example query designed to help you understand how to trace your own jobs
through the execut
nk*
>
>
> Regards
> Sam
>
>
> On Thu, 16 Feb 2017 at 22:00, Md. Rezaul Karim <
> rezaul.ka...@insight-centre.org> wrote:
>
>> Hi,
>>
>> I was looking for some URLs/documents for getting started on debugging
>> Spark applications.
>>
ome URLs/documents for getting started on debugging
> Spark applications.
>
> I prefer developing Spark applications with Scala on Eclipse and then
> package the application jar before submitting.
>
>
>
> Kind regards,
> Reza
>
>
>
>
Hi,
I was looking for some URLs/documents for getting started on debugging
Spark applications.
I prefer developing Spark applications with Scala on Eclipse and then
package the application jar before submitting.
Kind regards,
Reza
On Wed, Aug 26, 2015 at 11:02 PM, Joanne Contact
wrote:
> Hi I have a Ubuntu box with 4GB memory and duo cores. Do you think it
> won't be enough to run spark streaming and kafka? I try to install
> standalone mode spark kafka so I can debug them in IDE. Do I need to
> install hadoop?
Hi,
It sho
Hi I have a Ubuntu box with 4GB memory and duo cores. Do you think it
won't be enough to run spark streaming and kafka? I try to install
standalone mode spark kafka so I can debug them in IDE. Do I need to
install hadoop?
Thanks!
J
Deepesh,
you have to call an action to start actual processing.
words.count() would do the trick.
On 05 Aug 2015, at 11:42, Deepesh Maheshwari
wrote:
> Hi,
>
> As spark job is executed when you run start() method of JavaStreamingContext.
> All the job like map, flatMap is already defined ea
Hi,
As spark job is executed when you run start() method of
JavaStreamingContext.
All the job like map, flatMap is already defined earlier but even though
you put breakpoints in the function ,breakpoint doesn't stop there , then
how can i debug the spark jobs.
JavaDStream words=lines.flatMap(new
For debugging you can refer these two threads
http://apache-spark-user-list.1001560.n3.nabble.com/How-do-you-hit-breakpoints-using-IntelliJ-In-functions-used-by-an-RDD-td12754.html
http://mail-archives.apache.org/mod_mbox/spark-user/201410.mbox/%3ccahuq+_ygfioj2aa3e2zsh7zfsv_z-wsorhvbipahxjlm2fj.
Hello experts,
Is there an easy way to debug a spark java application?
I'm putting debug logs in the map's function but there aren't any logs on
the console.
Also can i include my custom jars while launching spark-shell and do my poc
there?
This might me a naive question but any help here is ap
I am a newbie and am looking for pointers to start debugging my spark app and
did not find a straightforward tutorial. Any help is appreciated?
Sent from my iPhone
Did you check the executor stderr logs?
On 5/16/14, 2:37 PM, "Robert James" wrote:
>I have Spark code which runs beautifully when MASTER=local. When I
>run it with MASTER set to a spark ec2 cluster, the workers seem to
>run, but the results, which are supposed to be put to AWS S3, don't
>appear
I have Spark code which runs beautifully when MASTER=local. When I
run it with MASTER set to a spark ec2 cluster, the workers seem to
run, but the results, which are supposed to be put to AWS S3, don't
appear on S3. I'm at a loss for how to debug this. I don't see any
S3 exceptions anywhere.
Ca
13 matches
Mail list logo