tition.forEachRemaining(message -> {
> //breakpoint doenst stop here
>
> })
> });
>
> *toDebug.first* // now is when this method will run
>
foreachPartition is a void method.
>
>
> 2016-05-31 17:59 GMT-03:00 Marcelo Oikawa :
>
>>
>>
>>&g
he problem itself because the code inside
forEachRemaining runs well but I can't debug this block.
> This is when the spark will run the operations.
> Have you tried that?
>
> Cheers.
>
> 2016-05-31 17:18 GMT-03:00 Marcelo Oikawa :
>
>> Hello, list.
>>
>
Hello, list.
I'm trying to debug my spark application on Intellij IDE. Before I submit
my job, I ran the command line:
export
SPARK_SUBMIT_OPTS=-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=4000
after that:
bin/spark-submit app-jar-with-dependencies.jar
The IDE connects with t
eryPath("/druid/discovery")
.location(druidLocation)
.timestampSpec(timestampSpec)
.rollup(druidRollup)
.tuning(clusteredBeamTuning)
.buildTranquilizer();
tranquilizer.start();
That worked for me. Thank you Ted, Alon
On Thu, Mar 31, 2016 at 11:37 AM, Marcelo Oikawa <
> marcelo.oik...@webradar.com> wrote:
>
>> Hi, Alonso.
>>
>> As you can see jackson-core is provided by several libraries, try to
>>> exclude it from spark-core, i think the minor version is included within
&g
, then programming
> must be the process of putting ..."
> - Edsger Dijkstra
>
> "If you pay peanuts you get monkeys"
>
>
> 2016-03-31 20:01 GMT+02:00 Marcelo Oikawa :
>
>> Hey, Alonso.
>>
>> here is the output:
>>
>> [I
los..."
> - Edsger Dijkstra
>
> My favorite quotes (today):
> "If debugging is the process of removing software bugs, then programming
> must be the process of putting ..."
> - Edsger Dijkstra
>
> "If you pay peanuts you get monkeys"
>
>
>
.1
provided
and the jackson version 2.4.4 was not listed in maven dependencies...
>
> Consider using maven-shade-plugin to resolve the conflict if you use maven.
>
> Cheers
>
> On Thu, Mar 31, 2016 at 9:50 AM, Marcelo Oikawa <
> marcelo.oik...@webradar.com> w
Hi, list.
We are working on a spark application that sends messages to Druid. For
that, we're using Tranquility core. In my local test, I'm using the
"spark-1.6.1-bin-hadoop2.6" distribution and the following dependencies in
my app:
org.apache.spark
spark-streaming_2.10
1.6.1
pro