You need to run your app in localmode ( aka master=local[2]) to get it
debugged locally. If you are running it on a cluster, then you can use
the remote
debugging feature.
<http://stackoverflow.com/questions/19128264/how-to-remote-debug-in-intellij-12-1-4>

For remote debugging, you need to pass the following:

-Xdebug -Xrunjdwp:server=y,transport=dt_socket,address=4000,suspend=n

 jvm options and configure your ide on that given port (4000) for remote
debugging.

Thanks
Best Regards


On Tue, Aug 26, 2014 at 1:32 AM, Sean Owen <so...@cloudera.com> wrote:

> PS from an offline exchange -- yes more is being called here, the rest
> is the standard WordCount example.
>
> The trick was to make sure the task executes locally, and calling
> setMaster("local") on SparkConf in the example code does that. That
> seems to work fine in IntelliJ for debugging this.
>
> On Mon, Aug 25, 2014 at 6:41 PM, Steve Lewis <lordjoe2...@gmail.com>
> wrote:
> > ????
> > That was not quite in English
> >
> >
> > My Flatmap code is shown below
> >
> > I know the code is called since the answers are correct but would like to
> > put a break point in dropNonLetters to make sure that code works properly
> >
> > I am running in the IntelliJ debugger but believe the code is executing
> on a
> > Spark Worker.
> > I am not sure what magic Intellij uses to hook up a debugger to a worker
> but
> > hope it is possib;e
> >
> > public class WordsMapFunction implements FlatMapFunction<String, String>
> {
> >
> >     private static final Pattern SPACE = Pattern.compile(" ");
> >
> >     public Iterable<String> call(String s) {
> >         String[] split = SPACE.split(s);
> >         for (int i = 0; i < split.length; i++) {
> >             split[i] = regularizeString(split[i]);
> >         }
> >         return Arrays.asList(split);
> >     }
> >
> >     public static String dropNonLetters(String s) {
> >         StringBuilder sb = new StringBuilder();
> >         for (int i = 0; i < s.length(); i++) {
> >             char c = s.charAt(i);
> >             if (Character.isLetter(c))
> >                 sb.append(c);
> >         }
> >
> >         return sb.toString();
> >     }
> >
> >
> >     public static String regularizeString(String inp) {
> >         inp = inp.trim();
> >         inp = inp.toUpperCase();
> >         return dropNonLetters(inp);
> >     }
> >
> > }
> >
> >
> > On Mon, Aug 25, 2014 at 10:35 AM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >> flatMap() is a transformation only. Calling it by itself does nothing,
> >> and it just describes the relationship between one RDD and another.
> >> You should see it swing into action if you invoke an action, like
> >> count(), on the words RDD.
> >>
> >> On Mon, Aug 25, 2014 at 6:32 PM, Steve Lewis <lordjoe2...@gmail.com>
> >> wrote:
> >> > I was able to get JavaWordCount running with a local instance under
> >> > IntelliJ.
> >> >
> >> > In order to do so I needed to use maven to package my code and
> >> > call
> >> >    String[] jars = {
> >> > "/SparkExamples/target/word-count-examples_2.10-1.0.0.jar" };
> >> >     sparkConf.setJars(jars);
> >> >
> >> > After that the sample ran properly and in the debugger I could set
> break
> >> > points in the main.
> >> >
> >> > However when I do
> >> > something like
> >> >    JavaRDD<String> words = lines.flatMap( new WordsMapFunction());
> >> >
> >> > where WordsMapFunction is a separate class like
> >> >
> >> >  public static class WordsMapFunction implements
> FlatMapFunction<String,
> >> > String> {
> >> >      private static final Pattern SPACE = Pattern.compile(" ");
> >> >      public Iterable<String> call(String s) {
> >> >         String[] split = SPACE.split(s);
> >> >         for (int i = 0; i < split.length; i++) {
> >> >             split[i] = toUpperCase(split[i]);
> >> >         }
> >> >         return Arrays.asList(split);
> >> >     }
> >> > }
> >> >
> >> > Breakpoints set in WordsMapFunction  are never hit.
> >> >
> >> > Most interesting functionality in the problems I am trying to solve if
> >> > in
> >> > the FlatMapFunction and the Function2 code and this is the
> functionality
> >> > I
> >> > will need to examine in more detail.
> >> >
> >> > Has anyone figured out how to configure a project to hit breakpoints
> in
> >> > these functions??
> >
> >
> >
> >
> > --
> > Steven M. Lewis PhD
> > 4221 105th Ave NE
> > Kirkland, WA 98033
> > 206-384-1340 (cell)
> > Skype lordjoe_com
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to