Builds are failing

2016-02-22 Thread Iulian Dragoș
Just in case you missed this: https://issues.apache.org/jira/browse/SPARK-13431 Builds are failing with 'Method code too large' in the "shading" step with Maven. iulian -- -- Iulian Dragos -- Reactive Apps on the JVM www.typesafe.com

Re: pull request template

2016-02-19 Thread Iulian Dragoș
It's a good idea. I would add in there the spec for the PR title. I always get wrong the order between Jira and component. Moreover, CONTRIBUTING.md is also lacking them. Any reason not to add it there? I can open PRs for both, but maybe you want to keep that info on the wiki instead. iulian On

Re: build error: code too big: specialStateTransition(int, IntStream)

2016-01-28 Thread Iulian Dragoș
t; After this change: > [SPARK-12681] [SQL] split IdentifiersParser.g into two files > > the biggest file under > sql/catalyst/src/main/antlr3/org/apache/spark/sql/catalyst/parser is > SparkSqlParser.g > > Maybe split SparkSqlParser.g up as well ? > > On Thu, Jan 28,

build error: code too big: specialStateTransition(int, IntStream)

2016-01-28 Thread Iulian Dragoș
Hi, Has anyone seen this error? The code of method specialStateTransition(int, IntStream) is exceeding the 65535 bytes limitSparkSqlParser_IdentifiersParser.java:39907 The error is in ANTLR generated files and it’s (according to Stack Overflow) due to state explosion in parser (or lexer). Th

Re: Unable to compile and test Spark in IntelliJ

2016-01-26 Thread Iulian Dragoș
On Tue, Jan 19, 2016 at 6:06 AM, Hyukjin Kwon wrote: > Hi all, > > I usually have been working with Spark in IntelliJ. > > Before this PR, > https://github.com/apache/spark/commit/7cd7f2202547224593517b392f56e49e4c94cabc > for > `[SPARK-12575][SQL] Grammar parity with existing SQL parser`. I was

Re: Removing the Mesos fine-grained mode

2016-01-20 Thread Iulian Dragoș
let me see if I can pull some logs > together in the next couple days. > > On Tue, Jan 19, 2016 at 10:08 AM, Iulian Dragoș < > iulian.dra...@typesafe.com> wrote: > >> It would be good to get to the bottom of this. >> >> Adam, could you share the Spark app th

Re: Removing the Mesos fine-grained mode

2016-01-19 Thread Iulian Dragoș
h. Again, the fine and > coarse-grained > > execution tests are on the exact same machines, exact same dataset, and > only > > changing spark.mesos.coarse to true/false. > > > > Let me know if there's anything else I can provide here. > > > > Thanks, > >

Re: [VOTE] Release Apache Spark 1.6.0 (RC4)

2015-12-23 Thread Iulian Dragoș
+1 (non-binding) Tested Mesos deployments (client and cluster-mode, fine-grained and coarse-grained). Things look good . iulian On Wed, Dec 23, 2015 at 2:35 PM, Sean Owen wrote: > Docker integration tests still fail for Mark

Re: [VOTE] Release Apache Spark 1.6.0 (RC3)

2015-12-17 Thread Iulian Dragoș
-0 (non-binding) Unfortunately the Mesos cluster regression is still there (see my comment for explanations). I'm not voting to delay the release any longer though. We tested (and passed) Mesos in: - client mode - fine/coarse-grained

Re: Update to Spar Mesos docs possibly? LIBPROCESS_IP needs to be set for client mode

2015-12-17 Thread Iulian Dragoș
> > master you can set LIPROCESS_IP and LIBPROCESS_PORT. > > > > It is a Mesos specific settings. We can definitely update the docs. > > > > Note that in the future as we move to use the new Mesos Http API these > > configurations won't be needed (also libm

Re: Update to Spar Mesos docs possibly? LIBPROCESS_IP needs to be set for client mode

2015-12-16 Thread Iulian Dragoș
on slave node with public IP 192.168.56.50 > >> > >> 1. Set > >> > >>export LIBPROCESS_IP=192.168.56.50 > >>export SPARK_LOCAL_IP=192.168.56.50 > >> > >> 2. Ensure your hostname resolves to public iface IP - (for testing) edit &g

Re: Update to Spar Mesos docs possibly? LIBPROCESS_IP needs to be set for client mode

2015-12-16 Thread Iulian Dragoș
t; > > > 1. Set > > > >export LIBPROCESS_IP=192.168.56.50 > >export SPARK_LOCAL_IP=192.168.56.50 > > > > 2. Ensure your hostname resolves to public iface IP - (for testing) edit > > /etc/hosts to resolve your domain name to 192.168.56.50 > > 3. Set

Re: Update to Spar Mesos docs possibly? LIBPROCESS_IP needs to be set for client mode

2015-12-16 Thread Iulian Dragoș
Hi Aaron, I never had to use that variable. What is it for? On Wed, Dec 16, 2015 at 2:00 PM, Aaron wrote: > In going through running various Spark jobs, both Spark 1.5.2 and the > new Spark 1.6 SNAPSHOTs, on a Mesos cluster (currently 0.25), we > noticed that is in order to run the Spark shells

Re: [VOTE] Release Apache Spark 1.6.0 (RC2)

2015-12-15 Thread Iulian Dragoș
Thanks for the heads up. On Tue, Dec 15, 2015 at 11:40 PM, Michael Armbrust wrote: > This vote is canceled due to the issue with the incorrect version. This > issue will be fixed by https://github.com/apache/spark/pull/10317 > > We can wait a little bit for a fix to > https://issues.apache.org/

Re: [VOTE] Release Apache Spark 1.6.0 (RC2)

2015-12-15 Thread Iulian Dragoș
-1 (non-binding) Cluster mode on Mesos is broken (regression compared to 1.5.2). It seems to be related to the way SPARK_HOME is handled. In the driver logs I see: I1215 15:00:39.411212 28032 exec.cpp:134] Version: 0.25.0 I1215 15:00:39.413512 28037 exec.cpp:208] Executor registered on slave 130b

Re: How to debug Spark source using IntelliJ/ Eclipse

2015-12-07 Thread Iulian Dragoș
What errors do you see? I’m using Eclipse and things work pretty much as described (I’m using Scala 2.11 so there’s a slight difference for that, but if you’re fine using Scala 2.10 it should be good to go). One little difference: the sbt command is no longer in the sbt directory, instead run: bu

Re: Removing the Mesos fine-grained mode

2015-11-23 Thread Iulian Dragoș
On Sat, Nov 21, 2015 at 3:37 AM, Adam McElwee wrote: > I've used fine-grained mode on our mesos spark clusters until this week, > mostly because it was the default. I started trying coarse-grained because > of the recent chatter on the mailing list about wanting to move the mesos > execution path

Re: Removing the Mesos fine-grained mode

2015-11-20 Thread Iulian Dragoș
>> define a Mesos framework. That said, with dyn-allocation and Mesos support >> for both resource reservation, oversubscription and revocation, I think the >> direction is clear that the coarse mode is the proper way forward, and >> having the two code paths is just nois

Removing the Mesos fine-grained mode

2015-11-19 Thread Iulian Dragoș
Hi all, Mesos is the only cluster manager that has a fine-grained mode, but it's more often than not problematic, and it's a maintenance burden. I'd like to suggest removing it in the 2.0 release. A few reasons: - code/maintenance complexity. The two modes duplicate a lot of functionality (and s

Re: Mesos cluster dispatcher doesn't respect most args from the submit req

2015-11-17 Thread Iulian Dragoș
Hi Jo, I agree that there's something fishy with the cluster dispatcher, I've seen some issues like that. I think it actually tries to send all properties as part of `SPARK_EXECUTOR_OPTS`, which may not be everything that's needed: https://github.com/jayv/spark/blob/mesos_cluster_params/core/src

Re: Please reply if you use Mesos fine grained mode

2015-11-04 Thread Iulian Dragoș
Probably because only coarse-grained mode respects `spark.cores.max` right now. See (and maybe review ;-)) #9027 (sorry for the shameless plug). iulian On Wed, Nov 4, 2015 at 5:05 PM, Timothy Chen wrote: > Hi Chris, > > How does coarse grain mode give

Building Spark w/ 1.8 and binary incompatibilities

2015-10-19 Thread Iulian Dragoș
Hey all, tl;dr; I built Spark with Java 1.8 even though my JAVA_HOME pointed to 1.7. Then it failed with binary incompatibilities. I couldn’t find any mention of this in the docs, so It might be a known thing, but it’s definitely too easy to do the wrong thing. The problem is that Maven is using

Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-12 Thread Iulian Dragoș
On Fri, Oct 9, 2015 at 10:34 PM, Patrick Wendell wrote: > I would push back slightly. The reason we have the PR builds taking so > long is death by a million small things that we add. Doing a full 2.11 > compile is order minutes... it's a nontrivial increase to the build times. > We can host the

Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-09 Thread Iulian Dragoș
--- >> [INFO] Total time: 17:49 min >> >> FYI >> >> On Thu, Oct 8, 2015 at 6:50 AM, Ted Yu wrote: >> >>> Interesting >>> >>> >>> https://amplab.cs.berkeley.edu/je

Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-08 Thread Iulian Dragoș
Since Oct. 4 the build fails on 2.11 with the dreaded [error] /home/ubuntu/workspace/Apache Spark (master) on 2.11/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:310: no valid targets for annotation on value conf - it is discarded unused. You may specify targets with meta-annotat

Re: Automatically deleting pull request comments left by AmplabJenkins

2015-08-14 Thread Iulian Dragoș
On Fri, Aug 14, 2015 at 4:21 AM, Josh Rosen wrote: > Prototype is at https://github.com/databricks/spark-pr-dashboard/pull/59 > > On Wed, Aug 12, 2015 at 7:51 PM, Josh Rosen wrote: > >> *TL;DR*: would anyone object if I wrote a script to auto-delete pull >> request comments from AmplabJenkins? >

Re: non-deprecation compiler warnings are upgraded to build errors now

2015-07-25 Thread Iulian Dragoș
e most common one (if not the only one). iulian > > On Fri, Jul 24, 2015 at 10:24 AM, Iulian Dragoș < > iulian.dra...@typesafe.com> wrote: > >> On Thu, Jul 23, 2015 at 6:08 AM, Reynold Xin wrote: >> >> Hi all, >>> >>> FYI, we just merged a p

Re: non-deprecation compiler warnings are upgraded to build errors now

2015-07-24 Thread Iulian Dragoș
On Thu, Jul 23, 2015 at 6:08 AM, Reynold Xin wrote: Hi all, > > FYI, we just merged a patch that fails a build if there is a scala > compiler warning (if it is not deprecation warning). > I’m a bit confused, since I see quite a lot of warnings in semi-legitimate code. For instance, @transient (p

Re: Spark 1.5.0-SNAPSHOT broken with Scala 2.11

2015-06-29 Thread Iulian Dragoș
On Mon, Jun 29, 2015 at 3:02 AM, Alessandro Baretta wrote: > I am building the current master branch with Scala 2.11 following these > instructions: > > Building for Scala 2.11 > > To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11 > property: > > dev/change-version-to-2.1

Various forks

2015-06-25 Thread Iulian Dragoș
Could someone point the source of the Spark-fork used to build genjavadoc-plugin? Even more important it would be to know the reasoning behind this fork. Ironically, this hinders my attempts at removing another fork, the Spark REPL fork (and the upgrade to Scala 2.11.7). See here

Re: [VOTE] Release Apache Spark 1.4.0 (RC2)

2015-05-26 Thread Iulian Dragoș
I tried 1.4.0-rc2 binaries on a 3-node Mesos cluster, everything seemed to work fine, both spark-shell and spark-submit. Cluster mode deployment also worked. +1 (non-binding) iulian On Tue, May 26, 2015 at 4:44 AM, jameszhouyi wrote: > Compiled: > git clone https://github.com/apache/spark.git

Why use "lib_managed" for the Sbt build?

2015-05-21 Thread Iulian Dragoș
I’m trying to understand why Sbt is configured to pull all libs under lib_managed. - it seems like unnecessary duplication (I will have those libraries under ./m2, via maven anyway) - every time I call make-distribution I lose lib_managed (via mvn clean install) and have to wait to dow

Re: Problem building master on 2.11

2015-05-19 Thread Iulian Dragoș
There's an open PR to fix it. If you could try it and report back on the PR it'd be great. More likely to get in fast. https://github.com/apache/spark/pull/6260 On Mon, May 18, 2015 at 6:43 PM, Fernando O. wrote: > I just noticed I sent this to users instead of dev: > > -- Forwarded mes

Re: Intellij Spark Source Compilation

2015-05-11 Thread Iulian Dragoș
Oh, I see. So then try to run one build on the command time firs (or try sbt avro:generate, though I’m not sure it’s enough). I just noticed that I have an additional source folder target/scala-2.10/src_managed/main/compiled_avro for spark-streaming-flume-sink. I guess I built the project once and

Re: Intellij Spark Source Compilation

2015-05-11 Thread Iulian Dragoș
Hi, `old-deps` is not really a project, so you can simply skip it (or close it). The rest should work fine (clean and build all). On Sat, May 9, 2015 at 10:27 PM, rtimp wrote: > Hi Iulian, > > Thanks for the reply! > > With respect to eclipse, I'm doing this all with a fresh download of the > s

Re: Intellij Spark Source Compilation

2015-05-09 Thread Iulian Dragoș
On Sat, May 9, 2015 at 12:29 AM, rtimp wrote: > Hello, > > I'm trying to compile the master branch of the spark source (25889d8) in > intellij. I followed the instructions in the wiki > https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools, > namely I downloaded IntelliJ 14.1.2

Re: Speeding up Spark build during development

2015-05-05 Thread Iulian Dragoș
I'm probably the only Eclipse user here, but it seems I have the best workflow :) At least for me things work as they should: once I imported projects in the workspace I can build and run/debug tests from the IDE. I only go to sbt when I need to re-create projects or I want to run the full test sui

Re: Update Wiki Developer instructions

2015-05-04 Thread Iulian Dragoș
th a suggested text change if it is significant enough to > need discussion. If it's trivial, just post it here and someone can > take care of it. > > On Mon, May 4, 2015 at 2:32 PM, Iulian Dragoș > wrote: > > I'd like to update the information about using Ec

Update Wiki Developer instructions

2015-05-04 Thread Iulian Dragoș
I'd like to update the information about using Eclipse to develop on the Spark project found on this page: https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=38572224 I don't see any way to edit this page (I created an account). Since it's a wiki, I assumed it's supposed to be edita

Re: Unit tests

2015-02-10 Thread Iulian Dragoș
February 9, 2015 at 5:47:59 AM, Iulian Dragoș ( > iulian.dra...@typesafe.com) wrote: > > Hi Patrick, > > Thanks for the heads up. I was trying to set up our own infrastructure for > testing Spark (essentially, running `run-tests` every night) on EC2. I > stumbled upon a nu

Re: Unit tests

2015-02-09 Thread Iulian Dragoș
Hi Patrick, Thanks for the heads up. I was trying to set up our own infrastructure for testing Spark (essentially, running `run-tests` every night) on EC2. I stumbled upon a number of flaky tests, but none of them look similar to anything in Jira with the flaky-test tag. I wonder if there's someth

UnknownHostException while running YarnTestSuite

2015-01-27 Thread Iulian Dragoș
Hi, I’m trying to run the Spark test suite on an EC2 instance, but I can’t get Yarn tests to pass. The hostname I get on that machine is not resolvable, but adding a line in /etc/hosts makes the other tests pass, except for Yarn tests. Any help is greatly appreciated! thanks, iulian ubuntu@ip-1