Should be an easy rebase for your PR, so I went ahead just to get this fixed up:
https://github.com/apache/spark/pull/1466
On Thu, Jul 17, 2014 at 5:32 PM, Ted Malaska wrote:
> Don't make this change yet. I have a 1642 that needs to get through around
> the same code.
>
> I can make this change
Don't make this change yet. I have a 1642 that needs to get through around
the same code.
I can make this change after 1642 is through.
On Thu, Jul 17, 2014 at 12:25 PM, Sean Owen wrote:
> CC tmalaska since he touched the line in question. This is a fun one.
> So, here's the line of code adde
CC tmalaska since he touched the line in question. This is a fun one.
So, here's the line of code added last week:
val channelFactory = new NioServerSocketChannelFactory
(Executors.newCachedThreadPool(), Executors.newCachedThreadPool());
Scala parses this as two statements, one invoking a no-ar
er, that line being in toDebugString, where it really shouldn't affect
anything (no signature changes or the like)
On Thu, Jul 17, 2014 at 10:58 AM, Nathan Kronenfeld <
nkronenf...@oculusinfo.com> wrote:
> My full build command is:
> ./sbt/sbt -Dhadoop.version=2.0.0-mr1-cdh4.6.0 clean assembly
>
My full build command is:
./sbt/sbt -Dhadoop.version=2.0.0-mr1-cdh4.6.0 clean assembly
I've changed one line in RDD.scala, nothing else.
On Thu, Jul 17, 2014 at 10:56 AM, Sean Owen wrote:
> This looks like a Jetty version problem actually. Are you bringing in
> something that might be changi
This looks like a Jetty version problem actually. Are you bringing in
something that might be changing the version of Jetty used by Spark?
It depends a lot on how you are building things.
Good to specify exactly how your'e building here.
On Thu, Jul 17, 2014 at 3:43 PM, Nathan Kronenfeld
wrote:
I'm trying to compile the latest code, with the hadoop-version set for
2.0.0-mr1-cdh4.6.0.
I'm getting the following error, which I don't get when I don't set the
hadoop version:
[error]
/data/hdfs/1/home/nkronenfeld/git/spark-ndk/external/flume/src/main/scala/org/apache/spark/streaming/flume/Flu