This looks like a Jetty version problem actually. Are you bringing in
something that might be changing the version of Jetty used by Spark?
It depends a lot on how you are building things.

Good to specify exactly how your'e building here.

On Thu, Jul 17, 2014 at 3:43 PM, Nathan Kronenfeld
<nkronenf...@oculusinfo.com> wrote:
> I'm trying to compile the latest code, with the hadoop-version set for
> 2.0.0-mr1-cdh4.6.0.
>
> I'm getting the following error, which I don't get when I don't set the
> hadoop version:
>
> [error]
> /data/hdfs/1/home/nkronenfeld/git/spark-ndk/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeInputDStream.scala:156:
> overloaded method constructor NioServerSocketChannelFactory with
> alternatives:
> [error]   (x$1: java.util.concurrent.Executor,x$2:
> java.util.concurrent.Executor,x$3:
> Int)org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory <and>
> [error]   (x$1: java.util.concurrent.Executor,x$2:
> java.util.concurrent.Executor)org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory
> [error]  cannot be applied to ()
> [error]       val channelFactory = new NioServerSocketChannelFactory
> [error]                            ^
> [error] one error found
>
>
> I don't know flume from a hole in the wall - does anyone know what I can do
> to fix this?
>
>
> Thanks,
>          -Nathan
>
>
> --
> Nathan Kronenfeld
> Senior Visualization Developer
> Oculus Info Inc
> 2 Berkeley Street, Suite 600,
> Toronto, Ontario M5A 4J5
> Phone:  +1-416-203-3003 x 238
> Email:  nkronenf...@oculusinfo.com

Reply via email to