we use fine-grained mode. coarse-grained mode keeps JVMs around which often
leads to OOMs, which in turn kill the entire executor, causing entire
stages to be retried. In fine-grained mode, only the task fails and
subsequently gets retried without taking out an entire stage or worse.
On Tue, Nov 3
Yieldbot is pleased to announce the release of Flambo, our Clojure DSL for
Apache Spark.
Flambo allows one to write spark applications in pure Clojure as an
alternative to Scala, Java and Python currently available in Spark.
We have already written a substantial amount of internal code in clojure
Are the hadoop configuration files on the classpath for your mesos
executors?
On Thu, Jul 3, 2014 at 6:45 PM, Steven Cox wrote:
> ...and a real subject line.
> --
> *From:* Steven Cox [s...@renci.org]
> *Sent:* Thursday, July 03, 2014 9:21 PM
> *To:* user@spark.apa
I typed "spark parquet" into google and the top results was this blog post
about reading and writing parquet files from spark
http://zenfractal.com/2013/08/21/a-powerful-big-data-trio/
On Mon, Jul 7, 2014 at 5:23 PM, Michael Armbrust
wrote:
> SchemaRDDs, provided by Spark SQL, have a saveAsPar
Hello,
I get a lot of these exceptions on my mesos cluster when running spark jobs:
14/07/19 16:29:43 WARN spark.network.SendingConnection: Error finishing
connection to prd-atl-mesos-slave-010/10.88.160.200:37586
java.net.ConnectException: Connection timed out
at sun.nio.ch.SocketChannelImpl.che
Anybody? Seems like a reasonable thing to be able to do no?
On Fri, Mar 21, 2014 at 3:58 PM, Benjamin Black wrote:
> Howdy, folks!
>
> Anybody out there having a working kafka _output_ for Spark streaming?
> Perhaps one that doesn't involve instantiating a new producer for every
> batch?
>
> Th
Hello,
Is it possible to use a custom class as my spark's KryoSerializer running
under Mesos?
I've tried adding my jar containing the class to my spark context (via
SparkConf.addJars), but I always get:
java.lang.ClassNotFoundException: flambo.kryo.FlamboKryoSerializer
at java.net.URLCla
Does spark support extending and registering a KryoSerializer class in an
application jar?
An example of why you might want to do this would be to always register
some set of common classes within an organization while still allowing the
particular application jar to use a kryo registrator to regi
There is a JavaSparkContext, but no JavaSparkConf object. I know SparkConf
is new in 0.9.x.
Is there a plan to add something like this to the java api?
It's rather a bother to have things like setAll take a scala
Traverable[String String] when using SparkConf from the java api.
At a minimum addi
rsions. The class itself is simple. But I agree adding java
> setters would be nice.
>
> On Tue, Apr 29, 2014 at 8:32 PM, Soren Macbeth wrote:
> > There is a JavaSparkContext, but no JavaSparkConf object. I know
> SparkConf
> > is new in 0.9.x.
> >
> > Is t
Hallo,
I've getting this rather crazy kryo exception trying to run my spark job:
Exception in thread "main" org.apache.spark.SparkException: Job aborted:
Exception while deserializing and fetching task:
com.esotericsoftware.kryo.KryoException:
java.lang.IllegalArgumentException: Can not set final
so
it seems that it dying while trying to fetch results from my tasks to
return back to the driver.
Am I close?
On Fri, May 2, 2014 at 3:35 PM, Soren Macbeth wrote:
> Hallo,
>
> I've getting this rather crazy kryo exception trying to run my spark job:
>
> Ex
Does this perhaps have to do with the spark.closure.serializer?
On Sat, May 3, 2014 at 7:50 AM, Soren Macbeth wrote:
> Poking around in the bowels of scala, it seems like this has something to
> do with implicit scala -> java collection munging. Why would it be doing
> this an
Is this supposed to be supported? It doesn't work, at least in mesos fine
grained mode. First it fails a bunch of times because it can't find my
registrator class because my assembly jar hasn't been fetched like so:
java.lang.ClassNotFoundException: pickles.kryo.PicklesRegistrator
at java.
14 matches
Mail list logo