Hi,
I had a closer look into this and opened an PR to fix the issue:
https://github.com/apache/flink/pull/1457
As I am afraid of side effects I am not aware of, please give feedback
if this fix can be merged or not...
Thx.
-Matthias
On 12/11/2015 06:26 PM, Nick Dimiduk wrote:
> Ah I see. This
Ah I see. This explains the issues I had with submitting streaming jobs
that package JDBC drivers. Is there a second in the guide/docs about
classloader considerations with Flink?
On Thu, Dec 10, 2015 at 11:53 PM, Stephan Ewen wrote:
> Flink's classloading is different from Hadoop's.
>
> In Hado
Flink's classloading is different from Hadoop's.
In Hadoop, the entire JVM is started with all classes (including the user
jar) in the classpath already. In Flink, jars are added dymanically, to
running JVMs with custom class loaders. That way, running worker/master
processes can accept new jars w
As far as the jvm is concerned, clojure is just another library. You should
be able to package it up like any other dependency and submit the job.
That's always how it worked in Hadoop/MR anyway...
On Thu, Dec 10, 2015 at 3:22 PM, Matthias J. Sax wrote:
> Thanks for this idea.
>
> I extended my
Thanks for this idea.
I extended my pom to include clojure-1.5.1.jar in my program jar.
However, the problem is still there... I did some research on the
Internet, and it seems I need to mess around with Clojure's class
loading strategy...
-Matthias
On 12/10/2015 06:47 PM, Nick Dimiduk wrote:
>
I think Mattias's project is using maven though -- there's a pom in the
project that doesn't look generated. If you want to do it from lein, maybe
my old lein-hadoop [0] plugin can help?
[0]: https://github.com/ndimiduk/lein-hadoop
On Thu, Dec 10, 2015 at 8:54 AM, Robert Metzger wrote:
> I had
I had the same though as Nick. Maybe Leiningen allows to somehow build a
fat-jar containing the clojure standard library.
On Thu, Dec 10, 2015 at 5:51 PM, Nick Dimiduk wrote:
> What happens when you follow the packaging examples provided in the flink
> quick start archetypes? These have the mave
What happens when you follow the packaging examples provided in the flink
quick start archetypes? These have the maven-foo required to package an
uberjar suitable for flink submission. Can you try adding that step to your
pom.xml?
On Thursday, December 10, 2015, Stephan Ewen wrote:
> This is a p
This is a problem in Java.
I think you cannot dynamically modify the initial system class loader.
What most apps do is check for the thread context class loader when
dynamically loading classes. We can check and make sure that one is set,
but if Closure does not respect that, we have a problem.
Th
Would it make sense (if possible?) for Flink to add the user jar
dynamically to it's own classpath so Clojure can find it? Or somehow
modify Clojure's class loader?
The jars in lib are added to the classpath at startup. This makes it
practically impossible to execute a Flink program that is writte
Clojure is not considering the user-jar when trying to load the class.
> On 10 Dec 2015, at 17:05, Matthias J. Sax wrote:
>
> Hi Squirrels,
>
> I was playing with a Flink Clojure WordCount example today.
> https://github.com/mjsax/flink-external/tree/master/flink-clojure
>
> After building the
Hi Squirrels,
I was playing with a Flink Clojure WordCount example today.
https://github.com/mjsax/flink-external/tree/master/flink-clojure
After building the project with "mvn package" I tried to submit it to a
local cluster. Before I started the cluster, I manually copied
"clojure-1.5.1.jar" in
12 matches
Mail list logo