Flink's classloading is different from Hadoop's.
In Hadoop, the entire JVM is started with all classes (including the user
jar) in the classpath already. In Flink, jars are added dymanically, to
running JVMs with custom class loaders. That way, running worker/master
processes can accept new jars w
As far as the jvm is concerned, clojure is just another library. You should
be able to package it up like any other dependency and submit the job.
That's always how it worked in Hadoop/MR anyway...
On Thu, Dec 10, 2015 at 3:22 PM, Matthias J. Sax wrote:
> Thanks for this idea.
>
> I extended my
Thanks for this idea.
I extended my pom to include clojure-1.5.1.jar in my program jar.
However, the problem is still there... I did some research on the
Internet, and it seems I need to mess around with Clojure's class
loading strategy...
-Matthias
On 12/10/2015 06:47 PM, Nick Dimiduk wrote:
>
The pluggable architecture is great! (why Don't we call it Flink plugins?
my 2 cents)
It will be nice if we come up with an idea of what directory structure
should look like before start dumping connectors (plugins).
Also wonder what to do with versioning.
At some point, for example, Twitter v1 con
You are right. I'll post the link with my next message on the maven user
list.
Here is the link to the maven discussion:
http://mail-archives.apache.org/mod_mbox/maven-users/201512.mbox/%3CCAGr9p8CdMJL0sdbewgZ5WqJ0_4OWCox-mw00T7Vn3KDxz9PjtA%40mail.gmail.com%3E
His last answer says that Maven 3.2's
I think Mattias's project is using maven though -- there's a pom in the
project that doesn't look generated. If you want to do it from lein, maybe
my old lein-hadoop [0] plugin can help?
[0]: https://github.com/ndimiduk/lein-hadoop
On Thu, Dec 10, 2015 at 8:54 AM, Robert Metzger wrote:
> I had
Lol. Okay, thanks a bunch. Mind linking back here with your discussion
thread on the maven list? This will be of interest to other projects as
well.
On Thursday, December 10, 2015, Robert Metzger wrote:
> I further looked into the issue. I have the strong feeling its a bug in
> Maven .. or at le
I had the same though as Nick. Maybe Leiningen allows to somehow build a
fat-jar containing the clojure standard library.
On Thu, Dec 10, 2015 at 5:51 PM, Nick Dimiduk wrote:
> What happens when you follow the packaging examples provided in the flink
> quick start archetypes? These have the mave
What happens when you follow the packaging examples provided in the flink
quick start archetypes? These have the maven-foo required to package an
uberjar suitable for flink submission. Can you try adding that step to your
pom.xml?
On Thursday, December 10, 2015, Stephan Ewen wrote:
> This is a p
This is a problem in Java.
I think you cannot dynamically modify the initial system class loader.
What most apps do is check for the thread context class loader when
dynamically loading classes. We can check and make sure that one is set,
but if Closure does not respect that, we have a problem.
Th
Would it make sense (if possible?) for Flink to add the user jar
dynamically to it's own classpath so Clojure can find it? Or somehow
modify Clojure's class loader?
The jars in lib are added to the classpath at startup. This makes it
practically impossible to execute a Flink program that is writte
Clojure is not considering the user-jar when trying to load the class.
> On 10 Dec 2015, at 17:05, Matthias J. Sax wrote:
>
> Hi Squirrels,
>
> I was playing with a Flink Clojure WordCount example today.
> https://github.com/mjsax/flink-external/tree/master/flink-clojure
>
> After building the
Hi Squirrels,
I was playing with a Flink Clojure WordCount example today.
https://github.com/mjsax/flink-external/tree/master/flink-clojure
After building the project with "mvn package" I tried to submit it to a
local cluster. Before I started the cluster, I manually copied
"clojure-1.5.1.jar" in
Hi All,
I want to discuss some ideas about improving the primitives/operations that
Flink offers for user-state, timers and windows and how these concepts can be
unified.
It has come up a lot lately that people have very specific requirements
regarding the state that they keep and it seems nece
Robert Metzger created FLINK-3159:
-
Summary: YARN AM should shutdown itself after a the job for which
the AM was started has finished
Key: FLINK-3159
URL: https://issues.apache.org/jira/browse/FLINK-3159
We would need to have a stable interface between the connectors and flink and
have very good checks that ensure that we don’t inadvertently break things.
> On 10 Dec 2015, at 15:45, Fabian Hueske wrote:
>
> Sounds like a good idea to me.
>
> +1
>
> Fabian
>
> 2015-12-10 15:31 GMT+01:00 Maxim
I like this a lot. It has multiple advantages:
- Obviously more frequent connector updates without being forced to go to
a snapshot version
- Reduce complexity and build time of the core flink repository
We should make sure that for example 0.10.x connectors always work with
0.10.x flink core
Hi Ali!
Seems like the Google Doc has restricted access, I tells me I have no
permission to view it...
Stephan
On Wed, Dec 9, 2015 at 8:49 PM, Kashmar, Ali wrote:
> Hi Stephan,
>
> Here’s a link to the screenshot I tried to attach earlier:
>
> https://drive.google.com/open?id=0B0_jTR8-IvUcMEd
Sounds like a good idea to me.
+1
Fabian
2015-12-10 15:31 GMT+01:00 Maximilian Michels :
> Hi squirrels,
>
> By this time, we have numerous connectors which let you insert data
> into Flink or output data from Flink.
>
> On the streaming side we have
>
> - RollingSink
> - Flume
> - Kafka
> - Ni
Hi squirrels,
By this time, we have numerous connectors which let you insert data
into Flink or output data from Flink.
On the streaming side we have
- RollingSink
- Flume
- Kafka
- Nifi
- RabbitMQ
- Twitter
On the batch side we have
- Avro
- Hadoop compatibility
- HBase
- HCatalog
- JDBC
Ma
Thanks for all your feedback! I updated the PR.
I would like to publish the post today. Please let me know if you have
any more comments on the draft.
-Matthias
On 12/09/2015 08:12 PM, Vasiliki Kalavri wrote:
> Thanks Matthias! This is a very nice blog post and reads easily.
>
> On 9 December 2
I further looked into the issue. I have the strong feeling its a bug in
Maven .. or at least a change in its behavior.
The build with maven 3.2.5 is correct, with 3.3.9, the first build is
incorrect, the second build of "flink-dist" only.
This is the issue: https://issues.apache.org/jira/browse/FL
Robert Metzger created FLINK-3158:
-
Summary: Shading does not remove google guava from flink-dist fat
jar
Key: FLINK-3158
URL: https://issues.apache.org/jira/browse/FLINK-3158
Project: Flink
23 matches
Mail list logo