Hi Robert,
Thanks for the detailed response. I worked on the encryption of HDFS as well as
the crypto file system in HDFS, so, I am aware of how it is done in Hadoop. Let
me sync up with Max to get started on it.
I will also start looking into the current implementations.
Niraj
From: Robert Met
If we were to drop CDH4 / Hadoop 2.0.0-alpha, would this mean we do
not even to shade the hadoop fat jars, or we do still needed to
support 1.x ?
- Henry
On Thu, Feb 26, 2015 at 8:57 AM, Robert Metzger wrote:
> Hi,
>
> I'm currently working on https://issues.apache.org/jira/browse/FLINK-1605
> a
Hi Niraj,
Welcome to the Flink community ;)
I'm really excited that you want to contribute to our project, and since
you've asked for something in the security area, I actually have something
very concrete in mind.
We recently added some support for accessing (Kerberos) secured HDFS
clusters in Fl
We haven’t yet implemented any of these machine learning models directly on the
Flink api but we have run them through the existing Samoa tasks, using Flink
Streaming as a backend. Apart from it we have a student looking into machine
learning pipelines on Flink Streaming with a focus on iterativ
Hi Niraj,
Thanks for your interest at Apache Flink. The quickest is to just give
Flink a spin and figure out how it works.
This would get you start on how it works before actually doing work on Flink =)
Please do visit Flink how to contribute page [1] and subscribe to dev
mailing list [2] to star
Hi Flink Dev,
I am looking to contribute to Flink, especially in the area of security. In the
past, I have contributed to Pig, Hive and HDFS. I would really appreciate, if I
can get some work assigned to me. Looking forward to hear back from the
development community of Flink.
Thanks
Niraj
Hi,
Can you tell me where I can find TaskManager logs. I can’t find them in
logs folder? I don’t suppose I should run taskmanager.sh as well. Right?
I’m using a OS X Yosemite. I’ll send you my ifconfig.
lo0: flags=8049 mtu 16384
options=3
inet6 ::1 prefixlen 128
Hi Martin,
welcome back :-)
I'll try to merge the documentation PR tonight.
Gelly is in the flink-staging package and most of the Gelly methods are in
the Graph class and have javadocs that describe their functionality.
Regarding your specific tasks, you can easily get the degree distribution
us
I have no very strong love for the Hadoop 2.0.0-alpha version, and it seems
that most users go through YARN anyways.
Just to understand: The solution would be to not share protobuf in the fat
Hadoop jar at all? Is that not a problem for other situations, like users
with an earlier protobuf version
Hi,
I'm currently working on https://issues.apache.org/jira/browse/FLINK-1605
and its a hell of a mess.
I got almost everything working, except for the hadoop 2.0.0-alpha profile.
The profile exists because google protobuf has a different version in that
Hadoop release.
Since maven is setting the
Hi Dulaj!
Thanks for helping to debug.
My guess is that you are seeing now the same thing between JobManager and
TaskManager as you saw before between JobManager and JobClient. I have a
patch pending that should help the issue (see
https://issues.apache.org/jira/browse/FLINK-1608), let's see if t
Hello,
Also, for guidelines on how to implement a graph algorithm in Gelly, you
can
use the provided examples:
https://github.com/apache/flink/tree/master/flink-staging/flink-gelly/src/main/java/org/apache/flink/graph/example
Have fun!
Andra
On Thu, Feb 26, 2015 at 5:31 PM, Fabian Hueske wrote:
Hej,
I was busy with other stuff for a while but I hope I will have more time to
work on Flink and Graphs again now.
I need to do some basic analytic's on a large graph set (stuff like degree
distribution, triangle count, component size distribution etc.)
Is there anything implemented in Gelli al
Hi Martin,
as a start, there is a PR with Gelly documentation:
https://github.com/vasia/flink/blob/gelly-guide/docs/gelly_guide.md
Cheers, Fabian
2015-02-26 17:12 GMT+01:00 Martin Neumann :
> Hej,
>
> I was busy with other stuff for a while but I hope I will have more time to
> work on Flink an
Hej,
I was busy with other stuff for a while but I hope I will have more time to
work on Flink and Graphs again now.
I need to do some basic analytic's on a large graph set (stuff like degree
distribution, triangle count, component size distribution etc.)
Is there anything implemented in Gelli al
Hi,
It’s great to help out. :)
Setting 127.0.0.1 instead of “localhost” in jobmanager.rpc.address,
helped to build the connection to the jobmanager. Apparently localhost
resolving is different in webclient and the jobmanager. I think it’s good to
set "jobmanager.rpc.address: 127
On 25 Feb 2015, at 16:35, Till Rohrmann wrote:
> The reason for this behaviour is the following:
>
> The log4j-test.properties is not a standard log4j properties file. It is
> only used if it is explicitly given to the executing JVM by
> -Dlog4j.configuration. The parent pom defines for the sur
Alexander Alexandrov created FLINK-1613:
---
Summary: Cannost submit to remote ExecutionEnvironment from IDE
Key: FLINK-1613
URL: https://issues.apache.org/jira/browse/FLINK-1613
Project: Flink
Thanks, changing it.
On Thu, Feb 26, 2015 at 10:45 AM, Till Rohrmann
wrote:
> If the streaming-examples module uses the tag to add the
> test-core dependency then we should change it into tag as
> recommended by maven [1]. Otherwise it might come to build failures if the
> install lifecycle is
If the streaming-examples module uses the tag to add the
test-core dependency then we should change it into tag as
recommended by maven [1]. Otherwise it might come to build failures if the
install lifecycle is not executed.
The dependency import should look like:
org.apache.flink
flink-st
To update the local repository, you have to do execute the "install" goal.
I can recommend to always do a "mvn clean install"
On Thu, Feb 26, 2015 at 10:11 AM, Matthias J. Sax <
mj...@informatik.hu-berlin.de> wrote:
> Thanks for clarifying Marton!
>
> I was on the latest build already. However,
Thanks for clarifying Marton!
I was on the latest build already. However, my local maven repository
contained old jars. After removing all flink-jars from my local maven
repository it works!
Why does maven no automatically update the local repository?
-Matthias
On 02/26/2015 09:20 AM, Márton
Dear Mathias,
Thanks for reporting the issue. I have successfully built
flink-streaming-examples with maven, you can depend on test classes, the
following in the pom does the trick:
org.apache.flink
flink-streaming-core
${project.version}
test
tests
This tells maven that the test cla
Hi,
I just build "flink-streaming" and avoid the problem. I guess, that the
issue is related to the module structure and dependencies.
"flink-streaming-examples" uses
org.apache.flink.streaming.util.StreamingProgramTestBase (that is
defined in "flink-streaming-core/src/TEST"
From my understanding
24 matches
Mail list logo