This is great news, Kostas and Sri.
Looks like the visit was super useful =)
I would love to help jump working on this!
- Henry
On Sat, Jan 24, 2015 at 1:07 PM, Sri Ambati wrote:
> Kostas,
> Thank you for your generosity.
>
> We are honored to be a part of Apache Flink's community & it's amazin
Hi,
"mvn clean verify" fails for me on Ubuntu with deleted .m2 repository.
I'm getting the following:
Results :
Failed tests:
YARNSessionFIFOITCase.setup:56->YarnTestBase.startYARNWithConfig:249 null
YARNSessionCapacitySchedulerITCase.setup:42->YarnTestBase.startYARNWithConfig:249
null
Tests
Fabian Hueske created FLINK-1445:
Summary: Add support to enforce local input split assignment
Key: FLINK-1445
URL: https://issues.apache.org/jira/browse/FLINK-1445
Project: Flink
Issue Type:
Fabian Hueske created FLINK-1444:
Summary: Add data properties for data sources
Key: FLINK-1444
URL: https://issues.apache.org/jira/browse/FLINK-1444
Project: Flink
Issue Type: New Feature
Fabian Hueske created FLINK-1443:
Summary: Add replicated data source
Key: FLINK-1443
URL: https://issues.apache.org/jira/browse/FLINK-1443
Project: Flink
Issue Type: New Feature
Co
The build fails also after the .m2 repository was deleted.
Does anybody else have this problem?
2015-01-24 21:31 GMT+01:00 Stephan Ewen :
> Is this reproducible on a machine when you delete the .m2/repository
> directory (local maven cache) ?
>
> (I currently cannot try that because I am behind
As the community of flink add-ons grows, a CPAN or maven-like mechanism
might be a nice option. That would let people download and install
extensions very fluidly.
The argument for making Apache contributions is definitely valid, but the
argument for the agility of fostering independent projects
Hey guys,
I opened a pull request which adds support to the webclient for executing
and visualizing streaming programs.
I had to make modifications to the clients and the way plans are handled,
so someone should definitely review it :)
https://github.com/apache/flink/pull/334
Cheers,
Gyula
I agree with ForwardFields as well.
I vaguely remember that Joe Harjung (when working on the first Scala API
version) called it the CopySet. I would assume that ForwardFields is more
intuitive to most people.
I only mention this, because Joe was one of the few English native
speakers in the team
I am also more in favor of option 1).
2015-01-24 20:27 GMT+01:00 Kostas Tzoumas :
> Thanks Fabian for starting the discussion.
>
> I would be biased towards option (1) that Stephan highlighted for the
> following reasons:
>
> - A separate github project is one more infrastructure to manage, and i
Kostas,
Thank you for your generosity.
We are honored to be a part of Apache Flink's community & it's amazing journey
from Berlin and beyond!
H2O is excited to bring best-in-class machine learning to application
developers worldwide.
Looking forward,
Sri
> On Jan 24, 2015, at 11:39 AM, Kost
Stephan Ewen created FLINK-1442:
---
Summary: Archived Execution Graph consumes too much memory
Key: FLINK-1442
URL: https://issues.apache.org/jira/browse/FLINK-1442
Project: Flink
Issue Type: Bug
Is this reproducible on a machine when you delete the .m2/repository
directory (local maven cache) ?
(I currently cannot try that because I am behind a rather low-bandwith
connection and would take very long to re-download all dependency artifacts)
On Sat, Jan 24, 2015 at 5:54 AM, Fabian Hueske
Not 100%
My guess is that it comes from the scala tests in flink-tests for POJOs
containing joda time classes (to test the custom serializers)
Stephan
On Sat, Jan 24, 2015 at 12:16 PM, Aljoscha Krettek
wrote:
> Yes, I will look into it.
>
> Are you sure this happens in the Scala code?
>
> On
Yes, I will look into it.
Are you sure this happens in the Scala code?
On Sat, Jan 24, 2015 at 8:57 PM, Stephan Ewen wrote:
> Hi!
>
> When running a recent build, I am seeing the following error message in the
> "flink-tests" project.
>
> [WARNING] warning: Class org.joda.convert.ToString not fo
Hi!
When running a recent build, I am seeing the following error message in the
"flink-tests" project.
[WARNING] warning: Class org.joda.convert.ToString not found - continuing
with a stub.
@aljoscha This is probably a message generated by the Scala type analyzer.
Can you elaborate what this mea
Hi everyone,
I had a chat with some folks behind the H2O project (http://h2o.ai), and
they would be interested in having H2O run on top/inside of Flink. H2O is a
very performant system focused on Machine Learning.
A similar integration has been implemented for H2O on Spark (called
sparkling water
Thanks Fabian for starting the discussion.
I would be biased towards option (1) that Stephan highlighted for the
following reasons:
- A separate github project is one more infrastructure to manage, and it
lives outside the ASF. I would like to bring as much code as possible to
the Apache Software
Yes, a "flink-contrib" project would be great.
We have two options:
1) Make it part of the flink apache project.
- PRO this makes it easy to get stuff for users
- CONTRA this means stronger requirements on the code, blocker for code
that uses dependencies under certain licenses, etc.
2) Make
I just tried to build ("mvn clean install") on a fresh Ubuntu VM. Fails
with the same exception as natively on MacOS.
Something strange is going on...
2015-01-24 11:19 GMT+01:00 Fabian Hueske :
> Thanks Robert! Sounds indeed like an environment problem.
> Will run the tests again and send you the
I think top level maven module called "flink-contrib" is reasonable. There are
other projects having contrib package such as Akka, Django.
Regards, Chiwan Park (Sent with iPhone)
2015. 1. 24. 오후 7:15 Fabian Hueske 작성:
> Hi all,
>
> we got a few contribution requests lately to add cool but "no
Rahul Mahindrakar created FLINK-1441:
Summary: Documentation SVN checkout link is wrong
Key: FLINK-1441
URL: https://issues.apache.org/jira/browse/FLINK-1441
Project: Flink
Issue Type: Bu
Thanks Robert! Sounds indeed like an environment problem.
Will run the tests again and send you the output.
2015-01-24 11:11 GMT+01:00 Robert Metzger :
> Okay, the tests have finished on my local machine, and they passed. So it
> looks like an environment specific issue.
> Maybe the log helps me
Hi all,
we got a few contribution requests lately to add cool but "non-core"
features to our API.
In previous discussions, concerns were raised to not bloat the APIs with
too many "shortcut", "syntactic sugar", or special-case features.
Instead we could setup a place to add Input/OutputFormats, c
Okay, the tests have finished on my local machine, and they passed. So it
looks like an environment specific issue.
Maybe the log helps me already to figure out whats the issue.
We should make sure that our tests are passing on all platforms ;)
On Sat, Jan 24, 2015 at 11:06 AM, Robert Metzger
wro
Hi Mustafa,
that would be a nice contribution!
We are currently discussing how to add "non-core" API features into Flink
[1].
I will move this discussion onto the mailing list to decide where to add
cool add-ons like yours.
Cheers, Fabian
[1] https://issues.apache.org/jira/browse/FLINK-1398
20
Hi,
the tests are passing on travis. Maybe its a issue with your environment.
I'm currently running the tests on my machine as well, just to make sure.
I haven't ran the tests on OS X, maybe that's causing the issues.
Can you send me (privately) the full output of the tests?
Best,
Robert
On S
Hi Henry,
running "mvn -DskipTests clean install" before "mvn clean install" did not
fix the build for me.
The failing tests are also integration tests (*ITCase) which are only
executed in Maven's verify phase which is not triggered if you run "mvn
clean test".
If I run "mvn test" without "mvn ins
28 matches
Mail list logo