This is the JIRA: https://issues.apache.org/jira/browse/FLINK-3565
It has been resolved by Max.
I'll merge the fix and create a new RC now
On Wed, Mar 2, 2016 at 12:15 PM, Aljoscha Krettek
wrote:
> Hi,
> I saw this one when trying my job that was built against Scala 2.11:
>
> java.lang.NoClassD
I opened this JIRA, if anyone has good examples, please add it in the
comments:
https://issues.apache.org/jira/browse/FLINK-3566
Gyula
Gyula Fóra ezt írta (időpont: 2016. márc. 2., Sze,
15:54):
> Okay, I will open a JIRA issue
>
> Gyula
>
> Timo Walther ezt írta (időpont: 2016. márc. 2., Sze,
Okay, I will open a JIRA issue
Gyula
Timo Walther ezt írta (időpont: 2016. márc. 2., Sze,
15:42):
> Can you open an issue with an example of your custom TypeInfo? I will
> then open a suitable PR for it.
>
>
> On 02.03.2016 15:33, Gyula Fóra wrote:
> > Would that work with generic classes?
> >
Gyula Fora created FLINK-3566:
-
Summary: Input type validation often fails on custom TypeInfo
implementations
Key: FLINK-3566
URL: https://issues.apache.org/jira/browse/FLINK-3566
Project: Flink
Can you open an issue with an example of your custom TypeInfo? I will
then open a suitable PR for it.
On 02.03.2016 15:33, Gyula Fóra wrote:
Would that work with generic classes?
Timo Walther ezt írta (időpont: 2016. márc. 2., Sze,
15:22):
After thinking about it, I think an even better so
Would that work with generic classes?
Timo Walther ezt írta (időpont: 2016. márc. 2., Sze,
15:22):
> After thinking about it, I think an even better solution is to provide
> an interface for the TypeExtractor where the user can register mappings
> from class to TypeInformation.
> So that the Typ
After thinking about it, I think an even better solution is to provide
an interface for the TypeExtractor where the user can register mappings
from class to TypeInformation.
So that the TypeExctractor is more extensible. This would also solve you
problem. What do you think?
On 02.03.2016 15:00
Hi!
Yes I think, that sounds good :) We just need to make sure that this works
with things like the TupleTypeInfo which is built-on but I can still mix in
new Types for the fields.
Thanks,
Gyula
Timo Walther ezt írta (időpont: 2016. márc. 2., Sze,
14:02):
> The TypeExtractor's input type vali
Aljoscha Krettek created FLINK-3565:
---
Summary: FlinkKafkaConsumer does not work with Scala 2.11
Key: FLINK-3565
URL: https://issues.apache.org/jira/browse/FLINK-3565
Project: Flink
Issue T
The TypeExtractor's input type validation was designed for the built-in
TypeInformation classes.
In your case of a new, unknown TypeInformation, the validation should
simply skipped, because we can assume that you user knows what he is doing.
I can open a PR for that.
On 02.03.2016 11:34, Al
Timo Walther created FLINK-3564:
---
Summary: Implement distinct() for Table API
Key: FLINK-3564
URL: https://issues.apache.org/jira/browse/FLINK-3564
Project: Flink
Issue Type: New Feature
Simone Robutti created FLINK-3563:
-
Summary: .returns() doesn't compile when using .map() with a
custom MapFunction
Key: FLINK-3563
URL: https://issues.apache.org/jira/browse/FLINK-3563
Project: Flink
Hi,
I saw this one when trying my job that was built against Scala 2.11:
java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at kafka.utils.Pool.(Pool.scala:28)
at
kafka.consumer.FetchRequestAndResponseStatsRegistry$.(FetchRequestAndResponseStats.scala:60)
Do we continue using the google shared doc for RC3 for the release testing
coordination?
On Wed, Mar 2, 2016 at 11:31 AM, Aljoscha Krettek
wrote:
> By the way, this is the commits that where added since rc3, so most of the
> testing that we already did should also be valid for this RC:
>
> $ git
I think you have a point. Another user also just ran into problems with the
TypeExtractor. (The “Java Maps and TypeInformation” email).
So let’s figure out what needs to be changed to make it work for all people.
Cheers,
Aljoscha
> On 02 Mar 2016, at 11:15, Gyula Fóra wrote:
>
> Hey,
>
> I ha
Maximilian Michels created FLINK-3562:
-
Summary: Update docs in the course of EventTimeSourceFunction
removal
Key: FLINK-3562
URL: https://issues.apache.org/jira/browse/FLINK-3562
Project: Flink
By the way, this is the commits that where added since rc3, so most of the
testing that we already did should also be valid for this RC:
$ git log origin/release-1.0.0-rc3..origin/release-1.0.0-rc4
commit 1b0a8c4e9d5df35a7dea9cdd6d2c6e35489bfefa
Author: Robert Metzger
Date: Tue Mar 1 16:40:31
That is cool Nikolaas :-) Looking forward to the scala-shell for streaming
:-)
On Wed, Mar 2, 2016 at 10:53 AM, Nikolaas s
wrote:
> Hi guys,
>
> I've integrated streaming in zeppelin for flink.
> It works using the scala shell, which I extended to support the streaming
> application.
> Unfortuna
Hey,
I have brought up this issue a couple months back but I would like to do it
again.
I think the current way of validating the input type of udfs against the
out type of the preceeding operators is too aggressive and breaks a lot of
code that should otherwise work.
This issue appears all the
Hi guys,
I've integrated streaming in zeppelin for flink.
It works using the scala shell, which I extended to support the streaming
application.
Unfortunately the scala-shell for streaming is not yet included in the
Flink-master, and changed a bit upon public request since I implemented the
zeppel
Maximilian Michels created FLINK-3561:
-
Summary: ExecutionConfig's timestampsEnabled is unused
Key: FLINK-3561
URL: https://issues.apache.org/jira/browse/FLINK-3561
Project: Flink
Issue T
Great to hear that you two are giving a talk at ApacheCon.
As far as I know, there's nobody working on a streaming interpreter for
Zeppelin. People thought about doing it, but it never got realized so far.
But I think it should not be too difficult to implement. So if you wanna
take the lead there
The release binaries are now located here:
http://home.apache.org/~rmetzger/flink-1.0.0-rc4/
On Wed, Mar 2, 2016 at 10:16 AM, Robert Metzger wrote:
> Yes, There was an email from Infra that they are going to shut down
> people.apache.org on March 1.
> I'll try to move the binaries to the new se
Yes, There was an email from Infra that they are going to shut down
people.apache.org on March 1.
I'll try to move the binaries to the new server ("home.apache.org").
On Wed, Mar 2, 2016 at 9:27 AM, Ufuk Celebi wrote:
> I get a 404 for the binaries. It's a INFRA thing I guess, because my
> perso
I get a 404 for the binaries. It's a INFRA thing I guess, because my
personal apache user page is also down/gone. :-(
On Tue, Mar 1, 2016 at 10:42 PM, Robert Metzger wrote:
> Dear Flink community,
>
> Please vote on releasing the following candidate as Apache Flink version 1.0
> .0.
>
> This is t
25 matches
Mail list logo