wouter ligtenberg created FLINK-5152:
Summary: accepting NullValue for EV in Gelly examples
Key: FLINK-5152
URL: https://issues.apache.org/jira/browse/FLINK-5152
Project: Flink
Issue Type
Robert Metzger created FLINK-5153:
-
Summary: Allow setting custom application tags for Flink on YARN
Key: FLINK-5153
URL: https://issues.apache.org/jira/browse/FLINK-5153
Project: Flink
Issue
Aljoscha Krettek created FLINK-5154:
---
Summary: Duplicate TypeSerializer when writing RocksDB Snapshot
Key: FLINK-5154
URL: https://issues.apache.org/jira/browse/FLINK-5154
Project: Flink
Is
Aljoscha Krettek created FLINK-5155:
---
Summary: Deprecate ValueStateDescriptor constructors with default
value
Key: FLINK-5155
URL: https://issues.apache.org/jira/browse/FLINK-5155
Project: Flink
Márton Balassi created FLINK-5156:
-
Summary: Consolidate streaming FieldAccessor functionality
Key: FLINK-5156
URL: https://issues.apache.org/jira/browse/FLINK-5156
Project: Flink
Issue Type:
Just a friendly reminder that PR 2094 resolved some of the issue mentioned
here, but a bit remains in terms of fully consolidating the semantics.
Merging the PR as soon as Travis comes in green.
The reminder is documented in [1].
[1] https://issues.apache.org/jira/browse/FLINK-5156
On Thu, Nov 3
Ventura Del Monte created FLINK-5157:
Summary: Extending AllWindow Function Metadata
Key: FLINK-5157
URL: https://issues.apache.org/jira/browse/FLINK-5157
Project: Flink
Issue Type: New F
In the initial discussion / proposal for interface annotations we decided
to annotate only the very core APIs with @Public and give the libraries
more freedom to evolve over time.
I think we should not have a general rule for the libraries and decide this
case-by-case. If you feel that Gelly is mat
Hello,
In Scala case classes can store huge count of fields, it's really helpful for
reading wide csv files, but It uses only in table api.
what about this issue (https://issues.apache.org/jira/browse/FLINK-2186),
should we use table api in machine learning library?
To solve the issue #readC
Till Rohrmann created FLINK-5158:
Summary: Handle ZooKeeperCompletedCheckpointStore exceptions in
CheckpointCoordinator
Key: FLINK-5158
URL: https://issues.apache.org/jira/browse/FLINK-5158
Project: F
Hi Naveen,
The new Kerberos authentication code in Flink assumes that we're
running against vanilla Hadoop. The unmodified Hadoop's behavior is to
skip a secure login if security is not configured. This is different
for the MapR Hadoop version.
Thus, we need to make sure we don't perform any logi
11 matches
Mail list logo