vinoyang created FLINK-9989:
---
Summary: The documentation of API Migration Guides is out of date
Key: FLINK-9989
URL: https://issues.apache.org/jira/browse/FLINK-9989
Project: Flink
Issue Type: Impr
As an update to this thread, Stephan opted to split the internal/external
configuration (by providing overrides for a common SSL configuration):
https://github.com/apache/flink/pull/6326
Note that Akka doesn't support hostname verification in its 'classic'
remoting implementation (though the new
Pavlo Petrychenko created FLINK-9988:
Summary: job manager does not respect property
jobmanager.web.address
Key: FLINK-9988
URL: https://issues.apache.org/jira/browse/FLINK-9988
Project: Flink
Chesnay Schepler created FLINK-9987:
---
Summary: Rework ClassLoaderITCase to not rely on
.version.properties file
Key: FLINK-9987
URL: https://issues.apache.org/jira/browse/FLINK-9987
Project: Flink
Chesnay Schepler created FLINK-9986:
---
Summary: Remove remote from .version.properties file
Key: FLINK-9986
URL: https://issues.apache.org/jira/browse/FLINK-9986
Project: Flink
Issue Type: I
Thanks Timo,
custom function worked for me with no further exceptions,
Thanks.
---
*Amol Suryawanshi*
Java Developer
am...@iprogrammer.com
*iProgrammer Solutions Pvt. Ltd.*
*Office 103, 104, 1st Floor Pride Portal,Shivaji Housing Society,
Bahira
zhangminglei created FLINK-9985:
---
Summary: Incorrect parameter order in document
Key: FLINK-9985
URL: https://issues.apache.org/jira/browse/FLINK-9985
Project: Flink
Issue Type: Bug
C
Timo Walther created FLINK-9984:
---
Summary: Add a byte array table format factory
Key: FLINK-9984
URL: https://issues.apache.org/jira/browse/FLINK-9984
Project: Flink
Issue Type: Sub-task
I tried to reproduce your error but everything worked fine. Which Flink
version are you using?
Inner joins are a Flink 1.5 feature.
Am 27.07.18 um 13:28 schrieb Amol S - iProgrammer:
Table master = table1.filter("ns === 'Master'").select("o as master,
'accessBasicDBObject(applicationId,o)' as
Aljoscha Krettek created FLINK-9983:
---
Summary: Savepoints should count as checkpoints when recovering
Key: FLINK-9983
URL: https://issues.apache.org/jira/browse/FLINK-9983
Project: Flink
Is
Hello Timo,
I have implemented my own scalar function as below
public class AccessBasicDBObject extends ScalarFunction {
public String eval(String key, BasicDBObject basicDBObject) {
if (basicDBObject.getString(key) != null)
return basicDBObject.getString(key);
el
zhangminglei created FLINK-9982:
---
Summary: NPE in EnumValueSerializer#copy
Key: FLINK-9982
URL: https://issues.apache.org/jira/browse/FLINK-9982
Project: Flink
Issue Type: Bug
Repor
Hi,
I think the exception is self-explaining. BasicDBObject is not
recognized as a POJO by Flink. A POJO is required such that the Table
API knows the types of fields for following operations.
The easiest way is to implement your own scalar function. E.g. a
`accessBasicDBObject(obj, key)`.
Hello Timo,
Thanks for quick reply. By using your suggestion Previous exception gone
but it is giving me following exception
Expression 'o.get(_id) failed on input check: Cannot access field of
non-composite type 'GenericType'.
---
*Amol Suryawanshi*
J
Stefan Richter created FLINK-9981:
-
Summary: Tune performance of RocksDB implementation
Key: FLINK-9981
URL: https://issues.apache.org/jira/browse/FLINK-9981
Project: Flink
Issue Type: Sub-ta
David Anderson created FLINK-9980:
-
Summary: wiki-edits quickstart example fails when run outside of
IDE
Key: FLINK-9980
URL: https://issues.apache.org/jira/browse/FLINK-9980
Project: Flink
Timo Walther created FLINK-9979:
---
Summary: Support a custom FlinkKafkaPartitioner for a Kafka table
sink factory
Key: FLINK-9979
URL: https://issues.apache.org/jira/browse/FLINK-9979
Project: Flink
Hi Amol,
the dot operation is reserved for calling functions on fields. If you
want to get a nested field in the Table API, use the
`.get("applicationId")` operation. See also [1] under "Value access
functions".
Regards,
Timo
[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.5/
-- Forwarded message --
From: Apache Security Team
Date: Thu, Jul 26, 2018 at 6:04 PM
Subject: Fwd: [apache/flink-web] One of your dependencies may have a
security vulnerability
To: priv...@flink.apache.org
Hi Flink PMC,
we are still receiving this notification from github.
Reg
Hello Fabian,
I am streaming my mongodb oplog using flink and want to use flink table API
to join multiple tables. My code looks like
DataStream streamSource = env
.addSource(kafkaConsumer)
.setParallelism(4);
StreamTableEnvironment tableEnv = TableEnvironment.getTableEnvironmen
20 matches
Mail list logo