Chesnay Schepler created FLINK-27081:
Summary: Remove Azure integration
Key: FLINK-27081
URL: https://issues.apache.org/jira/browse/FLINK-27081
Project: Flink
Issue Type: Sub-task
Chesnay Schepler created FLINK-27080:
Summary: Enable builds for pull requests
Key: FLINK-27080
URL: https://issues.apache.org/jira/browse/FLINK-27080
Project: Flink
Issue Type: Sub-task
Chesnay Schepler created FLINK-27079:
Summary: Setup cron jobs
Key: FLINK-27079
URL: https://issues.apache.org/jira/browse/FLINK-27079
Project: Flink
Issue Type: Sub-task
Compon
Chesnay Schepler created FLINK-27077:
Summary: Setup per-commit builds for master
Key: FLINK-27077
URL: https://issues.apache.org/jira/browse/FLINK-27077
Project: Flink
Issue Type: Sub-ta
Lijie Wang created FLINK-27078:
--
Summary: There is a performance gap between the new csv
source(file system source + CSV format) and legacy CsvTableSource.
Key: FLINK-27078
URL: https://issues.apache.org/jira/browse/
Chesnay Schepler created FLINK-27076:
Summary: Setup runners
Key: FLINK-27076
URL: https://issues.apache.org/jira/browse/FLINK-27076
Project: Flink
Issue Type: Sub-task
Componen
Chesnay Schepler created FLINK-27075:
Summary: Migrate CI from Azure to Github Actions
Key: FLINK-27075
URL: https://issues.apache.org/jira/browse/FLINK-27075
Project: Flink
Issue Type: T
Chesnay Schepler created FLINK-27074:
Summary: python_job.py should write to TEST_DATA_DIR
Key: FLINK-27074
URL: https://issues.apache.org/jira/browse/FLINK-27074
Project: Flink
Issue Typ
Yun Gao created FLINK-27073:
---
Summary: HAQueryableStateFsBackendITCase hangs on azure
Key: FLINK-27073
URL: https://issues.apache.org/jira/browse/FLINK-27073
Project: Flink
Issue Type: Bug
Zhipeng Zhang created FLINK-27072:
-
Summary: Add Bucketizer in FlinkML
Key: FLINK-27072
URL: https://issues.apache.org/jira/browse/FLINK-27072
Project: Flink
Issue Type: New Feature
haixiaCao created FLINK-27071:
-
Summary: For jdbcsink function to support config table name in
SQL parameter
Key: FLINK-27071
URL: https://issues.apache.org/jira/browse/FLINK-27071
Project: Flink
Caizhi Weng created FLINK-27070:
---
Summary: Reuse FileFormat instead of DecodingFormat/EncodingFormat
to ensure thread safety.
Key: FLINK-27070
URL: https://issues.apache.org/jira/browse/FLINK-27070
Proj
actually I had build/compile
- pyarrow==2.0.0 (test skipped)
- apache-beam==2.27.0 (test skipped)
on python 3.9, and test with example python jobs( bin/flink run
-pyclientexec python3.7 -pyexec python3.9 -py
examples/python/table/word_count.py )
but got exceptions following
Caused by: java.util.co
Huang Xingbo created FLINK-27069:
Summary: Fix the potential memory corruption in Thread Mode
Key: FLINK-27069
URL: https://issues.apache.org/jira/browse/FLINK-27069
Project: Flink
Issue Type
Huang Xingbo created FLINK-27068:
Summary: test_keyed_min_and_max and test_keyed_min_by_and_max_by
failed in py36,37
Key: FLINK-27068
URL: https://issues.apache.org/jira/browse/FLINK-27068
Project: Fl
It seems that you have already got this done :)
One more question, do we need to create an official docker image repo for
flink-kubernetes-operator[1]? Then user could pull the image directly via
"docker pull flink-kubernetes-operator".
The drawback is we always need to create a PR[2] and wait for
Dian Fu created FLINK-27067:
---
Summary: Prevent usage of deprecated APIs in PyFlink examples
Key: FLINK-27067
URL: https://issues.apache.org/jira/browse/FLINK-27067
Project: Flink
Issue Type: Improv
Hi Timo,
Thanks for you reply!
> It would be great to further investigate which other commands are required
> that would be usually be exeuted via CLI commands. I would like to avoid a
> large amount of FLIPs each adding a special job lifecycle command.
Okay. I listed only the commands about j
Thanks Gyula for the great work. It is a big step for making Flink more
cloud-native.
Best,
Yang
Xintong Song 于2022年4月6日周三 10:37写道:
> Thanks Gyula for driving this, and everyone who helped make this release
> possible.
>
> Kindly reminder: It seems we have not added this release to the ASF rep
I also lean to the second option since v1alpha1 is a preview release.
But we need to be more careful to introduce other incompatible changes
after v1beta1.
Maybe we also need a simple manual for how to upgrade the operator
especially when the CRD version changed.
Best,
Yang
Gyula Fóra 于2022年4月5
Thanks Gyula for driving this, and everyone who helped make this release
possible.
Kindly reminder: It seems we have not added this release to the ASF report
database. See "Recordkeeping" in the release process.[1]
Thank you~
Xintong Song
[1]
https://cwiki.apache.org/confluence/display/FLINK/C
Hi Josemon,
Can you provide the version of PyFlink you are using, and it would be
better if you can also provide code snippets that can be easily reproduced
this problem.
Best,
Xingbo
Josemon Maliakal 于2022年4月5日周二 13:31写道:
> Hello There,
> I started using pyflink recently,
> Have anyone know w
Hi Martjin and Luan,
As of now, the main reason why PyFlink has not declared to support Python
3.9 is that the dependent apache-beam, and the versions of numpy and
pyarrow that apache-beam depends on do not provide corresponding whl
packages in Python 3.9. Users need source code installation, but
Hi Martin,
Thanks for the clarification. I will be interested in becoming an ASF committer.
I missed the Sync meeting today because I thought 4pm CST was Central Standard
Time but it was actually my 1am(Pacific Time).
Is there no contributor joining from the US ? I will join the next week sync
f
Alexander Fedulov created FLINK-27066:
-
Summary: Reintroduce e2e tests in ES as Java tests.
Key: FLINK-27066
URL: https://issues.apache.org/jira/browse/FLINK-27066
Project: Flink
Issue Ty
Jing Ge created FLINK-27064:
---
Summary: Centralize ArchUnit rules for production code
Key: FLINK-27064
URL: https://issues.apache.org/jira/browse/FLINK-27064
Project: Flink
Issue Type: Improvement
Gyula Fora created FLINK-27065:
--
Summary: Store lastReconciledSpec as String in status
Key: FLINK-27065
URL: https://issues.apache.org/jira/browse/FLINK-27065
Project: Flink
Issue Type: Improvem
Martijn Visser created FLINK-27063:
--
Summary: Upgrade Hive 2.3 connector to version 2.3.6
Key: FLINK-27063
URL: https://issues.apache.org/jira/browse/FLINK-27063
Project: Flink
Issue Type: T
Martijn Visser created FLINK-27062:
--
Summary: Add Boring Cyborg bot to flink-connector-elasticsearch
Key: FLINK-27062
URL: https://issues.apache.org/jira/browse/FLINK-27062
Project: Flink
Is
Jing Ge created FLINK-27061:
---
Summary: Improve the ArchUnit test infra in the external
elasticsearch connector
Key: FLINK-27061
URL: https://issues.apache.org/jira/browse/FLINK-27061
Project: Flink
Zhanghao Chen created FLINK-27060:
-
Summary: Extending /jars/:jarid/run API to
Key: FLINK-27060
URL: https://issues.apache.org/jira/browse/FLINK-27060
Project: Flink
Issue Type: Improvement
Hi everyone,
As part of our efforts to externalize connectors, I've also been in contact
with the ASF to ask if it's possible to enable the Dependabot functionality
of Github. It was previously possible to only enable the notification
feature of Dependabot (which notifies you if there's a security
Ryan Skraba created FLINK-27059:
---
Summary: [JUnit5 Migration] Module: flink-compress
Key: FLINK-27059
URL: https://issues.apache.org/jira/browse/FLINK-27059
Project: Flink
Issue Type: Sub-task
Hi Luan,
According to the documentation Python 3.9 is currently indeed not
supported. I briefly checked the Jira tickets and also couldn't find one
about adding support for this, so I've created
https://issues.apache.org/jira/browse/FLINK-27058 for that.
@dian0511...@gmail.com @hxbks...@gmail.co
Martijn Visser created FLINK-27058:
--
Summary: Add support for Python 3.9
Key: FLINK-27058
URL: https://issues.apache.org/jira/browse/FLINK-27058
Project: Flink
Issue Type: Improvement
Jing Ge created FLINK-27057:
---
Summary: Connector to external repo migration - Elasticsearch
Connector
Key: FLINK-27057
URL: https://issues.apache.org/jira/browse/FLINK-27057
Project: Flink
Issue T
FLINK-26985 was discovered just before last weekend.
We will get it resolved first thing after the holiday (tomorrow).
Best
Yuan
On Tue, Apr 5, 2022 at 5:37 PM Yun Gao wrote:
> Hi Robert,
>
> Very sorry for the long delay before the rc1 could be published.
>
> For the open critical issues, I p
I see
Would that mean to contribute to the flink connector source code or I can
create it as part of another dependant jar
On Tue, Apr 5, 2022 at 5:36 AM Martijn Visser
wrote:
> Hi Shameet,
>
> No, you'll need to add support for the Snowflake dialect in Flink. You can
> find more information h
Hi
currently I'll need to run pyflink udf on python 3.9 which is not supported
right now
I tried building
- pyarrow==2.0.0
- apache-beam==2.27.0
on python 3.9 and test python jobs but failed
Is there any discussions/git branch on python 3.9 before? (I didn't find
any in this dev list)
so I can c
Hi Robert,
Very sorry for the long delay before the rc1 could be published.
For the open critical issues, I previously checked with the owners of the
issues
that they are optional to the release, thus the only blocker issue is the
FLINK-26985.
Besides to the blocker issues we are waiting for
Hi Shameet,
No, you'll need to add support for the Snowflake dialect in Flink. You can
find more information here at the JdbcDialect documentation [1]
Best regards,
Martijn Visser
https://twitter.com/MartijnVisser82
https://github.com/MartijnVisser
[1]
https://nightlies.apache.org/flink/flink-d
thanks Martijn
So supplying the snowflake-jdbc jar as a dependency jar as i have done and
mentioning the driver property doesn't help ?
t_env.get_config().get_configuration().set_string("pipeline.jars",
"file:///Users/shameetdoshi/Downloads/flink-connector-jdbc_2.12-1.14.4.jar;file:///Users/shame
Hi Shameet,
There's currently no open source Flink Snowflake connector/sink available.
As mentioned in the documentation [1] this requires the implementation of a
dialect.
Best regards
Martijn Visser
https://twitter.com/MartijnVisser82
https://github.com/MartijnVisser
[1]
https://nightlies.apac
43 matches
Mail list logo