Hi,
Currently we are using Flink 1.16.3 with Opensearch connector version
1.0.1-1.16 as per link below:
https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/connectors/datastream/opensearch/
Now, we want to upgrade Flink version to 1.18+, but found that there no
Opensearch connector
ausing silent
* unavailability.
*/
G
On Wed, Mar 6, 2024 at 11:10 AM Kirti Dhar Upadhyay K via user
mailto:user@flink.apache.org>> wrote:
Hi Team,
I am using Flink File Source with Local File System.
I am facing an issue, if source directory does not has read permission, it is
returnin
Subject: Re: SecurityManager in Flink
Hi Kirti Dhar,
What is your java version? I guess this problem may be related to
FLINK-33309[1]. Maybe you can try adding "-Djava.security.manager" to the java
options.
[1] https://issues.apache.org/jira/browse/FLINK-33309
Kirti Dhar Upadhyay K v
8a60d77d5b6&q=1&e=634cbd0d-6962-4ee2-bb8d-7f771a0d428c&u=https%3A%2F%2Fgithub.com%2Fapache%2Fflink%2Fblob%2F9b1375520b6b351df7551d85fcecd920e553cc3a%2Fflink-core%2Fsrc%2Fmain%2Fjava%2Forg%2Fapache%2Fflink%2Fcore%2Ffs%2Flocal%2FLocalFileSystem.java%23L161C32-L161C38>
Kirti Dhar Upadhyay
Hi Team,
I am using Flink File Source with Local File System.
I am facing an issue, if source directory does not has read permission, it is
returning the list of files as null instead of throwing permission exception
(refer the highlighted line below), resulting in NPE.
final FileStatus[] conta
ithub.com/apache/flink-connector-kafka/pull/18<https://protect2.fireeye.com/v1/url?k=31323334-501cfaf3-313273af-454445554331-1e24d52ba288559e&q=1&e=bfa69810-8bec-43fb-9f3e-34bf00ccc1c9&u=https%3A%2F%2Fgithub.com%2Fapache%2Fflink-connector-kafka%2Fpull%2F18>
On 2024/02/01 11:58:29
/02/01 07:46:38 Kirti Dhar Upadhyay K via user wrote:
> Hi Mates,
>
> I have below queries regarding Flink Kafka Sink.
>
>
> 1. Does Kafka Sink support schema registry? If yes, is there any
> documentations to configure the same?
> 2. Does Kafka Sink support sending
Hi Mates,
I have below queries regarding Flink Kafka Sink.
1. Does Kafka Sink support schema registry? If yes, is there any
documentations to configure the same?
2. Does Kafka Sink support sending messages (ProducerRecord) with headers?
Regards,
Kirti Dhar
SV with AVRO-generated classes sounds rather strange
and you might want to reconsider your approach.
As for a quick fix, using aliases in your reader schema might help [1]
[1] https://avro.apache.org/docs/1.8.1/spec.html#Aliases
Best,
Alexander Fedulov
On Thu, 26 Oct 2023 at 16:24, Kirti Dhar Upa
Hi Team,
I am using Flink CSV Decoder with AVSC generated java Object and facing issue
if the field name contains underscore(_) or fieldname starts with Capital case.
Sample Schema:
{
"namespace": "avro.employee",
"type": "record",
"name": "EmployeeTest",
"fields": [
{
"name":
Hi Community,
Can someone help me here?
Regards,
Kirti Dhar
From: Kirti Dhar Upadhyay K
Sent: 10 October 2023 15:52
To: user@flink.apache.org
Subject: File Source Watermark Issue
Hi Team,
I am using Flink File Source with window aggregator as process function, and
stuck with a weird issues.
F
Hi Team,
I am using Flink File Source with window aggregator as process function, and
stuck with a weird issues.
File source doesn't seem emitting/progressing the watermarks, whereas if I put
a delay (say 100ms) while extracting timestamp from event, it is working fine.
A bit same thing I found
will be cleaned up.
Best,
Shammon FY
On Fri, Sep 22, 2023 at 3:11 PM Kirti Dhar Upadhyay K via user
mailto:user@flink.apache.org>> wrote:
Hi Community,
I am using Flink File Source with Amazon S3.
Please help me on below questions-
1. When Split Enumerator assigns split to Source
Hi Community,
I am using Flink File Source with Amazon S3.
Please help me on below questions-
1. When Split Enumerator assigns split to Source Reader, does it downloads
the file temporarily and then starts reading/decoding the records from file or
it creates direct stream with S3?
1. I
Hi Team,
I am using CSV decoder with Flink file source.
I am stuck with decoding issues as below-
1. In case there is any blank line in between two records or blank lines in
the end of file, it returns the blank object. E.g-
Input Records:
id,name,age,isPermanent,tenure,salary,gender,contact
Hello Guys,
I am using Flink File Source with Amazon S3.
AFAIK, File source first downloads the file in temporary location and then
starts reading the file and emitting the records.
By default the download location is /tmp directory.
In case of containerized environment, where Pods have limited
Hi Team,
I am using Flink File Source in one of my use case.
I observed that, while reading file by source reader it stores its position in
checkpointed data.
In case application crashes, it restores its position from checkpointed data,
once application comes up, which may result in re-emitting
Source#createReader` and
`AbstractFileSource#createEnumerator`.
Best,
Hang
Kirti Dhar Upadhyay K via user
mailto:user@flink.apache.org>> 于2023年6月5日周一 22:57写道:
Hi Community,
I am trying to add a new counter for number of files collected on Flink File
Source.
Referring the doc
https://nightlies.apac
metric group from the context, like `SourceReaderContext` and
`SplitEnumeratorContext`. These contexts could be found when creating readers
and enumerators. See `AbstractFileSource#createReader` and
`AbstractFileSource#createEnumerator`.
Best,
Hang
Kirti Dhar Upadhyay K via user
mailto:user
Hi Community,
I am trying to add a new counter for number of files collected on Flink File
Source.
Referring the doc
https://nightlies.apache.org/flink/flink-docs-master/docs/ops/metrics/ I
understand how to add a new counter on any operator.
this.counter = getRuntimeContext().getMetricGroup(
Hi Community,
I am planning to use FileSource (with S3) in my application. Hence encountered
with below limitations:
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/datastream/filesystem/#current-limitations
1. Watermarking does not work very well for large backlogs of
-limitations
Best regards,
Martijn
On Thu, Apr 20, 2023 at 2:30 PM Kirti Dhar Upadhyay K via user
mailto:user@flink.apache.org>> wrote:
Hi Community,
I have started using file source of Flink 1.17.x recently.
I was going through the FLIP-27 documentation and as much I understand
SplitEnumerator
Hi Community,
I have started using file source of Flink 1.17.x recently.
I was going through the FLIP-27 documentation and as much I understand
SplitEnumerator lists files (splits) and assigns to SourceReader. A single
instance of SplitEnumerator runs whereas parallelism can be done on
SourceR
Hi Community,
I am reading CSV data using data stream file source connector and need to
convert them into AVRO generated specific objects.
I am using CsvReaderFormat with CSVSchema but this supports only primitive
types of AVRO (that also except null and bytes).
Is there any support provided f
Hi,
I am using Data stream file source connector in one of my use case.
I was going through the documentation where I found below limitations:
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/datastream/filesystem/#current-limitations
1. Watermarking does not work very wel
25 matches
Mail list logo