Unsubscribe
On Tue, Jul 4, 2023 at 1:33 PM Bauddhik Anand wrote:
> Unsubscribe
>
Hi Soumen,
I want to unsubscribe from this mailing list.
Thanks & Regards
Ragini Manjaiah
On Fri, Feb 3, 2023 at 4:07 PM Soumen Choudhury wrote:
>
>
> --
> Regards
> Soumen Choudhury
> Cell : +91865316168
> mail to : sou@gmail.com
>
Hi Sanket,
I have a similar use case. how are you measuring the time for Async1`
function to return the result and external api call
On Wed, Sep 29, 2021 at 10:47 AM Sanket Agrawal
wrote:
> Hi @Piotr Nowojski ,
>
>
>
> Thank you for replying back. Yes, first async is taking between 1300-1500
>
s, input data distribution, async mode or sync mode lookup.
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-master/docs/ops/monitoring/back_pressure/
>
> Best,
> JING ZHANG
>
> Ragini Manjaiah 于2021年9月27日周一 下午2:05写道:
>
>> Hi ,
>> I have a flink real t
drastically to 30 TPS. What are the things I need to look into in such a
situation? There are no exceptions caught . how to check the bottle neck
area . can some throw some light on this.
Thanks & Regards
Ragini Manjaiah
HI,
In what scenarios we hit with *java.lang.OutOfMemoryError: Java heap space
while publishing to kafka . I hit with this exception and a resolution
added property *.setProperty("security.protocol","SSL");in the flink
application.
Later I started encountering org.apache.kafka.common.errors.Timeou
fy the exclusion.
>
> Best,
> D.
>
> On Tue, Sep 14, 2021 at 2:15 PM Ragini Manjaiah
> wrote:
>
>> Hi David,
>> please find my pom.xml . where I have excluded the slf4j-log4j12
>> dependency . even after excluding encountering this issue
>>
2, which is no longer supported. Can you
> try getting rid of the slf4j-log4j12 dependency?
>
> Best,
> D.
>
> On Tue, Sep 14, 2021 at 1:51 PM Ragini Manjaiah
> wrote:
>
>> when I try to run flink .1.13 application encountering the below
>> mentioned issue. what
when I try to run flink .1.13 application encountering the below mentioned
issue. what dependency I am missing . can you please help me
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/Users/z004t01/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.1
ong an asynchronous request may take
>> before it is considered failed. This parameter guards against dead/failed
>> requests.
>
>
> Regards,
> Rahul
>
> On Wed, Jul 14, 2021 at 9:29 AM Ragini Manjaiah
> wrote:
>
>> Hi ,
>> I am facing the below iss
Hi ,
I am facing the below issue while processing streaming events. In what
scenarios hit with java.lang.Exception: Could not complete the stream
element. can please help me here . The job fails after this exception is hit
2021-07-13 13:24:58,781 INFO
org.apache.flink.runtime.executiongraph.Exec
gt; [INFO] | +- org.apache.kafka:kafka_2.12:jar:5.5.1-ccs:compile
>
> [INFO] +- io.confluent:kafka-schema-registry:jar:5.5.1:compile
> [INFO] | | \-
> com.kjetland:mbknor-jackson-jsonschema_2.12:jar:1.0.39:compile
>
> On 5/7/2021 2:47 PM, Ragini Manjaiah wrote:
>
> Hi ,
>
] \- org.springframework:spring-jcl:jar:5.0.9.RELEASE:test
On Fri, May 7, 2021 at 5:58 PM Chesnay Schepler wrote:
> Can you show us the dependency tree of your project?
> (If you are using maven, run "mvn dependency:tree")
>
> On 5/7/2021 2:15 PM, Ragini Manjaiah wrote:
>
&g
The scala version is same across the pom file . 2.11
On Fri, May 7, 2021 at 5:06 PM Chesnay Schepler wrote:
> It looks like you have different scala versions on the classpath. Please
> check that all your dependencies use the same scala version.
>
> On 5/7/2021 1:25 PM, Ragini Ma
Hi ,
I am surfacing when submitting flink from intellij IDE . what cloud the
issues. Do need to change the scala version
flink 1.11.3
scala 2.11
Exception in thread "main" java.lang.NoSuchMethodError:
scala.Function1.$init$(Lscala/Function1;)V
at
scala.concurrent.java8.FuturesConvertersImpl$CF.(
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288)
On Tue, May 4, 2021 at 11:47 AM Ragini Manjaiah
wrote:
> Thank you for the clarification.
>
> On Mon, May 3, 2021 at 6:57 PM Matthias Pohl
> wrote:
>
>> Hi Ragini,
>> this is a dependency version issue. F
Hi Team,
I am trying to submit a flink job of version 1.11.3 . The actual
application is developed in flink 1.8.1.
Since the Hadoop cluster is 3.2.0 apache I downloaded flink 1.11.3 (
flink-1.11.3-bin-scala_2.11.tgz) and tried to submit the job.
while submitting facing the below mentioned excepti
ould need to upgrade to a more recent Flink version.
>
> Best,
> Matthias
>
> [1]
> https://flink.apache.org/news/2020/07/06/release-1.11.0.html#important-changes
> [2] https://issues.apache.org/jira/browse/FLINK-11086
>
> On Mon, May 3, 2021 at 3:05 PM Ragini Manjaiah
> wrot
Hi ,
One of my flink applications needs to get and put records from HBASE for
every event while processing in real time . When there are less events the
application process without any issues. when the number of events
increases we start hitting with the below mentioned exception .Can these
excepti
Hi Team,
I have Flink 1.8.1 and hadoop open source 3.2.0 . My flink jobs run
without issues on HDP 2.5.3 version. when run on hadoop open source 3.2.0
encountering the below mentioned exception .
I have set hadoop
export HADOOP_CONF_DIR=/etc/hadoop/conf
export HADOOP_CLASSPATH=`hadoop classpath`
20 matches
Mail list logo