Getting java.lang.Exception when try to fetch data from Kafka

2016-04-22 Thread prateekarora
Hi I am sending data using kafkaProducer API imageRecord = new ProducerRecord(topic,messageKey, imageData); producer.send(imageRecord); And in flink program try to fect data using FlinkKafkaConsumer08 . below are the sample code . def main(a

Re: Getting java.lang.Exception when try to fetch data from Kafka

2016-04-25 Thread prateekarora
Hi I have java program that sending data into kafka topic using kafa client API (0.8.2) here is sample to code using to send data in kafka topic : import org.apache.kafka.clients.producer.Producer; import org.apache.kafka.clients.producer.ProducerConfig; import org.apache.kafka.clients.producer

Re: Getting java.lang.Exception when try to fetch data from Kafka

2016-04-25 Thread prateekarora
Hi I have java program to send data into kafka topic. below is code for this : private Producer producer = null Serializer keySerializer = new StringSerializer(); Serializer valueSerializer = new ByteArraySerializer(); producer = new KafkaProducer(props, keySerializer, valueSerializer); Produce

Re: Getting java.lang.Exception when try to fetch data from Kafka

2016-04-27 Thread prateekarora
ger <[hidden email] >> <http:///user/SendEmail.jtp?type=node&node=6464&i=1>> wrote: >> >>> Hi Prateek, >>> >>> were the messages written to the Kafka topic by Flink, using the >>> TypeInformationKeyValueSerializationSchema ? If not,

How to measure Flink performance

2016-05-06 Thread prateekarora
Hi I am new in Apache Flink and using Flink 1.0.1 I have a streaming program that fetch data from kafka , perform some computation and send result to kafka again. I am want to compare results between Flink and Spark . I have below information from spark . do i can get similar information fro

Re: How to measure Flink performance

2016-05-12 Thread prateekarora
Hi How can i measure throughput and latency of my application in flink 1.0.2 ? Regards Prateek -- View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/How-to-measure-Flink-performance-tp6741p6863.html Sent from the Apache Flink User Mailing List

Flink Kafka Streaming - Source [Bytes/records Received] and Sink [Bytes/records sent] show zero messages

2016-05-16 Thread prateekarora
Hi In my flink kafka streaming application i am fetching data from one topic and then process and sent to output topic . my application is working fine but flink dashboard shows Source [Bytes/records Received] and Sink [Bytes/records sent] is zero. Duration Name

Re: Flink Kafka Streaming - Source [Bytes/records Received] and Sink [Bytes/records sent] show zero messages

2016-05-16 Thread prateekarora
o external systems (and not Flink > operators), the dash board does not show receiving byte for sources and > sent bytes for sinks. > > Best, Fabian > > 2016-05-16 23:02 GMT+02:00 prateekarora <[hidden email] > <http:///user/SendEmail.jtp?type=node&node=6944&i=0>

whats is the purpose or impact of -yst( --yarnstreaming ) argument

2016-05-25 Thread prateekarora
Hi i am running flink kafka stream application . and have not seen any impact of -yst( --yarnstreaming ) argument in my application . i thought this argument is introduces in 1.0.2 . can any one explain what is the purpose of this argument . Regards Prateek -- View this message in co

How to perform multiple stream join functionality

2016-05-25 Thread prateekarora
Hi I am trying to port my spark application in flink. In spark i have used below command to join multiple stream : val stream=stream1.join(stream2).join(stream3).join(stream4) As per my understanding flink required window operation because flink don't works on RDD like spark. so i tried

Re: How to perform multiple stream join functionality

2016-05-26 Thread prateekarora
Hi Thanks for the information . it will be good if in future you provide a API to implement such use cases more pleasantly. Regards Prateek -- View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/How-to-perform-multiple-stream-join-functionality-

Re: whats is the purpose or impact of -yst( --yarnstreaming ) argument

2016-05-26 Thread prateekarora
Thanks for the information Regards Prateek -- View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/whats-is-the-purpose-or-impact-of-yst-yarnstreaming-argument-tp7183p7206.html Sent from the Apache Flink User Mailing List archive. mailing list archi

yarn kill container due to running beyond physical memory limits [ How can i debug memory issue ]

2016-05-31 Thread prateekarora
Hi I am running flink 1.0.2 with Yarn . After running application for some time , Yarn kill my container due to running beyond physical memory limits . how can i debug memory issue ? below are the logs : Container container_1463184272818_0165_01_12 is completed with diagnostics: Container

Re: yarn kill container due to running beyond physical memory limits [ How can i debug memory issue ]

2016-06-01 Thread prateekarora
Hi Thanks for the reply i have 6 node yarn cluster with total 107.66 GB Memory and 48 vcore . configuration : 5 Node : configure each Node with 19.53 GiB ( yarn.nodemanager.resource.memory-mb = 19.53 GB) 1 Node : configure Node with 10 GiB ( yarn.nodemanager

Re: yarn kill container due to running beyond physical memory limits [ How can i debug memory issue ]

2016-06-01 Thread prateekarora
Hi I have not changed any configuration "yarn.heap-cutoff-ratio" or "yarn.heap-cutoff-ratio.min" . As per log flink assign 4608 M out of 6 GB . i thought configuration working fine . /usr/lib/jvm/java-7-oracle-cloudera/bin/java -Xms4608m -Xmx4608m -XX:MaxDirectMemorySize=4608m Regards Pratee