Okay so acks will be 1 then. Anything in the broker logs when this occurs?

On Wed, 5 Sep. 2018, 3:32 pm LEE, Tung-Yu, <tun...@gmail.com> wrote:

> We didn't specify the acks in our producer.properties. I guess it will be
> the default value?
>
> Liam Clarke <liam.cla...@adscale.co.nz> 於 2018年9月5日 週三 11:09 寫道:
>
> > What are your producer acks configured to?
> >
> > On Wed, 5 Sep. 2018, 2:46 pm LEE, Tung-Yu, <tun...@gmail.com> wrote:
> >
> > > Actually we are not sure how often the blocking happens. Whenever it
> > > happens, it seems never to stop until we kill the process.
> > >
> > >
> > >
> > > We start 12 processes  at the same time using KafkaProducer to send
> data
> > > which is a csv containing around 80,000 records every minute. (crontab)
> > >
> > >
> > >
> > > Like this:
> > >
> > > */1 * * * *  java -jar ./kafka-producer.jar ./producer.properties
> > > ./folder01/test.csv
> > >
> > > */1 * * * *  java -jar ./kafka-producer.jar ./producer.properties
> > > ./folder02/test.csv
> > >
> > > */1 * * * *  java -jar ./kafka-producer.jar ./producer.properties
> > > ./folder03/test.csv
> > >
> > > …
> > >
> > > */1 * * * *  java -jar ./kafka-producer.jar ./producer.properties
> > > ./folder12/test.csv
> > >
> > >
> > >
> > > We deliberately use same data to test it. In normal case, every process
> > > will be executed and done. However we’ve noticed that sometimes some
> > > process just blocked and never stop. It happens not very often, maybe
> > from
> > > few hours to one or few days and seems that it has no fixed frequency.
> > >
> > >
> > >
> > > We have considered using “Flushing” immediately after sending data but
> > > concerned about increasing latency therefore we didn’t try it. Will
> give
> > it
> > > a try.
> > >
> > >
> > > Any thoughts? Thanks!
> > >
> > > M. Manna <manme...@gmail.com> 於 2018年9月4日 週二 18:17 寫道:
> > >
> > > > When you say "Blocked" - for how long did the blocking happened?
> Also,
> > > have
> > > > you considered "Flushing" for immediate availability?
> > > >
> > > >
> > > >
> > > > On Tue, 4 Sep 2018 at 11:03, LEE, Tung-Yu <tun...@gmail.com> wrote:
> > > >
> > > > > Hello,
> > > > >
> > > > > We currently use Kafka 1.0.2 and find that sometimes when using
> > > > > KafkaProducer.send(), it blocked and didn't throw any exception.
> > > > >
> > > > >
> > > > >
> > > > > Some code snippets and configuration are as fellows.
> > > > >
> > > > > Any feedback is welcomed, thank you.
> > > > >
> > > > >
> > > > >
> > > > > Tung-Yu
> > > > >
> > > > >
> > > > >
> > > > > ##### code snippets ########
> > > > >
> > > > >
> > > > >
> > > > > Producer<String, String> producer = new KafkaProducer<>(props);
> > > > >
> > > > >                         try (BufferedReader br = new
> > BufferedReader(new
> > > > > FileReader(new File(targetFilePath)))) {
> > > > >
> > > > >                                     String line = br.readLine();
> > > > >
> > > > >                                     while (line != null) {
> > > > >
> > > > >
> >  ProducerRecord<String,
> > > > > String> data = new ProducerRecord<String, String>(topicName, line);
> > > > >
> > > > >
>  producer.send(data);
> > //
> > > > > sometimes it blocked here
> > > > >
> > > > >                                                 line =
> br.readLine();
> > > > >
> > > > >                                     }
> > > > >
> > > > >                         } catch (Exception e) {
> > > > >
> > > > >                                     logger.error(e.getMessage(),
> e);
> > > > >
> > > > >                         }
> > > > >
> > > > > producer.close();
> > > > >
> > > > >
> > > > >
> > > > > ##### code snippet ########
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > #### some producer configuration ######
> > > > >
> > > > >
> > > > >
> > > > > request.timeout.ms=60000
> > > > >
> > > > > batch.size=20000
> > > > >
> > > > > buffer.memory=33554432
> > > > >
> > > > > retries=3
> > > > >
> > > > >
> > > > >
> > > > > #### some producer configuration ######
> > > > >
> > > >
> > >
> >
>

Reply via email to