Yup. I hacked a small script in bash to do it for all files and per file as
weil.

Thanks.

On Thu, Mar 23, 2017 at 2:31 PM, Marko Bonaći <marko.bon...@sematext.com>
wrote:

> You can use something like this to get a comma-separated list of all filed
> in a folder:
>
> ls -l | awk '{print $9}' ORS=','
>
> Marko Bonaći
> Monitoring | Alerting | Anomaly Detection | Centralized Log Management
> Solr & Elasticsearch Support
> Sematext <http://sematext.com/> | Contact
> <http://sematext.com/about/contact.html>
>
> On Thu, Mar 23, 2017 at 9:28 PM, Milind Vaidya <kava...@gmail.com> wrote:
>
> > That looks like a faster option.
> >
> > Now the thing is --file requires list of comma separated files. Is there
> > any way to look at all files in a directory ?
> >
> >
> > I tried *log but did not work or I will have to script something to do
> that
> > ?
> >
> > On Sat, Mar 4, 2017 at 9:04 PM, Guozhang Wang <wangg...@gmail.com>
> wrote:
> >
> > > Hi Milind,
> > >
> > > You can try the DumpSegmentTool to read the logs at broker machines
> > > directly as well:
> > >
> > > https://cwiki.apache.org/confluence/display/KAFKA/
> > > System+Tools#SystemTools-DumpLogSegment
> > >
> > > Guozhang
> > >
> > > On Sat, Mar 4, 2017 at 9:48 AM, Anish Mashankar <
> > an...@systeminsights.com>
> > > wrote:
> > >
> > > > Try Presto https://prestodb.io. It may solve your problem.
> > > >
> > > > On Sat, 4 Mar 2017, 03:18 Milind Vaidya, <kava...@gmail.com> wrote:
> > > >
> > > > > I have 6 broker kafka setup.
> > > > >
> > > > > I have retention period of  48 hrs.
> > > > >
> > > > > To debug if certain data has reached kafka or not I am using
> command
> > > line
> > > > > consumer to then piping to grep. But it will take huge amount of
> time
> > > and
> > > > > may not succeed as well.
> > > > >
> > > > > Is there an other way to search something in kafka without using
> > > > consumer?
> > > > >
> > > > --
> > > >
> > > > Regards,
> > > > Anish Samir Mashankar
> > > > R&D Engineer
> > > > System Insights
> > > > +91-9789870733
> > > >
> > >
> > >
> > >
> > > --
> > > -- Guozhang
> > >
> >
>

Reply via email to