On Fri, Apr 01, 2016 at 10:08:47AM PDT, N.J. Thomas spake thusly:
> * Tracy Reed [2016-03-31 12:15:55-0700]:
> > I have done a lot of work with Splunk also and have seriously mixed
> > feelings about it.
>
> Apart from the cost, can you expand on that?
Their demo is awesome and it is theoretical
* Tracy Reed [2016-03-31 12:15:55-0700]:
> I have done a lot of work with Splunk also and have seriously mixed
> feelings about it.
Apart from the cost, can you expand on that?
thanks,
Thomas
___
Tech mailing list
Tech@lists.lopsa.org
https://lists.lop
Have you tried grep -A 10 -B 10 *.log? or if they are gzipped
use zgrep.
Thanks!
Leila Kemery
On Thu, Mar 31, 2016 at 3:46 AM, Simon Lyall wrote:
> I'm looking for a tool that might handle this nicely.
>
> I have some asterisk log files that are generated by daemontools'
> multilog. The prob
.
-Pete
From: on behalf of Graham Dunn
Date: Thursday, March 31, 2016 at 06:43
To: Simon Lyall
Cc: LOPSA Tech List
Subject: Re: [lopsa-tech] Tool for searching and browsing log files.
I've found a small ELK stack reasonable (Kibana is pretty useful for finding
stuff), but needs
On Thu, Mar 31, 2016 at 03:46:53AM PDT, Simon Lyall spake thusly:
> * logstash and Elasticsearch would probably be the longer team option
> although there doesn't seem to be a good built-in asterisk filter for
> grok.
I have been very happy with and have done wonderful things with the ELK
(ela
On Thu, Mar 31, 2016 at 11:46:53PM +1300, Simon Lyall wrote:
> I'm looking for a tool that might handle this nicely.
>
> I have some asterisk log files that are generated by daemontools'
> multilog. The problem is that daemontools rotates logs every few
> minutes at the volumes I do so a single ca
As I usually go the straight forward route I would start with grep. I
routinely have to search multiple files and would do
grep "the search string" *filename
The above assumes that the file names are common with maybe a date.
To make things manageable I would do
grep "the search string" *filen
Big fan of the ELK stack. We're in the process of implementing it here. You
should be able to get the cohesion across files by tagging or custom
fields. It depends on what your source data looks like and I'm not very
familiar with Asterisk or its logs.
Also, don't underestimate the value of using L
Ditto on Splunk. If you work for a non-profit or education they have a
nice discount.
cheers,
ski
On 03/31/2016 06:43 AM, Graham Dunn wrote:
I've found a small ELK stack reasonable (Kibana is pretty useful for
finding stuff), but needs to be monitored lots (ie, logstash will stop
working for
On 2016-03-31 09:15, Guus Snijders wrote:
> The first thing that comes to mind is grep, with -A and -B (after/before)
> parameters. Not sure how it will perform with such big datasets, but it's
> probably a lot quicker than vi ;).
If you are going to use grep, I strongly suggest that you take a lo
Op 31 mrt. 2016 12:47 schreef "Simon Lyall" :
>
> I'm looking for a tool that might handle this nicely.
>
> I have some asterisk log files that are generated by daemontools'
multilog. The problem is that daemontools rotates logs every few minutes at
the volumes I do so a single call can be scattere
I've found a small ELK stack reasonable (Kibana is pretty useful for
finding stuff), but needs to be monitored lots (ie, logstash will stop
working for no reason, same with elasticsearch). TBH, splunk is very very
good at this and easy to set up. It *can* be expensive, but if it's worth
money to yo
12 matches
Mail list logo