J wrote:
Hello,
Hello, J,
This is totally off-topic, but I was wondering why you are posting
with double messages (triple) all over the place?
Your reply-to is set to comp.lang.pyt...@googlegroups.com, and you
cc to python-list@python.org... and you're stuff is showing up in
newsgr
On May 17, 11:07 am, J wrote:
> Hello,
>
> I have managed to get my script finished in the end by taking bits from
> everyone who answered. Thank you so much. the finished query string looks
> like this (still not the best but it gets the job done. Once I learn to code
> more with Python I w
Hello,
I have managed to get my script finished in the end by taking bits from
everyone who answered. Thank you so much. the finished query string looks
like this (still not the best but it gets the job done. Once I learn to code
more with Python I will probably go back to it and re-write it
On 16/05/2011 09:19, J wrote:
[snip]
#!/usr/bin/python
# Import RegEx module
import re as regex
# Log file to work on
filetoread = open('/tmp/ pdu_log.log', "r")
# File to write output to
filetowrite = file('/tmp/ pdu_log_clean.log', "w")
# Perform filtering in the log file
linetoread = filetor
This doesn't directly bear upon the posted example, but I found the
following tutorial extremely helpful for learning how to parse log
files with idiomatic python. Maybe you'll might find it useful, too.
http://www.dabeaz.com/generators/
http://www.dabeaz.com/generators/Generators.pdf
--
http://
J writes:
> cat logs/pdu_log_fe.log | awk -F\- '{print $1,$NF}' | awk -F\. '{print
> $1,$NF}' | awk '{print $1,$4,$5}' | sort | uniq | while read service command
> status; do echo "Service: $service, Command: $command, Status: $status,
> Occurrences: `grep $service logs/pdu_log_fe.log | grep $
Thanks for the sugestions Peter, I will give them a try
Peter Otten wrote:
> J wrote:
>
> > Hello Peter, Angelico,
> >
> > Ok lets see, My aim is to filter out several fields from a log file and
> > write them to a new log file. The current log file, as I mentioned
> > previously, has thousands o
J wrote:
> Hello Peter, Angelico,
>
> Ok lets see, My aim is to filter out several fields from a log file and
> write them to a new log file. The current log file, as I mentioned
> previously, has thousands of lines like this:- 2011-05-16 09:46:22,361
> [Thread-4847133] PDU D CC_SMS_SERVICE_514
On Mon, 16 May 2011 03:57:49 -0700, J wrote:
> Most of the fields are separated by
> spaces except for couple of them which I am processing with AWK
> (removing " to do is evaluate each line in the log file and break them down into
> fields which I can call individually and write them to a new log
Hello Peter, Angelico,
Ok lets see, My aim is to filter out several fields from a log file and write
them to a new log file. The current log file, as I mentioned previously, has
thousands of lines like this:-
2011-05-16 09:46:22,361 [Thread-4847133] PDU D
All the lines in the log file are sim
J wrote:
> Good morning all,
> Wondering if you could please help me with the following query:-
> I have just started learning Python last weekend after a colleague of mine
> showed me how to dramatically cut the time a Bash script takes to execute
> by re-writing it in Python. I was amazed at ho
On Mon, May 16, 2011 at 6:43 PM, J wrote:
> Good morning Angelico,
> Do I understand correctly? Do you mean incorporating a Python dict inside the
> AWK command? How can I do this?
No, inside Python. What I mean is that you can achieve the same
uniqueness requirement by simply storing the interm
Good morning Angelico,
Do I understand correctly? Do you mean incorporating a Python dict inside the
AWK command? How can I do this?
--
http://mail.python.org/mailman/listinfo/python-list
On Mon, May 16, 2011 at 6:19 PM, J wrote:
> cat logs/pdu_log_fe.log | awk -F\- '{print $1,$NF}' | awk -F\. '{print
> $1,$NF}' | awk '{print $1,$4,$5}' | sort | uniq | while read service command
> status; do echo "Service: $service, Command: $command, Status: $status,
> Occurrences: `grep $servi
14 matches
Mail list logo