From: John Doe
> A quick and dirty example (only prints the extra duplicate lines; not the
> original duplicate):
> awk -F: ' { v[$3]=v[$3]+1; if (v[$3]>1) print $0; } ' datafile
Here's the version will the 1st duplicate included:
awk -F: ' { v[$3]=v[$3]+1; if (v[$3] == 1) { f[$3]=$0; } else { i
On Wed, Oct 28, 2009 at 10:39:41PM +0530, Truejack wrote:
>
>Need a scripting help to sort out a list and list all the duplicate lines.
>
>My data looks somethings like this
>
>host6:dev406mum.dd.mum.test.com:22:11:11:no
>host7:dev258mum.dd.mum.test.com:36:17:19:no
A key to your
> m.r...@5-cent.us wrote:
>>> Need a scripting help to sort out a list and list all the duplicate
>>> lines.
>>>
>>> My data looks somethings like this
>>>
>>> host6:dev406mum.dd.mum.test.com:22:11:11:no
>>> host7:dev258mum.dd.mum.test.com:36:17:19:no
>>> host7:dev258mum.dd.mum.test.com:36:17:19:no
m.r...@5-cent.us wrote:
>> Need a scripting help to sort out a list and list all the duplicate lines.
>>
>> My data looks somethings like this
>>
>> host6:dev406mum.dd.mum.test.com:22:11:11:no
>> host7:dev258mum.dd.mum.test.com:36:17:19:no
>> host7:dev258mum.dd.mum.test.com:36:17:19:no
>> host17:de
> Need a scripting help to sort out a list and list all the duplicate lines.
>
> My data looks somethings like this
>
> host6:dev406mum.dd.mum.test.com:22:11:11:no
> host7:dev258mum.dd.mum.test.com:36:17:19:no
> host7:dev258mum.dd.mum.test.com:36:17:19:no
> host17:dev258mum.dd.mum.test.com:31:17:19
I think it can be optimized, and if programing language doesn't matter:
#!/usr/bin/python
file="test.txt"
fl = open(file,'r')
toParse = fl.readlines()
fl.close()
dublicates = []
firstOne = []
for ln in toParse:
ln=ln.strip()
lnMap = ln.split(':')
target = lnMap[2]
if target in firs
On 2009-10-28 18:09, Truejack wrote:
> Need a scripting help to sort out a list and list all the duplicate lines.
>
> My data looks somethings like this
>
> host6:dev406mum.dd.mum.test.com:22:11:11:no
> host7:dev258mum.dd.mum.test.com:36:17:19:no
> host7:dev258mum.dd.mum.test.com:36:17:19:no
> ho
>
>From: Truejack
>To: centos@centos.org
>Sent: Wed, October 28, 2009 6:09:41 PM
>Subject: [CentOS] Scripting help please
>
>Need a scripting help to sort out a list and list all the duplicate lines.
>
>My data looks somethings like this
>
>host6:dev4
2009/10/28 Neil Aggarwal :
> I dont know how to do this in a script.
Could be a job for awk.
Bit too busy at work to look into it further at the moment though.
Ben
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/cento
From: centos-boun...@centos.org [mailto:centos-boun...@centos.org] On Behalf
Of Truejack
Sent: Wednesday, October 28, 2009 12:10 PM
To: centos@centos.org
Subject: [CentOS] Scripting help please
Need a scripting help to sort out a list and list all the duplicate lines.
My data looks somethings lik
Need a scripting help to sort out a list and list all the duplicate lines.
My data looks somethings like this
host6:dev406mum.dd.mum.test.com:22:11:11:no
host7:dev258mum.dd.mum.test.com:36:17:19:no
host7:dev258mum.dd.mum.test.com:36:17:19:no
host17:dev258mum.dd.mum.test.com:31:17:19:no
host12:dev
11 matches
Mail list logo