Hello,

I need to gather information that is contained in various files.

Like so:

file1:
=====================
foo : 1 2
bar : 2 4
baz : 3
=====================

file2:
=====================
foo : 5
bar : 6
baz : 7
=====================

file3:
=====================
foo : 4 18
bar : 8
=====================


The straightforward way to solve this problem is to create a
dictionary. Like so:


[...]

a, b = get_information(line)
if a in dict.keys():
    dict[a].append(b)
else:
    dict[a] = [b]


Yet, I have got 43 such files. Together they are 4,1M
large. In the future, they will probably become much larger. 
At the moment, the process takes several hours. As it is a process
that I have to run very often, I would like it to be faster. 

How could the problem be solved more efficiently?


Klaus
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to