Thanks Fredrik,
very nice examples.
André
AMD wrote:
For reading delimited fields in Python, you can use .split string
method.
Yes, that is what I use right now, but I still have to do the
conversion to integers, floats, dates as several separate steps. What
is nice about the scanf
In message <[EMAIL PROTECTED]>, AMD wrote:
Actually it is quite common, it is used for processing of files not for
reading parameters. You can use it whenever you need to read a simple
csv file or fixed format file which contains many lines with several
fields per line.
I do that all th
AMD wrote:
I had seen this pure python implementation, but it is not as fast or
as elegant as would be an implementation written in C directly within
python with no need for import.
maybe you should wait with disparaging comments about how Python is not
what you want it to be until you
Robert Kern a écrit :
AMD wrote:
Hello,
I often need to parse strings which contain a mix of characters,
integers and floats, the C-language scanf function is very practical
for this purpose.
I've been looking for such a feature and I have been quite surprised
to find that it has
I'm pretty certain python won't grow an additional operator for this.
Yet you are free to create a scanf-implementation as 3rd-party-module.
IMHO the usability of the approach is very limited though. First of all,
the need to capture more than one input token is *very* seldom - nearly
all co
Hello,
I often need to parse strings which contain a mix of characters,
integers and floats, the C-language scanf function is very practical for
this purpose.
I've been looking for such a feature and I have been quite surprised to
find that it has been discussed as far back as 2001 but never
Thank you every one,
I ended up using a solution similar to what Gary Herron suggested :
Caching the output to a list of lists, one per file, and only doing the
IO when the list reaches a certain treshold.
After playing around with the list threshold I ended up with faster
execution times than
Hello,
I need to split a very big file (10 gigabytes) into several thousand
smaller files according to a hash algorithm, I do this one line at a
time. The problem I have is that opening a file using append, writing
the line and closing the file is very time consuming. I'd rather have
the files
Thanks Marc,
I just tried shelve but it is very slow :(
I haven't tried the dbs yet.
Andre
Marc 'BlackJack' Rintsch a écrit :
> On Mon, 15 Oct 2007 11:31:59 +0200, amdescombes wrote:
>
>> Are there any classes that implement disk based dictionaries?
>
> Take a look at the `shelve` module from
Hi Brad,
I do the reading one line at a time, the problem seems to be with the
dictionary I am creating.
Andre
> amdescombes wrote:
>> Hi,
>>
>> I am using Python 2.5.1
>> I have an application that reads a file and generates a key in a
>> dictionary for each line it reads. I have managed to r
10 matches
Mail list logo