John, thanks for your reply. I will then use the files as input to generate
an index. So the
files are temporary, and provide some attributes in the index. So I do this
multiple times
to gather different attributes, merge, etc.
"John Machin" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROT
On May 27, 11:24 am, "Jack" <[EMAIL PROTECTED]> wrote:
> I'll save them in a file for further processing.
Further processing would be what?
Did you read the remainder of what I wrote?
--
http://mail.python.org/mailman/listinfo/python-list
I'll save them in a file for further processing.
"John Machin" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
> On May 26, 6:17 pm, "Jack" <[EMAIL PROTECTED]> wrote:
>> I have tens of millions (could be more) of document in files. Each of
>> them
>> has other
>> properties in separa
On May 26, 6:17 pm, "Jack" <[EMAIL PROTECTED]> wrote:
> I have tens of millions (could be more) of document in files. Each of them
> has other
> properties in separate files. I need to check if they exist, update and
> merge properties, etc.
And then save the results where?
Option (0) retain it in
Jack wrote:
> "John Nagle" <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]
>> Jack wrote:
>>> I need to process large amount of data. The data structure fits well
>>> in a dictionary but the amount is large - close to or more than the size
>>> of physical memory. I wonder what will h
In <[EMAIL PROTECTED]>, Jack wrote:
> I have tens of millions (could be more) of document in files. Each of them
> has other properties in separate files. I need to check if they exist,
> update and merge properties, etc.
> And this is not a one time job. Because of the quantity of the files, I
>
If swap memery can not handle this efficiently, I may need to partition
data to multiple servers and use RPC to communicate.
"Dennis Lee Bieber" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
> On Fri, 25 May 2007 11:11:28 -0700, "Jack" <[EMAIL PROTECTED]>
> declaimed the following i
I suppose I can but it won't be very efficient. I can have a smaller
hashtable,
and process those that are in the hashtable and save the ones that are not
in the hash table for another round of processing. But chunked hashtable
won't work that well because you don't know if they exist in other chu
I have tens of millions (could be more) of document in files. Each of them
has other
properties in separate files. I need to check if they exist, update and
merge properties, etc.
And this is not a one time job. Because of the quantity of the files, I
think querying and
updating a database will
Jack wrote:
> I need to process large amount of data. The data structure fits well
> in a dictionary but the amount is large - close to or more than the size
> of physical memory. I wonder what will happen if I try to load the data
> into a dictionary. Will Python use swap memory or will it fail?
>
Larry Bates wrote:
> Jack wrote:
>> Thanks for the replies!
>>
>> Database will be too slow for what I want to do.
>>
>> "Marc 'BlackJack' Rintsch" <[EMAIL PROTECTED]> wrote in message
>> news:[EMAIL PROTECTED]
>>> In <[EMAIL PROTECTED]>, Jack wrote:
>>>
I need to process large amount of data
On 5/25/07, Jack <[EMAIL PROTECTED]> wrote:
> I need to process large amount of data. The data structure fits well
> in a dictionary but the amount is large - close to or more than the size
> of physical memory. I wonder what will happen if I try to load the data
> into a dictionary. Will Python us
Jack wrote:
> Thanks for the replies!
>
> Database will be too slow for what I want to do.
>
> "Marc 'BlackJack' Rintsch" <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]
>> In <[EMAIL PROTECTED]>, Jack wrote:
>>
>>> I need to process large amount of data. The data structure fits we
Thanks for the replies!
Database will be too slow for what I want to do.
"Marc 'BlackJack' Rintsch" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
> In <[EMAIL PROTECTED]>, Jack wrote:
>
>> I need to process large amount of data. The data structure fits well
>> in a dictionary but t
In <[EMAIL PROTECTED]>, Jack wrote:
> I need to process large amount of data. The data structure fits well
> in a dictionary but the amount is large - close to or more than the size
> of physical memory. I wonder what will happen if I try to load the data
> into a dictionary. Will Python use swap
On May 25, 10:50 am, "Jack" <[EMAIL PROTECTED]> wrote:
> I need to process large amount of data. The data structure fits well
> in a dictionary but the amount is large - close to or more than the size
> of physical memory. I wonder what will happen if I try to load the data
> into a dictionary. Wil
16 matches
Mail list logo