On Sun, Jul 31, 2011 at 9:38 AM, Rama Rao Polneni wrote:
> Earlier I had many duplicate strings in different rows retrieved from
> database.
> I created a list to have unique strings and their indexes are used in
> actual computations. So lot of space is saved by having integers
> instead of stri
Hi Ulrich,
Thanks for your idea.
I resolved the issue by making use of integers instead of strings.
Earlier I had many duplicate strings in different rows retrieved from database.
I created a list to have unique strings and their indexes are used in
actual computations. So lot of space is saved by
On 07/06/2011 02:49 AM, Rama Rao Polneni wrote:
After storing 1.99 GB data in to the dictionary, python stopped to
store the remaining data in to dictionary.
Is there any alternate solution to resolve this issue. Like splitting
the dictionaries or writing the data to hard disk instead of writing
Rama Rao Polneni wrote:
> After storing 1.99 GB data in to the dictionary, python stopped to
> store the remaining data in to dictionary.
Question here:
- Which Python?
- "stopped to store" (you mean "stopped storing", btw), how does it behave?
Hang? Throw exceptions? Crash right away?
> Memo
Yes the data is from table. which is retrieved using some queries in
to cx_oracel cursor. I am able to read the data row by row.
One more information, I am Able to load the data in to dictionary by
removin
On Wed, Jul 6, 2011 at 5:49 PM, Rama Rao Polneni wrote:
> Hi All,
>
> I am facing a problem when I am storing cursor fetched(Oracle 10G)
> data in to a dictionary.
> As I don't have option to manipulate data in oracle10G, I had to stick
> to python to parse the data for some metrics.
>
> After sto
Hi All,
I am facing a problem when I am storing cursor fetched(Oracle 10G)
data in to a dictionary.
As I don't have option to manipulate data in oracle10G, I had to stick
to python to parse the data for some metrics.
After storing 1.99 GB data in to the dictionary, python stopped to
store the rem