The belwo is the code, I have also tried writing it row by row with xlsxwrites
passing argument constant_memory:True. I donot know if my looping code is
ineffective.
for req_param in request.GET.get("Req-Tables").split(","):
sheet_names.append(req_param)
title.append(req_param)
report_data=db_exec_query( param_table_map[req_param])
sheet_names[key]=wb.active
if key!=0:
sheet_names[key]=wb.create_sheet(title= (title[key][:15] ) if
title[key] > 15 else title[key])
req_param, sys.getsizeof(report_data))
count= 0
for r in report_data:
try:
sheet_names[key].append(r)
except:
sheet_names[key].append(str(c).decode('cp1252') for c in r)
key=key+1
wb.save(response)
On Wednesday, July 13, 2016 at 2:41:43 PM UTC-7, Chris Angelico wrote:
> On Thu, Jul 14, 2016 at 7:29 AM, vineeth menneni
> wrote:
> > Hi I am finding it difficult to create a excel sheet using openpyxl or
> > xlsxwriter. The problem is that i am loading a table data from MYSQL db
> > which has 600k rows and 15 columns (approximately 100mb data). The error
> > that the terminal shows is that "MemoryError". I just wanted to know if it
> > is possible to create a excel sheet with the above said data in less than
> > one minute using any packages in python.
> >
>
> You can probably build it progressively, row by row. That way, you
> shouldn't need to keep everything in memory at once. But beyond that,
> I can't say without seeing your code.
>
> ChrisA
--
https://mail.python.org/mailman/listinfo/python-list