On Monday 27 April 2009 05:01:22 Carbon Man wrote: > I have a program that is generated from a generic process. It's job is to > check to see whether records (replicated from another system) exist in a > local table, and if it doesn't, to add them. I have 1 of these programs for > every table in the database. Everything works well until I do the postcode > table. The generated code is 5MB for a system with no current data. > Normally the file would not be this big as only the changes are copied > over. Python just quits, I have tried stepping through the code in the > debugger but it doesn't even start. > I am thinking that dynamically generating the programs to run might not be > such a good idea. It would be a shame to drop it because the system needs > to be generic and it runs from an XML file so the resulting code could be > pretty complex, and I am new to Python. The program did generate a pyc so > it was able to compile. > Thoughts anyone?
I agree with what most people here have said: don't generate Python code, it's a bad idea. Put your data in CSV files instead (one per table, named after the table, for instantce). You might want to either have a separate file with metadata (column names, etc...) or have the first line be a header: "<col1_name:col1_type>","<col2_name:col2_type>"... "<data_for_col1>","<data_for_col2>",... "<data_for_col1>","<data_for_col2>",... Then use the csv module (in the standard library) to read the file. If your original data is in XML, you might want to parse it using lxml instead. Cheers, Emm -- http://mail.python.org/mailman/listinfo/python-list