On Friday, August 17, 2012 8:29:12 AM UTC-4, Mike Girard wrote:
>
> The data will be coming from a large XML file, so my script will parse 
> that and make inserts into several different tables. It's fairly 
> straightforward.
>
> So is it correct to say that -
>
> 1. There is no compelling reason to do this without the DAL 
> 2.  My options in the DAL are bulk_insert, looping db.query and csv import 
> and that performance wise they're similar? 
>
2 is correct (as long as you are going through the DAL; db.executesql would 
be the non-DAL way from within web2py - and of course, you could use your 
DB's native facilities)

1. Is correct if you are not doing this often - e.g., it might take 10 
minutes as opposed to 1 minute without DAL (just assuming, not based on any 
actual measurement). So what?

If you do this once an hour, then DAL processing and the individual record 
insertion (even if you use bulk_insert or csv) might make it too slow for 
you, and you would be better off looking at your database's native bulk 
loading facilities.

-- 



Reply via email to