I'm reading through the profile log, not that I'm an expert on profiling or 
anything, but I do have a few possible tips:

   1. Most of your calls are done in ~86 ms. That's pretty respectable. 
   2. Most of your time is spent on database calls, with compiling being 
   the second biggest time consumer. So compiling your code should provide you 
   with a quick speed boost without having to change any code.
   3. The biggest bottleneck is almost always the queries to the database, 
   regardless of using the DAL or not. So spend more time optimizing your 
   queries before optimizing your code.
   4. Your "adviewer/viewads" makes 10 calls to the database. Try to 
   optimize this either by writing fewer queries, creating a view, and/or only 
   selecting fields that you need. Also make sure you get the criteria right 
   so that you (ideally) don't have any extra, unneeded rows.
   5. The "trees/index", "trees/get_binary_tree.json", 
   "trees/team_members", etc. I assume are supposed to be showing a 
   parent-child type of relationship. I've done this with folders in a 
   document management system before and trees are tough. The way I did it was 
   to load only the top-level and one level below in a single database query 
   (so you would know if the folder had subfolders), and then loaded the rest 
   whenever the user "opened" or "expanded" a folder. If your application 
   resembles this type of tree, then maybe try that. Otherwise, try to load 
   two or three levels per query. Even using a sub-select will be faster than 
   repeated queries.
   6. Moving models to modules and only loading the tables you need has 
   already been mentioned, but it's worth adding to the list of tips.
   7. When you absolutely have to load a few thousand rows (or more) in a 
   query (you should avoid this whenever possible), then try using 
   "db.executesql(query)" to manually execute a hand-crafted SQL query. This 
   will always be faster than using the DAL directly.
   8. Another point about executesql: The obvious issue is reduced 
   portability, but if you are only planning on using PostgreSQL, then you can 
    hand-craft a SQL query and profile it against PostgreSQL for maximum 
   performance. Once you've got it giving only the data you want, then you can 
   copy and paste that query into executesql.

I don't know how much, if any, of this applies to your situation, but I'd 
like to think it's good advice in general.

Reply via email to