Sim Zacks wrote:
I disagree with you that a human brain would be better then a machine for optimizing purposes. If the system is programmed to optimize correctly, then it will when to stick data into a temp table and update columns instead of doing a select because x number of joins are too much for the select. Humans may not know the optimal number of joins before the query becomes inefficent.

You're thinking about single queries here ;) A piece of software can very well optimize a single query, but it will be limited to that.

I was referring to changing parts of your database so that the query can be written in a simpler, more optimal way.

As an example, say that you have a tree structure of factories, each with a multitude of departments, and you want to query the number of employees (sum of all departments) in a specific factory.

You could write a query to read in all those departments and then count all the employees in them.

You could also keep track of the the employee-count for each department when you insert or delete employee records, and summarize these values for each factory record (using a trigger, most likely). If you do that, you only need to read the employee-count of the factory record.

Which do you think is faster? And how do you think a piece of software can get to this solution only from reading the explain output of your query?

--
Alban Hertroys
[EMAIL PROTECTED]

magproductions b.v.

T: ++31(0)534346874
F: ++31(0)534346876
M:
I: www.magproductions.nl
A: Postbus 416
   7500 AK Enschede

// Integrate Your World //

---------------------------(end of broadcast)---------------------------
TIP 2: Don't 'kill -9' the postmaster

Reply via email to