As far as I am concerned, it depends on your implementaton, rather than the
big O.
You can optimize your program and count the instruction lines in assembly.
Each processer has an approximate MIPS consumption stastistics. Your
Instructions/( MIPS /10^6 ) is the actual microseconds your program needs
to run.
Best Regards,
James Fang
-----邮件原件-----
发件人: [email protected] [mailto:[EMAIL PROTECTED] 代表
Sherry
发送时间: 2007年11月23日 4:26
收件人: Algorithm Geeks
主题: [algogeeks] How is the Big O actually calculated, time wise?
I know how the complexity of an algorithms is calculated, but how
would this relate to the time it takes? Let's say I have 25000 random
numbers I'd like to sort with the selection sort. Now how could I use
Big O notation to calculate the time taken to sort these numbers?? I
mean I understand it's a O(n^2) sort, but how do you approximate time
taken??
Thanks in advance.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups
"Algorithm Geeks" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/algogeeks
-~----------~----~----~----~------~----~------~--~---