On Tue, 6 Aug 2024 07:12:13 GMT, Johan Sjölen <jsjo...@openjdk.org> wrote:

>> When it is said that an algorithm has the log(n) time-complexity, it means 
>> that if the input grows n times, the times grows log(n) times. The tree 
>> data-structure has log(n) time-complexity. VMATree may have not exactly 
>> log(n) response times due to self-balancing costs. But it is still expected 
>> to be less than O(n).
>
> Hi Afshin, could we take a step back and do some asymptotic time complexity 
> analysis of this problem?
> 
> The test is measuring the following code:
> 
> ```c++
> for (int i = 0; i < n; i++) {
>   int a = (os::random() % n) * 100;
>   treap.upsert(a, 0);
> }
> 
> 
> So this algorithm is the tme complexity which we are trying to understand. 
> First, let's simplify the code slightly:
> 
> ```c++
> auto f = [&](auto n) { int a = (os::random() % n) * 100;  treap.upsert(a, i); 
> };
> for (int i = 0; i < n; i++) {
>   f(i);
> }
> 
> 
> Clearly, `f(n)` is executed `n` times, agreed? Then the time complexity of 
> the whole program must be `O(n*f(n))`, correct? It's the time complexity of 
> `f(n)` performed `n` times.
> 
> Let's replace `f` with something else to see if we can understand the time 
> complexity of the whole code snippet again.
> 
> ```c++
> int arr[n];
> auto f = [&](auto n) { arr[n] = 0; };
> for (int i = 0; i < n; i++) {
>   f(i);
> }
> 
> 
> Now, we both agree that assigning to an array has time complexity `O(1)`, 
> correct? Then, if we fill that in in our expression `O(n * f(n))` we receive 
> `O(n * 1) = O(n)`, correct? In other words, the time complexity of the 
> algorithm the test is measuring is *linear*, and we ought to expect that with 
> an array the time taken for the array should be 10x longer with 10k elements 
> as compared to 1k elements.
> 
> OK, now let's *assume* that `f(n)` has time complexity `O(log n)`, then 
> shouldn't the algorithm we're measuring have time complexity `O(n * log n)`, 
> that is actually *slower* than `O(n)`.
> 
> In conclusion: if `treap.upsert()` has time complexity `O(log n)` then the 
> whole algorithm should have time complexity `O(n * log n)` and the 
> measurements we're seeing are as expected *and the test shouldn't fail*.
> 
> Have I missed anything or made any mistakes? Please let me know.

The big O is a _notation_ and is not a math function. So `O(a * b)` is not 
always same as `O(a) * O(b)`.
Stick to this _definition_: "when an algorithm has time-complexity of 
O(`g(n)`), it means if the input grows `n` times the time of executing the 
algorithm grows `g(n)` times." Where `g()` is `log()` in our case.
IOW, the big O answers the following question:
$t_1$ = time for running `f()` for $n_1$ items
$t_2$ = time for running `f()` for $n_2$ items
if we know $\Large{\frac{n_2}{n_1} = n}$ what is expected value of $t_2$?

A detailed description can be found 
[here](https://en.wikipedia.org/wiki/Big_O_notation).

-------------

PR Review Comment: https://git.openjdk.org/jdk/pull/20425#discussion_r1705161898

Reply via email to