O (log (A)) + O (log (B)) = O (log (A * B))?

It's right:

log(A) + log(B) = log(A * B)    [0]

      

Is this also true?

O(log(A)) + O(log(B)) = O(log(A * B)) [1]

      

From what I understand

O(f(n)) + O(g(n)) = max( O(f(n)), O(g(n)) ) [2]

or in other words - if one function grows asymptotically faster than another, then only this function is relevant to the big O notation. So maybe this equation is true instead of?

O(log(A)) + O(log(B)) = max( O(log(A), O(log(B)) ) [3]

      

+3


source to share


3 answers


O

is linear.

Therefore O(a) + O(b) = O(a + b)

.

So O(log(A)) + O(log(B)) = O(log(A) + log(B)) = O(log(A * B))




As for [3]

, you're right.

if m = O(n) then O(n + m) = O(2n) = 2 O(n) = O(n) (2 is a constant)

+2


source


This is always true: O(X + Y) = O(X) + O(Y)



Also, complexity is not as valuable as algebra. This means that if something is equal in algebra, it must be the same in complexity. (of course, if something is equal in complexity, it doesn't have to be equal in algebra)

+1


source


1 is true. This is the proof, let c_1 and c_2 be the constants you get by the definition of the Big-O function. So you have:

O(log(A)) + O(log(B))  
= c_1*log(A) + c_2*log(B)
<= c*log(A) + c*log(B)  where c=max{c_1,c_2}
= c*(log(A)+log(B))
= O(log(A*B))

      

With the same logic, you can see that [3] is correct, so you have:

   O(log(A)) + O(log(B)) 
    = c_1*log(A) + c_2*log(B) [5]

      

Assuming log (A) is the maximum of log (A) and log (B), you get the previous equation [5]:

    <= c_1*log(A) + c_2*log(A)
    = (c_1+c2)*log(A)
    = O(log(A)) 
    = O(max(log(A), log(B)))

      

Which can be simplified to:

    = max(O(log(A)), O(log(B)) )

      

0


source







All Articles