O (log (A)) + O (log (B)) = O (log (A * B))?
It's right:
log(A) + log(B) = log(A * B) [0]
Is this also true?
O(log(A)) + O(log(B)) = O(log(A * B)) [1]
From what I understand
O(f(n)) + O(g(n)) = max( O(f(n)), O(g(n)) ) [2]
or in other words - if one function grows asymptotically faster than another, then only this function is relevant to the big O notation. So maybe this equation is true instead of?
O(log(A)) + O(log(B)) = max( O(log(A), O(log(B)) ) [3]
+3
source to share
3 answers
1 is true. This is the proof, let c_1 and c_2 be the constants you get by the definition of the Big-O function. So you have:
O(log(A)) + O(log(B))
= c_1*log(A) + c_2*log(B)
<= c*log(A) + c*log(B) where c=max{c_1,c_2}
= c*(log(A)+log(B))
= O(log(A*B))
With the same logic, you can see that [3] is correct, so you have:
O(log(A)) + O(log(B))
= c_1*log(A) + c_2*log(B) [5]
Assuming log (A) is the maximum of log (A) and log (B), you get the previous equation [5]:
<= c_1*log(A) + c_2*log(A)
= (c_1+c2)*log(A)
= O(log(A))
= O(max(log(A), log(B)))
Which can be simplified to:
= max(O(log(A)), O(log(B)) )
0
source to share