Algorithm complexity when colliding with value
I have the following formula that I have to simplify to get the time complexity of my algorithm: (n ^ 2-n) / 3. Are there any rules that might allow me to simplify this expression even further to a more "general" Θ (n ^ 2) or something similar (I assume that the result may be wrong).
I just don't know how to deal with this deduction. Typically, if two values add each other, you only consider the one that is highest for analyzing the complexity of the algorithm. What should I do in this case?
source to share
Since it Θ
is a tight bound, there is no better simplification for it, if any function f(n)
is in both Θ(h(n))
AND in Θ(g(n))
, it means that Θ(h(n)) = Θ(g(n))
, therefore, for any other function, you will find that there is Θ(n^2)
no information gain by in your example .
When you are dealing with n^k - n^m
where subtraction , k>m
you can just "throw" n^m
aside when parsing large Θ notations.
This is true because:
n^k - n^m <= n^k
-> and thus it is in O(n^k)
On the other hand:
For everyone m,k
there is some meaning N
such that for everyone n>N
: 0.5n^k >= n^m
and thus:
n^k - n^m >= n^k - 0.5n^k = 0.5n^k for n > N
-> it is also in Omega(n^k)
Since we have found both the upper and lower bounds, we can conclude n^k-n^m
when k>m
is at Θ(n^k)
.
(A similar proof can be done for general f (n), g (n), which are in different complexity classes Θ).
source to share