Asymptotic complexity for typical expressions

The increasing order of the following functions, shown in the figure below in terms of asymptotic complexity, is:

(A) f1 (n); f4 (p); 2 (n); f3 (n)

(B) f1 (n); 2 (n); f3 (p); f4 (p);

(C) f2 (n); f1 (n); f4 (p); f3 (n)

(D) f1 (n); 2 (n); f4 (p); f3 (n)

enter image description here

a) the order of time complexity for this simple question was asked as ---> (n ^ 0.99) * (logn) n ...... how? log may be a slow growing function, but it still grows faster than a constant

b) Consider a function f1, suppose f1 (n) = (n ^ 1.0001) (logn), what would be the answer?

whenever there is an expression that includes a multiplication between a logarithmic expression and a polynomial expression, does the logarithmic function outweigh the polynomial expression?

c) How to check in such cases, suppose that

1) (n ^ 2) logn vs (n ^ 1.5) which has higher time complexity? 2) (n ^ 1.5) logn vs (n ^ 2), which has higher time complexity?

+2


source to share


2 answers


If we consider C_1 and C_2 such that C_1 <C_2, then we can confidently say the following:

(n^C_2)*log(n) grows faster than (n^C_1)

      

This is because (n ^ C_1) is growing slower than (n ^ C_2) (obviously)

also, for values of n larger than 2 (for log in base 2), log(n) grows faster than 
1.

in fact, log(n) is asymptotically greater than any constant C,
because log(n) -> inf as n -> inf

if both (n^C_2) is asymptotically than (n^C_1) AND log(n) is asymptotically greater 
than 1, then we can certainly say that 
(n^2)log(n) has greater complexity than (n^1.5)

      

We see log (n) as a "slow growing" function, but it still grows faster than 1, which is the key here.


coder101 asked an interesting question in the comments, essentially

is n^e = Ω((n^c)*log_d(n))?
where e = c + ϵ for arbitrarily small ϵ

      

Let some algebra be done.



n^e = (n^c)*(n^ϵ)
so the question boils down to
is n^ϵ = Ω(log_d(n))
or is it the other way around, namely:
is log_d(n) = Ω(n^ϵ)

      

To do this, we find a value of ε that satisfies n ^ ε> log_d (n).

n^ϵ > log_d(n)
ϵ*ln(n) > ln(log_d(n))
ϵ > ln(log_d(n)) / ln(n)

      

Since we know that

ln(n) * c > ln(ln(n))        (1)
as n -> infinity

We can say that, for an arbitrarily small ϵ, there exists an n large enough to 
satisfy ϵ > ln(log_d(n)) / ln(n)

because, by (1), ln(log_d(n)) / ln(n) ---> 0 as n -> infinity.

      

Using this knowledge, we can say that

is n^ϵ = Ω(log_d(n))
for arbitrarily small ϵ

which means that
n^(c + ϵ) = Ω((n^c)*log_d(n))
for arbitrarily small ϵ.

      

in layman's terms

n^1.1 > n * ln(n)
for some n

also

n ^ 1.001 > n * ln(n)
for some much, much bigger n

and even
n ^ 1.0000000000000001 > n * ln(n)
for some very very big n.

      

+2


source


Replacing f1 = (n ^ 0.9999) (logn) with f1 = (n ^ 1.0001) (logn) will give the answer (C): n, (n ^ 1.0001) (logn), n ^ 2, 1.00001 ^ n

The rationale is as follows:

... (n ^ 1.0001) (logn) has higher complexity than n, obviously.



... n ^ 2 is higher than (n ^ 1,0001) (logn), since the polynomial part asymptotically dominates over the logarithmic part, so a polynomial of greater degree n ^ 2 wins

... 1.00001 ^ n dominates n ^ 2 because 1.00001 ^ n has exponential growth and n ^ 2 has polynomial growth. Exponential growth wins asymptotically.

BTW, 1.00001 ^ n looks a bit like a family called "subexponential" growth, usually denoted (1 + Ɛ) ^ n. However, no matter how small Ɛ is, subexponential growth still dominates any polynomial growth.

+1


source







All Articles