The complexity of a recent exam that confused people

Do you think the following information is correct?

If Θ(f(n)) = Θ(g(n)) AND g(n) > 0 everywhere THEN f(n)/g(n) ∈ Θ(1)

      

We have a little argument with our prof

+3


source to share


4 answers


f (n) = Θ (g (n)) means that c, d, n0 are such that cg (n) <= f (n) <= dg (n) for n> n0.

Then, since g (n)> 0, c <= f (n) / g (n) <= d for n> n0.



So f (n) / g (n) = Θ (1).

+4


source


Splitting functions f(n),g(n)

are not the same as splitting them Big-O. For example, let:

f(n) = n^3 + n^2 + n
g(n) = n^3

      

So:

O(f(n)) = n^3
O(g(n)) = n^3

      

and

f(n)/g(n) = 1 + 1/n + 1/n^2 != constant !!!

      



[Edit1]

but as kfx it says that you are comparing with complexity, so you want:

O(f(n)/g(n)) = O(1 + 1/n + 1/n^2) = O(1)

      

So the answer is: Yes.

But beware, complexity theory is not really my cup of tea and I have no context for your question.

0


source


Using the definitions for Landau notation https://en.wikipedia.org/wiki/Big_O_notation , it is easy to conclude that this is true, the division limit must be less than infinity but greater than 0.

It doesn't have to be exact 1

, but it must be a finite constant, which is Θ (1).

An example counter would be nice and should be easy to specify if the statement is not true. A positive rigorous proof would probably have to move from the definition of limes relative to series to prove the equivalence of formal and limiting definitions.

I am using this definition and have not seen it turn out to be wrong. I suppose the disagreement might lie in the precise definition Θ, people are known to use these conversations with minor differences, especially Big O. Or maybe some tricky cases. For positively defined functions and series, I don't think this fails.

0


source


Basically, there are three options for any two functions f, g

: a first grows asymptotically slower, and we write f=o(g)

(note that I use a small o), first grows asymptotically faster: f=ω(g)

(again, small omega) or they asymptotically tight bound: f=Θ(g)

.

What this means f=o(g)

is stricter than big O, since it doesn't allow it to f=Θ(g)

be true; f=Θ(g)

It includes both f=o(g)

and f=Ω(g)

, but o, Θ

and ω

are exceptional.

To find out if it is enough to f=o(g)

estimate the limit for n going to infinity f(n)/g(n)

, and if it is zero, f=o(g)

true, if it is infinitely f=ω(g)

true, and if it is a real finite number, f=Θ(g)

is your answer. This is not a definition , but simply a way of evaluating a statement. (One of the assumptions I made here was that both f and g are positive.)

Special case: if the limit for ng to infinity f(n)/1 = f(n)

is a finite number, that means f(n)=Θ(1)

(basically we chose a constant function for g

).

Now we are faced with your problem: since the f=g(Θ)

means f=o(g)

, we know that there are c>0

and n0

such that f(n) <= c*g(n)

for all n>n0

. Thus, we know that f(n)/g(n) <= (c*g(n))/g(n) = c

for everyone n>n0

. The same can be done for Ω

only with opposite inequality signs. Thus, we get what f(n)/g(n)

is between c1

and c2

from some n0

, which are known to be finite numbers due to how it is defined Θ

. Since we know that our new function is out there somewhere, we also know that its limit is a finite number, which proves that it is indeed constant.

Conclusion, I believe you were right and I would like your professor to suggest a counterexample to explain the claim. If something doesn't make sense, feel free to ask for more comments, I'll try to clarify.

0


source







All Articles