How to correctly compate $f(n)$ and $g(n)$ when working through $O(n)$ notation?

$\begingroup$

Going through theory, missing the idea, need a bit of help. So, the initial state is:

$$f(n) = O(g(n))$$

Assume that $f$ and $g$ are both nondecreasing and always bigger than 1. And, from my understanding, $f$ must be less or equal to $g$, because it must represent the maximum complexity. But, if we add non-equal elements like:

$$f(n)*log_2(f(n)^c) = O(g(n)*log_2(f(n)))$$

it becomes a bit messy for me. Assume that $c$ is some positive constant.

As I see, if, for example, previously $f(n) = n$ and $g(n) = n$, by adding $c$ power (even inside the $log$), it could make $f(n)$ bigger.

But, because $O(n)$ focuses on the highest order terms, I need to exclude $log_2$ from it anyway, so the whole statement stays equal. Is it equal and do I understand the logic correctly?

$\endgroup$ 4

1 Answer

$\begingroup$

First things first: writing $f(n) = O(g(n))$ is a shortcut for $f\in O(g)$, i.e. $O(g)$ is a set of functions. But that is a little pedantry we will not bother with.

Now, the above does not imply that $f\leq g$! It means that there exists $N\in\mathbb{N}$ and some $C > 0$ such that$$ f(n) \leq C\cdot g(n) $$for all $n \geq N$. That means that for $n$ large, $f$ is at most a constant bigger than $g$.

Next, you need to use the properties of the logarithm: for any $x, c > 0$, we have$$ \log(x^c) = c\cdot\log x. $$This is true for all logarithms. Now, assume that $f(n) = O(g(n))$, i.e. we have the above inequality for $n$ large enough. Then$$ f(n)\cdot \log(f(n)^c) = c\cdot f(n)\cdot \log(f(n)) \leq c\cdot C g(n)\cdot \log(f(n)) $$for $n$ large enough. Taking $c\cdot C > 0$ as the a new constant $C'$, we get that $f(n) = O(g(n)\cdot \log(f(n)))$.

Also, you need to be very careful: a logarithmic term is asymptotically smaller than a linear term, i.e.$$ \log(n) = O(n), $$BUT if you multiply with a logarithmic term, you cannot simply forget it, i.e.$$ n + \log n = O(n)\qquad\text{ but }\qquad n\cdot \log n = O(n\cdot \log n) \neq O(n). $$You would, at most, have $n\cdot \log n = O(n^2)$.

Perhaps, it could be interesting for you to take the above definition and to prove basic properties of the Landau notation, e.g.$$ f\in O(g)\text{ and } f'\in O(g')\qquad\text{ then }\qquad ff'\in O(gg') \subseteq O(\max\{g,g'\}^2). $$

$\endgroup$ 2

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

You Might Also Like