For the function F(n) = nlogn
Let's first understand what is asymptotic upperbound - well it's upper limit of function by which it can't growth, doesn't matter how much input you provide to the function. Notice here it's simply upperbound not tightest upperbound. Tightest upperbound gives precise result.
if we assume g(n) = n^k
than g(n) is upperbound of F(n) only and only when it satisfies below condition --
F(n) <= g(n)
for any input of n in the function it is always less than g(n).
---------
if F(n) <= g(n)
F(n) <= n^k
log(F(n))* <= k ,,,, Base of Log is n here
Well basically K is constant whose value must be greater than certain number to just follow the F(n) <= g(n).
-----------------------------------------------------------------------------------------------------------------------------------------------------------
**Hope you are asking the same which I explain above, by the way question was little unclear what you exactly wanted to ask.
For any doubts and clarification please comment, if you want to ask different, also ask in comments.
Waiting for your positive feedback :)
F(n)=n log n what is the most restrictive polynomial time asymptotic upperbound O(n^k) what is k?
1. What is the best asymptotic ("big-O”) characterization of the following function: f(n) = (14logn)2 + log (3) a) 0(3) Show steps. b) O(n) c) 0(n) d) 0(21) e) O(logn)
Order the following functions by asymptotic growth rate. 2n log n + 2n, 210, 2 log n, 3n + 100 log n, 4n, 2n, n2 + 10n, n3, n log n2
6. What is the asymptotic solution to the recurrence relation T(n) = 3T(n/2)+n3 log(n)? please explain
Asymptotic notation O satisfies the transitive property i.e. if f(n)=O(g(n)) and g(n)=O(h(n)), then f(n)=O(h(n)). Now we know that 2n =O(2n-1), 2n-1 =O(2n-2?),....... , 2i=O(2i-1?),....... So using rule of transitivity, we can write 2n =O(2i-1?).We can go extending this, so that finally 2n =O(2k?), where k is constant.So we can write 2n =O(1?). Do you agree to what has been proved?If not,where is the fallacy? 6 marks (ALGORITHM ANALYSIS AND DESIGN based problem)
When sorting n records, Merge Sort has worst-case running time O(log n) O O(n log n) O O(n) O(n^2)
Arrange the following functions in ascending order of asymptotic growth rate; that is if function g(n) immediately follows function f(n) in your list, then it should be the case that f(n) is O(g(n)): 2 Squareroot log n, 2^n, n^4/3, n(log n)^3, n log n, 2 2^n, 2^n^2. Justify your answer.
What is the time complexity of this code?
I'm unsure if it is O(log(n)) or O(n).
I think that the while loop is logn but the for loop that comes
after runs the same number of times as the while loop.
string toBinary(int num) { string binary = "", temp = ""; while (num != 0) { temp += to_string(num%2); num /= 2; for (int i = temp.size() - 1; i >= 0; i--) { binary += temp[i]; return binary;
(c) Iff is a polynomial function of degree n, then f has, at most, n-1 turning points. First, identify the degree of f(x). To do so, expand the polynomial to write it in the form f(x) = a,x"+an-1*"-1 + ... + a,x+20- f(x) = + (2x2 + 7)? (x2+6)
Please provide solution/methods so I can understand how this
work.
Given a algorithm with f(n) 5n2 + 4n + 14 in the worst case, f(n) 3n2 + 17 log, n + 1in the average case, and f(n) in 17 the best case. Which of the following would be the tightest possible asymptotic descriptions of the algorithm? The following statement that would be tightest possible asymptotic description of the algorithm above A) O(n) B) o (n) C) (n?) D) On Log...
O(log(log(N))) < O(log(N)) a. True b. False O(N ) < O(log(N)) a. True b. False O( N5) < O(N2 - 3N + 2) a. True b. False O(2N) < O(N2) a. True b. False