Friday, October 21, 2011

Spoilers for practice test

Please use this thread for posting and discussing solutions, AFTER you have taken the practice test.

Use the thread below for other midterm-related topics that will not spoil the practice test.

30 comments:

  1. I created a google doc so that we all can collaborate our notes, so that we can all have a good review!

    https://docs.google.com/document/d/18LGfKfJPtsoqKHSzVKUEOaXbyBX9_GCzGrv06RH0t-A/edit?hl=en_US

    ReplyDelete
  2. Midterm 2010, Problem 2

    (1) Theta, (2) none, (3) O

    Is that right?

    ReplyDelete
  3. @Lucia: I think it might be helpful if you listed the justifications for your answers.

    For Problem 2(1) I got: f(n) = (n^2 - n)/2 which is order n^2, and g(n) = n^2+2n which is also order n^2, so I believe f(n) is O(g)...I *think*

    Does this look correct?

    ReplyDelete
  4. And as for Problem 1:

    1. (c), the running time is at most c*n^3.

    2. A contradiction would be (c), The worst case running time is O(n log n).

    3. (c), we defined what an instability is first.

    4. I believe the answer is Theta(n), which is (b) because even though the actual running time of the Gale-Shapley algorithm is O(n^2), a specific woman can be proposed to by at most every n-men... so it would be Theta(n).

    5. I believe by the definition that a graph algorithm runs in O(n+M) linear time...which is (a).

    How do these answers look to everyone??

    -TIM

    ReplyDelete
  5. Tim,
    (1) C1 (1/2) (n^2-n) <= (1/2)(n^2 - n) <= C2 (1/2) (n^2-n), which gives us Theta(g).

    (2) I think these two functions are different. I'd say that g starts slow and then shoots up, f starts faster, but it grows slower. Which could be interpreted that f = O(g) not for all n>=n0>=0.

    (3) g grows faster, so f = O(g)

    I am sure for (1), but (2) and (3) are tricky.

    ReplyDelete
  6. Tim, I agree on your answers with one correction. #2 is (c) and (e). If the worst-case running time is Omega (n*n), then it cannot be Omega (n*n*n) because Omega is the lower bound. Do you agree?

    ReplyDelete
  7. Tim and Lucia,
    I got the following (PLEASE correct me if my reasoning is not correct)

    Definitions in short:
    Big O: f(n) < g(n)
    Big Omega: f(n) > g(n)
    Big Theta: f(n) = g(n)

    Problem 2
    1) since n^2 = f(n) = g(n), I said Theta(n)
    2) because n^(log(n)) > n log n I have f(n) > g(n) which would be Omega(n)
    3)I have f(n) < g(n) which is Big O.

    Julian

    ReplyDelete
  8. Lucia,

    On P1.2 - you cannot have two answers. I am pretty sure Tim is right about C since the worst case time cannot be less than the omega value, since omega is a lower bound. E is not a contradiction since the worst case can have an omega greater than the original function.

    ReplyDelete
  9. Julian

    Pretty sure your thinking is right but my guess is Dr Hayes might take exception to the incomplete definitions you gave (although I agree with you).

    Karl

    ReplyDelete
  10. Julian, for (3) you have to give FORMAL definition, so there is C>0, n0>=0 such that f(n)>=C*g(n) for all n>=n0.

    Karl, Problem 1 is multiple choice, so I can give as many answers as I need. Right?

    ReplyDelete
  11. Lucia and Karl, Yeah I didn't not write out the complete definitions for this post just the basic idea behind them. For the exam I will make sure to write out the formal definitions. Lucia, I am pretty sure he is just looking for one answer for a multiple choice question.

    Julian

    ReplyDelete
  12. Julian,

    for 2.2 I'm getting O()... because,

    (log n)^n grows much faster than n^(log n) ... so f <= g, and therefore f is O(g)...

    I guess I'm confused on what exactly constitutes 'none of the above' ...

    ReplyDelete
  13. Tim - I think Julian is right on 2.2 - you cannot directly compare (log n)^n to n^(log n) in terms of growth rates. You need to translate one into the same base as the other and I think you will see what Julian is talking about.

    Lucia - guess you could answer more than one for each question but not sure on 1.

    Karl

    ReplyDelete
  14. On problem 2, the correct answers are:
    1. f=Theta(g),
    2. f=O(g),
    3. f = Omega(g)
    4. There exist numbers C > 0 and n_0 > 0 such that, for all n >= n_0, we have f(n) <= C g(n).
    (@Lucia, you've got an inequality going the wrong way in your version.)

    For parts 2 and 3 the key is to write both functions as powers of the same base, and compare the powers correctly.

    @Tim, for an answer in problem 2 to be "none of the above", it would have to be the case that f and g go "back and forth" between f being bigger than any constant times g, and g being bigger than any constant times f. This is unusual, but not unheard of, in practice.
    A silly example: g(n) = n^2. f(n) = n if n is odd, n^3 if n is even.
    Another example: f(n) = 2^{2^{sqrt(n)}}. g(n) = 2^{2^{round(sqrt(n))}},
    where "round" is the function that rounds to the nearest integer.

    The right answers for problem 1 are: c, c, c, b, a

    ReplyDelete
  15. For problem 5, my thinking was that to calculate the sum takes n time, finding all the subsets takes 2^n time (if we are ignoring the order), and a subset could be at most size n, so summing that would take n time, giving O(n+n2^n).
    When trying to think about how to get a theta(n) factor of speedup, I thought the main way would be to ensure that you are not repeating subsets, but I'm not sure what the running time would be for that.
    Any ideas??

    ReplyDelete
  16. Sorry Tim - I thought Julian had O on 2.2. Guess my only point was what Dr Hayes said about changing them into a single base for comparison but maybe you did that.

    And Julian - I had to look again at 2.3 before I understood it.

    Anyone - isn't #4 just a special case of a hwk problem we already did or am I reading it wrong?

    KS

    ReplyDelete
  17. Question 6: I am not sure how to interpret part 1 of the question. I initially thought I was looking for a spanning Tree T of graph G for which the same tree T could be arrived from both DFS and BFS. From the homework 3.6, we know that this is not possible unless G is a tree. So, I must have interpreted the question wrong. Are we simply to find two separate trees using BFS and DFS and indicate which node was selected as root in each case?

    Ken Potter

    ReplyDelete
  18. I interpreted number 6 part 1 as: find one tree, but define 2 different roots for this tree, 1 which will make it be a BFS and 1 for DFS.

    ReplyDelete
  19. Thanks Amanda, That helped a lot.

    ReplyDelete
  20. For problem 5, O(n*2^n) would be a little better (although it would also be O(n + n*2^n), so it isn't wrong). Since the inner loop actually only takes n/2 steps on average, I had n + (n/2)*2^n as the worst case running time, but this is big O of the same thing, so that really doesn't matter. It's kind of funny, in 341 (the assembly/low level class) a 2* speedup is absolutely huge, in this class it's not actually a change.

    I'm stuck on 5.2 though. I found a much better speedup, but it isn't theta(n). Since the "return true" condition is equivalent to the subtotal being half the total, for any subset T that this works for, the subset S-T also works. So we only have to check subsets of size <= (n/2). This changes it to O(n*2^(n/2)), but it isn't a theta(n) speedup.

    ReplyDelete
  21. @Amanda, for problem 5, refer back to homework problem 2.6. The savings comes from not recalculating the entire sum, when most of it has already been computed before.

    @Michael, your "much better" speedup is not as good as you think. There are 2^(n-1) subsets of size <= n/2, not 2^(n/2). So the factor saved by this trick is only 2. You've proved this, in fact, by your observation that, for every subset T, either |T| <= n/2 or |S-T| <= n/2.

    @KS, Yes, problem 4 is a special case of homework problem 1.1

    ReplyDelete
  22. Problem 5. So shall we use an nxn matrix to keep track of all sums that have been computed before?

    ReplyDelete
  23. So, for problem 2.2, I’m stuck on the actual proof. Please let me know if I’m going in the right direction.

    If I want to determine if f is O(g), then I have so far:

    F(n) = n^log(n), g(n) = (log n)^n,

    So, taking the log of both sides on both equations yields:

    Log f = log(n)log(n).
    Log g = n log(log(n)).

    Is this correct so far?

    So, to determine if f is O(g), by the definition of big-O I have:

    Log(n)log(n) <= c * n log(log(n)), for n >= n(0),

    Does this look correct so far?

    If so, I guess I don’t know where to proceed from here… If not, what would be a better way to prove this?

    Thanks everybody for your help.

    -TIM

    ReplyDelete
  24. Tim,

    log n log n < n log log n therefore n^log n < (log n)^n. there not much to talk about after that. "<" means grows slower.

    ReplyDelete
  25. Hi Lucia,

    Thanks for the response. I guess my question is, will this be enough to get full credit on the question? Will it be enough of a justification to say:

    'therefore n^(log n) < (log n)^n, therefore f is O(g)'??

    I guess I just want to make sure I'm justifying my answer enough.

    Thanks for the response, though.

    ReplyDelete
  26. Tim, I got a full credit for my HW and I didn't have much English there, only math.

    ReplyDelete
  27. @Tim: What you have is right so far, unless I've made another mistake.

    (log n)*(log n) = (log n)^2 and
    n(log log n) > n = (sqrt n)^2

    log n is O(sqrt n)

    is most of the rest of my work on that one. I also did a little work to show that we can actually make a conclusion about f and g.

    ReplyDelete
  28. What building and room is the lecture in? I will be attending the midterm in class. I don't know where the lecture hall is located?

    ReplyDelete
  29. Hey Luis.
    Woodward Lecture Hall 147

    See ya there.

    -TIM

    ReplyDelete

Note: Only a member of this blog may post a comment.