Thursday, September 29, 2011

Problem Set #5

Problems 2.1, 2.3, 2.5, due Thursday 10/6.
10 points each. Discussion in the comments section.

Everyone who posts a comment in this thread by Sunday night gets +3 points on this assignment.
Be sure to include your name so we can credit you with these points.

Don't forget: Reading assignment for this week is Sections 3.1 to 3.4

34 comments:

  1. On question 2.5 a, i don't know if its my book or if it is the problem, there is a ' in my book after log base 2 of f(n) so it looks like its a derivative, eg. log2(f(n))' is that right? or is it just f(n)?

    Thanks !

    Nialls Chavez

    ReplyDelete
  2. 2.5 (a)
    log2(f(n)) = O(log2(g(n))). it must be a misprint in your book.

    Also for 2.1 do we just show f(n+1) plus should we also find the general equation like for f(n+2) and so on?

    ~Krista Conley

    ReplyDelete
  3. Krista, I guess it'll be just (a) f(2n) and (b) f(n+1).

    ReplyDelete
  4. I actually solved all 2.5 last week for better understanding of how to do 2.5(b). So I have a quick question on 2.5(a). It can go both ways. Shall we explain the general case only or we have to add the special case as well?

    ReplyDelete
  5. One clarification: on problem 2.5, don't bother doing part (b), which was previously assigned. The assignment should read "2.5 (a) and (c)" (5 points each).

    @Lucia 9/29 6:39:
    You are right about 2.5(a). I want everyone to assume for this problem that the functions f(n) and g(n) are at least 2, for all large enough values of n. I encourage you to figure out why this assumption changes the correct answer to this problem, and explain briefly (in a line or two) in your solution.

    Solution set for problem set #3 will go up later tonight, but don't wait up.

    Can anyone tell me what all the partying on Johnson field tonight was in honor of? It looked like fun.

    ReplyDelete
  6. I believe it was a pep rally. The football team is playing New Mexico State on Saturday.

    Matt A

    ReplyDelete
  7. If anyone in this class commutes from Santa Fe, it might be nice to study together up there for this class evenings or weekends :)
    Amanda Minnich

    ReplyDelete
  8. It looks like 2.3 is going to be a little bit easier than 2.4, I'm wondering what approach people took for 2.4, I know that a few people did limits and others did graphing. 2.1 looks like it's just like algebra since it's saying it wants a quantitative answer of how much slower the algorithms are.

    Is plugging in a huge value for f(n) like f(n) = 2 x 10^50 * n and g(n) = n generally a good idea to find a counter-example for problems like #5?

    --Lucas Nunno

    ReplyDelete
  9. Happy to see problem 2.1 and 2.3 on the assignment, since I did them last week to help with homework #4. :)

    Julian Apodaca

    ReplyDelete
  10. In problem 2.1 I looked at the limit of f(2n)/f(n) and f(n+1)/f(n). For f(2n)/f(n) this provides a meaningful result, however for f(n+1)/f(n) the result is less instructive. Should I be concerned with smaller values of n rather than consider the limit as n gets large?

    (Ken Potter +3)

    ReplyDelete
  11. I was wondering why you couldn't just take the log of both sides for last weeks assignment problem 2.5b, and then I studied the solution more carefully.

    *blows dust off algebra II book* logs and exponents, we meet again

    Mark S Montoya

    ReplyDelete
  12. Ken,

    I had the same question like with log(n+1)/log(n) as n >> 1, it does not seem very helpful!

    Pat Conley

    ReplyDelete
  13. Lucas,

    I was wondering the same thing about how other people solved 2.3 and 2.4. For most of them, I tried to simplify them to smaller problems. For example, to see if n<=n^2, an n can be factored out of each to give 1<=n where it is easier to see that the right hand side grows faster. For those that were tougher to simplify, I did find myself taking the limit of f(n)/g(n) to confirm my assumptions of the growth rates. I really wasn't sure if either of these approaches were the best way to do it though.

    April

    ReplyDelete
  14. This is a cheap way to get the points for the post...but, it's relevant for chapter 2, and funny:

    http://xkcd.com/835/

    (Tim C'de Baca)

    ReplyDelete
  15. This comment has been removed by the author.

    ReplyDelete
  16. The pep rally was the burning of the Aggy. They do it every year that UNM plays against State. My favorite xkcd is substitute(http://xkcd.com/135/)?

    Benjamin Lande

    ReplyDelete
  17. Ken and Pat,

    To see how slower the alg runs, you don't need to divide by f(n). In the several cases in the problem, the answer is not the factor of, but additive S time.

    ReplyDelete
  18. How would you solve (n + 1) log (n + 1). I thought I am all right with math, but this one threw me out.

    ReplyDelete
  19. 2.1d
    well, since for most of them we're saying 2 times slower, or 2^n times slower, etc... in this case we could say...

    "with (n+1), it is ( (n+1)log(n+1) / nlog(n) ) times slower"?

    my aforementioned de-dusted algebra book offers no insight into reducing log(n+1), so I think this is the most reduced form?

    -Mark Montoya

    ReplyDelete
  20. It kind of doesn't make sense though. Say, in the case if it is previous time + 2, can we say it was (n + 2)/n times slower? or x (1 + 2/n)? Funny.

    ReplyDelete
  21. The problem says "How much slower" not "How many times slower".
    So if it's more informative, you can answer by specifying f(n+1) - f(n) or by specifying f(n+1)/f(n). Or both.

    If n * log(n) is giving you trouble, I suggest you print out a plot or table of these values for a wide range of n, and see if it gives you any ideas.

    ReplyDelete
  22. Heh, do we have to provide an "exact" number (or formula) or an answer "somewhat sort of logarithmish" will do?

    ReplyDelete
  23. I did the hard parts of 2.3 and 2.4 by taking the log of both sides and/or dividing both sides by n or n^2 or something. Taking the log of both sides was more useful.

    As far as log(n+1) vs. log(n) goes, they're almost equal. For base 2, log(2n) = log(n) + 1, so log(n+1) is somewhere between log(n) and log(n) + 1, and it's closer to log(n).

    ReplyDelete
  24. Yes, it is obviously n log n + log of something. That what I am talking about. Shall we calculate this log of something or not?

    ReplyDelete
  25. OK - another try...

    Graphing can help, but you need to be careful when combining graphs of several functions - sometimes the differing scales can obscure the true growth of the function.

    Ken Kressin

    ReplyDelete
  26. Yes, Ken, but what if you are graphing (n+1)log(n+1)-n log n? It won't obscure anything.

    ReplyDelete
  27. That one can work - also, Lucia - not sure how much help, but one property of logs is: nlogn = log(n^n), which might also help with (n+1)log(n+1)...

    I'll need to look at it better tomorrow.

    Ken Kressin

    ReplyDelete
  28. Yes, I think it is one of those problems that you have to compare your f(n+1) - f(n) with some logs. Like we did in #2. See if it grows faster then log(n-1) or log (n + 1) - n log n or something like that.

    ReplyDelete
  29. One of my friends sent me this on Steam, it made me chuckle:

    I'm going to assume that you know what bogosort is. If you don't, here's the algorithm:
    1.If the list is sorted, stop.
    2.If it isn't, shuffle the list randomly.
    3.Go back to step 1.

    Obviously this is extremely inefficient. However, there's a solution to this inefficiency, and this solution is quantum computing. Because of the many-universes hypothesis in quantum physics, a list that is sorted randomly will create an infinite number of universes, with some universes containing a sorted list. In this case, quantum computing can make bogosort work in O(n). Here's the new algorithm:
    1.Shuffle the list randomly.
    2.If the list is sorted, stop.
    3.If it isn't sorted, destroy the entire universe.

    Since the only survivors of this rather apocalyptic approach to computing will be in universes where the list was sorted after the first shuffle, it is quite efficient. Checking if a list is sorted requires n-1 comparisons, and I'm going to assume an entire universe can be destroyed in O(1), as it only ever has to happen once. Thus, bogosort becomes an O(n) sort algorithm.

    ReplyDelete
  30. While Lucas's comment is obviously intended as a joke, it is not that far from the reality of some algorithms for quantum computers. For example, check out Grover's algorithm, which can tell you if your array of n integers contains any 7's (for instance), in only sqrt(n) time!

    If one believes in the many worlds interpretation of quantum physics, then one can view Grover's algorithm as picking a random entry in the database, checking whether it has a 7, and if not, using an Hadamard transformation to "cancel out" that universe. Of course, this "magic" has rules, and one has to be very careful trying to talk about such algorithms casually, rather than in the language of the deep mathematics that underlies them.

    ReplyDelete
  31. This comment has been removed by the author.

    ReplyDelete
  32. Hey " BA- BY"..... you got the SULUMP?
    We got the HUMPUSE and USEVAN.
    You want the PHOCKST?
    We got the PHOKERS.
    In fact we got the DEDOONV with MP3!
    You want the YONETACE?
    We got the BAZZELECO and MOEBACO.
    With CONYMAN!
    You want the SKINDEPO with BRATEOPH?
    Why didn't you say SO?
    We can arrange any fuckin' WHAMBANGO in the UNIVERSE!
    Just book in advance with LANCE, or " HANZ"!
    We'll take it anywhere , anyhow, we can.
    Call DAN!

    ReplyDelete
    Replies
    1. Really, you got all that? Wow!

      This blog is now closed.

      Delete

Note: Only a member of this blog may post a comment.