Friday, November 21, 2014

The Unsolvable Problem

Note: This post covers Week 12 (There was no class during Week 11),

Since the most difficult concept always gets introduced in the last class before the exam, this week we explored the Halting problem. Essentially, we were asked to create a function halt that predicts whether or not a function would halt, or stop (a task we soon proved was impossible). I generally wouldn't complain that we had proved a problem impossible, intuitively that would mean that we wouldn't have to try to solve it. Unfortunately, for halting problems, we have to prove whether a function is computable or not by attempting to reduce it to a halt function (since halt is not computable, if a halt function can be written using a function f, that function is not computable).

Friday, November 14, 2014

Generalizing Big-Oh Proofs

Note: This post covers Week 10.

This was probably the most hefty lecture of the semester. We took big-Oh proofs, which tend to be somewhat to moderately difficult themselves, and learned how to prove general properties (ie: if f is in big-Oh of g, then g is in big-Omega of f). Basically we were given a statement (ie: f is in O(g)) which provided us with a constant and breaking point, and we had to figure out how to mould those variables to prove the consequent (ie: g is then in Ω(f)).

An interesting example (which was asked on A3) was to prove that for any two functions f and g, f is in O(g) or Ω(g). I originally tried to prove this but then realized that cyclic functions or a simplified Dirichlet function would prove this statement false. Hopefully with a bit more practice these proofs will come easier to me!

Friday, November 7, 2014

Big-Oh

Note: This post covers both Weeks 8 and 9.

Prof. Zhang introduced the formal definitions of big-Oh and big-Omega in Week 8. We learned how to do some simple proofs involving them which were daunting at first but exposed to be quite easy by overestimating (for big-Oh proofs) or underestimating (for big-Omega proofs) the function. I’m glad we’ve had so much practice with proofs, because these were fairly easy to do. The most important part for me was understanding the intuition behind the proofs, which was covered very well in lecture.

We started off week 9 with both a midterm and the A2 due date They were both probably the most challenging pieces of coursework that we’ve done so far. We had quite a few hints in the previous lecture though so they were both fair.

This week we went much more in depth into big-Oh proofs, going over examples of negation and limit-based proofs. We started with a regular proof, but discovered that we could not only overestimate the function, but also underestimate the bound which makes dealing with more complex proofs a bit simpler (but lets the profs ask us to do more complex proofs on tests…). We also learned negations (which were fairly intuitive) and yet again, a proof involving limits. If the ratio of the bound divided by the function approaches infinity as n approaches infinity, we can determine that the bound grows more rapidly than the function and therefore the function is in big-Oh of the bound. We used l’Hopitals rule to simplify the limits and then followed with a regular big-Oh proof. This was a fairly complex proof so much like my classmate Aqib writes here, I had to attend office hours to fully understand how to do it. I like these proofs now though because you don’t have to find appropriate constants or breakpoints, the fact that the variables exist to prove big-Oh are given by the limit.

Thursday, October 23, 2014

Algorithm Analysis

Note: This post covers Week 7.

Finally! Some actual more typical computer science material!

We started off the lecture by discussing proofs by cases (useful for proofs involving absolute values or different cases when a variable is odd or even) and then reviewed all the proof structures we had done so far. This was a good synopsis to copy down for the exam aid sheet, Then we moved on to "Algorithm Analysis and Asymptotic Notation."

Prof. Zhang lectured about running time, juxtaposing merge sort (which operates in n log(n) time) and bubble sort (which operates in quadratic time) as an example. Since only the asymptotic behaviour of algorithms is of interest, which is to say only their behaviour as input sizes get very large, we learned that we can place algorithms into separate classes depending on their rate of growth. Thus big-Oh notation was introduced.

Friday, October 17, 2014

Proofs, Proofs and More Proofs

Note: This entry combines both Weeks 5 & 6

Midterm week! This test was fairly easy, as a few minutes spent practicing with last years test would leave you well prepared. We also finished up assignment 1 this week. It was more challenging than any of the other course material, but quite doable in a few hours.

Prof. Zhang started off Week 5's lecture proving how "if you did bad, then it is not bad" which was an entertaining example of a direct proof. We learned several new proof techniques, such as proof by contrapositive (when P => is too hard, try not Q => not P), proof by contradiction (assume not Q and arrive at a contradiction, thereby proving the contrapositive) and how to prove sequences. This was a much more challenging class than the previous ones, and it takes a while to get used to proofs. It's confusing at first when you don't know how to start, but I guess that's just something that goes away with practice.


Week 6 was spent building on what was learned in the previous week and providing us with examples of how to prove non-boolean operators (ie: "floor"), disprove statements (prove the negation), and prove limits

Proofs using the floor function were fairly difficult to do because we had to mold the definition to suit our purposes as opposed to simply applying memorized laws. This was a major part of assignment 2 and one of the hardest parts of the course up till now. Delta epsilon limit proofs were fairly straightforward because I had already learned them in calculus, but we encountered more difficult instances on the assignment.

This was an interesting two weeks learning the basics of proofs and I'm interested to see how we'll apply these techniques to more technical computer science scenarios.

Friday, October 10, 2014

Problem Solving Exercise

I will be using Polya's method to attempt to solve the problem depicted here.

Understand the Problem

The problem at hand is to predict whether creases on a repeatedly folded paper will be hill or valleys.

Devise a Plan

I will try to fold the paper and record the pattern of hills/valleys after each successive fold, in an attempt to uncover a pattern.

Carry Out the Plan

Folds | Hills | Valleys
   1         0           1
   2         1           3
   3         3           8
   4         8          14

It seems to be the number of valleys is the same as the number of hills for the next fold. I can't figure our another pattern after that thought...

Look Back

I tried one method, perhaps I should try again because I don't see a pattern.

Acknowledge When and How You're Stuck

I don't really see a pattern so I'm quite lost...

Saturday, October 4, 2014

Finishing up the "Language of Math"

Note: This post covers Week 4

We added to bi-implications and transitivity to our mathematical toolbox this week, and mixed it up with multiple quantifiers. It was a fairly easy lecture, but a good opportunity to practice using the various predicate laws we learned last week (ie: distributive, DeMorgan's).

We started proofs this week. First we were presented with an example limit proof, which was pretty straightforward because it had already been covered in MAT137 (but too déjà vu...bad memories). My friend Isobel is absolutely right here when she writes that taking 137 is very useful for this course, it made this lesson so much easier than if it were the first time I were exposed to these ideas. We were also introduced to the indented Python-like proof structure that is very different (much more organized and symbolic-heavy) from that used in calculus.