Monday, March 31, 2014

Week 11: Sorting and Efficiency

Sorting is the procedure a set of elements in an specific order. There are many different sorting algorithms in Python and each of them have their own benefits. These sorting algorithms have many uses in programming and also our everyday life. (e.g. phone-books, iTunes library, etc.)We talked about several sorting algorithms in class, so why don't we just have a brief look at each of them and their complexities.

1) Bubble Sort:

This algorithm starts by looking at the first two elements in a list and checks to see if they are out of order relative to each other. If yes, it swaps them, otherwise it leaves them where they are. Next, the algorithm looks at the second and third element of the list and repeats the same procedure on them. This continues until the the algorithm goes through the full list. After a complete pass through the list, this entire procedure happen again and again on the list until the list is completely sorted.

Complexity: O(n2)


2) Insertion Sort:

This algorithm divides the list into 2 parts, an unsorted part and a sorted part. At each step of the algorithm an element moves from the unsorted part to the sorted part until the entire list is sorted. The algorithm starts by considering the entire list to be unsorted and moving the first element of the unsorted part to the sorted part. After the placement of the first element in the sorted part, it compares the first element of the unsorted part to the entire sorted part and places it in the right position. As the unsorted element goes through the sorted part of the list until it finds the its right place, the elements in the sorted part shift up a place in order to make room for the new number. This procedure continues until no unsorted element is left and the entire list is sorted.

Complexity: O(n2)


3) Selection Sort:

Selection Sort also divides the list into 2 sorted and unsorted parts. The difference from Insertion Sort is that instead of comparing the first element of the unsorted part to the sorted part, it takes the smallest element of the unsorted part, and just simply appends it to the end of the sorted part each iteration through the list. This procedure continues until there is only 1 element left in the unsorted part which means all of the other elements are sorted and since the unsorted element is greater than all of them, the entire list is sorted.

Complexity: O(n2)


4) Quick Sort:

This algorithm starts by choosing the last element of a list as the pivot and puts an imaginary wall before the first element. The goal is to divide the list into 2 part of smaller, and greater than pivot. After the pivot is chosen, the algorithm looks at every element in the right of the wall and compares it with pivot. If the element is smaller than the pivot, it swaps that element with the first element on the right of the wall and then shifts the wall up by one index, otherwise, it leaves the element there and goes to the next one. This is repeated until the algorithm reaches the pivot, then it swaps the pivot with the first element on the the wall. Now, the pivot is in its final position and this algorithm is recursively called on both the smaller and the part of the pivot until the full list is covered and is in the sorted order.

Complexity: O(n2)


5) Merge Sort:

Merging is the process of creating a single sorted list of two other sorted lists. The algorithm for Merge compares the first element of the two lists and moves the smallest one to the merged list. This process continues until all the elements are in the merged list. The Merge Sort algorithm just recursively divides the list into roughly 2 equal parts until there is only element in each list and then merges them together which results in a sorted list.

Complexity: O(nlog(n))


By comparing the complexities of these algorithms, we can see the difference between the worst case run-time that it take for them to sort a really large list. From the lab exercises and comparison that we did in the course, we saw that bubble sort has the worst performance and it is followed by insertion and selection sort. Also, even though the worst case run-time of Quick Sort is O(n2), that only happens in 1 case when choose the worst pivot possible each iteration through the list. This explains the reason that average case of Quick sort is only O(nlog(n)). We also cannot conclude that Merge Sort will perform faster for sorting every list. As it was said before, these complexities are for really large list and in fact, that is what is important to us programmers since the difference in run-time of any of these algorithm on a short list would not be really noticeable. It should be also remembered that faster algorithms like Merge Sort do not come without the price. Unlike Bubble, Insertion or Selection Sort, Merge sort produces a new list instead of modifying the original one and it takes some additional space in the memory for that. This space is a bit of a price, but it usually is reasonable.

Saturday, March 22, 2014

Week 10: Assignment 2

In this week, I became a Canadian citizen, celebrated Persian new year and also handed in assignment 2. Persian new year was exactly on the same that assignment 2 was due, and even though I was working on it for a while, I still had to spend a good couple of hours on it on that day. I personally found assignment 2 much easier comparing to the first assignment, but it still had its own difficulties which got easily solved with the help of slides and Danny's tips. We also talked about sorting algorithms this week but I won't talk about them since the next blog is all about that. I'm still not receiving the mark that I really wanted for the slogs and that is the reason that the next blog will probably much longer. From what I have learned by looking at the slogs of other people we sadly I don't know the name of, I figured out that our slogs should usually be longer but sadly it is way too late for me to realize that. The reference links to those slogs will be posted below. One other big thing that I accomplished this week was finally figuring out a way to do part (b) of exercise 3, which was by using a tuple statement and using the first element as a reference and returning the second element which qualifies for what we wanted. Even though there will be 1 more blog, I find this as the last "open" blog that I will write this term. It was a great year and I really had fun writing these blogs and I hope I have helped some people at other's blogs have helped me.






Slog 1: http://csc148lschilling.blogspot.ca/
Slog 2: http://ethanwen1994.blogspot.ca/

Saturday, March 15, 2014

Week 9: Big-Oh

This week was a little different for me since I missed the Wednesday lecture. The good thing is that I am familiar with run-time and the concept of Big-oh which lessens the causes of that missed lecture. Being able to use what have learned in one course in another one made me really glad that I chose computer science. Just when we finished learning about Big-oh notation and proof in CSC165, CSC148 starts to talk about it and this relativity made me more interested in both courses. Even though we never really concerned about efficiency before, I personally always thought about how much work does the computer do in order to run our code and is there any way to make that less work? Big-oh gives us a general idea about that run-times and it can really help us to choose a better and more efficient algorithm. This week's lab was pretty straight forward when we got the BST_rec1.py to work but I had trouble with the exercise. After working on part (a) for a while I was able to figure it out but I couldn't really find a way to deal with part (b) which was really frustrating, but I hope I'll be able to find out the answer so I don't lose marks the same way on a test.

Sunday, March 9, 2014

Week 8: Linked List

This week was mainly about linked lists. I don't know why, but I see linked lists as a bunch of node who are all trying to blame each other. The very first node, points at the second one, the second one blames the third and so far and at the end there in this poor node who cannot point to anyone else other that a "None". We had two different approached towards linked lists. One as just a linked list and one with the wrapper class. The difference between those two approaches is that instead of being a linked list itself, the wrapper class consists of some linked list nodes which they all together make a linked list. This week's lab was one of the most useful ones so far. It was really related to the lecture and I totally felt the effect of it on my understanding about linked lists. The concept of a wrapper function was also used in a binary search tree which made it much more understandable for me.

Sunday, March 2, 2014

Week 7: Recursion

Recursion occurs when a function calls itself inside its own body. At the start, the concept of recursion seems really confusing to most of the people including myself, but as you practice more problems involving recursion it becomes pretty simple. Being able to trace the code is one of the most important abilities for a computer scientist to have and tracing a recursive code was a difficult but interesting task. I found the key part of this type of tracing in re-reading the function over and over again as soon as you get to the part where the function calls itself until you get to a part where the condition isn't satisfied anymore which results in using the base case. So what is the base case? Generally, a recursive algorithm repeats itself until the point when conditions stated in the function are no more true. At this time, there exist another value which gets calculated and returned which is the simplest case that can occur in this function also known as the base case. Starting from the base case and moving forward to more complicated cases makes it much easier to understand how a recursive function works since you can re-use the previous results in order to calculate new values. So basically, we can conclude that recursion is breaking a complex problem into smaller and simpler cases until we reach the simplest case which becomes the beginning point for us to work our way back up to the original problem while taking easier and more understandable steps.

Tuesday, February 18, 2014

Week 6: Trees

This week we learned about trees and their terminology. This kind of Abstract Data Type was one of the most interesting ones that we have dealt with so far. Being able to recursively use one procedure on every node in a tree is a really useful method to deal with the entire tree. I also really enjoyed the different type of traversals that we did in class. At first, I didn't know what traversing through a tree means, but after research a bit more about it and re-reading the posted codes, I saw that it is just going through every element of a tree in a certain way and that was when I really got what was happening in in-order, pre-order and post-order traversals. I also finished Assignment 1 which was actually harder that what expected after doing some parts of it. I had never worked with this many different files which all work together at once, and I can say that was a bit of a struggle for me but I manage to work it out by re-reading the codes over and over and also talking to Danny. The main disappointment for me was not being able to figure out the bonus function, but other than that, I have a strong feeling that the rest were fine.

Saturday, February 8, 2014

Week 5: Assignment 1

We are getting closer and closer to the assignment 1, and with the first look that we had this week at the Tower of Hanoi game and how the cheeses should move, I started to get afraid of it. I stayed in the library for a few hours and tried to just code it, but it didn't really work. After a while, I made a couple of cheeses our of paper and drew the 4 stool and a paper and tried to do it by hand to get a better understanding of the algorithm goes. That really helped me and when I started to code again, it was way more easier. This week's lab was really long, and unfortunately we didn't get to finish all of it, even though it was pretty understandable and easy, we didn't have enough time to finish it. My main concern for now is assignment one and I still haven't finished it yet, but I will keep you updated on the later blogs.