Sunday 30 March 2014

Week 11

During week 11, we've discussed the performance and efficiency of the sorting algorithms. As a continuation from last week, our professor touched on some topics regarding Big Oh. Personally, I was confused how I can determine the Big Oh of an algorithm simply based a given code. In order to have a clear understanding how one code can differ in runtime, I have used online resources. I found that the first thing I must check is what kind of job a function(searching, sorting, etc...). It is easy enough to see that the sorting takes longer time than searching. The worst runtime for sorting(including those discussed in class and lab) was O(n^2) while the worst for searching(including binary search and linear search) was O(n). During lab 10, the main task was to graph the runtime of the sorting algorithms. Based on the graph, my partner and I guessed and checked the efficiency of each cases. At the end we figured out the sorting optimization for selection sort, insertion sort, bubble sort, merge sort, quick_sort, and list sort. The key in determining each algorithm's efficiency was whether or not the list was nearly sorted and whether or not the list was reversed. Given an item, multiple of 400 elements in this lab, the selection sort had no particular best or worst cases; its performance was always O(n^2). For insertion sort, the best case was when the list was already sorted; worst case was when the list was reversed. For bubble sort, best case happens when the list is nearly sorted since the bubble sort will terminate after first iteration; the worst case happens when the smallest element is located at the large end of the list. For merge sort, both increasing and decreasing ordered list will give best case; if the input alternates between large and small value, it would give the worst case (more comparison needed during the process). For quick sort, the sorted list generates the worst case because then the pivot is always the largest element, meaning the list is sub-patitioned into sub-arrays of n-1 and 0. Overall, according to the general observation we had in lab, Python's built-in sorting function was the most efficient algorithm out of all.

Thursday 27 March 2014

Week 10

During this week, the main topic we have dealt with is big-oh and sorting. Big oh notation was something that coincide with the materials covered in csc165. In csc165, I got a chance to see how to prove the big oh(upper bound), lower bound and tight bound. This background knowledge helped me somehow understand this week's material in csc148. This topic was interesting because we got a chance to see runtime of the multiple kinds of sorting algorithms. Apart from this, we had assignment part 2 due on Thursday. The assignment part 2 was about the regular expressions. In part 1 last time, we wrote a class RegexTree and other subclasses that inherits the characteristic of the RegexTree. This time, in part 2, we dealt with string representations and tree representation of the regular expressions. We mainly worked on how to break down a string with its key character. My partner and I struggled how to split the string at the right position; the multiple number of brackets in the regex string representation was an obstacle. Though, once we've figured out a proper algorithm to break down the string representation, the rest of the assignment was not as hard.

Saturday 15 March 2014

Week 9

This week's lecture material was about the class BSTNode and the class BST. One challenging task this week was to complete Lab 8. In lab 8, we were asked to write four functions that essentially performs the same. The only difference between each functions was its location: we had to write one function in BSTNode(and no modification to BST), and, in the other file, we had to write the function in BST(and no modification to BSTNode) etc. Also, whether to use helper function or not was something that we had to figure out during the lab. During this procedure, my partner and I came up with the vague idea of what function must do in each case. But, it took many try and errors to know the precise way of coding. The big hint in this lab emerged after we understood the two different types of attributes(one of BSTNode and the other of BST). The tip I learned today is, when altering the object of the function (the case when the object the function is calling onto is of a newly defined class),  it is important to know the attributes and operations of that class before writing a code.

Friday 7 March 2014

Week 8

During week 8, we have discussed mostly the two topics: one, linked list, and second, binary search tree. On Monday and Wednesday lecture, our professor introduced us the class LinkedList, which functions similar to the ordinary lists with different attributes and its unique functionality. Since I have only dealt with the simple data structures(those that uses python built in classes, such as lists and dictionaries), it was difficult to get an idea how "linking" occur in LinkedList at first. Though, drawing and labelling it value(head) and next_value(rest), helped me have a clear vision towards its structure. During Tuesday lab, our task was to create the class operations, __len__, __setitem__, delete item, and insert. Through this lab, my partner and I have spent most of the time trying to figure out the base case, and how we can use recursive calls on LinkedLists. I believe except for figuring out the proper base case, there was nothing tricky about what each operations does (how exactly, or where exactly the item is inserted/deleted etc...)

Thursday 27 February 2014

Week 7

Recursion, by definition, refers to "...a method where a solution to a problem depends on solutions to smaller instances of the same problem" (google dictionary). For the last several weeks, including the week 7, our class dealt with the coding examples and problems involving recursion. In the past, the professor have given us a visual demonstration of recursion through turtle example. When visualized, we were able to see a pattern where the the large picture consists of smaller pictures in a recursive pattern. A tree with three branches(first branches from the root) that consists of  three branches in each end of the first three branches, and another set of three branch from the previous branch, etc... is an example of a recursive pattern.  In terms of the practical coding, we learned list comprehension is useful to code recursive functions. The common patterns of recursive coding I have seen is as follows: f(x) for item in object if (condition) else (base case). Depending on the situation, there was variation in where to put the list comprehension in the structure given above. As such, often times I have encountered the cases where the base case was given as the else clause of the list comprehension and normal cases, the one that uses recursive call f(x), written at the front with its if clause attached. One of the challenging task assigned in regards of this topic was to come up with the right condition statement. The latest topic discussed in week 7 was about the linked list and the Lab 6 was indeed about linked list. The task given to us was to insert and delete item from a tree (of a class other than list).  Although the linked list is difficult to follow, so far, we have only dealt with tree that are of arity 1, branching factor 1. Thus, we were able to draw a parallel between the tree and a sequence of list. In summary, week 7 material was continuation of recursion, linked list; the most important information to pick up from the lecture was how to implement a functions that can modify the existing list.

Saturday 22 February 2014

Week 6

This week we continued learning about recursion. In the lab, we have worked on coding dot product and matrix vector product. During the process, we were asked to write a code without using the list comprehension. Mostly, our task was to simply write a code the uses for loop (in some cases, multiple loops) . The task was relatively easy compared to previous labs. The difficult job assigned was the A1, the Tower of Hanoi assignment. Personally, I spent most of my time with my partner debugging TOAHModel and discussing over and over about tour.py. In order to to approach the part 5 to part 6 of the assignment which required our concrete understanding of how recursion works, we first traced the simple cases of the function that moves cheeses recursively. After then we were able to get a clue how to apply three_stool method into four_stool method. Since my learning style was passive during the lectures, I had to revisit lecture notes and review it. One good tips I can suggest for future reference is to try to trace as many example cases as possible, from the most simple to a bit complicated ones that suits into the recursive function, before actually writing the code.

Friday 7 February 2014

Week5

During this week, we continued dealing with the recursive code, and also we were introduced to the new topic, how python's scope operate and look for names/values. First, at the beginning of the Monday's lecture, Professor. Heap briefly mentioned the strategy we can implement for the tower of hanoi's code. By understanding the the recursive structure of the code used to move cheeses (code for the tower of hanoi), it gave me a good start-off point in completing the assignment. In the Week5 labs, my partner and I again worked on tracing/creating recursive code. Mainly, we wrote the body of the function SearchList function. Since it was the first trial in my coding experience to write recursive function, I struggled with writing the base case. On top of the course material discussed today, supplemental readings on the course website was useful for my lab partner and I to implement the function. As well, communicating with other classmate also helped us see the variation in coding. One tip I would recommend for future reference is to remember creating a simple test cases before moving on to defining the function.