Monday, February 23, 2026

Week 7

 Week 7 (2/18-2/24)

This week's content had a lot of topics and algorithms to cover. These are some of the concepts I learned this week:

Counting Sort: Count how many times each value appears (frequency), then build the cumulative distribution to place items in sorted order.

Radix Sort (LSD): Sort numbers digit by digit starting from the least significant digit (ones place), using a stable sort each pass.

Dynamic Programming (DP): Solve problems by storing results of smaller subproblems and building up to the final answer.

Coin-Collecting Problem: In a grid, find the maximum coins collectable when moving only right or down using a DP table.

Coin-Row Problem: Choose coins in a row to maximize value without taking adjacent coins

Warshall’s Algorithm: Finds the transitive closure of a graph (reachability: whether a path exists between vertices).

Floyd’s Algorithm: Finds all-pairs shortest paths by updating distances with intermediate vertices.

Greedy Method: Build a solution step by step by always choosing the best local option at the moment.

MST (Minimum Spanning Tree): Connect all vertices in a weighted graph with the minimum total edge weight and no cycles.

Prim’s Algorithm: A greedy MST algorithm that starts from one vertex and repeatedly adds the smallest edge connecting the current tree to a new vertex (weights matter first; alphabet only breaks ties).

Monday, February 16, 2026

Week 6

 Week 6 (2/11-2/17)

For this week, we practiced AVL trees by inserting values and fixing balance with rotations. We did 2–3 trees by inserting keys, splitting 3-nodes, and reading results level-by-level. We worked with max heaps: inserting (bubble/sift up), deleting max twice (swap with last, sift down), and connected this to heapsort and the array/bottom-up heap build method. Finally, we covered hashing: using 𝐾 mod 𝑚, detecting collisions (separate chaining), resolving them with linear probing, and using load factor thresholds to trigger rehashing to a larger table size. Going to office hours has been extremely helpful for me in this class to better grasp the concepts.

Sunday, February 8, 2026

Week 5

 Week 5 (2/4-2/10)

I made a little summary based on what I learned this week's content. I am hoping that this summary helps me to study for the final.

QuickSort:

QuickSort is a divide-and-conquer sort. It picks a pivot, then partitions the list so numbers smaller than the pivot go left and numbers bigger go right. After the first partition, QuickSort recursively sorts the left and right parts.

Partitioning (i and j pointers):

You keep moving i from the left (looking for a number too big) and j from the right (looking for a number too small), in other words, i moves from the left until it finds a value > pivot (too big, belongs on the right). j moves from the right until it finds a value < pivot (too small, belongs on the left). Then you swap those two.

When i and j cross, you swap the pivot into its final spot. 

Median-of-Three Partitioning:

This improves QuickSort when the data is already sorted or reverse-sorted (which can make QuickSort slow). It chooses the pivot as the median of (first, middle, last), then does partitioning using that pivot idea. Also, we can’t use median-of-three when a subarray is size 3 or less.

Binary Tree Traversals:

Traversal is just “the order you visit nodes”:

Preorder: root, left, right

In order: left, root, right

Postorder: left, right, root

Binary Search:

Binary search works on a sorted list. You check the middle value, then go left half or right half depending on whether the target is smaller or bigger. Each step cuts the search space in half, so it’s fast (log time).

DAG and Topological Sorting:

A DAG is a directed graph with no directed cycles.

A topological order is a list of nodes where every arrow goes from earlier to later. If there’s a cycle, you cannot do a topological sort.

Kahn’s Algorithm:

finds nodes with in-degree 0 (no incoming arrows), removes one (using alphabetical order when there are ties), updates in-degrees (depending on incoming arrows), and repeats.

Monday, February 2, 2026

Week 4

 Week 4 (1/28-2/3)

Besides studying mainly for the midterm, this week's content was focused on merge sort. Merge sort is a clear divide-and-conquer method: split the array, sort each half, then merge them back in order. It runs in O(n log n) time, is stable, and works well for large data. The tradeoff is that it needs extra memory for merging, so it’s often used when predictable performance matters most. As always, office hours help me to clarify all my questions and to have a wider view of the theory and the logic. Looking forward to the next half of the class.

Sunday, January 25, 2026

Week 3

 Week 3 (1/21-1/27)

This week's content was very interesting to me. Brute Force, TSP, Knapsack, Divide & Conquer, Master Theorem, and Depth-First Search and Breadth-First Search Algorithms were the main concepts for this week. I personally found DFS and BFS the most interesting ones, not just because of the development of their graphs but also because of the logic behind them and their functionality in different situations. DFS plans to go deep in a node (vertices) and backtracks when it gets stuck. Meanwhile, BFS plans to identify each neighbor's node first before moving on. TSP and Master Theorem were both easy to understand for me because it is very clear the time efficiency for them, especially in real world's cases. 

I am actually looking forward to next week's content and apply these algorithms in more advanced settings.

Sunday, January 18, 2026

Week 2

 Week 2 ( 1/14-1/20)

This week's content was very math-oriented. Out of all topics, recursive and nonrecursive algorithms, use cases of Theta notation, recurrence relation, and backward substitution, and brute force algorithm, the topic I had the most trouble with was backward substitution. It is good practice to look at different examples to see this algorithm from different perspectives. I attended office hours, which helped me quite a lot. I compare this algorithm to proof by induction in math classes, showing the steps to identify a pattern that eventually leads to the general formula or equation needed to understand the recurrence relation and the initial condition. Below is the example I was having trouble with, along with how each step helped me identify the general number i.

M(n) = M(n-1) + 2 // recurrence relation


As an initial condition, we know that if the input number n is 1, there is no multiplication. We can express it as follows.

M(1) = 0 // initial condition


We can solve the recurrence relation using the backward substitution as below

M(n) = M(n-1) + 2 // replace “M(n-1)” with “M(n-2) +2”

                   = [M(n-2) + 2] + 2

        = M(n-2) + 2 + 2 // replace “M(n-2)” with “M(n-3) +2”

        = [M(n-3) + 2] + 2 + 2

        = M(n-3) + 2 + 2 + 2

        = M(n-i) +2*i // For a general number i, we will get this.

...

        = M(1) + 2*(n-1) // Note that the initial condition M(1) = 0.

        = 0 + 2*(n-1)

        = 2*(n -1) (n)


Sunday, January 11, 2026

Week 1

 Week 1: Algorithms

This week, I learned pseudocode, GCD, graphs, trees, and the algorithm analysis framework. The book has been a great resource for understanding the content before watching the lectures. What I had the most trouble with was the basic operation. I thought the basic operation was only one: the most used operation in an innermost loop to measure the efficiency of the algorithm, but then I learned that there is not only one basic operation. Sometimes, there are a few operations that depend on each other in the same innermost loop, and together they determine the performance needed for the loop to execute its full cycle.

As an example, from the algorithm analysis framework, the pseudocode Display_2D_Stars:

1. Algorithm Display_2D_Stars (n)

2. // Input: A positive integer number n

3. // Output: Display the * symbol in two dimensions.

4. i ← 0             

5. while ( i < n ) do

6.     j ← 0

7.     while ( j < n ) do

8.         write '*'  

9.         j ← j + 1

10.     write '\n'

11.     i ← i + 1    

12. return 0

Lines 7, 8, and 9 are part of the same innermost loop; lines 8 and 9 depend on the execution of line 7 (the execution difference is small compared to the < operation ). Therefore, all operations on lines 7, 8 and 9: <, write and +) are part of the basic operations.

Week 7

  Week 7 (2/18-2/24) This week's content had a lot of topics and algorithms to cover. These are some of the concepts I learned this week...