Monday, February 23, 2026

Week 7

 Week 7 (2/18-2/24)

This week's content had a lot of topics and algorithms to cover. These are some of the concepts I learned this week:

Counting Sort: Count how many times each value appears (frequency), then build the cumulative distribution to place items in sorted order.

Radix Sort (LSD): Sort numbers digit by digit starting from the least significant digit (ones place), using a stable sort each pass.

Dynamic Programming (DP): Solve problems by storing results of smaller subproblems and building up to the final answer.

Coin-Collecting Problem: In a grid, find the maximum coins collectable when moving only right or down using a DP table.

Coin-Row Problem: Choose coins in a row to maximize value without taking adjacent coins

Warshall’s Algorithm: Finds the transitive closure of a graph (reachability: whether a path exists between vertices).

Floyd’s Algorithm: Finds all-pairs shortest paths by updating distances with intermediate vertices.

Greedy Method: Build a solution step by step by always choosing the best local option at the moment.

MST (Minimum Spanning Tree): Connect all vertices in a weighted graph with the minimum total edge weight and no cycles.

Prim’s Algorithm: A greedy MST algorithm that starts from one vertex and repeatedly adds the smallest edge connecting the current tree to a new vertex (weights matter first; alphabet only breaks ties).

Monday, February 16, 2026

Week 6

 Week 6 (2/11-2/17)

For this week, we practiced AVL trees by inserting values and fixing balance with rotations. We did 2–3 trees by inserting keys, splitting 3-nodes, and reading results level-by-level. We worked with max heaps: inserting (bubble/sift up), deleting max twice (swap with last, sift down), and connected this to heapsort and the array/bottom-up heap build method. Finally, we covered hashing: using 𝐾 mod 𝑚, detecting collisions (separate chaining), resolving them with linear probing, and using load factor thresholds to trigger rehashing to a larger table size. Going to office hours has been extremely helpful for me in this class to better grasp the concepts.

Sunday, February 8, 2026

Week 5

 Week 5 (2/4-2/10)

I made a little summary based on what I learned this week's content. I am hoping that this summary helps me to study for the final.

QuickSort:

QuickSort is a divide-and-conquer sort. It picks a pivot, then partitions the list so numbers smaller than the pivot go left and numbers bigger go right. After the first partition, QuickSort recursively sorts the left and right parts.

Partitioning (i and j pointers):

You keep moving i from the left (looking for a number too big) and j from the right (looking for a number too small), in other words, i moves from the left until it finds a value > pivot (too big, belongs on the right). j moves from the right until it finds a value < pivot (too small, belongs on the left). Then you swap those two.

When i and j cross, you swap the pivot into its final spot. 

Median-of-Three Partitioning:

This improves QuickSort when the data is already sorted or reverse-sorted (which can make QuickSort slow). It chooses the pivot as the median of (first, middle, last), then does partitioning using that pivot idea. Also, we can’t use median-of-three when a subarray is size 3 or less.

Binary Tree Traversals:

Traversal is just “the order you visit nodes”:

Preorder: root, left, right

In order: left, root, right

Postorder: left, right, root

Binary Search:

Binary search works on a sorted list. You check the middle value, then go left half or right half depending on whether the target is smaller or bigger. Each step cuts the search space in half, so it’s fast (log time).

DAG and Topological Sorting:

A DAG is a directed graph with no directed cycles.

A topological order is a list of nodes where every arrow goes from earlier to later. If there’s a cycle, you cannot do a topological sort.

Kahn’s Algorithm:

finds nodes with in-degree 0 (no incoming arrows), removes one (using alphabetical order when there are ties), updates in-degrees (depending on incoming arrows), and repeats.

Monday, February 2, 2026

Week 4

 Week 4 (1/28-2/3)

Besides studying mainly for the midterm, this week's content was focused on merge sort. Merge sort is a clear divide-and-conquer method: split the array, sort each half, then merge them back in order. It runs in O(n log n) time, is stable, and works well for large data. The tradeoff is that it needs extra memory for merging, so it’s often used when predictable performance matters most. As always, office hours help me to clarify all my questions and to have a wider view of the theory and the logic. Looking forward to the next half of the class.

Week 7

  Week 7 (2/18-2/24) This week's content had a lot of topics and algorithms to cover. These are some of the concepts I learned this week...