Introduction: Running time analysis — recall of asymptotic notation, big-oh, theta, big-omega, and introduce little-oh and little-omega. Worst case and average case
Basic design paradigms with illustrative examples — incremental design (e.g., incremental sorting, interpolating polynomials), decremental design (e.g., GCD with discussion on input size, factorial), and pruning (e.g., order statistics).
Divide and Conquer: Integer multiplication revisited with an efficient algorithm that motivates and leads into recurrences. Solving recurrences using recurrence trees, repeated substitution, statement of master theorem. Brief recall of merge sort and its recurrence. Median in worst case linear time.
Greedy Algorithms: Greedy choice, optimal substructure property, minimum spanning trees — Prims and Kruskals, Dijkstras shortest path using arrays and heaps, fractional knapsack, and Huffman coding (use of priority queue). Dynamic Programming: Integral knapsack (contrasted with the fractional variant), longest increasing subsequence.
Graph Algorithms – Graph Traversal BFS and DFS.
String Matching: Boyer Moore – KMP – Rabin Karp. NP-completeness: reduction amongst problems, classes NP, P, NP-complete, and polynomial time reductions.