Uživatelské nástroje

Nástroje pro tento web


teaching:ads12526_lecture

Algorithms and Data Structures I 2025/26 -- Lecture

I teach Algorithms and Data Structures I (NTIN060) every Monday at 12:20 at N1 (IMPAKT). I have taught this class two and three years ago and there are some (somewhat problematic) recordings from then (see the links).

There are three tutorials for the class: two taught by Jamie Fravel, and one by Petr Kučera. All tutorials will cover the same content and have the very similar criteria for obtaining credit, though some variations will exist.

The best way to reach me is on discord: my handle is mkoutecky and there's a channel for ADS1 at the MFF-U Discord Server (invite link). You can also e-mail me at koutecky+ads1@iuuk.mff.cuni.cz and/or include the text [ADS1] in the email subject.

data what was taught [resources]
16. 2. The Random Access Machine (RAM) model of computation, instruction cost (unit, logarithmic, relative logarithmic) [ALG 2], [A 0], Wiki: Random Access Machine, Big-Oh notation ($\mathcal{O}, \Omega, \Theta$)
23. 2. Graph problems, DFS: in- and out-orderings, edge classification, cycle detection. [ALG 5.6], [A, up to 3.3.2], Also learn the $\mathcal{O}(n+m)$ algorithm to find all bridges of an undirected graph [ALG 5.7], notes
2. 3 DFS: topological ordering, detecting strongly connected components. [ALG 5.8-5.9], [A, remainder of Chap 3].
9. 3 Finish SCC algorithm [ALG 5.9], [A, Chap 3]; begin shortest paths: BFS [ALG 5.2], Dijkstra [ALG 6.0-6.2], [A, Chap 4]
16. 3. binary, $d$-ary, Fibonacci heaps for Dijkstra; general relaxation algorithm, start Bellman-Ford [ALG 6.1-6.3], [A, Chap 4].
23. 3. Bellman-Ford [ALG 6.3], [A 4] (Also see [JE 8, 9] but that's a lot more information than necessary.) Start Min Spanning Trees: cut lemma, if edge weights distinct, MST is unique; Jarník's algorithm; Borůvka's algorithm [ALG 7-7.3], [A 5], [JE 7].)
30. 3. Kruskal's algorithm; Union-Find problem and data structure via forests („bushes“), $\mathcal{O}(\log n)$ Union and Find [ALG 7.4], [A 5.1.4]. Start data structures: Binary Search Trees; perfectly balanced BSTs have depth $\mathcal{O}(\log n)$ [ALG 8.1] (intro to Algs notes)
6. 4. Easter Monday — no lecture.
13. 4. AVL trees: depth lower bound $\Omega(\log n)$ and upper bound $\mathcal{O}(\log n)$; FIND runs in $\mathcal{O}(\log n)$; INSERT and DELETE via rotations also run in $\mathcal{O}(\log n)$. [ALG 8.2]
20. 4. $(a,b)$-trees and brief mention of Left-Leaning Red-Black (LLRB) trees [ALG 8.3] (original LLRB paper, slides, controversy). Tries [ALG 4.3].
27. 4. Hashing with chaining [ALG 11.3], open addressing (linear probing, double hashing) [ALG 11.4], universal hashing [ALG 11.5]. Brief mention of amortized complexity and expanding arrays [ALG 9.1].
4. 5. Divide and conquer: analysis of Mergesort via the substitution method and the recursion tree technique [ALG 10.2], Karatsuba's algorithm for multiplication of $n$-digit numbers in time $\mathcal{O}(n^{\log_2 3})$ [ALG 10.3], started the Master theorem [ALG 10.4] (to be finished next time).
11. 5. Plan: Finish Master theorem [ALG 10.4]. QuickSelect, QuickSort [ALG 10.6-10.7], finding the $k$-th smallest element in linear time [ALG 10.8], start dynamic programming [ALG 12].

Useful Resources

  • [ALG] Algorithm Labyrinth Guide by Mareš and Valla. EXPERIMENTAL: this is a work-in-progress, AI-generated translation of the excellent Czech textbook Průvodce Labyrintem Algoritmů. It may be rough or even wrong (due to translation issues); I'll be working on improving it, especially the parts directly relevant to covered material.
  • [A] Algorithms by Dasgupta, Papadimitriou, and Vazirani
  • [JE] Algorithms by Jeff Erickson (the page contains various PDFs suitable for screen, printing etc.)
  • [CLRS] Introduction to Algorithms by Cormen, Leiserson, Rivest, and Stein. Find it on libgen

ADS1 Exam / 2026

The exam will consist of:

1. Two questions about an algorithm or a data structure from the lecture – describing the algorithm or data structure, proving they are correct, giving the complexity analysis. 2. Two tasks similar to those from the tutorial – either apply an algorithm / data structure from the lecture to some problem (need to model the problem appropriately) OR adapt the algorithm / data structure to solve some problem (need to understand how it works internally to be able to adapt it appropriately)

The form of the exam is that you will come, get the question sheet, and work on the answers. Once you are finished with one of the answers, you hand it in, I (or one of my colleagues) will read it, point out anything which is missing / incorrect, give you hints if needed, and you can revise it. At some point either you will reach a correct solution, or you won't want to try to improve it again, or I will feel like I can't give you any more hints, and then we'll reach some grade.

Grading

To get a 3 it suffices to know definitions and algorithms / data structures and (with hints) finish at least one of the „type 2“ tasks (perhaps suboptimally).

To get a 2 you need to be able to solve the tasks with some hints, or find a suboptimal (complexity-wise) algorithm. If we proved some intermediate claims during the lecture, and these are used in the proofs of correctness/complexity, you need to at least know the statements of these claims.

To get a 1 you need to analyze the algorithms (correctness and complexity) including the proofs, and you need to be able to solve the „type 2“ tasks mostly independently. (Advice like „try dynamic programming“ and similar is still fine though.)

Topics

Disclaimer: the topics below are NOT specific instances of “type 1” questions. It is clear that some topics are wider and some narrower. However, if you have a good understanding of all the topics below, you should not be surprised by anything during the exam.

  • BFS, DFS and their edge classifications
  • DFS applications: topological sorting, detecting strongly connected components
  • Dijkstra's algorithm, $d$-ary heaps
  • Bellman-Ford's algorithm
  • Floyd-Warshall's algorithm (small topic)
  • Jarník's algorithm; cut lemma
  • Borůvka's algorithm
  • Kruskal's algoritmus + Union-Find data structure using trees
  • Binary Search Trees (BSTs) in general
  • AVL trees
  • $(a,b)$-trees
  • Tries (FIND, INSERT, DELETE in $O(L)$ for a string of length $L$)
  • Hashing with chaining (each bucket is a linked list)
    • Hashing with open addressing: just know that it exists, with various schemes (collisions are handled by putting elements elsewhere in the same array, not by creating a chain)
  • Universal hashing – construction of a $c$-universal family
  • Expanding arrays – brief mention, be able to prove that inserting $n$ elements into an expanding array (start from size 1, if it would overflow $\to$ double size and re-insert) takes time $O(n)$.
  • Master theorem – analyzing the complexity of Divide & Conquer algorithms using the recursion tree
  • Integer multiplication using Karatsuba's algorithm
  • LinearSelect (finding the $k$-th smallest element in an array in linear time)
  • QuickSelect with random pivots works in expected $\mathcal{O}(n)$ time, QuickSort with random pivots in expected $\mathcal{O}(n \log n)$ time.
  • Edit Distance dynamic programming algorithm
  • Longest Increasing Subsequence dynamic programming algorithm
  • Lower bounds: searching in a sorted array is $\Omega(\log n)$, sorting is $\Omega(n \log n)$ (only applies to deterministic and comparison-based algorithms)
teaching/ads12526_lecture.txt · Poslední úprava: autor: Martin Koutecky