Uživatelské nástroje

Nástroje pro tento web


teaching:ads12324_lecture

Algorithms and Data Structures I 2023/24 -- Lecture

I teach Algorithms and Data Structures I (NTIN060) every Tuesday at 12:20 at S9 (Malá strana). I have taught this class last year and there are some (somewhat problematic) recordings from then (see the link).

I also teach a tutorial for this class on Monday at 15:40, and another one is taught on Tuesday at 17:20 by Todor Antić. Both tutorials will cover the same content and have the same criteria for obtaining credit.

If you want to talk to me, schedule a meeting with me. You can e-mail me at koutecky+ads1@iuuk.mff.cuni.cz and/or include the text [ADS1] in the email subject.

data what was taught [resources]
19. 2. The Random Access Machine (RAM) model of computation, instruction cost (unit, logarithmic, relative logarithmic) [A Chapter 0], Wiki: Random Access Machine, Big-Oh notation ($\mathcal{O}, \Omega, \Theta$)
27. 2. Graph problems, DFS: identifying connected components, pre- and post-orderings, cycle detection. [A, up to 3.3.2]
5. 3. DFS: topological ordering, detecting strongly connected components. [A, remainder of Chap 3]. Also learn the $\mathcal{O}(n+m)$ algorithm to find all bridges of an undirected graph: notes, demo.
12. 3. Finish SCC algorithm [A, Chap 3]; begin shortest paths: BFS, Dijkstra. [A, Chap 4]
19. 3. $d$-ary heaps for Dijkstra; general relaxation algorithm, start Bellman-Ford [A, Chap 4].
26. 3. Bellman-Ford [A 4] (Also see [JE 8, 9] but that's a lot more information than necessary.) Beginning of Min Spanning Trees: cut lemma, if edge weights distinct, MST is unique; Jarník's algorithm; Borůvka's algorithm [A 5], [JE 7].)
2. 4. Finish Borůvka's algorithm; Kruskal's algorithm for MSTs, Union-Find problem and data structure. [A 5.1.4]. Intro to Data structures. Binary Search Trees (intro to Algs notes)
9. 4. Perfectly balanced trees (balancing turns out to be too expensive, this is a dead-end); depth-balanced trees = AVL trees. slides from U of Washington). OneNote notes. AVL tree rebalancing. Possible resources for AVL trees: 1, 2, 3
16. 4. $(a,b)$-trees 1, 2, hashing hashing notes from Jeff E. In tutorials: we showed a construction of a $1$-universal family of hashing functions, see here. This theorem may appear at the exam.
23. 4. Brief mention of amortized analysis: expanding arrays can do $n$ INSERT / DELETE operations in time $O(n)$ (but not each operation is $O(1)$) JeffE. Brief mention of LLRB trees original paper, slides, controversy. Divide & conquer – Recursion tree technique, Karatsuba's algorithm for multiplication of $n$-digit numbers in time $n^{\log_2 3}$ [JE 1], specifically here
30. 4. Master theorem JE 1, QuickSelect is fast in expectation, thus QuickSort is fast in expectation Wiki
7. 5. Finding the $k$-th element in $O(n)$ time by the „median of medians“ technique [JE 1]. Sorting lower bound JeffE's notes. Begin dynamic programming: Longest Increasing Subsequence in $O(n^2)$ time with a table; brief mention that it can be done in $O(n \log n)$ with an enhanced balanced binary search tree JE 3.
14. 5. No lecture!
21. 5. dynamic programming - edit distance, Floyd-Warshall, optimal binary trees [JE 3], [A 6], specifically here.

Useful Resources

ADS1 Exam / 2024

The exam will consist of:

1. Two questions about an algorithm or a data structure from the lecture – describing the algorithm or data structure, proving they are correct, giving the complexity analysis. 2. Two tasks similar to those from the tutorial – either apply an algorithm / data structure from the lecture to some problem (need to model the problem appropriately) OR adapt the algorithm / data structure to solve some problem (need to understand how it works internally to be able to adapt it appropriately)

The form of the exam is that you will come, get the question sheet, and work on the answers. Once you are finished with one of the answers, you hand it in, I (or one of my colleagues) will read it, point out anything which is missing / incorrect, give you hints if needed, and you can revise it. At some point either you will reach a correct solution, or you won't want to try to improve it again, or I will feel like I can't give you any more hints, and then we'll reach some grade.

Grading

To get a 3 it suffices to know definitions and algorithms / data structures and (with hints) finish at least one of the „type 2“ tasks (perhaps suboptimally).

To get a 2 you need to be able to solve the tasks with some hints, or find a suboptimal (complexity-wise) algorithm. If we proved some intermediate claims during the lecture, and these are used in the proofs of correctness/complexity, you need to at least know the statements of these claims.

To get a 1 you need to analyze the algorithms (correctness and complexity) including the proofs, and you need to be able to solve the „type 2“ tasks mostly independently. (Advice like „try dynamic programming“ and similar is still fine though.)

Topics

Disclaimer: the topics below are NOT specific instances of “type 1” questions. It is clear that some topics are wider and some narrower. However, if you have a good understanding of all the topics below, you should not be surprised by anything during the exam.

  • BFS, DFS and their edge classifications
  • DFS applications: topological sorting, detecting strongly connected components
  • Dijkstra's algorithm, $d$-ary heaps
  • Bellman-Ford's algorithm
  • Floyd-Warshall's algorithm (small topic)
  • Jarník's algorithm; cut lemma
  • Borůvka's algorithm
  • Kruskal's algoritmus + Union-Find data structure using trees
  • Binary Search Trees (BSTs) in general
  • AVL trees
  • $(a,b)$-trees
  • Hashing with chaining (each bucket is a linked list)
  • Universal hashing
  • Master theorem – analyzing the complexity of Divide & Conquer algorithms using the recursion tree
  • Integer multiplication using Karatsuba's algorithm
  • MomSelect (finding the $k$-th smallest element in an array in linear time)
  • QuickSelect with random pivots works in expected $\mathcal{O}(n)$ time, QuickSort with random pivots in expected $\mathcal{O}(n \log n)$ time.
  • Edit Distance dynamic programming algorithm
  • Longest Increasing Subsequence dynamic programming algorithm
  • Lower bounds: searching in a sorted array is $\Omega(\log n)$, sorting is $\Omega(n \log n)$ (only applies to deterministic and comparison-based algorithms)
teaching/ads12324_lecture.txt · Poslední úprava: 2024/05/29 20:41 autor: Martin Koutecky