====== Algorithms and Data Structures I 2023/24 -- Lecture ====== I teach Algorithms and Data Structures I ([[https://is.cuni.cz/studium/eng/predmety/index.php?do=predmet&kod=NTIN060|NTIN060]]) every Tuesday at 12:20 at S9 (Malá strana). [[https://research.koutecky.name/db/teaching:ads12122_lecture|I have taught this class last year]] and there are some (somewhat problematic) recordings from then (see the link). I also teach a [[https://research.koutecky.name/db/teaching:ads12324_tutorial|tutorial for this class on Monday at 15:40]], and another one is taught on [[https://sites.google.com/view/todor-antic/home|Tuesday at 17:20 by Todor Antić]]. Both tutorials will cover the same content and have the same criteria for obtaining credit. If you want to talk to me, schedule a meeting with me. You can e-mail me at ''koutecky+ads1@iuuk.mff.cuni.cz'' and/or include the text ''[ADS1]'' in the email subject. {{tablelayout?colwidth="100px,-"&rowsHeaderSource=1&rowsVisible=100&float=left}} ^ data ^ what was taught [resources] ^ | 19. 2.| The Random Access Machine (RAM) model of computation, instruction cost (unit, logarithmic, relative logarithmic) **[A Chapter 0]**, **[[https://en.wikipedia.org/wiki/Random-access_machine#Formal_definition|Wiki: Random Access Machine]]**, Big-Oh notation ($\mathcal{O}, \Omega, \Theta$)| | 27. 2. | Graph problems, DFS: identifying connected components, pre- and post-orderings, cycle detection. **[A, up to 3.3.2]**| | 5. 3. | DFS: topological ordering, detecting strongly connected components. **[A, remainder of Chap 3]**. **Also learn** the $\mathcal{O}(n+m)$ algorithm to find all bridges of an undirected graph: [[https://cp-algorithms.com/graph/bridge-searching.html|notes]], [[https://syga.app/#/algorithm/detail/syga/dfs-bridges/latest|demo]].| | 12. 3. | Finish SCC algorithm **[A, Chap 3]**; begin shortest paths: BFS, Dijkstra. **[A, Chap 4]** | | 19. 3. | $d$-ary heaps for Dijkstra; general relaxation algorithm, start Bellman-Ford **[A, Chap 4]**.| | 26. 3. | Bellman-Ford **[A 4]** //(Also see **[JE 8, 9]** but that's a lot more information than necessary.)// Beginning of Min Spanning Trees: cut lemma, if edge weights distinct, MST is unique; Jarník's algorithm; Borůvka's algorithm **[A 5]**, **[JE 7]**.)| | 2. 4. | Finish Borůvka's algorithm; Kruskal's algorithm for MSTs, Union-Find problem and data structure. **[A 5.1.4]**. Intro to Data structures. Binary Search Trees ([[https://ksvi.mff.cuni.cz/~dingle/2021-2/algs/notes_10.html|intro to Algs notes]]) | | 9. 4. | Perfectly balanced trees (//balancing turns out to be too expensive, this is a dead-end//); depth-balanced trees = AVL trees. [[https://courses.cs.washington.edu/courses/cse373/19su/files/lectures/slides/lecture09.pdf|slides from U of Washington]]). **{{ :teaching:ads12122:ads_bsts.pdf |OneNote notes}}**. AVL tree rebalancing. Possible resources for AVL trees: [[https://www.programiz.com/dsa/avl-tree|1]], [[http://gtu-paper-solution.com/Paper-Solution/DataStructure-2130702/AVL%20Trees/Summer-2019/Question-4-c-OR|2]], [[https://courses.cs.washington.edu/courses/cse373/19su/files/lectures/slides/lecture09.pdf|3]]| | 16. 4. | $(a,b)$-trees [[https://cs.lmu.edu/~ray/notes/abtrees/|1]], [[http://www14.in.tum.de/lehre/2016WS/ea/split/sub-ab-trees-single.pdf|2]], hashing [[https://jeffe.cs.illinois.edu/teaching/algorithms/notes/05-hashing.pdf|hashing notes from Jeff E]]. **In tutorials: we showed a construction of a $1$-universal family of hashing functions, see [[https://research.koutecky.name/db/_media/teaching:ads12324:09-cv.pdf|here]]**. This theorem may appear at the exam.| | 23. 4. | Brief mention of amortized analysis: expanding arrays can do $n$ INSERT / DELETE operations in time $O(n)$ (but not each operation is $O(1)$) [[https://jeffe.cs.illinois.edu/teaching/algorithms/notes/09-amortize.pdf|JeffE]]. Brief mention of LLRB trees [[https://sedgewick.io/wp-content/themes/sedgewick/papers/2008LLRB.pdf|original paper]], [[https://sedgewick.io/wp-content/uploads/2022/03/2008-09LLRB.pdf|slides]], [[https://read.seas.harvard.edu/~kohler/notes/llrb.html|controversy]]. Divide & conquer -- Recursion tree technique, Karatsuba's algorithm for multiplication of $n$-digit numbers in time $n^{\log_2 3}$ **[JE 1]**, specifically [[https://jeffe.cs.illinois.edu/teaching/algorithms/book/01-recursion.pdf|here]]| | 30. 4. | Master theorem [[https://jeffe.cs.illinois.edu/teaching/algorithms/book/01-recursion.pdf|JE 1]], QuickSelect is fast in expectation, thus QuickSort is fast in expectation [[https://en.wikipedia.org/wiki/Quicksort#Average-case_analysis|Wiki]]| | 7. 5. | Finding the $k$-th element in $O(n)$ time by the "median of medians" technique **[JE 1]**. Sorting lower bound [[https://jeffe.cs.illinois.edu/teaching/algorithms/notes/12-lowerbounds.pdf| JeffE's notes]]. Begin dynamic programming: Longest Increasing Subsequence in $O(n^2)$ time with a table; brief mention that it can be done in $O(n \log n)$ with an enhanced balanced binary search tree [[https://jeffe.cs.illinois.edu/teaching/algorithms/book/03-dynprog.pdf|JE 3]].| | 14. 5. | //No lecture!//| | 21. 5. | //Plan: dynamic programming - edit distance, Floyd-Warshall [JE 3], [A 6], specifically [[https://jeffe.cs.illinois.edu/teaching/algorithms/book/03-dynprog.pdf|here]].//| /* | 4. 5. | Divide & conquer -- Karatsuba's algorithm for multiplication of $n$-digit numbers in time $n^{\log_2 3}$, finding the median (or any $k$-th element) in $O(n)$ time by the "median of median" technique. [JE 1], specifically [[https://jeffe.cs.illinois.edu/teaching/algorithms/book/01-recursion.pdf|here]]. [[https://stream.cuni.cz/cs/Detail/17289|recording]]| | 18. 5. | Sorting lower bound [[https://jeffe.cs.illinois.edu/teaching/algorithms/notes/12-lowerbounds.pdf| JeffE's notes]], Dynamic programming [JE 3], [A 6], specifically [[https://jeffe.cs.illinois.edu/teaching/algorithms/book/03-dynprog.pdf|here]]. [[https://stream.cuni.cz/cs/Detail/17423|recording]]| */ {{page>teaching:bits:resources_ads1}} {{ :teaching:ads12122:exam2122.pdf | Exam topics and format (2022) (PDF).}} /* ====== ADS1 Exam / 2022 ====== The exam will consist of: - Two questions about an algorithm or a data structure from the lecture – describing the algorithm or data structure, proving they are correct, giving the complexity analysis. - Two tasks similar to those from the tutorial – either apply an algorithm / data structure from the lecture to some problem (need to model the problem appropriately) OR adapt the algorithm / data structure to solve some problem (need to understand how it works internally to be able to adapt it appropriately) The form of the exam is that you will come, get the question sheet, and work on the answers. Once you are finished with one of the answers, you hand it in, I will read it, point out anything which is missing / incorrect, give you hints if needed, and you can revise it. At some point either you will reach a correct solution, or you won’t want to try to improve it again, or I will feel like I can’t give you any more hints, and then we’ll reach some grade. ==== Grading ==== **To get a 3** it suffices to know definitions and algorithms / data structures and (with hints) finish at least one of the “type 2” tasks (perhaps suboptimally). **To get a 2** you need to be able to solve the tasks with some hints, or find a suboptimal (complexity) algorithm. If we proved some intermediate claims during the lecture, and these are used in the proofs of correctness/complexity, you need to at least know the statements of these claims. **To get a 1** you need to analyze the algorithms (correctness and complexity) including the proofs, and you need to be able to solve the “type 2” tasks mostly independently. (Advice like “try dynamic programming” and similar is still fine though.) ===== Topics ===== **Disclaimer:** the topics below are NOT specific instances of “type 1” questions. It is clear that some topics are wider and some narrower. However, if you have a good understanding of all the topics below, you should not be surprised by anything during the lecture. * BFS, DFS and their edge classifications * DFS applications: topological sorting, detecting strongly connected components * Dijkstra’s algorithm * Bellman-Ford’s algorithm * Floyd-Warshall’s algorithm (small topic) * Jarník’s algorithm * Borůvka’s algorithm * Kruskal’s algoritmus + Union-Find data structure using trees * Binary Search Trees (BSTs) in general * AVL trees * $(a,b)$-trees * Red-black trees – we covered these very briefly, but you should be able to describe the bijection between LLRBs and $(2,4)$-trees * Hashing with chaining * Hashing with open addressing (warning, I did this at the tutorial; if you don’t understand it, please read JeffE’s notes) * Universal hashing * Master theorem – analyzing the complexity of Divide & Conquer algorithms using the recursion tree * Integer multiplication using Karatsuba’s algorithm * MomSelect (finding the $k$-th smallest element in an array in linear time) * Edit Distance dynamic programming algorithm * Longest Increasing Subsequence dynamic programming algorithm * Lower bounds: searching in a sorted array is $\Omega(\log n)$, sorting is $\Omega(n \log n)$ (only applies to deterministic and comparison-based algorithms) */