====== Algorithms and Data Structures I 2021/22 -- Lecture ====== I teach Algorithms and Data Structures I ([[https://is.cuni.cz/studium/predmety/index.php?id=94f10a35af9ff69aeb38eb9166a34c3c&tid=&do=predmet&kod=NTIN060&skr=2019&fak=11320|NTIN060]]) every Wednesday at 9:00 at N2 (Troja/IMPAKT). If you want to talk to me, you're welcome in my office S326 at Malá Strana. You can also e-mail me at ''koutecky+ads1@iuuk.mff.cuni.cz'' and/or include the text ''[ADS1]'' in the email subject. {{tablelayout?colwidth="100px,-"&rowsHeaderSource=1&rowsVisible=100&float=left}} ^ data ^ what was taught [resources] ^ | 17. 2.| Richest subsequence. Attempts at defining an algorithm. Von Neumann's computational model, Big-Oh notation ($\mathcal{O}, \mathcal{o}, \Omega, \omega$), the RAM (Random Access Machine) model **[A Chapter 0]**, **[[https://en.wikipedia.org/wiki/Random-access_machine#Formal_definition|Wiki: Random Access Machine]]**, [[https://stream.cuni.cz/cs/Detail/16489|recording]]| | 24. 2. | Why graphs problems, DFS: identifying connected components, pre- and post-orderings, cycle detection. **[A, up to 3.3.2]** [[https://stream.cuni.cz/cs/Detail/16577|recording]] | | 2. 3. | DFS: topological ordering, detecting strongly connected components. **[A, remainder of Chap 3]**, very brief intro to BFS **[A, beginning of Chap 4]**. [[https://stream.cuni.cz/cs/Detail/16721|recording]]| | 9. 3. | Shortest paths: BFS, Dijkstra, general relaxation algorithm, Bellman-Ford, detecting negative cycles. **[A, Chap 4]**, [[https://stream.cuni.cz/cs/Detail/16723|recording]]. | | 16. 3. | Shortest paths: recap of Bellman-Ford; Floyd-Warshall for APSP. **[A 4]** //(Also see **[JE 8, 9]** but that's a lot more information than necessary.)// Beginning of Min Spanning Trees: if edge weights distinct, MST is unique. **[A 5]**, **[JE 7]**.) [[https://stream.cuni.cz/cs/Detail/16775|recording]]| | 23. 3. | Minimum Spanning Trees: Cut property (the MST contains every safe edge), Borůvka, Jarník, Kruskal **[JE 7]**, Union-Find data structure **[A 5.1.4]**. [[https://stream.cuni.cz/cs/Detail/16842|recording]]| | 30. 3. | Union-Find Data structure **[A 5.1.4]**; Binary Search Trees ([[https://ksvi.mff.cuni.cz/~dingle/2021-2/algs/notes_10.html|intro to Algs notes]]), perfect balancing too expensive; depth-balanced trees = AVL trees. [[https://courses.cs.washington.edu/courses/cse373/19su/files/lectures/slides/lecture09.pdf|slides from U of Washington]]). **{{ :teaching:ads12122:ads_bsts.pdf |OneNote notes}}**. [[https://stream.cuni.cz/cs/Detail/16918|recording]]| | 6. 4. | AVL tree rebalancing, intro to $(a,b)$-trees. [[https://stream.cuni.cz/cs/Detail/16972|recording]]. Possible resources for AVL trees: [[https://www.programiz.com/dsa/avl-tree|1]], [[http://gtu-paper-solution.com/Paper-Solution/DataStructure-2130702/AVL%20Trees/Summer-2019/Question-4-c-OR|2]], [[https://courses.cs.washington.edu/courses/cse373/19su/files/lectures/slides/lecture09.pdf|3]]| | 13. 4. | $(a,b)$-trees [[https://cs.lmu.edu/~ray/notes/abtrees/|1]], [[http://www14.in.tum.de/lehre/2016WS/ea/split/sub-ab-trees-single.pdf|2]], LLRB trees [[https://sedgewick.io/wp-content/themes/sedgewick/papers/2008LLRB.pdf|original paper]], [[https://sedgewick.io/talks/?pdfjs_id=1281&_wpnonce=effd5be684#zoom=auto|slides]], [[https://read.seas.harvard.edu/~kohler/notes/llrb.html|controversy]]. [[https://stream.cuni.cz/cs/Detail/17028|recording]]| | 20. 4. | Amortized analysis, hashing [[https://stream.cuni.cz/cs/Detail/17063|recording]], [[https://jeffe.cs.illinois.edu/teaching/algorithms/notes/05-hashing.pdf|hashing notes from Jeff E]], [[https://jeffe.cs.illinois.edu/teaching/algorithms/notes/09-amortize.pdf|amortized analysis notes by Jeff E]]| | 27. 4. | Universal, perfect hashing [[https://jeffe.cs.illinois.edu/teaching/algorithms/notes/05-hashing.pdf|hashing notes from Jeff E]], open addressing will be done at tutorial. [[https://stream.cuni.cz/cs/Detail/17230|recording]]| | 4. 5. | Divide & conquer -- Karatsuba's algorithm for multiplication of $n$-digit numbers in time $n^{\log_2 3}$, finding the median (or any $k$-th element) in $O(n)$ time by the "median of median" technique. [JE 1], specifically [[https://jeffe.cs.illinois.edu/teaching/algorithms/book/01-recursion.pdf|here]]. [[https://stream.cuni.cz/cs/Detail/17289|recording]]| | 18. 5. | Sorting lower bound [[https://jeffe.cs.illinois.edu/teaching/algorithms/notes/12-lowerbounds.pdf| JeffE's notes]], Dynamic programming [JE 3], [A 6], specifically [[https://jeffe.cs.illinois.edu/teaching/algorithms/book/03-dynprog.pdf|here]]. [[https://stream.cuni.cz/cs/Detail/17423|recording]]| {{page>teaching:bits:resources_ads1}} {{ :teaching:ads12122:exam2122.pdf | Exam topics and format (PDF).}} ====== ADS1 Exam / 2022 ====== The exam will consist of: - Two questions about an algorithm or a data structure from the lecture – describing the algorithm or data structure, proving they are correct, giving the complexity analysis. - Two tasks similar to those from the tutorial – either apply an algorithm / data structure from the lecture to some problem (need to model the problem appropriately) OR adapt the algorithm / data structure to solve some problem (need to understand how it works internally to be able to adapt it appropriately) The form of the exam is that you will come, get the question sheet, and work on the answers. Once you are finished with one of the answers, you hand it in, I will read it, point out anything which is missing / incorrect, give you hints if needed, and you can revise it. At some point either you will reach a correct solution, or you won’t want to try to improve it again, or I will feel like I can’t give you any more hints, and then we’ll reach some grade. ==== Grading ==== **To get a 3** it suffices to know definitions and algorithms / data structures and (with hints) finish at least one of the “type 2” tasks (perhaps suboptimally). **To get a 2** you need to be able to solve the tasks with some hints, or find a suboptimal (complexity) algorithm. If we proved some intermediate claims during the lecture, and these are used in the proofs of correctness/complexity, you need to at least know the statements of these claims. **To get a 1** you need to analyze the algorithms (correctness and complexity) including the proofs, and you need to be able to solve the “type 2” tasks mostly independently. (Advice like “try dynamic programming” and similar is still fine though.) ===== Topics ===== **Disclaimer:** the topics below are NOT specific instances of “type 1” questions. It is clear that some topics are wider and some narrower. However, if you have a good understanding of all the topics below, you should not be surprised by anything during the lecture. * BFS, DFS and their edge classifications * DFS applications: topological sorting, detecting strongly connected components * Dijkstra’s algorithm * Bellman-Ford’s algorithm * Floyd-Warshall’s algorithm (small topic) * Jarník’s algorithm * Borůvka’s algorithm * Kruskal’s algoritmus + Union-Find data structure using trees * Binary Search Trees (BSTs) in general * AVL trees * $(a,b)$-trees * Red-black trees – we covered these very briefly, but you should be able to describe the bijection between LLRBs and $(2,4)$-trees * Hashing with chaining * Hashing with open addressing (warning, I did this at the tutorial; if you don’t understand it, please read JeffE’s notes) * Universal hashing * Master theorem – analyzing the complexity of Divide & Conquer algorithms using the recursion tree * Integer multiplication using Karatsuba’s algorithm * MomSelect (finding the $k$-th smallest element in an array in linear time) * Edit Distance dynamic programming algorithm * Longest Increasing Subsequence dynamic programming algorithm * Lower bounds: searching in a sorted array is $\Omega(\log n)$, sorting is $\Omega(n \log n)$ (only applies to deterministic and comparison-based algorithms)