Uživatelské nástroje

Nástroje pro tento web


teaching:ads12526_lecture

Rozdíly

Zde můžete vidět rozdíly mezi vybranou verzí a aktuální verzí dané stránky.

Odkaz na výstup diff

Obě strany předchozí revizePředchozí verze
Následující verze
Předchozí verze
teaching:ads12526_lecture [2026/02/17 23:42] Martin Kouteckyteaching:ads12526_lecture [2026/05/11 14:04] (aktuální) Martin Koutecky
Řádek 11: Řádek 11:
 ^ data ^ what was taught [resources] ^ ^ data ^ what was taught [resources] ^
 | 16. 2.| The Random Access Machine (RAM) model of computation, instruction cost (unit, logarithmic, relative logarithmic) **[ALG 2]**, **[A 0]**, **[[https://en.wikipedia.org/wiki/Random-access_machine#Formal_definition|Wiki: Random Access Machine]]**, Big-Oh notation ($\mathcal{O}, \Omega, \Theta$)| | 16. 2.| The Random Access Machine (RAM) model of computation, instruction cost (unit, logarithmic, relative logarithmic) **[ALG 2]**, **[A 0]**, **[[https://en.wikipedia.org/wiki/Random-access_machine#Formal_definition|Wiki: Random Access Machine]]**, Big-Oh notation ($\mathcal{O}, \Omega, \Theta$)|
-| 23. 2. |//Plan: Graph problems, DFS: identifying connected components, pre- and post-orderings, cycle detection. **[A, up to 3.3.2]**//|+| 23. 2. |Graph problems, DFS: in- and out-orderings, edge classification, cycle detection. **[ALG 5.6]**, **[A, up to 3.3.2]**, **Also learn** the $\mathcal{O}(n+m)$ algorithm to find all bridges of an undirected graph **[ALG 5.7]**, [[https://cp-algorithms.com/graph/bridge-searching.html|notes]]| 
 +| 2. 3 | DFS: topological ordering, detecting strongly connected components. **[ALG 5.8-5.9]**,  **[A, remainder of Chap 3]**. | 
 +| 9. 3 | Finish SCC algorithm **[ALG 5.9]**, **[A, Chap 3]**; begin shortest paths: BFS **[ALG 5.2]**, Dijkstra **[ALG 6.0-6.2]**, **[A, Chap 4]**| 
 +| 16. 3. | binary, $d$-ary, Fibonacci heaps for Dijkstra; general relaxation algorithm, start Bellman-Ford **[ALG 6.1-6.3]**, **[A, Chap 4]**.| 
 +| 23. 3. | Bellman-Ford **[ALG 6.3]**, **[A 4]** //(Also see **[JE 8, 9]** but that's a lot more information than necessary.)// Start Min Spanning Trees: cut lemma, if edge weights distinct, MST is unique; Jarník's algorithm; Borůvka's algorithm **[ALG 7-7.3]**, **[A 5]**, **[JE 7]**.)| 
 +| 30. 3. | Kruskal's algorithm; Union-Find problem and data structure via forests ("bushes"), $\mathcal{O}(\log n)$ Union and Find **[ALG 7.4]**, **[A 5.1.4]**. Start data structures: Binary Search Trees; perfectly balanced BSTs have depth $\mathcal{O}(\log n)$ **[ALG 8.1]** ([[https://ksvi.mff.cuni.cz/~dingle/2021-2/algs/notes_10.html|intro to Algs notes]])| 
 +| 6. 4. | //Easter Monday — no lecture.//
 +| 13. 4. | AVL trees: depth lower bound $\Omega(\log n)$ and upper bound $\mathcal{O}(\log n)$; FIND runs in $\mathcal{O}(\log n)$; INSERT and DELETE via rotations also run in $\mathcal{O}(\log n)$. **[ALG 8.2]** | 
 +| 20. 4. | $(a,b)$-trees and brief mention of Left-Leaning Red-Black (LLRB) trees **[ALG 8.3]** ([[https://sedgewick.io/wp-content/themes/sedgewick/papers/2008LLRB.pdf|original LLRB paper]], [[https://sedgewick.io/wp-content/uploads/2022/03/2008-09LLRB.pdf|slides]], [[https://read.seas.harvard.edu/~kohler/notes/llrb.html|controversy]]). Tries **[ALG 4.3]**.| 
 +| 27. 4. | Hashing with chaining **[ALG 11.3]**, open addressing (linear probing, double hashing) **[ALG 11.4]**, universal hashing **[ALG 11.5]**. Brief mention of amortized complexity and expanding arrays **[ALG 9.1]**.| 
 +| 4. 5. | Divide and conquer: analysis of Mergesort via the substitution method and the recursion tree technique **[ALG 10.2]**, Karatsuba's algorithm for multiplication of $n$-digit numbers in time $\mathcal{O}(n^{\log_2 3})$ **[ALG 10.3]**, started the Master theorem **[ALG 10.4]** //(to be finished next time)//.| 
 +| 11. 5. | Finish Master theorem **[ALG 10.4]**. QuickSelect, QuickSort **[ALG 10.6-10.7]**, finding the $k$-th smallest element in linear time **[ALG 10.8]**| 
 +| 18. 5. | //Plan: dynamic programming: Longest Increasing Subsequence in $O(n \log n)$ time. Edit Distance in $O(nm)$. Floyd-Warshall algorithm for All-Pairs Shortest Paths (APSP). **[ALG 12]**, [[https://jeffe.cs.illinois.edu/teaching/algorithms/book/03-dynprog.pdf|JE 3]].//|
  
 /* /*
-27. 2. | Graph problems, DFS: identifying connected components, pre- and post-orderings, cycle detection. **[A, up to 3.3.2]**| +| 2. 4. | 
-| 5. 3. | DFS: topological ordering, detecting strongly connected components. **[A, remainder of Chap 3]**. **Also learn** the $\mathcal{O}(n+m)$ algorithm to find all bridges of an undirected graph: [[https://cp-algorithms.com/graph/bridge-searching.html|notes]], [[https://syga.app/#/algorithm/detail/syga/dfs-bridges/latest|demo]].| +
-| 12. 3. | Finish SCC algorithm **[A, Chap 3]**; begin shortest paths: BFS, Dijkstra. **[A, Chap 4]** | +
-| 19. 3. | $d$-ary heaps for Dijkstra; general relaxation algorithm, start Bellman-Ford **[A, Chap 4]**.| +
-| 26. 3. | Bellman-Ford **[A 4]** //(Also see **[JE 8, 9]** but that's a lot more information than necessary.)// Beginning of Min Spanning Trees: cut lemma, if edge weights distinct, MST is unique; Jarník's algorithm; Borůvka's algorithm **[A 5]**, **[JE 7]**.)| +
-| 2. 4. | Finish Borůvka's algorithm; Kruskal's algorithm for MSTs, Union-Find problem and data structure. **[A 5.1.4]**. Intro to Data structures. Binary Search Trees ([[https://ksvi.mff.cuni.cz/~dingle/2021-2/algs/notes_10.html|intro to Algs notes]]) |+
 | 9. 4. | Perfectly balanced trees (//balancing turns out to be too expensive, this is a dead-end//); depth-balanced trees = AVL trees. [[https://courses.cs.washington.edu/courses/cse373/19su/files/lectures/slides/lecture09.pdf|slides from U of Washington]]). **{{ :teaching:ads12122:ads_bsts.pdf |OneNote notes}}**. AVL tree rebalancing. Possible resources for AVL trees: [[https://www.programiz.com/dsa/avl-tree|1]], [[http://gtu-paper-solution.com/Paper-Solution/DataStructure-2130702/AVL%20Trees/Summer-2019/Question-4-c-OR|2]], [[https://courses.cs.washington.edu/courses/cse373/19su/files/lectures/slides/lecture09.pdf|3]]| | 9. 4. | Perfectly balanced trees (//balancing turns out to be too expensive, this is a dead-end//); depth-balanced trees = AVL trees. [[https://courses.cs.washington.edu/courses/cse373/19su/files/lectures/slides/lecture09.pdf|slides from U of Washington]]). **{{ :teaching:ads12122:ads_bsts.pdf |OneNote notes}}**. AVL tree rebalancing. Possible resources for AVL trees: [[https://www.programiz.com/dsa/avl-tree|1]], [[http://gtu-paper-solution.com/Paper-Solution/DataStructure-2130702/AVL%20Trees/Summer-2019/Question-4-c-OR|2]], [[https://courses.cs.washington.edu/courses/cse373/19su/files/lectures/slides/lecture09.pdf|3]]|
 | 16. 4. | $(a,b)$-trees [[https://cs.lmu.edu/~ray/notes/abtrees/|1]], [[http://www14.in.tum.de/lehre/2016WS/ea/split/sub-ab-trees-single.pdf|2]], hashing [[https://jeffe.cs.illinois.edu/teaching/algorithms/notes/05-hashing.pdf|hashing notes from Jeff E]]. **In tutorials: we showed a construction of a $1$-universal family of hashing functions, see [[https://research.koutecky.name/db/_media/teaching:ads12324:09-cv.pdf|here]]**. This theorem may appear at the exam.| | 16. 4. | $(a,b)$-trees [[https://cs.lmu.edu/~ray/notes/abtrees/|1]], [[http://www14.in.tum.de/lehre/2016WS/ea/split/sub-ab-trees-single.pdf|2]], hashing [[https://jeffe.cs.illinois.edu/teaching/algorithms/notes/05-hashing.pdf|hashing notes from Jeff E]]. **In tutorials: we showed a construction of a $1$-universal family of hashing functions, see [[https://research.koutecky.name/db/_media/teaching:ads12324:09-cv.pdf|here]]**. This theorem may appear at the exam.|
Řádek 33: Řádek 40:
 {{page>teaching:bits:resources_ads1}} {{page>teaching:bits:resources_ads1}}
  
-/* 
  
-{{ :teaching:ads12324:exam2324.pdf | Exam topics and format (PDF).}} 
  
-====== ADS1 Exam / 2024 ======+/* {{ :teaching:ads12324:exam2324.pdf | Exam topics and format (PDF).}} */ 
 + 
 +====== ADS1 Exam / 2026 ======
  
 The exam will consist of: The exam will consist of:
Řádek 66: Řádek 73:
   * Dijkstra's algorithm, $d$-ary heaps   * Dijkstra's algorithm, $d$-ary heaps
   * Bellman-Ford's algorithm   * Bellman-Ford's algorithm
-  * Floyd-Warshall's algorithm (small topic)+  * Floyd-Warshall's algorithm (small topic -- covered with dynamic programming)
   * Jarník's algorithm; cut lemma   * Jarník's algorithm; cut lemma
   * Borůvka's algorithm   * Borůvka's algorithm
Řádek 73: Řádek 80:
   * AVL trees   * AVL trees
   * $(a,b)$-trees   * $(a,b)$-trees
 +  * Tries (FIND, INSERT, DELETE in $O(L)$ for a string of length $L$)
   * Hashing with chaining (each bucket is a linked list)   * Hashing with chaining (each bucket is a linked list)
-  * Universal hashing+    * Hashing with open addressing: just know that it exists, with various schemes (collisions are handled by putting elements elsewhere in the same array, not by creating a chain) 
 +  * Universal hashing -- construction of a $c$-universal family 
 +  * Expanding arrays -- brief mention, be able to prove that inserting $n$ elements into an expanding array (start from size 1, if it would overflow $\to$ double size and re-insert) takes time $O(n)$.
   * Master theorem -- analyzing the complexity of Divide & Conquer algorithms using the recursion tree   * Master theorem -- analyzing the complexity of Divide & Conquer algorithms using the recursion tree
   * Integer multiplication using Karatsuba's algorithm   * Integer multiplication using Karatsuba's algorithm
-  * MomSelect (finding the $k$-th smallest element in an array in linear time)+  * LinearSelect (finding the $k$-th smallest element in an array in linear time)
   * QuickSelect with random pivots works in expected $\mathcal{O}(n)$ time, QuickSort with random pivots in expected $\mathcal{O}(n \log n)$ time.   * QuickSelect with random pivots works in expected $\mathcal{O}(n)$ time, QuickSort with random pivots in expected $\mathcal{O}(n \log n)$ time.
   * Edit Distance dynamic programming algorithm   * Edit Distance dynamic programming algorithm
   * Longest Increasing Subsequence dynamic programming algorithm   * Longest Increasing Subsequence dynamic programming algorithm
-  * Lower bounds: searching in a sorted array is $\Omega(\log n)$, sorting is $\Omega(n \log n)$ (only applies to deterministic and comparison-based algorithms) 
  
-*/+/*  * Lower bounds: searching in a sorted array is $\Omega(\log n)$, sorting is $\Omega(n \log n)$ (only applies to deterministic and comparison-based algorithms) */
  
  
teaching/ads12526_lecture.1771371728.txt.gz · Poslední úprava: autor: Martin Koutecky