Uživatelské nástroje

Nástroje pro tento web


teaching:ads12526_lecture

Rozdíly

Zde můžete vidět rozdíly mezi vybranou verzí a aktuální verzí dané stránky.

Odkaz na výstup diff

Obě strany předchozí revizePředchozí verze
Následující verze
Předchozí verze
teaching:ads12526_lecture [2026/03/28 23:25] – revert test edit Claude Coworkteaching:ads12526_lecture [2026/05/11 14:04] (aktuální) Martin Koutecky
Řádek 16: Řádek 16:
 | 16. 3. | binary, $d$-ary, Fibonacci heaps for Dijkstra; general relaxation algorithm, start Bellman-Ford **[ALG 6.1-6.3]**, **[A, Chap 4]**.| | 16. 3. | binary, $d$-ary, Fibonacci heaps for Dijkstra; general relaxation algorithm, start Bellman-Ford **[ALG 6.1-6.3]**, **[A, Chap 4]**.|
 | 23. 3. | Bellman-Ford **[ALG 6.3]**, **[A 4]** //(Also see **[JE 8, 9]** but that's a lot more information than necessary.)// Start Min Spanning Trees: cut lemma, if edge weights distinct, MST is unique; Jarník's algorithm; Borůvka's algorithm **[ALG 7-7.3]**, **[A 5]**, **[JE 7]**.)| | 23. 3. | Bellman-Ford **[ALG 6.3]**, **[A 4]** //(Also see **[JE 8, 9]** but that's a lot more information than necessary.)// Start Min Spanning Trees: cut lemma, if edge weights distinct, MST is unique; Jarník's algorithm; Borůvka's algorithm **[ALG 7-7.3]**, **[A 5]**, **[JE 7]**.)|
-| 30. 3. | //Plan: Kruskal's algorithm for MSTs, Union-Find problem and data structure**[ALG 7.4]**, **[A 5.1.4]**. Intro to Data structuresBinary Search Trees **[ALG 8]** ([[https://ksvi.mff.cuni.cz/~dingle/2021-2/algs/notes_10.html|intro to Algs notes]])//|+| 30. 3. | Kruskal's algorithmUnion-Find problem and data structure via forests ("bushes"), $\mathcal{O}(\log n)$ Union and Find **[ALG 7.4]**, **[A 5.1.4]**. Start data structuresBinary Search Trees; perfectly balanced BSTs have depth $\mathcal{O}(\log n)$ **[ALG 8.1]** ([[https://ksvi.mff.cuni.cz/~dingle/2021-2/algs/notes_10.html|intro to Algs notes]])
 +| 6. 4. | //Easter Monday — no lecture.//
 +| 13. 4. | AVL trees: depth lower bound $\Omega(\log n)$ and upper bound $\mathcal{O}(\log n)$; FIND runs in $\mathcal{O}(\log n)$; INSERT and DELETE via rotations also run in $\mathcal{O}(\log n)$. **[ALG 8.2]** | 
 +| 20. 4. | $(a,b)$-trees and brief mention of Left-Leaning Red-Black (LLRB) trees **[ALG 8.3]** ([[https://sedgewick.io/wp-content/themes/sedgewick/papers/2008LLRB.pdf|original LLRB paper]], [[https://sedgewick.io/wp-content/uploads/2022/03/2008-09LLRB.pdf|slides]], [[https://read.seas.harvard.edu/~kohler/notes/llrb.html|controversy]]). Tries **[ALG 4.3]**.| 
 +| 27. 4. | Hashing with chaining **[ALG 11.3]**, open addressing (linear probing, double hashing) **[ALG 11.4]**, universal hashing **[ALG 11.5]**. Brief mention of amortized complexity and expanding arrays **[ALG 9.1]**.| 
 +| 4. 5. | Divide and conquer: analysis of Mergesort via the substitution method and the recursion tree technique **[ALG 10.2]**, Karatsuba's algorithm for multiplication of $n$-digit numbers in time $\mathcal{O}(n^{\log_2 3})$ **[ALG 10.3]**, started the Master theorem **[ALG 10.4]** //(to be finished next time)//.| 
 +| 11. 5. | Finish Master theorem **[ALG 10.4]**. QuickSelect, QuickSort **[ALG 10.6-10.7]**, finding the $k$-th smallest element in linear time **[ALG 10.8]**| 
 +| 18. 5. | //Plan: dynamic programming: Longest Increasing Subsequence in $O(n \log n)$ time. Edit Distance in $O(nm)$. Floyd-Warshall algorithm for All-Pairs Shortest Paths (APSP). **[ALG 12]**, [[https://jeffe.cs.illinois.edu/teaching/algorithms/book/03-dynprog.pdf|JE 3]].//|
  
 /* /*
Řádek 33: Řádek 40:
 {{page>teaching:bits:resources_ads1}} {{page>teaching:bits:resources_ads1}}
  
-/* 
  
-{{ :teaching:ads12324:exam2324.pdf | Exam topics and format (PDF).}} 
  
-====== ADS1 Exam / 2024 ======+/* {{ :teaching:ads12324:exam2324.pdf | Exam topics and format (PDF).}} */ 
 + 
 +====== ADS1 Exam / 2026 ======
  
 The exam will consist of: The exam will consist of:
Řádek 66: Řádek 73:
   * Dijkstra's algorithm, $d$-ary heaps   * Dijkstra's algorithm, $d$-ary heaps
   * Bellman-Ford's algorithm   * Bellman-Ford's algorithm
-  * Floyd-Warshall's algorithm (small topic)+  * Floyd-Warshall's algorithm (small topic -- covered with dynamic programming)
   * Jarník's algorithm; cut lemma   * Jarník's algorithm; cut lemma
   * Borůvka's algorithm   * Borůvka's algorithm
Řádek 73: Řádek 80:
   * AVL trees   * AVL trees
   * $(a,b)$-trees   * $(a,b)$-trees
 +  * Tries (FIND, INSERT, DELETE in $O(L)$ for a string of length $L$)
   * Hashing with chaining (each bucket is a linked list)   * Hashing with chaining (each bucket is a linked list)
-  * Universal hashing+    * Hashing with open addressing: just know that it exists, with various schemes (collisions are handled by putting elements elsewhere in the same array, not by creating a chain) 
 +  * Universal hashing -- construction of a $c$-universal family 
 +  * Expanding arrays -- brief mention, be able to prove that inserting $n$ elements into an expanding array (start from size 1, if it would overflow $\to$ double size and re-insert) takes time $O(n)$.
   * Master theorem -- analyzing the complexity of Divide & Conquer algorithms using the recursion tree   * Master theorem -- analyzing the complexity of Divide & Conquer algorithms using the recursion tree
   * Integer multiplication using Karatsuba's algorithm   * Integer multiplication using Karatsuba's algorithm
-  * MomSelect (finding the $k$-th smallest element in an array in linear time)+  * LinearSelect (finding the $k$-th smallest element in an array in linear time)
   * QuickSelect with random pivots works in expected $\mathcal{O}(n)$ time, QuickSort with random pivots in expected $\mathcal{O}(n \log n)$ time.   * QuickSelect with random pivots works in expected $\mathcal{O}(n)$ time, QuickSort with random pivots in expected $\mathcal{O}(n \log n)$ time.
   * Edit Distance dynamic programming algorithm   * Edit Distance dynamic programming algorithm
   * Longest Increasing Subsequence dynamic programming algorithm   * Longest Increasing Subsequence dynamic programming algorithm
-  * Lower bounds: searching in a sorted array is $\Omega(\log n)$, sorting is $\Omega(n \log n)$ (only applies to deterministic and comparison-based algorithms) 
  
-*/+/*  * Lower bounds: searching in a sorted array is $\Omega(\log n)$, sorting is $\Omega(n \log n)$ (only applies to deterministic and comparison-based algorithms) */
  
  
teaching/ads12526_lecture.1774740329.txt.gz · Poslední úprava: autor: Claude Cowork