Материал готовится,

пожалуйста, возвращайтесь позднее

пожалуйста, возвращайтесь позднее

I very much hope that you're continuing to enjoy the course. Thanks again for all of your contributions to the discussion forums, we're doing our best to debug all issues as they come up.

Let's talk about what's going on in week 3 of the course. (Sorry this email is a bit late, I just got back home after giving a lecture in Chicago.) All the required video lectures were posted on Sunday, and we just posted some additional optional ones today.

1. VII — - We've posted a second part of the probability review. It's pretty short, just two topics, although quite tricky ones! (Namely, conditional probability and independence.) You need to review this material (via this video or some other source, as you wish) before studying the analysis of the randomized contraction algorithm (see below).

2. VIII — - Linear-Time Selection (Required). These lectures study the problem of computing the ith smallest element of an input array (e.g., the median). It's easy to solve this problem in O(n log n) time using sorting, but we can do better! The required material goes over a super-practical randomized algorithm, very much in the spirit of QuickSort (e.g., based on partitioning), that has *linear* expected running time. Don't forget that it takes linear time just to read the input! The analysis is somewhat different than what we studied for QuickSort, but is equally as slick. Basically, there's a cool way to think about the progress the algorithm makes in terms of a simple coin-flipping experiment. Linearity of expectation (yes, it's back...) seals the deal.

3. VIII — - Linear-Time Selection (Optional). I couldn't resist covering some advanced related material. The first is an algorithm that has more Turing Award-winning authors than any other algorithm that I know of. It is a deterministic (i.e., no randomization allowed) linear-time algorithm for the Selection problem, based on an ingenious "median-of-medians" idea for guaranteeing good pivot choices. The second optional topic answers the question "can we do better?" for sorting, unfortunately in the negative. That is, a counting argument shows that there is no "comparison-based" sorting algorithm (like MergeSort, QuickSort, or HeapSort) with worst-case running time better than n log n.

4. IX — - Graphs and the Contraction Algorithm. The second set of lectures for this week is a segue from randomized algorithms to graph algorithms. We first review graphs and the most standard ways of representing them (most commonly, by adjacency lists). We then consider the random contraction algorithm, discovered by Karger "only" 20 years ago (while a PhD student here at Stanford). This algorithm solves the minimum cut problem — - given an undirected graph, separate the vertices into two non-empty groups to minimize the number of "crossing edges". Such problems come up when reasoning about, for example, physical networks, social networks, and images. This algorithm was perhaps the first strong evidence that graph problems could be added to the long list of "killer applications" of random sampling. Don't tune out before the final plot twist — - a simple but useful trick for transforming an algorithm that almost always fails into one that almost always succeeds.

5. Problem Set #3 has five questions about the randomized selection algorithm, cuts in graphs, and the contraction algorithm. As usual, it's due in a little less than two weeks, and if you miss the deadline you can turn it in by the end of the course for half credit (April 22nd).

6.

Let's talk about what's going on in week 3 of the course. (Sorry this email is a bit late, I just got back home after giving a lecture in Chicago.) All the required video lectures were posted on Sunday, and we just posted some additional optional ones today.

1. VII — - We've posted a second part of the probability review. It's pretty short, just two topics, although quite tricky ones! (Namely, conditional probability and independence.) You need to review this material (via this video or some other source, as you wish) before studying the analysis of the randomized contraction algorithm (see below).

2. VIII — - Linear-Time Selection (Required). These lectures study the problem of computing the ith smallest element of an input array (e.g., the median). It's easy to solve this problem in O(n log n) time using sorting, but we can do better! The required material goes over a super-practical randomized algorithm, very much in the spirit of QuickSort (e.g., based on partitioning), that has *linear* expected running time. Don't forget that it takes linear time just to read the input! The analysis is somewhat different than what we studied for QuickSort, but is equally as slick. Basically, there's a cool way to think about the progress the algorithm makes in terms of a simple coin-flipping experiment. Linearity of expectation (yes, it's back...) seals the deal.

3. VIII — - Linear-Time Selection (Optional). I couldn't resist covering some advanced related material. The first is an algorithm that has more Turing Award-winning authors than any other algorithm that I know of. It is a deterministic (i.e., no randomization allowed) linear-time algorithm for the Selection problem, based on an ingenious "median-of-medians" idea for guaranteeing good pivot choices. The second optional topic answers the question "can we do better?" for sorting, unfortunately in the negative. That is, a counting argument shows that there is no "comparison-based" sorting algorithm (like MergeSort, QuickSort, or HeapSort) with worst-case running time better than n log n.

4. IX — - Graphs and the Contraction Algorithm. The second set of lectures for this week is a segue from randomized algorithms to graph algorithms. We first review graphs and the most standard ways of representing them (most commonly, by adjacency lists). We then consider the random contraction algorithm, discovered by Karger "only" 20 years ago (while a PhD student here at Stanford). This algorithm solves the minimum cut problem — - given an undirected graph, separate the vertices into two non-empty groups to minimize the number of "crossing edges". Such problems come up when reasoning about, for example, physical networks, social networks, and images. This algorithm was perhaps the first strong evidence that graph problems could be added to the long list of "killer applications" of random sampling. Don't tune out before the final plot twist — - a simple but useful trick for transforming an algorithm that almost always fails into one that almost always succeeds.

5. Problem Set #3 has five questions about the randomized selection algorithm, cuts in graphs, and the contraction algorithm. As usual, it's due in a little less than two weeks, and if you miss the deadline you can turn it in by the end of the course for half credit (April 22nd).

6.

Загрузка...

Выбрать следующее задание

Ты добавил

Выбрать следующее задание

Ты добавил