Reports from the lectures

Here you will find brief reports from the lectures

Wednesday January 17th: I reviewed some of the basic concepts of probability theory, partly following the appendix in the textbook. Among the things covered were: Probability spaces and conditional probabilities (section A.1), random variables (section A2, but I skipped all the material on special distribution), and expectations and variance (section A3, but again skipping everything on special distributions and everything after Theorem A5).

Friday January 19th: Today I went through sections 1.1 and 1.2 (a little faster than expected, but I wanted to get to a point where the problems in the book make sense). In section 1.1, I went through the definition of a Markov chain (not all that clear in the book, I thought) and explained (examples similar to) Examples 1.1, 1.2, and 1.3 in the book, plus an example on random walks on the integers. I may return to some of the other examples later when we need them. In section 1.2, I proved the Chapman-Kolmogorov equations (1.2) and Theorem 1.1, and illustrated Theorem 1.1 by looking at what happens to the simple weather model in Example 1.3 when we look further and further into the future. 

Wednesday January 24th: The physical lecture was cancelled, but I made a recorded lecture in three parts (se links below). The material is from section 1.3, but I do things in a slightly different order than the book. The lecture on Friday January 26th will have the same format and be on the remaining parts of section 1.3 and section 1.4.

1. Stopping times and the strong Markov property: NotesVideo.

2. Recurrent and transient states: NotesVideo. (Look out for slips of the tongue: In one of the examples I first call some states "transient", before I correct myself and change to the correct "recurrent".)

3. A criterion for criterion: NotesVideo.

Friday January 26th: The physical lecture was cancelled, but instead I recorded a lecture in three parts (se links below). The first two are from section 1.3, the last one from section 1.4.

1. Decomposition of the state space. Notes. Video.

2. Communication classes. Notes. Video. (This year's textbook doesn't seem to use the term "communication classes" – or just "classes" – but they appear in so many old exam problems that they are good to know about).

3. Stationary distributions. Notes. Video. (Excuse the quality - I had to correct myself several times, but I didn't have the energy to make a new version).

Wednesday, January 31st: The physical lecture was cancelled due a minor accident, and  have put out three videos instead. They are all from sections 1.4 and 1.5.

1. More about stationary distributions and especially the doubly stochastic situation. NotesVideo.

2. The detailed balance equation. NotesVideo.

3. Time reversed processes. NotesVideo.

Friday, February 2nd: I was not able to give this lecture in person (apologies for forgetting to cancel) and have prepared two videos instead. The first video has a format I don't really like (the text is written out beforehand, and I only talk my way through it), but sitting for long periods is uncomfortable at the moment. The second video combines material from sections 1.6 and 1.8 (I'm aiming at proving the Convergence Theorem 1.19 before I do section 1.7 and the rest of section 1.6).

1. Kolmogorov's criterion for the detailed balance equation. NotesVideo. (In the book, this argument is hard to read due to many q's appearing in random places where p should have been.)

2. Periodicity. NotesVideo.

Wednesday, February 7th: As I'm still not in top shape, the physical lecture was replaced by three videos, and the main event was the proof of the Convergence Theorem 1.23 (also known as Theorem 1.19). The proof is in section 1.8 and uses the so called coupling method, which is a much used technique in more advanced Markov chain theory. The two other videos are on important theorems in section 1.7. As the three videos below cover more material than an ordinary lecture, I will make the videos for Friday shorter – and also less theoretical (the course is not at all as theoretical as this weeks videos may lead you to assume!)

1. The Convergence Theorem. Notes. Video.

2. Return to states (Theorems 1.20 and 1.21). Notes. Video.

3. Existence of stationary measures (Theorem 1.24). Notes. Video.

Friday, February 9th: Three more videos. The first video is on the last remaining result from sections 1.6/1.7, the so-called Ergodic Theorem 1.22. The proof in the book is very sketchy, and I have tried to fill in a few missing steps. The other two videos illustrate how you can use the tools we now have at our disposal to analyse simple Markov chains. 

1. The Ergodic Theorem. NotesVideo

2. Example: An irreducible Markov chain. NotesVideo.

3. Example: A Markov chain with transient states. NotesVideo.

Wednesday, February 14th: As there has been a lot of theory lately, I thought it was time for an example that sums up much of what we have been doing so far, and at the same time introduces the themes we are going to look at next. So in the first video, you will find what is basically a solution of Problem 1 from the exam in 2005. If you want to try this problem on your own, be aware that some of the older exam problems use a different convention for transition matrices: What they call the transition matrix is the transpose of what we call the transition matrix, and hence you have to turn columns into rows before you start to solve the problem our way.

The second video is an alternative treatment of the last part of the first video, and it paves the way for the third video, which is again more theoretical. There is probably more material in these three videos than in an ordinary lecture, so the Friday lecture will be a bit shorter.

1. Long example based on Problem 1 from Exam 2005. Notes. Video.

2. An alternative (more matrix oriented) approach to the last part of the previous video. Notes. Video.

3. The fundamental theorem of exit probabilities (Theorem 1.28). Notes. Video.

Friday, February 16th. To compensate for the recent long and dense lectures, here are just a few short examples of applications of exit probabilities (and how to compute them). The next lecture will start with Section 1.10.

1. Endgame in tennis. NotesVideo.

2. An unfair game. NotesVideo.

Wednesday, February 21st: Today's topic was exit times from section 1.10. In addition to the material in the book, I have a written a supplementary note on the expected time a Markov chain spends in transient states. As this has been a popular exam topic, it is useful to know something about it when one tries to solve old exam problems. As usual for Wednesdays, there is more material here than for a normal lecture, and I shall try to restrain myself on Friday.

1. Introductory examples. Notes. Video.

2. Proof of Theorem 1.29. Notes. Video.

3. Example: Three heads in a row. Notes. Video.

4. Time spent in transient states (see the supplementary note). By accident, the notes for the video got split in two: Notes (part 1), Notes (part 2). Video.

Friday, February 23rd: The main topic of the day is from section 1.11 and deals with the distinction between two kinds of recurrent behavior, positive recurrence and null recurrence. Null recurrence is a borderline kind of recurrence, and it can only occur when the state space is infinite. The first video introduces the topic and proves some fundamental theorems; the second video illustrates the concepts by looking at what happens with random walks on the half-line. The third video introduces the notion of branching processes and proves a fundamental theorem about when such a process will die out in final time. This videos are the last ones from Chapter 1, and we shall continue with Chapter 2 next time.

1. Positive recurrence and null recurrence. NotesVideo

2. Random walks on the half-line. NotesVideo.

3. Branching processes. NotesVideo.

Wednesday, February 28th: We started Chapter 2. The first (long) video is on exponential random variables, the second, much shorter one, is on Poisson distributions.

1. Exponential distributions (section 2.1): NotesVideo. (Note that I make a mistake with the limits of integration at about 41:00, but correct myself soon after).

2. Poisson distributions (from the beginning of section 2.2 - most of you have probably seen this before): NotesVideo.

Friday, March 1st: The main theme today is Poisson processes. The first video defines Poisson processes and shows how they can be constructed from exponential random variables. The second video tries to explain how Poisson processes often appear in practice (including exams), and contains another description of Poisson processes that is sometimes referred to as "the infinitesimal description" (I forgot to mention this in the video, but we shall prove that this description is equivalent to the definition in a later video). The third video proves a sometimes useful fact that does not seem to be in the textbook: If a continues distribution is memoryless, it is necessarily exponential.

1. Poisson processes: Definition and construction: NotesVideo.

2. P oisson processes in practice. NotesVideo.

3. Bonus track: The exponential distributions are the only memoryless distributions with a density. NotesVideo.

Wednesday, March 6th: The themes of the day were inhomogeneous Poisson processes (Poisson like processes where the rate varies with time) and compound Poisson processes (where a Poisson process triggers other random variables).

1. Definition of inhomogeneous Poisson processes plus an example. NotesVideo.

2. Proof of the infinitesimal description of Poisson processes. NotesVideo.

3. Compound Poisson processes. NotesVideo.

Friday, March 8th: Today's material wass from section 2.4, and the videos are the last from Chapter 2. As we have been working hard recently, there is a little less material than usual (but I have included a bonus track).

1. Combining and thinning Poisson processes. NotesVideo.

2. Conditioning. NotesVideo.

3. Bonus track from last year: An example of thinning applied to the collection of football cards. NotesVideo.

Wednesday, March 13th: Back to physical lectures! I covered sections 3.1 and 3.2.1 and tried to make a video with Panopto. Unfortunately, the first half is without sound (not the system's fault, I just didn't hear that the microphone fell out), but the second half is now restored (it starts at about 40:45) and with sound. I don't know if the video link works for everybody (it may require that you log in), but if not, you may find the video in Canvas (click the Panopto button in the menu is the left margin).

Notes. Video.

Friday, March 15th: Covered the rest of chapter 3 by selecting material carefully, dropping section 3.2.2 on Little's theorem and concentrating on sections 3.2.3 and 3.3.1. Again I lost the sound on the first part of the recording (must have used the same bad microphone at last time), but there is sound on the second part (from about 45:00).

Notes. Video.

Wednesday, April 3rd: I started Chapter 4 by going through the description of continuous time Markov chains both in term of transition probabilities \(p_t(i,j) \) and rates \(\lambda_i\) and jump probabilities \(u(i,j)\). I then proved Kolmogorov's backward equation and explained Example 4.8.

Notes. Video.

Friday, April 5th: I started by deriving Kolmogorov's forward equation in both coordinate and matrix form. The rest of the time I spent on the Yule process (Example 4.11), trying to explain in a little more detail how one uses Kolmogorov's forward equations to solve it. It was probably a little too much of a good thing...

Notes. Video.

Wednesday, April 10th: Went through most of section 4.3 on limit behavior. Got a bit over enthusiastic in the proof of Theorem 4.8 (didn't think much of the one in the book), and if you're not an \(\epsilon-\delta\) - aficionado, you may want to skip the last of the proof. Didn't quite have time to do Example 4.15 properly, and will say a little more about it on Friday.

Notes. Video.

Friday, April 12th: Started by recapitulating Example 4.15 and showing a simple example of how it can be used in queueing theory. Then I turned to exit distributions, where I spent most of the time working out an example (almost from "Mesternes mester) in two different ways. Much of the machinery will be used again next time when we get to exit times.

Notes. Video.

Wednesday, April 17th: The physical lecture was canceled and replaced by the four prerecorded videos below:

1. Exit times (section 4.4). This video recycles an example from the previous lecture ("Mesternes mester"): NotesVideo.

2. Queueing theory: Notation and survey (intro to Section 4.5): NotesVideo.

3. Variations on M/M/1-queues (from section 4.5.1): NotesVideo.

4. M/M/s-queues (from section 4.5.2): NotesVideo.

Friday, April 19th. I first went through section 4.5.3, spending some time fixing a gap in the book (that if the distribution satisfies the detailed balance equation, then
\(\pi_ip_ t(i,j)=\pi_jp_t(j,i)\)). Then I started section 4.6 where I covered the simplest case with two consecutive queues.

Notes. Video.

Wednesday April 24th: Due to personal circumstances, I had to drop the physical lecture, and the digitaI lecture is quite short. It does, however, finish section 4.6 and the syllabus from the textbook.

Notes. Video.

Friday April 26th: This was a digital lecture (videos) on the note on Brownian motion, which is the last part of the syllabus. The lecture is quite short and consists of two videos:

1. Random walks with vanishing increments. NotesVideo.

2. Brownian motion: NotesVideo

Friday, May 3rd: Unfortunately, I also had to cancel this lecture, and replace it by two videos covering the last two sections of the note on Brownian motion. This is the last lecture one new material, the rest will be review.

1. Itô's Formula: Notes. Video.

2. Geometric Brownian motion: Notes. Video.

Wednesday, May 8th: This semester is turning into a comedy. When I was finally ready to give a physical lecture again, I was stopped by a train incident, and had to cancel this lecture. It is replaced by the four recordings below.

1. Markov chains and communication classes: Notes. Video.

2. Transience and recurrence: Notes. Video.

3. Stationary distributions and how to find them: Notes. Video.

4. Existence and uniqueness of stationary distributions: Notes. Video.

Friday, May 10th: A physical lecture! I covered the rest of Chapter 1 (convergence, exit distributions, exit times, plus expected number of visits to a transient state, see this note). I also covered exponential random variables from Chapter 2.

Notes. Video (the microphone fell out around 7:45, but the sound is soon back)

Wednesday, May 15th: Before the break, I reviewed Sections 2.2-2.4 on Poisson processes and Section 3.1 on renewal processes. After the break, I did two old exam problems, Problem 3 from Exam 2017 and Problem 3, from Exam 2009.

Notes. Video.

Wednesday, May 22nd: Reviewed continuous time Markov chains and did part of Problem 2, exam 2013.

Notes. Video.

Friday, May 24th: Ended the lectures by reviewing queueing theory and Brownian motion (including geometric Brownian motion). I also did problem 2 from the exam 2006 and a computation with geometric Brownian motion..

Notes. Video.

Published Jan. 10, 2024 9:32 PM - Last modified May 24, 2024 12:12 PM