Hostname: page-component-89b8bd64d-7zcd7 Total loading time: 0 Render date: 2026-05-13T11:24:20.335Z Has data issue: false hasContentIssue false

Surprise Probabilities in Markov Chains

Published online by Cambridge University Press:  16 March 2017

JAMES NORRIS
Affiliation:
Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WB, UK (e-mail: james@statslab.cam.ac.uk)
YUVAL PERES
Affiliation:
Microsoft Research, Redmond, Washington, WA 98052, USA (e-mail: peres@microsoft.com)
ALEX ZHAI
Affiliation:
Department of Mathematics, Stanford University, Stanford, CA 94305, USA (e-mail: azhai@stanford.edu)

Abstract

In a Markov chain started at a state x, the hitting time τ(y) is the first time that the chain reaches another state y. We study the probability $\mathbb{P}_x(\tau(y) = t)$ that the first visit to y occurs precisely at a given time t. Informally speaking, the event that a new state is visited at a large time t may be considered a ‘surprise’. We prove the following three bounds.

  • In any Markov chain with n states, $\mathbb{P}_x(\tau(y) = t) \le {n}/{t}$ .

  • In a reversible chain with n states, $\mathbb{P}_x(\tau(y) = t) \le {\sqrt{2n}}/{t}$ for $t \ge 4n + 4$ .

  • For random walk on a simple graph with n ≥ 2 vertices, $\mathbb{P}_x(\tau(y) = t) \le 4e \log(n)/t$ .

We construct examples showing that these bounds are close to optimal. The main feature of our bounds is that they require very little knowledge of the structure of the Markov chain.

To prove the bound for random walk on graphs, we establish the following estimate conjectured by Aldous, Ding and Oveis-Gharan (private communication): for random walk on an n-vertex graph, for every initial vertex x,

$$\sum_y \biggl( \sup_{t \ge 0} p^t(x, y) \biggr) = O(\log n). $$

Information

Type
Paper
Copyright
Copyright © Cambridge University Press 2017 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Article purchase

Temporarily unavailable