# 20th European Young Statisticians Meeting

Aug 14 – 18, 2017
Uppsala University
Europe/Stockholm timezone

## Viterbi process for pairwise Markov models

Aug 17, 2017, 10:00 AM
30m
Ångströmslaboratoriet (Uppsala University)

Speaker

### Speaker

Mr Joonas Sova (University of Tartu)

### Description

My talk is based on ongoing joint work with my supervisor Jüri Lember. We consider a Markov chain $Z = \{Z_k\}_{k \geq 1}$ with product state space $\mathcal{X}\times \mathcal{Y}$, where $\mathcal{Y}$ is a finite set (state space) and $\mathcal{X}$ is an arbitrary separable metric space (observation space). Thus, the process $Z$ decomposes as $Z=(X,Y)$, where $X=\{X_k \}_{k\geq 1}$ and $Y=\{Y_k \}_{k\geq 1}$ are random processes taking values in $\mathcal{X}$ and $\mathcal{Y}$, respectively. Following cite{pairwise,pairwise2,pairwise3}, we call the process $Z$ a \textit{pairwise Markov model}. The process $X$ is identified as an observation process and the process $Y$, sometimes called the \textit{regime}, models the observations-driving hidden state sequence. Therefore our general model contains many well-known stochastic models as a special case: hidden Markov models, Markov switching models, hidden Markov models with dependent noise and many more. The \textit{segmentation} or \textit{path estimation} problem consists of estimating the realization of $(Y_1,\ldots,Y_n)$ given a realization $x_{1:n}$ of $(X_1,\ldots,X_n)$. A standard estimate is any path $v_{1:n}\in \mathcal{Y}^n$ having maximum posterior probability: $$v_{1:n}=\mathop{\mathrm{argmax}}_{y_{1:n}}P(Y_{1:n}=y_{1:n}|X_{1:n}=x_{1:n}).$$ Any such path is called \textit{Viterbi path} and we are interested in the behaviour of $v_{1:n}$ as $n$ grows. The study of asymptotics of Viterbi path is complicated by the fact that adding one more observation, $x_{n+1}$ can change the whole path, and so it is not clear, whether there exists a limiting infinite Viterbi path. We show that under some conditions the infinite Viterbi path indeed exists for almost every realization $x_{1:\infty}$ of $X$, thereby defining an infinite Viterbi decoding of $X$, called the \textit{Viterbi process.} This is done trough construction of \textit{barriers}. A barrier is a fixed-sized block in the observations $x_{1:n}$ that fixes the Viterbi path up to itself: for every continuation of $x_{1:n}$, the Viterbi path up to the barrier remains unchanged. Therefore, if almost every realization of $X$-process contains infinitely many barriers, then the Viterbi process exists. Having infinitely many barriers is not necessary for existence of infinite Viterbi path, but the barrier-construction has several advantages. One of them is that it allows to construct the infinite path \textit{piecewise}, meaning that to determine the first $k$ elements $v_{1:k}$ of the infinite path it suffices to observe $x_{1:n}$ for $n$ big enough. Barrier construction has another great advantage: namely, the process $(Z,V)=\{(Z_k,V_k)\}_{k \geq 1}$, where $V= \{V_k\}_{k \geq 1}$ denotes the Viterbi process, is under certain conditions regenerative. This is can be proven by, roughly speaking, applying the Markov splitting method to construct regeneration times for $Z$ which coincide with the occurrences of barriers. Regenerativity of $(Z,V)$ allows to easily prove limit theorems to understand the asymptotic behaviour of inferences based on Viterbi paths. In fact, in a special case of hidden Markov model this regenerative property has already been known to hold and has found several applications cite{AV,AVacta,Vsmoothing,Vrisk, iowa}.

### Primary author

Mr Joonas Sova (University of Tartu)

### Presentation materials

There are no materials yet.