A time series concept, pretty technical but does come up in our fields.
Details #
A Markov, or Markovian process (or chain) is a time series in which there is serial dependency, that is the values at some point in time are not independent of values at other points in time. The crucial issue for a Markov process is that the only relationship is with the immediately preceding point in time, the only effects of earlier points in time is through the fact that the earlier point affected the point after it. As Wikipedia puts it nicely: “What happens next depends only on the state of affairs now.” This is a very basic time series model.
A Markov process can involve any sort of variable: categorical, which might be about transitions between emotional states; counts; or continuous. If continuous the autocorrelation (q.v.) will show non-zero correlations at a lag of 1 and for larger lags but the partial autocorrelation will non-zero correlations only for lag 1.
If a process seems to fit the Markovian model it is relatively easy to predict its behaviour but in real life in our fields truly Markovian processes would, I think, be vanishingly rare as generaly our processes have longer than lag 1 memories.
Try also #
- Autocorrelation
- Time series analysis
Chapters #
Not covered in the OMbook.
Online resources #
Maybe an Rblog exemplar at some point or a Shiny app allowing you to create your own Markov process. Appealing but not really my priority as it’s not something I use.
Dates #
First created 26.i.25.