About 2,320,000 results
Open links in new tab
  1. Definition of Markov operator - Mathematics Stack Exchange

    Mar 26, 2021 · Explore related questions probability stochastic-processes stochastic-calculus markov-process stochastic-analysis See similar questions with these tags.

  2. When is a stochastic process defined via a SDE Markovian?

    On its own, this is not a Markov process. However, a two dimensional stochastic process, with one co-ordinate being the Ornstein-Uhlenbeck process and the other being its time-integral, …

  3. Ornstein-Uhlenbeck process: Markov, but not martingale?

    You are right; the Ornstein-Uhlenbeck process is a Markov process but not a martingale. It is simply not correct that any Markov process is a martingale (and vica versa).

  4. Doeblin Condition of Markov Chains - Mathematics Stack Exchange

    Jan 2, 2023 · Explore related questions stochastic-processes markov-chains See similar questions with these tags.

  5. Reference on Doob's h-transform - Mathematics Stack Exchange

    I am searching for a reference about conditioning a Markov process in the sense of Doob, i.e. using h-transforms. My particular concern is to condition a discrete-time Markov Process on a …

  6. probability - 'Markovian Property' vs 'Memoryless Property ...

    Aug 23, 2015 · Finally, note that n-grams, for instance, illustrate a canonical example of the distinction above between Markov processes and the simplest possible memoryless processes.

  7. reference request - Good introductory book for Markov processes ...

    Nov 21, 2011 · Which is a good introductory book for Markov chains and Markov processes? Thank you.

  8. stochastic processes - Does markovian property imply …

    Jul 28, 2017 · A stochastic process has the Markov property if the conditional probability distribution of future states of the process (conditional on both past and present states) …

  9. probability theory - Are Markov chains necessarily time …

    May 18, 2015 · Transition probabilities of Markov Chains most definitely can depend on time. The ones that don't are called time-homogeneous. For instance in a discrete time discrete state …

  10. probability - Markov umbrellas - Mathematics Stack Exchange

    Explore related questions probability markov-chains See similar questions with these tags.