The Markov process was named after the Russian mathematician Andrey Markov, and it is a stochastic process that satisfies the Markov property. A process is said to satisfy the Markov property if predictions can be made for the future of the process based solely on its present state just as well as one could knowing the process's full history.
A Markov chain is a type of Markov process that has either discrete state space or discrete index set. It is common to define a Markov chain as a Markov process in either discrete or continuous time with a countable state space, but it is also common to define a Markov chain as having discrete time in either countable or continuous state space.
Markov chains have many applications as statistical models of real-world processes, such as studying cruise control systems in motor vehicles, queues or lines of customers arriving at an airport, exchange rates of currencies, storage systems such as dams, and population growths of certain animal species. They form the basis for general stochastic simulation methods known as Gibbs sampling and Markov Chain Monte Carlo, are used for simulating random objects with specific probability distributions.