Markov chain

From Glossary of Meteorology



Markov chain

A stochastic process with a finite number of states in which the probability of occurrence of a future state is conditional only upon the current state; past states are inconsequential.

In meteorology, Markov chains have been used to describe a raindrop size distribution in which the state at time step n + 1 is determined only by collisions between pairs of drops comprising the size distribution at time step n.


Copyright 2024 American Meteorological Society (AMS). For permission to reuse any portion of this work, please contact permissions@ametsoc.org. Any use of material in this work that is determined to be “fair use” under Section 107 of the U.S. Copyright Act (17 U.S. Code § 107) or that satisfies the conditions specified in Section 108 of the U.S.Copyright Act (17 USC § 108) does not require AMS’s permission. Republication, systematic reproduction, posting in electronic form, such as on a website or in a searchable database, or other uses of this material, except as exempted by the above statement, require written permission or a license from AMS. Additional details are provided in the AMS Copyright Policy statement.