摘要:AbstractOur departure point is the evolution equation of a Markov process. It describes the changes in the transition probability as time passes. We compare the transition probability for a priori model with the actual transition probability of the observed process to detect a mismatch between the expected and the measured data. To translate this idea into an algorithm, we characterise the involved measures by their moments. Specifically, a linear dynamic system is put forward that describes the evolution of moments. As the last result, we define a moment divergence as the means of computing the distance between two sequences of moments. We see the work as a step towards merging model-driven and data-driven concepts in control engineering. To elucidate the concepts introduced, we have incorporated several simple examples.