Markov process

수학노트
둘러보기로 가기 검색하러 가기

노트

말뭉치

  1. A Markov process is a random process in which the future is independent of the past, given the present.[1]
  2. Thus, Markov processes are the natural stochastic analogs of the deterministic processes described by differential and difference equations.[1]
  3. Sometimes the term Markov process is restricted to sequences in which the random variables can assume continuous values, and analogous sequences of discrete-valued variables are called Markov chains.[2]
  4. The defining property of a Markov process is commonly called the Markov property; it was first stated by A.A. Markov .[3]
  5. Examples of continuous-time Markov processes are furnished by diffusion processes (cf.[3]
  6. Then the corresponding Markov process can be taken to be right-continuous and having left limits (that is, its trajectories can be chosen so).[3]
  7. In the theory of Markov processes most attention is given to homogeneous (in time) processes.[3]
  8. A Markov process with stationary transition probabilities may or may not be a stationary process in the sense of the preceding paragraph.[4]
  9. If the Ys are identically distributed as well as independent, this transition probability does not depend on t, and then X(t) is a Markov process with stationary transition probabilities.[4]
  10. Since both the Poisson process and Brownian motion are created from random walks by simple limiting processes, they, too, are Markov processes with stationary transition probabilities.[4]
  11. The Ornstein-Uhlenbeck process defined as the solution (19) to the stochastic differential equation (18) is also a Markov process with stationary transition probabilities.[4]
  12. The connection between the Markov processes and certain linear equations is now well understood and can be explained in many ways.[5]
  13. We continue by constructing a similar semigroup using Markov processes.[5]
  14. and therefore, the infinitesimal operators of Markov processes do not contain any zeroth-order terms.[5]
  15. Each number represents the probability of the Markov process changing from one state to another state, with the direction indicated by the arrow.[6]
  16. In addition, there are other extensions of Markov processes that are referred to as such but do not necessarily fall within any of these four categories (see Markov model).[6]
  17. However, it is possible to model this scenario as a Markov process.[6]
  18. Markov chains and continuous-time Markov processes are useful in chemistry when physical systems closely approximate the Markov property.[6]
  19. When the range of possible values for a i is either finite or infinite denumerable, as in this study, the Markov process may be referred as a Markov chain.[7]
  20. A first-order Markov process is a Markov process where the transition from a class to any other does not require intermediate transitions to other states.[7]
  21. As described in the methodology, the main hypothesis to be tested in this study is that LUCC in the study area is generated by a first order Markov process.[7]
  22. To prove H 0 two subsidiary hypotheses must be verified: H 1 - land use/cover in different time periods is not statistically independent and H 2 - LUCC in the study area is a Markov process.[7]
  23. The first correct mathematical construction of a Markov process with continuous trajectories was given by N. WIENER in 1923.[8]
  24. The general theory of Markov processes was developed in the 1930's and 1940's by A. N. KOL­ MOGOROV, W. FELLER, W. DOEBLlN, P. LEVY, J. L. DOOB, and others.[8]
  25. During the past ten years the theory of Markov processes has entered a new period of intensive development.[8]
  26. The methods of the theory of semigroups of linear operators made possible further progress in the classification of Markov processes by their infinitesimal characteristics.[8]
  27. The book begins with a review of basic probability, then covers the case of finite state, discrete time Markov processes.[9]
  28. Building on this, the text deals with the discrete time, infinite state case and provides background for continuous Markov processes with exponential random variables and Poisson processes.[9]
  29. It presents continuous Markov processes which include the basic material of Kolmogorov’s equations, infinitesimal generators, and explosions.[9]
  30. While Markov processes are touched on in probability courses, this book offers the opportunity to concentrate on the topic when additional study is required.[9]
  31. But in literature, several empirical evidences suggest that Markov process is not appropriate for credit rating process.[10]
  32. In the present paper the importance of Markov process is shown by analysing the reliability function and availability of feeding system of sugar industry.[11]
  33. To investigate these complex repairable systems various tools such as Markov process, semi-Markov, Petri nets, and dynamic FTA are developed.[11]
  34. Hence system involving Markov process is considered in this discussion with certain assumptions.[11]
  35. Thus the performance of a repairable system can be easily analysed by exhibiting time-homogeneous Markov process and modelling its finite states.[11]
  36. In this work it is proposed a model for the assessment of availability measure of fault tolerant systems based on the integration of continuous time semi-Markov processes and Bayesian belief networks.[12]
  37. It is also proposed a numerical procedure for the solution of the state probability equations of semi-Markov processes described in terms of transition rates.[12]
  38. (2004) who use semi-Markov processes to model a possible security intrusion and corresponding response of the fault tolerant software system to this event.[12]
  39. (1994) and Chandra & Kumar (1997) use Markov processes (MP) in order to model safety systems with stochastic TDTs.[12]
  40. The module first introduces the theory of Markov processes with continuous time parameter running on graphs.[13]
  41. In order to solve this problem we make use of Markov chains or Markov processes (which are a special type of stochastic process).[14]
  42. To consolidate your knowledge of Markov processes consider the following example.[14]
  43. One interesting application of Markov processes that I know of concerns the Norwegian offshore oil/gas industry.[14]
  44. To overcome this problem they model oil price as a Markov process with three price levels (states), corresponding to optimistic, most likely and pessimistic scenarios.[14]
  45. We consider a general homogeneous continuous-time Markov process with restarts.[15]
  46. We provide a connection between the transition probability functions of the original Markov process and the modified process with restarts.[15]
  47. This paper presents a Markov process–based method for deterioration prediction of building components using condition data collected by the City of Kingston in Australia.[16]
  48. The paper presents a typical decision-making method based on the Markov process.[16]
  49. Transitions in LAMP may be influenced by states visited in the distant history of the process, but unlike higher-order Markov processes, LAMP retains an efficient parameterization.[17]
  50. Martingale problems for general Markov processes are systematically developed for the first time in book form.[18]
  51. The linking model for all these examples is the Markov process, which includes random walk, Markov chain and Markov jump processes.[19]

소스

메타데이터

위키데이터

Spacy 패턴 목록

  • [{'LOWER': 'markov'}, {'LEMMA': 'process'}]