"Markov process"의 두 판 사이의 차이
둘러보기로 가기
검색하러 가기
Pythagoras0 (토론 | 기여) (→노트: 새 문단) |
Pythagoras0 (토론 | 기여) |
||
(같은 사용자의 중간 판 하나는 보이지 않습니다) | |||
55번째 줄: | 55번째 줄: | ||
===소스=== | ===소스=== | ||
<references /> | <references /> | ||
+ | |||
+ | ==메타데이터== | ||
+ | ===위키데이터=== | ||
+ | * ID : [https://www.wikidata.org/wiki/Q2221775 Q2221775] | ||
+ | ===Spacy 패턴 목록=== | ||
+ | * [{'LOWER': 'markov'}, {'LEMMA': 'process'}] |
2021년 2월 17일 (수) 00:32 기준 최신판
노트
말뭉치
- A Markov process is a random process in which the future is independent of the past, given the present.[1]
- Thus, Markov processes are the natural stochastic analogs of the deterministic processes described by differential and difference equations.[1]
- Sometimes the term Markov process is restricted to sequences in which the random variables can assume continuous values, and analogous sequences of discrete-valued variables are called Markov chains.[2]
- The defining property of a Markov process is commonly called the Markov property; it was first stated by A.A. Markov .[3]
- Examples of continuous-time Markov processes are furnished by diffusion processes (cf.[3]
- Then the corresponding Markov process can be taken to be right-continuous and having left limits (that is, its trajectories can be chosen so).[3]
- In the theory of Markov processes most attention is given to homogeneous (in time) processes.[3]
- A Markov process with stationary transition probabilities may or may not be a stationary process in the sense of the preceding paragraph.[4]
- If the Ys are identically distributed as well as independent, this transition probability does not depend on t, and then X(t) is a Markov process with stationary transition probabilities.[4]
- Since both the Poisson process and Brownian motion are created from random walks by simple limiting processes, they, too, are Markov processes with stationary transition probabilities.[4]
- The Ornstein-Uhlenbeck process defined as the solution (19) to the stochastic differential equation (18) is also a Markov process with stationary transition probabilities.[4]
- The connection between the Markov processes and certain linear equations is now well understood and can be explained in many ways.[5]
- We continue by constructing a similar semigroup using Markov processes.[5]
- and therefore, the infinitesimal operators of Markov processes do not contain any zeroth-order terms.[5]
- Each number represents the probability of the Markov process changing from one state to another state, with the direction indicated by the arrow.[6]
- In addition, there are other extensions of Markov processes that are referred to as such but do not necessarily fall within any of these four categories (see Markov model).[6]
- However, it is possible to model this scenario as a Markov process.[6]
- Markov chains and continuous-time Markov processes are useful in chemistry when physical systems closely approximate the Markov property.[6]
- When the range of possible values for a i is either finite or infinite denumerable, as in this study, the Markov process may be referred as a Markov chain.[7]
- A first-order Markov process is a Markov process where the transition from a class to any other does not require intermediate transitions to other states.[7]
- As described in the methodology, the main hypothesis to be tested in this study is that LUCC in the study area is generated by a first order Markov process.[7]
- To prove H 0 two subsidiary hypotheses must be verified: H 1 - land use/cover in different time periods is not statistically independent and H 2 - LUCC in the study area is a Markov process.[7]
- The first correct mathematical construction of a Markov process with continuous trajectories was given by N. WIENER in 1923.[8]
- The general theory of Markov processes was developed in the 1930's and 1940's by A. N. KOL MOGOROV, W. FELLER, W. DOEBLlN, P. LEVY, J. L. DOOB, and others.[8]
- During the past ten years the theory of Markov processes has entered a new period of intensive development.[8]
- The methods of the theory of semigroups of linear operators made possible further progress in the classification of Markov processes by their infinitesimal characteristics.[8]
- The book begins with a review of basic probability, then covers the case of finite state, discrete time Markov processes.[9]
- Building on this, the text deals with the discrete time, infinite state case and provides background for continuous Markov processes with exponential random variables and Poisson processes.[9]
- It presents continuous Markov processes which include the basic material of Kolmogorov’s equations, infinitesimal generators, and explosions.[9]
- While Markov processes are touched on in probability courses, this book offers the opportunity to concentrate on the topic when additional study is required.[9]
- But in literature, several empirical evidences suggest that Markov process is not appropriate for credit rating process.[10]
- In the present paper the importance of Markov process is shown by analysing the reliability function and availability of feeding system of sugar industry.[11]
- To investigate these complex repairable systems various tools such as Markov process, semi-Markov, Petri nets, and dynamic FTA are developed.[11]
- Hence system involving Markov process is considered in this discussion with certain assumptions.[11]
- Thus the performance of a repairable system can be easily analysed by exhibiting time-homogeneous Markov process and modelling its finite states.[11]
- In this work it is proposed a model for the assessment of availability measure of fault tolerant systems based on the integration of continuous time semi-Markov processes and Bayesian belief networks.[12]
- It is also proposed a numerical procedure for the solution of the state probability equations of semi-Markov processes described in terms of transition rates.[12]
- (2004) who use semi-Markov processes to model a possible security intrusion and corresponding response of the fault tolerant software system to this event.[12]
- (1994) and Chandra & Kumar (1997) use Markov processes (MP) in order to model safety systems with stochastic TDTs.[12]
- The module first introduces the theory of Markov processes with continuous time parameter running on graphs.[13]
- In order to solve this problem we make use of Markov chains or Markov processes (which are a special type of stochastic process).[14]
- To consolidate your knowledge of Markov processes consider the following example.[14]
- One interesting application of Markov processes that I know of concerns the Norwegian offshore oil/gas industry.[14]
- To overcome this problem they model oil price as a Markov process with three price levels (states), corresponding to optimistic, most likely and pessimistic scenarios.[14]
- We consider a general homogeneous continuous-time Markov process with restarts.[15]
- We provide a connection between the transition probability functions of the original Markov process and the modified process with restarts.[15]
- This paper presents a Markov process–based method for deterioration prediction of building components using condition data collected by the City of Kingston in Australia.[16]
- The paper presents a typical decision-making method based on the Markov process.[16]
- Transitions in LAMP may be influenced by states visited in the distant history of the process, but unlike higher-order Markov processes, LAMP retains an efficient parameterization.[17]
- Martingale problems for general Markov processes are systematically developed for the first time in book form.[18]
- The linking model for all these examples is the Markov process, which includes random walk, Markov chain and Markov jump processes.[19]
소스
- ↑ 1.0 1.1 Markov Processes
- ↑ Markov process | mathematics
- ↑ 3.0 3.1 3.2 3.3 Encyclopedia of Mathematics
- ↑ 4.0 4.1 4.2 4.3 Probability theory - Markovian processes
- ↑ 5.0 5.1 5.2 Markov Process - an overview
- ↑ 6.0 6.1 6.2 6.3 Markov chain
- ↑ 7.0 7.1 7.2 7.3 MARKOV PROCESSES IN MODELING LAND USE AND LAND COVER CHANGES IN SINTRA-CASCAIS, PORTUGAL
- ↑ 8.0 8.1 8.2 8.3 Markov Processes - Volume 1
- ↑ 9.0 9.1 9.2 9.3 Markov Processes
- ↑ Portfolio optimization of credit risky bonds: a semi-Markov process approach
- ↑ 11.0 11.1 11.2 11.3 Application of Markov Process in Performance Analysis of Feeding System of Sugar Industry
- ↑ 12.0 12.1 12.2 12.3 A continuous-time semi-markov bayesian belief network model for availability measure estimation of fault tolerant systems
- ↑ MA3H2 Markov Processes and Percolation Theory
- ↑ 14.0 14.1 14.2 14.3 Markov processes
- ↑ 15.0 15.1 Markov Processes with Restart
- ↑ 16.0 16.1 Markov Process for Deterioration Modeling and Asset Management of Community Buildings
- ↑ Linear Additive Markov Processes
- ↑ Markov Processes: Characterization and Convergence
- ↑ MATH2750 Introduction to Markov Processes
메타데이터
위키데이터
- ID : Q2221775
Spacy 패턴 목록
- [{'LOWER': 'markov'}, {'LEMMA': 'process'}]