"Gaussian process"의 두 판 사이의 차이
둘러보기로 가기
검색하러 가기
Pythagoras0 (토론 | 기여) (→노트: 새 문단) |
Pythagoras0 (토론 | 기여) |
||
(같은 사용자의 중간 판 5개는 보이지 않습니다) | |||
5번째 줄: | 5번째 줄: | ||
===말뭉치=== | ===말뭉치=== | ||
# Below is a collection of papers relevant to learning in Gaussian process models.<ref name="ref_dab8abe7">[http://www.gaussianprocess.org/ The Gaussian Processes Web Site]</ref> | # Below is a collection of papers relevant to learning in Gaussian process models.<ref name="ref_dab8abe7">[http://www.gaussianprocess.org/ The Gaussian Processes Web Site]</ref> | ||
− | # Since Gaussian process classification scales cubically with the size of the dataset, this might be considerably faster.<ref name="ref_bcf82937">[http://scikit-learn.org/stable/modules/gaussian_process.html 1.7. Gaussian Processes — scikit-learn 0. | + | # Since Gaussian process classification scales cubically with the size of the dataset, this might be considerably faster.<ref name="ref_bcf82937">[http://scikit-learn.org/stable/modules/gaussian_process.html 1.7. Gaussian Processes — scikit-learn 0.24.0 documentation]</ref> |
# In one-versus-rest, one binary Gaussian process classifier is fitted for each class, which is trained to separate this class from the rest.<ref name="ref_bcf82937" /> | # In one-versus-rest, one binary Gaussian process classifier is fitted for each class, which is trained to separate this class from the rest.<ref name="ref_bcf82937" /> | ||
# In “one_vs_one”, one binary Gaussian process classifier is fitted for each pair of classes, which is trained to separate these two classes.<ref name="ref_bcf82937" /> | # In “one_vs_one”, one binary Gaussian process classifier is fitted for each pair of classes, which is trained to separate these two classes.<ref name="ref_bcf82937" /> | ||
12번째 줄: | 12번째 줄: | ||
# Thus, if a Gaussian process is assumed to have mean zero, defining the covariance function completely defines the process' behaviour.<ref name="ref_d5a7291a" /> | # Thus, if a Gaussian process is assumed to have mean zero, defining the covariance function completely defines the process' behaviour.<ref name="ref_d5a7291a" /> | ||
# The effect of choosing different kernels on the prior function distribution of the Gaussian process.<ref name="ref_d5a7291a" /> | # The effect of choosing different kernels on the prior function distribution of the Gaussian process.<ref name="ref_d5a7291a" /> | ||
− | # A Wiener process (aka Brownian motion) is the integral of a white noise generalized Gaussian process .<ref name="ref_d5a7291a" /> | + | # A Wiener process (aka Brownian motion) is the integral of a white noise generalized Gaussian process.<ref name="ref_d5a7291a" /> |
# The covariance matrix Σ \Sigma Σ is determined by its covariance function k k k, which is often also called the kernel of the Gaussian process.<ref name="ref_fce2ec29">[https://distill.pub/2019/visual-exploration-gaussian-processes A Visual Exploration of Gaussian Processes]</ref> | # The covariance matrix Σ \Sigma Σ is determined by its covariance function k k k, which is often also called the kernel of the Gaussian process.<ref name="ref_fce2ec29">[https://distill.pub/2019/visual-exploration-gaussian-processes A Visual Exploration of Gaussian Processes]</ref> | ||
# Making a prediction using a Gaussian process ultimately boils down to drawing samples from this distribution.<ref name="ref_fce2ec29" /> | # Making a prediction using a Gaussian process ultimately boils down to drawing samples from this distribution.<ref name="ref_fce2ec29" /> | ||
# Clicking on the graph results in continuous samples drawn from a Gaussian process using the selected kernel.<ref name="ref_fce2ec29" /> | # Clicking on the graph results in continuous samples drawn from a Gaussian process using the selected kernel.<ref name="ref_fce2ec29" /> | ||
# Using the checkboxes, different kernels can be combined to form a new Gaussian process.<ref name="ref_fce2ec29" /> | # Using the checkboxes, different kernels can be combined to form a new Gaussian process.<ref name="ref_fce2ec29" /> | ||
+ | # This tutorial introduces the reader to Gaussian process regression as an expressive tool to model, actively explore and exploit unknown functions.<ref name="ref_e8ec2d45">[https://www.sciencedirect.com/science/article/pii/S0022249617302158 A tutorial on Gaussian process regression: Modelling, exploring, and exploiting functions]</ref> | ||
+ | # Gaussian process regression is a powerful, non-parametric Bayesian approach towards regression problems that can be utilized in exploration and exploitation scenarios.<ref name="ref_e8ec2d45" /> | ||
# In this article, we introduce Gaussian process dynamic programming (GPDP), an approximate value function-based RL algorithm.<ref name="ref_7e3598a4">[https://www.sciencedirect.com/science/article/pii/S0925231209000162 Gaussian process dynamic programming]</ref> | # In this article, we introduce Gaussian process dynamic programming (GPDP), an approximate value function-based RL algorithm.<ref name="ref_7e3598a4">[https://www.sciencedirect.com/science/article/pii/S0925231209000162 Gaussian process dynamic programming]</ref> | ||
# In this colab, we explore Gaussian process regression using TensorFlow and TensorFlow Probability.<ref name="ref_0fea4471">[https://www.tensorflow.org/probability/examples/Gaussian_Process_Regression_In_TFP Gaussian Process Regression in TensorFlow Probability]</ref> | # In this colab, we explore Gaussian process regression using TensorFlow and TensorFlow Probability.<ref name="ref_0fea4471">[https://www.tensorflow.org/probability/examples/Gaussian_Process_Regression_In_TFP Gaussian Process Regression in TensorFlow Probability]</ref> | ||
# Note that, according to the above definition, any finite-dimensional multivariate Gaussian distribution is also a Gaussian process.<ref name="ref_0fea4471" /> | # Note that, according to the above definition, any finite-dimensional multivariate Gaussian distribution is also a Gaussian process.<ref name="ref_0fea4471" /> | ||
− | # | + | # The defining feature of a Gaussian process is that the joint distribution of the function’s value at a finite number of input points is a multivariate normal distribution.<ref name="ref_fd4ad4d7">[https://mc-stan.org/docs/2_25/stan-users-guide/gaussian-processes-chapter.html 10 Gaussian Processes]</ref> |
+ | # Unlike a simple multivariate normal distribution, which is parameterized by a mean vector and covariance matrix, a Gaussian process is parameterized by a mean function and covariance function.<ref name="ref_fd4ad4d7" /> | ||
# A Gaussian Process places a prior over functions, and can be described as an infinite dimensional generalisation of a multivariate Normal distribution.<ref name="ref_b13d0a17">[https://github.com/STOR-i/GaussianProcesses.jl STOR-i/GaussianProcesses.jl: A Julia package for Gaussian Processes]</ref> | # A Gaussian Process places a prior over functions, and can be described as an infinite dimensional generalisation of a multivariate Normal distribution.<ref name="ref_b13d0a17">[https://github.com/STOR-i/GaussianProcesses.jl STOR-i/GaussianProcesses.jl: A Julia package for Gaussian Processes]</ref> | ||
# The package allows the user to fit exact Gaussian process models when the observations are Gaussian distributed about the latent function.<ref name="ref_b13d0a17" /> | # The package allows the user to fit exact Gaussian process models when the observations are Gaussian distributed about the latent function.<ref name="ref_b13d0a17" /> | ||
− | # | + | # Use the Gaussian Process platform to model the relationship between a continuous response and one or more predictors.<ref name="ref_8c144ef4">[https://www.jmp.com/support/help/en/15.2/jmp/gaussian-process.shtml Gaussian Process]</ref> |
− | # | + | # The Gaussian Process platform fits a spatial correlation model to the data.<ref name="ref_8c144ef4" /> |
+ | # 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 # evaluate a gaussian process classifier model on the dataset from numpy import mean from numpy import std from sklearn .<ref name="ref_b72e1c6d">[https://machinelearningmastery.com/gaussian-processes-for-classification-with-python/ Gaussian Processes for Classification With Python]</ref> | ||
+ | # 2 3 4 5 6 7 8 9 10 11 12 13 14 15 # make a prediction with a gaussian process classifier model on the dataset from sklearn .<ref name="ref_b72e1c6d" /> | ||
# In so doing, this Gaussian process joint modeling (GPJM) framework can be viewed as a temporal and nonparametric extension of the covariance-based linking function (8, 14).<ref name="ref_6b386c04">[https://www.pnas.org/content/117/47/29398 Gaussian process linking functions for mind, brain, and behavior]</ref> | # In so doing, this Gaussian process joint modeling (GPJM) framework can be viewed as a temporal and nonparametric extension of the covariance-based linking function (8, 14).<ref name="ref_6b386c04">[https://www.pnas.org/content/117/47/29398 Gaussian process linking functions for mind, brain, and behavior]</ref> | ||
# On the contrary, Gaussian process regression directly models the function f as a sample from a distribution over functions.<ref name="ref_6b386c04" /> | # On the contrary, Gaussian process regression directly models the function f as a sample from a distribution over functions.<ref name="ref_6b386c04" /> | ||
# In the Gaussian process framework, one way to enforce temporally lagged, convolved neural signals is to convolve the GP kernel with a secondary function—an HRF in our case.<ref name="ref_6b386c04" /> | # In the Gaussian process framework, one way to enforce temporally lagged, convolved neural signals is to convolve the GP kernel with a secondary function—an HRF in our case.<ref name="ref_6b386c04" /> | ||
# This approach is justified from the fact that convolution of a Gaussian process with another function is another Gaussian process (34⇓–36).<ref name="ref_6b386c04" /> | # This approach is justified from the fact that convolution of a Gaussian process with another function is another Gaussian process (34⇓–36).<ref name="ref_6b386c04" /> | ||
− | |||
− | |||
# The prior samples are taken from a Gaussian process without any data and the posterior samples are taken from a Gaussian process where the data are shown as black squares.<ref name="ref_bbc8f812">[https://pythonhosted.org/infpy/gps.html What are Gaussian processes? — infpy 0.4.13 documentation]</ref> | # The prior samples are taken from a Gaussian process without any data and the posterior samples are taken from a Gaussian process where the data are shown as black squares.<ref name="ref_bbc8f812">[https://pythonhosted.org/infpy/gps.html What are Gaussian processes? — infpy 0.4.13 documentation]</ref> | ||
+ | # A Gaussian process generalizes the multivariate normal to infinite dimension.<ref name="ref_76ac5a8a">[https://blog.dominodatalab.com/fitting-gaussian-process-models-python/ Fitting Gaussian Process Models in Python]</ref> | ||
+ | # So, we can describe a Gaussian process as a distribution over functions.<ref name="ref_76ac5a8a" /> | ||
+ | # It may seem odd to simply adopt the zero function to represent the mean function of the Gaussian process — surely we can do better than that!<ref name="ref_76ac5a8a" /> | ||
+ | # All we will do here is a sample from the prior Gaussian process, so before any data have been introduced.<ref name="ref_76ac5a8a" /> | ||
+ | # Gaussian process regression takes into account all possible functions that fit to the training data vector and gives a predictive distribution around a single prediction for a given input vector.<ref name="ref_a2151a53">[https://www.frontiersin.org/articles/10.3389/fbuil.2017.00052/full Automatic Kernel Selection for Gaussian Processes Regression with Approximate Bayesian Computation and Sequential Monte Carlo]</ref> | ||
+ | # The initial and basic step in order to apply Gaussian process regression is to obtain a mean and covariance function.<ref name="ref_a2151a53" /> | ||
+ | # Gaussian process model for control of an existing building, 6th International Building Physics Conference, IBPC 2015.<ref name="ref_a2151a53" /> | ||
+ | # A Gaussian process-based dynamic surrogate model for complex engineering structural reliability analysis.<ref name="ref_a2151a53" /> | ||
+ | # For example, if a random process is modeled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly.<ref name="ref_25aac3e6">[https://docs.rapidminer.com/latest/studio/operators/modeling/predictive/functions/gaussian_process.html RapidMiner Documentation]</ref> | ||
+ | # Formally, a Gaussian process generates data located throughout some domain such that any finite subset of the range follows a multivariate Gaussian distribution.<ref name="ref_25aac3e6" /> | ||
+ | # Gaussian Process is a powerful non-parametric machine learning technique for constructing comprehensive probabilistic models of real world problems.<ref name="ref_25aac3e6" /> | ||
# Gaussian process regression (GPR) models are nonparametric kernel-based probabilistic models.<ref name="ref_8e29a90e">[https://www.mathworks.com/help/stats/gaussian-process-regression-models.html Gaussian Process Regression Models]</ref> | # Gaussian process regression (GPR) models are nonparametric kernel-based probabilistic models.<ref name="ref_8e29a90e">[https://www.mathworks.com/help/stats/gaussian-process-regression-models.html Gaussian Process Regression Models]</ref> | ||
− | |||
− | |||
− | |||
− | |||
# I'm working my way through Rasmussen and Williams' classical work Gaussian Process for Machine Learning, and attempting to implement a lot of their theory in Python.<ref name="ref_d1106002">[https://stats.stackexchange.com/questions/351820/gaussian-process-instability-with-more-datapoints Gaussian Process instability with more datapoints]</ref> | # I'm working my way through Rasmussen and Williams' classical work Gaussian Process for Machine Learning, and attempting to implement a lot of their theory in Python.<ref name="ref_d1106002">[https://stats.stackexchange.com/questions/351820/gaussian-process-instability-with-more-datapoints Gaussian Process instability with more datapoints]</ref> | ||
# Some call it kriging, which is a term that comes from geostatistics (Matheron 1963); some call it Gaussian spatial modeling or a Gaussian stochastic process.<ref name="ref_b98383e7">[https://bookdown.org/rbg/surrogates/chap5.html Chapter 5 Gaussian Process Regression]</ref> | # Some call it kriging, which is a term that comes from geostatistics (Matheron 1963); some call it Gaussian spatial modeling or a Gaussian stochastic process.<ref name="ref_b98383e7">[https://bookdown.org/rbg/surrogates/chap5.html Chapter 5 Gaussian Process Regression]</ref> | ||
# Gaussian process is a generic term that pops up, taking on disparate but quite specific meanings, in various statistical and probabilistic modeling enterprises.<ref name="ref_b98383e7" /> | # Gaussian process is a generic term that pops up, taking on disparate but quite specific meanings, in various statistical and probabilistic modeling enterprises.<ref name="ref_b98383e7" /> | ||
− | # | + | # The used machine learning technique, namely, Gaussian process regression, is briefly described in Section 2.<ref name="ref_95582627">[https://www.hindawi.com/journals/ace/2019/9185756/ Gaussian Process-Based Response Surface Method for Slope Reliability Analysis]</ref> |
− | # Gaussian process | + | # Using the Bayesian rule, the posterior distribution for the Gaussian process outputs can be obtained.<ref name="ref_95582627" /> |
===소스=== | ===소스=== | ||
<references /> | <references /> | ||
+ | |||
+ | ==메타데이터== | ||
+ | ===위키데이터=== | ||
+ | * ID : [https://www.wikidata.org/wiki/Q1496376 Q1496376] | ||
+ | ===Spacy 패턴 목록=== | ||
+ | * [{'LOWER': 'gaussian'}, {'LEMMA': 'process'}] | ||
+ | * [{'LOWER': 'gaussian'}, {'LOWER': 'stochastic'}, {'LEMMA': 'process'}] |
2021년 2월 17일 (수) 00:15 기준 최신판
노트
위키데이터
- ID : Q1496376
말뭉치
- Below is a collection of papers relevant to learning in Gaussian process models.[1]
- Since Gaussian process classification scales cubically with the size of the dataset, this might be considerably faster.[2]
- In one-versus-rest, one binary Gaussian process classifier is fitted for each class, which is trained to separate this class from the rest.[2]
- In “one_vs_one”, one binary Gaussian process classifier is fitted for each pair of classes, which is trained to separate these two classes.[2]
- A Gaussian process defines a prior over functions.[3]
- For example, if a random process is modelled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly.[4]
- Thus, if a Gaussian process is assumed to have mean zero, defining the covariance function completely defines the process' behaviour.[4]
- The effect of choosing different kernels on the prior function distribution of the Gaussian process.[4]
- A Wiener process (aka Brownian motion) is the integral of a white noise generalized Gaussian process.[4]
- The covariance matrix Σ \Sigma Σ is determined by its covariance function k k k, which is often also called the kernel of the Gaussian process.[5]
- Making a prediction using a Gaussian process ultimately boils down to drawing samples from this distribution.[5]
- Clicking on the graph results in continuous samples drawn from a Gaussian process using the selected kernel.[5]
- Using the checkboxes, different kernels can be combined to form a new Gaussian process.[5]
- This tutorial introduces the reader to Gaussian process regression as an expressive tool to model, actively explore and exploit unknown functions.[6]
- Gaussian process regression is a powerful, non-parametric Bayesian approach towards regression problems that can be utilized in exploration and exploitation scenarios.[6]
- In this article, we introduce Gaussian process dynamic programming (GPDP), an approximate value function-based RL algorithm.[7]
- In this colab, we explore Gaussian process regression using TensorFlow and TensorFlow Probability.[8]
- Note that, according to the above definition, any finite-dimensional multivariate Gaussian distribution is also a Gaussian process.[8]
- The defining feature of a Gaussian process is that the joint distribution of the function’s value at a finite number of input points is a multivariate normal distribution.[9]
- Unlike a simple multivariate normal distribution, which is parameterized by a mean vector and covariance matrix, a Gaussian process is parameterized by a mean function and covariance function.[9]
- A Gaussian Process places a prior over functions, and can be described as an infinite dimensional generalisation of a multivariate Normal distribution.[10]
- The package allows the user to fit exact Gaussian process models when the observations are Gaussian distributed about the latent function.[10]
- Use the Gaussian Process platform to model the relationship between a continuous response and one or more predictors.[11]
- The Gaussian Process platform fits a spatial correlation model to the data.[11]
- 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 # evaluate a gaussian process classifier model on the dataset from numpy import mean from numpy import std from sklearn .[12]
- 2 3 4 5 6 7 8 9 10 11 12 13 14 15 # make a prediction with a gaussian process classifier model on the dataset from sklearn .[12]
- In so doing, this Gaussian process joint modeling (GPJM) framework can be viewed as a temporal and nonparametric extension of the covariance-based linking function (8, 14).[13]
- On the contrary, Gaussian process regression directly models the function f as a sample from a distribution over functions.[13]
- In the Gaussian process framework, one way to enforce temporally lagged, convolved neural signals is to convolve the GP kernel with a secondary function—an HRF in our case.[13]
- This approach is justified from the fact that convolution of a Gaussian process with another function is another Gaussian process (34⇓–36).[13]
- The prior samples are taken from a Gaussian process without any data and the posterior samples are taken from a Gaussian process where the data are shown as black squares.[14]
- A Gaussian process generalizes the multivariate normal to infinite dimension.[15]
- So, we can describe a Gaussian process as a distribution over functions.[15]
- It may seem odd to simply adopt the zero function to represent the mean function of the Gaussian process — surely we can do better than that![15]
- All we will do here is a sample from the prior Gaussian process, so before any data have been introduced.[15]
- Gaussian process regression takes into account all possible functions that fit to the training data vector and gives a predictive distribution around a single prediction for a given input vector.[16]
- The initial and basic step in order to apply Gaussian process regression is to obtain a mean and covariance function.[16]
- Gaussian process model for control of an existing building, 6th International Building Physics Conference, IBPC 2015.[16]
- A Gaussian process-based dynamic surrogate model for complex engineering structural reliability analysis.[16]
- For example, if a random process is modeled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly.[17]
- Formally, a Gaussian process generates data located throughout some domain such that any finite subset of the range follows a multivariate Gaussian distribution.[17]
- Gaussian Process is a powerful non-parametric machine learning technique for constructing comprehensive probabilistic models of real world problems.[17]
- Gaussian process regression (GPR) models are nonparametric kernel-based probabilistic models.[18]
- I'm working my way through Rasmussen and Williams' classical work Gaussian Process for Machine Learning, and attempting to implement a lot of their theory in Python.[19]
- Some call it kriging, which is a term that comes from geostatistics (Matheron 1963); some call it Gaussian spatial modeling or a Gaussian stochastic process.[20]
- Gaussian process is a generic term that pops up, taking on disparate but quite specific meanings, in various statistical and probabilistic modeling enterprises.[20]
- The used machine learning technique, namely, Gaussian process regression, is briefly described in Section 2.[21]
- Using the Bayesian rule, the posterior distribution for the Gaussian process outputs can be obtained.[21]
소스
- ↑ The Gaussian Processes Web Site
- ↑ 2.0 2.1 2.2 1.7. Gaussian Processes — scikit-learn 0.24.0 documentation
- ↑ Martin Krasser's Blog
- ↑ 4.0 4.1 4.2 4.3 Gaussian process
- ↑ 5.0 5.1 5.2 5.3 A Visual Exploration of Gaussian Processes
- ↑ 6.0 6.1 A tutorial on Gaussian process regression: Modelling, exploring, and exploiting functions
- ↑ Gaussian process dynamic programming
- ↑ 8.0 8.1 Gaussian Process Regression in TensorFlow Probability
- ↑ 9.0 9.1 10 Gaussian Processes
- ↑ 10.0 10.1 STOR-i/GaussianProcesses.jl: A Julia package for Gaussian Processes
- ↑ 11.0 11.1 Gaussian Process
- ↑ 12.0 12.1 Gaussian Processes for Classification With Python
- ↑ 13.0 13.1 13.2 13.3 Gaussian process linking functions for mind, brain, and behavior
- ↑ What are Gaussian processes? — infpy 0.4.13 documentation
- ↑ 15.0 15.1 15.2 15.3 Fitting Gaussian Process Models in Python
- ↑ 16.0 16.1 16.2 16.3 Automatic Kernel Selection for Gaussian Processes Regression with Approximate Bayesian Computation and Sequential Monte Carlo
- ↑ 17.0 17.1 17.2 RapidMiner Documentation
- ↑ Gaussian Process Regression Models
- ↑ Gaussian Process instability with more datapoints
- ↑ 20.0 20.1 Chapter 5 Gaussian Process Regression
- ↑ 21.0 21.1 Gaussian Process-Based Response Surface Method for Slope Reliability Analysis
메타데이터
위키데이터
- ID : Q1496376
Spacy 패턴 목록
- [{'LOWER': 'gaussian'}, {'LEMMA': 'process'}]
- [{'LOWER': 'gaussian'}, {'LOWER': 'stochastic'}, {'LEMMA': 'process'}]