"Gaussian process"의 두 판 사이의 차이

수학노트
둘러보기로 가기 검색하러 가기
(→‎노트: 새 문단)
 
(→‎노트: 새 문단)
42번째 줄: 42번째 줄:
 
# modeling of massive datasets called globally approximate Gaussian process (GAGP).<ref name="ref_8631d1df">[https://asmedigitalcollection.asme.org/mechanicaldesign/article/141/11/111402/955350/Globally-Approximate-Gaussian-Processes-for-Big Globally Approximate Gaussian Processes for Big Data With Application to Data-Driven Metamaterials Design]</ref>
 
# modeling of massive datasets called globally approximate Gaussian process (GAGP).<ref name="ref_8631d1df">[https://asmedigitalcollection.asme.org/mechanicaldesign/article/141/11/111402/955350/Globally-Approximate-Gaussian-Processes-for-Big Globally Approximate Gaussian Processes for Big Data With Application to Data-Driven Metamaterials Design]</ref>
 
# Gaussian process (GP) models (also known as Kriging) have many attractive features that underpin their widespread use in engineering design.<ref name="ref_8631d1df" />
 
# Gaussian process (GP) models (also known as Kriging) have many attractive features that underpin their widespread use in engineering design.<ref name="ref_8631d1df" />
 +
===소스===
 +
<references />
 +
 +
== 노트 ==
 +
 +
===위키데이터===
 +
* ID :  [https://www.wikidata.org/wiki/Q1496376 Q1496376]
 +
===말뭉치===
 +
# Below is a collection of papers relevant to learning in Gaussian process models.<ref name="ref_dab8abe7">[http://www.gaussianprocess.org/ The Gaussian Processes Web Site]</ref>
 +
# Since Gaussian process classification scales cubically with the size of the dataset, this might be considerably faster.<ref name="ref_bcf82937">[http://scikit-learn.org/stable/modules/gaussian_process.html 1.7. Gaussian Processes — scikit-learn 0.24.0 documentation]</ref>
 +
# In one-versus-rest, one binary Gaussian process classifier is fitted for each class, which is trained to separate this class from the rest.<ref name="ref_bcf82937" />
 +
# In “one_vs_one”, one binary Gaussian process classifier is fitted for each pair of classes, which is trained to separate these two classes.<ref name="ref_bcf82937" />
 +
# A Gaussian process defines a prior over functions.<ref name="ref_762cbc2e">[http://krasserm.github.io/2018/03/19/gaussian-processes/ Martin Krasser's Blog]</ref>
 +
# For example, if a random process is modelled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly.<ref name="ref_d5a7291a">[https://en.wikipedia.org/wiki/Gaussian_process Gaussian process]</ref>
 +
# Thus, if a Gaussian process is assumed to have mean zero, defining the covariance function completely defines the process' behaviour.<ref name="ref_d5a7291a" />
 +
# The effect of choosing different kernels on the prior function distribution of the Gaussian process.<ref name="ref_d5a7291a" />
 +
# A Wiener process (aka Brownian motion) is the integral of a white noise generalized Gaussian process.<ref name="ref_d5a7291a" />
 +
# The covariance matrix Σ \Sigma Σ is determined by its covariance function k k k, which is often also called the kernel of the Gaussian process.<ref name="ref_fce2ec29">[https://distill.pub/2019/visual-exploration-gaussian-processes A Visual Exploration of Gaussian Processes]</ref>
 +
# Making a prediction using a Gaussian process ultimately boils down to drawing samples from this distribution.<ref name="ref_fce2ec29" />
 +
# Clicking on the graph results in continuous samples drawn from a Gaussian process using the selected kernel.<ref name="ref_fce2ec29" />
 +
# Using the checkboxes, different kernels can be combined to form a new Gaussian process.<ref name="ref_fce2ec29" />
 +
# This tutorial introduces the reader to Gaussian process regression as an expressive tool to model, actively explore and exploit unknown functions.<ref name="ref_e8ec2d45">[https://www.sciencedirect.com/science/article/pii/S0022249617302158 A tutorial on Gaussian process regression: Modelling, exploring, and exploiting functions]</ref>
 +
# Gaussian process regression is a powerful, non-parametric Bayesian approach towards regression problems that can be utilized in exploration and exploitation scenarios.<ref name="ref_e8ec2d45" />
 +
# In this article, we introduce Gaussian process dynamic programming (GPDP), an approximate value function-based RL algorithm.<ref name="ref_7e3598a4">[https://www.sciencedirect.com/science/article/pii/S0925231209000162 Gaussian process dynamic programming]</ref>
 +
# In this colab, we explore Gaussian process regression using TensorFlow and TensorFlow Probability.<ref name="ref_0fea4471">[https://www.tensorflow.org/probability/examples/Gaussian_Process_Regression_In_TFP Gaussian Process Regression in TensorFlow Probability]</ref>
 +
# Note that, according to the above definition, any finite-dimensional multivariate Gaussian distribution is also a Gaussian process.<ref name="ref_0fea4471" />
 +
# A Gaussian Process places a prior over functions, and can be described as an infinite dimensional generalisation of a multivariate Normal distribution.<ref name="ref_b13d0a17">[https://github.com/STOR-i/GaussianProcesses.jl STOR-i/GaussianProcesses.jl: A Julia package for Gaussian Processes]</ref>
 +
# The package allows the user to fit exact Gaussian process models when the observations are Gaussian distributed about the latent function.<ref name="ref_b13d0a17" />
 +
# The defining feature of a Gaussian process is that the joint distribution of the function’s value at a finite number of input points is a multivariate normal distribution.<ref name="ref_fd4ad4d7">[https://mc-stan.org/docs/2_25/stan-users-guide/gaussian-processes-chapter.html 10 Gaussian Processes]</ref>
 +
# Unlike a simple multivariate normal distribution, which is parameterized by a mean vector and covariance matrix, a Gaussian process is parameterized by a mean function and covariance function.<ref name="ref_fd4ad4d7" />
 +
# Use the Gaussian Process platform to model the relationship between a continuous response and one or more predictors.<ref name="ref_8c144ef4">[https://www.jmp.com/support/help/en/15.2/jmp/gaussian-process.shtml Gaussian Process]</ref>
 +
# The Gaussian Process platform fits a spatial correlation model to the data.<ref name="ref_8c144ef4" />
 +
# 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 # evaluate a gaussian process classifier model on the dataset from numpy import mean from numpy import std from sklearn .<ref name="ref_b72e1c6d">[https://machinelearningmastery.com/gaussian-processes-for-classification-with-python/ Gaussian Processes for Classification With Python]</ref>
 +
# 2 3 4 5 6 7 8 9 10 11 12 13 14 15 # make a prediction with a gaussian process classifier model on the dataset from sklearn .<ref name="ref_b72e1c6d" />
 +
# In so doing, this Gaussian process joint modeling (GPJM) framework can be viewed as a temporal and nonparametric extension of the covariance-based linking function (8, 14).<ref name="ref_6b386c04">[https://www.pnas.org/content/117/47/29398 Gaussian process linking functions for mind, brain, and behavior]</ref>
 +
# On the contrary, Gaussian process regression directly models the function f as a sample from a distribution over functions.<ref name="ref_6b386c04" />
 +
# In the Gaussian process framework, one way to enforce temporally lagged, convolved neural signals is to convolve the GP kernel with a secondary function—an HRF in our case.<ref name="ref_6b386c04" />
 +
# This approach is justified from the fact that convolution of a Gaussian process with another function is another Gaussian process (34⇓–36).<ref name="ref_6b386c04" />
 +
# The prior samples are taken from a Gaussian process without any data and the posterior samples are taken from a Gaussian process where the data are shown as black squares.<ref name="ref_bbc8f812">[https://pythonhosted.org/infpy/gps.html What are Gaussian processes? — infpy 0.4.13 documentation]</ref>
 +
# Some call it kriging, which is a term that comes from geostatistics (Matheron 1963); some call it Gaussian spatial modeling or a Gaussian stochastic process.<ref name="ref_b98383e7">[https://bookdown.org/rbg/surrogates/chap5.html Chapter 5 Gaussian Process Regression]</ref>
 +
# Gaussian process is a generic term that pops up, taking on disparate but quite specific meanings, in various statistical and probabilistic modeling enterprises.<ref name="ref_b98383e7" />
 
===소스===
 
===소스===
 
  <references />
 
  <references />

2020년 12월 23일 (수) 02:47 판

노트

위키데이터

말뭉치

  1. Below is a collection of papers relevant to learning in Gaussian process models.[1]
  2. Since Gaussian process classification scales cubically with the size of the dataset, this might be considerably faster.[2]
  3. In one-versus-rest, one binary Gaussian process classifier is fitted for each class, which is trained to separate this class from the rest.[2]
  4. In “one_vs_one”, one binary Gaussian process classifier is fitted for each pair of classes, which is trained to separate these two classes.[2]
  5. A Gaussian process defines a prior over functions.[3]
  6. For example, if a random process is modelled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly.[4]
  7. Thus, if a Gaussian process is assumed to have mean zero, defining the covariance function completely defines the process' behaviour.[4]
  8. The effect of choosing different kernels on the prior function distribution of the Gaussian process.[4]
  9. A Wiener process (aka Brownian motion) is the integral of a white noise generalized Gaussian process .[4]
  10. The covariance matrix Σ \Sigma Σ is determined by its covariance function k k k, which is often also called the kernel of the Gaussian process.[5]
  11. Making a prediction using a Gaussian process ultimately boils down to drawing samples from this distribution.[5]
  12. Clicking on the graph results in continuous samples drawn from a Gaussian process using the selected kernel.[5]
  13. Using the checkboxes, different kernels can be combined to form a new Gaussian process.[5]
  14. In this article, we introduce Gaussian process dynamic programming (GPDP), an approximate value function-based RL algorithm.[6]
  15. In this colab, we explore Gaussian process regression using TensorFlow and TensorFlow Probability.[7]
  16. Note that, according to the above definition, any finite-dimensional multivariate Gaussian distribution is also a Gaussian process.[7]
  17. This paper proposes the application of bagging to obtain more robust and accurate predictions using Gaussian process regression models.[8]
  18. A Gaussian Process places a prior over functions, and can be described as an infinite dimensional generalisation of a multivariate Normal distribution.[9]
  19. The package allows the user to fit exact Gaussian process models when the observations are Gaussian distributed about the latent function.[9]
  20. The defining feature of a Gaussian process is that the joint distribution of the function’s value at a finite number of input points is a multivariate normal distribution.[10]
  21. Unlike a simple multivariate normal distribution, which is parameterized by a mean vector and covariance matrix, a Gaussian process is parameterized by a mean function and covariance function.[10]
  22. In so doing, this Gaussian process joint modeling (GPJM) framework can be viewed as a temporal and nonparametric extension of the covariance-based linking function (8, 14).[11]
  23. On the contrary, Gaussian process regression directly models the function f as a sample from a distribution over functions.[11]
  24. In the Gaussian process framework, one way to enforce temporally lagged, convolved neural signals is to convolve the GP kernel with a secondary function—an HRF in our case.[11]
  25. This approach is justified from the fact that convolution of a Gaussian process with another function is another Gaussian process (34⇓–36).[11]
  26. 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 # evaluate a gaussian process classifier model on the dataset from numpy import mean from numpy import std from sklearn .[12]
  27. 2 3 4 5 6 7 8 9 10 11 12 13 14 15 # make a prediction with a gaussian process classifier model on the dataset from sklearn .[12]
  28. The prior samples are taken from a Gaussian process without any data and the posterior samples are taken from a Gaussian process where the data are shown as black squares.[13]
  29. Gaussian process regression (GPR) models are nonparametric kernel-based probabilistic models.[14]
  30. This paper develops an efficient computational method for solving a Gaussian process (GP) regression for large spatial data sets using a collection of suitably defined local GP regressions.[15]
  31. Gaussian process models are routinely used to solve hard machine learning problems.[16]
  32. A Gaussian process (GP) can be used as a prior probability distribution whose support is over the space of continuous functions.[17]
  33. PyMC3 is a great environment for working with fully Bayesian Gaussian Process models.[17]
  34. I'm working my way through Rasmussen and Williams' classical work Gaussian Process for Machine Learning, and attempting to implement a lot of their theory in Python.[18]
  35. Some call it kriging, which is a term that comes from geostatistics (Matheron 1963); some call it Gaussian spatial modeling or a Gaussian stochastic process.[19]
  36. Gaussian process is a generic term that pops up, taking on disparate but quite specific meanings, in various statistical and probabilistic modeling enterprises.[19]
  37. modeling of massive datasets called globally approximate Gaussian process (GAGP).[20]
  38. Gaussian process (GP) models (also known as Kriging) have many attractive features that underpin their widespread use in engineering design.[20]

소스

노트

위키데이터

말뭉치

  1. Below is a collection of papers relevant to learning in Gaussian process models.[1]
  2. Since Gaussian process classification scales cubically with the size of the dataset, this might be considerably faster.[2]
  3. In one-versus-rest, one binary Gaussian process classifier is fitted for each class, which is trained to separate this class from the rest.[2]
  4. In “one_vs_one”, one binary Gaussian process classifier is fitted for each pair of classes, which is trained to separate these two classes.[2]
  5. A Gaussian process defines a prior over functions.[3]
  6. For example, if a random process is modelled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly.[4]
  7. Thus, if a Gaussian process is assumed to have mean zero, defining the covariance function completely defines the process' behaviour.[4]
  8. The effect of choosing different kernels on the prior function distribution of the Gaussian process.[4]
  9. A Wiener process (aka Brownian motion) is the integral of a white noise generalized Gaussian process.[4]
  10. The covariance matrix Σ \Sigma Σ is determined by its covariance function k k k, which is often also called the kernel of the Gaussian process.[5]
  11. Making a prediction using a Gaussian process ultimately boils down to drawing samples from this distribution.[5]
  12. Clicking on the graph results in continuous samples drawn from a Gaussian process using the selected kernel.[5]
  13. Using the checkboxes, different kernels can be combined to form a new Gaussian process.[5]
  14. This tutorial introduces the reader to Gaussian process regression as an expressive tool to model, actively explore and exploit unknown functions.[6]
  15. Gaussian process regression is a powerful, non-parametric Bayesian approach towards regression problems that can be utilized in exploration and exploitation scenarios.[6]
  16. In this article, we introduce Gaussian process dynamic programming (GPDP), an approximate value function-based RL algorithm.[7]
  17. In this colab, we explore Gaussian process regression using TensorFlow and TensorFlow Probability.[8]
  18. Note that, according to the above definition, any finite-dimensional multivariate Gaussian distribution is also a Gaussian process.[8]
  19. A Gaussian Process places a prior over functions, and can be described as an infinite dimensional generalisation of a multivariate Normal distribution.[9]
  20. The package allows the user to fit exact Gaussian process models when the observations are Gaussian distributed about the latent function.[9]
  21. The defining feature of a Gaussian process is that the joint distribution of the function’s value at a finite number of input points is a multivariate normal distribution.[10]
  22. Unlike a simple multivariate normal distribution, which is parameterized by a mean vector and covariance matrix, a Gaussian process is parameterized by a mean function and covariance function.[10]
  23. Use the Gaussian Process platform to model the relationship between a continuous response and one or more predictors.[11]
  24. The Gaussian Process platform fits a spatial correlation model to the data.[11]
  25. 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 # evaluate a gaussian process classifier model on the dataset from numpy import mean from numpy import std from sklearn .[12]
  26. 2 3 4 5 6 7 8 9 10 11 12 13 14 15 # make a prediction with a gaussian process classifier model on the dataset from sklearn .[12]
  27. In so doing, this Gaussian process joint modeling (GPJM) framework can be viewed as a temporal and nonparametric extension of the covariance-based linking function (8, 14).[13]
  28. On the contrary, Gaussian process regression directly models the function f as a sample from a distribution over functions.[13]
  29. In the Gaussian process framework, one way to enforce temporally lagged, convolved neural signals is to convolve the GP kernel with a secondary function—an HRF in our case.[13]
  30. This approach is justified from the fact that convolution of a Gaussian process with another function is another Gaussian process (34⇓–36).[13]
  31. The prior samples are taken from a Gaussian process without any data and the posterior samples are taken from a Gaussian process where the data are shown as black squares.[14]
  32. Some call it kriging, which is a term that comes from geostatistics (Matheron 1963); some call it Gaussian spatial modeling or a Gaussian stochastic process.[15]
  33. Gaussian process is a generic term that pops up, taking on disparate but quite specific meanings, in various statistical and probabilistic modeling enterprises.[15]

소스