"Gaussian process"의 두 판 사이의 차이

수학노트
둘러보기로 가기 검색하러 가기
 
(같은 사용자의 중간 판 하나는 보이지 않습니다)
54번째 줄: 54번째 줄:
 
===소스===
 
===소스===
 
  <references />
 
  <references />
 +
 +
==메타데이터==
 +
===위키데이터===
 +
* ID :  [https://www.wikidata.org/wiki/Q1496376 Q1496376]
 +
===Spacy 패턴 목록===
 +
* [{'LOWER': 'gaussian'}, {'LEMMA': 'process'}]
 +
* [{'LOWER': 'gaussian'}, {'LOWER': 'stochastic'}, {'LEMMA': 'process'}]

2021년 2월 17일 (수) 00:15 기준 최신판

노트

위키데이터

말뭉치

  1. Below is a collection of papers relevant to learning in Gaussian process models.[1]
  2. Since Gaussian process classification scales cubically with the size of the dataset, this might be considerably faster.[2]
  3. In one-versus-rest, one binary Gaussian process classifier is fitted for each class, which is trained to separate this class from the rest.[2]
  4. In “one_vs_one”, one binary Gaussian process classifier is fitted for each pair of classes, which is trained to separate these two classes.[2]
  5. A Gaussian process defines a prior over functions.[3]
  6. For example, if a random process is modelled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly.[4]
  7. Thus, if a Gaussian process is assumed to have mean zero, defining the covariance function completely defines the process' behaviour.[4]
  8. The effect of choosing different kernels on the prior function distribution of the Gaussian process.[4]
  9. A Wiener process (aka Brownian motion) is the integral of a white noise generalized Gaussian process.[4]
  10. The covariance matrix Σ \Sigma Σ is determined by its covariance function k k k, which is often also called the kernel of the Gaussian process.[5]
  11. Making a prediction using a Gaussian process ultimately boils down to drawing samples from this distribution.[5]
  12. Clicking on the graph results in continuous samples drawn from a Gaussian process using the selected kernel.[5]
  13. Using the checkboxes, different kernels can be combined to form a new Gaussian process.[5]
  14. This tutorial introduces the reader to Gaussian process regression as an expressive tool to model, actively explore and exploit unknown functions.[6]
  15. Gaussian process regression is a powerful, non-parametric Bayesian approach towards regression problems that can be utilized in exploration and exploitation scenarios.[6]
  16. In this article, we introduce Gaussian process dynamic programming (GPDP), an approximate value function-based RL algorithm.[7]
  17. In this colab, we explore Gaussian process regression using TensorFlow and TensorFlow Probability.[8]
  18. Note that, according to the above definition, any finite-dimensional multivariate Gaussian distribution is also a Gaussian process.[8]
  19. The defining feature of a Gaussian process is that the joint distribution of the function’s value at a finite number of input points is a multivariate normal distribution.[9]
  20. Unlike a simple multivariate normal distribution, which is parameterized by a mean vector and covariance matrix, a Gaussian process is parameterized by a mean function and covariance function.[9]
  21. A Gaussian Process places a prior over functions, and can be described as an infinite dimensional generalisation of a multivariate Normal distribution.[10]
  22. The package allows the user to fit exact Gaussian process models when the observations are Gaussian distributed about the latent function.[10]
  23. Use the Gaussian Process platform to model the relationship between a continuous response and one or more predictors.[11]
  24. The Gaussian Process platform fits a spatial correlation model to the data.[11]
  25. 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 # evaluate a gaussian process classifier model on the dataset from numpy import mean from numpy import std from sklearn .[12]
  26. 2 3 4 5 6 7 8 9 10 11 12 13 14 15 # make a prediction with a gaussian process classifier model on the dataset from sklearn .[12]
  27. In so doing, this Gaussian process joint modeling (GPJM) framework can be viewed as a temporal and nonparametric extension of the covariance-based linking function (8, 14).[13]
  28. On the contrary, Gaussian process regression directly models the function f as a sample from a distribution over functions.[13]
  29. In the Gaussian process framework, one way to enforce temporally lagged, convolved neural signals is to convolve the GP kernel with a secondary function—an HRF in our case.[13]
  30. This approach is justified from the fact that convolution of a Gaussian process with another function is another Gaussian process (34⇓–36).[13]
  31. The prior samples are taken from a Gaussian process without any data and the posterior samples are taken from a Gaussian process where the data are shown as black squares.[14]
  32. A Gaussian process generalizes the multivariate normal to infinite dimension.[15]
  33. So, we can describe a Gaussian process as a distribution over functions.[15]
  34. It may seem odd to simply adopt the zero function to represent the mean function of the Gaussian process — surely we can do better than that![15]
  35. All we will do here is a sample from the prior Gaussian process, so before any data have been introduced.[15]
  36. Gaussian process regression takes into account all possible functions that fit to the training data vector and gives a predictive distribution around a single prediction for a given input vector.[16]
  37. The initial and basic step in order to apply Gaussian process regression is to obtain a mean and covariance function.[16]
  38. Gaussian process model for control of an existing building, 6th International Building Physics Conference, IBPC 2015.[16]
  39. A Gaussian process-based dynamic surrogate model for complex engineering structural reliability analysis.[16]
  40. For example, if a random process is modeled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly.[17]
  41. Formally, a Gaussian process generates data located throughout some domain such that any finite subset of the range follows a multivariate Gaussian distribution.[17]
  42. Gaussian Process is a powerful non-parametric machine learning technique for constructing comprehensive probabilistic models of real world problems.[17]
  43. Gaussian process regression (GPR) models are nonparametric kernel-based probabilistic models.[18]
  44. I'm working my way through Rasmussen and Williams' classical work Gaussian Process for Machine Learning, and attempting to implement a lot of their theory in Python.[19]
  45. Some call it kriging, which is a term that comes from geostatistics (Matheron 1963); some call it Gaussian spatial modeling or a Gaussian stochastic process.[20]
  46. Gaussian process is a generic term that pops up, taking on disparate but quite specific meanings, in various statistical and probabilistic modeling enterprises.[20]
  47. The used machine learning technique, namely, Gaussian process regression, is briefly described in Section 2.[21]
  48. Using the Bayesian rule, the posterior distribution for the Gaussian process outputs can be obtained.[21]

소스

메타데이터

위키데이터

Spacy 패턴 목록

  • [{'LOWER': 'gaussian'}, {'LEMMA': 'process'}]
  • [{'LOWER': 'gaussian'}, {'LOWER': 'stochastic'}, {'LEMMA': 'process'}]