Importance sampling

수학노트
Pythagoras0 (토론 | 기여)님의 2020년 12월 21일 (월) 01:42 판 (→‎노트: 새 문단)
(차이) ← 이전 판 | 최신판 (차이) | 다음 판 → (차이)
둘러보기로 가기 검색하러 가기

노트

위키데이터

말뭉치

  1. What importance sampling does, effectively, is replace the indicator functions in the above expression with their expectation.[1]
  2. The point of all this is to show that the importance sampling estimator of the mean can be seen as a “smoothed out” version of the rejection sampling estimator.[1]
  3. We introduce structured importance sampling, a new technique for efficiently rendering scenes illuminated by distant natural illumination given in an environment map.[2]
  4. In this tutorial we examine another sampling technique, importance sampling.[3]
  5. Importance sampling is useful when the area we are interested in may lie in a region that has a small probability of occurrence.[3]
  6. The example that follows is the minimal working example of importance sampling.[4]
  7. We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution.[5]
  8. Importance sampling is one approach that can lead to a reduction in the number of model evaluations.[6]
  9. Importance sampling uses a biasing distribution to sample the model more efficiently, but generating such a biasing distribution can be difficult and usually also requires model evaluations.[6]
  10. We introduce a multifidelity importance sampling (MFIS) method, which combines evaluations of both the high-fidelity and a surrogate model.[6]
  11. This method is based on the law of total expectation and variance in subintervals, and it combines the conditional Monte Carlo method and the importance sampling method.[7]
  12. The proposed method has a higher rate of convergence compared to the importance sampling method.[7]
  13. Computation of the theoretical moments arising in importance sampling is discussed and some numerical examples given.[8]
  14. Recent results have empirically demonstrated that multiple policy optimization steps can be performed with the same batch by using off-distribution techniques based on importance sampling.[9]
  15. However, when dealing with off-distribution optimization, it is essential to take into account the uncertainty introduced by the importance sampling process.[9]
  16. (Metelli et al., 2018) by incorporating two advanced variance reduction techniques: per-decision and multiple importance sampling.[9]
  17. It happens that importance sampling and quasi monte carlo are two such solutions.[10]
  18. Importance sampling and quasi Monte Carlo deserve a lesson of their own (which you will find later in this section).[10]
  19. (this often possible in shading, as we will see in the lesson on Importance Sampling).[10]
  20. Before we look at a simple example (practical examples applied to rendering will be given in the lesson on importance sampling), let's explain where the term importance sampling comes from.[10]
  21. We test a family of one parameter trial wavefunctions for variational Monte Carlo in stereographically projected manifolds which can be used to produce importance sampling.[11]
  22. We find that diffusion Monte Carlo with importance sampling in manifolds is orders of magnitude more efficient compared to unguided diffusion Monte Carlo.[11]
  23. Additionally, diffusion Monte Carlo with importance sampling in manifolds can overcome problems with nonconfining potentials and can suppress quasiergodicity effectively.[11]
  24. Similar importance sampling methods can be used for other material models, such as the Lafortune BRDF (Lafortune et al. 1997) or Ward's anisotropic BRDF (Walter 2005).[12]
  25. As shown in Figure 20-5a, deterministic importance sampling causes sharp aliasing artifacts that look like duplicate specular reflections.[12]
  26. Thus, much like the wavelet-based approaches, the filtered importance sampling does not require many samples to produce accurate, glossy surface reflection.[12]
  27. Our GPU-based importance sampling is a real-time rendering algorithm for various parameterized material models illuminated by an environment map.[12]
  28. The standard estimator used in conjunction with importance sampling in Monte Carlo integration is unbiased but inefficient.[13]
  29. This paper seeks to identify computationally efficient importance sampling (IS) algorithms for estimating large deviation probabilities for the loss on a portfolio of loans.[14]
  30. Importance sampling is often used as a Monte Carlo integrator.[15]
  31. Importance sampling is a variance reduction technique that can be used in the Monte Carlo method.[15]
  32. The idea behind importance sampling is that certain values of the input random variables in a simulation have more impact on the parameter being estimated than others.[15]
  33. Hence, the basic methodology in importance sampling is to choose a distribution which "encourages" the important values.[15]
  34. Importance sampling is an approximation method instead of sampling method.[16]
  35. Importance sampling (IS) is a method for estimating expectations.[17]
  36. In the importance sampling approach to simulation, we simulate a modified system in which the chance of failure has been artificially boosted and then correct for that boost.[18]
  37. The importance sampling approach is based on the following reasoning.[18]
  38. However, let us use it as a vehicle to explain how the principles of importance sampling could be used here.[18]
  39. In the importance sampling approach, we change the density function so that larger values of A and B are more likely.[18]

소스