"Random projection"의 두 판 사이의 차이

수학노트
둘러보기로 가기 검색하러 가기
(→‎노트: 새 문단)
 
 
(같은 사용자의 중간 판 하나는 보이지 않습니다)
20번째 줄: 20번째 줄:
 
===소스===
 
===소스===
 
  <references />
 
  <references />
 +
 +
==메타데이터==
 +
===위키데이터===
 +
* ID :  [https://www.wikidata.org/wiki/Q18393452 Q18393452]
 +
===Spacy 패턴 목록===
 +
* [{'LOWER': 'random'}, {'LEMMA': 'projection'}]

2021년 2월 17일 (수) 01:56 기준 최신판

노트

  • After trying PCA, robust PCA, ICA, removing highly correlated features, I was thinking to use Random Projection.[1]
  • But, it seems that they don't support random projection directly for dimension reduction.[1]
  • Is that possible to implement random projection easily in R?[1]
  • Or, taking advantage of existing tools to do dimension reduction with Random Projection in R?[1]
  • In this paper, we propose an architecture that incorporates random projection into NEC to train with more stability.[2]
  • Recently, Random Projection (RP) has emerged as a powerful method for dimensionality reduction.[3]
  • Random projection allows one to substantially reduce dimensionality of data while still retaining a significant degree of problem structure.[4]
  • Random projections gives you a kind of probabilistic warranty against that situation.[5]
  • , in this scenario of continuously learning from large streams of data, I believe random projections are a sensible and efficient approach.[5]
  • Random projection is a powerful method to construct Lipschitz mappings to realize dimensionality reduction with a high probability.[6]
  • Random projection does not introduce a significant distortion when the dimension and cardinality of data both are large.[6]
  • In Section 7.3, the justification of the validity of random projection is presented in detail.[6]
  • The applications of random projection are given in Section 7.4.[6]
  • However, using random projections is computationally significantly less expensive than using, e.g., principal component analysis.[7]
  • We also show experimentally that using a sparse random matrix gives additional computational savings in random projection.[7]
  • Initially, high-dimensional data are projected into a lower-dimensional Euclidean space using random projections.[8]
  • the pairwise distances between data points are preserved through random projections.[8]

소스

메타데이터

위키데이터

Spacy 패턴 목록

  • [{'LOWER': 'random'}, {'LEMMA': 'projection'}]