"Random projection"의 두 판 사이의 차이
둘러보기로 가기
검색하러 가기
Pythagoras0 (토론 | 기여) (→노트: 새 문단) |
Pythagoras0 (토론 | 기여) |
||
(같은 사용자의 중간 판 하나는 보이지 않습니다) | |||
20번째 줄: | 20번째 줄: | ||
===소스=== | ===소스=== | ||
<references /> | <references /> | ||
+ | |||
+ | ==메타데이터== | ||
+ | ===위키데이터=== | ||
+ | * ID : [https://www.wikidata.org/wiki/Q18393452 Q18393452] | ||
+ | ===Spacy 패턴 목록=== | ||
+ | * [{'LOWER': 'random'}, {'LEMMA': 'projection'}] |
2021년 2월 17일 (수) 01:56 기준 최신판
노트
- After trying PCA, robust PCA, ICA, removing highly correlated features, I was thinking to use Random Projection.[1]
- But, it seems that they don't support random projection directly for dimension reduction.[1]
- Is that possible to implement random projection easily in R?[1]
- Or, taking advantage of existing tools to do dimension reduction with Random Projection in R?[1]
- In this paper, we propose an architecture that incorporates random projection into NEC to train with more stability.[2]
- Recently, Random Projection (RP) has emerged as a powerful method for dimensionality reduction.[3]
- Random projection allows one to substantially reduce dimensionality of data while still retaining a significant degree of problem structure.[4]
- Random projections gives you a kind of probabilistic warranty against that situation.[5]
- , in this scenario of continuously learning from large streams of data, I believe random projections are a sensible and efficient approach.[5]
- Random projection is a powerful method to construct Lipschitz mappings to realize dimensionality reduction with a high probability.[6]
- Random projection does not introduce a significant distortion when the dimension and cardinality of data both are large.[6]
- In Section 7.3, the justification of the validity of random projection is presented in detail.[6]
- The applications of random projection are given in Section 7.4.[6]
- However, using random projections is computationally significantly less expensive than using, e.g., principal component analysis.[7]
- We also show experimentally that using a sparse random matrix gives additional computational savings in random projection.[7]
- Initially, high-dimensional data are projected into a lower-dimensional Euclidean space using random projections.[8]
- the pairwise distances between data points are preserved through random projections.[8]
소스
- ↑ 1.0 1.1 1.2 1.3 Any R implementation for dimension reduction using random projection?
- ↑ Random Projection in Neural Episodic Control
- ↑ Face recognition experiments with random projection
- ↑ Convergence rates of learning algorithms by random projection ☆
- ↑ 5.0 5.1 PCA vs. random projection
- ↑ 6.0 6.1 6.2 6.3 Random Projection
- ↑ 7.0 7.1 Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
- ↑ 8.0 8.1 Random Projection Estimation of Discrete-Choice Models with Large Choice Sets
메타데이터
위키데이터
- ID : Q18393452
Spacy 패턴 목록
- [{'LOWER': 'random'}, {'LEMMA': 'projection'}]