"Isomap"의 두 판 사이의 차이
둘러보기로 가기
검색하러 가기
Pythagoras0 (토론 | 기여) (→노트: 새 문단) |
Pythagoras0 (토론 | 기여) |
||
(같은 사용자의 중간 판 3개는 보이지 않습니다) | |||
1번째 줄: | 1번째 줄: | ||
== 노트 == | == 노트 == | ||
+ | * The most time-consuming step in Isomap is to compute the shortest paths between all pairs of data points based on a neighbourhood graph.<ref name="ref_5b8b">[https://www.emerald.com/insight/content/doi/10.1108/IJICC-03-2016-0014/full/html An improved Isomap method for manifold learning]</ref> | ||
+ | * The classical Isomap (C-Isomap) is very slow, due to the use of Floyd’s algorithm to compute the shortest paths.<ref name="ref_5b8b" /> | ||
+ | * The purpose of this paper is to speed up Isomap.<ref name="ref_5b8b" /> | ||
+ | * Next, the proposed approach is applied to two image data sets and achieved improved performances over standard Isomap.<ref name="ref_9aa2">[https://www.spiedigitallibrary.org/conference-proceedings-of-spie/9399/939904/Adaptive-graph-construction-for-Isomap-manifold-learning/10.1117/12.2082646.full Adaptive graph construction for Isomap manifold learning]</ref> | ||
+ | * : S-ISOMAP is a manifold learning algorithm, which is a supervised variant of ISOMAP.<ref name="ref_dd7e">[https://www.lamda.nju.edu.cn/code_S-Isomap.ashx S-Isomap]</ref> | ||
+ | * Experiment results show that 3N-Isomap is a more practical and simple algorithm than Isomap.<ref name="ref_36b0">[https://www.scientific.net/AMR.219-220.994 Natural Nearest Neighbor for Isomap Algorithm without Free-Parameter]</ref> | ||
+ | * A SemiSupervised local multimanifold Isomap by linear embedding is proposed.<ref name="ref_abbd">[https://dl.acm.org/citation.cfm?id=3178636 Semi-supervised local multi-manifold Isomap by linear embedding for feature extraction]</ref> | ||
+ | * We mainly evaluate SSMM-Isomap for manifold feature learning, data clustering and classification.<ref name="ref_abbd" /> | ||
+ | * It is becoming more and more difficult for MDS and ISOMAP to solve the full eigenvector problem with the increasing sample size.<ref name="ref_3500">[https://scialert.net/fulltextmobile/?doi=itj.2012.380.383 A Fast Manifold Learning Algorithm]</ref> | ||
+ | * The basic idea: In MDS and ISOMAP, solving the full eigenvector problem leads to higher complexity.<ref name="ref_3500" /> | ||
+ | * If MDS and ISOMAP also solve a sparse eigenvector problem, its execution will greatly speed up.<ref name="ref_3500" /> | ||
+ | * For ISOMAP, we computed the time of solving the full NxN eigenvector problem.<ref name="ref_3500" /> | ||
+ | * the result of ISOMAP is showed below, which indicates the successful extraction of the underlying manifold in original 3D space.<ref name="ref_f35b">[https://sites.google.com/site/machinelearningbybobon/home/dr/isomap Machine Learning]</ref> | ||
+ | * Isomap can be viewed as an extension of Multi-dimensional Scaling (MDS) or Kernel PCA.<ref name="ref_9fb3">[http://semantic-portal.net/base-machine-learning-scikit-learn-unsupervised-manifold-learning Semantic portal — learn smart!]</ref> | ||
+ | * Isomap seeks a lower-dimensional embedding which maintains geodesic distances between all points.<ref name="ref_9fb3" /> | ||
+ | * The algorithm can be selected by the user with the path_method keyword of Isomap .<ref name="ref_9fb3" /> | ||
+ | * The eigensolver can be specified by the user with the path_method keyword of Isomap .<ref name="ref_9fb3" /> | ||
+ | * To give you a solution first, you can use eigen_solver='dense' when using Isomap .<ref name="ref_699a">[https://stackoverflow.com/questions/44852406/python-scikit-learn-isomap-results-seem-random-but-no-possibility-to-set-rando Python: scikit-learn isomap results seem random, but no possibility to set random_state]</ref> | ||
+ | * No, there is currently no way to set a seed for KernelPCA from Isomap .<ref name="ref_699a" /> | ||
+ | * L-Isomap reduces both the time and space complexities of Isomap significantly.<ref name="ref_48fb">[https://www.hindawi.com/journals/mpe/2015/241436/ Enhancing Both Efficiency and Representational Capability of Isomap by Extensive Landmark Selection]</ref> | ||
+ | * So far, we have presented EL-Isomap through comparing with L-Isomap (and Isomap) in various viewpoints.<ref name="ref_48fb" /> | ||
+ | * The D embeddings of the data set were calculated by LLE, Laplacian eigenmap, Isomap, L-Isomap, and EL-Isomap.<ref name="ref_48fb" /> | ||
+ | * The neighborhood sizes of LLE, Laplacian eigenmap, Isomap, L-Isomap, and EL-Isomap are set as , , , , and , respectively.<ref name="ref_48fb" /> | ||
* Isomap is distinguished by its use of the geodesic distance induced by a neighborhood graph embedded in the classical scaling.<ref name="ref_cdb3">[https://en.wikipedia.org/wiki/Isomap Wikipedia]</ref> | * Isomap is distinguished by its use of the geodesic distance induced by a neighborhood graph embedded in the classical scaling.<ref name="ref_cdb3">[https://en.wikipedia.org/wiki/Isomap Wikipedia]</ref> | ||
* In a similar manner, the geodesic distance matrix in Isomap can be viewed as a kernel matrix.<ref name="ref_cdb3" /> | * In a similar manner, the geodesic distance matrix in Isomap can be viewed as a kernel matrix.<ref name="ref_cdb3" /> | ||
− | * Isomap can be | + | * Isomap can be performed with the object Isomap .<ref name="ref_ef6d">[http://scikit-learn.org/stable/modules/manifold.html 2.2. Manifold learning — scikit-learn 0.23.2 documentation]</ref> |
− | + | * The eigensolver can be specified by the user with the eigen_solver keyword of Isomap .<ref name="ref_ef6d" /> | |
− | |||
− | * The | ||
* Isomap is viewed as a variant of metric multidimensional scaling (MDS) to model nonlinear data using its geodesic distance.<ref name="ref_c313">[https://link.springer.com/article/10.1007/s42979-020-00179-y An Extended Isomap Approach for Nonlinear Dimension Reduction]</ref> | * Isomap is viewed as a variant of metric multidimensional scaling (MDS) to model nonlinear data using its geodesic distance.<ref name="ref_c313">[https://link.springer.com/article/10.1007/s42979-020-00179-y An Extended Isomap Approach for Nonlinear Dimension Reduction]</ref> | ||
* No such type of method has been used in recent years from the Isomap perspective.<ref name="ref_c313" /> | * No such type of method has been used in recent years from the Isomap perspective.<ref name="ref_c313" /> | ||
19번째 줄: | 40번째 줄: | ||
===소스=== | ===소스=== | ||
<references /> | <references /> | ||
+ | |||
+ | ==메타데이터== | ||
+ | ===위키데이터=== | ||
+ | * ID : [https://www.wikidata.org/wiki/Q6086067 Q6086067] | ||
+ | ===Spacy 패턴 목록=== | ||
+ | * [{'LEMMA': 'Isomap'}] |
2021년 2월 17일 (수) 01:55 기준 최신판
노트
- The most time-consuming step in Isomap is to compute the shortest paths between all pairs of data points based on a neighbourhood graph.[1]
- The classical Isomap (C-Isomap) is very slow, due to the use of Floyd’s algorithm to compute the shortest paths.[1]
- The purpose of this paper is to speed up Isomap.[1]
- Next, the proposed approach is applied to two image data sets and achieved improved performances over standard Isomap.[2]
- : S-ISOMAP is a manifold learning algorithm, which is a supervised variant of ISOMAP.[3]
- Experiment results show that 3N-Isomap is a more practical and simple algorithm than Isomap.[4]
- A SemiSupervised local multimanifold Isomap by linear embedding is proposed.[5]
- We mainly evaluate SSMM-Isomap for manifold feature learning, data clustering and classification.[5]
- It is becoming more and more difficult for MDS and ISOMAP to solve the full eigenvector problem with the increasing sample size.[6]
- The basic idea: In MDS and ISOMAP, solving the full eigenvector problem leads to higher complexity.[6]
- If MDS and ISOMAP also solve a sparse eigenvector problem, its execution will greatly speed up.[6]
- For ISOMAP, we computed the time of solving the full NxN eigenvector problem.[6]
- the result of ISOMAP is showed below, which indicates the successful extraction of the underlying manifold in original 3D space.[7]
- Isomap can be viewed as an extension of Multi-dimensional Scaling (MDS) or Kernel PCA.[8]
- Isomap seeks a lower-dimensional embedding which maintains geodesic distances between all points.[8]
- The algorithm can be selected by the user with the path_method keyword of Isomap .[8]
- The eigensolver can be specified by the user with the path_method keyword of Isomap .[8]
- To give you a solution first, you can use eigen_solver='dense' when using Isomap .[9]
- No, there is currently no way to set a seed for KernelPCA from Isomap .[9]
- L-Isomap reduces both the time and space complexities of Isomap significantly.[10]
- So far, we have presented EL-Isomap through comparing with L-Isomap (and Isomap) in various viewpoints.[10]
- The D embeddings of the data set were calculated by LLE, Laplacian eigenmap, Isomap, L-Isomap, and EL-Isomap.[10]
- The neighborhood sizes of LLE, Laplacian eigenmap, Isomap, L-Isomap, and EL-Isomap are set as , , , , and , respectively.[10]
- Isomap is distinguished by its use of the geodesic distance induced by a neighborhood graph embedded in the classical scaling.[11]
- In a similar manner, the geodesic distance matrix in Isomap can be viewed as a kernel matrix.[11]
- Isomap can be performed with the object Isomap .[12]
- The eigensolver can be specified by the user with the eigen_solver keyword of Isomap .[12]
- Isomap is viewed as a variant of metric multidimensional scaling (MDS) to model nonlinear data using its geodesic distance.[13]
- No such type of method has been used in recent years from the Isomap perspective.[13]
- Second section provides an overview and background work, which is followed by a brief discussion on manifold learning and Isomap.[13]
- Isomap stands for isometric mapping.[14]
- Isomap starts by creating a neighborhood network.[14]
- Isomap uses the above principle to create a similarity matrix for eigenvalue decomposition.[14]
- Classical MDS uses the euclidean distances as the similarity metric while isomap uses geodesic distances.[14]
- We will use the neighborhood graph to achieve ISOMAP.[15]
- kNN(k-Nearest Neighbors), the most common choice for ISOMAP, connects each data point to its k nearest neighbors.[15]
- It uses ISOMAP and it embeds 4096 pixels and 698 face pictures into 2D space and lines them onto light direction and up-down pose.[15]
소스
- ↑ 1.0 1.1 1.2 An improved Isomap method for manifold learning
- ↑ Adaptive graph construction for Isomap manifold learning
- ↑ S-Isomap
- ↑ Natural Nearest Neighbor for Isomap Algorithm without Free-Parameter
- ↑ 5.0 5.1 Semi-supervised local multi-manifold Isomap by linear embedding for feature extraction
- ↑ 6.0 6.1 6.2 6.3 A Fast Manifold Learning Algorithm
- ↑ Machine Learning
- ↑ 8.0 8.1 8.2 8.3 Semantic portal — learn smart!
- ↑ 9.0 9.1 Python: scikit-learn isomap results seem random, but no possibility to set random_state
- ↑ 10.0 10.1 10.2 10.3 Enhancing Both Efficiency and Representational Capability of Isomap by Extensive Landmark Selection
- ↑ 11.0 11.1 Wikipedia
- ↑ 12.0 12.1 2.2. Manifold learning — scikit-learn 0.23.2 documentation
- ↑ 13.0 13.1 13.2 An Extended Isomap Approach for Nonlinear Dimension Reduction
- ↑ 14.0 14.1 14.2 14.3 Tutorial: Dimension Reduction
- ↑ 15.0 15.1 15.2 VC: ISOMAP, Manifolds Learning
메타데이터
위키데이터
- ID : Q6086067
Spacy 패턴 목록
- [{'LEMMA': 'Isomap'}]