<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="ko">
	<id>https://wiki.mathnt.net/index.php?action=history&amp;feed=atom&amp;title=Feature_selection</id>
	<title>Feature selection - 편집 역사</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.mathnt.net/index.php?action=history&amp;feed=atom&amp;title=Feature_selection"/>
	<link rel="alternate" type="text/html" href="https://wiki.mathnt.net/index.php?title=Feature_selection&amp;action=history"/>
	<updated>2026-04-04T17:34:57Z</updated>
	<subtitle>이 문서의 편집 역사</subtitle>
	<generator>MediaWiki 1.35.0</generator>
	<entry>
		<id>https://wiki.mathnt.net/index.php?title=Feature_selection&amp;diff=51308&amp;oldid=prev</id>
		<title>2021년 2월 17일 (수) 08:14에 Pythagoras0님의 편집</title>
		<link rel="alternate" type="text/html" href="https://wiki.mathnt.net/index.php?title=Feature_selection&amp;diff=51308&amp;oldid=prev"/>
		<updated>2021-02-17T08:14:40Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left diff-editfont-monospace&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;ko&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← 이전 판&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;2021년 2월 17일 (수) 08:14 판&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l90&quot; &gt;90번째 줄:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;90번째 줄:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;  &amp;lt;references /&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;  &amp;lt;references /&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt;−&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== 메타데이터 ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==메타데이터==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt;−&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt; &lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===위키데이터===&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===위키데이터===&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* ID :  [https://www.wikidata.org/wiki/Q446488 Q446488]&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* ID :  [https://www.wikidata.org/wiki/Q446488 Q446488]&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;===Spacy 패턴 목록===&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;* [{&amp;#039;LOWER&amp;#039;: &amp;#039;feature&amp;#039;}, {&amp;#039;LEMMA&amp;#039;: &amp;#039;selection&amp;#039;}]&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Pythagoras0</name></author>
	</entry>
	<entry>
		<id>https://wiki.mathnt.net/index.php?title=Feature_selection&amp;diff=47105&amp;oldid=prev</id>
		<title>Pythagoras0: /* 메타데이터 */ 새 문단</title>
		<link rel="alternate" type="text/html" href="https://wiki.mathnt.net/index.php?title=Feature_selection&amp;diff=47105&amp;oldid=prev"/>
		<updated>2020-12-26T12:21:09Z</updated>

		<summary type="html">&lt;p&gt;&lt;span dir=&quot;auto&quot;&gt;&lt;span class=&quot;autocomment&quot;&gt;메타데이터: &lt;/span&gt; 새 문단&lt;/span&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left diff-editfont-monospace&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;ko&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← 이전 판&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;2020년 12월 26일 (토) 12:21 판&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l89&quot; &gt;89번째 줄:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;89번째 줄:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===소스===&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===소스===&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;  &amp;lt;references /&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;  &amp;lt;references /&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;== 메타데이터 ==&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;===위키데이터===&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;* ID :  [https://www.wikidata.org/wiki/Q446488 Q446488]&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Pythagoras0</name></author>
	</entry>
	<entry>
		<id>https://wiki.mathnt.net/index.php?title=Feature_selection&amp;diff=46213&amp;oldid=prev</id>
		<title>Pythagoras0: /* 노트 */ 새 문단</title>
		<link rel="alternate" type="text/html" href="https://wiki.mathnt.net/index.php?title=Feature_selection&amp;diff=46213&amp;oldid=prev"/>
		<updated>2020-12-21T09:53:23Z</updated>

		<summary type="html">&lt;p&gt;&lt;span dir=&quot;auto&quot;&gt;&lt;span class=&quot;autocomment&quot;&gt;노트: &lt;/span&gt; 새 문단&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;새 문서&lt;/b&gt;&lt;/p&gt;&lt;div&gt;== 노트 ==&lt;br /&gt;
&lt;br /&gt;
===위키데이터===&lt;br /&gt;
* ID :  [https://www.wikidata.org/wiki/Q446488 Q446488]&lt;br /&gt;
===말뭉치===&lt;br /&gt;
# In this article, I will focus on one of the 2 critical parts of getting your models right – feature selection.&amp;lt;ref name=&amp;quot;ref_6a9877c4&amp;quot;&amp;gt;[https://www.analyticsvidhya.com/blog/2016/12/introduction-to-feature-selection-methods-with-an-example-or-how-to-select-the-right-variables/ Feature Selection Methods]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# This is just an example of how feature selection makes a difference.&amp;lt;ref name=&amp;quot;ref_6a9877c4&amp;quot; /&amp;gt;&lt;br /&gt;
# I believe that his article has given you a good idea of how you can perform feature selection to get the best out of your models.&amp;lt;ref name=&amp;quot;ref_6a9877c4&amp;quot; /&amp;gt;&lt;br /&gt;
# These are the broad categories that are commonly used for feature selection.&amp;lt;ref name=&amp;quot;ref_6a9877c4&amp;quot; /&amp;gt;&lt;br /&gt;
# Plenty of feature selection methods are available in literature due to the availability of data with hundreds of variables leading to data with very high dimension.&amp;lt;ref name=&amp;quot;ref_98813e87&amp;quot;&amp;gt;[https://www.sciencedirect.com/science/article/pii/S0045790613003066 A survey on feature selection methods ☆]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# We also apply some of the feature selection techniques on standard datasets to demonstrate the applicability of feature selection techniques.&amp;lt;ref name=&amp;quot;ref_98813e87&amp;quot; /&amp;gt;&lt;br /&gt;
# Feature Selection is the process of selecting out the most significant features from a given dataset.&amp;lt;ref name=&amp;quot;ref_6c622de6&amp;quot;&amp;gt;[https://www.datacamp.com/community/tutorials/feature-selection-python (Tutorial) Feature Selection in Python]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# You got an informal introduction to Feature Selection and its importance in the world of Data Science and Machine Learning.&amp;lt;ref name=&amp;quot;ref_6c622de6&amp;quot; /&amp;gt;&lt;br /&gt;
# The importance of feature selection can best be recognized when you are dealing with a dataset that contains a vast number of features.&amp;lt;ref name=&amp;quot;ref_6c622de6&amp;quot; /&amp;gt;&lt;br /&gt;
# Sometimes, feature selection is mistaken with dimensionality reduction.&amp;lt;ref name=&amp;quot;ref_6c622de6&amp;quot; /&amp;gt;&lt;br /&gt;
# The logic behind using correlation for feature selection is that the good variables are highly correlated with the target.&amp;lt;ref name=&amp;quot;ref_617b1053&amp;quot;&amp;gt;[https://www.analyticsvidhya.com/blog/2020/10/feature-selection-techniques-in-machine-learning/ Feature Selection Techniques in Machine Learning]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# The feature selection process is based on a specific machine learning algorithm that we are trying to fit on a given dataset.&amp;lt;ref name=&amp;quot;ref_617b1053&amp;quot; /&amp;gt;&lt;br /&gt;
# This method works exactly opposite to the Forward Feature Selection method.&amp;lt;ref name=&amp;quot;ref_617b1053&amp;quot; /&amp;gt;&lt;br /&gt;
# This is the most robust feature selection method covered so far.&amp;lt;ref name=&amp;quot;ref_617b1053&amp;quot; /&amp;gt;&lt;br /&gt;
# Feature Selection is one of the core concepts in machine learning which hugely impacts the performance of your model.&amp;lt;ref name=&amp;quot;ref_4fbb8770&amp;quot;&amp;gt;[https://towardsdatascience.com/feature-selection-techniques-in-machine-learning-with-python-f24e7da3f36e Feature Selection Techniques in Machine Learning with Python]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In one of related works, a filter-based method has been introduced for use in online stream feature selection applications.&amp;lt;ref name=&amp;quot;ref_cf300b40&amp;quot;&amp;gt;[https://bmcbioinformatics.biomedcentral.com/articles/10.1186/s12859-019-2754-0 FeatureSelect: a software for feature selection based on machine learning approaches]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# This method has acceptable stability and scalability, and can also be used in offline feature selection applications.&amp;lt;ref name=&amp;quot;ref_cf300b40&amp;quot; /&amp;gt;&lt;br /&gt;
# Feature selection for linear data types has also been studied, in a work that provides a framework and selects features with maximum relevance and minimum redundancy.&amp;lt;ref name=&amp;quot;ref_cf300b40&amp;quot; /&amp;gt;&lt;br /&gt;
# In a separate study, a feature selection method was proposed in which both unbalanced and balanced data can be classified, based on a genetic algorithm.&amp;lt;ref name=&amp;quot;ref_cf300b40&amp;quot; /&amp;gt;&lt;br /&gt;
# An important distinction to be made in feature selection is that of supervised and unsupervised methods.&amp;lt;ref name=&amp;quot;ref_188a1277&amp;quot;&amp;gt;[https://machinelearningmastery.com/feature-selection-with-real-and-categorical-data/ How to Choose a Feature Selection Method For Machine Learning]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Unsupervised feature selection techniques ignores the target variable, such as methods that remove redundant variables using correlation.&amp;lt;ref name=&amp;quot;ref_188a1277&amp;quot; /&amp;gt;&lt;br /&gt;
# Wrapper feature selection methods create many models with different subsets of input features and select those features that result in the best performing model according to a performance metric.&amp;lt;ref name=&amp;quot;ref_188a1277&amp;quot; /&amp;gt;&lt;br /&gt;
# Finally, there are some machine learning algorithms that perform feature selection automatically as part of learning the model.&amp;lt;ref name=&amp;quot;ref_188a1277&amp;quot; /&amp;gt;&lt;br /&gt;
# This post is about some of the most common feature selection techniques one can use while working with data.&amp;lt;ref name=&amp;quot;ref_9e03d499&amp;quot;&amp;gt;[https://towardsdatascience.com/the-5-feature-selection-algorithms-every-data-scientist-need-to-know-3a6b566efd2 The 5 Feature Selection Algorithms every Data Scientist should know]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Removing features with low variance¶ VarianceThreshold is a simple baseline approach to feature selection.&amp;lt;ref name=&amp;quot;ref_4f7b127d&amp;quot;&amp;gt;[http://scikit-learn.org/stable/modules/feature_selection.html 1.13. Feature selection — scikit-learn 0.23.2 documentation]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Univariate feature selection¶ Univariate feature selection works by selecting the best features based on univariate statistical tests.&amp;lt;ref name=&amp;quot;ref_4f7b127d&amp;quot; /&amp;gt;&lt;br /&gt;
# GenericUnivariateSelect allows to perform univariate feature selection with a configurable strategy.&amp;lt;ref name=&amp;quot;ref_4f7b127d&amp;quot; /&amp;gt;&lt;br /&gt;
# Feature selection using SelectFromModel¶ SelectFromModel is a meta-transformer that can be used along with any estimator that has a coef_ or feature_importances_ attribute after fitting.&amp;lt;ref name=&amp;quot;ref_4f7b127d&amp;quot; /&amp;gt;&lt;br /&gt;
# Feature extraction creates new features from functions of the original features, whereas feature selection returns a subset of the features.&amp;lt;ref name=&amp;quot;ref_3f179fe9&amp;quot;&amp;gt;[https://en.wikipedia.org/wiki/Feature_selection Feature selection]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Feature selection techniques are often used in domains where there are many features and comparatively few samples (or data points).&amp;lt;ref name=&amp;quot;ref_3f179fe9&amp;quot; /&amp;gt;&lt;br /&gt;
# A feature selection algorithm can be seen as the combination of a search technique for proposing new feature subsets, along with an evaluation measure which scores the different feature subsets.&amp;lt;ref name=&amp;quot;ref_3f179fe9&amp;quot; /&amp;gt;&lt;br /&gt;
# Embedded methods are a catch-all group of techniques which perform feature selection as part of the model construction process.&amp;lt;ref name=&amp;quot;ref_3f179fe9&amp;quot; /&amp;gt;&lt;br /&gt;
# Feature selection is the process by which a subset of relevant features, or variables, are selected from a larger data set for constructing models.&amp;lt;ref name=&amp;quot;ref_27829565&amp;quot;&amp;gt;[https://deepai.org/machine-learning-glossary-and-terms/feature-selection Feature Selection]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Variable selection, attribute selection or variable subset selection are all other names used for feature selection.&amp;lt;ref name=&amp;quot;ref_27829565&amp;quot; /&amp;gt;&lt;br /&gt;
# The main focus of feature selection is to choose features that represent the data set well by excluding redundant and irrelevant data.&amp;lt;ref name=&amp;quot;ref_27829565&amp;quot; /&amp;gt;&lt;br /&gt;
# Feature selection is useful because it simplifies the learning models making interpretation of the model and the results easier for the user.&amp;lt;ref name=&amp;quot;ref_27829565&amp;quot; /&amp;gt;&lt;br /&gt;
# Feature selection is the study of algorithms for reducing dimensionality of data to improve machine learning performance.&amp;lt;ref name=&amp;quot;ref_2bf3f818&amp;quot;&amp;gt;[https://link.springer.com/10.1007/978-0-387-30164-8_306 Feature Selection]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Feature selection is commonly used in applications where original features need to be retained.&amp;lt;ref name=&amp;quot;ref_2bf3f818&amp;quot; /&amp;gt;&lt;br /&gt;
# These models are thought to have built-in feature selection: ada , AdaBag , AdaBoost.&amp;lt;ref name=&amp;quot;ref_5198045d&amp;quot;&amp;gt;[https://topepo.github.io/caret/feature-selection-overview.html 18 Feature Selection Overview]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In many cases, using these models with built-in feature selection will be more efficient than algorithms where the search routine for the right predictors is external to the model.&amp;lt;ref name=&amp;quot;ref_5198045d&amp;quot; /&amp;gt;&lt;br /&gt;
# Apart from models with built-in feature selection, most approaches for reducing the number of predictors can be placed into two main categories.&amp;lt;ref name=&amp;quot;ref_5198045d&amp;quot; /&amp;gt;&lt;br /&gt;
# The crucial role played by the feature selection step has led many researchers to innovate and find different approaches to address this issue.&amp;lt;ref name=&amp;quot;ref_25a7bbbe&amp;quot;&amp;gt;[https://journalofbigdata.springeropen.com/articles/10.1186/s40537-019-0241-0 Feature selection methods and genomic big data: a systematic review]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# The initial feature selection type is the filter methods, in which the algorithm selecting relevant and non-redundant features in the data set is actually independent of the used classifier.&amp;lt;ref name=&amp;quot;ref_25a7bbbe&amp;quot; /&amp;gt;&lt;br /&gt;
# Many bioinformatics researchers have shown interest in this particular type of feature selection methods due to the simplicity of its implementation, its low computational cost and its speed.&amp;lt;ref name=&amp;quot;ref_25a7bbbe&amp;quot; /&amp;gt;&lt;br /&gt;
# Then, using real data they show evidence that their wrapper feature selection leads to higher predictive accuracy than mRMR.&amp;lt;ref name=&amp;quot;ref_25a7bbbe&amp;quot; /&amp;gt;&lt;br /&gt;
# We can view feature selection as a method for replacing a complex classifier (using all features) with a simpler one (using a subset of the features).&amp;lt;ref name=&amp;quot;ref_93b202b6&amp;quot;&amp;gt;[https://nlp.stanford.edu/IR-book/html/htmledition/feature-selection-1.html Feature selection]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# The basic feature selection algorithm is shown in Figure 13.6 .&amp;lt;ref name=&amp;quot;ref_93b202b6&amp;quot; /&amp;gt;&lt;br /&gt;
# This section mainly addresses feature selection for two-class classification tasks like China versus not-China.&amp;lt;ref name=&amp;quot;ref_93b202b6&amp;quot; /&amp;gt;&lt;br /&gt;
# Often feature selection based on a filter method is part of the data preprocessing and in a subsequent step a learning method is applied to the filtered data.&amp;lt;ref name=&amp;quot;ref_b6d30380&amp;quot;&amp;gt;[https://mlr.mlr-org.com/articles/tutorial/feature_selection.html Feature Selection]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In each resampling iteration feature selection is carried out on the corresponding training data set before fitting the learner.&amp;lt;ref name=&amp;quot;ref_b6d30380&amp;quot; /&amp;gt;&lt;br /&gt;
# The software has been implemented to automate all machine learning steps, including data pre-processing, feature selection, model selection, and performance evaluation.&amp;lt;ref name=&amp;quot;ref_f6598aa6&amp;quot;&amp;gt;[https://www.frontiersin.org/articles/10.3389/fgene.2019.00452/full Large-Scale Automatic Feature Selection for Biomarker Discovery in High-Dimensional OMICs Data]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In this section, we describe the program procedures separated in three main components: preprocessing, feature selection and model selection.&amp;lt;ref name=&amp;quot;ref_f6598aa6&amp;quot; /&amp;gt;&lt;br /&gt;
# Preprocessing and feature selection procedures are fully parallelizable, When all features-optimized models are computed, the model selection starts.&amp;lt;ref name=&amp;quot;ref_f6598aa6&amp;quot; /&amp;gt;&lt;br /&gt;
# This optimization procedure performed on feature selection either maximize or minimize the criterion, depending if it measures a performance or an error, respectively.&amp;lt;ref name=&amp;quot;ref_f6598aa6&amp;quot; /&amp;gt;&lt;br /&gt;
# Variable and feature selection have become the focus of much research in areas of application for which datasets with tens or hundreds of thousands of variables are available.&amp;lt;ref name=&amp;quot;ref_0021775c&amp;quot;&amp;gt;[https://dl.acm.org/citation.cfm?id=944919.944968 An introduction to variable and feature selection]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Dimensionality reduction is another concept that newcomers tend to lump together with feature selection.&amp;lt;ref name=&amp;quot;ref_676d35be&amp;quot;&amp;gt;[https://heartbeat.fritz.ai/hands-on-with-feature-selection-techniques-an-introduction-1d8dc6d86c16 Hands-on with Feature Selection Techniques: An Introduction]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Feature selection is a method of selecting a subset of all features provided with observations data to build the optimal Machine Learning model.&amp;lt;ref name=&amp;quot;ref_554859e5&amp;quot;&amp;gt;[https://blog.netcetera.com/feature-selection-in-details-217bac0ed98d Feature selection is a method of selecting a subset of all features provided with observations data to build the optimal Machine Learning model.]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Embedded methods perform feature selection during the model training process.&amp;lt;ref name=&amp;quot;ref_554859e5&amp;quot; /&amp;gt;&lt;br /&gt;
# Feature selection using linear models assumes multivariant dependency of the target from values of available features, and values of available features are normally distributed.&amp;lt;ref name=&amp;quot;ref_554859e5&amp;quot; /&amp;gt;&lt;br /&gt;
# In this blog post, we shall continue our discussion further on “Feature Selection in Machine Learning”.&amp;lt;ref name=&amp;quot;ref_f5d9c43a&amp;quot;&amp;gt;[https://medium.com/@mehulved1503/feature-selection-in-machine-learning-variable-ranking-and-feature-subset-selection-methods-89b2896f2220 Feature Selection in Machine Learning: Variable Ranking and Feature Subset Selection Methods]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In the previous blog post, I’d introduced the the basic definitions, terminologies and the motivation in Feature Selection.&amp;lt;ref name=&amp;quot;ref_f5d9c43a&amp;quot; /&amp;gt;&lt;br /&gt;
# Feature selection mythologies fall into three general classes: intrinsic (or implicit) methods, filter methods, and wrapper methods.&amp;lt;ref name=&amp;quot;ref_2cac3ad4&amp;quot;&amp;gt;[http://www.feat.engineering/classes-of-feature-selection-methodologies.html Feature Engineering and Selection: A Practical Approach for Predictive Models]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Intrinsic methods have feature selection naturally incorporated with the modeling process.&amp;lt;ref name=&amp;quot;ref_2cac3ad4&amp;quot; /&amp;gt;&lt;br /&gt;
# Whereas filter and wrapper methods work to marry feature selection approaches with modeling techniques.&amp;lt;ref name=&amp;quot;ref_2cac3ad4&amp;quot; /&amp;gt;&lt;br /&gt;
# If the data are better fit by a non-intrinsic feature selection type of model, then predictive performance may be sub-optimal when all features are used.&amp;lt;ref name=&amp;quot;ref_2cac3ad4&amp;quot; /&amp;gt;&lt;br /&gt;
# This article describes how to use the Filter Based Feature Selection module in Azure Machine Learning designer.&amp;lt;ref name=&amp;quot;ref_d21cedc6&amp;quot;&amp;gt;[https://docs.microsoft.com/en-us/azure/machine-learning/algorithm-module-reference/filter-based-feature-selection Filter Based Feature Selection: Module reference - Azure Machine Learning]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In general, feature selection refers to the process of applying statistical tests to inputs, given a specified output.&amp;lt;ref name=&amp;quot;ref_d21cedc6&amp;quot; /&amp;gt;&lt;br /&gt;
# The Filter Based Feature Selection module provides multiple feature selection algorithms to choose from.&amp;lt;ref name=&amp;quot;ref_d21cedc6&amp;quot; /&amp;gt;&lt;br /&gt;
# When you use the Filter Based Feature Selection module, you provide a dataset and identify the column that contains the label or dependent variable.&amp;lt;ref name=&amp;quot;ref_d21cedc6&amp;quot; /&amp;gt;&lt;br /&gt;
# Good feature selection eliminates irrelevant or redundant columns from your dataset without sacrificing accuracy.&amp;lt;ref name=&amp;quot;ref_25000733&amp;quot;&amp;gt;[https://www.datarobot.com/wiki/feature-selection/ Feature Selection for Machine Learning]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Automated feature selection.&amp;lt;ref name=&amp;quot;ref_25000733&amp;quot; /&amp;gt;&lt;br /&gt;
# The process of feature selection can be briefly described as follows.&amp;lt;ref name=&amp;quot;ref_ca66cf16&amp;quot;&amp;gt;[https://www.nature.com/articles/s41598-019-53471-0 Feature selection with the Fisher score followed by the Maximal Clique Centrality algorithm can accurately identify the hub genes of hepatocellular carcinoma]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# To further evaluate the performance of the Fisher score algorithm, a series of control feature selection algorithms were utilized to select feature genes from the current integrated HCC dataset.&amp;lt;ref name=&amp;quot;ref_ca66cf16&amp;quot; /&amp;gt;&lt;br /&gt;
# According to Applied Predictive Modeling, 2013, feature selection is primarily focused on removing non-informative or redundant predictors from the model.&amp;lt;ref name=&amp;quot;ref_be82161c&amp;quot;&amp;gt;[https://codeburst.io/what-is-feature-selection-in-machine-learning-and-how-is-it-used-43042a0683c1 What is Feature Selection in Machine Learning and How is it Used?]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# So, given the fact that more and more features are becoming available for machine learning projects, feature selection algorithms are increasingly growing in significance.&amp;lt;ref name=&amp;quot;ref_5568189c&amp;quot;&amp;gt;[https://www.explorium.ai/blog/demystifying-feature-selection-filter-vs-wrapper-methods/ Data Science Feature Selection: Filter vs Wrapper Methods l Explorium]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# My team is responsible for locating algorithms and feature selection strategies.&amp;lt;ref name=&amp;quot;ref_5568189c&amp;quot; /&amp;gt;&lt;br /&gt;
# In order to examine the two feature selection methodologies, let’s take a look at a small sample of our Melbourne prices dataset.&amp;lt;ref name=&amp;quot;ref_5568189c&amp;quot; /&amp;gt;&lt;br /&gt;
# At this point, all the generated features will be clean and normalized, before being thrown into the feature selection phase.&amp;lt;ref name=&amp;quot;ref_5568189c&amp;quot; /&amp;gt;&lt;br /&gt;
# Before conducting these model developments, feature selection was applied in order to select the most important input parameters for PPV.&amp;lt;ref name=&amp;quot;ref_e3a45a7f&amp;quot;&amp;gt;[https://www.mdpi.com/2076-3417/10/3/869 A Combination of Feature Selection and Random Forest Techniques to Solve a Problem Related to Blast-Induced Ground Vibration]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In this study, we propose a feature selection method for text classification based on independent feature space search.&amp;lt;ref name=&amp;quot;ref_fdf96d52&amp;quot;&amp;gt;[https://www.hindawi.com/journals/mpe/2020/6076272/ A New Feature Selection Method for Text Classification Based on Independent Feature Space Search]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Therefore, the dimension reduction methods have been proposed to solve this problem, including feature extraction and feature selection.&amp;lt;ref name=&amp;quot;ref_fdf96d52&amp;quot; /&amp;gt;&lt;br /&gt;
# In this paper, we propose a novel and effective idea of feature selection and use the diagrams to illustrate the difference between this method and the general feature selection method.&amp;lt;ref name=&amp;quot;ref_fdf96d52&amp;quot; /&amp;gt;&lt;br /&gt;
# Figure 2 shows the process diagram of the new feature selection method, namely, the RDTFD method, step ① represents all features are added to the original features set.&amp;lt;ref name=&amp;quot;ref_fdf96d52&amp;quot; /&amp;gt;&lt;br /&gt;
===소스===&lt;br /&gt;
 &amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Pythagoras0</name></author>
	</entry>
</feed>