<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="ko">
	<id>https://wiki.mathnt.net/index.php?action=history&amp;feed=atom&amp;title=%EA%B2%B0%EC%A0%95_%ED%8A%B8%EB%A6%AC</id>
	<title>결정 트리 - 편집 역사</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.mathnt.net/index.php?action=history&amp;feed=atom&amp;title=%EA%B2%B0%EC%A0%95_%ED%8A%B8%EB%A6%AC"/>
	<link rel="alternate" type="text/html" href="https://wiki.mathnt.net/index.php?title=%EA%B2%B0%EC%A0%95_%ED%8A%B8%EB%A6%AC&amp;action=history"/>
	<updated>2026-04-05T10:02:38Z</updated>
	<subtitle>이 문서의 편집 역사</subtitle>
	<generator>MediaWiki 1.35.0</generator>
	<entry>
		<id>https://wiki.mathnt.net/index.php?title=%EA%B2%B0%EC%A0%95_%ED%8A%B8%EB%A6%AC&amp;diff=51245&amp;oldid=prev</id>
		<title>2021년 2월 17일 (수) 08:07에 Pythagoras0님의 편집</title>
		<link rel="alternate" type="text/html" href="https://wiki.mathnt.net/index.php?title=%EA%B2%B0%EC%A0%95_%ED%8A%B8%EB%A6%AC&amp;diff=51245&amp;oldid=prev"/>
		<updated>2021-02-17T08:07:10Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left diff-editfont-monospace&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;ko&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← 이전 판&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;2021년 2월 17일 (수) 08:07 판&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l121&quot; &gt;121번째 줄:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;121번째 줄:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;  &amp;lt;references /&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;  &amp;lt;references /&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt;−&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== 메타데이터 ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;==메타데이터==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt;−&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt; &lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===위키데이터===&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===위키데이터===&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* ID :  [https://www.wikidata.org/wiki/Q831366 Q831366]&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* ID :  [https://www.wikidata.org/wiki/Q831366 Q831366]&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;===Spacy 패턴 목록===&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;* [{&amp;#039;LOWER&amp;#039;: &amp;#039;decision&amp;#039;}, {&amp;#039;LEMMA&amp;#039;: &amp;#039;tree&amp;#039;}]&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Pythagoras0</name></author>
	</entry>
	<entry>
		<id>https://wiki.mathnt.net/index.php?title=%EA%B2%B0%EC%A0%95_%ED%8A%B8%EB%A6%AC&amp;diff=47042&amp;oldid=prev</id>
		<title>Pythagoras0: /* 메타데이터 */ 새 문단</title>
		<link rel="alternate" type="text/html" href="https://wiki.mathnt.net/index.php?title=%EA%B2%B0%EC%A0%95_%ED%8A%B8%EB%A6%AC&amp;diff=47042&amp;oldid=prev"/>
		<updated>2020-12-26T12:17:00Z</updated>

		<summary type="html">&lt;p&gt;&lt;span dir=&quot;auto&quot;&gt;&lt;span class=&quot;autocomment&quot;&gt;메타데이터: &lt;/span&gt; 새 문단&lt;/span&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left diff-editfont-monospace&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;ko&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← 이전 판&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;2020년 12월 26일 (토) 12:17 판&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l120&quot; &gt;120번째 줄:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;120번째 줄:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===소스===&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===소스===&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;  &amp;lt;references /&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;  &amp;lt;references /&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;== 메타데이터 ==&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;===위키데이터===&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class=&#039;diff-marker&#039;&gt;+&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;* ID :  [https://www.wikidata.org/wiki/Q831366 Q831366]&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Pythagoras0</name></author>
	</entry>
	<entry>
		<id>https://wiki.mathnt.net/index.php?title=%EA%B2%B0%EC%A0%95_%ED%8A%B8%EB%A6%AC&amp;diff=46282&amp;oldid=prev</id>
		<title>Pythagoras0: /* 노트 */ 새 문단</title>
		<link rel="alternate" type="text/html" href="https://wiki.mathnt.net/index.php?title=%EA%B2%B0%EC%A0%95_%ED%8A%B8%EB%A6%AC&amp;diff=46282&amp;oldid=prev"/>
		<updated>2020-12-21T10:58:18Z</updated>

		<summary type="html">&lt;p&gt;&lt;span dir=&quot;auto&quot;&gt;&lt;span class=&quot;autocomment&quot;&gt;노트: &lt;/span&gt; 새 문단&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;새 문서&lt;/b&gt;&lt;/p&gt;&lt;div&gt;== 노트 ==&lt;br /&gt;
&lt;br /&gt;
===위키데이터===&lt;br /&gt;
* ID :  [https://www.wikidata.org/wiki/Q831366 Q831366]&lt;br /&gt;
===말뭉치===&lt;br /&gt;
# This Decision Tree may be used as a tool to construct or test such a policy for your organisation.&amp;lt;ref name=&amp;quot;ref_a988f2d5&amp;quot;&amp;gt;[https://www.dpconline.org/handbook/organisational-activities/decision-tree Digital Preservation Handbook]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In psychology, the decision tree methods were used to model the human concept of learning.&amp;lt;ref name=&amp;quot;ref_7135c010&amp;quot;&amp;gt;[https://www.explorium.ai/blog/the-complete-guide-to-decision-trees/ Decision Trees: The Complete Guide to Decision Tree Classifier l Explorium]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# There is no more logical data to learn via decision tree classifier, than … tree classifications.&amp;lt;ref name=&amp;quot;ref_7135c010&amp;quot; /&amp;gt;&lt;br /&gt;
# Sometimes, it is very useful to visualize the final decision tree classifier model.&amp;lt;ref name=&amp;quot;ref_7135c010&amp;quot; /&amp;gt;&lt;br /&gt;
# Python supports various decision tree classifier visualization options, but only two of them are really popular.&amp;lt;ref name=&amp;quot;ref_7135c010&amp;quot; /&amp;gt;&lt;br /&gt;
# Decision tree software is used in data mining to simplify complex strategic challenges and evaluate the cost-effectiveness of research and business decisions.&amp;lt;ref name=&amp;quot;ref_4038695f&amp;quot;&amp;gt;[https://whatis.techtarget.com/definition/decision-tree Definition from WhatIs.com]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In this paper, we present fundamental theorems for the instability problem of decision tree classifiers.&amp;lt;ref name=&amp;quot;ref_e10ada1b&amp;quot;&amp;gt;[https://dl.acm.org/doi/10.1145/775047.775131 Instability of decision tree classification algorithms]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# As per Wikipedia, A decision tree is a decision support tool that uses a tree-like model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility.&amp;lt;ref name=&amp;quot;ref_0e78b4fe&amp;quot;&amp;gt;[https://datascience.foundation/sciencewhitepaper/understanding-decision-trees-with-python Understanding Decision Trees with Python]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Generally, a decision tree is drawn upside down with its root at the top (recommended) and it is known as Top-Down Approach.&amp;lt;ref name=&amp;quot;ref_0e78b4fe&amp;quot; /&amp;gt;&lt;br /&gt;
# A sub section of the decision tree is called branch or sub-tree.&amp;lt;ref name=&amp;quot;ref_0e78b4fe&amp;quot; /&amp;gt;&lt;br /&gt;
# Its formula is: Entropy –Another very popular way to split nodes in the decision tree is Entropy.&amp;lt;ref name=&amp;quot;ref_0e78b4fe&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree helps to decide whether the net gain from a decision is worthwhile.&amp;lt;ref name=&amp;quot;ref_b3464e5b&amp;quot;&amp;gt;[https://www.tutor2u.net/business/reference/decision-trees Decision Trees]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Let&amp;#039;s look at an example of how a decision tree is constructed.&amp;lt;ref name=&amp;quot;ref_b3464e5b&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree starts with a decision to be made and the options that can be taken.&amp;lt;ref name=&amp;quot;ref_b3464e5b&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is a branched flowchart showing multiple pathways for potential decisions and outcomes.&amp;lt;ref name=&amp;quot;ref_a661ed5f&amp;quot;&amp;gt;[https://courses.lumenlearning.com/wm-principlesofmanagement/chapter/using-a-decision-tree/ Principles of Management]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Even in only this simple form, a decision tree is useful to show the possibilities for a decision.&amp;lt;ref name=&amp;quot;ref_a661ed5f&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is a supervised learning technique that has a pre-defined target variable and is most often used in classification problems.&amp;lt;ref name=&amp;quot;ref_6bf20a1d&amp;quot;&amp;gt;[https://deepai.org/machine-learning-glossary-and-terms/decision-tree Decision Tree]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# A decision tree is a diagram or chart that people use to determine a course of action or show a statistical probability.&amp;lt;ref name=&amp;quot;ref_370b9492&amp;quot;&amp;gt;[https://www.investopedia.com/terms/d/decision-tree.asp Decision Tree Definition]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Each branch of the decision tree represents a possible decision, outcome, or reaction.&amp;lt;ref name=&amp;quot;ref_370b9492&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is a graphical depiction of a decision and every potential outcome or result of making that decision.&amp;lt;ref name=&amp;quot;ref_370b9492&amp;quot; /&amp;gt;&lt;br /&gt;
# In the decision tree, each end result has an assigned risk and reward weight or number.&amp;lt;ref name=&amp;quot;ref_370b9492&amp;quot; /&amp;gt;&lt;br /&gt;
# The third experiment evaluates the accuracy of a selected tree compared to a randomly chosen decision tree.&amp;lt;ref name=&amp;quot;ref_43fe7429&amp;quot;&amp;gt;[https://journalofbigdata.springeropen.com/articles/10.1186/s40537-019-0186-3 Selecting a representative decision tree from an ensemble of decision-tree models for fast big data classification]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# For calculating the semantic similarity and choosing the most accurate decision tree, we run the decision trees over the development set that is 10% of the dataset.&amp;lt;ref name=&amp;quot;ref_43fe7429&amp;quot; /&amp;gt;&lt;br /&gt;
# We expect the single-tree approach to yield shorter classification times than the ensemble due to the fact that there is no need to run all decision tree models over the testing data.&amp;lt;ref name=&amp;quot;ref_43fe7429&amp;quot; /&amp;gt;&lt;br /&gt;
# In this way, although each induced decision tree sees only part of the trained dataset the voting combines their predictions over the testing dataset.&amp;lt;ref name=&amp;quot;ref_43fe7429&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is a support tool with a tree-like structure that models probable outcomes, cost of resources, utilities, and possible consequences.&amp;lt;ref name=&amp;quot;ref_475596d9&amp;quot;&amp;gt;[https://corporatefinanceinstitute.com/resources/knowledge/other/decision-tree/ Overview, Decision Types, Applications]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# A small change in the data can result in a major change in the structure of the decision tree, which can convey a different result from what users will get in a normal event.&amp;lt;ref name=&amp;quot;ref_475596d9&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is a popular method of creating and visualizing predictive models and algorithms.&amp;lt;ref name=&amp;quot;ref_e18b3d12&amp;quot;&amp;gt;[https://www.aunalytics.com/decision-trees-an-overview/ Decision Trees: An Overview]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# The basic goal of a decision tree is to split a population of data into smaller segments.&amp;lt;ref name=&amp;quot;ref_e18b3d12&amp;quot; /&amp;gt;&lt;br /&gt;
# Since this data was not used to train the model, it will show whether or not the decision tree has overlearned the training data.&amp;lt;ref name=&amp;quot;ref_e18b3d12&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is created for each subset, and the results of each tree are combined.&amp;lt;ref name=&amp;quot;ref_e18b3d12&amp;quot; /&amp;gt;&lt;br /&gt;
# The decision tree is a greedy algorithm that performs a recursive binary partitioning of the feature space.&amp;lt;ref name=&amp;quot;ref_9442c955&amp;quot;&amp;gt;[https://spark.apache.org/docs/1.3.0/mllib-decision-tree.html Spark 1.3.0 Documentation]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Implementation details: For faster processing, the decision tree algorithm collects statistics about groups of nodes to split (rather than 1 node at a time).&amp;lt;ref name=&amp;quot;ref_9442c955&amp;quot; /&amp;gt;&lt;br /&gt;
# subsamplingRate : Fraction of the training data used for learning the decision tree.&amp;lt;ref name=&amp;quot;ref_9442c955&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is a simple representation for classifying examples.&amp;lt;ref name=&amp;quot;ref_ca92485b&amp;quot;&amp;gt;[https://artint.info/html/ArtInt_177.html foundations of computational agents -- 7.3.1 Learning Decision Trees]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# A decision tree or a classification tree is a tree in which each internal (non-leaf) node is labeled with an input feature.&amp;lt;ref name=&amp;quot;ref_ca92485b&amp;quot; /&amp;gt;&lt;br /&gt;
# Each decision tree can be used to classify examples according to the user&amp;#039;s action.&amp;lt;ref name=&amp;quot;ref_ca92485b&amp;quot; /&amp;gt;&lt;br /&gt;
# A deterministic decision tree, in which all of the leaves are classes, can be mapped into a set of rules, with each leaf of the tree corresponding to a rule.&amp;lt;ref name=&amp;quot;ref_ca92485b&amp;quot; /&amp;gt;&lt;br /&gt;
# The Decision Tree algorithm, like Naive Bayes, is based on conditional probabilities.&amp;lt;ref name=&amp;quot;ref_e0e16662&amp;quot;&amp;gt;[https://docs.oracle.com/cd/B28359_01/datamine.111/b28129/algo_decisiontree.htm Decision Tree]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Decision Tree Rules Oracle Data Mining supports several algorithms that provide rules.&amp;lt;ref name=&amp;quot;ref_e0e16662&amp;quot; /&amp;gt;&lt;br /&gt;
# Figure 11-1 shows a rule generated by a Decision Tree model.&amp;lt;ref name=&amp;quot;ref_e0e16662&amp;quot; /&amp;gt;&lt;br /&gt;
# This rule comes from a decision tree that predicts the probability that customers will increase spending if given a loyalty card.&amp;lt;ref name=&amp;quot;ref_e0e16662&amp;quot; /&amp;gt;&lt;br /&gt;
# This decision tree describes how to use the alt attribute of the &amp;lt;img&amp;gt; element in various situations.&amp;lt;ref name=&amp;quot;ref_45cc8300&amp;quot;&amp;gt;[https://www.w3.org/WAI/tutorials/images/decision-tree/ An alt Decision Tree • Images • WAI Web Accessibility Tutorials]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# This decision tree does not cover all cases.&amp;lt;ref name=&amp;quot;ref_45cc8300&amp;quot; /&amp;gt;&lt;br /&gt;
# then it called a Categorical variable decision tree.&amp;lt;ref name=&amp;quot;ref_e00217fe&amp;quot;&amp;gt;[https://www.kdnuggets.com/2020/01/decision-tree-algorithm-explained.html Decision Tree Algorithm, Explained]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Now, as we know this is an important variable, then we can build a decision tree to predict customer income based on occupation, product, and various other variables.&amp;lt;ref name=&amp;quot;ref_e00217fe&amp;quot; /&amp;gt;&lt;br /&gt;
# The primary challenge in the decision tree implementation is to identify which attributes do we need to consider as the root node and each level.&amp;lt;ref name=&amp;quot;ref_e00217fe&amp;quot; /&amp;gt;&lt;br /&gt;
# If there is no limit set on a decision tree, it will give you 100% accuracy on the training data set because in the worse case it will end up making 1 leaf for each observation.&amp;lt;ref name=&amp;quot;ref_e00217fe&amp;quot; /&amp;gt;&lt;br /&gt;
# It breaks down a dataset into smaller and smaller subsets while at the same time an associated decision tree is incrementally developed.&amp;lt;ref name=&amp;quot;ref_8153d365&amp;quot;&amp;gt;[https://www.saedsayad.com/decision_tree.htm Decision Tree]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# ID3 uses Entropy and Information Gain to construct a decision tree.&amp;lt;ref name=&amp;quot;ref_8153d365&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is built top-down from a root node and involves partitioning the data into subsets that contain instances with similar values (homogenous).&amp;lt;ref name=&amp;quot;ref_8153d365&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is one of the supervised machine learning algorithms.&amp;lt;ref name=&amp;quot;ref_8aa6a8ab&amp;quot;&amp;gt;[https://towardsai.net/p/programming/decision-trees-explained-with-a-practical-example-fe47872d3b53 Decision Trees Explained With a Practical Example]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# A part of the entire decision tree is called a branch or sub-tree.&amp;lt;ref name=&amp;quot;ref_8aa6a8ab&amp;quot; /&amp;gt;&lt;br /&gt;
# : This is the end of the decision tree where it cannot be split into further sub-nodes.&amp;lt;ref name=&amp;quot;ref_8aa6a8ab&amp;quot; /&amp;gt;&lt;br /&gt;
# The small variation in the input data can result in a different decision tree.&amp;lt;ref name=&amp;quot;ref_8aa6a8ab&amp;quot; /&amp;gt;&lt;br /&gt;
# It is a tree-structured classifier, where and In a Decision tree, there are two nodes, which are the Decision Node and Leaf Node.&amp;lt;ref name=&amp;quot;ref_3f022d5f&amp;quot;&amp;gt;[https://www.javatpoint.com/machine-learning-decision-tree-classification-algorithm Machine Learning Decision Tree Classification Algorithm]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Note: A decision tree can contain categorical data (YES/NO) as well as numeric data.&amp;lt;ref name=&amp;quot;ref_3f022d5f&amp;quot; /&amp;gt;&lt;br /&gt;
# The logic behind the decision tree can be easily understood because it shows a tree-like structure.&amp;lt;ref name=&amp;quot;ref_3f022d5f&amp;quot; /&amp;gt;&lt;br /&gt;
# Root node is from where the decision tree starts.&amp;lt;ref name=&amp;quot;ref_3f022d5f&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is a tree like collection of nodes intended to create a decision on values affiliation to a class or an estimate of a numerical target value.&amp;lt;ref name=&amp;quot;ref_47d4d34c&amp;quot;&amp;gt;[https://docs.rapidminer.com/latest/studio/operators/modeling/predictive/trees/parallel_decision_tree.html RapidMiner Documentation]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# After generation, the decision tree model can be applied to new Examples using the Apply Model Operator.&amp;lt;ref name=&amp;quot;ref_47d4d34c&amp;quot; /&amp;gt;&lt;br /&gt;
# The CHAID Operator provides a pruned decision tree that uses chi-squared based criterion instead of information gain or gain ratio criteria.&amp;lt;ref name=&amp;quot;ref_47d4d34c&amp;quot; /&amp;gt;&lt;br /&gt;
# The ID3 Operator provides a basic implementation of unpruned decision tree.&amp;lt;ref name=&amp;quot;ref_47d4d34c&amp;quot; /&amp;gt;&lt;br /&gt;
# In rpart decision tree library, you can control the parameters using the rpart.control() function.&amp;lt;ref name=&amp;quot;ref_6f8d8337&amp;quot;&amp;gt;[https://www.guru99.com/r-decision-trees.html Decision Tree in R | Classification Tree &amp;amp; Code in R with Example]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In its simplest form, a decision tree is a type of flowchart that shows a clear pathway to a decision.&amp;lt;ref name=&amp;quot;ref_3481ab5d&amp;quot;&amp;gt;[https://careerfoundry.com/en/blog/data-analytics/what-is-a-decision-tree/ What Is a Decision Tree and How Is It Used?]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Luckily, a lot of decision tree terminology follows the tree analogy, which makes it much easier to remember!&amp;lt;ref name=&amp;quot;ref_3481ab5d&amp;quot; /&amp;gt;&lt;br /&gt;
# By including options for what to do in the event of not being hungry, we’ve overcomplicated our decision tree.&amp;lt;ref name=&amp;quot;ref_3481ab5d&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is a tree-structured classification model, which is easy to understand, even by nonexpert users, and can be efficiently induced from data.&amp;lt;ref name=&amp;quot;ref_94181150&amp;quot;&amp;gt;[https://link.springer.com/10.1007%2F978-0-387-30164-8_204 Decision Tree]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# An extensive survey of decision tree learning can be found in Murthy (1998).&amp;lt;ref name=&amp;quot;ref_94181150&amp;quot; /&amp;gt;&lt;br /&gt;
# Researchers from various disciplines such as statistics, machine learning, pattern recognition, and Data Mining have dealt with the issue of growing a decision tree from available data.&amp;lt;ref name=&amp;quot;ref_42c68945&amp;quot;&amp;gt;[https://link.springer.com/chapter/10.1007/0-387-25465-X_9 Decision Trees]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# This paper presents an updated survey of current methods for constructing decision tree classifiers in a top-down manner.&amp;lt;ref name=&amp;quot;ref_42c68945&amp;quot; /&amp;gt;&lt;br /&gt;
# Now that you know exactly what a decision tree is, it’s time to consider why this methodology is so effective.&amp;lt;ref name=&amp;quot;ref_1ecc6d62&amp;quot;&amp;gt;[https://venngage.com/blog/what-is-a-decision-tree/ What is a Decision Tree and How to Make One [Templates + Examples]]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# A decision tree to help someone determine whether they should rent or buy, for example, would be a welcomed piece of content on your blog.&amp;lt;ref name=&amp;quot;ref_1ecc6d62&amp;quot; /&amp;gt;&lt;br /&gt;
# The overarching objective or decision you’re trying to make should be identified at the very top of your decision tree.&amp;lt;ref name=&amp;quot;ref_1ecc6d62&amp;quot; /&amp;gt;&lt;br /&gt;
# When creating your decision tree, it’s important to do research, so you can accurately predict the likelihood for success.&amp;lt;ref name=&amp;quot;ref_1ecc6d62&amp;quot; /&amp;gt;&lt;br /&gt;
# Fig 1. illustrates a learned decision tree.&amp;lt;ref name=&amp;quot;ref_775c88ec&amp;quot;&amp;gt;[https://www.hackerearth.com/practice/machine-learning/machine-learning-algorithms/ml-decision-tree/tutorial/ Decision Tree Tutorials &amp;amp; Notes]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# In Fig 3., we can see that there are two candidate concepts for producing the decision tree that performs the AND operation.&amp;lt;ref name=&amp;quot;ref_775c88ec&amp;quot; /&amp;gt;&lt;br /&gt;
# Attributes is a list of other attributes that may be tested by the learned decision tree.&amp;lt;ref name=&amp;quot;ref_775c88ec&amp;quot; /&amp;gt;&lt;br /&gt;
# #Importing the Decision tree classifier from the sklearn library.&amp;lt;ref name=&amp;quot;ref_775c88ec&amp;quot; /&amp;gt;&lt;br /&gt;
# We developed the additive tree, a theoretical approach to generate a more accurate and interpretable decision tree, which reveals connections between CART and gradient boosting.&amp;lt;ref name=&amp;quot;ref_4d83b225&amp;quot;&amp;gt;[https://www.pnas.org/content/116/40/19887 Building more accurate decision trees with the additive tree]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Decision tree learning and gradient boosting have been connected primarily through CART models used as the weak learners in boosting.&amp;lt;ref name=&amp;quot;ref_4d83b225&amp;quot; /&amp;gt;&lt;br /&gt;
# 26 proves that decision tree algorithms, specifically CART and C4.5 (27), are, in fact, boosting algorithms.&amp;lt;ref name=&amp;quot;ref_4d83b225&amp;quot; /&amp;gt;&lt;br /&gt;
# A sequence of weak classifiers on each branch of the decision tree was trained recursively using AdaBoost; therefore, rendering a decision tree where each branch conforms to a strong classifier.&amp;lt;ref name=&amp;quot;ref_4d83b225&amp;quot; /&amp;gt;&lt;br /&gt;
# Time to shine for the decision tree!&amp;lt;ref name=&amp;quot;ref_1368242a&amp;quot;&amp;gt;[https://christophm.github.io/interpretable-ml-book/tree.html Interpretable Machine Learning]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Individual predictions of a decision tree can be explained by decomposing the decision path into one component per feature.&amp;lt;ref name=&amp;quot;ref_1368242a&amp;quot; /&amp;gt;&lt;br /&gt;
# We want to predict the number of rented bikes on a certain day with a decision tree.&amp;lt;ref name=&amp;quot;ref_1368242a&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree is a decision support tool that uses a tree-like model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility.&amp;lt;ref name=&amp;quot;ref_b8441c85&amp;quot;&amp;gt;[https://en.wikipedia.org/wiki/Decision_tree Decision tree]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Drawn from left to right, a decision tree has only burst nodes (splitting paths) but no sink nodes (converging paths).&amp;lt;ref name=&amp;quot;ref_b8441c85&amp;quot; /&amp;gt;&lt;br /&gt;
# The decision tree illustrates that when sequentially distributing lifeguards, placing a first lifeguard on beach #1 would be optimal if there is only the budget for 1 lifeguard.&amp;lt;ref name=&amp;quot;ref_b8441c85&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree typically starts with a single node, which branches into possible outcomes.&amp;lt;ref name=&amp;quot;ref_0fa0bd9b&amp;quot;&amp;gt;[https://www.lucidchart.com/pages/decision-tree What is a Decision Tree Diagram]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# The construction of decision tree classifier does not require any domain knowledge or parameter setting, and therefore is appropriate for exploratory knowledge discovery.&amp;lt;ref name=&amp;quot;ref_7de990f9&amp;quot;&amp;gt;[https://www.geeksforgeeks.org/decision-tree/ Decision Tree]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Decision tree can be computationally expensive to train.&amp;lt;ref name=&amp;quot;ref_7de990f9&amp;quot; /&amp;gt;&lt;br /&gt;
# The process of growing a decision tree is computationally expensive.&amp;lt;ref name=&amp;quot;ref_7de990f9&amp;quot; /&amp;gt;&lt;br /&gt;
# A decision tree (also referred to as a classification tree or a reduction tree) is a predictive model which is a mapping from observations about an item to conclusions about its target value.&amp;lt;ref name=&amp;quot;ref_b4f91818&amp;quot;&amp;gt;[https://www.sciencedirect.com/topics/computer-science/decision-trees Decision Trees - an overview]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Building a decision tree that is consistent with a given data set is easy.&amp;lt;ref name=&amp;quot;ref_b4f91818&amp;quot; /&amp;gt;&lt;br /&gt;
# Section 17.4.2.1 describes how iComment uses decision tree learning to build models to classify comments.&amp;lt;ref name=&amp;quot;ref_b4f91818&amp;quot; /&amp;gt;&lt;br /&gt;
# iComment uses decision tree learning because it works well and its results are easy to interpret.&amp;lt;ref name=&amp;quot;ref_b4f91818&amp;quot; /&amp;gt;&lt;br /&gt;
# In this article I shall present one recently developed concept called the “decision tree,” which has tremendous potential as a decision-making tool.&amp;lt;ref name=&amp;quot;ref_73d96d45&amp;quot;&amp;gt;[https://hbr.org/1964/07/decision-trees-for-decision-making Decision Trees for Decision Making]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# The decision tree can clarify for management, as can no other analytical tool that I know of, the choices, risks, objectives, monetary gains, and information needs involved in an investment problem.&amp;lt;ref name=&amp;quot;ref_73d96d45&amp;quot; /&amp;gt;&lt;br /&gt;
# I illustrates a decision tree for the cocktail party problem.&amp;lt;ref name=&amp;quot;ref_73d96d45&amp;quot; /&amp;gt;&lt;br /&gt;
# In the decision tree you lay out only those decisions and events or results that are important to you and have consequences you wish to compare.&amp;lt;ref name=&amp;quot;ref_73d96d45&amp;quot; /&amp;gt;&lt;br /&gt;
# In decision analysis, a decision tree can be used to visually and explicitly represent decisions and decision making.&amp;lt;ref name=&amp;quot;ref_10a640db&amp;quot;&amp;gt;[https://towardsdatascience.com/decision-trees-in-machine-learning-641b9c4e8052 Decision Trees in Machine Learning]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# A decision tree is drawn upside down with its root at the top.&amp;lt;ref name=&amp;quot;ref_10a640db&amp;quot; /&amp;gt;&lt;br /&gt;
# This methodology is more commonly known as learning decision tree from data and above tree is called Classification tree as the target is to classify passenger as survived or died.&amp;lt;ref name=&amp;quot;ref_10a640db&amp;quot; /&amp;gt;&lt;br /&gt;
# Now the decision tree will start splitting by considering each feature in training data.&amp;lt;ref name=&amp;quot;ref_10a640db&amp;quot; /&amp;gt;&lt;br /&gt;
# Decision tree learning is one of the predictive modelling approaches used in statistics, data mining and machine learning.&amp;lt;ref name=&amp;quot;ref_62f2f37f&amp;quot;&amp;gt;[https://en.wikipedia.org/wiki/Decision_tree_learning Decision tree learning]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# It uses a decision tree (as a predictive model) to go from observations about an item (represented in the branches) to conclusions about the item&amp;#039;s target value (represented in the leaves).&amp;lt;ref name=&amp;quot;ref_62f2f37f&amp;quot; /&amp;gt;&lt;br /&gt;
# In data mining, a decision tree describes data (but the resulting classification tree can be an input for decision making).&amp;lt;ref name=&amp;quot;ref_62f2f37f&amp;quot; /&amp;gt;&lt;br /&gt;
# To construct a decision tree on this data, we need to compare the information gain of each of four trees, each split on one of the four features.&amp;lt;ref name=&amp;quot;ref_62f2f37f&amp;quot; /&amp;gt;&lt;br /&gt;
# You start a Decision Tree with a decision that you need to make.&amp;lt;ref name=&amp;quot;ref_5703ec85&amp;quot;&amp;gt;[https://www.mindtools.com/dectree.html Decision Skills from MindTools.com]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# Now you are ready to evaluate the decision tree.&amp;lt;ref name=&amp;quot;ref_5703ec85&amp;quot; /&amp;gt;&lt;br /&gt;
# Start on the right hand side of the decision tree, and work back towards the left.&amp;lt;ref name=&amp;quot;ref_5703ec85&amp;quot; /&amp;gt;&lt;br /&gt;
# The use of multi-output trees for regression is demonstrated in Multi-output Decision Tree Regression.&amp;lt;ref name=&amp;quot;ref_1403dcd3&amp;quot;&amp;gt;[http://scikit-learn.org/stable/modules/tree.html 1.10. Decision Trees — scikit-learn 0.23.2 documentation]&amp;lt;/ref&amp;gt;&lt;br /&gt;
# C4.5, C5.0 and CART¶ What are all the various decision tree algorithms and how do they differ from each other?&amp;lt;ref name=&amp;quot;ref_1403dcd3&amp;quot; /&amp;gt;&lt;br /&gt;
===소스===&lt;br /&gt;
 &amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Pythagoras0</name></author>
	</entry>
</feed>