Question

In: Statistics and Probability

In a decision tree, how does the algorithm pick the attributes for splitting? Would you explain...

In a decision tree, how does the algorithm pick the attributes for splitting?

Would you explain it logically and specifically?

Solutions

Expert Solution

Answer :

in choice tree :

the calculation id3 is as per the following,

  • compute the entropy of the objective property
  • the data set is then spilt on various traits. the entropy for each branch is determined. at that point it is added relatively to get all out entropy for the split. the subsequent entropy is subtracted from the entropy before the split. the outcome is the data increase, or abatement in entropy.
  • the characteristic is chosen which has the biggest data gain as the choice hub.
  • a branch with cntropy of 0 is a leaf hub.
  • a branch with entropy in excess of zero needs further part

the id3 calculation is run recursively on the non leaf branches, until all information is grouped.

A choice tree is constructed best down from a root hub and includes dividing the information into subsets that contain occurrences with comparative qualities (homogeneous). ID3 calculation utilizes entropy to ascertain the homogeneity of an example. On the off chance that the example is totally homogeneous the entropy is zero and on the off chance that the example is similarly separated, at that point it has entropy of one.


Related Solutions

Design a decision tree that computes the logical AND function. How does it compare to the...
Design a decision tree that computes the logical AND function. How does it compare to the Perceptron solution? Can a perceptron be used to implement a 3 input exclusive NOR gate?
Apply the classification algorithm to the following set of data records. Draw a decision tree. The...
Apply the classification algorithm to the following set of data records. Draw a decision tree. The class attribute is Repeat Customer. RID Age City Gender Education Repeat Customer 101 20..30 NY F College YES 102 20..30 SF M Graduate YES 103 31..40 NY F College YES 104 51..60 NY F College NO 105 31..40 LA M High school NO 106 41..50 NY F College YES 107 41..50 NY F Graduate YES 108 20..30 LA M College YES 109 20..30 NY...
Health Economics: Please answer asap How do you construct a decision tree and what does it...
Health Economics: Please answer asap How do you construct a decision tree and what does it mean to have an expected value and standard deviation? •                     Describe the risk preferences •                     What is scenario analysis and sensitivity analysis? •                     List and explain the approaches to managing risk •                     What is the difference among these concepts from a health economics perspective: HMO, POS, PPO, indemnity, risk pool, HSA, IPA, … •                     What are the payment systems?
How would you solve for octahedral splitting? Absorbed wavelength Octahedral splitting Formula (nm) (kJ/mol) 533 ?...
How would you solve for octahedral splitting? Absorbed wavelength Octahedral splitting Formula (nm) (kJ/mol) 533 ? [Co(NH3)5Cl]Cl2 483 ? [Co(NH3)6]Cl3 510 ? [Co(NH3)5CO3]NO3
For this problem, use the e1-p1.csv dataset. Using the decision tree algorithm that we discussed in...
For this problem, use the e1-p1.csv dataset. Using the decision tree algorithm that we discussed in the class, determine which attribute is the best attribute at the root level. You should not use Weka, JMP Pro, or any other data mining/machine learning software. You must show all intermediate results and calculations. For this problem, use the e1-p1.csv dataset. Using the decision tree algorithm that we discussed in the class, determine which attribute is the best attribute at the root level....
The following code will generate a Decision Tree. You need to run the code and explain...
The following code will generate a Decision Tree. You need to run the code and explain the tree. After you get the Tree. You need to explain how does it draw like that. install.packages("rpart.plot") # install package rpart.plot ########################################## # section 7.1.1 Overview of a Decision Tree ########################################## library("rpart") library("rpart.plot") # Read the data setwd("c:/data/") banktrain <- read.table("bank-sample-test.csv",header=TRUE,sep=",") ## drop a few columns to simplify the tree drops<-c("age", "balance", "day", "campaign", "pdays", "previous", "month") banktrain <- banktrain [,!(names(banktrain) %in% drops)]...
With regard to the commercial lending decision tree, How does the UCA cash flow analysis improve...
With regard to the commercial lending decision tree, How does the UCA cash flow analysis improve the process?
Can you provide an example on how a decision tree might be used in practice?
Can you provide an example on how a decision tree might be used in practice?
Explain, how the Mini-Max algorithm is used in decision-making and game theory. Make sure to explain...
Explain, how the Mini-Max algorithm is used in decision-making and game theory. Make sure to explain how this algorithm applies the utility function to get the utility values for the terminal states. Feel free to add any diagram/tree structure to represent all the possible moves that allow a game to move from one state to the next state. Also, discuss how the alpha-beta pruning approach is used for optimization.
Explain how the risk analysis decision tree can be used in conjunction with the five-step process...
Explain how the risk analysis decision tree can be used in conjunction with the five-step process for ADA. Provide an example.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT