Question 5 [3 pts]: In the dataset showing in Table 1, please use Gini Index to...
. Question 3 (5 pts]: Given the following six instances cach with five attributes (Outlook, Temperature, Humidity, Wind, Day) and one class label, calculate Entropy of the whole system [1 pt] • calculate Information gain for attribute "Outlook" [1 pt] • calculate Gini-index for attribute "Outlook" [1 pt] What is the information gain and Gini-index for attribute "Day" [pt] • Explain why "Day" is NOT a good feature being used as the root node of a decision tree. How to...
Given the following six instances each with five attributes (Outlook, Temperature, Humidity, Wind, Day) and one class label, calculate Entropy of the whole system • calculate Information gain for attribute "Outlook" • calculate Gini-index for attribute "Outlook" • What is the information gain and Gini-index for attribute “Day Explain why “Day" is NOT a good feature being used as the root node of a decision tree. How to avoid using “Day” as the root node to create the tree ID...
1. Decision trees As part of this question you will implement and compare the Information Gain, Gini Index and CART evaluation measures for splits in decision tree construction.Let D= (x,y), D = n be a dataset with n samples. The entropy of the dataset is defined as H(D)= P(c|D)log2P(c|D), where P(CD) is the fraction of samples in class i. A split on an attribute of the form X, <c partitions the dataset into two subsets Dy and Dn based on...