Question

Question 5 [3 pts]: In the dataset showing in Table 1, please use Gini Index to calculate the correlation between each of the four attributes (outlook, temperature, humidity, wind) to the Class label, respectively [2 pts]. Please rank and select the most important attribute to build the root node of the decision tree [1 pt] 91N

0 0
Add a comment Improve this question Transcribed image text
Answer #1

Guru Index Guin (D) :0.y8 เร asp d all posible ubse s Sunna ovulca! Sugny, Rain Jovoost, Rain, «F 蓪et ou sem oled .thenGun (D) Rain : 19 tnu 15 1s 0.36 + 0.1: O. 46I 0 Is เร่ 15 gini indu Nevo athi hub 2 wind week strongHi 15) tempovalut, こ alht, mid, Co 2 subsety hot ool8 15 o.47 same r mid mild t(ool 1+y o.472 (hot mild hett mild: 4t711 ISmuumuum humi di value 0. inaly roof

Add a comment
Know the answer?
Add Answer to:
Question 5 [3 pts]: In the dataset showing in Table 1, please use Gini Index to...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • . Question 3 (5 pts]: Given the following six instances cach with five attributes (Outlook, Temperature,...

    . Question 3 (5 pts]: Given the following six instances cach with five attributes (Outlook, Temperature, Humidity, Wind, Day) and one class label, calculate Entropy of the whole system [1 pt] • calculate Information gain for attribute "Outlook" [1 pt] • calculate Gini-index for attribute "Outlook" [1 pt] What is the information gain and Gini-index for attribute "Day" [pt] • Explain why "Day" is NOT a good feature being used as the root node of a decision tree. How to...

  • Given the following six instances each with five attributes (Outlook, Temperature, Humidity, Wind, Day) and one...

    Given the following six instances each with five attributes (Outlook, Temperature, Humidity, Wind, Day) and one class label, calculate Entropy of the whole system • calculate Information gain for attribute "Outlook" • calculate Gini-index for attribute "Outlook" • What is the information gain and Gini-index for attribute “Day Explain why “Day" is NOT a good feature being used as the root node of a decision tree. How to avoid using “Day” as the root node to create the tree ID...

  • 1. Decision trees As part of this question you will implement and compare the Information Gain,...

    1. Decision trees As part of this question you will implement and compare the Information Gain, Gini Index and CART evaluation measures for splits in decision tree construction.Let D= (x,y), D = n be a dataset with n samples. The entropy of the dataset is defined as H(D)= P(c|D)log2P(c|D), where P(CD) is the fraction of samples in class i. A split on an attribute of the form X, <c partitions the dataset into two subsets Dy and Dn based on...

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT