100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.2 TrustPilot
logo-home
Exam (elaborations)

GEORGIA INSTITUTE OF TECHNOLOGY CS 6601 ,DECISION TREES,2020/2021 BAYES NETS

Rating
-
Sold
-
Pages
7
Grade
A+
Uploaded on
08-11-2022
Written in
2020/2021

GEORGIA INSTITUTE OF TECHNOLOGY CS 6601 ,DECISION TREES, 2020/2021 BAYES NETS GEORGIA INSTITUTE OF TECHNOLOGY CS 6601 ,DECISION TREES, 2020/2021 BAYES NETS

Institution
Course









Whoops! We can’t load your doc right now. Try again or contact support.

Written for

Course

Document information

Uploaded on
November 8, 2022
Number of pages
7
Written in
2020/2021
Type
Exam (elaborations)
Contains
Questions & answers

Subjects

Content preview

import numpy as np
import math
from random import randint
from collections import Counter

class DecisionNode():
"""Class to represent a single node in
a decision tree."""

def __init__(self, left, right, decision_function, class_label=None):
"""Create a node with a left child, right child,
decision function and optional class label
for leaf nodes."""
self.left = left
self.right = right
self.decision_function = decision_function
self.class_label = class_label

def decide(self, feature):
"""Return on a label if node is leaf,
or pass the decision down to the node's
left/right child (depending on decision
function)."""
if self.class_label is not None:
return self.class_label
elif self.decision_function(feature):
return self.left.decide(feature)
else:
return self.right.decide(feature)

def build_decision_tree():
"""Create decision tree
capable of handling the provided
data."""
A3_decision_func = lambda feature : feature[2] == 1
A3_node = DecisionNode(DecisionNode(None, None, None, 0), DecisionNode(None,
None, None, 1), A3_decision_func, None)

A2_decision_func = lambda feature : feature[1] == 1
A2_node = DecisionNode(DecisionNode(None, None, None, 0), DecisionNode(None,
None, None, 1), A2_decision_func, None)

A4_decision_func = lambda feature : feature[3] == 1
A4_node = DecisionNode(A2_node, A3_node, A4_decision_func, None)

# Root should be A1
A1_decision_func = lambda feature : feature[0] == 1
decision_tree_root = DecisionNode(DecisionNode(None, None, None, 1),
A4_node, A1_decision_func, None)

return decision_tree_root

def confusion_matrix(classifier_output, true_labels):
true_positive = 0.0
false_negative = 0.0
false_positive = 0.0
true_negative = 0.0

for i in range(len(classifier_output)):
if true_labels[i] == 1:
if true_labels[i] == classifier_output[i]:
true_positive += 1.0
else:
false_negative += 1.0

This study source was downloaded by 100000850306617 from CourseHero.com on 11-08-2022 02:14:34 GMT -06:00


https://www.coursehero.com/file/24602714/decision-treespy/

, else:
if true_labels[i] == classifier_output[i]:
true_negative += 1.0
else:
false_positive += 1.0

return [[true_positive, false_negative], [false_positive, true_negative]]

def precision(classifier_output, true_labels):
#true_positive/ (true_positive + false_positive)
matrix = confusion_matrix(classifier_output, true_labels)

return matrix[0][0] / (matrix[0][0] + matrix[1][0])

def recall(classifier_output, true_labels):
#true_positive/ (true_positive + false_negative)
matrix = confusion_matrix(classifier_output, true_labels)

return matrix[0][0] / (matrix[0][0] + matrix[0][1])

def accuracy(classifier_output, true_labels):
#correct_classifications / total_number_examples
matrix = confusion_matrix(classifier_output, true_labels)

return (matrix[0][0] + matrix[1][1]) / len(true_labels)

def entropy(class_vector):
"""Compute the entropy for a list
of classes (given as either 0 or 1)."""

# Because we have 0 and 1 as class we can reduce the entropy function to
just
# entry: -(qlog2(q) + (1-q)log2(1-q)) where q = p/(p+n)
positive_count = class_vector.count(1)
negative_count = class_vector.count(0)

q = float(positive_count) / (positive_count + negative_count)

if positive_count == 0 or negative_count == 0:
return 0.0
else:
return -(q * math.log(q, 2) + (1 - q) * math.log(1 - q, 2))

def information_gain(previous_classes, current_classes):
"""Compute the information gain between the
previous and current classes (each
a list of 0 and 1 values)."""
previous_entropy = entropy(previous_classes)
total_example_count = len(previous_classes)

remainder = 0
for split in current_classes:
remainder += (float(len(split))/total_example_count) * entropy(split)

return previous_entropy - remainder

class DecisionTree():
"""Class for automatic tree-building
and classification."""

def __init__(self, depth_limit=200):
"""Create a decision tree with an empty root
and the specified depth limit."""
self.root = None

This study source was downloaded by 100000850306617 from CourseHero.com on 11-08-2022 02:14:34 GMT -06:00


https://www.coursehero.com/file/24602714/decision-treespy/
$15.99
Get access to the full document:

100% satisfaction guarantee
Immediately available after payment
Both online and in PDF
No strings attached

Get to know the seller
Seller avatar
kristen123

Get to know the seller

Seller avatar
kristen123 Walden University
Follow You need to be logged in order to follow users or courses
Sold
1
Member since
3 year
Number of followers
1
Documents
34
Last sold
1 year ago
HEALTHCARE 001 BY KRISTEN

Here in ,you'll find excellent exam /class notes in Healthcare and allied courses. Find bundles of self help notes on Healthcare and related courses/units. Have fun studying.

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions