Written by students who passed Immediately available after payment Read online or as PDF Wrong document? Swap it for free 4.6 TrustPilot
logo-home
Other

Homework 4 QA

Rating
-
Sold
-
Pages
6
Uploaded on
07-01-2023
Written in
2021/2022

The paper provides an in-depth understanding of binary classification, decision trees, probability, classifiers, and categorical variables. Besides, it explains autocorrelation, autocovariance, and logistic regression

Institution
Course

Content preview

Running head: COMPUTER SCIENCE 1




Homework 4 QA

Name

Institution

, COMPUTER SCIENCE 2


Homework 4 QA

1. For a binary classification, describe the possible values of entropy. On what conditions

does entropy reach its minimum and maximum values?


Entropy has two value, 0 and 1. A perfect model has a value of 0 whereas 1 signifies a

high loss value. To attain a maximum value in entropy, the random variable should be uniformly

distributed while to obtain a minimum value, the random value should be constant.

2. In a decision tree, how does the algorithm pick the attributes for splitting?


Splitting means diving a node into sub nodes. According to Rajesh (2018), algorithms

pick the attribute for splitting by either finding out the statistical significance between the

differences between sub-nodes and parent node when performing two or by calculating the

entropy of parent node or by calculating the variance for each node. The attributes to spilt are

chosen from the lowest values compared to parent node.

3. John went to see the doctor about a severe headache. The doctor selected John at random

to have a blood test for swine flu, which is suspected to affect 1 in 5,000 people in this

country. The test is 99% accurate, in the sense that the probability of a false positive is

1%. The probability of a false negative is zero. John's test came back positive. What is

the probability that John has swine flu?


Let P(D) be the probability John has swine flue

Let P(T) be the probability of a positive test.

According to Bayes theorem

P(D|T) = P(T|D) P(D) / P(T)=

P(D|T) = P(T|D) P(D) / (P(T|D) P(D) + P(T|ND) P(ND))

Written for

Institution
Study
Unknown
Course

Document information

Uploaded on
January 7, 2023
Number of pages
6
Written in
2021/2022
Type
OTHER
Person
Unknown

Subjects

$5.19
Get access to the full document:

Wrong document? Swap it for free Within 14 days of purchase and before downloading, you can choose a different document. You can simply spend the amount again.
Written by students who passed
Immediately available after payment
Read online or as PDF

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
TutorDenis Griffith University
Follow You need to be logged in order to follow users or courses
Sold
10
Member since
3 year
Number of followers
8
Documents
834
Last sold
8 months ago
Research Portal

Academic and Exam Resources for Business, CPA, Data Science, Nursing, Biology, History, English Literature, and Computer Science Coursework.

4.0

3 reviews

5
0
4
3
3
0
2
0
1
0

Trending documents

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions