AccountingQueen

(3)

$16/per page/Negotiable

About AccountingQueen

Levels Tought:
Elementary,Middle School,High School,College,University,PHD

Expertise:
Accounting,Algebra See all
Accounting,Algebra,Applied Sciences,Architecture and Design,Art & Design,Biology,Business & Finance,Calculus,Chemistry,Communications,Computer Science,Economics,Engineering,English,Environmental science,Essay writing,Film,Foreign Languages,Geography,Geology,Geometry,Health & Medical,History,HR Management,Information Systems,Law,Literature,Management,Marketing,Math,Numerical analysis,Philosophy,Physics,Precalculus,Political Science,Psychology,Programming,Science,Social Science,Statistics Hide all
Teaching Since: Jul 2017
Last Sign in: 270 Weeks Ago
Questions Answered: 5502
Tutorials Posted: 5501

Education

  • MBA.Graduate Psychology,PHD in HRM
    Strayer,Phoniex,
    Feb-1999 - Mar-2006

  • MBA.Graduate Psychology,PHD in HRM
    Strayer,Phoniex,University of California
    Feb-1999 - Mar-2006

Experience

  • PR Manager
    LSGH LLC
    Apr-2003 - Apr-2007

Category > Physics Posted 05 Sep 2017 My Price 8.00

The question/decision rule

Question 2 20 pts
Suppose we fit a classification tree to a given data set and construct the following tree: The question/decision rule corresponding to the first split is, "If X1<12.89 is true then go left, otherwise go right." If we go left, we end at a leaf node with 6123 training observations in class 0 and 88 training observations in class 1. Hence new observations that end at this leaf node would be predicted to be in class 0 with posterior probability 6123/(6123+88) = 0.98583159. If we go right we have another question/decision rule, "If X2<12.79 is true then go left, otherwise go right." And so on. Match the following data observations with their predicted class and posterior probabilities based on this decision tree:
(X1, X2, X3, X4, X5) = (13, 14, ­0.3, 0.9, 0.2) [ Choose ] (X1, X2, X3, X4, X5) = (14, 14, 0.1, 0.1, 0.1) [ Choose ] (X1, X2, X3, X4, X5) = (13, 14, 0.3, 1.1, 0.1) [ Choose ] Flag this Question Question 3 10 pts
For the scenario in Question 2, which one of the following describes the characteristics of an observation that has predicted class 0 and posterior probability 44/47 = 0.936? a) X1 < 12.89, X2 < 12.79, and X4 < 0.1449
b) X1 < 12.89, X2 ≥ 12.79, and 0.1449 ≤ X4 < 0.1809
c) X1 ≥ 12.89, X2 < 12.79, and 0.1449 ≤ X4 < 0.1809
d) X1 ≥ 12.89, X2 ≥ 12.79, and 0.1449 ≤ X4 < 0.1809
e) X1 ≥ 12.89, X2 ≥ 12.79, and X4 ≥ 0.1809 Flag this Question Question 4 10 pts
For the scenario in Question 2, consider the bottom right question/decision rule, "X1≥13.32," which, if true, leads to a leaf node with 33 training observations in class 0 and 7 training observations in class 1 and, if false, leads to a leaf node with 6 training observations in class 0 and 19 training observations in class 1. Suppose an alternative question/decision rule existed at this node, which, if true, leads to a leaf node with 34 training observations in class 0 and 11 training observations in class 1 and, if false, leads to a leaf node with 5 training observations in class 0 and 15 training observations in class 1. True or false: this alternative question/decision rule is preferable to the original question/decision rule? (Remember that the idea behind classification trees is to create "purer" child nodes.) True
False Flag this Question Question 5 10 pts
For the scenario in Question 2, X5 is not used at any of the splits. What single feature of classification trees does this best illustrate? a) Robustness to outliers and misclassified observations in the training set. b) Invariance under monotone transformations of quantitative predictors.
c) Automatic dimension reduction (variable selection).
d) Ease of interpretation via simple decision rules.
e) Provision of estimates of misclassification rates for test observations.

Answers

(3)
Status NEW Posted 05 Sep 2017 08:09 AM My Price 8.00

Hel-----------lo -----------Sir-----------/Ma-----------dam----------- T-----------han-----------k y-----------ou -----------for----------- us-----------ing----------- ou-----------r w-----------ebs-----------ite----------- an-----------d a-----------cqu-----------isi-----------tio-----------n o-----------f m-----------y p-----------ost-----------ed -----------sol-----------uti-----------on.----------- Pl-----------eas-----------e p-----------ing----------- me----------- on----------- ch-----------at -----------I a-----------m o-----------nli-----------ne -----------or -----------inb-----------ox -----------me -----------a m-----------ess-----------age----------- I -----------wil-----------l

Not Rated(0)