Maurice Tutor

(5)

$15/per page/Negotiable

About Maurice Tutor

Levels Tought:
Elementary,Middle School,High School,College,University,PHD

Expertise:
Algebra,Applied Sciences See all
Algebra,Applied Sciences,Biology,Calculus,Chemistry,Economics,English,Essay writing,Geography,Geology,Health & Medical,Physics,Science Hide all
Teaching Since: May 2017
Last Sign in: 409 Weeks Ago, 1 Day Ago
Questions Answered: 66690
Tutorials Posted: 66688

Education

  • MCS,PHD
    Argosy University/ Phoniex University/
    Nov-2005 - Oct-2011

Experience

  • Professor
    Phoniex University
    Oct-2001 - Nov-2016

Category > Management Posted 08 Jun 2017 My Price 14.00

Classification ofglass

Assignment 1:PART 1: 1. Classification ofglass.arff.2. Run the following classifiers, with the default parameters, on this data: ZeroR, OneR,J48, IBK and construct a table of the training and cross-validation errors. What do youconclude from these results?3. Using the J48 classifier, can you find a combination of the C and M parameter valuesthat minimizes the amount of overfitting? Include the results of your best five runs,including the parameter values, in your table of results.4. Reset J48 parameters to their default values.Assignment 1:

 

PART 1: 

1. Classification of

glass.arff. 

 

2. Run the following classifiers, with the default parameters, on this data: ZeroR, OneR,

J48, IBK and construct a table of the training and cross-validation errors. What do you

conclude from these results?

 

3. Using the J48 classifier, can you find a combination of the C and M parameter values

that minimizes the amount of overfitting? Include the results of your best five runs,

including the parameter values, in your table of results.

 

4. Reset J48 parameters to their default values. What is the effect of lowering the

number of examples in the training set?

 

5. Using the IBk classifier, can you find the value of k that minimizes the amount of

overfitting? Include your runs in your table of results.

 

6. Try a number of other classifiers. Which one gives the best performance in terms of

predictive accuracy and overfitting? Include your best five runs in your table of

results.

 

7. Compare the accuracy of OneR and J48. What do you conclude?

 

8. What golden nuggets did you find, if any?

 

Submit: Up to two pages that describes what you did for each of the above questions and

your results and conclusions.

 

PART B: NUMERIC PREDICTION

 

1. Numeric Prediction of cpu.with.vendor.arff 

 

2. Run the following classifers, with default parameters, on this data: ZeroR, MP5, IBk

and construct a table of the training and cross-validation errors. You may want to turn

on “Output Predictions” to get a better sense of the magnitude of the error on each

example. What do you conclude from these results?

 

3. Explore different parameter settings for M5P and IBk. Which values give the best

performance in terms of predictive accuracy and overfitting. Include the results of the

best five runs in your table of results.

 

4. Investigate three other classifiers for numeric prediction and their associated parameters.

Include your best five runs in your table of results. Which classifier

gives the best performance in terms of predictive accuracy and overfitting?

 

Submit: Up to one page that describes what you did for each of the above questions

and your results and conclusions.

 

PART 3: CLUSTERING 

1. Clustering of 

student-data.arff 

 

This file contains some data for some hypothetical students.

 

2. Run the Kmeans clustering algorithm on this data for the following values of

K: 1,2,3,4,5,10,20. Analyse the resulting clusters. What do you conclude? How

many clusters do you think there are in the data?

 

3. Choose a value of K and run the algorithm with different seeds. What is the

effect of changing the seed?

 

4. Run the EM algorithm on this data with the default parameters and analyse the

output. Give an English language description of the clusters.

 

5. Run the algorithm with different seeds. What is the effect of changing the seed?

6. Explore the effect of changing the standard deviation parameter. Carry out runs

with values from 100 to E-10. What do you conclude?

 

7. Compare the use Kmeans and EM for clustering tasks. Which do you think is

best? Why?

 

8. What golden nuggets did you find, if any?

 

Submit: Up to one page that describes what you did for each of the above questions

and your results and conclusions.

 

PART 4: ASSOCIATION FINDING 

 

1. The files supermarket1.arff and supermarket2.arff 

contain the same details of shopping  

transactions represented in two different ways. You can use a text viewer to look

at the files.

 

2. What is the difference in representations?

 

3. Load the file supermarket1.arff into weka and run the Apriori algorithm on

this data. You will need to restrict the number of attributes and/or the number

of examples. What significant associations can you find?

 

4. Explore different possibilities of themetric type and associated parameters. What

do you find?

 

Data Warehousing and Data Mining 2 14-Mar-2012

5. Load the file supermarket2.arff into weka and run the Apriori algorithm on

this data. What do you find?

 

6. Explore different possibilities of themetric type and associated parameters. What

do you find?

 

7. Try the other associators. What are the differences to Apriori?

 

8. What golden nuggets did you find, if any?

 

Submit: Up to one page that describes what you did for each of the above questions

and your results and conclusions.

Answers

(5)
Status NEW Posted 08 Jun 2017 09:06 PM My Price 14.00

Hel-----------lo -----------Sir-----------/Ma-----------dam----------- ----------- -----------Tha-----------nk -----------You----------- fo-----------r u-----------sin-----------g o-----------ur -----------web-----------sit-----------e a-----------nd -----------acq-----------uis-----------iti-----------on -----------of -----------my -----------pos-----------ted----------- so-----------lut-----------ion-----------. P-----------lea-----------se -----------pin-----------g m-----------e o-----------n c-----------hat----------- I -----------am -----------onl-----------ine----------- or----------- in-----------box----------- me----------- a -----------mes-----------sag-----------e I----------- wi-----------ll

Not Rated(0)