1 / 42

# Model Assessment - PowerPoint PPT Presentation

7. Chapter 7: Model Assessment. 7. Chapter 7: Model Assessment. Assessment Types. The Model Comparison tool provides. C. KS. Summary statistics Statistical graphics. ASE. 7. Chapter 7: Model Assessment. Summary Statistics Summary. Prediction Type. Statistic.

Related searches for Model Assessment

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

## PowerPoint Slideshow about 'Model Assessment' - omer

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Chapter 7: Model Assessment

Chapter 7: Model Assessment

The Model Comparison tool provides

C

KS

Summary statistics

Statistical graphics

ASE

Chapter 7: Model Assessment

Prediction Type

Statistic

Accuracy / Misclassification

Profit / Loss

KS-statistic

Decisions

ROC Index (concordance)

Gini coefficient

1,2,3,…

Rankings

Average squared error

SBC / Likelihood

p≈E(Y)

^

Estimates

Prediction Type

Statistic

Accuracy / Misclassification

Profit / Loss

KS-statistic

Decisions

ROC Index (concordance)

Gini coefficient

1,2,3,…

Rankings

Average squared error

SBC / Likelihood

p≈E(Y)

^

Estimates

Prediction Type

Statistic

Accuracy / Misclassification

Profit / Loss

KS-statistic

Decisions

ROC Index (concordance)

Gini coefficient

1,2,3,…

Rankings

Average squared error

SBC / Likelihood

p≈E(Y)

^

Estimates

• This demonstration illustrates the use of the Model Comparison tool, which collects assessment information from attached modeling nodes and enables you to easily compare model performance measures.

Chapter 7: Model Assessment

Sensitivity charts

Response rate charts

Statistical Graphics Summary

Prediction Type

Statistic

1,2,3,…

Rankings

p≈E(Y)

^

Estimates

...

Statistical Graphics – Prediction Ranks

validation data

...

Prediction Ranks

Select top n% cases.

...

Count fraction of primary outcome cases in selection.

top 40%

1.0

sensitivity

0.0

...

Count fraction of secondary outcome cases in selection.

top 40%

1.0

sensitivity

0.0

0.0

1.0

false positive fraction

(1-specificity)

...

Repeat for all selection fractions.

1.0

sensitivity

0.0

0.0

1.0

false positive fraction

(1-specificity)

...

sensitivity

0.0

0.0

1.0

false positive fraction

(1-specificity)

ROC Index

ROC Index

(c-statistic)

...

0.5

0%

100%

40%

percent selected

(decile)

Response Rate Charts

top 40%

Select top n% cases.

...

top 40%

Count fraction of cases in selection with primary outcome.

1.0

cumulative

gain

0.5

0%

100%

40%

percent selected

(decile)

...

Repeat for all selection fractions.

1.0

cumulative

gain

0.5

0%

100%

percent selected

(decile)

...

• This demonstration illustrates the use of statistical graphics to compare models.

• This demonstration illustrates how to adjust for separate sampling in SAS Enterprise Miner.

Chapter 7: Model Assessment

The sample size is determined not by the total number of cases but by the number of cases in least common outcome (usually primary).

...

Cases are sampled separately from each outcome.

Example:

• sample all primary cases

• match each primary case

by one or more secondary

cases

...

• Similar predictive power

with smaller case count

• Must adjust assessment

statistics and graphics

• Must adjust prediction estimates for bias

• This demonstration illustrates how to adjust for separate sampling in SAS Enterprise Miner.

• This demonstration illustrates how to create a profit matrix.

Chapter 7: Model Assessment

solicit

ignore

14.86

0

primary

outcome

-0.68

0

secondary

outcome

0

profit distribution

for solicit decision

solicit

ignore

14.86

0

primary

outcome

-0.68

0

secondary

outcome

0

profit distribution

for solicit decision

^

^

Expected Profit Solicit = 14.86p1 – 0.68p0

Expected ProfitIgnore = 0

Decision Expected Profits

solicit

ignore

14.86

0

primary

outcome

-0.68

0

secondary

outcome

0

...

solicit

ignore

14.86

0

primary

outcome

-0.68

0

secondary

outcome

0

decision threshold

^

p1 ≥ 0.68 / 15.54  Solicit

^

p1 < 0.68 / 15.54  Ignore

solicit

ignore

14.86

0

primary

outcome

-0.68

0

secondary

outcome

0

average profit

Average profit = (14.86NPS– 0.68 NSS ) / N

NPS = # solicited primary outcome cases

NSS = # solicited secondary outcome cases

N= total number of assessment cases

• This demonstration illustrates viewing the consequences of incorporating a profit matrix.

• This demonstration illustrates several other assessments of possible interest.

Optimizing with Profit (Optional)

• This demonstration illustrates optimizing your model strictly on profit.

• This exercise reinforces the concepts discussed previously.

Chapter 7: Model Assessment

Compare model summary statistics and statistical graphics.

Model

Comparison

Create decision data; add prior probabilities and profit matrices.

Data Source

Tune models with average squared error or appropriate profit matrix.

Modeling

Tools

Obtain means and other statistics on data source variables.

StatExplore

Chapter 7: Model Assessment