Loading in 5 sec....

Statistics between Inductive Logic and Empirical SciencePowerPoint Presentation

Statistics between Inductive Logic and Empirical Science

- 134 Views
- Updated On :

3 rd PROGIC Workshop, Canterbury. Statistics between Inductive Logic and Empirical Science. Jan Sprenger University of Bonn Tilburg Center for Logic and Philosophy of Science. I. The Logical Image of Statistics. Inductive Logic. Deductive logic discerns valid, truth-preserving inferences

Related searches for

Download Presentation
## PowerPoint Slideshow about '' - zebulon

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

### Statistics between Inductive Logic and Empirical Science

3rd PROGIC Workshop, Canterbury

Jan Sprenger

University of Bonn

Tilburg Center for Logic and Philosophy of Science

Inductive Logic

- Deductive logic discerns valid, truth-preserving inferences
P; P Q Q

Inductive Logic

- Deductive logic discerns valid, truth-preserving inferences
P; P Q Q

- Inductive logic generalizes that idea to non-truth-preserving inferences
P; P supports Q (more) probably Q

Inductive Logic

- Inductive logic: truth of premises indicates truth of concluions

Main concepts: confirmation, evidential support

Inductive Logic

- Inductive logic: truth of premises indicates truth of concluions
- Inductive inference: objective and independent of external factors

Main concepts: confirmation, evidential support

The Logical Image of Statistics

- Statistics infers from particular data to general models

The Logical Image of Statistics

- Statistics infers from particular data to general models
- Formal theory of inductive inference, governed by general, universally applicable principles

The Logical Image of Statistics

- Statistics infers from particular data to general models
- Formal theory of inductive inference, governed by general, universally applicable principles
- Separation of statistics and decision theory (statistics summarizes data in a way that makes a decision-theoretic analysis possible)

The Logical Image of Statistics

- Contains theoretical (mathematics, logic) as well as empirical elements (problem-based engineering of useful methods, interaction with „real science“)

Where to locate

on that scale?

The Logical Image of Statistics

- Pro: mathematical, „logical“ character of theoretical statistics

The Logical Image of Statistics

- Pro: mathematical, „logical“ character of theoretical statistics
- Pro: mechanical character of a lot of statistical practice (SPSS & Co.)

The Logical Image of Statistics

- Pro: mathematical, „logical“ character of theoretical statistics
- Pro: mechanical character of a lot of statistical practice (SPSS & Co.)
- Pro: Connection between Bayesian statistics and probabilistic logic

The Logical Image of Statistics

- Pro: mathematical, „logical“ character of theoretical statistics
- Pro: mechanical character of a lot of statistical practice (SPSS & Co.)
- Pro: Connection between Bayesian statistics and probabilistic logic
- Cons: presented in this work...

A Simple Experiment

- Five random numbers are drawn from {1, 2, ..., N} (N unknown):
- 21, 4, 26, 18, 12

- What is the optimal estimate of N on the basis of the data?

A Simple Experiment

- Five random numbers are drawn from {1, 2, ..., N} (N unknown):
- 21, 4, 26, 18, 12

- What is the optimal estimate of N on the basis of the data?

That depends on

the loss function!

Estimation and Loss Functions

- Aim: estimated parameter value close to true value
- Loss function measures distance between estimated and true value

Estimation and Loss Functions

- Aim: estimated parameter value close to true value
- Loss function measures distance between estimated and true value
- Choice of loss function sensitive to external constraints

A Bayesian approach

- Elicit prior distribution for the parameter N
- Use incoming data for updating via conditionalization
- Summarize data in a posterior distribution (credal set, etc.)
- Perform a decision-theoretic analysis

Model Selection

- True model usually „out of reach“
- Main idea: minimzing discrepancy between the approximating and the true model
- Discrepancy can be measured in various ways
- cf. choice of a loss function
- Kullback-Leibler divergence, Gauß distance, etc.

Model Selection

- A lot of model selection procedures focuses on estimating the discrepancy between the candidate model and the true model
- Choose the model with the lowest estimated discrepancy to the true model

That is easier said than done...

Problem-specific Premises

- Asymptotic behavior
- Small or large candidate model set?
- Nested vs. non-nested models
- Linear vs. non-linear models
- Random error structure

Problem-specific Premises

- Asymptotic behavior
- Small or large candidate model set?
- Nested vs. non-nested models
- Linear vs. non-linear models
- Random error structure

Scientific understanding required to fix the premises!

Bayesian Model Selection

- Idea: Search for the most probable model (or the model that has the highest Bayes factor)
- Variety of Bayesian methods (BIC, intrinsic and fractional Bayes Factors, ...)

Bayesian Model Selection

- Idea: Search for the most probable model (or the model that has the highest Bayes factor)
- Variety of Bayesian methods (BIC, intrinsic and fractional Bayes Factors, ...)

Does Bayes show a way out of the problems?

Bayesian Model Selection

- If the true model is not contained in the set of candidate models: must Bayesian methods be justified by their distance-minimizing properties?

Bayesian Model Selection

- If the true model is not contained in the set of candidate models: must Bayesian methods be justified by their distance-minimizing properties?
- It is not trivial that a particular distance function (e.g. K-L divergence) is indeed minimized by the model with the highest posterior!
- Bayesian probabilities = probabilities of being close to the true model?

Model Selection and Parameter Estimation

- In the elementary parameter estimation case, posterior distributions were independent of decision-theoretic elements (utilities/loss functions)
- The reasonableness of a posterior distribution in Bayesian model selection is itself relative to the choice of a distance/loss function

Conclusions (I)

- Quality of a model selection method subject to a plethora of problem-specific premises
- Model selection methods must be adapted to a specific problem (“engineering“)

Conclusions (I)

- Quality of a model selection method subject to a plethora of problem-specific premises
- Model selection methods must be adapted to a specific problem (“engineering“)
- Bayesian methods in model selection should have an instrumental interpretation
- Difficult to separate proper statistics from decision theory

Conclusions (II)

- Optimality of an estimator is a highly ambiguous notions
- Statistics more alike to scientific modelling than to a branch of mathematics?
- More empirical science than inductive logic?

Thanks a lot for your attention!!!

© by Jan Sprenger, Tilburg, September 2007

Download Presentation

Connecting to Server..