Statistics in Particle Physics. 1. 20-29 November 2006 Tatsuo Kawamoto ICEPP, University of Tokyo. Outline. Introduction Probability Distributions Fitting and extracting parameters Combination of measurements Errors, limits and confidence intervals Likelihood, ANN, and sort of things.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
20-29 November 2006
ICEPP, University of Tokyo
Fitting and extracting parameters
Combination of measurements
Errors, limits and confidence intervals
Likelihood, ANN, and sort of things
Study of elementary particles that have been discovered
- Gauge bosons
And anything that has not been discovered
For each particle we want to know, eg.
- mass, lifetime, spin, ….
Observation is a result of fundamental rules of the nature
these are random, quantum mechanical, processes
often of random nature
Systematic uncertainty is a subtle subject, but we have to do
our best to say something about it, and treat it reasonably.
Efficiency < 100%, Background > 0
mZ = 91.1853±0.0029 GeV
GZ = 2.4947 ±0.0041 GeV
shad= 41.82 ±0.044 nb
Use as much
W+W- → qqqq
which we don’t cover
What is it?
P(A) is a number obeying the rules:
Ai are disjoint events
And, that’s almost it.
From considerations of games of chances
Given by symmetry for equally-likely outcomes, for which
we are equally undecided.
Classify things into certain number of equally-likely cases,
And count the number of such favorable cases.
P(A) = number of equally-likely favorable cases / total number
Tossing a coin P(H)=1/2, Throwing a dice P(1)=1/6
How to handle continuous variables ?
Probability is the limit of frequency (taken over some ensemble)
The event A either occur or not. Relative frequency of occurence
Law of large numbers
Can’t say things like:
But one can say:
Comeback later in the discussion of confidence level
P(A) is the degree of belief in A
A can be anything:
Rain, LHC completion, SUSY, ….
You bet depending on odds P vs 1-P
Often used in subjective probability discussions
Conditional probability P(A|B)
Thomas Bayes 1702-1761
How it works?
Initial belief P(Theory) is modified by experimental results
If Result is negative, P(Result|Theory)=0, the Theory is killed
It’s an extreme case. Will comeback later in the discussion of
Monty Hall problem
You should stay or switch?
you don’t know anyway if there is the prize
behind Nr 1 or Nr 2. They are equally probable.
To stay or to switch give equal chance.
A ‘classical’ reasoning (count the number of cases)
Before the door is opened
After the door is opened
Odds to win : stay 1/3
P(Ci) : Prize is behind door i = 1/3
P(Ok) : Door k is opened
We want to know P(C1| O3) vs P(C2| O3)
A disease X (maybe AIDS, SARS, ….)
P(X) = 0.001 Prior probability
P(no X) = 0.999
Consider a test of X
P(+ | X) = 0.998
P(+ | no X) = 0.03
If the test result were +, how worried you should be ?
ie. What is P(X | +) ?