- By
**lew** - Follow User

- 86 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about 'Performance of Statistical Learning Methods' - lew

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

### Performance of Statistical Learning Methods

Jens Zimmermann

Max-Planck-Institut für Physik, München

Forschungszentrum Jülich GmbH

Performance Examples from Astrophysics

Performance vs. Control

H1 Neural Network Trigger

Controlling Statistical Learning Methods

Overtraining

Efficiencies

Uncertainties

Comparison of Learning Methods

Artificial Intelligence

Higgs Parity Measurement at the ILC

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Performance of Statistical Learning Methods: MAGIC

Significance and number of excess events scale theuncertainties in the flux calculation.

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Pileup vs. Single photon

pileups not recognised by XMM but by NN

? ?

classical algorithm

„XMM“

Performance of Statistical Learning Methods: XEUSJens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Control of Statistical Learning Methods

There may be many different successful applicationsof statistical learning methods.

There may be great performance improvementscompared to classical methods.

This does not impress people who fear thatstatistical learning methods are not well under control.

First talk: Understanding and Interpretation

Now: Control and correct Evaluation

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

„L2NN“The Neural Network Trigger in the H1 Experiment

Trigger Scheme

H1 at HERA ep Collider, DESY

L1 2.3 µs

L2 20 µs

L4 100 ms

10 MHz

500 Hz

50 Hz

10 Hz

Each neural network on L2 verifies a specific L1 sub-trigger.

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Signal(DVCS)

Background(upstreambeam-gasinteraction)

- L1 sub-trigger 41 triggers DVCS by requiring
- Significant energy deposition in SpaCal
- Within Time Window
- L2 neural network additional information
- Liquid argon energies
- SpaCal centre energies
- z-vertex information

Triggering with

4 Hz

Must be reduced to

0.8 Hz

Triggering Deeply Virtual Compton ScatteringTheory

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

25% selection set

25% test set

- Tune training parameters to
- avoid overtraining
- optimise performance

50% training set

signalshouldpeak at 1

backgroundshouldpeak at 0

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Determine the Correct Efficiency

training set

[%]

test set

[%]

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

propagation of uncertainties

statistical uncertainty of the efficiency

e.g. 80% ± 4% for 80 of 100

Check Statistical Uncertaintiesefficiency

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Check Systematical Uncertainties

There is only a propagation ofsystematical uncertainties of the inputs

Assumingx1 with absolute error s1x2 with relative error s2= 5%x3 with relative error s3=10%

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Check Systematical Uncertainties

example: DVCS dataset

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

smis the variation overdifferent parts of the test set

efficiencies for fixed rejection of 80%

Comparison of HypothesesNN: 96.5% vs. SVM: 95.7%

Statistically significant?

Build 95% confidence interval!

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

smis the variationover the different trainings

efficiencies for fixed rejection of 60%

Comparison of Learning MethodsCompare performancesover different training sets!

Cross-Validation:

Divide dataset into k parts,train k classifiers byusing each part once as test set.

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

CC

cosmic

two events with low NN-output

overlay

cosmic

Artificial IntelligenceH1-L2NN: TriggeringCharged Current

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

background foundin J/y selectionArtificial Intelligence

H1-L2NN: Triggering J/y

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Classical approach:fit angular distribution

A

- Parity induces favourite r-configuration:
- anti-parallel for H
- parallel for A

0

p

2p

Significance is amplitudedivided by its uncertainty

Significance measured for500 events and averagedover 600 pseudo-experiments

s = 5.09

Higgs Parity Measurement at the ILCH/A t+t- rn rn ppn ppn

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Significance is differenceof measured meansdivided by its uncertainty

Significance measured for500 events and averagedover 600 pseudo-experiments

s = 6.26

Higgs Parity Measurement at the ILCStatistical learning approach: direct discrimination

trained towards 0

trained towards 1

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Conclusion

Statistical Learning Methods successful in many

applications in high energy and astrophysics.

Significant performance improvements comparedto classical algorithms.

Statistical learning methods are well under control:

- efficiencies can be determined

- uncertainties can be calculated.

Comparison of learning methods revealsstatistically significant differences.

Statistical Learning Methods sometimes show more

artificial intelligence than expected.

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen

Download Presentation

Connecting to Server..