90 likes | 117 Views
Development of New Kaon Selectors. Kalanand Mishra University of Cincinnati. Overview of Neural Net Training. The input variables for neural net are: likelihoods from SVT, DCH, DRC (both global and track-based ) and momentum and polar angle () of the tracks.
E N D
Development of New Kaon Selectors Kalanand Mishra University of Cincinnati
Overview of Neural Net Training • The input variables for neural net are: likelihoods from SVT, DCH, DRC (both global and track-based) and momentum and polar angle () of the tracks. • Separate neural net training for ‘Good Quality’ and ‘Poor Quality’* tracks: gives two family of selectors - “KNNGoodQual“ and “KNNNoQual”. • * Poor Quality tracks are defined as belonging to one of the following categories: • outside DIRC acceptance • passing through the cracks between DIRC bars • no DCH hits in layers > 35 • EMC energy < 0.15 GeV
Performance of ‘KNNxQual’ selectors Bkgd. Rejection NoQual GoodQual An absolute improvement An overall improvement Signal Efficiency The higher curve/ point represents better performance GoodQual 0.3 < P < 0.5 Most of the tracks used in B-tagging have low momenta. And, the biggest consumer for such a selector is B-tagging group. Deterioration
Tried different algorithms …. Binary Ada Boost Simple Binary Split Fisher Ada Boost Decision Tree Bagger Decision Tree Events ( Provides the best separation ) Classifier Output
AdaBoost Decision Tree For details on the algorithms and software used, see: arXiv:physics/0507143 (by Ilya Narsky) Sgnl. Bkgd. Events • Decision Tree splits nodes recursively until a stopping criteria is satisfied. • AdaBoost combines weak classifiers by applying them sequentially. At each step it enhances weights of misclassified events and reduces weights of correctly classified events. Classifier Output
Ada Boost Decision Tree • Training on “real data”. • Visual inspection shows a significant improvement over the neural network performance. • Need to retrain after randomizing the momentum dist. and with additional input variables.
Performance in select momentum bins dE/dx - DRC transition region: 0.8 < P < 1.0 GeV/c Low momentum: 0.3 < P < 0.5 GeV/c Intermediate range: 1.9 < P < 2.1 GeV/c High momentum: 3.0 < P < 3.2 GeV/c
Things to do …. • Randomize the momentum distributions of signal and background events before training • Add additional input discriminating variables : - # signal and bkgd. Cherenkov photons in the ring - # total drift chamber hits and hits in the last 5 layers - # hits in the silicon detector - …… other suggestions ! • Add other background categories - proton, ….. • Finalize the cuts and implement the selectors : - it will be a single family of selectors : no separate selectors for “good” and “poor” quality tracks - should we still call it a KNN selector ?
Summary • Significant efforts underway to develop a “new version” of the KNN selectors. • The goal is to develop a powerful non-LH kaon selector using the best performing classifier (or a combination of classifiers). • Such a selector is expected to be able to replace the current KNN selectors for B-tagging purposes, and should be a meaningful alternative of the LH selectors for Physics analyses.