1 / 122

First Results from MiniBooNE

First Results from MiniBooNE. Eric Prebys, FNAL/BooNE Collaboration. The MiniBooNE Collaboration. University of Alabama Los Alamos National Laboratory Bucknell University Louisiana State University University of Cincinnati University of Michigan

Download Presentation

First Results from MiniBooNE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. First Results from MiniBooNE Eric Prebys, FNAL/BooNE Collaboration

  2. The MiniBooNE Collaboration University of Alabama Los Alamos National Laboratory Bucknell University Louisiana State University University of Cincinnati University of Michigan University of Colorado Princeton University Columbia University Saint Mary’s University of Minnesota Embry Riddle University Virginia Polytechnic Institute Fermi National Accelerator Laboratory Western Illinois University Indiana University Yale University HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  3. Outline • State of neutrino mixing measurements • History and background • Without LSND • LSND and Karmen • Experiment • Beam • Detector • Calibration and cross checks • Analysis • Reconstruction • Blindness • Errors and fitting • Unblinding • Results • Interpretation HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  4. The Neutrino “Problem” • 1968: Experiment in the Homestake Mine first observes neutrinos from the Sun, but there are far fewer than predicted. Possibilities: • Experiment wrong? • Solar Model wrong? ( believed by most not involved) • Enough created, but maybe oscillated (or decayed to something else) along the way. • ~1987: Also appeared to be too few atmospheric muon neutrinos. Less uncertainty in prediction. Similar explanation. • Both results confirmed by numerous experiments over the years. • 1998: SuperKamiokande observes clear oscillatory behavior in signals from atmospheric neutrinos. For most, this establishes neutrino oscillations “beyond a reasonable doubt” Solar Problem Atmospheric Problem HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  5. Theory of Neutrino Oscillations • Neutrinos are produced and detected as weak eigenstates (ne ,nm, or nt ). • These can be represented as linear combination of mass eigenstates. • If the above matrix is not diagonalandthe masses are not equal, then the net weak flavor content will oscillate as the neutrinos propagate. • Example: if there is mixing between the ne and nm:then the probability that a ne will be detected as a nm after a distance L is: Mass eigenstates Flavor eigenstates Distance in km Energy in GeV Only measure magnitude of the difference of the squares of the masses. HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  6. Path length Different experiments probe different ranges of Energy Probing Neutrino Mass Differences Accelerators use p decay to directly probe nm  ne Reactors use use disappearance to probe ne? & Reactors Cerenkov detectors directly measure nm andne content in atmospheric neutrinos. Fit to nenm  nt mixing hypotheses Also probe with new generation of “long baseline”accelerator and reactor experiments, MINOS, T2K, etc Solar neutrino experiments typically measure the disappearance ofne. HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  7. Best Three Generation Picture (all experiments but LSND) HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  8. Signature Cerenkov ring from electron Delayed g from neutron capture The LSND Experiment (1993-1998) mix ~30 m Energy 20-50 MeV HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  9. LSND Result Best fit: Excess Signal: • Only exclusive appearance result to date • Problem: Dm2 ~ 1 eV2 not consistent with other results with simple three generation mixing (Soudan, Kamiokande, MACRO, Super-K) (Homestake, SAGE, GALLEX, Super-K SNO, KamLAND) HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  10. Possibilities • 4 neutrinos? • We know from Z lineshape there are only 3 active flavors • Sterile? • CP or CPT Violation? • More exotic scenarios? • LSND Wrong? • Can’t throw it out just because people don’t like it. HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  11. Pulse 800 MeV proton beam (ISIS) 17.6 m baseline 56 tons of liquid scintillator Factor of 7 less statistical reach than LSND -> NO SIGNAL Combined analysis still leaves an allowed region Karmen II Experiment: not quite enough Combined HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  12. Role of MiniBooNE • Boo(ster) N(eutrino) E(xperiment) • Full “BooNE” would have two detectors • Primary Motivation: Absolutely confirm or refute LSND result • Optimized for L/E ~ 1 • Higher energy beam -> Different systematics than LSND • E: ~30 MeV -> 500 MeV • L: ~30 m -> 500 m • Timeline • Proposed: 12/97 • Approved 1998 • Began Construction: 10/99 • Completed: 5/02 • First Beam: 8/02 • Began to run concurrently with NuMI: 3/05 • Oscillation results: 4/07 HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  13. 8 GeV Protons ~ 8E16 p/hr max ~ 1 detected neutrino/minute L/E ~ 1 MiniBooNE Neutrino Beam (not to scale) “Little Muon Counter” (LMC): to understand K flux 500m dirt FNALBooster Be Targetand Horn 50 m Decay Region Detector HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  14. 950,000 l of pure mineral oil 1280 PMT’s in inner region 240 PMT’s outer veto region Light produced by Cerenkov radiation and scintillation Triggers: All beam spills Cosmic ray triggers Laser/pulser triggers NuMI Trigger First off-axis experiment Supernova trigger Detector Light barrier HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  15. e- ne W n p m- nm W n p nm nm Z p0 D0 n p Neutrino Detection/Particle ID Important Background!!! HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  16. Selecting Neutrino Events • Collect data from -5 to +15 usec around each beam spill trigger. • Identify individual “events” within this window based on PMT hits clustered in time. No cuts Veto hits < 6 Veto hits<6tank hits>200 1600 ns spill Time (ns) Time (ns) Time (ns) HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  17. Delivering Protons • Requirements of MiniBooNE greatly exceed the historical performance of the 30+ year old 8 GeV Booster, pushes… • Average repetition rate • Above ground radiation • Radiation damage and activation of accelerator components • Intense Program to improvethe Booster • Shielding • Loss monitoring and analysis • Lattice improvements (result of Beam Physics involvement) • Collimation system • Very challenging to continue to operate 8 GeV line during NuMI/MINOS operation • Once believed impossible • Element of lab’s “Proton Plan” • Goal to continue to deliver roughly 2E20 protons per years to the 8 GeV program even as NuMI intensities ramp up. HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  18. MiniBooNE began taking data in Fall of 2002, and has currently taken more protons than all other users in history of Fermilab combined (including NuMI and pBar) Small but Hungry Total protons (1019) 6.3x1020 Protons in n mode This analysis: (5.58±0.12)x1020 protons In anti-n mode since ~1/06 HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  19. Production GEANT4 model of target, horn, and beamline MARS for protons and neutrons Sanford-Wang fit to production data for p and K Mesons allowed to decay in model of decay pipe. Retain neutrinos which point at target Data Constrained by HARP Experiment Modeling neutrino flux p m nm Km nm m e nm ne Kp e ne • “Intrinsic” ne + ne sources: • m+e+nm ne (52%) • K+ p0 e+ne (29%) • K0 p e ne (14%) • Other ( 5%) ne/nm = 0.5% Antineutrino content: 6% HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  20. Cross sections Based on NUANCE 3 Monte Carlo Use NEUT and NEUEN as cross checks Theoretical input: Llewellyn-Smith free nucleon cross sections Rein-Sehgal resonant and coherent cross-sections Bodek-Yang DIS at low-Q2 Standard DIS parametrization at high Q2 Fermi-gas model Final state interaction model Constraining NUANCE Observed nm data MAeff -- effective axial mass EloSF -- Pauli Blocking parameter From electron scattering data Eb -- binding energy pf -- Fermi momentum K2K and other experiments are also better explained by these modifications MiniBooNE nm Interactions HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  21. Predicted Event Rates (NUANCE) D. Casper, NPS, 112 (2002) 161 HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  22. Characterizing the Detector • Full GEANT 3.21 model of detector • Detailed (39-parameter!!) optical model of oil • Detailed model of PMT response • MC events produced at the raw data level and reconstructed in the same way as real events HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  23. Calibrating the Detector HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  24. Events Producing Pions CCp+ Easy to tag due to 3 subevents. Not a substantial background to the oscillation analysis.   25%   N N  NCp0 The p0 decays to 2 photons, which can look “electron-like” mimicking the signal... <1% of p0 contribute to background.  8% 0  N N (also decays to a single photon with 0.56% probability) HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  25. SignalMis IDIntrinsic ne Bottom line: signal and background • If the LSND best fit is accurate, only about a third of our observed rate will come from oscillations • Backgrounds come from both intrinsic ne and misidentified nm Energy distribution can help separate HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  26. Analysis Philosophy • The “golden signal” in the detector is the Charged Current Quasi-Elastic (CCQE) event • With a particular mass hypothesis, the neutrino energy can be reconstructed from the observed partcle with • If the LSND signal were due to neutrino oscillations, we would expect an excess in the energy range of 300-1500 MeV • The “signal” is therefore an event with • A high probability of being a ne CCQE event • A reconstructed energy in the range 300 to 1500 MeV HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  27. Two Analyses • “Track Based” analysis: • Use detailed tracking information to form a particle ID liklihood. • Backgrounds weighted by observed events outside of box. • “Boosting” • Particle ID based on feeding a large amount of event information into a “boosted decision tree” (details to follow) • “Box” defined by the boosted “score” and the mass range. • Backgrounds determined by models which are largely constrained by the data. HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  28. >200 tank hits <6 veto hits R<500 cm (algorithm dependent) Common Event Cuts data MC HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  29. Blindness • In general, the two analyses have sequestered (hidden) data which has • A reconstructed neutrino energy in the range 300 to 1500 MeV • A high probability of being an electron • For historical reasons, “official box” based on Boosting analysis cuts • This leaves the vast majority (99%) of the data available for analysis • Individual open boxes were allowed, provided it could be established that an oscillation would not lead to a significant signal. • In addition, detector level data (hit times, PMT spectra, etc) were available for all events. • Determination of “primary” analysis based on final sensitivity before opening the box HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  30. ne CCQE MC nm CCQE Track-based analysis ne/nm separation: Reconstructed radius cubed: HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  31. Using a mass cut Using log(Le/Lp) ne CCQE MC nm NCp0 nm NCp0 ne CCQE NCp0 separation HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  32. π0 e e π0 Monte Carlo π0 only BLIND Invariant Mass Testing e-p0 separation using data 1 subevent log(Le/Lm)>0 (e-like) log(Le/Lp)<0 (p-like) mass>50 (high mass) signal invariant mass BLIND log(Le/Lp) HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  33. Monte Carlo π0 only BLIND mass<200 (low mass) log(Le/Lm)>0 (e-like) log(Le/Lp)<0 (p-like) Evaluating Sidebands 1 subevent log(Le/Lm)>0 (e-like) log(Le/Lp)<0 (p-like) mass<200 (low mass) Next: look here.... c2 Prob for mass<50 MeV (“most signal-like”): 69% HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  34. Efficiency: Summary of Track-based Cuts “Precuts” + • Log(Le/Lm) • + Log(Le/Lp) • + invariant mass Backgrounds after cuts HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  35. Boosted Decision Trees* • Fundamental variables are subjected to a series of cuts, each of which classifies the data as “signal” or “background”. • However it is classified at each step, the data are still subjected to subsequent cuts. • In the end, the number of times the event is classified as “background” is subtracted from the number of times it’s classified as “signal”, leading to a final score. • The algorithm is “trained” on Monte Carlo, with both the cut values and their order optimized to maximize signal to background. • Widely used outside of physics: • Example: Arditi and Pulket “Predicting the Outcome of Construction Litigation Using Boosted Decision Trees”, J. Comp. Civ. Eng., vol 19, iss. 4 (2005) Byron P. Roe, et al., NIM A543 (2005) 577. HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  36. Some analysis variables Resolutions: vertex: 24 cm direction: 3.8º energy 14% Reconstructed quantities which are inputs to EnQE nmCCQE nmCCQE Evisible UZ = cosqz HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  37. Example of a Decision Tree Variable 1 (Nsignal/Nbkgd) bkgd-like signal-like Variable 2 9755/23695 bkgd-like sig-like Variable 3 30,245/16,305 1906/11828 7849/11867 sig-like bkgd-like 20455/3417 9790/12888 etc. HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  38. “Box” Cuts on Energy and Boosting Score “Sideband” Box ne-like background-like HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  39. Test MC with Sideband Information HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  40. Efficiencies from Boosted Decision Trees Efficiency after precuts signal background HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  41. Background nm mis-id intrinsic ne (TB analysis) HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  42. Sources of Uncertainty Track Based /Boosted Decision Tree error in % Checked or Constrained by MB data Further reduced by tying ne to nm Source of Uncertainty On ne background Flux from p+/m+ decay 6.2 / 4.3 √ √ Flux from K+ decay 3.3 / 1.0 √ √ Flux from K0 decay 1.5 / 0.4 √ √ Target and beam models 2.8 / 1.3 √ n-cross section 12.3 / 10.5√ √ NC p0 yield 1.8 / 1.5√ External interactions (“Dirt”) 0.8 / 3.4√ Optical model 6.1 / 10.5√ √ DAQ electronics model 7.5 / 10.8√ HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  43. Overall normalization BDT Normalization & energy dependence of both background and signal Predict From the nm CCQE events Data/MC Boosted Decision Tree: 1.22 ± 0.29 Track Based: 1.32 ± 0.26 Tying the ne background and signal prediction to the nm flux constrains this analysis to a strict nmne appearance-only search HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  44. Predicted range of significant oscillation signal: 300<EnQE<1500 MeV Arbitrary Units 0 1 2 3 Neutrino Energy (GeV) K+ and K0 Backgrounds At high energies, above “signal range” nm and “ne -like” events are largely due to kaon decay signal range Signal examples: Dm2=0.4 eV2 Dm2=0.7 eV2 Dm2=1.0 eV2 HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  45. Using low and high energy bins to constrain backgrounds In both analyses, high energy bins constrain ne background TB signal range up to 3000 MeV BDT In Boosted Decision Tree analysis: Low energy bin (200<EnQE<300 MeV) constrains nm mis-ids: p0, DNg, dirt ... signal HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  46. We constrain p0 production using data from our detector This reduces the error on predicted mis-identified p0s Reweighting improves agreement in other variables, e.g. Because this constrains the D resonance rate, it also constrains the rate of DNg HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  47. External Sources of Background “Dirt” Events n interactions outside of the detector Ndata/NMC = 0.99 ± 0.15 Event Type of Dirt after PID cuts EnhancedBackgroundCuts Cosmic Rays: Measured from out-of-beam data: 2.1 ± 0.5 events HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  48. (example signal) Summary of Backgrounds HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  49. Constraining the Measurement • Track-Based • Re-weight MC predictions to match measured nm spectrum, taking into account correlations. • Boosted Decision Tree • Include the nm/ne correlations in the error matrix • Systematic and statistical uncertainties are included in M Binned in energy HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

  50. Example: Cross Section Uncertainties (Many are common to nm and ne and cancel in the fit) MAQE, elosf 6%, 2% (stat + bkg only) QE  norm 10% QE  shape function of E e/ QE  function of E NC 0 ratefunction of 0 mom MAcoh, coh ±25%  Nrate function of  mom + 7% BF EB, pF 9 MeV, 30 MeV s 10% MA1 25% MAN 40% DIS  25% determined from MiniBooNE  QE data determined from MiniBooNE  NC 0 data determined from other experiments HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys

More Related