1 / 16

Diphoton+MET 2015: Tasks and Timelines

A living document outlining the tasks and timelines for the Diphoton+MET project, including estimates for data collection and analysis, background studies, and optimization.

Download Presentation

Diphoton+MET 2015: Tasks and Timelines

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Diphoton+MET 2015: Tasks and Timelines A living document… Bruce Schumm SCIPP

  2. Timelines I • Best guess; potential 1 month delay due to sector short • 50 nsec running • Few pb-1 by end of May (but trigger in commissioning) • As much as 1 fb-1 by end of June • 2-3 fb-1 by end of August • can use for CONF NOTE • 25 nsec running • 10 fb-1 end of October • For journal publication • Proposal: Push/optimize for 2-3 fb-1 result

  3. Timelines II • If we push for “August” result… • Analysis walkthrough end of May • ~2 Hr process, with much discussion • Expected to present unblinding case during walkthrough, up to necessary lacunae associated with data-driven studies • Editorial Board formed at that point • Draft of support note expected at that point •  Tall order! But we need to push.

  4. Tasks Overview • Code/Infrastructure • xAOD • Derivations • Higher-level infrastructure (Ryan’s package) • Events variables (MET with photons, etc.) • Event selection • Preliminary studies • Optimization • Backgrounds • QCD • Electroweak • Irreducible • Overlap • Models • SM samples • Strong & EW signal • Full vs. fast sim?

  5. Models • SM samples largely defined; all requests submitted (?) • Gluino, squark, wino grids defined • Requests still to be submitted • Full or fast? • Tommaso says our ~2M events OK for full sim if necessary • Need this soon! (optimization) • NB: As of 26 March, MC15 Full Sim available, Fast Sim not

  6. Backgrounds - QCD • Prior approach was to assume real diphotons are 7525% of low-MET background • Diphoton MC used to estimate high-MET contribution • Pseudophoton control sample scaled to remainder of low-MET events used to estimate -jet contribution • Are exploring replacing pseudophoton control sample method with ABCD method • If this doesn’t work, will need to re-develop pseudophoton technique (potentially involved process)

  7. Backgrounds - EW • Estimate with e control sample scaled by e fake rate • Need to select e control sample • Need to measure e fake rate (tag and probe) • W MC suggests that ~25% of EW background doesn’t arise from e fakes • Some of this may be accounted for in QCD background • Some of QCD background may include e fake events • Prior approach was to include 25% systematic error on the EW background • Should perform QCD/EW background overlap study

  8. Backgrounds - Irreducible • W contribution estimated via l control sample and simultaneous fit with SR • Question about comparison w/ VBFNLO expectation • Need to develop control sample and explore • Z contribution from Sherpa, scaled to VBFNLO (via MadGraph) in relevant kinematic region • Big difference between VBFNLO and Sherpa not understood (Sherpa much larger) • Need to revisit

  9. Code/Infrastructure • xAOD-based analysis: TokyoTech, UCSC need to catch up • Derivations followed through upon by Milano (status?) • Higher-level statistics and • plotting utility (Ryan…) • Past quantities that have required • study (do we need to look into • these?) • MET • Isolation definition • ???

  10. Event Selection: Preliminary Studies • In past, formal optimization was last step, considering only M_eff (or HT) , MET • Individual, preliminary studies used to establish • Photon PT cut; see e.g. https://indico.cern.ch/event/165989/contribution/0/material/slides/0.pdf • Δφ-MET : make use of or not; cut value. Should we also cut on (Δφ-MET - )? • Δφjet-MET : cut value. Should we also cut on (Δφjet-MET - )? • For 8 TeV, used Meff vs. MET visualization plane (see below) • Will need signal grid points for this already!

  11. Optimization: 8 TeV Approach • Last step done by inspection of Meff (or HT) vs MET plane • Can be confounded by statistics; also look at background and signal stats over same plane • See 8 TeV backup note WP2 Optimization NO YES

  12. Optimization: The Conundrum • How to estimate backgrounds when final background estimates not available? • For 8 TeV analysis optimization, backgrounds estimated by • QCD background estimated by scaling 1 tight + 1 non-isolated pseudophoton sample to 2 tight pseudophoton sample with no Meff (HT) cut for 0 < 60 < MET (DATA) • EW background estimated by scaling e sample by uniform 2% e scale factor (DATA) • W, Z from MC • SUSY group will accept leaving final data-driven step and quick reoptimization before unblinding. Or, pre-optimize as a function of one to-be-determined background value

  13. What SRs to Create? • For 8 TeV Analysis • Strong production: High Meff; backgrounds near 0 • EW production: Intermediate HT; backgrounds 1-2 events • Low mass bino, high mass bino for both • SP1, SP2, WP1, WP2 • Also: Model-independent SR (MIS), no Meff (HT) cut. Based on choosing MET cut at which EW and QCD backgrounds about the same (~1 event each)

  14. Model-Independent SR (?) 8 TeV analysis: at MET=250, Meff = 0 backgrounds about same EW QCD Question: Should we rethink? What do we really want to do to minimize chance that we miss a signal? Hmmm…. How do we think about this?

  15. What Physics Could Hide Signal with Dominant BF into Photons and DM? Degenerate SUSY scenarios? No – energy has to go somewhere. We would see it in photons and/or MET. Photons will not be soft because decaying state will either be high-mass or boosted. Low photonic BF? Would need to accelerate single-photon analyses. Not really practical. Long-lived scenarios? Need to re-create non-pointing photon reconstruction. Probably no competition from CMS here anyway. Perhaps most likely scenario is lower-than-expected cross section from non-SUSY process. Probably best addressed by what was done before, or perhaps just use no Meff or Ht cut and use lower MET cut of the other, model-dependent SRs. Could perhaps also maintain low photon ET cut but that could be a “can of worms”.

  16. Wrap UP • I haven’t mentioned limit setting within HistFitter • Immediate motivation is to get to unblind before or simultaneous with CMS • I’m not assuming we’ll necessary be setting limits! • Our work is cut out for us. Thoughts? • We should start writing the skeleton of the backup note. If anyone is itching to do this, by all means. Otherwise, I’m very happy to do that.

More Related