1 / 27

MDC Status and Results

MDC Status and Results. D. A. Petyt 19 th March 2005. Talks in this session: MDC overview David (5) ND/CC parameter fitting and results David (15) NC analysis status and results Tobi (10) Nue MDC analysis status Chris (10) Unveiling of MDC parameters Robert (5). MDC overview.

jace
Download Presentation

MDC Status and Results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MDC Status and Results D. A. Petyt 19th March 2005 • Talks in this session: • MDC overview David (5) • ND/CC parameter fitting and results David (15) • NC analysis status and results Tobi (10) • Nue MDC analysis status Chris (10) • Unveiling of MDC parameters Robert (5)

  2. MDC overview • The Mock Data Challenge (MDC) is designed to challenge the ability of the analysis groups to extract values of the oscillation parameters from datasets with unknown systematic and oscillation parameters. • There exists two sets of MDC files: • The ‘Challenge’ set – generated with tweaked values of selected systematic parameters and, for the FD, particular values of the oscillation parameters. These events should be treated as the ‘Data’ – consequently MC truth information is not available. • The ‘MC’ set – generated with nominal systematic parameters and no oscillations • In addition to challenging the analysis groups to devise methods to pull out these parameters, there are two other benefits of this exercise: • Motivates us to develop tools that will be useful for the analysis of ‘real’ data. • Exercises the machinery of generating and reconstructing large datasets. • The unknown oscillation parameters to be determined are: • Dm223, sin22q23, sin22q13, fsterile • Systematic ‘tweaks’ are applied to the following parameters: • Quasi-elastic axial vector mass (3% variation) • Resonance production axial vector mass (3% variation) • Dis-resonance scale factor (4% variation) • BMPT hadron production uncertainties (~25 parameters, 1s variations)

  3. Catalogue of data files • Far: • MC: • 40 numu files @ 6.5e20 pot • 8 nue files • 39 nutau files • MDC: • 1 challenge set file @ 7.4e20 pot • Near: • MC: • 222 files (550 spills/file @ 2.4e13 POTs/spill) • MDC: • 246 files (550 spills/file) • All files processed with R1.12 (Christmas 2004 edition)

  4. MDC analysis tools • The MCReweight package: • Developed by Chris Smith – provides a method of reweighting GMINOS events to study the effect of changes in the MDC systematic parameters on particular physics analyses. • Incorporates NeugenInterface (C. Smith, H. Gallagher) for cross-section reweighting and GnumiInterface (C. Smith, M. Messier) for reweighting ND and FD neutrino events according to the BMPT parameterization of secondary hadron production. • Can in principle be extended to accommodate other reweighting schemes. The possibility of incorporating a more generalised beam reweighting scheme was discussed earlier at this meeting. • The Physics Analysis Ntuple (PAN): • A simple ROOT ntuple that contains “all relevant quantities” for a particular analysis. • Truth information for MC events, including sufficient information to permit event reweighting (i.e. need at minimum E_nu, x, y, q2, W, E_mu, limited STDHEP information). Can be extended to accommodate other reweighting functions as required. • Reconstructed information necessary to perform oscillation fit • Quantities specific to a particular analysis, such as PID parameters and fiducial cuts, that enable specific event samples (CC/NC/nue-like) to be selected • Association of tracks/showers/slices in the Near Detector. Identical format for ND and FD events • Each group currently maintains its own version of the PAN, although there has been progress on standardising the common MC truth and reconstruction information between the various versions

  5. Determination of systematic parameters from ND data D. Petyt, E. Lartey

  6. ND/CC MDC Analysis Procedure • Use the Mad package to process MDC files and produce the PID and event variables that are relevant to the ‘standard’ CC analysis. Make a cut on the PID parameter to select CC-like samples in both ND and FD (PID>-0.2 for ND, PID>-0.4 for FD) • Perform match-up between ND MC and MDC datasets to assess the level of agreement with nominal systematic parameters • Use the MCReweight package to perform a fit to the ND MDC data with a reduced set of systematic parameters in order to: • Determine if the level of agreement between MC/MDC samples can be improved • Obtain best-fit values and uncertainties on the systematic parameters • Use central values of these parameters in an oscillation fit to the FD MDC data set • Perform match-ups between FD MDC distributions and best-fit FD MC • Perform simultaneous ND/FD fit with systematics as ‘nuisance parameters’* *This has not yet been done

  7. ND MC/MDC matchup – after PID cut MC MDC Generally good agreement between MC and MDC in all variables

  8. Reco_enu distribution – after cuts All MC events True CC events True NC events Challenge set c2=36.2/30

  9. MDC/MC matchup – nominal parameters Match-up is pretty good – implies that FD fit with nominal beam/xsec parameters will be OK. ND fit is required to determine the allowed range of these parameters, however.

  10. Fit philosophy • A fit is performed to ND distributions that are sensitive to changes in the beam/cross-section systematic parameters and the MCReweight package is used to predict how these distributions change when the parameters are varied. • The ND fit in this analysis is performed on the 2D E_reco vs reconstructed_y distribution, where reco_y =reco_shw/reco_enu. • The reco_y dimension is necessary to provide some discrimination between QEL, RES & DIS events, and therefore some sensitivity to the difference between ma_qel, ma_res, disfact. • If only reco_enu is used, there is complete degeneracy between ma_qel and ma_res • It is expected that the e_reco distribution will provide discrimination between BMPT beam systematic parameters. • A total of 51 bins of variable bin-size are employed in the fit (17 in e_reco and 3 in y) and a simple chisq is calculated between the observed and expected distributions. • The second (optional) term applies a penalty to the chisq – based on the uncertainty on these parameters measured from neutrino scattering data (x-sec) and NA20/SPY data (BMPT) whenever the parameters deviate from their nominal values.

  11. Fit parameters and ranges • In ‘unconstrained’ fits, the parameters are allowed to vary freely within the ranges specified above, with no chisq penalty applied if they range far from the nominal values • In ‘constrained’ fits, a chisq penalty is applied when parameters deviate from their nominal values – the 1 sigma error is given by the ‘constraint’ column in the table above. (BMPT errors are taken from Alysia’s fits) * can be calculated outside of MCReweight

  12. Result of constrained fit Ma_qel = 1.032 +0.023-0.022 Ma_res = 1.032 +0.034-0.016 alpha_pi = 3.52 +0.05-0.11 a_pi = 6.22 +0.14-0.15 A_pi = 64.8 +1.6-3.6

  13. Constrained fit MDC matchup nominal best fit c2=38.0/46 Not much of an improvement – unsurprising given already good level of agreement

  14. FD oscillation fit and parameter extraction D. Petyt

  15. Oscillation fit • Simple chisq fit to the FD reconstructed energy spectrum of selected CC-like events (PID<-0.4) • Uses 39 variable-width bins of Evis (either 0.5 or 1 GeV width) • Subtract NC component using MC prediction of NC background. Assign a 10% error to this subtraction • Perform fits with nominal and best-fit systematic parameters

  16. Fit to reconstructed energy spectrum c2=37/37 dof Unoscillated Oscillated (best fit) NC contamination Challenge set

  17. And the answer is… x – Best fit (Dm2=2.17510-3, sin22q=0.925) – Super-K best fit

  18. Spectrum ratios: Nominal systematic parameters Some evidence of a ‘rise’ at low Evis NC subtracted No NC subtraction Best fit

  19. Comparison of allowed regions • Contours and best-fit point do not change significantly when FD MC is reweighted by best-fit systematic parameters as determined by the ND. Constrained fit Unconstrained fit Nominal parameters Nominal parameters

  20. MDC result vs MC sensitivity MDC fit (nominal) MC sensitivity MDC sample looks much like an ‘average’ dataset with Dm2=0.002175, sin22q=0.925

  21. 1D c2 projections: MDC/sensitivity Both Dm2 and sin22q measured to ~7% (1 sigma, 1 d.o.f) Sin22q=1 disfavoured at the 1 sigma level…

  22. Conclusions • We have performed a fit to the visible energy spectrum of selected CC-like events in the FD MDC ‘challenge’ sample. • The results seem reasonable, both in terms of the fit quality and the parameters obtained (given what we know about the generation of the ‘true’ parameters). • The allowed region obtained is consistent with the Super-K best-fit point with best-fit values (dmsq=2.175e-3, sin2theta=0.925) • The allowed region and best-fit point do not change significantly when we apply the best-fit systematic beam/x-sec parameters from the ND fit. • The ND fit seems able to separate the effect of the fitted systematic parameters (or at least the 5 I chose) • However, the match-up between ND MC and MDC samples is satisfactory, even with nominal systematic parameters – this either means the generated shifts were small or there has been a miraculous cancellation between the various systematic parameters. • This means that a fit to the FD spectrum with nominal parameters is OK, at least to first order. • So, did we get the ‘right’ answer? • If so, I can leave here with my reputation intact. • If not, we will need to understand what went wrong, and which systematic parameters fooled us into obtaining incorrect values. • Robert will reveal all…

  23. First look at FD challenge set All MC events True CC events True NC events Challenge set Distribution seems consistent with numu disappearance at a level that is expected for SK-like oscillation parameters…

  24. Effect of varying parameters E. Lartey These are largely equivalent

  25. Effect of varying parameters - 2 ma_qel9% ma_res9% alpha_pi5% a_pi6%

  26. Near and Far ratios Near unconstrained Far unconstrained Individual components Ratio of weighted/nominal These plots show that large effects in single parameters can largely cancel when they are combined Overall Some N/F differences in response to beam parameters. Probably exacerbated by loose r<1m cut on ND fiducial volume unconstrained constrained

More Related