1 / 12

2005 Unbinned Point Source Analysis Update

2005 Unbinned Point Source Analysis Update. Jim Braun IceCube Fall 2006 Collaboration Meeting. d. Nch = 20. Nch = 24. Nch = 26. a. Case 1: N bin = 3. d. Nch = 28. Nch = 60. Nch = 102. a. Case 2: N bin = 3. Review -- Inefficiency of Binned Methods. Unused information Event loss

nicki
Download Presentation

2005 Unbinned Point Source Analysis Update

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2005 Unbinned Point Source Analysis Update Jim Braun IceCube Fall 2006 Collaboration Meeting

  2. d Nch = 20 Nch = 24 Nch = 26 a Case 1: Nbin = 3 d Nch = 28 Nch = 60 Nch = 102 a Case 2: Nbin = 3 Review -- Inefficiency of Binned Methods • Unused information • Event loss • Distribution of events within bin • Track resolution • Event energy • Optimization • Bin sizes optimized to set the lowest flux limit are not optimal for 5s discovery • Unbinned search methods should be better in every way • Except work needed to implement them

  3. x1 x2 Review -- Methods • Comparison of two likelihood approaches with standard binned approach • Gaussian likelihood • Assume signal distributed according to 2D gaussian determined from MC • Paraboloid likelihood • Space angle error estimated on event-by-event basis • The signal + uniform background hypothesis contains an unknown number of signal events out of Nband total events in declination band around source. Minimize -Log likelihood to find best number of signal events

  4. Review -- Methods • Test hypothesis of no signal with likelihood ratio: • Compare likelihood ratio to distribution obtained in trials randomized in RA to compute significance • Compare methods at fixed points in the sky • Simulate signal point source events with neutrino MC in fixed declination bands • Choose 1000 random background events from neutrino MC • Apply 2005 filter and 2000-2004 point source quality cuts • For binned search, optimize bin radius to minimize m90(Nbkgd)/Ns

  5. Detection Probability d=22.5oa=180o, 1000 Background Events Likelihood Binned (Cone) 5s 3s Detection Probability • Gaussian and paraboloid methods perform similarly • Paraboloid resolution quality cut applied to simulation, paraboloid method may improve with looser cut • Clear 15%-20% decrease in number of events needed to achieve a given significance and detection probability compared to binned method • More to gain for hard spectra • Use energy information in likelihood formulation

  6. What if there is no Signal? • In the absence of signal, how do limits (sensitivity) of unbinned searches compare with binned? • Sensitivity of binned searches: • Calculate Nbkgd for optimal search bin at selected zenith angles • Look up m90(Nbkgd) from Feldman-Cousins Poisson tables • Sensitivity = m90(Nbkgd) * F / Ns(F) • Unbinned searches • No Poisson Statistics • No ‘number’ of observed events • Need to create analysis-specific Feldman-Cousins confidence tables

  7. Feldman-Cousins Tables • Given an observation of observable o, we would like to place limits on some physical parameter m • Past AMANDA point source searches • Observable o = number of events in the search bin • Parameter m = neutrino flux from a source in direction of search bin • We can calculate P(o|m) • For a search bin with N events and B expected background, P(o|m) is Poisson probability of N events given mean (m + B) • For each m, integrate probability until desired coverage is reached (typically 90%) • Order by P(o|m)/P(o|mbest) to determine which values of the observable are included in acceptance region • This ‘confidence belt’ in o-m space contains 90% of total probability • In 90% of observations of observable o, the true value of m will lie in the confidence belt. • 90% upper and lower confidence limits given observable o correspond to confidence belt maximum and minimum values of m

  8. Feldman-Cousins Tables • Construction of confidence belts for likelihood searches • m = Poisson mean number of true events, corresponding to flux • o = ANY observable • Choose Till’s significance estimate as the observable • Need table of P(z|m) on a fine grid of m • Choose number of signal events (N) from Poisson distribution with mean m • Calculate significance estimate and repeat ~10k times • Significance estimate distribution yields P(z|m)

  9. P(z|m) d=22.5, 1000 Background Events FC 90% Conf. Band d=22.5, 1000 Bkgd Events Feldman-Cousins Tables • Easier in practice: • Can simulate sources with Nt events and weight by Poisson probability of Nt for a given m • Confidence belts constructed by integrating probability for each m to 90% • Average upper limitcalculable using confidence band and z distribution for m = 0

  10. Gaussian LH Paraboloid LH q Sensitivity Comparison • Compare sensitivity of likelihood methods to sensitivity of binned cone search at three zenith angles • 22%-24% better sensitivity at d=22.5o , similar to gain in detection probability • Again, more to gain for hard spectra with energy information in likelihood function • If Nch is cut parameter, then for E-2 fluxes limits should be better than with optimal Nch cut

  11. Roadmap to Unblinding • Significant work yet to be done to unblind 2005! • Addition of energy estimator to likelihood function • May be as simple as Nch • 2005 neutrino sample selection • Cuts intended to maximize neutrino efficiency • The future: • Analyze 2000-2005(6) (possibly 1997-2006)

  12. Questions/Comments

More Related