Download
slide1 n.
Skip this Video
Loading SlideShow in 5 Seconds..
Attempts of neural tracking in ALICE PowerPoint Presentation
Download Presentation
Attempts of neural tracking in ALICE

Attempts of neural tracking in ALICE

113 Views Download Presentation
Download Presentation

Attempts of neural tracking in ALICE

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Attempts of neural tracking in ALICE Alberto Pulvirenti – I.N.F.N. Catania ALICE Week Monday, March 18 2002

  2. Outline • Motivations • Implementation • pattern recognition method • cuts and working conditions • reconstruction • Results • Collateral developments: “combined” tracking • Summary and outlook

  3. Motivations • High pt tracking • Tracking without TPC, for high rate events and high transverse momentum particles [ pt > 1 GeV / c ] • This is an useful benchmark for the method. • There is a work already in ALICE/ITS 99-34 (1999) to compare with (although the code is not in CVS and worked with an older version of ALICE/ITS geometry and rec-point simulation). • Low pt tracking (main goal) • Increase of tracking efficiency for low transverse momentum particles [ pt < 0.2 GeV / c ] or decaying particles which don’t leave enough recpoints in the TPC • Comparison with ALICE 95-50 (1995)

  4. Basics • Neural network constituents: • an array of “neurons” with a real “activation” value (like a data-member, ai) • a symmetric matrix of “synaptic weights” which represent the correlation between neurons (wij) • Neural network work-flow: • random initialization • asynchronous updating cycle (one neuron at a time) • stabilization • final result  binary map

  5. Implementation (1 – definitions) • Neurons: oriented track segments  2 indexes: [sij] • link two consecutive points in the particle’s path  important to define a direction l • sequence (sij with sjk- a segment starts from the end of another)  a possible guess for a tracklet (excitation  positive weight) • stronger correlation for well aligned segments (selects high pt particles) • crossing (sij with sik (skj) – two segments starting from or ending to the same point)  a choice is needed (inhibition  negative weight) k i j BAD SEQUENCE CROSSING GOOD SEQUENCE • Weights: correlations between 2 segments  4 indexes [wijkl] • Geometrical constraint: weight is not zero only when the two neurons share a point; two possible configurations, depending on their orientations:

  6. Implementation (2 – cuts) Needed to limit the number of point pairs used to create neurons • Check only couples on adjacent layers • Cut on the difference in polar angle (q) • Cut on the curvature of the projected circle passing through the two points and the calculated vertex (by means of the AliITSVertex class) • “Helix matching cut” where a is the projected corresponding circle arc

  7. Implementation (3 – work flow) “Step by step” procedure (removing the points used at the end of each one) • Many curvature cut steps, with increasing cut value • Sectioning of the ITS barrel into N azymuthal sectors RISK: edge effects the tracks crossing a sector boundary will not be recognizable by the ANN tracker

  8. Implementation (4 – classes) 2 AliRoot neural-tracking classes created (already in CVS) • Neuron class (for internal use): AliITSneuron • Provided only with a constructor and a method to calculate the segment angle w.r.t. another similar (related) object • Tracker class: AliITSneuralTracker • setters for working params [S , A/B, T, n, min, max ] • elaboration global method • …and a service class: AliITSglobalRecPoint • just to have the possibility to store into a TObjArray the reconstructed points in global coordinates

  9. Test trial ingredients • All detectors “on” and all physical effects “on”. • Full slow simulation and reconstruction in ITS. • Default detailed ITS geometry (AliITSvPPRasymm) • Parameterized HIJING generator: • 84210 particles in | h | < 8.0 • 21000 particles in | h | < 0.9

  10. Results (I) Number of found tracks, efficiency and CPU time as a function of the # of sectors. Only one event analyzed. Test choice: 18 sectors CPU time: ~230 secs PC used: PIII 1 GHz

  11. Results (II) good good fake fake KalmanNeural Note: the “findable” tracks are counted among all ITS findable tracks (not only the ones which are also findable in TPC)

  12. Reconstruction (1 – method) C = curvature Dt = transverse impact parameter Dz = longitudinal impact parameter l = dip angle g0 = momentum azimuthal angle (aka j) • Two steps • XY plane: fit of the bending circle with the Riemann sphere mapping algorithm (thanks to R. Turrisi for pointing us to the existing bibliography) C, Dt , g0 • Whole space: linearized helix equation by calculation of s for each point, using the fit values coming from the previous step (gives tanl, Dz)

  13. Reconstruction (2 – results)

  14. Impact parameter resolution

  15. “Combined” tracking An attempt to increase the tracking efficiency by trying to use the neural algorithm to recognize some findable tracks with the remaining ITS points after the Kalman TPC + ITS tracking. • Different definition for excitory weight and/or cut criteria • No ITS azymuthal sectioning • More CPU time required The “findable” tracks are counted among all ITS findable tracks (irrespective of the fact that they are also findable in the TPC) Kalman only Kalman + Neural

  16. Summary & outlook • Neural network tracking for high transverse momentum tracks in ITS stand-alone looks promising. • Tracking efficiency for pt > 1 GeV/c tracks is comparable with the TPC+ITS Kalman filter one. • Track reconstruction included • ALICE Note already submitted • In progress: • Improving the neural algorithm performances for LOW transverse momentum tracks [ pt < 0.2 GeV/c ]. • Alternative possible techniques for the same purpose (elastic tracking, elastic arms algorithm…) • Other possible developments • Combined tracking using also the “remaining” TPC points after Kalman tracking

  17. Results (III) good fake Results from work of Kindiziuk et al. [ALICE/ITS 99-34 (1999)]

  18. Basics • Neural network constituents: • an array of “neurons” with a real “activation” value (data-member, ai) • a symmetric matrix of “synaptic weights” which represent the correlation between neurons (wij) • Neural network work-flow: • random initialization • asynchronous updating cycle (one neuron at a time): • sum the activations of all neurons, multiplied by the weight of the connection they have with the one we are updating • put this value as the argument of an activation function, and set the new activation to its value • stabilization • from cycle to cycle, the average of the activations variations (taken on the whole network) will fall down. The network is considered “stable” when this average is lesser than a defined threshold S. • final result • create a binary map by turning “on” or “off” the neurons, respectively, if their activation is greater or smaller than a threshold value amin

  19. Implementation (I) l BAD SEQUENCE CROSSING k i j GOOD SEQUENCE Free parameters: it is enough to define only 2 of them (T and A/B) ‘gain’ contribution ‘cost’ contribution

  20. Implementation (II) 1 argument activation threshold= 0.5 slope parameter 0 • Neurons: oriented track segments  2 indexes: [sij] • link two consecutive points in the particle’s path  important to define a direction • Weights: correlations between 2 segments  4 indexes [wijkl] • Geometrical constraint: weight is not zero only when the two neurons share a point; two possible configurations, depending on their orientations: • sequence (sij with sjk- a segment starts from the end of another)  a possible guess for a tracklet (excitation  positive weight) • stronger correlation for well aligned segments (selects high pt particles) • crossing (sij with sik (skj) – two segments starting from or ending to the same point)  a choice is needed (inhibition  negative weight) • Activation function: • limited within [0,1]  low activation units don’t influence the others • increasing function, so that a weight > 0 increases the activations while a weight < 0 does the opposite

  21. Implementation (I) Switch “on” all neurons whose activation is > 0.5 END yes Is the mean activation variation lower than S? Start updating cycle no Loop until all units have been updated Get unit’stotal input Qi = S wik ak Calculate the activation by the “logistic” function End updating cycle Hopfield neural network with real-valued neurons [NIM A279 (1989) 537] Initialization of activations aiwith random values within [0,1]

  22. Implementation (IV) The length of the circle arc (in the outer layer) where the good mates can be found depends on its distance from the inner layer • No cut in Dj (it causes different cut strengths for each couple of layers, due to the different layer distance)

  23. Test trial ingredients • AliRoot v3-06-Rev-02 and HEAD. • All detectors “on” and all phys. eff. “on”. • Full slow simulation and reconstruction in ITS. • Default detailed ITS geometry (AliITSvPPRasymm) • Parameterized HIJING generator: • 84210 particles in | h | < 8.0 • 21000 particles in | h | < 0.9