1 / 30

Monte Carlo Instrument Simulation Activity at ISIS

Monte Carlo Instrument Simulation Activity at ISIS. Dickon Champion, ISIS Facility. HET Fermi Chopper Simulation. OSIRIS Back Scattering Instrument. m=2 (10cmx10cm). m=3.0 (2cmx4cm). L1. L2. Wish Diffractometer Guide. Polarizing Mirror Component. Single bounce. 0.4°. 1.2°.

ilana
Download Presentation

Monte Carlo Instrument Simulation Activity at ISIS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monte Carlo Instrument Simulation Activity at ISIS Dickon Champion, ISIS Facility

  2. HET Fermi Chopper Simulation

  3. OSIRIS Back Scattering Instrument m=2 (10cmx10cm) m=3.0 (2cmx4cm) L1 L2

  4. Wish Diffractometer Guide

  5. Polarizing Mirror Component Single bounce 0.4° 1.2°

  6. Double bounce 0.4° 1.2°

  7. New Vitess Module for ISIS

  8. Distributed Monte Carlo Instrument Simulations

  9. Introduction • What is Distributed Computing • The software we use • VITESS Specifics • McStas Specifics • Conclusions

  10. What do I mean by ‘Distributed Grid’? • A way of speeding up large, compute intensive tasks • Break large jobs into smaller chunks • Send these chunks out to (distributed) machines • Distributed machines do the work • Collate and merge the results

  11. Spare Cycles Concept • Typical PC usage is about 10% • Most PCs not used at all after 5pm • Even with ‘heavily used’ (Outlook, Word, IE) PCs, the CPU is still grossly underutilised • Everyone wants a fast PC! • Can we use (“steal?”) their unused CPU cycles? • SETI@home, World Community Grid (www.worldcommunitygrid.org)

  12. Visual Introduction to the Grid

  13. Suitable / Unsuitable Applications • CPU Intensive • Low to moderate memory use • Not too much file output • Coarse grained • Command line / batch driven • Licensing issues?

  14. Monte Carlo Speed-up Ideas • Two scenarios: • Single large simulation run • Split the neutrons into smaller numbers and execute separately • Merge results in some way • Many smaller runs • Parameter scan

  15. VITESS – Splitting It • Easy mode of operation: fixed executables + data files • Executables held on server • Split command line into bits – divide Ncount • Vary the random seed • Create data packages • Upload data packages

  16. VITESS – Running It • Use GUI to create instrument – Save As Command • “Parameter directory” set to “.” • Submit program parses bat file • Substitutes ‘V’ and ‘P’ • Removes ‘header’ and ‘footer’ • Creates many new bat files with different ‘--Z’s and

  17. C:\My_GRID\VITESSE\VITESSE\build>Vitess-Submit.exe example_job example.bat req_files 20 logging in to https://bruce.nd.rl.ac.uk:18443/mgsi/rpc_soap.fcgi as tom.... Adding Vitesse dataset.... Adding Vitesse datas.... 3e+007 neutrons split into 20 chunks, of -n1500000 neutrons Total number of Vitesse 'runs' = 20 Uploading data for run #1... Uploading data for run #2... . . Uploading data for run #19... Uploading data for run #20... Adding Vitesse datas to system.... Adding job.... Adding jobstep.... Turning on automatic workunit generation.... Closing jobstep.... All done Your job_id is 4878 VITESS – Running It • Submit program creates many bat files

  18. VITESS – Merging It • Download the ‘chunks’ • Merge Data files • DetectedNeutrons.dat : concatenate • vpipes : trajectories & count rate • Two classes of files • 1D - Values: sum & divide by num chunks- • - Errors: square, sum and divide • 2D –Sum / num of chunks

  19. VITESS – Advantages and Problems • Many times faster: linear increase • Needs verification runs (x3) • Typically 11 (potentially) 30+ times faster • 12 hours runs in 1 hour! • Very large simulations reach random limits

  20. VITESS – Some Results 176 hours 59 hours 6hrs 20mins

  21. McStas – Splitting It • Different executable for every run • Executable must be uploaded at run time • Split –n into chunks • or run many instances (parameter scan) • Create data (+ executable) packages • Upload packages

  22. McStas – Running It • Use McGui to create and compile executable • Create input file for Submit program

  23. McStas – Running It • Large run • Submit program breaks up –n##### • Uploads new command line + data + executable • Parameter Scan • Send each run to a separate machine

  24. McStas – Merging It • Many output files  Separate merge program • PGPLOT and Matlab implemented • Very similar • PGPLOT • 1D – intensities: sum and divide. Errors: square, sum and divide. Events: Sum • 2D – intensities: sum and divide. Errors: square, sum and divide. Events: Sum • Matlab • 1D – Same maths, different format • 2D – Virtually the same • ‘Metadata’ leave untouched

  25. McStas – Advantages and Problems • Security: Do we trust users? • 100 times faster[?] • Linux version much faster than Windows [?] • How do we merge certain fields? • values = '1.44156e+006 10459.9 30748'; • statistics = 'X0=3.5418; dX=1.52975; Y0=0.000822474; dY=1.0288;'; • Some issue related to randomness of moderator file

  26. Conclusions • Both run well under Grid MP • Submit & Retrieve a few hours work • Merge a bit more • Needs to merge more output formats [?] • Issues with very large simulations • More info on Grid MP at www.ud.com

  27. Acknowledgements • Tom Griffin - GRID • Ed Abel -GRID • Stuart Ansell - MCNPX • Mark Telling - OSIRIS • Robert Dalgliesh - Polarization • Laurent Chapon - WISH • Judith Peters - HET • Heloisa Bordallo - HET • Geza Zsigmond -HET

More Related