1 / 43

Report from XXI HTASC

Report from XXI HTASC. Tobias Haas HEPCCC 29 June, 2002. General Remarks. Last HTASC: 6/7 June at Niels Bohr Institute, Copenhagen Thanks to Bj ö rn Nilsson for the organization. Upcoming meetings: 26/27 September, CERN 13/14 March 2003, CERN Topics:

ishana
Download Presentation

Report from XXI HTASC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Report from XXI HTASC Tobias Haas HEPCCC 29 June, 2002 Tobias Haas

  2. General Remarks • Last HTASC: 6/7 June at Niels Bohr Institute, Copenhagen • Thanks to Björn Nilsson for the organization. • Upcoming meetings: • 26/27 September, CERN • 13/14 March 2003, CERN • Topics: • Grid and Networking Activities in the Nordic Countries • Reports from the Working Groups: • Videoconferencing (Chair: Hans Frese) • W2K (Chair: Gian Piero Siroli and Michel Jouvin) • Round table discussion: • Status of Tier-2 Centers Tobias Haas

  3. 6 June 14:00 Organizational Welcome and Introduction T.Haas Review of Action List Report from HEPCCC T.Haas Coffee Break 15:30 Nordic Activities Nordic Networking (NORDUNet) Jan P. Sørensen Nordic Grid Activities (NorduGrid) Anders Wäänänen 7 June 09:00 Subgroup Reports Win2K Videoconference Coffee Break 10:30 Site Reports etc Site Reports Round Table AOB 13:00 Adjourn XXI HTASC Agenda Tobias Haas

  4. Jan P Sørensen XXI HTASC 6 - 7 June 2002 Copenhagen Tobias Haas

  5. Tobias Haas

  6. Tobias Haas

  7. Tobias Haas

  8. Tobias Haas

  9. Tobias Haas

  10. Tobias Haas

  11. Tobias Haas

  12. Tobias Haas

  13. Tobias Haas

  14. Tobias Haas

  15. Tobias Haas

  16. NorduGrid Project HTASC XXI Copenhagen 6th June 2002 Anders Wäänänen <waananen@nbi.dk>

  17. Project Overview • Launched in spring 2001, with the aim of creating a Grid infrastructure in the Nordic countries • Partners from Denmark, Norway, Sweden, and Finland • Meant to be the Nordic branch of the EU DataGrid (EDG) project testbed • Relies on very limited human resources (3 full-time researchers, few part-time ones) with funding from NorduNet2

  18. The development • The fabric of the NorduGrid was laid down by June 2001 • NorduGrid Authentication System was put into operation in May 2001 • The first middleware was deployed and the sites were Grid-enabled by July 2001 • Further Grid services were put into operation (November-December 2001): • NorduGrid User Management System (Virtual Organization) • NorduGrid Information System • Grid Data Mirroring Package (GDMP) • Data replication catalog • Deployment & evaluation of the first (Testbed 1) release of the EDG Middleware (December-January)

  19. Facing Reality • NorduGrid was only an 18 months project compared to 3 years for EU DataGrid • Expected to run the ATLAS Data Challenge on a working Grid testbed in May 2002 in the Nordic countries • Continuing problems with EDG testbed stability • Architecture problems with bottlenecks and fragile system components • The urgent need to have something stable and working resulted in the decision to create a new architecture not necessarily compatible with EDG

  20. Input “sandbox” UI JDL Input “sandbox” Output “sandbox” Job Submit Job Query Brokerinfo Job Status Output “sandbox” Job Status Bottleneck A Job Submission Example Replica Catalogue Information Service Resource Broker Author. &Authen. Storage Element Job Submission Service Logging & Book-keeping Compute Element

  21. Strategy • Define new architecture with stability as main feature • Remove bottlenecks • Tune system to reflect reality • Implement robust core subsystems using Globus components • Use existing working subsystems from Globus and the EDG for the missing features and enhance where needed • Keep it simple – while functional

  22. Status • Working production Grid testbed exists • Stable information system (MDS) • Approximately 80 CPUs scattered across Denmark, Norway and Sweden • First job submitted on March 28 • Can now run the ATLAS data challenge 1 • Live status monitor available from the web site: http://www.nordugrid.org/

  23. NorduGrid Load Monitor

  24. dc1.000017.simu.0001.nordugrid.xrsl &(executable="/$ATLAS_ROOT/bin/atlsim") (arguments="-w 0 -b dc1.kumac project=dc1 pgroup=nordugrid step=simu partition=0001 nskip=0 ntrig=100 dset=000017 nset=0017") (stdout=out.txt)(stderr=err.txt) (outputfiles= ("out.txt" "") ("err.txt" "") ("dc1.000017.simu.0001.nordugrid.zebra" "gsiftp://lscf.nbi.dk/ATLAS/dc1-17/dc1.000017.simu.0001.nordugrid.zebra“) ("dc1.000017.simu.0001.nordugrid.his“ "gsiftp://lscf.nbi.dk/ATLAS/dc1-17/dc1.000017.simu.0001.nordugrid.his") ) (inputFiles= ("atlas.kumac" "http://www.nbi.dk/~waananen/atlas.kumac") ("atlsim.makefile" "http://www.nbi.dk/~waananen/atlsim.makefile") ("atlsim.logon.kumac" "http://www.nbi.dk/~waananen/atlsim.logon.kumac") ("dc1.kumac" "http://www.nbi.dk/~waananen/dc1.kumac") ("dc1.root“ "rc://@grid.uio.no:389/lc=ATLAS,rc=NorduGrid,dc=nordugrid,dc=org/gen0017_1.root") ) (jobname="dc1.000017.simu.0001.nordugrid") (notify="waananen@nbi.dk") (* 20 hours seem to be enough for 100 events *) (MaxCPUTime=1200) (* Try to make download faster *) (ftpthreads=4) (runtimeenvironment=ATLAS-3.0.1)

  25. Future work • Data management • Transparent shared access to data from computing nodes • Even during execution • Authorization • More fine grain access control to resources • Better user separation • Access control on files • Authentication • Distributed registration authority • More robust information system • Sites or nodes which are down should not have any effect on remaining system

  26. Conclusion • Working testbed exists and will be stress tested with the ATLAS data challenge • Users are invited to try it out • Sites are invited to join by installing the software • Compatibility with EDG desirable • Documentation including an architecture overview is available from the web site

  27. Videoconferencing • Headed by H. Frese • No report • Originally planned for Spring • Hans Frese not able to attend this time … • And not next time … • There are some issues here: • European MCUs • ESNET Booking System • Ask Hans Frese how he thinks this should proceed! Tobias Haas

  28. W2K • Headed by Gian Piero Siroly and Michel Jouvin • Could not attend Copenhagen meeting • Sent a progress report • Will report during September meeting at CERN • Attach progress report to the minutes Tobias Haas

  29. History • Group originally created Spring 2000 • Headed by Gian Piero Siroli & Michel Jouvin • Mandate extended by 12 months in Spring 2001 • Group has continued working actively in the context of HEPIX/HEPNT Tobias Haas

  30. The Mandate • To investigate and test the new features of Microsoft´s Windows 2000 operating system, with particular emphasis on those issues which may need coordination across HEP • To make recommendations to HTASC/HEPCCC on those areas where a coordinated migration plan is required. These plans should take into account any potential benefits, such as the integration with Unix and improved access of shared resources across HEP • To share the expertise gained with other HEP Windows system managers by organising an open Windows 2000 workshop and/or by other appropriate means Tobias Haas

  31. List of members Name Lab Email Frederic Hemmer CERN frederic.hemmer@cern.ch Gunter Trowitzsch DESY gut@ifh.de Gian Piero Siroli INFN gian.piero.siroli@cern.ch Michel Jouvin IN2P3 jouvin@lal.in2p3.fr Ian McArthur Oxford i.mcarthur1@physics.ox.ac.uk Joel Surget SACLAY surget@hep.saclay.cea.fr Enrico Fasanelli INFN Enrico.M.V.Fasanelli@le.infn.it Jean-Pierre Melot  Bristol j.p.melot@bristol.ac.uk David Kelsey RAL d.p.kelsey@rl.ac.uk Ed McFadden BNL emc@bnl.gov Jack Schmidt FNAL schmidt@fnal.gov Sandy Philpott JLAB sandy.philpott@jlab.org Andrea Chan SLAC achan@slac.stanford.edu Tobias Haas

  32. Recent Meetings • October 2001 at NERSC (HEPIX/HEPNT) • December 2001 at CERN • April 2002 in Catania (HEPIX/HEPNT) • Another meeting planned for this summer • Wide participation from many laboratories at these meetings Tobias Haas

  33. Topics of Discussion • Win/UX integration • Interoperability Kerberos/AFS • Domain Structures • Current Approaches, future projects • DNS Configurations • Possibility of Sharing MSI files across HEP labs • Win migration paths towards W2K, XP • Print Management Services • Web Services (incl. SOAP and XML) Tobias Haas

  34. General Opinion • This group is a very useful forum to exchange information • Organization follows a double format: • Limited subgroup sessions • limited but not formally restricted in participation • Often through videoconferences • HEPIX/HEPNT workshops Tobias Haas

  35. Plans • Document in the fall • Describe topics under discussion in more detail • Revise, discuss and possibly redefine the scope and mandate of the group after 2 years experience in tracking the HEP Windows technology and the trend of evolution in many HEP organizations. Tobias Haas

  36. HEP Coordination Issues • Inter-lab WAN access to Win platforms and resource sharing • VPN access and security  These topics deserve a more detailed consideration after the initial phase of Windows deployment at man HEP labs Tobias Haas

  37. Round Table Discussion:Status of Tier-2 Centers Tobias Haas

  38. Round Table Discussion • Reports from • Germany (Rainer Mankel) • Hungary (Joszef Kadlecsik) • CERN (Jürgen Knobloch) • US (Irwin Gaines) • Switzerland (Chris Grab) • Italy (Francesco Forti) • Portugal (Jorge Gomez) • Spain (Nicanor Colino) • France (Francois Etienne) Tobias Haas

  39. Round Table Discussion:Status of Tier-2 Centers • Germany (Rainer Mankel) • GSI, Darmstadt (Alice): Proposal • RWTH Aachen: Thinking about it • US (Irwin Gaines) … this may be CMS-centric • Caltech • UCSD • SCSF • U Florida • U Wisconsin, Madison Tobias Haas

  40. Round Table Discussion:Status of Tier-2 Centers… cont’d • Italy (Francesco Forti) • Tier-1 at CNAF (Bologna) • Many (up to 13?) Tier-2 centers • Problem to manage farms (Farm tools?) • Portugal (Jorge Gomez) • LIP somewhere between Tier-2 and Tier-3 • Switzerland (Chris Grab) • Contribute to CERN for Tier-1 • Tier-2 at CSCS at Manno Tobias Haas

  41. Round Table Discussion:Status of Tier-2 Centers • Tier-1 structure is emerging • Thinking about Tier-2 has only just started • Smaller countries cannot support Tier-1 • 10% of LHC computing is a large share! • Many countries are still waiting and observing Tobias Haas

  42. HTASC Future Discussion Topics • Trends in Commodity Computing/Storage • Cluster Management Tools • Videoconferencing and W2k groups • LCG Application Domain • New Pasta Report • … Tobias Haas

  43. Open Issues • HEPCCC Action Item 76 • Scientific Secretary… ongoing • HEPCCC Action Item 77 • Confirmation of Membership … ongoing • HEPCCC Action Item 83 • Security Summary … ongoing Tobias Haas

More Related