1 / 7

UK GRID Activities Glenn Patrick 11.09.00

UK GRID Activities Glenn Patrick 11.09.00. Not particularly knowledgeable-just based on attending 3 meetings.  09.08.00 UK-HEP Globus meeting (RAL)  11.07.00 UK-Grid Meeting (Cosener’s)  15.06.00 UK-Grid Testbed Meeting (RAL)

sbacon
Download Presentation

UK GRID Activities Glenn Patrick 11.09.00

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UK GRID ActivitiesGlenn Patrick 11.09.00 Not particularly knowledgeable-just based on attending 3 meetings.  09.08.00 UK-HEP Globus meeting (RAL)  11.07.00 UK-Grid Meeting (Cosener’s)  15.06.00 UK-Grid Testbed Meeting (RAL) Globus meeting on 9th August the most useful, especially from the technical point of view. Liverpool, 11.09.00

  2. Globus Activities RAL • Globus 1.1.3 installed on RAL-CSF (Redhat 6.1 + PBS) with 2 gatekeepers (Andrew will cover). • Interfacing GASS Server to RAL DataStore (T.Folkes). Backend code has been written and various technical issues identified concerning globus-cache, proxies and file opens. Users to access files using pseudo path name? • Gaining experience with GSI-ftp and GSI-ssh (B.Saunders). Now working on PPD Linux machines.

  3. Manchester (Andrew McNab) Globus packaged in RPM format (built on RH6.2/Globus 1.1.3). Interim measure as Globus supposed to be adopting RPM in future. GSI-ssh,GSI-ftp & GSI-gdm also being packaged in RPM. “Grid Aware ROOT” - first attempt at calling GASS client API from ROOT by modifying Tfile class. However, ROOT already has mechanisms for remote files and next version will add Grid files into list of recognised protocols. Need MDS/LDAP names rather than URLs. Standardising GASS cache on Manchester machines. Spool area for each user. Globus activities II

  4. Globus Activities III QMW (Alex Martin) • Globus 1.1.3 installed on several Linux boxes. • PBS installed (packaged as RPM). • Developed simple “quote” system using LDAP protocol and Perl scripts. Currently only works for single job and only based on CPU requirement. What is really required in wider batch world? TASKS • Common kit of parts/Globus distribution (eg.RPM). • Solution for handling certificates & Gridmap files. • Common UKHEP GIIS Service (gateway index)? • Security implications. Next meeting 20th Sept (Manchester)

  5. LHC(UK) Proto-Centres/Testbeds? RAL Tier 1 (ALICE/ATLAS/CMS/LHCb) • Submitted to JIF in Feb for £5.9M. Outcome in Nov(?). Liverpool MAP/COMPASS (LHCb/ATLAS) • Funded by JREI in 1998. Operational. Upgrade requested. Glasgow/Edinburgh Computing Centre (LHCb/ATLAS) • Submitted to JREI in May. Outcome known in ~December. Manchester-UCL-Sheffield (ATLAS WW scattering?) • JREI bid. 32 cpu + 5 TB disk. Bristol (BaBar/CMS/LHCb) • 8 node Linux  32 nodes later+storage server. Birmingham (ALICE) • Funded by JREI(1999). Farm and disk storage. Others?

  6. UK GRID Organisation Visible people seem to fall into 2 main camps: • Potential “Exploiters” for experiments & applications. • System Managers installing/developing Globus, etc. Various people are then involved in DataGrid work packages, but organisation of UK middleware work is not clear (to me). Significant funds supposed to go into Grid and e-science. Some (hierarchical) structures have been proposed along lines.. • PPARC Grid Steering Committee • Particle Physics Grid Management Board • Technical Work Groups • Sub-groups

  7. RAL GRID Organisation Up until now there has been a “CLRC Grid Team” which was originally based around PPD+ITD, but has gradually pulled in other departments/sciences. Now just about to be split into: • e-science forum for all of CLRC. • Particle Physics Grid Team. Not clear yet how this maps into existing structures and how it affects effort for LHCb applications.

More Related