1 / 14

Tier2 Centre in Prague

Tier2 Centre in Prague. Ji ří Chudoba FZU AV ČR - Institute of Physics of t he Academy of Sciences of the Czech Republic. Outline. Supported groups Hardware Middleware and software Current status. Particle Physics in the Czech Republic. Groups located at Charles University in Prague

naeva
Download Presentation

Tier2 Centre in Prague

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic

  2. Outline • Supported groups • Hardware • Middleware and software • Current status chudoba@fzu.cz

  3. Particle Physics in the Czech Republic Groups located at • Charles University in Prague • Czech Technical University in Prague • Institute of Physics of the Academy of Sciences of the Czech Republic • Nuclear Physics Institute of the Academy of Sciences of the Czech Republic Main Applications • Projects ATLAS, ALICE, D0, STAR, TOTEM • Groups of theoreticians • Approximate size of the community60scientists, 20 engineers, 20techniciansand 40students and PhD students chudoba@fzu.cz

  4. Hardware 2 independent computing farms • Golias located in FZU • Skurut from CESNET (= “academic network provider”) • older hardware (29 dual nodes, PIII700MHz) offered by CESNET • part used as a production farm, some nodes for tests and support for different VOs (VOCE, GILDA, CE testbed) • contribution to Atlas DC2 at a level of 2% of all jobs finished on LCG chudoba@fzu.cz

  5. Computer hall 2 x 9 racks 2 air condition units 180 kW electrical power available from UPS, backed up by Diesel generator 1 Gbps optical connection to CESNET metropolitan network direct 1 Gbps optical connection to CzechLight shared with other FZU activities Available Resources - FZU chudoba@fzu.cz

  6. Worker Nodes (September 2005) 67x HP DL140, dual Intel Xeon 3.06GHz withHT (enabled only on some nodes), 2 or 4 GB RAM, 80 GB HDD 1x dual AMD Opteron 1.6 GHz, 2 GB RAM, 40 GB HDD 24x HP LP1000r, 2xPIII1.13 GHz, 1 GB RAM, 18 GB SCSI HDD WN connected via 1 Gbps (DL140) or 100 Mbps (LP1000r) Network components 3x HP ProCurve Networking Switch 2848 (3x48 ports) HP 4108GL Hardware @ FZU ~ 30 KSI2K will be added this year chudoba@fzu.cz

  7. Golias Farm Hardware - servers • PBS server: HP DL360 – 2xIntel Xeon 2.8, 2 GB RAM • CE, SE, UI: HP LP1000, 2x1.13Ghz PIII, 1 GB RAM, 100 Mbps (SE should be upgraded to 1 Gbps soon) • NFS servers • 1x HP DL145 – 2x AMD Opteron 1.6 GHz, connected disc array 30 TB (raw capacity), ATA discs • 1x HP LP4100TC, 1 TB disc array, SCSI discs • 1x embedded server in EasySTOR 1400RP (PIII), 10 TB, ATA discs • dCache server • HP DL140 upgraded by raid controller, 2x300GB discs • not used for productions, reserved for SC3 • Some other servers (www, SAM) chudoba@fzu.cz

  8. Middleware, Batch System GOLIAS: • LCG2 (2_6_0): • CE, SE, UI – SLC3 • WNs - RH7.3 (local D0 group not yet ready for SLC3) • PBSPro server not on CE • CE submits jobs to the node golias (PBSPro server) • local users can submit to local queues on golias SKURUT: • LCG2 (2_6_0), OpenPBS server on CE • all nodes - SLC3 chudoba@fzu.cz

  9. Queues • Separate queues for different experiments and privileged users: atlas, atlasprod, lcgatlas, lcgatlasprod alice, aliceprod, d0, d0prod, auger, star, ... short, long • priorities are set by some PBS parameters: • max_running, max_user_running, priority, node properties • still not optimal in heterogeneous environment chudoba@fzu.cz

  10. Jobs statistics 2005/1-6 Used CPU time (in days) per activity, for January – June 2005 chudoba@fzu.cz

  11. Simulations for ATLAS chudoba@fzu.cz

  12. ALICE PDC – Phase2 (2004) 2004: ALICE jobs submitted to Golias via AliEn 2005: new version of AliEn is being installed chudoba@fzu.cz

  13. Tier1 – Tier2 relations • Requirements defined by experiments ATLAS and ALICE • Direct network connection between FZU and GridKa will be provided by GEANT2 next year • “Know-how” exchange welcomed chudoba@fzu.cz

  14. THE END chudoba@fzu.cz

More Related