nick gnedin fcpa retreat n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Nick Gnedin FCPA Retreat PowerPoint Presentation
Download Presentation
Nick Gnedin FCPA Retreat

Loading in 2 Seconds...

play fullscreen
1 / 21

Nick Gnedin FCPA Retreat - PowerPoint PPT Presentation


  • 61 Views
  • Uploaded on

Computational Astrophysics and Cosmology. Nick Gnedin FCPA Retreat. Contents. Dreams and reality Science Methods and Tools Computational Cosmology in the World… … and @ FNAL. Ultimate Dream of Every Cosmologist. Noah’s Ark of the Universe: a quantitative model of the

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

Nick Gnedin FCPA Retreat


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
contents
Contents
  • Dreams and reality
  • Science
  • Methods and Tools
  • Computational Cosmology in the World…
  • … and @ FNAL
ultimate dream of every cosmologist
Ultimate Dream ofEvery Cosmologist

Noah’s Ark of the Universe:

a quantitative model of the

universe in a large enough

volume to represent all types

of galaxies in sufficient detail.

“Every cosmologist carries a digital universe in his pack.” Napoleon Bonaparte

and reality
… and Reality

If we cannot

have this…

Large-scale simulations

Small-scale simulations

science computational cosmology for fundamental physics
Science:Computational Cosmology for Fundamental Physics

Weak lensing:

Directly probes the matter

distribution.

Unfortunately, we exist.

Baryonic effects can be

calibrated by simulations.

The required computational

effort is massive.

(Rudd et al 2007)

science computational cosmology for fundamental physics1
Science:Computational Cosmology for Fundamental Physics

Baryon Acoustic Oscillations:

Most clean test for dark

energy (potentially).

Galaxies are biased.

Baryonic effects can be

calibrated by simulations.

We are still very far from

predicting galaxy colors

from simulations reliably.

(Tegmark et al 2006)

science computational cosmology for fundamental physics2
Science:Computational Cosmology for Fundamental Physics

Galaxy Clusters:

Sensitive probes to DE

(live on exponential tail).

Complex physical systems.

Baryonic effects can be

calibrated by simulations.

It is not clear how much physics

we need to include to model

clusters with high precision

(even ignoring central 20%).

(Nagai et al 2007)

science computational cosmology for fundamental astrophysics
Science:Computational Cosmology for Fundamental Astrophysics

Galaxy Formation:

Direct comparison with

most observations.

Important physics (such as

star formation) will not be

understood any time soon.

There exist simple scaling

laws in the ISM.

Models based on those laws

make wrong galaxies.

science computational cosmology for fundamental astrophysics1
Science:Computational Cosmology for Fundamental Astrophysics

Reionization & 1st stars:

Physics is reasonably well

understood.

Radiative Transfer is a hard

computational problem.

Substantial progress has

been achieved, though.

A very large dynamic range

is required (1kpc – 100Mpc).

science computational cosmology for fundamental astrophysics2
Science:Computational Cosmology for Fundamental Astrophysics

Supermassive Black Holes:

Probably the most important unsolved astrophysical

problem now.

Very large dynamic range

and complex physics.

AMR can reach it now.

We may not have even

the most rudimentary,

conceptual understanding

of the physics of AGN.

methods and tools
Methods and Tools
  • Dark Matter:
  • similar to plasma simulations
  • particle methods (N-body)
  • Gas Dynamics:
  • mostly Lagrangian methods
  • little use of engineering expertise
  • Radiative Transfer:
  • 6D problem
  • useful approximations exist
  • full computational solution is still lacking
cosmological n body
Cosmological N-body
  • PM = particle-mesh
  • P3M = particle-particle particle-mesh
  • Tree
  • Hybrid codes: TreePM, Adaptive P3M
  • Adaptive Mesh Refinement (AMR)

The problem of cosmological N-body simulations is

essentially solved

cosmological gas dynamics
Cosmological Gas Dynamics
  • Little use of engineering expertise:
  • very high resolution required
  • complex physics
  • gas is gravitating
  • no solid boundaries
20 th century 1 st generation codes
20th Century(1st Generation Codes)
  • Eulerian schemes:
  • R ~ 1000, N~109, const
  • Simple Path to Hell (SPH):
  • R ~ 30,000, N~108, const
  • Arbitrary Lagrangian-Eulerian (ALE):
  • R ~ 10,000, N~107, const
21 st century 2 nd generation codes
21st Century(2nd Generation Codes)
  • Adaptive Mesh Refinement (AMR):
  • R > 100,000, N ~ 109, variable
  • can follow fragmentation
  • non-uniform initial conditions
  • spatially variable resolution
our tools
Our Tools
  • We (FNAL+KICP) have several cosmological codes
  • The main one is the Adaptive Refinement Tree (ART) code
  • Implementation of Adaptive
  • Mesh Refinement method
  • Refines on a cell-by-cell basis
  • Full 4D adaptive
  • Includes
  • - dark matter dynamics
  • - gas dynamics and chemistry
  • - radiative transfer
  • - star formation, etc
computational cosmology in the world
Computational Cosmology in the World
  • Europe: Research mostly
  • done by large groups:

Virgo Consortium [>30 people]

(Germany + UK)

Project Horizon [>20 people]

(France)

  • USA/Canada: Large number of small groups (10-12 people)
  • Chicago (FNAL+KICP), UWashington, Harvard, CITA,
  • SLAC, LANL, Princeton, UCLA, UIUC, Berkeley, UMass…
computing needs
Computing Needs
  • Modern state-of-the art cosmological simulations require
  • in excess of 100,000 CPU-hours (130 CPU-months). Biggest
  • ones use > 1,000,000 CPU-hours.
  • A single simulation produces a Terabyte-scale data set.
  • Even rudimentary analysis requires intermediate-level
  • computing capability.

The time of “this is a workstation to analyze your

simulation” is over.

solution a la lqcd
Solution (a-la LQCD)
  • National consortium/collaboration (the level of integration
  • may vary)
  • Powerful local, intermediate-level resources (10,000 CPU,
  • can be spread over a few separate places: FNAL, SLAC,
  • ANL, LANL)
  • United approach to funding agencies and national
  • supercomputer centers.
our plan
Our Plan
  • build-up local resources on par with other US groups
  • create a Chicago-wide collaboration (FNAL, KICP, ANL)
  • charm Mont, PAC, and Director
  • overtake other US groups in local computing resources
  • (reach 1,000 – 2,000 cores; there are good reasons why
  • FNAL is the best place for that)
  • move towards a national consortium/collaboration:
    • join with DOE labs (LANL, LBL, SLAC)
    • invite universities to join
  • get lots of money along the way… (people, software
  • support)