APC523/AST523 Scientific Computation in Astrophysics

1 / 20

# APC523/AST523 Scientific Computation in Astrophysics - PowerPoint PPT Presentation

## APC523/AST523 Scientific Computation in Astrophysics

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
##### Presentation Transcript

1. APC523/AST523 Scientific Computation in Astrophysics Jim Stone Department of Astrophysical Sciences & PACM 125 Peyton Hall : ph. 8-3815: jmstone@princeton.edu Robert Lupton Department of Astrophysical Sciences 134 Peyton Hall : ph. 8-3811: rhl@astro.princeton.edu + exciting guest lecturers! (TBA)

2. Some definitions • computer science. The study of computers and computation. • scientific computation. The theory, tools, and techniques required to solve mathematical models of scientific problems on a computer. • computational science. The use of scientific computing to gain insight into scientific problems. Focus of latter is on science, which we all agree is the most important component: why isn’t this course called computational astrophysics?

3. Scientific computation versus computational science Computational astrophysics is about the results of scientific computations. These results should be studied in core physics course e.g. results of stellar dynamic simulations studied in AST513 Separating results into a course just because they were obtained on a computer is not a good way to organize physics. Instead, the topic that needs instruction is the mechanics of the computations themselves --> this course

4. Methodology of traditional computational science • Identify an interesting astrophysical problem. (what is the interior structure of the Sun?) • Develop a mathematical model of the problem (the equations of stellar structure.) • Develop a numerical algorithm to compute solutions of the mathematical model (relaxation scheme for solving two-point boundary value problems.) • Visualize the data, and interpret the results (Structure and Evolution of the Stars, by M. Schwarzschild)

5. Limitations of computation “Computation has ruined theoretical astrophysics” Scott Tremaine In what ways are computations done badly? Only numbers, no insight. Focus is on software rather than application (my code is bigger than your code). Results incorrect due to numerical errors. Numerical simulations based on incorrect mathematical model. Use computations to address the greatest weakness of analytic models: they are sometimes too simplistic (spherical cows) For some problems, toy models are no longer sufficient, and computations areessential.

6. The goal of computation is insight, not numbers. R.W. Hamming

7. Is writing code becoming obsolete? Why write code in the era of Numerical Recipes, MATLAB, and MAPL? Should use software libraries whenever possible. However goal of science is to explore frontiers of knowledge. Not even the algorithms, let alone the software, are available when new physics is studied. Era of single graduate student writing own code is nearly over. In future, most codes will be built with modules from many sources, with new extensions added on top.

8. A brief history of computing: concepts • Babbage (1828-1839). First digital computing machine. • Turing (1935). Defined concept of computation, and idea of machine with stored program (can be self-modified) • von Neumann (1952). Described many of the practical design principles used in electronic digital computers. IAS machine used for stellar structure calculations.

9. A brief history of computing: hardware • Colossus (1943): first fully functional electronic digital computing machine (special purpose). • ENIAC (1945): first general purpose electronic digital computer • UNIVAC (1951): first commercially available computer in US. Cost \$250,000. • IAS computer (1952): contained many important innovations (e.g. hierarchical memory); led to IBM 701 series.

10. A brief history of computing: modern developments Development to 1953 are considered part of the First Generation of Computers Second Generation. (1954-1962): diodes and transistors replace vacuum tubes. High level languages developed (FORTRAN, ALGOL, COBOL) Third Generation. (1963-1972): integrated circuits, silicon memory introduced. CDC6600 (1964) reaches 1Mflop. CDC7600 (1969) reaches 10 Mflop. Fourth Generation. (1972-1984): VLSI takes over. Age of vector supercomputers (Cray-1 through Cray YMP). C and Unix developed. Workstations and PCs emerge. NSF supercomputer centers established. Fifth Generation. (1984-1990): parallel processing emerges. CM-2 and CM-5 are first successful SIMD machines. Sixth Generation. (1990-??): revenge of the desktops. Clusters of cheap, mass-produced processors dominate.

11. Critical step was development of IEEE standards • standardized number of bits in a word • standardized representation of floating point numbers • standardized floating point arithmetic To make code portable, always try to conform to standards. Avoid machine dependent language extensions and libraries.

12. Survey of the field today • 20% of all papers published in 2004 listed on ADS contain the words “computation”, “simulation”, or “numerical” in the abstract • This selection criteria is biased towards theory papers, including data analysis papers would increase the fraction • Double the rate of 20 years ago • By 2050, all papers in astrophysics will be based on numerical calculations (?!)

13. Examples • 3D Stellar structure • 3D stellar convection • Structure formation in early Universe • SDSS data pipeline

14. 1. Fully 3D stellar evolution models Colors are radial velocity inside 4 solar mass star, including implicit hydrodynamics, radiation transfer, realistic EOS and nuclear reaction networks. Computed with Djehuty code Eggleton & Dearborn, LLNL

15. 2. Time dependent compressible convection in full 3D model of AGB star. Porter & Woodward, UMN 5123 grid; 250,000 cpu hours

16. Structure formation Computations can follow the growth of perturbations that form clusters of galaxies • (Bode & Ostriker) Mass density of visible matter

17. 4. The SDSS data pipeline • 100 Mpixel camera, generates 4.6 Mb/s • To date, 15 Tb of data archived (compressed) • Each Mb of data requires ~10s of cpu time for processing on a 1Ghz PIV. About 1 cpu-year for full data set. Background: SDSS image of Perseus cluster

18. The Future of Computational Astrophysics • What is certain: increases in hardware performance will enable larger problems to be tackled numerically • What is needed: • More accurate algorithms • Community codes & visualization software • More realistic physics • Students trained in computation: you are the real future