APC523/AST523 Scientific Computation in Astrophysics. Jim Stone Department of Astrophysical Sciences & PACM 125 Peyton Hall : ph. 8-3815: firstname.lastname@example.org Robert Lupton Department of Astrophysical Sciences 134 Peyton Hall : ph. 8-3811: email@example.com
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Department of Astrophysical Sciences & PACM
125 Peyton Hall : ph. 8-3815: firstname.lastname@example.org
Department of Astrophysical Sciences
134 Peyton Hall : ph. 8-3811: email@example.com
+ exciting guest lecturers! (TBA)
Focus of latter is on science, which we all agree is the most important component:
why isn’t this course called computational astrophysics?
Computational astrophysics is about the results of scientific computations.
These results should be studied in core physics course
e.g. results of stellar dynamic simulations studied in AST513
Separating results into a course just because they were obtained on a computer is not a good way to organize physics.
Instead, the topic that needs instruction is the mechanics of the computations themselves --> this course
“Computation has ruined theoretical astrophysics”
In what ways are computations done badly?
Only numbers, no insight.
Focus is on software rather than application (my code is bigger than your code).
Results incorrect due to numerical errors.
Numerical simulations based on incorrect mathematical model.
Use computations to address the greatest weakness of analytic models: they are sometimes too simplistic (spherical cows)
For some problems, toy models are no longer sufficient, and computations areessential.
Why write code in the era of Numerical Recipes, MATLAB, and MAPL?
Should use software libraries whenever possible.
However goal of science is to explore frontiers of knowledge. Not even the algorithms, let alone the software, are available when new physics is studied.
Era of single graduate student writing own code is nearly over. In future, most codes will be built with modules from many sources, with new extensions added on top.
Development to 1953 are considered part of the First Generation of Computers
Second Generation. (1954-1962): diodes and transistors replace vacuum tubes. High level languages developed (FORTRAN, ALGOL, COBOL)
Third Generation. (1963-1972): integrated circuits, silicon memory introduced. CDC6600 (1964) reaches 1Mflop. CDC7600 (1969) reaches 10 Mflop.
Fourth Generation. (1972-1984): VLSI takes over. Age of vector supercomputers (Cray-1 through Cray YMP). C and Unix developed. Workstations and PCs emerge. NSF supercomputer centers established.
Fifth Generation. (1984-1990): parallel processing emerges. CM-2 and CM-5 are first successful SIMD machines.
Sixth Generation. (1990-??): revenge of the desktops. Clusters of cheap, mass-produced processors dominate.
To make code portable, always try to conform to standards.
Avoid machine dependent language extensions and libraries.
Colors are radial velocity inside 4 solar mass star, including implicit hydrodynamics, radiation transfer, realistic EOS and nuclear reaction networks.
Computed with Djehuty code
Eggleton & Dearborn, LLNL
Porter & Woodward, UMN
5123 grid; 250,000 cpu hours
Computations can follow the growth of perturbations that form clusters of galaxies
Mass density of visible matter
Background: SDSS image of Perseus cluster