UCSD/CASS Reconstructing the Very First Star



Reconstructing the Very First Star

Cosmology recapitulates ontology


NPACI On-line
Merry Maisel
Copyright 2001 Online

October 3,2001


Some orchestras (and their conductors) have amazing dynamic range. Arturo Toscanini, Bruno Walter, and Fritz Reiner were renowned for both thunder and whisper. Succeeding Reiner at the Chicago Symphony, Sir Georg Solti also won a reputation for controlling both the quiet tinkle of a triangle and the (actual) cannon of Tchaikovsky's 1812 Overture. Today, Simon Rattle, Claudio Abbado, Lorin Maazel, and Pierre Boulez come to mind as masters of music meek or mighty.

And if there is a Bruno Walter of cosmology, it is surely Michael L. Norman of UC San Diego. His cosmological structure code, Enzo, can span a spectacular 12 orders of magnitude in space and time. It can pursue the slightest gravitational perturbation of a nearly uniform primal gas all the way to its condensations in volumes so much tinier than the initial volume that the ratio can only be expressed in scientific notation: 10-30. It can follow the story from the consequences of the big bang to the coalescence of the first star in exquisite detail; indeed, in "quadruple precision," in a manner faithful to the best idea we have of the initial physics and chemistry of the process.

With former students Greg Bryan (now a lecturer at Oxford University) and Tom Abel (now a visiting scientist at Cambridge University), Norman has submitted a paper on the most recent Enzo calculations as an entry for this year's Gordon Bell Award (to be presented at SC2001).

Getting to the First Star

The physics of present-day star formation is extremely complicated, because the interstellar medium from which stars are born is itself composed of the remnants of previous generations of stars, including not only hydrogen and helium but also heavier elements, in abundances important enough to affect the star-formation process.

"In contrast", said Norman, "the formation of the first star takes place in a much simpler environment: the gas is just hydrogen and helium, and the initial conditions can be precisely specified by cosmological models. It's a clean initial-value problem -- and it is the starting point for the formation of all other structure in the universe, from galaxies to superclusters." Using the IBM Blue Horizon machine at SDSC, Bryan, Abel, and Norman simulated the condensation and formation of the universe's very first star.

They began with the material composition of the universe: about ten percent ordinary (baryonic) matter, which is the primordial gas, and ninety percent "cold dark matter" (CDM). While the actual composition of the CDM is unknown, its role in gravitational condensation can be calculated. The interesting fact about the CDM lies in the power spectrum of its density fluctuations, which suggests that cosmic structure is formed in a "bottom-up" fashion -- by the gravitational amplification of initially small fluctuations.


A slice through the protostellar object determined by Enzo in (left) temperature and (right) density. Copyright © 2001 Tom Abel; used by permission.

The scientific results will appear in Science later this year.

Structured Adaptive Mesh Refinement

So the computational problem becomes: how do you follow a reasonably large sample of the CDM-hydrogen-helium universe as its hydrodynamic perturbations lead to a collapsing protogalaxy and, within that, to protostellar clouds? Moreover, how do you follow all that with the correct chemistry and thermodynamics in a properly expanding cosmological spacetime continuum?

The gravitational problem alone has been attacked by N-body simulation, but Enzo is much more ambitious. The time-dependent calculation is carried out over the full three dimensions on a structured, adaptive grid hierarchy that follows the collapsing protogalaxy and subsequent protostellar cloud to near-stellar density, starting from primordial fluctuations a few million years after the big bang. Enzo thus combines an Euler solver for the primordial gas, an N-body solver for the collisionless CDM, a Poisson solver for the gravitational field, and a 12-species stiff reaction flow solver for the primordial gas chemistry (where the 12 species are hydrogen, deuterium, helium, and their various ionic states).

"We can't do all this on a uniform mesh," said Norman, "because there is no computer large enough to contain all the spatiotemporal scales we must follow." Instead, the researchers use a method called structured adaptive mesh refinement. While solving the equations on a uniform grid, the code follows the quality of the solution and, when necessary, adds an additional fine mesh over any region that requires enhanced resolution. The finer, "child" mesh obtains its boundary conditions from the coarse, "parent" mesh. The finer grid is also used to improve the solution on its parent, As the evolution continues, the finer mesh may need to be moved, resized, or removed. Even finer meshes may be required, producing a tree structure that may continue to any depth.

Enzo spawns a new mesh when any cell accumulates enough mass that refinement is needed to preserve a given mass resolution in the solution, or when a minimum length criterion to resolve perturbations is exceeded. The figure shows the adaptive mesh after five levels have been spawned.


The ability of the Enzo code to direct the formation of new mesh as needed by the physical solution is key to its ability to follow developments in a large, nearly isotropic universe that nevertheless ultimately produces very compact objects like stars. Copyright © 2001 Tom Abel; used by permission.

Norman pointed out that a fluctuation containing the mass of the Milky Way galaxy will collapse by a factor of about a thousand before it comes into dynamical equilibrium, and a code to follow such a collapse would need to have a spatial dynamic range (SDR) of 105. Resolving the formation of individual stars -- even very large ones -- within a galaxy-full of gas would require even more resolution, an SDR on the order of 1020. The work just completed is at an SDR of 1012 -- roughly the ratio of the diameter of the earth to the diameter of a human cell.

Code Details

Enzo is implemented in C++, with compute-intensive kernels in Fortran 77. The object-oriented approach provides two benefits: encapsulation (a single mesh or grid is the basic building block) and extensibility (new physics may be added easily at all levels).

"The hard part in running Enzo on a variety of platforms is parallelization and load balancing," said Robert Harkness of SDSC, who is working with Norman to prepare the Enzo code for runs across the TeraGrid. "We are using an MPI version that allows us to exploit the object-oriented design by distributing the objects over the processors, rather than attempting to distribute the grids themselves. The subgrids are generally small and numerous, and many may be quite short-lived." Other innovations include the creation of "sterile objects" that contain information about the size and location of a grid but without the solution. Each processor can hold the entire hierarchy of grids as sterile objects as it works the solution on any grid or grids that are local to the processor, which reduces communications traffic. Ultimately, however, the process of obtaining boundary values for the root (largest) grid is nonlocal; the sterile objects enable the code to "pipeline" the global communications so that they are received first where the solution can be examined first.

Accurate description of the positions of grids and particles within the problem domain requires the code to work at "extended precision" (double, where a single-precision word is 64 bits long; quadruple on 32-bit words). "Fortunately," Harkness said, "Blue Horizon supplies 128-bit arithmetic." The researchers have developed special methods of restricting the 128-bit work only to those parts of the code that require it, only when necessary.

Facing the Music

While Enzo in its present form is challenging enough for current platforms, the possibility of distributing it across the TeraGrid beckons -- and together with that, the necessity to increase the dynamic range of the code. At present, it begins with what might be called a statistically significant fraction of the early universe -- a large volume of isotropic CDM and gas -- and proceeds to follow the physics down to protogalactic objects (millions of solar masses) and protostellar objects (hundreds of solar masses).

"It seems that it will be possible, using this method, to extend the dynamic range on both ends of the spectrum," Norman said. In orchestral terms, the thunder will be more thunderous and the whispers will be all but inaudible -- and the computer platforms will be, collectively, the universe's largest concert hall.



Science Contact:

Professor Michael Norman (858) 822-4194
mnorman@mamacass.ucsd.edu