This paper describes the open-source code Enzo, which uses block-structured adaptive mesh refinement to provide high spatial and temporal resolution for modeling astrophysical fluid flows. The code is Cartesian, can be run in 1, 2, and 3 dimensions, …
As scientists' needs for computational techniques and tools grow, they cease to be supportable by software developed in isolation. In many cases, these needs are being met by communities of practice, where software is developed by domain scientists …
Here we present three adaptive mesh refinement radiation hydrodynamics simulations that illustrate the impact of momentum transfer from ionising radiation to the absorbing gas on star formation in high-redshift dwarf galaxies. Momentum transfer is …
We study the buildup of magnetic fields during the formation of Population III star-forming regions by conducting cosmological simulations from realistic initial conditions and varying the Jeans resolution. To investigate this in detail, we start …
Massive stars provide feedback that shapes the interstellar medium of galaxies at all redshifts and their resulting stellar populations. Here we present three adaptive mesh refinement radiation hydrodynamics simulations that illustrate the impact of …
By definition, Population III stars are metal-free, and their protostellar collapse is driven by molecular hydrogen cooling in the gas phase, leading to large characteristic masses. Population II stars with lower characteristic masses form when the …
The usage of the high-level scripting language Python has enabled new mechanisms for data interrogation, discovery and visualization of scientific data. We present yt, an open source, community-developed astrophysical analysis and visualization …
The transformation of atomic hydrogen to molecular hydrogen through three-body reactions is a crucial stage in the collapse of primordial, metal-free halos, where the first generation of stars (Population III stars) in the universe is formed. …
The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these …
Modern N-body cosmological simulations contain billions ($10^ 9$) of dark matter particles. These simulations require hundreds to thousands of gigabytes of memory, and employ hundreds to tens of thousands of processing cores on many compute nodes. In …