Abstract
Radiation feedback from stellar clusters is expected to play a key role in
setting the rate and efficiency of star formation in giant molecular clouds
(GMCs). To investigate how radiation forces influence realistic turbulent
systems, we have conducted a series of numerical simulations employing the ıt
Hyperion radiation hydrodynamics solver, considering the regime that is
optically thick to ultraviolet (UV) and optically thin to infrared (IR)
radiation. Our model clouds cover initial surface densities between
$\Sigma_cl,0 10-300~M_ødot~pc^-2$, with varying initial
turbulence. We follow them through turbulent, self-gravitating collapse,
formation of star clusters, and cloud dispersal by stellar radiation. All our
models display a lognormal distribution of gas surface density $\Sigma$; for an
initial virial parameter $\alpha_vir,0 = 2$, the lognormal standard
deviation is $\sigma_ln \Sigma = 1-1.5$ and the star formation rate
coefficient $\varepsilon_ff,\bar\rho = 0.3-0.5$, both of which are
sensitive to turbulence but not radiation feedback. The net star formation
efficiency $\varepsilon_final$ increases with $\Sigma_cl,0$ and
decreases with $\alpha_vir,0$. We interpret these results via a simple
conceptual framework, whereby steady star formation increases the radiation
force, such that local gas patches at successively higher $\Sigma$ become
unbound. Based on this formalism (with fixed $\sigma_ln \Sigma$), we
provide an analytic upper bound on $\varepsilon_final$, which is in
good agreement with our numerical results. The final star formation efficiency
depends on the distribution of Eddington ratios in the cloud and is strongly
increased by turbulent compression of gas.
Users
Please
log in to take part in the discussion (add own reviews or comments).