Abstract
Recently there have been suggestions that the Type Ia supernova data can be
explained using only general relativity and cold dark matter with no dark
energy. In "Swiss cheese" models of the Universe, the standard
Friedmann-Robertson-Walker picture is modified by the introduction of mass
compensating spherical inhomogeneities, typically described by the
Lemaitre-Tolman-Bondi metric. If these inhomogeneities correspond to underdense
cores surrounded by mass-compensating overdense shells, then they can modify
the luminosity distance-redshift relation in a way that can mimic accelerated
expansion. It has been argued that this effect could be large enough to explain
the supernova data without introducing dark energy or modified gravity. We show
that the large apparent acceleration seen in some models can be explained in
terms of standard weak field gravitational lensing together with insufficient
randomization of void locations. The underdense regions focus the light less
than the homogeneous background, thus dimming supernovae in a way that can
mimic the effects of acceleration. With insufficient randomization of the
spatial location of the voids and of the lines of sight, coherent defocusing
can lead to anomalously large demagnification effects. We show that a proper
randomization of the voids and lines of sight reduces the effect to the point
that it can no longer explain the supernova data.
Users
Please
log in to take part in the discussion (add own reviews or comments).