Recent work on the representation of functions on sets has considered the use
of summation in a latent space to enforce permutation invariance. In
particular, it has been conjectured that the dimension of this latent space may
remain fixed as the cardinality of the sets under consideration increases.
However, we demonstrate that the analysis leading to this conjecture requires
mappings which are highly discontinuous and argue that this is only of limited
practical use. Motivated by this observation, we prove that an implementation
of this model via continuous mappings (as provided by e.g. neural networks or
Gaussian processes) actually imposes a constraint on the dimensionality of the
latent space. Practical universal function representation for set inputs can
only be achieved with a latent dimension at least the size of the maximum
number of input elements.