Article,

Language Evolution and Information Theory

, and .
Journal of Theoretical Biology, 205 (1): 147--159 (July 2000)
DOI: 10.1006/jtbi.2000.2053

Abstract

This paper places models of language evolution within the framework of information theory. We study how signals become associated with meaning. If there is a probability of mistaking signals for each other, then evolution leads to an error limit: increasing the number of signals does not increase the fitness of a language beyond a certain limit. This error limit can be overcome by word formation: a linear increase of the word length leads to an exponential increase of the maximum fitness. We develop a general model of word formation and demonstrate the connection between the error limit and Shannon's noisy coding theorem.

Tags

Users

  • @vittorio.loreto

Comments and Reviews