Excess entropy in natural language: present state and perspectives release_iw2myboytjf3zhocd2plly4x5u

by Łukasz Dęowski

Released as a article .

2011  

Abstract

We review recent progress in understanding the meaning of mutual information in natural language. Let us define words in a text as strings that occur sufficiently often. In a few previous papers, we have shown that a power-law distribution for so defined words (a.k.a. Herdan's law) is obeyed if there is a similar power-law growth of (algorithmic) mutual information between adjacent portions of texts of increasing length. Moreover, the power-law growth of information holds if texts describe a complicated infinite (algorithmically) random object in a highly repetitive way, according to an analogous power-law distribution. The described object may be immutable (like a mathematical or physical constant) or may evolve slowly in time (like cultural heritage). Here we reflect on the respective mathematical results in a less technical way. We also discuss feasibility of deciding to what extent these results apply to the actual human communication.
In text/plain format

Archived Files and Locations

application/pdf  311.9 kB
file_zgz4hgdkqrallhh3msuyt2bybq
arxiv.org (repository)
web.archive.org (webarchive)
application/pdf  302.6 kB
file_frorkrb3avbnrcplq2rb55jeae
web.archive.org (webarchive)
core.ac.uk (web)
archive.org (archive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2011-08-08
Version   v2
Language   en ?
arXiv  1105.1306v2
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 0d5188b7-3548-47e9-81b0-25dd8f48e74f
API URL: JSON