ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1804.07893
33
9
v1v2 (latest)

Taylor's law for Human Linguistic Sequences

21 April 2018
Tatsuru Kobayashi
Kumiko Tanaka-Ishii
ArXiv (abs)PDFHTML
Abstract

Taylor's law describes the fluctuation characteristics underlying a system in which the variance of an event within a time span grows by a power law with respect to the mean. Although Taylor's law has been applied in many natural and social systems, its application for language has been scarce. This article describes Taylor analysis of over 1100 texts across 14 languages. The Taylor exponents of natural language texts exhibit almost the same value. The exponent was also compared for other language-related data, such as the CHILDES corpus, music, and programming languages. The results show how the Taylor exponent serves to quantify the fundamental structural complexity underlying linguistic time series. The article also shows the applicability of these findings in evaluating language models. Specifically, a text generated by an LSTM unit exhibited a Taylor exponent of 0.50, identical to that of an i.i.d. process, thus showing a limitation of that neural model.

View on arXiv
Comments on this paper