With Zipfs law being originally and most famously observed for word frequency, it is surprisingly limited in its applicability to human language, holding over no more than three to four orders of magnitude before hitting a clear break in scaling. Here, building on the simple observation that phrases of one or more words comprise the most coherent units of meaning in language, we show empirically that Zipfs law for phrases extends over as many as nine orders of rank magnitude. In doing so, we develop a principled and scalable statistical mechanical method of random text partitioning, which opens up a rich frontier of rigorous text analysis via a rank ordering of mixed length phrases.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Williams JR, Lessard PR, Desu S, Clark EM, Bagrow JP, Danforth CM, Dodds PS. Zipf’s law holds for phrases, not words. Scientific reports. 2015 Aug 11;5:12209.