Pictish art may have actually been written language

How do you tell the difference between art and written language?

Oh, yeah. It's math.

[Rob Lee] and colleagues Philip Jonathan and Pauline Ziman analyzed the engravings, found on the few hundred known Pictish Stones. The researchers used a mathematical process known as Shannon entropy to study the order, direction, randomness and other characteristics of each engraving.

The resulting data was compared with that for numerous written languages, such as Egyptian hieroglyphs, Chinese texts and written Latin, Anglo-Saxon, Old Norse, Ancient Irish, Old Irish and Old Welsh. While the Pictish Stone engravings did not match any of these, they displayed characteristics of writing based on a spoken language.

There is, sadly, not a lot of detail about what specific characteristics make language stand out from decoration. I'm guessing it has something to do finding patterns in the choice of symbols, or the way symbols are oriented, or how the patterns repeat. Wish there was more though. For the record, even if this is language, nobody is even close to deciphering what it means.

On a side note: Shannon entropy is a measure of the amount of information that we get from knowing one English letter. It's kind of the Entropy of Wheel of Fortune—how many guesses does it take to figure out all the letters of a sentence using only the information provided by the letters previously guessed. Besides identifying ancient scripts, it makes for a fun, time-wasting applet game.


  1. Shannon Entropy is a measure of the information content of a set of symbols. It’s related to the minimal compressed size of a text, and one can use compression programs such as gzip to estimate it.

    Purely random sequences of symbols have high (maximal) entropy, but human language is typically 5-10 times smaller. I expect the researchers just looked at the distribution of k-grams (sequences of k symbols) in the engravings, and ascertained that they’re far from random.

  2. Oh god – here comes another stupid Dan Brown novel. They’ve even got a quote from “one of the world’s leading experts on signs and symbols”…

  3. Shannon entropy is great for analysis of information in images as well! Entropic consideration of spatially resolved signals can filter out random, high-amplitude noise.
    I work with people who use it in ultrasound imaging. Amongst other applications, we’ve tried it out to pick up defects in synthetic materials and to distinguish dystrophy-affected tissue in DMD patients.
    (Articles may require institutional access)

  4. Unfortunately, information theory doesn’t help figure out the actual meaning. Without a Rosetta Stone of some sort, we’ll never know.

  5. “There is, sadly, not a lot of detail about what specific characteristics make language stand out from decoration.”

    A very insulting comment. Ever heard of the field of linguistics?

    This is linguistic anthropology, with a little Shannon Entropy thrown in for decoration. Linguistics is by nature quite mathematical.

    I am a linguistic anthropologist.

    1. > I am a linguistic anthropologist.

      Am I the only one who just pictured Anon popping out of the bushes and making Alton Brown grumpy?

    2. Enlighten us, oh linguistic anthropologist, instead of taking insult when we ask you how you know your stuff works.

      I mean, really, you’re coming off like a dick here. Why don’t you get yourself an account, pretend you are not the guy who wrote that anon comment, and answer the question instead of getting mad that it was asked.

      I’ve read “The Horse, the Wheel, and Language” cover to cover and I spent a vacation once 20 years ago wandering around Scotland looking at pictish stones. I can tell you the major themes (tongs, crescent & v-rod, peanut & z-rod, etc.). People here are interested in this subject. Sign in, and talk to us about it. You’ll rarely find such a mix of polymaths and autodidacts as you’ll find here.

    3. Not a lot of detail here. Obviously Maggie knows such details exist, given that she has linked to an article about using them; but that article doesn’t give too many. As a linguistic anthropologist, you’d think you’d pay a little more attention to context.

    4. I don’t think the observation you are talking about does anything to slight the field of linguistics. However, it might make a statement about the nature of art and language that humans produce.

      Both language and art consist of self-referential patterns that really aren’t that different to one another; depending on what language culture you were raised in they can be quite hard to disentangle. In fact, I can think of a few languages that are founded in picture-drawing…

      This difficulty in art/language disentanglement is particularly heinous when a culture’s art and its language exist in a sort of superposition with one another. If you wan’t to blow your mind with fascinating linguistics, watch “Cracking the Maya Code”:


  6. Until you know what bits of the pictures (pardon) contribute to meaning, you don’t know what to compress, and have no clue what it’s entropy might be. It’s possible to have writing that maps one-to-one with the 35,000 or so “words” of a language, assuming all languages are equally complex, with NO compressibility.

    And yes, Noam Chomsky demonstrated a beautifully elegant model of English grammar several decades ago, that is today the foundation of electronic translation devices (the military versions, not Babelfish).

  7. We used techniques like this in my freshmen engineering class at Purdue. We did a unit on cryptography, mostly using Matlab to crack 1960’s era U.S. and Soviet encryption systems.

    The concept of Shannon Entropy sounds similar to the index of coincidence, the probability that any character could could be any other character in a string of text. If i remember right, the IC of English is something like 1.73. So you can take a string of encrypted text, manipulate it a zillion times using raw computing power, and then pick out the iteration that results in decrypted text by looking at the IC…graphing all the IC values for each iteration in a histogram is handy…then you can easily find the decryption algorithm without having to examine every one of the hundreds or thousands possible algorithms individually to determine if the text is readable.

    This makes me miss my days at Purdue… :(

    Or…Maybe I don’t miss the all-nighters in ENAD pounding away on a Sun workstation all that much…

  8. I reasonably believe that classic Chomskyites will argue that this is less evidence that Pictish pictograms are a written expression of a spoken human language and more evidence of the human brain having an innate grammar that imposes itself on linear expressions of human thought.

    Grikdog: you don’t pick and choose which subsets of the known pictograms to compress, you compress (Shannon complexity analyse) the entire set of known pictograms. (Otherwise you’re playing Bible Code.) Not all languages have an equal Shannon entropy — or even similar amounts of Shannon entropy — but it’s an accepted axiom in linguistics that all written human languages have a Shannon entropy some orders of magnitude less than a collection of grouped spatial graphic depictions, which has less than a collection of naturally-arisen ordering (allow me to flip the bird once more at Dembski), which has less than a collection of “randomness”.

    1. bardfinn – maybe I’m missing your point (and I have only barely skimmed the article, which probably raises the likelihood that I am).

      I took Grikdog’s “Until you know what bits of the pictures (pardon) contribute to meaning, you don’t know what to compress, and have no clue what it’s entropy might be” to mean, how do you know which features of a pictogram have meaning, and would create a distinct pictogram, versus which strokes are just a variation in handwriting style: This dog’s head has one ear cocked – do we call this another instance of “dog’s head”, or do we call it a new and distinct pictogram, “dog’s head with cocked ear”?

  9. What? No several species of small furry animals gathered together in a cave and grooving?

Comments are closed.