Skip to main content

Phoenician ASCII

All communication, since the dawn of language, starts as synchronous communication, and tends towards the asynchronous. Oral histories disappear from cultures once writing and literacy is common enough to allow it. Orders delivered though synchronous communication (verbal conversation) can run a tribe but not an empire. Voicemail, e-mail, and text messages replace phone calls and face-to-face meetings. Having to be in the same place at the same time, let's face it, is inconvenient.

Some 3500 years ago, in one of the great and shining moments of human achievement (cue the monolith from 2001), the Phoenicians reprogrammed the human mind.

In the beginning, distinct sounds represented distinct thoughts. Words were the fundamental atomic nature of communication. I say "horse", and the idea of the equine quadruped pops into your head. (Computer scientists still honor this tradition when talking about PC architecture. The smallest chunk of memory that can be addressed is referred to as a "word". This is what "32-bit" refers to--a PC with 32-bit architecture can't address pieces of memory smaller than 32-bits in length). As language became more complex, so did the words. The first forms of asynchronous communication to develop all over the world were pictographic in nature. Just like spoken language, a visual symbol is a placeholder for an idea. Simple ideas were simple pictograms, and simple words. Complex ideas became combinations of pictograms or words. If a language had a massive vocabulary, it also had, by necessity, a massive collection of pictograms to represent it. This is the state that the Egyptian heiroglyphics are in at the height of their civilization. Writing was an absolute necessity to govern an empire so large, but the writing was so complex, that only very few were truly literate, and only those that held power.

The Phoenicians were a maritime trading culture--merchants that traded all along the Mediterranean Sea. Asynchronous communication was even more essential to a seafaring populace, where geography and slow travel makes synchronous communication nearly impossible. Wouldn't it be amazing if we could be witness to the moment in history when it dawned on that first Phoenician thinker that the number of possible sounds made to form language was far more finite than the number of distinct words that form the vocabulary of that language? That with just a few dozen symbols, every "word" in thier language could be constructed in an easy to remeber visual shorthand? What an incredible shift in thinking! That conceptual leap, from direct symbolic representation, to another layer of abstraction is nothing short of miraculous-- translating the "idea" into the "word", then into component simple sounds! (The Phoenicians, of course, is what gives us the term "Phonetic", if that wasn't obvious by now).

The human mind is good at internalizing a vast vocabulary by binding sound to ideas. Our brains are wired for it. So that first moment of genius, where the sounds themselves were broken up into simpler building blocks was akin to the splitting of the conceptual atom. The words "horse" and "house" sound similar to the ear, but evoke vastly different ideas in the human mind, but the sound that "h" makes doesn't evoke much of an idea at all. Who was the Phoenician mad man that first thought to build in that layer of abstraction, and encode distinct ideas ("words") into a compressed and easily transmitted format? I am humbled by the insight that that moment in human history must have taken.

We have so altered the way that human consciousness works that the very idea of "words" being atomic to language seems alien to us. We can't help but think that letters make up words, and words make up ideas. But in reality, this isn't the case! Letters are wholly abstract, they are nothing more than an auditory shorthand that carries no meaning in-and-of-itself. Letters, as an idea, rest on top of words, not beneath them. The irony of trying to convey this notion, while using a phonetic alphabet isn't lost on me.

Further tunelling down the rabit-hole, think about this... Right now, the letters on this page were conveyed to you in *even yet another* layer of abstraction. It only takes 8 little on/off switches (bits) to map to a number between 0 and 255, and those numbers are mapped to our phonetic alphabet, each number producing a specific letter on your screen right now. This is what the ASCII code is. (Of course, you're probably reading this on a 32-bit computer, so each letter is wasting alot of memory space, because it takes only 8 bits to represent a letter, but your computer is using 32-bits to store each one since it can't address anything smaller that a 32-bit "word".

A phonetic alphabet is the basis for the earliest compression algorithms, and this paradigm shift in human language and thought is nothing short of a monumental development in human evolution.

Comments

Popular posts from this blog

What Advice Would You Give Your Younger Self?

An old friend recently reached out to me (and presumably others) and asked us what advice we'd give our younger selves, particularly at ages 20, 30 and 40.


After writing my response to him, I thought it worth posting myself as well. 

The substantive bulk of my response to him follows:

-----

The difficult thing is that I really wouldn't change a thing about who I am, so any call for advice feels a bit like a time-traveler scenario where my advice to a younger self would affect the outcome of my present life, and I'm not sure I'd risk it. My experiences shaped me, including the glaring mistakes, and I wouldn't trade places today with anyone on Earth. But, for the sake of argument, let's assume the Many-Worlds Interpretation of quantum physics here, and thus assume I won't mess my own (present) life up.


It is also important to note that the question is "What advice would you give your younger self?". The answers below are specific and personal to me and…

Crowdsourcing Curation: The Social Graph as Gatekeeper

I've written before about the compromise we tacitly agree to when amateurs take over the roles formerly held by professionsals. The Internet promotes this takeover by lowering the cost of production and transmission to near zero for nearly every user, for everything from words (blogs) to pictures (Flickr) to video (YouTube).

As Clay Shirky put it so well: As freedom to produce increases, average quality necessarily goes down. For example: Thanks to Flickr, we now have access to a mind-boggling array of beautiful pictures, but that's partly because we simply have access to a mind boggling array of pictures, period. Some of these, of course, are beautiful; but there are a lot more of Aunt Bettie's 43rd picture of a bundt cake than of an Annie Leibovitz Rolling Stone cover.

It is at this point that many people interject: "This is the problem with the internet! It's full of crap!" Many would argue that without professional producers, editors, publishers, and the …

Intellectual Property and Deflation of the Knowledge Economy

[Update: This accidentally became a series of posts on a theme.


Does Intellectual Property Law Foster Innovation?Where I question the efficacy of patent and copyright in a socially networked world.


Intellectual Property and the Deflation of the Knowledge Economy - (this post) Where I toy with the idea that the Knowledge Economy may not turn out to be much of an economy, especially when it comes to Intellectual Property


The Economic Reset Button- Where Jeff Jarvis asks Eric Schmidt whether or not this is a fundamental shift in the economic base


Innovative Deflation- Where I ask, "Is the knowledge economy ripe for growth, or is it the means by which traditional economies are shrunk?" ]

Friday night I was discussing the future of intellectual property law with some friends. My argument, in a nutshell:

Every business model relying on intellectual property law (patent and copyright) is heading for massive deflation in our lifetimes. We've seen it with the music industry and news…