Information Languages

This Shit Matters Information Languages Illustration Sean Serafini

This Shit Matters Information Languages Illustration Sean Serafini

This Shit Matters Information Languages Illustration Sean Serafini Zoom

The word information may bring to mind data tables, books, or computer systems. But it’s much more than that. Information is the ultimate unsplittable particle, the very building block of matter—dictating every structure, organization and quantum state. In its simplest form, it is the bit: a choice, a yes or a no. Every particle, every field of force, even the space-time continuum itself—derives its function, its very existence from bits.1

Much of this understanding evolved from a 1948 paper by mathematician Claude Shannon. Shannon’s paper provided a theory for how little and how fast information could be communicated through a given channel to “reproduce at one point either exactly or approximately a message selected at another point.” To solve this he needed to measure information. To measure it he needed
to define it. And to define it is to find its simplest form. Shannon coined his discovery the bit. Shannon’s theory made a bridge between information and uncertainty. It revolutionized all the sciences—physics, biology, economics, quantum mechanics, computer science—it became a way to measure, decipher, compress, store, and transmit information on a scale never before possible.3

James Gleick, in his brilliant book, The Information: a History, a Theory, a Flood, proclaims that in the modern age, “Information is what our world runs on: the blood, the fuel, the vital principle. It pervades the sciences from top to bottom, transforming every branch of knowledge.” Force, mass, motion, time: the laws of physics explain our universe through algorithms—codes written in the language of numbers. “When photon and electrons and other particles interact, what are they really doing? Exchanging bits, transmitting quantum states, processing information.”1

We now know the building block of life as information: DNA, genes—four-digit replicating codes transmit instructions for building an organism. Six billion bits to form a human. Just as genes live on by successfully multiplying, their cultural twin, memes—brands, slogans, myths, catchphrases, songs, clichés—act the same way, using human or computer “carriers” to perpetuate and reproduce themselves. Successful memes survive, the rest go extinct.2

Communication of information has always fundamentally altered human consciousness. Information dictates how we process what we see, hear and believe. Our construction of language, codes for experiences and ideas, allowed us to find self-awareness, to organize communities, and manipulate the food supply. Early written language enabled humans to stand on the shoulders of those before them and pass knowledge over generations, to form concepts of religion and morals.

Gutenberg’s printing press accelerated the reliability and distribution of knowledge by duplicating information cheaply and quickly. It encouraged education, literacy, free thinking, and propelled the sciences. “From the printing press came new species of information organizers: dictionaries encyclopedias, almanacs—compendiums of words, classifiers of facts, trees of knowledge.”1

As we got better at distributing information, our world got smaller. Alphabets, numbers, dots, dashes, and flashes of light began to ride across the planet through telegraph codes, telephones lines, radio waves, and television signals. Symbols and signs are now coded into trillions of gigabytes worth of emails, business documents, credit card transactions, global stock exchanges, classified reports, Facebook photos, music, and games all stored within thousands of servers warehouses across the world. Search engines have replaced librarians, dictionaries, and encyclopedias as data is retrieved from within the most comprehensive database of human knowledge, the Cloud.

Information has become cheap. But, as Gleick states, “when information is cheap, attention becomes expensive.” It’s not enough for information to simply exist, it needs to be relevant and easy to find. We are surrounded by what engineers refer to as noise—meaningless chatter among the lines of communication—drowning the message being transmitted.

Our problem of knowledge is akin to Jorge Borges’s mythical ‘Library of Babel’. It was said to contain all knowledge, a library of all volumes of all books of all languages. Yet, no knowledge can be found there because all knowledge is there, shelved side by side with all falsehood.

Information is language. Translated properly, all knowledge could be found. But the problem has always been the same: Locating it amidst the noise.

Notes:

1. Gleick, James.
The Information:
a History, a Theory,
a Flood. New York,
N.Y.: Pantheon,
2011.

2. Dawkins, Richard.
The Selfish Gene.
USA.: Oxford
Press,1990.

3. Shannon, C.E.
A Mathematical
Theory of
Communication.
Bell Systems
Technical Journal,
Vol. 27, 1948.