Info Peripeteia



This is Morgan Ame's and Tricia Wang's research blog documenting challenges to and support of the rhetoric of neo-informationalism. Peripeteia is a sudden reversal dependent on intellect and logic.


working definition of neo-informationalism: the belief that information should function like currency in free-market capitalism—borderless, free from regulation, and mobile. The logic of neo-info rests on an ethical framework that is tied to we call “information determinism,” the belief that free and open access to information can create real social change.


Keywords: neo-informationalism, information determinism, open access, open source, F/OSS, hackers, data, history, political, global, digital, ideology, rhetoric, semiotic, freedom of information

The Information: James Gleick
Into the breach steps the gifted science writer James Gleick. In his formidable new book, The Information, Gleick explains how we’ve progressed from seeing information as the expression of human thought and emotion to looking at it as a commodity that can be processed, like wheat or plutonium. It’s a long, complicated, and important story, beginning with tribal drummers and ending with quantum physics, and in Gleick’s hands it’s also a mesmerizing one. Wisely, he avoids getting bogged down in the arcane formulas and equations of information theory—though (fair warning) there are quite a few of those—but rather situates his tale in the remarkable lives and discoveries of a series of brilliant mathematicians, logicians, and engineers.
There’s the eccentric English polymath Charles Babbage, who in the middle of the 19th century designed an elaborate calculating machine, the Analytical Engine, that anticipated the modern computer. There’s Countess Ada Lovelace Byron, the poet’s daughter, who, inspired by Babbage’s work, came up with the idea of the software algorithm. There’s the great philosopher-mathematician Bertrand Russell, who imagined that the language of mathematics would provide a perfect system of logic. And there’s the troubled Austrian theorist Kurt Gödel, who dismantled Russell’s dream by showing that mathematics is as prone to paradoxes and mysteries as any other language.
The star of Gleick’s story is a shy, gangly Midwesterner named Claude Shannon. As a boy growing up in a northern Michigan town in the 1920s, Shannon became obsessed with the mechanics of transmitting information. He turned a barbed-wire fence near his home into a makeshift telegraph system, using it to exchange coded messages with a friend a half mile away. After earning a doctorate from MIT, he joined Bell Labs as a researcher. In 1948, the same year that saw the invention of the transistor, Shannon published a groundbreaking monograph titled “A Mathematical Theory of Communication.” The paper was, as Gleick writes, “a fulcrum around which the world began to turn.”
Human beings, Shannon saw, communicate through codes—the strings of letters that form words and sentences, the dots and dashes of telegraph messages, the patterns of electrical waves flowing down telephone lines. Information is a logical arrangement of symbols, and those symbols, regardless of their meaning, can be translated into the symbols of mathematics. Building on that insight, Shannon showed that information can be quantified. He coined the term “bit”—indicating a single binary choice: yes or no, on or off, one or zero—as the fundamental unit of information. He realized, as well, that there is a great deal of redundant information—extraneous bits—in human communication. The message “Where are you?” can be boiled down to “whr r u?” and remain understandable to its recipient. Prune away the redundancy, through mathematical analysis, and you can transmit more information more quickly and at a much lower cost.
The impact of Shannon’s insights would be hard to overstate. They enabled phone companies to route more conversations through their wires, dramatically reducing the cost of communication and turning the telephone into a universal appliance. They paved the way for high-speed digital computers, software programming, mass data storage, and the Internet. Compression algorithms derived from Shannon’s work have become essential to modern media; they squeeze the music we listen to, the films we watch, the words we read. When you send a tweet, Google a keyword, or stream a Netflix movie, you are harvesting what Shannon sowed.
But information theory turned out to have applications far removed from communications systems. When, in the early 1950s, James Watson and Francis Crick discovered that genetic information was transmitted through a four-digit code—the nucleotide bases designated A, C, G, and T—biologists and geneticists began to draw on Shannon’s theory to decipher the secrets of life. Physicists, too, started to sense that the matter of the universe may be nothing more than the physical manifestation of information, that the most fundamental particles may be carriers and transmitters of messages. The bit, Gleick reports, could well turn out to be the basic unit of existence. The entire universe may be nothing more than “a cosmic information-processing machine.”

The Information: James Gleick

Into the breach steps the gifted science writer James Gleick. In his formidable new book, The Information, Gleick explains how we’ve progressed from seeing information as the expression of human thought and emotion to looking at it as a commodity that can be processed, like wheat or plutonium. It’s a long, complicated, and important story, beginning with tribal drummers and ending with quantum physics, and in Gleick’s hands it’s also a mesmerizing one. Wisely, he avoids getting bogged down in the arcane formulas and equations of information theory—though (fair warning) there are quite a few of those—but rather situates his tale in the remarkable lives and discoveries of a series of brilliant mathematicians, logicians, and engineers.

There’s the eccentric English polymath Charles Babbage, who in the middle of the 19th century designed an elaborate calculating machine, the Analytical Engine, that anticipated the modern computer. There’s Countess Ada Lovelace Byron, the poet’s daughter, who, inspired by Babbage’s work, came up with the idea of the software algorithm. There’s the great philosopher-mathematician Bertrand Russell, who imagined that the language of mathematics would provide a perfect system of logic. And there’s the troubled Austrian theorist Kurt Gödel, who dismantled Russell’s dream by showing that mathematics is as prone to paradoxes and mysteries as any other language.

The star of Gleick’s story is a shy, gangly Midwesterner named Claude Shannon. As a boy growing up in a northern Michigan town in the 1920s, Shannon became obsessed with the mechanics of transmitting information. He turned a barbed-wire fence near his home into a makeshift telegraph system, using it to exchange coded messages with a friend a half mile away. After earning a doctorate from MIT, he joined Bell Labs as a researcher. In 1948, the same year that saw the invention of the transistor, Shannon published a groundbreaking monograph titled “A Mathematical Theory of Communication.” The paper was, as Gleick writes, “a fulcrum around which the world began to turn.”

Human beings, Shannon saw, communicate through codes—the strings of letters that form words and sentences, the dots and dashes of telegraph messages, the patterns of electrical waves flowing down telephone lines. Information is a logical arrangement of symbols, and those symbols, regardless of their meaning, can be translated into the symbols of mathematics. Building on that insight, Shannon showed that information can be quantified. He coined the term “bit”—indicating a single binary choice: yes or no, on or off, one or zero—as the fundamental unit of information. He realized, as well, that there is a great deal of redundant information—extraneous bits—in human communication. The message “Where are you?” can be boiled down to “whr r u?” and remain understandable to its recipient. Prune away the redundancy, through mathematical analysis, and you can transmit more information more quickly and at a much lower cost.

The impact of Shannon’s insights would be hard to overstate. They enabled phone companies to route more conversations through their wires, dramatically reducing the cost of communication and turning the telephone into a universal appliance. They paved the way for high-speed digital computers, software programming, mass data storage, and the Internet. Compression algorithms derived from Shannon’s work have become essential to modern media; they squeeze the music we listen to, the films we watch, the words we read. When you send a tweet, Google a keyword, or stream a Netflix movie, you are harvesting what Shannon sowed.

But information theory turned out to have applications far removed from communications systems. When, in the early 1950s, James Watson and Francis Crick discovered that genetic information was transmitted through a four-digit code—the nucleotide bases designated A, C, G, and T—biologists and geneticists began to draw on Shannon’s theory to decipher the secrets of life. Physicists, too, started to sense that the matter of the universe may be nothing more than the physical manifestation of information, that the most fundamental particles may be carriers and transmitters of messages. The bit, Gleick reports, could well turn out to be the basic unit of existence. The entire universe may be nothing more than “a cosmic information-processing machine.”

Notes about this post from the Tumblr community:

  1. karina-peck reblogged this from infoperipeteia
  2. infoperipeteia posted this