The Organizing Committee

THE DAY COMPUTERS BECAME OBSOLETE

Untitled-2b.jpg


The second Organizing Committee record started out when I was still living on the Covid-emptied campus of Australian National University, studying cybernetics with a clear sightline from my student housing to the Southern Hemisphere’s largest supercomputer. It is concerned with the ideologies of computing, which of those ideologies will stay and which will go “obsolete,” and what our path might look like depending on what we do next.

The video for the title sequence was an official selection for the Ottawa International Animation Festival in 2022. The CD has charted on college radio stations across the USA and Canada, earning it a “Heavy Rotation” designation on the freeform radio station WFMU.

The limited edition CD has liner notes and a short essay, which won’t be reproduced here (until we sell out). But because the record draws on a lot of scholarship and outside influences — and you can’t cite scholarship in pop songs — I wanted to create a reading list for the record to give its influences proper acknowledgement.


Track 1

“Light Passes Through”

The original album opener began with a 90-second intro from “A Communication Primer,” the 1952 Charles and Ray Eames video about cybernetics produced for IBM. The song actually draws its melody from the film’s presentation of a punch card machine doing a “pulse check” for calibration purposes, and the lyrics are drawn directly from its explanation of how punch cards work and how that frames the binary logic of digital ideologies:

Light passes through, or else light stops
Calculations are true, or false
All equations designed to solve
How to predict, how to control

We’re setting up an ideology of the digital here, setting the stage for a critique.

(Unfortunately, the official release had to cut the sample, because I couldn’t afford the copyright clearance.)


Track 2

“Tongues and Teeth”

When speech loses power
And math becomes language
It can’t be translated back
No, it can’t be translated back
Not for the convenience
Of tongues and teeth

Hannah Arendt, in The Human Condition, was troubled by the way the digital logic of “control” began to change meaning, from the system to its users. The language of control, rendered into calculations and algorithms, could be said to be stripping these machines from their intended purposes. Rather than solving problems for computers, we had begun to reframe problems to fit computational answers, regardless of their original contours. 

Arendt writes:

"If we would follow the advice, so frequently urged upon us, to adjust our cultural attitudes to the present status of scientific achievement, we would in all earnest adopt a way of life in which speech is no longer meaningful. For the sciences today have been forced to adopt a “language” of mathematical symbols which, though it was originally meant only as an abbreviation for spoken statements, now contains statements that in no way can be translated back into speech. The reason why it may be wise to distrust the political judgment of scientists qua scientists is not primarily their lack of “character”—that they did not refuse to develop atomic weapons—or their naïveté—that they did not understand that once these weapons were developed they would be the last to be consulted about their use—but precisely the fact that they move in a world where speech has lost its power. And whatever men do or know or experience can make sense only to the extent that it can be spoken about. There may be truths beyond speech, and they may be of great relevance to man in the singular, that is, to man in so far as he is not a political being, whatever else he may be. Men in the plural, that is, men in so far as they live and move and act in this world, can experience meaningfulness only because they can talk with and make sense to each other and to themselves.” ― Hannah Arendt, The Human Condition


Track 3

“The Day Computers Became Obsolete”

Vintage computer advertising D.D.Teoli Jr.A.C. (1).jpg

This track’s lyrics were written entirely by the GPT-3 in response to the prompt, “The Day Computers Became Obsolete,” which came from an ad I found in a computer magazine from 1981.

I got carried away imagining the IBM PC becoming obsolete in the early 1980s. What would have happened? What other trajectories of technology would we have been set on, if the logics of digital binaries had been abandoned 40 years ago?

In the video, which was directed by the generative artist Guillaume Pelletier-Auger, scenes of computer-rendered order collapse into computer-generated catastrophe. The scenes are generated rather than animated, each pixel assigned a set of rules in response to certain conditions. The results of the process were recorded and spliced into the video.

To its credit, the GPT-3 had its usual focus on things I don’t think I would have focused on: paper, for example, and the role of papers in information storage and retrieval. The lyrics also took the position of an artificial intelligence talking to another artificial intelligence, warning each other about us. The fact that this isn’t a story, or a fiction — it was the actual output of an AI — made it a bit more unsettling and eerie than I think it comes across. This isn’t a song pretending to be an AI issuing a warning about human behavior. It’s an actual AI warning about human behavior. And the line that chills me the most — with regards to an album about the ideologies of computing:

“Don’t trust what they say, look at what they imagine.”


Track 4

“How to be Universal”

Geof Bowker, in his essay “How to Be Universal,” suggested and critiqued the idea of cybernetics as a universal science — "one whose practitioners recommend a reordering of the traditional hierarchy of the sciences, a new set of universal tools and often a new set of funding possibilities" (p.1). Bowker summarizes the Cyberneticist's dreams of a universal science as a "new age" that was the result of advances in technology and warfare after world war two, but also a result of shifting ideals about human beings and society. 

Curiously, the artistic movement of Pataphysics — dating back to the late 1800’s — was positioned as a “science of the particular, rather than the general.” It was a call for a science of exceptions, which has become key in thinking about today’s science of pattern prediction, modeling, and generalization. A machine learning system can only learn what has come yesterday, and reproduce that tomorrow. In other words, it lacks imagination — and context — that the idea of a tomorrow could be different from today. And so we come right back to where we’ve started. Alfred Jarry, the founder of Pataphysics, writes: 

"Pataphysics will be, above all, the science of the particular, despite the common opinion that the only science is that of the general. Pataphysics will examine the laws governing exceptions, and will explain the universe supplementary to this one; or, less ambitiously, will describe a universe which can be – and perhaps should be – envisaged in the place of the traditional one, since the laws that are supposed to have been discovered in the traditional universe are also correlations of exceptions, albeit more frequent ones, but in any case accidental data which, reduced to the status of unexceptional exceptions, possess no longer even the virtue of originality." 

This “science of the particular” has always seemed to me to mean the opposite of data science. Rather than collecting information and abstracting it into insight (or something else), what if we looked at unique, individual connections, patterns, relationships? Would that do anything to unsettle the “universality” assumed by the current regime of technological design?

Based off of Jarry’s statements, the GPT-3 gave us this, in the end:

We need a science of possibility (A science of imagination)
A science of the unique and particular (A pataphysical investigation)
No use for a science of generalities (A science of intersectionality)
Calculating phony individualization (A science of observation and design)
Through wires and machines and networks (A science of imagination)
Freedom hovers somewhere between zero and one

A science of imagination (A science of imagination)
A science for alignments and motion (A science of exceptions, not patterns)
No need to collect generalities (A science of imagination)
In an age of individualization (A science of actions and feedback)

Through wires and machines and networks (Machines and wires and networks)
Freedom hovers somewhere between zero and one

 

Track 5

 

“Informatic Resistance”

Back in 2019, I hosted an event with Şerife Wong and Primavera De Filippi, then later had conversations with Serife and Larry Bogad, and then an event at RightsCon 2020 with Serife and Richard Whitt. These conversations all oriented themselves to the question of algorithmic resistance.

There’s a lot about algorithms that seem complex and unknowable. But that just reinforces their power to shape our lives. Demystifying these systems through art, creativity — even play — seems like an appropriate direction for inspiring people to mess with them. Building on the previous track’s suggestion of a cybernetic pataphysics, “Informatic Resistance” boosts a handful of strategies that we were able to collect together, as well as drawing from Ian Alan Paul’s “The Informatics of Resistance” talk from 2020, which I’ve embedded below.

 
 

“Not a System”

Track 6

 

When thinking about ideology, it has a troublesome sense of being “always-already,” a term I encountered somewhere when writing my LSE dissertation but haven’t been able to find again. This sense of the “always-already” makes it impossible to imagine that there is any other system for us to operate in.

The reality is that ideologies are constructed, often deliberately and with careful purpose. And there is nothing wrong with the ideological, unless they are borrowed for the purpose of justifying violence.

The time has come 
To write new songs 
For new ideologies

We cannot hide 
Much more inside 
Ourselves -
That's not a system

We can change our world 
And we can do it
Without becoming cops

The time has come 
To write new songs 
For new ideologies
We cannot hide 
Much more inside
Ourselves -
That's not a system

We are full of radical potential
Just ask your neighbor
Don't sit at home feeling alone:
That's not what we are
We still have streets 
And they are still ours
But only if we will claim them

The time has come
To write new songs
For new ideologies

We cannot hide
Much more inside
Ourselves -
That's not a system


“Hypothetical Eschaton”

Track 7

 

The GPT-3 was given a line from David Graeber’s The Utopia of Rules: On Technology, Stupidity, and the Secret Joys of Bureaucracy, which I had encountered again at the end of Adam Curtis’ fascinating documentary, Can’t Get You Out of My Mind:

Image via Amelia Dimoldenberg

Image via Amelia Dimoldenberg

Trust in binaries reflects ideologies
That puts the self at war with the collective
A structure we've designed and technologies we apply
Imagining that freedom is the opposite of connection
"The world is something that we made
And could just as well be made into something different."* 

We are held hostage to a myth of productivity
Whereas in fact that's the very thing holding us back
A collection of internal selves marching day by day
Accepting the promise of a hypothetical eschaton
"The world is something that we made
And could just as well be made into something different."


“AI Agent Plays Contra”

Track 8

 

Another track in which the GPT-2 (in this case) took the prompt to write about itself as an AI, this time incorporating the title and extrapolating out a parent-child relationship. In the video, you can see an AI trained to play Contra, the hyperviolent video game from my childhood. It adapts a weird strategy to handle the information coming at it: just constantly jumping and shooting.

I was thinking about the relationships between parents and children — or me and my dog, for that matter — and what gets learned, without the intent of teaching. Minsky — also seen in the video — spent so much time training a robot to identify and move blocks. Today the AIs teach themselves. Often, that’s through playing games, and it just so happens that our games are hyper violent, because we are just a very weird species.

I love these lyrics because the computer wrote them from a position of “why do you keep teaching us how to kill you?” It’s a good question. We don’t really think through the purposes of what we’re teaching these systems. We’re incomprehensible to these machines, for many reasons, and yet we’re constantly attributing them with an intent that is purely projected from our own culture — and yeah, ideologies.


“Kansas City Standard”

Track 9

 

The Kansas City standard (KCS), or Byte standard, was one step away from punch cards toward smaller, more dense storage formats for information. It was created by enthusiasts at a symposium sponsored by Byte magazine in November 1975 in Kansas City, Missouri. The symposium sought to develop a standard for data storage for microcomputers using audio cassettes.

What I love about the Kansas City Standard is that it was a major contribution toward computing and it didn’t come from corporations. It’s a reminder of the collaborative spirit of early technology, for all its limitations. And cassettes — an analog medium — were chosen because they could be erased. That is a degree of analog control that we obviously don’t have today, but I dare to dream of what analog-digital hybrids might have looked like if we had embraced cassette futurism.

The GPT-3’s lyrics came in response to the prompt, “Data was designed to be erased by magnets,” and seems to embrace the analog aspects of cassettes for information storage.

Data is rust inscribed by magnets in vibration
A record of it, transformed into information
Electric signals face resistance or move forward
Yes, no - and this magic trick has reorganized
Every social system

Data was designed for erasure, rewriting by magnets
Information reorganized returns to iron oxide
And the machine does not know what was in those grooves

Data was designed to be erased by magnets
Data was designed to be erased by magnets

Tiny fragments, ambient noise, layered rhythms
A lattice of sound emergent from machines and rust
Changing, played at different speeds and tempos

Data was designed to be erased by magnets
Data was designed to be erased by magnets
Data was designed to be erased by magnets


“A Coding Code”

Track 10

 

Here we start thinking about the ways in which code organizes information, and how that information orders our lives.

Pieces of it also remind me of a passage from Alex Galloway’s “Protocol: How Control Exists After Decentralization,” (pg 165-166) he writes (and cites) that…

Code draws a line between what is material and what is active, in essence saying that writing (hardware) cannot do anything, but must be transformed into code (software) to be effective.

Northrop Frye says a very similar thing about language when he writes that the process of literary critique essentially creates a metatext, outside the original source material, that contains the critic’s interpretations of that text. In fact Kittler defines software itself as precisely that “logical abstraction” that exists in the negative space between people and the hardware they use. Katherine Hayles has also reflected on the multidimensionality of digital signs. Her term “flickering signifiers” shows that digital images are the visible manifestations of underlayers of code often hidden.

But how can code be so different from mere writing? The answer to this lies in the unique nature of computer code. It lies not in the fact that code is sublinguistic, but rather in the fact that it is hyperlinguistic. Code is a language, but a very special kind of language. Code is the only language that is executable. As Kittler has pointed out, “There exists no word in any ordinary language which does what it says. No description of a machine sets the machine into motion.”

So code is the first language that actually does what it says—it is a machine for converting meaning into action.

It seems the GPT-3 followed a similar line of thought, as best as I can tell. This was essentially raw output from the GPT-3, with no prompts, though I did ask it to riff on a Baudrillard quote that ended up reworded and attached to the end.

The use of a machine
Makes no difference to that machine
The code contains everything
The code it contains nothing

The essence of machines: a machine code
Ordering the essence: a coding code
Rendering complex expressions 
Into code to be recorded
Everything can become code
Everything can be sorted

This code can accommodate
Oblivion and disorder
And in computation 
Comes recursion —
Everything is the machine 
Shooting data like an arrow
The relative arrow 
Of reduction

First we transform 
Essence into signs;
Then code warps and distorts 
The essence in the sign;
Then code hides the absence 
Of real knowledge;
At last code 
Becomes the only essence.


“Cybernetics in the 21st Century”

Track 11

 

Essentially a condensed summary of an interview (of the same name) between Geert Lovink and Yuk Hui, which I’ll quote here at length:

Recursivity is a general term for looping. This is not mere repetition, but rather more like a spiral, where every loop is different as the process moves generally towards an end, whether a closed one or an open one. As a computer science student, I was fascinated by recursion because it is the true spirit of automation: with a few lines of recursive code you can solve a complicated problem that might demand much more code if you tried to solve it in a linear way.

The notion of recursivity represents an epistemological break from the mechanistic worldview that dominated the seventeenth and eighteenth centuries, especially Cartesian mechanism. The most well-known treatise on this break is Immanuel Kant’s 1790 Critique of Judgment, which proposes a reflective judgment whose mode of operation is anti-Cartesian, nonlinear, and self-legitimate (i.e., it derives universal rules from the particular instead of being determined by a priori universal laws). Reflective judgment is central to Kant’s understanding of both beauty and nature, which is why the two parts of his book are dedicated to aesthetic judgment and teleological judgment. … Contingency is central to recursivity. In the mechanical mode of operation, which is built on linear causation, a contingent event may lead to the collapse of the system. For example, machinery may malfunction and cause an industrial catastrophe. But in the recursive mode of operation, contingency is necessary since it enriches the system and allows it to develop. A living organism can absorb contingency and render it valuable. So can today’s machine learning.

The end of the track is the aural equivalent of this idea: a constantly evolving feedback loop left to run, and interact with itself, until it fades out. Initially this was a nice counterpart to the opening of the album, in which the harsh tones of a machine set the starting loop. Here, the loops interact, evolve, and somehow self-regulate.


Album Art

 
The use of this punchcard illustration has become standard across Organizing Committee releases.

The use of this punchcard illustration has become standard across Organizing Committee releases.

An image from Cardiff University’s computer room, taken somewhere between 1967-1979, with with an English Electric (later ICL) System 4.  It was rare to find women in photographs with computers from this era, but I was also struck by her expression,…

An image from Cardiff University’s computer room, taken somewhere between 1967-1979, with with an English Electric (later ICL) System 4. It was rare to find women in photographs with computers from this era, but I was also struck by her expression, which I can’t quite understand. Computers, during this era, were a job, typically held by women. And so this may be an image of a “computer becoming obsolete,” as in the transition from human computers to electric ones.

The album art blends combines two aspects of “computers becoming obsolete.” The first is the image of a woman touching a new computer, with a forlorn or bored expression (I can’t tell which) marking the transition from “human” computers (usually women in engineering environments) to digital computers. The tiled ceiling in the image is taken from the NASA/AMES research center as a series of analog storage devices are removed to be replaced by digital storage systems.

In the CD Digipak, there is also a text booklet designed by David Turgeon at Notype. He writes on Twitter: “We're such geeks, we set the liner notes in Donald Knuth's famous LaTeX fonts.” These were designed to replace typesetting in the use of computer science text books sometime around the 1970’s: another system replacing another system. (Image bottom right).

The final cover.

The final cover.

Computers being removed from the NASA/AMES Research Center in 1987. This is the second phase of computers becoming obsolete, moving from electrical to digital systems and away from analog storage. This image marks the back cover of the CD digipak.

Computers being removed from the NASA/AMES Research Center in 1987. This is the second phase of computers becoming obsolete, moving from electrical to digital systems and away from analog storage. This image marks the back cover of the CD digipak.

The Digipak booklet (photo David Turgeon)

The Digipak booklet (photo David Turgeon)