Creative Strategies for Algorithmic Resistance!

How machines designed to nudge us
can be nudged back.

by Eryk Salvaggio

beneaththedata.jpg

Notes from my portion of the RightsCon 2020 panel, “Data Dada: creative strategies for algorithmic resistance!” co-presented with Şerife Wong and Richard S. Whitt. Find more on Twitter #RobotRiot.

Usually when artists talk about AI or data art they talk about things like using datasets to generate images or poems, and I’ve done that stuff, but it’s not specifically useful as a frame for human rights or activism. More interesting might be to share the kinds of questions you can ask to peek into the “black box” of how autonomous systems can be steered. 

Our lives are heavily influenced by algorithms, whether we know it or not. In the textbook of our times, Data Feminism by Catherine D'Ignazio and Lauren F. Klein, these algorithms are described as extracting data from our bodies for the sake of science, surveillance, or selling. We’ll look at sales and surveillance.

Sales are the systems that collect data about what we do, and use that data to move us toward doing more of that, or toward doing something more profitable. If we buy a book, it moves us toward buying a similar book. This process shares detailed portraits of our daily activities with largely unregulated and oftentimes predatory commercial and industrial enterprises, who use it to market to us in ways we find increasingly hard to resist.

Surveillance, which is not always distinct from sales, identifies where we are, what we are doing, and who with, often for the goal of policing and regulating our behavior and our physical or mental presence. They will misidentify or over-identify black faces, for example. They may send more police to black neighborhoods, which drives up arrests and then convinces the algorithm that even more cops need to go to that neighborhood, which drives up even more arrests in an escalating feedback loop.

This approach to algorithmic art is focused on understanding how to identify and subvert tools that are misapplied to cause us harm. These strategies can be used to create activities that demystify automated systems, or they can be used as building blocks for interventions that catch the attention of the media and raise awareness of an exploitative system. They can also be used positively, as in finding out ways to make these systems more useful when we think they’ve gone awry.

Subvert or Steer?

I like to imagine that every algorithm has its own momentum, and when it’s released into the wild, it just starts rolling in one direction, picking up speed that keeps it moving forward. When it collects data, it accelerates towards that goal: when it learns more of the movies you watched, it can recommend more, for example, with better accuracy.

These decisions, like the momentum of a ball rolling down a mountain, can be steered when some force pushes them forward or pulls them back, or nudges them off course. That’s where the interesting stuff happens.

Purin rolls on ball, shattering barriers and melting hearts. (Photo credit: Guinness World Records)

Purin rolls on ball, shattering barriers and melting hearts. (Photo credit: Guinness World Records)

Riding the momentum of an algorithm is like trying to walk on top of a giant ball that’s already in motion. Here, we can see the Guinness World Record holder for distance traveled on top of a ball by a dog: Purin the Beagle can move 10 meters in 11 seconds.

This balancing act is in some ways the basis of cybernetic systems. That Greek word, kybernetes, is the base of that popular prefix, cyber. Cybernetics emerged from sorting out how to manage steering mechanisms. Early regulators for boats were autonomous systems: when a ship tilted in one direction, the ball moved, releasing steam, reducing pressure in the engine, slowing the ship down.

Today we don’t steer a lot of ships, but we can work with algorithms in the same way. We could all move to one side of the boat, to slow down the engine. Or we can make the regulator think it’s tilted, when it isn’t. So what Purin is doing is what all of us are doing whenever we try to guide a self-directed system. The ball moves forward on its own momentum, and to steer it, Purin can ride it, subverting that movement to get himself somewhere, or can resist it, pulling it back or pushing it forward by adjusting his posture or gait.

Within autonomous systems, such as machine learning, that starting momentum is toward doing whatever it was designed to do. Whether you can subvert or steer its movement depends on the system. You have to anticipate what data the system will respond to, and then lean into it, or pull away from it. You create a relationship that gives you back a sliver of control — or at least begins to reveal how those reins might be used.  

Riding the Wave

An algorithmic resistance T-shirt by Twitter user Nirbion. This image was posted with the caption “Would love this on a t-shirt!” so that a bot would steal the image and sell it in an online store, a behavior it was designed to do in order to steal …

An algorithmic resistance T-shirt by Twitter user Nirbion. This image was posted with the caption “Would love this on a t-shirt!” so that a bot would steal the image and sell it in an online store, a behavior it was designed to do in order to steal uncredited “cool” artwork from designers on Twitter. So this image ended up being sold on T-shirts.

Twitter users discovered that a T-shirt company had deployed a bot to search for Tweets that said “I would buy that shirt!”Then the bot would steal any image the phrase was associated with, automatically slapping it on shirts for an online store. Lots of graphic designers were getting their designs stolen, so they started to Tweet “I would buy that shirt!” to absolutely mundane, profane, or deliberately bizarre images that no human would ever curate into their fashion brand.

So in our framework, that’s an example of leaning into the algorithm’s momentum. It’s going in a direction, you know what that direction is, and you can jump on top of that ball and make it do something it wasn’t designed to do, though in the end it goes to the same place.  

Let’s look at some examples of artists who have reverse-engineered an algorithm to steer it to creative ends. These projects are really satisfying because they take systems that are widely feared and misunderstood and actually provide us with a sense of literacy and control.

Traffic Jams on the Information Highway

Here’s Simon Weckert, who was able to fake a traffic jam on an empty bridge. 

Let’s look at how this idea came to be through the lens of our balancing puppy.

It required the artist to understand what contributed to the “momentum” of Google Maps: where is that Google Maps ball going? What force drives it forward to that goal? The Google Maps system was pulling data from drivers, measuring their speed of movement, and if that movement was below a certain speed, it reported a traffic jam.

But it doesn’t measure cars. It measures devices connected to Google Maps. The phones are a proxy for cars. But ultimately, it’s the speed that phones were moving that “moved” the algorithm. So you put a bunch of cell phones together, turn on Google Maps, and then walk slowly across a bridge with the phones in a wagon, and you create a virtual traffic jam on an empty street.

The sensor is your oar. Phones often stand in for people, or vehicles, which is another useful piece of art making. The mistake of identifying people for the sensors that measure them is a surprisingly common one. Figure out how and where the data is being collected, and you figure out how to steer.

The T-Shirts simply rode the momentum. Here, I see Weckert’s work as pulling the system back: making it respond to something that wasn’t present, in a way that lead the system to misreport. It’s less direct than the T-shirt example, which simply rode the system to an unexpected destination. Here, the mechanism of Google Maps was used to transform the final destination of that data through misreporting: pulling back from the momentum.

Takeaways

  • Identify what data the system uses.

  • Identify the sensor that collects that data.

Navigating Absence

Photo from the @BLMPrivacyBot Twitter Account, which takes protest photos and anonymizes them.

Photo from the @BLMPrivacyBot Twitter Account, which takes protest photos and anonymizes them.

The “data” a system is collecting isn’t always obvious, and that can be a great place for activists, artists and tech folks to collaborate.

Sometimes the absence of data is the problem, and creating space for meaningful data is a creative act. There is space in making artwork that collects data for better, more socially constructive actions, or scrutinizes the data that’s collected or used to train a machine.

Caroline Sinders created a project called Feminist Data Set, which I highly recommend if you are interested in further exploring how to think about what datasets for an art or community project can be. It’s a set of critical practices for finding and categorizing data that isn’t easy to surface, which is a different, constructive approach.

Mimi Onuoha has another great project, a Dataset of Missing Datasets. In that work, Onuoha presents empty dossiers in a filing cabinet, with only their labels, revealing, in her words, “aspects of a data-saturated society that have been excluded.”

When you can work with a community to create data-driven projects, it’s possible to use it to leverage the weights of a system. For example, “Gendering the Smart City” works with women to intervene in Google Maps’ colonial lens of neighborhoods by working with people living in these spaces to tell stories. Projects such as a Wikipedia edit-a-thon documenting neighborhoods and cities from the perspective of the women who live there become swept up by the data collection algorithms used by Google when it offers information on Maps or Search. By understanding where Google’s data comes from, and how to gather data for that stream, the project creates a successful algorithmic intervention, but also creates competing knowledge sources — for example, websites with photographs and stories created by locals creates a counterweight to commercial sites such as Yelp or popular tourism sites written by outsiders. But primarily, they create an ecosystem of resources for women, by women, that a major corporate entity could never rival.

The flip side is also true: sometimes removing data is the best way to ensure the interests of your community. The BLM Privacy Bot is a facial recognition tool that scans an image and replaces it with an emoji to anonymize photographs of protests, specifically in the context of Black Lives Matter protests.

These works can be seen as building the ball outright, rather than steering. There’s no momentum at work, but this creates new systems that can start moving according to our own needs.

Takeaways

  • Sometimes the system’s weakness is what it doesn’t see, and there is power is showing it.

  • But think critically about what is missing and why, and when excluding data from the system can become a way to steer it.

Hexing UAVs

Here’s another example from artist James Bridle. This is a piece for an autonomous vehicle.

The artist came to understand how the vehicle was interpreting data to guide its decisions. In this case, the data moving the system forward was street paint: dotted lines on the road mean “pass on this side,” whereas the solid line means “do not pass from this side.” The sensors in this case were pulling visual feedback as part of a computer vision system, which would analyze those patterns into commands. So the lines on the road between lanes are what “pulls” the system toward an outcome.

If you have just one direction to move, like the T-shirts or Google Maps, you ride the ball where it wants to go. If you can create something that a system will respond to, you can start using that to manipulate the momentum of the system. I think of this as pulling left or right against the direction of the momentum of the rolling ball: creating an unexpected interface through which we can steer.

Takeaways

  • Identify how a system interprets, or reads, the data that comes in.

  • Think about ways to modify that data so that it can act differently.

Stage Against the Machine

We want to cultivate the exercise of positive freedom– freedom-to, rather than simply freedom-from– and urge feminists to equip themselves with the skills to redeploy existing technologies and invent novel cognitive and material tools in the service of common ends.
— The Xenofeminist Manifesto

So, we just looked at a few case studies of algorithmic resistance. You can consider sales and surveillance algorithms to be a new kind of audience. By performing for them in deliberately structured ways, we can learn to steer them.

This is where the Dada (finally) comes in. As an art movement, Dada was about breaking with the past — and the art “market” — by embracing the irrational and absurd: that is, things that could not be sold. It’s an interesting framework to use when thinking about corporate and surveillance algorithms, which are all about stabilizing and extending the past as far forward as they can. The Situationists are also useful here: their goal is to create novel situations that disrupted patterns of familiarity and inattention, seeking for the so-called “revolution of everyday life.”

Hugo Ball

Hugo Ball

Dadaists had a standard idea of an audience: most of their stuff was done on a stage in Zurich. The Situationists were the ones who took it to the streets. By thinking about the algorithms as an audience, the way the Situationists and Dadaists saw them, we can think about how to disrupt the rapt attention of unwelcome surveillance and sales algorithms. If we think about the “society of spectacle,” it’s clear that our personal lives have become the spectacle that companies and governments are eager to observe.

Dada sought to create what hadn’t been seen before, on stage at the Cabaret Voltaire in Zurich. The Situationists aimed for the jolt of unexpected encounters in the streets of Paris. Today’s Data Dadaists would create new performances for a global audience of ever-observing sensors — with the intent of avoiding “high art,” or what we might now call “high data,” for the sake of sales and surveillance, and replace it with the absurd and unsell-able.

For us, that means creating non-sensical or deliberately counter-productive data, taking our bodies and decisions consciously out of the observation cycle by replacing it with a facade intended to confuse those viewers. The Situationists talked about the detournement, or the hijack, of popular symbols toward new uses. Today’s systems of media ask us to hijack personal symbols: data, in these systems, are tiny abstractions made from our everyday lives. We do not have to hijack our lives; but we should think about how to hijack the mechanisms that collect and control pieces of it.

But this is only a short-term solution. The designers of these machines will learn, and adjust, just as audiences have for decades. The avant-garde eventually become banal. Hugo Ball dressed in a cardboard suit in Zurich in 1916 is today a cliche. The “Make Love, Not War!” spray-painted slogans of the Situationists are now familiar to anyone who has been inside a bathroom stall at a liberal arts college. The irrationality of Dada and the political provocation of the Situationists both embodied a spirit that can challenge and confuse systems built on the expectation of stability. But disruptive forces must be perpetually rediscovered. Resistance is not a permanent fix for structural problems: these actions must always be paired with the goal of moving toward beneficial systems that reinforce justice and freedom.  

Six Questions for Algorithmic Resistance

  • What autonomous system would you like to steer? 

  • What “weight” pulls that algorithm toward the goals of its creators? 

  • Where does information come into the system — what are the sensors, inputs, and how does it interpret them?

  • How can you push or pull those weights (data)? What data can you add or subtract?

  • Where can you reasonably steer that system? 

    • (Remember: Purin could only go 10 meters on top of a ball, and she is the world champion). 

  •  Who can come with you, or who else can steer?

So, I encourage you to think about an algorithm that you want to steer. Then identify what momentum pushes that system forward. Then think about how you could apply what we discussed today to get the model to go where you want it to go. Then think of your destination — this comes last, because you can’t get to a mountain on a boat or go SCUBA diving in an airplane. The sad truth is that you don’t take ownership of the algorithm without rewriting the code. But with patient observation, we can figure out how to nudge the nudgers back.


Wrap Notes

As RightsCon was global this year, I was able to hear some questions from an international audience, specifically from Latin America. It is worth noting that this idea of the complex autonomous system is privy to that idea of the future being here, just not evenly distributed. From the handful of conversations after the talk, it seemed that establishing grassroots, community centered practices for data protection and stewardship is the most ripe area for intervention wherever data surveillance and sales systems are not yet implemented at scale.

Additional Resources

Absence & Omission

Dada & Social Media

Autonomous Vehicles & Navigation

Community, Design, and Trust

Face Recognition & Surveillance


Immense thanks to Şerife Wong, Larry Bogad, Primavera De Filippi, Richard Whitt and the entire 3A Institute for inspiration, feedback &/or ideas on this post, and to RightsCon2020 for the space to share.


Find me on Twitter: @e_Salvaggio.