Note: Words in bold are defined in the Glossary below.
At the entrance to the Albright-Knox Art Gallery’s exhibition Difference Machines: Technology and Identity in Contemporary Art, visitors see four large color photographs by the artist Zach Blas. Each shows the head of a different person who looks towards the camera as if they are posing for a mugshot, but their faces are hidden behind opaque, futuristic-looking plastic shells. Like a superhero’s mask, these digitally-designed sculptures confer a special power: they make their wearers undetectable by the facial recognition programs being used to monitor and police marginalized communities.
To the right of these photographs hangs an enormous banner comprising almost 32,000 individual photographs of mundane details from someone’s life: a plate of food, a receipt, a view from a window. Gathered at this scale, they suggest the enormity of the databases that keep track of every aspect of our lives—with and without our consent. Notably, the artist Hasan Elahi has been sharing these photographs online ever since the FBI wrongly suspected him of being a terrorist and spent six months investigating him in the wake of 9/11—years before the advent of Instagram.
Beyond this banner is an installation of fifty cutouts suspended like a children’s mobile. Each has a life-sized, pixilated photograph of a woman’s body part on one side and a tropical-print fabric on the other. Many of the women are tanned or dark-skinned and wear bikinis, typifying the “exotic” images one might find when searching for women from the Caribbean on the internet. By fragmenting their bodies, the artist Joiri Minaya points to the role algorithms play in perpetuating stereotypes and producing partial images of reality.
The last work on the north side of the building is by the artist Keith Piper, who presents an explicitly dystopian view of technology’s role in identifying and tracking social groups. Four television screens show a Black man’s head rotating in space while a target continuously follows his face. Behind him are matrices of digital numbers, samples of other bodies, and close-ups of maps and newspapers. Pairs of words are overlaid on top of him: “subject object,” “culture ethnicity,” “other boundaries,” “visible difference.” The audio includes snippets from news stories about data protection, civil rights, immigration, and hate crimes, all interspersed with the sound of sirens wailing and a relentless synth beat.
These four works are a fitting introduction to the first major museum exhibition focused on the increasingly complex relationship between the technologies we use and the identities we inhabit. Together, they suggest that technology now plays a major role in capturing and shaping our identities. More than just tools, these technologies are redefining how we see ourselves, and each other. The earliest computers were called “difference engines,” as they were used to calculate the differences between numbers. Today, we are surrounded by “difference machines,” or computers that are used to encode the differences between us.
Difference Machines: Technology and Identity in Contemporary Art includes seventeen of the most important artists and collectives who have explored the aesthetic and social potential of emerging technologies, and especially the relationship between digital technologies and our collective identities. Whether they identify as Black, Latinx, Middle Eastern, South Asian, East Asian, Indigenous, queer, or trans, each of them views technology through their own unique lived experiences and artistic perspectives. The exhibition includes projects that span the last three decades, ranging from software-based and internet art to animated videos, bioart experiments, digital games, and 3-D printed sculptures. Many of them explore how digital systems contribute to the exclusion, erasure, and exploitation of marginalized communities. Others emphasize how digital tools can be repurposed to tell more inclusive stories or imagine new ways of being. Dynamic and interactive, these projects transform the space of the museum into a laboratory for experimenting with our increasingly powerful “difference machines.”
“We need to learn how the tools that make up our lives are built and managed. Otherwise we will be the ones who are getting programmed.”
While Difference Machines highlights contemporary artworks that explore the intersection of technology and identity, this introduction aims to sketch their theoretical and historical contexts. The first section discusses how digital tools both empower marginalized communities and amplify systemic inequities. The second section proposes a history of digital art focusing on artists who have explored the concept of difference, and particularly the role that difference plays in our digital systems and on the internet. The essay concludes with an overview of the major themes explored by the artworks in the exhibition.
(De)Coding Bias: Technology and the Production of Difference
In a sense, computers have always been tied to how we define our identities. In 1821, Charles Babbage designed his first difference engine, a mechanical calculator that was used to generate large tables of numerical data, and which anticipated his analytical engine—the first general-purpose, programmable computer. By the turn of the twentieth century, some of the largest analog databases in the world were devoted not to mathematical equations, but to census and life insurance records. Aggregating this data allowed governments and corporations to produce statistical information about social groups. When computers first became commercially available in the 1950s, these records were digitized, laying the groundwork for what we call Big Data today. Thus, modern computers were shaped not only by military research conducted during World War II and the Cold War, but also by the peacetime need to quantify and manage our individual and collective identities.
“We’re a different kind of organism with our mobile phone than before we had them, and we have different kinds of sociality, we have different kinds of memory, we extend ourselves in different ways.”
—Graham Harwood of Mongrel
The miniaturization of computers in the 1970s, the popularization of personal computing in the 1980s, and the emergence of the World Wide Web into the 1990s transformed computers from an elite research tool into a consumer appliance and an increasingly ubiquitous part of our everyday lives. Popular culture soon began describing cyberspace as a kind of parallel world that allows us to escape our bodies and assume any identity. As early as 1993, a now-legendary New Yorker cartoon depicted one dog explaining to another, “On the internet, nobody knows you’re a dog,” pointing out the appeal—and absurdity—of what has been described as identity tourism. One early book on avatars even went so far as to argue that on the internet, identity is irrelevant: “one of the best features about life in digital space is that your skin color, race, sex, size, religion, or age does not matter,” it claimed.
But thinking that the internet allows us to change our identity ignores the nuanced social dynamics of cultural difference, as Jennifer González has argued. For example, pretending to be another race online necessarily depends upon and reinforces stereotypes about how a given race might talk or act. More profoundly, identity tourism suggests that identity is merely a superficial “skin” that can be changed as easily as we change clothes, rather than a fundamental part of who we are. Our identities may not be ingrained in our DNA, but they are deeply rooted in the way we interact with the world and experience power and privilege. Furthermore, while it is possible to be anonymous on the internet, the idea that we could totally escape our identities was always fantasy. As Lisa Nakamura noted in her groundbreaking 2002 book Cybertypes: Race, Ethnicity, and Identity on the Internet, even the language of early internet users—who could only represent themselves by using words, not images—reflected their ideas about race and gender. In other words, the way that we use the internet can reinforce as well as resist racial and gendered hierarchies, as Jessie Daniels has argued.
The social media platforms that allow us to express ourselves and find community online have further blurred the distinction between online and offline life. Facebook was the first to require that we use our real names instead of anonymous handles to create profiles, which became the basis of online “social networks” that mirror our pre-existing personal and professional networks. We now constantly and willingly upload all sorts of identifying information, from our gender, age, and sexuality to our family genealogies, medical histories, and even our GPS locations. This data is used to create digital doubles that reflect every dimension of who we are, including our personal preferences and daily habits. Social media sites, news organizations, and streaming services promote specific content to us (and other people like us) based on these doubles, which reinforces the boundaries of identities and subcultures. When users who are all experiencing similar content interact, they create “corners” on the web that can even become their own communities, such as “Black Twitter” or “lesbian TikTok.” In extreme cases, the algorithms that govern these interactions produce filter bubbles, creating progressively narrower echo chambers that can lead to the cultivation of increasingly extreme viewpoints. The recent increase in hate groups, for example, is partly due to how such algorithms facilitate the sharing of white supremacist beliefs and conspiracy theories.
“What happens when humans get reduced to a small bit of data? It might be our race. How does race get codified? How does gender get codified? How does a body enter a database?”
Over the past decades, scholars in various disciplines have explored the evolving relationship between technology and identity. Some chronicle how diverse communities use digital tools. Examples of this work include the essays in the anthology The Intersectional Internet: Race, Sex, Class, and Culture Online; the books by Charlton D. McIlwain and André Brock Jr. on Black networks; Jennifer Gómez Menjívar and Gloria Elizabeth Chacón’s volume on new technologies in Indigenous communities; and Bonnie Ruberg’s book on how queer people experience and design videogames. Other scholars explore the metaphorical and literal intersections of our concepts of identity and technology. Lisa Nakamura examines what she calls “digital racial formation,” or the formation of racial categories through visual digital technologies, while Legacy Russell proposes that the internet can be a space in which we glitch gender codes and Shaka McGlotten connects Black queer life to the mining of data. Some argue that digital tools are themselves “encoded” with ideas about identity. Tara McPherson contends that the principles behind UNIX—an important operating system created in the late 1960s during the civil rights era—“hardwired an emerging system of covert racism into our mainframes and our minds.” Similarly, Jacob Gaboury argues that computational media are inherently heteronormative in their reliance on binary logic and the act of identification. Inspired in part by McPherson, Kara Keeling proposes the idea of a “Queer OS,” or Queer Operating System: a framework for understanding how our media and information technologies shape—and also are shaped by—our identities. Scholars such as Wendy Hui Kyong Chun and Beth Coleman even propose that race itself is a technology that was “coded” during the Enlightenment.
Concurrently, activists fighting for social justice have turned digital technologies into an important part of their practice. Whereas traditional print and broadcast media are one-way communication streams controlled by people with various forms of privilege, digital media require comparatively fewer resources to operate, allowing even marginalized peoples the opportunity to promote their own narratives and build communities (if they can secure access). Famously, in 1994, the Zapatistas—a pro-democracy guerrilla group based in Chiapas, Mexico—coordinated between Indigenous and campesino groups in Chiapas and elsewhere via messages shared on internet listervs and forums. Today, hashtags on social media, such as: #BlackLivesMatter, #TransDayofVisibility, #UsaTuVoz, and #CriptheVote, make identities visible and amplify demands for representation and social or political action.
Marginalized communities also are active in designing technologies according to their own specific needs. For example, disabled people have a long history of hacking or inventing their own technologies, from mobility aids to health-tracking programs, producing what is now referred to by some as CripTech. These projects are a counterpoint to the commercial devices that are designed by able-bodied people for disabled people, many of which are what the activist and designer Liz Jackson calls disability dongles: elegant and typically expensive technologies that fail to address the most important challenges that disabled people face.
While they can be empowering, digital tools also play a role in perpetuating—and even exacerbating—systemic forms of oppression. At the most basic level, many marginalized communities do not have equal access to computers (a situation referred to as the digital divide), which can limit academic and professional opportunities, especially with the recent explosion of remote schooling and working from home. These same communities also may not have access to high-speed internet: because internet access is not considered a utility (like water or electricity), internet service providers are allowed to choose where they want to invest in their infrastructure. As a result, they often do not invest in places that may be less profitable for them, like municipal housing developments or Native American reservations. This practice is known as digital redlining, as it echoes how banks and other businesses can offer inferior services to underprivileged neighborhoods. Because of the digital divide and digital redlining, both the perils and the promises of the internet are not evenly distributed.
In recent years, the way that databases and algorithms themselves contribute to systemic inequity has been met with greater public scrutiny, thanks in part to exposés such as the 2020 documentary Coded Bias. Increasingly, these tools influence decisions that impact our entire lives, from the grades we receive to the medical care we are provided and the jobs we are offered. Even our justice system has begun using algorithms to predict crimes (known as predictive policing) and determine sentences based on the calculated likelihood that defendants will commit a crime again (known as algorithmic risk assessment). These algorithms are notoriously discriminatory, giving rise to what Ruha Benjamin calls “the New Jim Code.” For example, in the spring of 2019, research in the American Criminal Law Review revealed that the most popular risk assessment program discriminates against Hispanic defendants, and in January 2020, an innocent Black man named Robert Julian-Borchak Williams was arrested because a facial recognition system wrongly matched him to surveillance footage of a robbery suspect.
“There’s a strange back layer to technology that’s often obscured by the terms and conditions, by those who moderate it, and by the code that is written, which seeks to do similar things that institutions do, such as erasing certain voices, deleting certain accounts, and creating a narrative around certain events.”
The problem with such digital tools is not just how they are used; it is also how they are designed. In her 2018 book Algorithms of Oppression: How Search Engines Reinforce Racism, Safiya Umoja Noble coined the term algorithmic oppression to describe how software programs contribute to systemic inequities—even in their very coding. “While we often think of terms such as ‘big data’ and ‘algorithms’ as being benign, neutral, or objective, they are anything but. The people who make these decisions hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy,” she writes. This is a problem because algorithms and databases are only as good as the information that is fed to them. For example, engineers “train” facial recognition algorithms using databases of faces, but if they fail to use diverse databases of images, the algorithms used by law enforcement to identify suspects (to cite one example) are more likely to produce false identifications of non-white subjects. This is why organizations such as the Algorithmic Justice League, Data for Black Lives, and Data & Society are increasing awareness about the uses and abuses of digital technologies and demanding greater transparency, accountability, and regulation in the technology sector.
“Saboteurs of Big Daddy Mainframe”: Difference in Digital Art
Today’s conversations about the intersection of technology and identity resonate with the history of digital art, which has always been tied to the social uses of digital tools. The development of Western art has long depended on artists embracing new tools, from Albrecht Dürer’s drawing grids to Vincent van Gogh’s use of commercial paint tubes. In the 1960s, researchers at companies such as Bell Labs and Siemens realized they could use software programs to produce abstract and figurative patterns, pioneering what was then known as computer art. But by then, “art” and “technology” seemed to be opposed terms, reflecting a division between the “two cultures” of the humanities and the sciences. The technological optimism of the Space Age was met by a technophobic backlash driven by the moral horror of nuclear warfare, the economic consequences of automation, the environmental costs of resource extraction, and the existential threat of artificial intelligence. As an integral part of the so-called military-industrial complex that had been developing since the 1950s, technology seemed incompatible with the humanistic ideals espoused by many in the art world. The sculptor Richard Serra famously wrote in 1970, “Technology is what we do to the Black Panthers and the Vietnamese under the guise of advancement in a materialistic theology.”
From the 1970s into the 1980s, artists working with computers and other forms of technology were widely suspected of being complicit with—or at the very least, insufficiently critical of—a newly technocratic society. It is true that many artists working with these tools focused primarily on developing new aesthetic forms, such as interactive texts, computer-generated paintings, and virtual environments. But many also addressed the role of technology in shaping identity and community in a globalized world. Some of the most important early video artists, for example, were women who used newly accessible video recording and editing techniques to critique the sexism of mass media film and television. Most famously, Dara Birnbaum’s Technology/Transformation: Wonder Woman, 1978–9, deconstructs the television show’s supposedly feminist representation of a powerful (and even “technological”) woman. Other artists who were early adopters of technology explored its capacity to bring communities and cultures together. Nam June Paik and John Godfrey’s iconic video Global Groove, 1973, celebrates the idea of using telecommunications systems to create global creative networks, as seen in its mash-up of performances by Koreans and Americans.
“When nearly everything today—our society, economy, military, environment, relationships—is mediated by digital technologies, it is inevitable that artists work with them to better understand ourselves.”
In the 1990s—in the wake of the culture wars and against the backdrop of the emergence of identity politics—a new generation of artists began using digital technologies to highlight the complexities of systemic bias, marginalization, and oppression. In 1993, the group (RT)mark, operating as the Barbie Liberation Organization, switched the voice boxes in some of Mattel’s Barbie and G.I. Joe dolls so that the former uttered phrases like “vengeance is mine,” while the latter excitedly asked if we would like “to go shopping.” The group secretly returned the dolls to store shelves so that unsuspecting families would be forced to consider how we teach our children to perform gender stereotypes. Ken Gonzales-Day’s digitally-composited photographic series The Bone Grass Boy: The Secret Banks of the Conejos River, c. 1996, features the fictitious Indigenous and Latina two-spirit character Ramoncita, whose body queers and decolonizes our cultural imagination of the nineteenth-century American frontier. The multi-ethnic British group Mongrel similarly used digital editing to produce photographs that questioned national myths about race as part of their project National Heritage, 1997–99, swapping the skins of faces using visible sutures. Produced by Graham Harwood (one of Mongrel’s members) in close collaboration with the staff and patients of a high-security psychiatric hospital, the interactive CD-ROM and installation Rehearsal of Memory, 1996, allows users to “rehearse” the traumatic memories of people sentenced to asylum. Tamiko Thiel and Zara Houshmand’s Beyond Manzanar, 2000, resurrects a different kind of traumatic memory, reimagining the first internment camp built for Japanese Americans during World War II as a virtual landscape. As a public art project, Ricardo Miranda Zúñiga’s Vagamundo, 2002, presented the specific plight of undocumented Mexicans in New York as a videogame that can be played either online or on a mobile cart.
As the web emerged, it allowed artists to not only make art, but also raise the visibility of their communities. As early as 1991, the Australian collective VNS Matrix authored A Cyberfeminist Manifesto for the 21st Century, which they circulated in the form of fax messages, emails, posters, and billboards. Using suggestive language that parodied what is now called techbro culture, they declared themselves “saboteurs of big daddy mainframe,” protesting the default association of digital technology with (white) men and advancing the idea of a feminist technoculture.32 In 1996, the Kanien’kehá:ka (Mohawk) artist Skawennati founded CyberPowWow, a website and series of chat rooms devoted to contemporary Indigenous art that subsequently evolved into AbTeC (Aboriginal Territories in Cyberspace), a network for promoting the representation of Indigenous people in virtual worlds such as Second Life. From 1999–2004, the Uruguayan artist Brian Mackern maintained a “netart latino database” (itself an artwork) to highlight net art projects by Latinx artists. The landing page displayed Mackern’s ASCII interpretation of América Invertida, a 1943 drawing by Joaquín Torres García that inverts North and South America, making the latter the privileged term.
“When I started back in the day, the internet was going to be this amazing information highway that was going to level the playing field and make everybody’s lives better. But what it’s turned out to be is a place where it’s very difficult to find the truth, where our privacy is threatened, and where our interaction with it becomes commodified. . . I’m very hopeful that we humans will use technology in a good way, in a way that helps people, which is how I think it was meant to be used.”
Many artists belonging to this first generation of net art targeted the idea that we leave behind our bodies and identities when we go online. In response to the increasing popularity of virtual avatars, Victoria Vesna’s 1996 website Bodies© Incorporated allowed visitors to assemble their own virtual bodies from body parts that recognizably belonged to different genders, races, and ages—but only after signing away the rights to their avatars. The work suggested that even our virtual identities are tied to bodies that belong to cultural value systems, including legal ones. Roshini Kempadoo’s website Sweetness and Light, commissioned for a 1996 project about technology and colonization called La Finca/The Homestead, explicitly connected the rush to colonize the new “territories” of cyberspace to the history of European colonization. In both, maximizing profits depends on the maintenance of racial hierarchies and power asymmetries. Shu Lea Chang’s Brandon, 1998–99, was a complex interactive project combining texts, images, virtual and real spaces, and performances displayed across various interfaces. In response to the murder of trans man Brandon Teena and a virtual assault that took place in a chat room (both in 1993), Brandon presented gender as a social code that programs our experience of violence and power both offline and online. Guillermo Gómez-Peña and his collaborator Roberto Sifuentes made a similar claim about race and ethnicity in a website they launched around 1994 that was based on their live performance-installations called Temple of Confessions. The site invited anonymous internet users to “Confess Your Intercultural Cyber-Sins,” including “your fears, desires, fantasies and mythologies regarding the Latino and the indigenous other,” suggesting that we don’t leave our prejudices behind when we go online— and that the internet might actually amplify them.
These many artworks demonstrate that artists—including many diverse artists—who work with digital technologies have long considered the complex relationship between technology and difference. Unfortunately, as is true across contemporary art, these artists often have not been valued beyond their communities, thanks to the systemic biases of institutions including art museums and galleries. In 1999, the art historian María Fernández wrote about the distinct ways that people associated with technology and with art tend to understand identity, focusing on the artists who bridge that gap, including Keith Piper and Rafael Lozano-Hemmer (both artists in Difference Machines), as well as Difference Machines co-curator Paul Vanouse. The highly specific history of digital art outlined above builds on her work, as well as on texts by Jennifer Chan, Kimberly Drew, Ben Valentine, Aria Dean, and Lila Pagola, among others. This account also draws on a few pioneering exhibitions that are rarely mentioned alongside more general surveys of art and technology from the dot-com era. These include the aforementioned The Homestead/La Finca, which was organized by Paul Hertz in 1996 and presented again under the title Colonial Ventures in Cyberspace in 1997, and Race in Digital Space, organized by Erika Dalya Muhammad at the MIT List Visual Arts Center in 2001. In the introduction to his show, Hertz noted, “Only a fraction of the world’s people have a presence in cyberspace: the rest are outsiders. Will the outsiders eventually participate? Will borders and differences persist in cyberspace? Who decides these issues?” Almost twenty-five years later, cyberspace’s relationship to difference has only become more complex. With so much at stake, it is all the more urgent to rewrite the decades-long history of contemporary art that deals with technology and identity for the present.
Difference Machines: Technology and Identity in Contemporary Art
In his “First Law of Technology,” Melvin Kranzberg stated: “Technology is neither good nor bad; nor is it neutral.” These words were written in 1986; today, they perfectly describe the digital technologies that shape how we understand and experience our differences. As digital tools continue infiltrating every aspect of our lives in ways that are both obvious and obscure, Difference Machines: Technology and Identity in Contemporary Art invites us to pause and consider: How does technology shape our identities? More specifically: How does technology shape the way we understand the differences between us, including our race, ethnicity, gender, sexual orientation, and dis/ability? And how does technology contribute to—or allow us to resist—the systemic marginalization and oppression of people with certain identities? Art in particular can help us answer these questions by presenting technology and identity in a new light and creating space for them to be imagined differently. The works on view in this exhibition are particularly relevant to our historical moment, as we decide what role we want technology to play in our lives and in our communities while we strive to build a more equitable future.
Several works in the exhibition foreground the idea that identity categories based on physical attributes are not “natural,” but rather, are shaped by society—including by our technologies. One of the earliest works in the show is Mongrel’s Heritage Gold, 1997, a hack of Photoshop 1.0 that transforms identities into filters that can be applied at will, anticipating the widespread use of face filters today. A.M. Darke’s ‘Ye or Nay?, 2020, is an adaptation of the game “Guess Who?,” with one significant twist: all of the figures are Black male celebrities. In producing verbal descriptions of each man, the players perform the same acts of categorization that are central to the construction of collective identities, while also producing the kind of metadata that are tracked by Big Data. Rian Hammond’s Root Picker, 2021, exposes gender as being itself a kind of “code” that has been “programmed” by the biological sciences, the pharmaceutical industry, and colonialism. Joiri Minaya’s #dominicanwomengooglesearch, 2016, reveals that while the eroticization and exoticization of Latinx women has a long history, the internet is now amplifying these stereotypes.
Many artworks are particularly focused on the increasing use of digital technologies to perform surveillance, which can have a more negative impact on marginalized communities. In his 1992 work Surveillance: Tagging the Other, Keith Piper sounds the alarm about the use of digital surveillance to tag certain identities, and especially the United Kingdom’s Black subjects, as being “Other,” extending the long history of the surveillance and classification of Black people. Hasan Elahi’s Thousand Little Brothers, 2014, includes photos from the artist’s ongoing digital self-surveillance, which he began after being wrongly interrogated by the FBI following 9/11— an event that contributed to a massive erosion of the right to privacy, especially for immigrants, Muslims, and people of color. Morehshin Allahyari’s Material Speculation: ISIS, 2015–16, and South Ivan Heads, 2017, protest both the destruction of Middle Eastern cultural heritage by ISIS and the colonialist “capture” of that heritage by Western companies that transform it into their intellectual property. Zach Blas produced the masks in Facial
, 2012–14, in conjunction with a series of workshops he organized about the dangers of biometric surveillance for women, gay men, and Black, Latinx, and Indigenous people, “weaponizing” surveillance against itself while suggesting that collective identities, including the ones constructed by technology, might themselves be “masks” that hide our individuality.
“I think…my approach to technology and ways of using technological tools has always been from this more conceptual, poetic, philosophical space, rather than being obsessed by technology for technology’s sake.”
While being visible within digital systems can make marginalized people vulnerable to surveillance, not being visible enough can become its own problem, too. Another recurring topic in the exhibition is the erasure of marginalized communities through digital technologies—whether by accident or by design. Mendi + Keith Obadike’s early net art projects Blackness for Sale, 2001, and The Interaction of Coloreds, 2002/2018, deploy satire to highlight the literal and metaphorical whiteness of our digital tools. In Level of Confidence, 2015, Rafael Lozano-Hemmer uses facial recognition technology to perform a futile search for the forty-three students who were kidnapped in Iguala, Mexico in 2014. By redirecting this form of surveillance, he underscores that technology is more often used to erode civil rights than address the needs of victims of injustice. For Insufficient Memory, 2020, Sean Fader used his own photographs and texts to create an interactive database of the murder victims of LGBTQ+ hate crimes, highlighting the “insufficiency” of the stories we use data to tell. Danielle Brathwaite-Shirley’s WE ARE HERE BECAUSE OF THOSE THAT ARE NOT, 2020, uses game mechanics to teach the importance of archiving the lives and experiences of Black trans women. Both Fader and Brathwaite-Shirley’s projects acknowledge that being visible can be traumatic and even dangerous for LGBTQ+ people, while also insisting that being excluded from digital systems is its own kind of violence.
Given the relationship between technology and inequity, many of the artists in the show emphasize the importance of reasserting control over the digital technologies that shape our identities. Skawennati’s video She Falls for Ages, 2017, occupies the platform Second Life to retell the traditional Haudenosaunee (Iroquois) creation story as a science fiction narrative. This prompts us to imagine Indigenous people as belonging to the future and not just the past, exemplifying what Anishinaabe author Grace Dillon calls Indigenous Futurism. Sondra Perry’s IT’S IN THE GAME…, 2018, which is based on the economic exploitation of her brother’s likeness by an NCAA Basketball videogame, attempts to reclaim his identity and agency through both technology and art. By questioning a Black female robot named Bina48 about topics such as racism, Stephanie Dinkins points out that the development of artificial intelligence raises profound questions about what it means to be human. Because the answers are necessarily influenced by racism, sexism, ageism, and other biases, AI must be shaped by input from diverse perspectives. Lior Zalmanson’s Excess Ability, 2014, uses Google’s error-prone auto-transcription software on the video of a publicity event in which the company claimed the technology would increase accessibility for the Deaf and hard of hearing. By highlighting the software’s flaws, the video critiques what Meredith Broussard describes as technochauvinism, and suggests the importance of creating technology outside of the limits of Silicon Valley.
“We get to explore, tinker, have fun with, and use things in ways they’re not necessarily made to be used, which means we have the capacity for discovery and ingenuity”
With her new video installation Landscape of Anticipation 2.0, 2021, Saya Woolfalk proposes a radical future for technology—and also for identity. The sci-fi figures we meet in this work belong to the artist’s imagined population of chimerical Empathics. These humanoids appear to defy categorization and embrace the idea of hybridity; one might say that they empathize so much with other organisms like plants and animals that they become them. In the tradition of Afrofuturism, Woolfalk’s work, like Skawennati’s, helps us see racialized bodies as belonging to the future (not just the past), and as agents (not just the subjects) of technology. In this utopian world, differences between bodies continue to exist, but are not so easily articulated into rigid, easily identifiable categories. Woolfalk’s work therefore allows us to imagine difference without oppression—or amnesia. Art always has helped us imagine possible futures. What landscape could we anticipate more eagerly than this?
The quotations by the artists in Difference Machines that appear throughout this essay are excerpted from interviews conducted on the occasion of this exhibition, which may be viewed in the exhibition and on the Albright-Knox website.
 Other contemporary artists who have produced work relating to this topic include Sophia al-Maria, American Artist, micha cárdenas, Andrea Crespo, Caitlin Cherry, Aria Dean, Heather Dewey-Hagborg, Carla Gannis, Claudia Hart, Matthew Angelo Harrison, Shawné Michaelain Holloway, Ryan Kuo, Lynn Hershman Leeson, Juliana Huxtable, Josh Kline, Carolyn Lazard, Cannupa Hanska Luger, LaJuné McMillian, Jayson Musson, Rashaad Newsome, Mimi Onuoha, Tabita Rezaire, Antonio Roberts, Alfredo Salazar-Caro, Jacolby Satterwhite, Stephanie Syjuco, Theo Triantafyllidis, Lawrence Paul Yuxweluptun, and Amelia Winger-Bearskin—among many others.
 See “The Engines,” Computer History Museum, updated 2021, computerhistory.org/babbage/engines/.
 See, for example, JoAnne Yates, Structuring the Information Age: Life Insurance and Technology in the Twentieth Century (Baltimore: Johns Hopkins University Press, 2009).
 On the military origins of computing, see Paul N. Edwards, The Closed World: Computers and the Politics of Discourse in Cold War America (Cambridge, MA: MIT Press, 1996). On the quantification of daily life, see Jacqueline Wernimont, Numbered Lives: Life and Death in Quantum Media (Cambridge, MA: MIT Press, 2019) and Jer Thorp, Living in Data: A Citizen’s Guide to a Better Information Future (New York: Farrar, Straus and Giroux, 2021).
 Bruce Damer, Avatars!: Exploring and Building Virtual Worlds on the Internet (Berkeley, CA: Peachpit Press, 1997), 136.
 Jennifer González, “The Face and the Public: Race, Secrecy, and Digital Art Practice,” in “Race and/as Technology,” ed. Wendy Hui Kyong Chun, special issue, Camera Obscura: Feminism, Culture, and Media Studies 70, vol. 24, no. 1 (2009): 37–65. Reprinted in Anna Dezeuze, ed., The “Do-It-Yourself” Artwork: Participation from Fluxus to New Media (Manchester: Manchester University Press, 2012), 185 –205.
 Lisa Nakamura, Cybertypes: Race, Ethnicity, and Identity on the Internet (New York: Routledge, 2002).
 Jessie Daniels, “Rethinking Cyber-Feminism(s): Race, Gender, and Embodiment,” in “Technologies,” special issue, Women’s Studies Quarterly 37, no. 1/2 (Spring/Summer 2009): 101–24.
 Safiya Umoja Noble and Brendesha M. Tynes, eds., The Intersectional Internet: Race, Sex, Class, and Culture Online (New York: Peter Lang, 2016); Charlton D. McIlwain, Black Software: The Internet and Racial Justice, from the AfroNet to Black Lives Matter (New York: Oxford University Press, 2020); André Brock Jr., Distributed Blackness: African American Cybercultures (New York: New York University Press, 2020); Jennifer Gómez Menjívar and Gloria Elizabeth Chacón, eds., Indigenous Interfaces: Spaces, Technology, and Social Networks in Mexico and Central America (Tucson: The University of Arizona Press, 2019); and Bonnie Ruberg, Video Games Have Always Been Queer (New York: New York University Press, 2019). For general (though now dated) overviews of some of the models of thinking about race and the internet, see Jessie Daniels, “Race and Racism in Internet Studies: A Review and Critique,” new media & society 15, no. 5 (2012): 695–719, and Amber M. Hamilton, “A Genealogy of Critical Race and Digital Studies: Past, Present, and Future,” Sociology of Race and Ethnicity 6, no. 3 (2020): 292–301.
 Lisa Nakamura, Digitizing Race: Visual Cultures of the Internet (Minneapolis: University of Minnesota Press, 2007), 14; Legacy Russell, Glitch Feminism: A Manifesto (London: Verso Books, 2020); Shaka McGlotten, “Black Data,” in No Tea, No Shade: New Writings in Black Queer Studies, ed. E. Patrick Johnson (Durham: Duke University Press, 2016), 262–87. One example of Nakamura’s “digital racial formation” is the link between racial categories and morphing software as described by Evelynn M. Hammonds; see her “New Technologies of Race,” in The Gendered Cyborg: A Reader, ed. Gill Kirkup et al. (New York: Routledge, 2000), 305–17.
 Tara McPherson, “Why Are the Digital Humanities So White? or Thinking the Histories of Race and Computation,” in Debates in the Digital Humanities, ed. Matthew K. Gold (Minneapolis: University of Minnesota Press, 2012), dhdebates.gc.cuny.edu/projects/debates-in-the-digital-humanities.
 Jacob Gaboury, “Becoming NULL: Queer Relations in the Excluded Middle,” Women & Performance: A Journal of Feminist Theory 28, no. 2 (2018): 143–58.
 Kara Keeling, “Queer OS,” Cinema Journal 53, no. 2 (Winter 2014): 152–57. See also Fiona Barnett et al., “QueerOS: A User’s Manual,” in Debates in the Digital Humanities 2016, eds. Matthew K. Gold and Lauren F. Klein (Minneapolis: University of Minnesota Press, 2016), dhdebates.gc.cuny.edu/projects/debates-in-the-digital-humanities-2016.
 Wendy Hui Kyong Chun, “Introduction: Race and/as Technology; or, How to Do Things to Race,” and Beth Coleman, “Race as Technology,” in “Race and/as Technology,” special issue, Camera Obscura: 6–35, 177–207.
 See, for example, Harry Cleaver, “The Zapatistas and the Electronic Fabric of Struggle,” in Zapatista! Reinventing Revolution in Mexico, eds. John Holloway and Eloína Peláez (London: Pluto Press, 1998), 81–103.
 On CripTech, see Vanessa Chang and Lindsey D. Felt, “Recoding CripTech,” SOMArts Cultural Center 2020, recodingcriptech.com and “CripTech Incubator,” Leonardo, last modified April 23, 2021, leonardo.info/criptech.
 On disability dongles, see s.e. smith, “Disabled people don’t need so many fancy new gadgets. We just need more ramps,” Vox, April 30, 2019, vox.com/first-person/2019/4/30/18523006/disabled-wheelchair-access-ramps-stair-climbing.
 On the digital divide, see Bhaskar Chakravorti, “How to Close the Digital Divide in the U.S.,” Harvard Business Review July 20, 2021, hbr.org/2021/07/how-to-close-the-digital-divide-in-the-u-s.
 Shara Tibken, “The Broadband Gap’s Dirty Secret: Redlining Still Exists in Digital form,” Cnet, June 28, 2021, cnet.com/features/the-broadbandgaps-dirty-secret-redlining-still-exists-in-digital-form/.
 These problems are not mere hypotheticals. In October 2019, a paper in the journal Science revealed that an algorithm used by medical centers and hospitals to allocate care was discriminating against Black patients. In August 2020, the United Kingdom used a program to guess what students would have scored on exams that were cancelled due to COVID; the program factored in where they lived, automatically lowering the grades of students from underprivileged neighborhoods. For more examples, see the Further Resources below.
 Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code (Cambridge, UK: Polity Press, 2019). See also Ruha Benjamin, ed., Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life (Durham: Duke University Press, 2019).
 Kashmir Hill, “Wrongfully Accused by an Algorithm,” The New York Times August 3, 2020, nytimes.com/2020/06/24/technology/facial-recognition-arrest.html; Melissa Hamilton, “The Biased Algorithm: Evidence of Disparate Impact on Hispanics,” American Criminal Law Review 56, no.4 (Fall 2019): 1533–77.
 Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York: New York University Press, 2018), 1–2.
 On the early history of digital art, see Grant D. Taylor, When the Machine Made Art: The Troubled History of Computer Art (New York: Bloomsbury, 2014).
 The term “the two cultures” was first popularized in 1959 by C.P. Snow and instantly became an influential idea, although it has since been widely criticized as an over-simplification. See C.P. Snow, “Two Cultures,” Science 130, no. 3373 (1959): 419.
 See Anne Collins Goodyear, “Expo ’70 as Watershed: The Politics of American Art and Technology,” in Cold War Modern: Design 1945–1970, eds. David Crowley and Jane Pavitt (London: Victoria and Albert Museums, 2008), 197–203.
 Gail R. Scott, “Richard Serra,” in A Report on the Art and Technology Program of the Los Angeles County Museum of Art, 1967–71, ed. Maurice Tuchman (Los Angeles: Los Angeles County Museum of Art, 1971), 300. Tellingly, while acknowledging the way that technology contributes to the oppression of Black and Vietnamese people, Serra’s words might imply that they do not also have technology themselves. This perpetuates the idea of racialized “Others” as being sub-human, given the close identification of humanity with the ability to make tools.
 An excellent introduction to this field is Christiane Paul, Digital Art, 3rd edition (London: Thames and Hudson, 2015).
 See David Firestone, “While Barbie Talks Tough, G.I. Joe Goes Shopping,” The New York Times, December 31, 1993, nytimes.com/1993/12/31/us/while-barbie-talks-tough-g-i-joe-goes-shopping.html.
 See Ken Gonzales-Day, “The Bone-Grass Boy,” kengonzalesday.com/archive/the-bone-grass-boy/.
 VNS Matrix comprises Josephine Starrs, Julianne Pierce, Francesca da Rimini, and Virginia Barratt.
 See VNS Matrix, “A Cyberfeminist Manifesto for the 21st Century,” Net Art Anthology, Rhizome, 1991, anthology.rhizome.org/a-cyber-feminist-manifesto-for-the-21st-century.
 AbTeC now lives at abtec.org. See Rea McNamara, “Skawennati Makes Space for Indigenous Representation and Sovereignty in the Virtual World of Second Life,” Art in America, July 1, 2020, artnews.com/art-in-america/features/skawennati-abtec-island-indigenous-community-second-life-1202693110/.
 Brian Mackern, “Netart Latino Database,” Net Art Anthology, Rhizome, 1999–2004, anthology.rhizome.org/netart-latino-database. On art and technology, including digital art, in Latin America, see María Fernández, ed., Latin American Modernisms and Technology (Trenton, NJ: Africa World Press, 2018).
 An emulation of this work can be accessed via Victoria Vesna, “Bodies©Incorporated,” Net Art Anthology, Rhizome, 1996, anthology.rhizome.org/bodies-incorporated. See also Jennifer González, “The Appended Subject: Race and Identity as Digital Assemblage,” in Race in Cyberspace, eds. Beth E. Kolko, Lisa Nakamura, and Gilbert B. Rodman (New York: Routledge, 2000), 27–50.
 The project was recently conserved and is now available online again, albeit without the live elements, at Shu Lea Cheang, “Brandon: A One Year Narrative Project in Installments,” Guggenheim, brandon.guggenheim.org. See also Shu Lea Cheang, “Brandon,” Net Art Anthology, Rhizome, 1998–1999, anthology.rhizome.org/brandon.
 The original website, parts of which can still be accessed via the Internet Archive, was echonyc.com/~confess/. See Thomas Foster, “Cyber-Aztecs and Cholo-Punks: Guillermo Gómez-Peña’s Five-Worlds Theory,” in “Mobile Citizens, Media States,” ed. Carlos J. Alonso, special topic, PMLA 117, no. 1 (January 2002): 43–67. See also Evantheia Schibsted, “Confessions of a Webback,” Wired, January 1, 1997, wired.com/1997/01/ffpena/.
 María Fernández, “Postcolonial Media Theory,” Art Journal 58, no. 3 (Autumn 1999): 58–73.
 See Jennifer Chan, “Why Are There No Great Women Net Artists?: Vague Histories of Female Contribution According to Video and Internet Art,” Mouchette, 2011, about.mouchette.org/wp-content/uploads/2012/05/Jennifer_Chan2.pdf; Kimberly Drew, “Towards a New Digital Landscape,” SuperScript, May 11, 2015, walkerart.org/magazine/equity-representation-future-digital-art; Ben Valentine, “Where Are the Women of Color in New Media Art?,” Hyperallergic, April 7, 2015, hyperallergic.com/195049/where-are-the-women-of-color-innew-media-art/; Aria Dean, “Blackness in Circulation: A History of NetArt,” in The Art Happens Here: Net Art Anthology, ed. Michael Connor with Aria Dean and Dragan Espenschied (New York: Rhizome, 2019), 394-99; and Lila Pagola, “Netart Latino Database: The Inverted Map of Latin American Net Art,” in The Art Happens Here, 400–5.
 Paul Hertz, “Colonial Ventures in Cyberspace,” Leonardo 30, no. 4 (1997): 249–59.
 Melvin Kranzberg, “Technology and History: ‘Kranzberg’s Laws,’” Technology and Culture 27, no. 3 (July 1986): 545.
 On the relationship between Blackness and surveillance, see Simone Browne, Dark Matters: On the Surveillance of Blackness (Durham: Duke University Press, 2015).
 See Indigenous Futurisms: Transcending Past/Present/Future (Santa Fe: IAIA Museum of Contemporary Native Arts, 2020).
 Meredith Broussard, Artificial Unintelligence: How Computers Misunderstand the World (Cambridge, MA: MIT Press, 2018).