Sherry Wong—Artist and Advocate

In celebration of all things Internet and art, we chatted with DPN board member Sherry Wong. A Turkish Hawaiian artist and AI ethics advocate, Sherry has great insight into art, technology, and why we need to protect the Internet. Ready to get inspired?

Tell us about yourself

I’m an artist who works in AI governance, and I use research much in the same way I have used paint or video.  I am a very hybrid person: I am Turkish-Hawaiian, and grew up in Turkey and England. I came to the U.S. for university, and I then spent 10 years in the New York art world before coming to San Francisco, where I worked at a hedge fund, and then in tech. Now, I partner with institutions and universities and collaborate with other artists and partners from academia, government, and civil society.

For me, art is simply a form of connection between things — and I feel this in a powerful life-affirming way. Over the past 20 years, my work has ranged from video to happenings, to paintings, to, my newer art language, research and advocacy as an art in itself. These mediums all break and expand conceptual boundaries: what are limits of language or image, and what should the role of the artist be? Beginning in 2016, my practice became oriented primarily around AI governance and increasingly around justice.

Tell us about your artistic process—how you choose your subject and bring your ideas to life?

My art is driven by friendship, which echoes the impetus of the work that is about expansion and connection. My collaborators often become my friends and my inspiration and knowledge of how to work often comes from my friends. The painter Francis Ruyter is a big driver of the way I think about art. He taught me to think of artists as migrants moving between ideas and materials. With this momentum, artists bring to life generative forms, and much like in machine learning, we have our own pattern recognition that is informed by our training.

I’ve consistently been interested in power, labor, myth, ethics, and aesthetics, and how these things historically relate to each other in the formation of culture and in occult narratives.  I look at technology and AI in particular, because I am curious about the forces that animate our world, and tech is a major driving force of how our reality is constructed. Art helps me approach technology as an economic and political force. As I look at how AI is created, by whom, and why; I can see older allegories and mythologies playing out with a specificity to our time.

What do you feel is the ideal role of AI in the future, and what we need to do to get there?

The market has decided that the ideal role of AI is to be a part of everything that can be digitized. AI is currently used to predict our future in that it determines everything from the cost of shipping goods to access to medical care. AI is ubiquitous even in sensitive sectors where great harm can occur, such as child protection services, humanitarian aid, and criminal justice.  The problems in AI go far beyond technical issues to societal ones. Since we tend to only recognize or talk about the technical issues with AI, we inevitably leave out the work that needs to be done on an institutional and structural level so we do not replicate and amplify our past failings. If we slowed down and limited some uses of AI or looked at them more stringently, couldn’t we do a better job ensuring there are mechanisms in place to prevent harm and maintain the public trust?

This is where it is important to have creatives, researchers, activists, and the public: we need to tell a different story from the one we currently tell of market forces and the race for AI dominance between companies or countries. Instead, we can think of AI as a collective enterprise. After all, it is built from our collected data, animated by our collected narratives, and necessitates our collective governance. We can imagine collaborative outcomes for AI-driven by equity that can shape the decisions being made by policymakers so that we can have laws that serve the public interest.

This ties into why I built  Fluxus Landscape, an interactive map of over 500 stakeholders in AI ethics and governance in collaboration with the Center for Advanced Study in the Behavior Sciences at Stanford (CASBS). It is both a useful tool and an art piece. It helps bring to the surface tensions in the various entities engaged in AI discourse and the audiences’ relationship to the language around AI. This includes work done in this space not only by academia, companies, and government; but also includes the tech worker movement, grassroots organizations, and groups impacted by the use of AI in their communities.

What projects you are working on now? 

I recently launched Fairer Tomorrow with CASBS, Stanford. It’s an immersive website (headphones on!), built by Lusion, that explores more than 100 ideas and solutions from the moral political economy program network in response to the problems and vulnerabilities Covid-19 magnified.  This was a very special project to work on — instead of feeling powerless and isolated,  I was able to connect to the work being done by many different thinkers who are laying the groundwork to make systemic change.

I’m currently researching data brokers with the artist Caroline Sinders for the Transformation of the Human Program at the Berggruen Institute. It is exciting to be creating work that unifies art, tech, and philosophy in a truly interdisciplinary way.

I am also thrilled to be working with AI policy researcher Raziye Buse Çetin and storytelling and legal expert Pinar Özütemizto look at AI from a Turkish and feminist perspective in order to from new narratives that can inform policy. Eryk Salvaggio and I are creating these moments of a decentralized art collective—where we look at throwing dust in the gears of AI systems from a performance art lens. We are encouraging the research community and civil society to think of AI in a way that can lead to a more nuanced understanding of brittleness in AI systems and why communities might want to protect themselves. 

Is there a specific cyberwarfare attack that left a strong impression on you?

It was the recent SolarWinds hack, that was so pervasive, that made me feel more deeply how scary a major cyberwar attack on infrastructure could be.

The vulnerability of digital systems as an intellectual conversation centers around risk assessment and mitigation; but on an emotional level, you bring into consideration emergency response systems, supply chains, the power grid, hospitals, and so on. It made me think of how interconnected we are: every digital system our lives depend on is connected to employees who are using the internet, and ultimately to my own family and friends. Cyberwarfare can also impact the most vulnerable communities in a disproportionate way due to geopolitical scenarios in which they may have no involvement.

There are always going to be vulnerabilities in our systems that can be exploited. Technical fixes can only take us so far, we also need to change our norms on a global level to make cyberwarfare, with its potentially devastating consequences, less of a possibility. 

How can art help in the fight against cyberwarfare?

It’s a big question. What does art do? In the context of cyberwar, there’s a quote from Anton Chekhov that I relate to: “The task of a writer is not to solve the problem, but to state the problem correctly.” There is no stronger force than art when it comes to making cultural change. Art is culture and as such, is a shaper of society. I believe that every single person can think of themselves as an artist, as creator and shaper of their world. Just by asking the correct questions, to feeling out the borders and boundaries of why cyberwarfare matters, we begin transformative work.

What made you want to get involved with Digital Peace Now?

Computers are much more than a collection of hardware and programs: they are our lives, our wedding announcements, the records of our health, the control of our traffic, our entertainment, our education, our whole lives. 

For decades, we knew a pandemic was a likelihood. We had agencies to address it and global arrangements to collaborate, and yet here we are, breaking down on so many levels: psychologically, economically, politically. It is devastating. For me, Cyberwar has that same kind of narrative. We see the devastating potential; the scale and the scope of possible harm. Are we prepared? What else can be done?

We must change this story of the digital world as a place of attacks and fear and create a different one.