How ‘ghost work’ in Silicon Valley pressures the workforce, with Mary Gray

The phrase “pull yourself up by your own bootstraps” was originally meant sarcastically.

It’s not actually physically possible to do — especially while wearing Allbirds and having just fallen off a Bird scooter in downtown San Francisco, but I should get to my point.

This week, Ken Cuccinelli, the acting Director of the United States Citizenship and Immigrant Services Office, repeatedly referred to the notion of bootstraps in announcing shifts in immigration policy, even going so far as to change the words to Emma Lazarus’s famous poem “The New Colossus:” no longer “give me your tired, your poor, your huddled masses yearning to breathe free,” but “give me your tired and your poor who can stand on their own two feet, and who will not become a public charge.”

We’ve come to expect “alternative facts” from this administration, but who could have foreseen alternative poems?

Still, the concept of ‘bootstrapping’ is far from limited to the rhetorical territory of the welfare state and social safety net. It’s also a favorite term of art in Silicon Valley tech and venture capital circles: see for example this excellent (and scary) recent piece by my editor Danny Crichton, in which young VC firms attempt to overcome a lack of the startup capital that is essential to their business model by creating, as perhaps an even more essential feature of their model, impossible working conditions for most everyone involved. Often with predictably disastrous results.

It is in this context of unrealistic expectations about people’s labor, that I want to introduce my most recent interviewee in this series of in-depth conversations about ethics and technology.

Mary L. Gray is a Fellow at Harvard University’s Berkman Klein Center for Internet and Society and a Senior Researcher at Microsoft Research. One of the world’s leading experts in the emerging field of ethics in AI, Mary is also an anthropologist who maintains a faculty position at Indiana University. With her co-author Siddharth Suri (a computer scientist), Gray coined the term “ghost work,” as in the title of their extraordinarily important 2019 book, Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass. 

a mathiowetz crop 2 768x960

Image via Mary L. Gray / Ghostwork / Adrianne Mathiowetz Photography

Ghost Work is a name for a rising new category of employment that involves people scheduling, managing, shipping, billing, etc. “through some combination of an application programming interface, APIs, the internet and maybe a sprinkle of artificial intelligence,” Gray told me earlier this summer. But what really distinguishes ghost work (and makes Mary’s scholarship around it so important) is the way it is presented and sold to the end consumer as artificial intelligence and the magic of computation.

In other words, just as we have long enjoyed telling ourselves that it’s possible to hoist ourselves up in life without help from anyone else (I like to think anyone who talks seriously about “bootstrapping” should be legally required to rephrase as “raising oneself from infancy”), we now attempt to convince ourselves and others that it’s possible, at scale, to get computers and robots to do work that only humans can actually do.

Ghost Work’s purpose, as I understand it, is to elevate the value of what the computers are doing (a minority of the work) and make us forget, as much as possible, about the actual messy human beings contributing to the services we use. Well, except for the founders, and maybe the occasional COO.

Facebook now has far more employees than Harvard has students, but many of us still talk about it as if it were little more than Mark Zuckerberg, Cheryl Sandberg, and a bunch of circuit boards.

But if working people are supposed to be ghosts, then when they speak up or otherwise make themselves visible, they are “haunting” us. And maybe it can be haunting to be reminded that you didn’t “bootstrap” yourself to billions or even to hundreds of thousands of dollars of net worth.

Sure, you worked hard. Sure, your circumstances may well have stunk. Most people’s do.

But none of us rise without help, without cooperation, without goodwill, both from those who look and think like us and those who do not. Not to mention dumb luck, even if only our incredible good fortune of being born with a relatively healthy mind and body, in a position to learn and grow, here on this planet, fourteen billion years or so after the Big Bang.

I’ll now turn to the conversation I recently had with Gray, which turned out to be surprisingly more hopeful than perhaps this introduction has made it seem.

Greg Epstein: One of the most central and least understood features of ghost work is the way it revolves around people constantly making themselves available to do it.

Mary Gray: Yes, [What Siddarth Suri and I call ghost work] values having a supply of people available, literally on demand. Their contributions are collective contributions.

It’s not one person you’re hiring to take you to the airport every day, or to confirm the identity of the driver, or to clean that data set. Unless we’re valuing that availability of a person, to participate in the moment of need, it can quickly slip into ghost work conditions.

Image via Getty Images / sorbetto

Epstein: We often forget that in order for us to have an on-demand service, we have to build an entire culture around ourselves where people are living a lot of their lives in readiness to meet our demands.

Gray: Exactly. [There is] a complete dependency on people making themselves available to us. What we have not worked out is recognizing the value of somebody making themselves available to us.

We have a long history of devaluing people who are compelled to serve us or just simply seem to be in the air available to us. This organization of labor by design means I am putting a good number of people in front of a consumer or a business, but they’ll never see them because there’s this invisible shield that makes them impossible to see.

But that is really what’s being offered: the availability of this collective pool of people. Won’t it be amazing if we can move culturally to a place where we actually value that availability?

Epstein: From your perspective as an anthropologist who works for Microsoft and has spent years researching this, why we don’t value that availability?

Gray: The historical reality is, somebody [who can be put] into service right now… we’ve equated that with somebody who’s not very valuable. Because if I can find them right away, and I don’t need them for very long, then they’re of no use beyond that moment.

The history chapter in Ghost Work [shows] we started doing that with slavery. We racialize, we gender, we nationalize disposability. We have organized labor [that way] for millennia.

As a cultural anthropologist, my belief is that there is nothing inherent about those cultural and social structures. We reproduce them every generation, but they’re not destiny.

Alongside the growth of industrial capitalism, we’ve had this belief that people, through education and other means, would be able to move from the drudgery of manual labor. That there was a way to elevate yourself and move toward more cultivated work, a culture of the mind.

[But] if you look outside of very particular settings in the global north, most people’s day to day still involves substance agriculture, a kind of market capitalism that’s very much service-oriented, delivering very particular [goods] to particular people. It doesn’t look like this formalization of employment that we have built up in industrial countries.

Epstein: Are people in the tech world listening to your message? To the extent the world is a competition, and I don’t think it is but people do see it that way, they are already the winners of the winners, right?

They have so much wealth, so much privilege, so much opportunity, so much freedom relative to other people on Earth that there’s a tremendous incentive to either ignore or wish these truths away.

Gray: Two things. First, many rank and file bench engineers and computer scientists move toward this kind of work because they like to build things. They might have different personal motivations, but they like to create. Second, almost without exception, they know that is done in teams.

They almost always come to the projects they build as systems building. It’s very rare to meet somebody who would ever claim, as an engineer, that they did everything entirely by themselves. [In] coding environments, and certainly open source, it’s a collective achievement.

What I think is possible here with Ghost Work, [is for people in the tech sector to say,] “it’s really your collective intelligence, which is its own phrase in computer science, that [determines] where you go. It’s never just one person’s input.”

You could think of [that argument] as the ultimate challenge to meritocracy. If meritocracy believes it’s just through my own grit and bootstrapping and my own individual effort that got me where I am, the counterpoint to that is, ‘we achieve nothing except from standing on the shoulders of others.’

Image via Getty Images / Caiaimage / Robert Daly

Epstein: On the one hand, I find what you’re saying very encouraging: tech folks work in teams, and they may be willing to expand the bounds of those teams and see more people as like themselves.

This raises a concern for me, however. Haven’t we documented the ways in which people will be willing to see others like them as part of a team to which they’re contributing, but don’t we then tend to marginalize others and say, “All right, I need people like me, but I may not need or need to value the efforts of women as much, or I may not need to value the efforts of people of color as much, or I may not need to value the efforts of people who didn’t come from my socioeconomic background as much, etc.

How do we address that when what we’re trying to build on is people’s ability to see themselves as part of a team?

Gray: That’s a fantastic question, Take the conversation around fairness and bias in artificial intelligence: it is less than two years old, arguably, maybe three. If you think about the beginnings of those conversations within the industry, but certainly within those disciplines, it was less a matter of people actively claiming, “My group is special. We do the really hard work here.”

And more of an unexamined habit of doing that, passively doing that. As soon as [tech workers become aware of] the possibility that I as an engineer or computer scientist might be perpetuating bias, might be devaluing women’s contributions, folks of color’s contributions, thinking that people outside the United States can’t possibly offer what I have to offer… by and large, except for some really clear exceptions, there’s some real humility within the tech sector and their willingness to examine their biases.

I mean, which industry implemented implicit bias training [almost] overnight?

Does that mean [bias] goes away? Heavens, no. [But] I do think there’s a lot of potential in this sector, for particularly doing the self-reflection it takes to say, “I have been banking on privilege and haven’t acknowledged the implications of that.”

I see willingness to reckon with structural inequality and oppression in this sector more quickly than I’ve seen it hit healthcare, where it’s really obvious among medical professionals. I mean, I don’t disagree with your statement that there’s a great deal of privilege and inequality at work in these environments. I’m noting the willingness and the interest of folks within this environment to reflect.

Epstein: I didn’t expect you to be more optimistic about this than I am, but I’m happy you are.

Does your book include any documentation of this willingness to listen and to be influenced by information about bias and the positive gravitational pull on people in the tech industry?

Gray: It’s touched on in the book by the cases of LeadGenius and Amara, particularly at LeadGenius to challenge the xenophobia that can be built into platforms that, without reflection, don’t think about who might be able to access jobs and how they might be disenfranchised from accessing jobs because of their English literacy or where they’re living.

[Saying,] “I’m not going to bait and switch and call something AI. I’m actually going to value that there are humans at work.” is precisely where LeadGenius and Prayag Narula go with their argument.

Prayag is quite frank about how that means he’s going into meetings with VC’s who are less likely to be excited about what he has to sell, because he’s not selling AI. He’s selling the value of leaning on this collective intelligence of people.

Epstein: I’ve interviewed him about that and very much enjoyed hearing his thoughts on the experience. [To be published in the future.]

I want to ask about your comments about work conditions versus type of work. Part of the Western patriarchal capitalist mindset that I’ve been very much a part of for the majority of my life and am only relatively recently waking up from — and so if anybody’s hypocrisy alert is going off when they’re reading me mention any of these things, I’m guilty as charged — But part of that mentality is, we value ourselves and one another by what we accomplish, not by who we are as human beings.

And yet the system in which we’re living is rigged to make some people much more likely to be able to accrue what we tend to view as “accomplishments,” and other people much less likely to accrue them. Right?

Gray: Yeah.

Epstein: It’s actually a pretty radical statement for you to say we should value working according to work conditions rather than what it is that we’re producing. I mean, it sounds relatively nice and simple coming from you, but you’re actually talking about flipping the tables on an enormous global system that in some ways running billions of people’s lives.

Gray: I’m very glad that that’s what you see [me] saying. The book demonstrates that the information services consumers most enjoy and turn into a verb, [rely] on people making themselves available.

We have a story we tell ourselves about who is valuable. Within a US articulation of capitalism, and we have so quickly fused [our accomplishments] with who we are. We’re quite defensive of the idea that I am valuable because of what I do, which is who I am.

[But if] I’m looking at service work, at how I might value a doctor compared to a nurse, compared to a nurse’s aid, it’s very difficult to make a clear and empirically sound argument about why we value one of those people more than the other, other than gestures to higher education. It’s not unfair to say that person has more training, but it gets muddy if we think about how many people participating in these markets have some level of higher education. It gets really muddy if we start thinking about the inequality built into access to higher education.

GettyImages 675412424 1

Image via Getty Images / vasabii

Epstein: I used to run a community center that I helped found, called the Humanist Hub. We had a big lecture hall, and one of the things we did there was provide free space for courses on anti-racism and white supremacy.

I took some of the courses, but I didn’t take them all. One day, I walked into the lecture hall after one of the classes; they had their notes on the wall on giant Post-Its. One of the posters was a list of the features or characteristics of “White Supremacy Culture.”

The first thing it listed was “Perfectionism.” At first, I thought to myself, “Now, wait a second. This is not what we’re here to talk about.”

We’re here to talk about racism. Perfectionism is a separate issue, and much less important. Then I started thinking about it: what if the whole culture that allows us to be perfectionistic about our work is really actually very much reinforcing white supremacy?

Because it’s very easy for privileged, professional people to spend so much time obsessively critiquing ourselves and one another for supposed failures to do perfect work that we find ourselves too exhausted and distracted to pay any attention to the structures of oppression that are reinforced by all that work.

[Note: the “White Supremacy Culture” exercise was for a class in which the Anti-Racism Collaborative (ARC) used a “restorative justice circle” to discuss a “ChangeWork” curricular resource by activists Kenneth Jones and Tema Okun.]

Gray: Absolutely. Absolutely. The rules of decorum around how one is supposed to behave and how that gets encoded into professions, and how that has to by definition become racialized and gendered. Right?

Epstein: Yeah.

Gray: When I say doctor, for most people, they have a very clear image of who comes up. If I say firefighter, if I say priest, all of those are so culturally loaded and encoded with who can become that. I mean, there’s that one part of it. There’s a very hard wall that anybody will hit if they’re trying to access certain kinds of professions.

When the work involves caring and attending to somebody, paying attention to somebody… If I bring it back to the example of a nurse and a doctor, it’s a real coin flip to say one of those is more valuable than the other.

It might be, under the knife, the surgeon is most valuable. But if I’m asking my mom when she’s convalescent, that moment of a nurse coming in and turning her so that doesn’t get bedsores is incredibly valuable to her.

We’re looking to the market to value human contribution of labor to each other. We have no evidence in history that the market has ever set the true value of human labor.

When it comes to employment, it’s a moral question who we value and how we value them. We’ve always had to rely on society, social activists, collective bargaining, collective action to effectively put a hand up to capitalism and say, “this is where you will stop playing any role in valuing human contribution.” The market can’t answer that question effectively.

Epstein: The market is as biased as our algorithms.

Gray: Absolutely. The market is biased towards pricing hard, durable goods. This is the failure of economics, to not have a better set of theories about the difference between labor capital and asset capital. There’s a difference between me contributing to somebody’s care and me selling my records on eBay, [but] we conflate the two and turn to the market to imagine it could effectively value human contribution.

To me that’s what’s unique about this moment: that the growth area of our employment is in this world of human contribution to others. It’s service. We will have to come up with a new social contract, because that service is… It’s pretty contingent.

I’ve been going back to [educator Paolo] Freire quite a bit, thinking of Pedagogy of the Oppressed [his landmark 1968 book, in which he argues that the learner, not just the teacher, should be seen as a co-creator of knowledge]. [A sustainable future] is not a banking model where I’m going to selfishly possess one employee, and their individual capacity.

That notion of human capital is so empty. It’s going to be what groups of people are collectively able to offer. That’s actually been true for some time. We just haven’t valued it.

Epstein: How optimistic are you about our shared human future?

Gray: I am a relentless optimist, because I see a growing awareness, not just in the tech sector, of the end result of letting markets drive everything to the exclusion of valuing the nonmonetary. We’re really [starting to value] the nonmonetary, prioritizing that. That is what could make for an ethical future: saying we are going to find greater value beyond the financials, beyond the bottom line.

That’s why I’m optimistic, [as we explored in Ghost Work’s chapter] “The Double Bottom Line,” most companies now see the value in selling something other than a product. None of us really want an experience that’s just stripped of all sociality.

As we try to fold that into the future of work and business and commerce, it necessarily means we’re going to be looking for ways to experience the world, beyond a price tag. At the end of the day, we are more than our markets.

Epstein: Great answer for that question, one of my favorites.