hgrsd

Tech work, ethics, and indifference

I recently read Rebecca Solnit's article "In the Shadow of Silicon Valley".1 In it, she laments the negative consequences of the tech industry and its billionaire moguls for the city of San Fransisco, where she has lived since her youth. The article is interesting for many reasons, and I urge you to read it. What struck me most, however, was Solnit's observation about the workers in tech.

She writes:

Many tech workers think of themselves as edgy, as outsiders, as countercultural, even as they're part of immense corporations that dominate culture, politics, and the ecomony.

This rings true for me. I am a tech worker in a mid-sized American corporation. I consider myself liberal and progressive; this is true for many of my colleagues too. I am often critical of what I perceive to be the serious negative influence of many, if not most, tech companies on the fabric of our societies. Again, this is true not just for me; many tech workers, including those at the molochs of capitalism like the so-called "FAANG" companies,2 lean left of center and hold progressive views.

Solnit's article provided a welcome jolt to my brain. While reading it, I became increasingly aware of a cognitive dissonance between the values that I hold and the fact that I work in the tech industry.

How come I——and others in similar positions——rarely think about the consequences that the products and companies that we have helped design and build have for the world? Most software engineers are more than capable of abstract thought, logical reasoning, and online research. With a bit of motivation, it is well within our means to figure out what the concrete effects are of the lines of code that we write. So why don't we more often?

(I want to make clear that I do not think there is a conflict between the practice of software engineering and liberal and progressive values; rather, I am concerned with the "tech industry").

Why don't we care?

I want to catalogue a few reasons that might contribute to tech workers' apathy and indifference towards the societal impact of the products we build.

Necessity

People struggle to make ends meet. In many lines of work, no matter how much effort you put in and how talented you are, it is hard to afford a roof over your head and food on the table. Working for a tech corporation makes this easier. Tech work pays well. You might even get shares. You will likely get health insurance taken care of for you.

If you have no feasible alternative that gives you the same capacity to obtain the things that you and your family need, then there is little space to question the effects of work on the wider the world. As Brecht wrote, erst kommt das Fressen, dann die Moral: food comes before morality.

Comfort

The office has free snacks; perhaps the company has an on-site chef. A hairdresser comes in every once in a while. With a bit of luck, we are even able to work from the comfort of our home. It is rare for us to come across poverty, suffering, violence, or the myriad other awful things that other jobs have to deal with. Working life, as a tech worker, is cushy——far more so than most other careers. It is not easy to question the ethical foundation of a job that provides such a level of comfort.

Our convictions are just for show

This is perhaps the more cynical of the entries in this catalogue. Maybe we don't really hold the values that we purport to hold. Perhaps holding those values is just the "done thing" in our social and professional circle, so we make it seem like we do. But in reality, we have little regard for the ill effects that our hypercapitalist employers and their products have on society. In Dutch we have a (slightly crass) way of saying this: "links lullen, rechts vullen": to talk on the left, and to fill (one's pockets) on the right.

Moral distance

"Moral distance" is a concept from ethics used to explain why we might have less moral concern for those at a distance. Think, for exapmle, of drone operators who, from the safety and comfort of an operations centre in the United States, press a button and kill people multiple thousands of kilometers away.

I am not equating tech workers to drone operators. But I am suggesting that there is a significant distance between the day-to-day work of a software engineer and the negative impact of the company or product that they work for. This negative impact is, at least mentally, far removed from the problems that a software engineer is generally concerned with. They might be worrying about the scalability of their system, the maintainability of their code, or the specific algorithm that best fits the engineering problem at hand. The human beings that will be affected do not feature much, if at all, during this kind of work.

As a result, it might be harder to feel a strong sense of responsibility for the potential ill effects of your labour. Although I am no Marxist theorist, I'm sure there is a point about alienation to be made here.

Compartmentalisation

Compartmentalisation is a psychological defense mechanism. The gist is that our minds keep irreconcilable thoughts and feelings neatly separated, so that we don't have to resolve the conflict that exists between them.

At work, we focus on vapid growth objectives and KPIs. Our labour is gamified: we look at metrics like cycle time and lead time as a proxy for the value we provide. Our minds are kept busy with insular, company-related issues. Our universe shrinks to "company culture". Perhaps we even uncritically adopt the company's mission statement as our own mission, which is guaranteed to be almost humoristically divorced from the reality of what the company does. (Take Meta's: "to give people the power to build community and bring the world closer together"——it says precious little about advertising; the company's cash cow and core business.)

As a result, when we are at work, we are so caught up in our professional environment that we "forget" to think about the real-world consequences of what we do. The critical, societally-engaged and politically aware part of our brain is sedated and pushed to the background.

Conversely, when we are at home or amongst our friends, it is more convenient to "forget" that our day jobs might be incongruous with our beliefs about what is valuable in life. Resolving that conflict is much more challenging: it would require changing your actions or changing your beliefs. Best to repress the conflict, so we don't have to choose.

Now what?

This article resolves nothing. It merely lists a few reasons why we don't seem to care about the ill effects of the software we build. But perhaps if we know why, we can try to actively work against the indifference.

If it is because of comfort, we can ask ourselves if we find that comfort important enough to blinker us. If it is because of moral distance, we can attempt to reduce that distance. How can we expand our circle of moral concern? If it is because of compartmentalisation, we can work to jolt "the better angels of our nature" into action even, when we assume our professional roles.

As engineers, we should practice asking ourselves who will benefit and who will suffer from the technologies that we help create. Doing so will help us fight back against the tunnel vision that is so endemic in our industry.

Notes

1 Rebecca Solnit - In the Shadow of Silicon Valley.

2 Facebook, Amazon, Apple, Netflix, Google. Although some of these companies have changed names since the acronym was coined, it remains widely used.

Tags: #tech industry #software #ethics