The Promise of Shitty Train AI

Promises were made. Promises were always made. The first homo erectus to notice that a tree was still burning after a lightning strike and carry a branch back to the cave promised, “We will never be cold again.” The first homo sapiens who realized that he could use one stick to throw another stick, inventing the atlatl, promised, “We will be able to hunt with less risk and we will never be hungry again,” in part because the atlatl allowed women to hunt as effectively as men. Every new technology is sold and spreads by someone making a promise.

Before thinking more about the promise, there are two aspects of technology that I want to highlight.

First, we live in a technological society. We all know that. But in addition, our ancestors for the past few thousand years also lived in technological societies. We often forget that.

The current discussion of technology too often starts from the assumption that technology began with the iPhone, which is an outgrowth of a proto-technology from the dark ages called the ARPANET which is an outgrowth of a pre-technological artifact called the transistor. When people talk about “tech giants,” they mean Amazon, Apple, Alphabet, Meta, and so forth. When people say they have a problematic relationship with “technology,” they usually mean their phone.

Lost in all that is that requires advanced technology for UPS to deliver 24 million packages each day in a trackable manner that lets you know at every moment where the package is, when it will arrive and when it finally has arrived. UPS is a tech company with a lot of drivers in the way that Amazon is a tech company with a lot of warehouse pickers. There are differences of course – I’ve never felt like UPS is actively trying to hijack my mind and destroy small businesses and lock me into UPS so that I can no longer use FedEx – but the similarity shouldn’t be lost either.

People talk about going backpacking for a week to get away from technology. The vast majority will carry a smartphone with them, which is genuinely handy for navigation with services like GaiaGPS. More to the point, hardly a single item we carry with us in the mountains would have been possible to produce without the technological advances of the late twentieth and early twenty-first centuries. Lightweight, waterproof, breathable clothing. Superlight tents and sleeping bags and cookstoves with titanium pots. Carbon fiber hiking poles. Shoes with rubbers and foams that were unknown 25 years ago. The modern backpacker who leaves all electronic devices behind is nevertheless literally wrapped in advanced modern technology.

We do not want nor could we possibly live as humans in a technology-free world. There are good and bad technologies, helpful and unhelpful technologies, technologies that are worth the downsides and technologies that are not. But there is no human world aside from technology and there literally never has been – hominids began using basic technology before the emergence of homo sapiens.

Second, Kevin Kelly’s metaphors notwithstanding, technology doesn’t want anything. People want things. We may be evolving technology that does want things, but most of our technology is inert in the domain of desires. Talking about what technology wants is like talking about what weather wants or what nature wants. And no, nature doesn’t want balance and equilibrium. Exhibit A: The Big Bang.

Nylon doesn’t want anything. People who sell nylon want things. When we talk about what technology wants or the promise of technology, we are absolving the human actors who make money by pretending that technology simply wants things and makes promises without an actual human wanter to do the wanting and a human huckster promiser to do the huckstering promising.

It’s a disguised passive, like the way some of my less strong history undergraduate students would use the word “happened” (“then the Reformation happened”). These are formally in active voice (technology does the wanting) but succeed just as well, better in fact, at removing the human actor.

A mere passive construction omits the human actor: “Many workers were laid off.” A muscled up disguised passive actively hides the human actor and then hides even the fact that it is hiding something: “Many workers lost their jobs.”

That’s the problem with those damn workers. They have such short attention spans. They take their eyes off their jobs for a second and they lose them. Factory owners who buy robots do not fire their workers. No, those irresponsible workers simply, through inexplicable inattention, lose their jobs as though with just a bit of diligent looking, they could find them again (and notice the asymmetry between “I lost MY job” and “I am looking for A job”).

Similarly, “Advances in technology destroyed thousands of jobs in the widget sector.” Like a glacier plowing down the mountain, technology appears to plow through labor markets and social structures without needing any human driver. Or as my undergrad history students might write, people had lots of jobs in the widget sector, but then automation happened.

Promises were made

And what is the nature of those promises? In my favorite non-fiction book of 2023, Andy Crouch’s The Life We’re Looking For, Crouch says that technology is typically sold by telling us what we will now be able to do or what we will no longer have to do. The wheelbarrow is sold on the promise that we will no longer have to carry heavy loads on our backs. The atlatl is sold on the promise that we will now be able to keep a safe distance as we kill mastodons (the examples are mine, not from the book).

New technology always brings more with it than the promise. It brings obligations. For each thing that you will no longer have to do, there is another thing you will no longer be able to do. For each thing you will now be able to do, there is often another thing that you now have to do.

Wheelbarrows and Matches

It would be hard to argue that a wheelbarrow made us weak by freeing us from carrying things on our backs. But already, the wheelbarrow brought with it the obligation of a single mason tender to deliver more bricks than he had to back when he just had an unwheeled barrow (aka basket) or his hands. And because it is more efficient, it brought with it the obligation of every mason tender to own a wheelbarrow, that marvel of ancient Greek technology (that’s right, brilliant as they were, the Babylonians were not technologically advanced enough to have wheelbarrows).

However, the wheelbarrow was such a marvel of technology with such outsized benefits, that historians estimate a worker could pay for the wheelbarrow in just three to four days of boosted productivity (This is much different than modern truck drivers who buy or lease trucks but frequently can’t ever pay them off due to exploitative practices from shipping companies). In short, the promise of the wheelbarrow was admirably fulfilled and the resultant obligations were minor.

I label the wheelbarrow a win.

Before matches were easily available in early-modern France, people had to keep embers burning. If the embers went out, they had to go fetch embers from a neighbor, often carrying them home in a wooden shoe. Unfortunately, the embers had a tendency to spill on the way home and burn down villages. Cheap safety matches, which in some places were not commonly available until after World War I, solved that problem. People no longer had to go to the neighbors to get embers to restart the fire. But this also meant that people were no longer able to go to the neighbors house and ask for embers, an important occasion for informal chats. By 1889, teachers in Lorraine had noticed this impact on village social life (Eugen Weber, Peasants into Frenchmen, pp. 16, 165, 527). Still, it seems a small price to pay for avoiding regularly burning the village down. Laurence Wylie, over the years he updated Village in Vaucluse, noticed a similar effect caused by refrigerators – it reduced food spoilage, but it also reduced informal socializing as people no longer had to go to the market every day.

Still, I like my matches and my refrigerator. I label matches and refrigerators wins.

Phones and Cars

With devices of the twenty-first century, though, we feel this tension acutely. To go for the obvious cliche, the promise of the smartphone sellers was that we would be able to check email from anywhere, but for many people it brought the obligation to check email from everywhere. The smartphone sellers told us we would now be able to work from home, but they did not tell us that we would no longer be able to be fully leave work and we would have to work from home.

We can choose many other examples. Modern cars promise remote start, but obligate you to pay for a subscription if you want basic features to work. Modern televisions promise easy access to a universe of entertainment, but they obligate you to give up massive amounts of data to various large corporations, sometimes the streaming service, sometimes the TV manufacturer, sometimes both. Or even more simply, they promise that you will no longer have to be bored because you always have entertainment available, but in practice mean you will frequently be bored or, worse, lonely, because you will no longer be able to see your neighbors because they are busy watching TV, functioning much like the match in the nineteenth century (on a personal note: we try to always have one TV series we are watching with our backyard neighbor, who lives alone, so we always have a reason to get together).

The examples are infinite and the benefit/cost ratio generally seems much lower than it was for the wheelbarrow, the atlatl, the safety match and the refrigerator.

This heuristic is just one of the reasons that I can’t get Andy Crouch’s book out of my mind (the personal vs personalized heuristic and the idea of the larger, possibly non-family household vs the nuclear family/couple being the other two).

Shitty Train AI

Meanwhile, the American Dialect Society chose “enshittification” as the 2023 Word of the Year which also delighted me because that is the other lens that became instrumental to my understanding of the world in 2023, thanks to Cory Doctorow’s blog. For those new to the idea, enshittification is the process by which things, especially online platforms and services, become shitty over time, mostly by abusing their loyal users in order to make more money for shareholders.

All of this clicked into place as One Big Thing when I read Cory’s comment about how AI-powered self-driving tech promises that we will be able to put trucks together into more efficient groups of, say, 10 trucks that will follow each other super closely, communicate back and forth and save a lot of fuel. Self-driving boosters have sold us this vision for a long time, if only our artificial intelligence were able to deliver on this lofty promise of efficiency.

Cory notes that what this lofty promise actually delivers is nothing more than a “shitty, failure-prone train.” I think you could add “dangerous” to the list of special properties these autonomous road trains will have. This much is fairly obvious.

The piece that had escaped me is that of course these will come and of course we cannot allow them to be dangerous so because we will be able to gang autonomous trucks together into dangerous, shitty road trains, we will have to reconfigure highways so the shitty trains have their own lanes, special signage and other design elements that will keep them from killing humans on the same highway.

The “Aha!” for me is that even shitty technology brings with it obligations, things you must do and will no longer be able to do. I had previously thought of this as a benefit/cost calculation – some good, some bad.

The wheelbarrow dramatically reduces toil, but it increases the minimum capital investment to be a mason. The atlatl makes individual hunters and, especially, a band of hunters, much more efficient, but it also leads to species extinction everywhere humans show up. That’s real bad for Stone Age megafauna, but if you’re a hungry Stone Age hunter, it is definitely good.

But Shitty Train AI is a thing that even in its promise is mostly bad, and yet it still entails obligations and losses. The promise is that the freight company will now no longer have to hire drivers for this one relatively limited piece of the supply chain where a driver shortage only exists because the vast majority of drivers with commercial drivers licenses refuse to work under the current, exploitative conditions. Nevertheless, we (that is you and I) will now have to allocate even more land to roads and no longer be able to use that lane or that land for something else.

Shitty Train AI is for the most part just another case of the benefit concentrating in the hands of the few (the freight company) with some small savings to consumers, but without the freight company needing to pay for the externalities. Socialized costs and privatized profits. Will they fund the extra lanes including all the costs of the externalities?

Yes, that’s obviously a rhetorical question. Little pink houses for you and me.

Shitty Train AI is a specific version of this problem. It is the version where “technology wants” to make the roads efficient (according to the promise), but even the promise itself enshittifies the train while the accompanying obligation enshittifies the road or the green space along the road or, even worse, the neighborhood along the road.

Like Andy Crouch and Cory Doctorow, I’m a fan of technology – computers, clean drinking water, central heat, high-performance climbing ropes and protection, superlight tents, vaccines, telephones that people actually talk on, wheelbarrows and wheels. I love it all. I even think that some of the consequential things AI has already done, like dramatically increase the rate at which scientists can figure out protein folding, will result in material improvements for humanity.

In the past, I might have asked whether The New Thing offers benefits that outweigh costs or whether The New Thing has any benefits at all. For at least a couple of decades, I have used the term “cross-grade” to describe many new software versions since often what is lost equals or outweighs what is gained.

Now, I find myself asking a crystalizing question: Is this Shitty Train AI? In other words, is the promise itself that The New Thing will make things worse? Is the benefit/cost analysis a shell game?

We are told there is a benefit under one of the shells and we try to follow the sleight of hand and keep our eye on the benefit shell. But when we lift the shell to find the benefit, there is nothing there. The Shitty Train AI is hiding under one of the cost shells and the destroyed neighborhood next to the highway is hiding under the other shell.

Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>