The Collective
This article appears in our first print issue, Pattern Machines.
The on-demand delivery technology boom, the latest spawn of the gestational revolutions of the app-driven gig-economy, is in full swing. News outlets and commentators large and small, broad-based and technology-focused, began speculating as early as 2015 that “Uber for everything” was on the horizon. Now, in 2019, media giants no less significant than CBS have made it official, proclaiming “The next big retail push: Faster delivery of everything.” For those of us living in major metropolitan areas—especially the heart of the technology sector that is the San Francisco Bay Area and the center of finance and commerce that is greater New York City—we already have access to a veritable bestiary of on-demand services, ranging from the banal (e.g., food delivery) to the novel, niche, and obtuse (e.g., on-demand dog walking, on-demand cuddling, and on-demand space burial).
In between these polarities exists a range of services, many of which, while often ridiculous, are generally harmless and even very helpful. Within this subset are also those that, while appearing useful on the surface, pose significant ethical dilemmas beyond the well-worn issue of reprehensible gig economy labor practices. In the worst instances, they present serious and pernicious public health risks.
A prime example of this subset: on-demand alcohol delivery applications. A broader view might include any application that provides on-demand delivery of substances that have addictive qualities or adverse health effects, ranging from on-demand cannabis to on-demand fast food. Yet cannabis—while capable of fostering dependence—cannot result in overdose, and fast food—while definitively bad for your health—cannot result in life-threatening withdrawal. Alcohol, despite its central position in American culture, is by far our deadliest drug, even in the time of a rampant and lethal opioid epidemic. Consequently, it is alcohol delivery apps in particular that are of grave concern to public health.
The nature of the goods sold is only part of what makes the rise of on-demand alcohol apps so worrying. The risks inherent in alcohol consumption and the costly societal price we pay for it are neither new nor are they the primary subject of concern. It is the very mechanisms through which a user orders alcohol on demand—most often from their smartphone—that threaten to exacerbate the already massive cost of addiction. We collectively subsidize the irresponsible production, cultural fetishization, and overconsumption of alcohol.
Alcoholism and alcohol abuse are already serious problems that are poised to be exacerbated by becoming horrific bedfellows with digital addictions.
A central product feature and operational element of on-demand delivery applications is the use of “push notifications,” or any message that pops up unbidden on a mobile device to alert the user in some way. Pervasive data collection and the tracking of user behavior feed back into the algorithms which govern the content and cadence of the push notifications that a given user receives. No matter what service or good is being provided by an on-demand application, push notifications inevitably come into play.
Some are central and essential for their obvious usefulness: an alert that your ride has just pulled up, an alert about an impending delivery, an alert reminding you of your upcoming appointment. But push notifications can also be detrimental to your well-being and functioning, acting as an unwanted distraction at best, and, at worst, providing a stimulus that over time can breed addictive behavior. Given this, we must continually ask ourselves if we truly understand the ethical implications of our creations and inscribe the practice of ethical framing and decision making into our developmental processes.
In the wake of Natasha Schull’s groundbreaking book Addiction By Design, an “empirically rigorous examination of users, designers, and objects that deepens practical and philosophical questions about the capacities of [users] interacting with machines designed to entrance them,” (Can Objects Be Evil?, Laura Noren, 2012), technologists, scholars, and the media alike have grappled with the question of whether smartphone addiction is a real phenomenon. Over half a decade since Noren’s article’s publication, internet addiction is now a recognized psychological disorder in the official Diagnostic and Statistical Manual of Mental Disorders. But smartphone addiction still lingers in categorical limbo. Nevertheless, it is a concept widely discussed in both academic circles and the news media, and popular opinion is beginning to coalesce around a view that the addictive quality of our smartphones and the countless apps that inhabit them has always been an intentional design choice. The evidence is hidden in plain sight; most notably, in the pages of the addiction-building business playbook Hooked: How to Build Habit-Forming Products by Nir Eyal, a tech entrepreneur and Stanford Graduate School of Business professor. Even if such adverse effects weren’t intended by its author, the book has contributed to the spread of a heinous and exploitative approach to user psychology and experience design.
Alcoholism and alcohol abuse are already serious problems that are poised to be exacerbated by becoming horrific bedfellows with digital addictions. If misery likes company, so does addiction; drug abuse is shockingly comorbid with mental illness of all kinds. On-demand delivery apps, which learn from user behavior and use push notifications to nudge their users towards a purchase, provide unprecedented and near-uninhibited access to one of the few drugs that can precipitate deadly withdrawal symptoms.
It feels particularly heinous that technological sales tactics are being used to sell something we all know can be deeply harmful.
This is not to shame adults who wish to drink responsibly, those who’ve faced addiction, or those who use such apps out of a concern for the risk of vehicular accidents. Nor is it a call to ban alcohol delivery, but rather to acknowledge the unsettling facts that should be grounds for caution with regard to the intersection of data collection, targeted push notifications, digital addiction, and substance abuse disorders.
We gave a few of these apps a try on our personal phones, and they are by no means equal in terms of usefulness or maliciousness. But one feature is relatively common among them: push—or, should we say, pusherman—notifications for various “deals,” “offers,” and “discounts.” Reminders to drink. Reminders to buy more alcohol. If this seems at all analogous to the typical targeted marketing strategies of online retailers, it is. But to us it feels particularly heinous that it in this case, these technological sales tactics are being used to sell something we all know can be deeply harmful. Imagine if these communications were from a person and not a faceless corporate app. It makes the picture seem even more appalling: a man who owns a bar and knows just when to call his friend he knows is trying to quit, or, more blatantly, a drug dealer selling one of his clients a little too much one too many times.
This certainly paints an extreme picture, and that is by all means intentional. We must consider the worst case scenarios seriously if we are to earnestly consider the ethics of our technological creations. Alcohol leads to the death of over 80,000 Americans every year. 1 in 10 deaths among working-age adults are related to alcohol. Anyone who has known the face of addiction in their family or among friends has felt its impact firsthand and seen the shame and danger it can impose upon its victims. In the ramifications of having access to a deliver-to-your-door button, there lies both safety and danger: delivery may improve road safety by reducing the rate of alcohol-related vehicular incidents, but it also allows alcoholics to elude the social shame of being denied by salesclerks or bartenders and in the process potentially deepen their addiction and dependence.
Do the individuals that are delivering alcohol in the gig economy have any moral role, right, or responsibility to deny delivery of goods sold to a customer who is quite obviously beyond their limit? Perhaps in an ideal world. Yet rating systems that don’t protect employees in such situations and the inherent precarity of gig-economy work act as clear disincentives for employees caught in ethical dilemmas.
Moreover, on-demand delivery can often be such an impersonal interaction, in an environment in which such interactions are so normalized, that there may be little opportunity for individual employees to be able to make such a judgment.
“Amazon’s “Everything Store” converges with just plain everything and, being ubiquitous, becomes invisible” (The Constant Consumer, Drew Austin, 2018). Although the alcohol delivery market is relatively niche compared to giant players like Amazon, it is part of a larger trend: one which is individuating commerce and expanding commodification via the common combination of gig workers, mapping/routing software, and on-demand marketplace smartphone apps. The companies that manage these apps are not only operating in a deliver-everything environment but are generally newer and modestly sized, helping to keep them out of the critical eye of journalists and regulatory actors.
For all intents and purposes, the “app economy” is largely a self-regulated space which is divided into competing marketplaces that are managed by three primary corporate gatekeepers: Apple’s App Store, the Google Play store, and the Microsoft Store.
Each of these stores is governed by Terms of Service that apply to both users and developers, yet the degree to which these terms protect users is up for debate, as is whether the terms are effectively enforced. For example, it is unclear whether alcohol delivery apps such as Saucey may be violating Apple’s Developer ToS regarding commercial use of push notifications:
1.4 Physical Harm
If your app behaves in a way that risks physical harm, we may reject it. For example:
1.4.3 Apps that encourage consumption of tobacco products, illegal drugs, or excessive amounts of alcohol are not permitted on the App Store. Apps that encourage minors to consume any of these substances will be rejected. Facilitating the sale of marijuana, tobacco, or controlled substances (except for licensed pharmacies) isn’t allowed.
3.2 Other Business Model Issues
The lists below are not exhaustive, and your submission may trigger a change or update to our policies, but here are some additional do’s and don’ts to keep in mind:
3.2.2.(ii): Monetizing built-in capabilities provided by the hardware or operating system, such as Push Notifications, the camera, or the gyroscope; or Apple services, such as Apple Music access or iCloud storage.
Given that the top ten percent of drinkers—those that drink more than ten drinks a day—account for well over 50% of all sales of alcohol, we must ask hard questions regarding the ethics of apps that encourage alcohol purchases from their most profitable customers: alcoholics. While it is difficult to predict the extent to which alcohol delivery apps will create new addicts or push-notify recovering alcoholics off the wagon, the suggestion of dangerous potential is clear. App marketplace gatekeepers and public health officials alike would be well served to pay attention to such a potent cocktail. ♦
Protean Magazine’s online content is provided to you entirely free, sans advertisement. However, we do pay each and every one of our contributors. In order to compensate contributors for quality labor, we rely on readers like you to chip in whatever you can to keep the content flowing.
So if you like what you read on Protean, please donate today!