By Jan Michael Bauer
◦ 2 min read ◦
Even before COVID, people have spent more and more time online. Particularly mobile devices have become a large part of our daily routines and for many there are few moments when the phone is not within direct reach. While studies have shown that even teenagers think they waste too much time online, surprisingly little is done to stop this trend.
But how did we get here? Several dovetailing factors enabled this development and give me little hope that this trend will slow down any time soon. While technological advancements in mobile internet and device components were necessary conditions that allow for an easy and enjoyable interaction with platforms and services at all times and places, the real champions of compulsive internet use are social and data scientists driven by monetary incentives and unrestrained by a lack of proper ethics training.
Despite the frequent regrets about the many hours wasted on the internet, people are struggling with self-regulation and apps, like “RescueTime”, with to sole purpose to block oneself from using other apps are becoming increasingly popular.
While internet addiction has not been officially recognized as a disorder by the WHO, close parallels can be drawn to officially acknowledge gaming and gambling addictions.
And this is certainly no coincidence as tech companies hire psychologists and designers to make their products and services as tempting as possible, frequently borrowing elements from the gambling industry. However, even though some tweaks based on the knowledge of capable social scientists will increase user engagement, much more can be learned about consumer behavior and how to manipulate it through the application of the scientific method itself. The use of experimentation, collection of big user data and application of machine learning algorithms are the big guns in the fight for user attention and their money.
All these efforts are used to make social media more “engaging” but ultimately sales and advertising campaigns more effective. To do so, user interfaces and features are explicitly designed to grab attention and contain what has been termed as “dark patterns”. Design elements that often tap into the subconscious decision-making processes and therefore manipulate user through purposefully curated interfaces. While such practices benefit the company, they can have detrimental effects on individuals and society as a whole.
We know that individual choices reflect individual preferences only under certain conditions, including the absence of deceptive choice architecture or marketing messages. Hence, I can’t stop wondering about the opportunity costs and side-effects of these miraculous little devices in our pockets that have grown into an ugly hybrid between a snake oil salesman and one-armed bandit.
We have free markets based on the belief that they create value for society and make people better off by efficiently satisfying their needs. The recent U.S. opioid scandal has shown that for some products, sellers’ profits might not be positively related to consumer value. It certainly gives me pause that the best offline equivalent to the “RescueTime” App is probably the Betty Ford Clinic.
We are faced with many pressing issues that would require our full attention, while people are increasingly plagued by credit card debt, the planet is suffering from overconsumption and we spent 30,000 years alone, watching Gangnam Style on YouTube.
Regarding the larger point that any efforts against these trends would hurt innovation, jobs and growth; let us take one step back and point out that the Western world has made it an imperative to ensure individual property rights and outlaw the use of violence with the explicit goal to increase investment and productivity. People can just do more good stuff, when they do not have to spend time protecting their property and family. Given our current technology and knowledge from the behavioral sciences, I think we have seen enough and should start treating distraction and manipulation as similar threads to human flourishing.
So, what could we do? In the short run, we need to find ways to reduce the stream of big data feeding these efforts, force these practices out in the open and raise awareness about their use and effects, and find effective regulation to limit manipulation efforts in a dynamic attention economy. In the long run, we probably need to go beyond those patches as these issues not only hurt individual lives and careers but also the fabric of our democracy.
Further reading
We recently published a paper showing how users can be manipulated through dark patterns to provide more data:
J. M. Bauer, R. Bergstrøm, R. Foss-Madsen (2021) – Are you sure, you want a cookie? – The effects of choice architecture on users’ decisions about sharing private online data, Computers in Human Behavior.
About the Author
Jan Michael Bauer is Associate Professor at Copenhagen Business School and part of the Consumer & Behavioural Insights Group at CBS Sustainability. His research interests are in the fields of sustainability, consumer behavior and decision-making.
Source: photo by ROBIN WORRALL on Unsplash