This essay is part of the The State of Responsible IoT 2019 report edited by Andrea Krajewski and Max Krüger and produced by ThingsCon. The authors were asked to think about recommendations for “escapes from surveillance capitalism” thus the three recommendations at the end. Published under Creative Commons (attribution/non-commercial/share-alike: CC BY-NC-SA).
I am trying to finish writing this thought but my phone keeps buzzing, helpfully informing me that another email arrived, a new tweet was posted, there was a response to my Facebook post… This never-ending stream of helpfulness from all things digital, of course, can be turned off, deliberately silenced with yet another set of applications which themselves will remind me to turn them on periodically. Ours is the world of constant reminding, of making sure we do not miss that important item, message, happening. Human memory is fallible and needs to be supported, but what about that little red number attached to the icon of my email program where unread emails number in the thousands at this point? I am too exhausted to be feeling guilty about that anymore. Yet so much of my technology is there reminding me that I haven’t spent enough time with it lately – my Fitbit is languishing and Facebook would really like me to “re-engage”.
In 1971, Herbert Simon famously said that “…in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.” Most designers know the maxim that attention is the scarce human resource and yet here we are – having created a monstrous system that continuously competes for our attention in the most predatory and aggressive way possible. Sight, sound, haptics (how many different vibrations can you use to remind?), temperature changes – we haven’t yet gotten to smell and taste, but there have been some attempts. Where did this all start and how did we end up here?
Shoshanna Zuboff argues that the drive to entice ever greater levels of engagement with technology comes from forms of surveillance capitalism moving to commodify the digitalized private human experience. This explains why Facebook, Slack and their ilk attempt to colonize ever more aspects of life, focused as they are on constantly increasing engagement. Zuboff’s point is that there is nothing ordained by digital technology, nothing specific to the digital – “what people invented can be uninvented” she said in one of her public speeches – if we only strip out the economic logic of trading on predictions of human behavior then we can have the positives of technology without the negatives. Certainly the fortunes of many companies trading on human attention rise and fall with performance on various engagement metrics, but I disagree with Zuboff that this is mostly a regulation problem. While regulation as a route to a better digital future is important, I believe there is plenty to consider from a design point of view as well.
The attention economy is central not only to the business enterprise, but to the logic of interaction design in general. In some ways, the purpose of interaction design is to create engagement – to excite, to entice and to entrance – to create an aesthetic and engaging experience. While engaging technology is important, what we tend to forget is how to design for disengagement as well. All technology has to fit into the rhythms of life and I don’t want my email to colonize all of my time (although my email clearly has different ideas). In our research, my student Nanna Gorm and I show pretty convincingly that people use some technologies episodically not continuously (as their designers intend). Although Nanna had focused on health tracking, I think this is true of most any technology. Whether counting steps or checking Facebook, life intervenes and dictates its own rhythms of use. I found that research by O’Brien and Toms from decades ago suggested that engagement is cyclical – people engage, disengage, re-engage and that supporting the full cycle is key. Disengagement is just part of how we engage with technology.
Somehow the disengage part of the cycle has slipped our minds and we do not design for it. After all, that seems counter-intuitive – don’t you want to keep as many people for as long as possible using your service? This creates a kind of pathological cycle: designers create ever more compelling, effortlessly usable and enticing technologies. People use these and find it difficult to disconnect, leading to claims of addiction and problematic use of different kinds. A cottage industry of apps and software has emerged to help all of us with an apparent problem of self-control, to empower us (of course) so we can finally limit or temporarily stop using these technologies. Yet coming back to Facebook or FitBit after a period of abstaining can overwhelm with missed and new material in the case of Facebook and dishearten with glaringly “empty” days that ruin all the averages and achievements in the case of Fitbit. These technologies are designed to prevent disengagement, leveraging guilt and fear of missing out to make us stick with them.
The argument is that disengaging from technology is simply about self-control. Yet self-control isn’t just something we have or do not have – it is a resource and it can be easily depleted. Self-control requires energy to achieve. Not only is our capacity for self-control shaped by our social and economic backgrounds, but it requires expending cognitive effort. This means that the more self-control we have to exercise the more mentally tired we get, making further exercise of self-control much harder. No wonder I am constantly feeling exhausted! If none of our high-engagement technologies are designed to allow us to gracefully disengage and then to reengage without the overwhelming feeling of having missed out on all things important, then what we are doing is designing in the very pathologies that create opportunities for what Zuboff defines as surveillance capitalism.
I want a pause button, but not the kind I can have right now when I can turn something off and get overwhelmed upon return. Our current choices are typically binary – use or not use. There are of course options where use is concerned. I can filter my content, provide some boundaries, even modulate just how engaged I might be. There are so many options, settings and dashboards that I don’t have time to spend thinking about how to adjust and manage each. Besides, very few of these facilitate disconnection with grace and allow me to retain dignity in coming back to the technology. Perhaps Facebook could provide a short recap of things that I have previously marked important or give me a digest for when I come back to it? Only when I come back instead of emailing me “your friend has made an update come see what it is!” with ever increasing frequency while I am away. The Pause button should respect my choice of stepping away and accommodate calm reengagement without pressuring me for return. The health tracking algorithm needs to be able to deal with the missing data without guilt inducing signaling of empty days and ruined averages. The statement “I forgot my FitBit so my steps do not count” should not seem funny but a little too close to true. It should be incomprehensible!
I want a pause button that is a mechanism to limit and challenge the dominance of platforms that want to read our lives as a digital text, made transparent to the alien vision of machines. Perhaps this is not a recipe for drastic change in surveillance capitalism, but a commitment to design for episodic use and disengagement is to design for treating users decently and with dignity. This is also a way to design for holes in the data and for a different kind of relationship with technology. It’s a small step, but we all need to start somewhere.
So I want a pause button – a well designed method of halting my engagement with technology with no drama attached.
References:
O’Brien, H. L., & Toms, E. G. (2008). What is user engagement? A conceptual framework for defining user engagement with technology. Journal of the American society for Information Science and Technology, 59(6), 938-955.
Simon, H. A. (1971) “Designing Organizations for an Information-Rich World” in: Martin Greenberger, Computers, Communication, and the Public Interest, Baltimore. MD: The Johns Hopkins Press. pp. 40–41.
Zuboff, S. (2019) The age of surveillance capitalism – talk at the Copenhagen Business School, Copenhagen, Denmark – September 30, 2019