Consumer Manipulation
I have been in contact with a Dutch graduate student in a sustainable design program. Recently he directed me to the work of a Stanford University researcher, B. J. Fogg. Fogg is the director of their Persuasive Technology Lab. I went visiting the web site and downloaded a few articles by Fogg to learn what was meant by “persuasive technology.” To my dismay, I found that it meant just what the name conveys. Fogg begins a short [paper](http://captology.stanford.edu/resources/thoughts-on-persuasive-technology.html) on his thoughts about the subject with:
> The world of technology has changed dramatically in twenty years. In 1993, I went to Stanford University to study how to automate persuasion. As a doctoral student, working with Cliff Nass and Byron Reeves, I was trained as an experimental psychologist. I had one question I wanted to answer: How could you computerize persuasion? In other words, how could you use the power of computers to change what people believed, how they behaved, and how could this then be applied to improve the world in some way?
At first glance, this all seems fabulous. Improve the world by changing what people believe and how they behave through the magic of computer technology. Shades of 1894 and Big Brother. Too good to be true. Yes! In another of Fogg’s [articles](http://www.thersa.org/fellowship/journal/archive/summer-2009/features/new-rules-of-persuasion), I read:
> Companies and individuals alike are writing software code that render persuasive experiences, influencing what we think and do. Never before has the ability to automate persuasion been in the hands of so many. With Cliff and others at Stanford, we performed the first scientific experiments to show how computers could influence people in predictable ways. We published our results in academic journals. In today’s world, people don’t need scientific data to believe computing technology influences people. From websites to mobile phone software, the evidence is all around us now. Persuasive technology is part of our ordinary experience.
>
>
– The fuel gauge in the Toyota Prius tells drivers how efficiently they are using gasoline. You can ask almost any Prius owner to hear stories of how they change their driving to get more miles to the gallon.
– In the living room many families use Nintendo Wii to make physical activity more engaging. The persuasive point of this device is “get up off the couch and move!”
– In the online world, we meet persuasion attempts at every click. In fact, virtually every website has a persuasive purpose: the creators intend to affect user attitudes or behaviours in some way: sign up for our service, tell a friend about this video, enter your email address.
> Facebook is perhaps the most successful example of persuasive technology to date, with more than 200 million people joining the service over the past two years. Facebook has created a system, driven by software code, that persuades us to upload our picture and divulge our personal information. We invite friends and accept friend invitations. Many of us log in regularly as part of our daily ritual, part of our ordinary lives like brushing our teeth.
I was OK through the first two bullets. Fogg was referring to a design methodology, I first met while teaching in the Netherlands. I came across work by Japp Jelsma which he called behavior steering design (not so ominous sounding as persuasive technology). I picked this up as an important way to use objects to guide behavior and through them to change beliefs. The French sociologist, Bruno Latour, noted that such designs have been around for a long time. The massive hotel key used in many European hotels makes it very difficult to forget to return it before exiting. Speed bumps force one to pay attention in dangerous crossings and may instill new more concernful driving habits and the new hidden beliefs that govern them.
I was quite taken by a new object to me at that time: the two-button toilet. This feature has become more common today, but then (2001), I had never seen this in the US. It’s simple; instead of a single lever or button, there are two–one larger than the other, proportional to the amount of water used in the flush. The choice is obvious. The first time one comes across this feature, it is impossible to mindlessly flush. The radically different design puts the brain in gear as one tries to figure out what is going on. It’s not a difficult puzzle to figure out, but it begins a process of attentiveness to the amount of water used, and has the ultimate potential to change one’s conservation awareness and practices.
The key to Jelsma’s and related theories is to encode messages in the design that break the mindlessness of routine habits. Only with such breakdowns or interruptions can one learn in a mindful sense. Even with a number of very interesting and positive examples, Jelsma and others were quick to note that an important ethical issue was involved. Who should decide what messages are to be encoded? Should engineers and designers with no particular authority or legitimacy to dictate “right” behaviors have a license to design such persuasive technology.
I have not done an extensive search of Fogg’s work. I base what follows on the two short articles and some additional web searching. When I got to the third bullet above and the following mention of Facebook, I started getting more concerned with what I was reading. Vance Packard wrote The Hidden Persuaders in 1957, exposing the new, then, techniques of manipulation in advertising, resulting from innovative psychological motivation research. He and many who read his book were concerned about the moral implications of these new practices. How innocent we were! Today it is taken for granted, either consciously or unconsciously, that we are surrounded by media designed to pull us toward consuming this or that.
Fogg’s work, as he says, takes us a step further and adds an important new channel for manipulation, a more accurate word for what is happening than persuasion: the computer in all its forms, from desktops to tablets and smart phones. Fogg comes across to me, based on what I see on the web, as a proud promoter of the work he has done and may be using language more as a promotional vehicle that a more precise description of his work. But he comes across as quite flip about it. He notes that some people thought on first seeing what could be done with computers to persuade that it was the devil’s work. I think it might be, at least in part.
My concerns are twofold. One is the concern Jelsma and other raised about the dangers of the designer or engineer using the methodology for immoral or improper purposes. This concern is amplified relative to the kinds of objects that had been involved in their work, where it takes special skills and training. Fogg emphasizes how easily people can learn the techniques of manipulation through computer programming. And how easy it is to apply them in the open structure of the internet. The ability to establish an ethical or moral baseline and enforce it is very problematic. I see this as very worrisome. Experience with the use of the internet argues that someone will always use it for their own personal advantage.
The second is more tied to my particular interest in sustainability, which I define as the possibility of flourishing. Flourishing is simply a description of living as a fully developed human being. An important part of being human in this sense is living authentically. Authentic living is acting out of a set of cares that come from one’s own self; they can be said to be self-motivated in psychological jargon. It is easier to define its opposite, inauthentic behavior. Inauthentic behavior springs from responses to the voice of the outside culture, or in terms of this post, acting out of someone else’s persuasiveness, that is, externally motivated.
Inauthentic behavior tends to leave a sense of emptiness that requires more of the same, exactly what Fogg commends Facebook for doing. Tonight on the ABC news, I watched a segment highlighting a rapidly growing illness–addiction to computer games, not just video games, but games like “Words with Friends or Farming games. Not just among kids, that’s old news by now, but among adults. Fogg and his acolytes are indeed successful, perhaps much too successful. While not yet a recognized mental illness, a computer game addition counseling profession has already sprung up. Yesterday it was news that soon we would be eating Frankenfish being secretly raised in Panama. Today, it’s about addiction to computer games. I’m almost afraid to watch the news tomorrow.

2 Replies to “Persuasive Pernicious Technology”

  1. Dear John,
    Thanks for your post. I share your ethical concerns for persuasive technology. There is an ongoing discussion on the ethics of persuasive technology similar to the discussion on behavior-steering.
    You make a distinction between persuasive technology and behavior-steering technology saying that persuasive technology is a form of manipulation whereas behavior-steering sounds ‘less ominous’. I do not share this perspective. I think there is not much difference between the two. The term ‘behavior steering’ implies that behavior is steered, as if there is no underlying attitude human character. No individual seems to make the decision here, the technology does that for him or her; it seems that autonomy is completely absent. This also sounds quite ominous to me. However, both notions make some distinction between possible strategies. Eco-feedback is proposed as example by researchers in both fields. And eco-feedback isn’t that ominous at all in my view.
    An interesting framework by Tromp et al. does not distinguish the two, but sees persuasive technology as one form of behavior-steering technology. Next to the persuasive aspect, technologies can also be decisive, coercive, and seductive properties. These properties all differ in salience and force.
    (www.mitpressjournals.org/doi/abs/10.1162/DESI_a_00087)
    I guess for both behavior-steering technology as Jelsma defined and persuasive technology as Fogg defined the strategies can be applied in a good or a bad way. Coercion sounds bad, limiting our freedom and all. Yet a speed bump is a form of coercion that makes our roads a little bit more safe. Persuasion through, for instance, eco-feedback does not sound bad to me; users of technologies can still make their own decision. I think it all depends on the situation.
    About your notion of presencing: I would not frame your concept of ‘presencing by design’ in either of these fields (behavior-steering or persuasive technology) as I think it is quite unique in that it tries to make people reflect. This concerns human character and not merely behavior. However, I think it can somehow make use of mechanisms / strategies defined in the two strategies discussed above.

  2. John,
    I’m not surprised to read about persuasive technology. The most pernicious and insideous effect of this technology, in my view, is the corporatization of the human psyche. This technology is employed most grandly and effectively by those who have the most money to spend.
    You mention the nightly news. I spent the first two days of this week in jury duty with TVs blaring in the waiting room. I normally don’t watch much TV. But I was taken (and literally sickened) by how much news programs have been hijacked by commercial interests. Even holiday toy drives are co-branded with corporate sponsors. The newcasts end up being commercials for the sponsors. I think the persuasive technology you highlight in your post has (over time) numbed people into accepting, processing, and acting upon corporate messages to the point where many cultural norms are infused with them.

Leave a Reply

Your email address will not be published. Required fields are marked *