10 min read

On persuasive design, social media* and behavior change

*Why it’s not your fault that you lose all sense of time in the infinite scroll

Social media platforms have become an integral part of people’s lives worldwide, opening up new avenues to connect with friends, family, and online communities. However, from international media to the US congress, the conversation regarding our health and our social media habits is gaining prominence. Additionally, an increasing body of academic research suggests that people are struggling to control their use of these platforms, leading to distraction, disempowerment and anxiety. This project exists to question the design features that got us here.

Excessive social app use has been framed as a health crisis

Although research concerning the cognitive impacts of social app use is growing, the results remain contradictory and inconclusive [2]. Not all use is created equal; certain apps, approaches, or user intentions moderate the relation between user experience and behavioural impacts. Evenso some scientists compare this social phenomenon to early warnings about the obesity epidemic or the climate crisis in the 1970s [3].

Today, increasing data capture and algorithmic personalisation mean technology’s impact is accelerating far faster than humans’ ability to understand it[4]. This shift points to a growing knowledge imbalance and, thus, a power disparity between users and social media platforms.

Combining persuasive design with the ability to collect and process our data at higher speeds than ever before, we are faced with a future where habit-forming technologies are designed to capture more and more of users’ attention. 

“If you’re too friendly with the user, giving them exactly what they want in the moment, then you’re being unfriendly in helping them achieve their highest level desires and goals.”

Justin Rosenstein, who helped create the ‘Like’ button

Image: Cichosz, via Unsplash

The design philosophy behind social apps

Whilst habitual social media use has been framed as an individuals’ problem, there’s a growing community of technologists speaking up on the design philosophy behind these apps, popularising terms such as:

Persuasive design calls into question the human value of autonomy. To set and accomplish goals, an individual must have control over their own time and attention.

Autonomy can be violated by design through deception (making it difficult to adjust app settings such as notifications or subscriptions), coercion (design features that make it difficult to quit, leave, or disengage), or manipulation (personalised content through emotional profiling). Particularly vulnerable to persuasive technology are groups such as children and those with anxiety disorders [2].

Not all designs that influence user behavior are malicious. The same kinds of techniques can be leveraged to influence users in neutral or positive way. Such as fitness tracking apps or online education tools. However in the attention economy, the value of each users’ attention is growing, and alongside Google, social apps are at the front of the race to monetise this precious resource. 

Creating habit-forming technologies

The Persuasive Technology Lab at Stanford University, founded by behaviour scientist BJ Fogg in 2002, introduced models for influencing user behaviour that inspired a generation of technologists. Graduates of the lab include the founders of Facebook, LinkedIn, Snapchat & Instagram. Researching slot machine technology, cults, sleight of hand magic, and habit formation graduates went on to make their products habit-forming for billions of users[14]. 

While new technologies inevitably shape our human experience – persuasive technology is designed with the intent to change behaviour as its primary goal. 

 

A spotlight on Instagram’s UX

As part of my Masters in UX Design I conducted a short, focused study of persuasive design within the Instagram user experience. The study observes the impacts of design patterns within Instagram on individuals’ everyday cognition. I decided to share these findings and adjacent industry research, to shed some light on the following questions:

Why do we lose all sense of time when scrolling?

Why do we lack self-control when it comes to social app screen time?

How is excessive social media use affecting our everyday cognition?

Where does Instagram fit into the discussion of persuasive design ethics?

Who?

Olivia Blue

Hi, I’m Olivia, a UX design student at Falmouth University and an advocate for human-first design.
/about

This study isn’t anti-social media, it acknowledges Instagram as a creative, vibrant hub and a tool for human connection. It aims to question technologies that use coercive design techniques and imbalance of knowledge to nudge the behaviour of their users for financial gain.

The full academic paper can be accessed here.

Why Instagram?

With over 1 billion daily users worldwide[15], Instagram is one of the fastest-growing social platforms in history.

I interviewed individuals on their personal use of Instagram and their perception of how particular design features impact their behaviour. Our participants are all UK-based young people who consider themselves habitual* users of the Instagram mobile app.

*The user engages with technology repeatedly to pass the time, as opposed to with an intention for a specific task. 

I found a course that I’m doing just through Instagram, so now I get to meet once a week with these really interesting people that have a real impact in my life.

My partner makes fashion accessories for kids, pretty much all of her custom comes from Instagram. So she probably wouldn’t have her business without Instagram.

I actually really appreciate that my friend posted loads of our uni photos on Instagram, it’s nice to look back.

I’m quite a visual person, it is a visual platform and I can consume lots of nice visual content, I think that’s the bit that I probably get most pleasure from.

Key findings

90% of participants show discontent regarding their own habitual and ‘impulsive’ use of Instagram.

All participants use language that indicates loss of autonomy in the user experience.

40% of users interviewed reported consciously reducing their use of or deleting the app due to mental health concerns.

Those who valued Instagram expressed discontent at delivery of the content and the amount of time spent using the app.

The design techniques behind Instagram’s persuasive UX

Variable reward scheduling

Variable reward scheduling is a technique used in the gambling industry. It’s often referred to the slot-machine technique [5] as it’s about keeping the appearance of ‘rewards’ such as exciting pieces of content unpredictable. This anticipation of the unknown stimulates dopamine receptors [6], leading to users coming back time after time.

“One of my lessons from infinite scroll: that optimizing something for ease-of-use does not mean best for the user or humanity.”

Aza Raskin, inventor of the infinite scroll

The pull to refresh feature is an example of this technique. The simple nature of the action takes little effort, and the anticipation of what fresh new content could appear next keeps users engaged.

 

The infinite scroll is an ongoing loop of new, unpredictable content, stimulating dopamine receptors in the brain. This erodes the user’s natural stopping cues, as they are no longer encouraged to pause or to exit the platform. The ease of scrolling, rather than clicking ‘next’, allows users to enter a trance-like state.

I knew the dangers of the infinite scroll but still infinitely scrolled. I have an understanding of these addictive mechanics. If anything, it made me feel more guilty, that I was wasting my time, getting another dopamine hit.

The reason that I’d spend those half an hour windows of just being totally vacant I would say is directly because of the infinite scroll.

 I think that it’s overwhelming, you sometimes lose your notion of time. Nowadays, I think it’s like one of the features that mostly makes you get lost in the social network. And I think it’s dangerous, in some ways.

That’s what hooked me [the infinite scroll], that’s why I deleted it. I guess that’s the whole point, you don’t actually notice it, it just becomes impulsive.

The design techniques behind Instagram’s persuasive UX

Personalised content

A sophisticated and ever-shifting combination of algorithms allow Instagram to predict the content that you’re more likely to engage with [7]. Meaning that your content feed could look entirely different from your friend’s, even if you were to follow exactly the same accounts.

Developers at Instagram only have partial knowledge about how algorithms work [8]. These algorithms are machine-learning systems, once initial metrics are set, aritficial intelligence continues learning to create an evermore efficient algorithm. Instagram’s personalised feeds mediate communication and knowledge acquisition, consequently influencing users’ experience and behaviour on the app.

Instagram say their feed algorithm is optimised towards content you’re most likely to interact with, this is based on: the popularity of a post, your interactions with the account, and your likes on other posts[7].

 

Wide-reaching platforms become more efficient as they gather more diverse user data across the app and external partners. The more user time spent, the more fine-tuned algorithmic predictions will become. Thus, the more engaging your feed will become. 

I feel like the algorithm can block meaningful use, because you’ve suddenly got adverts, I feel like it’s controlling what I’m seeing. I feel like I do actually miss certain posts that I would be interested in

I’d not even be able to tell you what I was looking at. I’d just be observing whatever the content algorithm served to me, and that was partly why I wanted to get away from it.

It makes meaningful content easily accessible. But I think the whole thing [algorithm] is designed to hook you in, the way it’s been designed is literally just to keep us on there, so that almost takes away from the meaningfulness.

Personalisation was interesting, but at the same time, you know, a little too much for me, it wasn’t healthy. It was about getting my attention all the time. So it worked well, for their [Instagram’s] own  interests.

Conclusion

This study indicates that habit-driven user experiences on Instagram led users to feel a loss of autonomy*. Participants reported that design patterns eroding natural stopping cues and personalised content feeds were prominent factors influencing their use of the app. All users interviewed wanted to reduce time spent on the app and expressed difficulty controlling their use.

The overarching rhetoric is that the content itself is enjoyable, but the delivery is harmful. Users relate that Instagram’s UX has become more ‘manipulative’ over time. Several participants identify the exponential power of data capture to personalise their feeds and direct their attention.

*Stanford Neuroethics define cognitive autonomy as “the freedom to be the person one wants to be and pursue one’s own goals without unjustifiable hindrances or interference”[10]

“As technology companies set the standard for what other applications must do to remain competitive, they are increasingly finding that their economic value is synonymous with the strength of the habits they create.”

Tristan Harris, former Google design ethicist

What now?

Learning more about persuasive design and technology may support users to regain autonomy

It’s essential to recognise that the anxieties associated with increased social media use don’t exist in isolation. Such feelings and behaviours may also arise from intersecting social and political contexts.

While designers and technologists do not have complete control, they do have a responsibility to consider and plan for the unintended consequences of their work.

Several investigations indicate that knowledge of design techniques can increase autonomy in the user experience [13]. Users have the right to choice and transparency so they can make informed decisions.

Whilst the responsibility for transforming the Social Media ecosystem into a safer place lies with technology companies and congress, I believe that in the short term reducing information asymmetry between users and SNS can support users to gain back agency and control over their use.

The future of humane tech

There is a movement concerning the ethical responsibility of technology platforms to examine how they persuade users. Most notably The Centre for Humane Technology, who gained prominence with the Netflix documentary ‘The Social Network’.

Several ex-big tech employees have launched organisations that research and challenge the monopoly of the leading platforms, and advocate for innovations that prioritise human wellbeing and equality.

Learn more

Get to know the organisations striving for a more humane future in the areas of social tech and AI. ​

Share this page

References

[1] LUNDAHL, Outi. 2021. ‘Media Framing of Social Media Addiction in the UK and the US’. International journal of consumer studies .

[2] WILMER, Henry H., Lauren E. SHERMAN and Jason M. CHEIN. 2017. ‘Smartphones and Cognition: A Review of Research Exploring the Links between Mobile Technology Habits and Cognitive Functioning’. Frontiers in psychology.

[3] HARI, Johann. 2022. ‘Your Attention Didn’t Collapse. It Was Stolen’. The Guardian, 2 Jan [online]. Available at: https://www.theguardian.com/science
/2022/jan/02/attenton-span-focus-screens-apps-smartphones-social-media.

[4] HARRIS, Tristan. 2021. ‘The Attention Economy’. The Centre For Humane Technology [online]. Available at: https://www.humanetech.com/youth/the-attention-economy .

[5] SCHÜLL, Natasha Dow and Caitlin ZALOOM. 2011. ‘The Shortsighted Brain: Neuroeconomics and the Governance of Choice in Time’. Social studies of science.

[6] BURHAN, Rasan and Jalal MORADZADEH. 2020. ‘Neurotransmitter Dopamine and Its Role in the Development of Social Media Addiction’. Journal of neurology & neurophysiology.

[7] MOSSERI, Adam. 2021. ‘Shedding More Light on How Instagram Works’. Instagram. Available at: https://about.instagram.com/blog/
announcements/shedding-more-light-on-how-instagram-works

[8] ‘The Facebook Files’. 2021. The Wall Street Journal, 15 Sept. Available at: www.wsj.com/articles/the-facebook-files

[9] LUKOFF, Kai, Cissy YU, Julie KIENTZ and Alexis HINIKER. 2018. ‘What Makes Smartphone Use Meaningful or Meaningless?’ Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.

[10] ROSKIES, Adina. 2021. ‘Neuroethics’. The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. Available at: plato.stanford.edu/archives/spr2021/entries/neuroethics

[11] FOGG, B. J. 2008. ‘Mass Interpersonal Persuasion: An Early View of a New Phenomenon’. In Persuasive Technology.

[12] SIMON, H. A. 1994. ‘The Bottleneck of Attention: Connecting Thought with Motivation’. Nebraska Symposium on Motivation. Nebraska Symposium on Motivation

[13] CHASLOT, Guillaume. 2019. 4 – Down the Rabbit Hole by Design [Podcast]. The Centre for Humane Technology. Available at: humanetech.com/podcast/4-down-the-rabbit-hole-by-design

[14] HARRIS, Tristan. ‘Better Tech’. Ted.com. Available at: ted.com/talks/tristan_harris_how_better_
tech_could_protect_us_from_distraction

Figures

1. Social icons on phone. Piotr Cichosz. Available at: Unsplash.

2. Distorted Insta Logo. By author.

3. Digital portrait. By author.

4. 3D Instagram logo. Eyestetix Studio. Available at: Unsplash.

5. Distorted Insta Logo. By author.

6. Hands and phone. Gilles Lambert via Unsplash.

7. CHT logo. Available at: humanetech.com.

8. DAIR logo: Available at: dair-institute.org.

9. Algo Transparency logo. Available at: algotransparency.org.

10. Integrity Institute logo. Available at: integrityinstitute.org