Intro to academic UX research

On Research through Design

From what I gather, this module, titled  ‘UX Research’, requires us to create a design artifact to back up or test the theories explored in our research paper. Using the process of design as an extension of research echoes the methodology of ‘Research through Design’ outlined in the book Ways of Knowing In HCI. Research through Design is an approach to conducting scholarly research that employs the methods, practices and processes of design practice with the intention of generating new knowledge. The knowledge produced functions as a proposal, not a prediction. (Olson 2014) As I understand it, RtD both draws from design methodologies (such as Geoff Petty’s ICEDIP technique) as a framework for research and uses the act of designing an artifact or a prototype to inform research.

In Ways of Knowing Through HCI* Olson references the Scandinavian Field design movement & participatory design. Described as merging research practices from sociology and anthropology with design action, the field design movement reminds me of exploratory [physical] product design work I saw at the Design Academy Eindhoven graduate show in 2014, as part of Dutch Design Week (DDW). The Design Academy Eindhoven sum up the importance of research for design in the context of using design as a tool for social change, which echoes my personal philosophy and motivations for studying UX.

In a constantly changing world, research is fundamental to meaningful design. By learning from the past and the world around us, we can make informed decisions and contribute to shaping a future that we want to live in. Without this, we would create blindly, wasting both time and resources in an era where both are becoming increasingly rare and valuable commodities. Collectively, our position is informed by the concept of thinking through making, with the act of creation and the act of thought indelibly intertwined.– DAE Philosophy

Over the past three years I’ve developed my career as a web designer, and have consequently turned my gaze to the impacts of ubiquitous technologies on human wellbeing. It’s an ever expanding area, about which I have learnt a lot about from the work of The Center for Humane Technology and First Draft.

*HCI refers to Human Computer Interaction. Encompassing computer science, human factors engineering and cognitive science, Human Computer Interaction is a field of academic and scientific study. The historical context of HCI as a discipline is often traced back to 1982, when the first conference of Human Factors in Computer Systems took place in Maryland, USA. However some scholars debate that the practice goes back further (Lazar 2017) 

WTF is humane technology

I see humane technology as a fairly recent sub-field within HCI, a philosophical and sociological branch questioning our relationship with tech and dealing with humanity’s most pressing crises. The Center for Humane Technology (CHT) is comprised of ex Facebook and Google employees who became disillusioned with the ways in which the decisions of a small amount of men in silicone valley were shaping the behavior and culture of the masses. CHT study the impact that social tech has on individual mental health, cognition as well as on politics and world order, and examining how these intersect. They believe that technology exists in a complex system of human vulnerabilities, economic and social mechanisms, and deeply held paradigms of thought. (CHT 2021) They warn that we’re in a time of crisis, perpetuated by technology that distracts us, divides us, and downgrades our collective ability to solve problems.

When you look at the financial incentives for people’s attention – it’s like deforestation. A tree is worth more dead than alive, a human is worth more addicted to tech and outrage is profitable” – Aza Raskin 2021 (designer of the continuous scroll)

Technology has the potential to strengthen and protect our human ability to solve pressing world issues, but the current paradigm leans towards optimising for attention, distraction and polarisation. This isn’t about moving away from financial incentives or cutting off social media, but questioning how to create a new paradigm of success that minimises harm. We need technology that enhances our potential for focus and cooperation, that protects our common good and our own well-being so we can be of service to each other.

The CHT ledger of harms collates research on the ‘invisible harms to society’ created by tech industry models. Here I’ll pick out 5 findings from their studies that stand out to me:

  • Months after starting to use a smartphone, users experience a significant decrease in their mental arithmetic scores (indicating a reduction in their attentional capacity) and a significant increase in social conformity, as shown by experiments with 25 year olds using randomized controlled trials. In addition, brain scans show that heavy users have significantly reduced neural activity in their right prefrontal cortex, a condition also seen in ADHD, and linked with serious behavioral abnormalities such as impulsivity and poor attention. (Hadar et al 2017)
  • 30% of 18-44 year olds feel anxious if they haven’t checked Facebook in the last 2 hours, according to a recent survey of over 2,000 American adults that indicates a high incidence of potential Facebook addiction warning signs. In fact, many are so hooked that 31% report checking it while driving and 16% while making love. 😱 (Honest Data 2020)
  • A person’s social media usage level significantly predicts their level of neuroticism/ anxiety one year later, as shown by a long-term study of 11,000 people aged 20-97. In addition, levels of neuroticism/anxiety predicted later levels of social media use, leading researchers to suggest a possible negative downward spiral linking these two processes. (Andrews et al 2020)
  • Exposure to a fake political news story can rewire your memories: in a study, where over 3,000 voters were shown fake stories, many voters later not only “remembered” the fake stories as if they were real events but also “remembered” additional, rich details of how and when the events took place. (Murphy et al 2019)
  • Chamath Palihapitiya, former VP of user growth at Facebook, has said that: “I can control my decision, which is that I don’t use that sh%t. I can control my kids’ decisions, which is that they’re not allowed to use that sh%t… The short-term, dopamine-driven feedback loops that we have created are destroying how society works.” (Hern 2018)
Research next steps

The dissemination of fake news and the impact of social tech on political extremism and conspiratal thinking over recent years has been painful to watch and is certainly an area of interest to me. I have seen it’s influence on political polarisation in Brazil, how false narratives spread throughout the pandemic causing real life consequences close to home. ‘Bad actors’ have weaponised the information ecosystem, where a cacophony of voices and narratives have coalesced to create an environment of extreme uncertainty. (First Draft 2021) We have to understand that individual pieces of content create larger attitude shaping narratives, and these narratives fall into even larger, overarching topics that steer conversations.

However I’m going to put this topic to the side for now, as I see it living in the realms of sociology and political science, and impacted by narratives beyond design. Though I do feel that it will bleed in to the other areas I’d like to explore.

The Centre for Humane Technology’s research on cognition and mental health are interesting and concerning, as they say: We can not meet the world’s most pressing challenges if our technology distracts us, divides us, and downgrades our collective ability to solve problems. So long as social tech is designed to prey on our weaknesses, manipulate our psychology and degrade our capacity to make our own decisions we’ll be stuck in this loop. I want to explore the ethics of design within this context, researching the influence of social tech on individuals’ cognition, mental health and wellbeing, and how this reverberates throughout society. However I know that I need to narrow down my ideas further, either to focus on one of the specific effects, or the impact on a segment of society. Today I’ll start brainstorming on paper, and later this week I’ll take my ideas to my tutors.

 

 

 

References

2020. Honest Data. Available at: https://honestdata.com/facebook-addiction Via: Ledger of Harms (https://ledger.humanetech.com/#study_68)

Andrews, N. P., Yogeeswaran, K., Wang, M.-J., Nash, K., Hawi, D. R., & Sibley, C. G., 2020. Cyberpsychology, Behavior, & Social Networking. Available at: https://www.liebertpub.com/doi/10.1089/cyber.2019.0744
Via: Ledger of Harms (https://ledger.humanetech.com/#study_161)

Hadar, A., Hadas, I., Lazarovits, A., Alyagon, U., Eliraz, D., & Zargen, A., 2017. PLoS One. Available at: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0180094 Via: Ledger of Harms (https://ledger.humanetech.com/#study_155)

Hern, 2018. The Guardian (https://www.theguardian.com/media/2018/jan/23/never-get-high-on-your-own-supply-why-social-media-bosses-dont-use-social-media) Via: Ledger of Harms (https://ledger.humanetech.com/#study_76)

Lazar, Jonathan, Jinjuan Heidi FENG and Harry HOCHHEISER. 2017. Research Methods in Human-Computer Interaction. Morgan Kaufmann.

Murphy, G., Loftus, E., Grady, R., Levine, L. J., & Greene, C. M., 2019. Psychological Science  Available at: https://journals.sagepub.com/doi/10.1177/0956797619864887 Via: Ledger of Harms (https://ledger.humanetech.com/#study_16)

https://www.designdisciplin.com/hci-profession/

https://www.designacademy.nl/p/research-and-debate

 

 

Leave a Comment