Public debate has rightly focused on the perils and toxicity of new technologies, and questioned the motivations of the companies building them. Meanwhile though, people are creatively adapting technology to their own social and psychological needs. Margie Morris explores this crucial space of personal innovation for social connection and well-being in her new book Left to Our Own Devices: Outsmarting Smart Technology to Reclaim our Relationships, Health, and Focus.
Margie is a clinical psychologist, researcher, and inventor of technologies which support well-being. She led research on emotional technology at Intel, conducted user experience at Amazon and now teaches in the department of Human-Centered Design and Engineering at the University of Washington. Based on years of primary and secondary research as well as Margie’s own involvement in creating apps and other technologies, the book offers a fresh take on human-technology interaction, providing a counter-narrative to the techno-doom that frequently overlooks the creative agency people bring to these interactions. It also suggests interesting ways in which we as social scientists can make sense of the complex role and place of technology in our lives.
We spoke with Margie about her book and the lessons it has to offer for researchers and designers.
Anna: Can you tell us a bit about your background, your research and the work you've done in the past that led to the book?
Margie: I'm a clinical psychologist by training but spent most of my career in technology. Part of that was making new tools that I thought would help with emotional well-being, physical health and social connectedness – so working closely with designers to develop prototypes, and watching how people use them. And then part of my work is studying how people use existing technologies.
I noticed a constructive misuse of both prototypes and products – unexpected uses that led to positive effects. It’s a little like what Edward Tenner called ‘reverse revenge effects’. I then saw similar patterns in the way that people used existing tools like Tinder and even connected devices like lights.
So while part of the answer is continuing to do research to make technologies more emotionally intelligent and sensitive, part of what will be helpful for people is using what they have in ways that are more inventive and more intentional. There are so many technologies that are already woven into our lives, that we can use to improve our relationships and health. Rather than chasing a new gadget or app, we can use existing tools differently. So that was sort of the impetus for the book.
Giulia: You acknowledge there has been an emphasis on ‘perils’ when we talk about social media and smart technologies. Do you think that a discussion around benefits has been missing from the conversation?
Margie: The discussion implies that technology has an effect on us; but what’s missing from the conversation is how that effect depends on how we engage with technology, how much we make it our own and align it with our personal objectives. The idea that the amount of time you spend on your phone determines its effect on your well-being seems absurd to me, given the diversity of what you could be doing. Sketching on your touch screen and watching YouTube are clearly different, for example. Even the same app can be used in really different ways – for example, social media apps could be a way to procrastinate or a deliberate way to connect with a family member. The idea of having agency and a two-way relationship with technology is missing from the conversation. There isn’t a metric for how people bring those screens and technologies into their lives, but I think that's what's important.
Anna: Thinking about that two-way relationship, early on in the book you talk about the therapeutic impact of social media. In a traditional therapist-patient relationship, the therapist bears a certain amount of responsibility. Where does responsibility sit in the two-way relationship with technology? Is it an alliance of equals? Who is in control?
Margie: Certainly technology companies have to take a lot more responsibility than they have for ensuring privacy and that there is no harm done to the end user. At the same time, I think we can exert more agency in how we use technologies. That requires reflecting on the changes we’d like to bring about in ourselves or in our relationships and then experimenting with how we can use technologies to support those goals.
Giulia: How do you think these ideas and insights could inform designers?
Margie: I think the examples in the book will help designers see what people are actually trying to make technologies do for them. They can see how people are repurposing and sometimes combining technologies to meet psychological needs. For example, using a dating app to get validation rather than a hook up.
Designers and developers may also learn from the nuanced ways that people are sharing data. In many apps the sharing options are still pretty rudimentary: “Share this with Facebook.” But the kinds of sharing that I see making a difference in people’s lives are within close relationships, especially when it comes to things like health or mood data.
Designers are often focused on pretty specific ‘use cases’, instead of how someone sees themselves over time – the self-view or identity narrative, as psychologist Dan McAdams calls it. By paying more attention to how people might see both their past and futures, we open up the doors for what technology can do for people. As opposed to “Oh, they're on this dating app to meet someone within two blocks within the next hour,” for example. That's a narrower ‘use-case’ kind of view.
Giulia: Yeah, that's an interesting way of looking at it. That idea of using technology to present a longer view of the self is drawn out in some case studies more than others. Do you see a distinction between the cases where people deliberately outsmart or adapt technology to make it work for them, and those in which people stumble upon unconventional use cases in a more accidental way?
Margie: Yeah, this is something I struggled with in terms of the tone of the book. For example, Miguel’s scale sends him an email saying, “Oh, congratulations, you've lost 15 pounds,” and he realizes that's not him, and it's not his house guest, so she must have a new boyfriend. That's not something he planned out in advance, of course, he kind of stumbled into it and then went with it once he saw it as a door to a conversation. That’s really different from the woman who upgrades her iPhone and actively struggles to get rid of the activity data in her Health app. This data is very unwanted; it threatens to trigger a latent eating disorder, so she starts a Twitter petition to Apple to make that optional. I think those two have pretty different flavours. But the commonality is that they're both using tech in line with their objectives. Miguel, with the scale, really wanted to have more warmth in his home, and closer relationships. And she really wants to be on this path to meaningful health, to maintain her recovery.
Anna: It’s partly down to the fact that people have different intentions when they join a particular app or start using a particular device. It’s difficult to compare across those in some ways because they all cater to such different needs and aspirations. But we also talked about it in the context of disruption. There is a lot of conversation around technology disrupting our lives, but what we're seeing here and in the examples in your book, is that the opposite happens too: people are trying to ‘disrupt’ the technology themselves.
Margie: Yeah, that's a great way of putting it: Disrupting the intended use. Sometimes it starts with a serendipitous thing and the intention is sparked a little later in the game. There is a difference between the thoughtfulness of, “Oh, maybe I'm going to take this email from my scale and forward it to my roommate and spark the conversation,” and being very deliberate about a particular intention from the get-go. There are a number of examples in the book about people who deliberately adapt smart lights for emotional communication – that’s disrupting the intended value of automation.
Anna: How are the emotional attachments that humans form with technology different from the kinds of interactions and relationships we form with other humans?
Margie: Well, to borrow a little bit from psychoanalysis, technology can act as this transitional object – something we latch onto when we're moving away from a secure base to take a risk. Similarly to how the proverbial baby blanket is clutched as the child is moving away from the mother, I think that the phone is clutched as people are going out by themselves to events that they may feel awkward being at alone. In that way, the phone allows them to take part in the world when there isn't a person playing that role for them.
There are also a lot of ways that people use technology to try on different selves. But I do think that we adjust technology, as we do physical objects like clothing, both as a way to reflect ourselves and as a way to shape who we are. The point is to always be expanding or refining how we are with other people. That’s the role that technology ideally serves.
Giulia: You talked about devices becoming more replaceable as a result of data moving onto the cloud. So the attachment is not to the device itself, but what's inside it – all those conversations or possibilities. How is our attachment to technology changing as devices and the data within them change?
Margie: That was definitely a speculative comment in the book, that our attachment to this singular physical object may be lessening as there's this sense that the data is also somewhere else. I might not remember whether it's on Dropbox, or iCloud, or whatever, but there's this sense of “it's out there,” and I think that is making this object feel a little bit less critical. Not that we don't need a phone to get by in the world, just to catch a ride, or to know where we're going. There are a lot of things that make it crucial, and difficult if you suddenly lose your phone. But I don't think there's that same sense of, “Oh my god, all my stuff was right inside this device and all my people were inside this device,” in the same way that, I think people felt in the early 2000s. There was this sense of horror if you had lost your phone. And these are expensive, and most people feel that cost, but I don't think there's that same feeling of “I've just lost my life.”
Giulia: What it sounds like you're saying is that it's what goes on beneath the surface that's actually rich and important and worthy of us studying and focusing on, not the thing itself. Perhaps it's the desire to continually connect and refresh feeds, and see what's going on, that should be the focus of media concern.
Margie: Right, the question is: what are we getting from this? I think a lot of this is validation along with self-expression. There's a lot of talk about how people are inundated by email or notifications. But if when we pick up our phone we’re thinking “Oh, I'm going to be inundated by things,” why would we be picking it up? I feel like we’re picking it up because there might be something really nice here for us, there might be something that says “Hey, you're special.” Some people talk about that in terms of variable intermittent reinforcement, every once in while you win. So maybe the questions we should study are: why do people need so much validation, and how can they get it or the encouragement they need?
Anna: You trace several instances of technology helping people stay connected, sometimes by filling gaps or creating shortcuts. Is there a danger that people develop an over-reliance on technology, which could be damaging in an emotional context where personal effort is actually required and valued?
Margie: Yeah, I think tech can be used in ways that create closeness or work against it. One story in the book shows how, in telemedicine, that can come down to the position of the camera. By shifting the camera to show more of her home, a doctor reveals more about her background and strengthens her bond with patients. Of course, the opposite situation is common, where people use technology to avoid intimacy, for example, looking down at a phone during a conversation or choosing to text rather than call a friend who is going through a rough time.
Part of the reason I suggest pushing technology to align with personal values and goals is that technology is often designed by default to support immediate gratification, and that can run counter to our long-term values, whether those are related to health or relationships or just mental focus.
Getting back to the question about technology and where responsibility for it lies, it should be able to understand our values and objectives. And to support those. Or at least not invite us to do things that are completely contrary to what we want for ourselves in the long run. Netflix automatically playing one episode after another, for example. Instead, these things should be helping us live the way we want to live. I think this sort of endless content feed is antithetical to what a lot of people really want.
Giulia: That brings us full-circle to the perils that we mentioned at the beginning, and how people can balance those with making technology fit around their goals and aspirations. One last question would be, is it always desirable to make technology align with our goals? Were there cases that you came across where it was better if technology just didn't play a part at all?
Margie: Yes, sometimes the ideal thing is to ditch the technology and just talk to someone. Technology may help us get to the point where we can have those unmediated conversations.
There's this example of this woman taking care of her elderly mother. Her mother lives alone but she was trying to help her mother have an engaged life, and she participated in the study where we were testing out this social health monitor. It gave feedback to both the mother and daughter in the form of a ‘solar system’ which depicted how much time she was spending with other people. One of the effects of this is that it gave them this vocabulary to talk about loneliness and relationships, and it wasn't like they needed to be staring at that device to have that conversation, it was just that over the study, they both became familiar with it. They could use the language of circles and space that was in the display to have that conversation. This common device had given them this reason to talk, and this allowed them to talk about a topic that had been kind of hard.
People can use tech to start conversations. And I think designers should try to cultivate this. Often it was when people brought tech into conversation that they benefited from it, sometimes that was looking at an app with someone else or in other cases just using it as a shared language. These conversations are the catalyst for change.
Anna Zavyalova is a Senior Consultant at Stripe Partners. A trained anthropologist, she is passionate about applying the ethnographic method to real business challenges. She has worked on projects spanning technology, healthcare and retail industries. With a particular interest in AI, Anna has carried out ethnographic studies of spoken interfaces, Smart Home, driverless cars and pharmaceutical R&D. Anna holds a BA in Archaeology and Anthropology from the University of Cambridge and an MSc in Social Anthropology from the University of Copenhagen.
Giulia Nicolini is a junior consultant at Stripe Partners. She holds a BA in Sociology from the University of Cambridge and an MA in Anthropology of Food from SOAS, and has experience working on food, sustainability and technology across different sectors.
Reconsidering the Value of Wearables, Sakari Tamminen
In Praise of Theory in Design Research, Bill Selman & Gemma Petrie
Design for Health Living, Ame Elliott & Brinda Dalal