Foresight. Tends. Megatrends. Forecasting. Speculative design. Predictive modelling. Impact estimating.
These are some of the established methods that researchers and analysts use in trying to understand what the future might look like, and how the organisations we work for and with approach the future. A variety of research and design techniques are available for us to make sense of the future in a structured way. Ethnographers and anthropologists know how to study the present in order to speculate on the future; design teams employ futurecasts and speculative design; futures research employs a wide range of methods that cut across disciplines. With the availability of big data, forecasting and predictive modelling is growing more and more sophisticated.
Sometimes I wonder, does the maturity of our methods and frameworks make us feel too confident about evidence-based views into the future? Has it become harder for us to say, ‘I don’t know’? What consequences does that have for our work?
The Significance of ‘I Don’t Know’
In their recent work Radical Uncertainty: Decision-making for an Unknowable Future, the economists John Kay and Mervyn King write polemically about changing the way we attempt to know about the future.
They argue that we need to accept and embrace radical uncertainty when creating strategies. It’s rare that a system or a phenomenon is so stable that it’s possible to model and forecast its future with high confidence. More often than we’d like to admit, they argue, we should say ‘we don’t know.’ This is important to avoid a false sense of certainty and instead make decisions on a more honest foundation.
The approach that Kay and King suggest resonates with ethnographers and other researchers in the social sciences. To tackle uncertainty, we should ask a broader question: ‘What is going on here?’ To answer it, they say, the best approach is to create narratives about the future using a variety of data types, qualitative and numerical, and, importantly, to keep questioning our assumptions.
Approaching ‘I Don’t Know’ in Product Development
When companies invest in product R&D, they want to know which ideas are worth investing time and effort in. What will the gains be? How risky is the investment? How long will it take to build? In competitive and quickly developing branches like online services, the cost of spending time on the wrong opportunity – or the benefit of investing in the right one at the right time – can be significant.
Each quarter, researchers and analysts try to find the best possible answer to these questions, while avoiding a false sense of confidence. Below, I want to share some of the ways in which teams at Spotify attempt to embrace the uncertainty of the future, while trying to de-risk decisions with their insights practice.
1. Questioning and Updating our Worldview
Unexpected events are an opportunity to question one’s worldview. When the COVID-19 pandemic started spreading and cities globally went into lockdown, a group of researchers at Spotify said they felt uncomfortable simply turning their field studies into remote ones. Most of us lived and worked in Northern Europe and North America, and we didn’t know what people globally were going through. These researchers wanted to start to ‘just listen’. They wanted to understand what lockdown life looked like for people in different places and situations, then understand what this meant for online services usage and what new expectations emerged for the services. This turned into an 8-month longitudinal study with people in Brazil, Indonesia and the US, where we tried to create as much open space for listening and learning as possible.
2. Creating Several Ideas for Solutions
Once product teams start to become familiar with future trends that their research points to, they usually start to picture future scenarios involving their products and services. Design sprints kick in, making it possible to imagine how the services being developed could evolve in the future. This is the fun part of the exercise but also the riskiest one. In environments like consumer tech, where teams typically want to move fast to experimentation, it’s possible that only a couple scenarios stick, teams are drawn to pursuing their first ideas, and they don’t explore solution spaces widely.
To mitigate this bias that could later prove costly, the Spotify growth team created a framework that encourages wider exploration and critical thinking about the unknowns midway into the product development process. The goal of the Thoughtful Execution Framework is to make teams ask: are we settling on too few solution ideas, too early? What if we’re not even looking in the right direction yet?
With the framework, we want to encourage teams to explore ideas that are radically different from each other. This may help teams say, ‘I don’t know if this will work’ and create an openness to exploring more ideas and learning more broadly.
3. Be Honest about Your Center of Focus
One of the most anticipated events in the product development process is the multivariate test. It’s typically expected to conclude with clear answers and erase any uncertainty about which direction to take. Which version met targets, which one did not?
In reality, it’s not unusual for multivariate tests or structured usability tests to come back inconclusive. This can become one of the toughest moments in the research process as it means the researcher or data scientist will need to go back to their team with an ‘I don’t know’.
In a positive scenario, this will take the team back to addressing basic questions with rigour. How did we define the problem? Are we going after the right solution, or even the right problem? Is it a real problem for the people involved in our services, or, as the design leader Charlie Sutton has put it, “a business problem masquerading as a people problem”?
Spotify develops and tests products for a global audience, and to bring more depth into these fundamental questions we’ve started asking: who is the center of our research and design process? Despite our best efforts, have we ended up focusing on people similar to us and our colleagues? Do we know enough about people who are different from us, enough to understand their problems and how to create solutions for them?
Embracing the ‘I Don’t Know’
Working in online service development means having abundant data and insights about your product. We employ all of the above methods to know the future, from futurecasts to forecasts. But strategic foresight that is honest about the uncertainty of the future, does not constrain itself to measuring events or patterns of the past, or trust that a robust method can explain the future. It’s important to keep asking open questions, question our worldview and data sources, and acknowledge limitations in how we know and interpret our data – even in fast paced, lean environments for product development.
Models of Enchantment and the Enchantment of Models, Simon Roberts
The Stakes of Uncertainty: Developing and Integrating Machine Learning in Clinical Care, Madeleine Claire Elish
Reading the Tea Leaves: Ethnographic Prediction as Evidence, Claire Maiers
Models in Motion: Ethnography Moves from Complicatedness to Complex Systems, ken anderson et al