How a Government Organisation Evolved to Embrace Ethnographic Methods for Service (and Team) Resilience: The Case of the Canadian Digital Service

Share Share Share Share Share
DOWNLOAD PDF

To access the video presentation: or

Government websites and online services are often built with limited input from the people they serve. This approach limits their ability to respond to ever changing needs and contexts. This case study describes a government digital team built from the ground up to embrace ethnographic methods to make government services more resilient.

The case study begins by tracing the organisation’s origins and relationship to other research-driven parts of its government. Then it shows how the organisation’s structure evolved as more projects included ethnography. It describes various approaches to locating skilled researchers within bureaucratic confines, as well as what responsibilities researchers took on as the organisation grew. It then summarises researchers’ experiences with matrixed, functional and hybrid organisation schemes.

The case study concludes explaining how embracing ethnographic approaches (and values) increased not only online service, but also organisational resilience. Teams who embraced ethnography had deeper and more thoughtful responses to the pandemic, and inclusivity challenges in the organisation. Lessons learned for other organisations attempting to scale an ethnographic research practice, and seize its benefits for resilience.

Article citation: 2022 EPIC Proceedings pp 141–151, ISSN 1559-8918, https://www.epicpeople.org/epic

THE EMERGENCE OF GOVERNMENT DIGITAL SERVICES

Since the early 1990s, governments around the world have invested in digitising public services with the aim of decreasing the cost and increasing the quality of public services for citizens. Yet over this time it has become increasingly clear that digital governments have not delivered all the benefits that were initially hoped for by its users (Mergel 2017). One reason for this shortcoming is a lack of consideration of the needs and behaviours of citizens in the planning, development and delivery of public services. As a response to this gap, several public sector teams have increasingly looked to the toolkits of design thinking and user-centred design to place the citizen “user-experience” at the forefront of public service delivery (Clarke and Craft 2017). And yet, over the past decade, the results have been mixed, with some governments such as the U.K, New Zealand, and Singapore succeeding in their efforts to deliver improved service outcomes more than others (United Nations 2020).

In the Government of Canada, early approaches to introduce user-centred design into the public service largely took the form of embedding design generalists in policy development teams to inform the front-end of policy design. At that period, the skill sets of designers were seen as a toolkit that could help with policy development, rather than as a skill set primarily relevant for service delivery (Michael McGann, Tamas Wells and Emma Blomkamp 2021). In this role, designers led workshops, projects and interventions with an overall aim to build “empathy” amongst public servants towards the needs of the people their policies served. While the expertise introduced fresh thinking and recommendations to increase citizen involvement in early policy planning phases, the outcomes didn’t go far enough to change the culture of government decision-making to impact day to day citizen-facing service delivery (Hum and Thibaudeau 2019).

The following case study will show how the introduction and evolution of a central digital service unit in the Government of Canada in 2017, Canadian Digital Service (CDS), and its approach to hiring researchers evolved in response to the increasing demand for an ethnographic research lens to improve how government understood the needs of citizens receiving their services online. And in turn, why it also proved to be effective in building team and organisational resilience at a time of unprecedented change in the delivery of urgent online services at the start of the COVID-19 global pandemic.

DESIGN THINKING AND USER-CENTRICITY IN THE GOVERNMENT OF CANADA

The creation of CDS in 2017 was a response to, and equally influenced by, a history of the Canadian public service engaging with digital services and the need for user-centred design that goes back to the early internet era.

As the 2013 Fall Report of the Office of the Auditor General on Access to Online Services (“OAG”) shows, when “Government of Canada services began to be migrated online in the late 1990s and early 2000s, Canada was seen as a world leader. Leadership in customer service and efforts in providing its citizens with online offerings were two of the main reasons cited for the government’s success” (Office of the Auditor General of Canada 2013). However, the same report goes on to critique the government for losing this early momentum. A later report from the OAG found that one of major reasons for the decline in quality of Canada’s digital services was a lack of importance given to the needs of the users of government services. In the words of the report, “It is critical for government departments to understand that their services need to be built around citizens, not process—or they can expect that those services will be disrupted” (Office of the Auditor General of Canada 2016).

The truth of these words was directly felt by public servants themselves in 2016 with the federal government’s large-scale and ongoing IT failure of the Phoenix payroll system (May 2022). Amongst the many lessons learned was the lack of testing the new service with real users before its launch. This situation cost the government over $400-million to repay federal public servants as part of continuing compensation for damages, and cost taxpayers more than an estimated one billion dollars, in addition to significantly disrupting the pay of thousands of public servants (May 2022).

Efforts to widen exposure to the citizen experience began a decade earlier. In the 2010s, traditionally siloed government public engagement teams—responsible for consulting and engaging citizens and stakeholders—began improving links across government with the creation of communities of practice. The goal was to renew their ability to be innovative and build a more flexible, knowledgeable member base. It was around the same time that people were looking for inspiration from the growing number of public sector innovation units from other governments such as Denmark’s MindLab, U.K’s NESTA, and UNDP’s Innovation Labs (McGann et al 2018). In particular, how they were able to embed design talent and expertise to expand the traditional public consultation playbook.

Pursuing the promise of social innovation labs, in 2013 a government-wide initiative known as Blueprint 2020 was launched with the aim of public servants “working together with citizens, making smart use of new technologies and achieving the best possible outcomes with efficient, interconnected and nimble processes, structures and systems” by the year 2020 (Privy Council Office 2013). As a result, a series of policy innovation-focused “Hubs and Labs” were set up to grow the practice of “co-designing” policy and program solutions with citizens and stakeholders and documenting what works to support learning and replication. Outcomes and lessons were subsequently shared at an annual “Innovation Fair” held at the National Capital Region, Ottawa. Examples of government reform projects came from teams including the Privy Council Office’s Central Innovation Hub (now Impact and Innovation Unit), Health Canada’s iHub, Indigenous Service Canada’s Indigenous Policy and Program Innovation Hub, Immigration Refugees and Citizenship Canada’s Pier SIX - Service Insights and Experimentation, Canadian Coast Guard Foresight & Innovation Hub and more.

While creating momentum and awareness of the need for a more nimble style of working, the following years proved to show assorted achievements. For many, the scale of their effort was evidently limited to those within the boundaries of the lab, creating an “us versus them” culture of people who seemed privileged to hold a title of being an “creative innovator” and those who continued to to represent an outdated style of working. Another critique discussed the placement of labs as separate entities within an organisation, removed from the day to day pressures facing the organisation’s core functions, and therefore removed from the realities of what it takes to create lasting change (Hum and Thibaudeau 2019). Perhaps the most glaring limitation was the disparity between the knowledge and involvement in early-stage policy making versus the practical implementation of the policy with people’s lived experience. By virtue of being semi-autonomous entities creating short bursts of co-design projects with sprinkled consultations at the front end of policy making, this resulted in an imbalance in the strategizing around accountability, applicability and implementation of said problems (Barnes 2016).

All the while, the need to increase citizen satisfaction and demonstrate measurable outcomes meant that similar governments were looking to play catch-up to rising expectations of what “digital transformation” could bring for the public sector. The U.K’s Government Digital Service showed targeted results in creating efficiencies and meeting client needs (Greenway et al 2018) . Closely followed by the Obama Administration’s U.S Digital Service, Australia’s Digital Transformation Agency, and Canada’s own Ontario Digital Service at the provincial level to lead the strategic implementation of each government’s digital agenda (How the Canadian Digital Service Started 2017). Collectively, these efforts proved that the time was right to start exploring what a Canadian approach to digital government could look like.

GROWING THE RESEARCH PRACTICE AT THE CANADIAN DIGITAL SERVICE

The Canadian Digital Service (CDS) was created in 2017 within the central federal department of the Treasury Board of Canada Secretariat to “demonstrate the art of the possible” and build digital capacity for Federal departments (Elvas 2017). Initially founded as a three-year pilot, in 2019, the team received additional funding to deliver government enterprise platform services and continue partnering with departmental teams to increase digital skills and capabilities. In 2020, with the shifted focus to support the federal pandemic response, CDS’s budget was doubled and, in 2021, the organisation was established as a permanent federal program (Budget Implementation Act 2021) to scale its impact and reach.

Over CDS’s five-year evolution, its approach to embedding research and ethnography evolved over the course of its growth. This process can be broken down into five phases, each bringing its own definition, goals and challenges.

The first phase can be characterised as a “team of one.” CDS hired its first—and for over six months, only—researcher, one of this paper’s authors, to help bring the team along on how a service can be built based around user needs, not governments. This early phase was lean and scrappy. As in any team of one, the researcher performed several roles—planner, designer, researcher and advocate. And the mission was singular: to begin shifting data and insight generation from a traditional top-down framework to a more ethnographic style, surfacing a bottom-up layer of evidence based on people’s experience of government.

The second phase can be described as promotional. Armed with the success of early research engagements with the Department of Veterans Affairs (Ferguson 2018), the Department of Immigration, Refugees and Citizenship Canada (Lorimer, Hillary and Naik, Mithula 2018) and others, the organisation emphasised the value of working in the open and hosted several research-focused meetups and workshops for public service teams. This phase also saw an increase in the hiring of designers, front-end developers and a second researcher to collectively make a user experience team. The challenge now was in balancing the growing interest from partners’ in helping unpack bigger research questions with the limited capacity of research on the team.

The third phase personified growth. There was a steady flow of requests to conduct ethnographic research on people’s complex relationship to government services including with members of the armed forces, Veterans, newcomers to Canada, low-income taxpayers and disability benefit applicants (Canadian Digital Service 2019). This momentum and progress showed the need for greater craft-based guidance and leadership. Following which, the first research manager, one of this paper’s authors, was hired to establish the research team. It was also at this stage that other disciplines such as design and development had grown in size, necessitating an expanded organisation structure. It was here that a matrix-style framework was developed, resulting in the research team reporting to the head of product delivery and researchers reporting to multiple leaders.

The fourth phase saw the maturity of research practice, both in the frequency, breadth and operations. Research was now built into every product phase and the key decision points of product development (Lee 2020). Participant recruitment emphasised the need for diversity in language, literacy, access to technology and disability. Research with end-users was critical, alongside research with public servants administering the service. Shareable artefacts, method toolkits and templates were prioritised to educate and guide people along the process. It was not entirely a surprise then, that the growth in research skills had an inverse relationship to the level of enthusiasm with the fast-paced agile process. Researchers were tired of feeling limited to shipping usability findings when the data was pointing to deeper structural concerns in service design.

And finally, the fifth and current phase represents research’s integration to various functions in the organisation. As a result of scaling to over a hundred staff, and in an effort to improve efficiencies, the organisation shifted from a matrix to a divisional organisation structure. One division delivers enterprise platform components for federal teams to adapt and reuse, where the researchers are embedded in product teams. The other unit is a consultancy providing bespoke guidance and coaching to federal teams. Here, the researchers are consulting strategists. In both units, researchers shifted from reporting to a research manager to likely a non-research manager. Researchers were also hired in new places. For example, the platform unit’s client experience team brought in researchers to consider the end-to-end client journey. In the consulting unit, the skillsets of researchers were sought across teams.

Transitioning to this new mode of management was not trivial. CDS team members had built up substantial identities around their discipline-specific communities. Senior managers also had to give up managing people just in their discipline, and change their scope of leadership (often stretching their professional skills). Importantly, the research team did not make this switch alone: engineers, designers and product managers all saw the end of their discipline-specific groupings and gained new management. Research “diffused” into the organisation as part of a broader effort that diffused many other job types.

Today’s challenge is an obvious one—how will researchers embedded in various parts of the organisation maintain a sense of community, connection and tell a shared story? Is it relevant to do so as one group? Is it more impactful that they leave research crumbs across disciplines and areas of the organisation. The next phases will tell.

Organisational charts showing changes in organisational growth from 2017 to 2020.

Figure 1. Diagram by Mithula Naik, 2022. Continuous organisational shape-shifting since 2017 — from “family-style”, to “matrix” to “business units”.

Organisational structure showing research communication flows across the organisation.

Figure 2. Diagram by Mithula Naik, 2022. In the fifth and current organisational structure, research is integrated into various functions, enabling trails of connection across the organisation.

TENSIONS WITH THE GROWTH AND ADAPTATION OF ETHNOGRAPHIC METHODS AT CDS

As CDS’ ethnography practice grew and diffused, it created interesting organisational tensions. Researchers found themselves in conflict with developers, designers and leaders. Overtime, these conflicts fell along predictable lines, and exemplified elements of cultural and epistemological theory from the ethnographic world. The following section details three types of conflict common as CDS progressed. It will provide examples of the conflict, explain them through theory, and offer some perspective on their “usefulness” to the organisation’s growth.

Conflict 1: Emic vs. etic perspectives

“Why can’t we just call this person what he is—a user?” one developer exclaimed to one of this paper’s authors. On one hand, researchers often advocated using the language of users to describe themselves and their activities. On the other hand, developers and product managers tried to apply their own categories to the people at hand. The language of “customers,” “users” and “stakeholders” was non-specific, but common in their professional communities. This felt like a conflict between emic and etic perspectives: using the researcher’s language and categories versus a users’ (Alasuutari 1995). Stepping back from this language is harder for some than others.

With time, the authors of this case study came to see the “word wars” as emblematic of a deeper conflict—one about the perspective taken when trying to describe a group of people. Many CDS team members made sense of their field by developing their own taxonomies and applying them to the subject at hand. In a given situation, they searched for “users” and “providers,” “transmitters” and “receivers.” Although research interviews challenged them to better understand these peoples’ behaviour, it was harder for them to escape the categorization schemes through which they viewed people. These schemes helped bring order to complicated situations, and, to some degree, a bit of comfort in a turbulent environment.

Yet, over time, more and more CDS staff embraced the language of the people their work was serving. With constant campaigning and reminders, the dreaded “user” fell out of favour, replaced by words more specific to the digital product at hand. Software developers, in particular, seemed to realise that their expertise did not hinge on calling people certain words. Rather, those broad categories often obscured the nuances of the people behind the keyboard. Perhaps this indicated that non-researchers seemed more practised taking on the language of others in some basic ways. Indeed, this tension seemed to be a “growing pain” or introducing ethnographic methods to an organisation used to other methods. But as this tension released, others seemed to take its place.

Conflict 2: The time orientation of leaders and researchers

What matters more: A grounded understanding of a project’s past? What are the needs now? What is the future vision? These questions of “time orientation” (Seeley 2012) are key elements of culture, but also became key fault lines for project teams. As the organisation grew, many a CDS team became embroiled in internal debate about which of these questions to focus on. Interaction designers tended to focus on sketching visions of the future, product managers became focused on the current state of the project and its tasks. When teams had dedicated researchers, they often became the team historian, trying to trace the journey of the team’s thinking (and relationship with users).

Far from being simple prioritisation decisions, these conflicts ran deep and caused substantial tension on teams. People often felt that their time orientation was “right” and forcefully advocated for above others’. Sometimes these debates became matters of professional and personal integrity. Far from a simple attentional choice, they became markers of identity and culture.

Arguably, these tensions still exist at CDS. But as individual roles (like “researcher” or “product manager”) faded from prominence, these tensions did too. When people did not identify as a “researcher,” they did not seem to feel as committed to maintaining that group’s identity by advocating a certain time orientation. Arguably, these debates about time orientation (and other cultural dimensions) were more prominent in times when the organisational structure created specialisations. Unlike the “word wars,” this tension seemed levered by organisational choices, championed by managers.

[/s2If]

Pages: 1 2

Leave a Reply