Pundits, policy-makers, and ordinary people alike have recognized that the landscape of industries and work has rapidly changed, and coming advancements in technology and automation will end many jobs and fundamentally change others (Lamb 2016). Since the 90s, The Government of Canada has been working to define the Essential Skills (ES) broadly necessary for people in the workplace to fulfill their personal and economic potential throughout their lives. In anticipation of these future skills, policy-makers must urgently address the question: how might we effectively deliver programs which upskill and re-train adults to be ready to take on new roles and careers?
Research suggests that having a place in the economy of the future will increasingly rely on people having and being able to learn “soft” skills such as communication and collaboration (Heckman et al, 2012; Berget et al, 2017; Conference Board of Canada, 2020). Soft skills are shown to be the most transferable skills across jobs and play a significant role for people to successfully keep jobs (Rudolph et al, 2017; OECD, 2020). In a world where machines and computing are increasingly sophisticated, an emphasis on the importance and relevance of social-emotional skills signals that participation in the labour market will increasingly rely on the skills which make us uniquely capable as humans.
In recognition of this, in 2019, The Government of Canada refreshed its Essential Skills framework, now named “Skills for Success,” to include the social-emotional skills of communication, collaboration, creativity, and adaptability alongside the pre-existing hard skills of literacy, numeracy, and digital literacy (OLES, 2020).
Blueprint, a Toronto-based research organization that works with policy-makers to generate evidence on what services and programs work for Canadians, was awarded a contract to implement and evaluate a program to upskill jobseekers rooted in the new Essential Skills framework. Blueprint partnered with Douglas College, a specialist in Essential Skills curriculum, and the Province of British Columbia’s Employment Services program, WorkBC. Douglas College developed a 6-week Essential Skills program called “Amplify,” with the plan to deliver the program model over three years to ~1,500 people in the WorkBC system through 2023. The goal of the project was to demonstrate how a short-term Essential Skills program could affect the outcomes of jobseekers advancing into post-secondary training and job placements that lead to long-term, sustainable employment.
Innovation in the public sector often calls for iteration, but in reality, the nature of grant funds often requires programs to determine a delivery plan and budgets from the outset, define anticipated outcomes, and generate evidence on those pre-defined outcomes using quantitative measures of impact. Like most large-scale public sector demonstration projects, the original research approach of the project was to focus purely on putting a model into the field and conducting an implementation and impact assessment of that model. However, although the gold standard for many publicly-funded projects, impact assessments such as Randomized Control Trials are expensive, are ‘one and done’ meaning their findings are taken at face-value by decision-makers, and they often have trouble pinpointing a nuanced ‘why’ behind the outcomes that are measured, making it challenging to know what to address in future delivery (Pearce et al., 2014, NESTA, 2018; OECD, 2017). As the field of evidence generation for public policy continues to evolve, there has been a recognition that “RCTs or quasi-experiments may work well when there is a simple intervention that can be tested. However, rarely do we have such simple interventions. In the complex world of social policy, it’s unlikely that your programme is the necessary or sufficient condition for success. It’s likely to be just one factor among many, part of a ‘causal package’.” (OECD, 2017).
In recognition of this, Blueprint worked with the funder early on to expand the project to incorporate testing and iterating the model after initial roll-out, to anticipate and mitigate potential challenges and increase the program’s success. With this goal in mind, the Blueprint team incorporated a focus on understanding whether the design was working, for who, and in what contexts, before starting to assess impact. The team also built in a defined “refresh” period in the work-plan where design iterations could be made and tested and then a stabilized model would be re-deployed and measured for impact. Our ability to scope the project to include this more fulsome approach to evidence generation was a significant shift in how demonstration projects to inform policy choices are historically carried out. (Pearce et al. 2014, OECD, 2017).
The Amplify program that was studied serves a wide range of jobseekers with diverse backgrounds: adults who have worked in the service or hospitality industry for decades and were recently laid off due to the COVID-19 pandemic; single mothers, many survivors of violence, seeking to re-enter the workforce; immigrants to Canada with large gaps in their early educational history; individuals with chronic illnesses and disabilities that feel they finally have managed their condition enough to re-enter the workforce; and so many more. Participants in the program varied widely in their skill levels: some had used programs like Microsoft Excel daily in their former jobs, others were learning to use a computer for the very first time.
In order to answer the question of how the Amplify program could best set up these jobseekers for success, the design researchers built an approach to understand both the experience of participating in the program and how Amplify fit into the jobseekers’ overall employment journey. The team agreed to expand the original project scope in two key ways. One, instead of collecting data from only Amplify jobseekers, the research team would also conduct research with WorkBC case managers and the Amplify instructors/delivery team. Two, instead of focusing on solely the in-program experience, the research would seek to understand jobseekers’ lives, barriers, needs, and emotions, as well as case managers’ overall roles and longer client histories.
With this broadened approach, the research team designed two phases of work to precede the impact assessment. Phase 1 followed Amplify’s evolution over three cohorts at two delivery sites — a total of 6 classes. This phase culminated in a set of participatory co-design sessions with the implementation team, in which insights were shared back and redesign opportunity areas were identified. Phase 2 will begin after Amplify is iterated and re-deployed. In Phase 2, the design researchers will again interview jobseekers, case managers, and implementers, following two cohorts at four delivery sites— a total of 8 classes—to understand if the iterations improved experience and outcomes. Once the Amplify model is stabilized, the design researchers will pass the torch for the quantitative impact assessment to begin.
Ethnographic In-Depth Interviews (IDIs) were conducted before and after each cohort of the program with 2-3 voluntary jobseekers, 2-3 case managers who referred their clients, and all Amplify facilitators. When a surplus of jobseekers volunteered, selection was based on ensuring diversity across gender, Essential Skill level, employment barriers, and newcomer status.
Phase 1 research was conducted from October 2020- May 2021 and resulted in 56 interviews. Phase 2 will begin in September 2021 and end in May 2023 and is anticipated to result in a further ~80 interviews. This case study details the findings and outcomes from Phase 1 of this project.
Two aspects of the research approach were key in helping to develop a rich body of evidence: longitudinal IDIs and early triangulation with real-time quantitative data.
The longitudinal design of the research plan allowed us to understand the evolution of participants’ experiences and outcomes.
For the interviews with jobseeker participants, the researchers used participatory activities. Specific examples include a social network map, and an activity entitled ‘the work I do’ to capture how participants met their needs such as housing, health-care, family responsibilities through a frame of agency and action rather than dependency and shame. Through these activities researchers captured jobseekers’ motivations, how they balanced their responsibilities at home and outside of the program, their networks of social support, and their experiences with employment services thus far. The interviews with caseworkers and facilitators used semi-structured protocols which probed on their decision-making processes of who to refer to the program, their perceptions of the value of essential skills programming along a jobseeker’s employment journey, and their definitions of success for their clients coming out of the program.
Several caseworkers made referrals for multiple cohorts, which allowed the team to follow how caseworkers refined their understanding of which clients would stand to benefit the most from the program and how they began to form relationships with Amplify facilitators to better strategize and serve their clients. Likewise, as facilitators delivered successive cohorts, the pre-and post-interviews conducted throughout allowed the research team to understand how facilitators changed delivery based on the needs, barriers, and social dynamics of the jobseekers, and to chart how they increasingly developed closer working relationships with caseworkers. We detail our insights in the following section.
Quantitative data on student outcomes
The research team triangulated the ethnographic research findings with real-time quantitative data about jobseekers’ outcomes. As part of the program’s impact measurement plan, several hard and soft skills assessments are administered at the start and end of each cohort of Amplify to track change. Assessments measure jobseekers’ reading skills, numeracy skills, and digital literacy skills. Additionally, multiple assessments measure different soft skills such as collaboration and communication, including a group task through which instructors would observe and rate student behaviors and interactions.
The research team further felt it would be beneficial to collect demographic information on jobseekers enrolled in Amplify as a whole, and developed a survey to capture that information at the end of the program alongside jobseekers’ opinions on their experience of the program. The survey allowed the research team to understand how demographic differences between the delivery sites shaped varying perspectives on the program. This helped us understand that one delivery site consistently had cohorts of jobseekers who were older, and more likely to be people for whom English is a second language, which was an important contextual factor when analyzing particular experiences shared about challenges in the classroom.
KEY FINDINGS AND TAKEAWAYS
The ethnographic research expanded the team’s understanding of what it means to prepare someone for a new career in multiple ways. Firstly, while this program aimed to improve people’s skills, participants felt the most meaningful gain from the experience was the changes they saw in themselves and their potential. Secondly, while this program was designed to be a standalone intervention that would prepare people for either post-secondary education or directly entering the job market, many participants left the program with remaining Essential Skills gaps or ambiguity about where to go next. Lastly, the desire for the program to be an efficient model that could flexibly serve a diverse group of jobseekers created tensions that were challenging to balance when delivering the curriculum to people with a range of needs and skill levels.
Developing self-confidence matters
A key aspect of the program’s definition of success at the outset was seeing measurable improvement of skills using a variety of assessments administered at the start and end of the program. However, early analysis of the quantitative data on assessment scores showed on average a modest improvement across all skills. Based on the assessments alone, jobseekers were generally making minimal progress by participating in the program – some even saw that post-test scores went down.
Yet, a common theme shared across participants was that they felt the assessment results did not reflect the amount of progress they made. There is extensive literature about the barriers to accurate assessment such as test anxiety and testing environments (Lu & Sireci, 2007; Cassady & Johnson, 2002), and some of those barriers were shared in situations that were recounted to the researchers. But more importantly to the jobseekers we interviewed, the assessments could not represent the full meaning of what they got out of the program and how it impacted their lives. What mattered was that the Amplify program changed how they saw themselves. Post-program jobseekers felt more confident in their ability to learn, do well in a structured school environment, and when thinking about their overall employment journey felt a groundbreaking sense of “I can do it.” Completing Amplify helped jobseekers shift from feeling scared by or avoidant of their next steps to feeling energized and motivated. Even caseworkers who were interviewed noted that in follow-up conversations with their clients after the program, jobseekers seemed transformed: they spoke more assertively, described themselves and their capabilities in more hopeful and positive ways, and were more diligent and proactive about moving towards steps for their action plan. For many caseworkers, these were significant changes they were seeing in clients who they had been working with for years. Stories as simple as, “My client has sent me an email for the first time,” were shared as revelatory.
Participants, caseworkers, and facilitators viewed success as a set of visible indicators of progress that they would term as a dramatic increase in “self-confidence.” Self-confidence was the word that was most often used to describe new behaviors, skills gains, and attitudes, and was held up as a critical ingredient in clients’ ability to take on the challenge of further upskilling and managing the fear that comes with changing careers.
A desire to link the program more deeply into the larger employment journey
Designed as a 6-week program, Amplify was intended to be a quick and intensive on-ramp for jobseekers to be able to increase their Essential Skills enough to enable them to move onto further training or into sustained employment. However, at the end of the program, many jobseekers shared that their next steps would be primarily to “continue to practice” the skills they were taught in the program. While they felt mentally and emotionally ready to move forward, they still needed to develop their skills over a longer time period than 6 weeks. For some, this program pushed them to learn how to use a computer for the first time. However, there are few other structured programs in the WorkBC system for clients to continue to improve their skills. Many job-seekers with low-levels of skills in particular shared that they felt in limbo after the Amplify program ended. Individuals typically only have the option of continuing their learning online in self-guided modules, or to find free public programs such as at a library. However, due to their low levels of skills, jobseekers often do not have the tools and capabilities to learn without structured guidance and support. As one jobseeker optimistically asked, “Is there an Amplify 2?”
The question of “what next?” after Amplify came up more often as the program evolved. Caseworkers increasingly shared how much they valued the facilitator’s feedback on a clients’ performance to help them plan and determine next steps. Caseworkers felt that facilitators were a key resource into deeper learning about their clients, as facilitators often spent more cumulative time with their clients, observing their skills and how they interact in training environments more than their caseworkers ever had. Completing Amplify was perceived by both jobseekers and caseworkers to increase clarity on jobseekers’ future plans. Many jobseekers felt that gaining a greater sense of their skills and abilities should help to either validate a desired path forward or help course correct with more practical options. However, if they didn’t already come into the program with a strong idea of where to go next, some expressed feeling frustrated and ‘back at square one.’ There was a shared desire by clients and caseworkers alike for the Amplify program to tie in more to the overall process of career navigation, and for more interventions to be waiting at the end of the program to support the interstitial space between ‘getting started’ and ‘being ready to train or work’.
Systems-level pressures shape who is referred
As the research team followed the experiences of facilitators who taught successive cohorts, it became clear that the composition of the cohort played a big role in what could be taught and covered in each iteration of the program. One facilitator described that the experience of teaching each day felt like “surgery,” constantly trying to cut and be precise about what parts to include. Many participants felt that the program’s design was stretched to serve everyone, and there was not enough time for the learnings to be as deep as they needed it to be. Participants believed they could have gotten more out of a program that grouped them with more similar peers.
From the engagement with facilitators and caseworkers, the research team came to understand how and why this outcome was normalized. Because the Amplify program is delivered through the Province of British Columbia’s employment services system (“WorkBC”), the program inherits the system’s incentives and challenges. Currently, WorkBC operates in a pay-for-performance model where employment service centers receive funding from the government depending on the enrollment and completion of services. This creates a pressure to ensure they can serve enough clients per year and to fill cohorts of jobseekers at each delivery site in order to pay for staff wages, which facilitators shared discourages them from being choiceful on who to admit to the program.
Further, eligibility limitations for other programs within the employment system meant that some clients were referred to Amplify for a variety of reasons beyond just improving their skills. Most commonly, jobseekers who sought to access funding for further post-secondary education were referred to the program as a way for case managers to bolster their case for approval of a client’s training package. Additionally, some jobseekers were not eligible for other programs within the employment services system (e.g. if they are an immigrant to Canada who has obtained Canadian citizenship, they are ineligible for free language training programs) and so were enrolled in the program despite that Amplify might not necessarily be the best fit for the skills they needed to focus on.
Finally, because past resources for Essential Skills programs in BC have been inconsistent, many caseworkers did not have a strong grasp of the function and role of Essential Skills programs. Thus many caseworkers perceive ‘Essential Skills” to be fuzzy and felt they needed more clarity on the program’s contents, support with explaining the program’s benefits, and time to make appropriate referrals. As a result, each cohort of the Amplify program could reliably be seen as a “catch-all” for jobseekers with a very wide range of skills and very diverse needs from the program.
The research team hosted a two-part virtual co-design session, which walked the implementation partners at Douglas College and the service delivery sites through the research insights, and solicited their input and ideas to validate the findings and address opportunity areas for program redesign.
It was important to immerse the implementation partners in the range of perspectives that were surfaced from the research in a short timeframe, and provide easy and efficient ways for them to digest the breadth of insights we gathered from jobseekers, caseworkers, and facilitators in order to enable balanced decision-making about the program. During the research team’s presentation of the findings, audio clips from interviews were played to bring the findings to life, and quantitative data was weaved in to give the narrative-based insights a sense of scale. The human focus of the presentation created a visceral connection to what otherwise would be words on a page, multiple choice boxes on a survey, and assessment scores in a database. It gave a very literal ‘face’ to the people that are showing up at the door of Amplify, allowing them to state their own needs, fears, and hopes. It also provided a safe space for facilitators to speak candidly, and for us to share their stories free from concern or judgement.
In order to help the implementation partners quickly interpret and action on the findings, the research team created two design research artifacts: personas and journey maps. They became the core anchors for the group to see the program’s core users and understand more about them than their skill levels, follow the thread from our insights to the overall Amplify experience, and finally to arrive at a clear set of opportunity areas for iteration. The journey maps summarized the experiences of each of the three user groups, enabling the team to quickly reference which pain points were shared and where experiences diverged. Further, the journey maps helped guide a conversation about the wider role of the program by situating the in-program experience within WorkBC-level on-ramps and off-ramps to the program. Lastly, we developed a set of design principles and recommendations to provide the scaffolding to kick-start the co-design process and provide guard rails to focus dialogue. As one facilitator of the program said, “I have to say I really appreciated the tools and steps that the team used. So often certain solutions are just top of mind, while certain stakeholders and priorities are completely overlooked. This was a highly detailed and rich set of activities and the way things were put together made it easy to follow, stay engaged and make meaningful contributions.”
Importantly, as our core partner was also a service delivery partner running two WorkBC centres with a longstanding history of designing and delivering Essential Skills programs, our ethnographic insights and recommendations came into dialogue with existing beliefs and a clear prioritization of minimizing the work associated with implementing changes. The team was lucky in that the partners were open to exploration and dialogue, and the co-design sessions were structured to provide the space for the implementation team to create a sandbox to focus on desirability and hear feedback. The experience generated some bold recommendations rooted in desirability, as well as some that were scaled down to respond to the limitations of the larger employment services system. The next section breaks down the changes to the program that were considered and ultimately chosen. The implementation team is currently in the process of implementing these changes to the program’s model.
Serving the people at our door
One major potential program redesign proposed by the team involved expanding the program’s duration, and stratifying cohorts based on skill levels so that jobseekers were more likely to be in a program with peers. The research findings pointed to these changes being strongly desirable, as the vast majority of participants shared they believed this would have enabled them to have learned more. However, it was challenging to address the anticipated challenges that would come with implementing a longer, stratified program.
In British Columbia, programs like Amplify are delivered by community-based service delivery partners who receive the funding that pays for staff salaries based on each client that enrolls and completes the program their site. This meant that typically service delivery sites operate against an extremely thin margin of available resources, and hiring multiple facilitators to deliver multiple concurrent cohorts for different skill levels would be financially risky. Likewise, expanding the program’s length and depth to meet the learning needs of more jobseekers would pose challenges for enough cohorts to be run each year in order to make running the program financially viable. To enable each delivery site of the Amplify program to offer longer, stratified cohorts was simply not possible. Team members have agreed to revisit offering a longer or stratified model at two WorkBC offices that have greater capacity to experiment, however the team decided to first evaluate the experience of delivering an in-person cohort in the fall of 2021 to see if that resolved the instructor’s ability to provide more attention to individual jobseekers’ learning needs within a classroom.
Embracing the emotional and behavioural outcomes in the program
As a result of the research findings, quantitatively measured skills progression will become not the only way that success is formally documented in the program. It was important to the implementation team to optimize the assessment results where possible, such as moving the dates of the assessment to when jobseekers were less stressed overall and improving certain sections of the curriculum. In tandem, our finding that the skills assessments alone are not a complete view of what participants gain through the program has led to the exploration of alternative approaches to capturing jobseekers’ progress.
The largest shift is focusing the curriculum’s delivery to bolster the psychological outcomes of learning that jobseekers experience. The implementation team is in process of re-framing the curriculum around ‘milestones’, which would make it easier for jobseekers in the program to more easily recognize the progress they have made simply due to participating in lessons. Moreover, the program is revising the structure for weekly one-on-one sessions with facilitators to encourage weekly goal planning: jobseekers set a goal for themselves that is slightly outside of their comfort zone, which would be reviewed weekly. The running list of weekly goals that are met create an artifact that celebrates what they can do.
Further, materials are being developed to help facilitators communicate to caseworkers the breadth of observed improvements that jobseekers made. Facilitators comment on attendance, participation, and a greater emphasis will be placed on qualitative changes in motivation, effort, and resilience. Some soft skills assessments already included components of qualitatively observed information, so these are being leveraged to provide a fuller picture of the importance of emotional and behavioral change. These changes are designed to intentionally recognize and expand the adjudication of participant’s accomplishments that the assessments alone could not capture.
Plugging into the larger employment journey
As a result of the research findings, the implementation team is working to help connect Amplify to the stages and stakeholders of a client’s overall WorkBC experience as much as possible. The team chose not to lean too far into weaving in career navigation supports in order to not duplicate pre-existing WorkBC interventions, but chose to build in moments in Amplify that tied-in career goal setting, helped facilitators be aware of career navigation efforts that may have been completed before Amplify, and helped plan for and direct clients to additional programming around career navigation if needed.
Most significantly, design choices were made to the program’s marketing and training materials to more clearly communicate the value of the program towards a clients’ journey. This included a greater emphasis on what caseworkers could expect to see from clients as a result of their participation, as well as eligibility criteria that leveraged the personas developed in the research to help caseworkers understand which clients to refer. Additionally, it was suggested that some other available resources within the WorkBC offerings should be utilized before enrollment in the Amplify, such as participating in career navigation workshops, and counseling.
The program is also being revised to more systematically emphasize a warm hand-off back into the WorkBC system by introducing a mid-program touchpoint between caseworkers and facilitators, and focusing facilitator feedback for caseworkers at the end of the program to information that could help a caseworker determine whether an individual is ready for their planned next steps. For example, if an individual with a disability is working towards customized employment post Amplify, post-program feedback would focus on clearly outlining the client’s strengths, capabilities, and potential to more easily fold into an infographic resume that is a known output of the customized employment pathway.
This project uncovered a nuanced view of the process of skills development for adults and illuminated the ways in which learning and preparing jobseekers with multiple barriers to participate in the economy of the future is a largely emotional and human process. The research team was able to introduce changes to the program’s design and delivery that recognized the emotional dimensions of changing careers, and validate the importance of the incorporation of soft skills and the role of confidence. Ultimately, the largest impact of the research was to iterate the program to maximize skills gains in large part by focusing on the psychological process of preparing for change.
By deliberately engaging implementers and case managers, the research produced a deep appreciation of the complexity of the various design constraints within the Employment Services system. It was this last recognition that led to the group’s decision to shift the central focus of the project to ongoing research and iteration versus solely generating impact data of a pilot model.
Importantly, this shift also meant a shift in understanding the value for the Government of Canada as the funder. What emerged from our process was a reframe from handing over a stand-alone impact assessment of one model, to the wider utility of developing a set of insights on what works and doesn’t work for outcomes and implementation, and the need to more directly reckon with the constraints of current-state services.
Dismantling the barriers to Essential Skills progression and ultimately career navigation is not a straightforward task. Career services and their outcomes cannot be easily parsed out from the complex forces and systems that affect who is able to get what job: the economy, our education system, our immigration system, etc. However, career development interventions like Amplify that work to anticipate, and acknowledge and respond to this interconnectedness will likely have greater impacts, especially for people who face barriers within those broader systems. Greater involvement from the ethnographic and design research community to continue to move the policy ecosystem towards more fulsome evidence generation and testing and iteration as part of ‘learning’ in a space that is often laser-focused on traditional quantitative evaluation as a funding requirement and locus of decision-making, will be critical to supporting our ability to anticipate and respond to the changing labour market.
This project is funded by the Government of Canada’s Office of Skills for Success (formerly known as Office of Literacy and Essential Skills). The opinions and interpretations in this publication are those of the author and do not necessarily reflect those of the Government of Canada.
Berger, E. M., Koenig, G., Mueller, H., Schmidt, F., & Schunk, D. (2017). Self-Regulation Training and Job Search Effort: A Natural Field Experiment within an Active Labor Market Program. Johannes Gutenberg-Universität Mainz Working Paper, 1712.
Cassady, J. C., & Johnson, R. E. (2002). Cognitive test anxiety and academic performance. Contemporary educational psychology, 27(2), 270-295.
Conference Board of Canada (2020). The Future is Social and Emotional: Evolving Skills Needs in the 21st Century. Impact Paper. Retrieved from: https://www.conferenceboard.ca/temp/a421bb9b-42ce-44f1-b88d-cea09a918302/24357_10628_FSC_SES_Impact_Paper_EN.pdf
Heckman, J. J., & Kautz, T. (2012). Hard evidence on soft skills. Labour economics, 19(4), 451-464.
Lamb, Creig (2016). The Talented Mr. Robot: The impact of automation on Canada’s workforce. Brookfield Institute. https://brookfieldinstitute.ca/wp-content/uploads/TalentedMrRobot_BIIE-1.pdf
Lu Ying, & Sireci S. G. (2007). Validity issues in test speededness. Educational Measurement: Issues and Practice, 26, 29-37.
NESTA (2018), Using research evidence: A practice guide. Retrieved from: https://media.nesta.org.uk/documents/Using_Research_Evidence_for_Success_-_A_Practice_Guide.pdf
Organization for Economic Cooperation and Development (2017). Embracing Innovation in Government. Retrieved from: https://www.oecd.org/gov/innovative-government/embracing-innovation-in-government.pdf
Organization for Economic Cooperation and Development (2020). Increasing Adult Learning Participation: Learning from Successful Reforms. Retrieved from: https://www.oecd-ilibrary.org/sites/cf5d9c21-en/1/1/1/index.html?itemId=/content/publication/cf5d9c21-en&_csp_=e0d124d9c63cfa3f39610aff50d62d5d&itemIGO=oecd&itemContentType=book
Pearce, Warren & Sujatha Raman (2014) The new randomised controlled trials (RCT) movement in public policy: challenges of epistemic governance. Policy Sciences, 47, 387–402.
Rudolph, C. W., Lavigne, K. N., & Zacher, H. (2017). Career adaptability: A meta-analysis of relationships with measures of adaptivity, adapting responses, and adaptation results. Journal of Vocational Behavior, 98, 17-34.