Advancing the Value of Ethnography

STAND Where You Live: Activating Civic Renewal by Socially Constructing Big Ethno

Share:

Download PDF

Cite this article:

Ethnographic Praxis in Industry Conference Proceedings 2012, pp. 132–148. https://epicpeople.org/stand-where-you-live-activating-civic-renewal-by-socially-constructing-big-ethno/

This paper explains how STAND Chattanooga became the world’s largest community visioning process in 2009. Behind its public success, the authors relate the underlying ‘research story’ of how 26,263 viewpoints were achieved by changing course in midstream and adopting more ethnographic methods of survey collection. For an EPIC audience, we analyze STAND’s ultimately successful outcomes as a case of following the logic of ‘social fields’ (however unintentionally). The paper furthermore argues that STAND is a paradigm example of the way ethnographic principles can be deployed at various scales to accomplish goals (such as community renewal) outside the reach of most ‘Big Data’ analytics.

PROLOGUE

If we are living in an “Age of Analytics”, as some EPIC commentators dub the current research scene, (Slobin & Cherkasky, 2010)1 must ethnography be content with a supporting role? These authors’ telling accounts “of analytics overshadowing ethnography” on various digital marketing projects (2010:195) led them conclude the most productive path forward was one of ‘constructive engagement’ (our word) in which ethnographers secure a place on multi-disciplinary teams, then carve out a meaningful share of a project’s interpretive role (in “partnership with data strategists” (2010:198).

Despite the necessity for engaging, we have to ask if this partnership for sharing in “consumer understanding” will run so smoothly when (as Slobin and Cherkasky note) there are many clients motivated to give analytics the whole ball of wax? Extrapolating to the future, can we anticipate a growing cadre of ethnographic practitioners (awed by the scale of “Big Data”) reframing our craft as niche specialists whose main role is providing cameos of “illustrative faces” or “contextual richness” so the invisible masses that populate customer databases can communicate to their companies in a human voice?

This is a seemingly far-fetched, even Orwellian, vision that surely would amount to a ‘Handmaid’s Tale’ for our discipline (Atwood, 1998). However, like many so-called ‘Visions of the Future’, it is already being partially fulfilled. Whether this scenario will follow a path to dominance is as yet unknown. To explore such questions Foresighting teams look for ‘weak signals’, ‘anomalies’, or ‘reversals’ in prevalent trends to find signs that an existing paradigm might be losing its hold (or an emergent one taking shape).

Consider the story we tell here then, as just such an anomaly (and possibly a strategic ‘pointer’) within our “Age of Analytics”. For this tale is one where ethnographic principles lead not follow in the creation of some decidedly ‘Big’ data for the city of Chattanooga. As we describe how Chattanoogans pursued renewal as a city, ethnographic practitioners may find a sense of renewal of their own.

Framing

Our focus here builds on investigations from some of the authors’ previous EPIC papers in the following areas: how research can ground successful ‘community visioning’ (Miller & Jones, 2011); the importance of embodied group experience for motivating grass roots activism (Jones, 2005); as well as, how ethnography succeeds or fails through continual ‘attention’ and ‘attunement’ to participants in the field (Jones, 2010). Knowledge and theoretical concepts from this previous work helped us understand the ways STAND accomplished its landmark.

DIRTY OLD TOWN / NEW TRADITION

In 1984, fifteen full years after Walter Cronkite described Chattanooga, Tennessee as “the dirtiest city in America” (when its downtown had reached an undeniable state of decline) a handful of civic leaders formed ‘Chattanooga Venture’. After a short period of public consultation, this initiative launched ‘Vision2000’ selecting 40 goals for the city to achieve by the start of the Millennium. The regeneration targets included the categories of Places, People, Work, Play and Government; and involved initiatives ranging from improving the livability of downtown Chattanooga, solving air, water, and toxic waste problems, to creating after school programs. By the year 2000, many of these goals had been realized. Chattanooga had even been able to make the label “The Scenic City” stick when in 2008 it was named one of the ‘Best places to live in the US’ by Outside Magazine.

In July of that same year Volkswagen chose Chattanooga as the site for its first US manufacturing plant in 20 years (after a hard-fought battle with rival Southern cities). This decision was projected to bring an investment of $1 billion to the local economy. Immediately following this announcement, Chattanooga’s Mayor called on the city to continue, even intensify, its regeneration efforts (to better welcome the new arrivals). In characteristic Chattanooga fashion the Mayor first made his appeal at a Rotary Club meeting. Later the same month his call was answered when STAND was formed by a diverse group of citizens, corporate, and non-profit organizations. Its goal was to build on Vision2000’s initiatives over the previous twenty-five years, but whereas the earlier civic regeneration efforts had sought limited public input (through large town hall meetings, which 2,000 or so people attended), STAND decided to first create a wider shared view of where to take Chattanooga’s future.

CreateHere (a nonprofit with experience redeveloping downtown Chattanooga) was charged with providing organizational support and a team for running STAND. It was instrumental in catalyzing the more publically consultative or ‘inclusive’ approach to civic regeneration that emerged. Hence, the cornerstone of STAND’s ‘community visioning’ process was a survey to solicit citizens’ input on preferred futures for the region. STAND created a four-question questionnaire, and set itself the ambitious goal of collecting 25,000 responses. This number was meaningful because it was close to 10% of the population (and because it would top the city of Calgary’s previous high of 19,000 viewpoints – which the organizers thought was achievable due to Chattanoogans’ strong enthusiasm for their city). The rationale for wanting high levels of participation was to identify directions for improving Chattanooga which the whole region could embrace (because a sizable amount of its citizens had helped shaped the direction themselves).

Launching STAND

The plans for conducting the survey were straight-forward, if audacious. The STAND initiative would achieve high levels of ‘public’ awareness by way of high-visibility branding throughout the city. STAND would be marketed as a kind of movement that anyone around Chattanooga could contribute to. The survey’s four questions would appear on billboards (print and various other media) to generate interest, and in turn drive people to complete the questionnaire online. In addition to the core web channel, the plan was for STAND volunteers to man stalls (and circulate through the crowds) at public events (like festivals, concerts, or open-air movies) for the duration of the survey’s five-month span. The third channel was a handful of large employers who promised to circulate the questionnaire among their staffs (via their corporate intranets). Through these means the numbers of completed questionnaires would spiral upwards until STAND reached its goal of 25,000 respondents.

THE FLIP-FLOP

However, research (as we all know) rarely runs exactly as planned. Three months into the survey’s five-month deadline STAND had achieved 7,500 completions with a slowing response rate. As co-author (and STAND researcher) Bijan Dhanani puts it, “The online survey just plateaued at that number then stopped”. Likewise the public events – where STAND volunteers offered people surveys – were not frequent enough (or netting sufficient responses) to generate numbers that would achieve the quota. And the returns from the corporate intranets were described as “abysmal”. Employees had learned to ignore the STAND intranet questionnaire the way they ignored most things there. In writing this paper we carried out interviews with the co-founders of CreateHere, Helen Johnson and Josh McManus (as well as other CreateHere members) probing them about the little known near-crisis that underlay STAND’s public acclaim. It became clear these organizers had made a concentrated effort to identify the factors causing the survey shortfall (and to correct course while there was still time).

According to Helen, there were two problematic issues with STAND’s survey collection methodology. First, they were seeking a “broad spectrum of responses” from a wide demographic, and secondly, “We never expected it would be so hard to get to more than 7,500 completions with the online component!” The STAND survey had almost saturation coverage across local media, its own website, and a presence on Facebook and Twitter (social media platforms with the world’s greatest ‘reach’). Yet, Josh added, “There was a week of NO returns and people got worried that the entire initiative would stall out”.

Summit

Yet, since the 25,000 target was a commitment they had made to the entire city; “We knew we had to get there,” explained Helen. So when the responses trickled to nothing, they called a mini-retreat of the CreateHere Board, Staff, and Fellows. One of the first decisions made was to dedicate more time and “people power” to the STAND initiative. Fellows and Staff working on other CreateHere projects were brought into the meetings, then seconded, or fully allocated, to STAND. One of these was Katherine Currin. Helen recalls her at the retreat saying, “We need to bring some practicality to this”. Katherine then got on the Chattanooga.gov ‘calendar of events’ and started looking for any events and meetings that would draw 25 people or more. She calculated that “We need this many completed surveys every day to reach the goal. So go out and get those surveys, and don’t come back until you do!”

Katherine remembers this turning-point herself as, “There were 15 to 20 fellows there, I realized if we all got ten completed surveys a day, we could hit our goal! Breaking it down to what it would take on an individual level enabled people to see that (reaching the quota) was possible”. Katherine went on to be a co-director of STAND alongside Sarah Lester (who had been very successful in getting businesses to support the initiative early on). The co-director leadership model for STAND aped that of CreateHere itself; and it is now one Josh and Helen advocate for community development work whenever possible.

Outreach Strategy

“At this point people began to own the process” says Helen. There was a change in collection methods; “Everything got tied to a time line”, but the more important shift was acknowledging that “We needed to take (the survey) to where the people were accessible. This shift is referred to by some CreateHere Fellows as “the flip-flop”. Josh describes the new collection process like this:

We got extremely ‘creative’ in outreach. It was during the summer. We went everywhere there were more than 25 people gathered together: churches, concerts, neighborhood meetings…Every morning we scoured the paper to see what events were going on, that we could send people out to, so we could get to as many diverse pockets of the population as possible: retirement homes, soccer matches where the Hispanic community were involved, I went to a motorcycle rally once…

In fact the pre- and post- flip-flop survey processes were different enough (in terms of three important variables) that you could almost call them STAND Phase 1 and STAND Phase 2. The first was more like a traditional marketing campaign (albeit an extremely well-designed and well-branded one, especially for a community initiative). For all that however, it was still trying to ‘convince’ a large population of people (based on limited messaging or information) to do something. In this it echoed the well-known AIDA model (Awareness-Interest-Desire-Action) for selling to (or influencing people) by moving them through these four states. Furthermore, even though there were many channels aiming at raising awareness and interest about the survey (from magazine ads, to yard signs, to coffee sleeves) unless you happened to bump into a STAND volunteer at an event., for most people online was the only channel through which to actually fill out the survey (at STAND’s website, or via Facebook and Twitter). Phase 2, on the other hand, was dominated by another channel, and is even called the period of “face-to-face canvassing”. As Josh and Helen described above, the whole recruiting mechanism or ‘conversion’ strategy’ for STAND had changed; from bringing the people to the survey before, to bringing the survey to the people after (to the places, occasions, or ‘habitual’ contexts where they lived, worked, played). These changes also brought about a third shift in effective targeting: from targeting the whole city of Chattanooga with the idea of the survey (via a marketing campaign) in Phase 1; to targeting groups and organizations one by one to complete the survey (via the logic of ‘social fields’) thereby indirectly achieving a study reflective of the whole city.

We have described these shifts in methodology (for a large-scale community visioning study) as involving a move toward ‘context’, toward ‘face-to-face’ research, and as ‘bringing the survey to the people’. Readers should therefore be able to guess what type of approach it now resembled. The ‘flip-flop’ summit explains how the “ethnographizing” of STAND happened from an organizational/managerial perspective.

THE GROUND GAME

But there are always multiple layers to any event or history. The mini-retreat had delivered a vivid new game plan, but now it had to be executed on every day until the quota was reached. So how did STAND actually achieve its record response rate? The project had canvassers staffed from the beginning, but in small numbers (for secondary event collection “to clean up around the edges,” as Josh put it). Now, more full-time CreateHere fellows would be working full time on STAND (many as canvassers) along with a number of dedicated volunteers. Also, since the impetus from the retreat made it clear canvassing was going to save the project, we can surmise the status of this role within the non-profit rose. Josh and Helen describe how at this time they shifted budget away from additional web marketing work to hiring and funding more fieldworkers. In STAND’s last month the original allocation of five ‘field organizers’ with teams of 3-7 canvassers under them (depending on the occasion) doubled to ten field organizers busy with outreach.

Trial & Error

For writing this paper, we also carried out interviews with a selection of STAND canvassers, including co-author Bijan Dhanani who served in this role himself. ‘Canvasser’ of course is just another word for a field researcher who works in a quantitative survey context (often for non-profits or political campaigns). Their job is to get respondents to answer a questionnaire. Canvassers usually carry clipboards (an emblem that alerts passersby to quickly head the opposite direction). So it is perhaps an illuminating index of the success STAND eventually achieved, that Bijan (who soon assumed a leadership role for STAND’s fieldwork) drew on past experiences of being a target of charity canvassers – and decided to do otherwise in his work for STAND. He recalled how past canvassers would stand in his way, use a cheesy line, and how their whole focus persuade him to say ‘Yes, I’ll do your survey’. Instead (after trial-and-mostly-error which brought back these memories), Bijan decided he would he would act less like a canvasser and more like a fellow citizen who wanted to talk to people about the future of Chattanooga. The STAND questionnaire itself facilitated this approach; it was a model of simplicity, consisting solely of four open-ended questions:

QUESTION 1 WHAT DO YOU LIKE ABOUT THE CHATTANOOGA REGION?
QUESTION 2 IMAGINE THE BEST POSSIBLE CHATTANOOGA REGION. DESCRIBE IT.
QUESTION 3 WHAT CHALLENGES MUST BE ADDRESSED?
QUESTION 4 WHAT ACTIONS, BIG OR SMALL, CAN YOU TAKE TO HELP?

Hence Bijan found that by simply adding a few words, and omitting any procedural tone, (“first question, second question, next…”) he could ‘conduct a survey’ that was just like “having a conversation” (or at least felt that way to the respondent). (This hybrid style of interview is of course a hallmark of ethnography (Spradley, 1973) distinguishing it from positivistic forms of research). Bijan said he went even further to ensure this feeling by trying hard to maintain eye contact, not looking down at the survey form, and copying down the respondent’s answers so they could focus on what they were thinking and saying. Josh (who while co-directing CreateHere did a fair share of STAND canvassing) independently echoed Bijan’s approach, saying, “The best experience was when I wrote for them – it made it easy and more conversational. That jogged things for them”. We do not know how standardized the research methods were across the STAND canvassers, but having confirmation of techniques like this (from separate interviews at separate times and places with two leaders of the canvassing) we believe there was a convergence on methods like this that proved to work (for generating both more responses and better responses).

Further evidence (of the shared adoption of successful research practices) came from another canvasser, Blair Waddell. She said the eventual canvassing lead-in for STAND that she arrived at, (for potential survey-takers) was saying, “We want to know what you want for the future of Chattanooga”. She learned quickly that the very word ‘survey’ was a “turn off” that made people freeze up. So as an alternative, she would follow that opening with, “I got just 4 questions for you…” and then she would begin talking through the questionnaire questions (above) as if it was she was asking the respondent, not the survey.

Some of the original canvassers had been learning these lessons even before the flip-flop. Another prime lesson was that if you simply ‘handed out’ the survey form, most of them never came back. Which is to say they learned the most successful form of face-to-face canvassing was not simply ‘distributing’ the survey, but ‘performing’ it in a dialogue with the respondent. Both Blair and Bijan talked about their role as facilitating the respondents, to better articulate their feelings and thoughts (not merely as getting them to ‘give answers’). Since these were open-ended questions, people hardly produced binary answers. And these particular short, simple questions opened up a host of issues about place (and people’s current and potential lives in a place) that could not always be easily captured or ‘processed’ (emotionally).

‘Accelerated Praxis’

So the ‘ground game’ we are describing here (added to the managerial account above) explains how the “ethnographizing” of STAND happened at the level of practice and execution. The STAND canvassers had in effect (through an intense period of experimentation and ‘accelerated learning’) managed to ethnographize their collection techniques to meet the survey’s high quotas for sample size and widespread community involvement. And this is all the more remarkable for the fact that none of these canvassers were trained researchers (and none of them were aware of the techniques of ethnographic research).

Yet the research learning process that occurred on STAND is descriptively similar to an approach mentioned (in one author’s earlier EPIC paper) that locates researchers’ motivations for growth in ethnographic practice within an individual’s experience of the never fully-fulfilled potential inherent in concrete research encounters. Accordingly, each of these canvassers was learning how best conduct a survey in ‘attunement’ with the respondents they were facing in context (Jones, 2010:255-56). They were ‘attending to’/’observing’ relevant critical variables, such as of body language, level of interest or engagement, to whether this potential respondent versus that one agreed to talk (or were just taking a survey form as a way not to talk) in reaction to one version of their approach line (and questions) versus another one. The STAND canvassers were then ‘adjusting’ their survey performance over sometimes hundreds of iterations a day. For professional ethnographers, this is a fascinating case study that queries the very nature of our praxis – for it suggests that non-researchers with a deep pragmatic motivation, but no theoretical basis, can arrive at something very close to ‘doing ethnography’ (especially evincing its more empathic or dialogical aspects) through concentrated field engagement.

‘LeadHere’

It appears that what these canvassers did not have in terms of theory (or training) was compensated for in volume of trial and error with people. But what was “learned” then got fed back into their practice so rapidly, that maybe we should call what happened with them ‘accelerated praxis’ (instead of accelerated learning). We are not sure if their experiences ever got translated into propositional knowledge that could easily be verbally shared. The slogan “turning canvassing into conversations” was actually in STAND’s ‘Field Strategy’ document. But we also know there was no explicit, preparatory ‘canvassing training’ provided by CreateHere for STAND canvassers, so we expect realizing this ideal in practice, like most of their skill development, was via ‘tacit’ or ‘apprenticeship’ learning. It was rooted in each researcher filtering their own canvassing attempts though what they saw their teammates doing (and having success with) around them (Lave & Wenger, 1991). This type of ‘experiential learning’ seemed to be a part of the non-profit’s management model called ‘LeadHere’. It mandated learning while doing, and that those in intermediate leadership roles (closer to the work than HQ) should take responsibility for spreading the new skills their people needed. This might seem to put the burden for any canvassing training on the five (and later ten) ‘field organizers’. However, we believe sharing best practice for conducting the survey worked more like an “ad-hocracy”; flowing in both directions up and down from field organizers (based on whoever’s numbers showed they were bringing in the surveys). In this way Bijan, starting on STAND as a line canvasser, became a field organizer himself. We also know some of these ‘new researchers’ (even though never formally trained themselves) did later give more explicit canvassing “crash courses” to those people called ‘individual supporters’ (more below) who were spreading or conducting surveys (even though not STAND staff or ‘official volunteers). So there was most likely also a “zone of proximal development” in operation that especially deepened the abilities of such trainer-canvassers (Vygotsky, 1978).

In addition to these social-cognitive processes, there were also some overlaying organizational mechanisms that helped account for the STAND team’s productivity. Katherine Currin had said “Breaking it down to what it would take on an individual level enabled people to see that (reaching the quota) was possible”. This had helped pull team members out of their fear of failure and get the project back on track. But keeping the survey accounting “broken down to an individual level” also helps explain how the project stayed on track, and the response numbers kept rising. Canvassers worked in small teams of 2 to 7, so each canvasser knew how many surveys he or she was bringing in each day. Bijan relates there was a state of ‘healthy competition’ between canvassers about their numbers which he terms was “more like ‘pride’ in how well you were doing to help STAND” reach its goal. Once it was realized that many respondents would only do a survey if you ‘talked them through it’ face-to-face, a positive feedback loop was likely created by the fact that giving a survey this way, was the only sure way a questionnaire would count as one of ‘your’ totals (vs. just handing them out). The CreateHere HQ also maintained a STAND “survey counter” which ticked up the response numbers each day (so canvassers could be aware of the progress the entire initiative was making towards its overall goal (and where their own personal contribution fit within this).

Mission Accomplished

Through the fieldwork processes described here, we know the shift initiated by the STAND flip-flop paid off. The project exceeded its overall target of 25,000 responses by over a thousand. Ultimately, the success of STAND can also be measured in terms of pure financial management. With a budget of $450,000 STAND achieved 26,263 responses to make it the world’s largest community visioning survey to date. Compare this to the former record holder of Calgary, Alberta (whose team members had actually been advisors to STAND). This city used a larger budget of $2.5 million to achieve 19,000 visioning survey responses (even while it had a larger population in its catchment area). This comparison raises fascinating questions about the “cost effectiveness” (contrary to popular opinion) of ethnographic, or face-to-face research, for this type of project (and therefore others). Because here is a firm case of how the STAND team, led by CreateHere, achieved more with less.

Ethnographic, face-to-face, or contextual surveying techniques were not ‘part of the problem’, when this initiative neared crisis, they were quite literally the solution out of it. In fact, the ethnographic canvassing approach helped make STAND successful along three key dimensions: achieving and exceeding its target sample, achieving the desired ‘diversity’ of sample (by going into neighborhoods whose ‘demographics’ had not come to the survey themselves – and in many cases facilitating non-literate respondents to dictate their answers to canvassers), and finally (as we now know) completing the project within budget. So more data, and better data, for less money. The STAND model has already set the benchmark for scale and breadth in community visioning. It is very likely to also set the benchmark for methodology and sound management, once interested parties realize the first two attributes were a consequence of the second two.

Consequently, in the project’s final channel accounting 80% of STAND’s record 26,263 responses were collected via face-to-face canvassing; compared to 20% of responses which were self-completed online. What this means is these data points were quite literally ‘socially constructed’ (through the kind of “encounters, situations, experiences within various groups to which each individual belongs” which alternative sociologists like Maffesoli (1996:88) believe constitute the true experiential basis for whatever sense of ‘Society’ still remains. And these encounters between canvassers and citizens, or citizen to citizen, occurred on occasions or events that STAND either set up, or ‘joined’, as part of its outreach to a plethora of ‘social fields.’ Therefore, this community visioning process is a true (not merely metaphorical) example of ‘the social construction of a set of ‘Big Data’ for the city of Chattanooga. So it’s probably more precides to call it ‘Big Ethno’ to reference the way it was collected and created. Because STAND’s survey data were socially constructed, even down to the fact that a majority of its responses were hand coded, from thousands of paper survey forms which canvassers and respondents scrawled on while engaged in live conversations in Chattanooga neighborhoods, offices, churches, parks (and countless other spaces) within the city and surrounding counties.

Josh McManus, co-director of CreateHere, reflected on this unexpected outcome during the current heyday of online research (and ‘Age of Analytics’).

It’s not that Chattanooga wasn’t a ‘tech savvy’ community, but it was much more effective for us to interact with people face-to-face. You need the human interaction to connect with people, so they know someone really cares and that their opinion actually matters.

This statement gives a good outline of the driving influences at work, and further explanation is furnished by considering the nature of ‘social fields.’

ENGAGING THE LOGIC OF SOCIAL FIELDS

As a result of STAND’s shift in methodology, canvassers starting to visit hundreds of group meetings where they soon learned to approach survey collection through the prevailing logic of the ‘social field’ each group of people belonged to.

In advanced societies, people do not face an undifferentiated social space. The various spheres of life; art, science, religion, law, economy, politics, and so on, tend to form distinct microcosms endowed with their own rules, regularities, and forms of authority… (Wacquant, 1998)

A ‘field’ is a patterned system of objective forces much in the manner of a magnetic field… (Bourdieu and Wacquant, 1992)

French sociologist and anthropologist, Pierre Bourdieu, arrived at the notion of ‘social field’ as a flexible way to describe the balance between the structuring forces within different subcultures (or ‘forms of life’) and the relative autonomy individuals had in deciding to follow or resist the ‘objective forces’ within them. By using the metaphor of a ‘field’ he also wanted to imply that there was always a sense of ‘play’ within a field (as in a sports field) but that this play was constrained by certain ‘rules of the game’ and never took place willy-nilly. Yet Bourdieu simultaneously deploys the analogy of magnetic fields (and the forces of attraction and repulsion they generate) to dramatize the influences actors within a field are subject to. With this analogy, he was drawing on an older conceptualization from Kurt Lewin (1951) based on field theory in physics.

The most relevant concepts here (which illuminate what happened on STAND) come from Bourdieu’s contention that fields structure the action of those within them by imposing on their players: 1) a ‘logic’ (about the way things work ‘within’ the field), and 2) ‘stakes’ or interests ‘within’ the field (which they seek to maintain or grow – in part by working with forces ‘outside’ the field). The key point for STAND is that before the flip-flop it was taking advantage of very few of these ‘forces of attraction’ (within the many subcultures or microcosms of Chattanooga) to promote the survey, because the Phase 1 marketing strategy was focused on targeting the whole city.

That decision was rationally based on the strategic judgment according to Josh that “this was the widest expected channel” from ‘general public’ to website survey. We should always remember that most research projects (especially large ones), whether commercial or non-profit, (as this one) are shaped as much by management decisions (starting with budget size) as purely methodological ones. It seemed an efficient, even elegant strategy (similar to Calgary’s) to try to sell the concept of the STAND process city-wide, then wait for people to stream into the website to complete the questionnaire. And, as we said before, the STAND branding was exceptional; the billboard and print ad communications were visually arresting and carried witty slogans like “Will another visioning process really make a difference?”

Despite this, as Mr. McLuhan opined so long ago, the mass medium became the message, and in terms of creating a reason to actually complete the survey, all the executions distilled down to the same core proposition, something akin to: ‘As a Chattanoogan you should want to help Chattanooga, so take this survey to be a good citizen’. As we have already described how the STAND organizers based their 25,000 target completion number on their belief Chattanoogans had a very high level of civic enthusiasm, this message was aligned to that belief. But the fact that the Phase 1 communication approach did not translate such enthusiasm into sufficient numbers filling out the online questionnaire does not necessarily mean it is not characteristic of the city. This response only shows that the kind of messaging you can do at this level (however good the copy) is always going to be ‘untailored’ or “one size fits all” for motivating people compared to kind of the enthusiasm you can unlock when speaking directly to a single group.

So after the new community outreach plan went into effect, not only did STAND canvassers come into the physical space of disparate ‘social fields’ (from Chattanooga dog show people, to the Ruritan rodeo cowboys, to suburban elementary school moms and dads, urban churchgoers, and inner city street party rappers) but each social field was spoken to with a message and language that appealed through the group, to their shared stake in the city. The field was thus the prime intermediary to civic concerns about Chattanooga, not vice versa. The STAND organizers had learned that even though different groups nest within the same city, they could not be motivated by the same logic. So during face-to-face canvassing STAND was adapted to appeal to the ‘logic’ and ‘stakes’ within each group.

For example to the Lions Club (an explicit community service organization) the appeal to the ‘logic’ of their organization was closest to the generic message above – that it was their duty as Lions Club members to help the city shape its planning through this survey; and their ‘stake’ was they did not want to reduce their organization’s share of influence in civic affairs. A similar ‘logic’ worked for the college sorority women (at the University of Tennessee-Chattanooga) who, as a rule of their club, have a monthly service requirement each member must perform. So when connected through a friend of a friend working for STAND, every girl in a sixty-person sorority was asked to get ten contacts to fill out the survey (thereby fulfilling that month’s service obligation).

From a very different demographic, a downtown African-American Baptist church was responsible for contributing several thousand surveys. The motivating ‘logic’ (that like magnetism) pulled the surveys into these churchgoers hands (and later into STANDs) was that the highly esteemed elder preacher of the church stopped his services, asked his congregation to fill out the survey and gave them time (during church) to complete their questionnaires. The ‘stake’ he employed was equally persuasive, that their area of town needed improvement, and it would come sooner if the voice of their community was represented. In contrast, when STAND fellows worked with an outlying Methodist Church, the survey questionnaire was simply inserted in the church bulletin (not becoming part of the service) and resulted in a much lower response rate.

One of the most frequent outreach targets were homeowner associations or HOAs. (These were all listed by neighborhood in the Chattanooga.gov database mentioned earlier). A field organizer team would travel out to these regularly scheduled meetings (of usually ten or so people led by elected HOA officers) and describe the STAND initiative and its objectives. Then the canvassers would ‘work the room’ doing full survey interviews with some, handing out questionnaires to others, being on hand to explain the questions, or talk these respondents through the survey. The consistent ‘logic’ of these events were that the HOA officers represented their subdivisions; and the ‘stake’ for HOA officers and residents alike was that their neighborhood, in their zip code, needed to be heard from.

Ultimately, through this process of face-to-face outreach, STAND collaborated with many ‘social fields’ and sub-groups: including 40 businesses, 20 religious organizations, and 48 nonprofit interest groups, to achieve its record number of responses. Also perhaps most effectively, this strategy motivated 239 individual supporters. ‘Individual supporters’ is the STAND team word for people who attended one of these outreach events, then decided to become a survey distributor (or also a survey collector) within their own social world. This is probably the strongest examples of the magnetism of ‘social fields’ to gravitate survey forms into the hands of new respondents. It happened for instance when an officer of a homeowner association (within one ‘social field’) would realize he or she could extend it to another overlapping (or distinct field) by offering to “Give me some questionnaires and I will take them to my church and racquet club too” as it went according to Bijan.

Likewise some HOA officers would take sheaves of questionnaire forms away from their HOA meetings (which only the most concerned subdivision residents usually attend) and take them door to door for residents to complete. The ‘logic’ and ‘stake’ motivating the HOA officer here was, I am the person who is the conduit between the city authorities and my subdivision, this is my natural role,(and of course by enacting it they were at least maintaining, and probably growing, their authority within the subdivision community).

Similarly, many of the college sorority women who had fulfilled their monthly service requirement earlier, followed the ‘forces of attraction’ back to STAND to become canvassers at a large street festival. Their ‘stake’ in performing this ‘service’ was again to fulfill their organizational role (even more so) but also to increase the visibility of their sorority, take part in a fun street fair with a ‘public’ role in it, and hopefully to score one of the by-then sought-after bright yellow STAND T-Shirts (that only volunteer canvassers who netted a sizable number of completed surveys were awarded). In cases like these the ‘logic’ of each ‘social field’ helped multiply the labor force of canvassers conducting the survey – making it more likely STAND would achieve its goal. Without STAND Phase 2’s outreach strategy, and the way it managed to harness the logics of (and thereby find a place within) a wide array of ‘social fields’, it is uncertain if any of these people would have even filled out a single survey for themselves.

Survey Collection as Service Design

The distinction between STAND Phase 1 and 2 may have been slightly overdrawn in this paper. But only to the extent that the winning ground game of canvassing continued to profit from the marketing ‘mind share’ generated by STAND’s advertising campaign (which in any event ran throughout the entire survey collection period). It helped both organizers like Katherine Currin in booking outreach meetings (when the gatekeepers at neighborhood associations, churches, or retirement homes already knew what STAND was), as well as canvassers asking questions who found people had heard of it too. But it is still a certainty that face-to-face canvassing was (in the language of service design) the crucial ‘touch point’ that enabled the STAND visioning process to achieve its goals. And furthermore it was by greatly expanding the role of this touch point (in comparison to others) that this effort prevailed. The affective-emotional motivators of feeling that your opinions really matter, and you are ‘connected’ to others that really want to hear them (as Josh describes) were the most powerfully motivating factors that made the greatest difference; but this ‘effect’ of the outreach events was combined with some simple features of the outreach events in themselves (as a touch point) that made these “happenings” advantageous for the goal at hand (securing completed questionnaires).

It is a well-know tenet of user centered design that at every stage, step, or click, of a process you lose some percentage of ‘users’ who are not pulled over this threshold (or obstacle). Basically, the outreach meetings (mentioned above) collapsed all the stages of the AIDA model (Awareness-Interest-Desire-Action) into one event. You were hearing about STAND, (possibly for the first time, but surely in more detail than ever before) hearing why it mattered for your ‘social field’ as well as the whole city, and then without skipping a beat you were completing the survey in conversation with a canvasser (or those around you). There was little opportunity to ‘drop out’ of this process. In fact, the outreach meetings STAND set up in neighborhood meetings and churches approximated the conditions of the alternative See-Feel-Change model (which behavior change theorists like Kotter and Cohen (2002) advocate over the AIDA approach).

STAND WHERE YOU LIVE: CONCLUSIONS

This paper has related a ‘research story’ that industry ethnographers will find telling, and in some respects, wearily familiar. The initial strategy for conducting the STAND community visioning survey was to be via a “convenient” online questionnaire that as a channel (or survey ‘format’) did not come close to achieving its target. As face-to-face outreach and interviewing ultimately drove the majority of STAND’s responses it becomes important to ask why these methods succeeded (as a lesson for future community visioning projects, or any forms of ‘public research’ that will not rely on pre-recruited samples or commercial databases). The lessons on display in STAND are also useful for any inquiry that needs to stimulate an audience to ‘care enough’ to take part and overcome the thresholds of sharing or apathy that research usually entails (especially when no monetary incentive is being offered).

We believe the reasons face-to-face outreach succeeded as a means of survey collection for STAND go far beyond issues of ‘channel access’, convenience, ‘reach’, or even the ‘digital divide’ (since most of the people who took the survey face-to-face also had internet access). In short, our view is the online questionnaire did not fail so much as a transactional platform as it did as a conversion medium (in tandem with the above-the line marketing that supported it). What Josh McManus meant earlier refers to this consequence, that face-to-face outreach was “more effective” to make people want to take the survey than billboards or print ads which ‘tell’ you about the community visioning process, but do not identify your stake in it. And thus more effective methodologically to generate responses for this type of project. We think STAND had tapped out the number of people at 7,500 who had a generalized interest in the issue of city regeneration (based on their generalized identity as Chattanooga citizens) that made them ‘motivated’ enough to complete a questionnaire. (Whether they ‘liked’ it on Facebook or not). That is, until the STAND personnel met with groups within their ‘social field’ and let them know (in person) why and how getting their voice into this survey mattered. The face-to-face outreach, in a single event, had the power to tip people’s balance of possible ‘interest’ towards the more crucial state of ‘engagement’, by making this visioning process relevant, even meaningful, to them.

This occurred in part because it cannot be overemphasized what a different experience it is to see an ad for a survey, and complete it online, compared to having a conversation about your city where you feel ‘heard’ by a fellow citizen (who has their life staked in the city as well). The first experience is Gallup; the second is Goffman (1967). In other words, the affective-emotional and interactional-interpersonal content at play for these two modes of survey response had a very different ‘lived significance’ for the respondents (and canvassers alike). Both modes provide ‘data points’, but the second mode is both a data point, and part of the social process it is supposed to be “reflecting”. Consequently, as a research methodology for a community visioning initiative (with an ultimate aim to catalyze community participation and ‘community development’) the second mode is more desirable. The online survey can capture and measure ‘beliefs’ about the city; but face-to-face methods are more fruitful for activating civic renewal by encouraging people towards engagement. If the first way captures the ‘public opinion’ of a community; the face-to-face method also helps generate ‘communitas’ (the feeling of solidarity, togetherness, or joint empathy) within one (Turner: 1969:132), (Jones, 2005:39).

We maintain that this effect was further catalyzed (and intensified) by the specific questions in the STAND survey, and the relationship between the questions. After asking the participants first what they liked about Chattanooga, then to imagine the best possible version of the city, and next to talk frankly about its “challenges”, the survey ‘conversation’ culminated in question number four. The first three questions served to establish what has been called “communalized empathy”(Maffesoli, 1996:136), (Jones, 2005:39) between canvasser and respondent. This is the sense of a shared common fate that bonds people living in the same place together. Following this, the final question, building on this place-based empathy, was easier for canvassers to elicit, but it also added still another dimension to the survey encounter. “What actions big or small can you take to help (the challenges in Chattanooga)?” Answering this question (even to a previously unknown canvasser) amounts to a public ‘speech act’ (Searle, 1972) akin to a pledge where there is a dimension of ‘witnessing’, along with some implicit obligation attached. Consequently, we hold that thousands and thousands of these ‘public conversations’ (across Chattanooga from May to September 2009) not only generated increased ‘communitas’ in the region; but ‘communitas’ with a directionality toward civic renewal. It is beyond the scope of the present paper to actually determine the impact of STAND research on the post-STAND level of civic or social participation in Chattanooga (and such an ‘outcomes study’ has not yet been conducted. But it is our strong hypothesis that conducting this visioning research more ethnographically (via this highly empathic style of face to face canvassing and outreach) has encouraged more people to be more active in their city and communities. The examples we do know are the number of STAND and CreateHere fellows who (now that CreateHere has disbanded) have started their own non-profit or social entrepreneurial organizations2. The likelihood of further impact on citizens’ ongoing participation was increased by the fact that STAND staff (especially Josh and Bijan) undertook a ‘results outreach’ five months later when they returned for meetings (with every group over ten that had hosted STAND when they were conducting the survey) to deliver and discuss the significance of the survey results for that group.

STAND Where WE Live: EPIC Learning from Big Data Ethno

Since the STAND survey was relatively unique in asking its respondents what actions they could take to help their city, it is an excellent test bed from which to investigate the extent to which people’s experiences while taking part in an ostensible research ‘exercise’ can actually prime commitment to the kind of widespread ‘’volunteerism’ (or social entrepreneurship) that makes civic renewal accomplishable. Furthermore, as part of an EPIC research program, we could examine the ways that ethnographic techniques (such as person-to-person open-ended questioning) specifically contribute to the success of such efforts. Since the intent of STAND’s fourth question was not merely ‘cognitive’ (or information-seeking) but also a kind of request aimed at ‘enrolling’ the research participant in action beyond the survey, its focus is consistent with EPIC’s concern to move from ‘descriptive’ to ‘generative’ (or “could” to “should”).

Mack & Squires (2011) and Lovejoy, Cefkin, Anderson, and Liebow (2011) both outline how an important way forward for the EPIC community lies in moving from ‘findings’ and ‘analysis’ to making ethical recommendations and guiding sensitive decisions. The normative potential for ethnographic work suggested by the STAND case affirms this call, suggesting that ethnographers using their discipline’s traditional strengths to deploy tools such as large scale surveys (or any teams working according to ethnographic principles) have the potential to both identify preferable options, and then mobilize populations toward achieving them. This would cast ethnography in the role of effecting change, in addition to reflecting upon it (in a way that most of the quantitative consumer research that goes into ‘Big Data’ currently does not).

It has been shown, in part, that this type of impact can be achieved purely by focusing community-wide “attention” on the outcomes of certain choices (Scharmer, 2007). In this manner, the EPIC community can take note that the results of the STAND survey expanded Chattanooga’s community’s vision to a heightened focus on overlooked problems (such as the city’s growing level of gang involvement) that had not been prominent in public discourse before. Now three years later a current Chattanooga mayoral candidate is even using STAND results as a core part of his election platform.

As Bruno Latour has argued for many years (1988) (1993) (against the reductionism of many brands of systems theory) perhaps the most profound truth of networks is that they remain at all points ‘local’. Similarly, we would say the research for the (very healthy 26,263 person) STAND survey remained at almost all points local and ethnographic (certainly the 80% of responses gathered from face-to-face canvassing) while remaining quantitatively significant. Thus, the STAND case supports Patel’s (2011) attack on the fallacy of the qual-quant divide as it applies to the ontology of the phenomena we study. But as students of context, we should note (as Patel would) the strong divergences that remain in the habitual contexts, effects, and applications of qualitative vs. quantitative data. Yet not all of these divergences stem from pure illusion, or the partisanship of tribal loyalties.

For example, STAND points to crucial differences in what ethnographic research per se (not only ‘Big Ethno’ on the scale of STAND) may be able to achieve, that conventional customer analytics usually does not. By virtue of the way ethno data is “conquered, constructed, and confirmed” (Bourdieu, 1992b:41) in closer and more attuned interaction with its participants (who emically guide it in the direction of their concerns – and thereby feel ‘empowered’ in the process) ethnography can be recommended as a preferred methodology for a range of purposes that require the reflexive shaping of outcomes by stakeholders. These include a host of practice areas (often clustering around the ‘c’-word) where enrolling participation of the agents studied (or people like them) is key to success, such as (among others): organizational change, change management, behavior change, as well as; community regeneration, urban renewal, medical compliance, and harm reduction3.

In all of these practice areas, ethnography has made great inroads, yet is still not conventionally a ‘core’ discipline (even if we can make a strong argument why it should be at least in the mix of methodologies for them all). If we are therefore as a community of practitioners (working across industries and myriad types of organizations) looking forward to a future of more action-oriented research, and shaping a greater range of outcomes, then the ethnographic research approach, which requires a greater level of collaboration from active participants (beyond that of a tick box), is still one that has a powerful potential for growth because of its considerable success at engendering the ‘transformations’ such practice areas aim for. Furthermore, the collaborative ‘competency’ of ethnography remains one which many forms of quantitative inquiry (no matter how ‘Big’ the data they generate) are hard-pressed to equal, (unless they not only partner with a participatory discipline like ethnography (as Slobin & Cherkasky (2010) hope); but also then do not regard their new ‘partner’ as a mere handmaid (e.g. servant) to a worldview ‘constructed’ chiefly by analytics).

The STAND community visioning process gives us one more reason to believe that Big Data and Big (or small) Ethno can productively cooperate on the (ontological) common ground that Patel (2011) shows they already occupy (so long as all parties agree to share that ground – both its wealth and discursive space). The STAND model can help the various sides recognize the distinct advantages of each approach; realize size of reach is not all that counts; and that ‘precision’, ‘rigor’ (even ‘objectivity’) comes in various forms (which may be thoughtfully and judiciously combined) for shaping different (and sometimes the same) realities

ACKNOWLEDGMENTS

The authors would first like to thank Josh McManus and Helen Johnson for their role on STAND; and for taking the time to collaborate with us on studying a project they completed three years prior. We would also like to thank Katherine Currin and Blair Waddell for additional interviews about the shift in survey methodology and canvassing strategies. We want here to show our appreciation and respect for the work of Adrian Slobin and Todd Cherkasky (2010) and Neil Patel (2011) whose thoughtful papers gave us a frame through which to view the significance of the STAND model; as well as to establish a dialogue between this new case study and three years of EPIC conferences. Finally we thank Ed Liebow for his help in bringing this paper to fruition; Anthony Alvarez for his probing question at this paper’s EPIC 2012 presentation; and Dawn Nafus for being an inspiring session curator.

Stokes Jones trained as a Social Anthropologist at the London School of Economics. He is Principal of Lodestar Innovation and Adjunct Faculty at the IIT Institute of Design. His research interests include: consumption, practice theory, ANT, action research, interview methods, phenomenology, cultural models, cooperation, public/private spheres, and structural history.

Christine Miller is professor of Design Management at Savannah College of Art and Design. She studies how sociality and culture influence the design of products, processes, and technologies. Her research interests include technology-mediated communication within groups, teams, and networks and the emergency of co-located and technology-enabled collaborative innovation networks.

Bijan Dhanani is involved in advertising and special projects with Chattanooga, Tennessee-based studio, delegator. He is also founder of the UnFoundation, and holds a BA in International Affairs from the University of Colorado.

NOTES

1 We are not sure if their title ‘Ethnography in the Age of Analytics’ is ironically meant to recall Walter Benjamin’s mediations in Charles Baudelaire: A Lyric Poet in the Era of High Capitalism (Benjamin, 1997). But if so, they have found a nice homology for the role of the contemporary ethnographer; turning well-honed, traditional skills towards the identification of, and reflection on, emerging new phenomena (haunting social data sets the way Baudelaire haunted the Paris arcades).

2 These include Unfoundation – a crowd funding initiative established by co-author Bijan Dhanani; Glass House Collective – a neighborhood revitalization non-profit founded by Katherine Currin; Causeway – a cause-sourcing charity where people can find local initiatives to donate their time or money to, founded by Stephen Culp, a STAND board member; and Project PopUp – a small business incubator started by Blair Waddell that holds competitions to give away downtown Chattanooga retail space rent-free for 6 months.

3 A key point to make here is that “The Age of Analytics” may not have taken hold as firmly in all the areas above, as it had for Slobin & Cherkasky’s “line of work – the marketing and IT space” (2010:189) when they gave their cases of ethnography being “overshadowed” by analytics .


REFERENCES CITED

Atwood, Margaret
1998 The Handmaid’s Tale. New York: Anchor.

Benjamin, Walter
1997 Charles Baudelaire: A Lyric Poet in the Era of High Capitalism. London: Verso.

Bourdieu, Pierre and Pierre Wacquant
1992 Invitation to Reflexive Sociology. Cambridge: Polity.

Bourdieu, Pierre
1992b Thinking About Limits Theory, Culture & Society. Feb. Vol. 9 No.1 37-49.

Goffman, Kurt
1967 Interaction Ritual: Essays on Face-to-Face Behavior.Anchor: Garden City.

Kotter, John and Dan Cohen
2002 The Heart of Change: Real-Life Stories of How People Change Their Organizations. Cambridge: Harvard University Press.

Jones, Stokes
2005 Grass Roots Campaigning as Elective Sociality (or Maffesoli meets ‘social software’): Lessons from the BBC iCan Project. in K. Anderson and T. Lovejoy eds. Proceedings of the Ethnographic Praxis in Industry Conference 2005. Washington, DC: American Anthropological Association.
2010 The Inner Game of Ethnography. in L. Arnal, S. Pullman, H. Tamura eds. Proceedings of the Ethnographic Praxis in Industry Conference 2010. Washington, DC: American Anthropological Association.

.Latour, Bruno
1988 The Pasteurization of France. Cambridge: Harvard University Press.
1993 We Have Never Been Modern. New York: Harvester.

Lave, Jean, and Etienne Wenger
1991 Situated Learning: Legitimate Peripheral Participation. Cambridge: Cambridge University Press.

Lewin, Kurt
1951 Field Theory in Social Science. New York: Harper & Row.

Lovejoy, Tracey, Melissa Cefkin, Ken Anderson, Ed Liebow
2011 The EPIC Conversation, 2011. D. Flynn, M. Bezaitis, L. Arnal, and R. Robinson eds. Proceedings of the Ethnographic Praxis in Industry Conference 2011. Washington, DC: American Anthropological Association.

Maffesoli, Michel
1996 The Time of The Tribes: The Decline of Individualism in Mass Society. London: Sage.

Ochs Center for Metropolitan Studies
2010 Chattanooga Stand: Full Report. http://results.chattanoogastand.com/static-media/media/Ochs_Report_long.pdf

Patel, Neal
2011 For a Ruthless Criticism of Everything Existing: Rebellion Against the Quantitative/Qualitative Divide. in D. Flynn, M. Bezaitis, L. Arnal, and R. Robinson eds. Proceedings of the Ethnographic Praxis in Industry Conference 2011. Washington, DC: American Anthropological Association.

Miller, Christine, and Stokes Jones
2011 Reinvention and Revisioning in an Appalachian Industry Cluster. in D. Flynn, M. Bezaitis, L. Arnal, and R. Robinson eds. Proceedings of the Ethnographic Praxis in Industry Conference 2011. Washington, DC: American Anthropological Association.

Mack, Alexandra & Susan Squires
2011 Evolving Ethnographic Practitioners and their Impact on Ethnographic Practice. in D. Flynn, M. Bezaitis, L. Arnal, and R. Robinson eds. Proceedings of the Ethnographic Praxis in Industry Conference 2011. Washington, DC: American Anthropological Association.

Searle, John
1970 Speech Acts: An Essay in the Philosophy of Language. Cambridge: CUP.

Schramer, Otto
2007 Theory U: Leading from the Future as it Emerges. Cambridge: The Society for Organizational Learning.

Slobin, Adrian & Todd Cherkasky
2010 Ethnography in the Age of Analytics. in L. Arnal, S. Pulman-Jones and H. Tamura eds. Proceedings of the Ethnographic Praxis in Industry Conference 2010. Washington, DC: American Anthropological Association.

Spradley, James
1973 The Ethnographic Interview. Fort Worth: Holt.

Turner, Victor
1969 The Ritual Process: Structure and Anti-Structure. Chicago: Aldine.

Vygotsky, Lev
1978 Mind in Society. Cambridge: Harvard University Press.

Wacquant, Loic
1998 Pierre Bourdieu in Key Sociological Thinkers. Rob Stones ed. London: Palgrave.

Share: