The value of a community survey might be summed up this way: You don’t know until you ask. You might have a good sense of how your constituents feel about a particular issue, but a survey helps you confirm — or revise — your assumptions, and it allows you to get more meaningful context.
That was the goal of the North Carolina Pandemic Recovery Office (NCPRO), which the state’s governor created in 2020 to manage the distribution of federal recovery funds. Surveys provided a way to understand the pandemic’s impact, both on constituents and the economy.
“We were in the heart of the pandemic, and we had this federal money, and we wanted to get it out the door to the right people and for the right uses,” said Stephanie McGarrah, Executive Director of NCPRO, which is part of the North Carolina Office of State Budget and Management.
McGarrah and Kiel Kinkade, Program Analyst at NCPRO, recently spoke to GovLoop’s CX Community of Practice (CoP) about survey creation. Here are some insights they shared.
Best Practices to Keep People Engaged
When creating a survey, think about the user experience. If people find a survey frustrating for any reason, they won’t hesitate to quit, said Kinkade. He offered some basic guiding principles: Ask questions in various formats. If every question is answered with yes, maybe or no, or with a rating of 1 to 5, respondents might lose focus.
- Write in a consistent style. Big shifts in tone or vocabulary might throw people off.
- Ensure that the questions and possible answers are easy to understand. If people aren’t sure what’s being asked, they are likely to give up.
- Keep the survey short and to the point. If you ask too many questions, or the questions take too long to answer, you’re only hurting yourself.
Imagine that the people taking the survey are looking for an excuse to quit. Don’t give it to them.
Question Formats: Multiple Options
To keep surveys engaging, Kinkade recommends using a variety of question formats, such as:
- Multiple choice (e.g., select one, top three, all that apply, etc.)
- Sliders (respondents move a button along a scale, such as 1 to 5)
- Ranking (order or prioritize a list of options)
- Open-ended/text entry
The Value of Open-Ended Questions
Open-ended questions, which invite respondents to submit written responses, can play an important role in a survey: inviting the unexpected.
For example, early in the pandemic, an open-ended question alerted North Carolina officials to the challenges that some communities faced in distributing food to families in need. Existing systems were not designed to handle such a large-scale crisis, McGarrah said.
Traditional survey questions make assumptions about what’s important to ask and even what the possible answers are. “If you don’t give [respondents] the opportunity to answer open-ended questions, you might be making the wrong assumptions,” she said.
Open-ended questions can solicit a range of responses. They include:
- A word or two (often factual, such as birthplace)
- A short sentence (“In just a few words…”) Multiple sentences (“In your own words…”)
A pitfall of open-ended questions is that the responses are not always helpful. “Sometimes they’ll talk to you about things that are not exactly what you’re looking for and that are not clearly related to the topic at hand,” McGarrah said.
Note: Not everyone will answer an open-ended question. According to a Pew Research Center study, which conducts national surveys, nonresponse rates for questions asking for a single word or a detailed response ranged from 4% to 25%, with a median of 13%.
Tip: Be Honest About the Time Required
Before fielding a survey, test how long it takes and let people know up front. Two factors determine the length of a survey: the number of questions and their complexity. A survey with 15 multiple-choice questions could take five minutes or less, while one with 10 multiple choice and three open-ended questions could take at least 15 minutes.
“I really don’t like it when somebody tells me it’ll take longer than, say, five or 10 minutes,” McGarrah said. “I’m very unlikely to click on it, or I get into it and it’s taking me much longer than what people said it would take.” NCPRO surveys typically ask 15 questions and take three to five minutes to complete, she said.
Researchers say that if surveys take too long, one of two things happen: People drop off midway through, leading to a low completion rate, or they spend less time thinking about their answers. Either way, the quality of your data will suffer.
Recommended Resource:
The Pew Research Center wrote a detailed blog post explaining its approach to determining survey length.
Focus Groups: A Way to Dig Deeper
A community survey might answer some questions but raise others, either because the results need more context or they suggest a whole new line of questioning. When that happens, one option is a focus group.
The idea is to bring together a sampling of constituents or community leaders to explore issues in greater depth. For example, respondents to one North Carolina survey expressed frustration with spotty internet access, Kinkade said, even though state-level data showed great coverage statewide. Subsequent focus group discussions found the quality of connectivity in some regions varied widely for residents living outside downtown areas.
When inviting people to join a focus group, be clear about its purpose and its value, Kinkade said. “Give them a sense that this isn’t just an abstract thing,” he said. “Let them know, ‘What are we doing with this data? How is this helping the state? How is this helping you?’”
This article appears in our guide, “CX: Turning Good Ideas Into Practice.” For more insight into innovative practice that improve the customer experience, download it here:
Leave a Reply
You must be logged in to post a comment.