By John Simpson, Engagement Consultant at GovDelivery
Surveys can be a great tool to gather public feedback, project impressions or even program success stories from your key audiences. However, the structure of the survey needs to be well planned in order to increase participation and get the insights you need. Organizations can get caught in the trap of asking too much and losing the attention of respondents or asking too little and not having enough data to effectively show results. As you explore whether surveys are the right tool for your organization to gather stakeholder feedback, take a look at these six best practices that can help you plan and execute a successful survey.
1. Less is More
- You have your audience’s attention for a short amount of time. Make it count by cutting to the chase and only asking the questions that are crucial for the results you need.
- The longer the survey, the smaller the completion rate will be.
2. Simplify the Answers with Room for Comments
- Fight the temptation to offer open text answer choices for every question.
- For most questions, offer multiple choice, multiple select, or Yes/No answers. This will save your team time down the road when reporting on results and focus your audience’s attention on what exists today or what may be possible in the future.
- If you or your team still wants to hear from respondents “in their own words,” you can add an open text field for comments at the end of each question or at the end of the survey
3. Target Your Audience
- Design your survey for your audience. Don’t waste time over-explaining concepts your audience should already know about or asking information of your stakeholders that you already have. Put yourself in the shoes of your possible respondents and ensure you’re using their language in both the email request and the survey, rather than the technical terms your office may use internally.
- The more targeted your audience for the survey, the higher your conversion rate will be. When sending the initial email to your audience asking for their participation, target key stakeholders first. Don’t be afraid to personally follow up with those in your audience who are high priority stakeholders or subject matter experts.
4. Test the Ask
- Conversion rates for surveys can be low. You can help increase the number of stakeholders taking your survey by testing a number of potential levers, such as the tone in your emails, the language or phrases you use, and personalizing the benefits of taking the survey to respondents so they can more easily see what’s in it for them.
- Test out different messages on smaller audiences, track the results to analyze what works, and use the best message combination for the larger, remaining list. Be sure that you are also calling out the expected time needed to complete the survey.
5. Develop a Clear Strategy for Your Survey Results
- Understand how you want to slice and dice your results before finalizing your questions. If you already have historical data or profile information on your audience, examine what audience background fields you may still want to gather in the survey. If you’re sending to a new audience with no background info or your survey is anonymous, build into the survey the minimum requirements you may need to target and segment respondents’ data (profession, state, etc).
- Ensure you have a clear idea of how you will use the data from each question and that you eliminate any questions that don’t get to the heart of what you want to measure. If they answer X instead of Y or Y instead of Z, make sure you have thought through what implications that may have for your plans after the survey.
- Before launching your survey, get internal buy-in for using your survey results to either modify, strengthen, or revamp processes, depending on your survey’s goals and questions.
6. Share Results and Future Plans
- While your initial outreach should explain how your audience will benefit from answering your survey, be sure you demonstrate that benefit after your survey is completed. If you told your audiences that you would be publishing or sharing the results, then do that within a timely manner. Be sure to highlight key results that your organization found interesting or that you believe speaks to a larger trend or issue, and explain to your audience how this survey will change what is being done today or the plan for the future.
- Respect the relationship. As we stated in tip 5, make sure that you can use your respondents’ answers to have a real influence in how business is done. This will help your organization gain more than just improved engagement for your next survey; you’ll be building a trusting and reciprocal relationship between your organization and its stakeholders.
Surveys aren’t for everyone and may not necessarily be the best tool for the job. Before committing yourself to the process of setting up and gaining approval for your survey questions, ensure that you can defend to critics both the reasoning behind using a survey and how you will use and share the data from your results and analysis.
If you decide that a survey is the right tool for what your organization is hoping to accomplish, be sure to put yourself in the shoes of your possible respondents when deciding the language of your questions and the message explaining why they should take the time to complete the survey. As you dive into developing your survey, we hope these best practices will help you maximize your completion rate and extend your relationship with your stakeholders to be even more impactful.
Has your organization used surveys to engage with stakeholders? If yes, are there any tips you can share with your public sector peers? If not, what are some of the obstacles in your way?
1) Too many surveys end up being a shopping list of everybody’s pet ideas, and the folks tasked with instrument development typically cannot say “No” to suggestions from above. The result is a disconnected uncoordinated mess. Be a scientist. Decide what needs to be known, first. Then draw out a circles-and-arrows model of what you want to know, and what you think might cause/predict/lead-to it. That overall circles-and-arrows model will allow you to determine if you have any given causal path covered off well enough, or if you’re going to end up with gaps that leave question marks in your understanding.
2) Management confuses what they want to say in the end, with the basis for an askable/answerable question. The first draft of what eventually became the FEVS (then the FHCS, in 2002) started out with a question, if you can believe it, that asked employees what they thought of their agency’s “human capital strategy”. Now, I think it entirely apropos that senior officials have something to evaluate and say about the human capital strategy of their agency, in the final analysis. But nobody uses that term, or thinks about it, outside of a few select circles. Thankfully, I wasn’t the only person who noted the unsuitability of that question (either that or I have enormous influence over OPM), and it was changed.
3) Shorter is sort of better, I suppose. But actual and perceived length can be two different things. Often it is not the total number of questions, but the ease with which individuals can “find their answer”. Case in point. Some years back I was on a working group, and the goal was to find out reasons for voluntary departure. The group suggested providing a list of possible reasons and asking employees to indicate their first, second, and third most important reasons for leaving from the list. On paper, it looks like one question, but involves a huge mental load to be able to ask oneself “Is this more important to me than that?”. I recommended making each reason a separate item, with a simple 3-point importance rating, such that the respondent could just go down the list, indicating, “Yup, yup, nope, sort of, nope…”. Focus groups ended up saying that this year’s survey was pleasingly shorter, even though it was actually longer.
4) People will avoid providing opinions if you let them, because as much as they want to tell you things, it requires effort, and often time they don’t have. Sometimes it IS the case that people have no basis for judgment of something. For example, new hires are justifiably reluctant to render judgments abut anything they deem to require a little more experience in the organization than they currently have. And one should never underestimate the diversity of work environments in the public sector that your question simply doesn’t map onto. But I draw your attention to the following:
– “Other (please specify)” is regularly interpreted as a license to not read the rest of the question, and will generally net you useless information that will require considerable time and effort to grasp how useless it actually was.
– “Don’t know”, “Not applicable”, and “Neither agree nor disagree” can mean different things. You will need at least one of them, but using all three will frequently provide almost as many ways to not give an opinion, as to give one. Use them judiciously.
– Open-ended questions will generally NOT result in anything in organized bulleted point-form. More often, the content may be excellent and impassioned, but go off in a million different directions; frequently in one long run-on paragraph without any separation between issues, with little of it easily divisible into points that might map onto a coding scheme. If you are going to let people provide open-ended comments, provide some introductory preamble that will gently shape what goes in the blank field after it. That could list some of the things you’re looking at learning more about, and perhaps things that you are not looking for.
5) Try to insert at least one question that lets your recipient know that you get it. It may well be an ugly admission, or a bit of a risk, but people respect surveys that are willing to take such risks in the interest of authenticity, and the management team behind them. It gives the survey more face validity. Of course, this assumes that the question/s is/are not simply ignored afterward. At the same time, this can be a strong test of something important that might turn out better than you expected.
6) The results arising should always be construed as a resource you can dip into when needed, and NOT just an “accountability event” that you can ignore once you know you’re out of the woods. If the results are actively used in decision-making, going forward, and not just briefly reacted to, your respondents will have a greater vested interest in completing subsequent surveys.
7) Thank people for their participation. I know that seems like a no-brainer, but in this world of frenetic e-everything, it can often seem like there is no budget-line-item for decency and civility. So when your survey is done and taken down, contact everyone who participated, and thank them; hopefully, letting them know when results might be available by, and where they might find them. If you sent out e-mail invitations, contact those you invited, even if they didn’t respond, and let them know that everybody’s efforts were deeply appreciated.
8) Often, context is critical, and knowing more about the circumstances of the responder helps to make a lot more sense of what they tell you. As I’ve repeated here far too often, often it’s not what they tell you, but who is telling you. So get as much background information from your respondents as can comfortably fit. The circles-and-arrows speculation mentioned above can be helpful in determining what you need to be asking.
9) Look at ALL of your messaging, and coordinate it. It can often happen that between the content of any preparatory corporate e-mail memos, the content of the invite itself, the content of the survey’s opening page and introductory content, that helpful orientation or critical info is omitted. That whole sequence of messaging should be coordinated so that they get presented with everything they need to know in the order that they need to know it. So, the rationale for the survey probably shouldn’t be presented all at once, and then never alluded to thereafter. And neither should the instructions be all bunched up in one spot. Think like a respondent, and determine what you’d want to know, or be asking yourself at each point along the way. Why the heck ARE they asking me this, and what business is it of theirs? “Used for statistical purposes only” is meaningless to the vast majority of people; it needs explaining. If people are going to be branched from one part of the survey to another, because of a prior response, then they should know why they ended up here.