/ Feedback Methods
FEEDBACK METHODS
Collecting feedback is an important part of understanding what is and isn’t working in your programs. It can provide insight into how your volunteers are feeling in their roles, as well as help you check your own assumptions, see what’s working, or uncover areas for improvement or opportunities for expansion.
This can be done in something as simple as a comment box, or through direct interviews with key members of your team or with clients. In this article, we discuss different methods and best practices for question design.
SURVEYS & QUESTIONNAIRES
You can design questionnaires for a number of different purposes. The most effective questionnaires are more targeted towards understanding a certain area or question rather than asking general “how are we doing” questions. For example, you can narrow the focus of a questionnaire to gather feedback on the distribution of a new service, such as home grocery deliveries, or to field questions, comments, and concerns from clients about interactions with volunteers.
When creating these kinds of specific questionnaires, think about the frequency that you would distribute these; if you’re sending a new questionnaire once a month this might be too much and people will start to ignore them, but an end-of-the-year questionnaire can help target on what to focus on for the upcoming year. Be sure to set a deadline for when you will stop receiving responses as well. This lets your team know when they can start analyzing the data. If you aren’t getting a good response rate for the survey, consider reworking the survey, the distribution location or format, and the accessibility of your tools.
INTERVIEWS
One-on-one interviews are the best way to collect feedback, as it allows you to ask followup questions in addition to your list of scripted questions. You can dig deeper to find out why a person answered a question a certain way, or if they provide a short response, you can allow them space to explain their thought process with questions like, “Why do you think this?”
An example of when an interview can occur is at the end of the volunteer’s service at your organization as an exit interview. If you do not have the capacity to conduct an interview, you can email them a survey to collect feedback on their experience. You can also find out if there’s anyone who may be affected by their departure, whether it’s their volunteer mentees, peers, or even clients they have gotten to know well.
All feedback channels require consent from the participant. You should not coerce or pressure a volunteer to participate in an exit interview if they do not want to.
BEST PRACTICES
1. Start with a goal in mind. Avoid creating a “general” survey that tries to target everything. Set a goal for what information you need most, then design your questions around those goals. Avoid questions that do not serve this goal.
​
2. Think about what type of answer would help. Closed-ended questions help you to collect quantitative data. The types of questions can involve selecting a checkbox (can select multiple responses on a list) or radio button (can only select one response on a list), or a scale. Open-ended questions allow users to explain their thoughts and experiences as qualitative data. For example, you can ask someone for more details about their initial response (why?) or have them share a specific experience. When designing your question, does a simple yes/no contain enough nuance? If your question has space for a long form answer, how will you analyze the responses? If you use a scale, aim for a scale of 1-5. There is enough range to distinguish between the different levels and a middle point (3) for neutral feelings. A good questionnaire should include a mix of answer types for analysis.
3. Be sure to ask only one question in your question. This seems straight forward, but when forming a question, you might try to clarify a first question with a followup question in the same line. This can lead to confusion on which question to answer. Similarly, if your question has two concepts in it, you may miss out on information. For example, if you ask, “How often do you pick up meat and vegetables at the coop?” the respondent may get distracted by the terminology and not know how to respond because they do not pick up meat.
4. Ask neutral questions. Avoid influencing responses by asking “How would you rate our service?” instead of “Why do you love our service?” - the former will capture their feelings, but the latter might have them only focus on positives, leaving you in the dark about where you’re failing them. Leading questions can be intimidating for the user and will mislead you during your analysis.
5. Avoid an overly long survey. You want your survey to be filled out within 5 minutes to avoid overwhelming users.
6. Consider if you need to collect personal information like names, demographics, ethnicity, gender identity, etc. - many can find this intrusive or worry that it will be used to identify them. If it’s more of a “nice to have” rather than having a direct effect on what you’re trying to solve, consider removing it to reduce the length of your questionnaire, or leave it optional instead of required.
7. Similarly, consider if you want to capture contact information like an email address or phone number. If you’re using this to add their information to a mailing list, this must be disclosed with the ability to opt-out. If you plan to follow-up with certain users, this also needs to be disclosed so they can choose to participate.
8. Keep questions relevant. If you’re using a digital survey, you can usually create conditionals. For example, if a user selects “Yes” to a question, you may have some follow-up questions to ask. But if they answer “No,” you want them to be able to skip those follow-ups and simply go to the next relevant question.
9. Always test your survey or interview script first! Find people who haven’t been a part of the survey design process and have them run through it. This will help you spot if any questions aren’t phrased correctly or if the answers don’t actually help you accomplish your goals with the survey. Edit and test again until you are comfortable. The same applies to an interview script. You can also walkthrough the flow of questions with someone to determine whether the order of the questions makes sense.
TYPES OF SURVEYS
Online questionnaire services help you collect data quickly, and will help display the data in easy to read and share data visualizations. They provide more accessibility by allowing users to digitally translate with their browsers or, in the case of blind or low-vision clients, utilize a screen reader to hear and answer questions.
​
Some websites to use include Google Forms, SurveyMonkey, and Typeform. These services range from free to monthly costs depending on the number of respondents you get, the analytic features they include, and the scale at which you want to deploy your survey.
A paper questionnaire can be more accessible to users who aren’t digital natives, as they don’t require a device or computer literacy to fill out. You can easily slip them in with food boxes or as part of delivery services - but you’ll need a method for collection and someone to compile all the data. Handwriting can also be difficult to read and the survey could be damaged from wear and tear.