Boost Survey Results with Carefully Crafted Questions
The act of planning programs is a big job with a lot to consider. Where will you have your program? How big is the space? Do you need to limit guests; will you need to have them register ahead of time? Will weather or parking spaces be factors? Can you choose a date and time when there are not too many other conflicts for your target audience? How much staff time or money will you need? How will you evaluate the program’s success afterward? These are just some of the questions you need to answer as you plan.
However, there are many things to consider before you even start planning. Perhaps the biggest question is, What kind of program should I have? Your events will be more successful if you actually know what sorts of events the people in your service area want and need.
Too many library employees decide on programs based on these questions:
- What have we always done?
- What would I enjoy organizing?
- What do I think my community needs?
- What does my director / manager / board want to have?
Choosing programs this way is simple and direct—but also dangerous. If you want to plan the programs that will have the best attendance (and therefore the biggest return on your time and money) here are the questions you should be asking:
- Who are the people in my service area?
- What do these people actually want and need?
- What’s the best time to schedule events for them?
- Are they getting this [info, entertainment, education] somewhere else already? And if so, what advantage can my library offer over that competition?
Does it take more time to make decisions this way? Yes. Does it result in more-useful events with higher attendance? Absolutely. If you want your programs to be more targeted, more interesting or helpful to your audience, and better-attended, then you need to find out what people actually want. Conducting surveys is one of the best ways to get that information.
Asking People What They Want
I know that there are plenty of arguments against asking library users (and non-users) what they want from you. These are common ones I’ve heard:
- It takes extra time and effort to do it.
- We already know what they like.
- We don’t have patrons’ email addresses OR We’ll invade their privacy by emailing them.
- We don’t know how and can’t afford to hire someone to do it.
And here are my replies to them:
- But you’ll save the time and effort of creating events that attract only three people.
- Unless you have asked, you are only guessing what people like. Guesses and assumptions are not real data.
- Interested people will give you their addresses if it means getting notices about things that are really worthwhile for them to attend.
- You can do basic surveying yourself, and any data you get is better than no data at all.
Whether you opt to do a survey face-to-face, via mail, email, or online, the main ingredient you need for success is good questions. However, writing good questions is not as easy as it sounds. You need to carefully consider what you ask and how you word it if you are to get true answers that you can trust enough to build programs on.
Writing Survey Questions That Get Useful Answers
If you’ve never studied survey questions before, you probably don’t realize that the way you word the questions can greatly affect the answers. There are many points to consider when writing questions. Here are the biggest ones.
Define Your Goal
The first thing you’ll want to do before designing a survey is to decide exactly what its goal is. If the goal is too general, then you’ll ask general questions and discover later that the answers may not be actionable. If you ask for opinions when what you really need is to discover whether there is enough serious interest to make a certain event worthwhile, the answers could be misleading. Read these two questions and consider how the answers would yield different information:
A. Are you interested in learning how to play chess? Yes / No
B. How likely would you be to attend a weekly class on how to play chess? 5 (very likely), 4, 3, 2, 1 (not at all likely)
Remember, interest does not always translate into action! There might be 50 people who are “interested” but only five who would bother to leave home to attend.
Another point is to beware open-ended questions. Consider the difference between these two similar queries:
A. Does the library offer enough public computers? Yes / No
B. In the past 6 months, when you’ve wanted to use a library computer, was one usually available? Yes / No
Everyone’s opinion of “enough” will differ, but each person can speak clearly to his or her own experience and needs. And having the timeframe in the question will force respondents to limit their answers (perhaps you added workstations 6 months ago and want to see if you still need more). So the more-specific data from the latter question is more trustworthy and useful, especially if your goal is to be able to say to funders, “When we asked patrons if computers were available when they wanted one, 64% said no. This proves that we need money to purchase even more computers.”
Survey Length Matters
People don’t want to fill out long surveys, so you should only ask eight to ten questions. That means you need to pack a lot of punch in each one. An easy way to do that is to offer multiple-choice answers. So, rather than taking three questions to ask about three different program possibilities, you could ask, “Which of the following programs would you be most likely to attend? Rate them as Choice 1, 2 & 3. How to play chess / How to knit a scarf / How to make a fishing lure.”
Making the answers multiple choice also makes a survey faster to complete, and much faster to tabulate later.
Asking “When?” Is As Important As Asking “What?”
Multiple-choice answers are also essential for choosing the timing of certain events. You could put on the best event ever, but if you schedule it at the wrong time, it could flop. One common example is not having programs for senior citizens at night, because many prefer not to drive after dark. So you’ll want to include a question such as, “When are you most available for this monthly class? Indicate your top two: Weeknights after 7 p.m. / Friday nights / Saturday mornings / Sunday afternoons.”
Does everyone have different availability in their lives? Sure. Can you please everyone? No. However, questions like this can reveal widespread concerns or competition for patrons’ time. For instance, if many of the people who want to make fishing lures are men who participate in the local sportsman’s club, and those outings are every other Saturday morning, then the responses would indicate that you should not have your class on Saturday mornings. Maybe Friday night would work, so attendees could use their new lures for the next day’s fishing trip. (People love instant gratification.) Likewise, craft-making programs might be more enticing before holidays. So asking about months and weeks matters as much as asking about days and hours.
Leave the Lingo Behind
Absolutely never use library lingo. You need to use the language that your target audience easily understands. Even some words that are common to you, like “reference” or “collections,” could baffle a layman. Even the word “program” is questionable, especially in its verb form. (Many minds associate “programming” with computers.) So don’t ask people what programming they want, ask them about events, classes, learning opportunities, clubs, etc.
One Survey Does Not Fit All
You won’t get the very best results by trying to ask everybody about everything at once. If you can swing it, you’ll get better results by targeting different surveys to the different audiences. For example, if you put out a general community survey asking, “Should the library offer more programs in Spanish?” you’re likely to get negative answers from all the non-Latinos who respond. However, if you used that same question in a survey that goes only to that portion of your population, you’ll get data that’s more valid. Plus, you can ask follow-up questions about which types of Spanish programs they’d value and attend.
Avoid Leading Questions
Think about any crime- or court-related TV shows that you watch and you’ve probably seen a judge scolding a lawyer for asking a “leading question.” Here’s an example of a question that implies what the answer should be:
“Would you like the library to continue with its annual Fun With Monkeys fundraising event?”
This implies a “Yes” answer in a few ways. First, you say the event is annual, so someone who is involved has already deemed it worth doing over and over. Second, it has “fun” in its name (and monkeys!). Finally, it’s a fundraiser—so maybe saying “No” would mean that you don’t think the library needs to bring in that money.
Here’s a different way to ask the same question:
“Would you be interested in going to a library fundraising event called Fun With Monkeys?”
The answers to this question would offer you more insight. Does this sound interesting to them? Hopefully. Are you asking them to make a judgment about your fundraising? No. Although the question still has “fun” and “monkeys” in it, this wording prompts people to tell you more about their own interest (which is what you’re trying to gauge) than it does about “Do you think we should continue it?” Because, really, how would outsiders have any idea whether you should or should not do it again?
After Writing, Test, Fix, Repeat
After you’ve created questions with these potential pitfalls in mind, you can further improve your chances of getting valid data by testing your words on real people. (And by “real people” I mean folks similar to the ones who will actually take the survey. Fellow librarians are likely to understand the words and questions the same way you do, so they’re not valid testers.) Find a few willing patrons or pals and ask them to read each question aloud and tell you what it means to them; how they interpret it. You may find that something you thought was clear can actually be interpreted different ways by different brains. And when you’re talking with them in person, you can explain what you meant for the question to say, and they can suggest different words that work for them.
Testing the questions will also weed out the last of the lingo you might have missed. Sometimes it’ll even bring up follow-up questions or clarifications you hadn’t thought of before. I know this may sound like overkill, but taking ten or fifteen minutes to talk to just a few folks could reveal an error that would have made a question invalid, so it’s worth the time.
Reaching Clear Conclusions
The main point here is this: Don’t quickly write questions about what you think you want to know. Think hard about the endgame. Why are you doing a survey, and what do you want to do with the data afterward? Then work with a group to craft the questions, discuss various words, and talk about how they might be misinterpreted. Run your questions by the sort of people who will ultimately be answering your survey to ensure that they’ll get the sorts of answers you’ll be able to use.
Doing a survey well is a serious undertaking that requires staff time and sometimes precious money. You definitely don’t want to find yourselves looking at your survey results and saying, “Oh, I wish we’d have thought to ask that!” when it’s too late. Careful planning will deliver clear conclusions.
I discuss many other points of doing surveys (how to find participants, tips for tabulating, etc.) and focus groups in my book The Accidental Library Marketer. View the full TOC and more. And I’ve listed a few other helpful resources below. Happy surveying, everyone!
Arlene G. Fink, ed., How to Conduct Surveys: A Step-by-Step Guide (Sage Publications, 2008).
Floyd J. Fowler Jr., Improving Survey Questions: Design and Evaluation (Sage Publications, 1995).
Giuseppe Iarossi, The Power of Survey Design: A User’s Guide for Managing Surveys, Interpreting Results, and Influencing Respondents (World Bank Publications, 2006).
Karen L. Gill, “Surveying People Who Don’t Use Libraries,” Marketing Library Services (Mar./Apr. 2010): 1, 5–7.