Creating Customer Satisfaction Survey Questions
This post about creating customer satisfaction survey questions was prompted from a recent phone call I accepted. The guy on the line asked me if I was willing to take the survey. I always ask a couple of questions before I get started. The first is, what is the survey about? A lot of times these survey folks just introduce themselves (like I know them or something) and who is sponsoring the survey (again.. like I know them). This particular phone call was a political survey. The next question is, how long is this going to take? I’m trying to assess whether the survey interests me and if I want to dedicate any of my time to it. Others have their own requirements… thats mine.
So, the survey started out rather innocent enough. Do I live in my county? Do I have any party alignment? Would I be voting in the next election? Who I planned to vote for? Then he read a measure that was up for vote and asked if and how I planned to vote for it. I’d never heard of the measure, but as long as I’m at the booth, I’ll likely vote. That’s just what I do.
First, the measure was entirely too long for me to absorb. If you’ve ever read a measure, you know it’s full of all sorts of weird things… folks trying to sneak things by the voters. Well, this one was no different, and I had questions. Parts of it, sounded it decent, but somethings stood out as odd to me. I had questions. He had few answers. In fact, I don’t even think he understood the initiative.
When he was done confusing me, I said I’d vote no. I don’t like measures that are too confusing and stick strange things into them. Then he summarized it to sound more attractive… Then my answer was an apprehensive yes, which is odd as I type it. I changed my answer after he reworded the question to sound better? haha
Now here’s where it started getting really weird, as if it wasn’t already weird. After I told him how I’d vote, he started asking follow-up questions. They were on the lines of, if you knew/understand this about the bill, how would you vote?
The first question was worded in a way that I was required to give one answer for two questions. Here’s an example. Let’s say I went to a restaurant and ordered shrimp and a burger. The shrimp was the best seafood I’d ever eaten and the burger was so disgusting, I took one bite and didn’t take another. So, the question would be, how satisfied were you with the seafood and fast food burger?
I wasn’t even sure how to answer the question. My response to the surveyor was something like, that doesn’t even make sense and that’s a really bad question. Finally, I asked if there was a neutral choice and picked that.
Survey Question Tip: When you ask a question, be sure you’re only asking one question at a time. For example, better questions would have been:
How were your shrimp?
How was your burger?
There were a couple of vague questions in there that gave so little information, I couldn’t even give a true opinion. Another restaurant question. Previous questions identified that I avoid fast food burgers, but I’m not opposed to eating a hamburger at select restaurants and burgers made at home (my home by me or a family member). His question would be something along the lines of how likely are you to eat a hamburger?
Again, I wasn’t sure how to answer the question. He knew how I felt about hamburgers. I’d follow up with, does the question state where I’d get the hamburger? He’d say no. What am I supposed to do with that? My answer was neutral, because it really depended.
Survey Question Tip: Be as specific as you can, so there’s no room for interpretation.
How likely are you to eat a burger at a fast food restaurant?
How likely are you to eat a burger at home in the next week?
Loaded and leading questions go hand in hand. For the political survey I took, the guy’s questions became increasingly biased. Not only were the question loaded and leading, but I swear his voice was full of judgment too. One particular question started out something like, how likely are you to vote in favor of this poorly worded… Yes, he called the measure poorly worded. As far as I’m concerned, politics is poorly worded. This is where I stopped the survey and asked him what the purpose of the survey was? And gave him my opinion about the awful design of his survey. If you truly want to know how your customers feel, don’t give them leading question.
Using the restaurant example, a leading question would be, critics around the world have raved about our great burgers, how tasty was yours? The question has an air of judgment about it. If the participant doesn’t like the burger, does that mean their palate isn’t up to that of a food critic? Likewise, if I support a bill that’s poorly worded, what does that say about my intelligence?
A loaded question would be more on the lines of where will you be enjoying your next fast food burger? This question is full of assumptions. First it assumes that I’ll be eating a fast food burger in the first place. Then it assumes I’ll be enjoying it, if I do choose to have one.
Survey Question Tips: Don’t make assumptions and don’t tried to push your participants in any particular direction. Instead, take judgments out of your questions and keep the tone neutral.
How was your burger?
How likely are you to order this burger again?
Your business surveys aren’t about getting pats on the back. The general purpose of surveys is for research. The analysis you do on your research is only as beneficial as the actual data. Therefore, don’t waste your time and money on a poorly designed survey. Do it right. Even if the results aren’t what you want or expected to learn, at least the results are valid and will allow you to build a valid plan.
So you know… I don’t think the political survey was meant to get valid results. Instead, I believe it was some sort of push-poll to sway me in a particular direction. After I expressed my irritation over the survey, he told me it was the last question and didn’t even bother getting my response before hanging up.
Anyway… as consumer entrepreneurs, be wary of surveys you take yourself.