One of the more-abused practices in survey design is forcing a response to a particular question.
You’ve probably experienced this: when doing a survey (usually online), you leave a question blank and try to continue. Rather than going to the next question, the survey highlights the blank question and requires you to answer before continuing.
In theory this might be OK if you were highly confident that the list of options you gave covered every possible customer and situation. Even then, I think there are legitimate reasons why someone might decline to answer almost any survey question.
In the real world, of course, this is not what happens. All too often there’s something the survey writer didn’t think of, and the customer is forced to answer a question she feels she can’t. The customer will do one of two things: abandon the survey (which is bad), or make something up (which is worse). Either way, the survey isn’t getting the data you wanted.
I encountered this just today. I was taking a customer survey for an organization I’ve done business with in the past few years, and I did have some feedback to offer–specific products I would like to see them offer in the future. About the third question on the survey asked me which products I’d purchased in the past two years, and offered a list of about 20 different products. The stuff I’d actually bought wasn’t on the list.
Naturally there was no option for “some other product,” and the survey wouldn’t let me continue without selecting something.
In this case, I abandoned the survey. In the past I’ve been known (to my eternal shame) to just make stuff up in order to be given the chance to give the company a piece of my mind.
This is why I think that forcing the customer to respond to a particular survey question is almost always bad practice. You may decide to discard some surveys after the fact becuase the customer skipped too many questions–but at least give the customer the opportunity to provide what feedback she wants.