On how to answer surveys

[As it is National Coming Out Day, I point out I did that here]

If you're a systems administrator, you've taken surveys. It seems any vendor with a support contract will send you one after you interact with support in any way, and most commercial websites have 'how are we doing' popups to capture opinion. Some of these are quick 3 question wonders just to test the waters of opinion, but once in a while a real survey ends up presented to us.

Real surveys have multiple pages, ask the same kind of question multiple ways to cross-check opinion, and often have an incentive attached to actually take them. The real surveys are prepared by outside parties, presumably to be scientific in the opinion they're trying to assess (there is in fact a difference between 'it looks right' and 'there is science here').

There is an art to answering surveys, and it depends on what your goals are as an answerer and what the survey is attempting to gather.

The vast majority of surveys I get presented come in three flavors:

  1. Hand-wringy "how are we doing???" quickies.
  2. Standardized "how did we do?" longer form ones, usually after interacting with support.
  3. Formalized surveys assessing satisfaction with or overall opinion of a company or product (almost always from the company itself).

The first two kinds are the type of survey that will elicit a response from the company if you answer it right. If that's what you're looking for, perhaps you did have a bad experience with support or maybe there is an ongoing frustration you want to talk to someone about, here is how to get that.

5 point scales (5 is Very Satisfied, 1 is Very Dissatisfied)

4-5: Good! No contact needed.
3: Some contact likelihood.
2: Guaranteed contact. This is dissatisfied, but hasn't written the company off yet. Can be salvaged!
1: Has written the company off.

A 1-star review is someone blowing steam off and is pissed. A 2-star review is someone with specific problems that might be redeemable.

5 point scales are hard, which is why most seem to be going with 10 pointers these days.

10 point scales (10 is Very Satisfied, 1 is Very Dissatisfied)

8-10: Good!
7: Neutral. Good option to pick to avoid getting called.
3-6: Bad, but redeemable. Nigh guaranteed to get a call.
2: Really bad, probable call, but they'll have kid-gloves.
1: Written off.

This scale is nice since you get a nice little spectrum for how pissed off someone is. If you want someone to call you, anything from 3-6 is a good choice. 7 is good for "I don't like it but I don't want to talk about it."


Rule of thumb: 1-point responses are vindictive, and are probably people who can't be reasonable on this. Safely ignorable.


Formalized surveys need as many respondents as possible, good and bad, to be scientific.

This is why they usually come with incentives, to lure in the people who don't give a damn. If you don't want to answer it because it comes from a company that sells crap, this is your chance to let them know. Save the loaded language for Twitter, though; subtle carries far.


Quickie 3-question wonder surveys are mostly to smoke out dissatisfied customers, and maybe take opinion temperature.

I've already described how it's used as a customer-service opportunity to win back the dissatisfied. As for opinion temperature, the internal dialog goes kind of like this:

Our TPS score dropped from 8.2 to 7.9 this month, and our 1-point reviews jumped from 5% to 8% of respondents. Clearly our customers hate the new interface.

The people who take the time to answer those are people who have something to say (they love or hate the product), or those whose whim strikes. It won't catch the mushy middle who just don't care. They're about as useful for opinion targeting as a finger in the wind is to determine wind-speed.


Companies do strange things with unscientific survey data

Strange things like couple Support Engineer pay to the average score on the 3-question-wonder post-call surveys. If you happen to know what a certain company does with this data, you can further game it to your advantage.


Way back in the pre-internet era my family used to get phone survey calls periodically. One of the first questions they asked was this doozy:

Does anyone in your family work for market research or related field?

Because people who know how surveys work can game them and ruin their scientific validity. It just so happens we did have such a person in our household. And it just so happens they're right.