One of the most troublesome aspects of DIY Google Consumer Surveys (#gsurveys), is that no matter what you feed into the sausage maker, Google will spit back graphed data, down to the tenth of a decimal point, with a detailed margin of sampling error associated with each of the answers, providing the DIY researcher with the assurance that his/her survey results are accurate to a knowable, measurable and finite degree. But let’s examine these results, as proudly showcased by Google, from a recent Google Consumer Survey:
The red arrows (all arrows added by SurveyUSA) highlight the fact that the question, as asked, is the opposite of what was intended and is therefore unaswerable. Answer-choice #1 should have been: “No; very un-likely.” Instead, destroying the integrity of the question and all data that came back, the answer choice was asked as: “No; very likely.”
This is the distilled essence of the danger of DIY Research: there is no professional researcher to read your work and to guarantee that you are asking what you intend to ask. That’s bad enough. But Google Consumer Surveys compounds the problem. Because #gsurveys are fully automated on the back-end, Google is happy to spit back whatever data was gathered, no matter how meaningless. In this case, it’s not bad enough that Google shows the answer choices with decimal point precision (39.6% answering an unanswerable question; see the green arrow, above), but Google goes further and tells the DIY researcher that 39.6% is accurate plus 4.1 percentage points or minus 4.0 percentage points (see the blue arrow, above).
This creates the appearance of precision when in fact there is no precision. Had a researcher been paying attention to this survey: a) the question would never have been asked incorrectly, and b) had it been asked incorrectly, it never would have been reported out to the client.
In general, the research paradigm is: Garbage In, Garbage Out.
But Google Consumer Surveys violates that paradigm, in favor of a new, more dangerous paradigm: Garbage In, Precision Out.
The fundamental DIY error in answer-choice #1 is a pity, because it obscures an otherwise novel concept that is being tested: BedHub. As in: StubHub for hotel rooms.
If you are considering using a DIY research tool such as Google Consumer Surveys, consider hiring a professional researcher to help you. A good professional researcher can save you money in the long term. The poor folks at BedHub may have learned this lesson the hard way. Surely they should throw away the results of the DIY attempt they made using Google Consumer Surveys. The only question is whether the BedHub folks will hire a professional before they spend more money trying to dimension the size of their market.