Data Quality and the Tragedy of Mistrust

At the ESOMAR Insights Festival last week, Pete Cape of Dynata provided a wide-ranging “State of the Panel Industry” address, focusing on the theme of trust. He said, “I ­get to take a more helicopter view of issues affecting our industry. I get to see significant changes: in our industry, the sample industry, and in our client’s business, the insights industry. My job to help the two sides align. Data quality is something we are all always concerned about.” (All quotes should be considered paraphrases.)

Pete shared 2020 data showing that consumers lack trust in organizations’ use of their data:

  • 23% are unwilling to share their age, up 4 percentage points from 2019;
  • 50% are unwilling to share their home address, up 8 points;
  • 77% believe companies are dishonest about their use of data, up 5 points;
  • 83% believe consumers have lost control over PII (Personally Identifiable Information), up 2 points;
  • Because of this, 68% would provide false PII, up 8 points.

He said, “Panelists have changed. We’ve changed as people, too. All trust online is a bit less than it was.”

On fraud: “As long as there is any financial attraction to fraudsters, there will always be fraud. They are not here to take surveys, but to make money. We have to make their effort not worth the reward. We have gone past the ability of the human being to spot fraud. It requires technical innovation.” He discussed methods that Dynata is taking to use Machine Learning and Artificial Intelligence (AI) to identify fraud, analyzing response patterns across surveys and analyzing paradata from interacting with questionnaires.

While trap questions are a common quality-control device, participants make honest mistakes, misreading or misclicking questions. Pete gave an example of a questionnaire using AI to prompt for verification of an unusual pattern.

In the above example, it is unlikely someone young and with a $50K income owns a Bugatti, but that could be a misclick. The AI looks for unusual patterns and then prompts for verification, letting a participant correct their mistakes and move on. Thanks to AI, these verification steps do not need to be configured by the survey programmer.

Pete shared 10 ways to improve quality:

  1. Get educated on the realities of online sampling – “We are actively working to get better all the time. We have been honest and transparent about the realities of panelist recruitment, management, and sampling. We all need to think about our commons, the panelist, not just our projects. We need to understand that algorithms rather than email are what place participants into surveys. Please understand that your survey is one of thousands being fielded at any one time.”
  2. Broaden quality definition – Quality is not just what the participant does, but how researchers treat the participants. If the screener is poor, if the questionnaire isn’t designed well, if participants don’t have a good experience, then the quality will be lacking.
  3. Treat participants as clients (or colleagues) – “Try to please participants and delight them with your survey. Speak to them, not down to them.” Today, a panelist only qualifies for 20% of the surveys they start; for the rest, they are screened out or treated as overquota. Try accepting the age and gender data the panel provides rather than asking those questions again.
  4. Test every questionnaire on a neutral person – “Given the speed of business today, a pilot phase seems almost archaic,” Pete said. “While we are all short of time, try to have others test your surveys before they go live. Do a colleague’s survey to see how it feels; have them do yours.”
  5. Tackle survey length – Give respondents a choice between surveys, where they can select one to participate in based on its average length.
  6. Use valid in-survey QC questions – “Let’s use ones that are valid methods of intention.”
  7. Get mobile friendly / mobile optimized – Remember that half of participants take the survey on mobile and design it accordingly.
  8. Use concise wording – “Ditch the jargon; talk the way real people talk.”
  9. Provide adequate answer options – Make sure everyone’s answer is there as a choice in the survey.
  10. Start from a position of trust  – “Any issues we have are always with a minority of participants. The vast majority of projects are successful, and a half century on research on research shows the importance of question wording. Prevent the disaster of distrust. Let’s work on improving what we do. It doesn’t need to be a revolution, an evolution will do. Let’s look for continuous improvement, for marginal gains in all aspects of the survey process. Because marginal gains will lead to significant improvement. Let’s turn the spiral of distrust into a virtuous circle.”

Note: Pete Cape is the author of the Principles Express course Quantitative Data Collection Methods, which teaches you how to conduct quantitative research around the world.

Photo credit: Bugatti

Facebooktwitterredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published. Required fields are marked *