4 Keys to Developing a Robust Data Quality Strategy

Ensuring data quality is a complex process that requires careful planning and execution. While the concept seems straightforward, implementation can be challenging. Ideally, the process would look like this:

✅ engage quality participants

✅ from a reliable supplier

✅ in a well-designed survey

For a clear path to insights

4 Keys to Developing a Robust Data Quality Strategy

This simplified view might suggest that survey data easily drives business impact. However, the reality is more nuanced. Before we outline the essential elements of a robust data quality strategy, let’s consider a crucial factor: responsibility.

Who is responsible for data quality?

They say sh*t rolls downhill, and that is true when it comes to data quality. To be less crude, we could say that data quality is a top-down initiative that requires ongoing focus. Leadership plays a critical role, even if they are not directly involved in day-to-day quality assurance. Before a project begins, leadership decisions significantly impact data quality:

1. Establishing preferred supplier lists or procurement processes
2. Allocating budget for tech-enabled fraud prevention
3. Ensuring accountability to quality-related standard operating procedures

However, data quality isn’t solely a leadership responsibility. It demands constant attention from every team member throughout the entire project lifecycle.

At dtect, we recognize the complexity of maintaining data quality. Our data quality platform is designed to address these challenges, proactively preventing fraudulent participants from entering surveys and setting a new standard for data integrity in market research. We’re part of a robust data quality strategy. Still, there are other keys to getting things right, and we offer some thoughts as you evaluate your current process and consider the essential aspects of data quality management in your workflow.

Key icon 1. Choose your sample provider wisely.

In market research, the quality of insights depends heavily on the data source. However, the industry often lacks transparency, making it challenging to determine where data actually comes from. It is rare to have a sample provider who does not actively or even exclusively aggregate sample. This opacity can lead to unexpected biases and quality issues.

When working with panel providers or aggregators, researchers often have limited control over data quality. Different partners employ various quality assurance methods, yielding inconsistent results. To address this, we need a consistent approach to evaluate all partners, creating a level playing field for assessment.

Key considerations when selecting data sources:

  • Transparency in recruiting processes
  • Diverse recruitment methods to minimize bias
  • A clear understanding of the target audience before sampling

For example, a sample provider recruiting female participants exclusively via social media introduces inherent bias. Conversely, another provider sourcing only from loyalty memberships creates a significantly different audience profile. These nuanced differences can accumulate, impacting data quality more than initially anticipated. Asking providers questions about their recruiting and aggregation processes can be a major step in the right direction to ensure the first and most important key to data quality is managed.

It’s easy to say, “Be selective with your panel providers,” but the realities of preferred vendors and compressed project timelines can make this easier said than done. Instead, consider how you can evaluate a provider’s performance. If you can’t implement a research tech (restech) solution that ranks or scores providers over time, come up with an internal way of reviewing reconciliations regularly.

Key icon 2. Implement effective fraud prevention measures.

As technology advances, so do the methods of survey fraud. Traditional approaches to ensuring data quality, such as relying solely on specific sample partners or in-survey checks, are needed. To combat tech-savvy fraudsters, we need equally sophisticated prevention measures.

Effective fraud prevention starts at the top of the funnel, aiming to eliminate problematic participants from surveys entirely. This proactive approach is far more efficient than identifying and removing fraudulent responses after data collection.

Critical aspects of modern fraud prevention:

Funnel graphic

  • Eliminate Bots
    Implement robust systems to block automated survey-taking technology.
  • Ask the Right Questions
    Validate that people are both real and who they say they are by asking them targeted questions.
  • Apply Validation Techniques
    Deploy multi-faceted validation methods to catch sophisticated fraudsters.

 

To stay ahead of fraudulent activities, researchers must keep up with industry trends and continuously update prevention strategies. Effective fraud prevention is not a matter of adding a few quality check questions to your survey. It’s about implementing a comprehensive and evolving technology-driven strategy that stops fraud before it enters your data stream. This key to success cannot be overstated, as everything that happens once bad actors enter your survey is costly to weed out, or worse, never gets weeded out.

Key icon 3. Use best practices for survey design.

Even with the right audience and robust fraud prevention measures, poor survey design can undermine data quality. Effective survey design is crucial for capturing valuable insights and maintaining participant engagement, and a lot of the best practices are well-known but not always followed. The basics are as follows:

1. Questionnaire design

  • Review questions with best practices in mind, such as using clear and jargon-free language, avoiding double-barreled questions, and ordering questions in ways that reduce bias.
  • Optimize for mobile users as the majority of participants complete surveys on smartphones.
  • Simplify questions as much as possible, avoiding questions that participants struggle with, such as large grids.
  • Reduce the number of questions whenever possible, asking only what is truly needed.

2. Engagement strategies

  • Ask engaging, open-ended questions that elicit usable responses. Avoid the trap of assuming that participants will write a paragraph response to a simple, boring question.
  • Ensure the questionnaire flows logically and does not skip from topic to topic, thus reducing the cognitive load for participants.

3. Quality checks

  • Use multiple, sophisticated methods to verify response consistency.
  • Develop a scoring system for participants to make it easier to remove them objectively from the study. Consider a “three strikes” approach or similar scoring mechanism to identify low-quality responses.

Remember, survey design isn’t about convenience for researchers—it’s about creating an optimal experience for participants that yields high-quality data. Thoughtful, participant-centric design is vital to maximizing the value of your research efforts.

Key icon 4. Establish a quality assurance process.

While technology is crucial in data quality, human oversight remains indispensable. A written Quality Assurance (QA) process ensures that all previous efforts culminate in reliable, actionable insights. While many companies say they have a QA program, not all are documented. Instead, they may rely on legacy team knowledge. But when teams change, that knowledge can disappear, leaving others with assumptions of what is being done. Instead, have a written QA process, no matter how basic. This provides a foundation of what can be expected to be done for each project. Depending on the size of your team, you might break these functions down into different roles, but the basics to consider include the following:

1. Data refinement

  • Standardize your duplicate removal process.
  • Provide guidelines for acceptable open ends.
  • Keep a list of approved restech tools with a brief description of what they are most commonly used for and why.

2. Cross-functional collaboration

  • Consider assigning checklist items not only to a particular team member but also to the department that role belongs in.
  • Store the original project brief in a central location so all involved in the QA process can access specs and requirements.

3. Continuous improvement

  • Set a regular review process for QA efforts that includes discussion about partner performance, data quality tools in use, and manual cleaning and analysis checklists. Be sure to tie that back to feedback from clients about project outcomes.
  • Choose an appropriate time to show clients your QA process to build trust and demonstrate value. Always be willing to accept their feedback.

Whatever you do, document it in any way that makes sense for your workflow. You do yourself, your team, and your clients a favor by not reinventing the wheel. Create it once and then refer to it over and over again.

New and Improved GIF

Because we’re dtect and like to provide something extra, we’ve got one bonus key to data quality success!

No matter what your process, the key to making it all work is consistency.

Toothbrush graphic

4 out of 5 might be a good number for dentists who recommend a specific toothpaste, but when it comes to implementing a data quality strategy, we want 5 out of 5.

Did we mention that consistency is key?

Consistency in data quality practices:

  • Provides the team with clear expectations and tools to achieve them
  • Improves efficiency as team members internalize processes
  • Allows for meaningful comparisons across studies
  • Facilitates continuous improvement through standardized metrics
  • Builds trust with clients and stakeholders

While a commitment to the highest standards for data quality starts at the top, it never stops. Executive sponsorship for data quality gets the ball rolling with much-needed support of DQM initiatives and the time required to implement them. However, everyone along the way has a tremendous opportunity to impact data quality and deliver for clients.

Putting together a robust data quality strategy that is consistently followed throughout each project lifecycle is key to success. The result is the ability to deliver reliable and representative data every time. This service is more than good client relations — it is the foundation of a serious competitive advantage.

You might also like

Survey Research: Market Research Pros Reveal Data Collection Fails [and What They Did to Fix It]

Survey Research: Market Research Pros Reveal Data Collection Fails [and What They Did to Fix It]

You take pride in your work, so it can be tough going when things don’t go as...
Read more
Show me where it hurts: The true cost of poor data quality

Show me where it hurts: The true cost of poor data quality

The data quality crisis in the market research industry has reached a tipping point. For years, conversations...
Read more
5 steps to create an effective data quality management strategy

5 steps to create an effective data quality management strategy

Data quality management (DQM) is a structured process for managing quality, checking validity, and maintaining integrity of...
Read more

Keep yourself and your data up-to-date

SUBSCRIBE