dtect

5 steps to create an effective data quality management strategy

5 steps to create an effective data quality management strategy

Data quality management (DQM) is a structured process for managing quality, checking validity, and maintaining integrity of market research data. The process covers the full workflow, including how study participants are vetted, and how data is collected, stored, processed, analyzed, and presented.

Why is DQM important?

Data is the most important asset a company has to make informed decisions. Data quality management provides a context-specific process for improving the quality of data used for analysis and decision-making. The goal is to ensure integrity and veracity of data regardless of the sample size. This means that a working data quality management system should require the same research rigor for both small and large data sets.

It’s crucial to understand that any data quality framework is never complete. It requires continuous improvement and iterations to address emerging technologies, especially as bad actors change how they cheat the system. Even without nefarious disruptions, new technologies emerge to help understand the quality of participants before they enter a survey, dramatically reducing the amount of data cleaning required.

This matters specifically for quality in quantitative surveys. Traditionally, participants are sourced from companies that maintain a panel or pool of participants – essentially a database of people who have opted in to participate in surveys. They are chosen based on their demographics, psychographics, or some combination of both by the company conducting the survey. The market researcher inputs the targeting information and quota to meet the sample size of the study. These participants are then directed to take a survey.

However, the quality of participants is largely unknown until they enter the survey. Through a series of behaviors and outcomes, the research professional monitoring the completion of the study may find that they need to “clean” or “scrub” the data prior to analysis because there are red flags that indicate low-quality data. Examples of low-quality data could be nonsense or out-of-context answers in open ends caused either by bots taking the survey or actual humans moving too quickly and not providing meaningful answers. In some instances, participants may be members in multiple panels, either for legitimate or malicious purposes, opening the door for them to complete a survey multiple times. Such duplication distorts survey results, as surveys are designed for participants to complete only once.a.

An innovative answer to this problem is to have standard operating procedures in your DQM that address vetting potential participants before they enter the survey. So, a DQM should not just start AFTER the survey data has been collected.

That is the innovative approach. Failing to include steps for data quality management BEFORE participants begin taking survey results in a completely inadequate system that will never fully produce the highest quality of data businesses need to make informed decisions.

Here are the steps that need to be taken:

step 1Step 1: Face the facts

Poor data quality isn’t just a minor inconvenience; it’s a business-breaking liability. Inaccurate insights, wasted resources, and botched decisions are all on the line. So, stop sweeping data quality issues under the rug and start tackling them head-on.

The Coalition for Advancing Sampling Excellence (CASE), a third-party, non-profit, brand-led coalition, has revealed startling findings about the state of survey data quality in the market research industry. According to their research, between 30-40% of survey data used in analysis is fraudulent. This means that without proper vetting, problematic participants and even bots end up taking surveys, compromising the integrity of the data and the insights derived from it.

Traditional approaches to mitigate this issue, such as simple demographic checks or digital fingerprinting, are no longer sufficient. The industry needs to adopt advanced technologies that use sophisticated behavioral checks to keep these potentially harmful participants out of surveys

CASE emphasizes that it is not enough or efficient to “clean the data” after low-quality participants have completed the survey. Instead, data quality needs to be proactively managed before the survey begins. This proactive approach is crucial for ensuring valid and reliable findings from your data.

The mission of CASE is to address the issues impacting the ability of providers to ensure valid and reliable findings from marketing data intelligence and to help brands understand the trade-offs made between time, cost, and quality. By facilitating discussions between brands and C-level thought-leaders from top-tier agencies and providers, CASE aims to develop standards and KPIs for creating transparency and accountability across the research and data intelligence supply chains.

Understanding the current data collection and sampling challenges enables researchers and insights professionals to have more knowledgeable and meaningful dialogue with vendors to ensure a quality foundation for marketing decision-making. CASE plays a vital role in this process by:

    1. Collaborating with senior-level insights executives from across the industry, including brands, agencies, and providers.
    2. Advocating for transparency and accountability across the sample data intelligence supply chain.
    3. Spearheading new ideas for objectively measuring cost, time, and quality trade-offs in data collection and sampling.
    4. Educating brands on the current issues in data collection and sampling, which are impacting the industry’s ability to ensure valid and reliable findings.

By facing these facts head-on and embracing the need for proactive data quality management, market research firms can take the first step towards ensuring the integrity of their survey data and the insights they provide to their clients.

step 2Step 2: Set your standards

Next, establish a clear set of data quality standards. What does “good” look like for your firm? Define your criteria for participant authenticity, survey completion rates, and data consistency. Don’t be afraid to set the bar high—your clients (and your bottom line) will thank you.

In 2023, a Nasdaq article stated, “Corporate revenues, profits, and share values are getting drained by an insidious force that’s worsening by the day. It’s causing businesses to waste billions of dollars and make strategic decisions that don’t necessarily serve their customers. All stakeholders, including investors, are paying a price for it. The problem is market research fraud. And the impact is larger than most executives realize.”

While these claims don’t take into consideration the usage of participant-vetting platforms like dtect, the problem is real. As you set your standard, be sure to gather the right stakeholders along with key personnel who set up and run surveys. Together, they can establish data quality standards that make sense for your team. This helps market research professionals consider the many factors involved in balancing the need for high-quality data with speed of delivery. A few good conversation starters include:

  1. How do we define “good enough” data quality?
    Determine the minimum acceptable level of data quality that still allows for meaningful insights. This may vary depending on the type of study, the client’s needs, and the decisions that will be made based on the results.
  2. How should we prioritize critical data points?
    Identify the most essential data points that directly impact the study’s objectives. Focus on ensuring the highest level of data quality for these key metrics while allowing for some flexibility on less critical data points.
  3. How can we implement a tiered approach?
    Consider creating different levels of data quality standards based on a study’s complexity, budget, and timeline. For example, a high-stakes, long-term project may require more stringent standards than a quick-turnaround exploratory study.
  4. How can we leverage automation and technology to improve data quality and still keep costs down?
    Consider using the dtect platform to streamline the process of participant authentication and fraud detection. Automating these processes allows you to maintain high data quality standards without sacrificing speed.
  5. How can we continuously monitor and adapt to what is happening in the sample industry?
    Regularly assess your data quality standards against industry benchmarks and your own historical data. A long-term benefit of using dtect includes being able to determine how suppliers’ data quality trends over time, so you can make data-driven decisions about about who provides the highest quality participants. This can save your team time and money by reducing the need to request and follow up on reconciliations.

By carefully considering these factors and incorporating them into your data quality standards, you can establish a framework that ensures the integrity of your research data while still meeting the fast-paced demands of the business world. Remember, finding the right balance between quality and speed is an ongoing process that requires continuous evaluation and adjustment.

step 3Step 3: Choose your weapons

Now, it’s time to arm yourself with the right tools. Look for a data quality platform that integrates seamlessly with your existing workflows and offers a comprehensive approach to fraud prevention. From AI-powered screening to location validation and deduplication, make sure your chosen solution covers all the bases.

 

dtect ai chat

 

Want to see what all the fuss is about with dtect? Let us show you how it all works.

Book a demo

 

step 4Step 4: Make it a team effort

Data quality management isn’t a solo mission. Get your whole team on board, from project managers to programmers. Make sure everyone understands the importance of data integrity and their role in upholding your standards. Foster a culture of transparency and accountability and plan regular meetings to share best practices and review feedback from complete projects.

One big benefit of using the dtect platform is that, over time, you can get reliable data on the quality of sample provided by specific suppliers. Because dtect qualifies and vets participants before they enter the survey, you can see how many were removed because of duplication or failing quality checks.

The platform provides a comprehensive overview of each supplier’s performance metrics. You can easily track the number of starts, completions, terminations, over-quota instances, and security-related terminations. As your projects accumulate, you’ll begin to see each supplier’s overall performance and make data-driven decisions about your partnerships.

step 5Step 5: Keep clients in the loop

Finally, don’t be afraid to have tough conversations with your clients. Educate them on the importance of data quality and the steps you’re taking to ensure it. dtect enables transparency about your supplier experience and the actions you take to select the right partner for the job. the data you gather regarding suppliers Remember, a proactive approach to data quality is a powerful differentiator in a crowded market. We have found that proactively addressing data quality can serve as a significant competitive advantage even when clients are not mentioning it. It’s better to initiate these discussions early rather than risk losing a client who hasn’t yet expressed their concerns about data quality.

The world of market research is moving pretty fast. Matching the speed of business and the quickly changing attitudes, behaviors, and actions of consumers is important to ensure brands gain and maintain a competitive advantage. Ensuring clients get high-quality data to make informed decisions is critical. If any of these steps are new to you, it’s time to refine your DQM process.

Keep yourself and your data up-to-date

Subscribe