Data Quality Tools, Mailing Software, Lists, NCOA, Data Enhancements
Call 1-800-MELISSA  
Cart Shopping Cart|Contact Us||

Data Quality Direct Mail Popular Services Downloads Support Resources Lookups Company

 
 News


 Poor Data Quality Costing Companies Millions of Dollars Annually
  By Jeff Kelly, News Editor, SearchDataManagement.com

While the use of data quality software has hit an all-time high, companies, by their own admission, are still losing boatloads of money because of inaccurate data, according to a 2009 survey.

The average organization surveyed by Gartner said it loses $8.2 million annually through poor data quality. Furthermore, of the 140 companies surveyed, 22 percent estimated their annual losses resulting from bad data at $20 million. Four percent put that figure as high as an astounding $100 million.

Much of this loss is due to lost productivity among workers who, realizing their data is incorrect, are forced to compensate for the inaccuracies or create workarounds when using both operational and analytic applications, Ted Friedman, an analyst with the Stamford, Conn.-based research firm, said in an interview.

Still, losses could be even higher were it not for the increasing adoption of data quality tools. According to Gartner, the data quality tools market grew by 26 percent in 2008, to $425 million.

Of those companies that use data quality tools, the 2009 survey found, many have begun deploying them to support projects other than business intelligence (BI) and data warehousing (DW), previously the two most common data quality use cases.

"The tools are not cheap, so people are doing the right thing by finding multiple ways to use them," Friedman said.

Specifically, around 50 percent of survey respondents said they are using data quality tools to support master data management (MDM) initiatives, and more than 40 percent are using data quality technologies to assist in systems and data migration projects.

As the survey indicates; however, most companies still have a long way to go to achieve comprehensive data quality processes. A common shortcoming, Friedman said, is that most data quality tools are difficult for non-power users to understand, and consequently are used by only a small group of workers—usually IT staff.

According to the survey, only between one and five workers regularly interact with data quality tools at 58 percent of organizations. Another 22 percent said between six and 10 workers use data quality tools.

To improve data quality throughout the organization, Friedman said, vendors must make data quality tools simpler to use so that business types can use them, and begin taking responsibility for the quality of their own data.

"In particular, providing data profiling and visualization functionality (reporting and dash-boarding of data quality metrics and exceptions) to a broader set of business users would increase awareness of data quality issues and facilitate data stewardship activities," Friedman wrote in an accompanying report.

"Directly engaging users in specifying and maintaining business rules for cleansing, matching, and monitoring would also aid a shift in culture toward the business, having responsibility and accountability for properly managing data," he wrote.

Friedman also said organizations are increasingly applying data quality to data domains other than customer data, but more still needs to be done. He said the quality of financial data, in particular, costs some companies considerable money in the form of fines for incorrect regulatory filings.

Companies should also invest in technology that applies data quality rules to data at the point of capture or creation, he said, not just "downstream," as when loading data into a data warehouse.

According to the survey, less than half of respondents currently use data quality tools at the point of capture or creation, which often happens in operational systems, like CRM software.

"Historically, data quality tools have been most often used in an offline, batch mode—cleansing data at a point in time outside the boundaries of operational applications and processes," Friedman wrote. "Gartner advises clients to consider pervasive data quality controls throughout their infrastructure, ensuring conformance of data to quality rules at the point of capture and maintenance, as well as downstream."

---Source: SearchDataManagement.com Aug. 25, 2009. The survey mentioned in the article was conducted in 2009.
 

Melissa Data


 
Enhance your website, software or database with easy-to-integrate data quality programming tools and web services.


 
Save money on postage using leading mail preparation software and other direct marketing products.


 
Update & standardize addresses and find out more about contacts in your database.

 


 
Find new customers perfect for your business with our online and specialty mailing lists.
 


 
Locate the business information you need such as ZIP Codes, address verification, maps.
 

Melissa Data Catalog - Your partner in data quality









Download
your free copy of the Melissa Data product catalog.


 


Follow us on:

Facebook           Twitter

           


Article Library | Direct Mail | Copywriting | Data Quality | eMail | Case Studies | Technical | Postal
Marketing Strategies | Internet & Web | Industry News | Subscript to Newsletters