Rancho Santa Margarita, CALIF - April 25, 2016 - Melissa Data, a leading provider of global data quality solutions, today introduced flexible, scalable data quality to the Hadoop framework for storing and processing Big Data in a distributed environment. Fueled by partnership with Pentaho, a Hitachi Group Company, and close integration with Pentaho's Big Data Integration and Analytics platform, Melissa Data's global data quality tools and services can be scaled across the Hadoop cluster to cleanse and verify billions of data center records. This creates a significant advantage for enterprise IT and data managers, now better equipped to leverage the distributed computing power of Hadoop to handle rapidly expanding data volumes feeding master data management systems.


Melissa Data will host a free webinar demonstrating quick and easy integration and analysis of large data sets, leveraging Pentaho Business Analytics for Hadoop deployments. Attendees will learn orchestration and automation techniques that build on Hadoop capabilities to transform data into clean, reliable assets. Click here to register for this live online event on Tuesday, May 3, 2016 at 1:00 p.m. Eastern.


Pentaho Data Integration offers intuitive drag-and-drop data integration coupled with data agnostic connectivity, and is designed to deliver accurate, analytics-ready data to business users from any source. Coupled with Melissa Data's integrated data quality tools available via API or local web service, users are able to eliminate the complex and time-consuming coding and programming requirements traditionally required to achieve Hadoop data quality. Processes can be automated through the chosen Melissa Data component for enhancing, verifying, correcting, standardizing or deduplicating customer records - options include full spectrum data quality that supports the entire Big Data lifecycle. These operations can also leverage Hadoop data processing frameworks, further maximizing the investment in Hadoop infrastructures.


"Consistently excellent data quality is essential to protect and maximize the long-term value of analytics, yet cleansing the vast number of records on a Hadoop cluster is not an inherently simple task," said Bud Walker, vice president enterprise sales and strategy, Melissa Data. "By pairing data quality with Pentaho Data Integration, users can quickly automate sophisticated data quality initiatives - capitalizing on the potentially massive scope of a Hadoop system to optimize business intelligence and reporting."

{top_comments_ads}
{bottom_comments_ads}

Follow