Big data is everywhere. Now widely accepted as the “currency of the new economy”, its scope extends far beyond business. The state itself is one of the largest producers and owners of data. In this context, Data Veracity takes on a new dimension.
Data derives its value not out of its quantity, but out of its trustworthiness. In our increasingly data-driven economy, the consequences of bad data being used by a company or public service cannot be underestimated. How to regulate, legislate and protect our data is occupying the minds of decision-makers at national and European level. Data Veracity is also identified as one of the five trends in Accenture’s 2018 Technology Vision.
Determining the accuracy and trustworthiness of data is one of the toughest challenges private and public organizations are confronted with. Scrutinizing and verifying raw unfiltered data takes time, money and an understanding of the different types of biases, noise and abnormalities that may occur. The reality of utilizing massive datasets makes it highly likely that they will bear at least some degree of addition (the corruption of data by the introduction of false data into an existing data set) or falsification (the corruption of existing data).
Source : Accenture-Insights