Data quality observability is one of the core feature sets of Accurity. We like to say it is the most robust solution we offer. However, the last year was marked by not only the obvious newsworthy happenings worldwide, but also a subtle, yet substantial change in the data management market. Many would let it go unnoticed, but not us.
Data management, and data quality management especially, has seen some serious, yet quietly passed over, changes consistently reported by reputable market research outlets including Gartner and Forrester.
Very interesting trends can be observed emerging from all directions and we will now introduce each of them to you.
This is the most important and widespread trend to keep up with! Data quality was a separate data management discipline for decades, but with the rise of decentralized architectures such as data fabric and data mesh, data management tools, including data quality tools, are now more often judged and evaluated based on their performance as a building block of these architectures.
These tools and systems are now expected to connect together as a jigsaw puzzle where it does not matter how large the puzzle piece is or what it depicts, only how seamlessly it connects with others.
Systems which generate data for other systems to use without the need for a central data repository and data standardization, with a data management headquarters made up of interconnected platforms overseeing and governing the entire data exchange are becoming the norm.
Apart from doing a ‘quality job’ as a quality tool, the customer’s evaluation criteria now increasingly calls for how the tool can do its part as a building block for a larger platform consisting of a variety of interconnected tools. This is why we made Accurity able to integrate with most other data management tools using the powerful REST API, as well as our growing network of partnerships.
A derivation of the previous trend, this one is here to make your data quality processes easier. It is one of the many ways data quality is trying to become automated for the benefit of its users. Leveraging a close and efficient integration with a metadata management tool, such as a business glossary or a data catalog, a data quality tool would be able to use technical and business metadata stored in the metadata manager to link data quality rules to connected data assets.
Proven use cases for this kind of automation are, for example, automatic reusing of certain data quality rules based on entity types of connected data, or data quality rule application based on shared level of data sensitivity.
Accurity can go one step further as a metadata manager is one part of the same app – integration does not get more seamless than this!
Knowledge graphs are primarily a trend in metadata management and its data quality version is also very strongly tied to metadata integration. Knowledge graphs describe semantic, contextual connections between data elements in graphical way. Thanks to them you are able to describe relationships between metadata and discover what we call network effects.
Network effects are useful. For example, recommendation and suggestion engines make use of them to locate similarly interconnected data that would perhaps benefit from having similar data quality rules applied over them.
Another use of knowledge graphs in data quality is using the connections between metadata to discover duplicated information. How? Well, duplicate data objects will share the same connections to the same data of the same type. Overlapping relationships of suspiciously similar type can be used as an alert to data quality managers that there are data quality issues to solve.
That is why Accurity allows its users to create knowledge graphs between metadata. Information about connections and relationships between data elements can be then exploited when defining data quality rules, especially from a business perspective. Speaking of which…
One of the most prevalent trends showing up not only in the data quality segment but in the entire data management industry as a whole, is so called “data democratization”. What a buzzword, right?
Well, yes, but behind it lies a very beneficial force of transformation. For years, Accurity has pushed the idea that business users are those who should be in charge of data initiatives, and we built our Accurity platform accordingly.
This trend now brings our philosophy to the overall market forefront. The idea that data quality platforms should be built to be successfully used by business users, without the need for extensive teams of data specialists, now permeates the pages of most data management market research studies.
Accurity’s Data Quality and Data Observability solution is built with business users in mind. Business-defined data quality rules are the main building blocks on which data quality stands, and from which are defined all technical measurements performed over data. Connected to both real, quality-measured data, and metadata with its insights and relationships, they create the perfect bridge between things business and things technical in data quality.
Following suit with the overall democratization trend is another particularly useful one to business users. The overall consensus among market research outlets now is that customers desire data quality tools that can provide them with insights and reporting that is useful to, and usable by, business users.
Such reports cannot be built if data quality rules are not defined by business users based on their KPIs, so these trends go literally hand in hand.
Once you have your data quality rules defined from the business perspective, however, you can create detailed reporting and tie your data’s quality directly to your business performance.
You may think that data quality and business have little to do with each other, but you could not be further from the truth. The quality of your data determines how well you can monetize your data, or how effective your business processes can be. Without quality data, you cannot trust your reporting and are unable to make informed decisions based on it.
That is why business users should be the ones to take the reins of data quality and, by extension, data management. And with Accurity’s interactive data quality dashboards it is easier than ever. With widgets displaying everything from basic measurement results to trend graphs and quality performance visualizations, they have everything a business user needs to get understanding and insights into their business’ data quality.
Transformation within the data quality segment of data management is not yet done, and there will be many interesting trends to watch in the following months and years. Accurity is a flexible and powerful tool to help you guide your organization through these industry changes without losing your grip on the data quality itself.
If you would like to know more about our data quality capabilities, why not schedule a demo with one of our product specialists who can guide you through Accurity and show you how it can be tailored to best fit your use case?