Data Quality is moving into the mainstream.
Where, five or six years ago there were a few, focused providers, today, nearly everybody claims to have a data quality offering.
Most of these solution providers are saying similar things - no wonder that it is difficult to make a decision about what is best for your organisation and environment.
What are the key factors that you should consider when making your choice?
- How much support will you need from the supplier?
- Does the solution give you options and choice to support your current (and future) architecture?
- Support for batch and real time / Service oriented Architecture (SOA)
- Local and International Support
- Out of the box experience
We will enable your success
A data quality tool, by itself, will not solve data quality problems. At the same time, data governance and data quality processes must leverage technology in order to achieve practical results.
Our approach will save you substantial time and money by focusing your data quality efforts where they will deliver value. We can ensure that you leverage the tools we provide to maximum advantage - applying the appropriate components and approaches to solve your business and technology challenges in a pragmatic way.
Our extensive experience means that we are prepared to take responsibility for an end to end solution including an initial implementat0oin and skills transfer to enable your team to move forward alone.
We will enable your team with both the methodology and the technical knowledge to successfully maintain and enhance your data quality processes.
We are supported in this endeavour by our international partners who share our focus and commitment to data quality - ensuring that the tools and platform continue to evolve to meet the demands of enterprise clients.
The platform gives you choice
The only thing that is certain is change.
Despite all efforts to the contrary, data is typically stored, shared and manipulated in multiple application, ranging from legacy mainframe systems, client server applications such as ERP and CRM, data warehouses and data integration, and web based applications.
When applications reach the end of their life they are replaced - but the data lives on and must be migrated to the new system.
It should not be necessary to reinvent and redevelop data quality standards and validations for every system, or after every system change as this raises costs and increases the risk of inconsistent standards.
The Trillium Software System® has been uniquely architected based on the premise that enterprise-wide data quality solutions must work seamlessly with all platforms, in any application, and should be able to integrate into any technical environment. As a result, the core product can be deployed, in a variety of ways, through out the enterprise.
This approach offers significant benefits:
Maintain a single instance of the Data Quality platform
Leverage a standard set of data quality rules
Support data governance through the consistent application of data quality standards and processes
Streamline development and deployment efforts
Support portability across platforms
Produce the same results across heterogeneous platforms, regardless of how functions are invoked
Scale easily to grow with your organisation
Fit into any data integration architecture
Click here for more detail on connectivity options
Seamlessly Deploy to batch or real time
The Trillium Software System core engines improve data quality in both real-time and in batch. Uniquely, Trillium Software System is architected to provide the same set of core functionality in multiple ways throughout the enterprise, as needs dictate.
Vendors of major operational systems, such as SAP, Oracle, IBM and Microsoft have certified tightly coupled connectivity between the Trillium Software System and their platforms, in order to enhance data quality within and across these environments.
Trillium Software also provides a range of loosely coupled connectivity options ranging from Web Services to high performance APIs to ensure the accessibility of these data quality services via multiple calling mechanisms, including:
• Requested as an on-demand batch process
• Invoked as a scheduled batch job
• Embedded within ETL workfl ows
• Tightly coupled with a third-party enterprise application
• Loosely coupled as a web service
• Integrated through open standards APIs
The benefits of this approach are obvious - seamlessly expand the platform to add support for additional or replacement platforms with minimal effort.
Click here for more detail on connectivity options
Local and International Support
Knowledge of local culture, language and standards can have a significant impact on data quality efforts - particularly when dealing with Party's (Name and Address) data.
A solution that deals adequately with American data may fail to deal with South African data.
Trillium Software is consistently recognised for its global capabilities
Trillium Software offers native support for all geographies you may require - including (but not limited to) common South African trading partners such as:
the USA and the UK,
Brazil and multiple African countries.
As the South African partner, MDM works with Trillium Software to maintain and enhance the South African country rules.
Our off the shelf rules seamlessly handle English and Afrikaans address and business variations, spelling errors, renaming of towns and cities, and similar complexities - saving hundreds of hours of development effort in cpomparison to solutions that do not offer this head start.
Find out more about handling international address data
Data governance is playing n increasingly important role in defining and driving data quality efforts.
The result , an enterprise data quality platform must support both business data stewards and the technical deployment team via a single interface.
The out of the box experience (OOBE) must go beyond simple "look and feel" to provide templates and guidelines that quick-start your data management processes and support the application of best practice.
The Trillium Software System provides native support and off the shelf projects to quick start common data cleaning and matching tasks, such as matching of individuals, businesses, households and related parties, or address validation or geocoding.
Specific rules can be quickly and easily added or modified with no programming required.
Rules can be developed centrally and easily deployed to any operating system, application or environment using a variety of off the shelf options.
This ensures rapid time to value and a low total cost of ownership while maintaining standards and ensuring consistent treatment of across the enterprise.
Enterprise data quality drives business value!
The modern enterprise is dependent on a huge number of interconnected systems to enable key business process and provide decision support. The emerging Big Data paradigm is a symptom of the overarching power of information to enable competitive advantage, while simultaneously adding to the complexities that already threaten to overwhelm overstretched IT teams
True enterprise data quality enables your organisation to integrate data quality across all of your enterprise applications ensuring that peak-condition data is seamlessly shared across teams, lines of business and systems.
The consistent application of data quality standards and processes enables competitive advantage as both decision makers and operational staff are supported by information that is accurate, consistent, current and complete.
Enterprise data quality can be deployed in a incremental fashion to ensure rapid time to value.
Our proven approach establishes alignment between the business and IT with data governance to focus, prioritise and measure data quality.
We then deploy solutions,. processes and resources to discover data conditions, develop business rules and processes to clean data, deploy these through batch and real-time services, and provide ongoing monitoring and remediation of data quality issues.