Today, confidence in data quality is needed more than ever.  Executives must constantly respond to a multitude of changes and be provided with accurate and timely data to make the right decisions on what actions to take.  It’s, therefore, more critical than ever that organisations have a clear strategy for ongoing financial data quality management (FDQM).  Why?  Well, better outcomes result from better decisions.  And high-quality data gives organisations the confidence needed to get it right.  This confidence in data then ultimately drives better performance and reduces risk.

Why Is Data Quality Important?

Data quality matters for several key reasons, as shown by research.  According to Gartner,[1] poor data quality costs organisations an average of $12.9 million annually.  Those costs accumulate not only from the wasted resource time in financial and operational processes but also – and even more importantly – from missed revenue opportunities.

As organisations turn to better technology to help drive better future direction, the emphasis placed on financial data quality management in enterprise systems has only increased.  Gartner also predicted that, by 2022, 70% of organisations would rigorously track data quality levels via metrics, improving quality by 60% to significantly reduce operational risks and costs.

In a report on closing the data–value gap, Accenture[2] said that “without trust in data, organisations can’t build a strong data foundation”.  Only one-third of firms, according to the report, trust their data enough to use it effectively and derive value from it.

In short, the importance of data quality is becoming increasingly clearer to many organisations – creating a compelling case for a change in financial reporting software evaluations.  There’s now real momentum to break down the historic barriers and move forward with renewed confidence.

Breaking Down the Barriers to Data Quality

Financial data quality management has been difficult, if not virtually impossible to achieve for many organisations.  Why?  Well, in many cases, legacy corporate performance management (CPM) and ERP systems were simply not built to work together naturally and instead tended to remain separate as siloed applications, each with their own purpose.  Little or no connectivity exists between the systems, so users are often forced to resort to manually retrieving data from one system, manually transforming the data, and then loading it into another system – which requires significant time and risks lower quality data.

There’s also often a lack of front-office controls that leads to a host of issues:

All of that contributes to significant barriers to ensuring good data quality.  Here are the top 3 barriers:

  1. Multiple disconnected systems – Multiple systems and siloed applications make it difficult to merge, aggregate, and standardise data in a timely manner.  Individual systems are added at different times, often using different technology and different dimensional structures.
  2. Poor quality data – Data in source systems is often incomplete, inconsistent, or out of date.  Plus, many integration methods simply copy data complete with any existing errors – lacking any validations or governance.
  3. Executive buy-in – Failing to get executive buy-in could be due to perception or approach.  After all, an effective data quality strategy requires focus and investment.  With so many competing projects in any organisation, the case for a strategy must be compelling and effectively demonstrate the value that data quality can deliver.

Yet even when such barriers are acknowledged, getting the Finance and Operations teams to give up their financial reporting software tools and traditional ways of working can be extremely challenging.  The barriers, however, are often not as difficult to surmount as some believe.

The solution may be as simple as demonstrating how much time can be saved by transitioning to newer tools and technology.  Via such transitions, reducing the time investment can result in a dramatic improvement in the quality of the data through effective integration with both validation and control built into the system itself.

The Solution

There are some who believe Pareto’s law applies to data quality: Typically, 20% of data enables 80% of use cases.  It is therefore critical that an organisation follows these 3 steps before embarking on any project to improve financial data quality:

  1. Define quality – Determine what quality means to the organisation, agree on the definition, and set metrics to achieve the level with which everyone will feel confident.
  2. Streamline collection of data – Ensure the number of disparate systems is minimised and the integrations use world-class technology with consistency to the data collection processes.
  3. Identify the importance of data – Know which data is the most critical for the organisation and start there – with the 20%.  Then move on when the organisation is ready.

At its core, a fully integrated CPM software platform with built-in financial data quality (see Figure 1) is critical for organisations to drive effective transformation across Finance and Lines of Business.  A key requirement is providing 100% visibility from reports to data sources – meaning all financial and operational data must be clearly visible and easily accessible.  Key financial processes should be automated and using a single interface would mean the enterprise can utilise its core financial and operational data with full integration to all ERPs and other systems.

Figure 1: Built-In Financial Data Quality Management in OneStream

The solution should also include guided workflows to protect business users from complexity by guiding them uniquely through all data management, verification, analysis, certification, and locking processes.

OneStream offers all of that and more.  With a strong foundation in financial data quality, OneStream allows organisations to integrate and validate data from multiple sources and make confident decisions based on accurate financial and operating results.  OneStream’s financial data quality management is not a module or separate product but built into the core of the OneStream platform — providing strict controls to deliver the confidence and reliability needed to ensure quality data.

In our e-book on Financial Data Quality Management, we shared the following 3 goals for effective financial data quality management with CPM:

  1. Simplify data integration – Direct integration to any open GL/ERP system and empower users to drill-back and drill-through to source data.
  2. Improve data integrity – Powerful pre-data and post-data loading validations and confirmations ensure the right data is available at every step in the process.
  3. Increase transparency – 100% transparency and audit trails for data, metadata, and process change visibility from report to source.

Delivering 100% Customer Success

Here’s one example of an organization that has streamlined data collection and improved data quality in the financial close, consolidation, and reporting process leveraging OneStream’s unified platform.

MEC Holding GmbH – headquartered in Bad Soden, Germany – manufactures and supplies industrial welding consumables and services, cutting systems, and medical instruments for OEMs in Germany and internationally.  The company operates through three units: Castolin Eutectic Systems, Messer Cutting Systems, and BIT Analytical Instruments.

MEC has 36 countries reporting monthly, including over 70 entities and 15 local currencies, so the financial consolidation and reporting process includes a high volume of intercompany activity.  With OneStream, data collection is now much easier with Guided Workflows leading users through their tasks.  Users upload trial balances on their own vs. sending to corporate, which speeds the process and ensures data quality.  Plus, the new system was very easy for users to learn and adopt with limited training.

MEC found that the confidence from having ‘one version of the truth’ is entirely possible with OneStream.

Learn More

If your Finance organisation is being hindered from unleashing its true value, maybe it’s time to evaluate your internal systems and processes and start identifying areas for improvement.  To learn how read our whitepaper on Conquering Complexity in the Financial Close.

Download the White Paper

Financial data quality management (FDQM) has often been developed as merely an afterthought or presented as simply an option by most corporate performance management (CPM) vendors.  In reality, it should be the foundation of any CPM system.  Why?  Well, having robust FDQM capabilities reduces not only errors and their consequences, but also downtime caused by breaks in processes and the associated costs of inefficiency.

Having effective FDQM means providing strict audit controls alongside standard, defined, and repeatable processes for maximum confidence and reliability in any business user process.  Effective FDQM also enables an organisation to shorten financial close and budgeting cycles and get critical information to end-users faster and more easily.

Continuing our Re-Imagining the Close Blog series here, we examine why financial data quality management is an essential requirement in today’s corporate reporting environment.  Poor data quality can lead to errors or omissions in financial statements, which often result in compliance penalties, loss of confidence from stakeholders, and potentially a reduction in market value.

Increasing Interest in Financial Data Quality Management

Organisations are now realising that FDQM is critical.  Why?  Finance teams must ensure not only that the financial reporting gets out of the door accurately but also that all forward-looking data and guidance is fully supported by ‘quality’ actual results.

An old expression used around financial reporting captures this idea well – ‘if you put garbage in, you get garbage out.

garbage in-garbage out- 741x610px

A lack of robust data integration capabilities creates a multitude of problems.  Here are just a few:

In many cases, legacy CPM and ERP systems were simply not built to work together naturally and tend to remain separate as siloed applications, each with their own purpose.  Often, little or no connectivity exists between the systems, and users are forced to resort to manually retrieving data from one system, manually transforming the data, and then loading it into another system.

As organisations get increasingly more complex and data volumes grow ever larger, it is only natural to turn attention to the quality of the data being input.  Not to mention, as more advanced capabilities are explored and adopted – such as artificial intelligence (AI) and machine learning (ML) – organisations are being forced to first examine their data and then take steps to ensure effective data quality.  Why are these steps necessary?  Simply put, the best results from AI, ML, and other advanced technologies are entirely dependent on good, clean quality data right from the start.

What’s the Solution?

A fully integrated CPM/EPM platform with FDQM at its core is critical for organisations to drive effective transformation across Finance and lines of business.  A key requirement is 100% visibility from reports to sources – all financial and operational data must be clearly visible and easily accessible.  Key financial processes should be automated and using a single interface would mean the enterprise can utilise its core financial and operational data with full integration to all ERPs and other systems.

errors-eraser-332x232px

The solution must also include guided workflows to protect business users from complexity by guiding them uniquely through all data management, verification, analysis, certification and locking processes.

Users should be able to achieve effective FDQM and verification through standardised and simplified data collection and analysis with reports at every step in the workflow.  The workflows must be guided to provide standard, defined, and repeatable processes for maximum confidence and reliability in a business user-driven process.  What’s the end result?  The simplification of business processes and a reduction in errors and inefficiencies across the enterprise.

Why OneStream?

OneStream’s strong foundations in the FDQM arena allow for unparalleled flexibility and visibility into the data loading and integration process.

OneStream’s Financial Data Quality Management is not a module or separate product but is a core part of OneStream’s unified platform – providing strict controls to deliver confidence and reliability in the quality of your data.  How?  Financial data quality risk is managed using fully auditable system integration maps, and validations are used to control submissions from remote sites.  Data can be loaded via direct connections to source databases or via any file format.  Audit reports can be filtered based on materiality thresholds – ensuring one-time reviews at appropriate points in the process.

In essence, OneStream’s unified platform offers market-leading data integration capabilities with seamless connections to multiple sources.

OneStream Integration Connectors offer direct integration with any open GL/ERP or other source systems, providing the following benefits:

100% Customer Success

AAA logo-customer-417x253px

AAA Life Insurance implemented OneStream to support all their financial consolidation, reporting, budgeting, and analysis needs.  Ever since AAA Life has streamlined the financial close process and dramatically improved not only visibility into data but also transparency into results.  They can now drill down from OneStream’s calculated or consolidated numbers all the way back to the transactional ERP system to get rapid answers to critical questions.

The dream of ‘one version of the truth’ is entirely possible with OneStream.

Learn More

To learn more about how you can re-imagine the financial close with the unrivalled power of OneStream’s Intelligent Finance Platform, download our whitepaper.  And don’t forget to tune in for additional posts from our Re-Imagining the Financial Close blog series.

Today’s CFOs and controllers need to manage their critical, enterprise-wide financial data and processes as effectively as possible. That data needs to be timely, accurate and easily accessible for insightful reporting and analysis to maintain a competitive edge. Accordingly, their corporate performance management (CPM) solutions need to be robust, scalable and provide full integration with their ERP, HCM, CRM and other systems.

(more…)

Whether you’re shopping around for an EPM solution or looking to switch over to a new solution — OneStream’s drill-through capability is the functionality you never knew you needed.

(more…)

Many vendors claim to handle complex consolidations, but there are several significant differences between the underlying engine that is needed to run financial data aggregation and the engine used to run straight aggregation or simple consolidations. Any vendor that uses Microsoft Analysis Services, Oracle Essbase, Cognos, TM1, SAP Hanna or a similar straight aggregation engine will face numerous challenges to address core financial consolidation requirements. Let’s examine this more closely.

(more…)