Skip to main content

Organizations everywhere rely on data to make informed decisions and improve the bottom line through data management.  But with the vast amount of data today, things can quickly get out of control and spawn data gremlins (i.e., little pockets of disconnected, ungoverned data) that wreak havoc on the organization.

Remember those adorable creatures that transformed into destructive, mischievous creatures when fed after midnight in the 1984 comedy horror film Gremlins?  (see Figure 1)

Data Gremlins
Figure 1: Warner Bros. Pictures/Amblin E/Sunset Boulevard/Corbis via Getty Images

In the same way, data gremlins – aka “technical debt” – can arise if your systems are not flexible to enable finance to deliver.  Effectively managing data to prevent data gremlins from wreaking havoc is crucial in any modern organization – and that requires first understanding the rise of data gremlins.

Do You Have Data Gremlins?

How do data gremlins arise, and how do they proliferate so rapidly?  The early beginnings started with Excel.  People in Finance and operations would get big stacks of green bar reports (see Figure 2). Don’t remember those?  They looked like the image below and were filed and stacked in big rooms.

Green Bar Reports Stacks
Figure 2: Green Bar Paper

When you needed data, you pulled the report, re-typed the data into Excel, added some formatting and calculations, printed the spreadsheet and dropped it in your boss’s inbox.  Your boss would then review and make suggestions and additions till confident (a relative term here!) that sharing the spreadsheet with upper management would be useful, and voila, a gremlin is born. 

And gremlins are bad for the organization.

The Rise of Data Gremlins

Data gremlins are not a new phenomenon but one that can severely impact the organization, leading to wasted time and resources, lost revenue, and a damaged reputation.  For that reason, having a robust data management strategy is essential to prevent data gremlins from causing havoc.

As systems and integration got more sophisticated and general ledgers became a reliable book of record, data gremlins should have faded out of existence.  But did they?  Nope, not even a little.  In fact, data gremlins grew faster than ever partly due to the rise of the most popular button on any report, anywhere at the time – you guessed it – the “Export to .CSV” button.  Creating new gremlins became even easier and faster, and management started habitually asking for increasingly more analysis that could easily be created in spreadsheets (see Figure 3).  To match the demand in 2006, Microsoft increased the number of rows of data a single sheet could have to a million rows.  A million!  And people cheered!

Excel is not a database
Figure 3: ©Fox Television The Simpsons™

However, errors were buried in those million-row spreadsheets, not in just some spreadsheets, but in almost ALL of them.  The spreadsheets had no overarching governance and could not be automatically checked in any way.  As a result, those errors would live for months and years.  Any consultant who experienced those spreadsheets will confirm stories of people adding “+1,000,000” to a formula as a last-minute adjustment and then forgetting to remove the addition later.  Major companies reported incorrect numbers to the street, and people lost jobs over such errors.

As time passed, the tools got more sophisticated – from MS Excel and MS Access to departmental planning solutions such as Anaplan, Essbase, Vena Solutions, Workday Adaptive and others.  Yet none stand up to the level of reliability IT is tasked with achieving.  The controls and audits are nothing compared to what enterprise resource planning (ERP) solutions provide.  So why do such tools continue to proliferate?  Who feeds them after midnight, so to speak? The real reason is that these departmental planning systems are like a hammer. Every time a new model or need for analysis crops up, it’s “Let’s build another cube.” Even if that particular data structure is not the best answer, Finance has a hammer, and they are going to pound something with it.  And why should they do their diligence to understand the proliferation of Gremlins? (see Figure 4)

Mean Gremlin
Figure 4: Warner Bros. Pictures/Amblin E/Sunset Boulevard/Corbis via Getty Images

The relationship can best be described as “complicated.”  ERPs and data warehouses are secure, managed environments but offer almost no flexibility for Finance or Operations to do any sophisticated reporting or analysis that has not been created for them by IT.  Requests for new reports are made, often through a ticket system, and the better IT is at satisfying these requests, the longer the ticket queue becomes.  Suddenly the team is doing nothing but reporting, and that leaves Finance thinking, “How the heck is there a team of people in IT not focused on making the business more efficient?”

The answer suggested by the mega ERP vendors is to stop doing that.  End users don’t really need that data, that level of granularity or that flexibility.  They need to learn to simplify and not worry about trivial things.  For example, end users don’t need visibility into what happens in a legal consolidation or a sophisticated forecast.  “Just trust us.  We will do it,” ERP vendors say.

Here’s the problem:  Finance is tasked with delivering the right data at the right time with the right analysis.  What is the result of just “trusting” ERP vendors?  Even more data gremlins.  More manual reconciliations.  Less security and control over the most critical data for the company.  In other words, an organization can easily spend $30M on a secure, well-designed ERP system that doesn’t fix the gremlin problem – emphasizing why organizations must control the data management process.

The Importance of Controlling Data Management and Reducing Technical Debt

Effective data management enables businesses to make informed decisions based on accurate and reliable data.  But controlling data management is what prevents data gremlins and reduces technical debt.  Data quality issues are perpetuated by the gremlins and can arise when proper data management practices aren’t in place.  These issues can include incomplete, inconsistent or inaccurate data, leading to incorrect conclusions, poor decision-making and wasted resources.

One of the biggest oversights when dealing with data gremlins is only focusing on Return on Investment (ROI) and dismissing fully burdened technical debt (see Figure 5). Many finance teams use performance measurements like Total Cost of Ownership (TCO) and ROI to qualify that the solution is good for the organization, but rarely does the organization dive deeper – beyond the performance measurements to include opportunities to reduce implementation and maintenance waste. Unfortunately, this view of performance measurements does not account for hidden complexities and costs associated with the negative impacts of data gremlin growth.

Technical Debt is MORE than the Total Cost of Ownership

Technical Debt and TCO
Figure 5: Technical Debt is More than the Total Cost of Ownership

Unfortunately, data gremlins can occur at any stage of the data management process, from data collection to analysis and reporting.  These gremlins can be caused by various factors, such as human error, system glitches or even malicious activity.  For example, a data gremlin could be a missing or incorrect field in a database, resulting in inaccurate calculations or reporting.

Want proof?  Just think about the long, slow process you and your team engage in when tracking down information between fragmented sources and tools rather than analyzing results and helping your business partners act.  Does this drawn-out process sound familiar?

The good news is that another option exists.  Unifying these multiple processes and tools can provide more automation, remove the complexities of the past and meet the diverse requirements of even the most complex organization, both today and well into the future. The key is building a flexible yet governed environment that allows problems and new analyses to be created within the framework – no Gremlins.

Sunlight on the Data Gremlins

At OneStream, we’ve lived and managed this complicated relationship our entire careers.  We even made gremlins back when the answer for everything was “build another spreadsheet.”  But we’ve eliminated the gremlins in spreadsheets only to replace them with departmental apps or cubes that are just bigger, nastier gremlins.  Our battle scars have taught us that gremlins, while easy to use and manage, are not the answer. Instead, they proliferate and cause newer and bigger hard-to-solve problems of endless data reconciliation.

For that reason, OneStream was designed and built from the ground up to eliminate gremlins, including the need for them in the future (see Figure 6).  OneStream combines all the security, governance and audit needed to ensure accurate data – and does it all in one place without the need to walk off-prem while allowing the flexibility to be leveraged inside the centrally defined framework.  In OneStream, organizations can leverage Extensible Dimensionality to provide value to end users to “do their thing” without having to push the “Export to .CSV” button.

OneStream Unified Platform
Figure 6: OneStream Unified Platform Capabilities

OneStream is also a platform in the truest sense of the word.  The platform provides direct data integration to source data, drill back to that data, and flexible, easy-to-use reporting and dashboarding tools. 

That allows IT to eliminate the non-value add cost of authoring reports AND the gremlins – all in one fell swoop.

Finally, our platform is the only EPM platform allowing organizations to develop their functionality directly on the platform.  That’s correct – OneStream is a full development platform where organizations can leverage all the platform resources of integration and reporting needed for organizations to deliver their own Intellectual Property (IP).  They can even encrypt the IP in the platform or share it with others.

OneStream, in other words, allows organizations to manage their data effectively.  In our e-book on financial data quality management, we shared the top 3 goals for effective financial data quality management with CPM:

Conclusion

Data gremlins can disrupt business operations and lead to severe business implications, making it essential for organizations to control their data management processes before data gremlins emerge.  By developing a data management strategy, investing in robust data management systems, conducting regular data audits, training employees on data management best practices and having a disaster recovery plan, organizations can prevent data-related issues and ensure business continuity.

Learn More

To learn more about how organizations are moving on from their data gremlins, download our whitepaper titled “Unify Connected Planning or Face the Hidden Cost.” 

Download the White Paper

The life of FP&A professionals doesn’t have to mean collating reports and files from different sources to get the right data and information.  In fact, FP&A should move away from spending time on data integration, cleansing and harmonization tasks and instead spend more time on applying the team’s knowledge to build the right insights to make timely decisions with agility.  The latter drives business performance.

A day in the life of an FP&A team working with fragmented systems and a look under the hood of Enterprise Performance Management (EPM) systems both underscore why the integrated data that comes with Intelligent Finance helps teams elevate their game.

Monday Again! A Day in the Life of a Finance Planning and Analysis Team

It’s 9:00 am on Monday.  Loria di Frangelico – a fictional character profiling a diligent FP&A leader of a large corporation – reviews the week’s goals and prepares the actions and schedule for the coming days.  Trying to get the 3+9 Forecast with actuals that includes the previous week’s data has always been a hassle.  Loria calls Joe Murphy, the FP&A Analyst:  “Joe, I’d love to get the figures earlier this time, on Wednesday.”

Joe responds: “Of course, Loria, you know we get the product line information automatically from the ERP with little manual enrichment.  But for the services lines, we depend on getting the data files extracted by IT.  Let me reach out to them now to get that going!”

On Wednesday, Loria follows up with Joe, and he responds: “The IT folks had a data integration issue. They’re still cleansing the data and reviewing the mappings since the data transfer didn’t go as expected.  However, I can get you the product lines information now.”

Thursday arrives, and Joe finally provides a complete forecast consolidated on a spreadsheet.  Loria not only wishes she could have gotten the information earlier in the week but also expects the information to be accurate.  Yet Joe’s file shows a $250K variance in the service line!  Joe is requesting clarifications from the field service department, but Loria gave up the fight with IT a long time ago.  With no time left, she’ll need to just report the inaccurate figures to her boss and take the criticism that will surely follow.

Loria di Frangelico isn't happy when she finds out about a 250K$ variance
Loria di Frangelico isn’t happy when she finds out about a $250K variance

A Look under the Hood

Arguably, the narrative above isn’t a one-off story for FP&A teams since getting good information on time is often a drag.  Why?  Well, the analyst is almost always juggling miscellaneous data sets, files and systems.  This struggle occurs in almost every organization with complex product portfolios and diverse business models.  But a look under the hood that shelters the EPM activities of many organizations offers some insights on what can go wrong:

If the goal is to provide best-in-class products and services, why should an organization live with underperforming processes and archaic technology?  The answer?  It shouldn’t.  Getting the right data, at the right time should be an instant process that makes data available to the FP&A analyst whenever required. A unified EPM platform minimizes system and data integration needs and empowers the FP&A team to provide better insights and more agility to critical decision making.

A Better Performance: Breaking Away from the EPM Toolkit Chaos

FP&A teams at organizations with under-performing processes and out of date technology face a reality much like Joe Murphy in the narrative above – drowning in an ocean of reports, files and spreadsheets. Such professionals accept their fate because they don’t understand the root cause of this chaos, let alone imagine a better way.  

One of the main issues – the EPM Toolkit chaos (i.e., the complex IT infrastructure that supports these essential process) – lies buried six feet under, unseen by them and those who most need to see it.   

Organizational and process changes circumvent this complex infrastructure but don’t really try to fix it – yet doing so is a necessity for modern FP&A teams.  After all, getting the right system setup ought to be a priority for organizations that aspire to be quicker at reacting to new market dynamics.

Today, organizations can replace complex EPM infrastructures with one single platform.

 Figure 1 (The Pyramid of Empowerment for the FP&A Team) shows how such an integrated system is possible and the benefits that flow through to the daily job of planners and analysts.  By having one platform that directly connects to the systems of record (ERP/MES) and systems of engagement (CRM), the platform can natively load and transform the data just once and supports all the planning and consolidation requirements of modern organizations, including the ability to support M&A activities without adding complexity.  

The pyramid of empowerment for the FP&A team
Figure 1: The Pyramid of Empowerment for the FP&A Team

The result is impressive: analysts and planners are empowered to apply their knowledge and expertise to build sensible insights.  And the business benefits for the organization are ample:

Built-in financial data quality management in OneStream
Figure 2: Built-In Financial Data Quality Management in OneStream

Contrarily, those organizations that choose to stay with silos of planning and analysis tools and models are facing the hidden costs of a fragmented EPM landscape.

Conclusion

The need for data integration isn’t going away for FP&A teams – but the process can be greatly simplified.  Doing so just requires adopting a well-reasoned strategy and investing in an EPM solution that reduces the dependence of systems and data-handling activities to give more time back to analysts and planners.  Organizations with scattered and complex EPM landscapes should consider investing in a solution with embedded capabilities to handle data and information from the source systems.  When choosing a solution, organizations should also consider the following key considerations (among others):

A solution of this kind increases transparency in the information and takes planning, analysis and strategic decision-making to the highest level.

At OneStream, we call this Intelligent Finance.

Learn More

Want to learn more about industry-leading data integration practices for EPM? Click here to see how OneStream can help you take things to the next level.

Download the White Paper

When organisations are faced with uncertainty, they typically re-forecast more often. For example, in the last three months, many businesses have re-forecast their financials at least weekly and their cash positions almost daily. It’s what FSN calls the ‘hamster wheel’ effect – the wheels are turning faster and faster, but unfortunately the process isn’t delivering any more substance by way of insight – and that’s because in most cases, fundamentally nothing has changed. But fascinatingly, recent experience with the COVID-19 crisis suggests that businesses are now looking more deeply, to non-financial and operational data to provide a better handle on their future prospects.

(more…)

During all my years designing and implementing integration solutions for EPM processes, one thing is clear to me: data integration is a vital part of the equation. After all, it is what brings together any CPM process such as Financial Consolidation and Planning & Budgeting.  As an integration solution architect, I believe that having someone that builds the bridge between all stakeholders is key for any CPM project. Indeed, one of our main responsibilities is to align source systems owners with finance teams.

(more…)

Data continuity is one of the greatest challenges in the Finance department. Many hands touch the data, enrich it, modify it, push it from one system to another until the end of the financial reporting process. The problem is that those breaks in data continuity have a cost: you lose the audit trail and also valuable information along the way. It is so normal that most Finance departments take it for granted that it is just the way it has to be. But does it still have to be that way?

(more…)

The market disruption we are experiencing is a new challenge for most business models.  As work from home has become the norm we are often asked if a software implementation project can kick off or continue in a remote fashion.  The answer is an emphatic yes!  In fact, in the normal course of our business, some portion, and sometimes over half of OneStream project delivery is performed remotely.

(more…)

Today’s CFOs and controllers need to manage their critical, enterprise-wide financial data and processes as effectively as possible. That data needs to be timely, accurate and easily accessible for insightful reporting and analysis to maintain a competitive edge. Accordingly, their corporate performance management (CPM) solutions need to be robust, scalable and provide full integration with their ERP, HCM, CRM and other systems.

(more…)

In today’s volatile business environment, change is the norm and organizations need the ability to adapt rapidly to changing business conditions, changing regulatory requirements, and changing organization structures – including the impact of mergers and acquisitions.  One of the key benefits of today’s modern corporate performance management (CPM) software platforms is the agility they provide organizations to plan, forecast, and report through rapidly-changing business conditions.  But another key benefit is the capability to model the impact of reorganizations, mergers, and acquisitions and to quickly integrate these changes without disrupting reporting and planning cycles.

M&A’s on the Uptick

Global merger and acquisition (M&A) activity has been picking up steam in recent years, with 2017 coming in as one of the most active years on record.  According to a recent Harvard Law School article, total deal volume in 2017 reached $3.7 trillion globally (roughly equal to 2016, making it the fourth busiest year on record.  Key drivers of this increasing M&A activity include:

  • Low interest rates
  • Increasing stock market performance
  • Tax reform in the US
  • Appetite for digital technologies
  • Shareholder activism

According to a recent M&A trends report by Deloitte, M&A activity is expected to continue at high levels in 2018 and beyond.  The factors cited above, combined with increasing corporate cash levels, will continue to drive high M&A activity in corporations as well as in private equity companies.

The Deloitte report goes on to cite that companies and private equity firms appear to be getting better at achieving their goals for their deals.  Deloitte’s surveys consistently show that well-planned, carefully-executed integrations yield transaction success.

The report states “More than 6 in 10 respondents (63 percent) say they now incorporate the use of non-spreadsheet-based M&A technology tools as part of their deal processes. The respondents cite a raft of benefits. These analytical tools make post-deal integration smoother and faster, reduce costs and conflict, and shorten the time it takes to complete them.”

CPM Software Critical in Supporting M&As

As the Deloitte report cited above indicates, having the right tools in place to support M&A activity are critical to successful M&A integration.  Using spreadsheets to integrate the financials of acquired companies, and analyze the impact of acquisitions takes too much time and effort and is prone to errors.  In fact, Deloitte reported that those who have not used M&A technology tools yet would like to do so going forward: “Sixty-two percent of those who still rely on spreadsheets want to tap into these new M&A tools to integrate their acquisitions faster and more smoothly and to reduce costs and conflicts.”

CPM software provides many capabilities that make it essential to planning and executing successful M&As and reorganizations.  Here are a few examples:

M&A Modeling – planning and forecasting the financial impact of M&As on consolidated financial results, including factoring in cost synergies and impact on corporate taxes.

Planning Reorganizations – changing legal entity structures to simulate the impact on financials is not something that’s wise to do in a General Ledger (GL).  Modern CPM solutions support the creation of multiple hierarchies and the ability to create “what if” scenarios based on potential reorganizations.

M&A Integration – integrating new companies, collecting data from new GL/ERP systems, mapping disparate charts of accounts, and generating consolidated financial results for internal and external reporting.

Ongoing Performance Monitoring – post-M&A tracking of financial and operational performance and key performance indicators (KPIs), and making mid-course corrections is critical to realizing expected synergies and maximizing the return from M&As.

Real-World Customer Examples

Several customers using OneStream’s SmartCPMTM platform have cited M&A support as a key benefit of implementing the solution.

TEAM Inc. – implemented OneStream for financial consolidation, reporting, budgeting, forecasting and account reconciliations, replacing their GL and spreadsheet-based approach to integrating acquisitions.

Cleaver-Brooks – selected and implemented OneStream for unified budgeting, planning, and reporting. In additional to streamlining these processes, it made M&A integration much easier and faster.

To learn more, visit the customer testimonials page on our web site.  And contact OneStream if your organization needs a better solution for planning and executing successful M&As.

John O’Rourke is Vice President of Product Marketing at OneStream Software. With a background in accounting and finance, John has over 30 years of experience in the software industry, including 20 years of experience in product marketing at Hyperion Solutions, Oracle and Host Analytics. He has worked with many customers and partners on financial reporting and planning initiatives and has spoken and written on many topics in corporate performance management. John has also held positions in strategic marketing and product marketing at Dun & Bradstreet Software, Kenan Systems and Decisyon.  Find me on:   

 

Don’t Miss Another Blog Article, Subscribe Today!

 

(more…)

Data integration is one of the most critical aspects of CPM solutions. Why?  Because the effectiveness of your budgeting, planning, consolidation and reporting processes is fully dependent on getting timely and accurate data from GL/ERP, HCM and other systems.

(more…)

Demo Sign Up