Traditionally, financial professionals have prepared financial reports for both external and internal users. The first category relates to preparing enterprise-level financial statements that meet public requirements for transparency. However, the second is much broader, as these reports serve managerial needs for decision-making. Providing a consistent, insightful and data-rich reporting infrastructure is a substantial challenge.
Over the past two decades financial professionals at every level have dealt with the so-called ‘spreadsheets revolution’ which covers most areas of FP&A activity. A spreadsheet appears to be an irreplaceable tool for analysis and planning; consequently every FP&A practitioner becomes an analytics modeller and report creator. With many diverse tasks to complete, he/she generates hundreds of different reports for many types of users, if we assume one spreadsheet to be a model or report.
Within an enterprise there may be thousands, even tens of thousands, of such reports generated either by FP&A professionals or by analysts in other business areas. Needless to say if spreadsheet reports creation is not coordinated, then their data validity – especially for reports that are regularly updated – is open to question and potentially can lead to non-relevant information being included.
Although FP&A teams deal with and process primarily financial information, often there is a need to make analytical explorations into operational details and provide appropriate management reports. At the same time, business-line departments work independently, producing sets of analytics and reports for their own operational needs. The result is independent data silos, which, in turn, lead to not fully-comparable reports generated on the operational level. Meanwhile, those prepared by the FP&A team use aggregations from accounting ledgers without operational details.
As financial reports contain internal reclassifications, reallocations and adjustments, they are becoming distant from the business metrics used by business line operators and analysts. When financial reports are generated with the use of special financial reporting software, it solves the problem of organised financial data preparation instead of spreadsheets but not that of enterprise reporting alignment. In other words, specialised financial software is establishing a new, well-organised, financial data silo.
Structuring Enterprise Reporting
If we try to build our reporting hierarchy within an enterprise then the financial reports take priority as they provide the main categories for evaluating the business as a whole. Metrics such as revenue, gross profit, operating profit and others are pivotal in understanding the enterprise performance of the business. They may be ‘sliced and diced’ across the enterprise in order to catch key business drivers which define their behaviour.
When diving deep into operational analytics and collecting specific data metrics from sales and marketing, supply chain management systems, customer relationship systems and human resources, we soon discover that these key areas of business activities use their own categories, which often are unaligned with the financial logic applied by the finance department.
Having defined FP&A as pivotal in the analytic and reporting environment, as well as the function responsible for the company’s thorough financial result, the challenge is to find a way to create enterprise-level metrics and data comparability across each business unit. The first step for such an initiative would be to create a reporting description matrix, with segments that define each data competence centre as responsible for a specific business area and its reporting.
These segments are grouped together for defining enterprise-level financial metrics such as revenue, cost of sales and gross profit, while at the same time they are split into more specific metrics and categories (dimensions) defining the performance of each business segment.
As the reporting matrix is built, the next step is checking the data from business sections in the reporting description matrix and validating definitions of business categories. Mathematical addition or other appropriate aggregation of all business sections should correspond with the company’s overall financial results, such as revenue for example. If not, then there is a consistency mismatch which may be handled in a number of ways.
An ideal solution for mitigating the gaps in consistency would be to create an enterprise-level data model as a business intelligence solution. This would encompass all software applications generating business information and build reports from this model as one source; thus enabling interconnectivity between the reports, especially by linking financial, operational and planning modules of the enterprise ecosystem. However, it would also prove impractical and involve a lot of time and effort across the entire business.
It would be reasonable to start with a simpler way, aimed at adjusting business segment reports according to the financial rules either in spreadsheets or through the applications generating them. Changing applications is possible but often resource-consuming. So what is the simplest solution? The answer is a reports validation matrix which can be realised in spreadsheets, FP&A teams’ favourite tool.
The reporting validation matrix is based on the reporting description matrix described above, which is just a list of all business segments metrics aligned with company’s financial indicators and targets. However, the validation matrix is more detailed and complex. It has to contain the detailed list of segment reports with metrics produced by each of them as well as a summarisation of quantitative information or meaningful business metrics.
These summarised business metrics are linked to at least two summary tables: the business unit summary table and business direction or channel (depending on the company’s internal structure) summary table, which are cross-linked and cross-checked through comparison formulas. In addition, there should be a summary table of mid-level summaries, comparing the entire company’s financial results with those from other business segments or units.
The exact structure of the reporting validation matrix may vary depending on business specifics. Basically though the idea is to concentrate (possibly through links) summarised outputs from various reports and provide horizontal and vertical arithmetical comparisons in one self-created model. This reveals any discrepancies between the business units or application modules in the reporting information.
Some companies are even introducing special reporting applications, which upload various reports in the special format and then provide automatic validation for all parameters with automatic detection of discrepancies. However, not all companies have the resources for such a project. Indeed, if there is a need for reporting consolidation and validation the above-described matrices may be the first step towards more automated solutions.
Once the reporting validation matrix is built then the issue of updates is inevitable. Where changes in either financial definitions or business applications are imminent, there should be a strategy of managing such changes. Firstly, business applications changes should, wherever possible, be agreed with financial departments if they touch key business metrics. In this case coordination with the IT team is crucial.
Secondly, financial definitions changes – caused either by GAAP (IFRS) or group accounting modifications – should be evaluated in terms of future processing costs for implementation either in applications or only on the reporting level. If changes are applied only on the reporting level then it is necessary to update all reporting definitions in the summary tables to ensure consistency. Recognising constant changes to the reporting environment means appointing a special reporting person with the FP&A team to coordinate all the described processes.
As experience shows, proper validation techniques within the reporting ecosystem increase the efficiency of the FP&A team’s analytic function and provides the ability to better manage performance measurement. As a result, this methodology produces a more solid decision-support tool. Finally, it should also be stressed that the principles of reporting validation do not require any significant investment; it’s just a managerial initiative that proves to be highly effective.
Coffee is the second most traded commodity in the world and is subject to plenty of price volatility. Costa Coffee's treasury manager discusses how the company hedges pricing and foreign exchange risks to minimise volatility.
Ireland is a potential beneficiary as businesses relocate from the UK post-Brexit, but this is undermined by the threatened disruption and complexities that increasingly appear to be a likely consequence.
A US study, based on the quick service restaurant chain Chick-fil-A, offers conflicting evidence on whether a TMS is the best option when upgrading from Excel-based forecasting.
With four hikes since late 2015 - three of them in the past six months - US interest rates are moving higher in line with a strengthening economy. However, for many American corporates the trend has raised concerns over their working capital management.