The FP&A Trends Webinar. Five Critical Roles for Building a World-Class FP&A Team
Click here to view details and register
By Michael J. Huthwaite, Director of Product Management, Intelligent Reporting at Walmart
Scenario Planning is a common topic among FP&A professionals and the software and services vendors who support them. Case studies, whitepapers and webinars are common media forms that highlight best-in-class scenario modelling.
But beneath the fancy rhetoric, slick analogies and often glossy images lies the less glamorous mechanics of scenario modelling. This is the layer that diehard modellers want to understand and operate within.
What methods are you using? How complex is the scenario you are running, and are any shortcuts being taken? In this article, we explore the most common techniques used for financial scenario modelling.
Before we discuss the most common techniques for running financial scenarios, it is important to start by revisiting the difference between Deterministic vs Stochastic modelling.
Below is a quick table.
|Deterministic modelling||Stochastic Modelling|
|Used to||Calculate very specific forecasted targets based on key assumptions and logic||Quantify the likely range of outputs that could occur|
|Good for||Interpreting what exact actions to take (due to discreet assumptions)||Passive observance of what is going on|
|Based on||Concrete assumptions and logic||Statistic (probability)|
|Output||The exact outputs are always reproduced providing the assumptions and the logic do not change||The results are likely to be slightly different every time the analysis is run|
Deterministic modelling is the most common form of financial modelling.
This method leverages concrete assumptions and logic to calculate financial projections that attempt to approximate the future. In doing so, deterministic modelling provides management with a clear and discrete set of assumptions to try to manage the business.
On the other hand, stochastic modelling leverages probability (and statistics) to help quantify the likely range of outputs that may occur based on running numerous trials. Stochastic modelling differs from deterministic modelling in that the results are likely to be slightly different every time the analysis is run. Still, as one would expect, the relevance of those differences is “statistically insignificant”. With deterministic modelling, the same outputs are always reproduced, providing the assumptions and the logic do not change.
The stochastic method is arguably more scientifically accurate (complete with confidence intervals). However, because it does not use discrete assumptions, it is often difficult for management to interpret what exact actions to take. As a result, most FP&A organisations still prefer to use deterministic methods. Stochastic modelling tends to be a better option for modellers who want to observe what is going on rather than try to manage the outcomes.
Below are four scenario techniques that are commonly used in deterministic scenario modelling.
The most common approach to running scenarios is via versioning. In spreadsheet terms, this is simply done by changing the numbers and hitting “SAVE AS”.
In more purpose-built solutions, model owners can often leverage a version dimension to create additional members, which can then be compared against each other.
How it works: In both cases, a version operates as a pretty rudimental way to run an alternative scenario. The entire dataset is duplicated in both the spreadsheet and the dimension example, allowing the modeller to change one dataset while leaving the other set constant.
Callouts: Versions work well as long as the initial or underlying dataset does not change.
Challenges: Version control. How do you ensure that people look at the same version, and how do you merge versions to bring the best ideas together under a single version? This can be a real challenge for many organisations, and most financial planning solutions do little to solve this problem.
Overlays (sometimes referred to as sensitivities) are more purpose-built than versions. Rather than duplicating the entire dataset, they simply store the assumption and/or logic differences unique to a given scenario. This is a much easier way to juggle multiple scenarios simultaneously.
How it works: Let’s assume we are working with a model that has 100 input assumptions. Under versioning, that new scenario would create an additional 100 inputs that would need to be managed. But with an overlay, the system only stores the assumptions you want to change. In most cases, this is just a handful of incremental differences; thus, it is a very efficient way to manage multiple scenarios.
Overlay scenarios are great for running corridor plans or high-low scenarios, where modellers set a baseline assumption for sales and then two additional sales assumptions representing both a high and a low assumption. The model then recalculates three times, providing a corridor of likely outcomes that ripples across the financials.
Callouts: Overlays can be manually built into the logic of a financial model, but purpose-built scenario planning tools often have Scenario Manager tools built to handle this process.
Challenges: Overlays require much in-memory computing. Because the model only stores the inputs to the disk, the calculations are forced to run in memory. Systems that rely on lengthy batch processes will not address overlay logic.
Many companies think of scenarios as strategic initiatives. As a result, scenario combinations are a popular option.
How it works: Under this approach, scenarios are additive business opportunities that can be toggled on and off to identify the right mix of initiatives. In this environment, it is more akin to a consolidation of scenarios. Modellers start with a baseline (no initiatives) and tack on discrete initiatives to optimise various financial metrics.
Callouts: This is usually the preferred approach for organisations who want to treat scenarios like another hierarchical dimension (like an entity structure or an account structure).
Challenges: Because this approach is additive, it works best when the scenarios have discrete impacts. Economies of scale are more difficult to model using this approach.
The last method commonly used in deterministic financial modelling is a timing offset. This can represent a delay in a project, or it can be used to roll out a scalable business model across a repeatable curve. This is often used in the retail industry with new stores or in a catalogue-based business where issuing a new catalogue generally follows an expected income curve over time.
How it works: This type of scenario is modelled along a discrete timeline. The entire timeline can then be shifted. For example, T0 – T12 may represent the time range of the model, but since T is a variable, it can be shifted.
Callouts: This technique is most often used to delay the timing of a scenario or project.
Challenges: Delays in projects can often come at increasing costs, which are difficult to model using this approach.
Below are two approaches that are widely used for stochastic modelling. It is important to note that the future of stochastic modelling will undoubtedly look much different as big data and computers evolve, but here is a look at what’s being used today.
Not too long ago, one of the best ways to analyse financial models using stochastic modelling techniques was through Monte Carlo analysis.
How it works: This technique runs hundreds if not thousands of scenarios based on assumptions that have met specific criteria or ranges (distribution curve). The outputs are then analysed to address some form of confidence interval to help predict a likely range of outcomes.
Callouts: Are you on track, or are you at risk? This is the most common way to analyse a Monte Carlo simulation (de-risking tool).
Challenges: This approach tends to be more difficult for management to actively rally around as the assumptions are based on a range.
The next frontier for stochastic modelling is undoubtedly Machine Learning (ML). ML leverages the techniques used in Monte Carlo Analysis. Still, rather than relying on humans to provide the constraints and algorithms, the machine provides that by constantly learning and improving over time.
How it works: Machine Learning leverages high volumes of data to run trials using algorithms that evolve over time.
Callout: Although Machine Learning is a relatively new technique for FP&A, it looks to be a game changer for Predictive Analytics.
Challenges: Artificial Intelligence/Machine Learning (AI/ML) run best on larger datasets. When data availability is low or non-existent, AI/ML is not highly effective.
Whether you are performing deterministic or stochastic modelling, we must understand the techniques that lurk deep within our models.
To optimise FP&A’s ability to run meaningful scenarios, we need to recognise a common tool kit of options that provide flexibility and speed.
Recognising what mathematical techniques underpin the various scenarios we run will not only bring more validity to our scenarios but also provide transparency to our colleagues as we try to build consensus around the what-if.
This article was first published on the D!gitalist Magazine blog.
Scenario Analysis is a tool to assist in making decisions under uncertainty. It provides a structured...
In this short video, Marat Lomakov, Finance Director, Europe Supply Chain at Ecolab, shares his insights on...
Scenario management plays a vital role in a world of irrelevant budgets and inaccurate forecasts.
2020 was a year of major disruptions for the business world. But whether your organisation suffered or...
We will regularly update you on the latest trends and developments in FP&A. Take the opportunity to have articles written by finance thought leaders delivered directly to your inbox; watch compelling webinars; connect with like-minded professionals; and become a part of our global community.