The Evolution of Scenario Planning

The Evolution of Scenario Planning

By Michael J. Huthwaite, Founder and CEO of FinanceSeer LLC 

We live in a world of uncertainty.  But in that uncertainty, lies a great deal of opportunity for those organizations capable of successfully executing a winning plan. 

To help navigate this challenge, we often turn to scenario planning to guide us.  Yet, despite all the advances in technology, we are still stuck in the past when it comes to the evolution of scenario planning. 

Variance analysis, corridor planning, checkerboard diagrams and various other forms of scenario planning are just a few of the techniques routinely discussed as ways in which organizations could evaluate future business outcomes. 

However, before FP&A can begin to master any of these techniques, it is important that we understand where we are in the functional evolution of scenario planning and where we need to go in order to make scenario planning a truly powerful force for combating uncertainty.

Modeling for Uncertainty

Models essentially come in two distinct forms. 

Stochastic models are one form of model that deals with probability algorithms (regression analysis, Monte Carlo Analysis or even machine learning (AI)). 

The other form is Deterministic models which deal with specific assumptions or driver-based inputs.    

Stochastic or probabilistic modeling operates best when you’re dealing with constraints such as time (short-term planning) or other resource constraints (number of Sales people in the field).  These natural limitations reduce or narrow the number of potential outcomes and therefore enable users to think in terms of probabilities (potential outcomes). 

On the other hand, deterministic modeling doesn’t necessarily share that same narrowing of resources.  In fact, strategy modeling is often about trying to expand boundaries such as time horizons and funding limitations in order to evaluate growth opportunities.

With this in mind, I’m choosing to focus this discussion purely on deterministic scenario models only. 

The Evolution

Save As (The Spreadsheet Answer)

Spreadsheets are the ultimate personal productivity tool.  We all have them and we all use them.  So, we’re all familiar with running scenarios this way. 

The process starts off so simple.  You change a few assumptions, evaluate the results and then click “Save as” to record your changes. 

However, if you are planning on trying to collaborate with others team members or combine other scenarios then you’re in for an error prone and uphill battle.  That’s because “Save as” is essentially a technique for taking scenarios “offline”.    

Versioning

Over the last 20+ years, companies have begun to shift over to Enterprise Performance Management (EPM) solutions to help streamline their financial planning capabilities. 

EPM solutions (which often rely on OLAP or hybrid OLAP technologies) use “Versioning” as their primary form of scenario management.  Versioning works especially well for tactical planning where the focus is on variance analysis (ex. “Plan to Budget”, “Budget to Latest Forecast” and “Latest Forecast to Actuals”). 

Unlike scenarios performed in spreadsheets, Versioning remains an online activity, which enables everyone to work together on a given scenario. 

Yet, despite the online benefits of Versioning, it also has some significant limitations.  Perhaps the most important limitation with regards to Versioning is the inability to address meaningful strategic scenarios. 

In strategy, scenario analysis is all about managing multiple views of the future so that organizations can understanding which scenarios could potentially be combined to establish an optimal strategy.

This is difficult to do in Versioning as each scenario represents a complete standalone dataset.  Trying to maintain multiple datasets at the same time is a nightmare.  Furthermore, because each version is a complete dataset, it’s difficult to combine or merge different datasets together to settle on an optimal plan. 

Sensitivity Scenarios

One way to begin to address scenario planning in a more strategic way is via Sensitivity scenarios. 

Sensitivity scenarios (sometimes called Overlays) work by allowing users to store alternative inputs that essentially override the underlying or initial assumption in the model. 

I like to think of them as clear transparencies that overlay an underlying dataset (often called a “Baseline”) enabling the user to simply recalculate the model on the fly (using in-memory technology).

This approach is far more efficient than Versioning as there isn’t a need to maintain duplicate dataset as all you need to do is store the handful of assumptions that differ from the original dataset (baseline). 

Because Sensitivity scenarios are easy to create and maintain, they are often used for evaluating multiple concurrent scenarios (a must for strategic modeling).  Modelers can easily create corridor or contingency plans that stay relevant throughout the entire planning process, giving modelers multiple views of the future, rather than a single version of the truth. 

In addition to maintaining multiple scenarios at the same time, sensitivity scenarios also enable users to evaluate combined scenarios (two or more scenarios occurring at the same time).  This works by simply layering Sensitivity scenarios on top of each other.  

Sensitivity scenarios are amazingly simple and work well in a true in-memory modeling engine, but they also have limitations.

When combining sensitivity scenarios, it works best when the assumption overlays are unique (no conflicting intersections).  Generally, the only way to address conflicting intersections is by allowing the user to select which overlay to use in a conflict (this is often handled by ordering sensitivities in a way that prioritizes assumption conflicts).

Impact Scenarios

To get the most out of scenario planning, organizations must also be able to understand the impact of a scenario as it relates to an overall strategy.  This is difficult to do with Sensitivity scenarios as we are simply swapping out assumptions and recalculating the model. 

This is where Impact Scenarios come into play.  They look and feel like Sensitivity scenarios, but they take the process one step further by calculating the financial impact.    

This enables us to visualize the impact of scenario planning in a really powerful way. 

Normally we are used to looking at financials we can usually view that data across time or across entity, but with impact scenarios, users can also pivot the data by scenario contribution, where the “baseline” represents your default outcome and each additional scenario contribution is added on top.

Because each scenario is captured and evaluated incrementally, users are free to include or exclude scenarios or even delay (“time shift”) a scenario by several months to arrive at optimal strategic plan of action. 

Impact Scenarios are created in much the same way as Sensitivity scenarios, but rather than overlaying the assumptions and recalculating the model, the model recalculates each scenario separately and then stores the delta impact of that scenario.   

Not only is this analytically superior to Sensitivity scenarios, it also overcomes the problem of conflicting or overlaying assumptions from scenario to scenario.

Impact Scenarios are ideal for evaluating checkerboard style strategies where different initiatives can be combined under different risk profiles (the optimal way to plan in a world of uncertainty).  

Conclusion

In a world of uncertainty, scenario planning is the key to evaluating what is possible.  The best strategies are rarely achieved by constructing a single scenario plan.   Rather the best strategies are the ones that enable peers to take action based on a combination of quantifiable scenarios. 

Putting your organization in a position to measure the impact of alternative scenarios is the key to achieving long-term success.  But, in order to get there, we need to continue to evolve our capabilities for meaningful scenario planning. 

The article was first published in Prevero Blog.

 

The full text is available for registered users. Please register to view the rest of the article.
to view and submit comments