Skip to main content
logo top bar

The Online Resource for Modern FP&A Professionals

Main menu

  • Home
  • FP&A Insights
    • FP&A Articles
    • FP&A Publications
    • FP&A Trends Digest
    • Short Videos
    • Our Contributors
  • FP&A Events
    • International FP&A Board
    • FP&A Trends Webinars
    • Digital FP&A Circles
  • AI FP&A Committee
    • Introduction
    • Members
    • Resources
    • Meetings
  • FP&A Maturity Model
  • About Us
    • Company Policy
    • Privacy Policy
    • Editorial Guidelines
    • Our Ambassadors
    • Our Sponsors & Partners
    • Contact Us
  • Sign up
  • Login
image banner
Forecasting: Should the Mean APE Rule the Accuracy Planet?
February 27, 2018

By Hans Levenbach, PhD, CPDF author of Change & Chance Embraced

FP&A Tags
Modelling and Forecasting
Forecasting Quality
Financial Planning and Analysis

Planners and managers in supply chain organizations are accustomed to using the Mean Absolute Percentage Error (MAPE) as their best (and sometimes only) answer to measuring forecast accuracy. It is so ubiquitous that it is hardly questioned. I do not even find a consensus on the definition of forecast error in supply chain organizations around the world among practitioners who participate in the forecasting workshops. For most, Actual (A) minus Forecast (F) is the forecast error, for others just the opposite.

Among practitioners, it is a jungle out there trying to understand the role of the APEs in the measurement of forecast accuracy. Forecast accuracy is commonly measured and reported by just the Mean Absolute Percentage Error (MAPE), which is the same no matter which definition of forecast error one uses.

Bias is the other component of accuracy, but is not consistently defined, either. For some, Actual (A) minus Forecast (F) is the forecast error, for others just the opposite. If bias is the difference, what should the sign be of a reported underforecast or overforecast? Who is right and why? 

Outliers in forecast errors and other sources of unusual data values should never be ignored in the accuracy measurement process. For a measurement of bias, for example, the calculation of the mean forecast error ME (the arithmetic mean of Actual (A) minus Forecast (F)) will drive the estimate towards the outlier. An otherwise unbiased pattern of performance can be distorted by just a single unusual value. 

When an outlier-resistant measure is close to the conventional measure, you should report the conventional measure. If not, the analyst should check out the APEs for anything that appears unusual. Then work with domain experts to find a credible rationale (stockouts, weather, strikes, etc.)

Are There More Reliable Measures Than the MAPE?  

The M-estimation method, introduced in Chapter 2 of my new book can be used to automatically reduce the effect of outliers by appropriately down- weighting values ‘far away’ from a typical MAPE. The method is based on an estimator that makes repeated use of the underlying data in an iterative procedure. In the case of the MAPE, a family of robust estimators, called M-estimators, is obtained by minimizing a specified function of the absolute percentage errors (APE). Alternate forms of the function produce the various M-estimators. Generally, the estimates are computed by iterated weighted least squares.

It is worth noting that the Bisquare-weighting scheme is more severe than the Huber weighting scheme. In the bisquare scheme, all data for which | ei | ≤ Ks will have a weight less than 1. Data having weights greater than 0.9 are not considered extreme. Data with weights less than 0.5 are regarded as extreme, and data with zero weight are, of course, ignored. To counteract the impact of outliers, the bisquare estimator gives zero weight to data whose forecast errors are quite far from zero.  

What we need, for best practices, are robust/resistant procedures that are resistant to outlying values and robust against non-normal characteristics in the data distribution, so that they give rise to estimates that are more reliable and credible than those based on normality assumptions.

Taking a data-driven approach with APE data to measure precision, we can create more useful TAPE (Typical APE) measures. However, we recommend that you start with the Median APE ( MdAPE) for the first iteration. Then use the Huber scheme for the next iteration and finish with one or two more iterations of the Bisquare scheme. The Huber-Bisquare-Bisquare Typical APE (HBB TAPE) measure has worked quite well for me in practice and can be readily automated even in a spreadsheet. This is worth testing with your own data to convince yourself whether a Mean APE should remain King of the accuracy jungle!!

Details may be found in Chapter 4 of Change & Chance Embraced: Achieving Agility with Demand Forecasting in the Supply Chain.

The full text is available for registered users. Please register to view the rest of the article.
  • Log In or Register

Related articles

FP&A: Learning to Love Risk
September 19, 2017

If measurement – or the lack of it – is the biggest weakness in most forecasting...

Read more
Playing Golf in the Dark
September 19, 2017

About the only thing that everyone seemed to agree on in my old company was that...

Read more
The 11 Commandments of Supreme Forecasting
August 20, 2016

The gold rush is a defining part of Silicon Valley. The gold of today is data...

Read more
+

Subscribe to
FP&A Trends Digest

We will regularly update you on the latest trends and developments in FP&A. Take the opportunity to have articles written by finance thought leaders delivered directly to your inbox; watch compelling webinars; connect with like-minded professionals; and become a part of our global community.

Create new account

Future Meetings

  • Digital
  • In person
Dubai FP&A Board
The Face-to-Face Dubai FP&A Board: From Insight to Impact: FP&A Business Partnering in Action
Agile FP&A: Turning Myths into Practice

February 5, 2026

Riyadh FP&A Board
The Face-to-Face Riyadh FP&A Board
Creating Your Transformation Map with the FP&A Trends Maturity Model

February 10, 2026

London FP&A Board
The Face-to-Face London FP&A Board
Beyond the Horizon: Top FP&A Trends of Tomorrow

February 25, 2026

FP&A Trends Webinar
The FP&A Trends Webinar. FP&A Roles Reimagined for the AI Agentic Era
FP&A Roles Reimagined for the Agentic AI Era

January 21, 2026

FP&A Trends Webinar
The FP&A Trends Webinar. The Winning Formula of FP&A Storytelling
The Power of FP&A Storytelling

February 4, 2026

Pagination

  • Previous
  • January 2026
  • Next
Su Mo Tu We Th Fr Sa
28
29
30
31
1
2
3
 
 
 
 
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
FP&A Roles Reimagined for the Agentic AI Era
 
25
26
27
28
29
30
31
All events for the year

Homepage – Acme Corp

Visit our register pageemailVisit our LinkedIn pagelinkedinVisit our Twitter profiletwiterWatch our YouTube channelyoutube Visit our register pagefp&a digest

A leading international think tank dedicated to advancing the practice of Financial Planning and Analysis (FP&A). Its mission is to shape the future of the profession by exploring emerging trends and sharing best practices through research, publications, events, education, and advisory work.

Foot menu

  • FP&A Articles
  • FP&A Publications
  • FP&A Trends Digest
  • International FP&A Board
  • FP&A Trends Webinars
  • AI FP&A Committee

© 2015 - 2026, FP&A Trends Group. All rights reserved.

0