This article explores the attributes of Big Data and considers whether having more data is always good. First, let's get back to the basics. What is Big Data? Put simply, big data refers to a vast variety of data with which an organisation can engage. The aim is to use big data to improve business performance. This is applicable to profit and non-profit organizations, financial and operational measures.
Big Data had various attributes, categorised into volume, veracity, variety, velocity and value. Volume refers to the quantity of data. Veracity is the quality of the data in terms of accuracy. Variety refers to the diversity of data. Velocity denotes how fast the data can be processed and be available for use. Value is what insight the data can provide.
We are inclined to agree that the more data we have, the more we are able to generate better insights. However, can we have too much data? I venture to say an organisation can be overwhelmed by data if data is not properly structured or does not have a purpose. The key here is the analytical ability to derive insights from the vast quantity of quality data in a timely fashion. Let us look at each point.
Accuracy is key to driving good insight of the business. In a trade-off amongst the various characteristics of data, accuracy wins hands down. Data should never be pre-assumed to be accurate on face value. Some form of validation should take place to provide confidence. Accountants will be familiar with the term “reasonableness” test. Does the data appear reasonable?
In addition to accuracy, we also need to consider the suitability of the data. To determine the suitability of data, first we need to answer the question – what are we trying to achieve here? Or what is our hypothesis? Data can only be considered as good quality if it helps with answering an intended question, proving or disproving a hypothesis.
Data alone cannot aid business improvements. An organisation needs to be able to draw insights from the data collected. This normally falls into the remit of the FP&A team. In some organisations, there are specific teams to investigate specific characteristics of the data. Data architects help organisations manage data in a structured manner. Data analysts extract knowledge and insight from the data. A word of caution: one should not fall into the trap of drawing conclusions or correlations from data just because correlations/conclusions are expected. The analysis should be set out to prove or disprove the hypothesis. The ability to do so comes with experience and knowledge share with other parts of the organisation.
Data quality deteriorates with time. Big Data is generally used to drive future-looking projections. An agile organisation strives to respond to the ever-changing environment with the data it can access. Insights drawn from outdated material can lead to non-actionable conclusions. Not all data needs to be real-time. True real-time refers to continuous stream of data. This is useful for say traffic management. Data can also be “right time” i.e. data available at the time when it’s needed to allow an organisation to take corrective action. For example, real-time Christmas toy sales data does not necessarily help toy store sales plan because the store has already stocked up for the season. However, this data made available at the planning time together (usually in the early part of the calendar year) with other drivers can improve the quality of the next Christmas sales projection and stock order.
Finally, as Francois Badenhorst pointed out in his recent blog on Accountingweb – it’s not data for data’s sake. Data is the most valuable asset. FP&A should express an interest and be involved in defining data capture requirement and structure. This enables FP&A to better equip an organisation in managing its performance.
Any views or opinions expressed are solely those of the author and do not necessarily represent those of The Warranty Group. The Warranty Group is part of Assurant.
The article was first published in Unit 4 Prevero Blog