Unless you have been shut off from the outside world in recent times you are probably aware that big data is one of the current flavours of the month in business. As an I/O psychologist, I’m particularly interested how this concept of big data is impacting thinking about people problems in companies. Indeed, a common request for information that is made to OPRA, whether that is Australia, New Zealand or Singapore, is for help with supposedly big data projects. The irony is that many of these requests are neither primarily about data nor involving big data sets. Rather what has happened is that the proliferation of talk on big data has made companies realise that they need to start incorporating data into their people decision.
Big data itself is nothing new. OPRA were involved in what could be described, in a New Zealand context, a big data project in the 1990’s attempting to predict future unemployment from, among other variables, psychological data to help in formulating policy on government assistance. What is new is the technology that has made this type of study far more accessible, the requirement for evidenced based HR decisions, and the natural evolution of people analytics to being a core-part of HR.
While data has value to solving organisational problems it is not a solution in and of itself. This point is lost on many clients who think that just by having data that the solution to problems such as retention, selection success and training evaluation will somehow reveal itself. As captured in the title to this blog-post data is not the solution. Rather data is simply one ingredient to better understand a problem being faced. However, to solve problems with data we must first know how to use the data ingredient correctly to form decisions.
- Define what you want to cook and whether the meal is worth eating: Start with a research paradigm and define the problem that you want to assess. Before we can begin to chuck data at a problem we need to have identified the problem first. A classic research paradigm will help in this regard. At a basic level start by thinking about the problems you wish to solve, map the antecedents and consequences to that problem and generate some hypotheses to be tested.
- Make sure you have all your ingredients: Having mapped the problem you then need to check what variables you have data for and what data is missing. Where are the gaps in your understanding to the problem? Where are you currently missing data to solve the problem that you want to solve? These are the types of questions you need to be able to answer prior to starting any analysis. Where there is a gap you need to look at how to collect data such that there is no glaring omission to your analysis.
- Make sure your ingredients are ready for processing. Not all data is equal. Before you can even begin to look at working with your data you need to make sure the data is fit to be analysed. For example, for many statistical operations, there is an assumption that your data will be normally distributed and you will be able to differentiate people against this model. This is often not the case. A common example is performance data which is often positively skewed. Making your data fit for purpose is vital before you begin willy-nilly chucking statistical operations at the data in the hope of finding something conclusive.
- A simple dish is often most easily consumed: In previous posts, we have discussed the idea that the best solutions to problems are the often the simplest. There are levels of sophistication that can be applied to data analysis but this does not mean that we must always adopt the most complicated analysis. On the contrary the purpose remains to solve a problem and this can often be achieved using qualitative and quantitative techniques to ultimately tell a story. As reiterated throughout this blog-post data is but an ingredient. Look first at simple techniques to try and tell a story such as graphing, simple inferential and descriptive statistics and simple multivariate models. Never forget that the purpose is not to be blinded by statistics but to use statistics to see more clearly.
- The proof is in the pudding. Doing the analysis is one thing but the findings need to be evaluated. Evaluation is far more than a measure of statistical significance it is looking at the practical significance of the findings. Will the findings have an impact on the organisation? Is this difference between these two divisions large enough to really make a difference? Even if this intervention worked is there a cheaper way of getting to the same outcome. These questions will not be solved by statistics alone but require an evaluative framework such as the key evaluation checklist to use the data to make decisions.
The skill of working with data is now a requirement of the strategic HR professional’s toolkit. Not surprisingly this model of working from understanding the problem to evaluation is central to the OPRA Consulting model. While starting to work with data, and big data in particular, can be daunting at first once some basic fundamentals are understood this fear can be alleviated. As with cooking you may still need a qualified chef to make sure everything is on track. However, once you have some fundamentals are grasped there is a whole raft of dishes you can cook for yourself. At worst you will develop enough of a palate to know what to order and appreciate the end product.