One of the articles of faith of the analytics movement is that we should move towards predictive analytics. There is a lot of appeal in the idea that we should ditch backward-looking measures for forward-looking ones. “You can’t drive by looking in the rear-view mirror”, we might quip with a smile.
The first hint that there might be something wrong with predictive analytics is when you go to a conference and hear three or four presenters mentioning the concept. The first speaks about how important predictive analytics is and gives the example of predicting flight risk. The second is a case study where they implemented predictive analytics on flight risk. The third is a vendor explaining how they have built-in predictive analytics, in this case predicting flight risk.
Whether we are in a conference or the workplace, we should be attuned to clues that subtly disturb us-even when we fundamentally agree with the point being made. Yes, we support the idea of predictive analytics, yet we can’t help but be troubled that we keep hearing the same single example. We have a clue that there may be something wrong with the notion of predictive analytics. Let’s see if we can figure out what it might be.
Examples of predictive analytics
Let’s imagine that engagement scores have gone down three years in a row. The analyst points out “With this trend, engagement will be below 60% in a year or two.” That clearly is predictive analytics since we a predicting something, however-and here’s another clue that something is amiss-it doesn’t feel the way we expect predictive analytics to feel. We are expecting a sophisticated model such as we saw in the flight risk example, not a self-evident comment on trends.
Another example of predictive analytics is workforce planning. We predict we’ll need, for example, 50 more programmers next year. That feels a little better than the engagement trends example since workforce planning takes more sophistication. However, in this case the discomfort is that predictive analytics becomes something we’ve been doing for 50 years, not a breakthrough that represents the future of analytics.
It’s at this point that the recruiting people chime in and say “We use assessment tests. Does that count as predictive analytics?” It’s hard to say that it isn’t, they are using a test to predict performance. Sadly, just like workforce planning, if assessment is an example of predictive analytics then it’s a decades old approach.
The issue boils down to whether pursuing predictive analytics as the next phase in analytics-which many people suggest is the right thing to do-will lead anywhere. Will we discover that predictive analytics isn’t the real deal after all?
Why we thought we needed predictive analytics
The reason we wanted to move from descriptive analytics to predictive analytics wasn’t, at heart, because we wanted to be forward-looking. The underlying reason is that our descriptive analytics were of little value. Those descriptive analytics had little value because there was no “So what?” and managers quickly lost interest. We needed something new.
Online AIHR Academy
Searching for a better approach, we hypothesized that the problem was that our descriptive analytics were backward-looking measures and that led to the article of faith that predictive analytics was the next step in the maturity curve.
I have a different hypothesis for why descriptive analytics often fails to add value. The reason is that they provide data that isn’t answering a specific, important question. Let’s consider a typical example of how valuable descriptive analytics can be when we start with a question compared to how little value it is when we don’t.
For this example, let’s use a report on turnover. That report contains backward-looking data and we can imagine a tedious presentation where HR takes leadership through slide after slide of data: “And now, on slide 72, we see the turnover of left-handed cashiers in the Eastern Region.” It’s easy to understand why the audience would beg for a new approach to end the tedium and if predictive analytics is mentioned they’ll latch on to the idea.
Now, imagine that the project had started with a question. A business unit head says “I want to invest in turnover reduction this year. Can you show me the three branches where turnover is the highest? We’ll start there.” The answer that comes back is descriptive-last year’s turnover in each branch sorted from high to low-but it is exactly what the business wants. There is no “So what?” to this backward-looking descriptive data because it addresses a specific business need.
We don’t need predictive analytics, we need relevant analytics
The next step in the maturity of people analytics is to provide relevant analytics. Those analytics may be descriptive, or they may be predictive, we don’t really care. In fact, we care so little that it’s unlikely we will even bother using those terms.
There is a sweet irony that the prime example of predictive analytics is flight risk. Many organizations say that as impressive as the flight risks algorithms are, they can’t use the results because in my experience, they don’t trust managers with the information on who is likely to leave. Predictive analytics are no better than descriptive ones if they don’t answer a question that leads to an action.
The key to relevance is starting with a specific, important question that the business needs answered. Data that answers that question is valuable; other data is not. It’s time to retire the idea of predictive analytics, it’s not a useful goal; look for relevant analytics instead.
Article by channel:
Everything you need to know about Digital Transformation
The best articles, news and events direct to your inbox
Read more articles tagged: People Analytics