AI and People Analytics have taken off.
As I’ve written about in the past, the workplace has become a highly instrumented place. Companies use surveys and feedback tools to get our opinions, new tools monitor emails and our network of communications (ONA), we capture data on travel, location, and mobility, and organizations now have data on our wellbeing, fitness, and health.
And added to this is a new stream of data which includes video (every video conference can be recorded and more than 40% of job interviews are recorded), audio (tools that record meetings can sense mood), and image recognition that recognizes faces wherever we are.
In the early days of HR analytics, companies captured employee data to measure span of control, the distribution of performance ratings, succession pipeline, and other talent-related topics.
Today, with all this new information entering the workplace (virtually everywhere you click at work is stored somewhere), the domain of people analytics is getting very personal.
While I know HR professionals take the job of ethics and safety seriously, I’d like to point out some ethical issues we need to consider.
The Risk of Data Abuse
First, let me give you a little motivation. While you may go out and buy a great new employee engagement tool or “retention risk predictor” from your HR software company, these new systems bring risk. When you buy the system you really don’t know how it works, so every decision, recommendation, or suggestion it makes becomes your organization’s problem.
Suppose, for example, you use Pymetrics, HireVue, or another advanced assessment technology to assess job candidates.
While these vendors work hard to remove racial, gender, and generational bias from their tools, if you implement them and a job candidate sues you, your company is responsible. And this happens all the time. ( Read how Amazon inadvertently created its own gender-biased recruitment system. )
I had this happen to me. We were interviewing a candidate for a secretarial position many years ago, and I had to go out of town the day of the interview. The candidate came to the office and our office manager told her we had to reschedule the meeting. She immediately sued us for discrimination, because she was a member of a protected class. I felt terrible and we paid her for her time, but I can see how she felt.
Another example I want to point out. A company that turned on the “retention predictor” from their HCM system told me their managers looked at these ratings and do all sorts of strange things when they see flight risk. Some managers actually stop talking to these people and reduce the support they get at work because I guess they think “they’re thinking about leaving anyway.”
Obviously, this is not good management, but if we don’t use this data well, people can use it incorrectly.
And of course, there are other things that could go wrong. If you have access to employee health data and you use it to assess or discuss an employee’s performance, I”m sure you’re in legal jeopardy. (I”m not an employment lawyer.) If you leak or inadvertently publish employee health data you violate HIPAA rules.
There are lots and lots of places to get in trouble. Just look at what’s happened to Facebook, Equifax, Marriott, and every other major company who thought they were protecting data. People make mistakes; employees do bad things; we have to protect the data, the algorithms, and managerial behaviors.
Online AIHR Academy
And as AI becomes more prevalent, we no longer see the data but rather we see a “nudge” or “recommendation.” What if that “nudge” is biased in some way and an employee becomes upset? Do you know how that software works and can you go back and make sure it didn’t discriminate based on some incorrect criteria?
Finally, if you’re an analyst and you do your own analysis (read my article on ONA and how email traffic predicts performance), are you ready to defend your findings and recommendations under attack?
If someone challenges your findings and wants to understand the data by age, gender, race, or even location or season – are you ready to make sure it’s valid and reliable? I know this is all something we can do with statistical tools, but we do have to be careful.
And remember, trust is one of the most important things we have in business today. Not only might you lose your job if something bad happens, but the damage to company reputation can be enormous.
What Should We Do?
I’ve been doing a lot of work in this area, including spending quite a bit of time with IBM, the folks at O’Reilly, and of course talking with many HR leaders, people analytics leaders, and vendors. To help you understand the issue of ethics with people analytics, let me present the following framework.

As you can see from this framework, there are two dimensions to ethics.
- First, is the data and algorithm you’re using fair? Does it accurately reflect the performance or productivity data you want without excluding, discriminating, or inadvertently biasing the result? This is tricky and I’ll discuss it below.
- Second, is the data system and algorithm safe? Are we protecting privacy, confidentiality, and security? Who has access and how do we audit its use and path through the company? This is a well-known problem in IT but now one we have to deal with in HR.
When you look at these two dimensions, you essentially find that there are four dimensions to trust.

1. Privacy
The first ethical issue to consider is privacy. As the chart above shows, companies like Facebook, CVS, Yahoo, and many others have gotten in trouble here. When an employee joins your company they give you the rights to collect a lot of data, but we as employers do not have the rights to expose this data, share it, or link it with personally identified information.
In GDPR rules, organizations also have to “forget” this data if the employee asks, so there are some important business practices to consider. If you look at a few of the questions above, they all deal with issues of disclosure and protection.
Who can access this data and have these people been trained on privacy rules and procedures?
At Deloitte, all consultants take a mandatory annual course on privacy, our PCs were scanned, and we are trained not to store any client information in a form it could be disclosed. In the case of HR, we need to tell employees what data we are collecting and make sure they understand that this data is being used for positive purposes.
Click here to continue reading Josh Bersin’s article.
Article by channel:
Everything you need to know about Digital Transformation
The best articles, news and events direct to your inbox
Read more articles tagged: People Analytics