Be Cautious — 8 Lessons for Sound Talent Analytics

Big data - photoexplorer - free

Data gathering and analysis are touted as the solution to many talent acquisition issues. Talent functions have added a statistician, data scientist, or at least someone good at crunching data, but very few have thought out the implications of data analysis or established guidelines for using the results.

A few companies including Google, a pioneer in using analytics to change how it hires and uses its employees, have focused on specific issues, such as employee selection or engagement, and used data analytics to help understand and improve it.

But gathering data and analyzing it may present as many problems as it solves. Good analytics requires a deep understanding of the limits, shortfalls, and biases that are most likely to occur. Be cautious. Be thoughtful.

There are at least eight specific requirements to a sound, thoughtful, and beneficial analytical capability.

  1. No Magic Bullet: First of all, analytics is not a magic bullet. Data can help you understand an issue and perhaps help you argue more effectively for a course of action, but data does not replace the need for empathy or human reasoning. Even when something may be indicated by a statistic, it may not be reason enough to take action. Knowing the context of the situation being analyzed is important to success. Just knowing the fact that of the last eight people interviewed, no one was tendered an offer doesn’t tell you very much. You also need to know, among other things, who interviewed them and what their reasons were for not making an offer, what the economic circumstances are of the company or function, who the recruiter was and how competent they are, and so on. It takes contextual knowledge, factual knowledge, caution, and good judgment to use the results of analysis effectively.
  1. Know What You Want to Know: Be crystal clear about what you want to analyze or measure, and ensure that it is possible to accurately analyze or measure it. As Einstein said, “Not everything that counts can be counted, and not everything that can be counted counts.” You can measure quantity, source, time, and cost reliably. But it is hard to measure quality, satisfaction, or engagement reliably or accurately, as these are subjective elements. Survey results can be interpreted in many ways, survey questions can and often are biased, and employees may answer with what they think you want to hear or may be answering in the hope that their answer will change something.
  1. Use the Appropriate Method: The method of gathering data may also be a problem. One use of analytics is to illuminate a problem or find possible reasons for something, yet even that can be daunting and inaccurate. For example, turnover of employees may be caused by a wide variety of factors, including poor management, lack of confidence in the organization’s performance, personal grievances, poor pay or benefits, antagonistic fellow employees, personal feuds, family issues, or a lack of empathy with the culture. A survey or focus groups may uncover some of these and even give a particular weight to one or two of them, but it is very difficult to determine if you have found the root cause. Answers to survey questions are likely to be subjective, and as we all realize, many reasons people give for leaving do not reflect the truth, but are designed to avoid “burning any bridges.” In this case, it may be better to trust to instincts rather than to rely too heavily on the results of a survey.
  1. Passive Data May Be Better Than Solicited Data: It is far easier to gather passive data about something than to solicit valid data from people. It is relatively straightforward to gather factual data from the results of actions and decisions. Good analysis can help recruiters understand where their hires come from, which sources provide the most employees, or which social media messages are the most effective for generating leads. But even with passive data, there are significant challenges. For example, it is possible to interpret what traits the most productive or longest tenured employees have in common, but it is hard to prove whether or not these correlations are the cause of the productivity or tenure. The correlation-is-causation situation is common and has to be vigorously avoided.
  1. A Supportive Culture Is Important: To make good use of data, there has to be an accepting leadership and corporate culture that values data and is willing to use it to make decisions. If your organization or functional leadership does not make use of analytics, you may find your work going to naught. Leaders who understand the value of data and already use it for better understanding manufacturing or marketing will be far more receptive to you.
  1. Focus is King: It is tempting to try and measure everything, especially in the early days of establishing an analytic function. Focus on two to three key questions you would like answers to. This will allow you time to gather better data and analyze it more completely. Sometimes data scientists can get so excited over the insights they are digging out of data that they lose sight of the original goal or purpose of their search. It requires discipline to stay focused. You must define the inquiry carefully and as narrowly as possible to get useful, actionable data. For example, determining which sources of candidates lead to the highest number of offers for a particulate function or hiring manager can be useful. Trying to extrapolate that to the entire organization may result in some poor decisions.
  1. Data Is Not Pure: We have a tendency to put analytical data on a pedestal and think that it is pure and uncontaminated by politics or opinion. Unfortunately, data and data analysis are as subject to bias and opinion as are anything else. Politics plays a part in determining what data you gather, what you measure, how you measure it, when you measure it, how much focus an area gets, and what conclusions and decisions are drawn from the data. Stakeholders, customers, and employees all have opinions and need to be listened to. When there is a broad consensus and agreement as to the source of the data and what methods and analysis will be used.
  1. Keep it Simple: Take the time to list what you really would like to know to improve your overall recruiting capabilities. What data would help you make a better case for more resources or would answer the pressing questions management has asked you? Initially stick to using the passive data that comes in every day from your social media, website, and the results of the recruiting efforts. Then work with your analytics person to determine what can be honestly and reliably provided. It is very tempting to go after the solicited data through surveys, focus groups, interviews and so forth. But as mentioned above, data gathered in this manner requires a sophisticated process and even then can be manipulated or misinterpreted because of bias, intentional or not.

Data analysis is difficult and fraught with problems of interpretation and understanding. Any good use of analytics requires thoughtful discussions and an understanding of limitations before any actions are taken. Google has created a website with additional useful thoughts about gathering and using people data and analytics.

 

 

 

  • Ben Sian

    Might I also add, “Learn about your data sources.” Your sources of data may or may not stem from a System of Record. Heck, you might not even know what your system of record is. You might have different systems e.g., an ATS and an HCM. Their data could conflict. For example, you might have a CHRO ask, “How many hires did we have this quarter?” Should those data come from your ATS? Should they come from your enterprise HCM? make sure you know the answer. Even if you know the answer about System of Record, if the data conflict, how do you account for the discrepancy? You’ll need to know because, trust me, your senior leaders will ask and your credibility is riding on the answer.

    Even within a system (intra-system), your data may vary, due to historical changes. You need to educate yourself on those changes. Why does one data field in your ATS say one thing before 2012 and another after 2012? It could be that a new law was enacted. It could be there were organizational changes that led to the change. Or it could be that no one is around from 2012 who can even tell you the answer. Know your data, not just the output and reports.