Hiring and Big Data: Those Who Could Be Left Behind
Recruiters, HR managers, and investors have always sought better ways of identifying who fits and has potential, and how to allocate opportunities to them. Big data holds the promise of not only vastly improved efficiencies but also bringing greater objectivity to our very unconsciously biased human decision-making. And without a doubt, predictive “people analytics” are starting to transform how employers hire, fire, and promote. As a recent Atlantic article argues, “What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.”
But that’s just the tip of the iceberg. One of the developments that will undoubtedly cement the relationship between big data and talent processes is the rise of massive open online courses, or MOOCs. Business schools are jumping into them whole hog. Soon, your MOOC performance will be sold to online recruiters taking advantage of the kinds of information that big data allows—fine distinctions not only on content assimilation but also participation, contribution to, and status within associated online communities. But what if these new possibilities—used by recruiters and managers to efficiently and objectively get the best talent—only bake in current inequities? Or create new ones?
Lauded as purveyors of equality, the data not only show that most MOOC-takers are well-educated, employed, young and male —but that most of the teachers, especially the “stars,” are men. And as a recent article entitled “Masculine Open Online Courses” warns, MOOCs may be taking academe back “to the days of huge gender gaps, when senior scholars overwhelmingly were men.” Yet who teaches us is important in more ways than one. Look at any piece of research about the subtle, systemic or “second-generation bias” holding back women and minorities in business and you will find lack of role models at the top of the list. After all, who are among our first role models (after the parents, if we’re lucky): Our teachers. Speaking from experience, I know that I would not have ended up a Yale PhD if my department head at the University of Miami, Dr. Robert Tallerico, hadn’t personally encouraged and mentored me from day one.
Far from democratizing education, critics argue that MOOCs will only reinforce those with power and weaken those without it. Early evidence from MOOCs suggests huge falloff rates. After Udacity founder Sebastian Thrun’s very public defection from the MOOC church, he was lambasted for conducting a for-profit “experiment” at San Jose State without thought to whether completion rates might differ across racial and class lines. I can’t help but wonder what would have happened to me if my first year at university was all MOOC. Did Thrun and his colleagues consider the possibility that the issue might not have been a “difficult neighborhood without good access to computers” but lack of contact and identification with the faculty?
And let’s look more closely at those online games that Don Peck reports on in his Atlantic piece. As more recruiters use gaming data for hiring decisions, are they inadvertently ensuring a homogenous workforce? Males rack up many more hours of practice at these kinds of games than females, a recent Sex Roles study demonstrates. Gaming is also associated with less time spent doing homework, i.e., working hard—the essential ingredient girls, minorities and immigrants (I know, I tick all 3) rely on to get ahead. I cannot imagine my parents saying “honey, put away those textbooks and work on your games or you’ll never get anywhere in life.”
And yet recruiters are taking this data seriously. “How long you hesitate before taking every action, the sequence of actions you take, how you solve problems,” says one purveyor of workforce analytics, “all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality.” Even after only twenty minutes of play, you will generate several megabytes of data that “compose a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.”
There’s more. The Sex Roles study’s co-author says another possible contributor to girls’ lack of interest in gaming is the scarcity of women working in the game-design industry. “88 percent of game developers are male,” Heeter says, adding that “games designed to optimally appeal to women might minimize in-game performance pressure, provide real-world benefits such as stress relief, brain exercise or more quality time with family and friends, and be playable in short chunks of time.”
Which leads to another question: What if “in-game performance pressure” triggers stereotype threat? Decades ago psychologist Claude Steele showed that women and African Americans underachieved academically, and on standardized tests, not due to incapacity but rather due to stereotype threat—the fear that they would be stereotyped and underestimated on the basis of their race and gender. Steele also discovered that the dropout rate for African American students was much higher than for their white peers, even though they were good students and had received excellent SAT scores. As forms of online learning and screening get more sophisticated, adding more elements of participation and linking more explicitly to career gatekeepers, will we be plugging leaks in the diversity pipeline—or adding more?
The beauty (and danger) of big data is that it’s not limited to the tests a person takes voluntarily as part of the hiring process—it can also scour our digital traces to find leading indicators correlated with on-the job performance. The vast number of data points that miners marshall afford them surgical precision in discerning which attributes correlate best with success in different jobs. For instance, it turns out that what browser an applicant uses to take the online test matters a lot, especially for technical roles, because using the most sophisticated browsers requires “a measure of savvy and initiative to download them.” Other predictors are so troubling that companies don’t use them despite their power. One start-up that applies people analytics to screen job applicants found that distance between home and work is strongly associated with employee engagement and retention. Another finds that the strongest coders tend to be fans of a particular Japanese manga site. What is the difference between the pattern recognition afforded by big data, and profiling on the basis of gender, race or class?
Even the person who’s never ventured beyond Statistics 101 knows that correlation is not causation, a truism acknowledged by the creators of the algorithms. Google, for example, stopped using puzzles and brainteasers in hiring after finding they did not predict work performance and leadership capacity. And, in many cases, unmeasured factors cause both the predictor variable and the outcome it is aiming to predict. What if what browser you use or what you read for fun in your spare time depends in part on your social network? It is well-documented, for example, that innovations diffuse via social networks that are notoriously “homophilous,” i.e., that connect people who are similar on demographics like class, race, and gender. Will algorithms based on our social interactions not only digitally recreate but exponentially empower the “old (white) boys club?”
Big data offers the promise of greater predictability, but that should not be confused with objectivity. As a researcher myself, I offer a word of caution: Before we go whole hog on our embrace of this evidence-based revolution, shouldn’t we follow its tenets by actually conducting some studies about the diversity dynamics of this brave new world of talent management?