Apple asks some prospective Design Engineers to infer from a table of deconstructed parts the intended function of a motorcycle pump. Prospective employees at Fujifilm are asked to write personal essays about two experiences in their life, from childhood to the present to make a metaphor of their career. They’re given the example of someone who enjoyed mending bird wings as a child and later grew up to become a “corporate-turnaround artist. Google once tested prospective hires to estimate how many golf balls could be stored in a school bus and explain why manhole covers are round, a practice it later admitted was “a complete waste of time.”
These pointless tests are not just matter of misapplied practice, they’re symptoms of the steady degradation of Human Resources, the purpose of which seems to have reversed itself in recent years, acting as much as a gatekeeper attempting to rationalize the labor market’s irrationalities with junk science and false empiricism. Maintaining a large workforce is often seen as a liability, especially in tech-related industries.
Facebook’s 1.1 billion monthly users are served by only 4,900 employees. Twitter’s 200 million accounts are maintained by fewer than 1,000 employees. Amazon is a relative behemoth, with 91,300 employees worldwide, yet tens of thousands of those are brought on through temp agencies, which pay low hourly wages, offer few medical or retirement benefits, and leave all workers subject to at-will termination. HR departments don’t necessarily cause any of these phenomenon, they are almost always the mechanism used to keep them in place. What was once meant to be a safeguard against managerial exploitation has now become an enforcement mechanism for it.
It wasn’t always this way. In the 19th Century, Human Resources originated as a means of providing on-site factory regulation to ensure working conditions met a certain minimum standard. Mary Wood is often identified as the first formal HR employee, hired by Rowntree in the late 1890s to see to the morale of its workforce by putting flowers around the workrooms and tending to sick workers. In 1913 the Welfare Workers’ Association was founded to organize people in this new position, and after World War I there were over 500 people, mostly women, responsible for monitoring working conditions. Some companies began to experiment with using these positions to handle employment as well as worker morale. In 1921 a collection of interview techniques and selection tests published by the National Institute of Industrial Psychologist, putting in place the modern framework for HR.
This shift in focus effectively reversed the purpose of HR departments, tasking them not just with maintaining adequate work conditions but making them responsible for worker adequacy itself, something that has become a corporate currency in today’s economy. In a paper on “Developing HR as an Internal Consulting Organization,” former Senior Vice President of Mirage Resorts and MGM Grand Richard M. Vosburgh writes that HR departments are most effective when they make “rigorous data-based decisions about human capital management.” HR has effectively transformed labor into a kind of investment banking, with people analyzed and packaged instead of shares.
Even while the logic underwriting this system is growing more transparently exploitive and dysfunctional, ridding ourselves of its delusions is difficult. We want the dogma of HR hiring to be true, if only as a form of self-validation. In the same way that the doubling of home prices in a few years was both self-evidently unsustainable and impossible to walk away from, the fantasy of meritocracy is tattooed into the American DNA. All meritocracies need prizes and so we carve out the benefits of job security, healthcare, and a decent standard of living as things that should be withheld, given only to those able to outperform the baseline in the human capital market.
This isn’t meritocracy but another form of freelance feudalism, in which a narrative of individual achievement becomes a pre-requisite for being considered basically employable. All of these employment structures cover the deeper truths of the labor market, where nepotism matters more than pure merit in many cases. According to Maribeth Bailey, Deloitte and Touche gets 49 percent of its experienced hires from personal referrals, while almost forty percent of Enterprise Rent-A-Car’s new hires in the last two years have come from employee referrals. The food services company Sodexo favors employee referrals by a factor of 10 to 1. These statistics aren’t proof of institutional corruption, but a reminder that people are always more adaptable than HR templates suggest.
I’ve worked in a number of different fields and found that, with few exceptions, the technical skills required in most roles can be quickly learned. I’ve built databases, negotiated licensing contracts, budgeted and scheduled a feature film, taught English as a foreign language, worked as a health educator, taught myself cinematography, video editing, learned HTML, and become conversationally fluent in two languages (in addition to the two I’d spoken in childhood, and the third I picked up in high school). I wouldn’t describe myself as an expert in any of these skills, but in anywhere from a few weeks to a few months I became relatively productive in a variety of professions.
From that side of the HR wall, one learns that all of those intimidating skill requirements are mostly set dressing. It’s doubtful Steve Jobs would have passed Apple’s motorcycle pump identification test, and who knows if Sergey Brin could have explained the conundrum of manhole covers. Which in a way only confirms that HR’s screening policies are functioning as intended, creating a quantific basis for keeping people out rather than ensuring communities are working as productively as they can.
It should be that the benefits of digitization and the widely accessible information it brings create a job market driven by the cross-pollination of ideas and disciplines, supported by groups of people training one another, working together, and benefitting from having old standards challenged from the outside. Instead, we seem trapped in a period of corporate rigidity, with employers using an irrational holdover of industrial-age empiricism to evaluate who is capable of contributing and who isn’t, something which is readily set aside whenever a friend of a friend comes calling. Human Resources has become the junk science of rationalizing why most people we live with shouldn’t be allowed to contribute, and the closer one looks at its hiring practices the more nonsensical it becomes, an example of how easily a structure that was meant to enrich the working conditions of workers can become diverted with the business of proving why only a chosen few should be allowed access to that coveted and increasingly rare status.