When the Toronto Raptors traded small forward Rudy Gay to the Sacramento Kings on Dec. 8, the stats geeks who follow professional basketball exulted. The news that Toronto was dumping Gay provided definitive proof, in their minds, of the value of "advanced statistics." For years, they'd claimed that a set of statistical methods -- basically basketball's version of Sabermetrics -- exposed Gay as overcompensated and underperforming.
The eyeball test says that Gay looks like a prototypically great NBA athlete, but the newfangled "analytics" that are all the rage in professional basketball -- "Efficient Shooting Percentage," "True Shooting Percentage" and "Offensive Rating" -- declared otherwise. And now here he was, traded for the second time in less than a year; from star to persona non grata in less time than it takes to dunk.
The NBA's "Moneyball" moment had arrived. The stat geeks had won.
Gay will make $18 million this year and $19 million next, so few regular people -- by which I mean non-athlete working stiffs -- are likely to shed tears over the ruthlessness with which arcane numbers have exposed the holes in Gay's game. But we should all still be paying attention, because the same kind of number crunching is increasingly being focused on us.
"People analytics" -- the assessment of whether particular workers are suitable for particular kinds of employment or performing well at their jobs -- is booming like never before. Everything that can be measured about us, whether via personality tests, biometrics or the big-data trails we leave in the cloud as we go about our work, is being captured and analyzed.
Proponents of "people analytics" say this new number crunching will lead to a fairer, more efficient workplace, in which employees are better suited to their jobs. They may be right. But critics worry about what happens to all the people who don't make the grade in an algorithmically driven pure meritocracy. And they ask: In the long run, will it be healthy for society to run everything by the numbers?
It's a question with no good, definitive, Rudy-Gay-is-clearly-a-bad-shooter answer at the moment. But the emerging trend-line is impossible to ignore: We're all fodder for the stats geeks. We're all Rudy Gay, powerless before the remorseless tale told by our digits.
* * *
Pre-employment "assessment training" is already widespread. According to one expert I talked with, about 75 million assessment tests are performed every year. One company alone, Kenexa, reportedly does 20 million assessments a year. For millions of available jobs, particularly entry-level hourly jobs, one of the very first hurdles applicants are required to clear is an online personality test that quickly culls prospective workers into green (yes) or red (no) groups.
The sophistication of such tests is rapidly advancing into scary territory. One brand-new San Francisco start-up, Prophecy Sciences, wires up applicants with sensors, gives them a 30-minute test and some video games to play, and (according to TechCrunch) aims "to analyze the unique blend of chemical reactions, electrical impulses, reflexes and behaviors that make you who you are, and measure how you respond to group dynamics, before ultimately identifying trends between you and your colleagues."
Potentially even more valuable insights can be derived from the vast amounts of data created after a worker gets a job. Another Bay Area start-up, Evolv, combines pre-employment assessment testing with the patterns that emerge from the "big data" generated by worker actions. Attendance records, salaries, the duration of customer transactions and their positive or negative outcomes -- anything that can be measured gets pulled into Evolv's maw, where its "artificially intelligent machine learning engine" searches for useful correlations.
"The hiring and managing of people has long been done by gut and intuition," declares a white paper prepared by Evolv. "Now, companies are turning to data to uncover the facts and drive decisions."
Max Simkoff, Evolv's co-founder and CEO, told me that his company's big-data crunching had revealed a stream of intriguing, contrarian results. For example, "people with a criminal background stay longer on the job and perform better at entry-level hourly jobs," he said. Having "relevant experience" for a job didn't track with later productivity. Indeed, the relative quality of a manager or supervisor was more important in influencing worker attrition and productivity than the background of the individual workers. Other useful insights -- as reported by the Atlantic's Don Peck in a comprehensive recent feature story, "They're Watching You At Work" -- include the nugget that educational attainment is not as big a factor in job success as the conventional wisdom believes. Another interesting data point: Being unemployed for a long period of time does not make you a worse worker, if hired.
Put it all together, says Simkoff, and you end up with a better world: Listening to the wisdom of the algorithm, he believes, results in a fairer workplace, less tainted by bias and discrimination.
"A lot of people like to talk about the Big Brother angle of Big Data -- how you are being watched all the time," says Simkoff. "But here's the fact. For over a hundred years, or longer -- as long as people have been hired up until now -- the decisions that people made about who to hire, how to promote them, when to terminate them, were made using almost 100 percent intuition and gut feel and as a result there were a bunch of really nasty practices permeating the workforce. One of them -- the best example we've been able to debunk -- is the idea that people need to have previous work experience. The idea that people need relevant experience for entry level jobs is factually false.
"What big data analytics is doing here is enabling a wider playing field. It is taking all these people who used to get unfairly screened out from these jobs and saying, 'no, these people are every bit as capable of doing these jobs as the people who have been hired with a lot of personal bias in the past.' I think that is fascinating. Especially in a world where we talk about the obvious negative connotation of a 9 percent unemployment rate."
Simkoff makes a compelling argument. His vision of a job market, in which job applicants are judged by their inherent suitability for a job rather than by the name of the college they attended, or their criminal record, or how many times they might have changed jobs in the past year, or how long it's been since they even had a job, is inherently attractive.
But others see a darker side. I was originally tipped off to the brave new world of assessment testing by Roland Behm, a lawyer who writes a blog that covers employment-testing issues. Behm has a personal reason for following this issue. He believes his son was rejected for a part-time job at the Kroger supermarket chain because the personality test discriminated against his son's bipolar disorder.
"Looking into the tests, I came to the conclusion that they were illegal medical examinations under the American Disabilities Act, and that they unlawfully screened out persons with disabilities (also in violation of the ADA). I communicated with a number of employers that utilize personality tests as part of their employment screening process -- some by exchanging letters and emails, some by phone, and some by meetings. The majority of their responses were along the lines of 'it will take you a long time (7-10 years) to litigate the issues,' 'we have more resources than you' and 'our test may be problematic, but it's not as bad as some of the others.'"
Behm is indeed litigating the issue, while also spreading word as widely as he can about the potential for "systemic discrimination" through assessment testing. But he doesn't deny that there might be some positive value from the kinds of things that Evolv does.
"We don't take the position that all pre-employment assessments are bad or illegal," says Behm. "Might some of the tests, in some of the circumstances, provide a benefit to the employer and the applicant by bringing in a highly qualified applicant who might previously have been dismissed out-of-hand (non-Ivy Leagues or persons with criminal records)? Of course, and that's what we're looking for with regard to applicants with mental illness. Currently, we believe that many of the pre-employment tests illegally exclude persons with mental illness due to the use of personality tests and the use of location-based screening (e.g., applicant lives more than X miles from the job site or has a commute longer than Y minutes). The fact that the testing may benefit some groups doesn't offset the illegal and negative effects it has on others."
"Many persons who are ready, willing and able to work are not being given the opportunity. Not only does that have an immediate negative impact on the person with mental illness who is not considered for employment, it has a medium- to long-term negative impact on society and taxpayers."
In the cutthroat world of American capitalism -- especially as currently exemplified by Silicon Valley's vaunted triumph-of-the-meritocracy-over-all-else philosophy -- the notion that companies might have a legal responsibility to hire the mentally ill might not be the most pleasant thought for H.R. managers. Where's the competitive advantage for the company? But from society's point of view, the more people with jobs, the better. And that raises another point about "people analytics" that stretches far beyond the letter of employment law: Might we not actually be better off if even the bad workers have jobs, too?
It sounds silly. But if you game out people analytics far enough, with companies only hiring the perfect workers for each, you end up with a lot of unemployed imperfect workers. And that creates a further drain on society's resources. Perhaps the most socially beneficial job market is one in which employers accept a certain level of imperfection and lower productivity on the part of some workers, because it's better for society at large to have as many people employed as possible?
From Simkoff's perspective, Evolv facilitates better "matching" -- putting the right worker in the right job. One could well argue that the increases in productivity that might result from better matching could contribute to stronger overall economic growth, which would in turn create a healthier job market for everyone -- even the not so great workers. But there's a darker scenario, one that increasingly seems to be playing out already: The best workers reap huge rewards; everyone else struggles for the scraps.
Because that's the logic of the algorithm. Reward productivity and punish inefficiency. It's a great model for an NBA team, with only 11 or 12 spots on the roster. But it's not all clear that it's a great way to run an entire society.
Shares