"In the spring of 1974 — purely speculatively, I told myself — I took the Law School Admissions Test.
— Scott Turow, "One L: The Turbulent True Story of a First Year at Harvard Law School"
Unlike Scott Turow, I always wanted to be a lawyer. Once I entered law school in 1976, it never occurred to me that using my JD to earn a living would be a significant challenge, or that my student loans from college and law school—roughly $50,000 in 2012 dollars—would be anything other than a minor inconvenience. I’d heard stories about unemployed lawyers driving taxicabs, but they were irrelevant to the life I’d planned. In that respect, I was similar to most of today’s prelaw students, who are convinced that bad things happen only to someone else. The difference is that the current prospects for law graduates are far worse than my contemporaries’ and mine ever were. Over the past two decades, the situation has deteriorated as student enrollments have grown to outpace the number of available new legal jobs by almost two to one. Deans who are determined to fill their classrooms have exploited prospective students who depend on federal student loan money to pay tuition. The result has been an unsustainable bubble.
Law school applicants continue to overwhelm the number of places available for them, ignoring data that on their face should propel most aspiring attorneys away from a legal career. Only about half of today’s graduates can expect to find a full-time position requiring a legal degree. Meanwhile, law schools have grown in number and size to accommodate demand without regard to whether there will be jobs for their graduates. The first part of the equation— student demand—is the product of media images projecting the glamour of attorneys’ lives, the perception that a legal degree ensures financial security, and law school’s status as the traditional default option for students with no idea what to do with their lives. The second part of the equation—the increase in law school supply—was made possible by a revolutionary change in the method of legal education more than a century ago. It gave educators an easy way to transform law schools into profit centers for their universities. Decades later, student loans would provide the funding.
Today there’s a lawyer for every 265 Americans—more than twice the per capita number in 1970—but for future attorneys, there won’t be enough legal jobs for more than half of them. In 2008, the U.S. Department of Labor’s Bureau of Labor Statistics (BLS) estimated that for the ten-year period ending in 2018, the economy would produce an additional 98,500 legal jobs. In 2012, after the Great Recession decimated the market for attorneys, the BLS revised that estimate downward, to 73,600 openings from 2010 through 2020. Another prediction considered attrition in combination with the number of anticipated new attorneys on a state-by-state basis and concluded that through 2015 the number of new attorneys passing the bar exam would be more than twice the expected number of openings. Whichever of these statistics turns out to be closest, there’s little doubt that law graduates are already feeling the crunch. Fewer than half of 2011 graduates found jobs in private practice. Nine months after graduation, only 55 percent held full-time, long-term positions requiring a legal degree.
Along with their degrees and dubious job prospects, 85 percent of 2010 graduates from ABA-accredited law schools carried debt, and the average debt load was almost $100,000. Average law school debt for the graduating class of 2011 broke six figures, and that number has been growing in tandem with unemployment rates for new graduates. Even if a career in law turns out to be the right path, the financial burden can be staggering. If the law ends up being the wrong path, then debt becomes the rock that Sisyphus had to push uphill for the rest of his life.
* * *
For most lawyers, the idea of pursuing a legal career comes early in life. One-third of respondents to a survey of recent applicants said that they had wanted to attend law school since childhood and, while still in high school, made the decision to apply after college. Another third made the decision as undergraduates, in either their freshman or sophomore year. One reason for this phenomenon is the media: popular images make a legal career look attractive to young people long before they get to college. Any middle school student who reads "To Kill a Mockingbird" (1960) or "Inherit the Wind" (1955) takes in an image of the admirable lawyer-statesman. Recent portrayals include the CBS hit series "The Good Wife," which continues a legacy of noble lawyers in television dating back to Perry Mason and proceeding through "The Defenders," "L.A. Law," "Law & Order," and others. Every week, an episode of "The Good Wife" focuses on junior associate Alicia Florrick, a single mom who was raising two teenagers by herself until her philandering husband, a former state’s attorney, got out of jail near the end of the first season. Regularly she finds herself in tense courtroom scenes cross-examining key witnesses in high-stakes trials. While making a lot of money, she finds clever ways to unearth critical facts, reveal truth, and vindicate clients. Then she goes home every evening in time for dinner with her kids.
There are negative images out there, too, most notably in the work of John Grisham. For example, no pre-law student should want to emulate the crooked attorneys in "The Firm," his 1991 best seller about lawyers who operate their enterprise as a front for the mob. But they also should be wary of identifying with the novel’s protagonist, Mitch McDeere. He follows the very track to which most of them aspire: he graduates from a top law school and joins a high-paying law firm to earn big money. However, he gets swept away by the billable-hour culture, which deprives him of sleep and a home life, and his marriage deteriorates. These pressures, which nearly destroy him, are wholly apart from the underlying criminality that his firm’s partners pursue.
Yet most pre-law students ignore the persistent warnings. Somehow those negative images can’t compete with the positive ones. Psychologist Daniel Kahneman, who won a Nobel Prize in economics, may have a partial explanation. Kahneman researches and writes about a universal human characteristic: clinging to preconceived notions, even as contrary information and unambiguous data undermine them. The phenomenon is a variant of confirmation bias, the tendency to credit information that comports with established beliefs and jettison anything that doesn’t. In the context of the legal profession, most prelaw students think they’ll be the exceptions—the traps that ensnare people like Mitch McDeere won’t get them.
* * *
Another reason that people become lawyers is to make money. But if prospective lawyers allow themselves to be dazzled by headlines about the wealthiest attorneys, such as the partner who recently left one big firm to join another where he’d earn a reported $5 million a year, they’re making a mistake. Nine months after graduation, members of the law school class of 2009 fortunate enough to have any full-time job had a median salary of $72,000, comparable in buying power to the $50,000 median salary for new lawyers in 1990. That may not sound bad, but even that number is misleadingly high, as it masks a skewed income distribution. Each year 10 to 15 percent of graduates get jobs in big law firms, where the starting salary can be as high as $160,000. But those firms constitute only a tiny slice of the profession, and it’s shrinking. Furthermore, the median salary has been falling. For all law firms, the median starting salary for the class of 2011 was $85,000; for all lawyers who graduated that year, it was $60,000 (a 17 percent drop compared to the $72,000 median starting salary for the class of 2009). Even those numbers overstate new graduates’ financial reality for another reason: they’re based solely on salary information for the 65 percent of graduates reported to be working full-time in a position lasting at least a year.
For most employed lawyers, the money gets better. The median annual income of all practicing lawyers in 2010 was $112,000—double that of all US households. The nagging problem is that the seemingly decent (but shrinking) payoff usually isn’t sufficient to justify the enormous investment in time and money. Professor Herwig Schlunk of Vanderbilt University Law School calculates that for the vast majority of graduates, getting a legal degree will never yield a return equal to the financial cost of becoming a lawyer.
* * *
Some people go to law school because it’s the last resort of the liberal arts major who doesn’t know what to do next. In that respect, the decision to enroll has long resulted from a process of elimination that proceeds something like this: being a member of a profession is the ultimate achievement, but medical school requires science-oriented interests and talents that don’t fit most students in the humanities; postgraduate degrees in history, philosophy, English, and the social sciences are for future professors; business school is for those whose principal ambition is to make lots of money. That leaves law school, which offers students a three-year reprieve from the world while they pursue a noble course that presumably creates even more options. Sometimes that plan works out okay; for too many others, it leads to a place where dreams go to die.
Proof that law school is a default solution for the undecided lies everywhere, even in newspapers’ sports pages. In the fall of 2011, twenty-six-year-old infielder Josh Satin made his major league debut for the New York Mets. An article about him included this line: “After graduating as a political science major from Cal, Satin was selected by the Mets in the sixth round of the 2008 draft. And like any number of 20-somethings with a liberal arts degree and nebulous career prospects, he kept law school applications at the ready.”
* * *
On the supply side of the lawyer bubble, some of the necessary conditions for its creation date to a nineteenth-century innovation in legal education—the case method. Credit for that development goes to former Harvard Law School dean Christopher Columbus Langdell. Prior to 1890, no other law school used the case method of instruction that he pioneered; today it’s pervasive.
Langdell didn’t set out to create what became an essential basis for the current mass production model of legal education. Rather, he was simply pursuing his penchant for thoroughness. He viewed the law as a science and believed that its ultimate truths could be discovered through the study of primary specimens, namely, the decisions of appellate court judges. Law students could divine general principles that, once mastered, would enable a graduate to practice anywhere. As Langdell saw it, differences in state law were inconsequential to the overall jurisprudential picture.
The large body of common law itself created a challenge for Langdell’s approach. No student could read every reported decision going back to Blackstone’s Commentaries on the Laws of England, an eighteenth-century treatise that first summarized the English common law as part of a unified system. For his Harvard contracts course, Langdell instead collected a selection of reported cases (there were more than two thousand at the time) from which an entire classroom of students could induce general legal principles.
The Langdell case method was a radical departure. Previously, prospective attorneys had learned the law from secondary sources as rules to memorize and skills to hone before engaging in one-on-one apprenticeships. For example, after a year of study consisting of the traditional lecture and drilling at the University of Michigan in the 1870s, Clarence Darrow received on-the-job legal training while working for an attorney in rural Ohio. He then proved his competence to a few lawyers before whom he literally sat to be examined for the bar. Darrow passed. A system that required students to learn specific legal rules and then receive training with practicing attorneys constrained the number of new lawyers admitted to the bar each year.
Langdell changed that model with what he regarded as a noble aim. Practical aspects—simply learning the rules—weren’t the key. Instead, a true lawyer’s most important work was to understand the governing principles so as to “be able to apply them with consistent facility and certainty to the ever-tangled skein of human affairs.” One by-product of the approach was that large groups of students could receive simultaneous legal training from a handful of instructors. The system became an early building block in the current business model of legal education.
Langdell’s new teaching protocol didn’t create the current lawyer bubble, but it provided an essential foundation that facilitated the mass production of attorneys. From 1890 to 1916, the number of law schools doubled from 61 to 139, but the schools themselves became larger, so the number of law students increased fivefold—from 4,500 to almost 23,000. As recently as 1963, there were still only 135 law schools, but total JD enrollment had doubled to 47,000 students.
During the next decade, baby boomers made their way into higher education as the Vietnam War popularized three-year law school deferments from the draft. Enrollment doubled again to 100,000 by 1972, but there were still fewer than 150 law schools. As the last of the boomers made their way through law school, enrollment leveled off, hovering around 127,000 through the 1990s. On a per capita basis, the United States had 1.58 lawyers per 1,000 citizens in 1960; by 1980, the number had grown to 2.38 lawyers per 1,000. But that was only the beginning.
In the 1990s, U.S. News & World Report’s law school rankings began to gain in popularity and became a key element in the competition for new students. Meanwhile, as applications to first-year classes rose generally, universities increasingly saw law schools as profit centers worth expanding. Recently the Maryland Department of Legislative Services concluded that the University of Baltimore School of Law sent 31 percent of its 2010 revenue back into the general university budget. For private schools the data are difficult to uncover, but the University of Baltimore report corroborates a widely held view that universities in general impose a “tax” amounting to between 20 and 25 percent of their law schools’ gross revenues.
Law school enrollments climbed even as tuition rose faster than at undergraduate colleges. In 2003, there were more than ninety-eight thousand applicants to the first-year class that enrolled about forty-eight thousand students nationwide. Average annual tuition for private law schools was $26,000. By 2010, it had increased to more than $37,000. Even as law school applications declined sharply after 2010, private law school tuition went up annually by 4 percent—more than twice the rate of inflation—to an average of $40,585 per year in 2012. Public law schools have followed an even steeper curve: for in-state residents, average tuition doubled from $11,860 in 2003 to $23,590. In 2012 alone, it went up by more than 6 percent.
When U.S. News published its first rankings in 1987, total law school enrollment in the 175 ABA-accredited institutions had remained around 120,000 for a decade. Since then, twenty-five more law schools have come on line and enrollments have steadily risen to more than 145,000. By 2010, there were more than 1.2 million lawyers in the United States—almost 4 for every 1,000 citizens. In the United Kingdom, the comparable number is about 2.5 per 1,000; in Germany, it’s slightly more than 1.5.
* * *
Law school deans defended the growth and proliferation of law schools after 2000 as a market reaction to student demand. After all, an excess of applicants over available spots sent an unambiguous signal: consumers wanted more openings in law schools. Anyone running a business would respond as most deans did: raise tuition, increase profits, and add capacity. Wrapping themselves in the rhetoric of free markets and individual choice, even deans at some of the best law schools avoided important disclosures, including meaningful employment and salary data for their recent graduates. After all, better information about the limited opportunities actually available to new attorneys might reduce student demand.
Of course, some of the widespread career dissatisfaction among attorneys is the fault of college students making shortsighted and unsound judgments about their future. But bad information shares the blame for what turned out to be a poor career choice for many of them. Law schools operating on the outer perimeter of candor to fill their classrooms worsened the problem. But without free-flowing student loan money for which law school deans never have to account, the entire system would look much different.
The law school business model permitted (and still permits) a perverse market response—increasing tuition in the face of declining demand for lawyers—for two reasons: student demand for law school still exceeds supply, and students have little difficulty borrowing whatever they need to cover the cost of a degree. For decades, lenders faced no risk of default because the federal government guaranteed the loans.
Then in 2008, out of concern that the credit market freeze would leave insufficient financing for student loans, the government essentially took over most such lending directly. Two years later, it completed the transition from insuring all loans to issuing the vast majority of them. Meanwhile, revisions to the bankruptcy laws essentially bar students from ever discharging public or private educational debt. In its totality, the current regime insulates law schools from the problem of graduates who can’t find jobs needed to repay their student loans, while giving schools no incentive to control tuition costs. Of the various parties involved—students, government, private lenders, and law schools—only the students and, to a growing extent under new income-based repayment programs, the federal treasury bear any significant risk that such borrowing might turn out to have been imprudent.
The combination of irresponsible lending and inadequate law school accountability has been deadly for many attorneys and the profession. It’s a story of good intentions gone awry.
The origins of the government student loan program generally date to 1958, when Congress followed the recommendation of economist Milton Friedman in creating a system of direct federal loans for higher education. When it expanded the program in 1965, existing federal budget accounting rules required booking direct student loans as total losses in the year made, regardless of whether they would be repaid in full with interest. But the rules also provided that a loan guarantee didn’t count as a federal budget cost item—not a penny. At the urging of economists, Congress finally revised the budget rules in 1990, but the most important feature remained: federal guarantees of all private and public student loans.
For lenders, such guarantees mean no risk of nonrepayment because the government picks up the tab for any shortfall. For students, they mean the growth of another industry that will chase them forever: debt collectors. When someone defaults on a student loan, the government turns it over to private collection agencies. In 2011, the US Department of Education paid more than $1.4 billion to such companies. Summarizing that industry’s attitude, a business consultant described his thoughts in 2011 as he watched Occupy protesters at New York University wearing T-shirts with the amounts of their student debt scribbled across the front: “I couldn’t believe the accumulated wealth they represent—for our industry. It was lip-smacking.” His article included a picture of some students in their T-shirts, including one with “the fine sum of $90,000” and another with “a really attractive $120,000.” Another consultant suggested that student loans might be the accounts receivable industry’s “new oil well.” Something is terribly amiss in a society where policies and incentive structures make debt collection a growth business.
In addition to government guarantees, private lenders gained another layer of protection against losses from their student loan portfolios. As noted previously, today such debt almost always survives a young lawyer’s bankruptcy filing. The cumulative impact of these policies is becoming clearer. As one recent graduate observed, a federally guaranteed student loan may be “the closest thing to debtor prison that there is on this earth.”
It wasn’t always so. In the early 1970s, the federal student loan program was still relatively new and the US Department of Health, Education, and Welfare sought to avoid any negative public image that might tarnish the young system. The agency proposed making government student loans nondischargeable in bankruptcy unless a borrower had been in default for at least five years or could prove “undue hardship.” Enacted in 1976, the undue-hardship requirement placed student loans in the same category as child support, alimony, court restitution orders, criminal fines, and certain taxes. No data supported the suggestion of a student loan default problem, but anecdotal media reports of isolated abuse carried the day.
The concern was moral hazard—the fear that graduates on the verge of lucrative careers would avoid responsibility for the federal educational loans that had made those careers possible. But as the legislative history makes clear, the basis for such concerns was “more myth and media hype than reality.” A lead editorial in the July 25, 2012, edition of the Wall Street Journal reveals the enduring power of that myth thirty-five years later: “After a surge in former students declaring bankruptcy to avoid repaying their loans, Congress acted to protect lenders beginning in 1977.” That’s simply not true. Although a House of Representatives report and analysis from the General Accounting Office had confirmed that abuse was “virtually non-existent,” the provision found its way into the Bankruptcy Reform Act of 1978.
In 1990, Congress extended the requisite five-year default period, requiring a seven-year wait as a precondition to relief from educational debt. In 1997, the Bankruptcy Reform Commission found no evidence to support claims of earlier systematic abuse. Even so, in 1998 Congress amended the statute to provide that no amount of time would render federal educational debt dischargeable in bankruptcy. In 2005, Congress extended nondischargeabilty to private lenders as well, although, as Senator Dick Durbin asked in 2012, “How in the world did that provision get into the law? It was a mystery amendment. We can’t find out who offered it.” A fruitful place to begin the search might be with lobbyists for the banking industry.
Apart from the unwillingness of any legislator to claim responsibility for the now orphaned provision, there was little factual justification for it or the earlier revisions that eliminated bankruptcy relief from federal loans in the first place. Nonfederal loans accounted for only 7 percent of all student borrowing in the 2010–2011 academic year. Repeated legislative inquiry yielded no empirical evidence to validate stated fears about systemic abuse for either private or government loans. But now that the limitations are in place, some have theorized that returning even to pre-2005 rules could lead to a parade of horribles, including higher interest rates for all students, reduced affordability, and tighter credit requirements throughout the system.
Two recent examples of the undue-hardship requirement illustrate the daunting task facing a debtor who seeks relief from educational debt today. In May 2012, a sixty-three-year-old Maryland woman had more than $330,000 in school loans dating back to her enrollment at the University of Baltimore School of Law in 1992. She didn’t graduate. Later, she received a master’s degree from Towson University and a PhD from an unaccredited online school. The judge decided that the debtor’s Asperger’s syndrome qualified her for relief from student loan debt. Expecting that she could “ever break the grip of autism and meaningfully channel her energies toward tasks that are not in some way either dictated, or circumscribed, by the demands of her disorder would be to dream the impossible dream.” Even the debtor’s attorney expressed surprise that his client had succeeded in discharging her debt under the demanding undue-hardship standard.
In July 2012, a sixty-four-year-old woman who had worked on an assembly line earning $11 an hour until she received a layoff notice obtained discharge of loans she had first taken on in 1981, when she was thirty-three and enrolled in Canisius College. After pursuing a five-year partial repayment plan under Chapter 13 of the Bankruptcy Code, she’d whittled only $2,400 from her loan balance and still owed more than $56,000, most of which was accrued interest on her original $17,000 loan. The court concluded that the debtor was “at the end of her ‘rope’ at age sixty-four, facing job loss and no prospects other than Social Security,” and ordered her loans discharged.
Such cases in which students get relief from burdensome student loan debt are unusual. In fact, the applicable legal standard for discharge isn’t even consistent across the federal circuits. Some appellate courts require judges to predict the future and conclude, as a prerequisite to discharge, that a debtor will never be able to repay the loans—that is, the “certainty of hopelessness.” One attorney described how he jokes about the absurdity of the standard: “What I say to the judge is that as long as we’ve got a lottery, there is no certainty of hopelessness. They smile, and then they rule against you.”
More attorneys are finding themselves in plights similar to that of the thirty-four-year-old lawyer with more than $200,000 in school loans and a job that would never pay enough to retire them: “It’s a noose around my neck that I see no way out of.” It takes little imagination to foresee the domino effects as she and similarly situated others become unable to fund their children’s higher education. The accumulating social costs over generations could haunt America for a long time.
* * *
As a consequence of these dynamics, some not-so-funny things happen to many of those who choose law school for the wrong reasons—or for no particularly good reason. The promise of a secure future at a well-paying job is often illusory. The persistent problem of lawyer oversupply rose to crisis level, and the market for new talent has remained weak. Compounding the difficulties with which they began law school, newly minted, less-than-passionate, and deeply indebted lawyers are now having trouble finding the secure, well-paying, and exciting work they thought would be waiting for them when they graduated. For most of the nation’s forty-four thousand annual graduates today, those positions were never there at all.
Because students rely on rankings to choose a school, such listings are now a critical element in the prevailing law school business model. U.S. News & World Report publishes what everyone regards as the gold standard. As a consequence, deans use its methodological criteria to run their institutions. Single-minded self-interest in selling a law school education—and the failure of colleges and law schools to offer a competing perspective that challenges students’ assumptions about most lawyers’ actual lives—has disserved many graduates and damaged the profession. But try telling that to deans who pander to the annual U.S. News rankings.
Excerpted with permission from "The Lawyer Bubble: A Profession In Crisis" by Steven J. Harper. Available from Basic Books, a member of The Perseus Books Group. Copyright © 2013.
Shares