The pharmaceutical industry, like oil companies and arms manufacturers, isn’t viewed highly in the public imagination.
And for good reason. There is growing awareness of an inherent conflict of interest in the testing of drugs by the companies that manufacture them — like Pfizer, Merck and Eli Lilly — and a steady stream of tales from journalists, researchers and doctors of deliberately dodgy trials, buried unfavorable results, and purchased academic journals.
Yet the greatest crime of the world’s major private pharmaceutical companies is not what they do, but what they don’t do. In the ongoing war against bugs and infection, these companies have abandoned their posts at the most critical time: when the enemy is mounting its most ferocious attack in generations. As these firms continue to shirk their duties — effectively abandoning antibiotic research for some 30 years now — senior public health officials are warning that the world could soon return to the pre-antibiotic era, a miserable, fearful time that few people alive now remember.
Market reports, medical journals, philanthropic organization analyses, government studies, and the pharmaceutical sector’s own assessments prefer a more delicate approach, attributing the dangerous threat to “insufficient market incentive.” My solution is a bit more elegant: socialization of the entire industry.
Policy options such as fresh regulation and keener oversight could work to moderately temper areas of Big Pharma malfeasance such as research massaging. But in the War on Bugs, these measures are either radically insufficient or of no use. There are a handful of emergency preventative steps that hospitals and livestock farmers can take to slow the advance of the enemy, but these attempts can do no more than postpone the impending doom. Socializing drug development is the only way of solving this problem.
A Threat Akin to Climate Change
In March, the director of the US Centers for Disease Control and Prevention, Thomas Frieden, warned authorities of their “limited window of opportunity” to deal with the “nightmare” presented by the rise of a family of bacteria highly resistant to what are often our last line of antibiotic defense: the suite of drugs known as carbapenems. A few months earlier, the UK’s chief medical officer, Sally Davies, used similar language to describe a future “apocalyptic scenario” in 20 years’ time, when people will be dying from infections that are currently understood to be trivial, “because we have run out of antibiotics.”
Davies described how the phenomenon “poses a catastrophic threat” to humanity akin to that of climate change and imagined a scenario in the coming decades in which “we will find ourselves in a health system not dissimilar to the early 19th Century,” where any one of us could go to the hospital for minor surgery and die from an ordinary infection that can no longer be treated. Major interventions like organ transplants, chemotherapy, hip replacements and care for premature babies will become impossible.
For generations we have grown accustomed to what are, frankly, superhuman feats of medicine, viewing them as unexceptional and indefinite, when in fact they depend on an assumption of prevention of microbial infection. Antibiotics revolutionized healthcare: the treatment of trauma, heart attacks, strokes and other illnesses requiring extensive care with catheters, intravenous feeding and mechanical ventilation cannot proceed without access to antimicrobial drugs. As the population ages, demand for this sort of intensive care will only increase.
So what did the pre-antibiotic era look like? There was 30% mortality from pneumonia for those who didn’t have surgery. Mortality from appendicitis or a ruptured bowel was at 100%. Before Alexander Fleming’s serendipitous discovery of the first antibiotic penicillin, hospitals were filled with people who had contracted blood poisoning through cuts and scratches. These scratches often developed into life-threatening infections. Using amputation or surgery as common medical responses for scraping out infected areas is not pleasant or preferred, but these were the only options for the doctors of 19-year-old David Ricci of Seattle following his train accident in India a few years ago. Ricci suffered infections from drug-resistant bacteria that even highly toxic last-resort antibiotics could not treat.
We have forgotten how common and deadly infectious disease once was. We’ve taken antibiotics for granted, but we can hardly blame ourselves for such complacency. US Surgeon General William H. Stewart is infamous for declaring it “time to close the book on infectious diseases and declare the war against pestilence won.” By the 1980s, cases of tuberculosis — humanity’s first known infectious disease and one of our deadliest foes, killing 1.4 million in 2011 — had dropped to such low rates that policymakers frequently spoke of eradicating the disease.
New infections and mortality rates are falling, but this fragile victory is overshadowed by the rise of multi-drug-resistant (MDR) TB, a form that is not susceptible to the four standard antibiotics, and extensively-drug-resistant (XDR) TB, strains of the disease that are not susceptible to second-line drugs. For normal drug-sensitive TB, the drug regimen typically lasts for six months, but for MDR TB, treatments take about 20 months and involve broad-spectrum antibiotics that are much more toxic and less effective.
Carbapenem antibiotics are last-resort drugs used when all else fails. Carbapenem-resistant Enterobacteriaceae (CRE) were first identified in the US in 1996 and have since spread around the world. As of March, all but eight states have confirmed CRE cases. CRE are frightening for three reasons, as Frieden pointed out: “First, they’re resistant to all or nearly all antibiotics. Second, they have high mortality rates: they kill up to half of the people who get serious infections with them. And third, they can spread their resistance to other bacteria. So one form of bacteria, for example, carbapenem-resistant Klebsiella, can spread the genes that destroy our last antibiotics to other bacteria, such as E. coli, and make E. coli resistant to those antibiotics also.”
Some 80-percent of gonorrhoea cases are now resistant to tetracycline—a frontline antibiotic—and a number of countries including Australia, France, Japan, Norway, Sweden and the UK are reporting cases of resistance to cephalosporin antibiotics, which is the last treatment option available for the STD.
Drug resistance is being reported in every sort of infectious disease. A recent survey found that 60-percent of infectious disease specialists had encountered infections that were resistant to every antibiotic.
How did we get here? The World Health Organization categorizes antimicrobial resistance as one of the three greatest threats to human health. A 2012 Washington Post article by Brian Valstag on the scarcity of new antibiotics put it most concisely: “It’s a case of evolution outrunning capitalism.”
Perpetual Arms Race
When someone takes a course of antibiotics, she’s helping to kill off bacteria, but there will inevitably be a small number of bacteria with random mutations, making these resistant to the drugs. This is called selection pressure. These hardier strains of the bugs survive and multiply, producing offspring with the same mutations. This is fine — indeed, it’s evolution, just happening at a breakneck pace. We come up with a category of antibiotics, microbes develop resistance, we develop new antibiotics, these develop resistance, and so forth. It’s an arms race. We’ll never truly defeat microbial resistance; we can only keep pace with it by maintaining in perpetuity a steady, unceasing development of new classes of antibiotics.
But if we stop developing these antibiotics, there are tremendous public health costs.
Drug firms produced 13 different families of antibiotics between 1945 and 1968. These were the low-hanging fruit as it were — the easiest to develop. Since then, just two new families of antibiotics have been brought on-stream. By the 1980s, pharmaceutical companies had essentially stopped developing them.
The reason that Big Pharma has gotten out of the game is that it takes years to develop any new drug, and costs between $500 million and $1 billion per agent approved by regulators, not to mention antibiotics deliver a much lower return on investment than other types of medicines. Unlike drugs that millions of people have to take for the rest of their lives to target chronic illnesses such as heart disease — drugs that suppress symptoms but do not cure — antibiotics are usually taken for a few weeks or months at most. This makes antibiotics unfavorable for capitalism. As a ‘call to arms’ paper from the Infectious Diseases Society of America in 2008 put it: “[Antibiotics] are less desirable to drug companies and venture capitalists because they are more successful than other drugs.” It is long-term therapy — not cures — that drives interest in drug development, the paper concluded.
Many large pharmaceutical companies have closed their research centers. Only four of the global Big Pharma 12 are engaged in antibiotic research.[2] These layoffs will make it hard to turn around the situation. Even if the political volition were there, it would take time to rebuild the highly skilled scientific workforce that has been lost over the past two decades, as companies have continuously abandoned antibacterial drug development. “[We] urge immediate, grassroots action by the medical community to attempt to address the deepening antimicrobial resistance crisis and, in particular, the need to significantly revitalise antibiotic R&D.” Drug resistance is accelerated when patients do not complete their full course of an antibiotic. Dismantling of public health infrastructure and social support systems enhance the possibility of patients abandoning their medicines part-way through the prescribed regime as there are fewer methods of monitoring adherence.
The antibiotic struggle is closely linked to one’s geographic position, class status and wealth. Resistant microorganisms can emerge and spread in an environment where poor quality antimicrobials are used. It does not take much to imagine situations where people on limited incomes or cash-strapped and austerity-bludgeoned hospitals and clinics might turn to cheaper options. Matters are made worse by the easy and inappropriate over-the-counter dishing out of antibiotics by pharmacies, particularly in the developing world, but also in Eastern Europe and the former Soviet Union.
So what will work?
Begging and Bribing Big Pharma
In the immediate term, experts are demanding that officials shift to war-footing and rationing the use of existing antibiotics. Greater surveillance, tracking networks and international coordination of efforts are imperative. A federal office dedicated to coordinating efforts to counter antimicrobial resistance and a national strategic research plan are vital. Hospitals, clinics and nursing homes can boost infection-control precautions such as radical cleanliness, hand-washing, gowning and gloving, grouping drug-resistant patients together, and reserving certain equipment for these patients alone.
But again, these tactics can only slow the enemy’s galloping advance. They affect the spreading rate of drug-resistance, but they don’t confront the phenomenon of drug-resistance itself. These efforts are important, but only because they’ll buy us time.
Fundamentally, what we need to fight to the microbes — to move from defense to offence — is to consistently develop new classes of antibiotics: a goal that most policymakers now recognize. But taking the job out of the private sector is not under consideration by anyone. Instead, policy proposals from the likes of IDSA, the WHO and the European Union amount to begging and bribing the pharmaceutical companies to lift a finger.
In the US, options under consideration include providing tax credits for critically needed drugs and grants for priority antibiotic development; nationally funded advance-purchase commitments or other ‘promised markets’; ‘transferable priority review vouchers’ that give another product from the company (of the company’s choosing); the right to be expedited through FDA review in return for achieving FDA approval for a priority antibiotic; and offering extended patent life or market exclusivity to 25 or 30 years for new drugs that are considered truly innovative. The last option has provoked understandable controversy for its threat to generic drug production and the accessibility of cheap antibiotics in the developing world. “Wild-card patent extensions” award companies patent extensions on another drug of six months to two years. This is the incentive that drug companies say is most likely to get them off their asses, and it has also sparked the most controversy.
We are still allowing these firms to cherry-pick the products that make the most money for their shareholders, such as Viagra or Lipitor, while through tax bungs, grants or public-private partnerships we pay them to research and develop what makes them millions instead of billions. The public bares the risk, but the companies take the profit. If these companies were brought into the public sector under the rubric of the National Institutes of Health or a similar stand-alone body, the money made from the profitable drugs could subsidise the research and development of less profitable drugs in turn for allowing more money to be spent on drug research and development. Placed in the public sector, barriers to open pharmaceutical research would dissolve, which would accelerate outcomes and limit duplication.
Finding future generations of antibiotics — assuming they’re out there to be discovered — will be devilishly difficult. But this is all the more reason to bring the sector into the public sphere: increased difficulty means increased costs, but for the same risible profit opportunities. There are completely novel strategies that avoid the antibiotic arms race altogether, but these are all highly uncertain, risky, and require years of expensive basic research, which demands heavy public intervention.
There was a time before the particularly virulent infectious disease known as neo-liberalism when Washington was much more open to direct government intervention in the sector. In the time of war, our leaders did not trust the private sector to be up to the task. But we are now at war with an invisible enemy more vicious than any Nazi, and the private sector is not only unwilling; it has gone AWOL. There is overwhelming evidence that Big Pharma is an innovation desert. Meanwhile, popular suspicion of the drug firms has pushed millions into the arms of alternative medicine quackery. If all the individual-focused time and energy spent on ‘natural’ remedies were spent collectively trying to bring Big Pharma under the yoke of democratic control, we’d already be halfway there.
For too long, the most common criticism from progressive quarters of these companies has been that their profit-seeking hurts the poor of the developed and developing world, who can’t afford their drugs. This is true as far as it goes, but doesn’t tackle the scale of this problem. The private pharmaceutical sector is a threat to public health and needs to be done away with entirely.
Shares