Saturday, July 03, 2004

NYReview of Books: The Truth About the Drug Companies

"Every day Americans are subjected to a barrage of advertising by the pharmaceutical industry. Mixed in with the pitches for a particular drug—usually featuring beautiful people enjoying themselves in the great outdoors—is a more general message. Boiled down to its essentials, it is this: "Yes, prescription drugs are expensive, but that shows how valuable they are. Besides, our research and development costs are enormous, and we need to cover them somehow. As 'research-based' companies, we turn out a steady stream of innovative medicines that lengthen life, enhance its quality, and avert more expensive medical care. You are the beneficiaries of this ongoing achievement of the American free enterprise system, so be grateful, quit whining, and pay up." More prosaically, what the industry is saying is that you get what you pay for.
Is any of this true? Well, the first part certainly is. Prescription drug costs are indeed high—and rising fast. Americans now spend a staggering $200 billion a year on prescription drugs, and that figure is growing at a rate of about 12 percent a year (down from a high of 18 percent in 1999).[1] Drugs are the fastest-growing part of the health care bill—which itself is rising at an alarming rate. The increase in drug spending reflects, in almost equal parts, the facts that people are taking a lot more drugs than they used to, that those drugs are more likely to be expensive new ones instead of older, cheaper ones, and that the prices of the most heavily prescribed drugs are routinely jacked up, sometimes several times a year.
Before its patent ran out, for example, the price of Schering-Plough's top-selling allergy pill, Claritin, was raised thirteen times over five years, for a cumulative increase of more than 50 percent—over four times the rate of general inflation.[2] As a spokeswoman for one company explained, "Price increases are not uncommon in the industry and this allows us to be able to invest in R&D."[3] In 2002, the average price of the fifty drugs most used by senior citizens was nearly $1,500 for a year's supply. (Pricing varies greatly, but this refers to what the companies call the average wholesale price, which is usually pretty close to what an individual without insurance pays at the pharmacy.). . .

The election of Ronald Reagan in 1980 was perhaps the fundamental element in the rapid rise of big pharma—the collective name for the largest drug companies. With the Reagan administration came a strong pro-business shift not only in government policies but in society at large. And with the shift, the public attitude toward great wealth changed. Before then, there was something faintly disreputable about really big fortunes. You could choose to do well or you could choose to do good, but most people who had any choice in the matter thought it difficult to do both. That belief was particularly strong among scientists and other intellectuals. They could choose to live a comfortable but not luxurious life in academia, hoping to do exciting cutting-edge research, or they could "sell out" to industry and do less important but more remunerative work. Starting in the Reagan years and continuing through the 1990s, Americans changed their tune. It became not only reputable to be wealthy, but something close to virtuous. There were "winners" and there were "losers," and the winners were rich and deserved to be. The gap between the rich and poor, which had been narrowing since World War II, suddenly began to widen again, until today it is a chasm.
The pharmaceutical industry and its CEOs quickly joined the ranks of the winners as a result of a number of business-friendly government actions. I won't enumerate all of them, but two are especially important. Beginning in 1980, Congress enacted a series of laws designed to speed the translation of tax-supported basic research into useful new products—a process sometimes referred to as "technology transfer." The goal was also to improve the position of American-owned high-tech businesses in world markets.
The most important of these laws is known as the Bayh-Dole Act, after its chief sponsors, Senator Birch Bayh (D-Ind.) and Senator Robert Dole (R-Kans.). Bayh-Dole enabled universities and small businesses to patent discoveries emanating from research sponsored by the National Institutes of Health, the major distributor of tax dollars for medical research, and then to grant exclusive licenses to drug companies. Until then, taxpayer-financed discoveries were in the public domain, available to any company that wanted to use them. But now universities, where most NIH-sponsored work is carried out, can patent and license their discoveries, and charge royalties. Similar legislation permitted the NIH itself to enter into deals with drug companies that would directly transfer NIH discoveries to industry.
Bayh-Dole gave a tremendous boost to the nascent biotechnology industry, as well as to big pharma. Small biotech companies, many of them founded by university researchers to exploit their discoveries, proliferated rapidly. They now ring the major academic research institutions and often carry out the initial phases of drug development, hoping for lucrative deals with big drug companies that can market the new drugs. Usually both academic researchers and their institutions own equity in the biotechnology companies they are involved with. Thus, when a patent held by a university or a small biotech company is eventually licensed to a big drug company, all parties cash in on the public investment in research.

These laws mean that drug companies no longer have to rely on their own research for new drugs, and few of the large ones do. Increasingly, they rely on academia, small biotech startup companies, and the NIH for that.[7] At least a third of drugs marketed by the major drug companies are now licensed from universities or small biotech companies, and these tend to be the most innovative ones.[8] While Bayh-Dole was clearly a bonanza for big pharma and the biotech industry, whether its enactment was a net benefit to the public is arguable.
The Reagan years and Bayh-Dole also transformed the ethos of medical schools and teaching hospitals. These nonprofit institutions started to see themselves as "partners" of industry, and they became just as enthusiastic as any entrepreneur about the oppor-tunities to parlay their discoveries in-to financial gain. Faculty researchers were encouraged to obtain patents on their work (which were assigned to their universities), and they shared in the royalties. Many medical schools and teaching hospitals set up "technology transfer" offices to help in this activity and capitalize on faculty discoveries. As the entrepreneurial spirit grew during the 1990s, medical school faculty entered into other lucrative financial arrangements with drug companies, as did their parent institutions.
One of the results has been a growing pro-industry bias in medical research —exactly where such bias doesn't belong. Faculty members who had earlier contented themselves with what was once referred to as a "threadbare but genteel" lifestyle began to ask themselves, in the words of my grandmother, "If you're so smart, why aren't you rich?" Medical schools and teaching hospitals, for their part, put more resources into searching for commercial opportunities.
Starting in 1984, with legislation known as the Hatch-Waxman Act, Congress passed another series of laws that were just as big a bonanza for the pharmaceutical industry. These laws extended monopoly rights for brand-name drugs. Exclusivity is the lifeblood of the industry because it means that no other company may sell the same drug for a set period. After exclusive marketing rights expire, copies (called generic drugs) enter the market, and the price usually falls to as little as 20 percent of what it was.[9] There are two forms of monopoly rights—patents granted by the US Patent and Trade Office (USPTO) and exclusivity granted by the FDA. While related, they operate somewhat independently, almost as backups for each other. Hatch-Waxman, named for Senator Orrin Hatch (R-Utah) and Representative Henry Waxman (D-Calif.), was meant mainly to stimulate the foundering generic industry by short-circuiting some of the FDA requirements for bringing generic drugs to market. While successful in doing that, Hatch-Waxman also lengthened the patent life for brand-name drugs. Since then, industry lawyers have manipulated some of its provisions to extend patents far longer than the lawmakers intended.
In the 1990s, Congress enacted other laws that further increased the patent life of brand-name drugs. Drug companies now employ small armies of lawyers to milk these laws for all they're worth—and they're worth a lot. The result is that the effective patent life of brand-name drugs increased from about eight years in 1980 to about fourteen years in 2000.[10] For a blockbuster—usually defined as a drug with sales of over a billion dollars a year (like Lipitor or Celebrex or Zoloft)—those six years of additional exclusivity are golden. They can add billions of dollars to sales—enough to buy a lot of lawyers and have plenty of change left over. No wonder big pharma will do almost anything to protect exclusive marketing rights, despite the fact that doing so flies in the face of all its rhetoric about the free market.

As their profits skyrocketed during the 1980s and 1990s, so did the political power of drug companies. By 1990, the industry had assumed its present contours as a business with unprecedented control over its own fortunes. For example, if it didn't like something about the FDA, the federal agency that is supposed to regulate the industry, it could change it through direct pressure or through its friends in Congress. The top ten drug companies (which included European companies) had profits of nearly 25 percent of sales in 1990, and except for a dip at the time of President Bill Clinton's health care reform proposal, profits as a percentage of sales remained about the same for the next decade. (Of course, in absolute terms, as sales mounted, so did profits.) In 2001, the ten American drug companies in the Fortune 500 list (not quite the same as the top ten worldwide, but their profit margins are much the same) ranked far above all other American industries in average net return, whether as a percentage of sales (18.5 percent), of assets (16.3 percent), or of shareholders' equity (33.2 percent). These are astonishing margins. For comparison, the median net return for all other industries in the Fortune 500 was only 3.3 percent of sales. Commercial banking, itself no slouch as an aggressive industry with many friends in high places, was a distant second, at 13.5 percent of sales.[11]
In 2002, as the economic downturn continued, big pharma showed only a slight drop in profits—from 18.5 to 17.0 percent of sales. The most startling fact about 2002 is that the combined profits for the ten drug companies in the Fortune 500 ($35.9 billion) were more than the profits for all the other 490 businesses put together ($33.7 billion).[12] In 2003 profits of the Fortune 500 drug companies dropped to 14.3 percent of sales, still well above the median for all industries of 4.6 percent for that year. When I say this is a profitable industry, I mean really profitable. It is difficult to conceive of how awash in money big pharma is.
Drug industry expenditures for research and development, while large, were consistently far less than profits. For the top ten companies, they amounted to only 11 percent of sales in 1990, rising slightly to 14 percent in 2000. The biggest single item in the budget is neither R&D nor even profits but something usually called "marketing and administration"—a name that varies slightly from company to company. In 1990, a staggering 36 percent of sales revenues went into this category, and that proportion remained about the same for over a decade.[13] Note that this is two and a half times the expenditures for R&D.
These figures are drawn from the industry's own annual reports to the Securities and Exchange Commission (SEC) and to stockholders, but what actually goes into these categories is not at all clear, because drug companies hold that information very close to their chests. It is likely, for instance, that R&D includes many activities most people would consider marketing, but no one can know for sure. For its part, "marketing and administration" is a gigantic black box that probably includes what the industry calls "education," as well as advertising and promotion, legal costs, and executive salaries—which are whopping. According to a report by the non-profit group Families USA, the for-mer chairman and CEO of Bristol-Myers Squibb, Charles A. Heimbold Jr., made $74,890,918 in 2001, not counting his $76,095,611 worth of unexercised stock options. The chairman of Wyeth made $40,521,011, exclusive of his $40,629,459 in stock options. And so on.[14]"

0 Comments:

Post a Comment

<< Home