Out of Order: Substance over volume

For all of our improvements in throughput and other technological achievements, it is not clear that we've seen much improvement in the number or quality of drugs being produced, and we certainly haven’t made them less expensive. Maybe we need to ask some better questions about what we're doing, how we're doing it and why.

Randall C Willis
Register for free to listen to this article
Listen with Speechify
0:00
5:00
When you meet someone who does not speak your language, there is a cliché response of talking louder to make yourself understood. There is something within many of us that says if we simply pump up the volume, we can overcome the disconnect.
 
A couple of months ago, Tufts University released their latest estimates for the average cost of developing a new drug: $2.6 billion (I’ve seen estimates up to $5 billion). Eleven years ago, the same group calculated the costs at $0.8 billion.
 
Now, every time these estimates arise, the hand-wringing begins over how the costs were calculated, which factors make sense and which are over-reaching. What no one seems to argue, however, is that drugs are less expensive to develop today than they were a decade ago.
 
So what has this to do with speaking louder?
 
The same period has seen amazing technological achievements designed to facilitate and accelerate drug discovery and development.
 
Combinatorial chemistry was heralded as a way to expand compound libraries from hundreds to hundreds of thousands. High-throughput and high-content screening, as well as miniaturization and automation, were lauded as ways to screen all of these compounds faster under the paradigm of “fail early, fail often.” And given the masses of data these technologies would churn out, the informatics revolution was supposed to convert data into knowledge and knowledge into healthcare.
 
And yet, for all of these improvements in throughput, I question whether we have seen much improvement in the number or quality of drugs being produced. We certainly haven’t made them less expensive.
 
Please understand, I don’t place any fault in the technologies. These are truly marvels of engineering. Rather, I question the applications and expectations of the technologies.
 
Almost two years ago, GSK CEO Andrew Witty told a London healthcare conference: “It’s entirely achievable that we can improve the efficiency of the industry and pass that forward in terms of reduced prices.”
 
The pivotal question here, I believe, is how one defines efficiency.
 
I wonder how many people simply felt economies of scale would improve discovery, much as mass production made Henry Ford a rich man. But drugs are not cars, and where throughput and scale make sense when you have a fully characterized end product, they have their limitations during exploration.
 
When I was a protein biochemist in an NMR structural biology lab, I spent some time trying to wrap my head around two concepts: precision and accuracy. A 3-Å protein structure is very precise, but if the structure isn’t truly reflective of what happens in nature, it is meaningless. A 30-Å protein structure is much less precise, but if it is more accurate, more in tune with nature, then it is likely more useful.
 
By comparison, I wonder if our zeal to equate efficiency with throughput hasn’t improved our precision at the cost of our accuracy. If you ask the wrong question, all of the throughput in the world won’t get you closer to the right answer.
 
In researching the DDNews Special Reports over the last couple of years, I have spoken at length to several pharma and biotech specialists about this topic, and many feel that the industrialization of drug discovery and development has underwhelmed if not outright failed. Several have suggested it is time to step back and learn to ask better questions of our technologies.
 
But getting back to the costs issue.
 
I know many will rightly point out that the largest expense comes from clinical trials. To address this challenge, new technologies and methodologies are being developed to get the most useful information out of the smallest patient populations.
 
Here again, however, no one segment of the drug development process stands in isolation, and I think back to the compounds reaching the clinic and question the expense of incremental improvements.
 
Oncolytics CEO Brad Thompson discussed the challenge in Cancer in the Clinic (June 2014 DDNews).
 
“If you could double [overall survival], you could show that in a couple of hundred patients. If you want to do a 10-percent improvement, you’re talking thousands of patients to do it to the statistical level that everybody would prefer to see. How do you run a study like that?”
 
That is a huge difference in financial expenditure that begs the question is an efficacy improvement of just 10 percent of value.
 
From an individual patient perspective, assuredly. From a pharmacoeconomic perspective, maybe not, and particularly with the growing prevalence of high-cost targeted biologics. Maybe we need to aim for bigger improvements before moving candidates forward, which happens long before the clinic.
 
Again, I’m not placing blame. The history of any industry is filled with experimentation in different methodologies and technologies. Everyone involved had the best of intentions.
 
But after a couple of decades of middling results, perhaps it is time to question how and when many of these advancements are applied. Simply yelling at a higher volume doesn’t seem to be enough.

Randall C Willis

Subscribe to Newsletter
Subscribe to our eNewsletters

Stay connected with all of the latest from Drug Discovery News.

March 2024 Issue Front Cover

Latest Issue  

• Volume 20 • Issue 2 • March 2024

March 2024

March 2024 Issue