All Episodes
Feb. 10, 2026 - Epoch Times
05:54
Dr. Jay Bhattacharya on Ending Funding of Dangerous Gain of Function Research

Dr. Jay Bhattacharya critiques the U.S. policy to defund dangerous gain-of-function research, like lab-manipulated viruses from "southern China’s back caves," citing risks despite poor biosafety. The shift stems from a 20-year-old "butterfly collecting" strategy—studying pathogens by enhancing their human transmissibility—to preemptively develop vaccines or antivirals, though these may fail against evolved strains. He dismisses biodefense claims as flawed, warning of unintended backlash and questioning whether the research masks offensive bioweapons ambitions under the Bioweapons Convention, ultimately arguing it’s a reckless, unsupported endeavor. [Automatically generated summary]

|

Time Text
Gain of Function Research Explained 00:01:23
You know, there's a kind of a commitment, you know, this administration, your administration came in to power with an agenda to end gain of function research.
So for starters, where are we at with that?
We made a lot of progress, and the White House is still working on a formal policy.
But actually, can I back up just to set the stage for this?
So there's a few things.
So one is, what is gain of function research?
And why would anyone want to do it?
So that I think is important to understand.
And then second, how should you regulate it so that dangerous projects that have the risk of catastrophic harm to human populations never happen?
So let's do that in stages, right?
So first, gain of function research by itself is not necessarily a bad thing.
Dangerous gain of function research is necessarily bad.
It sounds like I'm making a too fine-a-pointed distinction, but let me just give you an example, right?
So human insulin, which is used to treat diabetes, is often produced by taking a bacteria and then giving it a gene that allows it to produce insulin.
And you cook the bacteria up and it produces insulin.
That's a gain of function.
Bacteria didn't usually used to be able to produce insulin, but the genetic manipulation makes it able to produce insulin.
Right.
And now you can create large amounts of insulin for use inexpensively, right?
So that's completely legitimate.
Bacteria and Insulin Production 00:04:30
And we don't want to get rid of that, right?
Because diabetics depend on having the availability of insulin.
On the other hand, going into the back caves of southern China, bringing a virus out of the, that never had previously infected any human, or maybe one or two at most, bringing it into a lab in a huge city in China with poor biosafety protocols, and then manipulating it to make it more transmissible among humans, well, that's a gain of function.
That's a dangerous gain of function that should not ever be supported, should not ever be done.
So there's a distinction between gain of function and dangerous gain of function that's really important to know.
We want to make it so that there's never any support or interest in doing that kind of dangerous work ever again.
That's the policy of the administration.
The president signed an executive order in, you know, I think in April or May or May, where he said that the policy is, I wholeheartedly support that policy.
I think that is a very, very wise policy.
The question is, how do you implement it?
How do you create the incentive so that this sort of research doesn't happen again?
So let me go backwards again.
Because I'm sure people that are listening are asking, why on earth would anyone support such a research program in the first place?
Bringing the viruses out of the back caves and so on.
It arose out of a utopian vision by certain scientists that we could prevent all pandemics if we were allowed to do this kind of research.
The idea was, now going back probably two decades, that certainly a decade and a half ago, if we can go out into the wild places, capture every single virus or pathogen that's out there, bring it into the lab, and then test it to see if they have some chance of infecting humans.
If they're close in evolutionary space, so they have a chance of making a leap into humans, then we should prepare in advance for all of them that are close.
But how do you tell if they're close?
Well, you do that by making the viruses or the pathogens more pathogenic, seeing if they infect human cells by making them more pathogenic.
How much manipulation do you have to do before it infects human cells?
If it's only a little, then we should prepare for that.
If it takes a lot or it's not possible, then you can just ignore it.
It's essentially a triaging kind of operation.
First, butterfly collecting all the pathogens in the world.
Trillions and trillions, not possible to do all of them.
But you can certainly pay people to make a good start.
And then after you've identified which ones are most likely to make the leap, you prepare countermeasures in advance.
Vaccines, antivirals, and so on.
Stockpile them, even though that virus has never made a leap into humans at that point, stage of the project, right?
So the vaccines you prepare will never have been tested against humans.
The countermeasures you prepared will never have been tested in humans for the efficacy against the pathogen.
Sounds like a great business model.
It is a great business model, or was.
But if it ever does happen to make a leap into humans, the irony is that evolution is very difficult to predict.
I know you have evolutionary biology experience.
You can tell me this firsthand.
That means is that when it makes the leap, the countermeasures that you prepared against an earlier version of the virus may have nothing, have no efficacy whatsoever against the virus that makes the leap, the actual leap.
It's a foolish playbook, utopian playbook.
But that was the justification for doing this dangerous gain of function.
Well, and a number of scientists I've spoken of have basically believed that this explanation that you just offered is more just like a cover for actually doing bioweapons research.
Much of this work is dual use is the term of art, right?
So that the U.S. is a signatory to the Bioweapons Convention.
The U.S. does not do offensive bioweapons research.
But other countries, who knows, this research, as we found during the pandemic, can backfire very, very easily, where even just doing research on these pathogens can end up hurting your own country.
And the idea that you need this for biodefense, well, if the biodefense effort ends up hurting your own country also, that also makes little sense.
It's pretend biodefense because you're producing countermeasures for a thing that may not actually, when it makes the leap, may not have anything to do with what you prepared for.
And so either way, it's an agenda that doesn't deserve support.
Export Selection