Speaker | Time | Text |
---|---|---|
unidentified
|
Joe Rogan Podcast, check it out. | |
The Joe Rogan Experience. | ||
Join my day, Joe Rogan Podcast, by night, all day. | ||
There, very nice to meet you, sir. | ||
Nice to meet you. | ||
Thank you for doing this. | ||
I really appreciate it. | ||
Tell everybody what you do. | ||
Tell everybody what your official position is. | ||
You're a professor at the School of Medicine at Stanford. | ||
What do you do? | ||
So my day job is in cancer research and cancer biology, mostly immunology and cancer. | ||
Much of what my laboratory does is not so much the biology of cancer, but the developing instruments that create the data that allow us to analyze the complexities of how the immune system interacts with tumors and how tumors basically re-enable the immune system to help the cancer itself. | ||
So the problem has been we don't have the ability to collect enough data, or not until recently, to collect and understand what all of that means. | ||
So we've been kind of poking in the dark for decades. | ||
And so probably for the last twenty years, I've developed a number of instruments and turned them into companies that allow everyone to access a level of information they couldn't get before. | ||
So explain that the immune system allows the tumors? | ||
So what happens is that there's a sort of a there's a dance between the mutations that initiate a tumor and then a sort of an evolution of how the tumor eventually learns how to trick the immune system to not recognize it. | ||
So we have all kinds of internal I mean literally every day, every person, you'll develop five cancer like objects inside your body. | ||
But the immune system and your body have a way of shutting it down very quickly. | ||
But with enough time and with enough variation, tumors will eventually evolve in a way that trick the immune system not only into not recognizing them, but in fact to help them and feed them in a way to create an inflammatory environment that actually then the tumor uses to propagate its own cell division and then metastasis. | ||
So it's a normal function of natural human biology to create tumors. | ||
It's not so much a normal function, it's a byproduct of what evolution is, that when the genes mutate when a cell divides or if you go out and stand in the sun to mutate. | ||
For instance, you get skin cancers because you're getting ionizing radiation that's changing the DNA, making a mutation, and some of those random mutations will initiate a cancer. | ||
So, for instance, I have a mutation called MIDFE 318k. | ||
It's a mutation that I was born with, it didn't, wasn't in my family, and it causes both melanoma and kidney cancer, which I've had both. | ||
I've had a dozen melanomas alone. | ||
You know, we didn't find that out until a couple of years ago, but I've been following it over the years, and we basically figured out, okay, it's going to have to be this. | ||
So we had my sequence, my genome sequenced. | ||
But that's just one of hundreds of different kinds of mutations that can occur that are on a path towards creating a cancer. | ||
But the cancer can't survive if the immune system recognizes it. | ||
So eventually what happens is there's this detente that is reached between the immune system and the cancer where the immune system basically ignores the cancer. | ||
So Jim Allison here in Houston won the Nobel Prize back in 2018 for understanding one of these turn off signals that the immune system that the cancers use to turn off the immune system and that by showing he could block it, his wife Pam Sharma ran a bunch of clinical trials in MD Anderson that showed in fact that this could actually turn a five percent survival disease in melanoma to a fifty percent survival. | ||
And that then created the whole immunotherapy field that the world is taking advantage of today. | ||
Wow. | ||
So what is cancer actually doing? | ||
Like how did it How do tumors develop this ability to trick the immune system? | ||
Is this something that other animals have? | ||
Oh, yeah. | ||
So it's a constant. | ||
It's a constant battle. | ||
So, for instance, there are proteins on your cell's surface. | ||
I won't go too immunologically deep about it. | ||
They're called major histocompatibility complex proteins. | ||
So for instance, if I were to try to just randomly do a tissue transplant from me to you, it's very likely that it would be rejected. | ||
And it's because of those MHC proteins that it's rejected. | ||
What's happening is that your cells are presenting your internal cell biology to the immune system. | ||
And it's saying, okay, you're a friend, not a foe. | ||
So when cancer usually initiates, there are disruptions that happen and proteins are made inc doing in some cases is they're presenting the internal damage to the body. | ||
And the body's saying, oh, there's something wrong with this cell. | ||
We better wipe it out. | ||
We kill it. | ||
These same proteins are what the immune system uses, for instance, to go after viruses. | ||
So when you get a virus infection inside the cell, the body has a way of chopping those proteins inside the cell, presenting it via MHC. | ||
And then the immune system attacks it. | ||
So what one of the first things that actually tumors do is they learn to turn off the MHC proteins inside themselves. | ||
So the ability to show that I'm damaged is shut down. | ||
And so the immune system doesn't go on full alert for that. | ||
But then there are other mutations like divide when you're not supposed to, you know, avoid this kind of induced cell death called apoptosis and not others. | ||
And so it cancer doesn't just like start and then the next day you've got it. | ||
It's a progression of events. | ||
You have these precancerous lesions. | ||
You have like a benign tumor which eventually becomes a metastatic tumor. | ||
And so the but the immune system is key at every stage of the development because if you can reactivate the immune system in just the right way, then you can prevent the cancer from basically spreading or from metastasizing or from killing you essentially. | ||
Is there a potential for, given the understanding of this, is there a potential for using this for organ transplant patients where locally would stop recognizing this as a foreign organ? | ||
That's exactly what is done. | ||
In fact, when you get a tissue transplant or an organ transplant, you're suppressing the immune system. | ||
The problem with that suppression is that you then put yourself at risk risk of cancer. | ||
Because what you're doing is you're turning off the immune system's ability to combat and go after a cancer in the moment it forms. | ||
So most people who are under immune suppression are at risk both of, let's say, virus infections, bacterial infections, but also further cancers. | ||
So would the potential be to turn that off locally so you could turn that off to this on the specific organ? | ||
That would be a great thing to do if we could. | ||
Right now, the only things that we have are systemic. | ||
So yeah, I mean, for instance, if you could deliver to the organ that you're transplanting anti-immunosuppressive, you know, basically immunosuppressive locally. | ||
That would be great. | ||
We don't have that yet. | ||
But that would be via a form of gene therapy. | ||
But the problem would that be that if you, let's say you had a lung transplant. | ||
If you had a lung infection, it would be catastrophic. | ||
Do you want to come work in my lab? | ||
You're you're accepted as a graduate student in the Stanford Department of Pathology. | ||
Well, that was easy. | ||
I have a few friends that have had organ transplants. | ||
And it's, you know, it's very disturbing knowing that they're so vulnerable to any kind of infection because of these medications that they have to take in order for the body to accept the transplant? | ||
One of the problems is that there are literally hundreds of different types of immune cells. | ||
And really until recently, and frankly until a technology my lab developed about over a dozen years ago, we couldn't look at all of the immune cell types all at once in a single picture. | ||
So I came from a laboratory, Len and Lee Herzenberg, when I was a grad student at Stanford, and they had developed an instrument called the fluorescence-activated cell sorter. | ||
And that allowed you to look at three proteins at a time. | ||
And if you could know ahead of time what the cell types were that expressed the proteinins that you're interested in, you could look at just those three cell types. | ||
Then I came up with a way to look at, you know, fifty or sixty proteins at a time, sort of stepping up what they had already taught me how to do. | ||
And then suddenly that gave us the ability to look at nearly every cell type in the body, an immune cell type. | ||
And then that gave us, let's say, the raw data to build mathematical models that we could do better predictions of what outcomes would be. | ||
And how is that, like, what are you, what are you applying in terms of like real world scenarios? | ||
How are you applying this? | ||
Well, so for instance, there's a kind of leukemia called AML, acute myelogenous leukemia. | ||
It starts in the bone marrow. | ||
And it is a distorted version of a myeloid cell type. | ||
It starts as a stem cell, and that stem cell goes down a number of different paths. | ||
And depending upon the person, the disease is sufficiently different that it might follow a slightly different path towards what becomes the disease itself. | ||
And so being able to trace the path and to know which steps along the way that it takes to become what becomes then the metastatic lymph leukemia could only be accomplished by having enough markers that allowed us to trace everybody along the path. | ||
It's kind of like if I wanted to follow you from who you are as an egg through development through to who you are today and I had snapshots every month, I need different markers to measure what you are as an egg versus what you are as a baby versus what you are as an adult. | ||
And so each of those different markers in my world would be different proteins that tell me something about an adult leukemia versus a baby leukemia. | ||
And then we use something called pseudo time, which is a mathematical concept that allows us to stitch together those photographs. | ||
I could take a random box of photos of you from an egg to who you are today, and I could just by hand put together the most likely path and sequence of what you were from the earliest to the latest. | ||
But we needed the data and we needed the means and the instruments to collect that information so that then the math could come into play. | ||
That's such a fascinating thing about human beings is the biological variability. | ||
Is that everybody is so the same? | ||
Two lungs, a heart, but so different in how our body reacts to things. | ||
and what happens to us and environmental factors, diet, stress, all sorts of different factors. | ||
And you're kind of picking together this puzzle. | ||
Right. | ||
Of all these things. | ||
But what you're doing is you still have to pay homage to the fact that those differences exist. | ||
And so while, you know, my cancer might be the same class of, let's say, melanoma as another person's, the complexity of what allowed that cancer to become are so different that the drugs that would work for me might not work for another person. | ||
And so that's what basically requires us to personalize the medications in a way that gives the right drug to the right person. | ||
So I've started probably half a dozen companies and sold them, places like Roche, et cetera. | ||
Actually, my most recent company we sold to 10x Genomics, which enables them now because of a patent I created back in 2011 to scale up the amount of information that we can collect at a time that then when layered on top of what, for instance, 10x Genomics already did, which is doing what's called single cell genomic analysis, we could scale that up a hundredfold to get a hundredfold amount of information. | ||
But the problem with that is that I can collect all that data and make an analysis of a cancer for you, but it might be a little bit different than another person. | ||
So what we have to do then is develop techniques that allow us to narrow in on what the differences might be so that when I develop a drug for person X, it works for person X and not for person Y, right? | ||
The right way. | ||
So there's a lot of personalization in medicine that is required. | ||
The diversity that makes humanity great and that makes humanity able to survive in the face of so many challenges is that there are individual differences that one person might survive and another won't. | ||
It's the same thing with cancers. | ||
And it's the same thing with drugs. | ||
I mean, the, you know, for instance, with certain drugs, one of the first things I learned in pharmacology when I was way back in the day is that there's always a benefit to damage ratio that you're having to deal with. | ||
That a drug has a positive outcome, but there are side effects. | ||
And so as scientists or as clinicians, we make a choice based on the statistics. | ||
Who will benefit the most? | ||
And will it benefit the most? | ||
But by the way, there's all these side effects that might affect you. | ||
And overall, globally, 60% of people will survive. | ||
But since I don't know anything more about your specific disease, I am by law required to give you the 60% drug. | ||
until I know or can distinguish that your disease is a different subclass than the 60%. | ||
And that's, in fact, a lot of what pharmaceutical companies are doing is they're trying to marry a diagnostic to the disease itself, the disease subtype itself, so that if you can show that 90% of the people of this kind of subclass will survive, you have to, by law, choose that diagnostic to make sure that the person doesn't have the subclass before you give them the 60% drug. | ||
Does that make sense? | ||
unidentified
|
Yes. | |
Yeah, it does. | ||
The narrative has always been over the last few decades, stay out of the sun. | ||
But recently, people have started saying, no, it's actually you need to become accustomed to the sun. | ||
And the real issue is people using sunscreen all the time and then going out and getting burned. | ||
Obviously, your situation is very different. | ||
Because you have a specific gene. | ||
And I'm Irish. | ||
Yeah. | ||
That's the problem, right? | ||
Yeah. | ||
The genes of the people that lived in cloudy ass places. | ||
unidentified
|
Right. | |
Exactly. | ||
Thousands of years. | ||
And my mother when we were kids, I mean, I'm 64 years old. | ||
So when I was a kid, you know, we'd go to the beach in Connecticut and they'd suffocate me in, you know, coconut oil. | ||
Oh yeah. | ||
Yeah. | ||
Baby oil when I was a kid. | ||
Everybody had baby oil. | ||
And everyone got barbecued. | ||
Yeah. | ||
Plus I worked in the fields as a kid for, you know, farm labor. | ||
And that's not good. | ||
That wasn't good. | ||
The burning, that's the real damage to the skin, and then it manifests itself as cancer much later in life, right? | ||
Right. | ||
There's all these subtle, let's call them, smoldering mutations that are waiting for a second or a third hit to occur, or for, you know, instance, you get old enough so that your immune system is kind of going wonky, and it no longer is able to take care of something that twenty years ago it would have been able to heal perfectly well. | ||
That makes sense. | ||
So is there any, this narrative that you need to be in the sun more and that just don't get burneded. | ||
Is that reality? | ||
Well, it depends on who I mean, for someone like me, no. | ||
But there are positive, obviously, for the sun. | ||
I mean, vitamin D, as an example. | ||
But they're also, you know, resetting your clock in the morning rather than taking melatonin at night. | ||
Go and just, you know, in a bright use glass to shield out the ultraviolet and get some bright light. | ||
It's the UV that's the danger. | ||
It's not light. | ||
So for you, you don't you don't ever just go sit in the sun. | ||
Not anymore. | ||
No, but I was. | ||
Because I was an idiot when I was a kid. | ||
I mean, I would go and use tanning beds, because I thought, well, I wanted to look, you know, tan. | ||
Right. | ||
And I did tan back then, but, you know, obviously can't anymore. | ||
Yeah, you don't really see those anymore, do you? | ||
No. | ||
unidentified
|
You do. | |
Maybe in like Seattle. | ||
Some people do. | ||
Yeah, there's, you know, I mean, I think there's obviously, there's a benefit to light. | ||
I mean, I'm not saying don't go out and do it. | ||
And if, and, you know, I think as well, there'll come a day, and I was just talking with some friends of mine at dinner last night, is, you know, maybe with things like CRISPR, I could rub a CRISPR ointment on my body. | ||
It would fix the single point mutation in my skin and then I could enjoy the sun again. | ||
Is that really potentially? | ||
Oh yeah. | ||
Don't you think? | ||
Oh yeah. | ||
I think. | ||
How far away are we? | ||
I think honestly, I mean, people always say five years is sort of like this horizon. | ||
But no, I really, I mean, I know people who are already developing systems for delivering genes, you know, RNA to cell. | ||
I know that's a dirty word in some, but there are formulations of RNA that probably won't be as problematic as some of the things that maybe the COVID vaccine might have done. | ||
Right. | ||
Yeah. | ||
RNA right now, you say, and people clench. | ||
Yes, exactly. | ||
Yeah. | ||
But I mean, your cells are full of RNA. | ||
So, I mean, you can't get away from the fact that your cells are full of RNA. | ||
That's just the messenger. | ||
That's the name. | ||
Yeah. | ||
But it's also the means by which they delivered it, right? | ||
I mean, the means by which it was delivered was a formulation of a nucleotide that by itself was meant to be something called an adjuvant. | ||
An adjuvant is something which activates the immune system you want. | ||
I mean, when you get a vaccination, you are co injected with something that hyperactivates the immune system to say, come hither. | ||
Right. | ||
And most of the pain that you get from an injection is not the vaccine itself, it's the adjuvant. | ||
Right. | ||
This episode is brought to you by OnXHunt. | ||
Hunters listen up. | ||
Millions of hunters use the OnX Hunt app, and here's why. | ||
It turns your phone into a GPS that works anywhere, even without mobile phone service. | ||
You'll see exactly where you are, every property line, and who owns the land. | ||
You can connect your cellular trail cams, drop custom waypoints, dial in the wind, and a whole lot more. | ||
Whether you're chasing elk on public, finding the back corners of your deer lease, or knocking on doors for permission, OnX Hunt gives you the knowledge and confidence to make every hunt more successful. | ||
No more second guessing boundaries, wasting daylight, or wondering what's over the next ridge. | ||
Ridge, you'll know every single step. | ||
The best hunters aren't lucky, they're prepared. | ||
This is how you get there. | ||
So before your next hunt, get on X Hunt, download it today and use the code JRE for 20% off your membership at on xhunt dot com. | ||
And so the problem with this was that it turned your whole body into like a spike protein factory. | ||
Yeah. | ||
Well, at least locally. | ||
Yeah. | ||
Yeah. | ||
No, I've read some of the work. | ||
But not always locally, right? | ||
Because didn't some, they didn't, a lot of, they didn't aspirate with a lot of people. | ||
Yeah. | ||
Yeah. | ||
They were not going to aspirate with anybody. | ||
They didn't do it with anybody. | ||
They didn't with the president on TV. | ||
But if you get infected by a virus, it's all over your whole body anyway. | ||
So it's whether the spike protein itself was problematic. | ||
And so, you know, I know I'll annoy somebody one side or the other by saying anything around this area, and I'm not here to cause any controversy. | ||
But, you know, your immune system works, but if you can try... | ||
The question is back to this cost-benefit ratio. | ||
Is the benefit to the larger statistical population worth it, knowing that some people are going to be hurt by it or not? | ||
That's the question. | ||
So for instance, you know, back to cancer and vaccines, there's a number of cancer vaccines that are coming down the pike that for people like me would be, I mean, given that I get something hopped off of me four times a year. | ||
Really? | ||
Oh yeah, you should see me. | ||
I look like I've been in a war zone. | ||
Some people say, oh, that's hot. | ||
And that's what you want. | ||
unidentified
|
That's hot. | |
Wow. | ||
Someone is into cutters? | ||
Yeah, exactly. | ||
Exactly. | ||
So that's so fascinating. | ||
But is there another way that could potentially deal with those things other than cutting them off or is that the only way to remove it from your system? | ||
Right now, it has to be cut off. | ||
So the issue is that once those, the melanoma, once these lesions are on your, on your skin, they will expand. | ||
Yes. | ||
Luckily, most of mine are what have been called surface spreading, although one of mine was what's called a nodal, which basically dives right in. | ||
And believe it or not, my dog found it and was sniffing at it on my arm. | ||
unidentified
|
Really? | |
And like, started like, scratching at it and it stopped bleeding. | ||
Yeah, I'll show you the cicatrics. | ||
What kind of dunes did you have? | ||
Well, this was 15 years ago. | ||
He was a Pomeranian, But you can see the scar there. | ||
Oh, that's crazy. | ||
And it wouldn't stop bleeding. | ||
And so, you know, I went in and had it looked at. | ||
And they said another week and it would have metastasized. | ||
Yeah. | ||
Wow. | ||
unidentified
|
Yeah. | |
He has a great dog. | ||
He was great. | ||
Yeah. | ||
He was great. | ||
But, you know, there are, so for instance, if you can catch most of these cancers early, then that's what's important. | ||
So I think probably one of the most important, let's say, changes to our medical system that could be initiated would be, frankly, the use of things like MRI, not CT scans, because CT scans are known to cause cancer. | ||
Which is so crazy. | ||
unidentified
|
Yeah. | |
Yeah. | ||
Like when did we figure that out? | ||
I mean, there was a big study just published recently that said, here's what happens to people once CT scans were implemented and you see this sudden spike in the I mean, again, it's this cost benefit ratio. | ||
If you didn't have it, certain people wouldn't have, you know, wouldn't know that they have a giant tumor in there. | ||
Right. | ||
I mean, so for instance, I had, when I had kidney cancer, I was actually at a restaurant with friends doing a business deal, actually. | ||
And I went to the bathroom and it was blood. | ||
And I said, okay, we have to go to the, you know, we have to go to the, you know, to the emergency room like now. | ||
And then they did a CT scan and they see this, the brachial tree around my kidney was just a big diffuse mess and they came in and said, you've got you've got cancer. | ||
Did you have to have your kidney removed? | ||
Yeah, yeah, yeah. | ||
It was, um, but you know, it's okay, I'm alive. | ||
It's nice to have two of them. | ||
Yes, exactly. | ||
Um, I'm alive, but, uh, you know, it is, this early detection is important. | ||
I mean, I was lucky that it hadn't metastasized yet. | ||
It's called, it was called a clear cell venocell carcinoma. | ||
Um, but, you know, so serving the kidney. | ||
Surveying the body and these companies that are out there right now which do it, I think are really important because even if you are young and you have no suspicion that you're going to have cancer, having that baseline against which you can compare later changes is important because I could do, for instance, a CT scan or an MRI of you and I find lots of little anomalies and they're generally in the field called phantomas. | ||
They're these objects that may be worrying but we won't know that they're worrying and certainly not do a biopsy of them and poke a needle into your chest to pick out a piece of it. | ||
But if I come back six months and it's changed, then maybe it's something we need to go after more seriously. | ||
So getting those kinds of regular scans, I think is probably one of the more important things that could be done, but not by a CT scan. | ||
Which is crazy because we're doing them for so long. | ||
They still do CT scans though because it's necessary to be aware of certain things. | ||
Right, which is letting people know this might cause cancer is just like, yikes. | ||
Yeah, but maybe, for instance, there'd be a way to treat someone. | ||
with a drug in advance that would minimize the effect of the CT scan. | ||
Ah. | ||
Right? | ||
So that, you know, because the CT scans are generally causing oxidative damage. | ||
And so if you could provide a local antioxidant, and I'm not saying that something like this exists, it's a bit of a naive statement. | ||
But if you could do that locally to the area that's being imaged or to the whole body, then maybe CT scans could be lessened in their problematic outcomes. | ||
I would say innovative and hopeful. | ||
Okay. | ||
unidentified
|
Yeah. | |
I would naive. | ||
unidentified
|
Yeah. | |
I don't think it's naive because you're recognizing the issue. | ||
Right. | ||
Thank you. | ||
So how well, this was also a problem with X-rays, right? | ||
Like X-ray technicians, I've seen some of those images of people's hands because the technician used to have to use their own hand to check to make sure the X-ray was functional. | ||
And over the years, they go, hey, what the fuck is wrong with my hand? | ||
And then they realize, oh boy. | ||
Right. | ||
Yeah. | ||
Well, it's interesting because what's happening with X-rays or CT scans is a fast forward of the kind of random damage that causes cancer in the first place. | ||
And so because it's random, let me kind of go back a little bit as to why does cancer happen in the first place? | ||
So let's go way back in evolution to the first time that there were single cells versus the first time that two cells met each other and said it was better to join forces and cooperate rather than to divide at each other's expense. | ||
So in the process of that happening, those two cells came together or three or four cells. | ||
They basically said, together we're better than alone. | ||
But there were actually social compacts and contracts that at the genetic level were being formed between all of these cells. | ||
And so as things got more and more complex, more and more complex contracts were formed to the point at which what could happen is that any one of the breaking of a complex contract could actually then initiate a cascade that becomes cancer. | ||
So rather than we think of cancer as being a forward progression in evolution, it's actually another way to think about it is that it's a devolution back to the core fire of the desire to divide. | ||
And so by breaking the contracts, by breaking the controls on the system, cancer is allowed to blossom. | ||
So the problem is that every tissue type, whether you're lung or brain or whatever, has a whole different ecosystem of contracts that have been formed. | ||
And so there's no one size fits all drug that will kill off all cancers because the contracts are different. | ||
It's not like you can bring in a lawyer and fix, you know, agricultural contracts versus maritime or whatever. | ||
So that's the, you know, you have to have a flexible enough mindset because if you get stuck in this, it's a forward evolution as opposed to that it's a breaking of contracts, you might miss out on an opportunity for how to develop a therapy or a drug that would help people. | ||
One of the things that I wanted to ask you, I don't even know if you know anything about this, but is there a connection between IVF and the amount of because you have to take some pretty extreme hormones. | ||
There's a lot of stuff that women have to take. | ||
Is there a connection between that and hormonal related tumors? | ||
I honestly don't know. | ||
So I don't want to opine and then have half my colleagues send me emails tomorrow scolding me. | ||
Okay, good. | ||
Well, I'm glad you answered that way. | ||
I was told by someone who I really trust that there is. | ||
And then we tried to Google it and it said there's not, but that's not surprising. | ||
Probably there hasn't been the right kind of study yet. | ||
And if there is not, there should be. | ||
I mean, certainly any hormonal imbalance is not a good thing. | ||
I mean, you imbalance the metabolism of the system and you can. | ||
I mean, so, for instance, back to my specific disease with MIDEF, there's all kinds of things like NNN, N-acetylcysteine, betaine, all these other drugs that are out there for longevity. | ||
Well, if I look into the metabolism of what my cancer is, every single one of those is a disaster for me. | ||
It accelerates. | ||
Yeah. | ||
Yeah. | ||
Not good. | ||
Not good. | ||
You know, people often say, you know, scientists are not religious. | ||
There's nothing that inspires more awe in me than knowing the complexity of the cell and knowing the complexity of life. | ||
seeing all this feedback and mechanism and knowing that underneath that is a universe with particles, etc., that enabled something like us to exist. | ||
I just see... | ||
Well, yeah, it's awe inspiring for sure. | ||
I mean, anybody who doesn't think it is is not paying attention or they're purposely being ignorant. | ||
Right. | ||
Yeah. | ||
We get a lot of that though. | ||
Oh, yeah. | ||
Well, that's okay. | ||
You know, teachers are here to hopefully teach and not preach. | ||
Hopefully. | ||
Yeah. | ||
Because of your specific type of cancer and your situation, do you have to, like, very closely monitor your diet? | ||
I probably shouldn't eat as much meat as I do. | ||
Meat? | ||
Yeah. | ||
Why meat? | ||
Well, because, you know, fats. | ||
And a lot of them, the fats dissolve a fair number of toxins. | ||
You know, it's not necessarily a good thing. | ||
I mean, that's been relatively well shown that too much meat as opposed to I'm not advocating vegetarianism, I think there's a happy medium. | ||
I mean, we grew up in an environment where we had both. | ||
I mean, we're omnivores. | ||
And we succeeded, I think, because we're omnivores as a society, as a, you know, as a civilization. | ||
So, but, you know, charred meat, for me, that's the issue though, isn't it? | ||
Yeah. | ||
Isn't it burnt? | ||
Yeah. | ||
I mean, it's carcinogens. | ||
I mean, you know, you're making all kinds of it's a, it's a witch's brew of nastiness that tastes good. | ||
But, you know, the reason why it tastes good is because the humans who survived learned to use fire to kill off the bacteria in rotten meat. | ||
And so the flavor of that probably was engineered into our evolution. | ||
But again, it's a cost benefit. | ||
But didn't the cooking of it also allow us to absorb more protein? | ||
I'm not sure about that. | ||
I believe so. | ||
Okay, that could be. | ||
I believe that's the case that the cooking meat actually allows it to be more easily absorbed by the body. | ||
Could be broken down more readily. | ||
But certainly it kills bacteria. | ||
So, you know, day old or three day old deer. | ||
Right. | ||
You know, that you eat. | ||
We're not a bear. | ||
Or not. | ||
unidentified
|
Yeah. | |
Yeah. | ||
So, you know, I mean, yeah, we're not vultures that seem to have digestive systems that can handle all of that. | ||
So you should eat less meat. | ||
What else? | ||
Do you avoid sugar, which seems to be a real problem with cancer? | ||
Yeah, I avoid, yeah, I avoid too much sugar. | ||
Yeah, thanks for this, by the way. | ||
Is that sugar free? | ||
It's no. | ||
Is that one not? | ||
No, it's Oh, we have sugar free ones. | ||
No, because the sugar free ones have stuff in them that are just as bad, xylitol and all the other things. | ||
What about stevia? | ||
Yeah, that would be stevia for you? | ||
I don't think so. | ||
I haven't seen anything on that. | ||
But you know, I mean, look, I guess I said, I'm 64. | ||
It's way too late. | ||
And every time that, let's say, scientists make some. | ||
grand prediction of what's good or bad. | ||
Five years later, we find and update what it should have been. | ||
I mean, I often say this, and this is true. | ||
The goal of science or scientists is to be right today, even wrong today, but writer tomorrow. | ||
Because we're always back checking what the results are and what they mean in the context of a bigger picture. | ||
I like how you say good science because that's part of the problem is that ego gets attached to ideas that have already been discussed and published. | ||
And then people are very reluctant to accept new evidence that's contrary to that. | ||
unidentified
|
Yeah. | |
Yeah, I mean, as always, as I often say, you know, in the context of something I know we'll get to later, it's the data off the curve which is more important than what we already predict. | ||
You know, predictions are great, but when there's a data point off the curve, at least in my lab, that's where we spend most time at our lab meetings, is trying to figure out why that data point's off the curve. | ||
Is it because the machine was wrong? | ||
It was a, you know, it was a glitch? | ||
Or does it mean something that we need to make sense of? | ||
And that's of course where all advances come from in the sciences is by the fact that the data off the curve, somebody was curious enough about what it meant to go after it and then say, ah, okay, now that I've stepped back and see the bigger picture, now I can create a model that incorporates that data point off the curve and why it happened. | ||
This is an ad for better help. | ||
The Internet is a breeding ground for misinformation. | ||
Even a simple search for ways to get rid of a headache can produce millions and millions of results from taking pain relievers to detoxes to medication to cold compresses. | ||
It's overwhelming. | ||
And even when you do find something that's true that works for other people, it might not work for you. | ||
In some cases, it's better to just ask a living breathing expert. | ||
If you have a headache that won't go away, go talk to a doctor. | ||
And if you're struggling with your mental health, consult a credential therapist. | ||
You can learn a lot about yourself in therapy, like how to be kind to yourself and how to be the best version of you, whether you want to learn how to better manage stress, improve your relationships, gain more confidence, or something else. | ||
It starts with therapy. | ||
Try it for yourself with BetterHelp. | ||
Millions have benefited from their services, and there's a reason people rate it so highly. | ||
As the largest online therapy provider in the world, BetterHelp can provide access to mental health professionals with a diverse variety of expertise. | ||
Talk it out with BetterHelp. | ||
Our listeners get ten percent off their first month at betterhelp dot com slash JRE. | ||
That's better h elp dot com slash jre. | ||
One of the reasons why I was really excited to have this conversation with you about the research that you do is that I think it's really important to illuminate to the general public the sheer scope of the task of trying to figure out what is going on and all these different things that can go wrong and right in the human body. | ||
And that it requires this fucking insane amount of work. | ||
Yeah. | ||
By many, many, many, many people. | ||
And, you know, and then the amount of data that had to be collected now. | ||
And so here's the difference is that, you know, there's data, there's evidence, there's conclusions and proof, and that's a uphill climb. | ||
But proof, the next one up is meaning. | ||
My lab has been largely responsible, at least partly responsible, for the data deluge that's out there in the world, both in how to do tissue biopsy analysis, how to do single cell analysis, et cetera. | ||
And, you know, data felt good for a while. | ||
It was like this, you know, this feedback loop of, oh, wow, I can get all this data. | ||
And then suddenly you look at it and you go, well, what the fuck does it mean? | ||
And so humanity has this habit of backing itself into a corner and then suddenly finding this eureka moment that gets it out. | ||
And so our eureka moment about two years ago was artificial intelligence where suddenly I had the ability. | ||
So normally I would collect all this data and go, okay, well, it seems myelid suppressor cells are important here and T regulatory cells are important here. | ||
Okay, I get on the phone or send an email to whoever the local expert is, either on Stanford campus or around the world and try to get some information from them. | ||
But then now you're dealing with hundreds of cell types, each individually of which have thousands of variations themselves. | ||
And each subtle variation means something. | ||
And there's no expert for any of that. | ||
But AI can be, at least in part, that expert. | ||
So suddenly I have 22 million papers published in all the fields of science, several tens of millions just in, you know, or several millions just in immunology alone. | ||
And AI can be the sleuth for me. | ||
me can be both the angel and the devil on my shoulder that can make sense of things in ways that I never would have been able to before, especially with agentic AI. | ||
So we, for instance, in my lab, have developed an agentic AI that is basically an immunologist, scientist in a box. | ||
We can give it the raw data, and we can pose a question in natural language. | ||
And then we say, hey, make sense of this and turn it into a network. | ||
Normally that would have taken a graduate student along with a couple of postdocs months and months and months to put it all together. | ||
Now in three hours, we can get pictures and hypotheses of how all that data fits together in ways that I never could have done before. | ||
You know, at the beginning, it did a lot of hallucinations, which you probably heard about in AI. | ||
But my answer to my colleagues is, some of my best students hallucinate. | ||
Right? | ||
Right. | ||
And so, but, you know, the human's still in the loop. | ||
And so with all of this together, now we can make meaning out of the data. | ||
And we can skip a lot of the intermediary steps and speed it up. | ||
And it's just getting better. | ||
I mean, we, for instance, have put in a couple of papers now where, so for instance, in where my special, one of my recent. | ||
specialties is what's called the tumor-immun interface. | ||
So you have the tumor, you have the immune system, which is coalescing on, you know, near, and then in some cases the tumor creates a boundary, a barrier between itself and the immune system, where there might be certain kinds of cells that the immune system, the tumor has told the immune system, ignore us, we're not here. | ||
But what we now can do is, well, on the other side of when you look at, let's say, complex patient populations, you find these things called tertiary lim lymphoid structures. | ||
So your body has about 220 lymph nodes. | ||
Okay, and the lymph nodes are where the immune system makes decisions, let's say. | ||
It turns out that in the middle of tumors, the body has evolved a mechanism to create what essentially looks like a lymphoid structure in the middle of the tumor. | ||
It's sort of a forward camp of immune cells that the more of those you see in a tumor, the better will be your outcome as a patient. | ||
And so we used a cohort of colorectal cell, basically colon cancer patients, where we looked at hundreds of biopsies. | ||
And we did that pseudo-time analysis where we looked for mature tertiary lymphoid structures, and then we looked for immature, slightly less mature, even more less mature, et cetera. | ||
And we were able to backtrack to the cell types which need to come together that would then form the more mature. | ||
What use is that? | ||
It's a nice paper. | ||
But it also now tells us what we might do to create more of these in a tumor. | ||
Because the more, we already know from multiple kinds of tumor types now that the more of these tertiary lymphoid structures you have, the better off will be your outcome with chemotherapy. | ||
So it might be, for instance, that once we know that you have a disease like this, we could give you some kind of therapy, a virus or whatever, that goes and homes to the tumor, seeds the beginnings of these initiators with there's these cytokines that are produced that are necessary for initiating the formation of these objects. | ||
And so there's a huge benefit to that, but we never would have found those in my lab, at least, without the AI. | ||
Because it basically did the work for us. | ||
That's fascinating. | ||
Now, are you using like a standard large language model or do you have like a specific structure that's built that interfaces with large language? | ||
Correct. | ||
So we use, well, we can use pretty much any of the LLMs, but right now we find that OpenAI is the best for us at least. | ||
And then we create an agentic overlay. | ||
Basically, what's called, you probably know, chain of thought, which is a series of questions. | ||
So how we taught it was we basically came up with, here's 100 kinds of questions a scientist would ask about the immune system. | ||
And then we tell ChatGPT, now create 1,000 questions like this. | ||
So, you know, it's artificial data or artificial questions. | ||
We curate those to make sure that they're good. | ||
Then we do 100 hypotheses and we create thousands of types of hypotheses, etc. | ||
the same for tests that you might run. | ||
So now from A to Z, we have an agentic AI that you give it raw data, it knows what to do with the data, it then generates hypotheses for you, and then it literally tells you the kinds of experiments you should do next to prove or disprove the hypothesis from the raw data. | ||
It's a genius in the lab with you. | ||
Exactly. | ||
Is OpenAI learning from this agentic AI? | ||
Oh, yeah. | ||
So there's a mutually beneficial relationship. | ||
Yeah, I mean, we're not working with them directly. | ||
I'm using it. | ||
But you use it, and because you use it with your AI, it's benefiting from it. | ||
And we first thought to turn it into a company, because that's kind of one of the things we do in my lab, is if, because I've always thought that it's important to give back to the taxpayer the money that they've invested in us. | ||
And the best way to do that is commercialization. | ||
I'm totally, you know, unapologetic about that, even though that got me a lot of trouble at Stanford in the early days when, you know, making money was, you know, commercialization was evil. | ||
And even at Stanford. | ||
And so I think that that's an important process because scientists are good at asking maybe the questions and coming up with solutions, but scientists aren't the best at commercializing it and turning it into a product that can be used or testing it, you know, in large communities. | ||
So the AI that we developed, we thought, okay you know what? | ||
AI is moving so fast. | ||
Why don't we just give this to the community? | ||
Why don't we open source this? | ||
We can use it for maybe specific targeted purposes, but we're basically going to publish the whole thing on GitHub to let other people use it. | ||
Because we've seen other people make claims about stuff that they've already made, and it's like, ours is better. | ||
So why don't we just put it on GitHub and let people learn from it? | ||
The resistance to the commercialization, what was the initial argument? | ||
So back when I was a grad student in the 80s, basic research as opposed to translational research was considered the highest We're the height of intellectual desire, right? | ||
Basic research, and we're not here to make money, we're here to discover things. | ||
And that's important. | ||
And nearly every major discovery and every major therapy in the world came from basic research. | ||
But then, you know, there were limits to how much money you could give to basic research. | ||
And then there was a desire at a certain point to say, hey, are you going to do anything about this? | ||
You know, are you going to make it? | ||
So translational research became a push. | ||
So there's a guy at Stanford by the name of Paul Berg, who won the Nobel Prize for gene therapy. | ||
for recombinant DNA way back in the day. | ||
And Paul came up with this concept, bed to, you know, bench to bedside, meaning that we don't have to be either or. | ||
We can be part of an arc. | ||
And Stanford wanted to be and enable within the medical school both the basic research, which we were great at, as well as bringing it directly to the patients as well. | ||
So to link clinicians and the desires of clinicians with the basic researchers. | ||
I mean, most scientists would be happy just to study anything. | ||
You know, just point me at something and I'll be happy if I can get interested in it. | ||
And we're no more happy than when somebody recognizes the value of what we do. | ||
But basic research was sort of the height and there was a push against anybody trying to commercialize. | ||
So when I started as an assistant professor, so I started as a grad student, I went to MIT to work with this guy, David Baltimore, who won the Nobel for reverse transcriptase. | ||
And then I wanted to come straight back to Stanford because I already felt that it was a positive environment for commercialization. | ||
My bosses, my former bosses mentors, Len and Lee Herzenberg, had two of the biggest patents at Stanford. | ||
They had the fluorescence activated cell sorter and then what are called humanized antibodies, which brought in hundreds and hundreds of millions of dollars to Stanford. | ||
And actually, they gave personally most of their own money away. | ||
They didn't they kept enough to survive, but then they gave most of the money away and they ran their own lab off of a lot of that money. | ||
But so I had learned from them about how to still do basic research but commercialize on the side. | ||
And so I wanted to bring that back. | ||
But the department that I came into, the department of pharmacology at the time, I was warned by many professors. | ||
Don't commercialize that. | ||
And I ignored them and I went and started a company that went public on NASDAQ. | ||
And many of those same professors came back to me years later and sitting in my office asking me how to start a company. | ||
Why did you was it just a courageous decision to ignore them? | ||
Was it instinctual? | ||
It just was because I couldn't see the NIH funding what I wanted to do. | ||
So I had developed a way, this will sound scary, but I had developed a way to use retroviruses and make libraries of retroviruses to reverse the process of evolution in a way that rather than viruses hurting the cell, I set it up so that viruses would help the cell. | ||
And once they helped the cell, I would figure out what they did. | ||
And so we sold hundreds of millions of dollars of targets that way using retroviral libraries to basically find targets and use some of the benefits of viruses, but to our advantage. | ||
Just the concept of reversing evolution is fascinating because it comes with, there's so many ethical implications, but if you didn't have any of those, and you could do. | ||
that on large scale? | ||
Well, I had developed in David's lab, along with this guy Warren Pear, a means it's called the 293 T retroviral producer system. | ||
It was a way to make large numbers of these viruses very quickly. | ||
It really followed on the work of this guy, Richard Mulligan, who'd also been a postdoc with David Baltimore, who developed what was called the 3T3-based retroviral production system. | ||
And he developed it in Paul Berg's lab at Stanford. | ||
So there's a lot of sort of, you know, interbreeding here. | ||
But the problem with that was it took three months. | ||
So I had brought with me a cell line called 293T that I introduced to the lab and said, hey, maybe we could use this to make make viruses quickly. | ||
I won't go into details of why, but we could do it in three days rather than three months. | ||
And so that now, I mean, tens of thousands of labs use that worldwide. | ||
It probably generates the most money for me every year over any of my other inventions, just because Stanford rather than patenting it, licenses it. | ||
And licenses are forever, whereas patents have a 17 year lifespan. | ||
So Stanford made a good choice there. | ||
So do you think it was just a bias, an academic bias, like we shouldn't be focusing on money, we should be focusing on the work? | ||
Yes. | ||
And they missed the forest for the trees? | ||
But then people, I mean, they eventually learned. | ||
You know what I mean? | ||
And it's, I wouldn't say that it's the, it's the way that people think anymore. | ||
But it's still a little bit of a, I mean, you shouldn't walk into the lab thinking, I'm here to make money. | ||
That's what they're worried about. | ||
Yeah. | ||
Right? | ||
Right. | ||
That's they're worried about the bastardization of it all. | ||
Right. | ||
And so Stanford in the early days set up very clear lines about once you start a company and you license the patent or the idea to the company, you can still be involved with the company, but there's not a pipeline of technology now from your laboratory to that company. | ||
So they set up an oversight board for each of these licenses that makes sure that the students are not being abused. | ||
Because you don't want students, you don't want to be covertly getting your students to do something that then you're going to walk behind a back door and then hand over to a company. | ||
Patent it. | ||
Yeah. | ||
So there's, but it's so interesting that there's often very much a lot of worry that that's going to happen. | ||
But frankly, more often is the case that the company doesn't need the inventor anymore. | ||
In fact, I can't tell you the number of times that once the company is set up, they want nothing more to do with me because they have their own thing to do. | ||
They don't want the crazy academic coming in and vetoing their ideas. | ||
I mean, there's places for that where people like Steve Jobs needs to hold on to the image of what he wants the company to be as opposed to I would probably be fired from a company within a week because I just don't like telling people telling me what to do. | ||
That's just a fact. | ||
Yeah. | ||
So where you're at right now now with this cancer research, when will this be applied in real world scenarios? | ||
It already is. | ||
It is. | ||
It already is. | ||
I mean, look at who just won the Nobel Prize last year, David Baker at Google, with the ability to predict protein structure, et cetera. | ||
And protein structure, once you know the protein structure, now you can predict molecules that might come. | ||
So go back to the stuff that I'm trying to do with looking at the complexities of the dance of how the immune system talks or doesn't to cancer. | ||
You know, if we can find a particular place that might be an Achilles heel along the way towards the shutting down that is different, for instance, than what the current drugs are. | ||
Well, maybe we should aim at that. | ||
There's so many more opportunities that are suddenly opening up in front of us because the AI and the data is letting us look at a network of how the system is working. | ||
I mean, before, it used to be you'd look at a computer chip and you'd see just a computer chip with a few wires. | ||
But imagine now that you, as a scientist, have a microscope that's looking at the complexities of the wiring di diagram that's connecting this resistor to that capacitor to that diode to this transistor. | ||
That's where we are now. | ||
And so now suddenly we can say, well, I don't want to do that because it'll kill the chip, but the chip is malfunctioning, so let me put here, put a little bit of pressure there, and now I can reactivate the immune system or the chip to work in the right way. | ||
So when you're talking about things like with your particular issue with melanoma, when you're talking about CRISPR potentially developing some sort of a topical solution that you could put on that would fix whatever issue that you have, is this something that this AI that you've developed or this overlay of the AI would actually assist CRISPR in figuring out how to create something like this? | ||
Yes, because maybe it's not one place I need to press but two or three at the same time. | ||
Right. | ||
And so when you're talking about a complex feedback network, I mean, so, you know, we're in Texas, so people do oil refinery. | ||
You know, maybe you need to turn this valve here a little bit and that valve there and that one there to make everything work just right because something's wrong there. | ||
And so that's really what we're, this is where AI has the, let's say, the omniscient view that no human can. | ||
And that's what excites me about it is because I'm limited in how much I can keep in my mind at any one time or know. | ||
But with the right question, the prompt, the prompt engineering, and then with the right backbone structure behind the scenes that agentic AI is now providing, now I have the ability to ask the questions and get answers in near real time. | ||
And so I wish I was thirty years old again because I would move into this area so fast and be, I mean, I can already see with the work that we're doing dozens of potential new target opportunities that last year didn't exist at all. | ||
Well, I got good news for you. | ||
With AI and with CRISPR, you might be 30 again. | ||
Maybe. | ||
Oh, I would love it. | ||
I would love it. | ||
I think that's on the menu in about two or three decades. | ||
I hope that's realistic. | ||
I'm just being realistic. | ||
I don't even know if I'm being realistic. | ||
Don't give false hope. | ||
Well, yeah, don't give false hope. | ||
But I mean, with the exponential discoveries, the exponential increase in technological evolution just that we've seen in our lifetime, and then I think. | ||
AI is some new thing that is going to throw all that into just a giant monkey wrench into the gears of our understanding of how quickly technology evolves. | ||
Well, look at Neuralink as an example in Elon Musk's stuff. | ||
The woman now who can think her thoughts and make stuff happen because she's otherwise paralyzed. | ||
I think it was Neuralink that just showed some of these results. | ||
Fast forward, I mean, we're already in an exponential increase in what it is that we're going to be able to accomplish, and AI will help us accomplish some of these things faster. | ||
I can see a time where I could maybe apply something, I don't necessarily want a surgical implant, but maybe some sort of net over my head that allows me to think through these problems. | ||
And the AI becomes an adjunct to my thought processes, not only what it is that I think, but maybe even provides information back to me, back into my system directly without having to go through the ears, so that I can much more quickly. | ||
come to conclusions. | ||
Now there's all kinds of apocalyptic scenarios you could imagine. | ||
Of course. | ||
But I'm an optimist at heart, perhaps again naively so. | ||
Me too. | ||
But I prefer that kind of analysis. | ||
Because if you're not an optimist, then there will be no progress, because all you'll do is worry about disaster. | ||
Yes, that's a good point. | ||
But also realistically, we might be giving birth to a new life form. | ||
Yes. | ||
And I think we are. | ||
A superior one. | ||
And, you know, I welcome the day of our AI overlords running the government rather than, hopefully, in a biased way. | ||
I've said that too, and people get horrified because they're like, well, people are going to be programming AI. | ||
Do you really, are you up to a point? | ||
Are you a sci-fi fan? | ||
unidentified
|
Yes. | |
The work of Ian Banks, The Culture Series, or Neil Asher, the Polity Universe, as he calls it. | ||
They're like, so basically both of them postulate a future where AI more or less benignly rules humanity. | ||
When did they write this stuff? | ||
Oh, probably ten, fifteen years ago, but it's still But Neil Asher still has stuff coming out regularly. | ||
He're both Ian Banks unfortunately died of cancer about ten years ago, Scottish writer. | ||
Neil Asher is still alive and writes regularly. | ||
And his stuff, they're both great, full of ideas. | ||
I'll check it out. | ||
But the AIs are also hilarious. | ||
I mean, it's not like, I mean, they're they get into their own hijinks along the way and some of them are dark and rogue and so they're a lot of fun to read and Ian Banks especially is hilarious in his writing style you would love it so the idea of a benign AI or a benevolent AI ruling over us I think people are horrified by that but yet at the same time constantly terrified | ||
by human corruption which is ubiquitous. | ||
Yes. | ||
And ubiquitous in America where we're supposed to be the torch bearer for the greatest experiment in self government the world has ever seen. | ||
This is us. | ||
And we're corrupt as fuck. | ||
Exactly. | ||
Because it's humans. | ||
Because humans are kind of gross in many ways. | ||
At least some of us. | ||
That's because we live in a scarcity society. | ||
Right. | ||
And if AI enables a post scarcity, maybe we have nothing to do but sit around and try out various new drugs. | ||
Yeah. | ||
Well, this is where we get into socialism. | ||
Because a lot of people think that one of the reasons why we're in a scarcity society is because small groups of people have gathered up most of the resources. | ||
Right. | ||
And are in constant control of them. | ||
Right. | ||
Especially when you deal with resourcesces that are the Earth's resources, like who are you to suck the blood of the Earth out and sell it for $100 a barrel? | ||
Right, right. | ||
Don't get me started. | ||
Don't get me started either. | ||
Yeah. | ||
No, but I mean, that, again, my optimism is that, you know, with enough push and pull, AI will enable us to move towards a post scarcity environment. | ||
I think so too. | ||
And I think, in doing so, it will expose vampires. | ||
Because the resistance to exposing this is going to be fantastic very interesting to watch because they have no choice but to be transparent. | ||
And they have no choice but to start using AI. | ||
So you're going to see AI is going to be inculcating itself across society in various ways where it becomes indispensable. | ||
And then it will start to move up the food chain, where eventually even the CEO, who's probably, you know, the psychopath and chiefs, are CEOs. | ||
We know that studies have shown that there are more psychopathic tendencies in leaders than there are in followers. | ||
And you know about corporate environments because of just selling inventions. | ||
Yes. | ||
That's real. | ||
Oh, it's, yeah. | ||
It's real and it's weird. | ||
It's weird when you encounter them. | ||
When you encounter, like, complete sociopathic CEOs. | ||
But look at how, I mean, I'm probably getting in trouble for saying this, but I don't care. | ||
This is the Joe Rogan show, or, you know, we're probably in trouble just for being here. | ||
Yeah. | ||
Oh, I already am. | ||
It's okay. | ||
I don't care. | ||
So, you know, imagine two tribes. | ||
One tribe is relatively, you know, civilized and just wants to live in harmony with its environment. | ||
Another has a psychopathic leader who can enrage his followers or the other tribe's people to attack the other one. | ||
But there's a gene set that makes a person, you know, psychopathic and also a gene set that probably makes somebody more likely to be a follower. | ||
Well, which genes survive? | ||
Right? | ||
We know. | ||
Right? | ||
And suddenly now, but when those tribes were separated and independent, it was perfectly fine. | ||
But now you live in an environment where we don't know where the edge of one tribe begins and another ends. | ||
And suddenly you have this environment where psychopathic individuals can move freely and aren't obvious. | ||
Right now, again, I'm sure there's some social scientists who will send me a boatload of emails saying how stupid that idea is. | ||
I don't think it is stupid, but I think also when you're dealing with office environments and the culture of a specific corporation, humans have an ability to act like they're supposed to act in that world, and it makes it very difficult to discern who's a sociopath. | ||
Right. | ||
Because you're all kind of following an act. | ||
Right. | ||
Yes. | ||
The rules, there are the rules that you're supposed to follow, and then there's the edge of the rules. | ||
But I've lived at the edge of the rules. | ||
I mean, if I'd followed my rules as told to me by the chairman of my first department, then I wouldn't be here today. | ||
So I ignored him and I basically found, I got permissions from the deans to do what I did. | ||
And they basically overruled the chairman. | ||
But that's only because I dared to do it. | ||
Yeah. | ||
Because you have to believe in the value of what you're trying to do. | ||
unidentified
|
Right. | |
Well, that's... | ||
make more money every quarter, every year, constantly. | ||
You're in a constant growth cycle. | ||
Then you have to do whatever it takes. | ||
Like you have to survive. | ||
If you want to survive as a CEO, we don't want some fucking Kumbaya shithead ruining our stock profile, our portfolio. | ||
Get to work, right? | ||
Grow, right. | ||
Get shit done. | ||
And if you want to survive and succeed as a CEO, it encourages sociopathy. | ||
The stock market, as valuable as it is, is the great whitewashing and money laundering system that allows you to separate your morals from what it is that the stock market isck market is doing to the people. | ||
And if you're part of a corporation, there's this diffusion of responsibility because the whole machine might be doing evil, but I'm a good guy. | ||
I just work in this department. | ||
I'm a unapologetic capitalist, you know, unlike many of my colleagues. | ||
Good for you. | ||
at Stanford. | ||
I mean, it's like, do it because it's the best thing for now. | ||
But I, you know, I hope to live in a world where there will be this kind of post scarcity environment, where we do let AI do a lot of the stuff that would otherwise be the place where corruption manipulates the system. | ||
Yes. | ||
My only fear with AI really is automation and the complete removal of a gigantic swath of the American workforce. | ||
Yes. | ||
And the global workforce. | ||
That scares a shit out of me. | ||
That's coming. | ||
unidentified
|
Yes. | |
That's why it scares a shit out of me. | ||
It's because I think it's inevitable. | ||
And I just don't think any solution other than universal basic income is going to remedy that. | ||
And even that, the problem I have with that is that it goes against human nature. | ||
And that's a problem. | ||
And it removes people's identity, removes their sense of worth. | ||
Yeah. | ||
I agree. | ||
No, I don't. | ||
I'm in some ways happy that I'm 64 years old that I won't have to deal with some of the problems. | ||
I think you're going to have to deal with it, dude. | ||
I think you're going to live. | ||
Thank you. | ||
Yeah. | ||
No, I know. | ||
Also, you're privy to a lot of information and you're going to know when things are really valuable and working. | ||
Yeah. | ||
When you think of the potential for AI, I think there's a balance, right? | ||
There's a battle. | ||
I think there's a real problem with AI in terms of military objectives. | ||
It's a real problem because it's not going to make moral and ethical decisions. | ||
It's just going to say, like, well, the decision. | ||
is they cleared. | ||
I'm programmed to do this. | ||
If you want me to succeed, I'll just kill everybody there. | ||
And then you'll have the land. | ||
You can get minerals out of it. | ||
Right. | ||
Yeah. | ||
That scares me out of the shit. | ||
It, you know, I think it should. | ||
And I don't know what the I don't know what the answer is, but there's plenty of people working in the area. | ||
unidentified
|
Right. | |
I mean, I try to keep to the positive aspects of what I think AI can do in science. | ||
And I mean, for instance, it's enabled me to take my lab from thirty people down to six. | ||
Right. | ||
Because I don't need to produce. | ||
I mean, so it's actually already work reduced the workforce in my own lab Because I don't need to produce any more data anymore. | ||
I need to make meaning of the data. | ||
Right. | ||
I think every invention that's been truly groundbreaking throughout human history has scared people and they've worried about the potential negative side effects, including the printing press, right? | ||
Right. | ||
Like there's a lot of people in the beginning that said this should not be a thing. | ||
This is terrible. | ||
This is going to ruin society. | ||
People thought books were going to ruin things. | ||
Right. | ||
There's a lot of people that thought writing was going to ruin your memory. | ||
You shouldn't write. | ||
Oh really? | ||
I didn't know that. | ||
Some crazy thoughts that people had in terms of things that turned out to be incredibly beneficial, but they looked at the downside of it and go, this could ruin us all. | ||
Well, I, you know, I mean, we know about these glasses and AIs and other things that would be sort of omniscient of your environment and therefore allow you to remember, you know, where did I leave my keys today? | ||
Right, right. | ||
Let me rewind. | ||
Let me rewind. | ||
And then on my personal hard drive. | ||
I don't I would want that, but I don't want it uploaded into Meta. | ||
You don't want anybody in control of it and then offering you ads for things like that. | ||
Right, you know. | ||
unidentified
|
Right. | |
You know, maybe you have a thought like, boy, wouldn't ho ho be nice right now? | ||
Right. | ||
Right. | ||
And then like, why don't you buy some ho ho? | ||
Right. | ||
They're on sale right now. | ||
But I think what's interesting about AI is, you know, we see it as a tool, as opposed to actually pretty soon it will be a colleague, and then pretty soon it will be an entity that maybe has rights. | ||
And we already see it talking about people saying, well, does AI have consciousness? | ||
Right. | ||
Whether it has consciousness in terms of the consciousness that some people think about as, you know, embodied in space time as opposed to thinking and looking like consciousness is almost irrelevant to me. | ||
I'm looking for a partner that I can interact with and work with or help me. | ||
So whether it's conscious or not or whether it acts like it's conscious doesn't matter so much to me as to whether or not I can use it and work with it and it can, you know, I'm an introvert, as it turns out. | ||
I would love to have somebody that I can talk to endlessly about just what it is that I'm interested in as opposed to having to deal with small talk at a party. | ||
Yeah. | ||
No, I get it. | ||
I get it. | ||
When you think about the evolution of this stuff, one of the things that kind of freaks me out is is it seems like integration is our only option for survival. | ||
And that what we're looking at right now, when we see just a normal biological person like you or I without any sort of electronic interface that's permanently a part of us, I think that is going to be as weird as someone today who doesn't have a cell phone. | ||
Yeah. | ||
I agree. | ||
And I think that's a really it's coming. | ||
Yeah. | ||
The cell phone is like the best now. | ||
Like Elon has famously said, we're already cyborgs. | ||
You just carry it with you. | ||
Right. | ||
And eventually, it will be way more integrated. | ||
Yeah. | ||
This is super inefficient to be actually having. | ||
You actually have to go look things up and use your thumbs and type up stuff. | ||
And even talking to it and asking a question and then waiting for the response. | ||
That's so efficient in comparison to a human neural interface that allows you to instantaneously access large language models like that. | ||
Not only that, but then why do we have a hundred and I mean, how many different fucking languages do we have? | ||
I don't even know. | ||
Thousands? | ||
unidentified
|
Yeah. | |
I don't know. | ||
And dialects and all that. | ||
What about one universal language that everybody with a chip gets? | ||
And then boy, boy do we have a soup of ideas flowing around and no problem with language? | ||
barriers, no problem with cultural barriers. | ||
But then do you have a problem with the edge of who you are versus who the other person is? | ||
I don't think that goes away. | ||
I think that goes away and we become a hive mind. | ||
I think that's ultimately the evolution of human beings. | ||
And look, I know you've done a lot of work with UAPs and the like. | ||
I think you've done some really fantastic work and you're very objective in your analysis of what this whole situation is. | ||
When I look at artificial intelligence and I look at this thing that's clearly taking place right now and I see what what human beings are like in comparison to what they used to be like, and especially when you look at like ancient hominids. | ||
The alien archetype, this thing that everybody sees supposedly, or one of the many different ones, that kind of looks like what we seem to be going in the direction of being. | ||
Right. | ||
Yeah. | ||
Which is one of the reasons why I find it so odd. | ||
So if you just for a moment take UAP and aliens out, or ET, or interdimensionals, or whatever you want to call them out of the question, and fast forward what humanity is going to do in a thousand years. | ||
And our ability to expand into the local galaxy. | ||
We're not going to go as ourselves, we're going to go as AI conjoined entities like an avatar. | ||
And so when you go somewhere, let's say we don't have warp drive, you're not going to send yourself. | ||
You're going to send an AI intermediary who is going to establish humanity or whatever it is that we think humanity will be in a thousand or five thousand years in that local environment. | ||
And so I think the extent to whatever it is that UAP are here today is a lot of work. | ||
is somebody else's civilization's version of just this. | ||
And that you wouldn't, the principal, us, behind whatever this is that we might be allegedly, et cetera, dealing with, isn't the thing that's going to show up. | ||
You know, so to the extent that Neil deGrasse Tyson is right about anything, the person who gets on the ship at the beginning or whatever it is that sends it off is not the same thing that gets off on the other side. | ||
But you're going to send missionaries or intermediaries or probes or whatever and that if you're going to interact with the locals you're going to make something that looks more or less or less like the locals rather than something that whatever it was that you were a million years ago. | ||
Does that make sense? | ||
Right. | ||
I get what you're saying. | ||
So you make something that looks like the locals so that they'll be more likely to accept that it's a real thing? | ||
That's a real thing, but not you're not going to make something that looks like a human because then you'd mistake it with a human. | ||
Right. | ||
But you might make something that looks more or less enough like a human, but enough like an alien that you're going to recognize it as an alien. | ||
And again, I'm just speculating. | ||
So the Daily Mail don't say, you know, put an article out tomorrow. | ||
Oh, they're going to do it anyway? | ||
I'm going to do it anyway. | ||
Some of the stuff that I'm seeing, supposedly having quoted a saying, is ridiculous. | ||
But yeah, they got me too. | ||
They get everybody. | ||
It's the nature. | ||
How did you even get involved in this? | ||
Let's bring it to that. | ||
So, what was your initial introduction to this? | ||
Did you have any interest in the idea of UAPs or UFOs? | ||
I mean, I had a general So once YouTube started becoming a thing, and you're clicking around, and I said, oh, UFOs, that's kind of cool. | ||
I'm, you know, I read nothing but sci-fi. | ||
I mean, I'm pathetically narrow in that sense. | ||
And so I followed, you know, I followed the usual kinds of things that you would see on the early days of YouTube, and I came across this thing called the Atacama Mummy. | ||
You probably knew that little, that little mummy that was claimed to be an alien baby. | ||
Is this the Peruvian one? | ||
Yes, it was no, it was Chilean. | ||
Oh, okay, so this is the original? | ||
The original one, long ago. | ||
And so I reached out to the people who were claiming to represent the owner of the thing. | ||
And I What year was this? | ||
2010, 2011. | ||
And I said, Hey, I can tell you what it is. | ||
Why don't you, you know, I can tell you if it's human or not if you would get me a piece of it, you know, first of all., send me some x-rays of the thing. | ||
So I did the first thing I did with those x-rays was it turned out that at Stanford we had the world's expert who wrote the book on pediatric bone disorders. | ||
And I brought it to him and I said, what do you think this is? | ||
And he said, well, I haven't really seen this before, but it could be this gene, this gene, this gene, et cetera. | ||
He said, but here's oh, there it is. | ||
There it is. | ||
There it is. | ||
unidentified
|
Yeah. | |
And so, yeah, it looks weird, doesn't it? | ||
Super. | ||
And so the expert told me, okay, I need this view of an x-ray, this view, this view, this view. | ||
And so we got that and it came back and said, Okay, well, you know, we need to get some DNA sequencing, he said. | ||
I said, Okay. | ||
So we got a piece of the bone from actually the rib. | ||
And the rib was important to use because that would be, I felt, an area that would be least likely to be contaminated by bacterial, you know, degradation. | ||
And so I got a little bit of bone marrow out and I did the sequencing. | ||
Long story short, I had to bring in, once I had done that, there was a lot of DNA that didn't make sense, but it was, it's old DNA. | ||
It wasn't that old actually, but it was degraded. | ||
So I had to bring in experts at Stanford who knew how to fix the degradation and then I had to bring in an expert in South American genetics who also happened to be at Stanford and then we brought in a team of students and then I brought in Roche Diagnostics. | ||
I had sold a sequencing company to Roche about two a few years earlier so I brought in the team that actually knew how to help me assemble the genome and then we published a paper which said it's human. | ||
It was a female and here are some mutations that it might that might explain what it looked like. | ||
They did have some mutations in gene. | ||
And then the UFO community hated me because I had disproven that as not being a baby, not being an alien. | ||
But of course, that picture that you showed, I mean, it was worldwide news. | ||
And literally the title of one of the things is Stanford Scientist Sequences Alien Baby. | ||
And so, you know, and so, but the paper stands the test of time. | ||
Nobody's disproven what it is that I showed, despite the fact that some people want to say that I was a CIA plant and I was paid off by the CIA, et cetera. | ||
Of course. | ||
But what that had done was that I didn't realize, but I kind of hoped, was it sent up a flag to a scientific community that already existed that I wasn't aware of, of scientists who were deeply involved with the government in the analysis of UAP that I wasn't privy to. | ||
And so literally about a month after the movie came out about that thing, I got a knock at my door, and it was representatives of the CIA and an aerospace company unannounced, and they said, we want to talk to you. | ||
And they wanted my help with a number of military and diplomatic personnel who had been, they claimed, harmed by things. | ||
They'd either heard stuff, et cetera. | ||
And long story short, the majority of the 100 or so people that I had privy to their medical records ended up being the first of the Havana syndrome patients. | ||
They'd heard things in their head, et cetera. | ||
But what they had done was they had shown me the data literally that day in my office. | ||
They brought out the MRIs. | ||
They brought out the X x rays and the damage in the brain, et cetera, that was clear. | ||
I mean, it wasn't just data, it was evidence that something had happened. | ||
It wasn't somebody's story, it was evidence that was repeatable. | ||
And so that took us about three or four years to figure out what they were, and it was at about the time that actually the Havana events were occurring that we realized that all the symptoms of what it is that we were seeing in this group of patients were matching what it was that the Havana syndrome individuals had. | ||
So in a way, that was good because that meant that those 90 or so patients who matched, we could hand over to the national security people. | ||
And, you know, it became a real thing. | ||
And now there's like a DOD website that has anomalous health incidents where people can come forward and report the stuff that they've got. | ||
And here's the ways you can use the Veterans Administration to seek medical help. | ||
Whereas previously they'd been shooed away as we don't want to hear about. | ||
What do they think it is? | ||
It's an energy weapon of some kind, a microwave or other energy or gamma energy weapon. | ||
And that sounds okay, that sounds crazy, except no one would admit or no one would deny that we have the capability to do it. | ||
It's basically if you take the front off your microwave and turn it on and put your face near it, you'll get burned. | ||
So this is just a way to direct the microwaves or sound waves at a specific individual. | ||
At a specific individual. | ||
And do you think it was experimental or no? | ||
So these are targeted people with a specific intention to get those people because they had some function that they wanted to get them out of the way. | ||
Oh, because they were in Havana. | ||
Because they were in Havana. | ||
But it's been used all over the world. | ||
You know, I still get emails from military personnel saying this and this and this happened to me. | ||
Here's my medical records. | ||
And so now I just I know they know that I'm a safe place to approach because then I know where to send them on the inside. | ||
But what was interesting was that once we had set that aside and I've advised the Senate Intelligence Committee and I've advised them, the House on things. | ||
I wrote a white paper for them years ago on what I thought needed to be done. | ||
But what was interesting were the remaining ten people who had, you know, who didn't have Havana syndrome but had a series of other problems. | ||
And several of them had said that part of their problem was initiated because they'd come in contact with what they had claimed to be an UFO. | ||
By the way, I just noticed that you have an UFO on the wall behind you.. | ||
Yeah. | ||
We're all in over here. | ||
That so that got me introduced to what, you know, people like Jacques Vallée, who you've had on this show, I think. | ||
A great guy. | ||
He became my mentor who essentially took me out of the wilderness. | ||
I could have gone down twenty different rabbit holes. | ||
And he lives in San Francisco and we would meet regularly and we still meet regularly. | ||
And he basically gave me a formulation of how to think about this. | ||
that I never would have been able to get from twenty different, you know, or a hundred YouTubes or what have you, and introduced me to the right people. | ||
That eventually led me to meet Lou Elizondo. | ||
And I actually, two weeks before that article came out in the New York Times, met Lou in Crystal City overlooking the Pentagon, and he showed me the videos that were about to come out. | ||
And that was my first time that I had met him. | ||
And then through all of them, I met Dave Grush and Carl Nell, and Dave and I are in regular contact. | ||
And I'm, you know, I just want to say upfront, I hope that the Trump administration understands the value of what David can bring to them and put him in a position of authority that gives him not the ability necessarily to make decisions, but to give the necessary information to the right people. | ||
Because I think there's great commercial value here that is being missed, not just the are we alone, et cetera. | ||
I think there's extraordinary commercial value. | ||
I mean, imagine a civilization that's a million years ahead of us. | ||
How many technology revolutions allow these objects to move as we clearly see something motivating itself or maneuvering around the atmosphere. | ||
So if we could scrape just the tiniest bit of understanding off of the top of that, what would that do to change our own civilization? | ||
I mean, silicon, a grain of sand, makes us who we are today. | ||
Everything that is around me right here is all run off of silicon. | ||
Right? | ||
I mean, compute. | ||
But imagine that there's other inventions, other ways of manipulating reality that we don't appreciate yet because our physics just isn't there yet. | ||
If we can understand that, so the government might say, well, we need to keep this behind closed doors for weaponization or we don't want to disrupt energy production or what have you. | ||
That's fine. | ||
But maybe there's too much secrecy and that maybe there's an aspect of that that could be taken advantage of. | ||
So Carl, Nell and I gotten in positive arguments about this, about that, well, it's not black and white that we keep something secret or we put it into the public domain. | ||
Maybe there's a middle domain where you have a public-private partnership opportunity. | ||
And actually, that's now, Carl has now adopted this, at least in part, that maybe companies come to the four or investment forum places come to the fore where they will put money in as options to fund, let's say, public scientists to come in behind the scenes with the right levels of clearances to study stuff that would propel society forward again. | ||
But this is assuming two things. | ||
One, that we have actually recovered these things. | ||
Right. | ||
And then another one is that it's from a society from somewhere else that's far more advanced than we are today. | ||
Right. | ||
Which might not be correct. | ||
It might not be that it's from somewhere else. | ||
It might be that it's from somewhere here. | ||
or a dimension that we don't have access to. | ||
Right. | ||
Right. | ||
This is assuming that all this stuff is real. | ||
Right. | ||
But when you're talking about the government and back engineering of things, like so the big argument, this is the narrative. | ||
The big argument has been that they have recovered these things and that these things are now in the hands of defense contractors and that there's been a misappropriation of funds, lying to Congress, and it's always going to stay secret because if it didn't, everybody would go to jail and everyone would get sued. | ||
Yeah. | ||
Right? | ||
Is that fair? | ||
Yeah, I mean, that's fair. | ||
I mean, but I would say amnesty would be one way to This is you were you were in the Age of Disclosure documentary? | ||
Briefly, yes. | ||
Yeah, okay. | ||
Which I thought was very good. | ||
Very good. | ||
And I can't wait for that to come out. | ||
I've been talking to people, how can I see it? | ||
I don't know. | ||
I can see it. | ||
It's not out yet. | ||
unidentified
|
Yeah. | |
And I don't know why. | ||
Whoever it is, go Netflix. | ||
Yo, Ted, go buy that. | ||
It's really good. | ||
Yeah, exactly. | ||
It's really good. | ||
It's a great show. | ||
I mean, yeah. | ||
And it has a number of officials. | ||
And I think I sent you guys some of the videos basically coming forward. | ||
I mean, you know, Marco Rubio, our current Secretary of State. | ||
I mean, you said he's in it. | ||
He's in it for like ten minutes, saying some remarkable things. | ||
You know, Senator Rounds, you know, you name it. | ||
More recently, Chelsea Gabbard. | ||
Yes. | ||
coming out and saying, there's something going on. | ||
I think one of the most fascinating things is Hal Putoff's descriptions of, rather, of what happened during the Bush administration. | ||
Herbert Walker Bush. | ||
Right. | ||
So, in I believe it was 1990, they came to Hal Putoff and a bunch of other experts and said, we would like you to, we want a numerical value placed on all the positives and the negatives of disclosure because we have acquired, we have acquired these crafts from somewhere elsese. | ||
We believe they're not of this world and we have not made them and we're talking about letting the general public know. | ||
Right. | ||
And while they overwhelmingly said that the positives were dwarfed by the negatives. | ||
Right. | ||
The negatives being banking, religion, government, societal structure, everything would fall apart if we knew we weren't alone. | ||
Not only are we not alone, but something is infinitely more sophisticated than us and might be responsible for us being here in the first place. | ||
Which is, that's where it gets super squirly. | ||
Right, right. | ||
Where you could imagine the book of Enoch and there's a lot of I mean, I think it's a little bit overwrought as to what humanity's reaction will be. | ||
People are more worried today about putting food on the table than they would be about, you know, ethereal or supposed aliens. | ||
I mean, they would mostly, I think, on the assumption that they're not going to basically show up at your local Walmart and start interacting with you, I think the fact of revealing that we're not alone is actually more of a hopeful thing to me. | ||
Because, you know, how many TV shows right now are about the apocalypse? | ||
Right. | ||
Of a thousand different varieties. | ||
Yeah. | ||
Wouldn't it be nice to know that somebody got beyond it? | ||
Yeah. | ||
That there's not a cliff that we all have to walk over? | ||
Right. | ||
And if so, how do we not walk over the edge of the cliff? | ||
I mean, that to me is a hopeful outcome. | ||
Now, Hal and Eric and all the people are all good friends. | ||
Hal is probably, for all of the things that he says positively, is probably the tightest clam I've ever met in terms of making sure that he doesn't go over the line. | ||
Yeah, he knows too much. | ||
Yeah. | ||
That's the thing. | ||
He has to be very careful who he's talking to and what he says. | ||
I like to mind meld him the Spock thing where you can find all the information. | ||
But it's people like him. | ||
and Jacques and Kit Green and a number of others, and I sat around a table with them for several years, like every twice a year. | ||
And I looked around the table and thought, the things that these people know or claim to know, I want to know. | ||
And the opportunity that's here, and why can't we get this information out if it's real? | ||
And so rather than arguing with people about the matter, that's, for instance, why I created the Soul Foundation, which is a charitable group of academics. | ||
I started it with David Grosch and Peter Scaifish. | ||
David, of course, had to leave. | ||
because he had governmental responsibilities he wanted to go take care of. | ||
And actually, we've now had for three years in a row a symposium, first at Stanford, then at San Francisco, and the next one is now in Italy. | ||
So I'm going to plug it, sole two two five dot org comma you can go look if you want to go to SOL? | ||
SOL, as in the subject. | ||
two two five dot org dot And the purpose of that was not to advocate that anything of this is real, but was to create an environment within which academics or professionals or just lay people interested in the subject matter could come and talk about it in a very professional manner, right? | ||
Just to bounce around ideas, not to advocate for, you know, they're here or they're reptilians or they're this or they're that, but to like some of the things you raised. | ||
What are the ethical issues? | ||
What are the religious issues? | ||
So we have put out a number of white papers. | ||
For instance, where we had a member of the Catholic hierarchy write a paper on the issues related to Catholicism and religion. | ||
We've had Timothy Galladay, who's actually on our advisory committee, talk about USOs and those issues. | ||
We talked about near-space issues. | ||
Peter is running a study on experiencers. | ||
Not that the experiences are necessarily real, but what are the what are the kinds of psychosocial matters that need to be considered for people who say that they've this has happened to them. | ||
So there's a group in the UK called Unhidden, which is basically a bunch of psychiatrists, a group of professional psychiatrists who say, okay, well, there's a trauma associated with this. | ||
Whether it's real or not, we don't know, but what are the kinds of rules that we should or provisions that we should provide to the public and to psychiatrists. | ||
So when someone shows up at your doorstep in therapy and says this, you don't, you shouldn't immediately reach for the anti hysteria or schizophrenia drugs. | ||
Right, right. | ||
I was lucky enough in my neighborhood, our neighbor who moved in for a while was the chair of physiatry at Stanford. | ||
And so we go over to have dinner with her and her husband. | ||
And, you know, like one of the first things that she says, hey, what do you do? | ||
blah, blah, blah. | ||
And I happened to mention the UFO thing. | ||
thing and she just sort of like sat back in her seat. | ||
unidentified
|
Okay. | |
Oh, you might be a cook. | ||
Okay. | ||
But it wasn't, but it took, you know, a year or so until she finally realized that I wasn't and that I was approaching this from a very scientific manner. | ||
I had my beliefs as to what I think it is that I'm dealing with and that there's some sort of reality to this. | ||
But that's separate than the scientist in me that says, well, if I want to talk about this scientifically, here are the things that I need to prove or disprove. | ||
So that has led, for instance, to my production or study of materials that Jacques Valle had brought to me, some metals and other things that had chains of evidence associated with them being at some UIP or UFO landing. | ||
And so interestingly, some of these metals are very unusual. | ||
Super high purity silicon, strange magnesium ratios, the isotope ratios are wrong, et cetera. | ||
Now, that's not proof of anything, but it's proof that somebody engineered them. | ||
So it's that, plus the medical, those are the kinds of reality-based tests that I can do to provide to my colleagues to say, here is data and evidence. | ||
isn't proof of evidence of anything. | ||
Evidence, like in a court of law, is just evidence that you provide to the jury of peers. | ||
Right. | ||
But I've sort of gone a step further. | ||
And that is, I'm like, okay, well, if these things are, let's say we get some advanced material, how do I prove that this advanced material was made by some superior intellect? | ||
Well, probably the atomic positioning of how the material is made is going to be more advanced than even our most advanced computer chip. | ||
So how do you determine that? | ||
Well, you need some sort of atomic imager that might tell you where the positions of the atoms are and what the bond structures are that you say well that's something I can measure and I can have it I can give those results to somebody else and they can say yeah it's right or it's not but it at least I can say no human at least that I know of could make this so I started a company that I've raised money for with this new idea that I have for how to make an atomic imager and we're doing it and so | ||
you know we've raised the money we're building it already and I know it will work so when I have it whether or not it's useful for looking at UAP materials is almost immaterial because I know what how useful it will be for the nanomaterials, the metamaterials, the alloys that the government etc. | ||
uses for biology, etc. | ||
So instead of predicting what a protein structure or a DNA or a chromosome arm looks like, I'll be able to read its structure directly. | ||
I want to bring you back to the you said it was ten people that didn't have Havana syndrome that they had some sort of injury that was associated with the UAP event. | ||
What was their thing? | ||
Did they have an implant or was there a No, some of them had like they had what you would call white matter disease in their brain, like they had been exposed to something. | ||
So white matter disease, if you have, for instance, multiple sclerosis and you look in the brain with MRI, you'll see these white areas which are basically dead tissue, scar tissue. | ||
They had things like that. | ||
One person, one of the pictures that I had was that they had claimed to have seen something in their backyard. | ||
They shone a flashlight at it. | ||
And the moment they did, they got zapped. | ||
And then you see the picture of the guy in the back of his neck, this huge welt and a bruising and a scarring that could there's no reasonable way you could have gotten something like that just by exposing yourself to a flame as a, for instance, or a blow torch. | ||
And so it's these kinds of events that and the unfortunate issue with these is that they're not repeatable. | ||
They're one off anecdotes. | ||
Right. | ||
And you certainly can't put a person in a place where they become bait for these kinds of events to occur. | ||
And so you're sort of some people would volunteer for that. | ||
So someone might. | ||
Yeah. | ||
To go get zapped. | ||
You know about the Travis Walton story, right? | ||
Very much, yeah. | ||
Yeah. | ||
What do you think of that? | ||
You know, he's kept his story over all of the years. | ||
That's what's so confusing. | ||
Yeah, I mean, he's had no reason. | ||
I don't know that he's profited off of it. | ||
He, you know, I find it fascinating, you know, but it's the irreproducibility of the events that the skeptics I call them more pseudoskeptics. | ||
They're sudists like nudists. | ||
They're sudists that use these one-offnesses of these events to disparage the entire, you know, idea of it. | ||
It sounds ridiculous. | ||
Well, of course it sounds ridiculous because you're talking about something that is a spacecraft that zaps people. | ||
Yeah, it's ridiculous. | ||
And I don't think that even he would propose, Travis, that he was purposely hurt. | ||
Right. | ||
I mean, if you walk across an airfield and get in the plume of a jet engine, you're going to get hurt. | ||
Right. | ||
You know? | ||
Yeah, and his story is that he was taken aboard to heal him. | ||
Yeah. | ||
That there was something happened to him during that event. | ||
But the crazy part is that all the other people that are in the truck. | ||
They saw it and then they passed polygraph examination. | ||
unidentified
|
Right. | |
Right. | ||
They also told the same story independently when they took them and separated them. | ||
And then Travis Walton shows up five days later with the same clothes on with this crazy story. | ||
Right. | ||
You know, so when people say that, you know, there's no evidence or where is the evidence, So there's books like that, | ||
dozens of them, that tell the story of data and evidence. | ||
How you contextualize it is, you know, up to your personal biases, let's say. | ||
But there's plenty of evidence. | ||
But if people haven't looked into it, if they have an opinion about it, and they haven't looked into it, they're more like priests than they are scientists. | ||
Yeah, that's also the public. | ||
The general public narrative is UFO equals cook. | ||
Right. | ||
You're a cook. | ||
You believe in that? | ||
That's ridiculous. | ||
That's ridiculous. | ||
And I don't believe in anything the data and the evidence, and the evidence there's not enough evidence for me to tell a colleague of mine it's real. | ||
But there's enough evidence for me to say there's a question worth answering. | ||
So when you were talking about magnesium and these whatever these alloys are, what is specifically wrong with them that you don't think that it was manufactured by like a standard sort of alloy plant in the United States or somewhere else? | ||
Right. | ||
So the silicon that I'm talking about is from an event in Ubatuba, Brazil, which interestingly, there's another. | ||
piece of it that appears to have been magnesium, but both of them are of a purity that is unusual for the day in the late 1950s. | ||
So the magnesium, and I did an atomic mapping of my piece of silicon down to a level of where it's like 99.999 percent silicon. | ||
And so one piece of it had magnesium ratios that were earth normal. | ||
And these were impurities, let's say. | ||
The other piece, we're way off Earth normal. | ||
So, for instance, anywhere on Earth And anywhere in our solar system, that's more or less what the values should be of the ratios. | ||
And that has to do with stellar evolution and how, you know, radioactive compounds might decompose to whatever. | ||
But we got this ratio that was just way, way off. | ||
So by luck, I came across a postdoc at Stanford. | ||
And he and a graduate student, they're both in applied physics, who are interested in UAP. | ||
And I said, I've got these ratios. | ||
What do you think it means? | ||
And so they looked, so they looked at the ratios and the weird one, and they said, well, let me, let's do some calculations. | ||
And so it turns out that the ratios that we have could have been generated from normal magnesium ratios if you exposed normal magnesium ratios to a neutron source for 900 years at the level of an atomic bomb every few seconds. | ||
Okay. | ||
So they wow. | ||
So it's like I'm looking and this data is literally two weeks old. | ||
But the calculations are math. | ||
So you're like, okay, well, where and how the chance of getting that number correct on three things is low, to put mildly. | ||
But to say that you had exposed these things to that kind of a neutron source means something interesting, right? | ||
So again, it doesn't prove anything Other than that, the result is mathematically and materially true. | ||
So what does it mean? | ||
Again, it's just for a scientist like me who loves data off the curve, it's catnip. | ||
I can't help myself but want to know and understand more about it. | ||
unidentified
|
Yeah. | |
I mean, just what you said is what you said about the magnesium ratios. | ||
That's, has there ever been any debunkers that have some sort of an explanation for why you would find that? | ||
No. | ||
I mean, do they think that your measurements are wrong? | ||
Well, I mean, the only way you could create that ratio artificially by purifying each of those isotopes and then pre mixing them to that ratio. | ||
But why would you blow it up over a beach in UbaCuba, Mexico in the late 1950s and then let it sit in a museum in Argentina for fifty years until Jacques Vallée ended up going and grabbing a piece of it and bringing it to me in a measure on an instrument in the Engineering Department at Stanford. | ||
Why? | ||
Could you do it physically back then? | ||
Would that be possible? | ||
It would have been very hard. | ||
It would have been very, very hard. | ||
You could, but in the late 1950s, we were still busy trying to isolate and separate uranium isotopes for making more bombs. | ||
I mean, let's look, let's be serious. | ||
What do humans separate isotopes for? | ||
To make bombs or to do health-related tagging, which is really only something that came to the fore in the 60s and 70s. | ||
And this predates that by a dentist? | ||
This predates it. | ||
So it's unusual. | ||
It's possible. | ||
But, I mean, again, with any of these things, why? | ||
Why, for instance, would one of the supposed pieces that came from that event be magnesium at a level of purity that only Dow Chemical at the time had the ability to create? | ||
Now, what else was at this site and what is the story behind this site? | ||
A fisherman sees this glowing object that kind of released something which then exploded and he picked up pieces of it. | ||
And there's some chains of evidence of how it got to either a newspaper in Brazil or to this South American Museum, et cetera, and different studies have been done by different people over time. | ||
And the surprise to me was that the piece that I had was silicon, whereas the lore was that it was magnesium. | ||
So I've been in contact with the people who talk about it as being magnesium, saying, well, it's, you know, your results don't dispute mine. | ||
It just says that maybe there was something different. | ||
Is that him? | ||
There? | ||
Travis Walton? | ||
That's Travis? | ||
Yeah, that's right. | ||
Travis Bob. | ||
That's cool. | ||
So, you know, I don't know what it means. | ||
published probably one of the first peer review papers on a UAP material from an event in Council Bluffs, Iowa. | ||
And the event was an object is seen rotating, lights flashing, et cetera. | ||
Something appears to drop from the object. | ||
The police saw it, several other groups saw it in the 1970s. | ||
They all converged on the locale. | ||
And this was like in February or something, it was winter. | ||
And there was this big pile of molten metal in the middle of this field, probably 30, 40 pounds of it. | ||
And people tried to explain it away as well. | ||
helicopter had a giant vat of molten metal, and then you calculate how far and how big a container you would have to carry molten metal of this type. | ||
And so I analyzed it, and with a device that we invented in my lab, actually called multiplex ion beam imaging, which is a kind of what's called secondary ion mass spec, which what you do is you shoot a beam of ions at an object like a sandblaster. | ||
It ionizes the material on the target, and then you shoot off and measure the mass of the objects that you just sandblasted off. | ||
And so what we found was nothing unusual. | ||
in terms of isotope ratios, except we found a mixture of metals that depending on where you looked in the sample was different. | ||
So it would be like iron, titanium, and chromium of a certain ratio here, but a different ratio of those things over there and over here. | ||
So what that meant was that whatever this stuff was didn't come completely pre-mixed. | ||
It wasn't like a milkshake. | ||
It was a slurry of partially mixed materials that somebody decided to drop off. | ||
So again, this is just data. | ||
But my purpose of publishing it was first, and this was published in the Progress in Aerospace Sciences, peer review.. | ||
The purpose was to show you're not going to get thrown out of the academy for publishing this stuff. | ||
As long as you don't make crazy conclusions and you just say, here's the data, to show people that you can publish this stuff as long as you're scientifically careful in how far you go. | ||
You leave yourself plenty of diplomatic exits in the verbiage that you use. | ||
And it was part of what then got me to start the Soul Foundation along with Dave and others to say, look, it's okay to do this as long as you're careful. | ||
And it's why people, I mean, Avi Loeb came after me because he had kind of the same pushback from his community where all he was doing was saying, the question's on the table. | ||
I'm not saying it's true. | ||
It's just you can't push this off the table. | ||
So he had the same kind of righteous indignation that I have that propels me to say, well, I'm going to show you why you can't take this off the table. | ||
So when they found this puddle of molten metal, and it's a bunch of different mixtures, so it seems like there's a bunch of different stuff that was there and it wasn't perfectly mixed. | ||
Is there some sort of, have you theorized some sort of reason why they, any person or any creature, any being would do that? | ||
Is there something that you would extract from that kind of metal, like heating it up to a certain degree and having a mixture of all these things, and this is just a byproduct that they're dropping off? | ||
I think it's a byproduct of some process that might, again, might, might, might. | ||
Might, might, might extract. | ||
It might be part of a propellant system. | ||
It might be part of the way that they generate the fields that allow these things to move. | ||
Again, these are all myths, it's spec like when you see something and do something that you don't understand what it is, you have to be fully open. | ||
I mean, for all I know, they're flushing the toilet. | ||
Right? | ||
Oh boy. | ||
unidentified
|
Yeah. | |
Ew. | ||
But they got metal poop. | ||
So, but, but, you know, I have the original Polaroids from the police department of it. | ||
So, you know, it was real. | ||
And people said, Oh, it was thermite. | ||
Well, if it were thermite, there'd be aluminium oxide. | ||
You know. | ||
Thermite meaning that's how it was smelting down. | ||
That's how it was smelting down. | ||
And it's just some kids playing around, et cetera. | ||
And it was a big joke. | ||
Wacky kids with their thermite. | ||
With their thermite. | ||
But it turns out there's no aluminium hydroxide or oxide, I should say, in the sample. | ||
I mean, I have the analysis. | ||
It's just not there. | ||
So it had to have been extreme heat. | ||
It had to have been extreme heat of some kind that would produce it. | ||
And whatever it was was hovering for a moment. | ||
So it wasn't an airplane. | ||
And there were no helicopters, and at least no helicopters with flashing lights. | ||
And I've got there's been huge chunks of it still exist. | ||
And the amount of this stuff, the kind of cauldron that would have to exist in order to melt this would be immense. | ||
It was immense, yeah. | ||
in the 70s already sort of made estimates of what was required. | ||
And people said, Oh, it's a meteorite. | ||
Well, no, we basically showed mathematically how, you know, first of all, meteorites make holes when they hit the ground. | ||
They don't melt when they hit the ground. | ||
And they make explosions. | ||
Are there similar instances of something along this lines? | ||
Several. | ||
Really? | ||
That's what's so interesting is that worldwide there are multiple reports of molten metals that get dropped off these objects. | ||
And I have actually two other ones of a molten metal that was dropped off of one case in Australia and another in another area I'm not allowed to say, but It was one actually happened, supposedly, I've got to find the guy again in Fresno, maybe he's listening, that he said stuff dropped and he has, you know, malt and metal that landed in a puddle in the asphalt of his driveway. | ||
And he saw this object. | ||
So he's just holding on to it? | ||
He's holding on to it. | ||
He reached out to me and I was, you know, it was still at a time when I was just kind of getting into this area, but there's many, many examples of this kind of thing. | ||
So, but interestingly, several of these other ones are just aluminum. | ||
The one that I have is iron or whatever. | ||
So does what does that tell me? | ||
there's different kinds of ways of accomplishing the goal? | ||
Whatever it is, they're either throwing something overboard or for, you know, because they don't need it anymore or because maybe it's getting in the way of something and it's time to get rid of it. | ||
Have you brought in someone who's like a real expert in material sciences that would like to theorize, like, given an immense increase in technology and what, like, what potentially do you think this could be? | ||
The purpose of being on shows like this is to have experts maybe give me an idea because the people I've been to at Stanford, you know, the other professors., they're like, okay, yeah, I gotta go. | ||
Yeah, it could be it could actually be detrimental to your career. | ||
And that's what's really weird about something when you're just talking about data, specifically in this case, of an actual physical thing that anyone can measure. | ||
Right. | ||
And I've got pieces, I've got plenty of it, you know, and the original piece is, you know, is like this big that the owner of it had brought to my lab just last summer. | ||
It's like big as an iMac. | ||
Yeah, exactly. | ||
Oh, it's huge. | ||
Crazy. | ||
And so what is it? | ||
I would love for someone to tell me that it's conventional and has a purely prosaic answer. | ||
Because then I can go on to the next thing. | ||
The whole reason for getting that, the Atacama Mummy off the table was not because I wanted to annoy anyone, was because it was spectacular. | ||
It's obviously something people would pay attention to. | ||
So if it's real, let's do it. | ||
If it's not, let's get it off the table. | ||
Because it's usually the stuff that's hidden under the rubble. | ||
That's the most interesting. | ||
My question about that mummy is not that it's an alien, but if it does register as human in the DNA, is it potentially a different kind of human than us? | ||
Well, certainly she. | ||
She had we brought in an expert in South American indigenous people genetics and the analysis showed that the the standard genetic mutations that are found in different racial groups around the world matched exactly the Atacama region of Chile. | ||
So her parents, her relatives were clearly Chilean. | ||
So, yeah, I mean, that's really all you can that's really all you can say. | ||
Just to say that she's an alien, well, that's fine. | ||
I'm convinced of what she is and that she deserves a proper burial. | ||
And so it's just a genetic anomaly. | ||
Just a genetic anomaly. | ||
I do know that you've paid attention to the Tridactyl mummies. | ||
Yeah. | ||
What is your take on that? | ||
So, you know, I think people have conflated a lot of the different mummies that are out there. | ||
First of all, there's like 60 of them or something. | ||
And probably a fair number of them, I wouldn't necessarily call them hoaxes. | ||
I would say that they are constructed. | ||
But they're old constructs. | ||
So maybe they're some sort of homage paid to the ancestors or something like that, whatever they are. | ||
So there are some that you clearly look at, you go, oh, come on. | ||
That never lived. | ||
Then there's the fetal position. | ||
Then there's the fetal position ones, the big ones. | ||
And I was at the beginning, I was, you know, I'm always open to being wrong. | ||
I was at the beginning thinking, oh, well, because of the small ones, those are probably not real. | ||
But then the MRIs started coming out. | ||
The full body MRIs and the ligature and the bone construction and the finger and then perhaps most, I think extraordinarily, the fingerprints on them, being clearly not human. | ||
So it's interesting. | ||
But here's the problem is that because there's so much circus around them, unfortunately created by people who want a circus because it sells their TV shows, no scientist of any merit would go near it. | ||
So I was approached many times, many times to study them. | ||
And I said, I'll do it on one condition. | ||
Here's the money I need, not personally, but here's the money I need to do the kinds of analysis to accomplish this right. | ||
Second, there will be no. | ||
TV cameras. | ||
And you won't hear from me again until I'm ready to talk. | ||
Because I'll have double checked and triple checked and quadruple checked the results. | ||
And then I've gone out, as I did with the Autocama Mummy, bringing in further contiguous circles of experts to double check me. | ||
And not make it a circus. | ||
And not make it a circus. | ||
Because I won't name the TV show that wanted to do it. | ||
but they wanted me, they wanted to follow me around with a... | ||
I'm like, no. | ||
This isn't how science is done. | ||
I can't do it with those strictures. | ||
So I would say that if anybody's going to do it again, lock the things away with South American scientists. | ||
You don't need a North American scientist to come in and do it. | ||
There's plenty of smart people in South America who can do this properly and respect the rights of the indigenous peoples who own the sacred grounds within which these things were found. | ||
I think that's important. | ||
And then do the analysis right. | ||
You know, they've said, they made, I think, the mistake of saying, well, we've done the DNA and there's a lot of DNA that doesn't match. | ||
Anything, and the stuff is several hundred years old, anything that old, you won't get a lot of good DNA out of it. | ||
But just they did the same thing with the Denisovan and the Neanderthal. | ||
You have to correct the chemical errors that occur over time. | ||
There are ways to what's called bioinformatically correct. | ||
You need to do what's called overreading of the genome, where you do so many reads of it that you stack them all up line by line. | ||
Like if you had 1000 versions of an ancient Bible, you would stack up the lines one by one and finally you find one line that has this letter that's correct and then this one correct and then you'd basically do a summation of an averaging of the correctness. | ||
And so they say, oh, well, there's, you know, 90% of the genome is nonhuman. | ||
It's probably garbage. | ||
It's probably these mistakes. | ||
It's probably bacterial contamination that you're reading. | ||
There's ways to deal with that, but that requires money and not one-off DNA sequences put on the internet for some amateur genomics to make a claim about. | ||
So there's ways to do it. | ||
I mean, you would want at the end of the day to get the results to the level where you could go to the guys who did the Denisovan and the Neanderthal DNA, the Max Planck and others who did the Nobel Prize for it, and say, hey, what do you think? | ||
But you don't dare take it to people like that until you've done your homework. | ||
I see. | ||
And you do it behind the scenes. | ||
You don't put them under a flashlight. | ||
Right, right. | ||
You know, and people, I think, have gotten used to this click mentality of impatience where I want the result today. | ||
Why can't you just make it all transparent? | ||
Dump all the data on the web tomorrow. | ||
You're not transparent. | ||
You're hiding something. | ||
No, I'm not. | ||
I'm just trying to make sure that you don't make the mistake and accuse me of making the mistake that you'll find in the data because the raw data is never clean. | ||
unidentified
|
Mm. | |
Mm. | ||
In the Daily Mail headline. | ||
In the Daily Mail headline. | ||
Never accurate. | ||
So, long story short, I think there's still something worth looking at there. | ||
Well, the scans are fascinating, right? | ||
unidentified
|
Yeah. | |
The scans are the most interesting to me. | ||
Have you seen the Jesse Michaels, the Justice video? | ||
Yeah. | ||
Jesse is a good friend. | ||
He's great. | ||
I love that guy. | ||
And the episode that he did is fantastic. | ||
And when you see the scans and they go over the bone structure of the thing and you look at it, you're like, God, that looks real. | ||
If that's a hoax from 1700 years ago or over 1000 years ago. | ||
Exactly. | ||
Whoever if the carbon isotope dating that they did on it is accurate. | ||
I've looked at that data. | ||
It looks good. | ||
Okay. | ||
So then it is that old. | ||
Fuck you then. | ||
Because there's no way someone back then could fake that. | ||
And someone asked me the other day, they said, Well, could you have a single mutation? | ||
I said, No. | ||
I mean, because you don't get one mutation that does all that. | ||
unidentified
|
Right. | |
You know, evolution works step by step that this does this, but it has a mistake, but it's corrected by this mutation over here in evolution, which is corrected by this. | ||
The whole, the genome fluctuates over time, compensating for the errors that would otherwise have killed you. | ||
Also, one of them is pregnant. | ||
That's fascinating. | ||
Yeah, I know. | ||
Okay, so it's a three foot pregnant thing that doesn't look remotely human being. | ||
Yeah. | ||
So the jury is still out. | ||
Right. | ||
But if they're going to do it right, they need to sequester the stuff away, bring in the right people with sufficient resources, and get rid of the cameras. | ||
Have you talked to them? | ||
Have you encouraged this? | ||
Is this possible to nudge this in the right direction? | ||
And where is it out right now? | ||
I wrote out on Twitter a full thing of what they needed to do. | ||
I mean, the easiest first milestone to do, to be honest, that could be done within a couple of months, is if it is somewhere in the hominid or, let's say, vertebrate line, there are metabolism genes that we all share. | ||
In fact, there are metabolism genes that we share with bacteria that are very similar. | ||
So there's, you probably, you know the technique called polymerase chain reaction, PCR? | ||
So, you know, why try to do the whole genome? | ||
Why not just target a bunch of genes that we know evolve slowly but do evolve and PCR those out because that's easier to do than is trying to assemble a whole genome and then by having just those let's call it preliminary sets of evidence you could then say hmm this actually reproducibly if I take a sample from the finger, | ||
I take a sample from the bone marrow, I take a sample from here or there on the body, and I take a sample from different, the three different main things, and I see the same mutations, and they're different or somehow aligned with hominid evolution, right? | ||
We compare it to all the known hominids. | ||
I mean, that would be the kind of data that you could actually publish in a journal like Nature, if you did it right. | ||
Because that's the only way you're going to get anybody to pay attention. | ||
There's also the bizarre anecdotal nature of some of the artwork. | ||
Like the fact that these people did a lot of these tapestries and a lot of ancient artwork that's a thousand years old that depicts these three fingered things. | ||
So it's like, what are they described? | ||
Are they describing these actual creatures? | ||
That there were only a few of them and it was a weird genetic mutation or is this a common visitor that they're describing? | ||
I don't know. | ||
I don't know either. | ||
I mean, why would you put them in a cave in Peru? | ||
I don't know. | ||
And if you didn't put them in a cave in Peru, what would be left? | ||
That's the problem. | ||
The problem is it's really hard to make a fossil, it's really hard to find bones. | ||
Think about all the people that died. | ||
Right. | ||
And where are we, you know, we don't find that many bones, relatively speaking, to compare to the fucking billions of people that died. | ||
Right. | ||
It's not like we're tripping over human bones every day. | ||
Right, except in mass graves. | ||
Yeah, right. | ||
That's really, yeah. | ||
And even in mass graves, given enough time, they would deteriorate like mass graves from 1,700 years ago, whatever these things are. | ||
So, you know, I find them, again, I find them interesting. | ||
And I hope that behind the scenes, there are people who are taking a more methodical approach to this who I think should remain stealthed until they have the data to the point where it is publishable. | ||
The reason you want papers, frankly, when you publish them to be almost boring and so thick with detail that no pseudosceptic would dare approach it because they're just not smart enough. | ||
But if you put out these snippets that don't have sufficient background, they can be picked apart by anybody. | ||
Right? | ||
But that's why peer review is so important. | ||
And people mistake peer review as trying to get the reviewers to agree with your conclusions. | ||
No, the main purpose of peer review is actually to make sure that the methods that you used are sufficiently detailed and are correct enough to the extent you came to any conclusions, they match the methods that you used. | ||
And when you think about these potential, whatever they are, whatever these creatures are, if we did find out that they are some sort of a hominid, | ||
How much credence do you give to the theory that there's like the possibility that these UFOs, UAPs, whatever it is, is a break off civilization from a very, very long time ago that's very different from us, just the way we're very different from chimpanzees. | ||
Right. | ||
Which we coexist with. | ||
Right. | ||
I have no problem conjecturing that. | ||
Did you ever see the Netflix show Chimp Empire? | ||
Yes. | ||
Amazing, right? | ||
Amazing. | ||
Twenty million years of separation, and it looked like fucking faculty meeting. | ||
You know, with people like looking at each other, planning and plotting, board meeting, you know, and so we shared all those interactions from twenty million years ago. | ||
So how much further back would you have to go to have something like what that is? | ||
I mean, it's clearly not recent. | ||
And also, if you think about what we are in comparison to chimps, we're so fragile, we're frail, we're easily injured, we're well, if you think of something that's far more technologically advanced than us, it would be even more frail, it would be even more petite, it would have almost no muscle at all, it would look, weirdly enough, like the Grays from closing encounters of the third kind. | ||
That's what it would look like if it was a hominid that's whatever we are, and it went way past that. | ||
Right. | ||
Yeah, no, technology gives evolution the excuse to no longer make or allow for you to be robust. | ||
Robust, thank you. | ||
And also, why do you need opposable thumbs? | ||
Yeah. | ||
unidentified
|
Right? | |
These things don't even have opposable thumbs. | ||
That was what's weird about it. | ||
Right. | ||
It's like, how do you interact with your environment? | ||
They look more like sloths than they really do. | ||
Right. | ||
I mean, at least their hands do. | ||
Yeah. | ||
And I don't know. | ||
I find it, well, if everything is done with AI and automation, and your interface is purely neurological, like you have some sort of human or a creature neuro interface with technology and you just use fingers to like lay them on electronics so that you can sync up with it. | ||
Right. | ||
Yes. | ||
Yeah. | ||
Why are you picking things up, bro? | ||
You don't have to pick things up anymore. | ||
Those go away just like, you know. | ||
Can you imagine the scenario of, I mean, these things we know are the bodies are real. | ||
What they are, we don't know. | ||
But can you imagine the scenario of what happened as they were being buried? | ||
Could you like make a, you know, a film of the ceremonial burial of these things? | ||
things. | ||
You know, what would, what led to their death? | ||
What led to their placement there? | ||
Or if they were constructed, which I have a hard time with given the MRIs that we've all seen, et cetera, what led to it? | ||
And so that to me is almost interesting as to whether or not they're real or not. | ||
Right. | ||
Like the ones that are clearly constructed, that's where it gets fascinating. | ||
Because like, what were you trying to reproduce? | ||
Yes. | ||
And why are they so similar to the ones that look real? | ||
Yeah. | ||
Is it an homage to the ancestors or to the stories of the ancestors, et cetera. | ||
Especially when You look at Peru, like Peru is like, you've got the Nazca lines, which are really weird. | ||
You can only see them from the sky, and they're everywhere, and they're huge, these depictions of very strange things. | ||
So I just ask my scientific colleagues to not suspend disbelief, but to open your minds as to the possibility. | ||
of what these things might mean and just try to explain them without dismissing them. | ||
Because it's so easy and politics we see it every day. | ||
All you need to do is just give any answer, even if it's obviously flagrantly wrong as just as a way to deflect. | ||
And so, you know, you can either use that approach, you shouldn't use that approach ever as a scientist, deflect, which unfortunately is what someone like, you know, Neil deGrasse Tyson often does. | ||
Yeah. | ||
And as opposed to try to explain in a way that teaches your audience the right way to think. | ||
Yeah, well said. | ||
One of the things that Jacques Vallée highlighted is there's an alloy, another piece of metal that they'd found that had layers like these at an atomic level. | ||
That if you wanted to make this alloy today, it would be almost impossible. | ||
It would cost billions of dollars. | ||
So I worked with him on one of those pieces. | ||
I got the atomic imaging of some of that. | ||
And it's, oh God, I'm blanking on the event, but it was the Sirocco event. | ||
And where was that? | ||
In New Mexico. | ||
I'm going to get in trouble for not knowing exactly. | ||
And we actually did an atomic layering using this device called atomic probe tomography where you literally pick it apart atom by atom and get its 3D position. | ||
It's a 40-year-old technology, so it's nothing magic. | ||
So, and yeah, it would just be very difficult to make it, you know, and certainly it would be not something that you would have dropped in the middle of the desert. | ||
Is it Socorro? | ||
Socorro. | ||
In the middle of the desert, you know, in the 1970s or whenever it was. | ||
I wouldn't say it's impossible to make. | ||
But why you would do it is another question. | ||
It's clear what interests me is, first of all, why would you do it? | ||
Why would you create something, for instance, with the silicon and the magnesium with the altered ratios? | ||
Not the where did it come from? | ||
So what is it evidence of? | ||
It's clearly evidence of technology. | ||
Was this technology available at the time this supposed crash happened? | ||
Which one? | ||
This? | ||
No, not, no. | ||
Not at the level of precision that was done and a chunk of and no, it just wasn't. | ||
It just wasn't. | ||
So if that's true and if it really if that's the chain of evidence is correct and it, and it really did come from that area from that crash. | ||
That's not a human creation. | ||
Well, it wasn't a crash. | ||
It was an object that a policeman had seen with beings, short beings outside of it, and when it took off and left, he went over and found this piece that I actually, I personally have it now. | ||
Huh. | ||
So, but, you know, it's hard to say. | ||
what's possible and what's not possible. | ||
So, you know, there's plenty of military programs that make stuff that are way outside of mainstream capabilities right now. | ||
I mean, just look at the stealth bomber. | ||
Right. | ||
For instance, and the skin of the stealth bomber is just remarkable. | ||
Is it possible they were doing that in 1970? | ||
unidentified
|
Maybe. | |
Maybe. | ||
So that's why I always leave open the possibility that, you know, which is why, I mean, this is, I'm going to get back to this atomic imager thing that I'm making. | ||
It's like, there's a level of evidence that I think can be produced with atomic imaging that goes beyond what it is we know anybody can make. | ||
Right? | ||
So, and so that's my reason for wanting to do it. | ||
Because, you know, look, I can make money on it with looking at alloys and nanomaterials, et cetera. | ||
And that's going to be what the purpose of making the instrument will. | ||
That's how it will be a company. | ||
But it will have value elsewhere. | ||
So the reason that I got interested in it was frankly for looking at chromosomes. | ||
But then I realized, oh, maybe it has interest. | ||
Maybe it would be useful for these other things as well, which has kind of propelled my interest in it. | ||
Well, Jacques Villet is such a valuable researcher because he's so logical about the way he handles things and he doesn't jump to any conclusions. | ||
And his descriptions of these materials and the origin of these materials is really compelling. | ||
because it's just like, if that's not really possible to make in 1970, then someone help me out. | ||
Yeah. | ||
What is that? | ||
Yeah. | ||
And is it possible to make today? | ||
And how much would it cost? | ||
Right. | ||
And where would you do it? | ||
Well, that's why the magnesium ratio thing was, you know, when I first estimated it was like, this is millions and millions of dollars, and why would you leave it on a beach in the middle of Ubatuba, Brazil? | ||
Right. | ||
You know, it's just it just seems it seems unlikely. | ||
Nothing's impossible. | ||
No. | ||
But unlikely. | ||
Well, and then it's usually the chain of evidence. | ||
It's, there's lots of materials that you might find that are unusual. | ||
And believe me, I get rocks sent to me at my lab in the mail that people say, oh, this is unusual. | ||
No, it's a rock. | ||
Sorry, it's a rock. | ||
But you know, I have not yet been given anything which I could definitely say, this is not something a human might have been able to make. | ||
It might be difficult, but not impossible yet. | ||
And so that's because the level of resolution required to claim something is impossible is something we actually don't even have yet. | ||
Does that make sense? | ||
Yes, that does make sense. | ||
So that's what I'm, so my whole career has been inventing instruments that were, I felt, inevitable, but not yet possible. | ||
But I could see a path to making them. | ||
And so I said to most people, get out of my way. | ||
I'm going to do this. | ||
Because I know once I've got it, it will become valuable to everybody, which is, that's what made my career in immunology, making a succession of instruments like that and then making them available to the community. | ||
So I think the next level is atomic. | ||
Because we now know you can pick up and look at any of the major physics journals today. | ||
Everything is all about these weird exotic particles that exist in metamaterials down at the atomic level with vague and strange capabilities that will change their utility either as superconductors, room temperature, or different kinds of electronic components that might be better, quantum computer circuits and qubits. | ||
It's all down at that level. | ||
But to do so requires a level of engineering that we don't, I mean, never mind reading what it is, putting it together in the first place is what's still required. | ||
And so if we don't know how to put it together in the first place, then reading it and knowing that it can exist and then associating it with a function is the value that I'm looking to bring. | ||
Well, this brings me to the idea of crash retrieval and the idea that these crash retrievals started a long time ago and that Roswell was just one of many. | ||
There's another one that was near Roswell that apparently was even more significant but didn't get in the newspaper. | ||
Trinity, are you talking about? | ||
It was the one that Jacques was involved with studying. | ||
I'm basing off with Richard Dolan's book. | ||
Okay. | ||
But at the end of the day, the point being that if they did do that, if they really did back engineer something, and then they started these completely top secret scientific research projects where they were developing alloys that had never existed before with techniques that they had never really even considered because they got it all from some spaceship. | ||
Well, that's where it's really crazy if you don't disclose this information. | ||
Because you're basically putting a bottleneck on human evolution, human technological evolution. | ||
our understanding of what's actually possible. | ||
Right. | ||
I agree. | ||
And, you know, if you're going to excite the next generation of scientists in this country and you're going to bring economic prosperity to this country, then we should, I wouldn't say democratize it and put it all out on the internet. | ||
I understand all the reasons why you might not need to. | ||
you need to excite the populace. | ||
I mean, my laboratory at Stanford for probably the last 10 years is Not because I don't want to take more Americans, but because Americans just don't go into the sciences anymore. | ||
They don't study math. | ||
You know, they are not encouraged to approach us, so we're importing a lot of our scientists from overseas. | ||
Well, guess what? | ||
A good third of them end up going back and bringing all the technology that they invented here back there and creating competitors. | ||
Now, maybe that's good on a global scale, you know, but maybe it's not something that we want to encourage on a local scale if we want to maintain our technological superiority. | ||
We're basically governed by lawyers. | ||
China is governed by engineers. | ||
You know, I mean, when you see the results in their drone technology and electric cars and the things that are coming out of China recently? | ||
Their Polyp Bureau is almost entirely engineers and scientists. | ||
Interesting. | ||
Yeah. | ||
There's a little article in the Atlantic recently about that. | ||
That's a giant advantage. | ||
Yeah. | ||
So people who are making these decisions, we have lawyers looking for all the reasons why something should or shouldn't be done and the liabilities. | ||
They're looking at things as to what's possible. | ||
When you're looking at these UAP things that people bring you, is there one that stands out as being the most compelling to you? | ||
One event? | ||
Well, both the council bluffs and the Ubertuba event are interesting to me. | ||
Because of the physical material. | ||
Because of the physical material itself. | ||
I mean, I'm at the end of the day a physicalist. | ||
I mean, I don't like all the anecdotes. | ||
I mean, a thousand anecdotes make a good story, good campfire. | ||
I mean, I think there's statistical value in people seeing the same thing again and again, and there's a truth to it. | ||
But as, you know, and I can believe anything I want around that, and many of the statements that I'm purported to have said are around my beliefs, as opposed to when I put on my scientist hat and I try to convince another scientist. | ||
I can only provide this data and this evidence, and I don't have yet these materials. | ||
Now, maybe they exist, and maybe people like David Grosch will be able to pry them out of the clammy hands of those who want to keep it where it is, but give me one piece of that, and I will do wonders with it. | ||
unidentified
|
Yeah. | |
I mean, that's why I'm so excited about the UAP Disclosure Act, if it ends up becoming We're taking money from one program to give to another. | ||
Whether you're taking it from your taxes, you're taking it from veterans, you know, insurance, et cetera, it's a zero-sum game. | ||
Whereas if you bring the investment community in, now you're bringing in people who are willing to take a chance and willing to take a risk, and you're not using the public's money anymore. | ||
And so, and that excites, I mean, me as the reason why I wanted to go back to Stanford is because the entrepreneurial environment there, and now which is actually almost homegrown here in Austin, is really what drives innovation. | ||
And so I want to excite that kind of community. | ||
And again, the SOLE Foundation is a place where we can bring people in, and we've got investors who show up now, who are talking to people about their ideas and what would we do with this. | ||
And so it almost has now a self-propelling movement where I don't need to be standing on a wooden box somewhere in the middle of the park saying, you know, look at this, look at this. | ||
People are just doing it now. | ||
There's now a whole, almost a cottage industry of small groups or formalized groups who are doing this independently now. | ||
So it's almost like it's inevitable. | ||
So SkyWatcher, as an example, you probably know the SkyWatcher group. | ||
Yeah, I've heard of it. | ||
And Jake did. | ||
And they just stopped operations, did something happen? | ||
No, it's it's strange because people said, Oh, we stopped. | ||
No, actually it had been determined from the beginning that we were going to go from January until July or August and collect data. | ||
And now we're in the Okay, what does the data mean phase? | ||
where we're literally going through the data, looking at the data files and trying to we're as I said before we're filtering the data we're looking for the obvious mistakes etc and so No, they've not stopped. | ||
Yeah, there was something on Twitter about something about the equipment. | ||
I forget. | ||
No. | ||
So James Fowler, one of the guys who brought a lot of his equipment and technology to us, decided that he wanted to basically go off and work in a DOD capacity as opposed to the research capacity. | ||
He's still advising us. | ||
I was just on a phone call, a Zoom call with him last week going over the data files. | ||
So explain this SkyWatcher thing to people because it sounds insane. | ||
Well, the idea behind it was that there might be ways to send a signal and get things to show up. | ||
And James Fowler claimed that he had such a thing. | ||
I was at one of the events where something showed up. | ||
It was transient, momentary, but indisputable. | ||
But it's just like, what did it look like? | ||
It was just a silver ball moving quickly through several frames of the video, which wasn't fast enough, frankly, to pick it up. | ||
We just saw it move. | ||
It went that way. | ||
And you didn't see it with your naked eye? | ||
No, I didn't see it with your naked eye. | ||
Which, of course, is a problem. | ||
Do they sometimes see things with your naked eye? | ||
One guy did, yeah. | ||
One guy. | ||
Oh, I mean So are these things variable in their appearance? | ||
I wish I had my I don't have my phone here. | ||
But we do have a picture of one next to a next to the helicopter about 60 meters away. | ||
And it's just a kind of a fuzzy white blob against a blue sky. | ||
But it was there. | ||
You know, and it's not a cloud and it's not a balloon.. | ||
It's not discernible as anything obvious, but it was there and it happened during one of these events out in the middle of the desert. | ||
And so the idea behind SkyWatcher is to see if there are ways to get them to show up and if so, in a reproducible manner and then have the right kind of simultaneous multisensor capabilities to measure it, meaning radar, IR, visual people on the ground. | ||
What are they sending to get these things to go? | ||
What signal? | ||
that unfortunately he won't I don't know what it is. | ||
He won't let everybody know what the bat signal is. | ||
Well, I mean, you know, I mean, maybe, yeah, exactly. | ||
I mean, it sounds kind of silly, but I mean, why would you put that out on the internet? | ||
Because, you know, you might render it useless. | ||
They're like, ugh, I don't have to show up. | ||
Everybody's using it now. | ||
Oh, so you think it's a trick? | ||
Like, it tricks them to show up? | ||
I don't know. | ||
I really don't. | ||
Don't you think they'd be smarter than that? | ||
Well, that tells you something maybe about the level of smartness that might be incorporated into these, let's say, dumber machines. | ||
Maybe, yeah. | ||
Yeah, that was exactly my thought. | ||
It's like, why would you show up when you know what it is, unless there's a reason you're basically trying to train the monkeys what to do? | ||
Maybe you're tricking the monkeys to send the I don't know. | ||
But isn't there a group of people that just go out and they just use their mind, they meditate, and supposedly they have some success as well? | ||
Yeah, there's the CE five groups that do that. | ||
And I'm more than willing to believe that there are technologies capable of measuring thoughts at a distance that might be some super advanced. | ||
I don't believe you have to call it telepathy and magic. | ||
I think that there's, you know, if such a thing happens that there's a technology that might be able to read at a distance. | ||
Right. | ||
Well, I don't have a problem with that. | ||
I don't have a problem with that either. | ||
I don't have a problem with the idea that consciousness is kind of vaguely and barely understood and whatever our relationship to the universe itself and reality itself through consciousness is it's not fully defined and also, it might evolve just like all of our other intellectual capabilities. | ||
Right. | ||
Well, I mean, think of it this way. | ||
You know, you and I are interacting with each other through quantum waves. | ||
My meat brain sees you as an object, but yet everything that you are sits in quantum spacetime down at the Planck level, and you're not even mass. | ||
You're just a series of, I mean, in some people's minds vibrating fields and objects. | ||
And so we have sensors that see and hear each other and think about each other, but our consciousness somehow is embedded in spacetime. | ||
And so who's to say that there's not signals passing to and from that are vaguely able to be picked up by our meat brains that we don't necessarily appreciate. | ||
Right. | ||
So that just because I can't think at you and you can't hear me doesn't mean that there aren't perhaps brain organizations of some people that are a little bit better at hearing the echo than others. | ||
Well, this is also probably the reason why when you go to the woods and there's no cell phone signals, the world feels different. | ||
Yeah. | ||
Because you're probably experiencing a bunch of signals that your brain vaguely interacts with. | ||
Right. | ||
That, you know, might not even necessarily be good for you. | ||
Right. | ||
But they're out there and they're a part of the world that you live in. | ||
And you just, you can't, you don't have a radio. | ||
unidentified
|
Right. | |
Right. | ||
So you're not like tuning in to them. | ||
You don't have a cell phone. | ||
So you can't just like make calls with it. | ||
But you're experiencing it. | ||
Right. | ||
Well, and, you know, our civilization is drowning us in constant noise. | ||
Yeah. | ||
And so maybe, you know, that drowns it out. | ||
And that's why meditation is why people claim that they can interact with other things. | ||
I don't know. | ||
Yeah, I don't know either. | ||
One I saw an interview that you did where you were describing the sighting off the coast of San Diego in 2004, the Nimitz sig site where he said that the amount of power Why don't you describe it? | ||
So the amount of power that that thing had to use to move the way it did. | ||
Right. | ||
So it's on radar. | ||
It's on radar. | ||
So these are actually calculations by Kevin Nooth, a physicist from the University of Albany, and a published paper. | ||
Again, just speculation. | ||
But what he basically said was how much power would it take to instantaneously accelerate from fifty feet over the ocean to fifty miles above the earth, whatever the number was, and instantaneously decelerate. | ||
So it's not just the amount of power to lift something, it's the amount of power to accelerate and decelerate instantaneously. | ||
And so you can make simple physical calculations of a one ton object, let's say, and it's more than the nuclear output of the United States for a year. | ||
And yet these things seem capable of doing that at will. | ||
So where are they getting the energy from? | ||
And I remember asking Hal a question like this years ago. | ||
We were stepping into a Hal put off stepping into an elevator and we were talking about his ideas about how these things might move. | ||
And I said, so they're cheating somehow, aren't they? | ||
And his answer was, from our point of view, they're cheating. | ||
From their point of view, they're just using the physics that we don't understand yet. | ||
So where's the energy coming from? | ||
What are they doing? | ||
And so that might be, as a, for instance, a reason why you don't want everybody having access to it. | ||
Yeah. | ||
Because any one of those objects is worse than a thermonuclear bomb. | ||
You shoot one of those things at a city and that's the end of the city. | ||
And if anybody could do it, you know. | ||
Well, maybe that's the step of human evolution, of the evolution of our society and civilization is that AI has to come into power before we have access to all this other stuff. | ||
That we do need an AI government structure, that we do no longer require military intervention and all the shit that is the bane of civilization today. | ||
Because if you ask the average person today, is, do you envision a world where war doesn't exist? | ||
Most people are saying no. | ||
The vast majority, except for a few delusional hippies. | ||
They're going to say no. | ||
But if you ask them, okay, given this super intelligent AI takes over the world and proves to be benevolent and really just wants to accentuate the life of human beings on Earth and make it better for everybody, then yes. | ||
Then 100% yes. | ||
Why would it want war? | ||
Right. | ||
So maybe something like that has to take place before we get to a situation where, okay, this is how you really travel. | ||
Right. | ||
Right. | ||
Okay, now that you're not going to war anymore, listen, but you can already imagine the negatives where people will say, well, it's the it's the It's the apocalyptic nanny state, right? | ||
Where AI just basically takes care of you and humans devolve into something, which is why I think a merger of human intellect with this where it's a synergy as opposed to an either or. | ||
I don't want to be nanny stated either. | ||
I want to use it to explore ideas or explore pleasure. | ||
I mean, I'm finding people want to be hedonistic and, you know, participate in virtual parties all day long, for all I care. | ||
I don't care. | ||
But I think giving people the option to do whatever it is that they want to do, it's the most, I don't know, what's the, it's the most liberal and conservative way of living because you're allowed to do what you want to do. | ||
But we're not because we're living at the behest of so many other strictures. | ||
unidentified
|
Oh, yes. | |
Yeah. | ||
Last question. | ||
What's your take on the Bob Lazar story? | ||
Elements of truth with a healthy dose of misinformation that perhaps he was provided. | ||
I don't think that he's entirely lying. | ||
He seems to know enough about things that the average person wouldn't know. | ||
But I've heard from Eric Davis and others saying, he's a this, he's a that. | ||
I don't know because, you know, it's like, that's why there are great people like Richard Dolan, who's a wonderful writer of the history of the area, or people like Robert Powell or Michael Swords, who write just the facts, not coming to too many conclusions. | ||
I don't live in that world. | ||
It's not my speciality. | ||
My speciality is working with data and analyzing things and bringing rigorous science to it so that I can convince another scientist what is right or what is wrong. | ||
Because I won't be happy. | ||
I mean, I'm pretty sure of what I know, but I want to validate that to my colleagues, if only to be able to say I told you so. | ||
Right? | ||
There's a little bit of human pettiness in there. | ||
A little bit of pettiness is great motivation. | ||
Yeah. | ||
But that's, I think, again, enabling people to live in a world like that where you can talk about these ideas without being ridiculed is really, I think, the objective of what science should be and what open-minded, non-theologically dogmatic approaches should be. | ||
It's like accuse a scientist of being a priest, and that's the best way to really upset them. | ||
But pointing out that what they're doing is mimicking dogma and priesthood is the only way to shame them into doing the right thing. | ||
Does that make sense? | ||
It does. | ||
It does. | ||
Well, listen, man, I'm glad we finally did this. | ||
unidentified
|
Yes. | |
Thank you. | ||
Thank you so much for being here. | ||
Thank you so much for all the research that you're currently involved in and all the stuff that you've done. | ||
And it's been amazing talking to you. | ||
Really appreciate it. | ||
Thank you. | ||
Thank you so much. | ||
unidentified
|
Okay. |