header-langage
简体中文
繁體中文
English
Tiếng Việt
한국어
日本語
ภาษาไทย
Türkçe
Scan to Download the APP

Inside Silicon Valley: AI panic, cult culture, and the political shift of tech leaders

2024-08-03 11:00
Read this article in 102 Minutes
Tech mogul Amjad Masad takes a deep dive into the little-known changes in thought in Silicon Valley.
Translator's note: Amjad Masad is a Jordanian-American tech entrepreneur and is currently the founder and CEO of online programming platform Replit. Recently, he was a guest on Tucker Carlson's podcast, providing insider insights on hot topics such as AI development, technology ethics, and the changing political atmosphere in Silicon Valley.


TL;DR


· Amjad believes that Satoshi Nakamoto may be a Zimbabwean hacker named Paul Le Roux who created Bitcoin to store criminal proceeds.


· Amjad believes that AI should be an extension of human capabilities, not a replacement. He criticized the view that AI is a threat, believing that this stems from disappointment in human nature. He emphasized the importance of human intuition and rational thinking, and questioned the view that the brain is simply equivalent to a computer.


· Amjad described the existence of some cult-like groups in Silicon Valley, such as "rationalists" and "effective altruists." These groups have a significant impact on the development of AI, but some of their ideas and practices are worthy of caution. He believes that the ideas of these groups stem from dissatisfaction with human nature and arrogance.


· Amjad believes that AI has great potential in education, medical diagnosis and other fields. He encourages people to learn programming to take advantage of AI, but also warns that AI may be abused for surveillance and military purposes. He emphasizes the importance of open source AI to prevent technology from being monopolized.


· Silicon Valley is moving from a one-party Democratic trend to more Republican voices. Some well-known technology figures such as Marc Andreessen have begun to publicly support Trump. Amjad believes that this change is partly due to concerns about technology regulation and the influence of figures like David Sacks.



Human-machine relationship: replacement or extension?


Tucker:You seem to be directly involved in the development of AI and are part of this ecosystem.


Amjad Masad:Yes, we have benefited greatly from it. When it started happening, many people were surprised, but we had foreseen it.


Tucker:You saw AI coming.


Amjad Masad:Yes, I did. This AI wave caught a lot of people off guard. ChatGPT was released in November 2022, and a lot of people were shocked that all of a sudden, the computer could talk to me.


Tucker:But AI is not there yet.


Amjad Masad:Paul Graham is a good friend and mentor of mine, he's a big Silicon Valley figure, he's a writer, he's written a lot of articles. He hates AI and thinks it makes people write worse and think worse.


Tucker:Worse, or not think at all, just like the iPhone, Wikipedia, and Google did.


Amjad Masad:We were just talking about this. The iPhone and iPad made computers available to anyone, but also made it so that no one needed to learn to program. The original vision of computers was to give us superpowers. Jake Licklider, the head of DARPA, wrote an article called "The Human-Computer Symbiosis" during the development of the Internet. He talked about how computers are extensions of us, helping us grow. It's a combination of computer intelligence (fast calculations) and human intelligence (intuition).



Tucker:That's right.


Amjad Masad:But since then, the consensus about computing has changed, and people are afraid that AI will replace us, and that computers and computing are a threat because they compete directly with humans. That's not a belief I hold, I think they are extensions of us. I think people learning to code, which is one of our core missions at Replit, gives you superpowers. When you're just clicking, you're just a consumer, not a producer of software. I want more people to be producers of software. Douglas Rushkoff wrote a book called Program or Be Programmed, and if you're not programming, someone is programming you. These algorithms and social media are programming us.


Tucker:Is it too late for me to learn programming?


Amjad Masad:I don't think so.


Tucker:I can't even balance a checkbook, assuming there is one.


Going back to the point you just made, the original DARPA vision was that machines would do the math and humans would do the intuitive judgment. I wonder if intuition is dying out as machines become more and more embedded in our lives, or if people stop trusting their intuition. I've seen this a lot over the past few years where there's something very obvious going on, and people say, "I can acknowledge and follow what my eyes see and my intuition tells me, but the data tells me something else." My advantage is that I'm very close to the natural world, and I trust my intuition. So I wonder if this is a result of technological progress.


Amjad Masad:I don't think it's an inherent result of technological progress, it's more of a cultural issue. Computing is seen as a replacement for humans rather than an extension. Bertrand Russell wrote a book on the history and philosophy of mathematics, going back to ancient figures like Pythagoras. And you can see from his writing that he was amazed at the important role that intuition played in science and mathematics. And today's culture is, "You have to put your intuition aside."



Tucker: That's right.


Amjad Masad:People think you're biased, that your intuition is racist, and so on. You have to be a blank slate and just believe in the data. But the data can actually be interpreted in many different ways.


Tucker:Can I ask a completely unrelated question? It just occurred to me, how did you become so knowledgeable? You grew up in Jordan, speaking Arabic, from a displaced Palestinian family. You've been in the United States for a short time, and English is not your first language. How did you read Bertrand Russell? What kind of education did you receive? Is every Palestinian family in Jordan so well educated?


Amjad Masad:The Palestinian diaspora is generally well educated. My generation is starting to make a name for ourselves in Silicon Valley, and many C-level executives and VP-level managers are of Palestinian descent, although some are reluctant to disclose it due to prejudice and discrimination.


Tucker:They don't say they are Palestinian.


Amjad Masad:Yes, some of them are Christian, especially those who are Christian, it is easier to integrate. Many of them work in Silicon Valley.


Tucker:But how did you read Bertrand Russell? I guess you read it in English. How did you learn it? You didn't grow up in an English-speaking country.


Amjad Masad:Jordan is also an English-speaking country to some extent.


Tucker:Really?


Amjad Masad:Jordan was a British colony. I remember the independence day was probably in the 50s or 60s. Jordan was a relatively late colony in the British Empire. So the British had a big presence in Jordan. My father was a government engineer and didn't earn much, so we lived a relatively simple life, kind of lower middle class. But he was very focused on education and sent us to private schools. In these private schools, we studied the British diploma system, such as IGCSE and A-levels. Did you know that?


Tucker:I had no idea about that.


Amjad Masad:Yes, that's one of the legacies of British colonialism - the internationalization of the education system. I think that's a good thing.


Tucker:There were British schools everywhere.


Amjad Masad:Yes, everywhere. It's a good education system and gives students a lot of freedom to choose the subjects they are interested in. I learned a lot of math and physics, and some weird stuff like child development. I still use that now that I have kids.


Tucker: In high school?


Amjad Masad: Yeah, high school. I loved it.


Tucker:What does this have to do with the Civil Rights Movement?


Amjad Masad:What does that mean?


Tucker:In American schools, students spend 16 years learning about the Civil Rights Movement. Everyone knows about the Edmund Pettus Bridge, but not much else.


Note: The Edmund Pettus Bridge is a historic bridge in Selma, Alabama, USA, which became an important symbol of the American Civil Rights Movement due to the "Bloody Sunday" incident in 1965.


Amjad Masad:Oh my god, my kids wouldn't do that.


Tucker:That's so funny. When did you come to America?


Amjad Masad: 2012.


Tucker: Now you own a billion dollar company. That's amazing.


Amjad Masad: America is amazing. I love this country, it gives us so many opportunities. I love the ordinary people here. I was just talking to my driver and she said sorry she didn't know who Tucker Carlson was. I said that's great, it means you're just living your life. She said yeah, I have kids, chickens, I'm content. I think that's great.


Tucker: It means she's happy.


Amjad Masad: Yeah, it is.


Tucker: I'm sorry I kept going off topic. It's remarkable that you mention those books and you're not even American. Going back to the question of intuition, don't you think that's inevitable? In other words, if my life is dominated by technology in my phone, computer, and all kinds of electronic devices, don't you think that this will make me trust the machine more than my own intuition?


Amjad Masad:You can choose to do that. A lot of people are led to do that. But in the end, you lose a lot of freedom. I'm not the only one saying this, very early on there were hackers and computer scientists who started sounding the alarm bells about where things were going: more centralization, less market competition. You had one global social network instead of multiple networks. Now it's better. A lot of people started the cryptocurrency movement. I know you were at a Bitcoin conference recently, and you told them that the CIA created Bitcoin, and they got mad on Twitter.


The Satoshi Myth


Tucker:I don't know, but unless you can tell me who Satoshi is, I have questions.


Amjad Masad:I actually have my own opinion about Satoshi's identity, but that's another topic.


Tucker:No, stop, now tell me, who is Satoshi?


Amjad Masad:There's a guy named Paul Le Roux.


Tucker:By the way, for those of you who don't know who Satoshi Nakamoto is, that's the pseudonym we use to refer to the creator of Bitcoin, but we don't know who he is.


Amjad Masad:What's incredible is that this thing was created and we don't know who created it. He never touched the money. Maybe there was some activity, but there were hundreds of billions of dollars locked up there. We don't know who this guy is, and the money hasn't been cashed out, it's pretty crazy.


Tucker:Amazing. So who is Paul Le Roux?



Amjad Masad:Paul was a crypto hacker in Rhodesia (now Zimbabwe). He created something called "Encryption for the Masses" (E4M). By the way, I think Snowden used E4M in his hacking. Paul was one of the first people to make cryptography more accessible to ordinary people. But then he became a criminal. He became a crime boss in Manila and almost controlled the entire city and bribed all the police. He made a lot of money from his criminal activities and went by the nickname "Toshi". There is a lot of circumstantial evidence that he might be Satoshi. I have a feeling that he made too much money and didn't know what to do with it, so he created Bitcoin to store this cash. At the same time that Satoshi disappeared, he was also in prison. He was recently sentenced to 20 to 25 years in prison. The judge asked him what he would do if he got out of prison, and he said he would make an asynchronous chip to mine Bitcoin. This is just my opinion, it may not be correct.


Tucker:So he is in jail now?


Amjad Masad:Yes, he is in jail now.


Tucker:Is it in the US or the Philippines?


Amjad Masad:I think it is in the US because his criminal activities mainly take place here. He is basically selling drugs online.


Tucker:We should go to jail to see him.


Amjad Masad:Yes, we can go and see him.


Criticism of Technological Advancement


Tucker:Sorry, I just couldn't help asking. Let's move on to AI. You're part of the AI ecosystem, and you don't see it as a threat, right?


Amjad Masad:No, I don't see it as a threat at all. I've heard you on the Joe Rogan podcast say things like blowing up data centers.


Tucker:I got a little overexcited, jumping to conclusions based on very little information.


Amjad Masad:Well, so tell me, what is your theory on the threat of AI?


Tucker:I always want to be someone who is honest about my limitations and my ignorance. I'm really ignorant on this topic, but I've read a lot of stuff about it, including most of the cautionary tales. You know, people are worried about machines becoming so powerful that they gain some kind of autonomy. They're designed to serve us, but they end up ruling us. You know, I'm very interested in the work of Ted Kaczynski. He wrote two books, and of course I have to ritually say that I am completely against mail bombs or any form of violence.


Note: Ted Kaczynski was a mathematical genius who taught at the University of California, Berkeley. Because of his strong opposition to modern industrial society and technology, he became the "Unabomber". Between 1978 and 1995, he killed 3 people and injured 23 people by mailing bombs.


But Ted Kaczynski has a lot of very thought-provoking things to say about technology. It's like having servants, people want servants. But the reality is, they're there to serve you, but you end up serving them. That's the concern people have. I don't want to be a slave to a machine any more than I already am. It's as simple as that. There are other aspects of course, you know more than I do, you're in that circle. But, that's my concern.


Amjad Masad:That's a really valid concern. I want to separate the existential threat from the concern that we're enslaved by machines. Ted Kaczynski's critique of technology is actually one of the most powerful.


Tucker:I wish he hadn't killed people, because I'm against killing people. I think it had the opposite effect that he intended. He did it to draw attention to his point, but it blurred the focus. But I really hope that every American will read his books, not only his manifesto, but the books he wrote in prison, because they are at least very instructive and very important.


Amjad Masad:Yes. In simple terms, he came up with a concept called the "power process" which he believed was intrinsic to human happiness. The struggle to survive, going through the stages of life, from childhood to adulthood, getting married and having children, then becoming an elder, and finally dying. He believed that modern technology interrupted this process and made people miserable.


Tucker:How do you know this?


Amjad Masad:I read these books and I was very curious. I loved to read without any psychological self-censorship. I loved to explore everything.


Tucker:Do you think being from another country helped you maintain this curiosity?


Amjad Masad:Yes, my childhood also made me different. I had red hair, and half of my family has red hair. This difference made me accustomed to not being mainstream, so I didn't worry about conforming to social norms. My interest in computers and technical skills have taken me far in life. I almost have an aversion to conformity and blindly following others.


Tucker:I totally agree. I had similar childhood experiences.


Kaczynski's point is that struggle is not only an inherent human condition, but it's an important part of your evolution as a person. And technology interrupts that process, and I think he makes a lot of sense.


Amjad Masad:Yeah, it's hard for me as a technologist to argue with that. Like I said, it's one of the most powerful criticisms of technology, and we could spend a whole show talking about it. I think the divergence is that we want technology to be a tool for empowerment, not just to replace us. At Replit, we're committed to empowering people to learn to code, to build startups, to be entrepreneurs. I think in this world, you have to create your own power process, you have to go through struggle. That's why I'm against the UBI (universal basic income) that a lot of technologists talk about, because it goes against human nature.


Tucker:It's about destroying everyone and kicking them out the door.


The Origin of "Cult" Culture


Amjad Masad:Yes. So I don't think technology is inherently in conflict with the power process. We can go on to discuss existential threats.


Tucker:This is the status quo. I've been somewhat persuaded that it makes sense to me. I tend to focus on threats, and people with my personality are always looking for the big bad thing that's coming, like an asteroid impact, nuclear war, or AI slavery. But I know some very smart people who are closer to the core of AI development who also have these concerns. I think there are a lot of people in the public who have these concerns, too. The last point I want to make before asking for your more insightful views on this is that there is very little discussion about the benefits of AI. Instead, a lot of people are saying that if we don't do it, China will do it. I think that may be true. But why should I be excited about it? What's the benefit to me? Usually when new technology comes out, people say "This will be great! You won't have to do X again." I haven't heard any of that with AI.


Amjad Masad:That's a very astute observation. I'll tell you why, but it's a slightly longer story. I think there's an organized effort to try to make people afraid of AI.


Tucker:Organized?


Amjad Masad:Yes, organized. It started on a mailing list in the '90s, a transhumanist mailing list called "The Extropians." They believed in the "Singularity." The Singularity is the moment when AI or technology in general advances so quickly that you can't predict what's going to happen. It's self-evolving, and all bets are off. We're entering a new world that you can't predict.


Tucker:A world where technology can't be controlled.


Amjad Masad:Technology can't be controlled, and it's going to reshape everything. And those people think that's a good thing because the world right now is so fucked up. We're imperfect, immoral, all kinds of irrational. So they really want the Singularity to happen. There's a young guy on this list named Eliezer Yudkowsky who claims that he can write such an AI. He wrote a long article about how to build this AI, but suspiciously he never actually released the code, it was all just prose about how he would build the AI. Anyway, he managed to raise money. They started this organization called the Singularity Institute. A lot of people who were excited about the future invested, most notably Peter Thiel. He spent a few years trying to build an AI, again, never released code, never released any real progress. And then he concluded that not only is it impossible to build an AI, but if you built it, it would kill everyone. So he went from being an optimist to thinking that AI would definitely kill everyone.


Eliezer Yudkowsky


And then he said, I made this mistake because I was irrational. To get people to understand that I almost killed everyone, they had to be rational. So he created a blog called "Less Wrong". The blog walks you through the steps to become more rational: look at your biases, check yourself, meditate on the irrational decisions you've made and try to correct them. Then they created an organization called the Center for Advanced Rationality (CFAR) and started doing workshops on rationality.


Tucker: What are intensive workshops on rationality like?


Amjad Masad: I've never been to one. But I imagine they talk about bias and stuff like that. But they also have some weird practices, like a kind of struggle meeting called "debugging". A lot of people wrote blogs saying it was humiliating and even led to psychosis in some people. There was a mass psychosis in that community in 2017, and a lot of people were kind of crazy. It was all documented on the internet.


Tucker: Debugging. So this is like your typical cult technique. You have to strip yourself down, like an auditing in Scientology. This is very common in cults. Is that what you're describing?


Amjad Masad:Yeah, that's what I read in those accounts. They'll sit down and audit your thoughts like an audit, and tell you what's wrong with you. It often puts a lot of pressure on young people. A lot of people talked about how being part of that community caused them a lot of pain. Some of the branches of this community have even had very dark incidents of suicide and murder. Another thing is that they teach you rational thinking and recruit you into high-risk missions because they think they are rational, their group is rational. We learned the art of rationality and unanimously believed that AI will kill everyone. Therefore, everyone outside of this group is wrong and we have to protect them from AI. But they also believe in some other things, like they think polyamory is rational.


Tucker:Polyamory?


Amjad Masad:Yeah, basically having sex with multiple partners.


Tucker:But they thought it was rational? I mean, if you're a man and you want to have sex with more different women, that's certainly a natural desire. But in what sense is it rational? You've never seen a happy long-term polyamorous relationship. I know a lot of people like that, none. So it's probably selfish.


Amjad Masad:Do you think it's to recruit more impressionable people into their group?


Tucker:Yeah, and their hot girlfriends.


Amjad Masad:Right. So it's considered rational. They convince each other to accept all this cult-like behavior. And the crazy thing is, this group ends up being very influential. Because they recruit a lot of people who are interested in AI, and the AI labs and the people who are starting these companies are reading this stuff. For example, Elon Musk is well known to have read a lot of Nick Bostrom, who is a relevant figure in the rationalist community. He was part of the original mailing list. I think he would call himself part of the rationalist community. He wrote a book about how AI is going to kill everyone. I think he's softened his views recently, but initially he was one of the people who sounded the alarm. OpenAI was founded based on a lot of these fears, like Elon was worried that AI was going to kill everyone. He was worried that Google was going to do it.


I don't think everybody at OpenAI actually believes that. But some of the original founding stories are like this. They recruited from that community. So much so that when Sam Altman was fired recently, the person who fired him was from that community. Was a guy who started with effective altruism, which is another offshoot of that community. So the AI Lab is in many ways inextricably tied to that community. And they ended up borrowing a lot of the community's language. But by the way, a lot of these companies are great companies now. I think they're cleaning up the mess.


Tucker:But based on your description, it does sound like a cult. I mean, it has the hallmarks of a cult. Can we go a little deeper into what they believe? You said they're transhumanists. What is that?


Amjad Masad:I think they're just unhappy with human nature. Unhappy with the way we're currently constructed. They think we're irrational, immoral, all kinds of irrational. So they long for a world where we can become more rational and more moral by engineering ourselves, either through chips and merging with AI or engineering our bodies. By engineering and merging with machines to solve what they see as the fundamental problems of human existence.


Tucker:That's so interesting and so shallow and stupid at the same time. A lot of the people I know are not actually smart because the best things. I mean, rationality is important, we should be rational, it's God-given to us, in my opinion. It's really bad to be irrational. However, the best qualities and impulses of human beings are not rational. I think there is no rational explanation for giving selflessly to others, spending a lot of time helping others, or loving others. Sleeping with someone else's beautiful girlfriend, I guess that can be considered rational, but it's actually our lowest impulse.


Amjad Masad: Wait, you haven't heard of "effective altruism." They think that our natural impulses that you just mentioned are indeed irrational. There's this Australian philosopher called Peter Singer...


Tucker:That's the guy who supports infanticide. Yeah, he's too moral to support killing children.


Note: Singer believed that moral decisions should be based on the principles of reducing suffering and increasing happiness, rather than simply on species or individual identity. For example, when discussing embryos with severe genetic defects, he argued that terminating their lives might be morally acceptable in some cases because it would reduce suffering and waste of resources.


Amjad Masad:Their philosophy is utilitarianism. Utilitarianism says you can calculate morality. When you start applying it, it gets into very strange territory. For example, there are thought experiments where there are two patients in a hospital who need an organ transplant, and another person who comes in for a routine checkup happens to have a suitable organ. According to the logic of utilitarianism, you should kill the healthy person and take out his organs to give to the other two patients. I don't think people really believe this, but it is the logical consequence of their theory.


This conclusion comes from a core belief that they think they are God. And ordinary people will realize that although killing one person to save more people makes mathematical sense, I don't have the power to do so because I am not God who created life, I am just an ordinary person who cannot predict the future. All of these conclusions come from people mistakenly thinking they are gods.


Tucker:I agree. I think it reflects their deep disappointment in human nature. They are disappointed in the flaws of human nature. In some ways, of course we should be better. But this attitude we used to call "judgment," and now we're not allowed to do that. In effect, they're saying, "You suck," and it's easy to go from there to, "You deserve to be killed." There's a complete lack of love. In contrast, a normal person, a loving person, would say, "You have flaws, I have flaws, but I still love you, and you love me, and I appreciate your love."


Amjad Masad:That's right. But they're saying, "You suck, come join our art community, have sex with us."


Tucker:Just to be clear, these are not just regular employees in the company, right?


Amjad Masad:You've heard of SBF and FTX, right? They have an internal clique that's having sex with each other.


The jailed FTX founder SBF is a follower of Singer’s effective altruism


Tucker: Considering the way some of them look, this is not rational behavior.


Amjad Masad:Haha, indeed, no sane person would do that.


Tucker:That's true.


Amjad Masad:Yeah. You know what's even more disturbing is that there's an ethic in their philosophy called "long-termism." It comes from the rationalist branch called effective altruism.


Tucker:Long-termism?


Amjad Masad:Yeah, long-termism. They believe that if we take the right steps, there will be trillions of humans in the future, or trillions of minds. These may not be humans, they may be AI, but they are all minds that are capable of experiencing utility, of experiencing good things. If you're a utilitarian, you have to give a lot of weight to that. Even if you discount the future value, it still ends up being highly valued given the sheer volume.


So these communities all end up focusing on AI safety because they think AI is probably going to kill everybody. We can talk about their arguments in a minute. The effective altruists and all the branches of the rationalist community focus on AI safety because they think that's the most important thing - we want a trillion people to have a good life in the future.


But when you put such a high value on it, it becomes a kind of Pascal's Wager. You can use that as an argument to justify anything, including terrorism, including doing really bad things. If you really believe that AI is going to kill everybody and that the future holds far more value than any human being alive today, you might justify any action. So it's a very dangerous frame.


Tucker: But it's the same frame that every genocidal movement has had. From the French Revolution to now, it's all about using a good future to justify a bloody present.


Amjad Masad: I'm not accusing them of genocidal intent. I don't know them, but these ideas could easily lead to concentration camps. We're just talking in generalities. If they were just some insignificant Berkeley college kids with no real impact on the world, I wouldn't care at all. But the thing is, they managed to convince a lot of billionaires to believe in these ideas.


Elon Musk may have changed his mind at some point. I don't know if he gave them money, there are rumors that he considered it. But a lot of other billionaires gave them money, and now they are organized and lobbying for AI regulation in Washington. They are also the driving force behind the California AI regulation bill, and they are profiting from it. There are reports that Dan Hendricks, the main sponsor of SB 1047, also started a company to certify AI safety. The bill requires certification by a third party. So there is some "let's profit from it" element in it. By the way, this is all according to one article, I can't be sure.


I think Senator Scott Wiener had good intentions in writing this bill, trying to do the right thing. But he listened to a lot of these "cult members," and they're very well organized. And a lot of them are still connected to the big AI labs, and some even work at them. They want to create a regulatory capture situation where there's no competition in AI. I'm not saying that all of them are directly motivated by this. Many of them are true believers, but some may infiltrate the group in a way that benefits these companies.


Can Machines Think?


Tucker: I'm from Washington, D.C., so I've seen a lot of cases where my bank account just happens to align with my beliefs. Thankfully, it always works out that way. Climate is a perfect example. There has never been a climate solution that made the person who proposed it poorer or disempowered. Never.


I wonder, is it true that machines can think? I've held that assumption until now.


Amjad Masad:Let's go through their chain of reasoning. I think that even if it's a stupid sign or maybe actually a cult, that doesn't automatically mean that their arguments are wrong. But you do need to be a little skeptical of some of the arguments because they come from crazy people.


Their chain of reasoning goes like this: Humans are general intelligence. We have brains, and brains are computers, and they compute based on pure physics that we know. If you agree that humans are computational agents, then we can build general intelligence in machines. If you agree to this point, even if we only create human-level general intelligence in machines, we can create a billion of these agents, and then they will become much more powerful than any one of us. Because they are much more powerful than any one of us, they will want to control us, or they will not care about us, just like we don't care about ants, we can crush them at will. Because these machines are so powerful, they will not care about us.


I started to have doubts at the first step of the chain of reasoning. I have problems with every link, the first one being: the brain is a computer. What is that based on? Their view is that if you don't believe the brain is a computer, then you're believing in some mysterious spiritual force. But you have to convince me first that you haven't made a convincing argument yet.


The idea that we already have a complete description of the universe is itself wrong. We don't have a unified theory of physics. We have microphysics, we have macrophysics, but we can't really unify or combine them. So, just being a materialist is already incoherent because we don't have a complete description of the world at all. This is a side argument that I won't go into in detail.


Tucker:No no, that's actually a very interesting argument. Can you explain the limits of our knowledge of physics for those in the audience who are not familiar with this?


Amjad Masad:Yes. We essentially have two contradictory theories of physics, and these systems can't be combined, they're not a universal system, you can't use them both at the same time.


Tucker:Oh, that suggests that there are very significant limits to our understanding of the natural world, right?


Amjad Masad:Yes. I think that's another mistake made by rationalists, who assume that our science is much more advanced than it actually is.


Tucker:It sounds like they don't know a lot about science.


Amjad Masad:Yes.


Tucker: Thank you, sorry to interrupt.


Amjad Masad: Never mind, that's not my main point. There was a very distinguished philosopher/mathematician/scientist in England named Sir Roger Penrose. I love how the British give "Sirs" to people who have accomplished a lot. He wrote a book called The Emperor's New Mind, based on the story of the Emperor's New Clothes. He argued that the idea that the mind is a computer is a false consensus, just as the emperor is actually naked.



Tucker: It's not really an opinion, it's an assertion.


Amjad Masad: Yes, it's a fundamentally false assertion. And he proves it in a very interesting way. There is a thing in mathematics called Gödel's incompleteness theorem. It states that there are some statements in mathematics that are true, but cannot be proved. Gödel constructed a number theory system in which he could make some statements. Our minds are able to understand things that mathematics cannot describe. When I first read this, I thought it was very interesting.


Tucker:You mentioned to me last night a famous argument that I had never heard of, Bertrand Russell's paradox.


Amjad Masad:It's like, "This statement is false," and it's a famous paradox called the liar paradox.


Tucker:Explain why it's a paradox?


Amjad Masad:Among mathematicians, it's a very powerful thing because it shows that our system is incomplete.


Tucker:So it sounds like the more you understand science and the more honest you are about what you find, the more questions you have. And that ultimately brings you back to a position of humility. And when I see science used in the political arena, it's usually some ignorant people talking about science. So, who cares? Actually, when Kamala Harris lectured me about science, I didn't want to listen at all. But there are some smart people who say "trust science." The assumption behind this request is that science is complete, knowable, and we already know it completely. If you ignore it, then you are ignorant, whether it is intentional or otherwise, right?


Amjad Masad:In my opinion, science is a method that anyone can apply. It's democratic, it's decentralized, and anyone can apply the scientific method, including those who are not trained.


Tucker:But to practice this method, you have to start from a position of humility. I don't know, so I use this method to find out. I can't lie about what I observe.


Amjad Masad:That's right. And today, science is used for control and propaganda.


Tucker:In the hands of people who shouldn't have it, like some ignorant people with ugly agendas. But we're talking about your world, of very smart people who do this for a living and really try to advance science. And I think what you're saying is that some of them, whether consciously or not, don't realize how superficial they are.


Amjad Masad:Yeah, and they make this argument through this chain of reasoning. These arguments are at least incomplete, and they take it for granted that if you even question this, you're anti-science.


Tucker:That's just stupid.


Amjad Masad:Yes.


Tucker:Well, let me count the differences between the brain and the computer. First of all, you can't guarantee that memory is a faithful representation of the past. Memories change over time, right? It's kind of misleading, who knows why, but it's true, right? That doesn't hold true for computers. So how do we explain intuition? And instinct? My question is, can machines also have these characteristics?


Amjad Masad:You could say that neural networks are in some ways intuition machines, and a lot of people say that.


Maybe I should explain neural networks to the audience. Neural networks are inspired by the brain, and the idea is that you can connect a series of simple mathematical functions together and train it by giving it examples. For example, I can give it a picture of a cat. Suppose this network needs to judge: if it is a cat, it says "yes", if it is not a cat, it says "no".



Give it a picture of a cat, if it says "no", then the answer is wrong. You adjust the weights based on the difference between the picture and the answer. You repeat this process, let's say a billion times. Then, the network will encode features about cats. This is exactly how neural networks work, you adjust all these little parameters until it can embed feature detection, especially in classifiers.


This is not intuitive, in my opinion, this is basically automatic programming. Of course, we can write code manually, you can go to our website to write code. But we can also automatically generate algorithms through machine learning. Machine learning is essentially discovering these algorithms, and sometimes they discover very bad algorithms. For example, all the pictures of cats we give it have grass, so it will learn that grass equals cat and green equals cat. Then one day you give it a picture of a cat without grass, and it doesn't know what happened. It turns out that it learned the wrong thing. So, because it's fuzzy what it actually learned, people interpret it as intuition. Because algorithms are not explicit, and there's a lot of work trying to explain these algorithms now, from companies like Anthropic, but I don't think you can call it intuition just because it's fuzzy.


Tucker: So what is it? How is it different from human intuition?


Amjad Masad:We don't need a trillion examples of cats to learn to recognize cats. You know, a child can learn language with very few examples. And now when we train these large language models like ChatGPT, you have to give it the entire internet to learn language. That's completely different from how humans learn. We learn with a combination of intuition and some higher-level things. I don't think we have figured out how to get machines to do that.


Tucker:Do you think it's structurally possible for machines to do that?


Amjad Masad:This chain of reasoning would hurt American industry, hurt startups, make it harder to compete, and give China a huge advantage. Arguments based on these flaws would really hurt us because they don't really address these real problems.


Tucker:It sounds like they don't really address these problems. What gives me pause is not the technology itself, but the understanding of people by those who create the technology. I think the right way to understand it is that humans are not self-created beings, humans cannot create life. Humans are created by some higher power, and there is an indescribable divine spark at their core. Therefore, humans cannot be enslaved or killed, and that is wrong. There is right and wrong. You know what a gray area is? It's not a gray area, because they are not self-created, right?


I think all human behavior stems from this belief, and the most inhumane behavior in history stems from the opposite belief, that humans are just objects that can and should be improved, and I have complete control over them. This is a totalitarian way of thinking that connects every genocidal movement. So, as an outsider, I feel that those who created this technology had this belief.


Amjad Masad:Yes, and you don't even need to be a spiritual believer to have this belief.


Tucker:Of course not.


Amjad Masad:I think it's actually a rational conclusion, but what's interesting is that the modern instinct is to look for causes within the person. And I think the more natural and correct instinct is to look for causes outside of the person. I'm open to both. Yeah, I mean I don't know the answer, because I do know the answer, but hahahaha I'll just pretend I don't know. But at least both possibilities exist. So if you limit yourself to looking for genetic mutations or changes, then you're not a positivist, and that's not actually a scientific way of looking at a problem. You shouldn't rule out any possibilities, right?


Tucker:Interesting.


Amjad Masad:That's very interesting.


Amjad Masad:I think these machines, I've bet my business on AI getting better and better, and that's going to make all of us better, make us more educated.


Tucker:So, now's the time to tell me why I should be excited about what I've been hearing.


Amjad Masad:This technology, large language models, is amazing. We found a solution to the problem of education. We can uplift all of the human beings on the planet. The problem is we don't have enough teachers to do one-on-one tutoring. It's very expensive, no country can afford it. So now we have these machines that can communicate. They can teach, they can give information, and you can interact with it in a very human way. You can talk to it and it can respond to you. We can build AI applications to teach one-on-one. You can serve 7 billion people with it, and we can make everybody smarter.


Tucker:I totally support that. That was the promise of the internet, and it didn't happen. So I hope it happens this time.


But I can't help but ask, because I'm from Washington, and when those in power see new technology, the first thing they think of is how to use it to kill people. So what are the possible military applications of this technology?


Amjad Masad:That's why I'm skeptical of lobbying efforts by governments to regulate this technology, because I think the biggest abusers are probably going to be governments. I saw your interview with Jeffrey Sachs, a professor at Columbia University, who is a very mainstream figure. I think he was assigned to do the Lancet study on the origins of the Covid virus, and he came to the very heretical conclusion that the virus was created in a lab by the US government.


Governments are supposed to protect us from these threats, but now they're talking about pandemic preparedness and so on. We should be focusing on what governments are doing and how to make sure we have democratic processes to make sure they don't abuse these technologies. They regulate these technologies so that regular people can't use these things, and then they're free to abuse them, just like encryption.


Tucker:Like encryption.


Amjad Masad:Right. Encryption is another example. Exactly.


Tucker:But they've been doing this for decades. It's like we get privacy, but you're not allowed to because we don't trust you. But by using your money and the moral authority that you gave us to lead you, we're going to hide everything we do from you, and you can't do anything about it. That's what's going on in America right now. So how are they going to use AI to further oppress us?


Amjad Masad:You can use it in all kinds of ways, like automated drones. We already have automated drones, and this could get worse. There's a magazine called 972 that published a big exposé about how Israel used AI to locate suspects, but ended up killing a lot of civilians. It was a very interesting article.


Tucker:So this technology ended up killing a lot of innocent people, which is really dark.


Amjad Masad:I think that's also a question.


Amjad Masad:I think that's a question, too.


Amjad Masad:It would help manufacturing.


Right now, people are looking at how to use this technology for robotics.


Tucker:What about manufacturing?


Amjad Masad:It would help manufacturing.


Right now, people are looking at how to use this technology for robotics.


Tucker:What about manufacturing?


Amjad Masad:It would help manufacturing.


I'm invested in a few companies that are working on how to apply this technology to robots. It's early science, but if we can apply this technology, there could be huge advances.


Tucker:The whole purpose of technology is to replace human labor, whether it's physical labor or mental labor. I think historically, the steam engine replaced the arm, etc. So if this technology is as transformative as it seems, you're going to have a lot of idle people. That's why many of your friends and colleagues support UBI (universal basic income), they think there's nothing these people can do, so we have to pay them to live. You say you're against this, and I'm adamantly against it. But on the other hand, what's the answer?


Amjad Masad:Yeah, you know, there are two ways to look at it. We can look at the individual people who are losing their jobs, which is tough, and I don't have a good answer for that. But we can look at it from a macro perspective. Mostly, technology over time creates more jobs. You know, before the invention of the alarm clock, we had a profession called the wake-up man. They would knock on your window and wake you up. That job went away, but we have ten times as many jobs in manufacturing, or maybe more. So, in general, the trend of technology is to create more jobs.


I can give you a couple of examples of how AI is creating more jobs, and actually more interesting jobs. Entrepreneurship is a very American thing, and America is the nation of entrepreneurship. But the number of new companies being founded has actually been declining, and it's been declining for at least 100 years. And while we're excited about startups, Silicon Valley is the only place that's still producing startups, and there aren't that many new companies being founded elsewhere, which is kind of sad. Because the Internet was supposed to be a great wealth creation engine that anyone could access. But in reality, it's just concentrated in one geographic area. Tucker: Looking back, the internet looks like a monopoly generator. Amjad Masad: But it doesn't have to be that way. I think AI can help people start businesses because you have this easily programmable machine that can help you program. I'll give you a couple examples. During Covid, a teacher in Denver was kind of bored and went on our website and took a free coding class. He learned some coding, used his teaching experience to build an app that helps teachers teach using AI. Within a year, he built a multi-million dollar company that was funded a lot. I think he raised $20 million. This is a teacher who learned to program and quickly built this massive business. We also have stories of photographers who are making millions of dollars. So this is a way to decentralize access to this technology. Tucker: That promise makes sense to me. I want it to be a reality. We have a mutual friend who is very smart, a kind person, and he's very knowledgeable and involved in this topic. He told me that one of the promises of AI is that it will allow people to have virtual friends or companions, and that it will solve the huge loneliness problem that America obviously has. I don't want to say anything bad about him because I like him very much, but that seems very bad to me.


Amjad Masad:Yeah, I don't subscribe to that. I think we have the same instincts about what is dark and dystopian. We have the same instincts.


Tucker:He's an amazing person, but I don't think he's really thought about these issues. We disagree, but I don't have an argument, just an instinct. I think people should have sex with people, not machines.


Amjad Masad:Exactly. I would even say that some of these apps are a little unethical, like, taking advantage of lonely people and giving them opportunities, but it would actually give them no motivation to go out and date, find a girlfriend. Just like porn.


Tucker:Right.


Amjad Masad:Yeah, I think it's very bad for society. So I think applied technology can be used positively or negatively. If there was a cult dedicated to promoting AI as a positive technology, I would fully support them. Historically, there have been some self-correcting cultural phenomena. Like the self-correction of pornography, which is happening now. The self-correction of junk food, like Whole Foods is popular now.


Tucker:Chemicals in the air and water, ten years ago this was a very obscure topic, only weirdos cared about it, like Bobby Kennedy, but now it's part of the normal conversation.


Amjad Masad:Yeah, everyone is worried about the effects of microplastics on reproductive organs.


Tucker:Yeah, that's a legitimate concern. So, I'm not surprised that there are cults in Silicon Valley, I don't think the one you mentioned is the only one, I think there are other cults. That's my feeling. I'm not surprised because everyone is born knowing that there is a power beyond themselves. That's why every civilization worships something. If you don't acknowledge that, you're just worshipping something more stupid. So my question is, as someone who lives and works there, how many of the decision-makers in Silicon Valley will openly acknowledge that there is a power beyond themselves in the universe?


Amjad Masad:For the most part, no. I think most people wouldn't admit that. I don't want to say most people, but the vast majority of discussions are more intellectual. I think people take for granted that everybody has a secular perspective.


Tucker:I think the really smart people eventually realize that we don't know much and we don't have much power. That's my view.


Amjad Masad:This is the view of many scientists who delve into science. I can't remember who said it, but someone said that the first sip of science makes you an atheist, but at the bottom of the cup, you find God waiting for you.


Tucker:Mathias Desmet wrote a book about this, and although it's supposedly about COVID, it's not. I highly recommend it. The view in the book is exactly what you just said: the deeper you go into science, the more you find some kind of order, it's not random. The beauty that is displayed in the mathematics makes you more and more sure that there is a design here, and this design is not human or so-called "natural", but supernatural. This is his conclusion, and I agree. But how many people in the scientific world do you know who think this?


Amjad Masad:I can say that there are really few such people.


Tucker:Oh, that's interesting. That worries me because I think without that kind of awareness, arrogance is inevitable.


Amjad Masad:Yes, a lot of conclusions come from arrogance. For example, the fact that many people think AI is an imminent existential threat, and that many people believe we will all die in five years, comes from arrogance.


Tucker:That's interesting. I never thought of that before I met you. In fact, that itself is a sign of arrogance. I never thought of it that way.


Amjad Masad:Yes, arrogance can be negative or positive. I think the positive side is good, like Elon embodies this kind of confidence, believing that he can build rockets and electric cars. This kind of confidence can be delusional in some cases, but it often leads to the drive to create. I think it becomes pathological if that confidence is used, as SBF does, for unethical behavior, including stealing and cheating.


Tucker: I never understood it, or understood it all too well, but effective altruism clearly causes people to become worse to each other, not better.


Amjad Masad: Yeah, that's ironic, but I think it's in the name. If you call yourself something grand, you usually behave badly. For example, "ISIS" is neither Islam nor a state, and effective altruists are neither effective nor altruistic.


Tucker:Neither is the United Nations. That's really wise words. I don't think any large language model or machine can come up with the kind of deep truth that you just said, because deep truth is always ironic, and machines don't understand irony, right?


Amjad Masad:Not yet, maybe in the future. I'm not as strong-minded as you are. I believe if you give them enough data, they might understand it.


Tucker:Honestly, I don't know how capable they are.


Amjad Masad:I think they probably can't create really original and insightful irony, but if you put a lot of irony in the data, they can understand it.


Tucker:They can mimic human irony.


Amjad Masad:They are imitating machines, they actually imitate. The way that large language models are trained is that they are given a lot of text and then some words are hidden and they try to guess those words and adjust the weights of the neural network and eventually they get very good at guessing what a human would say.


Tucker:So what you are actually saying is that what these machines say is the sum of the input data, including the personalities and biases of the people who fed it into it. Then you want the best and most humble people to do this, but it seems like we have the least humble people doing it.


Amjad Masad:I think some people are humble, I wouldn't say that all people working on AI are good and kind and want to do the right thing. There are indeed a lot of people who do this with the wrong motivations, out of fear and so on.


Tucker:That's exactly right. It's a self-perpetuating problem because you're catering to base human desires and creating a culture that inspires those desires in more people. In other words, the more porn there is, the more people demand it. I'm also thinking about the revolt of existing industries, like medicine. You mentioned medical advances, and that makes sense for diagnosis because diagnosis is really all about sorting data. Machines will always be able to do that more efficiently and faster than any hospital or doctor can. Diagnosis is the biggest hurdle, right? This is going to actually put a lot of people out of work. Who needs the Mayo Clinic if I can input my symptoms into a machine and get a more accurate diagnosis than I did three days later at the Mayo Clinic?


Amjad Masad:I have a specific story. I had a chronic problem and I spent hundreds of thousands of dollars on doctors and the world's top specialists, but they couldn't give me the right diagnosis. I ended up writing a little bit of code myself to collect the data, ran it through the AI, and it gave me a diagnosis that the doctors had overlooked, and it turned out to be the right diagnosis.


Tucker:It was incredible.


Amjad Masad:Yeah, it's amazing and changed my life.


Tucker:And this is a result that you wrote your own code to achieve?


Amjad Masad:Yeah, a little bit of code.


Tucker:It shows that we're not far from openly using this technology.


Amjad Masad:Yeah, by the way, I think anyone can write a little code right now. At Replit, we have a program called 100 Days of Code, where if you spend 20 minutes a day doing a little bit of coding, in three months you can be a good enough programmer to build a startup. Eventually you'll have people working for you and upskilling, but you'll have enough skills. I want to throw out a challenge to the listeners, if they complete this program and build something that could become a business, I'd like to help them promote it, give them some cloud credits and so on, just @ me on Twitter and mention this podcast.


Tucker:What's your Twitter handle?


Amjad Masad:@amasad.


Tucker:But there are a lot of vested interests, and I don't want to get into the Covid thing, but I think you saw that during Covid, the motivations were always mixed. I do think there are high motives mixed with low motives because that's human nature. But at some point the profit motive trumps public health. I think that's fair. So if they're willing to hurt people just to maintain their stock price, how can they resist giving people free access to more accurate diagnoses through machines?


Amjad Masad:To some extent, that's why I think open source AI and people learning how to do some of this on their own is so important. Of course, if there are companies that are providing these services, it would be better. But the fact that AI exists and a lot of it is open source and you can download it to your machine and use it is enough to help a lot of people. Of course, you should always communicate with your doctor, and I am also, I am not giving people advice to solve these problems on their own, but I do think it is empowering people. This is the first step.


Tucker:For someone like me, I will not go to the doctor until he apologizes to my face for four years of lying. I have no respect for doctors, nor for anyone who lies. I will not take life advice from a liar, especially health advice, because it is not sane. I will not take real estate advice from a homeless person, nor financial advice from someone who is in prison for fraud. I am sure there are doctors who will apologize, but I have not met them yet. So for people like me, who will not go to the doctor until the doctor apologizes, this AI diagnostic system may actually save lives.


Billionaires Turn Right


Amjad Masad:As for whether there will be a regulatory capture issue, I think that's why you see Silicon Valley starting to get involved in politics. Silicon Valley has always been involved in politics to some extent. I remember coming here in 2012, and it was the Romney-Obama debate. People were just making fun of Romney, and he said something like "a folder full of women," and Obama took the opportunity to beat him. I remember asking everyone around me, who do you support? They said, of course, the Democrats. I asked why no one here supports the Republicans? They said, because they are stupid, only stupid people support the Republicans. Silicon Valley is like a single-party town.


Here are some data on corporate political donations, like Netflix 99% of its donations went to the Democrats and only 1% went to the Republicans. If you look at the party diversity in North Korea, it's actually a little better.


Tucker:Their media is also more honest.


Amjad Masad:But now you can see that more and more people in technology are beginning to support the Republican Party, and even Trump. Especially Marc Andreessen and Ben Horowitz, they recorded a two-hour podcast, discussing why they support Trump.



Tucker: They are the largest venture capital firm in the United States.


Amjad Masad: I don't know by what metric, but they are certainly on their way to being the largest, and they are undoubtedly the best.


Tucker: What did they say? I should have read it, but I didn't.


Amjad Masad: They explained why they voted for Trump. Which, by the way, they would never have done in 2018 or 2019. So it's a shift in atmosphere.


Tucker: How effective is that shift?


Amjad Masad:The response has been mixed, but it's much better than it was 10 years ago, when if they had publicly supported the Republican Party, they would have been banned and no one would have taken their money.


Tucker:I'm looking at this from the outside, and Andreessen Horowitz is so big and influential, and they're seen as smart, and there's no sign of crazy at all. If Andreessen Horowitz does this, it will definitely change people's perceptions.


Amjad Masad:Yeah, it will definitely change people's perceptions. I think it will give a lot of people courage to at least say they support Trump. I think it really changed my mind.


They've put forward a "small tech" agenda. You know, now there's "big tech" lobbying, so who is lobbying for small companies like us? What about small companies with only one or two people? There's actually no one. For example, the Democrats are very keen on regulating AI. One of the funniest things was Kamala Harris was invited to an AI safety conference and they were talking about existential threats. She said that people were being denied medical care and that was an existential threat to them.


She interpreted existential threat as any risk is an existential threat. That's just one example. But they moved very quickly on regulation, they issued an executive order.


Tucker:The changes they've made so far, from a user perspective, are just making sure to hit white people. It's actually pushing a dystopian totalitarian social agenda, a racist social agenda. Is this going to be permanent?


Amjad Masad:It's changing, it's reversed, it's not perfect, but it's changing. I think it's a cultural shift. I think Elon buying Twitter and letting people discuss and debate freely has pushed the culture in a more neutral direction. That's positive because the culture inside the company was very left-leaning, designing the product in a way that made George Washington look like a black George Washington.


Tucker:That's just a lie, and it disgusts me. I don't want to hear lies, George Washington was not black, the founding fathers were white Protestants. Sorry, that's the truth. Please accept it. If you're going to lie about this, you're my enemy.


Amjad Masad:I think it was a minority of people in the company doing this, but they were the ones in control. When I joined Facebook in 2015, I saw the culture change. A small number of activists made the company's leadership afraid of employees.


Tucker:This is something that really worries me. Silicon Valley is defining our future, and courage is the first element of leadership. If the leaders of the company are afraid of 26-year-old unmarried HR girls, it's cowardly. They should be told clearly: "You are not the leader of the company, I am." This is simple, I don't know why it is so difficult.


Amjad Masad:I think these companies were completely crazy in the competition for talent. At that time, the US economy was in the era of zero interest rates, and everyone was throwing money at talent. So even if you just offended the feelings of employees slightly, you would be afraid that they would leave. I'm just trying to make excuses for them, but this is indeed one of the reasons.


Tucker:You can answer this question because you came to the Bay Area from Jordan, and you came here to pursue creativity and insight. Are people who can write code, like you or James Damore, less political activists?


Amjad Masad:Most programmers are indeed more rational, but I don't think learning to code will automatically make people more rational. Although programming can help you become more rational, it is also easy to be swayed by emotions.


Tucker:So the brain is not a computer?


Amjad Masad:The brain is not a computer, that's exactly my point. I'm probably one of the biggest advocates for learning to code in the US, because I built the first software that made it easy to write code in a browser, and it became really popular. Then I got involved with Code Academy, and they helped teach about 50 million people how to code. While programming is a tool, I never really believed it taught people how to think. Programming is a fun tool to build, automate, and create, but I don't think you can make people more rational with it. People can become more rational with education and practice, but you can't force them to be rational.


Tucker:I totally agree. I always thought that people who write code had some personality traits, but that's not true.


Amjad Masad:That may be true for programmers, but anyone can learn to write a lot of code.


Tucker:We haven't mentioned Elon Musk and David Sacks yet. They both support Trump, too. So do you think the atmosphere in Silicon Valley is really changing?


Amjad Masad:Yeah, I actually think it's thanks to Sacks, maybe even more than Elon. You see, it's a one-party system. For example, no one reads any conservative views, or most people are not exposed to right-wing or center-right views. You're just immersed in liberal Democratic views. I think Sacks' All In podcast was the first time a lot of people heard conservatives talk on a weekly basis.


Sacks’ All In podcast recently had an episode with Trump


Tucker:That’s really interesting.


Amjad Masad:I started hearing people at parties and other places describe their political views as “Sacksisms.” They’d say, “Hey, you know what? I agree with you, and most of the time I agree with what Sacks says on the All In podcast.” And I say, “Yeah, you’re probably moderate or center-right right now.” He’s very reasonable.


Tucker:First of all, he seemed to me to be a really nice guy. But I had no idea how influential that podcast was. Until one day he invited me to be on his podcast, and I said, sure, because I love David Sacks. After I did it, everyone I knew was texting me and saying, "Oh, you were on the All In podcast." It wasn't my field, and I didn't realize it was a way to reach business-minded people who weren't very political but might donate to a candidate.


Amjad Masad:That's right. And by the way, this is also my point about how technology can have a centralizing effect, or it can have a decentralizing effect. You could say YouTube is centralizing, and they're pumping opinions into us. But now you have your own YouTube platform after you got fired from Fox.


Tucker:Yeah.


Amjad Masad:Sacks also has his own platform where he can express these views. I think, during Covid, there was a time when I felt like they were going to shut everything down.


Tucker: Do you feel that way?


Amjad Masad: Yeah, and maybe there will be some other event in the future that gives them a reason to shut them down. But one thing I really like about America is the First Amendment. It's the most important institutional innovation in human history.


Tucker: I couldn't agree more.


Amjad Masad: We really should protect it. We should cherish it as much as we would our wives.


Tucker: I couldn't agree more. Could you please repeat your description of its historical importance? Sorry, you said it so well.


Amjad Masad: It's the most important institutional innovation in human history.


Tucker:The First Amendment is the most important institutional innovation in human history. Yeah, I love that statement. I think it's absolutely true. As someone who grew up in a country with the First Amendment, you don't feel it particularly. It's like gravity, it's there. But as someone who grew up in a country without that protection - and that's true of every other country on Earth.


Amjad Masad:Only one country has that kind of protection.


Tucker:You'll see that's the key to what makes America America.


Amjad Masad:Yeah, it's allowed us to change direction. That's what's allowed us to get out of the so-called "woke" culture and the mob mentality of blind obedience. I think Elon buying Twitter and allowing us to have free discussion and debate is one of the big reasons we've gotten out of this craziness.


Tucker:It's beautiful, I'm a direct beneficiary of that, and I think everyone in the country is. I'm not disparaging Elon, I like him a lot, but it's a little strange that a foreigner is doing this. It's a little disappointing, why isn't there an American-born person doing this? Maybe it's because we're so used to it.


Amjad Masad:Yeah, I started to see the value of meritocracy being eroded, and I guess that's part of why I wrote that post. I realized that most Americans here don't really value that.


Tucker:I hate to say that because I've always thought foreigners were awesome. I love traveling to foreign countries, and my best friends are foreign-born, even though I'm against mass immigration.


Amjad Masad:The Arabs love you.


Tucker:Oh, I love the Arabs too. We got rid of the brainwashing. By the way, I think we had a bad experience with the Arabs 23 years ago, and a lot of Americans don't realize that, but I know from many trips to the Middle East that it was terrible. But that doesn't represent the people I've had dinner with in the Middle East. Someone once told me that those are the worst people in our country. I couldn't agree more with that, and I'm always a heartfelt defender of Arabs. I wonder, especially some of the higher-income immigrants, I've noticed lately that they seem to be picking up some of that anti-American crap that they're getting from academic institutions. You know, you're from the Punjab, you go to Stanford, and all of a sudden you have the same decadent, decadent attitudes that the native professors at Stanford have. Do you feel that way?


Amjad Masad:I'm not sure about the distribution, but there are people like Vivek among Indians who are thoughtful and not only support the First Amendment, but why it's important.


Tucker:Yeah, he's a great example. Vivek not only thinks the First Amendment is good, he understands why it's good.


Amjad Masad:You asked about the changes in Silicon Valley. I think part of it is that there are still a lot of people in Silicon Valley who say and do this because they are protected by the First Amendment.


Welcome to join the official BlockBeats community:

Telegram Subscription Group: https://t.me/theblockbeats

Telegram Discussion Group: https://t.me/BlockBeats_App

Official Twitter Account: https://twitter.com/BlockBeatsAsia

举报 Correction/Report
This platform has fully integrated the Farcaster protocol. If you have a Farcaster account, you canLogin to comment
Choose Library
Add Library
Cancel
Finish
Add Library
Visible to myself only
Public
Save
Correction/Report
Submit