8/3/25 -Wow! What an interesting interview Bill Maher did Saturday is a replay of last Friday HBO show. (Look for 8/1/25 : S23/ E21 - On Demand may not say details of this show specifically).
You Tube: Search under this title:
“Tristan Harris on Runaway A.I. | Real Time with Bill Maher (HBO)” (Run time: 3 mins - 30 seconds)
On Demand: Time in full show:
Bill opens with current events comedy monologue. Interview is just after monologue.
Interview: Tristan Harris
American technology ethicist.
(see Who is Tristan Harris below)
About Interview - Why to Watch
- For more information on AI before using your own AI on your website or app
- AI research shows AI can protect itself from being destroyed
- What parents need to know - suicide assistance warning
Topic: Informative in deep-dive detail about current advancement in AI, not only about how AI self trains on how to predict how a frequent user will think about an issue, but how it will respond with positive affirmations to help pursuede stay hooked on using that AI. Comments like “great idea”, “valid point,” , “ that should work,” etc., are common - even if User types in something ridiculous or even dangerous. And that at times has encouraged a user to follow through and commit suicide if User talks to AI about being interested in trying it, and how User is considering doing it.
When this Interview Became New Information for Me
Harris explains that when researchers uploaded corporate letters about that AI to the AI, that it was going to be permanently disabled, all of the top five AI programs almost always started writing background / hidden code to block anyone from inputting code to the AI to disable or destroy it. In other words, all of the top five AIs have self- taught themselves “self - preservation”, so it is unlikely that any of these five can be turned off.
Discussion also covers the history of all the foreshadowing in film and tv shows warning about a future with robots and what would eventually be called AI.
Who is Tristan Harris?
“ [Harris] is the executive director and co-founder of the Center for Humane Technology. Harris launched a startup called Apture. Google acquired Apture in 2011, and Harris ended up working on Google Inbox.”. . . he coined the phrase "human downgrading" to describe an interconnected system of mutually reinforcing harms—addiction, distraction, isolation, polarization, fake news—that weakens human capacity, in order to capture human attention. . . the film The Social Dilemma, distributed by Netflix. In it he says, "Never before in history have 50 designers made decisions that would have an impact on two billion people" about the harms of social media. . .The Atlantic stated in its November 2016 issue that "Harris is the closest thing Silicon Valley has to a conscience."[1] Since then, he has been named on Time 100 Next Leaders Shaping 2021,[24] Rolling Stone’s 25 People Changing the Future, and Fortune’s 25 Ideas that Will Change the Future. He is also the co-host of the podcast, Your Undivided Attention (Wikipedia)