Neil DeGrasse Tyson has pushed back against AI-generated videos that use his likeness to promote ideas he does not hold. In a StarTalk clip posted October 30 he said some of the clips are blatant fakes and warned they cross a line when viewers can’t tell they’re parodies.
AI tools are making it easier than ever to graft a public face onto scripted lines, and Tyson says that’s a problem when the result looks real enough to convince people. Some of the most extreme clips have pushed conspiracy-style claims, including flat earth rhetoric, and that has prompted him to call them out publicly.
“I’m flattered that people want to put me into content in a way that attracts audiences. I’m seriously flattered by that. However, it’s a deepfake,” he says in the video. He added that when a clip isn’t obviously a parody and viewers accept it as genuine, it becomes transgressive to the integrity we’ve worked so hard to build, and something needs to be done about it.
Terry Crews, Tyson said, fell for one of the fake clips, which pushed Tyson to shut it down. He also noted he doesn’t endorse products, so videos that put him behind ads for soft drinks or sneakers are clear red flags. “If you do (see it), it’s not simply, chances are it’s a deep fake, it is a deep fake! Pure and simple,” he said. Tyson allowed that obvious parodies can be fine, and he’s enjoyed lighter trends that rework public figures for laughs, for example the babyfied edit trend on TikTok, but he draws a clear line at content meant to deceive rather than amuse.
Tyson also said he rarely gives actionable directions in his appearances outside his signature line to keep looking up, so videos that instruct viewers to take specific actions are another signal they’re not genuine.
Join the conversation and drop thoughts on X, Bluesky, YouTube.
 
			 
					
 
								
								
																	













