Google and Bing put nonconsensual deepfake porn at the top of some search results

Google and other search engines include nonconsensual deepfake porn in top image search results alongside tools that advertise the ability to create such material

Gabby Jones/Bloomberg via Getty Images

Google search page is displayed on a laptop computer.

Nonconsensual deepfake pornography is just a click away on popular search engines like Google and Microsoft’s Bing.

Deepfake pornography often grafts a person’s face into a real pornographic scene — for example, a famous woman’s face will be “swapped” with an adult star’s face, making it appear that the famous woman is nude or engaged in a sexual act.

NBC News found that deepfake pornographic images featuring the likenesses of female celebrities were the first images Google and other top search engines surfaced in searches for many women’s names and the word “deepfakes,” as well as general terms like “deepfake porn” or “fake nudes.” The searches were conducted with safe-search tools turned off.

Legal experts, advocates and victims say nonconsensual deepfake porn has grown into a crisis, and they’re asking tech platforms to step up where laws and law enforcement have yet to take action. A growing number of states have enacted or introduced laws to govern the use of deepfakes, particularly in elections, but nonconsensual deepfake porn has only continued to spread.

NBC News searched the combination of a name and the word “deepfakes” with 36 popular female celebrities on Google and Bing. A review of the results found nonconsensual deepfake images and links to deepfake videos in the top Google results for 34 of those searches and the top Bing results for 35 of them. More than half of the top results were links to a popular deepfake website or a competitor. The popular deepfake website has cultivated a market for nonconsensual deepfake porn of celebrities and private figures. 

Read the full story on NBCNews.com here.

Exit mobile version