X limits access to Taylor Swift's music after sexually graphic AI photos went public.

Taylor Swift searches on Twitter no longer return results days after sexually explicit artificial intelligence photos became viral on X. As of Monday morning, all Taylor Swift searches return "Error. Try reload." Putting quote marks around her name allows posts to reference her.

The search glitch follows a flurry of sexually explicit deepfake Swift photos on social media, angering fans and underscoring the dangers of the technology. The BBC and Associated Press said that X's chief of business operations, Joe Benarroch, called the decision a "temporary action" to emphasize user safety.

Swift poses inappropriately at a Kansas City Chiefs game in an AI-generated spoof picture. The Grammy winner has started supporting football boyfriend Travis Kelce at team games more often.

"Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content," said the message. "Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.

The singer kissed Kelce at Sunday's Chiefs-Baltimore Ravens game, still smiling. The Chiefs defeated the Ravens 17-10 to get to the Super Bowl. AI pictures may be generated without authorization using language prompts, raising privacy issues.

Artificial intelligence-generated deepfakes—manipulated video created by machine learning to create realistic but fake pictures and audio—are also being used to manufacture phony celebrity endorsements.

In recent years, several phony photographs have gone viral, including one showing former President Donald Trump being arrested, attacked, and dragged away by police last year. AI-generated photos may still be examined for signs of deception. One Trump arrest photo depicted him with three legs. Eventually, scientists argue, there will be no visible difference between a genuine and an AI-generated image.

"I'm very confident in saying that in the long run, it will be impossible to tell the difference between a generated image and a real one," Berkeley computer science professor James O'Brien told USA TODAY. "The generated images are just going to keep getting better."