YouTube will block revenue for creators of artificial intelligence-generated and mass-produced videos starting July 15, targeting what the company calls “inauthentic” content.
The policy targets automated material flooding the platform, including music channels using AI that have attracted millions of followers, automated news video, and AI-generated true crime series.
YouTube will revise its partner program rules to provide clearer criteria for identifying problematic content. Previously, the company required “original” content for monetization but lacked specific guidelines for AI-generated material.
Starting July 15th YouTube will only pay creators who use their real voice and produce original content.
— SAY CHEESE! 👄🧀 (@SaycheeseDGTL) July 9, 2025
Following will not be eligible for monetization:
– Reused or repurposed videos
– Copied content
– Low effort uploads
– Fully AI-generated videos pic.twitter.com/nWqsF3o6KD
Some creators expressed concerns that the changes could limit monetization of reaction videos or content featuring clips, but YouTube’s head of editorial and creator liaison said the update targets clearly inauthentic material.
The move comes as the video platform struggles with AI-generated phishing scams and fake content despite existing detection tools.
The crackdown could influence how other social media platforms address similar content quality challenges.
YouTube, owned by Alphabet Inc. (Nasdaq: GOOG), generates revenue by sharing advertising income with qualified creators.
Information for this story was found via the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.