YouTube rolls out new rules to curb AI content: Who is affected (and who is safe)?
Representative image
Big changes are coming to YouTube, and if you’re a creator, you’ll want to pay attention. From July 15 2025, YouTube will stop monetising videos that are mostly AI-generated, repetitive, or lacking any real human input. The platform wants to make sure that thoughtful, original content doesn’t get lost in a sea of low-effort uploads.
In its official statement, YouTube said: “In order to monetise as part of the YouTube Partner Programme (YPP), YouTube has always required creators to upload ‘original’ and ‘authentic’ content. On July 15 2025, YouTube is updating our guidelines to identify mass-produced and repetitious content better. This update better reflects what ‘inauthentic’ content looks like today.”
Who’s most at risk?
Creators who rely heavily on automated videos could be impacted. That includes channels using AI voiceovers, basic slideshow visuals, or videos that follow the same format again and again with little variation. This kind of content has become especially common in YouTube Shorts, and some reports say that more than 40 per cent of Shorts now include AI in some way.
YouTube isn’t banning AI, but it’s sending a clear message. If your content feels like it was made by a machine without any personal effort or creative input, it may no longer qualify for monetisation.
Who’s safe?
If you show up in your videos, speak in your own voice, or bring your personality and ideas to the content, you’re in the clear. YouTube made it clear that this policy isn’t targeting creators who use AI as a helpful tool. It’s focused on those who use it to replace originality altogether.
So, if you create explainers, tutorials, reactions, or any kind of video where your voice or vision is present, you’re doing exactly what YouTube still values. In fact, your content may even have a better chance of standing out.
The basic requirements for monetisation haven’t changed. You still need 1,000 subscribers, and either 4,000 valid public watch hours in the past year or 10 million Shorts views in 90 days. But now, YouTube will also be looking closely at how your content is made. If it looks and feels mass-produced, it may not make the cut.
YouTube is also rolling out a new rule for transparency. From now on, creators will need to clearly disclose if their videos include AI-generated voices, faces, or visuals that could mislead viewers. If they fail to do so, their videos could be taken down or demonetised. This is especially important in cases involving deepfakes or impersonations.
AI tools have made it easier than ever to create content, but not necessarily meaningful content. With this update, YouTube’s message is clear: technology should support creativity, not replace it.
Sci/Tech