Summary
- YouTube creators now have the option to control if their videos are used for AI training, a positive step forward.
- Users can choose if third parties can access their content for AI training through the Creator Studio settings.
- Despite YouTube’s new feature, concerns remain about how Google may still be training its own AI using the videos.
The importance of user-generated content has never been greater, and not just because AI-generated content coexists with it on the internet. Platforms like YouTube, Reddit, and other social media serve as hubs for original work created by people like you and me, and its use to train AI models has been a bone of contention for a couple of years now. These big platforms are finally catching up, and YouTube just added an option for creators to control the role of their videos in AI training.
Related
YouTube doesn’t want OpenAI training Sora and ChatGPT on its videos
YouTube CEO Neal Mohan has now taken a shot across the bow, issuing a warning to OpenAI
While on the one hand Google’s Gemini AI willingly summarizes videos from YouTube, creators remain concerned that hours of their effort will be used to train AI models, with no fair compensation delivered in exchange. However, YouTube is slowly making amends, perhaps to end the year on a positive note. Following through with plans from 2024, the company just introduced an option for creators to pick whether their videos should be used by third parties for AI training, TechCrunch reports.
In the coming days, Creator Studio for YouTube uploaders will gain a new subsection in Studio Settings, called Third-party training. Here, the original creators and rights holders can choose to offer their video to third parties for AI training. YouTube could include owners detected by Content ID as well. Detailed support documentation for the feature is available as well.
The right way to use YouTube is more complex now
But nothing online is truly private
It’s worth noting that Creator Studio’s new addition only provides control of third-party access to your content for said purposes. Moreover, YouTube isn’t exactly angelic in its ways, and has carefully worded its announcement. We believe that since nothing is said about first-party AI training, Google will continue coaching Gemini and other AI products using the trove of original videos on YouTube while depriving rivals of the resource.
On the flipside, YouTube repeatedly reiterates that scraping videos isn’t ethical or permitted. However, given the liberal access permitted by its APIs there’s little standing between a determined AI engineer and YouTube’s content base. So, a simple toggle switch in the settings sure isn’t a foolproof solution to ensure precious content is handled correctly. We just hope tech companies stick to their word and develop AI sustainably because, cool as the applications may be, development is mostly self-governed while lawmakers catch up.
Related
Google Gemini’s AI video summary implies YouTube doesn’t care about content creators
How much more proof do we need?


