[go: up one dir, main page]

Jump to content
Our commitments

How does YouTube help keep kids and teens protected on the platform?

Kids and teenagers today can access a world of possibilities on YouTube and come to the platform to find positive communities that reflect their unique interests. While we provide them with a space and tools to navigate their imagination and curiosity, we also understand that these same young people need special protections and considerations when it comes to what they experience and discover online.

Fostering youth safety

What are YouTube’s Youth Principles?

Our Youth Principles are what we adhere to in our work at YouTube and are core to our efforts in creating a safer and more enriching environment for young people.

Over the years, we have been dedicated to fostering a safe, high-quality, helpful platform that enriches the lives of kids and families around the world. We’ve put extensive resources into building a robust set of policies and services that protect our entire community and connect people to trusted sources, from combating the promotion of eating disorders and suicide to guarding against dangerous challenges.

What is YouTube Kids and supervised experiences for pre-teens and teens?

YouTube offers several experiences to meet the unique developmental needs of children and teens and equips caregivers with the tools, features, and support to choose the online experiences that are right for their families. And as children and teens continuously evolve the way that they show up online, so do our services and policies.

  • YouTube Kids (for children 12 and under) is a separate app built from the ground up to be a safer and simpler experience for kids to explore, with tools for parents and caregivers to guide their journey.

  • Our supervised experience for pre-teens (children under 13 or the relevant age in their country / region) is designed for parents who decide their child is ready to explore the broad array of content on the main YouTube app, but want to be able to tailor their experiences. It offers three content settings that account for different parenting styles and individual differences in child development, with each option offering greater access to YouTube’s world of content.

  • Our supervised experience for teens (ages 13-17 in most countries and regions) allows parents and teens to voluntarily link accounts to help give parents insights about their teens’ creation and participation journey on YouTube, spark conversations between parents and teens, and provide timely learning opportunities about how to safely create on YouTube.

How does YouTube help teens spot misinformation and share responsibly?

As part of our efforts to provide teens with a safe and enriching experience online, we partnered with Poynter MediaWise to develop a new media literacy curriculum and resources. Teachers around the world can access these free educational materials for use in the classroom and other learning environments.

The curriculum includes lessons on important media literacy topics, from evaluating sources and evidence, to recognizing AI-generated content, and tips for sharing content responsibly. The full host of lesson plans, slide decks, and videos are available on hitpausewithmediawise.com for educators everywhere to access, with more materials to come over the next few months.

This collaboration with MediaWise further expands YouTube’s Hit Pause initiative, which uses short videos to teach essential media literacy skills. Hit Pause was first introduced in 2022 with input from the National Association of Media Literacy Education, and its videos have been viewed billions of times.

What child safety measures exist on the main YouTube app?

We invest heavily in the technology and teams that help provide kids and families with the best protection possible.

On YouTube, we require users to be 13 years of age (or applicable age in their country) unless otherwise enabled by a parent or caregiver. Our Community Guidelines outline content that is not allowed on the platform, and specific child safety policies prohibit content that may put children at particular risk.

On YouTube Kids, we work to identify content that is age-appropriate, adheres to our quality principles, and is diverse enough to meet the varied interests of kids globally. This includes specific content policies designed with feedback from external specialists in children’s media, child development, digital learning, and citizenship.

We continue to evolve our platform to ensure we’re creating an appropriate environment for family content on YouTube, and make improvements to our product and policies that reflect the input of outside experts and internal specialists where relevant.

Is YouTube collecting children’s data to serve them ads?

We treat data from anyone watching content identified as 'made for kids' on YouTube as coming from a child, regardless of the age of the user. This means that on videos 'made for kids', we limit data collection and use, and as a result, we need to restrict or disable some product features. For example, we do not serve personalized ads on content 'made for kids', and some features are not available on these videos, like comments and notifications. All Creators are required to indicate whether or not their content is 'made for kids'.

Personalized ads are prohibited on YouTube Kids, as well as for users in a supervised experience on YouTube. This means the ads that appear are matched to videos being watched based on the content, not the specific user watching.

How is YouTube protecting kids who post content as a Creator or feature in videos by other Creators?

As noted in our Terms of Service, children under the relevant age of consent may use YouTube or YouTube Kids (where available) if enabled by a parent or legal guardian.

On YouTube Kids, and within a parent supervised experience on YouTube, children aren't able to upload content, live stream, read or write comments, or access any features that involve financial transactions. We have additional protections for minors on live streams, such as terminating streams featuring minors who aren't visibly accompanied by an adult.

We also provide best practices for child safety and prominent prompts for Creators with kids in their videos to understand their legal obligations. In addition to securing consent, it is their responsibility to comply with all the laws, rules, and regulations applicable to kids’ appearance in their content - including required permits, wages/revenue sharing, school and education, and working environment and working hours.

How does YouTube restrict access to mature content?

We are committed to providing age-appropriate experiences when people come to YouTube. We recognize that Creators are generally the best judge of who should see their content, so we give them the ability to age-restrict their own content when appropriate. When content is age-restricted, viewers coming to YouTube must be signed in and their account age must be 18 or older in order to view the video. If they aren’t, they will see a warning and be redirected to find other content that is age-appropriate. Our Community Guidelines include guidance for uploaders about when content should be age-restricted.

Our reviewers apply age-restrictions if, in the course of reviewing content, they encounter a video that isn't appropriate for viewers under 18. We also use machine learning to detect content to help us automatically apply age-restrictions where appropriate. Uploaders can appeal a decision if they believe it was incorrectly applied.

To make sure viewing experiences are consistent, viewers attempting to access age-restricted videos on most third-party websites will be redirected to YouTube where they must sign-in and be over 18 to view it. This helps ensure that, no matter where a video is discovered, it will only be viewable by the appropriate audience.

How does YouTube protect children who appear in risky situations in videos?

We have always had clear policies against videos, playlists, thumbnails, and comments on YouTube which sexualise or exploit children. We use machine learning systems to attempt to proactively detect violations of these policies and have human reviewers around the world who quickly remove violations detected by our systems or flagged by users. We immediately terminate the accounts of those who are seeking to sexualize or exploit minors, and we report illegal activities to the National Center for Missing and Exploited Children (NCMEC), which liaises with global law enforcement agencies.

While some content featuring minors may not violate our policies, we recognize the minors could be at risk of online or offline exploitation. This is why we take an extra cautious approach towards our enforcement. Our machine learning systems help to proactively identify videos that may put minors at risk and apply our protections at scale, such as restricting live features, disabling comments, and limiting recommendations.

We also work with the industry, including technology companies and NGOs by offering our industry-leading machine learning technology for combating CSAI (Child Sexual Abuse Imagery) content online. This technology allows us to identify known CSAI content in a sea of innocent content. When a match of CSAI content is found, it is then flagged to partners to responsibly report in accordance with local laws and regulations.

Meet Youth and Families Advisory Committee