Ask any parent and they’ll tell you that kids love YouTube.

But is YouTube respectful or responsible in how it interacts with our children?

Recent revelations concerning the safety of the site for children have left parents and schools worried about the impacts of the social media giant. Many school districts block children from unfettered access to YouTube on school technology, a recognition that the site isn’t entirely safe. Parents, meanwhile, are often less equipped to handle some of the content that flows into their children’s feeds.

YouTube says the site is not intended for kids younger than 13. But, of course, many of the site’s core users are children. Several of the site’s most-viewed channels market directly to children. Once inside the site, children have access to an enormous library of content, often without parental controls.

YouTube also has come under fire in recent months for how it handles children’s content. YouTube’s autoplay feature selects and automatically plays content the site thinks viewers will enjoy, but this means children or parents can select an appropriate video but end up in a wormhole of unsupervised and inappropriate content. The site also has been criticized for tracking children’s data, creating issues of privacy.

The site has publicly sought to ameliorate these issues by creating a separate platform for children called YouTube Kids, eliminating autoplay suggestions on children’s content and halting the tracking of children’s data. But is this enough?

Some of these issues aren’t going away. The open nature of the site means that once you’re in, you’re in. Kids can access any videos that aren’t specifically flagged for audiences over 18, which typically requires nudity or graphic violence. Many of the remaining videos simply aren’t made for kids, such as sexual or explicit music videos. Sometimes adult content can be hard to distinguish from children’s content, such as explicit parodies of children’s shows.

For teenagers, the site presents other obstacles. YouTube’s wormhole-like algorithm has been criticized for its role in spreading misinformation and exposing people to increasingly radical ideas.

For example, if a teen watches a video about debunking conspiracies surrounding the moon landing, the viewer might get a recommendation for a video about flat-earth conspiracy theories. Clicking on that video can lead to other conspiracy theory video recommendations.

While this seems like an innocuous example, the point is that increasingly, our social media platforms pull users to the extremes. The breadth of content and the seemingly endless black hole of recommendations creates a dangerous reality for the impressionable.

In Texas, Dallas, Fort Worth and Plano school districts don’t allow students full access to YouTube on school technology. All three allow limited use of the site, primarily for educational videos only.

By limiting students’ access, school districts have shown they believe YouTube is something that should be used in moderation and with care. Perhaps they are onto something.

Readers are invited to choose between emojis indicating love, humor, surprise, sadness or anger about articles.

More details about article comments are available here.

0
0
0
0
0