We are searching data for your request:
Upon completion, a link will appear to access the found materials.
YouTube’s child-safety problem continues amidst the various efforts of the company to control and eliminate unfit content for children.
Issues center around YouTube’s autoplay function, which child abusers are known to manipulate in order to stream unsavory content on an otherwise innocent YouTube search.
RELATED: DOCTOR REVEALS DANGEROUS CONTENT IN YOUTUBE KIDS VIDEOS
The problems raised here have broader implications for how we will go about accessing content on the web in the future.
This also raises the reoccurring questions on free speech, in a social landscape filled with hate, child exploitation and the general explosion of data.
How will YouTube resolve the problem?
Over the years, YouTube and its parent company Google, have made various efforts in the direction of eliminating exploitative content for children.
As officially announced on YouTube’s blog recently:
"Responsibility is our number one priority, and chief among our areas of focus is protecting minors and families. Over the years, we’ve heavily invested in a number of technologies and efforts to protect young people on our platform, such as our CSAI Match technology. And in 2015, because YouTube has never been for kids under 13, we created YouTube Kids as a way for kids to be able to safely explore their interests and for parents to have more control. Accounts belonging to people under 13 are terminated when discovered. In fact, we terminate thousands of accounts per week as part of this process."
Here we have it, a two-prong effort to eliminate content (via CSAI Match technology) and manage demographic exposure through the child-friendly platform: YouTube Kids.
CSAI Match technology
CSAI (child sexual abuse imagery) Match technology is basically an algorithm developed to search for patterns in videos that fit previous histories of flagged, child exploitation content.
Further, CSAI works with humans who, through NGOs and partner companies, actively develop and organize databases, as well as review flagged contents. Current organizations on board, for example, are: Reddit, Tumblr, Canadian Centre for Child Protection
Second, YouTube Kids, which for those unfamiliar, is a YouTube child-oriented platform with specially monitored content and parent filter applications.
As YouTube officially comments:
“The app makes it safer and easier for children to find videos on topics they want to explore and is available for free on Google Play and the App Store in the U.S.”
As recently reported in Bloomberg, YouTube’s efforts are centered on its child site:
"The app, created four years ago, filters videos from the main site specifically for children under thirteen, who are protected by federal law from forms of digital data collection. The app has faced criticism – that it’s too addictive, lowbrow and unedited -- but YouTube Kids is, relatively speaking, a haven from the dangers of the open web and YouTube.com. “We strongly encourage parents that the general site is not made for kids,” Blum-Ross said."
The shortcoming of YouTube’s approach
Although YouTube Kids sounds hopeful, the company still faces many challenges around this issue.
As one University of Michigan assistant professor of pediatrics and an expert on childhood development stated, “Many parents have expressed that their child refuses to go back to YouTube Kids.”
Further stating that “It’s too baby-ish, too restrictive. Now that they’ve let the genie out of the bottle with YouTube main, it’s hard to reverse course.”
If YouTube doesn’t manage to reverse these trends, it will have to step up efforts on the main site by restricting live features, disabling comments on videos featuring minors, and reducing recommendations.