Twitch is, Vaguely, Working to Fix its Child Predator and Groomer Problem

Source: 11/23/22

Twitch has been hounded for months after a September report showed just how many sexual predators were stalking the streaming platform’s halls, targeting children that were not supposed to be there in the first place. Now the company says it’s developed systems to combat child sexual abuse material, though it’s not exactly sharing the details.

In a Tuesday blog post, Twitch said it is working on a “constantly evolving approach” to limit harm to young people. The user-streaming platform said it’s creating phone verification requirements for “potentially vulnerable accounts,” AKA those accounts made by young people pretending to be over 13, before they can livestream. The company said it’s also working to delete more accounts for those under 13-years-old.

Read the full article


Related posts

Notify of

We welcome a lively discussion with all view points - keeping in mind...


  1. Your submission will be reviewed by one of our volunteer moderators. Moderating decisions may be subjective.
  2. Please keep the tone of your comment civil and courteous. This is a public forum.
  3. Swear words should be starred out such as f*k and s*t
  4. Please stay on topic - both in terms of the organization in general and this post in particular.
  5. Please refrain from general political statements in (dis)favor of one of the major parties or their representatives.
  6. Please take personal conversations off this forum.
  7. We will not publish any comments advocating for violent or any illegal action.
  8. We cannot connect participants privately - feel free to leave your contact info here. You may want to create a new / free, readily available email address.
  9. Please refrain from copying and pasting repetitive and lengthy amounts of text.
  10. Please do not post in all Caps.
  11. If you wish to link to a serious and relevant media article, legitimate advocacy group or other pertinent web site / document, please provide the full link. No abbreviated / obfuscated links. Posts that include a URL may take considerably longer to be approved.
  12. We suggest to compose lengthy comments in a desktop text editor and copy and paste them into the comment form
  13. We will not publish any posts containing any names not mentioned in the original article.
  14. Please choose a short user name that does not contain links to other web sites or identify real people
  15. Please do not solicit funds
  16. If you use any abbreviation such as Failure To Register (FTR), or any others, the first time you use it please expand it for new people to better understand.
  17. All commenters are required to provide a real email address where we can contact them.  It will not be displayed on the site.
  18. Please send any input regarding moderation or other website issues via email to moderator [at] all4consolaws [dot] org
  19. We no longer post articles about arrests, only selected convictions. If your comment contains a link to an arrest article we will not approve your comment.
ACSOL, including but not limited to its board members and agents, does not provide legal advice on this website.  In addition, ACSOL warns that those who provide comments on this website may or may not be legal professionals on whose advice one can reasonably rely.  

Inline Feedbacks
View all comments

interesting that their using technology to try and combat the issue, also they refuse to say what that technology is.

Adriana Chechik is on Twitch. Perhaps kids shouldn’t be online at all.
Humans successfully reared children without the internet from the beginning. A computer need not be connected to a network to function. If we restrict children’s internet culture access were much better off. IMO lending children unattended access is irresponsible.
I suspect too many use the DDI as a babysitter. is a very unique use of the DDI and unlike YT no algorithm is used to select what you watch. You just do not get the extraneous click bait vids as you do on other social media platforms. Twitch is the idea venue to engage study in the field of communication studies. Twitch LIVE streamers are engaged in a labor market. A market only made possible by the ability to communicate at near light speed. The fact harnessed by purveyors of database and their most creative programmers to create unique protected markets. Twitch has done a fine job with their platform dynamics and I can find no fault in them.
As for what’s available to children, in my mind isn’t their problem at all. Any social blow back they’re going through mirrors the rest of the American landscape sapped with invasions of free speech.

Kids aren’t stupid Enough to meet some random old man somewhere, not saying it doesn’t happen but it’s very rare.
The only people On Twitch grooming people is law-enforcement, it’s like fishing for them, they bait the hook and let it sit hoping to snag someone and the sad part is people are stupid enough to fall for it.

Last edited 6 months ago by Aerospost

So it’s like how TikTok made the minimum age to go live 18.

So the sky is falling down because of this one researcher who requires anonymity says so. At least the Psycholgoy Today researcher put his name to his fake news article and the “frightening and high” statistic.

While I am somewhat concerned about the opacity of the technology, given the stakes involved – despite the reasonable concerns about people trying to “game the system” – and despite the fact there could of course be more to these efforts than discussed in the article, overall I am impressed and pleased with Twitch taking what seems to be a more thoughtful and likely more productive approach to this issue than companies more prone to moral panic (or really PR panic stemming from the former).

This is for two interrelated reasons. The first is that Twitch is willing to acknowledge implicitly, despite the canned PR statement, that as Mike Masnick has put it, content moderation at scale is fundamentally impossible. This means that on a platform as massive in terms of both user population and content volume as Twitch:

  1. You are likely never going to eliminate *all* problematic content – something is always likely to slip through the cracks due to the sheer amount
  2. Relatedly, there are always going to be moderation decisions that are judgment calls gone wrong: content or accounts which shouldn’t be removed but are, and vice-versa

Neither of these, however, imply that problems cannot be significantly *mitigated*, their incidence meaningfully reduced. Twitch seems to recognize this, and measures which acknowledge this reality are more likely to meaningfully move the needle with fewer purely symbolic outrage measures or disproportionate or even counterproductive ones. Their existing default message settings are a good example.

Also relatedly, the focus on under-13s is sensible. If content moderation at scale is fundamentally impossible, it makes sense to allocate scarce resources to the most vulnerable population, which are also the easiest to detect. It also recognizes that it is disproportionate and unfair to restrict the activities of, say, high schoolers on the site over this, many of whom like my son are big gamers and know perfectly well how to use the biggest and most important game streaming platform on the planet.

Here is a case from The civil rights where the camera and photos of homes posted online by social media are a hot topic. Listen to the fury of one WV judge complaining about a defendant posting a photo of his home. The commenter is John H Bryan Esq.
Title: Wild WV Judges update. Take note of this tubers demure shattering closing statement. And Amen to it.