Platforms like Twitter, Discord & Reddit Facilitate Child Grooming & Abuse

Platforms like Twitter, Discord & Reddit Facilitate Child Grooming & Abuse

June 7, 2021 – In recent years, overwhelming evidence has emerged demonstrating that big tech platforms such as Twitter, Discord and Reddit allow and even facilitate the exploitation and abuse of minor children. This is a massive problem, and it lurks under the surface of even the biggest platforms like Facebook, Snapchat and YouTube as well.

These platforms have poured tons of resources into monitoring, “moderating” and censoring Conservative and Right wing content. Simultaneously they not only ignore the problem of child grooming and pro-pedophile content on their platforms, but they are actively protecting it. These platforms provide special privileges to so-called “protected groups” like the LGBTQ community, but when people try to alert platforms that the groups glorify child sexual abuse, they are accused of “harassment” or “mass flagging” and the pro-pedophile content is protected.

To get an understanding of how massive a problem this is, let’s look at some reporting on this.

Matt Agorist of the Free Thought Project writes “While Banning Pro-Peace Accounts, Twitter Openly Allows Pedophiles to Discuss Raping Children“:

Previously, TFTP reported how a disturbing push was made to attempt to normalize pedophilia as a mainstream ‘sexual orientation.’ The move involved pedophiles rebranding themselves as ‘Minor Attracted Persons’ (MAP) with the hope that they will be accepted like the LGBTQ community. Disgustingly enough, it was somewhat effective as multiple outlets reported it like it was totally acceptable to be sexually attracted to children. While this incident was extremely disturbing, even more worrisome is that this normalization appears to be spreading and as some recent activity on Twitter illustrates, it’s condoned by social media giants.

For those who don’t recall, TFTP had multiple accounts unceremoniously deleted by Twitter in October of 2018 on the very same day Facebook removed all of our accounts. TFTP has never once spread hatred, racism, violence, or anything that even closely relates to these repugnant topics. We have only advocated for peace by ending wars and transparency and accountability in government. Nevertheless, Twitter wiped us away like specs of dirt from under their shoe. They also wiped from existence multiple other pro-peace and freedom accounts like Carey Wedler and the Antimedia.

The reason for mentioning the censorship above is to draw a parallel between speech that helps to make the world a better place getting banned and the current speech which is simply fine and dandy according to Twitter.

Since we reported on Minor Attracted Persons several years ago, the terminology became so popular that it morphed into multiple categories and abbreviations. There are now NOMAPS, which apparently are the ‘best kind’ of MAP because the ‘NO’ means they don’t want to have sex with children. That’s where the pro-c MAPs come in. The ‘pro-c’ denotes pro-contact as in the belief that children can consent into having physical contact and sex with an adult. Children cannot consent to sex with an adult.

Nevertheless, tags on Twitter now openly trend to promote this content. Case in point, #mappositivity. When searching this term on Twitter, one can find an entire community of Minor Attracted Persons both those claiming to be anti-touching as well as those claiming it’s just fine and dandy to rape a child. – TFTP

The article goes on to document screenshots of extremely disturbing content. It should be noted that both my personal Twitter account, and the main PSB account were both banned for simply reporting on the news in a way the legacy media didn’t like. These pro child abuse accounts remain up on Twitter. Here is one of the images that was included in the article, look very closely:

What you see in the profile is a child’s hand, intertwined with an adult hand, suggesting inappropriate and illegal acts. It claims to be with its “baby boy” and indicates its preferred pronouns, another indication that the LGBTQ community is being used to protect and normalize pedophilia.

The Print reports on a professor who claims Twitter will not remove this kind of child grooming content “Twitter is promoting child abuse under new policy, alleges Australia-based professor”:

Twitter’s child sexual exploitation policy has been pulled up by an Australia-based academic, who said the micro-blogging site promotes pedophilia and encourages talk about abuse of minors.

Michael Salter, an associate professor of criminology at the University of New South Wales in Australia, said on the social media platform Friday: ‘…Twitter quietly changed its terms of service to permit discussion about attraction towards minors…’

Twitter’s March 2019 policy on child sexual abuse says that any discussion on attraction towards minors is permitted, ‘provided they don’t promote or glorify child sexual exploitation in any way’.

Approached for comment on Salter’s allegations, a Twitter spokesperson said the social media giant ‘has a zero-tolerance policy for child sexual exploitation and we remain steadfastly committed to preventing the sexual exploitation of minors everywhere’.

Salter, who is also a board director at the Washington D.C.-based International Society for the Study of Trauma and Dissociation, said: ‘…sexual desires and social inclusion of pedophiles have been prioritized by Twitter over the safety of children on the platform.’

Salter’s accusations come barely two months after UK-based Internet Watch Foundation revealed that nearly half of child sexual abuse content in the social media space was being shared openly on Twitter.

Salter alleged that pedophile networks have ‘exploded’ on Twitter in the past one year. ‘This includes users who justify child sexual abuse material, and demand access to child sexual abuse dolls,’ he wrote. He added that he had no confidence in Twitter taking action against a user who claims to be ‘attracted to children, advocates for contact offending against children, and has an image of him with a child in his bio picture.’

Salter has alleged that Twitter’s head of product approves encryption of the site’s private messaging service, and it is only likely to exacerbate pedophiliac content. – The Print

Twitter is often dishonest and they provide no real transparency on their actions. There is no consistency in their moderation efforts, nor is there any ability to look at how decisions are being made behind the scenes.

End Sexual Exploitation, an organization dedicated to preventing sexual abuse has reported on this issue with Discord, another platform that went to great lengths to ban “Qanon” accounts but seemingly allows pro-pedophilia content because … well, priorities. These Q accounts also spent a large amount of time speaking out against child sex trafficking and abuse and it begs the question: Is this the real reason they were banned? From the ESE website “Discord Is A Haven For Gamers—And Sexual Exploiters“:

Discord, a popular communication service used by over 100 million active monthly users, has exploded in popularity since the onset of COVID-19 sent most of the world into the digital space. But what started in 2015 as a haven for gamers has quickly morphed into a virtual meeting spot where sexual exploitation and abuse thrive. Now, exploiters go to Discord to groom children for sexual abuse or sex trafficking, and to trade pornography—including child sexual abuse materials, non-consensually recorded and/or shared pornography, and more.

The Discord platform is a widely used text, video, and audio chat app that can be found on computers, browsers, and mobile phones. Originally designed for video game users to chat with each other while playing online games, the app has gained popularity even outside of the gamer community, with over 30% of users claiming they use Discord for activities other than gaming. Discord itself has capitalized on this expansion of their brand, advertising Discord as a way for teachers to reach their classrooms, virtual book clubs to discuss their latest read, and even positioning Discord as a viable workplace alternative to Slack or Microsoft Teams.

The main feature of Discord is the servers–chat rooms where like-minded individuals can join and play and talk about virtually anything together, all in real-time. Servers can be open to the public or private, requiring invites and special passwords to join. Moderation within these servers has historically been mostly left up to the members themselves, relying on user reports to catch bad behavior. Discord’s marketing chief confirmed this, saying: ‘We will not go into a private server unless something is reported to us. We believe deeply that privacy is a right and something we should support as a company.’ The age verification procedures are similarly lax. Users were not even asked their birthday upon signup until mid 2020—which now remains the only real age verification Discord provides for its users.

The laissez-faire moderation attitude from Discord has created an environment that fosters sexual exploitation and abuse on a mass scale.

This is why we’ve named Discord to the 2021 Dirty Dozen List, an annual campaign naming entities that profit from and facilitate sexual exploitation. Learn more about the rest of the list here, and TAKE ACTION urging Discord to take a stand against abuse and exploitation here.

With the expansion of Discord’s brand to include more than just gamers, the platform has opened its doors to predators and abusers and is allowing them to settle comfortably within Discord’s servers. While individual channels within a server can be age-gated with a ‘NSFW’ tag, this is easily circumvented by Discord’s lack of true age verification or moderation. And finding these channels is not a difficult task—over 30,000 servers contain tagged NSFW content.

The real problem is not just the sheer amount of graphic content embedded throughout Discord, but the nature of that content. Pornography trading has become popular on the platform, where users can share links and images of themselves and others. Entire servers are dedicated to finding and sharing non-consensual sexual images of girls and women—sometimes known as revenge porn. Discord made international news in 2020 when one server revealed over 140,000 images of women and even minors had been widely shared and distributed. – ESE

As a company, Discord seemed far more concerned with moderating and banning “Qanon” content than they were with moderating images and videos of child sexual abuse. This says a lot about Discord as a company and their employees. The article continues:

Discord prides itself as a company and platform that treats others and the community with respect, and keeps itself as ‘a safe and friendly place for everyone.’ However, in addition to Discord’s troubling association with hosting and normalizing exploitative explicit images, Discord has facilitated a space for sexual grooming by abusers or sex traffickers, leaving it anything but a safe and friendly place for minors. These abusers or traffickers know minors are on Discord, and utilize the platform to engage with minor users through mutual servers and direct messaging.

Grooming is more than adults simply talking online with children. Research shows just how dangerous this open contact from predators to vulnerable minors really is—one study found that predators who actually made contact with child victims are more likely to use the Internet to locate potential sexual abuse victims and engage in grooming behavior. Groomers use a variety of tactics to gain the victim’s trust, including using pornography as a tool to manipulate the child into believing the sexual abuse is normal.

Here are just a few examples of children being groomed and abused through Discord:

  • A 12 year old girl was groomed for over two months and manipulated into leaving her California home in the middle of the night by a 40 year old predator.
  • Two teenage boys were found in a sex trafficking ring where the traffickers used Discord to contact the victims.
  • A 22 year old man was arrested when a 12 year old girl he had been talking to on Discord disappeared with him in a car.
  • Several predators groomed a 12 year old boy by sending him explicit messages and even calling him over a period of six weeks, before his mother found out and deleted Discord.
  • Clearly Discord can be counted among the technological advances propelling predators’ access to minors. This fact is greatly concerning as sexual harassment and assault continues to become more rampant in society. Even more disconcerting is the potential role Discord is playing in the facilitation of child abuse and sex trafficking.

There are no official statistics reporting on the age breakdown of Discord users, but with the company’s rapid growth and the popularity of collaborative games like Fortnite, Minecraft, and Among Us, it’s safe to say millions of teenagers and kids are using Discord regularly. And while some changes have been made recently—including a new ‘Transparency Report’ to be published bi-annually and a ‘Safety Center’—it is clear that Discord is still not willing to prioritize child safety by providing parental controls and enhanced safety defaults for minor accounts. – ESE

There are probably far more instances of this kind of activity that have simply not been reported. Many parents are not as tech savvy as their children, and may not be aware of these apps, let alone how to monitor them.

Another End Sexual Exploitation organization article looked at a specific instance of Discord being used to lure a boy into sex trafficking:

A recent devastating news story revealed that the popular gaming app Discord was used as a vehicle to groom and lure at a teenage boy into sex trafficking.

A 17-year-old boy was lured from Louisiana to Florida by a group of suspected traffickers, all communicated and arranged through the app. The seven suspects were arrested by police in St. Petersburg, Florida in a human trafficking investigation, where authorities found another missing 16-year-old boy who had been hidden in a trailer there for at least a year. The boy was used as a sex slave, and police are investigating the possibility of other victims.

Discord, popular with gamers, is a widely used text, video, and audio chat app that can be found on computers, browsers, and mobile phones. Designed for video game users to chat with each other, the app has gained popularity even outside of the gamer community. The main feature of Discord is the servers–chat rooms where like-minded individuals can join and play and chat about video games together. Servers can be private, requiring invites and special passwords to join. Private, individual chat functions are also available for groups up to ten people.

The problem with Discord and its servers are the mass amounts of people who can abuse the system for grooming, as happened in the case of the teenage boy in St. Petersburg. – ESE

These apps provide a very easy way for adult predators to find young, lonely children that they can target and groom and eventually lure into sex trafficking or sexual abuse. They provide easy ways for Pedophilia groups to be created, and to exchange images, videos and information about how to groom and sexually abuse minors.

Even Newsweek decided to report on the issues with Discord “Discord Comes Under Fire for Alleged Moderator Abuse and Furry Corruption“:

Every day, Discord gathers 19 million people in chat rooms who discuss everything from video games to Steven Universe fan fiction. The platform, launched in May 2015, now faces scrutiny over allegations of illegal activity. Forbes reported that the FBI is investigating whether the chat application has been used as a marketplace for stolen items (including online passwords and accounts) hacking tips and even child grooming (befriending a minor online to persuade them into sexual abuse).

In the midst of all this public turmoil, Discord has seen an internal scandal rise: site moderators who fail to enforce the platform’s rules because of personal bias, specifically among moderators and community members who identify as Furries. A #ChangeDiscord movement on social media grew once users learned that some moderators, in violation of their own code of conduct, selectively banned communities that shared sexually suggestive art depicting minors.

Discord has Community Guidelines like all other social media platforms. People who break the rules can get banned by mods, and entire communities can lose their partner status – which includes prompt technical support from Discord and crucial moderation tools.

In late January 2019, a user on the Discord subreddit posted a conversation they had with a platform admin known as TinyFeex. In the emails, TinyFeex argues that ‘cub’ content is not in violation of Discord’s terms of service. ‘Cub’ is a term used in the furry community for underage members, with ‘cub play’ being used to describe sexually explicit acts. Discord users cried foul in the comments, and soon subreddit admins deleted the post and pinned a response from Discord Trust and Safety mod karrdian.

‘There is some overlap between ‘cub’ and loli,’ karrdian wrote. ‘There is also some segment of ‘cub’ art that is not, in fact, human or humanlike at all, but instead, for example, mythological creatures.

Users began to share their interactions with the Terms of Service team at Discord. Twitter user MrTempestilence posted a Twitter thread February 3 detailing a wide range of accusations from instances of zoophilia on Discord to moderators allowing ‘cub play.’ Once these tweets started to gain traction and soon after Discord’s ‘gay zoo and feral’ community was shut down. – Newsweek

Once again, there is an overlap between the gay community and pedophilia content. Many of these users were part of the LGBTQ community, and we must ask why there is such a prevalence of pedophilia content among this community. Is it because homosexuality is a form of sexual degeneracy, as is pedophilia, so it isn’t much of a stretch to go from one to the other? Discord began banning people who were reporting the pedophiliac content.

This was a Grooming & Blackmailing group that operated on Discord

Then we get to Reddit, and the lawsuit that was initiated against them back in April of this year, as The Verge reports “Reddit faces lawsuit for failing to remove child sexual abuse material“:

A woman has sued Reddit for allowing an ex-boyfriend to repeatedly post pornographic images of her as a 16-year-old. The lawsuit applies controversial measures instituted in 2018 under FOSTA-SESTA to a site that’s drawn particular criticism for child sexualization. The resulting case will test the limits of platforms’ legal shields amid ongoing efforts to pare back the law behind them.

The woman, identified under the pseudonym Jane Doe, argues that ‘Reddit knowingly benefits from lax enforcement of its content polices, including for child pornography.’ She claims that in 2019, an abusive ex-boyfriend posted sexual photos and videos that he’d taken without her knowledge or consent. But when she alerted Reddit moderators, they could wait ‘several days’ before removing the content, while Reddit administrators allowed the man to keep posting and create a new account when his old account was banned.

The lawsuit argues that Reddit knew its site was a hub for illegal photos and videos, based on news coverage and tips from users themselves, and it should have done more to protect victims. ‘Reddit has itself admitted that it is aware of the presence of child pornography on its website,’ the complaint reads. Among other questionable content, it lists several now-removed subreddits with titles referencing ‘jailbait,’ including an infamous forum that was removed in 2011 after media controversy. (That subreddit did not allow nude images, but it encouraged sexually suggestive ones.) – The Verge

Of course these platforms always deny they are doing anything wrong but at this point their word should not be taken at face value and their actions have betrayed these denials. Especially in light of the recent scandal surrounding one of Reddit’s employees.

The Rooster reports on this employee and incident at Reddit and why it is so disturbing “Reddit’s most popular subs going private, protesting their hiring/defense of pedophile-connected admin, Aimee (Challenor) Knight“:

If you’ve been on Reddit in the last 24-hours you’ve probably noticed an eruption of outraged posts, decrying the site for having hired someone by the name of Aimee (Challenor) Knight. And you’ve probably also noticed that some of the most popular subs have gone ‘private’ in protest of alleged censorship.

And it’s all connected, believe it or not, to pedophilia.

See, when Knight’s father was arrested in 2016, for the rape and torture of a 10-year-old girl (among 22 other sexual offenses), Knight was a rising star in UK politics. She’d just become the LGBTQ spokesperson for the Green Party. But when they began an investigation into why she’d allowed her father to continue running her campaign, after his conviction, she left outraged. Knight called the Green Party ‘transphobic’ and started working for the Liberal Democrats instead.

However, her tenure there would be short lived as well. In 2019, Knight’s husband Tweeted about having sexual fantasies about children, and they too dropped her like a slimy freak.

Enter Reddit. For whatever reason, the social media platform swooped in to the rescue, bent down and helped lift Knight back up, giving her a job as an admin for the site.

And that’s where shit started to hit the fan: Naturally, when people caught wind that Knight had been hired by Reddit, they were wary and confused. One moderator of r/UKPolitics posted an article from Spectator about her father, David Challenor and the crimes he was convicted of, which also briefly mentioned Aimee Knight, and referenced her in passing with a ‘three letter word.’

That post not only got removed, but, without any explanation beyond ‘doxing’ the moderator who had posted it was permanently suspended from Reddit by admins.

Outraged, the sub’s other moderators scrambled to figure out what was going on. They soon posted the following message, stating that a censorship campaign was under way and urging their members not to mention Knight lest they get permanently banned as well.

‘As we had no idea what had happened, or why posting this article resulted in a permanent suspension, we took the emergency step of making the subreddit private and immediately contacting the admins for clarification,’ the subreddit statement from r/UKPolitics’ mods reads. ‘We took this step to protect both the users of the subreddit, and ourselves, from further action by the Reddit admin staff. It later became apparent that Reddit has hired this individual as an Reddit admin, and were banning people from discussing her past to protect their employee from harassment.’

It seems, Reddit has not only hired someone unnervingly proximal to known pedophiles, but they’re now actively defending her from public scrutiny. Why?

Because, diversity. They wanted a token LGBTQ employee, and decided that Knight was precisely the person for the job — never mind her father’s past, or her intent to put him in a government position working with children; never mind her husband’s sick sexual fantasies. That’s all beyond reproach, apparently. – The Rooster

I believe The Rooster is incorrect in their conclusion here about Reddit’s behaviors. Knight was NOT a one time incident, this is the norm at these companies. “She” is not the first pedophile they employed and let’s not forget Ghislaine Maxwell’s massive Reddit account. They seem to cater to these people. We at Patriots Soapbox have written two articles about the issues at Reddit, one about Ellen Pao and Ghislaine Maxwell and the other about Knight. I recommend you read them both.

In 2019 Wired UK reported on YouTube and the problems they were having with pedophile networks hiding in plain sight on the platform “On YouTube, a network of paedophiles is hiding in plain sight“:

Videos of children showing their exposed buttocks, underwear and genitals are racking up millions of views on YouTube – with the site displaying advertising from major cosmetics and car brands alongside the content.

Comments beneath scores of videos appear to show paedophiles sharing timestamps for parts of the videos where exposed genitals can be seen, or when a child does the splits or lifts up their top to show their nipples. Some of the children in the videos, most of whom are girls, appear to be as young as five. Many of the videos have hundreds of thousands, if not millions of views, with hundreds of comments.

The videos are also being monetised by YouTube, including pre-roll adverts from Alfa Romeo, Fiat, Fortnite, Grammarly, L’Oreal, Maybelline, Metro: Exodus, Peloton and SingleMuslims.com. Banner advertising for Google and the World Business Forum also appeared alongside some of the videos. As well as providing YouTube with our research, we contacted the advertisers to alert them to the issue.

Videos of little girls playing Twister, doing gymnastics, playing in the pool and eating ice lollies are all routinely descended upon by hordes of semi-anonymous commenters, sharing time codes for crotch shots, directing other people to similar videos of children and exchanging phone numbers along with a promise to swap more videos via WhatsApp or Kik. On some videos, confused children who have uploaded videos of them playing in the garden respond to comments asking them how old they are. On one video, a young girl appears to ask another commenter why one of the videos had made him ‘grow’. The video shows the child and her friend doing yoga and is accompanied by pre-roll advertising from L’Oreal. The video has almost two million views.

YouTube says that it’s 99 per cent effective at ensuring that adverts only appear on appropriate content and that it takes every instance of ads showing up where they shouldn’t very seriously.

But with a blank YouTube account, and a couple of quick searches, hundreds of videos that are seemingly popular with paedophiles are surfaced by YouTube’s recommendation system. Worse still, YouTube doesn’t just recommend you watch more videos of children innocently playing, its algorithm specifically suggests videos that are seemingly popular with other paedophiles, most of which have hundreds of thousands of views and dozens of disturbing comments. Many include pre-roll advertising.

In one monetised video with 410,300 views, a prepubescent girl performs a dance routine in a dingy flat, flashing the camera half-way through in a definitely-illegal, distressingly exploitative bare crotch shot that’s shared in the comments with a time stamp. We’ve seen it accompanied by adverts for Fiat and Shen Yun.

Although some prominent YouTube channels have been taken down over child abuse revelations, we were still able to find a number dedicated to ‘pre-teen models’ and groups of young girls bathing, doing stretches and talking through their morning routines. Other YouTube profiles sharing these videos are anonymous, minimally filled out profiles that exist only to share videos of young children. – Wired UK

Again, it must be said that YouTube devotes massive resources, including teams of paid content moderators to go after right wing content, and yet this child abuse and grooming material is not only on the platform, but it is monetized and racking up the views.  YouTube has a long history of having issues moderating child grooming content. There was the whole ElsaGate saga as well.

One article from The Verge claims that YouTube’s algorithm was making it easier for pedophiles to connect with each other on the platform, “YouTube still can’t stop child predators in its comments“:

YouTube is facing a new wave of criticism over the alarming number of predatory comments and videos targeting young children.

The latest concerns started with a Reddit post, submitted to r/Drama, and a YouTube video, exposing a ‘wormhole into a soft-core pedophilia ring on YouTube,’ according to Matt Watson. Watson, a former YouTube creator who returned with a single video and live stream about the topic, demonstrated how a search for something like ‘bikini haul,’ a subgenre of video where women show various bikinis they’ve purchased, can lead to disturbing and exploitative videos of children. The videos aren’t pornographic in nature, but the comment sections are full of people time stamping specific scenes that sexualize the child or children in the video. Comments about how beautiful young girls are also litter the comment section.

Although Watson’s video is gaining mainstream attention, this isn’t the first time that YouTube has dealt with this issue. In 2017, YouTube updated its policies to address an event known as ‘ElsaGate,’ in which disturbing, sexualized kids’ content was being recommended to children. That same year, YouTube decided to close some comment sections on videos with children in an attempt to block predatory behavior from pedophiles. As early as 2013, Google changed its search algorithm to prevent exploitative content from appearing in searches on both Google and YouTube. But despite years of public outcry, YouTube still hasn’t found a way to effectively deal with apparent predators on its platform. 

The heart of the problem is YouTube’s recommendation algorithm, a system that has been widely criticized in the past. It only took two clicks for Watson to venture away from a video of a woman showcasing bikinis she’s purchased to a video of a young girl playing. Although the video is innocent, the comments below — which include timestamps calling out certain angles in the video and predatory responses to the images — certainly aren’t. – The Verge

It is very strange how often right leaning content is hit with strikes and removed so quickly on YouTube, and yet these child predators are often not caught until so many people complain that the platforms are FORCED to do something about it. Better than anything else, this fact showcases the priorities of the people who staff these companies. Their main goal is ideological political enforcement, not real moderation.

Then of course there was this from Facebook “Facebook asked users if pedophiles should be able to ask kids for ‘sexual pictures’” (as if this would EVER be acceptable):

Facebook is under fire for asking users whether pedophiles should be able to proposition underage girls for sexually explicit photographs on the giant social network.

The survey is the latest in a series of missteps by the Silicon Valley company, which has been criticized for allowing content that exploits children.

From violence on its Live streaming service to hate speech to divisive messages sent by Russian operatives trying to to meddle in the U.S. presidential election, toxic content flowing through its platform has heightened scrutiny of Facebook.

Facebook scrapped the survey that posed questions about teens being groomed by older men after it was spotted by media outlets in the United Kingdom. It now says the survey could have been better ‘designed.’

The company routinely uses surveys to get feedback from the social network’s more than 2 billion users. More recently, Facebook has been relying on user surveys to take their pulse on everything from the ‘fake news’ epidemic to whether Facebook makes them happy as people have stopped spending as much time there.

But the two questions in Sunday’s survey shocked and angered Facebook users.

‘In thinking about an ideal world where you could set Facebook’s policies, how would you handle the following: a private message in which an adult man asks a 14-year-old girl for sexual pictures,’ Facebook asked.

Sexual contact with minors online, part of a ‘grooming process’ in which adults seek to gain trust and lower inhibition, is often a precursor to sexual abuse.

The possible responses Facebook offered to the question ranged from ‘this content should not be allowed on Facebook, and no one should be able to see it’ to ‘this content should be allowed on Facebook, and I would not mind seeing it.’

Another question asked who should decide whether an adult man can ask for sexual pictures on Facebook, with options ranging from ‘Facebook users decide the rules by voting and tell Facebook’ to ‘Facebook decides the rules on its own.’ – USA Today

Leave it up to the “journalists” at USA Today to try to politicize this topic and kvetch about the Russian collusion hoax and fake news, of which USA Today is a practitioner.

Facebook is notorious for being a spin off of the DARPA project called LifeLog, which was created to monitor a person’s entire existence. According to investigative journalist Whitney Webb, Facebook is staffed by at least 28 “former” members of Israel’s infamous Unit 8200. These same people in Silicon Valley were connected to Jeffrey Epstein and Ghislaine Maxwell’s sisters Christine and Isabel, so this should come as no surprise.

An apologist for a Discord Groomer who got banned

These predators try to paint the child grooming pedophiles as “victims” and try to claim pedophilia is a “mental disorder,” which is an attempt to absolve these people of guilt and responsibility for their criminal behavior.

Discord protecting sexual perverts, punishing those who report them

I think that we have seen a pattern of behavior with all of these platforms. Many of these platforms were created as intelligence cutouts, to bypass laws on domestic spying by using a so-called “private company” in order to not have to provide transparency or accountability. We know the intelligence community often uses sexual blackmail to compromise people and control them. This is why I think they protect this material on their platforms, because the intelligence community stands to benefit from it.

There is also crossover between these companies and their staffs. One member of the Discord “trust and safety” team also had a history on Reddit as a moderator.

Pedophiles use the LGBTQ community as cover

Pedophiles are also hiding underneath the banner of LGBTQ, which now basically includes pedophiles and they use their special protected status to harm and groom innocent children.

I think people need to understand that liberalism is the cause for most of this. Being “tolerant” towards sexual predators and “tolerating” evil is a problem. This is how a society is slowly destroyed from within. The ideas of “sexual liberation” extrapolate to the point of normalizing pedophilia, and what exactly does “sexual liberalism” mean? A liberal “tolerant” view of sexual deviants?

Finally, I would to give a shout out to the YouTube channel Pax Tube, whose recent video led me back to this horrid rabbit hole once again:

I highly recommend you watch the above video explaining why this issue is so prevalent on these platforms.

See a spelling or grammar error? Let us know! Highlight the text and press Ctrl+Enter.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments