Disney Employees In Florida Arrested For Human Trafficking + Videos Appear Online Showing Top Disney Executives’ Desire To Sexualize Children

This is a good time to remind everyone that Walt Disney himself was a 33rd degree Freemason and that occult symbols have been used in Disney entertainment since the beginning.

Australian Altiyan Childs, in his excellent presentation on Freemasonry and Satanism, has a section on Walt Disney and his company that we have lifted out into a minute and a half video clip on our Bitchute Channel.

And let’s not stop with just Walt Disney. This nation, the United States of America, was founded by many Freemasons such as George Washington, and while the Constitution they wrote and ratified was supposed to protect civil rights for all Americans, most of these founders were themselves owners of African slaves and involved in human trafficking.

One doesn’t have to look very far to see the influence of Freemasonry and their occult symbols on the founding of our nation. Just pull out a U.S. 1 dollar bill from your wallet, and you will see many occult symbols.

This is another video clip from Altiyan Childs’ presentation on Freemasonry and Satanism. This is on our Bitchute channel.

By Brian Shilhavy, MedicalKidnap.com

For all of you parents and grandparents out there who still believe that the Walt Disney Company produces “family friendly” entertainment safe for young children, you need to pay attention to what is going on in Florida right now.

disney employees in florida arrested for human trafficking + videos appear online showing top disney executives’ desire to sexualize children

First, four Disney employees in Florida were recently arrested for human trafficking, with one of them being a 27-year-old lifeguard who reportedly sent sexual images of himself and graphic sexual messages to an undercover detective who was posing online as a 14-year-old girl.

NBC 6 in Miami reported:

Four Disney employees were among more than 100 people arrested as part of a human trafficking operation in Florida, authorities said.

Polk County Sheriff Grady Judd on Wednesday announced the arrest of 108 people as part of “Operation March Sadness 2,” a six-day undercover operation.

One of the Disney employees was a 27-year-old man who worked as a lifeguard at Disney’s Polynesian Village Resort, Judd said.

That man allegedly sent sexual images of himself and graphic sexual messages to an undercover detective who was posing online as a 14-year-old girl, Judd said.

Another Disney employee arrested was a 24-year-old man who worked at the Cosmic Restaurant. Other Disney employees arrested were a 45-year-old IT worker and a 27-year-old software developer, Judd said.

“Four arrests of this magnitude in a week is simply remarkable,” Judd said at a news conference. (Full article).

Then, the Walt Disney Company publicly criticized a new law that was recently passed in Florida that is supposed to protect children from sexual predators.

Tucker Carlson recently interviewed Florida Governor Ron DeSantis to discuss why Disney opposed this new law he had just signed.



Disney is infested with pedophiles:

  1. 35 Disney Employees Arrested On Child Sex Charges In Less Than 10 Years.
  2. Child Star Bella Thorne Confessed She Was Raped at Disney From Ages 6 to 14.
  3. Child Star Reveals How Naming His Abuser Got Him Silenced As His CONVICTED Rapist Hired At Disney.
  4. Vice President of Disney Convicted of Child Rape – Gets Only 6 Years.
  5. Weinstein Scandal Exposes Disney for Giving Convicted Pedophile Access to Kids as Film Director.
  6. Reversing Disney’s Black Magic Sex Spells.

And then yesterday, a Twitter user named Christopher Rufo published 3 videos of 3 Disney Executives from an internal meeting where they discuss their transgender and LGBTQIA agenda for children. (One of these was included in the Tucker Carlson interview).

This is a good time to remind everyone that Walt Disney himself was a 33rd degree Freemason and that occult symbols have been used in Disney entertainment since the beginning.

Australian Altiyan Childs, in his excellent presentation on Freemasonry and Satanism, has a section on Walt Disney and his company that we have lifted out into a minute and a half video clip on our Bitchute Channel.

And let’s not stop with just Walt Disney. This nation, the United States of America, was founded by many Freemasons such as George Washington, and while the Constitution they wrote and ratified was supposed to protect civil rights for all Americans, most of these founders were themselves owners of African slaves and involved in human trafficking.

One doesn’t have to look very far to see the influence of Freemasonry and their occult symbols on the founding of our nation. Just pull out a U.S. 1 dollar bill from your wallet, and you will see many occult symbols.

This is another video clip from Altiyan Childs’ presentation on Freemasonry and Satanism. This is on our Bitchute channel.

By Brian Shilhavy, MedicalKidnap.com

The Kids Online Safety Act Is A Heavy-Handed Plan To Force Platforms To Spy On Children

Putting children under surveillance and limiting their access to information doesn’t make them safer — in fact, research suggests just the opposite. Unfortunately those tactics are the ones endorsed by the Kids Online Safety Act of 2022 (KOSA), introduced by Sens. Blumenthal and Blackburn. The bill deserves credit for attempting to improve online data privacy for young people, and for attempting to update 1998’s Children’s Online Privacy Protection Rule (COPPA). But its plan to require surveillance and censorship of anyone under sixteen would greatly endanger the rights, and safety, of young people online.

KOSA would require the following:

  • A new legal duty for platforms to prevent certain harms: KOSA outlines a wide collection of content that platforms can be sued for if young people encounter it, including “promotion of self-harm, suicide, eating disorders, substance abuse, and other matters that pose a risk to physical and mental health of a minor.”
  • Compel platforms to provide data to researchers
  • An elaborate age-verification system, likely run by a third-party provider
  • Parental controls, turned on and set to their highest settings, to block or filter a wide array of content

There are numerous concerns with this plan. The parental controls would in effect require a vast number of online platforms to create systems for parents to spy on — and control — the conversations young people are able to have online, and require those systems be turned on by default. It would also likely result in further tracking of all users.

Data collection is a scourge for every internet user, regardless of age.

And in order to avoid liability for causing the listed harms, nearly every online platform would hide or remove huge swaths of content. And because each of the listed areas of concern involves significant gray areas, the platforms will over-censor to attempt to steer clear of the new liability risks.

These requirements would be applied far more broadly than the law KOSA hopes to update, COPPA. Whereas COPPA applies to anyone under thirteen, KOSA would apply to anyone under sixteen — an age group that child rights organizations agree have a greater need for privacy and independence than younger teens and kids. And in contrast to COPPA’s age self-verification scheme, KOSA would authorize a federal study of “the most technologically feasible options for developing systems to verify age at the device or operating system level.”

Age verification systems are troubling — requiring such systems could hand over significant power, and private data, to third-party identity verification companies like Clear or ID.me. Additionally, such a system would likely lead platforms to set up elaborate age-verification systems for everyone, meaning that all users would have to submit personal data. 

Lastly, KOSA’s incredibly broad definition of a covered platform would include any “commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.”

That would likely encompass everything from Apple’s iMessage and Signal to web browsers, email applications and VPN software, as well as platforms like Facebook and TikTok — platforms with wildly different user bases and uses.

It’s also unclear how deep into the ‘tech stack’ such a requirement would reach – web hosts or domain registries likely aren’t the intended platforms for KOSA, but depending on interpretation, could be subject to its requirements.

And, the bill raises concerns about how providers of end-to-end encrypted messaging platforms like iMessage, Signal, and WhatsApp would interpret their duty to monitor minors’ communications, with the potential that companies will simply compromise encryption to avoid litigation.

Censorship Isn’t The Answer

KOSA would force sites to use filters to block content — filters that we’ve seen, time and time again, fail to properly distinguish“good” speech from “bad” speech. The types of content targeted by KOSA are complex, and often dangerous — but discussing them is not bad by default.

It’s very hard to differentiate between minors having discussions about these topics in a way that encourages them, as opposed to a way that discourages them. Under this bill, all discussion and viewing of these topics by minors should be blocked.

The law requires platforms to ban the potentially infinite category of “other matters that pose a risk to physical and mental health of a minor.

Research already exists showing bans like these don’t work: when Tumblr banned discussions of anorexia, it discovered that the keywords used in pro-anorexia content were the same ones used to discourage anorexia. Other research has shown that bans like these actually make the content easier to find by forcing people to create new keywords to discuss it (for example, “thinspiration” became “thynsperation”). 

The law also requires platforms to ban the potentially infinite category of “other matters that pose a risk to physical and mental health of a minor.” As we’ve seen in the past, whenever the legality of material is up for interpretation, it is far more likely to be banned outright, leaving huge holes in what information is accessible online. The law would seriously endanger access to information to teenagers, who may want to explore ideas without their parents knowledge or approval.

For example, they might have questions about sexual health that they do not feel safe asking their parents about, or they may want to help a friend with an eating disorder or a substance abuse problem. (Research has shown that a large majority of young people have used the internet for health-related research.)

KOSA would allow individual state attorneys general to bring actions against platforms when the state’s residents are “threatened or adversely affected by the engagement of any person in a practice that violates this Act.” This leaves it up to individual state attorneys general to decide what topics pose a risk to the physical and mental health of a minor. A co-author of this bill, Sen. Blackburn of Tennessee, has referred to education about race discrimination as “dangerous for kids.” Many states have agreed, and recently moved to limit public education about the history of racegender, and sexuality discrimination.

Recently, Texas’ governor directed the state’s Department of Family and Protective Services to investigate gender affirming care as child abuse. KOSA would empower the Texas attorney general to define material that is harmful to children, and the current position of the state would include resources for trans youth. This would allow the state to force online services to remove and block access to that material everywhere — not only Texas. That’s not to mention the frequent conflation by tech platforms of LGBTQ content with dangerous “sexually explicit” material. KOSA could result in loss of access to information that a vast majority of people would agree is not dangerous, but is under political attack. 

Surveillance Isn’t The Answer

Some legitimate concerns are driving KOSA. Data collection is a scourge for every internet user, regardless of age. Invasive tracking of young people by online platforms is particularly pernicious — EFF has long pushed back against remote proctoring, for example. 

But the answer to our lack of privacy isn’t more tracking. Despite the growing ubiquity of technology to make it easy, surveillance of young people is actually bad for them, even in the healthiest household, and is not a solution to helping young people navigate the internet. Parents have an interest in deciding what their children can view online, but no one could argue that this interest is the same if a child is five or fifteen.

KOSA would put all children under sixteen in the same group, and require that specific types of content be hidden from them, and that other content be tracked and logged by parental tools. This would force platforms to more closely watch what all users do. 

KOSA’s parental controls would give parents, by default, access to monitor and control a young person’s online use. While a tool like Apple’s Screen Time allows parents to restrict access to certain apps, or limit their usage to certain times, platforms would need to do much more under KOSA.

They would have to offer parents the ability to modify the results of any algorithmic recommendation system, “including the right to opt-out or down-rank types or categories of recommendations,” effectively deciding for young people what they see – or don’t see – online. It would also give parents the ability to delete their child’s account entirely if they’re unhappy with their use of the platform. 

The answer to our lack of privacy isn’t more tracking. 

The bill tackles algorithmic systems by requiring that platforms provide “an overview of how algorithmic recommendation systems are used …to provide information to users of the platform who are minors, including how such systems use personal data belonging to minors.” Transparency about how a platform’s algorithms work, and tools to allow users to open up and create their own feeds, are critical for wider understanding of algorithmic curation, the kind of content it can incentivize, and the consequences it can have.

EFF has also supported giving users more control over the content they see online. But KOSA requires that parents be able to opt-out or down-rank types or categories of recommendations, without the consent or knowledge of the user, including teenage users.

Lastly, under KOSA, platforms would be required to prevent patterns of use that indicate addiction, and to offer parents the ability to limit features that “increase, sustain, or extend use of the covered platform by a minor, such as automatic playing of media, rewards for time spent on the platform, and notifications.” While minimizing dark patterns that can trick users into giving up personal information is a laudable goal, determining what features “cause addiction” is highly fraught.

If a sixteen year-old spends three hours a day on Discord working through schoolwork or discussing music with their friends, would that qualify as “addictive” behavior? KOSA would likely cover features as different as Netflix’s auto-playing of episodes and iMessage’s new message notifications. Putting these features together under the heading of “addictive” misunderstands which dark patterns actually harm users, including young people.

EFF has long supported comprehensive data privacy legislation for all users. But the Kids Online Safety Act would not protect the privacy of children or adults. It is a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is “not in their best interest,” as defined by the government, and interpreted by tech platforms. 

Source: EFF.org