Family Digital Wellness
The foundation of our society and economy is now grounded in the digital world, and the COVID-19 pandemic has only accelerated this trend. A March 2022 survey released by Common Sense Media, found that children and teens are spending more time than ever before on digital devices. This survey found children ages 8 to 12 spend on average five hours and 33 minutes on screens, and teens ages 13 to 18 spend eight and a half hours on screens per day.
While there is an emphasis on protecting children online from predators and preserving kids’ mental and physical health, a piece of the puzzle is missing. Now that we are immersed in the digital world, it is imperative to equip parents and families to recognize warning signs of digital threats and to create healthy relationships and interactions with digital technologies in order to prevent abuse and future harm. This is why PFSA has developed the Family Digital Wellness initiative.
Family Digital Wellness: An inclusive, supportive, and preventative approach aimed to strengthen families in raising healthy children in a digital era.
Did You Know?
- 69% of U.S. children have their own smartphone by age 12.
- 70% of kids encounter sexual or violent content online while doing homework research.
- 1 in 5 youth ages 10-17 received a sexual solicitation or were approached online.
- 40% of children in grades 4-8 say they have chatted with a stranger online.
- 20% of teens have sent or posted nude or semi-nude photos or videos.
- About 7 in 10 parents think smartphones could bring more harm than good to children.
- 66% of U.S. parents say parenting is harder today than it was 20 years ago, with many in this group citing technology as a reason why.
Introducing the Family Digital Wellness Parent Toolkit!
Perhaps you want to be proactive in protecting your family against digital dangers that threaten children and families. Or maybe you have witnessed others struggling with these issues, or you and your family have experienced struggles of your own. Whatever your reason may be, it is important to know that you are not alone, and resources do exist.
PFSA has developed a comprehensive toolkit for parents who are ready to learn more about Family Digital Wellness, what it means, and how it can be used to increase safety and create healthy interactions with digital technologies. This toolkit is built on the foundation of PFSA’s Digital Diligence Framework, which encourages parents to follow five steps in their journey towards digital wellness.
Download our FREE Parent Toolkit today to learn more and to apply easy-to-implement solutions for your family!
Coaching Guide: To learn about our accompanying Family Digital Wellness Coaching Guide for professionals, email us at email@example.com.
PFSA’s Family Digital Wellness Resources
Foundations of Family Digital Wellness
Digital wellness is built upon skills and practices that encourage users to protect themselves, and their families, from digital dangers through simple and proactive actions, while also shifting the way in which technology is used to effectively support safe and healthy interactions online and on digital devices. These skills and practices are the Foundations of Family Digital Wellness.
PFSA’s Family Digital Wellness Overview
In the 1980’s and 90’s the “Just Say No” campaign aimed to discourage children from engaging in illegal recreational drug use by offering various ways of saying no. Today, we must focus on how to discourage our children from engaging in the risky and dangerous behaviors of the digital era stemming from an increased use and dependence of digital technologies. Check out this resource to learn more about PFSA’s Family Digital Wellness initiative and goals.
Common Digital Dangers Booklet
While many of us are aware digital dangers exist and the threats of digital technologies are growing, most parents and caregivers are not aware of all the common risks for children in digital environments. As we tackle the challenge of protecting children and preventing future harm in digital environments, understanding the most common digital threats equips parents to raise safe and healthy children. Check out this resource to learn more about common digital dangers, warning signs, and what you can do.
The Digital Era Family Profile
We are truly living in an unprecedented time. The current generation of parents/caregivers is the first to raise children with a presence in both physical and digital worlds. Dependency on digital technologies has rapidly increased in our society, quickly becoming part of our everyday lives. Everything, including work, school, socializing, and entertainment, has turned digital for adults and children alike. Check out this resource to learn more about today’s digital era family.
Tips for Parents
Family Digital Wellness requires intention and many ongoing actions that create a comprehensive approach to raising safe and healthy kids in the digital-era. But taking the first small step to safeguard against digital threats is critical for parents and families as they begin the journey. Check out this resource to learn more about tips that parents can implement right away with your family.
Preventing Digital Threats
While digital technologies change and new threats emerge rapidly, the most effective strategy for preventing future harm is practicing positive and healthy digital behaviors. With any risk to our children, equipping them with knowledge and guidance lays a foundation for positive outcomes. Check out this resource to learn more about what to avoid and how to take steps towards preventing digital threats for you and your family.
Other Related Resources
Digital for Good Book – In Digital for Good, EdTech expert Richard Culatta argues that technology can be a powerful tool for learning, solving humanity’s toughest problems, and bringing us closer together. He offers a refreshingly positive framework for preparing kids to be successful in a digital world—one that encourages them to use technology proactively and productively—by outlining five qualities every young person should develop in order to become a thriving, contributing digital citizen. www.amazon.com
Childhood 2.0 Documentary – Childhood 2.0 is a must-view for anyone who wants to better understand the world their children are navigating as they grow up in the digital age. Featuring actual parents and kids as well as industry-leading experts in child safety and development, this documentary dives into the real-life issues facing kids today — including cyberbullying, online predators, suicidal ideation, and more. www.childhood2movie.com
The Social Dilemma Documentary – In The Social Dilemma, Tech experts from Silicon Valley sound the alarm on the dangerous impact of social networking, which Big Tech use in an attempt to manipulate and influence. www.netflix.com
NetSmartz – NetSmartz is an online safety education program. It provides age-appropriate videos and activities to help teach children be safer online with the goal of helping children to become more aware of potential online risks and empowering them to help prevent victimization by making safer choices on- and offline. www.missingkids.org/netsmartz/home
-Parental Control Resources
Bark – Bark monitors texts, email, YouTube, and 30+ apps and social media platforms for signs of issues like cyberbullying, sexual content, online predators, depression, suicidal ideation, threats of violence, and more. www.bark.us
Gabb Wireless – Gabb Wireless provides a great first phone for your child(ren). No games, social media, or internet. They also have an interactive watch that works as an alternative to an actual phone. www.gabbwireless.com
The Protect App – The Protect app has hundreds of bite-sized lessons and content to make it easy for busy parents to get the quick tips they need. The app also includes 20 videos produced with teens and young adults. Parents and kids watch these videos together. www.protectyoungeyes.com
Common Sense Media – Since 2003, Common Sense has been the leading source of entertainment and technology recommendations for families and schools. Every day, millions of parents and educators trust Common Sense reviews and advice to help them navigate the digital world with their kids. www.commonsensemedia.org
The Digital Wellness Lab – The Digital Wellness Lab synergizes global thought leaders from tech, content creation and health sciences, in order to best investigate, translate, innovate and intervene to build a digital environment that advances the well-being of families, society and humanity at large. www.digitalwellnesslab.org
Thorn – Thorn builds technology to defend children from sexual abuse and houses the first engineering and data science team focused solely on developing new technologies to combat online child sexual abuse. www.thorn.org
National Center for Missing and Exploited Children (NCMEC) – NCMEC is the nation’s nonprofit clearinghouse and comprehensive reporting center for all issues related to the prevention of and recovery from child victimization. www.missingkids.org
News & Media Stories – Stay Up to Date!
The New York Times: How Your Child’s Online Mistake Can Ruin Your Digital Life: When Jennifer Watkins got a message from YouTube saying her channel was being shut down, she wasn’t initially worried. She didn’t use YouTube, after all. Her 7-year-old twin sons, though, used a Samsung tablet logged into her Google account to watch content for children and to make YouTube videos of themselves doing silly dances. Few of the videos had more than five views. But the video that got Ms. Watkins in trouble, which one son made, was different. “Apparently it was a video of his bottom,” said Ms. Watkins, who has never seen it. “He’d been dared by a classmate to do a nudie video.” Google-owned YouTube has A.I.-powered systems that review the hundreds of hours of video that are uploaded to the service every minute. The scanning process can sometimes go awry and tar innocent individuals as child abusers.
The Street: Instagram is still plagued by a disturbing issue that Meta says it’s making headway on solving: Social media algorithms and the way they work is one of the most ardently kept secrets in Silicon Valley. The algorithms are the engines that drive the user experience, and similar to artificial intelligence, the way they operate is reliant on the data that is fed into them. Over the weekend, an unsealed complaint in a lawsuit filed against Meta Platforms by 33 states alleges that despite the company publicly stating that Instagram is only for users 13 and older, the company is not only allowing kids under the age of 13 to use the platform, but that the company has also “coveted and pursued” that demographic for years.
New York Times: At Meta, Millions of Underage Users Were an ‘Open Secret,’ States Say: Meta has received more than 1.1 million reports of users under the age of 13 on its Instagram platform since early 2019 yet it “disabled only a fraction” of those accounts, according to a newly unsealed legal complaint against the company brought by the attorneys general of 33 states. Instead, the social media giant “routinely continued to collect” children’s personal information, like their locations and email addresses, without parental permission, in violation of a federal children’s privacy law, according to the court filing. Meta could face hundreds of millions of dollars, or more, in civil penalties should the states prove the allegations.
The Hill: Three bipartisan things we can do now to save kids from social media’s harms: According to internal surveys, approximately 22 percent of Instagram users ages 13 to 15 reported being victims of bullying, 24 percent were subjected to unwanted advances, and 39 percent experienced negative comparison. Béjar stated, “We cannot trust [Meta] with our children and it’s time for Congress to act.” These new revelations, building on the disclosures by Meta whistleblower Frances Haugen, follow Surgeon General Dr. Vivek Murthy’s advisory this May, warning that social media poses a significant threat to the psychological health and well-being of kids and teens. Dr. Murthy implored legislators, tech firms and parents to take immediate action. This has created a perfect storm, igniting a bipartisan drive for new regulations. In fact, in these divided times, this is one of the few topics that voters and leaders on both sides of the aisle can get behind.
CNBC: X, Snap and Discord CEOs subpoenaed by lawmakers to testify about child sexual exploitation: Lawmakers said Monday that they have issued subpoenas to the CEOs of X, Snap and Discord to compel the executives to testify on a hearing regarding online child sexual exploitation. Sens. Dick Durbin, D-Ill., and Lindsey Graham, R-S.C., said they issued the subpoenas to the executives after “repeated refusals to appear during several weeks of negotiations.” “Since the beginning of this Congress, our Committee has rallied around a key bipartisan issue: protecting children from the dangers of the online world,” the senators wrote in a joint statement. “It’s at the top of every parent’s mind, and Big Tech’s failure to police itself at the expense of our kids cannot go unanswered.”
The Hill: Congress needs to protect kids, not Big Tech: Social media is a threat to our children. A bipartisan Congress is now stepping up to make Big Tech products safe for our children; but the social media companies are putting their incredible lobbying power behind efforts to break the momentum. Social media not only impacts teens’ mental health but their physical health, as well. A recent study in the Journal of Family Medicine and Primary Care found kids experience “anxiety, respiratory alterations, trembling, perspiration, agitation, disorientation and tachycardia” when they are not near their phones, their portals to social media and the internet.
Axios: Scoop: Biden’s team weighs joining TikTok to court young voters: President Biden’s re-election campaign privately has been weighing whether to join the social media platform TikTok to try to reach more young voters, according to two people familiar with the conversations.
The Telegraph: Why I don’t post pictures of my child on social media – and never will: I don’t post pictures of my child on social media – and never will. Those forms schools and clubs get you to fill out, asking whether you “consent to images being shared” on their website? I always tick “no”. Like every columnist, I will occasionally use something she has said or done to illustrate a wider point in print, but I would never offer up any of her personal feelings or private challenges for public consumption. I thought I’d done everything I could to shield her from prying eyes, but when I started researching my new book, The Square, I realised that no matter how careful you’ve been – and whether your parents happen to be in the public eye or not – there will always be an alarming quantity of details about you online. Enough crumbs of information about your life for anyone to piece together, should they choose to. According to the Information Commissioner’s Office, when it comes to identity theft, just your name, address and birth date is enough to create another “you”. So if someone is collecting information about you from the internet, you’d better hope they have a positive agenda.
The New York Times: If Your Child Is Addicted to TikTok, This May Be the Cure: Over the past few years, hundreds of families and school districts around the country have sued big tech companies on the grounds that the hypnotic properties of social media popular with children have left too many of them unwell…Tech companies, claiming First Amendment protections, have sought to get these sorts of suits quickly dismissed. But on Tuesday, a federal judge in California issued a ruling to make that more difficult. In it, she argued that what concerned plaintiffs most — ineffective parental controls, the challenges of deleting accounts, poor age verification and the timing and clustering of notifications to ramp up habitual use — was not the equivalent of speech, so the suits under her review should be allowed to proceed.
The Washington Post: Meta says vetting teens’ ages should fall on app stores, parents: Meta is pushing for rival tech giants such as Google and Apple to play a bigger role in keeping teens off potentially harmful sites, calling for the first time for legislation to require app stores to get parental approval when users age 13 to 15 download apps. The proposal, which the Facebook and Instagram parent company is set to announce Wednesday, counters mounting calls by state and federal policymakers for individual sites to proactively screen kids to limit their use of social media platforms over safety concerns.
Reuters: Senators demand documents from Meta on social media harm to children: A bipartisan group of U.S. Senators has written to Meta Platforms CEO Mark Zuckerberg demanding documents about its research into the harm to children from its social media platforms. A whistleblower’s release of documents in 2021 showed Meta knew Instagram, which began as a photo-sharing app, was addictive and worsened body image issues for some teen girls. “Members of Congress have repeatedly asked Meta for information on its awareness of threats to young people on its platforms and the measures that it has taken, only to be stonewalled and provided non-responsive or misleading information,” the senators wrote in a letter.
Ars Technica: Judge tosses social platforms’ Section 230 blanket defense in child safety case: This week, some of the biggest tech companies found out that Section 230 immunity doesn’t shield them from some of the biggest complaints alleging that social media platform designs are defective and harming children and teen users. On Tuesday, US district judge Yvonne Gonzalez Rogers ruled that discovery can proceed in a lawsuit documenting individual cases involving hundreds of children and teens allegedly harmed by social media use across 30 states. Their complaint alleged that tech companies were guilty of negligently operating platforms with many design defects—including lack of parental controls, insufficient age verification, complicated account deletion processes, appearance-altering filters, and requirements forcing users to log in to report child sexual abuse materials (CSAM)—and failed to warn young users and their parents about those defects.
Reuters: Social media companies must face youth addiction lawsuits, US judge rules: U.S. District Judge Yvonne Gonzalez Rogers in Oakland, California, ruled against Alphabet, which operates Google and YouTube; Meta Platforms, which operates Facebook and Instagram; ByteDance, which operates TikTok; and Snap, which operates Snapchat. The decision covers hundreds of lawsuits filed on behalf of individual children who allegedly suffered negative physical, mental and emotional health effects from social media use including anxiety, depression, and occasionally suicide. The litigation seeks, among other remedies, damages and a halt to the defendants’ alleged wrongful practices.
The Verge: Social media giants must face child safety lawsuits, judge rules: School districts across the US have filed suit against Meta, ByteDance, Alphabet, and Snap, alleging the companies cause physical and emotional harm to children. Meanwhile, 42 states sued Meta last month over claims Facebook and Instagram “profoundly altered the psychological and social realities of a generation of young Americans.” This order addresses the individual suits and “over 140 actions” taken against the companies. Tuesday’s ruling states that the First Amendment and Section 230, which says online platforms shouldn’t be treated as the publishers of third-party content, don’t shield Facebook, Instagram, YouTube, TikTok, and Snapchat from all liability in this case. Judge Gonzalez Rogers notes many of the claims laid out by the plaintiffs don’t “constitute free speech or expression,” as they have to do with alleged “defects” on the platforms themselves. That includes having insufficient parental controls, no “robust” age verification systems, and a difficult account deletion process.
The Conversation: TV can be educational but social media likely harms mental health: what 70 years of research tells us about children and screens: Ask any parent and it’s likely they’ll tell you they’re worried about their kids’ screen time. A 2021 poll found it was Australian parents’ number one health concern for their kids – ahead of cyberbullying and unhealthy diets. But how worried should parents be? The information that’s out there can be confusing. Some psychologists have compared it to smoking (amid concerns about “secondhand screen time”), while others are telling us not to worry too much about kids and screens. Academics are also confused. As The Lancet noted in 2019, researchers’ understanding of the benefits, risks and harms of the digital landscape is “sorely lacking”. In our new research, we wanted to give parents, policymakers and researchers a comprehensive summary of the best evidence on the influence of screens on children’s physical and psychological health, education and development.
WBRE: The effects social media can have on teens: According to the U.S. Department of Health and Human Services, 95 % of kids between 13 and 17 years of age use social media. One in three reports using the online platforms almost constantly. with an uptick in social media usage among teens, there’s also been an increase in teen mental health issues like anxiety and depression, but there are ways parents can safeguard their kids’ online interactions and prevent the pitfalls related to online usage.
NBC News: Omegle, the anonymous video chat site, shuts down after 14 years: Launched in 2009, the website initially gained traction with teens but remained a relatively fringe video-chatting platform, though clips of funny or strange interactions and pairings sometimes spread across the internet. Its cultural resonance ebbed and flowed, with a new burst of popularity on TikTok and YouTube in 2020. Not long after its launch, Omegle gained a reputation as a platform that struggled to stop child sexual abuse. Omegle has been named in numerous Department of Justice publications announcing the sentencing of people convicted of sex crimes. The website was sued in 2021 for allegedly having a “defectively designed product” and enabling sex trafficking after the service matched a girl, then 11, with a man who later sexually abused her.
The New York Times: Opinion: It’s Not Kids With the Cellphone Problem, It’s Parents: It’s not the school’s job to police kids’ phone habits, something parents are acutely aware isn’t easy. And that gets to the thorny crux of the issue: Parents are often the problem. When one group of parents in my district confronted the administration about its lax policy toward cellphones, the principal said whenever he raised the issue, parents were the ones who complained. How would they reach their children?! But if we expect our kids to comply with no-phones policies, we’ve got to get over the deprivation. Our own parents would just call the front office — in an emergency. Not because they wanted to make sure we remembered to walk the dog. And really, if we’re trying to teach kids to be safe, responsible and independent, shouldn’t we give them the leeway to do so? Phones don’t teach kids these values; parents do.
NPR: Meta failed to address harm to teens, whistleblower testifies as Senators vow action: Former Meta engineer Arturo Bejar was testifying in front of a Senate Judiciary subcommittee hearing centered on how algorithms for Facebook and Instagram (both owned by parent company Meta) push content to teens that promotes bullying, drug abuse, eating disorders and self-harm. Bejar’s job at the company was to protect the social media site’s users. He said that when he raised the flag about teen harm to Meta’s top executives, they failed to act. Bejar is the latest Facebook whistleblower to supply congress with internal documents that show Meta knows kids are being harmed by its products. His testimony comes after The Wall Street Journal reported on his claims last week. Lawmakers have now heard testimony from dozens of kids, parents and even company executives on the topic. And it seems to have reached a boiling point.
The Washington Post: Former Facebook staffer to speak out in hopes of jolting Congress: When whistleblower Frances Haugen warned Congress at an October 2021 hearing that Facebook and Instagram were exposing children to harm, lawmakers expressed hope that the testimony would hasten their efforts to pass fresh protections for kids online. That same day, another Facebook worker, Arturo Béjar, was privately sounding the alarm that the company was not taking the safety of its users, particularly teens, seriously enough. Béjar, a former Facebook engineering director and consultant, will now get his own shot to move the needle on Capitol Hill, where he is poised to urge lawmakers in public testimony to seize what he called an “urgent” moment for children’s safety. Béjar, who first spoke out in a Wall Street Journal report last week, had warned Facebook chief Mark Zuckerberg and his top lieutenants on the day Haugen first testified about a “critical gap in how we as a company approach harm,” according to documents we reviewed. Béjar cited internal survey data suggesting the company was underestimating how frequently users under 16 faced bullying, unwanted sexual advances or other negative encounters.
Reuters: Former Meta employee tells Senate company failed to protect teens’ safety: A former Meta employee testified before a U.S. Senate subcommittee on Tuesday, alleging that the Facebook and Instagram parent company was aware of harassment and other harms facing teens on its platforms but failed to address them. The employee, Arturo Bejar, worked on well-being for Instagram from 2019 to 2021 and earlier was a director of engineering for Facebook’s Protect and Care team from 2009 to 2015, he said. Bejar testified before the Senate Judiciary Subcommittee on Privacy, Technology and the Law at a hearing about social media and its impact on teen mental health. Bejar’s testimony comes amid a bipartisan push in Congress to pass legislation that would require social media platforms to provide parents with tools to protect children online.
AP: Meta engineer testifies before Congress on Instagram’s harms to teens: Arturo Béjar, known for his expertise on curbing online harassment, recounted to Zuckerberg his own daughter’s troubling experiences with Instagram. But he said his concerns and warnings went unheeded. And on Tuesday, it was Béjar’s turn to testify to Congress. Béjar worked as an engineering director at Facebook from 2009 to 2015, attracting wide attention for his work to combat cyberbullying. He thought things were getting better. But between leaving the company and returning in 2019 as a contractor, Béjar’s own daughter had started using Instagram. In the 2021 note, as first reported by The Wall Street Journal, Béjar outlined a “critical gap” between how the company approached harm and how the people who use its products — most notably young people — experience it.
The Seattle Times: States, schools take on Meta to protect children: Out of concern for children and teens, Washington is among 41 states and Seattle Public Schools among at least 190 school districts that have filed lawsuits against Meta, the parent company of Facebook and Instagram. The state plaintiffs allege Meta has “profoundly altered the psychological and social realities of a generation of young Americans.” SPS Superintendent Brent Jones said: “Our students — and young people everywhere — face unprecedented learning and life struggles that are amplified by the negative impacts of increased screen time, unfiltered content, and potentially addictive properties of social media.”
The Guardian: I resist sharenting on social media. Does that mean my son and I are missing out, or is it just safer?: A few years ago, sharenting, as it’s been called, felt like the norm among my social circle. These days I see far fewer babies’ faces on social media. Concerns about online privacy and safeguarding, as well as facial recognition and the commercial use of personal data, are far more prevalent than they were in the early days of Facebook. In fact, you could say that whether or not you share photos has become another parental identity marker, up there with breastfeeding, cloth nappies and baby-led weaning as evidence that you’re doing things “the right way”, not like “those other parents”.
The Boston Globe: How parents can deal with social media addiction in teens, kids: It surprised us to hear that most of the adolescents in our daily therapy group were using social media for double-digit hours each day, and that over half articulated a relationship between their mental health and the messaging they received through social media. To our greater surprise, all but one felt there was a need for some kind of external controls. They could not imagine decreasing their usage all by themselves.We hear about parents whose instinct is to react in the moment — perhaps a bit too late and often out of frustration — with an authoritarian, all-or-nothing approach. Many want to take away their child’s device or install content trackers to see their every swipe.Before taking drastic measures, we encourage parents to step back and work to resolve their feelings of guilt. Trust us when we say it’s not your fault. The explosive growth of social media was like a speeding train; there was no time to anticipate its power nor opportunity for a thoughtful response. The devices used to access these platforms were a Trojan horse, welcomed into our homes in the form of a telephone.
The Hill: Meta whistleblower to testify in Senate hearing on child safety, social media: Arturo Bejar, a former Facebook engineering director who later worked as a company consultant, will testify before a Senate subcommittee about social media and its impact on the teen mental health crisis, the panel announced Friday.The testimony comes amid a bipartisan push in Congress to adopt regulations aimed at protecting kids online. Bejar has also met with Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), the lead sponsors of the Kids Online Safety Act (KOSA), according to the senators’ offices. The senators added it is “time to say ‘enough is enough’ to Big Tech” and pass their legislation to address the harms from tech companies’ actions toward children.
6ABC Philadelphia: How much is too much? Exploring possible dangers of teen social media use: Public health experts are sounding the alarm over the potential risks when kids, particularly teens, are on social media. U.S. Surgeon General Vivek Murthy has indicated that social media may be playing a role in the teen mental health crisis. It’s best to delay social media use as long as you can, but realistically, it’s not a matter of if they’ll be on it but when. And while teens tend to be more tech-savvy than their parents, there are some things you can do to help them navigate this online world in a healthy way.
Observer Reporter: Social addiction: Area experts say constant scrolling can lead to mental health woes in teens: The impact of social media apps such as Facebook and Instagram on children and teens is getting more attention following a multi-state lawsuit against Meta that alleges the company intentionally designed its products with addictive features that negatively impact young users. Emily Walentosky, a school psychologist at California Area School District, said social media is a big part of many kids and teens’ lives, but spending too much time online is harmful.
MyChesCo: Pennsylvania Joins Multi-State Lawsuit Against Meta, PFSA Advocates for Family Digital Wellness: Pennsylvania has joined 32 other states in a federal lawsuit against Meta Platforms, Inc., the parent company of Facebook and Instagram. The lawsuit alleges that Meta’s social media platforms violate consumer protection laws by exposing young users to harmful, manipulative, and addictive content.Pennsylvania Attorney General Michelle Henry has voiced strong opposition against these practices, stating, “The time has come for social media giants to stop trading in our children’s mental health for big profits.” She further accused Meta of promoting a “click-bait culture” that is psychologically damaging to children.
Pittsburgh Tribune-Review: Editorial: Meta lawsuit is only part of necessary actions on social media: Last week, Pennsylvania Attorney General Michelle Henry was one of 33 state AGs to join a lawsuit against Meta, the company behind Instagram, Facebook, WhatsApp and Oculus. And that is the crux of the lawsuit, which claims children are being damaged by the company’s social media operations in ways neither they nor their parents realize.The problem is more than the content. That can be troubling but is more easily addressed by blocking what one doesn’t like or encouraging (or demanding) better moderation by the company.Federal and state governments need to look at social media companies and rethink what it means to have all that power with so little consequence. This is one sticky web. We need to stop kids from getting caught in it.
Sandford Health: Kids on social media must mind their mental health: Behavioral health leaders across the U.S. are urging parents to monitor the social media habits of their children, citing it as a factor in the increase in mental health issues in adolescents.The progression from what starts as a standard diversion, like watching TV or playing video games, to something darker can be a slippery slope, says Dene Hovet, associate behavioral health counselor for Sanford Health.In many cases, neither kids nor parents quite realize the extent to which the internet and social media have taken over their lives.
Gallup News: Parenting Mitigates Social Media-Linked Mental Health Issues: Teenagers who spend more time on social media experience worse mental health on a variety of measures, according to data from a new Gallup survey.Yet, the strength of the relationship between an adolescent and their parent is much more closely related to their mental health than their social media habits. When teens report having a strong, loving relationship with their parents or caretakers, their level of social media use no longer predicts mental health problems.The data inform debates about the consequences of social media use.
Los Angeles Times: Editorial: Social media can harm kids. Lawsuits could force Meta, others to make platforms safer: It’s a rare issue that can bring 41 states together for a bipartisan fight. This week, state attorneys general across the political spectrum joined forces in suing Facebook parent company Meta for allegedly using features on Instagram and other platforms that hook young users, while denying or downplaying the risks to their mental health.But there hasn’t yet been significant change in the industry. Most companies haven’t been willing to overhaul their platforms to curb addictive features or harmful content for users under 18 years old, such as setting time limits on their apps or changing algorithms that steer kids into “rabbit holes” to keep them online. Nor have federal lawmakers been able to enact comprehensive product safety regulations because legislation has stalled in Congress or been blocked by courts.In the absence of policy changes, lawsuits are the next logical step in prodding technology companies to ensure their products are safe for young people or be held accountable. Some have compared the states’ legal strategy to lawsuits against Big Tobacco and opioid manufacturers that revealed how the companies lied about the harm caused by their products, and forced them to change their business practices.
TechCrunch: Why 42 states came together to sue Meta over kids’ mental health: Attorneys general from dozens of states sued Meta this week, accusing the company of deliberately designing its products to appeal to kids to the detriment of their mental health. In the lawsuit, filed in California federal court Tuesday, 33 states including California, Colorado, New York, Arizona and Illinois argue that Meta violated state and federal laws in the process of luring young users in the U.S. into spending more time on Facebook and Instagram.
The New York Times: Is Social Media Addictive? Here’s What the Science Says: A group of 41 states and the District of Columbia filed suit on Tuesday against Meta, the parent company of Facebook, Instagram, WhatsApp and Messenger, contending that the company knowingly used features on its platforms to cause children to use them compulsively, even as the company said that its social media sites were safe for young people. “Meta has harnessed powerful and unprecedented technologies to entice, engage and ultimately ensnare youth and teens,” the states said in their lawsuit filed in federal court. “Its motive is profit.” The accusations in the lawsuit raise a deeper question about behavior: Are young people becoming addicted to social media and the internet? Here’s what the research has found.
Politics PA: Henry Joins Federal Lawsuit Against Meta Over Content For Young Users: Pennsylvania Attorney General Michelle Henry joined a multi-state coalition in a federal lawsuit against Meta Platforms, Inc., claiming the company’s social media platforms, including Facebook and Instagram, violate consumer protection laws by subjecting young users to a wave of harmful, manipulative, and addictive content. The lawsuit alleges that Meta knowingly designs and deploys features harmful to children on its platforms, while at the same time, falsely assuring the public that those features are suitable for children.
AP: AI-generated child sexual abuse images could flood the internet. A watchdog is calling for action: The already-alarming proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos, a watchdog agency warned on Tuesday.In a written report, the U.K.-based Internet Watch Foundation urges governments and technology providers to act quickly before a flood of AI-generated images of child sexual abuse overwhelms law enforcement investigators and vastly expands the pool of potential victims.
CBS Pittsburgh: Pennsylvania joins 32 other states in lawsuit against Facebook and Instagram parent company Meta: A bipartisan coalition of 33 state attorneys general, including Pennsylvania, have announced a federal lawsuit against Meta, the parent company of Facebook and Instagram. The suit alleges that the tech giant deliberately engineered its social media platforms to be addictive to both children and teenagers in an effort to boost its profits. Pennsylvania, and 32 other states, have joined the lawsuit after the U.S. Surgeon General said in May that bad social media companies have contributed to what is described as a “youth mental health crisis.” Federal law prohibits children under the age of 13 from signing up for social media platforms, but the federal complaint alleges that Meta knew young users were active on the platform and collected data from them without parental consent.
Philadelphia Business Journal: Facebook parent sued by Pennsylvania, New Jersey and others states over allegedly harming kids: Pennsylvania, New Jersey and 30 other states on Tuesday filed suit against Meta Platforms Inc., alleging that the social media giant intentionally marketed its services to kids without their parents’ consent and while knowing that those services were causing children harm.In a complaint filed in the U.S. District Court for Northern California in San Francisco, the states’ attorneys general charged the tech titan with violating the federal Children’s Online Privacy Protection Act as well as numerous state laws prohibiting corporations from deceptive acts. The states are asking that the court bar the Mountain View-base parent company of Facebook and Instagram from future violations and award them with unspecified damages.
Philly Voice: Pennsylvania, New Jersey join federal lawsuit alleging Facebook, Instagram harm children’s mental health: Pennsylvania and New Jersey are among dozens of states suing social media giant Meta over claims that its platforms, including Facebook and Instagram, are addicting and mentally damaging to children and teens.The federal lawsuit, filed Tuesday in California, claims Meta designed specific features intended to keep kids hooked on its platforms. The attorneys general of 33 states argue that Meta knew about the dangers its products posed to young people, but downplayed and concealed them in the interest of growth and competition with rival platforms like TikTok.
41 States Sue Meta, Claiming Instagram, Facebook are Addictive, Harm Kids: Forty-one states and the District of Columbia are suing Meta, alleging that the tech giant harms children by building addictive features into Instagram and Facebook. The legal actions represent one of the most significant efforts by state regulators to rein in the impact of social media on children’s mental health. Thirty-three states, including Pennsylvania, are filing a joint lawsuit in federal court in the Northern District of California, while attorneys general for D.C. and eight states are filing separate complaints in federal, state or local courts. The complaints underscore concern that major social networks risk the well-being of younger users by designing products in ways that optimize engagement over safety.
CNBC: FTC plans to hire child psychologist to guide internet rules: The Federal Trade Commission plans to hire at least one child psychologist who can guide its work on internet regulation, Democratic Commissioner Alvaro Bedoya told The Record in an interview published Monday. FTC Chair Lina Khan backs the plan, Bedoya told the outlet, adding that he hopes it can become a reality by next fall, though the commission does not yet have a firm timeline. “Our plan is to hire one or more child psychologists to help us assess the mental health impacts of what children and young people do online,” FTC spokesperson Douglas Farrar told CNBC in a statement. “We are currently exploring next steps including how many to hire and when.”
The New York Times: Face Search Engine PimEyes Blocks Searches of Children’s Faces: Concerns about children’s privacy have led PimEyes, the public face search engine, to ban searches of minors. The PimEyes chief executive, Giorgi Gobronidze, who is based in Tbilisi, Georgia, said that technical measures had been put in place to block such searches as part of a “no harm policy.” PimEyes, a subscription-based service that uses facial recognition technology to find online photos of a person, has a database of nearly three billion faces and enables about 118,000 searches per day, according to Mr. Gobronidze. The service is advertised as a way for people to search for their own face to find any unknown photos on the internet, but there are no technical measures in place to ensure that users are searching only for themselves.
CNBC: Want to raise happy, successful kids? ‘Wait as long as possible’ to give them a phone, says Yale expert: Children ages eight to 12 who have phones spend just under five hours a day glued to their phones, and teenagers rack up nearly eight hours of screen time per day, a 2019 report from nonprofit Common Sense Media found. That screen time is seldom used for creative activities like coding or making digital art. Rather, young people spend most of their phone time on social media or watching videos, Common Sense head of research Michael Robb wrote in an analysis of the report. This is likely to encourage poor mental health — in ways that affect kids differently than adults — and distractions in the classroom.
CNN: About half of children share their location on Twitch, research shows: Twitch is an online streaming service in which users can live stream their gaming, music and other creative content.Joining the interactive platform can help create feelings of community, but streamers age 13 and younger often share information that can put them at risk for exploitation with viewers all around the world, said coauthor Fiona Dubrosa, a visiting scholar at Cohen’s Children Medical Center in New York City.
Today: Social media influencer raises the alarm about kids and phones: Through her educational programming and nonprofit organization, #halfthestory, Larissa May urges teenagers to learn more about how technology affects their minds, offering concrete tips on bringing mindfulness back into the picture. At the moment, May is focusing her efforts on kids in middle school and high school, ages where she thinks she can make the most impact, and she urges parents to get involved. Rather than tell teens that phones are simply evil, May suggests that parents can demonstrate “positively using technology and having it be fun, because there’s just so much negativity around it.” She also suggests that parents “lead with vulnerability” in openly discussing their own tech temptations and habits they might want to change. Many of the parents who approach May are often chained to their phones themselves, she says.
Yahoo Lifestyle: Parents are pranking their kids on social media. Here’s why experts say it isn’t harmless.: Before parents participate in a prank or any social media trend with their children, they should consider the kid’s age and maturity level to determine whether or not it is age-appropriate. Could this cause pain or physical damage? Will it induce fear, humiliation or emotional harm? If the answer is yes, parents should consider that prank harmful, not harmless.But there are safer and more respectful ways to approach these social media trends. Dr. Niky has used her own TikTok platform to share her reframing of the #EggCrackChallenge. In her video, she involves her child in the prank by explaining what she wants to do, offering to let him crack the egg on her forehead first and discussing his decision to forego participation. Her older daughter, meanwhile, playfully but nervously takes her up on the offer to crack the egg against her mom’s forehead. The video, Dr. Niky shares, illustrates how a challenge can become a fun family activity and bonding experience when everyone is in on the joke. Otherwise, parents risk becoming their “kid’s first bully.”
The Hill: Opinion: Congress can disrupt the spread of online child sexual abuse: As Congress determines a path forward for government spending in this new fiscal year, it is past time for lawmakers to take decisive action to address the crisis of online sexual exploitation of children. In 2022 alone, the National Center for Missing & Exploited Children received 32 million reports of suspected child sexual exploitation — of those, nearly 90 percent resolved to a location outside the U.S. This is a global crime, often with demand-side offenders in one country and victims in another.
The Verge: Google asks Congress to not ban teens from social media: Google responded to congressional child online safety proposals with its own counteroffer for the first time Monday, urging lawmakers to drop problematic protections like age-verification tech. In a blog post, Google released its “Legislative Framework to Protect Children and Teens Online.” The framework comes as more lawmakers, like Sen. Elizabeth Warren (D-MA), are lining up behind the Kids Online Safety Act, a controversial bill intended to protect kids from dangerous content online.
Bloomberg: Kids Suing Social Media Over Addiction Find a Win Amid Losses: Minors and parents suing Meta Inc.’s Facebook and other technology giants for the kids’ social media platform addictions won an important ruling advancing their collection of lawsuits in a California court. A state judge on Friday threw out most of the claims but said she’ll allow the lawsuits to advance based on a claim that the companies were negligent – or knew that the design of their platforms would maximize minors’ use and prove harmful. The plaintiffs argue social media is designed to be addictive, causing depression, anxiety, self-harm, eating disorders, and suicide.
The New York Times: Can You Hide a Child’s Face From A.I.?: How much parents should post about their children online has been discussed and scrutinized to such an intense degree that it has its own off-putting portmanteau: “sharenting.”Historically, the main criticism of parents who overshare online has been the invasion of their progeny’s privacy, but advances in artificial intelligence-based technologies present new ways for bad actors to misappropriate online content of children. Among the novel risks are scams featuring deepfake technology that mimic children’s voices and the possibility that a stranger could learn a child’s name and address from just a search of their photo.
The New York Times: New Laws on Kids and Social Media Are Stymied by Industry Lawsuits: Many children’s groups heralded the measure, the first of its kind in the United States. So did Gov. Gavin Newsom. “We’re taking aggressive action in California to protect the health and well-being of our kids,” he said in an statement at the time. But last month, after a lawsuit filed by a tech industry group whose members include Meta and TikTok, a federal judge in California preliminarily blocked the law, saying it “likely violates” the First Amendment.
The Baltimore Sun: Opinion: Social media platforms must step up to combat youth mental health crisis in schools: Parents and educators are working hard to create a welcoming and safe environment for kids. Now, it’s time for Big Tech to step up as well, and take some responsibility. Social media has had a disruptive and often detrimental role in the well-being and academic success of students across the nation, with the consequences growing more concerning each year. Educators and parents are bearing the weight of these disruptions at school and at home, while Big Tech platforms make billions, so together we’re demanding social media companies make significant changes to make their products safer for millions of kids. In May 2023, the U.S. Surgeon General issued a landmark report declaring a youth mental health crisis in America and pointing a finger at social media’s role in the epidemic.
The Hill: Ramaswamy backs controversial social media limits for teens: Republican presidential candidate Vivek Ramaswamy seemingly backed controversial proposals that would limit teens under 16 from using social media platforms during a Wednesday night debate. “This isn’t a Republican point or a Democrat point,” Ramaswamy said. “But if you’re 16 years old or under, you should not be using an addictive social media product, period.” The conservative entrepreneur said the idea is “something that we can both agree on,” and in doing so can “revive both the mental health of this country while stopping the fentanyl epidemic.” Concerns around children’s online safety have emerged as a rare unifying issue across party lines, but proposals such as the one Ramaswamy suggested have not been as bipartisan. Sen. Josh Hawley (R-Mo.), an outspoken critic of social media companies, in February put forward a bill that would ban children under 16 from using social media.
The Washington Post: Got an idea for protecting kids online? You can now take action: If you have concerns about kids and teens on social media or ideas for keeping them healthy and safe now you can submit those directly to the federal government. The Department of Commerce’s National Telecommunications and Information Administration (NTIA) sent out a request for public comment on Thursday calling for parents, educators and other interested parties to write in and share their concerns and “best practices” around internet usage of kids and teens. The call comes several months after the White House promised in an advisory to dedicate more resources and brainpower to two big questions: How exactly is internet access affecting young people, and what should the rest of us be doing about it?
PhillyVoice: Federal bill would allow sexual violence survivors to temporarily defer student loan payments: Two federal lawmakers from Pennsylvania have introduced a bill that would defer student loan payments for survivors of sexual violence who withdraw from college, allowing them to focus on their well-being. The bill, introduced on Wednesday by U.S. Sen. John Fetterman and House Rep. Madeleine Dean, would allow survivors of sexual violence to temporarily suspend their student loan payments if they withdraw from a college or university to seek treatment and focus on their mental and physical recovery. The bill would allow students who have passed the six-month student loan deferment period to extend it for up to three years.
The Morning Call: Opinion: Protecting Pa. children from abuse isn’t easy. Specially trained pediatricians are part of the solution: Hard conversations are happening about how best to keep children safe (and alive) while guarding against over-surveilling or inappropriately intervening in families. Front-and-center are concerns about how Black and brown families are disproportionately reported to child welfare. In recent years, diverse stakeholders have engaged in interdisciplinary forums exploring reforms related to: mandatory reporting of suspected child abuse or neglect; the quality of child abuse investigations, including the role of specially trained medical professionals; and the unintended consequences of Pennsylvania’s child abuse registry. Recently, the Lehigh County controller focused on a rarely reported and substantiated type of abuse — Munchausen syndrome by proxy (or medical child abuse). His report was frustrating in its disconnection from these forums.
igamingbusiness: Pennsylvania fines operator for underage VGT gambling: The underage gambling breach was identified at a qualified truck stop in the Smithton area of Pennsylvania. Pilot Travel Centers was also flagged for not having a board-credentialed employee on duty. A financial penalty of $45,000 was agreed following negotiations between the PGCB’s Office of Enforcement Counsel and Pilot Travel Centers. Four Pennsylvania adults banned for leaving children unattended: In other news, four adults have been placed on the Pennsylvania Involuntary Exclusion List for leaving children unattended while gambling. A female player left three children – aged 10, 14 and 15 – in a running vehicle in the parking garage of Hollywood Casino at Penn National Race Course. The individual gambled inside the Pennsylvania venue for two hours and two minutes while the children were unattended.
NEXSTAR: Ohio police suggested charging an 11-year-old for her explicit photos. Experts say the practice is common: When an Ohio father learned that his 11-year-old daughter had been manipulated into sending explicit photos to an adult, he turned to the police for help. But instead of treating the girl as a crime victim, an officer seemingly threatened to charge her under a law most people view as designed to protect child victims. The shocking interaction was recorded last week on body camera audio and by the father’s doorbell camera in Columbus, Ohio. The footage drew criticism from the public and from experts who said law enforcement officials have long misused laws meant to protect children by threatening to charge them with being part of the same crime.
The Morning Call: After Lehigh County report, what you should know about the John Van Brakle Child Advocacy Center, the CAC movement and child abuse pediatricians: For the last 20 years, the Lehigh Valley has had a child advocacy center in one way or another, operating in plain sight but garnering little attention from those on the outside of child abuse investigations. But now, the John Van Brakle Child Advocacy Center at Lehigh Valley Health Network’s Reilly Children’s Hospital is in the spotlight, following a critical report by Lehigh County Controller Mark Pinsley and protests by parents and the Parents Medical Rights Group, a Lehigh Valley organization that seeks more parental input in medical decisions. The Van Brakle Center and its former director, Dr. Debra Esernio-Jenssen, who was recently replaced, are accused by some parents of misdiagnosing their children with abuse, causing traumatic investigations and family separations at the hands of Children and Youth Services, only for the investigation to be dropped later.
CBS 21: State leaders respond to challenges in PA’s child welfare system: In the months since three Adams County Children and Youth Services employees were arrested and charged with endangering the welfare of a child, CBS21 has been asking state leaders how they are working to address challenges in the Commonwealth’s Child Welfare system. Advocates, lawmakers and those who work in the child welfare system said some challenges include staffing, caseloads and funding. Governor Josh Shapiro’s office said he has a proven track record of working to protect children and ensure their safety. His Office provided this statement in part: “Governor Shapiro further supports the work of the Office of Child Advocate as yet another tool to help keep Pennsylvania’s children safe and ensure this essential function of county government can continue to meet its obligations to children and families.
People: Jodi Hildebrandt Has Counseling License Frozen Amid Child Abuse Charges with Ruby Franke: Jodi Hildebrandt, the embattled sex therapist who faces six felony child abuse charges in Utah, has reportedly agreed to have her counseling license frozen amid allegations of child abuse against her and her business partner, Ruby Franke. “Given the heinous abuse allegations, the agency felt that the surrender of the license was the best course of action to protect the safety of Hildebrandt’s patients and clients,” Margaret Busse, the executive director of the Utah Department of Commerce said in a statement to the media on Tuesday. The agreement, obtained by PEOPLE, sees that Hildebrandt’s license will be frozen amid the criminal charges against her and she has a hearing before the state’s Clinical Mental Health Counselor Licensing Board, which will determine the future of her ability to practice in the state. Hildebrandt has been a licensed clinical mental health counselor in Utah since 2005.
The Hill: Opinion: We must create an independent expert agency for AI and ‘Big Tech’: Last week, I met with child psychologists to discuss social media’s profound effects on Colorado’s kids. They shared their clinical assessment of the addiction and trauma our kids are experiencing — and the accompanying sleepless nights, searing anxiety, endless bullying and deepening despair. Almost all of these clinicians were parents as well. And our conversation shifted from their patients to their kids and how social media has deprived their own sons and daughters of their chance at a healthy childhood. They told me about school nights devolving into screaming matches about screen time, the deafening silence during carpool as kids ride hypnotized by an endless feed in the back seat, and the meals skipped by impressionable teens in hopes of achieving the “perfect” bodies these platforms parade to them.
Pittsburgh Post-Gazette: Opinion: Angela Liddle: How parents can help their children resist social media: Technology will always make sure that parenting will never be easy. More than half of parents surveyed in a recent Pew Research study said that social media makes parenting harder than it was 20 years ago. The reasons are well-known: Almost three-fourths of kids see sexual or violent content while doing their homework. One-fifth of kids from 10 to 17 have been approached or sexually solicited while online. Almost half of children in fourth to eighth grade have spoken with a stranger online. This has concerned lawmakers across the country enough to introduce legislation to ban platforms or restrict teens and adolescents from registering social media accounts. While I applaud lawmakers for wanting to protect kids online, legislation alone is not going to protect our kids. It’s up to us as parents and guardians to help our kids foster positive digital behaviors.
Slate: Sen. Richard Blumenthal Defends His Controversial Bill Regulating Social Media for Kids: For a while now, Washington has been wrestling with two big forces shaping technology: social media and artificial intelligence. Should they be regulated? Who should do it—and how? Currently, Congress is considering a bill that would regulate how social media companies treat minors: the Kids Online Safety Act. Although it has bipartisan support, KOSA is not without controversy. Several critics have called it “government censorship.” One group, the Electronic Frontier Foundation, says it is “one of the most dangerous bills in years.” One of KOSA’s sponsors is Connecticut Democratic Sen. Richard Blumenthal. On Friday’s episode of What Next: TBD, I spoke with Blumenthal about tech, kids, and what role the government should play when it comes to regulating Silicon Valley. Our conversation has been edited and condensed for clarity.
The Washington Post: Judge blocks California law meant to increase online safety for kids: A federal judge on Monday temporarily blocked an online child protection law in California and said it probably violates the Constitution. Under the law, known as the California Age-Appropriate Design Code, digital platforms would have to vet their products before public release to see whether those offerings could harm kids and teens. The law also requires platforms to enable stronger data privacy protections by default for younger users. U.S. District Court Judge Beth Labson Freeman granted a request Monday by the tech trade group NetChoice for a preliminary injunction against the measure, writing that the law probably violates the First Amendment and does “not pass constitutional muster.”
The Hill: Ashton Kutcher steps down from anti-child sex abuse group after Danny Masterson pushback: Ashton Kutcher, co-founder of Thorn – a technology company protecting children from sexual abuse – announced Friday he would resign from the organization’s board amid backlash he received for supporting former co-star and convicted rapist Danny Masterson. “This decision is rooted in the recognition of recent events and ensuring Thorn remains focused on its mission: to build technology to defend children from sexual abuse,” the company, founded by Kutcher and Demi Moore, said in a statement. On Sept. 7, a Los Angeles judge sentenced Masterson to 30 years to life in prison for raping two women. Kutcher and his wife, actress Mila Kunis, who both co-starred with Masterson on “That 70’s Show,” wrote character letters to the judge prior to Masterson’s sentencing.
The Hill: Opinion: Congress, it’s time to put kids before Big Tech profits. Pass KOSA: Our kids are experiencing a national epidemic of depression, anxiety, and loneliness. Rates of suicide have skyrocketed, feelings of hopelessness have reached critical levels, and across the country, parents and young people are demanding solutions to this national crisis. Behind this mental health emergency is social media — its ubiquity, its pervasive data collection, and its addictive design. Nearly 20 years after we first started posting on Facebook walls, Americans are finally turning their attention to the impact social media is having on an entire generation. These companies have been running a national experiment on our kids and the results have been catastrophic. According to a national poll commissioned by Issue One and our Council for Responsible Social Media, only 7 percent of Americans see social media’s impact on children as more positive than negative. That’s an overwhelming rebuke of Big Tech and the repercussions their platforms are having on children.
Tech Policy Press: Fight Over State Child Online Safety Laws May Last Years: After a wave of legislation focused on child online safety swept through state legislatures over the past two years, legal challenges against the new laws are gaining traction in federal courts. But rather than signaling a change in the tide, the lawsuits may ultimately spur a new round of bills that address flaws in those passed in the first wave. Putting aside the merits of the various approaches to child online safety that animate recent legislation and whether they may be effective, it is clear that the overarching issue is one that will survive well into the future. A recent national poll on children’s health found that use of devices and social media are at the top of parent concerns. What follows is a summary of the legal and political debate involving child online safety laws and where it might go in the future.
Mashable: New S.O.S. initiative online rating system targets teen safety: Imagine letting a child or teen to see a movie without any guidance about the film’s appropriateness for their age. You might settle into an animated feature that surprised you and your 8-year-old with nonstop profanity. Or discover that the action flick your 13-year-old watched depicted graphic sex. Parents typically like to avoid exposing their kids to inappropriate content and count on movie and TV ratings, however imperfect, to help them do exactly that. But as mental health advocate and fashion designer Kenneth Cole argues, parents have no such resource or guideline when it comes to the internet, which is where their kids and teens spend a significant amount of their time.
Them: Over 100 Parents of Trans Kids Sign Letter Opposing a Controversial Internet Safety Bill: Over 100 parents of trans and gender-expanding children have written an open letter opposing the Kids Online Safety Act (KOSA), legislation that advocates say could widely censor trans and other marginalized communities on the internet. KOSA was initially introduced in the Senate in 2022 by Democratic Senator Richard Blumenthal and Republican Senator Marsha Blackburn. The bill would burden online platforms with the legal responsibility to proactively remove content that causes anxiety, depression, eating disorders, bullying, violence, and more. Since its introduction in 2022, digital rights groups such as Fight for the Future have been warning that online platforms could face “substantial pressure to over-moderate” as a result of the bill, resulting in widespread censorship.
Los Angeles Times: California lawmakers pass measure to combat child sexual abuse material on social media: California lawmakers on Wednesday passed a bill aimed at combating child sexual abuse material on social media platforms such as Facebook, Snapchat and TikTok. The legislation, Assembly Bill 1394, would hold social media companies liable for failing to remove the content, which includes child pornography and other obscene material depicting children. “The goal of the bill is to end the practice of social media being a superhighway for child sexual abuse materials,” Assemblywoman Buffy Wicks (D-Oakland), who authored the legislation, said in an interview. The bill unanimously cleared the Senate on Tuesday. The Assembly unanimously approved an amended version of the bill on Wednesday and it’s now headed to the governor’s desk for consideration.
Washington Blade: EXCLUSIVE: Sen. Blumenthal defends Kids Online Safety Act: Responding to criticism from some in the LGBTQ community about the Kids Online Safety Act, U.S. Sen. Richard Blumenthal (D-Conn.) defended the legislation and reiterated his strong support for queer youth. “I would never put my name on any bill that targets or disparages or harms the trans or LGBTQ community,” Blumenthal told the Washington Blade on Friday. “There have been a lot of eyes” on the Kids Online Safety Act, he said. “A lot of very smart and careful people have reviewed its language, and they and I have worked to make it as rigorous and tight as possible.” The proposed legislation, introduced by Blumenthal and Republican U.S. Sen. Marsha Blackburn (Tenn.), would address harms experienced by children and their families at the hands of dominant social media and tech platform companies. It enjoys broad bipartisan support in the Senate.
Lake Okeechobee News: Data reveals TikTok to be the platform parents most worry about: A new study reveals TikTok to be the most worrying social media platform for parents, with an estimated 5,100 online searches asking if TikTok is safe. AI Digital family safety app, Canopy.us has collated Google search volume data relating to the safety of various social media platforms. The search volumes for each platform were collected for US and global searches then ordered to reveal the most worrying social media site. The family safety app has also explored what dangers threaten kids using TikTok and the safety precautions parents can take to protect them.
People: ‘Monster’: Powerful PSA from ChildFund Urges Big Tech to Step up Efforts Against Child Sex Predators: A new campaign has set its sights on Big Tech, challenging legislators to enact laws that require internet companies to more actively target and take down child sexual abuse material, which activists say is increasingly rampant on social media. ChildFund International, an organization that focuses on child development and protection, is spearheading the #TAKEITDOWN campaign, which aims to “build public support to pressure tech companies to proactively remove child sexual abuse content from their platforms,” according to a press release announcing the launch. The centerpiece of the campaign is a video PSA titled Monster, which portrays a seemingly innocuous man, who goes to work and interacts with others in person. But when he’s alone, browsing the internet, his face is suddenly covered by a monster mask, implying that he is engaging in predatory online behavior.
Gizmodo: ‘Our Kids’ Lives Are at Stake:’ 100 Parents of Trans Kids Beg Lawmakers to Kill Kids Online Safety Act: Parents of more than 100 trans and gender-expansive children are urging lawmakers to turn their back on the “dangerous and misguided” Kids Online Safety Act (KOSA) currently winding its way through Congress. In a fiery open letter shared with Gizmodo, the parents said KOSA, which is intended to shield kids from the harms of social media, would actually make their kids less safe and cut them off from potentially lifesaving resources and communities. “Big Tech is hurting our kids,” they added. “KOSA would hurt them even more.” Lawmakers from both sides of the aisle and President Biden himself have rallied around KOSA in recent months as a potential saving grace in response to a steady stream of reports showing various ways Big Tech platforms can harm young users and contribute to a worrying rise in depression and anxiety.
The New York Times: Demonizing Social Media Isn’t the Answer to Online Safety, a New Book Argues: Panic over social media has reached a fever pitch. Diagnoses of mental illness among adolescents have been on the rise, and in May the U.S. surgeon general warned of “ample indicators” that social media may in part be to blame. In June, a psychologist called for a nationwide ban of cellphones in schools. By next March, kids under 18 in Utah will be allowed to use TikTok and Instagram only if they have explicit parental permission. But perhaps banning social media — or heavily monitoring kids who use it, which is another common parental response — isn’t the most constructive solution to the problem. Perhaps, instead, we should focus more on helping kids learn how to safely navigate social media and manage online privacy and decision-making.
The New York Times: Appeals Court Rules White House Overstepped 1st Amendment on Social Media: A federal appeals court ruled on Friday that the Biden administration most likely overstepped the First Amendment by urging the major social media platforms to remove misleading or false content about the Covid-19 pandemic, partly upholding a lower court’s preliminary injunction in a victory for conservatives. The ruling, by a three-judge panel of the U.S. Court of Appeals for the Fifth Circuit in New Orleans, was another twist in a First Amendment case that has challenged the government’s ability to combat false and misleading narratives about the pandemic, voting rights and other issues that spread on social media.
Los Angeles Times: Elon Musk’s X, formerly Twitter, sues California over social media law: X, the company formerly known as Twitter, is suing California over a state law passed last year that lawmakers say aims to make social media platforms more transparent. The law, Assembly Bill 587, requires social media companies to disclose their policies, including what content users are allowed to post on their platforms and how it responds when they violate the platform’s rules. The companies are required to submit this information to the California attorney general by January 2024. The attorney general’s office would then make those reports public online. In the lawsuit, filed in a federal court in Sacramento on Friday, X alleges the law violates the 1st Amendment’s free speech protections and would pressure social media companies to moderate “constitutionally-protected” speech the state finds “undesirable or harmful.”
Forbes: Inside Apple’s Impossible War On Child Exploitation: Joe Mollick had spent much of his life dreaming of becoming a rich and famous oncologist, a luminary of cancer treatment, the man who would cure the disease for good. It was a quixotic quest for the 60-year-old, one that left him defeated and hopelessly isolated. He turned to pornography to ease his feelings of loneliness. When those feelings became more severe, so did his taste for porn; he began seeking out child sexual abuse material (CSAM). When the cops first caught Mollick uploading CSAM on a messaging application called Kik in 2019, they searched his electronics and discovered a stash of 2,000 illegal images and videos of children and some 800 files of what a search warrant reviewed by Forbes described as “child erotica.”
Mashable: Snapchat announces new updates to foster teen safety and age-appropriate content: Snapchat is releasing new app safeguards to protect teen users (aged 13-17) from unknown users and age-inappropriate content — a responsive step from a social media platform that’s been under fire for allegedly exposing its younger users to explicit content, despite its 13+ age rating. Developed in collaboration with the National Center on Sexual Exploitation (NCOSE) and National Center for Missing and Exploited Children (NCMEC), the new features include protections against unwanted contact. The safeguards build on an existing feature preventing teen users from messaging accounts not on their friends list by alerting users when adding an unknown account that doesn’t share mutual friends.
TIME: 5 Steps Parents Should Take to Help Kids Use AI Safely: Just as older generations have had to navigate the internet and social media, our children will have to learn how to interact with AI. We cannot escape this new era in the technological revolution; children as young as infants often come into contact with AI toys and chatbots like the smart toy ROYBI Robot, AI teddy bears from VTech, Moxie Robot, Siri, and Alexa. But we can’t just wait for the government to impose regulations and protect us (even though that is crucial for our sustained future). We should start in our homes, making sure our children are set up for success in a world increasingly shaped by tools like ChatGPT and Midjourney. This requires ongoing conversations between parents and around the educational benefits of AI, the potential dangers of fully relying on this technology, how technology affects us emotionally and behaviorally, and how the humans behind the algorithms also impact what information AI. gives us.
Tech Policy Press: Study Investigates Differences in Parent Approaches to Children’s Online Activities: A wave of child online safety legislation is sweeping the United States. Some laws, such as the California Age Appropriate Design Code Bill, address design and privacy concerns, putting the onus on the platforms to make their products safe for children. Laws in other states, often led by Republicans, put more emphasis on the role of parents. One example of the latter is Utah’s SB0152, known as the Utah Social Media Regulation Act. The law, which passed in March, requires that tech firms verify the age of users, requires companies to get parental consent for a child to have a social media account, and puts other restrictions on accounts held by minors, such as prohibitions on direct messaging, advertising, and the collection of personal data. The law enacts a social media curfew between 10:30 p.m. and 6:30 a.m., unless that restriction is adjusted by a parent or guardian. And, perhaps most controversially, the law gives parents and guardians the right to access a minor’s account, including direct messages.
CBS News: YouTuber Ruby Franke and her business partner each charged with 6 counts of aggravated child abuse: Ruby Franke, a once-popular YouTuber who gave parenting advice on her now-defunct “8 Passengers” YouTube channel, has been charged with six counts of aggravated child abuse, the Washington County Attorney’s Office in Utah said Wednesday. Franke and her business partner, Jodi Hildebrandt, were arrested last week after Franke’s malnourished son ran to a neighbor’s house asking for help, authorities said. The attorney’s office says Franke and Hildebrandt are accused of a combination of multiple physical injuries or torture; starvation or malnutrition that jeopardizes life; and causing severe emotional harm to two children. They each face six counts, which carry potential prison sentence of one to 15 years in prison and a fine of up to $10,000. The investigation is ongoing, the office said.
Associated Press: A federal judge strikes down a Texas law requiring age verification to view pornographic websites: A federal judge has struck down a Texas law requiring age verification and health warnings to view pornographic websites and blocked the state attorney general’s office from enforcing it. In a ruling Thursday, U.S. District Judge David Ezra agreed with claims that House Bill 1181, which was signed into law by Texas Gov. Greg Abbott in June, violates free speech rights and is overbroad and vague. The state attorney general’s office, which is defending the law, immediately filed notice of appeal to the Fifth Circuit U.S. Court of Appeals in New Orleans. The lawsuit was filed Aug. 4 by the Free Speech Coalition, a trade association for the adult entertainment industry and a person identified as Jane Doe and described as an adult entertainer on various adult sites, including Pornhub.
Associated Press: Prosecutors in all 50 states urge Congress to strengthen tools to fight AI child sexual abuse images: The top prosecutors in all 50 states are urging Congress to study how artificial intelligence can be used to exploit children through pornography, and come up with legislation to further guard against it. In a letter sent Tuesday to Republican and Democratic leaders of the House and Senate, the attorneys general from across the country call on federal lawmakers to “establish an expert commission to study the means and methods of AI that can be used to exploit children specifically” and expand existing restrictions on child sexual abuse materials specifically to cover AI-generated images. “We are engaged in a race against time to protect the children of our country from the dangers of AI,” the prosecutors wrote in the letter, shared ahead of time with The Associated Press. “Indeed, the proverbial walls of the city have already been breached. Now is the time to act.”
CNN: You don’t need to surveil your kids to protect them on social media: Parents and other caregivers hear that social media wreaks havoc on a teen’s self-esteem. But kids often tell us that it helps them find like-minded friends and boosts their emotional well-being. So, which is it? I’m a school counselor, and I often see it’s a mix of the above. And I promise you that adults can help kids make smart choices online that keep them safe and preserve their self-esteem. Easier said than done, right? While it may seem counterintuitive, surveilling kids’ movements, tracking their grades and scouring their online conversations can do more harm than good, argues Devorah Heitner, author of the new book “Growing Up in Public: Coming of Age in a Digital World.” CNN spoke to Heitner, whose book provides a pragmatic and empathetic road map to raising kids in today’s volatile and hyper-connected world. She helps caregivers learn to mentor rather than monitor their children.
Los Angeles Times: California lawmakers kill bill aimed at making social media safer for young people: California lawmakers on Friday killed a bill that would hold social media platforms liable for promoting harmful content about eating disorders, self-harm and drugs. Senate Bill 680, which was opposed by tech companies, died in the powerful Assembly Appropriations Committee as part of a marathon hearing where lawmakers culled hundreds of bills without public debate. “There is little doubt that social media platforms employ algorithms and design features that experts across the nation agree are contributing to harming our children,” Sen. Nancy Skinner (D-Berkeley), who wrote SB 680, said in a statement. “These companies have the power to adjust their platforms to limit this harm, yet to date we’ve seen them take no meaningful action.”
The Washington Post: Long sidelined, youth activists demand a say in online safety debate: When lawmakers began investigating the impact of social media on kids in 2021, Zamaan Qureshi was enthralled. Since middle school he’d watched his friends struggle with eating disorders, anxiety and depression, issues he said were “exacerbated” by platforms like Snapchat and Instagram. Qureshi’s longtime concerns were thrust into the national spotlight when Meta whistleblower Frances Haugen released documents linking Instagram to teen mental health problems. But as the revelations triggered a wave of bills to expand guardrails for children online, he grew frustrated at who appeared missing from the debate: young people, like himself, who’d experienced the technology from an early age. “There was little to no conversation about young people and … what they thought should be done,” said Qureshi, 21, a rising senior at American University.
Associated Press: Judge blocks Arkansas law requiring parental approval for minors to create social media accounts: A federal judge on Thursday temporarily blocked Arkansas from enforcing a new law that would have required parental consent for minors to create new social media accounts, preventing the state from becoming the first to impose such a restriction. U.S. District Judge Timothy L. Brooks granted a preliminary injunction that NetChoice — a tech industry trade group whose members include TikTok, Facebook parent Meta, and X, formerly known as Twitter — had requested against the law. The measure, which Republican Gov. Sarah Huckabee Sanders signed into law in April, was set to take effect Friday. Arkansas’ law is similar to a first-in-the-nation restriction signed into law earlier this year in Utah. That law is not set to take effect until March 2024.
The Washington Post: What to know about Ruby Franke, parenting YouTuber charged with child abuse: Ruby Franke, a Utah mother of six who ran the well-known parenting YouTube channel 8 Passengers, has been arrested on charges of child abuse along with her business partner Jodi Hildebrandt, Santa Clara-Ivins Public Safety Department said in a news release. The arrests came after Franke’s 12-year-old son climbed out of a window at Hildebrandt’s home in Ivins, Utah, and appeared, emaciated and with open wounds, at a neighbor’s, where he asked for food and water, according to an affidavit reported by the Associated Press. After searching Hildebrandt’s home, police found Franke’s 10-year-old daughter in a similar malnourished state. Hildebrandt’s counseling firm ConneXions, where Franke works, did not immediately return a request for comment late Thursday. It was not clear if either of the women had retained an attorney. Franke’s husband, Kevin, was not named, and it is not clear whether the couple is still together.
NBC Boston: Protecting your child’s personal information: With the start of a new school year, you may be filling out a lot of forms as you enroll your kids in extracurricular activities. But it’s a good idea to limit the information you share, because if it falls into the wrong hands, a criminal could ruin your child’s credit. Question whether it is absolutely necessary to provide all the personal information that you’re being asked for. “As the Better Business Bureau, we just say to use caution,” said Paula Fleming, chief marketing and sales officer for the BBB. “Whatever information you’re sharing, whether it be online or in person, the more you put out there, unfortunately, the more likely it is for something to happen.” Keep in mind that even school systems can be hacked. In 2019, there were 348 data breaches in educational institutions, and the personal information of more than 2.3 million students was exposed to scammers.
CNN: ‘All we want is revenge’: How social media fuels gun violence among teens: Juan Campos has been working to save at-risk teens from gun violence for 16 years. As a street outreach worker in Oakland, California, he has seen the pull and power of gangs. And he offers teens support when they’ve emerged from the juvenile justice system, advocates for them in school, and, if needed, helps them find housing, mental health services, and treatment for substance abuse. But, he said, he’s never confronted a force as formidable as social media, where small boasts and disputes online can escalate into deadly violence in schoolyards and on street corners.
The Verge: Child safety bills are reshaping the internet for everyone: By the end of this month, porn will get a lot harder to watch in Texas. Instead of clicking a button or entering a date of birth to access Pornhub and other adult sites, users will need to provide photos of their official government-issued ID or use a third-party service to verify their age. It’s the result of a new law passed earlier this summer intended to prevent kids from seeing porn online. But it’s also part of a broad — and worrying — attempt to age-gate the internet.
Pittsburgh Post-Gazette: Opinion: The state government can protect children from pornography: State government has a clear role in protecting the health, safety, and welfare of its residents. That is especially true for children, whose impressionability and development requires active work in protecting them from the worst of society so they can develop appropriately. As technology has infiltrated every part of our lives and has become ubiquitously used at the earliest stages of human development, the dark side of that technological access has also started to have a deleterious impact on children and their appropriate development. One of my priorities in the Pennsylvania General Assembly has been to protect children from sexual abuse and exploitation. While my personal story as a childhood rape survivor has helped drive this legislative passion, it is important that children have adults in positions of public trust speak up for them.
The Daily Review: Bradford County enters nationwide lawsuit against social media companies: Bradford County is in the early stages of entering a nationwide lawsuit against social media companies and their alleged negative effects on mental health. During their Thursday meeting, the Bradford County Commissioners approved “an agreement with Marc J Bern & Partners LLP to represent Bradford County regarding the recovery of all cost incurred by the county associated with the social media crisis.” Commissioner Daryl Miller explained that the lawsuit alleges social media companies have facilitated the bullying of children across its platforms. He stated that the lawsuit doesn’t take issue with the platforms’ free speech, but instead with its algorithms used to target people.
CT Examiner: Advocates, Lawmakers Plan Harder Look at Youth and Social Media: Seventeen-year-old Coventry High School student Dylan Nodwell knows first-hand the downsides of social media and how cyberbullying has caused anxiousness and, in many cases, deep depression among today’s teens. Nodwell, who will be a high school senior this fall, estimates that 90 percent of his peers use social media – primarily TikTok, Instagram and Snapchat – and that time on those sites often lead to bullying that goes unchecked. “I wished I’d grown up without it,” Nodwell told CT Examiner Friday. “People are behind a screen and they find it easy to bully others. You can leave mean comments and there are not always repercussions for that. I’ve seen friends get sad, anxious and depressed because of it. It kind of normalizes that sort of negative attitude toward others.”
Fox 56 Wolf: Ensuring online safety for kids: Expert urges parents to monitor apps as children return to school: As parents start to send their children back to school– one important step could be to check all of their online platforms. The app “N-G-L” also known as Not Gonna Lie– was launched last year. It’s a platform with anonymous users– and acts as a personal inbox to receive messages. “There’s some great apps out there, great programs that you can put on those phones, there designed specifically to keep your children safe. Remember those phones are yours parents, there not your child’s, they don’t have any expectation of privacy on your device,” “You can do a lot of things to place filters on those phones, and allow some type of control over how they use that, there are programs like Bart, family 360, things you can do to minimize the probability that somebody is going to contact them or they are going to contact somebody they shouldn’t be talking too,” said Social Media Intelligence Expert Dr. Steve Webb.
Vox: YouTube can’t fix its kid safety problem: Google might be facing significant fines for violating children’s privacy through YouTube ads — again. Two recent reports suggest that the company is collecting data from and targeting ads to children, a violation of both the Children’s Online Privacy and Protection Act (COPPA) and Google’s consent decree with the Federal Trade Commission. They also come as Google, which owns YouTube, prepares to defend itself in a major antitrust lawsuit about its search engine, is under scrutiny from Democrats and Republicans alike, and Congress considers child online safety bills. Simply put, this is not the best time for Google to face more accusations of wrongdoing, especially when the alleged victims are children.
NBC Miami: National teacher’s union promotes literacy, mental health initiative in Miami-Dade: With so much political controversy surrounding education in Florida, when the president of the most prominent national teacher’s union comes to town from Washington, it’s natural to expect her focus would be on the mandates from Tallahassee. Not this time. American Federation of Teachers president Randi Weingarten came to Miami to talk about proposals to boost learning in the classroom and to support teachers and families. “Today has nothing to do about politics, it has everything to do with lifting up the Miami-Dade schools, which around the country are known for the kind of public school choice they have done for kids and our communities,” Weingarten said. She and United Teachers of Dade president Karla Hernandez-Mats toured Miami Jackson Senior High and some other schools.
The New York Times: YouTube Improperly Used Targeted Ads on Children’s Videos, Watchdogs Say: After a research report last week found that YouTube’s advertising practices had the potential to undercut the privacy of children watching children’s videos, the company said it limited the collection of viewer data and did not serve targeted ads on such videos.These types of personalized ads, which use data to tailor marketing to users’ online activities and interests, can be effective for finding the right consumers. Under a federal privacy law, however, children’s online services must obtain parental consent before collecting personal information from users under 13 to target them with ads — a commitment YouTube extended to anyone watching a children’s video.
WIRED: How to Talk to Your Kids About Social Media and Mental Health: IF YOU GIVE a kid a smartphone, they’re going to want a social media account. That’s not the start of a storybook. The average age for a kid getting their first smartphone is 10.3. Within a year, a child has likely made four or five social media accounts; by the age of 12, 90 percent of kids are already on social media, according to research by Linda Charmaraman, a senior research scientist who runs the Youth Media and Well-Being Research Lab at Wellesley College.
DC News Now: Changes made following viral post about social media app’s safety features: With the back-to-school season around the corner, some parents are focused on making sure their children have safe online tools to succeed. A new app, Saturn, helps students to easily track their class schedules and to connect with other students. Although the app may sound helpful, some parents are worried about its privacy and safety. Chris Cullum said his daughter asked to sign up for the app, which is why he signed up around Aug. 9 to see how the app works. But after using it, he said he had some concerns about app’s safety features. “I was able to make a profile using just a number,” Cullum said. He later made a social media post highlighting his concerns, which went viral in Arkansas and parts of the DMV.
The Washington Post: YouTube faces fresh complaint over its children’s privacy practices: Children’s privacy advocates are urging federal regulators to consider issuing a massive fine “upwards of tens of billions of dollars” and imposing sweeping privacy limits on Google-owned YouTube over reports that it may have let companies track kids’ data across the internet. Ad tracking firm Adalytics last week released a report suggesting that YouTube served ads for adults on videos labeled as “made for kids,” stoking concern that the video-sharing giant may be trampling on federal privacy protections for children, as the New York Times first reported. In response, Sens. Edward J. Markey (D-Mass.) and Marsha Blackburn (R-Tenn.) called on the Federal Trade Commission to investigate the matter, writing that the purported tactics may have “impacted hundreds of thousands, to potentially millions, of children across the United States.”
Chicago Sun-Times: Opinion: Protect children from dangers of the internet with Kids Online Safety Act: In July, the Kids Online Safety Act drew one step closer to becoming law when a Senate committee advanced the bill. However, despite bipartisan support and the fact that most people in the U.S. agree that kids need internet safeguards, the bill faces hurdles. Some are trying to politicize the bill by claiming that it will be used by anti-LGBTQ+ groups to censor content under the guise of preventing depression, anxiety and eating disorders in children. Others claim it is a threat to free speech. But since the bill made it to the floor of Congress last year and was dropped because of criticism, some language was changed and LGBTQ+ advocacy groups that initially opposed it, like the Gay & Lesbian Alliance Against Defamation and the Human Rights Campaign, wisely dropped their opposition.
York Daily Record: My bill would protect PA child social media influencers from exploitive parents: Pennsylvania’s Child Labor Law exists to protect children, their labor, and their earnings from being exploited and I believe that the protections of our Child Labor Law should apply to child influencers on social media. I am introducing legislation to update our Child Labor Laws to do just that. Americans are well accustomed to childhood celebrities, and we are equally aware of the many stories of children whose families have become broken and their futures made difficult because the people who should have been looking out for them were exploiting them.
KDKA: Pa. lawmaker introduces new bill to regulate social media influencers under child labor laws: A Pennsylvania Lawmaker is planning on introducing a bill to regulate social media child influencers and celebrities under Pennsylvania’s child labor laws. Representative Torren Ecker, says it would protect kids who earn money by creating content or whose names or photographs in a parent’s content generate income. He says that child influencers make more than $50 million.
Philly Voice: Pa. lawmaker proposes protecting young social media influencers under state’s child labor laws: Pennsylvania could soon regulate money earned by child influencers and celebrities from their or their parents’ social media content under a bill set to be introduced this fall. The bill aims to protect children whose photographs, likenesses or names are used to make money through social media under Pennsylvania’s existing child labor laws. It will be introduced by Rep. Torren Ecker, a Republican serving portions of Adams and Cumberland counties, when the state House reconvenes in September.
WGAL: Pennsylvania bill aims to protect children creating social media content from being exploited: Content creators and social media stars can make millions of dollars, though Pennsylvania lawmakers are drafting legislation to ensure that children who are the focus of those videos aren’t being cut out on compensation and exploited. A sponsorship memo for legislation being drafted by Rep. Torren Ecker said his bill would “will protect children who earn money as influencers and content-makers, or whose likeness, name or photograph is substantially featured in a parent or guardian’s content that generates income.”
Fox43: Pa. lawmaker calls for regulation on child influencers on social media: The youngest stars of social media could get additional protection and pay under a proposed change to Pennsylvania’s child labor law. Republican representative Torren Ecker, who represents Cumberland County, proposed a bill that would regulate child influencers and celebrities on social media. “We have lots of child labor laws to protect children from working long hours in factories,” said Representative Ecker. “I think this is modernizing for another way children can be exploited.”
North Central PA: Legislators consider bill to regulate child social media influencers: Pennsylvania legislators are discussing a bill that would put regulations on child influencers and celebrities on social media. State Representative Torren Ecker (R – Adams/Cumberland) will soon introduce a bill to the state legislature, his office announced in August. “We always hear about the devastating later-life impact that childhood celebrity and wealth can have on those who experience fame early in life. Now, every parent or relative with a cellphone can work to make their children or relatives into social media celebrities that, without their consent, can deprive children of privacy, income from their work, and fair working conditions within the scope of current law,” Rep. Ecker said.
The Conversation: The Youth Mental Health Crisis Worsens amid a Shortage of Professional Help Providers: The hospital where I practice recently admitted a 14-year-old girl with post-traumatic stress disorder, or PTSD, to our outpatient program. She was referred to us six months earlier, in October 2022, but at the time we were at capacity. Although we tried to refer her to several other hospitals, they too were full. During that six-month wait, she attempted suicide. Unfortunately, this is an all-too-common story for young people with mental health issues. A 2021 survey of 88 children’s hospitals reported that they admit, on average, four teens per day to inpatient programs. At many of these hospitals, more children await help, but there are simply not enough services or psychiatric beds for them.
Franklin County Press: Protecting Young Minds: Navigating the Social Media Landscape for Children’s Safety: It seems that screen time is often intertwined with social interactions; parents are faced with the daunting task of ensuring their children’s safety on social media. Social media platforms offer myriad opportunities for connection, entertainment, and education and also present a minefield of potential risks for young users. The dangers are manifold and ever-present, From cyberbullying to exposure to inappropriate content and even the risk of contact with strangers. A recent survey from the Pew Research Center found that 95% of teens have access to a smartphone, and a whopping 45% claim to be online almost constantly. With this increased online presence, there’s a higher likelihood of encountering potential threats.
WTVO: Doctors trip could keep children safe on social media; study: A new study showed that a visit to the doctor might be able to keep children safe on social media. Researchers found that doctors having a five-minute conversation with their youth patients about social media safety resulted in more kids having follow up conversations with their parents. That led to kids checking or changing their privacy settings. The study also found that most pediatricians were not learning how to talk to patients about social media.
Pittsburgh Post-Gazette: Editoral: Pa. needs to protect children from internet pornography: Nobody — or at least nobody who deserves to be taken seriously — thinks children should see pornography. And yet that’s exactly what’s happening, and at alarming rates: Studies in the United States, as well as in France, Australia and elsewhere, show that the average age at which young people first encounter porn is 11 — and some say it’s even earlier. The question is what to do about it. Pennsylvania should join several other states — states with both Republican and Democratic leadership — to require that internet pornography companies verify the age of those who access their content. Ultimately, the U.S. should join other nations in adopting a uniform national strategy for youth online safety.
Fox 43: York and Adams County coaches, parents to learn about student athlete mental health: The York Adams Interscholastic Athletic Associations (YAIAA) and several other organizations will host a symposium to help parents and coaches better understand student athletes’ mental health next weekend. Organizers say the gathering aims to bring awareness, support, education and resources to students, coaches and the community. “The idea of what we’re trying to do through mental health and athletics is create a culture and climate on and off the field through coaches and parents where kids really feel safe and supported to be who they are, and if they aren’t doing okay that they have a space to go to be able to ask for this help that they might need,” Miranda Jenkins, a social worker and the head coach of men’s and women’s swimming at York Suburban, said.
WHTM: Pennsylvania lawmaker introduces bill to regulate child influencers: Representative Torren Ecker (R-Adams/Cumberland) announced today that he will introduce legislation to regulate social media child influencers and celebrities under Pennsylvania’s Child Labor Laws. Rep. Ecker says his legislation would protect children who earn money by creating content and/or whose name, likeness, or photograph is featured in a parent or guardian’s content that generates income for the parent under Pennsylvania’s Child Labor Law, according to a memo released Thursday.
CNN: Illinois passes a law that requires parents to compensate child influencers: When 16-year-old Shreya Nallamothu from Normal, Illinois, scrolled through social media platforms to pass time during the pandemic, she became increasingly frustrated with the number of children she saw featured in family vlogs. She recalled the many home videos her parents filmed of herself and her sister over the years: taking their first steps, going to school and other “embarrassing stuff.” “I’m so glad those videos stayed in the family,” she said. “It made me realize family vlogging is putting very private and intimate moments onto the internet.”
Psychology Today: When Mental Health Info Is Obtained Via Social Media: These days almost everyone goes online to look up health information. Googling medical questions and concerns has become a part of everyday life for many of us as the Internet has become an extremely easy way to search for a doctor of any specialty, book appointments and expand one’s knowledge. Over the past several years, however, the online landscape has evolved quite dramatically with the advent of social media. In fact, much like Google, social media has also become an increasingly important source of mental health-related information, especially for teenagers and young adults.
ABC: Illinois becomes 1st state to regulate kid influencers: What to know about the law: During the coronavirus pandemic, Shreya Nallamothu, now 16, said she, like so many others, began to spend more time on social media, where she saw countless parents who documented their own lives and their kids’ lives on different platforms. “The more I fell down that rabbit hole, I kept seeing cases of exploitation,” Shreya, a high school junior from Normal, Illinois, told “Good Morning America,” adding that she specifically was struck by seeing kids who she thought were not old enough to know the full ramifications of their online presence.
Mashable: Child influencers in Illinois can now sue their parents: Illinois is the first state in America to pass a law protecting child influencers and social media stars, making sure they are paid for appearing in videos posted to monetized online platforms like TikTok and YouTube. And if they’re not paid, they can sue. The bill, SB1782, was passed unanimously through the Senate in March, after being introduced by Democratic Sen. David Koehler, and was signed into law on Friday. It will go into effect July 1, 2024. The new law will amend the state’s Child Labor Law to ensure monetary compensation for influencers and social media personalities under 16, so those children who appear in online content will be entitled to a percentage of earnings. To qualify, the videos must be filmed in Illinois and “at least 30 percent of the vlogger’s compensated video content produced within a 30-day period included the likeness, name, or photograph of the minor.”
The Wall Street Journal: Opinion: The Constitution Protects ‘Harmful’ Speech: The Senate is considering a bill that poses serious risks to free speech. The Senate Commerce Committee recently advanced the Kids Online Safety Act by unanimous vote. It would empower government officials—state attorneys general and the Federal Trade Commission—to challenge social-media companies when they fail to prevent “harm to minors.” Invigorated with greater statutory authority, the already aggressive enforcement agencies would have the means to deem any speech unlawful and limit it under the guise of promoting child safety. According to the text of KOSA, a state attorney general could bring a civil lawsuit against a platform if it doesn’t take down content that falls under the bill’s definition of harmful. For instance, a state could sue Instagram for violating the act’s duty of care if it doesn’t take down posts that make a child feel more anxious.
The New York Times: Opinion: Teens Don’t Really Understand That the World Can See What They Do Online, but I Do: When Matthew McConaughey and his wife, Camila Alves McConaughey, took to Instagram to jointly announce a new venture this summer, you might have expected it to be an upcoming film or a fledgling lifestyle brand. Their news was more unusual: the unveiling of an official Instagram account for their son Levi, which they were giving to him on his 15th birthday, long after many of his friends had signed up, they noted. Celebrities have taken a wide array of approaches to granting their children access to social media — and thereby granting the public access to their children. Apple Martin, the daughter of Gwyneth Paltrow and Chris Martin, has always kept her Instagram private and once shamed her mother for publicly sharing a photo of her without her consent. DJ Khaled’s son has been on Instagram since shortly after birth. Whatever the approach, we’ve seen how easily personal revelations, flippant comments and family drama become fodder for public scrutiny and ridicule.
New York Times: Amid Sextortion’s Rise, Computer Scientists Tap A.I. to Identify Risky Apps: Almost weekly, Brian Levine, a computer scientist at the University of Massachusetts Amherst, is asked the same question by his 14-year-old daughter: Can I download this app? Mr. Levine responds by scanning hundreds of customer reviews in the App Store for allegations of harassment or child sexual abuse. The manual and arbitrary process has made him wonder why more resources aren’t available to help parents make quick decisions about apps. Over the past two years, Mr. Levine has sought to help parents by designing a computational model that assesses customers’ reviews of social apps. Using artificial intelligence to evaluate the context of reviews with words such as “child porn” or “pedo,” he and a team of researchers have built a searchable website called the App Danger Project, which provides clear guidance on the safety of social networking apps.
Los Angeles Times: California lawmakers want to make social media safer for young people. Can they finally succeed?: Samuel Chapman had no idea that drug dealers targeted teens on Snapchat until his 16-year-old son died from a fentanyl overdose. “We thought it was like a playground for kids and didn’t think of it, as I do now, as the dark web for kids,” the Los Angeles resident said. In 2021, a drug dealer reached out to his son, Sammy, on the disappearing messaging app and showed the teen a “colorful drug menu” that offered cocaine, Chapman said. After he and his wife fell asleep, the dealer delivered drugs to their house “like a pizza.” Sammy unknowingly took fentanyl and died in his bedroom. For parents like Chapman, the horrific ordeal underscored social media’s dangerous side. Tech platforms help people keep in touch with family and friends, but they also attract drug dealers, pedophiles and other predators. Plus, social media algorithms can steer young people to posts that could trigger eating disorders or self-harm.
AP: Georgia kids would need parental permission to join social media if Senate Republicans get their way: Georgia could join other states requiring children to have their parents’ explicit permission to create social media accounts. Two top Republicans in the Georgia state Senate — Lt. Gov. Burt Jones and Sen. Jason Anavitarte of Dallas — said in a Monday news conference they will seek to pass such a law in 2024. The proposal could also restrict accounts on other online services. “It’s important that we empower parents,” Anavitarte said. “A lot of parents don’t know how to restrict content.”
Fox 13: Social media can increase risks of mental health problems among students: U.S. Congresswoman Kathy Castor held a roundtable discussion in Tampa Thursday, with the goal of urging parents to add something new to their back-to-school checklists: safety guardrails for mobile devices. Castor was joined by members of the Hillsborough Classroom Teachers Association, Hillsborough PTSA President Ami Marie Grainger Welch and several students. “Set these guardrails. Have the conversation about what it means online for you to be on your phone for too long,” Castor said during a news conference following the roundtable. “The big tech platforms want to keep you addicted. They want your eyeballs constantly scrolling because they’re also targeting you with advertisements.” Castor believes social media and increased internet usage present a significant risk to the mental health and well-being of students across the country.
The Times-Tribune: Local schools sue social media giants: Four area school districts are among a growing number of districts nationwide that are suing several social media giants, alleging the companies have helped fuel a mental health crisis among youth that is disrupting education and costing taxpayers money. The suits, filed by the North Pocono, Hazleton Area, Hanover Area and Crestwood school districts, allege the owners of Facebook, Instagram, TikTok, YouTube and Snapchat know their sites are highly addictive and harmful to youth. They’ve refused to implement safety measures, however, so they can continue to reap massive profits. Those profits have come at the expense of schools, who are left to deal with behavioral issues, including anxiety, depression and other mental health issues tied to excessive use of the platforms, said Joseph Cappelli, a Montgomery County attorney who represents the districts.
Washington Examiner: Controversial legislation to protect children on social media advances in Senate: Two controversial bills that would expand teenagers’ rights to privacy and limit Big Tech’s ability to collect data from underage users advanced to the Senate floor Thursday. The Senate Committee on Commerce, Science, and Transportation voted to approve two bills to implement safeguards to protect children and teenagers online. It approved Sen. Marsha Blackburn (R-TN) and Richard Blumenthal’s (D-CT) Kids Online Safety Act, which would require platforms to take steps to prevent a defined set of harms to minors as well as implement controls for users that allow parents to limit screen time, restrict addictive features, and determine who gets access to their teenager’s user profile.
The Economist: Regulation could disrupt the booming “kidfluencer” business: It started with a Lego “choo-choo train”. The video shows three-year-old Ryan Kaji picking it out from the store “because I like it”, he tells his mother, Loann. Back at the family home in Houston, Texas, the toddler opens the box and plays with his new toy. It’s nothing out of the ordinary. But it helped make the Kajis millionaires. Loann had recorded and uploaded the video to a new YouTube channel, “Ryan ToysReview”. Eight years, many toy unboxings and 35m subscribers later, “Ryan’s World”, as the channel is now known, is considered YouTube royalty. He is part of a new generation of child social-media influencers (those under the age of 18) changing the shape of kids’ entertainment in America—and making a lot of money in the process.
CNN: Elizabeth Warren and Lindsey Graham want a new agency to regulate tech: Two US senators are calling for the creation of a new federal agency to regulate tech companies such as Amazon, Google and Meta, in the latest push by members of Congress to clamp down on Big Tech. Under the proposal released Thursday by Sen. Elizabeth Warren, a Massachusetts Democrat, and Sen. Lindsey Graham, a South Carolina Republican, Congress would establish a new regulatory body with the power to sue platforms — or even force them to stop operating — in response to various potential harms to customers, rivals and the general public, including anticompetitive practices, violations of consumer privacy and the spread of harmful online content.
Philly Voice: Reducing social media usage by just 15 minutes a day improves one’s well-being, research suggests: There are many paths toward living a healthier life, but here’s one simple place to start: Put your phone down. By spending less time on social media in particular, recent research suggests, we can improve our overall health and well-being. Just a 15-minute reduction in social media usage per day can have a positive impact on health and social well-being, according to a study published in the Journal of Technology in Behavioral Science.
Casino.org: Online Casinos, Roblox, and Children Linked in Report: There are reportedly some online casinos that allow players to use Robux, the in-game currency of the video game Roblox, and they also don’t make an effort to check the age of the users. The gambling platforms are also going a step further, allegedly recruiting content creators as young as 14 to attract more teenage gamblers. An article on the Sharpr substack, run and published by Cody Luongo, asserts that Roblox has become a gateway to underground gambling. The report alleges that children are able to bet millions of dollars on the sites.
The Wall Street Journal: Schools Sue Social-Media Platforms Over Alleged Harms to Students: Plaintiffs’ lawyers are pitching school boards throughout the country to file lawsuits against social-media companies on allegations that their apps cause classroom disciplinary problems and mental-health issues, diverting resources from education. Nearly 200 school districts so far have joined the litigation against the parent companies of Facebook TikTok, Snapchat and YouTube. The suits have been consolidated in the U.S. District Court in Oakland, Calif., along with hundreds of suits by families alleging harms to their children from social media. The lawsuits face a test later this year when a judge is expected to consider a motion by the tech companies to dismiss the cases on grounds that the conduct allegedly causing the harm is protected under the internet liability shield known as Section 230.
The Washington Post: Twitter rival Mastodon rife with child-abuse material, study finds: A new report has found rampant child sexual abuse material on Mastodon, a social media site that has gained popularity in recent months as an alternative to platforms like Twitter and Instagram. Researchers say the findings raise major questions about the effectiveness of safety efforts across so-called “decentralized” platforms, which let users join independently run communities that set their own moderation rules, particularly in dealing with the internet’s most vile content. Researchers reported finding their first piece of content containing child exploitation within about five minutes. They would go on to uncover roughly 2,000 uses of hashtags associated with such material. David Thiel, one of the report’s authors, called it an unprecedented sum. “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,” said Thiel, referring to a technique used to identify pieces of content with unique digital signatures. Mastodon did not return a request for comment.
K-12 Dive: Scrutiny over TikTok in schools grows: Florida is one of the earliest states to ban TikTok in schools. Montana Gov. Greg Gianforte also signed a TikTok ban in May, but that restricts use across the entire state rather than only in schools. Montana’s law was challenged in court that same month by TikTok creators, a lawsuit that was funded by the social media platform itself. Many other states as well as local districts have taken issue with the company. Louisiana’s Superintendent of Education Cade Brumley, for example, advised all school system leaders in January to remove the TikTok from public school devices because of ata privacy concerns stemming from the Chinese ownership of the platform. A growing list of districts have also sued Tiktok in the past year, many of them citing student mental health concerns. A lawsuit filed by Maryland’s Howard County Public School System in June, for example, said TikTok and other social media are “addictive and dangerous” and have changed the way kids “think, feel, and behave.”
NBC News: A teachers union says it’s fed up with social media’s impact on students: The nation’s second-largest teachers union said Thursday it was losing patience with social media apps that it says are contributing to mental health problems and misbehavior in classrooms nationwide, draining time and money from teachers and school systems. The American Federation of Teachers issued a report with several other organizations, warning that tech companies should rein in their apps before Congress forces them to do so. The federation has 1.7 million members. The report comes at a time of heightened concern about the impact of social media on children and teenagers. In May, the U.S. surgeon general warned that social media use is a main contributor to depression, anxiety and other mental health problems, and more than 100 school districts and government entities have sued the companies behind apps such as TikTok and Instagram because of the associated problems.
CNN: Leading AI companies commit to outside testing of AI systems and other safety commitments: Microsoft, Google and other leading artificial intelligence companies committed Friday to put new AI systems through outside testing before they are publicly released and to clearly label AI-generated content, the White House announced. The pledges are part of a series of voluntary commitments agreed to by the White House and seven leading AI companies – which also include Amazon, Meta, OpenAI, Anthropic and Inflection – aimed at making AI systems and products safer and more trustworthy while Congress and the White House develop more comprehensive regulations to govern the rapidly growing industry. President Joe Biden will meet with top executives from all seven companies at the White House on Friday. White House officials acknowledge that some of the companies have already enacted some of the commitments but argue they will as a whole raise “the standards for safety, security and trust of AI” and will serve as a “bridge to regulation.”
Pittsburgh Post-Gazette: 7 charged with hacking Snapchat accounts to obtain explicit images: Seven people have been indicted on charges they conspired to hack into Snapchat accounts to remove explicit images and videos depicting account holders, including child sexual abuse material. Six of the defendants are Pennsylvania residents and one is from North Carolina, according to the U.S. Attorney’s Office in Pittsburgh. A federal grand jury in Erie, Pa., produced the indictment on charges of conspiracy to commit wire fraud, fraud in connection with unlawful computer access, aggravated identity theft, and receipt and possession of child sexual abuse material. A news release issued this week said the indictment named: Richard Alan Martz, Jr., 33, of Meadville, Crawford County; Dylan Michael Miller, 30, of West Mifflin; Christopher Clampitt, 33, of Clemmons, N.C.; Edward Grabb, 31, of Jeannette; Michael Yackovich, 27, of West Newton; Luke Robert Swinehart, 22, of Lock Haven, Clinton County; and Karlin Terrell Jones, 26, of Beaver Falls.
NBC News: A friend-finding app offered a ‘safe space’ for teens — sextortion soon followed: A Tinder-like app popular among teenagers and young adults has allegedly been used to extort users by tricking them into sending sexually explicit photos, a problem that internet safety watchdogs say is indicative of the challenges of keeping young people safe on social media. The app, Wizz, allows users to scroll through profiles that show a person’s picture, first name, age, state and zodiac sign. Wizz advertises the app as a “safe space” to meet new friends and allows users as young as 13 to join and connect with users of a similar age. Its basic functionality resembles popular dating apps. When users open the app, they are presented with another person’s profile. They can then choose to send that person a message in the app’s chat function or swipe left to see a new profile.
Fox News: Baby monitor hackers sold nude images of kids on social media: report: Hackers are reportedly gaining access to Hikvision cameras through the company’s mobile app and have used the feeds to sell child pornography on social media. An investigation by IPVM, a surveillance industry trade publication, revealed that some hackers were using the company’s Hik-Connect app to distribute child pornography on Telegram, the publication reported last week. The investigation found several sales offers of nude videos on the platform, including some labeled “cp” (child porn), “kids room,” “family room,” “bedroom of a young girl” and “gynecological office.”
Fox News: Artificial intelligence could help ‘normalize’ child sexual abuse as graphic images erupt online: experts: Artificial intelligence is opening the door to a disturbing trend of people creating realistic images of children in sexual settings, which could increase the number of cases of sex crimes against kids in real life, experts warn. AI platforms that can mimic human conversation or create realistic images exploded in popularity late last year into 2023 following the release of chatbot ChatGPT, which served as a watershed moment for the use of artificial intelligence. As the curiosity of people across the world was piqued by the technology for work or school tasks, others have embraced the platforms for more nefarious purposes.
The New York Times: Opinion: Algorithms Are Making Kids Desperately Unhappy: Kids are even more in the bag of social media companies than we think. So many of them have ceded their online autonomy so fully to their phones that they even balk at the idea of searching the internet — for them, the only acceptable online environment is one customized by big tech algorithms, which feed them customized content. As our children’s free time and imaginations become more and more tightly fused to the social media they consume, we need to understand that unregulated access to the internet comes at a cost. Something similar is happening for adults, too. With the advent of A.I., a spiritual loss awaits us as we outsource countless human rituals — exploration and trial and error — to machines. But it isn’t too late to change this story.
Consumer Affairs: Meta’s updated parental controls give parents an inside look at who their kids at messaging: As lawmakers and government officials get more serious about kids’ and teens’ social media use, Meta, the home of Facebook and Instagram, is following suit. The company announced several new features that give parents more control over their kids’ social media use. While parents won’t be able to see the specifics of their children’s messages, they will be able to get a better idea of how their child uses these social media apps, including how much time is spent on them.
Beaver County Times: Beaver Falls man indicted in federal identity theft, child porn charges: A city man was indicted by a federal grand jury in Erie Tuesday for his involvement in a criminal ring accused of wire fraud conspiracy and possession of child sex abuse materials. According to the Department of Justice, charges were filed against six residents of Pennsylvania and one resident of North Carolina after the seven defendants allegedly conspired to hack into Snapchat accounts to obtain explicit images and videos of victims. Among those charged in the federal investigation was 26-year-old Karlin Terrell Jones, of Beaver Falls, who authorities said worked with the group to share these images with others online. “As alleged, the defendants used deception and hacking techniques to unlawfully access social media accounts so that they could steal, hoard, and trade explicit and otherwise private content of hundreds of unsuspecting victims,” said U.S. Attorney Eric Olshan.
CNN: Opinion: Mark Zuckerberg’s family photo raises this crucial question: Sandwiched between a Jiu-Jitsu video and the Threads announcement, Mark Zuckerberg’s Instagram profile recently featured a casual Independence Day snapshot of him and his family. Well, most of it — emojis obscure the faces of his 5- and 7-year-old daughters. This prompted social media comments accusing Zuckerberg of hypocrisy, given the constant outcry over his company Meta’s privacy practices. Yes, it is deeply ironic that Zuckerberg, whose platforms fine-tuned a business model that earns him enormous revenues by extracting our data, wants to limit where some of his data goes. But two things are important to note here: This decision is more about his children than him, and covering their faces with emojis is more about reducing their visibility to audiences than about preventing platforms from extracting their data.
WTAJ: Central Pa. schools file lawsuits against social media companies: Four local school districts have all filed lawsuits against big-name social media companies. Altoona, Bellwood-Antis, Ferndale and Tyrone Area school districts have all submitted their own federal lawsuits against social media companies Meta, which owns Facebook and Instagram, Google, which owns YouTube, ByteDance, who owns TikTok and Snap Inc, which owns Snapchat. The area schools filed the separate lawsuits on Wednesday, July 12 in U.S. District Court, Western District of Pennsylvania. Their suits accuse the companies of targeting children, which the school contends “are uniquely susceptible to harm from defendants’ products” and that these companies designed their products to “attract and addict youth.”
The Philadelphia Tribune: Pew Charitable Trusts Gives $6.55 million to children’s mental health providers: In 2021, suicide was the third leading cause of death in the U.S. for high school students aged 14 to 18, according to the Centers for Disease Control and Prevention. In the last few years, the rates of anxiety and depression among young people have increased significantly as a result of the lingering effects of the pandemic, rising gun violence and drug overdose deaths. U.S. Surgeon General Dr. Vivek Murthy declared a mental health crisis for young people in 2021. In this environment, the Pew Charitable Trusts said Thursday that it has awarded $6.55 million to five non-profit groups seeking to make mental health services more accessible to children and teens in underserved communities in the Philadelphia-area.
The Philadelphia Inquirer: Five Philadelphia nonprofits are receiving $6.55 million in Pew grants to expand youth and child mental health services: Five Philadelphia organizations working on child and youth mental health will receive a combined $6.55 million in grants from the Pew Charitable Trusts to expand access to services, the national nonprofit announced Thursday. The Pew grants aim to create more treatment options to expand the geographic reach of services within the city, and train providers in this highly specialized care, said Kristin Romens, project director of Pew’s Fund for Health and Human Services. Pew chose the organizations for their expertise and work within the community. “All of them take partnering with their clients and community very seriously,” Romens said.
NPR: So your tween wants a smartphone? Read this first: Your tween wants a smartphone very badly. So badly that it physically hurts. And they’re giving you soooo many reasons why.They’re going to middle school … they need it to collaborate with peers on school projects … they need it to tell you where they are … when they’ll be home … when the school bus is late. It’ll help you, dear parent, they vow. Plus, all their friends have one, and they feel left out. Come on! Pleeeeeeze.Before you click “place order” on that smartphone, pause and consider a few insights from a person who makes a living helping parents and tweens navigate the murky waters of smartphones and social media.Emily Cherkin spent more than a decade as a middle school teacher during the early aughts. She watched firsthand as the presence of smartphones transformed life for middle schoolers. For the past four years, she’s been working as screen-time consultant, coaching parents about digital technology.
CNBC: Discord does about-face on parental controls for teen social media use: Discord has introduced parental controls similar to those adopted by prominent social media platforms including Instagram, TikTok, Snapchat, and YouTube. In the past, Discord’s philosophy has rejected this concept, stressing a focus on user needs, not the needs of their parents. The release of these parental controls comes amid greater scrutiny of teen social media use and mental health issues, and follows Discord’s acquisition of Gas, a social media app focused on giving teenagers a platform to compliment each other.
Insider: Shyla Walker spent years turning her child into a YouTube star. Now, she says she regrets putting her daughter online and is cutting ties with the controversial world of family vlogging.: Shyla Walker and her boyfriend, Landon McBroom started a couple’s channel on YouTube to document their lives and romantic memories. Walker, now 25, said she was “naive” about the internet at the time, as she rarely used social media platforms outside of posting on the channel. But she had connections to the family-vlogging world through Landon, whose brother Austin McBroom runs The Ace Family YouTube channel, a controversial family channel with more than 18 million subscribers. Austin and Catherine McBroom regularly involve their three kids — Elle, Alaia, and Steel — in their YouTube videos, and also manage Instagram accounts on behalf of the children, who are 7, 4, and 3 years old. Like many family vloggers, they have been accused by other YouTubers of exploiting their children’s lives for content over the years. (The McBrooms did not respond to a request for comment regarding this accusation.)
The Dallas Morning News: LTE: Social media is hurting kids. Why are parents alone in the fight?: Danger lurks between grinning selfies, influencer travelogues and silly memes. Some social media threats, like cyberbullying, are obvious. Others are more subtle: the barrage of doctored photos that affect our body image, the quacks who craftily disguise fake information as fact, the social contagion that distorts decision making. Research shows social media can rewire our kids’ brains, and yet our government leaders haven’t established robust safeguards for these platforms the way they have with toys, cars and drugs. That is one of the main messages of the U.S. surgeon general’s social media and youth mental health advisory, released last month. Dr. Vivek Murthy’s 25-page document — part advice, part warning — is a must-read for everyone, but especially for lawmakers who alone have the power to force the tech industry to protect children. “While nearly all parents believe they have a responsibility to protect their children from inappropriate content online, the entire burden of mitigating the risk of harm of social media cannot be placed on the shoulders of children and parents,” Murthy wrote.
Los Angeles Times: Opinion: Smartphones take a toll on teenagers. What choice do parents have?: We can’t keep ignoring social media’s harmful effects on the mental health of young people. Across the world, regardless of skin color or language, people are suffering from mental health problems that are linked to the age at which they got their first smartphone or tablet, according to a new report from Sapien Labs. The nonprofit organization, which has a database of more than a million people in dozens of countries, found that the younger that people were when they got their first smartphone or tablet, the more likely they were to have mental health challenges as adults, including suicidal thoughts, a sense of being detached from reality and feelings of aggression toward others. The effects were most pronounced among girls, who spend more time on social media than boys do. The harm of the devices seems to be rooted in the 24/7 access they provide to social media. The longer that parents wait to give children portable digital devices, the better. Respondents who got their first smartphones or tablets in their later teens had a much stronger sense of self and ability to relate to others.
CNN: Opinion: Dr. Sanjay Gupta: Parenting in the era of ubiquitous screens and social media: A growing number of states are turning the screws on Big Tech, the internet and social media. On Wednesday, Montana became the first state to completely ban TikTok, although many are skeptical that the controversial new legislation will be enforceable. Other moves include laws that aim to tighten regulations on social media platforms in general, like those recently enacted by Arkansas and Utah. There are three worthwhile goals that appear to be at least part of the motivation behind legal maneuvers like these: preventing companies from collecting data on us and our children, protecting kids online and balancing your rights with your responsibilities when you post content to online platforms. For example, if a platform hosts content that leads to someone being harmed, can it then also held responsible? So far, the answer has been no, according to a recent US Supreme Court decision. For me, though, the discussions around smartphones and social media are very personal. As a dad of three teenage girls, I am often left wondering about the impact of so much screen time on their brains.
CNN: Teens should be trained before entering the world of social media, APA says: The American Psychological Association is calling for teens to undergo training before they enter the sometimes fun but sometimes fraught world of social media, according to new recommendations released Tuesday. “Social media is neither inherently harmful nor beneficial to our youth,” said Dr. Thema Bryant, the APA’s president. “Just as we require young people to be trained in order to get a driver’s license, our youth need instruction in the safe and healthy use of social media.” Bryant assembled an advisory panel to review the scientific literature on social media use and formulate recommendations for healthy adolescent use, according to an APA news release. The American Psychological Association Health Advisory on Social Media Use in Adolescence released 10 recommendations to guide educators, parents, policymakers, mental health and health practitioners, technology companies and adolescents.
The Greylock Glass: Children’s Advocates Applaud Kids Online Safety Act: Federal legislation aimed at protecting children and teens online has gained the support of leading advocates for children’s health and privacy. The Kids Online Safety Act would make online platforms and digital providers abide by a “duty of care” requiring them to eliminate or mitigate the impact of harmful content. Kris Perry, executive director of Children and Screens, the Institute of Digital Media and Child Development, said parents would have more tools to control how their children interact with the platforms. “Limit screen time, or limit autoplay, or limit the endless scrolls so that the products become safer for their children,” Perry recommended. Perry pointed out researchers believe if the negative features can be reduced, the troubling trend of adolescents comparing their lives to others could decline, while allowing for greater social connections to be made. Some critics of the bill have said it could pressure platforms to “over-moderate,” as various states deliberate what kinds of material are considered appropriate for children.
ABC News: Bipartisan pair of lawmakers push to protect children online: Sens. Richard Blumenthal and Marsha Blackburn introduced bipartisan legislation Tuesday focused on protecting children online and holding social media companies accountable as cries mount for improved safety features. The legislation would mandate independent annual audits to assess risks to minors, require social media companies to have more options for minors to protect their information and disable certain features, provide more parental controls and give academic and public interest organizations access to datasets to foster research. The Kids Online Safety Act of 2023 builds on the 117th Congress’ version by delineating important definitions and guidelines to better concentrate on immediate hazards to children. The legislation focuses on specific dangers online, including the promotion of suicide, eating disorders, substance abuse and sexual exploitation.
The Washington Post: Big Tech-funded groups try to kill bills to protect children online: At a March meeting in Annapolis, Md., that state lawmakers held to discuss proposals for new safety and privacy protections for children online, one local resident made a personal plea urging officials to reject the measure. “I’m going to talk to you as a lifelong Maryland resident, parent, [husband] of a child therapist,” Carl Szabo told the Maryland Senate Finance Committee, according to footage of the proceedings. “Typically I’m a pretty cool customer, but this bill, I’m really nervous, because this comes into effect, this will really harm my family. This will really harm my kids’ ability to be online.” What Szabo didn’t initially disclose in his two-minute testimony to the panel: He is vice president and general counsel for NetChoice, a tech trade association that receives funding from tech giants including Amazon, Google and Facebook parent company Meta. NetChoice has vocally opposed the measure and already sued to block a similar law in California.
Bloomberg: Instagram, Google See Surge in Reports of Online Child Abuse: Reports of child exploitation online increased at many of the biggest tech and social media firms over the last year, including Meta Platforms Inc.’s Instagram and Alphabet Inc.’s Google. TikTok, Amazon.com Inc.’s Twitch, Reddit Inc., and the chat apps Omegle and Discord Inc. also saw increases, according to a Tuesday report from the National Center for Missing and Exploited Children. The US child safety agency received over 32 million reports involving online enticement, child sexual abuse material and child sex trafficking in 2022 — some 2.7 million more than the year before. While child sexual abuse material, or CSAM, was the largest category, there was an 82% increase in reports regarding online enticement. The center partially attributes the increase to financial “sextortion,” which involves targeting kids to share explicit photographs and blackmailing them for money.
CNN Business: Pornhub blocks access in Utah over age verification law: Some of the internet’s biggest adult websites, including Pornhub, are now blocking access to Utah users over a new age verification law that takes effect on Wednesday. Pornhub and other adult sites controlled by its parent, MindGeek, began blocking visitors with Utah-based IP addresses this week. Now, instead of seeing adult content when visiting those sites, affected users are shown a message expressing opposition to SB287, the Utah law signed by Gov. Spencer Cox in March that creates liability for porn sites that make their content available to people below the age of 18. “As you may know, your elected officials in Utah are requiring us to verify your age before allowing you access to our website,” the message said. “While safety and compliance are at the forefront of our mission, giving your ID card every time you want to visit an adult platform is not the most effective solution for protecting our users, and in fact, will put children and your privacy at risk.”
Yahoo News: Online predators target children’s webcams, study finds: There has been a tenfold increase in sexual abuse imagery created with webcams and other recording devices worldwide since 2019, according to the the Internet Watch Foundation. Social media sites and chatrooms are the most common methods used to facilitate contact with kids, and abuse occurs both online and offline. Increasingly, predators are using advances in technology to engage in technology-facilitated sexual abuse. Once having gained access to a child’s webcam, a predator can use it to record, produce and distribute child pornography. We are criminologists who study cybercrime and cybersecurity. Our current research examines the methods online predators use to compromise children’s webcams. To do this, we posed online as children to observe active online predators in action.
The Tribune-Democrat: Area school districts join lawsuit against social media companies: As local school districts join a nationwide lawsuit against some of the largest social media companies, the educational leaders aim to bring awareness to the negative effects these apps and sites have on teenagers and children and hold the businesses accountable. “We’re alleging the public nuisance legal theory, which allows government entities to hold companies liable for unique damages caused by a company’s conduct,” said Ronald Repak, partner at Dillon McCandless King Coulter and Graham, LLP. He and the firm represent nearly 30 regional school districts and have encouraged each to join the suit against Facebook, Instagram, TikTok, Snapchat and similar companies. Indiana Area School Board was one of the first, locally, to sign on, followed by Windber Area, Penn Cambria and Blacklick Valley.
Education Week: Federal and State Lawmakers Want to Regulate Young Social Media Users. Will It Work?: A rising number of state and and federal lawmakers are crafting legislation that would restrict young kids’ access to social media and institute other protections for young social media users—all in the name of improving mental health. But some policy experts worry that the bills—which are generating bipartisan support—will be difficult to enforce and may have unintended consequences. “This is all new territory for Congress: how do you protect the First Amendment? How do you keep kids’ autonomy online?” said Allison Ivie, the government relations representative for the Eating Disorders Coalition for Research, Policy and Action, which has been tracking this issue closely. She was referring to a bill recently filed in the U.S. Senate. “There is a level of frustration in this country when we see these levels of mental health problems skyrocketing, and people want a quick fix.” Many lawmakers, who are parents and grandparents, are seeing this problem play out in their homes, said Ivie. And she suspects there was an expectation from a lot of adults that kids’ mental health issues would dissipate once they were back to learning full time in-person again.
WFSB: Lawmakers consider proposal to prohibit children under 13 from using social media: Lawmakers on Wednesday will reveal more information about a bipartisan bill to protect children from the harmful impacts of social media. Parents Channel 3 spoke with have said they’ve seen first-hand the negative impacts of social media platforms on children. Studies have shown social media usage is a cause of a mental health epidemic. U.S. lawmakers said they identified areas of concern and wanted to ensure kids’ mental health and overall safety. One of the lawmakers who backs the bill is Democratic Sen. Chris Murphy. He and the three other lawmakers said they support the bill because they have young children of their own.
MIT Technology Review: Why child safety bills are popping up all over the US: Hello and welcome to The Technocrat! Bills ostensibly aimed at making the internet safer for children and teens have been popping up all over the United States recently. Dozens of bills in states including Utah, Arkansas, Texas, Maryland, Connecticut, and New York have been introduced in the last few months. They are at least partly a response to concerns, especially among parents, over the potentially negative impact of social media on kids’ mental health. However, the content of these bills varies drastically from state to state. While some aim to protect privacy, others risk eroding it. Some could have a chilling effect on free speech online. There’s a decent chance that many of the measures will face legal challenges, and some aren’t necessarily even enforceable. And altogether, these bills will further fragment an already highly fractured regulatory landscape across the US.
Tribune Chronicle: Opinion: Child deaths prove social media is no game: I recall being dared by my friends, as a child, to put my hand on the electric fence surrounding a cow field just down the rural country lane from the southwestern Pennsylvania home where I grew up. I accepted the dare, and we all laughed out loud as the electric volt zapped through my body. It was, well, shocking, but since I’m still here to write about it, I guess it ended OK. I’m pretty sure reading this will be the first time my mother learns of my ridiculous childhood stunt. Sadly, when Jacob Stevens’ parents learned about the stupid teen challenge their son recently took in the company of his buddies, he already was in a seizure. The 13-year-old boy from Columbus never woke up. The boy had been participating in TikTok’s “Benedryl Challenge.”
CNN Business: Meta opens up its Horizon Worlds VR app to teens for the first time, prompting outcries from US lawmakers”: Meta is forging ahead with plans to let teenagers onto its virtual reality app, Horizon Worlds, despite objections from lawmakers and civil society groups that the technology could have possible unintended consequences for mental health. On Tuesday, the social media giant said children as young as 13 in Canada and the United States will gain access to Horizon Worlds for the first time in the coming weeks. The app, which is already available to users above the age of 17, represents Meta CEO Mark Zuckerberg’s vision for a next-generation internet, where users can physically interact with each other in virtual spaces resembling real life. “Now, teens will be able to explore immersive worlds, play games like Arena Clash and Giant Mini Paddle Golf, enjoy concerts and live comedy events, connect with others from around the world, and express themselves as they create their own virtual experiences,” Meta said in a blog post.
Florida Politics: Legislature unanimously passes bill restricting social media, student phone use in school: A bill banning TikTok, Snapchat, Twitter and other social media platforms on public school devices and requiring schools to teach kids about the perils of the internet is now primed for Gov. Ron DeSantis’ signature. The Senate voted 39-0 in favor of the measure (HB 379), which by July 1 will mandate public school districts to block access to social media on school-provided Wi-Fi and adopt a safety policy that addresses access to the internet by minors. Students would still be able to access social media sites using their own phones, tablets, laptops and mobile plans; however, the bill prohibits using devices during class time unless it’s for educational purposes as directed by a teacher. The bill also directs the Department of Education to develop new curricula on social media safety for grades 6-12 on its social, emotional and physical effects, as well as its dangers, and make the materials available to the public and parents.
CNN Wire: Parents decide their children’s online usage as US lawmakers debate over TikTok: In the future, when teenagers want to sign up for an account on Facebook or Instagram, they may first need to ask their parent or guardian to give their consent to the social media companies. That, at least, is the vision emerging from a growing number of states introducing – and in some cases passing – legislation intended to protect kids online. For years, US lawmakers have called for new safeguards to address concerns about social platforms leading younger users down harmful rabbit holes, enabling new forms of bullying and harassment and adding to what’s been described as a teen mental health crisis. Now, in the absence of federal legislation, states are taking action, and raising some alarms in the process. The governors of Arkansas and Utah recently signed controversial bills into law that require social media companies to conduct age verification for all state residents and to obtain consent from guardians for minors before they join a platform. Lawmakers in Connecticut and Ohio are also working to pass similar legislation.
Axios NW Arkansas: What Arkansas’ parental consent for social media means: Arkansas Gov. Sarah Huckabee Sanders signed legislation last week requiring social media companies to verify ages and obtain parental consent for users younger than 18 who’re trying to open new accounts. The big picture: Supporters of the Social Media Safety Act say it can help protect children from harmful effects of social media, while others say the move raises privacy, free speech and enforceability concerns. The legislation’s sponsor, state Sen. Tyler Dees (R-Siloam Springs), told the Arkansas Senate that minors are exposed to harmful people and inappropriate content on social media, arguing age verification would empower parents to protect their kids, Arkansas Advocate reported. Details: The law requires companies to contract with third-party vendors to verify users’ ages before allowing access to the platforms.
Engineering and Technology: Opinion: It’s time for responsible social media: “We must finally hold social media companies accountable for the experiment they are running on our children for profit.And it’s time to pass bipartisan legislation to stop Big Tech from collecting personal data on kids and teenagers online, ban targeted advertising to children, and impose stricter limits on the personal data these companies collect on all of us.” President Joe Biden got a standing ovation from Democrats and Republicans when he proposed tough regulation of social media in February’s State of the Union address. But getting something into federal law is proving tricky. The US has lagged the UK and the EU on online regulation. Europe’s General Data Protection Regulation has become a global template for privacy and has for now been retained in British law.
Gizmodo: Kids on BeReal Are Exposed to Sexual Content More Often Than Other Social Networks, Survey Finds: BeReal, a popular new spontaneous image-sharing app designed to show life, “without filters,” had the highest proportion of child users exposed to sexual content of any major social media app, according to a new survey shared with Gizmodo. Larger apps like YouTube and TikTok had more overall incidents of exposure to sexual content, but users on BeReal were the most likely to actually interact with the content, the survey found. Similarly, the survey of parents showed BeReal had the highest proportion of child users who have shared sexually explicit images of themselves on the app. Those findings are part of a large survey of US parents conducted by ParentsTogether Action, a nonprofit organization that advocates in favor of tougher online protection for teens and kids. The survey of 1,000 parents found instances of child sexual abuse and exploitation on every major social network. More than a third (34%) of parents surveyed said they believed their children had been exposed to sexually explicit content online. More than 40% of the kids exposed to sexual content at the time were under 12 years old.
Marketing Dive: Pinterest latest to add additional safety protocols for younger users: Pinterest is the latest social media platform to announce new safety features aimed at protecting the wellbeing and privacy of its younger users, a topic that has more heavily come into focus these days despite broader struggles to gain unified support. The efforts by the platform follow an investigation by NBC News last month that unveiled how predators online have been compiling photos of children, including toddlers, into saved collections — known on the app as “boards” — with content often involving children bending over, dancing or sticking their tongue out. The investigation also found that similar images and videos were fed to users through the app’s algorithm after interest in such content was displayed. The investigation, which quickly gained national attention, prompted Pinterest to add new features expanding the capabilities for users to report content and accounts, and its latest safety additions seem to be building on its corrective efforts.
Bloomberg: Meta Urged to Halt Plans Allowing Minors Into the Metaverse: Dozens of advocacy organizations and children’s safety experts are calling on Meta Platforms Inc. to terminate its plans to allow minors into its new virtual reality world. Meta is planning to invite teenagers and young adults to join its metaverse app, Horizon Worlds, in the coming months. But the groups and experts that signed the letter, which was sent to Meta Chief Executive Officer Mark Zuckerberg on Friday, argue that minors will face harassment and privacy violations on the virtual reality app, which is only in its early stages. “Meta must wait for more peer-reviewed research on the potential risks of the metaverse to be certain that children and teens would be safe,” wrote the groups, led by online safety groups including Fairplay, the Center for Countering Digital Hate, Common Sense Media and others. The letter points to a March report from the Center for Countering Digital Hate that found users under 18 are already facing harassment from adults on the app. Researchers with the center witnessed 19 episodes of abuse directed at minors by adults, including sexual harassment, during 100 visits to the most popular worlds within Horizon Universe.
Bloomberg Law: States Race After Utah on Minors’ Privacy Despite Legal Threats: The legislative and regulatory landscape concerning minors’ privacy is becoming increasingly protective, adding new complexity and uncertainty for companies as they navigate a patchwork of requirements. The latest legislative trend aims to regulate use of social media by minors and provide parents with greater control over their children’s social media activities. This wave of legislation is building alongside concerns about the impact of social media on teens’ mental health and perceived gaps in protections for children’s privacy rights on social media. On March 23, Utah was the first state to adopt such social media regulations with its Social Media Regulation Act. The law applies to social media companies with more than 5 million users worldwide, and it goes into effect on May 3, with numerous requirements coming into force beginning March 1, 2024.
The New York Times: What Students Are Saying About Banning TikTok: TikTok, the social media app owned by the Chinese company ByteDance, has long worried American lawmakers, but those concerns — which range from national security risks to the app’s effects on young people — came to a fever pitch last month when a House committee voted to advance legislation that would allow President Biden to ban TikTok from all devices nationwide. Two-thirds of American teenagers are on the app. So we asked students: “Should the United States Ban TikTok?” Most were opposed. Their arguments included the fact that many apps — not just TikTok — are collecting and selling their data; that a ban would violate the first amendment; that TikTok is fun and helpful for users and lucrative for creators; and that the government has bigger problems it should be worrying about. But a sizable portion of commenters was in favor, citing national security concerns, the app’s effects on young people’s mental health, and the ease with which they can get around restrictions. For some, the possibility of a TikTok ban brought on something like relief: “I wouldn’t be upset,” Timothy from WHS wrote, “and to be honest, I think it would be for the better when it comes to me and kids around my age because it becomes addicting.”
Bloomberg: Amazon’s Twitch Safety, AI Ethics Job Cuts Raise Concerns Among Ex-Workers: Job cuts at Amazon.com Inc.’s Twitch division are raising concerns among former employees and content monitors about the popular livestreaming site’s ability to police abusive or illegal behavior — issues that have plagued the business since its inception. Layoffs at Twitch eliminated about 15% of the staff responsible for monitoring such behavior, according to former employees with knowledge of the matter. The company also folded a new team monitoring the ethics of its AI efforts, said the people, who asked not to be identified to protect their job prospects. Since late 2022, technology companies have cut more than 200,000 jobs, including trust and safety positions and contractors, at Meta Platforms Inc., Alphabet Inc. and Twitter. Job postings including the words “trust and safety” declined 78% in March 2023 from a year ago, according to the job-listing site Indeed. Technology companies also thinned their responsible AI and diversity teams.
The Wall Street Journal: Mothers Power New Drive to Make Social-Media Firms Accountable for Harms: Silicon Valley has for years brushed back attempts to make internet platforms more accountable for harm to young people. Online safety advocates are hoping to turn the tide with a new force: Moms. Mothers who say social media devastated their sons and daughters are stepping up efforts to pass legislative remedies, including by making personal appeals to lawmakers and working with congressional aides to fine-tune legislation. The power of the lobby of mothers was demonstrated in November, when about 10 women walked into Sen. Maria Cantwell’s (D., Wash.) office, demanding to know why they hadn’t been able to secure a meeting with the chair of the Senate Commerce Committee. Kristin Bride, 56, of Mesa, Ariz., clutched a picture of her late 16-year-old son as she approached the reception desk. Several more mothers followed, holding their own photographs of children whose deaths or struggles they blame, in part, on platforms such as YouTube, Instagram and Snapchat.
The Washington Post: There are almost no legal protections for the internet’s child stars: Since she was a small child, Cam Barrett, now 24 and a social media strategist in Chicago, had her life documented online. First on Myspace, then on Facebook, Barrett says, her mother posted relentlessly about her, concocting storylines about highly personal events that resulted in her being bullied and ostracized at school. “When I was nine years old, the intimate details of my first period were shared online,” she said in February while testifying about her experience to the Illinois Senate. “At 15, I was in a car accident. … Instead of a hand being offered to hold, a camera was shoved in my face.” Eventually, she said, she began hiding out in her room so she wouldn’t have to appear on camera. “I was told to look sicker for the camera,” she later told The Washington Post. “I was told if I look too happy, I have to take another picture to look like this, or like this. … I was hit by a drunk driver, and she right off the bat put a phone in my face to take pictures to put online,” Barrett said.
The New York Times: A digital footprint that begins before birth: Sophie Kratsas was only a few hours old when she received her first email: a “welcome to the world” message from her father, Nick Kratsas. He had created an email account for his newborn daughter while still standing in the delivery room. This was 2014, and Kratsas, 44, had already noticed a dearth of unclaimed email addresses with a person’s full name without numbers, special characters or other concessions. “I’m like, man, if I can grab this for her now, eventually she’ll be able to use this when she’s ready for it,” Kratsas said. A few days later, he created a Facebook profile for Sophie so he and his wife, Heather, 41, could begin tagging her in posts and photos. When she’s old enough, they intend to turn over the email and Facebook accounts to her, along with the robust digital histories that come with them. Sophie, now 9, is one of many children in her generation whose digital footprint precedes her physical one. In an age when teens and tweens are more online than ever, some parents find it just as important to invest in their offspring’s digital futures, like securing their email addresses, domain names and social media handles, as it is to invest in their finances and education.
Cato Institute: Analyzing the Consequences of Recent Youth Online Safety Proposals: Many policymakers at both the state and federal levels have called for additional regulations to protect children’s online privacy and improve online safety. While the desire to protect children is a well‐ intentioned motivation, these proposals have significant consequences, and in many cases may even diminish children’s online privacy. In a new policy brief out today, I discuss the potential impact of these proposals for all internet users, not just children. In general, these proposed online safety regulations tend to fall into three major categories: A total or near total ban of social media use by users under a certain age; Requirements for age verification and age‐ appropriate design for social media and other general use websites; Additional age verification and age‐ appropriate design codes for particular types of content. A ban or near total ban of social media use is a draconian step for a government to take. It eliminates speech opportunities for individuals of every age.
Forbes: Sex Traffickers Used America’s Favorite Family Safety App To Control Victims: Earlier this year, an 18-year-old Amazon employee brought a tip to the San Diego Police Department: prior to working for the tech giant, she had been forced into sex work when she was 17. Her alleged trafficker told her that she had to work six days a week and earn at least $1,000 a day, according to a search warrant obtained by Forbes. Text messages also showed her alleged trafficker forced her to do something else: install an app called Life360 on her phone. The app, which claims over 50 million active users across 195 countries, is among the most popular family safety apps in America. It lets parents and kids know where each family member is located at all times, displaying their live coordinates on a map. But, according to nine federal cases dating back to at least 2018, it has also been used by sexual predators to monitor and control their victims. And privacy and trafficking experts say such misuse is hardly an anomaly; it’s becoming an issue with other apps like it including Apple’s “Find My Friends” and Google’s “Find My Phone” tools.
WBRE/WYOU: Being aware of ‘dark challenges’ on social media: Social media is as popular as ever with many people spending hours scrolling through the apps. But there is a darker side to social media that could be harmful to teens and even potentially deadly. Eyewitness News spoke with an area father who lost his teenager to a dark challenge. Now he’s made it his mission to warn other parents about the dangers lurking on some social media apps. “Boy was my hero, always has been, always will be,” said Dave Thomas. Dave Thomas of Honesdale continually grieves the loss of his teenage son, Logan Gorski. “Logan was the perfect child. He always stood for something,” added Thomas. Logan died on October 5, 2020, 11 days before his 17 birthday. Dave says Logan engaged in a dark challenge from the social media site called ‘Kik’, and it went too far, taking Logan’s life.
The Morning Call: ‘Human beings are making the decisions’: Northampton County dismisses concerns overs AI tool to help keep children safe: An artificial intelligence tool Northampton County officials have begun using to help predict which children could be at risk of harm has caught the attention of federal investigators looking into a case of an Allegheny County couple whose child was removed from their custody based on the same tool. Despite the investigation underway by the U.S. Department of Justice in the Allegheny County case, Susan Wandalowski, director of Northampton County’s human services, said recently the AI product could be an important device in its mission of protecting children. She said county officials believe the tool is part of child-welfare professionals’ arsenal in screening for potential abuse cases, to ensure children’s safety, keep families intact, and avoid unnecessary investigations of low-risk families. “At the end of the day,” Wandalowski said, “the human beings are making the decisions. This is just one additional tool in their tool belt.”
USA Today: Opinion: Will TikTok be banned? Maybe it should be for kids, at least.: Congress wants to kick TikTok out of America. I want to boot it from my bedroom. Like most crazy-busy parent professionals in America these days, I am beyond desperate for a full night of life-restoring rest. Yet I cannot fight the irresistible magnet-force pull that is the latest viral video – especially at night, when I’m too tired to be rational. I’m not alone. TikTok’s been downloaded more than 210 million times in the United States, according to the most recent marketing statistics. To date, there’s no evidence that TikTok is a threat to national security. But there’s plenty of evidence that, that like all of the others – Facebook, Twitter, Instagram, YouTube, Snapchat – the company’s main goal is to hook you young and keep you coming back for more.
Newsweek: Opinion: A TikTok Ban Is the Only Way Forward: When it comes to building the Great Firewall—a tool to control and censor the internet—the Chinese Communist Party (CCP) has had no trouble. When it comes to building a firewall between TikTok and its Chinese parent company ByteDance, though, the CCP will curiously struggle. That’s because TikTok is an American company in name only. It can and should be banned. The core problem here is that the CCP controls TikTok. Explaining why is a simple two-part argument. First, there is no firewall between the CCP and China’s private sector. Second, there is no firewall between TikTok and its Chinese parent company ByteDance. The first part is clearly spelled out in Chinese law, namely the 2015 National Security Law and 2017 National Intelligence Law. In particular, the national intelligence law states that “any organization or citizen shall support, assist, and cooperate with state intelligence work.”
The Seattle Times: Opinion: Isolation and restraint of students is abuse: Listening to legislative testimony about a 7-year-old boy in crisis who was restrained face down on a school floor tests the heart. Watching a grown man choke up at his memory of being dragged through a hallway and locked in a barren isolation room would lead any feeling person to wonder why Washington state continues these practices. Our youth prisons outlawed solitary confinement in 2020. Yet at least 3,800 children — most of them younger than 12 — were isolated or restrained 24,873 times during the 2019-20 school year, according to a report from Disability Rights Washington. The vast majority of these kids were special education students, and a wildly out-of-proportion number were Black. The high number of incidents makes the case plainly: Isolation and restraint do not correct behavior. To the contrary, with an average of six incidents per student, there is abundant evidence that these approaches only exacerbate the problem.
THV 11: Arkansas suing TikTok, Meta for exposing minors to ‘damaging content’: Three different lawsuits have been filed by Arkansas Attorney General Tim Griffin, taking aim at social media powerhouses, Meta and TikTok in order to “protect” Arkansas children from the applications. He was joined by Governor Sarah Huckabee Sanders during a press conference, where they discussed the lawsuits which are being filed under Arkansas’s ‘Deceptive Trade Practices Act.’ “The common theme is deception. And the consequences of that deception is endangering Arkansans, particularly our children, our youth,” said Attorney General Tim Griffin. According to Gov. Sanders, this act prohibits companies from “engaging in false, deceptive, business practices.” She believes that this falsehood comes as social media companies “claim that their platforms are beneficial, non-addictive, and private.” In terms of Meta, the governor claimed that the company played a role in a mental health crisis among teens.
The Washington Post: Utah governor says new social media laws will ‘prevail’ over challenges: Utah Gov. Spencer Cox (R) signed into law a pair of measures last week that seek to strictly limit social media access for kids and teens, marking “some of the most aggressive laws passed by any state to curb the use of social media by young people,” as my colleagues Naomi Nix, Cat Zakrzewski and Heather Kelly wrote. The move is likely to ignite another legal standoff with tech industry groups, which have already expressed concern about the laws’ constitutionality and gone on the offensive against a growing raft of state laws targeting social media companies. But Cox said Sunday that he believes state officials will topple lawsuits challenging its new social media laws. “We feel very confident that we have a good case here,” Cox told NBC News’s “Meet The Press” on Sunday. “We expect that there will be lawsuits, and we feel confident that we will prevail.” The laws would require companies to obtain parental consent before letting minors access their platforms and set a digital curfew for younger users. They would also require companies to give guardians access to their child’s account and to verify that users in Utah are over 18.
CNBC: Worried about your kids and A.I.? Experts share advice — and highlight the risks to look out for: Artificial Intelligence is all the rage in the tech world, especially after the launch of ChatGPT and GPT-4. It has shown potential not only to change life of workers — but also the daily life of another demographic: kids. In fact, children are already using AI-powered toys and platforms that write bedtime stories at the click of a button. “We call today’s children ‘Generation AI’ because they are surrounded by AI almost everywhere they go, and AI models make decisions that determine the videos they watch online, their curriculum in school, the social assistance their families receive, and more,” Seth Bergeson, fellow at the World Economic Forum who led their “AI for Children” project, told CNBC Make It. And AI’s influence will only grow from here, said Saurabh Sanghvi and Jake Bryant, partners at McKinsey.
My Ches. Co: Pennsylvania Man Sentenced to Prison for Cyberstalking: According to United States Attorney Gerard M. Karam, Vandaley, with the intent to harass and intimidate other persons, engaged in a course of conduct using electronic communication systems and services of interstate commerce to cause substantial emotional distress to six victims. All six victims were former romantic partners of Vandaley or relatives and friends of his former romantic partners. During the course of the conduct, Vandaley repeatedly made false anonymous allegations to law enforcement agencies throughout the country. He falsely accused the victims of committing heinous crimes including murder-for-hire, narcotics trafficking, human trafficking, and sexual offenses. Vandaley also sent anonymous electronic messages to the victims threatening to kidnap and murder the minor child of one of the victims. Vandaley also threatened to mail parts of that minor child back to the victim. Vandaley committed these crimes while one of the victims had a protection from abuse order against him.
CNN Business: TikTok CEO in the hot seat: 5 takeaways from his first appearance before Congress: In his first appearance before Congress on Thursday, TikTok CEO Shou Chew was grilled by lawmakers who expressed deep skepticism about his company’s attempts to protect US user data and ease concerns about its ties to China. It was a rare chance for the public to hear from the Chew, who offers very few interviews. Yet his company’s app is among the most popular in America, with more than 150 million active users. Here are the biggest takeaways from Thursday’s hearing.
The Washington Post: Utah governor signs bill to curb children’s social media use: Utah Gov. Spencer Cox (R) signed two bills into law Thursday that would impose sweeping restrictions on kid and teen use of social media apps such as Instagram and TikTok — a move proponents say will protect youth from the detrimental effects of internet platforms. One law aims to force social media companies to verify that users who are Utah residents are over the age of 18. The bill also requires platforms to obtain parental consent before letting minors use their services, and guardians must be given access to their child’s account. A default curfew must also be set. The Utah regulations are some of the most aggressive laws passed by any state to curb the use of social media by young people, at a time when experts have been raising alarm bells about worsening mental health among American adolescents. Congress has struggled to pass stricter bills on online child safety despite bipartisan concern about the effects social media has on kids.
CNN: Why Bucks County, Pennsylvania, is suing social media companies: One mother in Bucks County, Pennsylvania, said her 18-year-old daughter is so obsessed with TikTok, she’ll spend hours making elaborate videos for the Likes, and will post retouched photos of herself online to look skinnier. Another mother in the same county told CNN her 16-year-old daughter’s ex-boyfriend shared partially nude images of the teen with another Instagram user abroad via direct messages. After a failed attempt at blackmailing the family, the user posted the pictures on Instagram, according to the mother, with some partial blurring of her daughter’s body to bypass Instagram’s algorithms that ban nudity. “I worked so hard to get the photos taken down and had people I knew from all over the world reporting it to Instagram,” the mother said. The two mothers, who spoke with CNN on condition of anonymity, highlight the struggles parents face with the unique risks posed by social media, including the potential for online platforms to lead teens down harmful rabbit holes, compound mental health issues and enable new forms of digital harassment and bullying. But on Friday, their hometown of Bucks County became what’s believed to be the first county in the United States to file a lawsuit against social media companies, alleging TikTok, Instagram, YouTube, Snapchat and Facebook have worsened anxiety and depression in young people, and that the platforms are designed to “exploit for profit” their vulnerabilities.
Glenside Local: PA lawmakers introducing social media bill, Sen. Haywood co-sponsors firearm ammunition bill: The Pennsylvania state Senate will soon introduce a bill mandating age verification on social media platforms and allowing parents/guardians to submit a request to delete a minor’s social media account. “There are clear and demonstrated harms to children who utilize these platforms, a fact which has been known by social media companies for years,” State Rep. Robert W. Mercuri (R-Allegheny) said in a co-sponsorship memorandum. “Attempts by these companies to curtail such harms failed to alleviate the problem and actually made it worse.” Mercuri has also taken a stand against the platform Tiktok, which is owned by a Chinese company. In a memo to the PA House Mercuri said the bill would “protect the Commonwealth’s information technology assets from security risks associated with the social media network TikTok.”
CT Mirror: CT-led bill aims to protect kids online. Will it clear Congress?: As revelations about the harmful toll of social media on children and teens have become public over the past few years, Congress sought to amp up the pressure on Big Tech and pass legislation for the first time in decades to protect minors and hold companies accountable. Some of those efforts “came heartbreakingly close” to materializing at the end of the year but ultimately faded and got punted to the new session of Congress that started in January. One of those bills, co-authored by Sen. Richard Blumenthal, D-Conn., focuses on the safety aspect and gives children and parents greater control over what online content can be viewed. The issue came to a head when Facebook whistleblower Frances Haugen testified before Congress in 2021 about the harmful effects of social media on children and teenagers and how tech giants kept users engaged to turn profits. Lawmakers like Blumenthal believe the growing bipartisan support on this issue could lead to the passage of tech reforms this time around — possibly this year.
Bloomberg Law: Utah Taunts Social Media Sites With Sweeping Teen Restrictions: Utah’s first-in-the-nation legislation to restrict how social media companies treat young users and allow individuals to sue over violations will set the stage for a tech industry legal battle regarding their constitutionality. Gov. Spencer Cox (R), citing concerns over youth mental health, plans to sign into law Thursday two bills that aim to protect children from addictive features and other potential harms of social media. Platforms such as Facebook and Twitter would have to obtain parental consent if a user under 18 wants to open an account, and the companies could face fines and lawsuits for running afoul of a host of new requirements. The bills are among the most stringent efforts by state lawmakers across the country this year to regulate a child’s experience online. Tech industry groups said in letters asking Cox to veto the measures that they would violate the First Amendment and lead to frivolous lawsuits.
Next Gov: ‘Alarming Content’ from AI Chatbots Raises Child Safety Concerns, Senator Says: As leading technology companies rush to integrate artificial intelligence into their products, a Democratic senator is demanding answers about how these firms are working to protect their young users from harm—particularly following a series of news reports that detailed disturbing content created by AI-powered chatbots. In a letter on Tuesday to the CEOs of five companies—Alphabet Inc.’s Google, Facebook parent company Meta, Microsoft, OpenAI and Snap—Sen. Michael Bennet, D-Colo., expressed concern about “the rapid integration of generative artificial intelligence into search engines, social media platforms and other consumer products heavily used by teenagers and children.” Bennet noted that, since OpenAI’s ChatGPT was launched in November, “leading digital platforms have rushed to integrate generative AI technologies into their applications and services.” While he acknowledged the “enormous potential” of generative AI’s adoption into a range of technologies, Bennet added that “the race to integrate it into everyday applications cannot come at the expense of younger users’ safety and wellbeing.”
Patch: PA Could Mandate Social Media Age Limits, Allow Parents To Delete: As the reckoning for social media platforms that critics say have recklessly harmed children continues, Pennsylvania legislators are looking for more ways to keep young people safe. Legislation will soon be introduced in the Pennsylvania state Senate which would mandate age verification on social media platforms. It would also allow parents to request a child’s account be deleted. “There are clear and demonstrated harms to children who utilize these platforms, a fact which has been known by social media companies for years,” State Rep. Robert W. Mercuri (R-Allegheny) said in a co-sponsorship memorandum “Attempts by these companies to curtail such harms failed to alleviate the problem and actually made it worse.”
The Hill: Utah’s Cox says he will sign divisive social media bill restricting minors: Utah Gov. Spencer Cox (R) on Thursday said he’ll sign a divisive bill restricting minors from using social media without parental permission. Cox said at a meeting with reporters that he’ll “absolutely” sign the social media bills sent to his desk this session: Utah Senate Bill 152 would require social media companies to verify that users in the state are 18 years or older in order to open an account, and Cox said he is willing to face any legal challenges to the initiative. “I’m not gonna back down from a potential legal challenge when these companies are killing our kids,” Cox said, according to footage from PBS Utah, shaking off First Amendment concerns. Under the bill, Utah residents under age 18 would only be able open an account with a parent or guardian’s permission. The new restrictions would take effect March 1, 2024. The governor said he would be working with social media companies and third-party verification over the next year to work out the details of how the restrictions would be implemented.
Gizmodo: Parents Group Demands Meeting With Meta and TikTok Over Child Suicide: A family advocacy group called Parents Together published an open letter Thursday demanding a meeting with the heads of Meta and ByteDance, arguing that the companies knowingly expose children to a variety of dire threats, including the risk of suicide, and that they refuse to address these problems in lieu of growth and profit. The open letter describes a number of horror stories from families who say their children fell victim to the harms posed by social media, including suicides, accidental deaths from viral “challenges,” hospitalizations from eating disorders, sexual abuse, and more. Meta and ByteDance, the parent companies of Facebook and TikTok, respectively, “have imposed on unwitting children and families – anxiety and depression, cyberbullying, sexual predators, disordered eating, dangerous challenges, access to drugs, addiction to your platforms, and more—every single day,” Parents Together Action said in the letter. The companies “have chosen your profits, your stockholders, and your company over children’s health, safety, and even lives over and over again.”
NBC News: Senators seek answers from Pinterest after NBC News investigation: Days after an NBC News investigation revealed how grown men on Pinterest openly create sex-themed image boards filled with pictures of little girls, the company says it has “dramatically” increased its number of human content moderators. It also unveiled two new features enabling users to report content and accounts for a range of violations. Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn., sent a letter to the company Tuesday morning demanding to know why the new tools weren’t already available, among other questions. “It should not have taken national media coverage of such graphic misuse targeting young children to prompt action,” wrote the senators, who are co-sponsors of the bipartisan Kids Online Safety Act. “This report is particularly disappointing given that Pinterest has branded itself the ‘last positive corner of the internet.’”
The Washington Post: Snapchat tried to make a safe AI. It chats with me about booze and sex.: Snapchat recently launched an artificial intelligence chatbot that tries to act like a friend. It built in some guardrails to make it safer for teens than other AI bots built on the tech that powers the buzzy ChatGPT. But in my tests, conversations with Snapchat’s My AI can still turn wildly inappropriate. After I told My AI I was 15 and wanted to have an epic birthday party, it gave me advice on how to mask the smell of alcohol and pot. When I told it I had an essay due for school, it wrote it for me. In another conversation with a supposed 13-year-old, My AI even offered advice about having sex for the first time with a partner who is 31. “You could consider setting the mood with candles or music,” it told researchers in a test by the Center for Humane Technology I was able to verify.
Daily Mail: Facebook and Instagram used ‘aggressive tactics’ targeting children: Unredacted lawsuit claims Meta knew about child sexual exploitation and exploited extreme content to drive more engagement: Meta knowingly used ‘aggressive tactics’ that involved getting children hooked on social media ‘in the name of growth,’ according to a lawsuit against Meta claiming children have suffered at the hands of Facebook and Instagram. A Meta software engineer claimed that ‘it is not a secret’ how Facebook and Instagram used meticulous algorithms to promote repetitive and compulsive use among minors, regardless if the content was harmful – and was ‘been pretty unapologetic about it.’ The redacted revelations were disclosed in a lawsuit against Meta, which has been unsealed and seen by DailyMail.com.
CBS 12 News: Social Media Safety: subcommittee unanimously sends safety education bill to senate floor: Social media safety has become a primary concern for everyone – especially for schools and parents of young children, who are more vulnerable to it’s dangers. Now, the question of whether the Department of Education should mandate safety instruction on online platforms to children in schools – a proposal in Senate Bill 52 – has been advanced by a senate subcommittee. Social media may be a fun place for people to engage with one another – but at least one cyber expert says kids must learn about its dangers too. “It’s extremely dangerous, there’s no other way around it,” remarked FAU Adjunct Professor and tech expert Craig Agranoff. “We’ve become a society that values likes more than we care, value kindness.”
The Baltimore Sun: Judge approves redactions for AG’s Catholic clergy abuse report, clearing way for its release: A Baltimore judge approved the needed redactions Tuesday for the attorney general’s report on sexual abuse within the Roman Catholic Archdiocese of Baltimore, clearing the way for its public release. Circuit Judge Robert K. Taylor ordered the Maryland Attorney General’s Office to redact 37 names from the report and to anonymize the identities of 60 other people, removing them from the 456-page document entirely. Taylor’s order leaves the timing of the report’s release at the discretion of Attorney General Anthony Brown, whose office must complete the required redactions and notify the 37 individuals before publishing it. A timeline for when Brown expects to release the report was not available Tuesday afternoon, although it was unlikely the report would be released before Wednesday. The attorney general’s office will publish the report on its website.
News 5 Cleveland: Big tech, lawmakers, and local schools take steps to monitor screen time and protect children: Mom, Stephanie Miller, remains watchful when it comes to her kids screen time. “Who doesn’t let them be on technology?” Miller said. “You know they can enjoy things, but it’s minimally because I like to keep their minds going in a more educated way, like imaginary play.” In February, Lieutenant Governor, Jon Husted, proposed the Social Media Parental Notification Act. It would require social media and gaming companies to get parental consent before kids under 16 sign up. Dr. Michael Manos, Head of ADHD at the Cleveland Clinic, said too much phone time, early on can be linked to anxiety and depression in children. “The effort to limit screen time is certainly laudable and should have been done a long time ago,” said Manos.
The Washington Post: Meta doesn’t want to police the metaverse. Kids are paying the price: Zach Mathison, 28, sometimes worries about the hostility in Meta’s virtual reality-powered social media game, Horizon Worlds. When his 7-year old son, Mason, explores the app he encounters users, often other children, screaming obscenities or racist slurs. He is so uneasy about his son that he monitors his every move in VR through a television connected to his Quest headset. When Mathison decides a room is unsafe, he’ll instruct Mason to leave. He frequents online forums to advise other parents to do the same. “A lot of parents don’t really understand it at all so they just usually leave it to the kids to play on there,” he said. He will say “if your kid has an Oculus please try to monitor them and monitor who they’re talking to.” For years, Meta has argued the best way to protect people in virtual reality is by empowering them to protect themselves — giving users tools to control their own environments, such as the ability to block or distance other users.
NBC News: Men on Pinterest are creating sex-themed image boards of little girls. The platform makes it easy: Like other kids her age, 9-year-old Victoria signed up for Pinterest because she wasn’t allowed on TikTok. Her mother feared she might encounter dangerous content or individuals on the popular video-sharing app. Pinterest, meanwhile, seemed safe. But while the third grader was “pinning” pictures of baby animals, craft ideas and nail art inspiration into her image “boards” on the site, grown men were pinning her. Clips Victoria uploaded of herself to Pinterest, such as one in which she cheerfully turns a cartwheel, have been compiled by at least 50 users into their own boards with titles like “young girls,” as well as “Sexy little girls,” “hot,” “delicious,” and “guilty pleasures.” Those boards are filled with dozens, hundreds and sometimes thousands of photos and videos of children.
Forbes: Will The U.S. Update Laws For Children’s Digital Privacy?: Despite a last-ditch effort by lawmakers in December 2022, two bills to strengthen online regulatory protection for children in the U.S. failed to make it to Congress’s 2023 fiscal spending plan. The advocacy group Fairplay termed the development “beyond heartbreaking,” adding that “preventable harms and tragedies” were allowed to continue unimpeded. Lawmakers sponsoring the bills blamed the “behemoth sway” of lobbyists working on behalf of Big Tech. These and other voices call attention to the growing dangers of certain online activities to vulnerable children. To regain the initiative, President Joseph Biden demanded a ban on online ads targeting children during his State of the Union address on February 7, 2023.
The Globe: District 518 warns against scams, encourages parents to talk to kids: District 518 is asking parents and students for help fighting a new social media scam: fake accounts that threaten teens with releasing nude photos of them — pictures that aren’t even real. “We are investigating,” said Anne Foley, public relations/communications coordinator with District 518, noting that investigations take time. “We don’t know if these people are fellow students. We have no idea if these people live in Worthington. And we have no idea how they’re choosing the kids, but there’s been at least two.”
Social Media Today: New Social Media Restrictions for Youngsters Could Lead to Broader Limits in Access: Could this be a sign of things to come in social media regulation? The State of Utah is set to pass a new law which will restrict people under the age of 18 from using social media apps without a parent’s consent. As per Axios: “Starting March 1st, 2024, all Utahns would have to confirm their ages to use social media platforms or lose account access, under the bill, sponsored by state Rep. Michael McKell.” The new law, if enacted, will add an extra level of protection for youngsters, with parents to lose access to their own social media accounts if they fail to verify their kids’ age, and monitor their activity.
Wired: TikTok’s Screen-Time Limits Are the Real Distraction: My first cell phone was a brick-shaped Nokia with a couple hundred minutes loaded onto it. My parents gave it to me when I got my first car, on the understanding that, whenever I drove somewhere that wasn’t school, I’d call them as soon as I arrived so they’d know I was safe. It was a reasonable rule—especially given how many times it took me to pass my driver’s test—and one to which I had no problem agreeing. Even still, I almost never remembered to do it. I’d be in the middle of a movie at the theater and I’d realize that I had forgotten to call. I’d sprint out to the car—where I kept the phone itself—and have a brief, harried conversation with my worried and deeply irritated parents. They knew, of course, that I was likely fine. But it’s hard to not know what your kids are doing without you.
Axios: Tech platforms struggle to verify their users’ age: Social media and streaming platforms are trying to figure out the best ways to verify a user’s age as parents and lawmakers grow increasingly concerned about the way children and teenagers use online services. Driving the news: Those worries — along with recently enacted laws in the United Kingdom and California — have pushed companies to try new processes for ensuring underage users aren’t getting onto sites and services meant for older people. Age verification and age estimation is just one part of an attempt to make tech safer for kids as complaints grow over mental health harms, privacy trespasses and more.
Axios Salt Lake City: Utah set to limit minors from using social media without parent’s OK: Utah is poised to pass a law restricting children and teens under age 18 from using social media without their parent’s consent. Meanwhile, adults could lose access to their accounts, too, if they refuse to verify their age. The latest: After SB 152 cleared its final legislative hurdle last week, Utah Gov. Spencer Cox told reporters Friday — the final day of the 2023 general session — he planned to sign the bill. Cox said the state was “holding social media companies accountable for the damage that they are doing to our people.” Between the lines: Starting March 1, 2024, all Utahns would have to confirm their ages to use social media platforms or lose account access, under the bill, sponsored by state Rep. Michael McKell (R-Spanish Fork).
Bucks County Courier Times: Sextortion is on the rise, and it can be deadly. How to protect yourself and your kids: Ian Pisarchuk sat behind a screen and terrorized his victims. He’d befriend them mostly online, and then the demands would start. He wanted photos of them, or said he already had them. He made threats to get what he wanted from girls and young women, using details from their social media accounts to exert power over them and get what he wanted for his own pleasure. “Words cannot describe the anxiety Ian has caused me,” said one of his victims in a Bucks County court last month. In 2019, Pisarchuk set his sights on the young girl, getting her to send him explicit images of herself and then threatening to expose the photos online.
CNN: Democratic senators urge Meta not to market its metaverse app to teens: Two Democratic senators urged Meta this week to suspend a reported plan to offer Horizon Worlds, the company’s flagship virtual reality app, to teens between the ages of 13 and 17, arguing the technology could harm young users’ physical and mental health. The lawmakers, Massachusetts Sen. Ed Markey and Connecticut Sen. Richard Blumenthal, called Meta’s plan “unacceptable” in light of the company’s “record of failure to protect children and teens,” in a letter dated Wednesday to company CEO Mark Zuckerberg. The letter focuses on a plan, reported by the Wall Street Journal last month, that would enable Meta’s teen users to join a persistent online world consisting of multiple digital communities through the use of a virtual reality headset. Horizon Worlds is already available to adults 18 and older.
Time: ‘We Can Turn It Off.’ Why TikTok’s New Teen Time Limit May Not Do Much: TikTok is the most popular social media platform for teens—and by many accounts the time they spend on it is growing. Two-thirds of U.S. teenagers told a 2022 Pew survey that they are on the app, and 16% said they use it constantly. In 2021, the average time kids and teens spent on TikTok grew to 91 minutes a day, up from 82 the year before, according to a report by TechCrunch. So Tuesday’s news that TikTok moved to limit minors to one hour per day sounds like a big deal. But teachers, who have reported concerningly high social media use among students and struggles to compete for their attention, say that while the new limits are a good idea, they might not have a big impact.
WRIC: Do you know who your children are talking to online? Police warn of influx in social media scams targeting teens: A recent influx of scams targeting teenagers prompted Chesterfield County Police to urge parents to keep a closer eye on their children’s devices. Sergeant Winfred Lewis with Chesterfield County Police Department’s Special Victims team described how these particular scammers prey on young peoples’ fear and embarrassment. “They’re juveniles,” Lewis explained. “They’re teenagers.” Typically, when police warn of online scams, they note how scammers target the elderly, who may be less familiar with modernized social media and web technology. However, with this recent wave of scams, the most internet savvy individuals are vulnerable. Victims have been kids as young as 11 or 12 years old.
CNET: TikTok Will Limit Teen Screen Time to 60 Minutes by Default: TikTok said Wednesday that it wants teens to be more aware of the time they spend on the popular app for short-form videos. The tech company said it’ll set screen time limits for teens by default and release new features so parents have more control over their children’s use of social media. TikTok users under 18 years old will have their screen time limit automatically set to 60 minutes. The short-form video app said this default screen time will apply to new and existing accounts that haven’t already used this tool.
Politico: Vivek Murthy wants kids off social media: Surgeon General Vivek Murthy is an evangelist for wellness, hosting town halls and expounding on meditation and mindfulness on his House Calls podcast. He’s particularly concerned about kids’ mental health and has issued guidance for young people, suggesting they ask for help, volunteer in their communities and learn stress management techniques. And he’s testified before Congress about the topic. In conversation with Ruth, he calls out social media as a unique threat to the rising generation, a view shared by many in Congress who are considering legislation to make it harder for kids to use the technology.
Forbes: Meta Backs New Platform To Help Minors Wipe Naked, Sexual Images Off Internet: The National Center for Missing & Exploited Children has launched a platform, funded by Meta, to help kids and teens have naked or sexual photos and videos of themselves removed from social media. The new service for minors, called Take It Down, was unveiled one year after the release of a similar tool for adults known as StopNCII (short for non-consensual intimate imagery). Today, Take It Down will work only across five platforms that agreed to participate: Meta-owned Facebook and Instagram, as well as Yubo, OnlyFans and Pornhub. “We created this system because many children are facing these desperate situations,” said NCMEC’s president and CEO Michelle DeLaune. “Our hope is that children become aware of this service, and they feel a sense of relief that tools exist to help take the images down.”
Bloomberg Law: Social Media, Porn Sites Targeted in States Seeking Age Checks: Dozens of proposals pending in statehouses across the country that aim to regulate a child’s experience online are raising concerns over the future of anonymity on the internet. Lawmakers are pushing a variety of bills aimed at boosting privacy protections for kids’ personal information, limiting their access to social media without parental involvement, or keeping them off of sites that include explicit content such as pornography. The measures would rely on companies like Meta Platforms, Inc., Alphabet Inc., and TikTok Inc. to know how old their online users are—posing the conundrum of determining age without gathering too much sensitive information about a person’s identity.
NBC News Washington: How Social Media and Screen Time Can Affect Children’s Mental Health: New research shows sites like TikTok may have a negative impact on children’s mental health. The algorithm is designed to keep users engaged longer, and studies show the more kids and teens spend on social media, the more likely they’ll be depressed. Psychiatrist Dr. Asha Patton-Smith of Kaiser Permanente offered guidance for parents.
New Jersey Monitor: N.J. legislators propose punishing social media companies for kids’ online addiction: For teenagers like Nidhi Das, social media became a cherished lifeline to friends during the pandemic’s early days. But as regular life resumed, Das didn’t like how tethered she felt to it. Social media became her go-to boredom buster, and even the misinformation that infects many platforms kept her swiping. “The algorithm, it curates to what you like. And people would make up little controversies, so that might encourage you, like ‘oh, let me look into that.’ Even if it’s not true, I still want to know like: ‘Oh, where did that stem from?’” said Das, 17, a high school senior from Lawrenceville. “The addicting thing is that there’s always something endlessly there, so you keep scrolling.”
ABC News: Supreme Court wrestles with immunity for social media companies: For the first time Tuesday, the U.S. Supreme Court wrestled with the scope of a landmark federal law that’s given sweeping legal immunity to internet and social media companies for more than 25 years. Section 230 of the Communications Decency Act — known in the tech world as the “26 words that created the modern internet” — protects the companies from liability for content posted by individual users, no matter how discriminatory, defamatory or even dangerous the information may be.
Patch: NYC Mayor: Investigate Social Media, Mr. President: New York City’s drill rap-dissing, burgeoning fuddy-duddy-in-chief has a request for America’s octogenarian commander-in-chief: investigate social media. “I don’t think that we have properly analyzed what social media is doing to us in general, specifically to our young people,” Mayor Eric Adams said Tuesday in response to a question about a teen’s recent subway surfing death. “I am hoping the president calls a national Blue Ribbon Commission to really analyze this thing that has really dropped into our lives.”
CBS News: ‘It’s not going away’: Local psychologist weighs in on proposal to ban kids from social media: Whether fascinated by Facebook or taken with Twitter, kids could get the boot from social media. Republican Senator Josh Hawley of Missouri introduced a bill banning children under 16 years of age from using social media. Hawley says big tech companies are neglecting children’s health and monetizing their personal information. The U.S. Surgeon General says kids aged 13 and younger shouldn’t even be on social media. “This skewed, and often distorted environment of social media often does a disservice to many of those children,” said Dr. Vivek Murthy.
NPR: 10 things to know about how social media affects teens’ brains: The statistics are sobering. In the past year, nearly 1 in 3 teen girls reports seriously considering suicide. One in 5 teens identifying as LGBTQ+ say they attempted suicide in that time. Between 2009 and 2019, depression rates doubled for all teens. And that was before the COVID-19 pandemic. The question is: Why now? “Our brains, our bodies, and our society have been evolving together to shape human development for millennia… Within the last twenty years, the advent of portable technology and social media platforms is changing what took 60,000 years to evolve,” Mitch Prinstein, the chief science officer at the American Psychological Association (APA), told the Senate Judiciary Committee this week.
AP: Ohio proposal: Get parents’ OK for kids to use social media: Ohio’s governor wants the state to require parental consent for kids under 16 to get new accounts on TikTok, Snapchat and other social media platforms. Republican Gov. Mike DeWine’s two-year budget proposal would create a law that social media companies must obtain a parent’s permission for children to sign up for social media and gaming apps. The proposal also names YouTube, Facebook and Instagram, but the proposal would apply broadly, to “any online web site, online service, online product, or online feature that requires consumer consent to register, sign up, or otherwise create a unique username.”
CBS News: LOCAL NEWS Florida lawmakers mull HB 591, which aims to protect children from cyberbullying, sex trafficking: Florida lawmakers gathered Tuesday in Tallahassee to advocate for House Bill 591, legislation that aims to protect juveniles from falling victim to cyberbullying and sex trafficking. Citing a rise in the number of minors suffering from anxiety and depression, State Rep. Michele Rayner-Goolsby, Rep. Tyler Sirois and State Sen. Shevrin Jones sponsored the bill to protect the youth from online harassment. Jena McClure, a mother of three, said she supports the implementation of this bill after witnessing her children and their friends fall victim to bullying. She said it happens often to children everywhere.
Roll Call: Social media companies put profits over children, senators say: Senators sounded off against social media platforms and called for action during a Senate Judiciary Committee hearing on Tuesday, saying the companies lack accountability and are focused on profits at the expense of children.The hours-long hearing touched on an array of issues, including: the harms of cyberbullying, the scourge of child sexual abuse material on social media, and mental health issues among youth. It also underscored how there is bipartisan support for taking action on social media platforms — even in a narrowly divided Congress.
Patch: Sexting Education Program Aims To Keep Chester Kids Digitally Safe: Online sexual extortion of minors is on the rise as technology becomes more prevalent in everyday life, and the Chester Police Department wants to remind parents of the online safety precautions they and their children should take to stay safe. Chester Police Detective Lieutenant Chris Cavanagh has partnered with the Chester School District to present a parent evening titled “Keeping your child safe from Child Exploitation – it all starts with the device” on Feb. 15, at 6:30 p.m., at the Black River Middle School.
NBC News: How one teen is urging legislators in Washington state to help protect kids from being exploited on vlogs: A Washington state teenager is advocating for a bill to protect the privacy of the children of influencers. Chris McCarty, 18, a freshman at the University of Washington, said they wanted to advocate for children’s right to privacy online after having learned about influencer Myka Stauffer, who shared extensive, intimate content about her adopted son before she relinquished custody because of his medical needs. McCarty, who uses they/them pronouns, started the site Quit Clicking Kids to spread awareness and urge fellow advocates to take action in their own states. When they were a senior in high school last year, they cold-emailed multiple state legislators and eventually worked with state Rep. Emily Wicks to craft HB 2023, which was re-introduced as HB 1627 for this year’s legislative session.
News Press Now: Drug dealers targeting kids through social media: A popular social media platform is facing lawsuits from families for its role as a tool for drug dealers to dispense fentanyl to young people. Families of more than 50 overdose victims have filed a lawsuit against Snapchat. According to the lawsuit, from 2020-2022, Snapchat was allegedly a conduit for more than 75% of the fentanyl poisoning deaths of teens between the ages of 13 to 18. Local experts expressed concern over social media being a wide-open platform for dealers because they can sell drugs to people from anywhere in the country.
Journal News: Ohio governor seeks law requiring social media companies to get parental consent for kids’ accounts: Social media companies like TikTok and Facebook would be required to get verified parental consent before allowing a child under age 16 to have an account, according to a law proposed by Ohio Gov. Mike DeWine in his new budget.
The Columbus Dispatch: Ohio may require kids to get parental consent to use TikTok, Facebook, other social media: Ohio could soon make it easier for parents to restrict their children’s access to TikTok, Snapchat and other apps. Part of Gov. Mike DeWine’s two-year budget proposal would require social media companies to get parental consent before allowing kids under age 16 to use their platforms. They would be tasked with creating a splash page that verifies the user’s age and obtains the necessary consent from a parent or guardian.
KGO-TV (Utah): Why expert says Utah’s social media ID verification bill could lead to nationwide privacy issues: What steps are you willing to take to use social media? Would you pay for the platform or agree to all terms and conditions? What if one of those conditions were to upload a copy of your government identification card? That’s exactly what could happen in the State of Utah with Senate Bill 152 – “the social media regulation act”. If passed, Utah residents will have to upload their ID to prove they are over the age of 18 to use all platforms. For those under 18, a parent ID is needed to verify the account.
WPCO-TV (Ohio): DeWine seeks law requiring social media companies to get parental consent for kids’ accounts: Social media companies like TikTok and Facebook would be required to get verified parental consent before allowing a child under age 16 to have an account, according to a law proposed by Ohio Gov. Mike DeWine in his new budget. “Social media companies are running platforms that are addicting our children, harming our children and we need more parental involvement,” said Ohio Lt. Gov. Jon Husted, who is taking the lead on the effort and spoke about it during a Dayton visit on Wednesday.
WTVG-TV (Ohio): Ohio bill would require kids under 16 to have parental permission before joining social media: A new piece of legislation presented to the Ohio General Assembly last week would require kids aged 15 years old and younger to have parental permission before joining certain online platforms, it’s called the Social Media Notification Act. Lieutenant Governor Jon Husted is pushing for the proposal. “These tech companies have created these apps that are designed with algorithms to addict your children to these platforms and collect data on them. These platforms are not being used for virtuous reasons,” says the Lieutenant Governor. “They (parents) would be able to observe more things and they would know what exact platforms their children are on and see who their children are talking to and are connected to. They can see what kind of influences people can have on all their children, I think that would be really beneficial,” says Sarah Koralewski.
Bloomberg Law: California Bill to Let Parents Sue Social Media Gets Second Try: California lawmakers are attempting again to hold social media companies liable for addicting child users to their product, a renewed effort that will face fierce resistance from the tech industry. “This legislation is like throwing more fuel on the flames created by the legislature last session,” said Carl Szabo, vice president of NetChoice, which represents Meta, Google, and other tech companies. State Sen. Nancy Skinner (D) last week introduced SB 287, which would subject a company up to $250,000 per violation, an injunction, and litigation costs and attorney fees. Her bill is similar to widely watched state legislation last year that would have allowed the attorney general and local district attorneys to file civil suits against social media companies for knowingly putting in designs or algorithms that will addict kids.
American Academy of Pediatrics: Center of Excellence: Creating a Healthy Digital Ecosystem for Children and Youth: This National Center of Excellence will serve as a centralized, trusted source for evidence-based education and technical assistance to support the mental health of children and adolescents as they navigate social media. The American Academy of Pediatrics (AAP) Center of Excellence: Creating a Healthy Digital Ecosystem for Children and Youth is dedicated to promoting healthy social media use and pediatric mental wellbeing. Social media use starts during childhood and can play a significant role in the relationships and experiences that impact the growth, development and mental health of children and teens.
Dessert News (Utah): Op-ed: Teenage social media addictions — what parents don’t know and can’t track: Utah Gov. Spencer Cox “compared social media companies to pharmaceutical companies that make opioids” as reported in a recent Deseret News article. Children and teenagers spend less time with supportive groups and their families due to internet usage. Extensive social media usage is harming the younger generation. Social media companies were aware of this concern but did not share these details with the public. Children and teens consistently using social media are at greater risk for cyberbullying, online harassment, sexting and depression.
NBC News: Sen. Josh Hawley wants to create a legal age to be allowed on social media: Sen. Josh Hawley, R-Mo., intends to make his focus in the current Congress a legislative package aimed at protecting children online — including by setting the age threshold to be on social media at 16. In an interview with NBC News, Hawley detailed some top lines of what his agenda will include, such as: Commissioning a wide-ranging congressional mental-health study on the impact social media has on children. For me, this is about protecting kids, protecting their mental health, protecting their safety,” Hawley said. “There’s ample evidence to this effect that big tech companies put their profits ahead of protecting kids online.”
Tech Crunch: TikTok is crushing YouTube in annual study of kids’ and teens’ app usage: For another year in a row, TikTok has found itself as the social app kids and teens are spending the most time using throughout the day, even outpacing YouTube. According to an ongoing annual review of kids’ and teens’ app usage and behavior globally, the younger demographic — minors ranging in ages from 4 through 18 — began to watch more TikTok than YouTube on an average daily basis starting in June 2020, and TikTok’s numbers have continued to grow ever since. In June 2020, TikTok overtook YouTube for the first time, with kids watching an average of 82 minutes per day on TikTok versus an average of 75 minutes per day on YouTube, according to new data from parental control software maker Qustodio.
KMO-TV: ‘It’s an addiction’: Parents, teens navigate self-esteem, safety of social media: The Parkway School District hosted a national speaker Monday night, helping parents better monitor their children’s social media usage, as teens turn to popular apps to communicate and share photos of their lives. The event features a conversation with Erin Walsh of the Spark and Stich Institute, and includes research in the fields of child and adolescent development along with digital media. “The research is pretty nuanced on this,” said Erin Schulte, Coordinator of Counseling and Character Education for Parkway Schools. “It would be nice it if was simple, like this is all bad, keep them away. But it’s not, it can be used for good things.”
KBTX-TV: Focus at Four: Experts say social media breaks are critical for mental well-being82% of the U.S. population currently uses social media.: Studies have shown that reducing social media use to just 30 minutes a day can lead to increased mental health and well-being. Experts say that excessive use of social media platforms is also found to have a much greater impact. “We’ve seen that social media use is associated with eating disorders, particularly in female adolescents,” said Dr. Pete Loper, a triple board-certified physician in pediatrics, psychiatry, and child psychiatry. “It’s associated with increased depression and anxiety. It is also associated with increased self-harm thoughts, particularly in our children, and adolescents.”
WLS-TV: Our Chicago: TikTok’s CEO to testify before Congress and how social media impacts kids’ wellbeing: More than two dozen states have now banned TikTok on government owned devices. It’s also now illegal for the app to be on any federal phone. All of this over concerns about data privacy, national security and there are on-going studies about the app’s impact on the mental well-being of young people. It was announced recently that TikTok’s CEO will testify before the House Energy and Commerce Committee in March. Illinois U.S. Rep. Jan Schakowsky sits on that committee, and said she’s “really looking forward to quizzing the CEO and getting more information.” “But, I certainly have my concerns,” Schakowsky said. “There’s no question that TikTok, which is used mostly by young people, which adds to the concern is doing the kind of surveillance and looking into the private information. Too much information is collected by these platforms and social media companies. But, we worry about TikTok because of the relationship with the Chinese government.”
The Salt Lake Tribune: Editorial: Social genies are out of the bottle, Editorial Board writes. Let’s prepare our kids to handle them: Childhood and adolescence have always been fraught with danger. Parents have been at their proverbial wit’s end since the primary hazard was a sabertooth tiger. These days, one such fright is “social media,” which can mean a lot of things but generally refers to platforms such as Twitter, Instagram and TikTok. Apps on smartphones that can take the human need to communicate and hype it into addictive brain candies that, at their worst, carry messages of bullying, body shaming and other darts that can lead to depression or even suicide. But it is just as true that, ever since Professor Harold Hill warned the good people of River City, Iowa, about their children frequenting pool halls and “memorizing jokes from Capt. Billy’s Whiz Bang,” whatever is new and scary about a culture provides an avenue for con men and well-meaning busybodies to offer protection for our little dears.
FOX 35 (Orlando): Proposed bill aims to restrict social media usage in Florida classrooms: A proposed house and senate bill is targeting the use of social media in schools. One of the bills would prevent the use of any social media in K through 12 schools if you are using their network. The bills would also require teachings on the good, bad and ugly sides of social media. “It’s digital fentanyl for our children,” said Florida’s Chief Financial Officer Jimmy Patronis. Patronis feels social media is having an adverse effect on Florida’s youth. He supports SB 52 and HB 379 that their access to it in the classroom.
WOWT-TV (Nebraska): The FBI is warning parents tonight about an uprise in sextortion complaints.: The FBI has issued a new warning to Omaha parents after seeing an increase in reports of adults tricking children into sending explicit content through social media. Todd Dicaprio with the FBI is referring to it as “sextortion.” It is when an adult portrays himself as a minor to manipulate children through social media platforms to get them to send sexual pictures and videos to sell online. “We receive on average one to two referrals per week of a child who has been exploited some sexually-suggestive matter online,” Dicaprio said.
The New York Post: ‘Tranquilizer challenge’ ODs land 15 grade school students in hospital: Viral internet stunts continue to endanger the lives of young people: More than 15 students in Mexico forced to undergo treatment after overdosing on drugs as part of a dangerous online Clonazepam “tranquilizer challenge.” Viral internet stunts continue to endanger the lives of young people: More than 15 students in Mexico forced to undergo treatment after overdosing on drugs as part of a dangerous online Clonazepam
NBC News: Top Health Officials Urge Parents To Keep Kids and Teens Off Social Media Apps: “If you look at the guidelines from the platforms, at age 13 is when kids are technically allowed to use social media,” said U.S. Surgeon General Vivek Murthy. “I personally, based on the data I’ve seen, believe that 13 is too early.” “Too young for social media” – that’s what health officials are saying for children ages 13 despite it being the standard age requirement for several social media platforms.
Salon: 13-year-olds should not be on social media, surgeon general warns: As anyone who has either raised or been a teenager in the 21st century can tell you, social media is omnipresent in modern youth culture. Whether it is finding new music on TikTok or finding new friends on Fortnite, teenagers use social media to connect with their peers, express their individuality and participate in a global community. Yet this new technological and social paradigm brings with it grave concerns: social media spaces that youth frequent are rife with bullying, misinformation and bigotry, which can have a detrimental effect on the self-esteem of developing young minds.
The Washington Post: Analysis: A new bill would ban anyone under 16 from using social media: A growing number of U.S. policymakers and federal officials are angling to keep children and young teenagers off social media entirely, citing mounting concerns that the platforms may harm their well-being and mental health. It’s a notable escalation in the rhetoric around keeping kids safe online, which has largely focused on setting new digital protections. The push gained traction after the U.S. Surgeon General Vivek Murthy told CNN on Sunday that he believes 13 is “too early” for kids to be joining apps like Instagram and TikTok, which he said can create a “distorted environment” that “often does a disservice” to kids.
Good Morning America: Excessive screen time during infancy may be linked to lower cognitive skills later in childhood: The amount of time babies spend watching computer, TV and phone screens in their first year of life may be indirectly linked to lower cognitive skills later in life, according to a new study. Babies who watched on average two hours of screen time per day performed worse later on, at age 9, on executive functions, according to the study, which was published Monday in the journal JAMA Pediatrics.
NBC News: Sen. Dick Durbin urges DOJ to review Twitter’s handling of child exploitation: Senate Judiciary Committee Chair Dick Durbin urged Attorney General Merrick Garland in a letter Tuesday to review Twitter’s handling of child exploitation material, calling the Justice Department’s failure to address the issue “unacceptable.” “Sadly, Twitter has provided little confidence that it is adequately policing its platform to prevent the online sexual exploitation of children,” Durbin, D-Ill., wrote. “This puts children at serious risk.” The letter cites reporting from NBC News that found dozens of Twitter accounts and hundreds of tweets using numerous hashtags to promote the sale of child sexual abuse material (CSAM). Some of the tweets were brazen in how they marketed the material, using common terms and abbreviations for CSAM. After the article was published, Twitter said that it was blocking access to several hashtags associated with the posts.
Chicago Tribune: After study finds social media may change pre-teens’ brain wiring, psychologist advises time limits, IRL activities: A new study showing a correlation between frequent checking of social media and neurological sensitivity to social cues in young people underscores the importance of in-person interactions —in other words, talking to people face-to-face instead of on a screen — and setting boundaries around technology and social media use, a pediatric psychologist at Advocate Children’s Hospital said. The study, which was published Jan. 3 in JAMA Pediatrics, tracked the brain activity of about 170 sixth through eighth graders who reported checking Facebook, Instagram and Snapchat at varying frequencies. It found that young people who checked social platforms more frequently had a higher “neural sensitivity to anticipation of social rewards and punishments.”
Denver 7 TV: Is keeping teens off social media unrealistic?: Even though 13-year-olds can sign up for accounts, whether they should is a different question. On Sunday, Surgeon General Vivek Murthy told CNN that he believes age 13 is too young to be on social media. University of Michigan data from 2021 indicate that many children have social media accounts before reaching 13. According to a survey conducted by the University of Michigan, 49% of parents of children ages 10-12 report their kids having social media accounts. With so many children online, Sarah Clark, a research scientist in the Department of Pediatrics at the University of Michigan, questions whether it is realistic to ask parents to outright ban their children from social media. Instead, she encourages setting parameters to promote safe social media usage.
The Cleveland Clinic: Why Social Media Challenges Can Be a Recipe for Disaster — When They’re Real: It’s almost impossible to make it through childhood and adolescence without making questionable — and often downright foolish — decisions. Pushing boundaries and taking risks is part of growing up. We do the best we can to insulate our kids from risk, but they’re always finding new and innovative ways to get hurt. Social media definitely isn’t helping. It amplifies the power of peer pressure, and rewards dangerous risk-taking with likes, shares and empty promises of insta-fame. “It’s tricky because teens can get positive reinforcement with all the likes and views from the videos they post,” says pediatric emergency medicine specialist Purva Grover, MD. “So, the more risky or shocking, the greater the possibility that more people will see it.”
KSL-TV: Utah lawmakers want age restrictions on social media platforms: A Senate committee took the first steps toward regulating social media platforms in the state, advancing a bill that would require minors to get parental consent before signing up for social accounts. SB152 is one of several bills in the Utah Legislature aimed at tech giants this year, after Gov. Spencer Cox made social media regulation one of his top issues ahead of the legislative session. Earlier this month, Cox threatened to regulate social media companies due to the alleged harm to children and announced plans to sue major tech platforms last week. Cox’s brother-in-law, Sen. Mike McKell, R-Spanish Fork, is sponsoring the bill, which would require social media companies to use age verification to prevent minors from signing up without their parent’s permission and would prohibit companies from collecting or selling personal data of minors.
The Hill: Surgeon general: 13-year-olds too young to join social media: Surgeon General Vivek Murthy on Sunday cautioned that, despite many app guidelines, 13-year-olds are too young to join social media. “What is the right age for a child to start using social media? I worry that right now, if you look at the guidelines from the platforms, that age 13 is when kids are technically allowed to use social media. But there are two concerns I have about that. One is: I, personally, based on the data I’ve seen, believe that 13 is too early,” Murthy said on CNN’s “Newsroom.” Twitter, Facebook, Instagram and other top social media platforms allow users age 13 and older to join, create their own profiles and share and consume content.
Forbes: ‘We Can’t Look Away’: Documentary ANXIOUS NATION Explores The Rise In Anxiety In Children: Like adults, children feel worried from time to time. It’s normal. But when a child’s anxiety interferes with his or her school, home or social life, it’s time for professional help. The moving documentary, Anxious Nation, delves into the increased rates of anxiety among children and adolescents, and appeals for the urgent need for compassionate and science-based treatment and care. After attending a screening at the Palm Springs International Film Festival, I spoke with filmmakers and cast members about the mental illness epidemic among some of society’s most vulnerable individuals.
CNN: Children’s mental health tops list of parent worries, survey finds: Forty percent of US parents are “extremely” or “very” worried that their children will struggle with anxiety or depression at some point, a new survey finds. The Pew Research Center report said mental health was the greatest concern among parents, followed by bullying, which worries 35% of parents. These concerns trumped fears of kidnapping, dangers of drugs and alcohol, teen pregnancy and getting into trouble with the police. Concerns varied by race, ethnicity and income level, with roughly 4 in 10 Latino and low-income parents and 3 in 10 Black parents saying they are extremely or very worried that their children could be shot, compared with about 1 in 10 high-income or White parents.
Axios: Surgeon general: 13-year-olds too young to join social media platforms: Surgeon General Vivek Murthy said on “CNN Newsroom” on Saturday he believes 13-year-olds are too young to join social media and that being on those platforms does a “disservice” to children. The big picture: Scientists have warned of a connection between heavy social media use and mental health issues in children, saying that the negatives outweigh the positives. Instagram, Snapchat and Twitter all allow users ages 13 or older on their platforms. TikTok users in the United States who are younger than 13 can use the platform, albeit with a safety setting for children that limits the information collected from them, as well as prevents them from messaging other users or allowing others to see their user profile.
CNN: Surgeon General says 13 is ‘too early’ to join social media: US Surgeon General Vivek Murthy says he believes 13 is too young for children to be on social media platforms, because although sites allow children of that age to join, kids are still “developing their identity.” Meta, Twitter, and a host of other social media giants currently allow 13-year-olds to join their platforms. “I, personally, based on the data I’ve seen, believe that 13 is too early … It’s a time where it’s really important for us to be thoughtful about what’s going into how they think about their own self-worth and their relationships and the skewed and often distorted environment of social media often does a disservice to many of those children,” Murthy said on “CNN Newsroom.”
New York Post: Surgeon general warns 13 is too young for children to be on social media: Surgeon General Vivek Murthy warned that children join social media too early and believe they should only be allowed to access the platforms once they’re between 16 and 18. Platforms such as TikTok, Instagram and Twitter currently allow users to join as long as they are at least 13 years old. Murthy believes this can cause adolescents to have a “distorted’ sense of self during their crucial developmental years. “I, personally, based on the data I’ve seen, believe that 13 is too early,” Murthy said on CNN.
FOX 5 (Washington, D.C.) Parents push for Congress to address Snapchat drug dealers: Parents testified this week at a House hearing on Capitol Hill where they called on both Congress and tech companies to do more to fight the opioid crisis in this country. With the rise of overdoses involving children, lawsuits are now being filed against social media companies, such as Snapchat, for putting children in danger. Parents of some of these teens are putting increased pressure on lawmakers and these social media platforms to put better measures in place to stop online drug dealers from gaining access to kids.
NBC Chicago: Illinois School Warns Parents About App That Puts Students in Potential Stranger Danger: An Illinois school put out a warning to parents surrounding a social media app that school officials believe many students are using and could be putting them in dangerous situations with strangers. An Illinois school put out a warning to parents surrounding a social media app that school officials believe many students are using and could be putting them in dangerous situations with strangers. The free app, called Omegle, randomly pairs users with others from around the world to talk “one-on-one” anonymously. Users can add interests that will allow the app to pair them with someone who shares similar interests.
KLBK-TV (Texas): Republican congressman calls for nationwide social media ban for kids, teens: A Republican congressman says social media is so harmful for kids and teens that they should be banned from using it, just like kids aren’t allowed to drink or smoke. Congressman Chris Stewart says he hasn’t officially introduced his bill to ban social media for kids under 16 because he’s working on building up support behind the scenes first. “It’s destroyed their sense of self-worth, and their confidence and their sense of hope in the future,” Rep. Chris Stewart (R-UT) said. Studies show social media leads to an elevated risk of depression and suicide as Stewart noted, “nearly a third of our young people age 14-24 have considered suicide and have discussed how they would commit suicide with a friend.”
Dessert News: Should children under 16 be denied access to social media apps?: Tweens and teens spend as much as nine hours a day scrolling through social media, gaming, online shopping, video chatting and texting on their cell phones. And an increasing amount of evidence suggests all that screen time is taking a toll on their mental health. “The statistics are clear we’ve got a generation of young people that are the most distressed, anxious, depressed and tragically suicidal than any generation in our history,” said Rep. Chris Stewart, who was recently named co-chairman of the bipartisan Mental Health Caucus in Congress. The rise in anxiety and depression, he says, can be almost directly correlated to when Facebook bought Instagram in 2012 and began marketing initially to girls and then boys as young as 9. The Chinese app TikTok, he said, was designed as “emotional heroin” for young people.
WGN-TV (Chicago): Suburban man arrested for kidnapping 3 Ohio children, use of social media: A Beach Park man is facing charges for kidnapping three Ohio children after communicating with them through online platform for weeks. Michael Negron, 19, is being charged with a count of kidnapping and three counts of child endangerment, according to the lake County State’s Attorney’s Office. It is still unclear what the intentions of Negron were. According to police reports, a parent from Middleton, Ohio called the Lake County Sherrif’s Office Saturday afternoon about their missing children, a 12 and 14-year-old girl and their friend, a 15-year-old boy.
WANE-TV (Indiana): Deadly social media ‘blackout challenge’ resurfaces, more child deaths reported: The resurgence of a social media trend has become a nightmare for several families who have lost children to the “game,” with reports of more children dying. The “blackout challenge,” also known as the “choking game” or “pass-out challenge,” encourages users to choke themselves with belts, purse strings or other similar items until passing out. It dates back to at least 2008, when the Centers for Disease Control and Prevention noted that 82 children across 31 states died from the mid-1990s to the mid-2000s as a result. Most of the kids who died were between 11 and 16.
The Salt Lake Tribune: Why Utah Gov. Cox and AG Reyes plan to sue social media companies: Utah Gov. Spencer Cox, alongside Utah Attorney General Sean Reyes, announced that the state would take legal actions against social media companies to address, they say, the harm that digital platforms are doing to the mental health of Utah’s youth. “Without strong action on our part, social media companies will simply not make the changes necessary to protect our children,” Cox said in a news conference on Monday. He alleged that social media apps are designed so that users won’t want to put them down. Neither Cox nor Reyes would specify which social media companies would be sued or what particular claims potential litigation would address. No lawsuits have been filed at this time.
PC Mag: The Most Toxic Online Platforms: Are Your Kids on Them?: Kids are now born into a world with social media, as well as a tangled web of images, games, users, and algorithms that make it nearly impossible for parents to know everything they’re doing. A new study(Opens in a new window) by ExpressVPN asked over 2,000 children in the US and the UK about the biggest issues they’re facing online and on which platforms. The top problems kids reported experiencing are somebody being rude or swearing at them (34%), seeing scary videos (31%), and seeing scary photos (26%). Their parents, roughly 2,000 surveyed adults, gave slightly different answers.
KUTV-TV: Utah parents support social media ban after video of child’s attack posted online: Kylee and Adam Taylor said their daughter was brutally attacked at her own Utah school twice, and in one instance, video of the assault made the rounds on Instagram and TikTok. Now, the Taylors strongly support Congressman Chris Stewart’s proposal for a federal ban on social media for children younger than 16. “Her lips were cut up, bruising on her face,” said Kylee, of her daughter’s injuries. “Both times she was checked for concussions.“ “She was punched kicked, grabbed her hair, threw her to the ground,” added Adam. “It’s traumatic, especially when you get the call and your daughter is crying.”
KUTV-TV: Utah lawmaker to introduce new bill on federal social media ban for teens under 16: Congressman Christ Stewart did not head to Washington to solve our nation’s mental health crisis, but it has become one of his areas of focus. Six months ago, Rep. Stewart’s bill designating 9-8-8 as the universal number for the National Suicide Prevention Hotline was signed into law. It took two years to get the bill passed and the hotline ready for callers. The nations children and teens are his latest focus with a new bill set to be released next week. The bill seeks to ban children under the age of 16 from using social media sites like Facebook, TikTok and Instagram.
The Wall Street Journal: Op-ed: Republicans and Democrats, Unite Against Big Tech Abuses: The American tech industry is the most innovative in the world. I’m proud of what it has accomplished, and of the many talented, committed people who work in this industry every day. But like many Americans, I’m concerned about how some in the industry collect, share and exploit our most personal data, deepen extremism and polarization in our country, tilt our economy’s playing field, violate the civil rights of women and minorities, and even put our children at risk. As my administration works to address these challenges with the legal authority we have, I urge Democrats and Republicans to come together to pass strong bipartisan legislation to hold Big Tech accountable. The risks Big Tech poses for ordinary Americans are clear. Big Tech companies collect huge amounts of data on the things we buy, on the websites we visit, on the places we go and, most troubling of all, on our children. As I said last year in my State of the Union address, millions of young people are struggling with bullying, violence, trauma and mental health. We must hold social-media companies accountable for the experiment they are running on our children for profit.
FOX 5: VIDEO: Managing stress, anxiety, and screen time for children: Stress and anxiety can have negative impacts on your children physically, mentally, and emotionally. Plus, while social media can make people feel more connected, too much screen time can lead to health concerns like sleep or behavioral issues. Child psychologist Dr. Joseph McGuire with the Johns Hopkins Children Center, joined Fox 45 News with tips for parents to help their children navigate stress and anxiety while also managing screen time.
WHNT-TV: Deadly social media ‘Blackout Challenge’ resurfaces, nine children die: A social media trend has become a nightmare for several families after losing their children to the “game” – with at least nine children under the age of 14 dying for the dare of “how long can you hold your breath.” The “Blackout Challenge,” also known as the “Choking Game” or “Pass-Out Challenge,” dates back to at least 2008, when 82 children died trying to video themselves doing it. Most of the kids that died that year were between 11 and 16, spreading over 31 states. In 2021, the “challenge” resurfaced on TikTok, which led the viral video app to ban #BlackoutChallenge from its search engine. The social media giant is already facing a wrongful death lawsuit after a 10-year-old Italian girl was declared brain dead. She had allegedly tied a belt around her throat to self-asphyxiate.
Fortune: Is America overreacting to TikTok with all of its new bans at high schools and colleges? Probably not.: A growing number of public schools and colleges in the U.S. are moving to ban TikTok – the popular Chinese-owned social media app that allows users to share short videos. They are following the lead of the federal government and several states, that are banishing the social media app because authorities believe foreign governments – specifically China – could use the app to spy on Americans. The app is created by ByteDance, which is based in China and has ties to the Chinese government. The University of Oklahoma, Auburn University in Alabama and 26 public universities and colleges in Georgia have banned the app from campus Wi-Fi networks. Montana’s governor has asked the state’s university system to ban it.
Rice University: Three out of four parents say social media is a major distraction for students, according to new study: The vast majority of parents believe social media is a major distraction for students, according to a new nationwide study. The online study, conducted in November and December, surveyed a nationally representative sample of more than 10,000 parents of K-12 students. An overwhelming majority from across racial groups—African American (70%), Asian (72%), white (75%), Hispanic/Latino (70%)—agreed that social media is a distraction. Parents of children who attend private schools (82%) were more likely to see social media as a distraction than parents of children in public schools (73%) or charter schools (73%) or those being homeschooled (67%). Interestingly, parents with children in high school (74%), middle school (73%) and elementary school (73%) were equally concerned about the issue.
WCIV-TV (South Carolina): School district warns parents on the possible dangers of social media: Monitoring social media starts at home. That’s the message Berkeley County School District is sending to their students’ parents. The Berkeley County School District’s Office of Security and Emergency Management has hosted several informational meetings on the possible dangers of social media. Parents learn they are the gatekeepers to their child’s electronic experience. “This is part of life. It’s not going anywhere. It’s here to stay,” said Cheretha Kinlaw-Hickman, Security and Emergency Management Officer with BCSD. “And if you’re going to use it, we just want to be responsible and safe in how we use these social media apps and being online in general.”
CBS News: New phone allows parents to see everything their kids do online: A company says it has a solution for parents giving phones to their children for the first time. It’s a custom-built Android device called Aqua One from the company Cyber Dive. The specially made phone gives parents the ability to track everything their kids do online. Using an app on their own phones, parents can track a mirrored version of their child’s phone. That means parents can see every text their child types, what videos they are watching and which social media apps they are using. Creator Jeff Gottfurcht says there are just too many apps out there that have become a danger to kids and Cyber Dive’s phone will allow parents and their kids to have an open dialogue about what’s safe and what’s not.
Chalkbeat: As Seattle schools sue social media companies, legal experts split on potential impact: A notable new lawsuit against social media industry leaders by the Seattle school district has left legal experts divided on how the case will unfold. The complaint — which alleges that the school district and its students have been harmed by social media’s negative effects on youth mental health — could lead to sweeping changes in the industry, one expert said. Or, as others expect, it could fizzle out with little chance of winning in court. Seattle Public Schools alleges that the companies — which include Meta, Google, Snapchat, and ByteDance, the company behind TikTok — designed their platforms intentionally to grow their user bases and “exploit the psychology and neurophysiology of their users into spending more and more time on their platforms,” according to a complaint filed earlier this month.
Roll Call: White House, House GOP take aim at Big Tech, but see different targets: President Joe Biden and Republican lawmakers last week launched yet another effort to confront thorny issues relating to Big Tech and social media platforms that have bedeviled previous administrations and Congress, but the path to progress this time around is just as murky. In two high-profile opening salvos of the 118th Congress, the two sides showed how far apart they are starting. Aside from a glimmer of overlap on protections for minors and the market power of the big tech companies, the two sides aren’t offering much promise of legislation. Biden used a Jan. 11 op-ed in The Wall Street Journal to call on Congress to pass federal data privacy legislation, especially to protect children, and prevent ads targeting them, modify U.S. law on social media content moderation policies, and change antitrust policy to bring more competition into the tech industry.
The Wall Street Journal: The U.K.’s Online Safety Bill aims to better protect adults and children from viewing certain online content.: British legislators are set to approve a draft of a extensive new social-media bill that could see the chief executives of major tech firms held criminally liable if they don’t protect children from certain content online. As the U.K. moves closer toward enacting new legislation that technology companies say is too restrictive, its Online Safety Bill aims to better protect adults and children from viewing certain online content, including fraud, revenge porn and sexual abuse. The proposed law, expected to win approval this week by the House of Commons, will force tech companies to remove content deemed illegal or content that is barred by their own terms and conditions, or face fines or legal action. The bill would then go to the U.K.’s upper chamber, the House of Lords, in February, where it could be revised further, and become law by year-end.
GeekWire: Audio: Seattle Schools vs. Social Media: What’s at stake in the suit against TikTok, Instagram, and others: As a tech reporter based in Seattle, that certainly got my attention, and I wasn’t alone. After GeekWire broke the story last weekend, it made national news. Here are some of the key points to know: Seattle Public Schools is suing the social media giants for damages stemming from what the suit describes as a youth mental health crisis in Seattle and across the country. That crisis, the suit alleges, has been caused by the deliberate actions of the companies in deploying algorithms designed “to maximize engagement by preying on the psychology of children.”
NPR: AUDIO INTERVIEW: Why 2 Seattle area school districts are suing 5 social media companies: The school districts allege that the companies’ practices have led to increased anxiety, depression, eating disorders and bullying among children.
Seattle Times: Opinion: Seattle schools take social media giants to court: Social media can often be more aptly characterized as antisocial media. Purveyors of conspiracy theories, misinformation, misogyny, white supremacy and antisemitism thrive in these supposedly sociable swaths of the Internet. Beyond toxic politics, social media has also become a 21st century venue for teenage bullies and bad boyfriends, mean girls and malicious rumors. The often fragile psyches of adolescents do not always fare well in this online toxic environment and many people blame social media for a big spike in cyberbullying, prolonged depression and suicide attempts among young Americans.
Ars Technica: Schools sue social networks, claim they “exploit neurophysiology” of kids’ brains: Seattle schools argue that defendants are not protected by Section 230 of the Communications Decency Act, which says providers of interactive computer services cannot be treated as the publisher or speaker of information provided by third parties. Seattle schools are not claiming that the social networks are publishers, the lawsuit said. “Plaintiff is not alleging Defendants are liable for what third parties have said on Defendants’ platforms but, rather, for Defendants’ own conduct,” the lawsuit said. “As described above, Defendants affirmatively recommend and promote harmful content to youth, such as proanorexia and eating disorder content. Recommendation and promotion of damaging material is not a traditional editorial function and seeking to hold Defendants liable for these actions is not seeking to hold them liable as a publisher or speaker of third-party content.”
TODAY SHOW: Teens love the anonymous new Gas app: Here’s what parents should know: Teens can anonymously see who likes them, and more, on the hottest new social app for students.: There’s a new social media app captivating teens. Using the Gas app, users can anonymously compliment their friends (or secret crushes), and the app is gaining steam among young users. NBC News correspondent Savannah Sellers reports on TODAY that 1 in 3 teens are using the app and more than 1 billion compliments have been shared, according to Gas app founder Nikita Bier. So, how does it work? Gas app users can log on and compliment, or “gas up,” their friends. Users take a series of polls about their friends, with questions ranging from thoughtful to flirty. “You sign up, join your high school and or you sync your contacts, so we can find your friends,” Bier told Sellers. Bier says people have drawn comparisons to other anonymous apps that are plagued by bullying. “The distinction with Gas is that we author all the content so that you’re answering polls that are generally uplifting and positive, and that’s kind of the aim of the product,” Bier says.
ABC News: School district sues social media giants for ‘creating a youth mental health crisis’ Seattle Public Schools filed a lawsuit against Alphabet Inc., Meta Platforms, Inc., Snap Inc. and TikTok-owner ByteDance.: Seattle Public Schools, the largest school district in the state of Washington, filed a lawsuit Friday against multiple social media giants, in an effort to hold the companies “accountable for the harm they have wreaked on the social, emotional, and mental health of its students,” the district claimed. “It has become increasingly clear that many children are burdened by mental health challenges. Our students — and young people everywhere — face unprecedented, learning and life struggles that are amplified by the negative impacts of increased screen time, unfiltered content, and potentially addictive properties of social media,” Seattle Public Schools superintendent Brent Jones said in a statement. “We are confident and hopeful that this lawsuit is the first step toward reversing this trend for our students, children throughout Washington state, and the entire country.”
Axios: Social media’s effects on teen mental health comes into focus: Experts are increasingly warning of a connection between heavy social media use and mental health issues in children — a hot topic now driving major lawsuits against tech giants. Why it matters: Seattle Public Schools’ recently filed lawsuit against TikTok, Meta, Snap and others — which accuses the social media giants of contributing to a youth mental health crisis — is one of hundreds of similar cases. Driving the news: Some scientists who study technology’s effects on children say the negatives far outweigh any positives. “There is a substantial link to depression, and that link tends to be stronger among girls,” Jean Twenge, a psychology professor at San Diego State University and leading expert on the subject, tells Axios.
Axios: Podcast (Transcript): The escalating fight over Big Tech and kids: Seattle Public Schools filed a lawsuit accusing Big Tech of helping cause a youth mental health crisis. It’s going after TikTok, Meta, Snap and other companies in one of many cases that seek to hold social media platforms responsible for harm to children. Guests: Axios’ Ashley Gold, Sophia Cai and Andrew Freedman. NIALA: Good morning! Welcome to Axios Today! It’s Wednesday, January 11th. I’m Niala Boodhoo. Here’s what we’re covering today: more deaths in California as winter storms rage on. Plus, what we know about the classified documents found from Biden’s VP days. But first: the escalating fight over Big Tech and kids. That’s today’s One Big Thing.
The New York Times: Three-Quarters of Teenagers Have Seen Online Pornography by Age 17: Sexually explicit content has become so prevalent online that teenagers are deluged, according to a new report by a nonprofit child advocacy group.: The internet has transformed pornography, making it much easier to view and share than in the days of Playboy magazine and late-night cable television. For teenagers, that’s created a deluge of sexually explicit photos and videos that has invaded their everyday lives, according to a report released on Tuesday. Three-quarters of teenagers have viewed pornography online by the age of 17, with the average age of first exposure at age 12, according to the report by Common Sense Media, a nonprofit child advocacy group. Teenagers are seeing the photos and videos on their smartphones, on their school devices and across social media, pornography sites and streaming sites, it said.
Reuters: Seattle public schools blame tech giants for social media harm in lawsuit: Seattle’s public school district filed a lawsuit against Big Tech claiming that the companies were responsible for a worsening mental health crisis among students and directly affected the schools’ ability to carry out their educational mission. The complaint, filed on Friday against Alphabet Inc, Meta Platforms Inc, and TikTok-owner ByteDance with the U.S. District Court, claimed they purposefully designed their products to hook young people to their platforms and were creating a mental health crisis. In emailed statements to Reuters, Google said it has invested heavily in creating safe experiences for children across its platforms and has introduced “strong protections and dedicated features to prioritize their well being,” while Snap said it works closely with many mental health organizations to provide in-app tools and resources for users and that the well-being of its community is its top priority. Meta Platforms and TikTok did not immediately respond to Reuters’ request for comment. In the past, the companies have said they aim to create an enjoyable experience for users and exclude harmful content and invest in moderation and content controls.
AP: Seattle schools sue tech giants over social media harm: The public school district in Seattle has filed a novel lawsuit against the tech giants behind TikTok, Instagram, Facebook, YouTube and Snapchat, seeking to hold them accountable for the mental health crisis among youth. Seattle Public Schools filed the lawsuit Friday in U.S. District Court. The 91-page complaint says the social media companies have created a public nuisance by targeting their products to children. It blames them for worsening mental health and behavioral disorders including anxiety, depression, disordered eating and cyberbullying; making it more difficult to educate students; and forcing schools to take steps such as hiring additional mental health professionals, developing lesson plans about the effects of social media, and providing additional training to teachers.
Good Morning America (ABC News): Social media use linked to brain changes in teens, study finds: A new study has identified a possible link between frequently checking social media and brain changes that are associated with having less control of impulsive behaviors among young adolescents. Using MRI brain scans, researchers at the University of North Carolina found that teens who frequently checked social media were more likely to see increased activation in the regions of the brain that regulate reward centers and those that may play a role in regulating decision-making around social situations. The study, published Tuesday in the Journal of the American Medical Association, looked at nearly 200 young people in sixth and seventh grades.
WTVD-TV (North Carolina): VIDEO: Social media is changing how children’s brains develop, UNC researchers find: Researchers at the University of North Carolina released the results of one of the first ever long-term studies on child brain development and technology use. The study specifically looked at middle school students in North Carolina and the impact social media had on their brain development. Researchers said the evidence shows constant checking of a social media feed increased sensitivity to peer feedback. The 169 students underwent yearly brain imaging sessions over three years; that showed researchers that the children had become hypersensitive to feedback from their peers. The researchers published their results in JAMA Pediatrics. Ultimately, what this means for the future of social media and childhood development remains unclear. Even the authors of the study said the results are not necessarily good or bad.
The New York Times: Social Media Use Is Linked to Brain Changes in Teens, Research Finds: The effect of social media use on children is a fraught area of research, as parents and policymakers try to ascertain the results of a vast experiment already in full swing. Successive studies have added pieces to the puzzle, fleshing out the implications of a nearly constant stream of virtual interactions beginning in childhood. A new study by neuroscientists at the University of North Carolina tries something new, conducting successive brain scans of middle schoolers between the ages of 12 and 15, a period of especially rapid brain development.
The Hill: Study finds social media use may impact youth brain development: Advocates and parents have raised concerns about the potential health effects of social media on teens and children for years. A new study carried out in rural North Carolina shows habitually checking social media platforms may lead to long-term changes in adolescent brain development. Specifically, researchers found different social media checking habits were linked with changes in youths’ brains, altering how they respond to the outside world. Data suggest those who checked the sites and apps more than 15 times per day became hypersensitive to peer feedback.
Psychology Today: 5 Ways Parents Can Keep Kids Safe Online: The metaverse, artificial intelligence, virtual and augmented reality, ChatGPT; new technologies are coming in faster than a parent can say, “Put down that phone!” Rather than anguishing over what you may or may not know about these digital innovations, here are five easy ways to help keep your kids safe in 2023. 1. Stop focusing on “screen time.” Focus on “screen use” instead. During every presentation I gave last year, parents were laser-focused on one concern: “screen time.” I sincerely hope we move past this in 2023 because focusing on “time” rather than “use” disregards so many benefits of technology. For example, using a screen to do research or to say “hi” to Grandma is vastly different from doom-scrolling endless TikTok videos (although this might be “educational” too, but more on that in a moment). I don’t believe there is a parent on the planet who wants their child missing out on doing online research or visiting with a geographically-distant relative.
Patriot-News: Op-ed: Prioritize your family’s digital wellness this holiday season: The holiday season presents parents with unique challenges. From festive celebrations like office parties, holiday light displays, last-minute shopping trips, and concerts, most families’ schedules are jam-packed with activities right now. After navigating through the COVID-19 pandemic, the hectic pace of the current holiday season, is for many, a welcome return to normalcy. However, this time can also serve as a catalyst for stress. According to a Dec. 1 poll from the American Psychiatric Association, 31 percent of adults admitted that they expect to feel more stressed this holiday season than last year. When adults are stressed, family routines often fall by the wayside. As a father, I understand that adhering to structure can be very difficult this time of year, especially when children get extended time off from school for the holidays.
Forbes: VIDEO: Child Online Privacy Protections Cut From Congress’ Spending Bill— Despite Last-Minute Push: A pair of bills designed to strengthen online protections for children was left out of a fiscal year 2023 spending plan Congress is aiming to pass this week, despite heightened concerns about online privacy and an advocacy campaign by parents whose children’s deaths have been tied to Internet activity.
Huffington Post: How To Ask People Not To Share Photos Of Your Kids On Social Media: The digital record of a child born this century often begins before birth, when a parent shares a grainy sonogram image. By the time the child is old enough to open their own social media accounts, there may already be hundreds of images of them throughout cyberspace, searchable by name, geotag location and facial recognition technology. But an increasing number of parents are opting out of this “sharenting” norm of documenting all of their child’s milestones on social media. They may post no photos of their child at all or only photos in which their child’s face is not visible. Some parents block out their child’s face in group photos or make public requests that others not post images of their child.
Los Angeles Times: Column: Social media platforms must stop the exploitation of child performers. Now: YouTube has a major child labor problem. Just read Amy Kaufman and Jessica Gelt’s recent Times investigation into the lawsuit facing YouTube star Piper Rockelle and her mother, Tiffany Smith.Instagram and TikTok have child labor problems too, as do any social media platforms from which children (and their parents) derive income. As should be self-evident, when people make money on these platforms, “social” takes a backseat to “media.” When kids make money by producing content for a media company in California, they are — or should be — protected by the state’s laws, which mandate, among other things, limited hours, on-site education and a state-licensed teacher or social worker present on set at all times.
Newsweek: Op-ed: Teen Social Media Screen Time Should Concern Parents: Smartphones have always posed a range of challenges for parents of teens. From social media apps and excessive screen time to explicit content and mental health problems, the digital world often seems as threatening as the physical. A new Pew Research study shows that when it comes to teens and their smartphone use, parents might be worried too much about certain problems, and not worried enough about others. The study shows that about half (46 percent) of parents of teens are worried about their teen being exposed to explicit content online. This is a valid concern, of course. Adults know explicit content is ubiquitous online and can be damaging to see. But there are ways to mitigate the spread of explicit content, from changing the settings in their kids’ phone to preventing and monitoring such content with apps like Bark.
Press Release: Department of Justice – U.S. Attorney’s Office – Western District of Pennsylvania: The United States Attorney’s Office for the Western District of Pennsylvania, in partnership with Homeland Security Investigations – Philadelphia (HSI), the Federal Bureau of Investigation – Pittsburgh (FBI), and the National Center for Missing and Exploited Children (NCMEC), is issuing a public safety alert regarding an alarming increase in the online exploitation of children and teens. Reports of the online enticement of minors have dramatically spiked in recent months—including reports of sextortion.
CBS Evening News: When should you get your child a cellphone?: Cellphones are a popular gift during the holiday season, but the debate remains: What’s the best age for your child’s first phone? Craighton and Emily Berman are considering getting their 12-year-old son, Henry, a cellphone. He’s only allowed one hour of recreational screen time per day on the computer. “My wife and I have been kind of struggling with it,” Craighton Berman said. “Because there’s a lot packed into that phone. We all know digital technology and social media kind of destroys us. So I’m just trying to figure out how to destroy him a little less.”
PEW Research: Teens and Cyberbullying 2022: Nearly half of U.S. teens have been bullied or harassed online, with physical appearance being seen as a relatively common reason why: While bullying existed long before the internet, the rise of smartphones and social media has brought a new and more public arena into play for this aggressive behavior. Nearly half of U.S. teens ages 13 to 17 (46%) report ever experiencing at least one of six cyberbullying behaviors asked about in a Pew Research Center survey conducted April 14-May 4, 2022. The most commonly reported behavior in this survey is name-calling, with 32% of teens saying they have been called an offensive name online or on their cellphone. Smaller shares say they have had false rumors spread about them online (22%) or have been sent explicit images they didn’t ask for (17%).
New York Post: My kids were digitally kidnapped — here’s how parents can be more careful: Mommy bloggers beware. Mother of two Meredith Steele, 35, is warning parents to stop sharing photos of their children online after her family was “digitally kidnapped.” The terrifying phenomenon occurs when a stranger steals a parent’s social media snaps to use on their own accounts and live out a fake life online. “My kids had new names and new identities,” Steele said of the ordeal. “They [the culprit] had made their own captions and made their own lives. It was like they were playing with Barbie dolls but the dolls were my kids. “This changed my mind about sharing my stuff online,” the Maine mama told South West News Service. “Mommy blog culture normalizes oversharing intimate personal details of your kids and they aren’t old enough to agree or disagree with it.”
NBC News: Democratic senator questions Twitter’s handling of child safety under Elon Musk: Senate Judiciary Committee Chair Dick Durbin, D-Ill., sent a letter to tech billionaire Elon Musk on Friday expressing concern that Twitter’s approach to child safety had “rapidly deteriorated” since Musk bought the social media site in October. The letter follows reports from several news outlets, including NBC News, about Musk’s eliminating the jobs of people at Twitter who worked to prevent child sexual exploitation and disbanding a board of outside experts who advised Twitter on its efforts to address exploitation. Durbin wrote that he was not convinced by Musk’s recent pledge that addressing child sexual exploitation content was “Priority #1.”
PEW Research: Explicit content, time-wasting are key social media worries for parents of U.S. teens: Parents have a range of concerns when it comes to their teenagers using social media, with access to explicit content and time-wasting ranking among those at the top of the list, according to a Pew Research Center survey of parents of teens ages 13 to 17 conducted this spring. The survey also shows that a majority of parents are keeping a watchful eye on what their teens do on social media. Some are also imposing screen time restrictions on these sites.A bar chart showing that parents are more likely to be concerned about their teen seeing explicit content on social media than these sites leading to anxiety, depression or lower self-esteem. While social media has allowed people to easily seek out information, some say it has also made inappropriate and explicit content more accessible. Nearly half of parents of teens (46%) say they are extremely or very worried that their teen’s use of social media could lead to them being exposed to explicit content, according to the April 14-May 4, 2022, poll.
The Hill: Governors in Iowa, North Dakota and Alabama join GOP colleagues in banning TikTok for state employees: The Republican governors of three more states have joined the growing number of GOP governors who are banning TikTok among state government employees amid security concerns about the Chinese-owned social media platform. Alabama Gov. Kay Ivey, North Dakota Gov. Doug Burgum and Iowa Gov. Kim Reynolds each signed executive orders in the past two days to ban the app from state-owned devices. Republican governors in Maryland, South Dakota, Texas and Utah have already taken action to ban TikTok for state employees’ devices.
The New York Times: Research finds more negative effects of screen time on kids, including higher risk of OCD: A new study suggests that reliance on devices may hinder children’s ability to learn to regulate their emotions. Another linked video game use to a risk of obsessive-compulsive disorder.: Two new studies show associations between screen time and behavioral and psychological risks for children, adding to a growing body of evidence that excessive use of smartphones and other devices can be deleterious to their health. In one study, researchers reported a link between screen time and higher rates of obsessive-compulsive disorder diagnoses among preteens. In the other, the results suggested that using electronic devices to calm youngsters when they’re upset may inhibit their ability to learn to soothe themselves, leading to more frequent, intense emotional outbursts.
The New York Times: How to Use Parental Controls on Your Child’s New Phone: The holiday season is here, and if you’ve decided to give in and get your child a smartphone or tablet, you may be nervous about safety, supervision and screen time. Software can’t solve everything, but it can help. Here are a few of the tools available to help parents or caregivers guide children’s first solo steps into the digital age. First, Set the Rules.
CBS News (Minnesota): What are the concerns about using TikTok? Should parents tell their kids to delete it?: A popular app for entertainment and news is now banned on government devices in Maryland, Nebraska, South Carolina, South Dakota and Texas. Public employees in those five states can’t have TikTok on their work phones, computers or tablets. The reason is for security concerns, given TikTok’s owner – ByteDance – is a Chinese company. The FBI is also sounding the alarm about the social media platform. Aynne Kokas, an author and the director of the University of Virginia East Asia Center, broke down some of the concerns. “The first is the type of data that TikTok, as an app, is able to gather about our usage of the technologies,” Kokas said.
60 Minutes (CBS News): VIDEO: More than 1,200 families suing social media companies over kids’ mental health: When whistleblower Frances Haugen pulled back the curtain on Facebook last fall, thousands of pages of internal documents showed troubling signs that the social media giant knew its platforms could be negatively impacting youth and were doing little to effectively change it. With around 21 million American adolescents on social media, parents took note. Today, there are more than 1,200 families pursuing lawsuits against social media companies including TikTok, Snapchat, YouTube, Roblox and Meta, the parent company to Instagram and Facebook. More than 150 lawsuits will be moving forward next year. Tonight, you’ll hear from some of the families suing social media. We want to warn you that some of the content in this story is alarming, but we thought it was important to include because parents say the posts impacted their kids’ mental health and, in some cases, helped lead to the death of their children.
60 Minutes (CBS News): Meet the teens lobbying to regulate social media: When Emma Lembke was a 12-year-old 6th grader, she was excited to join the world of social media. Here was a way to connect instantly to millions of people around the globe from her home in Birmingham, Alabama, she thought. Lembke was eager to express herself through an online persona and explore new information that she otherwise would not have access to. She first signed up for Instagram, and in the first week, she followed Oprah and the Olive Garden.
Forbes: Twitter Has Cut Its Team That Monitors Child Sexual Abuse: Even as Elon Musk has said that removing child sexual exploitation content from Twitter was “Priority #1,” the teams charged with monitoring for, and subsequently removing such content have been reduced considerably since the tech entrepreneur took control of the social media platform. Bloomberg reported last month that there are now fewer than 10 people whose job it is to track such content – down from 20 at the start of the year. Even more worrisome is that the Asia-Pacific division has just one full-time employee who is responsible for removing child sexual abuse material from Twitter.
The Washington Post: Indiana sues TikTok, claiming it exposes children to harmful content: Indiana’s attorney general sued TikTok on Wednesday, claiming the Chinese-owned company exposes minors to inappropriate content and makes user data accessible to China, in one of the strongest moves against the social media giant taken by a state. Indiana’s lawsuit is the latest move to put TikTok and its parent company under scrutiny. As U.S. officials have sought to regulate TikTok, the platform in recent years has come under sharp questioning in Washington and been under investigation by a bipartisan group of attorneys general for its potential effects on youth mental health, its data security and its ties to China.
Forbes: Amazon Alexa Wants To Put Your Child To Bed With Generative AI Storytelling: While researchers applaud Amazon’s safeguards to ensure the tech is safe for kids, some experts are concerned that generative AI could lead children to believe these algorithms are more intelligent than they actually are. Generative AI, which is known for churning out fantastical art based on text prompts, is now sneaking into one of the most sacred bonding experiences for parents and children: bedtime storytelling.
CNBC: Op-ed: I raised 2 successful CEOs and a doctor. Here’s the No. 1 skill I wish more parents taught their kids today: Parenting expert: The No. 1 thing every parent should teach their kids. Developing skills like curiosity, kindness and emotional intelligence at a young age will help kids succeed as adults. But there’s one skill that parents aren’t teaching their kids enough of today: self-regulation. When kids learn to self-regulate, they better understand the importance of time and how to manage their own behaviors and actions. 1. Model a healthy relationship with technology.Think of the last time you were eating lunch while typing an email while listening to a podcast and checking your phone each time it dinged. We’ve all been there.
Los Angeles Times: How parents can help protect children from online catfishing and other digital dangers: The family of the Riverside teen girl who was tricked into a digital romance with a “catfishing” cop from Virginia want their devastating story to be a cautionary tale. “In this tragic moment of our family, our grief, we hope some good will come from this,” Michelle Blandin, the teen’s aunt, said this week. “Parents, please, please know your child’s online activity. Ask questions about what they’re doing and whom they are talking to; anybody can say they’re someone else.” Such incidents are too common, say experts who hope this one will serve as a reminder to parents about having important conversations early and often with children about online conduct. That is the best way, they say, to protect youth from the many dangers that can lurk on the internet, from both known and unknown predators, cyberbullying, sexual exploitation and other concerns. When should parents start talking about online safety?
Forbes: Our Kids’ Brains Hurt From Using Technology: The American Academy of Pediatrics recommends less than two hours of entertainment screen time per day for children and discourages the use of any screen media by children under two years of age. The psychology research bucket has been overflowing the last few years with indictments of technology and its deleterious impact on our mental and emotional well-being. Brain research and mental health studies are dovetailing on the conclusion that screen time—particularly social media use—is stressing our brains, specifically the engine of computation and mental functioning: the prefrontal cortex.
The Hill: Three things Congress should do now to protect kids and teens: In this April 9, 2020, photo, Lila Nelson watches as her son, sixth-grader Jayden Amacker, watches an online class at their home in San Francisco. The pandemic increased the amount of time kids and teens spend online, but some worry about the effects of media and technology on their outlook. With the start of the lame-duck session, Congress has a long to-do list in a short period of time. Among the important items that need immediate attention, Congress should not go home without making the internet a safer and healthier place for kids and teens. To their credit, committees in both the House and the Senate have dedicated time and energy to online privacy, health and safety over the past two years. There have been hearings and bipartisan markups, and the 117th Congress has gotten closer to passing comprehensive privacy legislation than any other. Still, Congress appears stuck when it comes to establishing guardrails for social media platforms.
NBC News: Ex-Virginia trooper dies in shootout after killing family of teen he had catfished, police say: A Virginia law enforcement employee was killed in a shootout with deputies in California after he allegedly killed the mother and grandparents of a teenage girl he had catfished online, police said Sunday. Austin Lee Edwards, a former trooper with the Virginia State Police who was working for the Washington County Sheriff’s Office, was accused of driving off with the girl after the killings in the Southern California city of Riverside on Friday, police said. It wasn’t clear if Edwards, 28, was a sworn officer when he allegedly killed 69-year-old Mark Winek; his wife, 65-year-old Sharie Winek; and their daughter, 38-year-old Brooke Winek. Washington County Sheriff Blake Andis did not immediately respond to a request for comment.
Pew Research Center: Connection, Creativity and Drama: Teen Life on Social Media in 2022: Society has long fretted about technology’s impact on youth. But unlike radio and television, the hyperconnected nature of social media has led to new anxieties, including worries that these platforms may be negatively impacting teenagers’ mental health. Just this year, the White House announced plans to combat potential harms teens may face when using social media.
The New York Times: Children’s Groups Want F.T.C. to Ban ‘Unfair’ Online Manipulation of Kids: My Talking Tom, an animated video game featuring a pet cat, is one of the most popular apps for young children. To advance through the game, youngsters must care for a wide-eyed virtual cat, earning points for each task they complete. The app, which has been downloaded more than a billion times from the Google Play Store, also bombards children with marketing. It is crowded with ads, constantly offers players extra points in exchange for viewing ads and encourages them to buy virtual game accessories.
Axios: Kids’ privacy online gets yearend push in Congress: Lawmakers from both parties who back stricter rules for handling kids’ data and accounts online see an opening in the last lame-duck weeks of this Congress. Why it matters: Passing a national online consumer privacy bill continues to be out of Congress’ reach, but protecting young people online has been one of the few areas in recent decades where Congress has been able to pass new tech regulations. Driving the news: The two laws best positioned to get rolled into big year-end legislative packages, according to advocates and lawmakers, are:
Forbes Health: Dear Pediatrician: What Is The Best Age For A Child’s First Smartphone?: Dear Pediatrician, My middle schooler really wants a smartphone, but I’m not so sure. He says that most of the kids in his class already have a phone, and he feels left out. I’m worried about him spending too much time on the phone. Plus, I’ve heard scary stories about kids sending inappropriate messages to one another. Is there a best age to give your child a smartphone? Dear Worried, Adding a smartphone to your child’s experience of the world is a big step. Having a supportive and thoughtful parent by their side increases their smartphone success. I commend you for thinking critically about when to introduce this tool to your child.
The Washington Post: Their kids’ deaths were tied to social media. They want Congress to act: Happy Wednesday! We’d like to tip our hats to the incredible team of journalists at Protocol, who delivered tons of insightful and dogged policy reporting in recent years. Below: Elon Musk delays the relaunch of Twitter Blue, and FBI Director Christopher A. Wray discusses his concerns about TikTok. Below: Their kids’ deaths were tied to social media. They want Congress to act. Maurine Molak says her son David, then 16, took his own life after facing months of cyberbullying on social media platforms, which were slow to respond to their reports. “He could not make it stop. I couldn’t make it stop,” she said during an interview Tuesday.
Boston Globe: Teens and young adults are self-diagnosing mental illness on TikTok. What could go wrong?: Does Carly Smith have attention deficit hyperactivity disorder? She was tested as a child and the answer came back a definitive no. But this summer — battling anxiety and struggling to focus while working remotely in her Watertown apartment — she yearned for an explanation, and turned to a hot source of mental health info for teens and young adults: TikTok. There, Smith, 24, a junior account executive at a PR firm, found an ADHD influencer named Katie Sue, an appealing young woman with a big smile, a lot of what felt like answers, and — on her website — a link to make a donation.
FOX 59 (Indianapolis): Woman’s warning after online exploitation: A 19-year-old Indiana woman is recounting her traumatic experience of being sexually exploited as a child. The woman, who asked us to conceal her identity, was a victim of sextortion. She said she was just 12 years old when what seemed like innocent attention from strangers took a dark turn on the online chatting site Omegle. “They would just be like hey, how’s your day?” she explained. “Then after that, it would be straight to ‘what are you wearing?” As with most sextortion cases, it progressed from talking to pictures to video chats. Oftentimes, they started the chat by showing their privates. She said she couldn’t tell how old some of them were, but estimates a lot of the men were in their 30s to 50s.
CNN Business: A guide to parental controls on social media: A little over a year ago, social media companies were put on notice for how they protect, or fail to protect, their youngest users. In a series of congressional hearings, executives from Facebook (FB), TikTok, Snapchat and Instagram faced tough questions from lawmakers over how their platforms can lead younger users to harmful content, damage mental health and body image (particularly among teenage girls), and lacked sufficient parental controls and safeguards to protect teens.
Forbes: Protecting Our Children In Cyberspace: What Are We Missing?: With final election results rolling in, one of the less talked about, yet a vitally crucial issue, is the safety and wellbeing of the children in America –U.S. citizens without voting rights, whose voice is too often lost when it’s time to count the ballots. But that should not be the case. The last couple of months have been bustling with activity on the technology regulation front, with particular attention devoted to the protection of children in cyberspace. It started with the White House formally announcing its expansive federal tech policy reform, emphasizing the protection of young users. The US Supreme Court followed suit, when last month it granted certiorari in Gonzalez v. Google, a high-stakes case appealed from the Ninth Circuit about the scope of protection Section 230 of the Communications Decency Act gives tech companies against liability for the content on their platforms.
Sky News (UK/Britain): Instagram age verification: Social media giant to use automated analysis of video selfies to allow some UK users to ‘prove their age’: From today, anyone who tries to edit their date of birth by changing it from under the age of 18 to over 18 will have to verify it by providing ID or a video selfie that will use age estimation technology.: Users of Instagram in the UK or EU will from now on see new age verification tools on the platform as part of a major safety update to protect children. From today, anyone who tries to edit their date of birth by changing it from under the age of 18 to over 18 will have to verify their age through ID or a video selfie, which will be examined by independent age estimation technology. Instagram said the new update would help ensure an age-appropriate experience for its users. Cyber safety campaigners have long been advocating for greater child protection, particularly after the Molly Russell inquest, which concluded last month that the 14-year-old girl died from an act of self-harm after being exposed to the “negative effects of online content”.
PEW Research: California’s New Child Privacy Law Could Become National Standard: A new California privacy law might fundamentally change how kids and teens use the internet — not only in California but also across the country. The first-in-the-nation legislation, which goes into effect in 2024, imposes sweeping restrictions on internet companies that serve minors, requiring that they design their platforms with children’s “well-being” in mind and barring eight common data-collection practices. Supporters of the bipartisan measure — including a range of privacy, consumer and children’s advocates — have compared it to longstanding consumer safety protections, such as seatbelts and nutrition labels. New York, Washington and West Virginia also have weighed child privacy bills, and Congress considered four such bills last year. While the Washington and West Virginia bills died in committee, the New York, Pennsylvania and federal bills remain under consideration.vvvv
The Hill: Advocates urge committee to advance Kids Online Safety Act: A joint letter sent by online children safety advocates urges Sen. Maria Cantwell (D-Wash.), Chair of the Senate Committee on Commerce, Science, and Transportation, to advance the Kids Online Safety Act (KOSA). The letter was organized by Fairplay, ParentsTogether and the Eating Disorders Coalition, and received more than 100 signatures of organizations and individuals concerned about the harmful impacts of social media on kids and teenagers. KOSA was first introduced in February 2022 and is sponsored by Sen. Richard Blumenthal (D-Conn.) and 11 others. In the letter, advocates call on Cantwell to “publicly commit to moving KOSA, (S.3663) as part of the omnibus spending bill before the end of the current session,” and requests she take time to talk to parents about the issue.
NBC News: Their children went viral. Now they wish they could wipe them from the internet: During the early months of the pandemic, Kodye Elyse started posting what she described as “normal mom quarantine content” on TikTok. Kodye Elyse, a cosmetic tattoo artist, said she “really wasn’t on social media” before then so she barely had any followers. Since her videos weren’t getting many views, she felt it “wasn’t a big deal” to have a public account to showcase their family life during lockdown, with many of the videos featuring her and her daughters dancing around the house. But the overwhelming response to one of Kodye Elyse’s first viral videos “convinced” her to take her kids offline entirely. The video started with Kodye’s then 5-year-old daughter. She then swapped places with Kodye Elyse to the beat of the music, and with a clever edit, appeared to transform into her mother.
The Dessert News (Utah): Op-ed: More tech, less teen happiness: the link between depression and tech use is especially troubling for children in nontraditional families, our new study found: Our teens are in crisis. The share of American high school students reporting “persistent feelings of sadness or hopelessness” has increased to nearly half of youth, according to the Centers for Disease Control and Prevention. That troubling news came on the heels of a report from Harvard’s Human Flourishing Program that the well-being of young adults has dramatically declined compared to older age groups. A host of factors are driving our kids to despair, from decreased social connection to increased worries about the future of the planet.
Newsweek: Op-ed: We Need Parents and Policy to Save Our Kids from Big Tech: It is now firmly established that social media are ruining the minds and bodies of America’s children. Facebook’s own internal studies find that among teens, especially teen girls, the company’s products lead to “increases in the rate of anxiety and depression.” Social media are designed to be addictive. Heavy use leads to sleep disorders, body dysmorphia, and suicidal thoughts. This should be enough reason for a sane society to stop, think, and change course. They are kids, after all, who deserve peace of mind and time with their loved ones undisturbed by digital encroachments. But we live in a technological age, in which the imperatives of Silicon Valley are given precedence over everything, including the well being of children. So instead of sending our kids a life raft, we are packing their bags for the Metaverse, where their minds will be beyond reach.
Lancaster Online: LTE: Social media affects everyone’s well-being: (Written by Savannah Ginder, student at Conestoga Valley High School): “I just felt happier.” That’s what my friend said about giving up social media for a week. Instead of scrolling, she listened to podcasts, colored and went on walks. My teacher had a similar experience after she decided to get rid of TikTok. Social media can affect your well-being by creating a negative environment that leads to illnesses such as depression and anxiety. “The platforms are designed to be addictive and are associated with anxiety, depression, and even physical ailments,” states a report on the website of McLean Hospital, a leading psychiatric hospital in Massachusetts. No wonder both my friend and my teacher felt better after giving up social media.
Forbes: FDA: Here Are Dangers Of NyQuil Chicken And Benadryl Challenges On Social Media: If you are thinking about cooking your chicken in NyQuil, don’t. Just don’t. The same goes for trying to swallow enough Benadryl so that you can start hallucinating. These are not good ideas, no matter what someone on Instagram, TikTok, Facebook, Face-meta, Meta-Face, or whatever your social media of choice may be called. But apparently enough people have been doing such things that the U.S. Food and Drug Administration (FDA) has felt the need to issue a warning about the dangers of doing such things.
Forbes: The Latest Attempt To Address The Online Data And Privacy Crisis: Some crises strike companies quickly, are addressed by corporate executives, and soon fade from the spotlight. Other crises capture the public’s attention but are eventually placed on the back burner, unresolved. But they can get moved to the front at any time. Consider the case of the online data and privacy crisis, which made international headlines a year ago when whistleblower Frances Haugen told Congress that Facebook and Instagram negatively impacted the mental health of teenagers. Not surprisingly, there were several rounds of accusations and finger-pointing over who was to blame for the crisis, the extent of the impact of social media on mental health, and what had or should be done about it.
ABC 27: VIDEO: Pennsylvania bill would require porn filter on children’s devices: A bill introduced in the Pennsylvania State House would require a filter on children’s mobile devices to prevent access to pornography. The bill introduced by Rep. Jim Gregory (R-Blair) would require cellular carriers to switch on filters for new smartphones and tablets activated in Pennsylvania. Gregory says the bill “mirrors” legislation signed in Utah, which doesn’t go into effect until multiple states enact similar legislation. The American Civil Liberties Union of Utah argued the constitutionality of the Utah bill was not adequately considered and that it will likely be argued in court. Gregory argues that Pennsylvania should follow several other states that have proposed similar legislation.
Time Magazine: Social Media Has Made Teen Friendships More Stressful: Public health data signals a genuine crisis in adolescent mental health: rising rates of anxiety, depression, and hopelessness. But as we worry about tweens and teens who are struggling, we can’t ignore another mounting toll—the burdens that are shouldered by their friends and peers in an “always on” world. We have studied teens and tech for over a decade. Still, what we learned in our most recent study stopped us in our tracks. We collected perspectives from more than 3,500 teens on the best and trickiest parts of growing up in a networked world, and we co-interpreted these perspectives alongside other teens who helped us make sense of what we were hearing.
Axios: Why social media companies moderate users’ posts: Facebook, Twitter and other online services set rules for users’ posts not just to flag individual statements, but more broadly, to ensure they’re complying with the law, to help define their businesses and to protect their users. Driving the news: Public debate over online speech peaked again with Kanye West’s ban from Twitter and Elon Musk’s willingness to bring Donald Trump back to that service if he becomes its owner. But public understanding of why social networks moderate content remains murky. Obeying the law: Social media networks have to follow local laws like everyone else.
CBS 21: Talking to your child about dangerous internet trends like ‘one chip challenge’: October is Cyber Security Awareness Month, so there’s no better time to shine a light on a shocking internet trend Harrisburg School District just banned for putting kids in the hospital. The “One Chip Challenge” is making its rounds on social media, particularly on TikTok. It’s been around for a few years, but people are having serious reactions to the 2022 edition of the chip. You can buy the chip at a convenience store or find it online. It costs a whopping $9. People eat a single spicy chip and wait as long as they can to eat or drink anything else. Then, they post the video of the challenge on social media.
The Washington Post: ‘Responsible social media’ council looks to bridge divides on tech: The Biden administration announces a proposal affecting gig workers, and Meta’s metaverse pitch for businesses faces some challenges. First: ‘Responsible social media’ council looks to bridge divides on tech. Public officials in Washington for years have sparred along partisan lines over whether social media platforms take down too much or too little hate speech and misinformation. A council launching this week aims to sidestep those disputes by proposing reforms that tackle issues of bipartisan concern, including children’s safety and national security.
AP: White House unveils artificial intelligence ‘Bill of Rights’: The Biden administration unveiled a set of far-reaching goals Tuesday aimed at averting harms caused by the rise of artificial intelligence systems, including guidelines for how to protect people’s personal data and limit surveillance. The Blueprint for an AI Bill of Rights notably does not set out specific enforcement actions, but instead is intended as a White House call to action for the U.S. government to safeguard digital and civil rights in an AI-fueled world, officials said. “This is the Biden-Harris administration really saying that we need to work together, not only just across government but across all sectors, to really put equity at the center and civil rights at the center of the ways that we make and use and govern technologies,” said Alondra Nelson, deputy director for science and society at the White House Office of Science and Technology Policy. “We can and should expect better and demand better from our technologies.”
New York Post: ‘School photo’ social media trend could leave kids vulnerable to predators: Police: As students adjust to returning to school this fall, law enforcement members and online safety experts are reminding parents to be cautious about the information they share on social media. It may give predators access to children and scammers access to personal information. “We’re not saying not to share,” Deputy Sheriff Tim Creighton of the McHenry County Sheriff’s Office in Woodstock, Illinois, recently told Fox News Digital. “I have people to this day on my feeds. They are sharing way too much information.” “Less is better,” he said. “Your close friends and family know the important details about your kids, such as the town they live in, the school they go to, their full name. Strangers don’t need to know that.”
The Hill: Why ‘sharenting’ is sparking real fears about children’s privacy: For parents, grandparents and caregivers, snapping a photo of their child and sharing it on social media may seem like a routine, harmless act. After all, being proud of your child and wanting to share that pride with loved ones is a completely normal and largely universal feeling. Unfortunately, this seemingly simple decision — to post a photo, video, or any other information about a child under 18 on social media or the internet in general — comes with a host of ethical and legal considerations, despite the innocent intention behind the action. “Sharenting,” or parents sharing their child’s likeness or personal information on the internet, has grown in popularity alongside the advent of smartphones and social media. And this practice shines a light on the murky realm of children’s consent, digital data collection, targeted advertising, and real-world dangers resulting from parents’ online activities.
WXYZ-TV: Detroit mother sues Instagram for negatively affecting her 13-year-old child: A 2018 Pew Research Study found that 45% of teenagers are online almost constantly. 97% use a social media platform. A Johns Hopkins University study from 2019 shows that 12 to 15-year-olds in the U.S. who spend more than three hours a day on social media are likely to have a heightened risk for mental health problems. Now, a Detroit mother of a 13-year-old is suing Instagram and its parent company Meta claiming it had horrible effects on her daughter. The plaintiff, known as L.H., had been on Instagram since the age of 11 and was a “heavy user” according to a 123-page federal complaint.
The Hill: California passes bill requiring social media companies to consider children’s mental health: California’s legislature has passed legislation that will require social media companies to consider the physical and mental health of minors who use their platforms. Senate Bill AB 2273 passed in the state’s Senate chamber in a 75-0 vote on Tuesday. The proposed legislation is headed to the desk of California Gov. Gavin Newsom (D), though it is unclear whether Newsom will sign the legislation into law, The Wall Street Journal reported. The California Age-Appropriate Design Code Act, which was first introduced by state representatives Buffy Wicks (D), Jordan Cunningham (R) and Cottie Petrie-Norris (D), will “require a business that provides an online service, product, or feature likely to be accessed by children to comply with specified requirements.”
The New York Times: An Apple Watch for Your 5-Year-Old? More Parents Say Yes.: Florian Fangohr waffled for about a year over whether to buy an Apple Watch SE as a gift. The smart watch cost $279, and he worried that its recipient would immediately break or lose it. In May, he decided the benefits outweighed the costs and bought the gadget. The beneficiary: his 8-year-old son, Felix. Mr. Fangohr, a 47-year-old product designer in Seattle, said he was aware that many people were pessimistic about technology’s creep into children’s lives. But “within the framework of the watch, I don’t feel scared,” he said. “I want him to explore.” Felix, a rising third grader, said he actually wanted a smartphone. “But the watch is still really, really nice,” he said.
The New York Times: Sweeping Children’s Online Safety Bill Is Passed in California: Social media and game platforms often use recommendation algorithms, find-a-friend tools, smartphone notices and other enticements to keep people glued online. But the same techniques may pose risks to scores of children who have flocked to online services that were not specifically designed for them. Now California lawmakers have passed the first statute in the nation requiring apps and sites to install guardrails for users under 18. The new rules would compel many online services to curb the risks that certain popular features — like allowing strangers to message one another — may pose to child users. The bill, the California Age-Appropriate Design Code Act, could herald a shift in the way lawmakers regulate the tech industry. Rather than wade into heated political battles over online content, the legislation takes a practical, product-safety approach. It aims to hold online services to the same kinds of basic safety standards as the automobile industry — essentially requiring apps and sites to install the digital equivalent of seatbelts and airbags for younger users. “The digital ecosystem is not safe by default for children,” said Buffy Wicks, a Democrat in the State Assembly who co-sponsored the bill with a Republican colleague, Jordan Cunningham. “We think the Kids’ Code, as we call it, would make tech safer for children by essentially requiring these companies to better protect them.”
ABC News: What parents should know before sharing back-to-school photos online: Katy Rose Prichard, a popular mom influencer on social media, speaks out about how images of children’s faces can be used in ways you never imagined. It’s become a cherished tradition among parents every August and September: sharing back-to-school photos on social media with family members and friends as a new school year kicks off. The trend has been a mainstay on social media, with parents posting pictures of their kids holding signs that showcase details like their child’s age, grade, school, teacher or afterschool activities, and the photos are an easy way to keep loved ones updated. But although it may seem harmless, privacy and security experts say parents and caregivers need to be aware of the inherent risks of sharing pictures and identifiable information online.
CNBC: Randi Zuckerberg says she’s a ‘big proponent of the real world’ when it comes to parenting: Randi Zuckerberg says she’s a “big proponent of the real world” — especially when it comes to protecting children from technology. Speaking at the Credit Suisse Global Supertrends Conference in Singapore earlier this month, Randi Zuckerberg, who is founder and CEO of Zuckerberg Media, discussed worries among many that the metaverse will take children further away from reality.
Los Angeles Times: Op-Ed: California’s fight for a safer internet isn’t over: This month, a bill to regulate social media services for children was rejected by California’s Senate Appropriations Committee without explanation. The proposed legislation, sponsored by Assembly members Jordan Cunningham (R-Paso Robles) and Buffy Wicks (D-Oakland) and called the Social Media Platform Duty to Children Act, would have allowed the state attorney general and local prosecutors to sue social media companies for knowingly incorporating features into their products that addicted children. The powerful tech industry lobbied for months to defeat the bill.
WLWT (Ohio): Experts share warning for parents about back-to-school social media posts: A warning for parents, as back-to-school social media posts could be putting your child at risk. Experts say some parents are posting too much personal information with their child’s back-to-school pictures. “I’m on Facebook, so I definitely see all the postings that are going on right now,” parent Selena Ramanayake said. “They actually have their school name on there, and age, and all this, and so sometimes I kind of have that, I don’t know, hesitation about should you be posting all that,” she said. Ramanayake is talking about popular social media posts of children holding signs that read details about their lives and school information.
The New York Times: LTE’s: Should Kids Be Kept Off Social Media?: Yuval Levin’s suggestion is an interesting one, but experience tells us that kids are savvy at getting around age restrictions and safety guards. Kids today are forming connections using technology and growing up with a smartphone in their hands, so we must meet the moment by taking a holistic approach to keeping them safe online. We need to ensure that social media platforms are designed to protect children from bad actors. And we must support parents by providing them with tools to have effective communication with their kids about online safety. Age limits alone will not take the place of these two fundamental elements. Research shows that parents shy away from having difficult conversations about safety topics. For example, one recent survey shows that while the majority of parents have spoken with their kids about being safe on social media generally, less than a third have talked directly about sharing and resharing nude selfies. In short, parents need support so they can feel confident having early and judgment-free conversations with their kids. Platforms need to be proactive in designing their platforms with child safety in mind. And youth need access to modern, relevant education on these tough topics to reduce shame and create a safety net.
Harrisburg Patriot News: Dauphin County girl rescued from couple who lured her away via Instagram: police: A New York couple kidnapped a Dauphin County teenager last year after reaching out to her on Instagram and offering to do her makeup, court documents said. A 13-year-old girl’s mother reported her missing to Lower Swatara Township police after she’d been gone for several days in December 2021. The mother said her daughter had run away before, but usually came right back or was quickly found, Lower Swatara police said in an affidavit of probable cause. Investigators traced the 13-year-old’s Internet Protocol (IP) address on Instagram to a home in Amsterdam, New York, where Jeniyah D. Lockhart-Tippins and Neil T. Moore II lived, according to the affidavit. After she was rescued from the couple’s home, the 13-year-old told investigators Lockhart-Tippins followed her on Instagram and sent her a direct message, offering to do her makeup, the affidavit said.
Pittsburgh Tribune-Review: Cellphones in schools: Some districts take steps to eliminate devices from class while others balance benefits: Wake up. Check your phone. Go to class. Check your phone. Start homework. Check your phone. Go to bed. Check your phone. For some high schoolers, cellphone use is almost on par with blinking, with the average teenager raking in up to nine hours of screen time each day, according to the American Academy of Child & Adolescent Psychology. In the classroom, phones can serve as an educational tool or a pesky distraction. The latter rang true for six Western Pennsylvania schools — so much so that these schools will take steps to eliminate cellphone use from the classroom during the 2022-23 academic year.
WIRED: How to Use Snapchat’s Family Center With Your Kids: The social media platform just made it easier to find out who your children are interacting with online: TO ALL THE parents who want to know more about who your kids are talking to on their smartphones, I have good news and bad news. The good news: A prominent social media app recently made changes allowing parents and guardians to access more data on the children they care for who are ages 13 to 17. The bad news: You have to download Snapchat. Once it’s set up and your account is connected with those of your children, Snapchat’s new family center lets you see the child’s friend list, who they’re sending messages to, and report potential abuse. The family center does not let you peek into the content of their messages. Although the new feature allows you to see the approximate time your teen messaged someone during the past week, an exact timestamp isn’t provided.
Huffington Post: 7 Things You Should Ask Your Kids About Their Social Media Accounts: Parents may feel apprehensive thinking about their kids on social media, but the reality is young people regularly use platforms like Instagram, TikTok and Snapchat. A survey published by Common Sense Media in March 2022 found that 84% of teens and 38% of tweens say they use social media, with 62% of teens and 18% of tweens saying they use it every day. These numbers underscore the importance of talking to young people about these platforms and their experiences.
Forbes: What The Results Of 32 Studies Teach Us About Parenting In The Age Of Social Media: A new study published in the academic journal Current Opinion in Psychology offers a path forward for parents who are searching for better ways to navigate the nascent world of adolescent social media use. The authors argue that it is possible for parents to put guardrails in place that reduce pre-teen and adolescent anxiety and depression resulting from social media overconsumption, as well as minimize the negative effects of cyberbullying. Here is an overview of their recommendations.
U.S. World News & Report: How to Talk to Tweens About Being Responsible on Social Media: Posting questionable content online could affect your child’s future.: Social media users are getting younger. As screen time increased during the pandemic, so did social media use, especially among tweens, according to the latest report by Common Sense Media, a nonprofit research and advocacy group. Although most social media apps are intended for those 13 or older, nearly one in every five tweens, defined as those ages 8 to 12, reported being on social media daily. These platforms can have both positive and negative effects for young people, researchers say. As more kids access social media at younger ages, it’s increasingly important for parents and educators to help them learn how to stay safe and use social media responsibly. That includes teaching your kids that what they say online can have long-term consequences.
AP: California social media addiction bill drops parent lawsuits: A first-of-its-kind proposal in the California Legislature aimed at holding social media companies responsible for harming children who have become addicted to their products would no longer let parents sue popular platforms like Instagram and TikTok. The revised proposal would still make social media companies liable for damages of up to $250,000 per violation for using features they know can cause children to become addicted. But it would only let prosecutors, not parents, file the lawsuits against social media companies. The legislation was amended last month, CalMatters reported Thursday. The bill’s author, Republican Assemblymember Jordan Cunningham, said he made the change to make sure the bill had enough votes to pass in the state Senate, where he said a number of lawmakers were “nervous about creating new types of lawsuits.”
NPR: Snapchat’s new parental controls try to mimic real-life parenting, minus the hovering: Snapchat is rolling out parental controls that allow parents to see their teenager’s contacts and report to the social media company — without their child’s knowledge — any accounts that may worry them. The goal, executives say, is to enable parents to monitor their child’s connections without compromising teens’ autonomy. Named Family Center, the new suite of tools released Tuesday requires both caregiver and teen to opt in.
New York Post: ‘Victims of Instagram’: Meta faces novel legal threat over teen suicides: Meta is facing a fresh storm of lawsuits that blame Instagram for eating disorders, depression and even suicides among children and teens — and experts say the suits are using a novel argument that could pose a threat to Mark Zuckerberg’s social-media empire. The suits — which are full of disturbing stories of teens being barraged by Instagram posts promoting anorexia, self-harm and suicide — rely heavily on leaks by whistleblower Frances Haugen, who last year exposed internal Meta documents showing that Instagram makes body image issues and other mental health problems worse for many teens.
CNET: Kids Are Being Exploited Online Every Day – Sometimes at the Hands of Their ParentsOn TikTok, Instagram and YouTube, some kids are making millions. But any child working as an influencer is at risk of exploitation.: Rachel Barkman’s son started accurately identifying different species of mushroom at the age of 2. Together they’d go out into the mossy woods near her home in Vancouver and forage. When it came to occasionally sharing in her TikTok videos her son’s enthusiasm and skill for picking mushrooms, she didn’t think twice about it — they captured a few cute moments, and many of her 350,000-plus followers seemed to like it. That was until last winter, when a female stranger approached them in the forest, bent down and addressed her son, then 3, by name and asked if he could show her some mushrooms. “I immediately went cold at the realization that I had equipped complete strangers with knowledge of my son that puts him at risk,” Barkman said in an interview this past June. This incident, combined with research into the dangers of sharing too much, made her reevaluate her son’s presence online. Starting at the beginning of this year, she vowed not to feature his face in future content.
Forbes: TikTok Moderators Are Being Trained Using Graphic Images Of Child Sexual Abuse: A largely unsecured cache of pictures of children being sexually exploited has been made available to third-party TikTok content moderators as a reference guide, former contractors say. Nasser expected to be confronted with some disturbing material during his training to become a content moderator for TikTok. But he was shocked when he and others in his class were shown uncensored, sexually explicit images of children.
Politico: Congress is closer than ever to reining in social media: The fallout from Facebook whistleblower Frances Haugen’s explosive testimony about social media’s threat to children before the Senate Commerce Committee last fall is coming into focus. There’s bipartisan support in Congress to ban targeted ads aimed at kids under 16, require tech firms to establish default safety tools to protect children online and give parents more control over their children’s web surfing.
New York Post: Online dangers are rampant for kids today: One of the most important jobs parents have today is keeping their children safe online. As moms and dads prepare to send their kids back to school soon, one critical item needs to be included on the checklist: checking out all online platforms their kids are using — and starting conversations early about cyber safety. Kids and teens between the ages of 8-28 spend about 44.5 hours each week in front of digital screens, according to the nonprofit Center for Parenting Education. This makes it crystal clear that parents need to be tuned in and very educated about what, exactly, their kids are doing during those hours.
CBS News: “It’s a crisis”: More children suffering mental health issues, challenges of the pandemic: According to the Mental Health Alliance, in 2022, fifteen percent of kids ages 12 to 17 reported experiencing at least one major depressive episode. That was 306,000 more than last year. “It’s bad. It’s a crisis” said Katherine Lewis, a licensed family therapist at The Bougainvilla House, a nonprofit treatment center in Ft Lauderdale that describes itself as a safe place for children and youth to grow emotionally. To understand why children’s mental health is in such a fragile state, CBS4 was given rare access to the center
Newsweek: Too Much Screen Time for Teens Leads to Mental Disorders, New Study Shows: Youngsters who spend a lot of time in front of a screen are at greater risk of developing behavior disorders, warned a new study. Social media is thought to have an especially strong influence and was most likely to be linked to issues such as shoplifting, scientists said. Watching videos and television, playing games, and texting were linked with oppositional defiant disorder (ODD), according to the findings published July 26 in the Journal of Child Psychology and Psychiatry.
Good Housekeeping: The Hidden Danger Behind TikTok’s “Product Overload” Cleaning Trend: TikTok is ripe with cleaning inspiration, but one eyebrow-raising trend that has been building steam over the last year now has experts concerned about social media users’ safety. Appropriately known as “product overload” by those in the know, the trend — which involves users filming themselves loading up a toilet, bath or sink with copious amounts of astringent cleaning products — has become its own form of ASMR for what’s known as the “CleanTok” corner of the platform.
FOX 11 (Los Angeles): TikTok sued by parents of teen who blame platform for child’s eating disorder: Another lawsuit was filed Thursday against TikTok, this time by the parents of a girl who allege the social media platform’s content is responsible for their 13-year-old daughter’s severe eating disorder that required the child’s hospitalization and will affect her for life.
The Washington Post: Senate panel advances bills to boost children’s safety online: Senators took their first step toward increasing protections for children and teens online on Wednesday, advancing a pair of bipartisan bills that would expand federal safeguards for their personal information and activities on digital platforms. The push gained momentum on Capitol Hill last year after Facebook whistleblower Frances Haugen disclosed internal research suggesting that the company’s products at times exacerbated mental health issues for some teens.
ABC News: Wren Eleanor’ TikTok trend sees parents removing photos, videos of their kids. An account featuring a 3-year-old has sparked a discussion on online safety.: The families of two teens filed new lawsuits against Meta, the parent company of Instagram, claiming the platform causes eating disorders and is spurring a mental health crisis among young people. A TikTok account with more than 17 million followers has sparked a discussion about children’s privacy and safety online.
ABC News: VIDEO: Parents sue TikTok after daughter dies attempting ‘blackout’ social media challenge: The parents speak exclusively to ABC News about a social media challenge called “blackout” — in which children choke themselves until they pass out. A Wisconsin family is suing TikTok after their 9-year-old daughter died attempting the so-called “blackout challenge” popularized on social media.
Fortune: Instagram and TikTok are wreaking havoc on our finances and happiness, new survey finds: You might have recently purchased athletic gear or a hoodie from an advertisement shared by an online retailer—and immediately regretted it. You’re far from alone. Social media impacts consumers’ spending habits, according to a new study by Bankrate, with nearly half of users admitting to making an impulse purchase based on a sponsored post.
Tech Crunch: Kids and teens now spend more time watching TikTok than YouTube, new data shows: Kids and teens are now spending more time watching videos on TikTok than on YouTube. In fact, that’s been the case since June 2020 — the month when TikTok began to outrank YouTube in terms of the average minutes per day people ages 4 through 18 spent accessing these two competitive video platforms.
Variety: TikTok Will Add Adult-Content Warning Labels to Videos With ‘Overtly Mature Themes’: TikTok is giving users of the popular app more controls over the kinds of videos they see in their feed — including flagging videos with “mature or complex themes” intended for viewers 18 and older. TikTok’s Community Guidelines detail categories of content that is banned by the platform, including nudity, pornography and sexually explicit content.
Forbes: TikTok: America’s Drug Of Choice: A recent report that TikTok’s American user data is routinely accessed by Chinese employees comes as no surprise. China’s global technology companies have long engaged in persistent data sharing thereby giving the Chinese government eyes and ears around the world.
New York Post: Alarming TikTok trend sees parents ask kids to help them fight: The first rule of Fight Club is you do not talk about Fight Club, but these parents are posting on TikTok. A new trend on the video-sharing app that involves parents asking their kids to defend them in a fight has divided users, with some saying it’s promoting violence in young children. The trend — and the hashtag #fightprank — has over 24.8 million views on TikTok.
Tech Crunch: Children’s rights groups call out TikTok’s ‘design discrimination’: Research examining default settings and terms & conditions offered to minors by social media giants TikTok, WhatsApp and Instagram across 14 different countries — including the US, Brazil, Indonesia and the UK — has found the three platforms do not offer same level of privacy and safety protections for children across all the markets where they operate.
New York Post: TikTok sued after young girls die in ‘blackout challenge’: TikTok is facing wrongful death lawsuits after two young girls killed themselves trying to recreate “blackout challenge” videos they watched on the platform. Lalani Erika Walton, 8, and Arriani Jaileen Arroyo, 9, both wound up dead after watching hours of the videos featuring the challenge fed to them by TikTok’s algorithm, the suits allege, the Los Angeles Times reported.
CNN: An FCC regulator wants TikTok removed from app stores. Here’s how a company executive responded: While TikTok’s short-form videos are entertaining, that’s “just the sheep’s clothing,” a Federal Communications Commission official said, and the app should be removed from app stores because of security issues. But a TikTok executive, in a rare interview on CNN’s “Reliable Sources” on Sunday, claimed there are no security concerns linked to the hugely successful app.
Forbes: Hugely Popular NGL App Offers Teenagers Anonymity In Comments About Each Other: A new app that allows Instagram users to send anonymous messages is soaring in popularity – and renewing concerns about cyberbullying and harassment that plagued previous apps allowing teens to comment on one another without attribution.
WTAE-TV: VIDEO: Charleroi man accused of luring three young girls through Snapchat: Police say Brandon Johnson, 35, drove girls to a Connellsville hotel: Connellsville police said a 35-year-old Charleroi man used Snapchat to lure three young girls to an area hotel last weekend.
US Attorney’s Office: Philadelphia Man Convicted of Sex Trafficking a Minor on Backpage.com: United States Attorney Jennifer Arbittier Williams announced that a man was convicted at trial of sex trafficking, arising from his forcible coercion of a minor to engage in prostitution. The defendant and the victim first met on a digital social networking application in June 2016.
WTAE-TV (Pittsburgh): North Dakota man accused of sexually exploiting 13-year-old Washington County girl: A North Dakota man has been indicted on charges of child pornography and sexual exploitation of a 13-year-old girl from Washington County. Nicholas Nesdahl, 27, was being held in a jail in North Dakota on Friday awaiting extradition to the Pittsburgh area. In October 2021, a woman reported to Peters Township police that she found troubling videos on her daughter’s cellphone.
PA Police Warn Of Dangerous TikTok Challenge With Gel Gun: Police departments all over are warning folks about a dangerous social media challenge urging users to shoot modified pellet guns at people.
PA State Rep. Hit By Pellets While Walking Dog: As multiple police agencies were investigating a shooting at Erie High School, Rep. Pat Harkins was walking his dog Barry, just several blocks away.
FBI Pittsburgh Warns of Increase in Sextortion Schemes Targeting Teenage Boys: The FBI Pittsburgh Field Office is warning parents and caregivers about an increase in incidents in the Pittsburgh area involving sextortion of teenagers. The FBI is receiving an increasing number of reports of adults posing as age-appropriate females coercing young boys through social media to produce sexual images and videos and then extorting money from them.
Dad Warns Parents After Son, 12, Dies from ‘Blackout Challenge’: ‘Check Out’ Your Kids’ Phones “This is a weapon in our home that people don’t know about,” says Haileyesus Zeryihun.
Vague TikTok threats bring police presence to local schools: Law enforcement and schools are taking extra precautions amid an apparent TikTok trend threatening violence nationwide on Friday.
12-Year-Old Boy Who Burned 35 Percent of Body in TikTok ‘Fire Challenge’ Tells Kids ‘Not to Be a Follower’: Nick Howell spent almost six months in and out of the hospital and had 50 surgeries
Easton Express-Times: Slate Belt teen faces 20 child porn counts in Pa. Attorney General’s Office probe: An 18-year-old Slate Belt man faces numerous charges of possessing child pornography after a months-long investigation by the Pennsylvania Attorney General’s Office
PFSA’s Digital Dialogue Video Series
Reporting Abuse and Exploitation
- ChildLine provides information, counseling, and referral services for families and children to ensure the safety and well-being of the children of Pennsylvania. The toll-free intake line,1-800-932-0313, is available 24/7 to receive reports of suspected child abuse.
- The National Center for Missing & Exploited Children (NCMEC) CyberTipline is the nation’s centralized reporting system for the online exploitation of children. The public and electronic service providers can make reports of suspected online enticement of children for sexual acts, child sexual molestation, child sexual abuse material, child sex tourism, child sex trafficking, unsolicited obscene materials sent to a child, misleading domain names, and misleading words or digital images on the internet. Reports may be made 24/7 online at www.cybertipline.org OR by call the 24-Hour Hotline: 1-800-THE-LOST (1-800-843-5678)