Family Digital Wellness

Training & Programs / Family Digital Wellness

The foundation of our society and economy is now grounded in the digital world, and the COVID-19 pandemic has only accelerated this trend. A March 2022 survey released by Common Sense Media, found that children and teens are spending more time than ever before on digital devices. This survey found children ages 8 to 12 spend on average five hours and 33 minutes on screens, and teens ages 13 to 18 spend eight and a half hours on screens per day.

While there is an emphasis on protecting children online from predators and preserving kids’ mental and physical health, a piece of the puzzle is missing. Now that we are immersed in the digital world, it is imperative to equip parents and families to recognize warning signs of digital threats and to create healthy relationships and interactions with digital technologies in order to prevent abuse and future harm. This is why PFSA has developed the Family Digital Wellness initiative.

Family Digital Wellness: An inclusive, supportive, and preventative approach aimed to strengthen families in raising healthy children in a digital era.


Did You Know?

  • 69% of U.S. children have their own smartphone by age 12.
  • 70% of kids encounter sexual or violent content online while doing homework research.
  • 1 in 5 youth ages 10-17 received a sexual solicitation or were approached online.
  • 40% of children in grades 4-8 say they have chatted with a stranger online.
  • 20% of teens have sent or posted nude or semi-nude photos or videos.
  • About 7 in 10 parents think smartphones could bring more harm than good to children.
  • 66% of U.S. parents say parenting is harder today than it was 20 years ago, with many in this group citing technology as a reason why.

Introducing the Family Digital Wellness Parent Toolkit!

Perhaps you want to be proactive in protecting your family against digital dangers that threaten children and families. Or maybe you have witnessed others struggling with these issues, or you and your family have experienced struggles of your own. Whatever your reason may be, it is important to know that you are not alone, and resources do exist.

PFSA has developed a comprehensive toolkit for parents who are ready to learn more about Family Digital Wellness, what it means, and how it can be used to increase safety and create healthy interactions with digital technologies. This toolkit is built on the foundation of PFSA’s Digital Diligence Framework, which encourages parents to follow five steps in their journey towards digital wellness.

Download our FREE Parent Toolkit today to learn more and to apply easy-to-implement solutions for your family!

Coaching Guide: To learn about our accompanying Family Digital Wellness Coaching Guide for professionals, email us at info@pafsa.org.

Click to Download


PFSA’s Family Digital Wellness Resources

Foundations of Family Digital Wellness

Digital wellness is built upon skills and practices that encourage users to protect themselves, and their families, from digital dangers through simple and proactive actions, while also shifting the way in which technology is used to effectively support safe and healthy interactions online and on digital devices. These skills and practices are the Foundations of Family Digital Wellness.

Click to Download

PFSA’s Family Digital Wellness Overview

In the 1980’s and 90’s the “Just Say No” campaign aimed to discourage children from engaging in illegal recreational drug use by offering various ways of saying no. Today, we must focus on how to discourage our children from engaging in the risky and dangerous behaviors of the digital era stemming from an increased use and dependence of digital technologies. Check out this resource to learn more about PFSA’s Family Digital Wellness initiative and goals.

Click to Download

Common Digital Dangers Booklet

While many of us are aware digital dangers exist and the threats of digital technologies are growing, most parents and caregivers are not aware of all the common risks for children in digital environments. As we tackle the challenge of protecting children and preventing future harm in digital environments, understanding the most common digital threats equips parents to raise safe and healthy children. Check out this resource to learn more about common digital dangers, warning signs, and what you can do.

Click to Download

The Digital Era Family Profile

We are truly living in an unprecedented time. The current generation of parents/caregivers is the first to raise children with a presence in both physical and digital worlds. Dependency on digital technologies has rapidly increased in our society, quickly becoming part of our everyday lives. Everything, including work, school, socializing, and entertainment, has turned digital for adults and children alike. Check out this resource to learn more about today’s digital era family.

Click to Download

Tips for Parents

Family Digital Wellness requires intention and many ongoing actions that create a comprehensive approach to raising safe and healthy kids in the digital-era. But taking the first small step to safeguard against digital threats is critical for parents and families as they begin the journey. Check out this resource to learn more about tips that parents can implement right away with your family.

Click to Download

Preventing Digital Threats

While digital technologies change and new threats emerge rapidly, the most effective strategy for preventing future harm is practicing positive and healthy digital behaviors. With any risk to our children, equipping them with knowledge and guidance lays a foundation for positive outcomes. Check out this resource to learn more about what to avoid and how to take steps towards preventing digital threats for you and your family.

Click to Download

Other Related Resources

-Media Resources

Digital for Good Book – In Digital for Good, EdTech expert Richard Culatta argues that technology can be a powerful tool for learning, solving humanity’s toughest problems, and bringing us closer together. He offers a refreshingly positive framework for preparing kids to be successful in a digital world—one that encourages them to use technology proactively and productively—by outlining five qualities every young person should develop in order to become a thriving, contributing digital citizen. www.amazon.com


Childhood 2.0 DocumentaryChildhood 2.0 is a must-view for anyone who wants to better understand the world their children are navigating as they grow up in the digital age. Featuring actual parents and kids as well as industry-leading experts in child safety and development, this documentary dives into the real-life issues facing kids today — including cyberbullying, online predators, suicidal ideation, and more. www.childhood2movie.com


The Social Dilemma Documentary – In The Social Dilemma, Tech experts from Silicon Valley sound the alarm on the dangerous impact of social networking, which Big Tech use in an attempt to manipulate and influence. www.netflix.com

NetSmartz – NetSmartz is an online safety education program. It provides age-appropriate videos and activities to help teach children be safer online with the goal of helping children to become more aware of potential online risks and empowering them to help prevent victimization by making safer choices on- and offline. www.missingkids.org/netsmartz/home

-Parental Control Resources

Bark – Bark monitors texts, email, YouTube, and 30+ apps and social media platforms for signs of issues like cyberbullying, sexual content, online predators, depression, suicidal ideation, threats of violence, and more. www.bark.us


Gabb Wireless – Gabb Wireless provides a great first phone for your child(ren). No games, social media, or internet. They also have an interactive watch that works as an alternative to an actual phone. www.gabbwireless.com


The Protect App – The Protect app has hundreds of bite-sized lessons and content to make it easy for busy parents to get the quick tips they need. The app also includes 20 videos produced with teens and young adults. Parents and kids watch these videos together. www.protectyoungeyes.com

-Research Resources

Common Sense Media – Since 2003, Common Sense has been the leading source of entertainment and technology recommendations for families and schools. Every day, millions of parents and educators trust Common Sense reviews and advice to help them navigate the digital world with their kids. www.commonsensemedia.org


The Digital Wellness Lab – The Digital Wellness Lab synergizes global thought leaders from tech, content creation and health sciences, in order to best investigate, translate, innovate and intervene to build a digital environment that advances the well-being of families, society and humanity at large. www.digitalwellnesslab.org


Thorn – Thorn builds technology to defend children from sexual abuse and houses the first engineering and data science team focused solely on developing new technologies to combat online child sexual abuse. www.thorn.org


National Center for Missing and Exploited Children (NCMEC) – NCMEC is the nation’s nonprofit clearinghouse and comprehensive reporting center for all issues related to the prevention of and recovery from child victimization. www.missingkids.org

-Government Resources

FBI: Safe Online Surfing: Click to visit webpage

FBI Online Resource Page: Click to visit webpage

FTC Resources: Click to visit webpage

News & Media Stories – Stay Up to Date!

WUSF: DeSantis plans to sign a bill restricting kids from social media: Gov. Ron DeSantis on Friday made clear he will sign a measure that seeks to keep children off social-media platforms. The bill, in part, would prevent children under age 16 from opening social-media accounts — though it would allow parents to give consent for 14- and 15-year-olds to have accounts. Children under 14 could not open accounts.

ABC News: Utah governor replaces social media laws for youth as state faces lawsuits: Utah’s governor has approved an overhaul of social media laws meant to protect children as the state fends off multiple lawsuits challenging their constitutionality. The new laws require social media companies to verify the ages of their users and disable certain features on accounts owned by Utah youths. Default privacy settings for minor accounts must restrict access to direct messages and sharing features and disable elements such as autoplay and push notifications that lawmakers argue could lead to excessive use.

The Washington Post: Big tech won’t be able to track kids’ data in Maryland under new bill: Maryland’s legislature unanimously endorsed a bill Thursday aimed at bolstering children’s online security, putting the state on track to become the second in the nation that strictly limits what data technology companies may gather on minors. A growing number of states have pushed to add their own protections to a patchwork of federal policies governing data privacy in the United States. The most consequential federal measure on minors, the Children’s Online Privacy Protection Act, limits companies from collecting data from young children and is the reason most social media platforms restrict accounts to children who are 13 or older. But that law does not require tech companies to consider the risks their products pose to children or teenagers.

Institute for Family Studies: Online Age Verification Laws Are a Bet Worth Making: Seven states adopted legislation last year to require age verification proving that a user is 18 to gain access to pornography sites. In 2024, by some counts, as many as a dozen additional states are considering bills of their own. If successful, this trend would erect a multi-state barrier between children and pornography platforms, a strong position which could potentially support efforts to enact federal legislation as well.

Des Moines Register: Social media’s threat to Iowa children includes dangerous terms of service: The Iowa House has approved a bill (House File 2523), requiring children under age 18 to get parental approval to open social media accounts on such sites as Instagram, Facebook and TikTok. The state attorney general also could sue any company violating those tenets.

Associated Press: Most teens report feeling happy or peaceful when they go without smartphones: survey: Nearly three-quarters of U.S. teens say they feel happy or peaceful when they don’t have their phones with them, according to a new report from the Pew Research Center. In a survey published Monday, Pew also found that despite the positive associations with going phone-free, most teens have not limited their phone or social media use.The survey comes as policymakers and children’s advocates are growing increasingly concerned with teens’ relationships with their phones and social media. Last fall, dozens of states, including California and New York, sued Instagram and Facebook owner Meta Platforms Inc. for harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features that addict children. In January, the CEOs of Meta, TikTok, X and other social media companies went before the Senate Judiciary Committee to testify about their platforms’ harms to young people.

Western Iowa Today: Iowa House Passes Bill to Help Parents and Kids Navigate Social Media: Iowa House Representative Tom Moore says in a bipartisan manner, a bill passed through the House to help parents and kids navigate the world of social media. Under this bill, it would also be against the law for social media companies to gather data on kids without permission from parents.

The New York Times: Biden Wants Congress to Reduce the Risks of Social Media for Children: In his State of the Union address in 2022, President Biden warned of social media harms to young people and called for new privacy protections for children online. “We must hold social media platforms accountable for the national experiment they’re conducting on our children for profit,” Mr. Biden said. The president made similar comments in his State of the Union speech last year, urging Congress to enact restrictions on social media services like TikTok and Instagram. “It’s time to pass bipartisan legislation to stop Big Tech from collecting personal data on kids and teenagers online, ban targeted advertising to children and impose stricter limits on the personal data these companies collect on all of us,” Mr. Biden said at the time.

Associated Press: Social media ban for minors less restrictive in Florida lawmakers’ second attempt: Less than a week after Republican Gov. Ron DeSantis vetoed a social media ban for minors, Florida lawmakers sent him a new version on Wednesday that’s expected to withstand his scrutiny. The House passed the bill on a 109-4 vote, completing Republican Speaker Paul Renner’s top priority for the 60-day session that ends Friday. The bill will ban social media accounts for children under 14 and require parental permission for 15- and 16-year-olds. “We’re not opening a Pandora’s box, we’re closing one,” said Republican Rep. Tyler Sirois, who sponsored the bill. “The harm that it is causing our children is documented, and it is severe.”

Psychology Today: Digital Parenting: Here’s how parents can navigate the challenges of technology and social media: One thing we know is that technology and social media have changed the world and changed parenting. But both technology and social media have brought blessings and curses. The blessings include the educational content, opportunities for self-paced learning, and connection for the disconnected. For example, some kids live in more rural areas where same-age peers live far away. There are also urban city kids who live in neighborhoods where it’s dangerous to get together at a playground. And there are still other kids who feel marginalized and more isolated by their small numbers or by aspects of their identities (e.g., LGBTQ). The curses include the risk of harm that is possible because of children’s ability to venture virtually anywhere with limited or no supervision and social media influences that lead vulnerable adolescents to question their self-worth and engage in unhealthy social comparisons.

CBS News: Florida Senate passes revamped social media bill: Gearing up for an expected legal battle, the Florida Senate on Monday overwhelmingly passed a revamped plan aimed at keeping children off social media. The Senate voted 30-5 to approve the plan (HB 3), three days after Gov. Ron DeSantis vetoed an earlier version (HB 1). DeSantis and House Speaker Paul Renner, who has made a priority of the social-media issue, negotiated the revamped plan.

The New York Times: DeSantis Vetoes Florida Social Media Ban for Kids Under 16: While several states have recently passed laws requiring parental consent for children’s social media accounts, the Florida measure that Mr. DeSantis vetoed was designed as a more blanket ban. It would have required certain social networks to verify users’ ages, prevent people under 16 from signing up for accounts and terminate accounts that a platform knew or believed belonged to underage users. Parents’ groups including the Florida Parent-Teacher Association had urged Mr. DeSantis to veto the bill after the state’s Legislature passed it last week. The bill would almost certainly have faced constitutional challenges over young people’s rights to freely seek information. It also would have likely ignited online protests from teenagers who rely on social apps to communicate with friends and family, express themselves creatively, keep up with news and follow political, sports, food and fashion trends.

The Washington Post: DeSantis vetoes Florida bill banning social media for most kids: The bill is the latest Republican-led state measure aimed at tackling bipartisan concerns that platforms are worsening youth mental health and well-being. But unlike laws in Utah and Arkansas blocking kids under 13 from setting up accounts and requiring parents to provide consent for some teens to join platforms, the Florida bill would have prohibited all minors up to 15 from creating social media accounts, the most stringent such prohibition yet. But DeSantis said he is working with state officials on a “different, superior” alternative plan, without elaborating, meaning the state could still usher in new limits for children’s access to digital platforms like TikTok, Instagram and Snapchat.

Politico: DeSantis vetoes Florida’s social media restrictions for minors: Gov. Ron DeSantis on Friday vetoed legislation that would have created strict social media prohibitions for minors in Florida, triggering lawmakers to reconfigure the top legislative priority of GOP House Speaker Paul Renner in the final days of the annual legislative session. DeSantis for weeks signaled that he wasn’t fully on board with the legislation and decided to block the proposal even as lawmakers made several changes with the hopes of quelling the Republican governor’s concerns. DeSantis’ veto this week went from a possibility to a foregone conclusion as senators, bracing for the move, cleared a path for lawmakers to alter the proposal once again after reaching a deal with the governor.

The Washington Post: Bill banning kids on social media may throw wrench in Senate safety push: Earlier this month, a group of senators announced they secured more than 60 co-sponsors for the Kids Online Safety Act (KOSA), a sprawling measure requiring tech companies to take steps to prevent harms to children on their sites and beef up their privacy and safety settings. The milestone signaled that backers had clinched enough support to clear it out of the Senate, a step that would make it the most significant piece of internet regulation to do so in decades. But now a group of lawmakers is pushing to couple the proposal with a separate bill banning kids under 13 from social media altogether, which could muddle the talks.

The Verge: Passing the Kids Online Safety Act just got more complicated: Just a couple of weeks after the Kids Online Safety Act (KOSA) surged with enough support to position it to clear the Senate, the path to new child protections on the internet suddenly looks more complex. Seeing the momentum, other lawmakers and outside groups sense it might be time to promote their own favored solutions, which could snarl KOSA’s Senate passage. But that bill’s path may be less clear than KOSA’s. The Post reported that Republican leaders who previously backed the Protecting Kids on Social Media Act are pulling support, citing two unnamed sources. A new draft version viewed by the Post removed age verification requirements and parental sign-off for minors to use social media but left in place limits on algorithmic recommendations.

The New York Times: To the Editor: Re “Justices Mull State Laws Constraining Social Media” (Business, Feb. 27): Conservative politicians need to get their story straight regarding the regulation of social media platforms. At the end of January Republicans, such as Senator Josh Hawley, were demanding that Meta’s C.E.O., Mark Zuckerberg, apologize to parents for failing to protect children from harmful content. However, this week conservatives from Texas and Florida argued that social media platforms should not be able to choose what content goes onto their platforms; the states believe that social media should have to publish all messages, regardless of the content. Luckily, the justices of the Supreme Court seemed skeptical of the two states’ laws, which would open the floodgates of misinformation, hate speech and unimaginably harmful speech that could lead to suicide, eating disorders and even terrorist attacks.

The Hill: Opinion: Beware social media’s bailout bill, don’t let dangerous companies pass the buck: In 2023, amid rising reports of social media addiction and mental health concerns, elected officials across the country began the process of holding two of the world’s largest and most controversial social media companies accountable. In a recent Senate Judiciary Committee hearing, members of Congress grilled social media executives, including Facebook CEO Mark Zuckerberg and Shou Zi Chew, the head of TikTok, a company with strong ties to the Chinese Communist Party, about the impact of their platforms on children. But the hearing is just one of the latest measures to bring transparency and accountability to the social media industry.

Psychology Today: Can We Embrace These 3 Insights About Teens and Tech?: Mark Zuckerberg turned toward a group of grieving parents recently and issued a public apology. “I’m sorry for everything you’ve been through,” he said. “No one should go through what you and your families have suffered.” His latest admission at the Senate online child safety hearing was not the first time Zuckerberg has apologized. But for parents eager to see someone take responsibility for social media harms, it might have hit differently. Caregivers are tired of alarming headlines, managing parental controls and default settings, and watching their teens struggle to navigate a digital world that wasn’t built to benefit them. Aside from the emotional gravity of this exchange, though, what are we to make of the back-and-forth between members of Congress and the world’s top tech executives and the current patchwork of state-level solutions?

The New York Times: Takeaways From the Supreme Court Arguments on Social Media Laws: The Supreme Court heard arguments for nearly four hours on Monday on a pair of First Amendment cases challenging laws in Florida and Texas that seek to limit the ability of internet companies to moderate content on their platforms. Here are some takeaways:The cases could shape the future of internet discourse.As the public square has moved online in the 21st century and technology companies like Facebook, YouTube and X have grappled with objectionable content, new dilemmas have arisen over the scope and meaning of free speech.

The Washington Post: Justices skeptical of Tex., Fla. laws that bar platforms from deleting content: A majority of the Supreme Court seemed broadly skeptical Monday that state governments have the power to set rules for how social media platforms curate content, with both liberal and conservative justices inclined to stop Texas and Florida from immediately implementing laws that ban the removal of certain controversial posts or political content.Even as justices expressed concern about the power of social media giants that have become the dominant modern public forum, a majority of the court seemed to think the First Amendment prevents state governments from requiring platforms such as Facebook and YouTube to host certain content.The high court’s decision in the two cases, likely to come near the end of the term in June, will have a significant impact on the operation of online platforms that are playing an increasingly important role in U.S. elections, democracy and public discussion.

The New York Times: Instagram and Facebook Subscriptions Get New Scrutiny in Child Safety Suit: The New Mexico attorney general, who last year sued Meta alleging that it did not protect children from sexual predators and had made false claims about its platforms’ safety, announced Monday that his office would examine how the company’s paid-subscription services attract predators.Attorney General Raúl Torrez said he had formally requested documentation from the social media company about subscriptions on Facebook and Instagram, which are frequently available on children’s accounts run by parents.Instagram does not allow users under 13, but accounts that focus entirely on children are permitted as long as they are managed by an adult. The New York Times published an investigation on Thursday into girl influencers on the platform, reporting that the so-called mom-run accounts charge followers up to $19.99 a month for additional photos as well as chat sessions and other extras.

The New York Times: Florida Passes Sweeping Bill to Keep Young People Off Social Media: Florida’s Legislature has passed a sweeping social media bill that would make the state the first to effectively bar young people under 16 from holding accounts on platforms like TikTok and Instagram. The measure — which Gov. Ron DeSantis said he would “be wrestling with” over the weekend and has not yet signed — could potentially upend the lives of millions of young people in Florida. It would also probably face constitutional challenges. Federal courts have blocked less-restrictive youth social media laws enacted last year by Arkansas and Ohio. Judges in those cases said the new statutes most likely impinged on social media companies’ free speech rights to distribute information as well as young people’s rights to have access to it. The new rules in Florida, passed on Thursday, would require social networks to both prevent people under 16 from signing up for accounts and terminate accounts that a platform knew or believed belonged to underage users. It would apply to apps and sites with certain features, most likely including Facebook, Instagram, Snapchat, TikTok and YouTube.

CBS: Minnesota lawmakers re-consider bill to boost child safety features on digital platforms: The Minnesota Legislature is reconsidering a bill that would require companies to analyze and mitigate child safety risks on their platforms. The “Minnesota Age-Appropriate Design Code Act” cleared a key committee this week at the capitol as lawmakers try to protect kids from social media and other websites that can harm their mental, physical and emotional health. The proposal would limit the amount of data collected on children and implement rules on how it can be used. It also requires any business that has an online platform or service that can be accessed by kids to perform “impact assessments” on the risks to their safety before releasing new digital features or products. 

CNN: Florida House passes legislation that would prohibit kids under 16 from having certain social media accounts: The Florida House of Representatives passed legislation that would prohibit anyone under 16 in the state from holding accounts with certain social media platforms. The bill now heads for Republican Governor Ron DeSantis’ desk. On Thursday, the House passed House Bill 1 in a vote of 108-7, according to online records. HB1, or Online Protections for Minors, would require some social media platforms to verify the age of account holders, prohibit kids under 16 from creating a new account, and terminate the accounts of anyone they believe to be under 16. This vote came just hours after the Senate passed the legislation on Thursday morning in a vote of 23-14. It will now head to the desk of DeSantis, who could sign it or veto it. DeSantis has spoken previously about his beliefs that parents should have a role in this bill.

AP: Florida lawmakers pass ban on social media for kids under 16 despite constitutional concerns: A bill to create one of the nation’s most restrictive bans on minors’ use of social media is heading to Republican Florida Gov. Ron DeSantis, who has expressed concerns about the legislation to keep children under the age of 16 off popular platforms regardless of parental approval. The House passed the bill on a 108-7 vote Thursday just hours after the Senate approved it 23-14. The Senate made changes to the original House bill, which Republican Speaker Paul Renner said he hopes will address DeSantis’ questions about privacy. The bill targets any social media site that tracks user activity, allows children to upload material and interact with others, and uses addictive features designed to cause excessive or compulsive use. Supporters point to rising suicide rates among children, cyberbullying and predators using social media to prey on kids.

Politico: Florida passes strict social media restrictions for minors despite DeSantis’ misgivings: The Florida Republican-led House overwhelmingly passed legislation Thursday to create the strictest social media prohibitions in the country by cutting off anyone under 16 years old from many platforms despite some objections from Gov. Ron DeSantis. House members voted on the bill mere hours after it was backed by the Senate in a surprise move that procedurally could force DeSantis to act sooner on legislation that he has been skeptical of for weeks. Because the Legislature passed the bill, FL HB1 (24R), with two weeks remaining in session, DeSantis would have to either sign or veto the measure before lawmakers leave Tallahassee. The Republican governor has raised particular concerns about the legislation not giving parents a say in whether their children should be allowed on social media. But lawmakers were unwilling to add a carve-out to the bill that could allow some minors to access social media despite the threat of DeSantis’ possible veto.

The New York Times: A Marketplace of Girl Influencers Managed by Moms and Stalked by Men: Thousands of accounts examined by The Times offer disturbing insights into how social media is reshaping childhood, especially for girls, with direct parental encouragement and involvement. Some parents are the driving force behind the sale of photos, exclusive chat sessions and even the girls’ worn leotards and cheer outfits to mostly unknown followers. The most devoted customers spend thousands of dollars nurturing the underage relationships. The large audiences boosted by men can benefit the families, The Times found. The bigger followings look impressive to brands and bolster chances of getting discounts, products and other financial incentives, and the accounts themselves are rewarded by Instagram’s algorithm with greater visibility on the platform, which in turn attracts more followers. One calculation performed by an audience demographics firm found 32 million connections to male followers among the 5,000 accounts examined by The Times.

The Verge: Instagram and Facebook knowingly platform parents who sexually exploit children for profit, say reports: Investigations into “child influencer” accounts on Facebook and Instagram have found that Meta is knowingly allowing parents who sexually exploit their children for financial gain on the platform — and in some cases, using Meta’s paid subscription tools to do so.  According to separate reports published by The New York Times and The Wall Street Journal on Thursday, Facebook and Instagram have become a potentially lucrative endeavor for parents who run social media accounts for children — mostly girls — who aren’t old enough to meet the platforms’ minimum 13-year-old age requirements. Several of the “parent-managed minor accounts” investigated sold materials to their large audiences of adult men, including photos of their children in revealing attire, exclusive chat sessions, and their children’s used leotards and cheer outfits.

The Wall Street Journal: Meta Staff Found Instagram Tool Enabled Child Exploitation. The Company Pressed Ahead Anyway.: Meta Platforms safety staff warned last year that new paid subscription tools on Facebook and Instagram were being misused by adults seeking to profit from exploiting their own children. Two teams inside Meta raised alarms in internal reports, after finding that hundreds of what the company calls “parent-managed minor accounts” were using the subscription feature to sell exclusive content not available to nonpaying followers. The content, often featuring young girls in bikinis and leotards, was sold to an audience that was overwhelmingly male and often overt about sexual interest in the children in comments on posts or when they communicated with the parents, according to people familiar with the investigations, which determined that the payments feature was launched without basic child-safety protections. While the images of the girls didn’t involve nudity or other illegal content, Meta’s staffers found evidence that some parents understood they were producing content for other adults’ sexual gratification. Sometimes parents engaged in sexual banter about their own children or had their daughters interact with subscribers’ sexual messages.

Forbes: Zuckerberg Wants No Personal Legal Blame for Instagram, Facebook Addiction: Holding Mark Zuckerberg personally responsible may be a challenge because of a corporate law tradition of shielding executives from liability, especially at larger companies where decision-making is often layered. A victory against the billionaire who launched Facebook with friends as a Harvard undergraduate two decades ago could encourage claims against other CEOs in mass personal injury litigation. Zuckerberg faces allegations from young people and parents that he was repeatedly warned that Instagram and Facebook weren’t safe for children but ignored the findings and publicly stated the opposite was true. Plaintiffs contend that as the face of Meta, Zuckerberg has a responsibility to “speak fully and truthfully on the risks Meta’s platforms pose to children’s health.”

Forbes: Social Media And Youth Mental Health: Are You Worried Yet?:  We can no longer overlook the abundance of warning signs and risks to adolescent mental health posed by social media. With up to 95% of youth ages 13 to 17 using a social media platform, the issue has become so concerning that the Surgeon General issued a warning. Yet the warning signs continue to mount, and CEOs of social media companies have been subject to Congressional questioning regarding documented harm against children and adolescents on their platforms. Public officials are beginning to call on social media companies to take responsibility for the harm they have created. We know now that certain kinds of social media use during specific windows of brain and emotional development show a correlative relationship with poor mental health outcomes for children, i.e., depression and anxiety, social comparison and low self-esteem, and poor sleep quality—to name a few. It’s time to change the trajectory of this growing crisis.

The Hill: Opinion: Congress’s big show of protecting kids online stopped when the cameras did: Just a few weeks ago, Senate lawmakers displayed a rare glimpse of bipartisanship when they grilled the CEOs of social media platforms on being asleep at the switch in protecting children from online predators. The hearings were intense, and attended by survivors, family members and advocates. Elected leaders from both parties slammed the tech executives for failing to take stronger action to protect America’s youth online — and for not using the technologies they’ve developed to build a safer space for children to learn, explore and grow. The contentious exchange sparked a tidal wave of national news coverage. In the weeks since the American people have waited patiently for legislation to be brought to the Senate floor that would force the companies represented in that room to take these issues seriously. It hasn’t happened. For the companies, it’s business as usual. For those in Congress, it represents a failed moment, a lost opportunity, to govern and work toward a common goal to protect kids and hold companies accountable.

Politico Playbook: Why this lawmaker is fighting social media feeds: Growing concerns over kids’ mental health are fueling an effort in New York to combat algorithm-driven social media feeds. Meta, the parent company of Facebook and Instagram, has called for federal action rather than state legislation. Meta has also backed standards developed by the industry for age-appropriate content and standards for apps commonly used by teens. Targeted ads would limit personalization for kids under 16 as well. But state Sen. Andrew Gounardes isn’t convinced. The Brooklyn Democrat has sponsored legislation embraced by Hochul to address data privacy for kids. He spoke with Playbook about the effort.

Delco Times: Delaware County internet crime investigators advise parents: talk, be alert and devices are not babysitters: Talk to your kids, be alert and internet devices are not babysitters. Those were some of the main messages Pennsylvania’s internet crimes against children commander had for parents at a recent presentation in Upper Darby. Delaware County Criminal Investigation Division Detective Sgt. Kenneth Bellis, who is the commander of the Pennsylvania Internet Crimes against Children task force (ICAC), spoke about the internet, social media and texting safety to parents in Upper Darby School District. As the age that children have access to the internet gets younger, the problem has increased and, like many issues, the pandemic made it worse, officials said.

The Washington Post: Senate poised to pass biggest piece of tech regulation in decades: After months of negotiations, senators announced Thursday that a sprawling bill to expand protections for children online had secured more than 60 backers, clearing a path to passage for what would be the most significant congressional attempt in decades to regulate tech companies. The Kids Online Safety Act, or KOSA, first introduced in 2022, would impose sweeping new obligations on an array of digital platforms, including requiring that companies “exercise reasonable care” to prevent their products from endangering kids. The safeguards would extend to their use of design features that could exacerbate depression, sexual exploitation, bullying, harassment and other harms. The measure would also require that platforms enable their most protective privacy and safety settings by default for younger users and offer parents greater tools to monitor their kids’ activity.

ABC News: New York City sues social media companies, accuses them of contributing to ‘youth mental health crisis’: The City of New York on Wednesday sued the companies behind SnapChat, Instagram, YouTube and TikTok, accusing them of fomenting a “nationwide youth mental health crisis” by exposing children “to a nonstop stream of harmful content.” The lawsuit accused the social media companies of manipulating users by making them feel compelled to respond to one positive action with another positive action.

CBS: Judge extends hold on Ohio enforcing social media parental consent law: A federal judge extended a block on enforcement Monday of an Ohio law that would require children under 16 to get parental consent to use social media apps, as a legal challenge proceeds. U.S. District Court Judge Algenon Marbley’s decision to grant a preliminary injunction prevents the law from taking effect while a lawsuit filed earlier this month by NetChoice winds its way through the courts. NetChoice is a trade group representing TikTok, Snapchat, Meta and other major tech companies. The group is fighting the law as overly broad, vague and an unconstitutional impediment to free speech. The law, originally was set to take effect Jan. 15 and is similar to ones enacted in other states – including California and Arkansas, where NetChoice has won lawsuits.

Bucks County Herald: Central Bucks South students help craft social media bill: After Pennsylvania State Rep. Brian Munroe saw a video created by three Central Bucks South High School students, he knew he wanted to work with them. The video, made by Max Jin, Luka Jonjic and Dylan Schwartz as part of C-SPAN’s national video competition, examined the impact of social media on teens. Titled “America’s Silent Struggle,” it details the harsh effects social media can have on youths’ mental health.

The Wall Street Journal: Keeping Teens Safe Online Has to Go Beyond Parental Controls. Here’s What to Do: In the past five years or so, the major players have rolled out software to give parents more say over when kids can use devices and services and what shows up on their screens. But these tools are optional and often buried, and sometimes broken. Most parents don’t use them, according to a poll conducted last year by the market research firm Ipsos.  Do parents just need more awareness about the tools? Or should tech companies take on more responsibility to protect young people? Facing political pressure, the social media platforms are building in protections for underage users, and they should continue doing more. Parents need to look to conversations, not controls, to ensure their kids aren’t meeting harm online. It’s time to reframe the discussion—and why I won’t be recommending parental controls going forward.

The Hill: Opinion: Zuckerberg claims social media isn’t harmful to mental health — here’s what the science says: Last week, Meta CEO Mark Zuckerberg told the Senate Judiciary Committee that there is no link between social media and negative mental health outcomes among young people.  The APA advisory, which was based on the best available science, showed that social media is related to psychological harm through online discrimination, prejudice, hate and cyberbullying. Research also has found that young people face serious risks when they are exposed to content about self-harm, harm to others or eating disorders. But what is key to know is that the advisory also outlined the science behind why certain features, functions and content on social media can be harmful to young people, whose brains have not yet fully developed. The potential for harm is baked into Facebook, Instagram and other social media platforms if young people use them as intended. Counts of followers or likes exploit children’s innate desire for social reward and their need to feel accepted by their peers. The endless scroll of posts challenges children’s ability to limit their social media use and time spent on screens before brain development in neural inhibition centers have fully developed. Friending and direct messaging functions may expose minors to predators. Research also shows that ongoing engagement on social media platforms are the primary cause of youths’ sleep deprivation, which has substantial consequences for adolescents’ mood, academic performance, and even the size and function of their developing brains.

Philly Burbs: Should social media platforms monitor chats of minors? 3 Bucks teens think it could help: A new bill introduced by a Bucks County lawmaker and drafted with the help of three Central Bucks students could put tighter restrictions on social media companies and make them accountable for online bullying and harassment.  During a press conference Tuesday at Tamanend Middle School, in Warrington, state Rep. Brian Munroe announced a House Bill 2017, “co-authored” by three former Tamanend students, which would require social media companies to monitor chats of two or more minors and notify parents of “flagged sensitive or graphic content,” require parental consent for users under 16 to create social media accounts and restrict data mining on users under 18.

The Wall Street Journal: States Crack Down on Social Media for Teens: ‘There Are No Guardrails’: It’s not just Congress. States are taking on social media’s grip on teenagers, too. Officials in New York are pushing to restrict the algorithms that power a platform’s feed, making it the latest state to attempt to rein in the big tech companies in the wake of federal inaction. Other states have hit legal roadblocks by attempting to shield children from specific types of content or restrict minors from signing up for accounts. If New York is successful, it would offer other states a legal pathway to pursue.

WFMZ (Allentown): ‘Light within a dark tunnel’: Students help craft social media legislation aiming to improve young users’ mental health:  A group of Bucks County students teamed up with a state representative to craft a bill about social media. It aims to empower parents, protect children and hold tech companies accountable. State Rep. Brian Munroe announced his newest legislation at Tamanend Middle School, because he wrote it with three of its former students. They describe what House Bill 2017 would require social media companies to do for young users. “Monitoring social media accounts for red flags, notifying parents or guardians of harmful content, and especially strengthening the proof of age verification,” said Jonjic.

The Hill: Hundreds of families urge Schumer to pass children’s safety bill: Hundreds of parent advocates urged Sen. Majority Leader Chuck Schumer (D-N.Y.) to pass the Kids Online Safety Act in a letter and full-page Wall Street Journal ad published Thursday. The call to action builds on pressure from parents at last week’s Senate Judiciary Committee hearing with the CEOs of Meta, TikTok, Discord, Snap and X, the company formerly known as Twitter. “We have paid the ultimate price for Congress’s failure to regulate social media. Our children have died from social media harms,” the parents wrote in the letter. Signatories include families whose children have died by suicide after being cyberbullied, sextorted or served pro-suicide content, according to Fairplay, one of the children’s advocacy groups that helped organize the letter. 

The Washington Post: How to keep your kids safe online — without taking away their phone: If you have a child old enough to be on social media, there’s a lot to worry about. During a nearly four-hour hearing on kids safety online last week, senators sparred with tech CEOs from Meta, TikTok, Snapchat, Discord and X about the harm their apps pose to tweens and teens. There was talk of child sexual abuse material (also known as CSAM), suicide, bullying, drugs, lethal viral trends, extortion, disordered eating and mental health issues — all linked back to the use of social media. Given the popularity of social media and the prevalence of smartphones in teenagers’ lives, how can adults protect them from every worst-case scenario lurking in direct messages and algorithmic feeds?

News 4 Jax (Jacksonville FL): Florida Senate advances bill banning social media accounts for children under 16: Florida senators Monday evening moved forward with a proposal to try to prevent minors under age 16 from using social media, as lawmakers take on tech platforms that they say harm children. The Senate Judiciary Committee voted 7-2 to approve a bill (SB 1788) that is similar to a measure (HB 1) that passed the House last month and is a priority of House Speaker Paul Renner, R-Palm Coast. While supporters described social media as harmful to children’s mental health, they continued to face questions about the constitutionality of the proposal, which, in part, would prevent minors under 16 from creating social-media accounts. Critics contend it would violate First Amendment rights.

Roll Call: Gulf remains for tech CEOs, Senate on kids’ online safety: Tech executives gave little ground when it came to endorsing bipartisan legislation aimed at addressing online child safety, even as senators lambasted their handling of the issue at a packed Senate committee hearing. The well-televised confrontations played out last week, as lawmakers blamed social media platforms for harming children and failing to regulate themselves. The vast lobbying power of large technology companies, lawmakers argued, have made passing solutions more difficult.

The New York Times: Why Eating Disorder Content Keeps Spreading: In late January, a volunteer at the help line for the National Alliance for Eating Disorders fielded a call from someone who had seen an alarming trend on TikTok. The hashtag #legginglegs had started taking off as users posted about the slim bodies seemingly deemed the most desirable for leggings. The organization, which works directly with social media companies including TikTok, Meta and Pinterest, quickly flagged the trend to TikTok. Less than a day later, the platform banned the hashtag and began directing users who searched for it toward the organization’s hotline and other resources.

Inquirer: Opinion: Mark Zuckerberg’s apologies won’t keep kids safe online: How many children must die before social media companies do something about their role in the sharp rise in suicides, eating disorders, depression, and bullying that has impacted so many young people? How many more hearings will Congress hold before it acts? Executives from Meta, the company behind Facebook and Instagram, have testified 33 times since 2017 on issues ranging from election interference and social media’s role in the insurrection on Jan. 6, 2021. Yet nothing has been done.

The New York Post: Kids shouldn’t be on social media AT ALL: Zuckerberg’s contrition — whether real, fake or somewhere in between — doesn’t really matter one way or the other, though. The key question is why we are subjecting our children to a vast, real-time experiment in exposure to a radically new medium that evidence suggests is harmful to their emotional and mental health?  This dubious venture is unquestionably a boon to the bottom line of Meta and its peer companies, but it’s doubtful that any parent in America has ever thought it was good for their kid.

The New York Times: Silicon Valley Battles States Over New Online Safety Laws for Children: Last summer, Ohio enacted a social media statute that would require Instagram, Snapchat, TikTok and YouTube to get a parent’s consent before permitting children under age 16 to use their platforms. But this month, just before the measure was to take effect, a tech industry group called NetChoice — which represents Google, Meta, Snap, TikTok and others — filed a lawsuit to block it on free speech grounds, persuading a Federal District Court judge to temporarily halt the new rules. The case is part of a sweeping litigation campaign by NetChoice to block new state laws protecting young people online — an anti-regulation effort likely to come under scrutiny on Wednesday as the Senate Judiciary Committee questions social media executives about child sexual exploitation online. The NetChoice lawsuits have rankled state officials and lawmakers who sought tech company input as they drafted the new measures.

Reuters: Tech CEOs told ‘you have blood on your hands’ at US Senate child safety hearing: U.S. senators on Wednesday grilled leaders of the biggest social media companies and said Congress must quickly pass legislation, as one lawmaker accused the companies of having “blood on their hands” for failing to protect children from escalating threats of sexual predation on their platforms. The hearing marks the latest effort by lawmakers to address the concerns of parents and mental health experts that social media companies put profits over guardrails that would ensure their platforms do not harm children.

Axios: Senators fail to press social media CEOs on AI at child safety hearing: Wednesday’s Senate hearing about protecting kids on social media focused on regulating yesterday’s and today’s technology — but lawmakers failed to grill executives on the rise of AI and the new problems it is generating. The hearing, like those that preceded it, looked for solutions to longstanding problems — failures in content moderation, age verification, protection of teens’ mental health and enforcement of laws against child sexual abuse material (CSAM) — but the tech industry keeps inventing new services that get put to bad use. Members of the Senate Judiciary Committee spent close to four hours grilling Meta CEO Mark Zuckerberg and his counterparts from TikTok, X, Snap and Discord on the sexual exploitation of children on their respective platforms.

Forbes: Adults And Kids Are In The ‘Bully’s Pulpit’ On Social Media, Experts Say: A lot of attention has been paid to social media’s impact on children—including during this week’s Senate hearings, where the heads of the platforms were grilled by lawmakers. However, many of social media’s biggest issues aren’t just limited to children. Perhaps as an extension of our political divide, or just the fact that there is little decorum online, social media has allowed many adults to act out. Cyberbullying isn’t something that just impacts children these days, and the problem could be getting worse warned experts. It can include personal attacks that instead of being creative criticism are often mean-spirited, hostile and oftentimes even threatening.

The New York Times: Six takeaways from a contentious online child safety hearing: After a series of tense exchanges between senators and tech executives that clocked in at just under four hours, the Senate Judiciary Committee hearing on online child safety came to an end on Wednesday with no clear resolutions in sight. The audience included several family members of victims, who cheered as senators berated the executives and listened stoically as Mark Zuckerberg, the chief executive of Meta, addressed the crowd directly. 

Yahoo Finance: Tech CEOs face US Senate grilling over youth content: Meta CEO Mark Zuckerberg and the chief executives of X, TikTok, Discord and Snap face a grilling by US lawmakers on Wednesday over the dangers that social media platforms bring to children and teens. The tech chieftains have been convened by the US Senate Judiciary Committee where they will be asked about the effects of social media in a session titled “Big Tech and the Online Child Sexual Exploitation Crisis.” The hearing could be grueling for executives confronting political anger for not doing enough to thwart online dangers for children, including from sexual predators. 

The Washington Post: Meta says its parental controls protect kids. But hardly anyone uses them.: Amid scrutiny of social media’s impact on kids and teens, tools that let parents track their children’s online activities have become increasingly popular. Snapchat, TikTok, Google and Discord all have rolled out parental controls in recent years; last week, Meta said these features “make it simpler for parents to shape their teens’ online experiences.” But inside Meta, kids safety experts have long raised red flags about relying on such features. And their use has been shockingly infrequent.

The Hill: Parent anger at social media companies boils over ahead of tech CEO hearing: The Senate is hauling in CEOs of social media companies to grill them over online harms to children on Wednesday, but parents and children’s safety advocates said the time for talking is over and Congress must act to protect children and teens. Parents who became advocates after losing their children to harms they say were created by social media companies will be among the crowd at Thursday’s Judiciary Committee hearing. The hearing will feature testimony from Meta CEO Mark Zuckerberg, TikTok CEO Shou Zi Chew, X CEO Linda Yaccarino, Snap CEO Evan Spiegel, and Discord CEO Jason Citron. 

Los Angeles Times: To protect kids, California might require chronological feeds on social media: Social media companies design their feeds to be as gripping as possible, with complicated algorithms shuffling posts and ads into a never-ending stream of entertainment. A new California law would require companies to shut off those algorithms by default for users under 18, and implement other mandated tweaks that lawmakers say would reduce the negative mental health effects of social media on children.

The Verge: Instagram and Facebook will now prevent strangers from messaging minors by default: Meta is introducing changes to Instagram and Facebook Messenger that aim to better protect minors from unwanted contact online, placing greater restrictions on who can message teens while giving parents more control over their children’s security settings. Notably, the company announced that by default, teens under the age of 16 (or under 18 in some countries) will no longer be able to receive messages, or be added to group chats, by users they don’t follow or aren’t connected with on Instagram and Messenger. These new updates build upon a series of safeguards that Meta has introduced over the last year as it battles accusations that its algorithms helped turn Facebook and Instagram into a “marketplace for predators in search of children.” Unlike the previous restrictions, which only limited adults over 19 from DM’ing minors who don’t follow them, these new rules will apply to all users regardless of age. Meta says that Instagram users will be notified of the change via a message at the top of their Feed. Teens using supervised accounts will need to request permission from the parent or guardian monitoring their account to change this setting. Parental supervision tools on Instagram are also being expanded. Instead of simply being notified when their child makes a change to their safety and privacy settings, parents will now be prompted to approve or deny their requests — preventing them from switching their profile from private to public, for example.

Reuters: Meta moves to protect teens from unwanted messages on Instagram, Facebook: Meta Platforms is building more safeguards to protect teen users from unwanted direct messages on Instagram and Facebook, the social media platform said on Thursday. The move comes weeks after the WhatsApp owner said it would hide more content from teens after regulators pushed the world’s most popular social media network to protect children from harmful content on its apps. The regulatory scrutiny increased following testimony in the U.S. Senate by a former Meta employee who alleged the company was aware of harassment and other harm facing teens on its platforms but failed to act against them.

Engadget: Facebook and Instagram will block DMs to teens unless they’re from a friend: In 2021, Meta restricted adults on Instagram from being able to message under-18 users who don’t follow them. Now, it’s expanding that rule to help protect younger teens from potentially unwanted contact. Users under 16 — or 18, depending on their country — can no longer receive DMs from anybody they don’t follow by default, even if they’re sent by fellow teens. This new safety measure applies to both Instagram and Messenger. For Messenger, in particular, young users will only be able to receive messages from their Facebook friends or people in their phone contacts. Since this setting is enabled by default, teens who have accounts under parental supervision will need to get any changes to it approved by their guardian. Of course, the setting will have to depend on a user’s declared age and Meta’s technology designed to predict people’s ages, so it’s not 100 percent foolproof.

Axios: Florida House passes bill that would ban children under 16 from social media: Florida’s House passed a bill that would limit youth access to social media by banning new and existing accounts of users younger than 16 years old. Why it matters: The Republican-backed bill, which hones in on social media’s addictive features, would be one of the strictest social media restrictions in the country, if passed by the Senate and signed into law by Gov. Ron DeSantis.

The Washington Post: Florida lawmakers move to bar kids from social media in latest statehouse push: Florida’s House of Representatives has greenlit what could be one of the nation’s strictest laws aimed at protecting children online, passing a bill that would bar anyone 16 and younger from using social media. While lawmakers voiced some concerns about enforcement, parental rights and First Amendment issues, the bill passed with overwhelming bipartisan support, and on Thursday moved to the state Senate, which is expected to take up the bill soon. It’s the latest in a slew of statehouse proposals to crack down on what is increasingly seen as a threat to children — and their childhood. U.S. Surgeon General Vivek H. Murthy issued an advisory last year declaring social media “an important driver” of a “national youth mental health crisis.” According to the Pew Research Center, approximately 95 percent of kids ages 13 to 17 are on social media, with more than a third of them admitting “they use social media ‘almost constantly.’” The bill, if passed, is likely to encounter resistance in court. Critics, including free speech advocates, say such proposals are unconstitutional.

The Guardian: ‘They’re addicting kids and they know it’: the attorney challenging social media firms: Matthew P Bergman is the founding attorney of the Social Media Victims Law Center – a law firm dedicated exclusively to representing the families of children allegedly harmed by social media. The firm has filed cases against platforms including Meta, Snap, TikTok, and Discord. Bergman, who has been practicing law since 1991, had a storied career in asbestos litigation before turning his attention to social media. Inspired by the testimony of Facebook whistleblower Frances Haugen, Berman said he noticed a number of parallels between social media and asbestos. Both are products initially thought to be beneficial before being exposed as detrimental to human health, he said. The corporations involved brazenly tried to cover up the harms they knew were coming, he added. Lawsuits against social media firms over alleged harms have long struggled to see their day in court due to section 230 of the 1996 Communications Decency Act, a federal law in the US that shields online platforms from liability for illegal actions of their users. With this in mind, Bergman took on a different tack, modeled after his days in asbestos litigation. Rather than arguing against content on the platforms, his firm leans on product liability laws and argues that the products are harmful by design.

Politico: First tech platform breaks ranks to support kids online safety bill: The owner of Snapchat is backing a bill meant to bolster online protections for children on social media, the first company to publicly split from its trade shop days before the company’s CEO prepares to testify on Capitol Hill. A Snap spokesperson told POLITICO about the company’s support of Kids Online Safety Act. The popular messaging service’s position breaks ranks with its trade group NetChoice, which has opposed KOSA. The bill directs platforms to prevent the recommendation of harmful content to children, like posts on eating disorders or suicide. KOSA co-sponsors Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) applauded Snap’s endorsement. “We are pleased that at least some of these companies are seemingly joining the cause to make social media safer for kids, but this is long overdue,” they told POLITICO. “We will continue fighting to pass the Kids Online Safety Act, building on its great momentum to ensure it becomes law.” None of the other platforms testifying have taken public positions supporting KOSA to date. TikTok and Discord declined to comment on KOSA. X did not respond to a request for comment. Meta didn’t answer if it supports KOSA, but said it supports “internet regulation” and issued its own legislative “framework” this month calling on Congress to pass a bill to shift the responsibility to app stores, not platforms, to obtain parental consent for kids to download social media apps.

Reuters: Florida lawmakers vote to restrict children’s access to social media: The Florida House of Representatives approved on Wednesday a bill aimed at barring children aged 16 and younger from social media platforms, following similar action in several states to limit online risks to young teenagers. Passed by a bipartisan vote of 106 to 13, the measure would require social media platforms to terminate the accounts of anyone under 17 years old and use a third-party verification system to screen out the underaged.

Politico: Florida’s GOP-controlled House passes strict social media restrictions for minors: Florida’s Republican-led House overwhelmingly passed legislation Wednesday that would create some of the strictest social media prohibitions in the country by cutting off anyone under 16 years old from many platforms. Still pending approval in the Senate, the proposal is a top priority of Republican Speaker Paul Renner on his conservative agenda to safeguard children in the state alongside a bill curbing access to adult websites, which lawmakers also passed Wednesday. The social media restrictions would put Florida in line with several other states attempting to crackdown on minors using “addictive” apps they consider harmful to mental health but may also open the state up to an eventual lawsuit from major tech firms.

The Washington Post: New York City designates social media a public health hazard: New York City on Wednesday designated social media a public health hazard for its effect on youth mental health, becoming the first major city in the United States to take such a step, Mayor Eric Adams (D) said in an address. In response, he said, New York City Health Commissioner Ashwin Vasan “is issuing a health commissioner advisory officially designating social media as a public health crisis hazard in New York City.” In an advisory issued the same day, Vasan outlined the deteriorating state of youth mental health in New York City and offered guidance to young people on encouraging healthy social media use, such as by implementing tech-free times and places; monitoring emotions during use; and sharing concerns related to social media and mental health with adults.

The Guardian: Meta has not done enough to safeguard children, whistleblower says: Mark Zuckerberg’s Meta has not done enough to safeguard children after Molly Russell’s death, according to a whistleblower who said the social media company already has the infrastructure in place to shield teenagers from harmful content. Arturo Béjar, a former senior engineer and consultant at the Instagram and Facebook owner, said if the company had learned its lessons from Molly’s death and subsequent inquest it would have created a safer experience for young users. According to research conducted by Béjar on Instagram users, 8.4% of 13- to 15-year-olds had seen someone harm themselves or threaten to harm themselves in the past week.

CBS Philadelphia: Allentown high schools use app to connect kids with mental health support: Kids have a lot of stress these days. A school district in the Lehigh Valley asked how it could help, and students requested resources they could use on their phones. Mental health resources in different languages are now just a few taps away for nearly 5,000 high school students across the Allentown School District, and it’s already making a difference. The app ‘Counslr’ launched in Allentown schools ahead of schedule at the end of December, thanks to federal COVID relief funding. It provides kids with free 24/7 support anywhere by licensed counselors through a medium they’re already used to.

CNN: Children targeted with sexually explicit photos on Facebook and Instagram, lawsuit claims: The lawsuit accuses Meta of creating a “breeding ground” for child predators. The 2020 incident — which is detailed in the newly unredacted complaint citing internal documents — is part of a years-long history of people inside and outside of the social media giant run by Mark Zuckerberg raising concerns about young users’ exposure to sexual and inappropriate content, the attorney general alleges. According to the complaint, after the Apple executive flagged the issue to Meta, then known as Facebook, at least one Meta employee raised concerns internally that the incident could put the company at risk of having Facebook removed from the Apple App Store — a potentially devastating blow.

The Guardian: ‘Fundamentally against their safety’: the social media insiders fearing for their kids: Arturo Bejar would not have let his daughter use Instagram at the age of 14 if he’d known then what he knows now. Bejar left Facebook in 2015, where he spent six years making it easier for users to report when they had problems on the platform. But it wasn’t until his departure that he witnessed what he described in recent congressional testimony as the “true level of harm” the products his former employer built are inflicting on children and teens – his own included. Bejar discovered his then 14-year-old daughter and her friends were routinely subjected to unwanted sexual advances, harassment and misogyny on Instagram, according to his testimony.

The Hill: Big Tech is preying on children for profit, and Congress needs to stop it: Research from the Parents Television and Media Council, where I am vice president, reveals that Hollywood is marketing television shows with explicit adult content to young teens through social media apps such as TikTok and Instagram, which are popular with 13 to 17-year-olds. We are talking about programs such as “Euphoria,” “Sex Education” and “PEN15,” which carry TV-MA ratings and are not for children or teens. Social media is being used to get around parents and directly to children with this content. Meta, the parent company of Facebook and Instagram, continues to be revealed by media outlets, whistleblowers and lawsuits as fueling child sexual exploitation, providing a platform for pedophiles, and enabling sexually explicit and other harmful content that targets teens, especially teen girls. Meta has been sued by the District of Columbia and 41 states, which claim its products are addictive and potentially harmful to children and their mental health. Other social media platforms are no better. Snapchat has been used to “lure and sexually exploit children.” The New York Times reported last year that X has struggled to confront its child sexual exploitation problem. Parents are suing Roblox over sexual content on its platform.

Politico: States get serious about limiting kids’ social media exposure: An increasing number of states are moving to require social media companies to create child-safe versions of their sites as Washington struggles with how to shield kids. The states are moving because they believe social media is contributing to increasing rates of mental illness among children, and because Congress hasn’t. There’s bipartisan support on Capitol Hill to do more, but lawmakers there can’t agree on whether a national privacy standard should override state laws. In a separate tack, 33 states sued Meta, parent company of Facebook and Instagram, in October in federal court in San Francisco alleging it violated children’s privacy. If successful, it also could force the company to change its sites. The legal battle is wide and the outcome far from clear. A 2022 California law, the first to mandate website design changes, is now in limbo after a tech industry group challenged it in federal court. The prospect of having to comply with varying state laws has alarmed the tech firms, which are moving to convince state lawmakers new rules aren’t needed. To do that, the firms are tightening their own controls over what kids see online. Meta is rolling out new protections that help children avoid content deemed harmful, such as posts about violence, sex or eating disorders. The firms insist they don’t oppose regulation, but would prefer a national standard than a patchwork of 50 state rules.

The New York Times: Opinion: Our Kids Are Living In a Different Digital World: Parents need to know that when children go online, they are entering a world of influencers, many of whom are hoping to make money by pushing dangerous products. It’s a world that’s invisible to us, because when we log on to our social media, we don’t see what they see. Thanks to algorithms and ad targeting, I see videos about the best lawn fertilizer and wrinkle laser masks, while Ian is being fed reviews of flavored vape pens and beautiful women livestreaming themselves gambling crypto and urging him to gamble, too. Smartphones are taking our kids to a different world. We know this, to some extent. We worry about bad actors bullying, luring or indoctrinating them online — all risks that have been deeply reported on by the media and that schools and public agencies like the Federal Trade Commission are taking great pains to address. The social-media giant Meta has been sued on allegations that using its platforms is associated with issues including childhood anxiety and depression. Yet all of this is, unfortunately, only part of what makes social media dangerous.

Roll Call: Advocates push for federal online safety laws as states take lead: At least 15 states have enacted or are pursuing legislation that would require online companies to protect the safety and privacy of kids using their platforms, putting pressure on Congress to pass more unifying federal legislation.

The National Desk: A teen’s perspective on social media’s harmful impact on adolescent self-perception: Children are often accused of wanting to grow up too fast, but is social media now robbing younger generations of normal childhood experiences and creativity? 14-year-old Jordyn Kelman said many of her peers are in a rush to look grown up because of what they see on social media apps like TikTok. She shares the same concerns many adults have over how social media is negatively impacting teens and even tweens.

AP: A judge has temporarily halted enforcement of an Ohio law limiting kids’ use of social media: U.S. District Court Judge Algenon Marbley’s temporary restraining order came in a lawsuit brought Friday by NetChoice, a trade group representing TikTok, Snapchat, Meta and other major tech companies. The litigation argues that the law unconstitutionally impedes free speech and is overbroad and vague. While calling the intent to protect children “a laudable aim,” Marbley said it is unlikely that Ohio will be able to show the law is “narrowly tailored to any ends that it identifies.”

Ars Technica: Facebook, Instagram block teens from sensitive content, even from friends: Starting now, Meta will begin removing content from feeds and Stories about sensitive topics that have been flagged as harmful to teens by experts in adolescent development, psychology, and mental health. That includes content about self-harm, suicide, and eating disorders, as well as content discussing restricted goods or featuring nudity. Even if sensitive content is shared by friends or accounts that teens follow, the teen will be blocked from viewing it, Meta confirmed.

USA Today: Teens won’t be able to see certain posts on Facebook, Instagram: What Meta’s changes mean: Meta, the parent company of the social media platforms, revealed Tuesday it will begin restricting some of what young users can see on Facebook and Instagram. The announcement comes as the company faces mounting pressure from regulators who claim its social media sites are addictive and harmful to the mental health of younger users. In a blog post, Meta said the measures, which will roll out in the coming weeks, are designed “to give teens more age-appropriate experiences on our apps.” The protections will make it more difficult for teens to view and search for sensitive content such as suicide, self-harm and eating disorders, according to Meta.

Axios: Facebook, Instagram will hide more sensitive content from teens, Meta says: Meta will institute more protections for teenagers on Facebook and Instagram, the company announced Tuesday. A lawsuit last year from more than half the country’s attorneys general said Meta knowingly released products and features that harm teens’ mental health.

NBC: Meta to start blocking some content from reaching teens on Facebook and Instagram: Meta will begin removing some sensitive and “age-inappropriate” content from teenagers’ feeds, the company said in an announcement published Tuesday. Meta already restricted topics such as self-harm, eating disorders and mental illnesses from being recommended to teens on Reels and the Explore pages of its apps. The new update also restricts these topics from appearing in young users’ feeds and Stories, even if the content was posted by people they follow. The company said it is hiding more search results and terms relating to suicide, self-harm and eating disorders for everyone in coming weeks. Meta said it will be directing people to resources for help if they search these topics.

The Wall Street Journal: Instagram and Facebook Will Stop Treating Teens Like Adults: Meta Platforms plans to automatically restrict teen Instagram and Facebook accounts from harmful content including videos and posts about self-harm, graphic violence and eating disorders. The changes are expected to roll out in the coming weeks. This marks the biggest change the tech giant has made to ensure younger users have a more age-appropriate experience on its social-media sites. The new content restrictions come as more than 40 states are suing Meta, alleging the tech company misled the public about the dangers its platforms pose to young people.

The Washington Post: Instagram’s new teen safety features still fall short, critics say: Instagram and Facebook unveiled further limits on what teens can see on the apps, a move their parent company Meta says will reduce the amount of potentially harmful content young people encounter. Already, teens could opt to have Instagram’s algorithm recommend less “sensitive content” — which includes bare bodies, violence, drugs, firearms, weight-loss content and discussions of self-harm. Now, Meta says it will hide sensitive content even if it’s posted by friends or creators teens follow. The change comes weeks before Meta CEO Mark Zuckerberg is set to testify before the Senate Judiciary Committee about what lawmakers have called the company’s “failure to protect children online.” In the Jan. 31 session Zuckerberg, along with executives from social apps TikTok, Snap, Discord and X, will respond to online safety concerns such as predatory advertising, bullying and posts promoting disordered eating.

6 ABC Philadelphia: Drexel University study finds screen time for kids under 2 is harmful for sensory development: A new study is revealing why children under the age of 2 should avoid all screen time. Drexel University psychologist David Bennett is the study’s senior author, and says their research found that kids under 2, who were exposed to television and movies, were more likely to develop sensory processing issues. By 12 months old, the research found that children are 105% more likely to develop high sensory behaviors. By 18 months, each additional hour of screen time increased the odds by 23%. By 24 months, the odds increased by 20% for developing high sensory behaviors.

Axios: Report: Twitch feature is used to record and share child abuse: A feature offered by Twitch, the Amazon-owned live video streaming platform that’s popular with teens and kids, is being used by predators to record and share child sexual abuse content, per a Bloomberg News analysis. The investigation reveals another way predators have used evolving media and technology to sexually exploit children and teens. Twitch’s “clips” feature allows users to capture seconds-long live moments into short videos that can be edited and shared.

AP: A group representing TikTok, Meta and X sues Ohio over new law limiting kids’ use of social media: A trade group representing TikTok, Snapchat, Meta and other major tech companies sued Ohio on Friday over a pending law that requires children to get parental consent to use social media apps. The law was part of an $86.1 billion state budget bill that Republican Gov. Mike DeWine signed into law in July. It’s set to take effect Jan. 15. The administration pushed the measure as a way to protect children’s mental health, with Republican Lt. Gov. Jon Husted saying at the time that social media was “intentionally addictive” and harmful to kids.

CPO Magazine: FTC Proposal Looks to Bolster Children’s Privacy Online With Stronger Restrictions on Personal Information Monetization: The Federal Trade Commission (FTC) closed out its rulemaking year with a proposal aimed at improving privacy online for underage users, calling for new terms to be added to the existing Children’s Online Privacy Protection Rule (COPPA). The new amendments would bolster children’s privacy by further restricting how companies can collect, use and monetize the data of underage users, shifting a greater deal of responsibility in this area to service providers. The notice of proposed rulemaking was issued just before Christmas, and is currently in a mandatory 60-day comment period. Members of Congress have been calling for an even more expansive update of COPPA, which was last amended in 2013, since 2022; some FTC Commissioners have in turn responded by calling on Congress to first pass new legislation before COPPA updates or new data privacy terms are taken up by the agency.

The Wrap: The ‘Sensical’ 7: Predictions for Kids’ Media in 2024: 2024 may emerge as one of media’s most memorable and disruptive on record. The news over the last 12 months bore a striking resemblance to that of 2022, with familiar themes taking center stage. But when it comes to kids, there are a whole other set of topics to address in the next 12 months.

Penn Today: When young people seem to make threats on social media, do they mean it?: In New York City, law enforcement regularly monitors the social media use of young people who are Black, Indigenous, and people of color (BIPOC), compiling binders of Twitter and Facebook posts to link them to crimes or gangs. Something as benign as liking a photo on Facebook can be used as evidence of wrongdoing in a trial, so when police officers misinterpret social media posts—which often include slang, inside jokes, song lyrics, and references to pop culture—it can lead to serious consequences. To prevent these kinds of misinterpretations, SAFELab, a transdisciplinary research initiative at the Annenberg School for Communication and the School of Social Practice and Policy, has launched a new web-based app that teaches adults to look closer at social media posts: InterpretMe. The tool is currently open to members of three groups—educators, law enforcement, and the press.

NBC: Judge allows lawsuit against Snap from relatives of dead children to move forward: Relatives of over 60 young people who died from fentanyl overdoses sued Snap in October 2022 over its messaging platform Snapchat’s disappearing message feature. An extended version of the complaint filed in April 2023 said that “Snap and Snapchat’s role in illicit drug sales to teens was the foreseeable result of the designs, structures, and policies Snap chose to implement to increase its revenues.” The complaint said that Snapchat’s disappearing messages allow those engaging in illegal conduct to obscure their actions. Social media companies have typically been shielded from many lawsuits under Section 230 of the Communications Decency Act, a law that gives many online tech companies like Snap protection from legal claims stemming from activities that occur on their platforms. However, parts of this lawsuit appear to have sidestepped Section 230 for now. The plaintiffs are seeking unspecified compensatory and punitive damages.

Forbes: More Students Are Now Being Taught Media Literacy—Will It Stop Misinformation?: California became the latest state in the country to require media literacy instruction at every grade level last fall, and the Instructional Quality Commission has begun to slowly roll out a curriculum framework on the topic, while the Golden State will also consider how to incorporate it into English language arts, math, science, history and social science lessons. In addition, the framework will guide media literacy instruction as a way to help build critical thinking skills while “developing strategies to strengthen digital citizenship” for each student, according to the law.

NPR: One reason social media companies aren’t doing more to protect children? Ad revenue: NPR’s Rob Schmitz asks Amanda Raffoul of Harvard’s School of Public Health about a new estimate of the amount of money social media companies make on advertisements to users 17 and younger.

USA Today: New law in Ohio cracks down on social media use among kids: What to know: Starting in January a new state law will make it harder for children under 16 to access social media in Ohio. On Jan. 15, Ohio’s Social Media Parental Notification Act will go into effect, and big name companies will have until then to comply. Ohio Gov. Mike DeWine signed the state’s annual budget in July, which included the Social Media Parental Notification Act. Here’s everything to know about Ohio’s new law and how it works.

Associated Press: Social media companies made $11 billion in US ad revenue from minors, Harvard study finds: Social media companies collectively made over $11 billion in U.S. advertising revenue from minors last year, according to a study from the Harvard T.H. Chan School of Public Health published on Wednesday. The researchers say the findings show a need for government regulation of social media since the companies that stand to make money from children who use their platforms have failed to meaningfully self-regulate. They note such regulations, as well as greater transparency from tech companies, could help alleviate harms to youth mental health and curtail potentially harmful advertising practices that target children and adolescents.

The Washington Post: Posting kids online is risky. Here’s how to remove their images: By now, we’re familiar with the risks of sharing photos and videos of minors to websites or social media apps, where they can be used for bullying or misused by strangers. An evolving threat is artificial intelligence tools, which are improving at a dizzying pace. They can be fed real images and photos to make “deep fakes.” It’s already happening. New Jersey high school students allegedly used AI tools to make sexualized images of their classmates using “original photos” last summer. A high school student in Issaquah, Wash., allegedly used real photos of classmates to make sexualized images, which were then shared around. And in Spain, parents of more than 20 girls between the ages of 11 and 17 say photos of their children were altered using AI tools to create sexual images. AI tools “need as little as one picture now,” says Wael Abd-Almageed, distinguished principal scientist and research director at the University of Southern California’s Information Sciences Institute. “You can train AI to pick up the facial features of somebody, so if the AI can pick up the facial features for a child, you can replace them in a video.”

Forbes: Social Media is Accelerating Health Care Concerns Of Our Youth: Recent research from the Social Media and Youth Mental Health Advisory have found that 95% of teenagers ages 13-17 use a social media app, and more than a third say they use it “almost constantly.” In addition, Common Sense Media research found that the average teen spends 7 hours and 22 minutes looking at screens each day. Since 2015, American teens’ screen time has increased by around 2 hours. In addition, teenage boys clock up nearly 1 hour more daily screen time than teenage girls. Looking specifically at social media, people under the age of 18 spend an average of 1 hour 47 minutes on TikTok every day. Research is showing that social media can perpetuate “body dissatisfaction, disordered eating behaviours, social comparison, and low self-esteem, especially among adolescent girls.” Nearly 1 in 3 adolescents report using screens until midnight or later and most teenagers are glued to their cell phone and social media devices. These levels of addiction are alarming and very pervasive.

PCMag: Opinion: A No-Screen-Time Policy for Kids Is the Wrong Move: Every parent today has to address this question: Should you let your kids use technology? It’s a fraught decision process. After careful consideration, I’ve concluded that my kids—and yours—are better off learning how to navigate the internet safely rather than being shielded from it. Not everyone will agree with this, and they’ll be sure to let you know. By the time my oldest child was toddling around, I already felt overwhelmed by the endless opinions other people had about how I should raise my kids in a technology-laden world. Extended family, other parents, and influencers warned me against the damage screens could cause, but I also needed to take a shower sometimes—and giving my kid a tablet with Cocomelon playing on it let me do just that. And more importantly, there isn’t any evidence that eliminating technology does any good.

The Hill: Opinion: Laws to protect children online are coming. Smart companies won’t need them: “We need a clear regulatory blueprint for the policies and services needed from websites and platforms across the internet.” Interestingly, those words came from the CEO of YouTube, Neal Mohan. Unfortunately, as whistleblower Frances Haugen wrote recently in this publication, the U.S. technology sector’s public acceptance of online safety is belied by behind-the-scenes attacks on proposals — federal and state — to make the internet safer. Regulation is common sense: Rules govern every industry to ensure products and services are safe for users and the public. Next month, the CEOs of Meta, Snap, Discord, TikTok and X (formerly Twitter) will appear before the Senate Judiciary Committee to discuss online child sexual exploitation of children — a crime that, as executive director of International Justice Mission’s Center to End the Online Sexual Exploitation of Children, I have been fighting for over a decade.

The New York Times: U.S. Regulators Propose New Online Privacy Safeguards for Children: The Federal Trade Commission on Wednesday proposed sweeping changes to bolster the key federal rule that has protected children’s privacy online, in one of the most significant attempts by the U.S. government to strengthen consumer privacy in more than a decade. The changes are intended to fortify the rules underlying the Children’s Online Privacy Protection Act of 1998, a law that restricts the online tracking of youngsters by services like social media apps, video game platforms, toy retailers and digital advertising networks. Regulators said the moves would “shift the burden” of online safety from parents to apps and other digital services while curbing how platforms may use and monetize children’s data.

New Jersey Monitor: New Jersey’s proposal to limit children’s social media use has a big obstacle: the First Amendment: A new bill moving with lightning speed through the Legislature has a noble intention: to protect children from the toxicity of social media. To get there, though, the bill would require social media companies to verify the age of all of its users in New Jersey. That means having to fork over your driver’s license to social media companies or using the same identity verification software now used to communicate online with the Internal Revenue Service.

The New York Times: Former YouTube Parenting Channel Host Pleads Guilty to Child Abuse: Ruby Franke, the host of a now-defunct YouTube parenting channel, pleaded guilty on Monday to four counts of aggravated child abuse over the treatment of her children. Ms. Franke, who was known for videos that chronicled her strict parenting style on social media, was arrested along with her business partner in Utah in September. The police said at the time that two of Ms. Franke’s children appeared malnourished, and one of them had duct tape on his ankles and wrists as well as open wounds. Her plea was part of an agreement with prosecutors.

Associated Press: A group representing TikTok, Meta and X sues Utah over strict new limits on app use for minors: A trade group that represents TikTok and other major tech companies sued Utah on Monday over its first-in-the-nation laws requiring children and teens to obtain parental consent to use social media apps. Two laws signed in March by Republican Gov. Spencer Cox will prohibit minors from using social media between the hours of 10:30 p.m. and 6:30 a.m. unless authorized by a parent — and require age verification to open and maintain a social media account in the state. The restrictions are designed to protect children from targeted advertisements and addictive features that could negatively impact their mental health. Both laws take effect March 1, 2024. The NetChoice trade group argues in its federal lawsuit that although Utah’s regulations are well-intentioned, they are unconstitutional because they restrict access to public content, compromise data security and undermine parental rights.

NBC: Is social media harming teens? A dive into the research cites risks but returns few hard answers: Is social media harming teenagers? And what can Congress, the Education Department and parents do about it?  The answers are murky. The authors surveyed hundreds of studies across more than a decade and came to complicated, occasionally contradictory, conclusions.  On one hand, they found there isn’t enough population data to specifically blame social media for changes in adolescent health. On the other hand, as shown in study after study cited by the report, social media has the clear potential to hurt the health of teenagers, and in situations where a teenager is already experiencing difficulties like a mental health crisis, social media tends to make it worse.

Forbes: Meta Remains A Dangerous Place For Children, Recent Lawsuits Claim: Just weeks after 33 states filed suit against the Facebook parent, the attorney generals of New Mexico and Montana have sued Meta, alleging the social network has endangered younger users. New Mexico Attorney General Raul Torrez seeks to fine Meta $5,000 for each alleged violation of New Mexico’s Unfair Practices Act and an order enjoining the company from “engaging in unfair, unconscionable, or deceptive practices.” Meta responded to the claim in a statement, “We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators.” It also has claimed to have disabled over 250,000 accounts on Instagram for violating these rules.

Axios: Pew: Many teens use social media “almost constantly”: Nearly 1 in 5 teens say they’re on YouTube or TikTok “almost constantly,” according to a Pew Research Center report. The report paints a picture of a rising generation whose lives are dominated by a handful of social platforms — amid ongoing debate over the possible mental health harms that could result. Pew’s latest survey on teens and technology — which polled 1,453 kids online, ages 13-17 — found roughly the same amount of internet use as last year, but substantially more than when the survey was conducted in 2014-2015.

The Washington Post: YouTube, wildly popular with teens, gets a pass for kids safety hearing: The Senate Judiciary Committee last month announced that it will hear testimony from the chief executives of five major tech companies about their “failures to protect children online.” It’s the latest such session as lawmakers increasingly scrutinize how social media platforms may be exposing children to harmful content and contributing to youth mental health problems. The hearing, set for Jan. 31, will be one of the biggest of its kind in years, with Meta’s Mark Zuckerberg and TikTok’s Shou Zi Chew set to testify alongside the CEOs of Snap, Discord and X, who all will be appearing for the first time for their companies. But the lineup for the session featured a notable omission: Google-owned YouTube. The video-sharing platform is one of the most popular digital services among kids and teens, outpacing rivals in key usage metrics according to numerous studies, but the panel has not yet publicly called on its executives to testify or indicated any plans to do so this session.

Newsweek: Opinion: When Protecting Kids Online, Don’t Let Apple and Google Off the Hook: In the last year, Utah, Arkansas, Louisiana, and Texas passed landmark laws mandating age verification and parental consent for minors to access social media. We have worked with many state legislators to draft principles for this legislation. With America’s youth caught in social media addiction and internet-mediated experiences influencing every part of their lives, parents need tools to control who is talking to and influencing their kids in order to raise them. In response, some Big Tech companies have either lobbied against the bills or tried to carve out exceptions for themselves.

MSNBC: Opinion: How social media algorithms can lead users into temptation: If you’re an adult who follows “only young gymnasts, cheerleaders and other teen and preteen influencers active on” Instagram, what other content is the Instagram Reels algorithm likely to recommend that you check out? The answer, according to a recent Wall Street Journal investigation, is “jarring doses of salacious content … including risqué footage of children.” Think about that friend who always encourages you to order just one more drink at the bar. To understand what’s going on here, let’s step out of the digital world and go “brick and mortar.” Let’s think about that friend who always encourages you to order just one more drink at the bar. This friend can be a lot of fun, in moderation and in adults-only settings. In large doses and in an all-ages setting, this friend can become a dangerous creep — and turn you into one, too.

Pittsburgh Post-Gazette: Not all screen time is bad: A new study yields real-world advice for parents of young children: More screen time was associated with a higher risk of mental health issues, regardless of what was watched. But for the many families who do engage with screens, there are better ways to spend that time, the study suggests. Higher amounts of screen time spent on educational programming were associated with a lower risk of mental health problems, meaning there’s nothing wrong with a little Abby Cadabby or Big Bird. Conversely, higher amounts of non-child-directed programming — adult shows — were associated with a higher risk of mental health problems. Previous research largely examines only the amount of screen time or only the type of programming and, of the few that examine both, the assessments weren’t repeated, as they were in this study. These results lend credence to recommendations issued by the American Academy of Pediatrics, which ask caregivers to limit non-educational screen time for children ages 2 through 5 to one hour per day on weekdays, and three hours per day on weekends.

CNBC: Harvard happiness expert’s ‘strict’ social media and news consumption policy that he recommends for everyone: Arthur C. Brooks, a professor at Harvard University and social scientist who studies happiness, thinks we should all use social media a lot less to improve our wellbeing. Brooks recommends that everyone be mindful of not just their social media use, but also their overall news consumption for their own happiness. It’s also important to note that the effects of extended social media use on mental health has become a major concern for children and teens. Parents can set a good example for their children by showing them what a healthy relationship with social media looks like.

Mashable: Do you know who’s posting pictures of your kid online?: Recently, a group of teen girls made the shocking discovery that boys in their New Jersey high school had rounded up images they’d posted of themselves on social media, then used those pictures to generate fake nudes. The boys, who shared the nudes in a group chat, allegedly did this with the help of a digital tool powered by artificial intelligence, according to the Wall Street Journal. The incident is a frightening violation of privacy. But it also illustrates just how rapidly AI is fundamentally reshaping expectations regarding what might happen to one’s online images. What this means for children and teens is particularly sobering.

The Wall Street Journal: Facebook and Instagram Steer Predators to Children, New Mexico Attorney General Alleges in Lawsuit: Facebook and Instagram recommend sexual content to underage users and promote minors’ accounts to apparent child predators, the state of New Mexico alleges in a lawsuit against parent company Meta Platforms and its CEO. The civil lawsuit, filed Tuesday in New Mexico state court, alleges that “Meta has allowed Facebook and Instagram to become a marketplace for predators in search of children upon whom to prey.” It also claims that Meta has failed to implement protections against usage by children below the age of 13 and has targeted the age-related vulnerabilities of children in the interest of increasing its advertising revenue. The suit says Chief Executive Mark Zuckerberg is personally responsible for product decisions that aggravated risks to children on Meta’s platforms.

The Verge: Facebook and Instagram accused of creating a “marketplace” for child predators in new lawsuit: Meta and its CEO, Mark Zuckerberg, allowed Facebook and Instagram to become a “marketplace for predators in search of children,” a new lawsuit from the New Mexico attorney general alleges, according to a report from The Wall Street Journal. The lawsuit, filed in state court on Tuesday, also claims Meta’s algorithms recommend sexual content to children. As outlined in the complaint, the New Mexico attorney general’s office conducted an investigation that involved creating test profiles on Facebook and Instagram that appeared to be teenagers or preteens. Not only did the office find inappropriate recommendations for each of the decoys, such as an account that “openly” posted adult pornography, but it also found that they attracted predators as well.

The Hill: Al Gore calls social media algorithms ‘digital’ AR-15s: Former Vice President Al Gore took aim at social media algorithms Tuesday, saying sites that are “dominated by algorithms” are the “digital equivalent of AR-15s.” Gore, speaking at the Bloomberg Green at COP28 event, said spending too much time scrolling on social media could be dangerous and suggested algorithms be banned. Numerous lawmakers have raised concerns about the use of social media among children.

The Washington Post: How Meta can — or can be forced to — avoid addicting kids: The state AGs may struggle in the courts, but the excessive social media use they describe deserves attention from legislatures — careful attention. There are plenty of bad ways to prevent platforms from creating a health crisis for the country’s children. Many have been tried already by overzealous lawmakers: Rules that mandate the removal or forbid the promotion of particular types of content run up against the First Amendment. Similarly, policies that expose platforms to liability for generally “causing” a kid to become addicted to social media are far too broad. The wiser route is to focus on design: the little things in a platform’s makeup that push teen users to overuse — and, indeed, those that pull them away, instead.

Digital Information World: Excessive Social Media Usage Might Increase Risk Seeking Behaviors Among Children: A recent study conducted at the University of Glasgow has revealed a troubling link between social media usage that goes beyond a healthy amount and risk seeking behaviors. There seems to be a positive correlation between minors who use social media excessively and their propensity to engage in unsafe sexual practices, consume drugs and alcohol, as well as smoke and gamble with all things having been considered and taken into account.

In The Know: Gen Z Activist Chris Mccarty Is On A Mission To Advocate For The Children Of Influencers: Gen Z is the first generation to grow up totally online. However, growing up with influencer parents has its issues, especially when it comes to monetized content. In this episode of ITK: Behind the Screens, our host, Niamh Adkins, looks into how people advocate for new standards for children on social media. To better understand this topic, Niamh chats with Chris McCarty, the student founder and executive director of Quit Clicking Kids, which is now making waves as an advocate for children’s privacy online.

Forbes: Social Media Use Linked With Risky Behavior In Adolescents, Study Finds: Adolescents who use social media such as Instagram, Snapchat, YouTube and TikTok every day are more likely to engage in risky behavior, according to a new study. Children as young as 10 are at greater risk of using alcohol, tobacco and drugs, or taking part in anti-social behavior or risky sexual behavior if they are frequent or daily social media users. And spending at least two hours a day on social media doubles the odds of consuming alcohol compared with children who are on social media for less than two hours a day. But while researchers established the link between social media use and risky behavior, they say more work is needed to confirm that one causes the other.

Reuters: TikTok, Meta, X CEOs to testify at US Senate hearing in January: The chief executives of social media companies Meta, X, TikTok, Snap and Discord will testify on online child sexual exploitation at a Jan. 31 U.S. Senate hearing, the Senate Judiciary Committee said on Wednesday. Senator Dick Durbin, the panel’s Democratic chairman and the ranking Republican Lindsey Graham said Discord and X had initially balked at participating and refused to accept a subpoena. “Now that all five companies are cooperating, we look forward to hearing from their CEOs,” they said in a statement. It will be the first appearance by TikTok CEO Shou Zi Chew before U.S. lawmakers since March when the Chinese-owned short video app company faced harsh questions, including some suggesting the app was damaging children’s mental health.

The Hill: Keeping children safe in a rapidly changing digital landscape: Recently, advocacy groups have been advocating fiercely for Congress to pass two bills: COPPA 2.0, which would update the data privacy rules for minors, and the Kids Online Safety Act (KOSA), which would create a duty of care for social media platforms to prevent harm. KOSA would give the FTC and state attorneys general enforcement authority. Both bills advanced out of the Senate Commerce Committee with bipartisan support in July. The bills also received bipartisan support from the committee last year but failed to make it to a full floor vote before the end of the session. Although they have bipartisan support, there is a coalition of LGBTQ rights groups that have raised concerns that KOSA could be weaponized by state attorneys general to censor information about LGBTQ health.

Technical.ly: Philly parents worry about kids’ digital media use but see some benefits, too: I am a professor of library and information science at Drexel University’s College of Computing and Informatics. My colleague Yuanyuan Feng and I conducted in-depth research interviews in 2019-22 with 17 parents at three branches of the Free Library of Philadelphia. The goal was to study how parents manage media use within their families. All of the parents — who represented a range of educational, socioeconomic, racial and ethnic backgrounds — were Philadelphia residents with at least one child age 5 to 11. Although we did not set out to study parental concerns about children’s media use, every one of the parents expressed worries. Only eight parents discussed any positive aspects of media use. Our research suggests promoting balance — rather than preventing addiction — is a better goal for managing kids’ digital media use.

The New York Times: How Your Child’s Online Mistake Can Ruin Your Digital Life: When Jennifer Watkins got a message from YouTube saying her channel was being shut down, she wasn’t initially worried. She didn’t use YouTube, after all. Her 7-year-old twin sons, though, used a Samsung tablet logged into her Google account to watch content for children and to make YouTube videos of themselves doing silly dances. Few of the videos had more than five views. But the video that got Ms. Watkins in trouble, which one son made, was different. “Apparently it was a video of his bottom,” said Ms. Watkins, who has never seen it. “He’d been dared by a classmate to do a nudie video.” Google-owned YouTube has A.I.-powered systems that review the hundreds of hours of video that are uploaded to the service every minute. The scanning process can sometimes go awry and tar innocent individuals as child abusers.

The Street: Instagram is still plagued by a disturbing issue that Meta says it’s making headway on solving: Social media algorithms and the way they work is one of the most ardently kept secrets in Silicon Valley. The algorithms are the engines that drive the user experience, and similar to artificial intelligence, the way they operate is reliant on the data that is fed into them. Over the weekend, an unsealed complaint in a lawsuit filed against Meta Platforms by 33 states alleges that despite the company publicly stating that Instagram is only for users 13 and older, the company is not only allowing kids under the age of 13 to use the platform, but that the company has also “coveted and pursued” that demographic for years.

New York Times: At Meta, Millions of Underage Users Were an ‘Open Secret,’ States Say: Meta has received more than 1.1 million reports of users under the age of 13 on its Instagram platform since early 2019 yet it “disabled only a fraction” of those accounts, according to a newly unsealed legal complaint against the company brought by the attorneys general of 33 states. Instead, the social media giant “routinely continued to collect” children’s personal information, like their locations and email addresses, without parental permission, in violation of a federal children’s privacy law, according to the court filing. Meta could face hundreds of millions of dollars, or more, in civil penalties should the states prove the allegations.

The Hill: Three bipartisan things we can do now to save kids from social media’s harms: According to internal surveys, approximately 22 percent of Instagram users ages 13 to 15 reported being victims of bullying, 24 percent were subjected to unwanted advances, and 39 percent experienced negative comparison. Béjar stated, “We cannot trust [Meta] with our children and it’s time for Congress to act.”  These new revelations, building on the disclosures by Meta whistleblower Frances Haugen, follow Surgeon General Dr. Vivek Murthy’s advisory this May, warning that social media poses a significant threat to the psychological health and well-being of kids and teens. Dr. Murthy implored legislators, tech firms and parents to take immediate action.  This has created a perfect storm, igniting a bipartisan drive for new regulations. In fact, in these divided times, this is one of the few topics that voters and leaders on both sides of the aisle can get behind. 

CNBC: X, Snap and Discord CEOs subpoenaed by lawmakers to testify about child sexual exploitation: Lawmakers said Monday that they have issued subpoenas to the CEOs of X, Snap and Discord to compel the executives to testify on a hearing regarding online child sexual exploitation. Sens. Dick Durbin, D-Ill., and Lindsey Graham, R-S.C., said they issued the subpoenas to the executives after “repeated refusals to appear during several weeks of negotiations.” “Since the beginning of this Congress, our Committee has rallied around a key bipartisan issue: protecting children from the dangers of the online world,” the senators wrote in a joint statement. “It’s at the top of every parent’s mind, and Big Tech’s failure to police itself at the expense of our kids cannot go unanswered.”

The Hill: Congress needs to protect kids, not Big Tech: Social media is a threat to our children. A bipartisan Congress is now stepping up to make Big Tech products safe for our children; but the social media companies are putting their incredible lobbying power behind efforts to break the momentum. Social media not only impacts teens’ mental health but their physical health, as well. A recent study in the Journal of Family Medicine and Primary Care found kids experience “anxiety, respiratory alterations, trembling, perspiration, agitation, disorientation and tachycardia” when they are not near their phones, their portals to social media and the internet.

Axios: Scoop: Biden’s team weighs joining TikTok to court young voters: President Biden’s re-election campaign privately has been weighing whether to join the social media platform TikTok to try to reach more young voters, according to two people familiar with the conversations.

The Telegraph: Why I don’t post pictures of my child on social media – and never will: I don’t post pictures of my child on social media – and never will. Those forms schools and clubs get you to fill out, asking whether you “consent to images being shared” on their website? I always tick “no”. Like every columnist, I will occasionally use something she has said or done to illustrate a wider point in print, but I would never offer up any of her personal feelings or private challenges for public consumption. I thought I’d done everything I could to shield her from prying eyes, but when I started researching my new book, The Square, I realised that no matter how careful you’ve been – and whether your parents happen to be in the public eye or not – there will always be an alarming quantity of details about you online. Enough crumbs of information about your life for anyone to piece together, should they choose to. According to the Information Commissioner’s Office, when it comes to identity theft, just your name, address and birth date is enough to create another “you”. So if someone is collecting information about you from the internet, you’d better hope they have a positive agenda.

The New York Times: If Your Child Is Addicted to TikTok, This May Be the Cure: Over the past few years, hundreds of families and school districts around the country have sued big tech companies on the grounds that the hypnotic properties of social media popular with children have left too many of them unwell…Tech companies, claiming First Amendment protections, have sought to get these sorts of suits quickly dismissed. But on Tuesday, a federal judge in California issued a ruling to make that more difficult. In it, she argued that what concerned plaintiffs most — ineffective parental controls, the challenges of deleting accounts, poor age verification and the timing and clustering of notifications to ramp up habitual use — was not the equivalent of speech, so the suits under her review should be allowed to proceed.

The Washington Post: Meta says vetting teens’ ages should fall on app stores, parents: Meta is pushing for rival tech giants such as Google and Apple to play a bigger role in keeping teens off potentially harmful sites, calling for the first time for legislation to require app stores to get parental approval when users age 13 to 15 download apps. The proposal, which the Facebook and Instagram parent company is set to announce Wednesday, counters mounting calls by state and federal policymakers for individual sites to proactively screen kids to limit their use of social media platforms over safety concerns.

Reuters: Senators demand documents from Meta on social media harm to children: A bipartisan group of U.S. Senators has written to Meta Platforms CEO Mark Zuckerberg demanding documents about its research into the harm to children from its social media platforms. A whistleblower’s release of documents in 2021 showed Meta knew Instagram, which began as a photo-sharing app, was addictive and worsened body image issues for some teen girls. “Members of Congress have repeatedly asked Meta for information on its awareness of threats to young people on its platforms and the measures that it has taken, only to be stonewalled and provided non-responsive or misleading information,” the senators wrote in a letter.

Ars Technica: Judge tosses social platforms’ Section 230 blanket defense in child safety case: This week, some of the biggest tech companies found out that Section 230 immunity doesn’t shield them from some of the biggest complaints alleging that social media platform designs are defective and harming children and teen users. On Tuesday, US district judge Yvonne Gonzalez Rogers ruled that discovery can proceed in a lawsuit documenting individual cases involving hundreds of children and teens allegedly harmed by social media use across 30 states. Their complaint alleged that tech companies were guilty of negligently operating platforms with many design defects—including lack of parental controls, insufficient age verification, complicated account deletion processes, appearance-altering filters, and requirements forcing users to log in to report child sexual abuse materials (CSAM)—and failed to warn young users and their parents about those defects.

Reuters: Social media companies must face youth addiction lawsuits, US judge rules: U.S. District Judge Yvonne Gonzalez Rogers in Oakland, California, ruled against Alphabet, which operates Google and YouTube; Meta Platforms, which operates Facebook and Instagram; ByteDance, which operates TikTok; and Snap, which operates Snapchat. The decision covers hundreds of lawsuits filed on behalf of individual children who allegedly suffered negative physical, mental and emotional health effects from social media use including anxiety, depression, and occasionally suicide. The litigation seeks, among other remedies, damages and a halt to the defendants’ alleged wrongful practices.

The Verge: Social media giants must face child safety lawsuits, judge rules: School districts across the US have filed suit against Meta, ByteDance, Alphabet, and Snap, alleging the companies cause physical and emotional harm to children. Meanwhile, 42 states sued Meta last month over claims Facebook and Instagram “profoundly altered the psychological and social realities of a generation of young Americans.” This order addresses the individual suits and “over 140 actions” taken against the companies. Tuesday’s ruling states that the First Amendment and Section 230, which says online platforms shouldn’t be treated as the publishers of third-party content, don’t shield Facebook, Instagram, YouTube, TikTok, and Snapchat from all liability in this case. Judge Gonzalez Rogers notes many of the claims laid out by the plaintiffs don’t “constitute free speech or expression,” as they have to do with alleged “defects” on the platforms themselves. That includes having insufficient parental controls, no “robust” age verification systems, and a difficult account deletion process.

The Conversation: TV can be educational but social media likely harms mental health: what 70 years of research tells us about children and screens: Ask any parent and it’s likely they’ll tell you they’re worried about their kids’ screen time. A 2021 poll found it was Australian parents’ number one health concern for their kids – ahead of cyberbullying and unhealthy diets. But how worried should parents be? The information that’s out there can be confusing. Some psychologists have compared it to smoking (amid concerns about “secondhand screen time”), while others are telling us not to worry too much about kids and screens. Academics are also confused. As The Lancet noted in 2019, researchers’ understanding of the benefits, risks and harms of the digital landscape is “sorely lacking”. In our new research, we wanted to give parents, policymakers and researchers a comprehensive summary of the best evidence on the influence of screens on children’s physical and psychological health, education and development.

WBRE: The effects social media can have on teens: According to the U.S. Department of Health and Human Services, 95 % of kids between 13 and 17 years of age use social media. One in three reports using the online platforms almost constantly. with an uptick in social media usage among teens, there’s also been an increase in teen mental health issues like anxiety and depression, but there are ways parents can safeguard their kids’ online interactions and prevent the pitfalls related to online usage.

NBC News: Omegle, the anonymous video chat site, shuts down after 14 years: Launched in 2009, the website initially gained traction with teens but remained a relatively fringe video-chatting platform, though clips of funny or strange interactions and pairings sometimes spread across the internet. Its cultural resonance ebbed and flowed, with a new burst of popularity on TikTok and YouTube in 2020. Not long after its launch, Omegle gained a reputation as a platform that struggled to stop child sexual abuse. Omegle has been named in numerous Department of Justice publications announcing the sentencing of people convicted of sex crimes.  The website was sued in 2021 for allegedly having a “defectively designed product” and enabling sex trafficking after the service matched a girl, then 11, with a man who later sexually abused her.

The New York Times: Opinion: It’s Not Kids With the Cellphone Problem, It’s Parents: It’s not the school’s job to police kids’ phone habits, something parents are acutely aware isn’t easy. And that gets to the thorny crux of the issue: Parents are often the problem. When one group of parents in my district confronted the administration about its lax policy toward cellphones, the principal said whenever he raised the issue, parents were the ones who complained. How would they reach their children?! But if we expect our kids to comply with no-phones policies, we’ve got to get over the deprivation. Our own parents would just call the front office — in an emergency. Not because they wanted to make sure we remembered to walk the dog. And really, if we’re trying to teach kids to be safe, responsible and independent, shouldn’t we give them the leeway to do so? Phones don’t teach kids these values; parents do.

NPR: Meta failed to address harm to teens, whistleblower testifies as Senators vow action: Former Meta engineer Arturo Bejar was testifying in front of a Senate Judiciary subcommittee hearing centered on how algorithms for Facebook and Instagram (both owned by parent company Meta) push content to teens that promotes bullying, drug abuse, eating disorders and self-harm. Bejar’s job at the company was to protect the social media site’s users. He said that when he raised the flag about teen harm to Meta’s top executives, they failed to act. Bejar is the latest Facebook whistleblower to supply congress with internal documents that show Meta knows kids are being harmed by its products. His testimony comes after The Wall Street Journal reported on his claims last week. Lawmakers have now heard testimony from dozens of kids, parents and even company executives on the topic. And it seems to have reached a boiling point.

The Washington Post: Former Facebook staffer to speak out in hopes of jolting Congress: When whistleblower Frances Haugen warned Congress at an October 2021 hearing that Facebook and Instagram were exposing children to harm, lawmakers expressed hope that the testimony would hasten their efforts to pass fresh protections for kids online. That same day, another Facebook worker, Arturo Béjar, was privately sounding the alarm that the company was not taking the safety of its users, particularly teens, seriously enough. Béjar, a former Facebook engineering director and consultant, will now get his own shot to move the needle on Capitol Hill, where he is poised to urge lawmakers in public testimony to seize what he called an “urgent” moment for children’s safety. Béjar, who first spoke out in a Wall Street Journal report last week, had warned Facebook chief Mark Zuckerberg and his top lieutenants on the day Haugen first testified about a “critical gap in how we as a company approach harm,” according to documents we reviewed. Béjar cited internal survey data suggesting the company was underestimating how frequently users under 16 faced bullying, unwanted sexual advances or other negative encounters. 

Reuters: Former Meta employee tells Senate company failed to protect teens’ safety: A former Meta employee testified before a U.S. Senate subcommittee on Tuesday, alleging that the Facebook and Instagram parent company was aware of harassment and other harms facing teens on its platforms but failed to address them. The employee, Arturo Bejar, worked on well-being for Instagram from 2019 to 2021 and earlier was a director of engineering for Facebook’s Protect and Care team from 2009 to 2015, he said. Bejar testified before the Senate Judiciary Subcommittee on Privacy, Technology and the Law at a hearing about social media and its impact on teen mental health. Bejar’s testimony comes amid a bipartisan push in Congress to pass legislation that would require social media platforms to provide parents with tools to protect children online.

AP: Meta engineer testifies before Congress on Instagram’s harms to teens: Arturo Béjar, known for his expertise on curbing online harassment, recounted to Zuckerberg his own daughter’s troubling experiences with Instagram. But he said his concerns and warnings went unheeded. And on Tuesday, it was Béjar’s turn to testify to Congress. Béjar worked as an engineering director at Facebook from 2009 to 2015, attracting wide attention for his work to combat cyberbullying. He thought things were getting better. But between leaving the company and returning in 2019 as a contractor, Béjar’s own daughter had started using Instagram. In the 2021 note, as first reported by The Wall Street Journal, Béjar outlined a “critical gap” between how the company approached harm and how the people who use its products — most notably young people — experience it.

The Seattle Times: States, schools take on Meta to protect children: Out of concern for children and teens, Washington is among 41 states and Seattle Public Schools among at least 190 school districts that have filed lawsuits against Meta, the parent company of Facebook and Instagram. The state plaintiffs allege Meta has “profoundly altered the psychological and social realities of a generation of young Americans.” SPS Superintendent Brent Jones said: “Our students — and young people everywhere — face unprecedented learning and life struggles that are amplified by the negative impacts of increased screen time, unfiltered content, and potentially addictive properties of social media.”

The Guardian: I resist sharenting on social media. Does that mean my son and I are missing out, or is it just safer?: A few years ago, sharenting, as it’s been called, felt like the norm among my social circle. These days I see far fewer babies’ faces on social media. Concerns about online privacy and safeguarding, as well as facial recognition and the commercial use of personal data, are far more prevalent than they were in the early days of Facebook. In fact, you could say that whether or not you share photos has become another parental identity marker, up there with breastfeeding, cloth nappies and baby-led weaning as evidence that you’re doing things “the right way”, not like “those other parents”.

The Boston Globe: How parents can deal with social media addiction in teens, kids: It surprised us to hear that most of the adolescents in our daily therapy group were using social media for double-digit hours each day, and that over half articulated a relationship between their mental health and the messaging they received through social media. To our greater surprise, all but one felt there was a need for some kind of external controls. They could not imagine decreasing their usage all by themselves.We hear about parents whose instinct is to react in the moment — perhaps a bit too late and often out of frustration — with an authoritarian, all-or-nothing approach. Many want to take away their child’s device or install content trackers to see their every swipe.Before taking drastic measures, we encourage parents to step back and work to resolve their feelings of guilt. Trust us when we say it’s not your fault. The explosive growth of social media was like a speeding train; there was no time to anticipate its power nor opportunity for a thoughtful response. The devices used to access these platforms were a Trojan horse, welcomed into our homes in the form of a telephone.

The Hill: Meta whistleblower to testify in Senate hearing on child safety, social media: Arturo Bejar, a former Facebook engineering director who later worked as a company consultant, will testify before a Senate subcommittee about social media and its impact on the teen mental health crisis, the panel announced Friday.The testimony comes amid a bipartisan push in Congress to adopt regulations aimed at protecting kids online. Bejar has also met with Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), the lead sponsors of the Kids Online Safety Act (KOSA), according to the senators’ offices.  The senators added it is “time to say ‘enough is enough’ to Big Tech” and pass their legislation to address the harms from tech companies’ actions toward children. 

6ABC Philadelphia: How much is too much? Exploring possible dangers of teen social media use: Public health experts are sounding the alarm over the potential risks when kids, particularly teens, are on social media. U.S. Surgeon General Vivek Murthy has indicated that social media may be playing a role in the teen mental health crisis. It’s best to delay social media use as long as you can, but realistically, it’s not a matter of if they’ll be on it but when. And while teens tend to be more tech-savvy than their parents, there are some things you can do to help them navigate this online world in a healthy way.

Observer Reporter: Social addiction: Area experts say constant scrolling can lead to mental health woes in teens: The impact of social media apps such as Facebook and Instagram on children and teens is getting more attention following a multi-state lawsuit against Meta that alleges the company intentionally designed its products with addictive features that negatively impact young users. Emily Walentosky, a school psychologist at California Area School District, said social media is a big part of many kids and teens’ lives, but spending too much time online is harmful.

MyChesCo: Pennsylvania Joins Multi-State Lawsuit Against Meta, PFSA Advocates for Family Digital Wellness: Pennsylvania has joined 32 other states in a federal lawsuit against Meta Platforms, Inc., the parent company of Facebook and Instagram. The lawsuit alleges that Meta’s social media platforms violate consumer protection laws by exposing young users to harmful, manipulative, and addictive content.Pennsylvania Attorney General Michelle Henry has voiced strong opposition against these practices, stating, “The time has come for social media giants to stop trading in our children’s mental health for big profits.” She further accused Meta of promoting a “click-bait culture” that is psychologically damaging to children.

Pittsburgh Tribune-Review: Editorial: Meta lawsuit is only part of necessary actions on social media: Last week, Pennsylvania Attorney General Michelle Henry was one of 33 state AGs to join a lawsuit against Meta, the company behind Instagram, Facebook, WhatsApp and Oculus. And that is the crux of the lawsuit, which claims children are being damaged by the company’s social media operations in ways neither they nor their parents realize.The problem is more than the content. That can be troubling but is more easily addressed by blocking what one doesn’t like or encouraging (or demanding) better moderation by the company.Federal and state governments need to look at social media companies and rethink what it means to have all that power with so little consequence. This is one sticky web. We need to stop kids from getting caught in it.

Sandford Health: Kids on social media must mind their mental health: Behavioral health leaders across the U.S. are urging parents to monitor the social media habits of their children, citing it as a factor in the increase in mental health issues in adolescents.The progression from what starts as a standard diversion, like watching TV or playing video games, to something darker can be a slippery slope, says Dene Hovet, associate behavioral health counselor for Sanford Health.In many cases, neither kids nor parents quite realize the extent to which the internet and social media have taken over their lives.

Gallup News: Parenting Mitigates Social Media-Linked Mental Health Issues: Teenagers who spend more time on social media experience worse mental health on a variety of measures, according to data from a new Gallup survey.Yet, the strength of the relationship between an adolescent and their parent is much more closely related to their mental health than their social media habits. When teens report having a strong, loving relationship with their parents or caretakers, their level of social media use no longer predicts mental health problems.The data inform debates about the consequences of social media use.

Los Angeles Times: Editorial: Social media can harm kids. Lawsuits could force Meta, others to make platforms safer: It’s a rare issue that can bring 41 states together for a bipartisan fight. This week, state attorneys general across the political spectrum joined forces in suing Facebook parent company Meta for allegedly using features on Instagram and other platforms that hook young users, while denying or downplaying the risks to their mental health.But there hasn’t yet been significant change in the industry. Most companies haven’t been willing to overhaul their platforms to curb addictive features or harmful content for users under 18 years old, such as setting time limits on their apps or changing algorithms that steer kids into “rabbit holes” to keep them online. Nor have federal lawmakers been able to enact comprehensive product safety regulations because legislation has stalled in Congress or been blocked by courts.In the absence of policy changes, lawsuits are the next logical step in prodding technology companies to ensure their products are safe for young people or be held accountable. Some have compared the states’ legal strategy to lawsuits against Big Tobacco and opioid manufacturers that revealed how the companies lied about the harm caused by their products, and forced them to change their business practices.

TechCrunch: Why 42 states came together to sue Meta over kids’ mental health: Attorneys general from dozens of states sued Meta this week, accusing the company of deliberately designing its products to appeal to kids to the detriment of their mental health. In the lawsuit, filed in California federal court Tuesday, 33 states including California, Colorado, New York, Arizona and Illinois argue that Meta violated state and federal laws in the process of luring young users in the U.S. into spending more time on Facebook and Instagram.

The New York Times: Is Social Media Addictive? Here’s What the Science Says: A group of 41 states and the District of Columbia filed suit on Tuesday against Meta, the parent company of Facebook, Instagram, WhatsApp and Messenger, contending that the company knowingly used features on its platforms to cause children to use them compulsively, even as the company said that its social media sites were safe for young people. “Meta has harnessed powerful and unprecedented technologies to entice, engage and ultimately ensnare youth and teens,” the states said in their lawsuit filed in federal court. “Its motive is profit.” The accusations in the lawsuit raise a deeper question about behavior: Are young people becoming addicted to social media and the internet? Here’s what the research has found.

Politics PA: Henry Joins Federal Lawsuit Against Meta Over Content For Young Users: Pennsylvania Attorney General Michelle Henry joined a multi-state coalition in a federal lawsuit against Meta Platforms, Inc., claiming the company’s social media platforms, including Facebook and Instagram, violate consumer protection laws by subjecting young users to a wave of harmful, manipulative, and addictive content. The lawsuit alleges that Meta knowingly designs and deploys features harmful to children on its platforms, while at the same time, falsely assuring the public that those features are suitable for children.

AP: AI-generated child sexual abuse images could flood the internet. A watchdog is calling for action: The already-alarming proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos, a watchdog agency warned on Tuesday.In a written report, the U.K.-based Internet Watch Foundation urges governments and technology providers to act quickly before a flood of AI-generated images of child sexual abuse overwhelms law enforcement investigators and vastly expands the pool of potential victims.

CBS Pittsburgh: Pennsylvania joins 32 other states in lawsuit against Facebook and Instagram parent company Meta: A bipartisan coalition of 33 state attorneys general, including Pennsylvania, have announced a federal lawsuit against Meta, the parent company of Facebook and Instagram. The suit alleges that the tech giant deliberately engineered its social media platforms to be addictive to both children and teenagers in an effort to boost its profits. Pennsylvania, and 32 other states, have joined the lawsuit after the U.S. Surgeon General said in May that bad social media companies have contributed to what is described as a “youth mental health crisis.” Federal law prohibits children under the age of 13 from signing up for social media platforms, but the federal complaint alleges that Meta knew young users were active on the platform and collected data from them without parental consent.

Philadelphia Business Journal: Facebook parent sued by Pennsylvania, New Jersey and others states over allegedly harming kids: Pennsylvania, New Jersey and 30 other states on Tuesday filed suit against Meta Platforms Inc., alleging that the social media giant intentionally marketed its services to kids without their parents’ consent and while knowing that those services were causing children harm.In a complaint filed in the U.S. District Court for Northern California in San Francisco, the states’ attorneys general charged the tech titan with violating the federal Children’s Online Privacy Protection Act as well as numerous state laws prohibiting corporations from deceptive acts. The states are asking that the court bar the Mountain View-base parent company of Facebook and Instagram from future violations and award them with unspecified damages.

Philly Voice: Pennsylvania, New Jersey join federal lawsuit alleging Facebook, Instagram harm children’s mental health: Pennsylvania and New Jersey are among dozens of states suing social media giant Meta over claims that its platforms, including Facebook and Instagram, are addicting and mentally damaging to children and teens.The federal lawsuit, filed Tuesday in California, claims Meta designed specific features intended to keep kids hooked on its platforms. The attorneys general of 33 states argue that Meta knew about the dangers its products posed to young people, but downplayed and concealed them in the interest of growth and competition with rival platforms like TikTok.

41 States Sue Meta, Claiming Instagram, Facebook are Addictive, Harm Kids: Forty-one states and the District of Columbia are suing Meta, alleging that the tech giant harms children by building addictive features into Instagram and Facebook. The legal actions represent one of the most significant efforts by state regulators to rein in the impact of social media on children’s mental health. Thirty-three states, including Pennsylvania, are filing a joint lawsuit in federal court in the Northern District of California, while attorneys general for D.C. and eight states are filing separate complaints in federal, state or local courts. The complaints underscore concern that major social networks risk the well-being of younger users by designing products in ways that optimize engagement over safety.

CNBC: FTC plans to hire child psychologist to guide internet rules: The Federal Trade Commission plans to hire at least one child psychologist who can guide its work on internet regulation, Democratic Commissioner Alvaro Bedoya told The Record in an interview published Monday. FTC Chair Lina Khan backs the plan, Bedoya told the outlet, adding that he hopes it can become a reality by next fall, though the commission does not yet have a firm timeline. “Our plan is to hire one or more child psychologists to help us assess the mental health impacts of what children and young people do online,” FTC spokesperson Douglas Farrar told CNBC in a statement. “We are currently exploring next steps including how many to hire and when.”

The New York Times: Face Search Engine PimEyes Blocks Searches of Children’s Faces: Concerns about children’s privacy have led PimEyes, the public face search engine, to ban searches of minors. The PimEyes chief executive, Giorgi Gobronidze, who is based in Tbilisi, Georgia, said that technical measures had been put in place to block such searches as part of a “no harm policy.” PimEyes, a subscription-based service that uses facial recognition technology to find online photos of a person, has a database of nearly three billion faces and enables about 118,000 searches per day, according to Mr. Gobronidze. The service is advertised as a way for people to search for their own face to find any unknown photos on the internet, but there are no technical measures in place to ensure that users are searching only for themselves.

CNBC: Want to raise happy, successful kids? ‘Wait as long as possible’ to give them a phone, says Yale expert: Children ages eight to 12 who have phones spend just under five hours a day glued to their phones, and teenagers rack up nearly eight hours of screen time per day, a 2019 report from nonprofit Common Sense Media found. That screen time is seldom used for creative activities like coding or making digital art. Rather, young people spend most of their phone time on social media or watching videos, Common Sense head of research Michael Robb wrote in an analysis of the report. This is likely to encourage poor mental health — in ways that affect kids differently than adults — and distractions in the classroom.

CNN: About half of children share their location on Twitch, research shows: Twitch is an online streaming service in which users can live stream their gaming, music and other creative content.Joining the interactive platform can help create feelings of community, but streamers age 13 and younger often share information that can put them at risk for exploitation with viewers all around the world, said coauthor Fiona Dubrosa, a visiting scholar at Cohen’s Children Medical Center in New York City.

Today: Social media influencer raises the alarm about kids and phones: Through her educational programming and nonprofit organization, #halfthestory, Larissa May urges teenagers to learn more about how technology affects their minds, offering concrete tips on bringing mindfulness back into the picture. At the moment, May is focusing her efforts on kids in middle school and high school, ages where she thinks she can make the most impact, and she urges parents to get involved. Rather than tell teens that phones are simply evil, May suggests that parents can demonstrate “positively using technology and having it be fun, because there’s just so much negativity around it.” She also suggests that parents “lead with vulnerability” in openly discussing their own tech temptations and habits they might want to change. Many of the parents who approach May are often chained to their phones themselves, she says.

Yahoo Lifestyle: Parents are pranking their kids on social media. Here’s why experts say it isn’t harmless.: Before parents participate in a prank or any social media trend with their children, they should consider the kid’s age and maturity level to determine whether or not it is age-appropriate. Could this cause pain or physical damage? Will it induce fear, humiliation or emotional harm? If the answer is yes, parents should consider that prank harmful, not harmless.But there are safer and more respectful ways to approach these social media trends. Dr. Niky has used her own TikTok platform to share her reframing of the #EggCrackChallenge. In her video, she involves her child in the prank by explaining what she wants to do, offering to let him crack the egg on her forehead first and discussing his decision to forego participation. Her older daughter, meanwhile, playfully but nervously takes her up on the offer to crack the egg against her mom’s forehead. The video, Dr. Niky shares, illustrates how a challenge can become a fun family activity and bonding experience when everyone is in on the joke. Otherwise, parents risk becoming their “kid’s first bully.”

The Hill: Opinion: Congress can disrupt the spread of online child sexual abuse: As Congress determines a path forward for government spending in this new fiscal year, it is past time for lawmakers to take decisive action to address the crisis of online sexual exploitation of children. In 2022 alone, the National Center for Missing & Exploited Children received 32 million reports of suspected child sexual exploitation — of those, nearly 90 percent resolved to a location outside the U.S. This is a global crime, often with demand-side offenders in one country and victims in another.

The Verge: Google asks Congress to not ban teens from social media: Google responded to congressional child online safety proposals with its own counteroffer for the first time Monday, urging lawmakers to drop problematic protections like age-verification tech. In a blog post, Google released its “Legislative Framework to Protect Children and Teens Online.” The framework comes as more lawmakers, like Sen. Elizabeth Warren (D-MA), are lining up behind the Kids Online Safety Act, a controversial bill intended to protect kids from dangerous content online.

Bloomberg: Kids Suing Social Media Over Addiction Find a Win Amid Losses: Minors and parents suing Meta Inc.’s Facebook and other technology giants for the kids’ social media platform addictions won an important ruling advancing their collection of lawsuits in a California court. A state judge on Friday threw out most of the claims but said she’ll allow the lawsuits to advance based on a claim that the companies were negligent – or knew that the design of their platforms would maximize minors’ use and prove harmful. The plaintiffs argue social media is designed to be addictive, causing depression, anxiety, self-harm, eating disorders, and suicide.

The New York Times: Can You Hide a Child’s Face From A.I.?: How much parents should post about their children online has been discussed and scrutinized to such an intense degree that it has its own off-putting portmanteau: “sharenting.”Historically, the main criticism of parents who overshare online has been the invasion of their progeny’s privacy, but advances in artificial intelligence-based technologies present new ways for bad actors to misappropriate online content of children. Among the novel risks are scams featuring deepfake technology that mimic children’s voices and the possibility that a stranger could learn a child’s name and address from just a search of their photo.

The New York Times: New Laws on Kids and Social Media Are Stymied by Industry Lawsuits: Many children’s groups heralded the measure, the first of its kind in the United States. So did Gov. Gavin Newsom. “We’re taking aggressive action in California to protect the health and well-being of our kids,” he said in an statement at the time. But last month, after a lawsuit filed by a tech industry group whose members include Meta and TikTok, a federal judge in California preliminarily blocked the law, saying it “likely violates” the First Amendment.

The Baltimore Sun: Opinion: Social media platforms must step up to combat youth mental health crisis in schools: Parents and educators are working hard to create a welcoming and safe environment for kids. Now, it’s time for Big Tech to step up as well, and take some responsibility. Social media has had a disruptive and often detrimental role in the well-being and academic success of students across the nation, with the consequences growing more concerning each year. Educators and parents are bearing the weight of these disruptions at school and at home, while Big Tech platforms make billions, so together we’re demanding social media companies make significant changes to make their products safer for millions of kids. In May 2023, the U.S. Surgeon General issued a landmark report declaring a youth mental health crisis in America and pointing a finger at social media’s role in the epidemic.

The Hill: Ramaswamy backs controversial social media limits for teens: Republican presidential candidate Vivek Ramaswamy seemingly backed controversial proposals that would limit teens under 16 from using social media platforms during a Wednesday night debate. “This isn’t a Republican point or a Democrat point,” Ramaswamy said. “But if you’re 16 years old or under, you should not be using an addictive social media product, period.” The conservative entrepreneur said the idea is “something that we can both agree on,” and in doing so can “revive both the mental health of this country while stopping the fentanyl epidemic.” Concerns around children’s online safety have emerged as a rare unifying issue across party lines, but proposals such as the one Ramaswamy suggested have not been as bipartisan.  Sen. Josh Hawley (R-Mo.), an outspoken critic of social media companies, in February put forward a bill that would ban children under 16 from using social media.

The Washington Post: Got an idea for protecting kids online? You can now take action: If you have concerns about kids and teens on social media or ideas for keeping them healthy and safe now you can submit those directly to the federal government. The Department of Commerce’s National Telecommunications and Information Administration (NTIA) sent out a request for public comment on Thursday calling for parents, educators and other interested parties to write in and share their concerns and “best practices” around internet usage of kids and teens. The call comes several months after the White House promised in an advisory to dedicate more resources and brainpower to two big questions: How exactly is internet access affecting young people, and what should the rest of us be doing about it?

PhillyVoice: Federal bill would allow sexual violence survivors to temporarily defer student loan payments: Two federal lawmakers from Pennsylvania have introduced a bill that would defer student loan payments for survivors of sexual violence who withdraw from college, allowing them to focus on their well-being. The bill, introduced on Wednesday by U.S. Sen. John Fetterman and House Rep. Madeleine Dean, would allow survivors of sexual violence to temporarily suspend their student loan payments if they withdraw from a college or university to seek treatment and focus on their mental and physical recovery. The bill would allow students who have passed the six-month student loan deferment period to extend it for up to three years.

The Morning Call: Opinion: Protecting Pa. children from abuse isn’t easy. Specially trained pediatricians are part of the solution: Hard conversations are happening about how best to keep children safe (and alive) while guarding against over-surveilling or inappropriately intervening in families. Front-and-center are concerns about how Black and brown families are disproportionately reported to child welfare. In recent years, diverse stakeholders have engaged in interdisciplinary forums exploring reforms related to: mandatory reporting of suspected child abuse or neglect; the quality of child abuse investigations, including the role of specially trained medical professionals; and the unintended consequences of Pennsylvania’s child abuse registry. Recently, the Lehigh County controller focused on a rarely reported and substantiated type of abuse — Munchausen syndrome by proxy (or medical child abuse). His report was frustrating in its disconnection from these forums.

igamingbusiness: Pennsylvania fines operator for underage VGT gambling: The underage gambling breach was identified at a qualified truck stop in the Smithton area of Pennsylvania. Pilot Travel Centers was also flagged for not having a board-credentialed employee on duty. A financial penalty of $45,000 was agreed following negotiations between the PGCB’s Office of Enforcement Counsel and Pilot Travel Centers. Four Pennsylvania adults banned for leaving children unattended: In other news, four adults have been placed on the Pennsylvania Involuntary Exclusion List for leaving children unattended while gambling. A female player left three children – aged 10, 14 and 15 – in a running vehicle in the parking garage of Hollywood Casino at Penn National Race Course. The individual gambled inside the Pennsylvania venue for two hours and two minutes while the children were unattended.

NEXSTAR: Ohio police suggested charging an 11-year-old for her explicit photos. Experts say the practice is common: When an Ohio father learned that his 11-year-old daughter had been manipulated into sending explicit photos to an adult, he turned to the police for help. But instead of treating the girl as a crime victim, an officer seemingly threatened to charge her under a law most people view as designed to protect child victims. The shocking interaction was recorded last week on body camera audio and by the father’s doorbell camera in Columbus, Ohio. The footage drew criticism from the public and from experts who said law enforcement officials have long misused laws meant to protect children by threatening to charge them with being part of the same crime.

The Morning Call: After Lehigh County report, what you should know about the John Van Brakle Child Advocacy Center, the CAC movement and child abuse pediatricians: For the last 20 years, the Lehigh Valley has had a child advocacy center in one way or another, operating in plain sight but garnering little attention from those on the outside of child abuse investigations. But now, the John Van Brakle Child Advocacy Center at Lehigh Valley Health Network’s Reilly Children’s Hospital is in the spotlight, following a critical report by Lehigh County Controller Mark Pinsley and protests by parents and the Parents Medical Rights Group, a Lehigh Valley organization that seeks more parental input in medical decisions. The Van Brakle Center and its former director, Dr. Debra Esernio-Jenssen, who was recently replaced, are accused by some parents of misdiagnosing their children with abuse, causing traumatic investigations and family separations at the hands of Children and Youth Services, only for the investigation to be dropped later.

CBS 21: State leaders respond to challenges in PA’s child welfare system: In the months since three Adams County Children and Youth Services employees were arrested and charged with endangering the welfare of a child, CBS21 has been asking state leaders how they are working to address challenges in the Commonwealth’s Child Welfare system. Advocates, lawmakers and those who work in the child welfare system said some challenges include staffing, caseloads and funding. Governor Josh Shapiro’s office said he has a proven track record of working to protect children and ensure their safety. His Office provided this statement in part: “Governor Shapiro further supports the work of the Office of Child Advocate as yet another tool to help keep Pennsylvania’s children safe and ensure this essential function of county government can continue to meet its obligations to children and families.

People: Jodi Hildebrandt Has Counseling License Frozen Amid Child Abuse Charges with Ruby Franke: Jodi Hildebrandt, the embattled sex therapist who faces six felony child abuse charges in Utah, has reportedly agreed to have her counseling license frozen amid allegations of child abuse against her and her business partner, Ruby Franke. “Given the heinous abuse allegations, the agency felt that the surrender of the license was the best course of action to protect the safety of Hildebrandt’s patients and clients,” Margaret Busse, the executive director of the Utah Department of Commerce said in a statement to the media on Tuesday. The agreement, obtained by PEOPLE, sees that Hildebrandt’s license will be frozen amid the criminal charges against her and she has a hearing before the state’s Clinical Mental Health Counselor Licensing Board, which will determine the future of her ability to practice in the state. Hildebrandt has been a licensed clinical mental health counselor in Utah since 2005.

The Hill: Opinion: We must create an independent expert agency for AI and ‘Big Tech’: Last week, I met with child psychologists to discuss social media’s profound effects on Colorado’s kids. They shared their clinical assessment of the addiction and trauma our kids are experiencing — and the accompanying sleepless nights, searing anxiety, endless bullying and deepening despair.  Almost all of these clinicians were parents as well. And our conversation shifted from their patients to their kids and how social media has deprived their own sons and daughters of their chance at a healthy childhood. They told me about school nights devolving into screaming matches about screen time, the deafening silence during carpool as kids ride hypnotized by an endless feed in the back seat, and the meals skipped by impressionable teens in hopes of achieving the “perfect” bodies these platforms parade to them. 

Pittsburgh Post-Gazette: Opinion: Angela Liddle: How parents can help their children resist social media: Technology will always make sure that parenting will never be easy. More than half of parents surveyed in a recent Pew Research study said that social media makes parenting harder than it was 20 years ago. The reasons are well-known: Almost three-fourths of kids see sexual or violent content while doing their homework. One-fifth of kids from 10 to 17 have been approached or sexually solicited while online. Almost half of children in fourth to eighth grade have spoken with a stranger online. This has concerned lawmakers across the country enough to introduce legislation to ban platforms or restrict teens and adolescents from registering social media accounts. While I applaud lawmakers for wanting to protect kids online, legislation alone is not going to protect our kids. It’s up to us as parents and guardians to help our kids foster positive digital behaviors.

Slate: Sen. Richard Blumenthal Defends His Controversial Bill Regulating Social Media for Kids: For a while now, Washington has been wrestling with two big forces shaping technology: social media and artificial intelligence. Should they be regulated? Who should do it—and how? Currently, Congress is considering a bill that would regulate how social media companies treat minors: the Kids Online Safety Act. Although it has bipartisan support, KOSA is not without controversy. Several critics have called it “government censorship.” One group, the Electronic Frontier Foundation, says it is “one of the most dangerous bills in years.” One of KOSA’s sponsors is Connecticut Democratic Sen. Richard Blumenthal. On Friday’s episode of What Next: TBD, I spoke with Blumenthal about tech, kids, and what role the government should play when it comes to regulating Silicon Valley. Our conversation has been edited and condensed for clarity.

The Washington Post: Judge blocks California law meant to increase online safety for kids: A federal judge on Monday temporarily blocked an online child protection law in California and said it probably violates the Constitution. Under the law, known as the California Age-Appropriate Design Code, digital platforms would have to vet their products before public release to see whether those offerings could harm kids and teens. The law also requires platforms to enable stronger data privacy protections by default for younger users. U.S. District Court Judge Beth Labson Freeman granted a request Monday by the tech trade group NetChoice for a preliminary injunction against the measure, writing that the law probably violates the First Amendment and does “not pass constitutional muster.”

The Hill: Ashton Kutcher steps down from anti-child sex abuse group after Danny Masterson pushback: Ashton Kutcher, co-founder of Thorn – a technology company protecting children from sexual abuse – announced Friday he would resign from the organization’s board amid backlash he received for supporting former co-star and convicted rapist Danny Masterson. “This decision is rooted in the recognition of recent events and ensuring Thorn remains focused on its mission: to build technology to defend children from sexual abuse,” the company, founded by Kutcher and Demi Moore, said in a statement. On Sept. 7, a Los Angeles judge sentenced Masterson to 30 years to life in prison for raping two women. Kutcher and his wife, actress Mila Kunis, who both co-starred with Masterson on “That 70’s Show,” wrote character letters to the judge prior to Masterson’s sentencing.

The Hill: Opinion: Congress, it’s time to put kids before Big Tech profits. Pass KOSA: Our kids are experiencing a national epidemic of depression, anxiety, and loneliness. Rates of suicide have skyrocketed, feelings of hopelessness have reached critical levels, and across the country, parents and young people are demanding solutions to this national crisis. Behind this mental health emergency is social media — its ubiquity, its pervasive data collection, and its addictive design. Nearly 20 years after we first started posting on Facebook walls, Americans are finally turning their attention to the impact social media is having on an entire generation. These companies have been running a national experiment on our kids and the results have been catastrophic. According to a national poll commissioned by Issue One and our Council for Responsible Social Media, only 7 percent of Americans see social media’s impact on children as more positive than negative. That’s an overwhelming rebuke of Big Tech and the repercussions their platforms are having on children.

Tech Policy Press: Fight Over State Child Online Safety Laws May Last Years: After a wave of legislation focused on child online safety swept through state legislatures over the past two years, legal challenges against the new laws are gaining traction in federal courts. But rather than signaling a change in the tide, the lawsuits may ultimately spur a new round of bills that address flaws in those passed in the first wave. Putting aside the merits of the various approaches to child online safety that animate recent legislation and whether they may be effective, it is clear that the overarching issue is one that will survive well into the future. A recent national poll on children’s health found that use of devices and social media are at the top of parent concerns. What follows is a summary of the legal and political debate involving child online safety laws and where it might go in the future.     

Mashable: New S.O.S. initiative online rating system targets teen safety: Imagine letting a child or teen to see a movie without any guidance about the film’s appropriateness for their age. You might settle into an animated feature that surprised you and your 8-year-old with nonstop profanity. Or discover that the action flick your 13-year-old watched depicted graphic sex. Parents typically like to avoid exposing their kids to inappropriate content and count on movie and TV ratings, however imperfect, to help them do exactly that. But as mental health advocate and fashion designer Kenneth Cole argues, parents have no such resource or guideline when it comes to the internet, which is where their kids and teens spend a significant amount of their time.

Them: Over 100 Parents of Trans Kids Sign Letter Opposing a Controversial Internet Safety Bill: Over 100 parents of trans and gender-expanding children have written an open letter opposing the Kids Online Safety Act (KOSA), legislation that advocates say could widely censor trans and other marginalized communities on the internet. KOSA was initially introduced in the Senate in 2022 by Democratic Senator Richard Blumenthal and Republican Senator Marsha Blackburn. The bill would burden online platforms with the legal responsibility to proactively remove content that causes anxiety, depression, eating disorders, bullying, violence, and more. Since its introduction in 2022, digital rights groups such as Fight for the Future have been warning that online platforms could face “substantial pressure to over-moderate” as a result of the bill, resulting in widespread censorship.

Los Angeles Times: California lawmakers pass measure to combat child sexual abuse material on social media: California lawmakers on Wednesday passed a bill aimed at combating child sexual abuse material on social media platforms such as Facebook, Snapchat and TikTok. The legislation, Assembly Bill 1394, would hold social media companies liable for failing to remove the content, which includes child pornography and other obscene material depicting children. “The goal of the bill is to end the practice of social media being a superhighway for child sexual abuse materials,” Assemblywoman Buffy Wicks (D-Oakland), who authored the legislation, said in an interview. The bill unanimously cleared the Senate on Tuesday. The Assembly unanimously approved an amended version of the bill on Wednesday and it’s now headed to the governor’s desk for consideration.

Washington Blade: EXCLUSIVE: Sen. Blumenthal defends Kids Online Safety Act: Responding to criticism from some in the LGBTQ community about the Kids Online Safety Act, U.S. Sen. Richard Blumenthal (D-Conn.) defended the legislation and reiterated his strong support for queer youth. “I would never put my name on any bill that targets or disparages or harms the trans or LGBTQ community,” Blumenthal told the Washington Blade on Friday. “There have been a lot of eyes” on the Kids Online Safety Act, he said. “A lot of very smart and careful people have reviewed its language, and they and I have worked to make it as rigorous and tight as possible.” The proposed legislation, introduced by Blumenthal and Republican U.S. Sen. Marsha Blackburn (Tenn.), would address harms experienced by children and their families at the hands of dominant social media and tech platform companies. It enjoys broad bipartisan support in the Senate. 

Lake Okeechobee News: Data reveals TikTok to be the platform parents most worry about: A new study reveals TikTok to be the most worrying social media platform for parents, with an estimated 5,100 online searches asking if TikTok is safe. AI Digital family safety app, Canopy.us has collated Google search volume data relating to the safety of various social media platforms. The search volumes for each platform were collected for US and global searches then ordered to reveal the most worrying social media site. The family safety app has also explored what dangers threaten kids using TikTok and the safety precautions parents can take to protect them. 

People: ‘Monster’: Powerful PSA from ChildFund Urges Big Tech to Step up Efforts Against Child Sex Predators: A new campaign has set its sights on Big Tech, challenging legislators to enact laws that require internet companies to more actively target and take down child sexual abuse material, which activists say is increasingly rampant on social media. ChildFund International, an organization that focuses on child development and protection, is spearheading the #TAKEITDOWN campaign, which aims to “build public support to pressure tech companies to proactively remove child sexual abuse content from their platforms,” according to a press release announcing the launch. The centerpiece of the campaign is a video PSA titled Monster, which portrays a seemingly innocuous man, who goes to work and interacts with others in person. But when he’s alone, browsing the internet, his face is suddenly covered by a monster mask, implying that he is engaging in predatory online behavior. 

Gizmodo: ‘Our Kids’ Lives Are at Stake:’ 100 Parents of Trans Kids Beg Lawmakers to Kill Kids Online Safety Act: Parents of more than 100 trans and gender-expansive children are urging lawmakers to turn their back on the “dangerous and misguided” Kids Online Safety Act (KOSA) currently winding its way through Congress. In a fiery open letter shared with Gizmodo, the parents said KOSA, which is intended to shield kids from the harms of social media, would actually make their kids less safe and cut them off from potentially lifesaving resources and communities. “Big Tech is hurting our kids,” they added. “KOSA would hurt them even more.” Lawmakers from both sides of the aisle and President Biden himself have rallied around KOSA in recent months as a potential saving grace in response to a steady stream of reports showing various ways Big Tech platforms can harm young users and contribute to a worrying rise in depression and anxiety.  

The New York Times: Demonizing Social Media Isn’t the Answer to Online Safety, a New Book Argues: Panic over social media has reached a fever pitch. Diagnoses of mental illness among adolescents have been on the rise, and in May the U.S. surgeon general warned of “ample indicators” that social media may in part be to blame. In June, a psychologist called for a nationwide ban of cellphones in schools. By next March, kids under 18 in Utah will be allowed to use TikTok and Instagram only if they have explicit parental permission. But perhaps banning social media — or heavily monitoring kids who use it, which is another common parental response — isn’t the most constructive solution to the problem. Perhaps, instead, we should focus more on helping kids learn how to safely navigate social media and manage online privacy and decision-making. 

The New York Times: Appeals Court Rules White House Overstepped 1st Amendment on Social Media: A federal appeals court ruled on Friday that the Biden administration most likely overstepped the First Amendment by urging the major social media platforms to remove misleading or false content about the Covid-19 pandemic, partly upholding a lower court’s preliminary injunction in a victory for conservatives. The ruling, by a three-judge panel of the U.S. Court of Appeals for the Fifth Circuit in New Orleans, was another twist in a First Amendment case that has challenged the government’s ability to combat false and misleading narratives about the pandemic, voting rights and other issues that spread on social media.  

Los Angeles Times: Elon Musk’s X, formerly Twitter, sues California over social media law: X, the company formerly known as Twitter, is suing California over a state law passed last year that lawmakers say aims to make social media platforms more transparent. The law, Assembly Bill 587, requires social media companies to disclose their policies, including what content users are allowed to post on their platforms and how it responds when they violate the platform’s rules. The companies are required to submit this information to the California attorney general by January 2024. The attorney general’s office would then make those reports public online. In the lawsuit, filed in a federal court in Sacramento on Friday, X alleges the law violates the 1st Amendment’s free speech protections and would pressure social media companies to moderate “constitutionally-protected” speech the state finds “undesirable or harmful.” 

Forbes: Inside Apple’s Impossible War On Child Exploitation: Joe Mollick had spent much of his life dreaming of becoming a rich and famous oncologist, a luminary of cancer treatment, the man who would cure the disease for good. It was a quixotic quest for the 60-year-old, one that left him defeated and hopelessly isolated. He turned to pornography to ease his feelings of loneliness. When those feelings became more severe, so did his taste for porn; he began seeking out child sexual abuse material (CSAM). When the cops first caught Mollick uploading CSAM on a messaging application called Kik in 2019, they searched his electronics and discovered a stash of 2,000 illegal images and videos of children and some 800 files of what a search warrant reviewed by Forbes described as “child erotica.” 

Mashable: Snapchat announces new updates to foster teen safety and age-appropriate content: Snapchat is releasing new app safeguards to protect teen users (aged 13-17) from unknown users and age-inappropriate content — a responsive step from a social media platform that’s been under fire for allegedly exposing its younger users to explicit content, despite its 13+ age rating. Developed in collaboration with the National Center on Sexual Exploitation (NCOSE) and National Center for Missing and Exploited Children (NCMEC), the new features include protections against unwanted contact. The safeguards build on an existing feature preventing teen users from messaging accounts not on their friends list by alerting users when adding an unknown account that doesn’t share mutual friends.  

TIME: 5 Steps Parents Should Take to Help Kids Use AI Safely: Just as older generations have had to navigate the internet and social media, our children will have to learn how to interact with AI. We cannot escape this new era in the technological revolution; children as young as infants often come into contact with AI toys and chatbots like the smart toy ROYBI Robot, AI teddy bears from VTech, Moxie Robot, Siri, and Alexa. But we can’t just wait for the government to impose regulations and protect us (even though that is crucial for our sustained future). We should start in our homes, making sure our children are set up for success in a world increasingly shaped by tools like ChatGPT and Midjourney. This requires ongoing conversations between parents and around the educational benefits of AI, the potential dangers of fully relying on this technology, how technology affects us emotionally and behaviorally, and how the humans behind the algorithms also impact what information AI. gives us.  

Tech Policy Press: Study Investigates Differences in Parent Approaches to Children’s Online Activities: A wave of child online safety legislation is sweeping the United States. Some laws, such as the California Age Appropriate Design Code Bill, address design and privacy concerns, putting the onus on the platforms to make their products safe for children. Laws in other states, often led by Republicans, put more emphasis on the role of parents. One example of the latter is Utah’s SB0152, known as the Utah Social Media Regulation Act. The law, which passed in March, requires that tech firms verify the age of users, requires companies to get parental consent for a child to have a social media account, and puts other restrictions on accounts held by minors, such as prohibitions on direct messaging, advertising, and the collection of personal data. The law enacts a social media curfew between 10:30 p.m. and 6:30 a.m., unless that restriction is adjusted by a parent or guardian. And, perhaps most controversially, the law gives parents and guardians the right to access a minor’s account, including direct messages. 

CBS News: YouTuber Ruby Franke and her business partner each charged with 6 counts of aggravated child abuse: Ruby Franke, a once-popular YouTuber who gave parenting advice on her now-defunct “8 Passengers” YouTube channel, has been charged with six counts of aggravated child abuse, the Washington County Attorney’s Office in Utah said Wednesday. Franke and her business partner, Jodi Hildebrandt, were arrested last week after Franke’s malnourished son ran to a neighbor’s house asking for help, authorities said. The attorney’s office says Franke and Hildebrandt are accused of a combination of multiple physical injuries or torture; starvation or malnutrition that jeopardizes life; and causing severe emotional harm to two children. They each face six counts, which carry potential prison sentence of one to 15 years in prison and a fine of up to $10,000. The investigation is ongoing, the office said. 

Associated Press: A federal judge strikes down a Texas law requiring age verification to view pornographic websites: A federal judge has struck down a Texas law requiring age verification and health warnings to view pornographic websites and blocked the state attorney general’s office from enforcing it. In a ruling Thursday, U.S. District Judge David Ezra agreed with claims that House Bill 1181, which was signed into law by Texas Gov. Greg Abbott in June, violates free speech rights and is overbroad and vague. The state attorney general’s office, which is defending the law, immediately filed notice of appeal to the Fifth Circuit U.S. Court of Appeals in New Orleans. The lawsuit was filed Aug. 4 by the Free Speech Coalition, a trade association for the adult entertainment industry and a person identified as Jane Doe and described as an adult entertainer on various adult sites, including Pornhub. 

Associated Press: Prosecutors in all 50 states urge Congress to strengthen tools to fight AI child sexual abuse images: The top prosecutors in all 50 states are urging Congress to study how artificial intelligence can be used to exploit children through pornography, and come up with legislation to further guard against it. In a letter sent Tuesday to Republican and Democratic leaders of the House and Senate, the attorneys general from across the country call on federal lawmakers to “establish an expert commission to study the means and methods of AI that can be used to exploit children specifically” and expand existing restrictions on child sexual abuse materials specifically to cover AI-generated images. “We are engaged in a race against time to protect the children of our country from the dangers of AI,” the prosecutors wrote in the letter, shared ahead of time with The Associated Press. “Indeed, the proverbial walls of the city have already been breached. Now is the time to act.” 

CNN: You don’t need to surveil your kids to protect them on social media: Parents and other caregivers hear that social media wreaks havoc on a teen’s self-esteem. But kids often tell us that it helps them find like-minded friends and boosts their emotional well-being. So, which is it? I’m a school counselor, and I often see it’s a mix of the above. And I promise you that adults can help kids make smart choices online that keep them safe and preserve their self-esteem. Easier said than done, right? While it may seem counterintuitive, surveilling kids’ movements, tracking their grades and scouring their online conversations can do more harm than good, argues Devorah Heitner, author of the new book “Growing Up in Public: Coming of Age in a Digital World.” CNN spoke to Heitner, whose book provides a pragmatic and empathetic road map to raising kids in today’s volatile and hyper-connected world. She helps caregivers learn to mentor rather than monitor their children. 

Los Angeles Times: California lawmakers kill bill aimed at making social media safer for young people: California lawmakers on Friday killed a bill that would hold social media platforms liable for promoting harmful content about eating disorders, self-harm and drugs. Senate Bill 680, which was opposed by tech companies, died in the powerful Assembly Appropriations Committee as part of a marathon hearing where lawmakers culled hundreds of bills without public debate. “There is little doubt that social media platforms employ algorithms and design features that experts across the nation agree are contributing to harming our children,” Sen. Nancy Skinner (D-Berkeley), who wrote SB 680, said in a statement. “These companies have the power to adjust their platforms to limit this harm, yet to date we’ve seen them take no meaningful action.” 

The Washington Post: Long sidelined, youth activists demand a say in online safety debate: When lawmakers began investigating the impact of social media on kids in 2021, Zamaan Qureshi was enthralled. Since middle school he’d watched his friends struggle with eating disorders, anxiety and depression, issues he said were “exacerbated” by platforms like Snapchat and Instagram. Qureshi’s longtime concerns were thrust into the national spotlight when Meta whistleblower Frances Haugen released documents linking Instagram to teen mental health problems. But as the revelations triggered a wave of bills to expand guardrails for children online, he grew frustrated at who appeared missing from the debate: young people, like himself, who’d experienced the technology from an early age. “There was little to no conversation about young people and … what they thought should be done,” said Qureshi, 21, a rising senior at American University. 

Associated Press: Judge blocks Arkansas law requiring parental approval for minors to create social media accounts: A federal judge on Thursday temporarily blocked Arkansas from enforcing a new law that would have required parental consent for minors to create new social media accounts, preventing the state from becoming the first to impose such a restriction. U.S. District Judge Timothy L. Brooks granted a preliminary injunction that NetChoice — a tech industry trade group whose members include TikTok, Facebook parent Meta, and X, formerly known as Twitter — had requested against the law. The measure, which Republican Gov. Sarah Huckabee Sanders signed into law in April, was set to take effect Friday. Arkansas’ law is similar to a first-in-the-nation restriction signed into law earlier this year in Utah. That law is not set to take effect until March 2024. 

The Washington Post: What to know about Ruby Franke, parenting YouTuber charged with child abuse: Ruby Franke, a Utah mother of six who ran the well-known parenting YouTube channel 8 Passengers, has been arrested on charges of child abuse along with her business partner Jodi Hildebrandt, Santa Clara-Ivins Public Safety Department said in a news release. The arrests came after Franke’s 12-year-old son climbed out of a window at Hildebrandt’s home in Ivins, Utah, and appeared, emaciated and with open wounds, at a neighbor’s, where he asked for food and water, according to an affidavit reported by the Associated Press. After searching Hildebrandt’s home, police found Franke’s 10-year-old daughter in a similar malnourished state. Hildebrandt’s counseling firm ConneXions, where Franke works, did not immediately return a request for comment late Thursday. It was not clear if either of the women had retained an attorney. Franke’s husband, Kevin, was not named, and it is not clear whether the couple is still together. 

NBC Boston: Protecting your child’s personal information: With the start of a new school year, you may be filling out a lot of forms as you enroll your kids in extracurricular activities. But it’s a good idea to limit the information you share, because if it falls into the wrong hands, a criminal could ruin your child’s credit. Question whether it is absolutely necessary to provide all the personal information that you’re being asked for. “As the Better Business Bureau, we just say to use caution,” said Paula Fleming, chief marketing and sales officer for the BBB. “Whatever information you’re sharing, whether it be online or in person, the more you put out there, unfortunately, the more likely it is for something to happen.” Keep in mind that even school systems can be hacked. In 2019, there were 348 data breaches in educational institutions, and the personal information of more than 2.3 million students was exposed to scammers. 

CNN: ‘All we want is revenge’: How social media fuels gun violence among teens: Juan Campos has been working to save at-risk teens from gun violence for 16 years. As a street outreach worker in Oakland, California, he has seen the pull and power of gangs. And he offers teens support when they’ve emerged from the juvenile justice system, advocates for them in school, and, if needed, helps them find housing, mental health services, and treatment for substance abuse. But, he said, he’s never confronted a force as formidable as social media, where small boasts and disputes online can escalate into deadly violence in schoolyards and on street corners.

The Verge: Child safety bills are reshaping the internet for everyone: By the end of this month, porn will get a lot harder to watch in Texas. Instead of clicking a button or entering a date of birth to access Pornhub and other adult sites, users will need to provide photos of their official government-issued ID or use a third-party service to verify their age. It’s the result of a new law passed earlier this summer intended to prevent kids from seeing porn online. But it’s also part of a broad — and worrying — attempt to age-gate the internet.

Pittsburgh Post-Gazette: Opinion: The state government can protect children from pornography: State government has a clear role in protecting the health, safety, and welfare of its residents. That is especially true for children, whose impressionability and development requires active work in protecting them from the worst of society so they can develop appropriately. As technology has infiltrated every part of our lives and has become ubiquitously used at the earliest stages of human development, the dark side of that technological access has also started to have a deleterious impact on children and their appropriate development. One of my priorities in the Pennsylvania General Assembly has been to protect children from sexual abuse and exploitation. While my personal story as a childhood rape survivor has helped drive this legislative passion, it is important that children have adults in positions of public trust speak up for them. 

The Daily Review: Bradford County enters nationwide lawsuit against social media companies: Bradford County is in the early stages of entering a nationwide lawsuit against social media companies and their alleged negative effects on mental health. During their Thursday meeting, the Bradford County Commissioners approved “an agreement with Marc J Bern & Partners LLP to represent Bradford County regarding the recovery of all cost incurred by the county associated with the social media crisis.” Commissioner Daryl Miller explained that the lawsuit alleges social media companies have facilitated the bullying of children across its platforms. He stated that the lawsuit doesn’t take issue with the platforms’ free speech, but instead with its algorithms used to target people.

CT Examiner: Advocates, Lawmakers Plan Harder Look at Youth and Social Media: Seventeen-year-old Coventry High School student Dylan Nodwell knows first-hand the downsides of social media and how cyberbullying has caused anxiousness and, in many cases, deep depression among today’s teens. Nodwell, who will be a high school senior this fall, estimates that 90 percent of his peers use social media – primarily TikTok, Instagram and Snapchat – and that time on those sites often lead to bullying that goes unchecked. “I wished I’d grown up without it,” Nodwell told CT Examiner Friday. “People are behind a screen and they find it easy to bully others. You can leave mean comments and there are not always repercussions for that. I’ve seen friends get sad, anxious and depressed because of it. It kind of normalizes that sort of negative attitude toward others.”  

Fox 56 Wolf: Ensuring online safety for kids: Expert urges parents to monitor apps as children return to school: As parents start to send their children back to school– one important step could be to check all of their online platforms. The app “N-G-L” also known as Not Gonna Lie– was launched last year. It’s a platform with anonymous users– and acts as a personal inbox to receive messages. “There’s some great apps out there, great programs that you can put on those phones, there designed specifically to keep your children safe. Remember those phones are yours parents, there not your child’s, they don’t have any expectation of privacy on your device,” “You can do a lot of things to place filters on those phones, and allow some type of control over how they use that, there are programs like Bart, family 360, things you can do to minimize the probability that somebody is going to contact them or they are going to contact somebody they shouldn’t be talking too,” said Social Media Intelligence Expert Dr. Steve Webb. 

Vox: YouTube can’t fix its kid safety problem: Google might be facing significant fines for violating children’s privacy through YouTube ads — again. Two recent reports suggest that the company is collecting data from and targeting ads to children, a violation of both the Children’s Online Privacy and Protection Act (COPPA) and Google’s consent decree with the Federal Trade Commission. They also come as Google, which owns YouTube, prepares to defend itself in a major antitrust lawsuit about its search engine, is under scrutiny from Democrats and Republicans alike, and Congress considers child online safety bills. Simply put, this is not the best time for Google to face more accusations of wrongdoing, especially when the alleged victims are children. 

NBC Miami: National teacher’s union promotes literacy, mental health initiative in Miami-Dade: With so much political controversy surrounding education in Florida, when the president of the most prominent national teacher’s union comes to town from Washington, it’s natural to expect her focus would be on the mandates from Tallahassee. Not this time. American Federation of Teachers president Randi Weingarten came to Miami to talk about proposals to boost learning in the classroom and to support teachers and families. “Today has nothing to do about politics, it has everything to do with lifting up the Miami-Dade schools, which around the country are known for the kind of public school choice they have done for kids and our communities,” Weingarten said. She and United Teachers of Dade president Karla Hernandez-Mats toured Miami Jackson Senior High and some other schools. 

The New York Times: YouTube Improperly Used Targeted Ads on Children’s Videos, Watchdogs Say: After a research report last week found that YouTube’s advertising practices had the potential to undercut the privacy of children watching children’s videos, the company said it limited the collection of viewer data and did not serve targeted ads on such videos.These types of personalized ads, which use data to tailor marketing to users’ online activities and interests, can be effective for finding the right consumers. Under a federal privacy law, however, children’s online services must obtain parental consent before collecting personal information from users under 13 to target them with ads — a commitment YouTube extended to anyone watching a children’s video.

WIRED: How to Talk to Your Kids About Social Media and Mental Health: IF YOU GIVE a kid a smartphone, they’re going to want a social media account. That’s not the start of a storybook. The average age for a kid getting their first smartphone is 10.3. Within a year, a child has likely made four or five social media accounts; by the age of 12, 90 percent of kids are already on social media, according to research by Linda Charmaraman, a senior research scientist who runs the Youth Media and Well-Being Research Lab at Wellesley College.

DC News Now: Changes made following viral post about social media app’s safety features: With the back-to-school season around the corner, some parents are focused on making sure their children have safe online tools to succeed. A new app, Saturn, helps students to easily track their class schedules and to connect with other students. Although the app may sound helpful, some parents are worried about its privacy and safety. Chris Cullum said his daughter asked to sign up for the app, which is why he signed up around Aug. 9 to see how the app works. But after using it, he said he had some concerns about app’s safety features. “I was able to make a profile using just a number,” Cullum said. He later made a social media post highlighting his concerns, which went viral in Arkansas and parts of the DMV. 

The Washington Post: YouTube faces fresh complaint over its children’s privacy practices: Children’s privacy advocates are urging federal regulators to consider issuing a massive fine “upwards of tens of billions of dollars” and imposing sweeping privacy limits on Google-owned YouTube over reports that it may have let companies track kids’ data across the internet. Ad tracking firm Adalytics last week released a report suggesting that YouTube served ads for adults on videos labeled as “made for kids,” stoking concern that the video-sharing giant may be trampling on federal privacy protections for children, as the New York Times first reported. In response, Sens. Edward J. Markey (D-Mass.) and Marsha Blackburn (R-Tenn.) called on the Federal Trade Commission to investigate the matter, writing that the purported tactics may have “impacted hundreds of thousands, to potentially millions, of children across the United States.” 

Chicago Sun-Times: Opinion: Protect children from dangers of the internet with Kids Online Safety Act: In July, the Kids Online Safety Act drew one step closer to becoming law when a Senate committee advanced the bill. However, despite bipartisan support and the fact that most people in the U.S. agree that kids need internet safeguards, the bill faces hurdles. Some are trying to politicize the bill by claiming that it will be used by anti-LGBTQ+ groups to censor content under the guise of preventing depression, anxiety and eating disorders in children. Others claim it is a threat to free speech. But since the bill made it to the floor of Congress last year and was dropped because of criticism, some language was changed and LGBTQ+ advocacy groups that initially opposed it, like the Gay & Lesbian Alliance Against Defamation and the Human Rights Campaign, wisely dropped their opposition. 

York Daily Record: My bill would protect PA child social media influencers from exploitive parents: Pennsylvania’s Child Labor Law exists to protect children, their labor, and their earnings from being exploited and I believe that the protections of our Child Labor Law should apply to child influencers on social media. I am introducing legislation to update our Child Labor Laws to do just that. Americans are well accustomed to childhood celebrities, and we are equally aware of the many stories of children whose families have become broken and their futures made difficult because the people who should have been looking out for them were exploiting them.

KDKA: Pa. lawmaker introduces new bill to regulate social media influencers under child labor laws: A Pennsylvania Lawmaker is planning on introducing a bill to regulate social media child influencers and celebrities under Pennsylvania’s child labor laws. Representative Torren Ecker, says it would protect kids who earn money by creating content or whose names or photographs in a parent’s content generate income.  He says that child influencers make more than $50 million.

Philly Voice: Pa. lawmaker proposes protecting young social media influencers under state’s child labor laws: Pennsylvania could soon regulate money earned by child influencers and celebrities from their or their parents’ social media content under a bill set to be introduced this fall. The bill aims to protect children whose photographs, likenesses or names are used to make money through social media under Pennsylvania’s existing child labor laws. It will be introduced by Rep. Torren Ecker, a Republican serving portions of Adams and Cumberland counties, when the state House reconvenes in September.

WGAL: Pennsylvania bill aims to protect children creating social media content from being exploited: Content creators and social media stars can make millions of dollars, though Pennsylvania lawmakers are drafting legislation to ensure that children who are the focus of those videos aren’t being cut out on compensation and exploited. A sponsorship memo for legislation being drafted by Rep. Torren Ecker said his bill would “will protect children who earn money as influencers and content-makers, or whose likeness, name or photograph is substantially featured in a parent or guardian’s content that generates income.”

Fox43: Pa. lawmaker calls for regulation on child influencers on social media: The youngest stars of social media could get additional protection and pay under a proposed change to Pennsylvania’s child labor law. Republican representative Torren Ecker, who represents Cumberland County, proposed a bill that would regulate child influencers and celebrities on social media. “We have lots of child labor laws to protect children from working long hours in factories,” said Representative Ecker. “I think this is modernizing for another way children can be exploited.”

North Central PA: Legislators consider bill to regulate child social media influencers: Pennsylvania legislators are discussing a bill that would put regulations on child influencers and celebrities on social media. State Representative Torren Ecker (R – Adams/Cumberland) will soon introduce a bill to the state legislature, his office announced in August. “We always hear about the devastating later-life impact that childhood celebrity and wealth can have on those who experience fame early in life. Now, every parent or relative with a cellphone can work to make their children or relatives into social media celebrities that, without their consent, can deprive children of privacy, income from their work, and fair working conditions within the scope of current law,” Rep. Ecker said.

The Conversation: The Youth Mental Health Crisis Worsens amid a Shortage of Professional Help Providers: The hospital where I practice recently admitted a 14-year-old girl with post-traumatic stress disorder, or PTSD, to our outpatient program. She was referred to us six months earlier, in October 2022, but at the time we were at capacity. Although we tried to refer her to several other hospitals, they too were full. During that six-month wait, she attempted suicide. Unfortunately, this is an all-too-common story for young people with mental health issues. A 2021 survey of 88 children’s hospitals reported that they admit, on average, four teens per day to inpatient programs. At many of these hospitals, more children await help, but there are simply not enough services or psychiatric beds for them. 

Franklin County Press: Protecting Young Minds: Navigating the Social Media Landscape for Children’s Safety: It seems that screen time is often intertwined with social interactions; parents are faced with the daunting task of ensuring their children’s safety on social media. Social media platforms offer myriad opportunities for connection, entertainment, and education and also present a minefield of potential risks for young users. The dangers are manifold and ever-present, From cyberbullying to exposure to inappropriate content and even the risk of contact with strangers. A recent survey from the Pew Research Center found that 95% of teens have access to a smartphone, and a whopping 45% claim to be online almost constantly. With this increased online presence, there’s a higher likelihood of encountering potential threats. 

WTVO: Doctors trip could keep children safe on social media; study: A new study showed that a visit to the doctor might be able to keep children safe on social media. Researchers found that doctors having a five-minute conversation with their youth patients about social media safety resulted in more kids having follow up conversations with their parents. That led to kids checking or changing their privacy settings. The study also found that most pediatricians were not learning how to talk to patients about social media. 

Pittsburgh Post-Gazette: Editoral: Pa. needs to protect children from internet pornography: Nobody — or at least nobody who deserves to be taken seriously — thinks children should see pornography. And yet that’s exactly what’s happening, and at alarming rates: Studies in the United States, as well as in France, Australia and elsewhere, show that the average age at which young people first encounter porn is 11 — and some say it’s even earlier. The question is what to do about it. Pennsylvania should join several other states — states with both Republican and Democratic leadership — to require that internet pornography companies verify the age of those who access their content. Ultimately, the U.S. should join other nations in adopting a uniform national strategy for youth online safety. 

Fox 43: York and Adams County coaches, parents to learn about student athlete mental health: The York Adams Interscholastic Athletic Associations (YAIAA) and several other organizations will host a symposium to help parents and coaches better understand student athletes’ mental health next weekend. Organizers say the gathering aims to bring awareness, support, education and resources to students, coaches and the community. “The idea of what we’re trying to do through mental health and athletics is create a culture and climate on and off the field through coaches and parents where kids really feel safe and supported to be who they are, and if they aren’t doing okay that they have a space to go to be able to ask for this help that they might need,” Miranda Jenkins, a social worker and the head coach of men’s and women’s swimming at York Suburban, said. 

WHTM: Pennsylvania lawmaker introduces bill to regulate child influencers: Representative Torren Ecker (R-Adams/Cumberland) announced today that he will introduce legislation to regulate social media child influencers and celebrities under Pennsylvania’s Child Labor Laws. Rep. Ecker says his legislation would protect children who earn money by creating content and/or whose name, likeness, or photograph is featured in a parent or guardian’s content that generates income for the parent under Pennsylvania’s Child Labor Law, according to a memo released Thursday. 

CNN: Illinois passes a law that requires parents to compensate child influencers: When 16-year-old Shreya Nallamothu from Normal, Illinois, scrolled through social media platforms to pass time during the pandemic, she became increasingly frustrated with the number of children she saw featured in family vlogs. She recalled the many home videos her parents filmed of herself and her sister over the years: taking their first steps, going to school and other “embarrassing stuff.” “I’m so glad those videos stayed in the family,” she said. “It made me realize family vlogging is putting very private and intimate moments onto the internet.”

Psychology Today: When Mental Health Info Is Obtained Via Social Media: These days almost everyone goes online to look up health information. Googling medical questions and concerns has become a part of everyday life for many of us as the Internet has become an extremely easy way to search for a doctor of any specialty, book appointments and expand one’s knowledge. Over the past several years, however, the online landscape has evolved quite dramatically with the advent of social media. In fact, much like Google, social media has also become an increasingly important source of mental health-related information, especially for teenagers and young adults.

ABC: Illinois becomes 1st state to regulate kid influencers: What to know about the law: During the coronavirus pandemic, Shreya Nallamothu, now 16, said she, like so many others, began to spend more time on social media, where she saw countless parents who documented their own lives and their kids’ lives on different platforms. “The more I fell down that rabbit hole, I kept seeing cases of exploitation,” Shreya, a high school junior from Normal, Illinois, told “Good Morning America,” adding that she specifically was struck by seeing kids who she thought were not old enough to know the full ramifications of their online presence.

Mashable: Child influencers in Illinois can now sue their parents: Illinois is the first state in America to pass a law protecting child influencers and social media stars, making sure they are paid for appearing in videos posted to monetized online platforms like TikTok and YouTube. And if they’re not paid, they can sue. The bill, SB1782, was passed unanimously through the Senate in March, after being introduced by Democratic Sen. David Koehler, and was signed into law on Friday. It will go into effect July 1, 2024. The new law will amend the state’s Child Labor Law to ensure monetary compensation for influencers and social media personalities under 16, so those children who appear in online content will be entitled to a percentage of earnings. To qualify, the videos must be filmed in Illinois and “at least 30 percent of the vlogger’s compensated video content produced within a 30-day period included the likeness, name, or photograph of the minor.”

The Wall Street Journal: Opinion: The Constitution Protects ‘Harmful’ Speech: The Senate is considering a bill that poses serious risks to free speech. The Senate Commerce Committee recently advanced the Kids Online Safety Act by unanimous vote. It would empower government officials—state attorneys general and the Federal Trade Commission—to challenge social-media companies when they fail to prevent “harm to minors.” Invigorated with greater statutory authority, the already aggressive enforcement agencies would have the means to deem any speech unlawful and limit it under the guise of promoting child safety. According to the text of KOSA, a state attorney general could bring a civil lawsuit against a platform if it doesn’t take down content that falls under the bill’s definition of harmful. For instance, a state could sue Instagram for violating the act’s duty of care if it doesn’t take down posts that make a child feel more anxious. 

The New York Times: Opinion: Teens Don’t Really Understand That the World Can See What They Do Online, but I Do: When Matthew McConaughey and his wife, Camila Alves McConaughey, took to Instagram to jointly announce a new venture this summer, you might have expected it to be an upcoming film or a fledgling lifestyle brand. Their news was more unusual: the unveiling of an official Instagram account for their son Levi, which they were giving to him on his 15th birthday, long after many of his friends had signed up, they noted. Celebrities have taken a wide array of approaches to granting their children access to social media — and thereby granting the public access to their children. Apple Martin, the daughter of Gwyneth Paltrow and Chris Martin, has always kept her Instagram private and once shamed her mother for publicly sharing a photo of her without her consent. DJ Khaled’s son has been on Instagram since shortly after birth. Whatever the approach, we’ve seen how easily personal revelations, flippant comments and family drama become fodder for public scrutiny and ridicule. 

New York Times: Amid Sextortion’s Rise, Computer Scientists Tap A.I. to Identify Risky Apps: Almost weekly, Brian Levine, a computer scientist at the University of Massachusetts Amherst, is asked the same question by his 14-year-old daughter: Can I download this app? Mr. Levine responds by scanning hundreds of customer reviews in the App Store for allegations of harassment or child sexual abuse. The manual and arbitrary process has made him wonder why more resources aren’t available to help parents make quick decisions about apps. Over the past two years, Mr. Levine has sought to help parents by designing a computational model that assesses customers’ reviews of social apps. Using artificial intelligence to evaluate the context of reviews with words such as “child porn” or “pedo,” he and a team of researchers have built a searchable website called the App Danger Project, which provides clear guidance on the safety of social networking apps.

Los Angeles Times: California lawmakers want to make social media safer for young people. Can they finally succeed?: Samuel Chapman had no idea that drug dealers targeted teens on Snapchat until his 16-year-old son died from a fentanyl overdose. “We thought it was like a playground for kids and didn’t think of it, as I do now, as the dark web for kids,” the Los Angeles resident said. In 2021, a drug dealer reached out to his son, Sammy, on the disappearing messaging app and showed the teen a “colorful drug menu” that offered cocaine, Chapman said. After he and his wife fell asleep, the dealer delivered drugs to their house “like a pizza.” Sammy unknowingly took fentanyl and died in his bedroom. For parents like Chapman, the horrific ordeal underscored social media’s dangerous side. Tech platforms help people keep in touch with family and friends, but they also attract drug dealers, pedophiles and other predators. Plus, social media algorithms can steer young people to posts that could trigger eating disorders or self-harm. 

AP: Georgia kids would need parental permission to join social media if Senate Republicans get their way: Georgia could join other states requiring children to have their parents’ explicit permission to create social media accounts. Two top Republicans in the Georgia state Senate — Lt. Gov. Burt Jones and Sen. Jason Anavitarte of Dallas — said in a Monday news conference they will seek to pass such a law in 2024. The proposal could also restrict accounts on other online services. “It’s important that we empower parents,” Anavitarte said. “A lot of parents don’t know how to restrict content.”

Fox 13: Social media can increase risks of mental health problems among students: U.S. Congresswoman Kathy Castor held a roundtable discussion in Tampa Thursday, with the goal of urging parents to add something new to their back-to-school checklists: safety guardrails for mobile devices. Castor was joined by members of the Hillsborough Classroom Teachers Association, Hillsborough PTSA President Ami Marie Grainger Welch and several students. “Set these guardrails. Have the conversation about what it means online for you to be on your phone for too long,” Castor said during a news conference following the roundtable. “The big tech platforms want to keep you addicted. They want your eyeballs constantly scrolling because they’re also targeting you with advertisements.” Castor believes social media and increased internet usage present a significant risk to the mental health and well-being of students across the country.  

The Times-Tribune: Local schools sue social media giants: Four area school districts are among a growing number of districts nationwide that are suing several social media giants, alleging the companies have helped fuel a mental health crisis among youth that is disrupting education and costing taxpayers money. The suits, filed by the North Pocono, Hazleton Area, Hanover Area and Crestwood school districts, allege the owners of Facebook, Instagram, TikTok, YouTube and Snapchat know their sites are highly addictive and harmful to youth. They’ve refused to implement safety measures, however, so they can continue to reap massive profits. Those profits have come at the expense of schools, who are left to deal with behavioral issues, including anxiety, depression and other mental health issues tied to excessive use of the platforms, said Joseph Cappelli, a Montgomery County attorney who represents the districts.

Washington Examiner: Controversial legislation to protect children on social media advances in Senate: Two controversial bills that would expand teenagers’ rights to privacy and limit Big Tech’s ability to collect data from underage users advanced to the Senate floor Thursday. The Senate Committee on Commerce, Science, and Transportation voted to approve two bills to implement safeguards to protect children and teenagers online. It approved Sen. Marsha Blackburn (R-TN) and Richard Blumenthal’s (D-CT) Kids Online Safety Act, which would require platforms to take steps to prevent a defined set of harms to minors as well as implement controls for users that allow parents to limit screen time, restrict addictive features, and determine who gets access to their teenager’s user profile.

The Economist: Regulation could disrupt the booming “kidfluencer” business: It started with a Lego “choo-choo train”. The video shows three-year-old Ryan Kaji picking it out from the store “because I like it”, he tells his mother, Loann. Back at the family home in Houston, Texas, the toddler opens the box and plays with his new toy. It’s nothing out of the ordinary. But it helped make the Kajis millionaires. Loann had recorded and uploaded the video to a new YouTube channel, “Ryan ToysReview”. Eight years, many toy unboxings and 35m subscribers later, “Ryan’s World”, as the channel is now known, is considered YouTube royalty. He is part of a new generation of child social-media influencers (those under the age of 18) changing the shape of kids’ entertainment in America—and making a lot of money in the process.

CNN: Elizabeth Warren and Lindsey Graham want a new agency to regulate tech: Two US senators are calling for the creation of a new federal agency to regulate tech companies such as Amazon, Google and Meta, in the latest push by members of Congress to clamp down on Big Tech. Under the proposal released Thursday by Sen. Elizabeth Warren, a Massachusetts Democrat, and Sen. Lindsey Graham, a South Carolina Republican, Congress would establish a new regulatory body with the power to sue platforms — or even force them to stop operating — in response to various potential harms to customers, rivals and the general public, including anticompetitive practices, violations of consumer privacy and the spread of harmful online content.

Philly Voice: Reducing social media usage by just 15 minutes a day improves one’s well-being, research suggests: There are many paths toward living a healthier life, but here’s one simple place to start: Put your phone down.  By spending less time on social media in particular, recent research suggests, we can improve our overall health and well-being. Just a 15-minute reduction in social media usage per day can have a positive impact on health and social well-being, according to a study published in the Journal of Technology in Behavioral Science.

Casino.org: Online Casinos, Roblox, and Children Linked in Report: There are reportedly some online casinos that allow players to use Robux, the in-game currency of the video game Roblox, and they also don’t make an effort to check the age of the users. The gambling platforms are also going a step further, allegedly recruiting content creators as young as 14 to attract more teenage gamblers. An article on the Sharpr substack, run and published by Cody Luongo, asserts that Roblox has become a gateway to underground gambling. The report alleges that children are able to bet millions of dollars on the sites.

The Wall Street Journal: Schools Sue Social-Media Platforms Over Alleged Harms to Students: Plaintiffs’ lawyers are pitching school boards throughout the country to file lawsuits against social-media companies on allegations that their apps cause classroom disciplinary problems and mental-health issues, diverting resources from education. Nearly 200 school districts so far have joined the litigation against the parent companies of Facebook TikTok, Snapchat and YouTube. The suits have been consolidated in the U.S. District Court in Oakland, Calif., along with hundreds of suits by families alleging harms to their children from social media. The lawsuits face a test later this year when a judge is expected to consider a motion by the tech companies to dismiss the cases on grounds that the conduct allegedly causing the harm is protected under the internet liability shield known as Section 230.

The Washington Post: Twitter rival Mastodon rife with child-abuse material, study finds: A new report has found rampant child sexual abuse material on Mastodon, a social media site that has gained popularity in recent months as an alternative to platforms like Twitter and Instagram. Researchers say the findings raise major questions about the effectiveness of safety efforts across so-called “decentralized” platforms, which let users join independently run communities that set their own moderation rules, particularly in dealing with the internet’s most vile content. Researchers reported finding their first piece of content containing child exploitation within about five minutes. They would go on to uncover roughly 2,000 uses of hashtags associated with such material. David Thiel, one of the report’s authors, called it an unprecedented sum. “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,” said Thiel, referring to a technique used to identify pieces of content with unique digital signatures. Mastodon did not return a request for comment. 

K-12 Dive: Scrutiny over TikTok in schools grows: Florida is one of the earliest states to ban TikTok in schools. Montana Gov. Greg Gianforte also signed a TikTok ban in May, but that restricts use across the entire state rather than only in schools. Montana’s law was challenged in court that same month by TikTok creators, a lawsuit that was funded by the social media platform itself. Many other states as well as local districts have taken issue with the company. Louisiana’s Superintendent of Education Cade Brumley, for example, advised all school system leaders in January to remove the TikTok from public school devices because of ata privacy concerns stemming from the Chinese ownership of the platform. A growing list of districts have also sued Tiktok in the past year, many of them citing student mental health concerns. A lawsuit filed by Maryland’s Howard County Public School System in June, for example, said TikTok and other social media are “addictive and dangerous” and have changed the way kids “think, feel, and behave.” 

NBC News:  A teachers union says it’s fed up with social media’s impact on students: The nation’s second-largest teachers union said Thursday it was losing patience with social media apps that it says are contributing to mental health problems and misbehavior in classrooms nationwide, draining time and money from teachers and school systems. The American Federation of Teachers issued a report with several other organizations, warning that tech companies should rein in their apps before Congress forces them to do so. The federation has 1.7 million members. The report comes at a time of heightened concern about the impact of social media on children and teenagers. In May, the U.S. surgeon general warned that social media use is a main contributor to depression, anxiety and other mental health problems, and more than 100 school districts and government entities have sued the companies behind apps such as TikTok and Instagram because of the associated problems. 

CNN: Leading AI companies commit to outside testing of AI systems and other safety commitments: Microsoft, Google and other leading artificial intelligence companies committed Friday to put new AI systems through outside testing before they are publicly released and to clearly label AI-generated content, the White House announced. The pledges are part of a series of voluntary commitments agreed to by the White House and seven leading AI companies – which also include Amazon, Meta, OpenAI, Anthropic and Inflection – aimed at making AI systems and products safer and more trustworthy while Congress and the White House develop more comprehensive regulations to govern the rapidly growing industry. President Joe Biden will meet with top executives from all seven companies at the White House on Friday. White House officials acknowledge that some of the companies have already enacted some of the commitments but argue they will as a whole raise “the standards for safety, security and trust of AI” and will serve as a “bridge to regulation.” 

Pittsburgh Post-Gazette: 7 charged with hacking Snapchat accounts to obtain explicit images: Seven people have been indicted on charges they conspired to hack into Snapchat accounts to remove explicit images and videos depicting account holders, including child sexual abuse material. Six of the defendants are Pennsylvania residents and one is from North Carolina, according to the U.S. Attorney’s Office in Pittsburgh. A federal grand jury in Erie, Pa., produced the indictment on charges of conspiracy to commit wire fraud, fraud in connection with unlawful computer access, aggravated identity theft, and receipt and possession of child sexual abuse material. A news release issued this week said the indictment named: Richard Alan Martz, Jr., 33, of Meadville, Crawford County; Dylan Michael Miller, 30, of West Mifflin; Christopher Clampitt, 33, of Clemmons, N.C.; Edward Grabb, 31, of Jeannette; Michael Yackovich, 27, of West Newton; Luke Robert Swinehart, 22, of Lock Haven, Clinton County; and Karlin Terrell Jones, 26, of Beaver Falls. 

NBC News: A friend-finding app offered a ‘safe space’ for teens — sextortion soon followed: A Tinder-like app popular among teenagers and young adults has allegedly been used to extort users by tricking them into sending sexually explicit photos, a problem that internet safety watchdogs say is indicative of the challenges of keeping young people safe on social media. The app, Wizz, allows users to scroll through profiles that show a person’s picture, first name, age, state and zodiac sign. Wizz advertises the app as a “safe space” to meet new friends and allows users as young as 13 to join and connect with users of a similar age. Its basic functionality resembles popular dating apps. When users open the app, they are presented with another person’s profile. They can then choose to send that person a message in the app’s chat function or swipe left to see a new profile.

Fox News: Baby monitor hackers sold nude images of kids on social media: report: Hackers are reportedly gaining access to Hikvision cameras through the company’s mobile app and have used the feeds to sell child pornography on social media. An investigation by IPVM, a surveillance industry trade publication, revealed that some hackers were using the company’s Hik-Connect app to distribute child pornography on Telegram, the publication reported last week. The investigation found several sales offers of nude videos on the platform, including some labeled “cp” (child porn), “kids room,” “family room,” “bedroom of a young girl” and “gynecological office.”

Fox News: Artificial intelligence could help ‘normalize’ child sexual abuse as graphic images erupt online: experts: Artificial intelligence is opening the door to a disturbing trend of people creating realistic images of children in sexual settings, which could increase the number of cases of sex crimes against kids in real life, experts warn.  AI platforms that can mimic human conversation or create realistic images exploded in popularity late last year into 2023 following the release of chatbot ChatGPT, which served as a watershed moment for the use of artificial intelligence. As the curiosity of people across the world was piqued by the technology for work or school tasks, others have embraced the platforms for more nefarious purposes.

The New York Times: Opinion: Algorithms Are Making Kids Desperately Unhappy: Kids are even more in the bag of social media companies than we think. So many of them have ceded their online autonomy so fully to their phones that they even balk at the idea of searching the internet — for them, the only acceptable online environment is one customized by big tech algorithms, which feed them customized content. As our children’s free time and imaginations become more and more tightly fused to the social media they consume, we need to understand that unregulated access to the internet comes at a cost. Something similar is happening for adults, too. With the advent of A.I., a spiritual loss awaits us as we outsource countless human rituals — exploration and trial and error — to machines. But it isn’t too late to change this story. 

Consumer Affairs: Meta’s updated parental controls give parents an inside look at who their kids at messaging: As lawmakers and government officials get more serious about kids’ and teens’ social media use, Meta, the home of Facebook and Instagram, is following suit. The company announced several new features that give parents more control over their kids’ social media use. While parents won’t be able to see the specifics of their children’s messages, they will be able to get a better idea of how their child uses these social media apps, including how much time is spent on them. 

Beaver County Times: Beaver Falls man indicted in federal identity theft, child porn charges: A city man was indicted by a federal grand jury in Erie Tuesday for his involvement in a criminal ring accused of wire fraud conspiracy and possession of child sex abuse materials. According to the Department of Justice, charges were filed against six residents of Pennsylvania and one resident of North Carolina after the seven defendants allegedly conspired to hack into Snapchat accounts to obtain explicit images and videos of victims. Among those charged in the federal investigation was 26-year-old Karlin Terrell Jones, of Beaver Falls, who authorities said worked with the group to share these images with others online. “As alleged, the defendants used deception and hacking techniques to unlawfully access social media accounts so that they could steal, hoard, and trade explicit and otherwise private content of hundreds of unsuspecting victims,” said U.S. Attorney Eric Olshan.  

CNN: Opinion: Mark Zuckerberg’s family photo raises this crucial question: Sandwiched between a Jiu-Jitsu video and the Threads announcement, Mark Zuckerberg’s Instagram profile recently featured a casual Independence Day snapshot of him and his family. Well, most of it — emojis obscure the faces of his 5- and 7-year-old daughters. This prompted social media comments accusing Zuckerberg of hypocrisy, given the constant outcry over his company Meta’s privacy practices. Yes, it is deeply ironic that Zuckerberg, whose platforms fine-tuned a business model that earns him enormous revenues by extracting our data, wants to limit where some of his data goes. But two things are important to note here: This decision is more about his children than him, and covering their faces with emojis is more about reducing their visibility to audiences than about preventing platforms from extracting their data. 

WTAJ: Central Pa. schools file lawsuits against social media companies: Four local school districts have all filed lawsuits against big-name social media companies. Altoona, Bellwood-Antis, Ferndale and Tyrone Area school districts have all submitted their own federal lawsuits against social media companies Meta, which owns Facebook and Instagram, Google, which owns YouTube, ByteDance, who owns TikTok and Snap Inc, which owns Snapchat. The area schools filed the separate lawsuits on Wednesday, July 12 in U.S. District Court, Western District of Pennsylvania. Their suits accuse the companies of targeting children, which the school contends “are uniquely susceptible to harm from defendants’ products” and that these companies designed their products to “attract and addict youth.” 

The Philadelphia Tribune: Pew Charitable Trusts Gives $6.55 million to children’s mental health providers: In 2021, suicide was the third leading cause of death in the U.S. for high school students aged 14 to 18, according to the Centers for Disease Control and Prevention. In the last few years, the rates of anxiety and depression among young people have increased significantly as a result of the lingering effects of the pandemic, rising gun violence and drug overdose deaths. U.S. Surgeon General Dr. Vivek Murthy declared a mental health crisis for young people in 2021. In this environment, the Pew Charitable Trusts said Thursday that it has awarded $6.55 million to five non-profit groups seeking to make mental health services more accessible to children and teens in underserved communities in the Philadelphia-area. 

The Philadelphia Inquirer: Five Philadelphia nonprofits are receiving $6.55 million in Pew grants to expand youth and child mental health services: Five Philadelphia organizations working on child and youth mental health will receive a combined $6.55 million in grants from the Pew Charitable Trusts to expand access to services, the national nonprofit announced Thursday. The Pew grants aim to create more treatment options to expand the geographic reach of services within the city, and train providers in this highly specialized care, said Kristin Romens, project director of Pew’s Fund for Health and Human Services. Pew chose the organizations for their expertise and work within the community. “All of them take partnering with their clients and community very seriously,” Romens said. 

NPR: So your tween wants a smartphone? Read this first: Your tween wants a smartphone very badly. So badly that it physically hurts. And they’re giving you soooo many reasons why.They’re going to middle school … they need it to collaborate with peers on school projects … they need it to tell you where they are … when they’ll be home … when the school bus is late. It’ll help you, dear parent, they vow. Plus, all their friends have one, and they feel left out. Come on! Pleeeeeeze.Before you click “place order” on that smartphone, pause and consider a few insights from a person who makes a living helping parents and tweens navigate the murky waters of smartphones and social media.Emily Cherkin spent more than a decade as a middle school teacher during the early aughts. She watched firsthand as the presence of smartphones transformed life for middle schoolers. For the past four years, she’s been working as screen-time consultant, coaching parents about digital technology.

CNBC: Discord does about-face on parental controls for teen social media use: Discord has introduced parental controls similar to those adopted by prominent social media platforms including Instagram, TikTok, Snapchat, and YouTube. In the past, Discord’s philosophy has rejected this concept, stressing a focus on user needs, not the needs of their parents. The release of these parental controls comes amid greater scrutiny of teen social media use and mental health issues, and follows Discord’s acquisition of Gas, a social media app focused on giving teenagers a platform to compliment each other.

Insider: Shyla Walker spent years turning her child into a YouTube star. Now, she says she regrets putting her daughter online and is cutting ties with the controversial world of family vlogging.: Shyla Walker and her boyfriend, Landon McBroom started a couple’s channel on YouTube to document their lives and romantic memories. Walker, now 25, said she was “naive” about the internet at the time, as she rarely used social media platforms outside of posting on the channel. But she had connections to the family-vlogging world through Landon, whose brother Austin McBroom runs The Ace Family YouTube channel, a controversial family channel with more than 18 million subscribers. Austin and Catherine McBroom regularly involve their three kids — Elle, Alaia, and Steel — in their YouTube videos, and also manage Instagram accounts on behalf of the children, who are 7, 4, and 3 years old. Like many family vloggers, they have been accused by other YouTubers of exploiting their children’s lives for content over the years. (The McBrooms did not respond to a request for comment regarding this accusation.) 

The Dallas Morning News: LTE: Social media is hurting kids. Why are parents alone in the fight?: Danger lurks between grinning selfies, influencer travelogues and silly memes. Some social media threats, like cyberbullying, are obvious. Others are more subtle: the barrage of doctored photos that affect our body image, the quacks who craftily disguise fake information as fact, the social contagion that distorts decision making. Research shows social media can rewire our kids’ brains, and yet our government leaders haven’t established robust safeguards for these platforms the way they have with toys, cars and drugs. That is one of the main messages of the U.S. surgeon general’s social media and youth mental health advisory, released last month. Dr. Vivek Murthy’s 25-page document — part advice, part warning — is a must-read for everyone, but especially for lawmakers who alone have the power to force the tech industry to protect children. “While nearly all parents believe they have a responsibility to protect their children from inappropriate content online, the entire burden of mitigating the risk of harm of social media cannot be placed on the shoulders of children and parents,” Murthy wrote. 

Los Angeles Times: Opinion: Smartphones take a toll on teenagers. What choice do parents have?: We can’t keep ignoring social media’s harmful effects on the mental health of young people. Across the world, regardless of skin color or language, people are suffering from mental health problems that are linked to the age at which they got their first smartphone or tablet, according to a new report from Sapien Labs. The nonprofit organization, which has a database of more than a million people in dozens of countries, found that the younger that people were when they got their first smartphone or tablet, the more likely they were to have mental health challenges as adults, including suicidal thoughts, a sense of being detached from reality and feelings of aggression toward others. The effects were most pronounced among girls, who spend more time on social media than boys do. The harm of the devices seems to be rooted in the 24/7 access they provide to social media. The longer that parents wait to give children portable digital devices, the better. Respondents who got their first smartphones or tablets in their later teens had a much stronger sense of self and ability to relate to others. 

CNN: Opinion: Dr. Sanjay Gupta: Parenting in the era of ubiquitous screens and social media:  A growing number of states are turning the screws on Big Tech, the internet and social media. On Wednesday, Montana became the first state to completely ban TikTok, although many are skeptical that the controversial new legislation will be enforceable. Other moves include laws that aim to tighten regulations on social media platforms in general, like those recently enacted by Arkansas and Utah. There are three worthwhile goals that appear to be at least part of the motivation behind legal maneuvers like these: preventing companies from collecting data on us and our children, protecting kids online and balancing your rights with your responsibilities when you post content to online platforms. For example, if a platform hosts content that leads to someone being harmed, can it then also held responsible? So far, the answer has been no, according to a recent US Supreme Court decision. For me, though, the discussions around smartphones and social media are very personal. As a dad of three teenage girls, I am often left wondering about the impact of so much screen time on their brains. 

CNN: Teens should be trained before entering the world of social media, APA says: The American Psychological Association is calling for teens to undergo training before they enter the sometimes fun but sometimes fraught world of social media, according to new recommendations released Tuesday. “Social media is neither inherently harmful nor beneficial to our youth,” said Dr. Thema Bryant, the APA’s president. “Just as we require young people to be trained in order to get a driver’s license, our youth need instruction in the safe and healthy use of social media.” Bryant assembled an advisory panel to review the scientific literature on social media use and formulate recommendations for healthy adolescent use, according to an APA news release. The American Psychological Association Health Advisory on Social Media Use in Adolescence released 10 recommendations to guide educators, parents, policymakers, mental health and health practitioners, technology companies and adolescents. 

The Greylock Glass: Children’s Advocates Applaud Kids Online Safety Act: Federal legislation aimed at protecting children and teens online has gained the support of leading advocates for children’s health and privacy. The Kids Online Safety Act would make online platforms and digital providers abide by a “duty of care” requiring them to eliminate or mitigate the impact of harmful content. Kris Perry, executive director of Children and Screens, the Institute of Digital Media and Child Development, said parents would have more tools to control how their children interact with the platforms. “Limit screen time, or limit autoplay, or limit the endless scrolls so that the products become safer for their children,” Perry recommended. Perry pointed out researchers believe if the negative features can be reduced, the troubling trend of adolescents comparing their lives to others could decline, while allowing for greater social connections to be made. Some critics of the bill have said it could pressure platforms to “over-moderate,” as various states deliberate what kinds of material are considered appropriate for children. 

ABC News: Bipartisan pair of lawmakers push to protect children online: Sens. Richard Blumenthal and Marsha Blackburn introduced bipartisan legislation Tuesday focused on protecting children online and holding social media companies accountable as cries mount for improved safety features. The legislation would mandate independent annual audits to assess risks to minors, require social media companies to have more options for minors to protect their information and disable certain features, provide more parental controls and give academic and public interest organizations access to datasets to foster research. The Kids Online Safety Act of 2023 builds on the 117th Congress’ version by delineating important definitions and guidelines to better concentrate on immediate hazards to children. The legislation focuses on specific dangers online, including the promotion of suicide, eating disorders, substance abuse and sexual exploitation. 

The Washington Post: Big Tech-funded groups try to kill bills to protect children online: At a March meeting in Annapolis, Md., that state lawmakers held to discuss proposals for new safety and privacy protections for children online, one local resident made a personal plea urging officials to reject the measure. “I’m going to talk to you as a lifelong Maryland resident, parent, [husband] of a child therapist,” Carl Szabo told the Maryland Senate Finance Committee, according to footage of the proceedings. “Typically I’m a pretty cool customer, but this bill, I’m really nervous, because this comes into effect, this will really harm my family. This will really harm my kids’ ability to be online.” What Szabo didn’t initially disclose in his two-minute testimony to the panel: He is vice president and general counsel for NetChoice, a tech trade association that receives funding from tech giants including Amazon, Google and Facebook parent company Meta. NetChoice has vocally opposed the measure and already sued to block a similar law in California. 

Bloomberg: Instagram, Google See Surge in Reports of Online Child Abuse: Reports of child exploitation online increased at many of the biggest tech and social media firms over the last year, including Meta Platforms Inc.’s Instagram and Alphabet Inc.’s Google. TikTok, Amazon.com Inc.’s Twitch, Reddit Inc., and the chat apps Omegle and Discord Inc. also saw increases, according to a Tuesday report from the National Center for Missing and Exploited Children. The US child safety agency received over 32 million reports involving online enticement, child sexual abuse material and child sex trafficking in 2022 — some 2.7 million more than the year before. While child sexual abuse material, or CSAM, was the largest category, there was an 82% increase in reports regarding online enticement. The center partially attributes the increase to financial “sextortion,” which involves targeting kids to share explicit photographs and blackmailing them for money. 

CNN Business: Pornhub blocks access in Utah over age verification law: Some of the internet’s biggest adult websites, including Pornhub, are now blocking access to Utah users over a new age verification law that takes effect on Wednesday. Pornhub and other adult sites controlled by its parent, MindGeek, began blocking visitors with Utah-based IP addresses this week. Now, instead of seeing adult content when visiting those sites, affected users are shown a message expressing opposition to SB287, the Utah law signed by Gov. Spencer Cox in March that creates liability for porn sites that make their content available to people below the age of 18. “As you may know, your elected officials in Utah are requiring us to verify your age before allowing you access to our website,” the message said. “While safety and compliance are at the forefront of our mission, giving your ID card every time you want to visit an adult platform is not the most effective solution for protecting our users, and in fact, will put children and your privacy at risk.” 

Yahoo News: Online predators target children’s webcams, study finds: There has been a tenfold increase in sexual abuse imagery created with webcams and other recording devices worldwide since 2019, according to the the Internet Watch Foundation. Social media sites and chatrooms are the most common methods used to facilitate contact with kids, and abuse occurs both online and offline. Increasingly, predators are using advances in technology to engage in technology-facilitated sexual abuse. Once having gained access to a child’s webcam, a predator can use it to record, produce and distribute child pornography. We are criminologists who study cybercrime and cybersecurity. Our current research examines the methods online predators use to compromise children’s webcams. To do this, we posed online as children to observe active online predators in action. 

The Tribune-Democrat: Area school districts join lawsuit against social media companies: As local school districts join a nationwide lawsuit against some of the largest social media companies, the educational leaders aim to bring awareness to the negative effects these apps and sites have on teenagers and children and hold the businesses accountable. “We’re alleging the public nuisance legal theory, which allows government entities to hold companies liable for unique damages caused by a company’s conduct,” said Ronald Repak, partner at Dillon McCandless King Coulter and Graham, LLP. He and the firm represent nearly 30 regional school districts and have encouraged each to join the suit against Facebook, Instagram, TikTok, Snapchat and similar companies. Indiana Area School Board was one of the first, locally, to sign on, followed by Windber Area, Penn Cambria and Blacklick Valley. 

Education Week: Federal and State Lawmakers Want to Regulate Young Social Media Users. Will It Work?: A rising number of state and and federal lawmakers are crafting legislation that would restrict young kids’ access to social media and institute other protections for young social media users—all in the name of improving mental health. But some policy experts worry that the bills—which are generating bipartisan support—will be difficult to enforce and may have unintended consequences. “This is all new territory for Congress: how do you protect the First Amendment? How do you keep kids’ autonomy online?” said Allison Ivie, the government relations representative for the Eating Disorders Coalition for Research, Policy and Action, which has been tracking this issue closely. She was referring to a bill recently filed in the U.S. Senate. “There is a level of frustration in this country when we see these levels of mental health problems skyrocketing, and people want a quick fix.” Many lawmakers, who are parents and grandparents, are seeing this problem play out in their homes, said Ivie. And she suspects there was an expectation from a lot of adults that kids’ mental health issues would dissipate once they were back to learning full time in-person again. 

WFSB: Lawmakers consider proposal to prohibit children under 13 from using social media: Lawmakers on Wednesday will reveal more information about a bipartisan bill to protect children from the harmful impacts of social media. Parents Channel 3 spoke with have said they’ve seen first-hand the negative impacts of social media platforms on children. Studies have shown social media usage is a cause of a mental health epidemic. U.S. lawmakers said they identified areas of concern and wanted to ensure kids’ mental health and overall safety. One of the lawmakers who backs the bill is Democratic Sen. Chris Murphy. He and the three other lawmakers said they support the bill because they have young children of their own. 

MIT Technology Review: Why child safety bills are popping up all over the US: Hello and welcome to The Technocrat! Bills ostensibly aimed at making the internet safer for children and teens have been popping up all over the United States recently. Dozens of bills in states including Utah, Arkansas, Texas, Maryland, Connecticut, and New York have been introduced in the last few months. They are at least partly a response to concerns, especially among parents, over the potentially negative impact of social media on kids’ mental health. However, the content of these bills varies drastically from state to state. While some aim to protect privacy, others risk eroding it. Some could have a chilling effect on free speech online. There’s a decent chance that many of the measures will face legal challenges, and some aren’t necessarily even enforceable. And altogether, these bills will further fragment an already highly fractured regulatory landscape across the US. 

Tribune Chronicle: Opinion: Child deaths prove social media is no game: I recall being dared by my friends, as a child, to put my hand on the electric fence surrounding a cow field just down the rural country lane from the southwestern Pennsylvania home where I grew up. I accepted the dare, and we all laughed out loud as the electric volt zapped through my body. It was, well, shocking, but since I’m still here to write about it, I guess it ended OK. I’m pretty sure reading this will be the first time my mother learns of my ridiculous childhood stunt. Sadly, when Jacob Stevens’ parents learned about the stupid teen challenge their son recently took in the company of his buddies, he already was in a seizure. The 13-year-old boy from Columbus never woke up. The boy had been participating in TikTok’s “Benedryl Challenge.” 

CNN Business: Meta opens up its Horizon Worlds VR app to teens for the first time, prompting outcries from US lawmakers”: Meta is forging ahead with plans to let teenagers onto its virtual reality app, Horizon Worlds, despite objections from lawmakers and civil society groups that the technology could have possible unintended consequences for mental health. On Tuesday, the social media giant said children as young as 13 in Canada and the United States will gain access to Horizon Worlds for the first time in the coming weeks. The app, which is already available to users above the age of 17, represents Meta CEO Mark Zuckerberg’s vision for a next-generation internet, where users can physically interact with each other in virtual spaces resembling real life. “Now, teens will be able to explore immersive worlds, play games like Arena Clash and Giant Mini Paddle Golf, enjoy concerts and live comedy events, connect with others from around the world, and express themselves as they create their own virtual experiences,” Meta said in a blog post. 

Florida Politics: Legislature unanimously passes bill restricting social media, student phone use in school: A bill banning TikTok, Snapchat, Twitter and other social media platforms on public school devices and requiring schools to teach kids about the perils of the internet is now primed for Gov. Ron DeSantis’ signature. The Senate voted 39-0 in favor of the measure (HB 379), which by July 1 will mandate public school districts to block access to social media on school-provided Wi-Fi and adopt a safety policy that addresses access to the internet by minors. Students would still be able to access social media sites using their own phones, tablets, laptops and mobile plans; however, the bill prohibits using devices during class time unless it’s for educational purposes as directed by a teacher. The bill also directs the Department of Education to develop new curricula on social media safety for grades 6-12 on its social, emotional and physical effects, as well as its dangers, and make the materials available to the public and parents. 

CNN Wire: Parents decide their children’s online usage as US lawmakers debate over TikTok: In the future, when teenagers want to sign up for an account on Facebook or Instagram, they may first need to ask their parent or guardian to give their consent to the social media companies. That, at least, is the vision emerging from a growing number of states introducing – and in some cases passing – legislation intended to protect kids online. For years, US lawmakers have called for new safeguards to address concerns about social platforms leading younger users down harmful rabbit holes, enabling new forms of bullying and harassment and adding to what’s been described as a teen mental health crisis. Now, in the absence of federal legislation, states are taking action, and raising some alarms in the process. The governors of Arkansas and Utah recently signed controversial bills into law that require social media companies to conduct age verification for all state residents and to obtain consent from guardians for minors before they join a platform. Lawmakers in Connecticut and Ohio are also working to pass similar legislation. 

Axios NW Arkansas: What Arkansas’ parental consent for social media means: Arkansas Gov. Sarah Huckabee Sanders signed legislation last week requiring social media companies to verify ages and obtain parental consent for users younger than 18 who’re trying to open new accounts. The big picture: Supporters of the Social Media Safety Act say it can help protect children from harmful effects of social media, while others say the move raises privacy, free speech and enforceability concerns. The legislation’s sponsor, state Sen. Tyler Dees (R-Siloam Springs), told the Arkansas Senate that minors are exposed to harmful people and inappropriate content on social media, arguing age verification would empower parents to protect their kids, Arkansas Advocate reported. Details: The law requires companies to contract with third-party vendors to verify users’ ages before allowing access to the platforms. 

Engineering and Technology: Opinion: It’s time for responsible social media: “We must finally hold social media companies accountable for the experiment they are running on our children for profit.And it’s time to pass bipartisan legislation to stop Big Tech from collecting personal data on kids and teenagers online, ban targeted advertising to children, and impose stricter limits on the personal data these companies collect on all of us.” President Joe Biden got a standing ovation from Democrats and Republicans when he proposed tough regulation of social media in February’s State of the Union address. But getting something into federal law is proving tricky. The US has lagged the UK and the EU on online regulation. Europe’s General Data Protection Regulation has become a global template for privacy and has for now been retained in British law. 

Gizmodo: Kids on BeReal Are Exposed to Sexual Content More Often Than Other Social Networks, Survey Finds: BeReal, a popular new spontaneous image-sharing app designed to show life, “without filters,” had the highest proportion of child users exposed to sexual content of any major social media app, according to a new survey shared with Gizmodo. Larger apps like YouTube and TikTok had more overall incidents of exposure to sexual content, but users on BeReal were the most likely to actually interact with the content, the survey found. Similarly, the survey of parents showed BeReal had the highest proportion of child users who have shared sexually explicit images of themselves on the app. Those findings are part of a large survey of US parents conducted by ParentsTogether Action, a nonprofit organization that advocates in favor of tougher online protection for teens and kids. The survey of 1,000 parents found instances of child sexual abuse and exploitation on every major social network. More than a third (34%) of parents surveyed said they believed their children had been exposed to sexually explicit content online. More than 40% of the kids exposed to sexual content at the time were under 12 years old. 

Marketing Dive: Pinterest latest to add additional safety protocols for younger users: Pinterest is the latest social media platform to announce new safety features aimed at protecting the wellbeing and privacy of its younger users, a topic that has more heavily come into focus these days despite broader struggles to gain unified support. The efforts by the platform follow an investigation by NBC News last month that unveiled how predators online have been compiling photos of children, including toddlers, into saved collections — known on the app as “boards” — with content often involving children bending over, dancing or sticking their tongue out. The investigation also found that similar images and videos were fed to users through the app’s algorithm after interest in such content was displayed. The investigation, which quickly gained national attention, prompted Pinterest to add new features expanding the capabilities for users to report content and accounts, and its latest safety additions seem to be building on its corrective efforts. 

Bloomberg: Meta Urged to Halt Plans Allowing Minors Into the Metaverse: Dozens of advocacy organizations and children’s safety experts are calling on Meta Platforms Inc. to terminate its plans to allow minors into its new virtual reality world. Meta is planning to invite teenagers and young adults to join its metaverse app, Horizon Worlds, in the coming months. But the groups and experts that signed the letter, which was sent to Meta Chief Executive Officer Mark Zuckerberg on Friday, argue that minors will face harassment and privacy violations on the virtual reality app, which is only in its early stages. “Meta must wait for more peer-reviewed research on the potential risks of the metaverse to be certain that children and teens would be safe,” wrote the groups, led by online safety groups including Fairplay, the Center for Countering Digital Hate, Common Sense Media and others. The letter points to a March report from the Center for Countering Digital Hate that found users under 18 are already facing harassment from adults on the app. Researchers with the center witnessed 19 episodes of abuse directed at minors by adults, including sexual harassment, during 100 visits to the most popular worlds within Horizon Universe. 

Bloomberg Law: States Race After Utah on Minors’ Privacy Despite Legal Threats: The legislative and regulatory landscape concerning minors’ privacy is becoming increasingly protective, adding new complexity and uncertainty for companies as they navigate a patchwork of requirements. The latest legislative trend aims to regulate use of social media by minors and provide parents with greater control over their children’s social media activities. This wave of legislation is building alongside concerns about the impact of social media on teens’ mental health and perceived gaps in protections for children’s privacy rights on social media. On March 23, Utah was the first state to adopt such social media regulations with its Social Media Regulation Act. The law applies to social media companies with more than 5 million users worldwide, and it goes into effect on May 3, with numerous requirements coming into force beginning March 1, 2024. 

The New York Times: What Students Are Saying About Banning TikTok: TikTok, the social media app owned by the Chinese company ByteDance, has long worried American lawmakers, but those concerns — which range from national security risks to the app’s effects on young people — came to a fever pitch last month when a House committee voted to advance legislation that would allow President Biden to ban TikTok from all devices nationwide. Two-thirds of American teenagers are on the app. So we asked students: “Should the United States Ban TikTok?” Most were opposed. Their arguments included the fact that many apps — not just TikTok — are collecting and selling their data; that a ban would violate the first amendment; that TikTok is fun and helpful for users and lucrative for creators; and that the government has bigger problems it should be worrying about. But a sizable portion of commenters was in favor, citing national security concerns, the app’s effects on young people’s mental health, and the ease with which they can get around restrictions. For some, the possibility of a TikTok ban brought on something like relief: “I wouldn’t be upset,” Timothy from WHS wrote, “and to be honest, I think it would be for the better when it comes to me and kids around my age because it becomes addicting.” 

Bloomberg: Amazon’s Twitch Safety, AI Ethics Job Cuts Raise Concerns Among Ex-Workers: Job cuts at Amazon.com Inc.’s Twitch division are raising concerns among former employees and content monitors about the popular livestreaming site’s ability to police abusive or illegal behavior — issues that have plagued the business since its inception. Layoffs at Twitch eliminated about 15% of the staff responsible for monitoring such behavior, according to former employees with knowledge of the matter. The company also folded a new team monitoring the ethics of its AI efforts, said the people, who asked not to be identified to protect their job prospects. Since late 2022, technology companies have cut more than 200,000 jobs, including trust and safety positions and contractors, at Meta Platforms Inc., Alphabet Inc. and Twitter. Job postings including the words “trust and safety” declined 78% in March 2023 from a year ago, according to the job-listing site Indeed. Technology companies also thinned their responsible AI and diversity teams. 

The Wall Street Journal: Mothers Power New Drive to Make Social-Media Firms Accountable for Harms: Silicon Valley has for years brushed back attempts to make internet platforms more accountable for harm to young people. Online safety advocates are hoping to turn the tide with a new force: Moms. Mothers who say social media devastated their sons and daughters are stepping up efforts to pass legislative remedies, including by making personal appeals to lawmakers and working with congressional aides to fine-tune legislation. The power of the lobby of mothers was demonstrated in November, when about 10 women walked into Sen. Maria Cantwell’s (D., Wash.) office, demanding to know why they hadn’t been able to secure a meeting with the chair of the Senate Commerce Committee. Kristin Bride, 56, of Mesa, Ariz., clutched a picture of her late 16-year-old son as she approached the reception desk. Several more mothers followed, holding their own photographs of children whose deaths or struggles they blame, in part, on platforms such as YouTube, Instagram and Snapchat. 

The Washington Post: There are almost no legal protections for the internet’s child stars: Since she was a small child, Cam Barrett, now 24 and a social media strategist in Chicago, had her life documented online. First on Myspace, then on Facebook, Barrett says, her mother posted relentlessly about her, concocting storylines about highly personal events that resulted in her being bullied and ostracized at school. “When I was nine years old, the intimate details of my first period were shared online,” she said in February while testifying about her experience to the Illinois Senate. “At 15, I was in a car accident. … Instead of a hand being offered to hold, a camera was shoved in my face.” Eventually, she said, she began hiding out in her room so she wouldn’t have to appear on camera. “I was told to look sicker for the camera,” she later told The Washington Post. “I was told if I look too happy, I have to take another picture to look like this, or like this. … I was hit by a drunk driver, and she right off the bat put a phone in my face to take pictures to put online,” Barrett said. 

The New York Times: A digital footprint that begins before birth: Sophie Kratsas was only a few hours old when she received her first email: a “welcome to the world” message from her father, Nick Kratsas. He had created an email account for his newborn daughter while still standing in the delivery room. This was 2014, and Kratsas, 44, had already noticed a dearth of unclaimed email addresses with a person’s full name without numbers, special characters or other concessions. “I’m like, man, if I can grab this for her now, eventually she’ll be able to use this when she’s ready for it,” Kratsas said. A few days later, he created a Facebook profile for Sophie so he and his wife, Heather, 41, could begin tagging her in posts and photos. When she’s old enough, they intend to turn over the email and Facebook accounts to her, along with the robust digital histories that come with them. Sophie, now 9, is one of many children in her generation whose digital footprint precedes her physical one. In an age when teens and tweens are more online than ever, some parents find it just as important to invest in their offspring’s digital futures, like securing their email addresses, domain names and social media handles, as it is to invest in their finances and education. 

Cato Institute: Analyzing the Consequences of Recent Youth Online Safety Proposals: Many policymakers at both the state and federal levels have called for additional regulations to protect children’s online privacy and improve online safety. While the desire to protect children is a well‐ intentioned motivation, these proposals have significant consequences, and in many cases may even diminish children’s online privacy. In a new policy brief out today, I discuss the potential impact of these proposals for all internet users, not just children. In general, these proposed online safety regulations tend to fall into three major categories: A total or near total ban of social media use by users under a certain age; Requirements for age verification and age‐ appropriate design for social media and other general use websites; Additional age verification and age‐ appropriate design codes for particular types of content. A ban or near total ban of social media use is a draconian step for a government to take. It eliminates speech opportunities for individuals of every age. 

Forbes: Sex Traffickers Used America’s Favorite Family Safety App To Control Victims: Earlier this year, an 18-year-old Amazon employee brought a tip to the San Diego Police Department: prior to working for the tech giant, she had been forced into sex work when she was 17. Her alleged trafficker told her that she had to work six days a week and earn at least $1,000 a day, according to a search warrant obtained by Forbes. Text messages also showed her alleged trafficker forced her to do something else: install an app called Life360 on her phone. The app, which claims over 50 million active users across 195 countries, is among the most popular family safety apps in America. It lets parents and kids know where each family member is located at all times, displaying their live coordinates on a map. But, according to nine federal cases dating back to at least 2018, it has also been used by sexual predators to monitor and control their victims. And privacy and trafficking experts say such misuse is hardly an anomaly; it’s becoming an issue with other apps like it including Apple’s “Find My Friends” and Google’s “Find My Phone” tools. 

WBRE/WYOU: Being aware of ‘dark challenges’ on social media: Social media is as popular as ever with many people spending hours scrolling through the apps. But there is a darker side to social media that could be harmful to teens and even potentially deadly. Eyewitness News spoke with an area father who lost his teenager to a dark challenge. Now he’s made it his mission to warn other parents about the dangers lurking on some social media apps. “Boy was my hero, always has been, always will be,” said Dave Thomas. Dave Thomas of Honesdale continually grieves the loss of his teenage son, Logan Gorski. “Logan was the perfect child. He always stood for something,” added Thomas. Logan died on October 5, 2020, 11 days before his 17 birthday. Dave says Logan engaged in a dark challenge from the social media site called ‘Kik’, and it went too far, taking Logan’s life. 

The Morning Call: ‘Human beings are making the decisions’: Northampton County dismisses concerns overs AI tool to help keep children safe: An artificial intelligence tool Northampton County officials have begun using to help predict which children could be at risk of harm has caught the attention of federal investigators looking into a case of an Allegheny County couple whose child was removed from their custody based on the same tool. Despite the investigation underway by the U.S. Department of Justice in the Allegheny County case, Susan Wandalowski, director of Northampton County’s human services, said recently the AI product could be an important device in its mission of protecting children. She said county officials believe the tool is part of child-welfare professionals’ arsenal in screening for potential abuse cases, to ensure children’s safety, keep families intact, and avoid unnecessary investigations of low-risk families. “At the end of the day,” Wandalowski said, “the human beings are making the decisions. This is just one additional tool in their tool belt.” 

USA Today: Opinion: Will TikTok be banned? Maybe it should be for kids, at least.: Congress wants to kick TikTok out of America. I want to boot it from my bedroom. Like most crazy-busy parent professionals in America these days, I am beyond desperate for a full night of life-restoring rest. Yet I cannot fight the irresistible magnet-force pull that is the latest viral video – especially at night, when I’m too tired to be rational. I’m not alone. TikTok’s been downloaded more than 210 million times in the United States, according to the most recent marketing statistics. To date, there’s no evidence that TikTok is a threat to national security.  But there’s plenty of evidence that, that like all of the others – Facebook, Twitter, Instagram, YouTube, Snapchat – the company’s main goal is to hook you young and keep you coming back for more. 

Newsweek: Opinion: A TikTok Ban Is the Only Way Forward: When it comes to building the Great Firewall—a tool to control and censor the internet—the Chinese Communist Party (CCP) has had no trouble. When it comes to building a firewall between TikTok and its Chinese parent company ByteDance, though, the CCP will curiously struggle. That’s because TikTok is an American company in name only. It can and should be banned. The core problem here is that the CCP controls TikTok. Explaining why is a simple two-part argument. First, there is no firewall between the CCP and China’s private sector. Second, there is no firewall between TikTok and its Chinese parent company ByteDance. The first part is clearly spelled out in Chinese law, namely the 2015 National Security Law and 2017 National Intelligence Law. In particular, the national intelligence law states that “any organization or citizen shall support, assist, and cooperate with state intelligence work.” 

The Seattle Times: Opinion: Isolation and restraint of students is abuse: Listening to legislative testimony about a 7-year-old boy in crisis who was restrained face down on a school floor tests the heart. Watching a grown man choke up at his memory of being dragged through a hallway and locked in a barren isolation room would lead any feeling person to wonder why Washington state continues these practices. Our youth prisons outlawed solitary confinement in 2020. Yet at least 3,800 children — most of them younger than 12 — were isolated or restrained 24,873 times during the 2019-20 school year, according to a report from Disability Rights Washington. The vast majority of these kids were special education students, and a wildly out-of-proportion number were Black. The high number of incidents makes the case plainly: Isolation and restraint do not correct behavior. To the contrary, with an average of six incidents per student, there is abundant evidence that these approaches only exacerbate the problem. 

THV 11: Arkansas suing TikTok, Meta for exposing minors to ‘damaging content’: Three different lawsuits have been filed by Arkansas Attorney General Tim Griffin, taking aim at social media powerhouses, Meta and TikTok in order to “protect” Arkansas children from the applications. He was joined by Governor Sarah Huckabee Sanders during a press conference, where they discussed the lawsuits which are being filed under Arkansas’s ‘Deceptive Trade Practices Act.’ “The common theme is deception. And the consequences of that deception is endangering Arkansans, particularly our children, our youth,” said Attorney General Tim Griffin. According to Gov. Sanders, this act prohibits companies from “engaging in false, deceptive, business practices.” She believes that this falsehood comes as social media companies “claim that their platforms are beneficial, non-addictive, and private.” In terms of Meta, the governor claimed that the company played a role in a mental health crisis among teens.  

The Washington Post: Utah governor says new social media laws will ‘prevail’ over challenges: Utah Gov. Spencer Cox (R) signed into law a pair of measures last week that seek to strictly limit social media access for kids and teens, marking “some of the most aggressive laws passed by any state to curb the use of social media by young people,” as my colleagues Naomi Nix, Cat Zakrzewski and Heather Kelly wrote. The move is likely to ignite another legal standoff with tech industry groups, which have already expressed concern about the laws’ constitutionality and gone on the offensive against a growing raft of state laws targeting social media companies. But Cox said Sunday that he believes state officials will topple lawsuits challenging its new social media laws. “We feel very confident that we have a good case here,” Cox told NBC News’s “Meet The Press” on Sunday. “We expect that there will be lawsuits, and we feel confident that we will prevail.” The laws would require companies to obtain parental consent before letting minors access their platforms and set a digital curfew for younger users. They would also require companies to give guardians access to their child’s account and to verify that users in Utah are over 18. 

CNBC: Worried about your kids and A.I.? Experts share advice — and highlight the risks to look out for: Artificial Intelligence is all the rage in the tech world, especially after the launch of ChatGPT and GPT-4. It has shown potential not only to change life of workers — but also the daily life of another demographic: kids. In fact, children are already using AI-powered toys and platforms that write bedtime stories at the click of a button. “We call today’s children ‘Generation AI’ because they are surrounded by AI almost everywhere they go, and AI models make decisions that determine the videos they watch online, their curriculum in school, the social assistance their families receive, and more,” Seth Bergeson, fellow at the World Economic Forum who led their “AI for Children” project, told CNBC Make It. And AI’s influence will only grow from here, said Saurabh Sanghvi and Jake Bryant, partners at McKinsey. 

My Ches. Co: Pennsylvania Man Sentenced to Prison for Cyberstalking: According to United States Attorney Gerard M. Karam, Vandaley, with the intent to harass and intimidate other persons, engaged in a course of conduct using electronic communication systems and services of interstate commerce to cause substantial emotional distress to six victims.  All six victims were former romantic partners of Vandaley or relatives and friends of his former romantic partners. During the course of the conduct, Vandaley repeatedly made false anonymous allegations to law enforcement agencies throughout the country.  He falsely accused the victims of committing heinous crimes including murder-for-hire, narcotics trafficking, human trafficking, and sexual offenses.  Vandaley also sent anonymous electronic messages to the victims threatening to kidnap and murder the minor child of one of the victims. Vandaley also threatened to mail parts of that minor child back to the victim.  Vandaley committed these crimes while one of the victims had a protection from abuse order against him. 

CNN Business: TikTok CEO in the hot seat: 5 takeaways from his first appearance before Congress: In his first appearance before Congress on Thursday, TikTok CEO Shou Chew was grilled by lawmakers who expressed deep skepticism about his company’s attempts to protect US user data and ease concerns about its ties to China. It was a rare chance for the public to hear from the Chew, who offers very few interviews. Yet his company’s app is among the most popular in America, with more than 150 million active users. Here are the biggest takeaways from Thursday’s hearing. 

The Washington Post: Utah governor signs bill to curb children’s social media use: Utah Gov. Spencer Cox (R) signed two bills into law Thursday that would impose sweeping restrictions on kid and teen use of social media apps such as Instagram and TikTok — a move proponents say will protect youth from the detrimental effects of internet platforms. One law aims to force social media companies to verify that users who are Utah residents are over the age of 18. The bill also requires platforms to obtain parental consent before letting minors use their services, and guardians must be given access to their child’s account. A default curfew must also be set. The Utah regulations are some of the most aggressive laws passed by any state to curb the use of social media by young people, at a time when experts have been raising alarm bells about worsening mental health among American adolescents. Congress has struggled to pass stricter bills on online child safety despite bipartisan concern about the effects social media has on kids. 

CNN: Why Bucks County, Pennsylvania, is suing social media companies: One mother in Bucks County, Pennsylvania, said her 18-year-old daughter is so obsessed with TikTok, she’ll spend hours making elaborate videos for the Likes, and will post retouched photos of herself online to look skinnier. Another mother in the same county told CNN her 16-year-old daughter’s ex-boyfriend shared partially nude images of the teen with another Instagram user abroad via direct messages. After a failed attempt at blackmailing the family, the user posted the pictures on Instagram, according to the mother, with some partial blurring of her daughter’s body to bypass Instagram’s algorithms that ban nudity. “I worked so hard to get the photos taken down and had people I knew from all over the world reporting it to Instagram,” the mother said. The two mothers, who spoke with CNN on condition of anonymity, highlight the struggles parents face with the unique risks posed by social media, including the potential for online platforms to lead teens down harmful rabbit holes, compound mental health issues and enable new forms of digital harassment and bullying. But on Friday, their hometown of Bucks County became what’s believed to be the first county in the United States to file a lawsuit against social media companies, alleging TikTok, Instagram, YouTube, Snapchat and Facebook have worsened anxiety and depression in young people, and that the platforms are designed to “exploit for profit” their vulnerabilities.

Glenside Local: PA lawmakers introducing social media bill, Sen. Haywood co-sponsors firearm ammunition bill: The Pennsylvania state Senate will soon introduce a bill mandating age verification on social media platforms and allowing parents/guardians to submit a request to delete a minor’s social media account. “There are clear and demonstrated harms to children who utilize these platforms, a fact which has been known by social media companies for years,” State Rep. Robert W. Mercuri (R-Allegheny) said in a co-sponsorship memorandum. “Attempts by these companies to curtail such harms failed to alleviate the problem and actually made it worse.” Mercuri has also taken a stand against the platform Tiktok, which is owned by a Chinese company. In a memo to the PA House Mercuri said the bill would “protect the Commonwealth’s information technology assets from security risks associated with the social media network TikTok.”

CT Mirror: CT-led bill aims to protect kids online. Will it clear Congress?: As revelations about the harmful toll of social media on children and teens have become public over the past few years, Congress sought to amp up the pressure on Big Tech and pass legislation for the first time in decades to protect minors and hold companies accountable. Some of those efforts “came heartbreakingly close” to materializing at the end of the year but ultimately faded and got punted to the new session of Congress that started in January. One of those bills, co-authored by Sen. Richard Blumenthal, D-Conn., focuses on the safety aspect and gives children and parents greater control over what online content can be viewed. The issue came to a head when Facebook whistleblower Frances Haugen testified before Congress in 2021 about the harmful effects of social media on children and teenagers and how tech giants kept users engaged to turn profits. Lawmakers like Blumenthal believe the growing bipartisan support on this issue could lead to the passage of tech reforms this time around — possibly this year. 

Bloomberg Law: Utah Taunts Social Media Sites With Sweeping Teen Restrictions: Utah’s first-in-the-nation legislation to restrict how social media companies treat young users and allow individuals to sue over violations will set the stage for a tech industry legal battle regarding their constitutionality. Gov. Spencer Cox (R), citing concerns over youth mental health, plans to sign into law Thursday two bills that aim to protect children from addictive features and other potential harms of social media. Platforms such as Facebook and Twitter would have to obtain parental consent if a user under 18 wants to open an account, and the companies could face fines and lawsuits for running afoul of a host of new requirements. The bills are among the most stringent efforts by state lawmakers across the country this year to regulate a child’s experience online. Tech industry groups said in letters asking Cox to veto the measures that they would violate the First Amendment and lead to frivolous lawsuits. 

Next Gov: ‘Alarming Content’ from AI Chatbots Raises Child Safety Concerns, Senator Says: As leading technology companies rush to integrate artificial intelligence into their products, a Democratic senator is demanding answers about how these firms are working to protect their young users from harm—particularly following a series of news reports that detailed disturbing content created by AI-powered chatbots. In a letter on Tuesday to the CEOs of five companies—Alphabet Inc.’s Google, Facebook parent company Meta, Microsoft, OpenAI and Snap—Sen. Michael Bennet, D-Colo., expressed concern about “the rapid integration of generative artificial intelligence into search engines, social media platforms and other consumer products heavily used by teenagers and children.” Bennet noted that, since OpenAI’s ChatGPT was launched in November, “leading digital platforms have rushed to integrate generative AI technologies into their applications and services.” While he acknowledged the “enormous potential” of generative AI’s adoption into a range of technologies, Bennet added that “the race to integrate it into everyday applications cannot come at the expense of younger users’ safety and wellbeing.” 

Patch: PA Could Mandate Social Media Age Limits, Allow Parents To Delete: As the reckoning for social media platforms that critics say have recklessly harmed children continues, Pennsylvania legislators are looking for more ways to keep young people safe. Legislation will soon be introduced in the Pennsylvania state Senate which would mandate age verification on social media platforms. It would also allow parents to request a child’s account be deleted. “There are clear and demonstrated harms to children who utilize these platforms, a fact which has been known by social media companies for years,” State Rep. Robert W. Mercuri (R-Allegheny) said in a co-sponsorship memorandum “Attempts by these companies to curtail such harms failed to alleviate the problem and actually made it worse.” 

The Hill: Utah’s Cox says he will sign divisive social media bill restricting minors: Utah Gov. Spencer Cox (R) on Thursday said he’ll sign a divisive bill restricting minors from using social media without parental permission. Cox said at a meeting with reporters that he’ll “absolutely” sign the social media bills sent to his desk this session: Utah Senate Bill 152 would require social media companies to verify that users in the state are 18 years or older in order to open an account, and Cox said he is willing to face any legal challenges to the initiative. “I’m not gonna back down from a potential legal challenge when these companies are killing our kids,” Cox said, according to footage from PBS Utah, shaking off First Amendment concerns. Under the bill, Utah residents under age 18 would only be able open an account with a parent or guardian’s permission. The new restrictions would take effect March 1, 2024. The governor said he would be working with social media companies and third-party verification over the next year to work out the details of how the restrictions would be implemented. 

Gizmodo: Parents Group Demands Meeting With Meta and TikTok Over Child Suicide: A family advocacy group called Parents Together published an open letter Thursday demanding a meeting with the heads of Meta and ByteDance, arguing that the companies knowingly expose children to a variety of dire threats, including the risk of suicide, and that they refuse to address these problems in lieu of growth and profit. The open letter describes a number of horror stories from families who say their children fell victim to the harms posed by social media, including suicides, accidental deaths from viral “challenges,” hospitalizations from eating disorders, sexual abuse, and more. Meta and ByteDance, the parent companies of Facebook and TikTok, respectively, “have imposed on unwitting children and families – anxiety and depression, cyberbullying, sexual predators, disordered eating, dangerous challenges, access to drugs, addiction to your platforms, and more—every single day,” Parents Together Action said in the letter. The companies “have chosen your profits, your stockholders, and your company over children’s health, safety, and even lives over and over again.” 

NBC News: Senators seek answers from Pinterest after NBC News investigation: Days after an NBC News investigation revealed how grown men on Pinterest openly create sex-themed image boards filled with pictures of little girls, the company says it has “dramatically” increased its number of human content moderators. It also unveiled two new features enabling users to report content and accounts for a range of violations. Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn., sent a letter to the company Tuesday morning demanding to know why the new tools weren’t already available, among other questions. “It should not have taken national media coverage of such graphic misuse targeting young children to prompt action,” wrote the senators, who are co-sponsors of  the bipartisan Kids Online Safety Act. “This report is particularly disappointing given that Pinterest has branded itself the ‘last positive corner of the internet.’” 

The Washington Post: Snapchat tried to make a safe AI. It chats with me about booze and sex.: Snapchat recently launched an artificial intelligence chatbot that tries to act like a friend. It built in some guardrails to make it safer for teens than other AI bots built on the tech that powers the buzzy ChatGPT. But in my tests, conversations with Snapchat’s My AI can still turn wildly inappropriate. After I told My AI I was 15 and wanted to have an epic birthday party, it gave me advice on how to mask the smell of alcohol and pot. When I told it I had an essay due for school, it wrote it for me. In another conversation with a supposed 13-year-old, My AI even offered advice about having sex for the first time with a partner who is 31. “You could consider setting the mood with candles or music,” it told researchers in a test by the Center for Humane Technology I was able to verify. 

Daily Mail: Facebook and Instagram used ‘aggressive tactics’ targeting children: Unredacted lawsuit claims Meta knew about child sexual exploitation and exploited extreme content to drive more engagement: Meta knowingly used ‘aggressive tactics’ that involved getting children hooked on social media ‘in the name of growth,’ according to a lawsuit against Meta claiming children have suffered at the hands of Facebook and Instagram. A Meta software engineer claimed that ‘it is not a secret’ how Facebook and Instagram used meticulous algorithms to promote repetitive and compulsive use among minors, regardless if the content was harmful – and was ‘been pretty unapologetic about it.’ The redacted revelations were disclosed in a lawsuit against Meta, which has been unsealed and seen by DailyMail.com. 

CBS 12 News: Social Media Safety: subcommittee unanimously sends safety education bill to senate floor: Social media safety has become a primary concern for everyone – especially for schools and parents of young children, who are more vulnerable to it’s dangers. Now, the question of whether the Department of Education should mandate safety instruction on online platforms to children in schools – a proposal in Senate Bill 52 – has been advanced by a senate subcommittee. Social media may be a fun place for people to engage with one another – but at least one cyber expert says kids must learn about its dangers too. “It’s extremely dangerous, there’s no other way around it,” remarked FAU Adjunct Professor and tech expert Craig Agranoff. “We’ve become a society that values likes more than we care, value kindness.” 

The Baltimore Sun: Judge approves redactions for AG’s Catholic clergy abuse report, clearing way for its release: A Baltimore judge approved the needed redactions Tuesday for the attorney general’s report on sexual abuse within the Roman Catholic Archdiocese of Baltimore, clearing the way for its public release. Circuit Judge Robert K. Taylor ordered the Maryland Attorney General’s Office to redact 37 names from the report and to anonymize the identities of 60 other people, removing them from the 456-page document entirely. Taylor’s order leaves the timing of the report’s release at the discretion of Attorney General Anthony Brown, whose office must complete the required redactions and notify the 37 individuals before publishing it. A timeline for when Brown expects to release the report was not available Tuesday afternoon, although it was unlikely the report would be released before Wednesday. The attorney general’s office will publish the report on its website. 

News 5 Cleveland: Big tech, lawmakers, and local schools take steps to monitor screen time and protect children: Mom, Stephanie Miller, remains watchful when it comes to her kids screen time. “Who doesn’t let them be on technology?” Miller said. “You know they can enjoy things, but it’s minimally because I like to keep their minds going in a more educated way, like imaginary play.” In February, Lieutenant Governor, Jon Husted, proposed the Social Media Parental Notification Act. It would require social media and gaming companies to get parental consent before kids under 16 sign up. Dr. Michael Manos, Head of ADHD at the Cleveland Clinic, said too much phone time, early on can be linked to anxiety and depression in children. “The effort to limit screen time is certainly laudable and should have been done a long time ago,” said Manos. 

The Washington Post: Meta doesn’t want to police the metaverse. Kids are paying the price: Zach Mathison, 28, sometimes worries about the hostility in Meta’s virtual reality-powered social media game, Horizon Worlds. When his 7-year old son, Mason, explores the app he encounters users, often other children, screaming obscenities or racist slurs. He is so uneasy about his son that he monitors his every move in VR through a television connected to his Quest headset. When Mathison decides a room is unsafe, he’ll instruct Mason to leave. He frequents online forums to advise other parents to do the same. “A lot of parents don’t really understand it at all so they just usually leave it to the kids to play on there,” he said. He will say “if your kid has an Oculus please try to monitor them and monitor who they’re talking to.” For years, Meta has argued the best way to protect people in virtual reality is by empowering them to protect themselves — giving users tools to control their own environments, such as the ability to block or distance other users. 

NBC News: Men on Pinterest are creating sex-themed image boards of little girls. The platform makes it easy: Like other kids her age, 9-year-old Victoria signed up for Pinterest because she wasn’t allowed on TikTok. Her mother feared she might encounter dangerous content or individuals on the popular video-sharing app. Pinterest, meanwhile, seemed safe. But while the third grader was “pinning” pictures of baby animals, craft ideas and nail art inspiration into her image “boards” on the site, grown men were pinning her. Clips Victoria uploaded of herself to Pinterest, such as one in which she cheerfully turns a cartwheel, have been compiled by at least 50 users into their own boards with titles like “young girls,” as well as “Sexy little girls,” “hot,” “delicious,” and “guilty pleasures.” Those boards are filled with dozens, hundreds and sometimes thousands of photos and videos of children. 

Forbes: Will The U.S. Update Laws For Children’s Digital Privacy?: Despite a last-ditch effort by lawmakers in December 2022, two bills to strengthen online regulatory protection for children in the U.S. failed to make it to Congress’s 2023 fiscal spending plan. The advocacy group Fairplay termed the development “beyond heartbreaking,” adding that “preventable harms and tragedies” were allowed to continue unimpeded. Lawmakers sponsoring the bills blamed the “behemoth sway” of lobbyists working on behalf of Big Tech. These and other voices call attention to the growing dangers of certain online activities to vulnerable children. To regain the initiative, President Joseph Biden demanded a ban on online ads targeting children during his State of the Union address on February 7, 2023. 

The Globe: District 518 warns against scams, encourages parents to talk to kids: District 518 is asking parents and students for help fighting a new social media scam: fake accounts that threaten teens with releasing nude photos of them — pictures that aren’t even real. “We are investigating,” said Anne Foley, public relations/communications coordinator with District 518, noting that investigations take time. “We don’t know if these people are fellow students. We have no idea if these people live in Worthington. And we have no idea how they’re choosing the kids, but there’s been at least two.” 

Social Media Today: New Social Media Restrictions for Youngsters Could Lead to Broader Limits in Access: Could this be a sign of things to come in social media regulation? The State of Utah is set to pass a new law which will restrict people under the age of 18 from using social media apps without a parent’s consent. As per Axios: “Starting March 1st, 2024, all Utahns would have to confirm their ages to use social media platforms or lose account access, under the bill, sponsored by state Rep. Michael McKell.” The new law, if enacted, will add an extra level of protection for youngsters, with parents to lose access to their own social media accounts if they fail to verify their kids’ age, and monitor their activity. 

Wired: TikTok’s Screen-Time Limits Are the Real Distraction: My first cell phone was a brick-shaped Nokia with a couple hundred minutes loaded onto it. My parents gave it to me when I got my first car, on the understanding that, whenever I drove somewhere that wasn’t school, I’d call them as soon as I arrived so they’d know I was safe. It was a reasonable rule—especially given how many times it took me to pass my driver’s test—and one to which I had no problem agreeing. Even still, I almost never remembered to do it. I’d be in the middle of a movie at the theater and I’d realize that I had forgotten to call. I’d sprint out to the car—where I kept the phone itself—and have a brief, harried conversation with my worried and deeply irritated parents. They knew, of course, that I was likely fine. But it’s hard to not know what your kids are doing without you. 

Axios: Tech platforms struggle to verify their users’ age: Social media and streaming platforms are trying to figure out the best ways to verify a user’s age as parents and lawmakers grow increasingly concerned about the way children and teenagers use online services. Driving the news: Those worries — along with recently enacted laws in the United Kingdom and California — have pushed companies to try new processes for ensuring underage users aren’t getting onto sites and services meant for older people. Age verification and age estimation is just one part of an attempt to make tech safer for kids as complaints grow over mental health harms, privacy trespasses and more. 

Axios Salt Lake City: Utah set to limit minors from using social media without parent’s OK: Utah is poised to pass a law restricting children and teens under age 18 from using social media without their parent’s consent. Meanwhile, adults could lose access to their accounts, too, if they refuse to verify their age. The latest: After SB 152 cleared its final legislative hurdle last week, Utah Gov. Spencer Cox told reporters Friday — the final day of the 2023 general session — he planned to sign the bill. Cox said the state was “holding social media companies accountable for the damage that they are doing to our people.” Between the lines: Starting March 1, 2024, all Utahns would have to confirm their ages to use social media platforms or lose account access, under the bill, sponsored by state Rep. Michael McKell (R-Spanish Fork). 

Bucks County Courier Times: Sextortion is on the rise, and it can be deadly. How to protect yourself and your kids: Ian Pisarchuk sat behind a screen and terrorized his victims. He’d befriend them mostly online, and then the demands would start. He wanted photos of them, or said he already had them. He made threats to get what he wanted from girls and young women, using details from their social media accounts to exert power over them and get what he wanted for his own pleasure. “Words cannot describe the anxiety Ian has caused me,” said one of his victims in a Bucks County court last month. In 2019, Pisarchuk set his sights on the young girl, getting her to send him explicit images of herself and then threatening to expose the photos online. 

CNN: Democratic senators urge Meta not to market its metaverse app to teens: Two Democratic senators urged Meta this week to suspend a reported plan to offer Horizon Worlds, the company’s flagship virtual reality app, to teens between the ages of 13 and 17, arguing the technology could harm young users’ physical and mental health. The lawmakers, Massachusetts Sen. Ed Markey and Connecticut Sen. Richard Blumenthal, called Meta’s plan “unacceptable” in light of the company’s “record of failure to protect children and teens,” in a letter dated Wednesday to company CEO Mark Zuckerberg. The letter focuses on a plan, reported by the Wall Street Journal last month, that would enable Meta’s teen users to join a persistent online world consisting of multiple digital communities through the use of a virtual reality headset. Horizon Worlds is already available to adults 18 and older. 

Time: ‘We Can Turn It Off.’ Why TikTok’s New Teen Time Limit May Not Do Much: TikTok is the most popular social media platform for teens—and by many accounts the time they spend on it is growing. Two-thirds of U.S. teenagers told a 2022 Pew survey that they are on the app, and 16% said they use it constantly. In 2021, the average time kids and teens spent on TikTok grew to 91 minutes a day, up from 82 the year before, according to a report by TechCrunch. So Tuesday’s news that TikTok moved to limit minors to one hour per day sounds like a big deal. But teachers, who have reported concerningly high social media use among students and struggles to compete for their attention, say that while the new limits are a good idea, they might not have a big impact. 

WRIC: Do you know who your children are talking to online? Police warn of influx in social media scams targeting teens: A recent influx of scams targeting teenagers prompted Chesterfield County Police to urge parents to keep a closer eye on their children’s devices. Sergeant Winfred Lewis with Chesterfield County Police Department’s Special Victims team described how these particular scammers prey on young peoples’ fear and embarrassment. “They’re juveniles,” Lewis explained. “They’re teenagers.” Typically, when police warn of online scams, they note how scammers target the elderly, who may be less familiar with modernized social media and web technology. However, with this recent wave of scams, the most internet savvy individuals are vulnerable. Victims have been kids as young as 11 or 12 years old. 

CNET: TikTok Will Limit Teen Screen Time to 60 Minutes by Default: TikTok said Wednesday that it wants teens to be more aware of the time they spend on the popular app for short-form videos. The tech company said it’ll set screen time limits for teens by default and release new features so parents have more control over their children’s use of social media. TikTok users under 18 years old will have their screen time limit automatically set to 60 minutes. The short-form video app said this default screen time will apply to new and existing accounts that haven’t already used this tool. 

Politico: Vivek Murthy wants kids off social media: Surgeon General Vivek Murthy is an evangelist for wellness, hosting town halls and expounding on meditation and mindfulness on his House Calls podcast. He’s particularly concerned about kids’ mental health and has issued guidance for young people, suggesting they ask for help, volunteer in their communities and learn stress management techniques. And he’s testified before Congress about the topic. In conversation with Ruth, he calls out social media as a unique threat to the rising generation, a view shared by many in Congress who are considering legislation to make it harder for kids to use the technology. 

Forbes: Meta Backs New Platform To Help Minors Wipe Naked, Sexual Images Off Internet: The National Center for Missing & Exploited Children has launched a platform, funded by Meta, to help kids and teens have naked or sexual photos and videos of themselves removed from social media. The new service for minors, called Take It Down, was unveiled one year after the release of a similar tool for adults known as StopNCII (short for non-consensual intimate imagery). Today, Take It Down will work only across five platforms that agreed to participate: Meta-owned Facebook and Instagram, as well as Yubo, OnlyFans and Pornhub. “We created this system because many children are facing these desperate situations,” said NCMEC’s president and CEO Michelle DeLaune. “Our hope is that children become aware of this service, and they feel a sense of relief that tools exist to help take the images down.” 

Bloomberg Law: Social Media, Porn Sites Targeted in States Seeking Age Checks: Dozens of proposals pending in statehouses across the country that aim to regulate a child’s experience online are raising concerns over the future of anonymity on the internet. Lawmakers are pushing a variety of bills aimed at boosting privacy protections for kids’ personal information, limiting their access to social media without parental involvement, or keeping them off of sites that include explicit content such as pornography. The measures would rely on companies like Meta Platforms, Inc., Alphabet Inc., and TikTok Inc. to know how old their online users are—posing the conundrum of determining age without gathering too much sensitive information about a person’s identity. 

NBC News Washington: How Social Media and Screen Time Can Affect Children’s Mental Health: New research shows sites like TikTok may have a negative impact on children’s mental health. The algorithm is designed to keep users engaged longer, and studies show the more kids and teens spend on social media, the more likely they’ll be depressed. Psychiatrist Dr. Asha Patton-Smith of Kaiser Permanente offered guidance for parents. 

New Jersey Monitor: N.J. legislators propose punishing social media companies for kids’ online addiction: For teenagers like Nidhi Das, social media became a cherished lifeline to friends during the pandemic’s early days. But as regular life resumed, Das didn’t like how tethered she felt to it. Social media became her go-to boredom buster, and even the misinformation that infects many platforms kept her swiping. “The algorithm, it curates to what you like. And people would make up little controversies, so that might encourage you, like ‘oh, let me look into that.’ Even if it’s not true, I still want to know like: ‘Oh, where did that stem from?’” said Das, 17, a high school senior from Lawrenceville. “The addicting thing is that there’s always something endlessly there, so you keep scrolling.” 

ABC News: Supreme Court wrestles with immunity for social media companies: For the first time Tuesday, the U.S. Supreme Court wrestled with the scope of a landmark federal law that’s given sweeping legal immunity to internet and social media companies for more than 25 years. Section 230 of the Communications Decency Act — known in the tech world as the “26 words that created the modern internet” — protects the companies from liability for content posted by individual users, no matter how discriminatory, defamatory or even dangerous the information may be. 

Patch: NYC Mayor: Investigate Social Media, Mr. President: New York City’s drill rap-dissing, burgeoning fuddy-duddy-in-chief has a request for America’s octogenarian commander-in-chief: investigate social media. “I don’t think that we have properly analyzed what social media is doing to us in general, specifically to our young people,” Mayor Eric Adams said Tuesday in response to a question about a teen’s recent subway surfing death. “I am hoping the president calls a national Blue Ribbon Commission to really analyze this thing that has really dropped into our lives.” 

CBS News: ‘It’s not going away’: Local psychologist weighs in on proposal to ban kids from social media: Whether fascinated by Facebook or taken with Twitter, kids could get the boot from social media. Republican Senator Josh Hawley of Missouri introduced a bill banning children under 16 years of age from using social media. Hawley says big tech companies are neglecting children’s health and monetizing their personal information. The U.S. Surgeon General says kids aged 13 and younger shouldn’t even be on social media. “This skewed, and often distorted environment of social media often does a disservice to many of those children,” said Dr. Vivek Murthy. 

NPR: 10 things to know about how social media affects teens’ brains: The statistics are sobering. In the past year, nearly 1 in 3 teen girls reports seriously considering suicide. One in 5 teens identifying as LGBTQ+ say they attempted suicide in that time. Between 2009 and 2019, depression rates doubled for all teens. And that was before the COVID-19 pandemic. The question is: Why now? “Our brains, our bodies, and our society have been evolving together to shape human development for millennia… Within the last twenty years, the advent of portable technology and social media platforms is changing what took 60,000 years to evolve,” Mitch Prinstein, the chief science officer at the American Psychological Association (APA), told the Senate Judiciary Committee this week.  

AP: Ohio proposal: Get parents’ OK for kids to use social media: Ohio’s governor wants the state to require parental consent for kids under 16 to get new accounts on TikTok, Snapchat and other social media platforms. Republican Gov. Mike DeWine’s two-year budget proposal would create a law that social media companies must obtain a parent’s permission for children to sign up for social media and gaming apps. The proposal also names YouTube, Facebook and Instagram, but the proposal would apply broadly, to “any online web site, online service, online product, or online feature that requires consumer consent to register, sign up, or otherwise create a unique username.” 

CBS News:  LOCAL NEWS  Florida lawmakers mull HB 591, which aims to protect children from cyberbullying, sex trafficking: Florida lawmakers gathered Tuesday in Tallahassee to advocate for House Bill 591, legislation that aims to protect juveniles from falling victim to cyberbullying and sex trafficking. Citing a rise in the number of minors suffering from anxiety and depression, State Rep. Michele Rayner-Goolsby, Rep. Tyler Sirois and State Sen. Shevrin Jones sponsored the bill to protect the youth from online harassment. Jena McClure, a mother of three, said she supports the implementation of this bill after witnessing her children and their friends fall victim to bullying. She said it happens often to children everywhere. 

Roll Call: Social media companies put profits over children, senators say: Senators sounded off against social media platforms and called for action during a Senate Judiciary Committee hearing on Tuesday, saying the companies lack accountability and are focused on profits at the expense of children.The hours-long hearing touched on an array of issues, including: the harms of cyberbullying, the scourge of child sexual abuse material on social media, and mental health issues among youth. It also underscored how there is bipartisan support for taking action on social media platforms — even in a narrowly divided Congress. 

Patch: Sexting Education Program Aims To Keep Chester Kids Digitally Safe: Online sexual extortion of minors is on the rise as technology becomes more prevalent in everyday life, and the Chester Police Department wants to remind parents of the online safety precautions they and their children should take to stay safe. Chester Police Detective Lieutenant Chris Cavanagh has partnered with the Chester School District to present a parent evening titled “Keeping your child safe from Child Exploitation – it all starts with the device” on Feb. 15, at 6:30 p.m., at the Black River Middle School. 

NBC News: How one teen is urging legislators in Washington state to help protect kids from being exploited on vlogs: A Washington state teenager is advocating for a bill to protect the privacy of the children of influencers. Chris McCarty, 18, a freshman at the University of Washington, said they wanted to advocate for children’s right to privacy online after having learned about influencer Myka Stauffer, who shared extensive, intimate content about her adopted son before she relinquished custody because of his medical needs. McCarty, who uses they/them pronouns, started the site Quit Clicking Kids to spread awareness and urge fellow advocates to take action in their own states. When they were a senior in high school last year, they cold-emailed multiple state legislators and eventually worked with state Rep. Emily Wicks to craft HB 2023, which was re-introduced as HB 1627 for this year’s legislative session. 

News Press Now: Drug dealers targeting kids through social media: A popular social media platform is facing lawsuits from families for its role as a tool for drug dealers to dispense fentanyl to young people. Families of more than 50 overdose victims have filed a lawsuit against Snapchat. According to the lawsuit, from 2020-2022, Snapchat was allegedly a conduit for more than 75% of the fentanyl poisoning deaths of teens between the ages of 13 to 18. Local experts expressed concern over social media being a wide-open platform for dealers because they can sell drugs to people from anywhere in the country. 

Journal News: Ohio governor seeks law requiring social media companies to get parental consent for kids’ accounts: Social media companies like TikTok and Facebook would be required to get verified parental consent before allowing a child under age 16 to have an account, according to a law proposed by Ohio Gov. Mike DeWine in his new budget. 

The Columbus Dispatch: Ohio may require kids to get parental consent to use TikTok, Facebook, other social media: Ohio could soon make it easier for parents to restrict their children’s access to TikTok, Snapchat and other apps. Part of Gov. Mike DeWine’s two-year budget proposal would require social media companies to get parental consent before allowing kids under age 16 to use their platforms. They would be tasked with creating a splash page that verifies the user’s age and obtains the necessary consent from a parent or guardian. 

KGO-TV (Utah): Why expert says Utah’s social media ID verification bill could lead to nationwide privacy issues: What steps are you willing to take to use social media? Would you pay for the platform or agree to all terms and conditions? What if one of those conditions were to upload a copy of your government identification card? That’s exactly what could happen in the State of Utah with Senate Bill 152 – “the social media regulation act”. If passed, Utah residents will have to upload their ID to prove they are over the age of 18 to use all platforms. For those under 18, a parent ID is needed to verify the account.

WPCO-TV (Ohio): DeWine seeks law requiring social media companies to get parental consent for kids’ accounts: Social media companies like TikTok and Facebook would be required to get verified parental consent before allowing a child under age 16 to have an account, according to a law proposed by Ohio Gov. Mike DeWine in his new budget. “Social media companies are running platforms that are addicting our children, harming our children and we need more parental involvement,” said Ohio Lt. Gov. Jon Husted, who is taking the lead on the effort and spoke about it during a Dayton visit on Wednesday.

WTVG-TV (Ohio): Ohio bill would require kids under 16 to have parental permission before joining social media: A new piece of legislation presented to the Ohio General Assembly last week would require kids aged 15 years old and younger to have parental permission before joining certain online platforms, it’s called the Social Media Notification Act. Lieutenant Governor Jon Husted is pushing for the proposal. “These tech companies have created these apps that are designed with algorithms to addict your children to these platforms and collect data on them. These platforms are not being used for virtuous reasons,” says the Lieutenant Governor. “They (parents) would be able to observe more things and they would know what exact platforms their children are on and see who their children are talking to and are connected to. They can see what kind of influences people can have on all their children, I think that would be really beneficial,” says Sarah Koralewski.

Bloomberg Law: California Bill to Let Parents Sue Social Media Gets Second Try: California lawmakers are attempting again to hold social media companies liable for addicting child users to their product, a renewed effort that will face fierce resistance from the tech industry. “This legislation is like throwing more fuel on the flames created by the legislature last session,” said Carl Szabo, vice president of NetChoice, which represents Meta, Google, and other tech companies. State Sen. Nancy Skinner (D) last week introduced SB 287, which would subject a company up to $250,000 per violation, an injunction, and litigation costs and attorney fees. Her bill is similar to widely watched state legislation last year that would have allowed the attorney general and local district attorneys to file civil suits against social media companies for knowingly putting in designs or algorithms that will addict kids.

American Academy of Pediatrics: Center of Excellence: Creating a Healthy Digital Ecosystem for Children and Youth: This National Center of Excellence will serve as a centralized, trusted source for evidence-based education and technical assistance to support the mental health of children and adolescents as they navigate social media. The American Academy of Pediatrics (AAP) Center of Excellence: Creating a Healthy Digital Ecosystem for Children and Youth is dedicated to promoting healthy social media use and pediatric mental wellbeing. Social media use starts during childhood and can play a significant role in the relationships and experiences that impact the growth, development and mental health of children and teens.

Dessert News (Utah): Op-ed: Teenage social media addictions — what parents don’t know and can’t track: Utah Gov. Spencer Cox “compared social media companies to pharmaceutical companies that make opioids” as reported in a recent Deseret News article. Children and teenagers spend less time with supportive groups and their families due to internet usage. Extensive social media usage is harming the younger generation. Social media companies were aware of this concern but did not share these details with the public. Children and teens consistently using social media are at greater risk for cyberbullying, online harassment, sexting and depression.

NBC News: Sen. Josh Hawley wants to create a legal age to be allowed on social media: Sen. Josh Hawley, R-Mo., intends to make his focus in the current Congress a legislative package aimed at protecting children online — including by setting the age threshold to be on social media at 16. In an interview with NBC News, Hawley detailed some top lines of what his agenda will include, such as: Commissioning a wide-ranging congressional mental-health study on the impact social media has on children. For me, this is about protecting kids, protecting their mental health, protecting their safety,” Hawley said. “There’s ample evidence to this effect that big tech companies put their profits ahead of protecting kids online.”

Tech Crunch: TikTok is crushing YouTube in annual study of kids’ and teens’ app usage: For another year in a row, TikTok has found itself as the social app kids and teens are spending the most time using throughout the day, even outpacing YouTube. According to an ongoing annual review of kids’ and teens’ app usage and behavior globally, the younger demographic — minors ranging in ages from 4 through 18 — began to watch more TikTok than YouTube on an average daily basis starting in June 2020, and TikTok’s numbers have continued to grow ever since. In June 2020, TikTok overtook YouTube for the first time, with kids watching an average of 82 minutes per day on TikTok versus an average of 75 minutes per day on YouTube, according to new data from parental control software maker Qustodio.

KMO-TV: ‘It’s an addiction’: Parents, teens navigate self-esteem, safety of social media: The Parkway School District hosted a national speaker Monday night, helping parents better monitor their children’s social media usage, as teens turn to popular apps to communicate and share photos of their lives. The event features a conversation with Erin Walsh of the Spark and Stich Institute, and includes research in the fields of child and adolescent development along with digital media. “The research is pretty nuanced on this,” said Erin Schulte, Coordinator of Counseling and Character Education for Parkway Schools. “It would be nice it if was simple, like this is all bad, keep them away. But it’s not, it can be used for good things.”

KBTX-TV: Focus at Four: Experts say social media breaks are critical for mental well-being82% of the U.S. population currently uses social media.: Studies have shown that reducing social media use to just 30 minutes a day can lead to increased mental health and well-being. Experts say that excessive use of social media platforms is also found to have a much greater impact. “We’ve seen that social media use is associated with eating disorders, particularly in female adolescents,” said Dr. Pete Loper, a triple board-certified physician in pediatrics, psychiatry, and child psychiatry. “It’s associated with increased depression and anxiety. It is also associated with increased self-harm thoughts, particularly in our children, and adolescents.”

WLS-TV: Our Chicago: TikTok’s CEO to testify before Congress and how social media impacts kids’ wellbeing: More than two dozen states have now banned TikTok on government owned devices. It’s also now illegal for the app to be on any federal phone. All of this over concerns about data privacy, national security and there are on-going studies about the app’s impact on the mental well-being of young people. It was announced recently that TikTok’s CEO will testify before the House Energy and Commerce Committee in March. Illinois U.S. Rep. Jan Schakowsky sits on that committee, and said she’s “really looking forward to quizzing the CEO and getting more information.” “But, I certainly have my concerns,” Schakowsky said. “There’s no question that TikTok, which is used mostly by young people, which adds to the concern is doing the kind of surveillance and looking into the private information. Too much information is collected by these platforms and social media companies. But, we worry about TikTok because of the relationship with the Chinese government.”

The Salt Lake Tribune: Editorial: Social genies are out of the bottle, Editorial Board writes. Let’s prepare our kids to handle them: Childhood and adolescence have always been fraught with danger. Parents have been at their proverbial wit’s end since the primary hazard was a sabertooth tiger. These days, one such fright is “social media,” which can mean a lot of things but generally refers to platforms such as Twitter, Instagram and TikTok. Apps on smartphones that can take the human need to communicate and hype it into addictive brain candies that, at their worst, carry messages of bullying, body shaming and other darts that can lead to depression or even suicide. But it is just as true that, ever since Professor Harold Hill warned the good people of River City, Iowa, about their children frequenting pool halls and “memorizing jokes from Capt. Billy’s Whiz Bang,” whatever is new and scary about a culture provides an avenue for con men and well-meaning busybodies to offer protection for our little dears.

FOX 35 (Orlando): Proposed bill aims to restrict social media usage in Florida classrooms: A proposed house and senate bill is targeting the use of social media in schools. One of the bills would prevent the use of any social media in K through 12 schools if you are using their network. The bills would also require teachings on the good, bad and ugly sides of social media. “It’s digital fentanyl for our children,” said Florida’s Chief Financial Officer Jimmy Patronis. Patronis feels social media is having an adverse effect on Florida’s youth. He supports SB 52 and HB 379 that their access to it in the classroom.

WOWT-TV (Nebraska): The FBI is warning parents tonight about an uprise in sextortion complaints.: The FBI has issued a new warning to Omaha parents after seeing an increase in reports of adults tricking children into sending explicit content through social media. Todd Dicaprio with the FBI is referring to it as “sextortion.” It is when an adult portrays himself as a minor to manipulate children through social media platforms to get them to send sexual pictures and videos to sell online. “We receive on average one to two referrals per week of a child who has been exploited some sexually-suggestive matter online,” Dicaprio said.

The New York Post: ‘Tranquilizer challenge’ ODs land 15 grade school students in hospital: Viral internet stunts continue to endanger the lives of young people: More than 15 students in Mexico forced to undergo treatment after overdosing on drugs as part of a dangerous online Clonazepam “tranquilizer challenge.” Viral internet stunts continue to endanger the lives of young people: More than 15 students in Mexico forced to undergo treatment after overdosing on drugs as part of a dangerous online Clonazepam

NBC News: Top Health Officials Urge Parents To Keep Kids and Teens Off Social Media Apps: “If you look at the guidelines from the platforms, at age 13 is when kids are technically allowed to use social media,” said U.S. Surgeon General Vivek Murthy. “I personally, based on the data I’ve seen, believe that 13 is too early.”  “Too young for social media” – that’s what health officials are saying for children ages 13 despite it being the standard age requirement for several social media platforms. 

Salon: 13-year-olds should not be on social media, surgeon general warns: As anyone who has either raised or been a teenager in the 21st century can tell you, social media is omnipresent in modern youth culture. Whether it is finding new music on TikTok or finding new friends on Fortnite, teenagers use social media to connect with their peers, express their individuality and participate in a global community. Yet this new technological and social paradigm brings with it grave concerns: social media spaces that youth frequent are rife with bullying, misinformation and bigotry, which can have a detrimental effect on the self-esteem of developing young minds.

The Washington Post: Analysis: A new bill would ban anyone under 16 from using social media: A growing number of U.S. policymakers and federal officials are angling to keep children and young teenagers off social media entirely, citing mounting concerns that the platforms may harm their well-being and mental health. It’s a notable escalation in the rhetoric around keeping kids safe online, which has largely focused on setting new digital protections. The push gained traction after the U.S. Surgeon General Vivek Murthy told CNN on Sunday that he believes 13 is “too early” for kids to be joining apps like Instagram and TikTok, which he said can create a “distorted environment” that “often does a disservice” to kids.

Good Morning America: Excessive screen time during infancy may be linked to lower cognitive skills later in childhood: The amount of time babies spend watching computer, TV and phone screens in their first year of life may be indirectly linked to lower cognitive skills later in life, according to a new study. Babies who watched on average two hours of screen time per day performed worse later on, at age 9, on executive functions, according to the study, which was published Monday in the journal JAMA Pediatrics.

NBC News: Sen. Dick Durbin urges DOJ to review Twitter’s handling of child exploitation: Senate Judiciary Committee Chair Dick Durbin urged Attorney General Merrick Garland in a letter Tuesday to review Twitter’s handling of child exploitation material, calling the Justice Department’s failure to address the issue “unacceptable.” “Sadly, Twitter has provided little confidence that it is adequately policing its platform to prevent the online sexual exploitation of children,” Durbin, D-Ill., wrote. “This puts children at serious risk.” The letter cites reporting from NBC News that found dozens of Twitter accounts and hundreds of tweets using numerous hashtags to promote the sale of child sexual abuse material (CSAM). Some of the tweets were brazen in how they marketed the material, using common terms and abbreviations for CSAM. After the article was published, Twitter said that it was blocking access to several hashtags associated with the posts.

Chicago Tribune: After study finds social media may change pre-teens’ brain wiring, psychologist advises time limits, IRL activities: A new study showing a correlation between frequent checking of social media and neurological sensitivity to social cues in young people underscores the importance of in-person interactions —in other words, talking to people face-to-face instead of on a screen — and setting boundaries around technology and social media use, a pediatric psychologist at Advocate Children’s Hospital said. The study, which was published Jan. 3 in JAMA Pediatrics, tracked the brain activity of about 170 sixth through eighth graders who reported checking Facebook, Instagram and Snapchat at varying frequencies. It found that young people who checked social platforms more frequently had a higher “neural sensitivity to anticipation of social rewards and punishments.”

Denver 7 TV: Is keeping teens off social media unrealistic?: Even though 13-year-olds can sign up for accounts, whether they should is a different question. On Sunday, Surgeon General Vivek Murthy told CNN that he believes age 13 is too young to be on social media. University of Michigan data from 2021 indicate that many children have social media accounts before reaching 13. According to a survey conducted by the University of Michigan, 49% of parents of children ages 10-12 report their kids having social media accounts. With so many children online, Sarah Clark, a research scientist in the Department of Pediatrics at the University of Michigan, questions whether it is realistic to ask parents to outright ban their children from social media. Instead, she encourages setting parameters to promote safe social media usage.

The Cleveland Clinic: Why Social Media Challenges Can Be a Recipe for Disaster — When They’re Real: It’s almost impossible to make it through childhood and adolescence without making questionable — and often downright foolish — decisions. Pushing boundaries and taking risks is part of growing up. We do the best we can to insulate our kids from risk, but they’re always finding new and innovative ways to get hurt. Social media definitely isn’t helping. It amplifies the power of peer pressure, and rewards dangerous risk-taking with likes, shares and empty promises of insta-fame. “It’s tricky because teens can get positive reinforcement with all the likes and views from the videos they post,” says pediatric emergency medicine specialist Purva Grover, MD. “So, the more risky or shocking, the greater the possibility that more people will see it.”

KSL-TV: Utah lawmakers want age restrictions on social media platforms: A Senate committee took the first steps toward regulating social media platforms in the state, advancing a bill that would require minors to get parental consent before signing up for social accounts. SB152 is one of several bills in the Utah Legislature aimed at tech giants this year, after Gov. Spencer Cox made social media regulation one of his top issues ahead of the legislative session. Earlier this month, Cox threatened to regulate social media companies due to the alleged harm to children and announced plans to sue major tech platforms last week. Cox’s brother-in-law, Sen. Mike McKell, R-Spanish Fork, is sponsoring the bill, which would require social media companies to use age verification to prevent minors from signing up without their parent’s permission and would prohibit companies from collecting or selling personal data of minors.

The Hill: Surgeon general: 13-year-olds too young to join social media: Surgeon General Vivek Murthy on Sunday cautioned that, despite many app guidelines, 13-year-olds are too young to join social media. “What is the right age for a child to start using social media? I worry that right now, if you look at the guidelines from the platforms, that age 13 is when kids are technically allowed to use social media. But there are two concerns I have about that. One is: I, personally, based on the data I’ve seen, believe that 13 is too early,” Murthy said on CNN’s “Newsroom.” Twitter, Facebook, Instagram and other top social media platforms allow users age 13 and older to join, create their own profiles and share and consume content.

Forbes: ‘We Can’t Look Away’: Documentary ANXIOUS NATION Explores The Rise In Anxiety In Children: Like adults, children feel worried from time to time. It’s normal. But when a child’s anxiety interferes with his or her school, home or social life, it’s time for professional help. The moving documentary, Anxious Nation, delves into the increased rates of anxiety among children and adolescents, and appeals for the urgent need for compassionate and science-based treatment and care. After attending a screening at the Palm Springs International Film Festival, I spoke with filmmakers and cast members about the mental illness epidemic among some of society’s most vulnerable individuals.

CNN: Children’s mental health tops list of parent worries, survey finds: Forty percent of US parents are “extremely” or “very” worried that their children will struggle with anxiety or depression at some point, a new survey finds. The Pew Research Center report said mental health was the greatest concern among parents, followed by bullying, which worries 35% of parents. These concerns trumped fears of kidnapping, dangers of drugs and alcohol, teen pregnancy and getting into trouble with the police. Concerns varied by race, ethnicity and income level, with roughly 4 in 10 Latino and low-income parents and 3 in 10 Black parents saying they are extremely or very worried that their children could be shot, compared with about 1 in 10 high-income or White parents.

Axios: Surgeon general: 13-year-olds too young to join social media platforms: Surgeon General Vivek Murthy said on “CNN Newsroom” on Saturday he believes 13-year-olds are too young to join social media and that being on those platforms does a “disservice” to children. The big picture: Scientists have warned of a connection between heavy social media use and mental health issues in children, saying that the negatives outweigh the positives. Instagram, Snapchat and Twitter all allow users ages 13 or older on their platforms. TikTok users in the United States who are younger than 13 can use the platform, albeit with a safety setting for children that limits the information collected from them, as well as prevents them from messaging other users or allowing others to see their user profile.

CNN: Surgeon General says 13 is ‘too early’ to join social media: US Surgeon General Vivek Murthy says he believes 13 is too young for children to be on social media platforms, because although sites allow children of that age to join, kids are still “developing their identity.” Meta, Twitter, and a host of other social media giants currently allow 13-year-olds to join their platforms. “I, personally, based on the data I’ve seen, believe that 13 is too early … It’s a time where it’s really important for us to be thoughtful about what’s going into how they think about their own self-worth and their relationships and the skewed and often distorted environment of social media often does a disservice to many of those children,” Murthy said on “CNN Newsroom.”

New York Post: Surgeon general warns 13 is too young for children to be on social media: Surgeon General Vivek Murthy warned that children join social media too early and believe they should only be allowed to access the platforms once they’re between 16 and 18. Platforms such as TikTok, Instagram and Twitter currently allow users to join as long as they are at least 13 years old. Murthy believes this can cause adolescents to have a “distorted’ sense of self during their crucial developmental years. “I, personally, based on the data I’ve seen, believe that 13 is too early,” Murthy said on CNN.

FOX 5 (Washington, D.C.) Parents push for Congress to address Snapchat drug dealers: Parents testified this week at a House hearing on Capitol Hill where they called on both Congress and tech companies to do more to fight the opioid crisis in this country. With the rise of overdoses involving children, lawsuits are now being filed against social media companies, such as Snapchat, for putting children in danger. Parents of some of these teens are putting increased pressure on lawmakers and these social media platforms to put better measures in place to stop online drug dealers from gaining access to kids.

NBC Chicago: Illinois School Warns Parents About App That Puts Students in Potential Stranger Danger: An Illinois school put out a warning to parents surrounding a social media app that school officials believe many students are using and could be putting them in dangerous situations with strangers. An Illinois school put out a warning to parents surrounding a social media app that school officials believe many students are using and could be putting them in dangerous situations with strangers. The free app, called Omegle, randomly pairs users with others from around the world to talk “one-on-one” anonymously. Users can add interests that will allow the app to pair them with someone who shares similar interests.

KLBK-TV (Texas): Republican congressman calls for nationwide social media ban for kids, teens: A Republican congressman says social media is so harmful for kids and teens that they should be banned from using it, just like kids aren’t allowed to drink or smoke. Congressman Chris Stewart says he hasn’t officially introduced his bill to ban social media for kids under 16 because he’s working on building up support behind the scenes first. “It’s destroyed their sense of self-worth, and their confidence and their sense of hope in the future,” Rep. Chris Stewart (R-UT) said. Studies show social media leads to an elevated risk of depression and suicide as Stewart noted, “nearly a third of our young people age 14-24 have considered suicide and have discussed how they would commit suicide with a friend.”

Dessert News: Should children under 16 be denied access to social media apps?: Tweens and teens spend as much as nine hours a day scrolling through social media, gaming, online shopping, video chatting and texting on their cell phones. And an increasing amount of evidence suggests all that screen time is taking a toll on their mental health. “The statistics are clear we’ve got a generation of young people that are the most distressed, anxious, depressed and tragically suicidal than any generation in our history,” said Rep. Chris Stewart, who was recently named co-chairman of the bipartisan Mental Health Caucus in Congress. The rise in anxiety and depression, he says, can be almost directly correlated to when Facebook bought Instagram in 2012 and began marketing initially to girls and then boys as young as 9. The Chinese app TikTok, he said, was designed as “emotional heroin” for young people.

WGN-TV (Chicago): Suburban man arrested for kidnapping 3 Ohio children, use of social media: A Beach Park man is facing charges for kidnapping three Ohio children after communicating with them through online platform for weeks. Michael Negron, 19, is being charged with a count of kidnapping and three counts of child endangerment, according to the lake County State’s Attorney’s Office. It is still unclear what the intentions of Negron were. According to police reports, a parent from Middleton, Ohio called the Lake County Sherrif’s Office Saturday afternoon about their missing children, a 12 and 14-year-old girl and their friend, a 15-year-old boy.

WANE-TV (Indiana): Deadly social media ‘blackout challenge’ resurfaces, more child deaths reported: The resurgence of a social media trend has become a nightmare for several families who have lost children to the “game,” with reports of more children dying. The “blackout challenge,” also known as the “choking game” or “pass-out challenge,” encourages users to choke themselves with belts, purse strings or other similar items until passing out. It dates back to at least 2008, when the Centers for Disease Control and Prevention noted that 82 children across 31 states died from the mid-1990s to the mid-2000s as a result. Most of the kids who died were between 11 and 16.

The Salt Lake Tribune: Why Utah Gov. Cox and AG Reyes plan to sue social media companies: Utah Gov. Spencer Cox, alongside Utah Attorney General Sean Reyes, announced that the state would take legal actions against social media companies to address, they say, the harm that digital platforms are doing to the mental health of Utah’s youth. “Without strong action on our part, social media companies will simply not make the changes necessary to protect our children,” Cox said in a news conference on Monday. He alleged that social media apps are designed so that users won’t want to put them down. Neither Cox nor Reyes would specify which social media companies would be sued or what particular claims potential litigation would address. No lawsuits have been filed at this time.

PC Mag: The Most Toxic Online Platforms: Are Your Kids on Them?: Kids are now born into a world with social media, as well as a tangled web of images, games, users, and algorithms that make it nearly impossible for parents to know everything they’re doing. A new study(Opens in a new window) by ExpressVPN asked over 2,000 children in the US and the UK about the biggest issues they’re facing online and on which platforms. The top problems kids reported experiencing are somebody being rude or swearing at them (34%), seeing scary videos (31%), and seeing scary photos (26%). Their parents, roughly 2,000 surveyed adults, gave slightly different answers.

KUTV-TV: Utah parents support social media ban after video of child’s attack posted online: Kylee and Adam Taylor said their daughter was brutally attacked at her own Utah school twice, and in one instance, video of the assault made the rounds on Instagram and TikTok. Now, the Taylors strongly support Congressman Chris Stewart’s proposal for a federal ban on social media for children younger than 16. “Her lips were cut up, bruising on her face,” said Kylee, of her daughter’s injuries. “Both times she was checked for concussions.“ “She was punched kicked, grabbed her hair, threw her to the ground,” added Adam. “It’s traumatic, especially when you get the call and your daughter is crying.”

KUTV-TV: Utah lawmaker to introduce new bill on federal social media ban for teens under 16: Congressman Christ Stewart did not head to Washington to solve our nation’s mental health crisis, but it has become one of his areas of focus. Six months ago, Rep. Stewart’s bill designating 9-8-8 as the universal number for the National Suicide Prevention Hotline was signed into law. It took two years to get the bill passed and the hotline ready for callers. The nations children and teens are his latest focus with a new bill set to be released next week. The bill seeks to ban children under the age of 16 from using social media sites like Facebook, TikTok and Instagram.

The Wall Street Journal: Op-ed: Republicans and Democrats, Unite Against Big Tech Abuses: The American tech industry is the most innovative in the world. I’m proud of what it has accomplished, and of the many talented, committed people who work in this industry every day. But like many Americans, I’m concerned about how some in the industry collect, share and exploit our most personal data, deepen extremism and polarization in our country, tilt our economy’s playing field, violate the civil rights of women and minorities, and even put our children at risk. As my administration works to address these challenges with the legal authority we have, I urge Democrats and Republicans to come together to pass strong bipartisan legislation to hold Big Tech accountable. The risks Big Tech poses for ordinary Americans are clear. Big Tech companies collect huge amounts of data on the things we buy, on the websites we visit, on the places we go and, most troubling of all, on our children. As I said last year in my State of the Union address, millions of young people are struggling with bullying, violence, trauma and mental health. We must hold social-media companies accountable for the experiment they are running on our children for profit.

FOX 5: VIDEO: Managing stress, anxiety, and screen time for children: Stress and anxiety can have negative impacts on your children physically, mentally, and emotionally. Plus, while social media can make people feel more connected, too much screen time can lead to health concerns like sleep or behavioral issues. Child psychologist Dr. Joseph McGuire with the Johns Hopkins Children Center, joined Fox 45 News with tips for parents to help their children navigate stress and anxiety while also managing screen time.

WHNT-TV: Deadly social media ‘Blackout Challenge’ resurfaces, nine children die: A social media trend has become a nightmare for several families after losing their children to the “game” – with at least nine children under the age of 14 dying for the dare of “how long can you hold your breath.” The “Blackout Challenge,” also known as the “Choking Game” or “Pass-Out Challenge,” dates back to at least 2008, when 82 children died trying to video themselves doing it. Most of the kids that died that year were between 11 and 16, spreading over 31 states. In 2021, the “challenge” resurfaced on TikTok, which led the viral video app to ban #BlackoutChallenge from its search engine. The social media giant is already facing a wrongful death lawsuit after a 10-year-old Italian girl was declared brain dead. She had allegedly tied a belt around her throat to self-asphyxiate.

Fortune: Is America overreacting to TikTok with all of its new bans at high schools and colleges? Probably not.: A growing number of public schools and colleges in the U.S. are moving to ban TikTok – the popular Chinese-owned social media app that allows users to share short videos. They are following the lead of the federal government and several states, that are banishing the social media app because authorities believe foreign governments – specifically China – could use the app to spy on Americans. The app is created by ByteDance, which is based in China and has ties to the Chinese government. The University of Oklahoma, Auburn University in Alabama and 26 public universities and colleges in Georgia have banned the app from campus Wi-Fi networks. Montana’s governor has asked the state’s university system to ban it.

Rice University: Three out of four parents say social media is a major distraction for students, according to new study: The vast majority of parents believe social media is a major distraction for students, according to a new nationwide study. The online study, conducted in November and December, surveyed a nationally representative sample of more than 10,000 parents of K-12 students. An overwhelming majority from across racial groups—African American (70%), Asian (72%), white (75%),  Hispanic/Latino (70%)—agreed that social media is a distraction. Parents of children who attend private schools (82%) were more likely to see social media as a distraction than parents of children in public schools (73%) or charter schools (73%) or those being homeschooled (67%). Interestingly, parents with children in high school (74%), middle school (73%) and elementary school (73%) were equally concerned about the issue.

WCIV-TV (South Carolina): School district warns parents on the possible dangers of social media: Monitoring social media starts at home. That’s the message Berkeley County School District is sending to their students’ parents. The Berkeley County School District’s Office of Security and Emergency Management has hosted several informational meetings on the possible dangers of social media. Parents learn they are the gatekeepers to their child’s electronic experience. “This is part of life. It’s not going anywhere. It’s here to stay,” said Cheretha Kinlaw-Hickman, Security and Emergency Management Officer with BCSD. “And if you’re going to use it, we just want to be responsible and safe in how we use these social media apps and being online in general.”

CBS News: New phone allows parents to see everything their kids do online: A company says it has a solution for parents giving phones to their children for the first time. It’s a custom-built Android device called Aqua One from the company Cyber Dive. The specially made phone gives parents the ability to track everything their kids do online. Using an app on their own phones, parents can track a mirrored version of their child’s phone. That means parents can see every text their child types, what videos they are watching and which social media apps they are using. Creator Jeff Gottfurcht says there are just too many apps out there that have become a danger to kids and Cyber Dive’s phone will allow parents and their kids to have an open dialogue about what’s safe and what’s not.

Chalkbeat: As Seattle schools sue social media companies, legal experts split on potential impact: A notable new lawsuit against social media industry leaders by the Seattle school district has left legal experts divided on how the case will unfold. The complaint — which alleges that the school district and its students have been harmed by social media’s negative effects on youth mental health — could lead to sweeping changes in the industry, one expert said. Or, as others expect, it could fizzle out with little chance of winning in court. Seattle Public Schools alleges that the companies — which include Meta, Google, Snapchat, and ByteDance, the company behind TikTok — designed their platforms intentionally to grow their user bases and “exploit the psychology and neurophysiology of their users into spending more and more time on their platforms,” according to a complaint filed earlier this month.

Roll Call: White House, House GOP take aim at Big Tech, but see different targets: President Joe Biden and Republican lawmakers last week launched yet another effort to confront thorny issues relating to Big Tech and social media platforms that have bedeviled previous administrations and Congress, but the path to progress this time around is just as murky. In two high-profile opening salvos of the 118th Congress, the two sides showed how far apart they are starting. Aside from a glimmer of overlap on protections for minors and the market power of the big tech companies, the two sides aren’t offering much promise of legislation. Biden used a Jan. 11 op-ed in The Wall Street Journal to call on Congress to pass federal data privacy legislation, especially to protect children, and prevent ads targeting them, modify U.S. law on social media content moderation policies, and change antitrust policy to bring more competition into the tech industry.

The Wall Street Journal: The U.K.’s Online Safety Bill aims to better protect adults and children from viewing certain online content.: British legislators are set to approve a draft of a extensive new social-media bill that could see the chief executives of major tech firms held criminally liable if they don’t protect children from certain content online. As the U.K. moves closer toward enacting new legislation that technology companies say is too restrictive, its Online Safety Bill aims to better protect adults and children from viewing certain online content, including fraud, revenge porn and sexual abuse. The proposed law, expected to win approval this week by the House of Commons, will force tech companies to remove content deemed illegal or content that is barred by their own terms and conditions, or face fines or legal action. The bill would then go to the U.K.’s upper chamber, the House of Lords, in February, where it could be revised further, and become law by year-end.

GeekWire: Audio: Seattle Schools vs. Social Media: What’s at stake in the suit against TikTok, Instagram, and others: As a tech reporter based in Seattle, that certainly got my attention, and I wasn’t alone. After GeekWire broke the story last weekend, it made national news. Here are some of the key points to know: Seattle Public Schools is suing the social media giants for damages stemming from what the suit describes as a youth mental health crisis in Seattle and across the country. That crisis, the suit alleges, has been caused by the deliberate actions of the companies in deploying algorithms designed “to maximize engagement by preying on the psychology of children.”

NPR: AUDIO INTERVIEW: Why 2 Seattle area school districts are suing 5 social media companies: The school districts allege that the companies’ practices have led to increased anxiety, depression, eating disorders and bullying among children.

Seattle Times: Opinion: Seattle schools take social media giants to court: Social media can often be more aptly characterized as antisocial media. Purveyors of conspiracy theories, misinformation, misogyny, white supremacy and antisemitism thrive in these supposedly sociable swaths of the Internet. Beyond toxic politics, social media has also become a 21st century venue for teenage bullies and bad boyfriends, mean girls and malicious rumors. The often fragile psyches of adolescents do not always fare well in this online toxic environment and many people blame social media for a big spike in cyberbullying, prolonged depression and suicide attempts among young Americans.

Ars Technica: Schools sue social networks, claim they “exploit neurophysiology” of kids’ brains: Seattle schools argue that defendants are not protected by Section 230 of the Communications Decency Act, which says providers of interactive computer services cannot be treated as the publisher or speaker of information provided by third parties. Seattle schools are not claiming that the social networks are publishers, the lawsuit said. “Plaintiff is not alleging Defendants are liable for what third parties have said on Defendants’ platforms but, rather, for Defendants’ own conduct,” the lawsuit said. “As described above, Defendants affirmatively recommend and promote harmful content to youth, such as proanorexia and eating disorder content. Recommendation and promotion of damaging material is not a traditional editorial function and seeking to hold Defendants liable for these actions is not seeking to hold them liable as a publisher or speaker of third-party content.”

TODAY SHOW: Teens love the anonymous new Gas app: Here’s what parents should know: Teens can anonymously see who likes them, and more, on the hottest new social app for students.: There’s a new social media app captivating teens. Using the Gas app, users can anonymously compliment their friends (or secret crushes), and the app is gaining steam among young users. NBC News correspondent Savannah Sellers reports on TODAY that 1 in 3 teens are using the app and more than 1 billion compliments have been shared, according to Gas app founder Nikita Bier. So, how does it work? Gas app users can log on and compliment, or “gas up,” their friends. Users take a series of polls about their friends, with questions ranging from thoughtful to flirty. “You sign up, join your high school and or you sync your contacts, so we can find your friends,” Bier told Sellers. Bier says people have drawn comparisons to other anonymous apps that are plagued by bullying. “The distinction with Gas is that we author all the content so that you’re answering polls that are generally uplifting and positive, and that’s kind of the aim of the product,” Bier says.

ABC News: School district sues social media giants for ‘creating a youth mental health crisis’ Seattle Public Schools filed a lawsuit against Alphabet Inc., Meta Platforms, Inc., Snap Inc. and TikTok-owner ByteDance.: Seattle Public Schools, the largest school district in the state of Washington, filed a lawsuit Friday against multiple social media giants, in an effort to hold the companies “accountable for the harm they have wreaked on the social, emotional, and mental health of its students,” the district claimed. “It has become increasingly clear that many children are burdened by mental health challenges. Our students — and young people everywhere — face unprecedented, learning and life struggles that are amplified by the negative impacts of increased screen time, unfiltered content, and potentially addictive properties of social media,” Seattle Public Schools superintendent Brent Jones said in a statement. “We are confident and hopeful that this lawsuit is the first step toward reversing this trend for our students, children throughout Washington state, and the entire country.”

Axios: Social media’s effects on teen mental health comes into focus: Experts are increasingly warning of a connection between heavy social media use and mental health issues in children — a hot topic now driving major lawsuits against tech giants. Why it matters: Seattle Public Schools’ recently filed lawsuit against TikTok, Meta, Snap and others — which accuses the social media giants of contributing to a youth mental health crisis — is one of hundreds of similar cases. Driving the news: Some scientists who study technology’s effects on children say the negatives far outweigh any positives. “There is a substantial link to depression, and that link tends to be stronger among girls,” Jean Twenge, a psychology professor at San Diego State University and leading expert on the subject, tells Axios.

Axios: Podcast (Transcript): The escalating fight over Big Tech and kids: Seattle Public Schools filed a lawsuit accusing Big Tech of helping cause a youth mental health crisis. It’s going after TikTok, Meta, Snap and other companies in one of many cases that seek to hold social media platforms responsible for harm to children. Guests: Axios’ Ashley Gold, Sophia Cai and Andrew Freedman. NIALA: Good morning! Welcome to Axios Today! It’s Wednesday, January 11th. I’m Niala Boodhoo. Here’s what we’re covering today: more deaths in California as winter storms rage on. Plus, what we know about the classified documents found from Biden’s VP days. But first: the escalating fight over Big Tech and kids. That’s today’s One Big Thing.

The New York Times: Three-Quarters of Teenagers Have Seen Online Pornography by Age 17: Sexually explicit content has become so prevalent online that teenagers are deluged, according to a new report by a nonprofit child advocacy group.: The internet has transformed pornography, making it much easier to view and share than in the days of Playboy magazine and late-night cable television. For teenagers, that’s created a deluge of sexually explicit photos and videos that has invaded their everyday lives, according to a report released on Tuesday. Three-quarters of teenagers have viewed pornography online by the age of 17, with the average age of first exposure at age 12, according to the report by Common Sense Media, a nonprofit child advocacy group. Teenagers are seeing the photos and videos on their smartphones, on their school devices and across social media, pornography sites and streaming sites, it said.

Reuters: Seattle public schools blame tech giants for social media harm in lawsuit: Seattle’s public school district filed a lawsuit against Big Tech claiming that the companies were responsible for a worsening mental health crisis among students and directly affected the schools’ ability to carry out their educational mission. The complaint, filed on Friday against Alphabet Inc, Meta Platforms Inc, and TikTok-owner ByteDance with the U.S. District Court, claimed they purposefully designed their products to hook young people to their platforms and were creating a mental health crisis. In emailed statements to Reuters, Google said it has invested heavily in creating safe experiences for children across its platforms and has introduced “strong protections and dedicated features to prioritize their well being,” while Snap said it works closely with many mental health organizations to provide in-app tools and resources for users and that the well-being of its community is its top priority. Meta Platforms and TikTok did not immediately respond to Reuters’ request for comment. In the past, the companies have said they aim to create an enjoyable experience for users and exclude harmful content and invest in moderation and content controls.

AP: Seattle schools sue tech giants over social media harm: The public school district in Seattle has filed a novel lawsuit against the tech giants behind TikTok, Instagram, Facebook, YouTube and Snapchat, seeking to hold them accountable for the mental health crisis among youth. Seattle Public Schools filed the lawsuit Friday in U.S. District Court. The 91-page complaint says the social media companies have created a public nuisance by targeting their products to children. It blames them for worsening mental health and behavioral disorders including anxiety, depression, disordered eating and cyberbullying; making it more difficult to educate students; and forcing schools to take steps such as hiring additional mental health professionals, developing lesson plans about the effects of social media, and providing additional training to teachers.

Good Morning America (ABC News): Social media use linked to brain changes in teens, study finds: A new study has identified a possible link between frequently checking social media and brain changes that are associated with having less control of impulsive behaviors among young adolescents. Using MRI brain scans, researchers at the University of North Carolina found that teens who frequently checked social media were more likely to see increased activation in the regions of the brain that regulate reward centers and those that may play a role in regulating decision-making around social situations. The study, published Tuesday in the Journal of the American Medical Association, looked at nearly 200 young people in sixth and seventh grades.

WTVD-TV (North Carolina): VIDEO: Social media is changing how children’s brains develop, UNC researchers find: Researchers at the University of North Carolina released the results of one of the first ever long-term studies on child brain development and technology use. The study specifically looked at middle school students in North Carolina and the impact social media had on their brain development. Researchers said the evidence shows constant checking of a social media feed increased sensitivity to peer feedback. The 169 students underwent yearly brain imaging sessions over three years; that showed researchers that the children had become hypersensitive to feedback from their peers. The researchers published their results in JAMA Pediatrics. Ultimately, what this means for the future of social media and childhood development remains unclear. Even the authors of the study said the results are not necessarily good or bad.

The New York Times: Social Media Use Is Linked to Brain Changes in Teens, Research Finds: The effect of social media use on children is a fraught area of research, as parents and policymakers try to ascertain the results of a vast experiment already in full swing. Successive studies have added pieces to the puzzle, fleshing out the implications of a nearly constant stream of virtual interactions beginning in childhood. A new study by neuroscientists at the University of North Carolina tries something new, conducting successive brain scans of middle schoolers between the ages of 12 and 15, a period of especially rapid brain development.

The Hill: Study finds social media use may impact youth brain development: Advocates and parents have raised concerns about the potential health effects of social media on teens and children for years. A new study carried out in rural North Carolina shows habitually checking social media platforms may lead to long-term changes in adolescent brain development. Specifically, researchers found different social media checking habits were linked with changes in youths’ brains, altering how they respond to the outside world. Data suggest those who checked the sites and apps more than 15 times per day became hypersensitive to peer feedback.

Psychology Today: 5 Ways Parents Can Keep Kids Safe Online: The metaverse, artificial intelligence, virtual and augmented reality, ChatGPT; new technologies are coming in faster than a parent can say, “Put down that phone!” Rather than anguishing over what you may or may not know about these digital innovations, here are five easy ways to help keep your kids safe in 2023. 1. Stop focusing on “screen time.” Focus on “screen use” instead. During every presentation I gave last year, parents were laser-focused on one concern: “screen time.” I sincerely hope we move past this in 2023 because focusing on “time” rather than “use” disregards so many benefits of technology. For example, using a screen to do research or to say “hi” to Grandma is vastly different from doom-scrolling endless TikTok videos (although this might be “educational” too, but more on that in a moment). I don’t believe there is a parent on the planet who wants their child missing out on doing online research or visiting with a geographically-distant relative.

Patriot-News: Op-ed: Prioritize your family’s digital wellness this holiday season: The holiday season presents parents with unique challenges. From festive celebrations like office parties, holiday light displays, last-minute shopping trips, and concerts, most families’ schedules are jam-packed with activities right now. After navigating through the COVID-19 pandemic, the hectic pace of the current holiday season, is for many, a welcome return to normalcy. However, this time can also serve as a catalyst for stress. According to a Dec. 1 poll from the American Psychiatric Association, 31 percent of adults admitted that they expect to feel more stressed this holiday season than last year. When adults are stressed, family routines often fall by the wayside. As a father, I understand that adhering to structure can be very difficult this time of year, especially when children get extended time off from school for the holidays.

Forbes: VIDEO: Child Online Privacy Protections Cut From Congress’ Spending Bill— Despite Last-Minute Push: A pair of bills designed to strengthen online protections for children was left out of a fiscal year 2023 spending plan Congress is aiming to pass this week, despite heightened concerns about online privacy and an advocacy campaign by parents whose children’s deaths have been tied to Internet activity.

Huffington Post: How To Ask People Not To Share Photos Of Your Kids On Social Media: The digital record of a child born this century often begins before birth, when a parent shares a grainy sonogram image. By the time the child is old enough to open their own social media accounts, there may already be hundreds of images of them throughout cyberspace, searchable by name, geotag location and facial recognition technology. But an increasing number of parents are opting out of this “sharenting” norm of documenting all of their child’s milestones on social media. They may post no photos of their child at all or only photos in which their child’s face is not visible. Some parents block out their child’s face in group photos or make public requests that others not post images of their child.

Los Angeles Times: Column: Social media platforms must stop the exploitation of child performers. Now: YouTube has a major child labor problem. Just read Amy Kaufman and Jessica Gelt’s recent Times investigation into the lawsuit facing YouTube star Piper Rockelle and her mother, Tiffany Smith.Instagram and TikTok have child labor problems too, as do any social media platforms from which children (and their parents) derive income. As should be self-evident, when people make money on these platforms, “social” takes a backseat to “media.” When kids make money by producing content for a media company in California, they are — or should be — protected by the state’s laws, which mandate, among other things, limited hours, on-site education and a state-licensed teacher or social worker present on set at all times.

Newsweek: Op-ed: Teen Social Media Screen Time Should Concern Parents: Smartphones have always posed a range of challenges for parents of teens. From social media apps and excessive screen time to explicit content and mental health problems, the digital world often seems as threatening as the physical. A new Pew Research study shows that when it comes to teens and their smartphone use, parents might be worried too much about certain problems, and not worried enough about others. The study shows that about half (46 percent) of parents of teens are worried about their teen being exposed to explicit content online. This is a valid concern, of course. Adults know explicit content is ubiquitous online and can be damaging to see. But there are ways to mitigate the spread of explicit content, from changing the settings in their kids’ phone to preventing and monitoring such content with apps like Bark.

Press Release: Department of Justice – U.S. Attorney’s Office – Western District of Pennsylvania: The United States Attorney’s Office for the Western District of Pennsylvania, in partnership with Homeland Security Investigations – Philadelphia (HSI), the Federal Bureau of Investigation – Pittsburgh (FBI), and the National Center for Missing and Exploited Children (NCMEC), is issuing a public safety alert regarding an alarming increase in the online exploitation of children and teens.  Reports of the online enticement of minors have dramatically spiked in recent months—including reports of sextortion. 

CBS Evening News: When should you get your child a cellphone?: Cellphones are a popular gift during the holiday season, but the debate remains: What’s the best age for your child’s first phone? Craighton and Emily Berman are considering getting their 12-year-old son, Henry, a cellphone. He’s only allowed one hour of recreational screen time per day on the computer. “My wife and I have been kind of struggling with it,” Craighton Berman said. “Because there’s a lot packed into that phone. We all know digital technology and social media kind of destroys us. So I’m just trying to figure out how to destroy him a little less.”

PEW Research: Teens and Cyberbullying 2022: Nearly half of U.S. teens have been bullied or harassed online, with physical appearance being seen as a relatively common reason why: While bullying existed long before the internet, the rise of smartphones and social media has brought a new and more public arena into play for this aggressive behavior. Nearly half of U.S. teens ages 13 to 17 (46%) report ever experiencing at least one of six cyberbullying behaviors asked about in a Pew Research Center survey conducted April 14-May 4, 2022. The most commonly reported behavior in this survey is name-calling, with 32% of teens saying they have been called an offensive name online or on their cellphone. Smaller shares say they have had false rumors spread about them online (22%) or have been sent explicit images they didn’t ask for (17%).

New York Post: My kids were digitally kidnapped — here’s how parents can be more careful: Mommy bloggers beware. Mother of two Meredith Steele, 35, is warning parents to stop sharing photos of their children online after her family was “digitally kidnapped.” The terrifying phenomenon occurs when a stranger steals a parent’s social media snaps to use on their own accounts and live out a fake life online. “My kids had new names and new identities,” Steele said of the ordeal. “They [the culprit] had made their own captions and made their own lives. It was like they were playing with Barbie dolls but the dolls were my kids. “This changed my mind about sharing my stuff online,” the Maine mama told South West News Service. “Mommy blog culture normalizes oversharing intimate personal details of your kids and they aren’t old enough to agree or disagree with it.”

NBC News: Democratic senator questions Twitter’s handling of child safety under Elon Musk: Senate Judiciary Committee Chair Dick Durbin, D-Ill., sent a letter to tech billionaire Elon Musk on Friday expressing concern that Twitter’s approach to child safety had “rapidly deteriorated” since Musk bought the social media site in October. The letter follows reports from several news outlets, including NBC News, about Musk’s eliminating the jobs of people at Twitter who worked to prevent child sexual exploitation and disbanding a board of outside experts who advised Twitter on its efforts to address exploitation. Durbin wrote that he was not convinced by Musk’s recent pledge that addressing child sexual exploitation content was “Priority #1.”

PEW Research: Explicit content, time-wasting are key social media worries for parents of U.S. teens: Parents have a range of concerns when it comes to their teenagers using social media, with access to explicit content and time-wasting ranking among those at the top of the list, according to a Pew Research Center survey of parents of teens ages 13 to 17 conducted this spring. The survey also shows that a majority of parents are keeping a watchful eye on what their teens do on social media. Some are also imposing screen time restrictions on these sites.A bar chart showing that parents are more likely to be concerned about their teen seeing explicit content on social media than these sites leading to anxiety, depression or lower self-esteem. While social media has allowed people to easily seek out information, some say it has also made inappropriate and explicit content more accessible. Nearly half of parents of teens (46%) say they are extremely or very worried that their teen’s use of social media could lead to them being exposed to explicit content, according to the April 14-May 4, 2022, poll.

The Hill: Governors in Iowa, North Dakota and Alabama join GOP colleagues in banning TikTok for state employees: The Republican governors of three more states have joined the growing number of GOP governors who are banning TikTok among state government employees amid security concerns about the Chinese-owned social media platform. Alabama Gov. Kay Ivey, North Dakota Gov. Doug Burgum and Iowa Gov. Kim Reynolds each signed executive orders in the past two days to ban the app from state-owned devices. Republican governors in Maryland, South Dakota, Texas and Utah have already taken action to ban TikTok for state employees’ devices.

The New York Times: Research finds more negative effects of screen time on kids, including higher risk of OCD: A new study suggests that reliance on devices may hinder children’s ability to learn to regulate their emotions. Another linked video game use to a risk of obsessive-compulsive disorder.: Two new studies show associations between screen time and behavioral and psychological risks for children, adding to a growing body of evidence that excessive use of smartphones and other devices can be deleterious to their health. In one study, researchers reported a link between screen time and higher rates of obsessive-compulsive disorder diagnoses among preteens. In the other, the results suggested that using electronic devices to calm youngsters when they’re upset may inhibit their ability to learn to soothe themselves, leading to more frequent, intense emotional outbursts.

The New York Times: How to Use Parental Controls on Your Child’s New Phone: The holiday season is here, and if you’ve decided to give in and get your child a smartphone or tablet, you may be nervous about safety, supervision and screen time. Software can’t solve everything, but it can help. Here are a few of the tools available to help parents or caregivers guide children’s first solo steps into the digital age. First, Set the Rules.

CBS News (Minnesota): What are the concerns about using TikTok? Should parents tell their kids to delete it?: A popular app for entertainment and news is now banned on government devices in Maryland, Nebraska, South Carolina, South Dakota and Texas. Public employees in those five states can’t have TikTok on their work phones, computers or tablets. The reason is for security concerns, given TikTok’s owner – ByteDance – is a Chinese company. The FBI is also sounding the alarm about the social media platform. Aynne Kokas, an author and the director of the University of Virginia East Asia Center, broke down some of the concerns. “The first is the type of data that TikTok, as an app, is able to gather about our usage of the technologies,” Kokas said.

60 Minutes (CBS News): VIDEO: More than 1,200 families suing social media companies over kids’ mental health: When whistleblower Frances Haugen pulled back the curtain on Facebook last fall, thousands of pages of internal documents showed troubling signs that the social media giant knew its platforms could be negatively impacting youth and were doing little to effectively change it. With around 21 million American adolescents on social media, parents took note. Today, there are more than 1,200 families pursuing lawsuits against social media companies including TikTok, Snapchat, YouTube, Roblox and Meta, the parent company to Instagram and Facebook. More than 150 lawsuits will be moving forward next year. Tonight, you’ll hear from some of the families suing social media. We want to warn you that some of the content in this story is alarming, but we thought it was important to include because parents say the posts impacted their kids’ mental health and, in some cases, helped lead to the death of their children.

60 Minutes (CBS News): Meet the teens lobbying to regulate social media: When Emma Lembke was a 12-year-old 6th grader, she was excited to join the world of social media. Here was a way to connect instantly to millions of people around the globe from her home in Birmingham, Alabama, she thought. Lembke was eager to express herself through an online persona and explore new information that she otherwise would not have access to. She first signed up for Instagram, and in the first week, she followed Oprah and the Olive Garden.

Forbes: Twitter Has Cut Its Team That Monitors Child Sexual Abuse: Even as Elon Musk has said that removing child sexual exploitation content from Twitter was “Priority #1,” the teams charged with monitoring for, and subsequently removing such content have been reduced considerably since the tech entrepreneur took control of the social media platform. Bloomberg reported last month that there are now fewer than 10 people whose job it is to track such content – down from 20 at the start of the year. Even more worrisome is that the Asia-Pacific division has just one full-time employee who is responsible for removing child sexual abuse material from Twitter.

The Washington Post: Indiana sues TikTok, claiming it exposes children to harmful content: Indiana’s attorney general sued TikTok on Wednesday, claiming the Chinese-owned company exposes minors to inappropriate content and makes user data accessible to China, in one of the strongest moves against the social media giant taken by a state. Indiana’s lawsuit is the latest move to put TikTok and its parent company under scrutiny. As U.S. officials have sought to regulate TikTok, the platform in recent years has come under sharp questioning in Washington and been under investigation by a bipartisan group of attorneys general for its potential effects on youth mental health, its data security and its ties to China.

Forbes: Amazon Alexa Wants To Put Your Child To Bed With Generative AI Storytelling: While researchers applaud Amazon’s safeguards to ensure the tech is safe for kids, some experts are concerned that generative AI could lead children to believe these algorithms are more intelligent than they actually are. Generative AI, which is known for churning out fantastical art based on text prompts, is now sneaking into one of the most sacred bonding experiences for parents and children: bedtime storytelling.

CNBC: Op-ed: I raised 2 successful CEOs and a doctor. Here’s the No. 1 skill I wish more parents taught their kids today: Parenting expert: The No. 1 thing every parent should teach their kids. Developing skills like curiosity, kindness and emotional intelligence at a young age will help kids succeed as adults. But there’s one skill that parents aren’t teaching their kids enough of today: self-regulation. When kids learn to self-regulate, they better understand the importance of time and how to manage their own behaviors and actions. 1. Model a healthy relationship with technology.Think of the last time you were eating lunch while typing an email while listening to a podcast and checking your phone each time it dinged. We’ve all been there.

Los Angeles Times: How parents can help protect children from online catfishing and other digital dangers: The family of the Riverside teen girl who was tricked into a digital romance with a “catfishing” cop from Virginia want their devastating story to be a cautionary tale. “In this tragic moment of our family, our grief, we hope some good will come from this,” Michelle Blandin, the teen’s aunt, said this week. “Parents, please, please know your child’s online activity. Ask questions about what they’re doing and whom they are talking to; anybody can say they’re someone else.” Such incidents are too common, say experts who hope this one will serve as a reminder to parents about having important conversations early and often with children about online conduct. That is the best way, they say, to protect youth from the many dangers that can lurk on the internet, from both known and unknown predators, cyberbullying, sexual exploitation and other concerns. When should parents start talking about online safety?

Forbes: Our Kids’ Brains Hurt From Using Technology: The American Academy of Pediatrics recommends less than two hours of entertainment screen time per day for children and discourages the use of any screen media by children under two years of age. The psychology research bucket has been overflowing the last few years with indictments of technology and its deleterious impact on our mental and emotional well-being. Brain research and mental health studies are dovetailing on the conclusion that screen time—particularly social media use—is stressing our brains, specifically the engine of computation and mental functioning: the prefrontal cortex.

The Hill: Three things Congress should do now to protect kids and teens: In this April 9, 2020, photo, Lila Nelson watches as her son, sixth-grader Jayden Amacker, watches an online class at their home in San Francisco. The pandemic increased the amount of time kids and teens spend online, but some worry about the effects of media and technology on their outlook. With the start of the lame-duck session, Congress has a long to-do list in a short period of time. Among the important items that need immediate attention, Congress should not go home without making the internet a safer and healthier place for kids and teens. To their credit, committees in both the House and the Senate have dedicated time and energy to online privacy, health and safety over the past two years. There have been hearings and bipartisan markups, and the 117th Congress has gotten closer to passing comprehensive privacy legislation than any other. Still, Congress appears stuck when it comes to establishing guardrails for social media platforms.

NBC News: Ex-Virginia trooper dies in shootout after killing family of teen he had catfished, police say: A Virginia law enforcement employee was killed in a shootout with deputies in California after he allegedly killed the mother and grandparents of a teenage girl he had catfished online, police said Sunday. Austin Lee Edwards, a former trooper with the Virginia State Police who was working for the Washington County Sheriff’s Office, was accused of driving off with the girl after the killings in the Southern California city of Riverside on Friday, police said. It wasn’t clear if Edwards, 28, was a sworn officer when he allegedly killed 69-year-old Mark Winek; his wife, 65-year-old Sharie Winek; and their daughter, 38-year-old Brooke Winek. Washington County Sheriff Blake Andis did not immediately respond to a request for comment.

Pew Research Center: Connection, Creativity and Drama: Teen Life on Social Media in 2022: Society has long fretted about technology’s impact on youth. But unlike radio and television, the hyperconnected nature of social media has led to new anxieties, including worries that these platforms may be negatively impacting teenagers’ mental health. Just this year, the White House announced plans to combat potential harms teens may face when using social media.

The New York Times: Children’s Groups Want F.T.C. to Ban ‘Unfair’ Online Manipulation of Kids: My Talking Tom, an animated video game featuring a pet cat, is one of the most popular apps for young children. To advance through the game, youngsters must care for a wide-eyed virtual cat, earning points for each task they complete. The app, which has been downloaded more than a billion times from the Google Play Store, also bombards children with marketing. It is crowded with ads, constantly offers players extra points in exchange for viewing ads and encourages them to buy virtual game accessories.

Axios: Kids’ privacy online gets yearend push in Congress: Lawmakers from both parties who back stricter rules for handling kids’ data and accounts online see an opening in the last lame-duck weeks of this Congress. Why it matters: Passing a national online consumer privacy bill continues to be out of Congress’ reach, but protecting young people online has been one of the few areas in recent decades where Congress has been able to pass new tech regulations. Driving the news: The two laws best positioned to get rolled into big year-end legislative packages, according to advocates and lawmakers, are:

Forbes Health: Dear Pediatrician: What Is The Best Age For A Child’s First Smartphone?: Dear Pediatrician, My middle schooler really wants a smartphone, but I’m not so sure. He says that most of the kids in his class already have a phone, and he feels left out. I’m worried about him spending too much time on the phone. Plus, I’ve heard scary stories about kids sending inappropriate messages to one another. Is there a best age to give your child a smartphone? Dear Worried, Adding a smartphone to your child’s experience of the world is a big step. Having a supportive and thoughtful parent by their side increases their smartphone success. I commend you for thinking critically about when to introduce this tool to your child.

The Washington Post: Their kids’ deaths were tied to social media. They want Congress to act: Happy Wednesday! We’d like to tip our hats to the incredible team of journalists at Protocol, who delivered tons of insightful and dogged policy reporting in recent years. Below: Elon Musk delays the relaunch of Twitter Blue, and FBI Director Christopher A. Wray discusses his concerns about TikTok. Below: Their kids’ deaths were tied to social media. They want Congress to act. Maurine Molak says her son David, then 16, took his own life after facing months of cyberbullying on social media platforms, which were slow to respond to their reports. “He could not make it stop. I couldn’t make it stop,” she said during an interview Tuesday.

Boston Globe: Teens and young adults are self-diagnosing mental illness on TikTok. What could go wrong?: Does Carly Smith have attention deficit hyperactivity disorder? She was tested as a child and the answer came back a definitive no. But this summer — battling anxiety and struggling to focus while working remotely in her Watertown apartment — she yearned for an explanation, and turned to a hot source of mental health info for teens and young adults: TikTok. There, Smith, 24, a junior account executive at a PR firm, found an ADHD influencer named Katie Sue, an appealing young woman with a big smile, a lot of what felt like answers, and — on her website — a link to make a donation.

FOX 59 (Indianapolis): Woman’s warning after online exploitation: A 19-year-old Indiana woman is recounting her traumatic experience of being sexually exploited as a child. The woman, who asked us to conceal her identity, was a victim of sextortion. She said she was just 12 years old when what seemed like innocent attention from strangers took a dark turn on the online chatting site Omegle. “They would just be like hey, how’s your day?” she explained. “Then after that, it would be straight to ‘what are you wearing?” As with most sextortion cases, it progressed from talking to pictures to video chats. Oftentimes, they started the chat by showing their privates. She said she couldn’t tell how old some of them were, but estimates a lot of the men were in their 30s to 50s.

CNN Business: A guide to parental controls on social media: A little over a year ago, social media companies were put on notice for how they protect, or fail to protect, their youngest users. In a series of congressional hearings, executives from Facebook (FB), TikTok, Snapchat and Instagram faced tough questions from lawmakers over how their platforms can lead younger users to harmful content, damage mental health and body image (particularly among teenage girls), and lacked sufficient parental controls and safeguards to protect teens.

Forbes: Protecting Our Children In Cyberspace: What Are We Missing?: With final election results rolling in, one of the less talked about, yet a vitally crucial issue, is the safety and wellbeing of the children in America –U.S. citizens without voting rights, whose voice is too often lost when it’s time to count the ballots. But that should not be the case. The last couple of months have been bustling with activity on the technology regulation front, with particular attention devoted to the protection of children in cyberspace. It started with the White House formally announcing its expansive federal tech policy reform, emphasizing the protection of young users. The US Supreme Court followed suit, when last month it granted certiorari in Gonzalez v. Google, a high-stakes case appealed from the Ninth Circuit about the scope of protection Section 230 of the Communications Decency Act gives tech companies against liability for the content on their platforms.

Sky News (UK/Britain): Instagram age verification: Social media giant to use automated analysis of video selfies to allow some UK users to ‘prove their age’: From today, anyone who tries to edit their date of birth by changing it from under the age of 18 to over 18 will have to verify it by providing ID or a video selfie that will use age estimation technology.: Users of Instagram in the UK or EU will from now on see new age verification tools on the platform as part of a major safety update to protect children. From today, anyone who tries to edit their date of birth by changing it from under the age of 18 to over 18 will have to verify their age through ID or a video selfie, which will be examined by independent age estimation technology. Instagram said the new update would help ensure an age-appropriate experience for its users. Cyber safety campaigners have long been advocating for greater child protection, particularly after the Molly Russell inquest, which concluded last month that the 14-year-old girl died from an act of self-harm after being exposed to the “negative effects of online content”.

PEW Research: California’s New Child Privacy Law Could Become National Standard: A new California privacy law might fundamentally change how kids and teens use the internet — not only in California but also across the country. The first-in-the-nation legislation, which goes into effect in 2024, imposes sweeping restrictions on internet companies that serve minors, requiring that they design their platforms with children’s “well-being” in mind and barring eight common data-collection practices. Supporters of the bipartisan measure — including a range of privacy, consumer and children’s advocates — have compared it to longstanding consumer safety protections, such as seatbelts and nutrition labels. New York, Washington and West Virginia also have weighed child privacy bills, and Congress considered four such bills last year. While the Washington and West Virginia bills died in committee, the New York, Pennsylvania and federal bills remain under consideration.vvvv

The Hill: Advocates urge committee to advance Kids Online Safety Act: A joint letter sent by online children safety advocates urges Sen. Maria Cantwell (D-Wash.), Chair of the Senate Committee on Commerce, Science, and Transportation, to advance the Kids Online Safety Act (KOSA). The letter was organized by Fairplay, ParentsTogether and the Eating Disorders Coalition, and received more than 100 signatures of organizations and individuals concerned about the harmful impacts of social media on kids and teenagers. KOSA was first introduced in February 2022 and is sponsored by Sen. Richard Blumenthal (D-Conn.) and 11 others.  In the letter, advocates call on Cantwell to “publicly commit to moving KOSA, (S.3663) as part of the omnibus spending bill before the end of the current session,” and requests she take time to talk to parents about the issue.

NBC News: Their children went viral. Now they wish they could wipe them from the internet: During the early months of the pandemic, Kodye Elyse started posting what she described as “normal mom quarantine content” on TikTok. Kodye Elyse, a cosmetic tattoo artist, said she “really wasn’t on social media” before then so she barely had any followers. Since her videos weren’t getting many views, she felt it “wasn’t a big deal” to have a public account to showcase their family life during lockdown, with many of the videos featuring her and her daughters dancing around the house. But the overwhelming response to one of Kodye Elyse’s first viral videos “convinced” her to take her kids offline entirely. The video started with Kodye’s then 5-year-old daughter. She then swapped places with Kodye Elyse to the beat of the music, and with a clever edit, appeared to transform into her mother.

The Dessert News (Utah): Op-ed: More tech, less teen happiness: the link between depression and tech use is especially troubling for children in nontraditional families, our new study found: Our teens are in crisis. The share of American high school students reporting “persistent feelings of sadness or hopelessness” has increased to nearly half of youth, according to the Centers for Disease Control and Prevention. That troubling news came on the heels of a report from Harvard’s Human Flourishing Program that the well-being of young adults has dramatically declined compared to older age groups. A host of factors are driving our kids to despair, from decreased social connection to increased worries about the future of the planet.

Newsweek: Op-ed: We Need Parents and Policy to Save Our Kids from Big Tech: It is now firmly established that social media are ruining the minds and bodies of America’s children. Facebook’s own internal studies find that among teens, especially teen girls, the company’s products lead to “increases in the rate of anxiety and depression.” Social media are designed to be addictive. Heavy use leads to sleep disorders, body dysmorphia, and suicidal thoughts. This should be enough reason for a sane society to stop, think, and change course. They are kids, after all, who deserve peace of mind and time with their loved ones undisturbed by digital encroachments. But we live in a technological age, in which the imperatives of Silicon Valley are given precedence over everything, including the well being of children. So instead of sending our kids a life raft, we are packing their bags for the Metaverse, where their minds will be beyond reach.

Lancaster Online: LTE: Social media affects everyone’s well-being: (Written by Savannah Ginder, student at Conestoga Valley High School): I just felt happier.” That’s what my friend said about giving up social media for a week. Instead of scrolling, she listened to podcasts, colored and went on walks. My teacher had a similar experience after she decided to get rid of TikTok. Social media can affect your well-being by creating a negative environment that leads to illnesses such as depression and anxiety. “The platforms are designed to be addictive and are associated with anxiety, depression, and even physical ailments,” states a report on the website of McLean Hospital, a leading psychiatric hospital in Massachusetts. No wonder both my friend and my teacher felt better after giving up social media.

Forbes: FDA: Here Are Dangers Of NyQuil Chicken And Benadryl Challenges On Social Media: If you are thinking about cooking your chicken in NyQuil, don’t. Just don’t. The same goes for trying to swallow enough Benadryl so that you can start hallucinating. These are not good ideas, no matter what someone on Instagram, TikTok, Facebook, Face-meta, Meta-Face, or whatever your social media of choice may be called. But apparently enough people have been doing such things that the U.S. Food and Drug Administration (FDA) has felt the need to issue a warning about the dangers of doing such things.

Forbes: The Latest Attempt To Address The Online Data And Privacy Crisis: Some crises strike companies quickly, are addressed by corporate executives, and soon fade from the spotlight. Other crises capture the public’s attention but are eventually placed on the back burner, unresolved. But they can get moved to the front at any time. Consider the case of the online data and privacy crisis, which made international headlines a year ago when whistleblower Frances Haugen told Congress that Facebook and Instagram negatively impacted the mental health of teenagers. Not surprisingly, there were several rounds of accusations and finger-pointing over who was to blame for the crisis, the extent of the impact of social media on mental health, and what had or should be done about it.

ABC 27: VIDEO: Pennsylvania bill would require porn filter on children’s devices: A bill introduced in the Pennsylvania State House would require a filter on children’s mobile devices to prevent access to pornography. The bill introduced by Rep. Jim Gregory (R-Blair) would require cellular carriers to switch on filters for new smartphones and tablets activated in Pennsylvania. Gregory says the bill “mirrors” legislation signed in Utah, which doesn’t go into effect until multiple states enact similar legislation. The American Civil Liberties Union of Utah argued the constitutionality of the Utah bill was not adequately considered and that it will likely be argued in court. Gregory argues that Pennsylvania should follow several other states that have proposed similar legislation.

Time Magazine: Social Media Has Made Teen Friendships More Stressful: Public health data signals a genuine crisis in adolescent mental health: rising rates of anxiety, depression, and hopelessness. But as we worry about tweens and teens who are struggling, we can’t ignore another mounting toll—the burdens that are shouldered by their friends and peers in an “always on” world. We have studied teens and tech for over a decade. Still, what we learned in our most recent study stopped us in our tracks. We collected perspectives from more than 3,500 teens on the best and trickiest parts of growing up in a networked world, and we co-interpreted these perspectives alongside other teens who helped us make sense of what we were hearing.

Axios: Why social media companies moderate users’ posts: Facebook, Twitter and other online services set rules for users’ posts not just to flag individual statements, but more broadly, to ensure they’re complying with the law, to help define their businesses and to protect their users. Driving the news: Public debate over online speech peaked again with Kanye West’s ban from Twitter and Elon Musk’s willingness to bring Donald Trump back to that service if he becomes its owner. But public understanding of why social networks moderate content remains murky. Obeying the law: Social media networks have to follow local laws like everyone else.

CBS 21: Talking to your child about dangerous internet trends like ‘one chip challenge’: October is Cyber Security Awareness Month, so there’s no better time to shine a light on a shocking internet trend Harrisburg School District just banned for putting kids in the hospital. The “One Chip Challenge” is making its rounds on social media, particularly on TikTok. It’s been around for a few years, but people are having serious reactions to the 2022 edition of the chip. You can buy the chip at a convenience store or find it online. It costs a whopping $9. People eat a single spicy chip and wait as long as they can to eat or drink anything else. Then, they post the video of the challenge on social media.

The Washington Post: ‘Responsible social media’ council looks to bridge divides on tech: The Biden administration announces a proposal affecting gig workers, and Meta’s metaverse pitch for businesses faces some challenges. First: ‘Responsible social media’ council looks to bridge divides on tech. Public officials in Washington for years have sparred along partisan lines over whether social media platforms take down too much or too little hate speech and misinformation. A council launching this week aims to sidestep those disputes by proposing reforms that tackle issues of bipartisan concern, including children’s safety and national security.

AP: White House unveils artificial intelligence ‘Bill of Rights’: The Biden administration unveiled a set of far-reaching goals Tuesday aimed at averting harms caused by the rise of artificial intelligence systems, including guidelines for how to protect people’s personal data and limit surveillance. The Blueprint for an AI Bill of Rights notably does not set out specific enforcement actions, but instead is intended as a White House call to action for the U.S. government to safeguard digital and civil rights in an AI-fueled world, officials said. “This is the Biden-Harris administration really saying that we need to work together, not only just across government but across all sectors, to really put equity at the center and civil rights at the center of the ways that we make and use and govern technologies,” said Alondra Nelson, deputy director for science and society at the White House Office of Science and Technology Policy. “We can and should expect better and demand better from our technologies.”

New York Post: ‘School photo’ social media trend could leave kids vulnerable to predators: Police: As students adjust to returning to school this fall, law enforcement members and online safety experts are reminding parents to be cautious about the information they share on social media. It may give predators access to children and scammers access to personal information. “We’re not saying not to share,” Deputy Sheriff Tim Creighton of the McHenry County Sheriff’s Office in Woodstock, Illinois, recently told Fox News Digital.  “I have people to this day on my feeds. They are sharing way too much information.” “Less is better,” he said. “Your close friends and family know the important details about your kids, such as the town they live in, the school they go to, their full name. Strangers don’t need to know that.”

The Hill: Why ‘sharenting’ is sparking real fears about children’s privacy: For parents, grandparents and caregivers, snapping a photo of their child and sharing it on social media may seem like a routine, harmless act. After all, being proud of your child and wanting to share that pride with loved ones is a completely normal and largely universal feeling. Unfortunately, this seemingly simple decision — to post a photo, video, or any other information about a child under 18 on social media or the internet in general — comes with a host of ethical and legal considerations, despite the innocent intention behind the action. “Sharenting,” or parents sharing their child’s likeness or personal information on the internet, has grown in popularity alongside the advent of smartphones and social media. And this practice shines a light on the murky realm of children’s consent, digital data collection, targeted advertising, and real-world dangers resulting from parents’ online activities.

WXYZ-TV: Detroit mother sues Instagram for negatively affecting her 13-year-old child: A 2018 Pew Research Study found that 45% of teenagers are online almost constantly. 97% use a social media platform. A Johns Hopkins University study from 2019 shows that 12 to 15-year-olds in the U.S. who spend more than three hours a day on social media are likely to have a heightened risk for mental health problems. Now, a Detroit mother of a 13-year-old is suing Instagram and its parent company Meta claiming it had horrible effects on her daughter. The plaintiff, known as L.H., had been on Instagram since the age of 11 and was a “heavy user” according to a 123-page federal complaint.

The Hill: California passes bill requiring social media companies to consider children’s mental health: California’s legislature has passed legislation that will require social media companies to consider the physical and mental health of minors who use their platforms. Senate Bill AB 2273 passed in the state’s Senate chamber in a 75-0 vote on Tuesday. The proposed legislation is headed to the desk of California Gov. Gavin Newsom (D), though it is unclear whether Newsom will sign the legislation into law, The Wall Street Journal reported. The California Age-Appropriate Design Code Act, which was first introduced by state representatives Buffy Wicks (D), Jordan Cunningham (R) and Cottie Petrie-Norris (D), will “require a business that provides an online service, product, or feature likely to be accessed by children to comply with specified requirements.”

The New York Times: An Apple Watch for Your 5-Year-Old? More Parents Say Yes.: Florian Fangohr waffled for about a year over whether to buy an Apple Watch SE as a gift. The smart watch cost $279, and he worried that its recipient would immediately break or lose it. In May, he decided the benefits outweighed the costs and bought the gadget. The beneficiary: his 8-year-old son, Felix. Mr. Fangohr, a 47-year-old product designer in Seattle, said he was aware that many people were pessimistic about technology’s creep into children’s lives. But “within the framework of the watch, I don’t feel scared,” he said. “I want him to explore.” Felix, a rising third grader, said he actually wanted a smartphone. “But the watch is still really, really nice,” he said.

The New York Times: Sweeping Children’s Online Safety Bill Is Passed in California: Social media and game platforms often use recommendation algorithms, find-a-friend tools, smartphone notices and other enticements to keep people glued online. But the same techniques may pose risks to scores of children who have flocked to online services that were not specifically designed for them. Now California lawmakers have passed the first statute in the nation requiring apps and sites to install guardrails for users under 18. The new rules would compel many online services to curb the risks that certain popular features — like allowing strangers to message one another — may pose to child users. The bill, the California Age-Appropriate Design Code Act, could herald a shift in the way lawmakers regulate the tech industry. Rather than wade into heated political battles over online content, the legislation takes a practical, product-safety approach. It aims to hold online services to the same kinds of basic safety standards as the automobile industry — essentially requiring apps and sites to install the digital equivalent of seatbelts and airbags for younger users. “The digital ecosystem is not safe by default for children,” said Buffy Wicks, a Democrat in the State Assembly who co-sponsored the bill with a Republican colleague, Jordan Cunningham. “We think the Kids’ Code, as we call it, would make tech safer for children by essentially requiring these companies to better protect them.”

ABC News: What parents should know before sharing back-to-school photos online: Katy Rose Prichard, a popular mom influencer on social media, speaks out about how images of children’s faces can be used in ways you never imagined. It’s become a cherished tradition among parents every August and September: sharing back-to-school photos on social media with family members and friends as a new school year kicks off. The trend has been a mainstay on social media, with parents posting pictures of their kids holding signs that showcase details like their child’s age, grade, school, teacher or afterschool activities, and the photos are an easy way to keep loved ones updated. But although it may seem harmless, privacy and security experts say parents and caregivers need to be aware of the inherent risks of sharing pictures and identifiable information online.

CNBC: Randi Zuckerberg says she’s a ‘big proponent of the real world’ when it comes to parenting: Randi Zuckerberg says she’s a “big proponent of the real world” — especially when it comes to protecting children from technology. Speaking at the Credit Suisse Global Supertrends Conference in Singapore earlier this month, Randi Zuckerberg, who is founder and CEO of Zuckerberg Media, discussed worries among many that the metaverse will take children further away from reality. 

Los Angeles Times: Op-Ed: California’s fight for a safer internet isn’t over: This month, a bill to regulate social media services for children was rejected by California’s Senate Appropriations Committee without explanation. The proposed legislation, sponsored by Assembly members Jordan Cunningham (R-Paso Robles) and Buffy Wicks (D-Oakland) and called the Social Media Platform Duty to Children Act, would have allowed the state attorney general and local prosecutors to sue social media companies for knowingly incorporating features into their products that addicted children. The powerful tech industry lobbied for months to defeat the bill.

WLWT (Ohio): Experts share warning for parents about back-to-school social media posts: A warning for parents, as back-to-school social media posts could be putting your child at risk. Experts say some parents are posting too much personal information with their child’s back-to-school pictures. “I’m on Facebook, so I definitely see all the postings that are going on right now,” parent Selena Ramanayake said. “They actually have their school name on there, and age, and all this, and so sometimes I kind of have that, I don’t know, hesitation about should you be posting all that,” she said. Ramanayake is talking about popular social media posts of children holding signs that read details about their lives and school information.

The New York Times: LTE’s: Should Kids Be Kept Off Social Media?: Yuval Levin’s suggestion is an interesting one, but experience tells us that kids are savvy at getting around age restrictions and safety guards. Kids today are forming connections using technology and growing up with a smartphone in their hands, so we must meet the moment by taking a holistic approach to keeping them safe online. We need to ensure that social media platforms are designed to protect children from bad actors. And we must support parents by providing them with tools to have effective communication with their kids about online safety. Age limits alone will not take the place of these two fundamental elements. Research shows that parents shy away from having difficult conversations about safety topics. For example, one recent survey shows that while the majority of parents have spoken with their kids about being safe on social media generally, less than a third have talked directly about sharing and resharing nude selfies. In short, parents need support so they can feel confident having early and judgment-free conversations with their kids. Platforms need to be proactive in designing their platforms with child safety in mind. And youth need access to modern, relevant education on these tough topics to reduce shame and create a safety net.

Harrisburg Patriot News: Dauphin County girl rescued from couple who lured her away via Instagram: police: A New York couple kidnapped a Dauphin County teenager last year after reaching out to her on Instagram and offering to do her makeup, court documents said. A 13-year-old girl’s mother reported her missing to Lower Swatara Township police after she’d been gone for several days in December 2021. The mother said her daughter had run away before, but usually came right back or was quickly found, Lower Swatara police said in an affidavit of probable cause. Investigators traced the 13-year-old’s Internet Protocol (IP) address on Instagram to a home in Amsterdam, New York, where Jeniyah D. Lockhart-Tippins and Neil T. Moore II lived, according to the affidavit.  After she was rescued from the couple’s home, the 13-year-old told investigators Lockhart-Tippins followed her on Instagram and sent her a direct message, offering to do her makeup, the affidavit said.

Pittsburgh Tribune-Review: Cellphones in schools: Some districts take steps to eliminate devices from class while others balance benefits: Wake up. Check your phone. Go to class. Check your phone. Start homework. Check your phone. Go to bed. Check your phone. For some high schoolers, cellphone use is almost on par with blinking, with the average teenager raking in up to nine hours of screen time each day, according to the American Academy of Child & Adolescent Psychology. In the classroom, phones can serve as an educational tool or a pesky distraction. The latter rang true for six Western Pennsylvania schools — so much so that these schools will take steps to eliminate cellphone use from the classroom during the 2022-23 academic year.

WIRED: How to Use Snapchat’s Family Center With Your Kids: The social media platform just made it easier to find out who your children are interacting with online: TO ALL THE parents who want to know more about who your kids are talking to on their smartphones, I have good news and bad news. The good news: A prominent social media app recently made changes allowing parents and guardians to access more data on the children they care for who are ages 13 to 17. The bad news: You have to download Snapchat. Once it’s set up and your account is connected with those of your children, Snapchat’s new family center lets you see the child’s friend list, who they’re sending messages to, and report potential abuse. The family center does not let you peek into the content of their messages. Although the new feature allows you to see the approximate time your teen messaged someone during the past week, an exact timestamp isn’t provided.

Huffington Post: 7 Things You Should Ask Your Kids About Their Social Media Accounts: Parents may feel apprehensive thinking about their kids on social media, but the reality is young people regularly use platforms like Instagram, TikTok and Snapchat. A survey published by Common Sense Media in March 2022 found that 84% of teens and 38% of tweens say they use social media, with 62% of teens and 18% of tweens saying they use it every day. These numbers underscore the importance of talking to young people about these platforms and their experiences.

Forbes: What The Results Of 32 Studies Teach Us About Parenting In The Age Of Social Media: A new study published in the academic journal Current Opinion in Psychology offers a path forward for parents who are searching for better ways to navigate the nascent world of adolescent social media use. The authors argue that it is possible for parents to put guardrails in place that reduce pre-teen and adolescent anxiety and depression resulting from social media overconsumption, as well as minimize the negative effects of cyberbullying. Here is an overview of their recommendations.

U.S. World News & Report: How to Talk to Tweens About Being Responsible on Social Media: Posting questionable content online could affect your child’s future.: Social media users are getting younger. As screen time increased during the pandemic, so did social media use, especially among tweens, according to the latest report by Common Sense Media, a nonprofit research and advocacy group. Although most social media apps are intended for those 13 or older, nearly one in every five tweens, defined as those ages 8 to 12, reported being on social media daily. These platforms can have both positive and negative effects for young people, researchers say. As more kids access social media at younger ages, it’s increasingly important for parents and educators to help them learn how to stay safe and use social media responsibly. That includes teaching your kids that what they say online can have long-term consequences.

AP: California social media addiction bill drops parent lawsuits: A first-of-its-kind proposal in the California Legislature aimed at holding social media companies responsible for harming children who have become addicted to their products would no longer let parents sue popular platforms like Instagram and TikTok. The revised proposal would still make social media companies liable for damages of up to $250,000 per violation for using features they know can cause children to become addicted. But it would only let prosecutors, not parents, file the lawsuits against social media companies. The legislation was amended last month, CalMatters reported Thursday. The bill’s author, Republican Assemblymember Jordan Cunningham, said he made the change to make sure the bill had enough votes to pass in the state Senate, where he said a number of lawmakers were “nervous about creating new types of lawsuits.”

NPR: Snapchat’s new parental controls try to mimic real-life parenting, minus the hovering: Snapchat is rolling out parental controls that allow parents to see their teenager’s contacts and report to the social media company — without their child’s knowledge — any accounts that may worry them. The goal, executives say, is to enable parents to monitor their child’s connections without compromising teens’ autonomy. Named Family Center, the new suite of tools released Tuesday requires both caregiver and teen to opt in.

New York Post: ‘Victims of Instagram’: Meta faces novel legal threat over teen suicides: Meta is facing a fresh storm of lawsuits that blame Instagram for eating disorders, depression and even suicides among children and teens — and experts say the suits are using a novel argument that could pose a threat to Mark Zuckerberg’s social-media empire. The suits — which are full of disturbing stories of teens being barraged by Instagram posts promoting anorexia, self-harm and suicide — rely heavily on leaks by whistleblower Frances Haugen, who last year exposed internal Meta documents showing that Instagram makes body image issues and other mental health problems worse for many teens.

CNET: Kids Are Being Exploited Online Every Day – Sometimes at the Hands of Their ParentsOn TikTok, Instagram and YouTube, some kids are making millions. But any child working as an influencer is at risk of exploitation.: Rachel Barkman’s son started accurately identifying different species of mushroom at the age of 2. Together they’d go out into the mossy woods near her home in Vancouver and forage. When it came to occasionally sharing in her TikTok videos her son’s enthusiasm and skill for picking mushrooms, she didn’t think twice about it — they captured a few cute moments, and many of her 350,000-plus followers seemed to like it. That was until last winter, when a female stranger approached them in the forest, bent down and addressed her son, then 3, by name and asked if he could show her some mushrooms. “I immediately went cold at the realization that I had equipped complete strangers with knowledge of my son that puts him at risk,” Barkman said in an interview this past June.  This incident, combined with research into the dangers of sharing too much, made her reevaluate her son’s presence online. Starting at the beginning of this year, she vowed not to feature his face in future content.

Forbes: TikTok Moderators Are Being Trained Using Graphic Images Of Child Sexual Abuse: A largely unsecured cache of pictures of children being sexually exploited has been made available to third-party TikTok content moderators as a reference guide, former contractors say. Nasser expected to be confronted with some disturbing material during his training to become a content moderator for TikTok. But he was shocked when he and others in his class were shown uncensored, sexually explicit images of children.

Politico: Congress is closer than ever to reining in social media: The fallout from Facebook whistleblower Frances Haugen’s explosive testimony about social media’s threat to children before the Senate Commerce Committee last fall is coming into focus. There’s bipartisan support in Congress to ban targeted ads aimed at kids under 16, require tech firms to establish default safety tools to protect children online and give parents more control over their children’s web surfing.

New York Post: Online dangers are rampant for kids today: One of the most important jobs parents have today is keeping their children safe online. As moms and dads prepare to send their kids back to school soon, one critical item needs to be included on the checklist: checking out all online platforms their kids are using — and starting conversations early about cyber safety. Kids and teens between the ages of 8-28 spend about 44.5 hours each week in front of digital screens, according to the nonprofit Center for Parenting Education. This makes it crystal clear that parents need to be tuned in and very educated about what, exactly, their kids are doing during those hours.

CBS News: “It’s a crisis”: More children suffering mental health issues, challenges of the pandemic: According to the Mental Health Alliance, in 2022, fifteen percent of kids ages 12 to 17 reported experiencing at least one major depressive episode. That was 306,000 more than last year. “It’s bad. It’s a crisis” said Katherine Lewis, a licensed family therapist at The Bougainvilla House, a nonprofit treatment center in Ft Lauderdale that describes itself as a safe place for children and youth to grow emotionally. To understand why children’s mental health is in such a fragile state, CBS4 was given rare access to the center

Newsweek: Too Much Screen Time for Teens Leads to Mental Disorders, New Study Shows: Youngsters who spend a lot of time in front of a screen are at greater risk of developing behavior disorders, warned a new study. Social media is thought to have an especially strong influence and was most likely to be linked to issues such as shoplifting, scientists said. Watching videos and television, playing games, and texting were linked with oppositional defiant disorder (ODD), according to the findings published July 26 in the Journal of Child Psychology and Psychiatry.

Good Housekeeping: The Hidden Danger Behind TikTok’s “Product Overload” Cleaning Trend: TikTok is ripe with cleaning inspiration, but one eyebrow-raising trend that has been building steam over the last year now has experts concerned about social media users’ safety. Appropriately known as “product overload” by those in the know, the trend — which involves users filming themselves loading up a toilet, bath or sink with copious amounts of astringent cleaning products — has become its own form of ASMR for what’s known as the “CleanTok” corner of the platform.

FOX 11 (Los Angeles): TikTok sued by parents of teen who blame platform for child’s eating disorder: Another lawsuit was filed Thursday against TikTok, this time by the parents of a girl who allege the social media platform’s content is responsible for their 13-year-old daughter’s severe eating disorder that required the child’s hospitalization and will affect her for life.

The Washington Post: Senate panel advances bills to boost children’s safety online: Senators took their first step toward increasing protections for children and teens online on Wednesday, advancing a pair of bipartisan bills that would expand federal safeguards for their personal information and activities on digital platforms. The push gained momentum on Capitol Hill last year after Facebook whistleblower Frances Haugen disclosed internal research suggesting that the company’s products at times exacerbated mental health issues for some teens.

ABC News: Wren Eleanor’ TikTok trend sees parents removing photos, videos of their kids. An account featuring a 3-year-old has sparked a discussion on online safety.: The families of two teens filed new lawsuits against Meta, the parent company of Instagram, claiming the platform causes eating disorders and is spurring a mental health crisis among young people. A TikTok account with more than 17 million followers has sparked a discussion about children’s privacy and safety online.

ABC News: VIDEO: Parents sue TikTok after daughter dies attempting ‘blackout’ social media challenge: The parents speak exclusively to ABC News about a social media challenge called “blackout” — in which children choke themselves until they pass out. A Wisconsin family is suing TikTok after their 9-year-old daughter died attempting the so-called “blackout challenge” popularized on social media.

Fortune: Instagram and TikTok are wreaking havoc on our finances and happiness, new survey finds: You might have recently purchased athletic gear or a hoodie from an advertisement shared by an online retailer—and immediately regretted it. You’re far from alone. Social media impacts consumers’ spending habits, according to a new study by Bankrate, with nearly half of users admitting to making an impulse purchase based on a sponsored post.

Tech Crunch: Kids and teens now spend more time watching TikTok than YouTube, new data shows: Kids and teens are now spending more time watching videos on TikTok than on YouTube. In fact, that’s been the case since June 2020 — the month when TikTok began to outrank YouTube in terms of the average minutes per day people ages 4 through 18 spent accessing these two competitive video platforms.

Variety: TikTok Will Add Adult-Content Warning Labels to Videos With ‘Overtly Mature Themes’: TikTok is giving users of the popular app more controls over the kinds of videos they see in their feed — including flagging videos with “mature or complex themes” intended for viewers 18 and older. TikTok’s Community Guidelines detail categories of content that is banned by the platform, including nudity, pornography and sexually explicit content.

Forbes: TikTok: America’s Drug Of Choice: A recent report that TikTok’s American user data is routinely accessed by Chinese employees comes as no surprise. China’s global technology companies have long engaged in persistent data sharing thereby giving the Chinese government eyes and ears around the world.

New York Post: Alarming TikTok trend sees parents ask kids to help them fight: The first rule of Fight Club is you do not talk about Fight Club, but these parents are posting on TikTok. A new trend on the video-sharing app that involves parents asking their kids to defend them in a fight has divided users, with some saying it’s promoting violence in young children. The trend — and the hashtag #fightprank — has over 24.8 million views on TikTok.

Tech Crunch: Children’s rights groups call out TikTok’s ‘design discrimination’: Research examining default settings and terms & conditions offered to minors by social media giants TikTok, WhatsApp and Instagram across 14 different countries — including the US, Brazil, Indonesia and the UK — has found the three platforms do not offer same level of privacy and safety protections for children across all the markets where they operate.

New York Post: TikTok sued after young girls die in ‘blackout challenge’: TikTok is facing wrongful death lawsuits after two young girls killed themselves trying to recreate “blackout challenge” videos they watched on the platform. Lalani Erika Walton, 8, and Arriani Jaileen Arroyo, 9, both wound up dead after watching hours of the videos featuring the challenge fed to them by TikTok’s algorithm, the suits allege, the Los Angeles Times reported.

CNN: An FCC regulator wants TikTok removed from app stores. Here’s how a company executive responded: While TikTok’s short-form videos are entertaining, that’s “just the sheep’s clothing,” a Federal Communications Commission official said, and the app should be removed from app stores because of security issues. But a TikTok executive, in a rare interview on CNN’s “Reliable Sources” on Sunday, claimed there are no security concerns linked to the hugely successful app.

Forbes: Hugely Popular NGL App Offers Teenagers Anonymity In Comments About Each Other: A new app that allows Instagram users to send anonymous messages is soaring in popularity – and renewing concerns about cyberbullying and harassment that plagued previous apps allowing teens to comment on one another without attribution.

WTAE-TV: VIDEO: Charleroi man accused of luring three young girls through Snapchat: Police say Brandon Johnson, 35, drove girls to a Connellsville hotel: Connellsville police said a 35-year-old Charleroi man used Snapchat to lure three young girls to an area hotel last weekend.

US Attorney’s Office: Philadelphia Man Convicted of Sex Trafficking a Minor on Backpage.com: United States Attorney Jennifer Arbittier Williams announced that a man was convicted at trial of sex trafficking, arising from his forcible coercion of a minor to engage in prostitution. The defendant and the victim first met on a digital social networking application in June 2016.

WTAE-TV (Pittsburgh): North Dakota man accused of sexually exploiting 13-year-old Washington County girl: A North Dakota man has been indicted on charges of child pornography and sexual exploitation of a 13-year-old girl from Washington County. Nicholas Nesdahl, 27, was being held in a jail in North Dakota on Friday awaiting extradition to the Pittsburgh area. In October 2021, a woman reported to Peters Township police that she found troubling videos on her daughter’s cellphone.

PA Police Warn Of Dangerous TikTok Challenge With Gel Gun: Police departments all over are warning folks about a dangerous social media challenge urging users to shoot modified pellet guns at people.

PA State Rep. Hit By Pellets While Walking Dog: As multiple police agencies were investigating a shooting at Erie High School, Rep. Pat Harkins was walking his dog Barry, just several blocks away.

FBI Pittsburgh Warns of Increase in Sextortion Schemes Targeting Teenage Boys: The FBI Pittsburgh Field Office is warning parents and caregivers about an increase in incidents in the Pittsburgh area involving sextortion of teenagers. The FBI is receiving an increasing number of reports of adults posing as age-appropriate females coercing young boys through social media to produce sexual images and videos and then extorting money from them.

Dad Warns Parents After Son, 12, Dies from ‘Blackout Challenge’: ‘Check Out’ Your Kids’ Phones “This is a weapon in our home that people don’t know about,” says Haileyesus Zeryihun.

Vague TikTok threats bring police presence to local schools: Law enforcement and schools are taking extra precautions amid an apparent TikTok trend threatening violence nationwide on Friday.

12-Year-Old Boy Who Burned 35 Percent of Body in TikTok ‘Fire Challenge’ Tells Kids ‘Not to Be a Follower’: Nick Howell spent almost six months in and out of the hospital and had 50 surgeries

Easton Express-Times: Slate Belt teen faces 20 child porn counts in Pa. Attorney General’s Office probe: An 18-year-old Slate Belt man faces numerous charges of possessing child pornography after a months-long investigation by the Pennsylvania Attorney General’s Office

PFSA’s Digital Dialogue Video Series

PFSA’s Digital Dialogue video series is a new resource that provides listeners with updated information, current events, and emerging trends regarding digital safety and digital wellbeing for families. This video series is intended to help increase awareness, educate families and professionals, and provide tips that can be quickly implemented as we navigate the digital era of parenting.

Check out the videos below and keep an eye out for new videos in this series!

Reporting Abuse and Exploitation

ChildLine

  • ChildLine provides information, counseling, and referral services for families and children to ensure the safety and well-being of the children of Pennsylvania. The toll-free intake line,1-800-932-0313, is available 24/7 to receive reports of suspected child abuse.

NCMEC CyberTipline

  • The National Center for Missing & Exploited Children (NCMEC) CyberTipline is the nation’s centralized reporting system for the online exploitation of children. The public and electronic service providers can make reports of suspected online enticement of children for sexual acts, child sexual molestation, child sexual abuse material, child sex tourism, child sex trafficking, unsolicited obscene materials sent to a child, misleading domain names, and misleading words or digital images on the internet. Reports may be made 24/7 online at www.cybertipline.org OR by call the 24-Hour Hotline: 1-800-THE-LOST (1-800-843-5678)

Subscribe To Our

Mailing List