Tag: technology

  • The Best Interest of the Child: A Look at the Impact of Social Media in Child Custody Proceedings in California

    The Best Interest of the Child: A Look at the Impact of Social Media in Child Custody Proceedings in California

    By: Sally Vazquez-Castellanos

    Published, April 12, 2025. Revised, April 13, 2025.

    California’s Age-Appropriate Design Code

    California’s Age-Appropriate Design Code Act (CAADCA) was passed in 2022 to protect children’s online privacy and safety. The Act requires businesses that provide online services or products likely to be used by children under 18 to prioritize the interests of young users in the design of their products.

    In 2023, a federal judge issued a preliminary injunction over free speech and constitutional concerns. The CAADCA presently remains under review.

    Algorithmic Integrity: The Social Media Algorithm Act

    In 2023, California also passed the Social Media Algorithm Act, effective January 2025. According to the New York Times, the legislation aims to protect young users from the adverse effects of algorithm-driven content, which can contribute to issues such as addiction and cyberbullying.(1)

    California’s Social Media Algorithm Act is complimentary legislation to the CAADCA. It’s a further effort by the state to address addictive, harmful algorithmic practices intended to target children and teens.

    By Prioritizing chronological feeds, the law reportedly seeks to offer a safer online environment for children. Technology companies apparently have until 2027 to comply with these rules.

    The law is significant because it forces businesses to prioritize the harms that can come from the algorithm. This is extremely important because companies are being asked to take a deep dive into examining how algorithms impact the minds of young children. It’s absolutely essential because it can have a detrimental impact on the psychology of a child’s mind in addition to the traditional societal harms children’s face in the digital age.

    The Children’s Code in the United Kingdom

    Since September 2021, the United Kingdom set design standards for digital services that were likely to be accessed by children under age 18 to protect their privacy and online safety in their own Age-Appropriate Design Code (UK AADC).

    The UK’s Age Appropriate Design Code, also known as the “Children’s Code,” is the first official guideline for online services accessed by children. California’s laws are based on this code. The UK AADC helps businesses follow UK data protection laws such as general data protection regulation (GDPR) and focuses on a child’s best interests. Introduced by the Information Commissioner’s Office (ICO) in 2021, the Children’s Code aims to protect children’s data.

    Similar to California’s legislation, the ICO states that the purpose of the Children’s Code is to ensure that online services are designed and operated in the best interests of children, which includes promoting their safety, wellbeing, and development.

    The UK Age-Appropriate Design Code includes a set a 15 standards that act as guidelines for data processing, design, and to protect children online.

    The United Nations Convention on the Rights of the Child

    According to the ICO, the best interest of the child standard should be evaluated based on Article 3 of the United Nations Convention on the Rights of the Child (UNCRC). When a family law court looks at this standard, it may consider if a business is acting in the best interests of children and may also consider how a business uses children’s data in relation to the rights outlined in the UNCRC.

    The range of rights under UNCRC include:

    •safety;

    •health;

    •wellbeing;

    •family relationships;

    •physical, psychological and emotional development;

    •identity;

    •freedom of expression;

    •privacy; and

    •agency to form their own views and have them heard.

    In a Child’s Best Interest: Taking a Fresh Look at Business Design & Operations

    In California, child custody disputes focus on a child’s best interest. The social media impact on a child’s mental and emotional health has become significant in all of our lives, especially as it relates to our children.

    It’s important to design online experiences that are age-appropriate to ensure safety and to support emotional growth, while minimizing risks associated with social media.

    Social Media’s Impact on a Child’s Mental Health

    In California, family law courts consider a child’s use of social media when deciding what is in their best interest. In recent years, the role of social media use is becoming increasingly significant in these cases, particularly concerning the child’s mental and emotional well-being.

    In California, the legislation discussed here requires businesses to use age-appropriate design principles in the design of their products.

    Family law court’s in California may consider a child’s exposure to social media platforms as part of the best interest evaluation. Courts may look at how social media use affects a child’s mental health, social interactions, and overall well-being when determining a custody arrangement.

    The TikTok Dilemma

    However, we are now faced with a national emergency and the consideration of the TikTok application on our children’s phones. In many ways, I could envision a scenario where parents and caregivers may need the court’s analysis to go a step further.

    We are in the middle of a national emergency due to a series of presidential Executive Orders that include the current crisis over the TikTok App. The application is commonly found on many smartphones across America, and it’s popular among younger audiences, particularly teenagers, and raises serious questions regarding the privacy and security of its users.

    There are serious issues about the collection and use of children’s data among foreign adversaries, while at the same time we are faced with existential threats that go beyond intellectual property theft and retaliatory tariffs with countries like China, which is where TikTok’s parent company ByteDance is located.

    Online sexual exploitation of young women by experienced predators is a serious issue within the app’s ecosystem and continues to be a major concern.

    In light of these concerns, we must consider how children use social media. We must also consider how algorithms on smartphones and the security associated with these devices has the potential to lead to the exploitation of children by advertisers, third parties, and foreign nations.

    These are serious issues for children and the adults who supervise and love them. Parents can’t be with their children 24/7. It’s an incredible responsibility for any parent or caregiver to have to deal with while business continues to connect us to the world. It’s especially troublesome when smartphones are essential to a child’s day-to-day life in schools across the country.

    Companies should understand the risks posed when third party service providers or others have access to your child’s smartphone. This is an important consideration for any court or legal proceeding when having to consider the psychological impact done to children after prolonged usage or over a lengthy period of time.

    Enforcement and Penalties

    California’s Age-Appropriate Design Code and the UK Children’s Code does not establish for a private right of action by individuals. They both provide for enforcement through a regulatory body. In California, enforcement rests with the California Attorney General.

    The California Privacy Protection Agency enforces state data protection laws, and the Agency investigates complaints under state privacy laws such as the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA).

    In the United Kingdom, it is the ICO that sets guidelines for the Children’s Code. However, in the United Kingdom penalties rest with fines that may be available for a data breach under GDPR.

    In California, penalties are on a per-child, per-violation basis, while penalties under the UK Children’s Code rests with the fines available under GDPR for a data breach.

    Conclusion

    Overall, the integration of social media considerations into child custody disputes reflects the evolving nature of family law in addressing modern challenges that affect the well-being of children. As technology continues to advance and social media becomes an integral part of daily life, its impact on parenting cannot be overlooked.

    In child custody cases, it’s essential for courts to integrate California’s age-appropriate design principles, recognizing that algorithmic integrity and online engagement directly influences a child’s emotional development and safety.

    These issues are critically important to cases that go well beyond family law courts. With the application of these principles, legal experts may be able to holistically evaluate how online behavior and interactions not only impact a child’s well-being but also leads to an evaluation of potential risks associated with harmful content or the misuse of social media platforms.

    1. New York Times article, “Newsom Signs Bill That Adds Protection for Children on Social Media. The California Legislation Comes Amid Growing Concerns About the Impact of Cellphones and Social Media on Adolescents’ Mental Health,” written by Shawn Hubler and Amy Quin. Published Sept. 21, 2024. Updated Sept. 22, 2024.

  • Ensuring Children’s Privacy and Safety in the Digital Age: TikTok and the AI Dilemma

    Ensuring Children’s Privacy and Safety in the Digital Age: TikTok and the AI Dilemma

    By Sally Vazquez-Castellanos

    Republished on September 23, 2025 at 7:02 pm.

    In the face of rapid technological advancement, both policymakers and tech companies are dealing with increasingly complex issues concerning the online safety and privacy of children. Global laws and regulations, such as the EU’s AI Act, were implemented to address these challenges.

    When considering the plain meaning of recent executive orders as well as the ongoing conflicts in Ukraine and the Middle East, we must understand that TikTok’s issues are a national security nightmare for the United States.

    TIKTOK and National Security Concerns

    In a recent article from Reuters, President Trump’s proposal to have TikTok sell its U.S. interests remains on the table days after the deal was said to be on hold.

    TikTok is at the forefront of debate regarding national security and children’s privacy. In recent years, concerns have grown about how the platform handles user data. We have seen executive orders aimed at addressing the collection of sensitive data from American consumers by foreign adversaries, prompting recent presidential directives against TikTok to mitigate the risk to national security and children’s privacy.

    TikTok, a widely used app among children and teens is heavily scrutinized for the company’s data collection practices. The U.S. government, under both President Trump and President Biden, has taken steps to limit these concerns, citing the need to protect national security and critical infrastructure and technologies.

    President Trump’s Executive Order Issued August 6, 2020

    President Trump’s executive order expressly states that there’s a national emergency with respect to critical infrastructure and technologies. The presidential directive addresses the need to secure Information and Communications Technology and Services in the United States. It also deals with mobile apps developed and owned by companies in China that threaten the national security, foreign policy and economy of the United States.

    The AI Act and Global Perspectives

    On a global scale, the EU’s AI Act aims to regulate artificial intelligence technologies, focusing on transparency, accountability, and personal data protection, especially as the ‘internet of things’ increasingly becomes integrated in all of our lives. The use of artificial intelligence without guidelines and regulation, including Google workstations and other smart technologies at schools nationwide, could pose a significant threat to our nation’s children and educational system.

    Artificial intelligence regulations are part of a broader effort to safeguard users globally, inspired by important regulations such as privacy and General Data Protection Regulation in the United Kingdom (GDPR). A regulation such as GDPR treats data privacy as a fundamental human right. We have similar constitutional and state law authorities in California such as the California Privacy Rights Act (CPRA), but under federal law it’s much more complicated.

    Conclusion

    The intersection of technological innovation and children’s safety online demands ongoing attention and adaptation of a number of laws, policies and practices. We must address national security concerns with platforms like TikTok, which includes the ethical use of artificial intelligence.

    Security concerns for our nation, which includes the national economy, also demonstrates a profound need for regulation and policies at the federal level that carefully considers comprehensive regulations like the EU’s AI Act. Meanwhile, stakeholders continue their important work towards providing a safer digital environment for children worldwide.