Published in Children’s Voice, Volume 34, Number 2
by Linda Anderson
The ring light is turned on. A child no older than ten adjusts the camera. Her mother stands just off screen, offering a practiced smile of encouragement and a subtle nod that signals it’s time to begin. Moments later, the girl launches into a skincare tutorial for her 2.3 million followers— her voice bright, her script rehearsed. The video is polished, charming, and (like so many others on her channel) sponsored by a major brand. By bedtime, her likeness has been viewed over a hundred thousand times, while her mother scrolls through comments and checks the next product deadline.
There is no judge, no courtroom, and no bailiff at the door. Yet this remains a legal frontier—one in which the rights, safety, and well-being of children engaged in influencer work are not yet fully protected. These children, who gain recognition and income through platforms like YouTube, Instagram, and TikTok, navigate a digital space where the spotlight rarely turns off. They are stars in a show that never ends, and unlike children engaged in traditional acting, their performances often unfold in bedrooms, kitchens, or playgrounds. Their private lives become the content—and the law is struggling to keep up.
A New Kind of Child Labor
In 2021, the Centers for Disease Control and Prevention reported a sharp and troubling rise in youth mental health concerns. Between 2007 and 2021, the suicide rate among young people ages 10-24 increased 62% (Tsai, 2023). According to the Youth Risk Behavior Survey (YRBS), nearly three in five (57%) U.S. teenage girls reported feeling persistently sad or hopeless in 2021, and almost one in three (30%) said they had seriously considered attempting suicide (Curtin & Garnett, 2023). While social media has not been labeled as the direct cause, the environment of constant online engagement and the pressures of maintaining a public persona can increase emotional strain.
Children involved in influencer activities are often excluded from labor protections that could help safeguard their well-being—which recent statistics show is rapidly declining (Hamilton, 2023; Simone, 2020). Without these protections, they face heightened risks of mental health challenges such as anxiety and depression—issues that research tie to the unpredictability of online attention and the pressures of turning their childhood experiences into income (Hamilton, 2023).
Traditionally, laws like California’s Coogan Act have helped protect children who are performers by requiring that a portion of their earnings be placed in trust accounts and that working conditions meet strict labor guidelines. Yet because young people engaged in influencer work often are not legally recognized as performers, they fall through the cracks of traditional labor laws designed for entertainment industries (Simone, 2020).
The rise in popular social media platforms has enabled a new form of labor (digital content creation) in which children increasingly are participating (Hamilton, 2023). While these platforms offer new opportunities for fame and income, they have also introduced new legal and ethical challenges, particularly for children who are featured in monetized content (Simone, 2020).
As of 2023, Illinois became the first U.S. state to enact legislation directly addressing the rights of children in monetized family vlogs, requiring that children featured in such content receive a portion of the earnings and that platforms maintain documentation of this labor (Hamilton, 2023). While this marked a significant step forward, such protections remain rare and fragmented. To date, only a handful of states, including Illinois, California, Minnesota, and Utah, have implemented laws safeguarding children’s earnings and well-being. With no federal regulations in place and inconsistent policies across jurisdictions, the legal landscape remains ill-equipped to safeguard young people working in the digital sphere (Abrams, 2023). These challenges are not unique to the United States, as explored further in the Expanding the Global Landscape section below.
The Family Business: Balancing Parental Authority and Children’s Rights
Digital content featuring children is often produced within the home where parents take on multiple roles: caregiver, content manager, videographer, and business strategist. This convergence of responsibilities introduces a significant ethical and legal challenge: how to navigate the intersection of parental authority and a child’s rights in a setting where the child’s image and labor generate income that often supports the family’s financial needs. Unlike traditional entertainment industries, where external oversight and labor laws offer some degree of protection, content created in domestic environments typically lacks regulation, structure, and independent accountability (Hamilton, 2023; Simone, 2020).
Parents have long held the legal right to make decisions on behalf of their children, including those related to employment. This right is traditionally grounded in the presumption that parents act in the best interests of their children. However, when content creation becomes a source of both personal revenue and family livelihood, this presumption warrants closer examination. The influence of monetization, branding opportunities, and platform growth can complicate parental decision-making, especially when those decisions may also serve the family’s financial interests (Abrams, 2023).
The resulting dynamic raises concerns about consent, agency, and exploitation. Children whose lives are shared through monetized content are often too young to fully understand the implications of discussing their personal experiences online. Issues such as privacy, compensation, and long-term mental health risks are rarely addressed with the same seriousness afforded to child actors or traditional laborers (Simone, 2020; Boring, 2020). When parents are responsible for overseeing their child’s digital presence, the absence of third-party advocacy means that there are few safeguards to ensure that decisions are made with the child’s well-being as the primary consideration (Hamilton, 2023).
Existing legal frameworks offer limited protection within this evolving landscape. In the United States, for example, the Children’s Online Privacy Protection Act (COPPA) primarily focuses on data collection and requires parental consent for users under 13, but it does not address critical issues such as monetized content, working conditions, or the psychological impact of sustained digital engagement (Hamilton, 2023). In contrast, the European Union’s General Data Protection Regulation for Children (GDPR-K) grants individuals under age 16 enhanced rights—like the ability to remove personal data and content—reflecting a more nuanced recognition of children’s digital labor rights and autonomy (Simone, 2020).
Meanwhile, the relentless pressure to maintain a visible online presence, driven by algorithms and advertising revenue, has tangible effects on children’s mental health. Research by the Ethics and Public Policy Center concluded that many young people engaged in influencer work experience anxiety and depression directly linked to fluctuating views, likes, and audience engagement (Morell, 2023). When these metrics affect both personal validation and family income, children face heightened emotional risks with inadequate protective support (Curtin & Garnett, 2023).
These realities highlight the urgent need for a legal and ethical framework that goes beyond the longstanding assumption of parental good faith. Regulation must evolve to reflect the unique challenges of home-based digital labor, ensuring that children’s rights to privacy, psychological well-being, and future autonomy are upheld rather than subordinated to familial commercial interests. Legal protections such as guaranteed earnings safeguards, independent oversight, and the right to content removal can establish a more balanced and effective approach to protecting young people navigating the influencer economy (Abrams, 2023; Simone, 2020).
Expanding the Global Landscape
While protections for children in entertainment have long existed, they were designed for a different era—one of film sets and casting calls, not ring lights and living room studios. In the United Kingdom (UK), for example, traditional regulations stem from the Children and Young Persons Act of 1963, which governs hours, conditions, and earnings for child actors (Abrams, 2023). These laws, however, were not built for the always-on nature of social media. Recognizing this gap, the UK’s Office of Communications (Ofcom) proposed reforms in 2024 aimed at addressing online child safety. These include mandatory age verification for digital content featuring minors, limitations on data-driven advertising to children, enhanced record-keeping obligations for platforms, and an expansion of the “right to be forgotten” under the GDPR-K framework (Abrams, 2023).
Australia has taken similar steps. In 2024, the Office of the eSafety Commissioner released guidelines formally identifying children featured in commercial content as a vulnerable class of digital laborers. The rules require platforms to verify parental consent, create streamlined content removal processes for minors, and publish transparency reports on how frequently children appear in monetized content (Simone, 2024). These are promising developments—but they remain isolated.
France stands out as one of the first countries to create a law specifically designed for the social media influencer economy. In 2020, the French Parliament passed legislation requiring that children featured in monetized social media content receive legal protections similar to those afforded to traditional child actors. The law mandates prior administrative authorization for any work that generates income from platforms such as YouTube or Instagram. It also requires that a portion of the child’s earnings be placed into a dedicated savings account accessible only when they reach adulthood. Additionally, the law grants young people the right to have content removed and obligates platforms and parents to maintain detailed records of the child’s involvement in online productions (Boring, 2020).
Despite progress in select nations, legal protections for children in the influencer economy remain inconsistent and incomplete on the global scale. There is no unified international standard to safeguard young digital laborers, and many existing child labor or performance laws—like the Coogan Act—fail to account for the unique pressures and privacy risks of digital fame (Simone, 2024). Until laws evolve to meet the specific realities of social media, young people participating in content creation will remain exposed to commercial exploitation in a regulatory gray zone (Abrams, 2023; Simone, 2024).
A Uniform National Framework
What is needed is a uniform national framework that adapts child labor laws for the realities of the digital age. Such a framework must mandate fair compensation, restrict exploitative content, and ensure that young people have the right to remove content once they can fully understand its implications (Simone, 2020). It should also include mandatory mental health oversight, so that children and youth involved in digital content creation receive regular assessments and support from professionals, along with robust privacy protections that allow for data erasure and content removal upon request (Hamilton, 2023). The progress made by states like Illinois, California, Minnesota, and Utah, together with Australia’s eSafety guidelines and France’s comprehensive influencer legislation, demonstrates that effective, targeted protections can be implemented to safeguard young digital laborers. These examples highlight that a national framework—one that closes jurisdictional gaps, standardizes protections, and provides clear accountability—is not only necessary but achievable.
Recognizing young people who work as influencers as digital laborers and rights-holders—not just as viral content—means acknowledging that their well-being requires consistent, enforceable protections. Without a national framework, children’s rights remain subject to patchwork laws, platform discretion, and uneven enforcement. A standardized approach would ensure that all children, regardless of where they live or on what platform they appear, are afforded the same baseline protections for their privacy, earnings, mental health, and future autonomy. Policy-makers must act to modernize child labor policy at the federal level—introducing legislation that reflects the realities of the digital economy and prioritizes the safety, dignity, and future of every child featured in monetized online content.
Equally essential is public education and awareness. Many caregivers, educators, and even content creators themselves are unaware of the legal gaps that leave children unprotected online. By informing communities, raising awareness in schools, and promoting dialogue among policy-makers, media platforms, and families, we can build momentum for reform. Educating the public is not only a tool for change—it is a necessary step toward creating a culture that values and defends the rights of children growing up in the digital spotlight.
Advancing Digital Safety for Young People
The rise of children participating in the influencer economy has opened new doors—but also new dangers—for how children experience work, family, and fame in the digital age. As this article has explored, the legal protections currently in place are often outdated, inconsistent, or altogether absent when it comes to safeguarding young creators. Existing labor laws were not designed for a world where childhood can be broadcast, monetized, and shaped by algorithms.
The well-being of children and youth must be the starting point for any serious conversation about digital labor. That means rethinking how we define work, how we balance parental authority with children’s rights, and how we ensure that young people are not being asked— implicitly or explicitly—to sacrifice privacy, mental health, or identity for engagement metrics.
Laws cannot prevent every harm, but they can create boundaries that protect children from being overexposed, overworked, or overlooked in the name of content. A more uniform, child-centered legal framework would not only address issues like compensation and consent—it would reflect a deeper commitment to protecting the humanity of the children behind the screens.
This is a pivotal moment. Policy-makers, educators, parents, and platform designers alike must step forward and acknowledge that the digital economy has outpaced the protections meant to keep children safe. We need stronger legislation, clearer ethical standards, and a shared willingness to prioritize the long-term well-being of children over short-term clicks and profit. It is not just about reforming policy—it’s about reimagining what we owe to children growing up online. Because childhood should be protected, not marketed.
Linda Anderson earned her Bachelor of Criminal Justice with highest honors from New Mexico State University. During her studies, she distinguished herself as a campus and community leader, holding multiple leadership roles and contributing to a variety of service initiatives. Linda’s professional experience spans criminal defense, immigration, and medical malpractice law, where she has developed a deep understanding of the legal system and a strong commitment to advocacy. She is especially passionate about protecting the rights of children and supporting communities that are underserved, and she plans to continue her legal education in pursuit of becoming an attorney.
References
Abrams, R. (2023). Family influencing in the best interests of the child. Chicago Journal of International Law, 2(2). https://cjil.uchicago.edu/online-archive/family-influencing-best-interests-child
Boring, N. (2020, October 30). France: Parliament adopts law to protect child “influencers” on social media. Library of Congress. https://www.loc.gov/item/global-legal-monitor/2020-10-30/france-parliament-adopts-law-to-protect-child-influencerson-social-media/
Curtin, S., & Garnett, M. (2023). Suicide and homicide death rates among youth and young adults aged 10-24: United states, 2001- 2021 key findings data from the national vital statistics system. Centers for Disease Control and Prevention. https://www.cdc.gov/nchs/data/databriefs/db471.pdf
Hamilton, B. E. (2023). Anything for views parenting: Framing privacy, ethics, and norms for children of influencers on YouTube. DukeSpace: Duke University Libraries. https://dukespace.lib.duke.edu/server/api/core/bitstreams/28b50eac-1234-4bc7-a20d-195c12cf1877/content
Morell, C. (2023, August 31). Social media and harm to children. Ethics & Public Policy Center. https://eppc.org/publication/social-media-and-harm-to-children/
Simone, C. (2020). When parents decide that all the world’s a stage: Expanding publicity rights to protect children in monetized social media content. Columbia Journal of Law & Social Problems, 1(1), 47-100. https://jlsp.law.columbia.edu/files/2024/10/Simone.pdf
Tsai, B. (2023, June 15). Suicide and homicide rates increase among young Americans. NCHS: A Blog of the National Center for Health Statistics. Centers for Disease Control and Prevention. https://blogs.cdc.gov/nchs/2023/06/15/7396
