
Image Source: National Interest
- Meta has launched teen accounts, which require parental approval for teens under 16 to use Messenger.
- The move aligns with global trends in tech companies enhancing child protection.
- While welcome globally, Africa faces hurdles enforcing similar protections as low digital literacy and weak regulation could leave African teens exposed.
Meta is rolling out teen accounts with stricter controls for users under 18 on its Messenger platform in a move it says is aimed at improving safety for teenagers online.
This initiative aims to provide a more secure online environment for teenagers by introducing stricter privacy settings and parental controls.
The new development responds to rising pressure on tech companies to do more to protect children. Regulators in the US and EU have stepped up scrutiny on how platforms handle minors’ data and who they allow them to communicate with.
Companies like TikTok have faced challenges related to content moderation and child safety, leading to significant content removals in various regions.
In the African context, implementing such safety measures presents unique challenges as digital literacy levels vary widely across the continent, and regulatory frameworks are often still developing.
For instance, in 2019, Nigerian lawmakers proposed a bill to censor the online distribution of child pornography, highlighting the ongoing efforts to establish legal structures for online child protection.
Unlike Europe or North America, most African countries lack specific child data protection laws. Nigeria, for example, has a Child Rights Act but no robust enforcement mechanisms for online activity. Efforts to police digital spaces are also hampered by limited technical capacity and fragmented policy frameworks.
In practice, many children access social media through shared devices, making age verification difficult to enforce. And in places where digital literacy is low, parents often lack the tools or awareness to supervise their children’s online activity—even if platforms provide the features.
Other platforms have also faced challenges: TikTok, for example, came under fire in Kenya for failing to moderate harmful content aimed at teens, prompting national hearings. These cases highlight the gap between tech companies’ global policies and local realities.
Join 30,000 other smart people like you
Get our fun 5-minute roundup of happenings in African and global tech, directly in your inbox every weekday, hours before everyone else.
For Africa’s growing population of connected youth—many of whom access social media before turning 13—the question isn’t just whether platforms are doing enough but whether they are tailoring those efforts to environments with weaker protections and oversight.
Ultimately, Meta’s new teen protections are a step forward. But their success in Africa will depend on how well they’re supported by local partnerships, better public awareness, and stronger regulatory frameworks that put child safety front and centre.
Meta previously introduced teen accounts for Instagram in September 2024 in a bid to manage their activities on the platform.