In June 2025, a coalition of Civil Society Organizations delivered a sharply worded memorandum to the Communications Authority of Kenya, urging urgent revisions to the country’s new Draft Industry Guidelines for Child Online Protection and Safety in Kenya, 2025. The guidelines, issued in April 2025 and set to take effect on October 29, 2025, aim to shield children under 18 from online harms like child sexual abuse material, grooming, and cyberbullying — but advocates say they fall dangerously short. The Nairobi-based regulator, tasked with enforcing the Kenya Information and Communications Act, 1998, now faces mounting pressure to close critical gaps before the deadline.
What the Guidelines Require
The draft framework, grounded in Kenya’s 2010 Constitution, lays out nine non-negotiable principles: upholding children’s right to free expression, prioritizing their best interests, embedding data protection into product design, and fostering digital citizenship. But it’s the technical mandates that have sparked the most debate. All licensed ICT providers — from Safaricom and Telkom Kenya to app developers and device manufacturers — must now deploy age-verification tools, set default privacy settings to maximum protection for minors, and build content filters at the network, device, and service levels. They’re also required to publish transparent processes for removing harmful content, including CSAM, and to submit their child protection policies to the CA for approval. Quarterly compliance reports will be made public, a move meant to force accountability.
It’s not just about blocking bad content. The guidelines push companies to design safety in from the start — not as an afterthought. That means apps mustn’t allow strangers to message minors without consent, platforms can’t recommend risky content to children, and default settings mustn’t expose location or personal data. The goal? To make Kenya’s digital space as safe as its schoolyards.
Why Civil Society Is Pushing Back
Here’s the thing: the memorandum didn’t just praise the effort — it tore into its blind spots. While the public summary doesn’t list every gap, insiders say the concerns are structural. For one, the guidelines don’t specify how age verification should work in practice. Should a 13-year-old in Kisumu have to upload a national ID? What about kids without formal documents? The rules also lack clear penalties for non-compliance. Without teeth, these could become beautiful paper promises.
“We’re asking for a child’s safety to be treated like a public health emergency,” said one anonymous advocate familiar with the memo. “But right now, it’s treated like a compliance checkbox.”
Another major concern: the guidelines don’t address the role of social media giants headquartered outside Kenya. TikTok, Instagram, and YouTube aren’t licensed by the CA — yet they’re where most Kenyan children spend their time. The document calls for multistakeholder cooperation, but doesn’t mandate cross-border enforcement mechanisms. That’s a glaring hole in a world where harmful content flows freely across borders.
A Decade in the Making
This isn’t Kenya’s first attempt. Back in 2011, the Communications Authority of Kenya hosted a landmark workshop titled “Protecting Children in Cyberspace: Whose responsibility is it?” That event lit a fuse. By 2015, the Child Online Protection (COP) programme launched — training parents, teachers, and kids in digital literacy. Then, in 2022, the State Department for Children Services rolled out its National Plan of Action to Tackle Online Child Sexual Exploitation and Abuse (OCSEA), which included Standard Operating Procedures developed in 2024.
Meanwhile, ChildFund Kenya’s Safe CLICS project has been quietly building local infrastructure — helping the government draft a child-friendly OCSEA manual and pushing for more funding. These efforts weren’t just bureaucratic. They were grassroots. Teachers in Nakuru started teaching kids to screenshot suspicious messages. Community leaders in Mombasa held town halls on sextortion. The guidelines are the formalization of that groundswell.
Who’s Really Affected?
It’s easy to think this only matters to tech companies. But it’s the 12-year-old scrolling TikTok after school. The single mother in Kibera who can’t afford parental control apps. The boy in Kakuma refugee camp who uses a borrowed phone to stay in touch with his sister abroad. If the guidelines are too rigid, they’ll block access to education and support networks. If they’re too loose, they’ll leave children exposed to predators who know how to slip through the cracks.
The CA has said it will publish a public consultation draft in July. That’s the window. And civil society is ready.
What’s Next?
By late July, the Communications Authority of Kenya must respond to the memorandum. The next milestone? A public forum in August, where parents, tech firms, and children’s rights groups will debate revisions. If the CA ignores the concerns, legal challenges could follow — possibly from the Kenya Human Rights Commission or even the Children’s Court.
And if the guidelines go into effect as written on October 29, 2025? The real test begins then: will enforcement be consistent? Will complaints be acted on within 72 hours, as recommended by the UN? Will data on removed content be made public? Those answers will determine whether Kenya becomes a regional leader — or another cautionary tale.
Frequently Asked Questions
How will age verification work for children without official ID in Kenya?
The guidelines don’t specify, but experts warn that requiring national ID cards would exclude millions of children, especially in rural and refugee communities. Alternatives being discussed include behavioral AI analysis, parental consent portals, or trusted third-party verification through schools — but no consensus has been reached. Without a clear solution, enforcement could become discriminatory.
What happens if a company like Safaricom doesn’t comply by October 2025?
The draft doesn’t outline fines or license suspensions, which is a major criticism. Currently, the CA can issue warnings and publish non-compliance lists — but has no power to levy financial penalties. Civil society is pushing for penalties of up to 5% of annual revenue, similar to the EU’s Digital Services Act. Without real consequences, many firms may delay compliance until forced by public pressure or lawsuits.
Are foreign platforms like TikTok and YouTube bound by these rules?
No — not directly. Since they aren’t licensed by the CA, they’re not legally required to follow the guidelines. But the memorandum urges the CA to negotiate with international platforms through bilateral agreements and to require app stores like Google Play and Apple App Store to enforce compliance for apps available in Kenya. Without this, children will still be exposed to unregulated content.
How does this affect children’s right to free expression online?
The guidelines explicitly protect children’s right to access information — but critics fear overblocking. Filters designed to stop CSAM might also block educational content on sexual health or LGBTQ+ topics. The CA has promised to use “proportionate” tools, but without independent oversight, there’s no guarantee. A 2024 UNICEF study in Kenya found 43% of teens had been blocked from accessing health resources due to overzealous filters.
Why is this timeline so tight — only six months to comply?
The CA argues that tech evolves fast, and delays risk more children being harmed. But small developers and local startups say six months is unrealistic. Building age verification and safety tools from scratch takes 12–18 months. The memorandum requests a phased rollout, with SMEs getting an extra six months. Without flexibility, many local tech innovators could be forced out of the market — reducing competition and innovation in child-safe apps.
Who is monitoring compliance after October 2025?
The CA’s Consumer Protection Unit is designated as the lead, but it currently has only 12 staff handling all ICT compliance issues. Civil society is demanding an independent Child Online Safety Monitoring Office with at least 30 specialists, including data analysts and child psychologists. Without adequate staffing, quarterly reports will become paperwork exercises — not real oversight.
Cheryl Jonah
Okay but what if this is just the government’s way to install backdoors in every phone? 🤔 I’ve seen this script before - ‘protect the children’ while they collect biometric data from every kid’s face scan. Next thing you know, your 12-year-old’s TikTok scroll gets logged into a federal database labeled ‘Potential Radicalization Risk.’