
UAE Federal Decree- Law on Child Digital Safety : A New Legal Framework for Protecting Children Online
The UAE has taken a significant step towards ensuring the safety of children from the risks of the digital environment with the passing of the Federal Decree- Law No. 26 which came into force early January this year. The Child Digital Safety or CDS Law marks the first comprehensive legislation in the Emirates dedicated specifically to child safety in the digital age and establishes a wide range of obligations and regulations that must be abided by digital platforms, internet service providers, and caregivers alike.
The CDS Law is a response to the rapid expansion of digital services and the consequent increasing exposure of children to online content, recognising the need for an airtight legal structure that not only restricts harmful material but also promotes responsible use of the digital space. It reflects UAE’s commitment to safeguarding children’s physical, psychological, and moral well-being while balancing technological innovation and digital access.
Scope and Applicability:
While the CDS Law currently sets up foundational principles rather than exhaustive rules, clarifications and detailed compliance requirements will follow in its implementation regulations since the law applies to an array of digital actors and entities. Predominantly, aspects worth noting are as follows;
i. Digital Platforms operating within the UAE or targeting users in UAE, this extends to websites, search engines, social media, messaging applications, streaming services, online gaming, e-commerce etc.
ii. Internet Service Providers (ISPs) licensed under UAE telecommunications law will also be monitored and brought under scrutiny under the new law. The only providers currently licensed are du and Etisalat (e&).
iii. Parents and Guardians will also be brought under the ambit of the law and will be vested with responsibilities, as elaborated below, although the law has not yet clarified how such obligations will be brought into practice.
Core Obligations for Platforms:
Having defined its regulatory reach, the CDS law shifts focus to how digital platforms must actively safeguard child users. The legislation initiates a set of baseline obligations that prioritize prevention and accountability, such as;
i. Age Verification and Classification – Digital platforms must implement age verification tools and mechanisms and subsequently restrict access to content based on age. Platforms will eventually be classified according to their risk level, determining the extent and nature of the obligations they must follow.
ii. Gambling and Commercial Gaming – Platforms must take adequate measures and must not allow children access to online commercial games, be it via advertising or promotions. ISPs must also abide by the same.
iii. Harmful Content and Controls – Platforms must put blocking and filtering systems into place to prevent children accessing harmful digital content. This covers tools for content moderation, age-based content classification, and controls on targeted advertising that could exploit and manipulate minors.
iv. Child Data Privacy and Consent – The law generally prohibits platforms from processing personal data of children under 13 years of age unless specific custodial consent is obtained and usage is transparent. Data collection for targeted advertising or behavioural profiling is expressly debarred in such cases.
v. Reporting and Takedown of Harmful Content – Platforms must implement use-friendly mechanisms for reporting harmful content, including child sexual abuse material (CASM), and must disclose their content moderation policies and enforcement actions to competent authorities.
vi. Custodian Controls – Platforms must offer features allowing parents and guardians to set daily time limits, monitor usage, and manage account settings for minor’s profiles, reinforcing oversight for custodians over digital interactions.
These obligations collectively signal a shift from a self- regulatory model towards a structured regulatory environment that emphasises proactive child protection and operational transparency.
Obligations for Internet Service Providers
ISPs also also tasked with distinct responsibilities under the new law, this includes;
i. Activation of network level content filtering systems consistent with the law’s prohibition of harmful content.
ii. Equipping its respective services with parental monitoring and control tools.
iii. Reporting Child Sexual Abuse Material (CSAM) and harmful content to the responsible authorities, along with information on the involved parties.
These requirements extend ISP obligations beyond connectivity to an effective role in enhancing the safety of child users, particularly with reference to controlling access and supervising use.
Obligations for Parents and Guardians
The law sets out specific expectations for parents and guardians of minors (under 18), recognising that legal protection must extend beyond platform compliance to incorporate responsible practices. Notably parents must;
i. Monitor children’s digital activities.
ii. Use parental controls to prevent access to inappropriate content.
Refrain from creating or facilitating children’s access to age-inappropriate platforms.
iii. Report harmful or exploitative contents to the competent authority.
While the enforcement mechanisms for custodial duties are yet to be clarified by the regulations, these provisions underscore the law’s collaborative approach to child safety.
Regulation and Enforcement
The Telecommunications and Digital Government Regulatory Authority (TDRA) has been tasked with overseeing compliance and enforcement of the CDS Law, supported by a Child Digital Safety Council chaired by the Minister of Family. This Council coordinates national efforts, develops policy, and engages in public awareness and regulatory development to protect children online.
Non-compliance may lead to a range of administrative sanctions, including blocking or closure of platforms, removal orders, and other penalties that will be elaborated in future regulations. Penalty structures are expected to be risk-based, with greater obligations and sanctions for platforms that pose higher risks to children.
What does this mean for businesses?
Businesses involved in the Digital sector should begin preparatory compliance adjustments without delay such as assessment for products and services that could potentially impact minors, review and update privacy policies, age-verification systems, and content moderation procedures.
Furthermore ensure all operations and transparent, establish mechanisms for the reporting of harmful content to authorities. Also make sure to adhere to the implementing regulations which will be issues soon, including the forthcoming platform classification system which will detail risk categories and specify all other obligations.
Conclusion
The Child Digital Safety Law marks an important step in the UAE’s wider effort to protect children in an increasingly digital society. The implementation of the law reflects a broader national commitment to family wellbeing and quality of life, in line with the declaration 2026 as the Year of the Family. Furthermore, it builds on the existing digital safety agreements that emphasise cooperation between government authorities, private sector actors, and the community. Together, these measures reinforce child online protection not merely as a regulatory requirement, but as a shared societal responsibility embedded within the UAE’s long-term social and digital policy framework.
References
- UAE Government, Federal Decree-Law on Child Digital Safety (UAE Legislation Portal, 26 December 2025) https://uaelegislation.gov.ae/en/news/uae-government-issues-a-federal-decree-law-on-child-digital-safety accessed 19 January 2026.
- Baker McKenzie, United Arab Emirates issues new Child Digital Safety law (Baker McKenzie, 8 January 2026) https://www.bakermckenzie.com/en/insight/publications/2026/01/uae-issues-new-child-digital-safety-law accessed 19 January 2026.
- UAE issues law to protect children from harmful digital content (The National, 26 December 2025) https://www.thenationalnews.com/news/uae/2025/12/26/uae-issues-law-to-protect-children-from-harmful-digital-content/ accessed 19 January 2026.
- UAE Federal Decree-Law on Child Digital Safety (Khaleej Times) https://www.khaleejtimes.com/uae/government/federal-decree-law-on-child-digital-safety accessed 19 January 2026.
- UAE is tracking 4,000 digital platforms as new law aims to protect children (The National, 8 January 2026) https://www.thenationalnews.com/news/uae/2026/01/08/uae-tracking-4000-digital-platforms-as-new-law-aims-to-protect-children/ accessed 19 January 2026.
- UAE Issues Federal Decree-Law on Child Digital Safety (Emirati Times, December 2025) https://emiratitimes.com/uae-child-digital-safety-law/ accessed 19 January 2026.
- UAE Federal Decree Law No. 26 of 2025 on Child Digital Safety: Liability for Digital Platforms and Internet Service Providers (BSA LAW, January 2026) https://www.bsalaw.com/insight/uae-federal-decree-law-no-26-of-2025-on-child-digital-safety-liability-for-digital-platforms-and-internet-service-providers/ accessed 19 January 2026.
- UAE Child Safety: New Federal Law Shields Kids Online (Gulf News) https://gulfnews.com/uae/uae-introduces-federal-law-to-safeguard-children-online-1.500390837/ accessed 19 January 2026.
FAQs:
Competent authorities may issue binding directions, require corrective measures, suspend services, or impose penalties for non-compliance.
The law requires platforms to adopt reasonable technical and organisational measures to reduce risks to children, which may include changes to content moderation systems, data practices, or platform design.
Harmful content includes material that threatens a child’s physical, psychological, moral, or social wellbeing, as determined by regulatory authorities and implementing regulations.



