The Federal Competition and Consumer Protection Commission (FCCPC) of Nigeria has levied a $220 million fine against Meta, alleging that the privacy policy of its messaging platform, WhatsApp, breaches the country’s data and privacy laws.
According to a press release issued by the FCCPC on Friday, July 19, the fine is due to Meta’s violations of the Federal Competition and Consumer Protection Act (FCCPA) 2018, the Nigeria Data Protection Regulation 2019 (NDPR), and other pertinent regulations.
The FCCPC outlined several specific violations, including “abusive and invasive practices against data subjects/consumers in Nigeria, such as appropriating personal data or information without consent, discriminatory practices against Nigerian data subjects/consumers, and disparate treatment of consumers/data subjects compared to other jurisdictions with similar regulatory frameworks. The commission also accused Meta of abusing its dominant market position by enforcing unscrupulous, exploitative, and non-compliant privacy policies that appropriated consumer personal information without providing the option or opportunity for users to consent to or withhold the gathering, use, and sharing of their personal data.”
In response to the fine, a WhatsApp spokesperson told Bloomberg, “In 2021, we explained to users globally how interactions with businesses would work, and despite initial confusion, these features have become quite popular. We disagree with today’s decision and the fine, and we are appealing the decision.”
This fine comes shortly after Meta announced it would suspend its generative artificial intelligence (AI) tools in Brazil following objections from one of the country’s regulators regarding the company’s privacy policy. Earlier in July, Brazil’s National Data Protection Authority suspended Meta’s new privacy policy, demanding the exclusion of sections related to the processing of personal data for AI training purposes.
In a related event, Meta paused the launch of its AI assistant, Meta AI, in Europe in early June. This decision came after the Irish Data Protection Commission, representing European data protection authorities, requested Meta to delay training its large language models using content shared by adults on Facebook and Instagram.
These regulatory actions highlight the growing scrutiny and challenges that tech giants like Meta face in navigating privacy laws and data protection regulations across different jurisdictions.
Comments