Two-Minute Recap of Platform Law Developments - 2023 Summer Edition
Published on 25 September 2023.
CJEU ruled that: “A national competition authority can find, in the context of the examination of an abuse of a dominant position, that the GDPR has been infringed”
On July 4th, the Court of Justice of the European Union (“CJEU”), announced its decision in the case involving Meta Platforms Ireland (“Meta”) and the German Competition Authority (“Bundeskartellamt”). According to this decision, competition authorities in European Union member states can evaluate the compliance of activities not only with competition law regulations but also with other legal regulations, including personal data protection law, law when investigating the abuse of a dominant position.
Prior to this Decision, the German Federal Cartel Office had found that Facebook had unlawfully processed personal data. The investigation revealed that data related to users’ visits to other websites, apps and their engagement with other online services within the Meta group, including Instagram and WhatsApp (“non-Facebook data” or “off-Facebook data”), was collected to deliver personalized advertising messages to users. Consequently, Facebook was prohibited from processing non-Facebook data of users in Germany under general terms without their consent. This action was deemed non-compliant with the European Union’s General Data Protection Regulation (“GDPR”) and was seen as an abuse of Meta’s dominant position in the online social networking market.
In the lawsuit filed challenging the decision of the German Federal Cartel Office, the Higher Regional Court of Düsseldorf sought guidance from the CJEU on whether competition authorities have the jurisdiction to examine compliance with personal data protection legislation. On July 4, the CJEU clarified that the competition authorities:
• may inspect compliance with other legal regulations, including personal data protection legislation, when determining whether undertakings are abusing their dominant position; and
• should coordinate with data protection authorities, ensuring that previous decisions made by these authorities are binding for competition authorities.
Furthermore, the CJEU ruled that Facebook’s interest in funding its operations through personalized advertising does not justify the processing of data without the consent of users.
Following the CJEU’s decision, the Higher Regional Court of Düsseldorf is expected to conclude the case between Meta and the German Federal Cartel Office.
US Federal Trade Commission has initiated an investigation against OpenAI
On July 10, the US Federal Trade Commission (the “FTC“) imposed administrative fines against Meta, Amazon, and Twitter for breaches of consumer protection laws.
Concurrently, the FTC initiated an investigation into OpenAI, the organization behind ChatGPT, to determine whether it violated consumer protection laws by endangering personal data and reputation. During this investigation, OpenAI was asked to address complaints that ChatGPT had made “false, misleading, derogatory and harmful” statements about individuals.
OpenAI’s CEO, Sam Altman, expressed his disappointment with the FTC’s investigation, asserting that it was not intended to build trust. However, he emphasized the importance of ensuring security and user-friendliness of their technology, and he also expressed confidence in their compliance with the law.
The investigation is currently underway, and a decision from the FTC is expected in due course.
The Norwegian Data Protection Authority (“Datatilsynet”) has banned Meta from using personalized ads
After the CJEU determination that Meta’s processing of Facebook users’ personal data breached the GDPR, the Norwegian Data Protection Authority emerged as the first national data protection authority to rule that online behavioural advertising on Facebook and Instagram was in violation of the GDPR.
In the relevant decision, Meta was found to have abused its dominant position by not securing the appropriate consent from its users for the lawful processing of personal data. Therefore, an advertising ban has been imposed on both the Facebook and Instagram platforms pursuant to the decision. The advertising ban is set to take effect on August 4 and will last for at least three months. The prohibition will remain in effect until Meta makes the required arrangements to align with Norwegian legislation. If Meta fails to comply, it risks facing approximately EUR 90,000 administrative fine for each day the violation continues.
First “Deepfake” trial in the Netherlands
In the Netherlands, a criminal case has been initiated against an individual who produced pornographic content using the face of television presenter Welmoed Sijtsma through “deepfake” technology. This case is important marking the first in the Netherlands to address the infringmenet of personal rights through the use of “deepfake” technology. In the statement released by the Public Prosecution Office Amsterdam, it was stated that while the identity of the accused remains undisclosed, they will stand trial under charges of “creating and disseminating fabricated sexual images.”.
Meta will pay $68.5 million to Instagram users living in Illinois as part of a settlement
A settlement has been reached in a class-action lawsuit accusing Meta of infringing user privacy by collecting and storing biometric identifiers and data in violation of the Illinois Biometric Information Privacy Act, which prohibits companies from collecting and storing such information. As a result of the class-action lawsuit known as Parris v. Meta, Meta has agreed to pay $68.5 million to Instagram users residing in Illinois. The class-action lawsuit covers users who used the app from August 10, 2015 through August 16, 2023. While Meta has asserted that they are not admitting any fault in the matter, they agreed for a settlement with the parties involved to avoid prolonging the legal process.
Joint statement on data scraping and privacy risks by data protection authorities
On August 24, 2023, on the Information Commissioner’s Office’s (“ICO”s) website, the data protection authorities from 12 countries released a joint statement warning against the privacy risks linked to the with data scraping of personal data from social media platforms and other online sites.
Data scraping is the automated extraction of data from the web in general, and this joint statement against data scraping aims to:
-articulate the primary privacy concerns linked to data scraping,
-establish guidelines for social media platforms, and other websites to safeguard individuals’ personal data against unauthorized scraping in alignment with regulatory standards, and
-provide recommendations for individuals to mitigate privacy risks stemming from scraping activities
The joint statement noted that data scraping can lead to several privacy risks, including targeted cyber-attacks, identity fraud, surveillance, unauthorized political or intelligence activities, as well as unsolicited direct marketing or spam.
European Commission invites public input on consumer profiling reporting template
The European Commission is seeking feedback from the public on a proposed reporting template designed for consumer profiling techniques, as outlined in Article 15 of the Digital Markets Act (“DMA”). This template is intended for designated gatekeepers to utilize when submitting their annual reports, with the goal of enhancing transparency, fairness, and competition in the market.
The European Commission has opened the template for public feedback, with the intention of enhancing and updating it based on the input feedback received. They encourage participation from various stakeholders, including companies, consumer advocacy groups, and data experts. Submissions in any official EU language will be accepted until September 15, 2023. The non-confidential version of the decisions will be made available on the Commission’s DMA website once confidentiality issues have been resolved.
Google strengthen privacy tools to protect personal data
Google has introduced improvements to its privacy tool on 3rd of August 2023, “Results about you”, designed to help users in identify and remove personal information from search results. This update comes in response to significant privacy concerns, underscoring Google’s commitment to facilitating better management of personal information on the internet. The tool now enables users to request the removal of personal details that may appear in search results, such as photographs, phone numbers, email addresses, or physical addresses. Notably, this tool remains functional even if you have removed the image yourself and if someone else displays your image on their website, you can still request its removal from Google Search. However, this policy would not apply to any content that you are “currently commercializing” which you chose to advertise the content on the internet.
Google faces copyright lawsuit in Denmark
Google is facing a copyright lawsuit that was filed by the Danish Media Association (“Danske Medier”) on behalf of Jobindex, an online job search platform based in Denmark. The lawsuit, initiated on August 24, alleges that Google transferred Jobindex’s job postings to its servers without permission and seeks compensation for copyright infringements. This is the first case to be filed in the Danish courts under the new EU copyright regulations, enacted in 2021, which attribute liability to platforms for the content they host. Last year, Jobindex also logged a complaint to the European Commission against Google for Jobs, Google’s job search service, for abusing its dominant position, violating the GDPR and transferring job postings to its servers without permission.
ICO and CMA have expressed concerns about online design that manipulates consumers for additional personal data
On August 9 2023, as stated on the Information Commissioner’s Office (“ICO”) website, both the ICO and Competition and Markets Authority (“CMA”) warned businesses, urging them to cease using manipulative website designs that inadvertently influence consumers to share their personal data unintentionally for the interest of consumers.
The ICO and CMA are working together to put an end to these harmful design practices. Default settings, bundled privacy choices, and overly complex privacy controls reduce consumers’ control over their personal data and encourage them to disclose more data than they originally intended. These techniques force consumers to make decisions about their personal data while browsing websites. A common example of these techniques is consumers giving up control over their advertising preferences by accepting cookies on websites. Lack of control over personal data not only poses potential harm to consumers but also undermines competition in the market. In response to these concerns, the CMA will launch the “Rip Off Tip Off” campaign, aimed to educate consumers about these harmful techniques, while the ICO will take enforcement actions to protect personal data.