Major digital platforms are designated
The European Union's Digital Services Regulation entered into force on November 16, 2022. The Digital Services Regulation applies to all digital services through which goods, services or content are offered to consumers. The regulation imposes new extensive obligations on online platforms to reduce harm and combat risks online and to better protect the rights of online users. Large digital platforms have particularly extensive obligations. Now, on 25.04, the European Commission (EC) defined what these large digital platforms are. These are: 17 very large digital platforms and 2 very large Internet-based search engines with at least 45 million active users per month. They are as follows:
- Very large digital platforms: Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube, Zalando
- Very large Internet-based search engines: Bing, Google Search
The platforms are designated based on the user data they had to publish by February 17, 2023. Once designated, companies have four months to begin fulfilling all new obligations arising from the Digital Services Regulation. These tasks include e.g.:
Greater user empowerment:
- users get clear information about why certain information is recommended to them and have the right to opt out of recommendation systems based on profile analysis;
- users can easily report illegal content and platforms must handle such reports diligently;
- the advertisement may not be presented based on the user's sensitive data (for example, ethnic origin, political views, or sexual orientation);
- platforms must label all advertisements and inform the user about whose message is being promoted;
- platforms must provide an easily understandable and clear summary of their terms of use in the languages of the Member States in which they operate.
Better protection of minors:
- platforms must redesign their systems to ensure a high level of privacy, safety, and security for minors;
- targeted advertising for children based on profile analysis is no longer allowed;
- specific risk assessments must be submitted to the EC within four months after the designation, including the negative impact on mental health, and they must be made public no later than one year later;
- platforms must redesign their services, including user interfaces, recommendation systems and terms of use, to reduce these risks.
More careful content moderation and less disinformation:
- platforms and search engines must address the risks associated with the distribution of illegal online content and the negative impact on freedom of expression and information;
- platforms must have clear terms of service and they must be fulfilled diligently and non-arbitrarily;
- platforms must have a mechanism to allow users to report illegal content and must respond promptly to such reports;
- platforms must analyse their specific risks and establish risk mitigation measures, for example to prevent the spread of disinformation and to prevent inauthentic use of the service.
Greater transparency and accountability:
- platforms must ensure that independent external audits are carried out on their risk assessment and compliance with the obligations according to the digital services regulation;
- platforms must give researchers access to publicly available data;
- a special access mechanism for verified researchers will be created later;
- platforms must publish data repositories related to all advertisements served through their interface;
- platforms must publish transparency reports on content moderation decisions and risk management.
No later than four months after notification of the designation decision, designated platforms and search engines must adapt compliance systems, resources, and processes; create an independent compliance control system and make the first annual risk assessment and submit a report on it to the EC.
Compliance with all these obligations is ensured through a pan-European supervisory structure. Based on the regulation, the EC itself is the competent authority supervising platforms and search engines. The regulation also provides for a supervisory framework formed by the coordinators of digital services of the member states. Digital Service Coordinators are national bodies to be established in EU Member States by February 17, 2024. They are also responsible for monitoring smaller platforms and search engines. By the same date, i.e., on 17.02.2024, all other platforms must also fulfil the obligations according to the Digital Services Act and offer their users protection and protective measures according to the Digital Services Act.
Text of the Digital Services Act in the Official Journal of the European Union