Europe Toughens Rules On Large Search Engines And Online Platforms – Analysis
By Goce Trpkovski
Online services businesses, from hosting service providers to search engines such as Google or social networks like Meta and Twitter, will need to change the way they work in the European market when two new acts published in the EU Official Gazette,the Digital Services Act and the Digital Markets Act, enter into effect.
The first will regulate how providers manage the content published through them while the second focuses on their market behaviour, and their relations with competitors, users and the businesses operating through their platforms.
The regulations were being prepared for a long time and the final versions were published on October 27 last year. Implementation will start gradually this year and will take 12 months. The companies are being given time to adapt to the new rules.
“The services covered by these acts seriously influence our way of life. Since everything is digitalized, there are big changes in the way we communicate, work, shop, educate ourselves, get informed, get entertained, and most importantly, the way we enter the public sphere to influence social and political processes,” Snezana Trpevska, a researcher at the Skopje-based RESIS Institute, who follows communication and media policies, told BIRN.
The acts will basically cover all companies providing online services such as search engines and social media platforms, but there will be a special set of rules for very large search engines and platforms. As described in the acts, a search engine and platform will be considered “very large” if it covers more than 45 million active users on a monthly basis, which is 10 per cent of the 450-million EU market.
“An environment was created where a few very large private platforms control ecosystems in the digital economy and play the role of ‘gatekeepers’ that dictate the terms and conditions in the digital markets, imposing their own rules. These rules create unfair conditions for the businesses that use the platforms, leaving limited opportunities for the users and customers,” Trpevska added.
According to her, the benefits of online services are undisputed because of the opportunities that they create, but they can also have a negative impact on economies and societies because of the opportunities for manipulative abuse of algorithms for spreading disinformation or illegal trade.
Improving security and curbing illegal content
One of the main goals of the new acts is to improve security online and to limit the spread of the so-called “illegal content”, such as dangerous disinformation and hate speech.
All providers of digital services, regardless on which category they fall into, will need to fulfill minimal requirements, such as being transparent in spreading information, respect for the fundamental rights of their users, cooperation with relevant national institutions and providing a point of contact and an official legal representative.
The very large online platforms and search engines, however, those used by more than 10 per cent of the EU’s total population, will be under direct surveillance of the European Commission, Charles Manoury, a digital economy press officer for the European Commission, told BIRN.
“The very large online platforms and very large search engines will have four months to adapt to the Digital Services Act upon being designated by the European Commission,” Manoury said.
He added that by January 2024, governments of the EU member states will need to impose new rules on regulating smaller platforms and non-systemic aspects of the work of larger platforms.
According to Trpevska, the large online platforms and search engines will be subjected to the strictest public scrutiny to make sure that they respect the rules.
“Those platforms carry the biggest risk of spreading disinformation, hate speech and other harmful or illegal content. The member states will have the major role in surveying their operations, with the support of the European Digital Services Board, while surveillance of the largest platforms will be an obligation of the European Commission,” she added.
Platforms will not be obliged to check content before publishing it, nor will they bear responsibility for what their users publish. Their main obligation will be to react immediately if certain content is reported as illegal.
“The spread of disinformation is a profitable business for online platforms. The more provocative and shocking the content spread through the platforms, the longer the users stay on them. Besides, the automated ‘recommendation systems’ and the algorithms used by the platforms are designed to serve users similar content to ones that they’ve already viewed,” Trpevska explained.
She recalled the testimony to the European Parliament of a Meta whistleblower who revealed how this platform encourages the spread of disinformation and violent content, using its algorithms for recommending content.
He said that when extreme content and the disinformation appear in the feed, the users tend to view them more, and therefore stay longer on the platform, generating more income.
The rules on content will apply to all platforms, regardless of whether some of them already have pre-publishing checks. Some platforms have introduced active content control and are able to refuse to publish certain content or remind the users that it’s not allowed and goes against their rules. It was unclear whether they could be held responsible if certain illegal content “slips into them”, but the new act defines that they will also enjoy immunity.
Besides having to react immediately when requested, the very large online platforms and search engines will be obliged to publish reports on moderating content and to allow the European Commission access to all data needed to monitor whether they’re compliant with the Digital Services Act. In some cases, that means allowing access to the algorithms and their premises. They will also need to pay a surveillance tax to the Commission. The fines for failing to abide are up to 6 per cent of their global revenue.
Companies like Meta and Google, for example, have reported annual revenues of 120 to 260 billion US dollars, which means that the 6 per cent fine would mean paying around 10 billion dollars, close to the annual GDP of North Macedonia.
‘Trusted flaggers’ to report illegal content
One of the categories introduced with the act is the trusted flagger. Trusted flaggers are entities with proven expertise in identifying illegal content, who are independent of any platform regulated by the DSA. The content reported by them will have priority in processing, without unnecessary delays.
Manoury from the EC said that the national coordinators for digital services in each member state will determine the trusted flaggers.
A court or another relevant institution may order the online services providers to take down certain content. However, conditions need to be met for that.
For example, such an order would need to define on which territory the content cannot be shown. Also, clear information should be submitted in order to enable the company to locate and identify the content, such as URL and other data.
The online platforms are also obliged to notify the authorities when they have executed the takedown order.
Trpevska said the two acts do not only impose restrictive measures but also encourage self-regulation, which is a better approach, she said, because it’s less threatening to freedom of expression online.
“It’s not a good idea to let the authorities decide whether certain content is disinformation or not, and to direct takedown demands towards Google or Meta. That would be a threat to democracy. The restrictive approach should be reserved for extreme content,” she added.
Greater scrutiny of traders
The online platforms that enable their users to make contact with traders of various goods and services and to shop online will need to impose obligations for trading companies to provide essential information on their business prior to letting them in the network. Also, the design of their pages should enhance their transparency and allow the user to see all necessary data on the trader.
The online services providers will also be obliged to notify consumers if it’s revealed that an illegal product or service have been offered through their platforms. They must also reveal the identity of the traders offering such products or services and inform the consumers on what action they can take to get reimbursement.
The companies must also have a high level of security for minors and will not be allowed to profile them as potential consumers and target them with ads.
The Digital Markets Act is the act that regulates the trade going through the online platforms. There will also be an opportunity for users to turn off the targeted ads option.
The companies will need to maintain interoperability with other systems and allow alternative payment systems, as well as to allow business users access to the data on their operations.
The European Commission will assign special teams that will monitor implementation of the two acts.
Macedonian users can expect benefits
The Digital Services Act and the Digital Markets Act are a part of the EU legislation, and therefore North Macedonia will need to implement them at some point, especially the DSA, Trpevska said.
However, it cannot be estimated when this obligation would need to be fulfilled. Prior to that, Trpevska added, broad analyses and public consultations need to be made with various bodies and institutions, as well as with civil society.
EU member states have, as said earlier, until 2024 to adapt their national legislation to the acts. However, the largest online platforms and online search engines will need to abide to the strict obligations imposed to them from the middle of 2023 onwards. Their users may expect to feel the benefits of these acts from then on.
“Macedonian users will also feel the benefits from the new rules because the online platforms operate on a global scale. We can expect less illegal content and products, simpler explanations on how the algorithms work, as well as less disinformation and manipulative content,” Trpevska concluded.