Social media platforms - in Hungary primarily Facebook - gained a significant role in leading public debates. They enable masses to share, get to know and comment on each other’s opinions. However, their functioning is led by market goals, and they can make themselves most desirable, if the content shared by them reflects to each user his or her own world view. Facebook shows us what we agree with. However, this encloses people in a bubble, who are getting less able to understand each other.
The content spread in the bubble includes not only opinions, but also facts and many times misinformation that also influence the exercising of voting rights. In many countries it is on the agenda to regulate social media platforms by the state at latest since the presidential election of 2016 in the USA. In Hungary, the governing party announced to pass a “Facebook act” this spring, but according to the latest news they will rather postpone the national law-making process until the regulation set by the EU comes into force.
The regulation of social media platforms by the state influences fundamental rights. Therefore, we saw it fit to ascertain in a statement the corner stones with which such a regulation should comply. It makes it more complicated that the managers of the platforms are private entities, therefore, all interference by the state that affects the freedom of expression of users, also interferes with the freedom of the entities managing the social platforms.
The platform managers are mostly criticised due to their decisions affecting the content shared by the users. Such as the platform deleting content, or, on the contrary, omitting to delete content marked by other users. Nevertheless, in our opinion it is not the duty of the state regulation to decide under which circumstances the managers may delete content and profiles, or under which conditions they may share content with users. First of all, the transparency of the platforms’ functioning shall be ensured in order to secure the fundamental rights of users. It is the duty of the site managers to regulate their inner working, but users shall be enabled to check up on the background of the decision affecting them and their rights personally. This way the user could make an informed decision on whether the risks arising from the usage of social media are acceptable.
It comes with an increased danger on freedom of expression, if the future state regulation introduces sanctions vis-à-vis social media platforms, in case of they do not delete certain content. The behaviour and omissions entailing sanctions shall be precisely described and making possible to evaluate the unique circumstances of each case is of a crucial importance. Should it not be completely clear and foreseeable in which cases sanctions apply, the platform managers will be shifted in the direction to rather delete all content regarding which sanctions may apply in case of doubt.
The state interventions may only be oriented at enforcing the removal of unlawful content. Nevertheless, platforms cannot exercise activities requiring legal competence taking over the task of courts. The protection of fundamental rights shall prevail on social media platforms the same way as in case of offline communication: a court decision is necessary for the state to enforce the removal of a content allegedly unlawful.
The site managers may decide on the removal of not unlawful but harmful content if its criterion is transparent. The state regulation may provide legal remedies for the users against such decisions, but judicial remedies shall be made accessible also in this case. It cannot be adduced that the court system could not deal with the burden caused by the mass of the cases: this could be dealt with the establishment of online courts specified to online content management, where the decision-making would be made without personal attendance, through electronic communication, but as emphasized, the decision should be made by humans and not by an algorithm.
The criteria ascertained in our statement are minimum requirements, they do not refer to the ideal, but to the absolutely necessary protection. We had to set up these requirements in circumstances where the domestic regulation is prepared behind closed doors. Knowledge of the planned regulation may only be gained through press releases, or obscure references made by government actors. But the legitimacy of a regulation that was prepared without public discussion is questionable: it is a primary condition for transparent legislation that the civil society and the persons concerned - beyond the platform managers any person, who can be affected as user - have easily accessible and diverse possibilities to offer their opinion of substance.
We feel it important to share our statement for this reason, providing viewpoints for the law being prepared: since the government decided to wait for the EU regulation, there is enough time for the Hungarian regulation to be made after leading a substantive discussion.