Statement on the minimum requirements of the regulation on social media platforms

The aim of the following statement is to lay out the fundamentals regarding the regulation of the managers of social media platforms that has been heralded in the press many times [1] and was promised for this spring (but was not, however, listed in the legislation program for this spring [2]), and which cornerstones the state regulation shall be prepared to comply with according to international hard and soft law [3]. The requirements defined in this statement are general for the time being, since we will only be able to answer every question after getting to know the actual rules that will be made. Their aim is to set out the requirements of the minimal protection that the regulation should at all events comply with. Therefore, the regulation may provide more, but not less guarantees in order for the relevant fundamental rights to prevail.

Social media platforms gained a significant role in leading public debates. These interfaces are accessible to everyone, with relatively low entry costs, and they make it possible for people to understand, share, and comment on each other’s opinions. As well as this democratising effect however, social media also acts as a kind of filter. Social media platforms function along free market lines, their aim is to collect and keep users. They make the kind of content accessible to users which the users agree with most, since people would rather watch content that reflects their own world view. The bubble created by this undeniably contributes to us understanding each other less.

The state regulation of the platforms’ functioning affects fundamental rights, but the issue is more complex than when the state directly interferes with the citizens’ freedom. On the one hand platforms provide a space for others to offer their opinion, and on the other platforms are private entities themselves that may have their own opinions. Therefore, all state interference that affects the opinion of others on the platform, also interferes with the freedom of the entities managing the social platforms.

The affected fundamental rights are far beyond freedom of expression. For managers of social media platforms to be able to select content that is interesting for the user, they have to know us.

The management of the data about us enables the existence of the bubble that has an effect on what information is available to us to confirm our views. Moreover, the bubble includes not only facts, but also opinions, misinformation and hoaxes that influence the exercising of democratic participation rights in a very broad way, from the freedom of information, through to the freedom of expression to the exercising of voting rights.

It is HCLU’s opinion that the primary goal of the state’s regulation should not be to settle under which circumstances the platform managers should provide information for the user; under which circumstances they may delete content and profiles, and whether they should provide remedies in this regard for the users. The duty of the state regulation is to enforce the requirement of transparency with regard to the functioning of algorithms [4] ensuring that the user is able to check up on the background of the decision made by the platform managers affecting him/her personally. Therefore, it is absolutely necessary to make such a regulation. However, the necessity of all regulation beyond the question of transparency should be strictly scrutinized.

Minimum requirements set for the Hungarian regulation to be made in the future:

  • First of all, the transparency of the platforms’ functioning shall be ensured in order to secure the fundamental rights of users.

  • The requirement of transparency shall also be fulfilled with regard to state interference.

  • The primary means to enforce transparency are the publishing lists having regulated content.

  • The legitimacy of the regulation is ensured by involving the affected persons in the development of the regulation, beside establishing a broad professional and public discussion.

  • The regulation shall fit into the framework set by international law and the European Union.

  • The personal scope and subject matter of the law package being prepared shall be clear.

  • It should be true to life and should keep in mind the criteria of the practical applicability.

  • An impact assessment should forego and follow it, and real control mechanisms should prevail with regard to its compliance with the constitution.

  • The goal of the regulation may only be legitimate, if it’s in compliance with the relevant international human rights norms and guidelines.

  • The same level of protection of fundamental rights shall prevail with regard to the content on the platforms as in the case of communications outside the platforms.

  • A court decision is necessary in order for the state to enforce the deletion of unlawful content.

  • The requirement of proportionality shall prevail with regard to the sanctioned behaviour, as well as the introduced sanctions.

  • Should the regulation aim to claim the legal responsibility of the platform managers for omitting to delete content, being held responsible for content relating to third persons may only be proportionate, if the platform explicitly interfered with the content, or it failed to comply with a decision brought by an independent control organisation in a fair procedure on deleting the content, although it could have been able to comply with it.

  • If the regulation introduces sanctions vis-à-vis the platform managers, a court revision of the sanction shall be made accessible.

  • If the regulation includes remedies for the users against the proceedings of social media platform managers, remedies by the court shall be ensured against the legal injuries.

I. The justification of the regulation

The first question arising in relation to the law package being prepared is the justification of the regulation. It can be questionable in relation to the regulation of social media, whether it is absolutely necessary for the Hungarian legislator to regulate this area. It is a further question that if the legislator is committed to the regulation being justifiable, how the enactment of the law package should be timed, and what viewpoints should be kept in mind.

Among the matters to be regulated the following frequently arise: the state regulation of content being deleted; of securing remedies against the deletion of content; of regulation on the revision of status decisions [5], as well as enforcing the transparency of social media platforms. In our view the state regulation should establish the transparent functioning of platforms.

We are convinced that although the legislator could approach the matter of regulation from many aspects, its primary duty is to enforce transparency. Although platform managers are mostly criticized because of decisions regarding content, this should not be the main focus of state regulation. Regulating the platform managers’ decisions regarding content directly interferes with the fundamental rights of users, as well as with the inner workings of platforms. If possible, state interference should be avoided, and be only present in a proportion that is absolutely necessary. Regulation should first settle those questions that can be settled without similar direct interference, since the platform is subject to fundamental rights itself. It is another level of interference if the platform managers are obliged to provide space for lawful or harmless content that they simply do not wish to give space to, than if the state regulation simply states that the platform shall reveal the way the algorithm functions. Enforcing transparency is the entry level of state regulation. A regulation direction that involves the least possible interference, and that allows the social media platform managers to regulate their own inner workings, but makes them responsible for this inner regulation.

The definition of transparency shall be interpreted as follows under the present circumstances. It is the basic interest and right of users to know under which aspects platforms bring their decisions restricting their rights to communication and private life, whether inner revision of complaints is possible, and whether decisions may be contested. International experience shows that platforms do not adequately comply with requirements of transparency without external guarantees, and it is solely their decision whether they take any actions in order to protect the users’ fundamental rights. If it is transparent to the wider public as to whether they took such actions or failed to take them, the user is able to make an informed decision on whether the risks arising from the usage of that social media platform are acceptable to him/her. Ensuring this informed user status is not an optional legislative duty. The state’s possibilities are more restricted with regard to regulating the substantive decisions of the platform managers, but the enforcement of transparency may also render this unnecessary. According to HCLU’s viewpoint it is absolutely necessary to enforce transparency, therefore, an external regulation for securing transparency cannot be avoided. This entails at the same time that all regulation beyond securing transparency shall be viewed critically. Regulating content related matters can only be allowed if the platform managers do not comply with the requirements set out in respect of transparency.

Beyond transparency, there are other areas where establishing a regulation is not an option, but an expectation for the participating states according to the international law [6]. However, this does not mean that the Hungarian legislator should by all means establish a regulation with respect to social media platforms. Regulating platforms concerns complex jurisdictional matters and may cause cross-border legal disputes. Therefore, according to HCLU’s viewpoint, it would be worthwhile to keep the regulation at the level of the European Union, or at least to await the EU regulation being currently prepared [7] in order for the regulation to comply with the law of the European Union. Should the National Assembly disregard while establishing the regulation the characteristics of the EU regulation being currently prepared, the intent to regulate cannot be supported. Since if the law package does not fit into the framework set by the EU law, it is not suitable to form a unified framework for the functioning of social media and therefore for the regulation regarding users.

It is already clear that the European Union is committed to regulation beyond the matter of transparency. In light of the above, it may be professionally controversial how far this is justifiable. Nevertheless, it is necessary to ascertain the fact that currently two sources of law from the European Union are being prepared to cover security, protection of user’s fundamental rights, as well as innovation, growth and fostering competitiveness:

  • the Digital Services Act (DSA) regarding online marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms; and

  • the Digital Markets Act (DMA) regarding gatekeepers connecting business and consumers in the inner market.

With regard to the Hungarian regulation the most important fact is that the framework set by the European Union will ensure guarantees for the users, and will as part of this regulate the users’ possibilities for the revision of the platforms’ decisions restricting content, and will prescribe broad responsibilities regarding transparency for the platform managers [8].

II. Importance of public participation

The future Hungarian regulation shall not only keep in mind the framework set by the European Union, but other circumstances of the regulation are also relevant with regard to the regulation’s legitimacy.

One of these requirements is legal certainty that is part of the rule of law, being one of the fundamental pillars of the Council of Europe. Rule of law also appears in the preamble of the European Convention of Human Rights setting a framework for the interpretation of the Convention. However, this requirement is not only present as part of the principles, but at the level of the specific rules in the international framework that the Hungarian legislation should comply with. According to section 10(2) of the European Convention of Human Rights the freedom of expression may only be restricted by law, and according to section 19(3) of the International Covenant on Civil and Political Rights freedom of expression may only be restricted by rules specifically set out by law. Although in praxis any publicly announced legislation would comply with this criteria, since the regulation will be in direct connection with freedom of expression, in Hungary the regulation of social media can only be imagined by way of an act [9]. The reason for this lies in the guarantees incorporated into the process of legislation that shall secure deliberation and the transparency of the legislation process. Consequently, not only the finished result of legislation, but the process itself shall comply with certain criteria: the legitimacy is questionable of a regulation that was established without public discussion, involving the civil society, as well as involving the persons concerned [10]. Under persons concerned we understand this to be on the one hand the social media platform managers, on the other any person who may be indirectly affected as the user of platforms. The broadest possible public shall be made aware of the possibility of participation, and easily accessible and diverse possibilities have to be provided for having a say in the content of the regulation by setting a time limit to substantially offer opinions.

III. Requirements regarding content

  • Requirement of legal certainty

The requirement of legal certainty with regard to the content of the regulation prevails, if the regulation’s effect is foreseeable, it is defined so precisely that it allows the addressees of the norm to adjust their behaviour to it [11]. The addressee of the regulation on social media is primarily the manager of the platform providing services, but the regulation also has an indirect effect on the users, and its content will be enforced with the cooperation of (not yet known) law enforcement entities. Therefore, the regulation has to withstand the test of norm clarity with respect to multiple addressees. Obscure and ambiguous regulation may lead to hollowing out of fundamental rights [12]. Therefore, a unified and calculable law enforcement standard is necessary that can only evolve amongst a sufficiently clear regulation background.

Since the regulation may also regulate matters of content restriction, if it envisages sanctions against social mediums for failing to remove content, it can entail intensified danger with respect to the prevalence of freedom of expression. Should it not be completely clear and foreseeable in which cases sanctions apply the platform managers may be influenced in cases where there is doubt to delete all content about which sanctions may apply in order to minimise the regulation’s risks. The personal and objective scope of the law package being prepared shall be clear. It shall be clearly defined, who it concerns and who not, defining the interpretation of social media while applying the law. It shall also precisely define those behaviours and omissions that are included in the regulation’s scope [13].

Besides the requirement of legislative clarity it is important for the regulation’s framework to simultaneously enable the analysis of the case’s unique circumstances [14] and it should not be casuistic. The requirement of legal certainty does not mean that a right to consideration may not be provided to the law enforcement, if necessary. However, the wording of the law shall set out the criteria and extent of the consideration [15].

The regulation shall be applicable in praxis. For example, it may not define unrealistically short deadlines for the removal of content, the platforms managers shall not be obliged to fulfil impossible requirements, and the legal remedies provided for the users shall be accessible in reality. Platforms cannot exercise activities enforcing the Hungarian law and requiring legal competence, as if they were replacing courts. Beyond the fact that platform managers do not in praxis possess the necessary competence, the law enforcement activities may, as principle, not be outsourced to non-state entities. Therefore, platform managers shall not be assigned a role that e.g. they should analyse the characteristics of facts regarding criminal acts set out in the Criminal Code.

A method to test applicability is an impact assessment carried out preliminarily and subsequently, ideally two years following the entry into force [16]. Failing to do this means significant danger for the prevalence of fundamental rights. Without preliminary impact assessment the measure of the restriction of the affected fundamental rights cannot even be estimated, since there is no basis to which the effect of the regulation may be compared to. The follow-up on the regulation that already entered into force cannot be neglected either, since the subsequent correction of the regulation may be necessary in order to avoid restrictions that go beyond the absolutely necessary measure. Furthermore, we cannot set aside the importance of being able to enforce the conformity of the regulation with the constitution and international law, and that effective mechanisms for the revision of the regulation are in fact available.

  • Regulation of online content

The regulation of content related matters are secondary in relation to transparency. There are less possibilities available for the legislator in this area, and this narrower framework is set out by the following limitations.

It is important as principle that the regulation of social media shall enforce the same standards of fundamental rights that are enforced in case of the communication outside the platforms [17]. This also applies in a way that the proportionality of the restriction shall be judged according to the same benchmark that is applicable to the restriction of the freedom of expression offline. The requirement of proportionality shall prevail both for the scope of sanctioned behaviour and the sanctions themselves. These requirements were developed and informed by international organisations protecting human rights. Therefore, while judging the proportionality of the regulation, the praxis of these organisations, mainly the European Court of Human Rights shall be applied. However, this will only be possible after getting to know the actual regulation.

If the restriction concerns the freedom of expression, the primary criteria is that the law enforcement shall interpret all limitations in a restrictive approach, in the narrowest possible spectrum with regard to the role that freedom of expression fills in constitutional democracies.

The relevant international conventions ascertain which legitimate aims a regulation limiting the freedom of expression may serve. The legitimate aims may include

  • respecting rights or reputation of others,

  • preventing the disclosure of information received in confidence,

  • protection of state or national security, of public order, of public safety, or prevention of disorder or crime, protection of public health or public morals, or even

  • maintaining the authority and impartiality of the judiciary [18].

The scope of legitimate aims are narrowed down by the business and human rights guidelines of the UN [19], which provide guidance for the states and companies in respect of the human rights related responsibilities regarding the activities carried out by business entities. The guideline consolidates the international and national legal standards, and is in praxis relevant to judging the activities of companies, and, therefore, to their regulation. According to the UN Guiding Principles the social platforms have an accentuated responsibility in their inner regulations to comply with international human rights norms with regard to their power and role filled in the public space, as well as to the freedom of expressions and the fundamental rights of users. The UN media law commissioner called attention in multiple reports to the fact that the social platforms are also responsible to comply in the course of their functioning with the international human rights norms and carry out regular internal investigations with regard to how their services affect the human rights of users. Discussing the requirements set out in the UN Guiding Principles with regard to business entities is not part of the present statement, since this statement deals with the role of the state. However, the first pillar of the guidelines states that it is the state’s duty to enforce human rights under its jurisdiction. Therefore, it has primary importance from the above listed legitimate aims that the regulation should be directed by the protection of the rights of others.

With regard to the possible objects of the regulation, a distinction should be made between lawful and not unlawful, but harmful content. The state interventions may only be oriented at enforcing the removal of unlawful content. The state may not interfere with the removal of lawful content. The law should define the scope of unlawful content and the possible actions that may be taken against it, which are presently also provided by the Hungarian law currently in force within the framework of the civil and criminal system. The social media platform may decide on the removal of lawful content according to the rules set out by the platform (social principles, behaviour codex), but it shall make the criteria, which the decision is based on, transparent.

Should a user be wronged by the process of the social media platform manager, a judicial remedy shall be made available, if the state regulation aims to provide remedy for these cases. The requirement for judicial remedy also stands in the reverse case: if the platform fails to delete potentially unlawful content, the content restricting decision of the manager may only be enforced by an independent and impartial judicial entity, a decision by the court is necessary to remove unlawful content.

It does not fulfil the requirements set by international law, if the national regulation entrusts law enforcement instead of the courts with the adjudication of the lawfulness of certain content. The prevalence of personal fundamental rights is the matter of the decision in these cases. Solely the courts are entitled to make final decisions on fundamental rights.

The fundamental rights to be enforceable means that judicial remedies shall be made available. This enforcement of rights may follow the procedure of some other entity (usually an authority) as a secondary guarantee. However, in the organisational system currently protecting fundamental rights, this is not possible. If the main proceeding is led by an authority, the framework of the proceedings of administrative authorities only provide limited possibilities to substantially measure viewpoints regarding fundamental rights. Moreover, the judicial revision of the decision made by an administrative authority cannot deal with the settlement of the dispute regarding conflicts of fundamental rights, but only concerns itself with the lawfulness of the main proceeding. Therefore, courts shall not be involved as entities revising the main case, but as primary adjudicators of the conflict of fundamental rights. Since only courts are able to solve this task, the proceedings should be directed from the beginning to this goal. It is especially true with regard to the effectiveness of the remedy, since the quick decision of cases requires that the consideration of fundamental rights happens as soon as possible. There is no other entity besides court that would have a similar, independent status ensured by the constitution, and would also be entitled to render a decision having obligatory force. The judiciary system is an independent branch that is not only a legal declaration, but has a substance that may be described by the law. Guarantees set out by the law secure the organisational independence of courts, the personal independence of judges and the independence and impartiality of the judge proceeding in the particular case. Courts provide a uniquely high level of safeguards for a fair proceeding. The matters arising herein shall be decided in a final way, with enforceable judgements delieverd in a highly effective way. It is not possible in the current organisational framework in proceedings led by entities other than the independent court to guarantee fair procedure. According to our view, establishing a new, not yet existing entity protecting fundamental rights would not provide sufficient guarantees for the effective enforcement of rights. If the protection of rights can be guaranteed in the current organisational framework, no argument exists for the establishment of a not yet existing entity providing unsure level of independence, professionalism and effectiveness.

The possibility of revision by courts (e-courts) cannot be limited or excluded by adducing to the possibility that the court system could not deal with the burden caused by the mass of cases. The same mass of cases is characteristic of some of the data protection cases, where the exclusion of the judicial route did not come up. The state shall ensure the guarantees for the accessibility and effectiveness of court proceedings, and the ones exercising public authority shall ensure the regulation’s operability. We are aware of the challenges the court proceedings will face while forming our viewpoint. However, these practical aspects cannot lead to disregarding the requirements of theoretical significance regarding judicial enforcement of rights.

The non-automatic decision made by a human is a guarantee in itself. Therefore, under e-courts we understand traditional proceedings not carried out by algorithmic decision-making, during which contact is done electronically, and there are no court hearings that require personal attendance, it shall be ensured that the public may be able to monitor the decision-making and that the guarantees of the transparency of the decision-making can prevail while taking into account the characteristics of the proceeding. The establishment of e-courts specialized in online content regulation may enable in these entities for know-how to be formed that is necessary for the effective management of these cases, having as they do quite special characteristics. Besides, the guarantees of the court proceedings would also ensure the legitimacy of the decisions.

With regard to the proportionality of the platform manager’s responsibility it should also be taken into account that being held responsible for content relating to third persons may only be proportionate, if the platform explicitly interfered with the content, or it failed to comply with a decision brought by an independent control organisation in a fair procedure requiring the removal of the content, although it could have complied [20]. Surely, a regulation requiring the continuous, proactive monitoring of the online content by the state does not comply with the requirement of proportionality, since this is both inconsistent with the freedom of expression, as well as with the right to a private life [21].

If the regulation foresees sanctions against platform managers, the sanctions shall be able to facilitate the compliance of the platforms with the law (e.g. in the area of transparency), but they cannot be disproportional - especially in case of failing to delete content -, since this would have a chilling effect on the freedom of expression. Regarding the nature of sanctions, the conformity of criminal sanctions with the international law is only possible within a very narrow scope, criminal law is ‘ultima ratio’ that shall be upheld for dealing with cases when sanctions of another nature proved to be insufficient. The requirement of fair proceedings also includes that the platform managers are entitled to effective remedies if sanctions are imposed on them.

  • Requirement of transparency

The requirement of transparency shall prevail in the regulation in two directions: that the state shall require transparency from the platform managers and the algorithms used by them, and that the transparency shall also apply to the state.

The worst transparency problem in the functioning of social media platforms is that the adjudication is not transparent for deletion requests based on users’ terms and conditions. It has a fundamental significance that the decision mechanisms regarding content and status shall be transparent. Platforms shall publish precise, easily accessible by the public, simple, by various devices readable reports on every interference concerning the expression of opinion and private life of users, etc. Their transparency reports shall include the following:

  • number of incoming requests;

  • categories of requesting persons: private person/ state entity;

  • criteria for measuring the lawfulness of the content / in which way it breached the users’ terms and conditions;

  • number of incoming requests for remedies and the result of their judgement.

The state shall also frequently publish real and public data about its activities in its revision role. Publishing data is a pre-condition for evaluating the law enforcement praxis, lacking this there is no possibility for correcting and reviewing legislation with regard to how able the regulation was in achieving its original goals. Data shall be published at least in the following matters:

  • amount and type of content limitation, including the categories of personal data requested of intermediaries;

  • the transparency reports shall include all content-based requests submitted to the intermediary;

  • with the clear explanation of the request’s legal basis;

  • including the measures taken following the requests.

IV. Final thoughts

The above criteria are, as emphasized, minimal requirements. This means that they shall be complied with in order for the planned regulation to be in compliance with the fundamental requirements deduced from international law that shall unconditionally prevail. However, the above described scope of minimal requirements, next to the fact that it does not demand an ideal level, but the level of an unconditionally necessary protection vis-à-vis the Hungarian regulation, is also not complete. Since we had to set out these requirements when the domestic regulation is being prepared behind closed doors. Knowledge of the planned regulation may only be gained through press releases, or obscure references made by government actors.

HCLU aims to continuously review the fulfilment of the above minimal requirements simultaneously with the details of the regulation becoming public knowledge. However, the solution is not in our hands. The ministry of justice would be able to correct the lack of transparency in the legislation. The minimal requirements aim to revive the need for this while at the same time keeping in mind that social media has become a new corner stone of our democracy.

[1] https://hvg.hu/itthon/20210118...
https://24.hu/belfold/2021/01/...
https://www.portfolio.hu/globa...

[2] https://www.parlament.hu/docum...

[3] The planned content of the regulation is not known at the time of publishing the present statement. Therefore, we collected the requirements based on the main direction of the regulation that could be surmised according to the White Book of the Digital Freedom Work-group of the Ministry of Justice active in the preparation of the regulation, as well as on the news accessible in the press and social media. Since the package known as the “Facebook Act” will only deal with the activities of social media platforms presumably, the statement focuses on the possible regulation concerning these platforms and does not deal with the adjudication of the disputes between the users, or their legal responsibilities.

[4] Freedom and Accountability: A Transatlantic Framework for Moderating Speech Online (Philadelphia: Annenberg Public Policy Center, June 2020). https://www.annenbergpublicpol...,

[5] Under status decisions we understand the decisions of the platform regarding user accounts (especially: suspending and deleting accounts).

[6] For example according to Section 20(2) of the International Covenant on Civil and Political Rights any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence shall be prohibited by law.

[7] https://digital-strategy.ec.eu...

[8] The DSA proposal primarily relevant with regard to social media platforms is accessible here: https://eur-lex.europa.eu/lega...

[9] Section I(3) of the Fundamental Law. “Not every connection with fundamental rights requires the regulation to be included in an act. However, the substance of a fundamental law can only be defined and its substantial guarantees can only be set out in an act. An act is also required to restrict a fundamental law in an indirect and substantial way.” [Resolution No. 64/1991. (XII. 17.) of the Hungarian Constitutional Court]

[10] According to the report No. A/HRC/38/35 of the UN commissioner for freedom of expression, a broad public discussion shall be carried out in case of planning the regulation of online content. https://ap.ohchr.org/documents...

[11] https://www.venice.coe.int/web... [44]

[12] VC [47]

[13] This simultaneously means that it may be necessary to regulate the guarantees for establishing legal unity.
VC [50]

[14] https://www.venice.coe.int/web... [49]

[15] https://www.venice.coe.int/web... [45]

[16] VC [51]

[17] https://documents-dds-ny.un.or...

[18] Section 19(2) of the International Covenant on Civil and Political Rights and Section 10(2) of the European Convention on Human Rights.

[19] Guiding Principles on Business and Human Rights - UNGP

[20] 2017: Joint Declaration on freedom of expression, “fake news”, disinformation and propaganda [https://www.osce.org/files/f/d...], as well as Section 14(1) of the Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce').

[21] Report No. A/HRC/38/35 of the freedom of speech commissioner of the UN [67]. https://ap.ohchr.org/documents...

Share

Related articles

Not even the court would stop the Media Council, Klubrádió will no longer stay on air

The Metropolitan Court of Budapest dismissed the action brought by the Hungarian Civil Liberties Union (HCLU) on behalf of Klubrádió against the decision of the media authority to silence the radio station without giving detailed reasons.

GDPR weaponized – Summary of cases and strategies where data protection is used to undermine freedom of press in Hungary

Recently, the Hungarian Civil Liberties Union (hereinafter: HCLU) has represented multiple media outlets in GDPR based civil and administrative procedures in which the right to data protection was invoked to repress the freedom of press.

International response to the internet disruption measures of Belarusian authorities

In relation to the presidential elections, human rights violations have occurred in the online.