Sunday Dec 22, 2024
Friday, 22 September 2023 00:10 - - {{hitsCtrl.values.hits}}
Unable to distinguish between what is permitted and what is an offence, content creators, intermediaries and ISPs are likely to engage in excessively strict self-regulation
Article 14(1)(a) of the Constitution guarantees us the freedom of speech and expression including publication in any medium, subject to such restrictions as may be prescribed by law in the interests of racial and religious harmony or in relation to parliamentary privilege, contempt of court, defamation or incitement to an offence. Yet, the Cabinet has approved a bill that grievously limits online speech, expression and related actions. The limitations go beyond what is set out in Article 15(2) of the Constitution.
The bill seeks to establish what may be described as an “Online Truth Commission” that will decide what speech will be permitted and what will be prohibited based on subjective judgments of what is true and what is false. This creature of the executive (and not the judiciary) will, if the Supreme Court lets this bill go ahead, perform its functions through procedures that are inconsistent with the principles of natural justice.
Decision makers exercising public functions are subject to a set of common law procedural rules described as the rules of natural justice. Those who drafted this bill appear to have slept through that part of their legal education. The bill not only violates the division of powers set out in the Constitution, but also the basic tenets of administrative law.
Activating punishment for speech
All that it takes to activate the “Online Truth Commission” is for a person aggrieved by the communication of a prohibited statement which is seen, heard or otherwise perceived by users of internet based communication services in Sri Lanka, to either orally, in writing or in electronic form, make a complaint providing information pertaining to such communication to the Commission (s. 26). Shakthika Sathkumara’s persecution under the ICCPR Act (No. 56 of 2007) is illustrative of how this is likely to work.
The Commission may issue notice to the person who communicated the “prohibited statement” to prevent its circulation. There is no requirement for both sides to be heard. If the statement is not taken down within 24 hours, action can then be taken through the Internet Service Provider (ISP) or the internet intermediary/platform. There is no requirement for the ISP or the platform to be heard. The core principle of natural justice that no person should be judged without a fair hearing in which each party is given the opportunity to respond to the evidence against them is violated.
Alternatively, an aggrieved person may directly seek an interim order from a magistrate (s. 27). The hearing is required to be completed within two weeks. However, the presumption of delivery of notice and the seven-day period within which the parties must appear to show cause why the conditional order by the magistrate should not be made final preclude a fair hearing in the normal sense. The simplicity of the Section 26 procedure is such that Section 27 is likely to be rarely used. It may have been inserted as a shield against the expected assault on the Online Truth Commission.
One good thing is that the mistake made in the ICCPR Act to make bail difficult to obtain has not been repeated. Section 47 states that an arrest can only be made with a warrant and that these offences are bailable.
Vague terminology leading to excessive self-regulation
Among many ways the bill recognises for a person to become aggrieved are:
The above paraphrased descriptions are of a selected set of ill-defined offences that directly impinge on freedom of expression. Other offences such as cheating (s. 18) and impersonation (called online personation, s. 19) which are offences under existing law have also been included in the bill, possibly to take advantage of the summary procedures. Matters such as revenge porn that are not covered by existing law have also been included (s. 23).
The lengthy interpretation section (s. 56) goes to the extent of defining what a “fact” is. But it does not venture to define key elements of the offences described above such as “outraging or wounding of religious feelings” or “ill will between classes.” Based on recent experience with the ICCPR Act, abuse is likely. The uncertainty that will be engendered by the vaguely defined offences will be even more pernicious. Unable to distinguish between what is permitted and what is an offence, content creators, intermediaries and ISPs are likely to engage in excessively strict self-regulation. Perhaps this is the real objective of this bill.
Truth to be determined by President’s appointees
The above offences hinge on the Commission or a magistrate determining that statements are false and that they pose threats to subjectively defined matters such as national security, public order, ill-will, hurting of religious feelings, etc. A false statement is defined as a “statement that is known or believed by its maker to be incorrect or untrue and is made especially with intent to deceive or mislead but does not include a caution, an opinion or imputation made in good faith” (s. 56). One may not be overly concerned if the determination of whether a statement is false or not is made by a member of the judiciary, after allowing the accused to present his or her side and challenge the veracity of the accusation as set out in long-accepted laws of procedure.
But that is not so. In most cases, these determinations will be made by the “Online Truth Commission,” made up of persons appointed by the President. They may be removed “for reasons assigned” by the President. This is not a judicial body, nor an independent commission that is appointed with the approval of the Constitutional Council. If they do not do the bidding of the appointing/removing authority, they can be asked to resign or be removed for any made-up reason.
Effective solutions that will safeguard expression
It can be conceded that revenge porn is a novel phenomenon that requires the creation of the new offence, ideally through an amendment to the Penal Code. It may be argued that contempt of court is deserving of attention given the injustices that have occurred because of legislative lacunae. However, there is little justification for new law that addresses only online contempt when the problem applies to all forms of contempt. But the greatest concern is online speech that is perceived to cause harm which is the focus here.
Many forms of harm from online speech are perceived, and remedies are sought from platform providers and from the State. The harms range from online bullying and intellectual property violations to incitement to, or facilitation of, violence. Unlike in the past where printing presses, publishers and broadcasters enabled effective assignment of responsibility for harmful content, the internet has enabled millions to directly publish user-generated content (UGC), in some cases anonymously or pseudonymously. Publication can be done from outside the jurisdiction of specific governments as well. It must be conceded that these are novel problems worthy of attention.
Because platforms enable extremely rapid and articulated dissemination of UGC, remedies obtained through court orders or administrative actions fail to be fully responsive to the aggrieved parties and State authorities. Therefore, the responsibility for remedial action to remove or reduce the reach of UGC perceived as causing harm tends to fall on platform providers who have the technical capability. The online safety bill may be seen as an attempt by the State to do that job directly.
Safeguarding freedom of speech and expression is so important that it is constitutionally protected in most civilised countries, as it is in Sri Lanka. Legislators seeking to address the new problems posed by rapid and articulated dissemination of must first decide what the priority is. If it is rapid takedown (to avoid situations such as the live-streaming of the Christchurch massacre), the solution is not what is proposed in this bill. By the time the “Online Truth Commission,” likely to be ill-resourced like mist regulatory bodies, issues its orders the damage will be done.
What Government must do is to endorse the ongoing efforts in Sri Lanka that draw on the experience of countries such as New Zealand to establish an industry code of conduct. This code will, through transparent and responsible procedures, ensure rapid takedown of harmful content that falls within the broadly consulted and codified definitions. The rapidly changing technology and the innovativeness of users (who could have imagined live-streaming a massacre using a body cam?) necessitate experimentation and nimbleness on the part of those trying maintain standards of responsible behaviour.
An industry code of the type that has been drafted for Sri Lanka can be envisioned as a regulatory sandbox. It can be used to learn how the process works, how it can be improved, and to design light-touch but effective legislation where necessary. In other words, the Government should withdraw this bill, allow the code to be implemented, observe how effective it is in terms of mitigating harms, and then working up a co-regulatory scheme that would leverage the industry’s strengths and the State’s ability to punish following proper legal procedures.