By Dr. Elad Gil
14.03.2022
To
The Committee for the Law’s Compliance with Technological Challenges (Davidi Committee)
By email lawtech-committee@justice.gov.il
Minister of Communications Advisory Taskforce
By email sellam@moc.gov.il
Dear Sirs,
We are pleased to submit the position of Tachlith – The Institute for Israeli Policy, and the Federmann Cyber Security Research Center at Hebrew University in response to the open call distributed by the Ministry of Communications on February 24, 2022, regarding the digital content platform regulation. * Due to the short timetable avaible, our response will focus on presenting an outline of the recommended regulatory policy’s main points, and elucidating a number of specific arrangements with respect to the legal liability regime we propose and its scope of application. We would be happy to speak with you and elaborate our detailed response regarding molding the legislation and regulation in this area, should you be interested.
Summary of Our Position
The objective function
Involvement in the online platform market must be executed in accordance with a well-defined objective function and must be derived from the needs and characteristics of Israel’s economy and society. We would define the Israeli objective function as follows: The existence of an online space where Israeli citizens can enjoy, with equality, a broad right of access to the web and its content, in a manner that complies with a flourishing culture of freedom of expression, and which provides a safe and protected environment where citizens’ privacy is maintained, and which grants an opportunity to develop an innovative, entrepreneurial economy.
The organizing idea: optimal distribution of work between the private sector and the Government
In selecting the regulatory model, we propose to adopt a simple guiding principle: comparative institutional (sectoral) advantage. The objective function should be realized through a series of moral, professional, and technological decisions regarding the balance between competing values. The regulatory arrangement must grant the Regulator powers and authorities only with respect to decisions it is in a good position to advance, and it must leave discretion in the platforms’ hands, with respect to decisions they are in a better position to make. This should be done while creating suitable incentives.
Regulatory package deal: “safe harbor” in exchange for a content policy suitable to the public interest
According to the guiding principle, we recommend adopting a model of structural-procedural regulation. At the foundation of this model, central parts of which have been adopted in legislative bills currently being advanced in the European Union and the United States, is a “package deal” between the state and the platforms. The platforms receive “safe harbor”, which means conditional immunity from tort and criminal liability for user content, in exchange for undertaking a series of regulatory obligations intended to impact the structure of incentives and put in place procedural requirements for the supervision and formulation of an optimal content policy.
This Paper does not intend to detail all components of the proposed model or to claim that the foregoing herein represents a single, exclusive truth. Our aim is to point out its advantages and formulate five core principles which, in our eyes, can help formulate the desired regulatory plan:
1. Regulation that does not focus on the tip of the iceberg – A large part of the legislation that is extant, or is currently being advanced in the world, is based on a distorted image of how platforms work, which likens them to enforcement systems of an essence similar to state courts, and as a result, it focuses on the “tip of the iceberg” of their content supervision policy. This is a mistake with grave implications for the efficiency and effectiveness of regulation. We will explain how to avoid this.
2. Supervision of illegal publications: obeying courts + “notice and action” mechanism - In dealing with a large volume of illegal content on the web, adoption of a combined mechanism should be considered. The state would maintain ultimate responsibility through the courts and would issue orders to take down extreme content there is a clear public interest in removing. Platforms would be required to adopt an innovative “notice and action” mechanism. Within such a mechanism, users would be able to report illegal content in an accessible way, in Hebrew, and publishers would be able to respond in a “counter notice”. The platforms would maintain immunity on condition they make timely, professional decisions in good faith, even if they choose not to take down a publication or to use more proportionate enforcement tools. This solution suitably confronts the disadvantages of the familiar “notice and takedown” model, which has been proven to lead to overcensorship.
3. Supervision of self-enforcement mechanisms: increasing the consumer obligation to Israel’s citizens - Today, platforms already have a practice of adopting a content supervision policy to limit harmful speech, even when it is legal. The regulatory arrangement must institutionalize the obligation to implement this practice and create standardization. This must be in addition to a series of provisions to guarantee minimal due process and public loyalty with respect to a broad spectrum of decision-making mechanisms inside and outside of platforms which are involved in content enforcement, including government bodies. Additionally, a blind “overburdening” of due process obligations that negatively impact other values, should be avoided.
4. Effective transparency – Regulatory obligations should be formulated to promote transparency in a way that would impact all decision-making mechanisms on platforms and lead to the publication of effective information for the Regulator and the public. Regulatory obligations must aim to improve our understanding of the mechanisms active in Israel, and particularly in Hebrew.
5. Scope of application tailored to size and activity-level – Regulation must be tailored to the regulated parties according to their influence on speech in cyberspace, the risks stemming from their activity, and their resources. As a rule of thumb, distinctions should be made, for the purposes of the scope of regulatory obligations, based on two main criteria: the type and size of activity.
Dr. Elad Gil
He is a senior fellow and head of research at the Tachlith Institute for Israeli Policy.
Note – For references cited in this article, see the original Hebrew text.
Commenti