The practice of content moderation, whereby online platforms monitor and
remove harmful content, has proliferated in the digital sphere. Private
corporations controlling digital speech raises challenges for democracies.
Should, therefore, private power in the digital sphere be regulated, and if
yes, should content moderation practices be restrained to protect
fundamental rights? The EU has initiated a line of legislative efforts that
together generate a body of digital governance norms for regulating content
moderation practices. Inspecting these initiatives reveals that although
they are intended to enshrine human rights guarantees in the digital sphere,
they have failed to provide individual end users with procedural justice
rights, known also as procedural due process rights, to protect them against
the online platforms. Although the principles of transparency and
accountability are praised in the preambles of these legislative
initiatives, they are not translated into concrete obligations.