Crushing terrorism online – or curtailing free speech? The proposed EU Regulation on online terrorist content


Professor
Lorna Woods, University of Essex
On
12th September 2018, the Commission published a proposal
for a regulation (COM(2018) 640 final) aiming to require Member States to
require certain internet intermediaries to take proactive if not pre-emptive
action against terrorist content on line as well as to ensure that state actors
have the necessary capacity to take action against such illegal content. It is
described as “[a] contribution from the European Commission to the Leaders’
meeting in Salzburg on 19-20 September 2018”. The proposal is a development
from existing voluntary frameworks and partnerships, for example the EU
Internet Forum, and the non-binding Commission
Recommendation
on measures to effectively tackle illegal content online
((C(2018)1177 final), 1st March 2018) and its earlier Communication
on tackling illegal content online (COM(2017) 555 final). In moving from
non-binding to legislative form, the Commission is stepping up action against
such content; this move may also be seen as part of a general tightening of
requirements for Internet intermediaries which can also be seen in the
video-sharing platform provisions in the revised Audiovisual Media Services
Directive and in the proposals regarding copyright. Since the proposal has an “internal
market” legal base, it would apply to all Member States.
Article
1 of the proposed Regulation sets out its subject matter, including its
geographic scope.  The scope of the
proposed regulation is directed to certain service providers, “hosting service
provider” in respect of specified content “illegal terrorist content”.  Terms are defined in Article 2. Article 2(1)
defines “hosting service provider” (HSP) as “a provider of information society
services consisting in the storage of information provided by and at the
request of the content provider and in making the information stored available
to third parties”. The definition of illegal terrorist content found in Article
2(5) is one (or more) of the following types of information:
(a)
inciting or advocating, including by glorifying, the commission of terrorist
offences, thereby causing a danger that such acts be committed;
(b)
encouraging the contribution to terrorist offences;
(c)
promoting the activities of a terrorist group, in particular by encouraging the
participation in or support to a terrorist group within the meaning of Article
2(3) of Directive
(EU) 2017/541
(d)
instructing on methods or techniques for the purpose of committing terrorist
offences.
The
format does not matter: thus terrorist content can be found in text, images,
sound recordings and videos.
Article
3 specifies the obligations of the HSPs. In addition to a specific obligation
to prohibit terrorist content in their terms and conditions, HSPs are obliged
to take appropriate, reasonable and  
proportionate actions against terrorist content, though those actions
must take into account fundamental rights, specifically freedom of expression.
Article
4 introduces the idea of a removal order, and requires that the competent
authorities of the Member States are empowered to issue such orders;
requirements relating to removal orders are set out in Article 4(3).  It does not seem that the issuing of such
orders require judicial authorization, though the Regulation does envisage
mechanisms for HSPs or the “content provider” to ask for reasons; HSPs may also
notify issuing authorities when the HSP views the order as defective (on the
basis set out in Article 4(8)), or to notify the issuing authority of force
majeure. Article 4(2) states:
Hosting service providers shall remove terrorist content or
disable access to it within one hour from receipt of the removal order.
The
regulation also envisages referral orders; these do not necessitate the removal
of content, nor – unlike the position for removal orders – does it specify
deadlines for action. On receipt of a referral order, a HSP should assess the
notified content for compatibility with its own terms and conditions. It is
obliged to have in place a system for carrying out such assessments. There is
also an obligation in Article 6 for HSPs in appropriate circumstances to take
(unspecified) effective and proportionate proactive measures and must report
upon these measures. Article 6 also envisages the possibility that competent
authorities may – in certain circumstances – require a hosting service provider
to take specified action.
Article
7 requires hosting service providers to preserve data for certain periods.  The hosting service provider is also required
to provide transparency reports as well as to operate within certain safeguards
specified in Section III, including transparency reporting, human oversight of
decisions, complaints mechanisms and information to content providers – these
are important safeguards to ensure that content is not removed erronously.  Section IV deals with cooperation between the
relevant authorities and with the HSPs. 
Cooperation with European bodies (e.g. Europol) is also envisaged.  As part of this, HSPs are to establish points
of contact.
The Regulation
catches services based in the EU but also those outside it which provide
services in the EU (with jurisdiction in relation to Article 6 (proactive
measures), 18 (penalties) and 21 (monitoring) going to the Member State in
which the provider has its main establishment) and should designate a legal
representative. The Member State in which the representative is based has
jurisdiction (for the purposes of Articles 6, 18 and 21). Failure so to
designate means that all Member States would have jurisdiction.  Note that as the legal form of the proposal
is a Regulation, national implementing measures would not be required more
generally.
Member
States are required to designate competent authorities for the purposes of the
regulation, and also to ensure that penalties are available in relation to
specified articles such penalties to be effective, proportionate and dissuasive.  The Regulation also envisages a monitoring
programme in respect of action taken by the authorities and the HSPs.  Member States are to ensure that their
competent authorities have the necessary capacity to tackle terrorist content
online.
The
proposal is in addition to the Terrorism Directive, the implementation date for
which is September 2018.  That directive
includes provisions requiring the blocking and removal of content; is the
assumption that – even before they are require legally to be in place – these
provisions are being seen as ineffective.
This
is also another example of what seems to be a change in attitude towards
intermediaries, particularly those platforms that host third party
content.  Rather than the approach from
the early 2000s – exemplified in the e-Commerce
Directive
safe harbour provisions – that these providers are and to some
extent should be expected to be content-neutral, it now seems that they are
being treated as a policy tool for reaching content viewed as problematic.  From the definition in the Regulation, it
seems that some of the HSPs could have – provided they were neutral -fallen
within the terms of Article 14 e-Commerce Directive: they are information
society service providers that provide hosting services.  The main body of the proposed regulation does
not deal with the priority of the respective laws but in terms of the impact on
HSPs, the recitals claim
“any measures taken by the hosting service provider in
compliance with this Regulation, including any proactive measures, should not
in themselves lead to that service provider losing the benefit of the liability
exemption provided for in that provision. 
This Regulation leaves unaffected the powers of national authorities and
courts to establish liability of hosting service providers in specific cases
where the conditions under Article 14 of Directive 2000/31/EC for liability
exemption are not met”.
This
reading in of what is effectively a good Samaritan saving clause follows the approach
that the Commission had taken with regard to its recommendation – albeit in
that instance without any judicial or legislative backing.  Here it seems that the recitals of one
instrument (the Regulation) are being deployed to interpret another (the
e-Commerce Directive). 
The
recitals here also specify that although Article 3 puts HSPs under a duty of
care to take proactive measures, this should not constitute ‘general
monitoring’; such general monitoring is precluded according to Article 15 e-Commerce
Directive. How this boundary is to be drawn remains to be seen. Especially as
the regulation envisages prevention of uploads as well as swift take-downs.
Further, recital 19 also recognises that
“[c]onsidering the particularly grave risks associated with
the dissemination of terrorist content, the decisions adopted by the competent
authorities on the basis of this Regulation could derogate from the approach
established in Article 15(1) of Directive 2000/31/EC, as regards certain
specific, targeted measures, the adoption of which is necessary for overriding
public security reasons”.
This
is a new departure in the interpretation of Article 15 e-Commerce Directive.
The
Commission press
release
suggests the following could be caught: social media platforms,
video streaming services, video, image and audio sharing services, file sharing
and other cloud services, websites where users can make comments or post
reviews. There is a limitation in that the content hosted should be made
available to third parties. Does this mean that if no one other than the
content provider can access the content, the provider is not an HSP?  This boundary might prove difficult in
practice.  The test does not seem to be
one of public display so services where users who are content providers can
choose to let others have access (even without the knowledge of the host) might
fall within the definition. What would be the position of a webmail service
where a user shared his or her credentials so that others within that closed
circle could access the information? Note that the Commission is also
envisaging services whose primary purpose is not hosting but which allows user
generated content– e.g. a news website or even Amazon – also fall within the
definition. 
The
scope of HSP is broad and may to some extent overlap with that of video-sharing
platforms or even audiovisual media service providers for the purposes of the
Audiovisual Media Services Directive (AVMSD). 
Priorities and conflicts will need to be ironed out in that respect. The
second element of this broadness is that the HSP provisions are not just
applying to the big companies, the ones to some extent already cooperating with
the Commission, but also to small companies. In the view of the Commission
terrorist content may be spread just as much by small platforms as large.  Similar to the approach in the AVMSD, the
Commission claims that the regulatory burden will be proportionate as the
proportionality with mean the level of risk as well as the economic
capabilities would be taken into account. 
In
line with the approach in other recent legislation (e.g. GDPR, video-sharing
platforms provisions in AVMSD) the proposal has an extraterritorial dimension.
HSPs would be caught if they provide a service in the EU.  The recitals clarify that “the mere
accessibility of a service provider’s website or of an email address and of
other contact details in one or more Member States taken in isolation should
not be a sufficient condition for the application of this Regulation” [rec 10];
instead a substantial connection is required [rec 11]. Whether this will have a
black out effect similar to the GDPR remains to be seen; it may depend on
whether the operator is aware enough of the law; how central the hosting
element is and how large a part of its operations the EU market is.
While
criminal law, in principle, is a matter for Member States, the definition of
terrorist content relies on a European definition – though whether this
definition is ideal is questionable.  For
companies that operate across borders, this is presumably something of a relief
(and as noted above, the proposal is based on Article 114 TFEU, the internal
market harmonisation power).  The
Commission also envisages this a mechanisms limiting the possible scope of the
obligations – only material that falls within the EU definition falls within
the scope of this obligation – thereby minimising impact on freedom of
expression (Proposal p. 8).  Whether
national standards will consequently be precluded is a different question.  Note that the provisions in the AVMSD that
focus on video sharing platforms were originally envisaged as maximum
harmonisation but, as a result of amendments from the Council, retuned to
minimum harmonisation (the Council amendments also introduced provisions on
terrorist content into the AVMSD based on the same definition).
The
removal notice is a novelty aimed at addressing differential approaches in the
Member States in this regard (an on-going problem within the safe harbour
provisions of the e-Commerce Directive), but also to ensure that such take down
requests are enforceable.  Note, however,
that it is up to each Member State to specify the competent authorities, which
may give rise to differences between the Member States, perhaps also indicating
differences in approach.  The startling
point is probably the very short timescale: 1 hour (a complete contrast to the
timing for example specified in the UK’s Terrorism Act 2006).  The removal notices have been a source of
concern.  This is not very long which
will mean that – especially with non-domestic providers and taking into account
time differences – HSPs will need to think how to man such a requirement
(unless the HSPs plan to automate their responses to notices), especially if
the HSP hopes to challenge ‘unsatisfactory’ notices (Art 4(8)). 
Given
the size of the penalties in view, industry commentators have suggested that
all reported content will be taken down. 
This is certainly would be a concern in relation to situations where the
HSPs had to identify terrorist content (ie ascertain not just that it was in a
certain location but also that it met the legal criteria) themselves.  Is it not the case that this criticism is
fully appropriate here.  Here, HSPs are
not having to decide whether or not the relevant content is terrorist or not-
the notice will make that choice for them. 
Further, the notice is made not by private companies with a profit
agenda but instead by public authorities (presumably) orientated to the public
good and with some experience in the topic as well as in legal safeguards.  Furthermore, the authority must include
reasons. Indeed, the Commission is of the view that referrals are limited to
the competent authorities which will have to explain their decisions ensures
the proportionality of such notices (Proposal p. 8). Nonetheless, a one hour
time frame is a very short period of time.
Another
ambiguity arises in the context of referral notices. It seems that the
objective here is to put the existing voluntary arrangements on a statutory
footing but with no obligation on the HSP to take the content down within a
specified period. Rather the HSP is to assess whether the content referred is
compatible with the HSPs terms of service (not whether the content is illegal
terrorist content).  Note this is a
different from the situation where the HSP discovers the content itself and
there has been no official view as to whether the content falls within the
definition of terrorist content or not. This seems rather devoid of purpose:
relevant authorities have either decided that the content is a problem (in
which case the removal notice seems preferable as the decision is made by
competent authorities not private companies) or the notice refers to content
which is not quite bad enough to fall with the content prohibited by the
regulation but the relevant authorities would still like it down, with the
responsibility for that decision being pushed on to the HSP. Such an approach
seems undesirable.
Article
6 requires HSPs to take effective proactive measures.  These are not specified in the Regulation,
and may therefore allow the HSPs some leeway to take measures that seem
appropriate in the light of each HSP’s own service and priorities, though it seems
here that there may also be concerns about the HSPs’ interpretation of relevant
terrorist content.  It is perhaps here
that criticisms about the privatisation of the fight against terror comes to
the fore.  Note, however that Article
6(4) allows a designated authority to impose measures specified by the
authority on the HSP.  Given that this is
dealt with at the national level, some fragmentation across the EU may arise;
there seems to be no cooperation mechanism or EU coordination of responses
under Article 6(4).
There
is also the question of freedom of expression. Clearly state mandated removal
of content should be limited, but it is the intention that HSPs have no freedom
to remove objectionable content for other reasons. At some points, the recitals
suggest precisely this: “hosting service providers should act with due
diligence and implement safeguards, including notably human oversight and
verifications, where appropriate, to avoid any unintended and erroneous
decision leading to removal of content that is not terrorist content” [rec 17].
Presumably the intention is that HSPs should take steps to avoid mistakenly
considering content to be terrorist. 
They clearly are under obligations to take other forms of content down,
e.g. child pornography and hate speech. 
More
questionable is the position with regard other types of content: the
controversial and the objectionable, for example.  As private entities human rights obligations
do not bite on them in the same way as they do with regards to States, so there
may be questions about the extent to which a content provider can claim freedom
of expression against an unwilling HSP (e.g. for Mastodon, the different
instances have different community standards set up by that community – should
those communities not be entitled to enforce those standards (providing that
they are not themselves illegal)?). 
There may moreover be differences between the various Member States as
to how such human rights have horizontal effect and the deference given to
contractual autonomy.  With regard to the
video sharing platforms, it seems that room is given to the platforms to
enforce higher standards if they so choose; there is not such explicit
provision here.
A
final point to note is the size of the penalties that are proposed.  The proposal implicitly distinguished between
one-off failings and a ‘systematic failure to comply with obligations’.  In the latter cases, penalties of up to 4% of
global turnover- in this there are similarities to the scale of penalties under
the GDPR.  This seems to be developing
into a standard approach in this sector.
Barnard
& Peers: chapter 25, chapter 9



Source link

Related posts

Leave a Comment