INSUBCONTINENT EXCLUSIVE:
The European Union executive body is doubling down on its push for platforms to pre-filter the Internet, publishing a proposal today for
all websites to monitor uploads in order to be able to quickly remove terrorist uploads.
The Commission handed platforms an informal
one-hour rule for removing terrorist contentback in March
It now proposing turning that into a law to prevent such content spreading its violent propaganda over the Internet.
For now the ‘rule of
thumb& regime continues to apply
But it putting meat on the bones of its thinking, fleshing out a more expansiveproposalfor a regulation aimed at &preventing the
dissemination of terrorist content online&.
As per usual EU processes, the Commission proposal would need to gain the backing of Member
States and the EU parliament before it could be cemented into law.
One major point to note here is that existing EU law does not allow
Member States to imposea general obligation on hosting service providers to monitor the information that users transmit or store
But in the proposal the Commission argues that, given the &grave risks associated with the dissemination of terrorist content&, states could
be allowed to &exceptionally derogate from this principle under an EU framework&.
So it essentially suggesting that Europeans& fundamental
rights might not, in fact, be so fundamental
(Albeit, European judges might well take a different view — and it very likely the proposals could face legal challenges should they be
cast into law.)
What is being suggested would also apply to any hosting service provider that offers services in the EU — ®ardless of
their place of establishment or their size&
So, seemingly, not just large platforms, like Facebook or YouTube, but — for example — anyone hosting a blog that includes a
free-to-post comment section.
Websites that fail to promptly take down terrorist content would face fines — with the level of penalties
being determined by EU Member States (Germany has already legislated to enforce social media hate speech takedowns within 24 hours, setting
the maximum fine at€50M).
Penalties are necessary to ensure the effective implementation by hosting service providers of the obligations
pursuant to this Regulation,& the Commission writes,envisaging the most severe penalties being reserved for systematic failures to remove
terrorist material within one hour.
It adds: &When determining whether or not financial penalties should be imposed, due account should be
taken of the financial resources of the provider.& So — for example — individuals with websites who fail to moderate their comment
section fast enough might not be served the very largest fines, presumably.
The proposal also encourages platforms to develop &automated
detection tools& so they can take what it terms &proactive measures proportionate to the level of risk and to remove terrorist material from
their services&.
So the Commission continued push for Internet pre-filtering is clear
(This is also a feature of the itscopyright reform — which is being voted on by MEPs later today.)
Albeit, it not alone on that
front.Earlier this year the UK government went so far as to pay an AI company to develop a terrorist propaganda detection tool that used
machine learning algorithms trained to automatically detect propaganda produced by the Islamic State terror group — with a claimed
&extremely high degree of accuracy&
(At the time it said it had not ruled out forcing tech giants to use it.)
What is terrorist content for the purposes of this proposals The
Commission refers to an earlier EU directive on combating terrorism — which defines the material as information which is used to incite
and glorify the commission of terrorist offences, encouraging the contribution to and providing instructions for committing terrorist
offences as well as promoting participation in terrorist groups&.
And on that front you do have to wonder whether, for example, some of
United States president Donald Trump comments last year after the far right rally in Charlottesville where a counter protestor was murdered
by a white supremacist — in which he suggested there were &fine people& among those same murderous and violent white supremacists might
not fall under that ‘glorifying the commission of terrorist offences& umbrella, should, say, someone repost them to a comment section that
was viewable in the EU…
Safe to say, even terrorist propaganda can be subjective
And the proposed regime will inevitably encourage borderline content to be taken down — having a knock-on impact upon online freedom of
expression.
The Commission also wants websites and platforms to share information with law enforcement and other relevant authorities and
with each other — suggesting the use of &standardised templates&, &response forms& and &authenticated submission channels& to facilitate
&cooperation and the exchange of information&.
It tackles the problem of what it refers to as &erroneous removal& — i.e
content that removed after being reported or erroneously identified as terrorist propaganda but which is subsequently, under requested
review, determined not to be — by placing an obligation on providers to have&remedies and complaint mechanisms to ensure that users can
challenge the removal of their content&.
So platforms and websites will be obligated to police and judge speech — which they already do
do, of course but the proposal doubles down on turning online content hosters into judges and arbiters of that same content.
The regulation
also includes transparency obligations onthe steps being taken against terrorist content by hosting service providers — which the
Commission claims will ensure &accountability towards users, citizens and public authorities&.
Other perspectives are of course
available…
There is no way a hosting provider (including your private website, if it includes comments section) can comply with these
obligations without #UploadFilters
It not limited to large platforms
The @EU_Commission has ignored 100% of the #copyright discussion
#SaveYourInternet pic.twitter.com/WeV0GwDZVD
mdash; Julia Reda (@Senficon) September 12, 2018
The Commission envisages all taken down
content being retained by the host for a period of six months so that it could be reinstated if required, i.e
after a valid complaint — to ensure what it couches as ''the effectiveness of complaint and review procedures in view of protecting
freedom of expression and information&.
It also sees the retention of takedowns helping law enforcement — meaning platforms and websites
will continue to be co-opted into state law enforcement and intelligence regimes, getting further saddled with the burden and cost of having
to safely store and protect all this sensitive data.
(On that the EC just says:&Hosting service providers need to put in place technical and
organisational safeguards to ensure the data is not used for other purposes.&)
The Commission would also create a system for monitoring the
monitoring it proposing platforms and websites undertake — thereby further extending the proposed bureaucracy, saying it would establish a
&detailed programme for monitoring the outputs, results and impacts& within one year of the regulation being applied; and report on the
implementation and the transparency elements within two years; evaluating the entire functioning of it four years after it coming into
force.
The executive body says it consulted widely ahead of forming the proposals — including running an open public consultation,
carrying out a survey of 33,500 EU residents, and talking to Member States& authorities and hosting service providers.
By and large, most
stakeholders expressed that terrorist content online is a serious societal problem affecting internet users and business models of hosting
service providers,& the Commission writes
&More generally, 65% of respondent to the Eurobarometer survey considered that the internet is not safe for its users and 90% of the
respondents consider it important to limit the spread of illegal content online.
Consultations with Member States revealed that while
voluntary arrangements are producing results, many see the need for binding obligations on terrorist content, a sentiment echoed in the
European Council Conclusions of June 2018
While overall, the hosting service providers were in favour of the continuation of voluntary measures, they noted the potential negative
effects of emerging legal fragmentation in the Union.
Many stakeholders also noted the need to ensure that any regulatory measures for
removal of content, particularly proactive measures and strict timeframes, should be balanced with safeguards for fundamental rights,
notably freedom of speech
Stakeholders noted a number of necessary measures relating to transparency, accountability as well as the need for human review in deploying