On the Internet Archive’s blog on April 10, Office Manager Chris Butler wrote that over the previous week, the Archive had received a series of email notices from a French government agency erroneously identifying over 550 archive.org URLs as “terrorist propaganda.” The blog post has an extensive list, but they included a number of major collection pages such as Project Gutenberg, the Smithsonian, the Grateful Dead, and the Prelinger Archive of public-domain audio and video content. Also included were scholarly research material, U.S. Government-produced content, and other user-uploaded materials where just a glance at the title should be enough to show no possible connection to terrorism.
The government agency in question is France’s Internet Referral Unit (IRU), which is tasked with investigating and removing terrorist material from the French-accessible Internet. As Butler points out, this comes when the EU is in the process of considering legislation that would require such content be removed within one hour of notification lest the site hosting it be placed on an Internet block list. But thanks to time zone differences, all these notifications arrived in the middle of the night when all of the Internet Archive’s employees were asleep—so, to comply with the law, they would have to take down the URLs automatically and review them after the fact.
Thus, we are left to ask – how can the proposed legislation realistically be said to honor freedom of speech if these are the types of reports that are currently coming from EU law enforcement and designated governmental reporting entities? It is not possible for us to process these reports using human review within a very limited timeframe like one hour. Are we to simply take what’s reported as “terrorism” at face value and risk the automatic removal of things like THE primary collection page for all books on archive.org?
Laws and False Positives
It seems likely that such broadly targeted requests are the result of badly tuned search algorithms, spewing out hundreds of false positives that no one bothers to review before feeding them directly to email. For example, one comment to Butler’s post notes that a research paper is targeted because it uses an abbreviation for a scientific term that happens to consist of the same four letters as an abbreviation for Al Qaeda. It wouldn’t be the first time that algorithmic content takedowns have caused problems—YouTube has so much content uploaded every second of the day that it can’t possibly review it all, and its algorithmic copyright watchdog has meant endless headaches for content creators who use excerpts from copyrighted material within the bounds of fair use.
The new antiterrorist legislation may not be entirely bad. Commenter “kravietz” notes that the new law would replace a patchwork of confusing, often contradictory existing laws with one consistent rule—including an appeal process and safeguards against unjustified takedown requests. He also points out that the requests the Archive received are not the mandatory court-ordered removal orders, but the more advisory-style “referral” asking ISPs to check for possibly illegal content on a “best effort” basis. And he adds:
Recital (9) also contains the following statement:
“Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content”
Takedown, You’re Busted
But I’m skeptical those safeguards could be effective. Certainly, the American Digital Millennium Copyright Act also includes penalties for sending unjustified copyright takedown orders—but as far as I know, not one bad actor has ever been penalized under those provisions. Sometimes they settle out of court before it can come to that—but even when it gets all the way through a court, all the offender has to do is bat their eyes at the judge and say, “Gosh, your honor, it sure looked like a copyright violation to me” to get let off with a warning.
Even in one of the most blatant takedown abuses to make it to trial—the case where a mother filmed her toddler dancing to a Prince song—the courts ruled that content owners should consider fair use before sending takedowns, but only have to demonstrate a “subjective good faith belief” that that it isn’t fair use to avoid a penalty. (Or, in other words, “Gosh, your honor…”) And that one ended up being settled, too.
As I was reading the news this morning, I observed another such case: cable network Starz filed takedown requests against tweets promoting a news story covering a major leak of pirated unaired television episodes. The news story didn’t provide any links to the pirated content, but Starz acted as if it did. Don’t look for any fines to be levied when that one goes to trial, either.
And I suspect something similar will apply to any anti-abuse provisions built into this new terrorism law, too. “Gosh, your honor, it sure looked like terrorism to me.”
The Transatlantic Culture War
The potential for harm to the Internet’s openness is disturbing. Another commenter on the Archive.org blog post points out that the US, Canada, Australia, Britain, Europe, and Russia have all lately, almost simultaneously, passed legislation attacking the open nature of the Internet.
On top of that, the laws of many countries treat material posted to the Internet as if it was made available in the physical location of a local computer capable of downloading it. This means that material that might be just fine under one country’s laws could run afoul of another’s—which recently happened to Project Gutenberg. PG ended up having to geoblock Germany from its entire site after a German court ruled that hosting German material that was public domain in the US but not in Germany violated German copyrights. (And Project Gutenberg was also mentioned in that list of “terrorist material on the Internet Archive.) And speaking of copyright, the EU also just finished passing its controversial “meme ban” copyright directive that could give additional ammunition to European courts seeking to remove content from foreign Internet services.
But copyright might be just the beginning. Freedom of speech is one of the USA’s fundamental values—it’s right there in the first amendment ever made to the Constitution. Europe doesn’t have such a strong tradition of individual free speech—indeed, the only reason we do is that we had a vehement reaction to how Europe didn’t at the time. Europe has gotten better since then, but its governments still by and large place more emphasis on the group than the individual, and have more rights to regulate speech they find harmful. For example, Germany has very strong restrictions on speech partaking of Nazi iconography. Until regulations were changed just last year, video games like Castle Wolfenstein had to get rid of all their swastikas for release in Europe. That brings us back to those false positive terrorist content warnings.
What if ordinary, free speech over here runs afoul of a more restrictive European law?
Judging by what Gutenberg administrator Dr. Gregory Newby said in the story about Gutenberg blocking Germany, European courts do have ways to compel obedience from the US court system, which could well end in more sites being required to geoblock European IP addresses. A good number of comments to that Internet Archive blog post suggest the Archive should just geoblock Europe altogether and be done with it, but it would be sad if that way of thinking won out. It’s the Internet’s very ubiquity that makes it so useful. If it started fragmenting into national networks that have very limited contact with each other, it would be a real tragedy.
It’s interesting to note that one of my favorite new animated television series, Rooster Teeth’s gen:LOCK, is set against a backdrop of a war between two different cultures rather than nations. Series creator Gray Haddock told Newsweek he had been influenced by the divisive 2016 US Presidential campaign, but it seems to me that these US vs. Europe Internet tensions could be just as relevant.
Really great article, but a tiny bit of optimism is warranted re your comment:
“but as far as I know, not one bad actor has ever been penalized under those provisions.”
I’ve litigated a few cases on “illegal” take-down notices. You are correct in that, often, the parties do settle before trial. But that happens in any type of litigation.
E.g.; In one case, the plaintiff (the opposing party) was just trying to make a living on eBay. He was selling legit products manufactured by my client and his eBay business was doing quite well. The client was unhappy with its “high end” products being sold on eBay and sent a ‘take down notice.” Ebay removed the plaintiff’s shop/site and he was losing tons of $$. When my clients refused to work with the plaintiff and eBay to get his site back up, the plaintiff sued them. And he won in a fairly big way. After reviewing the facts of the case and law with them, I had advised my clients to work with the plaintiff to resolve the issue and get his site back up, but they got angry and took their case to another attorney. They lost the case, and as far as I know, lost their appeal as well.
Yeah, it’s great for both parties in any given case if they can come to an equitable agreement between themselves. But it doesn’t exactly lead to precedent being set that flouting that particular law leads to fines. It just makes the law appear effectively toothless—which in turn leads to more copyright owners ignoring it.