Bono Still Hasn't Found What He's Looking For
I don’t get to blog much about technology policy issues anymore, but every once in a while something juicy comes along that is worth spending a few minutes responding to. In Sunday’s New York Times Bono (U2’s lead singer) wrote a piece advocating 10 ideas for the next decade. Among them was a call for widespread filtering and deep packet inspection of the Internet to stop sharing of copyrighted content:
We’re the post office, they tell us; who knows what’s in the brown-paper packages? But we know from America’s noble effort to stop child pornography, not to mention China’s ignoble effort to suppress online dissent, that it’s perfectly possible to track content.”
Perhaps Bono shouldn’t try to play policymaker before understanding the limits and inherent dangers of this approach. Yes, while sniffing packets and some level of filtering is possible, this approach is dangerous and ultimately futile for dealing with the underlying problem. Setting aside the obvious civil liberties issues, USACM’s technical experts made three points about the technology itself in a letter to Congress when it was considering such a filtering/inspection mandate for all US universities receiving federal funds:
- there are known techniques for bypassing filtering (i.e. encryption); in fact it can be proven mathematically that this race will never be won by the side seeking to filter
- installing deep packet inspection technology opens up a Pandora’s box of security issues as attackers can target a specific point that looks at all traffic on the network
- no filter can determine what is and what isn’t legal use of a copyrighted work
All of these are just as valid in response to Bono’s idea. From USACM’s letter:
Encryption is an even more effective countermeasure as strong encryption of traffic will render filtering technology useless. When traffic is encrypted it becomes impossible for any technology to distinguish infringing traffic from non-infringing traffic, or even from routine encrypted traffic such as e-commerce transactions or corporate applications such as virtual private network traffic. Encryption is a widely available technology and one that could be readily incorporated into peer-to-peer applications.
Second, because filtering technologies depend on seeing all traffic flowing over a network they raise significant new security risks. An attacker (external or internal to the filtering organization) can potentially use this infrastructure to gain the same look into the network traffic that the filter uses. This access would be very valuable for attackers trying to steal identities, personal or financial information or gain illicit access to valuable research.
Finally, filters can undermine existing freedoms, rights and research. Even the best filters cannot determine what is a fair use of a copyrighted work. A policy requiring or encouraging filtering without having a process to resolve fair use claims would undermine existing, long-established rights as overly aggressive filters blocked otherwise legal activities. Having such a process is not possible if the intention is to block content in real time.
Further, false positives — blocking content that is in the public-domain because it happens to share a signature of the copyrighted work — could have a significant negative impact on distribution of educational material at universities. False positives may also hinder legitimate academic research endeavors that rely upon an open and flexible Internet as a platform for experimentation and innovation. Overly broad filters might interfere with legitimate research on peer-to-peer networks, as well as grid or cloud computing efforts.
Infringement of copyrighted works on university networks is a serious issue. However, a Federal policy that promotes or requires filtering will indirectly add to the costs of education and university research, introduce new security and privacy issues, degrade existing rights under copyright, and have little or no lasting impact on infringement of copyrighted works.”
(Thanks to Peter Harsha at CRA for titling this post for me.)