Fordham CLIP Symposium: Panel, "Intermediaries as Legal Filters" - Fordham Intellectual Property, Media & Entertainment Law Journal
post-template-default,single,single-post,postid-255,single-format-standard,ajax_fade,page_not_loaded,,select-theme-ver-3.3,wpb-js-composer js-comp-ver-6.6.0,vc_responsive

Fordham CLIP Symposium: Panel, “Intermediaries as Legal Filters”

Fordham CLIP Symposium: Panel, “Intermediaries as Legal Filters”

Live Blog of the Fordham Center for Law & Information Policy “Third Law & Information Society Symposium” — Intermediaries in the Information Society

Panel, “Intermediaries as Legal Filters”
Moderator: Rob Frieden
Panelists: Dr. Ian Brown, Wendy Gordon, Dawn C. Nunziato, James Grimmelmann

[Blogged live by Jason Lunardi, citations edited subsequently]

[2:38] Moderator Rob Frieden: Mentions he wrote an article for the Fordham Intellectual Property Law Journal on network filtering, deep packet inspection [article available at]. Filtering can be very aggressive and proactive. There are also net-neutrality questions. Are there commercial possibilities for companies to be non-neutral? If so, what does that mean for their DMCA safeharbor?

[Introduces the panelists]

[2:42] First Speaker, Dr. Ian Brown:
[powerpoint slides] What he will talk about: Infringement, gossip, and child abuse. Constitutional code and information anarchists.

E-commerce Directive is similar to the CDA and DMCA. Shows that telecom lobbyists were very successful in the 90s. Directive language says “Mere conduit” art. 12.
Copyright Directive. All the fair use exceptions in states, are not mandatory in Europe. the ISP protection is the only thing mandatory in the Copyright Directive.
Proportionality (3 strikes model): infringes right to privacy, expression, association, educationm, commerce, civic engagement.
Oversight; lack of transparency etc. If it gets anywhere, EU courts are much slower to enforce. It might be 20 years after passed! Look at the DNA samle case, they only ruled in Dec. 2008.

Data Protection Dir. Says not only applies to governments, but companies, and INDIVIDUALS! Does not exclude individuals. the only ECJ case was Lindqvist (Swedish case). Gossip posted on a website, but it was medical info. and this qualified as a protected piece of info. [C-101/01]

How far can we regulate individuals as well as intermediaries as data controllers? Easier to regulate companies than individuals, as the music industry has realized.

Blocking child abuse images — British Telecom system blocks access to pages on secret blacklist. Without notice to sites concerned. A huge media-storm in Britain a few months ago when Wikipedia got on blacklist. When the cover of rock band Scorpions album triggered the blacklisting. The list was imposed by government on ISPs. Now being considered by EU, with little consideration of constitutional issues. also expanded past just child porn to other unwated speech, such as terrorism, bomb-making, pornography in general.

What role can technology also play in protecting rights? Can the attacks on P2P systems be a boon for free speech? Resulted in replicated peers, more users.

Can you design for privacy? Data minimization is key. Limit what you collect. Limit how much time you keep. Users must be notified and consent. the is EU law already! In the directive.
Why not encrypt the data?

We want to encourage systems that support constitutional values.
Dr. Brown asks, “How far has public disgust with political corruption damaged the rule of law?”

[2:57] Second Speaker, Wendy Gordon:
Her concern is with methodology.
Prior to Grokster, the absence of filtering was seen as a way to escape contributory or vicarious liability. Now, they will nevertheless be liable because they intend to help infringement.

Immense debate as to what Grokster means. One step in the direction of requiring filtering. But this has free speech downsides! Could block this generation’s Pentagon Papers. We need a way to circulate modes of rebellion.

Policywise, maintain a system of laws that maintains a way to have pure peer-to-peer communications to allow this filterless free speech.

Tim Wu, looking at Grokster, said decision is “morally bankrupt” and ducking the hard questions — Supreme Court avoided the big questions by going with intent test.

Does it ever make sense to use an intent test?
Some things so bad, like torturing a child, or cutting up live person to distribute his organs to save others, no one would ever consider.
Grokster Court “blinded” themselves from the free-speech aspects of the case. Maybe what was being encouraged is actually a very bad thing. Even if user thought he was within fair use by uploading to Grokster, copyright is strict liability, so he would be infringing. Some legally infringing conduct may be morally alright. She can’t see why the underlying conduct in Grokster should be treated as prohibited at all costs akin to murder!

To measure harm, look at how would the party have faired “but for” the actions of the other. Better off never to hear the song then to be prohibited from reuse and making copies? Sometimes exact copying is important part of free expression. Like saying the pledge of Allegance. Part of every religion. Any institutional ritual. She cant see why the record companies, copyright owners, are merely giving a “benefit to the users that they can simply withdraw at will”. the record companies dont have a blanket claim to all uploaded content. If it doesn’t square with the moral core, then maybe nothing wrong with what Grokster was doing!

[3:12] Third Speaker, Dawn Nunziato:
How broadband service providers are regulated, and how they should be.
Backdrop of FCC Aug. 2008 order that Comcast was unlawfully discriminating against P2P protocols. Accused of deep-packet inpection and blocking certain protocols.

Did the FCC act with proper legal authority?
There has been a pattern of deregulation over the last decade.

Telecommunications Act of 1996.
Policy of minimal regulation for Internet.
also policy of user control over what we receive over the Internet.
Immunization for ISPs for blocking content that is objectionable.

How are telecoms regulated?
First, it was Common carraige regime under Act. From an ancient doctrine, that imposes duty on communication facilitators, to act without discrimination. that is how it was regulated in the early 1990s.
When cable broadband came, question of whether common carrier regime would continue. In 2002, decided cable broadband was not under common carrier regime.
In Brand X decision, S.Ct. says fine for FCC to treat cable broadband as immune from common carriage regulation.

At the same time, in 2005, FCC had some policy statements. They state users have 4 freedoms: including choice of internet content, and choice to run service of their choice.
Not enforceable documents, but they wont hesitate to take action if violated!?

After this uncertain regulatory regime, better to proceed cautiously. In book, “Virtual Freedom,” she documents instances of broadband providers discriminating based on certain communications. Ex. Cidy Sheehan. Downing Street. Verizon Wireless censored NARAL. Comcast blocking P2P sharing using deep packet inspection, then lied about it.

FCC said that Comcast action was illegal, in violation of the 2005 policy statement. Comcast argues that FCC had been deregulating providers for 10 years! FCC argues they have ancillary jurisdiction to regulate non-common carriers. Comcast then responds that Scalia in dissent to Brand X asks “ancillary to what?”

Her policy view is that the boradband companies are just the pipelines, and they have no business interfereing with out free-speech rights. They may have economic interests in controlling content, but that should not trup our rights to access what we will. Concern with keeping the pipes open to preserve free-speech rights.

[3:29] Moderator says a way to conceptioalize “ancillary to what” think of “A is to B; B is to C; therefore A is to C”

[3:30] Fourth Speaker, James Grimmelmann
Presentation of paper about regulation of search engines as opposed to other intermediaries.

Website operators aren’t the only intermediaries, users choose what they view by using search. You can squelch speech in another way at the seach engine level.

Proposals … (1) Could repeal 230 for search engines. (3) Notice/takedown provision (3) pressure to make websites less visible, require them to use robots.txt or nofollow metatags.

Interventions that make search less useful, makes it harder for users to lead independent lives.
Some information would be unfindable. Some websites would take steps to remove themselves from the search ecosystem.
At some point search engines would stop indexing if the liability was too much.

Opportunity for gamesmanship at an important choke-point. Notice/takedown would create this problem.

Keeping all information up, this seems to permit the excuse that everyone is allowed to speech. what maybe is more important and the listener’s right to hear, rather than speaker’s right to speak.

Better to target the owner of site where content appears, rather than search engine. Otherwise, like “shooting the messenger.”

[3:42] Jerry Lewis, Comcast (in audience)
Often lost in discussion: underlying technical reasons for network management — namely network congestion. The technology, though, is content neutral.
the new technique is content agnostic and PROTOCOL AGNOSTIC.
This can be used for brief times to alleviate network congestion.
Management will not go away in any regime — we have to deal with it. More disclosure, transparency, working with customers, vet it through regulatory bodies.
Free-speech is always a risk — you can always find some examples, but they don’t always necessitate a change. Some edge cases, but they dont make the case.

[3:47] Questions & Rebuttals

Susan Scafidi (from audience): Isn’t it a broad claim that anyone who has created something and released it, it is immoral to try to control it? the logical conclusion would be that there be no copyright. Better to have loved and lost than never to have loved at all?

Gordon: Talking about an elaborate menatl conception of the moral core of copyright — not questioning the consequentialist conception of copyright. Only asking, What is the moral minimum of copyright?
First, the bit about Grokster asks if what the kids are doing violates morals so much that court can avoid looking at consequences.
Second, even within the moral universe, no such thing as infinite slippery slope.

[3:53] Audience member question: (to Grimmelmann) How extreme a position are you arguing?

Grimmelmann: Purely that the hosting site is a much better target than the search engine.

Audience member question: Move towards increasing liability of intermediaries includes deputizing gatekeepers such as financial companies to cut off money to gambing sites. How does that affect?

Grimmelmann: if you wholly internalize things in the government, you get the secret black-list type of things. If we have this commitment against internet gambling, how do we accomplish this, but preserving due process concern.

Audience member question: With public records, used to be that only a certain amount of people bothered to go in and read them. Now less friction for dissemination. How will that affect?

Grimmelmann: Peter Winn has a draft on this issue. Courts have the responsibility to make the decision to make public this info with responsible choices. To have procedure where most things are available immediately, and sensitive things take more review, where highly sensitive materials are extremely scrutinized.

Brown: Comment that Grimmelmann’s take on the law is like the classic Internet “end to end” conception where it is more efficient to make decisions on the end and let the middle be neutral.

Jason Lunardi