Greetings from the Internet Law Program #6. Below are my rough notes for the opening session -- an introduction to what things regulate the Internet, using efforts to control speech (pornography) as a case study. The back-and-forth between Larry and JZ is, as usual, in equal parts amusing and illuminating.
[Update: Frank Field is here as well, catching what I'm missing -- do check it out.]
Introducing the faculty and people behind ILAW: Yochai from Yale -- one of the originals; me; Jerry Kang at UCLA, will discuss privacy; Lawrence Lessig, formerly @ Harvard, now @ Stanford; Charlie Nesson is the Berkman Center's founder; John Palfrey, Berkman's ED, which he runs w/great skill and does an increasing amount of teaching; and Jonathan Zittrain, original ED of Berkman Center, and perhaps knows more than any of us about how all of this works. Behind the scenes: Robyn Mintz, Derek Rumbauer (sp?); Catherine Bracey, Mary Bridges; officials from the MA state government. Wendy Koslow; Hal Roberts handling tech globally, Jesse Ross locally.
Larry Lessig steps to the podium & begins.
Larry: Let's start w/conceptual frameworks... Jonathan was anxious to review the ways that we regulate. He wasn't sure you reviewed the DVD. He suggested a review session.
JZ: I suggested a quiz!
Larry: Okay, so this is how Jonathan is.
So we have selective regulation of speech; picked to frame the full range of tools that government will use.
Think about what regulates, rather than the lawyer-centric view of regulation. Law, norms, architecture, the market.
These doors out here are padlocked; you are barred physically from leaving. Also norms; you can't mix with the crazy Harvard students.
These modalities are not static; they affect one another. Law can change architecture--there are ramps in buildings like this, for wheelchairs. Four elements are operating together. In a full lecture, we'd get to nuances -- "Can't the market be used to changed the law?" Answer is yes, but we won't get into that now.
JZ: Are you admiring your slides or my clothes?
Larry: Your tie.
JZ: Thank you. I like your cardigan. [Big laugh.]
Larry: Plenty of laws that regulate speech. One kind is a law that regulates obscenity; ban obscenity. But pornography can only be banned for one group: people under the age of 18. Regularory problem: how do we deal w/regulations that separate out the world? Focus first on "some" problem.
Rules, laws: You can't sell porn to kids.
Norms: Most people don't think kids should have porn.
Market: Kids don't have a lot of money -- porn in realspace costs.
Architecture: Hard to hide the fact that you are a kid.
The fact that realspace gives us relatively self-authenticating age means regulation this way works pretty well.
Porn = regulable
But in cyberspace, this changes.
Market = not very restrictive. Much is free.
In cyberspace, no one knows you're a dog; the server doesn't know you're a kid.
Architecture in cyberspace is not the same kind of restraint.
How do we solve this problem? Think through modalities of regulation. If content is free and identity invisible, what's a government to do?
Laws affect things that regulate. How might the law change these features? If (age = minor) & (content = porn), then BLOCK.
How to get there?
Options -- if age = minor. Communications Decency Act (CDA) -- it said it in a clearly unconsitutional way. Serve to minors; throw you in jail.
Then you have, "IDs, please." Authenticate your age.
JZ: CDA approach is definitely wrong. Won't work; too burdensome.
Larry: First thing you said, "You can't do anything about it!" Cyberlibertarians said this about the government. They were wrong. If you're fighting this, it means the government can do it.
Then you said it's a terrible burden.
JZ: Yes, we won't bother about it; too burdensome.
Larry: CDA solution burdens everybody -- why do I need an ID? Don't burden us, only the kids.
Idea 2: "accounts." Increasingly common feature of computers. Impose constraints for users of different accounts. Here's a solution - how about an account/browser that says, "Hey, I'm a kid, I'm a kid!"
JZ: That's a great idea. Problem w/this, people then know who is a kid! They'll provide special bad stuff designed for a kid. This is a privacy problem.
Larry: Idea 3, then. Rather than broadcast it; we will set up a kid-mode browser. KMB will sit there and be aware of a tag that exists on a site -- the "harmful to minors" tag. This idea only burdens kids. What's wrong w/this?
It doesn't suffer from your CDA complaint, doesn't compromise privacy, yet assures Internet is age-aware.
JZ: How do you know what porn is?
Larry: You think that's hard?
JZ: It's too hard for parents.
Larry: For me?
JZ: Yes, maybe!
Here's how Microsoft has solved this problem -- right inside my browser, I can go to Internet options, can choose level of offensiveness. Killing, killing w/blood and gore, wanton and gratuitous violence. Blood and gore and they didn't deserve it.
Larry: People who compile lists of these sites...like Net Nanny...how does that work?
JZ: Net Nanny you can install -- it gives you updates to your machine, dictates what's offensive or not. Site doesn't have to rate itself.
Larry: That's like a list of banned books...can I see the list?
JZ: As Seth Finkelstein -- who is here -- can tell you, no. He tried to find out.
Larry: He spends his life on the phone asking questions?
JZ: No, they don't take his calls anymore. [Laugh.]
[Demos how you can't see Seth's site w/censorware on; Seth has been censored.]
Larry: Seth's story is a little sharper than we've seen. When you become a critic of the censorware companies, they block your site and the world begins to become the world as seen by Net Nanny. Critics can't see the list to say, "Hey, you've blocked my site."
How do we enable selective blocking of sites for pornography? Violence isn't blocked; otherwise there would be no video game industry. But these companies have an over-inclusiveness problem. This solution is both secret and over-inclusive.
JZ: What about PICS...it came out of tech community.
Larry: Which means it must be great.
JZ: PICS says anyone can set up a set of ratings; you can choose your filter.
Larry: Christian right and loony left -- choose either?
JZ: Both. But you can prevent over-inclusiveness because the lists are transparent.
Larry: I can see how it takes care of the first problem -- it's no longer secret. But what about the second? I'm not so sure. We have ways of criticizing the sites. But it's bascially a general system for blocking stuff on the Net.
JZ: Horizontal portability problem. This solution is meant to scale.
Your claim is that it's now really easy to slice and dice speech. But why wouldn't we want that?
Larry: Why so eager to build this into the Internet?
JZ: You're putting your politics into this. Don't you want to protect your kid?
Larry: I do that with a really long password. [Laugh.] But do we want to make the Internet itself the best censor ever created?
Government says, "If you have content harmful to minors, you must have a tag."
Problem: for whom/by whom? In different jurisdictions, what is blockable speech changes.
JZ: Maybe there is a way to have the Internet respect local boundaries -- supply-side filtering. Here are my sildes...my font -- brighter and less tortured than Larry's.
Larry: Matches your shirt, too!
JZ: Google localizes for hate speech -- "Stormfront." Zoning. In Google.de this site doesn't exist. But it only deprives Germans of it -- not us. Google was willing to do it, for two reasons. Rule: Never anger a G8 country. Plus, they wanted to do the right thing.
What's wrong with that?
Larry: Obvious criticism -- what about moving offshore -- literally? Build a site in the middle of the ocean. We're free of any jurisdiction, right?
JZ: This is Sealand, for those of you who haven't seen it. A guy claimed it and called it his own. Tax disputes. He won. Issued passports from Sealand. Invasion of Sealand. Real international hub. By day, Roy works in a shellfish processing plant. But otherwise he's prince of Sealand -- and he set up servers there.
But why not target the ISP instead? The one law in Sealand is "no child porn." Because they know they'd lose their Internet access.
Start as a government, filter from supply side. Domain name servers in China not far away from government.
A few years ago we did studies -- we searched for "Tibet," related terms. All ten sites unavailable. "Democracy" blocked. Net Nanny-style implemented "vertical portability" -- up one click to ISP.
Government of the US asked circumvention engine to filter out porno. Can't get any site with "ass" in it; can't see American EmbASSy. Can't get "Bush" or "hot" (Hotmail). "TV" is blocked -- Tuvalu's domain. Global warming is about to make Tuvalu moot, but in the meantime...
This filtering is happening in Saudi Arabia. China. And in the US. The state of Pennsylvania -- you can suggest a site to be blocked because it's obscene. Here's a form where you can report to the governor that you have been....seeing...some child porn. Ended up blocking all of Geocites. Incentive for Geocities to clean it up.
Larry: Wait -- the AG gets an email -- and what happens? Is there any way to second guess these choices?
JZ: Hard to know HOW it's been blocked. They won't reveal the sites being blocked.
Larry: Why isn't there a Seth response to this?
Seth Finkelstein: I don't touch child porn.
JZ: This is the answer. Most won't touch it.
JZ: Small effort there, gingerly. But CDT raised procedural issue -- not substantive issue...
Dave Winer: Next round of pictures from Iraq -- present issues? Political content?
JZ: Q we overlook -- defines obsenity? Case from the '70s. Appeals to the prurient interest -- something you're both attracted to/disgusted by. No cultural value. Patently offensive.
Larry: Tomorrow oral argument in CDT case...
JZ: Quieter. No press conferences...
What Playboy publishes isn't obscene...there are defined boundaries...settled norms. They could settle this in part because they had lawyers to help draw the lines. This has been up-ended by the Internet. Pressure on mainstream to get raunchier.
Hard to build censorware to help people see good speech and not bad.
Q: Do tech solutions rely on people tagging stuff -- is there a problem w/forcing people to do this?
Larry: Great q; unanswered right now. In realspace, could you require stores to put up a sign that says, "Porno here"? A real burden. But in cyberspace -- html tags... it's hidden. Could find out that B&N has these tags -- but different burden. Spam laws similar. The differences between realspace and cyberspace will determine the outcome here.
JZ: How can you rate the organic Web? Must ask people who provide the content to do it. But it never caught on for people to self-rate. And if the site doesn't offer a rating, you're not going to see it.
Q: Searching/filtering using text...anyone looked at tagging the images themselves? I used text-based browser. Tagging images would be a smaller job.
JZ: How granular should this be? Let children read news but not see the image?
Late '90s businesses would analyze pictures on-the-fly. Color analysis -- how much flesh? Actually used in airports -- it gives you a good first cut.
Q: One of the things you didn't discuss -- cultural/educational solution. You can generally see what your kid has looked at, talk about it.
Larry: Good point. We began by thinking about modalities. Law can assert such & such. Tags in material. Works by creating incentive. But there are alts. Work in the same way. Norms. Children/parents and "self-policing." Congress passes baldly unconstitutional laws -- they rely on courts to strike it down. Pass it again. Teaching and morals education to deal w/this -- not tech or law.
We introduced this to think about regulation functions.
Seth F.: You touched on Anonymizer -- I wanted to stress architectural regulation...government cannot allow people anonymity/privacy. Government blocks Anonymizer. Interesting category in censoring called a "loophole." A privacy site, a language translation site, an anonymizing site. This is a threat to control. If you go to Google images, you search for "breasts." You get not quite...wardrobe malfunctions. It was an architectural requirement.
Larry: [...] Two ways to regulate speech: 1.) hide it all, or 2.) change the mix, so people don't get too much. One Madonna album won't destroy Hungarian culture. Speed bumps. But if you're absolute, you do have to take the steps you're talking about. Underlying aim -- do you need absolute control? Not w/porno.
JZ: It is getting cheaper and easier to censor. Small fences keep in large mammals. If they manage to jump it; you start to watch. Anonymizer doesn't make you anonymous to the country you're in, but to the sites you visit from there. It's a good way to identify people who want to jump.