Friday, June 15, 2007

DISCLAIMER: I don't feel like proof-reading this. It's probably muddled and repetitive and [random disparaging adjective], but I just don't care that much. Sorry.

Background, part I: Shortly before the sixth Harry Potter book was released, one booksellers in Canada accidentally sold copies to fourteen people. The courts issued an injunction prohibiting them from reading said copies until the intended release date.

Background, part II: Richard Stallman called on us to boycott Harry Potter because of this.

His argument is, at its core, that you have a right to read any book you own. This seems reasonable, but he doesn't really address--to my knowledge--the reasoning behind the injunction. It's reasonable, though still wrong.

Suppose I promise you that I'll eat a hamburger. Then I remember that I don't eat meat. Too bad: I made a promise and I have to keep it. You have a right to have promises made to you kept, and the courts come in and force-feed me the hamburger to protect that right. (I used to have a right to not eat a hamburger, but then I uncoercedly promised I would.)

Suppose, however, that I promise you that I'll at a hamburger, and promise my brother that I won't. I've now created a situation where either you or my brother will have your rights violated. The courts can come in and enforce one of your rights, but not the other. The solution is to force me to offer one of you something sufficient that you'll say 'Okay, this is just as valuable to me as your eating/not eating a hamburger.' I only have to keep my promise to you so long as you want me to.

Suppose, however, that you want a million dollars and my brother wants a billion dollars. I don't have a million dollars, and I certainly don't have a billion dollars. I can't pay either of you off.

So the courts come in, say 'Ah! Someone's rights will be violated, whatever we do, so the only two differences are:
* Whether Luca wants to eat the hamburger. (I don't.)
* The value of my fulfilling my promise you each put on it.

It seems reasonable (though arguable) that I gave up my right to choose whether I eat the hamburger, so the only thing to consider is the harm of the two options: breaking a million-dollar to you or breaking a billion-dollar promise to my brother.

The courts should choose make me break the million-dollar promise. No hamburger eating for me. Sorry.

However, you could pop up and say 'No, it's worth a hundred quintillion dollars to me!' and who are we to refute you? (Call your bluff by offering you a million dollars? Nope. You're too smart for that. And maybe you're not bluffing.) Or you say it's priceless. And my brother says the same. At that point, the courts have to decide what a reasonable value is, and that sucks, but it's how things actually work.

The hamburger is a silly example. The original issue was a Harry Potter book. (The bookseller promised the publisher they wouldn't let anyone read it and by selling it they promised the buyers they could read it.) RMS offers the example of accidental publishing of health effects of a product. I offer what I think is a stronger counter-example: What if the doctor's office accidentally publishing my medical records?

Slashdotters often argue against copyright by saying 'Don't publish what you don't want public' (which is a bad argument for reasons I won't detail here), but if I share information with my doctor on the basis that he won't share it, and then he does, it's not my fault. I had a reasonable expectation of privacy. Now you have my medical records. This is terrible. You'll exploit them for your own nefarious purposes, I just know it.

This, like the Harry Potter case, is different from the hamburger promise case. In the hamburger case, we had to decide which promise would br broken. In this case, one promise (the doctor's promise of secrecy made to me) is already broken. The question is whether to break the doctor's promise to you ('here, you can have (==can read) these medical records!') by taking them back.

Instead of these to choices:
* Break a promise to you/publisher + bad consequences X
* Break a promise to brother/consumer + bad consequences Y

We now have to choices:
* Bad consequences X (promise to me already broken)
* Break a promise + bad consequences Y (promise to me already broken)

Before we chose which promise to break based on Utilitarianism ('minimise bad consequences) because the Deontologist (the 'keep promises' moral theorist) had no way of choosing one over the other. But now the Deontologist has a way of deciding. He doesn't give a damn about consequences.1

So the Harry Potter injunction is wrong because Deontology is the correct moral theory. If (Act) Utilitarianism is correct, the injunction is probably right.

That's why the injunction is reasonable. Utilitarianism is a reasonable position to take. It just happens to be the wrong on. And, no, I can't prove that.


1 People always assume Utilitarianism is the cold, calculating moral theory of logical people--it's the one Vulcan's hold--but the Deontologist is as ruthless and cold as any Utilitarian, and it's the only moral theory I know of that actually tries to derive its principles from logic. (It makes basic assumptions, of course, but these 'emotional' rules are derived logically from the basic premise, whereas Utilitarian doesn't need to bother with any logic.)

2 comments:

Leif K-Brooks said...

Well-written, well-thought-out, and interesting. Very excellent. I learned from this post.

All of the possible outcomes result in wrongness; that's bad. Obviously, the ideal is for situations like this—ones where some level of wrongness is inevitable—to never occur.

Now, what if that could be enforced at a governmental level? If you think it sounds like I'm talking about Big Brother looking over book store owners' shoulders to make sure they don't release books early, you're right, sort of.

Essentially, what I'm suggesting is to have a device—I'm going to call it a chip, but it could really be a circuit board containing multiple chips, or something else entirely—implanted in the brain of every human being. These chips wouldn't alter thoughts, but they would alter behavior to prevent wrong actions. If you tried to punch me in the face, for example, your hand would stop obeying you in mid-air before it reached my face. Wrong actions would be roughly defined as:

- Violating promises. The chip wouldn't require people to do anything special in order to establish a promise; it would be as simple as saying, 'I'll lend you my car on Tuesday if you fill it up with gas before you bring it back.' Rather than parsing the language itself, the chip would read the brain's understanding of what was being said. This lets the chip invalidate a promise if people understand it differently, and it guarantees that language changes won't require the chip to be updated. Promises would also be invalidated for conflicting with a promise one of the people is already in.

- Touching/hurting someone's body. By default, any kind of physical contact—whether it's a high-five or a knife to the stomach—would be impossible. This could be changed using promises: 'you can poke me if I can poke you back'. Promises of this type (and any other type) could also be one-sided.

This sounds annoying, but since the chip's promise handling code would be based on understanding, promises wouldn't necessarily need to be verbal: if someone's extending his hand to you and both of you understand that as an invitation to shake hands, that's a promise which allows you to shake his hand.

- Violating personal property. This would work about the same way as touching/hurting peoples' bodies. To enter someone else's property, you would need their permission in the form of a (possibly one-sided) promise. People could set ridiculous requirements for entering their property; it would be possible to make people agree that you can kill them if they come on your property, for instance. This shouldn't be much of a problem: if you don't like the conditions, go somewhere else.

One issue with property is how to distribute it at the beginning. Even if everyone gets an exactly equal amount of land, how do you decide who gets which piece? What if there are disputes? And what if the only path from property A to property B goes through property C, whose owner doesn't let anyone else on his land? I don't know what the solution to this issue is. There may not be one.

The chips would completely replace all other government. There would be no one anywhere who could control the chips; they would always do exactly what their logic told them to do, with no exceptions. (Obviously, this means the software would need to be the most thoroughly tested thing ever.)

Lots of things would just naturally fall into place under this system. Copyright, for example, would just be a simple promise: in order for you to buy a CD, you would have to promise not to copy it, and your chip would enforce that promise on you, physically stopping you from. This would be bad for fair use, but fair use seems like a utilitarian thing, and my chip system is already very much non-utilitarian.

Getting back to the Harry Potter example, the book store would've promised the publisher not to sell the books before the release date, and so their employees' chips would have prevented them from selling the books. No wrongness occurs, and the moral dilemma you posted about never comes into being.

There are some problems with this system, of course. I already mentioned the problem with distributing land. There's also a question of when/how new-born babies will get the chip installed, and I'm not sure how much authority their parents should have over them, and for how long. And, of course, there are numerous technical issues; I don't think computer-brain interfaces are even close to where my system would need them to be, for instance.

If nothing else, I think this makes an interesting thought experiment.

LKBM said...

That is essentially how I justify copyright (promises, not mind-control chips), but it gets slightly messy. When I buy a record, I promise not to copy it, but the person who overhears it as I'm listening has not. /That/ restriction /is/ a government-granted right, I guess as an enforcement of the zeitgeist.

In the popular opinion is 'no copyright', the government should stop enforcing restriction that go beyond promises.

Right now loads of people pirate, but I think they believe it's wrong in most cases. With out-of-print stuff possibly not (though that's messy), and with backing up your purchased stuff surely not.

Sadly, the less respect people have for copyright the stronger the government enforces it, when that's the exact opposite of how it should go. Legal restrictions are justified by the social conscience and must follow it.