Speaker 1: Thank you. I'm going to warn you right now, there are no audiovisuals, and part of that's because if you're looking at someone, I want you to be looking at me, and part of that's because after 30 years of working in technology, I don't trust it. But don't worry, I have notecards. When I told the organizers this, they looked slightly chagrined, but I said don't worry, it's only for the long, boring quotes I want to use and the statistics that I want to bore the audience with, and that made them feel a great deal better. So let me say, when I heard the topic was entropy, nothing came to mind faster to my mind than data, because we are surrounded by data that seems to be falling out of control. Data being lost by corporations, data being stolen from government agencies, data that we are volunteering that's being collected about us, billions of bytes a day that seems hopelessly out of control. And just to continue the focus on entropy, it seems to be getting worse. So I thought what I might do this evening is talk about particularly the challenge of personal data and privacy. So remember all of this data that we are talking about, much of it we are volunteering. We are posting those pictures of our delicious meals that we think our friends care about. We are engaging in millions and millions of texts a second. We are posting images and videos at a colossal rate, at a rate that could not have been imagined. It's become almost meaningless to talk about the volume, because unless you're a computer scientist talking about things like petabytes and terabytes, it just starts to add up to the point it means nothing. But it has a tremendous impact on our privacy. It has a tremendous impact on this data that we are volunteering, that's being collected about us. In some cases, it's being calculated or inferred about us. Are you a good credit risk? Should you be able to buy that car? Are you somebody that we want to market to? These may not even be data that really exist about you, but rather that are being created. The New York Times reported in 2017, this begins the statistics, that a company you've never heard of, not Facebook, not Amazon, not a company that trips off your tongue, engages in 50 trillion personal data transactions a year. That's buying and selling your data and mine every year. It seems completely out of control, and along with it, our privacy. There are many reasons for this, but the one that I want to focus on, which I think will be, I hope, of interest to you, and I think is a tiny bit controversial, is the role that consent plays in data protection and privacy today. Modern privacy law really came about in the 1960s. It came about from an academic study, so this makes people who work in universities very happy. Dr. Alan Weston, who was at Columbia University, wrote his doctoral dissertation, for which he later got funding to turn into a book, this sounds familiar so far, called Privacy and Freedom. In that book, he defined privacy in a way that every country in the world now follows. That is, and I quote, the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others. By the 1990s, every country had followed suit. In fact, in the New York Times, William Safire wrote, accepting legitimate needs of law enforcement and public interest, control of information must rest with the person himself. Now, you might not care about Alan Weston or William Safire, but the Supreme Court went along with his view as well, and in 1988, the Department of Justice versus Reporter Committee gave us the definition that we use today, both the common law and the literal understandings of privacy encompass the individual's control of information concerning his or her person. Now, just quickly, unless you think this is just a U.S. phenomenon, Europe and Asia and many other countries have followed suit. Europe, you may know, enacted a new general data protection regulation. It took effect May 18 months ago, and in that regulation, although they're quick to say they've not made the mistake that the U.S. has made and focused so much on consent, they used the term 108 times. It's still pretty important, and California, which has adopted our most recent privacy law, the California Consumer Privacy Act, the law gives individuals the right to consent about uses of their data that are collected online. Well, look, this sounds like great. I mean, like, who could be against consent? In fact, challenging consent seems totally counterintuitive in the world of privacy because privacy is, after all, so closely linked to ourselves and our autonomy. But for seven quick reasons I want to outline, I think it's both impractical and undesirable that we focus on consent, and I think it explains why privacy law is in the dreadful state it is today. Okay, first, think of the complexity of privacy notices. You see them all the time. You probably ignore them. That's okay. Almost everybody does, unless you're a lawyer and get paid to read them, but you go visit the doctor. You get a privacy notice. You log on to a website. You get that little cookie privacy notice that's required by European law. That's why you get it. You know, researchers like to count things, and if you count, for example, PayPal's privacy notice is 36,275 words. That's by the way longer than Hamlet. iTunes' privacy policy comes to 19,972 words, just longer than Macbeth. One 2008 study calculated that to read the privacy policies of the 40 or 45 most popular websites in the world would take an individual 30 full working days a year. So these notices are complex because the things are complex. They are difficult to understand, and we often just pass them by. Second, they are often just inaccessible. For example, how many of you have phones? Okay, everybody has a phone in this audience. I'm going to be willing to bet, and every one of you could be recording right now, despite that nice sign at the entrance that says, please don't record. Did you give me a notice? Did I consent to that? Did we discuss that? In other words, how do you provide consent in environments in which you're in a group? How about you walk down the streets of Bloomington's? They're cameras now. Did you consent to that? How do we manage consent in a world in which data is being inferred about you or collected as part of a group? A third reason is that consent has proven incredibly ineffective, mainly because people just ignore it. And I love this quote. It's why I carry it with me. The Federal Trade Commission chairman, John Liebowitz, back in 2009, and remember the FTC is the United States' largest privacy regulator. They're the people who make the rest of us put notices and give consent. He said, we all agree that consumers don't read privacy policies. Well, you know, that's a little troubling from the guy who's done more than anyone else on earth to make us have them. In fact, his predecessor, FTC chairman Timothy Muris, commented at the end of 2001 about a new law that required more notices and consent opportunities in the financial services market. And Chairman Muris said this, acres of trees died to produce a blizzard of barely comprehensible privacy notices. Indeed, this is a statute that only lawyers could love until they found out it applied to them too. Okay, fourth, what we often find out is that the consent we're giving is entirely illusory. We have no choice. Try to update your iPhone, right? There's a new software. It comes out every couple of weeks. If you don't update it quickly, you're prompted to update it. Then it starts blocking it. Then it starts saying, we're going to update it for you automatically. Then the phone stops working. The first thing it does when you go to update it is it said, here are the 74 screens of our privacy policy. You can download it. You can email it. You can agree to it. But you cannot not agree to it. And that's illusory consent. That's meaningless. That may make lawyers feel better. In fact, Apple makes it pop up twice, consent, yes, and then it says, do you really consent? The alternative being, would you like us to turn your very expensive phone into a brick, which is the alternative if you say no. Okay, fifth, there's a huge burden on individuals of all of these consent opportunities. And while consent is often talked about as a right, sometimes even a human right, it's realistically much more of a duty. It's much more of a burden on individuals. And remember, when you make that choice, it has the legal effect of shifting the liability from the data processor to you. It's just like when you drive in the garage and you take that ticket and on the back of it it says, we have no liability for anything. We can crush and melt your car. We can throw it out the edge of the garage. We're not liable for anything. That may look like a right to you, like you got the right to consent to that by driving in the garage. But realistically, it was the imposition of a burden on you and on me. And that burden is really quite significant at times because of the legal significance that can attach to those. Six, choice often serves as an enormous disservice both to individuals and society. Think about press coverage. Do we really want the president to have a right to privacy right now that only his consent will allow coverage of what he's been up to? Fraud prevention, crime detection. Do we want to wait till the criminals consent so that we can use their data? So what we do in these cases is we override the consent. We acknowledge the consent doesn't work. Research, something close to my heart, often depends on being able to use past data, often in an anonymized fashion without going back and getting individual consent. But finally and most importantly, the real challenge about consent is it leads to lousy privacy protection. It's not the same thing clicking I agree to I get privacy now. You agree to terms that often eliminate your privacy. You agree to broad terms that appear to have no limit. We're being asked to do things that we could never be asked to do in other consumer protection settings. Right? Imagine a consumer protection law that said, well, you can always ask the consumer if it's okay to defraud them. We don't allow that. We set rules and then we hold people to it. We don't allow you to consent them away. Okay, so I'm tempted to end there, but I won't. I have four more things I want to say very quickly. And that's because I want to say something positive. I don't just want to be a drain on your otherwise stimulating evening. So what might we do that would look differently? Right? So one thing would be less focused on consent and more on stewardship of data. If you collect my data, if you use my data and something goes wrong that causes harm, you should be liable for it. You can't shift that liability by asking me to consent to take it myself. You would be the steward for my data. And that's the way we treat other things. That's the way a lawyer is who acts on the best behalf of his or her client. That's the way a banker is. That's the way a doctor behaves. Why wouldn't we say if you're using my most personal information, you should be held to the same requirements. You should be a steward. You should be acting in trust of the person whose data you're using. Second, we might think more about what are the things we agree that can be done with data in normal circumstances and what shouldn't be done with it. Stalking is out. Fraud is out. But something should be in. You know, bill collection, fraud detection, maybe even research if you want to keep the vice president for research happy. There's some things we might be able to slide over into the generally permitted category so long as you use good security and not have to burden people by telling them the bleeding obvious. Like if you give me your credit card, I'm going to use it to charge you money. Third, we might think more about redress because no matter what happens, something's going to go wrong. That's the one thing we know. That's the one certainty in a world of entropy. And right now, we are often left in the cold when something does go wrong. In fact, we often learn about it from the newspaper. And finally, when we do ask for consent, let's make it meaningful, timely, and effective. And since I've criticized iPhone, I'm going to say something nice about them now. You know, the just-in-time message, did you know this app was using your location data right now, that's kind of a useful prompt. It gives you a chance to say, no, I'm going to go in and shut that off. I don't like that. But using consent in all of these other settings has the unintended effect of making us tend to ignore it when we could make meaningful, effective choices that would protect our privacy. Thank you very much. Thank you. Thank you.
Generate a brief summary highlighting the main points of the transcript.
GenerateGenerate a concise and relevant title for the transcript based on the main themes and content discussed.
GenerateIdentify and highlight the key words or phrases most relevant to the content of the transcript.
GenerateAnalyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.
GenerateCreate interactive quizzes based on the content of the transcript to test comprehension or engage users.
GenerateWe’re Ready to Help
Call or Book a Meeting Now