It's very nice for Bruce Schneier to express opposition to invasion of privacy. I agree with him. But unless I totally misunderstand Schmidt's comments, Schmidt is not suggesting people should not make love to their wives, for example, but rather, that if you are going to do something that you don't want the government to know about you, you shouldn't give it to someone that is legally obligated to tell the government about it if they ask.
Regardless of how Google feels about privacy, it's unreasonable to expect them to martyr themselves for our sakes when the Department of Justice or the FBI knock on their door with a subpoena. The way to protect your privacy is to not tell people things you don't want them to know(or to use Tor or the like).
Eric Schmidt is an American business leader. He has the same First Amendment rights as anybody else. He is obligated to follow the law, but he is not obligated to do so without complaint, let alone make excuses for his activities and those of his government.
Schmidt could have said "We understand that Google collects a lot of data; we have privacy policies that protect that data and we fire anyone who abuses them. We take data privacy very seriously. But we're aware that people are unhappy that various governments can compel us to disclose data. We believe that this is wrong. We believe that a world where anonymous government employees can see anything you do is a dangerous world. We believe in the power of data, but we want to encourage society to take steps to prevent this kind of abuse."
But he didn't, of course. Instead he told us that the innocent have nothing to hide.
it's unreasonable to expect [Google] to martyr themselves for our sakes
"Martyr", indeed. That is no mere hypothetical. Read this:
This is Herta Mueller, formerly of Romania, describing her life in the Ceausescu regime. There you could, indeed, face harsh reprisals, ranging all the way up to martyrdom, for refusing to spy on your fellow citizens.
I'm not sure it's reasonable to expect those people -- Mueller's friends, stuck in a scary totalitarian regime -- to stick up for their principles and resist the state's order to inform on their friends. (Though many did resist, and paid the price for it.) But Google? Google isn't even human. It has no fingernails to pull out. It is an American corporation, whose leaders live in America, with enough legal budget to sue God himself. I don't think it's unreasonable to expect the company's leader to use some of that power to lobby for good, instead of looking sheepishly at his feet, kicking the dust, and claiming that he's only following orders.
"But he didn't, of course. Instead he told us that the innocent have nothing to hide."
Please keep in mind you're commenting about a snippet of a televised interview. For all you know Eric Schmidt agrees with everything you wrote above. (He may have even said something along these lines and it was edited out.) He certainly did not say or even imply "the innocent have nothing to hide."
I want to live in a world where people have to make dry, quantified, lawyer-pleasing statements all the time just to make sure there is no speculation a lot less than I want to live in a world where Google may hold data about me which the US government can demand. :)
at the least, i think you are misreading the tone and giving a rather generous interpretation of what was said. schmidt could have said what you wrote. he could have something much more critical of the position he was in. he did neither.
also, your definition of "reasonable" is not one we all need share. there is a trade-off between profit and privacy here - the balance google has chosen is not the only one possible.
so twice you slant your argument to suit google. once by interpreting what was said in as favourable a way as possible, and once by presenting the false dichotomy of "google v martyr".
"People are treating Google like their most trusted friend. Should they be?"
"I think judgment matters. If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place. If you really need that kind of privacy, the reality is that search engines -- including Google -- do retain this information for some time and it's important, for example, that we are all subject in the United States to the Patriot Act and it is possible that all that information could be made available to the authorities."
The quote was in response to a question about to what extent it is wise to trust Google(not a moral question, but a pragmatic one). All of Schmidt's response, except perhaps one sentence("If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place.") which is itself ambiguous, is describing pragmatic reasons why Google can not legally protect you from the government. Given that the context of the remark in which "maybe you shouldn't be doing it" either means you shouldn't be doing something you want hidden, or means that you shouldn't be trusting Google with the knowledge that you are doing something that you don't want the government knowing about because they may not be able to keep that information from the government without violating the law, I'm inclined to interpret it as the latter.
With regard to "google v martyr" being a false dichotomy, how could Google not respond to a subpoena for the search data for a certain IP address without suffering legal consequences?
That's a reasonable statement, and it's supported by parts of his interview. However, most people are complaining about this quote: "If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place." It's really a different sentiment.
I realise that Schmidt was taken out of context for media hype purposes but...
Even in context what he says is worrying though. It is not just advising you on how to keep you data out of government hands. It is also making some sort statement about Google's position.
Surely their position is 100% clear: they cannot deny a legal government request for information.
The point it becomes contentious (and something this doesnt touch on) is whether they need to keep XYZ data in a personalized way, and where holding said data tips over from giving us value into causing us danger.
I don't buy it. I think it's an underhanded tactic. I think he meant to say it, but to say it in a way that was deniable.
The excuse that people should only worry about privacy if they have something to hide, is nothing new. I think Eric Schmidt is good at recognizing patterns, and therefore must have noticed that he said that while he was taking part in the interview. I know I couldn't let those words slip by if I said them.
Under what conditions does google have to turn over the search records? Can the government say, "Give ue all the records for those who searched for 'murder' related queries." I'm guessing not, and that a warrant must be presented, but I'm still curious.
My search history probably makes me out to be a psychopath, when in fact I just have a wide array of strange interests.
Search warrants have to be very specific regarding what they target. "Give us all the records for those who searched for 'murder' related queries" isn't specific enough for a warrant. "Give us the records of [some user] searching for [terms related to some specific technique of disposing of bodies] during March 2009" might be valid, but IANAL.
If you're ever brought to trial using evidence gained by a warrant which didn't list that evidence to some reasonable specificity, any decent lawyer will have that evidence removed from consideration.
What a fantastic last paragraph. Something from the original essay that was left out:
"A future in which privacy would face constant assault was so alien to the framers of the Constitution that it never occurred to them to call out privacy as an explicit right. Privacy was inherent to the nobility of their being and their cause. Of course being watched in your own home was unreasonable. Watching at all was an act so unseemly as to be inconceivable among gentlemen in their day. You watched convicted criminals, not free citizens. You ruled your own home. It's intrinsic to the concept of liberty."
Foucault argued that the very act of being watched was dehumanizing. Google shouldn't watch us because it violates our rights, but because it's evil.
EDIT: I hear this claim all the time on the internet: Collecting data is evil. So please, go beyond the claim, and give me an explanation. Why is knowledge about people inherently evil?
EDIT 2: I know you're not supposed to complain about downmodding, but this comment has been downmodded four times. If you disagree so vehemently, then try defending your position rather than censoring me :)
To reply to csallen's question about why being watched is evil - these answers are in part elsewhere in the discussion that's gone on about my comment already, but bear repeating.
1) It's not being watched that's evil. It's watching - my argument is that being watched consistently has negative effects on a person.
2) I haven't said that collecting data is evil - by all means collect data. Data about a specific person or tied to a specific person on the other hand, is a mode of control of a person, whether or not there is intent to control the person with the data.
This fact, while not empirical, is reflected in the literature about humanity. Foucault, as I already mentioned, wrote extensively about the modes of security that a nation employs (note that security is a technical term that Foucault has defined to mean the ways that an entity in power may maintain that power, roughly). One such way is surveillance of the citizenry. Another is through spectacle (drawing and quartering). He argues that as a society, we've switched to a disciple/rehabilitation focused system of control. If you want to read more on this, see his work "Discipline and Punish".
Even fiction authors have found it fit to point out the control one possesses over another when they know things about someone. In "The Wizard of Earthsea" series, for example, knowing someone or something's name gives you power over it - and not just because of a mystical quality to a name. By the time the series is over, LeGuin has made her point that the name of a thing allows you to control it in part; at least make requests or commands of it, whereas when you did not know it's name you had no such power.
I could write a dissertation on this (and in fact, I did), but there's no point. I just figured that a question deserved a response. I'd love to hear if some disagree with me on this and why. Always love a discussion on Foucault.
I really object to people collecting data about me without my consent, because I value my ability to lie and misrepresent myself when it serves my purposes. I think that's an important part of human freedom, and taking it away is evil.
If something is 'inherent' it cannot be explained further almost by definition. I think this is what that paragraph is claiming.
It is claiming that invasion of privacy is evil in the way that hurting people is inherently evil. You do not call on someone to explain why hurting people is evil. It is calling on you to accept that invasion of privacy is just evil.
It's not the knowledge, it's the power.
Access to intermate knowledge about people by a radical political party using a hate plank to get into power would be death to those at the other end of the plank.
Search queries are pretty intermate knowledge. Who else has searched the anarchist's cookbook or DeCSS?
Going even further with your question, what's so wrong with Patriot Act when it's there to ultimately protect us? If there is information in one of our gmail accounts that can help to unveil the plans of an imminent terrorist attack that could potentially kill hundreds, would you not compromise 'privacy'? Where do you draw the line?
It's a bit facile to say that it's evil for Google to watch us. Yes, a company can abuse the data in its possession, and yes, all that data in one place makes the government lick its lips in anticipation.
But don't forget, Google does a lot of wonderful things. Millions (billions?) of people's lives are enriched by Google's products, and most of those people (certainly those of us on HN) know that Google needs some information about us in order to deliver the things it delivers. The startups that will change the world in the next decade or two are likely to be built with the help of things Google makes — and which Google cannot make as well without some personal data about its users.
So yes, companies now know more about you than they did 20 years ago, and we should certainly note the dangers of that. But don't ignore the huge value of what you get in return.
> Foucault argued that the very act of being watched was dehumanizing.
Was any data involved in his 'studies'? Was 'dehumanising' somehow defined independently? I suspect his arguments are more philosophical musing relying on circular logic, unstated assumptions and social norms, rather like pretty much everything I seem to read from Schneier. I'm sure they sound rather clever but do they really amount to much more than a circle jerk?
One could just as easily argue that 'need for privacy' is just a learned social construct.
Yes, there was data involved in his studies. It sounds like you don't value philosophical knowledge - to that end I won't endeavor to persuade you of it's merit. I would advise you to read his work before you criticize it.
I remember this essay from when it was new. To me, it is Schneier at his best, and it makes a perfect retort to the most pessimistic interpretation of what Schmidt said. And bravo to Schneier for doing so in a timely manner.
That said, I wonder what Bruce would say in response to a more optimistic interpretation of Schmidt's statement. What if Eric's sentiment was more along the lines of "if you don't want people to see you making love, then don't do it behind the bushes in Buena Vista Park"?
You know, the government told librarians they had to turn over the lists of books people checked out from the library and so they stopped storing that information. They delete it.
I applaud the librarians for standing up to the government. Google... not so much.
Google (and the vast majority of their users) has a whole lot to lose by deleting search history while its competitors pursue personalized search. Librarians - nothing at all.
Then couldn't Google offer some sort of non-personalized version where such data was discarded that privacy conscious users could choose to use? This would bring me back to believing that Google isn't "evil"
Well, it puts an end to certain types of programmes that use loan histories. For example, some libraries allow people to check their personal loan histories. This is useful if you've ever dug deeply into a given subject at a library, and would like to check what you've read already. Another example is the use of loan histories to generate lists of recommended material. So there is a cost, but I think it's a cost that librarians were willing to pay to protect people's privacy.
I see your point though, the costs to Google and their users is higher.
In a world where watching other people is easier and easier, I don't think that asserting that not being watched is a human need is going to hold up. Data collection is only going to get easier in general, no matter what laws are passed. Cameras and listening devices are only going to become more prevalent, in spite of last ditch attempts by theater owners and angry police. Aggregation of everything that goes on around a person will become a tool that is so useful that those without it are like those refusing to use search engines today: able to get along, but at a noticeable disadvantage.
No amount of posturing and rhetoric about rights and privacy will long delay the world we're moving into; those are political activities, and the coming abolition of privacy is a technological and economic matter; politics can only weakly affect such things without becoming overwhelmingly invasive -- a solution more to be feared than the problem.
Corporate abuse too. If someone is out-spoken against Google, what's to stop Google from silencing them by threatening to reveal secret information about them (or to plant information about them, like child porn searches in their search history).
Not only that, but it makes it easier for 3rd party groups to try and gain information to silence opponents, Scientology being the first to come to mind.
In general, all of this information will benefit people, but possibly only people that 'keep their head down' and don't 'go against the flow.' For people that have strongly held beliefs that are against the 'norm,' it will make it easier for them to be attacked and (eventually) silenced. It will be the tool used to bludgeon people into becoming a homogenized society.
While I think that it is worthwhile trying to increase privacy on the data collection side (eg Google recording search queries), the more important part is what happens after. Is it anonymized? Is it available to government organisations? What happens after they get it?
Has there ever been any society free of busybodies, thoughtcrime, and persecution? I think Schmidt is blaming the victims and advocating an inhuman degree of conformity.
Regardless of how Google feels about privacy, it's unreasonable to expect them to martyr themselves for our sakes when the Department of Justice or the FBI knock on their door with a subpoena. The way to protect your privacy is to not tell people things you don't want them to know(or to use Tor or the like).
Edit: fixed a minor typo
Schmidt could have said "We understand that Google collects a lot of data; we have privacy policies that protect that data and we fire anyone who abuses them. We take data privacy very seriously. But we're aware that people are unhappy that various governments can compel us to disclose data. We believe that this is wrong. We believe that a world where anonymous government employees can see anything you do is a dangerous world. We believe in the power of data, but we want to encourage society to take steps to prevent this kind of abuse."
But he didn't, of course. Instead he told us that the innocent have nothing to hide.
it's unreasonable to expect [Google] to martyr themselves for our sakes
"Martyr", indeed. That is no mere hypothetical. Read this:
http://www.signandsight.com/features/1910.html
This is Herta Mueller, formerly of Romania, describing her life in the Ceausescu regime. There you could, indeed, face harsh reprisals, ranging all the way up to martyrdom, for refusing to spy on your fellow citizens.
I'm not sure it's reasonable to expect those people -- Mueller's friends, stuck in a scary totalitarian regime -- to stick up for their principles and resist the state's order to inform on their friends. (Though many did resist, and paid the price for it.) But Google? Google isn't even human. It has no fingernails to pull out. It is an American corporation, whose leaders live in America, with enough legal budget to sue God himself. I don't think it's unreasonable to expect the company's leader to use some of that power to lobby for good, instead of looking sheepishly at his feet, kicking the dust, and claiming that he's only following orders.
Please keep in mind you're commenting about a snippet of a televised interview. For all you know Eric Schmidt agrees with everything you wrote above. (He may have even said something along these lines and it was edited out.) He certainly did not say or even imply "the innocent have nothing to hide."
I want to live in a world where people have to make dry, quantified, lawyer-pleasing statements all the time just to make sure there is no speculation a lot less than I want to live in a world where Google may hold data about me which the US government can demand. :)
also, your definition of "reasonable" is not one we all need share. there is a trade-off between profit and privacy here - the balance google has chosen is not the only one possible.
so twice you slant your argument to suit google. once by interpreting what was said in as favourable a way as possible, and once by presenting the false dichotomy of "google v martyr".
"I think judgment matters. If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place. If you really need that kind of privacy, the reality is that search engines -- including Google -- do retain this information for some time and it's important, for example, that we are all subject in the United States to the Patriot Act and it is possible that all that information could be made available to the authorities."
The quote was in response to a question about to what extent it is wise to trust Google(not a moral question, but a pragmatic one). All of Schmidt's response, except perhaps one sentence("If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place.") which is itself ambiguous, is describing pragmatic reasons why Google can not legally protect you from the government. Given that the context of the remark in which "maybe you shouldn't be doing it" either means you shouldn't be doing something you want hidden, or means that you shouldn't be trusting Google with the knowledge that you are doing something that you don't want the government knowing about because they may not be able to keep that information from the government without violating the law, I'm inclined to interpret it as the latter.
With regard to "google v martyr" being a false dichotomy, how could Google not respond to a subpoena for the search data for a certain IP address without suffering legal consequences?
I think you nailed it right there; after carefully re-reading exactly what he said I believe that was the intended meaning.
Even in context what he says is worrying though. It is not just advising you on how to keep you data out of government hands. It is also making some sort statement about Google's position.
The point it becomes contentious (and something this doesnt touch on) is whether they need to keep XYZ data in a personalized way, and where holding said data tips over from giving us value into causing us danger.
The excuse that people should only worry about privacy if they have something to hide, is nothing new. I think Eric Schmidt is good at recognizing patterns, and therefore must have noticed that he said that while he was taking part in the interview. I know I couldn't let those words slip by if I said them.
My search history probably makes me out to be a psychopath, when in fact I just have a wide array of strange interests.
If you're ever brought to trial using evidence gained by a warrant which didn't list that evidence to some reasonable specificity, any decent lawyer will have that evidence removed from consideration.
Quoting from the recently released public DNS service:
"The temporary logs store the full IP address of the machine you're using... We _delete_ these temporary logs within 24 to 48 hours."
http://code.google.com/speed/public-dns/privacy.html
Is there anything in the Patriot Act that says explicitly that search engines should keep search logs but not DNS queries?
"A future in which privacy would face constant assault was so alien to the framers of the Constitution that it never occurred to them to call out privacy as an explicit right. Privacy was inherent to the nobility of their being and their cause. Of course being watched in your own home was unreasonable. Watching at all was an act so unseemly as to be inconceivable among gentlemen in their day. You watched convicted criminals, not free citizens. You ruled your own home. It's intrinsic to the concept of liberty."
Foucault argued that the very act of being watched was dehumanizing. Google shouldn't watch us because it violates our rights, but because it's evil.
EDIT: I hear this claim all the time on the internet: Collecting data is evil. So please, go beyond the claim, and give me an explanation. Why is knowledge about people inherently evil?
EDIT 2: I know you're not supposed to complain about downmodding, but this comment has been downmodded four times. If you disagree so vehemently, then try defending your position rather than censoring me :)
1) It's not being watched that's evil. It's watching - my argument is that being watched consistently has negative effects on a person.
2) I haven't said that collecting data is evil - by all means collect data. Data about a specific person or tied to a specific person on the other hand, is a mode of control of a person, whether or not there is intent to control the person with the data.
This fact, while not empirical, is reflected in the literature about humanity. Foucault, as I already mentioned, wrote extensively about the modes of security that a nation employs (note that security is a technical term that Foucault has defined to mean the ways that an entity in power may maintain that power, roughly). One such way is surveillance of the citizenry. Another is through spectacle (drawing and quartering). He argues that as a society, we've switched to a disciple/rehabilitation focused system of control. If you want to read more on this, see his work "Discipline and Punish".
Even fiction authors have found it fit to point out the control one possesses over another when they know things about someone. In "The Wizard of Earthsea" series, for example, knowing someone or something's name gives you power over it - and not just because of a mystical quality to a name. By the time the series is over, LeGuin has made her point that the name of a thing allows you to control it in part; at least make requests or commands of it, whereas when you did not know it's name you had no such power.
I could write a dissertation on this (and in fact, I did), but there's no point. I just figured that a question deserved a response. I'd love to hear if some disagree with me on this and why. Always love a discussion on Foucault.
It is claiming that invasion of privacy is evil in the way that hurting people is inherently evil. You do not call on someone to explain why hurting people is evil. It is calling on you to accept that invasion of privacy is just evil.
Search queries are pretty intermate knowledge. Who else has searched the anarchist's cookbook or DeCSS?
Dead Comment
But don't forget, Google does a lot of wonderful things. Millions (billions?) of people's lives are enriched by Google's products, and most of those people (certainly those of us on HN) know that Google needs some information about us in order to deliver the things it delivers. The startups that will change the world in the next decade or two are likely to be built with the help of things Google makes — and which Google cannot make as well without some personal data about its users.
So yes, companies now know more about you than they did 20 years ago, and we should certainly note the dangers of that. But don't ignore the huge value of what you get in return.
Was any data involved in his 'studies'? Was 'dehumanising' somehow defined independently? I suspect his arguments are more philosophical musing relying on circular logic, unstated assumptions and social norms, rather like pretty much everything I seem to read from Schneier. I'm sure they sound rather clever but do they really amount to much more than a circle jerk?
One could just as easily argue that 'need for privacy' is just a learned social construct.
That said, I wonder what Bruce would say in response to a more optimistic interpretation of Schmidt's statement. What if Eric's sentiment was more along the lines of "if you don't want people to see you making love, then don't do it behind the bushes in Buena Vista Park"?
I applaud the librarians for standing up to the government. Google... not so much.
I see your point though, the costs to Google and their users is higher.
No amount of posturing and rhetoric about rights and privacy will long delay the world we're moving into; those are political activities, and the coming abolition of privacy is a technological and economic matter; politics can only weakly affect such things without becoming overwhelmingly invasive -- a solution more to be feared than the problem.
Not only that, but it makes it easier for 3rd party groups to try and gain information to silence opponents, Scientology being the first to come to mind.
In general, all of this information will benefit people, but possibly only people that 'keep their head down' and don't 'go against the flow.' For people that have strongly held beliefs that are against the 'norm,' it will make it easier for them to be attacked and (eventually) silenced. It will be the tool used to bludgeon people into becoming a homogenized society.
While I think that it is worthwhile trying to increase privacy on the data collection side (eg Google recording search queries), the more important part is what happens after. Is it anonymized? Is it available to government organisations? What happens after they get it?
Deleted Comment
Scheier's response is expectable from someone living in a fear-based society such as ours.