"privacy advocates and industry stakeholders are debating ... how shoppers should be informed about when their faces are scanned"
I don't want to be informed when my face are being scanned, in the same way that I don't want to be informed when a website is about to sell my data. What I want is for the law not to allow such things.
This. The question should be why they should be allowed to do that in the first place, so "debating" is basically what they want, since it fundamentally side-tracks the discussion. I'm usually against government supervising anything, but if anything, it should be this: why facilities that are pretty much the primary example of a "public place" in modern society should be allowed spying on people without a good reason?
I'm also against government supervision, but there's a different between supervision/regulation (checking that legal activies are indeed being conducted in a legal way), and between make some activities illegal, such as theft, killing - and yes - also spying.
I want the law to allow such things, because it is extremely useful to me as a consumer and a customer if the system can automatically identify me and attach me to my past history with the company.
This dimension of privacy is still open for debate.
Strongly disagree. While I can accept that it's supposedly valuable for you as a consumer to be "identified and attached" to a company—even though it's something that I want nothing to do do with and consider a net bad for society—the notion that whole-scale identification of persons entering a store may be an acceptable default behavior is absurd, even before considering the so-called "dimension of privacy."
As the other child comment mentioned, there are many ways to identify yourself to a store that don't require facial recognition. The YMCA for example has a system that lets users track their exercise history; there's no reason Macy's couldn't do the same for your purchases. If indiscriminate ID'ing is the default position, then you have decided that _your convenience is more important than the freedom of every other person in the store._ It should be the responsibility of the person who wants this convenience to opt in, not the responsibility of the rest of us to opt out.
The desire to not have PiD stored by anyone should be reason enough to close the debate. But if it's not, we fortunately have hundreds (thousands?) of cases of corporate consumer abuse and irresponsible data storage to point to.
Back in the day, growing up, the storekeepers may not have known my name, but they certainly knew my parents. I didn't realize until much later that a number of them were my parents second and third cousins.
There's already many services that exist for accessing databases of personal data, it's just a matter of combining them into some sort of mesh to really tip the scales to the extreme.
With facial recognition being able to query a face against a number of databases, and payment information able to be queried as well, it just takes a matter of some SaaS subscriptions to be able to know someone's: name, address, email/phone/social media, wealth, education, employment history, what they look like, where they go, who they associate with, when they go places, what they buy/own, etc, etc.
It's like having a stalker behind your back all the time letting subscribers know someone's life history in exchange for trying to extract as much money and value out of person as possible until they are completely used up.
We might as well just back to feudal society and slavery because the game may be different but the motivations are the same.
Maybe it's time to run an e-commerce marketplace with no tracking, no "personalization" of anything, no ads, no marketing emails, non of that bullshit.
With a focus on quality, peace of mind, and best value for money products, focus on minimizing SEO plays and fake reviews.
The infrastructure exist and is rentable from many(although a bit pricy).
Maybe all of those, together, would offer enough value to enough people to make prices reasonable.
Good luck. We may constantly decry the erosion of privacy on HN, but I guarantee you that the populace at large (like 95%+), even if they may sometimes complain about a lack of privacy, also generally really like the stuff that personalization provides. People (again, the vast populace at large) make their purchasing decisions based on things like price and perceived quality; privacy considerations are a distant follow up. Why do you think the ad-supported web is the dominant model in the first place? Because to most users it feels like "free", because the stuff they're giving up (like privacy and the cost of advertising baked into things they buy) isn't readily apparent.
Privacy is a concern that really doesn't matter much, until it does. For other concerns like this it seems like regulation is the only thing that works. For example, insurance companies are highly regulated because the default risk of an insurer is something so amorphous for the average consumer to determine that it's basically impossible for them to judge on their own. So that's when government steps in and says there are basic standards an insurance company must adhere to to operate at an acceptable level of risk. I feel like that's the same for privacy. The average user isn't really able to comprehend, or value, the gradual long term effects of privacy erosion, so that's where the government steps in to demand all players must have basic standards.
As a merchant in general, an e-commerce site is also responsible for mitigating fraud. Too much online anonymity can compromise their ability to comply with the law and to protect themselves from money-losing scams.
Would those databases also have the option to trace the data back to the origins of that data?
It wouldn't solve the privacy issue but I think if companies are compelled to hold a record from whom they purchased/obtained data, it would help. It would help to determine if the originator had informed consent to share it and also if that included every company in the chain.
Currently if a company 'loses' their data then people who never even interacted with this company gets their data exposed without them knowing. If you're dealing with personal data, why not keep a record how, where and when you obtained it from?
I think I did hear brick-and-mortar stores complaining recently about having too many customers, and wondering whether there was something else they could do, something they hadn't thought of yet, to annoy and anger customers, maybe add a little friction to keep them from walking in the door. /irony
No but, ever since "the Schneier piece" [0] I've been thinking differently about this whole issue. It's about identifying you, and well, the store already knows exactly who you are as soon as you use your credit card. So does Amazon.com of course, though ironically they might not have an image of your face.
If you were going to that physical store and planning to use cash (or shoplift), THEN this affects you. Although even without "AI," you would still get captured on security cameras. So if your presence there became a matter of interest - let's say for example, to corroborate or debunk your alibi for a more serious crime elsewhere - someone would invest the time to identify your face the old-fashioned way. Like probably a subordinate of the senior detective on the case, watching hours of camera footage looking for you.
You can see how it was a series of small, slow, incremental steps, each one not necessarily big enough to create an uproar, that got us here.
I work on identity resolution problems among other data challenges at a big retailer. We get very few details when a credit card is used in-store -- pretty much just the basic card info. We don't own the payment networks and a variety of legal and business reasons prevent us from doing what may seem to be possible to get an individual's data based on a card number (from the networks themselves or other data brokers). What's possible is mostly limited to what you can do with the name on the card and store location. We've invested significantly in the problem and the answers are a patchwork of guesses.
There's an interesting related issue here for brick & mortar businesses with CCPA and GDPR in effect: you can do some useful analytics, personalization, and fraud prevention work with probabilistic identity info, but if someone verifies they actually are Person X and wants to download or delete whatever data you have on them, what can you confidently say is actually their data?
Will companies be held to different standards based on how much money they've invested and success they've had in identity resolution, in which case this might be a factor dissuading them from doing more identification and personalization? Or if they haven't invested millions in trying to figure out who people are, but it's possible to do so, are they liable for some kind of misconduct if they don't produce all the data they have that could have been tied together for that person? Is the choice binary, i.e. either invest big in identity resolution and take it as far as possible (with parallel governance investment) or de-identify everything you can? A privacy advocate might think on first pass that it's as simple as choosing the latter, but that's mostly not possible due to requirements we face related to other regulation and business realities: fraud, anti-money laundering, age-related laws, shoplifting, intense competition in a razor thin margin industry, etc.
What do you do? Start growing all of your vegetables in your backyard, and order what you must with a fake identity from behind a proxy online?
I was going to question if we'd reached the point where it is not only impossible to avoid handing over intimate identifiable data, but necessary to participate in society - and I realized that we probably passed that threshold a few years ago, although this does make things seemingly worse.
I wonder how long before they'll ban adversarial masks or hats in the name of safety or some other such nonsense...
In all seriousness, it does open a business opportunity to create proxies to do your shopping for you, essentially hiding your identity through third parties. I don't think that's the real solution. We need to legislate these issues.
Years ago these were clearly possible options. It's only a matter of time before we start to see dynamic pricing based on identity as well. Consumers need to stop allowing these practices to happen. Spend more and shop at smaller shops if you have to. Elect people who will stop letting corporate greed interests take over at the detriment of our shared society and living conditions.
The irony os that it does not prevent theft or how do you know i does? “Please check the box if you where thinking in stealing this item” someone that never stole before or was never caught might steal anyway and someone who stole before and you prevent him/ her from entering might mean a lost sale or customer.
"I wonder how long before they'll ban adversarial masks or hats in the name of safety or some other such nonsense..."
In the US it's long been illegal to wear masks in public. From what I understand, this was done as a response to KKK members wearing masks to intimidate their victims and hide their identity.
However, I've personally never seen or heard of the anti-mask laws enforced, and I see Asian people wearing surgical masks in public relatively often, without any legal consequence that I'm aware of. People wearing masks on Halloween also seems to be not only tolerated but encouraged. And, while concern about the Wuhan coronavirus lasts, I expect to see a lot more masks worn in public.
Of course there are certain locations like banks, jewellery stores, courthouses and other government buildings, where wearing a mask probably won't be looked upon too kindly.
Actually, it's not clear to what extent such laws are constitutional in the US (https://en.wikipedia.org/wiki/Anti-mask_law#United_States). Anonymous speech (among other things) is very clearly protected, but threats of violence obviously aren't.
If challenged, I wonder if arguing that the mask was your way of expressing your wish not to be tracked would be successful. Of course this is all irrelevant since it likely won't be law enforcement you have to contend with - presumably businesses that cared would simply refuse entry to those with face coverings.
Surgical masks you can probably get away with, particularly if there is wildfire smoke in the region or a disease scare. In very cold weather regions you might be able to get away with a scarf covering your lower face.
Walking into a store with other sorts mask on in America runs a not insignificant chance of getting you shot, or at least held at gunpoint. But more importantly, you'd be traumatizing everybody else in the store. When somebody walks into a store with a mask on, more people expect 'robber' than 'privacy enthusiast'.
I've been telling myself to pay with cash more often, primarily because I think it's crazy that a finance company gets 3%-5% of every purchase using a CC. Obviously some of that covers overhead, but it's mostly a mafia style shake-down that society is blindly accepting as a matter of convenience. Stores linking facial recognition with CC details is just extra motivation for me.
First ask them why they need this system. If it's reasonable then implement rules and laws so it can only be used for this specific purpose.
If the reason for installing it is to prevent theft then they shouldn't have any problems with rules limiting them to this one specific usage and a fine if they used it for anything else (eg. marketing).
We can also make up rules where it is acceptable to use it for marketing (informed consent being the bare minimum).
These things, like face recognition of customers, have a tendency of spinning out of control. They say they originally used it for purpose X but because they have the data... Why not use it for purpose Y and Z?
When ordering the fake ID, you'd need some kind of method that allowed the buyer to remain anonymous - I guess there are a few crypo currencies that fit the bill.
You'd also need to get the fake ID delivered somewhere other than your home, so it couldn't be trivially tied back to you.
Six years ago I interviewed at a company that was working on this technology. My background was in image processing and segmentation. I knew someone else would work on it if I didn’t but I still couldn’t sleep with myself at night working for a company whose monetization depends on data mining people. It deeply saddens me that even this shopping mall where I’m currently posting from, tracked customers secretly and got busted with a bit of luck. Eventually they will win. :(
It's a funny thing. You hear on HN constantly that everyone's backend is a mess, that the SQL tables are incomprehensible, the AWS server has crashed again. Surely these scummy companies must be in the same state?
Loss/theft prevention is a priority for retail adoption of FR, but the cherry on top is the creation of real world consumer tracking that every website gets "for free" due to their web platform, which until FR had no equivalent in the real world. With an FR enhanced retail location, the store knows your face visited just as a web site knows your IP address visited. This will overlay a tracking capability on the real world with as much capacity as the out of control tracking of people online. For this key reason, we need regulation to prevent the sharing of any and all personal data between organizations. Wrong identification and junk/incorrect/false data in retail/non-authoritative databases must never be combined to create an uber-database of dirty data we get impressed as "official".
Yet somehow there are twice as many Amazon Go stores in Illinois than in California [1]. And biometric-based time-tracking for hourly employees is extremely common. It's because the Illinois law doesn't prohibit biometric collection and use, it prohibits collection and use without explicit permission of each individual person involved.
PS - Illinois also prohibits discrimination against employees that do not consent to biometric-based tracking and thus all of the systems for sale in Illinois have traditional methods that can be used at all times.
Someone else may have purchased the items. Eventually. Or it may have spoiled and been thrown out for perishables. Or it may have been sold on sale. Or it may have not sold at all before being destroyed or liquidated.
Lots of maybes there. A sale at sticker price can't be assumed just because the product exists in a store. Had the story used the cost of the item, not imagined sales, I wouldn't have brought it up.
Um, no. They might have just sat on the shelf. The whole reason stores have sales is to get rid of merchandise that isn't moving at the price they hoped it would.
I was going to complain about "consent" being removed from the headline, but the consent model for this is all wrong anyway.
Even if you had to sign a EULA when you entered the store, there's no way to meaningfully consent to submitting your biometric data into a legally unlimited universe of analysis, cross-referencing, marketing, credit reporting, law enforcement, and whatever else ingenious minds can invent. This just isn't a consent problem. It's a regulatory problem.
Under a proper regulatory regime, you shouldn't need to consent to a computer recognizing your face any more than you should for an employee to recognize your face, because the laws constrain what the computer can do with that recognition, and you can reasonably expect those limits to be in line with what the employee could do.
I don't want to be informed when my face are being scanned, in the same way that I don't want to be informed when a website is about to sell my data. What I want is for the law not to allow such things.
This dimension of privacy is still open for debate.
As the other child comment mentioned, there are many ways to identify yourself to a store that don't require facial recognition. The YMCA for example has a system that lets users track their exercise history; there's no reason Macy's couldn't do the same for your purchases. If indiscriminate ID'ing is the default position, then you have decided that _your convenience is more important than the freedom of every other person in the store._ It should be the responsibility of the person who wants this convenience to opt in, not the responsibility of the rest of us to opt out.
The desire to not have PiD stored by anyone should be reason enough to close the debate. But if it's not, we fortunately have hundreds (thousands?) of cases of corporate consumer abuse and irresponsible data storage to point to.
Cheers - where everybody knows your name.
Deleted Comment
With facial recognition being able to query a face against a number of databases, and payment information able to be queried as well, it just takes a matter of some SaaS subscriptions to be able to know someone's: name, address, email/phone/social media, wealth, education, employment history, what they look like, where they go, who they associate with, when they go places, what they buy/own, etc, etc.
It's like having a stalker behind your back all the time letting subscribers know someone's life history in exchange for trying to extract as much money and value out of person as possible until they are completely used up.
We might as well just back to feudal society and slavery because the game may be different but the motivations are the same.
With a focus on quality, peace of mind, and best value for money products, focus on minimizing SEO plays and fake reviews.
The infrastructure exist and is rentable from many(although a bit pricy).
Maybe all of those, together, would offer enough value to enough people to make prices reasonable.
Privacy is a concern that really doesn't matter much, until it does. For other concerns like this it seems like regulation is the only thing that works. For example, insurance companies are highly regulated because the default risk of an insurer is something so amorphous for the average consumer to determine that it's basically impossible for them to judge on their own. So that's when government steps in and says there are basic standards an insurance company must adhere to to operate at an acceptable level of risk. I feel like that's the same for privacy. The average user isn't really able to comprehend, or value, the gradual long term effects of privacy erosion, so that's where the government steps in to demand all players must have basic standards.
We take precautions to prevent unauthorized access to or misuse of data about you.
We do not run ads, other than the classifieds posted by our users.
We do not share your data with third parties for marketing purposes.
We do not engage in cross-marketing or link-referral programs.
We do not employ tracking devices for marketing purposes.
We do not send you unsolicited communications for marketing purposes.
We do not engage in affiliate marketing (and prohibit it on CL).
We do provide email proxy & relay services to reduce unwanted email.
Please review privacy policies of any third party sites linked to from CL.
https://www.craigslist.org/about/privacy.policy
It wouldn't solve the privacy issue but I think if companies are compelled to hold a record from whom they purchased/obtained data, it would help. It would help to determine if the originator had informed consent to share it and also if that included every company in the chain.
Currently if a company 'loses' their data then people who never even interacted with this company gets their data exposed without them knowing. If you're dealing with personal data, why not keep a record how, where and when you obtained it from?
No but, ever since "the Schneier piece" [0] I've been thinking differently about this whole issue. It's about identifying you, and well, the store already knows exactly who you are as soon as you use your credit card. So does Amazon.com of course, though ironically they might not have an image of your face.
If you were going to that physical store and planning to use cash (or shoplift), THEN this affects you. Although even without "AI," you would still get captured on security cameras. So if your presence there became a matter of interest - let's say for example, to corroborate or debunk your alibi for a more serious crime elsewhere - someone would invest the time to identify your face the old-fashioned way. Like probably a subordinate of the senior detective on the case, watching hours of camera footage looking for you.
You can see how it was a series of small, slow, incremental steps, each one not necessarily big enough to create an uproar, that got us here.
[0] https://www.nytimes.com/2020/01/20/opinion/facial-recognitio...
There's an interesting related issue here for brick & mortar businesses with CCPA and GDPR in effect: you can do some useful analytics, personalization, and fraud prevention work with probabilistic identity info, but if someone verifies they actually are Person X and wants to download or delete whatever data you have on them, what can you confidently say is actually their data?
Will companies be held to different standards based on how much money they've invested and success they've had in identity resolution, in which case this might be a factor dissuading them from doing more identification and personalization? Or if they haven't invested millions in trying to figure out who people are, but it's possible to do so, are they liable for some kind of misconduct if they don't produce all the data they have that could have been tied together for that person? Is the choice binary, i.e. either invest big in identity resolution and take it as far as possible (with parallel governance investment) or de-identify everything you can? A privacy advocate might think on first pass that it's as simple as choosing the latter, but that's mostly not possible due to requirements we face related to other regulation and business realities: fraud, anti-money laundering, age-related laws, shoplifting, intense competition in a razor thin margin industry, etc.
Data privacy is complicated.
We are wringing our hands over the issue when in reality we can solve it like the Gordian Knot.
I was going to question if we'd reached the point where it is not only impossible to avoid handing over intimate identifiable data, but necessary to participate in society - and I realized that we probably passed that threshold a few years ago, although this does make things seemingly worse.
I wonder how long before they'll ban adversarial masks or hats in the name of safety or some other such nonsense...
In all seriousness, it does open a business opportunity to create proxies to do your shopping for you, essentially hiding your identity through third parties. I don't think that's the real solution. We need to legislate these issues.
Years ago these were clearly possible options. It's only a matter of time before we start to see dynamic pricing based on identity as well. Consumers need to stop allowing these practices to happen. Spend more and shop at smaller shops if you have to. Elect people who will stop letting corporate greed interests take over at the detriment of our shared society and living conditions.
In the US it's long been illegal to wear masks in public. From what I understand, this was done as a response to KKK members wearing masks to intimidate their victims and hide their identity.
However, I've personally never seen or heard of the anti-mask laws enforced, and I see Asian people wearing surgical masks in public relatively often, without any legal consequence that I'm aware of. People wearing masks on Halloween also seems to be not only tolerated but encouraged. And, while concern about the Wuhan coronavirus lasts, I expect to see a lot more masks worn in public.
Of course there are certain locations like banks, jewellery stores, courthouses and other government buildings, where wearing a mask probably won't be looked upon too kindly.
If challenged, I wonder if arguing that the mask was your way of expressing your wish not to be tracked would be successful. Of course this is all irrelevant since it likely won't be law enforcement you have to contend with - presumably businesses that cared would simply refuse entry to those with face coverings.
Walking into a store with other sorts mask on in America runs a not insignificant chance of getting you shot, or at least held at gunpoint. But more importantly, you'd be traumatizing everybody else in the store. When somebody walks into a store with a mask on, more people expect 'robber' than 'privacy enthusiast'.
Online, you're screwed. They link you with your phone number and/or email address and those are often necessary for purchase.
First ask them why they need this system. If it's reasonable then implement rules and laws so it can only be used for this specific purpose.
If the reason for installing it is to prevent theft then they shouldn't have any problems with rules limiting them to this one specific usage and a fine if they used it for anything else (eg. marketing).
We can also make up rules where it is acceptable to use it for marketing (informed consent being the bare minimum).
These things, like face recognition of customers, have a tendency of spinning out of control. They say they originally used it for purpose X but because they have the data... Why not use it for purpose Y and Z?
You'd also need to get the fake ID delivered somewhere other than your home, so it couldn't be trivially tied back to you.
Except that a fake ID will have your picture on it.
I know that security through obfuscation is not a valid method for storing bitcoin keys. But privacy via idiocy is more common than not (exceptions do apply: https://en.wikipedia.org/wiki/History_of_the_Jews_in_the_Net...)
It makes me want to move to Illinois. How tone deaf is this company?
I also noticed that they couch their losses in “lost sales”, as if the shoplifters would have purchased the items had they only been caught.
[1] https://en.wikipedia.org/wiki/Amazon_Go
PS - Illinois also prohibits discrimination against employees that do not consent to biometric-based tracking and thus all of the systems for sale in Illinois have traditional methods that can be used at all times.
Lots of maybes there. A sale at sticker price can't be assumed just because the product exists in a store. Had the story used the cost of the item, not imagined sales, I wouldn't have brought it up.
How often is a desired product unavailable because some of the inventory was shoplifted?
Deleted Comment
Even if you had to sign a EULA when you entered the store, there's no way to meaningfully consent to submitting your biometric data into a legally unlimited universe of analysis, cross-referencing, marketing, credit reporting, law enforcement, and whatever else ingenious minds can invent. This just isn't a consent problem. It's a regulatory problem.
Under a proper regulatory regime, you shouldn't need to consent to a computer recognizing your face any more than you should for an employee to recognize your face, because the laws constrain what the computer can do with that recognition, and you can reasonably expect those limits to be in line with what the employee could do.