Readit News logoReadit News
_kb · 9 years ago
You'd be surprised / scared / outraged if you knew how common this is. Any time you've been in a public place for the past few years, you've likely been watched, analysed and optimised for. Advertising in the physical world is just as scummy as it's online equivalent.

Check out the video here http://sightcorp.com/ for an ultra creepy overview. You can even try their live demo: https://face-api.sightcorp.com/demo_basic/.

_kb · 9 years ago
Woah, so this got a bit of interest. To be honest I'm a little surprised this seems to be news to the HN crowd.

I feel a little bad about calling out one API provider specifically, so here's a bunch more: https://www.kairos.com/https://skybiometry.com/https://azure.microsoft.com/en-us/services/cognitive-service...http://www.affectiva.com/http://www.crowdemotion.co.uk/http://emovu.com/e/https://www.faceplusplus.com/

Face tracking, emotional analytics and vision based demographics analysis is a pretty huge industry. There's a entire spectrum of uses for this tech, from the altruistic (psychology labs, humans factors research), to the well, not.

white-flame · 9 years ago
I've lost track of how many times I've said this on HN:

We need HIPPA for all personal information, not just medical. We have an expectation of privacy in being "lost in the crowd" when we're out and about. Our physical & online whereabouts, who we're physically with, who we're communicating with, our personal contact information, and obviously payment information is private information that can be harmful if not kept private (false positives in automated legal systems, identity theft, and including all the defenses of securing medical information).

Anybody who chooses to hold such information must regard it with a high level of respect and privacy. Since nobody is doing so, and there are no penalties for violating privacy, and this gets into fundamental rights and proper functioning of society, it seems applicable to federal law.

danek · 9 years ago
To be honest this is news to me and I lurk on hn every day :-/

I thought Minority Report was still a few years away...

Thanks for the links, this stuff is both fascinating and scary.

jedberg · 9 years ago
Amazon too:

https://aws.amazon.com/rekognition/

Sentiment analysis for everyone!

Deleted Comment

rpeden · 9 years ago
Indeed. This isn't new. It's everywhere.

The cameras retailers use with their surveillance systems are coming with facial recognition built in now. [1]

And lots of retailers, banks, etc, are using systems that track people's visits across multiple locations. [2]

You'll see a lot of these systems being sold as fraud/loss prevention solutions. The reason for this is that it's a relatively easy sell this way - customers can count how many thieves they've caught this way to easily determine the ROI they're getting on the system. Once the systems are in place, it's relatively easy to start using them for marketing related purposes.

Not all uses of systems like these are necessarily unethical. Consider a case where you want to set up a rule like 'if the average lineup length at the checkouts exceeds 5 people, call backup cashiers'. The problem is that once you have something like this in place, it's very tempting for company execs to want to use the data for legal but less than ethical purposes.

[1] https://www.axis.com/ca/en/solutions-by-application/facial-r... [2] https://www.facefirst.com/solutions/face-recognition-predict...

fixermark · 9 years ago
That's always going to be tempting, and the only real tractable solution is for society to have a larger conversation on the ethics so the law can catch up with it.

Note that some ethical consensus is key---without it, companies can just price "Well, some customers think image recognition is creepy" into the risk model and do it anyway. Compare privacy concerns---people talk big about their concerns over privacy, but in practice, we're still in a world where a survey-taker can get very personal information from a random individual at a mall by offering a free candy bar. Until and unless people arrive at a common consensus that their personal information---including their face---has value or they have a proprietary right to that information, even in public, there's no real tractable solution to this problem.

... because there's no real agreement that there's a problem to solve.

jhonovich · 9 years ago
"The cameras retailers use with their surveillance systems are coming with facial recognition built in now. [1]"

Your source in the marketing material of an IP camera manufacturer.

We research that space and I can guarantee that less than 0.1% of IP cameras have facial recognition built-in or running. These manufacturers, like Axis, whom you cite, would love for such capabilities but they are still very uncommon.

fps · 9 years ago
Just yesterday I was hearing news of how most of the retail giants and lots of smaller retail stores are going out of business due to competition from ecommerce. If that means the end of practices like this, then good riddance.

Deleted Comment

Stratoscope · 9 years ago
I'm 65 and it says I am 39.

This is a wonderful app. I will use it every day!

It also picked up the colors in my aloha shirt perfectly. (Anyone who knows me knows that I am to aloha shirts as Steve Jobs was to black turtlenecks.)

When I want to feel young and go shopping for shirts, now I know what to do!

jstanley · 9 years ago
I'm a man and it thinks I'm a woman.

But it also scores me high for anger and sadness, despite (what I thought to be!) a rather neutral expression. Perhaps it knows more than we think :)

TFortunato · 9 years ago
I tried the demo. A little sad, that it sees me as 12 years older than I am, and that I apparently always look angry and disgusted! I'm going to blame it on my glasses and bushy beard, and try to look at the bright side - apparently face scanning systems aren't quite good enough to get a read on me yet. (And try not to be too sad about looking like a grumpy old man)

(Anyone else with a glasses, a beard, or other non-typical facial features want to comment? I'm curious now how well their system handles these?)

pravda · 9 years ago
I uploaded Comrade Putin. [1]

Thinks he is 45 years old. He is 64! Not calibrated for the superior Russian genetics.

[1] https://pbs.twimg.com/media/CuV5wciUAAA0aBz.jpg

dingo_bat · 9 years ago
Same experience here. Shows my age 10-15 years more than actual. I tried to smile and it just filled up the "disgust" bar. Neutral expression shows a high amount of "sadness". I wear glasses too, and have a slight beard.

I don't think glasses and beard are non-typical facial features!

aiahopeful · 9 years ago
I'm a 25 year old male, bald with a full beard, and it thinks I'm a 33 year old woman. At least it could tell I was happy?
elygre · 9 years ago
Of course you look angry and disgusted. That's only natural, considering that you are aware of what's going on.
cortesoft · 9 years ago
Bearded male here, and it got me exactly right the first time - 33 yr old male, 100% happiness.

The second time it thought I was 28, which increased my happiness even more.

Gustomaximus · 9 years ago
MS did this one a while back to estimate age. I loaded some family members and it was on the majority quite accurate.

https://how-old.net/

bigiain · 9 years ago
With a bald head, beard/mo, and reading glasses it doesn't detect a face. Without the glasses it estimates me as 7 years younger than I am - and reasonably high on anger and sadness...
rukuu001 · 9 years ago
It added 5 years to my age (42) but got everything else right.

My partner tried it and it took 10 years off her age, and found an angry 31 year old man hiding in the folds of her clothing!

sbuttgereit · 9 years ago
I did well... it said I was an angry 33 y/o male. Well, I am male... I'm almost 50... and I didn't think smiling at the camera conveyed anger... but who am I to question our AI overlords ;-)

(edit)... on the other hand they probably were just trying to sell product so thought flattery was the right approach...

ClassyJacket · 9 years ago
Says I'm 4 years older than I am (31 / 27) and have high levels of sadness.

Covered up my receding hairline a bit and it said 29. I reckon if I shaved I could get it down to about 22 since that's how old people usually think I am.

Pulled a disgusted face and it said 47. Hmm.

geowwy · 9 years ago
It thinks I'm 15-20 years older than I am, and even if I smile it thinks I'm angry.
finolex1 · 9 years ago
The real metric to judge it' s effectiveness is by comparing its accuracy to an average human observer's responses. I doubt a human would do a lot better at estimating someone's age.
ako · 9 years ago
From the responses it feels like they are using aws recognition.
froindt · 9 years ago
I wonder if the age error is symmetrical, or if it tends towards guessing lower or higher? It would make for an interesting study.
spc476 · 9 years ago
Glasses and a beard. Pegged me as an angry white guy about five years younger than I am.
allengeorge · 9 years ago
Apparently I'm a 40+ yr-old male. At least it got the "male" part right.
mixedbit · 9 years ago
I was very unhappy to discover that shoes now often have RFIDs built into soles. This + anti-theft RFIDs readers that are already deployed by the entry of most stores can allow to easily assign unique ids to shoppers.
avian · 9 years ago
Most anti-theft tags are not RFID and the gates are not full RFID readers. At least in Europe, vast majority I see are still based on simple resonators that get disabled on checkout. Effectively, the gates only provide a yes/no signal and can't be used for tracking.

Applied Science has a good video on how they work:

https://benkrasnow.blogspot.si/2015/11/how-anti-theft-tags-w...

erjjones · 9 years ago
Retailers also use your MAC address from your phone which is always being published unless you take precautions.

http://lifehacker.com/how-retail-stores-track-you-using-your...

hawski · 9 years ago
Would putting my newly bought shoes into a microwave be a good countermeasure?
genmon · 9 years ago
Do you have a source for this? All I can find online is the occasional use of RFID for stock management or the odd marketing campaign. But nothing about customer tracking
askvictor · 9 years ago
bballer · 9 years ago
Pretty sure they already are built into credit cards for this exact purpose.
padraic7a · 9 years ago
Any sources for this?
amingilani · 9 years ago
By grinning like an idiot I was classified as a very happy 33 year old female. I'm a 25 year old male.

I tried variations of the standard expressions and pulled off sad, disgust, anger quite easily.

I knew binge watching Lie To Me before my psychology mid would come handy at some point!

AznHisoka · 9 years ago
if you ever are homeless you could be like the guy from The Imposter (2012 film) and trick a family into believing you are their long lost daughter.

Deleted Comment

laen · 9 years ago
I was blown away by the accuracy of Microsoft's https://how-old.net/ This just kicked it up a notch.
mfarris · 9 years ago
Counterpoint: this tool pegged my 8-year-old as 13 (and she looks younger than her age) and me as 56... well into the double-digit error zone.

Someone recently thought I was 30. People aren't any better than computers.

Angostura · 9 years ago
It thinks I'm 91.

I am not 91.

metal_guru · 9 years ago
"Emotion Recognition

Understand how your customers feel. Detect and measure facial expressions like happiness, surprise, sadness, disgust, anger and fear."

Creepy, indeed.

fixermark · 9 years ago
Perhaps. On the other hand, it's something that good salespeople are doing internally already; there's an argument to be made that this is just automating yet another part of the customer service process.

(There's an old joke that sometimes shows up on HN about augmenting an automated bugtracker to snap a photo when a crash is detected or a bug is reported, so developers can be reminded that bugs tie down to real people who are actually sad / angry that the software failed them ;) ).

noobermin · 9 years ago
>You can even try their live demo

I'd rather not give them my facial image so they can optimize for me.

frakr · 9 years ago
Matched to your IP none the less. I wonder why Microsoft made that "how old are you?" web app...
asher_ · 9 years ago
Why is this scummy exactly? If a salesperson was to try to sell to you in a store, they would take into account how you appear and act to tailor the sale. There's nothing wrong with that. Why is it suddenly bad if a machine does it?
r0muald · 9 years ago
Because when you talk to a salesperson you know you're being looked at (and reciprocally you're looking at them), and human memory is limited so it's unlikely they will retain any "data" about you when the contact is finished.

Here, instead, there is no indication that you're being watched, analyzed and kept recorded for indefinite amounts of time.

tuxxy · 9 years ago
Well, your behavior and appearance isn't logged in some computer somewhere available for someone to look at whenever they want. Not to mention, face-to-face interaction means you know someone else is watching. This allows someone to do this without your knowledge.

It's just creepy.

pasta · 9 years ago
Because a machine has much more capabilities than one salesperson.

The salesperson doesn't know in what shops you have been before.

The salesperson might also not know you talked to his colleague the day before.

This is about trust and privacy. You can't trust what they do with your data.

dmoo · 9 years ago
Also if it records images it would fall under data protection legislation in the EU
MichaelMoser123 · 9 years ago
I find it funny that the store doesn't trust their salespersons to make such a judgement on their own. Probably they hope to do analytics on what kind of people are visiting and when. Selling the data would only make sense if they are able to link it to an identity, I am not sure that they can legally do that.

Well, you never know into what dystophia you are heading...

794CD01 · 9 years ago
Most humans today are prejudiced against nonorganic life due to not growing up interacting with anyone but other meatbags.

There's a huge double standard in place that makes it somehow wrong for computers to do what humans have been doing without objection for decades or millenia.

aub3bhat · 9 years ago
Everytime I eat at Chipotle I smile at the sign on door that proclaims this property is protected by Envysion.
Grollicus · 9 years ago
> Any time you've been in a public place for the past few years, you've likely been watched, analysed and optimised for

I don't believe this. This kind of advertisement in public would be illegal in germany as it is mass surveillance.

fixermark · 9 years ago
Germany is an outlier; their history with the Nazis makes the country unusually conservative about anything that could be abused for mass-surveillance purposes.

Not that this is a bad thing---being able to think differently like this is one of the positives of having countries!---but relative to the rest of the world, what Germany considers "surveillance" is unusual and sometimes surprising.

tzs · 9 years ago
That demo is amusing. I did a Google image search for "N year old faces" for N = 5, 9, 10 and 14, and eliminated results where the accompanying text did not confirm the age, and then gave some of the remaining ones to it. It was always at least 10 years too old on its guess for these children. It got the gender right maybe 3/4 of the time.

I also tried it on a few internet porn images. It looks like it is definitely only relying on the face for determining gender, or it thinks that there are a lot of women with hairy flat chests and large penises...

joshmn · 9 years ago
Really neat, except, it's about... 22 years and 5 months off. Plus a gender. http://i.imgur.com/wVmPdDj.png
kefka · 9 years ago
I also have my own facial recognition open sourced as well.

https://gitlab.com/crankylinuxuser/uWho

Runs on CPU only, realtime 1280x720 @ 15 fps.

Is it creepy? Sure. But anyone can run it. I was looking at a rewrite to work in CLI with a web interface instead. But the core loop is the magic part that makes everything work nicely.

amelius · 9 years ago
Isn't it illegal in most countries to put cameras in public places? Especially if they don't contain a warning?
TheSpiceIsLife · 9 years ago
No one's going to go to jail for their first offence of putting a in an advertising sign.

What's the worst that could happen? The local advertising regulator will order you to meet the regulatory standards or remove the cameras, but give you x number of weeks to act per unit installed.

lawless123 · 9 years ago
haha, i did something like this in college for a project, it was confined to only emotions though and used images skimmed from google to train it.

Maybe i should try to sell it..

Shivetya · 9 years ago
I will be honest, I have no real problems with this. Then again I enjoyed some of the concepts for advertising shown in Minority Report which did feature ads which could identify you.

the idea of collecting who looked at your display is invaluable. It would be beneficial for both government and ad agency. Ad agency is obvious but government could learn if displays present information people want and it was presented in a manner to catch their attention. The negative aspects of government use could be limited through privacy laws and such.

kebman · 9 years ago
Then new Hitler (Erdogan) enters office, and starts using said device to deport infidels like it's 1915.
bartkappenburg · 9 years ago
I've tested it a few times and it's indecisive on my gender and age (depends if I'm smiling, angle etc etc).

Microsoft's service is much more consistent and accurate (I've tested the same images...): https://azure.microsoft.com/en-us/services/cognitive-service...

Deleted Comment

custos · 9 years ago
Wearing a baseball cap makes me invisible to their scan...
yeukhon · 9 years ago
I tried, but nothing from the response... is that normal?
tluyben2 · 9 years ago
I take my glasses off, but with exactly the same expression and position; 15 years younger, suddenly a lot African(?) and very happy.
smoyer · 9 years ago
Fearful with my glasses on and angry with them off - I'm sitting here while my morning coffee is steeping so I was expecting sleepy!
ge96 · 9 years ago
Try it but first take some clay and apply it all over your face that's skin-tone... then use that as your face ha.
Godel_unicode · 9 years ago
I love how many people replied to your comment about how creepy this is with their age, appearance, and gender.
wfunction · 9 years ago
I love how people tried it in the first place. Now the site has faces tied to IP addresses.
joshrotenberg · 9 years ago
Now your employer can monitor your facial expressions to determine if you are actually working or just reading HN.
paulcole · 9 years ago
The future is here:

"WorkSmart can track workers' keystroke activity and take webcam images to ensure they're doing their jobs."

http://www.oregonlive.com/silicon-forest/index.ssf/2017/05/j...

akhilcacharya · 9 years ago
Oh dear it thinks I'm a 35 year old woman...
paulddraper · 9 years ago
And I suppose you're not?
qb45 · 9 years ago
It's AI, it knows you are secretly trans.

Deleted Comment

jswo3901 · 9 years ago
shitgoose · 9 years ago
just tried it... 57 yo?! Fucking kids wrote this code. Everyone above 30 is an old man for them.
muppetman · 9 years ago
I'm 40 and it keeps saying I"m a 28y/o
mythrwy · 9 years ago
It just said I'm 23 on one picture. The other one unfortunately was pretty close to my real age.
mrep · 9 years ago
It said I was 35 years old and yet I am 23 so I don't think it is just you
anabis · 9 years ago
In Japan at least, before automated facial recognition, cashiers recorded buyer demographics by hand. I would think other places do it too.

Edit: Here is what the buttons look like. Gender and age. https://image.slidesharecdn.com/hvc-c-android-prototype20141...

preinheimer · 9 years ago
I've seen this in Canada, at a Dairy Queen.

The cash register had a matrix of buttons

M: [0-9][10-19][20-29][30-54][55+]

F: [0-9][10-19][20-29][30-54][55+]

They'd just push a button as they punched in your order.

caf · 9 years ago
Presumably most cashiers would just optimise to pressing the same button on every transaction, since doing it "right" makes no difference observable to them.
cyberferret · 9 years ago
In a few stores here in Australia, I am asked for my post code (or country of residence if no post code) by the cashier when ringing up a sale. Predominantly at tourist/tour related points of sale, but I've also had it an electronics and white goods stores.

No idea if the operator is also recording gender and perceived age group etc., but I do know that on most occasions, you can opt not to answer the post code question.

dis-sys · 9 years ago
When you go to those weekend open inspections in Sydney, you are almost guaranteed to be asked for your postcode. They record that as well.

I actually did some experiments - for different properties in roughly the same area/price range, I told different real estate agents different postcodes. It is beyond reasonable doubt that the code you tell them play a huge role on how they rank you as potential buyers. When you tell them a random north shore post code, you are guaranteed to receive a nice & friendly follow up on the coming Monday, however if you tell them that you live in the west (when mostly inspecting north shore properties), they would smile and immediately end the whole conversation.

The sample size here is ~50, which I believe is big enough to draw some reasonable conclusions.

punk_coder · 9 years ago
I'm not sure if it's true, but a friend once told me the reason a store asked for your zip code was to see if they had a large audience coming from a certain area. This let them know other locations to possibly open other stores.
plink · 9 years ago
By cross-referencing your name (from your credit/debit card) with your zip/post-code, stores are able to determine specifically who you are with greater probability than without the zip/post-code.
gumby · 9 years ago
I always give the post code of the shop, if I know it, or one nearby that I do know.
closeparen · 9 years ago
Isn't it a credit card security feature? That's what it's for at gas stations.
tomjen3 · 9 years ago
Of course you can refuse, but you can also just give them a false one.
partiallypro · 9 years ago
Don't forget loyalty cards are both a way to track demo and purchasing trends.
erroneousfunk · 9 years ago
I volunteer at the Boston Museum of Science on Sundays and we also track how many people we interact with at the various activities. We log by group, so log might read "1 man, 1 woman, 2 boys, 1 girl (family)" or "3 women, 10 girls, 12 boys (school group)"

It's really handy to see how many people the activities attract, and who they appeal to most. You're tracked everywhere!

logicallee · 9 years ago
Could you be more specific? What could they possibly [Edit:] have recorded in a second other than male/female, Japanese/foreigner, or child/adult?

Facial recognition is a unique identifier but cashiers have access to almost nothing they can record... [Edit] What was it?

Edit: clarified that I am asking about the history here, what information was manually collected by cashiers as parent stated

anabis · 9 years ago
Demographics and sales data is a primary example of big data.

The tweeted advertisement system also looks like it's only recording demographics. Not individual personal IDs.

derefr · 9 years ago
You could distinguish people who are married or are parents (with false negatives) by recording people who are at the till with their spouse or children. (Send out demographic-research cards once, to a few of the same stores you've collected this info from, to derive a normalization factor that will make such collected observations useful from then on.)

You could make a note of a person's seeming affect—positive/negative/neutral emotion.

Deleted Comment

samtho · 9 years ago
During 2010-2012, I was part of a startup called Clownfish Media. We basically created something very similar to this and got scary accurate results then. Given how accessible computer vision has become, the image in the tweet comes at no surprise to me.

Best part - we got a first gen raspberry pi to crunch all the data locally at 2-5fps. Gender, age group (child, youth, teen, young adult, middle age, senior), and approximate ethnicity were all recorded and logged. Everyone had a unique profile and could track people between cameras and days (underlying facial features do not change).

Next time you look at digital signage, just be aware that it is probably looking back at you.

mattcoles · 9 years ago
Do you feel it was ethical to work on that?
deadbunny · 9 years ago
Not GP but I've worked in a similar industry.

For me I knew how our data was anonymized. So while our system would be able to say "I have seen person 1234 at locations 4,7,9,11 on dates x,y,x" we had absolutely no way of knowing who 1234 was or anything about them, even the unique identifier was just a hash.

Obviously it depends on how much data you collect/store, personally I don't think the things shown in OP are all that onerous (sex, age group, gender, rage, time spent looking at ad).

csmattryder · 9 years ago
I work on digital signage, our product isn't using facial expression recognition yet, but it has been asked and will eventually be a part of the system.

What's the difference between this and an anonymised dataset? No PII is tracked, it's just looking at you and calculating what emotion you're likely feeling to show more targeted advertising.

I mean, I'm personally against it but we've got to prove a higher and higher ROI to justify the cost of digital signage, this leads to just that.

_-_T_-_ · 9 years ago
(Supposedly) Lee Gambles comment on Reddit -

"Hi. I am the original taker of the photo. There is a screen that normally shows peppes pizza advertisements in front of peppes pizza in Oslo S. The advertisements had crashed revealing what was running underneath the ads. As I approached the screen to take a picture, the screen began scrolling with my generic information - That I am young male (sorry my profile picture was misleading, not a woman), wearing glasses, where I was looking, and if I was smiling and how much I was smiling. The intention behind my original post on facebook was merely to point out that people may not know that these sort of demographics are being collected about them merely by approaching and looking at an advertisement. the camera was not, at a glance, evident. It was merely meant as informational, maybe to point out what we all know or suspect anyway, but just to put it out in the open. I believe the only intent behind the data collected is engagement and demographic statistics for better targeted advertisements."

Source: https://www.reddit.com/r/norge/comments/67jox4/denne_kr%C3%A...

sleepyhead · 9 years ago
Not Lee Gamble. He just shared the photo without any source. See this article: https://translate.google.com/translate?sl=auto&tl=en&js=y&pr.... Photo taken by Jeff Newman.
baldfat · 9 years ago
It is still a BIG ethical issue for some people. Myself I see this as just the natural progression we are headed. If we don't have rules about this kind of technology it will very much be "Minority Report" in a decade.
794CD01 · 9 years ago
How is it an ethical issue for strangers to look at your face when you are out in public?
mk-61 · 9 years ago
For me is way more bigger ethical issue is what millions sharing in social networks, on their own will.

Deleted Comment

gedrap · 9 years ago
To be honest, I am fairly surprised at the reaction here on HN. It's not really surprising to see such system, it would be more surprising if such system did not exist because offline ads is a huge business and the technology is here. This goes together with conversion tracking at physical shops, etc.

I am equally surprised by the comments about how come engineers implement such systems, how they find it ethical, etc. I'm sorry, but it sounds just a bit out of touch with the real world, or just outside of HN bubble. Given the things that money motivates people to do, it's probably one of the least unethical things that has been done.

I am not judging that this is right or wrong, I am simply stating the fact that nothing about this should be surprising. Yes, this is slightly sad, but that's simply the reality of technological advancement. It's not really possible to expect the rest of the world to use the technology only for things considered 'right', etc.

TeMPOraL · 9 years ago
Well, nothing here is surprising beyond maybe the scale of things (if a random pizza joint now uses facial recognition in ads, who else is using them?). But those things still need to be called out and opposed, because peer pressure is an important part of morality in society. People are social animals, and are less likely to do things that are disliked by their friends.

Looking at things from a little distance, the whole thing is abhorrent, and paints a really sad state of our society. I wrote this many times, and will keep writing it: if you did the same things personally to your friend that people in advertising industry do to everyone, you'd most likely get punched in the face. And yet somehow marketing became a respectable occupation.

fixermark · 9 years ago
Do they need to be opposed though?

There isn't really much consensus---even on HN---that passive demographic data collection is a bad thing alone. People claim it is, and I believe they feel it is---then they turn around and do things that compromise their stated beliefs because it's convenient.

I liken it to the gap between the rhetoric around open source and free software and the reality that Windows and Mac OS make up approximately 90% of OS marketshare. You can believe what you want to believe, but from a business standpoint you'd be putting yourself at a disadvantage if you structure your business requiring FOSS operating systems to climb to even 25% of marketshare; there's a similar situation, probably, for customer data tracking and advertising preference tracking.

jstanley · 9 years ago
It's not surprising that it's possible. It's surprising that it's already happening.

It feels straight out of Black Mirror.

ashark · 9 years ago
Lots of Black Mirror is commentary on where we are now, not where we're going, even if the episode itself is set in future (15 Million Merits is an easy example).
mk-61 · 9 years ago
I think, such reaction is just because article is less "techy" than it could be, but more about "moral" aspect.

OTOH, what is so interesting in simple face recognition, innit? That future became a past quite fast, meanwhile teh human rights never get old. (smile.jpg)

ffriend · 9 years ago
As someone working on a similar project (specifically, emotion recognition) I'm highly interested to hear how such a product should look like to be not considered unethical. So far from the comments I see that:

- it should be made clear that you are being analyzed e.g. by big yellow sticker near the camera

- no raw data should be stored

- it should be used to collect statistics, not identify individuals (?)

Is it sufficient to consider such a software as a fair use? What else would you add to the list to make it reasonable?

abalone · 9 years ago
The ethics are simple: If you don't get opt-in consent, many subjects are going to feel violated. Even if you assure them you anonymize the data.

It's not enough to put a warning next to the camera because you've already captured them at that point and it's too late. If anywhere it would need to be at the entrance to the store.

But guess what? Customers HATE this stuff. The backlash and lost business is not worth it. See: http://abc7news.com/business/philz-to-stop-tracking-customer...

bigbugbag · 9 years ago
If a store has a warning at the door that this happens inside, it's good because now I can avoid getting inside the store and silently hate and boycott the brand.

If a store has a warning label on the device engaging in this, it's bad because it's too late for not entering the store. I'm gonna complain right now at the store manager, maybe call the cops or sue. I'll be vocal about actively hating the store, the brand, the manager, the employees.

If I went to a store engaging in this without telling and I later learn about it, then I'm calling Keyser Söze and it's pitchforks and beheading time.

I suppose it will take a couple more generations of brainwashing to have the population ready to accept this kind of highly invasive technology. IIRC about 10 years the big brother awards was awarded to a french industry group for their blue book describing how to condition a population to accept surveillance and control technology over a few generations.

TeMPOraL · 9 years ago
It's actually pretty simple - don't use it on people.

Advertising? No. Sales? Definitely no.

Augmenting that single-player video game so that it adjusts content depending on emotions and gaze of the player? Ok. Better if the player is explicitly told the game will track their reactions though.

EDIT:

Also, another angle. Even for advertisers / "sales optimization", I'd forgive you if that was a local, on-site system. But if it's meant as a SaaS, with deployments connected to vendor's butt, then I am gonna actively try to screw with it if I learn there's one installed anywhere I frequent. Hopefully new EU laws will curb that, though.

fixermark · 9 years ago
Editing note: unless it was intentional, you appear to have your "cloud to butt" web extension enabled. ;)
spuz · 9 years ago
I'm not sure what you mean by your last paragraph. How would facial recognition apply to SaaS products?
arximboldi · 9 years ago
The only ethical possibility in my view is for it to not exist. I don't like having my emotions manipulated to make me buy more stuff, regardless of whether I am anonymized or not. But then again, I think similarly of a lot of the non-targeted advertising; the recognition just add whole new level of disgusting.
ianai · 9 years ago
It feels both like an invasion of privacy and unhelpful beyond a point. I don't see the benefit rising above the societal cost.
ffriend · 9 years ago
> I don't like having my emotions manipulated

What about collecting statistics to make better decisions? Let's say, you go to your favorite jeans store, but find out that current collection is disgusting. Does it sound ok for you if some sort of system would analyze your attitude to the product to improve it in later versions?

ctdonath · 9 years ago
I've watched the same hysteria & concerns about all kinds of privacy-invading systems. Social Security Numbers, credit cards, computer IDs, camera GPS, search queries, and piles of other tech all start popularization with "OMG evil people can do evil things with that data to hurt you!" Save for a few holdouts (usually much older folks), society at large has completely accepted all that tech as normal. Just takes about a decade of the convenience overwhelming the fear. I despise SSNs, but cutting my taxes by $1500/yr (child tax credit) is motivating; credit cards suck for a zillion reasons, but swipe-and-done is so damn convenient; no question Google has an impressive model of me but those search results are enormously useful; etc.
maze-le · 9 years ago
I have a question: Do you do trials in a controlled environment, where you actually have proper feedback and a distinct comparison between self-described state and machine analysis? Because in my opinion, systems like these are like modern day version of astrology (at least, when they are only based on vision and not things like fMRT imaging or proper psychological analysis). I know seriously depressed people, who always had a smile on their face (maybe a social coping mechanism), as well as "angry" looking coworkers, who had a very good mood most of the time. It very easy do misinterpret a persons mood, when the only "interaction" is: looking at them, and analyzing their facial features.

When these things are used outside a controlled environment, things could get even more complicated: weird beards, squeezing of your eyes because of excessive sunshine, reflexive glasses, etc.

ffriend · 9 years ago
I see 2 separate issues here:

1. Accurate collection of facial features. Illumination, occlusions, head rotation, etc. may seriously affect accuracy, but this is exactly our main focus right now. We are at the very start of the process, yet early experiments and some recent papers show that it should be doable.

2. Correlation between real and detected emotional state. At the moment we concentrate on the 6 basic emotions and don't detect less common expressions like depression with smiling face. This topic is definitely interesting and I'm pretty much sure it's possible to implement given enough training data, but right now we try to concentrate on different things.

criddell · 9 years ago
The fact that you are already working on this says something about your willingness to do something distasteful to earn a paycheck. A slightly bigger paycheck would probably mean you would relax you morals even further. Even if your product starts out with stickers and no logging, I bet it doesn't stay that way for long. Not if the paycheck can be bigger.
bigbugbag · 9 years ago
To me the only way this could be ethical is by the project being limited to private space (a lab, a room in your house). No data is to be recorded ever, runs on an airgapped computer, doesn't try to identify people, every people subjected to it has to be fully aware of what this is about and the implication it can have.

Facial recognition is by essence creepy.

spuz · 9 years ago
The opinion of most people here is that facial recognition technology is for the most part creepy if used in a commercial setting. Mine is slightly different. I think it's fine if you want to show me a different advertisement or sign based on an interpretation of my expression. I also think it's ok if you track my position within a mall and see which shops I visit and when. I would draw the line at attaching personally identifiable information to that data such as a name or a photo of my face. Anyone who decides to do that is probably going to cause harm/inconvenience to me (I don't want junk email from shops I happened to visit but didn't buy in).

I should also state that I think the first use of my data is ultimately unprofitable. Will the extra cents you make by advertising cinnabon to depressed looking people or hairdressers to long haired people really offset the cost of developing such a system? If applied to a broad population any customisation effects will be marginal.

I also believe that the non-anonymous tracking system is much more likely to produce value for companies and it would be very tempting when gathering anonymous data to cross reference with actual individual information. My concern about any tracking system is that by the motivation of profit it could easily shift from an ethical to non ethical space.

pjc50 · 9 years ago
I'm kind of surprised that it didn't have some sort of Data Protection warning near it already, but I'm not sure if the EU data protection directive covers Norway as well.
Foxboron · 9 years ago
We have pretty strong laws regarding this. It has created several news articles for the past week, and the Norwegian Data Protection Authority has already commented and said they don't believe this is legal. Stickers where added after the initial discovery.
forgottenpass · 9 years ago
What else would you add to the list to make it reasonable?

For it to not be used without opt-in consent. And to not hold any other benefits/privileges/incentives behind the wall of opting in.

fhars · 9 years ago
It should work without taking pictures or movement profiles of people that did not give consent in writing or using a qualified digital signature.
retube · 9 years ago
I think you're going to have to accept that what you are doing is disliked and considered unethical by the majority.
ffriend · 9 years ago
I'm curious what's the boundary between ethical and unethical. People constantly analyze each other's mood and it's perceived positively. But doing the same thing massively using automated tools is often considered inappropriate. So is it because of using technology, massiveness, purposes? I hope there's a way to make such things both efficient and not unethical.
tawayway · 9 years ago
An opportunity to opt out? If I need to be near the unit for whatever reason I would be more comfortable if I could switch it off.
monochromatic · 9 years ago
The fact that it's secret is what bothers me the most.
cyberferret · 9 years ago
Uh, I got sidetracked and brain hammered by the devolving discussion on that Twitter thread, thus couldn't find the context for this pizza shop kiosk - Is it a customer service portal that attempts to identify the person in front of it to try and match up with an order, or a plain advertising display that is trying to capture the demographics of the people who happen to stop in front of it and look at it?
nom · 9 years ago
It's an ad. May be targeted advertising, or just simple engagement analysis.

http://www.dinside.no/okonomi/reklameskilt-ser-hvem-du-er/67...

Edit: there is more information in the article. Not going to read it, though.

vilhelm_s · 9 years ago
To summarize, it's an experimental project, there is so far only one such screen at the train station. If someone stands within 5 meters of the screen it will try to classify their age and gender and show a targeted ad, and record how long they looked at it. The raw images are not saved.

The screen uses a software called Kairos to analyze faces. It can estimate age, gender, and whether you are "white, black, hispanic, asian, or other".

According to the marketing manager at Peppe's Pizza, he thought there would a label on the screen saying what's going on, but in fact there is just a small sticker on the back of it, which is quite hard to see.

The company making the screen, ProtoTV, says that people should be okay with this because ads on the internet are even more targeted. A government representative says that the system might violate laws about surveillance cameras.

SmellyGeekBoy · 9 years ago
A UK-based company called Amscreen make a similar system: http://www.amscreen.eu/products/ds24-2/

(If the "Am" part of the name seems familiar it's because it's Alan Sugar's son)

They installed one at a local petrol station so they lost my business. Perhaps I should be more vocal about it.

tjpnz · 9 years ago
It saddens me that there are people in our profession for whom implementing such a thing presents no moral or ethical dilemma.
edanm · 9 years ago
It saddens me that most people refuse to accept that not everybody shares their morals, and that it might be them that are wrong.

I'm not saying there isn't an absolute right and wrong, but I certainly don't find this as abhorrent as most people here, apparently.

lloeki · 9 years ago
So what am I supposed to do if I find this invading my privacy? Not walk within 10m of such a billboard? It's not like there's any other active way to opt out of this.

I suppose a banner saying "you are being tracked" on tom of such a billboard could have quite an interesting effect. Come to think of it I've seen "Smile, you're being recorded!" in some shops.

bigbugbag · 9 years ago
You'd have a point if this was a general statement, but in the case of automated facial recognition it's almost universally despised across the world.

If you're not sharing this position it could be that you are younger or have been subjected to the conditioning of population by industries to make intrusive surveillance technology acceptable to them which has been going on for at least 15+ years afaik.

ihm · 9 years ago
Agreed. Who is agreeing to build such dystopian technologies?
grayhatter · 9 years ago
I would totally get interested in building this, have a half-good working prototype. All before I'd even considered that someone else might use it for evil...
thatwebdude · 9 years ago
The technology interests me and I would gladly work on a system that implemented such features.

As an ethical programmer, I'd be sure to incorporate security, anonymization, and be able to draw the line so that I can help businesses make more money (since that's what they pay me for), and advance technology at the same time.

This is only unethical when it's used in a system that infers more information aside from general demographics (which, BTW nearly everywhere collects), and makes them vulnerable to interception outside of the pizza company.

blackoil · 9 years ago
How is any different from Google, Facebook and rest of adverting driven web companies?
falcolas · 9 years ago
It's happening in a space previously devoid of such automated tracking.
andbberger · 9 years ago
Has nothing to do with morals - people gotta eat
tspike · 9 years ago
The alternative in our industry is not starvation.
bigiain · 9 years ago
"You can’t buy ethics offsets for the terrible things you do at your day job." https://deardesignstudent.com/ethics-cant-be-a-side-hustle-b...

and from the same guy:

"Don’t ask how you’re going to pay your rent working ethically. Ask why you’re open to behaving unethically in the first place." https://deardesignstudent.com/ethics-and-paying-rent-86e972c...

bigbugbag · 9 years ago
Has everything to do with ethics and morality. You can eat without being a cog in the mega big machine trying to crush us all.
ludjer · 9 years ago
It saddens me that people are complaining about this yet people are doing way worse in our profession like extorting business for money through ransom-ware, hacking personal information and bank accounts, creating robots that kill people. But no people are worried that while walking around in public place a picture is taken of them and an add is changed to target them.
spuz · 9 years ago
I wouldn't call writing malware 'our profession'. Maybe you are misinterpreting the meaning of 'Hacker' in 'Hacker news'.