Readit News logoReadit News
kypro · 3 years ago
Well executed.

I was aware of most of the data protection and privacy concerns presented, but I wasn't aware facial recognition system are being used as widely as suggested here.

If these systems ever become widely adopted I might seriously consider obscuring my face in public.

Here in the UK I've noticed over the last year or so many supermarket self checkouts have been fitted with cameras and screens. I'm not naive enough to believe I wasn't being recorded previously, but I can't help but find this trend of sticking a camera directly in my face whenever I'm trying to make a purchase extremely insulting and violating to my sense of privacy. Now after watching this I have almost no doubt that facial recognition software is installed on these systems.

I've spoken to other people about the rudeness of this but most people seem to think it's fine. Perhaps I'm just weird and more bothered about this stuff than most people. If sticking a camera in someone's face when they're trying to purchase something in a pharmacy isn't going too far though I do wonder if the average person would really care about anything presented here.

bko · 3 years ago
I played around with Amazon's Rekognition software. I took one of those youtube videos where someone takes a picture of themselves every day for 10 years. It was fairly ideal conditions (consistent lighting, same pose for the most part) but the kid also went from 12 years old to 22, so his face definitely changed a lot. I used the first image as the image to compare the rest to (12 years old at the time), and I was surprised that it got pretty much almost all of them with a high degree of confidence (80%+). And the 80% ones were terrible lighting, sunglasses or an image of his girlfriend he slipped in there.

Even the sunglasses, beard, face paint, bad lighting or puberty didn't throw off the model.

The open source dlib model was considerably worse, but AWS Rekognition was incredible

https://mleverything.substack.com/p/how-facial-recognition-w...

april_22 · 3 years ago
There was this one incident in China where the facial recognition system mistaked the face of a chinese celebrity on a bus for a jaywalker... so the system isn't perfect for special conditions/environments yet. However I do believe that results today are already outstanding and will only get better.

https://www.theverge.com/2018/11/22/18107885/china-facial-re...https://you.com/search?q=facial+recognition+in+china

throwaway1777 · 3 years ago
Shops and many other places have had video cameras for decades to deter theft and crime so people are used to it.
kypro · 3 years ago
They're completely different. This is solely for intimidation purposes.

Cameras previously would show a wide angle view of the store so you can see when someone put something under their jacket, etc. I can understand and accept this.

In comparison these new cameras have a very swallow angle of view, they're zoomed in on your face and they're in portrait. They would be completely useless if you wanted to see if someone was, say, putting something under their jacket at checkout.

These are there to purposefully record your face and let you know that they're doing that full colour and HD whether you like it or not. It's vile and extremely rude. I've never signed an agreement accepting such an obtrusive and unreasonable violation of my privacy when I enter a supermarket - at least online I'd have to accept T&Cs before placing my order.

nsv · 3 years ago
The key difference here is that most CCTV up to this point has not been high resolution enough to run facial detection on.
m463 · 3 years ago
In about 2000 a friend of mine worked in las vegas. At that time, they used facial recognition to identify everyone coming in the door to detect card counters and other troublemakers. They shared this information widely among all the casinos.

It is > 20 years later now... I wonder how accommodating they are when a whale walks in?

stevofolife · 3 years ago
It's also a good reason to wear mask.
Quarrelsome · 3 years ago
I thought it was so people behind the till could do age checks and bypass machine blocks without having to move.
jstanley · 3 years ago
I was initially put off by the webcam permission requirement, but the terms and conditions page says it's basically an art project and they don't send any data off (unless you explicitly accept it at the end) so I gave it a chance.

I'm glad I allowed webcam permission because it was an interesting, informative, and fun look at biometric tracking.

Apparently I'm "violently average" which is not a way I would previously have described myself. According to this site the most unusual thing about me is that I read the terms and conditions before ticking the "accept" box.

blablabla123 · 3 years ago
Same, but I think people generally over-estimate how much different from average they are. (It's from the webcam's pov anyway) Usually I also don't like interactions with the webcam but this was interesting enough.
cambalache · 3 years ago
and now your face is linked to your ip in a db somewhere
richrichardsson · 3 years ago
I found that the most shocking. 96% of people who used this didn't read the conditions first.
teraku · 3 years ago
It's skewed. I read the terms first, but then the website bugged so I had to reload. And then it said I haven't read the terms first. So....
DJHenk · 3 years ago
I didn't read them because I don't believe them anyway. If the website uses my camera, I'll just assume that everything is recorded and sent to advertisers and shady governments before I closed the tab. So I did not want continue with this one. It was after reading the comments here I decided to give it a go. Terms and conditions played not role.
maest · 3 years ago
That's lower than I expected, tbh.
montebicyclelo · 3 years ago
> According to this site the most unusual thing about me is that I read the terms and conditions before ticking the "accept" box.

I’d put forward the hypothesis that people who read the terms, and who are therefore concerned about privacy, are also less likely to be willing to agree to submit the data at the end.

avodonosov · 3 years ago
So you haven´t even tried without the "accept" box. It works the same, BTW.
Bromeo · 3 years ago
I think many people commenting on the model making bad predictions are missing the point. The speaker argues that even though models are known to be inaccurate, companies like tinder or insurance companies might still use the model outputs since they have nothing better. Therefore, in some future (or already today?) you can suffer from bad model predictions because you are "not normal enough" for the model to make good predictions, and might therefore receive a wrong predicted life expectancy and higher insurance bills.
ensignavenger · 3 years ago
Insurance companies hire very smart actuaries... and thus currently use actuarial models. Actuarial models aren't perfect either. However, throwing them out to use one of these machine prediction models would almost certainly be a disaster for the insurance company. And there is currently a lot of competition for life insurance.
lowercased · 3 years ago
Using this sort of data to enrich and refine existing data, vs throwing current stuff all out in favor of this newer data... that's what I'd expect (enrichment vs replacement). I'm fairly confident insurance companies have areas in their models where they know there's stuff they don't know. If more data can enrich their models to provide better accuracy, why wouldn't they?
cloverich · 3 years ago
If one of these models is on average better, then they would gain an advantage by using it. The problem is for the "not normal enough" folks, it may be _harder_ to remedy an invalid classification, particularly if there are no fallbacks or work arounds. I was cued into this once by an ML book that gave an example of a fraud detection company using an actually worse algorithm, because when it gave false positives it was easier to understand and hence easier to manually override. But if it is less profitable to operate this way, and there is no regulation around it, people getting falsely classified may be out of luck. That's where the discussion around regulation needs to happen, I think.
car_analogy · 3 years ago
> you can suffer from bad model predictions

This is the worst anti-surveillance argument. The last thing I want is to be accurately predicted. As far as I'm concerned, once the models are perfected and they can accurately predict everything you will do or say, things will be far worse.

aendruk · 3 years ago
Already experiencing this with automated fraud detection systems. Excessive captchas, “there was a problem with your order”, unexplained 403s, etc.
kjkjadksj · 3 years ago
This isnt an usolvable problem though since you can calculate the strength of model fit for a particular data point. You learn how to do this with linear models in stat 101 so everyone who is paid to be a datascientist will no doubt understand this.
Bromeo · 3 years ago
Unfortunately the theory for linear models does not translate easily to deep learning based models, which this demo is based on. The "strength of model fit" becomes much more complicated and is an active field of deep learning research.
canaus · 3 years ago
It might not be an unsolvable problem, but it is a problem that is apparent currently in the technologies that we use.
hinkley · 3 years ago
Credit already works this way.
NightWhistler · 3 years ago
For people worried this might be a scam: the guy in the video is Tijmen Schep and he's a privacy researcher and activist.

I had the pleasure of meeting him a while ago and seeing him give a talk about how companies gather data and the chilling effects on society.

Deleted Comment

kuu · 3 years ago
Sharing my face with my camera with a random website? Sure, what could go wrong... /s
avnigo · 3 years ago
For the extra paranoid, I can confirm that it does work offline after you press the button and let it load. You can hide your face until it loads, and then cut the connection. Be sure to close the tab before you go online again.
danuker · 3 years ago
Service Workers are a thing.

If you're extra extra paranoid, use a disposable VM in Qubes OS.

https://www.qubes-os.org/doc/how-to-use-disposables/

NoboruWataya · 3 years ago
Worth it to be told I am more attractive than 100% of the Spice Girls.
Hamuko · 3 years ago
Spice Girls back then or Spice Girls now?
cal85 · 3 years ago
Devil’s advocate here. What could go wrong?

If you open a website in a fresh browser context and let it use your camera, isn’t this about the same as walking down a street with CCTV cameras?

Gigachad · 3 years ago
Most people have such weird and illogical views on privacy. A website collecting face pictures without anything else is pretty useless. Walking in to a retail store owned by a company using facial recognition on a huge range of owned stores is a serious privacy issue.
slightwinder · 3 years ago
> Devil’s advocate here. What could go wrong?

Could be used for a scam, by a stalker, for social engineering, or several other evil ploys. Today, there are so many possible bad actions which are happening... On average, they are unlikely, but not impossible. And who knows about the long run.

> If you open a website in a fresh browser context and let it use your camera, isn’t this about the same as walking down a street with CCTV cameras?

In my country, there are harsh regulations on public cameras. Private people are not even allowed to capture you outside from being in the background.

nkrisc · 3 years ago
You have to weigh “what could go wrong?” with “what could go right?”

The worst case scenario is far greater in magnitude than the best case scenario.

Just not worth any risk here.

jeroenhd · 3 years ago
A website can collect millions of faces in a few hours. A CCTV camera can only collect data from a single area.

It's more akin to those spying doorbells from Amazon and friends, which I personally would try to avoid when I can.

The concentration of data and the lack of necessity of your face being recorded in the first place change the decision making process significantly.

5e92cb50239222b · 3 years ago
It could use that to impersonate you in a mobile banking application through biometric authentication. It's getting pretty popular in my country (I think it's required by law or something).
RyEgswuCsn · 3 years ago
> fresh browser context

I get your point, but I am not sure how fresh our browser context is.

kkjjkgjjgg · 3 years ago
web site has much more information about you that it can connect.
Rerarom · 3 years ago
Don't we all have tons of pictures of ourselves online already?
Joeboy · 3 years ago
Almost all, yes. The people who don't are obviously not normal, so they don't need the website.
mlatu · 3 years ago
not all of us ^^
barbarbar · 3 years ago
No. Zero.
croes · 3 years ago
No

Deleted Comment

seanw444 · 3 years ago
Gonna pile on another "no."
oakpond · 3 years ago
No
cynix · 3 years ago
Nope.
ccbccccbbcccbb · 3 years ago
No. This kind of auto-defaulting to "tons" baffles me quite hard.
Aleksdev · 3 years ago
N-not really…
cambalache · 3 years ago
i will give you 1 million usd if you find 1 of mine. people are different
ajsnigrutin · 3 years ago
They give a pinky promise, that they won't do evil!
arisAlexis · 3 years ago
EU funded project, really not very risky.
seanw444 · 3 years ago
> Government-funded project, really not very risky.

No thanks.

raverbashing · 3 years ago
Did you read the terms of service? ;)
croes · 3 years ago
Do you believe the terms of service?
luplex · 3 years ago
I stumbled across this when it launched. Here's the talk by the artist that gives some background information: https://youtu.be/bp23r-Gtdkk

Rather than talking about webcam permissions, I think we should talk about how much we use and rely on bad ML models. Dating apps might want to rate attractiveness, but we have no checks in place to see how we're being rated. Especially free open access models probably don't come with a thorough bias&limitations datasheet.

rramadass · 3 years ago
Holy Moly !

That video by the creator of this website is a must watch by everybody reading this thread; it is not all fun and games.

Thanks for posting this.

Joeboy · 3 years ago
I was not really prepared for being told how unattractive I am this morning.
kleene_op · 3 years ago
I was a 6.3 as I stepped out of bed.

I took a shower, shaved, brushed my teeth, put on a nice shirt, fixed my hair and offered a good angle of my best smile at the camera. I was now a 9.

rossdavidh · 3 years ago
That's actually fairly similar to how humans would be influenced by showering, shaving, etc. so in some ways that's pretty impressive.
jeroenhd · 3 years ago
I'm also reasonably sure other factors also influence these algorithms. Hair style, framing, lighting, the clothes found on your shoulders, angle, distortion by the camera lens, and most importantly, similarity to high-scoring faces in the source data set.
altdataseller · 3 years ago
I was a 6.3 but I felt pleasantly surprised because I personally think I'm a 1.5
Hammershaft · 3 years ago
Interesting! I thought it would be purely based on facial symmetry.

Deleted Comment

vanderZwan · 3 years ago
Then you're missing the point the website makes about this and other algorithms like it being extremely unreliable. There are tons of biases in the training data, and not just ethnic/cultural ones mentioned. For starters, you're comparing a crappy webcam to a model that in all likelihood is based on people's best selfies.

The model tries to fit you into a very narrow niche. On top of that it will do so poorly.

stackbutterflow · 3 years ago
Interestingly, if you click on the ToS you end up on a page explaining how it works (can't link it).

The beauty scoring model was found on Github (this or this one). The models to predict age, gender and facial expression/emotion are part of FaceApiJS, which forms the backbone of this project. Do note that its developer doesn't fully divulge on which photos the models were trained. Also, FaceApiJS is bad at detecting "Asian guys".

Apparently some dating apps rates their users with these sort of algorithms. Maybe I'm living under a rock but I did not know that was a thing.

This project has received funding from the European Union’s Horizon 2020 research and innovation programme, under grant agreement No 786641.

schleck8 · 3 years ago
How does a patchwork of APIs and open source pretrained models receive funding from the eu...
eurasiantiger · 3 years ago
Here is the project funding information page.

https://cordis.europa.eu/project/id/786641

gspr · 3 years ago
Indeed - I was also surprised by this. One possibility, which I hope is true, is that the authors are funded for something else (some actual research). If they spent a few hours of their workday to throw this together, they might be obligated to cite their funding agency.

I know for a fact that I've acknowledged funding agencies on papers about topics that were at best extremely tangential to my grant.

A less flattering possibility is that they wanna use EU affiliation as a badge for respect for privacy or something like that.