there's two things here, passkeys as a mechansim to sign into your google account, and passkeys in general. i'll assume you're asking about passkeys in general, even though the announcement here is passkeys to sign into your google account - passkeys as a general standard were introduced earlier this year.
1. no, it's a cryptographic key stored on your device. but also yes, because if you're using a passkey on a device you sign into using your google account, google can lock you out of that account and potentially that device. if that device is your only way of logging into a service, you'd be locked out of the service in the same way that losing or destroying the device would lock you out of the service. but authorizing multiple passkeys and adding backup authentication methods is encouraged, so for practical purposes the answer is no.
2. you should understand that passkeys are issued by your device or browser. so if you don't want to trust google, you can use a passkey issued by safari or iPhone instead of Chrome or Android. if you're using an apple-issued passkey to sign into some other service that isn't your google account, google has literally no involvement anywhere in that chain.
3. not according to the spec. the passkey is a certificate that gets stored on your device. your device shares a signature (not the certificate) with the service you're signing into to prove you have access to the certificate. but they're issuing you the cert through their proprietary software. if you don't trust google to not snag a copy of the passkey when they generate it for you, you should use passkeys generated by an issuer that you do trust. (in apple's implementations, passkeys are synced through iCloud, so the answer there is yes - apple does have your passkeys and could sign in to services as you if they wanted. google is not doing that.)
4. you'd have to read the ToS for yourself to figure out what you could sue google for. but the more relevant thing than legal recourse is that this scheme doesn't really give google any power they don't already have, and in many ways distributes power out to third-parties so you have to trust google less. apple's passkey implementation is leading google's. others will follow. this is about allowing you to use an open standard to authenticate with your google account.
On point 1 if google decides to lock you out of its service, it generally doesn't stop at a single device but all other devices you own and associated accounts can also be impacted.
So, the solution here seems to be not just creating backup devices (using the same account) but diversify the platforms as well like use a mix of android, yubikey, ios and any other supported platforms.
Doubt many people would have the time and resources to do it correctly.
> (in apple's implementations, passkeys are synced through iCloud, so the answer there is yes - apple does have your passkeys and could sign in to services as you if they wanted. google is not doing that.)
Passkeys are stored in the iCloud Keychain which is end-to-end encrypted with 256-bit AES. Passkeys cannot be read by Apple anywhere but on your own device.
> (in apple's implementations, passkeys are synced through iCloud, so the answer there is yes - apple does have your passkeys and could sign in to services as you if they wanted. google is not doing that.)
1. Google can lock you out, but it’s ok because 1% of users have a workaround,
2. You will lose access, but it’s ok because you won’t if you didn’t use this product,
3. Google can log in as you, but it’s ok because we should trust them not to, and
4. We’ve got no recourse, but it’s ok because we were probably helpless anyway?
Companies may get away with this for other products, but the bar is way higher for security products. There are real solutions, or at least ameliorations, and it’d be unsafe to use the product without ensuring they’re there.
Yes, but if you use the “Login with Google” function on many 3rd party websites, then what? If Google locks you out, can you get back into your account on the 3rd party website?
I think this is a great question, and that the answer depends on the 3rd party site.
Yet they still don't have a way to disable the 2FA prompt offered in the Gmail app (called "Google prompt").
It's a shame that you can add hardware security keys to your Google account and all of that can be bypassed just by pressing "approve" in the Gmail app when you're trying to login.
The attack vector that I'm thinking of here is your phone being stolen while in public while unlocked, it doesn't even prompt for further biometrics when approving a login from the app.
Advanced Protection is nice but practically a non starter or very difficult in a lot of situations, e.g if you're travelling and all of your security keys get stolen and your backup security key is half way across the globe, you'll wish you had 2FA backup codes then.
> The attack vector that I'm thinking of here is your phone being stolen
This is exactly why I'm a big fan of larger, heavier devices, such as my rack-mounted desktop PC, being their own 2FA device. It's pretty hard to steal my 4U desktop that is bolted down, there's no reason why it should depend on anything else for 2FA.
Similarly, laptops should be their own 2FA device and not depend on a phone, since people are generally more careful to not take laptops into high-crime areas.
I didn’t mention it in that post but that is _exactly_ the attack vector I had in mind also- even if someone stole my phone, Touch ID should stop them from getting at my passwords.
> Yet they still don't have a way to disable the 2FA prompt offered in the Gmail app (called "Google prompt").
There used to be a way to intentionally (after 2FA) add or remove devices to "Google prompt". These days, it just seems to be any device I'm logged in to, sadly.
That setting doesn’t affect what I’m saying. I want to disable Google Prompt as it’s insecure for my threat model. All “skip password when possible” does is give you the option to use a passkey to login without a password, it doesn’t prevent an attacker from using a password + Google Prompt to login, bypassing the need for any of the configured 2FA security keys or passkeys.
Click bait title, the password is not going away, unlike the dolphins who left earth...
>Creating a passkey on your Google Account makes it an option for sign-in. Existing methods, including your password, will still work in case you need them, for example when using devices that don't support passkeys yet. Passkeys are still new and it will take some time before they work everywhere. However, creating a passkey today still comes with security benefits as it allows us to pay closer attention to the sign-ins that fall back to passwords. Over time, we'll increasingly scrutinize these as passkeys gain broader support and familiarity.
How is this, practically, any better than existing 2FA? A 2FA code is stored on a device just like a passkey is.
Passwords had a security and a usability problem, I guess, and so the solution was to add 2FA, which allegedly improves security. Now, we’re dropping the security of passwords to solve the usability issue. This doesn’t seem to be a big improvement to me.
How do I back these up in case of catastrophic data loss, such as my house and all my possessions burning down (there are approximately 350000 house fires a year in the USA, so it's worth worrying about when its your entire digital life)? I take my security safety, but I take the durability of my digital life much more seriously.
Right now, I am able to back up all my personal passwords and all my personal TOTP secrets into printed paper form that I keep in tamper-evident packaging and distribute to a safe-deposit box and the basements/attics of different trusted friends/family members. The printed packet of paper has instructions on how to use it, the passwords and TOTP secrets themselves, and the brief source code for one program, one that lets you generate TOTP codes from all my secrets (TOTP is very simple, it's like 30 lines of python[0]). I've tested it all and it is sufficient to access any account I have. Since it's all recorded on paper, each packet will function for my entire lifetime; I don't have to worry about storing a device and that device's charging/power equipment, and I don't have to worry about that device's capacitors or battery going bad in 10 years.
So in the world of "passkeys", how do I simply and durably record them so that if need be, someone who is not me, who I've never met, and who has access to none of my electronic devices, can authenticate as me given that this person is willing to put several days effort into dealing with my authentication archive (e.g. an estate lawyer)?
I know that this is "possible"; it's all based on data and secrets recorded somewhere, but I'm having a hard time understanding the FIDO description, and I don't see an Open-Source equivalent of what Google's offering here. Is there a "KeePassX" of the passkey world?
> When a user sets up a new Android device by transferring data from an older device, existing end-to-end encryption keys are securely transferred to the new device. In some cases, for example, when the older device was lost or damaged, users may need to recover the end-to-end encryption keys from a secure online backup.
> To recover the end-to-end encryption key, the user must provide the lock screen PIN, password, or pattern of another existing device that had access to those keys.
> Screen lock PINs, passwords or patterns themselves are not known to Google. The data that allows Google to verify correct input of a device's screen lock is stored on Google's servers in secure hardware enclaves and cannot be read by Google or any other entity.
How does Google know my device's local PIN in the first place? Are Android phones phoning home to Google with their local PINs, like how they already "helpfully" backup your Wi-Fi passwords to Google servers? And what's stopping the FBI from asking Google to send them every device's PIN as it's created?
If I had to guess, when you set up an Android device's passcode, it generates a key pair, signs a message and ships the public key to Google, and then encrypts the private key with your pin (maybe with your Google password and/or a Google-provided salt at the end) and ships that to Google. So when you restore access on a new device, it downloads that private key bundle and tries to decrypt it using the PIN you provide.
> And what's stopping the FBI from asking Google to send them every device's PIN as it's created?
device PINs are never sent to Google in this scenario - but nothing is stopping them from being asked to log your Password the next time you sign in, and it's up to you to trust that they'd never do this even when asked via national security letter since it destroys all trust people (and Google Workspace enterprise businesses) have in their login system.
What's even more ridiculous is calling it end-to-end encryption when it's secured by a short numeric pin (what most people use). The search space for brute forcing it is so insanely small that it doesn't matter how much password stretching you do, it's trivial for Google/anyone with access to the keystore to brute force.
1. If you are using this to authenticate a non-Google service, can Google cut off your access to that service?
2. If your Google account is terminated, do you still have access to non-Google services?
3. Does Google possess sufficient information to log into non-Google services as you?
4. In the event that any of the above happen, do you still have the right to sue Google?
1. no, it's a cryptographic key stored on your device. but also yes, because if you're using a passkey on a device you sign into using your google account, google can lock you out of that account and potentially that device. if that device is your only way of logging into a service, you'd be locked out of the service in the same way that losing or destroying the device would lock you out of the service. but authorizing multiple passkeys and adding backup authentication methods is encouraged, so for practical purposes the answer is no.
2. you should understand that passkeys are issued by your device or browser. so if you don't want to trust google, you can use a passkey issued by safari or iPhone instead of Chrome or Android. if you're using an apple-issued passkey to sign into some other service that isn't your google account, google has literally no involvement anywhere in that chain.
3. not according to the spec. the passkey is a certificate that gets stored on your device. your device shares a signature (not the certificate) with the service you're signing into to prove you have access to the certificate. but they're issuing you the cert through their proprietary software. if you don't trust google to not snag a copy of the passkey when they generate it for you, you should use passkeys generated by an issuer that you do trust. (in apple's implementations, passkeys are synced through iCloud, so the answer there is yes - apple does have your passkeys and could sign in to services as you if they wanted. google is not doing that.)
4. you'd have to read the ToS for yourself to figure out what you could sue google for. but the more relevant thing than legal recourse is that this scheme doesn't really give google any power they don't already have, and in many ways distributes power out to third-parties so you have to trust google less. apple's passkey implementation is leading google's. others will follow. this is about allowing you to use an open standard to authenticate with your google account.
So, the solution here seems to be not just creating backup devices (using the same account) but diversify the platforms as well like use a mix of android, yubikey, ios and any other supported platforms.
Doubt many people would have the time and resources to do it correctly.
Passkeys are stored in the iCloud Keychain which is end-to-end encrypted with 256-bit AES. Passkeys cannot be read by Apple anywhere but on your own device.
iCloud Keychain is e2ee.
1. Google can lock you out, but it’s ok because 1% of users have a workaround,
2. You will lose access, but it’s ok because you won’t if you didn’t use this product,
3. Google can log in as you, but it’s ok because we should trust them not to, and
4. We’ve got no recourse, but it’s ok because we were probably helpless anyway?
Companies may get away with this for other products, but the bar is way higher for security products. There are real solutions, or at least ameliorations, and it’d be unsafe to use the product without ensuring they’re there.
I think this is a great question, and that the answer depends on the 3rd party site.
It's a shame that you can add hardware security keys to your Google account and all of that can be bypassed just by pressing "approve" in the Gmail app when you're trying to login.
The attack vector that I'm thinking of here is your phone being stolen while in public while unlocked, it doesn't even prompt for further biometrics when approving a login from the app.
AFAIK this removes the ability to use a phone prompt as 2fa.
This is exactly why I'm a big fan of larger, heavier devices, such as my rack-mounted desktop PC, being their own 2FA device. It's pretty hard to steal my 4U desktop that is bolted down, there's no reason why it should depend on anything else for 2FA.
Similarly, laptops should be their own 2FA device and not depend on a phone, since people are generally more careful to not take laptops into high-crime areas.
I didn’t mention it in that post but that is _exactly_ the attack vector I had in mind also- even if someone stole my phone, Touch ID should stop them from getting at my passwords.
There used to be a way to intentionally (after 2FA) add or remove devices to "Google prompt". These days, it just seems to be any device I'm logged in to, sadly.
Skip password when possible - You’ll be able to sign in securely with just a passkey or device prompt.
Dead Comment
>Creating a passkey on your Google Account makes it an option for sign-in. Existing methods, including your password, will still work in case you need them, for example when using devices that don't support passkeys yet. Passkeys are still new and it will take some time before they work everywhere. However, creating a passkey today still comes with security benefits as it allows us to pay closer attention to the sign-ins that fall back to passwords. Over time, we'll increasingly scrutinize these as passkeys gain broader support and familiarity.
Passwords had a security and a usability problem, I guess, and so the solution was to add 2FA, which allegedly improves security. Now, we’re dropping the security of passwords to solve the usability issue. This doesn’t seem to be a big improvement to me.
2FA just means 2 factors.
Passkeys are just a factor. Usually a more secure factor than a password.
So Passkey = Something you have, PIN = Something you know.
That's not how 2FA works.
Right now, I am able to back up all my personal passwords and all my personal TOTP secrets into printed paper form that I keep in tamper-evident packaging and distribute to a safe-deposit box and the basements/attics of different trusted friends/family members. The printed packet of paper has instructions on how to use it, the passwords and TOTP secrets themselves, and the brief source code for one program, one that lets you generate TOTP codes from all my secrets (TOTP is very simple, it's like 30 lines of python[0]). I've tested it all and it is sufficient to access any account I have. Since it's all recorded on paper, each packet will function for my entire lifetime; I don't have to worry about storing a device and that device's charging/power equipment, and I don't have to worry about that device's capacitors or battery going bad in 10 years.
So in the world of "passkeys", how do I simply and durably record them so that if need be, someone who is not me, who I've never met, and who has access to none of my electronic devices, can authenticate as me given that this person is willing to put several days effort into dealing with my authentication archive (e.g. an estate lawyer)?
I know that this is "possible"; it's all based on data and secrets recorded somewhere, but I'm having a hard time understanding the FIDO description, and I don't see an Open-Source equivalent of what Google's offering here. Is there a "KeePassX" of the passkey world?
[0] - https://github.com/susam/mintotp/blob/main/mintotp.py
> Is there a "KeePassX" of the passkey world?
Both 1Password and Bitwarden have announced support as soon as OS support will become available, the latter being open source.
https://security.googleblog.com/2022/10/SecurityofPasskeysin... claims it's encrypted using "an encryption key that is only accessible on the user's own devices":
> When a user sets up a new Android device by transferring data from an older device, existing end-to-end encryption keys are securely transferred to the new device. In some cases, for example, when the older device was lost or damaged, users may need to recover the end-to-end encryption keys from a secure online backup.
> To recover the end-to-end encryption key, the user must provide the lock screen PIN, password, or pattern of another existing device that had access to those keys.
> Screen lock PINs, passwords or patterns themselves are not known to Google. The data that allows Google to verify correct input of a device's screen lock is stored on Google's servers in secure hardware enclaves and cannot be read by Google or any other entity.
How does Google know my device's local PIN in the first place? Are Android phones phoning home to Google with their local PINs, like how they already "helpfully" backup your Wi-Fi passwords to Google servers? And what's stopping the FBI from asking Google to send them every device's PIN as it's created?
> And what's stopping the FBI from asking Google to send them every device's PIN as it's created?
device PINs are never sent to Google in this scenario - but nothing is stopping them from being asked to log your Password the next time you sign in, and it's up to you to trust that they'd never do this even when asked via national security letter since it destroys all trust people (and Google Workspace enterprise businesses) have in their login system.
In short we can only verify PINs, and only a very limited number of guesses, enforced by hardware.
They could lie about that, but they could do a lot of things.