When I try to log into AWS console and have to solve their captcha I always think that the target audience that this is designed to avoid can easily automate it (Some open source models can solve this without all that "AI safety" gatekeeping) while the majority of audience are simply suffering from this "feature".
Apple has proposed a solution to captcha[1] which I can't wait to be standardized and widely used.
> Automatic Verification helps protects your privacy when you sign in to an app or website. Instead of being asked to complete a CAPTCHA:
> An Apple server validates your device and Apple Account.
> This verification is sent to a third-party token issuance server, which has been verified by Apple. The token issuance server generates a private access token that verifies you to the app or website.
This is definitely not better for privacy and gives apple even more control. I would rather 3 extra seconds for captchas.
The cryptography that Apple uses for _their users_ is something novel and useful. I'm sure there would be completely open solutions that doesn't require Apple servers. At the moment this is Apple specific. I'm guessing most users will user Apple/Google solutions (just like how Passkeys are being used today) but for the security and privacy conscious there is always ways to keep your own private keys. Most people are handing over their private keys to Google and Apple for Passkeys today. That's not necessary a bad thing. Most people reuse passwords so it's an improvement.
If Apples solution only tell the site that the current user has an verified apple account without providing futher info I guess I can be acceptable for most. If the site being visited get info about whom that specific user it is a no-go.
On a more meta level it is tragic that we wont be able to use our computer online without being signed in to an online and be verified.
Apple has proposed a solution to captcha[1] which I can't wait to be standardized and widely used.
[1] https://support.apple.com/en-us/102591