Actually, it would be beyond reckless for Apple to do anything other than implement this as a safeguard. The cops just gave up the game. Their only way into a locked phone is one in an AFU state. Apple doesn't give backdoors to law enforcement, so in lieu of Apple being able to patch this vulnerability, they absolutely should implement protections against it, including this one we just heard from the horse's mouth.
If Apple doesn't make this an official feature, or worse: fixes this issue for the convenience of law enforcement, we need to read that as Apple selling out our privacy to the government.
Apple is in a weird position, on one hand they HAVE to give us government way to access people's iphone (CIA, NSA), and in a less direct way to the whole us government (local cops). On the other hand, privacy is a main point of their marketing so they have to look like they do things to protect it's users.
So they obviously have direct backdoor for the big ones like cia, and they let some wiggle room for 'security' companies that sell 0day exploit to local cops. If they didn't do, there would be lobbies until inevitably they too get their backdoor, which would look bad for apple. It would kill the myth of iphone privacy, any cop could leak about it.
I suspect this is either a bug or a feature that won't really prevent cops from accessing suspect's iphones, they will be annoyed until their 'unlock tool' get updated.
Don't count on Apple to actually fight any government to protect their customer privacy. If they did so, they would never have set up an alternate icloud on CCP controlled server for their Chinese customer, they Would have gone out of Chinese market.
"While all of the vulnerabilities were zero-day bugs when patched, one attracted particular attention, because it turned out to be an undocumented hardware vulnerability.
Larin described the bug as “insane”, saying it’s a hardware feature in Apple’s A12 to A16 Bionic system-on-chip (SoC).
The feature, he said, allows attackers to “bypass the hardware-based kernel memory protection” in target iPhones, if they write data to “unknown memory-mapped input-output (MIMO) hardware registers” that Apple’s firmware doesn’t use.
Larin said the research team found six undocumented MIMO addresses used by the Triangulation exploit, which “basically, bypass all hardware-based kernel memory protections”.
He said they appear to be ARM/Apple CoreSight debug registers for GPUs, since they’re nearby identified MIMO registers.
In a statement, Larin said that "due to the closed nature of the iOS ecosystem, the discovery process was both challenging and time-consuming.”
If Apple doesn't make this an official feature, or worse: fixes this issue for the convenience of law enforcement, we need to read that as Apple selling out our privacy to the government.