> the claim that open source is automatically better than closed source, when it comes to security, is also strange. Remember xz utils backdoor?
The XZ attack is an extremely rare event coming likely from a state actor, which actually proves that FLOSS is a big target not easy to attack without huge effort. It was also caught not least thanks to the open nature of the repository. Also, AFAIK it wasn't even a change in the repo itself.
In short, using FLOSS is the way to ensure security. Whenever you touch proprietary staff, be careful and use compartmentalization.
Yeah I found this comment to be weird. At least the XZ backdoor was found before it went live anywhere. How many companies were hit by the Solarwind supply chain attacks?
Yeah, but it doesn't include the apps that lay-people want to use, such as Facebook, Venmo, and Google Maps. I like open source software as much as the next guy, but most of it seems very off-brand to your average joe. Setting people up with Firefox is one thing, trying to get them to use AntennaPod instead of Spotify is a much taller order.
Doesn't this mean that no matter how securely your phone is locked, Apple (and probably the three-letter agencies) can always unlock it by installing an appropriate update?
Not necessarily. If the secret is protected in the secure element against something only you can provide (physical presence of RFID, password, biometric etc) then it is ok.
BUT you must trust the entire Apple trusted chain to protect you.
> If the secret is protected in the secure element against something only you can provide (physical presence of RFID, password, biometric etc) then it is ok.
But we already established unlocking is not possible, so going with the argument it's implied there is a side-channel. Nothing, but a secret in your brain is something only you can (willingly) provide. Especially not biometric data, which you distribute freely at any moment. RFID can be relayed, see carjacking.
If you can side-step the password, to potentially install malware/backdoor, that's inherently compromising security.
If the data you care about is encrypted with a token locked behind your passcode input, and it's not theoretically brute forceable by being a 4 character numeric only thing, then not easily, no.
Could they produce an update that is bespoke and stops encrypting the next time you unlock, push it to your phone before seizing it, wait for some phone home to tell them it worked, and then grab it?
Perhaps, but the barrier to making Apple do that is much higher than "give us the key you already have", and only works if it's a long planned thing, not a "we got this random phone, unlock it for us".
(It's also something of a mutually-assured destruction scenario - if you ever compel Apple to do that, and it's used in a scenario where it's visibly the case that 'the iPhone was backdoored' is the only way you could have gotten that data, it's game over for people trusting Apple devices to not do that, including in your own organization, even if you somehow found a legal way to compel them to not be permitted to do it for any other organization.)
> Perhaps, but the barrier to making Apple do that is much higher than "give us the key you already have", and only works if it's a long planned thing, not a "we got this random phone, unlock it for us".
The attack situation would be e.g. at the airport security check, where you have to part with your device for a moment. That's a common way for law enforcement and intelligence to get a backdoor onto a device. Happens all the time. You wouldn't be able to attribute it to Apple collaborating with agencies or them using some zero-day exploit. For starters, you likely wouldn't be aware of the attack at all. If you came home to a shut-down phone, would you send your 1000$ device to some security researcher thinking it's conceivably compromised, or just connect it to a charger?
If you can manually install anything on a locked phone, that's increasing the attack surface, significantly. You wouldn't have to get around the individual key to unlock the device, but mess with the code verification process. The latter is an attractive target, since any exploit or leaked/stolen/shared key will be potentially usable on many devices.
Part of the reason e.g. Cellebrite is obsessive about not telling people many specifics about their product capabilities outside of NDA is that Apple is quite serious about trying to fix these things, and "we can crack every iPhone before the 14" probably tells them a fair bit about what might have a flaw.
Tools like that lose a lot of value if anyone paying enough attention can infer they exist, even indirectly, like if all the TSA agents you know suddenly switch to Android phones, or some of them tell you not to bring iPhones through security and won't tell you why, or a thousand other vectors for rumors to start.
All it takes is enough rumors for people to say it's enough to not trust any more, and suddenly you've lost a lot of the value of a secret information source.
So if you have a tool like that, where most people don't think it's readily available, the way you probably use it is very sparingly, to keep it that way.
There is a difference in targeted software supply attacks vs. weakening encryption for everyone by introducing a master key. Apple would be required to cooperate by US law, it may never become public either. But as I said, Apple doesn't have to know, or "know". This feature inherently compromises security. Contrary to device encryption, OS update security depends on a single key held by Apple (rather several devOps guys...), which could be stolen, leaked or shared.
Would you bet, the NSA can't sign iOS updates?
> So if you have a tool like that, where most people don't think it's readily available, the way you probably use it is very sparingly, to keep it that way.
Of course. This is reserved for targeted attacks against journalists and other enemies of the state.
> All it takes is enough rumors for people to say it's enough to not trust any more, and suddenly you've lost a lot of the value of a secret information source.
None of those articles are inconsistent with the claim that Apple cares about security, though?
"We can be legally compelled to give up data we have" and "we thought letting people have custom kernel modules was a bigger threat" are not particularly incompatible with "we design things so we don't have keys to your data we can be compelled to give up" and valuing people's security. (I am not a fan of the latter, to be clear, but there are reasonable reasons you could argue for it.)
But yes, I would probably, at the moment, bet that if the NSA can sign a custom iOS build on consumer hardware, Apple doesn't know about how, both because that's a very hard secret to keep, and because you'd see a massive uptick in people avoiding Apple devices in governments that might be of interest to US intelligence if even a rumor of that got out.
> None of those articles are inconsistent with the claim that Apple cares about security, though?
You are moving the goalpost.
> "We can be legally compelled to give up data we have" and "we thought letting people have custom kernel modules was a bigger threat" are not particularly incompatible with "we design things so we don't have keys to your data we can be compelled to give up" and valuing people's security. (I am not a fan of the latter, to be clear, but there are reasonable reasons you could argue for it.)
They do have the signing keys your iPhone will gladly accept to circumvent encryption, which is the argument.
I'm not the one moving the goalpost; my argument was that Apple's incentives are not in favor of them permitting even the appearance that they might allow that kind of compromise, your argument with that wall of articles appeared to be that Apple has a history of making decisions inconsistent with that, which I disputed. If that wasn't your intended argument, you might wish to be more explicit than a wall of links and "As if Apple users would care...".
> They do have the signing keys your iPhone will gladly accept to circumvent encryption, which is the argument.
Yes, and my argument is that the plumbing for either multiple release signing keys, one of which is never seen in the wild, or to avoid a second "iOS 13.1.5" or whatever with different build information showing up in various telemetry that would leak this existing, is very difficult to have built without far too many people who would spread rumors about it coming about, and even that rumor would be a problem.
So the most plausible thing, to me, would be that if such a capability exists, it's a "nuclear option" for whoever holds it to only use in a circumstance where it's so important they don't mind potentially never being able to use it again, whether that's because it's an exploit chain that will be fixed or because it's been coerced out of the target company and they will probably be compelled to fix it if it gets out.
reply