Apple gives two fucks about media pressure. Their longstanding history of giving in to pressure almost exclusively is responsive to their users, either:
- pro/dev users who act as their trend setters (see recent backpedals on keyboards and Mac Pro form factor)
- people hacking stuff where it’s popular enough they want more influence on the UX (bootcamp)
I cited a pretty huge example (bootcamp) where there was no media pressure at all. There was definitely tech pressure, but the media had no dog in the fight.
Its possible that they saw it as a purely positive move that would improve security on MacOS and then after seeing everyone's response they reevaluated its importance.
or they started getting security bugs from the fuzzer teams. (it's a hard problem to sanitize so many exectutables, and the people who use VPNs are probably high-value customers, like business people wfh)
Having to be told by outsiders that this backdoor could be abused by malware is pretty embarrassing. It's hard to imagine Apple's engineers weren't aware of that.
As much the community wants to think they are evil and want to purposefully violate trust, most often the easier explanation works very well. Its an oversight or a resourcing issue.
"Some system processes bypassing NetworkExtensions in macOS is a bug, in case you were wondering."
Reply[2] by David Dudok de Wit, developer of TripMode:
"Glad to see it's being reconsidered as a bug, because Apple told us it 'behaves as designed' (FB7740671 + FB7665551). And why is there an exclusion list in the first place? I'd love to know more and see this documented."
Reply[3] by Russ:
"Can't get too specific but I promise it's really mundane/boring software development stuff... like two features that interact in an unintended way kind of boring."
Comment[4] on Russ's original tweet by Sérgio Silva:
"Yes. A bug with its own configuration file /System/Library/Frameworks/NetworkExtension.framework/Resources/Info.plist ContentFilterExclusionList"
The tweet by the Apple developer has been deleted - hope he didn't lose job, and at worst only earned a reprimand. (Nobody with experience would call it a bug, when it was clearly a deliberate design decision).
Inexperience is less concerning than trying to publicly whitewash the misdeeds of a corporation. I'm not sure you can even chalk this up to inexperience; my charitable guess is he probably didn't look at the code or config, assumed the company he likes would only do something like this by accident, went to twitter to say as much, then got a little carried away in the heat of it.
As much as I'd like to believe it was just an oversight, how do you accidentally have your services bypass the firewall? That feels like it would have to be a deliberate choice under the assumption that "our apps are signed by us, and the OS verifies that, so all traffic through these apps should be OK, right?" I don't mean this snarkily; it's a genuine question. I don't know how OSes work.
My guess is that this started small (“we shouldn’t let firewalls block security updates or Find My Mac”) and once that mechanism was there people kept adding other things to it thinking about support (security filters are notorious for people blocking things without understanding the implications and then file big reports) but not the users who would be upset about not being able to block those services.
I don't see why half the shit in that list would be needed during a recovery process, let alone need to bypass a VPN as well. If Apple wants to claim this as their defense, let them. Until then, I see little value in dreaming up excuses they aren't willing to make for themselves.
> Before, I was trying to figure out how mac's would ever be used anywhere near something classified or secret for a company.
Relying on a personal firewall on the device itself seems ill-fated. Maybe it could be considered an additional layer of security, but I've yet to work at a place where a personal firewall is part of the security concept, no matter which OS. It's either firewalls at the gateway, maybe additional ones for certain departments, or mandatory proxy servers if you're stuck in the 90s.
An application firewall on the device serves a different purpose to that running off-device, namely the ability to filter traffic based on the origin (or destination) application.
Clearly if your kernel or userspace are compromised that's not much use, and that's where external controls kick in.
You can't determine (absent some custom network and protocols) which piece of software was responsible for a given packet once you leave the device though, so that's the (current) best place to do that - if you want to impose policies controlling the hosts and protocols an application can use, you will want to implement this on-device, then firewall for the superset of all of those at the network level.
In essence it's about raising the number of independent failures required to result in a compromise. If you imagine the application firewall on the device has its policies managed rather than selected by the user, it starts to make more sense.
To fight apps phoning home, I agree. But even the tweet linked in OP refers to a tweet that shows how to abuse the now removed whitelisting by piggybacking your traffic through one of those whitelisted apps. On a locked down system like Android or iOS this isn't that trivial, but in a classic desktop OS use case it's easy to abuse another app to exfiltrate data.
> In essence it's about raising the number of independent failures required to result in a compromise.
Sure, it doesn't hurt, minus maybe the case that a vulnerability in that firewall itself is used.
> If you imagine the application firewall on the device has its policies managed rather than selected by the user, it starts to make more sense.
That's a requirement I guess. You don't want accountants and HR people handling popups by a firewall app. :-)
I mostly agree with your assessment of their usefulness, but any organisation that handles credit cards likely has to use personal firewalls. Requirement 1.4 of the DSS:
> Install personal firewall software or equivalent functionality on any portable computing devices (including company and/or employee-owned) that connect to the Internet when outside the network (for example, laptops used by employees), and which are also used to access the CDE.
The most common place this would come up would be with SREs/Devs that have access to prod (and thus the "Cardholder Data Environment") from their laptops. It can also apply to business users that have access to certain admin dashboards in some organisations.
Defense in depth suggests using controls to protect each individual resource. This includes individual apps, individual servers and individual user devices. A firewall is a single control, but never the only control. This from experience working with... government things.
I wasn't aware that VS Code was "proprietary malware"? All the code for that is right on GitHub.[0] If you don't trust the prebuilt ones, you're free to build it from the source yourself.
Want something else? Their GitHub repo list has almost 4000 repositories totaling 130 pages![1]
Just because the OS itself isn't open source[a] doesn't mean that Microsoft doesn't open source a whole crap ton of stuff. And sure, Window's telemetry can easily be construed as "malware", but Windows is not the entirety of Microsoft.
[a]: And that's a lie too (sortove). It is open source.[2] (ahem source available (sorry, FSF)) You just need a valid reason to look at it besides "I want to". Sidenote: I personally would hope that Windows gets open sourced, but I'm not holding my breath.
Of course VSCode is proprietary malware - unless, as you suggest, you build it yourself and get rid of that proprietary telemetry malware (https://github.com/VSCodium/vscodium)
But please don't suggest that we should praise Microsoft because they're decent enough to almost give us a somewhat convenient-if-you're-a-dev way to avoid being tracked?
> [a]: And that's a lie too. It is open source.[2] You just need a valid reason to look at it besides "I want to"
Google Search is open source as well, you just need a valid reason for them to hire you for their search team! /s
I agree with your overall comment in that Microsoft is not bad, but calling Windows 'open source' when the only ways to access it involves a long application process and a lot of money is quite a stretch. Nearly all source code is open under these definitions, as you can always pay for access, get hired, hack their servers or straight out buy the company. That's not what people usually consider open.
> I wasn't aware that VS Code was "proprietary malware"?
I'm sorry to see you are discovering this. VS Code is under this proprietary license [1]. As for the malware part:
> Data Collection. The software may collect information about you and your use of the software, and send that to Microsoft. Microsoft may use this information to provide services and improve our products and services. You may opt-out of many of these scenarios, but not all, as described in the product documentation located at https://code.visualstudio.com/docs/supporting/faq#_how-to-di.... There may also be some features in the software that may enable you and Microsoft to collect data from users of your applications.
(emphasis mine).
Indeed you can rebuild your copy or get Codium without the telemetry under the MIT license, and the software is really good, but it is a crippled version and does not make VS Code free software.
(edit: it's not pure / pointless theory! I'm sure there is an agenda behind VS Code not being really free. I would not be surprised if more and more "convenient" or important features were released as proprietary, and they also control the extension center (the market place), which is one of the main reasons Theia [5] exists. Beware not to lock yourself down in this ecosystem too much.)
> Just because the OS itself isn't open source[a] doesn't mean that Microsoft doesn't open source a whole crap ton of stuff
Microsoft is huge. The ratio between their proprietary code and the code that they open source is probably tiny. More importantly, they only release developer-related things, never things that target end users, except their telemetry-riddled Calculator [3].
> It is open source.[2] (ahem source available (sorry, FSF))
Hum. The reference for Open Source is the Open Source Initiative, not the FSF [2]. The FSF defines Free Software. These are almost equivalent things but still have two separate definitions.
Microsoft is not an open source software company. They just happen to be a huge software company and every huge software company open source a lot of code when it is strategic. I'm not judging, it's a fact. I'm happy to use some of their quality open source software targeted to the developer community like TypeScript, but an Open Source company would release their important code and be based on a business model around this.
And, no, we can't even really call Windows a source-available software. They share its source code to big entities so they can audit it, probably under NDA, and not everybody can access it. Actually you can find leaked code, but it is just this: leaked code. Mapbox-gl-js is a source-available software which is not open source (anymore) [4].
> Indeed you can rebuild your copy or get Codium without the telemetry under the MIT license, and the software is really good, but it is a crippled version and does not make VS Code free software.
I'm not in the business of commending or admonishing.
The question was whether Apple could/should be used in government classified/sensitive business environments.
I find it amazing that recently on a presumably ‘hacker’ forum opinions showing a ‘freedom software’ perspective get a bully response in form of simply downvoting and shutting up the person. I urge the admins to stop this practice. I wish to hear such points of view and consider things from such perspective.
It is very logical to assume that once you have no direct access to the sources of software, that software could do things that malware does. Yet this obviously logical reminder get downvoted like it is irrational or off topic.
It is on topic, it is rational, it is a good reminder and we see Microsoft and Apple consistently disrespect a right of a person to control own _Personal_ computer(PC). On recent M1 you can’t even have own OS without Apple permission, which makes it useless brick for me. Do some people still understand what ‘personal’ means ?
There are very few if not zero modern computers with completely open source firmware and boot loader. Including computers with open source as their primary selling point.
Some people find it psychologically easier to sit in the shit when they discover they are not alone. Some even go further and trying to justify the shit when there is a lot of it.
I am not one of them, so for me shit is still shit and I prefer to see it for what it is.
I am not suggesting it is perfect, but that it may be a little hyperbolic to say any os you load on a Mac is not it's 'own OS' because it relies on closed bootloaders/firmware. If that is the case, then what does every other computer run? Is Linus's own operating system not his own because he loads it on an Intel or AMD processor?
We will know the answer to that question once we study thoroughly all versions of microcode those processors have/ had. Alto had how much? 128k ? and some big part of it was for display memory? And it was a full OS. Just imagine what you can have in firmware/bootloader now days. Is it that hard to imagine for everybody?
You're changing the subject. The discussion is about your implication that one "can’t even have own OS without Apple permission", and that is simply not true.
Am I? Then what ‘closing bootloader’ is if not their form of giving permissions? If I understood correctly they would not “help” with specs and we do not know what bootloader capable of. Why? if their goal to keep it open? Let’s see the tendency.
They have put a complete control over IOS devices since the beginning of IOS, including the apps that were not allowed at all. Since then they are trying to bring this into a ‘personal computer’ domain as it seems.
they started slowly but steady to put more and more control over Mac apps,
then they started limiting root access,
Now the bootloader ...
So where they are going ? As I see it the tendency is to close MacOS completely just as IOS unless they face a strong resistance, then they go for ‘as much as they can get’ or ‘as much as they can get away with’ strategy, feeding some ‘calming pills’ on the way that some perhaps are happy to swallow. They have changed things from, “of course it is not our business what you boot” , to “of course, we may allow unsigned kernels .... for now” (if it’s true at all, we still need to see this in reality) Yes it is not as strong as “we will not allow at all”, but for sure it’s their permission now. You may say, oh it’s just as before. No , it’s not, they have taken some of the existing freedom and intend to take more next year possibly. And who knows what their bootloader does? What if it would be controlled remotely ? What if to load some “unsigned kernels” which are called just kernels by the way, they would still require some online check? Who knows? What of the above is not true or incorrect?
Before, I was trying to figure out how mac's would ever be used anywhere near something classified or secret for a company.