They can submit a perfectly innocent app for notarization, get the app notarized, and then flip a switch on their own server to download a malware software update when the victim opens the "innocent" notarized app.
And then Apple can block the publisher of that notarized app, and possibly take legal action against the registered Apple Developer legal identity. That doesn't sound like theater. What am I missing?
People with our skillset tend to want to find technologically perfect solutions, but do not understand that businesses and entities and human beings in general have solutions out of band from the technical to solve problems.
The promise of notarizing is that Apple has seen the code and can decide whether it’s benign or malicious (which in itself is laughable). Notably, the claim is that this will achieve malware prevention before it runs on a user machine.
This is undermined if the signed and analyzed code is only a subset of what is actually executed.
You’re arguing outside of the technical discussion, i.e., Apple has legal means to enforce compliance. But that’s only after the fact.
Sure, but that's not how Apple touts notarization. Apple could still revoke signing keys and block/delete malicious apps without requiring the lengthy notarization process. The value of notarization is to claim that software is safe before it runs.
To continue with the analogy, would you buy a lock from the company whose ads say "Usually Work! Stops Thieves Sometimes!" or would you go with the one promising "100% Deterrent, Iron-clad Guarantee".
(Whether these are realistic claims to make is a separate matter, but it's what's being claimed, so it's what they should be judged on.)
I can certainly see how the marketing of notarization may be hyperbolic / highly simplified/ etc. But the statement "notarization is security theater " (the title) is a general claim that is not limited to a technical discussion.
But even limiting the discussion to just the narrow dataset collected during binary analysis ... I'd love to see a paper / study / anything that says binary analysis can detect only a statistically insignificant percentage of malicious programs. (I think that is what you are arguing by using the adjective "laughable", but correct me if I'm wrong). So far the article doesn't provide any real evidence, but it does lean on the implication that because malware rates are going up, binary analysis (notarization) is useless ("theater"). That of course is not sound reasoning, but the claim might nonetheless still be true. Hence my genuine curiosity about real evidence.
I thought the same so I dug a little. The article they refer to as their past prediction notes a scenario in which old app versions should have had their notarization revoked but didn't.
What may surprise you, though, is that older versions of Panic software that were signed with the revoked cert still pass Gatekeeper checks.
I don't know if I'm convinced yet; one data point doesn't make a trend.
Notarization allows revocation, particularly for those who have installed the app, but haven't launched it yet (this is the only group for whom it's not too late - because they're not already compromised - but for whom yanking the app from the App Store listing isn't enough).
I'm not arguing that this is the correct freedom/privacy/security tradeoff, only that it does do something.
Can't the app store already push updates to the app, including ones that would disable it? In fact, isn't that the attack vector that malicious actors are using in this report? I think you could accomplish the same things without the notarization today, with the systems we have.
That's not my point, if the system isn't secure against them acting maliciously, then the system isn't secure period, because them acting maliciously is one social engineering episode away.
One thing people often miss about security: it’s not just about prevention, it’s also about attribution.
CCTV cameras don’t stop someone breaking in but they make it easier to identify the culprit, for example. Notarisation is in the same category. It’s not about preventing malware, it’s about being able to answer the question ‘who is to blame if this is malware?’. If you submit a benign app for motorisation and then use it to distribute malware, there is a clear link between the information that you submitted to Apple and the malware.
As with CCTV, whether the privacy concerns outweigh the attribution benefit is a much more complex question.
The problem is the CCTV camera gets turned off right after someone enters the store. Or something, it's a bad analogy. Is this notarization just making it harder for malware to sneak directly into the appstore? Could that be prevented in a way that doesn't make it harder to develop apps, and annoy users?
No, as I said, it’s about attribution. If someone does the attack mentioned in the article (ships something that downloads a malicious payload) then that app is tied to their identity via motorisation. It doesn’t prevent the app sneaking into the App Store, just as CCTV doesn’t prevent shoplifters sneaking into the shop, it makes it easier to identify them afterwards.
Can’t Apple now revoke the certificate of the malware author and it won’t run on any more machines? I always thought that was the point of mandatory code signing.
Unfortunately for us, the digital world has become too hostile.
Want to be (somewhat) safe and have software freedom? Use two computers, where one is disconnected from the network, turned off and sit still one meter underground 🤷♂️
The downloaded malware update doesn't need to be notarized, because the software updater will delete the quarantine attribute, thus bypassing Gatekeeper. It's impossible for Apple to detect this beforehand, because the malware update won't be made available for download until after Apple notarizes the original app.
Apple could probably scan for code that removes the quarantine attribute on files. But also, Gatekeeper should probably not allow apps to do just that without user interaction.
jbeckford | a day ago
From article:
And then Apple can block the publisher of that notarized app, and possibly take legal action against the registered Apple Developer legal identity. That doesn't sound like theater. What am I missing?
Vaelatern | a day ago
People with our skillset tend to want to find technologically perfect solutions, but do not understand that businesses and entities and human beings in general have solutions out of band from the technical to solve problems.
freddyb | a day ago
The promise of notarizing is that Apple has seen the code and can decide whether it’s benign or malicious (which in itself is laughable). Notably, the claim is that this will achieve malware prevention before it runs on a user machine.
This is undermined if the signed and analyzed code is only a subset of what is actually executed.
You’re arguing outside of the technical discussion, i.e., Apple has legal means to enforce compliance. But that’s only after the fact.
kingmob | a day ago
Yeah. Once malware has run, the entire device might be compromised in a way that persists, even if the original app is blocked from running.
AKA, "locking the barn door after the horse has bolted".
glhaynes | 23 hours ago
But if there are hundreds of millions of horses in the barn, it's worth having a door you can lock after you've noticed a few thousand have escaped.
kingmob | 16 hours ago
Sure, but that's not how Apple touts notarization. Apple could still revoke signing keys and block/delete malicious apps without requiring the lengthy notarization process. The value of notarization is to claim that software is safe before it runs.
To continue with the analogy, would you buy a lock from the company whose ads say "Usually Work! Stops Thieves Sometimes!" or would you go with the one promising "100% Deterrent, Iron-clad Guarantee".
(Whether these are realistic claims to make is a separate matter, but it's what's being claimed, so it's what they should be judged on.)
jbeckford | 18 hours ago
I can certainly see how the marketing of notarization may be hyperbolic / highly simplified/ etc. But the statement "notarization is security theater " (the title) is a general claim that is not limited to a technical discussion.
But even limiting the discussion to just the narrow dataset collected during binary analysis ... I'd love to see a paper / study / anything that says binary analysis can detect only a statistically insignificant percentage of malicious programs. (I think that is what you are arguing by using the adjective "laughable", but correct me if I'm wrong). So far the article doesn't provide any real evidence, but it does lean on the implication that because malware rates are going up, binary analysis (notarization) is useless ("theater"). That of course is not sound reasoning, but the claim might nonetheless still be true. Hence my genuine curiosity about real evidence.
crmsnbleyd | a day ago
What does this have to do with notarisation though
kevinc | a day ago
I thought the same so I dug a little. The article they refer to as their past prediction notes a scenario in which old app versions should have had their notarization revoked but didn't.
I don't know if I'm convinced yet; one data point doesn't make a trend.
orib | a day ago
Simply requiring ID to participate in the app store accomplishes this. There's no need for a technical change to anything on the end users side
strugee | a day ago
Notarization allows revocation, particularly for those who have installed the app, but haven't launched it yet (this is the only group for whom it's not too late - because they're not already compromised - but for whom yanking the app from the App Store listing isn't enough).
I'm not arguing that this is the correct freedom/privacy/security tradeoff, only that it does do something.
orib | a day ago
Can't the app store already push updates to the app, including ones that would disable it? In fact, isn't that the attack vector that malicious actors are using in this report? I think you could accomplish the same things without the notarization today, with the systems we have.
davidbalbert | a day ago
Notarization is primarily about software that's distributed outside the App Store.
enobayram | a day ago
And what if their server was compromised?
k749gtnc9l3w | a day ago
Next you will ask what happens when the domain expires.
hyperpape | a day ago
Then they should have to do a postmortem and prove that they've taken remedial steps to be allowed to continue distributing software.
enobayram | a day ago
That's not my point, if the system isn't secure against them acting maliciously, then the system isn't secure period, because them acting maliciously is one social engineering episode away.
david_chisnall | a day ago
One thing people often miss about security: it’s not just about prevention, it’s also about attribution.
CCTV cameras don’t stop someone breaking in but they make it easier to identify the culprit, for example. Notarisation is in the same category. It’s not about preventing malware, it’s about being able to answer the question ‘who is to blame if this is malware?’. If you submit a benign app for motorisation and then use it to distribute malware, there is a clear link between the information that you submitted to Apple and the malware.
As with CCTV, whether the privacy concerns outweigh the attribution benefit is a much more complex question.
benjajaja | a day ago
The problem is the CCTV camera gets turned off right after someone enters the store. Or something, it's a bad analogy. Is this notarization just making it harder for malware to sneak directly into the appstore? Could that be prevented in a way that doesn't make it harder to develop apps, and annoy users?
david_chisnall | 10 hours ago
No, as I said, it’s about attribution. If someone does the attack mentioned in the article (ships something that downloads a malicious payload) then that app is tied to their identity via motorisation. It doesn’t prevent the app sneaking into the App Store, just as CCTV doesn’t prevent shoplifters sneaking into the shop, it makes it easier to identify them afterwards.
mxey | a day ago
Can’t Apple now revoke the certificate of the malware author and it won’t run on any more machines? I always thought that was the point of mandatory code signing.
peter-leonov | a day ago
Unfortunately for us, the digital world has become too hostile.
Want to be (somewhat) safe and have software freedom? Use two computers, where one is disconnected from the network, turned off and sit still one meter underground 🤷♂️
baetylboy | a day ago
mxey | a day ago
Apple could probably scan for code that removes the quarantine attribute on files. But also, Gatekeeper should probably not allow apps to do just that without user interaction.
classichasclass | a day ago
Unfortunately, Apple will now use this as a pretext to just eliminate all but officially sanctioned software. It's already going that direction.
mxey | a day ago
How is it going in that direction?
strugee | a day ago
https://www.macrumors.com/2024/08/06/macos-sequoia-gatekeeper-security-change/, for example.