Microsoft’s Recall was supposed to be the marquee feature for the new Copilot+ PCs Microsoft announced in May 2024. Its stated goal was to give Windows 11 users an AI-powered “photographic memory” to help them instantly find something they’d previously seen on their PC.
In theory, Recall offers a clever solution to a classic problem of information overload, tapping powerful neural processing units to turn a vague search into a specific result. However, the initial design created the potential for serious privacy and security issues and unleashed a torrent of criticism from security experts who called it a “privacy nightmare.”
Also: Have a Windows 10 PC that can’t be upgraded? You have 5 options before support ends next year
The criticism was so intense, in fact, that the company scrapped its plans to launch a preview of the feature as part of the Copilot+ PC launch, instead sending the entire codebase back to the developers for a major overhaul.
So, what have they been doing for the past four months?
Today’s blog post from David Weston, VP of Enterprise and OS Security at Microsoft, has the answers. In a remarkable departure from typical corporate pronouncements from Redmond, this one reads like it was written by engineers rather than lawyers, and it contains an astonishing level of detail about sweeping changes to the security architecture of Recall.
Here are the highlights.
Recall will work only on Copilot+ PCs running Windows 11
The Recall feature will only be available on Copilot+ PCs, Microsoft says. Those devices must meet the secured-core standard, and the feature will only be enabled if Windows can verify that the system drive is encrypted and a Trusted Platform Module (TPM version 2.0) is enabled. The TPM, Microsoft says, provides the root of trust for the secure platform and manages the keys used for the encryption and decryption of data.
Also: Why Windows 11 requires a TPM – and how to get around that
In addition, the feature as it will ship takes advantage of some core security features of Windows 11, including Virtualization-Based Security, Hypervisor-enforced Code Integrity, and Kernel DMA Protection. It will also use the Measured Boot and System Guard Secure Launch features to block the use of Recall if a machine is not booted securely (so-called “early boot” attacks).
Although it might be possible for security researchers to find hacks that allow them to test Recall on incompatible hardware, those workarounds should be significantly more difficult than they were in the leaked May preview that was the subject of the initial disclosures.
Recall will be opt-in only
One of the critics’ biggest concerns was that Microsoft would try to push Windows users into adopting the feature. Today’s announcement says, “Recall is an opt-in experience,” and in a separate interview, Weston emphasized that the feature will remain off unless you specifically choose to turn it on.
Also: At Microsoft’s security summit, experts debated how to prevent another global IT meltdown. Will it help?
The blog post says, “During the set-up experience for Copilot+ PCs, users are given a clear option whether to opt-in to saving snapshots using Recall. If a user doesn’t proactively choose to turn it on, it will be off, and snapshots will not be taken or saved.”
In addition, customers running OEM and retail versions of Windows 11 (Home and Pro) will be able to completely remove Recall by using the Optional Features settings in Windows 11. (That’s a change from previous reports based on leaked builds.)
Also: 7 password rules to live by in 2024, according to security experts
On PCs running Windows 11 Enterprise, the feature will not be available as part of a standard installation, Weston told me. Administrators who want to use Recall in their organizations must deploy the feature separately and enable it using Group Policy or other management tools. Even then, individual users would have to use Windows Hello biometrics on supported hardware to enable the feature.
New privacy settings add extra control over personal data
Microsoft says an icon in the system tray will notify users each time a Recall snapshot is saved and also provide the option to pause the feature.
Some types of content will never be saved as a Recall snapshot. Any browsing done in a private session within a supported browser (Edge, Chrome, Firefox, and Opera) is blocked by default, and you can filter out specific apps and websites as well.
Also: Stop paying for antivirus software. Here’s why you don’t need it
Recall also filters out sensitive information types, such as passwords, credit card numbers, and national ID numbers. The library that powers this feature is the same one used by enterprises that subscribe to Microsoft’s Purview information protection product.
If the Recall analysis phase determines that a snapshot contains sensitive information or content from a filtered app or website, the entire snapshot is discarded and its contents aren’t saved to the Recall database.
Additional configuration tools allow users to retroactively delete a time range, all content from an app or website, or the contents of a Recall search.
Recall’s security architecture leverages core Windows features
The biggest concern with the initial announcement of Recall was that it offered a prime target for attackers, with scenarios that included local attacks (another user on the same Windows 11 PC) and remote (via malware or remote access).
The revised architecture offers multiple layers of protection against those scenarios.
Also: This hidden Windows 11 setting adds an ‘End task’ option to every task on your taskbar
First, setting up Recall requires biometric authentication to the user’s account, and additional operations are tied to that account using the Windows Hello Enhanced-Sign-in Security identity. That ensures that Recall searches and other operations are only possible when the user is physically present and confirmed by biometrics.
Next, snapshot data is encrypted, as is the so-called vector database that contains the information used to search through stored snapshots. Decrypting those databases also requires biometric authentication, and any operations on those data (saving, searching, and so on) take place within a secure environment called a Virtualization-based security Enclave (VBS Enclave). This design ensures that other users can’t access the decryption keys and thus can’t access the contents of the database.
The Recall services that operate on snapshots and the associated database are isolated, making it nearly impossible for other processes, including malware, to take over those services. Other protections against malware include rate-limiting and anti-hammering measures designed to stop brute-force attacks.
Microsoft conducted security reviews
Under the heading “Recall Security Reviews,” the company claims that it has conducted multiple reviews of the new security architecture. Internally, it’s been red-team tested by the Microsoft Offensive Research and Security Engineering team (MORSE). In addition, the company says it hired an unnamed third-party security vendor to perform an independent security design review and penetration test.
Also: Microsoft will start charging for Windows 10 updates next year. Here’s how much
Finally, Redmond says they’ve done a “Responsible AI Impact Assessment (RAI)” covering “risks, harms, and mitigations analysis across our six RAI principles (Fairness, Reliability & Safety, Privacy & Security, Inclusion, Transparency, Accountability).”
And, of course, the company says it will pay bug bounties for anyone who reports a serious security issue that can be verified.
Will it satisfy critics?
The botched initial rollout of Recall squandered a lot of goodwill, so security experts have a right to be skeptical. Still, today’s announcement contains a wealth of detail, and the Insider testing that will start in October should provide an ample opportunity for additional feedback.
That feedback will have a huge impact on Microsoft’s AI plans, so I expect that everyone up to and including CEO Satya Nadella will be paying close attention.