Microsoft's New 'Recall' Feature Is Equal Parts Cool and Dangerous
How does a feature that records everything you do on your PC sound?
We may earn a commission from links on this page.
Credit: Windows/YouTube
We take the search function for granted—when it goes well. If you search for a particular email, photo, or document on your PC, and it pops right up, you don't think twice about it. But if you spend 10 minutes scouring your hard drive looking for that one file, you lose your mind. That's where Microsoft hopes its new Recall feature can help—even if it comes with some major security risks.
What is Recall?
Recall, at its core, is simple: The feature quietly takes screenshots of what you're doing on your PC throughout your session. Whenever you perform a search with Recall, it pulls from all these screenshots to find relevant moments in your PC activity history that might be what you're looking for, stitching them together into a scrollable timeline. For example, if you're looking for a slideshow you were crafting for work, searching for it may pull up the times you were working on it in PowerPoint, as well as the presentation you gave with it. The same goes for an image: If you're looking for the photo of your dog at the park, you may see it from the time you opened it in your photos library, but also in the messaging app you used to send the photos to friends and family.
Recall associates these screenshots with the active app, as well: As you scroll through your timeline, not only can you see which window you were looking at it with, Recall will tell you which app was running and when. So if you know you want the PowerPoint session itself from February, you can skip over any screenshots from Teams.
While it's certainly a novel feature, Microsoft wasn't the first to launch a feature like this. Rewind offers a similar experience over on macOS, recording all your activity (including transcribing your audio) in order to make everything you do on your Mac searchable. Of course, the big difference here is Recall is a Microsoft-built feature, while Rewind is only offered by a third-party developer on macOS.
You also won't be able to use Recall on just any PC, even if its running Windows 11. Instead, this is a Copilot+ PC-exclusive, Microsoft's new AI-powered PC standard. These machines are equipped with the Snapdragon X Plus and Snapdragon X Elite chips, which have a dedicated neural processing unit (NPU) for handling local AI processes. Unless you have one of these new machines, like the new Surface Pro or Surface Laptop, you won't be able to try Recall when it launches. (At least, not officially.)
Is Recall safe to use?
Look, there's no getting around it: Recall takes screenshots of almost everything you do on your PC (assuming you haven't adjusted these settings yourself). That means it won't stop taking screenshots when you enter or access sensitive information like passwords, your Social Security number, or banking data: If you can see it on-screen, chances are Recall is recording it. To use Recall is to accept this practice happening on your PC at all times.
That said, from Microsoft's perspective, Recall is totally safe to use. Because it only runs on Copilot+ PCs, Recall is entirely handled on-device, with no processing outsourced to the cloud. That means everything, from the AI processing to the screenshots themselves, happen on your PC.
Plus, you have control over which apps and websites Recall takes screenshots for. If you don't want Recall to take screenshots when you use WhatsApp, you can tell it not to. You can choose to pause Recall for periods of time as well, and delete either recently taken screenshots, or all screenshots stored on your device. Private browsing sessions in certain browsers, like Microsoft Edge and Chrome, as well as DRM content, like Netflix shows and movies, will also not be recorded. (Your secrets really are safe with private browsing, I guess.)
Recall will also now be opt-in only, according to The Verge: Previously, the feature was slated to be a default feature that you would have to disable after setting up your PC. Now, however, you'll have the option to turn it off during the initial setup process. On top of that, Microsoft will now require you to authenticate yourself with Windows Hello before both setting up Recall and accessing its data. Microsoft will also encrypt the screenshots Recall takes as well as the search index database: To access any of the data, you need to authenticate with Windows Hello.
This is excellent news, as Recall did not originally have these protections. Instead, the Recall database was decrypted when you logged into your PC, which opened a huge security hole for bad actors to take advantage of. This was apparent just from Microsoft's original demo of the feature, but we saw the true security risks this week thanks to security researcher Kevin Beaumont. Beaumont tested the feature out for himself on a PC without an NPU, and concluded that hackers would have no problem scraping your Recall information once you unlock your PC.
Beaumont discovered that when Windows would save these screenshots to your machine, it would actually save all of the text from the images, and would store this data as plain text. That's everything you might do on your PC—including accessing banking information, private websites, and messages—saved as plain text, minus the aforementioned exceptions, of course. This information wouldn't be deleted when you delete the associated data or app, either: If you deleted a message in Teams, for example, it would live on in your Recall database forever. It wouldn't matter if they were messages set to auto-delete: If it came on screen, it would be likely saved to the database.
Further, Beaumont found hackers could employ readily available infostealers to scrape your entire Recall database in seconds. They wouldn't even need physical access to your computer. All they would need from you is to log in, decrypt your drive, and they could use remote hacking software to steal your Recall data. Beaumont actually did it to his own PC: Windows' built-in security tool, Microsoft Defender, did identify the infostealer Beaumont was using, but after taking over 10 minutes to block it, the infostealer had scraped all of Beaumont's Recall data.
The changes Microsoft has implemented to Recall following these findings are definitely positive: Beaumont would likely not have been able to break in and scrape his Recall data quite so easily if the database was still encrypted following a login. But there are still questions here: Recall will likely still save data from apps and files you delete yourself, for example. Will there be any way to easily erase this data if you want it gone completely from your computer? Plus, there's no avoiding the issue that Recall saves all uncensored private information. It's a big risk should somebody figure out your Windows Hello PIN.
Microsoft is slowly figuring out how to implement Recall in a way that protects users, but it doesn't seem quite there yet. I no longer believe you should avoid this feature at all costs, but I also don't necessarily endorse it, either. My advice? Keep an eye on how Microsoft continues to evolve the security surrounding Recall. Perhaps it'll find the right combination of protections to ensure a feature like this can't be abused.
If you do want to try Recall, and any other Copilot+ PC-exclusive features, you can preorder one of Microsoft new Surface devices below:
Surface Laptop: Starting at $999.99
Surface Pro: Starting at $999.99