Artificial Intelligence & Machine Learning
,
Next-Generation Technologies & Secure Development
Concerns Remain Over Screenshot-Capture Feature and Microsoft’s Security Practices
How in the world has Microsoft’s leadership managed to get the debut of its forthcoming Recall feature for Windows so wrong on the security and privacy fronts?
See Also: Strategies for Protecting Your Organization from Within
Recall for Copilot+ is designed to capture screenshots of everything a user does on their Windows PC for potential replay later.
To give Microsoft the benefit of the doubt, perhaps Recall will astound. The company is certainly banking on it and other AI-enabled features to help its Windows Copilot+ laptops, set for release on June 18, better compete against Apple’s MacBooks.
At the same time, capturing all of this information in a single place on a device that already gets regularly targeted by hackers – that is, the Windows PC – seems risky for device users as well as employers. Taking screenshots of a user’s system is a feature that’s long been built not just into spyware, but also information-stealing malware, as thieves seek ways of recovering targets’ passwords or other sensitive information, not least to access their bank accounts and other sensitive or private information.
“The old security adage, ‘if you don’t store it you then don’t have to secure it,’ rings very true with regards to Recall,” said cybersecurity expert Brian Honan, who heads Dublin-based BH Consulting. “The Recall database on a device now provides attackers with a very valuable target to focus their resources on and we will no doubt see tools and attack methods quickly evolve to compromise that data.”
Silicon Valley has too often allowed an engineering mindset to trump privacy considerations. That’s what landed Google in hot water with the U.S. Federal Communications Commission in 2012, after an engineer decided, as part of the search giant’s Street View program, to turn it into a “wardriving” exercise and capture all unencrypted Wi-Fi data their vehicles encountered, just in case it might prove useful later on. Too often functionality gets added before anyone has properly answered the question of whether it should be there, or how such features must first be secured.
The Recall security stumble arrives as Microsoft has already been put on notice. In April, the U.S. Department of Homeland Security’s Cyber Safety Review Board issued an excoriating report concluding that Chinese espionage hackers’ hit on government users of Microsoft’s cloud-based products “was preventable and should never have occurred.”
In response, Microsoft CEO Satya Nadella pledged to do better, telling employees in a May 3 memo that they must prioritize “security above all else” and that “if you’re faced with the tradeoff between security and another priority, your answer is clear: Do security.”
Why then has Microsoft messed up the security protections it’s only belatedly bolting onto Recall?
After getting blowback over the feature, which is a cornerstone of Microsoft’s new offerings driven by artificial intelligence, Pavan Davuluri, Windows’ chief, on Friday announced several changes:
- Disabled by default: Recall will now ship disabled. “If you don’t proactively choose to turn it on, it will be off by default,” Davuluri said. Microsoft previously said users can also prevent Recall from taking screenshots of certain apps or websites.
- Extra checks: Accessing Recall will require a user to first enroll in Windows Hello, which uses facial recognition, a fingerprint or PIN code as an added authentication check.
- More encryption: Microsoft says it’s “encrypted the search index database” and also that “Recall snapshots will only be decrypted and accessible when the user authenticates.”
British cybersecurity expert Kevin Beaumont, a leading critic of Recall since its announcement, celebrated the latest guarantees. “There are obviously going to be devils in the details – potentially big ones – but there’s some good elements here,” he wrote on social media. “Microsoft needs to commit to not trying to sneak users to enable it in the future, and it needs turning off by default in Group Policy and Intune for enterprise.” He also cautioned against enabling Recall anytime soon, saying “it’s not ready” for primetime.
No advance assurance from Microsoft can reliably answer how these features might yet be abused by attackers via technological means or trickery (see: Microsoft Tweaks Recall for Security).
Alexander Hagenah, formerly an executive at commercial surveillance firm FinFisher, last week released a proof-of-concept tool called “TotalRecall” to extract a Recall database from a Windows PC.
Hagenah, who’s now head of the offensive cybersecurity team at Swiss financial market infrastructure firm Six, said Microsoft’s fixes looked “great on paper,” at least, and should block his tool from working.
“My main concern,” Hagenah told Information Security Media Group on Friday, “is that they decided just in the past three to four days to implement everything and still aim for a June 18 release. That leaves them with about 11 days to thoroughly threat model, develop and test. It’s hard to imagine that this will result in a secure, high-quality outcome.”
Microsoft’s much-hyped new feature and deployment timeline appear set to debut with important enterprise and privacy questions left unanswered.
“To be frank, the way Recall has been rolled out by Microsoft and this rushed response to the concerns raised by various cybersecurity and privacy professionals flies in the face of the recent ‘prioritizing security above all else’ announcement by the Microsoft CEO Satya Nadella,” Honan told me.
The feature could also pose “huge headaches for organizations that have strict regulatory requirements to comply with,” for example, with the EU General Data Protection Regulation or Payment Card Industry’s Data Security Standards, he said. “Should Recall be enabled on a device, then what controls and compliance policies can the organization enforce regarding the securing, storing, deletion and retention of personal data, credit card information, and other sensitive data?”
Expect regulators to soon pose precise questions to Microsoft executives about the security and privacy protections they included by design – and “we can’t recall” won’t be an answer they’ll likely accept.