Site icon Medical Market Report

Microsoft’s New AI Product Sounds Like A Dystopian Nightmare

It’s been a big couple of days for Microsoft. The company’s annual Build conference, held from May 21-23 in Seattle, has seen a slew of announcements on upcoming tech, with much of it centering on integration of AI into future devices. But there’s one announcement in particular that has already proven controversial – and might even land the company in legal hot water. 

The AI-powered “Recall” feature, touted by Microsoft as “an explorable timeline of your PC’s past” has raised eyebrows from both experts and laypeople alike since its reveal – and understandably so, since its entire schtick is to literally take screenshots of your active screen every few seconds and save them locally. The program will then have the ability to search through these screenshots, as well as all of users’ past activity – files, photos, emails, browsing history, you name it.


It is, AI and privacy advisor Dr Kris Shrishak told the BBC, a potential “privacy nightmare.”

“The mere fact that screenshots will be taken during use of the device could have a chilling effect on people,” Shrishak said. “People might avoid visiting certain websites and accessing documents, especially confidential documents, when Microsoft is taking screenshots every few seconds.”

Microsoft, meanwhile, has stressed that there are safeguards in place: privacy has been “built […] into Recall’s design from the ground up,” the company states, with users able to limit when screenshots are taken.

“Recall snapshots are kept on […] on the local hard disk, and are protected using data encryption on your device,” the company says. “Recall screenshots are only linked to a specific user profile and Recall does not share them with other users, make them available for Microsoft to view, or use them for targeting advertisements […] Recall screenshots are not available to other users or accessed by other applications or services.”


Sounds reassuring – but safety experts aren’t convinced. Microsoft itself notes that sensitive information like passwords and financial account numbers won’t be hidden on screenshots; a computer whose Recall privacy settings are set up incorrectly, or targeted by malware, or just plain fail, therefore, could be a security threat on a massive scale.

“It is a one-shot attack for criminals,” Muhammad Yahya Patel, lead security engineer at Check Point, told TechRadar.

“[It’s] like a grab and go, but with Recall they will essentially have everything in a single location [your screenshot database],” Patel said. “Imagine the goldmine of information that will be stored on a machine, and what threat actors can do with it.” 

In fact, so concerning are the questions surrounding Recall that Microsoft has already found itself in legal trouble – at least in the UK, where the Information Commissioner’s Office (ICO), the national data protection authority that reports directly to Parliament, announced that it is “making enquiries with Microsoft to understand the safeguards in place to protect user privacy.”


“We expect organisations to be transparent with users about how their data is being used and only process personal data to the extent that it is necessary to achieve a specific purpose,” the office said in a statement released Wednesday. “Industry must consider data protection from the outset and rigorously assess and mitigate risks to peoples’ rights and freedoms before bringing products to market.”

Source Link: Microsoft's New AI Product Sounds Like A Dystopian Nightmare

Exit mobile version